Research Data Leeds Repository
Learning manipulation planning from VR human demonstrations
Citation
Hasan, Mohamed and Warburton, Matthew and Agboh, Wisdom C. and Dogar, Mehmet R. and Leonetti, Matteo and Wang, He and Mushtaq, Faisal and Mon-Williams, Mark and Cohn, Anthony G. (2020) Learning manipulation planning from VR human demonstrations. University of Leeds. [Dataset] https://doi.org/10.5518/780
Dataset description
The objective of this project is learning high-level manipulation planning skills from humans and transfer these skills to robot planners. We used virtual reality to generate data from human participants whilst they reached for objects on a cluttered table top. From this, we devised a qualitative representation of the task space to abstract human decisions, irrespective of the number of objects in the way. Based on this representation, human demonstrations were segmented and used to train decision classifiers. Using these classifiers, our planner produced a list of waypoints in the task space. These waypoints provide a high-level plan, which can be transferred to any arbitrary robot model. The VR dataset is released here.
Subjects: | H000 - Engineering > H600 - Electronic & electrical engineering > H670 - Robotics & cybernetics > H671 - Robotics I000 - Computer sciences > I400 - Artificial intelligence > I460 - Machine learning C000 - Biological sciences > C800 - Psychology H000 - Engineering > H600 - Electronic & electrical engineering > H670 - Robotics & cybernetics > H674 - Virtual reality engineering |
||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Divisions: | Faculty of Engineering and Physical Sciences > School of Computing | ||||||||||
Related resources: |
|
||||||||||
License: | Creative Commons Attribution 4.0 International (CC BY 4.0) | ||||||||||
Date deposited: | 01 May 2020 11:05 | ||||||||||
URI: | https://archive.researchdata.leeds.ac.uk/id/eprint/664 | ||||||||||