Research Data Leeds Repository

Study on the Semantics of Spatial Language - dataset

Citation

Richard-Bollans, Adam (2020) Study on the Semantics of Spatial Language - dataset. University of Leeds. [Dataset] https://doi.org/10.5518/764

This item is part of the Spatial Prepositions and Situated Dialogue collection.

Dataset description

Building on a previous study to collect rich data on spatial prepositions, we have conducted a study to gather annotations related to spatial language in 3D virtual environments. The data collection environment is built using the Unity3D modelling software and game engine. The study comprised two tasks --- a Preposition Selection Task and a Comparative Task. The Preposition Selection Task allows for the collection of categorical data while the Comparative Task provides typicality judgements. In the Preposition Selection Task participants are shown a figure-ground pair (highlighted and with text description) and asked to select all prepositions in the list which fit the configuration. Users may select `None of the above' if they deem none of the prepositions to be appropriate. In the Comparative Task a description is given with a single preposition and ground object where the figure is left ambiguous. Participants are asked to select an object in the scene which best fits the description. Again, participants can select none if they deem none of the objects appropriate. In both tasks, participants are given a first person view of an indoor scene which they can navigate using the mouse and keyboard. To allow for easy selection, objects in the scene are indivisible entities e.g. a table in the scene can be selected but not a particular table leg. We limited both tasks to prepositions: `in', `inside', `on', `on top of', `against', `over', `above', `under' and `below'. The current dataset is from an online study where participants were recruited via internal mailing lists along with recruitment of friends and family. For the study 67 separate scenes were created in order to capture a variety of tabletop configurations. Each participant performed first the Preposition Selection Task on 10 randomly selected scenes and then the Comparative Task on 10 randomly selected scenes, which took participants roughly 10 minutes. Some scenes were removed towards the end of the study to make sure each scene was completed at least 3 times for each task. 32 native English speakers participated in the Preposition Selection Task providing 635 annotations, and 29 participated in the Comparative Task providing 1379 annotations. As the study was hosted online we first asked participants to show basic competence. This was assessed by showing participants two simple scenes with an unambiguous description of an object. Participants are asked to select the object which best fits the description in a similar way to the Comparative Task. If the participant makes an incorrect guess in either scene they are taken back to the start menu.

Additional information: Correction: A minor error in the process_data.py file when reading the list of prepositions associated with annotations caused some inaccuracies when calculating and comparing annotator agreements. This has been corrected and the corresponding results have been updated. The directory structure of the archive has also been improved so that the analysis can be run without modifying the directory structure. (Change made 10/11/2020)
Keywords: Spatial Prepositions, Spatial Semantics, Human Robot Interaction, Natural Language Processing, Referring Expression Generation
Subjects: I000 - Computer sciences > I400 - Artificial intelligence
Divisions: Faculty of Engineering and Physical Sciences > School of Computing
Related resources:
LocationType
http://eprints.whiterose.ac.uk/156866/Publication
https://doi.org/10.3233/FAIA200341Publication
https://doi.org/10.1016/j.cogsys.2022.09.004Publication
https://eprints.whiterose.ac.uk/194132/Publication
https://etheses.whiterose.ac.uk/28893/Ethesis
Date deposited: 14 Feb 2020 18:34
URI: https://archive.researchdata.leeds.ac.uk/id/eprint/643

Files

Documentation

Data

Program

Research Data Leeds Repository is powered by EPrints
Copyright © University of Leeds