Omnidirectional gaze data: feedbacks from the creating process of a new 360° videos, head & gaze dataset David Erwan (1), Coutrot Antoine (2), Perreira Da Silva Matthieu (1), Gutiérrez Jesus (1), Le Callet Patrick (1), 1 - Laboratoire des Sciences du Numérique de Nantes (France), 2 - Laboratoire des Sciences du Numérique de Nantes (France) Recent advances in virtual reality Head-Mounted Display (HMD) and embedded eye-tracking systems opened new opportunities for the study of visual attention. A VR headset's strongest feature is real-time display of omni-directional contents, allowing users to experience full 360° scenes thanks to rotation and translation tracking of the HMD; coupled with powerful eye-tracking technology, we are able to precisely study head and eye movements. In a recent free-viewing experiment, participants explored 360° dynamic stimuli wearing a VR headset. They watched videos lasting 20 seconds each; participants started a viewing trial either at longitude 0° or 180° (center of the equirectangular content or opposite side). Gaze and head rotation data were collected. Gathered data, processed into scanpaths (gaze data) or trajectories (head data), and saliency maps were released publicly along with 19 stimuli and a toolbox necessary for saliency maps and scanpaths/trajectories similarity measures. This dataset will be useful to the visual attention community in understanding the deployment of visual attention in dynamic 360° scenes, it is also more ecological in regards to usual experimental (neck constraints) and content (screen display) restrictions. Visual attention in VR has applications in content encoding, compression and transmission, quality evaluation, foveated rendering, etc. We propose to reflect on the development of this dataset and the theoretical and practical problematics that arose in relation with head and gaze data processing, and similarity measures in the 360° domain.