PyNeon#
PyNeon is a lightweight Python package designed to streamline the processing and analysis of multimodal eye-tracking data from the Neon eye-tracking system (Pupil Labs GmbH). This community-driven effort provides a versatile set of tools to work with Neon’s rich data, including gaze, eye states, IMU, video, events, and more.
Currently, PyNeon supports the Timeseries Data
or Timeseries Data + Scene Video
formats of data, downloaded from Pupil Cloud. For reading data in the native
format, please refer to the
pl-neon-recording
project, which inspired the design of PyNeon.
Documentation for PyNeon is available at https://ncc-brain.github.io/PyNeon/ which includes detailed references for classes and functions, as well as step-by-step tutorials presented as Jupyter notebooks.
Key Features#
(Tutorial) Easy API for reading in datasets and recordings. Quick access to various modalities of data.
(Tutorial) Various preprocessing functions, including data cropping, interpolation, concatenation, etc.
(Tutorial) Flexible epoching of data for trial-based analysis.
(Tutorial) Methods for working with scene video, including scanpath estimation and AprilTags-based mapping.
(Tutorial) Exportation to Motion-BIDS (and forthcoming Eye-Tracking-BIDS) format for interoperability across the cognitive neuroscience community.
Installation#
To install PyNeon, clone the PyNeon repository from ncc-brain/PyNeon and run:
pip install .
PyPI and conda releases are planned for the future.
Citing PyNeon#
If you use PyNeon in your research, please cite the accompanying paper as follows:
@misc{pyneon,
title={PyNeon: a Python package for the analysis of Neon multimodal mobile eye-tracking data},
url={osf.io/preprints/psyarxiv/y5jmg_v1},
DOI={10.31234/osf.io/y5jmg_v1},
publisher={PsyArXiv},
author={Chu, Qian and Hartel, Jan-Gabriel and Lepauvre, Alex and Melloni, Lucia},
year={2025},
month={Jun}
}