PyNeon#
PyNeon is a lightweight Python package designed to streamline the processing and analysis of multimodal data from the Neon eye-tracking system (Pupil Labs GmbH). This community-driven effort provides a versatile set of tools to work with Neon’s rich data, including gaze, eye states, IMU, video, events, and more.
PyNeon supports both native (data stored in the companion device) and Pupil Cloud data formats. We want to acknowledge the pupil-labs/pl-neon-recording project, which inspired the design of PyNeon.
Documentation for PyNeon is available at https://ncc-brain.github.io/PyNeon/ which includes detailed references for classes and functions, as well as step-by-step tutorials presented as Jupyter notebooks.
Key Features#
(Tutorial) Easy API for reading in datasets, recordings, or individual modalities of data.
(Tutorial) Various preprocessing functions, including data cropping, interpolation, concatenation, etc.
(Tutorial) Flexible epoching of data for trial-based analysis.
(Tutorial) Methods for working with scene video, including scanpath estimation and AprilTags-based mapping.
(Tutorial) Exportation to Motion-BIDS (and forthcoming Eye-Tracking-BIDS) format for interoperability across the cognitive neuroscience community.
Installation#
To install the development version of PyNeon:
pip install git+https://github.com/ncc-brain/PyNeon.git
A PyPI release is planned for the future.
Citing PyNeon#
If you use PyNeon in your research, please cite the accompanying paper as follows:
@misc{pyneon,
title={PyNeon: A Python package for the analysis of Neon multimodal mobile eye-tracking data},
url={osf.io/preprints/psyarxiv/y5jmg_v2},
DOI={10.31234/osf.io/y5jmg_v2},
publisher={PsyArXiv},
author={Chu, Qian and Hartel, Jan-Gabriel and Lepauvre, Alex and Melloni, Lucia},
year={2025},
month={August}
}