NivLink is a lightweight toolkit originally built to support eyetracking and pupillometry experiments in the Niv Lab. The overarching goal is to develop fast, flexible, and robust software for preprocesisng EyeLink data collected as part of ongoing and future experiments. The design goals for this package are straightforward:
With that in mind, we are open to contributions. NivLink is meant to benefit anyone analyzing EyeLink data, and as such, we seek enhancements that will benefit users of the package. Before starting new code, we highly recommend opening an issue on NivLink GitHub to discuss potential changes.
NivLink follows a simple organizational layout. The nivlink
package itself is comprised of two primary modules, Screen
and preprocessing
, and secondary modules particular to specific datasets, e.g. fht
.
Functions for representing experimental stimuli and their associated spatial areas of interest belong in the Screen
module. Functions for preprocessing eyetracking data that are likely to generalize across experiments belong in the preprocessing
module.
Preprocessing functions required by and particular to specific datasets (e.g. fht_epoching
) should be stored in a separate *.py
file. This organizational structure helps to demarcate which tools are appropriate for some or all of the datasets collected by the lab.
Screen
and preprocessing
must have unit tests executable with pytest and Travis-CI. Unit tests should be put in a test_*.py
file in the test folder.