This repository contains code for LENS - Locational Encoding with Neuromorphic Systems. LENS combines neuromorphic algorithms, sensors, and hardware to perform accurate, real-time robotic localization using visual place recognition (VPR).
LENS performs VPR with the SynSense SpeckTM development kits, featuring a combination of a dynamic vision sensor and neuromorphic System-on-Chip processor for real-time, energy-efficient localization.
LENS can also be used with conventional CPU, GPU, and Apple Silicon (MPS) devices to perform event-based VPR thanks to the Sinabs spiking network architecture.
For more information, please visit the LENS Documentation.
For reproducibility and simplicity, we use pixi for package management and installation. If not already installed, please run the following command in your terminal:
curl -fsSL https://pixi.sh/install.sh | bash
You will be prompted to restart your terminal once installed. For more information, please refer to the pixi documentation.
Run the following in your terminal to clone the LENS repository and navigate to the project directory:
git clone [email protected]:AdamDHines/LENS.git
cd ~/LENS
For alternative package and dependency installation, please see the LENS documentation.
Get started using our demo dataset and pre-trained model to evaluate the system. Run the following in your command terminal to see the demo:
pixi run demo
Test out training and evaluating a new model with our ultra-fast learning method using our provided demo dataset by running the following in your command terminal:
pixi run train
pixi run evaluate
For a full guide on training and evaluating your own datasets, please visit the LENS documentation.
To get the best localization performance on benchmark or custom datasets, you can tune your network hyperparameters using Weights & Biases through our convenient optimizer script:
pixi run optimizer
For detailed instructions on setting up Weights & Biases and the optimizer, please refer to the LENS documentation.
LENS was developed using a SynSense Speck2fDevKit. If you have one of these kits, deploying to it is simple. Try out LENS using our pre-trained model and datasets by deploying simulated event streams on-chip:
pixi run sim-speck
Additionally, models can be deployed onto the Speck2fDevKit for low-latency and energy efficient VPR with sequence matching in real-time:
pixi run on-speck
For more details on deployment to the Speck2fDevKit, please visit the LENS documentation.
For all data relating to our manuscript, we have a dedicated permanent repository at https://zenodo.org/records/15392412, as well as including all data in this repository, which can found in the ./lens/data folder.
We acknowledge the Brisbane-Event-VPR dataset from https://zenodo.org/records/4302805.
This repository is licensed under the permissive MIT License. If you use our code, please cite our paper:
@article{hines2025lens,
title={A compact neuromorphic system for ultra-energy-efficient, on-device robot localization},
author={Adam D. Hines and Michael Milford and Tobias Fischer},
journal={},
year={2025},
volume={},
number={},
doi={},
url={},
}
If you encounter problems whilst running the code or if you have a suggestion for a feature or improvement, please report it as an issue.