π ASReview LAB v3 is here! π
Cleaner screening, smarter data handling, and more control over your reviews.
Automatic duplicate detection, editable tags, and a streamlined workflow.
ASReview LAB is an open-source machine learning tool for efficient, transparent, and interactive screening of large textual datasets. It is widely used for systematic reviews, meta-analyses, and any scenario requiring systematic text screening.
The key features of ASReview LAB are:
- Active Learning: Interactively prioritize records using AI models that learn from your labeling decisions.
- Scientifically validated: ASReview LAB has been scientifically validated and published in Nature Machine Intelligence.
- Flexible AI Models: Choose from pre-configured ELAS models or build your own with custom components.
- Simulation toolkit: Assess model performance on fully labeled datasets.
- Label Management: All decisions are saved automatically; easily change labels at any time.
- User-Centric Design: Humans are the oracle; the interface is transparent and customizable.
- Privacy First: Everything is open source and no usage or user data is collected.
- Automatic Duplicate Hiding: Records with duplicate titles and texts are automatically hidden during screening, keeping your workflow clean and tidy. Need those records back? No problem β you can choose to include them when you export your data.
- Editable Tags in Collection: Manage and edit tags directly from the Collection screen, giving you more control over your data extraction and classification.
Requires Python 3.10 or later.
pip install asreviewUpgrade:
pip install --upgrade asreviewFor Docker and advanced installation, see the installation guide.
Latest version of ASReview LAB:
- Import Data: Load your dataset (CSV, RIS, XLSX, etc.).
- Create Project: Set up a new review or simulation project.
- Select Prior Knowledge: Optionally provide records you already know are relevant or not relevant.
- Start Screening: Label records as Relevant or Not Relevant; the AI model continuously improves.
- Monitor Progress: Use the dashboard to track your progress and decide when to stop.
- Export Results: Download your labeled dataset or project file.
If you wish to cite the underlying methodology of the ASReview software, please use the following publication in Nature Machine Intelligence:
van de Schoot, R., de Bruin, J., Schram, R. et al. An open source machine learning framework for efficient and transparent systematic reviews. Nat Mach Intell 3, 125β133 (2021). https://doi.org/10.1038/s42256-020-00287-7
For citing the software, please refer to the specific release of the ASReview software on Zenodo: https://doi.org/10.5281/zenodo.3345592. The menu on the right can be used to find the citation format you need.
For more scientific publications on the ASReview software, go to asreview.ai/papers.
The best resources to find an answer to your question or ways to get in contact with the team are:
- Newsletter
- FAQ
- Community events
- Issues or feature requests
- Donate to ASReview
- Contact (asreview@uu.nl)
The ASReview software has an Apache 2.0 LICENSE. The ASReview team accepts no responsibility or liability for the use of the ASReview tool or any direct or indirect damages arising out of the application of the tool.
