This interface allows researchers to explore and compare tools for manual linguistic annotation. Its goal is to make it easier to identify platforms that meet the requirements of corpus annotation projects. Users can browse and filter tools according to a wide range of features, including supported annotation layers, collaboration capabilities, and export formats.
The interface is based on a structured inventory of 21 widely used annotation tools and more than 50 documented features. It was developed within the COST Action UniDive – Universality, Diversity and Idiosyncrasy in Language Technology. Particular attention was given to tools supporting multilingual annotation frameworks such as Universal Dependencies (UD) and PARSEME, while also covering tools used for other annotation layers across the broader linguistic annotation workflow.
You can see surveyed tools on the left of the interface, and surveyed features on the right, divided into sections. You can resize the width of the two sections by dragging the middle separation line. When clicking on a specific features, all tools that exhibit that features will be shown on top of the list. You can combine multiple selections to extend the list of tools. When clicking on the "plus" button on each tool card, you will see more information and its features will be highlighted in the features section (border color will change) Use the reset button to clear all filters and start from scratch.
The survey dataset and the interactive interface are maintained in an open GitHub repository. The repository contains the structured descriptions of the surveyed annotation tools as well as the code used to generate the interface.
To ensure that the inventory can evolve over time, the repository provides a protocol for adding new tools or updating existing entries. Contributors can submit new tools using the provided template and open a pull request, following the instructions described in the repository README.
After review and approval by the maintainers, the submitted information is integrated into the dataset and automatically reflected in the interface.
If you would like to suggest a new tool or update an existing entry, you can contribute directly through the GitHub repository.
View repositoryIf you use this resource, please cite:
Pannitto, L., Dobrovoljc, K., & Guillaume, B. (2026).
Survey of Tools for Manual Linguistic Annotation: Supporting Diversity through Interactive Exploration.
Proceedings of LREC 2026.
BibTeX
@inproceedings{pannitto2026survey,
title = "Survey of Tools for Manual Linguistic Annotation: Supporting Diversity through Interactive Exploration",
author = "Pannitto, Ludovica and
Dobrovoljc, Kaja and
Guillaume, Bruno",
booktitle = "Proceedings of the Fifteenth Language Resources and Evaluation Conference",
year = "2026",
publisher = "European Language Resources Association",
}
This survey was conducted within the COST Action UniDive – Universality, Diversity and Idiosyncrasy in Language Technology. We thank the maintainers of the surveyed annotation tools and all contributors who responded to the survey and shared information about the tools. Their input made this interface possible.