Skip to content

IBM/unitxt

Repository files navigation

Image Description

Button Button Button Button Button Button Button Button

In the dynamic landscape of generative NLP, traditional text processing pipelines limit research flexibility and reproducibility, as they are tailored to specific dataset, task, and model combinations. The escalating complexity, involving system prompts, model-specific formats, instructions, and more, calls for a shift to a structured, modular, and customizable solution.

Addressing this need, we present Unitxt, an innovative library for customizable textual data preparation and evaluation tailored to generative language models. Unitxt natively integrates with common libraries like HuggingFace and LM-eval-harness and deconstructs processing flows into modular components, enabling easy customization and sharing between practitioners. These components encompass model-specific formats, task prompts, and many other comprehensive dataset processing definitions. The Unitxt-Catalog centralizes these components, fostering collaboration and exploration in modern textual data workflows. Beyond being a tool, Unitxt is a community-driven platform, empowering users to build, share, and advance their pipelines collaboratively.

version license python tests codecov Read the Docs downloads

Turn.on.speakers.mov

πŸ¦„ Currently on Unitxt Catalog

NLP Tasks Dataset Cards Templates Formats Metrics

πŸ¦„ Run Unitxt Exploration Dashboard

To launch unitxt graphical user interface first install unitxt with ui requirements:

pip install unitxt[ui]

Then launch the ui by running:

unitxt-explore

πŸ¦„ Contributors

Please install Unitxt from source by:

git clone git@github.com:IBM/unitxt.git
cd unitxt
pip install -e ".[dev]"
pre-commit install

πŸ¦„ Citation

If you use Unitxt in your research, please cite our paper:

@misc{unitxt,
      title={Unitxt: Flexible, Shareable and Reusable Data Preparation and Evaluation for Generative AI},
      author={Elron Bandel and Yotam Perlitz and Elad Venezian and Roni Friedman-Melamed and Ofir Arviv and Matan Orbach and Shachar Don-Yehyia and Dafna Sheinwald and Ariel Gera and Leshem Choshen and Michal Shmueli-Scheuer and Yoav Katz},
      year={2024},
      eprint={2401.14019},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}