Welcome to the FARM!¶
Framework for Adapting Representation Models
What is it?¶
FARM makes Transfer Learning with BERT & Co simple, fast and enterprise-ready. It’s build upon transformers and provides additional features to simplify the life of developers: Parallelized preprocessing, highly modular design, multi-task learning, experiment tracking, easy debugging and close integration with AWS SageMaker.
With FARM you can build fast proof-of-concepts for tasks like text classification, NER or question answering and transfer them easily into production.
Core features¶
Easy fine-tuning of language models to your task and domain language
Speed: AMP optimizers (~35% faster) and parallel preprocessing (16 CPU cores => ~16x faster)
Modular design of language model and prediction heads
Switch between heads or just combine them for multitask learning
Full Compatibility with transformers’ models and model hub
Smooth upgrading to newer language models
Integration of custom datasets via Processor class
Powerful experiment tracking & execution
Checkpointing & Caching to resume training and reduce costs with spot instances
Simple deployment and visualization to showcase your model
Task |
BERT |
RoBERTa |
XLNet |
ALBERT |
DistilBERT |
XLMRoBERTa |
---|---|---|---|---|---|---|
Text classification |
x |
x |
x |
x |
x |
x |
NER |
x |
x |
x |
x |
x |
x |
Question Answering |
x |
x |
x |
x |
x |
x |
Language Model Fine-tuning |
x |
|||||
Text Regression |
x |
x |
x |
x |
x |
x |
Multilabel Text classif. |
x |
x |
x |
x |
x |
x |
Extracting embeddings |
x |
x |
x |
x |
x |
x |
LM from scratch (beta) |
x |
Getting Started
Concepts