Collection: Adapters

Adapters are lightweight modules added to pre-trained neural networks, allowing them to adapt to new tasks without retraining the entire model. Instead of updating all the parameters, only the adapter layers are trained, making this approach more efficient in terms of time and resources.

They work by "adapting" the knowledge of a large pre-trained model (like BERT) to specific tasks, which is useful in natural language processing (NLP) and other machine learning fields.

Adapters