<aside> <img src="/icons/burst_gray.svg" alt="/icons/burst_gray.svg" width="40px" />
Domains: NLP, Deep Learning, Transformers, Model Deployment, Pre-processing
</aside>
https://github.com/sarayusapa/t5_grammarator
This project focuses on fine-tuning a transformer model: T5 (Text-to-Text Transfer Transformer) to perform Grammar Error Correction (GEC). The end goal being the model modifying grammatically incorrect sentences into their correct forms.
Neural Network A model inspired by the human brain, that processes complex data by using layers of interconnected artificial “neurons”
Deep Learning A subfield of Machine Learning using large neural networks to perform complex tasks.
Transformer A neural network architecture that processes sequences (like text) and is able to understand context.
Fine-Tuning Training a pretrained model further on a smaller, task-specific dataset to specialize it.