Skip to content
#

distilbert-fine-tuning

Here are 10 public repositories matching this topic...

Language: All
Filter by language

Built an end-to-end NLP service that flags potentially manipulative language in text for content moderation workflows. Fine-tuned a transformer model and deployed it behind a FastAPI inference API with a Streamlit web interface. The system demonstrates production-style ML deployment, model inference, and API-driven integration.

  • Updated Jan 4, 2026
  • Python

Successfully developed a multiclass text classification model by fine-tuning pretrained DistilBERT transformer model to classify various distinct types of luxury apparels into their respective categories i.e. pants, accessories, underwear, shoes, etc.

  • Updated Dec 31, 2024
  • Jupyter Notebook

Successfully established a multiclass text classification model by fine-tuning pretrained DistilBERT transformer model to classify several distinct types of mental health statuses such as anxiety, stress, personality disorder, etc. with an accuracy of 77%.

  • Updated Jan 6, 2025
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the distilbert-fine-tuning topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the distilbert-fine-tuning topic, visit your repo's landing page and select "manage topics."

Learn more