This project evaluates five NER models: LSTM-CRF, Hidden Markov Models, Brown Clustering, Decision Tree Classifier and DistilBERT, across seven languages: English, French, Chinese, Arabic, Farsi, Finnish and Swahili. It explores baseline performance on monolingual datasets then Few-Shot Learning at 5%, 10% and 20% to study transfer learning from high-resource to low-resource languages, offering insights into model effectiveness in language transfer.
-
Notifications
You must be signed in to change notification settings - Fork 0
This project evaluates five NER models, from statistical to neural, across seven languages, including English and Swahili. It explores baseline performance on monolingual datasets and Few-Shot Learning to study transfer learning from high-resource to low-resource languages, offering insights into model effectiveness in diverse linguistic contexts.
rbouaf/nlp-language-transfer
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
This project evaluates five NER models, from statistical to neural, across seven languages, including English and Swahili. It explores baseline performance on monolingual datasets and Few-Shot Learning to study transfer learning from high-resource to low-resource languages, offering insights into model effectiveness in diverse linguistic contexts.