Edit model card

Next word prediction in 103 languages. Give it a sentence in another language, and replace one of the words with "[MASK]". Works with English too, obviously, but that defeats the point of the demo.

distilbert-base-multilingual-cased finetuned on 50,000 examples from r/explainlikeimfive subset of ELI5 dataset, for English causal language modelling. All knowledge of target languages is acquired from pretraining only.

Hyperparameters: epochs 3, learning rate 2e-5, batch size 8, weight decay 0.01, optimizer Adam with betas=(0.9,0.999) and epsilon=1e-08.

Final model perplexity 10.22

Downloads last month
3
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for zzzotop/zero-shot-cross-lingual-transfer-demo-masked

Finetuned
this model