Edit model card

GPT2 Ukrainian

A generative language model for the Ukrainian language follows the GPT-2 architecture (124M parameters).

  • hidden size: 768
  • number of heads: 12
  • number of layers: 12
  • seq length: 1024
  • tokens: 11238113280 (3 epochs)
  • steps: 57167

Training data

  • OSCAR
  • Wikimedia dumps

License

MIT

Downloads last month
39
Safetensors
Model size
137M params
Tensor type
F32
·
U8
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for malteos/gpt2-uk

Finetunes
2 models

Dataset used to train malteos/gpt2-uk

Space using malteos/gpt2-uk 1