Datasets:

Modalities:
Tabular
Text
Formats:
parquet
Languages:
English
ArXiv:
DOI:
Libraries:
Datasets
Dask
License:

350BT sample is much smaller than advertized

#53
by DavidNemeskey - opened

Hi,

I downloaded the 350BT sample to experiment with it, and found that it is actually much smaller. The exact token count depends on the tokenizer, of course, but most tokenizers I experimented with (including GTP-2) return 140B tokens or thereabouts. Even the "tokens" field in metadata backs this up, summing to a little over 141B.

On the page, there is even a graph showing how Fineweb performs compared to other datasets, which is capped at 350B tokens. So I assume a proper 350B sample does exist then?

@guipenedo would it be possible to upload the real 350B sample instead of the current, much smaller sample-350BT? Thank you!

Sign up or log in to comment