Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

Datasets:
DoggiAI
/
Taiwan-Netizen

Modalities:
Text
Formats:
parquet
Size:
10K - 100K
Libraries:
Datasets
pandas
License:
Dataset card Data Studio Files Files and versions
xet
Community
1
Taiwan-Netizen / prepare
908 kB
  • 2 contributors
History: 10 commits
hibana2077
Enhance convert.py to save DataFrame as CSV; add load_test.py for testing Parquet file
1f9bc59 12 months ago
  • convert.py
    2.09 kB
    Enhance convert.py to save DataFrame as CSV; add load_test.py for testing Parquet file 12 months ago
  • dcard.txt
    34 kB
    Add token counting and text processing scripts with sample data files 12 months ago
  • discord.txt
    21 kB
    Add additional commentary and personal reflections to discord.txt 12 months ago
  • est.py
    1.52 kB
    Update token goal to 1 billion and improve remaining tokens calculation; add text encoding test script 12 months ago
  • fb.txt
    9.89 kB
    Add token counting and text processing scripts with sample data files 12 months ago
  • ptt.txt
    796 kB
    Clean up ptt.txt by removing unnecessary blank lines 12 months ago
  • tbrain.txt
    9.73 kB
    Add token counting and text processing scripts with sample data files 12 months ago
  • test.py
    322 Bytes
    Update token goal to 1 billion and improve remaining tokens calculation; add text encoding test script 12 months ago
  • threads.txt
    33.3 kB
    Remove old train.parquet file and update new file with larger size; enhance convert.py to drop duplicates and reset index; add PTT article crawler script 12 months ago