Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
DoggiAI
/
Taiwan-Netizen
like
0
Follow
Doggi AI
2
Modalities:
Text
Formats:
parquet
Size:
10K - 100K
Libraries:
Datasets
pandas
Croissant
+ 1
License:
mit
Dataset card
Data Studio
Files
Files and versions
xet
Community
1
main
Taiwan-Netizen
/
prepare
908 kB
2 contributors
History:
10 commits
hibana2077
Enhance convert.py to save DataFrame as CSV; add load_test.py for testing Parquet file
1f9bc59
12 months ago
convert.py
Safe
2.09 kB
Enhance convert.py to save DataFrame as CSV; add load_test.py for testing Parquet file
12 months ago
dcard.txt
Safe
34 kB
Add token counting and text processing scripts with sample data files
12 months ago
discord.txt
Safe
21 kB
Add additional commentary and personal reflections to discord.txt
12 months ago
est.py
Safe
1.52 kB
Update token goal to 1 billion and improve remaining tokens calculation; add text encoding test script
12 months ago
fb.txt
Safe
9.89 kB
Add token counting and text processing scripts with sample data files
12 months ago
ptt.txt
Safe
796 kB
Clean up ptt.txt by removing unnecessary blank lines
12 months ago
tbrain.txt
Safe
9.73 kB
Add token counting and text processing scripts with sample data files
12 months ago
test.py
Safe
322 Bytes
Update token goal to 1 billion and improve remaining tokens calculation; add text encoding test script
12 months ago
threads.txt
Safe
33.3 kB
Remove old train.parquet file and update new file with larger size; enhance convert.py to drop duplicates and reset index; add PTT article crawler script
12 months ago