Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
3
rendy saputra
10bodrex
Follow
0 followers
·
9 following
AI & ML interests
None yet
Recent Activity
reacted
to
IlyasMoutawwakil
's
post
with 🔥
11 days ago
Transformers v5 just landed! 🚀 It significantly unifies and reduces modeling code across architectures, while opening the door to a whole new class of performance optimizations. My favorite new feature? 🤔 The new dynamic weight loader + converter. Here’s why 👇 Over the last few months, the core Transformers maintainers built an incredibly fast weight loader, capable of converting tensors on the fly while loading them in parallel threads. This means we’re no longer constrained by how parameters are laid out inside the safetensors weight files. In practice, this unlocks two big things: - Much more modular modeling code. You can now clearly see how architectures build on top of each other (DeepSeek v2 → v3, Qwen v2 → v3 → MoE, etc.). This makes shared bottlenecks obvious and lets us optimize the right building blocks once, for all model families. - Performance optimizations beyond what torch.compile can do alone. torch.compile operates on the computation graph, but it can’t change parameter layouts. With the new loader, we can restructure weights at load time: fusing MoE expert projections, merging attention QKV projections, and enabling more compute-dense kernels that simply weren’t possible before. Personally, I'm honored to have contributed in this direction, including the work on optimizing MoE implementations and making modeling code more torch-exportable, so these optimizations can be ported cleanly across runtimes. Overall, Transformers v5 is a strong signal of where the community and industry are converging: Modularity and Performance, without sacrificing Flexibility. Transformers v5 makes its signature from_pretrained an entrypoint where you can mix and match: - Parallelism - Quantization - Custom kernels - Flash/Paged attention - Continuous batching - ... Kudos to everyone involved! I highly recommend the: Release notes: https://github.com/huggingface/transformers/releases/tag/v5.0.0 Blog post: https://huggingface.co/blog/transformers-v5
published
a dataset
16 days ago
10bodrex/DATA.69.00.Shot
published
a model
16 days ago
10bodrex/Mydellin_Stunding
View all activity
Organizations
None yet
10bodrex
's datasets
116
Sort:Â Recently updated
10bodrex/aimaru-V0.12
Updated
Dec 28, 2025
10bodrex/Pandawaru
Updated
Dec 27, 2025
10bodrex/Nicholsoon
Updated
Dec 27, 2025
10bodrex/MilaGros_Cisnr
Updated
Dec 27, 2025
10bodrex/Sperfast
Updated
Dec 26, 2025
•
1
10bodrex/Dara_Lope
Updated
Dec 26, 2025
10bodrex/DjiampT_100M
Updated
Dec 26, 2025
10bodrex/Jayak_MoL
Updated
Dec 25, 2025
10bodrex/xMorin_new
Updated
Dec 25, 2025
•
1
10bodrex/Vormelon_701
Updated
Dec 25, 2025
•
1
10bodrex/Somplack
Updated
Dec 24, 2025
•
1
10bodrex/Bitorick_sAw
Updated
Dec 24, 2025
10bodrex/Pierching
Updated
Dec 24, 2025
10bodrex/ArmoRy
Updated
Dec 23, 2025
•
1
10bodrex/YaMantul
Updated
Dec 23, 2025
10bodrex/Power_Sableng
Updated
Dec 23, 2025
10bodrex/Boker_Law
Updated
Dec 22, 2025
10bodrex/Farmy_owN
Updated
Dec 22, 2025
10bodrex/motoRol
Updated
Dec 22, 2025
10bodrex/Dragonfang
Updated
Dec 21, 2025
10bodrex/iRon_theForce
Updated
Dec 21, 2025
10bodrex/dhyo_Hw
Updated
Dec 21, 2025
10bodrex/Gwendolyn_VauGH
Updated
Dec 20, 2025
10bodrex/the_Moong
Updated
Dec 20, 2025
10bodrex/Luc_PeTrov
Updated
Dec 20, 2025
10bodrex/Holaho_PE
Updated
Dec 19, 2025
10bodrex/Riko_MP
Updated
Dec 19, 2025
10bodrex/kloDfi_Go
Updated
Dec 19, 2025
•
1
10bodrex/MarjInaL_Ku
Updated
Dec 18, 2025
•
1
10bodrex/djaRuUm_UpR
Updated
Dec 18, 2025
•
1
Previous
1
2
3
4
Next