Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
Kutches
/
Lwx
like
0
GGUF
conversational
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
Lwx
104 GB
Ctrl+K
Ctrl+K
1 contributor
History:
26 commits
Kutches
Upload gemma-3-12b-it-IQ4_XS.gguf with huggingface_hub
c5fee4b
verified
2 days ago
.gitattributes
2.48 kB
Upload gemma-3-12b-it-IQ4_XS.gguf with huggingface_hub
2 days ago
LTX-2.3 - Better Female Nudity v2_rank64.safetensors
Safe
1.26 GB
xet
Upload LTX-2.3 - Better Female Nudity v2_rank64.safetensors with huggingface_hub
2 days ago
LTX-2.3 - She Rubs Her Pussy v2.11.safetensors
Safe
1.29 GB
xet
Upload LTX-2.3 - She Rubs Her Pussy v2.11.safetensors with huggingface_hub
2 days ago
LTX23_audio_vae_bf16.safetensors
Safe
365 MB
xet
Upload LTX23_audio_vae_bf16.safetensors with huggingface_hub
2 days ago
LTX23_video_vae_bf16.safetensors
Safe
1.45 GB
xet
Upload LTX23_video_vae_bf16.safetensors with huggingface_hub
2 days ago
Ltx2.3-Licon-VBVR-I2V-240K-R32.safetensors
Safe
554 MB
xet
Upload Ltx2.3-Licon-VBVR-I2V-240K-R32.safetensors with huggingface_hub
2 days ago
Ltx2.3-Licon-VBVR-I2V-390K-R32.safetensors
Safe
554 MB
xet
Upload Ltx2.3-Licon-VBVR-I2V-390K-R32.safetensors with huggingface_hub
2 days ago
gemma-3-12b-it-IQ4_XS.gguf
Safe
6.55 GB
xet
Upload gemma-3-12b-it-IQ4_XS.gguf with huggingface_hub
2 days ago
gemma-3-12b-it-heretic-x.Q3_K_S.gguf
5.46 GB
xet
Upload gemma-3-12b-it-heretic-x.Q3_K_S.gguf with huggingface_hub
2 days ago
gemma-3-12b-it-heretic-x.Q4_K_S.gguf
6.94 GB
xet
Upload gemma-3-12b-it-heretic-x.Q4_K_S.gguf with huggingface_hub
2 days ago
gemma-3-12b-it-qat-Q4_K_S.gguf
Safe
6.94 GB
xet
Upload gemma-3-12b-it-qat-Q4_K_S.gguf with huggingface_hub
2 days ago
gemma-3-12b-it-qat-abliterated.q4_k_m.gguf
7.3 GB
xet
Upload gemma-3-12b-it-qat-abliterated.q4_k_m.gguf with huggingface_hub
2 days ago
gemma-3-12b-it-qat-abliterated.q5_k_m.gguf
8.45 GB
xet
Upload gemma-3-12b-it-qat-abliterated.q5_k_m.gguf with huggingface_hub
2 days ago
gemma-3-12b-it-qat-q4_0-unquantized.Q4_K_M.gguf
Safe
7.3 GB
xet
Upload gemma-3-12b-it-qat-q4_0-unquantized.Q4_K_M.gguf with huggingface_hub
2 days ago
gemma-3-12b-it-qat-q4_0-unquantized.Q4_K_S.gguf
Safe
6.94 GB
xet
Upload gemma-3-12b-it-qat-q4_0-unquantized.Q4_K_S.gguf with huggingface_hub
2 days ago
gemma-3-12b-it-ultra-heretic.i1-IQ3_S.gguf
5.46 GB
xet
Upload gemma-3-12b-it-ultra-heretic.i1-IQ3_S.gguf with huggingface_hub
2 days ago
gemma-3-12b-it-ultra-heretic.i1-Q4_K_S.gguf
6.94 GB
xet
Upload gemma-3-12b-it-ultra-heretic.i1-Q4_K_S.gguf with huggingface_hub
2 days ago
ltx-2.3-22b-distilled-1.1-Q3_K_M.gguf
10.6 GB
xet
Upload ltx-2.3-22b-distilled-1.1-Q3_K_M.gguf with huggingface_hub
2 days ago
ltx-2.3-22b-distilled-1.1-Q3_K_S.gguf
9.74 GB
xet
Upload ltx-2.3-22b-distilled-1.1-Q3_K_S.gguf with huggingface_hub
2 days ago
ltx-2.3-22b-distilled-1.1_lora-dynamic_fro09_avg_rank_111_bf16.safetensors
Safe
2.74 GB
xet
Upload ltx-2.3-22b-distilled-1.1_lora-dynamic_fro09_avg_rank_111_bf16.safetensors with huggingface_hub
2 days ago
ltx-2.3-22b-distilled-lora-1.1_fro90_ceil36.safetensors
Safe
739 MB
xet
Upload ltx-2.3-22b-distilled-lora-1.1_fro90_ceil36.safetensors with huggingface_hub
2 days ago
ltx-2.3-22b-distilled-lora-1.1_fro90_ceil72_condsafe.safetensors
Safe
662 MB
xet
Upload ltx-2.3-22b-distilled-lora-1.1_fro90_ceil72_condsafe.safetensors with huggingface_hub
2 days ago
ltx-2.3-22b-distilled-lora-fro90_ceil72.safetensors
Safe
1.4 GB
xet
Upload ltx-2.3-22b-distilled-lora-fro90_ceil72.safetensors with huggingface_hub
2 days ago
ltx-2.3-spatial-upscaler-x1.5-1.0.safetensors
Safe
1.09 GB
xet
Upload ltx-2.3-spatial-upscaler-x1.5-1.0.safetensors with huggingface_hub
2 days ago
ltx-2.3_text_projection_bf16.safetensors
Safe
2.31 GB
xet
Upload ltx-2.3_text_projection_bf16.safetensors with huggingface_hub
2 days ago
mmproj-BF16.gguf
Safe
854 MB
xet
Upload mmproj-BF16.gguf with huggingface_hub
2 days ago