Image-Text-to-Text
Transformers
Safetensors
GGUF
gemma3
image-to-text
turkish
türkiye
english
ai
lamapi
next
next-x1
efficient
text-generation
open-source
4b
huggingface
large-language-model
llm
causal
transformer
artificial-intelligence
machine-learning
ai-research
natural-language-processing
language
multilingual
multimodal
nlp
finetuned
lightweight
creative
summarization
question-answering
chat
generative-ai
optimized
unsloth
trl
sft
chemistry
code
biology
finance
legal
music
art
state-of-the-art
climate
medical
agent
text-generation-inference
Merge
dense
conversational
Undownloadable files
#1
by
BayKedy
- opened
Hello,
I’m encountering a persistent 403 Access Denied error when trying to download
the following files from the Lamapi/next-4b repository:
- tokenizer.json
- tokenizer.model
All other files (including large safetensors and GGUF files) download correctly.
I have tried:
- hf download (huggingface_hub CLI)
- git clone with git-lfs
- wget / curl
- Direct download from the Hugging Face web UI
- VPN and proxy
- Logged-in HF token with read permissions
In all cases, these two files fail with a 403 error from the HF LFS CDN
(cdn-lfs-us-1.hf.co), returning an AccessDenied XML response.
This suggests the files may be incorrectly configured as LFS objects or have
invalid/expired permissions on the CDN side.
Could you please:
- Re-upload tokenizer.json and tokenizer.model, or
- Check their LFS / permissions configuration?
Thanks in advance.


