Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
FarziBuilder
/
havenTry1
like
0
Text Generation
Transformers
PyTorch
mpt
custom_code
text-generation-inference
Model card
Files
Files and versions
xet
Community
1
Deploy
Use this model
abd216d
havenTry1
13.3 GB
Ctrl+K
Ctrl+K
1 contributor
History:
8 commits
FarziBuilder
Upload 11 files
abd216d
almost 3 years ago
.gitattributes
Safe
1.52 kB
initial commit
almost 3 years ago
adapt_tokenizer.py
Safe
1.75 kB
Upload 11 files
almost 3 years ago
attention.py
Safe
23.7 kB
Upload 11 files
almost 3 years ago
blocks.py
Safe
2.65 kB
Upload 11 files
almost 3 years ago
config.json
Safe
1.32 kB
Upload MPTForCausalLM
almost 3 years ago
configuration_mpt.py
Safe
9.08 kB
Upload 11 files
almost 3 years ago
flash_attn_triton.py
Safe
28.2 kB
Upload 11 files
almost 3 years ago
generation_config.json
Safe
96 Bytes
Upload MPTForCausalLM
almost 3 years ago
hf_prefixlm_converter.py
Safe
27.2 kB
Upload 11 files
almost 3 years ago
is_torch_version.py
Safe
2.39 kB
Upload 11 files
almost 3 years ago
meta_init_context.py
Safe
3.64 kB
Upload 11 files
almost 3 years ago
modeling_mpt.py
Safe
20.3 kB
Upload 11 files
almost 3 years ago
norm.py
Safe
2.56 kB
Upload 11 files
almost 3 years ago
param_init_fns (1).py
Safe
12.6 kB
Upload 11 files
almost 3 years ago
pytorch_model-00001-of-00002.bin
pickle
Detected Pickle imports (3)
"torch.HalfStorage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
9.94 GB
xet
Upload MPTForCausalLM
almost 3 years ago
pytorch_model-00002-of-00002.bin
pickle
Detected Pickle imports (3)
"torch._utils._rebuild_tensor_v2"
,
"torch.HalfStorage"
,
"collections.OrderedDict"
What is a pickle import?
3.36 GB
xet
Upload MPTForCausalLM
almost 3 years ago
pytorch_model.bin.index.json
Safe
16 kB
Upload MPTForCausalLM
almost 3 years ago
special_tokens_map.json
Safe
131 Bytes
Upload tokenizer
almost 3 years ago
tokenizer.json
Safe
2.11 MB
Upload tokenizer
almost 3 years ago
tokenizer_config.json
Safe
237 Bytes
Upload tokenizer
almost 3 years ago