Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
microsoft
/
Phi-3-mini-4k-instruct
like
1.39k
Follow
Microsoft
18.4k
Text Generation
Transformers
Safetensors
English
French
phi3
nlp
code
conversational
custom_code
text-generation-inference
License:
mit
Model card
Files
Files and versions
xet
Community
114
Deploy
Use this model
refs/pr/73
Phi-3-mini-4k-instruct
7.64 GB
10 contributors
History:
21 commits
dkleine
Update README.md
9ef4036
verified
over 1 year ago
.gitattributes
1.52 kB
initial commit
almost 2 years ago
CODE_OF_CONDUCT.md
444 Bytes
chore(root): Initial files upload.
almost 2 years ago
LICENSE
1.08 kB
chore(root): Initial files upload.
almost 2 years ago
NOTICE.md
1.77 kB
chore(root): Initial files upload.
almost 2 years ago
README.md
18.4 kB
Update README.md
over 1 year ago
SECURITY.md
2.66 kB
chore(root): Initial files upload.
almost 2 years ago
added_tokens.json
293 Bytes
fix(readme): Adds information about placeholder tokens.
almost 2 years ago
config.json
931 Bytes
Add attention_bias to make TGI work (#64)
over 1 year ago
configuration_phi3.py
10.4 kB
chore(root): Initial files upload.
almost 2 years ago
generation_config.json
172 Bytes
chore(root): Initial files upload.
almost 2 years ago
model-00001-of-00002.safetensors
4.97 GB
xet
chore(root): Initial files upload.
almost 2 years ago
model-00002-of-00002.safetensors
2.67 GB
xet
chore(root): Initial files upload.
almost 2 years ago
model.safetensors.index.json
16.3 kB
chore(root): Initial files upload.
almost 2 years ago
modeling_phi3.py
73.8 kB
Fix typo (#58)
almost 2 years ago
sample_finetune.py
6.19 kB
Update sample_finetune.py (#65)
over 1 year ago
special_tokens_map.json
568 Bytes
chore(root): Initial files upload.
almost 2 years ago
tokenizer.json
1.84 MB
fix(tokenizer): Also updates tokenizer.json to prevent any mismatch.
almost 2 years ago
tokenizer.model
500 kB
xet
chore(root): Initial files upload.
almost 2 years ago
tokenizer_config.json
3.17 kB
fix(tokenizer_config): Adjusts `rstrip` of special tokens. (#53)
almost 2 years ago