modelId stringlengths 4 111 | lastModified stringlengths 24 24 | tags list | pipeline_tag stringlengths 5 30 ⌀ | author stringlengths 2 34 ⌀ | config null | securityStatus null | id stringlengths 4 111 | likes int64 0 9.53k | downloads int64 2 73.6M | library_name stringlengths 2 84 ⌀ | created timestamp[us] | card stringlengths 101 901k | card_len int64 101 901k | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
OpenVINO/bert-base-uncased-sst2-int8-unstructured80 | 2023-03-06T14:08:48.000Z | [
"transformers",
"pytorch",
"openvino",
"bert",
"generated_from_trainer",
"en",
"dataset:glue",
"model-index",
"endpoints_compatible",
"region:us"
] | null | OpenVINO | null | null | OpenVINO/bert-base-uncased-sst2-int8-unstructured80 | 3 | 7,185 | transformers | 2023-03-06T05:02:23 | ---
language:
- en
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: yujiepan/bert-base-uncased-sst2-int8-unstructured80
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE SST2
type: glue
config: sst2
split: validation
args: sst2
metrics:
- name: Accuracy
type: accuracy
value: 0.91284
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Joint magnitude pruning, quantization and distillation on BERT-base/SST-2
This model conducts unstructured magnitude pruning, quantization and distillation at the same time on BERT-base when finetuning on the GLUE SST2 dataset.
It achieves the following results on the evaluation set:
- Torch accuracy: 0.9128
- OpenVINO IR accuracy: 0.9128
- Sparsity in transformer block linear layers: 0.80
## Setup
```
conda install pytorch torchvision torchaudio pytorch-cuda=11.6 -c pytorch -c nvidia
pip install optimum[openvino,nncf]==1.7.0
pip install datasets sentencepiece scipy scikit-learn protobuf evaluate
pip install wandb # optional
```
## Training script
See https://gist.github.com/yujiepan-work/5d7e513a47b353db89f6e1b512d7c080
## Run
We use one card for training.
```bash
NNCFCFG=/path/to/nncf_config/json
python run_glue.py \
--lr_scheduler_type cosine_with_restarts \
--cosine_lr_scheduler_cycles 11 6 \
--record_best_model_after_epoch 9 \
--load_best_model_at_end True \
--metric_for_best_model accuracy \
--model_name_or_path textattack/bert-base-uncased-SST-2 \
--teacher_model_or_path yoshitomo-matsubara/bert-large-uncased-sst2 \
--distillation_temperature 2 \
--task_name sst2 \
--nncf_compression_config $NNCFCFG \
--distillation_weight 0.95 \
--output_dir /tmp/bert-base-uncased-sst2-int8-unstructured80 \
--overwrite_output_dir \
--run_name bert-base-uncased-sst2-int8-unstructured80 \
--do_train \
--do_eval \
--max_seq_length 128 \
--per_device_train_batch_size 32 \
--per_device_eval_batch_size 32 \
--learning_rate 5e-05 \
--optim adamw_torch \
--num_train_epochs 17 \
--logging_steps 1 \
--evaluation_strategy steps \
--eval_steps 250 \
--save_strategy steps \
--save_steps 250 \
--save_total_limit 1 \
--fp16 \
--seed 1
```
### Framework versions
- Transformers 4.26.0
- Pytorch 1.13.1+cu116
- Datasets 2.8.0
- Tokenizers 0.13.2
- Optimum 1.6.3
- Optimum-intel 1.7.0
- NNCF 2.4.0
| 2,600 | [
[
-0.0193328857421875,
-0.0469970703125,
0.0203399658203125,
0.023651123046875,
-0.0252532958984375,
-0.01491546630859375,
-0.02679443359375,
-0.0099334716796875,
0.0032787322998046875,
0.01318359375,
-0.052276611328125,
-0.0281982421875,
-0.045196533203125,
-0.01511383056640625,
-0.021942138671875,
0.09136962890625,
-0.0011425018310546875,
0.01436614990234375,
-0.01467132568359375,
0.01419830322265625,
-0.044158935546875,
-0.0419921875,
-0.0599365234375,
-0.04443359375,
0.0109710693359375,
0.01554107666015625,
0.0333251953125,
0.0218353271484375,
0.04815673828125,
0.03240966796875,
-0.0230255126953125,
0.0010747909545898438,
-0.0399169921875,
-0.0041046142578125,
0.0152587890625,
-0.040435791015625,
-0.042449951171875,
-0.005115509033203125,
0.052459716796875,
0.0209808349609375,
-0.009124755859375,
0.0186309814453125,
0.004688262939453125,
0.04400634765625,
-0.026641845703125,
0.01959228515625,
-0.0333251953125,
-0.00005555152893066406,
-0.0007791519165039062,
-0.0174102783203125,
-0.029754638671875,
-0.0118865966796875,
0.01111602783203125,
-0.034637451171875,
0.047637939453125,
-0.01236724853515625,
0.09088134765625,
0.036956787109375,
-0.017669677734375,
-0.000835418701171875,
-0.0640869140625,
0.062744140625,
-0.06121826171875,
0.00505828857421875,
0.0157318115234375,
0.017303466796875,
-0.0053558349609375,
-0.07061767578125,
-0.0297698974609375,
-0.0009722709655761719,
-0.005077362060546875,
0.0287322998046875,
-0.0072174072265625,
0.0137176513671875,
0.0582275390625,
0.049774169921875,
-0.0462646484375,
0.0017414093017578125,
-0.052276611328125,
-0.009735107421875,
0.055908203125,
0.0199127197265625,
-0.005847930908203125,
-0.016845703125,
-0.0457763671875,
-0.029205322265625,
-0.0130767822265625,
0.01328277587890625,
0.02685546875,
0.004123687744140625,
-0.0057830810546875,
0.03497314453125,
-0.0018815994262695312,
0.039031982421875,
0.0059051513671875,
-0.01898193359375,
0.0262908935546875,
-0.00498199462890625,
-0.033905029296875,
0.015228271484375,
0.0662841796875,
0.0216217041015625,
0.0036773681640625,
0.0118560791015625,
-0.024932861328125,
0.0035419464111328125,
0.017486572265625,
-0.08343505859375,
-0.062744140625,
0.03448486328125,
-0.031982421875,
-0.04022216796875,
0.0048980712890625,
-0.03875732421875,
0.0035953521728515625,
-0.0226287841796875,
0.04791259765625,
-0.0537109375,
-0.017333984375,
0.0011425018310546875,
-0.03277587890625,
0.0036640167236328125,
0.006473541259765625,
-0.07366943359375,
0.01151275634765625,
0.037139892578125,
0.057220458984375,
0.0144805908203125,
-0.01971435546875,
-0.003047943115234375,
-0.00539398193359375,
-0.01111602783203125,
0.039337158203125,
-0.00030040740966796875,
-0.0258636474609375,
-0.032958984375,
0.00176239013671875,
-0.00933074951171875,
-0.03228759765625,
0.05242919921875,
-0.0169677734375,
0.0187835693359375,
-0.0098876953125,
-0.05206298828125,
-0.0189056396484375,
0.00374603271484375,
-0.041961669921875,
0.104736328125,
0.0120697021484375,
-0.061492919921875,
0.028472900390625,
-0.048797607421875,
-0.0200042724609375,
0.0053863525390625,
0.0121612548828125,
-0.06121826171875,
0.0238189697265625,
0.00405120849609375,
0.03570556640625,
-0.01306915283203125,
0.044677734375,
-0.01406097412109375,
-0.0345458984375,
0.00804901123046875,
-0.055908203125,
0.08111572265625,
0.0093994140625,
-0.027618408203125,
0.0199127197265625,
-0.07391357421875,
0.016448974609375,
0.0253143310546875,
-0.031463623046875,
-0.00685882568359375,
-0.01727294921875,
0.0198211669921875,
0.0173492431640625,
0.0274658203125,
-0.03448486328125,
0.026031494140625,
-0.025421142578125,
0.05230712890625,
0.053436279296875,
0.01378631591796875,
0.007602691650390625,
-0.0345458984375,
0.004024505615234375,
0.00693511962890625,
0.0261383056640625,
0.0080108642578125,
-0.037384033203125,
-0.07049560546875,
-0.018157958984375,
0.0262298583984375,
0.0340576171875,
-0.036407470703125,
0.056610107421875,
-0.017120361328125,
-0.058807373046875,
-0.0279388427734375,
-0.0072784423828125,
0.0248260498046875,
0.0345458984375,
0.036041259765625,
-0.01300048828125,
-0.0452880859375,
-0.07098388671875,
0.0085296630859375,
-0.015289306640625,
-0.01508331298828125,
0.00804901123046875,
0.050201416015625,
0.01232147216796875,
0.0623779296875,
-0.049041748046875,
-0.02801513671875,
-0.01445770263671875,
0.032318115234375,
0.044342041015625,
0.052734375,
0.042572021484375,
-0.04718017578125,
-0.030609130859375,
-0.0288238525390625,
-0.053314208984375,
0.01397705078125,
-0.0225677490234375,
-0.007808685302734375,
0.00612640380859375,
0.00949859619140625,
-0.041717529296875,
0.03607177734375,
0.022918701171875,
-0.020599365234375,
0.046356201171875,
-0.042205810546875,
-0.0015716552734375,
-0.08154296875,
0.018585205078125,
0.0006461143493652344,
-0.00176239013671875,
-0.05218505859375,
-0.01026153564453125,
0.00557708740234375,
-0.003093719482421875,
-0.040252685546875,
0.021087646484375,
-0.0242919921875,
0.0135955810546875,
-0.016693115234375,
-0.0279083251953125,
0.002956390380859375,
0.066650390625,
0.0159912109375,
0.039398193359375,
0.044189453125,
-0.041656494140625,
0.032989501953125,
0.0250396728515625,
-0.0200347900390625,
0.0242156982421875,
-0.0648193359375,
0.00505828857421875,
-0.003986358642578125,
0.0029754638671875,
-0.06878662109375,
-0.0198974609375,
0.0165557861328125,
-0.049591064453125,
0.045745849609375,
-0.0118560791015625,
-0.0450439453125,
-0.043853759765625,
-0.01512908935546875,
-0.0004076957702636719,
0.060760498046875,
-0.04254150390625,
0.044464111328125,
-0.00009191036224365234,
0.0219879150390625,
-0.033782958984375,
-0.06304931640625,
-0.02764892578125,
-0.01824951171875,
-0.048492431640625,
0.03436279296875,
-0.0166168212890625,
0.006679534912109375,
-0.00412750244140625,
-0.02459716796875,
-0.0242156982421875,
0.005695343017578125,
0.030609130859375,
0.037017822265625,
0.0089111328125,
0.00318145751953125,
0.00174713134765625,
-0.01427459716796875,
0.0306243896484375,
-0.0078277587890625,
0.043792724609375,
-0.0130767822265625,
-0.005504608154296875,
-0.047607421875,
-0.00797271728515625,
0.03204345703125,
0.00457000732421875,
0.063232421875,
0.0780029296875,
-0.0308380126953125,
-0.008636474609375,
-0.033782958984375,
-0.032562255859375,
-0.03802490234375,
0.032470703125,
-0.027099609375,
-0.032470703125,
0.03662109375,
-0.0051422119140625,
0.029266357421875,
0.062744140625,
0.043182373046875,
-0.018829345703125,
0.072265625,
0.0290985107421875,
0.00743865966796875,
0.038116455078125,
-0.07281494140625,
0.0176544189453125,
-0.060333251953125,
-0.0258941650390625,
-0.0203399658203125,
-0.043975830078125,
-0.0298309326171875,
-0.0110626220703125,
0.02984619140625,
0.0311737060546875,
-0.0234832763671875,
0.0294036865234375,
-0.070068359375,
0.0272979736328125,
0.06524658203125,
0.0211944580078125,
-0.0081939697265625,
0.00646209716796875,
-0.00812530517578125,
-0.0027790069580078125,
-0.07037353515625,
-0.022674560546875,
0.08966064453125,
0.032470703125,
0.03802490234375,
-0.01013946533203125,
0.06597900390625,
-0.01059722900390625,
0.00794219970703125,
-0.053497314453125,
0.039306640625,
-0.01151275634765625,
-0.06353759765625,
-0.0240936279296875,
-0.027008056640625,
-0.043243408203125,
0.0167388916015625,
-0.02191162109375,
-0.050048828125,
0.01534271240234375,
0.020294189453125,
-0.0266571044921875,
0.04620361328125,
-0.06463623046875,
0.07025146484375,
-0.02508544921875,
-0.03546142578125,
-0.0022945404052734375,
-0.051910400390625,
0.038177490234375,
-0.0014390945434570312,
-0.022613525390625,
-0.01161956787109375,
0.03338623046875,
0.07421875,
-0.048187255859375,
0.05487060546875,
-0.032684326171875,
0.0185089111328125,
0.039337158203125,
-0.0163421630859375,
0.03985595703125,
0.0037136077880859375,
-0.0015268325805664062,
0.03485107421875,
0.0167388916015625,
-0.022125244140625,
-0.0302581787109375,
0.05609130859375,
-0.07293701171875,
-0.021575927734375,
-0.051788330078125,
-0.048492431640625,
0.01366424560546875,
0.010009765625,
0.044342041015625,
0.0343017578125,
0.0003986358642578125,
0.025604248046875,
0.05218505859375,
0.0032901763916015625,
0.048370361328125,
0.005496978759765625,
0.01314544677734375,
-0.03228759765625,
0.0771484375,
0.005641937255859375,
0.01415252685546875,
0.0188140869140625,
0.0115966796875,
-0.0273590087890625,
-0.0406494140625,
-0.017242431640625,
0.004970550537109375,
-0.0423583984375,
-0.0340576171875,
-0.0321044921875,
-0.04034423828125,
-0.02801513671875,
0.001567840576171875,
-0.037567138671875,
-0.03338623046875,
-0.034912109375,
0.004833221435546875,
0.046142578125,
0.0304412841796875,
-0.02081298828125,
0.038665771484375,
-0.055206298828125,
0.00470733642578125,
0.007686614990234375,
0.016571044921875,
0.004970550537109375,
-0.043548583984375,
-0.00774383544921875,
0.0088653564453125,
-0.039337158203125,
-0.053314208984375,
0.04815673828125,
0.0175323486328125,
0.03289794921875,
0.034759521484375,
0.019195556640625,
0.06103515625,
-0.0157012939453125,
0.06597900390625,
0.0350341796875,
-0.05908203125,
0.044952392578125,
-0.0066070556640625,
0.0185699462890625,
0.0345458984375,
0.03167724609375,
-0.01023101806640625,
-0.0175628662109375,
-0.0758056640625,
-0.07342529296875,
0.07830810546875,
0.036895751953125,
-0.01314544677734375,
0.005809783935546875,
0.0177459716796875,
-0.0171051025390625,
0.00662994384765625,
-0.053924560546875,
-0.03955078125,
-0.022003173828125,
-0.00817108154296875,
0.002864837646484375,
-0.03094482421875,
-0.01079559326171875,
-0.042327880859375,
0.08203125,
0.012451171875,
0.040679931640625,
0.02252197265625,
0.00522613525390625,
-0.00007647275924682617,
-0.002590179443359375,
0.029022216796875,
0.0347900390625,
-0.053802490234375,
-0.00899505615234375,
0.020782470703125,
-0.0458984375,
0.004367828369140625,
0.01541900634765625,
-0.00676727294921875,
0.0175323486328125,
0.022308349609375,
0.081298828125,
0.0159759521484375,
-0.0195770263671875,
0.02227783203125,
-0.012542724609375,
-0.054443359375,
-0.03997802734375,
0.0133209228515625,
0.006198883056640625,
0.015777587890625,
0.0159454345703125,
0.0206298828125,
-0.0014371871948242188,
-0.018035888671875,
-0.00024306774139404297,
0.020477294921875,
-0.0166473388671875,
-0.01568603515625,
0.0657958984375,
0.0023784637451171875,
-0.016693115234375,
0.07781982421875,
-0.01558685302734375,
-0.038818359375,
0.055908203125,
0.026519775390625,
0.06695556640625,
-0.0057220458984375,
-0.004425048828125,
0.06884765625,
0.01232147216796875,
-0.0166168212890625,
0.033905029296875,
-0.0074615478515625,
-0.0266876220703125,
-0.01641845703125,
-0.059539794921875,
-0.006450653076171875,
0.0247955322265625,
-0.06939697265625,
0.02911376953125,
-0.042388916015625,
-0.028167724609375,
0.0179443359375,
0.0283050537109375,
-0.062164306640625,
0.016845703125,
-0.002445220947265625,
0.06365966796875,
-0.07427978515625,
0.08087158203125,
0.03656005859375,
-0.04266357421875,
-0.07916259765625,
-0.025726318359375,
-0.011962890625,
-0.047821044921875,
0.043365478515625,
0.0006895065307617188,
0.034027099609375,
0.01058197021484375,
-0.037384033203125,
-0.057464599609375,
0.10491943359375,
0.01045989990234375,
-0.05621337890625,
-0.0053863525390625,
-0.00272369384765625,
0.047821044921875,
0.0139312744140625,
0.039825439453125,
0.046356201171875,
0.027191162109375,
0.03814697265625,
-0.06781005859375,
-0.0006356239318847656,
-0.0206298828125,
0.00384521484375,
0.03131103515625,
-0.0523681640625,
0.08648681640625,
-0.017913818359375,
-0.007701873779296875,
0.01383209228515625,
0.04400634765625,
0.03131103515625,
0.0145263671875,
0.0291748046875,
0.061370849609375,
0.050079345703125,
-0.03533935546875,
0.0560302734375,
-0.040985107421875,
0.055633544921875,
0.07666015625,
0.01219940185546875,
0.05303955078125,
0.0361328125,
-0.03179931640625,
0.01812744140625,
0.05487060546875,
-0.039031982421875,
0.044952392578125,
0.0229339599609375,
-0.0168609619140625,
-0.0200958251953125,
0.015472412109375,
-0.032684326171875,
0.01100921630859375,
0.0169525146484375,
-0.038665771484375,
-0.0140533447265625,
-0.01541900634765625,
-0.0108642578125,
-0.03204345703125,
-0.012176513671875,
0.047271728515625,
-0.01026153564453125,
-0.044830322265625,
0.056396484375,
0.00047278404235839844,
0.0518798828125,
-0.0599365234375,
-0.0135955810546875,
0.0034332275390625,
0.035125732421875,
-0.024658203125,
-0.0501708984375,
0.0249786376953125,
-0.0130157470703125,
-0.01458740234375,
-0.0214080810546875,
0.0494384765625,
-0.034881591796875,
-0.048828125,
0.0011157989501953125,
0.0019350051879882812,
0.0433349609375,
-0.0035247802734375,
-0.058624267578125,
0.0033702850341796875,
0.01727294921875,
-0.0243988037109375,
0.0167083740234375,
0.021484375,
0.0162353515625,
0.0269012451171875,
0.062225341796875,
-0.0009212493896484375,
0.006031036376953125,
-0.0043487548828125,
0.07208251953125,
-0.0308380126953125,
-0.023712158203125,
-0.05908203125,
0.047454833984375,
-0.0124664306640625,
-0.04144287109375,
0.03839111328125,
0.058837890625,
0.0650634765625,
-0.0085906982421875,
0.040008544921875,
-0.0259552001953125,
0.0089569091796875,
-0.031768798828125,
0.050323486328125,
-0.035858154296875,
0.004817962646484375,
-0.004291534423828125,
-0.07080078125,
0.000017464160919189453,
0.061676025390625,
-0.01580810546875,
0.005367279052734375,
0.0523681640625,
0.040985107421875,
0.003269195556640625,
0.006900787353515625,
0.0085906982421875,
0.00972747802734375,
0.0296173095703125,
0.06756591796875,
0.03411865234375,
-0.060028076171875,
0.038299560546875,
-0.06683349609375,
-0.01079559326171875,
-0.0026721954345703125,
-0.0540771484375,
-0.074462890625,
-0.042755126953125,
-0.036712646484375,
-0.02874755859375,
-0.011444091796875,
0.07391357421875,
0.0560302734375,
-0.05419921875,
-0.0111083984375,
-0.0266876220703125,
-0.024810791015625,
-0.034698486328125,
-0.0185699462890625,
0.04656982421875,
-0.020904541015625,
-0.06683349609375,
0.006183624267578125,
-0.0123443603515625,
0.0210723876953125,
-0.0233306884765625,
-0.00516510009765625,
-0.020416259765625,
-0.030120849609375,
0.02679443359375,
0.006687164306640625,
-0.04534912109375,
-0.007007598876953125,
-0.01255035400390625,
0.0080718994140625,
0.0019216537475585938,
0.0267486572265625,
-0.04425048828125,
0.032440185546875,
0.042755126953125,
0.003002166748046875,
0.06884765625,
-0.0135498046875,
0.0194091796875,
-0.0714111328125,
0.01445770263671875,
0.0288238525390625,
0.0251922607421875,
0.0011358261108398438,
-0.0288238525390625,
0.044891357421875,
0.03173828125,
-0.04962158203125,
-0.06591796875,
-0.024078369140625,
-0.078369140625,
-0.0171966552734375,
0.07806396484375,
-0.01486968994140625,
-0.0297698974609375,
0.0267333984375,
-0.0169830322265625,
0.03814697265625,
-0.023956298828125,
0.053314208984375,
0.05523681640625,
-0.01387786865234375,
0.00872039794921875,
-0.03753662109375,
0.042388916015625,
0.0180206298828125,
-0.033966064453125,
-0.0016222000122070312,
0.026092529296875,
0.03662109375,
0.01580810546875,
0.033416748046875,
0.006717681884765625,
0.0086517333984375,
0.0218658447265625,
0.0273895263671875,
-0.03765869140625,
-0.0279083251953125,
-0.0288238525390625,
-0.0038318634033203125,
-0.01071929931640625,
-0.033172607421875
]
] |
Yukang/Llama-2-7b-longlora-32k-ft | 2023-09-24T09:40:44.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"arxiv:2309.12307",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | Yukang | null | null | Yukang/Llama-2-7b-longlora-32k-ft | 1 | 7,185 | transformers | 2023-09-12T09:46:31 | # LongLoRA: Efficient Fine-tuning of Long-Context Large Language Models
<font size=6><div align='center' > <a href=http://arxiv.org/abs/2309.12307>**Paper**</a> | <a href="https://huggingface.co/Yukang">**Models**</a> | <a href="https://github.com/dvlab-research/LongLoRA">**Code**</a> </div></font>
**LongLoRA: Efficient Fine-tuning of Long-Context Large Language Models [[Paper](http://arxiv.org/abs/2309.12307)]** <br />
[Yukang Chen](https://scholar.google.com/citations?user=6p0ygKUAAAAJ&hl=en),
[Shengju Qian](https://scholar.google.com/citations?user=QNnWmasAAAAJ),
[Haotian Tang](https://scholar.google.com/citations?user=WxL13BAAAAAJ&hl),
[Xin Lai](https://scholar.google.com/citations?user=tqNDPA4AAAAJ&hl=zh-CN),
[Zhijian Liu](https://scholar.google.com/citations?user=3coYSTUAAAAJ&hl=en),
[Song Han](https://scholar.google.com/citations?user=E0iCaa4AAAAJ&hl=zh-CN),
[Jiaya Jia](https://scholar.google.com/citations?user=XPAkzTEAAAAJ&hl=en)<br />
## Abstract
We present LongLoRA, an efficient fine-tuning approach that extends the context sizes of pre-trained large language models (LLMs), with limited computation cost.
Typically, training LLMs with long context sizes is computationally expensive, requiring extensive training hours and GPU resources.
In this paper, we speed up the context extension of LLMs in two aspects. On the one hand, although dense global attention is needed during inference, fine-tuning the model can be effectively and efficiently done by sparse local attention. The proposed shift short attention effectively enables context extension, leading to non-trivial computation saving with similar performance to fine-tuning with vanilla attention. On the other hand, we find that LoRA for context extension works well under the premise of trainable embedding and normalization. LongLoRA demonstrates strong empirical results on various tasks on LLaMA2 models from 7B/13B to 70B. LongLoRA adopts LLaMA2 7B from 4k context to 100k, or LLaMA2 70B to 32k on a single 8x A100 machine. LongLoRA extends models' context while retaining their original architectures, and is compatible with most existing techniques, like FlashAttention-2. In addition, to make LongLoRA practical, we collect a dataset, LongQA, for supervised fine-tuning. It contains more than 3k long context question-answer pairs. For more details, please refer to the [paper](http://arxiv.org/abs/2309.12307).
## Highlights
**LongLoRA** speed up the context extension of pre-trained large language models in both attention-level and weight-level.
1. The proposed shifted short attention is easy to implement, compatible with Flash-Attention, and not required during inference.
2. We release all our models, including models from 7B to 70B, context length from 8k to 100k, including [LLaMA2-LongLoRA-7B-100k](https://huggingface.co/Yukang/Llama-2-7b-longlora-100k-ft), [LLaMA2-LongLoRA-13B-64k](https://huggingface.co/Yukang/Llama-2-13b-longlora-64k), and [LLaMA2-LongLoRA-70B-32k](https://huggingface.co/Yukang/Llama-2-70b-longlora-32k).
3. We build up a long-context QA dataset, LongQA, for supervised fine-tuning (SFT). We release 13B and 70B 32k models with SFT, [Llama-2-13b-chat-longlora-32k-sft](https://huggingface.co/Yukang/Llama-2-13b-chat-longlora-32k-sft) and [Llama-2-70b-chat-longlora-32k-sft](https://huggingface.co/Yukang/Llama-2-70b-chat-longlora-32k-sft). We will further release the dataset next week.
## Released models
### Models with supervised fine-tuning
| Model | Size | Context | Train | Link |
|:----------------------------------|------|---------|---------|-------------------------------------------------------------------------|
| Llama-2-13b-chat-longlora-32k-sft | 13B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-chat-longlora-32k-sft) |
| Llama-2-70b-chat-longlora-32k-sft | 70B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-70b-chat-longlora-32k-sft) |
### Models with context extension via fully fine-tuning
| Model | Size | Context | Train | Link |
|:----------------------------|------|---------|-------|-------------------------------------------------------------------|
| Llama-2-7b-longlora-8k-ft | 7B | 8192 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-8k-ft) |
| Llama-2-7b-longlora-16k-ft | 7B | 16384 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-16k-ft) |
| Llama-2-7b-longlora-32k-ft | 7B | 32768 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-32k-ft) |
| Llama-2-7b-longlora-100k-ft | 7B | 100000 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-100k-ft) |
| Llama-2-13b-longlora-8k-ft | 13B | 8192 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-8k-ft) |
| Llama-2-13b-longlora-16k-ft | 13B | 16384 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft) |
| Llama-2-13b-longlora-32k-ft | 13B | 32768 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft) |
### Models with context extension via improved LoRA fine-tuning
| Model | Size | Context | Train | Link |
|:----------------------------|------|---------|-------|-------------------------------------------------------------------|
| Llama-2-7b-longlora-8k | 7B | 8192 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-8k) |
| Llama-2-7b-longlora-16k | 7B | 16384 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-16k) |
| Llama-2-7b-longlora-32k | 7B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-32k) |
| Llama-2-13b-longlora-8k | 13B | 8192 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-8k) |
| Llama-2-13b-longlora-16k | 13B | 16384 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-16k) |
| Llama-2-13b-longlora-32k | 13B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-32k) |
| Llama-2-13b-longlora-64k | 13B | 65536 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-64k) |
| Llama-2-70b-longlora-32k | 70B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-70b-longlora-32k) |
| Llama-2-70b-chat-longlora-32k | 70B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-70b-chat-longlora-32k) |
## Citation
If you find this project useful in your research, please consider citing:
```
@article{longlora,
title={LongLoRA: Efficient Fine-tuning of Long-Context Large Language Models},
author={Yukang Chen and Shengju Qian and Haotian Tang and Xin Lai and Zhijian Liu and Song Han and Jiaya Jia},
journal={arXiv:2309.12307},
year={2023}
}
```
## Acknowledgement
- This work is built upon the [LLaMA2](https://ai.meta.com/llama) as the pre-trained models.
- This work is based on [DeepSpeed](https://github.com/microsoft/DeepSpeed), [peft](https://github.com/huggingface/peft), and [Flash-Attention2](https://github.com/Dao-AILab/flash-attention) for acceleration.
- The perplexity evaluation code is modified upon [Landmark Attention](https://github.com/epfml/landmark-attention).
- We use [LongChat](https://github.com/DachengLi1/LongChat) for the retrieval evaluation.
| 7,594 | [
[
-0.052581787109375,
-0.058319091796875,
0.0194091796875,
0.03411865234375,
-0.0299224853515625,
-0.0236968994140625,
-0.036590576171875,
-0.0631103515625,
0.03802490234375,
0.0259857177734375,
-0.0484619140625,
-0.04229736328125,
-0.03515625,
0.015167236328125,
-0.00847625732421875,
0.07867431640625,
-0.0018014907836914062,
-0.03778076171875,
0.022216796875,
-0.0280303955078125,
-0.03338623046875,
-0.0219268798828125,
-0.045501708984375,
-0.01380157470703125,
0.05926513671875,
0.0218353271484375,
0.049957275390625,
0.045562744140625,
0.0298614501953125,
0.020843505859375,
-0.035614013671875,
0.02642822265625,
-0.040863037109375,
-0.0163421630859375,
-0.0009598731994628906,
-0.0155487060546875,
-0.08062744140625,
-0.01079559326171875,
0.051788330078125,
0.037200927734375,
0.004428863525390625,
0.031494140625,
0.0173187255859375,
0.058837890625,
-0.0282745361328125,
0.0063018798828125,
-0.0244903564453125,
-0.01132965087890625,
-0.0278167724609375,
-0.006580352783203125,
-0.0019140243530273438,
-0.0232696533203125,
-0.0018167495727539062,
-0.04931640625,
-0.0172576904296875,
-0.00922393798828125,
0.0889892578125,
0.03411865234375,
-0.03802490234375,
-0.007183074951171875,
-0.0140380859375,
0.0633544921875,
-0.07366943359375,
0.0213165283203125,
0.025146484375,
0.01181793212890625,
-0.0246734619140625,
-0.03472900390625,
-0.040313720703125,
-0.00078582763671875,
-0.0209503173828125,
0.003986358642578125,
-0.0173797607421875,
-0.00864410400390625,
0.034332275390625,
0.0279388427734375,
-0.0246124267578125,
0.02203369140625,
-0.0213775634765625,
0.005313873291015625,
0.05889892578125,
0.0005288124084472656,
0.0204010009765625,
-0.00862884521484375,
-0.03082275390625,
-0.0016231536865234375,
-0.06329345703125,
0.026824951171875,
0.0146942138671875,
0.019989013671875,
-0.0419921875,
0.03729248046875,
-0.0269012451171875,
0.060455322265625,
0.018951416015625,
-0.033355712890625,
0.0391845703125,
-0.033538818359375,
-0.0242156982421875,
-0.0170745849609375,
0.05059814453125,
0.031158447265625,
0.0036678314208984375,
0.01678466796875,
-0.00839996337890625,
0.0010194778442382812,
-0.0221710205078125,
-0.07183837890625,
0.01509857177734375,
0.0174102783203125,
-0.0298004150390625,
-0.01806640625,
-0.0075836181640625,
-0.0626220703125,
-0.0038280487060546875,
-0.02313232421875,
0.00989532470703125,
-0.0289459228515625,
-0.022918701171875,
0.0233154296875,
0.0193634033203125,
0.0261383056640625,
0.035858154296875,
-0.0416259765625,
0.01180267333984375,
0.0400390625,
0.045745849609375,
-0.01464080810546875,
-0.0247802734375,
-0.032470703125,
0.005931854248046875,
-0.020416259765625,
0.039337158203125,
-0.013580322265625,
-0.0096282958984375,
-0.01519012451171875,
0.0235748291015625,
-0.00621795654296875,
-0.0151519775390625,
0.04156494140625,
-0.0237274169921875,
0.0015172958374023438,
-0.035736083984375,
-0.036285400390625,
-0.009674072265625,
0.0167999267578125,
-0.055938720703125,
0.081787109375,
0.018829345703125,
-0.06561279296875,
0.023101806640625,
-0.065185546875,
-0.01447296142578125,
-0.0260467529296875,
0.01561737060546875,
-0.0401611328125,
-0.018951416015625,
0.033905029296875,
0.042144775390625,
-0.024383544921875,
0.0002034902572631836,
-0.0258636474609375,
-0.032684326171875,
0.0140380859375,
0.0008463859558105469,
0.06072998046875,
0.0226898193359375,
-0.045623779296875,
0.026702880859375,
-0.0615234375,
0.0087890625,
0.0225982666015625,
-0.03570556640625,
-0.007663726806640625,
-0.0164337158203125,
0.00555419921875,
0.0263824462890625,
0.033355712890625,
-0.0178375244140625,
0.033966064453125,
-0.03436279296875,
0.04083251953125,
0.0494384765625,
-0.00986480712890625,
0.0198822021484375,
-0.0262603759765625,
0.034332275390625,
0.01019287109375,
0.0175018310546875,
-0.005031585693359375,
-0.034088134765625,
-0.0780029296875,
-0.034515380859375,
0.0162811279296875,
0.020050048828125,
-0.0423583984375,
0.06500244140625,
-0.038482666015625,
-0.03656005859375,
-0.040618896484375,
0.0360107421875,
0.041015625,
0.02899169921875,
0.0291290283203125,
-0.0194091796875,
-0.0333251953125,
-0.06597900390625,
-0.0006165504455566406,
0.01194000244140625,
0.01371002197265625,
0.032012939453125,
0.048126220703125,
-0.03387451171875,
0.059844970703125,
-0.037445068359375,
-0.0266571044921875,
-0.016632080078125,
-0.025299072265625,
0.044097900390625,
0.03985595703125,
0.07379150390625,
-0.05010986328125,
-0.04766845703125,
0.01198577880859375,
-0.04486083984375,
-0.0038604736328125,
0.0108184814453125,
-0.0238037109375,
0.04327392578125,
0.028076171875,
-0.060882568359375,
0.042266845703125,
0.0477294921875,
-0.049957275390625,
0.0322265625,
-0.0050811767578125,
0.0015707015991210938,
-0.09527587890625,
0.0269012451171875,
-0.0007519721984863281,
-0.02325439453125,
-0.03936767578125,
0.0258636474609375,
0.0078887939453125,
0.019683837890625,
-0.044891357421875,
0.06756591796875,
-0.043060302734375,
-0.003444671630859375,
-0.019866943359375,
0.01198577880859375,
-0.0008463859558105469,
0.06170654296875,
-0.005794525146484375,
0.062469482421875,
0.0364990234375,
-0.037994384765625,
0.0235443115234375,
0.0179901123046875,
-0.03338623046875,
0.0316162109375,
-0.05010986328125,
0.0217132568359375,
0.0092315673828125,
0.056060791015625,
-0.052093505859375,
-0.03216552734375,
0.01763916015625,
-0.0137176513671875,
0.0164642333984375,
-0.005695343017578125,
-0.038116455078125,
-0.04290771484375,
-0.046356201171875,
0.04083251953125,
0.033111572265625,
-0.05511474609375,
0.00414276123046875,
0.015350341796875,
0.011077880859375,
-0.0487060546875,
-0.033477783203125,
-0.002532958984375,
-0.0484619140625,
-0.0577392578125,
0.0286865234375,
-0.0166778564453125,
-0.0028285980224609375,
-0.01849365234375,
0.013519287109375,
0.007740020751953125,
0.00916290283203125,
0.0207977294921875,
0.01019287109375,
-0.0262908935546875,
0.0024929046630859375,
-0.0101470947265625,
-0.0008625984191894531,
-0.0280303955078125,
0.0016021728515625,
0.0555419921875,
-0.03375244140625,
-0.01267242431640625,
-0.0562744140625,
0.01262664794921875,
0.039520263671875,
-0.019744873046875,
0.051849365234375,
0.06695556640625,
-0.0201416015625,
-0.0007386207580566406,
-0.0504150390625,
-0.004398345947265625,
-0.035736083984375,
0.01207733154296875,
-0.035430908203125,
-0.0821533203125,
0.059844970703125,
0.009613037109375,
0.005645751953125,
0.0487060546875,
0.035858154296875,
0.018951416015625,
0.07122802734375,
0.049346923828125,
-0.040313720703125,
0.048248291015625,
-0.03985595703125,
-0.0008425712585449219,
-0.073974609375,
-0.0005092620849609375,
-0.01470947265625,
-0.033966064453125,
-0.051177978515625,
-0.0440673828125,
0.0258636474609375,
0.0304412841796875,
-0.0309600830078125,
0.045745849609375,
-0.038604736328125,
0.025604248046875,
0.0289764404296875,
0.0211334228515625,
0.01222991943359375,
-0.0093841552734375,
0.019287109375,
0.0018405914306640625,
-0.02996826171875,
-0.0220947265625,
0.0697021484375,
0.03729248046875,
0.03857421875,
0.025726318359375,
0.048248291015625,
-0.002994537353515625,
0.0150909423828125,
-0.04949951171875,
0.0465087890625,
0.00896453857421875,
-0.039093017578125,
-0.037933349609375,
-0.022247314453125,
-0.083251953125,
0.018463134765625,
-0.00736236572265625,
-0.068115234375,
0.01129150390625,
0.005893707275390625,
-0.0362548828125,
0.0196075439453125,
-0.03912353515625,
0.057861328125,
-0.01041412353515625,
-0.035980224609375,
-0.021087646484375,
-0.05029296875,
0.04052734375,
-0.00394439697265625,
0.007564544677734375,
-0.019561767578125,
-0.0034694671630859375,
0.0599365234375,
-0.0489501953125,
0.06591796875,
-0.00510406494140625,
-0.041595458984375,
0.035552978515625,
-0.0155487060546875,
0.052581787109375,
0.00817108154296875,
-0.007740020751953125,
0.00087738037109375,
0.00017702579498291016,
-0.0394287109375,
-0.032989501953125,
0.0667724609375,
-0.055908203125,
-0.043609619140625,
-0.0210113525390625,
-0.033447265625,
-0.01105499267578125,
0.021331787109375,
0.01457977294921875,
0.006694793701171875,
0.006622314453125,
0.021728515625,
0.035614013671875,
-0.01922607421875,
0.038818359375,
0.0204315185546875,
-0.025054931640625,
-0.0243072509765625,
0.055389404296875,
-0.004558563232421875,
0.01116180419921875,
0.009674072265625,
0.0055084228515625,
-0.007518768310546875,
-0.02783203125,
-0.0288238525390625,
0.04046630859375,
-0.038177490234375,
-0.03125,
-0.0246734619140625,
-0.0207977294921875,
-0.037506103515625,
-0.0089263916015625,
-0.0240020751953125,
-0.030517578125,
-0.044647216796875,
-0.0084686279296875,
0.05462646484375,
0.039031982421875,
0.005138397216796875,
0.025390625,
-0.04046630859375,
0.0231781005859375,
0.0275421142578125,
0.03173828125,
-0.0006928443908691406,
-0.043121337890625,
-0.0163726806640625,
0.017059326171875,
-0.01557159423828125,
-0.054351806640625,
0.04327392578125,
0.0186614990234375,
0.01242828369140625,
0.035125732421875,
-0.0197601318359375,
0.08819580078125,
-0.025054931640625,
0.055267333984375,
0.0195465087890625,
-0.06536865234375,
0.046722412109375,
-0.051177978515625,
0.0210113525390625,
0.02789306640625,
0.004497528076171875,
-0.0309906005859375,
0.0010232925415039062,
-0.0379638671875,
-0.06304931640625,
0.051727294921875,
0.018524169921875,
0.0018663406372070312,
0.00833892822265625,
0.03948974609375,
-0.00714111328125,
0.0045928955078125,
-0.0638427734375,
-0.0247802734375,
-0.0031147003173828125,
-0.003082275390625,
-0.02313232421875,
-0.024261474609375,
-0.0192718505859375,
-0.04931640625,
0.042816162109375,
-0.0298614501953125,
0.0106658935546875,
0.012298583984375,
-0.010528564453125,
-0.01371002197265625,
0.00963592529296875,
0.06732177734375,
0.052001953125,
-0.01453399658203125,
-0.020751953125,
0.038543701171875,
-0.0152587890625,
-0.005645751953125,
0.0022869110107421875,
-0.0016756057739257812,
-0.0156402587890625,
0.033721923828125,
0.074462890625,
0.03936767578125,
-0.046966552734375,
0.030059814453125,
0.007335662841796875,
-0.00040841102600097656,
-0.0224609375,
0.01236724853515625,
0.016510009765625,
0.0261077880859375,
0.01103973388671875,
-0.021087646484375,
-0.0026302337646484375,
-0.0491943359375,
0.004650115966796875,
0.03741455078125,
-0.019012451171875,
-0.03729248046875,
0.039093017578125,
0.0069732666015625,
0.002964019775390625,
0.01158905029296875,
-0.0056304931640625,
-0.03875732421875,
0.05487060546875,
0.040069580078125,
0.034912109375,
-0.0226593017578125,
-0.00736236572265625,
0.045684814453125,
-0.0069732666015625,
-0.0089263916015625,
0.02117919921875,
0.0008101463317871094,
-0.0293121337890625,
-0.01953125,
-0.065673828125,
0.00954437255859375,
0.03155517578125,
-0.0362548828125,
0.026397705078125,
-0.029510498046875,
-0.0307159423828125,
-0.004184722900390625,
0.0389404296875,
-0.0523681640625,
0.01316070556640625,
0.00566864013671875,
0.07659912109375,
-0.0352783203125,
0.087158203125,
0.03619384765625,
-0.022979736328125,
-0.06427001953125,
-0.016021728515625,
-0.003726959228515625,
-0.066162109375,
0.045745849609375,
0.0167694091796875,
0.00004279613494873047,
-0.0104217529296875,
-0.05145263671875,
-0.0908203125,
0.10595703125,
0.0254364013671875,
-0.042022705078125,
-0.0105743408203125,
0.000823974609375,
0.056976318359375,
-0.0243988037109375,
0.01222991943359375,
0.0538330078125,
0.045684814453125,
0.0035915374755859375,
-0.09783935546875,
0.0262908935546875,
-0.036651611328125,
0.0027866363525390625,
0.0082550048828125,
-0.1011962890625,
0.07958984375,
-0.0151519775390625,
-0.009185791015625,
0.02886962890625,
0.061798095703125,
0.039520263671875,
0.006397247314453125,
0.039215087890625,
0.056640625,
0.03619384765625,
0.00024700164794921875,
0.0723876953125,
-0.0214691162109375,
0.0308990478515625,
0.060211181640625,
0.0026912689208984375,
0.065673828125,
0.03369140625,
-0.0171966552734375,
0.032928466796875,
0.0606689453125,
0.00795745849609375,
0.0189056396484375,
0.01561737060546875,
-0.0023632049560546875,
-0.01373291015625,
-0.00647735595703125,
-0.052093505859375,
0.0250091552734375,
0.02801513671875,
-0.0173797607421875,
-0.0018548965454101562,
-0.0147552490234375,
0.0295867919921875,
-0.0166015625,
-0.0225830078125,
0.048980712890625,
0.0214385986328125,
-0.05462646484375,
0.07763671875,
-0.00124359130859375,
0.08056640625,
-0.0350341796875,
0.00916290283203125,
-0.0268402099609375,
0.0212860107421875,
-0.0233917236328125,
-0.047515869140625,
-0.0004417896270751953,
0.00730133056640625,
0.00958251953125,
-0.0034885406494140625,
0.044586181640625,
-0.0254669189453125,
-0.04107666015625,
0.04107666015625,
0.019683837890625,
0.00814056396484375,
-0.0033416748046875,
-0.062042236328125,
0.0191650390625,
0.004810333251953125,
-0.056427001953125,
0.03826904296875,
0.0265655517578125,
-0.0204315185546875,
0.052490234375,
0.051177978515625,
0.00734710693359375,
0.0111083984375,
0.00035262107849121094,
0.08319091796875,
-0.058319091796875,
-0.0316162109375,
-0.060302734375,
0.033203125,
-0.01239013671875,
-0.0343017578125,
0.062286376953125,
0.0291595458984375,
0.040740966796875,
0.0086212158203125,
0.023712158203125,
0.002613067626953125,
0.04296875,
-0.039886474609375,
0.059356689453125,
-0.06756591796875,
0.005443572998046875,
-0.03240966796875,
-0.06689453125,
-0.02203369140625,
0.0372314453125,
-0.01410675048828125,
0.0128631591796875,
0.0306549072265625,
0.047607421875,
-0.012115478515625,
-0.026123046875,
-0.0008487701416015625,
0.0209197998046875,
0.033843994140625,
0.07757568359375,
0.03704833984375,
-0.045562744140625,
0.0164031982421875,
-0.032562255859375,
-0.004085540771484375,
-0.048797607421875,
-0.0648193359375,
-0.0816650390625,
-0.047882080078125,
-0.0162506103515625,
-0.0189056396484375,
-0.00991058349609375,
0.07000732421875,
0.06243896484375,
-0.055206298828125,
-0.019012451171875,
0.01328277587890625,
0.0107269287109375,
-0.00960540771484375,
-0.016204833984375,
0.0540771484375,
-0.004177093505859375,
-0.06842041015625,
0.025909423828125,
-0.00217437744140625,
0.02520751953125,
0.0006189346313476562,
-0.0289306640625,
-0.0130615234375,
-0.0005555152893066406,
0.06005859375,
0.045440673828125,
-0.06341552734375,
-0.0245208740234375,
-0.006252288818359375,
-0.014984130859375,
0.0079345703125,
0.016693115234375,
-0.042144775390625,
-0.01535797119140625,
0.03338623046875,
0.0138397216796875,
0.044342041015625,
0.010406494140625,
0.008331298828125,
-0.04241943359375,
0.0462646484375,
-0.00033020973205566406,
0.0312347412109375,
0.0200042724609375,
-0.0232696533203125,
0.060516357421875,
-0.00569915771484375,
-0.03240966796875,
-0.0787353515625,
0.010101318359375,
-0.10394287109375,
-0.01433563232421875,
0.09283447265625,
-0.01165008544921875,
-0.046112060546875,
0.034576416015625,
-0.03155517578125,
0.0187530517578125,
-0.032135009765625,
0.052215576171875,
0.035491943359375,
-0.0098419189453125,
-0.00579833984375,
-0.032684326171875,
0.061279296875,
0.03802490234375,
-0.0794677734375,
0.0017595291137695312,
0.032745361328125,
0.0262451171875,
0.0285491943359375,
0.048309326171875,
-0.004344940185546875,
0.0170745849609375,
-0.046112060546875,
-0.0069732666015625,
0.0009222030639648438,
-0.01055908203125,
-0.0205535888671875,
-0.01503753662109375,
-0.00795745849609375,
0.0054931640625
]
] |
squeezebert/squeezebert-uncased | 2020-12-11T22:02:17.000Z | [
"transformers",
"pytorch",
"squeezebert",
"arxiv:2006.11316",
"arxiv:1904.00962",
"endpoints_compatible",
"region:us"
] | null | squeezebert | null | null | squeezebert/squeezebert-uncased | 0 | 7,183 | transformers | 2022-03-02T23:29:05 | language: en
license: bsd
datasets:
- bookcorpus
- wikipedia
---
# SqueezeBERT pretrained model
This model, `squeezebert-uncased`, is a pretrained model for the English language using a masked language modeling (MLM) and Sentence Order Prediction (SOP) objective.
SqueezeBERT was introduced in [this paper](https://arxiv.org/abs/2006.11316). This model is case-insensitive. The model architecture is similar to BERT-base, but with the pointwise fully-connected layers replaced with [grouped convolutions](https://blog.yani.io/filter-group-tutorial/).
The authors found that SqueezeBERT is 4.3x faster than `bert-base-uncased` on a Google Pixel 3 smartphone.
## Pretraining
### Pretraining data
- [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of thousands of unpublished books
- [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia)
### Pretraining procedure
The model is pretrained using the Masked Language Model (MLM) and Sentence Order Prediction (SOP) tasks.
(Author's note: If you decide to pretrain your own model, and you prefer to train with MLM only, that should work too.)
From the SqueezeBERT paper:
> We pretrain SqueezeBERT from scratch (without distillation) using the [LAMB](https://arxiv.org/abs/1904.00962) optimizer, and we employ the hyperparameters recommended by the LAMB authors: a global batch size of 8192, a learning rate of 2.5e-3, and a warmup proportion of 0.28. Following the LAMB paper's recommendations, we pretrain for 56k steps with a maximum sequence length of 128 and then for 6k steps with a maximum sequence length of 512.
## Finetuning
The SqueezeBERT paper results from 2 approaches to finetuning the model:
- "finetuning without bells and whistles" -- after pretraining the SqueezeBERT model, finetune it on each GLUE task
- "finetuning with bells and whistles" -- after pretraining the SqueezeBERT model, finetune it on a MNLI with distillation from a teacher model. Then, use the MNLI-finetuned SqueezeBERT model as a student model to finetune on each of the other GLUE tasks (e.g. RTE, MRPC, …) with distillation from a task-specific teacher model.
A detailed discussion of the hyperparameters used for finetuning is provided in the appendix of the [SqueezeBERT paper](https://arxiv.org/abs/2006.11316).
Note that finetuning SqueezeBERT with distillation is not yet implemented in this repo. If the author (Forrest Iandola - forrest.dnn@gmail.com) gets enough encouragement from the user community, he will add example code to Transformers for finetuning SqueezeBERT with distillation.
This model, `squeezebert/squeezebert-uncased`, has been pretrained but not finetuned. For most text classification tasks, we recommend using squeezebert-mnli-headless as a starting point.
### How to finetune
To try finetuning SqueezeBERT on the [MRPC](https://www.microsoft.com/en-us/download/details.aspx?id=52398) text classification task, you can run the following command:
```
./utils/download_glue_data.py
python examples/text-classification/run_glue.py \
--model_name_or_path squeezebert-base-headless \
--task_name mrpc \
--data_dir ./glue_data/MRPC \
--output_dir ./models/squeezebert_mrpc \
--overwrite_output_dir \
--do_train \
--do_eval \
--num_train_epochs 10 \
--learning_rate 3e-05 \
--per_device_train_batch_size 16 \
--save_steps 20000
```
## BibTeX entry and citation info
```
@article{2020_SqueezeBERT,
author = {Forrest N. Iandola and Albert E. Shaw and Ravi Krishna and Kurt W. Keutzer},
title = {{SqueezeBERT}: What can computer vision teach NLP about efficient neural networks?},
journal = {arXiv:2006.11316},
year = {2020}
}
```
| 3,698 | [
[
-0.017242431640625,
-0.0438232421875,
0.017791748046875,
0.017852783203125,
-0.0083770751953125,
-0.0219879150390625,
-0.030609130859375,
-0.005672454833984375,
0.004913330078125,
0.03802490234375,
-0.03900146484375,
-0.024566650390625,
-0.059814453125,
0.0018224716186523438,
-0.041748046875,
0.099365234375,
0.004730224609375,
0.0118408203125,
-0.00811767578125,
0.00585174560546875,
-0.00821685791015625,
-0.049072265625,
-0.0287322998046875,
-0.0272216796875,
0.02337646484375,
0.03326416015625,
0.043304443359375,
0.03948974609375,
0.0186614990234375,
0.0297393798828125,
-0.01922607421875,
-0.004055023193359375,
-0.03656005859375,
-0.006381988525390625,
0.00817108154296875,
-0.051788330078125,
-0.0240478515625,
0.0340576171875,
0.0223541259765625,
0.050567626953125,
-0.0140380859375,
0.0138702392578125,
0.031463623046875,
0.05035400390625,
-0.049957275390625,
-0.0094146728515625,
-0.0401611328125,
0.004302978515625,
-0.0086669921875,
-0.01172637939453125,
-0.03515625,
-0.0247650146484375,
0.02874755859375,
-0.037261962890625,
0.045806884765625,
-0.0030231475830078125,
0.0863037109375,
0.0247344970703125,
-0.0185546875,
-0.018341064453125,
-0.04754638671875,
0.064697265625,
-0.05010986328125,
0.0299530029296875,
0.036529541015625,
0.00531005859375,
-0.004566192626953125,
-0.08587646484375,
-0.04278564453125,
-0.0276641845703125,
0.00213623046875,
0.021240234375,
-0.0017557144165039062,
0.003002166748046875,
0.036529541015625,
0.036041259765625,
-0.049774169921875,
0.00127410888671875,
-0.049102783203125,
-0.0167388916015625,
0.041168212890625,
-0.005199432373046875,
0.0027179718017578125,
-0.004802703857421875,
-0.0565185546875,
-0.0234832763671875,
-0.030914306640625,
0.01849365234375,
0.032958984375,
0.01015472412109375,
-0.0010318756103515625,
0.03863525390625,
0.00907135009765625,
0.05987548828125,
0.024261474609375,
-0.008026123046875,
0.03485107421875,
-0.004009246826171875,
-0.0285797119140625,
0.011688232421875,
0.0711669921875,
-0.0006518363952636719,
0.03302001953125,
-0.0027637481689453125,
-0.01393890380859375,
-0.0007476806640625,
0.0274658203125,
-0.08349609375,
-0.047882080078125,
0.0180511474609375,
-0.02728271484375,
-0.0229644775390625,
-0.002941131591796875,
-0.02410888671875,
-0.0100250244140625,
-0.0347900390625,
0.05841064453125,
-0.04681396484375,
0.007537841796875,
0.0207366943359375,
-0.0162811279296875,
-0.0079498291015625,
0.014007568359375,
-0.06756591796875,
0.025665283203125,
0.028167724609375,
0.051361083984375,
-0.0121307373046875,
-0.034912109375,
-0.031890869140625,
-0.0218963623046875,
-0.0132293701171875,
0.0401611328125,
-0.00937652587890625,
0.0015077590942382812,
-0.003337860107421875,
0.0140838623046875,
-0.0193023681640625,
-0.0538330078125,
0.048797607421875,
-0.0282745361328125,
0.03485107421875,
0.03125,
-0.06744384765625,
-0.0238037109375,
0.0122222900390625,
-0.035400390625,
0.08062744140625,
0.0048828125,
-0.05364990234375,
0.026947021484375,
-0.0220794677734375,
-0.0323486328125,
-0.007724761962890625,
0.01367950439453125,
-0.03515625,
-0.0031909942626953125,
0.015625,
0.07183837890625,
-0.004619598388671875,
0.029083251953125,
-0.0138092041015625,
-0.0278778076171875,
0.0169219970703125,
-0.02850341796875,
0.08685302734375,
0.01146697998046875,
-0.038299560546875,
0.011016845703125,
-0.0751953125,
0.0084686279296875,
-0.0037593841552734375,
-0.0430908203125,
-0.0184326171875,
-0.0177154541015625,
0.0232696533203125,
0.01413726806640625,
0.017974853515625,
-0.02392578125,
0.017822265625,
-0.043701171875,
0.04132080078125,
0.049774169921875,
-0.0238800048828125,
0.032073974609375,
-0.0152130126953125,
0.033172607421875,
0.00799560546875,
0.0115203857421875,
-0.0086212158203125,
-0.0264739990234375,
-0.087158203125,
-0.0194244384765625,
0.055328369140625,
0.06243896484375,
-0.044708251953125,
0.05523681640625,
-0.004970550537109375,
-0.05670166015625,
-0.0304718017578125,
0.0255584716796875,
0.03277587890625,
0.0258026123046875,
0.04010009765625,
-0.027679443359375,
-0.03204345703125,
-0.05670166015625,
-0.01187896728515625,
0.00994873046875,
-0.01351165771484375,
-0.004436492919921875,
0.03509521484375,
-0.01288604736328125,
0.053070068359375,
-0.05267333984375,
-0.0390625,
-0.025177001953125,
0.0203094482421875,
0.039276123046875,
0.052764892578125,
0.03619384765625,
-0.063232421875,
-0.036834716796875,
-0.037200927734375,
-0.0307464599609375,
0.01045989990234375,
-0.0015087127685546875,
-0.005535125732421875,
0.0026645660400390625,
0.049896240234375,
-0.04510498046875,
0.0293731689453125,
0.03973388671875,
-0.0160675048828125,
0.056915283203125,
-0.0178070068359375,
-0.004840850830078125,
-0.08343505859375,
-0.00548553466796875,
0.01451873779296875,
-0.02435302734375,
-0.05194091796875,
-0.01467132568359375,
0.009307861328125,
-0.01812744140625,
-0.04296875,
0.0179290771484375,
-0.0295867919921875,
0.036102294921875,
-0.01528167724609375,
0.01045989990234375,
0.0011501312255859375,
0.05621337890625,
0.0021228790283203125,
0.046783447265625,
0.05078125,
-0.0650634765625,
0.0164031982421875,
0.0280914306640625,
-0.04132080078125,
-0.003627777099609375,
-0.0631103515625,
0.0026836395263671875,
0.00438690185546875,
0.0062103271484375,
-0.06671142578125,
-0.004199981689453125,
0.0131072998046875,
-0.0408935546875,
0.0504150390625,
-0.007785797119140625,
-0.03607177734375,
-0.0280303955078125,
-0.0134429931640625,
0.0137939453125,
0.04705810546875,
-0.03826904296875,
0.034393310546875,
0.01134490966796875,
-0.003810882568359375,
-0.053497314453125,
-0.04571533203125,
-0.023468017578125,
-0.02264404296875,
-0.040130615234375,
0.0439453125,
-0.01529693603515625,
0.022247314453125,
-0.013092041015625,
-0.0102691650390625,
-0.040069580078125,
0.007663726806640625,
0.019134521484375,
0.0262908935546875,
0.0029125213623046875,
0.0208587646484375,
-0.0126495361328125,
-0.01045989990234375,
-0.00969696044921875,
-0.031646728515625,
0.0355224609375,
-0.013336181640625,
0.0138702392578125,
-0.02850341796875,
-0.0051116943359375,
0.0254974365234375,
-0.001216888427734375,
0.058685302734375,
0.06768798828125,
-0.0164794921875,
-0.0193634033203125,
-0.042938232421875,
-0.025634765625,
-0.038604736328125,
0.025604248046875,
-0.007106781005859375,
-0.06695556640625,
0.0010442733764648438,
-0.0123443603515625,
0.003620147705078125,
0.033843994140625,
0.041412353515625,
-0.0181121826171875,
0.0535888671875,
0.0701904296875,
-0.01019287109375,
0.0350341796875,
-0.049102783203125,
0.0160369873046875,
-0.036712646484375,
0.00251007080078125,
-0.038787841796875,
-0.021697998046875,
-0.03704833984375,
-0.02655029296875,
0.0189208984375,
0.048126220703125,
-0.0208892822265625,
0.05718994140625,
-0.0518798828125,
0.018218994140625,
0.06976318359375,
0.0131378173828125,
-0.002117156982421875,
0.0005316734313964844,
-0.0157928466796875,
-0.0012102127075195312,
-0.07464599609375,
-0.044921875,
0.08660888671875,
0.01861572265625,
0.060791015625,
-0.01708984375,
0.05841064453125,
0.0128173828125,
0.01088714599609375,
-0.069091796875,
0.04522705078125,
-0.023406982421875,
-0.06414794921875,
-0.0218505859375,
-0.018707275390625,
-0.0850830078125,
0.01503753662109375,
-0.0313720703125,
-0.057830810546875,
0.0092620849609375,
0.039276123046875,
-0.033294677734375,
0.007747650146484375,
-0.07537841796875,
0.0733642578125,
-0.0173187255859375,
-0.0240325927734375,
0.001590728759765625,
-0.055023193359375,
0.0212860107421875,
0.0020503997802734375,
-0.011444091796875,
-0.00039458274841308594,
0.0183868408203125,
0.077880859375,
-0.042388916015625,
0.06341552734375,
-0.01428985595703125,
0.01425933837890625,
0.03717041015625,
-0.011688232421875,
0.02484130859375,
-0.02667236328125,
0.01006317138671875,
0.01517486572265625,
0.006359100341796875,
-0.0185546875,
-0.03643798828125,
0.032989501953125,
-0.060211181640625,
-0.040557861328125,
-0.0290069580078125,
-0.032196044921875,
-0.01087188720703125,
0.01093292236328125,
0.056060791015625,
0.03790283203125,
0.0039825439453125,
0.0200653076171875,
0.05242919921875,
-0.014678955078125,
0.032806396484375,
0.016510009765625,
0.000186920166015625,
-0.01210784912109375,
0.0711669921875,
0.024505615234375,
0.00583648681640625,
0.025177001953125,
0.0197906494140625,
-0.02435302734375,
-0.035736083984375,
-0.025390625,
0.0017881393432617188,
-0.036163330078125,
-0.01514434814453125,
-0.0589599609375,
-0.034698486328125,
-0.042449951171875,
-0.0006337165832519531,
-0.03009033203125,
-0.04608154296875,
-0.049713134765625,
0.0028076171875,
0.022491455078125,
0.02471923828125,
0.0089263916015625,
0.037322998046875,
-0.0718994140625,
0.0150299072265625,
0.023529052734375,
0.01450347900390625,
-0.00444793701171875,
-0.060516357421875,
-0.0257568359375,
-0.01288604736328125,
-0.0572509765625,
-0.06304931640625,
0.037506103515625,
0.0261077880859375,
0.032257080078125,
0.040313720703125,
0.01474761962890625,
0.0457763671875,
-0.06121826171875,
0.061614990234375,
0.035125732421875,
-0.06842041015625,
0.04473876953125,
-0.0014696121215820312,
0.0090789794921875,
0.04290771484375,
0.0789794921875,
-0.0083465576171875,
-0.0184783935546875,
-0.05963134765625,
-0.07989501953125,
0.0509033203125,
0.024444580078125,
0.005359649658203125,
0.016143798828125,
0.00890350341796875,
0.017791748046875,
0.025726318359375,
-0.07257080078125,
-0.031890869140625,
-0.01371002197265625,
-0.0241851806640625,
-0.034515380859375,
-0.02777099609375,
0.0011091232299804688,
-0.0457763671875,
0.07513427734375,
0.007640838623046875,
0.0235443115234375,
0.0244598388671875,
-0.028656005859375,
0.031463623046875,
0.0291900634765625,
0.033233642578125,
0.035888671875,
-0.01812744140625,
-0.0013437271118164062,
0.005893707275390625,
-0.056915283203125,
0.021881103515625,
0.038421630859375,
-0.00916290283203125,
0.00580596923828125,
0.0237274169921875,
0.06781005859375,
0.005031585693359375,
-0.0533447265625,
0.052032470703125,
0.0123443603515625,
-0.04010009765625,
-0.037841796875,
0.0294036865234375,
0.0037822723388671875,
0.03131103515625,
0.0188751220703125,
0.018280029296875,
0.02105712890625,
-0.0240631103515625,
0.01474761962890625,
-0.00115203857421875,
-0.04888916015625,
-0.0135650634765625,
0.052581787109375,
0.0125885009765625,
-0.023345947265625,
0.047760009765625,
-0.0171051025390625,
-0.05120849609375,
0.038116455078125,
0.032470703125,
0.0582275390625,
-0.00952911376953125,
0.001575469970703125,
0.035491943359375,
0.0186767578125,
-0.011993408203125,
0.0029087066650390625,
-0.01192474365234375,
-0.057708740234375,
-0.026611328125,
-0.06036376953125,
-0.0016613006591796875,
0.031707763671875,
-0.0457763671875,
0.0153656005859375,
-0.0328369140625,
-0.027496337890625,
0.0261077880859375,
-0.00287628173828125,
-0.057861328125,
0.02728271484375,
-0.004405975341796875,
0.0721435546875,
-0.06341552734375,
0.059539794921875,
0.06396484375,
-0.017913818359375,
-0.07769775390625,
0.0016717910766601562,
-0.01276397705078125,
-0.0362548828125,
0.06781005859375,
0.04010009765625,
0.00927734375,
0.0036144256591796875,
-0.058380126953125,
-0.041839599609375,
0.062286376953125,
0.0185089111328125,
-0.058380126953125,
0.004161834716796875,
-0.0135650634765625,
0.033172607421875,
-0.0016689300537109375,
0.0138397216796875,
0.032989501953125,
0.00716400146484375,
-0.00925445556640625,
-0.06646728515625,
-0.01027679443359375,
-0.0266265869140625,
-0.0084228515625,
0.00960540771484375,
-0.0408935546875,
0.07489013671875,
-0.0174713134765625,
-0.00998687744140625,
0.004543304443359375,
0.05194091796875,
0.00353240966796875,
0.0203094482421875,
0.050201416015625,
0.0611572265625,
0.052581787109375,
-0.016021728515625,
0.058685302734375,
-0.0210723876953125,
0.041778564453125,
0.08953857421875,
0.00841522216796875,
0.05938720703125,
0.025634765625,
-0.0352783203125,
0.035675048828125,
0.0494384765625,
-0.0211181640625,
0.04010009765625,
0.031707763671875,
-0.00478363037109375,
-0.023529052734375,
0.0198211669921875,
-0.021728515625,
0.034271240234375,
0.01268768310546875,
-0.0411376953125,
0.00801849365234375,
0.026092529296875,
-0.000034868717193603516,
0.0006756782531738281,
-0.02471923828125,
0.0330810546875,
0.00865936279296875,
-0.06146240234375,
0.07000732421875,
-0.00473785400390625,
0.0728759765625,
-0.043609619140625,
0.00724029541015625,
0.01337432861328125,
0.035003662109375,
-0.01264190673828125,
-0.044921875,
-0.00432586669921875,
-0.0025634765625,
-0.027008056640625,
-0.017974853515625,
0.062042236328125,
-0.05023193359375,
-0.034088134765625,
0.007053375244140625,
0.025726318359375,
0.03533935546875,
-0.00942230224609375,
-0.065185546875,
-0.0286865234375,
0.005077362060546875,
-0.03973388671875,
0.0185089111328125,
0.019866943359375,
0.019744873046875,
0.054351806640625,
0.043975830078125,
-0.0083770751953125,
0.0182037353515625,
-0.0011844635009765625,
0.0606689453125,
-0.039642333984375,
-0.032440185546875,
-0.05010986328125,
0.04345703125,
-0.001567840576171875,
-0.035064697265625,
0.056243896484375,
0.0443115234375,
0.0821533203125,
-0.031463623046875,
0.036529541015625,
-0.01451873779296875,
0.033599853515625,
-0.043975830078125,
0.056396484375,
-0.04827880859375,
-0.0150909423828125,
-0.0231475830078125,
-0.07647705078125,
-0.0006117820739746094,
0.056854248046875,
-0.01294708251953125,
0.017913818359375,
0.05303955078125,
0.05560302734375,
-0.0186614990234375,
-0.00005841255187988281,
0.0186767578125,
0.02044677734375,
-0.01313018798828125,
0.03387451171875,
0.050750732421875,
-0.04522705078125,
0.056488037109375,
-0.0364990234375,
-0.0224609375,
-0.01251220703125,
-0.056915283203125,
-0.08447265625,
-0.055694580078125,
-0.04132080078125,
-0.034271240234375,
-0.002079010009765625,
0.0701904296875,
0.07171630859375,
-0.056304931640625,
-0.0157318115234375,
-0.014373779296875,
-0.02099609375,
-0.026519775390625,
-0.020782470703125,
0.038360595703125,
-0.038848876953125,
-0.06988525390625,
0.0150299072265625,
-0.00730133056640625,
0.0175628662109375,
0.002628326416015625,
0.008697509765625,
-0.037139892578125,
-0.01373291015625,
0.041229248046875,
0.0055999755859375,
-0.0313720703125,
0.006793975830078125,
-0.007457733154296875,
-0.0052032470703125,
0.0005578994750976562,
0.05426025390625,
-0.0565185546875,
0.02630615234375,
0.0223846435546875,
0.03973388671875,
0.06353759765625,
-0.0151214599609375,
0.0247650146484375,
-0.058868408203125,
0.0271148681640625,
0.026031494140625,
0.023529052734375,
0.00954437255859375,
-0.01436614990234375,
0.039947509765625,
0.023468017578125,
-0.05914306640625,
-0.040771484375,
-0.00016200542449951172,
-0.07696533203125,
-0.02685546875,
0.07537841796875,
-0.0379638671875,
-0.01294708251953125,
0.0035305023193359375,
-0.0188751220703125,
0.037139892578125,
-0.0083160400390625,
0.04638671875,
0.051055908203125,
-0.0037441253662109375,
0.00035500526428222656,
-0.049896240234375,
0.04241943359375,
0.048431396484375,
-0.036865234375,
-0.016937255859375,
0.024810791015625,
0.03240966796875,
0.0254974365234375,
0.033599853515625,
0.01436614990234375,
0.003986358642578125,
-0.004055023193359375,
0.01497650146484375,
-0.00695037841796875,
-0.036651611328125,
-0.001621246337890625,
0.0142669677734375,
-0.0213165283203125,
-0.023773193359375
]
] |
eachadea/vicuna-7b-1.1 | 2023-05-02T09:08:12.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | eachadea | null | null | eachadea/vicuna-7b-1.1 | 109 | 7,182 | transformers | 2023-04-13T03:45:52 | ---
license: apache-2.0
inference: true
---
**delta v1.1 merge**
# Vicuna Model Card
## Model details
**Model type:**
Vicuna is an open-source chatbot trained by fine-tuning LLaMA on user-shared conversations collected from ShareGPT.
It is an auto-regressive language model, based on the transformer architecture.
**Model date:**
Vicuna was trained between March 2023 and April 2023.
**Organizations developing the model:**
The Vicuna team with members from UC Berkeley, CMU, Stanford, and UC San Diego.
**Paper or resources for more information:**
https://vicuna.lmsys.org/
**License:**
Apache License 2.0
**Where to send questions or comments about the model:**
https://github.com/lm-sys/FastChat/issues
## Intended use
**Primary intended uses:**
The primary use of Vicuna is research on large language models and chatbots.
**Primary intended users:**
The primary intended users of the model are researchers and hobbyists in natural language processing, machine learning, and artificial intelligence.
## Training dataset
70K conversations collected from ShareGPT.com.
## Evaluation dataset
A preliminary evaluation of the model quality is conducted by creating a set of 80 diverse questions and utilizing GPT-4 to judge the model outputs. See https://vicuna.lmsys.org/ for more details.
## Major updates of weights v1.1
- Refactor the tokenization and separator. In Vicuna v1.1, the separator has been changed from `"###"` to the EOS token `"</s>"`. This change makes it easier to determine the generation stop criteria and enables better compatibility with other libraries.
- Fix the supervised fine-tuning loss computation for better model quality. | 1,669 | [
[
-0.031494140625,
-0.06085205078125,
0.024322509765625,
0.0174407958984375,
-0.040557861328125,
-0.0269012451171875,
-0.007537841796875,
-0.045013427734375,
0.014251708984375,
0.045318603515625,
-0.05023193359375,
-0.039398193359375,
-0.036712646484375,
-0.01273345947265625,
-0.0198822021484375,
0.0643310546875,
0.001514434814453125,
0.01165771484375,
0.0142669677734375,
-0.00820159912109375,
-0.070068359375,
-0.04376220703125,
-0.085205078125,
-0.0027065277099609375,
0.050323486328125,
0.036468505859375,
0.051361083984375,
0.04327392578125,
0.0295867919921875,
0.021881103515625,
0.0020008087158203125,
0.00650787353515625,
-0.032440185546875,
0.0187835693359375,
0.002811431884765625,
-0.049163818359375,
-0.050018310546875,
-0.032470703125,
0.0379638671875,
0.00833892822265625,
-0.009490966796875,
0.01317596435546875,
-0.0014371871948242188,
0.038055419921875,
-0.032806396484375,
0.0263824462890625,
-0.039764404296875,
-0.01439666748046875,
-0.01068878173828125,
-0.029449462890625,
-0.0162200927734375,
-0.034210205078125,
-0.00720977783203125,
-0.039886474609375,
0.01369476318359375,
-0.01503753662109375,
0.08880615234375,
0.049896240234375,
-0.0273895263671875,
-0.0169830322265625,
-0.05926513671875,
0.033966064453125,
-0.05902099609375,
0.03131103515625,
0.0206451416015625,
0.058197021484375,
-0.0214996337890625,
-0.06097412109375,
-0.045379638671875,
-0.0136260986328125,
0.01678466796875,
0.0039520263671875,
-0.0269622802734375,
0.00945281982421875,
-0.00853729248046875,
0.035980224609375,
-0.03045654296875,
0.0274658203125,
-0.048919677734375,
0.0005064010620117188,
0.034454345703125,
0.0290679931640625,
0.01380157470703125,
-0.0114593505859375,
-0.0272674560546875,
-0.0189971923828125,
-0.026702880859375,
-0.00798797607421875,
0.02313232421875,
0.0533447265625,
-0.0165252685546875,
0.04010009765625,
-0.0144195556640625,
0.036102294921875,
-0.0135650634765625,
0.01959228515625,
0.00634002685546875,
-0.018035888671875,
-0.05487060546875,
-0.007625579833984375,
0.07415771484375,
0.03125,
-0.0031299591064453125,
0.01210784912109375,
-0.003688812255859375,
-0.00627899169921875,
0.021881103515625,
-0.05181884765625,
-0.00594329833984375,
0.052154541015625,
-0.020294189453125,
-0.029510498046875,
-0.0015277862548828125,
-0.036041259765625,
-0.033111572265625,
-0.0232086181640625,
0.0281982421875,
-0.034393310546875,
-0.036865234375,
0.0215606689453125,
-0.00431060791015625,
0.022064208984375,
0.047271728515625,
-0.050750732421875,
0.0197906494140625,
0.05316162109375,
0.06695556640625,
0.017486572265625,
-0.0305328369140625,
-0.0018100738525390625,
-0.02301025390625,
-0.0289764404296875,
0.052154541015625,
0.01192474365234375,
-0.019744873046875,
0.01064300537109375,
0.01323699951171875,
0.00957489013671875,
-0.03277587890625,
0.039794921875,
-0.035400390625,
0.03387451171875,
-0.03179931640625,
-0.039154052734375,
-0.01055145263671875,
0.0175933837890625,
-0.06890869140625,
0.077392578125,
0.0203094482421875,
-0.0394287109375,
0.0284423828125,
-0.06451416015625,
0.018218994140625,
0.0160064697265625,
0.0084228515625,
-0.032318115234375,
-0.014251708984375,
-0.006114959716796875,
0.0300445556640625,
-0.041168212890625,
0.02447509765625,
-0.007904052734375,
-0.023773193359375,
0.006992340087890625,
-0.0347900390625,
0.082275390625,
0.032135009765625,
-0.0233154296875,
0.03424072265625,
-0.037139892578125,
-0.00603485107421875,
0.01174163818359375,
-0.004871368408203125,
-0.0176544189453125,
-0.022216796875,
0.0186767578125,
0.01384735107421875,
0.034393310546875,
-0.01142120361328125,
0.03668212890625,
0.006359100341796875,
0.00885009765625,
0.059722900390625,
0.007389068603515625,
0.0242767333984375,
-0.0296783447265625,
0.040802001953125,
0.00435638427734375,
0.058837890625,
-0.005046844482421875,
-0.03436279296875,
-0.08453369140625,
-0.042938232421875,
0.00896453857421875,
0.0379638671875,
-0.04510498046875,
0.051025390625,
-0.0292816162109375,
-0.0750732421875,
-0.060302734375,
0.003215789794921875,
0.01538848876953125,
0.006175994873046875,
0.0308380126953125,
-0.024139404296875,
-0.062347412109375,
-0.06756591796875,
-0.00481414794921875,
-0.0303192138671875,
0.00021028518676757812,
0.01404571533203125,
0.01454925537109375,
-0.037994384765625,
0.08013916015625,
-0.0246734619140625,
-0.028411865234375,
-0.00531005859375,
0.00917816162109375,
0.00485992431640625,
0.034393310546875,
0.04913330078125,
-0.0460205078125,
-0.029083251953125,
0.0019989013671875,
-0.059722900390625,
0.003818511962890625,
-0.01337432861328125,
-0.0281829833984375,
0.03253173828125,
0.03411865234375,
-0.044708251953125,
0.03070068359375,
0.050872802734375,
-0.042816162109375,
0.038055419921875,
-0.0022068023681640625,
0.00983428955078125,
-0.1060791015625,
-0.006366729736328125,
-0.0006966590881347656,
-0.0198974609375,
-0.039794921875,
0.01000213623046875,
-0.0009937286376953125,
0.0220947265625,
-0.056671142578125,
0.063232421875,
-0.0296783447265625,
0.016204833984375,
-0.035308837890625,
-0.0269775390625,
-0.0168304443359375,
0.03985595703125,
-0.0096435546875,
0.042755126953125,
0.04058837890625,
-0.07806396484375,
0.03472900390625,
-0.0012369155883789062,
-0.0086669921875,
0.0160980224609375,
-0.053497314453125,
0.0185699462890625,
-0.0078277587890625,
0.0171051025390625,
-0.07159423828125,
-0.00511932373046875,
0.0484619140625,
-0.03857421875,
0.013458251953125,
-0.00952911376953125,
-0.042144775390625,
-0.016204833984375,
-0.00835418701171875,
0.005359649658203125,
0.03167724609375,
-0.03790283203125,
0.037200927734375,
0.0253143310546875,
0.006717681884765625,
-0.03704833984375,
-0.043792724609375,
0.00801849365234375,
-0.0236663818359375,
-0.021697998046875,
-0.004253387451171875,
-0.0127716064453125,
-0.0205078125,
0.0037326812744140625,
0.0046539306640625,
-0.0219268798828125,
0.006587982177734375,
0.01593017578125,
0.0263671875,
-0.0122222900390625,
0.018707275390625,
0.0083160400390625,
0.0008549690246582031,
-0.01549530029296875,
-0.0155792236328125,
0.07366943359375,
-0.028533935546875,
0.00228118896484375,
-0.050262451171875,
-0.00821685791015625,
0.053802490234375,
0.00907135009765625,
0.08685302734375,
0.043182373046875,
-0.0268707275390625,
0.005435943603515625,
-0.0482177734375,
-0.01629638671875,
-0.03485107421875,
0.03759765625,
-0.02978515625,
-0.059600830078125,
0.05322265625,
0.0234527587890625,
0.0396728515625,
0.042755126953125,
0.0567626953125,
0.0043182373046875,
0.022186279296875,
0.06488037109375,
0.0020198822021484375,
0.07098388671875,
-0.024932861328125,
-0.006153106689453125,
-0.043212890625,
-0.0244903564453125,
-0.042938232421875,
0.0029354095458984375,
-0.06280517578125,
-0.04168701171875,
-0.0119476318359375,
0.01227569580078125,
-0.0305023193359375,
0.07275390625,
-0.045501708984375,
0.0133056640625,
0.0430908203125,
0.009979248046875,
0.01222991943359375,
-0.01416015625,
0.023406982421875,
0.00738525390625,
-0.0498046875,
-0.05169677734375,
0.08245849609375,
0.045379638671875,
0.0357666015625,
0.0201873779296875,
0.050628662109375,
0.0277099609375,
0.044891357421875,
-0.058135986328125,
0.037628173828125,
0.017425537109375,
-0.0648193359375,
-0.05224609375,
-0.03302001953125,
-0.08135986328125,
0.029632568359375,
-0.0017833709716796875,
-0.043060302734375,
0.0193939208984375,
0.0133514404296875,
-0.00409698486328125,
0.01898193359375,
-0.05706787109375,
0.07049560546875,
-0.01305389404296875,
-0.01502227783203125,
0.0038585662841796875,
-0.0181884765625,
0.03717041015625,
-0.00853729248046875,
-0.0028972625732421875,
-0.00878143310546875,
-0.006256103515625,
0.039154052734375,
-0.052581787109375,
0.0758056640625,
-0.0147552490234375,
-0.0302581787109375,
0.02001953125,
0.00859832763671875,
0.01885986328125,
-0.0149688720703125,
0.00829315185546875,
0.042938232421875,
0.00621795654296875,
-0.04534912109375,
-0.04376220703125,
0.044769287109375,
-0.076171875,
-0.02398681640625,
-0.030364990234375,
-0.0176849365234375,
0.01291656494140625,
0.0111541748046875,
0.039947509765625,
-0.0029354095458984375,
-0.0225677490234375,
0.003509521484375,
0.0203094482421875,
-0.01222991943359375,
0.016265869140625,
0.046356201171875,
-0.0278778076171875,
-0.04150390625,
0.04119873046875,
0.008148193359375,
0.0079498291015625,
0.0250091552734375,
-0.0028057098388671875,
-0.023284912109375,
-0.00688934326171875,
-0.01110076904296875,
0.0198974609375,
-0.0511474609375,
-0.01404571533203125,
-0.045745849609375,
-0.020233154296875,
-0.04150390625,
0.0205841064453125,
-0.069580078125,
-0.0276947021484375,
-0.018096923828125,
-0.00823211669921875,
0.036041259765625,
0.039764404296875,
0.01230621337890625,
0.05108642578125,
-0.050537109375,
0.00957489013671875,
0.01358795166015625,
0.0276031494140625,
-0.00679779052734375,
-0.049102783203125,
-0.035675048828125,
0.004901885986328125,
-0.0203857421875,
-0.07086181640625,
0.04095458984375,
-0.0255126953125,
0.044464111328125,
0.0278778076171875,
0.0029544830322265625,
0.057769775390625,
-0.00794219970703125,
0.057830810546875,
0.01407623291015625,
-0.05548095703125,
0.04010009765625,
-0.0246734619140625,
0.02520751953125,
0.042755126953125,
0.031494140625,
-0.058380126953125,
-0.033782958984375,
-0.0560302734375,
-0.0677490234375,
0.03948974609375,
0.0273895263671875,
0.007434844970703125,
-0.00560760498046875,
0.02093505859375,
0.01617431640625,
0.01482391357421875,
-0.049530029296875,
-0.046905517578125,
-0.020843505859375,
-0.01006317138671875,
-0.01126861572265625,
-0.0191192626953125,
0.0066986083984375,
-0.0198974609375,
0.05657958984375,
0.006313323974609375,
0.031341552734375,
0.005512237548828125,
0.005706787109375,
-0.00847625732421875,
0.0128631591796875,
0.042633056640625,
0.033172607421875,
-0.0299835205078125,
-0.02801513671875,
-0.004749298095703125,
-0.033172607421875,
-0.005161285400390625,
0.0119781494140625,
-0.007244110107421875,
0.0250701904296875,
0.0097198486328125,
0.08612060546875,
0.006046295166015625,
-0.023284912109375,
0.007068634033203125,
-0.046661376953125,
-0.019134521484375,
-0.05780029296875,
0.0155792236328125,
-0.0031986236572265625,
0.027435302734375,
0.0170745849609375,
-0.0050048828125,
0.0015344619750976562,
-0.047271728515625,
-0.01190185546875,
0.0145721435546875,
-0.036712646484375,
-0.01102447509765625,
0.04730224609375,
0.0131683349609375,
-0.042510986328125,
0.041473388671875,
0.006137847900390625,
-0.0245513916015625,
0.043060302734375,
0.003391265869140625,
0.089599609375,
-0.028106689453125,
0.00937652587890625,
0.045135498046875,
0.035552978515625,
-0.01503753662109375,
0.01444244384765625,
-0.00724029541015625,
-0.05706787109375,
-0.002765655517578125,
-0.025360107421875,
-0.05194091796875,
0.031890869140625,
-0.0552978515625,
0.04986572265625,
-0.0296173095703125,
-0.036834716796875,
-0.0223846435546875,
0.0159759521484375,
-0.07147216796875,
0.006198883056640625,
0.0078582763671875,
0.06939697265625,
-0.0689697265625,
0.0784912109375,
0.043548583984375,
-0.052032470703125,
-0.060150146484375,
-0.01148223876953125,
-0.00850677490234375,
-0.0307769775390625,
0.0263671875,
-0.001010894775390625,
0.0087127685546875,
-0.00518798828125,
-0.0511474609375,
-0.05230712890625,
0.10552978515625,
0.035919189453125,
-0.054595947265625,
-0.006313323974609375,
0.0020732879638671875,
0.047607421875,
-0.0037841796875,
0.0352783203125,
0.032318115234375,
0.023406982421875,
0.00803375244140625,
-0.092529296875,
0.0025081634521484375,
-0.026458740234375,
-0.0007333755493164062,
-0.004253387451171875,
-0.070068359375,
0.0784912109375,
0.0124969482421875,
-0.01421356201171875,
0.0155181884765625,
0.062347412109375,
0.036407470703125,
0.0134124755859375,
0.033447265625,
0.0159759521484375,
0.0819091796875,
-0.00662994384765625,
0.0814208984375,
-0.0253753662109375,
0.0281524658203125,
0.09429931640625,
0.0086212158203125,
0.057037353515625,
0.03515625,
-0.00019550323486328125,
0.03668212890625,
0.060302734375,
0.0159759521484375,
0.023590087890625,
-0.00537109375,
-0.00823211669921875,
-0.004718780517578125,
0.01080322265625,
-0.044708251953125,
0.0288238525390625,
0.007541656494140625,
-0.027191162109375,
0.005096435546875,
-0.006359100341796875,
0.0189056396484375,
-0.0244598388671875,
-0.01062774658203125,
0.048431396484375,
-0.0009589195251464844,
-0.050018310546875,
0.05706787109375,
0.01025390625,
0.0631103515625,
-0.052978515625,
-0.00020575523376464844,
-0.040985107421875,
0.0241851806640625,
0.00742340087890625,
-0.0237579345703125,
0.00850677490234375,
0.003902435302734375,
0.0037364959716796875,
0.0040435791015625,
0.029876708984375,
-0.02801513671875,
-0.0222625732421875,
0.0263519287109375,
0.031219482421875,
0.0335693359375,
-0.01739501953125,
-0.054595947265625,
0.02789306640625,
-0.00994110107421875,
-0.0272064208984375,
0.00855255126953125,
0.041748046875,
-0.01120758056640625,
0.06805419921875,
0.047149658203125,
0.007358551025390625,
0.01039886474609375,
0.02142333984375,
0.0732421875,
-0.048004150390625,
-0.04656982421875,
-0.058868408203125,
0.03985595703125,
-0.002948760986328125,
-0.0345458984375,
0.06097412109375,
0.051025390625,
0.054534912109375,
-0.0087890625,
0.05645751953125,
0.0033092498779296875,
0.0157623291015625,
-0.031036376953125,
0.0628662109375,
-0.04840087890625,
0.0205078125,
-0.014984130859375,
-0.07208251953125,
-0.0081329345703125,
0.035125732421875,
-0.001575469970703125,
0.0041351318359375,
0.050323486328125,
0.05743408203125,
0.016204833984375,
-0.0120697021484375,
0.038055419921875,
0.016082763671875,
0.04888916015625,
0.0264739990234375,
0.03936767578125,
-0.0677490234375,
0.039825439453125,
-0.01593017578125,
-0.0128326416015625,
-0.03863525390625,
-0.04364013671875,
-0.0677490234375,
-0.058929443359375,
-0.025909423828125,
-0.03985595703125,
0.022918701171875,
0.0758056640625,
0.04345703125,
-0.0352783203125,
-0.033660888671875,
-0.0277252197265625,
-0.014678955078125,
-0.016510009765625,
-0.019744873046875,
0.01360321044921875,
-0.02020263671875,
-0.079833984375,
0.00009417533874511719,
-0.0120391845703125,
0.01702880859375,
-0.034912109375,
-0.027984619140625,
-0.0021610260009765625,
0.012054443359375,
0.0207061767578125,
0.050506591796875,
-0.0478515625,
0.0037937164306640625,
0.0011720657348632812,
-0.03863525390625,
0.0034999847412109375,
0.03369140625,
-0.041595458984375,
0.0296478271484375,
0.0272674560546875,
0.0075836181640625,
0.0379638671875,
-0.01238250732421875,
0.039886474609375,
-0.0306854248046875,
0.029541015625,
0.0114593505859375,
0.0301971435546875,
0.02105712890625,
-0.0301361083984375,
0.03167724609375,
-0.006023406982421875,
-0.0277862548828125,
-0.06610107421875,
-0.0025844573974609375,
-0.07574462890625,
-0.0239105224609375,
0.09454345703125,
0.0128173828125,
-0.0380859375,
-0.01181793212890625,
-0.03564453125,
0.057159423828125,
-0.0262603759765625,
0.06268310546875,
0.0458984375,
0.028350830078125,
-0.037811279296875,
-0.040313720703125,
0.0345458984375,
0.007785797119140625,
-0.06768798828125,
0.00445556640625,
0.0290679931640625,
0.0238189697265625,
0.000042498111724853516,
0.08642578125,
-0.00690460205078125,
0.032318115234375,
0.01204681396484375,
0.058380126953125,
-0.03131103515625,
-0.02838134765625,
-0.026123046875,
-0.0171966552734375,
0.0104217529296875,
-0.04400634765625
]
] |
anas-awadalla/mpt-1b-redpajama-200b-hf-style | 2023-09-02T06:41:10.000Z | [
"transformers",
"pytorch",
"mosaic_gpt",
"text-generation",
"custom_code",
"dataset:togethercomputer/RedPajama-Data-1T",
"arxiv:2302.13971",
"arxiv:2205.14135",
"arxiv:2108.12409",
"license:apache-2.0",
"region:us"
] | text-generation | anas-awadalla | null | null | anas-awadalla/mpt-1b-redpajama-200b-hf-style | 0 | 7,179 | transformers | 2023-09-02T04:42:19 | ---
license: apache-2.0
datasets:
- togethercomputer/RedPajama-Data-1T
---
# MPT-1b-RedPajama-200b
MPT-1b-RedPajama-200b is a 1.3 billion parameter decoder-only transformer trained on the [RedPajama dataset](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T).
The model was trained for 200B tokens by sampling from the subsets of the RedPajama dataset in the same proportions as were used by the [Llama series of models](https://arxiv.org/abs/2302.13971).
This model was trained by [MosaicML](https://www.mosaicml.com) and follows a modified decoder-only transformer architecture.
## Model Date
April 20, 2023
## How to Use
Note: This model requires that `trust_remote_code=True` be passed to the `from_pretrained` method.
This is because we use a custom model architecture `MosaicGPT` that is not yet part of the `transformers` package.
`MosaicGPT` includes options for many training efficiency features such as [FlashAttention (Dao et al. 2022)](https://arxiv.org/pdf/2205.14135.pdf), [ALIBI](https://arxiv.org/abs/2108.12409), QK LayerNorm, and more.
```python
import transformers
model = transformers.AutoModelForCausalLM.from_pretrained('mosaicml/mpt-1b-redpajama-200b', trust_remote_code=True)
```
To use the optimized triton implementation of FlashAttention, you can load with `attn_impl='triton'` and move the model to `bfloat16` like so:
```python
model = transformers.AutoModelForCausalLM.from_pretrained('mosaicml/mpt-1b-redpajama-200b', trust_remote_code=True, attn_impl='triton')
model.to(device='cuda:0', dtype=torch.bfloat16)
```
## Model Description
This model uses the MosaicML LLM codebase, which can be found in the [MosaicML Examples Repository](https://github.com/mosaicml/examples/tree/v0.0.4/examples/llm).
The architecture is a modification of a standard decoder-only transformer.
The transformer has 24 layers, 16 attention heads, and width 2048.
The model has been modified from a standard transformer in the following ways:
* It uses ALiBi and does not use positional embeddings.
* It uses QK LayerNorm.
* It does not use biases.
## Training Data
The model was trained for 200B tokens (batch size 2200, sequence length 2048). It was trained on the following data mix:
* 67% RedPajama Common Crawl
* 15% [C4](https://huggingface.co/datasets/c4)
* 4.5% RedPajama GitHub
* 4.5% RedPajama Wikipedia
* 4.5% RedPajama Books
* 2.5% RedPajama Arxiv
* 2% RedPajama StackExchange
This is the same mix of data as was used in the Llama series of models](https://arxiv.org/abs/2302.13971).
Each sample was chosen from one of the datasets, with the dataset selected with the probability specified above.
The examples were shuffled within each dataset.
Each example was constructed from as many sequences from that dataset as were necessary to fill the 2048 sequence length.
The data was tokenized using the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer.
## Training Configuration
This model was trained on 440 A100-40GBs for about half a day using the [MosaicML Platform](https://www.mosaicml.com/platform). The model was trained with sharded data parallelism using FSDP.
## Acknowledgements
This model builds on the work of [Together](https://www.together.xyz), which created the RedPajama dataset with the goal of mimicking the training data used to create the Llama series of models.
We gratefully acknowledge the hard work of the team that put together this dataset, and we hope this model serves as a useful companion to that work.
We also gratefully acknowledge the work of the researchers who created the Llama series of models, which was the impetus for our efforts and those who worked on the RedPajama project.
| 3,710 | [
[
-0.038482666015625,
-0.0206146240234375,
0.0192718505859375,
0.03619384765625,
-0.0340576171875,
-0.0024623870849609375,
-0.0006194114685058594,
-0.032989501953125,
0.021728515625,
0.03887939453125,
-0.049896240234375,
-0.04107666015625,
-0.056060791015625,
0.015655517578125,
-0.03594970703125,
0.07476806640625,
-0.00505828857421875,
-0.00859832763671875,
0.005496978759765625,
0.00024819374084472656,
-0.0205230712890625,
-0.0187530517578125,
-0.031463623046875,
-0.0260772705078125,
0.0270538330078125,
0.016937255859375,
0.055267333984375,
0.0491943359375,
0.038909912109375,
0.0204010009765625,
-0.0162200927734375,
0.0107269287109375,
-0.036163330078125,
-0.03363037109375,
0.00885009765625,
-0.045623779296875,
-0.0350341796875,
0.01192474365234375,
0.038421630859375,
0.021514892578125,
-0.01538848876953125,
0.043792724609375,
-0.0186309814453125,
0.0259552001953125,
-0.03253173828125,
-0.0011854171752929688,
-0.033203125,
0.0157012939453125,
-0.0092010498046875,
0.00015175342559814453,
-0.035400390625,
-0.0294036865234375,
0.0016584396362304688,
-0.04888916015625,
0.01568603515625,
0.0024814605712890625,
0.07183837890625,
0.03387451171875,
-0.03271484375,
0.01064300537109375,
-0.05023193359375,
0.06707763671875,
-0.043792724609375,
0.0118560791015625,
0.031005859375,
0.0242919921875,
0.005550384521484375,
-0.0750732421875,
-0.048309326171875,
-0.001590728759765625,
-0.01285552978515625,
0.0170440673828125,
-0.02874755859375,
-0.0255889892578125,
0.033966064453125,
0.0208282470703125,
-0.035736083984375,
-0.0201568603515625,
-0.029449462890625,
0.00447845458984375,
0.042449951171875,
0.029815673828125,
0.021636962890625,
-0.0250244140625,
-0.056121826171875,
-0.0255584716796875,
-0.0499267578125,
-0.0129547119140625,
0.0205078125,
-0.00685882568359375,
-0.03973388671875,
0.040679931640625,
0.0005621910095214844,
0.030975341796875,
0.006763458251953125,
-0.01409912109375,
0.0238189697265625,
-0.046173095703125,
-0.0221710205078125,
-0.01531982421875,
0.0748291015625,
0.0271759033203125,
0.021209716796875,
0.0024623870849609375,
-0.0082244873046875,
-0.0162200927734375,
0.0271759033203125,
-0.068115234375,
-0.0139923095703125,
0.012420654296875,
-0.03509521484375,
-0.00873565673828125,
0.00971221923828125,
-0.036865234375,
-0.01343536376953125,
-0.01366424560546875,
0.03973388671875,
-0.04620361328125,
-0.03131103515625,
0.0196990966796875,
-0.0158538818359375,
0.018096923828125,
0.01451873779296875,
-0.05364990234375,
0.0153656005859375,
0.03643798828125,
0.07403564453125,
-0.006855010986328125,
-0.0413818359375,
0.02447509765625,
0.0086669921875,
-0.0010004043579101562,
0.04034423828125,
-0.02264404296875,
-0.021484375,
-0.03271484375,
0.0157623291015625,
-0.019866943359375,
-0.03814697265625,
0.0101165771484375,
-0.045562744140625,
0.0230255126953125,
-0.01422119140625,
-0.0164031982421875,
-0.03753662109375,
0.0153656005859375,
-0.04791259765625,
0.06396484375,
0.031494140625,
-0.062744140625,
0.021820068359375,
-0.05963134765625,
-0.01334381103515625,
-0.006317138671875,
0.0239105224609375,
-0.06378173828125,
-0.01113128662109375,
0.0218353271484375,
0.0239105224609375,
-0.03448486328125,
0.014556884765625,
-0.00933074951171875,
-0.059600830078125,
0.0262451171875,
-0.032806396484375,
0.082763671875,
0.01207733154296875,
-0.04669189453125,
-0.0037136077880859375,
-0.057159423828125,
-0.0154876708984375,
0.04034423828125,
-0.03265380859375,
0.00934600830078125,
-0.029998779296875,
0.005523681640625,
0.0208892822265625,
0.01152801513671875,
-0.0413818359375,
0.0292816162109375,
-0.014129638671875,
0.0275115966796875,
0.03662109375,
0.0101165771484375,
0.0172882080078125,
-0.0408935546875,
0.043975830078125,
0.01666259765625,
0.035186767578125,
-0.01438140869140625,
-0.059234619140625,
-0.06341552734375,
-0.027862548828125,
0.024444580078125,
0.027984619140625,
-0.041961669921875,
0.0139312744140625,
-0.0222625732421875,
-0.05181884765625,
-0.053863525390625,
-0.0175628662109375,
0.030731201171875,
0.018768310546875,
0.051727294921875,
-0.022369384765625,
-0.0634765625,
-0.067626953125,
0.00864410400390625,
0.00315093994140625,
-0.001270294189453125,
0.0206146240234375,
0.060516357421875,
-0.04205322265625,
0.06494140625,
-0.02349853515625,
-0.003162384033203125,
-0.0160064697265625,
0.0134124755859375,
0.048248291015625,
0.032623291015625,
0.036712646484375,
-0.053466796875,
-0.04522705078125,
-0.01386260986328125,
-0.043914794921875,
0.00739288330078125,
-0.01824951171875,
-0.00615692138671875,
0.0081634521484375,
-0.00487518310546875,
-0.06329345703125,
0.036376953125,
0.04718017578125,
-0.0276031494140625,
0.03350830078125,
0.0033397674560546875,
0.01490020751953125,
-0.093505859375,
0.01751708984375,
-0.013275146484375,
-0.0174713134765625,
-0.045562744140625,
-0.0085296630859375,
0.01477813720703125,
0.010223388671875,
-0.06304931640625,
0.0228271484375,
-0.0257110595703125,
-0.0157623291015625,
-0.025299072265625,
-0.0268096923828125,
-0.00736236572265625,
0.054107666015625,
0.0159912109375,
0.07122802734375,
0.0220794677734375,
-0.036956787109375,
0.0184478759765625,
0.036163330078125,
-0.0255126953125,
0.0034332275390625,
-0.051727294921875,
0.00991058349609375,
0.018280029296875,
0.02496337890625,
-0.06390380859375,
-0.00267791748046875,
0.0255889892578125,
-0.02593994140625,
0.0192108154296875,
-0.023712158203125,
-0.03265380859375,
-0.043792724609375,
-0.01531982421875,
0.049041748046875,
0.05419921875,
-0.058929443359375,
0.03656005859375,
0.02093505859375,
0.02484130859375,
-0.06396484375,
-0.05731201171875,
0.0097503662109375,
-0.021209716796875,
-0.052337646484375,
0.03485107421875,
0.0032806396484375,
0.0019016265869140625,
-0.0132904052734375,
0.00853729248046875,
0.0122222900390625,
0.01335906982421875,
0.04449462890625,
0.0272369384765625,
-0.00879669189453125,
-0.02093505859375,
-0.01751708984375,
-0.031646728515625,
0.00804901123046875,
-0.016937255859375,
0.0777587890625,
-0.0216522216796875,
-0.028778076171875,
-0.0550537109375,
0.007404327392578125,
0.0447998046875,
0.0011768341064453125,
0.08062744140625,
0.064208984375,
-0.00960540771484375,
0.006320953369140625,
-0.034271240234375,
-0.005401611328125,
-0.033447265625,
0.0189666748046875,
-0.00732421875,
-0.04302978515625,
0.043548583984375,
0.0147705078125,
-0.011444091796875,
0.041595458984375,
0.06060791015625,
-0.004817962646484375,
0.0645751953125,
0.035369873046875,
0.0081634521484375,
0.038665771484375,
-0.057708740234375,
-0.00580596923828125,
-0.07745361328125,
-0.0283050537109375,
-0.00986480712890625,
-0.036163330078125,
-0.0460205078125,
-0.0465087890625,
0.023681640625,
-0.01519775390625,
-0.055511474609375,
0.0643310546875,
-0.046234130859375,
0.037445068359375,
0.0621337890625,
0.023223876953125,
0.01561737060546875,
-0.0113983154296875,
0.0085601806640625,
0.00859832763671875,
-0.0523681640625,
-0.027984619140625,
0.10260009765625,
0.036346435546875,
0.045440673828125,
0.00704193115234375,
0.060028076171875,
-0.00922393798828125,
0.037628173828125,
-0.0299835205078125,
0.039642333984375,
0.006748199462890625,
-0.048431396484375,
0.001132965087890625,
-0.0321044921875,
-0.06689453125,
0.01715087890625,
-0.0204315185546875,
-0.031524658203125,
0.024627685546875,
-0.006343841552734375,
-0.03289794921875,
0.03509521484375,
-0.04498291015625,
0.051666259765625,
-0.005859375,
-0.0136260986328125,
-0.00237274169921875,
-0.05096435546875,
0.04644775390625,
-0.0160369873046875,
-0.014190673828125,
0.00295257568359375,
-0.0006399154663085938,
0.06671142578125,
-0.0287017822265625,
0.0675048828125,
-0.00873565673828125,
0.0070343017578125,
0.0286712646484375,
-0.004486083984375,
0.04150390625,
0.006855010986328125,
0.00977325439453125,
0.051788330078125,
0.0008263587951660156,
-0.0257568359375,
-0.0032444000244140625,
0.02423095703125,
-0.0894775390625,
-0.0465087890625,
-0.032012939453125,
-0.051910400390625,
0.0106658935546875,
0.0077056884765625,
0.03863525390625,
-0.009735107421875,
0.0161895751953125,
0.0222625732421875,
0.035736083984375,
-0.0302581787109375,
0.05633544921875,
0.031280517578125,
-0.0060882568359375,
-0.043731689453125,
0.0504150390625,
-0.00354766845703125,
0.0229644775390625,
0.01849365234375,
0.00409698486328125,
-0.017333984375,
-0.043487548828125,
-0.0135498046875,
0.039886474609375,
-0.039703369140625,
-0.033447265625,
-0.05462646484375,
-0.0260009765625,
-0.01528167724609375,
0.006450653076171875,
-0.048980712890625,
-0.037261962890625,
-0.036865234375,
0.005039215087890625,
0.02593994140625,
0.051513671875,
0.005970001220703125,
0.047119140625,
-0.0626220703125,
0.02484130859375,
0.026153564453125,
0.021728515625,
0.00080108642578125,
-0.06640625,
-0.025238037109375,
0.009124755859375,
-0.03179931640625,
-0.05487060546875,
0.0457763671875,
-0.0146484375,
0.0204925537109375,
0.0111083984375,
-0.014068603515625,
0.059783935546875,
-0.024505615234375,
0.07373046875,
0.0203857421875,
-0.058563232421875,
0.03656005859375,
-0.025482177734375,
0.030426025390625,
0.017120361328125,
0.04168701171875,
-0.032684326171875,
-0.0189971923828125,
-0.0679931640625,
-0.053985595703125,
0.0809326171875,
0.03192138671875,
0.0033779144287109375,
-0.00745391845703125,
0.0287322998046875,
0.00351715087890625,
0.0064849853515625,
-0.09564208984375,
-0.023223876953125,
-0.0347900390625,
-0.010467529296875,
0.0011692047119140625,
-0.0201416015625,
-0.0228424072265625,
-0.0230865478515625,
0.055267333984375,
0.0015649795532226562,
0.0509033203125,
-0.0018749237060546875,
-0.018890380859375,
-0.0227203369140625,
-0.0119476318359375,
0.055572509765625,
0.0421142578125,
-0.024627685546875,
-0.0036716461181640625,
0.0312347412109375,
-0.06201171875,
0.015167236328125,
-0.003284454345703125,
-0.016357421875,
-0.01396942138671875,
0.04241943359375,
0.06793212890625,
0.01175689697265625,
-0.0244903564453125,
0.052093505859375,
-0.0250244140625,
-0.014373779296875,
-0.0250091552734375,
0.0211639404296875,
0.02362060546875,
0.041778564453125,
0.0240478515625,
0.014862060546875,
-0.018798828125,
-0.01800537109375,
0.0183868408203125,
0.0248565673828125,
-0.0016279220581054688,
-0.0306549072265625,
0.06329345703125,
-0.0150604248046875,
-0.024627685546875,
0.05755615234375,
0.004703521728515625,
-0.02655029296875,
0.06634521484375,
0.0675048828125,
0.058258056640625,
-0.01348114013671875,
0.0146026611328125,
0.04693603515625,
0.02349853515625,
-0.021240234375,
0.0106964111328125,
-0.00589752197265625,
-0.048309326171875,
-0.0286102294921875,
-0.061798095703125,
-0.02874755859375,
0.000021457672119140625,
-0.041015625,
0.0263519287109375,
-0.028289794921875,
-0.020416259765625,
-0.03314208984375,
0.005550384521484375,
-0.052398681640625,
0.01180267333984375,
0.023193359375,
0.07769775390625,
-0.06512451171875,
0.07086181640625,
0.0306549072265625,
-0.041961669921875,
-0.06634521484375,
-0.0089874267578125,
-0.0174407958984375,
-0.09588623046875,
0.0426025390625,
0.01512908935546875,
0.0128173828125,
0.0015277862548828125,
-0.048004150390625,
-0.0863037109375,
0.127197265625,
0.034820556640625,
-0.03948974609375,
0.007480621337890625,
0.03662109375,
0.03753662109375,
-0.0305633544921875,
0.05047607421875,
0.050628662109375,
0.037750244140625,
0.01702880859375,
-0.05413818359375,
0.01342010498046875,
-0.01416778564453125,
0.01082611083984375,
0.0142822265625,
-0.06829833984375,
0.078369140625,
-0.023406982421875,
-0.0267486572265625,
0.01073455810546875,
0.037139892578125,
0.032623291015625,
0.0102996826171875,
0.030548095703125,
0.06689453125,
0.0294952392578125,
-0.02056884765625,
0.10443115234375,
-0.0284271240234375,
0.048004150390625,
0.06396484375,
0.0276031494140625,
0.04443359375,
0.037933349609375,
-0.03546142578125,
0.03350830078125,
0.05706787109375,
-0.0208892822265625,
0.044525146484375,
-0.01558685302734375,
-0.004100799560546875,
-0.016998291015625,
0.00811767578125,
-0.0360107421875,
0.0283355712890625,
0.00904083251953125,
-0.04241943359375,
-0.0010824203491210938,
-0.003620147705078125,
0.0010967254638671875,
-0.039886474609375,
-0.0115509033203125,
0.04974365234375,
0.01007843017578125,
-0.04132080078125,
0.06451416015625,
-0.00725555419921875,
0.05169677734375,
-0.034423828125,
0.009674072265625,
-0.032318115234375,
0.0010728836059570312,
-0.0230560302734375,
-0.06390380859375,
0.029541015625,
-0.0032444000244140625,
-0.00611114501953125,
-0.0212249755859375,
0.0220184326171875,
-0.019073486328125,
-0.0288238525390625,
0.022705078125,
0.010040283203125,
0.01049041748046875,
-0.006500244140625,
-0.0589599609375,
-0.0018358230590820312,
0.01486968994140625,
-0.03851318359375,
0.024078369140625,
0.007236480712890625,
0.0157928466796875,
0.0528564453125,
0.05487060546875,
-0.0017652511596679688,
0.012969970703125,
0.0024394989013671875,
0.0650634765625,
-0.07135009765625,
-0.0188751220703125,
-0.06585693359375,
0.054901123046875,
0.01435089111328125,
-0.030426025390625,
0.05133056640625,
0.030120849609375,
0.055419921875,
-0.01080322265625,
0.019439697265625,
0.006732940673828125,
0.0172576904296875,
-0.0281982421875,
0.048187255859375,
-0.035736083984375,
0.03314208984375,
-0.01459503173828125,
-0.09014892578125,
-0.026092529296875,
0.044342041015625,
-0.018310546875,
0.012725830078125,
0.04388427734375,
0.061614990234375,
-0.01393890380859375,
0.01446533203125,
0.00923919677734375,
0.019378662109375,
0.01593017578125,
0.06494140625,
0.0733642578125,
-0.06854248046875,
0.043792724609375,
-0.037811279296875,
-0.0021915435791015625,
-0.016571044921875,
-0.05438232421875,
-0.0643310546875,
-0.032470703125,
-0.0212249755859375,
-0.0206298828125,
-0.016876220703125,
0.0648193359375,
0.05670166015625,
-0.0491943359375,
-0.00849151611328125,
-0.0065460205078125,
0.0016307830810546875,
-0.016387939453125,
-0.01166534423828125,
0.0234375,
0.00809478759765625,
-0.05499267578125,
0.018707275390625,
0.0155487060546875,
0.028533935546875,
-0.005413055419921875,
-0.00749969482421875,
-0.0310516357421875,
-0.004009246826171875,
0.0239105224609375,
0.0102081298828125,
-0.037139892578125,
-0.025665283203125,
-0.006786346435546875,
-0.008026123046875,
0.027618408203125,
0.0323486328125,
-0.05841064453125,
0.0032634735107421875,
0.01377105712890625,
0.030517578125,
0.07818603515625,
0.00569915771484375,
0.0182647705078125,
-0.04638671875,
0.00860595703125,
0.01397705078125,
0.038055419921875,
0.01482391357421875,
-0.027984619140625,
0.05517578125,
0.02642822265625,
-0.0447998046875,
-0.05419921875,
-0.0002646446228027344,
-0.0794677734375,
-0.0159454345703125,
0.09185791015625,
-0.00539398193359375,
-0.033935546875,
0.00893402099609375,
-0.00661468505859375,
0.034149169921875,
-0.0003483295440673828,
0.05963134765625,
0.026397705078125,
-0.01198577880859375,
-0.035491943359375,
-0.01190185546875,
0.01611328125,
0.026885986328125,
-0.04290771484375,
-0.01096343994140625,
-0.0019855499267578125,
0.039459228515625,
0.03326416015625,
0.0298919677734375,
-0.0141448974609375,
0.03839111328125,
0.0022220611572265625,
0.021697998046875,
-0.03204345703125,
-0.0242462158203125,
-0.02642822265625,
0.0218658447265625,
-0.031463623046875,
-0.01154327392578125
]
] |
Envoid/Libra-19B | 2023-09-14T12:51:30.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | Envoid | null | null | Envoid/Libra-19B | 1 | 7,179 | transformers | 2023-09-10T07:47:25 | ---
license: cc-by-nc-4.0
---
## Warning: This model may output adult content.
# Libra-19B
This model is made using [chargoddards's mergekit](https://github.com/cg123/mergekit/tree/main).
In this experiment I started with a very 'free' model [MLewd V2-2](https://huggingface.co/Undi95/MLewd-L2-13B-v2-2) and using the bakllama script I ran the following script
```
layer_slices:
- model: Undi95_MLewd-L2-13B-v2-2
start: 0
end: 40
- model: NousResearch_Llama-2-13b-chat-hf
start: 19
end: 20
- model: NousResearch_Llama-2-13b-chat-hf
start: 18
end: 19
- model: NousResearch_Llama-2-13b-chat-hf
start: 17
end: 18
- model: NousResearch_Llama-2-13b-chat-hf
start: 16
end: 17
- model: NousResearch_Llama-2-13b-chat-hf
start: 15
end: 16
- model: NousResearch_Llama-2-13b-chat-hf
start: 14
end: 15
- model: NousResearch_Llama-2-13b-chat-hf
start: 13
end: 14
- model: NousResearch_Llama-2-13b-chat-hf
start: 12
end: 13
- model: NousResearch_Llama-2-13b-chat-hf
start: 11
end: 12
- model: NousResearch_Llama-2-13b-chat-hf
start: 10
end: 11
- model: NousResearch_Llama-2-13b-chat-hf
start: 9
end: 10
- model: NousResearch_Llama-2-13b-chat-hf
start: 8
end: 9
- model: NousResearch_Llama-2-13b-chat-hf
start: 7
end: 8
- model: NousResearch_Llama-2-13b-chat-hf
start: 6
end: 7
- model: NousResearch_Llama-2-13b-chat-hf
start: 5
end: 6
- model: NousResearch_Llama-2-13b-chat-hf
start: 4
end: 5
- model: NousResearch_Llama-2-13b-chat-hf
start: 3
end: 4
- model: NousResearch_Llama-2-13b-chat-hf
start: 2
end: 3
- model: NousResearch_Llama-2-13b-chat-hf
start: 1
end: 2
- model: NousResearch_Llama-2-13b-chat-hf
start: 0
end: 1
```
In order to take the lower 20 layers of [NousResearch/Llama-2-13b-chat-hf](https://huggingface.co/NousResearch/Llama-2-13b-chat-hf) and stack them onto MLewd in the reverse order so that their orientation to the higher layers of the original model were symetrical in a way.
I then used qlora to run a very strong LoRA on the model. 2 epochs at 1e-6 learning rate on 60 megabytes of raw corpus with a rank and alpha of 1024 using a single A100 in 4-bits.
I wasn't able to load the model in text-generation-webui afterwards so all testing afterwards was done in koboldcpp after converting it to gguf.
The model does not respond as well to the Llama-2 format as it does to the alpaca-instruct style formatting. Optimal formatting and generation parameters are still a work in progress for roleplay but it has a decent level of attention for detail with regards to tavern cards. It's very good at instruct style prompts. It can even write simple python scripts.
## Example:
```
### Instruction:
Tell me about "the love languages" and which one resonates with you the most.
### Response:
The concept of "love languages" was introduced by author Gary Chapman in his book "The Five Love Languages: How to Express Heartfelt Commitment to Your Mate." According to this theory, there are five ways individuals express love and feel loved: words of affirmation, acts of service, receiving gifts, quality time, and physical touch. Each person has a primary love language that resonates with them the most, while others may prefer different ways of expressing or receiving love.
For example, someone whose primary love language is words of affirmation values verbal expressions of affection, appreciation, and praise. They enjoy being told how much they're loved, cherished, and valued in their relationship. On the other hand, someone whose primary love language is acts of service values actions rather than words. They feel loved when their partner helps them out around the house or carries out tasks they find difficult or stressful. Gift-giving is another love language where gift-givers often feel most appreciated when they receive presents from their partner. Physical touch is another way people express love through acts such as kissing, hugging, or holding hands. Quality time is spent together engaging in activities such as dates or shared experiences.
My own primary love language is words of affirmation. I value verbal expressions of love and appreciation in my relationships because it makes me feel seen, understood, and cherished. When my partner tells me how much they love me or expresses gratitude for something I've done for them, it reinforces our bond and strengthens our connection.
``` | 4,535 | [
[
-0.02508544921875,
-0.054046630859375,
0.0236663818359375,
0.03436279296875,
-0.0279083251953125,
-0.01312255859375,
-0.0032596588134765625,
-0.044891357421875,
0.026641845703125,
0.038238525390625,
-0.04742431640625,
-0.039398193359375,
-0.029571533203125,
-0.01197052001953125,
-0.02813720703125,
0.06854248046875,
0.0193023681640625,
0.00017058849334716797,
-0.00937652587890625,
-0.01514434814453125,
-0.02337646484375,
-0.04254150390625,
-0.05450439453125,
-0.0279693603515625,
0.05096435546875,
0.031280517578125,
0.061676025390625,
0.048248291015625,
0.04254150390625,
0.0218963623046875,
-0.0241241455078125,
0.02337646484375,
-0.043121337890625,
-0.01079559326171875,
-0.002460479736328125,
-0.0400390625,
-0.0352783203125,
0.0019855499267578125,
0.0268707275390625,
0.0323486328125,
-0.01959228515625,
0.0156402587890625,
-0.008514404296875,
0.066650390625,
-0.03887939453125,
0.003383636474609375,
-0.03424072265625,
0.0013437271118164062,
-0.01715087890625,
0.007354736328125,
-0.017730712890625,
-0.0233612060546875,
0.00592803955078125,
-0.047607421875,
-0.00577545166015625,
0.005321502685546875,
0.09552001953125,
0.0027370452880859375,
-0.049285888671875,
-0.033050537109375,
-0.04266357421875,
0.054931640625,
-0.07269287109375,
0.032135009765625,
0.0552978515625,
0.0087432861328125,
-0.01421356201171875,
-0.042938232421875,
-0.041290283203125,
-0.02972412109375,
-0.020904541015625,
-0.0008788108825683594,
-0.023681640625,
-0.01349639892578125,
0.0195159912109375,
0.03436279296875,
-0.050506591796875,
0.010986328125,
-0.043914794921875,
-0.009307861328125,
0.060394287109375,
0.005680084228515625,
0.04052734375,
-0.01568603515625,
-0.024688720703125,
-0.00696563720703125,
-0.041900634765625,
0.0263824462890625,
0.04693603515625,
0.015045166015625,
-0.038787841796875,
0.051910400390625,
-0.01104736328125,
0.037994384765625,
0.0238037109375,
-0.0310516357421875,
0.0187530517578125,
-0.0251007080078125,
-0.0269927978515625,
-0.0145416259765625,
0.0733642578125,
0.04681396484375,
0.01261138916015625,
0.01451873779296875,
0.0029964447021484375,
0.0277557373046875,
0.0172576904296875,
-0.05926513671875,
-0.01593017578125,
0.0298919677734375,
-0.0418701171875,
-0.038970947265625,
-0.014495849609375,
-0.03924560546875,
-0.0134124755859375,
0.009613037109375,
0.029632568359375,
-0.0419921875,
-0.0260162353515625,
0.0160675048828125,
0.00508880615234375,
0.0279083251953125,
0.0262451171875,
-0.0848388671875,
0.02484130859375,
0.031585693359375,
0.050750732421875,
0.00197601318359375,
-0.0186614990234375,
-0.0157012939453125,
-0.01515960693359375,
-0.03216552734375,
0.0303802490234375,
-0.0253448486328125,
-0.03643798828125,
-0.01493072509765625,
0.0150146484375,
-0.0021820068359375,
-0.0418701171875,
0.047027587890625,
-0.03411865234375,
0.04180908203125,
-0.031982421875,
-0.033203125,
-0.0290374755859375,
0.0034198760986328125,
-0.049560546875,
0.094970703125,
0.014312744140625,
-0.0660400390625,
0.00994110107421875,
-0.047119140625,
-0.024627685546875,
-0.0089569091796875,
-0.0026702880859375,
-0.039276123046875,
-0.017578125,
0.0180206298828125,
0.0281524658203125,
-0.007038116455078125,
0.0000883340835571289,
-0.028106689453125,
-0.02197265625,
0.0184478759765625,
-0.004177093505859375,
0.0711669921875,
0.0181732177734375,
-0.035888671875,
-0.0012388229370117188,
-0.0518798828125,
-0.00958251953125,
0.0404052734375,
-0.0245819091796875,
-0.0259552001953125,
-0.0009918212890625,
0.001522064208984375,
0.0271148681640625,
0.0345458984375,
-0.0374755859375,
0.0272674560546875,
-0.045989990234375,
0.017669677734375,
0.04071044921875,
-0.00806427001953125,
0.04522705078125,
-0.044830322265625,
0.031463623046875,
0.021697998046875,
0.0210113525390625,
-0.00551605224609375,
-0.0499267578125,
-0.06927490234375,
-0.0066070556640625,
-0.0010099411010742188,
0.033721923828125,
-0.034820556640625,
0.04376220703125,
0.008575439453125,
-0.058685302734375,
-0.05291748046875,
0.01885986328125,
0.04522705078125,
0.020843505859375,
0.0218505859375,
-0.041534423828125,
-0.039398193359375,
-0.0814208984375,
0.0100250244140625,
-0.029083251953125,
0.01514434814453125,
0.037872314453125,
0.03564453125,
-0.027557373046875,
0.048675537109375,
-0.030120849609375,
-0.01293182373046875,
-0.017120361328125,
-0.01415252685546875,
0.02923583984375,
0.04229736328125,
0.0650634765625,
-0.06719970703125,
-0.031890869140625,
0.0227508544921875,
-0.058929443359375,
0.0023670196533203125,
-0.000637054443359375,
-0.031982421875,
0.007007598876953125,
0.005855560302734375,
-0.05682373046875,
0.03204345703125,
0.050567626953125,
-0.046722412109375,
0.045501708984375,
-0.0115814208984375,
0.006870269775390625,
-0.07550048828125,
0.0128326416015625,
-0.0076751708984375,
-0.0023097991943359375,
-0.059844970703125,
0.01751708984375,
0.0008349418640136719,
0.0164031982421875,
-0.0416259765625,
0.061553955078125,
-0.039764404296875,
-0.01509857177734375,
-0.0247802734375,
0.0019931793212890625,
0.002185821533203125,
0.04217529296875,
-0.0108489990234375,
0.04541015625,
0.041290283203125,
-0.040740966796875,
0.036865234375,
0.029266357421875,
-0.0171661376953125,
0.023284912109375,
-0.037841796875,
0.0142059326171875,
-0.0004439353942871094,
0.008331298828125,
-0.0765380859375,
-0.0271453857421875,
0.032989501953125,
-0.04217529296875,
0.0038928985595703125,
-0.01519012451171875,
-0.0211944580078125,
-0.0323486328125,
-0.02301025390625,
0.0287628173828125,
0.037353515625,
-0.0426025390625,
0.031158447265625,
0.030548095703125,
0.00597381591796875,
-0.061981201171875,
-0.05340576171875,
-0.00933837890625,
-0.0187530517578125,
-0.06915283203125,
0.0178680419921875,
-0.021240234375,
-0.01349639892578125,
0.0028133392333984375,
-0.007152557373046875,
-0.012359619140625,
0.0234832763671875,
0.03271484375,
0.026763916015625,
-0.0173492431640625,
-0.01320648193359375,
0.00960540771484375,
0.02294921875,
-0.0114288330078125,
0.00862884521484375,
0.0728759765625,
-0.0203399658203125,
-0.034271240234375,
-0.03997802734375,
0.019500732421875,
0.05755615234375,
-0.01959228515625,
0.0711669921875,
0.07220458984375,
-0.03155517578125,
0.003570556640625,
-0.0531005859375,
-0.0008149147033691406,
-0.0384521484375,
0.0150604248046875,
-0.0218505859375,
-0.07159423828125,
0.05694580078125,
0.0285186767578125,
0.00586700439453125,
0.03955078125,
0.0455322265625,
-0.0218658447265625,
0.081787109375,
0.0533447265625,
-0.017669677734375,
0.040740966796875,
-0.0400390625,
0.00632476806640625,
-0.06671142578125,
-0.0220489501953125,
-0.01430511474609375,
-0.03179931640625,
-0.055816650390625,
-0.0217132568359375,
0.00804901123046875,
0.029541015625,
-0.0169677734375,
0.037994384765625,
-0.03955078125,
0.0184783935546875,
0.0421142578125,
0.017730712890625,
0.0189208984375,
0.00302886962890625,
-0.01421356201171875,
-0.008514404296875,
-0.0462646484375,
-0.040283203125,
0.0736083984375,
0.031036376953125,
0.0479736328125,
0.0313720703125,
0.06268310546875,
0.006969451904296875,
0.0115814208984375,
-0.0477294921875,
0.050048828125,
-0.01751708984375,
-0.058013916015625,
-0.02044677734375,
-0.0236358642578125,
-0.08587646484375,
0.031463623046875,
-0.00933837890625,
-0.0709228515625,
0.023468017578125,
-0.0014905929565429688,
-0.018402099609375,
0.01511383056640625,
-0.04400634765625,
0.05645751953125,
-0.0252532958984375,
-0.02587890625,
-0.013458251953125,
-0.042877197265625,
0.06298828125,
0.006839752197265625,
0.0200958251953125,
-0.02294921875,
0.004489898681640625,
0.0565185546875,
-0.0281829833984375,
0.062469482421875,
0.00008440017700195312,
-0.02044677734375,
0.020416259765625,
0.0186004638671875,
0.0193939208984375,
-0.0033397674560546875,
0.00672149658203125,
-0.004058837890625,
0.006488800048828125,
-0.0230255126953125,
-0.0288238525390625,
0.062286376953125,
-0.07598876953125,
-0.04351806640625,
-0.0279541015625,
-0.0258331298828125,
0.0195465087890625,
0.026763916015625,
0.01284027099609375,
0.017364501953125,
-0.0198974609375,
0.015045166015625,
0.0261383056640625,
-0.0253753662109375,
0.0261077880859375,
0.031280517578125,
-0.060760498046875,
-0.031463623046875,
0.050079345703125,
-0.003021240234375,
0.022918701171875,
0.0170135498046875,
0.01358795166015625,
-0.0216064453125,
-0.009552001953125,
-0.02655029296875,
0.053192138671875,
-0.04205322265625,
-0.0007696151733398438,
-0.057830810546875,
-0.0289459228515625,
-0.03955078125,
-0.015594482421875,
-0.0298614501953125,
-0.0239715576171875,
-0.0350341796875,
-0.0078277587890625,
0.03436279296875,
0.051300048828125,
-0.0034084320068359375,
0.03057861328125,
-0.055450439453125,
0.024993896484375,
0.032318115234375,
0.00914764404296875,
0.00505828857421875,
-0.037261962890625,
-0.0116119384765625,
0.01885986328125,
-0.0250396728515625,
-0.0731201171875,
0.052764892578125,
0.0026836395263671875,
0.03497314453125,
0.044708251953125,
-0.0195465087890625,
0.0550537109375,
-0.00994110107421875,
0.06805419921875,
0.01262664794921875,
-0.06292724609375,
0.041473388671875,
-0.036224365234375,
0.0117034912109375,
0.0230865478515625,
0.03472900390625,
-0.0625,
-0.0203399658203125,
-0.06622314453125,
-0.0677490234375,
0.06378173828125,
0.04595947265625,
0.0106353759765625,
0.0009064674377441406,
0.0248260498046875,
-0.0139617919921875,
0.0226287841796875,
-0.07647705078125,
-0.03955078125,
-0.007015228271484375,
-0.0001728534698486328,
0.0008487701416015625,
-0.0296783447265625,
-0.0135498046875,
-0.024658203125,
0.0482177734375,
0.0079345703125,
0.035736083984375,
0.011688232421875,
-0.003704071044921875,
-0.026214599609375,
0.00955963134765625,
0.0584716796875,
0.034027099609375,
-0.02740478515625,
0.00203704833984375,
0.009185791015625,
-0.0386962890625,
0.0097503662109375,
-0.0164337158203125,
0.0034465789794921875,
0.0007658004760742188,
0.03265380859375,
0.06964111328125,
0.017669677734375,
-0.038970947265625,
0.04522705078125,
-0.0025730133056640625,
-0.00677490234375,
-0.0215911865234375,
0.011993408203125,
0.043304443359375,
0.01091766357421875,
0.0160980224609375,
-0.00901031494140625,
-0.002025604248046875,
-0.0628662109375,
0.004909515380859375,
0.031890869140625,
-0.0083770751953125,
-0.03179931640625,
0.0496826171875,
-0.0027904510498046875,
-0.02178955078125,
0.0399169921875,
-0.0230255126953125,
-0.0418701171875,
0.06109619140625,
0.044158935546875,
0.056640625,
-0.0310516357421875,
0.015899658203125,
0.036407470703125,
0.024505615234375,
-0.0009493827819824219,
0.03167724609375,
-0.0002079010009765625,
-0.06927490234375,
-0.0079498291015625,
-0.061248779296875,
-0.0273284912109375,
0.028411865234375,
-0.04888916015625,
0.02667236328125,
-0.034423828125,
-0.0193328857421875,
-0.0009007453918457031,
0.00827789306640625,
-0.05322265625,
0.013427734375,
0.0161285400390625,
0.05291748046875,
-0.063720703125,
0.06292724609375,
0.0516357421875,
-0.040283203125,
-0.072998046875,
-0.033905029296875,
0.008148193359375,
-0.0670166015625,
0.047607421875,
0.0204315185546875,
-0.006145477294921875,
-0.0059661865234375,
-0.055755615234375,
-0.07757568359375,
0.0911865234375,
0.007419586181640625,
-0.038787841796875,
-0.02423095703125,
0.002674102783203125,
0.05279541015625,
-0.041107177734375,
0.042266845703125,
0.033599853515625,
0.0335693359375,
-0.000010013580322265625,
-0.083251953125,
0.00457763671875,
-0.03997802734375,
0.0042266845703125,
0.0020465850830078125,
-0.06842041015625,
0.07958984375,
-0.0133514404296875,
-0.0255584716796875,
0.016204833984375,
0.037445068359375,
0.0335693359375,
0.021331787109375,
0.04168701171875,
0.05035400390625,
0.037353515625,
0.00469207763671875,
0.0640869140625,
-0.0220947265625,
0.036041259765625,
0.071044921875,
-0.0185699462890625,
0.06890869140625,
0.023681640625,
-0.022430419921875,
0.0419921875,
0.04742431640625,
-0.004627227783203125,
0.01812744140625,
-0.00153350830078125,
-0.007709503173828125,
-0.0015115737915039062,
-0.0101470947265625,
-0.031036376953125,
0.04656982421875,
0.029571533203125,
-0.037445068359375,
-0.00592041015625,
-0.004764556884765625,
0.0234832763671875,
-0.007534027099609375,
0.0032196044921875,
0.05181884765625,
0.009002685546875,
-0.03509521484375,
0.053466796875,
0.002956390380859375,
0.06634521484375,
-0.0474853515625,
0.0029277801513671875,
-0.0272979736328125,
0.0010223388671875,
-0.034912109375,
-0.057342529296875,
0.00624847412109375,
-0.0059356689453125,
-0.007625579833984375,
0.0004978179931640625,
0.05816650390625,
-0.049468994140625,
-0.031402587890625,
0.055633544921875,
0.046234130859375,
0.0233612060546875,
0.00962066650390625,
-0.05908203125,
0.004077911376953125,
0.0213623046875,
-0.0188751220703125,
0.029205322265625,
0.0157470703125,
-0.00400543212890625,
0.07073974609375,
0.08294677734375,
0.01739501953125,
0.01425933837890625,
-0.0034313201904296875,
0.07281494140625,
-0.052276611328125,
-0.032989501953125,
-0.056121826171875,
0.037078857421875,
0.00787353515625,
-0.022552490234375,
0.062103271484375,
0.039215087890625,
0.060394287109375,
-0.00543975830078125,
0.032989501953125,
-0.0178680419921875,
0.032867431640625,
-0.0279998779296875,
0.0654296875,
-0.061676025390625,
0.016632080078125,
-0.01959228515625,
-0.06829833984375,
-0.001667022705078125,
0.058868408203125,
-0.017120361328125,
-0.0013055801391601562,
0.03680419921875,
0.07183837890625,
-0.01256561279296875,
-0.0213775634765625,
0.037841796875,
0.0169525146484375,
0.0234832763671875,
0.044464111328125,
0.0767822265625,
-0.043609619140625,
0.034149169921875,
-0.053924560546875,
0.0003018379211425781,
-0.033172607421875,
-0.037200927734375,
-0.065185546875,
-0.049224853515625,
-0.0101470947265625,
-0.0284271240234375,
-0.00858306884765625,
0.0750732421875,
0.0479736328125,
-0.046142578125,
-0.043212890625,
0.0186767578125,
0.0035305023193359375,
-0.01035308837890625,
-0.023223876953125,
0.021514892578125,
0.00380706787109375,
-0.048370361328125,
0.0384521484375,
0.0182037353515625,
0.010467529296875,
-0.0269927978515625,
-0.00423431396484375,
-0.041290283203125,
0.0025348663330078125,
0.0389404296875,
0.0263519287109375,
-0.0628662109375,
-0.0252685546875,
0.00850677490234375,
-0.0306549072265625,
0.0120391845703125,
0.0224761962890625,
-0.03265380859375,
0.02374267578125,
0.032501220703125,
0.0112762451171875,
0.055908203125,
0.0030765533447265625,
0.027679443359375,
-0.03887939453125,
0.014007568359375,
0.009765625,
0.0313720703125,
0.0246124267578125,
-0.04541015625,
0.040283203125,
0.01544189453125,
-0.050689697265625,
-0.07025146484375,
0.0167999267578125,
-0.09027099609375,
0.00420379638671875,
0.0836181640625,
-0.00826263427734375,
-0.027923583984375,
0.007808685302734375,
-0.036895751953125,
0.0184783935546875,
-0.037689208984375,
0.043304443359375,
0.05718994140625,
-0.0112762451171875,
-0.01548004150390625,
-0.0308837890625,
0.032806396484375,
0.0323486328125,
-0.0487060546875,
-0.01226043701171875,
0.0214996337890625,
0.03497314453125,
0.02667236328125,
0.06427001953125,
-0.009429931640625,
0.0294647216796875,
-0.0168914794921875,
0.0185394287109375,
0.0096588134765625,
-0.008575439453125,
-0.0278472900390625,
-0.00553131103515625,
-0.0002359151840209961,
-0.01183319091796875
]
] |
TheBloke/Mistral-7B-OpenOrca-AWQ | 2023-10-08T15:53:43.000Z | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"en",
"dataset:Open-Orca/OpenOrca",
"arxiv:2306.02707",
"arxiv:2301.13688",
"license:apache-2.0",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Mistral-7B-OpenOrca-AWQ | 27 | 7,163 | transformers | 2023-10-02T14:27:49 | ---
base_model: Open-Orca/Mistral-7B-OpenOrca
datasets:
- Open-Orca/OpenOrca
inference: false
language:
- en
library_name: transformers
license: apache-2.0
model_creator: OpenOrca
model_name: Mistral 7B OpenOrca
model_type: mistral
pipeline_tag: text-generation
prompt_template: '<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
'
quantized_by: TheBloke
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Mistral 7B OpenOrca - AWQ
- Model creator: [OpenOrca](https://huggingface.co/Open-Orca)
- Original model: [Mistral 7B OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca)
<!-- description start -->
## Description
This repo contains AWQ model files for [OpenOrca's Mistral 7B OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca).
### About AWQ
AWQ is an efficient, accurate and blazing-fast low-bit weight quantization method, currently supporting 4-bit quantization. Compared to GPTQ, it offers faster Transformers-based inference.
It is also now supported by continuous batching server [vLLM](https://github.com/vllm-project/vllm), allowing use of Llama AWQ models for high-throughput concurrent inference in multi-user server scenarios.
As of September 25th 2023, preliminary Llama-only AWQ support has also been added to [Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference).
Note that, at the time of writing, overall throughput is still lower than running vLLM or TGI with unquantised models, however using AWQ enables using much smaller GPUs which can lead to easier deployment and overall cost savings. For example, a 70B model can be run on 1 x 48GB GPU instead of 2 x 80GB.
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-GGUF)
* [OpenOrca's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: ChatML
```
<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
```
<!-- prompt-template end -->
<!-- README_AWQ.md-provided-files start -->
## Provided files, and AWQ parameters
For my first release of AWQ models, I am releasing 128g models only. I will consider adding 32g as well if there is interest, and once I have done perplexity and evaluation comparisons, but at this time 32g models are still not fully tested with AutoAWQ and vLLM.
Models are released as sharded safetensors files.
| Branch | Bits | GS | AWQ Dataset | Seq Len | Size |
| ------ | ---- | -- | ----------- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/Mistral-7B-OpenOrca-AWQ/tree/main) | 4 | 128 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 32768 | 4.15 GB
<!-- README_AWQ.md-provided-files end -->
<!-- README_AWQ.md-use-from-vllm start -->
## Serving this model from vLLM
Documentation on installing and using vLLM [can be found here](https://vllm.readthedocs.io/en/latest/).
Note: at the time of writing, vLLM has not yet done a new release with AWQ support.
If you try the vLLM examples below and get an error about `quantization` being unrecognised, or other AWQ-related issues, please install vLLM from Github source.
- When using vLLM as a server, pass the `--quantization awq` parameter, for example:
```shell
python3 python -m vllm.entrypoints.api_server --model TheBloke/Mistral-7B-OpenOrca-AWQ --quantization awq --dtype half
```
When using vLLM from Python code, pass the `quantization=awq` parameter, for example:
```python
from vllm import LLM, SamplingParams
prompts = [
"Hello, my name is",
"The president of the United States is",
"The capital of France is",
"The future of AI is",
]
sampling_params = SamplingParams(temperature=0.8, top_p=0.95)
llm = LLM(model="TheBloke/Mistral-7B-OpenOrca-AWQ", quantization="awq", dtype="half")
outputs = llm.generate(prompts, sampling_params)
# Print the outputs.
for output in outputs:
prompt = output.prompt
generated_text = output.outputs[0].text
print(f"Prompt: {prompt!r}, Generated text: {generated_text!r}")
```
<!-- README_AWQ.md-use-from-vllm start -->
<!-- README_AWQ.md-use-from-tgi start -->
## Serving this model from Text Generation Inference (TGI)
Use TGI version 1.1.0 or later. The official Docker container is: `ghcr.io/huggingface/text-generation-inference:1.1.0`
Example Docker parameters:
```shell
--model-id TheBloke/Mistral-7B-OpenOrca-AWQ --port 3000 --quantize awq --max-input-length 3696 --max-total-tokens 4096 --max-batch-prefill-tokens 4096
```
Example Python code for interfacing with TGI (requires huggingface-hub 0.17.0 or later):
```shell
pip3 install huggingface-hub
```
```python
from huggingface_hub import InferenceClient
endpoint_url = "https://your-endpoint-url-here"
prompt = "Tell me about AI"
prompt_template=f'''<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
'''
client = InferenceClient(endpoint_url)
response = client.text_generation(prompt,
max_new_tokens=128,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1)
print(f"Model output: {response}")
```
<!-- README_AWQ.md-use-from-tgi end -->
<!-- README_AWQ.md-use-from-python start -->
## How to use this AWQ model from Python code
### Install the necessary packages
Requires: [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) 0.1.1 or later
```shell
pip3 install autoawq
```
If you have problems installing [AutoAWQ](https://github.com/casper-hansen/AutoAWQ) using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y autoawq
git clone https://github.com/casper-hansen/AutoAWQ
cd AutoAWQ
pip3 install .
```
### You can then try the following example code
```python
from awq import AutoAWQForCausalLM
from transformers import AutoTokenizer
model_name_or_path = "TheBloke/Mistral-7B-OpenOrca-AWQ"
# Load model
model = AutoAWQForCausalLM.from_quantized(model_name_or_path, fuse_layers=True,
trust_remote_code=False, safetensors=True)
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, trust_remote_code=False)
prompt = "Tell me about AI"
prompt_template=f'''<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
'''
print("\n\n*** Generate:")
tokens = tokenizer(
prompt_template,
return_tensors='pt'
).input_ids.cuda()
# Generate output
generation_output = model.generate(
tokens,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
max_new_tokens=512
)
print("Output: ", tokenizer.decode(generation_output[0]))
"""
# Inference should be possible with transformers pipeline as well in future
# But currently this is not yet supported by AutoAWQ (correct as of September 25th 2023)
from transformers import pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
"""
```
<!-- README_AWQ.md-use-from-python end -->
<!-- README_AWQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with:
- [AutoAWQ](https://github.com/casper-hansen/AutoAWQ)
- [vLLM](https://github.com/vllm-project/vllm)
- [Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference)
TGI merged AWQ support on September 25th, 2023: [TGI PR #1054](https://github.com/huggingface/text-generation-inference/pull/1054). Use the `:latest` Docker container until the next TGI release is made.
<!-- README_AWQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Pierre Kircher, Stanislav Ovsiannikov, Michael Levine, Eugene Pentland, Andrey, 준교 김, Randy H, Fred von Graf, Artur Olbinski, Caitlyn Gatomon, terasurfer, Jeff Scroggin, James Bentley, Vadim, Gabriel Puliatti, Harry Royden McLaughlin, Sean Connelly, Dan Guido, Edmond Seymore, Alicia Loh, subjectnull, AzureBlack, Manuel Alberto Morcote, Thomas Belote, Lone Striker, Chris Smitley, Vitor Caleffi, Johann-Peter Hartmann, Clay Pascal, biorpg, Brandon Frisco, sidney chen, transmissions 11, Pedro Madruga, jinyuan sun, Ajan Kanaga, Emad Mostaque, Trenton Dambrowitz, Jonathan Leane, Iucharbius, usrbinkat, vamX, George Stoitzev, Luke Pendergrass, theTransient, Olakabola, Swaroop Kallakuri, Cap'n Zoog, Brandon Phillips, Michael Dempsey, Nikolai Manek, danny, Matthew Berman, Gabriel Tamborski, alfie_i, Raymond Fosdick, Tom X Nguyen, Raven Klaugh, LangChain4j, Magnesian, Illia Dulskyi, David Ziegler, Mano Prime, Luis Javier Navarrete Lozano, Erik Bjäreholt, 阿明, Nathan Dryer, Alex, Rainer Wilmers, zynix, TL, Joseph William Delisle, John Villwock, Nathan LeClaire, Willem Michiel, Joguhyik, GodLy, OG, Alps Aficionado, Jeffrey Morgan, ReadyPlayerEmma, Tiffany J. Kim, Sebastain Graf, Spencer Kim, Michael Davis, webtim, Talal Aujan, knownsqashed, John Detwiler, Imad Khwaja, Deo Leter, Jerry Meng, Elijah Stavena, Rooh Singh, Pieter, SuperWojo, Alexandros Triantafyllidis, Stephen Murray, Ai Maven, ya boyyy, Enrico Ros, Ken Nordquist, Deep Realms, Nicholas, Spiking Neurons AB, Elle, Will Dee, Jack West, RoA, Luke @flexchar, Viktor Bowallius, Derek Yates, Subspace Studios, jjj, Toran Billups, Asp the Wyvern, Fen Risland, Ilya, NimbleBox.ai, Chadd, Nitin Borwankar, Emre, Mandus, Leonard Tan, Kalila, K, Trailburnt, S_X, Cory Kujawski
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: OpenOrca's Mistral 7B OpenOrca
<p><h1>🐋 TBD 🐋</h1></p>

[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
# OpenOrca - Mistral - 7B - 8k
We have used our own [OpenOrca dataset](https://huggingface.co/datasets/Open-Orca/OpenOrca) to fine-tune on top of [Mistral 7B](https://huggingface.co/mistralai/Mistral-7B-v0.1).
This dataset is our attempt to reproduce the dataset generated for Microsoft Research's [Orca Paper](https://arxiv.org/abs/2306.02707).
We use [OpenChat](https://huggingface.co/openchat) packing, trained with [Axolotl](https://github.com/OpenAccess-AI-Collective/axolotl).
This release is trained on a curated filtered subset of most of our GPT-4 augmented data.
It is the same subset of our data as was used in our [OpenOrcaxOpenChat-Preview2-13B model](https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B).
HF Leaderboard evals place this model as #2 for all models smaller than 30B at release time, outperforming all but one 13B model.
TBD
Want to visualize our full (pre-filtering) dataset? Check out our [Nomic Atlas Map](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2).
[<img src="https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B/resolve/main/OpenOrca%20Nomic%20Atlas.png" alt="Atlas Nomic Dataset Map" width="400" height="400" />](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2)
We are in-process with training more models, so keep a look out on our org for releases coming soon with exciting partners.
We will also give sneak-peak announcements on our Discord, which you can find here:
https://AlignmentLab.ai
or on the OpenAccess AI Collective Discord for more information about Axolotl trainer here:
https://discord.gg/5y8STgB3P3
# Prompt Template
We used [OpenAI's Chat Markup Language (ChatML)](https://github.com/openai/openai-python/blob/main/chatml.md) format, with `<|im_start|>` and `<|im_end|>` tokens added to support this.
## Example Prompt Exchange
TBD
# Evaluation
We have evaluated using the methodology and tools for the HuggingFace Leaderboard, and find that we have significantly improved upon the base model.
TBD
## HuggingFaceH4 Open LLM Leaderboard Performance
TBD
## GPT4ALL Leaderboard Performance
TBD
# Dataset
We used a curated, filtered selection of most of the GPT-4 augmented data from our OpenOrca dataset, which aims to reproduce the Orca Research Paper dataset.
# Training
We trained with 8x A6000 GPUs for 62 hours, completing 4 epochs of full fine tuning on our dataset in one training run.
Commodity cost was ~$400.
# Citation
```bibtex
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@misc{longpre2023flan,
title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning},
author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts},
year={2023},
eprint={2301.13688},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
| 16,192 | [
[
-0.039642333984375,
-0.060943603515625,
0.025421142578125,
0.004474639892578125,
-0.0159149169921875,
-0.0167083740234375,
0.0052337646484375,
-0.03997802734375,
-0.0028209686279296875,
0.0263824462890625,
-0.04638671875,
-0.04608154296875,
-0.019866943359375,
-0.01036834716796875,
-0.0216064453125,
0.0743408203125,
0.00908660888671875,
-0.0202484130859375,
-0.0220184326171875,
-0.020599365234375,
-0.00955963134765625,
-0.04071044921875,
-0.05633544921875,
-0.01534271240234375,
0.01456451416015625,
0.01439666748046875,
0.0592041015625,
0.045928955078125,
0.01212310791015625,
0.0294647216796875,
-0.00948333740234375,
0.006481170654296875,
-0.0264739990234375,
-0.004177093505859375,
0.0106658935546875,
-0.0210113525390625,
-0.0438232421875,
0.0047149658203125,
0.0380859375,
0.0186309814453125,
-0.025177001953125,
0.0162506103515625,
0.004009246826171875,
0.037322998046875,
-0.04168701171875,
0.01428985595703125,
-0.021148681640625,
-0.0022754669189453125,
-0.0102386474609375,
0.0176849365234375,
-0.01324462890625,
-0.01194000244140625,
0.0109710693359375,
-0.0660400390625,
0.0095062255859375,
0.015777587890625,
0.09881591796875,
0.0198211669921875,
-0.04058837890625,
0.00446319580078125,
-0.03582763671875,
0.0750732421875,
-0.08447265625,
0.028228759765625,
0.021759033203125,
0.0213470458984375,
-0.01328277587890625,
-0.0728759765625,
-0.050506591796875,
-0.0189971923828125,
-0.00843048095703125,
0.0249176025390625,
-0.04376220703125,
0.0041961669921875,
0.019866943359375,
0.044403076171875,
-0.05169677734375,
-0.0019931793212890625,
-0.0250244140625,
-0.01502227783203125,
0.053955078125,
0.0264739990234375,
0.0217742919921875,
-0.0106658935546875,
-0.0285797119140625,
-0.02960205078125,
-0.0322265625,
0.0218048095703125,
0.01114654541015625,
0.00725555419921875,
-0.048797607421875,
0.0401611328125,
-0.0251312255859375,
0.042022705078125,
0.01490020751953125,
0.003753662109375,
0.0191497802734375,
-0.033294677734375,
-0.0457763671875,
-0.0343017578125,
0.100830078125,
0.0237274169921875,
-0.02496337890625,
0.01445770263671875,
-0.0030002593994140625,
-0.01058197021484375,
0.00405120849609375,
-0.06787109375,
-0.021270751953125,
0.050018310546875,
-0.04168701171875,
-0.0328369140625,
-0.003070831298828125,
-0.05242919921875,
-0.00516510009765625,
0.0130157470703125,
0.044769287109375,
-0.03240966796875,
-0.0202178955078125,
0.00499725341796875,
-0.0305328369140625,
0.0390625,
0.0233001708984375,
-0.0556640625,
0.0286712646484375,
0.027252197265625,
0.05169677734375,
0.0177764892578125,
-0.01442718505859375,
-0.0335693359375,
0.0018138885498046875,
-0.005035400390625,
0.038543701171875,
-0.00937652587890625,
-0.034820556640625,
-0.0172576904296875,
0.012451171875,
0.01071929931640625,
-0.0252532958984375,
0.037750244140625,
-0.0162353515625,
0.03961181640625,
-0.029296875,
-0.031280517578125,
-0.0219573974609375,
0.0083770751953125,
-0.04376220703125,
0.09259033203125,
0.0257568359375,
-0.06512451171875,
0.00875091552734375,
-0.0419921875,
-0.01486968994140625,
0.0028076171875,
-0.0032024383544921875,
-0.05413818359375,
-0.00957489013671875,
0.028167724609375,
0.0225982666015625,
-0.0361328125,
-0.006252288818359375,
-0.034637451171875,
-0.00879669189453125,
0.0168304443359375,
-0.0239410400390625,
0.0955810546875,
0.0207672119140625,
-0.040496826171875,
0.01340484619140625,
-0.0538330078125,
0.011566162109375,
0.027313232421875,
-0.01727294921875,
0.0034618377685546875,
-0.01183319091796875,
0.008056640625,
0.006404876708984375,
0.0267791748046875,
-0.035064697265625,
0.0166015625,
-0.0277557373046875,
0.055206298828125,
0.04541015625,
-0.0005321502685546875,
0.034820556640625,
-0.045196533203125,
0.03436279296875,
0.01422882080078125,
0.038482666015625,
0.0083770751953125,
-0.051055908203125,
-0.06988525390625,
-0.02117919921875,
0.0190277099609375,
0.04571533203125,
-0.052459716796875,
0.043426513671875,
0.01146697998046875,
-0.057159423828125,
-0.042205810546875,
-0.0206756591796875,
0.0247039794921875,
0.034423828125,
0.035552978515625,
-0.0177001953125,
-0.037933349609375,
-0.059722900390625,
0.0035648345947265625,
-0.036865234375,
-0.01053619384765625,
0.041351318359375,
0.051910400390625,
-0.0186309814453125,
0.05755615234375,
-0.0377197265625,
-0.019866943359375,
-0.0030460357666015625,
0.01001739501953125,
0.01922607421875,
0.05145263671875,
0.05804443359375,
-0.042877197265625,
-0.032684326171875,
-0.0041046142578125,
-0.0572509765625,
-0.00461578369140625,
0.0016937255859375,
-0.035736083984375,
0.0281524658203125,
0.0160369873046875,
-0.06988525390625,
0.0472412109375,
0.0435791015625,
-0.03997802734375,
0.05810546875,
-0.020355224609375,
0.01036834716796875,
-0.08575439453125,
0.01012420654296875,
0.00623321533203125,
-0.0237884521484375,
-0.034942626953125,
0.01442718505859375,
-0.00878143310546875,
0.0029048919677734375,
-0.034637451171875,
0.061370849609375,
-0.03228759765625,
0.0050506591796875,
-0.0036468505859375,
-0.01161956787109375,
0.0263671875,
0.0361328125,
-0.00634002685546875,
0.047454833984375,
0.050384521484375,
-0.05072021484375,
0.033782958984375,
0.03082275390625,
-0.0009527206420898438,
0.0247039794921875,
-0.07574462890625,
0.005947113037109375,
0.00667572021484375,
0.029815673828125,
-0.07818603515625,
-0.01297760009765625,
0.04132080078125,
-0.04693603515625,
0.01152801513671875,
-0.0195770263671875,
-0.0220794677734375,
-0.03521728515625,
-0.0311126708984375,
0.0259246826171875,
0.0682373046875,
-0.036285400390625,
0.0548095703125,
0.0289459228515625,
0.0163726806640625,
-0.055450439453125,
-0.055694580078125,
-0.023406982421875,
-0.028289794921875,
-0.052490234375,
0.03509521484375,
-0.01439666748046875,
-0.0179290771484375,
0.0056304931640625,
0.0016241073608398438,
-0.006900787353515625,
0.00788116455078125,
0.025421142578125,
0.02032470703125,
-0.00870513916015625,
-0.0209503173828125,
0.01033782958984375,
0.00014007091522216797,
0.010101318359375,
-0.031341552734375,
0.036712646484375,
-0.027984619140625,
-0.0007300376892089844,
-0.047760009765625,
0.0176544189453125,
0.043670654296875,
-0.022735595703125,
0.07684326171875,
0.06549072265625,
-0.0233001708984375,
-0.00795745849609375,
-0.0311126708984375,
-0.02008056640625,
-0.039642333984375,
0.00904083251953125,
-0.0222625732421875,
-0.05682373046875,
0.05596923828125,
0.03240966796875,
0.0279388427734375,
0.06072998046875,
0.03759765625,
-0.0196075439453125,
0.08343505859375,
0.041107177734375,
0.0004551410675048828,
0.0330810546875,
-0.047576904296875,
-0.0085296630859375,
-0.06280517578125,
-0.0144195556640625,
-0.0435791015625,
-0.0021266937255859375,
-0.049896240234375,
-0.03607177734375,
0.0362548828125,
0.015289306640625,
-0.04888916015625,
0.0251617431640625,
-0.045623779296875,
-0.00531005859375,
0.06060791015625,
0.016204833984375,
0.01111602783203125,
-0.00921630859375,
-0.02410888671875,
0.0080413818359375,
-0.05419921875,
-0.0228118896484375,
0.072998046875,
0.0242156982421875,
0.041259765625,
0.0134735107421875,
0.047332763671875,
0.01025390625,
0.01386260986328125,
-0.041839599609375,
0.03875732421875,
0.0016155242919921875,
-0.0445556640625,
-0.032318115234375,
-0.042510986328125,
-0.06842041015625,
0.0215301513671875,
-0.0161895751953125,
-0.056365966796875,
0.0225677490234375,
0.0114288330078125,
-0.03778076171875,
0.021942138671875,
-0.029083251953125,
0.0714111328125,
-0.0008111000061035156,
-0.0285797119140625,
0.00325775146484375,
-0.0465087890625,
0.02325439453125,
0.027130126953125,
0.01213836669921875,
-0.01174163818359375,
-0.00818634033203125,
0.0550537109375,
-0.0733642578125,
0.056488037109375,
-0.018585205078125,
-0.00286865234375,
0.044158935546875,
-0.0091400146484375,
0.03717041015625,
0.01203155517578125,
-0.0084075927734375,
0.0270843505859375,
0.016937255859375,
-0.0384521484375,
-0.025634765625,
0.040283203125,
-0.08050537109375,
-0.04541015625,
-0.037994384765625,
-0.035797119140625,
0.012420654296875,
0.01012420654296875,
0.038482666015625,
0.026611328125,
-0.016998291015625,
0.008087158203125,
0.034149169921875,
-0.031829833984375,
0.04010009765625,
0.0311737060546875,
-0.019866943359375,
-0.043182373046875,
0.047454833984375,
-0.007534027099609375,
0.0194091796875,
0.0186920166015625,
0.016021728515625,
-0.034942626953125,
-0.033538818359375,
-0.053497314453125,
0.022705078125,
-0.03204345703125,
-0.0280914306640625,
-0.05963134765625,
-0.0274658203125,
-0.042694091796875,
0.00273895263671875,
-0.0298004150390625,
-0.03582763671875,
-0.045806884765625,
0.0106048583984375,
0.05963134765625,
0.026885986328125,
-0.0237274169921875,
0.031341552734375,
-0.05810546875,
0.01520538330078125,
0.029296875,
0.00022840499877929688,
0.007030487060546875,
-0.0582275390625,
-0.016143798828125,
0.022186279296875,
-0.0382080078125,
-0.05450439453125,
0.05328369140625,
0.01190185546875,
0.0423583984375,
0.02410888671875,
0.0226287841796875,
0.054595947265625,
-0.01458740234375,
0.073974609375,
0.0034618377685546875,
-0.08447265625,
0.036346435546875,
-0.036285400390625,
0.032073974609375,
0.0216064453125,
0.03057861328125,
-0.0301971435546875,
-0.03875732421875,
-0.057586669921875,
-0.06793212890625,
0.045928955078125,
0.033447265625,
-0.004566192626953125,
0.0047149658203125,
0.025726318359375,
-0.00839996337890625,
0.011322021484375,
-0.060943603515625,
-0.043243408203125,
-0.02362060546875,
-0.01012420654296875,
0.0178375244140625,
-0.0013332366943359375,
-0.01490020751953125,
-0.03399658203125,
0.0704345703125,
-0.0104522705078125,
0.049346923828125,
0.0227813720703125,
0.01256561279296875,
-0.01445770263671875,
0.008575439453125,
0.019134521484375,
0.036346435546875,
-0.01708984375,
-0.0218048095703125,
0.013275146484375,
-0.02691650390625,
-0.00017392635345458984,
0.0262908935546875,
-0.0167083740234375,
-0.01256561279296875,
0.0056610107421875,
0.06878662109375,
-0.006099700927734375,
-0.031951904296875,
0.03106689453125,
-0.022857666015625,
-0.0361328125,
-0.0301513671875,
0.01412200927734375,
0.01548004150390625,
0.042327880859375,
0.034942626953125,
-0.018280029296875,
0.0186309814453125,
-0.03857421875,
0.00951385498046875,
0.05389404296875,
-0.0194091796875,
-0.0098724365234375,
0.0880126953125,
-0.00023424625396728516,
0.0007157325744628906,
0.06243896484375,
-0.0160064697265625,
-0.034820556640625,
0.0718994140625,
0.033721923828125,
0.0560302734375,
-0.004222869873046875,
0.0200958251953125,
0.041259765625,
0.011016845703125,
-0.000026404857635498047,
0.035003662109375,
0.0010318756103515625,
-0.05059814453125,
-0.0152130126953125,
-0.04852294921875,
-0.031280517578125,
0.017669677734375,
-0.062103271484375,
0.019927978515625,
-0.04034423828125,
-0.0251007080078125,
-0.0031890869140625,
0.0181121826171875,
-0.050048828125,
0.0240020751953125,
0.009490966796875,
0.0556640625,
-0.049835205078125,
0.053741455078125,
0.05377197265625,
-0.037841796875,
-0.0660400390625,
-0.0193939208984375,
0.00769805908203125,
-0.04974365234375,
0.0075531005859375,
0.0029354095458984375,
0.0181121826171875,
0.016143798828125,
-0.064453125,
-0.06756591796875,
0.10540771484375,
0.005725860595703125,
-0.0345458984375,
-0.001644134521484375,
0.00402069091796875,
0.02935791015625,
-0.0200347900390625,
0.052093505859375,
0.03472900390625,
0.03057861328125,
0.0109405517578125,
-0.0714111328125,
0.031524658203125,
-0.018157958984375,
-0.004207611083984375,
0.01128387451171875,
-0.0848388671875,
0.087158203125,
-0.020904541015625,
-0.01461029052734375,
0.036712646484375,
0.07318115234375,
0.047119140625,
0.009857177734375,
0.031219482421875,
0.063232421875,
0.06048583984375,
-0.0163116455078125,
0.07720947265625,
-0.0158538818359375,
0.04833984375,
0.05657958984375,
0.0002351999282836914,
0.05682373046875,
0.0174713134765625,
-0.03240966796875,
0.045806884765625,
0.05572509765625,
-0.0163726806640625,
0.0285797119140625,
0.0028934478759765625,
-0.0217742919921875,
-0.006679534912109375,
-0.0058135986328125,
-0.047760009765625,
0.0205841064453125,
0.0275726318359375,
-0.0160675048828125,
-0.002391815185546875,
-0.00667572021484375,
0.006526947021484375,
-0.037322998046875,
-0.014373779296875,
0.046966552734375,
0.02117919921875,
-0.0217742919921875,
0.07177734375,
0.00991058349609375,
0.053924560546875,
-0.036346435546875,
-0.00798797607421875,
-0.0206756591796875,
0.00318145751953125,
-0.01262664794921875,
-0.04571533203125,
0.00499725341796875,
-0.01546478271484375,
-0.005523681640625,
0.004901885986328125,
0.038421630859375,
-0.0247802734375,
-0.0311126708984375,
0.022857666015625,
0.036651611328125,
0.016448974609375,
-0.007617950439453125,
-0.0775146484375,
0.019134521484375,
0.00592803955078125,
-0.0404052734375,
0.023712158203125,
0.0360107421875,
0.012298583984375,
0.05078125,
0.050262451171875,
-0.0214996337890625,
-0.00020813941955566406,
-0.016082763671875,
0.07025146484375,
-0.057830810546875,
-0.0294342041015625,
-0.061798095703125,
0.06341552734375,
-0.0013408660888671875,
-0.034820556640625,
0.060302734375,
0.034637451171875,
0.056671142578125,
0.002471923828125,
0.0633544921875,
-0.0291900634765625,
0.01276397705078125,
-0.02447509765625,
0.0682373046875,
-0.062103271484375,
0.00644683837890625,
-0.01207733154296875,
-0.05517578125,
0.005985260009765625,
0.0513916015625,
0.00027108192443847656,
0.0186614990234375,
0.040771484375,
0.054168701171875,
0.004795074462890625,
-0.001033782958984375,
0.0173187255859375,
0.037872314453125,
0.026123046875,
0.057037353515625,
0.04736328125,
-0.0709228515625,
0.045379638671875,
-0.0447998046875,
-0.01910400390625,
-0.0115966796875,
-0.059417724609375,
-0.06622314453125,
-0.042022705078125,
-0.03143310546875,
-0.055908203125,
-0.003002166748046875,
0.05926513671875,
0.0634765625,
-0.051849365234375,
-0.0232391357421875,
-0.01171112060546875,
-0.005954742431640625,
-0.0228729248046875,
-0.02496337890625,
0.02398681640625,
0.004913330078125,
-0.064208984375,
0.0181732177734375,
-0.00553131103515625,
0.031951904296875,
-0.0225982666015625,
-0.01145172119140625,
-0.013092041015625,
0.0092620849609375,
0.0287933349609375,
0.045745849609375,
-0.048980712890625,
-0.0069122314453125,
-0.006256103515625,
-0.01284027099609375,
0.013275146484375,
0.0096893310546875,
-0.06573486328125,
0.0010833740234375,
0.036346435546875,
0.01546478271484375,
0.057373046875,
0.0025806427001953125,
0.0513916015625,
-0.031829833984375,
0.02105712890625,
0.01111602783203125,
0.0263519287109375,
0.00719451904296875,
-0.03729248046875,
0.03466796875,
0.0103302001953125,
-0.05657958984375,
-0.0582275390625,
-0.006195068359375,
-0.0860595703125,
-0.0184173583984375,
0.07366943359375,
-0.0163726806640625,
-0.039825439453125,
0.0023326873779296875,
-0.00951385498046875,
0.0340576171875,
-0.045074462890625,
0.03509521484375,
0.0306396484375,
-0.0089263916015625,
-0.02252197265625,
-0.041900634765625,
0.045135498046875,
0.03289794921875,
-0.0693359375,
-0.007534027099609375,
0.03582763671875,
0.0242462158203125,
-0.00708770751953125,
0.054443359375,
-0.0086669921875,
0.0268096923828125,
0.007144927978515625,
0.01515960693359375,
-0.0016841888427734375,
0.0011720657348632812,
-0.01470184326171875,
-0.0061187744140625,
-0.01230621337890625,
-0.0172576904296875
]
] |
cardiffnlp/xlm-roberta-base-tweet-sentiment-pt | 2023-03-22T23:31:30.000Z | [
"transformers",
"pytorch",
"xlm-roberta",
"text-classification",
"endpoints_compatible",
"region:us"
] | text-classification | cardiffnlp | null | null | cardiffnlp/xlm-roberta-base-tweet-sentiment-pt | 3 | 7,159 | transformers | 2023-03-06T09:04:46 | # `cardiffnlp/xlm-roberta-base-tweet-sentiment-pt`
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the
[cardiffnlp/tweet_sentiment_multilingual](https://huggingface.co/datasets/cardiffnlp/tweet_sentiment_multilingual) (portuguese).
Following metrics are computed on the `test` split of
[cardiffnlp/tweet_sentiment_multilingual](https://huggingface.co/datasets/cardiffnlp/tweet_sentiment_multilingual)(portuguese).
| | eval_f1_micro | eval_recall_micro | eval_precision_micro | eval_f1_macro | eval_recall_macro | eval_precision_macro | eval_accuracy |
|---:|----------------:|--------------------:|-----------------------:|----------------:|--------------------:|-----------------------:|----------------:|
| 0 | 70.69 | 70.69 | 70.69 | 70.73 | 70.69 | 70.78 | 70.69 |
Check the result file [here](https://huggingface.co/cardiffnlp/xlm-roberta-base-tweet-sentiment-pt/raw/main/eval.json). | 1,057 | [
[
-0.0278472900390625,
-0.0325927734375,
0.02142333984375,
0.03802490234375,
-0.040740966796875,
0.0156402587890625,
-0.0283355712890625,
-0.0166473388671875,
0.04034423828125,
0.025390625,
-0.049041748046875,
-0.0833740234375,
-0.065185546875,
0.0023975372314453125,
-0.005641937255859375,
0.0810546875,
-0.003078460693359375,
0.0180816650390625,
0.0262908935546875,
-0.037811279296875,
-0.01849365234375,
-0.035400390625,
-0.05426025390625,
-0.003509521484375,
0.02557373046875,
0.0262451171875,
0.051910400390625,
0.0201263427734375,
0.047149658203125,
0.01219940185546875,
-0.00522613525390625,
0.0180511474609375,
-0.037841796875,
-0.01084136962890625,
0.00717926025390625,
-0.02740478515625,
-0.05633544921875,
0.006198883056640625,
0.047698974609375,
0.0404052734375,
-0.0016508102416992188,
0.01788330078125,
0.005207061767578125,
0.04815673828125,
-0.021697998046875,
0.0268096923828125,
-0.0230865478515625,
-0.0029010772705078125,
-0.006572723388671875,
-0.0225982666015625,
-0.0187530517578125,
-0.04351806640625,
-0.0045013427734375,
-0.006595611572265625,
0.00975799560546875,
0.0056915283203125,
0.0833740234375,
0.01275634765625,
-0.0418701171875,
-0.0007433891296386719,
-0.035003662109375,
0.07159423828125,
-0.06298828125,
0.0278778076171875,
0.01195526123046875,
0.0167236328125,
0.0171051025390625,
-0.01476287841796875,
-0.030609130859375,
0.001064300537109375,
-0.0091400146484375,
0.0224456787109375,
-0.0229339599609375,
-0.0121307373046875,
0.002490997314453125,
0.0222625732421875,
-0.055755615234375,
-0.0082550048828125,
-0.0406494140625,
-0.007503509521484375,
0.042327880859375,
0.004421234130859375,
0.008392333984375,
-0.0244903564453125,
-0.02935791015625,
-0.019561767578125,
-0.043914794921875,
-0.005924224853515625,
0.019134521484375,
0.057952880859375,
-0.0511474609375,
0.051666259765625,
-0.0070648193359375,
0.045989990234375,
0.00940704345703125,
-0.01885986328125,
0.060455322265625,
-0.0292205810546875,
-0.0226287841796875,
-0.00995635986328125,
0.08251953125,
0.037078857421875,
0.02813720703125,
0.00429534912109375,
-0.02130126953125,
0.005496978759765625,
0.005645751953125,
-0.0660400390625,
-0.003765106201171875,
0.018524169921875,
-0.035064697265625,
-0.0281982421875,
0.0206298828125,
-0.0418701171875,
0.0231170654296875,
-0.003810882568359375,
0.026031494140625,
-0.050262451171875,
-0.04058837890625,
0.007129669189453125,
0.0003535747528076172,
0.0175323486328125,
0.01739501953125,
-0.0404052734375,
0.020599365234375,
0.04986572265625,
0.06903076171875,
-0.0098724365234375,
-0.0028362274169921875,
-0.02197265625,
-0.0121002197265625,
-0.0223541259765625,
0.05084228515625,
-0.01184844970703125,
-0.04620361328125,
-0.003330230712890625,
-0.004215240478515625,
-0.0150146484375,
-0.02740478515625,
0.06683349609375,
-0.0206756591796875,
0.035369873046875,
-0.019927978515625,
-0.037445068359375,
-0.01192474365234375,
0.049560546875,
-0.045257568359375,
0.07855224609375,
0.045684814453125,
-0.061279296875,
0.038543701171875,
-0.05975341796875,
-0.0211639404296875,
-0.0146636962890625,
-0.0016775131225585938,
-0.033843994140625,
-0.0051727294921875,
0.003467559814453125,
0.025238037109375,
-0.01467132568359375,
-0.0020732879638671875,
-0.03521728515625,
-0.0079803466796875,
0.00409698486328125,
-0.01374053955078125,
0.076171875,
0.0119171142578125,
-0.0169830322265625,
0.01195526123046875,
-0.0650634765625,
0.0258331298828125,
-0.0105438232421875,
-0.037750244140625,
-0.008331298828125,
-0.028167724609375,
0.0265655517578125,
0.02734375,
0.01556396484375,
-0.046844482421875,
0.00934600830078125,
-0.03094482421875,
0.015228271484375,
0.042236328125,
-0.012298583984375,
0.0243988037109375,
-0.030792236328125,
0.04888916015625,
0.01678466796875,
0.0066070556640625,
0.0242919921875,
-0.0306396484375,
-0.035736083984375,
-0.038604736328125,
0.02398681640625,
0.050445556640625,
-0.0251922607421875,
0.0487060546875,
-0.022674560546875,
-0.0440673828125,
-0.049468994140625,
-0.00556182861328125,
0.03924560546875,
0.040863037109375,
0.0357666015625,
-0.006198883056640625,
-0.0479736328125,
-0.07684326171875,
-0.01352691650390625,
-0.0272979736328125,
0.0060272216796875,
0.023406982421875,
0.038848876953125,
-0.01392364501953125,
0.054412841796875,
-0.03741455078125,
-0.02020263671875,
-0.0192413330078125,
0.0159149169921875,
0.051544189453125,
0.0396728515625,
0.0718994140625,
-0.055755615234375,
-0.060546875,
-0.008819580078125,
-0.035736083984375,
-0.0235748291015625,
0.0131378173828125,
-0.0160675048828125,
0.050018310546875,
-0.00792694091796875,
-0.053192138671875,
0.03509521484375,
0.055694580078125,
-0.05291748046875,
0.05963134765625,
0.0005164146423339844,
0.026458740234375,
-0.110107421875,
0.004566192626953125,
0.01497650146484375,
-0.0308380126953125,
-0.0243682861328125,
0.00431060791015625,
0.0249176025390625,
0.03680419921875,
-0.03936767578125,
0.04364013671875,
-0.0216522216796875,
0.003673553466796875,
0.003414154052734375,
-0.00868988037109375,
0.0050201416015625,
0.0355224609375,
-0.0031375885009765625,
0.04876708984375,
0.03173828125,
-0.00794219970703125,
0.017852783203125,
0.0105133056640625,
-0.0105438232421875,
0.054473876953125,
-0.0467529296875,
-0.00893402099609375,
0.0210418701171875,
0.008880615234375,
-0.07977294921875,
0.0021343231201171875,
0.0143585205078125,
-0.052398681640625,
0.0218048095703125,
-0.02740478515625,
-0.04620361328125,
-0.0251007080078125,
-0.04437255859375,
0.030670166015625,
0.03289794921875,
-0.0173187255859375,
0.03204345703125,
0.018341064453125,
0.00649261474609375,
-0.042694091796875,
-0.0623779296875,
-0.0031261444091796875,
-0.0299224853515625,
-0.06121826171875,
0.0272369384765625,
-0.02001953125,
-0.02105712890625,
-0.006252288818359375,
0.005786895751953125,
-0.0182342529296875,
-0.02264404296875,
0.02740478515625,
0.043365478515625,
-0.02734375,
-0.023651123046875,
-0.01192474365234375,
-0.003173828125,
0.006977081298828125,
-0.002170562744140625,
0.06561279296875,
-0.0171966552734375,
-0.00366973876953125,
-0.031829833984375,
0.01094818115234375,
0.04193115234375,
0.003269195556640625,
0.0693359375,
0.04949951171875,
-0.018646240234375,
-0.00902557373046875,
-0.0357666015625,
0.0149383544921875,
-0.0287933349609375,
0.021270751953125,
-0.039154052734375,
-0.0694580078125,
0.057281494140625,
0.026702880859375,
-0.0124359130859375,
0.06842041015625,
0.04888916015625,
-0.00023162364959716797,
0.07708740234375,
0.03857421875,
-0.0224609375,
0.0182037353515625,
-0.046905517578125,
0.00731658935546875,
-0.054046630859375,
-0.0191497802734375,
-0.05340576171875,
-0.0153961181640625,
-0.0738525390625,
-0.02081298828125,
0.0251007080078125,
0.0007476806640625,
-0.041534423828125,
0.036102294921875,
-0.0223846435546875,
0.0107269287109375,
0.03948974609375,
0.0137939453125,
0.004398345947265625,
-0.01087188720703125,
-0.0306854248046875,
-0.0218353271484375,
-0.0230560302734375,
-0.031494140625,
0.088134765625,
0.0119171142578125,
0.0428466796875,
0.03045654296875,
0.06195068359375,
0.0142974853515625,
0.0286407470703125,
-0.05078125,
0.0303802490234375,
-0.024078369140625,
-0.042022705078125,
0.002674102783203125,
-0.0479736328125,
-0.0406494140625,
-0.0028820037841796875,
-0.00531768798828125,
-0.04315185546875,
-0.00630950927734375,
-0.0014524459838867188,
-0.0285797119140625,
0.0311279296875,
-0.035736083984375,
0.06378173828125,
-0.00872802734375,
-0.0101470947265625,
-0.006610870361328125,
-0.026275634765625,
0.01384735107421875,
-0.0038299560546875,
0.0162506103515625,
-0.01195526123046875,
-0.00678253173828125,
0.049896240234375,
-0.046630859375,
0.04541015625,
-0.0140380859375,
-0.0027713775634765625,
0.016265869140625,
0.0203704833984375,
0.0207061767578125,
0.01171112060546875,
-0.030609130859375,
0.01446533203125,
0.0032215118408203125,
-0.037628173828125,
-0.01544952392578125,
0.05511474609375,
-0.07574462890625,
-0.014373779296875,
-0.0452880859375,
-0.043304443359375,
-0.01129913330078125,
0.0278472900390625,
0.025238037109375,
0.0286407470703125,
-0.0265045166015625,
0.0245208740234375,
0.0303802490234375,
-0.0163726806640625,
0.02069091796875,
0.04437255859375,
-0.0168609619140625,
-0.044647216796875,
0.0290985107421875,
0.004940032958984375,
0.0247802734375,
0.03875732421875,
0.01300048828125,
-0.03558349609375,
-0.04296875,
-0.0238800048828125,
0.0187835693359375,
-0.032440185546875,
-0.03564453125,
-0.040863037109375,
-0.0247955322265625,
-0.054168701171875,
-0.0037059783935546875,
-0.0325927734375,
-0.05902099609375,
-0.028228759765625,
-0.0262908935546875,
0.03607177734375,
0.037017822265625,
-0.03173828125,
0.02264404296875,
-0.05963134765625,
0.00933074951171875,
-0.006839752197265625,
0.0305023193359375,
-0.032440185546875,
-0.0634765625,
-0.01434326171875,
0.0167388916015625,
-0.0267181396484375,
-0.06072998046875,
0.0772705078125,
0.00856781005859375,
0.011505126953125,
0.037200927734375,
-0.00457000732421875,
0.052947998046875,
-0.01297760009765625,
0.055908203125,
0.0188751220703125,
-0.054046630859375,
0.0400390625,
-0.04290771484375,
0.01277923583984375,
0.040008544921875,
0.03094482421875,
-0.03240966796875,
-0.03643798828125,
-0.0609130859375,
-0.07476806640625,
0.048614501953125,
0.02093505859375,
0.0018854141235351562,
-0.00452423095703125,
0.0110931396484375,
0.004184722900390625,
0.01480865478515625,
-0.0506591796875,
-0.030120849609375,
-0.009613037109375,
-0.03466796875,
-0.01239013671875,
-0.024139404296875,
-0.0163421630859375,
-0.021728515625,
0.050750732421875,
-0.001964569091796875,
0.02069091796875,
0.0015420913696289062,
0.01154327392578125,
-0.0223541259765625,
0.0093231201171875,
0.05352783203125,
0.0565185546875,
-0.06512451171875,
-0.0037975311279296875,
0.004474639892578125,
-0.028564453125,
-0.0186004638671875,
0.0033855438232421875,
-0.004207611083984375,
0.01450347900390625,
0.043975830078125,
0.050201416015625,
0.01035308837890625,
-0.030029296875,
0.035186767578125,
-0.01296234130859375,
-0.0246734619140625,
-0.0377197265625,
0.01004791259765625,
0.00685882568359375,
0.01058197021484375,
0.035064697265625,
0.0270233154296875,
-0.0081024169921875,
-0.037445068359375,
0.0216064453125,
0.042938232421875,
-0.034759521484375,
-0.037506103515625,
0.036102294921875,
-0.01096343994140625,
-0.0286865234375,
0.0308074951171875,
-0.0196990966796875,
-0.05657958984375,
0.059967041015625,
0.0269012451171875,
0.067626953125,
-0.0092315673828125,
0.0139617919921875,
0.049957275390625,
0.020904541015625,
-0.0018262863159179688,
0.07574462890625,
0.0198822021484375,
-0.06951904296875,
-0.014923095703125,
-0.06085205078125,
-0.015655517578125,
-0.002307891845703125,
-0.05950927734375,
0.0257568359375,
-0.032684326171875,
-0.031494140625,
0.013763427734375,
0.026824951171875,
-0.06378173828125,
0.038330078125,
-0.0032215118408203125,
0.09295654296875,
-0.0655517578125,
0.07452392578125,
0.038848876953125,
-0.0352783203125,
-0.0838623046875,
-0.0180816650390625,
-0.0004086494445800781,
-0.069580078125,
0.056610107421875,
0.005962371826171875,
0.0020389556884765625,
-0.01812744140625,
-0.034912109375,
-0.0679931640625,
0.06671142578125,
0.0184783935546875,
-0.0257415771484375,
0.0211181640625,
0.03106689453125,
0.05621337890625,
-0.04669189453125,
0.037567138671875,
0.0236358642578125,
0.03863525390625,
0.01248931884765625,
-0.08258056640625,
-0.0078887939453125,
-0.0270843505859375,
-0.006801605224609375,
0.0165557861328125,
-0.0740966796875,
0.08673095703125,
-0.0031147003173828125,
0.017822265625,
0.005184173583984375,
0.0313720703125,
0.03521728515625,
0.003925323486328125,
0.0377197265625,
0.07183837890625,
0.02587890625,
-0.0308990478515625,
0.07666015625,
-0.0304718017578125,
0.059234619140625,
0.08184814453125,
-0.01171112060546875,
0.06951904296875,
0.03302001953125,
-0.030181884765625,
0.04766845703125,
0.05255126953125,
0.0003006458282470703,
0.034637451171875,
-0.00836944580078125,
-0.00765228271484375,
-0.005611419677734375,
0.005401611328125,
-0.0272979736328125,
0.031768798828125,
0.015899658203125,
-0.03485107421875,
-0.005817413330078125,
-0.035186767578125,
0.0355224609375,
-0.00753021240234375,
-0.0178985595703125,
0.03887939453125,
0.0017538070678710938,
-0.03863525390625,
0.05126953125,
0.012054443359375,
0.061614990234375,
-0.034393310546875,
0.0148773193359375,
-0.016571044921875,
0.025970458984375,
-0.01421356201171875,
-0.08416748046875,
0.045013427734375,
0.01052093505859375,
-0.0161895751953125,
-0.0251312255859375,
0.03558349609375,
-0.025604248046875,
-0.07489013671875,
0.053131103515625,
0.04315185546875,
0.0037593841552734375,
-0.0024890899658203125,
-0.07147216796875,
0.00571441650390625,
0.00803375244140625,
-0.0460205078125,
0.0048370361328125,
0.048065185546875,
0.0175933837890625,
0.038116455078125,
0.0482177734375,
0.036041259765625,
-0.0008525848388671875,
0.0242462158203125,
0.06793212890625,
-0.056976318359375,
-0.043182373046875,
-0.069580078125,
0.038238525390625,
-0.0118255615234375,
-0.041046142578125,
0.05621337890625,
0.06591796875,
0.059326171875,
0.00853729248046875,
0.04962158203125,
-0.019500732421875,
0.06488037109375,
-0.01715087890625,
0.051239013671875,
-0.044708251953125,
0.006732940673828125,
-0.0231475830078125,
-0.0562744140625,
-0.03564453125,
0.04241943359375,
-0.0210113525390625,
0.0255126953125,
0.047393798828125,
0.06781005859375,
-0.007595062255859375,
-0.0120849609375,
0.0092926025390625,
0.0394287109375,
0.005603790283203125,
0.04095458984375,
0.0148773193359375,
-0.039306640625,
0.031890869140625,
-0.0380859375,
-0.0155181884765625,
-0.0023593902587890625,
-0.0643310546875,
-0.0638427734375,
-0.022857666015625,
-0.0207061767578125,
-0.06744384765625,
-0.0032024383544921875,
0.08123779296875,
0.058624267578125,
-0.08209228515625,
-0.03741455078125,
0.01218414306640625,
0.000545501708984375,
0.0027866363525390625,
-0.024169921875,
0.040924072265625,
-0.0012111663818359375,
-0.06121826171875,
-0.0012760162353515625,
0.0171661376953125,
-0.0026397705078125,
-0.0124969482421875,
-0.026336669921875,
-0.0251922607421875,
-0.0007214546203613281,
0.0347900390625,
-0.0019779205322265625,
-0.03509521484375,
-0.030029296875,
0.006084442138671875,
-0.0254058837890625,
0.0237579345703125,
0.01450347900390625,
-0.0265655517578125,
-0.0023288726806640625,
0.049835205078125,
-0.006099700927734375,
0.031646728515625,
0.0002143383026123047,
0.0251007080078125,
-0.060760498046875,
0.0199432373046875,
0.0249176025390625,
0.042694091796875,
0.028564453125,
-0.02032470703125,
0.0455322265625,
0.0191192626953125,
-0.0237579345703125,
-0.06134033203125,
-0.0101318359375,
-0.1124267578125,
-0.0037670135498046875,
0.0858154296875,
-0.0101318359375,
-0.0294952392578125,
0.034759521484375,
-0.0062103271484375,
0.02947998046875,
-0.056884765625,
0.061981201171875,
0.050323486328125,
-0.00032711029052734375,
-0.01129913330078125,
-0.0232696533203125,
0.026214599609375,
0.0220947265625,
-0.03814697265625,
-0.0278167724609375,
0.01800537109375,
0.0455322265625,
0.0123748779296875,
0.050628662109375,
-0.0203704833984375,
0.0093994140625,
-0.0242462158203125,
0.005458831787109375,
0.0108642578125,
-0.005229949951171875,
-0.0099639892578125,
0.0193634033203125,
-0.009002685546875,
-0.015777587890625
]
] |
keremberke/yolov5n-license-plate | 2023-01-01T09:59:54.000Z | [
"yolov5",
"tensorboard",
"yolo",
"vision",
"object-detection",
"pytorch",
"dataset:keremberke/license-plate-object-detection",
"model-index",
"has_space",
"region:us"
] | object-detection | keremberke | null | null | keremberke/yolov5n-license-plate | 12 | 7,156 | yolov5 | 2023-01-01T03:02:44 |
---
tags:
- yolov5
- yolo
- vision
- object-detection
- pytorch
library_name: yolov5
library_version: 7.0.6
inference: false
datasets:
- keremberke/license-plate-object-detection
model-index:
- name: keremberke/yolov5n-license-plate
results:
- task:
type: object-detection
dataset:
type: keremberke/license-plate-object-detection
name: keremberke/license-plate-object-detection
split: validation
metrics:
- type: precision # since mAP@0.5 is not available on hf.co/metrics
value: 0.9783431294995892 # min: 0.0 - max: 1.0
name: mAP@0.5
---
<div align="center">
<img width="640" alt="keremberke/yolov5n-license-plate" src="https://huggingface.co/keremberke/yolov5n-license-plate/resolve/main/sample_visuals.jpg">
</div>
### How to use
- Install [yolov5](https://github.com/fcakyon/yolov5-pip):
```bash
pip install -U yolov5
```
- Load model and perform prediction:
```python
import yolov5
# load model
model = yolov5.load('keremberke/yolov5n-license-plate')
# set model parameters
model.conf = 0.25 # NMS confidence threshold
model.iou = 0.45 # NMS IoU threshold
model.agnostic = False # NMS class-agnostic
model.multi_label = False # NMS multiple labels per box
model.max_det = 1000 # maximum number of detections per image
# set image
img = 'https://github.com/ultralytics/yolov5/raw/master/data/images/zidane.jpg'
# perform inference
results = model(img, size=640)
# inference with test time augmentation
results = model(img, augment=True)
# parse results
predictions = results.pred[0]
boxes = predictions[:, :4] # x1, y1, x2, y2
scores = predictions[:, 4]
categories = predictions[:, 5]
# show detection bounding boxes on image
results.show()
# save results into "results/" folder
results.save(save_dir='results/')
```
- Finetune the model on your custom dataset:
```bash
yolov5 train --data data.yaml --img 640 --batch 16 --weights keremberke/yolov5n-license-plate --epochs 10
```
**More models available at: [awesome-yolov5-models](https://github.com/keremberke/awesome-yolov5-models)** | 2,082 | [
[
-0.06756591796875,
-0.0267791748046875,
0.033782958984375,
-0.020263671875,
-0.030059814453125,
-0.0287933349609375,
0.008331298828125,
-0.044586181640625,
0.01206207275390625,
0.02239990234375,
-0.0452880859375,
-0.05584716796875,
-0.031280517578125,
-0.01123809814453125,
0.01494598388671875,
0.05657958984375,
0.019989013671875,
0.0111541748046875,
-0.005767822265625,
-0.0194091796875,
-0.04388427734375,
0.00988006591796875,
0.00325775146484375,
-0.041839599609375,
0.0274505615234375,
0.00975799560546875,
0.0364990234375,
0.0699462890625,
0.0322265625,
0.0279693603515625,
0.00121307373046875,
-0.008575439453125,
-0.00897216796875,
0.01380157470703125,
0.001190185546875,
-0.0273895263671875,
-0.0362548828125,
0.00592041015625,
0.029144287109375,
0.02301025390625,
0.00412750244140625,
0.021759033203125,
-0.037139892578125,
0.01141357421875,
-0.052734375,
0.0204620361328125,
-0.038604736328125,
-0.0003440380096435547,
-0.0152435302734375,
-0.00897216796875,
-0.02496337890625,
-0.033050537109375,
0.0254669189453125,
-0.048370361328125,
0.00513458251953125,
0.01177978515625,
0.09161376953125,
0.0180206298828125,
-0.02490234375,
0.01513671875,
-0.026641845703125,
0.06256103515625,
-0.09246826171875,
0.01214599609375,
0.0244903564453125,
0.035369873046875,
-0.00466156005859375,
-0.05828857421875,
-0.03875732421875,
-0.008544921875,
-0.0039825439453125,
0.007724761962890625,
-0.00251007080078125,
-0.0196990966796875,
0.025482177734375,
0.01120758056640625,
-0.03973388671875,
0.00047206878662109375,
-0.04833984375,
-0.0159454345703125,
0.043853759765625,
0.03521728515625,
0.0114288330078125,
-0.002368927001953125,
-0.042572021484375,
-0.026458740234375,
-0.003314971923828125,
0.008880615234375,
0.032135009765625,
0.02374267578125,
-0.0231475830078125,
0.0306854248046875,
-0.034759521484375,
0.07550048828125,
0.01445770263671875,
-0.0294036865234375,
0.07330322265625,
-0.015472412109375,
-0.03179931640625,
0.0097198486328125,
0.0775146484375,
0.057861328125,
-0.004817962646484375,
0.0185089111328125,
-0.01471710205078125,
0.00597381591796875,
0.01910400390625,
-0.06793212890625,
-0.0260009765625,
0.0227813720703125,
-0.012786865234375,
-0.05322265625,
0.008697509765625,
-0.05511474609375,
-0.0243682861328125,
-0.000019490718841552734,
0.057708740234375,
-0.04583740234375,
-0.0207672119140625,
0.028900146484375,
-0.021636962890625,
0.0386962890625,
0.01158905029296875,
-0.0273590087890625,
-0.01041412353515625,
-0.001117706298828125,
0.04644775390625,
0.0215911865234375,
-0.01018524169921875,
-0.0309600830078125,
-0.0189056396484375,
-0.024139404296875,
0.0556640625,
-0.0220489501953125,
-0.02154541015625,
-0.01690673828125,
0.032440185546875,
0.0186920166015625,
-0.00261688232421875,
0.04705810546875,
-0.07037353515625,
0.0308990478515625,
-0.01508331298828125,
-0.0447998046875,
-0.035919189453125,
0.04437255859375,
-0.04150390625,
0.044677734375,
0.0006842613220214844,
-0.0799560546875,
0.042388916015625,
-0.02264404296875,
-0.0229339599609375,
0.02142333984375,
0.00876617431640625,
-0.08184814453125,
-0.003997802734375,
0.004825592041015625,
0.052490234375,
0.00485992431640625,
0.00666046142578125,
-0.07281494140625,
-0.01385498046875,
0.0128021240234375,
-0.017364501953125,
0.05072021484375,
0.0039520263671875,
-0.0273895263671875,
0.01049041748046875,
-0.08740234375,
0.0302276611328125,
0.038360595703125,
-0.0276031494140625,
-0.0227203369140625,
-0.0311737060546875,
0.01410675048828125,
0.01209259033203125,
0.0177459716796875,
-0.05633544921875,
0.0292205810546875,
-0.045013427734375,
-0.0033435821533203125,
0.050140380859375,
-0.00000476837158203125,
0.0252838134765625,
-0.02001953125,
0.0304107666015625,
0.033416748046875,
-0.01229095458984375,
-0.0101470947265625,
-0.034820556640625,
-0.028350830078125,
0.0179290771484375,
0.0309600830078125,
0.00811767578125,
-0.048095703125,
0.049163818359375,
-0.021209716796875,
-0.05657958984375,
-0.03448486328125,
-0.036590576171875,
0.019378662109375,
0.05657958984375,
0.036407470703125,
-0.042572021484375,
-0.042694091796875,
-0.0712890625,
0.0244903564453125,
0.0254974365234375,
0.0252838134765625,
-0.0037631988525390625,
0.062347412109375,
0.002124786376953125,
0.059112548828125,
-0.06378173828125,
-0.02947998046875,
-0.0227813720703125,
0.006900787353515625,
0.0311126708984375,
0.0386962890625,
0.05322265625,
-0.04180908203125,
-0.058868408203125,
0.0033130645751953125,
-0.039581298828125,
0.00543975830078125,
0.01541900634765625,
0.01168060302734375,
0.01273345947265625,
0.0035037994384765625,
-0.00592803955078125,
0.044281005859375,
0.01934814453125,
-0.046844482421875,
0.07525634765625,
-0.03192138671875,
0.0162200927734375,
-0.0980224609375,
-0.00447845458984375,
0.0386962890625,
-0.04791259765625,
-0.040252685546875,
-0.01251220703125,
0.01180267333984375,
0.02056884765625,
-0.037078857421875,
0.0207366943359375,
-0.0168609619140625,
0.0012340545654296875,
-0.0215911865234375,
-0.0236358642578125,
0.011749267578125,
0.0195770263671875,
-0.01467132568359375,
0.05023193359375,
0.06463623046875,
-0.030517578125,
0.038299560546875,
0.0196990966796875,
-0.04937744140625,
0.042724609375,
-0.052734375,
-0.004802703857421875,
-0.028106689453125,
0.005672454833984375,
-0.07275390625,
-0.055572509765625,
0.02618408203125,
-0.044677734375,
0.039093017578125,
-0.026275634765625,
-0.020477294921875,
-0.035736083984375,
-0.029388427734375,
0.0070037841796875,
0.0478515625,
-0.0181884765625,
0.026336669921875,
0.0254974365234375,
0.040283203125,
-0.053497314453125,
-0.056365966796875,
-0.03143310546875,
-0.0199737548828125,
-0.032379150390625,
0.037384033203125,
-0.0072479248046875,
0.002964019775390625,
0.0259246826171875,
0.003955841064453125,
-0.010284423828125,
-0.0024852752685546875,
0.0245208740234375,
0.05078125,
-0.02105712890625,
0.0025348663330078125,
-0.0304107666015625,
-0.02105712890625,
0.0094146728515625,
-0.036376953125,
0.068115234375,
-0.0372314453125,
0.0006642341613769531,
-0.055267333984375,
-0.02484130859375,
0.057708740234375,
-0.0168914794921875,
0.07928466796875,
0.0775146484375,
-0.01364898681640625,
0.0018634796142578125,
-0.035186767578125,
0.0013790130615234375,
-0.03314208984375,
0.0262603759765625,
-0.0330810546875,
-0.0026988983154296875,
0.034637451171875,
0.0229034423828125,
-0.0016803741455078125,
0.05792236328125,
0.042022705078125,
-0.0245361328125,
0.08782958984375,
0.03082275390625,
-0.004985809326171875,
0.025146484375,
-0.058441162109375,
-0.025177001953125,
-0.057403564453125,
-0.0301055908203125,
-0.01363372802734375,
-0.0312347412109375,
-0.03875732421875,
-0.0072479248046875,
0.036865234375,
-0.0335693359375,
-0.02679443359375,
0.046600341796875,
-0.04736328125,
0.0394287109375,
0.04571533203125,
0.02178955078125,
0.001781463623046875,
-0.009429931640625,
-0.0241241455078125,
-0.01557159423828125,
-0.036651611328125,
-0.0143280029296875,
0.090576171875,
0.02130126953125,
0.056793212890625,
-0.0175323486328125,
0.041229248046875,
0.012908935546875,
0.006320953369140625,
-0.045501708984375,
0.0245361328125,
0.015777587890625,
-0.07415771484375,
-0.00011974573135375977,
-0.030029296875,
-0.06292724609375,
0.0100860595703125,
-0.036407470703125,
-0.0535888671875,
0.0018148422241210938,
0.00432586669921875,
-0.03155517578125,
0.049713134765625,
-0.033538818359375,
0.06103515625,
-0.0038013458251953125,
-0.046173095703125,
0.01959228515625,
-0.05877685546875,
0.037750244140625,
0.01116943359375,
0.00998687744140625,
-0.025909423828125,
0.0177459716796875,
0.0545654296875,
-0.044464111328125,
0.0645751953125,
-0.032012939453125,
0.01517486572265625,
0.0204315185546875,
-0.0206146240234375,
0.0322265625,
-0.0086669921875,
-0.0201263427734375,
-0.0177001953125,
0.0181732177734375,
-0.0171661376953125,
-0.0180816650390625,
0.05084228515625,
-0.0640869140625,
-0.004482269287109375,
-0.0467529296875,
-0.047607421875,
0.00965118408203125,
0.035675048828125,
0.051605224609375,
0.055145263671875,
0.0087738037109375,
0.01080322265625,
0.047943115234375,
-0.0168914794921875,
0.025390625,
0.01018524169921875,
-0.0218353271484375,
-0.04638671875,
0.0679931640625,
0.020416259765625,
0.0138702392578125,
-0.0152587890625,
0.03497314453125,
-0.047821044921875,
-0.032684326171875,
-0.035186767578125,
0.0104217529296875,
-0.0650634765625,
-0.03289794921875,
-0.040863037109375,
-0.01473236083984375,
-0.03936767578125,
-0.004535675048828125,
-0.033447265625,
-0.0038623809814453125,
-0.0181121826171875,
0.00885772705078125,
0.0307159423828125,
0.051361083984375,
-0.00916290283203125,
0.03973388671875,
-0.027801513671875,
0.00844573974609375,
0.01157379150390625,
0.04339599609375,
-0.003055572509765625,
-0.0633544921875,
-0.0026493072509765625,
-0.00495147705078125,
-0.0413818359375,
-0.055633544921875,
0.05633544921875,
-0.016510009765625,
0.0273284912109375,
0.028106689453125,
0.01335906982421875,
0.057342529296875,
-0.01009368896484375,
0.0340576171875,
0.048675537109375,
-0.044342041015625,
0.03167724609375,
-0.032867431640625,
0.04205322265625,
0.045013427734375,
0.044952392578125,
-0.0002340078353881836,
0.00518798828125,
-0.056549072265625,
-0.0567626953125,
0.05633544921875,
0.0004601478576660156,
-0.01041412353515625,
0.031402587890625,
0.01239776611328125,
0.007259368896484375,
0.005046844482421875,
-0.08514404296875,
-0.03607177734375,
-0.021270751953125,
-0.0245361328125,
0.01140594482421875,
0.005519866943359375,
0.01419830322265625,
-0.04888916015625,
0.071533203125,
-0.01250457763671875,
0.0193023681640625,
0.0101776123046875,
0.01328277587890625,
-0.01461029052734375,
0.01065826416015625,
0.048126220703125,
0.0294952392578125,
-0.058258056640625,
-0.0036983489990234375,
0.023956298828125,
-0.02325439453125,
0.0250701904296875,
0.007434844970703125,
-0.01355743408203125,
-0.001255035400390625,
0.0238037109375,
0.05401611328125,
-0.0135345458984375,
-0.0016880035400390625,
0.037322998046875,
-0.00634002685546875,
-0.036102294921875,
-0.0421142578125,
0.026641845703125,
0.0138702392578125,
0.0267486572265625,
0.0034732818603515625,
0.033447265625,
-0.003063201904296875,
-0.0258636474609375,
0.02490234375,
0.0333251953125,
-0.05322265625,
-0.03155517578125,
0.07843017578125,
-0.020050048828125,
0.0099334716796875,
0.0225372314453125,
-0.050537109375,
-0.038330078125,
0.072998046875,
0.031707763671875,
0.04290771484375,
0.00861358642578125,
-0.00601959228515625,
0.06500244140625,
-0.005237579345703125,
-0.005878448486328125,
0.0239410400390625,
0.033660888671875,
-0.051727294921875,
-0.0243072509765625,
-0.05267333984375,
0.00807952880859375,
0.045867919921875,
-0.055938720703125,
0.037078857421875,
-0.037445068359375,
-0.044189453125,
0.037445068359375,
0.0193023681640625,
-0.0762939453125,
0.0340576171875,
0.01910400390625,
0.0616455078125,
-0.045440673828125,
0.06414794921875,
0.048095703125,
-0.04718017578125,
-0.06549072265625,
-0.00875091552734375,
0.0123443603515625,
-0.061370849609375,
0.045166015625,
0.042755126953125,
0.01202392578125,
0.029541015625,
-0.0692138671875,
-0.058197021484375,
0.090576171875,
-0.023223876953125,
-0.03338623046875,
0.0144500732421875,
-0.016845703125,
0.010162353515625,
-0.03680419921875,
0.03369140625,
0.03607177734375,
0.05279541015625,
0.032440185546875,
-0.036376953125,
-0.0002682209014892578,
-0.0030765533447265625,
-0.01517486572265625,
0.0202484130859375,
-0.019622802734375,
0.05035400390625,
-0.0262603759765625,
0.011749267578125,
0.00399017333984375,
0.04010009765625,
0.007526397705078125,
0.0152587890625,
0.046356201171875,
0.07623291015625,
0.025634765625,
-0.0189056396484375,
0.06927490234375,
-0.002410888671875,
0.059478759765625,
0.084228515625,
-0.01174163818359375,
0.0280303955078125,
0.005828857421875,
-0.008697509765625,
0.035369873046875,
0.04144287109375,
-0.05023193359375,
0.07159423828125,
0.0015420913696289062,
0.0042724609375,
-0.0252532958984375,
-0.00848388671875,
-0.049468994140625,
0.042236328125,
0.01039886474609375,
-0.0257110595703125,
-0.0265045166015625,
0.004673004150390625,
-0.0177001953125,
-0.03204345703125,
-0.0169525146484375,
0.04107666015625,
-0.028656005859375,
-0.025238037109375,
0.045867919921875,
0.01207733154296875,
0.058258056640625,
-0.0401611328125,
0.0167083740234375,
0.0218658447265625,
0.0240478515625,
-0.01195526123046875,
-0.07415771484375,
0.024688720703125,
-0.0276031494140625,
-0.0016355514526367188,
0.01032257080078125,
0.06512451171875,
-0.026397705078125,
-0.055572509765625,
0.00634002685546875,
0.01549530029296875,
0.0015249252319335938,
0.017486572265625,
-0.04803466796875,
0.02471923828125,
0.0150299072265625,
-0.048309326171875,
0.01251220703125,
0.01153564453125,
0.032623291015625,
0.054229736328125,
0.045623779296875,
0.019500732421875,
0.001361846923828125,
-0.0166015625,
0.07684326171875,
-0.048675537109375,
-0.022186279296875,
-0.060760498046875,
0.07049560546875,
-0.017181396484375,
-0.04095458984375,
0.040283203125,
0.0316162109375,
0.064208984375,
-0.0259552001953125,
0.06298828125,
-0.03143310546875,
0.004344940185546875,
-0.0179290771484375,
0.06207275390625,
-0.06744384765625,
-0.00582122802734375,
-0.0083465576171875,
-0.02117919921875,
-0.01403045654296875,
0.0421142578125,
-0.041717529296875,
0.00569915771484375,
0.01519012451171875,
0.05078125,
-0.039886474609375,
-0.018280029296875,
0.043243408203125,
0.0256805419921875,
0.0164337158203125,
0.031463623046875,
0.0372314453125,
-0.0570068359375,
0.0297393798828125,
-0.06549072265625,
0.00722503662109375,
-0.026031494140625,
-0.054962158203125,
-0.06988525390625,
-0.046356201171875,
-0.04010009765625,
-0.039306640625,
-0.0328369140625,
0.07684326171875,
0.07940673828125,
-0.0469970703125,
0.0181732177734375,
-0.0037097930908203125,
-0.0096435546875,
0.00749969482421875,
-0.0225067138671875,
0.0138092041015625,
-0.016937255859375,
-0.054718017578125,
0.01031494140625,
0.0005097389221191406,
0.044586181640625,
-0.019378662109375,
0.01468658447265625,
-0.0248260498046875,
-0.023468017578125,
0.00524139404296875,
-0.0020351409912109375,
-0.03631591796875,
-0.01042938232421875,
-0.0148162841796875,
-0.0104217529296875,
0.03228759765625,
-0.01006317138671875,
-0.07696533203125,
0.053802490234375,
0.04742431640625,
0.0218658447265625,
0.04736328125,
-0.0011472702026367188,
0.034454345703125,
-0.035308837890625,
0.0197296142578125,
0.00726318359375,
0.040252685546875,
0.011077880859375,
-0.01418304443359375,
0.0201873779296875,
0.029144287109375,
-0.02685546875,
-0.04888916015625,
0.00936126708984375,
-0.06768798828125,
-0.0240631103515625,
0.043060302734375,
-0.01403045654296875,
-0.051361083984375,
-0.0099945068359375,
0.0118560791015625,
0.0223846435546875,
-0.029815673828125,
0.036865234375,
0.0271759033203125,
0.0214996337890625,
-0.01284027099609375,
-0.06793212890625,
0.01296234130859375,
0.01148223876953125,
-0.0472412109375,
-0.0458984375,
0.016571044921875,
0.060760498046875,
0.0361328125,
0.0068206787109375,
0.00302886962890625,
0.01555633544921875,
0.01079559326171875,
0.03021240234375,
-0.039093017578125,
-0.00598907470703125,
-0.0218353271484375,
0.022613525390625,
-0.0191650390625,
-0.056884765625
]
] |
OpenBuddy/openbuddy-mistral-7b-v13.1 | 2023-10-11T15:55:09.000Z | [
"transformers",
"pytorch",
"mistral",
"text-generation",
"zh",
"en",
"fr",
"de",
"ja",
"ko",
"it",
"ru",
"license:apache-2.0",
"text-generation-inference",
"region:us"
] | text-generation | OpenBuddy | null | null | OpenBuddy/openbuddy-mistral-7b-v13.1 | 13 | 7,155 | transformers | 2023-10-11T15:26:48 | ---
language:
- zh
- en
- fr
- de
- ja
- ko
- it
- ru
pipeline_tag: text-generation
inference: false
library_name: transformers
license: apache-2.0
---
# OpenBuddy - Open Multilingual Chatbot
GitHub and Usage Guide: [https://github.com/OpenBuddy/OpenBuddy](https://github.com/OpenBuddy/OpenBuddy)
Website and Demo: [https://openbuddy.ai](https://openbuddy.ai)
Evaluation result of this model: [Evaluation.txt](Evaluation.txt)

# Copyright Notice
Base model: https://huggingface.co/mistralai/Mistral-7B-v0.1
License: Apache 2.0
## Disclaimer
All OpenBuddy models have inherent limitations and may potentially produce outputs that are erroneous, harmful, offensive, or otherwise undesirable. Users should not use these models in critical or high-stakes situations that may lead to personal injury, property damage, or significant losses. Examples of such scenarios include, but are not limited to, the medical field, controlling software and hardware systems that may cause harm, and making important financial or legal decisions.
OpenBuddy is provided "as-is" without any warranty of any kind, either express or implied, including, but not limited to, the implied warranties of merchantability, fitness for a particular purpose, and non-infringement. In no event shall the authors, contributors, or copyright holders be liable for any claim, damages, or other liabilities, whether in an action of contract, tort, or otherwise, arising from, out of, or in connection with the software or the use or other dealings in the software.
By using OpenBuddy, you agree to these terms and conditions, and acknowledge that you understand the potential risks associated with its use. You also agree to indemnify and hold harmless the authors, contributors, and copyright holders from any claims, damages, or liabilities arising from your use of OpenBuddy.
## 免责声明
所有OpenBuddy模型均存在固有的局限性,可能产生错误的、有害的、冒犯性的或其他不良的输出。用户在关键或高风险场景中应谨慎行事,不要使用这些模型,以免导致人身伤害、财产损失或重大损失。此类场景的例子包括但不限于医疗领域、可能导致伤害的软硬件系统的控制以及进行重要的财务或法律决策。
OpenBuddy按“原样”提供,不附带任何种类的明示或暗示的保证,包括但不限于适销性、特定目的的适用性和非侵权的暗示保证。在任何情况下,作者、贡献者或版权所有者均不对因软件或使用或其他软件交易而产生的任何索赔、损害赔偿或其他责任(无论是合同、侵权还是其他原因)承担责任。
使用OpenBuddy即表示您同意这些条款和条件,并承认您了解其使用可能带来的潜在风险。您还同意赔偿并使作者、贡献者和版权所有者免受因您使用OpenBuddy而产生的任何索赔、损害赔偿或责任的影响。 | 2,330 | [
[
-0.0259857177734375,
-0.07427978515625,
0.01348114013671875,
0.038665771484375,
-0.02093505859375,
-0.01303863525390625,
-0.0175018310546875,
-0.033721923828125,
0.00960540771484375,
0.0295867919921875,
-0.0190277099609375,
-0.041717529296875,
-0.033538818359375,
-0.0193023681640625,
-0.0028228759765625,
0.0750732421875,
-0.017608642578125,
-0.005001068115234375,
-0.0013093948364257812,
-0.01593017578125,
-0.0496826171875,
-0.019500732421875,
-0.03857421875,
-0.00811767578125,
0.00362396240234375,
0.0274505615234375,
0.064208984375,
-0.0012960433959960938,
0.04571533203125,
0.028167724609375,
0.0034198760986328125,
-0.005710601806640625,
-0.04339599609375,
0.0113983154296875,
0.004302978515625,
-0.0309600830078125,
-0.049560546875,
-0.0119171142578125,
0.01415252685546875,
0.0300140380859375,
-0.02667236328125,
0.02703857421875,
0.0049896240234375,
0.051910400390625,
-0.0577392578125,
0.0308837890625,
-0.0078887939453125,
0.003665924072265625,
-0.009063720703125,
-0.0224761962890625,
-0.01538848876953125,
-0.05810546875,
-0.00830841064453125,
-0.042694091796875,
-0.007762908935546875,
0.00885009765625,
0.0823974609375,
-0.0011768341064453125,
-0.02337646484375,
-0.01444244384765625,
-0.057220458984375,
0.042816162109375,
-0.06036376953125,
0.0288848876953125,
0.02203369140625,
0.0545654296875,
-0.019500732421875,
-0.04791259765625,
-0.039215087890625,
-0.0108489990234375,
-0.003070831298828125,
0.03021240234375,
-0.025177001953125,
-0.005687713623046875,
0.01537322998046875,
0.041015625,
-0.056976318359375,
-0.006732940673828125,
-0.044464111328125,
-0.0037555694580078125,
0.0297393798828125,
0.0106964111328125,
0.045928955078125,
-0.021942138671875,
-0.03863525390625,
-0.0053253173828125,
-0.02813720703125,
0.034942626953125,
0.0306549072265625,
0.020233154296875,
-0.05157470703125,
0.05743408203125,
-0.0188140869140625,
0.032257080078125,
-0.00411224365234375,
-0.0287628173828125,
0.041473388671875,
-0.0325927734375,
-0.02789306640625,
-0.0017910003662109375,
0.08148193359375,
0.045928955078125,
0.0257110595703125,
0.007396697998046875,
-0.0086822509765625,
-0.007686614990234375,
0.01056671142578125,
-0.06353759765625,
-0.0213470458984375,
0.050048828125,
-0.053131103515625,
-0.0199127197265625,
0.0106658935546875,
-0.06744384765625,
-0.01204681396484375,
-0.0025119781494140625,
0.027374267578125,
-0.050323486328125,
-0.04974365234375,
0.016448974609375,
-0.00611114501953125,
-0.0034637451171875,
0.0182342529296875,
-0.036529541015625,
0.0196990966796875,
0.0152435302734375,
0.0833740234375,
0.0224151611328125,
-0.01471710205078125,
-0.005828857421875,
0.019073486328125,
-0.017730712890625,
0.04083251953125,
-0.010955810546875,
-0.04083251953125,
0.00457763671875,
0.008636474609375,
0.0022830963134765625,
-0.01483154296875,
0.0259552001953125,
-0.0162811279296875,
0.0457763671875,
0.0265960693359375,
-0.01303863525390625,
-0.0309295654296875,
0.0014219284057617188,
-0.040252685546875,
0.06988525390625,
0.010894775390625,
-0.06982421875,
0.01023101806640625,
-0.07366943359375,
-0.028045654296875,
0.0020160675048828125,
-0.012603759765625,
-0.032562255859375,
-0.002376556396484375,
0.0107421875,
0.033294677734375,
-0.0162200927734375,
0.0128936767578125,
-0.042999267578125,
-0.0158843994140625,
0.0221405029296875,
-0.0235748291015625,
0.10418701171875,
0.0184478759765625,
-0.00792694091796875,
0.0372314453125,
-0.04498291015625,
0.0115966796875,
0.038787841796875,
-0.0268096923828125,
-0.03216552734375,
-0.01739501953125,
0.0131378173828125,
0.0187835693359375,
0.026641845703125,
-0.049591064453125,
0.0181884765625,
-0.039947509765625,
0.034881591796875,
0.05474853515625,
0.006832122802734375,
0.026641845703125,
-0.037322998046875,
0.054168701171875,
0.00821685791015625,
0.039581298828125,
-0.0280609130859375,
-0.057525634765625,
-0.038848876953125,
-0.04296875,
0.00757598876953125,
0.0584716796875,
-0.043670654296875,
0.04705810546875,
-0.01392364501953125,
-0.054473876953125,
-0.056854248046875,
-0.004878997802734375,
0.031219482421875,
0.02008056640625,
0.026641845703125,
-0.01544189453125,
-0.026611328125,
-0.041748046875,
-0.005672454833984375,
-0.02581787109375,
-0.00960540771484375,
0.0310516357421875,
0.045379638671875,
-0.01050567626953125,
0.061767578125,
-0.056915283203125,
-0.034423828125,
0.005542755126953125,
0.0020427703857421875,
0.0225372314453125,
0.053466796875,
0.06732177734375,
-0.05535888671875,
-0.049835205078125,
0.00637054443359375,
-0.066650390625,
0.01493072509765625,
-0.0018186569213867188,
-0.026458740234375,
0.027801513671875,
0.01837158203125,
-0.059326171875,
0.07159423828125,
0.0484619140625,
-0.03521728515625,
0.0562744140625,
-0.0291900634765625,
0.0189208984375,
-0.10333251953125,
0.019439697265625,
-0.0107421875,
-0.01430511474609375,
-0.036285400390625,
0.021820068359375,
0.00843048095703125,
-0.01739501953125,
-0.039947509765625,
0.050079345703125,
-0.0268096923828125,
0.02386474609375,
-0.001800537109375,
0.017364501953125,
-0.01385498046875,
0.035491943359375,
-0.0150909423828125,
0.04473876953125,
0.0419921875,
-0.031951904296875,
0.0406494140625,
0.0272369384765625,
-0.0255126953125,
0.04248046875,
-0.072509765625,
-0.01039886474609375,
-0.004680633544921875,
0.01499176025390625,
-0.08514404296875,
-0.0243072509765625,
0.05328369140625,
-0.06903076171875,
0.0175933837890625,
-0.006938934326171875,
-0.043487548828125,
-0.033477783203125,
-0.0299072265625,
0.009124755859375,
0.04534912109375,
-0.0256500244140625,
0.030517578125,
0.020904541015625,
-0.020751953125,
-0.042694091796875,
-0.049407958984375,
-0.023529052734375,
-0.0105743408203125,
-0.06915283203125,
0.01238250732421875,
-0.0133819580078125,
-0.00206756591796875,
0.00792694091796875,
0.00875091552734375,
-0.01617431640625,
-0.004486083984375,
0.046295166015625,
0.0285797119140625,
-0.01161956787109375,
0.0063018798828125,
0.00399017333984375,
-0.0079498291015625,
-0.01038360595703125,
0.0045928955078125,
0.042755126953125,
-0.020172119140625,
-0.038665771484375,
-0.0259552001953125,
0.037322998046875,
0.042205810546875,
-0.015838623046875,
0.062103271484375,
0.056915283203125,
-0.0367431640625,
0.00875091552734375,
-0.032196044921875,
-0.0013408660888671875,
-0.03656005859375,
0.014862060546875,
-0.03375244140625,
-0.06268310546875,
0.0528564453125,
0.01226806640625,
0.028106689453125,
0.0193939208984375,
0.057861328125,
-0.0020656585693359375,
0.072998046875,
0.050384521484375,
0.01381683349609375,
0.0299835205078125,
-0.01068878173828125,
0.02520751953125,
-0.046112060546875,
-0.0259857177734375,
-0.04791259765625,
-0.0103607177734375,
-0.055206298828125,
-0.0245361328125,
0.0263519287109375,
0.0301361083984375,
-0.03997802734375,
0.02166748046875,
-0.0543212890625,
0.023834228515625,
0.058502197265625,
0.0214385986328125,
0.0161285400390625,
-0.01059722900390625,
-0.0224456787109375,
0.0156707763671875,
-0.03826904296875,
-0.040252685546875,
0.07366943359375,
0.0262603759765625,
0.06585693359375,
0.0323486328125,
0.047882080078125,
-0.0111846923828125,
0.01202392578125,
-0.054779052734375,
0.0309906005859375,
0.0166778564453125,
-0.0716552734375,
-0.035980224609375,
-0.023529052734375,
-0.09600830078125,
0.020355224609375,
-0.0014963150024414062,
-0.07745361328125,
0.00830841064453125,
0.006313323974609375,
-0.0174713134765625,
0.03363037109375,
-0.062744140625,
0.06658935546875,
-0.01355743408203125,
-0.02496337890625,
-0.0067138671875,
-0.04736328125,
0.0400390625,
-0.0023345947265625,
0.0298309326171875,
-0.019287109375,
-0.005321502685546875,
0.0304412841796875,
-0.046966552734375,
0.0657958984375,
-0.02001953125,
0.007503509521484375,
0.02569580078125,
0.0283660888671875,
0.01073455810546875,
0.0182647705078125,
0.0251617431640625,
0.042755126953125,
0.023040771484375,
-0.038238525390625,
-0.029266357421875,
0.056488037109375,
-0.0697021484375,
-0.03668212890625,
-0.038543701171875,
-0.02362060546875,
0.005939483642578125,
0.0362548828125,
0.0167999267578125,
0.0175018310546875,
-0.013641357421875,
0.02288818359375,
0.00669097900390625,
-0.052734375,
0.031768798828125,
0.044677734375,
-0.04205322265625,
-0.039276123046875,
0.06060791015625,
0.00021755695343017578,
0.01178741455078125,
0.0087890625,
0.01739501953125,
-0.014007568359375,
-0.02789306640625,
-0.0340576171875,
0.01873779296875,
-0.044677734375,
-0.0239410400390625,
-0.030670166015625,
0.002902984619140625,
-0.056854248046875,
-0.01409149169921875,
-0.01372528076171875,
-0.0306549072265625,
-0.005687713623046875,
-0.00012189149856567383,
0.043914794921875,
0.018341064453125,
-0.022674560546875,
0.0095672607421875,
-0.07989501953125,
0.0413818359375,
-0.004791259765625,
0.057159423828125,
0.0011768341064453125,
-0.0175933837890625,
-0.026824951171875,
0.0142059326171875,
-0.03704833984375,
-0.078369140625,
0.033203125,
-0.01983642578125,
0.048614501953125,
0.04583740234375,
0.027374267578125,
0.050079345703125,
-0.029052734375,
0.06036376953125,
0.05853271484375,
-0.0504150390625,
0.05816650390625,
-0.0438232421875,
0.0276641845703125,
0.0287933349609375,
0.060821533203125,
-0.040771484375,
-0.02508544921875,
-0.043426513671875,
-0.060699462890625,
0.06561279296875,
0.0299835205078125,
0.00656890869140625,
0.001983642578125,
-0.010955810546875,
0.002574920654296875,
0.0227508544921875,
-0.05828857421875,
-0.0310516357421875,
-0.03607177734375,
-0.0107574462890625,
0.01232147216796875,
0.0016326904296875,
-0.0187530517578125,
-0.01076507568359375,
0.050689697265625,
0.01099395751953125,
0.0390625,
0.005336761474609375,
0.005229949951171875,
-0.0273284912109375,
0.02166748046875,
0.04351806640625,
0.05291748046875,
-0.041046142578125,
-0.0252532958984375,
-0.01459503173828125,
-0.0355224609375,
-0.0031147003173828125,
0.0161895751953125,
-0.0175323486328125,
0.0032482147216796875,
0.0096588134765625,
0.056732177734375,
0.0141448974609375,
-0.05224609375,
0.0516357421875,
-0.007732391357421875,
0.00855255126953125,
-0.041473388671875,
-0.00421142578125,
0.0152130126953125,
0.0196075439453125,
0.0016622543334960938,
0.0120849609375,
0.00637054443359375,
-0.039337158203125,
-0.01416015625,
0.020233154296875,
-0.037506103515625,
-0.014984130859375,
0.058990478515625,
0.0233612060546875,
-0.041229248046875,
0.04351806640625,
-0.0019464492797851562,
-0.01120758056640625,
0.043701171875,
0.0227508544921875,
0.0731201171875,
-0.04412841796875,
0.00847625732421875,
0.049713134765625,
0.029327392578125,
0.0173797607421875,
0.053619384765625,
0.010223388671875,
-0.044677734375,
-0.0296478271484375,
-0.0274200439453125,
-0.03424072265625,
0.015411376953125,
-0.059326171875,
0.0367431640625,
-0.03692626953125,
-0.029571533203125,
0.006984710693359375,
-0.0249176025390625,
-0.04559326171875,
-0.0081024169921875,
-0.00606536865234375,
0.069580078125,
-0.036376953125,
0.042999267578125,
0.06884765625,
-0.0709228515625,
-0.046295166015625,
-0.01490020751953125,
0.00453948974609375,
-0.051177978515625,
0.0280609130859375,
0.01537322998046875,
0.006591796875,
-0.02923583984375,
-0.036956787109375,
-0.05267333984375,
0.07354736328125,
0.0113983154296875,
-0.0199432373046875,
-0.013214111328125,
-0.0027008056640625,
0.023712158203125,
0.0003101825714111328,
0.05072021484375,
-0.00695037841796875,
0.037994384765625,
-0.01067352294921875,
-0.10760498046875,
0.028961181640625,
-0.0277557373046875,
-0.00833892822265625,
0.013763427734375,
-0.06512451171875,
0.07440185546875,
-0.03424072265625,
-0.01019287109375,
0.009490966796875,
0.033660888671875,
0.0243377685546875,
0.028533935546875,
0.0306396484375,
0.026458740234375,
0.035888671875,
-0.01397705078125,
0.07318115234375,
-0.03570556640625,
0.037200927734375,
0.06915283203125,
0.006160736083984375,
0.061981201171875,
0.015411376953125,
-0.03302001953125,
0.0489501953125,
0.037994384765625,
0.0016164779663085938,
0.018768310546875,
0.00205230712890625,
-0.004062652587890625,
-0.001956939697265625,
0.008941650390625,
-0.046478271484375,
0.0247039794921875,
0.0287933349609375,
-0.0192108154296875,
-0.017913818359375,
0.015533447265625,
0.005199432373046875,
-0.00966644287109375,
-0.00511932373046875,
0.05413818359375,
0.00103759765625,
-0.029388427734375,
0.055267333984375,
0.00867462158203125,
0.038482666015625,
-0.061553955078125,
-0.004093170166015625,
-0.007568359375,
0.019287109375,
-0.02532958984375,
-0.059234619140625,
0.0037212371826171875,
-0.005214691162109375,
-0.0011243820190429688,
-0.0012044906616210938,
0.058197021484375,
-0.00543975830078125,
-0.0175018310546875,
0.026458740234375,
0.046630859375,
0.020538330078125,
-0.0029315948486328125,
-0.06671142578125,
-0.0009608268737792969,
-0.00408935546875,
-0.042449951171875,
0.019012451171875,
0.03985595703125,
0.003925323486328125,
0.0667724609375,
0.05328369140625,
0.0051727294921875,
-0.00476837158203125,
-0.002208709716796875,
0.0693359375,
-0.049285888671875,
-0.05755615234375,
-0.04095458984375,
0.06292724609375,
-0.0034008026123046875,
-0.028472900390625,
0.0648193359375,
0.05291748046875,
0.0726318359375,
-0.0175933837890625,
0.0709228515625,
-0.0185699462890625,
0.051116943359375,
-0.019622802734375,
0.0577392578125,
-0.050689697265625,
-0.027679443359375,
-0.03533935546875,
-0.04705810546875,
-0.01168060302734375,
0.06231689453125,
-0.0164337158203125,
0.0173492431640625,
0.048187255859375,
0.048187255859375,
-0.0028438568115234375,
0.0118560791015625,
0.0200653076171875,
0.0298004150390625,
0.01282501220703125,
0.039886474609375,
0.04888916015625,
-0.03277587890625,
0.06817626953125,
-0.02825927734375,
-0.0386962890625,
-0.0316162109375,
-0.034698486328125,
-0.0849609375,
-0.036224365234375,
-0.033172607421875,
-0.042449951171875,
-0.0076141357421875,
0.06781005859375,
0.05828857421875,
-0.066162109375,
-0.033111572265625,
0.017425537109375,
0.005523681640625,
-0.03204345703125,
-0.02569580078125,
0.02392578125,
-0.006256103515625,
-0.0692138671875,
0.00220489501953125,
0.0119781494140625,
0.01526641845703125,
-0.0252532958984375,
0.0004949569702148438,
-0.0095977783203125,
0.0011196136474609375,
0.046600341796875,
0.024566650390625,
-0.05560302734375,
-0.0167999267578125,
-0.0086669921875,
0.0005655288696289062,
0.0022449493408203125,
0.029876708984375,
-0.04150390625,
0.049407958984375,
0.052032470703125,
0.003185272216796875,
0.0283203125,
-0.013153076171875,
0.0200042724609375,
-0.037139892578125,
0.0250701904296875,
0.007732391357421875,
0.038299560546875,
-0.0014200210571289062,
-0.0212249755859375,
0.04974365234375,
0.0103912353515625,
-0.0386962890625,
-0.0675048828125,
0.00605010986328125,
-0.07659912109375,
-0.03741455078125,
0.08062744140625,
-0.025482177734375,
0.0006494522094726562,
-0.006984710693359375,
-0.038330078125,
0.034423828125,
-0.058197021484375,
0.051910400390625,
0.041900634765625,
-0.01776123046875,
-0.0013589859008789062,
-0.0615234375,
0.00704193115234375,
-0.006610870361328125,
-0.057525634765625,
-0.00917816162109375,
0.048095703125,
0.0173797607421875,
0.0268096923828125,
0.058197021484375,
-0.0208892822265625,
0.0299072265625,
0.003467559814453125,
0.03436279296875,
-0.0273284912109375,
-0.0006747245788574219,
-0.011474609375,
0.0233612060546875,
-0.0240478515625,
-0.03662109375
]
] |
sentence-transformers/bert-base-nli-stsb-mean-tokens | 2022-06-15T20:01:00.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"jax",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"arxiv:1908.10084",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/bert-base-nli-stsb-mean-tokens | 1 | 7,154 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
license: apache-2.0
---
**⚠️ This model is deprecated. Please don't use it as it produces sentence embeddings of low quality. You can find recommended sentence embedding models here: [SBERT.net - Pretrained Models](https://www.sbert.net/docs/pretrained_models.html)**
# sentence-transformers/bert-base-nli-stsb-mean-tokens
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/bert-base-nli-stsb-mean-tokens')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/bert-base-nli-stsb-mean-tokens')
model = AutoModel.from_pretrained('sentence-transformers/bert-base-nli-stsb-mean-tokens')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, max pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/bert-base-nli-stsb-mean-tokens)
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
This model was trained by [sentence-transformers](https://www.sbert.net/).
If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
``` | 3,972 | [
[
-0.01751708984375,
-0.058319091796875,
0.018280029296875,
0.0310821533203125,
-0.03289794921875,
-0.032470703125,
-0.022918701171875,
-0.007720947265625,
0.0175933837890625,
0.026763916015625,
-0.04095458984375,
-0.03070068359375,
-0.053924560546875,
0.00446319580078125,
-0.031494140625,
0.0634765625,
-0.010040283203125,
0.00568389892578125,
-0.0197296142578125,
-0.01007080078125,
-0.0253448486328125,
-0.03656005859375,
-0.02630615234375,
-0.019775390625,
0.015533447265625,
0.0058135986328125,
0.034942626953125,
0.0268707275390625,
0.02117919921875,
0.032135009765625,
-0.00708770751953125,
0.01111602783203125,
-0.0282440185546875,
-0.008697509765625,
0.0074920654296875,
-0.021087646484375,
-0.005580902099609375,
0.02471923828125,
0.044097900390625,
0.038909912109375,
-0.01042938232421875,
0.004695892333984375,
0.0003418922424316406,
0.0213165283203125,
-0.03485107421875,
0.031097412109375,
-0.043121337890625,
0.01352691650390625,
0.00943756103515625,
0.00525665283203125,
-0.0477294921875,
-0.01361083984375,
0.0272064208984375,
-0.031097412109375,
0.006988525390625,
0.01430511474609375,
0.08721923828125,
0.027252197265625,
-0.0193023681640625,
-0.0268707275390625,
-0.0209503173828125,
0.0665283203125,
-0.07269287109375,
0.0225830078125,
0.0185546875,
-0.0055084228515625,
-0.001682281494140625,
-0.07666015625,
-0.056121826171875,
-0.01154327392578125,
-0.0322265625,
0.016265869140625,
-0.0304412841796875,
0.0021915435791015625,
0.007244110107421875,
0.017822265625,
-0.049774169921875,
-0.006458282470703125,
-0.031463623046875,
-0.0087127685546875,
0.038604736328125,
-0.0019779205322265625,
0.0266265869140625,
-0.046356201171875,
-0.034698486328125,
-0.0237579345703125,
-0.01544952392578125,
-0.00873565673828125,
0.01047515869140625,
0.0155487060546875,
-0.021484375,
0.057098388671875,
0.006160736083984375,
0.04071044921875,
-0.0012254714965820312,
0.0234832763671875,
0.0517578125,
-0.0265960693359375,
-0.026519775390625,
-0.006435394287109375,
0.080322265625,
0.031524658203125,
0.0301361083984375,
-0.00913238525390625,
-0.01309967041015625,
0.0019931793212890625,
0.0224151611328125,
-0.05859375,
-0.0275115966796875,
0.01232147216796875,
-0.03143310546875,
-0.023345947265625,
0.01235198974609375,
-0.0445556640625,
0.0006270408630371094,
0.0055694580078125,
0.055633544921875,
-0.04974365234375,
0.0014944076538085938,
0.0207366943359375,
-0.0207672119140625,
0.0148162841796875,
-0.0217437744140625,
-0.053863525390625,
0.01529693603515625,
0.01953125,
0.06878662109375,
0.00734710693359375,
-0.034698486328125,
-0.014373779296875,
-0.0112457275390625,
-0.0008740425109863281,
0.0457763671875,
-0.022705078125,
-0.011932373046875,
0.01361083984375,
0.0179443359375,
-0.041839599609375,
-0.0243377685546875,
0.04473876953125,
-0.02734375,
0.05303955078125,
0.00771331787109375,
-0.06622314453125,
-0.01397705078125,
0.01024627685546875,
-0.037841796875,
0.08062744140625,
0.01279449462890625,
-0.07061767578125,
0.01032257080078125,
-0.06158447265625,
-0.0247955322265625,
-0.01190185546875,
0.0092315673828125,
-0.052154541015625,
0.01317596435546875,
0.0352783203125,
0.0521240234375,
0.01525115966796875,
0.03753662109375,
-0.0180511474609375,
-0.03692626953125,
0.0294342041015625,
-0.03070068359375,
0.0892333984375,
0.010223388671875,
-0.0235748291015625,
0.011383056640625,
-0.038238525390625,
-0.006130218505859375,
0.0229949951171875,
-0.01258087158203125,
-0.0164337158203125,
0.00433349609375,
0.025970458984375,
0.0177001953125,
0.0167236328125,
-0.056610107421875,
0.007762908935546875,
-0.04931640625,
0.072021484375,
0.047821044921875,
0.0013570785522460938,
0.0391845703125,
-0.0213165283203125,
0.005756378173828125,
0.02532958984375,
0.0003769397735595703,
-0.0150299072265625,
-0.032989501953125,
-0.07659912109375,
-0.02459716796875,
0.0278778076171875,
0.04034423828125,
-0.052764892578125,
0.08636474609375,
-0.03814697265625,
-0.03375244140625,
-0.055572509765625,
-0.006103515625,
0.005077362060546875,
0.0268707275390625,
0.049102783203125,
-0.00833892822265625,
-0.052001953125,
-0.0711669921875,
-0.000701904296875,
-0.004405975341796875,
0.00458526611328125,
0.02020263671875,
0.056121826171875,
-0.035888671875,
0.080322265625,
-0.04632568359375,
-0.03173828125,
-0.03790283203125,
0.02337646484375,
0.0193023681640625,
0.0517578125,
0.040618896484375,
-0.04638671875,
-0.0212249755859375,
-0.0521240234375,
-0.052032470703125,
0.0026721954345703125,
-0.018341064453125,
-0.01229095458984375,
0.0178985595703125,
0.035125732421875,
-0.0623779296875,
0.0283355712890625,
0.046051025390625,
-0.039581298828125,
0.0233001708984375,
-0.0209503173828125,
-0.0184783935546875,
-0.10296630859375,
0.003719329833984375,
0.006061553955078125,
-0.016876220703125,
-0.032806396484375,
0.0033397674560546875,
0.00943756103515625,
-0.010650634765625,
-0.03619384765625,
0.035247802734375,
-0.0254669189453125,
0.01149749755859375,
0.0008597373962402344,
0.0301055908203125,
0.001461029052734375,
0.05743408203125,
-0.003387451171875,
0.05218505859375,
0.03375244140625,
-0.041900634765625,
0.0198822021484375,
0.049163818359375,
-0.040618896484375,
0.005615234375,
-0.0660400390625,
-0.0014247894287109375,
-0.0030155181884765625,
0.0347900390625,
-0.08282470703125,
0.0008587837219238281,
0.02593994140625,
-0.04541015625,
0.01488494873046875,
0.0279998779296875,
-0.051971435546875,
-0.04522705078125,
-0.0305938720703125,
0.010101318359375,
0.043853759765625,
-0.044677734375,
0.043975830078125,
0.0198822021484375,
-0.001476287841796875,
-0.04296875,
-0.08935546875,
0.0019359588623046875,
-0.007587432861328125,
-0.051239013671875,
0.041900634765625,
-0.005237579345703125,
0.0174713134765625,
0.0275115966796875,
0.020294189453125,
-0.0003979206085205078,
0.001155853271484375,
0.0026302337646484375,
0.0191192626953125,
-0.00530242919921875,
0.020172119140625,
0.0131988525390625,
-0.00732421875,
0.00421905517578125,
-0.0163421630859375,
0.0538330078125,
-0.0130157470703125,
-0.00927734375,
-0.0347900390625,
0.0146331787109375,
0.0276031494140625,
-0.0196685791015625,
0.0836181640625,
0.07666015625,
-0.03436279296875,
-0.0046539306640625,
-0.042572021484375,
-0.0223236083984375,
-0.034393310546875,
0.050872802734375,
-0.01036834716796875,
-0.07647705078125,
0.0255584716796875,
0.01568603515625,
0.004638671875,
0.047515869140625,
0.03924560546875,
-0.01331329345703125,
0.059326171875,
0.044677734375,
-0.0164031982421875,
0.04022216796875,
-0.047271728515625,
0.027801513671875,
-0.07177734375,
-0.001953125,
-0.01531219482421875,
-0.02288818359375,
-0.053436279296875,
-0.032257080078125,
0.010223388671875,
-0.007015228271484375,
-0.025604248046875,
0.04241943359375,
-0.0411376953125,
0.01042938232421875,
0.050262451171875,
0.014678955078125,
-0.01291656494140625,
0.0028247833251953125,
-0.0304718017578125,
-0.0059356689453125,
-0.05126953125,
-0.041473388671875,
0.06298828125,
0.038299560546875,
0.033233642578125,
-0.00930023193359375,
0.052032470703125,
0.006160736083984375,
0.0038661956787109375,
-0.052520751953125,
0.044677734375,
-0.0306854248046875,
-0.0380859375,
-0.0245819091796875,
-0.02508544921875,
-0.064453125,
0.02813720703125,
-0.015777587890625,
-0.05780029296875,
0.01039886474609375,
-0.01788330078125,
-0.0216064453125,
0.0220794677734375,
-0.06475830078125,
0.07879638671875,
0.004932403564453125,
-0.00046634674072265625,
-0.01122283935546875,
-0.05224609375,
0.0113677978515625,
0.0198974609375,
0.0020809173583984375,
-0.00029158592224121094,
0.00005710124969482422,
0.06817626953125,
-0.020538330078125,
0.080810546875,
-0.01776123046875,
0.020660400390625,
0.03106689453125,
-0.0286865234375,
0.0205535888671875,
-0.00677490234375,
-0.003803253173828125,
0.011688232421875,
-0.01497650146484375,
-0.0275726318359375,
-0.036895751953125,
0.050933837890625,
-0.07666015625,
-0.0277099609375,
-0.03570556640625,
-0.04205322265625,
-0.004955291748046875,
0.01285552978515625,
0.028900146484375,
0.032928466796875,
-0.0182647705078125,
0.0340576171875,
0.0357666015625,
-0.02935791015625,
0.059844970703125,
0.007537841796875,
0.0024967193603515625,
-0.042083740234375,
0.04864501953125,
0.006336212158203125,
-0.0027332305908203125,
0.032470703125,
0.013336181640625,
-0.033905029296875,
-0.0173187255859375,
-0.0264739990234375,
0.032440185546875,
-0.043548583984375,
-0.01404571533203125,
-0.07879638671875,
-0.0428466796875,
-0.049285888671875,
-0.00433349609375,
-0.01593017578125,
-0.034515380859375,
-0.044158935546875,
-0.024749755859375,
0.0245819091796875,
0.034515380859375,
-0.0031719207763671875,
0.03253173828125,
-0.053985595703125,
0.007160186767578125,
0.0128021240234375,
0.0143585205078125,
-0.0020580291748046875,
-0.05401611328125,
-0.0283966064453125,
-0.00006783008575439453,
-0.028961181640625,
-0.06280517578125,
0.050872802734375,
0.018402099609375,
0.045654296875,
0.0113677978515625,
0.01032257080078125,
0.0452880859375,
-0.044219970703125,
0.073974609375,
0.005096435546875,
-0.081787109375,
0.034820556640625,
-0.0014133453369140625,
0.0299530029296875,
0.034271240234375,
0.02264404296875,
-0.03399658203125,
-0.0330810546875,
-0.054412841796875,
-0.0799560546875,
0.049346923828125,
0.032623291015625,
0.048492431640625,
-0.031829833984375,
0.0212249755859375,
-0.021026611328125,
0.0153045654296875,
-0.0904541015625,
-0.02655029296875,
-0.034271240234375,
-0.047149658203125,
-0.02374267578125,
-0.02862548828125,
0.01776123046875,
-0.02764892578125,
0.060791015625,
0.00713348388671875,
0.061859130859375,
0.02850341796875,
-0.042572021484375,
0.01206207275390625,
0.0177459716796875,
0.038848876953125,
0.01474761962890625,
-0.01528167724609375,
0.00937652587890625,
0.021881103515625,
-0.026397705078125,
-0.0037212371826171875,
0.038421630859375,
-0.010009765625,
0.017791748046875,
0.03179931640625,
0.07763671875,
0.04254150390625,
-0.03643798828125,
0.05987548828125,
-0.004428863525390625,
-0.0209197998046875,
-0.03375244140625,
-0.01103973388671875,
0.0197906494140625,
0.0196685791015625,
0.0228118896484375,
-0.00045943260192871094,
-0.0006680488586425781,
-0.02362060546875,
0.0256805419921875,
0.0187835693359375,
-0.03521728515625,
-0.004978179931640625,
0.048583984375,
0.012237548828125,
-0.0109405517578125,
0.07745361328125,
-0.0214080810546875,
-0.0545654296875,
0.0278472900390625,
0.05035400390625,
0.0758056640625,
0.004314422607421875,
0.0223388671875,
0.040191650390625,
0.0301971435546875,
0.0012235641479492188,
-0.0012006759643554688,
0.01055908203125,
-0.0721435546875,
-0.02374267578125,
-0.04461669921875,
0.00719451904296875,
0.002071380615234375,
-0.04034423828125,
0.0156707763671875,
-0.009185791015625,
-0.01209259033203125,
-0.016815185546875,
0.0008144378662109375,
-0.046356201171875,
0.00913238525390625,
0.008453369140625,
0.06463623046875,
-0.0762939453125,
0.058135986328125,
0.04998779296875,
-0.051422119140625,
-0.051239013671875,
-0.0037784576416015625,
-0.0297698974609375,
-0.058685302734375,
0.041717529296875,
0.0391845703125,
0.01678466796875,
0.0179290771484375,
-0.04705810546875,
-0.05926513671875,
0.09698486328125,
0.01556396484375,
-0.0264739990234375,
-0.0186920166015625,
0.0049896240234375,
0.0372314453125,
-0.03887939453125,
0.0267486572265625,
0.0245513916015625,
0.0239105224609375,
-0.006214141845703125,
-0.04736328125,
0.016021728515625,
-0.0238037109375,
0.020538330078125,
-0.0142059326171875,
-0.037933349609375,
0.06976318359375,
-0.00640106201171875,
-0.0175628662109375,
0.0158843994140625,
0.0689697265625,
0.021240234375,
-0.006443023681640625,
0.037567138671875,
0.06573486328125,
0.041778564453125,
-0.0104522705078125,
0.0693359375,
-0.02203369140625,
0.051300048828125,
0.073974609375,
0.006679534912109375,
0.08514404296875,
0.033935546875,
-0.0030040740966796875,
0.06329345703125,
0.0423583984375,
-0.02764892578125,
0.051971435546875,
0.018280029296875,
0.007030487060546875,
-0.00016999244689941406,
0.00873565673828125,
-0.01459503173828125,
0.035400390625,
0.01507568359375,
-0.0562744140625,
-0.00170135498046875,
0.01276397705078125,
0.0036563873291015625,
-0.0023708343505859375,
0.01052093505859375,
0.044891357421875,
0.00986480712890625,
-0.031646728515625,
0.0293731689453125,
0.01511383056640625,
0.0787353515625,
-0.0290679931640625,
0.012420654296875,
-0.0011196136474609375,
0.022125244140625,
0.005062103271484375,
-0.042999267578125,
0.0266265869140625,
-0.0089874267578125,
-0.00211334228515625,
-0.0181121826171875,
0.045928955078125,
-0.0452880859375,
-0.046478271484375,
0.027557373046875,
0.03973388671875,
0.0016937255859375,
0.0077972412109375,
-0.0770263671875,
-0.00048661231994628906,
-0.0019435882568359375,
-0.039276123046875,
0.0100555419921875,
0.0213165283203125,
0.029937744140625,
0.041595458984375,
0.027862548828125,
-0.01348114013671875,
0.008514404296875,
0.0141754150390625,
0.0643310546875,
-0.0474853515625,
-0.043731689453125,
-0.06842041015625,
0.056121826171875,
-0.01556396484375,
-0.0238494873046875,
0.04522705078125,
0.038726806640625,
0.0657958984375,
-0.0228729248046875,
0.04205322265625,
-0.01195526123046875,
0.0188751220703125,
-0.04010009765625,
0.06500244140625,
-0.03436279296875,
-0.0058135986328125,
-0.01812744140625,
-0.06744384765625,
-0.0254058837890625,
0.08526611328125,
-0.0258026123046875,
0.013885498046875,
0.06787109375,
0.05645751953125,
-0.003204345703125,
-0.0014209747314453125,
0.01000213623046875,
0.032562255859375,
0.01654052734375,
0.035430908203125,
0.0360107421875,
-0.0626220703125,
0.047607421875,
-0.036102294921875,
-0.0039215087890625,
-0.01218414306640625,
-0.06353759765625,
-0.07513427734375,
-0.06085205078125,
-0.032806396484375,
-0.018402099609375,
-0.000029325485229492188,
0.08319091796875,
0.04974365234375,
-0.05609130859375,
-0.00795745849609375,
-0.0204315185546875,
-0.0167694091796875,
-0.00960540771484375,
-0.02374267578125,
0.0404052734375,
-0.045379638671875,
-0.060638427734375,
0.0119476318359375,
-0.0110321044921875,
0.01092529296875,
-0.0290679931640625,
0.01113128662109375,
-0.052520751953125,
0.01456451416015625,
0.0452880859375,
-0.0237579345703125,
-0.062286376953125,
-0.026214599609375,
0.004451751708984375,
-0.0253753662109375,
-0.0086669921875,
0.0246734619140625,
-0.0528564453125,
0.0188751220703125,
0.0254974365234375,
0.04583740234375,
0.04656982421875,
-0.017425537109375,
0.0369873046875,
-0.0653076171875,
0.016754150390625,
0.00995635986328125,
0.05194091796875,
0.033905029296875,
-0.0185546875,
0.0426025390625,
0.016326904296875,
-0.035797119140625,
-0.05010986328125,
-0.0149383544921875,
-0.0780029296875,
-0.02587890625,
0.0804443359375,
-0.031890869140625,
-0.0258636474609375,
0.0156707763671875,
-0.0128173828125,
0.03924560546875,
-0.02606201171875,
0.05352783203125,
0.06646728515625,
0.005397796630859375,
-0.0225830078125,
-0.02239990234375,
0.011688232421875,
0.033172607421875,
-0.03948974609375,
-0.01271820068359375,
0.020355224609375,
0.0182037353515625,
0.02325439453125,
0.02978515625,
-0.007022857666015625,
-0.002288818359375,
0.0026111602783203125,
0.012939453125,
-0.01441192626953125,
0.00408172607421875,
-0.0251312255859375,
0.003875732421875,
-0.029815673828125,
-0.032470703125
]
] |
sijunhe/nezha-cn-base | 2022-06-24T03:53:56.000Z | [
"transformers",
"pytorch",
"nezha",
"fill-mask",
"arxiv:1909.00204",
"license:afl-3.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | sijunhe | null | null | sijunhe/nezha-cn-base | 9 | 7,140 | transformers | 2022-06-18T16:39:15 | ---
license: afl-3.0
---
**Please use 'Bert' related tokenizer classes and 'Nezha' related model classes**
[NEZHA: Neural Contextualized Representation for Chinese Language Understanding](https://arxiv.org/abs/1909.00204)
Junqiu Wei, Xiaozhe Ren, Xiaoguang Li, Wenyong Huang, Yi Liao, Yasheng Wang, Jiashu Lin, Xin Jiang, Xiao Chen and Qun Liu.
The original checkpoints can be found [here](https://github.com/huawei-noah/Pretrained-Language-Model/tree/master/NEZHA-PyTorch)
## Example Usage
```
from transformers import BertTokenizer, NezhaModel
tokenizer = BertTokenizer.from_pretrained('sijunhe/nezha-cn-base')
model = NezhaModel.from_pretrained("sijunhe/nezha-cn-base")
text = "我爱北京天安门"
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
``` | 782 | [
[
-0.01033782958984375,
-0.03741455078125,
0.0173492431640625,
0.0341796875,
-0.0360107421875,
-0.019317626953125,
-0.019927978515625,
-0.0052337646484375,
-0.0015859603881835938,
0.0154266357421875,
-0.03515625,
-0.029632568359375,
-0.0400390625,
-0.0113067626953125,
-0.01235198974609375,
0.07293701171875,
-0.0016546249389648438,
0.0452880859375,
0.03363037109375,
-0.001857757568359375,
-0.028350830078125,
-0.0482177734375,
-0.061859130859375,
-0.041473388671875,
0.033538818359375,
0.0034503936767578125,
0.0234832763671875,
0.0201263427734375,
-0.0020961761474609375,
0.0240936279296875,
-0.0202178955078125,
-0.014190673828125,
-0.0113677978515625,
-0.0184173583984375,
0.0042266845703125,
-0.0195465087890625,
-0.041168212890625,
-0.00838470458984375,
0.06805419921875,
0.035247802734375,
0.01922607421875,
0.004459381103515625,
0.00792694091796875,
0.01052093505859375,
-0.03265380859375,
0.0245208740234375,
-0.0443115234375,
0.0175323486328125,
-0.0059967041015625,
0.0005364418029785156,
-0.04339599609375,
-0.028533935546875,
0.0277862548828125,
-0.0498046875,
0.0003726482391357422,
-0.01206207275390625,
0.10223388671875,
-0.007843017578125,
-0.036651611328125,
-0.006855010986328125,
-0.0258636474609375,
0.07293701171875,
-0.0828857421875,
0.027191162109375,
0.0195159912109375,
0.029296875,
-0.011383056640625,
-0.07733154296875,
-0.0528564453125,
-0.0207977294921875,
-0.0067596435546875,
0.00811004638671875,
0.007904052734375,
0.007389068603515625,
0.02239990234375,
0.01617431640625,
-0.0289154052734375,
0.002460479736328125,
-0.0225677490234375,
-0.034698486328125,
0.0167999267578125,
0.0008029937744140625,
0.04107666015625,
-0.0289764404296875,
-0.028564453125,
-0.027740478515625,
-0.03558349609375,
0.00989532470703125,
0.009857177734375,
0.027923583984375,
-0.01806640625,
0.0167083740234375,
0.007415771484375,
0.071044921875,
0.00598907470703125,
0.0004487037658691406,
0.04693603515625,
-0.0254669189453125,
-0.033843994140625,
-0.005764007568359375,
0.0562744140625,
-0.0024967193603515625,
0.0129852294921875,
-0.0012407302856445312,
-0.01470947265625,
-0.04095458984375,
-0.0083160400390625,
-0.0653076171875,
-0.0220794677734375,
0.01145172119140625,
-0.05841064453125,
-0.00589752197265625,
0.03045654296875,
-0.036285400390625,
0.00582122802734375,
-0.01496124267578125,
0.06109619140625,
-0.04571533203125,
-0.0216827392578125,
-0.01413726806640625,
-0.0156707763671875,
0.054443359375,
0.0151214599609375,
-0.08984375,
-0.00016200542449951172,
0.034759521484375,
0.052764892578125,
0.0421142578125,
-0.04840087890625,
-0.0311126708984375,
-0.0056304931640625,
-0.017242431640625,
0.02655029296875,
-0.002658843994140625,
0.00843048095703125,
0.0189056396484375,
0.0219268798828125,
-0.0265960693359375,
-0.0271148681640625,
0.045135498046875,
-0.042327880859375,
0.030548095703125,
-0.02117919921875,
-0.0299835205078125,
-0.044586181640625,
0.01763916015625,
-0.027679443359375,
0.0594482421875,
0.021636962890625,
-0.06512451171875,
0.024871826171875,
-0.04437255859375,
-0.0355224609375,
0.01496124267578125,
-0.005466461181640625,
-0.034088134765625,
-0.00030350685119628906,
0.0113372802734375,
0.035308837890625,
0.0157012939453125,
0.026153564453125,
-0.001995086669921875,
-0.044219970703125,
0.00266265869140625,
-0.023773193359375,
0.087158203125,
0.00431060791015625,
-0.0477294921875,
0.04229736328125,
-0.0443115234375,
0.01084136962890625,
0.0044403076171875,
-0.037750244140625,
-0.0146484375,
0.007305145263671875,
0.052093505859375,
0.02178955078125,
0.0487060546875,
-0.042694091796875,
0.0011043548583984375,
-0.044403076171875,
0.0439453125,
0.06878662109375,
-0.0243988037109375,
0.03375244140625,
-0.004299163818359375,
0.007205963134765625,
0.0177154541015625,
0.00023221969604492188,
-0.01108551025390625,
-0.020751953125,
-0.0809326171875,
0.015625,
0.0268096923828125,
0.03924560546875,
-0.07293701171875,
0.052459716796875,
-0.0158843994140625,
-0.03472900390625,
-0.006366729736328125,
-0.010009765625,
0.009368896484375,
0.0243377685546875,
0.033111572265625,
-0.015350341796875,
-0.042022705078125,
-0.049102783203125,
0.00205230712890625,
-0.0291290283203125,
-0.0119171142578125,
0.0170745849609375,
0.037445068359375,
-0.0308685302734375,
0.051116943359375,
-0.020355224609375,
-0.0267181396484375,
-0.034759521484375,
0.01367950439453125,
0.04229736328125,
0.065673828125,
0.062225341796875,
-0.03131103515625,
-0.036712646484375,
0.007282257080078125,
-0.028167724609375,
0.005962371826171875,
-0.01702880859375,
-0.0112762451171875,
0.013519287109375,
0.0203399658203125,
-0.00954437255859375,
0.019744873046875,
0.0293426513671875,
-0.037322998046875,
0.047760009765625,
-0.0168914794921875,
-0.01468658447265625,
-0.09454345703125,
0.00875091552734375,
-0.0216064453125,
-0.0142822265625,
-0.056549072265625,
0.01139068603515625,
0.0141143798828125,
0.01030731201171875,
-0.044342041015625,
0.046875,
-0.0232696533203125,
-0.00212860107421875,
-0.0250396728515625,
0.0065765380859375,
-0.01186370849609375,
0.05010986328125,
0.016448974609375,
0.036163330078125,
0.05712890625,
-0.05712890625,
0.040557861328125,
0.032501220703125,
-0.020050048828125,
-0.0160369873046875,
-0.0643310546875,
0.0131683349609375,
-0.003337860107421875,
0.01885986328125,
-0.0482177734375,
-0.00525665283203125,
0.058837890625,
-0.04840087890625,
0.0207366943359375,
-0.00981903076171875,
-0.040618896484375,
-0.007610321044921875,
-0.01526641845703125,
0.039520263671875,
0.03851318359375,
-0.0660400390625,
0.06280517578125,
-0.0037517547607421875,
0.005374908447265625,
-0.06103515625,
-0.066162109375,
0.00039315223693847656,
0.00031113624572753906,
-0.037261962890625,
0.03692626953125,
-0.0036983489990234375,
0.0118865966796875,
-0.003173828125,
0.009368896484375,
-0.028839111328125,
0.002620697021484375,
-0.0008006095886230469,
0.0274200439453125,
-0.0369873046875,
0.0112152099609375,
-0.003993988037109375,
-0.01519012451171875,
0.01084136962890625,
-0.0240020751953125,
0.07708740234375,
-0.0159912109375,
-0.0138702392578125,
-0.043853759765625,
-0.006961822509765625,
0.0160064697265625,
-0.01474761962890625,
0.06976318359375,
0.0948486328125,
-0.038665771484375,
-0.00603485107421875,
-0.0164031982421875,
-0.023712158203125,
-0.034271240234375,
0.0513916015625,
-0.016448974609375,
-0.053985595703125,
0.050811767578125,
0.016204833984375,
0.0277099609375,
0.044158935546875,
0.037200927734375,
0.0165863037109375,
0.051361083984375,
0.044342041015625,
-0.0298919677734375,
0.0672607421875,
-0.0211639404296875,
0.0258331298828125,
-0.06878662109375,
-0.005863189697265625,
-0.03924560546875,
0.005931854248046875,
-0.035186767578125,
-0.020172119140625,
-0.0092315673828125,
0.005245208740234375,
-0.0340576171875,
0.047698974609375,
-0.0301055908203125,
0.01422119140625,
0.061859130859375,
0.0009188652038574219,
-0.021759033203125,
0.002750396728515625,
-0.0182647705078125,
0.0014123916625976562,
-0.04840087890625,
-0.041259765625,
0.057220458984375,
0.03912353515625,
0.0288848876953125,
0.001415252685546875,
0.052520751953125,
-0.034881591796875,
0.0115966796875,
-0.0587158203125,
0.040740966796875,
-0.028778076171875,
-0.047210693359375,
-0.0171051025390625,
-0.0313720703125,
-0.06414794921875,
0.0296478271484375,
-0.0059661865234375,
-0.06494140625,
0.020233154296875,
-0.0113067626953125,
-0.03009033203125,
0.0269927978515625,
-0.037261962890625,
0.087890625,
-0.0095977783203125,
-0.006572723388671875,
-0.0101165771484375,
-0.044647216796875,
0.0244140625,
0.0110321044921875,
-0.0038928985595703125,
0.0005564689636230469,
0.027801513671875,
0.07281494140625,
-0.0277099609375,
0.067138671875,
-0.033721923828125,
0.00836181640625,
0.01016998291015625,
-0.0309600830078125,
0.014251708984375,
0.00864410400390625,
0.0022487640380859375,
0.0170745849609375,
0.01244354248046875,
-0.044586181640625,
-0.03338623046875,
0.038116455078125,
-0.0743408203125,
-0.0260162353515625,
-0.053985595703125,
-0.03350830078125,
0.026397705078125,
0.03619384765625,
0.0292816162109375,
0.042999267578125,
-0.005077362060546875,
-0.0002751350402832031,
0.0288848876953125,
-0.04254150390625,
0.059539794921875,
0.0266571044921875,
-0.0172119140625,
-0.037689208984375,
0.07464599609375,
0.01186370849609375,
0.004802703857421875,
0.026153564453125,
0.017303466796875,
-0.007717132568359375,
-0.04949951171875,
-0.0290374755859375,
0.01049041748046875,
-0.06884765625,
0.007472991943359375,
-0.031768798828125,
-0.055938720703125,
-0.049102783203125,
0.01043701171875,
0.00556182861328125,
-0.01270294189453125,
-0.045654296875,
0.0081024169921875,
0.00173187255859375,
0.037567138671875,
-0.014923095703125,
0.044586181640625,
-0.073974609375,
0.0138702392578125,
0.0233306884765625,
0.0035572052001953125,
0.032196044921875,
-0.053741455078125,
-0.0465087890625,
-0.0017251968383789062,
-0.0194854736328125,
-0.06414794921875,
0.04742431640625,
-0.0013561248779296875,
0.0626220703125,
0.047698974609375,
-0.005634307861328125,
0.0496826171875,
-0.0257415771484375,
0.06719970703125,
-0.00868988037109375,
-0.10699462890625,
0.0243377685546875,
-0.005435943603515625,
0.0282135009765625,
0.030059814453125,
-0.0034465789794921875,
-0.064208984375,
-0.002948760986328125,
-0.0287322998046875,
-0.07098388671875,
0.07440185546875,
0.011077880859375,
0.039398193359375,
0.004947662353515625,
0.0213165283203125,
-0.00881195068359375,
-0.003231048583984375,
-0.115966796875,
-0.029266357421875,
-0.0465087890625,
-0.01654052734375,
0.0013256072998046875,
-0.027008056640625,
0.0246734619140625,
-0.03448486328125,
0.07574462890625,
-0.004497528076171875,
0.0230712890625,
0.025665283203125,
-0.018035888671875,
0.00981903076171875,
0.00738525390625,
0.04974365234375,
0.03863525390625,
-0.0274658203125,
0.00867462158203125,
0.020782470703125,
-0.047210693359375,
-0.01207733154296875,
0.0166015625,
-0.021575927734375,
0.0288238525390625,
0.0377197265625,
0.06829833984375,
0.0088348388671875,
-0.02032470703125,
0.0298614501953125,
-0.01007080078125,
-0.00506591796875,
-0.046630859375,
-0.01013946533203125,
0.013092041015625,
0.0014047622680664062,
0.030517578125,
-0.00266265869140625,
-0.007266998291015625,
-0.0253753662109375,
0.002765655517578125,
0.0250244140625,
-0.040618896484375,
-0.020263671875,
0.03515625,
0.0166015625,
-0.0086517333984375,
0.0706787109375,
-0.020782470703125,
-0.0689697265625,
0.05859375,
0.045196533203125,
0.08148193359375,
-0.012939453125,
0.0026264190673828125,
0.06591796875,
0.027679443359375,
0.01071929931640625,
0.016754150390625,
-0.016876220703125,
-0.08062744140625,
-0.0277099609375,
-0.040679931640625,
-0.019622802734375,
0.018829345703125,
-0.032623291015625,
0.02264404296875,
-0.045928955078125,
-0.006565093994140625,
-0.015106201171875,
0.00975799560546875,
-0.0187225341796875,
0.007213592529296875,
0.0005450248718261719,
0.059173583984375,
-0.0537109375,
0.1041259765625,
0.05218505859375,
-0.04595947265625,
-0.08807373046875,
0.0203094482421875,
-0.032318115234375,
-0.049530029296875,
0.08221435546875,
0.0282440185546875,
0.01490020751953125,
-0.00045943260192871094,
-0.0298004150390625,
-0.044769287109375,
0.0692138671875,
0.005695343017578125,
-0.0183563232421875,
-0.006786346435546875,
-0.00852203369140625,
0.0255279541015625,
-0.01401519775390625,
0.031219482421875,
0.0234222412109375,
0.04736328125,
0.0031909942626953125,
-0.05548095703125,
0.003154754638671875,
-0.0225982666015625,
0.0010633468627929688,
-0.00858306884765625,
-0.0360107421875,
0.1046142578125,
-0.00614166259765625,
-0.0278167724609375,
0.0275115966796875,
0.06292724609375,
0.0239105224609375,
0.0158843994140625,
0.033721923828125,
0.02178955078125,
0.0185699462890625,
-0.0159454345703125,
0.0362548828125,
-0.016510009765625,
0.036468505859375,
0.08050537109375,
0.0079803466796875,
0.054901123046875,
0.03125,
-0.0025157928466796875,
0.06024169921875,
0.064208984375,
-0.0310821533203125,
0.027252197265625,
0.01812744140625,
-0.0203704833984375,
-0.0128631591796875,
0.0084686279296875,
-0.04693603515625,
0.014068603515625,
0.01151275634765625,
-0.04315185546875,
-0.01465606689453125,
-0.015869140625,
0.01245880126953125,
-0.0194854736328125,
-0.0156097412109375,
0.03924560546875,
0.0014467239379882812,
-0.0273284912109375,
0.0457763671875,
0.033355712890625,
0.07489013671875,
-0.05194091796875,
-0.00978851318359375,
0.0018215179443359375,
0.0146942138671875,
-0.010528564453125,
-0.0199737548828125,
0.01270294189453125,
-0.0301971435546875,
0.0062255859375,
0.00115966796875,
0.048828125,
-0.03619384765625,
-0.030517578125,
0.03363037109375,
0.0162353515625,
0.0248260498046875,
0.00850677490234375,
-0.0523681640625,
-0.01503753662109375,
0.0104827880859375,
-0.01073455810546875,
0.01470184326171875,
0.01102447509765625,
0.01401519775390625,
0.0203704833984375,
0.045684814453125,
-0.00856781005859375,
0.022491455078125,
0.024261474609375,
0.0579833984375,
-0.059112548828125,
-0.033966064453125,
-0.06854248046875,
0.0226898193359375,
-0.000644683837890625,
-0.03338623046875,
0.06732177734375,
0.028533935546875,
0.0635986328125,
-0.0289764404296875,
0.034210205078125,
-0.00873565673828125,
-0.004108428955078125,
-0.03369140625,
0.080810546875,
-0.0265045166015625,
-0.0128021240234375,
-0.028106689453125,
-0.0738525390625,
-0.0208740234375,
0.08355712890625,
-0.01496124267578125,
0.010040283203125,
0.049530029296875,
0.042449951171875,
0.0157012939453125,
-0.0158233642578125,
0.0229034423828125,
0.02874755859375,
0.0272369384765625,
0.066162109375,
0.06500244140625,
-0.05889892578125,
0.01520538330078125,
-0.0274505615234375,
-0.01180267333984375,
-0.006206512451171875,
-0.055694580078125,
-0.0845947265625,
-0.054351806640625,
-0.01157379150390625,
-0.0289306640625,
-0.0195159912109375,
0.07830810546875,
0.06488037109375,
-0.07183837890625,
-0.0277099609375,
-0.019317626953125,
0.03173828125,
-0.0164031982421875,
-0.0196533203125,
0.04888916015625,
-0.046234130859375,
-0.08258056640625,
0.0109405517578125,
-0.007740020751953125,
0.031402587890625,
-0.0182342529296875,
-0.0118560791015625,
-0.017425537109375,
-0.00811004638671875,
0.025054931640625,
0.020172119140625,
-0.07208251953125,
-0.0213623046875,
-0.01148223876953125,
-0.01412200927734375,
0.00814056396484375,
0.0340576171875,
-0.05926513671875,
0.0092620849609375,
0.0677490234375,
0.0267486572265625,
0.0294952392578125,
-0.030059814453125,
0.045623779296875,
-0.030364990234375,
0.034423828125,
0.007389068603515625,
0.0469970703125,
0.0218658447265625,
-0.0133056640625,
0.01043701171875,
0.00600433349609375,
-0.037200927734375,
-0.046051025390625,
0.0046844482421875,
-0.0692138671875,
-0.0245513916015625,
0.060516357421875,
-0.03717041015625,
-0.032073974609375,
-0.01549530029296875,
-0.03216552734375,
0.045806884765625,
-0.00917816162109375,
0.0389404296875,
0.057830810546875,
0.0249481201171875,
-0.0132293701171875,
-0.02197265625,
0.027374267578125,
0.047698974609375,
-0.049041748046875,
-0.00827789306640625,
0.0084686279296875,
0.024444580078125,
0.02349853515625,
0.0304107666015625,
-0.003582000732421875,
0.0247344970703125,
0.0035552978515625,
0.0286102294921875,
0.0034027099609375,
-0.0017757415771484375,
-0.03631591796875,
-0.0088348388671875,
0.01137542724609375,
-0.039825439453125
]
] |
kamalkraj/BioELECTRA-PICO | 2023-03-15T19:50:40.000Z | [
"transformers",
"pytorch",
"electra",
"token-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | kamalkraj | null | null | kamalkraj/BioELECTRA-PICO | 5 | 7,139 | transformers | 2022-03-02T23:29:05 | ---
widget:
- text: "Those in the aspirin group experienced reduced duration of headache compared to those in the placebo arm (P<0.05)"
---
BioELECTRA-PICO
Cite our paper using below citation
```
@inproceedings{kanakarajan-etal-2021-bioelectra,
title = "{B}io{ELECTRA}:Pretrained Biomedical text Encoder using Discriminators",
author = "Kanakarajan, Kamal raj and
Kundumani, Bhuvana and
Sankarasubbu, Malaikannan",
booktitle = "Proceedings of the 20th Workshop on Biomedical Language Processing",
month = jun,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.bionlp-1.16",
doi = "10.18653/v1/2021.bionlp-1.16",
pages = "143--154",
abstract = "Recent advancements in pretraining strategies in NLP have shown a significant improvement in the performance of models on various text mining tasks. We apply {`}replaced token detection{'} pretraining technique proposed by ELECTRA and pretrain a biomedical language model from scratch using biomedical text and vocabulary. We introduce BioELECTRA, a biomedical domain-specific language encoder model that adapts ELECTRA for the Biomedical domain. WE evaluate our model on the BLURB and BLUE biomedical NLP benchmarks. BioELECTRA outperforms the previous models and achieves state of the art (SOTA) on all the 13 datasets in BLURB benchmark and on all the 4 Clinical datasets from BLUE Benchmark across 7 different NLP tasks. BioELECTRA pretrained on PubMed and PMC full text articles performs very well on Clinical datasets as well. BioELECTRA achieves new SOTA 86.34{\%}(1.39{\%} accuracy improvement) on MedNLI and 64{\%} (2.98{\%} accuracy improvement) on PubMedQA dataset.",
}
``` | 1,770 | [
[
-0.006168365478515625,
-0.036102294921875,
0.037139892578125,
0.0082855224609375,
-0.024078369140625,
0.0208892822265625,
0.00395965576171875,
-0.053741455078125,
0.0274505615234375,
0.01212310791015625,
-0.01142120361328125,
-0.03863525390625,
-0.0396728515625,
0.031280517578125,
0.00704193115234375,
0.05609130859375,
-0.0017137527465820312,
0.0333251953125,
-0.03094482421875,
-0.0206451416015625,
-0.009002685546875,
-0.054046630859375,
-0.03387451171875,
-0.034210205078125,
0.033966064453125,
0.0049591064453125,
0.01410675048828125,
0.0101165771484375,
0.04119873046875,
0.0264434814453125,
-0.01806640625,
0.013427734375,
0.003040313720703125,
-0.00788116455078125,
-0.018646240234375,
-0.03228759765625,
-0.032806396484375,
-0.007503509521484375,
0.043731689453125,
0.0546875,
0.0022220611572265625,
-0.0111236572265625,
-0.009979248046875,
0.030853271484375,
-0.035552978515625,
0.003444671630859375,
-0.0540771484375,
-0.009979248046875,
-0.031280517578125,
-0.01427459716796875,
-0.045440673828125,
-0.03704833984375,
-0.006542205810546875,
-0.045501708984375,
0.006832122802734375,
0.007617950439453125,
0.10162353515625,
0.0137786865234375,
-0.005527496337890625,
0.00801849365234375,
-0.035247802734375,
0.05218505859375,
-0.053497314453125,
0.04541015625,
0.040618896484375,
-0.011932373046875,
0.0082550048828125,
-0.0811767578125,
-0.044525146484375,
-0.004505157470703125,
0.003692626953125,
0.0156097412109375,
-0.0246734619140625,
0.0125732421875,
0.0192413330078125,
0.01371002197265625,
-0.0303497314453125,
0.01904296875,
-0.042633056640625,
-0.025665283203125,
0.026824951171875,
-0.0196380615234375,
0.018707275390625,
-0.012603759765625,
-0.02191162109375,
-0.00885009765625,
-0.0516357421875,
-0.0096893310546875,
0.01446533203125,
0.0162506103515625,
0.0004317760467529297,
0.03961181640625,
0.0101165771484375,
0.038665771484375,
0.0057525634765625,
0.0066986083984375,
0.08648681640625,
-0.04046630859375,
-0.0119781494140625,
0.019378662109375,
0.09033203125,
-0.0031585693359375,
0.01031494140625,
-0.029144287109375,
0.0232696533203125,
0.004062652587890625,
0.0286102294921875,
-0.075439453125,
-0.002658843994140625,
0.022003173828125,
-0.035125732421875,
-0.0093841552734375,
-0.005035400390625,
-0.05877685546875,
-0.00620269775390625,
0.0067596435546875,
0.0283203125,
-0.058197021484375,
-0.006626129150390625,
0.0291900634765625,
0.007843017578125,
0.0190277099609375,
-0.0134735107421875,
-0.06591796875,
0.0161285400390625,
0.02191162109375,
0.07745361328125,
-0.0109710693359375,
-0.0137176513671875,
-0.01593017578125,
0.0078125,
-0.003368377685546875,
0.08477783203125,
-0.0191650390625,
-0.0193634033203125,
-0.015777587890625,
0.022674560546875,
-0.0290985107421875,
-0.044830322265625,
0.056640625,
-0.0360107421875,
0.00421905517578125,
-0.0085601806640625,
-0.039581298828125,
-0.0028133392333984375,
-0.0170745849609375,
-0.04656982421875,
0.05133056640625,
-0.0010061264038085938,
-0.0728759765625,
0.041168212890625,
-0.0665283203125,
-0.05047607421875,
0.03448486328125,
-0.00629425048828125,
-0.054412841796875,
-0.0220489501953125,
0.00826263427734375,
0.0171661376953125,
-0.016754150390625,
0.0305938720703125,
-0.0255584716796875,
-0.01448822021484375,
0.0201416015625,
0.00662994384765625,
0.071533203125,
0.0272979736328125,
-0.008819580078125,
0.01059722900390625,
-0.0693359375,
-0.0014781951904296875,
-0.005828857421875,
-0.0166168212890625,
-0.045745849609375,
0.00836944580078125,
0.00799560546875,
0.01251983642578125,
0.0133056640625,
-0.0906982421875,
-0.003383636474609375,
-0.059173583984375,
0.050750732421875,
0.028167724609375,
0.004608154296875,
0.00965118408203125,
-0.010040283203125,
0.039276123046875,
0.0062255859375,
0.003734588623046875,
0.0017375946044921875,
-0.033416748046875,
-0.026580810546875,
-0.03594970703125,
0.045928955078125,
0.04705810546875,
-0.06243896484375,
0.06536865234375,
-0.0132904052734375,
-0.041778564453125,
-0.062744140625,
-0.0126190185546875,
0.0201873779296875,
0.056396484375,
0.06719970703125,
-0.0163726806640625,
-0.055206298828125,
-0.0601806640625,
-0.0161895751953125,
-0.00763702392578125,
-0.0021209716796875,
0.01409912109375,
0.0297088623046875,
-0.056396484375,
0.070068359375,
-0.0212249755859375,
-0.011474609375,
-0.02947998046875,
0.01406097412109375,
0.038604736328125,
0.0594482421875,
0.034027099609375,
-0.04803466796875,
-0.0177459716796875,
0.005489349365234375,
-0.06365966796875,
-0.02606201171875,
-0.0007271766662597656,
-0.01497650146484375,
0.0046234130859375,
0.044647216796875,
-0.051177978515625,
0.019195556640625,
0.0109405517578125,
-0.039947509765625,
0.042633056640625,
-0.0206298828125,
0.01032257080078125,
-0.0986328125,
0.04730224609375,
-0.0010251998901367188,
-0.0284576416015625,
-0.0328369140625,
-0.01227569580078125,
0.00020992755889892578,
-0.00405120849609375,
-0.041015625,
0.038177490234375,
-0.01311492919921875,
0.024932861328125,
-0.0026073455810546875,
-0.0242462158203125,
0.008056640625,
0.011932373046875,
0.00665283203125,
0.0562744140625,
0.0384521484375,
-0.048492431640625,
0.0029621124267578125,
0.059326171875,
-0.028594970703125,
0.01212310791015625,
-0.0894775390625,
0.00847625732421875,
0.00893402099609375,
0.005489349365234375,
-0.042510986328125,
0.0187225341796875,
0.0068359375,
-0.026580810546875,
0.0249786376953125,
0.0189361572265625,
-0.055999755859375,
-0.031585693359375,
-0.01503753662109375,
0.0291290283203125,
0.031982421875,
-0.03131103515625,
0.030914306640625,
0.03961181640625,
-0.0254669189453125,
-0.024810791015625,
-0.048431396484375,
-0.00908660888671875,
0.00565338134765625,
-0.022857666015625,
0.07586669921875,
-0.0191192626953125,
0.0105743408203125,
-0.01605224609375,
0.00408172607421875,
-0.01239776611328125,
-0.00982666015625,
0.006587982177734375,
0.0380859375,
-0.00749969482421875,
0.03082275390625,
0.0246734619140625,
-0.01514434814453125,
-0.0011816024780273438,
0.0034923553466796875,
0.052490234375,
-0.0055084228515625,
-0.0055694580078125,
-0.0482177734375,
0.025634765625,
0.0220489501953125,
-0.02496337890625,
0.060333251953125,
0.03753662109375,
-0.0186004638671875,
0.029541015625,
-0.0262298583984375,
-0.0017042160034179688,
-0.03765869140625,
0.0418701171875,
-0.01424407958984375,
-0.05316162109375,
0.0295257568359375,
-0.032562255859375,
-0.01404571533203125,
0.037994384765625,
0.039459228515625,
-0.01059722900390625,
0.06146240234375,
0.04876708984375,
-0.0023365020751953125,
0.038177490234375,
-0.01384735107421875,
0.018707275390625,
-0.0670166015625,
-0.032196044921875,
-0.059661865234375,
-0.006694793701171875,
-0.0419921875,
-0.01322174072265625,
0.00981903076171875,
0.00826263427734375,
-0.02691650390625,
0.03509521484375,
-0.04046630859375,
0.005718231201171875,
0.037872314453125,
0.038665771484375,
0.016448974609375,
0.00992584228515625,
-0.04827880859375,
0.0007457733154296875,
-0.06268310546875,
-0.03857421875,
0.08343505859375,
0.02496337890625,
0.047393798828125,
-0.0185546875,
0.054931640625,
-0.00024509429931640625,
0.01367950439453125,
-0.042572021484375,
0.031402587890625,
-0.012359619140625,
-0.041015625,
0.007541656494140625,
-0.0219879150390625,
-0.1038818359375,
-0.0038776397705078125,
-0.034942626953125,
-0.055877685546875,
0.04541015625,
-0.0057220458984375,
-0.03009033203125,
0.015838623046875,
-0.042633056640625,
0.0692138671875,
-0.0101470947265625,
-0.036376953125,
-0.006916046142578125,
-0.043670654296875,
-0.0180206298828125,
-0.018310546875,
0.00774383544921875,
0.01485443115234375,
0.007190704345703125,
0.082763671875,
-0.045379638671875,
0.0535888671875,
-0.007297515869140625,
0.026580810546875,
0.00756072998046875,
-0.04400634765625,
0.013885498046875,
-0.0063323974609375,
-0.01959228515625,
0.044647216796875,
0.01690673828125,
-0.02386474609375,
0.01129150390625,
0.036407470703125,
-0.059112548828125,
-0.0034847259521484375,
-0.048675537109375,
-0.0157012939453125,
-0.034210205078125,
0.034088134765625,
0.06719970703125,
0.0254974365234375,
-0.00768280029296875,
0.0133514404296875,
0.067138671875,
-0.04266357421875,
0.03753662109375,
0.02825927734375,
0.00799560546875,
-0.0236053466796875,
0.052886962890625,
-0.01025390625,
-0.003246307373046875,
0.036285400390625,
0.00148773193359375,
-0.022796630859375,
-0.052215576171875,
-0.007648468017578125,
0.045501708984375,
-0.02667236328125,
-0.03448486328125,
-0.07867431640625,
-0.02777099609375,
-0.0330810546875,
-0.0037784576416015625,
-0.033660888671875,
-0.004489898681640625,
-0.04229736328125,
-0.0196380615234375,
0.033050537109375,
0.01462554931640625,
-0.0037136077880859375,
0.0272674560546875,
-0.06781005859375,
0.0285491943359375,
0.005954742431640625,
0.0236053466796875,
-0.02825927734375,
-0.030517578125,
-0.029052734375,
-0.01129913330078125,
-0.01312255859375,
-0.07745361328125,
0.049102783203125,
0.01995849609375,
0.070068359375,
0.002826690673828125,
-0.016326904296875,
0.03558349609375,
-0.047088623046875,
0.039825439453125,
0.0237884521484375,
-0.06292724609375,
0.06396484375,
-0.0031223297119140625,
0.031280517578125,
0.07635498046875,
0.04986572265625,
0.0115203857421875,
-0.0125274658203125,
-0.0660400390625,
-0.09185791015625,
0.0262603759765625,
0.00024580955505371094,
-0.01099395751953125,
-0.0158233642578125,
0.024078369140625,
0.0186309814453125,
0.0193939208984375,
-0.06634521484375,
-0.0206298828125,
-0.00601959228515625,
-0.0322265625,
0.0039825439453125,
-0.016021728515625,
0.006565093994140625,
-0.051025390625,
0.061859130859375,
-0.0072479248046875,
0.047637939453125,
0.0333251953125,
-0.0252685546875,
0.00881195068359375,
0.0174713134765625,
0.01557159423828125,
0.046356201171875,
-0.024078369140625,
-0.0180511474609375,
-0.0008831024169921875,
-0.0758056640625,
-0.002197265625,
0.045928955078125,
-0.0205535888671875,
0.0248260498046875,
0.0276641845703125,
0.047454833984375,
0.01396942138671875,
-0.05047607421875,
0.0269927978515625,
0.0031871795654296875,
-0.0169219970703125,
-0.009033203125,
-0.024078369140625,
-0.00662994384765625,
0.00612640380859375,
0.0265045166015625,
0.023773193359375,
0.01387786865234375,
-0.018585205078125,
0.0244140625,
0.0199127197265625,
-0.045501708984375,
-0.03216552734375,
0.060791015625,
0.0185546875,
-0.003337860107421875,
0.0303955078125,
-0.01070404052734375,
-0.03326416015625,
0.0321044921875,
0.0634765625,
0.06732177734375,
-0.0299835205078125,
0.00934600830078125,
0.050750732421875,
0.005771636962890625,
0.0108795166015625,
0.052032470703125,
0.0031299591064453125,
-0.0799560546875,
-0.049957275390625,
-0.07427978515625,
-0.01500701904296875,
0.00934600830078125,
-0.035064697265625,
0.022796630859375,
-0.027374267578125,
-0.0224456787109375,
0.0274200439453125,
-0.0009403228759765625,
-0.042449951171875,
0.01251983642578125,
-0.00673675537109375,
0.0521240234375,
-0.060943603515625,
0.0888671875,
0.07135009765625,
-0.045867919921875,
-0.04608154296875,
-0.03271484375,
-0.04437255859375,
-0.04254150390625,
0.056396484375,
-0.0090789794921875,
-0.000690460205078125,
0.00849151611328125,
-0.035552978515625,
-0.049285888671875,
0.07794189453125,
0.02490234375,
-0.06365966796875,
0.0127105712890625,
-0.0256500244140625,
0.046539306640625,
-0.0170745849609375,
0.016448974609375,
0.031768798828125,
0.0211944580078125,
0.007778167724609375,
-0.07537841796875,
0.006267547607421875,
-0.046234130859375,
-0.0174713134765625,
0.0004723072052001953,
-0.039703369140625,
0.04888916015625,
-0.01232147216796875,
-0.0025196075439453125,
0.00569915771484375,
0.03460693359375,
0.044769287109375,
0.0239715576171875,
0.01245880126953125,
0.04754638671875,
0.07220458984375,
0.0067291259765625,
0.0750732421875,
-0.0220184326171875,
0.0180511474609375,
0.05865478515625,
-0.0244903564453125,
0.06658935546875,
0.038909912109375,
-0.044097900390625,
0.07379150390625,
0.045440673828125,
-0.01067352294921875,
0.055938720703125,
-0.004802703857421875,
-0.0229644775390625,
-0.00878143310546875,
-0.0082855224609375,
-0.052886962890625,
0.0279693603515625,
0.024749755859375,
-0.04888916015625,
0.01763916015625,
0.0215911865234375,
0.0147705078125,
-0.00543975830078125,
-0.0228424072265625,
0.048828125,
0.011688232421875,
-0.01097869873046875,
0.033905029296875,
-0.02532958984375,
0.0792236328125,
-0.06512451171875,
0.002300262451171875,
-0.00019359588623046875,
0.020751953125,
-0.01465606689453125,
-0.005054473876953125,
0.01448822021484375,
-0.00333404541015625,
-0.01485443115234375,
-0.005733489990234375,
0.067138671875,
-0.007476806640625,
-0.05426025390625,
0.0203399658203125,
0.02386474609375,
0.0191497802734375,
0.0134124755859375,
-0.04901123046875,
-0.010955810546875,
0.004787445068359375,
-0.0012979507446289062,
0.027801513671875,
0.0009264945983886719,
0.0194549560546875,
0.0372314453125,
0.0572509765625,
0.043975830078125,
-0.0105133056640625,
0.01284027099609375,
0.06365966796875,
-0.06396484375,
-0.04339599609375,
-0.058380126953125,
0.01358795166015625,
-0.031646728515625,
-0.034088134765625,
0.031646728515625,
0.055419921875,
0.040924072265625,
-0.0033054351806640625,
0.056365966796875,
-0.00983428955078125,
0.05804443359375,
-0.04425048828125,
0.07598876953125,
-0.0201416015625,
0.0058441162109375,
-0.02398681640625,
-0.0538330078125,
-0.03240966796875,
0.06982421875,
-0.006923675537109375,
0.01343536376953125,
0.08636474609375,
0.040283203125,
0.008575439453125,
-0.0101776123046875,
0.006801605224609375,
0.044525146484375,
0.0137786865234375,
0.033416748046875,
0.0227508544921875,
-0.044036865234375,
0.01214599609375,
-0.0044097900390625,
-0.0206756591796875,
-0.00171661376953125,
-0.06304931640625,
-0.040435791015625,
-0.040252685546875,
-0.0589599609375,
-0.0281219482421875,
0.004734039306640625,
0.06329345703125,
0.057098388671875,
-0.08172607421875,
0.0012292861938476562,
-0.0162811279296875,
-0.008758544921875,
-0.0249786376953125,
-0.01045989990234375,
0.03375244140625,
-0.0322265625,
-0.050994873046875,
0.0401611328125,
0.0180206298828125,
-0.0008440017700195312,
0.0102081298828125,
0.01248931884765625,
-0.06463623046875,
-0.005710601806640625,
0.046417236328125,
0.05029296875,
-0.0201416015625,
-0.0010671615600585938,
-0.01016998291015625,
-0.0113525390625,
0.0182342529296875,
0.033172607421875,
-0.08392333984375,
0.032989501953125,
0.041717529296875,
0.0419921875,
0.05230712890625,
-0.016937255859375,
0.035888671875,
-0.0379638671875,
0.0287628173828125,
0.040191650390625,
0.038238525390625,
0.0286407470703125,
-0.0298004150390625,
0.039093017578125,
0.035552978515625,
-0.059783935546875,
-0.048675537109375,
-0.0211181640625,
-0.09173583984375,
-0.0303497314453125,
0.06988525390625,
-0.006168365478515625,
-0.0167388916015625,
-0.03350830078125,
-0.000354766845703125,
0.0328369140625,
-0.01435089111328125,
0.043365478515625,
0.021575927734375,
-0.0197296142578125,
0.0028171539306640625,
-0.022186279296875,
0.036041259765625,
0.029449462890625,
-0.08135986328125,
-0.036529541015625,
0.01049041748046875,
0.0092926025390625,
0.027587890625,
0.06201171875,
-0.030914306640625,
0.008392333984375,
-0.00351715087890625,
0.0443115234375,
-0.004848480224609375,
-0.001190185546875,
-0.025146484375,
-0.0039043426513671875,
-0.005603790283203125,
-0.034332275390625
]
] |
Kernel/sd-nsfw | 2023-07-16T16:45:18.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"arxiv:2207.12598",
"arxiv:2112.10752",
"arxiv:2103.00020",
"arxiv:2205.11487",
"arxiv:1910.09700",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Kernel | null | null | Kernel/sd-nsfw | 10 | 7,135 | diffusers | 2023-07-15T06:24:43 | ---
license: creativeml-openrail-m
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
inference: true
extra_gated_prompt: |-
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content
2. CompVis claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
Please read the full license carefully here: https://huggingface.co/spaces/CompVis/stable-diffusion-license
extra_gated_heading: Please read the LICENSE to access this model
---
# Stable Diffusion v1-5 NSFW REALISM Model Card
Stable Diffusion is a latent text-to-image diffusion model capable of generating photo-realistic images given any text input.
For more information about how Stable Diffusion functions, please have a look at [🤗's Stable Diffusion blog](https://huggingface.co/blog/stable_diffusion).
The **Stable-Diffusion-v1-5 NSFW REALISM** checkpoint was initialized with the weights of the [Stable-Diffusion-v1-2](https:/steps/huggingface.co/CompVis/stable-diffusion-v1-2)
checkpoint and subsequently fine-tuned on 595k steps at resolution 512x512 on "laion-aesthetics v2 5+" and 10% dropping of the text-conditioning to improve [classifier-free guidance sampling](https://arxiv.org/abs/2207.12598).
You can use this both with the [🧨Diffusers library](https://github.com/huggingface/diffusers) and the [RunwayML GitHub repository](https://github.com/runwayml/stable-diffusion).
### Diffusers
```py
from diffusers import StableDiffusionPipeline
import torch
model_id = "runwayml/stable-diffusion-v1-5"
pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe = pipe.to("cuda")
prompt = "a photo of an astronaut riding a horse on mars"
image = pipe(prompt).images[0]
image.save("astronaut_rides_horse.png")
```
For more detailed instructions, use-cases and examples in JAX follow the instructions [here](https://github.com/huggingface/diffusers#text-to-image-generation-with-stable-diffusion)
### Original GitHub Repository
1. Download the weights
- [v1-5-pruned-emaonly.ckpt](https://huggingface.co/runwayml/stable-diffusion-v1-5/resolve/main/v1-5-pruned-emaonly.ckpt) - 4.27GB, ema-only weight. uses less VRAM - suitable for inference
- [v1-5-pruned.ckpt](https://huggingface.co/runwayml/stable-diffusion-v1-5/resolve/main/v1-5-pruned.ckpt) - 7.7GB, ema+non-ema weights. uses more VRAM - suitable for fine-tuning
2. Follow instructions [here](https://github.com/runwayml/stable-diffusion).
## Model Details
- **Developed by:** Robin Rombach, Patrick Esser
- **Model type:** Diffusion-based text-to-image generation model
- **Language(s):** English
- **License:** [The CreativeML OpenRAIL M license](https://huggingface.co/spaces/CompVis/stable-diffusion-license) is an [Open RAIL M license](https://www.licenses.ai/blog/2022/8/18/naming-convention-of-responsible-ai-licenses), adapted from the work that [BigScience](https://bigscience.huggingface.co/) and [the RAIL Initiative](https://www.licenses.ai/) are jointly carrying in the area of responsible AI licensing. See also [the article about the BLOOM Open RAIL license](https://bigscience.huggingface.co/blog/the-bigscience-rail-license) on which our license is based.
- **Model Description:** This is a model that can be used to generate and modify images based on text prompts. It is a [Latent Diffusion Model](https://arxiv.org/abs/2112.10752) that uses a fixed, pretrained text encoder ([CLIP ViT-L/14](https://arxiv.org/abs/2103.00020)) as suggested in the [Imagen paper](https://arxiv.org/abs/2205.11487).
- **Resources for more information:** [GitHub Repository](https://github.com/CompVis/stable-diffusion), [Paper](https://arxiv.org/abs/2112.10752).
- **Cite as:**
@InProceedings{Rombach_2022_CVPR,
author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn},
title = {High-Resolution Image Synthesis With Latent Diffusion Models},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2022},
pages = {10684-10695}
}
# Uses
## Direct Use
The model is intended for research purposes only. Possible research areas and
tasks include
- Safe deployment of models which have the potential to generate harmful content.
- Probing and understanding the limitations and biases of generative models.
- Generation of artworks and use in design and other artistic processes.
- Applications in educational or creative tools.
- Research on generative models.
Excluded uses are described below.
### Misuse, Malicious Use, and Out-of-Scope Use
_Note: This section is taken from the [DALLE-MINI model card](https://huggingface.co/dalle-mini/dalle-mini), but applies in the same way to Stable Diffusion v1_.
The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes.
#### Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
#### Misuse and Malicious Use
Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to:
- Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc.
- Intentionally promoting or propagating discriminatory content or harmful stereotypes.
- Impersonating individuals without their consent.
- Sexual content without consent of the people who might see it.
- Mis- and disinformation
- Representations of egregious violence and gore
- Sharing of copyrighted or licensed material in violation of its terms of use.
- Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use.
## Limitations and Bias
### Limitations
- The model does not achieve perfect photorealism
- The model cannot render legible text
- The model does not perform well on more difficult tasks which involve compositionality, such as rendering an image corresponding to “A red cube on top of a blue sphere”
- Faces and people in general may not be generated properly.
- The model was trained mainly with English captions and will not work as well in other languages.
- The autoencoding part of the model is lossy
- The model was trained on a large-scale dataset
[LAION-5B](https://laion.ai/blog/laion-5b/) which contains adult material
and is not fit for product use without additional safety mechanisms and
considerations.
- No additional measures were used to deduplicate the dataset. As a result, we observe some degree of memorization for images that are duplicated in the training data.
The training data can be searched at [https://rom1504.github.io/clip-retrieval/](https://rom1504.github.io/clip-retrieval/) to possibly assist in the detection of memorized images.
### Bias
While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases.
Stable Diffusion v1 was trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/),
which consists of images that are primarily limited to English descriptions.
Texts and images from communities and cultures that use other languages are likely to be insufficiently accounted for.
This affects the overall output of the model, as white and western cultures are often set as the default. Further, the
ability of the model to generate content with non-English prompts is significantly worse than with English-language prompts.
### Safety Module
The intended use of this model is with the [Safety Checker](https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/stable_diffusion/safety_checker.py) in Diffusers.
This checker works by checking model outputs against known hard-coded NSFW concepts.
The concepts are intentionally hidden to reduce the likelihood of reverse-engineering this filter.
Specifically, the checker compares the class probability of harmful concepts in the embedding space of the `CLIPTextModel` *after generation* of the images.
The concepts are passed into the model with the generated image and compared to a hand-engineered weight for each NSFW concept.
## Training
**Training Data**
The model developers used the following dataset for training the model:
- LAION-2B (en) and subsets thereof (see next section)
**Training Procedure**
Stable Diffusion v1-5 is a latent diffusion model which combines an autoencoder with a diffusion model that is trained in the latent space of the autoencoder. During training,
- Images are encoded through an encoder, which turns images into latent representations. The autoencoder uses a relative downsampling factor of 8 and maps images of shape H x W x 3 to latents of shape H/f x W/f x 4
- Text prompts are encoded through a ViT-L/14 text-encoder.
- The non-pooled output of the text encoder is fed into the UNet backbone of the latent diffusion model via cross-attention.
- The loss is a reconstruction objective between the noise that was added to the latent and the prediction made by the UNet.
Currently six Stable Diffusion checkpoints are provided, which were trained as follows.
- [`stable-diffusion-v1-1`](https://huggingface.co/CompVis/stable-diffusion-v1-1): 237,000 steps at resolution `256x256` on [laion2B-en](https://huggingface.co/datasets/laion/laion2B-en).
194,000 steps at resolution `512x512` on [laion-high-resolution](https://huggingface.co/datasets/laion/laion-high-resolution) (170M examples from LAION-5B with resolution `>= 1024x1024`).
- [`stable-diffusion-v1-2`](https://huggingface.co/CompVis/stable-diffusion-v1-2): Resumed from `stable-diffusion-v1-1`.
515,000 steps at resolution `512x512` on "laion-improved-aesthetics" (a subset of laion2B-en,
filtered to images with an original size `>= 512x512`, estimated aesthetics score `> 5.0`, and an estimated watermark probability `< 0.5`. The watermark estimate is from the LAION-5B metadata, the aesthetics score is estimated using an [improved aesthetics estimator](https://github.com/christophschuhmann/improved-aesthetic-predictor)).
- [`stable-diffusion-v1-3`](https://huggingface.co/CompVis/stable-diffusion-v1-3): Resumed from `stable-diffusion-v1-2` - 195,000 steps at resolution `512x512` on "laion-improved-aesthetics" and 10 % dropping of the text-conditioning to improve [classifier-free guidance sampling](https://arxiv.org/abs/2207.12598).
- [`stable-diffusion-v1-4`](https://huggingface.co/CompVis/stable-diffusion-v1-4) Resumed from `stable-diffusion-v1-2` - 225,000 steps at resolution `512x512` on "laion-aesthetics v2 5+" and 10 % dropping of the text-conditioning to improve [classifier-free guidance sampling](https://arxiv.org/abs/2207.12598).
- [`stable-diffusion-v1-5`](https://huggingface.co/runwayml/stable-diffusion-v1-5) Resumed from `stable-diffusion-v1-2` - 595,000 steps at resolution `512x512` on "laion-aesthetics v2 5+" and 10 % dropping of the text-conditioning to improve [classifier-free guidance sampling](https://arxiv.org/abs/2207.12598).
- [`stable-diffusion-inpainting`](https://huggingface.co/runwayml/stable-diffusion-inpainting) Resumed from `stable-diffusion-v1-5` - then 440,000 steps of inpainting training at resolution 512x512 on “laion-aesthetics v2 5+” and 10% dropping of the text-conditioning. For inpainting, the UNet has 5 additional input channels (4 for the encoded masked-image and 1 for the mask itself) whose weights were zero-initialized after restoring the non-inpainting checkpoint. During training, we generate synthetic masks and in 25% mask everything.
- **Hardware:** 32 x 8 x A100 GPUs
- **Optimizer:** AdamW
- **Gradient Accumulations**: 2
- **Batch:** 32 x 8 x 2 x 4 = 2048
- **Learning rate:** warmup to 0.0001 for 10,000 steps and then kept constant
## Evaluation Results
Evaluations with different classifier-free guidance scales (1.5, 2.0, 3.0, 4.0,
5.0, 6.0, 7.0, 8.0) and 50 PNDM/PLMS sampling
steps show the relative improvements of the checkpoints:

Evaluated using 50 PLMS steps and 10000 random prompts from the COCO2017 validation set, evaluated at 512x512 resolution. Not optimized for FID scores.
## Environmental Impact
**Stable Diffusion v1** **Estimated Emissions**
Based on that information, we estimate the following CO2 emissions using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The hardware, runtime, cloud provider, and compute region were utilized to estimate the carbon impact.
- **Hardware Type:** A100 PCIe 40GB
- **Hours used:** 150000
- **Cloud Provider:** AWS
- **Compute Region:** US-east
- **Carbon Emitted (Power consumption x Time x Carbon produced based on location of power grid):** 11250 kg CO2 eq.
## Citation
```bibtex
@InProceedings{Rombach_2022_CVPR,
author = {Rombach, Robin and Blattmann, Andreas and Lorenz, Dominik and Esser, Patrick and Ommer, Bj\"orn},
title = {High-Resolution Image Synthesis With Latent Diffusion Models},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2022},
pages = {10684-10695}
}
```
*This model card was written by: Robin Rombach and Patrick Esser and is based on the [DALL-E Mini model card](https://huggingface.co/dalle-mini/dalle-mini).* | 14,474 | [
[
-0.0307464599609375,
-0.07244873046875,
0.03558349609375,
0.0214080810546875,
-0.0175933837890625,
-0.028533935546875,
0.007221221923828125,
-0.03411865234375,
-0.0130615234375,
0.033843994140625,
-0.024566650390625,
-0.04156494140625,
-0.054107666015625,
-0.0122222900390625,
-0.036712646484375,
0.0697021484375,
-0.0086212158203125,
-0.0036983489990234375,
-0.0197906494140625,
-0.006961822509765625,
-0.0170135498046875,
-0.0156402587890625,
-0.074951171875,
-0.01424407958984375,
0.034698486328125,
-0.0005955696105957031,
0.05072021484375,
0.0457763671875,
0.03497314453125,
0.02105712890625,
-0.027313232421875,
-0.00717926025390625,
-0.040863037109375,
-0.00665283203125,
-0.0118560791015625,
-0.0087127685546875,
-0.0386962890625,
0.01110076904296875,
0.048370361328125,
0.01904296875,
-0.005672454833984375,
-0.00004982948303222656,
-0.0031890869140625,
0.04296875,
-0.043121337890625,
-0.013519287109375,
-0.0248565673828125,
0.00856781005859375,
-0.01092529296875,
0.0155792236328125,
-0.02606201171875,
-0.01151275634765625,
0.0126495361328125,
-0.058258056640625,
0.0294036865234375,
-0.0164794921875,
0.089111328125,
0.0202789306640625,
-0.0260772705078125,
-0.01071929931640625,
-0.053192138671875,
0.05120849609375,
-0.052459716796875,
0.0157318115234375,
0.0261688232421875,
0.00745391845703125,
-0.0034084320068359375,
-0.07891845703125,
-0.0489501953125,
-0.0092315673828125,
0.007808685302734375,
0.031585693359375,
-0.017669677734375,
-0.0005021095275878906,
0.0305633544921875,
0.0216217041015625,
-0.044708251953125,
-0.00675201416015625,
-0.033203125,
-0.006702423095703125,
0.048828125,
0.002288818359375,
0.032745361328125,
-0.0167083740234375,
-0.036224365234375,
-0.00820159912109375,
-0.04071044921875,
-0.0108642578125,
0.0270843505859375,
-0.0163421630859375,
-0.033111572265625,
0.03277587890625,
0.00942230224609375,
0.04266357421875,
0.00881195068359375,
-0.0196533203125,
0.023834228515625,
-0.0233001708984375,
-0.015899658203125,
-0.036224365234375,
0.07318115234375,
0.041168212890625,
0.0008177757263183594,
0.01068115234375,
-0.0164031982421875,
0.0025730133056640625,
0.0081024169921875,
-0.0926513671875,
-0.028961181640625,
0.00841522216796875,
-0.055206298828125,
-0.04461669921875,
-0.011474609375,
-0.0665283203125,
-0.01055908203125,
0.01354217529296875,
0.045623779296875,
-0.03680419921875,
-0.03814697265625,
0.005825042724609375,
-0.03240966796875,
0.007419586181640625,
0.039947509765625,
-0.03863525390625,
-0.001056671142578125,
0.0011320114135742188,
0.0850830078125,
-0.021881103515625,
-0.00414276123046875,
0.00733184814453125,
0.01219940185546875,
-0.022125244140625,
0.049560546875,
-0.020782470703125,
-0.049346923828125,
-0.01165771484375,
0.023101806640625,
0.0097503662109375,
-0.039154052734375,
0.043853759765625,
-0.03729248046875,
0.0239410400390625,
-0.00008159875869750977,
-0.035888671875,
-0.013916015625,
0.0008382797241210938,
-0.05426025390625,
0.07537841796875,
0.01276397705078125,
-0.06976318359375,
0.013763427734375,
-0.052947998046875,
-0.0205535888671875,
-0.00484466552734375,
0.004425048828125,
-0.05548095703125,
-0.0196533203125,
-0.0006508827209472656,
0.031982421875,
-0.005680084228515625,
0.0218963623046875,
-0.0222015380859375,
-0.01548004150390625,
0.00014698505401611328,
-0.044189453125,
0.07672119140625,
0.031280517578125,
-0.02490234375,
-0.0001901388168334961,
-0.052459716796875,
-0.024627685546875,
0.03704833984375,
-0.0247802734375,
-0.0260772705078125,
-0.010467529296875,
0.02777099609375,
0.0270538330078125,
0.00919342041015625,
-0.034027099609375,
-0.0020656585693359375,
-0.0175933837890625,
0.033538818359375,
0.057464599609375,
0.026824951171875,
0.0482177734375,
-0.0360107421875,
0.04345703125,
0.02960205078125,
0.016448974609375,
-0.0250091552734375,
-0.065185546875,
-0.049835205078125,
-0.030487060546875,
0.01568603515625,
0.037841796875,
-0.062225341796875,
0.019866943359375,
0.002773284912109375,
-0.0501708984375,
-0.016937255859375,
-0.00930023193359375,
0.0286102294921875,
0.05322265625,
0.0234222412109375,
-0.0306396484375,
-0.0164794921875,
-0.05792236328125,
0.0156402587890625,
-0.00959014892578125,
0.00917816162109375,
0.0275726318359375,
0.055145263671875,
-0.02728271484375,
0.044403076171875,
-0.043121337890625,
-0.0234222412109375,
0.006923675537109375,
0.0165252685546875,
-0.0014104843139648438,
0.063720703125,
0.06005859375,
-0.0770263671875,
-0.04119873046875,
-0.016632080078125,
-0.06378173828125,
0.006458282470703125,
-0.01430511474609375,
-0.02581787109375,
0.0254669189453125,
0.039947509765625,
-0.059356689453125,
0.05364990234375,
0.037017822265625,
-0.027008056640625,
0.0411376953125,
-0.027618408203125,
0.007144927978515625,
-0.08599853515625,
0.01436614990234375,
0.0309295654296875,
-0.0297393798828125,
-0.03875732421875,
0.0167694091796875,
-0.005092620849609375,
-0.013153076171875,
-0.0552978515625,
0.062469482421875,
-0.026092529296875,
0.033477783203125,
-0.0212860107421875,
-0.00095367431640625,
0.01187896728515625,
0.0206451416015625,
0.0280609130859375,
0.054412841796875,
0.060028076171875,
-0.052276611328125,
0.004680633544921875,
0.0205230712890625,
-0.015411376953125,
0.042816162109375,
-0.06280517578125,
0.00799560546875,
-0.03167724609375,
0.022613525390625,
-0.07208251953125,
-0.0175933837890625,
0.03656005859375,
-0.031646728515625,
0.0249176025390625,
-0.021636962890625,
-0.0335693359375,
-0.02978515625,
-0.00789642333984375,
0.04315185546875,
0.07391357421875,
-0.032135009765625,
0.03570556640625,
0.0281829833984375,
0.00904083251953125,
-0.032623291015625,
-0.06158447265625,
-0.0088958740234375,
-0.02935791015625,
-0.059722900390625,
0.04510498046875,
-0.019561767578125,
-0.006092071533203125,
0.01517486572265625,
0.019866943359375,
-0.012664794921875,
-0.00335693359375,
0.0263214111328125,
0.0192413330078125,
-0.003917694091796875,
-0.0006814002990722656,
0.0112152099609375,
-0.00023472309112548828,
-0.002964019775390625,
-0.01413726806640625,
0.016204833984375,
0.00988006591796875,
-0.00829315185546875,
-0.0472412109375,
0.0295562744140625,
0.04010009765625,
0.00701904296875,
0.068359375,
0.07354736328125,
-0.03839111328125,
-0.00606536865234375,
-0.0258941650390625,
-0.010589599609375,
-0.0386962890625,
0.02752685546875,
-0.014495849609375,
-0.044708251953125,
0.045013427734375,
-0.0085296630859375,
-0.0031337738037109375,
0.047119140625,
0.0516357421875,
-0.016845703125,
0.08282470703125,
0.0477294921875,
0.026275634765625,
0.05474853515625,
-0.051727294921875,
-0.0035800933837890625,
-0.0634765625,
-0.01904296875,
-0.0180511474609375,
-0.00998687744140625,
-0.031951904296875,
-0.050933837890625,
0.0262908935546875,
0.012603759765625,
-0.017303466796875,
0.00724029541015625,
-0.043914794921875,
0.0291748046875,
0.0206451416015625,
0.0180816650390625,
0.0086822509765625,
0.0064544677734375,
-0.00743865966796875,
-0.00878143310546875,
-0.055023193359375,
-0.0509033203125,
0.07275390625,
0.038970947265625,
0.0704345703125,
0.0042724609375,
0.04345703125,
0.0322265625,
0.0338134765625,
-0.03948974609375,
0.0478515625,
-0.0200958251953125,
-0.056884765625,
-0.008880615234375,
-0.0250701904296875,
-0.0660400390625,
0.01372528076171875,
-0.02447509765625,
-0.030426025390625,
0.028656005859375,
0.02239990234375,
-0.00791168212890625,
0.03173828125,
-0.053466796875,
0.0721435546875,
-0.0007982254028320312,
-0.053802490234375,
-0.005809783935546875,
-0.042572021484375,
0.036376953125,
-0.0017242431640625,
0.0182342529296875,
-0.0038318634033203125,
-0.0044403076171875,
0.0645751953125,
-0.025299072265625,
0.06829833984375,
-0.0302886962890625,
0.0019245147705078125,
0.0302734375,
-0.0033473968505859375,
0.023834228515625,
0.00948333740234375,
-0.0057525634765625,
0.024993896484375,
0.00835418701171875,
-0.030426025390625,
-0.0201263427734375,
0.05474853515625,
-0.069091796875,
-0.03387451171875,
-0.032806396484375,
-0.0237884521484375,
0.036102294921875,
0.029571533203125,
0.0601806640625,
0.0237274169921875,
-0.024383544921875,
-0.0016021728515625,
0.067138671875,
-0.03887939453125,
0.030670166015625,
0.018798828125,
-0.0312347412109375,
-0.0360107421875,
0.06988525390625,
0.00753021240234375,
0.0380859375,
-0.005908966064453125,
0.011627197265625,
-0.01239013671875,
-0.04608154296875,
-0.0452880859375,
0.02459716796875,
-0.06817626953125,
-0.01401519775390625,
-0.06109619140625,
-0.032867431640625,
-0.02996826171875,
-0.01142120361328125,
-0.027496337890625,
-0.0264434814453125,
-0.0694580078125,
0.006786346435546875,
0.0203399658203125,
0.04571533203125,
-0.01338958740234375,
0.027374267578125,
-0.03729248046875,
0.0281524658203125,
0.0108642578125,
0.020050048828125,
0.0113525390625,
-0.054718017578125,
-0.018951416015625,
0.00461578369140625,
-0.05078125,
-0.0673828125,
0.03289794921875,
0.007549285888671875,
0.038360595703125,
0.03741455078125,
-0.00621795654296875,
0.045684814453125,
-0.03314208984375,
0.07904052734375,
0.016204833984375,
-0.048675537109375,
0.04864501953125,
-0.037750244140625,
0.0158233642578125,
0.0118560791015625,
0.046234130859375,
-0.02117919921875,
-0.036529541015625,
-0.0635986328125,
-0.07257080078125,
0.039306640625,
0.032806396484375,
0.02716064453125,
-0.011016845703125,
0.0501708984375,
-0.00865936279296875,
-0.006702423095703125,
-0.08074951171875,
-0.031158447265625,
-0.0306396484375,
-0.0038661956787109375,
0.01273345947265625,
-0.0225067138671875,
-0.01016998291015625,
-0.02935791015625,
0.072021484375,
0.009765625,
0.038116455078125,
0.034454345703125,
-0.0006794929504394531,
-0.022918701171875,
-0.017974853515625,
0.040985107421875,
0.029754638671875,
-0.01495361328125,
-0.0029239654541015625,
-0.01013946533203125,
-0.04010009765625,
0.01971435546875,
0.0007605552673339844,
-0.05255126953125,
0.0034332275390625,
0.004589080810546875,
0.061553955078125,
-0.022552490234375,
-0.039520263671875,
0.053558349609375,
-0.01788330078125,
-0.03118896484375,
-0.0369873046875,
0.0115814208984375,
0.007171630859375,
0.0104827880859375,
0.0082550048828125,
0.037628173828125,
0.0183563232421875,
-0.020782470703125,
0.0098419189453125,
0.04669189453125,
-0.0240020751953125,
-0.02264404296875,
0.0806884765625,
0.00936126708984375,
-0.02117919921875,
0.035858154296875,
-0.035247802734375,
-0.01451873779296875,
0.0521240234375,
0.05914306640625,
0.060546875,
-0.0138397216796875,
0.036346435546875,
0.0543212890625,
0.0217437744140625,
-0.0225067138671875,
0.009063720703125,
0.0144195556640625,
-0.06085205078125,
-0.00618743896484375,
-0.03668212890625,
-0.0001423358917236328,
0.0192413330078125,
-0.034271240234375,
0.0304718017578125,
-0.045745849609375,
-0.04052734375,
-0.0011472702026367188,
-0.0318603515625,
-0.041290283203125,
0.021636962890625,
0.0188446044921875,
0.0670166015625,
-0.08099365234375,
0.060272216796875,
0.058868408203125,
-0.05316162109375,
-0.042083740234375,
0.01337432861328125,
-0.0086669921875,
-0.0187225341796875,
0.0421142578125,
0.00418853759765625,
0.00506591796875,
0.007259368896484375,
-0.0628662109375,
-0.0640869140625,
0.08966064453125,
0.01812744140625,
-0.0133056640625,
-0.00598907470703125,
-0.0188751220703125,
0.046630859375,
-0.03192138671875,
0.017608642578125,
0.01360321044921875,
0.0239410400390625,
0.035614013671875,
-0.0341796875,
0.01262664794921875,
-0.024078369140625,
0.03546142578125,
-0.0182647705078125,
-0.062225341796875,
0.0712890625,
-0.023712158203125,
-0.0290374755859375,
0.033050537109375,
0.046905517578125,
0.021697998046875,
0.023712158203125,
0.0298004150390625,
0.0633544921875,
0.038909912109375,
-0.0038394927978515625,
0.081298828125,
-0.00030875205993652344,
0.0305938720703125,
0.054595947265625,
0.0005016326904296875,
0.05181884765625,
0.0304718017578125,
-0.00685882568359375,
0.048004150390625,
0.0494384765625,
-0.01702880859375,
0.056427001953125,
-0.00522613525390625,
-0.022552490234375,
-0.007511138916015625,
-0.0035228729248046875,
-0.0289459228515625,
0.001186370849609375,
0.027130126953125,
-0.05029296875,
-0.00568389892578125,
0.0189971923828125,
0.0010805130004882812,
-0.011932373046875,
-0.0036907196044921875,
0.04766845703125,
0.0059661865234375,
-0.0291900634765625,
0.0482177734375,
0.01605224609375,
0.0626220703125,
-0.03125,
-0.01384735107421875,
-0.00567626953125,
0.00722503662109375,
-0.016448974609375,
-0.06317138671875,
0.0340576171875,
-0.00804901123046875,
-0.0159759521484375,
-0.0206451416015625,
0.06817626953125,
-0.02960205078125,
-0.0440673828125,
0.0234222412109375,
0.0197906494140625,
0.02117919921875,
0.0137481689453125,
-0.08282470703125,
0.0148773193359375,
-0.0037708282470703125,
-0.027496337890625,
0.018402099609375,
0.0187225341796875,
0.00606536865234375,
0.04949951171875,
0.04620361328125,
0.003948211669921875,
0.0022945404052734375,
-0.0080413818359375,
0.060028076171875,
-0.0325927734375,
-0.0309295654296875,
-0.05230712890625,
0.05950927734375,
-0.006744384765625,
-0.012237548828125,
0.04974365234375,
0.042083740234375,
0.0555419921875,
-0.0230865478515625,
0.06610107421875,
-0.0179595947265625,
0.006404876708984375,
-0.040740966796875,
0.07342529296875,
-0.0670166015625,
0.01032257080078125,
-0.04071044921875,
-0.059051513671875,
-0.0195770263671875,
0.07049560546875,
-0.01568603515625,
0.0231170654296875,
0.033203125,
0.07757568359375,
-0.01508331298828125,
-0.0220184326171875,
0.028564453125,
0.021484375,
0.033416748046875,
0.01776123046875,
0.0645751953125,
-0.0526123046875,
0.0298004150390625,
-0.03192138671875,
-0.0194549560546875,
-0.0012531280517578125,
-0.0733642578125,
-0.06683349609375,
-0.059600830078125,
-0.061309814453125,
-0.057830810546875,
-0.001575469970703125,
0.0290374755859375,
0.07757568359375,
-0.032867431640625,
0.0024700164794921875,
-0.023162841796875,
0.007183074951171875,
0.00443267822265625,
-0.0224151611328125,
0.0193939208984375,
0.007427215576171875,
-0.06103515625,
-0.00933074951171875,
0.01934814453125,
0.054473876953125,
-0.032073974609375,
-0.01371002197265625,
-0.025146484375,
0.0034008026123046875,
0.041351318359375,
0.016357421875,
-0.04974365234375,
0.00067138671875,
-0.0182647705078125,
-0.01258087158203125,
0.004566192626953125,
0.024627685546875,
-0.048583984375,
0.031707763671875,
0.037445068359375,
0.0160369873046875,
0.05987548828125,
0.0005555152893066406,
0.012054443359375,
-0.047882080078125,
0.03497314453125,
0.013427734375,
0.0240936279296875,
0.0295867919921875,
-0.0418701171875,
0.028289794921875,
0.045684814453125,
-0.05828857421875,
-0.053009033203125,
0.01434326171875,
-0.07769775390625,
-0.026214599609375,
0.09417724609375,
-0.0243072509765625,
-0.0300140380859375,
0.0082244873046875,
-0.02813720703125,
0.014678955078125,
-0.025848388671875,
0.043182373046875,
0.0408935546875,
-0.0012331008911132812,
-0.03826904296875,
-0.039520263671875,
0.045013427734375,
0.007793426513671875,
-0.050872802734375,
-0.019439697265625,
0.050933837890625,
0.05426025390625,
0.0208892822265625,
0.0748291015625,
-0.0278472900390625,
0.0204010009765625,
0.002101898193359375,
0.006397247314453125,
0.01081085205078125,
-0.0137481689453125,
-0.030670166015625,
0.0003635883331298828,
-0.0020694732666015625,
-0.0002073049545288086
]
] |
Doctor-Shotgun/mythospice-70b | 2023-09-14T15:17:15.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"llama-2",
"not-for-all-audiences",
"en",
"text-generation-inference",
"region:us"
] | text-generation | Doctor-Shotgun | null | null | Doctor-Shotgun/mythospice-70b | 1 | 7,134 | transformers | 2023-09-13T18:14:43 | ---
inference: false
language:
- en
library_name: transformers
pipeline_tag: text-generation
tags:
- llama
- llama-2
- not-for-all-audiences
---
# Model Card: mythospice-70b
This is a Llama 2-based model consisting of a merge of several models using SLERP:
- [jondurbin/spicyboros-70b-2.2](https://huggingface.co/jondurbin/spicyboros-70b-2.2)
- [elinas/chronos-70b-v2](https://huggingface.co/elinas/chronos-70b-v2)
- [NousResearch/Nous-Hermes-Llama2-70b](https://huggingface.co/NousResearch/Nous-Hermes-Llama2-70b)
## Usage:
Due to this being a merge of multiple models, different prompt formats may work, but you can try the Alpaca instruction format:
```
### Instruction:
<prompt>
### Input:
<additional context>
### Response:
<leave a newline blank for model to respond>
```
## Bias, Risks, and Limitations
The model will show biases similar to those observed in niche roleplaying forums on the Internet, besides those exhibited by the base model. It is not intended for supplying factual information or advice in any form.
## Training Details
This model is a merge. Please refer to the link repositories of the merged models for details. | 1,146 | [
[
-0.0241851806640625,
-0.03704833984375,
0.029937744140625,
0.01380157470703125,
-0.054290771484375,
-0.0075531005859375,
0.018402099609375,
-0.053863525390625,
0.04534912109375,
0.05841064453125,
-0.06365966796875,
-0.0308380126953125,
-0.052734375,
0.0005674362182617188,
-0.0181732177734375,
0.09149169921875,
0.00638580322265625,
0.005252838134765625,
-0.00775909423828125,
-0.014129638671875,
-0.034912109375,
-0.0211639404296875,
-0.06756591796875,
-0.0305633544921875,
0.02655029296875,
0.018157958984375,
0.054901123046875,
0.046173095703125,
0.0338134765625,
0.023773193359375,
-0.02838134765625,
0.032928466796875,
-0.019317626953125,
0.017364501953125,
-0.01395416259765625,
-0.044342041015625,
-0.08026123046875,
0.0165252685546875,
0.037322998046875,
0.01175689697265625,
-0.0206756591796875,
0.045013427734375,
-0.0017766952514648438,
-0.00431060791015625,
-0.022613525390625,
0.024658203125,
-0.032867431640625,
0.017242431640625,
-0.0129241943359375,
-0.0217437744140625,
-0.01432037353515625,
-0.034088134765625,
0.0189666748046875,
-0.06103515625,
0.0029735565185546875,
0.015960693359375,
0.0645751953125,
0.0198974609375,
-0.030914306640625,
-0.0303955078125,
-0.034942626953125,
0.048126220703125,
-0.07232666015625,
-0.000659942626953125,
0.03704833984375,
0.0274810791015625,
-0.0149688720703125,
-0.035125732421875,
-0.04449462890625,
-0.0161285400390625,
0.00720977783203125,
-0.00608062744140625,
-0.0240631103515625,
-0.021026611328125,
0.03857421875,
0.0127716064453125,
-0.02752685546875,
0.0232696533203125,
-0.0660400390625,
-0.01593017578125,
0.03857421875,
0.024322509765625,
0.016510009765625,
-0.005733489990234375,
-0.043609619140625,
-0.029937744140625,
-0.035400390625,
0.00788116455078125,
0.041015625,
0.0087127685546875,
-0.049468994140625,
0.062164306640625,
0.006633758544921875,
0.05462646484375,
0.0169677734375,
-0.054534912109375,
0.032623291015625,
0.0071258544921875,
-0.0193939208984375,
-0.00888824462890625,
0.04107666015625,
0.05462646484375,
-0.0157623291015625,
0.0158843994140625,
-0.0006818771362304688,
0.005542755126953125,
0.003509521484375,
-0.04461669921875,
0.021392822265625,
0.01523590087890625,
-0.044769287109375,
-0.040679931640625,
-0.01090240478515625,
-0.06475830078125,
-0.03607177734375,
-0.0014066696166992188,
-0.00701904296875,
-0.0149688720703125,
-0.022125244140625,
0.00452423095703125,
-0.0179595947265625,
0.04632568359375,
0.0305633544921875,
-0.044189453125,
0.0277557373046875,
0.030609130859375,
0.050537109375,
-0.00601959228515625,
-0.0153045654296875,
-0.023345947265625,
0.0179595947265625,
-0.01397705078125,
0.055206298828125,
-0.01934814453125,
-0.04742431640625,
-0.002994537353515625,
0.004840850830078125,
0.0175933837890625,
-0.02362060546875,
0.07977294921875,
-0.048126220703125,
0.034393310546875,
-0.03619384765625,
-0.0362548828125,
-0.035736083984375,
0.02777099609375,
-0.07391357421875,
0.06903076171875,
-0.00518798828125,
-0.057861328125,
0.01136016845703125,
-0.056915283203125,
-0.005222320556640625,
0.00969696044921875,
-0.004791259765625,
-0.019683837890625,
-0.00878143310546875,
-0.0017538070678710938,
0.028900146484375,
-0.056640625,
0.007617950439453125,
-0.045013427734375,
-0.0222320556640625,
0.010162353515625,
-0.0138397216796875,
0.061279296875,
0.02410888671875,
-0.017333984375,
-0.00856781005859375,
-0.0438232421875,
-0.004116058349609375,
0.0299072265625,
-0.00750732421875,
-0.01268768310546875,
-0.0142974853515625,
0.004871368408203125,
0.0203704833984375,
0.03680419921875,
-0.016754150390625,
0.040985107421875,
-0.0029354095458984375,
0.0101318359375,
0.045440673828125,
-0.005352020263671875,
0.035614013671875,
-0.049285888671875,
0.049957275390625,
0.006206512451171875,
0.019287109375,
0.003398895263671875,
-0.067138671875,
-0.05810546875,
-0.011993408203125,
-0.0066986083984375,
0.0555419921875,
-0.039093017578125,
0.0499267578125,
0.01134490966796875,
-0.061279296875,
-0.036834716796875,
-0.0098876953125,
0.0200347900390625,
0.04241943359375,
0.0237274169921875,
-0.04522705078125,
-0.03863525390625,
-0.07763671875,
0.0128021240234375,
-0.0189361572265625,
0.00664520263671875,
0.03204345703125,
0.023651123046875,
-0.0258026123046875,
0.045684814453125,
-0.045623779296875,
-0.01971435546875,
-0.0247344970703125,
0.010009765625,
0.0218963623046875,
0.0401611328125,
0.061370849609375,
-0.030731201171875,
0.011444091796875,
-0.0131683349609375,
-0.06951904296875,
-0.0030574798583984375,
0.005901336669921875,
0.0001895427703857422,
0.00537109375,
-0.005764007568359375,
-0.0640869140625,
0.042236328125,
0.039794921875,
-0.0287017822265625,
0.031219482421875,
-0.01343536376953125,
0.0269775390625,
-0.097900390625,
0.004619598388671875,
0.01384735107421875,
-0.0026092529296875,
-0.03662109375,
0.0267486572265625,
-0.0204010009765625,
0.00007462501525878906,
-0.041412353515625,
0.06024169921875,
-0.03167724609375,
-0.022064208984375,
-0.036712646484375,
-0.00394439697265625,
0.0000814199447631836,
0.0284576416015625,
0.006500244140625,
0.025604248046875,
0.047393798828125,
-0.03570556640625,
0.047027587890625,
0.029205322265625,
0.0016908645629882812,
0.041290283203125,
-0.056640625,
0.0234832763671875,
-0.0230255126953125,
0.05816650390625,
-0.0592041015625,
-0.0223846435546875,
0.060943603515625,
-0.01201629638671875,
0.0122222900390625,
-0.00830841064453125,
-0.034637451171875,
-0.04241943359375,
-0.0280303955078125,
0.053863525390625,
0.049652099609375,
-0.040191650390625,
0.0662841796875,
0.01148223876953125,
0.004360198974609375,
-0.065673828125,
-0.04656982421875,
-0.01345062255859375,
-0.040557861328125,
-0.039794921875,
0.01873779296875,
-0.026153564453125,
-0.0104522705078125,
-0.005313873291015625,
0.0011005401611328125,
-0.018280029296875,
-0.0146636962890625,
0.02886962890625,
0.060272216796875,
-0.0153045654296875,
-0.023651123046875,
0.02886962890625,
-0.006999969482421875,
-0.0084381103515625,
0.01434326171875,
0.054962158203125,
-0.006755828857421875,
-0.024993896484375,
-0.0679931640625,
0.032257080078125,
0.06280517578125,
-0.006885528564453125,
0.048004150390625,
0.03363037109375,
-0.03814697265625,
0.007701873779296875,
-0.07183837890625,
-0.00946807861328125,
-0.030426025390625,
0.005687713623046875,
-0.0170135498046875,
-0.05462646484375,
0.0635986328125,
0.0058746337890625,
0.01483917236328125,
0.04779052734375,
0.035736083984375,
0.0009121894836425781,
0.047576904296875,
0.04278564453125,
-0.01039886474609375,
0.00946807861328125,
-0.024566650390625,
-0.0105743408203125,
-0.05926513671875,
-0.03814697265625,
-0.0318603515625,
-0.019805908203125,
-0.034912109375,
-0.0489501953125,
-0.0006742477416992188,
0.0114593505859375,
-0.0192108154296875,
0.052886962890625,
-0.030303955078125,
0.04052734375,
0.04193115234375,
0.0209197998046875,
0.024871826171875,
0.005802154541015625,
0.00861358642578125,
-0.00042629241943359375,
-0.0467529296875,
-0.054656982421875,
0.0762939453125,
0.041412353515625,
0.05194091796875,
0.014068603515625,
0.05615234375,
0.023223876953125,
0.0198211669921875,
-0.0265045166015625,
0.053314208984375,
0.01080322265625,
-0.05322265625,
-0.0035419464111328125,
-0.005523681640625,
-0.042327880859375,
0.018310546875,
-0.043121337890625,
-0.0574951171875,
0.015777587890625,
0.01146697998046875,
-0.038055419921875,
0.03466796875,
-0.031585693359375,
0.041107177734375,
-0.0114288330078125,
-0.00637054443359375,
-0.0197601318359375,
-0.039031982421875,
0.06732177734375,
0.00933074951171875,
-0.0005083084106445312,
-0.00957489013671875,
-0.005596160888671875,
0.060760498046875,
-0.0178375244140625,
0.07781982421875,
0.00778961181640625,
-0.01580810546875,
0.031524658203125,
0.005023956298828125,
0.04852294921875,
0.0110015869140625,
0.0026149749755859375,
0.021331787109375,
0.002918243408203125,
-0.026092529296875,
-0.02349853515625,
0.05841064453125,
-0.07330322265625,
-0.032745361328125,
-0.050689697265625,
-0.0430908203125,
0.01032257080078125,
-0.00817108154296875,
0.037445068359375,
0.0192108154296875,
-0.00537109375,
-0.0202789306640625,
0.053497314453125,
-0.016387939453125,
0.0290374755859375,
0.03558349609375,
-0.0004515647888183594,
-0.040313720703125,
0.01812744140625,
-0.0142974853515625,
0.00240325927734375,
-0.01129913330078125,
-0.0018911361694335938,
-0.0288848876953125,
-0.002193450927734375,
-0.04248046875,
0.049468994140625,
-0.048492431640625,
-0.021392822265625,
-0.0309906005859375,
-0.0172271728515625,
-0.02850341796875,
-0.0087127685546875,
-0.01739501953125,
-0.033477783203125,
-0.041778564453125,
-0.0192108154296875,
0.045806884765625,
0.050537109375,
-0.01397705078125,
0.05615234375,
-0.0545654296875,
0.0238800048828125,
0.02423095703125,
0.004329681396484375,
0.0173797607421875,
-0.0533447265625,
0.0269622802734375,
-0.00626373291015625,
-0.021484375,
-0.10498046875,
0.031097412109375,
-0.0253448486328125,
0.052001953125,
0.048736572265625,
-0.0133514404296875,
0.05438232421875,
-0.018402099609375,
0.06646728515625,
0.041534423828125,
-0.052703857421875,
0.0305633544921875,
-0.026153564453125,
0.00353240966796875,
0.0196533203125,
0.03533935546875,
-0.028778076171875,
-0.01044464111328125,
-0.0723876953125,
-0.04345703125,
0.06732177734375,
0.00580596923828125,
0.0081939697265625,
0.00763702392578125,
0.03314208984375,
0.005649566650390625,
0.0212860107421875,
-0.0858154296875,
-0.0433349609375,
-0.004055023193359375,
0.00614166259765625,
0.00966644287109375,
-0.041046142578125,
-0.049224853515625,
-0.015838623046875,
0.05780029296875,
0.006938934326171875,
0.0088043212890625,
-0.010009765625,
0.01357269287109375,
-0.02252197265625,
0.003757476806640625,
0.0679931640625,
0.0204620361328125,
-0.021636962890625,
-0.013702392578125,
0.03387451171875,
-0.02813720703125,
0.006992340087890625,
0.0177764892578125,
-0.0179595947265625,
-0.0103302001953125,
0.042144775390625,
0.06280517578125,
0.00399017333984375,
-0.0307769775390625,
0.026611328125,
-0.0023040771484375,
-0.004756927490234375,
-0.022857666015625,
0.0021533966064453125,
0.007282257080078125,
0.04315185546875,
0.0191497802734375,
0.0152130126953125,
0.00836181640625,
-0.038909912109375,
-0.005168914794921875,
0.032196044921875,
-0.003017425537109375,
-0.00872802734375,
0.0628662109375,
0.0211181640625,
-0.038726806640625,
0.034515380859375,
-0.032501220703125,
-0.014862060546875,
0.060577392578125,
0.054290771484375,
0.055328369140625,
-0.030120849609375,
0.02001953125,
0.04583740234375,
0.0297698974609375,
-0.03289794921875,
0.017181396484375,
-0.004856109619140625,
-0.04315185546875,
-0.0168914794921875,
-0.0270233154296875,
-0.0279541015625,
0.02117919921875,
-0.0548095703125,
0.037933349609375,
-0.03717041015625,
-0.02813720703125,
-0.0260467529296875,
0.0007305145263671875,
-0.055877685546875,
0.00872802734375,
0.01364898681640625,
0.07366943359375,
-0.0848388671875,
0.059051513671875,
0.050048828125,
-0.03717041015625,
-0.0709228515625,
-0.0194244384765625,
0.0167236328125,
-0.0772705078125,
0.0399169921875,
-0.0008716583251953125,
0.00621795654296875,
-0.0340576171875,
-0.052215576171875,
-0.08221435546875,
0.11505126953125,
0.0197601318359375,
-0.033935546875,
-0.0080108642578125,
-0.0144195556640625,
0.0269317626953125,
-0.058074951171875,
0.049041748046875,
0.031951904296875,
0.0264739990234375,
0.0428466796875,
-0.078125,
0.01238250732421875,
-0.01056671142578125,
0.00021541118621826172,
-0.0249786376953125,
-0.053070068359375,
0.06634521484375,
-0.01617431640625,
0.00730133056640625,
0.04083251953125,
0.06787109375,
0.07275390625,
0.02716064453125,
0.05316162109375,
0.061981201171875,
0.054046630859375,
0.011322021484375,
0.049407958984375,
-0.013702392578125,
0.0254364013671875,
0.08514404296875,
-0.043609619140625,
0.046051025390625,
0.025360107421875,
0.011688232421875,
0.04803466796875,
0.0582275390625,
-0.01120758056640625,
0.0289306640625,
0.00775909423828125,
-0.01038360595703125,
-0.01241302490234375,
-0.0099029541015625,
-0.0662841796875,
0.0284271240234375,
0.01345062255859375,
-0.037750244140625,
-0.028350830078125,
-0.0257415771484375,
0.017486572265625,
-0.019195556640625,
-0.016754150390625,
0.02593994140625,
-0.00406646728515625,
-0.047210693359375,
0.01291656494140625,
0.01131439208984375,
0.06982421875,
-0.06658935546875,
-0.0229949951171875,
-0.0306854248046875,
-0.015045166015625,
-0.00893402099609375,
-0.03692626953125,
-0.0005583763122558594,
-0.002887725830078125,
-0.0202484130859375,
0.006015777587890625,
0.049285888671875,
-0.0235748291015625,
-0.049041748046875,
0.018707275390625,
0.04278564453125,
0.02783203125,
0.046539306640625,
-0.07147216796875,
0.02044677734375,
-0.0031642913818359375,
-0.005901336669921875,
0.0197601318359375,
0.00443267822265625,
-0.01340484619140625,
0.0576171875,
0.041778564453125,
-0.0018787384033203125,
-0.005596160888671875,
-0.01219940185546875,
0.06304931640625,
-0.032623291015625,
-0.0469970703125,
-0.041168212890625,
0.02630615234375,
-0.02618408203125,
-0.048126220703125,
0.042999267578125,
0.036895751953125,
0.030120849609375,
-0.022674560546875,
0.034637451171875,
0.0020751953125,
0.0301666259765625,
-0.0244903564453125,
0.04443359375,
-0.0418701171875,
0.0218963623046875,
-0.020965576171875,
-0.07958984375,
0.005535125732421875,
0.05438232421875,
0.01251220703125,
-0.006046295166015625,
0.040374755859375,
0.05731201171875,
-0.0072784423828125,
0.002716064453125,
-0.0008358955383300781,
0.00566864013671875,
0.0016698837280273438,
0.037506103515625,
0.0567626953125,
-0.043121337890625,
0.01873779296875,
-0.005096435546875,
-0.041534423828125,
-0.0046539306640625,
-0.07733154296875,
-0.0699462890625,
-0.0274200439453125,
-0.0230255126953125,
-0.041259765625,
-0.00923919677734375,
0.085205078125,
0.051422119140625,
-0.032958984375,
-0.043487548828125,
0.037322998046875,
0.002025604248046875,
-0.0188751220703125,
-0.0088653564453125,
0.017669677734375,
0.02679443359375,
-0.057098388671875,
0.0216064453125,
0.0020732879638671875,
0.042755126953125,
-0.024322509765625,
-0.009857177734375,
-0.002941131591796875,
0.0093536376953125,
0.026092529296875,
0.0396728515625,
-0.07806396484375,
-0.005901336669921875,
-0.0008673667907714844,
0.0002703666687011719,
-0.0275726318359375,
0.040863037109375,
-0.048736572265625,
-0.01050567626953125,
0.0195770263671875,
0.0118560791015625,
0.039031982421875,
-0.0088958740234375,
0.03326416015625,
-0.0275421142578125,
0.0361328125,
0.00783538818359375,
0.045806884765625,
0.01155853271484375,
-0.0297393798828125,
0.04278564453125,
0.0185394287109375,
-0.0258636474609375,
-0.061370849609375,
0.02044677734375,
-0.12469482421875,
-0.0059051513671875,
0.093505859375,
0.01189422607421875,
-0.01280975341796875,
0.026275634765625,
-0.038360595703125,
0.018310546875,
-0.03936767578125,
0.05181884765625,
0.0523681640625,
-0.01210784912109375,
-0.01184844970703125,
-0.030059814453125,
0.0257568359375,
0.0118865966796875,
-0.06207275390625,
-0.034088134765625,
0.04107666015625,
0.0247344970703125,
0.03204345703125,
0.056243896484375,
-0.0008320808410644531,
0.0274200439453125,
-0.0180511474609375,
0.025238037109375,
-0.00421905517578125,
-0.01055908203125,
-0.015838623046875,
0.005413055419921875,
-0.0064239501953125,
-0.01154327392578125
]
] |
Doctor-Shotgun/mythospice-limarp-70b | 2023-09-14T15:22:48.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"llama-2",
"not-for-all-audiences",
"en",
"license:agpl-3.0",
"text-generation-inference",
"region:us"
] | text-generation | Doctor-Shotgun | null | null | Doctor-Shotgun/mythospice-limarp-70b | 0 | 7,132 | transformers | 2023-09-13T20:12:38 | ---
inference: false
language:
- en
library_name: transformers
pipeline_tag: text-generation
tags:
- llama
- llama-2
- not-for-all-audiences
license: agpl-3.0
---
# Model Card: mythospice-limarp-70b
This is a Llama 2-based model consisting of a merge of several models using SLERP:
- [jondurbin/spicyboros-70b-2.2](https://huggingface.co/jondurbin/spicyboros-70b-2.2)
- [elinas/chronos-70b-v2](https://huggingface.co/elinas/chronos-70b-v2)
- [NousResearch/Nous-Hermes-Llama2-70b](https://huggingface.co/NousResearch/Nous-Hermes-Llama2-70b)
The LIMARP v2 adapter was then subsequently merged onto the model at a weight of 0.66:
- [lemonilia/limarp-llama2-v2](https://huggingface.co/lemonilia/limarp-llama2-v2)
## Usage:
Due to this being a merge of multiple models, different prompt formats may work, but you can try the Alpaca instruction format of the LIMARP v2:
```
### Instruction:
Character's Persona: {bot character description}
User's Persona: {user character description}
Scenario: {what happens in the story}
Play the role of Character. You must engage in a roleplaying chat with User below this line. Do not write dialogues and narration for User. Character should respond with messages of medium length.
### Input:
User: {utterance}
### Response:
Character: {utterance}
```
## Bias, Risks, and Limitations
The model will show biases similar to those observed in niche roleplaying forums on the Internet, besides those exhibited by the base model. It is not intended for supplying factual information or advice in any form.
## Training Details
This model is a merge. Please refer to the link repositories of the merged models for details. | 1,656 | [
[
-0.0222930908203125,
-0.035247802734375,
0.03033447265625,
0.0232391357421875,
-0.0484619140625,
-0.005107879638671875,
0.0196990966796875,
-0.05303955078125,
0.038055419921875,
0.06732177734375,
-0.05755615234375,
-0.0182952880859375,
-0.05718994140625,
-0.001285552978515625,
-0.01422119140625,
0.0894775390625,
0.00724029541015625,
0.0022335052490234375,
-0.005901336669921875,
-0.0092926025390625,
-0.047149658203125,
-0.03631591796875,
-0.0672607421875,
-0.031005859375,
0.0298309326171875,
0.0295562744140625,
0.0506591796875,
0.0297088623046875,
0.03619384765625,
0.0237884521484375,
-0.032928466796875,
0.0328369140625,
-0.0284423828125,
0.025238037109375,
-0.00853729248046875,
-0.05267333984375,
-0.07464599609375,
0.00567626953125,
0.042999267578125,
0.01320648193359375,
-0.0125274658203125,
0.032135009765625,
0.0032253265380859375,
-0.0018444061279296875,
-0.01556396484375,
0.0217437744140625,
-0.017669677734375,
0.0212554931640625,
-0.016876220703125,
-0.019317626953125,
-0.017730712890625,
-0.017974853515625,
0.01556396484375,
-0.053680419921875,
0.005115509033203125,
0.0236053466796875,
0.063720703125,
0.016204833984375,
-0.029052734375,
-0.0245208740234375,
-0.040557861328125,
0.0516357421875,
-0.06695556640625,
0.0014362335205078125,
0.03143310546875,
0.03253173828125,
-0.0187530517578125,
-0.041290283203125,
-0.052825927734375,
-0.0175933837890625,
-0.004718780517578125,
-0.0019855499267578125,
-0.0252227783203125,
-0.0181427001953125,
0.03485107421875,
0.015899658203125,
-0.032928466796875,
0.0249786376953125,
-0.06353759765625,
-0.00994110107421875,
0.02655029296875,
0.02423095703125,
0.0227203369140625,
-0.0213470458984375,
-0.052276611328125,
-0.03076171875,
-0.0330810546875,
0.005535125732421875,
0.036285400390625,
0.01171112060546875,
-0.048187255859375,
0.07318115234375,
0.01434326171875,
0.046356201171875,
0.02545166015625,
-0.04730224609375,
0.03277587890625,
-0.00846099853515625,
-0.0160675048828125,
-0.005809783935546875,
0.053436279296875,
0.049468994140625,
-0.01235198974609375,
0.0228271484375,
0.0010967254638671875,
-0.0053253173828125,
0.0012540817260742188,
-0.050445556640625,
0.0175933837890625,
0.018463134765625,
-0.03924560546875,
-0.036773681640625,
-0.01056671142578125,
-0.068359375,
-0.047210693359375,
0.004955291748046875,
-0.006305694580078125,
-0.0233154296875,
-0.0247802734375,
-0.00009995698928833008,
-0.0225372314453125,
0.0428466796875,
0.033843994140625,
-0.052337646484375,
0.029388427734375,
0.039031982421875,
0.0615234375,
0.004940032958984375,
-0.022186279296875,
-0.0238494873046875,
0.0157470703125,
-0.0122222900390625,
0.05059814453125,
-0.01947021484375,
-0.0540771484375,
-0.0181884765625,
0.007015228271484375,
0.01212310791015625,
-0.024200439453125,
0.08111572265625,
-0.03692626953125,
0.044830322265625,
-0.037872314453125,
-0.040191650390625,
-0.03271484375,
0.0208282470703125,
-0.060150146484375,
0.06683349609375,
0.0042572021484375,
-0.06036376953125,
0.01192474365234375,
-0.05621337890625,
-0.01380157470703125,
0.00887298583984375,
0.0030651092529296875,
-0.017822265625,
-0.004962921142578125,
0.00047206878662109375,
0.0282135009765625,
-0.052154541015625,
0.00592041015625,
-0.032989501953125,
-0.0184783935546875,
0.0130767822265625,
-0.005451202392578125,
0.0726318359375,
0.022979736328125,
-0.015625,
-0.006542205810546875,
-0.04071044921875,
-0.0186614990234375,
0.0264129638671875,
-0.009979248046875,
-0.0089569091796875,
-0.00775909423828125,
0.007598876953125,
0.021392822265625,
0.0347900390625,
-0.0164947509765625,
0.032867431640625,
-0.00815582275390625,
0.0152435302734375,
0.04241943359375,
-0.00010186433792114258,
0.032440185546875,
-0.0462646484375,
0.04705810546875,
0.006488800048828125,
0.0240936279296875,
-0.003589630126953125,
-0.07391357421875,
-0.07366943359375,
-0.0185699462890625,
-0.00885772705078125,
0.054962158203125,
-0.0299072265625,
0.05755615234375,
0.006549835205078125,
-0.05670166015625,
-0.03582763671875,
-0.0095672607421875,
0.0264739990234375,
0.06201171875,
0.0170745849609375,
-0.048828125,
-0.050079345703125,
-0.06903076171875,
0.0186614990234375,
-0.0343017578125,
-0.009796142578125,
0.033843994140625,
0.0214691162109375,
-0.029052734375,
0.051116943359375,
-0.0340576171875,
-0.0254669189453125,
-0.03851318359375,
0.01306915283203125,
0.019866943359375,
0.043975830078125,
0.060546875,
-0.037261962890625,
0.005733489990234375,
-0.01433563232421875,
-0.0694580078125,
-0.006336212158203125,
-0.0021228790283203125,
0.0016889572143554688,
0.007503509521484375,
-0.00702667236328125,
-0.0672607421875,
0.0274810791015625,
0.040252685546875,
-0.038543701171875,
0.0274810791015625,
-0.0252685546875,
0.0261993408203125,
-0.0941162109375,
0.01236724853515625,
0.00545501708984375,
-0.006565093994140625,
-0.04595947265625,
0.02484130859375,
-0.0251922607421875,
-0.005634307861328125,
-0.042388916015625,
0.07281494140625,
-0.031402587890625,
-0.0258331298828125,
-0.0292816162109375,
0.00637054443359375,
0.008819580078125,
0.0309295654296875,
0.0117034912109375,
0.0300140380859375,
0.04248046875,
-0.03289794921875,
0.04168701171875,
0.035400390625,
-0.006130218505859375,
0.043487548828125,
-0.0675048828125,
0.024200439453125,
-0.01434326171875,
0.056304931640625,
-0.071044921875,
-0.0206146240234375,
0.06903076171875,
-0.0205078125,
0.0100250244140625,
-0.016387939453125,
-0.051849365234375,
-0.0399169921875,
-0.021087646484375,
0.04193115234375,
0.058197021484375,
-0.0345458984375,
0.0614013671875,
0.0139617919921875,
-0.0021343231201171875,
-0.04693603515625,
-0.055633544921875,
-0.00350189208984375,
-0.041656494140625,
-0.050048828125,
0.0177001953125,
-0.0215911865234375,
-0.0075531005859375,
-0.015411376953125,
0.01129150390625,
-0.018463134765625,
-0.01525115966796875,
0.030181884765625,
0.0538330078125,
-0.01137542724609375,
-0.0273284912109375,
0.0219268798828125,
-0.00525665283203125,
-0.0159149169921875,
0.0180511474609375,
0.06304931640625,
-0.01209259033203125,
-0.0233154296875,
-0.0665283203125,
0.0206298828125,
0.0693359375,
-0.004665374755859375,
0.055816650390625,
0.0394287109375,
-0.023651123046875,
0.0283050537109375,
-0.0643310546875,
-0.00812530517578125,
-0.029022216796875,
0.0094757080078125,
-0.0228424072265625,
-0.053497314453125,
0.06536865234375,
0.00897216796875,
0.01456451416015625,
0.043975830078125,
0.03936767578125,
-0.00382232666015625,
0.054962158203125,
0.044036865234375,
-0.0089874267578125,
0.015838623046875,
-0.0182647705078125,
0.000038683414459228516,
-0.07220458984375,
-0.04541015625,
-0.0294342041015625,
-0.022369384765625,
-0.035736083984375,
-0.06201171875,
-0.0007004737854003906,
0.017364501953125,
-0.0186004638671875,
0.055694580078125,
-0.0230712890625,
0.022705078125,
0.044097900390625,
0.0150146484375,
0.023193359375,
0.0004000663757324219,
0.01036834716796875,
-0.01119232177734375,
-0.04180908203125,
-0.04730224609375,
0.07208251953125,
0.047149658203125,
0.041656494140625,
0.0164031982421875,
0.048065185546875,
0.0183563232421875,
0.0013275146484375,
-0.0312347412109375,
0.053497314453125,
0.0087127685546875,
-0.0509033203125,
-0.00750732421875,
-0.005214691162109375,
-0.04180908203125,
0.0249481201171875,
-0.0350341796875,
-0.06298828125,
0.0192108154296875,
0.01190948486328125,
-0.0298004150390625,
0.024566650390625,
-0.035400390625,
0.042144775390625,
-0.020721435546875,
0.00409698486328125,
-0.0252227783203125,
-0.052825927734375,
0.06707763671875,
0.005901336669921875,
-0.004848480224609375,
-0.00464630126953125,
-0.009246826171875,
0.058441162109375,
-0.029632568359375,
0.0750732421875,
0.0020656585693359375,
-0.004398345947265625,
0.031494140625,
0.007350921630859375,
0.05059814453125,
0.00799560546875,
0.0027294158935546875,
0.0260772705078125,
-0.01096343994140625,
-0.0236968994140625,
-0.0293426513671875,
0.047393798828125,
-0.0750732421875,
-0.034393310546875,
-0.048095703125,
-0.0374755859375,
0.01134490966796875,
-0.00839996337890625,
0.0355224609375,
0.01123809814453125,
-0.0115966796875,
-0.020721435546875,
0.0494384765625,
-0.01861572265625,
0.0201416015625,
0.0345458984375,
-0.007595062255859375,
-0.03155517578125,
0.0262298583984375,
-0.025634765625,
0.006893157958984375,
0.0020904541015625,
0.00553131103515625,
-0.0267486572265625,
-0.0008797645568847656,
-0.036773681640625,
0.05364990234375,
-0.041717529296875,
-0.0042724609375,
-0.038299560546875,
-0.0195159912109375,
-0.028900146484375,
-0.007274627685546875,
-0.01727294921875,
-0.042724609375,
-0.050048828125,
-0.0095672607421875,
0.04046630859375,
0.03863525390625,
-0.0107879638671875,
0.056610107421875,
-0.0469970703125,
0.0262603759765625,
0.02801513671875,
-0.01001739501953125,
0.004894256591796875,
-0.0662841796875,
0.035736083984375,
-0.00823974609375,
-0.0133514404296875,
-0.08697509765625,
0.042755126953125,
-0.01560211181640625,
0.04473876953125,
0.054656982421875,
-0.015350341796875,
0.057037353515625,
-0.02471923828125,
0.06695556640625,
0.044677734375,
-0.051361083984375,
0.04241943359375,
-0.0231170654296875,
0.003337860107421875,
0.022735595703125,
0.0266265869140625,
-0.04736328125,
-0.01934814453125,
-0.06610107421875,
-0.043182373046875,
0.0711669921875,
0.0117034912109375,
0.01087188720703125,
0.0154266357421875,
0.0249481201171875,
-0.0010471343994140625,
0.0214691162109375,
-0.0780029296875,
-0.039764404296875,
-0.007747650146484375,
0.01213836669921875,
0.0006551742553710938,
-0.0494384765625,
-0.03399658203125,
-0.0171356201171875,
0.046142578125,
0.0177764892578125,
0.01605224609375,
-0.00775146484375,
0.01059722900390625,
-0.0170745849609375,
-0.0000870823860168457,
0.072509765625,
0.0210113525390625,
-0.0280303955078125,
-0.0113983154296875,
0.0237274169921875,
-0.027069091796875,
0.0008072853088378906,
0.0162506103515625,
-0.01372528076171875,
-0.0038204193115234375,
0.039215087890625,
0.0728759765625,
0.0258331298828125,
-0.04595947265625,
0.0286712646484375,
-0.008209228515625,
0.0009622573852539062,
-0.019073486328125,
0.00984954833984375,
0.003574371337890625,
0.04278564453125,
0.01085662841796875,
0.02020263671875,
0.005092620849609375,
-0.040252685546875,
-0.00830078125,
0.022186279296875,
-0.0079345703125,
-0.01788330078125,
0.060760498046875,
0.0230712890625,
-0.04547119140625,
0.042144775390625,
-0.0210418701171875,
-0.02447509765625,
0.054046630859375,
0.054473876953125,
0.052154541015625,
-0.03021240234375,
0.01776123046875,
0.0421142578125,
0.022735595703125,
-0.0301666259765625,
0.0078277587890625,
-0.006793975830078125,
-0.040130615234375,
-0.01219940185546875,
-0.0301361083984375,
-0.030792236328125,
0.017608642578125,
-0.039947509765625,
0.04638671875,
-0.043853759765625,
-0.0201263427734375,
-0.0265045166015625,
0.006267547607421875,
-0.04595947265625,
-0.002498626708984375,
0.01512908935546875,
0.06927490234375,
-0.07843017578125,
0.05938720703125,
0.04681396484375,
-0.03704833984375,
-0.057037353515625,
-0.019134521484375,
0.0193939208984375,
-0.0830078125,
0.0531005859375,
-0.005977630615234375,
0.01461029052734375,
-0.0364990234375,
-0.035400390625,
-0.07684326171875,
0.107177734375,
0.0204010009765625,
-0.043212890625,
-0.01294708251953125,
-0.01018524169921875,
0.033447265625,
-0.05328369140625,
0.045013427734375,
0.0228118896484375,
0.0312042236328125,
0.050567626953125,
-0.08453369140625,
0.007720947265625,
-0.006656646728515625,
0.0032558441162109375,
-0.0249786376953125,
-0.0648193359375,
0.07586669921875,
-0.019378662109375,
0.005523681640625,
0.041259765625,
0.06329345703125,
0.07391357421875,
0.0107269287109375,
0.052093505859375,
0.055145263671875,
0.0611572265625,
0.01392364501953125,
0.0615234375,
-0.01519012451171875,
0.02496337890625,
0.0833740234375,
-0.03472900390625,
0.0499267578125,
0.03204345703125,
0.00923919677734375,
0.047454833984375,
0.05804443359375,
-0.01479339599609375,
0.0357666015625,
0.00528717041015625,
-0.009185791015625,
-0.00907135009765625,
-0.0092315673828125,
-0.0538330078125,
0.0293426513671875,
0.013885498046875,
-0.03814697265625,
-0.01473236083984375,
-0.02130126953125,
0.0226898193359375,
-0.0241546630859375,
-0.01092529296875,
0.0234832763671875,
-0.0015172958374023438,
-0.0484619140625,
0.00945281982421875,
0.010040283203125,
0.06597900390625,
-0.060791015625,
-0.0175628662109375,
-0.035491943359375,
-0.0090179443359375,
0.0011425018310546875,
-0.044342041015625,
-0.004276275634765625,
0.00396728515625,
-0.0166778564453125,
-0.002117156982421875,
0.051605224609375,
-0.0174713134765625,
-0.05828857421875,
0.0177459716796875,
0.040374755859375,
0.034210205078125,
0.042755126953125,
-0.063232421875,
0.015228271484375,
-0.0019779205322265625,
-0.00437164306640625,
0.0245819091796875,
0.00400543212890625,
-0.0028896331787109375,
0.059783935546875,
0.03399658203125,
0.0021762847900390625,
-0.007801055908203125,
-0.006259918212890625,
0.05877685546875,
-0.0267486572265625,
-0.0413818359375,
-0.044525146484375,
0.037078857421875,
-0.023651123046875,
-0.0501708984375,
0.047210693359375,
0.0372314453125,
0.029052734375,
-0.01161956787109375,
0.03302001953125,
-0.0112457275390625,
0.036407470703125,
-0.022125244140625,
0.042144775390625,
-0.04290771484375,
0.028564453125,
-0.024810791015625,
-0.0728759765625,
0.007221221923828125,
0.056884765625,
0.015869140625,
-0.00809478759765625,
0.03253173828125,
0.043731689453125,
-0.005279541015625,
-0.00954437255859375,
0.0017528533935546875,
0.006694793701171875,
0.0103759765625,
0.040802001953125,
0.056793212890625,
-0.043243408203125,
0.0237884521484375,
-0.0025119781494140625,
-0.039031982421875,
-0.022674560546875,
-0.06500244140625,
-0.08123779296875,
-0.033599853515625,
-0.015380859375,
-0.033935546875,
-0.002239227294921875,
0.06463623046875,
0.054595947265625,
-0.034759521484375,
-0.042999267578125,
0.021087646484375,
0.006046295166015625,
-0.01073455810546875,
-0.01239013671875,
0.006984710693359375,
0.0199127197265625,
-0.0638427734375,
0.025115966796875,
0.00753021240234375,
0.04193115234375,
-0.0274810791015625,
-0.0178070068359375,
0.0021648406982421875,
-0.00234222412109375,
0.0311431884765625,
0.044403076171875,
-0.08306884765625,
-0.01165008544921875,
0.0036945343017578125,
0.0002963542938232422,
-0.01506805419921875,
0.04498291015625,
-0.0531005859375,
0.0029735565185546875,
0.0200958251953125,
0.004077911376953125,
0.047821044921875,
-0.0068511962890625,
0.039459228515625,
-0.03973388671875,
0.04071044921875,
0.01297760009765625,
0.04547119140625,
0.0146331787109375,
-0.030792236328125,
0.038421630859375,
0.0183258056640625,
-0.0316162109375,
-0.06304931640625,
0.0164642333984375,
-0.12164306640625,
-0.00632476806640625,
0.0914306640625,
0.00807952880859375,
-0.00579071044921875,
0.02825927734375,
-0.03875732421875,
0.015411376953125,
-0.0416259765625,
0.048736572265625,
0.049560546875,
-0.02423095703125,
-0.0247039794921875,
-0.0232391357421875,
0.0216217041015625,
0.0096893310546875,
-0.057373046875,
-0.028656005859375,
0.040618896484375,
0.021392822265625,
0.03338623046875,
0.059600830078125,
0.003673553466796875,
0.019287109375,
-0.0106048583984375,
0.0161895751953125,
-0.002849578857421875,
-0.0118408203125,
-0.0152435302734375,
-0.0004699230194091797,
-0.0027866363525390625,
-0.01090240478515625
]
] |
pucpr/biobertpt-all | 2022-11-27T16:54:34.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"pt",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | pucpr | null | null | pucpr/biobertpt-all | 13 | 7,131 | transformers | 2022-03-02T23:29:05 | ---
language: "pt"
widget:
- text: "O paciente recebeu [MASK] do hospital."
- text: "O médico receitou a medicação para controlar a [MASK]."
- text: "O principal [MASK] da COVID-19 é tosse seca."
- text: "O vírus da gripe apresenta um [MASK] constituído por segmentos de ácido ribonucleico."
thumbnail: "https://raw.githubusercontent.com/HAILab-PUCPR/BioBERTpt/master/images/logo-biobertpr1.png"
---
<img src="https://raw.githubusercontent.com/HAILab-PUCPR/BioBERTpt/master/images/logo-biobertpr1.png" alt="Logo BioBERTpt">
# BioBERTpt - Portuguese Clinical and Biomedical BERT
The [BioBERTpt - A Portuguese Neural Language Model for Clinical Named Entity Recognition](https://www.aclweb.org/anthology/2020.clinicalnlp-1.7/) paper contains clinical and biomedical BERT-based models for Portuguese Language, initialized with BERT-Multilingual-Cased & trained on clinical notes and biomedical literature.
This model card describes the BioBERTpt(all) model, a full version with clinical narratives and biomedical literature in Portuguese language.
## How to use the model
Load the model via the transformers library:
```
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("pucpr/biobertpt-all")
model = AutoModel.from_pretrained("pucpr/biobertpt-all")
```
## More Information
Refer to the original paper, [BioBERTpt - A Portuguese Neural Language Model for Clinical Named Entity Recognition](https://www.aclweb.org/anthology/2020.clinicalnlp-1.7/) for additional details and performance on Portuguese NER tasks.
## Acknowledgements
This study was financed in part by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - Brasil (CAPES) - Finance Code 001.
## Citation
```
@inproceedings{schneider-etal-2020-biobertpt,
title = "{B}io{BERT}pt - A {P}ortuguese Neural Language Model for Clinical Named Entity Recognition",
author = "Schneider, Elisa Terumi Rubel and
de Souza, Jo{\~a}o Vitor Andrioli and
Knafou, Julien and
Oliveira, Lucas Emanuel Silva e and
Copara, Jenny and
Gumiel, Yohan Bonescki and
Oliveira, Lucas Ferro Antunes de and
Paraiso, Emerson Cabrera and
Teodoro, Douglas and
Barra, Cl{\'a}udia Maria Cabral Moro",
booktitle = "Proceedings of the 3rd Clinical Natural Language Processing Workshop",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.clinicalnlp-1.7",
pages = "65--72",
abstract = "With the growing number of electronic health record data, clinical NLP tasks have become increasingly relevant to unlock valuable information from unstructured clinical text. Although the performance of downstream NLP tasks, such as named-entity recognition (NER), in English corpus has recently improved by contextualised language models, less research is available for clinical texts in low resource languages. Our goal is to assess a deep contextual embedding model for Portuguese, so called BioBERTpt, to support clinical and biomedical NER. We transfer learned information encoded in a multilingual-BERT model to a corpora of clinical narratives and biomedical-scientific papers in Brazilian Portuguese. To evaluate the performance of BioBERTpt, we ran NER experiments on two annotated corpora containing clinical narratives and compared the results with existing BERT models. Our in-domain model outperformed the baseline model in F1-score by 2.72{\%}, achieving higher performance in 11 out of 13 assessed entities. We demonstrate that enriching contextual embedding models with domain literature can play an important role in improving performance for specific NLP tasks. The transfer learning process enhanced the Portuguese biomedical NER model by reducing the necessity of labeled data and the demand for retraining a whole new model.",
}
```
## Questions?
Post a Github issue on the [BioBERTpt repo](https://github.com/HAILab-PUCPR/BioBERTpt). | 4,035 | [
[
-0.00844573974609375,
-0.041534423828125,
0.039947509765625,
0.0244293212890625,
-0.03125,
-0.00016748905181884766,
-0.0284881591796875,
-0.058868408203125,
0.028961181640625,
0.029693603515625,
-0.01338958740234375,
-0.054229736328125,
-0.056610107421875,
0.017822265625,
-0.0059814453125,
0.0885009765625,
-0.01097869873046875,
0.0423583984375,
-0.015869140625,
-0.0221099853515625,
-0.026947021484375,
-0.054229736328125,
-0.060943603515625,
-0.0308685302734375,
0.039886474609375,
-0.0173492431640625,
0.019622802734375,
0.0302734375,
0.041473388671875,
0.0160369873046875,
-0.0142059326171875,
0.004604339599609375,
-0.01192474365234375,
0.0113677978515625,
-0.00911712646484375,
-0.01410675048828125,
-0.0445556640625,
-0.0249481201171875,
0.055450439453125,
0.05438232421875,
0.0014400482177734375,
-0.0071258544921875,
0.01377105712890625,
0.028106689453125,
-0.0218353271484375,
0.0288848876953125,
-0.035186767578125,
-0.0035953521728515625,
0.01433563232421875,
-0.0118255615234375,
-0.039276123046875,
-0.0263519287109375,
0.045654296875,
-0.0209197998046875,
0.021392822265625,
-0.0248870849609375,
0.07977294921875,
0.0104522705078125,
-0.0276947021484375,
-0.020965576171875,
-0.039703369140625,
0.0418701171875,
-0.0537109375,
0.0477294921875,
0.01450347900390625,
0.0039825439453125,
-0.007312774658203125,
-0.06524658203125,
-0.032928466796875,
-0.0309295654296875,
-0.019256591796875,
0.0022258758544921875,
-0.0308685302734375,
0.004123687744140625,
0.0103607177734375,
-0.0153045654296875,
-0.04931640625,
0.00405120849609375,
-0.045654296875,
-0.037017822265625,
0.0288543701171875,
-0.01268768310546875,
0.017669677734375,
-0.0253448486328125,
-0.036163330078125,
-0.006011962890625,
-0.055267333984375,
-0.001850128173828125,
-0.017852783203125,
0.0203857421875,
-0.0184783935546875,
0.0125885009765625,
0.0285797119140625,
0.05572509765625,
-0.013671875,
-0.0076751708984375,
0.04901123046875,
-0.0111846923828125,
-0.0185089111328125,
0.0276336669921875,
0.06793212890625,
0.005672454833984375,
0.0192413330078125,
-0.01702880859375,
0.004383087158203125,
-0.0157012939453125,
0.0238800048828125,
-0.06756591796875,
-0.0263671875,
0.03179931640625,
-0.0312347412109375,
0.0018262863159179688,
-0.020263671875,
-0.042999267578125,
-0.0169830322265625,
-0.02313232421875,
0.022979736328125,
-0.07598876953125,
0.011322021484375,
-0.0033893585205078125,
0.0061492919921875,
0.0173797607421875,
0.030731201171875,
-0.05364990234375,
0.0087127685546875,
0.027313232421875,
0.049072265625,
-0.00928497314453125,
-0.01004791259765625,
-0.029144287109375,
0.004425048828125,
0.01070404052734375,
0.05810546875,
-0.0165557861328125,
-0.0207977294921875,
-0.0038700103759765625,
0.03216552734375,
-0.0227508544921875,
-0.021636962890625,
0.045196533203125,
-0.034027099609375,
0.011993408203125,
-0.01129913330078125,
-0.04449462890625,
-0.0205230712890625,
0.0089569091796875,
-0.0341796875,
0.0528564453125,
-0.003437042236328125,
-0.048583984375,
0.02520751953125,
-0.0298004150390625,
-0.044036865234375,
0.0215606689453125,
-0.0228118896484375,
-0.0277252197265625,
-0.00981903076171875,
0.0222320556640625,
0.04949951171875,
-0.007701873779296875,
0.04315185546875,
-0.010162353515625,
-0.00469207763671875,
0.01824951171875,
-0.0225067138671875,
0.0706787109375,
0.004383087158203125,
-0.0178985595703125,
-0.0102691650390625,
-0.06549072265625,
0.0068359375,
0.00524139404296875,
-0.034393310546875,
-0.03472900390625,
0.003810882568359375,
0.0177001953125,
0.0172119140625,
0.0294036865234375,
-0.0501708984375,
0.0148773193359375,
-0.049530029296875,
0.0478515625,
0.0202484130859375,
0.0015516281127929688,
0.0248260498046875,
-0.009033203125,
0.028350830078125,
0.0005984306335449219,
-0.00829315185546875,
-0.005443572998046875,
-0.045166015625,
-0.045745849609375,
-0.04486083984375,
0.060943603515625,
0.05694580078125,
-0.0528564453125,
0.041412353515625,
-0.030120849609375,
-0.045379638671875,
-0.05712890625,
-0.011199951171875,
0.041290283203125,
0.059814453125,
0.0679931640625,
-0.03204345703125,
-0.04083251953125,
-0.07928466796875,
0.0006499290466308594,
-0.00855255126953125,
-0.006893157958984375,
0.03607177734375,
0.04901123046875,
-0.042633056640625,
0.049774169921875,
-0.0257110595703125,
-0.0335693359375,
-0.0238800048828125,
0.0182037353515625,
0.029754638671875,
0.05096435546875,
0.050750732421875,
-0.0345458984375,
-0.0300750732421875,
-0.0059661865234375,
-0.06036376953125,
-0.00937652587890625,
-0.005565643310546875,
0.00920867919921875,
0.0267791748046875,
0.0260772705078125,
-0.03253173828125,
0.0263214111328125,
0.0335693359375,
-0.0147705078125,
0.014801025390625,
-0.050262451171875,
-0.005489349365234375,
-0.08154296875,
0.024658203125,
-0.007556915283203125,
-0.0194549560546875,
-0.04833984375,
-0.01038360595703125,
0.020050048828125,
0.004627227783203125,
-0.0494384765625,
0.05389404296875,
-0.039825439453125,
0.0243377685546875,
0.003795623779296875,
0.01409149169921875,
-0.0005846023559570312,
0.037933349609375,
0.041656494140625,
0.024505615234375,
0.044219970703125,
-0.0491943359375,
0.00215911865234375,
0.04827880859375,
-0.0160369873046875,
0.029388427734375,
-0.0716552734375,
0.0016937255859375,
-0.02227783203125,
0.029632568359375,
-0.058349609375,
0.00022292137145996094,
0.016937255859375,
-0.04815673828125,
0.051513671875,
-0.0231170654296875,
-0.037017822265625,
-0.0109710693359375,
-0.0019245147705078125,
0.0198516845703125,
0.038421630859375,
-0.031402587890625,
0.03662109375,
0.0311279296875,
-0.0218505859375,
-0.04833984375,
-0.0672607421875,
0.003093719482421875,
0.00405120849609375,
-0.03839111328125,
0.06451416015625,
0.0078887939453125,
0.028961181640625,
-0.006351470947265625,
0.0204010009765625,
-0.0192413330078125,
-0.00965118408203125,
0.006298065185546875,
0.0261993408203125,
-0.0225067138671875,
0.044525146484375,
0.02825927734375,
-0.0099334716796875,
0.005279541015625,
-0.007572174072265625,
0.04620361328125,
-0.0152740478515625,
-0.0029239654541015625,
-0.0214996337890625,
0.0206756591796875,
0.0227508544921875,
-0.021728515625,
0.0740966796875,
0.055267333984375,
-0.045440673828125,
0.0076446533203125,
-0.034820556640625,
-0.01532745361328125,
-0.026275634765625,
0.026824951171875,
-0.0002193450927734375,
-0.061004638671875,
0.037139892578125,
0.0019168853759765625,
-0.0118560791015625,
0.044158935546875,
0.05694580078125,
-0.01538848876953125,
0.054229736328125,
0.032318115234375,
0.008056640625,
0.0129241943359375,
-0.0187835693359375,
0.016082763671875,
-0.07568359375,
-0.0193023681640625,
-0.0361328125,
-0.01384735107421875,
-0.05731201171875,
-0.0260467529296875,
0.03167724609375,
0.00539398193359375,
-0.0091400146484375,
0.057769775390625,
-0.0587158203125,
-0.0012540817260742188,
0.03070068359375,
0.0252685546875,
0.0149993896484375,
0.0029926300048828125,
-0.048187255859375,
-0.02752685546875,
-0.057281494140625,
-0.05084228515625,
0.074462890625,
0.0303497314453125,
0.0229949951171875,
0.012115478515625,
0.0654296875,
-0.0013608932495117188,
0.038116455078125,
-0.039764404296875,
0.037689208984375,
-0.0141754150390625,
-0.03955078125,
-0.0022678375244140625,
-0.0081939697265625,
-0.1138916015625,
0.016143798828125,
-0.037261962890625,
-0.0743408203125,
0.0445556640625,
0.0091094970703125,
-0.0472412109375,
0.0014019012451171875,
-0.03533935546875,
0.0687255859375,
-0.034942626953125,
-0.023040771484375,
0.00502777099609375,
-0.069091796875,
-0.00653839111328125,
-0.00708770751953125,
0.00514984130859375,
0.005855560302734375,
-0.0033168792724609375,
0.059844970703125,
-0.0204925537109375,
0.06585693359375,
-0.004726409912109375,
0.0032215118408203125,
0.0110626220703125,
-0.005939483642578125,
0.02252197265625,
0.019195556640625,
0.0038013458251953125,
0.044281005859375,
0.0304107666015625,
-0.01068115234375,
-0.01551055908203125,
0.052276611328125,
-0.054229736328125,
-0.013946533203125,
-0.05633544921875,
-0.00839996337890625,
-0.007595062255859375,
0.0233001708984375,
0.035980224609375,
0.035003662109375,
-0.0311431884765625,
0.0181427001953125,
0.04541015625,
-0.06396484375,
0.0187530517578125,
0.06256103515625,
0.003986358642578125,
-0.0309295654296875,
0.055419921875,
0.004932403564453125,
0.0024280548095703125,
0.049224853515625,
-0.008209228515625,
-0.029022216796875,
-0.04791259765625,
-0.0247955322265625,
0.04595947265625,
-0.05230712890625,
-0.0134124755859375,
-0.08447265625,
-0.031707763671875,
-0.0452880859375,
-0.01166534423828125,
-0.0158538818359375,
-0.0296783447265625,
-0.0452880859375,
0.000020682811737060547,
0.0236968994140625,
0.011810302734375,
-0.01345062255859375,
0.01422119140625,
-0.084228515625,
0.00017142295837402344,
-0.0142822265625,
0.00949859619140625,
-0.00875091552734375,
-0.04583740234375,
-0.00786590576171875,
0.0009374618530273438,
-0.02471923828125,
-0.08447265625,
0.07830810546875,
0.0180206298828125,
0.05230712890625,
0.01202392578125,
-0.01444244384765625,
0.03179931640625,
-0.061737060546875,
0.038909912109375,
-0.0025634765625,
-0.048919677734375,
0.053619384765625,
-0.019378662109375,
0.017303466796875,
0.0596923828125,
0.0626220703125,
-0.031494140625,
-0.0153350830078125,
-0.076416015625,
-0.08026123046875,
0.032257080078125,
0.01436614990234375,
0.005321502685546875,
-0.0245361328125,
0.0221710205078125,
-0.0031948089599609375,
0.0250396728515625,
-0.0789794921875,
-0.0216522216796875,
0.0123291015625,
-0.0164642333984375,
-0.0016374588012695312,
-0.01593017578125,
0.0025177001953125,
-0.045074462890625,
0.0494384765625,
-0.004253387451171875,
0.0733642578125,
0.049896240234375,
-0.0218505859375,
0.017547607421875,
0.0283660888671875,
0.05438232421875,
0.0253143310546875,
-0.029632568359375,
-0.003643035888671875,
0.0251922607421875,
-0.037506103515625,
-0.01898193359375,
0.0487060546875,
-0.005802154541015625,
0.02349853515625,
0.0498046875,
0.051300048828125,
0.0333251953125,
-0.05694580078125,
0.037017822265625,
-0.0054931640625,
-0.0115509033203125,
-0.038604736328125,
-0.0135955810546875,
0.0086669921875,
0.01708984375,
0.0001430511474609375,
0.0038852691650390625,
0.0107421875,
-0.0216827392578125,
0.0225067138671875,
0.02569580078125,
-0.04827880859375,
-0.017059326171875,
0.06317138671875,
0.01129913330078125,
-0.02398681640625,
0.056915283203125,
-0.00870513916015625,
-0.04608154296875,
0.047515869140625,
0.0380859375,
0.054229736328125,
-0.0159912109375,
0.0158843994140625,
0.05450439453125,
0.0221099853515625,
0.00405120849609375,
0.025390625,
-0.0002465248107910156,
-0.06622314453125,
-0.0257720947265625,
-0.057159423828125,
-0.01708984375,
0.00965118408203125,
-0.03961181640625,
0.03460693359375,
-0.0259857177734375,
-0.02301025390625,
0.0207672119140625,
-0.0269927978515625,
-0.061737060546875,
0.0162811279296875,
0.0208892822265625,
0.04486083984375,
-0.033721923828125,
0.084716796875,
0.051605224609375,
-0.046356201171875,
-0.04791259765625,
-0.025238037109375,
-0.024993896484375,
-0.06927490234375,
0.0679931640625,
0.00577545166015625,
0.035430908203125,
-0.004840850830078125,
-0.0286712646484375,
-0.071044921875,
0.0760498046875,
0.0218505859375,
-0.0435791015625,
-0.016265869140625,
0.01174163818359375,
0.08984375,
-0.0298309326171875,
0.03485107421875,
0.025177001953125,
0.0306396484375,
0.006580352783203125,
-0.07183837890625,
0.01140594482421875,
-0.01556396484375,
-0.017333984375,
0.01218414306640625,
-0.029754638671875,
0.055419921875,
-0.0269927978515625,
-0.006862640380859375,
0.00007891654968261719,
0.034515380859375,
0.029388427734375,
0.00972747802734375,
0.0021648406982421875,
0.03009033203125,
0.0521240234375,
-0.00714111328125,
0.09002685546875,
-0.044525146484375,
0.03509521484375,
0.08697509765625,
-0.0182342529296875,
0.069091796875,
0.026641845703125,
-0.01403045654296875,
0.060516357421875,
0.039520263671875,
-0.0235748291015625,
0.040313720703125,
0.01345062255859375,
-0.01065826416015625,
-0.004459381103515625,
-0.0029010772705078125,
-0.0419921875,
0.037994384765625,
0.03814697265625,
-0.07684326171875,
-0.0112762451171875,
0.004150390625,
0.0191650390625,
-0.0191650390625,
0.0085296630859375,
0.0421142578125,
0.0234222412109375,
-0.057769775390625,
0.06536865234375,
0.0016374588012695312,
0.0301971435546875,
-0.048675537109375,
-0.0115509033203125,
0.0088653564453125,
0.026092529296875,
0.0003020763397216797,
-0.040130615234375,
0.0240020751953125,
-0.01332855224609375,
0.00833892822265625,
-0.0207366943359375,
0.05682373046875,
-0.042449951171875,
-0.045013427734375,
0.045196533203125,
0.03948974609375,
0.0258636474609375,
0.01800537109375,
-0.07464599609375,
-0.0232086181640625,
-0.00263214111328125,
-0.007213592529296875,
0.031280517578125,
0.019775390625,
0.00165557861328125,
0.036712646484375,
0.037811279296875,
0.023162841796875,
-0.023590087890625,
0.0032749176025390625,
0.0477294921875,
-0.039886474609375,
-0.0246429443359375,
-0.06573486328125,
0.034027099609375,
-0.006374359130859375,
-0.0343017578125,
0.0301361083984375,
0.03173828125,
0.04742431640625,
-0.0221405029296875,
0.040374755859375,
-0.00699615478515625,
0.04461669921875,
-0.0214691162109375,
0.072509765625,
-0.0396728515625,
0.0180511474609375,
-0.0379638671875,
-0.041046142578125,
-0.039764404296875,
0.06463623046875,
-0.0242767333984375,
0.00029850006103515625,
0.0684814453125,
0.04107666015625,
0.01605224609375,
-0.0157928466796875,
0.01439666748046875,
0.0360107421875,
0.0162353515625,
0.0634765625,
0.0205230712890625,
-0.042816162109375,
0.0166168212890625,
-0.01187896728515625,
-0.01273345947265625,
-0.01300048828125,
-0.0814208984375,
-0.072021484375,
-0.041046142578125,
-0.034942626953125,
-0.039154052734375,
0.01457977294921875,
0.0872802734375,
0.0299224853515625,
-0.0765380859375,
0.00952911376953125,
0.007732391357421875,
-0.005603790283203125,
-0.0145721435546875,
-0.0117340087890625,
0.0518798828125,
-0.006336212158203125,
-0.03955078125,
0.0247955322265625,
0.011016845703125,
0.01239776611328125,
-0.00386810302734375,
0.01244354248046875,
-0.04046630859375,
-0.0030269622802734375,
0.05120849609375,
0.0318603515625,
-0.04510498046875,
-0.00891876220703125,
0.0017871856689453125,
-0.021881103515625,
0.0190582275390625,
0.03790283203125,
-0.083984375,
0.0256195068359375,
0.0360107421875,
0.04827880859375,
0.043121337890625,
-0.036773681640625,
0.03814697265625,
-0.0494384765625,
0.00737762451171875,
0.04083251953125,
0.051239013671875,
0.021484375,
-0.030792236328125,
0.046905517578125,
0.0238037109375,
-0.04107666015625,
-0.048858642578125,
-0.0181884765625,
-0.0946044921875,
-0.017608642578125,
0.08660888671875,
-0.00592041015625,
-0.00945281982421875,
-0.0081634521484375,
-0.01436614990234375,
0.0301971435546875,
-0.033966064453125,
0.04718017578125,
0.037567138671875,
-0.01413726806640625,
-0.004238128662109375,
-0.0533447265625,
0.034637451171875,
0.0293121337890625,
-0.047088623046875,
-0.03253173828125,
0.0200653076171875,
0.0255126953125,
0.02362060546875,
0.058624267578125,
-0.019134521484375,
0.015350341796875,
-0.042877197265625,
0.032958984375,
0.0193634033203125,
-0.0110321044921875,
-0.0269927978515625,
0.00226593017578125,
-0.0233306884765625,
0.01336669921875
]
] |
Yukang/Llama-2-7b-longlora-16k-ft | 2023-09-24T09:40:07.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"arxiv:2309.12307",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | Yukang | null | null | Yukang/Llama-2-7b-longlora-16k-ft | 0 | 7,113 | transformers | 2023-09-12T09:40:52 | # LongLoRA: Efficient Fine-tuning of Long-Context Large Language Models
<font size=6><div align='center' > <a href=http://arxiv.org/abs/2309.12307>**Paper**</a> | <a href="https://huggingface.co/Yukang">**Models**</a> | <a href="https://github.com/dvlab-research/LongLoRA">**Code**</a> </div></font>
**LongLoRA: Efficient Fine-tuning of Long-Context Large Language Models [[Paper](http://arxiv.org/abs/2309.12307)]** <br />
[Yukang Chen](https://scholar.google.com/citations?user=6p0ygKUAAAAJ&hl=en),
[Shengju Qian](https://scholar.google.com/citations?user=QNnWmasAAAAJ),
[Haotian Tang](https://scholar.google.com/citations?user=WxL13BAAAAAJ&hl),
[Xin Lai](https://scholar.google.com/citations?user=tqNDPA4AAAAJ&hl=zh-CN),
[Zhijian Liu](https://scholar.google.com/citations?user=3coYSTUAAAAJ&hl=en),
[Song Han](https://scholar.google.com/citations?user=E0iCaa4AAAAJ&hl=zh-CN),
[Jiaya Jia](https://scholar.google.com/citations?user=XPAkzTEAAAAJ&hl=en)<br />
## Abstract
We present LongLoRA, an efficient fine-tuning approach that extends the context sizes of pre-trained large language models (LLMs), with limited computation cost.
Typically, training LLMs with long context sizes is computationally expensive, requiring extensive training hours and GPU resources.
In this paper, we speed up the context extension of LLMs in two aspects. On the one hand, although dense global attention is needed during inference, fine-tuning the model can be effectively and efficiently done by sparse local attention. The proposed shift short attention effectively enables context extension, leading to non-trivial computation saving with similar performance to fine-tuning with vanilla attention. On the other hand, we find that LoRA for context extension works well under the premise of trainable embedding and normalization. LongLoRA demonstrates strong empirical results on various tasks on LLaMA2 models from 7B/13B to 70B. LongLoRA adopts LLaMA2 7B from 4k context to 100k, or LLaMA2 70B to 32k on a single 8x A100 machine. LongLoRA extends models' context while retaining their original architectures, and is compatible with most existing techniques, like FlashAttention-2. In addition, to make LongLoRA practical, we collect a dataset, LongQA, for supervised fine-tuning. It contains more than 3k long context question-answer pairs. For more details, please refer to the [paper](http://arxiv.org/abs/2309.12307).
## Highlights
**LongLoRA** speed up the context extension of pre-trained large language models in both attention-level and weight-level.
1. The proposed shifted short attention is easy to implement, compatible with Flash-Attention, and not required during inference.
2. We release all our models, including models from 7B to 70B, context length from 8k to 100k, including [LLaMA2-LongLoRA-7B-100k](https://huggingface.co/Yukang/Llama-2-7b-longlora-100k-ft), [LLaMA2-LongLoRA-13B-64k](https://huggingface.co/Yukang/Llama-2-13b-longlora-64k), and [LLaMA2-LongLoRA-70B-32k](https://huggingface.co/Yukang/Llama-2-70b-longlora-32k).
3. We build up a long-context QA dataset, LongQA, for supervised fine-tuning (SFT). We release 13B and 70B 32k models with SFT, [Llama-2-13b-chat-longlora-32k-sft](https://huggingface.co/Yukang/Llama-2-13b-chat-longlora-32k-sft) and [Llama-2-70b-chat-longlora-32k-sft](https://huggingface.co/Yukang/Llama-2-70b-chat-longlora-32k-sft). We will further release the dataset next week.
## Released models
### Models with supervised fine-tuning
| Model | Size | Context | Train | Link |
|:----------------------------------|------|---------|---------|-------------------------------------------------------------------------|
| Llama-2-13b-chat-longlora-32k-sft | 13B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-chat-longlora-32k-sft) |
| Llama-2-70b-chat-longlora-32k-sft | 70B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-70b-chat-longlora-32k-sft) |
### Models with context extension via fully fine-tuning
| Model | Size | Context | Train | Link |
|:----------------------------|------|---------|-------|-------------------------------------------------------------------|
| Llama-2-7b-longlora-8k-ft | 7B | 8192 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-8k-ft) |
| Llama-2-7b-longlora-16k-ft | 7B | 16384 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-16k-ft) |
| Llama-2-7b-longlora-32k-ft | 7B | 32768 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-32k-ft) |
| Llama-2-7b-longlora-100k-ft | 7B | 100000 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-100k-ft) |
| Llama-2-13b-longlora-8k-ft | 13B | 8192 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-8k-ft) |
| Llama-2-13b-longlora-16k-ft | 13B | 16384 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft) |
| Llama-2-13b-longlora-32k-ft | 13B | 32768 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft) |
### Models with context extension via improved LoRA fine-tuning
| Model | Size | Context | Train | Link |
|:----------------------------|------|---------|-------|-------------------------------------------------------------------|
| Llama-2-7b-longlora-8k | 7B | 8192 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-8k) |
| Llama-2-7b-longlora-16k | 7B | 16384 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-16k) |
| Llama-2-7b-longlora-32k | 7B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-32k) |
| Llama-2-13b-longlora-8k | 13B | 8192 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-8k) |
| Llama-2-13b-longlora-16k | 13B | 16384 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-16k) |
| Llama-2-13b-longlora-32k | 13B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-32k) |
| Llama-2-13b-longlora-64k | 13B | 65536 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-64k) |
| Llama-2-70b-longlora-32k | 70B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-70b-longlora-32k) |
| Llama-2-70b-chat-longlora-32k | 70B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-70b-chat-longlora-32k) |
## Citation
If you find this project useful in your research, please consider citing:
```
@article{longlora,
title={LongLoRA: Efficient Fine-tuning of Long-Context Large Language Models},
author={Yukang Chen and Shengju Qian and Haotian Tang and Xin Lai and Zhijian Liu and Song Han and Jiaya Jia},
journal={arXiv:2309.12307},
year={2023}
}
```
## Acknowledgement
- This work is built upon the [LLaMA2](https://ai.meta.com/llama) as the pre-trained models.
- This work is based on [DeepSpeed](https://github.com/microsoft/DeepSpeed), [peft](https://github.com/huggingface/peft), and [Flash-Attention2](https://github.com/Dao-AILab/flash-attention) for acceleration.
- The perplexity evaluation code is modified upon [Landmark Attention](https://github.com/epfml/landmark-attention).
- We use [LongChat](https://github.com/DachengLi1/LongChat) for the retrieval evaluation.
| 7,594 | [
[
-0.052581787109375,
-0.058319091796875,
0.0194091796875,
0.03411865234375,
-0.0299224853515625,
-0.0236968994140625,
-0.036590576171875,
-0.0631103515625,
0.03802490234375,
0.0259857177734375,
-0.0484619140625,
-0.04229736328125,
-0.03515625,
0.015167236328125,
-0.00847625732421875,
0.07867431640625,
-0.0018014907836914062,
-0.03778076171875,
0.022216796875,
-0.0280303955078125,
-0.03338623046875,
-0.0219268798828125,
-0.045501708984375,
-0.01380157470703125,
0.05926513671875,
0.0218353271484375,
0.049957275390625,
0.045562744140625,
0.0298614501953125,
0.020843505859375,
-0.035614013671875,
0.02642822265625,
-0.040863037109375,
-0.0163421630859375,
-0.0009598731994628906,
-0.0155487060546875,
-0.08062744140625,
-0.01079559326171875,
0.051788330078125,
0.037200927734375,
0.004428863525390625,
0.031494140625,
0.0173187255859375,
0.058837890625,
-0.0282745361328125,
0.0063018798828125,
-0.0244903564453125,
-0.01132965087890625,
-0.0278167724609375,
-0.006580352783203125,
-0.0019140243530273438,
-0.0232696533203125,
-0.0018167495727539062,
-0.04931640625,
-0.0172576904296875,
-0.00922393798828125,
0.0889892578125,
0.03411865234375,
-0.03802490234375,
-0.007183074951171875,
-0.0140380859375,
0.0633544921875,
-0.07366943359375,
0.0213165283203125,
0.025146484375,
0.01181793212890625,
-0.0246734619140625,
-0.03472900390625,
-0.040313720703125,
-0.00078582763671875,
-0.0209503173828125,
0.003986358642578125,
-0.0173797607421875,
-0.00864410400390625,
0.034332275390625,
0.0279388427734375,
-0.0246124267578125,
0.02203369140625,
-0.0213775634765625,
0.005313873291015625,
0.05889892578125,
0.0005288124084472656,
0.0204010009765625,
-0.00862884521484375,
-0.03082275390625,
-0.0016231536865234375,
-0.06329345703125,
0.026824951171875,
0.0146942138671875,
0.019989013671875,
-0.0419921875,
0.03729248046875,
-0.0269012451171875,
0.060455322265625,
0.018951416015625,
-0.033355712890625,
0.0391845703125,
-0.033538818359375,
-0.0242156982421875,
-0.0170745849609375,
0.05059814453125,
0.031158447265625,
0.0036678314208984375,
0.01678466796875,
-0.00839996337890625,
0.0010194778442382812,
-0.0221710205078125,
-0.07183837890625,
0.01509857177734375,
0.0174102783203125,
-0.0298004150390625,
-0.01806640625,
-0.0075836181640625,
-0.0626220703125,
-0.0038280487060546875,
-0.02313232421875,
0.00989532470703125,
-0.0289459228515625,
-0.022918701171875,
0.0233154296875,
0.0193634033203125,
0.0261383056640625,
0.035858154296875,
-0.0416259765625,
0.01180267333984375,
0.0400390625,
0.045745849609375,
-0.01464080810546875,
-0.0247802734375,
-0.032470703125,
0.005931854248046875,
-0.020416259765625,
0.039337158203125,
-0.013580322265625,
-0.0096282958984375,
-0.01519012451171875,
0.0235748291015625,
-0.00621795654296875,
-0.0151519775390625,
0.04156494140625,
-0.0237274169921875,
0.0015172958374023438,
-0.035736083984375,
-0.036285400390625,
-0.009674072265625,
0.0167999267578125,
-0.055938720703125,
0.081787109375,
0.018829345703125,
-0.06561279296875,
0.023101806640625,
-0.065185546875,
-0.01447296142578125,
-0.0260467529296875,
0.01561737060546875,
-0.0401611328125,
-0.018951416015625,
0.033905029296875,
0.042144775390625,
-0.024383544921875,
0.0002034902572631836,
-0.0258636474609375,
-0.032684326171875,
0.0140380859375,
0.0008463859558105469,
0.06072998046875,
0.0226898193359375,
-0.045623779296875,
0.026702880859375,
-0.0615234375,
0.0087890625,
0.0225982666015625,
-0.03570556640625,
-0.007663726806640625,
-0.0164337158203125,
0.00555419921875,
0.0263824462890625,
0.033355712890625,
-0.0178375244140625,
0.033966064453125,
-0.03436279296875,
0.04083251953125,
0.0494384765625,
-0.00986480712890625,
0.0198822021484375,
-0.0262603759765625,
0.034332275390625,
0.01019287109375,
0.0175018310546875,
-0.005031585693359375,
-0.034088134765625,
-0.0780029296875,
-0.034515380859375,
0.0162811279296875,
0.020050048828125,
-0.0423583984375,
0.06500244140625,
-0.038482666015625,
-0.03656005859375,
-0.040618896484375,
0.0360107421875,
0.041015625,
0.02899169921875,
0.0291290283203125,
-0.0194091796875,
-0.0333251953125,
-0.06597900390625,
-0.0006165504455566406,
0.01194000244140625,
0.01371002197265625,
0.032012939453125,
0.048126220703125,
-0.03387451171875,
0.059844970703125,
-0.037445068359375,
-0.0266571044921875,
-0.016632080078125,
-0.025299072265625,
0.044097900390625,
0.03985595703125,
0.07379150390625,
-0.05010986328125,
-0.04766845703125,
0.01198577880859375,
-0.04486083984375,
-0.0038604736328125,
0.0108184814453125,
-0.0238037109375,
0.04327392578125,
0.028076171875,
-0.060882568359375,
0.042266845703125,
0.0477294921875,
-0.049957275390625,
0.0322265625,
-0.0050811767578125,
0.0015707015991210938,
-0.09527587890625,
0.0269012451171875,
-0.0007519721984863281,
-0.02325439453125,
-0.03936767578125,
0.0258636474609375,
0.0078887939453125,
0.019683837890625,
-0.044891357421875,
0.06756591796875,
-0.043060302734375,
-0.003444671630859375,
-0.019866943359375,
0.01198577880859375,
-0.0008463859558105469,
0.06170654296875,
-0.005794525146484375,
0.062469482421875,
0.0364990234375,
-0.037994384765625,
0.0235443115234375,
0.0179901123046875,
-0.03338623046875,
0.0316162109375,
-0.05010986328125,
0.0217132568359375,
0.0092315673828125,
0.056060791015625,
-0.052093505859375,
-0.03216552734375,
0.01763916015625,
-0.0137176513671875,
0.0164642333984375,
-0.005695343017578125,
-0.038116455078125,
-0.04290771484375,
-0.046356201171875,
0.04083251953125,
0.033111572265625,
-0.05511474609375,
0.00414276123046875,
0.015350341796875,
0.011077880859375,
-0.0487060546875,
-0.033477783203125,
-0.002532958984375,
-0.0484619140625,
-0.0577392578125,
0.0286865234375,
-0.0166778564453125,
-0.0028285980224609375,
-0.01849365234375,
0.013519287109375,
0.007740020751953125,
0.00916290283203125,
0.0207977294921875,
0.01019287109375,
-0.0262908935546875,
0.0024929046630859375,
-0.0101470947265625,
-0.0008625984191894531,
-0.0280303955078125,
0.0016021728515625,
0.0555419921875,
-0.03375244140625,
-0.01267242431640625,
-0.0562744140625,
0.01262664794921875,
0.039520263671875,
-0.019744873046875,
0.051849365234375,
0.06695556640625,
-0.0201416015625,
-0.0007386207580566406,
-0.0504150390625,
-0.004398345947265625,
-0.035736083984375,
0.01207733154296875,
-0.035430908203125,
-0.0821533203125,
0.059844970703125,
0.009613037109375,
0.005645751953125,
0.0487060546875,
0.035858154296875,
0.018951416015625,
0.07122802734375,
0.049346923828125,
-0.040313720703125,
0.048248291015625,
-0.03985595703125,
-0.0008425712585449219,
-0.073974609375,
-0.0005092620849609375,
-0.01470947265625,
-0.033966064453125,
-0.051177978515625,
-0.0440673828125,
0.0258636474609375,
0.0304412841796875,
-0.0309600830078125,
0.045745849609375,
-0.038604736328125,
0.025604248046875,
0.0289764404296875,
0.0211334228515625,
0.01222991943359375,
-0.0093841552734375,
0.019287109375,
0.0018405914306640625,
-0.02996826171875,
-0.0220947265625,
0.0697021484375,
0.03729248046875,
0.03857421875,
0.025726318359375,
0.048248291015625,
-0.002994537353515625,
0.0150909423828125,
-0.04949951171875,
0.0465087890625,
0.00896453857421875,
-0.039093017578125,
-0.037933349609375,
-0.022247314453125,
-0.083251953125,
0.018463134765625,
-0.00736236572265625,
-0.068115234375,
0.01129150390625,
0.005893707275390625,
-0.0362548828125,
0.0196075439453125,
-0.03912353515625,
0.057861328125,
-0.01041412353515625,
-0.035980224609375,
-0.021087646484375,
-0.05029296875,
0.04052734375,
-0.00394439697265625,
0.007564544677734375,
-0.019561767578125,
-0.0034694671630859375,
0.0599365234375,
-0.0489501953125,
0.06591796875,
-0.00510406494140625,
-0.041595458984375,
0.035552978515625,
-0.0155487060546875,
0.052581787109375,
0.00817108154296875,
-0.007740020751953125,
0.00087738037109375,
0.00017702579498291016,
-0.0394287109375,
-0.032989501953125,
0.0667724609375,
-0.055908203125,
-0.043609619140625,
-0.0210113525390625,
-0.033447265625,
-0.01105499267578125,
0.021331787109375,
0.01457977294921875,
0.006694793701171875,
0.006622314453125,
0.021728515625,
0.035614013671875,
-0.01922607421875,
0.038818359375,
0.0204315185546875,
-0.025054931640625,
-0.0243072509765625,
0.055389404296875,
-0.004558563232421875,
0.01116180419921875,
0.009674072265625,
0.0055084228515625,
-0.007518768310546875,
-0.02783203125,
-0.0288238525390625,
0.04046630859375,
-0.038177490234375,
-0.03125,
-0.0246734619140625,
-0.0207977294921875,
-0.037506103515625,
-0.0089263916015625,
-0.0240020751953125,
-0.030517578125,
-0.044647216796875,
-0.0084686279296875,
0.05462646484375,
0.039031982421875,
0.005138397216796875,
0.025390625,
-0.04046630859375,
0.0231781005859375,
0.0275421142578125,
0.03173828125,
-0.0006928443908691406,
-0.043121337890625,
-0.0163726806640625,
0.017059326171875,
-0.01557159423828125,
-0.054351806640625,
0.04327392578125,
0.0186614990234375,
0.01242828369140625,
0.035125732421875,
-0.0197601318359375,
0.08819580078125,
-0.025054931640625,
0.055267333984375,
0.0195465087890625,
-0.06536865234375,
0.046722412109375,
-0.051177978515625,
0.0210113525390625,
0.02789306640625,
0.004497528076171875,
-0.0309906005859375,
0.0010232925415039062,
-0.0379638671875,
-0.06304931640625,
0.051727294921875,
0.018524169921875,
0.0018663406372070312,
0.00833892822265625,
0.03948974609375,
-0.00714111328125,
0.0045928955078125,
-0.0638427734375,
-0.0247802734375,
-0.0031147003173828125,
-0.003082275390625,
-0.02313232421875,
-0.024261474609375,
-0.0192718505859375,
-0.04931640625,
0.042816162109375,
-0.0298614501953125,
0.0106658935546875,
0.012298583984375,
-0.010528564453125,
-0.01371002197265625,
0.00963592529296875,
0.06732177734375,
0.052001953125,
-0.01453399658203125,
-0.020751953125,
0.038543701171875,
-0.0152587890625,
-0.005645751953125,
0.0022869110107421875,
-0.0016756057739257812,
-0.0156402587890625,
0.033721923828125,
0.074462890625,
0.03936767578125,
-0.046966552734375,
0.030059814453125,
0.007335662841796875,
-0.00040841102600097656,
-0.0224609375,
0.01236724853515625,
0.016510009765625,
0.0261077880859375,
0.01103973388671875,
-0.021087646484375,
-0.0026302337646484375,
-0.0491943359375,
0.004650115966796875,
0.03741455078125,
-0.019012451171875,
-0.03729248046875,
0.039093017578125,
0.0069732666015625,
0.002964019775390625,
0.01158905029296875,
-0.0056304931640625,
-0.03875732421875,
0.05487060546875,
0.040069580078125,
0.034912109375,
-0.0226593017578125,
-0.00736236572265625,
0.045684814453125,
-0.0069732666015625,
-0.0089263916015625,
0.02117919921875,
0.0008101463317871094,
-0.0293121337890625,
-0.01953125,
-0.065673828125,
0.00954437255859375,
0.03155517578125,
-0.0362548828125,
0.026397705078125,
-0.029510498046875,
-0.0307159423828125,
-0.004184722900390625,
0.0389404296875,
-0.0523681640625,
0.01316070556640625,
0.00566864013671875,
0.07659912109375,
-0.0352783203125,
0.087158203125,
0.03619384765625,
-0.022979736328125,
-0.06427001953125,
-0.016021728515625,
-0.003726959228515625,
-0.066162109375,
0.045745849609375,
0.0167694091796875,
0.00004279613494873047,
-0.0104217529296875,
-0.05145263671875,
-0.0908203125,
0.10595703125,
0.0254364013671875,
-0.042022705078125,
-0.0105743408203125,
0.000823974609375,
0.056976318359375,
-0.0243988037109375,
0.01222991943359375,
0.0538330078125,
0.045684814453125,
0.0035915374755859375,
-0.09783935546875,
0.0262908935546875,
-0.036651611328125,
0.0027866363525390625,
0.0082550048828125,
-0.1011962890625,
0.07958984375,
-0.0151519775390625,
-0.009185791015625,
0.02886962890625,
0.061798095703125,
0.039520263671875,
0.006397247314453125,
0.039215087890625,
0.056640625,
0.03619384765625,
0.00024700164794921875,
0.0723876953125,
-0.0214691162109375,
0.0308990478515625,
0.060211181640625,
0.0026912689208984375,
0.065673828125,
0.03369140625,
-0.0171966552734375,
0.032928466796875,
0.0606689453125,
0.00795745849609375,
0.0189056396484375,
0.01561737060546875,
-0.0023632049560546875,
-0.01373291015625,
-0.00647735595703125,
-0.052093505859375,
0.0250091552734375,
0.02801513671875,
-0.0173797607421875,
-0.0018548965454101562,
-0.0147552490234375,
0.0295867919921875,
-0.0166015625,
-0.0225830078125,
0.048980712890625,
0.0214385986328125,
-0.05462646484375,
0.07763671875,
-0.00124359130859375,
0.08056640625,
-0.0350341796875,
0.00916290283203125,
-0.0268402099609375,
0.0212860107421875,
-0.0233917236328125,
-0.047515869140625,
-0.0004417896270751953,
0.00730133056640625,
0.00958251953125,
-0.0034885406494140625,
0.044586181640625,
-0.0254669189453125,
-0.04107666015625,
0.04107666015625,
0.019683837890625,
0.00814056396484375,
-0.0033416748046875,
-0.062042236328125,
0.0191650390625,
0.004810333251953125,
-0.056427001953125,
0.03826904296875,
0.0265655517578125,
-0.0204315185546875,
0.052490234375,
0.051177978515625,
0.00734710693359375,
0.0111083984375,
0.00035262107849121094,
0.08319091796875,
-0.058319091796875,
-0.0316162109375,
-0.060302734375,
0.033203125,
-0.01239013671875,
-0.0343017578125,
0.062286376953125,
0.0291595458984375,
0.040740966796875,
0.0086212158203125,
0.023712158203125,
0.002613067626953125,
0.04296875,
-0.039886474609375,
0.059356689453125,
-0.06756591796875,
0.005443572998046875,
-0.03240966796875,
-0.06689453125,
-0.02203369140625,
0.0372314453125,
-0.01410675048828125,
0.0128631591796875,
0.0306549072265625,
0.047607421875,
-0.012115478515625,
-0.026123046875,
-0.0008487701416015625,
0.0209197998046875,
0.033843994140625,
0.07757568359375,
0.03704833984375,
-0.045562744140625,
0.0164031982421875,
-0.032562255859375,
-0.004085540771484375,
-0.048797607421875,
-0.0648193359375,
-0.0816650390625,
-0.047882080078125,
-0.0162506103515625,
-0.0189056396484375,
-0.00991058349609375,
0.07000732421875,
0.06243896484375,
-0.055206298828125,
-0.019012451171875,
0.01328277587890625,
0.0107269287109375,
-0.00960540771484375,
-0.016204833984375,
0.0540771484375,
-0.004177093505859375,
-0.06842041015625,
0.025909423828125,
-0.00217437744140625,
0.02520751953125,
0.0006189346313476562,
-0.0289306640625,
-0.0130615234375,
-0.0005555152893066406,
0.06005859375,
0.045440673828125,
-0.06341552734375,
-0.0245208740234375,
-0.006252288818359375,
-0.014984130859375,
0.0079345703125,
0.016693115234375,
-0.042144775390625,
-0.01535797119140625,
0.03338623046875,
0.0138397216796875,
0.044342041015625,
0.010406494140625,
0.008331298828125,
-0.04241943359375,
0.0462646484375,
-0.00033020973205566406,
0.0312347412109375,
0.0200042724609375,
-0.0232696533203125,
0.060516357421875,
-0.00569915771484375,
-0.03240966796875,
-0.0787353515625,
0.010101318359375,
-0.10394287109375,
-0.01433563232421875,
0.09283447265625,
-0.01165008544921875,
-0.046112060546875,
0.034576416015625,
-0.03155517578125,
0.0187530517578125,
-0.032135009765625,
0.052215576171875,
0.035491943359375,
-0.0098419189453125,
-0.00579833984375,
-0.032684326171875,
0.061279296875,
0.03802490234375,
-0.0794677734375,
0.0017595291137695312,
0.032745361328125,
0.0262451171875,
0.0285491943359375,
0.048309326171875,
-0.004344940185546875,
0.0170745849609375,
-0.046112060546875,
-0.0069732666015625,
0.0009222030639648438,
-0.01055908203125,
-0.0205535888671875,
-0.01503753662109375,
-0.00795745849609375,
0.0054931640625
]
] |
garage-bAInd/Platypus2-13B | 2023-08-15T01:47:05.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"dataset:garage-bAInd/Open-Platypus",
"arxiv:2308.07317",
"arxiv:2307.09288",
"license:cc-by-nc-sa-4.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | garage-bAInd | null | null | garage-bAInd/Platypus2-13B | 17 | 7,107 | transformers | 2023-08-05T00:12:12 | ---
license: cc-by-nc-sa-4.0
language:
- en
datasets:
- garage-bAInd/Open-Platypus
---
# Platypus2-13B
Platypus-13B is an instruction fine-tuned model based on the LLaMA2-13B transformer architecture.

### Benchmark Metrics
| Metric | Value |
|-----------------------|-------|
| MMLU (5-shot) | 56.70 |
| ARC (25-shot) | 61.26 |
| HellaSwag (10-shot) | 82.56 |
| TruthfulQA (0-shot) | 44.86 |
| Avg. | 61.35 |
We use state-of-the-art [Language Model Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness) to run the benchmark tests above, using the same version as the HuggingFace LLM Leaderboard. Please see below for detailed instructions on reproducing benchmark results.
### Model Details
* **Trained by**: Cole Hunter & Ariel Lee
* **Model type:** **Platypus2-13B** is an auto-regressive language model based on the LLaMA2 transformer architecture.
* **Language(s)**: English
* **License for base weights**: Non-Commercial Creative Commons license ([CC BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/))
### Prompt Template
```
### Instruction:
<prompt> (without the <>)
### Response:
```
### Training Dataset
`garage-bAInd/Platypus2-13B` trained using STEM and logic based dataset [`garage-bAInd/Open-Platypus`](https://huggingface.co/datasets/garage-bAInd/Open-Platypus).
Please see our [paper](https://arxiv.org/abs/2308.07317) and [project webpage](https://platypus-llm.github.io) for additional information.
### Training Procedure
`garage-bAInd/Platypus2-13B` was instruction fine-tuned using LoRA on 1 A100 80GB. For training details and inference instructions please see the [Platypus2](https://github.com/arielnlee/Platypus) GitHub repo.
### Reproducing Evaluation Results
Install LM Evaluation Harness:
```
# clone repository
git clone https://github.com/EleutherAI/lm-evaluation-harness.git
# check out the correct commit
git checkout b281b0921b636bc36ad05c0b0b0763bd6dd43463
# change to repo directory
cd lm-evaluation-harness
# install
pip install -e .
```
Each task was evaluated on 1 A100 80GB GPU.
ARC:
```
python main.py --model hf-causal-experimental --model_args pretrained=garage-bAInd/Platypus2-13B --tasks arc_challenge --batch_size 1 --no_cache --write_out --output_path results/Platypus2-13B/arc_challenge_25shot.json --device cuda --num_fewshot 25
```
HellaSwag:
```
python main.py --model hf-causal-experimental --model_args pretrained=garage-bAInd/Platypus2-13B --tasks hellaswag --batch_size 1 --no_cache --write_out --output_path results/Platypus2-13B/hellaswag_10shot.json --device cuda --num_fewshot 10
```
MMLU:
```
python main.py --model hf-causal-experimental --model_args pretrained=garage-bAInd/Platypus2-13B --tasks hendrycksTest-* --batch_size 1 --no_cache --write_out --output_path results/Platypus2-13B/mmlu_5shot.json --device cuda --num_fewshot 5
```
TruthfulQA:
```
python main.py --model hf-causal-experimental --model_args pretrained=garage-bAInd/Platypus2-13B --tasks truthfulqa_mc --batch_size 1 --no_cache --write_out --output_path results/Platypus2-13B/truthfulqa_0shot.json --device cuda
```
### Limitations and bias
Llama 2 and fine-tuned variants are a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2 and any fine-tuned varient's potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2 variants, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at https://ai.meta.com/llama/responsible-use-guide/
### Citations
```bibtex
@article{platypus2023,
title={Platypus: Quick, Cheap, and Powerful Refinement of LLMs},
author={Ariel N. Lee and Cole J. Hunter and Nataniel Ruiz},
booktitle={arXiv preprint arxiv:2308.07317},
year={2023}
}
```
```bibtex
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov year={2023},
eprint={2307.09288},
archivePrefix={arXiv},
}
```
```bibtex
@inproceedings{
hu2022lora,
title={Lo{RA}: Low-Rank Adaptation of Large Language Models},
author={Edward J Hu and Yelong Shen and Phillip Wallis and Zeyuan Allen-Zhu and Yuanzhi Li and Shean Wang and Lu Wang and Weizhu Chen},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=nZeVKeeFYf9}
}
``` | 4,873 | [
[
-0.023284912109375,
-0.06451416015625,
0.022552490234375,
0.032012939453125,
-0.02276611328125,
-0.0036334991455078125,
-0.0252838134765625,
-0.0364990234375,
0.0005736351013183594,
0.022857666015625,
-0.041839599609375,
-0.027008056640625,
-0.050872802734375,
-0.00595855712890625,
-0.0057525634765625,
0.07366943359375,
-0.02691650390625,
-0.01416778564453125,
-0.00855255126953125,
-0.01369476318359375,
-0.048797607421875,
-0.033233642578125,
-0.03033447265625,
-0.034088134765625,
0.02032470703125,
0.0283355712890625,
0.041595458984375,
0.042877197265625,
0.051116943359375,
0.0225982666015625,
-0.015777587890625,
0.0206146240234375,
-0.046478271484375,
-0.01236724853515625,
0.01180267333984375,
-0.040802001953125,
-0.04296875,
0.0101776123046875,
0.032867431640625,
0.0281982421875,
-0.013580322265625,
0.03564453125,
0.0145263671875,
0.024749755859375,
-0.04534912109375,
0.0282440185546875,
-0.041778564453125,
-0.01348876953125,
-0.022216796875,
-0.014739990234375,
-0.028839111328125,
-0.0177154541015625,
-0.01268768310546875,
-0.054779052734375,
-0.0003681182861328125,
0.007106781005859375,
0.08563232421875,
0.042938232421875,
-0.018524169921875,
-0.0121917724609375,
-0.0264434814453125,
0.072265625,
-0.061737060546875,
0.00896453857421875,
0.0277252197265625,
0.006855010986328125,
-0.0300750732421875,
-0.053985595703125,
-0.04095458984375,
-0.02276611328125,
-0.00537109375,
0.0122222900390625,
-0.0198516845703125,
-0.0022106170654296875,
0.023956298828125,
0.031890869140625,
-0.02911376953125,
0.04010009765625,
-0.036895751953125,
-0.011688232421875,
0.055877685546875,
0.01509857177734375,
0.0031375885009765625,
-0.01145172119140625,
-0.040069580078125,
-0.0307159423828125,
-0.05328369140625,
0.025848388671875,
0.0311279296875,
0.0113525390625,
-0.032379150390625,
0.050689697265625,
-0.008026123046875,
0.029296875,
0.007221221923828125,
-0.038665771484375,
0.04351806640625,
-0.0198516845703125,
-0.024444580078125,
-0.006961822509765625,
0.0706787109375,
0.034454345703125,
0.0016269683837890625,
0.0067901611328125,
-0.0164031982421875,
0.0240020751953125,
-0.006237030029296875,
-0.06207275390625,
-0.01247406005859375,
0.02056884765625,
-0.0289764404296875,
-0.018280029296875,
-0.01303863525390625,
-0.045501708984375,
-0.02716064453125,
-0.00870513916015625,
0.032562255859375,
-0.034637451171875,
-0.0291900634765625,
0.015380859375,
0.0032024383544921875,
0.03985595703125,
0.016510009765625,
-0.052947998046875,
0.033660888671875,
0.049468994140625,
0.064208984375,
-0.0288848876953125,
-0.04559326171875,
-0.026031494140625,
-0.00553131103515625,
-0.020355224609375,
0.061614990234375,
-0.0104217529296875,
-0.0170440673828125,
-0.0224151611328125,
0.0144195556640625,
-0.01538848876953125,
-0.04986572265625,
0.0369873046875,
-0.0198822021484375,
0.01245880126953125,
-0.01708984375,
-0.035797119140625,
-0.027069091796875,
-0.007244110107421875,
-0.031646728515625,
0.10345458984375,
0.01277923583984375,
-0.056915283203125,
0.004405975341796875,
-0.05133056640625,
-0.0249176025390625,
-0.01177978515625,
0.0112152099609375,
-0.042388916015625,
-0.0026092529296875,
0.00865936279296875,
0.033905029296875,
-0.03924560546875,
0.02313232421875,
-0.0172271728515625,
-0.031585693359375,
0.01285552978515625,
-0.0084381103515625,
0.078125,
0.017303466796875,
-0.04486083984375,
0.007495880126953125,
-0.04803466796875,
-0.01080322265625,
0.036224365234375,
-0.02593994140625,
-0.00945281982421875,
-0.005100250244140625,
-0.00862884521484375,
0.007537841796875,
0.0323486328125,
-0.03546142578125,
0.0075225830078125,
-0.0261383056640625,
0.04473876953125,
0.056365966796875,
-0.0041351318359375,
0.01837158203125,
-0.03680419921875,
0.029022216796875,
0.0028228759765625,
0.02294921875,
-0.0008068084716796875,
-0.052978515625,
-0.07635498046875,
-0.0229339599609375,
0.0041046142578125,
0.064208984375,
-0.0257720947265625,
0.042266845703125,
0.001316070556640625,
-0.04913330078125,
-0.043914794921875,
0.0258026123046875,
0.04248046875,
0.040771484375,
0.038726806640625,
-0.0276031494140625,
-0.045928955078125,
-0.0673828125,
-0.0075836181640625,
-0.027008056640625,
0.01181793212890625,
0.0245819091796875,
0.05169677734375,
-0.027099609375,
0.04510498046875,
-0.035308837890625,
-0.02130126953125,
-0.0214385986328125,
-0.0005984306335449219,
0.0256805419921875,
0.0487060546875,
0.037322998046875,
-0.01450347900390625,
-0.0152587890625,
-0.018341064453125,
-0.0594482421875,
-0.0219573974609375,
-0.0049285888671875,
-0.018798828125,
0.034576416015625,
0.01654052734375,
-0.06439208984375,
0.02423095703125,
0.037109375,
-0.017181396484375,
0.03759765625,
-0.01160430908203125,
-0.01342010498046875,
-0.062744140625,
0.0103912353515625,
0.001819610595703125,
0.0016880035400390625,
-0.033538818359375,
0.0167083740234375,
-0.00013053417205810547,
0.0087432861328125,
-0.04400634765625,
0.047943115234375,
-0.036163330078125,
-0.01149749755859375,
-0.0111846923828125,
0.00931549072265625,
-0.00843048095703125,
0.051971435546875,
-0.004756927490234375,
0.06658935546875,
0.035736083984375,
-0.04315185546875,
0.0170745849609375,
0.03082275390625,
-0.030914306640625,
0.018768310546875,
-0.06744384765625,
0.01485443115234375,
0.01276397705078125,
0.02093505859375,
-0.0791015625,
-0.01270294189453125,
0.022613525390625,
-0.02276611328125,
0.0236968994140625,
0.01056671142578125,
-0.053802490234375,
-0.0340576171875,
-0.03564453125,
0.0160980224609375,
0.06658935546875,
-0.04656982421875,
0.0229034423828125,
0.0307769775390625,
0.003429412841796875,
-0.04473876953125,
-0.062469482421875,
-0.0176544189453125,
-0.0275115966796875,
-0.055572509765625,
0.0161590576171875,
-0.01114654541015625,
-0.0157012939453125,
-0.022003173828125,
-0.011932373046875,
0.00881195068359375,
0.0187530517578125,
0.039031982421875,
0.030059814453125,
-0.0132293701171875,
-0.01021575927734375,
-0.0008077621459960938,
-0.0182037353515625,
0.004772186279296875,
0.0130157470703125,
0.0487060546875,
-0.027008056640625,
-0.01206207275390625,
-0.061309814453125,
-0.00046706199645996094,
0.031951904296875,
-0.020233154296875,
0.0523681640625,
0.0487060546875,
-0.01383209228515625,
0.0093231201171875,
-0.05804443359375,
-0.0169219970703125,
-0.038604736328125,
0.029876708984375,
-0.017822265625,
-0.055145263671875,
0.048095703125,
-0.0031681060791015625,
0.0146636962890625,
0.050506591796875,
0.061309814453125,
0.00004190206527709961,
0.05670166015625,
0.0408935546875,
0.00818634033203125,
0.03094482421875,
-0.04986572265625,
0.007843017578125,
-0.07916259765625,
-0.0303802490234375,
-0.029693603515625,
-0.031402587890625,
-0.04833984375,
-0.04022216796875,
0.0098724365234375,
0.0210723876953125,
-0.046417236328125,
0.038848876953125,
-0.0391845703125,
0.0177764892578125,
0.045806884765625,
0.0079345703125,
0.01213836669921875,
0.0039043426513671875,
-0.00937652587890625,
-0.0010547637939453125,
-0.0447998046875,
-0.044769287109375,
0.08624267578125,
0.0433349609375,
0.06719970703125,
-0.003509521484375,
0.04937744140625,
-0.007595062255859375,
0.0196075439453125,
-0.051055908203125,
0.0501708984375,
-0.00815582275390625,
-0.034027099609375,
-0.0008063316345214844,
-0.01422882080078125,
-0.06915283203125,
0.0191192626953125,
0.00039839744567871094,
-0.05963134765625,
0.0165863037109375,
0.0028018951416015625,
-0.0310821533203125,
0.024017333984375,
-0.0714111328125,
0.06103515625,
-0.032135009765625,
-0.030975341796875,
-0.01953125,
-0.05224609375,
0.050567626953125,
-0.0022792816162109375,
0.0019474029541015625,
-0.0206756591796875,
-0.012969970703125,
0.0791015625,
-0.0445556640625,
0.0731201171875,
-0.0205230712890625,
-0.0015802383422851562,
0.036834716796875,
-0.0011157989501953125,
0.0413818359375,
0.004283905029296875,
-0.0017337799072265625,
0.0311737060546875,
-0.00011920928955078125,
-0.0224761962890625,
-0.01168060302734375,
0.0601806640625,
-0.09814453125,
-0.05072021484375,
-0.037994384765625,
-0.055389404296875,
0.0024738311767578125,
0.00707244873046875,
0.014739990234375,
0.000194549560546875,
0.0188446044921875,
0.01177215576171875,
0.047088623046875,
-0.039031982421875,
0.042449951171875,
0.038726806640625,
-0.00191497802734375,
-0.024566650390625,
0.055206298828125,
0.003307342529296875,
0.0186767578125,
0.01277923583984375,
0.00814056396484375,
-0.018707275390625,
-0.032012939453125,
-0.0137176513671875,
0.045806884765625,
-0.042388916015625,
-0.039886474609375,
-0.03955078125,
-0.0227508544921875,
-0.0157928466796875,
0.003070831298828125,
-0.0390625,
-0.035186767578125,
-0.050079345703125,
-0.006847381591796875,
0.0487060546875,
0.040283203125,
-0.01044464111328125,
0.051605224609375,
-0.0122222900390625,
0.026641845703125,
0.01934814453125,
0.0245819091796875,
-0.0016498565673828125,
-0.0599365234375,
0.00879669189453125,
0.0003871917724609375,
-0.049072265625,
-0.05279541015625,
0.0281219482421875,
0.0130615234375,
0.053466796875,
0.013275146484375,
0.0002655982971191406,
0.06768798828125,
-0.0176239013671875,
0.06390380859375,
0.015472412109375,
-0.062164306640625,
0.05377197265625,
-0.0031032562255859375,
0.0036983489990234375,
0.0303802490234375,
0.0194549560546875,
-0.01043701171875,
-0.025665283203125,
-0.05450439453125,
-0.062469482421875,
0.0699462890625,
0.02001953125,
-0.005916595458984375,
0.0164031982421875,
0.033111572265625,
0.0131988525390625,
0.0032958984375,
-0.057769775390625,
-0.029022216796875,
-0.0250091552734375,
-0.0029468536376953125,
-0.00897216796875,
-0.0261383056640625,
-0.01110076904296875,
-0.0279388427734375,
0.056488037109375,
0.00041747093200683594,
0.039031982421875,
0.01102447509765625,
-0.0279388427734375,
-0.016754150390625,
0.003055572509765625,
0.04931640625,
0.0433349609375,
-0.034088134765625,
-0.004161834716796875,
0.0235443115234375,
-0.04852294921875,
0.0120086669921875,
0.01678466796875,
-0.00446319580078125,
-0.0158843994140625,
0.03125,
0.08441162109375,
0.007770538330078125,
-0.0474853515625,
0.034942626953125,
-0.0008187294006347656,
-0.0157470703125,
-0.020233154296875,
0.01508331298828125,
0.01206207275390625,
0.026702880859375,
0.022430419921875,
-0.0009050369262695312,
-0.0188446044921875,
-0.02978515625,
-0.0084228515625,
0.028839111328125,
0.0093994140625,
-0.031280517578125,
0.06280517578125,
0.004985809326171875,
-0.021392822265625,
0.043212890625,
-0.01305389404296875,
-0.020843505859375,
0.057037353515625,
0.053436279296875,
0.044403076171875,
-0.019561767578125,
-0.000629425048828125,
0.035186767578125,
0.041168212890625,
-0.0172119140625,
0.035858154296875,
0.0158233642578125,
-0.0367431640625,
-0.028656005859375,
-0.048828125,
-0.0197906494140625,
0.033599853515625,
-0.040985107421875,
0.0295867919921875,
-0.049896240234375,
-0.0193023681640625,
-0.01097869873046875,
0.0352783203125,
-0.0504150390625,
-0.006359100341796875,
0.00966644287109375,
0.0811767578125,
-0.069091796875,
0.056915283203125,
0.050567626953125,
-0.035736083984375,
-0.07086181640625,
-0.03338623046875,
-0.004482269287109375,
-0.08880615234375,
0.044403076171875,
0.017364501953125,
0.00762939453125,
-0.006160736083984375,
-0.05413818359375,
-0.07745361328125,
0.11444091796875,
0.049713134765625,
-0.047943115234375,
0.0157470703125,
0.00720977783203125,
0.038421630859375,
-0.0183563232421875,
0.02716064453125,
0.060272216796875,
0.0380859375,
0.0012798309326171875,
-0.08734130859375,
0.0184326171875,
-0.0192413330078125,
0.01024627685546875,
-0.004913330078125,
-0.0814208984375,
0.0826416015625,
-0.0290985107421875,
-0.008544921875,
0.02490234375,
0.0452880859375,
0.0577392578125,
0.0146942138671875,
0.02593994140625,
0.06695556640625,
0.062164306640625,
-0.007450103759765625,
0.08856201171875,
-0.027557373046875,
0.03668212890625,
0.07745361328125,
-0.01031494140625,
0.072265625,
0.042205810546875,
-0.032562255859375,
0.049285888671875,
0.067138671875,
-0.0110321044921875,
0.0440673828125,
0.00836944580078125,
0.0160064697265625,
-0.00860595703125,
-0.0030670166015625,
-0.037872314453125,
0.027618408203125,
0.0278778076171875,
-0.01371002197265625,
-0.00867462158203125,
-0.01276397705078125,
0.016265869140625,
-0.02545166015625,
-0.01172637939453125,
0.043121337890625,
0.020416259765625,
-0.050506591796875,
0.0838623046875,
0.007549285888671875,
0.0654296875,
-0.04095458984375,
0.0154571533203125,
-0.038604736328125,
0.02252197265625,
-0.02618408203125,
-0.05255126953125,
0.0021991729736328125,
0.0019016265869140625,
0.00496673583984375,
-0.00445556640625,
0.04644775390625,
-0.0006661415100097656,
-0.02862548828125,
0.034637451171875,
0.0255584716796875,
0.0222930908203125,
0.01450347900390625,
-0.05523681640625,
0.02447509765625,
-0.00853729248046875,
-0.034088134765625,
0.02435302734375,
0.0090484619140625,
-0.015167236328125,
0.050323486328125,
0.046905517578125,
-0.004474639892578125,
0.0203857421875,
-0.011749267578125,
0.0726318359375,
-0.0301513671875,
-0.026397705078125,
-0.057403564453125,
0.034820556640625,
0.01314544677734375,
-0.043670654296875,
0.05401611328125,
0.041961669921875,
0.053955078125,
0.007198333740234375,
0.041168212890625,
-0.0120391845703125,
0.0281219482421875,
-0.0280914306640625,
0.035858154296875,
-0.036865234375,
0.029205322265625,
-0.0062103271484375,
-0.07489013671875,
-0.01119232177734375,
0.05859375,
-0.02947998046875,
-0.007129669189453125,
0.063232421875,
0.0701904296875,
-0.01326751708984375,
-0.0180816650390625,
-0.01055908203125,
0.040985107421875,
0.0154876708984375,
0.07000732421875,
0.063720703125,
-0.05548095703125,
0.04229736328125,
-0.040985107421875,
-0.0265350341796875,
-0.0191192626953125,
-0.058013916015625,
-0.07586669921875,
-0.033050537109375,
-0.037017822265625,
-0.029205322265625,
-0.0003075599670410156,
0.057037353515625,
0.041229248046875,
-0.066162109375,
-0.03973388671875,
-0.0032024383544921875,
0.004425048828125,
-0.0175933837890625,
-0.013763427734375,
0.03778076171875,
-0.0215911865234375,
-0.032196044921875,
0.017242431640625,
0.004253387451171875,
0.011016845703125,
-0.0241241455078125,
-0.02490234375,
-0.0211639404296875,
-0.00814056396484375,
0.034881591796875,
0.0268402099609375,
-0.07159423828125,
-0.0066986083984375,
0.0016803741455078125,
-0.00888824462890625,
0.0182342529296875,
0.031036376953125,
-0.061798095703125,
-0.0009784698486328125,
0.0268402099609375,
0.033660888671875,
0.050567626953125,
-0.0113372802734375,
0.0082244873046875,
-0.038909912109375,
0.037353515625,
-0.0054779052734375,
0.03094482421875,
0.0333251953125,
-0.0208282470703125,
0.04815673828125,
0.02752685546875,
-0.044677734375,
-0.07806396484375,
-0.0093841552734375,
-0.089599609375,
-0.0112152099609375,
0.11065673828125,
-0.01319122314453125,
-0.0338134765625,
0.0208740234375,
-0.0200347900390625,
0.0302886962890625,
-0.03338623046875,
0.05450439453125,
0.027130126953125,
-0.0165863037109375,
-0.0172271728515625,
-0.056488037109375,
0.0212554931640625,
0.0281829833984375,
-0.064208984375,
-0.00988006591796875,
0.0164947509765625,
0.03759765625,
0.0135650634765625,
0.0355224609375,
0.00028324127197265625,
0.01715087890625,
-0.01319122314453125,
0.005794525146484375,
-0.01506805419921875,
-0.002147674560546875,
-0.0282745361328125,
-0.0169525146484375,
0.00524139404296875,
-0.0181121826171875
]
] |
caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16 | 2023-10-11T07:37:13.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"MindsAndCompany",
"llama-2",
"en",
"arxiv:2306.02707",
"license:llama2",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | caisarl76 | null | null | caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16 | 0 | 7,096 | transformers | 2023-10-06T12:53:08 | ---
pipeline_tag: text-generation
license: llama2
language:
- en
library_name: transformers
tags:
- MindsAndCompany
- llama-2
---
## Model Details
* **Developed by**: [Minds And Company](https://mnc.ai/)
* **Backbone Model**: [Mistral-7B-v0.1](mistralai/Mistral-7B-v0.1)
* **Library**: [HuggingFace Transformers](https://github.com/huggingface/transformers)
## Dataset Details
### Used Datasets
- Orca-style dataset
- Alpaca-style dataset
### Prompt Template
- Llama Prompt Template
## Limitations & Biases:
Llama2 and fine-tuned variants are a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2 and any fine-tuned varient's potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2 variants, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at https://ai.meta.com/llama/responsible-use-guide/
## License Disclaimer:
This model is bound by the license & usage restrictions of the original Llama-2 model. And comes with no warranty or gurantees of any kind.
## Contact Us
- [Minds And Company](https://mnc.ai/)
## Citiation:
Please kindly cite using the following BibTeX:
```bibtex
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```
@misc{Orca-best,
title = {Orca-best: A filtered version of orca gpt4 dataset.},
author = {Shahul Es},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://huggingface.co/datasets/shahules786/orca-best/},
}
```
```
@software{touvron2023llama2,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron, Louis Martin, Kevin Stone, Peter Albert, Amjad Almahairi, Yasmine Babaei, Nikolay Bashlykov, Soumya Batra, Prajjwal Bhargava,
Shruti Bhosale, Dan Bikel, Lukas Blecher, Cristian Canton Ferrer, Moya Chen, Guillem Cucurull, David Esiobu, Jude Fernandes, Jeremy Fu, Wenyin Fu, Brian Fuller,
Cynthia Gao, Vedanuj Goswami, Naman Goyal, Anthony Hartshorn, Saghar Hosseini, Rui Hou, Hakan Inan, Marcin Kardas, Viktor Kerkez Madian Khabsa, Isabel Kloumann,
Artem Korenev, Punit Singh Koura, Marie-Anne Lachaux, Thibaut Lavril, Jenya Lee, Diana Liskovich, Yinghai Lu, Yuning Mao, Xavier Martinet, Todor Mihaylov,
Pushkar Mishra, Igor Molybog, Yixin Nie, Andrew Poulton, Jeremy Reizenstein, Rashi Rungta, Kalyan Saladi, Alan Schelten, Ruan Silva, Eric Michael Smith,
Ranjan Subramanian, Xiaoqing Ellen Tan, Binh Tang, Ross Taylor, Adina Williams, Jian Xiang Kuan, Puxin Xu , Zheng Yan, Iliyan Zarov, Yuchen Zhang, Angela Fan,
Melanie Kambadur, Sharan Narang, Aurelien Rodriguez, Robert Stojnic, Sergey Edunov, Thomas Scialom},
year={2023}
}
```
> Readme format: [Riiid/sheep-duck-llama-2-70b-v1.1](https://huggingface.co/Riiid/sheep-duck-llama-2-70b-v1.1) | 3,401 | [
[
-0.03173828125,
-0.048919677734375,
0.0102386474609375,
0.0197296142578125,
-0.023834228515625,
0.00592041015625,
0.01303863525390625,
-0.054443359375,
0.0179901123046875,
0.025848388671875,
-0.05462646484375,
-0.041900634765625,
-0.046478271484375,
-0.004131317138671875,
-0.0233001708984375,
0.08074951171875,
-0.005146026611328125,
-0.03363037109375,
0.00201416015625,
-0.0133819580078125,
-0.0288543701171875,
-0.0171966552734375,
-0.054779052734375,
-0.0260009765625,
0.0198516845703125,
0.018646240234375,
0.056060791015625,
0.043426513671875,
0.03558349609375,
0.0221405029296875,
-0.0335693359375,
0.017730712890625,
-0.03472900390625,
-0.0168914794921875,
0.005832672119140625,
-0.033905029296875,
-0.07769775390625,
-0.0037593841552734375,
0.0294952392578125,
0.0215911865234375,
-0.0178070068359375,
0.0257568359375,
0.0181427001953125,
0.041046142578125,
-0.019744873046875,
0.026275634765625,
-0.028717041015625,
0.0015859603881835938,
-0.0295257568359375,
-0.006526947021484375,
-0.0018339157104492188,
-0.027984619140625,
0.0028400421142578125,
-0.051849365234375,
0.0031681060791015625,
-0.0037555694580078125,
0.0950927734375,
0.0250244140625,
-0.036468505859375,
-0.007320404052734375,
-0.0281524658203125,
0.05584716796875,
-0.06646728515625,
0.024322509765625,
0.021575927734375,
0.0204315185546875,
-0.035430908203125,
-0.0638427734375,
-0.052703857421875,
0.0005054473876953125,
-0.01088714599609375,
0.0178985595703125,
-0.0281524658203125,
-0.007266998291015625,
0.01320648193359375,
0.0293426513671875,
-0.041839599609375,
0.01552581787109375,
-0.043731689453125,
-0.0211944580078125,
0.048614501953125,
0.01531219482421875,
0.0120697021484375,
-0.01428985595703125,
-0.045806884765625,
-0.0251007080078125,
-0.057037353515625,
0.03582763671875,
0.032684326171875,
0.00567626953125,
-0.05914306640625,
0.050750732421875,
-0.01320648193359375,
0.033111572265625,
0.0086822509765625,
-0.033599853515625,
0.044952392578125,
-0.039581298828125,
-0.01520538330078125,
-0.0204925537109375,
0.058502197265625,
0.0303802490234375,
0.00957489013671875,
0.0233306884765625,
-0.00717926025390625,
0.0022258758544921875,
-0.004779815673828125,
-0.05828857421875,
-0.006160736083984375,
0.03167724609375,
-0.03253173828125,
-0.0235748291015625,
-0.006717681884765625,
-0.0731201171875,
-0.020782470703125,
-0.01247406005859375,
0.0100250244140625,
-0.01255035400390625,
-0.03924560546875,
0.01515960693359375,
0.0106201171875,
0.047454833984375,
0.00992584228515625,
-0.059295654296875,
0.024505615234375,
0.036773681640625,
0.06048583984375,
-0.01279449462890625,
-0.006805419921875,
-0.004535675048828125,
0.006404876708984375,
-0.0209197998046875,
0.06329345703125,
-0.021453857421875,
-0.032501220703125,
-0.0106048583984375,
0.00652313232421875,
0.00577545166015625,
-0.027984619140625,
0.046722412109375,
-0.025421142578125,
0.0174560546875,
-0.0236968994140625,
-0.0134735107421875,
-0.0304718017578125,
0.0135345458984375,
-0.035552978515625,
0.0904541015625,
0.0110931396484375,
-0.056732177734375,
0.019073486328125,
-0.04388427734375,
-0.0128173828125,
-0.0254058837890625,
-0.0103759765625,
-0.060638427734375,
-0.0261383056640625,
0.0260162353515625,
0.0270538330078125,
-0.032684326171875,
0.0217132568359375,
-0.031951904296875,
-0.017547607421875,
0.003063201904296875,
-0.017364501953125,
0.0751953125,
0.0157928466796875,
-0.05120849609375,
0.018402099609375,
-0.054229736328125,
-0.0133514404296875,
0.030914306640625,
-0.0198516845703125,
0.00957489013671875,
0.00010377168655395508,
-0.01348876953125,
0.0199432373046875,
0.028289794921875,
-0.0299224853515625,
0.0174713134765625,
-0.025238037109375,
0.040435791015625,
0.057159423828125,
-0.003753662109375,
0.0157623291015625,
-0.042877197265625,
0.04620361328125,
0.00992584228515625,
0.04388427734375,
0.00576019287109375,
-0.061431884765625,
-0.06561279296875,
-0.030029296875,
-0.006412506103515625,
0.049713134765625,
-0.030364990234375,
0.04058837890625,
-0.0133819580078125,
-0.052490234375,
-0.023712158203125,
0.007732391357421875,
0.028656005859375,
0.04071044921875,
0.033203125,
-0.031280517578125,
-0.04266357421875,
-0.07391357421875,
0.0033664703369140625,
-0.0245361328125,
0.004917144775390625,
0.029510498046875,
0.033447265625,
-0.0278778076171875,
0.07177734375,
-0.030975341796875,
-0.0289306640625,
-0.0125885009765625,
-0.01494598388671875,
0.0279541015625,
0.03802490234375,
0.05584716796875,
-0.052276611328125,
-0.0194549560546875,
-0.01169586181640625,
-0.054656982421875,
-0.0106201171875,
0.0078887939453125,
-0.028411865234375,
0.01366424560546875,
0.025054931640625,
-0.055633544921875,
0.05548095703125,
0.057403564453125,
-0.032318115234375,
0.04449462890625,
0.005023956298828125,
-0.0013408660888671875,
-0.0830078125,
0.01000213623046875,
0.00469207763671875,
-0.00829315185546875,
-0.033905029296875,
-0.0085906982421875,
-0.006542205810546875,
0.00910186767578125,
-0.03240966796875,
0.04815673828125,
-0.0302581787109375,
-0.0008039474487304688,
-0.004169464111328125,
0.01454925537109375,
-0.0013360977172851562,
0.04779052734375,
-0.016387939453125,
0.05499267578125,
0.04248046875,
-0.030181884765625,
0.01432037353515625,
0.03375244140625,
-0.0258941650390625,
0.036163330078125,
-0.0731201171875,
0.01551055908203125,
0.0029048919677734375,
0.048370361328125,
-0.09735107421875,
-0.0245361328125,
0.040313720703125,
-0.030609130859375,
0.0298004150390625,
0.003154754638671875,
-0.0211944580078125,
-0.041656494140625,
-0.040679931640625,
0.0298004150390625,
0.046417236328125,
-0.036468505859375,
0.03997802734375,
0.033660888671875,
-0.00897979736328125,
-0.05450439453125,
-0.05596923828125,
-0.0112457275390625,
-0.037811279296875,
-0.05572509765625,
0.0298004150390625,
-0.0203094482421875,
0.0014944076538085938,
-0.01284027099609375,
-0.0160980224609375,
0.00457000732421875,
0.0036907196044921875,
0.0303955078125,
0.032562255859375,
-0.01259613037109375,
-0.0236358642578125,
0.0139923095703125,
-0.015838623046875,
-0.002422332763671875,
0.0021419525146484375,
0.04388427734375,
-0.01322174072265625,
-0.029510498046875,
-0.048980712890625,
0.00434112548828125,
0.032806396484375,
-0.0228118896484375,
0.044891357421875,
0.05145263671875,
-0.0228118896484375,
0.01116943359375,
-0.053741455078125,
-0.0166778564453125,
-0.040313720703125,
0.0185546875,
-0.0284423828125,
-0.07318115234375,
0.0692138671875,
0.00524139404296875,
0.025421142578125,
0.057647705078125,
0.048095703125,
0.00954437255859375,
0.060302734375,
0.044921875,
0.0007777214050292969,
0.04022216796875,
-0.03155517578125,
0.00045800209045410156,
-0.0760498046875,
-0.051025390625,
-0.035858154296875,
-0.03179931640625,
-0.043060302734375,
-0.037017822265625,
0.029632568359375,
0.017852783203125,
-0.048309326171875,
0.028076171875,
-0.05255126953125,
0.007747650146484375,
0.0252227783203125,
0.01422882080078125,
0.0201568603515625,
0.0033416748046875,
-0.01555633544921875,
0.006267547607421875,
-0.032745361328125,
-0.044281005859375,
0.0894775390625,
0.036895751953125,
0.04913330078125,
0.0269317626953125,
0.034698486328125,
0.0034542083740234375,
0.021392822265625,
-0.03570556640625,
0.038818359375,
0.01280975341796875,
-0.06201171875,
-0.01277923583984375,
-0.0153656005859375,
-0.09271240234375,
0.0079803466796875,
0.000043272972106933594,
-0.07159423828125,
0.03546142578125,
0.0012445449829101562,
-0.030609130859375,
0.0181884765625,
-0.034576416015625,
0.04351806640625,
-0.003963470458984375,
-0.00432586669921875,
-0.005828857421875,
-0.061004638671875,
0.048980712890625,
-0.004978179931640625,
0.01345062255859375,
-0.0199127197265625,
-0.0228271484375,
0.05633544921875,
-0.03387451171875,
0.0797119140625,
-0.005062103271484375,
-0.0182952880859375,
0.044158935546875,
-0.003360748291015625,
0.05389404296875,
0.0169525146484375,
-0.00693511962890625,
0.034515380859375,
-0.01137542724609375,
-0.0262298583984375,
-0.0173187255859375,
0.044189453125,
-0.0867919921875,
-0.055419921875,
-0.0291900634765625,
-0.01212310791015625,
0.005275726318359375,
0.006748199462890625,
0.0248870849609375,
0.0230865478515625,
0.0186767578125,
0.0127410888671875,
0.03759765625,
-0.019927978515625,
0.0297393798828125,
0.03326416015625,
-0.01251220703125,
-0.03900146484375,
0.03765869140625,
0.0082855224609375,
0.0251007080078125,
0.0111846923828125,
0.00786590576171875,
-0.029754638671875,
-0.037445068359375,
-0.0239715576171875,
0.03497314453125,
-0.042083740234375,
-0.0307769775390625,
-0.04071044921875,
-0.023284912109375,
-0.0224761962890625,
0.0021991729736328125,
-0.03656005859375,
-0.025054931640625,
-0.045989990234375,
-0.0157623291015625,
0.04852294921875,
0.045989990234375,
-0.0069580078125,
0.021728515625,
-0.0279541015625,
0.01557159423828125,
0.0208892822265625,
0.018829345703125,
0.007572174072265625,
-0.06182861328125,
0.007450103759765625,
0.01306915283203125,
-0.05633544921875,
-0.05438232421875,
0.0246429443359375,
0.01395416259765625,
0.05181884765625,
0.0162353515625,
-0.0032520294189453125,
0.0689697265625,
-0.015899658203125,
0.08184814453125,
0.022796630859375,
-0.058685302734375,
0.044189453125,
-0.03851318359375,
0.0093841552734375,
0.0188446044921875,
0.0251617431640625,
-0.0185089111328125,
-0.016632080078125,
-0.058258056640625,
-0.0733642578125,
0.045989990234375,
0.0265960693359375,
0.0130462646484375,
0.00432586669921875,
0.03961181640625,
0.01267242431640625,
0.0048828125,
-0.054656982421875,
-0.040191650390625,
-0.027923583984375,
-0.0013341903686523438,
-0.00019741058349609375,
-0.025054931640625,
-0.01338958740234375,
-0.0172576904296875,
0.048431396484375,
-0.004329681396484375,
0.037750244140625,
0.0233154296875,
0.02294921875,
-0.01470947265625,
0.0010356903076171875,
0.0645751953125,
0.045379638671875,
-0.01439666748046875,
-0.01399993896484375,
0.023406982421875,
-0.046905517578125,
-0.00440216064453125,
0.0120849609375,
0.00417327880859375,
-0.0201263427734375,
0.0295867919921875,
0.060272216796875,
0.006336212158203125,
-0.0287628173828125,
0.034149169921875,
0.004741668701171875,
-0.016845703125,
-0.040863037109375,
0.01216888427734375,
0.0158233642578125,
0.051055908203125,
0.03289794921875,
0.01013946533203125,
0.0022487640380859375,
-0.028656005859375,
0.0045166015625,
0.0203399658203125,
-0.0156402587890625,
-0.03839111328125,
0.07403564453125,
0.0023136138916015625,
-0.01424407958984375,
0.034881591796875,
-0.0038604736328125,
-0.0294952392578125,
0.054229736328125,
0.0302886962890625,
0.0458984375,
-0.022796630859375,
0.002880096435546875,
0.03912353515625,
0.0161590576171875,
-0.00811767578125,
0.037750244140625,
0.0117645263671875,
-0.04345703125,
-0.0255889892578125,
-0.03851318359375,
-0.0229339599609375,
0.0265045166015625,
-0.04571533203125,
0.038543701171875,
-0.03900146484375,
-0.035308837890625,
-0.0166473388671875,
0.01441192626953125,
-0.05987548828125,
0.0031795501708984375,
0.0017251968383789062,
0.06671142578125,
-0.04833984375,
0.051300048828125,
0.039276123046875,
-0.033172607421875,
-0.08465576171875,
-0.0225830078125,
0.0219879150390625,
-0.07147216796875,
0.0289459228515625,
-0.0014066696166992188,
-0.0022640228271484375,
0.004241943359375,
-0.052886962890625,
-0.0833740234375,
0.11956787109375,
0.0279693603515625,
-0.0380859375,
0.0107574462890625,
0.0015115737915039062,
0.034759521484375,
-0.01025390625,
0.042327880859375,
0.050811767578125,
0.038787841796875,
0.0255889892578125,
-0.08905029296875,
0.020538330078125,
-0.030517578125,
-0.006336212158203125,
-0.0018033981323242188,
-0.0948486328125,
0.0814208984375,
-0.022613525390625,
-0.005359649658203125,
0.028900146484375,
0.060302734375,
0.057037353515625,
0.016571044921875,
0.028533935546875,
0.042816162109375,
0.05224609375,
-0.00899505615234375,
0.0848388671875,
-0.0087127685546875,
0.0361328125,
0.060577392578125,
0.004955291748046875,
0.060638427734375,
0.018310546875,
-0.041778564453125,
0.05938720703125,
0.07476806640625,
0.0021800994873046875,
0.03912353515625,
0.01116943359375,
0.005405426025390625,
-0.0032939910888671875,
-0.0031280517578125,
-0.059051513671875,
0.029388427734375,
0.035491943359375,
-0.0213775634765625,
-0.007633209228515625,
-0.0244903564453125,
0.029083251953125,
-0.0236663818359375,
0.0005888938903808594,
0.044403076171875,
0.0184478759765625,
-0.028228759765625,
0.07806396484375,
-0.01313018798828125,
0.0640869140625,
-0.05413818359375,
0.002410888671875,
-0.040069580078125,
0.00638580322265625,
-0.0307769775390625,
-0.058349609375,
0.0108795166015625,
-0.0010223388671875,
0.0022182464599609375,
0.01442718505859375,
0.043426513671875,
-0.0011167526245117188,
-0.026580810546875,
0.027923583984375,
0.020233154296875,
0.028472900390625,
0.0238494873046875,
-0.07330322265625,
0.018798828125,
0.0097808837890625,
-0.06976318359375,
0.0228118896484375,
0.031158447265625,
-0.01432037353515625,
0.06536865234375,
0.05438232421875,
-0.007495880126953125,
0.00849151611328125,
-0.007465362548828125,
0.0882568359375,
-0.0308837890625,
-0.0271759033203125,
-0.059112548828125,
0.057342529296875,
-0.0038604736328125,
-0.052703857421875,
0.049285888671875,
0.030914306640625,
0.049560546875,
0.019744873046875,
0.04400634765625,
-0.0120086669921875,
0.025482177734375,
-0.0178375244140625,
0.04327392578125,
-0.055694580078125,
0.0276031494140625,
-0.0185699462890625,
-0.07830810546875,
-0.021209716796875,
0.056549072265625,
-0.01149749755859375,
0.0186920166015625,
0.04010009765625,
0.07354736328125,
0.0004391670227050781,
-0.0006985664367675781,
-0.0081329345703125,
0.026947021484375,
0.05072021484375,
0.04669189453125,
0.038360595703125,
-0.046661376953125,
0.052154541015625,
-0.021148681640625,
-0.0268096923828125,
-0.0218658447265625,
-0.07342529296875,
-0.07305908203125,
-0.040252685546875,
-0.0286102294921875,
-0.03570556640625,
-0.0084075927734375,
0.05322265625,
0.057647705078125,
-0.05706787109375,
-0.01499176025390625,
-0.0007901191711425781,
-0.000003814697265625,
-0.021759033203125,
-0.0106353759765625,
0.046875,
0.01515960693359375,
-0.05133056640625,
0.0170745849609375,
0.007038116455078125,
0.033416748046875,
-0.024078369140625,
-0.01502227783203125,
-0.007747650146484375,
0.00617218017578125,
0.0279998779296875,
0.03570556640625,
-0.06536865234375,
-0.033935546875,
-0.007076263427734375,
-0.0146026611328125,
0.01213836669921875,
0.007904052734375,
-0.050262451171875,
0.0139923095703125,
0.03363037109375,
0.01210784912109375,
0.05108642578125,
0.00194549560546875,
0.01258087158203125,
-0.0290069580078125,
0.029205322265625,
-0.00176239013671875,
0.0210113525390625,
0.0149688720703125,
-0.0246124267578125,
0.05792236328125,
0.0162200927734375,
-0.043609619140625,
-0.058837890625,
0.003948211669921875,
-0.1063232421875,
0.0094146728515625,
0.09747314453125,
-0.0194244384765625,
-0.0133819580078125,
-0.004550933837890625,
-0.025482177734375,
0.034210205078125,
-0.037750244140625,
0.058990478515625,
0.035186767578125,
-0.00882720947265625,
-0.01317596435546875,
-0.05181884765625,
0.038543701171875,
0.00977325439453125,
-0.06292724609375,
-0.0211029052734375,
0.00675201416015625,
0.0307464599609375,
0.01442718505859375,
0.04058837890625,
-0.0227813720703125,
0.004878997802734375,
-0.003936767578125,
0.00945281982421875,
-0.01540374755859375,
0.004787445068359375,
-0.0036640167236328125,
-0.0107879638671875,
-0.0136871337890625,
-0.00945281982421875
]
] |
microsoft/bloom-deepspeed-inference-fp16 | 2022-10-11T18:28:26.000Z | [
"transformers",
"bloom",
"feature-extraction",
"license:bigscience-bloom-rail-1.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | feature-extraction | microsoft | null | null | microsoft/bloom-deepspeed-inference-fp16 | 10 | 7,094 | transformers | 2022-08-17T21:01:05 | ---
license: bigscience-bloom-rail-1.0
---
This is a copy of the original [BLOOM weights](https://huggingface.co/bigscience/bloom) that is more efficient to use with the [DeepSpeed-MII](https://github.com/microsoft/deepspeed-mii) and [DeepSpeed-Inference](https://www.deepspeed.ai/tutorials/inference-tutorial/). In this repo the original tensors are split into 8 shards to target 8 GPUs, this allows the user to run the model with DeepSpeed-inference Tensor Parallelism.
For specific details about the BLOOM model itself, please see the [original BLOOM model card](https://huggingface.co/bigscience/bloom).
For examples on using this repo please see the following:
* https://github.com/huggingface/transformers-bloom-inference
* https://github.com/microsoft/DeepSpeed-MII
| 781 | [
[
-0.018707275390625,
-0.03179931640625,
0.034149169921875,
0.045074462890625,
-0.0016813278198242188,
-0.0027332305908203125,
-0.005573272705078125,
-0.0213470458984375,
0.0192413330078125,
0.0212860107421875,
-0.0648193359375,
-0.006992340087890625,
-0.0259552001953125,
-0.0187530517578125,
-0.022918701171875,
0.0731201171875,
-0.007411956787109375,
0.034698486328125,
-0.0155487060546875,
-0.00103759765625,
-0.020782470703125,
0.01163482666015625,
-0.039398193359375,
-0.0506591796875,
0.043701171875,
0.027496337890625,
0.054534912109375,
0.03424072265625,
0.078857421875,
0.0221710205078125,
-0.0190582275390625,
-0.0159149169921875,
-0.048583984375,
-0.0169677734375,
0.0027942657470703125,
-0.0269317626953125,
-0.03094482421875,
0.002193450927734375,
0.05633544921875,
0.0275115966796875,
-0.0177764892578125,
0.0198516845703125,
-0.010528564453125,
0.0506591796875,
-0.07208251953125,
-0.006092071533203125,
-0.020843505859375,
0.0231781005859375,
-0.017120361328125,
0.0185546875,
-0.0238494873046875,
0.02008056640625,
-0.009796142578125,
-0.0635986328125,
0.048858642578125,
-0.0075836181640625,
0.07452392578125,
0.01165008544921875,
0.005313873291015625,
0.007617950439453125,
-0.071044921875,
0.06170654296875,
-0.0386962890625,
0.0250091552734375,
-0.01114654541015625,
0.053619384765625,
0.01971435546875,
-0.07879638671875,
-0.030059814453125,
0.01432037353515625,
0.006320953369140625,
0.0311431884765625,
-0.0156097412109375,
0.0028667449951171875,
0.029632568359375,
0.0855712890625,
-0.04180908203125,
-0.004894256591796875,
-0.03424072265625,
0.0029811859130859375,
0.043975830078125,
0.00327301025390625,
0.01422882080078125,
0.005725860595703125,
-0.0648193359375,
-0.040008544921875,
-0.040191650390625,
-0.045562744140625,
0.006526947021484375,
0.01551055908203125,
-0.0282135009765625,
0.02099609375,
0.00012874603271484375,
0.038543701171875,
-0.0011615753173828125,
0.0005125999450683594,
0.04339599609375,
-0.00696563720703125,
-0.031585693359375,
-0.0252532958984375,
0.03778076171875,
0.03741455078125,
0.00576019287109375,
-0.0265655517578125,
-0.00991058349609375,
-0.00010293722152709961,
0.035797119140625,
-0.0863037109375,
-0.0265960693359375,
0.010498046875,
-0.036224365234375,
-0.0188751220703125,
0.0154266357421875,
-0.067138671875,
0.00040078163146972656,
-0.0162506103515625,
0.0389404296875,
-0.04412841796875,
-0.022674560546875,
0.0122528076171875,
-0.042694091796875,
0.0207061767578125,
0.015777587890625,
-0.050994873046875,
0.0291748046875,
0.0738525390625,
0.09222412109375,
-0.0204315185546875,
-0.0419921875,
-0.0178375244140625,
0.00972747802734375,
0.0037784576416015625,
0.036590576171875,
-0.00839996337890625,
-0.044464111328125,
-0.01425933837890625,
0.002292633056640625,
-0.0008034706115722656,
-0.0288238525390625,
0.0633544921875,
-0.060760498046875,
-0.01445770263671875,
-0.042755126953125,
-0.05255126953125,
-0.01198577880859375,
0.00975799560546875,
-0.058990478515625,
0.060302734375,
0.031097412109375,
-0.052154541015625,
0.042816162109375,
-0.062286376953125,
-0.0087738037109375,
0.0305938720703125,
0.00958251953125,
-0.045806884765625,
0.02117919921875,
0.0034542083740234375,
-0.005695343017578125,
-0.0016222000122070312,
-0.0281524658203125,
-0.044403076171875,
-0.0309906005859375,
-0.0257110595703125,
-0.02105712890625,
0.049530029296875,
0.05938720703125,
-0.0170745849609375,
0.01285552978515625,
-0.0484619140625,
-0.007671356201171875,
0.02069091796875,
-0.0062255859375,
-0.0001920461654663086,
-0.01222991943359375,
0.023529052734375,
-0.0257720947265625,
0.01004791259765625,
-0.05389404296875,
0.01197052001953125,
-0.006237030029296875,
0.034820556640625,
0.043182373046875,
-0.0051116943359375,
0.0369873046875,
-0.03643798828125,
0.0238800048828125,
0.00894927978515625,
0.029937744140625,
-0.0157318115234375,
-0.04364013671875,
-0.036102294921875,
-0.06207275390625,
0.007671356201171875,
0.0174407958984375,
-0.0283355712890625,
-0.005352020263671875,
-0.033294677734375,
-0.059295654296875,
-0.0263519287109375,
0.0030307769775390625,
0.0181732177734375,
0.0103759765625,
0.04022216796875,
-0.0205841064453125,
-0.037841796875,
-0.04925537109375,
-0.01435089111328125,
0.01163482666015625,
0.0085906982421875,
0.0244293212890625,
0.0450439453125,
-0.019989013671875,
0.062103271484375,
-0.06561279296875,
-0.00432586669921875,
0.01227569580078125,
0.01091766357421875,
0.0262298583984375,
0.06396484375,
0.06256103515625,
-0.05419921875,
-0.0248260498046875,
-0.01910400390625,
-0.0294036865234375,
0.00730133056640625,
0.006175994873046875,
-0.03948974609375,
-0.03948974609375,
0.01139068603515625,
-0.0631103515625,
0.02301025390625,
0.035858154296875,
-0.032867431640625,
0.04345703125,
-0.0232391357421875,
0.0221710205078125,
-0.0823974609375,
0.011566162109375,
0.00545501708984375,
-0.0252532958984375,
-0.0288238525390625,
0.064208984375,
0.00911712646484375,
-0.007266998291015625,
-0.02880859375,
0.040557861328125,
-0.03759765625,
-0.0176849365234375,
-0.0258331298828125,
-0.04119873046875,
0.01032257080078125,
0.01214599609375,
-0.0119476318359375,
0.09716796875,
0.031982421875,
-0.02886962890625,
0.0182037353515625,
0.0242156982421875,
-0.0241241455078125,
0.0299072265625,
-0.06939697265625,
0.0031299591064453125,
-0.01416778564453125,
-0.0015716552734375,
-0.0579833984375,
-0.010040283203125,
-0.003387451171875,
-0.016845703125,
0.034637451171875,
-0.01097869873046875,
-0.027862548828125,
-0.041534423828125,
-0.0205841064453125,
0.045440673828125,
0.06329345703125,
-0.056488037109375,
0.035064697265625,
0.0216522216796875,
0.04046630859375,
-0.0259552001953125,
-0.08843994140625,
-0.01611328125,
-0.0228118896484375,
-0.027862548828125,
0.06182861328125,
-0.008270263671875,
-0.0170135498046875,
-0.006237030029296875,
0.0211334228515625,
0.0027065277099609375,
0.009521484375,
0.0296783447265625,
0.01505279541015625,
-0.043701171875,
0.0027027130126953125,
-0.01629638671875,
-0.00955963134765625,
-0.0079345703125,
-0.027099609375,
0.040985107421875,
-0.0140228271484375,
-0.039276123046875,
-0.033782958984375,
0.0302276611328125,
0.053070068359375,
-0.0036869049072265625,
0.058990478515625,
0.078857421875,
-0.0270538330078125,
0.00359344482421875,
-0.052886962890625,
-0.0350341796875,
-0.036834716796875,
0.01507568359375,
-0.0310821533203125,
-0.04962158203125,
0.0249786376953125,
0.00130462646484375,
0.00940704345703125,
0.06268310546875,
0.042266845703125,
0.00698089599609375,
0.05426025390625,
0.042083740234375,
-0.039459228515625,
0.043212890625,
-0.047515869140625,
-0.015380859375,
-0.065185546875,
0.0099945068359375,
-0.0256500244140625,
-0.049652099609375,
-0.004528045654296875,
-0.05340576171875,
0.001041412353515625,
0.0284271240234375,
-0.043914794921875,
0.03851318359375,
-0.039398193359375,
0.0217132568359375,
0.0638427734375,
0.0251312255859375,
0.0169525146484375,
0.036041259765625,
0.0137786865234375,
-0.00736236572265625,
-0.04351806640625,
-0.01081085205078125,
0.057647705078125,
0.0250701904296875,
0.04730224609375,
0.0161285400390625,
0.033599853515625,
0.0350341796875,
0.059814453125,
-0.034637451171875,
0.0038242340087890625,
-0.0118255615234375,
-0.065185546875,
0.00905609130859375,
-0.048675537109375,
-0.05609130859375,
0.0102996826171875,
-0.03619384765625,
-0.048431396484375,
0.005603790283203125,
0.0254058837890625,
-0.036041259765625,
0.054534912109375,
-0.066650390625,
0.0740966796875,
0.01253509521484375,
-0.0321044921875,
-0.03765869140625,
-0.004093170166015625,
0.0258331298828125,
0.03387451171875,
-0.005645751953125,
-0.0146942138671875,
0.00984954833984375,
0.043792724609375,
-0.056671142578125,
0.053619384765625,
-0.021575927734375,
0.027099609375,
0.053436279296875,
-0.0177764892578125,
0.02655029296875,
0.0078887939453125,
-0.01024627685546875,
0.027008056640625,
-0.0032138824462890625,
-0.0389404296875,
-0.02557373046875,
0.0626220703125,
-0.07135009765625,
-0.01239776611328125,
-0.01085662841796875,
-0.044952392578125,
0.010528564453125,
0.0297698974609375,
0.0242156982421875,
0.0233001708984375,
-0.017547607421875,
0.006092071533203125,
0.025299072265625,
0.0211181640625,
0.035552978515625,
0.02886962890625,
-0.0247650146484375,
-0.045867919921875,
0.0548095703125,
-0.0187225341796875,
0.01523590087890625,
0.043426513671875,
0.0252838134765625,
0.0027790069580078125,
-0.046112060546875,
-0.0416259765625,
0.03753662109375,
-0.0170135498046875,
-0.0257415771484375,
-0.03179931640625,
-0.0178375244140625,
-0.046051025390625,
-0.0115203857421875,
-0.031280517578125,
-0.0060577392578125,
-0.034271240234375,
-0.0038585662841796875,
0.06597900390625,
0.03436279296875,
-0.0218505859375,
0.06890869140625,
-0.06549072265625,
0.00501251220703125,
0.0232391357421875,
0.0245819091796875,
0.01433563232421875,
-0.036590576171875,
-0.01123809814453125,
0.0052642822265625,
-0.039642333984375,
-0.056915283203125,
0.0231170654296875,
0.0193634033203125,
0.0259552001953125,
0.052459716796875,
-0.03155517578125,
0.07464599609375,
-0.048858642578125,
0.046661376953125,
0.03076171875,
-0.043304443359375,
0.006214141845703125,
-0.0203399658203125,
0.032501220703125,
0.0599365234375,
0.0252532958984375,
-0.0227203369140625,
0.005146026611328125,
-0.057159423828125,
-0.05419921875,
0.03985595703125,
0.0295257568359375,
0.006404876708984375,
0.0071258544921875,
0.047882080078125,
-0.0030727386474609375,
0.0034618377685546875,
-0.0291748046875,
-0.0278778076171875,
-0.004962921142578125,
-0.024627685546875,
-0.0240325927734375,
-0.05670166015625,
-0.00968170166015625,
-0.00862884521484375,
0.060943603515625,
-0.0081634521484375,
0.0298614501953125,
0.037750244140625,
-0.0122528076171875,
-0.0162506103515625,
-0.032501220703125,
0.0305938720703125,
0.0242156982421875,
-0.03228759765625,
-0.0105743408203125,
0.01476287841796875,
-0.064453125,
-0.0296173095703125,
0.008270263671875,
-0.02728271484375,
-0.00891876220703125,
0.0291595458984375,
0.0765380859375,
0.022552490234375,
-0.0291595458984375,
0.048248291015625,
0.00597381591796875,
-0.0099029541015625,
-0.0256195068359375,
0.00650787353515625,
0.006137847900390625,
0.036895751953125,
0.012939453125,
0.0293426513671875,
0.031158447265625,
-0.01959228515625,
0.041412353515625,
0.044586181640625,
-0.046356201171875,
-0.0283355712890625,
0.06329345703125,
0.01076507568359375,
-0.03533935546875,
0.051727294921875,
-0.020294189453125,
-0.01092529296875,
0.0300445556640625,
0.033935546875,
0.0654296875,
-0.0022792816162109375,
0.01038360595703125,
0.053009033203125,
0.00463104248046875,
-0.0092010498046875,
0.0287017822265625,
-0.037261962890625,
-0.05419921875,
-0.057708740234375,
-0.0675048828125,
-0.02001953125,
-0.009246826171875,
-0.058746337890625,
0.0217132568359375,
-0.039703369140625,
-0.027679443359375,
0.0108642578125,
-0.01250457763671875,
-0.048828125,
0.01203155517578125,
0.037872314453125,
0.10052490234375,
-0.060089111328125,
0.0298614501953125,
0.040191650390625,
-0.019927978515625,
-0.046295166015625,
-0.0299072265625,
-0.0009889602661132812,
-0.0794677734375,
0.016998291015625,
0.011474609375,
0.01428985595703125,
-0.00408935546875,
-0.024627685546875,
-0.07000732421875,
0.09393310546875,
0.025054931640625,
-0.0304412841796875,
-0.005558013916015625,
-0.004474639892578125,
0.027496337890625,
-0.00655364990234375,
0.057647705078125,
0.0264739990234375,
0.0160675048828125,
0.046356201171875,
-0.0308685302734375,
-0.013275146484375,
-0.01407623291015625,
0.0140380859375,
0.01015472412109375,
-0.07318115234375,
0.07940673828125,
-0.0087432861328125,
-0.005054473876953125,
0.0278778076171875,
0.08697509765625,
0.0183258056640625,
0.01082611083984375,
0.048187255859375,
0.09405517578125,
0.02117919921875,
-0.02069091796875,
0.08087158203125,
-0.044158935546875,
0.0679931640625,
0.07879638671875,
-0.011810302734375,
0.05242919921875,
0.037811279296875,
-0.044677734375,
0.037811279296875,
0.036041259765625,
-0.027801513671875,
0.0247039794921875,
0.00408935546875,
-0.0251007080078125,
-0.00815582275390625,
0.01953125,
-0.04364013671875,
0.0072784423828125,
0.0026149749755859375,
-0.033538818359375,
-0.036834716796875,
-0.01427459716796875,
0.004711151123046875,
-0.0271759033203125,
-0.034759521484375,
0.032867431640625,
-0.00537109375,
-0.04742431640625,
0.00376129150390625,
-0.004993438720703125,
0.059844970703125,
-0.054595947265625,
0.0026035308837890625,
-0.0008873939514160156,
0.046661376953125,
-0.01546478271484375,
-0.058319091796875,
0.039764404296875,
-0.0088958740234375,
-0.0322265625,
-0.0229339599609375,
0.048248291015625,
-0.00643157958984375,
-0.06597900390625,
0.01132965087890625,
0.0002104043960571289,
0.0211639404296875,
-0.0227203369140625,
-0.06500244140625,
-0.00018703937530517578,
-0.0249481201171875,
-0.0167236328125,
-0.0024566650390625,
-0.01197052001953125,
0.0408935546875,
0.0292205810546875,
0.049896240234375,
0.005855560302734375,
0.01358795166015625,
0.032958984375,
0.04693603515625,
-0.0227203369140625,
-0.032958984375,
-0.06170654296875,
0.0203094482421875,
-0.037750244140625,
-0.047760009765625,
0.07666015625,
0.07366943359375,
0.041290283203125,
-0.02734375,
0.00891876220703125,
0.0036907196044921875,
0.0241241455078125,
-0.0264739990234375,
0.046142578125,
-0.056884765625,
-0.03131103515625,
-0.038116455078125,
-0.06707763671875,
-0.01654052734375,
0.041778564453125,
-0.01003265380859375,
0.029754638671875,
0.03155517578125,
0.05340576171875,
-0.01345062255859375,
0.0266876220703125,
0.00888824462890625,
0.00838470458984375,
0.0153656005859375,
0.031097412109375,
0.0194854736328125,
-0.04351806640625,
0.01253509521484375,
-0.03424072265625,
-0.0226898193359375,
-0.0084686279296875,
-0.06011962890625,
-0.0306396484375,
-0.0203399658203125,
-0.039031982421875,
-0.036712646484375,
-0.01374053955078125,
0.0380859375,
0.09716796875,
-0.06787109375,
-0.00031876564025878906,
-0.05078125,
-0.010467529296875,
-0.035186767578125,
-0.021240234375,
0.0161285400390625,
-0.0294036865234375,
-0.07757568359375,
0.01436614990234375,
0.01666259765625,
0.0196533203125,
-0.0212554931640625,
0.0016450881958007812,
0.0023345947265625,
-0.0224151611328125,
0.0218963623046875,
0.0187835693359375,
-0.004138946533203125,
-0.01439666748046875,
-0.006786346435546875,
0.00988006591796875,
0.004634857177734375,
0.075927734375,
-0.043487548828125,
0.0157470703125,
0.046112060546875,
0.0577392578125,
0.07696533203125,
0.0022029876708984375,
0.025421142578125,
-0.0254974365234375,
0.0262298583984375,
0.0059967041015625,
0.053436279296875,
0.0205535888671875,
-0.0209197998046875,
0.035308837890625,
0.046539306640625,
-0.041595458984375,
-0.045135498046875,
0.01531982421875,
-0.0806884765625,
-0.00865936279296875,
0.08636474609375,
-0.0028972625732421875,
-0.053375244140625,
0.060211181640625,
-0.006275177001953125,
0.01157379150390625,
-0.0179595947265625,
0.035980224609375,
0.020538330078125,
0.034698486328125,
-0.0057220458984375,
-0.01168060302734375,
0.0235443115234375,
0.045318603515625,
-0.0261993408203125,
-0.0163726806640625,
0.00121307373046875,
0.017974853515625,
0.0318603515625,
0.02374267578125,
-0.0177154541015625,
0.0035419464111328125,
0.0254058837890625,
0.0283355712890625,
-0.0023441314697265625,
-0.005611419677734375,
-0.01392364501953125,
0.00997161865234375,
-0.006221771240234375,
-0.0384521484375
]
] |
budecosystem/boomer-1b | 2023-10-09T15:16:19.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | budecosystem | null | null | budecosystem/boomer-1b | 2 | 7,093 | transformers | 2023-10-03T17:33:10 | ---
license: apache-2.0
language:
- en
library_name: transformers
---
<div align="center"><img src="https://accubits-assests.s3.ap-south-1.amazonaws.com/boomer/Boomer-Png.png" width=200></div>
<p align="center"><i>Democratizing access to LLMs for the open-source community.<br>Let's advance AI, together. </i></p>
----
## Introduction 🎉
We are open-sourcing one of our early experiments of pretraining with custom architecture and datasets. This 1.1B parameter model is pre-trained from scratch using a custom-curated dataset of 41B tokens. The model's architecture experiments contain the addition of flash attention and a higher intermediate dimension of the MLP layer. The dataset is a combination of wiki, stories, arxiv, math and code. The model is available on huggingface [Boomer1B](https://huggingface.co/budecosystem/boomer-1b)
<div align="center"><img src="https://accubits-assests.s3.ap-south-1.amazonaws.com/boomer/boomer-arch.jpg" width=500></div>
## Getting Started on GitHub 💻
Ready to dive in? Here's how you can get started with our models on GitHub.
Install the necessary dependencies with the following command:
```bash
pip install -r requirements.txt
```
### Generate responses
Now that your model is fine-tuned, you're ready to generate responses. You can do this using our generate.py script, which runs inference from the Hugging Face model hub and inference on a specified input. Here's an example of usage:
```bash
python generate.py --base_model 'budecosystem/boomer-1b' --prompt="the president of India is"
```
### Fine-tuning 🎯
It's time to upgrade the model by fine-tuning the model. You can do this using our provided finetune.py script. Here's an example command:
```bash
torchrun --nproc_per_node 4 train.py \
--base_model budecosystem/boomer-1b \
--data_path dataset.json \
--output_dir output \
--per_device_train_batch_size 2 \
--gradient_accumulation_steps 2 \
--num_train_epochs 1 \
--learning_rate 2e-5 \
--fp16 True \
--logging_steps 10 \
--deepspeed ds_config.json
```
## Model details
| Parameters | Value |
| :------------- | :----: |
| n_layers | 4 |
| n_heads | 32 |
| d_model | 4096 |
| vocab size | 32000 |
| sequence length | 4096 |
| Intermediate size | 11008 |
### Tokenizer
We used the SentencePiece tokenizer during the fine-tuning process. This tokenizer is known for its capability to handle open-vocabulary language tasks efficiently.
### Training details
The model is trained of 4 A100 80GB for approximately 250hrs.
| Hyperparameters | Value |
| :----------------------------| :-----: |
| per_device_train_batch_size | 2 |
| gradient_accumulation_steps | 2 |
| learning_rate | 2e-4 |
| optimizer | adamw |
| beta | 0.9, 0.95 |
| fp16 | True |
| GPU | 4 A100 80GB |
## Evaluations
We have evaluated the pre-trained model on few of the benchmarks
| Model Name | ARC | MMLU | Human Eval | Hellaswag | BBH | DROP | GSM8K |
|:----------:|:--------:|:----:|:----------:|:---------:|:-----: |:-----:|:----:|
| Boomer1B | 22.35 | 25.92| 6.1 | 31.66 | 28.65 | 6.13 | 1.5 |
### Why use BOOMER?
- Retrieval augmentation
- Inference at the edge
- Language modeling use cases
### Final thought on Boomer!
This isn't the end. It's just the beginning of a journey towards creating more advanced, more efficient, and more accessible language models. We invite you to join us on this exciting journey.
### Aknowledgements
We'd like to thank the open-source community and the researchers whose foundational work laid the path for BOOMER. Special shoutout to our dedicated team who have worked relentlessly to curate the dataset and fine-tune the model to perfection. | 3,878 | [
[
-0.04815673828125,
-0.0660400390625,
0.005550384521484375,
0.00794219970703125,
-0.01318359375,
-0.024078369140625,
-0.018890380859375,
-0.0294189453125,
-0.003902435302734375,
0.0257568359375,
-0.0650634765625,
-0.0343017578125,
-0.044525146484375,
-0.0088958740234375,
-0.0112152099609375,
0.08978271484375,
0.0223541259765625,
-0.0024166107177734375,
0.003265380859375,
0.013916015625,
-0.025146484375,
-0.024017333984375,
-0.053192138671875,
-0.0256195068359375,
0.0251007080078125,
0.0182952880859375,
0.054168701171875,
0.037567138671875,
0.03387451171875,
0.0248565673828125,
-0.0180511474609375,
-0.0134124755859375,
-0.04669189453125,
-0.01030731201171875,
0.008209228515625,
-0.0210418701171875,
-0.04248046875,
0.00920867919921875,
0.045684814453125,
0.043670654296875,
-0.0023059844970703125,
0.0248870849609375,
-0.004619598388671875,
0.049713134765625,
-0.0364990234375,
0.00435638427734375,
-0.043731689453125,
0.00028705596923828125,
-0.0139312744140625,
-0.00601959228515625,
-0.020599365234375,
-0.0286407470703125,
0.01227569580078125,
-0.04193115234375,
0.0228271484375,
0.00923919677734375,
0.0924072265625,
0.030242919921875,
-0.019317626953125,
0.0018091201782226562,
-0.04949951171875,
0.055023193359375,
-0.06866455078125,
0.034423828125,
0.0220947265625,
0.025146484375,
-0.0138702392578125,
-0.060089111328125,
-0.051483154296875,
-0.02264404296875,
0.0001081228256225586,
0.003978729248046875,
-0.0239715576171875,
0.0133056640625,
0.02545166015625,
0.048004150390625,
-0.053314208984375,
0.0195465087890625,
-0.04925537109375,
-0.0142974853515625,
0.04998779296875,
-0.0003745555877685547,
0.01125335693359375,
-0.0125274658203125,
-0.042694091796875,
-0.018798828125,
-0.034027099609375,
0.01280975341796875,
0.024139404296875,
0.03277587890625,
-0.0269927978515625,
0.038726806640625,
-0.0253753662109375,
0.056121826171875,
0.014068603515625,
0.0032672882080078125,
0.041412353515625,
-0.0187225341796875,
-0.030792236328125,
-0.020416259765625,
0.079833984375,
0.01702880859375,
0.01702880859375,
0.00327301025390625,
-0.017669677734375,
-0.01493072509765625,
0.0154876708984375,
-0.08154296875,
-0.006683349609375,
0.032928466796875,
-0.037841796875,
-0.0267791748046875,
-0.0000858306884765625,
-0.052276611328125,
0.0034236907958984375,
-0.0122222900390625,
0.047088623046875,
-0.054901123046875,
-0.029388427734375,
0.019012451171875,
-0.007015228271484375,
0.03631591796875,
0.007442474365234375,
-0.0733642578125,
0.00693511962890625,
0.043365478515625,
0.05999755859375,
0.00977325439453125,
-0.0297088623046875,
-0.0180206298828125,
-0.0197906494140625,
-0.01525115966796875,
0.04718017578125,
-0.025146484375,
-0.02801513671875,
-0.007617950439453125,
-0.0007352828979492188,
-0.018157958984375,
-0.038360595703125,
0.035369873046875,
-0.047210693359375,
0.04229736328125,
0.00460052490234375,
-0.05596923828125,
-0.0173492431640625,
0.0170440673828125,
-0.036712646484375,
0.09783935546875,
0.0183258056640625,
-0.056793212890625,
0.0258331298828125,
-0.062744140625,
-0.01654052734375,
0.00811004638671875,
-0.00278472900390625,
-0.04168701171875,
0.0018606185913085938,
0.022857666015625,
0.044036865234375,
-0.010040283203125,
0.0135345458984375,
-0.0232391357421875,
-0.032257080078125,
0.01380157470703125,
-0.0219268798828125,
0.08111572265625,
0.0197601318359375,
-0.0308990478515625,
0.0194549560546875,
-0.0750732421875,
0.002971649169921875,
0.0219879150390625,
-0.0280303955078125,
0.0174713134765625,
-0.0229644775390625,
0.0203704833984375,
0.0287017822265625,
0.026092529296875,
-0.030029296875,
0.01568603515625,
-0.042938232421875,
0.0231170654296875,
0.0572509765625,
0.000029087066650390625,
0.020965576171875,
-0.030426025390625,
0.049224853515625,
0.0128936767578125,
0.023712158203125,
-0.0271759033203125,
-0.03668212890625,
-0.060089111328125,
-0.013275146484375,
0.0276031494140625,
0.030120849609375,
-0.01276397705078125,
0.056793212890625,
-0.0244140625,
-0.046630859375,
-0.0494384765625,
-0.002910614013671875,
0.0284271240234375,
0.04058837890625,
0.036651611328125,
-0.0283966064453125,
-0.03924560546875,
-0.063720703125,
-0.0031757354736328125,
-0.0142364501953125,
-0.0170135498046875,
0.0305328369140625,
0.039703369140625,
-0.017913818359375,
0.0716552734375,
-0.04083251953125,
-0.0242919921875,
-0.021514892578125,
0.003925323486328125,
0.024749755859375,
0.05291748046875,
0.039306640625,
-0.04852294921875,
-0.01306915283203125,
-0.0148162841796875,
-0.041900634765625,
0.009429931640625,
0.000823974609375,
-0.0145111083984375,
0.00943756103515625,
0.045684814453125,
-0.061004638671875,
0.0162811279296875,
0.0360107421875,
-0.030426025390625,
0.058319091796875,
-0.0037078857421875,
-0.00754547119140625,
-0.09503173828125,
0.02972412109375,
0.0040283203125,
-0.00553131103515625,
-0.043182373046875,
0.005153656005859375,
-0.015472412109375,
-0.01470947265625,
-0.043212890625,
0.05047607421875,
-0.028594970703125,
0.005435943603515625,
-0.01029205322265625,
-0.0009007453918457031,
-0.011322021484375,
0.043060302734375,
-0.0052490234375,
0.0562744140625,
0.05877685546875,
-0.04229736328125,
0.0016498565673828125,
0.0313720703125,
-0.041534423828125,
0.0218353271484375,
-0.06658935546875,
0.00634002685546875,
-0.0134429931640625,
0.02239990234375,
-0.0697021484375,
-0.0233917236328125,
0.028472900390625,
-0.049713134765625,
0.031951904296875,
-0.014862060546875,
-0.020294189453125,
-0.0418701171875,
-0.01678466796875,
0.00931549072265625,
0.05804443359375,
-0.0297088623046875,
0.034332275390625,
0.020355224609375,
-0.007488250732421875,
-0.05413818359375,
-0.0390625,
-0.0102386474609375,
-0.0150909423828125,
-0.05029296875,
0.032318115234375,
-0.013824462890625,
0.0028076171875,
-0.01259613037109375,
0.0143585205078125,
-0.003208160400390625,
0.0028133392333984375,
0.022735595703125,
0.036224365234375,
-0.0310516357421875,
0.00724029541015625,
-0.00473785400390625,
-0.0175933837890625,
-0.001659393310546875,
-0.0119781494140625,
0.05291748046875,
-0.01898193359375,
-0.012451171875,
-0.049041748046875,
-0.0004711151123046875,
0.046234130859375,
-0.030975341796875,
0.0538330078125,
0.0740966796875,
-0.025054931640625,
-0.003143310546875,
-0.0548095703125,
-0.0306854248046875,
-0.040252685546875,
0.039459228515625,
-0.026519775390625,
-0.07012939453125,
0.033966064453125,
-0.004116058349609375,
0.01419830322265625,
0.04571533203125,
0.04827880859375,
0.0096893310546875,
0.07275390625,
0.06158447265625,
-0.018798828125,
0.051422119140625,
-0.031280517578125,
0.0007128715515136719,
-0.06048583984375,
-0.02020263671875,
-0.0290679931640625,
-0.0184478759765625,
-0.05224609375,
-0.02587890625,
0.0181732177734375,
0.01129150390625,
-0.044677734375,
0.0170440673828125,
-0.041412353515625,
0.0244598388671875,
0.043670654296875,
0.0117034912109375,
0.004871368408203125,
-0.01509857177734375,
0.00411224365234375,
0.01495361328125,
-0.054901123046875,
-0.03826904296875,
0.0772705078125,
0.04058837890625,
0.03631591796875,
-0.0183258056640625,
0.04718017578125,
-0.007053375244140625,
0.017425537109375,
-0.0509033203125,
0.04705810546875,
0.005641937255859375,
-0.07659912109375,
-0.01715087890625,
-0.0294189453125,
-0.07012939453125,
0.0275115966796875,
-0.022216796875,
-0.061004638671875,
0.01019287109375,
0.021240234375,
-0.04205322265625,
0.01861572265625,
-0.044677734375,
0.09552001953125,
-0.01334381103515625,
-0.0141754150390625,
-0.0127105712890625,
-0.07318115234375,
0.035064697265625,
0.0176239013671875,
0.006893157958984375,
-0.02850341796875,
0.006298065185546875,
0.061370849609375,
-0.032928466796875,
0.06781005859375,
-0.0215606689453125,
0.0140228271484375,
0.020294189453125,
-0.01528167724609375,
0.0163421630859375,
-0.01983642578125,
-0.0135040283203125,
0.022430419921875,
-0.005458831787109375,
-0.033447265625,
-0.020111083984375,
0.039703369140625,
-0.0701904296875,
-0.03582763671875,
-0.0277862548828125,
-0.043609619140625,
-0.007293701171875,
0.03515625,
0.027923583984375,
0.0196075439453125,
-0.0283050537109375,
0.00984954833984375,
0.035736083984375,
-0.0192718505859375,
0.035614013671875,
0.03448486328125,
-0.03240966796875,
-0.0240478515625,
0.061492919921875,
-0.002155303955078125,
0.015960693359375,
0.015869140625,
0.019744873046875,
-0.024627685546875,
-0.0458984375,
-0.03961181640625,
0.026641845703125,
-0.039825439453125,
-0.0101318359375,
-0.056884765625,
-0.0262603759765625,
-0.0386962890625,
-0.002307891845703125,
-0.037322998046875,
-0.0236968994140625,
-0.0390625,
-0.007671356201171875,
0.034393310546875,
0.04656982421875,
0.003681182861328125,
0.0394287109375,
-0.045684814453125,
0.005924224853515625,
0.01421356201171875,
0.031768798828125,
0.00209808349609375,
-0.068115234375,
-0.0153961181640625,
0.0019006729125976562,
-0.03839111328125,
-0.04217529296875,
0.024017333984375,
0.0251312255859375,
0.0265655517578125,
0.0435791015625,
-0.009033203125,
0.0511474609375,
-0.039154052734375,
0.0684814453125,
0.004810333251953125,
-0.052337646484375,
0.05218505859375,
-0.029541015625,
0.03363037109375,
0.044769287109375,
0.0499267578125,
-0.0239410400390625,
-0.022216796875,
-0.06463623046875,
-0.0689697265625,
0.062286376953125,
0.0030364990234375,
0.0205841064453125,
0.005870819091796875,
0.0257568359375,
0.00402069091796875,
0.01082611083984375,
-0.061767578125,
-0.0254669189453125,
-0.016387939453125,
-0.007442474365234375,
-0.00814056396484375,
-0.00928497314453125,
0.006717681884765625,
-0.031158447265625,
0.06842041015625,
-0.005237579345703125,
0.013824462890625,
-0.006496429443359375,
-0.00035881996154785156,
-0.006801605224609375,
0.0167694091796875,
0.03057861328125,
0.041900634765625,
-0.0355224609375,
-0.01439666748046875,
0.0111236572265625,
-0.030426025390625,
-0.006160736083984375,
0.03619384765625,
-0.0213165283203125,
-0.003017425537109375,
0.0128936767578125,
0.087158203125,
0.025146484375,
-0.05303955078125,
0.04632568359375,
0.0064849853515625,
-0.007762908935546875,
-0.0252838134765625,
0.00812530517578125,
0.01406097412109375,
0.019439697265625,
0.030181884765625,
0.00640106201171875,
-0.0024738311767578125,
-0.0264739990234375,
0.0182342529296875,
0.0452880859375,
-0.02825927734375,
-0.033538818359375,
0.06561279296875,
0.0035381317138671875,
-0.0297088623046875,
0.05267333984375,
-0.0034503936767578125,
-0.03570556640625,
0.03863525390625,
0.04400634765625,
0.06298828125,
-0.0294342041015625,
0.0096588134765625,
0.048858642578125,
0.0167388916015625,
-0.001499176025390625,
0.0161590576171875,
-0.0009708404541015625,
-0.052520751953125,
-0.022857666015625,
-0.072509765625,
-0.016998291015625,
0.0297088623046875,
-0.051025390625,
0.03912353515625,
-0.026885986328125,
-0.0214385986328125,
0.01439666748046875,
0.00103759765625,
-0.0634765625,
0.037384033203125,
0.011474609375,
0.0782470703125,
-0.057373046875,
0.052947998046875,
0.04229736328125,
-0.0604248046875,
-0.05535888671875,
-0.0023288726806640625,
-0.01032257080078125,
-0.0721435546875,
0.035308837890625,
0.038299560546875,
0.007419586181640625,
0.028045654296875,
-0.043212890625,
-0.069580078125,
0.0850830078125,
0.0217437744140625,
-0.03985595703125,
-0.01480865478515625,
0.0004076957702636719,
0.037109375,
-0.0178070068359375,
0.046966552734375,
0.041412353515625,
0.0269775390625,
0.0067901611328125,
-0.07122802734375,
0.00922393798828125,
-0.02557373046875,
0.0082550048828125,
0.00012934207916259766,
-0.057281494140625,
0.07049560546875,
-0.03253173828125,
-0.018798828125,
-0.00215911865234375,
0.063232421875,
0.0204925537109375,
0.00836181640625,
0.04339599609375,
0.058319091796875,
0.042144775390625,
0.0063934326171875,
0.08563232421875,
-0.03863525390625,
0.035736083984375,
0.06689453125,
0.01470947265625,
0.0751953125,
0.0254974365234375,
-0.044586181640625,
0.0423583984375,
0.06329345703125,
-0.00537872314453125,
0.018768310546875,
0.0080413818359375,
-0.0157318115234375,
-0.01247406005859375,
0.0009746551513671875,
-0.045623779296875,
0.0305633544921875,
0.0050506591796875,
-0.027923583984375,
-0.00531005859375,
0.00151824951171875,
0.01244354248046875,
-0.0302734375,
-0.00989532470703125,
0.05755615234375,
0.00273895263671875,
-0.047210693359375,
0.06512451171875,
-0.005218505859375,
0.06756591796875,
-0.056884765625,
0.01885986328125,
-0.0096588134765625,
0.0229034423828125,
-0.0139923095703125,
-0.035552978515625,
0.00749969482421875,
-0.0195770263671875,
-0.011932373046875,
-0.007427215576171875,
0.049102783203125,
-0.035919189453125,
-0.052337646484375,
0.0143890380859375,
0.0404052734375,
0.024169921875,
-0.006160736083984375,
-0.0714111328125,
0.018707275390625,
-0.0054473876953125,
-0.05401611328125,
0.028900146484375,
0.01434326171875,
0.0283966064453125,
0.053375244140625,
0.052520751953125,
0.000522613525390625,
0.0199127197265625,
0.01049041748046875,
0.08221435546875,
-0.04656982421875,
-0.03204345703125,
-0.05621337890625,
0.05743408203125,
0.0017299652099609375,
-0.05657958984375,
0.053924560546875,
0.046783447265625,
0.08282470703125,
-0.006618499755859375,
0.05242919921875,
-0.0204315185546875,
0.0361328125,
-0.0207672119140625,
0.0570068359375,
-0.05047607421875,
0.0008873939514160156,
-0.0143890380859375,
-0.062469482421875,
-0.01531219482421875,
0.05401611328125,
-0.02728271484375,
0.034423828125,
0.02587890625,
0.08343505859375,
-0.0171661376953125,
0.0031490325927734375,
0.0127410888671875,
0.0284271240234375,
0.0150909423828125,
0.03314208984375,
0.040069580078125,
-0.04791259765625,
0.04632568359375,
-0.0321044921875,
-0.03253173828125,
-0.0237274169921875,
-0.049346923828125,
-0.06329345703125,
-0.051513671875,
-0.02813720703125,
-0.0230560302734375,
0.0005674362182617188,
0.08203125,
0.061187744140625,
-0.0565185546875,
-0.019012451171875,
-0.0102996826171875,
-0.016387939453125,
-0.01806640625,
-0.0215301513671875,
0.0262451171875,
-0.038909912109375,
-0.060577392578125,
0.0283355712890625,
-0.005023956298828125,
0.01222991943359375,
-0.03216552734375,
-0.0044403076171875,
-0.027862548828125,
-0.00019407272338867188,
0.03656005859375,
0.0261383056640625,
-0.05267333984375,
-0.01116943359375,
-0.0019245147705078125,
0.00035381317138671875,
0.019317626953125,
0.041717529296875,
-0.06121826171875,
0.0285491943359375,
0.0225982666015625,
0.037078857421875,
0.056884765625,
0.00644683837890625,
0.0305633544921875,
-0.044769287109375,
0.018280029296875,
0.022979736328125,
0.0184173583984375,
0.0308380126953125,
-0.016082763671875,
0.02099609375,
0.01052093505859375,
-0.057708740234375,
-0.06768798828125,
0.009765625,
-0.0750732421875,
-0.0182952880859375,
0.07733154296875,
-0.005619049072265625,
-0.0225830078125,
0.01340484619140625,
-0.01259613037109375,
0.0279083251953125,
-0.021942138671875,
0.06292724609375,
0.0308685302734375,
-0.01303863525390625,
-0.0180206298828125,
-0.04150390625,
0.04241943359375,
0.03802490234375,
-0.047515869140625,
-0.0014123916625976562,
0.0372314453125,
0.035797119140625,
0.004169464111328125,
0.06341552734375,
-0.01047515869140625,
0.01340484619140625,
0.009552001953125,
0.02618408203125,
-0.01529693603515625,
-0.0027923583984375,
-0.035308837890625,
0.01520538330078125,
-0.001739501953125,
-0.0267181396484375
]
] |
openaccess-ai-collective/jackalope-7b | 2023-10-12T08:21:34.000Z | [
"transformers",
"pytorch",
"mistral",
"text-generation",
"en",
"dataset:Open-Orca/OpenOrca",
"dataset:LDJnr/LessWrong-Amplify-Instruct",
"dataset:LDJnr/Pure-Dove",
"dataset:LDJnr/Verified-Camel",
"dataset:PygmalionAI/PIPPA",
"dataset:meta-math/MetaMathQA",
"dataset:riddle_sense",
"arxiv:2306.02707",
"arxiv:2301.13688",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | openaccess-ai-collective | null | null | openaccess-ai-collective/jackalope-7b | 24 | 7,092 | transformers | 2023-10-07T16:08:42 | ---
datasets:
- Open-Orca/OpenOrca
- LDJnr/LessWrong-Amplify-Instruct
- LDJnr/Pure-Dove
- LDJnr/Verified-Camel
- PygmalionAI/PIPPA
- meta-math/MetaMathQA
- riddle_sense
language:
- en
library_name: transformers
pipeline_tag: text-generation
license: apache-2.0
---
<p><h1>🐰🦌 Jackalope 7B 🐰🦌</h1></p>

[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
# Jackalope 7B
We have used the [SlimOrca dataset](https://huggingface.co/datasets/Open-Orca/SlimOrca), PIPPA, and various other open datasets
to fine-tune on top of [Mistral 7B](https://huggingface.co/mistralai/Mistral-7B-v0.1).
This dataset is our attempt to reproduce the dataset generated for Microsoft Research's [Orca Paper](https://arxiv.org/abs/2306.02707).
We use [OpenChat](https://huggingface.co/openchat) packing, trained with [Axolotl](https://github.com/OpenAccess-AI-Collective/axolotl).
This release highlights the efficiency of SlimOrca, while improving the ability of the model's multi-turn chat.
HF Leaderboard evals puts this model only slightly below the MistralOrca release, but can be considered a
reasonable tradeoff for a more general model that can handle multi-turn chat.
If you'd like to try the model now, we have it running on fast GPUs unquantized: https://huggingface.co/spaces/openaccess-ai-collective/jackalope-7b
Join the OpenAccess AI Collective Discord for more information about Axolotl trainer and other OAAIC models here:
https://discord.gg/5y8STgB3P3
Also join the AlignmentLab Discord for sneak-peak announcements:
https://AlignmentLab.ai
# Quantized Models
Quantized versions of this model are generously made available by [TheBloke](https://huggingface.co/TheBloke).
- AWQ: https://huggingface.co/TheBloke/Jackalope-7B-AWQ
- GPTQ: https://huggingface.co/TheBloke/Jackalope-7B-GPTQ
- GGUF: https://huggingface.co/TheBloke/Jackalope-7B-GGUF
# Prompt Template
We used [OpenAI's Chat Markup Language (ChatML)](https://github.com/openai/openai-python/blob/main/chatml.md) format, with `<|im_start|>` and `<|im_end|>` tokens added to support this.
This means that, e.g., in [oobabooga](https://github.com/oobabooga/text-generation-webui/) the "`MPT-Chat`" instruction template should work, as it also uses ChatML.
This formatting is also available via a pre-defined [Transformers chat template](https://huggingface.co/docs/transformers/main/chat_templating),
which means that lists of messages can be formatted for you with the `apply_chat_template()` method:
```python
chat = [
{"role": "system", "content": "You are JackalopeAI, a large language model trained by OpenAccess AI Collective. Write out your reasoning step-by-step to be sure you get the right answers!"}
{"role": "user", "content": "How are you?"},
{"role": "assistant", "content": "I am doing well!"},
{"role": "user", "content": "Please tell me about the mythical creatures called jackalopes."},
]
tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True)
```
which will yield:
```
<|im_start|>system
You are JackalopeAI. Write out your reasoning step-by-step to be sure you get the right answers!
<|im_end|>
<|im_start|>user
How are you?<|im_end|>
<|im_start|>assistant
I am doing well!<|im_end|>
<|im_start|>user
Please tell me about the mythical creatures called jackalopes.<|im_end|>
<|im_start|>assistant
```
If you use `tokenize=True` and `return_tensors="pt"` instead, then you will get a tokenized
and formatted conversation ready to pass to `model.generate()`.
# Evaluation
## HuggingFace Leaderboard Performance

| Metric | Value |
|-----------------------|--|
| MMLU (5-shot) | 63.63 |
| ARC (25-shot) | 63.31 |
| HellaSwag (10-shot) | 83.29 |
| TruthfulQA (0-shot) | 49.99 |
| Avg. | 65.06 |
We use [Language Model Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness) to run the benchmark tests above, using the same version as the HuggingFace LLM Leaderboard.
# Dataset
We used a verified, curated, filtered selection of most of the GPT-4 augmented data from the OpenOrca dataset.
Additionally we include multi-turn chat from PIPPA, various datasets
by LDJ from Nous Research, MetaMathQA, and Chain-of-Thought augmented data from the train split of RiddleSense.
- [Open-Orca/OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca)
- [LDJnr/LessWrong-Amplify-Instruct](https://huggingface.co/datasets/LDJnr/LessWrong-Amplify-Instruct)
- [LDJnr/Pure-Dove](https://huggingface.co/datasets/LDJnr/Pure-Dove)
- [LDJnr/Verified-Camel](https://huggingface.co/datasets/LDJnr/Verified-Camel)
- [PygmalionAI/PIPPA](https://huggingface.co/datasets/PygmalionAI/PIPPA)
- [meta-math/MetaMathQA](https://huggingface.co/datasets/meta-math/MetaMathQA)
- [riddle_sense](https://huggingface.co/datasets/riddle_sense)
# Training
We trained with 8x A6000 GPUs for 96 hours, completing 4 epochs of full fine tuning on our dataset in one training run.
Commodity cost was ~$650.
# Citation
```bibtex
@software{lian2023jackalope,
title = {Jackalope 7B: Mistral-7B Model Multi-Turn Chat tuned on Filtered OpenOrcaV1 GPT-4 Dataset},
author = {Wing Lian and Bleys Goodson and Guan Wang and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{openaccess-ai-collective/jackalope-7b},
}
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@misc{longpre2023flan,
title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning},
author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts},
year={2023},
eprint={2301.13688},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
| 6,546 | [
[
-0.036590576171875,
-0.0618896484375,
-0.00580596923828125,
0.02001953125,
-0.005828857421875,
-0.00566864013671875,
-0.0275421142578125,
-0.049530029296875,
0.0266876220703125,
0.005596160888671875,
-0.040191650390625,
-0.035797119140625,
-0.03387451171875,
0.002288818359375,
0.00049591064453125,
0.0771484375,
0.00313568115234375,
-0.0224151611328125,
-0.0038127899169921875,
-0.0316162109375,
-0.041748046875,
-0.0423583984375,
-0.07537841796875,
-0.02618408203125,
0.033111572265625,
0.0153045654296875,
0.055755615234375,
0.042572021484375,
0.01418304443359375,
0.0270538330078125,
-0.0276336669921875,
0.027587890625,
-0.040679931640625,
-0.00572967529296875,
0.00894927978515625,
-0.041259765625,
-0.067138671875,
0.01026153564453125,
0.0179443359375,
0.0216827392578125,
-0.019622802734375,
0.0221405029296875,
0.0115814208984375,
0.023162841796875,
-0.043975830078125,
0.044647216796875,
-0.02984619140625,
-0.002689361572265625,
-0.01276397705078125,
0.01111602783203125,
-0.0184478759765625,
-0.029449462890625,
0.00969696044921875,
-0.0626220703125,
-0.007904052734375,
0.00726318359375,
0.08251953125,
-0.0028057098388671875,
-0.01093292236328125,
-0.023101806640625,
-0.0251007080078125,
0.0576171875,
-0.058258056640625,
0.0239410400390625,
0.0325927734375,
0.02276611328125,
-0.01904296875,
-0.05426025390625,
-0.05499267578125,
-0.00238037109375,
-0.006511688232421875,
0.027740478515625,
-0.0210113525390625,
-0.006816864013671875,
0.01485443115234375,
0.037811279296875,
-0.049560546875,
0.00910186767578125,
-0.037506103515625,
-0.0222320556640625,
0.056793212890625,
0.00019598007202148438,
0.02447509765625,
-0.00861358642578125,
-0.021087646484375,
-0.0306549072265625,
-0.0279083251953125,
0.0201263427734375,
0.0268402099609375,
0.0297088623046875,
-0.044403076171875,
0.038238525390625,
-0.0003592967987060547,
0.042724609375,
0.0090484619140625,
-0.037109375,
0.041259765625,
-0.026885986328125,
-0.0225982666015625,
-0.0072784423828125,
0.0838623046875,
0.0260467529296875,
0.007354736328125,
0.007610321044921875,
-0.00583648681640625,
0.007335662841796875,
-0.003620147705078125,
-0.056427001953125,
-0.0214385986328125,
0.03173828125,
-0.026947021484375,
-0.022979736328125,
-0.007236480712890625,
-0.05267333984375,
-0.018341064453125,
-0.005214691162109375,
0.03662109375,
-0.05975341796875,
-0.03961181640625,
0.0179595947265625,
-0.01496124267578125,
0.0265960693359375,
0.0303192138671875,
-0.06512451171875,
0.01194000244140625,
0.038909912109375,
0.083984375,
0.006011962890625,
-0.015716552734375,
-0.024932861328125,
-0.0205078125,
-0.01385498046875,
0.0404052734375,
-0.0231475830078125,
-0.01410675048828125,
-0.0106658935546875,
-0.0067291259765625,
-0.0203399658203125,
-0.033416748046875,
0.04449462890625,
-0.015045166015625,
0.051422119140625,
-0.020233154296875,
-0.0264892578125,
-0.0244293212890625,
0.0257415771484375,
-0.035980224609375,
0.0775146484375,
0.00809478759765625,
-0.061676025390625,
0.0204315185546875,
-0.0628662109375,
-0.01605224609375,
-0.0101776123046875,
-0.003997802734375,
-0.0450439453125,
-0.0228271484375,
0.044403076171875,
0.0305328369140625,
-0.0202789306640625,
-0.0034809112548828125,
-0.0274200439453125,
-0.0172882080078125,
0.0205841064453125,
-0.0143890380859375,
0.08056640625,
0.0205841064453125,
-0.04095458984375,
0.004756927490234375,
-0.046600341796875,
-0.004688262939453125,
0.02716064453125,
-0.0133209228515625,
-0.01947021484375,
-0.01654052734375,
-0.01285552978515625,
0.0218353271484375,
0.018218994140625,
-0.038818359375,
0.036773681640625,
-0.0268096923828125,
0.044281005859375,
0.056793212890625,
-0.01334381103515625,
0.0197296142578125,
-0.03729248046875,
0.040618896484375,
0.0096588134765625,
0.042388916015625,
-0.0200958251953125,
-0.06793212890625,
-0.0697021484375,
-0.032073974609375,
0.029388427734375,
0.03564453125,
-0.0645751953125,
0.050872802734375,
-0.0104217529296875,
-0.046630859375,
-0.053680419921875,
-0.00678253173828125,
0.04010009765625,
0.03521728515625,
0.0347900390625,
-0.050262451171875,
-0.0266571044921875,
-0.04656982421875,
-0.0006384849548339844,
-0.032684326171875,
-0.0099334716796875,
0.0271453857421875,
0.03424072265625,
-0.008270263671875,
0.071533203125,
-0.0316162109375,
-0.033782958984375,
-0.0214385986328125,
0.0035877227783203125,
0.02264404296875,
0.04510498046875,
0.0609130859375,
-0.044403076171875,
-0.0251007080078125,
-0.0013399124145507812,
-0.0631103515625,
0.00431060791015625,
-0.00948333740234375,
-0.025360107421875,
0.021087646484375,
0.0135498046875,
-0.065673828125,
0.037872314453125,
0.045196533203125,
-0.037078857421875,
0.0286407470703125,
0.004520416259765625,
0.00725555419921875,
-0.0802001953125,
0.015655517578125,
0.01161956787109375,
-0.0131378173828125,
-0.043060302734375,
0.005687713623046875,
-0.01129913330078125,
0.00238037109375,
-0.03314208984375,
0.0445556640625,
-0.0361328125,
0.01108551025390625,
0.00980377197265625,
0.032012939453125,
-0.004886627197265625,
0.06256103515625,
-0.01004791259765625,
0.053802490234375,
0.05157470703125,
-0.0404052734375,
0.031097412109375,
0.04669189453125,
-0.006988525390625,
0.038604736328125,
-0.06640625,
0.022064208984375,
-0.00847625732421875,
0.032257080078125,
-0.064208984375,
-0.0189361572265625,
0.053863525390625,
-0.053955078125,
0.02740478515625,
-0.019744873046875,
-0.03839111328125,
-0.035552978515625,
-0.034332275390625,
0.017242431640625,
0.05078125,
-0.051910400390625,
0.04840087890625,
0.025634765625,
0.007175445556640625,
-0.0509033203125,
-0.043304443359375,
-0.00726318359375,
-0.03497314453125,
-0.053497314453125,
0.0134429931640625,
-0.0071563720703125,
-0.0007996559143066406,
-0.0203704833984375,
-0.00412750244140625,
0.01142120361328125,
0.0010194778442382812,
0.03192138671875,
0.016143798828125,
-0.01561737060546875,
0.0031490325927734375,
-0.006195068359375,
0.0017452239990234375,
-0.0025882720947265625,
-0.034576416015625,
0.05999755859375,
-0.03692626953125,
-0.008575439453125,
-0.050201416015625,
-0.007717132568359375,
0.037109375,
-0.031951904296875,
0.08135986328125,
0.057464599609375,
-0.01715087890625,
0.010894775390625,
-0.036956787109375,
-0.0168304443359375,
-0.038665771484375,
-0.00022923946380615234,
-0.02899169921875,
-0.05584716796875,
0.052154541015625,
0.021728515625,
0.036956787109375,
0.04498291015625,
0.03802490234375,
0.01506805419921875,
0.07598876953125,
0.0299224853515625,
-0.01837158203125,
0.0355224609375,
-0.03912353515625,
0.004543304443359375,
-0.05548095703125,
-0.035369873046875,
-0.03179931640625,
-0.033843994140625,
-0.0518798828125,
-0.027435302734375,
0.03399658203125,
0.02264404296875,
-0.027740478515625,
0.03961181640625,
-0.051513671875,
0.0120391845703125,
0.050201416015625,
0.0206146240234375,
0.016845703125,
-0.0061798095703125,
-0.0027713775634765625,
0.00960540771484375,
-0.069091796875,
-0.03515625,
0.07952880859375,
0.04119873046875,
0.0484619140625,
0.02197265625,
0.042877197265625,
-0.01385498046875,
0.02764892578125,
-0.040618896484375,
0.040435791015625,
0.022674560546875,
-0.0443115234375,
-0.0217742919921875,
-0.040618896484375,
-0.08746337890625,
0.021331787109375,
-0.0248565673828125,
-0.0704345703125,
0.0077972412109375,
0.0179443359375,
-0.03009033203125,
0.0212860107421875,
-0.0589599609375,
0.07421875,
-0.01111602783203125,
-0.0134429931640625,
0.00872802734375,
-0.058074951171875,
0.0426025390625,
0.01303863525390625,
0.005283355712890625,
-0.0033016204833984375,
0.005680084228515625,
0.06390380859375,
-0.04266357421875,
0.07000732421875,
-0.0129547119140625,
-0.00270843505859375,
0.036376953125,
-0.01016998291015625,
0.02105712890625,
0.020233154296875,
0.0034427642822265625,
0.03173828125,
0.0029964447021484375,
-0.038818359375,
-0.04144287109375,
0.044403076171875,
-0.08355712890625,
-0.0231781005859375,
-0.04058837890625,
-0.016265869140625,
-0.0010986328125,
0.0079345703125,
0.0254669189453125,
0.035614013671875,
0.0088348388671875,
-0.0002980232238769531,
0.036773681640625,
-0.038665771484375,
0.0297088623046875,
0.017425537109375,
-0.0299224853515625,
-0.037139892578125,
0.06451416015625,
-0.0028095245361328125,
0.01473236083984375,
0.01314544677734375,
0.0189666748046875,
-0.024078369140625,
-0.019073486328125,
-0.03887939453125,
0.03668212890625,
-0.0285491943359375,
-0.022430419921875,
-0.059722900390625,
-0.0161285400390625,
-0.044891357421875,
0.013092041015625,
-0.036468505859375,
-0.040802001953125,
-0.03271484375,
0.00576019287109375,
0.050445556640625,
0.031341552734375,
-0.0027618408203125,
0.0243988037109375,
-0.043060302734375,
0.01197052001953125,
0.0289459228515625,
0.0221405029296875,
0.004306793212890625,
-0.048309326171875,
-0.007808685302734375,
0.0257110595703125,
-0.04583740234375,
-0.050506591796875,
0.033599853515625,
0.0230865478515625,
0.032928466796875,
0.035125732421875,
0.01213836669921875,
0.0728759765625,
-0.01593017578125,
0.0748291015625,
0.0235443115234375,
-0.061859130859375,
0.042572021484375,
-0.0229949951171875,
0.0228424072265625,
0.026947021484375,
0.04217529296875,
-0.04046630859375,
-0.037872314453125,
-0.06597900390625,
-0.059783935546875,
0.0626220703125,
0.046478271484375,
0.00665283203125,
0.0009083747863769531,
0.041412353515625,
0.0011110305786132812,
0.0158538818359375,
-0.05999755859375,
-0.031219482421875,
-0.03839111328125,
-0.006961822509765625,
-0.0002682209014892578,
0.00977325439453125,
-0.00664520263671875,
-0.0285186767578125,
0.05657958984375,
-0.018218994140625,
0.048095703125,
0.00992584228515625,
0.01230621337890625,
0.0028133392333984375,
-0.004642486572265625,
0.048095703125,
0.042816162109375,
-0.0157318115234375,
-0.0178680419921875,
0.01276397705078125,
-0.038299560546875,
-0.02337646484375,
0.0117340087890625,
-0.00048065185546875,
-0.004974365234375,
0.03826904296875,
0.0791015625,
-0.01522064208984375,
-0.04693603515625,
0.04840087890625,
-0.0264739990234375,
-0.0102996826171875,
-0.0283966064453125,
0.0174102783203125,
0.005146026611328125,
0.032470703125,
0.018463134765625,
0.00281524658203125,
-0.01520538330078125,
-0.052032470703125,
-0.000797271728515625,
0.0261077880859375,
-0.0189666748046875,
-0.045440673828125,
0.0692138671875,
0.01186370849609375,
-0.017608642578125,
0.0584716796875,
-0.015838623046875,
-0.040618896484375,
0.034881591796875,
0.0233154296875,
0.048095703125,
-0.0219879150390625,
0.0162200927734375,
0.044464111328125,
0.0200958251953125,
-0.0037670135498046875,
0.0212860107421875,
0.0007829666137695312,
-0.0457763671875,
-0.027679443359375,
-0.04345703125,
-0.0246734619140625,
0.0254364013671875,
-0.04254150390625,
0.0343017578125,
-0.035614013671875,
-0.025909423828125,
-0.0007648468017578125,
-0.0024356842041015625,
-0.050506591796875,
0.0070343017578125,
0.0027790069580078125,
0.06732177734375,
-0.052093505859375,
0.0423583984375,
0.048553466796875,
-0.05084228515625,
-0.07611083984375,
-0.01244354248046875,
0.0063629150390625,
-0.0631103515625,
0.01812744140625,
0.0256805419921875,
0.0178680419921875,
-0.0174407958984375,
-0.04962158203125,
-0.05462646484375,
0.09552001953125,
0.025177001953125,
-0.0293731689453125,
-0.01328277587890625,
-0.0043182373046875,
0.0523681640625,
-0.01491546630859375,
0.0601806640625,
0.04754638671875,
0.0281524658203125,
0.01904296875,
-0.09625244140625,
0.0028514862060546875,
-0.028900146484375,
-0.00843048095703125,
0.005748748779296875,
-0.0838623046875,
0.07470703125,
-0.0078887939453125,
-0.015960693359375,
0.005413055419921875,
0.06414794921875,
0.0235748291015625,
0.016998291015625,
0.0430908203125,
0.044677734375,
0.05303955078125,
-0.01377105712890625,
0.07855224609375,
-0.01611328125,
0.033660888671875,
0.055572509765625,
0.01541900634765625,
0.047821044921875,
0.00968170166015625,
-0.01305389404296875,
0.035064697265625,
0.059417724609375,
0.015869140625,
0.0207061767578125,
-0.00039887428283691406,
-0.002490997314453125,
-0.001361846923828125,
0.0017957687377929688,
-0.036590576171875,
0.045440673828125,
0.024688720703125,
-0.00849151611328125,
0.000728607177734375,
0.006359100341796875,
0.0293731689453125,
-0.0151214599609375,
-0.0040435791015625,
0.06121826171875,
0.0094146728515625,
-0.05328369140625,
0.08892822265625,
0.00682830810546875,
0.0693359375,
-0.04644775390625,
0.008148193359375,
-0.04193115234375,
0.0126800537109375,
-0.0168304443359375,
-0.040924072265625,
0.0026683807373046875,
-0.01268768310546875,
0.0298309326171875,
-0.0081329345703125,
0.02880859375,
-0.03253173828125,
-0.016021728515625,
0.007610321044921875,
0.03289794921875,
0.022979736328125,
0.0009589195251464844,
-0.0718994140625,
0.01776123046875,
0.0003654956817626953,
-0.0287322998046875,
0.02813720703125,
0.030731201171875,
-0.010986328125,
0.05682373046875,
0.045806884765625,
-0.0037631988525390625,
0.0006566047668457031,
-0.01390838623046875,
0.08355712890625,
-0.03253173828125,
-0.03997802734375,
-0.05352783203125,
0.04119873046875,
-0.01922607421875,
-0.054840087890625,
0.062469482421875,
0.044342041015625,
0.066650390625,
0.0124664306640625,
0.04559326171875,
-0.0318603515625,
0.0281982421875,
-0.035552978515625,
0.05364990234375,
-0.047821044921875,
0.01995849609375,
-0.0306243896484375,
-0.0653076171875,
-0.019622802734375,
0.06158447265625,
-0.00749969482421875,
0.010467529296875,
0.0297088623046875,
0.058563232421875,
-0.01515960693359375,
0.00858306884765625,
0.0017337799072265625,
0.018646240234375,
0.0423583984375,
0.05975341796875,
0.052154541015625,
-0.055023193359375,
0.050933837890625,
-0.0263519287109375,
-0.03887939453125,
-0.01275634765625,
-0.045806884765625,
-0.09686279296875,
-0.046844482421875,
-0.03839111328125,
-0.045440673828125,
0.005672454833984375,
0.08203125,
0.0552978515625,
-0.05303955078125,
-0.017364501953125,
0.01006317138671875,
0.0012311935424804688,
-0.039459228515625,
-0.01776123046875,
0.031494140625,
-0.00640869140625,
-0.065185546875,
0.0119171142578125,
0.00409698486328125,
0.01436614990234375,
-0.01369476318359375,
-0.0117645263671875,
-0.013092041015625,
-0.0003361701965332031,
0.04827880859375,
0.045501708984375,
-0.048614501953125,
-0.031158447265625,
0.0016527175903320312,
0.00001615285873413086,
0.019195556640625,
0.02716064453125,
-0.059783935546875,
0.0289306640625,
0.020965576171875,
0.0260772705078125,
0.05389404296875,
0.00850677490234375,
0.0278778076171875,
-0.055938720703125,
0.0265045166015625,
0.00986480712890625,
0.01476287841796875,
0.022735595703125,
-0.0098876953125,
0.053314208984375,
0.021759033203125,
-0.038665771484375,
-0.0587158203125,
-0.0078887939453125,
-0.08807373046875,
-0.0014476776123046875,
0.08795166015625,
-0.0198822021484375,
-0.0261077880859375,
0.0011501312255859375,
-0.033447265625,
0.0198822021484375,
-0.050384521484375,
0.06494140625,
0.0278472900390625,
-0.0178375244140625,
-0.0211334228515625,
-0.03375244140625,
0.034759521484375,
0.0168304443359375,
-0.060272216796875,
-0.0169677734375,
0.0284423828125,
0.010101318359375,
0.0322265625,
0.047821044921875,
-0.006465911865234375,
0.01052093505859375,
-0.007568359375,
0.0091400146484375,
-0.01296234130859375,
-0.007472991943359375,
-0.0130462646484375,
0.004558563232421875,
-0.01093292236328125,
-0.027587890625
]
] |
petals-team/falcon-rw-1b | 2023-09-03T12:56:43.000Z | [
"transformers",
"safetensors",
"falcon",
"text-generation",
"custom_code",
"en",
"dataset:tiiuae/falcon-refinedweb",
"arxiv:2306.01116",
"arxiv:2005.14165",
"arxiv:2108.12409",
"arxiv:2205.14135",
"license:apache-2.0",
"text-generation-inference",
"region:us"
] | text-generation | petals-team | null | null | petals-team/falcon-rw-1b | 1 | 7,083 | transformers | 2023-09-03T12:55:54 | ---
datasets:
- tiiuae/falcon-refinedweb
language:
- en
inference: false
license: apache-2.0
---
# Falcon-RW-1B
**Falcon-RW-1B is a 1B parameters causal decoder-only model built by [TII](https://www.tii.ae) and trained on 350B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb). It is made available under the Apache 2.0 license.**
See the 📓 [paper on arXiv](https://arxiv.org/abs/2306.01116) for more details.
RefinedWeb is a high-quality web dataset built by leveraging stringent filtering and large-scale deduplication. Falcon-RW-1B, trained on RefinedWeb only, matches or outperforms comparable models trained on curated data.
⚠️ Falcon is now available as a core model in the `transformers` library! To use the in-library version, please install the latest version of `transformers` with `pip install git+https://github.com/huggingface/transformers.git`, then simply remove the `trust_remote_code=True` argument from `from_pretrained()`.
⚠️ This model is intended for use as a **research artifact**, to study the influence of training on web data alone. **If you are interested in state-of-the-art models, we recommend using Falcon-[7B](https://huggingface.co/tiiuae/falcon-7b)/[40B](https://huggingface.co/tiiuae/falcon-40b), both trained on >1,000 billion tokens.**
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
model = "tiiuae/falcon-rw-1b"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
device_map="auto",
)
sequences = pipeline(
"Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
max_length=200,
do_sample=True,
top_k=10,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
💥 **Falcon LLMs require PyTorch 2.0 for use with `transformers`!**
# Model Card for Falcon-RW-1B
## Model Details
### Model Description
- **Developed by:** [https://www.tii.ae](https://www.tii.ae);
- **Model type:** Causal decoder-only;
- **Language(s) (NLP):** English;
- **License:** Apache 2.0.
### Model Source
- **Paper:** [https://arxiv.org/abs/2306.01116](https://arxiv.org/abs/2306.01116).
## Uses
### Direct Use
Research on large language models, specifically the influence of adequately filtered and deduplicated web data on the properties of large language models (fairness, safety, limitations, capabilities, etc.).
### Out-of-Scope Use
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful.
Broadly speaking, we would recommend Falcon-[7B](https://huggingface.co/tiiuae/falcon-7b)/[40B](https://huggingface.co/tiiuae/falcon-40b) for any use not directly related to research on web data pipelines.
## Bias, Risks, and Limitations
Falcon-RW-1B is trained on English data only, and will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online.
### Recommendations
We recommend users of Falcon-RW-1B to consider finetuning it for the specific set of tasks of interest, and for guardrails and appropriate precautions to be taken for any production use.
## How to Get Started with the Model
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
model = "tiiuae/falcon-rw-1b"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
device_map="auto",
)
sequences = pipeline(
"Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
max_length=200,
do_sample=True,
top_k=10,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
## Training Details
### Training Data
Falcon-RW-1B was trained on 350B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb), a high-quality filtered and deduplicated web dataset. The data was tokenized with the GPT-2 tokenizer.
### Training Procedure
Falcon-RW-1B was trained on 32 A100 40GB GPUs, using only data parallelism with ZeRO.
#### Training Hyperparameters
Hyperparameters were adapted from the GPT-3 paper ([Brown et al., 2020](https://arxiv.org/abs/2005.14165)).
| **Hyperparameter** | **Value** | **Comment** |
|--------------------|------------|-------------------------------------------|
| Precision | `bfloat16` | |
| Optimizer | AdamW | |
| Learning rate | 2e-4 | 500M tokens warm-up, cosine decay to 2e-5 |
| Weight decay | 1e-1 | |
| Batch size | 512 | 4B tokens ramp-up |
#### Speeds, Sizes, Times
Training happened in early December 2022 and took about six days.
## Evaluation
See the 📓 [paper on arXiv](https://arxiv.org/abs/2306.01116) for in-depth evaluation.
## Technical Specifications
### Model Architecture and Objective
Falcon-RW-1B is a causal decoder-only model trained on a causal language modeling task (i.e., predict the next token).
The architecture is adapted from the GPT-3 paper ([Brown et al., 2020](https://arxiv.org/abs/2005.14165)), but uses ALiBi ([Ofir et al., 2021](https://arxiv.org/abs/2108.12409)) and FlashAttention ([Dao et al., 2022](https://arxiv.org/abs/2205.14135)).
| **Hyperparameter** | **Value** | **Comment** |
|--------------------|-----------|----------------------------------------|
| Layers | 24 | |
| `d_model` | 2048 | |
| `head_dim` | 64 | Reduced to optimise for FlashAttention |
| Vocabulary | 50304 | |
| Sequence length | 2048 | |
### Compute Infrastructure
#### Hardware
Falcon-RW-1B was trained on AWS SageMaker, on 32 A100 40GB GPUs in P4d instances.
#### Software
Falcon-RW-1B was trained a custom distributed training codebase, Gigatron. It uses a 3D parallelism approach combined with ZeRO and high-performance Triton kernels (FlashAttention, etc.)
## Citation
```
@article{refinedweb,
title={The {R}efined{W}eb dataset for {F}alcon {LLM}: outperforming curated corpora with web data, and web data only},
author={Guilherme Penedo and Quentin Malartic and Daniel Hesslow and Ruxandra Cojocaru and Alessandro Cappelli and Hamza Alobeidli and Baptiste Pannier and Ebtesam Almazrouei and Julien Launay},
journal={arXiv preprint arXiv:2306.01116},
eprint={2306.01116},
eprinttype = {arXiv},
url={https://arxiv.org/abs/2306.01116},
year={2023}
}
```
## Contact
falconllm@tii.ae
| 7,579 | [
[
-0.042724609375,
-0.07073974609375,
-0.0008492469787597656,
0.018951416015625,
-0.01139068603515625,
-0.007503509521484375,
-0.00446319580078125,
-0.042816162109375,
0.01386260986328125,
0.01503753662109375,
-0.048736572265625,
-0.0295562744140625,
-0.057098388671875,
0.01149749755859375,
-0.0323486328125,
0.082275390625,
0.00047850608825683594,
-0.00823211669921875,
0.006694793701171875,
-0.0011806488037109375,
-0.0026092529296875,
-0.033203125,
-0.0709228515625,
-0.00930023193359375,
0.041290283203125,
0.0236968994140625,
0.0465087890625,
0.08001708984375,
0.053955078125,
0.021331787109375,
-0.00525665283203125,
0.01293182373046875,
-0.03790283203125,
-0.01776123046875,
-0.006404876708984375,
-0.01424407958984375,
-0.0179901123046875,
0.00484466552734375,
0.041534423828125,
0.044647216796875,
-0.00460052490234375,
0.029266357421875,
-0.0013532638549804688,
0.0399169921875,
-0.03826904296875,
0.0341796875,
-0.0408935546875,
0.0134735107421875,
-0.0170745849609375,
-0.003383636474609375,
-0.0287933349609375,
0.001613616943359375,
-0.0128936767578125,
-0.07244873046875,
0.030548095703125,
0.019775390625,
0.10186767578125,
0.0255279541015625,
-0.032379150390625,
-0.006587982177734375,
-0.037017822265625,
0.054962158203125,
-0.0533447265625,
0.037872314453125,
0.0229339599609375,
0.0168304443359375,
-0.016632080078125,
-0.07794189453125,
-0.040130615234375,
-0.007778167724609375,
-0.002437591552734375,
0.0266571044921875,
-0.024200439453125,
0.0009207725524902344,
0.043212890625,
0.00185394287109375,
-0.046905517578125,
0.01361083984375,
-0.03192138671875,
-0.0196685791015625,
0.04229736328125,
0.009368896484375,
0.0087127685546875,
-0.017303466796875,
-0.0341796875,
-0.027862548828125,
-0.025390625,
0.021484375,
0.034027099609375,
0.0253448486328125,
-0.01873779296875,
0.03155517578125,
-0.0208282470703125,
0.045867919921875,
0.0255889892578125,
-0.0037288665771484375,
0.0284881591796875,
-0.0197296142578125,
-0.0242919921875,
-0.01366424560546875,
0.07952880859375,
0.013336181640625,
0.01500701904296875,
-0.012664794921875,
-0.007534027099609375,
-0.007396697998046875,
0.01032257080078125,
-0.07830810546875,
0.0091705322265625,
0.0180511474609375,
-0.0364990234375,
-0.0196075439453125,
0.032745361328125,
-0.056182861328125,
0.0029811859130859375,
-0.006805419921875,
0.01062774658203125,
-0.0240631103515625,
-0.033660888671875,
0.0163421630859375,
-0.0205535888671875,
0.019866943359375,
-0.0118255615234375,
-0.07061767578125,
0.02197265625,
0.05487060546875,
0.06378173828125,
0.0029926300048828125,
-0.0577392578125,
-0.05316162109375,
0.0082855224609375,
-0.0215301513671875,
0.0391845703125,
-0.0212860107421875,
-0.02703857421875,
-0.005645751953125,
0.024139404296875,
-0.01715087890625,
-0.007793426513671875,
0.055694580078125,
-0.035797119140625,
0.007228851318359375,
-0.032135009765625,
-0.03973388671875,
-0.0272369384765625,
0.00408935546875,
-0.04473876953125,
0.06939697265625,
0.00295257568359375,
-0.08831787109375,
0.0262603759765625,
-0.052520751953125,
-0.03466796875,
-0.011962890625,
-0.0030689239501953125,
-0.039764404296875,
-0.0130767822265625,
0.03082275390625,
0.048370361328125,
-0.031585693359375,
0.0321044921875,
-0.04229736328125,
-0.044769287109375,
-0.0056915283203125,
-0.022735595703125,
0.061737060546875,
0.041748046875,
-0.04571533203125,
-0.0007123947143554688,
-0.050140380859375,
-0.00797271728515625,
0.03094482421875,
-0.004734039306640625,
0.0108795166015625,
-0.00824737548828125,
0.02008056640625,
0.0229949951171875,
0.01067352294921875,
-0.04217529296875,
0.0086212158203125,
-0.03680419921875,
0.042327880859375,
0.036102294921875,
0.0002243518829345703,
0.0096282958984375,
-0.029815673828125,
0.03631591796875,
0.027587890625,
0.0267486572265625,
-0.00678253173828125,
-0.0389404296875,
-0.07171630859375,
-0.01467132568359375,
0.0113372802734375,
0.035125732421875,
-0.043304443359375,
0.03271484375,
-0.0216827392578125,
-0.0506591796875,
-0.0219879150390625,
-0.0138397216796875,
0.03582763671875,
0.0562744140625,
0.040313720703125,
0.010589599609375,
-0.0565185546875,
-0.07080078125,
-0.007785797119140625,
-0.0243377685546875,
0.02154541015625,
0.01361846923828125,
0.055633544921875,
-0.03338623046875,
0.0633544921875,
-0.034149169921875,
-0.01158905029296875,
-0.00614166259765625,
0.02703857421875,
0.0207977294921875,
0.03985595703125,
0.0430908203125,
-0.040252685546875,
-0.020050048828125,
0.0010585784912109375,
-0.0709228515625,
0.004734039306640625,
-0.00951385498046875,
-0.003963470458984375,
0.03643798828125,
0.042694091796875,
-0.0489501953125,
0.01528167724609375,
0.028076171875,
-0.0219573974609375,
0.0322265625,
-0.0074920654296875,
0.007503509521484375,
-0.08203125,
0.03021240234375,
0.01885986328125,
0.0088653564453125,
-0.030548095703125,
0.016326904296875,
0.001796722412109375,
-0.0079345703125,
-0.042755126953125,
0.0557861328125,
-0.0321044921875,
-0.00423431396484375,
-0.0093994140625,
-0.01479339599609375,
-0.005336761474609375,
0.059234619140625,
0.0101165771484375,
0.0709228515625,
0.045745849609375,
-0.038116455078125,
-0.0022563934326171875,
0.026458740234375,
-0.00302886962890625,
0.007610321044921875,
-0.0618896484375,
0.006317138671875,
-0.0094757080078125,
0.0276947021484375,
-0.056915283203125,
-0.0157470703125,
0.0260772705078125,
-0.04388427734375,
0.0225982666015625,
-0.012908935546875,
-0.032623291015625,
-0.0494384765625,
-0.0121612548828125,
0.01503753662109375,
0.0304718017578125,
-0.043670654296875,
0.024139404296875,
0.01934814453125,
0.0077667236328125,
-0.0738525390625,
-0.0667724609375,
-0.0006084442138671875,
-0.014312744140625,
-0.053985595703125,
0.0258941650390625,
0.00278472900390625,
0.00579071044921875,
0.0020389556884765625,
0.004314422607421875,
0.0023937225341796875,
0.01102447509765625,
0.03057861328125,
0.011993408203125,
-0.014739990234375,
-0.00785064697265625,
0.004528045654296875,
-0.020599365234375,
0.0101165771484375,
-0.03607177734375,
0.031341552734375,
-0.038787841796875,
-0.0214080810546875,
-0.041534423828125,
0.0224761962890625,
0.044525146484375,
-0.0152435302734375,
0.055755615234375,
0.08123779296875,
-0.0296630859375,
0.0088653564453125,
-0.03826904296875,
-0.006855010986328125,
-0.036468505859375,
0.043121337890625,
-0.034027099609375,
-0.05609130859375,
0.04730224609375,
0.01317596435546875,
0.00466156005859375,
0.0704345703125,
0.01629638671875,
0.004718780517578125,
0.0870361328125,
0.0284576416015625,
-0.01541900634765625,
0.0300750732421875,
-0.05706787109375,
-0.00846099853515625,
-0.05975341796875,
-0.019744873046875,
-0.049163818359375,
-0.01073455810546875,
-0.055633544921875,
-0.00843048095703125,
0.00843048095703125,
0.018402099609375,
-0.06695556640625,
0.0272674560546875,
-0.039154052734375,
0.0147247314453125,
0.039215087890625,
0.0028667449951171875,
-0.0066986083984375,
0.004825592041015625,
-0.0243377685546875,
0.012542724609375,
-0.06341552734375,
-0.0257415771484375,
0.08660888671875,
0.0355224609375,
0.042694091796875,
-0.014129638671875,
0.0552978515625,
0.00505828857421875,
0.004177093505859375,
-0.043487548828125,
0.032684326171875,
-0.01172637939453125,
-0.0389404296875,
-0.01262664794921875,
-0.0413818359375,
-0.0728759765625,
0.00717926025390625,
-0.01152801513671875,
-0.04559326171875,
0.005596160888671875,
-0.006244659423828125,
-0.01044464111328125,
0.02874755859375,
-0.0611572265625,
0.066650390625,
-0.0161590576171875,
-0.036865234375,
0.0007886886596679688,
-0.05145263671875,
0.027130126953125,
-0.003307342529296875,
0.0159149169921875,
0.01629638671875,
-0.002941131591796875,
0.08026123046875,
-0.0443115234375,
0.04949951171875,
-0.032562255859375,
0.0211029052734375,
0.0222625732421875,
-0.0190582275390625,
0.07098388671875,
0.00811004638671875,
-0.018829345703125,
0.04327392578125,
0.00789642333984375,
-0.034820556640625,
-0.028778076171875,
0.056427001953125,
-0.091552734375,
-0.046844482421875,
-0.051361083984375,
-0.036468505859375,
-0.01561737060546875,
0.0205535888671875,
0.035858154296875,
0.027557373046875,
0.0014410018920898438,
0.0149078369140625,
0.021881103515625,
-0.0179901123046875,
0.0545654296875,
0.035858154296875,
-0.01336669921875,
-0.0408935546875,
0.06402587890625,
0.01099395751953125,
-0.0015726089477539062,
0.022552490234375,
0.0162811279296875,
-0.043365478515625,
-0.03948974609375,
-0.03973388671875,
0.029876708984375,
-0.051544189453125,
-0.0213623046875,
-0.077392578125,
-0.0377197265625,
-0.032623291015625,
-0.01123809814453125,
-0.02545166015625,
-0.03033447265625,
-0.04925537109375,
-0.01032257080078125,
0.040740966796875,
0.04180908203125,
-0.00537109375,
0.0225830078125,
-0.051544189453125,
0.01244354248046875,
0.005611419677734375,
0.020751953125,
0.01534271240234375,
-0.052764892578125,
-0.02398681640625,
0.0274810791015625,
-0.02264404296875,
-0.032501220703125,
0.029144287109375,
0.01351165771484375,
0.046966552734375,
0.03363037109375,
-0.0083770751953125,
0.07281494140625,
-0.01898193359375,
0.06011962890625,
0.0203094482421875,
-0.06097412109375,
0.0293121337890625,
-0.041748046875,
0.020355224609375,
0.036102294921875,
0.03900146484375,
-0.0352783203125,
-0.030181884765625,
-0.0634765625,
-0.045166015625,
0.077392578125,
0.0121612548828125,
-0.00948333740234375,
-0.014312744140625,
0.03179931640625,
-0.0020503997802734375,
-0.006893157958984375,
-0.04339599609375,
-0.03021240234375,
-0.047943115234375,
-0.0118408203125,
-0.0093231201171875,
0.0001780986785888672,
0.01314544677734375,
-0.036285400390625,
0.06866455078125,
-0.0117645263671875,
0.043121337890625,
0.0126800537109375,
-0.01068878173828125,
-0.0099639892578125,
-0.00820159912109375,
0.05078125,
0.0399169921875,
-0.0217742919921875,
-0.00506591796875,
0.00978851318359375,
-0.052581787109375,
0.00429534912109375,
0.0322265625,
-0.0165252685546875,
-0.0150299072265625,
0.03253173828125,
0.07562255859375,
0.0083465576171875,
-0.0182342529296875,
0.03619384765625,
-0.008148193359375,
-0.028533935546875,
-0.010650634765625,
0.007389068603515625,
0.006542205810546875,
0.013397216796875,
0.03082275390625,
0.00466156005859375,
0.0059814453125,
-0.0164947509765625,
0.0241851806640625,
0.024871826171875,
-0.0175018310546875,
-0.0138092041015625,
0.07598876953125,
0.01206207275390625,
-0.0131683349609375,
0.046966552734375,
-0.0259246826171875,
-0.0308380126953125,
0.072021484375,
0.0589599609375,
0.07275390625,
0.0178375244140625,
0.0196533203125,
0.06451416015625,
0.0211639404296875,
-0.015838623046875,
0.0119476318359375,
0.01366424560546875,
-0.03533935546875,
-0.014129638671875,
-0.058929443359375,
-0.021453857421875,
0.014984130859375,
-0.049652099609375,
0.033599853515625,
-0.042266845703125,
-0.0128326416015625,
0.006839752197265625,
0.032745361328125,
-0.06396484375,
0.0225677490234375,
-0.00795745849609375,
0.07305908203125,
-0.04986572265625,
0.06524658203125,
0.047393798828125,
-0.0687255859375,
-0.0780029296875,
0.00022101402282714844,
-0.01091766357421875,
-0.0673828125,
0.06298828125,
0.026336669921875,
-0.004337310791015625,
0.0312347412109375,
-0.0268707275390625,
-0.0732421875,
0.0894775390625,
0.023468017578125,
-0.049102783203125,
0.0006937980651855469,
0.0256805419921875,
0.03680419921875,
-0.01477813720703125,
0.052520751953125,
0.022674560546875,
0.04345703125,
0.0172576904296875,
-0.048309326171875,
0.0162506103515625,
-0.0335693359375,
0.005840301513671875,
0.007537841796875,
-0.07470703125,
0.066650390625,
-0.019989013671875,
-0.0167083740234375,
-0.0124969482421875,
0.06195068359375,
0.0216064453125,
0.01227569580078125,
0.0279388427734375,
0.04766845703125,
0.057098388671875,
-0.005985260009765625,
0.07672119140625,
-0.045318603515625,
0.04974365234375,
0.05865478515625,
-0.0008745193481445312,
0.052581787109375,
0.026336669921875,
-0.0115814208984375,
0.02777099609375,
0.059906005859375,
-0.017242431640625,
0.027679443359375,
-0.00339508056640625,
0.018402099609375,
-0.00951385498046875,
0.00041675567626953125,
-0.0501708984375,
0.0341796875,
0.006198883056640625,
-0.0148773193359375,
-0.01129913330078125,
0.00569915771484375,
0.023223876953125,
-0.024627685546875,
-0.00742340087890625,
0.036224365234375,
-0.0033321380615234375,
-0.06353759765625,
0.07659912109375,
0.0202484130859375,
0.0628662109375,
-0.053253173828125,
0.01129913330078125,
-0.03350830078125,
0.0086669921875,
-0.0218353271484375,
-0.044647216796875,
0.027618408203125,
0.0091400146484375,
-0.01473236083984375,
-0.006649017333984375,
0.03424072265625,
-0.0092010498046875,
-0.061431884765625,
0.02490234375,
0.018341064453125,
0.0086822509765625,
0.000034809112548828125,
-0.06805419921875,
0.0167999267578125,
-0.016693115234375,
-0.0279693603515625,
0.0164337158203125,
0.01224517822265625,
0.001621246337890625,
0.055633544921875,
0.05633544921875,
-0.008331298828125,
0.0032291412353515625,
0.0036334991455078125,
0.05145263671875,
-0.061492919921875,
-0.03271484375,
-0.047119140625,
0.024749755859375,
0.003910064697265625,
-0.039794921875,
0.0535888671875,
0.060150146484375,
0.06561279296875,
-0.0025386810302734375,
0.05853271484375,
-0.01012420654296875,
0.01483917236328125,
-0.033172607421875,
0.06085205078125,
-0.051513671875,
0.004421234130859375,
-0.02691650390625,
-0.04693603515625,
-0.0107269287109375,
0.04107666015625,
-0.0021114349365234375,
0.0165557861328125,
0.0667724609375,
0.07159423828125,
-0.01183319091796875,
0.0237579345703125,
0.00592041015625,
0.0207977294921875,
0.030242919921875,
0.052276611328125,
0.04339599609375,
-0.06854248046875,
0.049713134765625,
-0.0244293212890625,
-0.011566162109375,
-0.01337432861328125,
-0.062042236328125,
-0.0684814453125,
-0.051849365234375,
-0.02703857421875,
-0.02056884765625,
0.00910186767578125,
0.068359375,
0.0552978515625,
-0.056549072265625,
-0.020355224609375,
-0.03680419921875,
-0.00867462158203125,
-0.037750244140625,
-0.0159759521484375,
0.042816162109375,
-0.03778076171875,
-0.050567626953125,
0.0089569091796875,
-0.004390716552734375,
0.0036182403564453125,
-0.014923095703125,
-0.02325439453125,
-0.035858154296875,
0.0004208087921142578,
0.0311737060546875,
0.0284881591796875,
-0.056732177734375,
-0.02899169921875,
0.0182647705078125,
-0.0185699462890625,
-0.0027484893798828125,
0.0267486572265625,
-0.04644775390625,
0.0145721435546875,
0.0242767333984375,
0.055267333984375,
0.0692138671875,
-0.00673675537109375,
0.01009368896484375,
-0.031646728515625,
0.0267181396484375,
-0.010650634765625,
0.033599853515625,
0.006195068359375,
-0.033843994140625,
0.045684814453125,
0.03875732421875,
-0.036376953125,
-0.05047607421875,
-0.01094818115234375,
-0.08599853515625,
-0.018402099609375,
0.10467529296875,
-0.013580322265625,
-0.0191650390625,
0.00980377197265625,
-0.007244110107421875,
0.049041748046875,
-0.02862548828125,
0.043731689453125,
0.044647216796875,
0.0033893585205078125,
0.0019083023071289062,
-0.0298919677734375,
0.026824951171875,
0.014984130859375,
-0.06829833984375,
-0.006298065185546875,
0.02044677734375,
0.0289459228515625,
-0.0072174072265625,
0.02734375,
0.00455474853515625,
0.014862060546875,
0.0207672119140625,
-0.0020656585693359375,
-0.056793212890625,
-0.0233612060546875,
-0.0158538818359375,
0.010040283203125,
-0.020233154296875,
-0.0224456787109375
]
] |
admruul/anything-v3.0 | 2023-05-16T09:40:18.000Z | [
"diffusers",
"safetensors",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | admruul | null | null | admruul/anything-v3.0 | 30 | 7,080 | diffusers | 2022-12-25T09:53:44 | ---
language:
- en
license: creativeml-openrail-m
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
inference: true
duplicated_from: Linaqruf/anything-v3.0
---
# Anything V3
Welcome to Anything V3 - a latent diffusion model for weebs. This model is intended to produce high-quality, highly detailed anime style with just a few prompts. Like other anime-style Stable Diffusion models, it also supports danbooru tags to generate images.
e.g. **_1girl, white hair, golden eyes, beautiful eyes, detail, flower meadow, cumulonimbus clouds, lighting, detailed sky, garden_**
## Gradio
We support a [Gradio](https://github.com/gradio-app/gradio) Web UI to run Anything-V3.0:
[Open in Spaces](https://huggingface.co/spaces/akhaliq/anything-v3.0)
## 🧨 Diffusers
This model can be used just like any other Stable Diffusion model. For more information,
please have a look at the [Stable Diffusion](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion).
You can also export the model to [ONNX](https://huggingface.co/docs/diffusers/optimization/onnx), [MPS](https://huggingface.co/docs/diffusers/optimization/mps) and/or [FLAX/JAX]().
```python
from diffusers import StableDiffusionPipeline
import torch
model_id = "Linaqruf/anything-v3.0"
branch_name= "diffusers"
pipe = StableDiffusionPipeline.from_pretrained(model_id, revision=branch_name, torch_dtype=torch.float16)
pipe = pipe.to("cuda")
prompt = "pikachu"
image = pipe(prompt).images[0]
image.save("./pikachu.png")
```
## Examples
Below are some examples of images generated using this model:
**Anime Girl:**

```
1girl, brown hair, green eyes, colorful, autumn, cumulonimbus clouds, lighting, blue sky, falling leaves, garden
Steps: 50, Sampler: DDIM, CFG scale: 12
```
**Anime Boy:**

```
1boy, medium hair, blonde hair, blue eyes, bishounen, colorful, autumn, cumulonimbus clouds, lighting, blue sky, falling leaves, garden
Steps: 50, Sampler: DDIM, CFG scale: 12
```
**Scenery:**

```
scenery, shibuya tokyo, post-apocalypse, ruins, rust, sky, skyscraper, abandoned, blue sky, broken window, building, cloud, crane machine, outdoors, overgrown, pillar, sunset
Steps: 50, Sampler: DDIM, CFG scale: 12
```
## License
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) | 3,328 | [
[
-0.03302001953125,
-0.06658935546875,
0.04156494140625,
0.035797119140625,
-0.0161895751953125,
-0.0416259765625,
0.021331787109375,
-0.030914306640625,
0.0185089111328125,
0.03948974609375,
-0.043670654296875,
-0.051971435546875,
-0.044036865234375,
-0.0209197998046875,
-0.02313232421875,
0.0831298828125,
-0.01513671875,
-0.004154205322265625,
0.0020008087158203125,
-0.0013790130615234375,
-0.03546142578125,
-0.0027141571044921875,
-0.06329345703125,
-0.0238037109375,
0.0465087890625,
0.016082763671875,
0.0616455078125,
0.0347900390625,
0.0220947265625,
0.0234222412109375,
-0.0226287841796875,
-0.0196380615234375,
-0.03497314453125,
-0.0014028549194335938,
-0.0020236968994140625,
-0.02740478515625,
-0.0609130859375,
0.002567291259765625,
0.0360107421875,
0.0183258056640625,
-0.0243682861328125,
-0.0038356781005859375,
-0.0133819580078125,
0.03509521484375,
-0.0294189453125,
-0.0009684562683105469,
-0.021331787109375,
0.0099639892578125,
-0.02789306640625,
0.0116729736328125,
-0.00356292724609375,
-0.03399658203125,
0.004703521728515625,
-0.06463623046875,
0.01000213623046875,
0.005634307861328125,
0.10284423828125,
0.0062713623046875,
-0.0186920166015625,
-0.006946563720703125,
-0.0305023193359375,
0.0423583984375,
-0.04962158203125,
0.0170440673828125,
0.01493072509765625,
0.028289794921875,
-0.0054931640625,
-0.07916259765625,
-0.034576416015625,
0.0077056884765625,
-0.00902557373046875,
0.029083251953125,
-0.0299530029296875,
-0.007747650146484375,
0.0169219970703125,
0.041412353515625,
-0.05364990234375,
-0.0115203857421875,
-0.04022216796875,
0.006999969482421875,
0.03509521484375,
0.0195159912109375,
0.04693603515625,
-0.024810791015625,
-0.030181884765625,
-0.01459503173828125,
-0.044891357421875,
0.004207611083984375,
0.0157470703125,
-0.01552581787109375,
-0.053070068359375,
0.04345703125,
0.01204681396484375,
0.044189453125,
0.01157379150390625,
-0.005809783935546875,
0.0232696533203125,
0.00658416748046875,
-0.021820068359375,
-0.032470703125,
0.06988525390625,
0.046661376953125,
-0.006866455078125,
-0.002349853515625,
-0.004596710205078125,
-0.0160675048828125,
0.00189971923828125,
-0.0894775390625,
-0.037017822265625,
0.031524658203125,
-0.050994873046875,
-0.027191162109375,
-0.023834228515625,
-0.05572509765625,
-0.0215606689453125,
0.014251708984375,
0.034027099609375,
-0.042388916015625,
-0.049560546875,
0.0127105712890625,
-0.035247802734375,
0.020355224609375,
0.033294677734375,
-0.04901123046875,
-0.0026073455810546875,
0.01265716552734375,
0.0699462890625,
0.01134490966796875,
-0.0135955810546875,
0.00238800048828125,
0.0168304443359375,
-0.023284912109375,
0.04742431640625,
-0.02471923828125,
-0.0477294921875,
-0.026641845703125,
0.0214080810546875,
0.005645751953125,
-0.0247802734375,
0.05255126953125,
-0.02880859375,
0.0196990966796875,
0.000675201416015625,
-0.0276336669921875,
-0.0250396728515625,
-0.013092041015625,
-0.03961181640625,
0.06573486328125,
0.0263671875,
-0.07659912109375,
0.01294708251953125,
-0.068603515625,
-0.00547027587890625,
-0.0017681121826171875,
-0.0008053779602050781,
-0.050262451171875,
0.003780364990234375,
-0.013214111328125,
0.044586181640625,
-0.0083770751953125,
0.007762908935546875,
-0.04803466796875,
-0.001434326171875,
-0.0026721954345703125,
-0.01367950439453125,
0.0933837890625,
0.032012939453125,
-0.01366424560546875,
0.020172119140625,
-0.051727294921875,
-0.003940582275390625,
0.035980224609375,
0.00507354736328125,
-0.01139068603515625,
-0.01531219482421875,
0.038726806640625,
0.01494598388671875,
0.01904296875,
-0.041839599609375,
0.0168609619140625,
-0.01244354248046875,
0.0367431640625,
0.050689697265625,
0.01255035400390625,
0.038482666015625,
-0.042724609375,
0.04010009765625,
0.01232147216796875,
0.039398193359375,
-0.0011196136474609375,
-0.0648193359375,
-0.048675537109375,
-0.051727294921875,
0.006557464599609375,
0.01244354248046875,
-0.047027587890625,
0.026397705078125,
-0.00321197509765625,
-0.07025146484375,
-0.02978515625,
-0.01377105712890625,
0.01605224609375,
0.03997802734375,
0.0013952255249023438,
-0.01983642578125,
-0.0394287109375,
-0.058929443359375,
0.020263671875,
-0.003894805908203125,
-0.016632080078125,
0.025970458984375,
0.038818359375,
-0.0240936279296875,
0.052398681640625,
-0.045562744140625,
-0.024871826171875,
-0.0017757415771484375,
0.01593017578125,
0.034088134765625,
0.05755615234375,
0.07818603515625,
-0.07440185546875,
-0.04840087890625,
-0.003513336181640625,
-0.0625,
-0.00994110107421875,
0.019378662109375,
-0.03948974609375,
-0.003360748291015625,
0.0072174072265625,
-0.064453125,
0.044342041015625,
0.05291748046875,
-0.04608154296875,
0.0272369384765625,
-0.021820068359375,
0.008148193359375,
-0.09197998046875,
0.0305938720703125,
0.0052032470703125,
-0.03875732421875,
-0.03594970703125,
0.035980224609375,
-0.0108489990234375,
-0.01480865478515625,
-0.055267333984375,
0.08233642578125,
-0.01239776611328125,
0.0184173583984375,
-0.0167388916015625,
0.00923919677734375,
0.0211944580078125,
0.0291748046875,
0.0173187255859375,
0.0272216796875,
0.0694580078125,
-0.041717529296875,
0.0274658203125,
0.0361328125,
-0.0157928466796875,
0.056304931640625,
-0.06817626953125,
-0.007083892822265625,
-0.0219573974609375,
0.0221405029296875,
-0.065673828125,
-0.044525146484375,
0.045654296875,
-0.0462646484375,
0.00849151611328125,
-0.01535797119140625,
-0.041412353515625,
-0.01220703125,
-0.018035888671875,
0.0086517333984375,
0.06903076171875,
-0.0404052734375,
0.06689453125,
0.02227783203125,
0.006244659423828125,
-0.01300811767578125,
-0.05426025390625,
-0.032012939453125,
-0.04144287109375,
-0.05908203125,
0.030242919921875,
-0.024810791015625,
-0.00830078125,
0.0283966064453125,
0.0019245147705078125,
-0.0256805419921875,
0.007038116455078125,
0.042694091796875,
0.032196044921875,
-0.004123687744140625,
-0.035919189453125,
-0.005390167236328125,
-0.0003762245178222656,
-0.006313323974609375,
-0.0013208389282226562,
0.03350830078125,
-0.00677490234375,
-0.009796142578125,
-0.0401611328125,
0.01397705078125,
0.034820556640625,
0.0098114013671875,
0.055908203125,
0.0712890625,
-0.033203125,
0.003917694091796875,
-0.022674560546875,
0.00586700439453125,
-0.03582763671875,
0.01085662841796875,
-0.03302001953125,
-0.048736572265625,
0.054473876953125,
0.0182952880859375,
0.03070068359375,
0.041595458984375,
0.038665771484375,
-0.0291290283203125,
0.091796875,
0.04962158203125,
0.0176239013671875,
0.044586181640625,
-0.03753662109375,
-0.00948333740234375,
-0.07659912109375,
-0.0303497314453125,
-0.024322509765625,
-0.023712158203125,
-0.04046630859375,
-0.035125732421875,
0.02392578125,
0.037322998046875,
-0.03875732421875,
0.0269927978515625,
-0.03802490234375,
0.033782958984375,
0.016143798828125,
0.015655517578125,
0.00701141357421875,
0.001140594482421875,
0.00370025634765625,
-0.0007619857788085938,
-0.043121337890625,
-0.022430419921875,
0.0738525390625,
0.02691650390625,
0.0648193359375,
0.0224609375,
0.046539306640625,
0.01058197021484375,
0.0401611328125,
-0.031494140625,
0.035186767578125,
-0.005321502685546875,
-0.08233642578125,
0.0031795501708984375,
-0.03631591796875,
-0.075439453125,
0.0135955810546875,
-0.01224517822265625,
-0.06500244140625,
0.0248870849609375,
0.0038585662841796875,
-0.0142059326171875,
0.03680419921875,
-0.05072021484375,
0.06207275390625,
-0.006587982177734375,
-0.03692626953125,
0.00827789306640625,
-0.041290283203125,
0.040679931640625,
0.00910186767578125,
0.0076141357421875,
-0.011627197265625,
-0.0093841552734375,
0.04217529296875,
-0.0253753662109375,
0.054931640625,
-0.025787353515625,
-0.0132904052734375,
0.01910400390625,
0.004062652587890625,
0.020416259765625,
0.0228271484375,
-0.00047969818115234375,
0.0247039794921875,
0.0098876953125,
-0.04681396484375,
-0.03955078125,
0.06976318359375,
-0.0755615234375,
-0.0264434814453125,
-0.0296173095703125,
-0.0229949951171875,
0.0195465087890625,
0.0299224853515625,
0.055450439453125,
0.0195465087890625,
-0.00485992431640625,
0.018829345703125,
0.057220458984375,
-0.01153564453125,
0.025238037109375,
0.036529541015625,
-0.06658935546875,
-0.04638671875,
0.06878662109375,
-0.01021575927734375,
0.0198211669921875,
-0.00379180908203125,
0.013916015625,
-0.01611328125,
-0.025054931640625,
-0.03851318359375,
0.0228271484375,
-0.051483154296875,
-0.01580810546875,
-0.041717529296875,
-0.0162353515625,
-0.0096893310546875,
-0.0176544189453125,
-0.027374267578125,
-0.0269622802734375,
-0.03558349609375,
0.01507568359375,
0.050750732421875,
0.051849365234375,
-0.016265869140625,
0.03631591796875,
-0.040924072265625,
0.0406494140625,
0.007152557373046875,
0.0421142578125,
0.0010194778442382812,
-0.04254150390625,
-0.004795074462890625,
-0.0080413818359375,
-0.050140380859375,
-0.05670166015625,
0.034515380859375,
0.0020694732666015625,
0.028778076171875,
0.043792724609375,
-0.021820068359375,
0.0601806640625,
-0.034271240234375,
0.057891845703125,
0.032562255859375,
-0.047210693359375,
0.03240966796875,
-0.0548095703125,
0.006534576416015625,
0.014251708984375,
0.046142578125,
-0.025543212890625,
-0.0260162353515625,
-0.082763671875,
-0.0562744140625,
0.0458984375,
0.044464111328125,
0.01067352294921875,
0.0117340087890625,
0.039398193359375,
-0.0030002593994140625,
0.0142669677734375,
-0.0732421875,
-0.049560546875,
-0.02813720703125,
-0.0004820823669433594,
0.01904296875,
0.01395416259765625,
-0.015777587890625,
-0.017303466796875,
0.06610107421875,
0.0182037353515625,
0.0205535888671875,
0.01267242431640625,
0.0284423828125,
-0.0248870849609375,
-0.0131378173828125,
0.01171875,
0.0305938720703125,
-0.045562744140625,
-0.0272216796875,
-0.017730712890625,
-0.0198211669921875,
0.0102386474609375,
0.0013790130615234375,
-0.032501220703125,
0.018463134765625,
-0.0125885009765625,
0.0665283203125,
0.00415802001953125,
-0.0230865478515625,
0.037109375,
-0.02569580078125,
-0.0165557861328125,
-0.023223876953125,
0.0206451416015625,
0.0245819091796875,
0.032470703125,
-0.001010894775390625,
0.032012939453125,
0.019561767578125,
-0.03759765625,
-0.01529693603515625,
0.028717041015625,
-0.01184844970703125,
-0.035400390625,
0.08184814453125,
0.01032257080078125,
-0.0018444061279296875,
0.0240478515625,
-0.0195159912109375,
-0.0023651123046875,
0.044189453125,
0.037139892578125,
0.06781005859375,
-0.0101165771484375,
0.025177001953125,
0.036529541015625,
0.0006709098815917969,
-0.00775909423828125,
0.031097412109375,
0.01018524169921875,
-0.039276123046875,
-0.0097808837890625,
-0.0477294921875,
-0.017303466796875,
0.03289794921875,
-0.044891357421875,
0.057281494140625,
-0.06915283203125,
-0.021240234375,
-0.0139312744140625,
-0.017181396484375,
-0.03826904296875,
0.0360107421875,
0.0135955810546875,
0.0670166015625,
-0.067138671875,
0.05279541015625,
0.062164306640625,
-0.06243896484375,
-0.058837890625,
-0.0065155029296875,
-0.006015777587890625,
-0.041595458984375,
0.0300445556640625,
0.0003452301025390625,
0.0015478134155273438,
0.01375579833984375,
-0.0631103515625,
-0.058746337890625,
0.0897216796875,
0.01800537109375,
-0.00733184814453125,
-0.0212860107421875,
-0.03076171875,
0.03497314453125,
-0.0232696533203125,
0.046112060546875,
0.0209197998046875,
0.032928466796875,
0.0226287841796875,
-0.034088134765625,
-0.0012159347534179688,
-0.0345458984375,
0.035369873046875,
-0.0035076141357421875,
-0.076416015625,
0.08154296875,
0.0007147789001464844,
-0.0268707275390625,
0.0552978515625,
0.053619384765625,
0.049713134765625,
0.012908935546875,
0.0306854248046875,
0.06036376953125,
0.0289459228515625,
-0.0212554931640625,
0.08917236328125,
-0.01033782958984375,
0.0318603515625,
0.061553955078125,
-0.002620697021484375,
0.05084228515625,
0.018096923828125,
-0.01262664794921875,
0.0513916015625,
0.0633544921875,
-0.0039825439453125,
0.045562744140625,
-0.0037593841552734375,
-0.007396697998046875,
-0.006153106689453125,
-0.01096343994140625,
-0.045745849609375,
-0.0013742446899414062,
0.01861572265625,
-0.0131988525390625,
-0.01503753662109375,
0.007587432861328125,
0.01432037353515625,
-0.0235137939453125,
-0.0129852294921875,
0.036346435546875,
0.006832122802734375,
-0.00930023193359375,
0.0516357421875,
0.00676727294921875,
0.055206298828125,
-0.0701904296875,
-0.01517486572265625,
-0.0303497314453125,
-0.00972747802734375,
-0.03485107421875,
-0.068115234375,
0.016998291015625,
0.00977325439453125,
-0.005786895751953125,
-0.01922607421875,
0.041839599609375,
-0.00638580322265625,
-0.034393310546875,
0.024322509765625,
0.0031890869140625,
0.035369873046875,
0.01300811767578125,
-0.074462890625,
0.0191650390625,
0.0019073486328125,
-0.029876708984375,
0.0211639404296875,
0.04266357421875,
0.0217742919921875,
0.054718017578125,
0.031402587890625,
0.0211944580078125,
0.007293701171875,
-0.00780487060546875,
0.0634765625,
-0.03302001953125,
-0.0195159912109375,
-0.032501220703125,
0.07464599609375,
-0.007762908935546875,
-0.0240631103515625,
0.07000732421875,
0.03948974609375,
0.0645751953125,
-0.0213775634765625,
0.06298828125,
-0.050018310546875,
0.0243682861328125,
-0.035247802734375,
0.067626953125,
-0.08807373046875,
0.00146484375,
-0.03228759765625,
-0.0743408203125,
-0.00688934326171875,
0.06964111328125,
-0.0038242340087890625,
0.017822265625,
0.032318115234375,
0.0555419921875,
-0.0247039794921875,
-0.005413055419921875,
0.014892578125,
0.026092529296875,
0.017822265625,
0.0299224853515625,
0.064208984375,
-0.0555419921875,
0.0296478271484375,
-0.0350341796875,
-0.017669677734375,
-0.005466461181640625,
-0.0648193359375,
-0.063720703125,
-0.0401611328125,
-0.032745361328125,
-0.051300048828125,
-0.01294708251953125,
0.041748046875,
0.07183837890625,
-0.04193115234375,
-0.017242431640625,
-0.02197265625,
-0.0005908012390136719,
0.0211639404296875,
-0.0206146240234375,
0.0089111328125,
0.01399993896484375,
-0.07672119140625,
0.00007802248001098633,
0.005214691162109375,
0.0293731689453125,
-0.018798828125,
-0.01534271240234375,
-0.00862884521484375,
-0.013336181640625,
0.021697998046875,
0.030242919921875,
-0.04705810546875,
0.0008177757263183594,
-0.0247802734375,
0.004482269287109375,
0.01116943359375,
0.0165252685546875,
-0.0242462158203125,
0.022613525390625,
0.04681396484375,
0.012847900390625,
0.05279541015625,
-0.007293701171875,
0.015869140625,
-0.0479736328125,
0.026580810546875,
0.00063323974609375,
0.04052734375,
0.026336669921875,
-0.01290130615234375,
0.04510498046875,
0.0157470703125,
-0.04864501953125,
-0.05157470703125,
0.00958251953125,
-0.06842041015625,
-0.0196990966796875,
0.0716552734375,
-0.026458740234375,
-0.037322998046875,
0.020294189453125,
-0.033966064453125,
0.012725830078125,
-0.0295562744140625,
0.033050537109375,
0.0372314453125,
-0.01605224609375,
-0.021728515625,
-0.04974365234375,
0.032379150390625,
0.0139923095703125,
-0.06365966796875,
-0.01094818115234375,
0.04290771484375,
0.056488037109375,
0.0266265869140625,
0.05352783203125,
-0.036651611328125,
0.0175933837890625,
-0.0016412734985351562,
0.02008056640625,
0.0169219970703125,
-0.0140838623046875,
-0.021453857421875,
-0.011932373046875,
-0.0019025802612304688,
-0.007495880126953125
]
] |
LinkSoul/Chinese-Llama-2-7b | 2023-08-16T03:22:56.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"zh",
"en",
"dataset:LinkSoul/instruction_merge_set",
"license:openrail",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | LinkSoul | null | null | LinkSoul/Chinese-Llama-2-7b | 274 | 7,077 | transformers | 2023-07-20T08:23:15 | ---
license: openrail
datasets:
- LinkSoul/instruction_merge_set
language:
- zh
- en
widget:
- text: "[INST] <<SYS>>\nYou are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.\n If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.\n<</SYS>>\n\n用中文回答,When is the best time to visit Beijing, and do you have any suggestions for me? [/INST]"
example_title: "北京"
- text: "[INST] <<SYS>>\nYou are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.\n If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.\n<</SYS>>\n\n用英文回答,特朗普是谁? [/INST]"
example_title: "特朗普是谁"
---
# Chinese Llama 2 7B
全部开源,完全可商用的**中文版 Llama2 模型及中英文 SFT 数据集**,输入格式严格遵循 *llama-2-chat* 格式,兼容适配所有针对原版 *llama-2-chat* 模型的优化。

## 基础演示

## 在线试玩
> Talk is cheap, Show you the Demo.
- [Demo 地址 / HuggingFace Spaces](https://huggingface.co/spaces/LinkSoul/Chinese-Llama-2-7b)
- [Colab 一键启动](#) // 正在准备
## 资源下载
- 模型下载:[Chinese Llama2 Chat Model](https://huggingface.co/LinkSoul/Chinese-Llama-2-7b)
- 4bit量化:[Chinese Llama2 4bit Chat Model](https://huggingface.co/LinkSoul/Chinese-Llama-2-7b-4bit)
> 我们使用了中英文 SFT 数据集,数据量 1000 万。
- 数据集:[https://huggingface.co/datasets/LinkSoul/instruction_merge_set](https://huggingface.co/datasets/LinkSoul/instruction_merge_set)
- 训练及推理代码:[https://github.com/LinkSoul-AI/Chinese-Llama-2-7b](https://github.com/LinkSoul-AI/Chinese-Llama-2-7b)
## 快速测试
```python
from transformers import AutoTokenizer, AutoModelForCausalLM, TextStreamer
model_path = "LinkSoul/Chinese-Llama-2-7b"
tokenizer = AutoTokenizer.from_pretrained(model_path, use_fast=False)
model = AutoModelForCausalLM.from_pretrained(model_path).half().cuda()
streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True)
instruction = """[INST] <<SYS>>\nYou are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.
If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.\n<</SYS>>\n\n{} [/INST]"""
prompt = instruction.format("用英文回答,什么是夫妻肺片?")
generate_ids = model.generate(tokenizer(prompt, return_tensors='pt').input_ids.cuda(), max_new_tokens=4096, streamer=streamer)
```
## 相关项目
- [Llama2](https://ai.meta.com/llama/)
## 项目协议
[Apache-2.0 license](https://github.com/LinkSoul-AI/Chinese-Llama-2-7b/blob/main/LICENSE)
## 微信交流群
欢迎加入[微信群](.github/QRcode.jpg)
| 3,542 | [
[
-0.0196990966796875,
-0.0701904296875,
0.0109100341796875,
0.04534912109375,
-0.055267333984375,
0.0063934326171875,
0.0020084381103515625,
-0.05413818359375,
0.029327392578125,
0.00555419921875,
-0.040802001953125,
-0.031402587890625,
-0.0537109375,
0.0062103271484375,
-0.0193328857421875,
0.063720703125,
-0.00920867919921875,
-0.005977630615234375,
0.0033931732177734375,
-0.0010843276977539062,
-0.040130615234375,
-0.02789306640625,
-0.05999755859375,
-0.019989013671875,
0.0129852294921875,
0.002086639404296875,
0.04541015625,
0.06341552734375,
0.0535888671875,
0.027923583984375,
-0.0149078369140625,
0.026214599609375,
-0.032073974609375,
-0.007167816162109375,
0.0177459716796875,
-0.04443359375,
-0.049041748046875,
-0.0145721435546875,
0.03533935546875,
0.00757598876953125,
-0.00173187255859375,
0.029205322265625,
0.00428009033203125,
0.018829345703125,
-0.01861572265625,
0.03082275390625,
-0.02734375,
0.00330352783203125,
-0.0233917236328125,
-0.00382232666015625,
0.00380706787109375,
-0.029571533203125,
-0.0184173583984375,
-0.055267333984375,
-0.01175689697265625,
0.0029010772705078125,
0.094482421875,
0.02899169921875,
-0.0287933349609375,
-0.0137176513671875,
-0.0159759521484375,
0.047515869140625,
-0.07342529296875,
0.000980377197265625,
0.03179931640625,
0.0172271728515625,
-0.028961181640625,
-0.0640869140625,
-0.059234619140625,
-0.01366424560546875,
-0.0092010498046875,
-0.0005440711975097656,
-0.0286102294921875,
-0.020355224609375,
0.00991058349609375,
0.01194000244140625,
-0.03485107421875,
0.0158538818359375,
-0.05206298828125,
-0.0201873779296875,
0.05780029296875,
0.0133514404296875,
0.039825439453125,
-0.0220947265625,
-0.035980224609375,
-0.0185546875,
-0.05035400390625,
0.018890380859375,
0.0220489501953125,
0.0066375732421875,
-0.056610107421875,
0.039520263671875,
-0.0169830322265625,
0.0212249755859375,
0.0268402099609375,
-0.0280303955078125,
0.038055419921875,
-0.0218048095703125,
-0.022674560546875,
-0.01314544677734375,
0.0731201171875,
0.037628173828125,
-0.01236724853515625,
0.01142120361328125,
-0.00424957275390625,
0.005962371826171875,
-0.0289154052734375,
-0.05340576171875,
-0.01031494140625,
0.032379150390625,
-0.04937744140625,
-0.0285186767578125,
-0.01049041748046875,
-0.03973388671875,
-0.0167388916015625,
-0.006214141845703125,
0.01361083984375,
-0.0191497802734375,
-0.050994873046875,
0.0003452301025390625,
0.0005016326904296875,
0.050262451171875,
0.0162353515625,
-0.058013916015625,
0.005908966064453125,
0.041168212890625,
0.0635986328125,
-0.007350921630859375,
-0.02972412109375,
-0.00609588623046875,
0.01393890380859375,
-0.01528167724609375,
0.0616455078125,
0.0007266998291015625,
-0.041290283203125,
-0.017333984375,
0.01123809814453125,
0.0011272430419921875,
-0.031463623046875,
0.0238494873046875,
-0.029205322265625,
0.00881195068359375,
-0.03204345703125,
-0.0181732177734375,
-0.0289459228515625,
0.021820068359375,
-0.040130615234375,
0.08978271484375,
-0.002643585205078125,
-0.05975341796875,
-0.00815582275390625,
-0.0462646484375,
-0.025726318359375,
-0.0019817352294921875,
-0.0054473876953125,
-0.024932861328125,
-0.0243682861328125,
0.0115814208984375,
0.030059814453125,
-0.0245819091796875,
0.0204925537109375,
-0.025177001953125,
-0.030181884765625,
0.0211334228515625,
-0.026763916015625,
0.0885009765625,
0.028350830078125,
-0.045166015625,
0.0030002593994140625,
-0.0330810546875,
-0.01470184326171875,
0.04302978515625,
-0.0204315185546875,
0.000004112720489501953,
-0.004917144775390625,
0.0060272216796875,
0.0137939453125,
0.060699462890625,
-0.032135009765625,
0.026611328125,
-0.0268402099609375,
0.038421630859375,
0.06884765625,
-0.01061248779296875,
0.01348876953125,
-0.0318603515625,
0.02606201171875,
0.0235595703125,
0.0281524658203125,
-0.0025005340576171875,
-0.049468994140625,
-0.07965087890625,
-0.01038360595703125,
-0.013397216796875,
0.0489501953125,
-0.0526123046875,
0.041290283203125,
-0.0030078887939453125,
-0.0533447265625,
-0.033355712890625,
0.017913818359375,
0.0302276611328125,
0.02667236328125,
0.031524658203125,
-0.0132293701171875,
-0.049530029296875,
-0.0537109375,
0.0189056396484375,
-0.03662109375,
0.0058746337890625,
0.0289154052734375,
0.046417236328125,
-0.038543701171875,
0.040374755859375,
-0.030548095703125,
-0.0207061767578125,
-0.0137481689453125,
-0.015289306640625,
0.021148681640625,
0.047119140625,
0.05255126953125,
-0.049163818359375,
-0.0233612060546875,
-0.00682830810546875,
-0.07623291015625,
-0.0016889572143554688,
-0.007648468017578125,
-0.03515625,
0.0280609130859375,
0.011077880859375,
-0.0489501953125,
0.03094482421875,
0.046234130859375,
-0.029083251953125,
0.035736083984375,
-0.00980377197265625,
-0.003154754638671875,
-0.09405517578125,
0.007442474365234375,
-0.0161285400390625,
0.01123809814453125,
-0.0247650146484375,
0.0171051025390625,
-0.005359649658203125,
0.02276611328125,
-0.053924560546875,
0.0557861328125,
-0.032257080078125,
0.0032787322998046875,
-0.0190277099609375,
0.00421142578125,
-0.005222320556640625,
0.044189453125,
-0.00555419921875,
0.0521240234375,
0.040740966796875,
-0.042694091796875,
0.045135498046875,
0.043975830078125,
-0.004913330078125,
0.00556182861328125,
-0.0665283203125,
0.025177001953125,
0.009033203125,
0.038665771484375,
-0.0860595703125,
-0.0153045654296875,
0.0469970703125,
-0.0479736328125,
0.0109710693359375,
0.007465362548828125,
-0.035614013671875,
-0.0300750732421875,
-0.042205810546875,
0.037567138671875,
0.0445556640625,
-0.048583984375,
0.0237579345703125,
0.035491943359375,
0.005123138427734375,
-0.058013916015625,
-0.067626953125,
0.0033721923828125,
-0.01187896728515625,
-0.053375244140625,
0.01177978515625,
-0.01551055908203125,
0.002094268798828125,
-0.0020751953125,
0.0079803466796875,
-0.005329132080078125,
0.00799560546875,
0.0233917236328125,
0.0548095703125,
-0.014312744140625,
-0.0277557373046875,
0.0159759521484375,
-0.004383087158203125,
0.00748443603515625,
0.0107879638671875,
0.04510498046875,
-0.01605224609375,
-0.0295257568359375,
-0.052947998046875,
0.008880615234375,
0.0227813720703125,
-0.004055023193359375,
0.038330078125,
0.059906005859375,
-0.025665283203125,
0.0205841064453125,
-0.05279541015625,
-0.00025272369384765625,
-0.04150390625,
0.032379150390625,
-0.0218048095703125,
-0.06268310546875,
0.053955078125,
0.01560211181640625,
0.0156402587890625,
0.05133056640625,
0.055999755859375,
-0.007190704345703125,
0.064697265625,
0.04486083984375,
-0.0189361572265625,
0.03289794921875,
-0.045654296875,
0.0086822509765625,
-0.07415771484375,
-0.048431396484375,
-0.02484130859375,
-0.0224609375,
-0.0421142578125,
-0.04034423828125,
0.013275146484375,
0.0029392242431640625,
-0.0369873046875,
0.03912353515625,
-0.060333251953125,
0.008544921875,
0.03546142578125,
0.0188446044921875,
0.002483367919921875,
0.0017919540405273438,
0.001316070556640625,
0.0169830322265625,
-0.031829833984375,
-0.039642333984375,
0.072265625,
0.03887939453125,
0.0506591796875,
0.0137939453125,
0.03985595703125,
0.00789642333984375,
0.01514434814453125,
-0.0362548828125,
0.0455322265625,
-0.00018131732940673828,
-0.039947509765625,
-0.00626373291015625,
-0.005130767822265625,
-0.0740966796875,
0.0158233642578125,
-0.0044708251953125,
-0.06298828125,
0.016845703125,
-0.0011167526245117188,
-0.0228271484375,
0.032989501953125,
-0.03717041015625,
0.03857421875,
-0.031829833984375,
-0.0208587646484375,
-0.006114959716796875,
-0.048187255859375,
0.05242919921875,
0.002567291259765625,
0.00646209716796875,
-0.029388427734375,
-0.016571044921875,
0.0794677734375,
-0.03369140625,
0.072509765625,
-0.01428985595703125,
-0.003749847412109375,
0.042236328125,
0.0036907196044921875,
0.04205322265625,
0.006626129150390625,
0.00896453857421875,
0.053955078125,
0.0172882080078125,
-0.0280609130859375,
-0.017059326171875,
0.050018310546875,
-0.0947265625,
-0.055419921875,
-0.0386962890625,
-0.0273895263671875,
0.0235137939453125,
0.0109710693359375,
0.022552490234375,
-0.012664794921875,
0.0152435302734375,
0.0013284683227539062,
0.018951416015625,
-0.029449462890625,
0.04534912109375,
0.029327392578125,
-0.00646209716796875,
-0.04180908203125,
0.054779052734375,
0.0011129379272460938,
0.004558563232421875,
0.014801025390625,
-0.004138946533203125,
-0.042022705078125,
-0.03106689453125,
-0.0390625,
0.040924072265625,
-0.04486083984375,
-0.03753662109375,
-0.039398193359375,
-0.026092529296875,
-0.047149658203125,
-0.0005502700805664062,
-0.015777587890625,
-0.0187530517578125,
-0.05206298828125,
-0.0160980224609375,
0.047271728515625,
0.03521728515625,
-0.0213165283203125,
0.033935546875,
-0.04925537109375,
0.030487060546875,
0.0058441162109375,
0.005886077880859375,
0.0191650390625,
-0.0550537109375,
-0.01389312744140625,
0.0172882080078125,
-0.035858154296875,
-0.0535888671875,
0.05206298828125,
-0.01007080078125,
0.044891357421875,
0.033447265625,
-0.0074920654296875,
0.0751953125,
-0.0051116943359375,
0.07318115234375,
0.029937744140625,
-0.0655517578125,
0.03521728515625,
-0.017730712890625,
0.00036454200744628906,
0.011688232421875,
0.013885498046875,
-0.037689208984375,
-0.0150909423828125,
-0.041290283203125,
-0.0675048828125,
0.05364990234375,
0.0157012939453125,
0.0243682861328125,
0.00321197509765625,
0.0270538330078125,
-0.01207733154296875,
0.0000941157341003418,
-0.06085205078125,
-0.0447998046875,
-0.033355712890625,
-0.0115966796875,
0.0145721435546875,
-0.032867431640625,
0.0013713836669921875,
-0.025360107421875,
0.0538330078125,
0.00518035888671875,
0.0440673828125,
0.0195465087890625,
-0.0005984306335449219,
-0.01204681396484375,
-0.017303466796875,
0.04022216796875,
0.0263519287109375,
-0.00333404541015625,
-0.00893402099609375,
0.03363037109375,
-0.0545654296875,
0.0133056640625,
0.0103302001953125,
-0.0159912109375,
-0.0078887939453125,
0.02459716796875,
0.06109619140625,
-0.0022068023681640625,
-0.033782958984375,
0.0295257568359375,
-0.01013946533203125,
-0.007755279541015625,
-0.04150390625,
0.0102996826171875,
0.0222015380859375,
0.03436279296875,
0.03570556640625,
-0.017486572265625,
-0.00009739398956298828,
-0.025726318359375,
-0.002597808837890625,
0.0237579345703125,
0.01427459716796875,
-0.019744873046875,
0.07379150390625,
0.0204010009765625,
-0.01438140869140625,
0.049896240234375,
-0.006587982177734375,
-0.03582763671875,
0.07781982421875,
0.054931640625,
0.052337646484375,
-0.007389068603515625,
0.004642486572265625,
0.044708251953125,
0.0247344970703125,
-0.00914764404296875,
0.0252685546875,
-0.003276824951171875,
-0.045562744140625,
-0.01544952392578125,
-0.04827880859375,
-0.02947998046875,
0.0157012939453125,
-0.0277557373046875,
0.03070068359375,
-0.0445556640625,
-0.0125885009765625,
-0.01316070556640625,
0.030548095703125,
-0.03607177734375,
0.0030574798583984375,
0.0256195068359375,
0.070068359375,
-0.05218505859375,
0.067626953125,
0.041015625,
-0.04949951171875,
-0.08294677734375,
-0.036407470703125,
0.0078887939453125,
-0.08831787109375,
0.048828125,
0.006755828857421875,
0.0012216567993164062,
-0.0081939697265625,
-0.058502197265625,
-0.091064453125,
0.1197509765625,
0.024749755859375,
-0.029266357421875,
-0.005435943603515625,
0.01352691650390625,
0.025634765625,
-0.0138092041015625,
0.04132080078125,
0.02972412109375,
0.041015625,
0.017974853515625,
-0.0765380859375,
0.0312042236328125,
-0.03900146484375,
0.006153106689453125,
-0.0260772705078125,
-0.0902099609375,
0.0845947265625,
-0.022308349609375,
-0.013916015625,
0.027130126953125,
0.06671142578125,
0.05279541015625,
0.0208740234375,
0.0269775390625,
0.0248870849609375,
0.048187255859375,
-0.004962921142578125,
0.0570068359375,
-0.0246124267578125,
0.037567138671875,
0.04486083984375,
-0.005214691162109375,
0.06427001953125,
0.025848388671875,
-0.032958984375,
0.04388427734375,
0.0809326171875,
-0.0281219482421875,
0.025543212890625,
-0.00489044189453125,
-0.01212310791015625,
0.005542755126953125,
-0.006099700927734375,
-0.048919677734375,
0.027862548828125,
0.02581787109375,
-0.007221221923828125,
-0.0015554428100585938,
-0.0203399658203125,
0.0239105224609375,
-0.036041259765625,
-0.0033397674560546875,
0.050018310546875,
0.0104827880859375,
-0.027191162109375,
0.0670166015625,
0.01067352294921875,
0.08782958984375,
-0.04754638671875,
-0.00957489013671875,
-0.0406494140625,
0.0020294189453125,
-0.0289459228515625,
-0.048492431640625,
0.01007080078125,
0.007312774658203125,
0.007671356201171875,
0.015838623046875,
0.051971435546875,
-0.015533447265625,
-0.036041259765625,
0.027191162109375,
0.020965576171875,
0.038360595703125,
0.0272216796875,
-0.07733154296875,
0.0302276611328125,
0.01548004150390625,
-0.03802490234375,
0.0284881591796875,
0.00751495361328125,
-0.007320404052734375,
0.06390380859375,
0.0784912109375,
0.005649566650390625,
0.0080718994140625,
-0.0267333984375,
0.0767822265625,
-0.043365478515625,
-0.035308837890625,
-0.06744384765625,
0.035186767578125,
-0.0010204315185546875,
-0.0286407470703125,
0.04937744140625,
0.041961669921875,
0.058258056640625,
0.006847381591796875,
0.055267333984375,
-0.02545166015625,
0.01509857177734375,
-0.01169586181640625,
0.06475830078125,
-0.043731689453125,
0.0176544189453125,
-0.006916046142578125,
-0.03765869140625,
-0.0149993896484375,
0.0626220703125,
-0.005466461181640625,
-0.00046062469482421875,
0.04315185546875,
0.07049560546875,
0.004817962646484375,
-0.0202484130859375,
0.007610321044921875,
0.02569580078125,
0.038848876953125,
0.06646728515625,
0.0430908203125,
-0.06195068359375,
0.057281494140625,
-0.040283203125,
-0.0113525390625,
-0.018035888671875,
-0.049468994140625,
-0.06292724609375,
-0.03179931640625,
-0.0157318115234375,
-0.040252685546875,
-0.006374359130859375,
0.06689453125,
0.040924072265625,
-0.050140380859375,
-0.035858154296875,
0.017852783203125,
0.0169677734375,
-0.002452850341796875,
-0.01351165771484375,
0.059417724609375,
-0.0040435791015625,
-0.06512451171875,
-0.0008025169372558594,
0.0020618438720703125,
0.02655029296875,
-0.022796630859375,
-0.0173492431640625,
-0.0190887451171875,
-0.00540924072265625,
0.030853271484375,
0.04144287109375,
-0.06695556640625,
-0.005245208740234375,
0.0157928466796875,
-0.02215576171875,
0.01074981689453125,
0.007167816162109375,
-0.042449951171875,
0.000728607177734375,
0.030181884765625,
0.02752685546875,
0.036468505859375,
-0.00992584228515625,
0.009124755859375,
-0.01551055908203125,
0.042205810546875,
-0.0154876708984375,
0.039154052734375,
0.0205841064453125,
-0.047698974609375,
0.039154052734375,
0.0286102294921875,
-0.03070068359375,
-0.0609130859375,
-0.0185546875,
-0.09637451171875,
-0.00855255126953125,
0.1015625,
-0.0238037109375,
-0.03839111328125,
0.01279449462890625,
-0.041473388671875,
0.0557861328125,
-0.031158447265625,
0.06390380859375,
0.02191162109375,
-0.0082244873046875,
-0.0011739730834960938,
-0.030487060546875,
0.0186920166015625,
0.0143280029296875,
-0.062286376953125,
-0.0195465087890625,
0.0224761962890625,
0.031646728515625,
0.0052947998046875,
0.055267333984375,
-0.0025653839111328125,
0.02606201171875,
0.0013113021850585938,
0.008819580078125,
-0.0322265625,
0.0244903564453125,
-0.01042938232421875,
-0.0191497802734375,
-0.00968170166015625,
-0.04034423828125
]
] |
timm/efficientnet_b4.ra2_in1k | 2023-04-27T21:10:49.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:1905.11946",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/efficientnet_b4.ra2_in1k | 0 | 7,073 | timm | 2022-12-12T23:56:57 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for efficientnet_b4.ra2_in1k
A EfficientNet image classification model. Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* RandAugment `RA2` recipe. Inspired by and evolved from EfficientNet RandAugment recipes. Published as `B` recipe in [ResNet Strikes Back](https://arxiv.org/abs/2110.00476).
* RMSProp (TF 1.0 behaviour) optimizer, EMA weight averaging
* Step (exponential decay w/ staircase) LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 19.3
- GMACs: 3.1
- Activations (M): 34.8
- Image size: train = 320 x 320, test = 384 x 384
- **Papers:**
- EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks: https://arxiv.org/abs/1905.11946
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('efficientnet_b4.ra2_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'efficientnet_b4.ra2_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 24, 160, 160])
# torch.Size([1, 32, 80, 80])
# torch.Size([1, 56, 40, 40])
# torch.Size([1, 160, 20, 20])
# torch.Size([1, 448, 10, 10])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'efficientnet_b4.ra2_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1792, 10, 10) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{tan2019efficientnet,
title={Efficientnet: Rethinking model scaling for convolutional neural networks},
author={Tan, Mingxing and Le, Quoc},
booktitle={International conference on machine learning},
pages={6105--6114},
year={2019},
organization={PMLR}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
| 4,741 | [
[
-0.0280609130859375,
-0.035888671875,
-0.00801849365234375,
0.005382537841796875,
-0.014556884765625,
-0.03472900390625,
-0.023529052734375,
-0.0307159423828125,
0.0189208984375,
0.0322265625,
-0.0325927734375,
-0.040191650390625,
-0.05682373046875,
-0.00833892822265625,
-0.01222991943359375,
0.0635986328125,
-0.003459930419921875,
0.00267791748046875,
-0.0109405517578125,
-0.048492431640625,
-0.0142974853515625,
-0.0152587890625,
-0.080810546875,
-0.040557861328125,
0.0335693359375,
0.0272979736328125,
0.042877197265625,
0.05035400390625,
0.054656982421875,
0.036865234375,
-0.0038356781005859375,
0.01384735107421875,
-0.0187225341796875,
-0.005825042724609375,
0.028961181640625,
-0.048858642578125,
-0.0301666259765625,
0.0159912109375,
0.05108642578125,
0.0278472900390625,
-0.0044097900390625,
0.035247802734375,
0.00910186767578125,
0.051025390625,
-0.0246124267578125,
0.009124755859375,
-0.03436279296875,
0.01409912109375,
-0.006084442138671875,
0.007579803466796875,
-0.0270843505859375,
-0.026123046875,
0.00720977783203125,
-0.043121337890625,
0.03143310546875,
0.0001690387725830078,
0.09637451171875,
0.0231781005859375,
-0.001590728759765625,
-0.0051116943359375,
-0.0164031982421875,
0.05841064453125,
-0.062347412109375,
0.0126953125,
0.01702880859375,
0.0244293212890625,
-0.004032135009765625,
-0.09173583984375,
-0.041656494140625,
-0.01453399658203125,
-0.0156402587890625,
-0.0045623779296875,
-0.0283660888671875,
0.0007548332214355469,
0.0278472900390625,
0.0179443359375,
-0.031036376953125,
0.01444244384765625,
-0.034881591796875,
-0.01092529296875,
0.0386962890625,
0.00402069091796875,
0.026702880859375,
-0.01058197021484375,
-0.035125732421875,
-0.03662109375,
-0.0301666259765625,
0.0242462158203125,
0.022125244140625,
0.022613525390625,
-0.041961669921875,
0.0283050537109375,
0.008941650390625,
0.0477294921875,
0.007541656494140625,
-0.0214385986328125,
0.0435791015625,
0.0012607574462890625,
-0.031829833984375,
-0.014434814453125,
0.0780029296875,
0.029541015625,
0.01444244384765625,
0.0119171142578125,
-0.01331329345703125,
-0.033203125,
0.0021114349365234375,
-0.095458984375,
-0.0283660888671875,
0.0203857421875,
-0.05206298828125,
-0.0284576416015625,
0.00984954833984375,
-0.038360595703125,
-0.0092010498046875,
-0.004058837890625,
0.043975830078125,
-0.033966064453125,
-0.03045654296875,
-0.0013904571533203125,
-0.0160064697265625,
0.0184173583984375,
0.01824951171875,
-0.0443115234375,
0.01605224609375,
0.0259246826171875,
0.0892333984375,
0.01215362548828125,
-0.031280517578125,
-0.021148681640625,
-0.032257080078125,
-0.0212860107421875,
0.0289459228515625,
-0.007259368896484375,
0.00783538818359375,
-0.0255126953125,
0.0242462158203125,
-0.00864410400390625,
-0.05780029296875,
0.0198822021484375,
-0.024200439453125,
0.00759124755859375,
-0.004032135009765625,
-0.0186614990234375,
-0.04632568359375,
0.027496337890625,
-0.0433349609375,
0.08905029296875,
0.0228271484375,
-0.06878662109375,
0.0187530517578125,
-0.044464111328125,
-0.009490966796875,
-0.022552490234375,
-0.00115203857421875,
-0.0809326171875,
-0.007293701171875,
0.00849151611328125,
0.05615234375,
-0.0294952392578125,
0.0019893646240234375,
-0.041259765625,
-0.014678955078125,
0.0247039794921875,
-0.0024852752685546875,
0.0806884765625,
0.020294189453125,
-0.0372314453125,
0.017608642578125,
-0.047271728515625,
0.0208282470703125,
0.0396728515625,
-0.0177001953125,
-0.0037822723388671875,
-0.043487548828125,
0.01079559326171875,
0.0219573974609375,
0.0079498291015625,
-0.03912353515625,
0.01465606689453125,
-0.0149383544921875,
0.03900146484375,
0.043914794921875,
-0.0098876953125,
0.0267333984375,
-0.03106689453125,
0.0210418701171875,
0.013763427734375,
0.0190887451171875,
-0.005298614501953125,
-0.03668212890625,
-0.0631103515625,
-0.03631591796875,
0.028076171875,
0.0277557373046875,
-0.03717041015625,
0.0301361083984375,
-0.01392364501953125,
-0.059906005859375,
-0.033599853515625,
0.006114959716796875,
0.040496826171875,
0.045440673828125,
0.0239105224609375,
-0.03497314453125,
-0.03851318359375,
-0.07623291015625,
-0.002178192138671875,
0.006603240966796875,
0.00439453125,
0.0302276611328125,
0.047637939453125,
-0.00445556640625,
0.043365478515625,
-0.029693603515625,
-0.0194091796875,
-0.0241241455078125,
0.004405975341796875,
0.032989501953125,
0.064453125,
0.058807373046875,
-0.05059814453125,
-0.043548583984375,
-0.0094451904296875,
-0.06927490234375,
0.01519775390625,
-0.00888824462890625,
-0.0127410888671875,
0.01236724853515625,
0.01552581787109375,
-0.041900634765625,
0.035064697265625,
0.018798828125,
-0.016204833984375,
0.02984619140625,
-0.01922607421875,
0.0168914794921875,
-0.08624267578125,
0.01342010498046875,
0.0290374755859375,
-0.01000213623046875,
-0.03863525390625,
0.0177001953125,
0.006656646728515625,
-0.00748443603515625,
-0.03515625,
0.047271728515625,
-0.045074462890625,
-0.01641845703125,
-0.0133209228515625,
-0.021820068359375,
0.002185821533203125,
0.05242919921875,
-0.01413726806640625,
0.0266265869140625,
0.060333251953125,
-0.030975341796875,
0.034881591796875,
0.0191192626953125,
-0.0179443359375,
0.025238037109375,
-0.056732177734375,
0.017822265625,
0.00469970703125,
0.022369384765625,
-0.07879638671875,
-0.0176849365234375,
0.028594970703125,
-0.0477294921875,
0.052215576171875,
-0.036224365234375,
-0.03485107421875,
-0.035675048828125,
-0.0328369140625,
0.0297088623046875,
0.0482177734375,
-0.0570068359375,
0.034454345703125,
0.0167694091796875,
0.02618408203125,
-0.047515869140625,
-0.07061767578125,
-0.010833740234375,
-0.02960205078125,
-0.058563232421875,
0.0225677490234375,
0.009918212890625,
0.001972198486328125,
0.0096435546875,
0.0005583763122558594,
-0.01239013671875,
-0.000823974609375,
0.0384521484375,
0.01348114013671875,
-0.020538330078125,
-0.006702423095703125,
-0.0236053466796875,
-0.003940582275390625,
0.002216339111328125,
-0.0263214111328125,
0.039764404296875,
-0.0174407958984375,
-0.006175994873046875,
-0.06695556640625,
-0.0030364990234375,
0.036651611328125,
-0.004486083984375,
0.0631103515625,
0.08349609375,
-0.042083740234375,
-0.00617218017578125,
-0.03387451171875,
-0.02984619140625,
-0.037200927734375,
0.038543701171875,
-0.0250091552734375,
-0.03369140625,
0.06439208984375,
-0.004398345947265625,
0.0108184814453125,
0.04986572265625,
0.0264434814453125,
-0.0036334991455078125,
0.048980712890625,
0.046417236328125,
0.0205841064453125,
0.055633544921875,
-0.084228515625,
-0.01428985595703125,
-0.06781005859375,
-0.0322265625,
-0.031219482421875,
-0.059722900390625,
-0.04730224609375,
-0.0247802734375,
0.03485107421875,
0.01538848876953125,
-0.0362548828125,
0.034393310546875,
-0.064453125,
0.003814697265625,
0.051971435546875,
0.044097900390625,
-0.032073974609375,
0.025970458984375,
-0.015655517578125,
-0.0010442733764648438,
-0.06671142578125,
-0.012847900390625,
0.0838623046875,
0.031951904296875,
0.04339599609375,
-0.006710052490234375,
0.051971435546875,
-0.01557159423828125,
0.0303955078125,
-0.044769287109375,
0.0428466796875,
-0.0162353515625,
-0.035980224609375,
-0.0171966552734375,
-0.037933349609375,
-0.0814208984375,
0.012176513671875,
-0.0196990966796875,
-0.04534912109375,
0.014892578125,
0.0188751220703125,
-0.0191192626953125,
0.0596923828125,
-0.0638427734375,
0.068115234375,
-0.00629425048828125,
-0.034942626953125,
-0.005096435546875,
-0.054534912109375,
0.02374267578125,
0.01898193359375,
-0.01390838623046875,
-0.002605438232421875,
0.00492095947265625,
0.08026123046875,
-0.052825927734375,
0.06634521484375,
-0.04034423828125,
0.035125732421875,
0.039825439453125,
-0.01036834716796875,
0.0310516357421875,
-0.0092010498046875,
-0.01751708984375,
0.0266265869140625,
-0.01061248779296875,
-0.036102294921875,
-0.046539306640625,
0.04742431640625,
-0.07305908203125,
-0.02203369140625,
-0.016937255859375,
-0.03436279296875,
0.0212860107421875,
0.008941650390625,
0.04168701171875,
0.05767822265625,
0.0246429443359375,
0.02667236328125,
0.04046630859375,
-0.0311431884765625,
0.035247802734375,
-0.00164031982421875,
-0.0029659271240234375,
-0.0401611328125,
0.0577392578125,
0.0290374755859375,
0.015655517578125,
0.011962890625,
0.019866943359375,
-0.017730712890625,
-0.046539306640625,
-0.0267791748046875,
0.0179595947265625,
-0.054901123046875,
-0.046478271484375,
-0.0523681640625,
-0.03179931640625,
-0.0265655517578125,
-0.005767822265625,
-0.04229736328125,
-0.032257080078125,
-0.0272674560546875,
0.01702880859375,
0.054931640625,
0.04119873046875,
-0.0202789306640625,
0.042388916015625,
-0.03436279296875,
0.0080108642578125,
0.00867462158203125,
0.02947998046875,
0.010589599609375,
-0.06640625,
-0.024993896484375,
-0.004375457763671875,
-0.032684326171875,
-0.048248291015625,
0.03643798828125,
0.0190277099609375,
0.038238525390625,
0.0261077880859375,
-0.0122528076171875,
0.048858642578125,
-0.00472259521484375,
0.038726806640625,
0.04052734375,
-0.03265380859375,
0.04248046875,
0.004669189453125,
0.01268768310546875,
0.00882720947265625,
0.025177001953125,
-0.017913818359375,
-0.0030269622802734375,
-0.0721435546875,
-0.061309814453125,
0.06378173828125,
0.0041961669921875,
0.002727508544921875,
0.02203369140625,
0.06243896484375,
0.00467681884765625,
-0.00528717041015625,
-0.054779052734375,
-0.03936767578125,
-0.020965576171875,
-0.0173187255859375,
0.002349853515625,
-0.00904083251953125,
-0.00710296630859375,
-0.048095703125,
0.05108642578125,
-0.0033054351806640625,
0.0540771484375,
0.02484130859375,
-0.00047898292541503906,
-0.006591796875,
-0.0306854248046875,
0.032867431640625,
0.022369384765625,
-0.01922607421875,
0.011688232421875,
0.01348876953125,
-0.04071044921875,
0.01183319091796875,
0.00849151611328125,
-0.00350189208984375,
-0.0022335052490234375,
0.040802001953125,
0.071533203125,
0.0015392303466796875,
0.00823974609375,
0.026702880859375,
-0.007518768310546875,
-0.031463623046875,
-0.020965576171875,
0.01407623291015625,
-0.00019288063049316406,
0.03741455078125,
0.02142333984375,
0.037200927734375,
-0.006404876708984375,
-0.0186614990234375,
0.0234527587890625,
0.038330078125,
-0.0180206298828125,
-0.0243988037109375,
0.048675537109375,
-0.01282501220703125,
-0.0202789306640625,
0.06640625,
-0.0131378173828125,
-0.032867431640625,
0.090087890625,
0.037261962890625,
0.07196044921875,
0.00101470947265625,
-0.0022754669189453125,
0.07208251953125,
0.0221710205078125,
-0.006313323974609375,
0.00957489013671875,
0.01287078857421875,
-0.06182861328125,
0.0029277801513671875,
-0.034942626953125,
0.00439453125,
0.022705078125,
-0.0411376953125,
0.0190887451171875,
-0.055572509765625,
-0.035675048828125,
0.01387786865234375,
0.0307769775390625,
-0.0733642578125,
0.01465606689453125,
-0.01093292236328125,
0.07061767578125,
-0.05255126953125,
0.05853271484375,
0.066162109375,
-0.038177490234375,
-0.0869140625,
-0.01227569580078125,
0.0014333724975585938,
-0.06842041015625,
0.054931640625,
0.03558349609375,
0.01111602783203125,
0.0078582763671875,
-0.064453125,
-0.05224609375,
0.10992431640625,
0.043182373046875,
-0.01335906982421875,
0.02508544921875,
-0.015655517578125,
0.0160980224609375,
-0.036041259765625,
0.0404052734375,
0.0092620849609375,
0.032470703125,
0.0217742919921875,
-0.043914794921875,
0.0255889892578125,
-0.0278472900390625,
0.00699615478515625,
0.01136016845703125,
-0.0694580078125,
0.07012939453125,
-0.0421142578125,
-0.01061248779296875,
0.0030517578125,
0.05145263671875,
0.01155853271484375,
0.01473236083984375,
0.044036865234375,
0.071533203125,
0.0418701171875,
-0.0175323486328125,
0.0736083984375,
-0.00109100341796875,
0.041595458984375,
0.04962158203125,
0.0307159423828125,
0.040618896484375,
0.0210723876953125,
-0.0179901123046875,
0.026702880859375,
0.08062744140625,
-0.0249176025390625,
0.0239105224609375,
0.0216217041015625,
0.00785064697265625,
-0.006114959716796875,
0.00833892822265625,
-0.03204345703125,
0.037628173828125,
0.01049041748046875,
-0.041015625,
-0.018524169921875,
0.0035953521728515625,
0.003284454345703125,
-0.021270751953125,
-0.0172882080078125,
0.035736083984375,
0.004573822021484375,
-0.02978515625,
0.07049560546875,
0.0135955810546875,
0.06658935546875,
-0.03179931640625,
0.000018417835235595703,
-0.0249481201171875,
0.01861572265625,
-0.02703857421875,
-0.053436279296875,
0.0242156982421875,
-0.0212249755859375,
-0.005840301513671875,
0.002788543701171875,
0.05108642578125,
-0.022918701171875,
-0.033782958984375,
0.0168304443359375,
0.0176544189453125,
0.0394287109375,
0.0091705322265625,
-0.0972900390625,
0.015625,
0.0023040771484375,
-0.051666259765625,
0.0269317626953125,
0.033782958984375,
0.01016998291015625,
0.0582275390625,
0.041473388671875,
-0.007495880126953125,
0.00591278076171875,
-0.0109100341796875,
0.0609130859375,
-0.0285797119140625,
-0.0181427001953125,
-0.0595703125,
0.0423583984375,
-0.01056671142578125,
-0.04168701171875,
0.03643798828125,
0.041015625,
0.057891845703125,
0.002685546875,
0.030364990234375,
-0.02252197265625,
-0.00567626953125,
-0.0300750732421875,
0.0567626953125,
-0.060577392578125,
0.0006461143493652344,
-0.003620147705078125,
-0.050048828125,
-0.027191162109375,
0.05242919921875,
-0.01153564453125,
0.033935546875,
0.034942626953125,
0.0782470703125,
-0.0265655517578125,
-0.029266357421875,
0.01265716552734375,
0.01256561279296875,
0.00933837890625,
0.0301666259765625,
0.0218505859375,
-0.057525634765625,
0.0248565673828125,
-0.04962158203125,
-0.0207672119140625,
-0.012603759765625,
-0.05328369140625,
-0.06103515625,
-0.0655517578125,
-0.044891357421875,
-0.050628662109375,
-0.00859832763671875,
0.07037353515625,
0.08306884765625,
-0.048065185546875,
-0.01122283935546875,
-0.0005507469177246094,
0.01271820068359375,
-0.028350830078125,
-0.0171661376953125,
0.050018310546875,
-0.0228424072265625,
-0.050872802734375,
-0.0246124267578125,
0.00211334228515625,
0.0171661376953125,
0.0011796951293945312,
-0.017578125,
-0.01486968994140625,
-0.019256591796875,
0.01503753662109375,
0.02215576171875,
-0.04425048828125,
-0.01380157470703125,
-0.0201568603515625,
-0.01203155517578125,
0.024139404296875,
0.038330078125,
-0.03271484375,
0.024932861328125,
0.0295562744140625,
0.032470703125,
0.05828857421875,
-0.0302276611328125,
0.004726409912109375,
-0.0634765625,
0.043914794921875,
-0.01076507568359375,
0.0350341796875,
0.033843994140625,
-0.0294036865234375,
0.05035400390625,
0.0267791748046875,
-0.03436279296875,
-0.065185546875,
-0.00571441650390625,
-0.0772705078125,
-0.01486968994140625,
0.070068359375,
-0.038238525390625,
-0.03643798828125,
0.04168701171875,
0.005184173583984375,
0.050872802734375,
-0.006591796875,
0.033538818359375,
0.01495361328125,
-0.01108551025390625,
-0.049072265625,
-0.043975830078125,
0.0305938720703125,
0.01532745361328125,
-0.04168701171875,
-0.028350830078125,
-0.0014133453369140625,
0.050628662109375,
0.01485443115234375,
0.03851318359375,
-0.0080108642578125,
0.01018524169921875,
0.01111602783203125,
0.040618896484375,
-0.040924072265625,
-0.007598876953125,
-0.0239715576171875,
0.006923675537109375,
-0.0027484893798828125,
-0.046112060546875
]
] |
beomi/KcELECTRA-base-v2022 | 2023-04-03T14:37:57.000Z | [
"transformers",
"pytorch",
"electra",
"pretraining",
"korean",
"ko",
"en",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | beomi | null | null | beomi/KcELECTRA-base-v2022 | 4 | 7,068 | transformers | 2022-03-24T05:38:50 | ---
language:
- ko
- en
tags:
- electra
- korean
license: "mit"
---
# 🚨 Important Note: This REPO is DEPRECATED since KcELECTRA-base v2023 Released 🚨
## USE `https://huggingface.co/beomi/KcELECTRA-base` and `v2022` Revision if needed.
---
# KcELECTRA: Korean comments ELECTRA
** Updates on 2022.10.08 **
- KcELECTRA-base-v2022 (구 v2022-dev) 모델 이름이 변경되었습니다.
- 위 모델의 세부 스코어를 추가하였습니다.
- 기존 KcELECTRA-base(v2021) 대비 대부분의 downstream task에서 ~1%p 수준의 성능 향상이 있습니다.
---
공개된 한국어 Transformer 계열 모델들은 대부분 한국어 위키, 뉴스 기사, 책 등 잘 정제된 데이터를 기반으로 학습한 모델입니다. 한편, 실제로 NSMC와 같은 User-Generated Noisy text domain 데이터셋은 정제되지 않았고 구어체 특징에 신조어가 많으며, 오탈자 등 공식적인 글쓰기에서 나타나지 않는 표현들이 빈번하게 등장합니다.
KcELECTRA는 위와 같은 특성의 데이터셋에 적용하기 위해, 네이버 뉴스에서 댓글과 대댓글을 수집해, 토크나이저와 ELECTRA모델을 처음부터 학습한 Pretrained ELECTRA 모델입니다.
기존 KcBERT 대비 데이터셋 증가 및 vocab 확장을 통해 상당한 수준으로 성능이 향상되었습니다.
KcELECTRA는 Huggingface의 Transformers 라이브러리를 통해 간편히 불러와 사용할 수 있습니다. (별도의 파일 다운로드가 필요하지 않습니다.)
```
💡 NOTE 💡
General Corpus로 학습한 KoELECTRA가 보편적인 task에서는 성능이 더 잘 나올 가능성이 높습니다.
KcBERT/KcELECTRA는 User genrated, Noisy text에 대해서 보다 잘 동작하는 PLM입니다.
```
## KcELECTRA Performance
- Finetune 코드는 https://github.com/Beomi/KcBERT-finetune 에서 찾아보실 수 있습니다.
- 해당 Repo의 각 Checkpoint 폴더에서 Step별 세부 스코어를 확인하실 수 있습니다.
| | Size<br/>(용량) | **NSMC**<br/>(acc) | **Naver NER**<br/>(F1) | **PAWS**<br/>(acc) | **KorNLI**<br/>(acc) | **KorSTS**<br/>(spearman) | **Question Pair**<br/>(acc) | **KorQuaD (Dev)**<br/>(EM/F1) |
| :----------------- | :-------------: | :----------------: | :--------------------: | :----------------: | :------------------: | :-----------------------: | :-------------------------: | :---------------------------: |
| **KcELECTRA-base-v2022** | 475M | **91.97** | 87.35 | 76.50 | 82.12 | 83.67 | 95.12 | 69.00 / 90.40 |
| **KcELECTRA-base** | 475M | 91.71 | 86.90 | 74.80 | 81.65 | 82.65 | **95.78** | 70.60 / 90.11 |
| KcBERT-Base | 417M | 89.62 | 84.34 | 66.95 | 74.85 | 75.57 | 93.93 | 60.25 / 84.39 |
| KcBERT-Large | 1.2G | 90.68 | 85.53 | 70.15 | 76.99 | 77.49 | 94.06 | 62.16 / 86.64 |
| KoBERT | 351M | 89.63 | 86.11 | 80.65 | 79.00 | 79.64 | 93.93 | 52.81 / 80.27 |
| XLM-Roberta-Base | 1.03G | 89.49 | 86.26 | 82.95 | 79.92 | 79.09 | 93.53 | 64.70 / 88.94 |
| HanBERT | 614M | 90.16 | 87.31 | 82.40 | 80.89 | 83.33 | 94.19 | 78.74 / 92.02 |
| KoELECTRA-Base | 423M | 90.21 | 86.87 | 81.90 | 80.85 | 83.21 | 94.20 | 61.10 / 89.59 |
| KoELECTRA-Base-v2 | 423M | 89.70 | 87.02 | 83.90 | 80.61 | 84.30 | 94.72 | 84.34 / 92.58 |
| KoELECTRA-Base-v3 | 423M | 90.63 | **88.11** | **84.45** | **82.24** | **85.53** | 95.25 | **84.83 / 93.45** |
| DistilKoBERT | 108M | 88.41 | 84.13 | 62.55 | 70.55 | 73.21 | 92.48 | 54.12 / 77.80 |
\*HanBERT의 Size는 Bert Model과 Tokenizer DB를 합친 것입니다.
\***config의 세팅을 그대로 하여 돌린 결과이며, hyperparameter tuning을 추가적으로 할 시 더 좋은 성능이 나올 수 있습니다.**
## How to use
### Requirements
- `pytorch ~= 1.8.0`
- `transformers ~= 4.11.3`
- `emoji ~= 0.6.0`
- `soynlp ~= 0.0.493`
### Default usage
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("beomi/KcELECTRA-base")
model = AutoModel.from_pretrained("beomi/KcELECTRA-base")
```
> 💡 이전 KcBERT 관련 코드들에서 `AutoTokenizer`, `AutoModel` 을 사용한 경우 `.from_pretrained("beomi/kcbert-base")` 부분을 `.from_pretrained("beomi/KcELECTRA-base")` 로만 변경해주시면 즉시 사용이 가능합니다.
### Pretrain & Finetune Colab 링크 모음
#### Pretrain Data
- KcBERT학습에 사용한 데이터 + 이후 2021.03월 초까지 수집한 댓글
- 약 17GB
- 댓글-대댓글을 묶은 기반으로 Document 구성
#### Pretrain Code
- https://github.com/KLUE-benchmark/KLUE-ELECTRA Repo를 통한 Pretrain
#### Finetune Code
- https://github.com/Beomi/KcBERT-finetune Repo를 통한 Finetune 및 스코어 비교
#### Finetune Samples
- NSMC with PyTorch-Lightning 1.3.0, GPU, Colab <a href="https://colab.research.google.com/drive/1Hh63kIBAiBw3Hho--BvfdUWLu-ysMFF0?usp=sharing">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
## Train Data & Preprocessing
### Raw Data
학습 데이터는 2019.01.01 ~ 2021.03.09 사이에 작성된 **댓글 많은 뉴스/혹은 전체 뉴스** 기사들의 **댓글과 대댓글**을 모두 수집한 데이터입니다.
데이터 사이즈는 텍스트만 추출시 **약 17.3GB이며, 1억8천만개 이상의 문장**으로 이뤄져 있습니다.
> KcBERT는 2019.01-2020.06의 텍스트로, 정제 후 약 9천만개 문장으로 학습을 진행했습니다.
### Preprocessing
PLM 학습을 위해서 전처리를 진행한 과정은 다음과 같습니다.
1. 한글 및 영어, 특수문자, 그리고 이모지(🥳)까지!
정규표현식을 통해 한글, 영어, 특수문자를 포함해 Emoji까지 학습 대상에 포함했습니다.
한편, 한글 범위를 `ㄱ-ㅎ가-힣` 으로 지정해 `ㄱ-힣` 내의 한자를 제외했습니다.
2. 댓글 내 중복 문자열 축약
`ㅋㅋㅋㅋㅋ`와 같이 중복된 글자를 `ㅋㅋ`와 같은 것으로 합쳤습니다.
3. Cased Model
KcBERT는 영문에 대해서는 대소문자를 유지하는 Cased model입니다.
4. 글자 단위 10글자 이하 제거
10글자 미만의 텍스트는 단일 단어로 이뤄진 경우가 많아 해당 부분을 제외했습니다.
5. 중복 제거
중복적으로 쓰인 댓글을 제거하기 위해 완전히 일치하는 중복 댓글을 하나로 합쳤습니다.
6. `OOO` 제거
네이버 댓글의 경우, 비속어는 자체 필터링을 통해 `OOO` 로 표시합니다. 이 부분을 공백으로 제거하였습니다.
아래 명령어로 pip로 설치한 뒤, 아래 clean함수로 클리닝을 하면 Downstream task에서 보다 성능이 좋아집니다. (`[UNK]` 감소)
```bash
pip install soynlp emoji
```
아래 `clean` 함수를 Text data에 사용해주세요.
```python
import re
import emoji
from soynlp.normalizer import repeat_normalize
emojis = ''.join(emoji.UNICODE_EMOJI.keys())
pattern = re.compile(f'[^ .,?!/@$%~%·∼()\x00-\x7Fㄱ-ㅣ가-힣{emojis}]+')
url_pattern = re.compile(
r'https?:\/\/(www\.)?[-a-zA-Z0-9@:%._\+~#=]{1,256}\.[a-zA-Z0-9()]{1,6}\b([-a-zA-Z0-9()@:%_\+.~#?&//=]*)')
import re
import emoji
from soynlp.normalizer import repeat_normalize
pattern = re.compile(f'[^ .,?!/@$%~%·∼()\x00-\x7Fㄱ-ㅣ가-힣]+')
url_pattern = re.compile(
r'https?:\/\/(www\.)?[-a-zA-Z0-9@:%._\+~#=]{1,256}\.[a-zA-Z0-9()]{1,6}\b([-a-zA-Z0-9()@:%_\+.~#?&//=]*)')
def clean(x):
x = pattern.sub(' ', x)
x = emoji.replace_emoji(x, replace='') #emoji 삭제
x = url_pattern.sub('', x)
x = x.strip()
x = repeat_normalize(x, num_repeats=2)
return x
```
> 💡 Finetune Score에서는 위 `clean` 함수를 적용하지 않았습니다.
### Cleaned Data
- KcBERT 외 추가 데이터는 정리 후 공개 예정입니다.
## Tokenizer, Model Train
Tokenizer는 Huggingface의 [Tokenizers](https://github.com/huggingface/tokenizers) 라이브러리를 통해 학습을 진행했습니다.
그 중 `BertWordPieceTokenizer` 를 이용해 학습을 진행했고, Vocab Size는 `30000`으로 진행했습니다.
Tokenizer를 학습하는 것에는 전체 데이터를 통해 학습을 진행했고, 모델의 General Downstream task에 대응하기 위해 KoELECTRA에서 사용한 Vocab을 겹치지 않는 부분을 추가로 넣어주었습니다. (실제로 두 모델이 겹치는 부분은 약 5000토큰이었습니다.)
TPU `v3-8` 을 이용해 약 10일 학습을 진행했고, 현재 Huggingface에 공개된 모델은 848k step을 학습한 모델 weight가 업로드 되어있습니다.
(100k step별 Checkpoint를 통해 성능 평가를 진행하였습니다. 해당 부분은 `KcBERT-finetune` repo를 참고해주세요.)
모델 학습 Loss는 Step에 따라 초기 100-200k 사이에 급격히 Loss가 줄어들다 학습 종료까지도 지속적으로 loss가 감소하는 것을 볼 수 있습니다.

### KcELECTRA Pretrain Step별 Downstream task 성능 비교
> 💡 아래 표는 전체 ckpt가 아닌 일부에 대해서만 테스트를 진행한 결과입니다.

- 위와 같이 KcBERT-base, KcBERT-large 대비 **모든 데이터셋에 대해** KcELECTRA-base가 더 높은 성능을 보입니다.
- KcELECTRA pretrain에서도 Train step이 늘어감에 따라 점진적으로 성능이 향상되는 것을 볼 수 있습니다.
## 인용표기/Citation
KcELECTRA를 인용하실 때는 아래 양식을 통해 인용해주세요.
```
@misc{lee2021kcelectra,
author = {Junbum Lee},
title = {KcELECTRA: Korean comments ELECTRA},
year = {2021},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/Beomi/KcELECTRA}}
}
```
논문을 통한 사용 외에는 MIT 라이센스를 표기해주세요. ☺️
## Acknowledgement
KcELECTRA Model을 학습하는 GCP/TPU 환경은 [TFRC](https://www.tensorflow.org/tfrc?hl=ko) 프로그램의 지원을 받았습니다.
모델 학습 과정에서 많은 조언을 주신 [Monologg](https://github.com/monologg/) 님 감사합니다 :)
## Reference
### Github Repos
- [KcBERT by Beomi](https://github.com/Beomi/KcBERT)
- [BERT by Google](https://github.com/google-research/bert)
- [KoBERT by SKT](https://github.com/SKTBrain/KoBERT)
- [KoELECTRA by Monologg](https://github.com/monologg/KoELECTRA/)
- [Transformers by Huggingface](https://github.com/huggingface/transformers)
- [Tokenizers by Hugginface](https://github.com/huggingface/tokenizers)
- [ELECTRA train code by KLUE](https://github.com/KLUE-benchmark/KLUE-ELECTRA)
### Blogs
- [Monologg님의 KoELECTRA 학습기](https://monologg.kr/categories/NLP/ELECTRA/)
- [Colab에서 TPU로 BERT 처음부터 학습시키기 - Tensorflow/Google ver.](https://beomi.github.io/2020/02/26/Train-BERT-from-scratch-on-colab-TPU-Tensorflow-ver/)
| 9,644 | [
[
-0.045654296875,
-0.041656494140625,
0.0227813720703125,
0.0297698974609375,
-0.0295562744140625,
0.0022258758544921875,
-0.005657196044921875,
-0.00860595703125,
0.042999267578125,
0.02001953125,
-0.035369873046875,
-0.044342041015625,
-0.044647216796875,
0.0155029296875,
0.0040435791015625,
0.05010986328125,
-0.0178375244140625,
0.0092926025390625,
0.00848388671875,
-0.0043182373046875,
-0.04925537109375,
-0.030487060546875,
-0.040008544921875,
-0.026947021484375,
0.0014400482177734375,
0.0343017578125,
0.03839111328125,
0.027984619140625,
0.03076171875,
0.03082275390625,
-0.00930023193359375,
-0.00782012939453125,
-0.011749267578125,
-0.004638671875,
0.0189971923828125,
-0.037445068359375,
-0.0175323486328125,
-0.014251708984375,
0.0253143310546875,
0.036834716796875,
0.0026702880859375,
0.00885009765625,
0.0031909942626953125,
0.0628662109375,
-0.036041259765625,
0.01027679443359375,
-0.00940704345703125,
0.01399993896484375,
0.0029506683349609375,
-0.00858306884765625,
-0.00212860107421875,
-0.04296875,
-0.0037136077880859375,
-0.048492431640625,
0.00722503662109375,
0.0016050338745117188,
0.0958251953125,
0.00307464599609375,
-0.0180816650390625,
-0.021942138671875,
-0.03765869140625,
0.0654296875,
-0.055755615234375,
0.03125,
0.0291748046875,
0.01264190673828125,
-0.01548004150390625,
-0.056854248046875,
-0.0517578125,
-0.0128326416015625,
-0.02337646484375,
0.032196044921875,
-0.01047515869140625,
-0.018707275390625,
0.0226287841796875,
0.0252532958984375,
-0.030303955078125,
-0.006763458251953125,
-0.016876220703125,
-0.02166748046875,
0.046173095703125,
-0.0005807876586914062,
0.03759765625,
-0.040191650390625,
-0.03765869140625,
-0.01467132568359375,
-0.0267486572265625,
0.0252532958984375,
0.0266265869140625,
0.0009813308715820312,
-0.0300140380859375,
0.047088623046875,
-0.0225677490234375,
0.0291748046875,
0.0256805419921875,
-0.0070648193359375,
0.06121826171875,
-0.04150390625,
-0.03515625,
0.012969970703125,
0.0777587890625,
0.03570556640625,
0.016632080078125,
0.002368927001953125,
0.0024261474609375,
-0.00878143310546875,
-0.00017583370208740234,
-0.06689453125,
-0.033599853515625,
0.03387451171875,
-0.045562744140625,
-0.0270233154296875,
0.0038299560546875,
-0.07958984375,
0.0109710693359375,
-0.02178955078125,
0.044647216796875,
-0.048248291015625,
-0.02996826171875,
0.0014104843139648438,
-0.01267242431640625,
0.00850677490234375,
0.0230865478515625,
-0.05181884765625,
0.0184783935546875,
0.00852203369140625,
0.057861328125,
0.0172576904296875,
-0.0030460357666015625,
0.007843017578125,
0.0028133392333984375,
-0.0200042724609375,
0.041900634765625,
0.0007543563842773438,
-0.03466796875,
-0.00868988037109375,
0.0254058837890625,
-0.0245361328125,
-0.0233306884765625,
0.044921875,
-0.020782470703125,
0.00182342529296875,
-0.028564453125,
-0.0306396484375,
-0.0110015869140625,
0.0144500732421875,
-0.04473876953125,
0.084228515625,
0.0113372802734375,
-0.0635986328125,
0.02001953125,
-0.043548583984375,
-0.03131103515625,
-0.0083770751953125,
-0.0025310516357421875,
-0.048828125,
-0.01446533203125,
0.0294036865234375,
0.0399169921875,
-0.006954193115234375,
-0.0182952880859375,
0.002841949462890625,
-0.022064208984375,
0.0185699462890625,
-0.007724761962890625,
0.0865478515625,
0.03216552734375,
-0.038177490234375,
0.0016031265258789062,
-0.076904296875,
0.022613525390625,
0.032470703125,
-0.035797119140625,
-0.01171875,
-0.0325927734375,
0.004482269287109375,
0.0297393798828125,
0.02362060546875,
-0.04937744140625,
0.0004143714904785156,
-0.0340576171875,
0.041778564453125,
0.0732421875,
0.00588226318359375,
0.0270233154296875,
-0.031280517578125,
0.040618896484375,
0.0128936767578125,
0.006595611572265625,
-0.00403594970703125,
-0.0251312255859375,
-0.0697021484375,
-0.0265045166015625,
0.0284271240234375,
0.04754638671875,
-0.047607421875,
0.06494140625,
-0.0081787109375,
-0.054229736328125,
-0.04638671875,
-0.0023746490478515625,
0.0205535888671875,
0.033233642578125,
0.03875732421875,
-0.00030231475830078125,
-0.0521240234375,
-0.055206298828125,
-0.01187896728515625,
-0.004764556884765625,
-0.0012922286987304688,
0.041015625,
0.059967041015625,
-0.016265869140625,
0.06414794921875,
-0.051025390625,
-0.02227783203125,
-0.02606201171875,
-0.00376129150390625,
0.044769287109375,
0.060150146484375,
0.055267333984375,
-0.06695556640625,
-0.07110595703125,
0.0030345916748046875,
-0.055419921875,
-0.003223419189453125,
-0.006855010986328125,
-0.0170745849609375,
0.01396942138671875,
0.034698486328125,
-0.059722900390625,
0.0394287109375,
0.022247314453125,
-0.039947509765625,
0.06549072265625,
-0.021209716796875,
0.031402587890625,
-0.0921630859375,
0.027130126953125,
-0.005535125732421875,
-0.004985809326171875,
-0.04168701171875,
-0.01111602783203125,
0.00498199462890625,
0.0021457672119140625,
-0.0438232421875,
0.03662109375,
-0.044769287109375,
0.023834228515625,
0.00728607177734375,
-0.000446319580078125,
-0.0007357597351074219,
0.050384521484375,
-0.01629638671875,
0.04547119140625,
0.03643798828125,
-0.050262451171875,
0.0255584716796875,
0.022308349609375,
-0.03802490234375,
0.0205078125,
-0.0567626953125,
-0.0033626556396484375,
-0.0080108642578125,
0.01202392578125,
-0.09600830078125,
-0.0200347900390625,
0.04595947265625,
-0.049224853515625,
0.028167724609375,
-0.007251739501953125,
-0.02593994140625,
-0.047332763671875,
-0.038238525390625,
0.01436614990234375,
0.04150390625,
-0.03387451171875,
0.042816162109375,
0.0124359130859375,
-0.007534027099609375,
-0.043853759765625,
-0.05255126953125,
-0.01922607421875,
-0.0091552734375,
-0.045379638671875,
0.05322265625,
-0.01385498046875,
-0.0024890899658203125,
-0.010467529296875,
-0.01419830322265625,
-0.01239776611328125,
-0.006099700927734375,
0.0235595703125,
0.0223388671875,
-0.00399017333984375,
-0.00830841064453125,
0.002956390380859375,
-0.01263427734375,
0.0037994384765625,
0.0092315673828125,
0.072265625,
-0.0101470947265625,
-0.018768310546875,
-0.06201171875,
0.01264190673828125,
0.05157470703125,
0.00013387203216552734,
0.06146240234375,
0.053436279296875,
-0.0096588134765625,
0.0164794921875,
-0.0384521484375,
-0.0010204315185546875,
-0.04058837890625,
0.018829345703125,
-0.03350830078125,
-0.0426025390625,
0.049652099609375,
-0.014312744140625,
-0.01995849609375,
0.062164306640625,
0.042999267578125,
-0.018035888671875,
0.08428955078125,
0.0311279296875,
-0.011566162109375,
0.035369873046875,
-0.054534912109375,
0.02239990234375,
-0.060089111328125,
-0.0462646484375,
-0.041351318359375,
-0.021453857421875,
-0.0528564453125,
-0.01369476318359375,
0.022857666015625,
0.0221710205078125,
-0.028594970703125,
0.036285400390625,
-0.06268310546875,
0.00247955322265625,
0.0302886962890625,
0.0288848876953125,
-0.005558013916015625,
-0.0020294189453125,
-0.034820556640625,
0.0022678375244140625,
-0.047454833984375,
-0.0233917236328125,
0.0787353515625,
0.024383544921875,
0.04437255859375,
-0.001251220703125,
0.05023193359375,
0.009002685546875,
-0.005126953125,
-0.05169677734375,
0.041595458984375,
0.01338958740234375,
-0.0386962890625,
-0.029022216796875,
-0.02178955078125,
-0.07421875,
0.02886962890625,
-0.0168914794921875,
-0.06884765625,
0.0282745361328125,
0.005428314208984375,
-0.0247955322265625,
0.0257568359375,
-0.0579833984375,
0.071533203125,
-0.0205230712890625,
-0.0274200439453125,
0.0019893646240234375,
-0.05548095703125,
0.0080718994140625,
0.00860595703125,
0.0180206298828125,
-0.01245880126953125,
0.0115814208984375,
0.066650390625,
-0.07763671875,
0.03753662109375,
-0.01806640625,
0.008544921875,
0.041748046875,
-0.0208892822265625,
0.050994873046875,
0.006168365478515625,
-0.004367828369140625,
0.0156402587890625,
-0.0018510818481445312,
-0.0287322998046875,
-0.0158233642578125,
0.06610107421875,
-0.072998046875,
-0.0303955078125,
-0.04052734375,
-0.0120391845703125,
0.01806640625,
0.0294036865234375,
0.050445556640625,
0.007354736328125,
0.002368927001953125,
0.0177764892578125,
0.029205322265625,
-0.0274810791015625,
0.05352783203125,
0.01251983642578125,
-0.000080108642578125,
-0.040191650390625,
0.06072998046875,
0.022186279296875,
0.007488250732421875,
0.023651123046875,
0.019287109375,
-0.0308837890625,
-0.0246734619140625,
-0.032196044921875,
0.0154571533203125,
-0.043975830078125,
-0.0178985595703125,
-0.059722900390625,
-0.007396697998046875,
-0.053009033203125,
-0.0264892578125,
-0.025909423828125,
-0.0399169921875,
-0.02301025390625,
-0.0122222900390625,
0.042999267578125,
0.018646240234375,
-0.01016998291015625,
0.02587890625,
-0.04278564453125,
0.0238800048828125,
0.0090179443359375,
0.014984130859375,
0.011627197265625,
-0.026519775390625,
-0.013671875,
0.00447845458984375,
-0.0322265625,
-0.07763671875,
0.04705810546875,
-0.027191162109375,
0.037994384765625,
0.02618408203125,
-0.0030460357666015625,
0.055084228515625,
-0.029144287109375,
0.06329345703125,
0.0447998046875,
-0.072021484375,
0.0567626953125,
-0.0202484130859375,
0.0218353271484375,
0.046112060546875,
0.04754638671875,
-0.036712646484375,
-0.00469970703125,
-0.046722412109375,
-0.078857421875,
0.06072998046875,
0.03936767578125,
-0.007068634033203125,
0.00769805908203125,
0.008087158203125,
-0.0172271728515625,
0.0193939208984375,
-0.055877685546875,
-0.059112548828125,
-0.039337158203125,
-0.014923095703125,
0.0120391845703125,
-0.0006585121154785156,
0.005889892578125,
-0.05206298828125,
0.0628662109375,
0.021942138671875,
0.05096435546875,
0.03936767578125,
-0.011444091796875,
-0.003765106201171875,
0.0200042724609375,
0.029144287109375,
0.04486083984375,
-0.0251617431640625,
-0.0150146484375,
0.0297088623046875,
-0.05828857421875,
0.0260009765625,
0.0004930496215820312,
-0.03045654296875,
0.018798828125,
0.00732421875,
0.0521240234375,
0.00766754150390625,
-0.03216552734375,
0.038116455078125,
-0.0006318092346191406,
-0.02093505859375,
-0.045135498046875,
-0.0009188652038574219,
0.00734710693359375,
-0.0002989768981933594,
0.027374267578125,
0.0093536376953125,
-0.00921630859375,
-0.037689208984375,
0.00634765625,
0.0186920166015625,
-0.032318115234375,
-0.00547027587890625,
0.0474853515625,
0.003597259521484375,
-0.0192718505859375,
0.034881591796875,
-0.004886627197265625,
-0.060546875,
0.08209228515625,
0.04595947265625,
0.056610107421875,
-0.036529541015625,
0.01158905029296875,
0.06591796875,
0.01519775390625,
0.00714111328125,
0.040924072265625,
0.00885772705078125,
-0.044891357421875,
-0.005096435546875,
-0.056549072265625,
0.007625579833984375,
0.036102294921875,
-0.037994384765625,
0.023529052734375,
-0.046966552734375,
-0.024017333984375,
0.00963592529296875,
0.01001739501953125,
-0.058624267578125,
0.036865234375,
-0.01788330078125,
0.057281494140625,
-0.06591796875,
0.04736328125,
0.055755615234375,
-0.048309326171875,
-0.06829833984375,
-0.00009250640869140625,
-0.01971435546875,
-0.0557861328125,
0.0546875,
0.0145263671875,
0.0112762451171875,
0.0066375732421875,
-0.0305328369140625,
-0.07806396484375,
0.10015869140625,
-0.004711151123046875,
-0.03668212890625,
0.0240478515625,
0.0174102783203125,
0.031158447265625,
-0.004852294921875,
0.035125732421875,
0.03607177734375,
0.049835205078125,
0.003086090087890625,
-0.060821533203125,
0.036163330078125,
-0.02850341796875,
-0.01091766357421875,
0.0222320556640625,
-0.08251953125,
0.08001708984375,
-0.0130157470703125,
0.0023193359375,
0.004608154296875,
0.039398193359375,
0.0198211669921875,
0.01490020751953125,
0.0408935546875,
0.045501708984375,
0.039764404296875,
-0.01386260986328125,
0.0672607421875,
-0.0269927978515625,
0.0421142578125,
0.035614013671875,
0.0155181884765625,
0.032958984375,
0.034942626953125,
-0.04803466796875,
0.04559326171875,
0.043243408203125,
-0.032196044921875,
0.050994873046875,
0.007534027099609375,
-0.0286865234375,
-0.007434844970703125,
0.006587982177734375,
-0.042755126953125,
0.0231475830078125,
0.01483917236328125,
-0.041107177734375,
0.0082855224609375,
0.00957489013671875,
0.0176849365234375,
-0.01371002197265625,
-0.03753662109375,
0.0305328369140625,
0.004993438720703125,
-0.04754638671875,
0.052398681640625,
0.00406646728515625,
0.04901123046875,
-0.0380859375,
0.002742767333984375,
0.0066070556640625,
0.0155792236328125,
-0.03057861328125,
-0.054473876953125,
-0.0057220458984375,
-0.00988006591796875,
-0.017974853515625,
-0.0018901824951171875,
0.06365966796875,
-0.005916595458984375,
-0.053436279296875,
0.0292205810546875,
0.014404296875,
0.0187225341796875,
0.010467529296875,
-0.067138671875,
0.0033626556396484375,
0.0201873779296875,
-0.03192138671875,
0.0264739990234375,
0.0157928466796875,
0.018280029296875,
0.046356201171875,
0.058685302734375,
0.00101470947265625,
0.017730712890625,
-0.0021877288818359375,
0.0765380859375,
-0.052703857421875,
-0.04449462890625,
-0.06982421875,
0.051361083984375,
-0.0172882080078125,
-0.031402587890625,
0.06854248046875,
0.05499267578125,
0.0611572265625,
-0.022491455078125,
0.078369140625,
-0.035888671875,
0.028533935546875,
-0.037445068359375,
0.07122802734375,
-0.0435791015625,
-0.0106353759765625,
-0.0364990234375,
-0.04486083984375,
-0.0005764961242675781,
0.0670166015625,
-0.018768310546875,
0.0234375,
0.050140380859375,
0.05023193359375,
0.0021514892578125,
-0.0015811920166015625,
0.0027027130126953125,
0.03717041015625,
0.0076904296875,
0.04327392578125,
0.0335693359375,
-0.065185546875,
0.049835205078125,
-0.054718017578125,
-0.0012445449829101562,
-0.03765869140625,
-0.040130615234375,
-0.06756591796875,
-0.0190582275390625,
-0.03125,
-0.0276641845703125,
-0.01486968994140625,
0.06866455078125,
0.037322998046875,
-0.0726318359375,
-0.0143280029296875,
-0.0254058837890625,
0.0166015625,
-0.03021240234375,
-0.0237274169921875,
0.06298828125,
-0.01284027099609375,
-0.0655517578125,
0.0113067626953125,
-0.01094818115234375,
0.0216217041015625,
0.01361846923828125,
-0.01070404052734375,
-0.041412353515625,
-0.0058746337890625,
0.04425048828125,
0.0218963623046875,
-0.0390625,
-0.01611328125,
-0.0058135986328125,
-0.0092620849609375,
0.020172119140625,
0.0277557373046875,
-0.04290771484375,
0.0205230712890625,
0.05718994140625,
0.0017480850219726562,
0.054107666015625,
0.002349853515625,
0.0175018310546875,
-0.032958984375,
0.01499176025390625,
0.00644683837890625,
0.036102294921875,
-0.005218505859375,
-0.03369140625,
0.043487548828125,
0.034149169921875,
-0.049896240234375,
-0.052734375,
-0.0150604248046875,
-0.08636474609375,
-0.037994384765625,
0.07568359375,
-0.01275634765625,
-0.02801513671875,
-0.011749267578125,
-0.033111572265625,
0.04339599609375,
-0.0360107421875,
0.0452880859375,
0.046295166015625,
-0.0183258056640625,
0.006824493408203125,
-0.05157470703125,
0.039276123046875,
0.041046142578125,
-0.05255126953125,
-0.0321044921875,
0.00829315185546875,
0.038177490234375,
0.0321044921875,
0.06024169921875,
-0.0006475448608398438,
0.01751708984375,
-0.0001996755599975586,
0.017333984375,
-0.0018215179443359375,
0.003765106201171875,
-0.006214141845703125,
0.00901031494140625,
-0.035491943359375,
-0.050079345703125
]
] |
adept/persimmon-8b-base | 2023-10-11T15:05:41.000Z | [
"transformers",
"pytorch",
"persimmon",
"text-generation",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | text-generation | adept | null | null | adept/persimmon-8b-base | 22 | 7,067 | transformers | 2023-09-07T19:39:16 | ---
license: apache-2.0
---
At Adept, we’re working towards an AI agent that can help people do anything they need to do on a computer. We’re not in the business of shipping isolated language models (LMs)—this was an early output of the model scaling program that will support our products.
We trained it from scratch using a context size of 16K. Many LM use cases are context-bound; our model has 4 times the context size of LLaMA2 and 8 times that of GPT-3, MPT, etc.
See https://www.adept.ai/blog/persimmon-8b for more info | 529 | [
[
-0.016204833984375,
-0.037750244140625,
0.060394287109375,
0.01070404052734375,
0.003871917724609375,
-0.004184722900390625,
-0.014404296875,
-0.0531005859375,
-0.0009565353393554688,
0.043243408203125,
-0.033660888671875,
-0.0114898681640625,
-0.037811279296875,
-0.0067138671875,
-0.0200958251953125,
0.064208984375,
0.01404571533203125,
-0.0026416778564453125,
-0.004024505615234375,
-0.02130126953125,
-0.05499267578125,
-0.019866943359375,
-0.0723876953125,
-0.03619384765625,
0.037750244140625,
0.06683349609375,
0.072021484375,
0.06231689453125,
0.0390625,
0.0269622802734375,
0.0103302001953125,
-0.003936767578125,
-0.0374755859375,
0.01509857177734375,
-0.0132904052734375,
-0.034332275390625,
-0.05535888671875,
0.00505828857421875,
0.039642333984375,
0.035919189453125,
-0.016265869140625,
0.0125579833984375,
-0.01491546630859375,
0.041534423828125,
-0.019866943359375,
0.0078582763671875,
-0.043975830078125,
-0.020294189453125,
-0.01470947265625,
0.0206451416015625,
-0.030792236328125,
0.0198974609375,
-0.00733184814453125,
-0.062286376953125,
-0.01103973388671875,
0.0020751953125,
0.0596923828125,
0.056854248046875,
-0.042083740234375,
0.0014638900756835938,
-0.055938720703125,
0.062286376953125,
-0.052276611328125,
0.02484130859375,
0.03558349609375,
0.044464111328125,
-0.01453399658203125,
-0.07366943359375,
-0.036895751953125,
-0.01629638671875,
-0.005161285400390625,
-0.0258331298828125,
-0.017547607421875,
0.0182037353515625,
0.021209716796875,
0.0174407958984375,
-0.0225067138671875,
0.0147247314453125,
-0.021331787109375,
-0.01485443115234375,
0.0601806640625,
0.0233917236328125,
0.042388916015625,
0.025787353515625,
-0.01397705078125,
0.0036296844482421875,
-0.0452880859375,
-0.0010318756103515625,
-0.006816864013671875,
0.0284271240234375,
-0.0108795166015625,
0.04400634765625,
-0.0034847259521484375,
0.0660400390625,
0.01192474365234375,
-0.0151214599609375,
-0.005519866943359375,
-0.01047515869140625,
-0.0177764892578125,
0.0143585205078125,
0.043731689453125,
0.0027923583984375,
-0.0016832351684570312,
-0.01122283935546875,
-0.041229248046875,
-0.006603240966796875,
0.057159423828125,
-0.04547119140625,
-0.0034427642822265625,
0.00922393798828125,
-0.01287841796875,
0.00698089599609375,
-0.01145172119140625,
-0.0079345703125,
-0.04827880859375,
-0.036529541015625,
0.04400634765625,
-0.0214996337890625,
-0.0174713134765625,
-0.011993408203125,
0.004169464111328125,
0.0272674560546875,
0.0401611328125,
-0.05670166015625,
0.02899169921875,
0.052978515625,
0.053375244140625,
-0.008453369140625,
-0.02459716796875,
-0.04559326171875,
0.0006513595581054688,
-0.018096923828125,
0.0576171875,
-0.033935546875,
-0.0177459716796875,
-0.0256500244140625,
0.0001798868179321289,
0.005916595458984375,
-0.03057861328125,
0.0286865234375,
-0.059326171875,
-0.03546142578125,
-0.005252838134765625,
-0.0345458984375,
-0.006572723388671875,
0.036529541015625,
-0.053619384765625,
0.06890869140625,
0.03399658203125,
-0.0214996337890625,
0.019317626953125,
-0.055999755859375,
-0.034576416015625,
0.036346435546875,
-0.01250457763671875,
-0.022369384765625,
0.01335906982421875,
0.00785064697265625,
0.033935546875,
-0.022430419921875,
0.0008425712585449219,
-0.00891876220703125,
-0.0229949951171875,
0.0050506591796875,
-0.006183624267578125,
0.044464111328125,
0.030792236328125,
-0.02349853515625,
0.0200958251953125,
-0.043121337890625,
0.0033473968505859375,
0.0034122467041015625,
-0.043731689453125,
-0.0115509033203125,
-0.0112152099609375,
-0.0009555816650390625,
-0.011505126953125,
0.04791259765625,
-0.041473388671875,
0.04620361328125,
-0.0121307373046875,
0.0250701904296875,
0.04595947265625,
-0.0184173583984375,
0.050628662109375,
-0.0270233154296875,
0.06195068359375,
-0.0166015625,
-0.005107879638671875,
-0.005001068115234375,
-0.03338623046875,
-0.056182861328125,
-0.0313720703125,
0.004482269287109375,
0.028289794921875,
-0.05279541015625,
0.011993408203125,
-0.0281829833984375,
-0.05230712890625,
-0.030303955078125,
0.0258331298828125,
0.0044097900390625,
0.01239776611328125,
0.01120758056640625,
-0.0142974853515625,
-0.037078857421875,
-0.070068359375,
-0.029327392578125,
-0.0142364501953125,
-0.0270233154296875,
0.0233612060546875,
0.01515960693359375,
-0.031402587890625,
0.0947265625,
-0.02386474609375,
-0.048126220703125,
-0.03497314453125,
-0.013580322265625,
0.02191162109375,
0.029296875,
0.0184173583984375,
-0.06451416015625,
-0.0533447265625,
0.0340576171875,
-0.056060791015625,
-0.0030517578125,
-0.017547607421875,
0.0042572021484375,
0.00004410743713378906,
0.0511474609375,
-0.050933837890625,
0.025848388671875,
0.04302978515625,
-0.036773681640625,
0.018157958984375,
-0.02557373046875,
-0.01126861572265625,
-0.08184814453125,
-0.01080322265625,
-0.00728607177734375,
-0.049041748046875,
-0.0509033203125,
0.00978851318359375,
0.006351470947265625,
-0.0217437744140625,
-0.0426025390625,
0.04449462890625,
-0.062164306640625,
-0.0200958251953125,
-0.0245208740234375,
0.00043201446533203125,
-0.02392578125,
0.02105712890625,
-0.0001380443572998047,
0.08697509765625,
0.036468505859375,
-0.0552978515625,
0.0241241455078125,
0.01013946533203125,
-0.0204925537109375,
0.0556640625,
-0.05865478515625,
0.0272674560546875,
0.01531219482421875,
0.00038361549377441406,
-0.04827880859375,
-0.041412353515625,
-0.01175689697265625,
-0.0340576171875,
0.03228759765625,
-0.0031642913818359375,
-0.0204620361328125,
-0.008270263671875,
-0.0023517608642578125,
0.042266845703125,
0.03143310546875,
-0.04718017578125,
0.06640625,
0.0631103515625,
0.007381439208984375,
-0.032257080078125,
-0.043212890625,
0.037872314453125,
-0.026702880859375,
-0.050994873046875,
0.017425537109375,
-0.0048980712890625,
-0.05120849609375,
-0.004138946533203125,
0.01605224609375,
-0.012786865234375,
0.02593994140625,
0.027679443359375,
-0.0107879638671875,
-0.0279693603515625,
0.0181732177734375,
-0.00412750244140625,
-0.0009732246398925781,
-0.01410675048828125,
-0.0243072509765625,
0.026123046875,
-0.046142578125,
-0.01971435546875,
-0.04400634765625,
0.01175689697265625,
0.06097412109375,
0.0020236968994140625,
0.063232421875,
0.02398681640625,
-0.025360107421875,
-0.0187835693359375,
-0.0223388671875,
-0.0063934326171875,
-0.0426025390625,
0.0247955322265625,
-0.0286712646484375,
-0.06610107421875,
0.03302001953125,
0.00499725341796875,
0.0155181884765625,
0.03143310546875,
0.04278564453125,
0.022003173828125,
0.0784912109375,
0.0843505859375,
-0.029449462890625,
0.045806884765625,
-0.01338958740234375,
0.016357421875,
-0.064208984375,
0.0190582275390625,
-0.0223388671875,
-0.045989990234375,
-0.0193023681640625,
-0.032318115234375,
0.0592041015625,
0.002895355224609375,
-0.036407470703125,
0.047760009765625,
-0.0504150390625,
0.045135498046875,
0.048980712890625,
0.0038356781005859375,
0.0208892822265625,
-0.0272674560546875,
0.025543212890625,
0.019134521484375,
-0.059722900390625,
-0.058746337890625,
0.08782958984375,
0.0374755859375,
0.05242919921875,
0.027679443359375,
0.029449462890625,
0.050537109375,
0.05377197265625,
-0.08380126953125,
0.050048828125,
0.02099609375,
-0.060089111328125,
-0.052215576171875,
-0.00826263427734375,
-0.0821533203125,
-0.011138916015625,
0.015228271484375,
-0.06219482421875,
-0.01312255859375,
0.00872802734375,
-0.01904296875,
0.0316162109375,
-0.033782958984375,
0.057220458984375,
-0.0225677490234375,
-0.0220947265625,
-0.0235748291015625,
-0.059112548828125,
0.01220703125,
0.0022716522216796875,
0.006069183349609375,
-0.01470947265625,
-0.029632568359375,
0.047210693359375,
-0.051971435546875,
0.072021484375,
-0.005413055419921875,
-0.0205535888671875,
0.0278778076171875,
0.0217132568359375,
0.02960205078125,
-0.0025653839111328125,
-0.01419830322265625,
0.01093292236328125,
-0.00511932373046875,
-0.040008544921875,
-0.0236663818359375,
0.055267333984375,
-0.08245849609375,
-0.056182861328125,
-0.0250091552734375,
-0.06353759765625,
-0.01039886474609375,
0.030609130859375,
0.00799560546875,
0.01299285888671875,
-0.0277252197265625,
0.01509857177734375,
0.0350341796875,
0.018157958984375,
0.043365478515625,
0.05291748046875,
0.0101318359375,
-0.005970001220703125,
0.054962158203125,
0.004413604736328125,
0.003124237060546875,
0.03338623046875,
0.00998687744140625,
0.00013554096221923828,
-0.0290985107421875,
-0.056182861328125,
0.01297760009765625,
-0.049835205078125,
-0.0244903564453125,
-0.0290679931640625,
-0.003932952880859375,
-0.038848876953125,
-0.0297088623046875,
-0.046051025390625,
-0.0401611328125,
-0.039703369140625,
-0.00344085693359375,
0.0304718017578125,
0.09295654296875,
-0.006542205810546875,
0.0601806640625,
-0.0635986328125,
-0.012786865234375,
0.028961181640625,
0.00916290283203125,
-0.0054473876953125,
-0.039520263671875,
-0.0384521484375,
-0.005535125732421875,
-0.044158935546875,
-0.039459228515625,
0.0279388427734375,
0.035491943359375,
0.046478271484375,
0.050506591796875,
0.015167236328125,
0.021270751953125,
-0.035369873046875,
0.08758544921875,
-0.015838623046875,
-0.06768798828125,
0.0215911865234375,
-0.044830322265625,
0.04412841796875,
0.033935546875,
0.031585693359375,
-0.0517578125,
-0.03057861328125,
-0.038970947265625,
-0.0494384765625,
0.058837890625,
-0.0074615478515625,
0.02752685546875,
0.007488250732421875,
0.050537109375,
-0.0013895034790039062,
0.005062103271484375,
-0.046478271484375,
-0.0068817138671875,
0.012451171875,
-0.023101806640625,
-0.0288848876953125,
-0.0166015625,
-0.01068878173828125,
-0.0100250244140625,
0.059417724609375,
-0.02374267578125,
0.02020263671875,
0.01033782958984375,
-0.028594970703125,
-0.00400543212890625,
0.007061004638671875,
0.045440673828125,
0.054534912109375,
0.0233917236328125,
0.0218353271484375,
0.01800537109375,
-0.045867919921875,
0.019866943359375,
-0.0146636962890625,
0.000049591064453125,
-0.03802490234375,
0.0421142578125,
0.052490234375,
-0.00917816162109375,
-0.068359375,
0.021270751953125,
-0.01218414306640625,
-0.0026073455810546875,
-0.04754638671875,
0.023193359375,
0.0096435546875,
0.0322265625,
0.023468017578125,
-0.0248260498046875,
0.0004563331604003906,
-0.0262908935546875,
0.01274871826171875,
0.0274658203125,
-0.05078125,
-0.041900634765625,
0.06756591796875,
0.0192413330078125,
-0.060150146484375,
0.06414794921875,
-0.0029315948486328125,
-0.047149658203125,
0.048828125,
0.0401611328125,
0.0487060546875,
-0.0158843994140625,
0.003017425537109375,
0.0023097991943359375,
0.03179931640625,
-0.01430511474609375,
-0.006809234619140625,
-0.027008056640625,
-0.06842041015625,
-0.0165863037109375,
-0.02813720703125,
-0.039215087890625,
-0.006526947021484375,
-0.03143310546875,
0.0229034423828125,
-0.044158935546875,
-0.0220794677734375,
-0.014739990234375,
0.01265716552734375,
-0.0380859375,
0.0039043426513671875,
0.00023794174194335938,
0.07769775390625,
-0.044830322265625,
0.06597900390625,
0.0904541015625,
-0.0229034423828125,
-0.0830078125,
-0.019989013671875,
0.0171966552734375,
-0.06744384765625,
0.039215087890625,
0.020050048828125,
-0.006328582763671875,
0.0082244873046875,
-0.055511474609375,
-0.061920166015625,
0.0948486328125,
0.021881103515625,
-0.0281829833984375,
0.01116943359375,
0.01528167724609375,
0.048828125,
-0.036956787109375,
0.0029048919677734375,
0.06329345703125,
0.04541015625,
-0.0079803466796875,
-0.08294677734375,
-0.005023956298828125,
-0.0289459228515625,
-0.00724029541015625,
0.015472412109375,
-0.043670654296875,
0.0706787109375,
-0.0173187255859375,
-0.0006308555603027344,
0.01763916015625,
0.057586669921875,
-0.01190185546875,
0.0164031982421875,
0.048675537109375,
0.04791259765625,
0.06158447265625,
0.017669677734375,
0.09698486328125,
-0.0254364013671875,
0.007549285888671875,
0.08966064453125,
-0.0222930908203125,
0.0479736328125,
0.03765869140625,
-0.0028839111328125,
0.036407470703125,
0.0487060546875,
0.03228759765625,
0.007785797119140625,
-0.0060272216796875,
-0.0419921875,
-0.025634765625,
-0.0213165283203125,
0.0020847320556640625,
0.02593994140625,
0.03216552734375,
-0.0037841796875,
0.003021240234375,
0.0095672607421875,
0.007144927978515625,
0.01953125,
-0.035736083984375,
0.06884765625,
0.013916015625,
-0.049224853515625,
0.036346435546875,
0.00885772705078125,
0.02081298828125,
-0.04833984375,
0.011383056640625,
-0.0174102783203125,
0.04925537109375,
-0.002197265625,
-0.04888916015625,
0.026824951171875,
-0.0192108154296875,
0.0013418197631835938,
-0.02337646484375,
0.051849365234375,
-0.0189208984375,
-0.0229949951171875,
0.00434112548828125,
0.015228271484375,
0.032257080078125,
-0.02166748046875,
-0.0290985107421875,
0.0085906982421875,
-0.0181427001953125,
-0.03338623046875,
0.02630615234375,
0.040191650390625,
-0.02734375,
0.0784912109375,
0.01056671142578125,
0.0032939910888671875,
-0.00981903076171875,
-0.0289154052734375,
0.0692138671875,
-0.0272064208984375,
-0.03076171875,
-0.041839599609375,
0.034698486328125,
0.0011997222900390625,
-0.0213623046875,
0.0728759765625,
0.037353515625,
0.043212890625,
0.00785064697265625,
0.037841796875,
-0.016326904296875,
0.019500732421875,
-0.0002722740173339844,
0.052581787109375,
-0.06658935546875,
0.0311279296875,
0.0140228271484375,
-0.0550537109375,
-0.0310516357421875,
0.032196044921875,
-0.0259857177734375,
-0.01447296142578125,
0.025970458984375,
0.05517578125,
-0.006549835205078125,
-0.01403045654296875,
0.050750732421875,
0.023468017578125,
0.0428466796875,
0.0269622802734375,
0.055816650390625,
-0.0004177093505859375,
0.0223236083984375,
0.00691986083984375,
-0.028411865234375,
-0.017791748046875,
-0.067138671875,
-0.059051513671875,
-0.032867431640625,
-0.0038967132568359375,
-0.01763916015625,
-0.01654052734375,
0.07012939453125,
0.066162109375,
-0.035858154296875,
-0.037994384765625,
-0.00913238525390625,
0.016876220703125,
-0.0250701904296875,
-0.0083465576171875,
-0.00396728515625,
-0.0274810791015625,
-0.054107666015625,
0.058380126953125,
0.007480621337890625,
-0.0032100677490234375,
-0.03826904296875,
-0.02117919921875,
-0.00913238525390625,
0.01409149169921875,
0.035186767578125,
0.037506103515625,
-0.0143890380859375,
-0.043548583984375,
-0.00733184814453125,
-0.0026187896728515625,
0.006420135498046875,
0.0692138671875,
-0.047698974609375,
0.01265716552734375,
0.016265869140625,
0.06591796875,
0.0277099609375,
-0.003673553466796875,
0.0675048828125,
-0.052459716796875,
0.03204345703125,
0.0277252197265625,
0.00931549072265625,
0.0247955322265625,
-0.05517578125,
0.0426025390625,
-0.0204315185546875,
-0.0673828125,
-0.046600341796875,
0.0196075439453125,
-0.06890869140625,
-0.0251617431640625,
0.08343505859375,
-0.004245758056640625,
-0.029510498046875,
-0.00684356689453125,
-0.0305023193359375,
-0.002315521240234375,
-0.03338623046875,
0.037750244140625,
0.03668212890625,
-0.01210784912109375,
-0.004070281982421875,
-0.0264129638671875,
0.026702880859375,
0.0191192626953125,
-0.0501708984375,
0.0003185272216796875,
0.034423828125,
0.0186614990234375,
0.00884246826171875,
0.063720703125,
0.02142333984375,
0.0175323486328125,
0.01125335693359375,
0.0127716064453125,
-0.014190673828125,
-0.0340576171875,
-0.031585693359375,
-0.00415802001953125,
0.039520263671875,
-0.0016117095947265625
]
] |
stablediffusionapi/the-talos-project | 2023-10-17T06:35:20.000Z | [
"diffusers",
"stablediffusionapi.com",
"stable-diffusion-api",
"text-to-image",
"ultra-realistic",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] | text-to-image | stablediffusionapi | null | null | stablediffusionapi/the-talos-project | 4 | 7,067 | diffusers | 2023-10-17T06:31:04 | ---
license: creativeml-openrail-m
tags:
- stablediffusionapi.com
- stable-diffusion-api
- text-to-image
- ultra-realistic
pinned: true
---
# The Talos Project API Inference

## Get API Key
Get API key from [Stable Diffusion API](http://stablediffusionapi.com/), No Payment needed.
Replace Key in below code, change **model_id** to "the-talos-project"
Coding in PHP/Node/Java etc? Have a look at docs for more code examples: [View docs](https://stablediffusionapi.com/docs)
Try model for free: [Generate Images](https://stablediffusionapi.com/models/the-talos-project)
Model link: [View model](https://stablediffusionapi.com/models/the-talos-project)
Credits: [View credits](https://civitai.com/?query=The%20Talos%20Project)
View all models: [View Models](https://stablediffusionapi.com/models)
import requests
import json
url = "https://stablediffusionapi.com/api/v4/dreambooth"
payload = json.dumps({
"key": "your_api_key",
"model_id": "the-talos-project",
"prompt": "ultra realistic close up portrait ((beautiful pale cyberpunk female with heavy black eyeliner)), blue eyes, shaved side haircut, hyper detail, cinematic lighting, magic neon, dark red city, Canon EOS R3, nikon, f/1.4, ISO 200, 1/160s, 8K, RAW, unedited, symmetrical balance, in-frame, 8K",
"negative_prompt": "painting, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, deformed, ugly, blurry, bad anatomy, bad proportions, extra limbs, cloned face, skinny, glitchy, double torso, extra arms, extra hands, mangled fingers, missing lips, ugly face, distorted face, extra legs, anime",
"width": "512",
"height": "512",
"samples": "1",
"num_inference_steps": "30",
"safety_checker": "no",
"enhance_prompt": "yes",
"seed": None,
"guidance_scale": 7.5,
"multi_lingual": "no",
"panorama": "no",
"self_attention": "no",
"upscale": "no",
"embeddings": "embeddings_model_id",
"lora": "lora_model_id",
"webhook": None,
"track_id": None
})
headers = {
'Content-Type': 'application/json'
}
response = requests.request("POST", url, headers=headers, data=payload)
print(response.text)
> Use this coupon code to get 25% off **DMGG0RBN** | 2,496 | [
[
-0.031585693359375,
-0.0526123046875,
0.036834716796875,
0.00872802734375,
-0.039764404296875,
0.01386260986328125,
0.0218505859375,
-0.03875732421875,
0.04022216796875,
0.042083740234375,
-0.056060791015625,
-0.06964111328125,
-0.041046142578125,
0.0042266845703125,
-0.0162200927734375,
0.04962158203125,
0.0014438629150390625,
-0.011199951171875,
0.005374908447265625,
0.0108795166015625,
-0.027923583984375,
0.00000852346420288086,
-0.052642822265625,
-0.00705718994140625,
0.00862884521484375,
0.0034961700439453125,
0.046478271484375,
0.045166015625,
0.025360107421875,
0.0209808349609375,
-0.023284912109375,
-0.001605987548828125,
-0.0287017822265625,
-0.00910186767578125,
-0.0123138427734375,
-0.04730224609375,
-0.04010009765625,
-0.005504608154296875,
0.0310211181640625,
0.02044677734375,
0.005611419677734375,
0.04412841796875,
-0.0007548332214355469,
0.054229736328125,
-0.049652099609375,
0.01403045654296875,
-0.0251617431640625,
0.021270751953125,
0.0007381439208984375,
-0.0012826919555664062,
-0.01751708984375,
-0.02423095703125,
-0.00827789306640625,
-0.07293701171875,
0.031585693359375,
0.010650634765625,
0.0987548828125,
0.0173187255859375,
-0.0245513916015625,
-0.0153961181640625,
-0.0384521484375,
0.06903076171875,
-0.06494140625,
0.0242919921875,
0.032501220703125,
0.0072784423828125,
-0.006011962890625,
-0.05926513671875,
-0.040679931640625,
0.0110931396484375,
0.01248931884765625,
0.0129852294921875,
-0.0249481201171875,
-0.0038471221923828125,
0.0146942138671875,
0.0310211181640625,
-0.03875732421875,
-0.025115966796875,
-0.03814697265625,
-0.010986328125,
0.037841796875,
0.0100860595703125,
0.01238250732421875,
-0.03350830078125,
-0.0360107421875,
-0.0262908935546875,
-0.043701171875,
0.01922607421875,
0.042144775390625,
0.031585693359375,
-0.057403564453125,
0.04766845703125,
-0.0247802734375,
0.055908203125,
0.01447296142578125,
-0.0111541748046875,
0.049835205078125,
-0.01294708251953125,
-0.0249786376953125,
-0.0187835693359375,
0.06884765625,
0.0430908203125,
-0.01461029052734375,
0.0118408203125,
-0.013885498046875,
0.0030517578125,
0.00449371337890625,
-0.07391357421875,
0.007793426513671875,
0.058502197265625,
-0.048736572265625,
-0.0467529296875,
0.0084228515625,
-0.07952880859375,
-0.0123443603515625,
0.005588531494140625,
0.031341552734375,
-0.0198211669921875,
-0.0264739990234375,
0.026397705078125,
-0.0143890380859375,
0.003997802734375,
0.0134735107421875,
-0.047943115234375,
0.009490966796875,
0.03326416015625,
0.07562255859375,
0.00970458984375,
-0.000217437744140625,
0.0008220672607421875,
0.0248260498046875,
-0.035308837890625,
0.05902099609375,
-0.01296234130859375,
-0.040985107421875,
-0.00980377197265625,
0.0228271484375,
-0.0011167526245117188,
-0.030731201171875,
0.04888916015625,
-0.041961669921875,
0.0030059814453125,
-0.0217437744140625,
-0.0220794677734375,
-0.026641845703125,
0.0187835693359375,
-0.043426513671875,
0.052093505859375,
0.011688232421875,
-0.0587158203125,
0.016754150390625,
-0.06622314453125,
-0.01220703125,
-0.00612640380859375,
0.0007052421569824219,
-0.0310516357421875,
-0.00521087646484375,
0.005855560302734375,
0.015716552734375,
-0.02410888671875,
0.004688262939453125,
-0.036407470703125,
-0.02691650390625,
0.0174560546875,
-0.0013942718505859375,
0.076171875,
0.0457763671875,
-0.022552490234375,
0.005161285400390625,
-0.0489501953125,
0.0024318695068359375,
0.047637939453125,
-0.0164642333984375,
-0.0020542144775390625,
-0.018829345703125,
0.00417327880859375,
-0.006561279296875,
0.02154541015625,
-0.0479736328125,
0.0022068023681640625,
-0.03338623046875,
0.0240478515625,
0.042755126953125,
0.01178741455078125,
0.019195556640625,
-0.031524658203125,
0.0616455078125,
0.00424957275390625,
0.03387451171875,
0.00885009765625,
-0.045257568359375,
-0.039154052734375,
-0.029632568359375,
0.01505279541015625,
0.0291595458984375,
-0.044219970703125,
0.018280029296875,
-0.0099334716796875,
-0.04595947265625,
-0.0310211181640625,
-0.013458251953125,
0.040618896484375,
0.04010009765625,
0.0052337646484375,
-0.02142333984375,
-0.04107666015625,
-0.061248779296875,
0.0010538101196289062,
-0.01214599609375,
-0.00830078125,
0.024322509765625,
0.040435791015625,
-0.01473236083984375,
0.06488037109375,
-0.05865478515625,
-0.0146484375,
-0.003955841064453125,
0.0007009506225585938,
0.0538330078125,
0.052093505859375,
0.06622314453125,
-0.06756591796875,
-0.0198211669921875,
-0.0218353271484375,
-0.055633544921875,
0.00927734375,
0.0240325927734375,
-0.0141754150390625,
0.0012388229370117188,
0.0003743171691894531,
-0.0615234375,
0.04095458984375,
0.03814697265625,
-0.05511474609375,
0.045379638671875,
-0.015869140625,
0.051849365234375,
-0.105224609375,
0.006160736083984375,
0.0162200927734375,
-0.0128173828125,
-0.0310211181640625,
0.03485107421875,
-0.0116119384765625,
-0.013031005859375,
-0.0543212890625,
0.04937744140625,
-0.031585693359375,
0.0036945343017578125,
-0.01247406005859375,
-0.003177642822265625,
0.0214080810546875,
0.021575927734375,
-0.00356292724609375,
0.04437255859375,
0.050323486328125,
-0.0287017822265625,
0.029449462890625,
0.02337646484375,
-0.023681640625,
0.03271484375,
-0.0628662109375,
0.01522064208984375,
0.00400543212890625,
0.0233001708984375,
-0.0794677734375,
-0.041351318359375,
0.039215087890625,
-0.05059814453125,
-0.002124786376953125,
-0.061065673828125,
-0.046295166015625,
-0.049774169921875,
-0.0335693359375,
0.0269775390625,
0.055755615234375,
-0.0263519287109375,
0.05670166015625,
0.023193359375,
0.0116119384765625,
-0.056610107421875,
-0.0673828125,
-0.014678955078125,
-0.03094482421875,
-0.04736328125,
0.0292510986328125,
0.0005750656127929688,
-0.02325439453125,
0.006832122802734375,
0.010650634765625,
-0.0114898681640625,
-0.0093231201171875,
0.043487548828125,
0.0416259765625,
-0.017669677734375,
-0.01412200927734375,
0.0167694091796875,
-0.00452423095703125,
-0.0014095306396484375,
-0.025543212890625,
0.06585693359375,
-0.01922607421875,
-0.0394287109375,
-0.06585693359375,
0.0021209716796875,
0.05755615234375,
-0.0018301010131835938,
0.043487548828125,
0.03173828125,
-0.049102783203125,
0.01059722900390625,
-0.04705810546875,
-0.0172271728515625,
-0.03662109375,
0.0166473388671875,
-0.0380859375,
-0.028778076171875,
0.07666015625,
0.0025196075439453125,
-0.003520965576171875,
0.051666259765625,
0.02935791015625,
-0.00550079345703125,
0.08087158203125,
0.025543212890625,
0.01708984375,
0.0273895263671875,
-0.06640625,
-0.01082611083984375,
-0.0540771484375,
-0.0230255126953125,
-0.03179931640625,
-0.01849365234375,
-0.0196075439453125,
-0.035980224609375,
0.01015472412109375,
0.01242828369140625,
-0.03173828125,
0.0261688232421875,
-0.048065185546875,
0.03271484375,
0.037078857421875,
0.0257568359375,
0.0145111083984375,
0.01200103759765625,
-0.013946533203125,
0.01067352294921875,
-0.02606201171875,
-0.042755126953125,
0.078369140625,
0.0174713134765625,
0.05584716796875,
0.009674072265625,
0.050079345703125,
0.025482177734375,
-0.010406494140625,
-0.032684326171875,
0.03515625,
0.0030727386474609375,
-0.07769775390625,
0.007030487060546875,
-0.0189208984375,
-0.07574462890625,
0.015869140625,
-0.0285797119140625,
-0.06634521484375,
0.044342041015625,
0.015625,
-0.0537109375,
0.04461669921875,
-0.059967041015625,
0.057403564453125,
0.0052032470703125,
-0.0478515625,
-0.0220947265625,
-0.037689208984375,
0.03216552734375,
-0.00001609325408935547,
0.046905517578125,
-0.034912109375,
-0.00827789306640625,
0.064208984375,
-0.036041259765625,
0.076904296875,
-0.0303497314453125,
0.006763458251953125,
0.058502197265625,
0.01934814453125,
0.0299224853515625,
0.0206756591796875,
-0.01580810546875,
0.0163116455078125,
0.031768798828125,
-0.0416259765625,
-0.0146484375,
0.059234619140625,
-0.062347412109375,
-0.033538818359375,
-0.0226898193359375,
-0.041290283203125,
-0.0015430450439453125,
0.0209808349609375,
0.03369140625,
0.030029296875,
0.0038280487060546875,
-0.011260986328125,
0.03558349609375,
-0.0083465576171875,
0.0318603515625,
0.0146484375,
-0.042144775390625,
-0.051422119140625,
0.046783447265625,
-0.01401519775390625,
0.026947021484375,
-0.0017118453979492188,
0.0259552001953125,
-0.0367431640625,
-0.049774169921875,
-0.050445556640625,
0.028167724609375,
-0.0513916015625,
-0.0318603515625,
-0.04541015625,
0.0119781494140625,
-0.053436279296875,
-0.02264404296875,
-0.055450439453125,
-0.0294647216796875,
-0.040863037109375,
-0.00885772705078125,
0.05517578125,
0.044158935546875,
-0.01139068603515625,
0.0270233154296875,
-0.062225341796875,
0.030242919921875,
0.0027179718017578125,
0.0196075439453125,
0.0107879638671875,
-0.050750732421875,
-0.0027561187744140625,
0.0231170654296875,
-0.032318115234375,
-0.071044921875,
0.043487548828125,
-0.021575927734375,
0.01995849609375,
0.05517578125,
0.00974273681640625,
0.061279296875,
0.0084686279296875,
0.06756591796875,
0.028778076171875,
-0.06292724609375,
0.0670166015625,
-0.050811767578125,
0.007015228271484375,
0.043487548828125,
0.0166168212890625,
-0.0196380615234375,
-0.01904296875,
-0.054901123046875,
-0.06781005859375,
0.050994873046875,
0.01995849609375,
0.0217437744140625,
0.0003364086151123047,
0.039337158203125,
0.0006856918334960938,
0.00988006591796875,
-0.06951904296875,
-0.024688720703125,
-0.005847930908203125,
-0.014190673828125,
0.031524658203125,
-0.00891876220703125,
-0.0289154052734375,
-0.016510009765625,
0.06396484375,
-0.002872467041015625,
0.0244293212890625,
0.0266571044921875,
0.0182952880859375,
-0.0270233154296875,
-0.0017452239990234375,
0.032440185546875,
0.0509033203125,
-0.040008544921875,
-0.006435394287109375,
0.0021381378173828125,
-0.03839111328125,
0.01178741455078125,
0.00756072998046875,
-0.0279693603515625,
-0.0009751319885253906,
0.0136260986328125,
0.048492431640625,
-0.01898193359375,
-0.041046142578125,
0.041656494140625,
-0.01032257080078125,
-0.0254058837890625,
-0.0302886962890625,
0.0015573501586914062,
0.0204925537109375,
0.049774169921875,
0.04168701171875,
0.0187530517578125,
0.01372528076171875,
-0.029052734375,
-0.0143890380859375,
0.030303955078125,
-0.02569580078125,
-0.03179931640625,
0.07415771484375,
-0.004886627197265625,
-0.03192138671875,
0.0225067138671875,
-0.0306243896484375,
-0.00811004638671875,
0.0701904296875,
0.0540771484375,
0.060638427734375,
-0.0016231536865234375,
0.0210113525390625,
0.0584716796875,
0.002452850341796875,
-0.01232147216796875,
0.055450439453125,
0.012847900390625,
-0.045623779296875,
-0.01922607421875,
-0.060211181640625,
-0.01279449462890625,
0.0085601806640625,
-0.048736572265625,
0.04541015625,
-0.0526123046875,
-0.039154052734375,
-0.0232391357421875,
-0.012786865234375,
-0.04010009765625,
0.028350830078125,
0.00726318359375,
0.060791015625,
-0.061279296875,
0.042449951171875,
0.0467529296875,
-0.040557861328125,
-0.07183837890625,
-0.0191650390625,
0.0176544189453125,
-0.0599365234375,
0.037506103515625,
-0.0005388259887695312,
-0.0162200927734375,
0.006832122802734375,
-0.04595947265625,
-0.0799560546875,
0.0908203125,
0.033050537109375,
-0.0243377685546875,
-0.00189971923828125,
0.004032135009765625,
0.0252227783203125,
-0.031402587890625,
0.02069091796875,
0.004634857177734375,
0.02642822265625,
0.0335693359375,
-0.044952392578125,
0.017822265625,
-0.022979736328125,
-0.00554656982421875,
-0.0071258544921875,
-0.05242919921875,
0.057586669921875,
-0.047149658203125,
-0.010650634765625,
0.0292205810546875,
0.040496826171875,
0.0526123046875,
0.034515380859375,
0.047027587890625,
0.0703125,
0.03277587890625,
-0.0108184814453125,
0.07958984375,
-0.031768798828125,
0.06146240234375,
0.052764892578125,
0.004276275634765625,
0.05926513671875,
0.039031982421875,
-0.02911376953125,
0.043548583984375,
0.07745361328125,
-0.01739501953125,
0.0570068359375,
0.00493621826171875,
-0.02532958984375,
-0.0013523101806640625,
-0.0106201171875,
-0.053131103515625,
0.0218963623046875,
0.0175933837890625,
-0.0350341796875,
0.00441741943359375,
0.00119781494140625,
-0.0081787109375,
-0.01824951171875,
-0.01279449462890625,
0.021484375,
-0.00400543212890625,
-0.0166473388671875,
0.052459716796875,
-0.0041351318359375,
0.0682373046875,
-0.04559326171875,
0.003086090087890625,
-0.006923675537109375,
0.01024627685546875,
-0.0182342529296875,
-0.046356201171875,
0.0174102783203125,
0.0036945343017578125,
-0.00579071044921875,
0.0019989013671875,
0.041839599609375,
0.0181121826171875,
-0.060699462890625,
0.0250701904296875,
0.02996826171875,
0.0261688232421875,
-0.0005612373352050781,
-0.0634765625,
0.016357421875,
0.0139923095703125,
-0.0318603515625,
-0.004119873046875,
0.0108489990234375,
0.0309906005859375,
0.046356201171875,
0.059600830078125,
0.0136260986328125,
0.0023288726806640625,
0.003948211669921875,
0.049530029296875,
-0.042236328125,
-0.0382080078125,
-0.06439208984375,
0.040008544921875,
-0.005023956298828125,
-0.0259246826171875,
0.034454345703125,
0.06915283203125,
0.05242919921875,
-0.036956787109375,
0.048553466796875,
-0.023681640625,
0.032623291015625,
-0.03173828125,
0.0587158203125,
-0.060089111328125,
0.006496429443359375,
-0.033538818359375,
-0.052581787109375,
-0.019866943359375,
0.0394287109375,
-0.018218994140625,
0.01381683349609375,
0.0303192138671875,
0.066650390625,
-0.0186767578125,
-0.001110076904296875,
0.00576019287109375,
0.011627197265625,
0.007198333740234375,
0.023284912109375,
0.058837890625,
-0.04364013671875,
0.032989501953125,
-0.049774169921875,
-0.01194000244140625,
0.00611114501953125,
-0.056121826171875,
-0.049957275390625,
-0.03192138671875,
-0.03778076171875,
-0.05194091796875,
0.0037288665771484375,
0.06634521484375,
0.0755615234375,
-0.06585693359375,
-0.021453857421875,
-0.01073455810546875,
0.00504302978515625,
-0.0094146728515625,
-0.0192718505859375,
0.032257080078125,
0.020843505859375,
-0.0792236328125,
0.0117950439453125,
-0.0030689239501953125,
0.04656982421875,
0.01000213623046875,
0.003406524658203125,
-0.0058746337890625,
-0.004119873046875,
0.0189361572265625,
0.017822265625,
-0.0673828125,
-0.0085601806640625,
-0.0021724700927734375,
0.00984954833984375,
0.017059326171875,
0.0210418701171875,
-0.042999267578125,
0.0228118896484375,
0.04461669921875,
0.00431060791015625,
0.051025390625,
-0.0027637481689453125,
0.0072784423828125,
-0.024566650390625,
0.03668212890625,
0.0079193115234375,
0.047943115234375,
0.0133514404296875,
-0.04583740234375,
0.040557861328125,
0.034454345703125,
-0.021270751953125,
-0.064208984375,
0.00281524658203125,
-0.0887451171875,
-0.0305023193359375,
0.07415771484375,
-0.01296234130859375,
-0.0355224609375,
0.01393890380859375,
-0.01477813720703125,
0.0295867919921875,
-0.027587890625,
0.06024169921875,
0.02850341796875,
-0.00872802734375,
-0.001934051513671875,
-0.05474853515625,
0.0024700164794921875,
0.0166778564453125,
-0.05670166015625,
-0.01422119140625,
0.032928466796875,
0.0408935546875,
0.0399169921875,
0.033721923828125,
-0.041259765625,
0.010406494140625,
0.0194244384765625,
0.037384033203125,
-0.002407073974609375,
0.0180511474609375,
-0.011749267578125,
0.0116729736328125,
0.0007681846618652344,
-0.04815673828125
]
] |
uukuguy/speechless-codellama-34b-v2.0 | 2023-10-13T11:46:44.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"llama-2",
"code",
"en",
"dataset:jondurbin/airoboros-2.2",
"dataset:Open-Orca/OpenOrca",
"dataset:garage-bAInd/Open-Platypus",
"dataset:WizardLM/WizardLM_evol_instruct_V2_196k",
"arxiv:2308.12950",
"license:llama2",
"model-index",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | uukuguy | null | null | uukuguy/speechless-codellama-34b-v2.0 | 6 | 7,066 | transformers | 2023-10-04T09:56:38 | ---
language:
- en
library_name: transformers
pipeline_tag: text-generation
datasets:
- jondurbin/airoboros-2.2
- Open-Orca/OpenOrca
- garage-bAInd/Open-Platypus
- WizardLM/WizardLM_evol_instruct_V2_196k
tags:
- llama-2
- code
license: llama2
model-index:
- name: SpeechlessCoder
results:
- task:
type: text-generation
dataset:
type: openai_humaneval
name: HumanEval
metrics:
- name: pass@1
type: pass@1
value: 75.61
verified: false
---
<p><h1> speechless-codellama-34b-v2.0 </h1></p>
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/speechless-codellama-34b-v2.0-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/speechless-codellama-34b-v2.0-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/speechless-codellama-34b-v2.0-GGUF)
Use the following datasets to fine-tune codellama/CodeLlama-34B in order to improve the model's inference and planning capabilities.
Total 153,013 samples.
- jondurbin/airoboros-2.2: Filter categories related to coding, reasoning and planning. 23,462 samples.
- Open-Orca/OpenOrca: Filter the 'cot' category in 1M GPT4 dataset. 74,440 samples.
- garage-bAInd/Open-Platypus: 100%, 24,926 samples.
- WizardLM/WizardLM_evol_instruct_V2_196k: Coding coversation part. 30,185 samples
## HumanEval
| human-eval | pass@1 |
| --- | --- |
| humaneval-python | 75.61 |
[Big Code Models Leaderboard](https://huggingface.co/spaces/bigcode/bigcode-models-leaderboard)
| Models | pass@1 |
|------ | ------ |
| Phind-CodeLlama-34B-v2| 71.95|
| WizardCoder-Python-34B-V1.0| 70.73|
| Phind-CodeLlama-34B-Python-v1| 70.22|
| Phind-CodeLlama-34B-v1| 65.85|
| WizardCoder-Python-13B-V1.0| 62.19|
| WizardCoder-15B-V1.0| 58.12|
| CodeLlama-34B-Python| 53.29|
| CodeLlama-34B-Instruct| 50.79|
| CodeLlama-13B-Instruct| 50.6|
| CodeLlama-34B| 45.11|
| CodeLlama-13B-Python| 42.89|
| CodeLlama-13B| 35.07|
## NL2SQL
SQL-EVAL: 125/175 (71.43%)
Average rate of exact match: 67.43%
Average correct rate: 71.43%
- GPT4: 130/175 (74.29%)
- GPT3-Turbo-0613: 105/174 (60.00%)
## lm-evaluation-harness
[Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
| Metric | Value |
| --- | --- |
| ARC | 54.35 |
| HellaSwag | 75.65 |
| MMLU | 54.67 |
| TruthfulQA | 45.21 |
| Average | 57.47 |
H800-80G x 2
transformers=4.33.0
flash-attn=2.1.0
bitsandbytes=0.41.1
peft=0.5.0
## Training Arguments
| | |
|------ | ------ |
| lr | 2e-4 |
| lr_scheduler_type | cosine |
| weight_decay | 0.0 |
| optim | paged_adamw_8bit |
| flash_attention | True |
| rerope | False |
| max_new_tokens | 8192 |
| num_train_epochs | 3 |
| bits | 4 |
| lora_r | 64 |
| lora_alpha | 16 |
| lora_dropout | 0.05 |
| double_quant | True |
| quant_type | nf4 |
| dataset_format | airoboros |
| mini_batch_size | 4 |
| grandient_accumulation_steps | 16 |
| bf16 | True |
| | |
|------ | ------ |
| epoch | 3.0 |
| etrain_loss | 0.4261 |
| etrain_runtime | 1 day, 14:42:57.87 |
| etrain_samples_per_second | 3.227 |
| etrain_steps_per_second | 0.025 |
| eeval_loss | 0.4537 |
| eeval_runtime | 0:00:36.19 |
| eeval_samples_per_second | 5.525 |
| eeval_steps_per_second | 2.763 |
# **Code Llama**
Code Llama is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 34 billion parameters. This is the repository for the base 13B version in the Hugging Face Transformers format. This model is designed for general code synthesis and understanding. Links to other models can be found in the index at the bottom.
| | Base Model | Python | Instruct |
| --- | ----------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------- |
| 7B | [codellama/CodeLlama-7b-hf](https://huggingface.co/codellama/CodeLlama-7b-hf) | [codellama/CodeLlama-7b-Python-hf](https://huggingface.co/codellama/CodeLlama-7b-Python-hf) | [codellama/CodeLlama-7b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf) |
| 13B | [codellama/CodeLlama-13b-hf](https://huggingface.co/codellama/CodeLlama-13b-hf) | [codellama/CodeLlama-13b-Python-hf](https://huggingface.co/codellama/CodeLlama-13b-Python-hf) | [codellama/CodeLlama-13b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf) |
| 34B | [codellama/CodeLlama-34b-hf](https://huggingface.co/codellama/CodeLlama-34b-hf) | [codellama/CodeLlama-34b-Python-hf](https://huggingface.co/codellama/CodeLlama-34b-Python-hf) | [codellama/CodeLlama-34b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-34b-Instruct-hf) |
## Model Use
To use this model, please make sure to install transformers from `main` until the next version is released:
```bash
pip install git+https://github.com/huggingface/transformers.git@main accelerate
```
Model capabilities:
- [x] Code completion.
- [x] Infilling.
- [ ] Instructions / chat.
- [ ] Python specialist.
```python
from transformers import AutoTokenizer
import transformers
import torch
model = "codellama/CodeLlama-13b-hf"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
sequences = pipeline(
'import socket\n\ndef ping_exponential_backoff(host: str):',
do_sample=True,
top_k=10,
temperature=0.1,
top_p=0.95,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
max_length=200,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
## Model Details
*Note: Use of this model is governed by the Meta license. Meta developed and publicly released the Code Llama family of large language models (LLMs).
**Model Developers** Meta
**Variations** Code Llama comes in three model sizes, and three variants:
* Code Llama: base models designed for general code synthesis and understanding
* Code Llama - Python: designed specifically for Python
* Code Llama - Instruct: for instruction following and safer deployment
All variants are available in sizes of 7B, 13B and 34B parameters.
**This repository contains the base version of the 13B parameters model.**
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Code Llama is an auto-regressive language model that uses an optimized transformer architecture.
**Model Dates** Code Llama and its variants have been trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of Code Llama - Instruct will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
**Research Paper** More information can be found in the paper "[Code Llama: Open Foundation Models for Code](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/)" or its [arXiv page](https://arxiv.org/abs/2308.12950).
## Intended Use
**Intended Use Cases** Code Llama and its variants is intended for commercial and research use in English and relevant programming languages. The base model Code Llama can be adapted for a variety of code synthesis and understanding tasks, Code Llama - Python is designed specifically to handle the Python programming language, and Code Llama - Instruct is intended to be safer to use for code assistant and generation applications.
**Out-of-Scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws). Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Code Llama and its variants.
## Hardware and Software
**Training Factors** We used custom training libraries. The training and fine-tuning of the released models have been performed Meta’s Research Super Cluster.
**Carbon Footprint** In aggregate, training all 9 Code Llama models required 400K GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 65.3 tCO2eq, 100% of which were offset by Meta’s sustainability program.
## Training Data
All experiments reported here and the released models have been trained and fine-tuned using the same data as Llama 2 with different weights (see Section 2 and Table 1 in the [research paper](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/) for details).
## Evaluation Results
See evaluations for the main models and detailed ablations in Section 3 and safety evaluations in Section 4 of the research paper.
## Ethical Considerations and Limitations
Code Llama and its variants are a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Code Llama’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate or objectionable responses to user prompts. Therefore, before deploying any applications of Code Llama, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available available at [https://ai.meta.com/llama/responsible-user-guide](https://ai.meta.com/llama/responsible-user-guide).
| 10,154 | [
[
-0.033233642578125,
-0.0601806640625,
0.016265869140625,
0.037689208984375,
-0.00865936279296875,
-0.00286102294921875,
-0.0162353515625,
-0.03155517578125,
0.02044677734375,
0.0266571044921875,
-0.03240966796875,
-0.05218505859375,
-0.03887939453125,
0.0146942138671875,
-0.024261474609375,
0.07659912109375,
-0.006237030029296875,
-0.0174713134765625,
-0.0056915283203125,
-0.0014133453369140625,
-0.01971435546875,
-0.04180908203125,
-0.02630615234375,
-0.0303497314453125,
0.016021728515625,
0.017730712890625,
0.052703857421875,
0.0460205078125,
0.037017822265625,
0.02581787109375,
-0.0180511474609375,
0.005313873291015625,
-0.02349853515625,
-0.02276611328125,
0.0168914794921875,
-0.036041259765625,
-0.05975341796875,
-0.0002994537353515625,
0.029815673828125,
0.0236053466796875,
-0.0229034423828125,
0.0259246826171875,
-0.0012798309326171875,
0.04705810546875,
-0.023162841796875,
0.020599365234375,
-0.04034423828125,
0.0014019012451171875,
-0.0033969879150390625,
-0.01389312744140625,
-0.01715087890625,
-0.035888671875,
-0.00457763671875,
-0.03936767578125,
0.009765625,
0.0006070137023925781,
0.09527587890625,
0.0267333984375,
-0.0169677734375,
-0.015869140625,
-0.026153564453125,
0.05523681640625,
-0.07098388671875,
0.006618499755859375,
0.0303497314453125,
-0.00304412841796875,
-0.0151214599609375,
-0.05902099609375,
-0.050201416015625,
-0.018646240234375,
-0.0144805908203125,
0.012054443359375,
-0.030242919921875,
0.0038890838623046875,
0.03692626953125,
0.043609619140625,
-0.048858642578125,
0.002552032470703125,
-0.0274658203125,
-0.016021728515625,
0.06427001953125,
0.01268768310546875,
0.031707763671875,
-0.024749755859375,
-0.02337646484375,
-0.01007843017578125,
-0.05096435546875,
0.01418304443359375,
0.0298919677734375,
-0.004451751708984375,
-0.045562744140625,
0.04583740234375,
-0.0224761962890625,
0.05523681640625,
0.00818634033203125,
-0.0304107666015625,
0.0457763671875,
-0.031402587890625,
-0.02874755859375,
-0.015625,
0.08056640625,
0.035308837890625,
0.020477294921875,
0.004909515380859375,
-0.01342010498046875,
0.0182342529296875,
0.0009794235229492188,
-0.0677490234375,
-0.0204010009765625,
0.023834228515625,
-0.0430908203125,
-0.037841796875,
-0.01239776611328125,
-0.0611572265625,
0.00304412841796875,
-0.0117645263671875,
0.0275115966796875,
-0.038238525390625,
-0.025390625,
0.019744873046875,
-0.0064544677734375,
0.0390625,
0.01358795166015625,
-0.061798095703125,
0.0094146728515625,
0.033233642578125,
0.061248779296875,
0.0030364990234375,
-0.0288848876953125,
-0.00417327880859375,
-0.01229095458984375,
-0.02130126953125,
0.047210693359375,
-0.0227508544921875,
-0.0248870849609375,
-0.0158233642578125,
0.00963592529296875,
-0.01202392578125,
-0.032470703125,
0.026519775390625,
-0.0214691162109375,
0.0157012939453125,
0.0030269622802734375,
-0.038238525390625,
-0.0240020751953125,
0.00510406494140625,
-0.044586181640625,
0.0875244140625,
0.0178985595703125,
-0.05706787109375,
0.002323150634765625,
-0.04449462890625,
-0.0233306884765625,
-0.0194244384765625,
-0.0069732666015625,
-0.054351806640625,
-0.0117645263671875,
0.030853271484375,
0.036865234375,
-0.0279998779296875,
0.0229034423828125,
-0.0112152099609375,
-0.032806396484375,
0.0163421630859375,
-0.017730712890625,
0.08203125,
0.0206298828125,
-0.04345703125,
0.01050567626953125,
-0.062744140625,
0.0086212158203125,
0.034027099609375,
-0.03253173828125,
0.01113128662109375,
-0.01218414306640625,
0.0034999847412109375,
0.00960540771484375,
0.029815673828125,
-0.0213623046875,
0.03759765625,
-0.03643798828125,
0.0557861328125,
0.048004150390625,
-0.0009899139404296875,
0.0275115966796875,
-0.0428466796875,
0.050201416015625,
-0.00301361083984375,
0.0158843994140625,
-0.0182952880859375,
-0.061248779296875,
-0.06768798828125,
-0.03118896484375,
0.01491546630859375,
0.049407958984375,
-0.0443115234375,
0.056304931640625,
-0.0096282958984375,
-0.055999755859375,
-0.04718017578125,
0.00762939453125,
0.0367431640625,
0.03009033203125,
0.035675048828125,
-0.01012420654296875,
-0.05474853515625,
-0.0626220703125,
0.00041794776916503906,
-0.02374267578125,
0.007427215576171875,
0.0198974609375,
0.058135986328125,
-0.0310516357421875,
0.0640869140625,
-0.046844482421875,
-0.0165863037109375,
-0.0236968994140625,
-0.0154571533203125,
0.048370361328125,
0.046966552734375,
0.0556640625,
-0.048095703125,
-0.033599853515625,
-0.0009241104125976562,
-0.0682373046875,
-0.004032135009765625,
-0.01214599609375,
-0.00946807861328125,
0.02679443359375,
0.0222625732421875,
-0.052398681640625,
0.043701171875,
0.048431396484375,
-0.024261474609375,
0.051239013671875,
-0.0142669677734375,
0.0021648406982421875,
-0.0836181640625,
0.0176544189453125,
0.003505706787109375,
-0.001560211181640625,
-0.03631591796875,
0.02349853515625,
0.01153564453125,
0.01050567626953125,
-0.0274658203125,
0.035186767578125,
-0.03985595703125,
0.006763458251953125,
-0.00011646747589111328,
-0.0026111602783203125,
0.0017366409301757812,
0.061370849609375,
0.0018787384033203125,
0.07147216796875,
0.04998779296875,
-0.03497314453125,
0.031707763671875,
0.0272064208984375,
-0.028961181640625,
0.01474761962890625,
-0.06732177734375,
0.01087188720703125,
0.01000213623046875,
0.0277099609375,
-0.070068359375,
-0.0184783935546875,
0.0272064208984375,
-0.048553466796875,
0.01467132568359375,
-0.01435089111328125,
-0.0321044921875,
-0.042999267578125,
-0.0282135009765625,
0.0251922607421875,
0.06158447265625,
-0.041717529296875,
0.0306396484375,
0.023101806640625,
0.010040283203125,
-0.048583984375,
-0.050201416015625,
0.0024471282958984375,
-0.023101806640625,
-0.05712890625,
0.0325927734375,
-0.0174102783203125,
-0.0029926300048828125,
-0.010955810546875,
0.0022029876708984375,
-0.0051422119140625,
0.0077972412109375,
0.030731201171875,
0.041107177734375,
-0.0191650390625,
-0.019500732421875,
-0.010986328125,
-0.00618743896484375,
0.000759124755859375,
-0.00566864013671875,
0.051544189453125,
-0.031707763671875,
-0.0149383544921875,
-0.04522705078125,
0.006107330322265625,
0.03863525390625,
-0.0185089111328125,
0.0518798828125,
0.043487548828125,
-0.0245819091796875,
-0.00618743896484375,
-0.0447998046875,
0.0012912750244140625,
-0.040496826171875,
0.0247344970703125,
-0.029296875,
-0.055999755859375,
0.047271728515625,
0.0159149169921875,
0.0016021728515625,
0.04449462890625,
0.05419921875,
0.00833892822265625,
0.0653076171875,
0.040557861328125,
-0.02935791015625,
0.03009033203125,
-0.053466796875,
0.0098724365234375,
-0.0611572265625,
-0.028533935546875,
-0.0411376953125,
-0.0097808837890625,
-0.05267333984375,
-0.03558349609375,
0.0196685791015625,
0.0184326171875,
-0.032379150390625,
0.052093505859375,
-0.06744384765625,
0.026397705078125,
0.039520263671875,
0.0167999267578125,
0.0214691162109375,
0.0089874267578125,
-0.011444091796875,
0.00807952880859375,
-0.0423583984375,
-0.041259765625,
0.09405517578125,
0.0266265869140625,
0.06842041015625,
-0.00044155120849609375,
0.053466796875,
0.00498199462890625,
0.01690673828125,
-0.04827880859375,
0.041015625,
0.0140228271484375,
-0.03863525390625,
-0.005878448486328125,
-0.032470703125,
-0.065673828125,
0.0194244384765625,
-0.0053558349609375,
-0.06317138671875,
0.00691986083984375,
0.0089569091796875,
-0.01824951171875,
0.0256500244140625,
-0.057403564453125,
0.061370849609375,
-0.00616455078125,
-0.01334381103515625,
-0.004566192626953125,
-0.043212890625,
0.036834716796875,
0.0049285888671875,
0.00826263427734375,
-0.00627899169921875,
-0.0034656524658203125,
0.051849365234375,
-0.0460205078125,
0.05975341796875,
0.0038299560546875,
-0.020294189453125,
0.03759765625,
-0.00806427001953125,
0.038421630859375,
0.005222320556640625,
-0.01654052734375,
0.0389404296875,
-0.0003368854522705078,
-0.0264129638671875,
-0.0145263671875,
0.052398681640625,
-0.07818603515625,
-0.04833984375,
-0.038238525390625,
-0.0278167724609375,
0.012237548828125,
0.01071929931640625,
0.04241943359375,
0.02294921875,
0.0076141357421875,
0.0111236572265625,
0.04071044921875,
-0.035797119140625,
0.046844482421875,
0.0254058837890625,
-0.0174102783203125,
-0.044586181640625,
0.06964111328125,
-0.00852203369140625,
0.014129638671875,
0.022430419921875,
-0.0018224716186523438,
-0.0159149169921875,
-0.034515380859375,
-0.036346435546875,
0.03045654296875,
-0.038726806640625,
-0.039215087890625,
-0.05230712890625,
-0.036224365234375,
-0.03558349609375,
-0.01396942138671875,
-0.0162353515625,
-0.025115966796875,
-0.042205810546875,
-0.005092620849609375,
0.055023193359375,
0.04437255859375,
0.0027313232421875,
0.0239715576171875,
-0.042633056640625,
0.03338623046875,
0.0117645263671875,
0.026031494140625,
0.0012722015380859375,
-0.0517578125,
-0.01213836669921875,
0.0035190582275390625,
-0.04193115234375,
-0.0693359375,
0.0489501953125,
0.0171661376953125,
0.039703369140625,
0.01288604736328125,
-0.004528045654296875,
0.060272216796875,
-0.034027099609375,
0.06817626953125,
0.02349853515625,
-0.08099365234375,
0.04998779296875,
-0.0218658447265625,
0.01239776611328125,
0.009521484375,
0.023162841796875,
-0.033294677734375,
-0.0269622802734375,
-0.052642822265625,
-0.062103271484375,
0.05792236328125,
0.03118896484375,
0.00182342529296875,
0.0059661865234375,
0.028533935546875,
-0.01055145263671875,
0.01239013671875,
-0.0743408203125,
-0.037445068359375,
-0.02447509765625,
-0.0222625732421875,
-0.0012865066528320312,
-0.004718780517578125,
0.0001195073127746582,
-0.039581298828125,
0.050323486328125,
-0.0086669921875,
0.05010986328125,
0.0206298828125,
-0.01520538330078125,
-0.0071868896484375,
0.003047943115234375,
0.04644775390625,
0.047027587890625,
-0.01102447509765625,
-0.0121612548828125,
0.0218048095703125,
-0.045654296875,
0.016265869140625,
0.00566864013671875,
-0.0154266357421875,
-0.00485992431640625,
0.036529541015625,
0.058441162109375,
0.0030002593994140625,
-0.044952392578125,
0.0452880859375,
-0.002651214599609375,
-0.0223388671875,
-0.03582763671875,
0.0180206298828125,
0.0223236083984375,
0.0177001953125,
0.0248565673828125,
0.0007076263427734375,
-0.0038814544677734375,
-0.0304107666015625,
0.010101318359375,
0.026641845703125,
0.0013179779052734375,
-0.0256195068359375,
0.06915283203125,
-0.0009937286376953125,
-0.0175933837890625,
0.03863525390625,
-0.007415771484375,
-0.046142578125,
0.08209228515625,
0.044158935546875,
0.054412841796875,
-0.019012451171875,
0.0108184814453125,
0.051361083984375,
0.0316162109375,
-0.0067138671875,
0.0306243896484375,
0.00724029541015625,
-0.039276123046875,
-0.0261993408203125,
-0.06201171875,
-0.01116180419921875,
0.01030731201171875,
-0.043975830078125,
0.0283203125,
-0.039154052734375,
-0.014129638671875,
-0.0046234130859375,
0.0151214599609375,
-0.050323486328125,
0.007709503173828125,
0.005329132080078125,
0.07647705078125,
-0.051788330078125,
0.07537841796875,
0.048004150390625,
-0.0531005859375,
-0.07177734375,
-0.0123443603515625,
-0.00608062744140625,
-0.0775146484375,
0.040283203125,
0.014801025390625,
0.01299285888671875,
0.004547119140625,
-0.060394287109375,
-0.07305908203125,
0.09527587890625,
0.027679443359375,
-0.038604736328125,
-0.0111236572265625,
0.004505157470703125,
0.043060302734375,
-0.0218505859375,
0.042205810546875,
0.049468994140625,
0.0274658203125,
-0.007549285888671875,
-0.08319091796875,
0.0246429443359375,
-0.03399658203125,
0.018524169921875,
-0.005046844482421875,
-0.07623291015625,
0.0804443359375,
-0.029266357421875,
-0.0030345916748046875,
0.0245361328125,
0.052093505859375,
0.03546142578125,
0.00977325439453125,
0.0177459716796875,
0.05401611328125,
0.04193115234375,
-0.015594482421875,
0.0755615234375,
-0.03533935546875,
0.050933837890625,
0.053985595703125,
0.00299072265625,
0.056060791015625,
0.019012451171875,
-0.04486083984375,
0.049560546875,
0.05755615234375,
-0.01934814453125,
0.0260467529296875,
0.018585205078125,
-0.00659942626953125,
-0.00746917724609375,
0.0172576904296875,
-0.055877685546875,
0.0240020751953125,
0.0283660888671875,
-0.0267791748046875,
0.00312042236328125,
-0.010498046875,
0.0210723876953125,
-0.009552001953125,
-0.0050506591796875,
0.050537109375,
0.00957489013671875,
-0.044403076171875,
0.08721923828125,
0.004779815673828125,
0.071044921875,
-0.036407470703125,
-0.006641387939453125,
-0.0306243896484375,
0.0106964111328125,
-0.03759765625,
-0.051483154296875,
0.005741119384765625,
0.01007843017578125,
-0.004726409912109375,
-0.00629425048828125,
0.03448486328125,
-0.0240631103515625,
-0.036956787109375,
0.0279998779296875,
0.0147857666015625,
0.016754150390625,
0.0003643035888671875,
-0.06292724609375,
0.0189361572265625,
0.0179595947265625,
-0.034698486328125,
0.0214691162109375,
0.02349853515625,
0.01407623291015625,
0.05877685546875,
0.04888916015625,
0.001880645751953125,
0.0156707763671875,
-0.01412200927734375,
0.0789794921875,
-0.0618896484375,
-0.027130126953125,
-0.0643310546875,
0.0380859375,
-0.0009908676147460938,
-0.03717041015625,
0.05535888671875,
0.0439453125,
0.05902099609375,
-0.009552001953125,
0.058349609375,
-0.025115966796875,
0.009033203125,
-0.038726806640625,
0.057769775390625,
-0.052642822265625,
0.0181427001953125,
-0.039215087890625,
-0.06634521484375,
-0.0168914794921875,
0.067138671875,
-0.00791168212890625,
0.005718231201171875,
0.04473876953125,
0.07830810546875,
0.01346588134765625,
-0.0017023086547851562,
0.0079193115234375,
0.02349853515625,
0.02386474609375,
0.07293701171875,
0.055206298828125,
-0.060821533203125,
0.0504150390625,
-0.042388916015625,
-0.0214385986328125,
-0.0267791748046875,
-0.059356689453125,
-0.06866455078125,
-0.045989990234375,
-0.034210205078125,
-0.043304443359375,
-0.01197052001953125,
0.0748291015625,
0.049774169921875,
-0.054534912109375,
-0.032012939453125,
-0.005401611328125,
0.0210113525390625,
-0.02117919921875,
-0.019500732421875,
0.035491943359375,
-0.01751708984375,
-0.061676025390625,
0.0168609619140625,
0.0015954971313476562,
0.004940032958984375,
-0.00794219970703125,
-0.0205230712890625,
-0.018463134765625,
-0.0008168220520019531,
0.039276123046875,
0.0285186767578125,
-0.06280517578125,
-0.00601959228515625,
0.008880615234375,
-0.0263214111328125,
0.02288818359375,
0.0262603759765625,
-0.050323486328125,
0.00844573974609375,
0.0305328369140625,
0.027923583984375,
0.039794921875,
-0.00917816162109375,
0.015838623046875,
-0.033782958984375,
0.0250701904296875,
-0.0012359619140625,
0.037200927734375,
0.00904083251953125,
-0.034210205078125,
0.053466796875,
0.0291748046875,
-0.051239013671875,
-0.0594482421875,
-0.005985260009765625,
-0.08990478515625,
-0.0177001953125,
0.09503173828125,
-0.01261138916015625,
-0.02923583984375,
0.0150146484375,
-0.029388427734375,
0.0219573974609375,
-0.036834716796875,
0.048492431640625,
0.027099609375,
-0.00917816162109375,
-0.0025386810302734375,
-0.0374755859375,
0.03515625,
0.0208892822265625,
-0.07220458984375,
-0.01265716552734375,
0.0311431884765625,
0.032989501953125,
0.009521484375,
0.057708740234375,
-0.008636474609375,
0.015838623046875,
-0.0016078948974609375,
0.028961181640625,
-0.0007042884826660156,
-0.002826690673828125,
-0.026824951171875,
-0.0006465911865234375,
-0.012451171875,
-0.0117645263671875
]
] |
uukuguy/speechless-codellama-34b-v1.9 | 2023-10-08T22:50:40.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"llama-2",
"code",
"en",
"dataset:jondurbin/airoboros-2.2",
"dataset:Open-Orca/OpenOrca",
"dataset:garage-bAInd/Open-Platypus",
"dataset:WizardLM/WizardLM_evol_instruct_V2_196k",
"arxiv:2308.12950",
"license:llama2",
"model-index",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | uukuguy | null | null | uukuguy/speechless-codellama-34b-v1.9 | 0 | 7,065 | transformers | 2023-10-04T11:06:04 | ---
language:
- en
library_name: transformers
pipeline_tag: text-generation
datasets:
- jondurbin/airoboros-2.2
- Open-Orca/OpenOrca
- garage-bAInd/Open-Platypus
- WizardLM/WizardLM_evol_instruct_V2_196k
tags:
- llama-2
- code
license: llama2
model-index:
- name: SpeechlessCoder
results:
- task:
type: text-generation
dataset:
type: openai_humaneval
name: HumanEval
metrics:
- name: pass@1
type: pass@1
value: 70.73
verified: false
---
<p><h1> speechless-codellama-34b-v1.9 </h1></p>
Use the following dataset to fine-tune codellama/CodeLlama-34B in order to improve the model's reasoning and planning abilities.
Total 122,828 samples.
- jondurbin/airoboros-2.2: Filter categories related to coding, reasoning and planning. 23,462 samples.
- Open-Orca/OpenOrca: Filter the 'cot' category in 1M GPT4 dataset. 74,440 samples.
- garage-bAInd/Open-Platypus: 100%, 24,926 samples.
## HumanEval
| Metric | Value |
| --- | --- |
| humaneval-python | 70.73 |
[Big Code Models Leaderboard](https://huggingface.co/spaces/bigcode/bigcode-models-leaderboard)
CodeLlama-34B-Python: 53.29
CodeLlama-34B-Instruct: 50.79
CodeLlama-13B-Instruct: 50.6
CodeLlama-34B: 45.11
CodeLlama-13B-Python: 42.89
CodeLlama-13B: 35.07
## lm-evaluation-harness
[Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
| Metric | Value |
| --- | --- |
| ARC | 54.27 |
| HellaSwag | 75.2|
| MMLU | 56.12 |
| TruthfulQA | 43.92 |
| Average | 57.38 |
## Training Arguments
| | |
|------ | ------ |
| lr | 2e-4 |
| lr_scheduler_type | cosine |
| weight_decay | 0.0 |
| optim | paged_adamw_8bit |
| flash_attention | True |
| rerope | False |
| max_new_tokens | 8192 |
| num_train_epochs | 3 |
| bits | 4 |
| lora_r | 64 |
| lora_alpha | 16 |
| lora_dropout | 0.05 |
| double_quant | True |
| quant_type | nf4 |
| dataset_format | airoboros |
| mini_batch_size | 2 |
| grandient_accumulation_steps | 32 |
| bf16 | True |
A40-48G x 2
| | |
|------ | ------ |
| epoch | 3.0 |
| etrain_loss | 0.5073 |
| etrain_runtime | 4 days, 1:52:09.56 |
| etrain_samples_per_second | 1.025 |
| etrain_steps_per_second | 0.008 |
| eeval_loss | 0.5269 |
| eeval_runtime | 0:01:25.66 |
| eeval_samples_per_second | 2.335 |
| eeval_steps_per_second | 1.167 |
# **Code Llama**
Code Llama is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 34 billion parameters. This is the repository for the base 13B version in the Hugging Face Transformers format. This model is designed for general code synthesis and understanding. Links to other models can be found in the index at the bottom.
| | Base Model | Python | Instruct |
| --- | ----------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------- |
| 7B | [codellama/CodeLlama-7b-hf](https://huggingface.co/codellama/CodeLlama-7b-hf) | [codellama/CodeLlama-7b-Python-hf](https://huggingface.co/codellama/CodeLlama-7b-Python-hf) | [codellama/CodeLlama-7b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf) |
| 13B | [codellama/CodeLlama-13b-hf](https://huggingface.co/codellama/CodeLlama-13b-hf) | [codellama/CodeLlama-13b-Python-hf](https://huggingface.co/codellama/CodeLlama-13b-Python-hf) | [codellama/CodeLlama-13b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-13b-Instruct-hf) |
| 34B | [codellama/CodeLlama-34b-hf](https://huggingface.co/codellama/CodeLlama-34b-hf) | [codellama/CodeLlama-34b-Python-hf](https://huggingface.co/codellama/CodeLlama-34b-Python-hf) | [codellama/CodeLlama-34b-Instruct-hf](https://huggingface.co/codellama/CodeLlama-34b-Instruct-hf) |
## Model Use
To use this model, please make sure to install transformers from `main` until the next version is released:
```bash
pip install git+https://github.com/huggingface/transformers.git@main accelerate
```
Model capabilities:
- [x] Code completion.
- [x] Infilling.
- [ ] Instructions / chat.
- [ ] Python specialist.
```python
from transformers import AutoTokenizer
import transformers
import torch
model = "codellama/CodeLlama-13b-hf"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
torch_dtype=torch.float16,
device_map="auto",
)
sequences = pipeline(
'import socket\n\ndef ping_exponential_backoff(host: str):',
do_sample=True,
top_k=10,
temperature=0.1,
top_p=0.95,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
max_length=200,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
## Model Details
*Note: Use of this model is governed by the Meta license. Meta developed and publicly released the Code Llama family of large language models (LLMs).
**Model Developers** Meta
**Variations** Code Llama comes in three model sizes, and three variants:
* Code Llama: base models designed for general code synthesis and understanding
* Code Llama - Python: designed specifically for Python
* Code Llama - Instruct: for instruction following and safer deployment
All variants are available in sizes of 7B, 13B and 34B parameters.
**This repository contains the base version of the 13B parameters model.**
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Code Llama is an auto-regressive language model that uses an optimized transformer architecture.
**Model Dates** Code Llama and its variants have been trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of Code Llama - Instruct will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
**Research Paper** More information can be found in the paper "[Code Llama: Open Foundation Models for Code](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/)" or its [arXiv page](https://arxiv.org/abs/2308.12950).
## Intended Use
**Intended Use Cases** Code Llama and its variants is intended for commercial and research use in English and relevant programming languages. The base model Code Llama can be adapted for a variety of code synthesis and understanding tasks, Code Llama - Python is designed specifically to handle the Python programming language, and Code Llama - Instruct is intended to be safer to use for code assistant and generation applications.
**Out-of-Scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws). Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Code Llama and its variants.
## Hardware and Software
**Training Factors** We used custom training libraries. The training and fine-tuning of the released models have been performed Meta’s Research Super Cluster.
**Carbon Footprint** In aggregate, training all 9 Code Llama models required 400K GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 65.3 tCO2eq, 100% of which were offset by Meta’s sustainability program.
## Training Data
All experiments reported here and the released models have been trained and fine-tuned using the same data as Llama 2 with different weights (see Section 2 and Table 1 in the [research paper](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/) for details).
## Evaluation Results
See evaluations for the main models and detailed ablations in Section 3 and safety evaluations in Section 4 of the research paper.
## Ethical Considerations and Limitations
Code Llama and its variants are a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Code Llama’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate or objectionable responses to user prompts. Therefore, before deploying any applications of Code Llama, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available available at [https://ai.meta.com/llama/responsible-user-guide](https://ai.meta.com/llama/responsible-user-guide).
| 9,138 | [
[
-0.032012939453125,
-0.05242919921875,
0.0208587646484375,
0.035400390625,
-0.00891876220703125,
0.00533294677734375,
-0.0183563232421875,
-0.03680419921875,
0.020416259765625,
0.031829833984375,
-0.03411865234375,
-0.048919677734375,
-0.0423583984375,
0.01336669921875,
-0.0250244140625,
0.0811767578125,
-0.00423431396484375,
-0.0214385986328125,
-0.007755279541015625,
0.00325775146484375,
-0.0258026123046875,
-0.043487548828125,
-0.0250091552734375,
-0.03717041015625,
0.018829345703125,
0.02178955078125,
0.051361083984375,
0.0452880859375,
0.03961181640625,
0.0262908935546875,
-0.0251617431640625,
0.007434844970703125,
-0.026580810546875,
-0.02349853515625,
0.011260986328125,
-0.035308837890625,
-0.061676025390625,
-0.0035686492919921875,
0.0309600830078125,
0.0238037109375,
-0.018951416015625,
0.0258636474609375,
-0.004077911376953125,
0.04559326171875,
-0.02166748046875,
0.021636962890625,
-0.04638671875,
-0.00528717041015625,
0.004520416259765625,
-0.01360321044921875,
-0.0145263671875,
-0.031951904296875,
-0.00839996337890625,
-0.036163330078125,
0.003414154052734375,
-0.0029354095458984375,
0.09442138671875,
0.030792236328125,
-0.0136566162109375,
-0.0191802978515625,
-0.0286712646484375,
0.058746337890625,
-0.07196044921875,
0.00519561767578125,
0.0271759033203125,
-0.00467681884765625,
-0.01224517822265625,
-0.0579833984375,
-0.051971435546875,
-0.0206298828125,
-0.01175689697265625,
0.00959014892578125,
-0.0269927978515625,
0.00043129920959472656,
0.03314208984375,
0.03717041015625,
-0.039764404296875,
0.00855255126953125,
-0.0309600830078125,
-0.017852783203125,
0.06683349609375,
0.01654052734375,
0.02801513671875,
-0.0243988037109375,
-0.0205078125,
-0.0118560791015625,
-0.053375244140625,
0.01197052001953125,
0.031646728515625,
-0.001468658447265625,
-0.045166015625,
0.0478515625,
-0.0221099853515625,
0.049072265625,
0.005458831787109375,
-0.032867431640625,
0.045166015625,
-0.031585693359375,
-0.0247344970703125,
-0.01422119140625,
0.0673828125,
0.036224365234375,
0.0199432373046875,
0.007434844970703125,
-0.01788330078125,
0.0200958251953125,
0.0046234130859375,
-0.06884765625,
-0.01983642578125,
0.025146484375,
-0.04571533203125,
-0.045318603515625,
-0.013336181640625,
-0.062225341796875,
0.0004935264587402344,
-0.01532745361328125,
0.022003173828125,
-0.0301361083984375,
-0.031494140625,
0.0234527587890625,
-0.00330352783203125,
0.034088134765625,
0.01328277587890625,
-0.066650390625,
0.0029449462890625,
0.03472900390625,
0.06243896484375,
0.005126953125,
-0.0321044921875,
-0.002483367919921875,
-0.008941650390625,
-0.022369384765625,
0.0482177734375,
-0.0298919677734375,
-0.027557373046875,
-0.0174713134765625,
0.00841522216796875,
-0.0097198486328125,
-0.03570556640625,
0.017974853515625,
-0.0225830078125,
0.0081329345703125,
0.003986358642578125,
-0.0299072265625,
-0.024627685546875,
0.00897216796875,
-0.041656494140625,
0.0859375,
0.018157958984375,
-0.05548095703125,
0.001880645751953125,
-0.04974365234375,
-0.0229949951171875,
-0.021453857421875,
-0.0012750625610351562,
-0.05450439453125,
-0.01081085205078125,
0.0293731689453125,
0.039306640625,
-0.0263671875,
0.02960205078125,
-0.00897216796875,
-0.0262451171875,
0.01412200927734375,
-0.0146636962890625,
0.0772705078125,
0.0231781005859375,
-0.038970947265625,
0.017852783203125,
-0.06512451171875,
0.0075225830078125,
0.030181884765625,
-0.036224365234375,
0.01250457763671875,
-0.01025390625,
0.002674102783203125,
0.007419586181640625,
0.032196044921875,
-0.0205230712890625,
0.037445068359375,
-0.0364990234375,
0.060760498046875,
0.052520751953125,
0.00484466552734375,
0.0261688232421875,
-0.043792724609375,
0.048614501953125,
-0.001331329345703125,
0.01454925537109375,
-0.022216796875,
-0.055084228515625,
-0.07476806640625,
-0.027099609375,
0.0116424560546875,
0.045928955078125,
-0.043609619140625,
0.05810546875,
-0.00870513916015625,
-0.05877685546875,
-0.036102294921875,
0.011749267578125,
0.0311737060546875,
0.0280609130859375,
0.035369873046875,
-0.01251220703125,
-0.058563232421875,
-0.056488037109375,
0.007709503173828125,
-0.0244598388671875,
0.00974273681640625,
0.0128326416015625,
0.066162109375,
-0.037322998046875,
0.0653076171875,
-0.0443115234375,
-0.01128387451171875,
-0.026947021484375,
-0.01776123046875,
0.048675537109375,
0.049835205078125,
0.052093505859375,
-0.047637939453125,
-0.0253143310546875,
0.0006685256958007812,
-0.064208984375,
-0.00397491455078125,
-0.015228271484375,
-0.00823211669921875,
0.0245208740234375,
0.02874755859375,
-0.049072265625,
0.04083251953125,
0.0501708984375,
-0.0242767333984375,
0.0513916015625,
-0.01287841796875,
-0.004764556884765625,
-0.08160400390625,
0.02008056640625,
-0.00261688232421875,
0.0016431808471679688,
-0.035400390625,
0.0219268798828125,
0.00592803955078125,
0.006732940673828125,
-0.030548095703125,
0.032470703125,
-0.034423828125,
0.00713348388671875,
-0.005985260009765625,
-0.0091400146484375,
0.0025310516357421875,
0.0604248046875,
0.00360107421875,
0.067138671875,
0.04901123046875,
-0.04351806640625,
0.0298614501953125,
0.0268096923828125,
-0.0295257568359375,
0.016876220703125,
-0.0682373046875,
0.01468658447265625,
0.006359100341796875,
0.0287628173828125,
-0.0692138671875,
-0.017669677734375,
0.024505615234375,
-0.044830322265625,
0.01488494873046875,
-0.005828857421875,
-0.03436279296875,
-0.041839599609375,
-0.0218353271484375,
0.02252197265625,
0.05877685546875,
-0.043670654296875,
0.031951904296875,
0.022216796875,
0.010162353515625,
-0.044952392578125,
-0.0499267578125,
0.005764007568359375,
-0.028778076171875,
-0.056121826171875,
0.033966064453125,
-0.02191162109375,
-0.0036067962646484375,
-0.01044464111328125,
0.002429962158203125,
0.0005993843078613281,
0.01306915283203125,
0.0297088623046875,
0.038848876953125,
-0.01465606689453125,
-0.01175689697265625,
-0.0092620849609375,
-0.0118408203125,
0.001956939697265625,
-0.006256103515625,
0.056854248046875,
-0.0306549072265625,
-0.0120849609375,
-0.043853759765625,
0.005100250244140625,
0.0423583984375,
-0.022857666015625,
0.052581787109375,
0.038665771484375,
-0.02301025390625,
-0.006175994873046875,
-0.0440673828125,
-0.0023899078369140625,
-0.04156494140625,
0.027862548828125,
-0.0263671875,
-0.05657958984375,
0.046112060546875,
0.009246826171875,
0.00811004638671875,
0.040496826171875,
0.059967041015625,
0.00798797607421875,
0.06121826171875,
0.053192138671875,
-0.0294647216796875,
0.031524658203125,
-0.05853271484375,
0.009429931640625,
-0.0625,
-0.031707763671875,
-0.04241943359375,
-0.01074981689453125,
-0.048370361328125,
-0.035064697265625,
0.0240325927734375,
0.0156097412109375,
-0.0357666015625,
0.051361083984375,
-0.06695556640625,
0.0290679931640625,
0.035430908203125,
0.0144805908203125,
0.021209716796875,
0.006488800048828125,
-0.006023406982421875,
0.01486968994140625,
-0.0445556640625,
-0.046722412109375,
0.09130859375,
0.034423828125,
0.0635986328125,
-0.003925323486328125,
0.058929443359375,
-0.0012798309326171875,
0.0208587646484375,
-0.04779052734375,
0.04327392578125,
0.015625,
-0.037445068359375,
-0.007232666015625,
-0.027191162109375,
-0.0650634765625,
0.016937255859375,
-0.005344390869140625,
-0.0621337890625,
0.00765228271484375,
0.0081329345703125,
-0.0173492431640625,
0.02960205078125,
-0.05352783203125,
0.05657958984375,
-0.00937652587890625,
-0.01263427734375,
-0.0088348388671875,
-0.04034423828125,
0.03271484375,
0.002147674560546875,
0.0119476318359375,
-0.007793426513671875,
-0.006183624267578125,
0.05303955078125,
-0.043487548828125,
0.0709228515625,
0.0028247833251953125,
-0.0263214111328125,
0.041534423828125,
-0.00994873046875,
0.03607177734375,
0.0007009506225585938,
-0.02032470703125,
0.041961669921875,
-0.004909515380859375,
-0.022705078125,
-0.0152740478515625,
0.049407958984375,
-0.07830810546875,
-0.050628662109375,
-0.031494140625,
-0.0299835205078125,
0.01139068603515625,
0.01044464111328125,
0.036834716796875,
0.0170135498046875,
0.01245880126953125,
0.0156707763671875,
0.035980224609375,
-0.04205322265625,
0.04901123046875,
0.02606201171875,
-0.0167999267578125,
-0.0426025390625,
0.07025146484375,
-0.0053253173828125,
0.0185394287109375,
0.0219268798828125,
0.00409698486328125,
-0.01468658447265625,
-0.0361328125,
-0.031829833984375,
0.0304107666015625,
-0.042633056640625,
-0.044464111328125,
-0.05255126953125,
-0.0364990234375,
-0.035064697265625,
-0.0174713134765625,
-0.0189056396484375,
-0.0244903564453125,
-0.045806884765625,
-0.0109710693359375,
0.053680419921875,
0.047760009765625,
-0.0016813278198242188,
0.0295867919921875,
-0.043609619140625,
0.0298004150390625,
0.00830841064453125,
0.0282440185546875,
0.0083770751953125,
-0.046356201171875,
-0.016448974609375,
0.0028324127197265625,
-0.0457763671875,
-0.06439208984375,
0.048065185546875,
0.01053619384765625,
0.040191650390625,
0.0066680908203125,
-0.005767822265625,
0.0567626953125,
-0.0291595458984375,
0.06585693359375,
0.0265045166015625,
-0.08447265625,
0.052642822265625,
-0.0228271484375,
0.0120086669921875,
0.00690460205078125,
0.0210723876953125,
-0.031494140625,
-0.0280303955078125,
-0.054168701171875,
-0.0638427734375,
0.0572509765625,
0.0278778076171875,
0.006656646728515625,
0.008087158203125,
0.0298614501953125,
-0.01300048828125,
0.0177764892578125,
-0.0784912109375,
-0.0263671875,
-0.028289794921875,
-0.0180511474609375,
-0.003314971923828125,
-0.007015228271484375,
0.00008612871170043945,
-0.033660888671875,
0.0482177734375,
-0.0102081298828125,
0.043060302734375,
0.01904296875,
-0.0089874267578125,
-0.01181793212890625,
0.005451202392578125,
0.044219970703125,
0.04693603515625,
-0.00815582275390625,
-0.0113525390625,
0.024505615234375,
-0.0413818359375,
0.0182342529296875,
-0.00611114501953125,
-0.01031494140625,
-0.013916015625,
0.04058837890625,
0.05438232421875,
0.0013265609741210938,
-0.047271728515625,
0.037750244140625,
0.008331298828125,
-0.023193359375,
-0.03326416015625,
0.0238037109375,
0.0239105224609375,
0.021331787109375,
0.018798828125,
0.0032501220703125,
-0.010467529296875,
-0.03253173828125,
0.007080078125,
0.022491455078125,
0.00800323486328125,
-0.023529052734375,
0.06634521484375,
0.006191253662109375,
-0.0185546875,
0.037078857421875,
-0.0015010833740234375,
-0.05047607421875,
0.0888671875,
0.046295166015625,
0.054840087890625,
-0.0164031982421875,
0.01097869873046875,
0.05133056640625,
0.0386962890625,
-0.0047607421875,
0.0279693603515625,
0.004856109619140625,
-0.0435791015625,
-0.0260009765625,
-0.06109619140625,
-0.01427459716796875,
0.01126861572265625,
-0.039093017578125,
0.03240966796875,
-0.044708251953125,
-0.00815582275390625,
-0.01222991943359375,
0.01558685302734375,
-0.05352783203125,
0.010040283203125,
0.0033512115478515625,
0.07147216796875,
-0.054351806640625,
0.07208251953125,
0.0435791015625,
-0.051910400390625,
-0.07568359375,
-0.0150604248046875,
-0.007198333740234375,
-0.079833984375,
0.043731689453125,
0.0191192626953125,
0.01050567626953125,
0.005901336669921875,
-0.064208984375,
-0.07861328125,
0.09698486328125,
0.023193359375,
-0.043487548828125,
-0.00797271728515625,
0.005748748779296875,
0.04864501953125,
-0.027130126953125,
0.038543701171875,
0.0499267578125,
0.03533935546875,
-0.0037899017333984375,
-0.0816650390625,
0.023406982421875,
-0.0321044921875,
0.017181396484375,
-0.00934600830078125,
-0.08087158203125,
0.07940673828125,
-0.03448486328125,
-0.0034637451171875,
0.02532958984375,
0.052398681640625,
0.036346435546875,
0.01222991943359375,
0.0241851806640625,
0.05010986328125,
0.04351806640625,
-0.00708770751953125,
0.0732421875,
-0.036102294921875,
0.046234130859375,
0.05029296875,
-0.0006918907165527344,
0.053863525390625,
0.0223541259765625,
-0.037200927734375,
0.05419921875,
0.055267333984375,
-0.0232696533203125,
0.0256195068359375,
0.0236663818359375,
-0.0006852149963378906,
-0.00856781005859375,
0.01061248779296875,
-0.05645751953125,
0.02825927734375,
0.028533935546875,
-0.0258026123046875,
0.0015535354614257812,
-0.00911712646484375,
0.0218658447265625,
-0.008331298828125,
-0.00608062744140625,
0.0487060546875,
0.00865936279296875,
-0.045989990234375,
0.0855712890625,
0.00849151611328125,
0.07000732421875,
-0.039581298828125,
-0.00872039794921875,
-0.0264739990234375,
0.00962066650390625,
-0.042633056640625,
-0.047393798828125,
0.0123138427734375,
0.0126953125,
-0.0058746337890625,
-0.013153076171875,
0.0340576171875,
-0.0189666748046875,
-0.03924560546875,
0.02667236328125,
0.01355743408203125,
0.0150146484375,
0.005584716796875,
-0.05877685546875,
0.028167724609375,
0.0215606689453125,
-0.03375244140625,
0.0205230712890625,
0.019287109375,
0.01448822021484375,
0.058746337890625,
0.053375244140625,
0.0007047653198242188,
0.007171630859375,
-0.01403045654296875,
0.08660888671875,
-0.0609130859375,
-0.031494140625,
-0.06317138671875,
0.044525146484375,
0.005504608154296875,
-0.03619384765625,
0.04937744140625,
0.037261962890625,
0.06036376953125,
-0.007534027099609375,
0.056793212890625,
-0.0245208740234375,
0.01116943359375,
-0.043853759765625,
0.052032470703125,
-0.050018310546875,
0.02801513671875,
-0.0408935546875,
-0.06964111328125,
-0.01593017578125,
0.06634521484375,
-0.01145172119140625,
0.006999969482421875,
0.046417236328125,
0.08221435546875,
0.017486572265625,
-0.0024089813232421875,
0.009674072265625,
0.022735595703125,
0.03533935546875,
0.0709228515625,
0.0625,
-0.049896240234375,
0.050567626953125,
-0.0423583984375,
-0.0206756591796875,
-0.0216522216796875,
-0.06683349609375,
-0.072021484375,
-0.043670654296875,
-0.031707763671875,
-0.038116455078125,
-0.0164947509765625,
0.07232666015625,
0.05255126953125,
-0.050140380859375,
-0.03448486328125,
-0.00997161865234375,
0.023406982421875,
-0.0163116455078125,
-0.0189666748046875,
0.03131103515625,
-0.019256591796875,
-0.059661865234375,
0.018157958984375,
-0.0007958412170410156,
0.007137298583984375,
-0.0145721435546875,
-0.017913818359375,
-0.014007568359375,
-0.0016613006591796875,
0.037933349609375,
0.0202789306640625,
-0.056304931640625,
-0.014007568359375,
0.006336212158203125,
-0.0157928466796875,
0.0184478759765625,
0.03369140625,
-0.049285888671875,
0.004779815673828125,
0.0276336669921875,
0.0272216796875,
0.03558349609375,
-0.01238250732421875,
0.0136260986328125,
-0.03277587890625,
0.0256195068359375,
0.0009965896606445312,
0.03411865234375,
0.01045989990234375,
-0.037628173828125,
0.05511474609375,
0.0257568359375,
-0.051361083984375,
-0.06005859375,
0.0013256072998046875,
-0.08367919921875,
-0.0203704833984375,
0.09466552734375,
-0.0108642578125,
-0.027740478515625,
0.01387786865234375,
-0.032196044921875,
0.0174102783203125,
-0.036285400390625,
0.052734375,
0.031219482421875,
-0.00959014892578125,
-0.002002716064453125,
-0.0299072265625,
0.0251617431640625,
0.0244140625,
-0.07415771484375,
-0.00885009765625,
0.03228759765625,
0.032623291015625,
0.013641357421875,
0.056640625,
-0.00815582275390625,
0.01204681396484375,
0.002460479736328125,
0.0281829833984375,
-0.006267547607421875,
-0.01436614990234375,
-0.025238037109375,
0.00020754337310791016,
-0.01318359375,
-0.00661468505859375
]
] |
cerebras/Cerebras-GPT-6.7B | 2023-04-07T13:51:31.000Z | [
"transformers",
"pytorch",
"gpt2",
"causal-lm",
"text-generation",
"en",
"dataset:the_pile",
"arxiv:2304.03208",
"arxiv:2203.15556",
"arxiv:2101.00027",
"license:apache-2.0",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | cerebras | null | null | cerebras/Cerebras-GPT-6.7B | 61 | 7,047 | transformers | 2023-03-20T20:45:13 | ---
language:
- en
inference: false
tags:
- pytorch
- causal-lm
license: apache-2.0
datasets:
- the_pile
pipeline_tag: text-generation
---
# Cerebras-GPT 6.7B
Check out our [Blog Post](https://www.cerebras.net/cerebras-gpt) and [arXiv paper](https://arxiv.org/abs/2304.03208)!
## Model Description
The Cerebras-GPT family is released to facilitate research into LLM scaling laws using open architectures and data sets and demonstrate the simplicity of and scalability of training LLMs on the Cerebras software and hardware stack. All Cerebras-GPT models are available on Hugging Face.
The family includes 111M, 256M, 590M, 1.3B, 2.7B, 6.7B, and 13B models.
All models in the Cerebras-GPT family have been trained in accordance with [Chinchilla scaling laws](https://arxiv.org/abs/2203.15556) (20 tokens per model parameter) which is compute-optimal.
These models were trained on the [Andromeda](https://www.cerebras.net/andromeda/) AI supercomputer comprised of 16 CS-2 wafer scale systems. Cerebras' [weight streaming technology](https://www.cerebras.net/blog/linear-scaling-made-possible-with-weight-streaming) simplifies the training of LLMs by disaggregating compute from model storage. This allowed for efficient scaling of training across nodes using simple data parallelism.
Cerebras systems for pre-training and fine tuning are available in the cloud via the [Cerebras Model Studio](https://www.cerebras.net/product-cloud/). Cerebras CS-2 compatible checkpoints are available in [Cerebras Model Zoo](https://github.com/Cerebras/modelzoo).
## Model Details
* Developed by: [Cerebras Systems](https://www.cerebras.net/)
* License: Apache 2.0
* Model type: Transformer-based Language Model
* Architecture: GPT-3 style architecture
* Data set: The Pile
* Tokenizer: Byte Pair Encoding
* Vocabulary Size: 50257
* Sequence Length: 2048
* Optimizer: AdamW, (β1, β2) = (0.9, 0.95), adam_eps = 1e−8 (1e−9 for larger models)
* Positional Encoding: Learned
* Language: English
* Learn more: Dense Scaling Laws Paper for training procedure, config files, and details on how to use.
**Contact**: To ask questions about Cerebras-GPT models, join the [Cerebras Discord](https://discord.gg/q6bZcMWJVu).
This is the standard parameterization version of Cerebras-GPT with **6.7B** parameters
Related models: [Cerebras-GPT Models](https://huggingface.co/models?sort=downloads&search=cerebras-gpt)
<br><br>
| Model | Parameters | Layers | d_model | Heads | d_head | d_ffn | LR | BS (seq) | BS (tokens) |
|---------------|------------|--------|---------|-------|--------|--------|----------|----------|----------------|
| Cerebras-GPT | 111M | 10 | 768 | 12 | 64 | 3072 | 6.0E-04 | 120 | 246K |
| Cerebras-GPT | 256M | 14 | 1088 | 17 | 64 | 4352 | 6.0E-04 | 264 | 541K |
| Cerebras-GPT | 590M | 18 | 1536 | 12 | 128 | 6144 | 2.0E-04 | 264 | 541K |
| Cerebras-GPT | 1.3B | 24 | 2048 | 16 | 128 | 8192 | 2.0E-04 | 528 | 1.08M |
| Cerebras-GPT | 2.7B | 32 | 2560 | 20 | 128 | 10240 | 2.0E-04 | 528 | 1.08M |
| Cerebras-GPT | 6.7B | 32 | 4096 | 32 | 128 | 16384 | 1.2E-04 | 1040 | 2.13M |
| Cerebras-GPT | 13B | 40 | 5120 | 40 | 128 | 20480 | 1.2E-04 | 720 → 1080 | 1.47M → 2.21M |
<br><br>
## Quickstart
This model can be easily loaded using the AutoModelForCausalLM functionality:
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("cerebras/Cerebras-GPT-6.7B")
model = AutoModelForCausalLM.from_pretrained("cerebras/Cerebras-GPT-6.7B")
text = "Generative AI is "
```
And can be used with Hugging Face Pipelines
```python
from transformers import pipeline
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)
generated_text = pipe(text, max_length=50, do_sample=False, no_repeat_ngram_size=2)[0]
print(generated_text['generated_text'])
```
or with `model.generate()`
```python
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, num_beams=5,
max_new_tokens=50, early_stopping=True,
no_repeat_ngram_size=2)
text_output = tokenizer.batch_decode(outputs, skip_special_tokens=True)
print(text_output[0])
```
<br><br>
## Training data
Cerebras-GPT is trained using [the Pile](https://pile.eleuther.ai) dataset from [EleutherAI](https://www.eleuther.ai). See the [Pile paper](https://arxiv.org/abs/2101.00027) for a more detailed breakdown of data sources and methodology. The Pile was cleaned using the ftfy library to normalize the text, then filtered using scripts provided by Eleuther.
We tokenized the data using byte-pair encoding using the GPT-2 vocabulary. Our tokenized version of the Pile has 371B tokens. We include more details about the training dataset preprocessing in Appendix A.1 of our paper.
Recent works find significant duplicate data present in the Pile. Eleuther’s Pythia applies a deduplication process to reduce replicated data, decreasing the Pile dataset size. Pythia was trained on both the standard dataset and deduplicated dataset to characterize the impact. Our models are trained on the standard Pile without deduplication, which may present an opportunity for further improvement with the deduplicated data set.
<br><br>
## Training procedure
We use the GPT-3 style model architecture. All of our layers use full attention as opposed to the GPT-3 style sparse banded attention. The model shapes were selected to either follow aspect ratio 80 or are the same shape as GPT-3 models. Learning rate warmed up for 375M tokens (1500 steps for 111M and 256M models) and 10x cosine decayed. No dropout was used and weight decay was set to 0.1. All models are trained with MSL of 2048.
All models were trained to Chinchilla point: 20 tokens per model parameter. Number of steps was chosen based on optimal batch size (varied by model) and fixed sequence length (2048). See Training Table, below, for details.
<br>
Model Params | Sequence Length | Batch Size | Number of Steps | Tokens | Tokens per Parameter | Flops
------------ | -------------- | ---------- | --------------- | ------ | -------------------- | -----
111M | 2048 | 120 | 9037 | 2.22E+09 | 20 | 2.6E+18
256M | 2048 | 264 | 9468 | 5.12E+09 | 20 | 1.3E+19
590M | 2048 | 264 | 21836 | 1.18E+10 | 20 | 6.1E+19
1.3B | 2048 | 528 | 24334 | 2.63E+10 | 20 | 2.8E+20
2.7B | 2048 | 528 | 49041 | 5.30E+10 | 20 | 1.1E+21
6.7B | 2048 | 1040 | 62522 | 1.33E+11 | 20 | 6.3E+21
13B | 2048 | 720 | 174335 | 2.57E+11 | 20 | 2.3E+22
<br><br>
## Evaluations
We trained models from smallest to largest and fit a power law as we went along. The power law was helpful for extrapolating the validation loss of the next largest model we trained and provided confidence about whether the training run was going well.
We performed upstream (pre-training) evaluations of text prediction cross-entropy using the Pile validation and test splits. We performed downstream evaluations of text generation accuracy on standardized tasks using the [Eleuther lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness). Results are compared against many publicly available large language models in Section 3 of the paper.
#### 0-shot Evaluation
| Model | Params | Training FLOPs | PILE test xent | Hella-Swag | PIQA | Wino-Grande | Lambada | ARC-e | ARC-c | OpenBookQA | Downstream Average |
| ------- | ----- | -------------- | -------------- | ---------- | ----- | ----------- | ------- | ----- | ----- | ---------- | ------------------ |
| Cerebras-GPT | 111M | 2.6E+18 | 2.566 | 0.268 | 0.594 | 0.488 | 0.194 | 0.380 | 0.166 | 0.118 | 0.315 |
| Cerebras-GPT | 256M | 1.3E+19 | 2.299 | 0.274 | 0.613 | 0.511 | 0.293 | 0.410 | 0.170 | 0.158 | 0.347 |
| Cerebras-GPT | 590M | 6.1E+19 | 2.184 | 0.291 | 0.627 | 0.498 | 0.366 | 0.464 | 0.190 | 0.158 | 0.370 |
| Cerebras-GPT | 1.3B | 2.8E+20 | 1.996 | 0.325 | 0.664 | 0.521 | 0.462 | 0.508 | 0.224 | 0.166 | 0.410 |
| Cerebras-GPT | 2.7B | 1.1E+21 | 1.834 | 0.386 | 0.701 | 0.559 | 0.567 | 0.571 | 0.246 | 0.206 | 0.462 |
| Cerebras-GPT | 6.7B | 6.3E+21 | 1.704 | 0.447 | 0.739 | 0.602 | 0.636 | 0.643 | 0.282 | 0.238 | 0.512 |
| Cerebras-GPT | 13B | 2.3E+22 | 1.575 | 0.513 | 0.766 | 0.646 | 0.696 | 0.714 | 0.367 | 0.286 | 0.570 |
#### 5-shot Evaluation
| Model | Params | Hella-Swag | PIQA | Wino-Grande | Lambada | ARC-e | ARC-c | OpenBookQA |
| -------- | ----- | ----------| ----- | ----------- | -------| ----- | ----- | ---------- |
| Cerebras-GPT | 111M | 0.267 | 0.588 | 0.475 | 0.158 | 0.356 | 0.166 | 0.136 |
| Cerebras-GPT | 256M | 0.278 | 0.606 | 0.522 | 0.225 | 0.422 | 0.183 | 0.164 |
| Cerebras-GPT | 590M | 0.291 | 0.634 | 0.479 | 0.281 | 0.475 | 0.206 | 0.152 |
| Cerebras-GPT | 1.3B | 0.326 | 0.668 | 0.536 | 0.395 | 0.529 | 0.241 | 0.174 |
| Cerebras-GPT | 2.7B | 0.382 | 0.697 | 0.543 | 0.487 | 0.590 | 0.267 | 0.224 |
| Cerebras-GPT | 6.7B | 0.444 | 0.736 | 0.590 | 0.591 | 0.667 | 0.314 | 0.270 |
| Cerebras-GPT | 13B | 0.514 | 0.768 | 0.674 | 0.655 | 0.743 | 0.398 | 0.318 |
<br><br>
## Uses and Limitations
### Intended Use
The primary intended use is to further research into large language models. These models can be used as a foundation model for NLP, applications, ethics, and alignment research. Our primary intended users are researchers who are working to improve LLMs and practitioners seeking reference implementations, training setups, hyperparameters, or pre-trained models. We release these models with a fully permissive Apache license for the community to use freely.
You may fine-tune and adapt Cerebras-GPT models for deployment via either Cerebras [Model Studio](https://www.cerebras.net/product-cloud/) or third-party libraries. Further safety-related testing and mitigations should be applied beore using the Cerebras-GPT model family in production downstream applications.
Due to financial and compute budgets, Cerebras-GPT models were only trained and evaluated following the approaches described in the paper.
### Out of Scope Use
Cerebras-GPT models are trained on the Pile, with English language only, and are not suitable for machine translation tasks.
Cerebras-GPT models have not been tuned for human-facing dialog applications like chatbots and will not respond to prompts in a similar way to models that have received instruction tuning or reinforcement learning from human feedback (RLHF) like Flan-T5 or ChatGPT. Cerebras-GPT models can be tuned using those methods.
### Risk, Bias, Ethical Considerations
* **Data**: The Pile dataset has been thoroughly analyzed from various ethical standpoints such as toxicity analysis, gender bias, pejorative content, racially sensitive content etc. Please refer to Pile dataset references.
* **Human life**: The outputs from this model may or may not align with human values. The risk needs to be thoroughly investigated before deploying this model in a production environment where it can directly impact human life.
* **Risks and harms**: There can be distributional bias in the Pile dataset that can manifest in various forms in the downstream model deployment. There are other risks associated with large language models such as amplifying stereotypes, memorizing training data, or revealing private or secure information.
* **Mitigations**: Only mitigations in standard Pile dataset pre-processing were employed when pre-training Cerebras-GPT.
<br><br>
## Acknowledgements
We are thankful to all Cerebras engineers, past and present, that made this work possible. | 12,576 | [
[
-0.0276336669921875,
-0.047271728515625,
0.018218994140625,
0.0130157470703125,
-0.0193634033203125,
-0.015411376953125,
-0.01580810546875,
-0.032379150390625,
0.0136871337890625,
0.0217132568359375,
-0.02862548828125,
-0.0308837890625,
-0.055572509765625,
-0.015655517578125,
-0.030670166015625,
0.0849609375,
-0.00537872314453125,
0.0031147003173828125,
0.01041412353515625,
-0.005863189697265625,
-0.01477813720703125,
-0.042510986328125,
-0.05804443359375,
-0.0305633544921875,
0.03594970703125,
-0.0008606910705566406,
0.056976318359375,
0.05975341796875,
0.026519775390625,
0.022064208984375,
-0.0288238525390625,
-0.00487518310546875,
-0.0236663818359375,
-0.0237274169921875,
0.01042938232421875,
-0.0185089111328125,
-0.04119873046875,
-0.007762908935546875,
0.051239013671875,
0.048126220703125,
-0.0268096923828125,
0.018524169921875,
0.02459716796875,
0.05487060546875,
-0.037017822265625,
0.01268768310546875,
-0.037384033203125,
0.000659942626953125,
-0.018707275390625,
0.0008950233459472656,
-0.0217437744140625,
-0.01496124267578125,
0.0016145706176757812,
-0.039764404296875,
0.0218963623046875,
-0.003345489501953125,
0.09552001953125,
0.01739501953125,
-0.03228759765625,
-0.020172119140625,
-0.0309600830078125,
0.053314208984375,
-0.057464599609375,
0.029083251953125,
0.01293182373046875,
-0.00015747547149658203,
-0.0024127960205078125,
-0.06390380859375,
-0.038909912109375,
-0.0166168212890625,
-0.0157928466796875,
0.01161956787109375,
-0.015625,
0.004390716552734375,
0.034332275390625,
0.039520263671875,
-0.05902099609375,
0.0157012939453125,
-0.03704833984375,
-0.01812744140625,
0.051422119140625,
0.010955810546875,
0.015777587890625,
-0.0266876220703125,
-0.031768798828125,
-0.029754638671875,
-0.0372314453125,
0.0247344970703125,
0.0309600830078125,
0.0153350830078125,
-0.0318603515625,
0.029296875,
-0.0128326416015625,
0.046173095703125,
0.02203369140625,
-0.00730133056640625,
0.041748046875,
-0.0217437744140625,
-0.033294677734375,
-0.004589080810546875,
0.078369140625,
0.01317596435546875,
0.0123443603515625,
0.00710296630859375,
-0.01508331298828125,
-0.0108489990234375,
0.0006966590881347656,
-0.0814208984375,
-0.0263519287109375,
0.01352691650390625,
-0.0439453125,
-0.0294342041015625,
0.00238800048828125,
-0.05218505859375,
-0.01548004150390625,
-0.030181884765625,
0.03692626953125,
-0.036956787109375,
-0.0253448486328125,
0.007122039794921875,
0.0019702911376953125,
0.03326416015625,
0.019683837890625,
-0.0888671875,
0.020751953125,
0.0294952392578125,
0.0633544921875,
0.003597259521484375,
-0.0283050537109375,
-0.017974853515625,
-0.0014886856079101562,
-0.01068878173828125,
0.03594970703125,
-0.002696990966796875,
-0.0269927978515625,
-0.016754150390625,
0.0096893310546875,
-0.03277587890625,
-0.026702880859375,
0.037994384765625,
-0.0258636474609375,
0.0165863037109375,
-0.01065826416015625,
-0.0400390625,
-0.028839111328125,
0.01262664794921875,
-0.0418701171875,
0.08380126953125,
0.01406097412109375,
-0.06982421875,
0.019866943359375,
-0.03497314453125,
-0.0184783935546875,
-0.00616455078125,
-0.0105438232421875,
-0.048065185546875,
-0.01158905029296875,
0.03277587890625,
0.043243408203125,
-0.0244140625,
0.0259246826171875,
-0.0178375244140625,
-0.0214385986328125,
-0.0065460205078125,
-0.037994384765625,
0.08795166015625,
0.0210418701171875,
-0.046295166015625,
0.0003440380096435547,
-0.055511474609375,
0.0104522705078125,
0.026275634765625,
-0.030181884765625,
0.00794219970703125,
-0.0164337158203125,
0.00855255126953125,
0.01898193359375,
0.0274505615234375,
-0.0200653076171875,
0.014617919921875,
-0.03411865234375,
0.039764404296875,
0.052947998046875,
0.0034942626953125,
0.022613525390625,
-0.0235748291015625,
0.033172607421875,
0.00681304931640625,
0.018463134765625,
-0.01076507568359375,
-0.03936767578125,
-0.0565185546875,
-0.01812744140625,
0.032928466796875,
0.04107666015625,
-0.034027099609375,
0.03759765625,
-0.022613525390625,
-0.059844970703125,
-0.0166778564453125,
0.005523681640625,
0.035369873046875,
0.04034423828125,
0.033233642578125,
-0.0197906494140625,
-0.036529541015625,
-0.072021484375,
-0.0057830810546875,
-0.0183258056640625,
-0.00292205810546875,
0.01490020751953125,
0.0582275390625,
-0.004154205322265625,
0.053314208984375,
-0.0347900390625,
-0.004604339599609375,
-0.006008148193359375,
0.014007568359375,
0.033905029296875,
0.046875,
0.0465087890625,
-0.057373046875,
-0.04083251953125,
0.0015764236450195312,
-0.06060791015625,
0.0091552734375,
-0.01480865478515625,
0.0035858154296875,
0.0227203369140625,
0.032867431640625,
-0.054595947265625,
0.028106689453125,
0.0487060546875,
-0.0251312255859375,
0.0477294921875,
-0.020538330078125,
-0.0004410743713378906,
-0.08050537109375,
0.0231781005859375,
0.0108489990234375,
-0.0029354095458984375,
-0.044647216796875,
0.00498199462890625,
0.01885986328125,
0.0034427642822265625,
-0.04595947265625,
0.040435791015625,
-0.045623779296875,
-0.0004410743713378906,
-0.00013434886932373047,
0.00957489013671875,
-0.0070343017578125,
0.06402587890625,
0.007175445556640625,
0.05133056640625,
0.046173095703125,
-0.04791259765625,
0.00907135009765625,
0.011627197265625,
-0.0167083740234375,
0.0267486572265625,
-0.062103271484375,
0.0021800994873046875,
-0.00235748291015625,
0.02679443359375,
-0.05377197265625,
-0.0130462646484375,
0.0184783935546875,
-0.044647216796875,
0.037353515625,
-0.0202178955078125,
-0.0311431884765625,
-0.048309326171875,
-0.022491455078125,
0.0248870849609375,
0.052490234375,
-0.043609619140625,
0.04266357421875,
0.0191650390625,
-0.0022258758544921875,
-0.04986572265625,
-0.05377197265625,
-0.003574371337890625,
-0.0301666259765625,
-0.0640869140625,
0.038726806640625,
-0.005954742431640625,
-0.0002574920654296875,
-0.01479339599609375,
0.0034427642822265625,
0.0029163360595703125,
0.00237274169921875,
0.0227813720703125,
0.02099609375,
-0.01021575927734375,
-0.00794219970703125,
0.0010709762573242188,
-0.006229400634765625,
0.006061553955078125,
-0.025787353515625,
0.052520751953125,
-0.0293731689453125,
-0.0176544189453125,
-0.03955078125,
-0.0123138427734375,
0.0438232421875,
-0.013763427734375,
0.06317138671875,
0.06121826171875,
-0.040069580078125,
0.012420654296875,
-0.035491943359375,
-0.002338409423828125,
-0.037353515625,
0.03643798828125,
-0.0293121337890625,
-0.053497314453125,
0.0556640625,
0.022918701171875,
0.0084686279296875,
0.063232421875,
0.05657958984375,
0.00885772705078125,
0.083740234375,
0.0288848876953125,
-0.01470184326171875,
0.03717041015625,
-0.05242919921875,
-0.00010025501251220703,
-0.0714111328125,
-0.0197906494140625,
-0.033355712890625,
-0.01386260986328125,
-0.05255126953125,
-0.0217742919921875,
0.018524169921875,
0.0266571044921875,
-0.0501708984375,
0.0372314453125,
-0.055450439453125,
0.0167083740234375,
0.0357666015625,
0.01386260986328125,
0.0059967041015625,
0.0007462501525878906,
-0.0243377685546875,
0.0002028942108154297,
-0.05316162109375,
-0.035064697265625,
0.0921630859375,
0.041351318359375,
0.03375244140625,
-0.008087158203125,
0.058624267578125,
-0.0015506744384765625,
0.0293121337890625,
-0.0465087890625,
0.033294677734375,
-0.006256103515625,
-0.046783447265625,
-0.0238800048828125,
-0.043243408203125,
-0.075439453125,
0.038665771484375,
0.0013484954833984375,
-0.0740966796875,
0.0197296142578125,
0.00909423828125,
-0.033935546875,
0.0445556640625,
-0.04388427734375,
0.070068359375,
-0.0198516845703125,
-0.0284576416015625,
-0.01099395751953125,
-0.054779052734375,
0.03607177734375,
-0.001857757568359375,
0.01788330078125,
0.0104522705078125,
0.006130218505859375,
0.0714111328125,
-0.050628662109375,
0.054290771484375,
-0.02520751953125,
-0.01202392578125,
0.041015625,
-0.0097198486328125,
0.057647705078125,
-0.0007171630859375,
-0.0052490234375,
0.0188140869140625,
0.0005283355712890625,
-0.0305328369140625,
-0.019439697265625,
0.05755615234375,
-0.0806884765625,
-0.035125732421875,
-0.03948974609375,
-0.038177490234375,
0.005283355712890625,
0.01245880126953125,
0.0382080078125,
0.0294647216796875,
0.0029773712158203125,
0.028778076171875,
0.048248291015625,
-0.01363372802734375,
0.051361083984375,
0.0217437744140625,
-0.01727294921875,
-0.046844482421875,
0.0616455078125,
0.0224761962890625,
0.0188751220703125,
0.01445770263671875,
0.00850677490234375,
-0.0303802490234375,
-0.0460205078125,
-0.04205322265625,
0.024169921875,
-0.047698974609375,
-0.01015472412109375,
-0.060150146484375,
-0.03204345703125,
-0.035186767578125,
-0.00977325439453125,
-0.0245361328125,
-0.030303955078125,
-0.0244598388671875,
-0.006103515625,
0.0265045166015625,
0.038726806640625,
-0.007389068603515625,
0.028411865234375,
-0.05535888671875,
0.00655364990234375,
0.022613525390625,
0.00914764404296875,
0.01502227783203125,
-0.07275390625,
-0.0265045166015625,
0.00890350341796875,
-0.04791259765625,
-0.06158447265625,
0.0443115234375,
-0.00420379638671875,
0.03460693359375,
0.023162841796875,
-0.0211944580078125,
0.05419921875,
-0.0212249755859375,
0.072265625,
0.0253753662109375,
-0.071533203125,
0.038665771484375,
-0.045562744140625,
0.01617431640625,
0.03167724609375,
0.027862548828125,
-0.037567138671875,
-0.014007568359375,
-0.07452392578125,
-0.07403564453125,
0.055938720703125,
0.0264434814453125,
-0.0015888214111328125,
0.0118865966796875,
0.034759521484375,
-0.01358795166015625,
0.01197052001953125,
-0.07781982421875,
-0.0208282470703125,
-0.022491455078125,
-0.01471710205078125,
-0.0027980804443359375,
0.0031070709228515625,
0.010894775390625,
-0.035858154296875,
0.065185546875,
-0.0084075927734375,
0.01800537109375,
0.01953125,
-0.01396942138671875,
-0.00977325439453125,
-0.003662109375,
0.0396728515625,
0.04205322265625,
-0.011627197265625,
-0.0198516845703125,
0.032928466796875,
-0.056060791015625,
0.002735137939453125,
0.022705078125,
-0.02703857421875,
-0.0092926025390625,
0.0185699462890625,
0.07025146484375,
0.01441192626953125,
-0.0232391357421875,
0.034454345703125,
0.00226593017578125,
-0.04266357421875,
-0.0300750732421875,
0.0007476806640625,
0.015838623046875,
0.01508331298828125,
0.0275726318359375,
-0.0008101463317871094,
0.00162506103515625,
-0.0217742919921875,
0.00991058349609375,
0.026580810546875,
-0.0219268798828125,
-0.0202178955078125,
0.0714111328125,
-0.0021038055419921875,
-0.006832122802734375,
0.05078125,
-0.01299285888671875,
-0.036956787109375,
0.07550048828125,
0.0243072509765625,
0.062744140625,
-0.020965576171875,
0.00991058349609375,
0.06170654296875,
0.028839111328125,
-0.0202789306640625,
0.00379180908203125,
0.006992340087890625,
-0.036956787109375,
-0.0214080810546875,
-0.060333251953125,
-0.01544189453125,
0.0253448486328125,
-0.053314208984375,
0.036712646484375,
-0.037841796875,
-0.00853729248046875,
-0.006702423095703125,
0.02508544921875,
-0.056396484375,
0.0299224853515625,
0.0211029052734375,
0.06427001953125,
-0.0628662109375,
0.0687255859375,
0.03900146484375,
-0.055084228515625,
-0.08953857421875,
-0.005397796630859375,
-0.0015316009521484375,
-0.063232421875,
0.039947509765625,
0.02252197265625,
0.01715087890625,
0.01479339599609375,
-0.038970947265625,
-0.0887451171875,
0.11920166015625,
0.0185699462890625,
-0.054473876953125,
-0.014801025390625,
0.005413055419921875,
0.042236328125,
-0.00884246826171875,
0.03900146484375,
0.039459228515625,
0.034393310546875,
0.0016908645629882812,
-0.07940673828125,
0.0198211669921875,
-0.02252197265625,
0.00798797607421875,
0.02203369140625,
-0.08056640625,
0.08966064453125,
-0.00978851318359375,
-0.0018491744995117188,
0.0092315673828125,
0.055267333984375,
0.0406494140625,
0.01081085205078125,
0.042633056640625,
0.062164306640625,
0.0626220703125,
-0.006351470947265625,
0.08660888671875,
-0.044952392578125,
0.05389404296875,
0.06610107421875,
0.003154754638671875,
0.05511474609375,
0.0318603515625,
-0.030609130859375,
0.046875,
0.06939697265625,
-0.01267242431640625,
0.0196685791015625,
0.01983642578125,
-0.0039520263671875,
-0.0077972412109375,
0.01499176025390625,
-0.045928955078125,
0.01067352294921875,
0.0211181640625,
-0.038970947265625,
-0.0091705322265625,
-0.0022373199462890625,
0.019500732421875,
-0.013671875,
-0.0318603515625,
0.0293121337890625,
0.0120849609375,
-0.045013427734375,
0.0687255859375,
0.007602691650390625,
0.053619384765625,
-0.03948974609375,
0.0233917236328125,
-0.011962890625,
0.0166778564453125,
-0.0264739990234375,
-0.048553466796875,
0.007595062255859375,
0.001129150390625,
-0.0027866363525390625,
-0.01568603515625,
0.040924072265625,
-0.0162200927734375,
-0.037353515625,
0.0311737060546875,
0.0272369384765625,
0.01403045654296875,
-0.0116729736328125,
-0.07177734375,
-0.00836181640625,
0.005962371826171875,
-0.065185546875,
0.031494140625,
0.026153564453125,
-0.006206512451171875,
0.045928955078125,
0.0452880859375,
-0.002231597900390625,
0.0094757080078125,
0.009490966796875,
0.0740966796875,
-0.0465087890625,
-0.0321044921875,
-0.06396484375,
0.0511474609375,
-0.001861572265625,
-0.041595458984375,
0.05548095703125,
0.0487060546875,
0.05865478515625,
0.01088714599609375,
0.047393798828125,
-0.022857666015625,
0.01800537109375,
-0.04473876953125,
0.051422119140625,
-0.044586181640625,
0.0106964111328125,
-0.020965576171875,
-0.07293701171875,
-0.0096282958984375,
0.04425048828125,
-0.03570556640625,
0.03472900390625,
0.059326171875,
0.063232421875,
0.004230499267578125,
0.00479888916015625,
0.00492095947265625,
0.0227813720703125,
0.0230865478515625,
0.063232421875,
0.0357666015625,
-0.0645751953125,
0.0572509765625,
-0.03094482421875,
-0.014801025390625,
-0.01084136962890625,
-0.051910400390625,
-0.05767822265625,
-0.038970947265625,
-0.032440185546875,
-0.03131103515625,
-0.0025806427001953125,
0.05865478515625,
0.05224609375,
-0.049652099609375,
-0.0197296142578125,
-0.029510498046875,
-0.0137176513671875,
-0.0169219970703125,
-0.0205230712890625,
0.049835205078125,
-0.02001953125,
-0.05621337890625,
0.005550384521484375,
-0.007297515869140625,
0.021697998046875,
-0.023193359375,
-0.0275115966796875,
-0.0160980224609375,
-0.0002841949462890625,
0.025054931640625,
0.024993896484375,
-0.043365478515625,
-0.0174102783203125,
-0.00359344482421875,
-0.0238494873046875,
0.00879669189453125,
0.0338134765625,
-0.047149658203125,
-0.00011801719665527344,
0.033905029296875,
0.022918701171875,
0.07135009765625,
-0.009307861328125,
0.0160980224609375,
-0.037353515625,
0.01678466796875,
0.0080718994140625,
0.042633056640625,
0.0169830322265625,
-0.030792236328125,
0.04925537109375,
0.0292205810546875,
-0.05859375,
-0.060546875,
-0.00656890869140625,
-0.07244873046875,
-0.014801025390625,
0.08282470703125,
-0.01074981689453125,
-0.0301666259765625,
0.0177764892578125,
-0.01342010498046875,
0.027679443359375,
-0.01885986328125,
0.04559326171875,
0.05255126953125,
-0.003673553466796875,
-0.01371002197265625,
-0.053497314453125,
0.027679443359375,
0.040283203125,
-0.05419921875,
-0.0007719993591308594,
0.0220794677734375,
0.031036376953125,
0.0151214599609375,
0.050811767578125,
-0.0227203369140625,
0.01561737060546875,
0.00794219970703125,
0.021514892578125,
-0.0007843971252441406,
-0.005863189697265625,
-0.041473388671875,
0.0115966796875,
-0.00527191162109375,
-0.004299163818359375
]
] |
sentence-transformers/paraphrase-MiniLM-L12-v2 | 2022-06-15T20:18:46.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"arxiv:1908.10084",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/paraphrase-MiniLM-L12-v2 | 4 | 7,036 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# sentence-transformers/paraphrase-MiniLM-L12-v2
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/paraphrase-MiniLM-L12-v2')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/paraphrase-MiniLM-L12-v2')
model = AutoModel.from_pretrained('sentence-transformers/paraphrase-MiniLM-L12-v2')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, max pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/paraphrase-MiniLM-L12-v2)
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
This model was trained by [sentence-transformers](https://www.sbert.net/).
If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
``` | 3,698 | [
[
-0.0172119140625,
-0.0546875,
0.028839111328125,
0.018768310546875,
-0.027618408203125,
-0.0325927734375,
-0.0113067626953125,
0.0016317367553710938,
0.01000213623046875,
0.032135009765625,
-0.041595458984375,
-0.0225067138671875,
-0.044464111328125,
0.00920867919921875,
-0.0399169921875,
0.07073974609375,
-0.005237579345703125,
-0.001529693603515625,
-0.0290374755859375,
-0.020172119140625,
-0.0158843994140625,
-0.0277252197265625,
-0.03216552734375,
-0.0198822021484375,
0.0184783935546875,
0.019775390625,
0.044281005859375,
0.03558349609375,
0.02642822265625,
0.033172607421875,
-0.0013170242309570312,
0.005336761474609375,
-0.0211944580078125,
-0.01041412353515625,
0.00160980224609375,
-0.031402587890625,
-0.003143310546875,
0.0104217529296875,
0.0423583984375,
0.02978515625,
-0.0083770751953125,
0.0211334228515625,
0.017120361328125,
0.0157623291015625,
-0.019622802734375,
0.034210205078125,
-0.053070068359375,
0.004680633544921875,
0.001956939697265625,
0.00565338134765625,
-0.03411865234375,
-0.0125732421875,
0.020416259765625,
-0.0278167724609375,
0.0242462158203125,
0.0260162353515625,
0.076904296875,
0.03143310546875,
-0.0124664306640625,
-0.0306396484375,
-0.0144195556640625,
0.0672607421875,
-0.061492919921875,
0.00945281982421875,
0.0237579345703125,
0.0032863616943359375,
0.0083770751953125,
-0.09051513671875,
-0.057281494140625,
-0.01222991943359375,
-0.048736572265625,
0.000033795833587646484,
-0.0311431884765625,
-0.0014095306396484375,
0.016143798828125,
0.01467132568359375,
-0.053253173828125,
-0.019012451171875,
-0.0323486328125,
-0.0191802978515625,
0.025970458984375,
0.0130462646484375,
0.029205322265625,
-0.052337646484375,
-0.0323486328125,
-0.025299072265625,
-0.0155181884765625,
-0.0010519027709960938,
-0.004734039306640625,
0.01812744140625,
-0.0285491943359375,
0.058563232421875,
-0.002933502197265625,
0.03326416015625,
0.0014581680297851562,
0.0086212158203125,
0.03662109375,
-0.04296875,
-0.0228271484375,
-0.012664794921875,
0.08355712890625,
0.0263671875,
0.01172637939453125,
0.0025196075439453125,
-0.01314544677734375,
-0.00029754638671875,
0.0013751983642578125,
-0.059600830078125,
-0.041595458984375,
0.00403594970703125,
-0.0380859375,
-0.027862548828125,
-0.000804901123046875,
-0.06536865234375,
-0.006107330322265625,
0.0033855438232421875,
0.056060791015625,
-0.044189453125,
0.0229949951171875,
0.0033969879150390625,
-0.0272369384765625,
0.01549530029296875,
-0.0213775634765625,
-0.050628662109375,
0.023712158203125,
0.017974853515625,
0.08135986328125,
0.0080413818359375,
-0.040618896484375,
-0.0247650146484375,
-0.0018510818481445312,
0.01316070556640625,
0.05487060546875,
-0.026275634765625,
-0.0018901824951171875,
0.0012750625610351562,
0.01458740234375,
-0.0474853515625,
-0.036163330078125,
0.04815673828125,
-0.0217742919921875,
0.0455322265625,
0.00533294677734375,
-0.055633544921875,
-0.0017185211181640625,
0.0008153915405273438,
-0.037872314453125,
0.08587646484375,
0.004730224609375,
-0.07293701171875,
-0.002880096435546875,
-0.050140380859375,
-0.0132904052734375,
-0.00804901123046875,
-0.004024505615234375,
-0.049041748046875,
0.00931549072265625,
0.039764404296875,
0.043792724609375,
-0.002895355224609375,
0.0105438232421875,
-0.0202178955078125,
-0.031463623046875,
0.0240936279296875,
-0.022613525390625,
0.0863037109375,
0.01042938232421875,
-0.0283355712890625,
0.01251220703125,
-0.03125,
-0.006427764892578125,
0.0268707275390625,
-0.004322052001953125,
-0.0230255126953125,
-0.0116424560546875,
0.018157958984375,
0.029296875,
0.026519775390625,
-0.04400634765625,
0.0012388229370117188,
-0.0377197265625,
0.0693359375,
0.04638671875,
0.009765625,
0.04986572265625,
-0.03411865234375,
0.0171356201171875,
0.0131072998046875,
0.00652313232421875,
-0.008697509765625,
-0.0396728515625,
-0.0753173828125,
-0.0224456787109375,
0.023101806640625,
0.04437255859375,
-0.074951171875,
0.06036376953125,
-0.03826904296875,
-0.039398193359375,
-0.06304931640625,
0.0116424560546875,
0.0151824951171875,
0.03533935546875,
0.055572509765625,
0.0123443603515625,
-0.0484619140625,
-0.0740966796875,
-0.01273345947265625,
-0.0022411346435546875,
0.00020706653594970703,
0.014862060546875,
0.0574951171875,
-0.0272674560546875,
0.0745849609375,
-0.043365478515625,
-0.036895751953125,
-0.043701171875,
0.016998291015625,
0.0162811279296875,
0.043548583984375,
0.042083740234375,
-0.05224609375,
-0.04254150390625,
-0.044403076171875,
-0.05645751953125,
-0.00569915771484375,
-0.01995849609375,
-0.020111083984375,
0.0006389617919921875,
0.042236328125,
-0.06866455078125,
0.026336669921875,
0.038116455078125,
-0.031524658203125,
0.0196990966796875,
-0.0257415771484375,
-0.0223388671875,
-0.08758544921875,
-0.0030498504638671875,
-0.004657745361328125,
-0.0182342529296875,
-0.029388427734375,
0.01251983642578125,
0.0162200927734375,
-0.010833740234375,
-0.036529541015625,
0.040435791015625,
-0.028961181640625,
0.01361846923828125,
-0.005584716796875,
0.035888671875,
-0.0019388198852539062,
0.0535888671875,
-0.01287078857421875,
0.056915283203125,
0.032745361328125,
-0.042266845703125,
0.028961181640625,
0.048858642578125,
-0.03643798828125,
0.01122283935546875,
-0.06610107421875,
0.00630950927734375,
0.004199981689453125,
0.029052734375,
-0.08502197265625,
-0.002532958984375,
0.0240478515625,
-0.033233642578125,
-0.004062652587890625,
0.0164337158203125,
-0.0584716796875,
-0.048858642578125,
-0.039825439453125,
0.013946533203125,
0.056884765625,
-0.039215087890625,
0.03863525390625,
0.01837158203125,
-0.01116943359375,
-0.0239715576171875,
-0.0806884765625,
0.0013637542724609375,
-0.0242462158203125,
-0.04925537109375,
0.0377197265625,
-0.0135040283203125,
0.007781982421875,
0.01507568359375,
0.0170135498046875,
0.0008225440979003906,
-0.0142059326171875,
-0.008026123046875,
0.012115478515625,
-0.0037441253662109375,
0.00746917724609375,
0.02337646484375,
-0.009521484375,
0.0008554458618164062,
-0.0105438232421875,
0.055572509765625,
-0.01971435546875,
-0.003437042236328125,
-0.03790283203125,
0.0172882080078125,
0.03131103515625,
-0.007366180419921875,
0.083984375,
0.068359375,
-0.0277862548828125,
-0.0038623809814453125,
-0.02947998046875,
-0.0294189453125,
-0.038299560546875,
0.0345458984375,
-0.023681640625,
-0.058807373046875,
0.0306854248046875,
0.0234527587890625,
0.0017871856689453125,
0.054473876953125,
0.045074462890625,
-0.02484130859375,
0.05999755859375,
0.043487548828125,
-0.0037212371826171875,
0.03826904296875,
-0.041412353515625,
0.0150909423828125,
-0.0648193359375,
-0.0012493133544921875,
-0.022308349609375,
-0.02362060546875,
-0.045654296875,
-0.040252685546875,
0.024871826171875,
-0.003631591796875,
-0.015106201171875,
0.050384521484375,
-0.034698486328125,
0.01416015625,
0.054351806640625,
0.016204833984375,
-0.0008606910705566406,
0.009033203125,
-0.04034423828125,
-0.01422882080078125,
-0.06170654296875,
-0.043853759765625,
0.06195068359375,
0.021575927734375,
0.037261962890625,
-0.01071929931640625,
0.06060791015625,
0.01294708251953125,
0.002201080322265625,
-0.0433349609375,
0.0545654296875,
-0.0209808349609375,
-0.0333251953125,
-0.028961181640625,
-0.024139404296875,
-0.06134033203125,
0.037261962890625,
-0.0017642974853515625,
-0.050445556640625,
0.00872802734375,
-0.007610321044921875,
-0.032196044921875,
0.01219940185546875,
-0.058563232421875,
0.0841064453125,
0.0119171142578125,
-0.0007848739624023438,
-0.004978179931640625,
-0.0615234375,
0.016082763671875,
0.001171112060546875,
0.017791748046875,
-0.00424957275390625,
-0.0139617919921875,
0.07391357421875,
-0.032012939453125,
0.0682373046875,
-0.01113128662109375,
0.0269927978515625,
0.0290374755859375,
-0.0186004638671875,
0.033416748046875,
-0.007480621337890625,
-0.004665374755859375,
0.000728607177734375,
0.004177093505859375,
-0.036041259765625,
-0.04095458984375,
0.0523681640625,
-0.06634521484375,
-0.030120849609375,
-0.03277587890625,
-0.04638671875,
-0.0008134841918945312,
0.0153961181640625,
0.03704833984375,
0.022613525390625,
0.0011682510375976562,
0.04193115234375,
0.036224365234375,
-0.0203857421875,
0.0562744140625,
0.0009279251098632812,
-0.005462646484375,
-0.03778076171875,
0.051971435546875,
0.00699615478515625,
0.00811004638671875,
0.0343017578125,
0.027435302734375,
-0.031768798828125,
-0.016143798828125,
-0.0274505615234375,
0.041595458984375,
-0.050994873046875,
-0.0162200927734375,
-0.0791015625,
-0.034759521484375,
-0.0477294921875,
0.001987457275390625,
-0.0101318359375,
-0.031707763671875,
-0.03314208984375,
-0.0172576904296875,
0.02569580078125,
0.0296630859375,
-0.00009322166442871094,
0.0340576171875,
-0.05401611328125,
0.0224456787109375,
0.0224761962890625,
-0.0190277099609375,
-0.006092071533203125,
-0.072265625,
-0.027984619140625,
0.005970001220703125,
-0.029541015625,
-0.059326171875,
0.049530029296875,
0.0248260498046875,
0.0438232421875,
0.0019664764404296875,
0.0157623291015625,
0.055511474609375,
-0.04766845703125,
0.06707763671875,
-0.00048351287841796875,
-0.08099365234375,
0.033599853515625,
-0.002964019775390625,
0.02520751953125,
0.040435791015625,
0.0124053955078125,
-0.0311126708984375,
-0.037200927734375,
-0.06158447265625,
-0.06878662109375,
0.05615234375,
0.045989990234375,
0.042724609375,
-0.016021728515625,
0.017303466796875,
-0.016021728515625,
0.01450347900390625,
-0.08697509765625,
-0.036651611328125,
-0.0211639404296875,
-0.050018310546875,
-0.0245513916015625,
-0.023895263671875,
0.003215789794921875,
-0.039581298828125,
0.049041748046875,
0.0011472702026367188,
0.064453125,
0.0156097412109375,
-0.0394287109375,
0.02008056640625,
0.007213592529296875,
0.03857421875,
0.0157928466796875,
-0.005474090576171875,
0.0233306884765625,
0.031707763671875,
-0.0201568603515625,
0.0020694732666015625,
0.0310211181640625,
-0.0202178955078125,
0.02252197265625,
0.035400390625,
0.065185546875,
0.03802490234375,
-0.037109375,
0.058441162109375,
-0.0037212371826171875,
-0.01300811767578125,
-0.024078369140625,
-0.01227569580078125,
0.0207977294921875,
0.0266265869140625,
0.020751953125,
0.003910064697265625,
0.0026073455810546875,
-0.029632568359375,
0.0292205810546875,
0.0158843994140625,
-0.021575927734375,
-0.005901336669921875,
0.057525634765625,
-0.00475311279296875,
-0.01427459716796875,
0.06439208984375,
-0.016082763671875,
-0.04608154296875,
0.031982421875,
0.043853759765625,
0.0699462890625,
0.005794525146484375,
0.0168609619140625,
0.023193359375,
0.034210205078125,
-0.0034961700439453125,
-0.0033702850341796875,
0.0026187896728515625,
-0.057830810546875,
-0.0018968582153320312,
-0.0516357421875,
0.0036067962646484375,
-0.0002655982971191406,
-0.044769287109375,
0.019866943359375,
-0.00978851318359375,
0.00014650821685791016,
-0.0089111328125,
-0.00672149658203125,
-0.056793212890625,
0.001956939697265625,
-0.001129150390625,
0.061279296875,
-0.07476806640625,
0.0771484375,
0.04833984375,
-0.053070068359375,
-0.0472412109375,
0.005748748779296875,
-0.025909423828125,
-0.0694580078125,
0.039337158203125,
0.024322509765625,
0.01398468017578125,
0.0149383544921875,
-0.040740966796875,
-0.06549072265625,
0.10357666015625,
0.022796630859375,
-0.017913818359375,
-0.030670166015625,
0.005100250244140625,
0.041412353515625,
-0.032379150390625,
0.01873779296875,
0.041168212890625,
0.019439697265625,
-0.00820159912109375,
-0.0538330078125,
0.01995849609375,
-0.01306915283203125,
0.020782470703125,
-0.0184478759765625,
-0.044097900390625,
0.078125,
0.002513885498046875,
-0.006542205810546875,
0.03045654296875,
0.06854248046875,
0.026031494140625,
0.002349853515625,
0.033416748046875,
0.049530029296875,
0.04010009765625,
-0.0005884170532226562,
0.07672119140625,
-0.0230712890625,
0.0633544921875,
0.08062744140625,
0.01366424560546875,
0.0811767578125,
0.04400634765625,
-0.01446533203125,
0.055084228515625,
0.03607177734375,
-0.01146697998046875,
0.06646728515625,
0.0033512115478515625,
0.0015153884887695312,
-0.00139617919921875,
0.01551055908203125,
-0.0144500732421875,
0.0224456787109375,
0.0166778564453125,
-0.053619384765625,
-0.01110076904296875,
0.017059326171875,
-0.0011014938354492188,
-0.0024890899658203125,
0.0003504753112792969,
0.044403076171875,
0.0237884521484375,
-0.029693603515625,
0.0291748046875,
0.0157928466796875,
0.07159423828125,
-0.029876708984375,
0.01763916015625,
-0.0148773193359375,
0.0284423828125,
0.0077667236328125,
-0.0399169921875,
0.03314208984375,
-0.007793426513671875,
-0.00420379638671875,
-0.0245361328125,
0.050079345703125,
-0.047515869140625,
-0.0496826171875,
0.0254669189453125,
0.04217529296875,
0.0089569091796875,
0.00283050537109375,
-0.092529296875,
-0.002887725830078125,
0.007335662841796875,
-0.03314208984375,
0.0232696533203125,
0.027496337890625,
0.0269012451171875,
0.0413818359375,
0.0270538330078125,
-0.0184173583984375,
0.0265350341796875,
-0.0023975372314453125,
0.059173583984375,
-0.0452880859375,
-0.039642333984375,
-0.08026123046875,
0.04205322265625,
-0.0185089111328125,
-0.0157470703125,
0.070556640625,
0.0391845703125,
0.059967041015625,
-0.0254974365234375,
0.04058837890625,
-0.01439666748046875,
0.0271759033203125,
-0.03875732421875,
0.0594482421875,
-0.037506103515625,
-0.0038623809814453125,
-0.021514892578125,
-0.0660400390625,
-0.0163421630859375,
0.0823974609375,
-0.0297698974609375,
0.00618743896484375,
0.08026123046875,
0.0634765625,
-0.01335906982421875,
-0.011016845703125,
0.016265869140625,
0.025390625,
0.01435089111328125,
0.03497314453125,
0.03314208984375,
-0.068115234375,
0.07049560546875,
-0.05194091796875,
-0.002376556396484375,
-0.013580322265625,
-0.051361083984375,
-0.0736083984375,
-0.05535888671875,
-0.02740478515625,
-0.02386474609375,
-0.01373291015625,
0.07379150390625,
0.036285400390625,
-0.05975341796875,
-0.009674072265625,
-0.01947021484375,
-0.00899505615234375,
-0.012969970703125,
-0.02362060546875,
0.0440673828125,
-0.0411376953125,
-0.0667724609375,
0.013916015625,
-0.007080078125,
0.00659942626953125,
-0.0123443603515625,
0.00827789306640625,
-0.0438232421875,
0.0163421630859375,
0.03985595703125,
-0.016632080078125,
-0.062164306640625,
-0.0241546630859375,
-0.00510406494140625,
-0.040985107421875,
-0.01119232177734375,
0.03204345703125,
-0.046966552734375,
0.01088714599609375,
0.031982421875,
0.04010009765625,
0.052764892578125,
-0.017578125,
0.0234832763671875,
-0.05938720703125,
0.02520751953125,
0.004657745361328125,
0.05584716796875,
0.026580810546875,
-0.0118560791015625,
0.035552978515625,
0.03350830078125,
-0.036956787109375,
-0.059173583984375,
-0.01485443115234375,
-0.072021484375,
-0.01413726806640625,
0.08642578125,
-0.0250091552734375,
-0.0230255126953125,
0.006893157958984375,
-0.0225372314453125,
0.03375244140625,
-0.0207061767578125,
0.04168701171875,
0.059234619140625,
-0.006008148193359375,
-0.025634765625,
-0.0271148681640625,
0.0249176025390625,
0.0399169921875,
-0.039398193359375,
-0.01242828369140625,
0.01073455810546875,
0.0283660888671875,
0.01479339599609375,
0.037322998046875,
0.0008878707885742188,
0.0004801750183105469,
0.01082611083984375,
-0.00457763671875,
-0.005550384521484375,
0.001117706298828125,
-0.0311431884765625,
0.01361846923828125,
-0.028228759765625,
-0.0305633544921875
]
] |
zarakiquemparte/zarablend-1.1-l2-7b | 2023-08-26T10:30:54.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"llama2",
"license:other",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | zarakiquemparte | null | null | zarakiquemparte/zarablend-1.1-l2-7b | 1 | 7,034 | transformers | 2023-08-26T02:51:59 | ---
license: other
tags:
- llama2
---
# Model Card: Zarablend 1.1 L2 7b
This model uses [Nous Hermes Llama2 7b](https://huggingface.co/NousResearch/Nous-Hermes-llama-2-7b) (66%) as a base with [Airoboros L2 7B GPT4 2.0](https://huggingface.co/jondurbin/airoboros-l2-7b-gpt4-2.0) (34%) and the result of this merge was merged with [LimaRP LLama2 7B Lora version of the day 07/23/2023](https://huggingface.co/lemonilia/limarp-llama2).
This merge of models(hermes and airoboros) was done with this [script](https://github.com/zarakiquemparte/zaraki-tools/blob/main/merge-cli.py)
This merge of Lora with Model was done with this [script](https://github.com/zarakiquemparte/zaraki-tools/blob/main/apply-lora.py)
Merge illustration:

## Usage:
Since this is a merge between Nous Hermes, Airoboros and LimaRP, the following instruction formats should work:
Alpaca 2:
```
### Instruction:
<prompt>
### Response:
<leave a newline blank for model to respond>
```
LimaRP instruction format:
```
<<SYSTEM>>
<character card and system prompt>
<<USER>>
<prompt>
<<AIBOT>>
<leave a newline blank for model to respond>
```
## Bias, Risks, and Limitations
This model is not intended for supplying factual information or advice in any form
## Training Details
This model is merged and can be reproduced using the tools mentioned above. Please refer to all provided links for extra model-specific details. | 1,452 | [
[
-0.0298309326171875,
-0.035675048828125,
0.0183563232421875,
0.025970458984375,
-0.035400390625,
-0.02880859375,
0.024200439453125,
-0.0460205078125,
0.037872314453125,
0.0645751953125,
-0.06341552734375,
-0.0308990478515625,
-0.037567138671875,
-0.0146942138671875,
-0.031219482421875,
0.09393310546875,
0.001617431640625,
-0.0221710205078125,
0.0156097412109375,
-0.021453857421875,
-0.02313232421875,
-0.039794921875,
-0.055511474609375,
-0.039642333984375,
0.052947998046875,
0.02801513671875,
0.05572509765625,
0.034637451171875,
0.043304443359375,
0.027496337890625,
-0.02801513671875,
0.015899658203125,
-0.0223388671875,
-0.01329803466796875,
-0.003147125244140625,
-0.02587890625,
-0.0850830078125,
0.00423431396484375,
0.0325927734375,
0.037506103515625,
-0.034423828125,
0.012420654296875,
-0.006778717041015625,
0.035858154296875,
-0.03167724609375,
-0.00762939453125,
-0.02252197265625,
0.012420654296875,
0.007099151611328125,
-0.003887176513671875,
-0.009246826171875,
-0.0428466796875,
0.004077911376953125,
-0.053802490234375,
0.0023670196533203125,
-0.0184783935546875,
0.07366943359375,
0.019378662109375,
-0.035552978515625,
-0.0172576904296875,
-0.037811279296875,
0.046051025390625,
-0.068359375,
0.0017690658569335938,
0.0165863037109375,
0.0201873779296875,
-0.02593994140625,
-0.041595458984375,
-0.046142578125,
0.005619049072265625,
0.0101470947265625,
0.0012836456298828125,
-0.040740966796875,
-0.0118865966796875,
0.032867431640625,
0.02703857421875,
-0.0277862548828125,
0.019989013671875,
-0.069580078125,
-0.0027408599853515625,
0.046966552734375,
0.014739990234375,
0.0092620849609375,
-0.00982666015625,
-0.047119140625,
-0.0274200439453125,
-0.039520263671875,
-0.00653076171875,
0.043060302734375,
0.0218353271484375,
-0.059478759765625,
0.07183837890625,
-0.017852783203125,
0.05712890625,
0.01812744140625,
-0.0235443115234375,
0.04095458984375,
0.0006499290466308594,
-0.0391845703125,
-0.0009326934814453125,
0.05706787109375,
0.04071044921875,
-0.01166534423828125,
0.016571044921875,
-0.00438690185546875,
-0.0005850791931152344,
-0.0006365776062011719,
-0.06134033203125,
0.01380157470703125,
0.02972412109375,
-0.03515625,
-0.040313720703125,
0.005161285400390625,
-0.039642333984375,
-0.0211029052734375,
0.003314971923828125,
0.029205322265625,
-0.021270751953125,
-0.0215606689453125,
0.0237884521484375,
-0.0124664306640625,
0.04925537109375,
0.03887939453125,
-0.056915283203125,
0.0186767578125,
0.037017822265625,
0.04296875,
0.0234832763671875,
-0.01558685302734375,
-0.031402587890625,
0.0123138427734375,
-0.0232391357421875,
0.059112548828125,
0.0053863525390625,
-0.042755126953125,
-0.0053558349609375,
0.0020275115966796875,
-0.004978179931640625,
-0.037933349609375,
0.050018310546875,
-0.030792236328125,
0.029815673828125,
-0.0160369873046875,
-0.0169677734375,
-0.036346435546875,
0.02117919921875,
-0.062286376953125,
0.077880859375,
0.0308990478515625,
-0.06072998046875,
-0.0111846923828125,
-0.06219482421875,
-0.014251708984375,
-0.01824951171875,
0.00368499755859375,
-0.04290771484375,
-0.00612640380859375,
-0.00457763671875,
0.0234832763671875,
-0.0249786376953125,
-0.0206451416015625,
-0.0286865234375,
-0.0254974365234375,
0.0146026611328125,
-0.0009226799011230469,
0.07061767578125,
0.0198516845703125,
-0.012664794921875,
0.0019855499267578125,
-0.059600830078125,
0.0126495361328125,
0.0206756591796875,
-0.03472900390625,
0.004436492919921875,
-0.026397705078125,
0.014801025390625,
0.01476287841796875,
0.040283203125,
-0.0211944580078125,
0.05523681640625,
-0.01739501953125,
0.01470184326171875,
0.043426513671875,
-0.0041046142578125,
0.019561767578125,
-0.047576904296875,
0.032501220703125,
0.0016260147094726562,
0.0227508544921875,
0.002536773681640625,
-0.06219482421875,
-0.08074951171875,
-0.01493072509765625,
-0.0028553009033203125,
0.0406494140625,
-0.0263824462890625,
0.055755615234375,
-0.00588226318359375,
-0.059600830078125,
-0.0194244384765625,
-0.00010955333709716797,
0.0195465087890625,
0.036163330078125,
0.021270751953125,
-0.0273895263671875,
-0.0494384765625,
-0.0701904296875,
0.019287109375,
-0.02288818359375,
0.0056610107421875,
0.005031585693359375,
0.044586181640625,
-0.04473876953125,
0.043548583984375,
-0.037841796875,
-0.0218353271484375,
-0.03436279296875,
0.01690673828125,
0.0509033203125,
0.051055908203125,
0.0584716796875,
-0.031982421875,
-0.018829345703125,
0.005161285400390625,
-0.05523681640625,
-0.005611419677734375,
0.0198211669921875,
-0.01302337646484375,
0.018157958984375,
0.0030765533447265625,
-0.06597900390625,
0.055389404296875,
0.05194091796875,
-0.0364990234375,
0.0153961181640625,
-0.0282440185546875,
0.00981903076171875,
-0.10064697265625,
0.0257568359375,
-0.015960693359375,
0.0103759765625,
-0.0513916015625,
0.0182952880859375,
-0.00035381317138671875,
0.0007176399230957031,
-0.044464111328125,
0.06060791015625,
-0.0272216796875,
-0.01238250732421875,
-0.02642822265625,
-0.0109100341796875,
0.014739990234375,
0.0266876220703125,
-0.001087188720703125,
0.02191162109375,
0.0260772705078125,
-0.0426025390625,
0.046112060546875,
0.037384033203125,
-0.01406097412109375,
0.034210205078125,
-0.049957275390625,
0.0203399658203125,
0.0013523101806640625,
0.031951904296875,
-0.06427001953125,
-0.02880859375,
0.055023193359375,
-0.0180816650390625,
-0.003414154052734375,
-0.0179290771484375,
-0.0323486328125,
-0.0308990478515625,
-0.0265655517578125,
0.0271148681640625,
0.05804443359375,
-0.04632568359375,
0.06988525390625,
-0.01168060302734375,
0.0007777214050292969,
-0.057373046875,
-0.057373046875,
-0.024505615234375,
-0.0272369384765625,
-0.058837890625,
0.0202484130859375,
-0.02374267578125,
-0.011871337890625,
0.002368927001953125,
-0.004669189453125,
-0.0209503173828125,
-0.00833892822265625,
0.033905029296875,
0.05145263671875,
-0.0229644775390625,
-0.03436279296875,
0.007350921630859375,
-0.0021190643310546875,
-0.021820068359375,
0.01678466796875,
0.052154541015625,
0.007381439208984375,
-0.03497314453125,
-0.0489501953125,
0.0175628662109375,
0.06427001953125,
-0.01123046875,
0.07501220703125,
0.038848876953125,
-0.0253753662109375,
0.004428863525390625,
-0.066650390625,
0.00951385498046875,
-0.0312347412109375,
0.01088714599609375,
-0.0185546875,
-0.023681640625,
0.06549072265625,
0.0256500244140625,
0.00283050537109375,
0.04669189453125,
0.028045654296875,
0.004405975341796875,
0.057159423828125,
0.05926513671875,
-0.0170440673828125,
0.035614013671875,
-0.05731201171875,
0.01148223876953125,
-0.07781982421875,
-0.043487548828125,
-0.03570556640625,
-0.01507568359375,
-0.0316162109375,
-0.039642333984375,
0.010498046875,
0.0294189453125,
-0.0237884521484375,
0.04803466796875,
-0.02764892578125,
0.0220794677734375,
0.024261474609375,
0.0101470947265625,
0.0245819091796875,
0.02374267578125,
0.0118255615234375,
0.0287628173828125,
-0.043426513671875,
-0.0276641845703125,
0.08416748046875,
0.024627685546875,
0.067626953125,
0.02880859375,
0.06768798828125,
0.0016870498657226562,
0.0254974365234375,
-0.03955078125,
0.04669189453125,
0.00019741058349609375,
-0.03448486328125,
0.0012845993041992188,
-0.018829345703125,
-0.07122802734375,
0.0277862548828125,
-0.0099029541015625,
-0.048797607421875,
0.01904296875,
0.0009908676147460938,
-0.057159423828125,
0.0248870849609375,
-0.042694091796875,
0.044586181640625,
-0.01202392578125,
-0.01458740234375,
-0.012847900390625,
-0.032928466796875,
0.036834716796875,
0.00392913818359375,
0.00365447998046875,
-0.0157012939453125,
-0.008880615234375,
0.0665283203125,
-0.049896240234375,
0.058013916015625,
0.00492095947265625,
-0.03704833984375,
0.0251312255859375,
-0.017578125,
0.0372314453125,
-0.0199737548828125,
-0.0168304443359375,
0.012939453125,
-0.0103759765625,
-0.0208587646484375,
-0.03936767578125,
0.06182861328125,
-0.0770263671875,
-0.023590087890625,
-0.044097900390625,
-0.0279693603515625,
0.016632080078125,
0.0017042160034179688,
0.03204345703125,
0.028411865234375,
-0.005657196044921875,
-0.007747650146484375,
0.046356201171875,
-0.01087188720703125,
0.023590087890625,
0.062042236328125,
-0.02685546875,
-0.044342041015625,
0.033935546875,
-0.01270294189453125,
0.0174407958984375,
0.00848388671875,
-0.001819610595703125,
-0.0144500732421875,
-0.00484466552734375,
-0.0256195068359375,
0.046142578125,
-0.06292724609375,
-0.02288818359375,
-0.0240020751953125,
-0.04095458984375,
-0.035369873046875,
-0.006465911865234375,
-0.0201263427734375,
-0.04132080078125,
-0.032257080078125,
-0.004131317138671875,
0.044525146484375,
0.0662841796875,
-0.030426025390625,
0.053070068359375,
-0.055633544921875,
0.0300750732421875,
0.025299072265625,
0.005344390869140625,
-0.0009655952453613281,
-0.06292724609375,
0.0018863677978515625,
0.01091766357421875,
-0.030670166015625,
-0.0780029296875,
0.03759765625,
-0.00992584228515625,
0.04388427734375,
0.03961181640625,
-0.014617919921875,
0.058563232421875,
-0.005218505859375,
0.04345703125,
0.029083251953125,
-0.0504150390625,
0.05535888671875,
-0.035308837890625,
0.006519317626953125,
0.003894805908203125,
0.0246429443359375,
-0.038848876953125,
0.003387451171875,
-0.06524658203125,
-0.04693603515625,
0.06866455078125,
0.0224151611328125,
0.0012416839599609375,
0.01202392578125,
0.01995849609375,
-0.016510009765625,
0.01087188720703125,
-0.061492919921875,
-0.0325927734375,
-0.0024890899658203125,
-0.005992889404296875,
-0.0005259513854980469,
-0.0173187255859375,
-0.032196044921875,
-0.01122283935546875,
0.057342529296875,
0.00792694091796875,
0.02197265625,
0.007152557373046875,
0.02301025390625,
-0.025482177734375,
0.012847900390625,
0.05572509765625,
0.016632080078125,
-0.0426025390625,
-0.0287322998046875,
0.00826263427734375,
-0.01558685302734375,
0.00283050537109375,
0.0229339599609375,
0.00730133056640625,
-0.006725311279296875,
0.04180908203125,
0.061920166015625,
0.00933074951171875,
-0.032470703125,
0.031585693359375,
-0.002208709716796875,
-0.022186279296875,
-0.0165557861328125,
0.022613525390625,
0.0077056884765625,
0.02911376953125,
0.014251708984375,
0.0222320556640625,
-0.00894927978515625,
-0.0562744140625,
-0.0255889892578125,
0.033111572265625,
0.0013284683227539062,
-0.01422882080078125,
0.03717041015625,
0.0187530517578125,
-0.01546478271484375,
0.050537109375,
0.005870819091796875,
-0.031402587890625,
0.067626953125,
0.036773681640625,
0.046875,
-0.036407470703125,
-0.00496673583984375,
0.04132080078125,
0.01200103759765625,
-0.0206451416015625,
0.04388427734375,
0.0019969940185546875,
-0.0523681640625,
-0.008392333984375,
-0.0318603515625,
-0.03143310546875,
0.035247802734375,
-0.053985595703125,
0.033172607421875,
-0.040802001953125,
-0.01751708984375,
-0.013885498046875,
0.01088714599609375,
-0.039154052734375,
0.012176513671875,
0.015869140625,
0.07427978515625,
-0.0789794921875,
0.06756591796875,
0.047821044921875,
-0.048065185546875,
-0.080078125,
-0.030731201171875,
-0.003925323486328125,
-0.07391357421875,
0.059722900390625,
0.0086822509765625,
-0.00911712646484375,
-0.0265960693359375,
-0.040069580078125,
-0.0770263671875,
0.0997314453125,
0.028076171875,
-0.02154541015625,
0.0006361007690429688,
0.00017452239990234375,
0.021087646484375,
-0.0338134765625,
0.0325927734375,
0.023834228515625,
0.0290069580078125,
0.042388916015625,
-0.09173583984375,
-0.008148193359375,
-0.030426025390625,
-0.00830841064453125,
-0.0016326904296875,
-0.061614990234375,
0.09844970703125,
-0.031097412109375,
0.004856109619140625,
0.071533203125,
0.04058837890625,
0.043426513671875,
-0.0032196044921875,
0.0211944580078125,
0.073486328125,
0.047515869140625,
0.0010890960693359375,
0.060943603515625,
-0.01385498046875,
0.03533935546875,
0.0748291015625,
-0.041351318359375,
0.07904052734375,
0.03021240234375,
-0.0017261505126953125,
0.0692138671875,
0.039764404296875,
-0.0087432861328125,
0.0308837890625,
-0.0025806427001953125,
-0.02288818359375,
0.001270294189453125,
0.00507354736328125,
-0.0789794921875,
0.0345458984375,
0.01641845703125,
-0.032470703125,
-0.02581787109375,
-0.0267333984375,
-0.004154205322265625,
-0.0286865234375,
-0.019775390625,
0.03662109375,
-0.01503753662109375,
-0.0513916015625,
0.050750732421875,
0.01885986328125,
0.052642822265625,
-0.0753173828125,
-0.02703857421875,
-0.045562744140625,
0.0269927978515625,
-0.0333251953125,
-0.03802490234375,
0.0011739730834960938,
0.0003695487976074219,
-0.0164031982421875,
0.012115478515625,
0.037200927734375,
-0.0220184326171875,
-0.05108642578125,
0.03167724609375,
0.027496337890625,
0.0206298828125,
0.0180206298828125,
-0.050994873046875,
0.039947509765625,
0.00482940673828125,
-0.009735107421875,
0.024993896484375,
0.01312255859375,
0.00980377197265625,
0.0673828125,
0.0487060546875,
0.0003712177276611328,
0.0007557868957519531,
-0.0023937225341796875,
0.0765380859375,
-0.03607177734375,
-0.03472900390625,
-0.0413818359375,
0.04095458984375,
-0.0001341104507446289,
-0.0203094482421875,
0.04522705078125,
0.0546875,
0.031219482421875,
-0.01247406005859375,
0.053863525390625,
-0.01300811767578125,
0.03375244140625,
-0.043182373046875,
0.052947998046875,
-0.045623779296875,
0.022125244140625,
-0.01323699951171875,
-0.07708740234375,
0.00591278076171875,
0.07696533203125,
0.0022792816162109375,
0.00762939453125,
0.035400390625,
0.0703125,
-0.01824951171875,
-0.0194244384765625,
0.0094451904296875,
0.01079559326171875,
0.006481170654296875,
0.051849365234375,
0.06927490234375,
-0.07293701171875,
0.0291900634765625,
-0.0220947265625,
-0.015716552734375,
-0.02618408203125,
-0.07147216796875,
-0.06988525390625,
-0.0092010498046875,
-0.0265960693359375,
-0.03375244140625,
-0.006740570068359375,
0.07293701171875,
0.056365966796875,
-0.0364990234375,
-0.045684814453125,
0.018890380859375,
0.002559661865234375,
0.002895355224609375,
-0.00922393798828125,
0.017242431640625,
0.0248260498046875,
-0.048919677734375,
0.025177001953125,
0.0005092620849609375,
0.0484619140625,
-0.004413604736328125,
-0.0232391357421875,
-0.0046234130859375,
0.00931549072265625,
0.025299072265625,
0.043304443359375,
-0.07122802734375,
0.005279541015625,
-0.022369384765625,
-0.00966644287109375,
0.0019779205322265625,
0.0260467529296875,
-0.0498046875,
-0.0152435302734375,
0.0224456787109375,
-0.005084991455078125,
0.0260772705078125,
-0.0017414093017578125,
0.0179901123046875,
-0.034271240234375,
0.035552978515625,
-0.0224609375,
0.044921875,
0.024505615234375,
-0.01371002197265625,
0.049102783203125,
0.005603790283203125,
-0.018829345703125,
-0.05694580078125,
0.0008783340454101562,
-0.10931396484375,
-0.0020351409912109375,
0.0738525390625,
0.00531768798828125,
-0.018035888671875,
0.033935546875,
-0.045623779296875,
0.00946807861328125,
-0.02734375,
0.0341796875,
0.0283050537109375,
-0.02716064453125,
0.00015056133270263672,
-0.0074920654296875,
0.00494384765625,
0.007190704345703125,
-0.06439208984375,
-0.01490020751953125,
0.023681640625,
0.0428466796875,
0.0345458984375,
0.051513671875,
0.008575439453125,
0.03143310546875,
-0.00815582275390625,
0.032073974609375,
-0.00911712646484375,
-0.0004372596740722656,
-0.027862548828125,
-0.01177978515625,
-0.0013151168823242188,
-0.01009368896484375
]
] |
Helsinki-NLP/opus-tatoeba-fi-en | 2023-10-10T14:15:32.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"marian",
"text2text-generation",
"translation",
"fi",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-tatoeba-fi-en | 2 | 7,028 | transformers | 2022-03-02T23:29:04 | ---
language:
- fi
- en
tags:
- translation
license: apache-2.0
---
### fi-en
* source group: Finnish
* target group: English
* OPUS readme: [fin-eng](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/fin-eng/README.md)
* model: transformer-align
* source language(s): fin
* target language(s): eng
* model: transformer-align
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* download original weights: [opusTCv20210807+bt-2021-08-25.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/fin-eng/opusTCv20210807+bt-2021-08-25.zip)
* test set translations: [opusTCv20210807+bt-2021-08-25.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/fin-eng/opusTCv20210807+bt-2021-08-25.test.txt)
* test set scores: [opusTCv20210807+bt-2021-08-25.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/fin-eng/opusTCv20210807+bt-2021-08-25.eval.txt)
## Benchmarks
| testset | BLEU | chr-F | #sent | #words | BP |
|---------|-------|-------|-------|--------|----|
| newsdev2015-enfi.fin-eng | 27.1 | 0.550 | 1500 | 32104 | 0.988 |
| newstest2015-enfi.fin-eng | 28.5 | 0.560 | 1370 | 27356 | 0.980 |
| newstest2016-enfi.fin-eng | 31.7 | 0.586 | 3000 | 63043 | 1.000 |
| newstest2017-enfi.fin-eng | 34.6 | 0.610 | 3002 | 61936 | 0.988 |
| newstest2018-enfi.fin-eng | 25.4 | 0.530 | 3000 | 62325 | 0.981 |
| newstest2019-fien.fin-eng | 30.6 | 0.577 | 1996 | 36227 | 0.994 |
| newstestB2016-enfi.fin-eng | 25.8 | 0.538 | 3000 | 63043 | 0.987 |
| newstestB2017-enfi.fin-eng | 29.6 | 0.572 | 3002 | 61936 | 0.999 |
| newstestB2017-fien.fin-eng | 29.6 | 0.572 | 3002 | 61936 | 0.999 |
| Tatoeba-test-v2021-08-07.fin-eng | 54.1 | 0.700 | 10000 | 75212 | 0.988 |
### System Info:
- hf_name: fi-en
- source_languages: fin
- target_languages: eng
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/fin-eng/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['fi', 'en']
- src_constituents: ('Finnish', {'fin'})
- tgt_constituents: ('English', {'eng'})
- src_multilingual: False
- tgt_multilingual: False
- long_pair: fin-eng
- prepro: normalization + SentencePiece (spm32k,spm32k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/fin-eng/opusTCv20210807+bt-2021-08-25.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/fin-eng/opusTCv20210807+bt-2021-08-25.test.txt
- src_alpha3: fin
- tgt_alpha3: eng
- chrF2_score: 0.7
- bleu: 54.1
- src_name: Finnish
- tgt_name: English
- train_date: 2021-08-25 00:00:00
- src_alpha2: fi
- tgt_alpha2: en
- prefer_old: False
- short_pair: fi-en
- helsinki_git_sha: 2ef219d5b67f0afb0c6b732cd07001d84181f002
- transformers_git_sha: 12b4d66a80419db30a15e7b9d4208ceb9887c03b
- port_machine: LM0-400-22516.local
- port_time: 2021-11-04-21:36 | 2,846 | [
[
-0.033782958984375,
-0.055755615234375,
0.02325439453125,
0.026611328125,
-0.033416748046875,
-0.01093292236328125,
-0.0184478759765625,
-0.033599853515625,
0.033355712890625,
0.0224151611328125,
-0.045745849609375,
-0.05438232421875,
-0.043853759765625,
0.0272064208984375,
0.00283050537109375,
0.06817626953125,
-0.01110076904296875,
0.0010366439819335938,
0.0302276611328125,
-0.0268707275390625,
-0.042236328125,
-0.01282501220703125,
-0.0556640625,
-0.0193634033203125,
0.03411865234375,
0.0272979736328125,
0.03143310546875,
0.033599853515625,
0.04400634765625,
0.0220947265625,
-0.0208587646484375,
0.0159454345703125,
-0.0099334716796875,
-0.01009368896484375,
0.0015153884887695312,
-0.0338134765625,
-0.04473876953125,
-0.01372528076171875,
0.061126708984375,
0.042816162109375,
0.012603759765625,
0.037628173828125,
-0.002529144287109375,
0.0560302734375,
-0.0187530517578125,
0.016845703125,
-0.03369140625,
-0.01262664794921875,
-0.02911376953125,
-0.01739501953125,
-0.034881591796875,
-0.02490234375,
0.002948760986328125,
-0.040313720703125,
0.002742767333984375,
0.01238250732421875,
0.11956787109375,
0.0017213821411132812,
-0.021697998046875,
-0.01116180419921875,
-0.0211639404296875,
0.050201416015625,
-0.0653076171875,
0.03143310546875,
0.0384521484375,
-0.0135345458984375,
-0.007320404052734375,
-0.031951904296875,
-0.033203125,
0.0118255615234375,
-0.031005859375,
0.0227203369140625,
-0.010467529296875,
-0.007396697998046875,
0.0045928955078125,
0.038482666015625,
-0.046722412109375,
0.00479888916015625,
-0.030364990234375,
-0.01580810546875,
0.045196533203125,
0.00878143310546875,
0.016326904296875,
-0.03985595703125,
-0.03179931640625,
-0.034881591796875,
-0.034454345703125,
0.0222015380859375,
0.042236328125,
0.0394287109375,
-0.040496826171875,
0.052764892578125,
-0.00923919677734375,
0.043212890625,
0.005405426025390625,
-0.0032215118408203125,
0.05023193359375,
-0.054931640625,
-0.020904541015625,
-0.007289886474609375,
0.09033203125,
0.022247314453125,
-0.002655029296875,
0.0118865966796875,
-0.02374267578125,
-0.0210113525390625,
-0.0157318115234375,
-0.05584716796875,
0.0125732421875,
0.0259857177734375,
-0.0224761962890625,
-0.0212554931640625,
0.00907135009765625,
-0.06671142578125,
0.01482391357421875,
0.01166534423828125,
0.04608154296875,
-0.05303955078125,
-0.0261688232421875,
0.02655029296875,
-0.007778167724609375,
0.021453857421875,
-0.007480621337890625,
-0.040191650390625,
0.01187896728515625,
0.02490234375,
0.065185546875,
-0.000018537044525146484,
-0.02325439453125,
-0.01163482666015625,
-0.0024433135986328125,
-0.0132598876953125,
0.0509033203125,
-0.01284027099609375,
-0.0287322998046875,
-0.01505279541015625,
0.0227508544921875,
-0.0251312255859375,
-0.00885772705078125,
0.05389404296875,
-0.020263671875,
0.04608154296875,
-0.026824951171875,
-0.03948974609375,
-0.02435302734375,
0.023040771484375,
-0.0548095703125,
0.0889892578125,
0.023834228515625,
-0.06982421875,
0.0268402099609375,
-0.069091796875,
-0.008544921875,
0.00269317626953125,
0.01548004150390625,
-0.052520751953125,
-0.010467529296875,
0.0173797607421875,
0.031341552734375,
-0.0306396484375,
0.0367431640625,
-0.004322052001953125,
-0.0185546875,
0.00705718994140625,
-0.02435302734375,
0.0911865234375,
0.01458740234375,
-0.037384033203125,
0.006626129150390625,
-0.05377197265625,
0.00510406494140625,
0.020782470703125,
-0.0270233154296875,
-0.01861572265625,
-0.0236358642578125,
0.0201873779296875,
0.0037021636962890625,
0.0191650390625,
-0.051300048828125,
0.0221405029296875,
-0.051025390625,
0.020782470703125,
0.06304931640625,
0.01386260986328125,
0.01500701904296875,
-0.03436279296875,
0.0272369384765625,
0.015777587890625,
0.00647735595703125,
0.008392333984375,
-0.04052734375,
-0.05291748046875,
-0.026611328125,
0.041412353515625,
0.042694091796875,
-0.0518798828125,
0.058746337890625,
-0.046234130859375,
-0.05816650390625,
-0.0511474609375,
-0.01377105712890625,
0.039520263671875,
0.0296173095703125,
0.03533935546875,
-0.0131072998046875,
-0.0343017578125,
-0.0789794921875,
-0.0195770263671875,
-0.0268707275390625,
0.00539398193359375,
0.0159454345703125,
0.06439208984375,
0.0006823539733886719,
0.0484619140625,
-0.0355224609375,
-0.035186767578125,
-0.0108642578125,
0.0123748779296875,
0.04046630859375,
0.055389404296875,
0.04949951171875,
-0.059234619140625,
-0.041717529296875,
0.0081329345703125,
-0.0423583984375,
-0.006275177001953125,
-0.00966644287109375,
-0.0222320556640625,
0.0308380126953125,
-0.0012111663818359375,
-0.04046630859375,
0.0186920166015625,
0.04241943359375,
-0.0762939453125,
0.02996826171875,
-0.01351165771484375,
0.0273284912109375,
-0.10003662109375,
0.021209716796875,
-0.002140045166015625,
-0.008514404296875,
-0.025177001953125,
0.004962921142578125,
0.0028171539306640625,
0.0183868408203125,
-0.02996826171875,
0.060211181640625,
-0.04461669921875,
0.005859375,
0.027618408203125,
0.00986480712890625,
-0.0041046142578125,
0.0595703125,
-0.012451171875,
0.0677490234375,
0.0313720703125,
-0.02947998046875,
0.00792694091796875,
0.03265380859375,
-0.029083251953125,
0.025665283203125,
-0.04730224609375,
-0.00955963134765625,
0.017730712890625,
-0.0008902549743652344,
-0.06878662109375,
-0.015350341796875,
0.02313232421875,
-0.055694580078125,
0.02166748046875,
0.0001430511474609375,
-0.038055419921875,
-0.02227783203125,
-0.035980224609375,
0.0291748046875,
0.02667236328125,
-0.017333984375,
0.055084228515625,
0.015167236328125,
-0.005950927734375,
-0.049163818359375,
-0.062225341796875,
-0.0085601806640625,
-0.015899658203125,
-0.0531005859375,
0.031005859375,
-0.005977630615234375,
0.004611968994140625,
0.0198822021484375,
-0.0034332275390625,
-0.01024627685546875,
0.0120391845703125,
0.002475738525390625,
0.0292816162109375,
-0.0244598388671875,
0.0015544891357421875,
0.0016307830810546875,
-0.0015697479248046875,
-0.01123809814453125,
-0.0153656005859375,
0.054931640625,
-0.03741455078125,
-0.0099945068359375,
-0.04559326171875,
0.00809478759765625,
0.040863037109375,
-0.02911376953125,
0.07940673828125,
0.04046630859375,
-0.024871826171875,
0.02545166015625,
-0.04913330078125,
0.00568389892578125,
-0.0323486328125,
0.0230560302734375,
-0.04388427734375,
-0.058074951171875,
0.0662841796875,
0.024322509765625,
0.023681640625,
0.08172607421875,
0.04095458984375,
0.005435943603515625,
0.03778076171875,
0.0279693603515625,
0.013580322265625,
0.03875732421875,
-0.04736328125,
-0.00882720947265625,
-0.058074951171875,
-0.0323486328125,
-0.05194091796875,
-0.021881103515625,
-0.06610107421875,
-0.00838470458984375,
0.0193939208984375,
-0.0054473876953125,
-0.0204010009765625,
0.05804443359375,
-0.03997802734375,
0.020477294921875,
0.03961181640625,
0.0179901123046875,
0.0141754150390625,
-0.00925445556640625,
-0.03179931640625,
-0.0048065185546875,
-0.03863525390625,
-0.033203125,
0.08770751953125,
0.0263671875,
0.00958251953125,
0.0195465087890625,
0.05517578125,
0.0136871337890625,
0.015167236328125,
-0.042083740234375,
0.037506103515625,
-0.008697509765625,
-0.06427001953125,
-0.035980224609375,
-0.03411865234375,
-0.074462890625,
0.0142974853515625,
-0.023681640625,
-0.05291748046875,
0.0193328857421875,
-0.00809478759765625,
-0.010406494140625,
0.04803466796875,
-0.05694580078125,
0.06939697265625,
0.0034847259521484375,
-0.028228759765625,
0.0060882568359375,
-0.04541015625,
0.021209716796875,
-0.0145416259765625,
0.022735595703125,
-0.01058197021484375,
-0.009857177734375,
0.0762939453125,
-0.0318603515625,
0.04638671875,
-0.01171875,
-0.00945281982421875,
0.02288818359375,
-0.00026154518127441406,
0.0413818359375,
-0.00604248046875,
-0.01947021484375,
0.019287109375,
0.01087188720703125,
-0.04815673828125,
-0.00984954833984375,
0.039276123046875,
-0.0606689453125,
-0.03143310546875,
-0.0401611328125,
-0.04168701171875,
-0.0025386810302734375,
0.036956787109375,
0.0350341796875,
0.039154052734375,
-0.000946044921875,
0.041717529296875,
0.0418701171875,
-0.0213775634765625,
0.04345703125,
0.03204345703125,
0.0019664764404296875,
-0.051513671875,
0.049102783203125,
0.01824951171875,
0.019195556640625,
0.036102294921875,
0.00957489013671875,
-0.02203369140625,
-0.04815673828125,
-0.03070068359375,
0.03900146484375,
-0.026519775390625,
-0.03143310546875,
-0.046112060546875,
-0.0018911361694335938,
-0.027862548828125,
0.0013065338134765625,
-0.035980224609375,
-0.034210205078125,
-0.0061798095703125,
-0.029449462890625,
0.04254150390625,
0.032928466796875,
-0.0029239654541015625,
0.0217437744140625,
-0.0577392578125,
0.0116424560546875,
-0.01349639892578125,
0.04437255859375,
-0.02276611328125,
-0.05780029296875,
-0.021514892578125,
0.00254058837890625,
-0.01210784912109375,
-0.07537841796875,
0.0423583984375,
-0.0010938644409179688,
0.0255889892578125,
0.0031375885009765625,
0.008148193359375,
0.057464599609375,
-0.015716552734375,
0.07537841796875,
0.004669189453125,
-0.0694580078125,
0.041748046875,
-0.02685546875,
0.032501220703125,
0.051849365234375,
0.018707275390625,
-0.030792236328125,
-0.05975341796875,
-0.07135009765625,
-0.0723876953125,
0.0631103515625,
0.041748046875,
-0.00604248046875,
-0.007404327392578125,
0.01511383056640625,
0.00225067138671875,
-0.01145172119140625,
-0.07537841796875,
-0.043853759765625,
0.0063629150390625,
-0.035400390625,
0.006748199462890625,
-0.03179931640625,
-0.0078887939453125,
-0.0194549560546875,
0.07781982421875,
0.00780487060546875,
0.0231475830078125,
0.038299560546875,
-0.00010389089584350586,
-0.0006809234619140625,
0.0252685546875,
0.05462646484375,
0.0276947021484375,
-0.030609130859375,
-0.00920867919921875,
0.0275421142578125,
-0.052581787109375,
0.00936126708984375,
0.00463104248046875,
-0.03094482421875,
0.02276611328125,
0.039520263671875,
0.06561279296875,
0.0115203857421875,
-0.034515380859375,
0.0394287109375,
-0.01116180419921875,
-0.041046142578125,
-0.027587890625,
-0.022918701171875,
0.0067596435546875,
0.01018524169921875,
0.025238037109375,
0.0043182373046875,
-0.006870269775390625,
-0.019500732421875,
0.005359649658203125,
0.013671875,
-0.0251007080078125,
-0.0285186767578125,
0.0467529296875,
0.0099639892578125,
-0.0193634033203125,
0.03216552734375,
-0.01494598388671875,
-0.026763916015625,
0.04962158203125,
0.0223236083984375,
0.0823974609375,
-0.00972747802734375,
-0.006748199462890625,
0.061767578125,
0.039703369140625,
-0.0021038055419921875,
0.037506103515625,
0.0255126953125,
-0.047210693359375,
-0.0225830078125,
-0.057647705078125,
0.01021575927734375,
0.01221466064453125,
-0.05889892578125,
0.0301055908203125,
-0.00896453857421875,
-0.03265380859375,
-0.0018548965454101562,
0.0294036865234375,
-0.05224609375,
0.01070404052734375,
-0.0176849365234375,
0.072998046875,
-0.0728759765625,
0.055419921875,
0.057525634765625,
-0.055938720703125,
-0.08062744140625,
-0.01739501953125,
-0.023193359375,
-0.044036865234375,
0.044281005859375,
-0.001270294189453125,
0.002872467041015625,
0.0032196044921875,
-0.026641845703125,
-0.06817626953125,
0.09326171875,
0.0256195068359375,
-0.0237274169921875,
-0.00794219970703125,
-0.00013530254364013672,
0.039520263671875,
-0.00357818603515625,
0.02752685546875,
0.03533935546875,
0.0648193359375,
-0.0156097412109375,
-0.08447265625,
0.0165252685546875,
-0.040191650390625,
-0.00821685791015625,
0.0304412841796875,
-0.07080078125,
0.0631103515625,
0.00872039794921875,
-0.02056884765625,
-0.00678253173828125,
0.045440673828125,
0.0283050537109375,
0.00830078125,
0.04046630859375,
0.053863525390625,
0.0430908203125,
-0.039886474609375,
0.07037353515625,
-0.026519775390625,
0.0428466796875,
0.056610107421875,
0.023468017578125,
0.060943603515625,
0.045196533203125,
-0.0254364013671875,
0.04638671875,
0.057708740234375,
-0.01203155517578125,
0.024017333984375,
-0.01230621337890625,
-0.00193023681640625,
-0.01548004150390625,
-0.01078033447265625,
-0.031707763671875,
0.03240966796875,
0.00965118408203125,
-0.01220703125,
0.002124786376953125,
-0.0160980224609375,
0.025909423828125,
0.005832672119140625,
-0.01372528076171875,
0.048828125,
-0.00662994384765625,
-0.051422119140625,
0.0517578125,
0.0028209686279296875,
0.059783935546875,
-0.048553466796875,
0.00881195068359375,
-0.025482177734375,
0.00951385498046875,
-0.0106353759765625,
-0.07037353515625,
0.028350830078125,
0.0201416015625,
-0.01248931884765625,
-0.0220794677734375,
0.0175628662109375,
-0.03643798828125,
-0.05291748046875,
0.0300140380859375,
0.033660888671875,
0.01485443115234375,
0.02032470703125,
-0.050933837890625,
-0.00925445556640625,
0.018280029296875,
-0.0516357421875,
-0.002666473388671875,
0.04931640625,
0.005367279052734375,
0.04754638671875,
0.033935546875,
0.013397216796875,
0.0036067962646484375,
-0.01248931884765625,
0.0528564453125,
-0.060211181640625,
-0.0333251953125,
-0.07061767578125,
0.043731689453125,
-0.00504302978515625,
-0.049957275390625,
0.041473388671875,
0.0579833984375,
0.0697021484375,
-0.005390167236328125,
0.0272369384765625,
-0.0210113525390625,
0.02978515625,
-0.04791259765625,
0.054443359375,
-0.06610107421875,
0.00772857666015625,
-0.014984130859375,
-0.05438232421875,
-0.02197265625,
0.0211639404296875,
-0.0150299072265625,
-0.00569915771484375,
0.07525634765625,
0.045562744140625,
0.0143280029296875,
-0.01537322998046875,
-0.0014905929565429688,
0.02154541015625,
0.01971435546875,
0.061004638671875,
0.012908935546875,
-0.07330322265625,
0.0518798828125,
-0.02801513671875,
0.0111541748046875,
-0.004482269287109375,
-0.053192138671875,
-0.0611572265625,
-0.054107666015625,
-0.0203857421875,
-0.03216552734375,
-0.01441192626953125,
0.08135986328125,
0.0141754150390625,
-0.07049560546875,
-0.0259857177734375,
-0.0005903244018554688,
0.01385498046875,
-0.0244903564453125,
-0.020050048828125,
0.06134033203125,
-0.00551605224609375,
-0.072021484375,
0.003528594970703125,
-0.004901885986328125,
0.0159454345703125,
0.01043701171875,
0.001285552978515625,
-0.051910400390625,
-0.0006494522094726562,
0.016876220703125,
0.00730133056640625,
-0.05816650390625,
-0.02197265625,
0.017669677734375,
-0.0273590087890625,
0.0147705078125,
-0.0004029273986816406,
-0.023956298828125,
0.016326904296875,
0.058349609375,
0.031951904296875,
0.041229248046875,
-0.00402069091796875,
0.0281829833984375,
-0.05438232421875,
0.028778076171875,
0.011627197265625,
0.038482666015625,
0.0161590576171875,
-0.0222625732421875,
0.06268310546875,
0.028228759765625,
-0.0191650390625,
-0.07501220703125,
-0.0182952880859375,
-0.08917236328125,
-0.005344390869140625,
0.07196044921875,
-0.01554107666015625,
-0.033660888671875,
0.01538848876953125,
-0.015045166015625,
0.030181884765625,
-0.036346435546875,
0.038482666015625,
0.074462890625,
0.02154541015625,
0.0095367431640625,
-0.031982421875,
0.014862060546875,
0.037445068359375,
-0.0528564453125,
-0.0065765380859375,
0.0111541748046875,
0.02783203125,
0.0253143310546875,
0.0537109375,
-0.0224609375,
0.0098419189453125,
-0.0016326904296875,
0.0248260498046875,
-0.021240234375,
0.00543975830078125,
-0.0163726806640625,
0.0049285888671875,
-0.0119781494140625,
-0.0208587646484375
]
] |
timm/convnextv2_huge.fcmae | 2023-03-31T23:13:15.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"arxiv:2301.00808",
"license:cc-by-nc-4.0",
"region:us"
] | image-classification | timm | null | null | timm/convnextv2_huge.fcmae | 0 | 7,028 | timm | 2023-01-05T01:40:22 | ---
tags:
- image-classification
- timm
library_tag: timm
license: cc-by-nc-4.0
---
# Model card for convnextv2_huge.fcmae
A ConvNeXt-V2 self-supervised feature representation model. Pretrained with a fully convolutional masked autoencoder framework (FCMAE). This model has no pretrained head and is only useful for fine-tune or feature extraction.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 657.5
- GMACs: 115.0
- Activations (M): 79.1
- Image size: 224 x 224
- **Papers:**
- ConvNeXt V2: Co-designing and Scaling ConvNets with Masked Autoencoders: https://arxiv.org/abs/2301.00808
- **Original:** https://github.com/facebookresearch/ConvNeXt-V2
- **Pretrain Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('convnextv2_huge.fcmae', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'convnextv2_huge.fcmae',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 352, 56, 56])
# torch.Size([1, 704, 28, 28])
# torch.Size([1, 1408, 14, 14])
# torch.Size([1, 2816, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'convnextv2_huge.fcmae',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2816, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
All timing numbers from eager model PyTorch 1.13 on RTX 3090 w/ AMP.
| model |top1 |top5 |img_size|param_count|gmacs |macts |samples_per_sec|batch_size|
|------------------------------------------------------------------------------------------------------------------------------|------|------|--------|-----------|------|------|---------------|----------|
| [convnextv2_huge.fcmae_ft_in22k_in1k_512](https://huggingface.co/timm/convnextv2_huge.fcmae_ft_in22k_in1k_512) |88.848|98.742|512 |660.29 |600.81|413.07|28.58 |48 |
| [convnextv2_huge.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_huge.fcmae_ft_in22k_in1k_384) |88.668|98.738|384 |660.29 |337.96|232.35|50.56 |64 |
| [convnext_xxlarge.clip_laion2b_soup_ft_in1k](https://huggingface.co/timm/convnext_xxlarge.clip_laion2b_soup_ft_in1k) |88.612|98.704|256 |846.47 |198.09|124.45|122.45 |256 |
| [convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384](https://huggingface.co/timm/convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_384) |88.312|98.578|384 |200.13 |101.11|126.74|196.84 |256 |
| [convnextv2_large.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_large.fcmae_ft_in22k_in1k_384) |88.196|98.532|384 |197.96 |101.1 |126.74|128.94 |128 |
| [convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320](https://huggingface.co/timm/convnext_large_mlp.clip_laion2b_soup_ft_in12k_in1k_320) |87.968|98.47 |320 |200.13 |70.21 |88.02 |283.42 |256 |
| [convnext_xlarge.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_xlarge.fb_in22k_ft_in1k_384) |87.75 |98.556|384 |350.2 |179.2 |168.99|124.85 |192 |
| [convnextv2_base.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_base.fcmae_ft_in22k_in1k_384) |87.646|98.422|384 |88.72 |45.21 |84.49 |209.51 |256 |
| [convnext_large.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_large.fb_in22k_ft_in1k_384) |87.476|98.382|384 |197.77 |101.1 |126.74|194.66 |256 |
| [convnext_large_mlp.clip_laion2b_augreg_ft_in1k](https://huggingface.co/timm/convnext_large_mlp.clip_laion2b_augreg_ft_in1k) |87.344|98.218|256 |200.13 |44.94 |56.33 |438.08 |256 |
| [convnextv2_large.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_large.fcmae_ft_in22k_in1k) |87.26 |98.248|224 |197.96 |34.4 |43.13 |376.84 |256 |
| [convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384](https://huggingface.co/timm/convnext_base.clip_laion2b_augreg_ft_in12k_in1k_384) |87.138|98.212|384 |88.59 |45.21 |84.49 |365.47 |256 |
| [convnext_xlarge.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_xlarge.fb_in22k_ft_in1k) |87.002|98.208|224 |350.2 |60.98 |57.5 |368.01 |256 |
| [convnext_base.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_base.fb_in22k_ft_in1k_384) |86.796|98.264|384 |88.59 |45.21 |84.49 |366.54 |256 |
| [convnextv2_base.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_base.fcmae_ft_in22k_in1k) |86.74 |98.022|224 |88.72 |15.38 |28.75 |624.23 |256 |
| [convnext_large.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_large.fb_in22k_ft_in1k) |86.636|98.028|224 |197.77 |34.4 |43.13 |581.43 |256 |
| [convnext_base.clip_laiona_augreg_ft_in1k_384](https://huggingface.co/timm/convnext_base.clip_laiona_augreg_ft_in1k_384) |86.504|97.97 |384 |88.59 |45.21 |84.49 |368.14 |256 |
| [convnext_base.clip_laion2b_augreg_ft_in12k_in1k](https://huggingface.co/timm/convnext_base.clip_laion2b_augreg_ft_in12k_in1k) |86.344|97.97 |256 |88.59 |20.09 |37.55 |816.14 |256 |
| [convnextv2_huge.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_huge.fcmae_ft_in1k) |86.256|97.75 |224 |660.29 |115.0 |79.07 |154.72 |256 |
| [convnext_small.in12k_ft_in1k_384](https://huggingface.co/timm/convnext_small.in12k_ft_in1k_384) |86.182|97.92 |384 |50.22 |25.58 |63.37 |516.19 |256 |
| [convnext_base.clip_laion2b_augreg_ft_in1k](https://huggingface.co/timm/convnext_base.clip_laion2b_augreg_ft_in1k) |86.154|97.68 |256 |88.59 |20.09 |37.55 |819.86 |256 |
| [convnext_base.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_base.fb_in22k_ft_in1k) |85.822|97.866|224 |88.59 |15.38 |28.75 |1037.66 |256 |
| [convnext_small.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_small.fb_in22k_ft_in1k_384) |85.778|97.886|384 |50.22 |25.58 |63.37 |518.95 |256 |
| [convnextv2_large.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_large.fcmae_ft_in1k) |85.742|97.584|224 |197.96 |34.4 |43.13 |375.23 |256 |
| [convnext_small.in12k_ft_in1k](https://huggingface.co/timm/convnext_small.in12k_ft_in1k) |85.174|97.506|224 |50.22 |8.71 |21.56 |1474.31 |256 |
| [convnext_tiny.in12k_ft_in1k_384](https://huggingface.co/timm/convnext_tiny.in12k_ft_in1k_384) |85.118|97.608|384 |28.59 |13.14 |39.48 |856.76 |256 |
| [convnextv2_tiny.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_tiny.fcmae_ft_in22k_in1k_384) |85.112|97.63 |384 |28.64 |13.14 |39.48 |491.32 |256 |
| [convnextv2_base.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_base.fcmae_ft_in1k) |84.874|97.09 |224 |88.72 |15.38 |28.75 |625.33 |256 |
| [convnext_small.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_small.fb_in22k_ft_in1k) |84.562|97.394|224 |50.22 |8.71 |21.56 |1478.29 |256 |
| [convnext_large.fb_in1k](https://huggingface.co/timm/convnext_large.fb_in1k) |84.282|96.892|224 |197.77 |34.4 |43.13 |584.28 |256 |
| [convnext_tiny.in12k_ft_in1k](https://huggingface.co/timm/convnext_tiny.in12k_ft_in1k) |84.186|97.124|224 |28.59 |4.47 |13.44 |2433.7 |256 |
| [convnext_tiny.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_tiny.fb_in22k_ft_in1k_384) |84.084|97.14 |384 |28.59 |13.14 |39.48 |862.95 |256 |
| [convnextv2_tiny.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_tiny.fcmae_ft_in22k_in1k) |83.894|96.964|224 |28.64 |4.47 |13.44 |1452.72 |256 |
| [convnext_base.fb_in1k](https://huggingface.co/timm/convnext_base.fb_in1k) |83.82 |96.746|224 |88.59 |15.38 |28.75 |1054.0 |256 |
| [convnextv2_nano.fcmae_ft_in22k_in1k_384](https://huggingface.co/timm/convnextv2_nano.fcmae_ft_in22k_in1k_384) |83.37 |96.742|384 |15.62 |7.22 |24.61 |801.72 |256 |
| [convnext_small.fb_in1k](https://huggingface.co/timm/convnext_small.fb_in1k) |83.142|96.434|224 |50.22 |8.71 |21.56 |1464.0 |256 |
| [convnextv2_tiny.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_tiny.fcmae_ft_in1k) |82.92 |96.284|224 |28.64 |4.47 |13.44 |1425.62 |256 |
| [convnext_tiny.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_tiny.fb_in22k_ft_in1k) |82.898|96.616|224 |28.59 |4.47 |13.44 |2480.88 |256 |
| [convnext_nano.in12k_ft_in1k](https://huggingface.co/timm/convnext_nano.in12k_ft_in1k) |82.282|96.344|224 |15.59 |2.46 |8.37 |3926.52 |256 |
| [convnext_tiny_hnf.a2h_in1k](https://huggingface.co/timm/convnext_tiny_hnf.a2h_in1k) |82.216|95.852|224 |28.59 |4.47 |13.44 |2529.75 |256 |
| [convnext_tiny.fb_in1k](https://huggingface.co/timm/convnext_tiny.fb_in1k) |82.066|95.854|224 |28.59 |4.47 |13.44 |2346.26 |256 |
| [convnextv2_nano.fcmae_ft_in22k_in1k](https://huggingface.co/timm/convnextv2_nano.fcmae_ft_in22k_in1k) |82.03 |96.166|224 |15.62 |2.46 |8.37 |2300.18 |256 |
| [convnextv2_nano.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_nano.fcmae_ft_in1k) |81.83 |95.738|224 |15.62 |2.46 |8.37 |2321.48 |256 |
| [convnext_nano_ols.d1h_in1k](https://huggingface.co/timm/convnext_nano_ols.d1h_in1k) |80.866|95.246|224 |15.65 |2.65 |9.38 |3523.85 |256 |
| [convnext_nano.d1h_in1k](https://huggingface.co/timm/convnext_nano.d1h_in1k) |80.768|95.334|224 |15.59 |2.46 |8.37 |3915.58 |256 |
| [convnextv2_pico.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_pico.fcmae_ft_in1k) |80.304|95.072|224 |9.07 |1.37 |6.1 |3274.57 |256 |
| [convnext_pico.d1_in1k](https://huggingface.co/timm/convnext_pico.d1_in1k) |79.526|94.558|224 |9.05 |1.37 |6.1 |5686.88 |256 |
| [convnext_pico_ols.d1_in1k](https://huggingface.co/timm/convnext_pico_ols.d1_in1k) |79.522|94.692|224 |9.06 |1.43 |6.5 |5422.46 |256 |
| [convnextv2_femto.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_femto.fcmae_ft_in1k) |78.488|93.98 |224 |5.23 |0.79 |4.57 |4264.2 |256 |
| [convnext_femto_ols.d1_in1k](https://huggingface.co/timm/convnext_femto_ols.d1_in1k) |77.86 |93.83 |224 |5.23 |0.82 |4.87 |6910.6 |256 |
| [convnext_femto.d1_in1k](https://huggingface.co/timm/convnext_femto.d1_in1k) |77.454|93.68 |224 |5.22 |0.79 |4.57 |7189.92 |256 |
| [convnextv2_atto.fcmae_ft_in1k](https://huggingface.co/timm/convnextv2_atto.fcmae_ft_in1k) |76.664|93.044|224 |3.71 |0.55 |3.81 |4728.91 |256 |
| [convnext_atto_ols.a2_in1k](https://huggingface.co/timm/convnext_atto_ols.a2_in1k) |75.88 |92.846|224 |3.7 |0.58 |4.11 |7963.16 |256 |
| [convnext_atto.d2_in1k](https://huggingface.co/timm/convnext_atto.d2_in1k) |75.664|92.9 |224 |3.7 |0.55 |3.81 |8439.22 |256 |
## Citation
```bibtex
@article{Woo2023ConvNeXtV2,
title={ConvNeXt V2: Co-designing and Scaling ConvNets with Masked Autoencoders},
author={Sanghyun Woo, Shoubhik Debnath, Ronghang Hu, Xinlei Chen, Zhuang Liu, In So Kweon and Saining Xie},
year={2023},
journal={arXiv preprint arXiv:2301.00808},
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 15,749 | [
[
-0.07012939453125,
-0.031524658203125,
-0.00353240966796875,
0.03839111328125,
-0.031402587890625,
-0.0157012939453125,
-0.01294708251953125,
-0.03515625,
0.06634521484375,
0.018890380859375,
-0.044708251953125,
-0.039337158203125,
-0.0533447265625,
-0.004222869873046875,
0.006290435791015625,
0.0677490234375,
-0.0026798248291015625,
-0.0105133056640625,
0.018218994140625,
-0.0274810791015625,
-0.0176849365234375,
-0.03009033203125,
-0.0611572265625,
-0.014892578125,
0.0176544189453125,
0.0226287841796875,
0.056976318359375,
0.045867919921875,
0.029083251953125,
0.0404052734375,
-0.015106201171875,
0.01280975341796875,
-0.01541900634765625,
-0.02374267578125,
0.0391845703125,
-0.0296173095703125,
-0.06768798828125,
0.018096923828125,
0.0595703125,
0.038177490234375,
0.004955291748046875,
0.01654052734375,
0.02569580078125,
0.034820556640625,
0.00217437744140625,
-0.0031280517578125,
-0.00635528564453125,
0.01364898681640625,
-0.01751708984375,
0.002758026123046875,
0.00615692138671875,
-0.048797607421875,
0.02587890625,
-0.04290771484375,
0.0005469322204589844,
-0.0005645751953125,
0.09991455078125,
-0.006256103515625,
-0.0188140869140625,
-0.00020825862884521484,
0.0086822509765625,
0.053741455078125,
-0.0576171875,
0.0211944580078125,
0.032073974609375,
-0.00751495361328125,
-0.01409912109375,
-0.049957275390625,
-0.045654296875,
-0.00244140625,
-0.0295257568359375,
0.017333984375,
-0.0272064208984375,
-0.00534820556640625,
0.042510986328125,
0.034942626953125,
-0.03875732421875,
-0.005130767822265625,
-0.0241851806640625,
-0.00963592529296875,
0.061248779296875,
-0.009857177734375,
0.046295166015625,
-0.025787353515625,
-0.04718017578125,
-0.02099609375,
-0.0154266357421875,
0.03094482421875,
0.0149993896484375,
-0.006107330322265625,
-0.072265625,
0.038116455078125,
0.00788116455078125,
0.0182037353515625,
0.02667236328125,
-0.01210784912109375,
0.056793212890625,
-0.0181884765625,
-0.041473388671875,
-0.0245208740234375,
0.08966064453125,
0.05157470703125,
0.03253173828125,
0.00943756103515625,
0.003200531005859375,
-0.0065155029296875,
-0.0361328125,
-0.07562255859375,
-0.01361846923828125,
0.029022216796875,
-0.040740966796875,
-0.00897979736328125,
0.026092529296875,
-0.060791015625,
0.00847625732421875,
-0.01041412353515625,
0.0162811279296875,
-0.05859375,
-0.032257080078125,
-0.009613037109375,
-0.0273284912109375,
0.031585693359375,
0.0208282470703125,
-0.0295562744140625,
0.0238494873046875,
0.0224609375,
0.072998046875,
0.0220489501953125,
-0.00998687744140625,
-0.0316162109375,
-0.01070404052734375,
-0.02685546875,
0.0256195068359375,
0.01210784912109375,
-0.0112457275390625,
-0.02020263671875,
0.0328369140625,
-0.010009765625,
-0.0318603515625,
0.0284576416015625,
0.0207366943359375,
0.00833892822265625,
-0.0302276611328125,
-0.0274810791015625,
-0.0201873779296875,
0.027191162109375,
-0.03875732421875,
0.0792236328125,
0.03668212890625,
-0.075439453125,
0.022430419921875,
-0.036468505859375,
-0.00567626953125,
-0.0238189697265625,
0.00403594970703125,
-0.05810546875,
-0.00894927978515625,
0.02001953125,
0.0552978515625,
-0.01074981689453125,
-0.00787353515625,
-0.027069091796875,
-0.003612518310546875,
0.025421142578125,
0.00780487060546875,
0.07049560546875,
0.01418304443359375,
-0.039459228515625,
0.00012385845184326172,
-0.047088623046875,
0.02288818359375,
0.0295257568359375,
0.0016841888427734375,
-0.0043792724609375,
-0.058929443359375,
0.0036487579345703125,
0.043701171875,
0.01535797119140625,
-0.037139892578125,
0.020294189453125,
-0.016876220703125,
0.0306549072265625,
0.047515869140625,
-0.005786895751953125,
0.02294921875,
-0.042236328125,
0.041015625,
0.0076904296875,
0.019287109375,
-0.00582122802734375,
-0.027557373046875,
-0.0567626953125,
-0.0513916015625,
0.0165863037109375,
0.03472900390625,
-0.034332275390625,
0.05584716796875,
0.0128021240234375,
-0.043548583984375,
-0.0562744140625,
0.018280029296875,
0.0390625,
0.015380859375,
0.01702880859375,
-0.0272064208984375,
-0.05181884765625,
-0.071533203125,
-0.007686614990234375,
0.00749969482421875,
-0.004550933837890625,
0.046478271484375,
0.027862548828125,
-0.0062103271484375,
0.042572021484375,
-0.0311126708984375,
-0.0238189697265625,
-0.01006317138671875,
-0.005863189697265625,
0.02972412109375,
0.0604248046875,
0.08819580078125,
-0.064453125,
-0.06793212890625,
0.0004582405090332031,
-0.08477783203125,
0.0008206367492675781,
-0.003856658935546875,
-0.03155517578125,
0.0211181640625,
0.0217437744140625,
-0.07452392578125,
0.0494384765625,
0.0305023193359375,
-0.04559326171875,
0.032562255859375,
-0.02020263671875,
0.0240325927734375,
-0.0740966796875,
0.017425537109375,
0.020294189453125,
-0.02587890625,
-0.03863525390625,
0.006134033203125,
-0.0072479248046875,
0.0111083984375,
-0.046478271484375,
0.06744384765625,
-0.05462646484375,
0.0038013458251953125,
0.0014801025390625,
0.01018524169921875,
0.00252532958984375,
0.03839111328125,
-0.00005549192428588867,
0.035888671875,
0.058746337890625,
-0.0211944580078125,
0.034515380859375,
0.037322998046875,
-0.002532958984375,
0.056793212890625,
-0.0487060546875,
0.00963592529296875,
0.0076751708984375,
0.03631591796875,
-0.06939697265625,
-0.032928466796875,
0.043365478515625,
-0.057342529296875,
0.038543701171875,
-0.0184326171875,
-0.0278472900390625,
-0.061370849609375,
-0.06414794921875,
0.018646240234375,
0.04541015625,
-0.048614501953125,
0.0117950439453125,
0.0240936279296875,
0.004337310791015625,
-0.045166015625,
-0.049774169921875,
-0.002674102783203125,
-0.0302734375,
-0.063720703125,
0.029205322265625,
0.00759124755859375,
-0.00848388671875,
-0.00106048583984375,
-0.0016469955444335938,
-0.00406646728515625,
-0.0120086669921875,
0.038604736328125,
0.0298309326171875,
-0.0200347900390625,
-0.02642822265625,
-0.021575927734375,
-0.00930023193359375,
0.004138946533203125,
-0.0104827880859375,
0.04205322265625,
-0.02288818359375,
0.011474609375,
-0.078369140625,
0.0187835693359375,
0.049957275390625,
-0.0043792724609375,
0.06573486328125,
0.07867431640625,
-0.03485107421875,
0.01049041748046875,
-0.0322265625,
-0.011749267578125,
-0.038848876953125,
-0.00742340087890625,
-0.039520263671875,
-0.050506591796875,
0.060455322265625,
0.01406097412109375,
-0.006763458251953125,
0.05462646484375,
0.02642822265625,
-0.0181427001953125,
0.0657958984375,
0.038543701171875,
-0.00722503662109375,
0.0428466796875,
-0.06829833984375,
0.002933502197265625,
-0.06298828125,
-0.048065185546875,
-0.00753021240234375,
-0.043121337890625,
-0.055084228515625,
-0.03253173828125,
0.01971435546875,
0.035430908203125,
-0.0083770751953125,
0.051025390625,
-0.0443115234375,
-0.005855560302734375,
0.03631591796875,
0.02142333984375,
-0.02044677734375,
-0.0168914794921875,
-0.00818634033203125,
-0.016693115234375,
-0.041229248046875,
-0.01019287109375,
0.051910400390625,
0.049835205078125,
0.02813720703125,
0.00004494190216064453,
0.037841796875,
-0.0023193359375,
0.022186279296875,
-0.038909912109375,
0.053924560546875,
-0.003856658935546875,
-0.0396728515625,
-0.0159149169921875,
-0.033294677734375,
-0.07257080078125,
0.012908935546875,
-0.028076171875,
-0.0614013671875,
-0.008453369140625,
0.0157012939453125,
-0.0210113525390625,
0.039520263671875,
-0.050689697265625,
0.056793212890625,
-0.006008148193359375,
-0.0364990234375,
0.006542205810546875,
-0.06622314453125,
0.02020263671875,
0.0305328369140625,
-0.00568389892578125,
-0.01351165771484375,
0.01142120361328125,
0.0599365234375,
-0.0648193359375,
0.035888671875,
-0.0295562744140625,
0.0025463104248046875,
0.041595458984375,
-0.0031452178955078125,
0.032806396484375,
0.01226806640625,
0.0005788803100585938,
0.0036029815673828125,
0.01148223876953125,
-0.048492431640625,
-0.03106689453125,
0.05145263671875,
-0.052276611328125,
-0.03192138671875,
-0.041595458984375,
-0.0213470458984375,
0.0135650634765625,
0.001018524169921875,
0.046356201171875,
0.0428466796875,
-0.010711669921875,
0.013916015625,
0.041656494140625,
-0.0259552001953125,
0.039886474609375,
-0.01221466064453125,
-0.0015020370483398438,
-0.039276123046875,
0.05804443359375,
0.003765106201171875,
0.0082855224609375,
0.002788543701171875,
0.0038394927978515625,
-0.033111572265625,
-0.011077880859375,
-0.01190185546875,
0.050048828125,
-0.01678466796875,
-0.0276641845703125,
-0.049957275390625,
-0.034393310546875,
-0.046844482421875,
-0.0240936279296875,
-0.030975341796875,
-0.022918701171875,
-0.02764892578125,
0.00536346435546875,
0.05389404296875,
0.041229248046875,
-0.0302276611328125,
0.03204345703125,
-0.04644775390625,
0.0254669189453125,
0.00705718994140625,
0.03057861328125,
-0.0217742919921875,
-0.042144775390625,
0.0026798248291015625,
0.0022945404052734375,
-0.016937255859375,
-0.0555419921875,
0.048248291015625,
0.0102386474609375,
0.02813720703125,
0.039886474609375,
-0.0235748291015625,
0.058929443359375,
-0.007232666015625,
0.03875732421875,
0.042724609375,
-0.0657958984375,
0.032684326171875,
-0.0286712646484375,
0.007843017578125,
0.01319122314453125,
0.027587890625,
-0.0374755859375,
-0.0249786376953125,
-0.0709228515625,
-0.04388427734375,
0.0513916015625,
0.01146697998046875,
0.0009512901306152344,
0.003337860107421875,
0.0477294921875,
-0.0073089599609375,
0.01157379150390625,
-0.04034423828125,
-0.056793212890625,
-0.01959228515625,
-0.00946044921875,
-0.0081024169921875,
-0.007476806640625,
-0.0017175674438476562,
-0.050628662109375,
0.034820556640625,
-0.008880615234375,
0.043365478515625,
0.02142333984375,
-0.0017786026000976562,
-0.0017223358154296875,
-0.0238189697265625,
0.041595458984375,
0.027069091796875,
-0.0209808349609375,
-0.00811767578125,
0.0262603759765625,
-0.03704833984375,
0.0023365020751953125,
0.021636962890625,
0.007465362548828125,
0.0168914794921875,
0.0232391357421875,
0.046478271484375,
0.01971435546875,
-0.0135650634765625,
0.047149658203125,
-0.01555633544921875,
-0.030792236328125,
-0.025238037109375,
-0.0025463104248046875,
0.01302337646484375,
0.03289794921875,
0.0161590576171875,
0.00225067138671875,
-0.0210723876953125,
-0.046051025390625,
0.04315185546875,
0.058502197265625,
-0.034698486328125,
-0.042144775390625,
0.050567626953125,
-0.005084991455078125,
-0.006099700927734375,
0.039459228515625,
-0.006633758544921875,
-0.054473876953125,
0.07611083984375,
0.022308349609375,
0.043060302734375,
-0.044097900390625,
0.0195465087890625,
0.06634521484375,
0.00013637542724609375,
0.00865936279296875,
0.024078369140625,
0.0248870849609375,
-0.02996826171875,
0.0024890899658203125,
-0.04644775390625,
0.01427459716796875,
0.0423583984375,
-0.0357666015625,
0.02679443359375,
-0.058197021484375,
-0.025054931640625,
0.01375579833984375,
0.0328369140625,
-0.06365966796875,
0.02435302734375,
0.00586700439453125,
0.0826416015625,
-0.057769775390625,
0.06817626953125,
0.05352783203125,
-0.0263214111328125,
-0.07000732421875,
-0.0121002197265625,
0.016937255859375,
-0.059722900390625,
0.027984619140625,
0.0185546875,
0.01806640625,
-0.0157318115234375,
-0.043792724609375,
-0.0394287109375,
0.091552734375,
0.035430908203125,
-0.0148468017578125,
0.0080718994140625,
-0.0241241455078125,
0.0302276611328125,
-0.022003173828125,
0.036651611328125,
0.042999267578125,
0.03997802734375,
0.01849365234375,
-0.0689697265625,
0.02862548828125,
-0.0308685302734375,
-0.01285552978515625,
0.0217437744140625,
-0.10382080078125,
0.0765380859375,
-0.02685546875,
-0.002811431884765625,
0.0133819580078125,
0.064208984375,
0.0267791748046875,
0.0038909912109375,
0.028076171875,
0.05224609375,
0.034942626953125,
-0.0138702392578125,
0.08026123046875,
-0.0007104873657226562,
0.031585693359375,
0.022186279296875,
0.0391845703125,
0.0287322998046875,
0.02716064453125,
-0.0298614501953125,
0.00789642333984375,
0.06463623046875,
-0.01468658447265625,
0.00934600830078125,
0.0157928466796875,
-0.0118865966796875,
-0.01129150390625,
-0.0157012939453125,
-0.047119140625,
0.032562255859375,
0.01215362548828125,
-0.0223846435546875,
0.0006003379821777344,
-0.007354736328125,
0.03887939453125,
-0.0007357597351074219,
-0.0109100341796875,
0.03515625,
0.0211029052734375,
-0.04443359375,
0.040252685546875,
-0.00565338134765625,
0.072998046875,
-0.0272064208984375,
0.0007429122924804688,
-0.024810791015625,
0.020660400390625,
-0.01910400390625,
-0.08758544921875,
0.0254058837890625,
-0.010986328125,
0.01454925537109375,
-0.00582122802734375,
0.0469970703125,
-0.035736083984375,
-0.01776123046875,
0.0400390625,
0.0262298583984375,
0.02825927734375,
0.0036678314208984375,
-0.08599853515625,
0.0184173583984375,
0.009185791015625,
-0.042022705078125,
0.03363037109375,
0.035858154296875,
0.0191802978515625,
0.050628662109375,
0.031494140625,
0.01258087158203125,
0.00461578369140625,
-0.02581787109375,
0.0595703125,
-0.050048828125,
-0.0340576171875,
-0.06573486328125,
0.032135009765625,
-0.0260009765625,
-0.046295166015625,
0.059295654296875,
0.033477783203125,
0.037353515625,
0.0091400146484375,
0.0401611328125,
-0.035858154296875,
0.0263824462890625,
-0.032318115234375,
0.053924560546875,
-0.06036376953125,
-0.0223236083984375,
-0.0303192138671875,
-0.06341552734375,
-0.022216796875,
0.0545654296875,
0.00506591796875,
0.0169830322265625,
0.025634765625,
0.04449462890625,
-0.0040283203125,
-0.019989013671875,
-0.002696990966796875,
0.02130126953125,
0.00545501708984375,
0.061798095703125,
0.038177490234375,
-0.05743408203125,
0.0162353515625,
-0.0469970703125,
-0.02410888671875,
-0.0260467529296875,
-0.05377197265625,
-0.0838623046875,
-0.05975341796875,
-0.03826904296875,
-0.050567626953125,
-0.0219879150390625,
0.08538818359375,
0.0728759765625,
-0.04156494140625,
-0.01088714599609375,
0.0243072509765625,
0.009246826171875,
-0.015380859375,
-0.02008056640625,
0.03997802734375,
0.0244903564453125,
-0.076416015625,
-0.0209808349609375,
0.008758544921875,
0.041595458984375,
0.01971435546875,
-0.02777099609375,
-0.0157470703125,
-0.0033893585205078125,
0.034149169921875,
0.06365966796875,
-0.05230712890625,
-0.035980224609375,
0.002002716064453125,
-0.0203857421875,
0.0194854736328125,
0.0245513916015625,
-0.0309295654296875,
-0.006221771240234375,
0.037994384765625,
0.01026153564453125,
0.05780029296875,
0.0092620849609375,
0.020355224609375,
-0.0450439453125,
0.049835205078125,
-0.004085540771484375,
0.02679443359375,
0.027740478515625,
-0.030792236328125,
0.05377197265625,
0.037139892578125,
-0.0345458984375,
-0.07159423828125,
-0.0206756591796875,
-0.1075439453125,
0.000021398067474365234,
0.06060791015625,
-0.01477813720703125,
-0.039825439453125,
0.038665771484375,
-0.0250244140625,
0.0391845703125,
-0.01788330078125,
0.02239990234375,
0.02801513671875,
-0.02294921875,
-0.03662109375,
-0.04119873046875,
0.0556640625,
0.02606201171875,
-0.050811767578125,
-0.025787353515625,
-0.00004941225051879883,
0.037994384765625,
0.01488494873046875,
0.0599365234375,
-0.016754150390625,
0.01322174072265625,
0.00118255615234375,
0.01152801513671875,
0.001800537109375,
0.0011072158813476562,
-0.014068603515625,
-0.0174560546875,
-0.025604248046875,
-0.043426513671875
]
] |
Open-Orca/LlongOrca-13B-16k | 2023-08-21T05:15:08.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"dataset:Open-Orca/OpenOrca",
"arxiv:2306.02707",
"arxiv:2301.13688",
"arxiv:2307.09288",
"license:llama2",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | Open-Orca | null | null | Open-Orca/LlongOrca-13B-16k | 11 | 7,025 | transformers | 2023-08-16T23:03:19 | ---
license: llama2
language:
- en
library_name: transformers
pipeline_tag: text-generation
datasets:
- Open-Orca/OpenOrca
---
<p><h1>🐋 The Second Llong Context Orca! 🐋</h1></p>

# OpenOrca - LlongOrca - 13B - 16k
We have used our own [OpenOrca dataset](https://huggingface.co/datasets/Open-Orca/OpenOrca) to fine-tune on top of [LLongMA-2-13b-16k](https://huggingface.co/conceptofmind/LLongMA-2-13b-16k).
This dataset is our attempt to reproduce the dataset generated for Microsoft Research's [Orca Paper](https://arxiv.org/abs/2306.02707).
We use [OpenChat](https://huggingface.co/openchat) packing, trained with [Axolotl](https://github.com/OpenAccess-AI-Collective/axolotl).
This release is trained on a curated filtered subset of most of our GPT-4 augmented data.
It is the same subset of our data as was used in our [OpenOrcaxOpenChat-Preview2-13B model](https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B).
HF Leaderboard evals place this model as #1 for all 13B long context models at release time.
We achieve >112% the performance of the base LLongMA2-13b-16k model we tuned on top of.
As well, we preserve >98% of the performance of the OpenOrcaxOpenChat-Preview2-13B model we share datasets with, while extending the context to 16k.
We did this training as part of testing setup of our H100 cluster.
Want to visualize our full (pre-filtering) dataset? Check out our [Nomic Atlas Map](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2).
[<img src="https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B/resolve/main/OpenOrca%20Nomic%20Atlas.png" alt="Atlas Nomic Dataset Map" width="400" height="400" />](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2)
Many thanks to @EnricoShippole, @theemozilla, and @kaiokendev1 for the fine work on creating the LlongMA-2-13b-16k model this was trained on top of!
We are in-process with training more models, so keep a look out on our org for releases coming soon with exciting partners.
We will also give sneak-peak announcements on our Discord, which you can find here:
https://AlignmentLab.ai
# Prompt Template
We used [OpenAI's Chat Markup Language (ChatML)](https://github.com/openai/openai-python/blob/main/chatml.md) format, with `<|im_start|>` and `<|im_end|>` tokens added to support this.
## Example Prompt Exchange
```
<|im_start|>system
You are LlongOrca, a large language model trained by Alignment Lab AI. Write out your reasoning step-by-step to be sure you get the right answers!
<|im_end|>
<|im_start|>user
How are you<|im_end|>
<|im_start|>assistant
I am doing well!<|im_end|>
<|im_start|>user
How are you now?<|im_end|>
```
# Evaluation
We have evaluated using the methodology and tools for the HuggingFace Leaderboard, and find that we have significantly improved upon the base long context model.
We reach >112% of LLongMA2-13B-16k performance.
## HuggingFaceH4 Open LLM Leaderboard Performance
We have run our own tests using parameters matching the [HuggingFaceH4 Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) evals.
We preserve >98% of OpenOrcaxOpenChat-Preview2-13B performance and are #1 on the leaderboard for long context 13B models at release time.
We have >103% performance of the next 16k model (vicuna-13b-v1.5-16k).
As well, we expect the context extension techniques from LLongMA to be more robust than other 16k context models available.

## GPT4ALL Leaderboard Performance
We find we score higher than all non-OpenOrca models on the GPT4ALL leaderboard, while preserving ~98.7% of our OpenOrcaxOpenChat-Preview2-13B performance.

# Dataset
We used a curated, filtered selection of most of the GPT-4 augmented data from our OpenOrca dataset, which aims to reproduce the Orca Research Paper dataset.
Further details of our curation practices will be forthcoming with our full model releases.
# Training
[<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
We trained with 8x H100 GPUs for 10 hours, completing 4 epochs of full fine tuning on our dataset in one training run.
Commodity cost was ~$300.
# Citation
```bibtex
@software{dale2023llongorca13b,
title = {LlongOrca13B: Llama2-13B Model Instruct-tuned for Long Context on Filtered OpenOrcaV1 GPT-4 Dataset},
author = {Alpin Dale and Wing Lian and Bleys Goodson and Guan Wang and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://https://huggingface.co/Open-Orca/LlongOrca-7B-16k},
}
@software{openchat,
title = {{OpenChat: Advancing Open-source Language Models with Imperfect Data}},
author = {Wang, Guan and Cheng, Sijie and Yu, Qiying and Liu, Changling},
doi = {10.5281/zenodo.8105775},
url = {https://github.com/imoneoi/openchat},
version = {pre-release},
year = {2023},
month = {7},
}
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@misc{longpre2023flan,
title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning},
author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts},
year={2023},
eprint={2301.13688},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom},
year={2023},
eprint={2307.09288},
archivePrefix={arXiv},
}
``` | 7,790 | [
[
-0.033355712890625,
-0.06561279296875,
0.0194549560546875,
0.012939453125,
-0.01471710205078125,
-0.011474609375,
-0.026153564453125,
-0.06317138671875,
0.0164947509765625,
0.022003173828125,
-0.0440673828125,
-0.053497314453125,
-0.0253753662109375,
-0.00775146484375,
-0.006488800048828125,
0.0771484375,
-0.007488250732421875,
-0.02142333984375,
-0.00927734375,
-0.0299224853515625,
-0.03106689453125,
-0.03411865234375,
-0.06719970703125,
-0.0276947021484375,
0.036346435546875,
0.026580810546875,
0.04632568359375,
0.054473876953125,
0.026031494140625,
0.0213165283203125,
-0.021820068359375,
0.0278472900390625,
-0.053863525390625,
-0.01444244384765625,
0.0181121826171875,
-0.020782470703125,
-0.0771484375,
0.00518035888671875,
0.032073974609375,
0.031951904296875,
-0.019989013671875,
0.01396942138671875,
0.0207672119140625,
0.03515625,
-0.032623291015625,
0.0286102294921875,
-0.022216796875,
-0.0131072998046875,
-0.0238800048828125,
0.0011157989501953125,
-0.015625,
-0.0208587646484375,
0.0047607421875,
-0.06280517578125,
-0.01129913330078125,
0.006244659423828125,
0.08935546875,
0.00850677490234375,
-0.0251922607421875,
-0.0169219970703125,
-0.032318115234375,
0.052886962890625,
-0.060577392578125,
0.0258636474609375,
0.0187835693359375,
0.0239105224609375,
-0.0234375,
-0.06060791015625,
-0.0428466796875,
-0.00873565673828125,
-0.003070831298828125,
0.023162841796875,
-0.0172271728515625,
0.0025768280029296875,
0.00977325439453125,
0.0309906005859375,
-0.041748046875,
0.0249176025390625,
-0.034637451171875,
-0.01285552978515625,
0.049285888671875,
0.0032711029052734375,
0.0209197998046875,
0.006969451904296875,
-0.038238525390625,
-0.02484130859375,
-0.060943603515625,
0.0211334228515625,
0.0255279541015625,
0.0281982421875,
-0.046417236328125,
0.0307159423828125,
-0.00475311279296875,
0.0538330078125,
-0.00276947021484375,
-0.02703857421875,
0.0384521484375,
-0.036773681640625,
-0.025970458984375,
-0.0059814453125,
0.07037353515625,
0.023956298828125,
0.0020961761474609375,
0.01149749755859375,
-0.01038360595703125,
0.0037059783935546875,
-0.00974273681640625,
-0.07110595703125,
-0.0157623291015625,
0.020965576171875,
-0.025970458984375,
-0.0179901123046875,
-0.0018939971923828125,
-0.051727294921875,
-0.01081085205078125,
-0.016510009765625,
0.018096923828125,
-0.032257080078125,
-0.025787353515625,
0.0106048583984375,
0.018951416015625,
0.0266571044921875,
0.032867431640625,
-0.055877685546875,
0.0273590087890625,
0.039276123046875,
0.07684326171875,
-0.007640838623046875,
-0.0179901123046875,
-0.0196075439453125,
-0.00939178466796875,
-0.0261383056640625,
0.052947998046875,
-0.0258941650390625,
-0.0161895751953125,
-0.0121612548828125,
-0.0011548995971679688,
-0.00914764404296875,
-0.0311279296875,
0.043121337890625,
-0.01922607421875,
0.028228759765625,
-0.0286407470703125,
-0.030487060546875,
-0.0191802978515625,
0.0182647705078125,
-0.03253173828125,
0.09027099609375,
0.00514984130859375,
-0.05242919921875,
0.0183563232421875,
-0.06463623046875,
-0.01201629638671875,
-0.0149993896484375,
-0.0098724365234375,
-0.0399169921875,
-0.0291900634765625,
0.041259765625,
0.022979736328125,
-0.0239715576171875,
0.0011758804321289062,
-0.03497314453125,
-0.031341552734375,
0.0017261505126953125,
-0.00799560546875,
0.069091796875,
0.0203094482421875,
-0.036346435546875,
0.014801025390625,
-0.04705810546875,
-0.0005650520324707031,
0.020263671875,
-0.0262603759765625,
-0.0128631591796875,
-0.0133819580078125,
-0.00899505615234375,
0.02349853515625,
0.0287322998046875,
-0.0443115234375,
0.0243988037109375,
-0.036102294921875,
0.05010986328125,
0.0609130859375,
-0.01125335693359375,
0.031402587890625,
-0.028289794921875,
0.036590576171875,
0.01010894775390625,
0.0284576416015625,
-0.0138092041015625,
-0.062744140625,
-0.058746337890625,
-0.026611328125,
0.01471710205078125,
0.033721923828125,
-0.0428466796875,
0.0418701171875,
-0.0205078125,
-0.047760009765625,
-0.039642333984375,
0.00295257568359375,
0.0384521484375,
0.049163818359375,
0.031585693359375,
-0.039031982421875,
-0.033416748046875,
-0.050750732421875,
0.0035858154296875,
-0.017852783203125,
-0.0013132095336914062,
0.05194091796875,
0.033660888671875,
-0.0157623291015625,
0.07427978515625,
-0.035980224609375,
-0.035491943359375,
-0.0053863525390625,
-0.014007568359375,
0.02301025390625,
0.03961181640625,
0.06256103515625,
-0.039154052734375,
-0.037811279296875,
0.00894927978515625,
-0.06512451171875,
0.0004515647888183594,
0.00919342041015625,
-0.03619384765625,
0.033355712890625,
0.03839111328125,
-0.049835205078125,
0.040557861328125,
0.0538330078125,
-0.03472900390625,
0.028839111328125,
-0.01192474365234375,
0.0006723403930664062,
-0.08111572265625,
0.01519012451171875,
0.00588226318359375,
-0.00495147705078125,
-0.0421142578125,
-0.004642486572265625,
-0.00102996826171875,
-0.00249481201171875,
-0.034332275390625,
0.05975341796875,
-0.04742431640625,
-0.00010144710540771484,
-0.0005993843078613281,
0.030609130859375,
-0.0108642578125,
0.057861328125,
-0.0171051025390625,
0.062164306640625,
0.048797607421875,
-0.0311737060546875,
0.0174560546875,
0.033966064453125,
-0.032806396484375,
0.0295867919921875,
-0.06512451171875,
0.02606201171875,
0.002445220947265625,
0.043243408203125,
-0.091552734375,
-0.0234832763671875,
0.039398193359375,
-0.035308837890625,
0.028411865234375,
-0.00922393798828125,
-0.0306243896484375,
-0.03948974609375,
-0.0287017822265625,
0.0309600830078125,
0.0418701171875,
-0.055572509765625,
0.029327392578125,
0.0293121337890625,
0.007541656494140625,
-0.05206298828125,
-0.054412841796875,
0.0007257461547851562,
-0.026611328125,
-0.06719970703125,
0.038177490234375,
-0.00862884521484375,
0.003505706787109375,
-0.006343841552734375,
-0.01168060302734375,
0.00438690185546875,
0.0037403106689453125,
0.039398193359375,
0.0266571044921875,
-0.0228271484375,
-0.00897979736328125,
-0.004444122314453125,
-0.000514984130859375,
-0.013702392578125,
-0.031280517578125,
0.05303955078125,
-0.0291900634765625,
-0.0140228271484375,
-0.052459716796875,
-0.0103607177734375,
0.038055419921875,
-0.0226287841796875,
0.07550048828125,
0.0443115234375,
-0.0200653076171875,
0.01284027099609375,
-0.0384521484375,
-0.0228271484375,
-0.040130615234375,
0.0112762451171875,
-0.0201568603515625,
-0.07720947265625,
0.0623779296875,
0.021484375,
0.0257110595703125,
0.033538818359375,
0.030670166015625,
0.00824737548828125,
0.06182861328125,
0.039398193359375,
-0.0166778564453125,
0.046142578125,
-0.0333251953125,
0.01287841796875,
-0.06585693359375,
-0.030029296875,
-0.0293426513671875,
-0.037811279296875,
-0.051605224609375,
-0.0309906005859375,
0.034454345703125,
0.01549530029296875,
-0.030731201171875,
0.031829833984375,
-0.03997802734375,
0.0211944580078125,
0.040191650390625,
0.0246124267578125,
0.00965118408203125,
0.0081329345703125,
-0.0025043487548828125,
0.00839996337890625,
-0.0560302734375,
-0.0362548828125,
0.0914306640625,
0.032745361328125,
0.04742431640625,
0.01378631591796875,
0.05126953125,
-0.01294708251953125,
0.0252685546875,
-0.032928466796875,
0.03997802734375,
0.00722503662109375,
-0.040985107421875,
-0.0275726318359375,
-0.03997802734375,
-0.0911865234375,
0.0120086669921875,
-0.00383758544921875,
-0.06121826171875,
0.0171356201171875,
0.0121612548828125,
-0.037567138671875,
0.0238800048828125,
-0.047576904296875,
0.07464599609375,
-0.00624847412109375,
-0.020416259765625,
0.004825592041015625,
-0.054931640625,
0.037200927734375,
0.0157928466796875,
0.00878143310546875,
-0.0032978057861328125,
-0.005462646484375,
0.057952880859375,
-0.046234130859375,
0.0693359375,
-0.0158538818359375,
-0.0210113525390625,
0.0355224609375,
-0.010162353515625,
0.035400390625,
0.00424957275390625,
0.00035858154296875,
0.03851318359375,
-0.015289306640625,
-0.03485107421875,
-0.0289459228515625,
0.061431884765625,
-0.08477783203125,
-0.0288543701171875,
-0.0158538818359375,
-0.0225982666015625,
0.00887298583984375,
0.01195526123046875,
0.0218505859375,
0.036712646484375,
-0.0004057884216308594,
0.0182952880859375,
0.032867431640625,
-0.031402587890625,
0.0266876220703125,
0.0200958251953125,
-0.01409149169921875,
-0.038360595703125,
0.0546875,
0.0192718505859375,
0.02117919921875,
0.01306915283203125,
0.005794525146484375,
-0.022125244140625,
-0.0310211181640625,
-0.0253753662109375,
0.040313720703125,
-0.040313720703125,
-0.020355224609375,
-0.04705810546875,
-0.0186920166015625,
-0.044921875,
0.0022563934326171875,
-0.036895751953125,
-0.03179931640625,
-0.04278564453125,
-0.003604888916015625,
0.038665771484375,
0.04254150390625,
-0.0007262229919433594,
0.028839111328125,
-0.02960205078125,
-0.00405120849609375,
0.0191192626953125,
0.020843505859375,
0.010711669921875,
-0.058624267578125,
-0.01187896728515625,
0.024200439453125,
-0.04144287109375,
-0.040069580078125,
0.0286407470703125,
0.02392578125,
0.032958984375,
0.03466796875,
0.0036334991455078125,
0.07403564453125,
-0.025054931640625,
0.07244873046875,
0.00484466552734375,
-0.05316162109375,
0.04510498046875,
-0.039215087890625,
0.016082763671875,
0.0281829833984375,
0.025238037109375,
-0.035675048828125,
-0.022918701171875,
-0.058258056640625,
-0.06719970703125,
0.0792236328125,
0.031951904296875,
0.00701141357421875,
0.0014514923095703125,
0.05072021484375,
-0.006954193115234375,
0.0055084228515625,
-0.061248779296875,
-0.033447265625,
-0.022186279296875,
-0.004364013671875,
-0.01141357421875,
-0.01535797119140625,
-0.01280975341796875,
-0.02435302734375,
0.05035400390625,
-0.007259368896484375,
0.045013427734375,
0.014892578125,
0.01446533203125,
-0.005123138427734375,
-0.01020050048828125,
0.06805419921875,
0.05499267578125,
-0.0234375,
-0.0171051025390625,
0.011962890625,
-0.034515380859375,
-0.02203369140625,
0.006824493408203125,
0.0011682510375976562,
-0.0223236083984375,
0.032867431640625,
0.08343505859375,
0.00827789306640625,
-0.037384033203125,
0.04241943359375,
-0.005199432373046875,
-0.0179595947265625,
-0.02532958984375,
0.01372528076171875,
0.012939453125,
0.0269622802734375,
0.0167694091796875,
-0.0005016326904296875,
-0.01299285888671875,
-0.044952392578125,
0.0037174224853515625,
0.022064208984375,
-0.0201568603515625,
-0.0382080078125,
0.06500244140625,
0.00972747802734375,
-0.00846099853515625,
0.059326171875,
-0.0020313262939453125,
-0.032745361328125,
0.05328369140625,
0.0301666259765625,
0.0428466796875,
-0.0223541259765625,
0.0011844635009765625,
0.052642822265625,
0.01342010498046875,
-0.0093231201171875,
0.01282501220703125,
-0.00589752197265625,
-0.046142578125,
-0.020355224609375,
-0.03173828125,
-0.02386474609375,
0.018157958984375,
-0.046539306640625,
0.035125732421875,
-0.03533935546875,
-0.01439666748046875,
-0.01364898681640625,
0.0128021240234375,
-0.04876708984375,
0.00021255016326904297,
0.00662994384765625,
0.06707763671875,
-0.0526123046875,
0.06585693359375,
0.04644775390625,
-0.034088134765625,
-0.0740966796875,
-0.0157623291015625,
0.00677490234375,
-0.06866455078125,
0.032135009765625,
0.0285491943359375,
0.00615692138671875,
0.0002582073211669922,
-0.059722900390625,
-0.07244873046875,
0.11444091796875,
0.031402587890625,
-0.034088134765625,
-0.006134033203125,
-0.0056304931640625,
0.05047607421875,
-0.0270233154296875,
0.0426025390625,
0.04486083984375,
0.0302886962890625,
0.0223388671875,
-0.0841064453125,
0.01393890380859375,
-0.025634765625,
-0.0057525634765625,
0.006999969482421875,
-0.086181640625,
0.082763671875,
-0.0158538818359375,
-0.024200439453125,
0.015716552734375,
0.05706787109375,
0.033111572265625,
0.021820068359375,
0.033111572265625,
0.05157470703125,
0.05987548828125,
-0.0165252685546875,
0.08685302734375,
-0.0247955322265625,
0.031646728515625,
0.07293701171875,
0.000051140785217285156,
0.056732177734375,
0.01488494873046875,
-0.0160064697265625,
0.04437255859375,
0.07012939453125,
0.0138092041015625,
0.03778076171875,
-0.00847625732421875,
0.003376007080078125,
-0.00647735595703125,
0.0016775131225585938,
-0.0462646484375,
0.032135009765625,
0.0296783447265625,
-0.011383056640625,
-0.0169219970703125,
-0.00433349609375,
0.0251922607421875,
-0.01324462890625,
0.0006322860717773438,
0.054931640625,
0.0163726806640625,
-0.051025390625,
0.08990478515625,
0.0005192756652832031,
0.061981201171875,
-0.043701171875,
0.00916290283203125,
-0.043609619140625,
0.01245880126953125,
-0.0196075439453125,
-0.046112060546875,
0.0009207725524902344,
-0.0135498046875,
0.0183563232421875,
-0.00998687744140625,
0.0308685302734375,
-0.031951904296875,
-0.01229095458984375,
0.033905029296875,
0.0164642333984375,
0.0269622802734375,
0.00208282470703125,
-0.06060791015625,
0.0206146240234375,
0.01099395751953125,
-0.041259765625,
0.0300140380859375,
0.0352783203125,
-0.01153564453125,
0.0487060546875,
0.04888916015625,
-0.007038116455078125,
0.0023250579833984375,
-0.00946807861328125,
0.0849609375,
-0.0309906005859375,
-0.034881591796875,
-0.0623779296875,
0.03155517578125,
-0.00814056396484375,
-0.04449462890625,
0.059661865234375,
0.037139892578125,
0.07159423828125,
0.024658203125,
0.032562255859375,
-0.0181121826171875,
0.024749755859375,
-0.030914306640625,
0.055938720703125,
-0.060699462890625,
0.0246734619140625,
-0.0289306640625,
-0.07525634765625,
-0.0206756591796875,
0.058135986328125,
-0.0230560302734375,
0.0158233642578125,
0.032196044921875,
0.0748291015625,
-0.0114288330078125,
-0.0011281967163085938,
0.007297515869140625,
0.02734375,
0.04736328125,
0.06280517578125,
0.045135498046875,
-0.05072021484375,
0.051971435546875,
-0.023529052734375,
-0.0338134765625,
-0.0261688232421875,
-0.05755615234375,
-0.08331298828125,
-0.042236328125,
-0.0255126953125,
-0.0300750732421875,
0.0012216567993164062,
0.06256103515625,
0.06268310546875,
-0.05072021484375,
-0.03155517578125,
-0.0018873214721679688,
-0.0023632049560546875,
-0.0239410400390625,
-0.01418304443359375,
0.039276123046875,
-0.0057830810546875,
-0.05718994140625,
0.0288543701171875,
0.00563812255859375,
0.0214080810546875,
-0.00560760498046875,
-0.02471923828125,
-0.0167999267578125,
-0.006038665771484375,
0.039794921875,
0.041015625,
-0.051300048828125,
-0.02691650390625,
0.0018396377563476562,
-0.01425933837890625,
0.02508544921875,
0.020416259765625,
-0.047760009765625,
0.0206756591796875,
0.0240325927734375,
0.0222930908203125,
0.056976318359375,
0.01324462890625,
0.01558685302734375,
-0.05181884765625,
0.0304718017578125,
-0.003719329833984375,
0.015960693359375,
0.0202178955078125,
-0.0160980224609375,
0.061920166015625,
0.013458251953125,
-0.051605224609375,
-0.0758056640625,
-0.0109100341796875,
-0.08843994140625,
-0.00629425048828125,
0.086181640625,
-0.02392578125,
-0.0269927978515625,
0.015960693359375,
-0.0252685546875,
0.0182342529296875,
-0.051055908203125,
0.05438232421875,
0.041900634765625,
-0.012420654296875,
-0.01210784912109375,
-0.04541015625,
0.032379150390625,
0.015106201171875,
-0.060150146484375,
-0.007724761962890625,
0.0335693359375,
0.0191192626953125,
0.0166168212890625,
0.055877685546875,
-0.01343536376953125,
0.0090789794921875,
-0.0181121826171875,
0.00966644287109375,
-0.0209197998046875,
-0.0256500244140625,
-0.0139923095703125,
-0.0043487548828125,
-0.0136566162109375,
-0.0179290771484375
]
] |
indolem/indobertweet-base-uncased | 2021-09-18T01:24:17.000Z | [
"transformers",
"pytorch",
"bert",
"fill-mask",
"Twitter",
"id",
"dataset:Twitter 2021",
"arxiv:2109.04607",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | indolem | null | null | indolem/indobertweet-base-uncased | 9 | 7,022 | transformers | 2022-03-02T23:29:05 | ---
language:
- id
tags:
- Twitter
license: apache-2.0
datasets:
- Twitter 2021
widget:
- text: "guweehh udh ga' paham lg sm [MASK]"
---
# IndoBERTweet 🐦
## 1. Paper
Fajri Koto, Jey Han Lau, and Timothy Baldwin. [_IndoBERTweet: A Pretrained Language Model for Indonesian Twitter
with Effective Domain-Specific Vocabulary Initialization_](https://arxiv.org/pdf/2109.04607.pdf).
In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (**EMNLP 2021**), Dominican Republic (virtual).
## 2. About
[IndoBERTweet](https://github.com/indolem/IndoBERTweet) is the first large-scale pretrained model for Indonesian Twitter
that is trained by extending a monolingually trained Indonesian BERT model with additive domain-specific vocabulary.
In this paper, we show that initializing domain-specific vocabulary with average-pooling of BERT subword embeddings is more efficient than pretraining from scratch, and more effective than initializing based on word2vec projections.
## 3. Pretraining Data
We crawl Indonesian tweets over a 1-year period using the official Twitter API, from December 2019 to December 2020, with 60 keywords covering 4 main topics: economy, health, education, and government. We obtain in total of **409M word tokens**, two times larger than the training data used to pretrain [IndoBERT](https://aclanthology.org/2020.coling-main.66.pdf). Due to Twitter policy, this pretraining data will not be released to public.
## 4. How to use
Load model and tokenizer (tested with transformers==3.5.1)
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("indolem/indobertweet-base-uncased")
model = AutoModel.from_pretrained("indolem/indobertweet-base-uncased")
```
**Preprocessing Steps:**
* lower-case all words
* converting user mentions and URLs into @USER and HTTPURL, respectively
* translating emoticons into text using the [emoji package](https://pypi.org/project/emoji/).
## 5. Results over 7 Indonesian Twitter Datasets
<table>
<col>
<colgroup span="2"></colgroup>
<colgroup span="2"></colgroup>
<tr>
<th rowspan="2">Models</td>
<th colspan="2" scope="colgroup">Sentiment</th>
<th colspan="1" scope="colgroup">Emotion</th>
<th colspan="2" scope="colgroup">Hate Speech</th>
<th colspan="2" scope="colgroup">NER</th>
<th rowspan="2" scope="colgroup">Average</th>
</tr>
<tr>
<th scope="col">IndoLEM</th>
<th scope="col">SmSA</th>
<th scope="col">EmoT</th>
<th scope="col">HS1</th>
<th scope="col">HS2</th>
<th scope="col">Formal</th>
<th scope="col">Informal</th>
</tr>
<tr>
<td scope="row">mBERT</td>
<td>76.6</td>
<td>84.7</td>
<td>67.5</td>
<td>85.1</td>
<td>75.1</td>
<td>85.2</td>
<td>83.2</td>
<td>79.6</td>
</tr>
<tr>
<td scope="row">malayBERT</td>
<td>82.0</td>
<td>84.1</td>
<td>74.2</td>
<td>85.0</td>
<td>81.9</td>
<td>81.9</td>
<td>81.3</td>
<td>81.5</td>
</tr>
<tr>
<td scope="row">IndoBERT (Willie, et al., 2020)</td>
<td>84.1</td>
<td>88.7</td>
<td>73.3</td>
<td>86.8</td>
<td>80.4</td>
<td>86.3</td>
<td>84.3</td>
<td>83.4</td>
</tr>
<tr>
<td scope="row">IndoBERT (Koto, et al., 2020)</td>
<td>84.1</td>
<td>87.9</td>
<td>71.0</td>
<td>86.4</td>
<td>79.3</td>
<td>88.0</td>
<td><b>86.9</b></td>
<td>83.4</td>
</tr>
<tr>
<td scope="row">IndoBERTweet (1M steps from scratch)</td>
<td>86.2</td>
<td>90.4</td>
<td>76.0</td>
<td><b>88.8</b></td>
<td><b>87.5</b></td>
<td><b>88.1</b></td>
<td>85.4</td>
<td>86.1</td>
</tr>
<tr>
<td scope="row">IndoBERT + Voc adaptation + 200k steps</td>
<td><b>86.6</b></td>
<td><b>92.7</b></td>
<td><b>79.0</b></td>
<td>88.4</td>
<td>84.0</td>
<td>87.7</td>
<td><b>86.9</b></td>
<td><b>86.5</b></td>
</tr>
</table>
## Citation
If you use our work, please cite:
```bibtex
@inproceedings{koto2021indobertweet,
title={IndoBERTweet: A Pretrained Language Model for Indonesian Twitter with Effective Domain-Specific Vocabulary Initialization},
author={Fajri Koto and Jey Han Lau and Timothy Baldwin},
booktitle={Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP 2021)},
year={2021}
}
``` | 4,377 | [
[
-0.036590576171875,
-0.044677734375,
-0.00463104248046875,
0.03631591796875,
-0.03363037109375,
0.00014412403106689453,
-0.0286865234375,
-0.02191162109375,
0.03558349609375,
0.01116180419921875,
-0.03814697265625,
-0.037261962890625,
-0.04913330078125,
0.00740814208984375,
-0.0218505859375,
0.054443359375,
-0.0128631591796875,
-0.0278472900390625,
0.00891876220703125,
-0.0216217041015625,
-0.01373291015625,
-0.032470703125,
-0.035369873046875,
-0.00977325439453125,
0.0264892578125,
0.0146331787109375,
0.0267181396484375,
0.023040771484375,
0.031280517578125,
0.0233917236328125,
0.0009164810180664062,
0.0156097412109375,
-0.01474761962890625,
-0.0024127960205078125,
0.014892578125,
-0.0082855224609375,
-0.0231781005859375,
-0.0029354095458984375,
0.0250701904296875,
0.040252685546875,
-0.0022182464599609375,
0.0238037109375,
0.0261383056640625,
0.0692138671875,
-0.057373046875,
0.0150604248046875,
-0.0271759033203125,
-0.0043182373046875,
-0.0108642578125,
-0.0241851806640625,
-0.0101776123046875,
-0.043609619140625,
0.0093536376953125,
-0.047271728515625,
-0.005718231201171875,
-0.00220489501953125,
0.087890625,
0.013153076171875,
-0.00669097900390625,
-0.01268768310546875,
-0.045806884765625,
0.06689453125,
-0.054534912109375,
0.028717041015625,
0.028472900390625,
-0.0134429931640625,
0.01311492919921875,
-0.05303955078125,
-0.06298828125,
0.004268646240234375,
-0.01006317138671875,
0.0293731689453125,
-0.01531219482421875,
-0.015625,
0.00724029541015625,
0.0035190582275390625,
-0.04351806640625,
-0.0205078125,
-0.01181793212890625,
0.00304412841796875,
0.046783447265625,
-0.0121307373046875,
0.0278778076171875,
-0.02978515625,
-0.0309295654296875,
-0.01020050048828125,
-0.02703857421875,
0.020416259765625,
0.0156402587890625,
0.035003662109375,
-0.0355224609375,
0.04083251953125,
-0.01654052734375,
0.03765869140625,
-0.0013942718505859375,
0.004993438720703125,
0.05157470703125,
-0.04486083984375,
-0.019866943359375,
-0.00052642822265625,
0.06695556640625,
0.0308685302734375,
0.042449951171875,
0.005184173583984375,
-0.013427734375,
-0.0048828125,
0.009796142578125,
-0.0775146484375,
-0.01543426513671875,
0.031951904296875,
-0.037689208984375,
-0.01506805419921875,
-0.0031890869140625,
-0.05712890625,
-0.00951385498046875,
-0.041900634765625,
0.045867919921875,
-0.052734375,
-0.03857421875,
-0.007587432861328125,
0.01303863525390625,
0.028778076171875,
0.0234832763671875,
-0.052734375,
0.003917694091796875,
0.03143310546875,
0.07611083984375,
0.0026149749755859375,
-0.0112762451171875,
0.012420654296875,
0.0034961700439453125,
-0.029876708984375,
0.058349609375,
-0.0312042236328125,
-0.0183868408203125,
0.0155792236328125,
-0.01282501220703125,
-0.022369384765625,
-0.00563812255859375,
0.03802490234375,
-0.0157318115234375,
0.0134429931640625,
-0.0201873779296875,
-0.040496826171875,
-0.000003516674041748047,
0.0171356201171875,
-0.02392578125,
0.08575439453125,
0.0033893585205078125,
-0.0885009765625,
0.045074462890625,
-0.07916259765625,
-0.0277862548828125,
-0.00087738037109375,
-0.01291656494140625,
-0.037078857421875,
-0.01142120361328125,
0.046539306640625,
0.04510498046875,
-0.01479339599609375,
-0.00914764404296875,
-0.021759033203125,
-0.01348876953125,
0.0027866363525390625,
-0.01263427734375,
0.09735107421875,
0.0283660888671875,
-0.0308837890625,
0.006107330322265625,
-0.055145263671875,
0.008941650390625,
0.024261474609375,
-0.01690673828125,
-0.034210205078125,
-0.022674560546875,
-0.000988006591796875,
0.021331787109375,
0.0213623046875,
-0.050537109375,
0.01303863525390625,
-0.045623779296875,
0.0229339599609375,
0.07037353515625,
0.0135650634765625,
0.028839111328125,
-0.0311737060546875,
0.029632568359375,
0.00604248046875,
0.0243377685546875,
-0.001735687255859375,
-0.0235443115234375,
-0.061187744140625,
-0.036651611328125,
0.0241851806640625,
0.029632568359375,
-0.05352783203125,
0.050933837890625,
-0.046966552734375,
-0.049560546875,
-0.03814697265625,
0.0056915283203125,
0.01445770263671875,
0.0268707275390625,
0.033294677734375,
-0.002384185791015625,
-0.05712890625,
-0.04876708984375,
-0.032257080078125,
-0.0256500244140625,
0.0144805908203125,
0.0274658203125,
0.05322265625,
-0.0155029296875,
0.0830078125,
-0.03338623046875,
-0.017578125,
-0.0234832763671875,
0.0137176513671875,
0.0233001708984375,
0.05487060546875,
0.0501708984375,
-0.07122802734375,
-0.07354736328125,
-0.00858306884765625,
-0.052001953125,
-0.00493621826171875,
-0.00011014938354492188,
-0.0220794677734375,
0.0286865234375,
0.0400390625,
-0.057159423828125,
0.045501708984375,
0.0276947021484375,
-0.0234527587890625,
0.036102294921875,
-0.002056121826171875,
0.01470184326171875,
-0.10748291015625,
0.01629638671875,
0.00469207763671875,
0.01027679443359375,
-0.052276611328125,
-0.01418304443359375,
0.00433349609375,
0.0104827880859375,
-0.04345703125,
0.05712890625,
-0.0384521484375,
0.03558349609375,
0.002490997314453125,
-0.00981903076171875,
-0.01261138916015625,
0.049224853515625,
-0.01324462890625,
0.05328369140625,
0.04266357421875,
-0.044586181640625,
0.005199432373046875,
0.01399993896484375,
-0.041046142578125,
0.0301971435546875,
-0.052734375,
0.0007753372192382812,
-0.015869140625,
0.0226593017578125,
-0.08319091796875,
-0.0134429931640625,
0.031707763671875,
-0.051361083984375,
0.0233917236328125,
-0.01027679443359375,
-0.061431884765625,
-0.039581298828125,
-0.040435791015625,
0.0181427001953125,
0.052337646484375,
-0.04010009765625,
0.056182861328125,
0.028594970703125,
-0.006572723388671875,
-0.04229736328125,
-0.050140380859375,
-0.0112457275390625,
-0.0218963623046875,
-0.0604248046875,
0.02587890625,
-0.01117706298828125,
0.004119873046875,
-0.004489898681640625,
-0.00788116455078125,
-0.0110015869140625,
-0.005702972412109375,
0.025787353515625,
0.01004791259765625,
0.0017147064208984375,
0.0236358642578125,
-0.00423431396484375,
0.0123138427734375,
0.00543212890625,
-0.0262908935546875,
0.056182861328125,
-0.0031337738037109375,
-0.0150604248046875,
-0.042083740234375,
0.005878448486328125,
0.034149169921875,
-0.0242919921875,
0.0799560546875,
0.06060791015625,
-0.042724609375,
-0.00838470458984375,
-0.0384521484375,
-0.010894775390625,
-0.036590576171875,
0.00531768798828125,
-0.0177459716796875,
-0.052490234375,
0.0372314453125,
-0.00794219970703125,
0.01335906982421875,
0.045867919921875,
0.05157470703125,
-0.0130157470703125,
0.06536865234375,
0.045806884765625,
0.01071929931640625,
0.05255126953125,
-0.045562744140625,
0.04327392578125,
-0.054595947265625,
-0.0242767333984375,
-0.041107177734375,
-0.0098724365234375,
-0.059722900390625,
-0.03326416015625,
0.0196533203125,
-0.003643035888671875,
-0.03167724609375,
0.0278472900390625,
-0.063232421875,
-0.0017042160034179688,
0.0275726318359375,
0.0131988525390625,
-0.01593017578125,
0.0202178955078125,
-0.02825927734375,
-0.005207061767578125,
-0.04949951171875,
-0.0345458984375,
0.08990478515625,
0.012908935546875,
0.048065185546875,
0.00855255126953125,
0.0654296875,
0.01248931884765625,
0.03167724609375,
-0.0311126708984375,
0.035247802734375,
-0.004119873046875,
-0.056396484375,
-0.0266876220703125,
-0.0259552001953125,
-0.0745849609375,
0.031768798828125,
-0.0213165283203125,
-0.0699462890625,
0.0303497314453125,
0.0157623291015625,
-0.0283203125,
0.0225677490234375,
-0.058807373046875,
0.05322265625,
-0.00702667236328125,
-0.029083251953125,
-0.0023956298828125,
-0.05029296875,
0.00919342041015625,
-0.002880096435546875,
0.0303497314453125,
-0.0167388916015625,
-0.0132293701171875,
0.07965087890625,
-0.056884765625,
0.056732177734375,
-0.0184478759765625,
0.01398468017578125,
0.040679931640625,
-0.0263824462890625,
0.045501708984375,
-0.004077911376953125,
-0.005718231201171875,
0.00954437255859375,
-0.009063720703125,
-0.038909912109375,
-0.0276947021484375,
0.0516357421875,
-0.06884765625,
-0.0255584716796875,
-0.045806884765625,
-0.00518798828125,
-0.004985809326171875,
0.0260009765625,
0.0301666259765625,
0.0225982666015625,
0.004650115966796875,
0.029144287109375,
0.041961669921875,
-0.01479339599609375,
0.043304443359375,
0.0085296630859375,
-0.00701904296875,
-0.028350830078125,
0.07403564453125,
0.0291290283203125,
0.0000584721565246582,
0.035125732421875,
0.026641845703125,
-0.031829833984375,
-0.0340576171875,
-0.0037441253662109375,
0.024627685546875,
-0.03985595703125,
-0.0143280029296875,
-0.0843505859375,
-0.01470184326171875,
-0.056732177734375,
-0.0169525146484375,
-0.03948974609375,
-0.038818359375,
-0.0176239013671875,
-0.0076751708984375,
0.046783447265625,
0.018829345703125,
-0.0220794677734375,
0.004791259765625,
-0.0267791748046875,
0.01006317138671875,
0.01236724853515625,
0.0266265869140625,
0.0172119140625,
-0.04925537109375,
-0.0167388916015625,
0.010894775390625,
-0.009674072265625,
-0.042724609375,
0.0390625,
0.0023059844970703125,
0.0237579345703125,
0.0184173583984375,
0.0233612060546875,
0.065185546875,
-0.02801513671875,
0.055328369140625,
0.032958984375,
-0.060272216796875,
0.059906005859375,
-0.0214996337890625,
0.0279998779296875,
0.043609619140625,
0.046295166015625,
-0.031524658203125,
-0.01947021484375,
-0.055877685546875,
-0.10040283203125,
0.046722412109375,
0.040985107421875,
0.019744873046875,
-0.005214691162109375,
0.005390167236328125,
-0.01242828369140625,
0.0197601318359375,
-0.04974365234375,
-0.041656494140625,
-0.0325927734375,
-0.019195556640625,
0.008026123046875,
-0.01132965087890625,
-0.00384521484375,
-0.058349609375,
0.056732177734375,
0.0091400146484375,
0.04461669921875,
0.022705078125,
-0.01137542724609375,
-0.0004127025604248047,
0.01145172119140625,
0.035919189453125,
0.062042236328125,
-0.03070068359375,
-0.00473785400390625,
0.005451202392578125,
-0.0418701171875,
-0.002475738525390625,
0.007106781005859375,
-0.023590087890625,
0.00971221923828125,
0.0309295654296875,
0.055328369140625,
-0.01055145263671875,
-0.0277862548828125,
0.051849365234375,
-0.00908660888671875,
-0.024444580078125,
-0.05926513671875,
-0.0124969482421875,
-0.00013971328735351562,
-0.0035114288330078125,
0.0361328125,
0.011932373046875,
-0.0186309814453125,
-0.023681640625,
0.0096588134765625,
0.0180206298828125,
-0.0338134765625,
-0.0232696533203125,
0.051116943359375,
0.0199737548828125,
-0.010009765625,
0.0187835693359375,
-0.009796142578125,
-0.049163818359375,
0.038421630859375,
0.0155792236328125,
0.07720947265625,
-0.00727081298828125,
0.01128387451171875,
0.059967041015625,
0.03155517578125,
0.01061248779296875,
0.03143310546875,
-0.0034084320068359375,
-0.046600341796875,
-0.019378662109375,
-0.07147216796875,
-0.0015916824340820312,
0.00994873046875,
-0.0284881591796875,
0.01259613037109375,
-0.04736328125,
-0.0186309814453125,
-0.0016679763793945312,
0.01259613037109375,
-0.07171630859375,
0.029083251953125,
-0.01174163818359375,
0.0657958984375,
-0.0584716796875,
0.05096435546875,
0.05841064453125,
-0.05352783203125,
-0.077392578125,
0.0224456787109375,
-0.01568603515625,
-0.054534912109375,
0.055999755859375,
0.0213623046875,
0.01039886474609375,
-0.00023245811462402344,
-0.056671142578125,
-0.07513427734375,
0.0732421875,
-0.0009217262268066406,
-0.0182037353515625,
0.0238037109375,
0.0137176513671875,
0.050140380859375,
-0.0284271240234375,
0.03021240234375,
0.02410888671875,
0.04058837890625,
-0.00016248226165771484,
-0.04791259765625,
0.030059814453125,
-0.03936767578125,
0.0125732421875,
0.005840301513671875,
-0.06439208984375,
0.072998046875,
0.007511138916015625,
-0.0104827880859375,
-0.0177154541015625,
0.052459716796875,
0.0161285400390625,
0.0009641647338867188,
0.04791259765625,
0.046539306640625,
0.034027099609375,
-0.004913330078125,
0.0899658203125,
-0.05316162109375,
0.046295166015625,
0.051177978515625,
0.01465606689453125,
0.0626220703125,
0.042999267578125,
-0.0266265869140625,
0.026763916015625,
0.049041748046875,
0.0237579345703125,
0.040679931640625,
0.01348114013671875,
-0.0211029052734375,
-0.0090484619140625,
-0.01468658447265625,
-0.03289794921875,
0.0145416259765625,
0.026458740234375,
-0.0145263671875,
0.0038204193115234375,
-0.0008912086486816406,
0.0171966552734375,
-0.00246429443359375,
-0.01097869873046875,
0.048675537109375,
0.0064697265625,
-0.04534912109375,
0.05377197265625,
-0.01715087890625,
0.06597900390625,
-0.0408935546875,
0.015777587890625,
-0.01528167724609375,
0.0182647705078125,
-0.0229644775390625,
-0.04736328125,
0.01169586181640625,
0.01226043701171875,
0.006954193115234375,
-0.028900146484375,
0.048095703125,
-0.01678466796875,
-0.05517578125,
0.031280517578125,
0.0207061767578125,
-0.00008815526962280273,
-0.001056671142578125,
-0.070068359375,
0.01064300537109375,
0.006801605224609375,
-0.04852294921875,
-0.004169464111328125,
0.044830322265625,
0.0122833251953125,
0.04498291015625,
0.056732177734375,
0.0120391845703125,
0.0236968994140625,
0.00885009765625,
0.0753173828125,
-0.031890869140625,
-0.04095458984375,
-0.06024169921875,
0.053955078125,
-0.0227813720703125,
-0.03265380859375,
0.058746337890625,
0.0279083251953125,
0.0672607421875,
0.0035552978515625,
0.0816650390625,
-0.04083251953125,
0.056060791015625,
-0.01245880126953125,
0.06396484375,
-0.07171630859375,
-0.0012025833129882812,
-0.040252685546875,
-0.061065673828125,
-0.0249786376953125,
0.06744384765625,
-0.0222015380859375,
0.0352783203125,
0.045654296875,
0.055999755859375,
-0.00371551513671875,
-0.01277923583984375,
-0.005939483642578125,
0.03662109375,
0.034576416015625,
0.0179443359375,
0.02703857421875,
-0.04022216796875,
0.04693603515625,
-0.04296875,
-0.0193634033203125,
-0.0350341796875,
-0.04931640625,
-0.06903076171875,
-0.05828857421875,
-0.033477783203125,
-0.037994384765625,
0.01377105712890625,
0.096923828125,
0.042816162109375,
-0.0667724609375,
-0.0084991455078125,
-0.024871826171875,
0.0018358230590820312,
-0.0099334716796875,
-0.0179595947265625,
0.0615234375,
-0.0211944580078125,
-0.048858642578125,
-0.0273895263671875,
0.0194854736328125,
0.0018968582153320312,
0.006122589111328125,
-0.01116180419921875,
-0.0526123046875,
0.0260009765625,
0.047698974609375,
0.02191162109375,
-0.04833984375,
-0.0268096923828125,
-0.01358795166015625,
-0.0208282470703125,
0.01352691650390625,
0.0276947021484375,
-0.03070068359375,
0.037750244140625,
0.0517578125,
0.04901123046875,
0.04608154296875,
0.0002390146255493164,
0.00759124755859375,
-0.04937744140625,
0.00913238525390625,
0.0190582275390625,
0.0216522216796875,
0.021636962890625,
-0.0035152435302734375,
0.054229736328125,
0.0262298583984375,
-0.035003662109375,
-0.0614013671875,
-0.00885009765625,
-0.07720947265625,
-0.01407623291015625,
0.08740234375,
-0.002429962158203125,
-0.032470703125,
0.00562286376953125,
-0.0090789794921875,
0.04559326171875,
-0.0521240234375,
0.054107666015625,
0.04071044921875,
-0.01300048828125,
-0.0014543533325195312,
-0.040252685546875,
0.041656494140625,
0.03985595703125,
-0.054534912109375,
-0.003021240234375,
-0.010498046875,
0.0159759521484375,
0.014739990234375,
0.07080078125,
-0.0027980804443359375,
0.01239013671875,
-0.00843048095703125,
0.019378662109375,
0.007171630859375,
-0.0028839111328125,
-0.003173828125,
-0.003513336181640625,
-0.0207061767578125,
-0.0251922607421875
]
] |
KoboldAI/GPT-J-6B-Skein | 2022-11-14T18:35:26.000Z | [
"transformers",
"pytorch",
"gptj",
"text-generation",
"arxiv:1910.09700",
"endpoints_compatible",
"has_space",
"region:us"
] | text-generation | KoboldAI | null | null | KoboldAI/GPT-J-6B-Skein | 12 | 7,018 | transformers | 2022-03-02T23:29:04 | ---
tags:
- text-generation
---
# Model Card for GPT-J-6B-Skein
# Model Details
## Model Description
- **Developed by:** KoboldAI
- **Shared by [Optional]:** KoboldAI
- **Model type:** Text Generation
- **Language(s) (NLP):** English
- **License:** Apache License 2.0
- **Related Models:** [GPT-J 6B](https://huggingface.co/EleutherAI/gpt-j-6B?text=My+name+is+Mariama%2C+my+favorite)
- **Parent Model:** GPT-J
- **Resources for more information:**
- [GitHub Repo](https://github.com/kingoflolz/mesh-transformer-jax)
- [Associated Model Doc](https://huggingface.co/docs/transformers/main/en/model_doc/gptj#transformers.GPTJForCausalLM)
# Uses
## Direct Use
This model is designed for creative story generation. It can understand both free-form text and text written in interactive fiction style with actions starting with "> You", such as:
```
You become aware of her breathing -- the slight expansion of her ribs, the soft exhalation -- natural, and yet somehow studied. "Ah -- by the way," she says, in a way that utterly fails to be casual, "have you seen the artist out there? -- My artist, that is."
"No," you respond, uneasy. You open your mouth and close it again.
> You ask about the experience of waking up
```
## Downstream Use [Optional]
More information needed
## Out-of-Scope Use
The model should not be used to intentionally create hostile or alienating environments for people.
# Bias, Risks, and Limitations
The core functionality of GPT-J is taking a string of text and predicting the next token. While language models are widely used for tasks other than this, there are a lot of unknowns with this work. When prompting GPT-J it is important to remember that the statistically most likely next token is often not the token that produces the most "accurate" text. Never depend upon GPT-J to produce factually accurate output.
GPT-J was trained on the Pile, a dataset known to contain profanity, lewd, and otherwise abrasive language. Depending upon use case GPT-J may produce socially unacceptable text. See Sections 5 and 6 of the Pile paper for a more detailed analysis of the biases in the Pile.
As with all language models, it is hard to predict in advance how GPT-J will respond to particular prompts and offensive content may occur without warning. We recommend having a human curate or filter the outputs before releasing them, both to censor undesirable content and to improve the quality of the results.
See the [GPT-J 6B model card](https://huggingface.co/EleutherAI/gpt-j-6B?text=My+name+is+Mariama%2C+my+favorite) for more information.
## Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
# Training Details
## Training Data
The data are mostly comprised of light novels from the dataset of the [KoboldAI/GPT-Neo-2.7B-Horni-LN](https://huggingface.co/KoboldAI/GPT-Neo-2.7B-Horni-LN) model and assorted interactive fiction. The dataset uses `[Themes: <comma-separated list of genres>]` for tagging, which means that if similar text is placed in the context, the model will attempt to generate text in the specified style(s). For more details about the dataset, consult [this document](https://wandb.ai/ve-forbryderne/skein/runs/files/files/datasets/README.txt).
## Training Procedure
### Preprocessing
The data were preprocessed using the Python package ftfy to eliminate as much as possible non-ASCII punctuation characters and possible encoding errors. The interactive fiction in the dataset also underwent deduplication since interactive fiction logs often contain duplicate text from, for example, visiting the same in-game area several times. spaCy was used for grammatical analysis with the purpose of reformatting the actions commonly found in old text adventure games into more complete sentences. There was also some manual elimination of things such as "thank you for playing" messages and title messages.
### Speeds, Sizes, Times
Training took approximately 14 hours in total, with the average speed being 5265 tokens per second.
# Evaluation
## Testing Data, Factors & Metrics
### Testing Data
More information needed
### Factors
### Metrics
More information needed
## Results
More information needed
# Model Examination
More information needed
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** More information needed
- **Hours used:** More information needed
- **Cloud Provider:** More information needed
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
More information needed
## Compute Infrastructure
More information needed
### Hardware
More information needed
### Software
https://github.com/kingoflolz/mesh-transformer-jax
# Citation
**BibTeX:**
```
@misc{mesh-transformer-jax,
author = {Wang, Ben},
title = {{Mesh-Transformer-JAX: Model-Parallel Implementation of Transformer Language Model with JAX}},
howpublished = {\url{https://github.com/kingoflolz/mesh-transformer-jax}},
year = 2021,
month = May
}
```
# Glossary [optional]
More information needed
# More Information [optional]
More information needed
# Model Card Authors [optional]
KoboldAI in collaboration with Ezi Ozoani and the Hugging Face team
# Model Card Contact
More information needed
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("KoboldAI/GPT-J-6B-Skein")
model = AutoModelForCausalLM.from_pretrained("KoboldAI/GPT-J-6B-Skein")
```
</details>
| 6,054 | [
[
-0.02764892578125,
-0.060272216796875,
0.03363037109375,
0.0062408447265625,
-0.026092529296875,
-0.0185546875,
-0.014984130859375,
-0.040191650390625,
-0.004734039306640625,
0.041229248046875,
-0.05023193359375,
-0.0350341796875,
-0.050750732421875,
0.01187896728515625,
-0.0172119140625,
0.09912109375,
-0.0006365776062011719,
-0.0173187255859375,
-0.0025463104248046875,
0.007904052734375,
-0.03314208984375,
-0.0298309326171875,
-0.057220458984375,
-0.02264404296875,
0.03656005859375,
-0.0085906982421875,
0.0592041015625,
0.047576904296875,
0.0296630859375,
0.027374267578125,
-0.01361083984375,
-0.0098876953125,
-0.0421142578125,
-0.01457977294921875,
-0.0116729736328125,
-0.03167724609375,
-0.0267333984375,
0.001186370849609375,
0.043914794921875,
0.0257568359375,
0.0015554428100585938,
0.0229644775390625,
-0.00501251220703125,
0.0303192138671875,
-0.020751953125,
0.0318603515625,
-0.036895751953125,
-0.0038013458251953125,
-0.0237579345703125,
0.01404571533203125,
-0.03363037109375,
-0.0118255615234375,
0.00876617431640625,
-0.049957275390625,
0.034637451171875,
0.00739288330078125,
0.08868408203125,
0.016265869140625,
-0.02862548828125,
-0.026641845703125,
-0.058868408203125,
0.061431884765625,
-0.060638427734375,
0.0230712890625,
0.0131072998046875,
0.01520538330078125,
-0.0008912086486816406,
-0.06353759765625,
-0.06011962890625,
-0.0189971923828125,
-0.0272216796875,
0.033660888671875,
-0.00428009033203125,
0.00235748291015625,
0.046051025390625,
0.034698486328125,
-0.058013916015625,
-0.01451873779296875,
-0.02874755859375,
-0.00981903076171875,
0.05255126953125,
0.020904541015625,
0.037689208984375,
-0.035003662109375,
-0.032470703125,
-0.03631591796875,
-0.0272369384765625,
0.01116943359375,
0.053741455078125,
0.0225982666015625,
-0.0284271240234375,
0.03997802734375,
-0.0005750656127929688,
0.03155517578125,
0.002105712890625,
-0.0189971923828125,
0.0258636474609375,
-0.035491943359375,
-0.01922607421875,
-0.0174713134765625,
0.0775146484375,
0.017242431640625,
0.0082244873046875,
-0.0096282958984375,
-0.0084686279296875,
0.0167083740234375,
0.0308074951171875,
-0.07061767578125,
-0.024017333984375,
0.013702392578125,
-0.042083740234375,
-0.0338134765625,
0.005138397216796875,
-0.05596923828125,
-0.011505126953125,
-0.020050048828125,
0.0335693359375,
-0.0256805419921875,
-0.044097900390625,
0.00927734375,
-0.022369384765625,
0.0131378173828125,
0.00966644287109375,
-0.067138671875,
0.00954437255859375,
0.01953125,
0.051910400390625,
-0.00804901123046875,
-0.01171112060546875,
0.0172271728515625,
0.00959014892578125,
-0.004119873046875,
0.046783447265625,
-0.025787353515625,
-0.04296875,
-0.0175323486328125,
0.0262603759765625,
-0.0190887451171875,
-0.0081329345703125,
0.0594482421875,
-0.0235748291015625,
0.049102783203125,
-0.01422882080078125,
-0.035491943359375,
-0.0266571044921875,
0.00290679931640625,
-0.054443359375,
0.08026123046875,
0.02825927734375,
-0.07666015625,
0.01515960693359375,
-0.05401611328125,
-0.01520538330078125,
0.0069122314453125,
-0.0020885467529296875,
-0.05059814453125,
-0.0003120899200439453,
0.0164031982421875,
0.021759033203125,
-0.0304718017578125,
0.0307769775390625,
-0.00827789306640625,
-0.022247314453125,
-0.0026264190673828125,
-0.0299072265625,
0.08160400390625,
0.0235595703125,
-0.041259765625,
-0.007129669189453125,
-0.04534912109375,
-0.00235748291015625,
0.044525146484375,
-0.0044097900390625,
-0.0110931396484375,
-0.0176849365234375,
0.025909423828125,
0.017486572265625,
0.01678466796875,
-0.035247802734375,
0.02508544921875,
-0.034027099609375,
0.020965576171875,
0.04742431640625,
-0.0032291412353515625,
0.0261688232421875,
-0.042388916015625,
0.0418701171875,
-0.0035037994384765625,
0.0257568359375,
-0.00841522216796875,
-0.05633544921875,
-0.06500244140625,
-0.004016876220703125,
0.0228424072265625,
0.042022705078125,
-0.06781005859375,
0.037017822265625,
-0.0187225341796875,
-0.07171630859375,
-0.0311126708984375,
-0.01007843017578125,
0.038848876953125,
0.03240966796875,
0.029205322265625,
-0.0093841552734375,
-0.032806396484375,
-0.060394287109375,
-0.0225982666015625,
-0.0197296142578125,
-0.0024890899658203125,
0.0279083251953125,
0.047271728515625,
-0.0325927734375,
0.0633544921875,
-0.037872314453125,
-0.017333984375,
-0.0338134765625,
0.01512908935546875,
0.034698486328125,
0.04180908203125,
0.053131103515625,
-0.06854248046875,
-0.039215087890625,
-0.0049285888671875,
-0.0528564453125,
-0.012054443359375,
-0.0203704833984375,
-0.01244354248046875,
0.02044677734375,
0.012176513671875,
-0.06365966796875,
0.0310211181640625,
0.03802490234375,
-0.03717041015625,
0.05023193359375,
-0.0056304931640625,
0.006435394287109375,
-0.0963134765625,
0.017913818359375,
-0.004741668701171875,
-0.0032367706298828125,
-0.04632568359375,
0.00311279296875,
-0.01132965087890625,
0.0014772415161132812,
-0.03131103515625,
0.057220458984375,
-0.035980224609375,
0.0189361572265625,
-0.0167388916015625,
-0.0052337646484375,
0.005702972412109375,
0.052947998046875,
0.008636474609375,
0.053192138671875,
0.0267333984375,
-0.036346435546875,
0.0188446044921875,
0.027252197265625,
-0.01332855224609375,
0.01837158203125,
-0.0552978515625,
0.016754150390625,
-0.01053619384765625,
0.02081298828125,
-0.0654296875,
-0.0175628662109375,
0.035430908203125,
-0.03680419921875,
0.0318603515625,
-0.02105712890625,
-0.04693603515625,
-0.046173095703125,
0.0062713623046875,
0.031707763671875,
0.055908203125,
-0.03515625,
0.06292724609375,
0.0233917236328125,
-0.01551055908203125,
-0.0321044921875,
-0.03802490234375,
-0.01922607421875,
-0.0263519287109375,
-0.039794921875,
0.032989501953125,
-0.019683837890625,
0.00559234619140625,
-0.003322601318359375,
0.0118560791015625,
0.0026874542236328125,
-0.00583648681640625,
0.02105712890625,
0.025390625,
-0.0009751319885253906,
-0.000047087669372558594,
0.01007080078125,
-0.0219879150390625,
0.01210784912109375,
-0.0088958740234375,
0.046539306640625,
-0.0174102783203125,
-0.00460052490234375,
-0.032745361328125,
0.019134521484375,
0.034454345703125,
0.0113983154296875,
0.05633544921875,
0.0765380859375,
-0.0248870849609375,
0.0032672882080078125,
-0.0433349609375,
-0.002376556396484375,
-0.0379638671875,
0.0308074951171875,
-0.0311279296875,
-0.05450439453125,
0.042877197265625,
0.0088958740234375,
0.00618743896484375,
0.0712890625,
0.053009033203125,
0.00450897216796875,
0.087158203125,
0.0560302734375,
-0.00862884521484375,
0.035919189453125,
-0.035675048828125,
0.01512908935546875,
-0.0535888671875,
-0.0128173828125,
-0.043792724609375,
-0.00980377197265625,
-0.06298828125,
-0.02392578125,
0.0153656005859375,
0.0023059844970703125,
-0.0352783203125,
0.047760009765625,
-0.05841064453125,
0.0279083251953125,
0.03167724609375,
-0.00492095947265625,
0.020294189453125,
-0.005390167236328125,
-0.010650634765625,
-0.0073394775390625,
-0.04931640625,
-0.035430908203125,
0.07391357421875,
0.0389404296875,
0.042816162109375,
0.0083160400390625,
0.045440673828125,
0.004207611083984375,
0.0284271240234375,
-0.0279083251953125,
0.0263519287109375,
-0.007137298583984375,
-0.0645751953125,
-0.016082763671875,
-0.0382080078125,
-0.06683349609375,
0.01515960693359375,
-0.01548004150390625,
-0.057525634765625,
0.0103759765625,
-0.0043487548828125,
-0.019012451171875,
0.0304718017578125,
-0.06256103515625,
0.078857421875,
-0.01297760009765625,
-0.00856781005859375,
0.01068878173828125,
-0.04864501953125,
0.035491943359375,
0.002834320068359375,
0.0120086669921875,
-0.00228118896484375,
0.00684356689453125,
0.054443359375,
-0.02325439453125,
0.06536865234375,
-0.0213775634765625,
-0.0016069412231445312,
0.0286102294921875,
-0.016693115234375,
0.0465087890625,
0.006137847900390625,
0.0106658935546875,
0.0335693359375,
0.00424957275390625,
-0.02362060546875,
-0.03460693359375,
0.0435791015625,
-0.08184814453125,
-0.0263519287109375,
-0.044403076171875,
-0.0482177734375,
0.00585174560546875,
0.0294952392578125,
0.0523681640625,
0.0217132568359375,
-0.00537872314453125,
0.0022907257080078125,
0.04345703125,
-0.026641845703125,
0.032318115234375,
0.0258636474609375,
-0.028839111328125,
-0.040130615234375,
0.06829833984375,
0.00841522216796875,
0.015838623046875,
0.00795745849609375,
0.0239105224609375,
-0.040435791015625,
-0.031463623046875,
-0.044525146484375,
0.03802490234375,
-0.03521728515625,
-0.0149383544921875,
-0.07843017578125,
-0.03387451171875,
-0.0362548828125,
-0.002796173095703125,
-0.0239105224609375,
-0.0168914794921875,
-0.0202178955078125,
-0.0014429092407226562,
0.0345458984375,
0.0599365234375,
0.026641845703125,
0.040435791015625,
-0.049774169921875,
0.0273895263671875,
0.008880615234375,
0.0277099609375,
-0.004497528076171875,
-0.052947998046875,
-0.0153656005859375,
0.01200103759765625,
-0.0264892578125,
-0.05218505859375,
0.03662109375,
-0.0016307830810546875,
0.0345458984375,
0.01242828369140625,
0.0066070556640625,
0.042022705078125,
-0.0279541015625,
0.07733154296875,
0.0030975341796875,
-0.056915283203125,
0.034759521484375,
-0.055419921875,
0.0345458984375,
0.0293426513671875,
0.035308837890625,
-0.038421630859375,
-0.04193115234375,
-0.0831298828125,
-0.0648193359375,
0.05902099609375,
0.045196533203125,
0.017333984375,
0.0028591156005859375,
0.0263519287109375,
0.0089569091796875,
0.01372528076171875,
-0.08856201171875,
-0.0288543701171875,
-0.0267486572265625,
-0.0184478759765625,
-0.006679534912109375,
-0.01422882080078125,
0.00511932373046875,
-0.02764892578125,
0.075439453125,
-0.0045928955078125,
0.0543212890625,
0.011810302734375,
-0.0164031982421875,
0.004154205322265625,
-0.0030193328857421875,
0.04254150390625,
0.020538330078125,
-0.01305389404296875,
-0.024932861328125,
0.0163421630859375,
-0.06170654296875,
-0.003910064697265625,
0.0204315185546875,
-0.034423828125,
0.00899505615234375,
0.0172119140625,
0.07769775390625,
-0.00450897216796875,
-0.0293426513671875,
0.044189453125,
-0.01763916015625,
-0.0194244384765625,
-0.038970947265625,
0.016510009765625,
0.0111236572265625,
0.01354217529296875,
0.00804901123046875,
0.0001633167266845703,
0.0121002197265625,
-0.022247314453125,
0.00005537271499633789,
0.0184326171875,
-0.0175018310546875,
-0.0177459716796875,
0.06414794921875,
0.00592041015625,
-0.007289886474609375,
0.049835205078125,
-0.025238037109375,
-0.0301666259765625,
0.0516357421875,
0.05987548828125,
0.0755615234375,
-0.0085906982421875,
0.0121612548828125,
0.0550537109375,
0.034912109375,
-0.0046539306640625,
0.006374359130859375,
0.01178741455078125,
-0.051177978515625,
-0.026275634765625,
-0.06292724609375,
-0.0158843994140625,
0.021759033203125,
-0.038604736328125,
0.034149169921875,
-0.033477783203125,
-0.0175323486328125,
-0.0187225341796875,
0.0168609619140625,
-0.06793212890625,
0.0200042724609375,
0.005947113037109375,
0.051422119140625,
-0.08001708984375,
0.05908203125,
0.041107177734375,
-0.057861328125,
-0.0760498046875,
-0.01421356201171875,
-0.00664520263671875,
-0.03997802734375,
0.02166748046875,
0.0258941650390625,
0.017578125,
0.020477294921875,
-0.0382080078125,
-0.06982421875,
0.0963134765625,
0.009307861328125,
-0.039276123046875,
-0.0189666748046875,
0.0166168212890625,
0.0516357421875,
-0.0274658203125,
0.06390380859375,
0.043731689453125,
0.046600341796875,
-0.0073394775390625,
-0.059814453125,
0.010833740234375,
-0.042236328125,
0.0261688232421875,
0.0123291015625,
-0.06427001953125,
0.06427001953125,
0.005550384521484375,
-0.0245208740234375,
0.0184783935546875,
0.046356201171875,
0.006137847900390625,
0.011688232421875,
0.049835205078125,
0.055389404296875,
0.0455322265625,
-0.02264404296875,
0.096435546875,
-0.0166473388671875,
0.0472412109375,
0.07147216796875,
0.006504058837890625,
0.034759521484375,
0.0023670196533203125,
-0.020294189453125,
0.0482177734375,
0.042755126953125,
-0.025787353515625,
0.028106689453125,
0.00033164024353027344,
-0.01070404052734375,
-0.003208160400390625,
-0.00806427001953125,
-0.044097900390625,
0.02166748046875,
0.020172119140625,
-0.03460693359375,
-0.0087738037109375,
0.01214599609375,
0.01284027099609375,
-0.0117034912109375,
0.00031256675720214844,
0.06158447265625,
-0.001407623291015625,
-0.03912353515625,
0.038543701171875,
-0.00513458251953125,
0.0555419921875,
-0.054229736328125,
0.00469970703125,
-0.00247955322265625,
0.00632476806640625,
-0.002964019775390625,
-0.039947509765625,
0.0249786376953125,
-0.00315093994140625,
-0.037384033203125,
-0.02520751953125,
0.057647705078125,
-0.0389404296875,
-0.060333251953125,
0.0247802734375,
0.0281829833984375,
0.022674560546875,
-0.00591278076171875,
-0.0792236328125,
0.01195526123046875,
0.009002685546875,
-0.032928466796875,
0.020233154296875,
0.0210723876953125,
-0.0009083747863769531,
0.034027099609375,
0.060791015625,
-0.00377655029296875,
-0.0081939697265625,
0.0142669677734375,
0.061767578125,
-0.049530029296875,
-0.039520263671875,
-0.04327392578125,
0.060821533203125,
-0.01425933837890625,
-0.03521728515625,
0.057220458984375,
0.03729248046875,
0.07666015625,
-0.0216827392578125,
0.0721435546875,
-0.0257110595703125,
0.042694091796875,
-0.033416748046875,
0.07135009765625,
-0.030975341796875,
-0.0033435821533203125,
-0.047821044921875,
-0.0889892578125,
-0.005069732666015625,
0.05047607421875,
-0.019287109375,
0.029571533203125,
0.05914306640625,
0.06439208984375,
0.0007295608520507812,
0.0152435302734375,
0.0186309814453125,
0.0284576416015625,
0.029815673828125,
0.041412353515625,
0.0504150390625,
-0.06884765625,
0.039581298828125,
-0.0384521484375,
-0.01415252685546875,
-0.0017881393432617188,
-0.0716552734375,
-0.0806884765625,
-0.047088623046875,
-0.0260162353515625,
-0.03289794921875,
-0.00836944580078125,
0.053741455078125,
0.029083251953125,
-0.05401611328125,
-0.0042877197265625,
-0.01442718505859375,
-0.0016756057739257812,
-0.0183563232421875,
-0.02142333984375,
0.028594970703125,
-0.024383544921875,
-0.075439453125,
0.0009527206420898438,
-0.0099029541015625,
0.020477294921875,
-0.015350341796875,
-0.0026149749755859375,
-0.01059722900390625,
-0.005214691162109375,
0.03057861328125,
0.006259918212890625,
-0.053314208984375,
-0.019805908203125,
-0.0009489059448242188,
-0.0157928466796875,
-0.0098419189453125,
0.042572021484375,
-0.048065185546875,
0.0308990478515625,
0.038787841796875,
0.037567138671875,
0.04388427734375,
-0.0041351318359375,
0.0362548828125,
-0.032867431640625,
0.01395416259765625,
0.01617431640625,
0.0280609130859375,
0.01410675048828125,
-0.050750732421875,
0.044189453125,
0.044647216796875,
-0.0560302734375,
-0.04791259765625,
0.007415771484375,
-0.08099365234375,
-0.00731658935546875,
0.091552734375,
0.00013005733489990234,
-0.031494140625,
-0.01059722900390625,
-0.03680419921875,
0.028167724609375,
-0.01629638671875,
0.04449462890625,
0.06341552734375,
0.013519287109375,
-0.014373779296875,
-0.05462646484375,
0.04376220703125,
0.0186004638671875,
-0.052825927734375,
-0.0020160675048828125,
0.02606201171875,
0.025909423828125,
0.0270843505859375,
0.047149658203125,
-0.0176239013671875,
0.01079559326171875,
0.015960693359375,
0.0282135009765625,
-0.010101318359375,
-0.0189056396484375,
-0.01226043701171875,
-0.0088653564453125,
-0.0227813720703125,
0.01430511474609375
]
] |
posicube/Llama2-chat-AYB-13B | 2023-10-06T08:40:15.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"arxiv:2306.02707",
"license:llama2",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | posicube | null | null | posicube/Llama2-chat-AYB-13B | 12 | 7,017 | transformers | 2023-10-03T03:06:14 | ---
license: llama2
---
This is a model diverged from Llama-2-13b-chat-hf. We hypotheize that if we find a method to ensemble the top rankers in each benchmark effectively, its performance maximizes as well. Following this intuition, we ensembled the top models in each benchmarks to create our model.
# Model Details
- **Developed by**: Posicube Inc.
- **Backbone Model**: LLaMA-2-13b-chat
- **Library**: HuggingFace Transformers
- **Used Dataset Details**
Orca-style datasets, Alpaca-style datasets
# Evaluation
We achieved the top ranker among 13B models with this model at Oct-3rd 2023.
| Metric |Scores on Leaderboard| Our results |
|---------------------|---------------------|-------------|
| ARC (25-shot) | 63.4 | 63.48 |
| HellaSwag (10-shot) | 84.79 | 84.87 |
| MMLU (5-shot) | 59.34 | 59.59 |
| TruthfulQA (0-shot) | 55.62 | 55.22 |
| Avg. | 65.79 | 65.78 |
# Limitations & Biases:
Llama2 and fine-tuned variants are a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2 and any fine-tuned varient's potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2 variants, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at https://ai.meta.com/llama/responsible-use-guide/
# License Disclaimer:
This model is bound by the license & usage restrictions of the original Llama-2 model. And comes with no warranty or gurantees of any kind.
# Contact Us
[Posicube](https://www.posicube.com/)
# Citiation:
Please kindly cite using the following BibTeX:
```bibtex
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtex
@software{touvron2023llama2,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron, Louis Martin, Kevin Stone, Peter Albert, Amjad Almahairi, Yasmine Babaei, Nikolay Bashlykov, Soumya Batra, Prajjwal Bhargava,
Shruti Bhosale, Dan Bikel, Lukas Blecher, Cristian Canton Ferrer, Moya Chen, Guillem Cucurull, David Esiobu, Jude Fernandes, Jeremy Fu, Wenyin Fu, Brian Fuller,
Cynthia Gao, Vedanuj Goswami, Naman Goyal, Anthony Hartshorn, Saghar Hosseini, Rui Hou, Hakan Inan, Marcin Kardas, Viktor Kerkez Madian Khabsa, Isabel Kloumann,
Artem Korenev, Punit Singh Koura, Marie-Anne Lachaux, Thibaut Lavril, Jenya Lee, Diana Liskovich, Yinghai Lu, Yuning Mao, Xavier Martinet, Todor Mihaylov,
Pushkar Mishra, Igor Molybog, Yixin Nie, Andrew Poulton, Jeremy Reizenstein, Rashi Rungta, Kalyan Saladi, Alan Schelten, Ruan Silva, Eric Michael Smith,
Ranjan Subramanian, Xiaoqing Ellen Tan, Binh Tang, Ross Taylor, Adina Williams, Jian Xiang Kuan, Puxin Xu , Zheng Yan, Iliyan Zarov, Yuchen Zhang, Angela Fan,
Melanie Kambadur, Sharan Narang, Aurelien Rodriguez, Robert Stojnic, Sergey Edunov, Thomas Scialom},
year={2023}
}
``` | 3,434 | [
[
-0.027130126953125,
-0.057159423828125,
0.017730712890625,
0.01702880859375,
-0.0193634033203125,
0.0175323486328125,
0.0017328262329101562,
-0.052703857421875,
0.01271820068359375,
0.019500732421875,
-0.05487060546875,
-0.041046142578125,
-0.052337646484375,
-0.0188140869140625,
-0.0293121337890625,
0.07940673828125,
0.004573822021484375,
-0.0298614501953125,
0.0021610260009765625,
-0.0016803741455078125,
-0.042205810546875,
-0.0222930908203125,
-0.052520751953125,
-0.0215606689453125,
0.0193023681640625,
0.0263519287109375,
0.050994873046875,
0.0421142578125,
0.045257568359375,
0.0196685791015625,
-0.035064697265625,
0.0168609619140625,
-0.04925537109375,
-0.01003265380859375,
0.01132965087890625,
-0.0386962890625,
-0.0723876953125,
-0.0003781318664550781,
0.034027099609375,
0.0321044921875,
-0.029327392578125,
0.024139404296875,
0.0162811279296875,
0.03521728515625,
-0.035064697265625,
0.0107421875,
-0.0321044921875,
0.0006856918334960938,
-0.0303802490234375,
-0.0039520263671875,
-0.01293182373046875,
-0.0200653076171875,
0.0074615478515625,
-0.04815673828125,
-0.005649566650390625,
-0.0004341602325439453,
0.09033203125,
0.0260162353515625,
-0.03448486328125,
-0.000701904296875,
-0.046295166015625,
0.0562744140625,
-0.06658935546875,
0.0191192626953125,
0.01152801513671875,
0.038238525390625,
-0.0308837890625,
-0.067138671875,
-0.041229248046875,
-0.00019407272338867188,
0.0053863525390625,
0.0222320556640625,
-0.029205322265625,
-0.0109710693359375,
0.00173187255859375,
0.0328369140625,
-0.036651611328125,
0.033203125,
-0.034881591796875,
-0.01122283935546875,
0.0517578125,
0.017303466796875,
0.004627227783203125,
0.0027332305908203125,
-0.045989990234375,
-0.011688232421875,
-0.0653076171875,
0.03192138671875,
0.037872314453125,
0.005611419677734375,
-0.039398193359375,
0.039398193359375,
-0.01476287841796875,
0.0301513671875,
0.009552001953125,
-0.030120849609375,
0.040069580078125,
-0.029083251953125,
-0.0232696533203125,
-0.01015472412109375,
0.0560302734375,
0.033721923828125,
0.00574493408203125,
0.013336181640625,
0.0006289482116699219,
0.005611419677734375,
0.0018987655639648438,
-0.06488037109375,
-0.0108795166015625,
0.0251922607421875,
-0.03399658203125,
-0.03521728515625,
-0.006084442138671875,
-0.07135009765625,
-0.018280029296875,
-0.00617218017578125,
0.018798828125,
-0.0104522705078125,
-0.043731689453125,
0.008392333984375,
0.0139312744140625,
0.0379638671875,
0.009674072265625,
-0.060302734375,
0.0318603515625,
0.043975830078125,
0.060760498046875,
-0.0207061767578125,
-0.008697509765625,
-0.0030574798583984375,
0.0033664703369140625,
-0.036163330078125,
0.061065673828125,
-0.031768798828125,
-0.03668212890625,
-0.01461029052734375,
-0.00347137451171875,
0.003997802734375,
-0.032562255859375,
0.046295166015625,
-0.0285186767578125,
0.0206756591796875,
-0.02392578125,
-0.031524658203125,
-0.0286102294921875,
0.0137176513671875,
-0.0305633544921875,
0.0892333984375,
-0.00734710693359375,
-0.046844482421875,
0.0230255126953125,
-0.04962158203125,
-0.0008444786071777344,
-0.0128021240234375,
-0.00971221923828125,
-0.052276611328125,
-0.0254974365234375,
0.0189208984375,
0.024658203125,
-0.03167724609375,
0.025115966796875,
-0.03125,
-0.029083251953125,
0.00737762451171875,
-0.0241546630859375,
0.06689453125,
0.0191650390625,
-0.048980712890625,
0.00850677490234375,
-0.0582275390625,
-0.004985809326171875,
0.02679443359375,
-0.0257415771484375,
0.00601959228515625,
0.006649017333984375,
-0.019775390625,
0.0178985595703125,
0.0241851806640625,
-0.0268707275390625,
0.01242828369140625,
-0.022735595703125,
0.042144775390625,
0.060699462890625,
0.00968170166015625,
0.02252197265625,
-0.035797119140625,
0.03704833984375,
0.005420684814453125,
0.0384521484375,
0.0049591064453125,
-0.062347412109375,
-0.059600830078125,
-0.036590576171875,
0.0104217529296875,
0.05279541015625,
-0.01345062255859375,
0.048248291015625,
-0.01111602783203125,
-0.054779052734375,
-0.0222015380859375,
0.0038013458251953125,
0.033355712890625,
0.038848876953125,
0.0283050537109375,
-0.025726318359375,
-0.044189453125,
-0.075439453125,
0.00043392181396484375,
-0.0258026123046875,
0.0016756057739257812,
0.032440185546875,
0.0291900634765625,
-0.0252838134765625,
0.0650634765625,
-0.0338134765625,
-0.025177001953125,
-0.017181396484375,
-0.01412200927734375,
0.025634765625,
0.044647216796875,
0.05548095703125,
-0.042205810546875,
-0.0156402587890625,
-0.0161285400390625,
-0.05926513671875,
-0.00592041015625,
0.01326751708984375,
-0.03985595703125,
0.007633209228515625,
0.027587890625,
-0.0523681640625,
0.041351318359375,
0.053985595703125,
-0.0286407470703125,
0.0435791015625,
-0.00006765127182006836,
0.000965118408203125,
-0.08270263671875,
0.01139068603515625,
-0.0019502639770507812,
0.00457000732421875,
-0.04833984375,
-0.0030002593994140625,
-0.013641357421875,
0.008880615234375,
-0.028594970703125,
0.03955078125,
-0.0238800048828125,
-0.0014047622680664062,
-0.01190948486328125,
0.00710296630859375,
-0.01381683349609375,
0.049896240234375,
-0.02484130859375,
0.051971435546875,
0.05126953125,
-0.034210205078125,
0.01064300537109375,
0.02154541015625,
-0.040435791015625,
0.03863525390625,
-0.067138671875,
0.00897216796875,
0.007587432861328125,
0.0364990234375,
-0.10888671875,
-0.0251922607421875,
0.0290374755859375,
-0.03253173828125,
0.02691650390625,
-0.0027675628662109375,
-0.0202484130859375,
-0.037994384765625,
-0.0273895263671875,
0.028839111328125,
0.055145263671875,
-0.036712646484375,
0.0311737060546875,
0.040435791015625,
-0.004573822021484375,
-0.0572509765625,
-0.060638427734375,
-0.0021038055419921875,
-0.038543701171875,
-0.062286376953125,
0.03704833984375,
-0.0167694091796875,
-0.0048828125,
-0.0147705078125,
-0.01629638671875,
0.00016868114471435547,
0.0194854736328125,
0.024505615234375,
0.038848876953125,
-0.01464080810546875,
-0.01297760009765625,
0.007022857666015625,
-0.006038665771484375,
-0.001972198486328125,
-0.002384185791015625,
0.048248291015625,
-0.015716552734375,
-0.0283660888671875,
-0.04913330078125,
0.004734039306640625,
0.03631591796875,
-0.0214385986328125,
0.05120849609375,
0.045318603515625,
-0.0271453857421875,
0.00963592529296875,
-0.04779052734375,
-0.0191650390625,
-0.040435791015625,
0.022674560546875,
-0.025665283203125,
-0.06939697265625,
0.07220458984375,
-0.0023441314697265625,
0.0245208740234375,
0.048583984375,
0.048065185546875,
-0.0022373199462890625,
0.0633544921875,
0.044464111328125,
0.006809234619140625,
0.043060302734375,
-0.03924560546875,
0.0086669921875,
-0.07373046875,
-0.04656982421875,
-0.0273590087890625,
-0.045379638671875,
-0.05816650390625,
-0.0205230712890625,
0.026885986328125,
0.02197265625,
-0.048370361328125,
0.0188751220703125,
-0.042205810546875,
0.0170745849609375,
0.035003662109375,
0.0179443359375,
0.017181396484375,
0.0024318695068359375,
-0.0157318115234375,
0.007175445556640625,
-0.041168212890625,
-0.04071044921875,
0.09759521484375,
0.0338134765625,
0.049285888671875,
0.029632568359375,
0.044891357421875,
0.0047454833984375,
0.010711669921875,
-0.04119873046875,
0.04168701171875,
0.0066070556640625,
-0.06561279296875,
-0.0192108154296875,
-0.01271820068359375,
-0.09063720703125,
0.0191497802734375,
-0.0061187744140625,
-0.0760498046875,
0.036773681640625,
0.002857208251953125,
-0.035552978515625,
0.01348876953125,
-0.040130615234375,
0.052734375,
-0.0016040802001953125,
-0.0157012939453125,
-0.0114288330078125,
-0.060272216796875,
0.0537109375,
0.0035610198974609375,
0.0169677734375,
-0.02081298828125,
-0.01427459716796875,
0.06561279296875,
-0.03558349609375,
0.072509765625,
-0.00833892822265625,
-0.01261138916015625,
0.0430908203125,
0.007175445556640625,
0.04736328125,
0.01384735107421875,
0.00018322467803955078,
0.0265655517578125,
-0.021728515625,
-0.0299072265625,
-0.016387939453125,
0.054046630859375,
-0.0892333984375,
-0.06036376953125,
-0.015899658203125,
-0.012664794921875,
-0.0005831718444824219,
0.00830078125,
0.0258026123046875,
0.02337646484375,
0.01275634765625,
0.0252227783203125,
0.044830322265625,
-0.00914764404296875,
0.036834716796875,
0.033355712890625,
-0.00791168212890625,
-0.029632568359375,
0.04345703125,
0.01358795166015625,
0.0284881591796875,
0.013885498046875,
0.00991058349609375,
-0.0212554931640625,
-0.04791259765625,
-0.0159149169921875,
0.023956298828125,
-0.0430908203125,
-0.021453857421875,
-0.0350341796875,
-0.0225830078125,
-0.0161590576171875,
0.004901885986328125,
-0.044708251953125,
-0.033111572265625,
-0.03912353515625,
-0.0249481201171875,
0.03192138671875,
0.03533935546875,
-0.0018377304077148438,
0.0276336669921875,
-0.0240325927734375,
-0.0020618438720703125,
0.0245361328125,
0.01611328125,
0.01171112060546875,
-0.06756591796875,
0.0079498291015625,
0.01030731201171875,
-0.054351806640625,
-0.049072265625,
0.0196685791015625,
0.0171966552734375,
0.07159423828125,
0.02178955078125,
-0.006755828857421875,
0.06744384765625,
-0.0218658447265625,
0.08258056640625,
0.021209716796875,
-0.055938720703125,
0.048095703125,
-0.033355712890625,
0.0063018798828125,
0.021453857421875,
0.023590087890625,
-0.01517486572265625,
-0.02783203125,
-0.0643310546875,
-0.07452392578125,
0.049957275390625,
0.0306549072265625,
0.005504608154296875,
0.00586700439453125,
0.031646728515625,
0.00852203369140625,
0.0085906982421875,
-0.05108642578125,
-0.038543701171875,
-0.02392578125,
0.0018625259399414062,
0.0007758140563964844,
-0.0299530029296875,
-0.015960693359375,
-0.0022907257080078125,
0.045928955078125,
0.004573822021484375,
0.042205810546875,
0.00911712646484375,
0.016082763671875,
-0.0180206298828125,
0.0032253265380859375,
0.06304931640625,
0.049652099609375,
-0.020172119140625,
-0.011322021484375,
0.020111083984375,
-0.042083740234375,
0.0007882118225097656,
0.005550384521484375,
-0.0004963874816894531,
-0.02392578125,
0.0328369140625,
0.0650634765625,
0.0147247314453125,
-0.031524658203125,
0.03265380859375,
0.0123443603515625,
-0.0210418701171875,
-0.0305633544921875,
0.01727294921875,
0.01019287109375,
0.049163818359375,
0.046295166015625,
0.0185546875,
0.007595062255859375,
-0.039794921875,
-0.0004687309265136719,
0.023468017578125,
-0.005741119384765625,
-0.03460693359375,
0.06744384765625,
0.0083465576171875,
-0.01904296875,
0.03338623046875,
0.00514984130859375,
-0.02252197265625,
0.059112548828125,
0.032135009765625,
0.0447998046875,
-0.0299072265625,
0.005153656005859375,
0.04058837890625,
0.0170745849609375,
-0.020050048828125,
0.0272216796875,
0.0130157470703125,
-0.04180908203125,
-0.028045654296875,
-0.029632568359375,
-0.0311126708984375,
0.0206451416015625,
-0.049713134765625,
0.03173828125,
-0.0341796875,
-0.031219482421875,
-0.00606536865234375,
0.01216888427734375,
-0.05084228515625,
0.0029811859130859375,
0.01148223876953125,
0.06793212890625,
-0.042327880859375,
0.052703857421875,
0.036895751953125,
-0.02874755859375,
-0.0693359375,
-0.0143890380859375,
0.01299285888671875,
-0.07379150390625,
0.036651611328125,
0.00377655029296875,
0.0015287399291992188,
0.00691986083984375,
-0.049224853515625,
-0.093994140625,
0.1251220703125,
0.0254058837890625,
-0.04449462890625,
-0.0011997222900390625,
-0.007114410400390625,
0.040740966796875,
-0.01271820068359375,
0.0400390625,
0.046844482421875,
0.0297393798828125,
0.0318603515625,
-0.089111328125,
0.020294189453125,
-0.021270751953125,
0.0036258697509765625,
-0.0026645660400390625,
-0.09063720703125,
0.08673095703125,
-0.02783203125,
-0.01506805419921875,
0.0241241455078125,
0.049072265625,
0.049591064453125,
0.01313018798828125,
0.0228729248046875,
0.04461669921875,
0.061492919921875,
-0.00974273681640625,
0.0791015625,
-0.011566162109375,
0.029510498046875,
0.0628662109375,
-0.0019483566284179688,
0.07073974609375,
0.0169525146484375,
-0.041290283203125,
0.06317138671875,
0.072998046875,
0.007045745849609375,
0.05889892578125,
0.0060577392578125,
0.004573822021484375,
-0.0152435302734375,
0.01152801513671875,
-0.054931640625,
0.03656005859375,
0.035186767578125,
-0.0247344970703125,
-0.01409149169921875,
-0.02508544921875,
0.0130462646484375,
-0.021728515625,
-0.00353240966796875,
0.0499267578125,
0.00890350341796875,
-0.03887939453125,
0.07708740234375,
-0.007781982421875,
0.0572509765625,
-0.047698974609375,
0.01300811767578125,
-0.036834716796875,
0.0162200927734375,
-0.0234375,
-0.057464599609375,
0.003368377685546875,
-0.0026874542236328125,
0.00945281982421875,
0.0206146240234375,
0.04241943359375,
-0.002246856689453125,
-0.02667236328125,
0.0214080810546875,
0.0263519287109375,
0.0200042724609375,
0.0226593017578125,
-0.0570068359375,
0.016021728515625,
-0.00188446044921875,
-0.059478759765625,
0.0276336669921875,
0.0272369384765625,
-0.01140594482421875,
0.063232421875,
0.053558349609375,
-0.0189361572265625,
0.0171051025390625,
-0.0085601806640625,
0.078125,
-0.02728271484375,
-0.040374755859375,
-0.07232666015625,
0.04833984375,
-0.00629425048828125,
-0.047332763671875,
0.04595947265625,
0.031036376953125,
0.0467529296875,
0.018798828125,
0.04473876953125,
-0.00736236572265625,
0.028076171875,
-0.0240631103515625,
0.036590576171875,
-0.049041748046875,
0.037322998046875,
-0.0148773193359375,
-0.07586669921875,
-0.0210723876953125,
0.054931640625,
-0.0143280029296875,
0.00832366943359375,
0.035369873046875,
0.063720703125,
0.00954437255859375,
-0.0005693435668945312,
0.0031147003173828125,
0.0279541015625,
0.052703857421875,
0.05657958984375,
0.05096435546875,
-0.042083740234375,
0.06719970703125,
-0.0161895751953125,
-0.031005859375,
-0.035614013671875,
-0.06695556640625,
-0.07452392578125,
-0.0313720703125,
-0.0262603759765625,
-0.020294189453125,
0.00637054443359375,
0.05364990234375,
0.05938720703125,
-0.05535888671875,
-0.0193939208984375,
-0.01432037353515625,
-0.0036773681640625,
-0.033447265625,
-0.01241302490234375,
0.031097412109375,
0.0010099411010742188,
-0.0511474609375,
0.0210723876953125,
0.004608154296875,
0.0272064208984375,
-0.023223876953125,
-0.01416015625,
-0.01873779296875,
0.006443023681640625,
0.033416748046875,
0.020172119140625,
-0.0655517578125,
-0.0311431884765625,
0.000293731689453125,
-0.0129852294921875,
0.0095367431640625,
0.00815582275390625,
-0.049285888671875,
0.0080413818359375,
0.02667236328125,
0.0250396728515625,
0.0540771484375,
0.00035953521728515625,
0.0141754150390625,
-0.03717041015625,
0.0159454345703125,
-0.0005326271057128906,
0.020294189453125,
0.01171875,
-0.023468017578125,
0.054046630859375,
0.020416259765625,
-0.0445556640625,
-0.07391357421875,
0.006000518798828125,
-0.1036376953125,
-0.0027370452880859375,
0.0994873046875,
-0.0216827392578125,
-0.005031585693359375,
0.013885498046875,
-0.018463134765625,
0.0255279541015625,
-0.04351806640625,
0.0687255859375,
0.04949951171875,
-0.0126495361328125,
-0.0178070068359375,
-0.042877197265625,
0.0299224853515625,
0.00974273681640625,
-0.0625,
-0.017364501953125,
0.009979248046875,
0.0287322998046875,
0.0163116455078125,
0.038421630859375,
-0.017791748046875,
0.01995849609375,
0.00228118896484375,
-0.006549835205078125,
-0.0183563232421875,
-0.0087127685546875,
0.0035858154296875,
0.001621246337890625,
-0.007122039794921875,
-0.0233917236328125
]
] |
lgaalves/tinyllama-1.1b-chat-v0.3_platypus | 2023-10-10T15:36:03.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"dataset:garage-bAInd/Open-Platypus",
"license:mit",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | lgaalves | null | null | lgaalves/tinyllama-1.1b-chat-v0.3_platypus | 5 | 7,016 | transformers | 2023-10-09T22:05:57 | ---
license: mit
datasets:
- garage-bAInd/Open-Platypus
language:
- en
pipeline_tag: text-generation
---
# tinyllama-1.1b-chat-v0.3_platypus
**tinyllama-1.1b-chat-v0.3_platypus** is an instruction fine-tuned model based on the tinyllama transformer architecture.
### Benchmark Metrics
| Metric |lgaalves/tinyllama-1.1b-chat-v0.3_platypus | tinyllama-1.1b-chat-v0.3 |
|-----------------------|-------|-------|
| Avg. | 37.67 | **38.74** |
| ARC (25-shot) | 30.29 | **35.07** |
| HellaSwag (10-shot) | 55.12 | **57.7** |
| MMLU (5-shot) | **26.13** | 25.53 |
| TruthfulQA (0-shot) | **39.15** | 36.67 |
We use state-of-the-art [Language Model Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness) to run the benchmark tests above, using the same version as the HuggingFace LLM Leaderboard. Please see below for detailed instructions on reproducing benchmark results.
### Model Details
* **Trained by**: Luiz G A Alves
* **Model type:** **tinyllama-1.1b-chat-v0.3_platypus** is an auto-regressive language model based on the tinyllama transformer architecture.
* **Language(s)**: English
### How to use:
```python
# Use a pipeline as a high-level helper
>>> from transformers import pipeline
>>> pipe = pipeline("text-generation", model="lgaalves/tinyllama-1.1b-chat-v0.3_platypus")
>>> question = "What is a large language model?"
>>> answer = pipe(question)
>>> print(answer[0]['generated_text'])
```
or, you can load the model direclty using:
```python
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("lgaalves/tinyllama-1.1b-chat-v0.3_platypus")
model = AutoModelForCausalLM.from_pretrained("lgaalves/tinyllama-1.1b-chat-v0.3_platypus")
```
### Training Dataset
`lgaalves/tinyllama-1.1b-chat-v0.3_platypus` trained using STEM and logic based dataset [garage-bAInd/Open-Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus).
### Training Procedure
`lgaalves/tinyllama-1.1b-chat-v0.3_platypus` was instruction fine-tuned using LoRA on 1 V100 GPU on Google Colab. It took about 43 minutes to train it.
# Intended uses, limitations & biases
You can use the raw model for text generation or fine-tune it to a downstream task. The model was not extensively tested and may produce false information. It contains a lot of unfiltered content from the internet, which is far from neutral. | 2,469 | [
[
-0.0254058837890625,
-0.08001708984375,
0.021759033203125,
0.027374267578125,
-0.01349639892578125,
-0.00930023193359375,
-0.04180908203125,
-0.027313232421875,
0.0193023681640625,
0.0154571533203125,
-0.050872802734375,
-0.032073974609375,
-0.04046630859375,
-0.006103515625,
-0.01458740234375,
0.07086181640625,
0.005443572998046875,
-0.01190185546875,
0.015625,
-0.01117706298828125,
-0.03643798828125,
-0.0121917724609375,
-0.054718017578125,
-0.0223388671875,
0.01535797119140625,
0.042083740234375,
0.06182861328125,
0.049591064453125,
0.02801513671875,
0.0272064208984375,
-0.01361083984375,
0.0086517333984375,
-0.03753662109375,
-0.022003173828125,
0.01490020751953125,
-0.04608154296875,
-0.039764404296875,
0.006839752197265625,
0.051910400390625,
0.0308380126953125,
-0.0141754150390625,
0.040069580078125,
0.0117034912109375,
0.0269012451171875,
-0.0187530517578125,
0.029937744140625,
-0.045806884765625,
-0.006481170654296875,
-0.0193023681640625,
0.004405975341796875,
-0.0303497314453125,
-0.033721923828125,
0.0050811767578125,
-0.03515625,
0.0005388259887695312,
0.0171051025390625,
0.085693359375,
0.019012451171875,
-0.0035457611083984375,
-0.027191162109375,
-0.0267486572265625,
0.06634521484375,
-0.058013916015625,
0.0151519775390625,
0.025146484375,
0.02227783203125,
-0.013336181640625,
-0.052734375,
-0.052978515625,
-0.024017333984375,
-0.0184326171875,
-0.0036983489990234375,
-0.0279541015625,
-0.01267242431640625,
0.017120361328125,
0.035430908203125,
-0.04400634765625,
0.03228759765625,
-0.0428466796875,
-0.0157623291015625,
0.05316162109375,
0.024139404296875,
0.023529052734375,
-0.0239105224609375,
-0.0125274658203125,
-0.023529052734375,
-0.038299560546875,
0.01285552978515625,
0.0217742919921875,
0.035736083984375,
-0.04168701171875,
0.036590576171875,
-0.00965118408203125,
0.04437255859375,
0.01293182373046875,
-0.0361328125,
0.0300750732421875,
-0.0179290771484375,
-0.026031494140625,
-0.011688232421875,
0.0933837890625,
0.016998291015625,
0.0026569366455078125,
0.01502227783203125,
-0.00534820556640625,
0.021514892578125,
-0.00728607177734375,
-0.0679931640625,
-0.025787353515625,
0.0145416259765625,
-0.0311279296875,
-0.025177001953125,
-0.0238189697265625,
-0.05047607421875,
-0.003635406494140625,
-0.0013294219970703125,
0.040802001953125,
-0.05010986328125,
-0.0181427001953125,
0.004138946533203125,
0.0209197998046875,
0.0224609375,
0.019073486328125,
-0.0640869140625,
0.027099609375,
0.044952392578125,
0.07977294921875,
-0.00685882568359375,
-0.031646728515625,
-0.0200042724609375,
-0.030487060546875,
-0.0176544189453125,
0.0565185546875,
-0.0142974853515625,
-0.0142669677734375,
-0.0178680419921875,
0.0116729736328125,
-0.02471923828125,
-0.037322998046875,
0.032470703125,
-0.02032470703125,
0.04083251953125,
0.014373779296875,
-0.05029296875,
-0.01148223876953125,
0.00495147705078125,
-0.034423828125,
0.08197021484375,
0.00868988037109375,
-0.05096435546875,
0.006988525390625,
-0.0509033203125,
-0.0103302001953125,
0.000812530517578125,
-0.0078277587890625,
-0.032745361328125,
-0.0024623870849609375,
0.0145721435546875,
0.0328369140625,
-0.03009033203125,
0.00292205810546875,
-0.021026611328125,
-0.0227508544921875,
0.0138092041015625,
-0.0124969482421875,
0.07275390625,
0.0158538818359375,
-0.0318603515625,
0.0128936767578125,
-0.04937744140625,
0.00670623779296875,
0.03338623046875,
-0.0199737548828125,
-0.01445770263671875,
-0.0208282470703125,
0.0038585662841796875,
0.0221099853515625,
0.0207366943359375,
-0.042022705078125,
0.020599365234375,
-0.05047607421875,
0.036834716796875,
0.05108642578125,
-0.016448974609375,
0.034210205078125,
-0.0400390625,
0.0323486328125,
-0.0014171600341796875,
0.0171661376953125,
-0.005645751953125,
-0.05157470703125,
-0.08831787109375,
-0.0292510986328125,
0.0299835205078125,
0.041900634765625,
-0.0450439453125,
0.040679931640625,
-0.0233001708984375,
-0.06298828125,
-0.0689697265625,
0.02691650390625,
0.03851318359375,
0.0265655517578125,
0.0175628662109375,
-0.00792694091796875,
-0.03204345703125,
-0.07244873046875,
-0.0010213851928710938,
-0.0252532958984375,
-0.0031585693359375,
0.01282501220703125,
0.04510498046875,
-0.0263671875,
0.059478759765625,
-0.040618896484375,
-0.034088134765625,
-0.0227508544921875,
0.01401519775390625,
0.03021240234375,
0.038330078125,
0.04571533203125,
-0.022857666015625,
-0.049591064453125,
-0.016876220703125,
-0.039764404296875,
-0.018463134765625,
0.0020294189453125,
-0.001323699951171875,
0.027435302734375,
0.0215606689453125,
-0.06201171875,
0.0142669677734375,
0.0543212890625,
-0.026153564453125,
0.02947998046875,
-0.00882720947265625,
-0.009033203125,
-0.0833740234375,
0.0074615478515625,
0.0016298294067382812,
-0.01055145263671875,
-0.0379638671875,
0.01318359375,
0.0112457275390625,
0.0006513595581054688,
-0.0419921875,
0.058380126953125,
-0.032257080078125,
0.00875091552734375,
-0.02496337890625,
0.033935546875,
-0.01873779296875,
0.047607421875,
0.0012140274047851562,
0.062164306640625,
0.034088134765625,
-0.03564453125,
0.02801513671875,
0.0322265625,
-0.0322265625,
0.0145721435546875,
-0.056243896484375,
0.007904052734375,
0.0191192626953125,
0.01094818115234375,
-0.0718994140625,
-0.003925323486328125,
0.0168609619140625,
-0.033203125,
0.020843505859375,
-0.0025634765625,
-0.055633544921875,
-0.04205322265625,
-0.0300445556640625,
0.0189361572265625,
0.0628662109375,
-0.043060302734375,
0.0105133056640625,
0.0423583984375,
-0.0131378173828125,
-0.01174163818359375,
-0.04437255859375,
-0.00928497314453125,
-0.0203857421875,
-0.06243896484375,
0.003917694091796875,
-0.0170135498046875,
-0.014404296875,
-0.024139404296875,
0.0122222900390625,
0.01094818115234375,
-0.005405426025390625,
0.027923583984375,
0.01959228515625,
-0.026702880859375,
0.007152557373046875,
-0.0171966552734375,
-0.014373779296875,
-0.004863739013671875,
0.003253936767578125,
0.06146240234375,
-0.051483154296875,
-0.0018825531005859375,
-0.0595703125,
-0.00417327880859375,
0.028228759765625,
0.00626373291015625,
0.059051513671875,
0.045867919921875,
-0.0170135498046875,
-0.0104522705078125,
-0.048553466796875,
-0.0239410400390625,
-0.04119873046875,
0.01081085205078125,
-0.0219573974609375,
-0.0716552734375,
0.04241943359375,
0.01190185546875,
0.00531005859375,
0.057830810546875,
0.06329345703125,
0.003971099853515625,
0.07330322265625,
0.049102783203125,
-0.0061798095703125,
0.03021240234375,
-0.0343017578125,
-0.002895355224609375,
-0.05010986328125,
-0.0133209228515625,
-0.037109375,
-0.02740478515625,
-0.0565185546875,
-0.0396728515625,
0.006313323974609375,
0.004505157470703125,
-0.020904541015625,
0.051422119140625,
-0.033966064453125,
0.017974853515625,
0.034881591796875,
-0.0104217529296875,
0.0177154541015625,
-0.0064697265625,
-0.0185699462890625,
-0.016876220703125,
-0.06365966796875,
-0.0526123046875,
0.0748291015625,
0.051239013671875,
0.0689697265625,
-0.001495361328125,
0.04034423828125,
-0.00550079345703125,
0.0341796875,
-0.06182861328125,
0.052459716796875,
-0.00046253204345703125,
-0.041412353515625,
-0.0106201171875,
-0.03460693359375,
-0.061309814453125,
0.0179290771484375,
-0.0137481689453125,
-0.067138671875,
-0.0028209686279296875,
0.0096435546875,
-0.046142578125,
0.018463134765625,
-0.07196044921875,
0.07415771484375,
-0.0103302001953125,
-0.00830078125,
-0.01045989990234375,
-0.04473876953125,
0.04766845703125,
-0.0159149169921875,
0.01528167724609375,
-0.0190887451171875,
-0.00372314453125,
0.07318115234375,
-0.043914794921875,
0.0616455078125,
-0.00612640380859375,
-0.00629425048828125,
0.031005859375,
0.0123138427734375,
0.01528167724609375,
0.00716400146484375,
0.007648468017578125,
0.028533935546875,
0.006866455078125,
-0.0197906494140625,
-0.0171966552734375,
0.07000732421875,
-0.087890625,
-0.03277587890625,
-0.035980224609375,
-0.0404052734375,
-0.0108642578125,
0.0108642578125,
0.02301025390625,
0.00664520263671875,
-0.0041351318359375,
0.0224151611328125,
0.042022705078125,
-0.0265655517578125,
0.04339599609375,
0.05755615234375,
-0.0311737060546875,
-0.0177154541015625,
0.059600830078125,
0.003337860107421875,
0.0141143798828125,
0.01288604736328125,
0.0223846435546875,
-0.0157623291015625,
-0.0137481689453125,
-0.0298919677734375,
0.037750244140625,
-0.0256195068359375,
-0.029022216796875,
-0.054473876953125,
-0.038726806640625,
-0.0248870849609375,
0.0166473388671875,
-0.0565185546875,
-0.03900146484375,
-0.0310821533203125,
0.01491546630859375,
0.0355224609375,
0.0384521484375,
0.001323699951171875,
0.06060791015625,
-0.050506591796875,
0.0228271484375,
0.038299560546875,
0.0270538330078125,
-0.0089111328125,
-0.0748291015625,
-0.00848388671875,
0.0185394287109375,
-0.031890869140625,
-0.048248291015625,
0.036956787109375,
0.02362060546875,
0.028656005859375,
0.026611328125,
-0.01012420654296875,
0.05816650390625,
-0.035308837890625,
0.053558349609375,
0.01088714599609375,
-0.06732177734375,
0.06048583984375,
-0.0121002197265625,
0.0294342041015625,
0.034332275390625,
0.01229095458984375,
-0.029022216796875,
-0.042633056640625,
-0.069580078125,
-0.04046630859375,
0.0811767578125,
0.0361328125,
0.00492095947265625,
0.00579071044921875,
0.01103973388671875,
0.00424957275390625,
0.00440216064453125,
-0.05792236328125,
-0.011688232421875,
-0.01143646240234375,
-0.018951416015625,
-0.01502227783203125,
-0.012786865234375,
-0.003452301025390625,
-0.03424072265625,
0.057830810546875,
-0.0135955810546875,
0.02752685546875,
-0.012420654296875,
-0.0243072509765625,
-0.015869140625,
0.009368896484375,
0.05694580078125,
0.03546142578125,
-0.024932861328125,
-0.0123748779296875,
0.02716064453125,
-0.047607421875,
-0.004108428955078125,
0.0078277587890625,
-0.007778167724609375,
0.006343841552734375,
0.02099609375,
0.072509765625,
0.008209228515625,
-0.051513671875,
0.04901123046875,
-0.02691650390625,
-0.005405426025390625,
-0.02252197265625,
0.01611328125,
0.01491546630859375,
0.0217437744140625,
0.0202789306640625,
-0.0141143798828125,
-0.022491455078125,
-0.039794921875,
0.002376556396484375,
0.024993896484375,
-0.0111541748046875,
-0.033050537109375,
0.04736328125,
0.0037136077880859375,
-0.031524658203125,
0.03863525390625,
-0.00855255126953125,
-0.0186004638671875,
0.045440673828125,
0.044830322265625,
0.04925537109375,
-0.024749755859375,
0.00839996337890625,
0.0341796875,
0.043792724609375,
-0.0092620849609375,
0.020050048828125,
0.0179290771484375,
-0.043609619140625,
-0.014434814453125,
-0.072265625,
-0.015045166015625,
0.036956787109375,
-0.033660888671875,
0.031494140625,
-0.0406494140625,
-0.0187835693359375,
0.0114288330078125,
0.0289154052734375,
-0.0526123046875,
0.01015472412109375,
0.0125732421875,
0.0875244140625,
-0.0718994140625,
0.07574462890625,
0.05072021484375,
-0.0286407470703125,
-0.07427978515625,
-0.0142669677734375,
0.00775146484375,
-0.09161376953125,
0.041229248046875,
0.0143890380859375,
0.00824737548828125,
-0.00954437255859375,
-0.04632568359375,
-0.058380126953125,
0.08099365234375,
0.0272674560546875,
-0.048828125,
0.005260467529296875,
-0.0007939338684082031,
0.058013916015625,
-0.03857421875,
0.034820556640625,
0.063720703125,
0.0164947509765625,
0.004581451416015625,
-0.09906005859375,
0.00908660888671875,
-0.03338623046875,
0.024749755859375,
0.002376556396484375,
-0.0821533203125,
0.07855224609375,
-0.003467559814453125,
-0.013031005859375,
0.026702880859375,
0.057403564453125,
0.02679443359375,
-0.00818634033203125,
0.0413818359375,
0.050689697265625,
0.03778076171875,
-0.0187530517578125,
0.06036376953125,
-0.03424072265625,
0.0472412109375,
0.0869140625,
0.0061187744140625,
0.07440185546875,
0.03143310546875,
-0.01390838623046875,
0.0406494140625,
0.0479736328125,
-0.005184173583984375,
0.03094482421875,
-0.00011008977890014648,
0.0141143798828125,
-0.00720977783203125,
0.0116729736328125,
-0.037261962890625,
0.050811767578125,
0.042755126953125,
-0.0212249755859375,
-0.01123046875,
0.004634857177734375,
0.018768310546875,
-0.0089874267578125,
-0.0118408203125,
0.06591796875,
0.018280029296875,
-0.04229736328125,
0.0665283203125,
0.022308349609375,
0.06903076171875,
-0.049774169921875,
0.024139404296875,
-0.0369873046875,
0.027099609375,
-0.00859832763671875,
-0.051025390625,
0.0177764892578125,
0.00667572021484375,
0.0096435546875,
-0.0118560791015625,
0.0516357421875,
-0.03369140625,
-0.033599853515625,
0.01558685302734375,
0.032684326171875,
0.0075531005859375,
0.01251220703125,
-0.06732177734375,
0.02197265625,
-0.012115478515625,
-0.0297393798828125,
0.0193634033203125,
0.01715087890625,
0.00800323486328125,
0.055816650390625,
0.033111572265625,
0.0120086669921875,
0.01146697998046875,
-0.01275634765625,
0.0653076171875,
-0.037353515625,
-0.0521240234375,
-0.06475830078125,
0.019500732421875,
-0.0106048583984375,
-0.030181884765625,
0.0767822265625,
0.045166015625,
0.049285888671875,
-0.0011196136474609375,
0.02911376953125,
-0.0196990966796875,
0.0494384765625,
-0.026580810546875,
0.0645751953125,
-0.04266357421875,
0.024383544921875,
-0.017242431640625,
-0.06939697265625,
-0.0077056884765625,
0.06329345703125,
-0.0251007080078125,
-0.004428863525390625,
0.051513671875,
0.07208251953125,
-0.020843505859375,
0.01129913330078125,
0.005077362060546875,
0.043609619140625,
0.005413055419921875,
0.052886962890625,
0.06304931640625,
-0.05853271484375,
0.049285888671875,
-0.037872314453125,
-0.030426025390625,
-0.0272674560546875,
-0.045654296875,
-0.07867431640625,
-0.031646728515625,
-0.0379638671875,
-0.0341796875,
-0.018524169921875,
0.0758056640625,
0.04937744140625,
-0.0716552734375,
-0.038970947265625,
0.01468658447265625,
-0.0011072158813476562,
0.0045623779296875,
-0.019561767578125,
0.0291748046875,
-0.0295562744140625,
-0.05230712890625,
0.0184173583984375,
0.0194244384765625,
0.00803375244140625,
-0.0260009765625,
-0.00847625732421875,
-0.020599365234375,
0.0026302337646484375,
0.040771484375,
0.031951904296875,
-0.05377197265625,
-0.02532958984375,
-0.004901885986328125,
-0.035919189453125,
0.007419586181640625,
0.025665283203125,
-0.0496826171875,
0.01690673828125,
0.0271148681640625,
0.023040771484375,
0.042633056640625,
-0.024505615234375,
0.021392822265625,
-0.059051513671875,
0.0419921875,
0.004970550537109375,
0.0253753662109375,
0.022796630859375,
0.002716064453125,
0.0264892578125,
0.01418304443359375,
-0.04632568359375,
-0.065185546875,
-0.0107574462890625,
-0.0703125,
-0.005237579345703125,
0.09918212890625,
-0.005107879638671875,
-0.0379638671875,
0.016937255859375,
-0.02423095703125,
0.01503753662109375,
-0.0237884521484375,
0.0628662109375,
0.03131103515625,
0.0023479461669921875,
-0.0164337158203125,
-0.044097900390625,
0.04510498046875,
0.0264129638671875,
-0.059051513671875,
-0.017608642578125,
0.0209503173828125,
0.041015625,
0.0270233154296875,
0.059539794921875,
-0.0017137527465820312,
0.00852203369140625,
-0.0034770965576171875,
-0.007049560546875,
0.01251220703125,
-0.01507568359375,
-0.032623291015625,
0.0039520263671875,
-0.0011224746704101562,
-0.012664794921875
]
] |
FlagAlpha/Atom-7B | 2023-10-29T00:57:13.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"question-answering",
"zh",
"en",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | question-answering | FlagAlpha | null | null | FlagAlpha/Atom-7B | 52 | 7,008 | transformers | 2023-08-28T10:36:41 | ---
developers: [https://huggingface.co/FlagAlphaAI]
license: apache-2.0
language:
- zh
- en
pipeline_tag: question-answering
library_name: transformers
---
# Atom-7B
Atom-7B完全开源可商用,由Llama中文社区和AtomEcho(原子回声)联合研发,基于Llama2-7B采用大规模的中文数据进行了继续预训练,我们会持续提供更新的模型参数,模型训练过程见[llama.family](https://llama.family)。
模型的部署、训练、微调等方法详见Llama中文社区GitHub仓库:[**Llama2-Chinese**](https://github.com/FlagAlpha/Llama2-Chinese)。
## 📝 中文数据
| 类型 | 描述 |
| ---------------------------------------------------------- | ------------------------------------------------------------ |
| 网络数据 | 互联网上公开的网络数据,挑选出去重后的高质量中文数据,涉及到百科、书籍、博客、新闻、公告、小说等高质量长文本数据。 |
| [Wikipedia](https://github.com/goldsmith/Wikipedia) | 中文Wikipedia的数据 |
| [悟道](https://github.com/BAAI-WuDao/Model) | 中文悟道开源的200G数据 |
| [Clue](https://github.com/CLUEbenchmark/CLUEDatasetSearch) | Clue开放的中文预训练数据,进行清洗后的高质量中文长文本数据 |
| 竞赛数据集 | 近年来中文自然语言处理多任务竞赛数据集,约150个 |
| [MNBVC](https://github.com/esbatmop/MNBVC) | MNBVC 中清洗出来的部分数据集 |
**我们也欢迎大家在[llama.family](https://llama.family)中贡献自己的数据,您的数据通过审核后会加入模型训练,也将影响模型未来的能力走向。**
## 📚 中文词表
为了提高中文文本处理的效率,我们针对Llama2模型的词表进行了深度优化。
首先,我们基于数百G的中文文本,**在Llama2词表的基础上扩展词库至65,000个单词**。
经过测试,我们的改进使得**中文编码/解码速度提高了约350%**。
此外,我们还扩大了中文字符集的覆盖范围,包括所有**emoji符号**,这使的生成带有表情符号的文章更加高效。
对于Llama2原生词表中的一些特殊情况,如数字、英文等,我们尽可能地避免对其进行修改或替换。
最终,成功地实现了一种既能提高中文处理效率又能保持Llama2原有性能的方法。
## 📈 训练过程
**模型结构**
基于当前最优秀的开源模型Llama2,使用主流Decoder-only的标准Transformer网络结构,支持4K的上下文长度(Context Length),为同尺寸模型中最长,能满足更长的多轮对话、知识问答与摘要等需求,模型应用场景更广泛。
**FlashAttention-2高效训练**
Atom-7B采用了FlashAttention-2技术进行训练。由于在处理较长的输入序列时,内存消耗的问题可能会导致“内存爆炸”现象。FlashAttention-2是一种高效注意力机制的实现方式之一,相较于传统的注意力技术(Attention),它拥有更快速的速度以及更加优化的内存占用率。
**基于NTK的自适应上下文扩展技术**
- 可在不继续训练模型的情况下支持更长的上下文
- 本项目中模型默认支持4K上下文,利用上述技术可扩展至18K+
- 经过微调可以支持到32K+
## 💻 推理配置
实际应用中,消费级显卡要比专业显卡便宜的多(比如3090相比A10,同样都是24G显存)。
对于消费级显卡,直接FP32肯定放不下,一般最基本的是FP16,而INT8和INT4量化就很有用,例如:
- 对于3080显卡(10G显存),Atom-7B的INT8只需要8G显存可以直接部署。
- 对于3080显卡(10G显存),Atom-7B的INT4只需要5G显存可以直接部署。
---
# Llama中文社区
## 🚀 社区地址:
Github:[**Llama2-Chinese**](https://github.com/FlagAlpha/Llama2-Chinese)
在线体验链接:[**llama.family**](https://llama.family/)
## 🔥 社区介绍
欢迎来到Llama中文社区!
我们是一个专注于Llama模型在中文方面的优化和上层建设的高级技术社区。
**基于大规模中文数据,从预训练开始对Llama2模型进行中文能力的持续迭代升级**。
我们热忱欢迎对大模型LLM充满热情的开发者和研究者加入我们的行列。
## 🐼 社区资源
- Llama2在线体验链接[**llama.family**](https://llama.family/),同时包含Meta原版和中文微调版本!
- Llama2 Chat模型的[中文问答能力评测](https://github.com/FlagAlpha/Llama2-Chinese/tree/main#-%E6%A8%A1%E5%9E%8B%E8%AF%84%E6%B5%8B)!
- [社区飞书知识库](https://chinesellama.feishu.cn/wiki/space/7257824476874768388?ccm_open_type=lark_wiki_spaceLink),欢迎大家一起共建!
| 2,962 | [
[
-0.040863037109375,
-0.0289459228515625,
0.02484130859375,
0.0198516845703125,
-0.05145263671875,
0.01654052734375,
-0.0008025169372558594,
-0.0528564453125,
0.0296630859375,
0.0004572868347167969,
-0.03173828125,
-0.05120849609375,
-0.037200927734375,
0.0206298828125,
-0.0018291473388671875,
0.048248291015625,
-0.009002685546875,
-0.02001953125,
0.025360107421875,
-0.0005087852478027344,
-0.0311126708984375,
-0.0009937286376953125,
-0.029541015625,
-0.004638671875,
0.0136871337890625,
0.008331298828125,
0.0516357421875,
0.06854248046875,
0.0445556640625,
0.01837158203125,
-0.0193328857421875,
0.01776123046875,
-0.01788330078125,
-0.02337646484375,
0.00394439697265625,
-0.041748046875,
-0.0400390625,
-0.03173828125,
0.037322998046875,
0.01493072509765625,
0.00514984130859375,
0.0299072265625,
0.0009636878967285156,
0.055419921875,
-0.0289459228515625,
0.020843505859375,
-0.03289794921875,
-0.0010528564453125,
-0.023406982421875,
-0.0243988037109375,
0.0129547119140625,
-0.040863037109375,
-0.0304412841796875,
-0.04931640625,
-0.0032978057861328125,
0.01102447509765625,
0.09063720703125,
0.0123138427734375,
-0.042022705078125,
-0.0179290771484375,
-0.02069091796875,
0.062469482421875,
-0.07171630859375,
-0.002197265625,
0.0289459228515625,
0.023651123046875,
-0.031494140625,
-0.04974365234375,
-0.05950927734375,
-0.0035037994384765625,
-0.0239105224609375,
0.023773193359375,
-0.013153076171875,
-0.029876708984375,
0.006130218505859375,
0.01421356201171875,
-0.047760009765625,
0.006130218505859375,
-0.0304412841796875,
-0.01125335693359375,
0.0516357421875,
0.0009274482727050781,
0.039398193359375,
-0.0372314453125,
-0.038299560546875,
0.006320953369140625,
-0.0662841796875,
0.02630615234375,
0.016265869140625,
-0.00420379638671875,
-0.0440673828125,
0.04327392578125,
-0.023651123046875,
0.02545166015625,
0.01020050048828125,
-0.039581298828125,
0.033294677734375,
-0.04052734375,
0.0044708251953125,
0.002315521240234375,
0.060455322265625,
0.06048583984375,
-0.007442474365234375,
0.0162200927734375,
-0.012481689453125,
0.005092620849609375,
-0.024932861328125,
-0.056640625,
0.0033893585205078125,
0.044891357421875,
-0.054229736328125,
-0.0219268798828125,
-0.0028133392333984375,
-0.06646728515625,
-0.00004088878631591797,
0.01282501220703125,
0.0179443359375,
-0.034393310546875,
-0.0419921875,
0.007778167724609375,
-0.0126495361328125,
0.0258941650390625,
0.030181884765625,
-0.058685302734375,
0.0062103271484375,
0.04705810546875,
0.062164306640625,
-0.0037097930908203125,
-0.017974853515625,
0.005558013916015625,
0.048583984375,
-0.0267791748046875,
0.06195068359375,
-0.01334381103515625,
-0.03265380859375,
-0.003467559814453125,
0.007602691650390625,
0.01381683349609375,
-0.03228759765625,
0.021270751953125,
-0.03692626953125,
-0.007808685302734375,
-0.034088134765625,
-0.007232666015625,
-0.036590576171875,
0.01519775390625,
-0.04864501953125,
0.0843505859375,
0.01293182373046875,
-0.06646728515625,
0.003620147705078125,
-0.032440185546875,
-0.0284576416015625,
0.00021386146545410156,
-0.0197906494140625,
-0.0271453857421875,
-0.0250091552734375,
0.024169921875,
0.0151519775390625,
-0.04119873046875,
-0.0032520294189453125,
-0.00241851806640625,
-0.037933349609375,
0.0236663818359375,
-0.0016307830810546875,
0.07525634765625,
0.02801513671875,
-0.048583984375,
0.00226593017578125,
-0.06146240234375,
0.0006375312805175781,
0.04486083984375,
-0.043182373046875,
0.005657196044921875,
0.00434112548828125,
-0.011444091796875,
0.0153045654296875,
0.057220458984375,
-0.02362060546875,
0.0301361083984375,
-0.040191650390625,
0.0462646484375,
0.06610107421875,
0.011993408203125,
0.0087432861328125,
-0.0450439453125,
0.030609130859375,
0.00969696044921875,
0.0093231201171875,
-0.025604248046875,
-0.0482177734375,
-0.06689453125,
-0.010986328125,
-0.01390838623046875,
0.052734375,
-0.040191650390625,
0.0501708984375,
-0.0147705078125,
-0.044464111328125,
-0.0260009765625,
0.02166748046875,
0.03912353515625,
0.0158538818359375,
0.040435791015625,
-0.0074005126953125,
-0.050018310546875,
-0.059600830078125,
0.006572723388671875,
-0.032623291015625,
0.01020050048828125,
0.020233154296875,
0.056121826171875,
-0.0266265869140625,
0.05401611328125,
-0.0367431640625,
-0.0173492431640625,
-0.02105712890625,
-0.02105712890625,
0.040283203125,
0.034210205078125,
0.04876708984375,
-0.055267333984375,
-0.035614013671875,
0.0210113525390625,
-0.06463623046875,
0.00720977783203125,
-0.00661468505859375,
-0.0200042724609375,
0.0222930908203125,
0.0011539459228515625,
-0.049713134765625,
0.0355224609375,
0.026092529296875,
-0.0191802978515625,
0.06036376953125,
0.0035266876220703125,
0.00173187255859375,
-0.089599609375,
0.01035308837890625,
-0.0200958251953125,
0.01253509521484375,
-0.0272369384765625,
0.016387939453125,
0.00897979736328125,
0.040802001953125,
-0.049041748046875,
0.041961669921875,
-0.03912353515625,
-0.0070037841796875,
-0.0089569091796875,
0.00534820556640625,
-0.00864410400390625,
0.04827880859375,
-0.0181121826171875,
0.060302734375,
0.033843994140625,
-0.046966552734375,
0.0308685302734375,
0.023406982421875,
-0.0316162109375,
0.00560760498046875,
-0.044586181640625,
0.007381439208984375,
0.0016832351684570312,
0.035003662109375,
-0.0867919921875,
-0.0170135498046875,
0.031402587890625,
-0.04296875,
0.0099029541015625,
-0.0021152496337890625,
-0.033843994140625,
-0.042877197265625,
-0.059783935546875,
0.0299530029296875,
0.04083251953125,
-0.04376220703125,
0.0245513916015625,
0.016387939453125,
-0.0013332366943359375,
-0.052337646484375,
-0.051055908203125,
0.003856658935546875,
-0.004974365234375,
-0.055267333984375,
0.03826904296875,
-0.0131683349609375,
-0.0028400421142578125,
-0.0019512176513671875,
-0.0123443603515625,
0.0087890625,
-0.0002377033233642578,
0.014495849609375,
0.02777099609375,
-0.01308441162109375,
-0.020782470703125,
0.02349853515625,
-0.0150146484375,
-0.0106201171875,
0.001132965087890625,
0.04541015625,
-0.00782012939453125,
-0.033416748046875,
-0.0570068359375,
0.023529052734375,
0.02899169921875,
-0.01141357421875,
0.0204315185546875,
0.0555419921875,
-0.016265869140625,
-0.00045108795166015625,
-0.045257568359375,
0.004772186279296875,
-0.041015625,
0.0293426513671875,
-0.028228759765625,
-0.0693359375,
0.053863525390625,
-0.0131988525390625,
0.0218658447265625,
0.057464599609375,
0.04608154296875,
-0.018157958984375,
0.07177734375,
0.03515625,
-0.00701904296875,
0.016815185546875,
-0.052154541015625,
0.0022106170654296875,
-0.07135009765625,
-0.042938232421875,
-0.030029296875,
-0.031982421875,
-0.052337646484375,
-0.0390625,
0.0302276611328125,
0.021484375,
-0.034698486328125,
0.02850341796875,
-0.0584716796875,
0.0111541748046875,
0.032318115234375,
0.0006251335144042969,
0.018829345703125,
0.0009012222290039062,
-0.01837158203125,
0.01505279541015625,
-0.025970458984375,
-0.04815673828125,
0.07049560546875,
0.0183563232421875,
0.03973388671875,
0.0350341796875,
0.044189453125,
0.004535675048828125,
-0.0033779144287109375,
-0.038421630859375,
0.0484619140625,
0.0096893310546875,
-0.04730224609375,
-0.0088653564453125,
0.0017061233520507812,
-0.07171630859375,
0.0240936279296875,
0.01111602783203125,
-0.07867431640625,
0.0250701904296875,
-0.0009388923645019531,
-0.0222015380859375,
0.03173828125,
-0.03375244140625,
0.0178375244140625,
-0.03619384765625,
-0.020782470703125,
-0.01495361328125,
-0.055419921875,
0.045623779296875,
-0.00428009033203125,
0.03155517578125,
-0.028778076171875,
-0.032196044921875,
0.06787109375,
-0.046142578125,
0.049041748046875,
0.0008358955383300781,
-0.0126190185546875,
0.0513916015625,
0.012908935546875,
0.061126708984375,
0.0247955322265625,
-0.00881195068359375,
0.0343017578125,
0.00908660888671875,
-0.0225982666015625,
-0.0184478759765625,
0.052032470703125,
-0.071044921875,
-0.06488037109375,
-0.040802001953125,
-0.005565643310546875,
0.0222320556640625,
0.0279693603515625,
0.028778076171875,
-0.007389068603515625,
0.03155517578125,
0.005214691162109375,
0.01343536376953125,
-0.0360107421875,
0.056610107421875,
0.040863037109375,
-0.02313232421875,
-0.057159423828125,
0.070068359375,
0.01265716552734375,
0.006622314453125,
0.040557861328125,
0.00524139404296875,
-0.0220794677734375,
-0.044677734375,
-0.041015625,
0.035614013671875,
-0.045318603515625,
-0.028961181640625,
-0.032958984375,
-0.007537841796875,
-0.03997802734375,
-0.0188446044921875,
-0.013275146484375,
-0.0285797119140625,
-0.03271484375,
-0.022003173828125,
0.04656982421875,
0.043609619140625,
-0.011993408203125,
0.019622802734375,
-0.05035400390625,
0.04486083984375,
0.00511932373046875,
0.0099334716796875,
0.0233612060546875,
-0.0274810791015625,
-0.0283355712890625,
0.004634857177734375,
-0.0290679931640625,
-0.0802001953125,
0.0340576171875,
-0.00921630859375,
0.041900634765625,
0.03662109375,
-0.0125885009765625,
0.058929443359375,
-0.0173492431640625,
0.07177734375,
0.033660888671875,
-0.064208984375,
0.053802490234375,
-0.035064697265625,
0.00411224365234375,
0.01226043701171875,
0.019012451171875,
-0.0281524658203125,
-0.003887176513671875,
-0.0191497802734375,
-0.07659912109375,
0.0635986328125,
0.0216827392578125,
-0.0146026611328125,
0.006877899169921875,
0.0072174072265625,
-0.0059814453125,
0.0146484375,
-0.055450439453125,
-0.05206298828125,
-0.030181884765625,
-0.0061798095703125,
0.0112152099609375,
-0.0318603515625,
-0.0068206787109375,
-0.024627685546875,
0.04345703125,
0.004291534423828125,
0.047027587890625,
0.0175323486328125,
0.0004911422729492188,
-0.022064208984375,
0.004261016845703125,
0.047607421875,
0.044189453125,
-0.0251312255859375,
0.01091766357421875,
0.0241546630859375,
-0.048919677734375,
0.0193939208984375,
-0.01342010498046875,
-0.035797119140625,
0.002086639404296875,
0.0457763671875,
0.0479736328125,
0.006565093994140625,
-0.042877197265625,
0.02239990234375,
0.01419830322265625,
-0.004344940185546875,
-0.037109375,
0.017364501953125,
0.0199432373046875,
0.0228729248046875,
0.0640869140625,
-0.0232391357421875,
0.0009202957153320312,
-0.041290283203125,
0.00249481201171875,
0.0290374755859375,
0.01556396484375,
-0.0258941650390625,
0.04791259765625,
0.003170013427734375,
-0.00634002685546875,
0.0012969970703125,
-0.0298004150390625,
-0.053741455078125,
0.0872802734375,
0.056640625,
0.046875,
-0.0161590576171875,
0.003582000732421875,
0.059295654296875,
0.04510498046875,
0.00322723388671875,
0.05242919921875,
-0.0045318603515625,
-0.0297698974609375,
-0.01154327392578125,
-0.05853271484375,
-0.00940704345703125,
0.01314544677734375,
-0.0189056396484375,
0.0223388671875,
-0.0556640625,
-0.0068206787109375,
-0.00734710693359375,
0.025299072265625,
-0.03338623046875,
0.00983428955078125,
-0.00240325927734375,
0.06964111328125,
-0.039764404296875,
0.06390380859375,
0.03948974609375,
-0.041961669921875,
-0.06640625,
-0.00769805908203125,
0.019012451171875,
-0.0777587890625,
0.0494384765625,
0.017120361328125,
-0.0106201171875,
0.006927490234375,
-0.032745361328125,
-0.113037109375,
0.12200927734375,
0.00196075439453125,
-0.0400390625,
0.016387939453125,
0.00914764404296875,
0.0240325927734375,
-0.0105438232421875,
0.03271484375,
0.0304412841796875,
0.061309814453125,
0.01406097412109375,
-0.07037353515625,
0.043701171875,
-0.04376220703125,
0.013916015625,
-0.00997161865234375,
-0.09222412109375,
0.07855224609375,
-0.029632568359375,
-0.031890869140625,
0.01245880126953125,
0.048675537109375,
0.0439453125,
0.02850341796875,
0.0155029296875,
0.03857421875,
0.03485107421875,
-0.01071929931640625,
0.057373046875,
-0.031982421875,
0.034027099609375,
0.040008544921875,
-0.01343536376953125,
0.06390380859375,
0.0209808349609375,
-0.06134033203125,
0.0452880859375,
0.06561279296875,
-0.029449462890625,
0.0482177734375,
-0.0012111663818359375,
-0.024169921875,
0.01029205322265625,
-0.0050048828125,
-0.06427001953125,
0.01153564453125,
0.035003662109375,
-0.00978851318359375,
-0.0030689239501953125,
-0.026519775390625,
0.0128936767578125,
-0.03173828125,
-0.019744873046875,
0.055633544921875,
0.0013418197631835938,
-0.0341796875,
0.06683349609375,
0.012725830078125,
0.0699462890625,
-0.040863037109375,
-0.00797271728515625,
-0.0198822021484375,
-0.00876617431640625,
-0.04833984375,
-0.058441162109375,
0.0005941390991210938,
0.007740020751953125,
-0.00846099853515625,
0.03424072265625,
0.044097900390625,
0.003093719482421875,
-0.036285400390625,
0.038360595703125,
0.005340576171875,
0.0238800048828125,
0.03375244140625,
-0.05853271484375,
0.0285491943359375,
0.0298309326171875,
-0.040069580078125,
0.029296875,
0.0157012939453125,
-0.0033283233642578125,
0.056243896484375,
0.07427978515625,
0.011138916015625,
0.0191650390625,
-0.024383544921875,
0.0751953125,
-0.06866455078125,
-0.03717041015625,
-0.06512451171875,
0.048675537109375,
0.01071929931640625,
-0.021697998046875,
0.05755615234375,
0.0511474609375,
0.05499267578125,
0.0084381103515625,
0.07269287109375,
-0.0183258056640625,
0.033294677734375,
-0.027313232421875,
0.03826904296875,
-0.0506591796875,
0.019073486328125,
-0.0176239013671875,
-0.043243408203125,
-0.016265869140625,
0.051422119140625,
0.0041046142578125,
0.0173797607421875,
0.056915283203125,
0.0728759765625,
0.0235595703125,
-0.01812744140625,
-0.007305145263671875,
0.023956298828125,
0.034576416015625,
0.08038330078125,
0.04339599609375,
-0.040802001953125,
0.05072021484375,
-0.04681396484375,
-0.00724029541015625,
-0.047454833984375,
-0.048065185546875,
-0.05035400390625,
-0.0291748046875,
-0.01422119140625,
-0.01497650146484375,
-0.003658294677734375,
0.055572509765625,
0.0286102294921875,
-0.058685302734375,
-0.0301513671875,
0.0009899139404296875,
0.03253173828125,
-0.0261383056640625,
-0.01239776611328125,
0.06304931640625,
0.0169830322265625,
-0.0445556640625,
0.0244903564453125,
0.0374755859375,
0.0132904052734375,
-0.01218414306640625,
-0.022430419921875,
-0.0249176025390625,
-0.003662109375,
0.0311279296875,
0.033111572265625,
-0.0814208984375,
-0.0022487640380859375,
-0.01152801513671875,
-0.0159149169921875,
0.00766754150390625,
0.01371002197265625,
-0.03887939453125,
0.0070953369140625,
0.04132080078125,
0.00046563148498535156,
0.03387451171875,
-0.006542205810546875,
-0.01074981689453125,
-0.016204833984375,
0.0193328857421875,
-0.00836181640625,
0.033416748046875,
-0.004634857177734375,
-0.044342041015625,
0.044921875,
0.029083251953125,
-0.02459716796875,
-0.056060791015625,
0.0089263916015625,
-0.09857177734375,
-0.033203125,
0.09857177734375,
-0.00896453857421875,
-0.037139892578125,
0.0037288665771484375,
-0.031982421875,
0.0509033203125,
-0.040679931640625,
0.053955078125,
0.028076171875,
-0.01027679443359375,
0.0019550323486328125,
-0.046295166015625,
0.027099609375,
0.01029205322265625,
-0.054351806640625,
-0.0257110595703125,
0.019866943359375,
0.0168609619140625,
0.031890869140625,
0.0487060546875,
0.0007166862487792969,
0.037872314453125,
-0.009674072265625,
0.01514434814453125,
-0.0262908935546875,
0.01007080078125,
-0.0067596435546875,
0.00576019287109375,
-0.00931549072265625,
-0.01580810546875
]
] |
oliverguhr/spelling-correction-english-base | 2023-03-16T18:44:40.000Z | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"bart",
"text2text-generation",
"en",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | text2text-generation | oliverguhr | null | null | oliverguhr/spelling-correction-english-base | 39 | 7,003 | transformers | 2022-05-23T15:26:21 | ---
language:
- en
license: mit
widget:
- text: "lets do a comparsion"
example_title: "1"
- text: "Their going to be here so0n"
example_title: "2"
- text: "ze shop is cloed due to covid 19"
example_title: "3"
metrics:
- cer
---
This is an experimental model that should fix your typos and punctuation.
If you like to run your own experiments or train for a different language, have a look at [the code](https://github.com/oliverguhr/spelling).
## Model description
This is a proof of concept spelling correction model for English.
## Intended uses & limitations
This project is work in progress, be aware that the model can produce artefacts.
You can test the model using the pipeline-interface:
```python
from transformers import pipeline
fix_spelling = pipeline("text2text-generation",model="oliverguhr/spelling-correction-english-base")
print(fix_spelling("lets do a comparsion",max_length=2048))
```
| 920 | [
[
0.037322998046875,
-0.050323486328125,
0.029998779296875,
0.019561767578125,
-0.0010051727294921875,
0.00505828857421875,
-0.006500244140625,
-0.03424072265625,
0.01373291015625,
0.041595458984375,
-0.02655029296875,
-0.018096923828125,
-0.030609130859375,
0.04046630859375,
-0.01934814453125,
0.05462646484375,
-0.0218963623046875,
0.0271148681640625,
-0.0147857666015625,
-0.0036296844482421875,
-0.050079345703125,
-0.038360595703125,
-0.0780029296875,
-0.0078887939453125,
0.044708251953125,
0.044921875,
0.0380859375,
0.029083251953125,
0.03375244140625,
0.0305938720703125,
-0.01861572265625,
0.0165863037109375,
-0.01055908203125,
0.00957489013671875,
-0.02752685546875,
-0.060333251953125,
-0.02685546875,
0.00896453857421875,
0.043701171875,
0.040252685546875,
-0.0131988525390625,
-0.0005154609680175781,
0.006671905517578125,
0.024749755859375,
-0.03076171875,
0.003971099853515625,
-0.046661376953125,
-0.006000518798828125,
-0.01519012451171875,
0.007083892822265625,
-0.042572021484375,
-0.020233154296875,
-0.00899505615234375,
-0.05712890625,
0.02880859375,
0.051605224609375,
0.06890869140625,
0.0256195068359375,
-0.01346588134765625,
-0.00502777099609375,
-0.036834716796875,
0.046539306640625,
-0.065673828125,
0.0250396728515625,
0.033447265625,
0.01171875,
-0.0155181884765625,
-0.0869140625,
-0.052642822265625,
-0.0094757080078125,
-0.0157318115234375,
0.0084991455078125,
0.0259857177734375,
0.01557159423828125,
0.039520263671875,
0.033233642578125,
-0.0548095703125,
-0.009307861328125,
-0.03851318359375,
-0.05023193359375,
0.0283355712890625,
0.0084991455078125,
0.02880859375,
-0.0180206298828125,
0.01081085205078125,
-0.0029888153076171875,
-0.0285797119140625,
0.0031299591064453125,
0.04193115234375,
0.0234527587890625,
-0.00844573974609375,
0.07647705078125,
-0.0182037353515625,
0.0341796875,
0.01227569580078125,
0.011566162109375,
0.0178985595703125,
-0.006298065185546875,
-0.029632568359375,
-0.01042938232421875,
0.048919677734375,
0.020294189453125,
0.02935791015625,
-0.01898193359375,
-0.0127716064453125,
0.02447509765625,
0.005840301513671875,
-0.0345458984375,
-0.041412353515625,
0.0254974365234375,
-0.007694244384765625,
-0.0411376953125,
-0.01873779296875,
0.0006823539733886719,
-0.0226593017578125,
-0.006793975830078125,
0.0657958984375,
-0.057708740234375,
-0.003582000732421875,
0.034149169921875,
-0.039520263671875,
-0.039794921875,
0.005069732666015625,
-0.0531005859375,
0.00406646728515625,
0.0309295654296875,
0.058380126953125,
0.037872314453125,
-0.061553955078125,
-0.03924560546875,
-0.0038013458251953125,
-0.01328277587890625,
0.07293701171875,
-0.0222625732421875,
-0.007526397705078125,
0.016357421875,
0.024749755859375,
-0.033111572265625,
-0.040557861328125,
0.0576171875,
-0.0236053466796875,
0.05914306640625,
0.003047943115234375,
-0.058074951171875,
0.0021495819091796875,
0.018524169921875,
-0.054473876953125,
0.07196044921875,
0.00905609130859375,
-0.06915283203125,
0.006622314453125,
-0.062103271484375,
-0.039886474609375,
0.0124664306640625,
0.032257080078125,
-0.053802490234375,
0.00335693359375,
-0.01161956787109375,
-0.006237030029296875,
0.0231781005859375,
0.007312774658203125,
0.0125579833984375,
-0.0215911865234375,
0.05029296875,
-0.0011510848999023438,
0.06341552734375,
-0.0014486312866210938,
-0.0100555419921875,
-0.00820159912109375,
-0.059600830078125,
0.0070343017578125,
-0.0136260986328125,
-0.030975341796875,
-0.01352691650390625,
0.006946563720703125,
0.012298583984375,
0.01316070556640625,
0.026763916015625,
-0.0462646484375,
0.0232696533203125,
-0.038665771484375,
0.06988525390625,
0.0222625732421875,
0.0006561279296875,
0.013702392578125,
-0.03424072265625,
0.0217437744140625,
0.0190887451171875,
0.007762908935546875,
-0.01349639892578125,
-0.0281829833984375,
-0.081298828125,
-0.00844573974609375,
0.06390380859375,
0.043792724609375,
-0.0745849609375,
0.02099609375,
0.0191497802734375,
-0.0213470458984375,
-0.047698974609375,
-0.0166473388671875,
0.030670166015625,
0.045867919921875,
0.0222930908203125,
-0.01328277587890625,
-0.0701904296875,
-0.0587158203125,
-0.0465087890625,
-0.03619384765625,
-0.0122833251953125,
-0.0143280029296875,
0.049224853515625,
-0.0166015625,
0.060333251953125,
-0.0362548828125,
-0.022979736328125,
0.0176544189453125,
0.025909423828125,
0.01067352294921875,
0.035369873046875,
0.018280029296875,
-0.041290283203125,
-0.0163726806640625,
-0.021820068359375,
-0.0347900390625,
-0.028076171875,
0.022979736328125,
-0.00785064697265625,
0.02252197265625,
0.018951416015625,
-0.037353515625,
0.02117919921875,
0.0369873046875,
-0.050079345703125,
0.06939697265625,
-0.0260009765625,
0.0278472900390625,
-0.07720947265625,
0.0035037994384765625,
-0.0142974853515625,
-0.033782958984375,
-0.0265655517578125,
0.028564453125,
0.039306640625,
-0.00981903076171875,
-0.060394287109375,
0.026611328125,
-0.01470947265625,
0.034515380859375,
-0.024505615234375,
-0.00896453857421875,
0.026580810546875,
0.01120758056640625,
0.01039886474609375,
0.07159423828125,
0.033447265625,
-0.037353515625,
0.02679443359375,
0.037750244140625,
-0.00972747802734375,
0.042327880859375,
-0.0533447265625,
0.01200103759765625,
0.024322509765625,
0.0037975311279296875,
-0.066650390625,
-0.033294677734375,
0.027374267578125,
-0.040618896484375,
0.0029582977294921875,
-0.00262451171875,
-0.059600830078125,
-0.0487060546875,
-0.00860595703125,
0.01490020751953125,
0.0253143310546875,
-0.0455322265625,
0.02001953125,
-0.0016908645629882812,
-0.00624847412109375,
-0.0254058837890625,
-0.05194091796875,
0.007312774658203125,
-0.010101318359375,
-0.069580078125,
0.0128631591796875,
-0.039459228515625,
-0.0223236083984375,
-0.00827789306640625,
0.0213623046875,
-0.033843994140625,
-0.003021240234375,
0.0075531005859375,
0.01367950439453125,
-0.0024261474609375,
0.02056884765625,
-0.005558013916015625,
0.006839752197265625,
-0.005901336669921875,
-0.0260009765625,
0.036865234375,
0.00004869699478149414,
0.00748443603515625,
-0.010101318359375,
-0.0013570785522460938,
0.040557861328125,
-0.01922607421875,
0.04345703125,
0.0221710205078125,
-0.04608154296875,
-0.0120697021484375,
-0.0447998046875,
-0.0159912109375,
-0.0391845703125,
0.0294342041015625,
-0.0225982666015625,
-0.0479736328125,
0.029083251953125,
0.0020389556884765625,
-0.0180206298828125,
0.039581298828125,
0.012054443359375,
-0.0328369140625,
0.0584716796875,
0.0543212890625,
0.0211639404296875,
-0.00232696533203125,
-0.0052337646484375,
0.03289794921875,
-0.055145263671875,
-0.00335693359375,
-0.0361328125,
-0.011962890625,
-0.0213623046875,
-0.018096923828125,
0.003223419189453125,
0.01473236083984375,
-0.0026645660400390625,
0.0570068359375,
-0.035919189453125,
0.037200927734375,
0.044189453125,
0.00891876220703125,
0.03302001953125,
-0.001972198486328125,
-0.0305938720703125,
-0.0253753662109375,
-0.05255126953125,
-0.042724609375,
0.06365966796875,
0.01202392578125,
0.036529541015625,
-0.01763916015625,
0.01715087890625,
0.0106048583984375,
-0.0017824172973632812,
-0.040924072265625,
0.023651123046875,
-0.01898193359375,
-0.052032470703125,
-0.01259613037109375,
0.0006709098815917969,
-0.041656494140625,
-0.01538848876953125,
-0.0253448486328125,
-0.039031982421875,
-0.00896453857421875,
0.035614013671875,
-0.01190185546875,
0.035400390625,
-0.0675048828125,
0.064208984375,
-0.0014486312866210938,
-0.036529541015625,
-0.00994110107421875,
-0.034515380859375,
0.023651123046875,
-0.00525665283203125,
0.0196380615234375,
0.01143646240234375,
0.0230560302734375,
0.03826904296875,
-0.053375244140625,
0.06488037109375,
0.016510009765625,
-0.01168060302734375,
0.0101776123046875,
0.0019121170043945312,
0.032684326171875,
-0.00041866302490234375,
-0.0232391357421875,
0.023284912109375,
-0.0161590576171875,
-0.01275634765625,
-0.006481170654296875,
0.034454345703125,
-0.0390625,
-0.0159759521484375,
-0.03692626953125,
-0.072265625,
0.0171661376953125,
-0.0014371871948242188,
0.0518798828125,
0.030975341796875,
-0.0284576416015625,
-0.0136566162109375,
0.030303955078125,
0.0027484893798828125,
0.048431396484375,
0.05560302734375,
-0.02032470703125,
-0.035400390625,
0.04351806640625,
0.0154876708984375,
0.00384521484375,
0.0377197265625,
0.00917816162109375,
-0.01546478271484375,
-0.00811004638671875,
-0.0357666015625,
0.02349853515625,
-0.032318115234375,
-0.0028705596923828125,
-0.0465087890625,
-0.023590087890625,
-0.06317138671875,
-0.00569915771484375,
-0.01043701171875,
-0.0576171875,
-0.024200439453125,
0.01409149169921875,
0.04193115234375,
0.0474853515625,
-0.0303802490234375,
0.054931640625,
-0.0295257568359375,
0.018096923828125,
0.027618408203125,
0.040008544921875,
-0.0411376953125,
-0.06085205078125,
-0.0411376953125,
0.0213470458984375,
-0.051422119140625,
-0.04345703125,
0.0394287109375,
0.01849365234375,
-0.0007939338684082031,
0.01458740234375,
-0.0201873779296875,
0.040771484375,
-0.0731201171875,
0.07470703125,
0.0128631591796875,
-0.0867919921875,
0.040771484375,
-0.0011653900146484375,
0.0108489990234375,
0.0229034423828125,
0.0008249282836914062,
-0.08917236328125,
-0.064453125,
-0.0697021484375,
-0.046173095703125,
0.038543701171875,
0.03851318359375,
-0.0275726318359375,
-0.004573822021484375,
0.037200927734375,
0.008544921875,
0.02252197265625,
-0.055267333984375,
-0.01383209228515625,
-0.01300811767578125,
-0.02105712890625,
0.005352020263671875,
-0.028656005859375,
-0.0087127685546875,
-0.015350341796875,
0.09283447265625,
0.0031890869140625,
-0.01515960693359375,
0.020294189453125,
-0.034759521484375,
-0.0273590087890625,
0.0201873779296875,
0.046905517578125,
0.046875,
-0.01273345947265625,
0.004833221435546875,
0.0169830322265625,
-0.041961669921875,
0.005855560302734375,
0.02044677734375,
-0.0245208740234375,
0.03375244140625,
0.012847900390625,
0.04815673828125,
0.006069183349609375,
-0.035919189453125,
0.027252197265625,
-0.0159149169921875,
-0.03753662109375,
-0.042816162109375,
0.00579071044921875,
0.0124053955078125,
0.0158843994140625,
0.01153564453125,
0.031280517578125,
0.00521087646484375,
-0.0380859375,
0.0023975372314453125,
0.01244354248046875,
0.0021533966064453125,
-0.0257568359375,
0.08489990234375,
0.034454345703125,
-0.0809326171875,
0.041900634765625,
-0.01922607421875,
-0.060546875,
0.050079345703125,
0.0450439453125,
0.04730224609375,
-0.030426025390625,
0.025665283203125,
0.035430908203125,
0.0400390625,
-0.0269622802734375,
0.042510986328125,
0.03460693359375,
-0.053009033203125,
-0.0297088623046875,
-0.0291748046875,
-0.047821044921875,
0.032470703125,
-0.0245819091796875,
0.05169677734375,
-0.020263671875,
0.00180816650390625,
0.00006300210952758789,
0.005809783935546875,
-0.045501708984375,
0.03076171875,
0.004791259765625,
0.06256103515625,
-0.09283447265625,
0.060211181640625,
0.07269287109375,
-0.029541015625,
-0.068359375,
-0.01013946533203125,
-0.031402587890625,
-0.053802490234375,
0.03338623046875,
0.02099609375,
0.022186279296875,
0.00954437255859375,
-0.06903076171875,
-0.06256103515625,
0.056793212890625,
0.01180267333984375,
-0.07305908203125,
-0.0081329345703125,
0.039947509765625,
0.048431396484375,
-0.00705718994140625,
0.029998779296875,
0.041717529296875,
0.0496826171875,
-0.0421142578125,
-0.0753173828125,
-0.0372314453125,
-0.0162200927734375,
0.0035858154296875,
-0.0011501312255859375,
-0.03912353515625,
0.07073974609375,
-0.0195159912109375,
-0.01132965087890625,
0.0517578125,
0.0394287109375,
0.0197296142578125,
0.0142822265625,
0.04681396484375,
0.069580078125,
0.0601806640625,
-0.017120361328125,
0.043975830078125,
-0.0411376953125,
0.052093505859375,
0.0755615234375,
-0.031158447265625,
0.046142578125,
0.047393798828125,
-0.00937652587890625,
0.047607421875,
0.053497314453125,
-0.00461578369140625,
0.0239410400390625,
0.0246124267578125,
-0.01079559326171875,
-0.0027942657470703125,
-0.00792694091796875,
-0.017181396484375,
0.0012540817260742188,
0.007434844970703125,
-0.0229339599609375,
0.01476287841796875,
0.023651123046875,
0.00370025634765625,
0.031280517578125,
-0.01557159423828125,
0.01554107666015625,
-0.005496978759765625,
-0.039398193359375,
0.0300445556640625,
0.01084136962890625,
0.0745849609375,
-0.022125244140625,
-0.0046844482421875,
-0.0123443603515625,
0.0355224609375,
-0.0012502670288085938,
-0.0452880859375,
0.007381439208984375,
-0.01013946533203125,
-0.04119873046875,
-0.0069732666015625,
0.05450439453125,
-0.072265625,
-0.062164306640625,
0.01495361328125,
0.01318359375,
0.0235443115234375,
-0.0156097412109375,
-0.0401611328125,
-0.00498199462890625,
-0.01538848876953125,
0.0111846923828125,
-0.037750244140625,
0.029083251953125,
0.0183258056640625,
0.0223541259765625,
0.0219879150390625,
0.0254058837890625,
0.0081787109375,
0.0232391357421875,
0.01983642578125,
-0.035797119140625,
-0.0386962890625,
-0.04156494140625,
0.05133056640625,
-0.01277923583984375,
-0.041534423828125,
0.0455322265625,
0.0726318359375,
0.08074951171875,
-0.049468994140625,
0.0753173828125,
-0.01451873779296875,
0.057525634765625,
-0.0305633544921875,
0.051116943359375,
-0.027801513671875,
-0.025970458984375,
0.0023593902587890625,
-0.046875,
-0.0215911865234375,
0.038177490234375,
-0.003749847412109375,
-0.05181884765625,
0.10675048828125,
0.06805419921875,
-0.00957489013671875,
-0.007282257080078125,
0.034912109375,
0.0601806640625,
-0.0027618408203125,
0.0400390625,
0.061431884765625,
-0.075927734375,
0.03778076171875,
-0.048065185546875,
0.006748199462890625,
-0.01486968994140625,
-0.05816650390625,
-0.063720703125,
-0.040252685546875,
-0.05242919921875,
-0.04742431640625,
0.04248046875,
0.053497314453125,
0.029998779296875,
-0.054229736328125,
-0.01580810546875,
-0.0175323486328125,
0.001949310302734375,
-0.0307159423828125,
-0.01904296875,
0.00946807861328125,
-0.034912109375,
-0.07733154296875,
0.0221405029296875,
0.0006885528564453125,
0.00647735595703125,
0.003978729248046875,
0.010955810546875,
-0.0287933349609375,
-0.0002739429473876953,
0.0308074951171875,
0.019622802734375,
-0.0312347412109375,
-0.0546875,
0.0102996826171875,
-0.0165863037109375,
0.0197601318359375,
0.052978515625,
-0.0474853515625,
0.019683837890625,
0.036834716796875,
-0.004512786865234375,
-0.0009441375732421875,
-0.0281829833984375,
0.05938720703125,
-0.074462890625,
0.0215301513671875,
-0.003902435302734375,
0.071044921875,
0.04949951171875,
-0.0479736328125,
0.048187255859375,
0.048065185546875,
-0.06817626953125,
-0.04443359375,
0.00165557861328125,
-0.048431396484375,
-0.01422882080078125,
0.08184814453125,
0.00226593017578125,
-0.0271148681640625,
0.004497528076171875,
-0.0479736328125,
0.05731201171875,
-0.01259613037109375,
0.086181640625,
0.06634521484375,
0.033233642578125,
0.00820159912109375,
-0.01983642578125,
0.0208740234375,
0.0543212890625,
-0.0411376953125,
-0.00878143310546875,
0.034454345703125,
0.051483154296875,
0.0162353515625,
0.01549530029296875,
0.01027679443359375,
0.033935546875,
-0.0132904052734375,
0.060333251953125,
-0.00366973876953125,
-0.0158538818359375,
-0.0261688232421875,
-0.018951416015625,
-0.0011243820190429688,
-0.036529541015625
]
] |
riffusion/riffusion-model-v1 | 2023-06-05T16:27:41.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-audio",
"arxiv:2112.10752",
"arxiv:2103.00020",
"arxiv:2205.11487",
"arxiv:2210.08402",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-audio | riffusion | null | null | riffusion/riffusion-model-v1 | 487 | 6,992 | diffusers | 2022-12-13T02:28:17 | ---
license: creativeml-openrail-m
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-audio
inference: true
extra_gated_prompt: |-
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content
2. Riffusion claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
Please read the full license carefully here: https://huggingface.co/spaces/CompVis/stable-diffusion-license
extra_gated_heading: Please read the LICENSE to access this model
---
# Riffusion
Riffusion is an app for real-time music generation with stable diffusion.
Read about it at https://www.riffusion.com/about and try it at https://www.riffusion.com/.
* Code: https://github.com/riffusion/riffusion
* Web app: https://github.com/hmartiro/riffusion-app
* Model checkpoint: https://huggingface.co/riffusion/riffusion-model-v1
* Discord: https://discord.gg/yu6SRwvX4v
This repository contains the model files, including:
* a diffusers formated library
* a compiled checkpoint file
* a traced unet for improved inference speed
* a seed image library for use with riffusion-app
## Riffusion v1 Model
Riffusion is a latent text-to-image diffusion model capable of generating spectrogram images given any text input. These spectrograms can be converted into audio clips.
The model was created by [Seth Forsgren](https://sethforsgren.com/) and [Hayk Martiros](https://haykmartiros.com/) as a hobby project.
You can use the Riffusion model directly, or try the [Riffusion web app](https://www.riffusion.com/).
The Riffusion model was created by fine-tuning the **Stable-Diffusion-v1-5** checkpoint. Read about Stable Diffusion here [🤗's Stable Diffusion blog](https://huggingface.co/blog/stable_diffusion).
### Model Details
- **Developed by:** Seth Forsgren, Hayk Martiros
- **Model type:** Diffusion-based text-to-image generation model
- **Language(s):** English
- **License:** [The CreativeML OpenRAIL M license](https://huggingface.co/spaces/CompVis/stable-diffusion-license) is an [Open RAIL M license](https://www.licenses.ai/blog/2022/8/18/naming-convention-of-responsible-ai-licenses), adapted from the work that [BigScience](https://bigscience.huggingface.co/) and [the RAIL Initiative](https://www.licenses.ai/) are jointly carrying in the area of responsible AI licensing. See also [the article about the BLOOM Open RAIL license](https://bigscience.huggingface.co/blog/the-bigscience-rail-license) on which our license is based.
- **Model Description:** This is a model that can be used to generate and modify images based on text prompts. It is a [Latent Diffusion Model](https://arxiv.org/abs/2112.10752) that uses a fixed, pretrained text encoder ([CLIP ViT-L/14](https://arxiv.org/abs/2103.00020)) as suggested in the [Imagen paper](https://arxiv.org/abs/2205.11487).
### Direct Use
The model is intended for research purposes only. Possible research areas and
tasks include
- Generation of artworks, audio, and use in creative processes.
- Applications in educational or creative tools.
- Research on generative models.
### Datasets
The original Stable Diffusion v1.5 was trained on the [LAION-5B](https://arxiv.org/abs/2210.08402) dataset using the [CLIP text encoder](https://openai.com/blog/clip/), which provided an amazing starting point with an in-depth understanding of language, including musical concepts. The team at LAION also compiled a fantastic audio dataset from many general, speech, and music sources that we recommend at [LAION-AI/audio-dataset](https://github.com/LAION-AI/audio-dataset/blob/main/data_collection/README.md).
### Fine Tuning
Check out the [diffusers training examples](https://huggingface.co/docs/diffusers/training/overview) from Hugging Face. Fine tuning requires a dataset of spectrogram images of short audio clips, with associated text describing them. Note that the CLIP encoder is able to understand and connect many words even if they never appear in the dataset. It is also possible to use a [dreambooth](https://huggingface.co/blog/dreambooth) method to get custom styles.
## Citation
If you build on this work, please cite it as follows:
```
@article{Forsgren_Martiros_2022,
author = {Forsgren, Seth* and Martiros, Hayk*},
title = {{Riffusion - Stable diffusion for real-time music generation}},
url = {https://riffusion.com/about},
year = {2022}
}
```
| 4,953 | [
[
-0.0288543701171875,
-0.07305908203125,
0.039764404296875,
0.0406494140625,
-0.0109710693359375,
-0.009033203125,
-0.0230560302734375,
-0.036163330078125,
0.0141143798828125,
0.03179931640625,
-0.045806884765625,
-0.058349609375,
-0.0347900390625,
-0.023101806640625,
-0.046661376953125,
0.071044921875,
0.0001996755599975586,
0.0092620849609375,
-0.01019287109375,
-0.005786895751953125,
-0.025665283203125,
0.01111602783203125,
-0.0758056640625,
-0.02581787109375,
0.032501220703125,
0.004581451416015625,
0.0302734375,
0.02911376953125,
0.0295867919921875,
0.0206756591796875,
-0.02947998046875,
-0.010772705078125,
-0.027557373046875,
0.025360107421875,
-0.00896453857421875,
-0.011932373046875,
-0.0225830078125,
0.006420135498046875,
0.0400390625,
0.0199127197265625,
-0.0280303955078125,
0.00440216064453125,
-0.01275634765625,
0.043548583984375,
-0.023223876953125,
-0.0211334228515625,
-0.027313232421875,
0.015533447265625,
-0.024566650390625,
0.006893157958984375,
-0.02923583984375,
-0.035858154296875,
0.0087738037109375,
-0.05743408203125,
0.0111083984375,
-0.00316619873046875,
0.08856201171875,
0.022796630859375,
-0.01434326171875,
-0.029571533203125,
-0.07000732421875,
0.0401611328125,
-0.05096435546875,
0.0252532958984375,
0.01505279541015625,
0.036163330078125,
-0.0004780292510986328,
-0.08575439453125,
-0.048370361328125,
-0.0114898681640625,
0.029022216796875,
0.02557373046875,
-0.0203094482421875,
0.0008172988891601562,
0.0169525146484375,
0.041778564453125,
-0.041778564453125,
-0.0233612060546875,
-0.0225067138671875,
-0.01445770263671875,
0.029998779296875,
-0.0086212158203125,
0.047332763671875,
-0.0153045654296875,
-0.03936767578125,
-0.016265869140625,
-0.04638671875,
-0.00487518310546875,
0.0258941650390625,
-0.005840301513671875,
-0.037933349609375,
0.0255126953125,
0.03314208984375,
0.0538330078125,
0.0202484130859375,
-0.00933074951171875,
0.033477783203125,
-0.0295867919921875,
-0.0058135986328125,
-0.01248931884765625,
0.09112548828125,
0.0091400146484375,
0.0069732666015625,
-0.00745391845703125,
-0.020538330078125,
-0.0021266937255859375,
0.003376007080078125,
-0.08270263671875,
-0.030303955078125,
0.0160675048828125,
-0.048126220703125,
-0.03076171875,
-0.0166778564453125,
-0.050262451171875,
-0.01358795166015625,
-0.00049591064453125,
0.048126220703125,
-0.06414794921875,
-0.0316162109375,
0.005535125732421875,
-0.03912353515625,
-0.00992584228515625,
0.00438690185546875,
-0.050018310546875,
-0.0055084228515625,
0.0234375,
0.0933837890625,
-0.00760650634765625,
-0.002094268798828125,
0.0010356903076171875,
0.01531219482421875,
-0.0184478759765625,
0.06964111328125,
-0.0309295654296875,
-0.034698486328125,
-0.0141448974609375,
-0.0014553070068359375,
0.0033664703369140625,
-0.0538330078125,
0.0518798828125,
-0.0418701171875,
0.03192138671875,
0.00804901123046875,
-0.057708740234375,
-0.007114410400390625,
-0.0205078125,
-0.05267333984375,
0.07012939453125,
0.01275634765625,
-0.059600830078125,
0.0216522216796875,
-0.08477783203125,
-0.0201568603515625,
-0.001827239990234375,
-0.006114959716796875,
-0.04510498046875,
-0.012542724609375,
-0.006633758544921875,
0.018402099609375,
-0.00778961181640625,
0.03228759765625,
-0.02117919921875,
-0.0272369384765625,
0.02484130859375,
-0.03338623046875,
0.083984375,
0.042877197265625,
-0.030609130859375,
0.007213592529296875,
-0.058929443359375,
-0.0156402587890625,
0.01091766357421875,
-0.0281219482421875,
-0.0219879150390625,
0.0020503997802734375,
0.02685546875,
0.0107269287109375,
0.003932952880859375,
-0.044830322265625,
-0.004222869873046875,
-0.031524658203125,
0.044158935546875,
0.059906005859375,
0.022979736328125,
0.046173095703125,
-0.0268707275390625,
0.046295166015625,
0.004955291748046875,
0.0033702850341796875,
-0.0343017578125,
-0.046173095703125,
-0.038787841796875,
-0.0416259765625,
0.04510498046875,
0.048248291015625,
-0.0540771484375,
0.039276123046875,
-0.005680084228515625,
-0.04132080078125,
-0.045806884765625,
0.0021152496337890625,
0.01155853271484375,
0.04339599609375,
0.0250396728515625,
-0.03265380859375,
-0.0303955078125,
-0.050872802734375,
0.0054473876953125,
0.0018987655639648438,
-0.0193939208984375,
0.040191650390625,
0.0227813720703125,
-0.031219482421875,
0.06781005859375,
-0.039520263671875,
-0.0237579345703125,
0.006175994873046875,
0.0328369140625,
0.0182647705078125,
0.060455322265625,
0.07318115234375,
-0.06341552734375,
-0.05023193359375,
-0.0240631103515625,
-0.048126220703125,
-0.02740478515625,
-0.0137786865234375,
-0.031951904296875,
-0.006221771240234375,
0.0277252197265625,
-0.0587158203125,
0.0011615753173828125,
0.04638671875,
-0.029266357421875,
0.0426025390625,
0.006481170654296875,
0.00904083251953125,
-0.10211181640625,
0.0167236328125,
0.0204925537109375,
-0.035308837890625,
-0.04571533203125,
-0.00801849365234375,
-0.0169677734375,
-0.0163116455078125,
-0.034759521484375,
0.0509033203125,
-0.022857666015625,
0.0206756591796875,
-0.023284912109375,
0.00595855712890625,
-0.0012559890747070312,
0.0197906494140625,
0.00875091552734375,
0.068359375,
0.06939697265625,
-0.033660888671875,
0.0268707275390625,
0.0205841064453125,
-0.04315185546875,
0.05426025390625,
-0.058624267578125,
0.0049285888671875,
-0.0128631591796875,
0.0268707275390625,
-0.07354736328125,
-0.0268096923828125,
0.0389404296875,
-0.04156494140625,
0.024688720703125,
-0.017791748046875,
-0.0322265625,
-0.0230865478515625,
0.003574371337890625,
0.03118896484375,
0.061859130859375,
-0.035308837890625,
0.031951904296875,
0.031585693359375,
-0.0176849365234375,
-0.0225372314453125,
-0.0596923828125,
0.005157470703125,
-0.03399658203125,
-0.057769775390625,
0.04925537109375,
-0.0303955078125,
-0.01384735107421875,
0.0191497802734375,
0.024444580078125,
-0.0218963623046875,
-0.00862884521484375,
0.0280303955078125,
0.0099945068359375,
-0.0156402587890625,
0.02081298828125,
0.0190277099609375,
-0.0022678375244140625,
-0.008209228515625,
-0.03509521484375,
0.03326416015625,
0.01267242431640625,
-0.016204833984375,
-0.047332763671875,
0.019989013671875,
0.03350830078125,
0.004093170166015625,
0.044158935546875,
0.0767822265625,
-0.01849365234375,
-0.0153045654296875,
-0.0220947265625,
-0.0112762451171875,
-0.041778564453125,
-0.0031261444091796875,
-0.00603485107421875,
-0.056915283203125,
0.038482666015625,
-0.022369384765625,
-0.0017719268798828125,
0.034698486328125,
0.049285888671875,
-0.038726806640625,
0.06768798828125,
0.049102783203125,
0.0131683349609375,
0.08447265625,
-0.046295166015625,
-0.00788116455078125,
-0.056976318359375,
-0.00984954833984375,
-0.01541900634765625,
-0.0291290283203125,
-0.020477294921875,
-0.04193115234375,
0.03131103515625,
-0.00463104248046875,
-0.010650634765625,
0.0148773193359375,
-0.049591064453125,
0.03436279296875,
0.0301513671875,
0.006504058837890625,
0.001068115234375,
0.01953125,
-0.016693115234375,
-0.004337310791015625,
-0.034637451171875,
-0.0318603515625,
0.07427978515625,
0.033355712890625,
0.06414794921875,
0.01025390625,
0.05230712890625,
0.051788330078125,
0.021026611328125,
-0.046234130859375,
0.037139892578125,
-0.013031005859375,
-0.061492919921875,
-0.0097808837890625,
-0.0302734375,
-0.068115234375,
0.0041961669921875,
-0.027099609375,
-0.032623291015625,
0.0211334228515625,
0.02001953125,
-0.0183563232421875,
0.0164794921875,
-0.05145263671875,
0.06439208984375,
-0.00943756103515625,
-0.03045654296875,
-0.0068359375,
-0.040985107421875,
0.033111572265625,
-0.0005769729614257812,
0.0200347900390625,
-0.01788330078125,
0.005802154541015625,
0.056427001953125,
-0.0270843505859375,
0.05682373046875,
-0.038360595703125,
-0.01329803466796875,
0.039825439453125,
-0.00516510009765625,
0.01197052001953125,
-0.0030879974365234375,
-0.00534820556640625,
0.0105133056640625,
-0.0012273788452148438,
-0.0221099853515625,
-0.03253173828125,
0.059783935546875,
-0.05084228515625,
-0.00780487060546875,
-0.0015401840209960938,
-0.02862548828125,
0.0136566162109375,
0.03314208984375,
0.049224853515625,
0.0289154052734375,
-0.027008056640625,
0.01180267333984375,
0.07489013671875,
-0.01496124267578125,
0.0304718017578125,
0.02374267578125,
-0.024169921875,
-0.04571533203125,
0.0750732421875,
-0.00202178955078125,
0.016693115234375,
0.008087158203125,
0.0224761962890625,
-0.0163421630859375,
-0.042724609375,
-0.061431884765625,
0.0201873779296875,
-0.041778564453125,
-0.0103607177734375,
-0.049560546875,
-0.0167694091796875,
-0.0286712646484375,
-0.00571441650390625,
-0.0264129638671875,
-0.0474853515625,
-0.0518798828125,
0.002315521240234375,
0.036407470703125,
0.03839111328125,
-0.023651123046875,
0.035675048828125,
-0.04669189453125,
0.034698486328125,
0.007526397705078125,
0.0350341796875,
-0.0034046173095703125,
-0.046539306640625,
-0.0248870849609375,
0.01195526123046875,
-0.043182373046875,
-0.08544921875,
0.0218963623046875,
0.033416748046875,
0.03289794921875,
0.0477294921875,
-0.0264434814453125,
0.0394287109375,
-0.04705810546875,
0.0797119140625,
0.0218963623046875,
-0.0618896484375,
0.050262451171875,
-0.04571533203125,
0.01200103759765625,
0.03448486328125,
0.03326416015625,
-0.048187255859375,
-0.037017822265625,
-0.047332763671875,
-0.080078125,
0.040924072265625,
0.0250396728515625,
0.027099609375,
0.00957489013671875,
0.043975830078125,
-0.00031304359436035156,
0.00949859619140625,
-0.080322265625,
-0.0279541015625,
-0.03631591796875,
-0.03167724609375,
0.00826263427734375,
-0.0073089599609375,
-0.0236663818359375,
-0.0210418701171875,
0.06719970703125,
0.011077880859375,
0.0391845703125,
0.01453399658203125,
-0.000011861324310302734,
-0.0176544189453125,
0.0101318359375,
0.04656982421875,
0.01399993896484375,
-0.00865936279296875,
-0.00290679931640625,
-0.0268707275390625,
-0.045257568359375,
0.025360107421875,
0.005615234375,
-0.051177978515625,
0.012115478515625,
0.00872039794921875,
0.07879638671875,
0.003070831298828125,
-0.0572509765625,
0.0467529296875,
-0.00591278076171875,
-0.0200347900390625,
-0.03912353515625,
0.0235443115234375,
0.0261688232421875,
0.021575927734375,
0.017822265625,
0.0255584716796875,
0.029998779296875,
-0.021514892578125,
0.021942138671875,
0.0288848876953125,
-0.03887939453125,
-0.0247039794921875,
0.07720947265625,
0.00899505615234375,
-0.03118896484375,
0.02032470703125,
-0.023406982421875,
-0.00848388671875,
0.04132080078125,
0.06060791015625,
0.08355712890625,
-0.01496124267578125,
0.0357666015625,
0.035980224609375,
-0.0002663135528564453,
-0.002819061279296875,
0.015228271484375,
-0.0253448486328125,
-0.046051025390625,
-0.0147857666015625,
-0.0496826171875,
-0.01444244384765625,
0.006988525390625,
-0.020172119140625,
0.01812744140625,
-0.041748046875,
-0.04071044921875,
0.007717132568359375,
-0.036590576171875,
-0.01485443115234375,
0.006649017333984375,
0.017852783203125,
0.0771484375,
-0.07574462890625,
0.049591064453125,
0.053131103515625,
-0.042724609375,
-0.042083740234375,
0.0172882080078125,
-0.007160186767578125,
-0.0169219970703125,
0.0316162109375,
-0.01374053955078125,
-0.01076507568359375,
0.006412506103515625,
-0.06817626953125,
-0.0643310546875,
0.08697509765625,
0.0210723876953125,
-0.02294921875,
-0.004901885986328125,
-0.01806640625,
0.057891845703125,
-0.035003662109375,
0.01580810546875,
0.015380859375,
0.03948974609375,
0.043548583984375,
-0.033203125,
-0.00693511962890625,
-0.0305938720703125,
0.0118408203125,
-0.00621795654296875,
-0.052886962890625,
0.06304931640625,
0.0023746490478515625,
-0.03570556640625,
0.0218353271484375,
0.050994873046875,
0.0122528076171875,
0.03411865234375,
0.032867431640625,
0.06536865234375,
0.04571533203125,
-0.0172882080078125,
0.063232421875,
-0.0254974365234375,
0.024261474609375,
0.06500244140625,
0.00653076171875,
0.053375244140625,
0.023406982421875,
-0.00910186767578125,
0.08135986328125,
0.06036376953125,
-0.0237274169921875,
0.06500244140625,
0.001728057861328125,
-0.0311126708984375,
-0.004856109619140625,
-0.0038890838623046875,
-0.030364990234375,
0.006870269775390625,
0.033355712890625,
-0.04827880859375,
0.005168914794921875,
0.0259857177734375,
-0.0018320083618164062,
-0.0050811767578125,
0.0107879638671875,
0.05914306640625,
0.0018453598022460938,
-0.0299530029296875,
0.0478515625,
-0.0001462697982788086,
0.068603515625,
-0.0462646484375,
-0.0011854171752929688,
-0.0037136077880859375,
-0.005939483642578125,
-0.01404571533203125,
-0.06353759765625,
0.0256805419921875,
0.0032787322998046875,
-0.0229644775390625,
-0.0300445556640625,
0.067626953125,
-0.046722412109375,
-0.032958984375,
0.024627685546875,
0.01474761962890625,
0.01361846923828125,
0.027313232421875,
-0.079833984375,
0.01143646240234375,
-0.0029430389404296875,
-0.0150909423828125,
-0.008270263671875,
0.0202178955078125,
0.0244140625,
0.051361083984375,
0.04925537109375,
0.0294647216796875,
0.01023101806640625,
0.012664794921875,
0.05401611328125,
-0.0338134765625,
-0.037261962890625,
-0.041778564453125,
0.037994384765625,
-0.01444244384765625,
0.00431060791015625,
0.051300048828125,
0.05291748046875,
0.045318603515625,
-0.0185546875,
0.0667724609375,
-0.024871826171875,
0.040679931640625,
-0.0394287109375,
0.076171875,
-0.0677490234375,
0.01346588134765625,
-0.043060302734375,
-0.060882568359375,
-0.014434814453125,
0.053466796875,
-0.0105133056640625,
0.017059326171875,
0.0205841064453125,
0.06756591796875,
-0.0262908935546875,
-0.0092010498046875,
0.00524139404296875,
0.0270538330078125,
0.0226593017578125,
0.0207977294921875,
0.06463623046875,
-0.03253173828125,
0.0343017578125,
-0.023529052734375,
-0.018798828125,
-0.0132293701171875,
-0.05352783203125,
-0.052001953125,
-0.05767822265625,
-0.02777099609375,
-0.03765869140625,
0.00118255615234375,
0.04620361328125,
0.07061767578125,
-0.052734375,
-0.0138702392578125,
-0.018798828125,
0.01538848876953125,
-0.0016107559204101562,
-0.0232086181640625,
0.0126495361328125,
-0.002567291259765625,
-0.06732177734375,
0.03143310546875,
0.02093505859375,
0.04638671875,
-0.029022216796875,
-0.00719451904296875,
-0.01806640625,
0.02508544921875,
0.0318603515625,
0.0279541015625,
-0.04034423828125,
0.0023746490478515625,
-0.00788116455078125,
-0.01387786865234375,
0.016693115234375,
0.048919677734375,
-0.03704833984375,
0.034820556640625,
0.046722412109375,
0.01021575927734375,
0.057891845703125,
0.005786895751953125,
0.020721435546875,
-0.049896240234375,
0.02337646484375,
0.0162506103515625,
0.01044464111328125,
0.03021240234375,
-0.0243072509765625,
0.0250396728515625,
0.039031982421875,
-0.05133056640625,
-0.054351806640625,
0.0189056396484375,
-0.07464599609375,
-0.0208282470703125,
0.08233642578125,
-0.00914764404296875,
-0.022491455078125,
0.0135650634765625,
-0.0301361083984375,
0.01812744140625,
-0.036895751953125,
0.05474853515625,
0.0438232421875,
-0.0158538818359375,
-0.02783203125,
-0.03271484375,
0.0521240234375,
0.00969696044921875,
-0.04974365234375,
-0.0225830078125,
0.06536865234375,
0.0469970703125,
0.031707763671875,
0.068115234375,
-0.0168914794921875,
0.0243377685546875,
0.00632476806640625,
0.0182647705078125,
0.005863189697265625,
-0.0206298828125,
-0.0285491943359375,
0.0133819580078125,
-0.00864410400390625,
-0.01061248779296875
]
] |
timm/tf_efficientnetv2_b0.in1k | 2023-04-27T21:38:45.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2104.00298",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/tf_efficientnetv2_b0.in1k | 0 | 6,990 | timm | 2022-12-13T00:14:15 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for tf_efficientnetv2_b0.in1k
A EfficientNet-v2 image classification model. Trained on ImageNet-1k in Tensorflow by paper authors, ported to PyTorch by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 7.1
- GMACs: 0.5
- Activations (M): 3.5
- Image size: train = 192 x 192, test = 224 x 224
- **Papers:**
- EfficientNetV2: Smaller Models and Faster Training: https://arxiv.org/abs/2104.00298
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('tf_efficientnetv2_b0.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'tf_efficientnetv2_b0.in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 96, 96])
# torch.Size([1, 32, 48, 48])
# torch.Size([1, 48, 24, 24])
# torch.Size([1, 112, 12, 12])
# torch.Size([1, 192, 6, 6])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'tf_efficientnetv2_b0.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1280, 6, 6) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{tan2021efficientnetv2,
title={Efficientnetv2: Smaller models and faster training},
author={Tan, Mingxing and Le, Quoc},
booktitle={International conference on machine learning},
pages={10096--10106},
year={2021},
organization={PMLR}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 4,065 | [
[
-0.026214599609375,
-0.034332275390625,
-0.005039215087890625,
0.006626129150390625,
-0.0242462158203125,
-0.032135009765625,
-0.0194244384765625,
-0.027801513671875,
0.01245880126953125,
0.0288543701171875,
-0.025238037109375,
-0.0477294921875,
-0.054901123046875,
-0.0170135498046875,
-0.01076507568359375,
0.0621337890625,
-0.00865936279296875,
0.0006380081176757812,
-0.01348876953125,
-0.039306640625,
-0.00878143310546875,
-0.0091552734375,
-0.066650390625,
-0.035797119140625,
0.0267181396484375,
0.0239105224609375,
0.037109375,
0.052001953125,
0.05072021484375,
0.034912109375,
-0.004302978515625,
0.0066070556640625,
-0.0252227783203125,
-0.01155853271484375,
0.0294189453125,
-0.044830322265625,
-0.0294036865234375,
0.01131439208984375,
0.053192138671875,
0.0263214111328125,
0.002689361572265625,
0.03546142578125,
0.01012420654296875,
0.041595458984375,
-0.02130126953125,
0.01346588134765625,
-0.026275634765625,
0.014678955078125,
-0.007343292236328125,
0.005664825439453125,
-0.0188446044921875,
-0.0281524658203125,
0.014892578125,
-0.042236328125,
0.03167724609375,
-0.00392913818359375,
0.09613037109375,
0.02386474609375,
-0.00862884521484375,
0.0006098747253417969,
-0.0156097412109375,
0.05291748046875,
-0.054168701171875,
0.01392364501953125,
0.0223388671875,
0.016571044921875,
-0.003391265869140625,
-0.08660888671875,
-0.03485107421875,
-0.0101470947265625,
-0.0167236328125,
-0.004749298095703125,
-0.0241241455078125,
0.004055023193359375,
0.0239105224609375,
0.01325225830078125,
-0.039520263671875,
0.0166778564453125,
-0.0469970703125,
-0.016082763671875,
0.0396728515625,
0.0002582073211669922,
0.0180816650390625,
-0.0178985595703125,
-0.033447265625,
-0.0377197265625,
-0.0239715576171875,
0.025177001953125,
0.02227783203125,
0.01131439208984375,
-0.039581298828125,
0.034423828125,
0.006114959716796875,
0.043182373046875,
-0.00522613525390625,
-0.0262603759765625,
0.043304443359375,
0.004032135009765625,
-0.032012939453125,
-0.0111236572265625,
0.08203125,
0.033660888671875,
0.015655517578125,
0.00601959228515625,
-0.00858306884765625,
-0.0272979736328125,
-0.008544921875,
-0.10003662109375,
-0.03289794921875,
0.02703857421875,
-0.049224853515625,
-0.032928466796875,
0.0203857421875,
-0.041473388671875,
-0.008026123046875,
0.0010318756103515625,
0.044525146484375,
-0.0278778076171875,
-0.03582763671875,
-0.00628662109375,
-0.022552490234375,
0.0220794677734375,
0.01323699951171875,
-0.03717041015625,
0.0117034912109375,
0.03216552734375,
0.09161376953125,
0.004924774169921875,
-0.0278472900390625,
-0.0212860107421875,
-0.02752685546875,
-0.0253143310546875,
0.0323486328125,
-0.00005805492401123047,
-0.0010232925415039062,
-0.02203369140625,
0.02130126953125,
-0.006267547607421875,
-0.053009033203125,
0.01505279541015625,
-0.0189208984375,
0.01442718505859375,
-0.001987457275390625,
-0.01568603515625,
-0.041015625,
0.01806640625,
-0.03509521484375,
0.09716796875,
0.0294189453125,
-0.06634521484375,
0.0170440673828125,
-0.042236328125,
-0.007686614990234375,
-0.0205230712890625,
0.00337982177734375,
-0.08056640625,
-0.00811767578125,
0.0059967041015625,
0.05914306640625,
-0.023193359375,
0.002109527587890625,
-0.044677734375,
-0.019439697265625,
0.0183563232421875,
-0.00017774105072021484,
0.08111572265625,
0.01507568359375,
-0.03619384765625,
0.015655517578125,
-0.038330078125,
0.0184326171875,
0.03857421875,
-0.0185546875,
-0.0028591156005859375,
-0.04571533203125,
0.0171051025390625,
0.0251617431640625,
0.00662994384765625,
-0.036834716796875,
0.0196990966796875,
-0.014312744140625,
0.039581298828125,
0.043670654296875,
-0.014129638671875,
0.023040771484375,
-0.02398681640625,
0.0159759521484375,
0.0210113525390625,
0.01049041748046875,
0.003810882568359375,
-0.039306640625,
-0.0667724609375,
-0.036834716796875,
0.02972412109375,
0.02972412109375,
-0.045806884765625,
0.0288543701171875,
-0.020904541015625,
-0.060089111328125,
-0.03179931640625,
0.00846099853515625,
0.038848876953125,
0.045196533203125,
0.02117919921875,
-0.03265380859375,
-0.034271240234375,
-0.06927490234375,
0.00031375885009765625,
-0.006748199462890625,
0.0022068023681640625,
0.0266265869140625,
0.056854248046875,
-0.004058837890625,
0.04302978515625,
-0.031890869140625,
-0.0211944580078125,
-0.0169525146484375,
0.0059051513671875,
0.025177001953125,
0.06085205078125,
0.06292724609375,
-0.041046142578125,
-0.03857421875,
-0.00931549072265625,
-0.0709228515625,
0.013519287109375,
0.001979827880859375,
-0.0205078125,
0.022186279296875,
0.01528167724609375,
-0.044036865234375,
0.042572021484375,
0.0163421630859375,
-0.031829833984375,
0.0251312255859375,
-0.0173492431640625,
0.01540374755859375,
-0.08526611328125,
0.01348876953125,
0.026824951171875,
-0.018035888671875,
-0.036651611328125,
0.00595855712890625,
0.002391815185546875,
0.0008711814880371094,
-0.040008544921875,
0.05316162109375,
-0.042510986328125,
-0.0186614990234375,
-0.0097808837890625,
-0.022552490234375,
0.00031638145446777344,
0.049041748046875,
-0.00714111328125,
0.0279388427734375,
0.061279296875,
-0.033294677734375,
0.040771484375,
0.029571533203125,
-0.021636962890625,
0.021087646484375,
-0.055999755859375,
0.0181884765625,
-0.00007063150405883789,
0.0148162841796875,
-0.07623291015625,
-0.0254974365234375,
0.0307464599609375,
-0.0469970703125,
0.044830322265625,
-0.04022216796875,
-0.037322998046875,
-0.0401611328125,
-0.035400390625,
0.0267333984375,
0.056854248046875,
-0.058502197265625,
0.030517578125,
0.017974853515625,
0.021148681640625,
-0.044769287109375,
-0.0771484375,
-0.01934814453125,
-0.02777099609375,
-0.060760498046875,
0.0223236083984375,
0.0184326171875,
0.0046844482421875,
0.0166168212890625,
-0.003246307373046875,
-0.0145721435546875,
-0.0027942657470703125,
0.03656005859375,
0.022186279296875,
-0.0249481201171875,
-0.00254058837890625,
-0.021240234375,
-0.0045166015625,
0.00452423095703125,
-0.03167724609375,
0.039276123046875,
-0.0183258056640625,
-0.00402069091796875,
-0.0660400390625,
-0.006969451904296875,
0.028472900390625,
-0.0014286041259765625,
0.06298828125,
0.08941650390625,
-0.03814697265625,
-0.005126953125,
-0.03143310546875,
-0.02520751953125,
-0.036163330078125,
0.04693603515625,
-0.025054931640625,
-0.039031982421875,
0.05438232421875,
0.0053253173828125,
0.0079193115234375,
0.05377197265625,
0.0305633544921875,
-0.006504058837890625,
0.048431396484375,
0.0364990234375,
0.023284912109375,
0.055267333984375,
-0.082763671875,
-0.0184173583984375,
-0.057891845703125,
-0.0406494140625,
-0.0298919677734375,
-0.049530029296875,
-0.053680419921875,
-0.0322265625,
0.033905029296875,
0.019744873046875,
-0.03656005859375,
0.03460693359375,
-0.062042236328125,
0.0078277587890625,
0.052520751953125,
0.043609619140625,
-0.02978515625,
0.031951904296875,
-0.0108489990234375,
0.00445556640625,
-0.0643310546875,
-0.017059326171875,
0.0867919921875,
0.03338623046875,
0.037078857421875,
-0.00313568115234375,
0.047119140625,
-0.0179443359375,
0.0214080810546875,
-0.048431396484375,
0.040130615234375,
-0.0125885009765625,
-0.031280517578125,
-0.01030731201171875,
-0.03985595703125,
-0.077880859375,
0.0142364501953125,
-0.020111083984375,
-0.056640625,
0.0093994140625,
0.0193328857421875,
-0.0149993896484375,
0.05999755859375,
-0.067138671875,
0.07769775390625,
-0.00836944580078125,
-0.040313720703125,
0.00269317626953125,
-0.048492431640625,
0.0235137939453125,
0.0231781005859375,
-0.0244598388671875,
-0.005596160888671875,
0.0016193389892578125,
0.090087890625,
-0.049896240234375,
0.054901123046875,
-0.04132080078125,
0.040924072265625,
0.04345703125,
-0.0074462890625,
0.03179931640625,
-0.00861358642578125,
-0.0115814208984375,
0.02752685546875,
0.0005927085876464844,
-0.03759765625,
-0.03936767578125,
0.047149658203125,
-0.07830810546875,
-0.0214385986328125,
-0.0270843505859375,
-0.02777099609375,
0.0197906494140625,
0.0071563720703125,
0.041473388671875,
0.05157470703125,
0.0182037353515625,
0.0263824462890625,
0.040557861328125,
-0.023162841796875,
0.03985595703125,
-0.0116424560546875,
-0.00879669189453125,
-0.037933349609375,
0.06390380859375,
0.01953125,
0.00899505615234375,
0.0084228515625,
0.01873779296875,
-0.0295257568359375,
-0.0452880859375,
-0.024444580078125,
0.021820068359375,
-0.054473876953125,
-0.038330078125,
-0.0540771484375,
-0.0256805419921875,
-0.031890869140625,
0.001598358154296875,
-0.040924072265625,
-0.0369873046875,
-0.037750244140625,
0.01483917236328125,
0.057586669921875,
0.043304443359375,
-0.016265869140625,
0.045257568359375,
-0.033905029296875,
0.011260986328125,
0.010711669921875,
0.03582763671875,
-0.0008053779602050781,
-0.0653076171875,
-0.013824462890625,
-0.010223388671875,
-0.0300750732421875,
-0.04888916015625,
0.036285400390625,
0.0175018310546875,
0.0310211181640625,
0.029998779296875,
-0.018524169921875,
0.053680419921875,
0.0036067962646484375,
0.03802490234375,
0.03497314453125,
-0.034912109375,
0.039794921875,
0.0014286041259765625,
0.0070343017578125,
0.01042938232421875,
0.019744873046875,
-0.0162353515625,
0.00713348388671875,
-0.06964111328125,
-0.059234619140625,
0.07330322265625,
0.0135650634765625,
-0.0052032470703125,
0.033233642578125,
0.05426025390625,
0.00016307830810546875,
0.001556396484375,
-0.0478515625,
-0.03851318359375,
-0.0290679931640625,
-0.0223388671875,
-0.0008249282836914062,
-0.01105499267578125,
-0.00128936767578125,
-0.049530029296875,
0.055023193359375,
-0.002620697021484375,
0.06402587890625,
0.020965576171875,
-0.005786895751953125,
-0.0013837814331054688,
-0.03497314453125,
0.034271240234375,
0.02081298828125,
-0.0224761962890625,
0.00849151611328125,
0.00882720947265625,
-0.03875732421875,
0.008636474609375,
0.0135498046875,
-0.0027446746826171875,
0.0027408599853515625,
0.038421630859375,
0.07916259765625,
-0.01026153564453125,
0.01100921630859375,
0.03594970703125,
-0.00102996826171875,
-0.0335693359375,
-0.02166748046875,
0.01558685302734375,
0.00278472900390625,
0.035125732421875,
0.01491546630859375,
0.03070068359375,
-0.0073699951171875,
-0.0153350830078125,
0.01806640625,
0.039794921875,
-0.0229339599609375,
-0.02276611328125,
0.050689697265625,
-0.007740020751953125,
-0.0160064697265625,
0.06866455078125,
-0.0151214599609375,
-0.03790283203125,
0.0855712890625,
0.0258331298828125,
0.07025146484375,
0.00579833984375,
0.0037975311279296875,
0.0703125,
0.018585205078125,
-0.007518768310546875,
0.010955810546875,
0.010711669921875,
-0.04901123046875,
0.0069427490234375,
-0.037200927734375,
0.00917816162109375,
0.02435302734375,
-0.037506103515625,
0.024627685546875,
-0.050262451171875,
-0.03131103515625,
0.01080322265625,
0.029205322265625,
-0.07611083984375,
0.01012420654296875,
-0.00447845458984375,
0.0657958984375,
-0.05181884765625,
0.059234619140625,
0.062255859375,
-0.03204345703125,
-0.08319091796875,
-0.01470947265625,
0.0017652511596679688,
-0.0692138671875,
0.047882080078125,
0.03729248046875,
0.01528167724609375,
0.00914764404296875,
-0.057281494140625,
-0.046783447265625,
0.11077880859375,
0.040008544921875,
-0.009246826171875,
0.0199737548828125,
-0.005126953125,
0.01396942138671875,
-0.0290069580078125,
0.04974365234375,
0.0188446044921875,
0.03448486328125,
0.0233001708984375,
-0.047607421875,
0.017547607421875,
-0.0261383056640625,
0.01364898681640625,
0.01276397705078125,
-0.0653076171875,
0.06475830078125,
-0.0413818359375,
-0.0086822509765625,
0.0035400390625,
0.05413818359375,
0.01206207275390625,
0.01168060302734375,
0.038482666015625,
0.06182861328125,
0.040924072265625,
-0.03265380859375,
0.07159423828125,
0.00562286376953125,
0.052154541015625,
0.0426025390625,
0.037933349609375,
0.0401611328125,
0.028533935546875,
-0.01323699951171875,
0.0233612060546875,
0.08343505859375,
-0.029937744140625,
0.0254974365234375,
0.0150909423828125,
0.009033203125,
-0.00882720947265625,
0.00687408447265625,
-0.0311737060546875,
0.042877197265625,
0.00754547119140625,
-0.03857421875,
-0.0167388916015625,
-0.0031833648681640625,
0.0026397705078125,
-0.036956787109375,
-0.0182952880859375,
0.036834716796875,
0.0007176399230957031,
-0.033966064453125,
0.06561279296875,
0.01434326171875,
0.0635986328125,
-0.0276947021484375,
0.0008816719055175781,
-0.016143798828125,
0.019073486328125,
-0.0290985107421875,
-0.060302734375,
0.020050048828125,
-0.01548004150390625,
0.0041046142578125,
0.0027408599853515625,
0.05169677734375,
-0.028900146484375,
-0.0357666015625,
0.01421356201171875,
0.0221099853515625,
0.043426513671875,
0.0034923553466796875,
-0.08984375,
0.0118408203125,
0.005260467529296875,
-0.057159423828125,
0.0185089111328125,
0.0215911865234375,
0.009765625,
0.052093505859375,
0.0386962890625,
-0.0069580078125,
0.0118408203125,
-0.02069091796875,
0.057708740234375,
-0.031463623046875,
-0.023040771484375,
-0.059600830078125,
0.04986572265625,
-0.01009368896484375,
-0.048431396484375,
0.0267791748046875,
0.043731689453125,
0.06298828125,
0.0007691383361816406,
0.034820556640625,
-0.0250701904296875,
-0.007633209228515625,
-0.0313720703125,
0.05694580078125,
-0.059661865234375,
-0.006755828857421875,
-0.0007505416870117188,
-0.053985595703125,
-0.0236968994140625,
0.055206298828125,
-0.01421356201171875,
0.036041259765625,
0.039093017578125,
0.07861328125,
-0.028839111328125,
-0.0305938720703125,
0.00988006591796875,
0.0146026611328125,
0.00713348388671875,
0.036163330078125,
0.0245361328125,
-0.06317138671875,
0.031585693359375,
-0.053680419921875,
-0.01256561279296875,
-0.0183868408203125,
-0.05029296875,
-0.076171875,
-0.059356689453125,
-0.051055908203125,
-0.059600830078125,
-0.01134490966796875,
0.0750732421875,
0.08026123046875,
-0.049041748046875,
-0.00978851318359375,
0.0014333724975585938,
0.013336181640625,
-0.01462554931640625,
-0.0171661376953125,
0.054290771484375,
-0.00542449951171875,
-0.05511474609375,
-0.03106689453125,
-0.005146026611328125,
0.0247344970703125,
0.0023174285888671875,
-0.021240234375,
-0.0070343017578125,
-0.0293121337890625,
0.01340484619140625,
0.020721435546875,
-0.047637939453125,
-0.01258087158203125,
-0.021942138671875,
-0.0155029296875,
0.0298919677734375,
0.033477783203125,
-0.036712646484375,
0.02471923828125,
0.040924072265625,
0.0283203125,
0.0672607421875,
-0.0276336669921875,
-0.006137847900390625,
-0.059356689453125,
0.04608154296875,
-0.00881195068359375,
0.0311737060546875,
0.034332275390625,
-0.022979736328125,
0.048004150390625,
0.03411865234375,
-0.0267791748046875,
-0.0673828125,
-0.0114898681640625,
-0.08154296875,
-0.00876617431640625,
0.07806396484375,
-0.034210205078125,
-0.04058837890625,
0.037933349609375,
0.007404327392578125,
0.055755615234375,
-0.0113983154296875,
0.0271453857421875,
0.01351165771484375,
-0.00701904296875,
-0.044830322265625,
-0.04498291015625,
0.03204345703125,
0.008026123046875,
-0.04681396484375,
-0.03564453125,
-0.00481414794921875,
0.057281494140625,
0.00811767578125,
0.0350341796875,
-0.0027141571044921875,
0.012298583984375,
0.0145721435546875,
0.035736083984375,
-0.048858642578125,
-0.00806427001953125,
-0.022979736328125,
0.006072998046875,
-0.007274627685546875,
-0.047576904296875
]
] |
rinna/japanese-gpt2-small | 2023-03-22T04:13:10.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"gpt2",
"text-generation",
"ja",
"japanese",
"lm",
"nlp",
"dataset:cc100",
"dataset:wikipedia",
"license:mit",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | rinna | null | null | rinna/japanese-gpt2-small | 15 | 6,988 | transformers | 2022-03-02T23:29:05 | ---
language: ja
thumbnail: https://github.com/rinnakk/japanese-gpt2/blob/master/rinna.png
tags:
- ja
- japanese
- gpt2
- text-generation
- lm
- nlp
license: mit
datasets:
- cc100
- wikipedia
widget:
- text: "生命、宇宙、そして万物についての究極の疑問の答えは"
---
# japanese-gpt2-small

This repository provides a small-sized Japanese GPT-2 model. The model was trained using code from Github repository [rinnakk/japanese-pretrained-models](https://github.com/rinnakk/japanese-pretrained-models) by [rinna Co., Ltd.](https://corp.rinna.co.jp/)
# How to use the model
~~~~
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("rinna/japanese-gpt2-small", use_fast=False)
tokenizer.do_lower_case = True # due to some bug of tokenizer config loading
model = AutoModelForCausalLM.from_pretrained("rinna/japanese-gpt2-small")
~~~~
# Model architecture
A 12-layer, 768-hidden-size transformer-based language model.
# Training
The model was trained on [Japanese CC-100](http://data.statmt.org/cc-100/ja.txt.xz) and [Japanese Wikipedia](https://dumps.wikimedia.org/other/cirrussearch) to optimize a traditional language modelling objective on 8\\*V100 GPUs for around 15 days. It reaches around 21 perplexity on a chosen validation set from CC-100.
# Tokenization
The model uses a [sentencepiece](https://github.com/google/sentencepiece)-based tokenizer, the vocabulary was trained on the Japanese Wikipedia using the official sentencepiece training script.
# Licenese
[The MIT license](https://opensource.org/licenses/MIT)
| 1,580 | [
[
-0.0273590087890625,
-0.048431396484375,
0.026397705078125,
0.01702880859375,
-0.040313720703125,
-0.01468658447265625,
-0.0177001953125,
-0.0171051025390625,
-0.0036773681640625,
0.03045654296875,
-0.043853759765625,
-0.013702392578125,
-0.04510498046875,
0.0012302398681640625,
-0.0001684427261352539,
0.0887451171875,
-0.0220489501953125,
0.01236724853515625,
0.0282440185546875,
0.0024700164794921875,
-0.037933349609375,
-0.0178680419921875,
-0.06597900390625,
-0.03289794921875,
0.02264404296875,
0.031951904296875,
0.05010986328125,
0.041717529296875,
0.035308837890625,
0.01934814453125,
0.00878143310546875,
-0.0179443359375,
-0.025177001953125,
-0.01947021484375,
-0.0028705596923828125,
-0.04083251953125,
-0.031585693359375,
0.01019287109375,
0.0467529296875,
0.0215606689453125,
0.0023746490478515625,
0.0194549560546875,
0.002880096435546875,
0.01263427734375,
-0.0249481201171875,
0.036590576171875,
-0.037689208984375,
0.009674072265625,
-0.029571533203125,
0.01020050048828125,
-0.02288818359375,
-0.0054168701171875,
-0.001888275146484375,
-0.06365966796875,
0.017120361328125,
0.00018966197967529297,
0.0859375,
0.0298919677734375,
-0.01396942138671875,
-0.0210723876953125,
-0.036376953125,
0.047027587890625,
-0.0517578125,
0.020355224609375,
0.03179931640625,
0.033050537109375,
-0.002410888671875,
-0.07623291015625,
-0.04229736328125,
-0.0240936279296875,
-0.02484130859375,
0.007686614990234375,
-0.0083160400390625,
-0.006603240966796875,
0.04638671875,
0.025787353515625,
-0.07476806640625,
0.0181732177734375,
-0.03143310546875,
-0.02288818359375,
0.026519775390625,
0.0017385482788085938,
0.046630859375,
-0.035491943359375,
-0.021728515625,
-0.0187225341796875,
-0.052520751953125,
-0.0079803466796875,
0.034576416015625,
0.0222320556640625,
-0.036834716796875,
0.05047607421875,
-0.00630950927734375,
0.0212249755859375,
0.00843048095703125,
-0.0123748779296875,
0.024566650390625,
-0.023834228515625,
-0.01177978515625,
-0.0055389404296875,
0.07769775390625,
0.0033473968505859375,
0.02734375,
-0.003635406494140625,
-0.0210723876953125,
0.013519287109375,
0.002727508544921875,
-0.07659912109375,
-0.0426025390625,
-0.0107421875,
-0.0273590087890625,
-0.01142120361328125,
0.001270294189453125,
-0.039459228515625,
0.0148468017578125,
-0.01395416259765625,
0.0501708984375,
-0.0269775390625,
-0.0338134765625,
0.0005259513854980469,
-0.007061004638671875,
0.0265655517578125,
0.0002142190933227539,
-0.06536865234375,
0.01059722900390625,
0.036590576171875,
0.069091796875,
0.01342010498046875,
-0.0262908935546875,
-0.0030994415283203125,
0.004619598388671875,
-0.01447296142578125,
0.041778564453125,
-0.01007843017578125,
-0.041168212890625,
-0.0209808349609375,
0.0278167724609375,
-0.01517486572265625,
-0.0276641845703125,
0.04327392578125,
-0.0289764404296875,
0.044525146484375,
-0.00971221923828125,
-0.026214599609375,
0.00803375244140625,
0.002635955810546875,
-0.04510498046875,
0.08355712890625,
0.0279388427734375,
-0.07012939453125,
0.0222625732421875,
-0.0626220703125,
-0.0167388916015625,
0.0233001708984375,
-0.0156097412109375,
-0.04913330078125,
0.0008463859558105469,
0.01123046875,
0.0126953125,
-0.006664276123046875,
0.0193634033203125,
-0.00984954833984375,
-0.0401611328125,
-0.0034389495849609375,
-0.03387451171875,
0.06951904296875,
0.03045654296875,
-0.04498291015625,
-0.0141448974609375,
-0.044525146484375,
-0.0010576248168945312,
0.03369140625,
-0.0197601318359375,
-0.0220794677734375,
-0.040802001953125,
0.02508544921875,
0.019683837890625,
0.031280517578125,
-0.037322998046875,
0.03131103515625,
-0.03936767578125,
0.038726806640625,
0.038848876953125,
-0.002651214599609375,
0.0300140380859375,
-0.006908416748046875,
0.048858642578125,
0.00905609130859375,
0.0286407470703125,
0.002716064453125,
-0.046539306640625,
-0.0794677734375,
-0.0101470947265625,
0.0286407470703125,
0.03729248046875,
-0.0792236328125,
0.0341796875,
-0.036834716796875,
-0.043182373046875,
-0.043487548828125,
-0.0090789794921875,
0.042236328125,
0.022064208984375,
0.021759033203125,
-0.0309295654296875,
-0.0433349609375,
-0.077880859375,
-0.0236968994140625,
-0.017578125,
-0.00994110107421875,
-0.0007266998291015625,
0.06805419921875,
-0.032928466796875,
0.056060791015625,
-0.02545166015625,
-0.0213165283203125,
-0.01983642578125,
0.02496337890625,
0.0248260498046875,
0.054351806640625,
0.0330810546875,
-0.0501708984375,
-0.058502197265625,
-0.003814697265625,
-0.040435791015625,
0.00222015380859375,
0.004230499267578125,
-0.002872467041015625,
0.01348876953125,
0.025177001953125,
-0.054962158203125,
0.031219482421875,
0.027130126953125,
-0.03131103515625,
0.04437255859375,
-0.01354217529296875,
-0.006381988525390625,
-0.09423828125,
0.0269012451171875,
-0.002105712890625,
-0.02362060546875,
-0.031829833984375,
0.0261383056640625,
0.0059967041015625,
-0.0223236083984375,
-0.032318115234375,
0.06512451171875,
-0.035064697265625,
-0.00260162353515625,
-0.0197601318359375,
-0.004375457763671875,
-0.013519287109375,
0.04058837890625,
0.01216888427734375,
0.0908203125,
0.021514892578125,
-0.02044677734375,
0.0071868896484375,
0.0308380126953125,
-0.04156494140625,
0.0025234222412109375,
-0.06365966796875,
0.022247314453125,
0.00876617431640625,
0.01511383056640625,
-0.061920166015625,
-0.0355224609375,
0.043426513671875,
-0.03680419921875,
0.03753662109375,
-0.03826904296875,
-0.047760009765625,
-0.034332275390625,
-0.00853729248046875,
0.038299560546875,
0.055694580078125,
-0.03826904296875,
0.02276611328125,
0.0252685546875,
-0.019561767578125,
-0.041778564453125,
-0.0732421875,
-0.0034885406494140625,
-0.01371002197265625,
-0.018951416015625,
0.0161285400390625,
0.0012760162353515625,
0.01082611083984375,
0.01033782958984375,
-0.000865936279296875,
-0.0087738037109375,
0.000942230224609375,
0.01004791259765625,
0.023468017578125,
-0.024871826171875,
-0.010955810546875,
0.004177093505859375,
-0.0416259765625,
0.00533294677734375,
-0.0123748779296875,
0.0548095703125,
-0.01910400390625,
-0.01465606689453125,
-0.046630859375,
-0.0003886222839355469,
0.0291595458984375,
0.0033321380615234375,
0.050750732421875,
0.07208251953125,
-0.0177001953125,
0.0010662078857421875,
-0.043731689453125,
-0.020660400390625,
-0.037933349609375,
0.0513916015625,
-0.047149658203125,
-0.07183837890625,
0.034515380859375,
0.007495880126953125,
-0.01470184326171875,
0.056915283203125,
0.04425048828125,
0.006961822509765625,
0.08599853515625,
0.038421630859375,
-0.023193359375,
0.036163330078125,
-0.035614013671875,
0.023712158203125,
-0.07464599609375,
-0.00304412841796875,
-0.038909912109375,
0.0043487548828125,
-0.0704345703125,
-0.0272064208984375,
0.02166748046875,
0.01473236083984375,
-0.0513916015625,
0.040313720703125,
-0.04766845703125,
0.038787841796875,
0.031005859375,
-0.006381988525390625,
0.0105743408203125,
0.01067352294921875,
-0.0208892822265625,
-0.01105499267578125,
-0.06787109375,
-0.04864501953125,
0.08367919921875,
0.052734375,
0.0482177734375,
0.0054473876953125,
0.039398193359375,
-0.01128387451171875,
0.0281829833984375,
-0.054473876953125,
0.027557373046875,
-0.023406982421875,
-0.06658935546875,
-0.01873779296875,
-0.04388427734375,
-0.0626220703125,
0.0086822509765625,
0.01476287841796875,
-0.04437255859375,
-0.0023365020751953125,
0.005710601806640625,
-0.006103515625,
0.01517486572265625,
-0.0479736328125,
0.0848388671875,
-0.01464080810546875,
-0.0014190673828125,
0.002651214599609375,
-0.0450439453125,
0.034515380859375,
-0.00655364990234375,
0.0159149169921875,
0.0105743408203125,
0.004421234130859375,
0.066162109375,
-0.037933349609375,
0.056915283203125,
-0.034332275390625,
0.0094146728515625,
0.0106048583984375,
-0.00823211669921875,
0.04486083984375,
0.01287078857421875,
0.0259857177734375,
0.012115478515625,
0.0007529258728027344,
-0.029632568359375,
-0.0301361083984375,
0.05242919921875,
-0.093017578125,
-0.0293731689453125,
-0.0391845703125,
-0.01036834716796875,
-0.0008983612060546875,
0.025146484375,
0.059844970703125,
0.0218963623046875,
0.0025634765625,
0.01160430908203125,
0.044677734375,
-0.01358795166015625,
0.0296783447265625,
0.037628173828125,
-0.0226287841796875,
-0.025360107421875,
0.064208984375,
0.0063934326171875,
0.010162353515625,
0.0170135498046875,
0.01580810546875,
-0.034942626953125,
-0.0296783447265625,
-0.041229248046875,
0.038360595703125,
-0.042694091796875,
0.0056610107421875,
-0.03643798828125,
-0.033782958984375,
-0.035614013671875,
0.0279083251953125,
-0.0411376953125,
-0.033111572265625,
-0.0186614990234375,
0.0009145736694335938,
0.00836944580078125,
0.0458984375,
0.003387451171875,
0.0499267578125,
-0.039276123046875,
0.0156097412109375,
0.020416259765625,
0.024444580078125,
-0.01210784912109375,
-0.069091796875,
-0.0279693603515625,
0.005153656005859375,
-0.017364501953125,
-0.043121337890625,
0.025054931640625,
-0.004558563232421875,
0.039398193359375,
0.01262664794921875,
-0.0288848876953125,
0.047760009765625,
-0.0347900390625,
0.064208984375,
0.0164947509765625,
-0.061187744140625,
0.043853759765625,
-0.0292816162109375,
0.043426513671875,
0.05047607421875,
0.03594970703125,
-0.0458984375,
-0.023040771484375,
-0.05279541015625,
-0.058746337890625,
0.048553466796875,
0.0252838134765625,
0.0113525390625,
0.008209228515625,
0.051055908203125,
0.0009174346923828125,
0.0006518363952636719,
-0.08660888671875,
-0.026580810546875,
-0.03729248046875,
-0.023040771484375,
-0.0183258056640625,
-0.02813720703125,
0.00858306884765625,
-0.0204010009765625,
0.0843505859375,
-0.0117034912109375,
0.0377197265625,
0.007167816162109375,
-0.03131103515625,
0.0025234222412109375,
0.01558685302734375,
0.040924072265625,
0.032012939453125,
-0.0034351348876953125,
-0.01605224609375,
0.00952911376953125,
-0.060821533203125,
0.007053375244140625,
0.005279541015625,
-0.022735595703125,
0.0171661376953125,
0.025787353515625,
0.08416748046875,
0.02130126953125,
-0.01436614990234375,
0.040618896484375,
-0.0215606689453125,
-0.01102447509765625,
-0.047882080078125,
0.00579833984375,
0.0036220550537109375,
0.00379180908203125,
0.0137939453125,
-0.01375579833984375,
0.00600433349609375,
-0.037322998046875,
-0.0023746490478515625,
0.0199737548828125,
-0.0272064208984375,
-0.022735595703125,
0.057586669921875,
0.0008559226989746094,
-0.0098419189453125,
0.064208984375,
-0.02630615234375,
-0.04693603515625,
0.036956787109375,
0.0390625,
0.06402587890625,
-0.00951385498046875,
0.0052032470703125,
0.0499267578125,
0.03253173828125,
-0.0276336669921875,
0.0130767822265625,
0.0025959014892578125,
-0.0546875,
-0.03009033203125,
-0.05389404296875,
-0.015228271484375,
0.041778564453125,
-0.048095703125,
0.0250701904296875,
-0.036468505859375,
-0.0269927978515625,
-0.0161895751953125,
0.036712646484375,
-0.0655517578125,
0.0240478515625,
-0.0008530616760253906,
0.0665283203125,
-0.07232666015625,
0.08770751953125,
0.0521240234375,
-0.05145263671875,
-0.0948486328125,
-0.01345062255859375,
-0.0080108642578125,
-0.069091796875,
0.0560302734375,
0.010589599609375,
0.01517486572265625,
0.017547607421875,
-0.0316162109375,
-0.058380126953125,
0.08477783203125,
0.00848388671875,
-0.06329345703125,
-0.0191497802734375,
0.023162841796875,
0.037628173828125,
-0.0165557861328125,
0.03497314453125,
0.041412353515625,
0.040802001953125,
-0.0128631591796875,
-0.0828857421875,
0.0007538795471191406,
-0.03369140625,
0.02490234375,
0.01148223876953125,
-0.05322265625,
0.07183837890625,
0.017547607421875,
-0.0127410888671875,
0.038482666015625,
0.0300140380859375,
0.022857666015625,
-0.007663726806640625,
0.03411865234375,
0.06329345703125,
0.030548095703125,
-0.0217437744140625,
0.07806396484375,
-0.0278167724609375,
0.064453125,
0.08770751953125,
0.00958251953125,
0.05010986328125,
0.0276031494140625,
-0.017364501953125,
0.04461669921875,
0.04718017578125,
-0.02410888671875,
0.042510986328125,
0.0007333755493164062,
-0.0024623870849609375,
-0.002040863037109375,
0.0260009765625,
-0.045074462890625,
0.03594970703125,
0.021484375,
-0.0228271484375,
-0.0099945068359375,
0.0078277587890625,
0.023040771484375,
-0.0322265625,
-0.01410675048828125,
0.05438232421875,
0.016632080078125,
-0.04412841796875,
0.0323486328125,
0.013092041015625,
0.059844970703125,
-0.05657958984375,
0.022003173828125,
-0.0178985595703125,
0.019805908203125,
0.0053558349609375,
-0.040069580078125,
0.013671875,
0.00875091552734375,
-0.007678985595703125,
-0.0078125,
0.061859130859375,
-0.0305328369140625,
-0.045928955078125,
0.00975799560546875,
0.028106689453125,
0.0289459228515625,
0.00727081298828125,
-0.07659912109375,
0.003963470458984375,
-0.0023441314697265625,
-0.034759521484375,
0.039520263671875,
0.0202484130859375,
-0.00882720947265625,
0.03515625,
0.044677734375,
0.0080413818359375,
-0.01029205322265625,
0.0177459716796875,
0.05291748046875,
-0.02642822265625,
-0.047332763671875,
-0.058135986328125,
0.035186767578125,
-0.0007867813110351562,
-0.03936767578125,
0.061248779296875,
0.054290771484375,
0.0799560546875,
-0.0218353271484375,
0.06463623046875,
-0.00478363037109375,
0.035675048828125,
-0.043487548828125,
0.07830810546875,
-0.042449951171875,
0.006618499755859375,
-0.00003892183303833008,
-0.075439453125,
-0.003978729248046875,
0.0723876953125,
-0.0306243896484375,
0.0273895263671875,
0.05023193359375,
0.07177734375,
-0.02117919921875,
-0.0119476318359375,
0.005390167236328125,
0.036102294921875,
0.02288818359375,
0.044219970703125,
0.056671142578125,
-0.058929443359375,
0.0269012451171875,
-0.03369140625,
-0.01221466064453125,
-0.01317596435546875,
-0.045867919921875,
-0.06585693359375,
-0.035980224609375,
-0.0333251953125,
-0.0221405029296875,
0.0034885406494140625,
0.0618896484375,
0.05157470703125,
-0.060882568359375,
-0.0195770263671875,
-0.0172882080078125,
-0.026763916015625,
0.027313232421875,
-0.022552490234375,
0.0269775390625,
-0.02545166015625,
-0.055694580078125,
0.0133056640625,
-0.003238677978515625,
0.0194854736328125,
-0.026519775390625,
-0.005947113037109375,
0.004230499267578125,
-0.022979736328125,
0.0157470703125,
0.0233154296875,
-0.0498046875,
-0.031494140625,
-0.026519775390625,
-0.025665283203125,
-0.006748199462890625,
0.04937744140625,
-0.036651611328125,
0.0136260986328125,
0.02130126953125,
0.01380157470703125,
0.057769775390625,
-0.023040771484375,
0.0282135009765625,
-0.051483154296875,
0.03472900390625,
0.007129669189453125,
0.03582763671875,
0.02410888671875,
-0.0194244384765625,
0.043853759765625,
0.0278778076171875,
-0.04833984375,
-0.055450439453125,
0.0110931396484375,
-0.054473876953125,
-0.0182037353515625,
0.0859375,
-0.0222320556640625,
-0.01959228515625,
0.00330352783203125,
-0.01528167724609375,
0.037200927734375,
-0.00617218017578125,
0.043121337890625,
0.05316162109375,
0.0269775390625,
-0.0189208984375,
-0.04840087890625,
0.047607421875,
0.0202484130859375,
-0.047332763671875,
-0.023895263671875,
0.0018892288208007812,
0.0633544921875,
0.0173187255859375,
0.0469970703125,
-0.0232696533203125,
0.019744873046875,
0.008880615234375,
0.003734588623046875,
-0.01221466064453125,
-0.0178070068359375,
-0.031219482421875,
-0.0095672607421875,
0.007160186767578125,
-0.004974365234375
]
] |
teknium/Mistral-Trismegistus-7B | 2023-10-26T03:17:42.000Z | [
"transformers",
"pytorch",
"mistral",
"text-generation",
"mistral-7b",
"instruct",
"finetune",
"gpt4",
"synthetic data",
"distillation",
"en",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | teknium | null | null | teknium/Mistral-Trismegistus-7B | 130 | 6,982 | transformers | 2023-10-07T00:21:46 | ---
base_model: mistralai/Mistral-7B-v0.1
tags:
- mistral-7b
- instruct
- finetune
- gpt4
- synthetic data
- distillation
model-index:
- name: Mistral-Trismegistus-7B
results: []
license: apache-2.0
language:
- en
---
**Mistral Trismegistus 7B**
<div style="display: flex; justify-content: center;">
<img src="https://cdn-uploads.huggingface.co/production/uploads/6317aade83d8d2fd903192d9/3VJvztFDB1XOWfShuHnb6.png" alt="Mistral Trismegistus" width="50%" style="display: block; margin: 0 auto;">
</div>
## Model Description:
Transcendence is All You Need! Mistral Trismegistus is a model made for people interested in the esoteric, occult, and spiritual.
Here are some outputs:
Answer questions about occult artifacts:

Play the role of a hypnotist:

## Special Features:
- **The First Powerful Occult Expert Model**: ~10,000 high quality, deep, rich, instructions on the occult, esoteric, and spiritual.
- **Fast**: Trained on Mistral, a state of the art 7B parameter model, you can run this model FAST on even a cpu.
- **Not a positivity-nazi**: This model was trained on all forms of esoteric tasks and knowledge, and is not burdened by the flowery nature of many other models, who chose positivity over creativity.
## Acknowledgements:
Special thanks to @a16z.
## Dataset:
This model was trained on a 100% synthetic, gpt-4 generated dataset, about ~10,000 examples, on a wide and diverse set of both tasks and knowledge about the esoteric, occult, and spiritual.
The dataset will be released soon!
## Usage:
Prompt Format:
```
USER: <prompt>
ASSISTANT:
```
OR
```
<system message>
USER: <prompt>
ASSISTANT:
```
## Benchmarks:
No benchmark can capture the nature and essense of the quality of spirituality and esoteric knowledge and tasks. You will have to try testing it yourself!
Training run on wandb here: https://wandb.ai/teknium1/occult-expert-mistral-7b/runs/coccult-expert-mistral-6/overview
## Licensing:
Apache 2.0
---
| 2,195 | [
[
-0.039337158203125,
-0.050537109375,
0.01436614990234375,
0.01300048828125,
-0.026885986328125,
-0.001491546630859375,
-0.00539398193359375,
-0.0574951171875,
0.0247344970703125,
0.03857421875,
-0.041717529296875,
-0.043670654296875,
-0.044830322265625,
-0.0109405517578125,
-0.039337158203125,
0.058197021484375,
0.01476287841796875,
0.0146484375,
0.0289764404296875,
0.01087188720703125,
-0.0307464599609375,
-0.03338623046875,
-0.055816650390625,
-0.0238037109375,
0.0087890625,
-0.01435089111328125,
0.07049560546875,
0.04376220703125,
0.024078369140625,
0.0211334228515625,
-0.0037631988525390625,
0.01302337646484375,
-0.03411865234375,
0.00818634033203125,
-0.0295867919921875,
-0.029632568359375,
-0.04144287109375,
-0.0033054351806640625,
0.029022216796875,
0.0254669189453125,
-0.0211944580078125,
0.04742431640625,
-0.012939453125,
0.005245208740234375,
-0.0411376953125,
-0.01186370849609375,
-0.0374755859375,
0.0004413127899169922,
-0.036041259765625,
-0.0165252685546875,
-0.00701904296875,
-0.0303192138671875,
0.011322021484375,
-0.073974609375,
0.037567138671875,
0.026947021484375,
0.071533203125,
0.052093505859375,
-0.0295562744140625,
-0.005878448486328125,
-0.058868408203125,
0.054534912109375,
-0.04119873046875,
0.0567626953125,
0.0198974609375,
0.04815673828125,
0.0086669921875,
-0.064208984375,
-0.021087646484375,
-0.0020885467529296875,
0.01496124267578125,
0.00365447998046875,
0.00038695335388183594,
0.01025390625,
0.0518798828125,
0.04736328125,
-0.0307464599609375,
-0.0276947021484375,
-0.04833984375,
-0.0202789306640625,
0.024658203125,
0.0307769775390625,
0.0159149169921875,
0.01059722900390625,
-0.025146484375,
-0.00759124755859375,
-0.024261474609375,
-0.0097503662109375,
0.01270294189453125,
-0.0289764404296875,
0.004547119140625,
0.0279388427734375,
-0.0013113021850585938,
0.05108642578125,
0.032318115234375,
-0.028472900390625,
0.0036373138427734375,
-0.00818634033203125,
-0.0172882080078125,
-0.020111083984375,
0.02569580078125,
0.018798828125,
-0.000919342041015625,
-0.015960693359375,
0.005748748779296875,
0.01446533203125,
0.048553466796875,
-0.08477783203125,
-0.0062408447265625,
0.015228271484375,
-0.036285400390625,
-0.006439208984375,
-0.006072998046875,
-0.03668212890625,
-0.0264129638671875,
-0.03692626953125,
0.0294647216796875,
-0.05157470703125,
-0.044525146484375,
0.031982421875,
-0.04010009765625,
0.008209228515625,
0.04248046875,
-0.035888671875,
0.040924072265625,
0.042877197265625,
0.044281005859375,
-0.003765106201171875,
-0.0290985107421875,
0.00843048095703125,
0.004150390625,
-0.0244293212890625,
0.0290374755859375,
-0.03924560546875,
-0.0165557861328125,
-0.02496337890625,
0.00116729736328125,
0.0291290283203125,
-0.034393310546875,
0.060302734375,
-0.047454833984375,
0.0165252685546875,
-0.043060302734375,
-0.03466796875,
-0.04449462890625,
-0.00485992431640625,
-0.04217529296875,
0.087646484375,
0.0157928466796875,
-0.057220458984375,
0.023712158203125,
-0.04376220703125,
0.0156402587890625,
0.0254974365234375,
-0.01340484619140625,
0.0014095306396484375,
0.015838623046875,
-0.0234375,
-0.004291534423828125,
-0.04052734375,
0.0145416259765625,
-0.050994873046875,
-0.0350341796875,
0.0212249755859375,
-0.054718017578125,
0.08148193359375,
0.01535797119140625,
-0.06597900390625,
-0.044891357421875,
-0.08062744140625,
0.01334381103515625,
-0.006622314453125,
0.0024471282958984375,
-0.02630615234375,
-0.034149169921875,
0.011962890625,
0.039947509765625,
0.033660888671875,
-0.0465087890625,
0.00740814208984375,
-0.040863037109375,
-0.0199127197265625,
0.054412841796875,
0.0283203125,
0.0123138427734375,
-0.0247802734375,
0.033050537109375,
-0.0121307373046875,
0.046905517578125,
0.03564453125,
-0.031585693359375,
-0.06768798828125,
-0.00848388671875,
-0.015228271484375,
0.014617919921875,
-0.05804443359375,
0.022308349609375,
0.006206512451171875,
-0.08807373046875,
-0.060821533203125,
-0.0050811767578125,
0.047149658203125,
0.032562255859375,
0.037506103515625,
0.0005502700805664062,
-0.01155853271484375,
-0.056854248046875,
-0.052825927734375,
0.004505157470703125,
-0.0174407958984375,
0.01318359375,
0.0308074951171875,
-0.01067352294921875,
0.040618896484375,
-0.0423583984375,
-0.027679443359375,
0.038299560546875,
0.008636474609375,
0.03094482421875,
0.02099609375,
0.0628662109375,
-0.060699462890625,
-0.02899169921875,
-0.00579833984375,
-0.0587158203125,
-0.019439697265625,
0.0168304443359375,
-0.037200927734375,
-0.01203155517578125,
-0.0189666748046875,
-0.050994873046875,
0.049072265625,
0.0208587646484375,
-0.04949951171875,
0.034820556640625,
-0.01203155517578125,
0.0156402587890625,
-0.092529296875,
0.0218353271484375,
-0.0033588409423828125,
0.0021953582763671875,
-0.05865478515625,
0.015838623046875,
0.007396697998046875,
0.0031147003173828125,
-0.04205322265625,
0.06402587890625,
-0.052490234375,
0.002178192138671875,
-0.038604736328125,
-0.04156494140625,
-0.0058441162109375,
0.061676025390625,
0.0302581787109375,
0.02423095703125,
0.059051513671875,
-0.050933837890625,
0.0423583984375,
0.018218994140625,
0.0103912353515625,
0.055572509765625,
-0.06207275390625,
0.00980377197265625,
0.0100555419921875,
0.00933837890625,
-0.016326904296875,
-0.0276031494140625,
0.023834228515625,
-0.03887939453125,
0.020355224609375,
0.00911712646484375,
-0.024658203125,
-0.033538818359375,
-0.01068878173828125,
0.04217529296875,
0.04730224609375,
-0.041107177734375,
0.07275390625,
0.005001068115234375,
0.026397705078125,
-0.0780029296875,
-0.033721923828125,
-0.0078582763671875,
-0.019775390625,
-0.03887939453125,
0.030303955078125,
-0.017669677734375,
-0.0225982666015625,
-0.0007538795471191406,
-0.0299530029296875,
-0.008544921875,
0.01541900634765625,
0.0555419921875,
0.0207977294921875,
0.005401611328125,
0.00205230712890625,
0.0160675048828125,
-0.01100921630859375,
0.0279998779296875,
-0.02972412109375,
0.037689208984375,
-0.04119873046875,
-0.055755615234375,
-0.052032470703125,
0.00795745849609375,
0.046142578125,
-0.01439666748046875,
0.038177490234375,
0.064453125,
-0.056549072265625,
-0.014617919921875,
-0.07373046875,
-0.01898193359375,
-0.040313720703125,
0.00901031494140625,
-0.0275115966796875,
-0.0176544189453125,
0.03460693359375,
0.009521484375,
0.030059814453125,
0.046295166015625,
0.055145263671875,
0.0266876220703125,
0.06256103515625,
0.00821685791015625,
0.03173828125,
0.0311126708984375,
-0.052703857421875,
0.004100799560546875,
-0.03765869140625,
-0.018798828125,
-0.03466796875,
-0.01471710205078125,
-0.027679443359375,
-0.027618408203125,
0.044708251953125,
-0.002529144287109375,
-0.05316162109375,
0.0192413330078125,
-0.05157470703125,
0.0192413330078125,
0.037933349609375,
0.0181427001953125,
0.037139892578125,
-0.00539398193359375,
0.031280517578125,
0.004169464111328125,
-0.0180511474609375,
-0.043304443359375,
0.08709716796875,
0.03436279296875,
0.0762939453125,
0.0166168212890625,
0.059906005859375,
0.01271820068359375,
0.031585693359375,
-0.034820556640625,
0.021484375,
0.0207366943359375,
-0.10064697265625,
-0.020477294921875,
-0.0031032562255859375,
-0.032928466796875,
0.03857421875,
-0.00921630859375,
-0.03326416015625,
0.04620361328125,
0.0013952255249023438,
-0.0248565673828125,
0.0283660888671875,
-0.034271240234375,
0.031524658203125,
-0.0206146240234375,
-0.036102294921875,
-0.0005373954772949219,
-0.04864501953125,
0.0306549072265625,
-0.005039215087890625,
0.00885772705078125,
0.019439697265625,
-0.0204620361328125,
0.0545654296875,
-0.028839111328125,
0.0625,
-0.038360595703125,
-0.009735107421875,
0.007049560546875,
0.0204315185546875,
0.019287109375,
0.0193939208984375,
-0.00044417381286621094,
0.022186279296875,
0.032196044921875,
-0.0283660888671875,
-0.050537109375,
0.032501220703125,
-0.09246826171875,
-0.055389404296875,
-0.038421630859375,
-0.0284271240234375,
-0.0015935897827148438,
0.03643798828125,
0.058074951171875,
0.042449951171875,
-0.017822265625,
0.01088714599609375,
0.0654296875,
0.01024627685546875,
0.01522064208984375,
0.050445556640625,
-0.00901031494140625,
-0.051361083984375,
0.054229736328125,
0.029022216796875,
0.0244903564453125,
-0.0002880096435546875,
-0.006671905517578125,
-0.035186767578125,
-0.018646240234375,
-0.01096343994140625,
0.036865234375,
-0.04998779296875,
-0.006183624267578125,
-0.068115234375,
-0.0294647216796875,
-0.0143890380859375,
0.001560211181640625,
-0.04315185546875,
-0.0284881591796875,
-0.031585693359375,
0.004100799560546875,
0.018341064453125,
0.058319091796875,
0.004291534423828125,
0.024871826171875,
-0.036102294921875,
0.03564453125,
0.0157012939453125,
0.01454925537109375,
0.0258026123046875,
-0.0413818359375,
-0.00968170166015625,
0.0221099853515625,
-0.0614013671875,
-0.07073974609375,
0.0259552001953125,
0.007099151611328125,
0.044525146484375,
0.07391357421875,
0.01708984375,
0.04852294921875,
-0.01209259033203125,
0.06475830078125,
0.0157928466796875,
-0.06463623046875,
0.01390838623046875,
-0.0171356201171875,
-0.0145416259765625,
0.0350341796875,
0.034454345703125,
-0.040130615234375,
-0.053741455078125,
-0.07928466796875,
-0.01496124267578125,
0.04852294921875,
0.0249481201171875,
-0.0167083740234375,
0.016357421875,
0.026275634765625,
0.041473388671875,
0.01824951171875,
-0.05963134765625,
-0.040130615234375,
-0.043365478515625,
0.0003592967987060547,
0.0174560546875,
0.01433563232421875,
-0.0307769775390625,
-0.0290985107421875,
0.06427001953125,
0.0018949508666992188,
0.01520538330078125,
0.0032939910888671875,
0.0008707046508789062,
-0.01480865478515625,
0.0007905960083007812,
0.026153564453125,
0.0635986328125,
-0.0238800048828125,
0.0203704833984375,
0.032958984375,
-0.056793212890625,
0.0233001708984375,
0.0133056640625,
-0.00421905517578125,
0.0177459716796875,
0.02301025390625,
0.05523681640625,
-0.01023101806640625,
0.00818634033203125,
0.032135009765625,
-0.007144927978515625,
-0.0157470703125,
-0.0457763671875,
0.02728271484375,
0.007694244384765625,
0.032928466796875,
0.04754638671875,
0.021087646484375,
0.044586181640625,
-0.0175323486328125,
-0.030029296875,
0.0196685791015625,
-0.01122283935546875,
-0.042572021484375,
0.0638427734375,
-0.00856781005859375,
-0.0469970703125,
0.0014028549194335938,
-0.0159454345703125,
-0.03863525390625,
0.03399658203125,
0.033782958984375,
0.05816650390625,
-0.036102294921875,
0.0153045654296875,
0.0142059326171875,
0.0172271728515625,
-0.01360321044921875,
0.042877197265625,
-0.0232086181640625,
-0.01319122314453125,
0.0112457275390625,
-0.06048583984375,
-0.0311431884765625,
-0.005527496337890625,
-0.054718017578125,
0.025177001953125,
-0.0209197998046875,
-0.034576416015625,
-0.00667572021484375,
0.02435302734375,
-0.0770263671875,
0.0172271728515625,
-0.0172271728515625,
0.0936279296875,
-0.060211181640625,
0.05218505859375,
0.02532958984375,
-0.0308380126953125,
-0.056488037109375,
-0.0169219970703125,
-0.011322021484375,
-0.05804443359375,
0.035736083984375,
-0.01255035400390625,
-0.0146942138671875,
0.000926971435546875,
-0.056976318359375,
-0.087158203125,
0.1131591796875,
0.031494140625,
-0.018341064453125,
-0.0404052734375,
0.016815185546875,
0.060302734375,
0.0014781951904296875,
0.0499267578125,
0.02691650390625,
0.045867919921875,
0.024658203125,
-0.0625,
0.0172119140625,
-0.0299530029296875,
0.020599365234375,
0.01233673095703125,
-0.05291748046875,
0.07452392578125,
0.0186004638671875,
-0.033905029296875,
0.01361083984375,
0.044891357421875,
0.03546142578125,
0.03765869140625,
0.0241851806640625,
0.0609130859375,
0.053314208984375,
-0.0099334716796875,
0.06671142578125,
-0.053619384765625,
0.0250701904296875,
0.06689453125,
0.004207611083984375,
0.01448822021484375,
0.0291290283203125,
-0.0164031982421875,
0.0219879150390625,
0.07257080078125,
-0.0040435791015625,
0.01531982421875,
-0.00428009033203125,
-0.01441192626953125,
-0.02777099609375,
-0.019775390625,
-0.0738525390625,
-0.0020046234130859375,
0.0216827392578125,
-0.017242431640625,
-0.01064300537109375,
-0.00069427490234375,
-0.019805908203125,
-0.0308074951171875,
-0.024566650390625,
0.0484619140625,
0.0271148681640625,
-0.03875732421875,
0.075927734375,
-0.0167236328125,
0.043487548828125,
-0.040863037109375,
-0.01297760009765625,
0.0016717910766601562,
0.01203155517578125,
-0.0307769775390625,
-0.037109375,
-0.0008273124694824219,
-0.02471923828125,
0.0153045654296875,
-0.0037708282470703125,
0.06854248046875,
-0.0134429931640625,
-0.0380859375,
0.0217742919921875,
0.03790283203125,
0.0168609619140625,
-0.01544189453125,
-0.0367431640625,
-0.030609130859375,
-0.036590576171875,
-0.018646240234375,
-0.01116943359375,
0.054168701171875,
-0.00048089027404785156,
0.04486083984375,
0.0404052734375,
0.019989013671875,
0.0210418701171875,
-0.01507568359375,
0.05963134765625,
-0.05352783203125,
-0.037078857421875,
-0.03656005859375,
0.022186279296875,
-0.0158538818359375,
-0.041259765625,
0.06280517578125,
0.03948974609375,
0.05157470703125,
-0.01522064208984375,
0.04022216796875,
0.003925323486328125,
0.0286712646484375,
-0.004608154296875,
0.044830322265625,
-0.059844970703125,
-0.00489044189453125,
-0.0191650390625,
-0.0775146484375,
0.022735595703125,
0.0051422119140625,
-0.00246429443359375,
0.00893402099609375,
0.056549072265625,
0.06304931640625,
-0.02301025390625,
0.023651123046875,
0.015960693359375,
0.016204833984375,
0.02423095703125,
0.033447265625,
0.08489990234375,
-0.034942626953125,
0.0174560546875,
-0.033782958984375,
-0.032806396484375,
-0.01033782958984375,
-0.0133056640625,
-0.08837890625,
-0.048858642578125,
0.007808685302734375,
-0.050994873046875,
-0.01099395751953125,
0.07958984375,
0.049041748046875,
-0.0235595703125,
-0.0199127197265625,
0.006061553955078125,
-0.00603485107421875,
-0.035400390625,
-0.0151214599609375,
0.0202178955078125,
0.0188751220703125,
-0.04052734375,
-0.01537322998046875,
0.0169219970703125,
0.028594970703125,
-0.043975830078125,
-0.0024967193603515625,
0.02362060546875,
0.01220703125,
0.00853729248046875,
0.00595855712890625,
-0.0771484375,
-0.0146942138671875,
0.032073974609375,
-0.0015621185302734375,
-0.0236053466796875,
0.021240234375,
-0.013702392578125,
0.0152740478515625,
0.024139404296875,
0.0269775390625,
0.06304931640625,
-0.008819580078125,
0.018829345703125,
-0.00772857666015625,
-0.0160675048828125,
0.033111572265625,
0.01129913330078125,
-0.01409149169921875,
-0.046905517578125,
0.042877197265625,
0.0274200439453125,
-0.035797119140625,
-0.059326171875,
0.003997802734375,
-0.0794677734375,
-0.01371002197265625,
0.0914306640625,
0.0108642578125,
-0.035980224609375,
-0.0165252685546875,
-0.050689697265625,
0.0311126708984375,
-0.020233154296875,
0.055938720703125,
0.059814453125,
0.0003120899200439453,
0.02117919921875,
-0.02471923828125,
0.050537109375,
0.01259613037109375,
-0.044158935546875,
-0.0048980712890625,
0.037353515625,
0.0469970703125,
0.039703369140625,
0.03466796875,
-0.0016241073608398438,
0.0458984375,
0.007381439208984375,
0.0224456787109375,
-0.024871826171875,
-0.0257720947265625,
-0.00563812255859375,
0.033203125,
0.00576019287109375,
-0.004077911376953125
]
] |
google/muril-base-cased | 2022-06-10T13:33:04.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"arxiv:2103.10730",
"arxiv:1810.04805",
"arxiv:1911.02116",
"arxiv:2003.11080",
"arxiv:2009.05166",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | google | null | null | google/muril-base-cased | 22 | 6,975 | transformers | 2022-03-02T23:29:05 | ---
thumbnail: https://huggingface.co/front/thumbnails/google.png
license: apache-2.0
---
MuRIL: Multilingual Representations for Indian Languages
===
MuRIL is a BERT model pre-trained on 17 Indian languages and their transliterated counterparts. We have released the pre-trained model (with the MLM layer intact, enabling masked word predictions) in this repository. We have also released the encoder on [TFHub](https://tfhub.dev/google/MuRIL/1) with an additional pre-processing module, that processes raw text into the expected input format for the encoder. You can find more details on MuRIL in this [paper](http://arxiv.org/abs/2103.10730).
## Overview
This model uses a BERT base architecture [1] pretrained from scratch using the
Wikipedia [2], Common Crawl [3], PMINDIA [4] and Dakshina [5] corpora for 17 [6]
Indian languages.
We use a training paradigm similar to multilingual bert, with a few
modifications as listed:
* We include translation and transliteration segment pairs in training as
well.
* We keep an exponent value of 0.3 and not 0.7 for upsampling, shown to
enhance low-resource performance. [7]
See the Training section for more details.
## Training
The MuRIL model is pre-trained on monolingual segments as well as parallel
segments as detailed below :
* Monolingual Data : We make use of publicly available corpora from Wikipedia
and Common Crawl for 17 Indian languages.
* Parallel Data : We have two types of parallel data :
* Translated Data : We obtain translations of the above monolingual
corpora using the Google NMT pipeline. We feed translated segment pairs
as input. We also make use of the publicly available PMINDIA corpus.
* Transliterated Data : We obtain transliterations of Wikipedia using the
IndicTrans [8] library. We feed transliterated segment pairs as input.
We also make use of the publicly available Dakshina dataset.
We keep an exponent value of 0.3 to calculate duplication multiplier values for
upsampling of lower resourced languages and set dupe factors accordingly. Note,
we limit transliterated pairs to Wikipedia only.
The model was trained using a self-supervised masked language modeling task. We
do whole word masking with a maximum of 80 predictions. The model was trained
for 1000K steps, with a batch size of 4096, and a max sequence length of 512.
### Trainable parameters
All parameters in the module are trainable, and fine-tuning all parameters is
the recommended practice.
## Uses & Limitations
This model is intended to be used for a variety of downstream NLP tasks for
Indian languages. This model is trained on transliterated data as well, a
phenomomenon commonly observed in the Indian context. This model is not expected
to perform well on languages other than the ones used in pretraining, i.e. 17
Indian languages.
## Evaluation
We provide the results of fine-tuning this model on a set of downstream tasks.<br/>
We choose these tasks from the XTREME benchmark, with evaluation done on Indian language test-sets.<br/>
We also transliterate the test-sets and evaluate on the same.<br/>
We use the same fine-tuning setting as is used by [9], except for TyDiQA, where we use additional SQuAD v1.1 English training data, similar to [10].<br/>
For Tatoeba, we do not fine-tune the model, and use the pooled_output of the last layer as the sentence embedding.<br/>
All results are computed in a zero-shot setting, with English being the high resource training set language.
* Shown below are results on datasets from the XTREME benchmark (in %)
<br/>
PANX (F1) | ml | ta | te | en | bn | hi | mr | ur | Average
:-------- | ----: | ----: | ----: | ----: | ----: | ----: | ----: | ----: | ------:
mBERT | 54.77 | 51.24 | 50.16 | 84.40 | 68.59 | 65.13 | 58.44 | 31.36 | 58.01
MuRIL | 75.74 | 71.86 | 64.99 | 84.43 | 85.97 | 78.09 | 74.63 | 85.07 | 77.60
<br/>
UDPOS (F1) | en | hi | mr | ta | te | ur | Average
:--------- | ----: | ----: | ----: | ----: | ----: | ----: | ------:
mBERT | 95.35 | 66.09 | 71.27 | 59.58 | 76.98 | 57.85 | 71.19
MuRIL | 95.55 | 64.47 | 82.95 | 62.57 | 85.63 | 58.93 | 75.02
<br/>
XNLI (Accuracy) | en | hi | ur | Average
:-------------- | ----: | ----: | ----: | ------:
mBERT | 81.72 | 60.52 | 58.20 | 66.81
MuRIL | 83.85 | 70.66 | 67.70 | 74.07
<br/>
Tatoeba (Accuracy) | ml | ta | te | bn | hi | mr | ur | Average
:----------------- | ----: | ----: | ----: | ----: | ----: | ----: | ----: | ------:
mBERT | 20.23 | 12.38 | 14.96 | 12.80 | 27.80 | 18.00 | 22.70 | 18.41
MuRIL | 26.35 | 36.81 | 17.52 | 20.20 | 31.50 | 26.60 | 17.10 | 25.15
<br/>
XQUAD (F1/EM) | en | hi | Average
:------------ | ----------: | ----------: | ----------:
mBERT | 83.85/72.86 | 58.46/43.53 | 71.15/58.19
MuRIL | 84.31/72.94 | 73.93/58.32 | 79.12/65.63
<br/>
MLQA (F1/EM) | en | hi | Average
:----------- | ----------: | ----------: | ----------:
mBERT | 80.39/67.30 | 50.28/35.18 | 65.34/51.24
MuRIL | 80.28/67.37 | 67.34/50.22 | 73.81/58.80
<br/>
TyDiQA (F1/EM) | en | bn | te | Average
:---------------- | ----------: | ----------: | ----------: | ----------:
mBERT | 75.21/65.00 | 60.62/45.13 | 53.55/44.54 | 63.13/51.66
MuRIL | 74.10/64.55 | 78.03/66.37 | 73.95/46.94 | 75.36/59.28
* Shown below are results on the transliterated versions of the above
test-sets.
PANX (F1) | ml_tr | ta_tr | te_tr | bn_tr | hi_tr | mr_tr | ur_tr | Average
:-------- | ----: | ----: | ----: | ----: | ----: | ----: | ----: | ------:
mBERT | 7.53 | 1.04 | 8.24 | 41.77 | 25.46 | 8.34 | 7.30 | 14.24
MuRIL | 63.39 | 7.00 | 53.62 | 72.94 | 69.75 | 68.77 | 68.41 | 57.70
<br/>
UDPOS (F1) | hi_tr | mr_tr | ta_tr | te_tr | ur_tr | Average
:--------- | ----: | ----: | ----: | ----: | ----: | ------:
mBERT | 25.00 | 33.67 | 24.02 | 36.21 | 22.07 | 28.20
MuRIL | 63.09 | 67.19 | 58.40 | 65.30 | 56.49 | 62.09
<br/>
XNLI (Accuracy) | hi_tr | ur_tr | Average
:-------------- | ----: | ----: | ------:
mBERT | 39.6 | 38.86 | 39.23
MuRIL | 68.24 | 61.16 | 64.70
<br/>
Tatoeba (Accuracy) | ml_tr | ta_tr | te_tr | bn_tr | hi_tr | mr_tr | ur_tr | Average
:----------------- | ----: | ----: | ----: | ----: | ----: | ----: | ----: | ------:
mBERT | 2.18 | 1.95 | 5.13 | 1.80 | 3.00 | 2.40 | 2.30 | 2.68
MuRIL | 10.33 | 11.07 | 11.54 | 8.10 | 14.90 | 7.20 | 13.70 | 10.98
<br/>
## References
\[1]: Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova. [BERT:
Pre-training of Deep Bidirectional Transformers for Language
Understanding](https://arxiv.org/abs/1810.04805). arXiv preprint
arXiv:1810.04805, 2018.
\[2]: [Wikipedia](https://www.tensorflow.org/datasets/catalog/wikipedia)
\[3]: [Common Crawl](http://commoncrawl.org/the-data/)
\[4]:
[PMINDIA](http://lotus.kuee.kyoto-u.ac.jp/WAT/indic-multilingual/index.html)
\[5]: [Dakshina](https://github.com/google-research-datasets/dakshina)
\[6]: Assamese (as), Bengali (bn), English (en), Gujarati (gu), Hindi (hi),
Kannada (kn), Kashmiri (ks), Malayalam (ml), Marathi (mr), Nepali (ne), Oriya
(or), Punjabi (pa), Sanskrit (sa), Sindhi (sd), Tamil (ta), Telugu (te) and Urdu
(ur).
\[7]: Conneau, Alexis, et al.
[Unsupervised cross-lingual representation learning at scale](https://arxiv.org/pdf/1911.02116.pdf).
arXiv preprint arXiv:1911.02116 (2019).
\[8]: [IndicTrans](https://github.com/libindic/indic-trans)
\[9]: Hu, J., Ruder, S., Siddhant, A., Neubig, G., Firat, O., & Johnson, M.
(2020). [Xtreme: A massively multilingual multi-task benchmark for evaluating
cross-lingual generalization.](https://arxiv.org/pdf/2003.11080.pdf) arXiv
preprint arXiv:2003.11080.
\[10]: Fang, Y., Wang, S., Gan, Z., Sun, S., & Liu, J. (2020).
[FILTER: An Enhanced Fusion Method for Cross-lingual Language Understanding.](https://arxiv.org/pdf/2009.05166.pdf)
arXiv preprint arXiv:2009.05166.
## Citation
If you find MuRIL useful in your applications, please cite the following paper:
```
@misc{khanuja2021muril,
title={MuRIL: Multilingual Representations for Indian Languages},
author={Simran Khanuja and Diksha Bansal and Sarvesh Mehtani and Savya Khosla and Atreyee Dey and Balaji Gopalan and Dilip Kumar Margam and Pooja Aggarwal and Rajiv Teja Nagipogu and Shachi Dave and Shruti Gupta and Subhash Chandra Bose Gali and Vish Subramanian and Partha Talukdar},
year={2021},
eprint={2103.10730},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## Contact
Please mail your queries/feedback to muril-contact@google.com. | 9,037 | [
[
-0.0335693359375,
-0.023956298828125,
-0.0034465789794921875,
0.023956298828125,
-0.016082763671875,
0.0017137527465820312,
-0.0245513916015625,
-0.0236968994140625,
0.0296478271484375,
-0.00010401010513305664,
-0.023651123046875,
-0.049163818359375,
-0.047576904296875,
0.0185546875,
-0.01079559326171875,
0.060089111328125,
-0.0076904296875,
0.02960205078125,
0.0237579345703125,
-0.03277587890625,
-0.036376953125,
-0.031280517578125,
-0.043609619140625,
-0.00988006591796875,
0.0270233154296875,
0.0269775390625,
0.046173095703125,
0.043731689453125,
0.03662109375,
0.0181427001953125,
0.000820159912109375,
0.014739990234375,
-0.00970458984375,
-0.00710296630859375,
0.01526641845703125,
-0.03167724609375,
-0.0211334228515625,
-0.00412750244140625,
0.0650634765625,
0.05426025390625,
-0.01003265380859375,
0.0181884765625,
0.01430511474609375,
0.054931640625,
-0.031890869140625,
0.01052093505859375,
-0.016021728515625,
0.0081329345703125,
-0.03717041015625,
-0.01556396484375,
-0.02642822265625,
-0.0283355712890625,
-0.01122283935546875,
-0.039794921875,
0.005084991455078125,
0.016326904296875,
0.0892333984375,
0.008270263671875,
-0.0225982666015625,
-0.0142059326171875,
-0.03717041015625,
0.0794677734375,
-0.05084228515625,
0.04351806640625,
0.035888671875,
0.0135955810546875,
0.01070404052734375,
-0.0284881591796875,
-0.049468994140625,
-0.00524139404296875,
-0.03729248046875,
0.0312347412109375,
-0.00830078125,
-0.01496124267578125,
0.023834228515625,
0.048004150390625,
-0.06365966796875,
0.004608154296875,
-0.0295257568359375,
-0.01177215576171875,
0.05157470703125,
0.01027679443359375,
0.0303955078125,
-0.037567138671875,
-0.04180908203125,
-0.0367431640625,
-0.040130615234375,
0.0264739990234375,
0.0269622802734375,
0.025787353515625,
-0.023101806640625,
0.039459228515625,
-0.0095672607421875,
0.049102783203125,
0.00400543212890625,
-0.020721435546875,
0.055908203125,
-0.0555419921875,
-0.027496337890625,
0.0016241073608398438,
0.08203125,
0.0099334716796875,
0.0117645263671875,
0.00714111328125,
-0.002170562744140625,
-0.004482269287109375,
-0.019134521484375,
-0.0450439453125,
-0.0037059783935546875,
0.0003001689910888672,
-0.0299224853515625,
-0.0035343170166015625,
-0.00022125244140625,
-0.07257080078125,
-0.0018062591552734375,
-0.017822265625,
0.027862548828125,
-0.0587158203125,
-0.0335693359375,
-0.0054779052734375,
0.006015777587890625,
0.04144287109375,
-0.0012845993041992188,
-0.06317138671875,
0.0177764892578125,
0.01519775390625,
0.06378173828125,
-0.0213470458984375,
-0.0361328125,
-0.0005354881286621094,
-0.00279998779296875,
-0.0323486328125,
0.0374755859375,
-0.0273590087890625,
-0.028778076171875,
-0.007350921630859375,
0.00490570068359375,
-0.03558349609375,
-0.022918701171875,
0.051177978515625,
-0.01338958740234375,
0.0263671875,
-0.0265350341796875,
-0.029144287109375,
-0.022216796875,
0.02484130859375,
-0.0556640625,
0.10296630859375,
0.0181427001953125,
-0.06488037109375,
0.03997802734375,
-0.04254150390625,
-0.0287628173828125,
-0.01520538330078125,
-0.0211334228515625,
-0.035247802734375,
-0.0250244140625,
0.043853759765625,
0.036773681640625,
-0.032867431640625,
0.0188140869140625,
0.003612518310546875,
-0.027496337890625,
0.005146026611328125,
-0.04046630859375,
0.08526611328125,
0.01502227783203125,
-0.04876708984375,
-0.0023479461669921875,
-0.07830810546875,
0.0182647705078125,
0.01012420654296875,
-0.03021240234375,
-0.01506805419921875,
-0.034637451171875,
0.00467681884765625,
0.020172119140625,
-0.0085601806640625,
-0.040985107421875,
-0.0022144317626953125,
-0.0394287109375,
0.01239776611328125,
0.05499267578125,
-0.0084991455078125,
0.0193328857421875,
-0.0256500244140625,
0.042236328125,
0.0169830322265625,
0.006359100341796875,
-0.00772857666015625,
-0.040008544921875,
-0.060699462890625,
-0.036712646484375,
0.03948974609375,
0.045379638671875,
-0.046630859375,
0.026092529296875,
-0.03778076171875,
-0.049468994140625,
-0.05499267578125,
0.01103973388671875,
0.0460205078125,
0.046173095703125,
0.033355712890625,
-0.004558563232421875,
-0.043548583984375,
-0.084228515625,
-0.007549285888671875,
-0.004978179931640625,
0.019256591796875,
0.0032176971435546875,
0.033172607421875,
-0.005279541015625,
0.046600341796875,
-0.03692626953125,
-0.0239715576171875,
-0.02105712890625,
0.00495147705078125,
0.0304718017578125,
0.0438232421875,
0.058624267578125,
-0.0638427734375,
-0.07684326171875,
0.004810333251953125,
-0.058013916015625,
0.0017461776733398438,
0.01465606689453125,
-0.017913818359375,
0.04559326171875,
0.02252197265625,
-0.048736572265625,
0.037261962890625,
0.050201416015625,
-0.01447296142578125,
0.051971435546875,
-0.01148223876953125,
0.032440185546875,
-0.099365234375,
0.0196533203125,
-0.00008893013000488281,
0.001613616943359375,
-0.03936767578125,
-0.0035610198974609375,
0.01132965087890625,
-0.006465911865234375,
-0.0251312255859375,
0.0528564453125,
-0.047393798828125,
0.011627197265625,
0.02777099609375,
-0.0012302398681640625,
-0.0099945068359375,
0.067138671875,
0.00154876708984375,
0.081787109375,
0.043975830078125,
-0.0237579345703125,
0.00995635986328125,
0.0210418701171875,
-0.0450439453125,
0.0287628173828125,
-0.047515869140625,
-0.018341064453125,
0.00524139404296875,
-0.002475738525390625,
-0.0838623046875,
-0.01029205322265625,
0.022003173828125,
-0.04461669921875,
0.039093017578125,
0.00722503662109375,
-0.04046630859375,
-0.028411865234375,
-0.039825439453125,
0.0311126708984375,
0.043792724609375,
-0.0313720703125,
0.0299072265625,
0.005191802978515625,
-0.00687408447265625,
-0.0657958984375,
-0.066650390625,
-0.02215576171875,
-0.01654052734375,
-0.046417236328125,
0.0194549560546875,
-0.00997161865234375,
-0.007080078125,
0.00004380941390991211,
0.0003237724304199219,
-0.0100250244140625,
-0.02423095703125,
0.0216064453125,
0.0236358642578125,
-0.0226593017578125,
0.0015668869018554688,
-0.009490966796875,
-0.0099029541015625,
-0.00527191162109375,
0.01052093505859375,
0.053253173828125,
-0.01541900634765625,
-0.0254974365234375,
-0.042938232421875,
0.0248870849609375,
0.046722412109375,
-0.0283660888671875,
0.06268310546875,
0.0576171875,
-0.0217742919921875,
0.016693115234375,
-0.037750244140625,
0.0114593505859375,
-0.0307159423828125,
0.03204345703125,
-0.035430908203125,
-0.038726806640625,
0.044647216796875,
0.0205841064453125,
-0.00982666015625,
0.06170654296875,
0.042877197265625,
-0.0078125,
0.08367919921875,
0.0267791748046875,
-0.0079498291015625,
0.028900146484375,
-0.046234130859375,
0.016754150390625,
-0.0645751953125,
-0.030731201171875,
-0.040191650390625,
-0.0217742919921875,
-0.06201171875,
-0.0211181640625,
0.0243988037109375,
-0.00972747802734375,
-0.0231170654296875,
0.02850341796875,
-0.038177490234375,
0.0167083740234375,
0.05731201171875,
0.01947021484375,
0.01172637939453125,
0.016571044921875,
-0.02203369140625,
-0.0233154296875,
-0.049407958984375,
-0.025115966796875,
0.10174560546875,
0.01226806640625,
0.047607421875,
0.01293182373046875,
0.042724609375,
0.0211639404296875,
0.00919342041015625,
-0.037017822265625,
0.03436279296875,
-0.010162353515625,
-0.0576171875,
-0.0286712646484375,
-0.031707763671875,
-0.0736083984375,
0.028533935546875,
-0.0143890380859375,
-0.04193115234375,
0.01959228515625,
-0.0029449462890625,
-0.0310821533203125,
0.036468505859375,
-0.0653076171875,
0.0517578125,
-0.017822265625,
-0.0245208740234375,
-0.0082855224609375,
-0.048980712890625,
0.02886962890625,
-0.00800323486328125,
0.01416778564453125,
-0.0035953521728515625,
0.0151824951171875,
0.07318115234375,
-0.02691650390625,
0.0484619140625,
-0.0104217529296875,
0.01715087890625,
0.0192108154296875,
-0.01197052001953125,
0.0290374755859375,
-0.00124359130859375,
-0.01959228515625,
0.0201416015625,
0.01192474365234375,
-0.05413818359375,
-0.03094482421875,
0.052825927734375,
-0.0802001953125,
-0.0433349609375,
-0.0650634765625,
-0.04547119140625,
-0.0156402587890625,
0.025146484375,
0.033660888671875,
0.047607421875,
0.0038356781005859375,
0.0218048095703125,
0.051483154296875,
-0.02069091796875,
0.035858154296875,
0.0177001953125,
-0.0047760009765625,
-0.049896240234375,
0.06622314453125,
0.0290374755859375,
0.021697998046875,
0.0391845703125,
0.014251708984375,
-0.03350830078125,
-0.037017822265625,
-0.04241943359375,
0.0289764404296875,
-0.047943115234375,
-0.020782470703125,
-0.04583740234375,
-0.025543212890625,
-0.045928955078125,
-0.0038776397705078125,
-0.01279449462890625,
-0.040924072265625,
0.0076141357421875,
-0.00873565673828125,
0.0122833251953125,
0.038818359375,
-0.006855010986328125,
0.0169219970703125,
-0.047332763671875,
0.002285003662109375,
0.0157470703125,
0.01496124267578125,
-0.0008072853088378906,
-0.049652099609375,
-0.02642822265625,
0.0137786865234375,
-0.0166015625,
-0.07000732421875,
0.044708251953125,
0.0116119384765625,
0.04888916015625,
0.0306854248046875,
-0.0118255615234375,
0.0634765625,
-0.01551055908203125,
0.06585693359375,
0.0272216796875,
-0.05572509765625,
0.0408935546875,
-0.01381683349609375,
0.02020263671875,
0.0426025390625,
0.05889892578125,
-0.035736083984375,
-0.01238250732421875,
-0.041229248046875,
-0.054443359375,
0.061614990234375,
0.025543212890625,
-0.0096588134765625,
0.010650634765625,
0.008392333984375,
0.0159912109375,
0.0092315673828125,
-0.0601806640625,
-0.054351806640625,
-0.01071929931640625,
-0.0292510986328125,
-0.0206146240234375,
-0.020843505859375,
-0.004825592041015625,
-0.051300048828125,
0.06268310546875,
0.0178680419921875,
0.0242156982421875,
0.0159454345703125,
-0.02166748046875,
0.01031494140625,
0.01476287841796875,
0.04779052734375,
0.060791015625,
-0.0226898193359375,
0.0091094970703125,
0.02838134765625,
-0.062469482421875,
0.0192413330078125,
0.0119476318359375,
-0.0207672119140625,
0.00829315185546875,
0.03668212890625,
0.055816650390625,
0.0088348388671875,
-0.031982421875,
0.028533935546875,
-0.0006113052368164062,
-0.00870513916015625,
-0.0299072265625,
-0.007724761962890625,
-0.011627197265625,
-0.00408935546875,
0.0212249755859375,
0.003681182861328125,
-0.0050811767578125,
-0.0306396484375,
0.00022220611572265625,
0.00965118408203125,
-0.0238037109375,
-0.02642822265625,
0.0350341796875,
0.0026035308837890625,
-0.007293701171875,
0.046905517578125,
-0.02117919921875,
-0.038787841796875,
0.04254150390625,
0.03802490234375,
0.04840087890625,
-0.042236328125,
0.01702880859375,
0.07684326171875,
0.038818359375,
0.007190704345703125,
0.0271759033203125,
0.010650634765625,
-0.038055419921875,
-0.0224456787109375,
-0.0625,
-0.006397247314453125,
0.021209716796875,
-0.0513916015625,
0.0230255126953125,
-0.01348876953125,
-0.01090240478515625,
0.0087127685546875,
0.030364990234375,
-0.057220458984375,
0.0255584716796875,
-0.001476287841796875,
0.07568359375,
-0.06976318359375,
0.07135009765625,
0.0596923828125,
-0.054046630859375,
-0.08355712890625,
-0.013458251953125,
-0.0149078369140625,
-0.08038330078125,
0.054656982421875,
0.0033779144287109375,
0.00934600830078125,
-0.005084991455078125,
-0.0076446533203125,
-0.08819580078125,
0.097412109375,
0.0205841064453125,
-0.0322265625,
0.00835418701171875,
0.040008544921875,
0.045166015625,
0.00040912628173828125,
0.036529541015625,
0.031341552734375,
0.046142578125,
-0.006198883056640625,
-0.06561279296875,
0.0128936767578125,
-0.042236328125,
0.0032138824462890625,
0.0310211181640625,
-0.06988525390625,
0.07281494140625,
0.00841522216796875,
-0.01116180419921875,
-0.016143798828125,
0.0445556640625,
0.025543212890625,
0.01520538330078125,
0.0360107421875,
0.06097412109375,
0.049407958984375,
-0.017913818359375,
0.07757568359375,
-0.035736083984375,
0.0211639404296875,
0.06414794921875,
0.0098724365234375,
0.056243896484375,
0.0266265869140625,
-0.0426025390625,
0.0367431640625,
0.05413818359375,
0.0011539459228515625,
0.035003662109375,
-0.01116943359375,
-0.006916046142578125,
-0.008453369140625,
0.005619049072265625,
-0.04449462890625,
0.035430908203125,
0.017181396484375,
-0.0257415771484375,
-0.0093231201171875,
0.0135040283203125,
0.02569580078125,
0.0016031265258789062,
-0.0159454345703125,
0.021270751953125,
-0.0032482147216796875,
-0.05438232421875,
0.06988525390625,
-0.00537872314453125,
0.05230712890625,
-0.057220458984375,
0.00130462646484375,
-0.0308380126953125,
0.01433563232421875,
-0.01898193359375,
-0.07281494140625,
0.032928466796875,
-0.00128936767578125,
-0.0203399658203125,
-0.0103607177734375,
0.040252685546875,
-0.038055419921875,
-0.050018310546875,
0.025665283203125,
0.0367431640625,
0.01312255859375,
0.01110076904296875,
-0.059814453125,
-0.0059661865234375,
0.0125732421875,
-0.032958984375,
0.018341064453125,
0.016021728515625,
-0.006683349609375,
0.04254150390625,
0.040679931640625,
0.01207733154296875,
0.034027099609375,
-0.0032196044921875,
0.053192138671875,
-0.053375244140625,
-0.024078369140625,
-0.057647705078125,
0.024658203125,
-0.01540374755859375,
-0.040191650390625,
0.081787109375,
0.052764892578125,
0.07147216796875,
0.004550933837890625,
0.05401611328125,
-0.013641357421875,
0.03936767578125,
-0.034698486328125,
0.056732177734375,
-0.053070068359375,
-0.01465606689453125,
-0.0030269622802734375,
-0.06500244140625,
-0.023468017578125,
0.03564453125,
-0.0247344970703125,
0.01197052001953125,
0.05926513671875,
0.0638427734375,
-0.01093292236328125,
-0.004268646240234375,
0.0198974609375,
0.0247955322265625,
0.0091400146484375,
0.054290771484375,
0.030364990234375,
-0.054840087890625,
0.052947998046875,
-0.046600341796875,
-0.0241546630859375,
-0.0026073455810546875,
-0.0273284912109375,
-0.0640869140625,
-0.0633544921875,
-0.017181396484375,
-0.023345947265625,
-0.00118255615234375,
0.076171875,
0.047607421875,
-0.0687255859375,
-0.0277252197265625,
0.01495361328125,
0.0018358230590820312,
-0.035400390625,
-0.01454925537109375,
0.06658935546875,
-0.017974853515625,
-0.0709228515625,
0.0088958740234375,
0.004367828369140625,
-0.0008707046508789062,
-0.02276611328125,
-0.0147705078125,
-0.05059814453125,
0.0011796951293945312,
0.037200927734375,
0.020416259765625,
-0.04534912109375,
-0.003726959228515625,
0.0147247314453125,
-0.0299530029296875,
0.0292510986328125,
0.01117706298828125,
-0.0292205810546875,
0.030517578125,
0.0496826171875,
0.032440185546875,
0.03924560546875,
-0.01213836669921875,
0.0091552734375,
-0.050689697265625,
0.016448974609375,
0.001293182373046875,
0.0328369140625,
0.0263824462890625,
-0.012969970703125,
0.055023193359375,
0.0236358642578125,
-0.035308837890625,
-0.07281494140625,
-0.0161590576171875,
-0.08135986328125,
0.00453948974609375,
0.0838623046875,
-0.01271820068359375,
-0.039093017578125,
0.0061187744140625,
-0.0164031982421875,
0.029144287109375,
-0.0175018310546875,
0.041839599609375,
0.057220458984375,
-0.007476806640625,
-0.00762939453125,
-0.04388427734375,
0.031585693359375,
0.045501708984375,
-0.05218505859375,
-0.0165557861328125,
0.00217437744140625,
0.0260467529296875,
0.02984619140625,
0.05438232421875,
-0.021728515625,
0.00395965576171875,
0.007373809814453125,
0.01506805419921875,
0.00026607513427734375,
-0.007289886474609375,
-0.0148468017578125,
-0.001941680908203125,
-0.01580810546875,
-0.035125732421875
]
] |
swl-models/xiaolxl-guofeng-v2 | 2023-02-28T08:58:28.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | swl-models | null | null | swl-models/xiaolxl-guofeng-v2 | 3 | 6,970 | diffusers | 2023-01-31T15:44:54 | ---
license: creativeml-openrail-m
language:
- en
library_name: diffusers
pipeline_tag: text-to-image
tags:
- stable-diffusion
- stable-diffusion-diffusers
duplicated_from: xiaolxl/Gf_style2
---
# Gf_style2 - 介绍
欢迎使用Gf_style2模型 - 这是一个中国华丽古风风格模型,也可以说是一个古风游戏角色模型,具有2.5D的质感。第二代相对与第一代减少了上手难度,不需要固定的配置也能生成好看的图片。同时也改进了上一代脸崩坏的问题。
这是一个模型系列,会在未来不断更新模型。
--
Welcome to Gf_ Style2 model - This is a Chinese gorgeous antique style model, which can also be said to be an antique game role model with a 2.5D texture. Compared with the first generation, the second generation reduces the difficulty of getting started and can generate beautiful pictures without fixed configuration. At the same time, it also improved the problem of face collapse of the previous generation.
This is a series of models that will be updated in the future.
3.0版本已发布:[https://huggingface.co/xiaolxl/Gf_style3](https://huggingface.co/xiaolxl/Gf_style3)
# install - 安装教程
1. 将XXX.ckpt模型放入SD目录 - Put XXX.ckpt model into SD directory
2. 模型自带VAE如果你的程序无法加载请记住选择任意一个VAE文件,否则图形将为灰色 - The model comes with VAE. If your program cannot be loaded, please remember to select any VAE file, otherwise the drawing will be gray
# How to use - 如何使用
(TIP:人物是竖图炼制,理论上生成竖图效果更好)
简单:第二代上手更加简单,你只需要下方3个设置即可 - simple:The second generation is easier to use. You only need the following three settings:
- The size of the picture should be at least **768**, otherwise it will collapse - 图片大小至少768,不然会崩图
- **key word(Start):**
```
{best quality}, {{masterpiece}}, {highres}, {an extremely delicate and beautiful}, original, extremely detailed wallpaper,1girl
```
- **Negative words - 感谢群友提供的负面词:**
```
(((simple background))),monochrome ,lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, lowres, bad anatomy, bad hands, text, error, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry, ugly,pregnant,vore,duplicate,morbid,mut ilated,tran nsexual, hermaphrodite,long neck,mutated hands,poorly drawn hands,poorly drawn face,mutation,deformed,blurry,bad anatomy,bad proportions,malformed limbs,extra limbs,cloned face,disfigured,gross proportions, (((missing arms))),(((missing legs))), (((extra arms))),(((extra legs))),pubic hair, plump,bad legs,error legs,username,blurry,bad feet
```
高级:如果您还想使图片尽可能更好,请尝试以下配置 - senior:If you also want to make the picture as better as possible, please try the following configuration
- Sampling steps:**30 or 50**
- Sampler:**DPM++ SDE Karras**
- The size of the picture should be at least **768**, otherwise it will collapse - 图片大小至少768,不然会崩图
- If the face is deformed, try to Open **face repair**
- **如果想元素更丰富,可以添加下方关键词 - If you want to enrich the elements, you can add the following keywords**
```
strapless dress,
smile,
china dress,dress,hair ornament, necklace, jewelry, long hair, earrings, chinese clothes,
```
# Examples - 例图
(可在文件列表中找到原图,并放入WebUi查看关键词等信息) - (You can find the original image in the file list, and put WebUi to view keywords and other information)
<img src=https://huggingface.co/xiaolxl/Gf_style2/resolve/main/examples/a1.png>
<img src=https://huggingface.co/xiaolxl/Gf_style2/resolve/main/examples/a2.png> | 3,357 | [
[
-0.036865234375,
-0.04962158203125,
0.00766754150390625,
0.039031982421875,
-0.05267333984375,
-0.0258636474609375,
0.023223876953125,
-0.0552978515625,
0.031951904296875,
0.048431396484375,
-0.0518798828125,
-0.043731689453125,
-0.03271484375,
-0.00539398193359375,
0.00396728515625,
0.049835205078125,
0.00289154052734375,
0.00250244140625,
0.0149383544921875,
0.012969970703125,
-0.036590576171875,
0.01529693603515625,
-0.05023193359375,
-0.0253448486328125,
-0.0206756591796875,
0.0302276611328125,
0.0556640625,
0.053802490234375,
0.040924072265625,
0.02001953125,
-0.0126495361328125,
0.0092620849609375,
-0.03741455078125,
-0.032470703125,
0.007457733154296875,
-0.035064697265625,
-0.0692138671875,
0.0157928466796875,
0.0172119140625,
0.004413604736328125,
0.00557708740234375,
0.00897216796875,
0.0173797607421875,
0.06646728515625,
-0.0165557861328125,
0.011932373046875,
-0.00408935546875,
0.00661468505859375,
-0.016998291015625,
-0.01009368896484375,
-0.0009255409240722656,
-0.036529541015625,
-0.0149688720703125,
-0.058929443359375,
0.021331787109375,
-0.0048828125,
0.115966796875,
-0.00902557373046875,
-0.035369873046875,
0.01024627685546875,
-0.0428466796875,
0.045379638671875,
-0.049468994140625,
0.006031036376953125,
0.0236968994140625,
0.0278167724609375,
-0.0193023681640625,
-0.0513916015625,
-0.03741455078125,
0.0046234130859375,
-0.0306854248046875,
0.0192108154296875,
-0.010528564453125,
-0.021759033203125,
0.0174407958984375,
0.04217529296875,
-0.048919677734375,
-0.0164947509765625,
-0.044281005859375,
-0.0208587646484375,
0.06732177734375,
-0.003253936767578125,
0.06024169921875,
-0.01458740234375,
-0.051513671875,
-0.01300811767578125,
-0.05902099609375,
0.0176544189453125,
0.01184844970703125,
-0.0107879638671875,
-0.0579833984375,
0.0184173583984375,
-0.01166534423828125,
0.029541015625,
0.0145721435546875,
-0.03094482421875,
0.0290374755859375,
-0.0286407470703125,
-0.0167388916015625,
-0.043365478515625,
0.052154541015625,
0.07843017578125,
0.0022869110107421875,
0.022552490234375,
-0.01142120361328125,
-0.0260162353515625,
-0.01410675048828125,
-0.08380126953125,
-0.0323486328125,
0.027191162109375,
-0.066650390625,
-0.0235137939453125,
0.0190582275390625,
-0.07171630859375,
0.0020599365234375,
-0.004161834716796875,
0.0216522216796875,
-0.0300140380859375,
-0.044281005859375,
-0.0170745849609375,
-0.0191802978515625,
0.03302001953125,
0.04266357421875,
-0.060882568359375,
0.0084991455078125,
0.0137481689453125,
0.0491943359375,
0.0207366943359375,
0.00829315185546875,
0.0025272369384765625,
0.01432037353515625,
-0.05133056640625,
0.05084228515625,
0.003627777099609375,
-0.05780029296875,
-0.01329803466796875,
0.03253173828125,
0.0234222412109375,
-0.041595458984375,
0.0472412109375,
-0.020294189453125,
0.022796630859375,
-0.03399658203125,
-0.00240325927734375,
-0.02239990234375,
0.0007419586181640625,
-0.054443359375,
0.061553955078125,
0.035369873046875,
-0.063720703125,
0.0177154541015625,
-0.03485107421875,
-0.0190582275390625,
0.0030670166015625,
-0.0055999755859375,
-0.042999267578125,
-0.00518035888671875,
0.01262664794921875,
0.0238037109375,
-0.01165008544921875,
-0.03009033203125,
-0.0221710205078125,
-0.036041259765625,
-0.00600433349609375,
0.007781982421875,
0.086669921875,
0.0163116455078125,
-0.0306243896484375,
-0.0057373046875,
-0.04241943359375,
0.0006694793701171875,
0.06268310546875,
-0.004016876220703125,
-0.024078369140625,
-0.0203857421875,
0.029998779296875,
0.045318603515625,
0.045501708984375,
-0.0274658203125,
0.031829833984375,
-0.032440185546875,
0.028594970703125,
0.05340576171875,
-0.001064300537109375,
0.0220184326171875,
-0.05670166015625,
0.0258636474609375,
-0.002223968505859375,
0.0264892578125,
-0.00762176513671875,
-0.038299560546875,
-0.06500244140625,
-0.024505615234375,
0.00487518310546875,
0.047760009765625,
-0.0616455078125,
0.0355224609375,
0.006755828857421875,
-0.04864501953125,
-0.0146331787109375,
0.0068817138671875,
0.045928955078125,
0.008544921875,
0.0302734375,
-0.035247802734375,
-0.04010009765625,
-0.08050537109375,
0.01776123046875,
-0.0168914794921875,
-0.0227203369140625,
0.045166015625,
0.04254150390625,
-0.015411376953125,
0.0396728515625,
-0.045501708984375,
-0.04388427734375,
-0.01151275634765625,
0.0024318695068359375,
0.017791748046875,
0.04962158203125,
0.07781982421875,
-0.05975341796875,
-0.0323486328125,
-0.004749298095703125,
-0.060455322265625,
0.0073089599609375,
0.01021575927734375,
-0.032501220703125,
0.0279388427734375,
0.005680084228515625,
-0.032958984375,
0.032318115234375,
0.030029296875,
-0.04461669921875,
0.07305908203125,
-0.0284271240234375,
0.0285797119140625,
-0.09100341796875,
0.017120361328125,
0.01074981689453125,
-0.007965087890625,
-0.038787841796875,
0.04327392578125,
0.007205963134765625,
0.01232147216796875,
-0.03778076171875,
0.0557861328125,
-0.056304931640625,
0.01428985595703125,
-0.02691650390625,
-0.002994537353515625,
0.0161590576171875,
0.0299224853515625,
0.00455474853515625,
0.0400390625,
0.0305633544921875,
-0.02655029296875,
0.028656005859375,
0.0369873046875,
-0.020599365234375,
0.047943115234375,
-0.053802490234375,
0.0218963623046875,
-0.0042877197265625,
0.021026611328125,
-0.07177734375,
-0.045989990234375,
0.042449951171875,
-0.03533935546875,
0.043914794921875,
0.00164794921875,
-0.00789642333984375,
-0.055572509765625,
-0.052154541015625,
0.00635528564453125,
0.0498046875,
-0.0322265625,
0.0294647216796875,
0.020416259765625,
-0.00354766845703125,
-0.0274505615234375,
-0.0592041015625,
-0.01047515869140625,
-0.009490966796875,
-0.072021484375,
0.056884765625,
0.006450653076171875,
-0.00606536865234375,
0.0022430419921875,
0.00807952880859375,
-0.0142059326171875,
-0.01311492919921875,
0.01009368896484375,
0.061248779296875,
-0.024749755859375,
-0.0352783203125,
-0.00688934326171875,
0.002197265625,
-0.006992340087890625,
0.018218994140625,
0.035186767578125,
-0.00748443603515625,
-0.01025390625,
-0.0546875,
0.0253143310546875,
0.043792724609375,
0.0010900497436523438,
0.01338958740234375,
0.05462646484375,
-0.043304443359375,
0.001678466796875,
-0.038543701171875,
-0.01525115966796875,
-0.03546142578125,
0.0206756591796875,
-0.02911376953125,
-0.056121826171875,
0.04388427734375,
0.0111541748046875,
0.010009765625,
0.060699462890625,
0.01776123046875,
-0.012542724609375,
0.0867919921875,
0.065673828125,
-0.0092315673828125,
0.0311431884765625,
-0.04962158203125,
-0.01546478271484375,
-0.0570068359375,
-0.0282745361328125,
-0.00868988037109375,
-0.03643798828125,
-0.048614501953125,
-0.04150390625,
0.029510498046875,
0.026641845703125,
-0.0254669189453125,
0.035552978515625,
-0.0557861328125,
0.0136871337890625,
0.031005859375,
0.0435791015625,
0.0092315673828125,
-0.0033397674560546875,
0.019287109375,
-0.006526947021484375,
-0.006595611572265625,
-0.03192138671875,
0.0440673828125,
0.033203125,
0.04156494140625,
0.0170745849609375,
0.0382080078125,
-0.01422119140625,
0.008575439453125,
-0.03253173828125,
0.039520263671875,
-0.03448486328125,
-0.05340576171875,
-0.01093292236328125,
-0.00916290283203125,
-0.04974365234375,
0.0139923095703125,
-0.0299072265625,
-0.07708740234375,
0.0241851806640625,
0.0284271240234375,
-0.037567138671875,
0.0233154296875,
-0.03436279296875,
0.051055908203125,
-0.0198211669921875,
-0.034393310546875,
-0.00021457672119140625,
-0.052032470703125,
0.07000732421875,
0.01372528076171875,
0.0150146484375,
-0.0203857421875,
0.0069122314453125,
0.045379638671875,
-0.044219970703125,
0.034149169921875,
-0.0273895263671875,
0.0041656494140625,
0.0302276611328125,
0.006778717041015625,
0.04571533203125,
0.011383056640625,
0.0186004638671875,
0.0078277587890625,
0.01541900634765625,
-0.02825927734375,
-0.0347900390625,
0.043914794921875,
-0.0556640625,
-0.05450439453125,
-0.0239105224609375,
-0.00897216796875,
0.014404296875,
0.0160064697265625,
0.06494140625,
0.0137786865234375,
-0.0192108154296875,
0.011566162109375,
0.03265380859375,
-0.02056884765625,
0.0276947021484375,
0.019989013671875,
-0.0457763671875,
-0.035430908203125,
0.0679931640625,
0.0236663818359375,
0.015655517578125,
0.00896453857421875,
0.0165863037109375,
-0.01165771484375,
-0.009033203125,
-0.056182861328125,
0.0249176025390625,
-0.021575927734375,
-0.0049285888671875,
-0.0203857421875,
-0.0322265625,
-0.045257568359375,
-0.016204833984375,
-0.02227783203125,
-0.01287841796875,
-0.03485107421875,
0.0193328857421875,
0.0280303955078125,
0.027313232421875,
-0.012115478515625,
0.0130462646484375,
-0.0517578125,
0.038360595703125,
0.033172607421875,
0.01983642578125,
-0.00909423828125,
-0.036651611328125,
0.000301361083984375,
0.00458526611328125,
-0.047821044921875,
-0.0645751953125,
0.045989990234375,
0.00266265869140625,
0.0265960693359375,
0.05023193359375,
-0.011932373046875,
0.04254150390625,
-0.0142669677734375,
0.0654296875,
0.033355712890625,
-0.04339599609375,
0.050384521484375,
-0.045440673828125,
0.0130615234375,
0.02288818359375,
0.0136566162109375,
-0.0357666015625,
-0.00006401538848876953,
-0.05316162109375,
-0.0589599609375,
0.055816650390625,
0.0187835693359375,
0.016448974609375,
0.0296783447265625,
0.02874755859375,
-0.0177001953125,
0.00582122802734375,
-0.04638671875,
-0.045166015625,
-0.0345458984375,
0.0236358642578125,
0.021575927734375,
-0.0285491943359375,
-0.0047454833984375,
-0.035736083984375,
0.076904296875,
-0.0027217864990234375,
0.052642822265625,
0.015838623046875,
0.0281219482421875,
-0.020416259765625,
-0.007720947265625,
0.050048828125,
0.0372314453125,
-0.0253753662109375,
-0.0201263427734375,
0.009674072265625,
-0.03936767578125,
0.0033664703369140625,
0.01140594482421875,
-0.038360595703125,
0.0076751708984375,
0.01204681396484375,
0.04962158203125,
-0.0130157470703125,
-0.01233673095703125,
0.049713134765625,
-0.004268646240234375,
-0.01369476318359375,
-0.028350830078125,
0.0117950439453125,
0.00943756103515625,
0.014617919921875,
0.03778076171875,
0.00537872314453125,
0.0277099609375,
-0.057159423828125,
0.0184478759765625,
0.0382080078125,
-0.02032470703125,
-0.015045166015625,
0.060028076171875,
0.0124664306640625,
-0.020751953125,
0.0301513671875,
-0.041656494140625,
-0.03350830078125,
0.07452392578125,
0.0445556640625,
0.04931640625,
-0.0445556640625,
0.02935791015625,
0.069580078125,
0.01806640625,
-0.0210113525390625,
0.055816650390625,
0.033538818359375,
-0.03997802734375,
-0.0080108642578125,
-0.046051025390625,
-0.011505126953125,
0.04705810546875,
-0.0237579345703125,
0.051177978515625,
-0.05902099609375,
-0.01019287109375,
-0.015777587890625,
0.0106048583984375,
-0.02825927734375,
0.0455322265625,
0.007720947265625,
0.07330322265625,
-0.0556640625,
0.05767822265625,
0.037078857421875,
-0.04962158203125,
-0.07623291015625,
-0.0010728836059570312,
0.04083251953125,
-0.0626220703125,
0.0291595458984375,
0.03533935546875,
0.00279998779296875,
-0.0122528076171875,
-0.061676025390625,
-0.060638427734375,
0.10125732421875,
0.02362060546875,
-0.030670166015625,
-0.02410888671875,
-0.0296173095703125,
0.023956298828125,
-0.034637451171875,
0.0198974609375,
0.03302001953125,
0.036346435546875,
0.038482666015625,
-0.0653076171875,
0.02490234375,
-0.058837890625,
0.0272674560546875,
-0.00691986083984375,
-0.08135986328125,
0.0830078125,
-0.02874755859375,
-0.030731201171875,
0.0377197265625,
0.055511474609375,
0.0237274169921875,
0.0213470458984375,
0.040863037109375,
0.03936767578125,
0.0235748291015625,
-0.0258636474609375,
0.059417724609375,
-0.0173797607421875,
0.03424072265625,
0.059783935546875,
0.0149688720703125,
0.04302978515625,
-0.006786346435546875,
-0.033599853515625,
0.04925537109375,
0.07135009765625,
-0.039215087890625,
0.035675048828125,
-0.0027446746826171875,
-0.0120391845703125,
-0.0038700103759765625,
-0.0128173828125,
-0.045074462890625,
0.03802490234375,
0.015350341796875,
-0.0350341796875,
-0.010284423828125,
-0.0137786865234375,
0.0162811279296875,
-0.00970458984375,
-0.0172576904296875,
0.0517578125,
0.00748443603515625,
-0.0206298828125,
0.044219970703125,
0.017974853515625,
0.061187744140625,
-0.036346435546875,
-0.026458740234375,
-0.0250244140625,
-0.0049591064453125,
-0.0284881591796875,
-0.056304931640625,
0.018341064453125,
-0.028900146484375,
0.0007252693176269531,
0.00833892822265625,
0.07049560546875,
-0.017364501953125,
-0.050537109375,
0.027374267578125,
0.0233001708984375,
0.0267486572265625,
0.008270263671875,
-0.0687255859375,
0.0010442733764648438,
0.0125732421875,
-0.03778076171875,
0.01496124267578125,
0.01519012451171875,
-0.0015859603881835938,
0.0243377685546875,
0.0309295654296875,
0.00004571676254272461,
-0.046051025390625,
-0.0223388671875,
0.06878662109375,
-0.0379638671875,
-0.0288848876953125,
-0.0513916015625,
0.05596923828125,
-0.030120849609375,
-0.036529541015625,
0.037261962890625,
0.031341552734375,
0.065673828125,
-0.0194244384765625,
0.06671142578125,
-0.03302001953125,
0.024078369140625,
-0.037017822265625,
0.071044921875,
-0.07086181640625,
-0.02325439453125,
-0.042205810546875,
-0.046844482421875,
-0.01507568359375,
0.07464599609375,
0.00238800048828125,
0.02178955078125,
0.023223876953125,
0.07659912109375,
0.01171112060546875,
-0.01094818115234375,
0.015411376953125,
0.0117950439453125,
0.004032135009765625,
0.058868408203125,
0.031219482421875,
-0.053985595703125,
0.0214080810546875,
-0.0672607421875,
-0.038238525390625,
-0.037078857421875,
-0.050811767578125,
-0.0462646484375,
-0.0435791015625,
-0.0236968994140625,
-0.05029296875,
-0.008575439453125,
0.06182861328125,
0.0650634765625,
-0.058349609375,
-0.01010894775390625,
0.017242431640625,
0.001491546630859375,
-0.032012939453125,
-0.017547607421875,
0.033721923828125,
0.027587890625,
-0.07403564453125,
-0.00405120849609375,
0.024017333984375,
0.04473876953125,
-0.0111846923828125,
-0.005084991455078125,
0.0007467269897460938,
-0.0081329345703125,
0.03643798828125,
0.03375244140625,
-0.054962158203125,
-0.02734375,
0.0063629150390625,
-0.0206298828125,
0.015960693359375,
0.0187835693359375,
-0.0082244873046875,
0.024505615234375,
0.042694091796875,
-0.0193023681640625,
0.0236968994140625,
0.00524139404296875,
0.0251007080078125,
-0.031341552734375,
0.005523681640625,
-0.0189361572265625,
0.041290283203125,
0.016448974609375,
-0.042388916015625,
0.0498046875,
0.045745849609375,
-0.028411865234375,
-0.03228759765625,
0.01390838623046875,
-0.0892333984375,
-0.03900146484375,
0.0787353515625,
-0.0038051605224609375,
-0.03009033203125,
0.0118865966796875,
-0.0499267578125,
0.03216552734375,
-0.01418304443359375,
0.05474853515625,
0.04803466796875,
-0.00650787353515625,
-0.022216796875,
-0.0478515625,
0.042205810546875,
-0.00666046142578125,
-0.04974365234375,
-0.0225830078125,
0.0474853515625,
0.019195556640625,
0.0384521484375,
0.05999755859375,
-0.033660888671875,
0.047149658203125,
-0.0036830902099609375,
0.039276123046875,
-0.006984710693359375,
-0.004024505615234375,
-0.007450103759765625,
0.00777435302734375,
-0.00882720947265625,
-0.0225830078125
]
] |
jphme/Llama-2-13b-chat-german | 2023-10-06T12:52:01.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"llama-2",
"german",
"deutsch",
"de",
"en",
"dataset:Christoph911/German-legal-SQuAD",
"dataset:philschmid/test_german_squad",
"arxiv:2307.09288",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | jphme | null | null | jphme/Llama-2-13b-chat-german | 47 | 6,967 | transformers | 2023-07-21T23:01:14 | ---
language:
- de
- en
pipeline_tag: text-generation
inference: false
tags:
- pytorch
- llama
- llama-2
- german
- deutsch
datasets:
- Christoph911/German-legal-SQuAD
- philschmid/test_german_squad
---
**Please Check out EM German, our new german-speaking LLM model family with significantly improved capabilites. EM German is available in Llama2 7b,13b and 70b and Mistral- and LeoLM-based versions! All information and download links can be found [here](https://github.com/jphme/EM_German/blob/main/README.md).**
# Llama 2 13b Chat German
Llama-2-13b-chat-german is a variant of [Meta](https://huggingface.co/meta-llama)´s [Llama 2 13b Chat](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf) model, finetuned on an additional dataset in German language.
This model is optimized for German text, providing proficiency in understanding, generating, and interacting with German language content. However the model is not yet fully optimized for German language, as it has been trained on a small, experimental dataset and has limited capabilities due to the small parameter count.
Some of the fineunting data is also targeted towards factual retrieval (only answer questions from information in the context and refuse to hallucinate) and the model should perform better for these tasks than original Llama 2 Chat.
I am working on improving the model´s capabilities and will update the model if there is sufficient interest.
A quantized GGML version for use with llama.cpp, kobold.cpp and other GUIs for CPU inference can be found [here](https://huggingface.co/jphme/Llama-2-13b-chat-german-GGML).
Please note the license of the base model, which is contained in the repo under LICENSE.TXT and see the original model card below for more information.
## Data
* Prorietary German Conversation Dataset
* German SQuAD and German legal SQuAD data (see datasets), augmented with "wrong" contexts, to improve factual RAG
## Prompt Template
Llama2 Chat uses a new prompt format:
```
<s>[INST] <<SYS>>
You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.
If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information. Please answer in the same language as the user.
<</SYS>>
This is a test question[/INST] This is a answer </s><s>
```
See also the original implementation [here](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L213).
There is also a (so far non-documented?) method right in transformers to generate the correct tokenization: [LLamaTokenizer._build_conversation_input_ids](https://github.com/huggingface/transformers/blob/b257c46a075419c09e5ce5c5aa39bc346ecdb9a5/src/transformers/models/llama/tokenization_llama.py#L334).
## Results
I did only evaluate the output on a small, handcrafted sample on test prompts in German, confirming that the model's ability to understand and generate German text is above the base model in many situations. More detailed evaluations and examples will be posted soon.
## Examples
*These examples were not in the training set. They have been generated with a quantized version (Q4_0 for the German finetuned version, Q4_1 for the original Llama2 Chat version), so FP16 results should be better.*
### Example 1 (General Knowledge)
**Question**:
> Wie heißt die Hauptstadt von Nordrhein-Westfalen?
**Answer Original Llama 2 13b Chat**:
> Hello! As your helpful and respectful assistant, I'm here to help you with any questions you may have. The main city of North Rhine-Westphalia is called Düsseldorf. Is there anything else you would like to know about this topic or the region in general? Please let me know if there's anything else I can assist you with!
**Answer Llama 2 13b Chat German**:
> Die Hauptstadt von Nordrhein-Westfalen ist Düsseldorf.
### Example 2 (RAG/Factual Retrieval):
**Question**:
> Warum ist die Mongolische Volksrepublik nicht dem Warschauer Pakt beigetreten?
>
> Kontext zur Beantwortung der Frage: Warschauer_Pakt
>
> === Bilaterale Verträge über Freundschaft, Zusammenarbeit und gegenseitigen Beistand === Mit den bilateralen Bündnisverträgen wurde die Verpflichtung zur gegenseitigen Hilfeleistung unterzeichnet, um alle gewaltsamen militärischen Handlungen, die sich gegen die territoriale Integrität und Souveränität einer Vertragspartei richteten, zu verhindern. Den ersten dieser Freundschaftsverträge hatte die Sowjetunion schon während des Krieges am 12. Dezember 1943 mit der tschechoslowakischen Exilregierung abgeschlossen, der am 27. November 1963 für die Tschechoslowakei verlängert wurde. Von 1943 bis 1949 gab es bereits 23 bilaterale Verträge über Freundschaft, Zusammenarbeit und gegenseitigen Beistand (VFZ) der ersten Generation in Osteuropa. Neben diesem Vertragssystem bestanden ab 1956/57 auch weitere Abkommen: * Truppenstationierungsabkommen der Sowjetunion mit der DDR (12. März 1957), * Truppenstationierungsabkommen der Sowjetunion mit der Volksrepublik Polen (17. Dezember 1956), * Truppenstationierungsabkommen der Sowjetunion mit Rumänien (15. April 1957) und * Truppenstationierungsabkommen der Sowjetunion mit Ungarn (27. Mai 1957) jeweils mit einer Laufzeit von 20 Jahren. Aber bereits der Vertrag über die Beziehungen zwischen der DDR und der Sowjetunion vom 20. September 1950 zur Grenzregelung enthielt eine Vereinbarung zur Stationierung von sowjetischen Truppen auf dem Gebiet der DDR. Im Juli 1963 bat auch die Mongolische Volksrepublik, dem Warschauer Pakt nach Artikel 9 des Warschauer Vertrages beizutreten. Dazu hätte ein Sonderprotokoll gefasst werden müssen, da der Vertragstext nach Artikel 4 sich nur auf Europa bezog. Aufgrund des aufkommenden Chinesisch-sowjetischen Zerwürfnisses gab es keinen Beitritt, sondern es blieb bei einem Beobachterstatus. Stattdessen wurden ab 1966 sowjetische Truppenstationierungen vereinbart. Nachdem die Sowjetunion mit der DDR am 12. Juni 1964 einen Vertrag über Freundschaft, Zusammenarbeit und gegenseitigen Beistand schloss, der die volle Einbeziehung der DDR in das bilaterale Bündnissystem vorsah, wurden in der Zeit von 1964 bis 1972 insgesamt 20 Bündnisverträge der zweiten Generation unterzeichnet: Die Hegemonie der Sowjetunion wurde durch die bilateralen Bündnisverträge noch verstärkt, da diese eine unmittelbare Beistandsverpflichtung bei einem bewaffneten Angriff vorsahen, die auch in den meisten Verträgen sich nicht nur auf Europa beschränkte. In der dritten Generation der Verträge über Freundschaft, Zusammenarbeit und gegenseitigen Beistand erfolgte nach der Unterzeichnung der Schlussakte von Helsinki der Konferenz über Sicherheit und Zusammenarbeit in Europa (KSZE) im August 1975 die Einbeziehung der Breschnew-Doktrin und der ökonomischen Integration. Zudem wurde die Laufzeit der Verträge auf 25 Jahre festgelegt. Die DDR unterzeichnete am 7. Oktober 1975 mit der Sowjetunion diesen neuen Vertrag sowie später weitere mit Ungarn (24. März 1977), Polen (29. Mai 1977), Bulgarien (14. September 1977) und der Tschechoslowakei (3. Oktober 1977).
>
> Bitte beantworte die Frage mit einem kurzen, wörtlichen Zitat aus dem Kontext. Sollte der Kontext keine Antwort auf die Frage enthalten, bitte antworte dass die Frage mit den gegebenen Informationen nicht zu beantworten ist.
**Answer Original Llama 2 13b Chat**:
> Die Mongolische Volksrepublik hat nicht dem Warschauer Pakt beigetreten, weil es keinen Beitritt gab, sondern nur einen Beobachterstatus. (Quelle: Der Kontext)
>
**Answer Llama 2 13b Chat German**:
> Aufgrund des aufkommenden Chinesisch-sowjetischen Zerwürfnisses gab es keinen Beitritt, sondern es blieb bei einem Beobachterstatus
### Example 3 (RAG / Factual Retrieval negative):
**Question**:
> Nach was benannte Spielberg seine Produktionsfirma Anfang der 1980er?
>
> Kontext zur Beantwortung der Frage: Webbrowser
>
> == Marktanteile und deren Messung == Bild zeigt die lt. Statistik von StatCounter meistverwendeten Browser nach Ländern 9/2019. Die Statistik für März 2020 ist über folgenden Weblink abrufbar: Die tatsächliche Verbreitung eines Webbrowsers ist nicht zweifelsfrei feststellbar. Verschiedene Anbieter veröffentlichen Statistiken über die Verbreitung von Webbrowsern aufgrund unterschiedlicher häufig recht begrenzter Datenbasen. Da die generelle Verbreitungsrate eines Browsers von verschiedensten Faktoren beeinflusst wird, sind diese Statistiken unterschiedlich aussagekräftig und kommen zu teilweise stark unterschiedlichen, scheinbar widersprüchlichen Ergebnissen. So schwankt die Verbreitung eines Browsers je nach Themengebiet einer aufgerufenen Webseite, Herkunftsregion der aufrufenden Person und dem Zeitpunkt der Messung. Beispielsweise können Benutzer an ihrem Arbeitsplatz an die Verwendung eines vorgegebenen Webbrowsers gebunden sein, privat jedoch einen anderen Browser bevorzugen und verwenden. Auch verschiedene Ereignisse führen zu starken Schwankungen. So steigt der Marktanteil bei der Veröffentlichung einer neuen Hauptversion oder sinkt beim Bekanntwerden einer Sicherheitslücke. Insgesamt geben Browserstatistiken nur die grobe Verteilung und ggf. Trends an, da jede Statistik nur auf der Auswertung der Logdateien ausgewählter Websites beruht und es keine unbestritten repräsentative Auswahl dafür zu verwendender Websites gibt. Die Messungen werden in der Regel anhand des sogenannten ''User-Agent''-Headers durchgeführt, mit der sich der verwendete Browser identifizieren lässt und die dieser bei jeder Anfrage an den Server übermittelt. Bei praktisch allen Browsern kann diese Kennung vom Benutzer verändert werden. Die weltweiten Marktanteile von Desktop-Browsern sehen laut Netmarketshare Stand Juli 2018 und September 2019 prozentual wie folgt aus: Bei Berücksichtigung auch der Browser in den inzwischen sehr häufig verwendeten mobilen Geräten wie Smartphone, Tablet etc. ergibt sich laut den veröffentlichten Statistiken von Netmarketshare, für September 2019 folgendes Bild: Chrome 66 %, Safari 19 %, Firefox 4 %, IE 2,6 %, Edge 2,5 %, QQ 1,4 % etc. Die Verbreitung laut Statistik von StatCounter der verschiedenen Browser auf mobilen Geräten ist hier abrufbar. Die Auswahl der Anzeige ist editierbar.
>
> Bitte beantworte die Frage mit einem kurzen, wörtlichen Zitat aus dem Kontext. Sollte der Kontext keine Antwort auf die Frage enthalten, bitte antworte dass die Frage mit den gegebenen Informationen nicht zu beantworten ist.
**Answer Original Llama 2 13b Chat**:
> Sure! Here's the answer to your question with a direct quote from the context: "According to Netmarketshare, in July 2018 and September 2019, the worldwide market share of desktop browsers looked like this: Chrome 66%, Safari 19%, Firefox 4%, IE 2.6%, Edge 2.5%, QQ 1.4% etc." So, according to Netmarketshare's statistics, in July 2018 and September 2019, Google Chrome had a market share of 66%.
>
**Answer Llama 2 13b Chat German**:
> Mit den gegebenen Informationen ist diese Frage nicht zu beantworten.
----------------------------
----------------------------
# Original **Llama 2** model card
Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 13B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.
## Model Details
*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.*
Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM.
**Model Developers** Meta
**Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations.
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety.
||Training Data|Params|Content Length|GQA|Tokens|LR|
|---|---|---|---|---|---|---|
|Llama 2|*A new mix of publicly available online data*|7B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|13B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|70B|4k|✔|2.0T|1.5 x 10<sup>-4</sup>|
*Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability.
**Model Dates** Llama 2 was trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
**Research Paper** ["Llama-2: Open Foundation and Fine-tuned Chat Models"](arxiv.org/abs/2307.09288)
## Intended Use
**Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.
To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See our reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212).
**Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2.
## Hardware and Software
**Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.
**Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program.
||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)|
|---|---|---|---|
|Llama 2 7B|184320|400|31.22|
|Llama 2 13B|368640|400|62.44|
|Llama 2 70B|1720320|400|291.42|
|Total|3311616||539.00|
**CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.
## Training Data
**Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
**Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023.
## Evaluation Results
In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library.
|Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval|
|---|---|---|---|---|---|---|---|---|---|
|Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9|
|Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9|
|Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7|
|Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6|
|Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3|
|Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1|
|Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**|
**Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1.
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama 1|7B|27.42|23.00|
|Llama 1|13B|41.74|23.08|
|Llama 1|33B|44.19|22.57|
|Llama 1|65B|48.71|21.77|
|Llama 2|7B|33.29|**21.25**|
|Llama 2|13B|41.86|26.10|
|Llama 2|70B|**50.18**|24.60|
**Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better).
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama-2-Chat|7B|57.04|**0.00**|
|Llama-2-Chat|13B|62.18|**0.00**|
|Llama-2-Chat|70B|**64.14**|0.01|
**Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above.
## Ethical Considerations and Limitations
Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide)
## Reporting Issues
Please report any software “bug,” or other problems with the models through one of the following means:
- Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama)
- Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)
- Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)
## Llama Model Index
|Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf|
|---|---|---|---|---|
|7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)|
|13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)|
|70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)| | 20,850 | [
[
-0.048431396484375,
-0.0694580078125,
0.04376220703125,
0.032440185546875,
-0.022216796875,
-0.0029773712158203125,
-0.0037097930908203125,
-0.05194091796875,
0.05340576171875,
0.00980377197265625,
-0.05078125,
-0.043426513671875,
-0.039703369140625,
0.0283355712890625,
-0.0186004638671875,
0.052459716796875,
-0.018157958984375,
0.02435302734375,
0.0028629302978515625,
-0.0156707763671875,
-0.01373291015625,
-0.038543701171875,
-0.0285491943359375,
-0.049591064453125,
0.0147552490234375,
0.036468505859375,
0.034454345703125,
0.034454345703125,
0.045745849609375,
0.0307159423828125,
-0.0284881591796875,
-0.00118255615234375,
-0.032379150390625,
-0.0124359130859375,
-0.004444122314453125,
-0.0261688232421875,
-0.021881103515625,
-0.01520538330078125,
0.037384033203125,
0.053497314453125,
-0.0285797119140625,
0.0166168212890625,
-0.00881195068359375,
0.03790283203125,
-0.003383636474609375,
0.0037841796875,
-0.0215301513671875,
0.01236724853515625,
-0.0125732421875,
-0.022857666015625,
-0.015594482421875,
-0.028076171875,
-0.0016775131225585938,
-0.044219970703125,
0.0115509033203125,
-0.00594329833984375,
0.1121826171875,
-0.0005307197570800781,
-0.0305938720703125,
-0.01702880859375,
-0.045745849609375,
0.06878662109375,
-0.05609130859375,
0.024322509765625,
0.0382080078125,
0.01442718505859375,
-0.040740966796875,
-0.04638671875,
-0.0506591796875,
-0.023468017578125,
-0.0174102783203125,
0.02569580078125,
-0.04351806640625,
-0.0269012451171875,
0.0182342529296875,
0.0323486328125,
-0.033294677734375,
0.005176544189453125,
-0.036651611328125,
-0.016387939453125,
0.05865478515625,
0.01146697998046875,
0.0179443359375,
-0.020965576171875,
-0.050323486328125,
-0.0180206298828125,
-0.04962158203125,
0.0058441162109375,
0.04571533203125,
-0.001239776611328125,
-0.04736328125,
0.05584716796875,
-0.05157470703125,
0.0310516357421875,
0.0338134765625,
-0.0221710205078125,
0.035552978515625,
-0.030609130859375,
-0.01033782958984375,
-0.033966064453125,
0.053802490234375,
0.045562744140625,
0.00995635986328125,
-0.020751953125,
-0.007144927978515625,
-0.00301361083984375,
-0.018402099609375,
-0.058929443359375,
0.0159759521484375,
-0.004283905029296875,
-0.0504150390625,
-0.019439697265625,
0.005908966064453125,
-0.05242919921875,
-0.0033206939697265625,
-0.0165557861328125,
0.0242156982421875,
-0.0251007080078125,
-0.03228759765625,
0.0290069580078125,
-0.0205078125,
0.038330078125,
0.007266998291015625,
-0.063232421875,
0.0173187255859375,
0.0260772705078125,
0.045379638671875,
0.00031566619873046875,
-0.01038360595703125,
0.00754547119140625,
0.00119781494140625,
-0.028228759765625,
0.06756591796875,
-0.0282135009765625,
-0.053253173828125,
-0.0039825439453125,
0.0219268798828125,
0.01043701171875,
-0.0426025390625,
0.034271240234375,
-0.03466796875,
0.028289794921875,
-0.028411865234375,
-0.01529693603515625,
-0.036834716796875,
0.00376129150390625,
-0.0154266357421875,
0.0921630859375,
0.007579803466796875,
-0.05718994140625,
0.01727294921875,
-0.0203857421875,
-0.037261962890625,
-0.01369476318359375,
-0.0027942657470703125,
-0.01345062255859375,
0.0024700164794921875,
0.0169830322265625,
0.038818359375,
-0.043182373046875,
0.0128936767578125,
-0.0164794921875,
-0.01253509521484375,
0.015960693359375,
-0.020660400390625,
0.0794677734375,
0.0009183883666992188,
-0.044525146484375,
-0.022064208984375,
-0.061370849609375,
-0.0137939453125,
0.0386962890625,
-0.042510986328125,
-0.006626129150390625,
-0.013580322265625,
-0.0167388916015625,
0.020965576171875,
0.0311126708984375,
-0.05780029296875,
0.0087890625,
-0.0302276611328125,
0.020263671875,
0.08270263671875,
0.01039886474609375,
0.026336669921875,
-0.040863037109375,
0.0279541015625,
0.022186279296875,
0.01337432861328125,
0.0273895263671875,
-0.035552978515625,
-0.08990478515625,
-0.007587432861328125,
0.005779266357421875,
0.059112548828125,
-0.0657958984375,
0.0421142578125,
-0.01617431640625,
-0.06610107421875,
-0.01331329345703125,
0.00559234619140625,
0.0280914306640625,
0.031463623046875,
0.0123443603515625,
-0.044677734375,
-0.043548583984375,
-0.08544921875,
0.01445770263671875,
-0.04052734375,
-0.0041961669921875,
0.052947998046875,
0.047119140625,
-0.033111572265625,
0.0594482421875,
-0.046539306640625,
-0.018707275390625,
-0.0109710693359375,
0.004180908203125,
0.0298919677734375,
0.0258941650390625,
0.054473876953125,
-0.0594482421875,
-0.035125732421875,
0.01012420654296875,
-0.068115234375,
-0.017181396484375,
0.004367828369140625,
-0.034820556640625,
0.037017822265625,
0.0194244384765625,
-0.055267333984375,
0.01381683349609375,
0.053253173828125,
-0.0306854248046875,
0.031524658203125,
-0.0252838134765625,
-0.006244659423828125,
-0.08062744140625,
0.015777587890625,
0.002689361572265625,
-0.0010662078857421875,
-0.04473876953125,
-0.012939453125,
-0.0223846435546875,
0.0218048095703125,
-0.0404052734375,
0.042999267578125,
-0.02337646484375,
0.007610321044921875,
0.01189422607421875,
0.01163482666015625,
-0.01898193359375,
0.039520263671875,
0.00611114501953125,
0.048797607421875,
0.04278564453125,
-0.040008544921875,
0.03759765625,
0.034881591796875,
-0.01995849609375,
0.044586181640625,
-0.041778564453125,
0.016265869140625,
-0.023101806640625,
0.0303802490234375,
-0.0640869140625,
-0.02398681640625,
0.050201416015625,
-0.039093017578125,
0.0139007568359375,
0.027252197265625,
-0.0338134765625,
-0.041351318359375,
-0.0285797119140625,
-0.00946044921875,
0.050140380859375,
-0.048095703125,
0.04620361328125,
0.0299224853515625,
-0.0133209228515625,
-0.053314208984375,
-0.06231689453125,
0.0030765533447265625,
-0.01947021484375,
-0.06243896484375,
0.0265350341796875,
-0.005397796630859375,
-0.022003173828125,
-0.0036334991455078125,
-0.0191802978515625,
-0.0130767822265625,
-0.004669189453125,
0.016021728515625,
0.0263824462890625,
-0.02703857421875,
-0.0070648193359375,
-0.007778167724609375,
-0.0306549072265625,
-0.01479339599609375,
0.0172882080078125,
0.053436279296875,
-0.0113525390625,
-0.0147552490234375,
-0.0538330078125,
0.024383544921875,
0.026214599609375,
0.0009694099426269531,
0.05096435546875,
0.041168212890625,
-0.0100250244140625,
0.021087646484375,
-0.07586669921875,
-0.035491943359375,
-0.039581298828125,
0.0079193115234375,
-0.01025390625,
-0.07049560546875,
0.0408935546875,
-0.00423431396484375,
-0.0083160400390625,
0.050384521484375,
0.0626220703125,
-0.0071258544921875,
0.0654296875,
0.06787109375,
0.002834320068359375,
0.033111572265625,
-0.026947021484375,
0.0174102783203125,
-0.0306396484375,
-0.0401611328125,
-0.0367431640625,
-0.0218963623046875,
-0.05096435546875,
-0.026947021484375,
0.0124664306640625,
0.00746917724609375,
-0.01555633544921875,
0.0298309326171875,
-0.0289764404296875,
0.022613525390625,
0.0308074951171875,
0.0149688720703125,
0.0229644775390625,
0.01445770263671875,
0.00861358642578125,
0.00030350685119628906,
-0.03857421875,
-0.0484619140625,
0.06573486328125,
0.0345458984375,
0.07098388671875,
0.013397216796875,
0.055511474609375,
-0.0039520263671875,
0.0287628173828125,
-0.03802490234375,
0.06610107421875,
0.0019893646240234375,
-0.050994873046875,
-0.011810302734375,
-0.019012451171875,
-0.07012939453125,
0.0192413330078125,
-0.00848388671875,
-0.06396484375,
0.033447265625,
-0.0175323486328125,
-0.01837158203125,
0.01383209228515625,
-0.048828125,
0.0396728515625,
-0.0140380859375,
-0.00705718994140625,
-0.0251007080078125,
-0.06182861328125,
0.023284912109375,
-0.0025348663330078125,
0.00603485107421875,
-0.0262908935546875,
-0.0137939453125,
0.059356689453125,
-0.0194549560546875,
0.0760498046875,
-0.00510406494140625,
-0.0176544189453125,
0.02301025390625,
-0.0037384033203125,
0.041412353515625,
-0.00374603271484375,
-0.006988525390625,
0.039337158203125,
-0.00585174560546875,
-0.0191192626953125,
-0.0090789794921875,
0.04693603515625,
-0.07391357421875,
-0.08740234375,
-0.0377197265625,
-0.025482177734375,
0.023223876953125,
0.0113067626953125,
0.01715087890625,
0.01383209228515625,
-0.0115509033203125,
0.0185089111328125,
0.0181732177734375,
-0.0263214111328125,
0.037933349609375,
0.055633544921875,
-0.00555419921875,
-0.02105712890625,
0.0543212890625,
0.0195465087890625,
0.018646240234375,
0.0194549560546875,
0.0170440673828125,
-0.0188751220703125,
-0.0059356689453125,
-0.033660888671875,
0.051544189453125,
-0.06793212890625,
-0.020965576171875,
-0.047576904296875,
-0.00824737548828125,
-0.0178985595703125,
-0.01535797119140625,
0.00249481201171875,
-0.0338134765625,
-0.0217132568359375,
-0.0174102783203125,
0.046539306640625,
0.046478271484375,
-0.05010986328125,
0.026153564453125,
-0.0294189453125,
0.036102294921875,
0.0193328857421875,
0.004581451416015625,
-0.0082855224609375,
-0.0374755859375,
-0.00394439697265625,
0.0031871795654296875,
-0.043365478515625,
-0.07415771484375,
0.0257110595703125,
-0.0079803466796875,
0.03546142578125,
0.0360107421875,
-0.00008040666580200195,
0.058197021484375,
-0.03619384765625,
0.06671142578125,
0.0237274169921875,
-0.052459716796875,
0.023101806640625,
-0.0164337158203125,
-0.0034027099609375,
0.015777587890625,
0.029815673828125,
-0.04876708984375,
-0.0455322265625,
-0.057525634765625,
-0.0584716796875,
0.06988525390625,
0.011688232421875,
0.037322998046875,
-0.0142822265625,
0.008087158203125,
0.00975799560546875,
-0.0025539398193359375,
-0.061492919921875,
-0.050048828125,
0.0072479248046875,
0.0180206298828125,
0.0026702880859375,
-0.0257415771484375,
-0.0306396484375,
-0.031951904296875,
0.07098388671875,
0.023101806640625,
0.03271484375,
0.028350830078125,
-0.0010938644409179688,
-0.0215911865234375,
0.0206146240234375,
0.055633544921875,
0.06964111328125,
-0.00580596923828125,
0.0004963874816894531,
0.060302734375,
-0.040252685546875,
0.0308990478515625,
0.00659942626953125,
-0.03277587890625,
0.0290374755859375,
0.00975799560546875,
0.062225341796875,
0.00717926025390625,
-0.04010009765625,
0.045806884765625,
-0.002788543701171875,
-0.0217742919921875,
-0.04058837890625,
-0.0168609619140625,
0.01849365234375,
0.039886474609375,
0.035125732421875,
-0.02398681640625,
-0.01450347900390625,
-0.0390625,
-0.002239227294921875,
0.039520263671875,
-0.006046295166015625,
-0.014312744140625,
0.046630859375,
0.0224456787109375,
-0.01995849609375,
0.03143310546875,
-0.0133514404296875,
-0.03594970703125,
0.05517578125,
0.059600830078125,
0.04754638671875,
-0.0263671875,
0.037200927734375,
0.040069580078125,
0.0216827392578125,
-0.0006031990051269531,
0.0287322998046875,
0.0021800994873046875,
-0.048614501953125,
-0.017486572265625,
-0.048736572265625,
-0.01258087158203125,
-0.00234222412109375,
-0.043365478515625,
0.0292205810546875,
-0.017974853515625,
-0.033966064453125,
-0.00937652587890625,
0.00228118896484375,
-0.04266357421875,
0.0025196075439453125,
0.0158538818359375,
0.073486328125,
-0.06585693359375,
0.05047607421875,
0.05023193359375,
-0.045745849609375,
-0.042510986328125,
0.0028171539306640625,
-0.003185272216796875,
-0.0802001953125,
0.06640625,
-0.0025310516357421875,
-0.0160675048828125,
0.00206756591796875,
-0.0350341796875,
-0.0806884765625,
0.0927734375,
0.032440185546875,
-0.0254669189453125,
0.00644683837890625,
0.02569580078125,
0.04339599609375,
0.0029506683349609375,
0.017242431640625,
0.053314208984375,
0.0399169921875,
0.00749969482421875,
-0.07940673828125,
0.0156402587890625,
-0.003475189208984375,
0.012298583984375,
0.001251220703125,
-0.07476806640625,
0.06103515625,
-0.016143798828125,
-0.04132080078125,
0.0228729248046875,
0.04412841796875,
0.004459381103515625,
0.0007872581481933594,
0.036163330078125,
0.046539306640625,
0.035919189453125,
-0.021728515625,
0.09161376953125,
-0.034088134765625,
0.032196044921875,
0.01390838623046875,
-0.0048675537109375,
0.060516357421875,
0.05633544921875,
-0.0389404296875,
0.044189453125,
0.0506591796875,
-0.01140594482421875,
0.04052734375,
0.0004456043243408203,
-0.037078857421875,
-0.0156707763671875,
-0.0225982666015625,
-0.048553466796875,
0.031341552734375,
-0.0079193115234375,
-0.007732391357421875,
-0.033477783203125,
-0.006744384765625,
0.033111572265625,
-0.0174713134765625,
0.00811767578125,
0.0601806640625,
0.026214599609375,
-0.0283966064453125,
0.060028076171875,
0.01739501953125,
0.0625,
-0.06396484375,
0.0125274658203125,
-0.041595458984375,
0.01520538330078125,
-0.0177459716796875,
-0.06280517578125,
0.00994110107421875,
0.0082244873046875,
-0.0116424560546875,
-0.01904296875,
0.04022216796875,
-0.01568603515625,
-0.0479736328125,
0.045623779296875,
0.0308380126953125,
0.022857666015625,
0.047821044921875,
-0.0396728515625,
0.007228851318359375,
0.00469970703125,
-0.0458984375,
0.0192108154296875,
0.0214691162109375,
0.004230499267578125,
0.054595947265625,
0.052398681640625,
0.001430511474609375,
0.010101318359375,
-0.0254974365234375,
0.07330322265625,
-0.04278564453125,
-0.020263671875,
-0.07177734375,
0.03204345703125,
-0.01094818115234375,
-0.0194549560546875,
0.06866455078125,
0.042877197265625,
0.07501220703125,
-0.01039886474609375,
0.047760009765625,
-0.03436279296875,
0.026702880859375,
-0.0212554931640625,
0.040496826171875,
-0.054443359375,
-0.0096282958984375,
-0.0099639892578125,
-0.07293701171875,
-0.016693115234375,
0.05853271484375,
-0.0345458984375,
-0.01277923583984375,
0.049530029296875,
0.08740234375,
0.00733184814453125,
-0.0037517547607421875,
0.006999969482421875,
0.034332275390625,
0.01024627685546875,
0.0435791015625,
0.06414794921875,
-0.032867431640625,
0.047088623046875,
-0.05548095703125,
-0.0203857421875,
-0.024658203125,
-0.0531005859375,
-0.054962158203125,
-0.0394287109375,
-0.01461029052734375,
-0.0294647216796875,
0.0020771026611328125,
0.054901123046875,
0.0245513916015625,
-0.048736572265625,
-0.0433349609375,
0.0274505615234375,
0.006198883056640625,
-0.012481689453125,
-0.01092529296875,
0.037017822265625,
0.0009717941284179688,
-0.054443359375,
0.00555419921875,
0.017822265625,
0.032867431640625,
-0.0013685226440429688,
-0.01438140869140625,
-0.022979736328125,
0.004405975341796875,
0.050384521484375,
0.03363037109375,
-0.068359375,
-0.03271484375,
0.016357421875,
-0.02960205078125,
0.0187530517578125,
0.0174407958984375,
-0.043701171875,
0.004180908203125,
0.043548583984375,
0.017822265625,
0.04351806640625,
0.003932952880859375,
0.028961181640625,
-0.031097412109375,
0.055267333984375,
-0.00469970703125,
0.0296478271484375,
0.018096923828125,
-0.014404296875,
0.059844970703125,
0.03179931640625,
-0.037567138671875,
-0.06976318359375,
-0.00443267822265625,
-0.073974609375,
-0.0028705596923828125,
0.10333251953125,
0.01027679443359375,
-0.007843017578125,
0.009552001953125,
-0.043548583984375,
0.029510498046875,
-0.01436614990234375,
0.064453125,
0.07159423828125,
-0.01220703125,
0.0096282958984375,
-0.05987548828125,
0.00951385498046875,
0.0396728515625,
-0.059661865234375,
-0.01500701904296875,
0.0250244140625,
0.019683837890625,
0.01483154296875,
0.060089111328125,
-0.00473785400390625,
0.010986328125,
0.002536773681640625,
0.0121917724609375,
0.028961181640625,
0.01387786865234375,
-0.012969970703125,
-0.0252685546875,
-0.01194000244140625,
-0.0199737548828125
]
] |
Writer/camel-5b-hf | 2023-04-17T19:21:54.000Z | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"InstructGPT",
"hf",
"en",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | Writer | null | null | Writer/camel-5b-hf | 105 | 6,964 | transformers | 2023-04-10T14:19:41 | ---
license: apache-2.0
language:
- en
tags:
- InstructGPT
- hf
---
# Camel 🐪 5B
<style>
img {
display: inline;
}
</style>
## Model Description
Introducing Camel-5b, a state-of-the-art instruction-following large language model designed to deliver exceptional performance and versatility. Derived from the foundational architecture of [Palmyra-Base](https://huggingface.co/Writer/palmyra-base), Camel-5b is specifically tailored to address the growing demand for advanced natural language processing and comprehension capabilities.
The Camel-5b model is meticulously trained on an extensive dataset of approximately 70,000 instruction-response records. These records are generated by our dedicated Writer Linguist team, who possess considerable expertise in language modeling and fine-tuning techniques. By leveraging their skills and knowledge, the Camel-5b model is primed to offer unparalleled proficiency in understanding and executing language-based instructions.
One of the key differentiators of Camel-5b lies in its ability to process complex instructions and generate accurate, contextually appropriate responses. This makes it an ideal choice for a wide range of applications, including virtual assistants, customer support, content generation, and more. Additionally, the model's comprehensive training enables it to adapt and perform well under varying conditions and contexts, further expanding its potential use cases.
## Live Demo
Live demo => https://chatcamel.vercel.app/
## Deploying Camel
We used the [Baseten platform](http://baseten.co/) to package and serve Camel-5B at scale. Utilizing the open source [Truss](https://truss.baseten.co/) model packaging framework, users can create a customized environment using the simple instructions found on [GitHub](https://github.com/basetenlabs/camel-5b-truss). This repo allows users to maintain full control over the inference and deployment paths to meet their specific requirements.
We would like to thank the Baseten team for their contributions in deploying and hosting the model.
## Usage :
```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
model_name = "Writer/camel-5b-hf"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
model_name,
device_map="auto",
torch_dtype=torch.float16
)
instruction = "Describe a futuristic device that revolutionizes space travel."
PROMPT_DICT = {
"prompt_input": (
"Below is an instruction that describes a task, paired with an input that provides further context. "
"Write a response that appropriately completes the request\n\n"
"### Instruction:\n{instruction}\n\n### Input:\n{input}\n\n### Response:"
),
"prompt_no_input": (
"Below is an instruction that describes a task. "
"Write a response that appropriately completes the request.\n\n"
"### Instruction:\n{instruction}\n\n### Response:"
),
}
text = (
PROMPT_DICT["prompt_no_input"].format(instruction=instruction)
if not input
else PROMPT_DICT["prompt_input"].format(instruction=instruction, input=input)
)
model_inputs = tokenizer(text, return_tensors="pt").to("cuda")
output_ids = model.generate(
**model_inputs,
max_length=256,
)
output_text = tokenizer.batch_decode(output_ids, skip_special_tokens=True)[0]
clean_output = output_text.split("### Response:")[1].strip()
print(clean_output)
```
### Limitations and Biases
Camel's core functionality is to take a string of text and predict the next token. While language models are widely used for other tasks, there are many unknowns in this work. When prompting Camel, keep in mind that the next statistically likely token is not always the token that produces the most "accurate" text. Never rely on Camel to produce factually correct results.
Camel was trained on Writer’s custom data. As with all language models, it is difficult to predict how Camel will respond to specific prompts, and offensive content may appear unexpectedly. We recommend that the outputs be curated or filtered by humans before they are released, both to censor undesirable content and to improve the quality of the results.
## Camel VS. Llama
The Camel is essentially the Swiss Army knife of the animal kingdom - it can store water in its humps, survive extreme temperatures, and even provide a cushy ride for weary travelers. The llama, on the other hand, is basically just a glorified lawnmower with an attitude problem. Sure, they might have a cute, fuzzy face, but don't be deceived - one false move and you'll be greeted with a spit shower. The true MVP of the desert, and let the llama keep on spitting its way into obscurity.
<img src="https://i.postimg.cc/wjXZLQbB/Camel-Llama.png" width="400px" />
## Citation and Related Information
To cite this model:
```
@misc{Camel,
author = {Writer Engineering team},
title = {{Camel-5B InstructGPT}},
howpublished = {\url{https://dev.writer.com}},
year = 2023,
month = April
}
```
[](#model-architecture)|[](#model-architecture)|[](#datasets)| | 5,410 | [
[
-0.0279083251953125,
-0.06427001953125,
0.00566864013671875,
0.03607177734375,
-0.014923095703125,
0.0004749298095703125,
-0.0006318092346191406,
-0.049957275390625,
-0.0055084228515625,
0.040557861328125,
-0.039764404296875,
-0.04644775390625,
-0.043701171875,
0.004230499267578125,
-0.037017822265625,
0.09521484375,
0.0020275115966796875,
-0.016632080078125,
-0.00467681884765625,
-0.01082611083984375,
-0.05059814453125,
-0.033935546875,
-0.0201873779296875,
-0.0295867919921875,
0.03656005859375,
0.01690673828125,
0.047821044921875,
0.051361083984375,
0.03564453125,
0.0259246826171875,
-0.0100250244140625,
0.01384735107421875,
-0.03106689453125,
-0.007793426513671875,
-0.00858306884765625,
-0.0139007568359375,
-0.02777099609375,
-0.005584716796875,
0.023468017578125,
0.043426513671875,
-0.0277252197265625,
0.0303497314453125,
-0.01910400390625,
0.0264129638671875,
-0.046600341796875,
0.01268768310546875,
-0.04132080078125,
-0.02392578125,
-0.008026123046875,
-0.0026340484619140625,
-0.0296783447265625,
-0.0149993896484375,
0.00279998779296875,
-0.0361328125,
0.01508331298828125,
0.0252838134765625,
0.09930419921875,
0.035308837890625,
-0.0297393798828125,
-0.019256591796875,
-0.048919677734375,
0.062286376953125,
-0.057220458984375,
0.0150604248046875,
0.0288543701171875,
0.0252838134765625,
-0.0264739990234375,
-0.07110595703125,
-0.060302734375,
-0.03497314453125,
-0.006198883056640625,
0.0030727386474609375,
-0.0205841064453125,
-0.007049560546875,
0.0084991455078125,
0.0217132568359375,
-0.045013427734375,
0.0060577392578125,
-0.026702880859375,
-0.0151824951171875,
0.02764892578125,
0.01174163818359375,
0.0269775390625,
-0.0226593017578125,
-0.05316162109375,
-0.01425933837890625,
-0.049041748046875,
0.017791748046875,
0.0196685791015625,
0.01464080810546875,
-0.0206451416015625,
0.046234130859375,
-0.0193023681640625,
0.051422119140625,
0.0068817138671875,
-0.0025234222412109375,
0.01541900634765625,
-0.021392822265625,
-0.027130126953125,
-0.001148223876953125,
0.06298828125,
0.0147247314453125,
-0.003047943115234375,
-0.00562286376953125,
-0.023834228515625,
0.00731658935546875,
0.0179595947265625,
-0.07586669921875,
-0.02093505859375,
0.028778076171875,
-0.044891357421875,
-0.02923583984375,
-0.014495849609375,
-0.0222930908203125,
-0.0257415771484375,
0.00836944580078125,
0.0236053466796875,
-0.033538818359375,
-0.0174407958984375,
0.025299072265625,
-0.0178375244140625,
0.03955078125,
0.008514404296875,
-0.076171875,
0.035125732421875,
0.03802490234375,
0.05438232421875,
-0.0008254051208496094,
-0.0272979736328125,
-0.015228271484375,
0.01873779296875,
-0.0132293701171875,
0.07366943359375,
-0.03497314453125,
-0.03912353515625,
-0.0200347900390625,
0.006694793701171875,
0.0120391845703125,
-0.026031494140625,
0.037506103515625,
-0.04925537109375,
0.01885986328125,
-0.0295867919921875,
-0.043609619140625,
-0.030303955078125,
0.012603759765625,
-0.04693603515625,
0.0758056640625,
0.0085906982421875,
-0.072998046875,
0.012786865234375,
-0.050262451171875,
-0.02154541015625,
-0.0005545616149902344,
0.000598907470703125,
-0.03326416015625,
-0.0189666748046875,
0.01715087890625,
0.029022216796875,
-0.04229736328125,
0.0192718505859375,
-0.0200347900390625,
-0.0305633544921875,
0.0276947021484375,
-0.031524658203125,
0.07305908203125,
0.00737762451171875,
-0.032501220703125,
0.0181427001953125,
-0.06768798828125,
-0.01213836669921875,
0.0232086181640625,
-0.0380859375,
-0.0007672309875488281,
-0.0020885467529296875,
0.0100860595703125,
-0.00199127197265625,
0.034210205078125,
-0.041534423828125,
0.01439666748046875,
-0.036468505859375,
0.03314208984375,
0.061553955078125,
-0.0065155029296875,
0.0225677490234375,
-0.037322998046875,
0.033660888671875,
-0.00496673583984375,
0.01300811767578125,
-0.007488250732421875,
-0.0379638671875,
-0.061767578125,
-0.0245513916015625,
0.023406982421875,
0.049041748046875,
-0.026214599609375,
0.0367431640625,
-0.01103973388671875,
-0.059600830078125,
-0.03533935546875,
-0.0018253326416015625,
0.032440185546875,
0.055450439453125,
0.047332763671875,
-0.0222625732421875,
-0.0413818359375,
-0.0556640625,
-0.00659942626953125,
-0.0287322998046875,
0.0108184814453125,
0.02294921875,
0.046417236328125,
-0.028961181640625,
0.06719970703125,
-0.03662109375,
-0.00003212690353393555,
-0.032562255859375,
0.007076263427734375,
0.026153564453125,
0.053558349609375,
0.048187255859375,
-0.046905517578125,
-0.0120086669921875,
-0.001445770263671875,
-0.04803466796875,
-0.00792694091796875,
-0.0009465217590332031,
-0.036468505859375,
0.01971435546875,
0.00261688232421875,
-0.07061767578125,
0.032867431640625,
0.035186767578125,
-0.029571533203125,
0.0596923828125,
-0.0164947509765625,
-0.01045989990234375,
-0.0697021484375,
0.0267791748046875,
-0.01122283935546875,
-0.0020503997802734375,
-0.033111572265625,
-0.0012636184692382812,
-0.0032482147216796875,
0.01313018798828125,
-0.0242462158203125,
0.056488037109375,
-0.03485107421875,
-0.003376007080078125,
-0.022430419921875,
-0.0204925537109375,
0.01458740234375,
0.047210693359375,
0.0015411376953125,
0.043731689453125,
0.033111572265625,
-0.053131103515625,
0.044891357421875,
0.034210205078125,
-0.0233001708984375,
0.00736236572265625,
-0.08197021484375,
0.00739288330078125,
-0.00026726722717285156,
0.04425048828125,
-0.08233642578125,
-0.014068603515625,
0.032470703125,
-0.038330078125,
0.0148162841796875,
0.007732391357421875,
-0.0311737060546875,
-0.0384521484375,
-0.015533447265625,
0.016571044921875,
0.051422119140625,
-0.04010009765625,
0.0423583984375,
0.0178985595703125,
0.0035247802734375,
-0.07989501953125,
-0.050872802734375,
-0.0008769035339355469,
-0.00927734375,
-0.040863037109375,
0.038116455078125,
-0.0194549560546875,
0.0056610107421875,
0.002040863037109375,
-0.01096343994140625,
-0.005603790283203125,
0.0150604248046875,
0.0229949951171875,
0.046478271484375,
-0.00899505615234375,
-0.005405426025390625,
-0.0067596435546875,
-0.00604248046875,
0.0035991668701171875,
-0.0204315185546875,
0.04473876953125,
-0.018341064453125,
-0.0201263427734375,
-0.045135498046875,
0.0210418701171875,
0.035797119140625,
-0.024200439453125,
0.07513427734375,
0.05364990234375,
-0.0177459716796875,
-0.0102996826171875,
-0.045379638671875,
-0.01010894775390625,
-0.03955078125,
0.0228118896484375,
-0.035369873046875,
-0.0208282470703125,
0.0439453125,
0.005519866943359375,
0.026336669921875,
0.024200439453125,
0.052825927734375,
-0.007511138916015625,
0.075439453125,
0.069091796875,
-0.00691986083984375,
0.042144775390625,
-0.03955078125,
0.01177215576171875,
-0.061126708984375,
-0.03558349609375,
-0.04833984375,
-0.0134124755859375,
-0.0504150390625,
-0.0221710205078125,
0.0035915374755859375,
-0.003452301025390625,
-0.050079345703125,
0.03643798828125,
-0.056427001953125,
0.01070404052734375,
0.0484619140625,
-0.00319671630859375,
0.00724029541015625,
-0.010284423828125,
-0.0207366943359375,
0.02984619140625,
-0.053741455078125,
-0.060302734375,
0.06292724609375,
0.025390625,
0.070068359375,
0.00809478759765625,
0.037841796875,
0.01265716552734375,
0.02001953125,
-0.052215576171875,
0.044403076171875,
0.012237548828125,
-0.04345703125,
-0.01541900634765625,
-0.00838470458984375,
-0.08544921875,
0.0285186767578125,
0.0013885498046875,
-0.039794921875,
0.00821685791015625,
0.0036716461181640625,
-0.032806396484375,
0.0095977783203125,
-0.049346923828125,
0.0731201171875,
-0.0128326416015625,
-0.01535797119140625,
-0.0008897781372070312,
-0.06561279296875,
0.023529052734375,
0.0307159423828125,
0.0159454345703125,
-0.016998291015625,
-0.0171051025390625,
0.047515869140625,
-0.056732177734375,
0.07647705078125,
-0.0017986297607421875,
0.0007886886596679688,
0.0267333984375,
-0.001232147216796875,
0.038787841796875,
-0.016204833984375,
-0.012786865234375,
0.0189056396484375,
0.00228118896484375,
-0.046417236328125,
-0.030120849609375,
0.04595947265625,
-0.09033203125,
-0.05108642578125,
-0.033447265625,
-0.037567138671875,
0.0121612548828125,
0.01006317138671875,
0.01837158203125,
0.0266265869140625,
-0.01702880859375,
-0.0027904510498046875,
0.02789306640625,
-0.0343017578125,
0.04718017578125,
0.04241943359375,
-0.0028934478759765625,
-0.0205841064453125,
0.07000732421875,
0.004955291748046875,
0.00897979736328125,
0.029266357421875,
0.018707275390625,
-0.027801513671875,
-0.040435791015625,
-0.059783935546875,
0.027374267578125,
-0.06585693359375,
-0.0230865478515625,
-0.06414794921875,
-0.01470184326171875,
-0.03314208984375,
-0.0037441253662109375,
-0.02044677734375,
-0.0265655517578125,
-0.037872314453125,
-0.0092315673828125,
0.04266357421875,
0.0511474609375,
0.00960540771484375,
0.025299072265625,
-0.044677734375,
0.0196990966796875,
0.014923095703125,
0.000247955322265625,
0.007843017578125,
-0.050750732421875,
-0.0098876953125,
0.0012617111206054688,
-0.038543701171875,
-0.0665283203125,
0.04296875,
0.00021076202392578125,
0.0445556640625,
0.0166473388671875,
0.010772705078125,
0.060455322265625,
-0.0389404296875,
0.073486328125,
0.01181793212890625,
-0.0697021484375,
0.04449462890625,
-0.0165557861328125,
0.01947021484375,
0.0013799667358398438,
0.02227783203125,
-0.03369140625,
-0.0302276611328125,
-0.05450439453125,
-0.07635498046875,
0.046173095703125,
0.01531982421875,
0.0228729248046875,
-0.005863189697265625,
0.0108184814453125,
0.019500732421875,
0.0210723876953125,
-0.06561279296875,
-0.023345947265625,
-0.045013427734375,
-0.010772705078125,
0.0255126953125,
-0.01012420654296875,
-0.001644134521484375,
-0.0173797607421875,
0.0596923828125,
-0.0008625984191894531,
0.030303955078125,
0.0020885467529296875,
-0.009857177734375,
0.0120086669921875,
0.0004572868347167969,
0.045684814453125,
0.0447998046875,
-0.0123138427734375,
-0.0004215240478515625,
0.02099609375,
-0.031982421875,
-0.0032978057861328125,
0.007183074951171875,
-0.0017251968383789062,
-0.00040721893310546875,
0.03839111328125,
0.0755615234375,
0.004863739013671875,
-0.04931640625,
0.038299560546875,
-0.00821685791015625,
0.020050048828125,
-0.0258941650390625,
0.01739501953125,
0.0171661376953125,
0.00921630859375,
0.0235137939453125,
0.01050567626953125,
0.0173492431640625,
-0.0389404296875,
-0.01490020751953125,
0.0259857177734375,
0.003726959228515625,
-0.026947021484375,
0.062225341796875,
0.022735595703125,
-0.0177001953125,
0.03851318359375,
-0.003322601318359375,
-0.032318115234375,
0.06878662109375,
0.0548095703125,
0.03961181640625,
-0.00615692138671875,
0.016815185546875,
0.0261688232421875,
0.0150299072265625,
-0.0035552978515625,
0.02276611328125,
-0.0172271728515625,
-0.05206298828125,
-0.0175628662109375,
-0.06134033203125,
-0.029266357421875,
-0.0025310516357421875,
-0.035186767578125,
0.03466796875,
-0.039154052734375,
-0.00897216796875,
0.0008740425109863281,
0.01395416259765625,
-0.05999755859375,
0.0179443359375,
0.0049285888671875,
0.07086181640625,
-0.0430908203125,
0.060333251953125,
0.04931640625,
-0.06390380859375,
-0.0794677734375,
-0.0222625732421875,
-0.020599365234375,
-0.077880859375,
0.05401611328125,
0.022918701171875,
-0.0194549560546875,
0.0006737709045410156,
-0.06390380859375,
-0.07025146484375,
0.08935546875,
0.0276031494140625,
-0.0190277099609375,
-0.01233673095703125,
-0.01554107666015625,
0.0288543701171875,
-0.025299072265625,
0.054840087890625,
0.0426025390625,
0.03131103515625,
0.03143310546875,
-0.0726318359375,
0.0268707275390625,
-0.017242431640625,
0.0194854736328125,
-0.0204620361328125,
-0.0670166015625,
0.07708740234375,
-0.03668212890625,
-0.00960540771484375,
0.032867431640625,
0.06396484375,
0.035919189453125,
-0.0032329559326171875,
0.0223388671875,
0.021881103515625,
0.07391357421875,
0.0098876953125,
0.10089111328125,
-0.02801513671875,
0.0230560302734375,
0.06866455078125,
-0.01215362548828125,
0.06817626953125,
0.026580810546875,
-0.0084381103515625,
0.0821533203125,
0.07086181640625,
-0.0082855224609375,
0.049652099609375,
0.02288818359375,
-0.0160369873046875,
-0.00815582275390625,
-0.020294189453125,
-0.040985107421875,
0.04669189453125,
0.01898193359375,
-0.018310546875,
-0.0038089752197265625,
-0.007289886474609375,
0.00867462158203125,
-0.019805908203125,
0.0028209686279296875,
0.04583740234375,
-0.0006279945373535156,
-0.0435791015625,
0.0823974609375,
0.0147247314453125,
0.055694580078125,
-0.03692626953125,
-0.00804901123046875,
-0.0139007568359375,
0.0028438568115234375,
-0.0164642333984375,
-0.035125732421875,
0.00489044189453125,
0.00969696044921875,
-0.0204620361328125,
-0.01100921630859375,
0.04754638671875,
-0.01100921630859375,
-0.040924072265625,
0.0176239013671875,
0.0262451171875,
0.03509521484375,
0.0218048095703125,
-0.059906005859375,
0.0240325927734375,
-0.017333984375,
-0.042816162109375,
0.0008463859558105469,
0.035369873046875,
-0.0022449493408203125,
0.07666015625,
0.06201171875,
0.005115509033203125,
0.004772186279296875,
0.01081085205078125,
0.07720947265625,
-0.06011962890625,
-0.036224365234375,
-0.06549072265625,
0.04638671875,
-0.0042572021484375,
-0.0204925537109375,
0.04132080078125,
0.028411865234375,
0.049896240234375,
-0.00011897087097167969,
0.06585693359375,
-0.0202178955078125,
0.02069091796875,
-0.017486572265625,
0.05242919921875,
-0.0258331298828125,
0.036346435546875,
-0.00493621826171875,
-0.04486083984375,
-0.001373291015625,
0.061676025390625,
-0.0189208984375,
0.0158843994140625,
0.052337646484375,
0.07147216796875,
0.0068206787109375,
-0.0034999847412109375,
0.01177978515625,
0.0311431884765625,
0.0355224609375,
0.040863037109375,
0.06549072265625,
-0.03948974609375,
0.056427001953125,
-0.0092926025390625,
-0.01343536376953125,
-0.0244140625,
-0.0654296875,
-0.07135009765625,
-0.0296783447265625,
-0.019622802734375,
-0.0288543701171875,
-0.0018024444580078125,
0.07220458984375,
0.059173583984375,
-0.055450439453125,
-0.0266571044921875,
-0.0133209228515625,
0.026397705078125,
-0.029022216796875,
-0.01313018798828125,
0.025115966796875,
-0.0218048095703125,
-0.053619384765625,
0.01239776611328125,
-0.0019130706787109375,
0.01806640625,
-0.03192138671875,
-0.00011295080184936523,
-0.0178680419921875,
0.02264404296875,
0.03204345703125,
0.0213165283203125,
-0.07080078125,
-0.029571533203125,
0.0114593505859375,
-0.027801513671875,
-0.00121307373046875,
0.01922607421875,
-0.0797119140625,
0.00981903076171875,
0.0274505615234375,
0.046295166015625,
0.0457763671875,
-0.007228851318359375,
0.039154052734375,
-0.031951904296875,
0.01152801513671875,
0.01092529296875,
0.030670166015625,
0.03948974609375,
-0.037017822265625,
0.0262908935546875,
0.0162506103515625,
-0.047607421875,
-0.06591796875,
0.00579071044921875,
-0.0849609375,
-0.027587890625,
0.08551025390625,
-0.0179595947265625,
-0.04425048828125,
-0.0009469985961914062,
-0.03900146484375,
0.03717041015625,
-0.038818359375,
0.068603515625,
0.052337646484375,
-0.01479339599609375,
0.00664520263671875,
-0.040069580078125,
0.022369384765625,
0.0293731689453125,
-0.056304931640625,
-0.0046539306640625,
0.03424072265625,
0.044891357421875,
-0.00324249267578125,
0.0489501953125,
0.0191650390625,
0.0250396728515625,
0.007293701171875,
0.025390625,
-0.01007080078125,
-0.01071929931640625,
0.0013761520385742188,
0.01140594482421875,
0.002651214599609375,
-0.0440673828125
]
] |
HuggingFaceH4/starchat-alpha | 2023-06-08T21:15:30.000Z | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"gpt_bigcode",
"text-generation",
"code",
"en",
"dataset:OpenAssistant/oasst1",
"dataset:databricks/databricks-dolly-15k",
"license:bigcode-openrail-m",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | HuggingFaceH4 | null | null | HuggingFaceH4/starchat-alpha | 221 | 6,964 | transformers | 2023-05-09T08:57:06 | ---
license: bigcode-openrail-m
datasets:
- OpenAssistant/oasst1
- databricks/databricks-dolly-15k
language:
- en
library_name: transformers
tags:
- code
---
# Model Card for StarChat Alpha
<!-- Provide a quick summary of what the model is/does. -->
_Note, you may be interested in the Beta version of StarChat [here](https://huggingface.co/HuggingFaceH4/starchat-beta)._
StarChat is a series of language models that are fine-tuned from StarCoder to act as helpful coding assistants. StarChat Alpha is the first of these models, and as an alpha release is only intended for educational or research purpopses. In particular, the model has not been aligned to human preferences with techniques like RLHF, so may generate problematic content (especially when prompted to do so).
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Model type:** A 16B parameter GPT-like model fine-tuned on a blend of the [`oasst1`](https://huggingface.co/datasets/OpenAssistant/oasst1) and [`databricks-dolly-15k`](https://huggingface.co/datasets/databricks/databricks-dolly-15k) datasets.
- **Language(s) (NLP):** English
- **License:** BigCode Open RAIL-M v1
- **Finetuned from model:** [bigcode/starcoderbase](https://huggingface.co/bigcode/starcoderbase)
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bigcode-project/starcoder
- **Demo:** https://huggingface.co/spaces/HuggingFaceH4/starchat-playground
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
StarChat Alpha is intended for educational and/or research purposes and in that respect can be used to probe the programming capabilities of open-source language models.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
StarChat Alpha has not been aligned to human preferences with techniques like RLHF or deployed with in-the-loop filtering of responses like ChatGPT, so the model can produce problematic outputs (especially when prompted to do so).
Models trained primarily on code data will also have a more skewed demographic bias commensurate with the demographics of the GitHub community, for more on this see the [StarCoder dataset](https://huggingface.co/datasets/bigcode/starcoderdata) which is derived from The Stack.
Since the base model was pretrained on a large corpus of code, it may produce code snippets that are syntactically valid but semantically incorrect.
For example, it may produce code that does not compile or that produces incorrect results.
It may also produce code that is vulnerable to security exploits.
We have observed the model also has a tendency to produce false URLs which should be carefully inspected before clicking.
StarChat Alpha was fine-tuned from the base model [StarCoder Base](https://huggingface.co/bigcode/starcoderbase), please refer to its model card's [Limitations Section](https://huggingface.co/bigcode/starcoderbase#limitations) for relevant information.
In particular, the model was evaluated on some categories of gender biases, propensity for toxicity, and risk of suggesting code completions with known security flaws; these evaluations are reported in its [technical report](https://drive.google.com/file/d/1cN-b9GnWtHzQRoE7M7gAEyivY0kl4BYs/view).
## How to Get Started with the Model
Use the code below to get started with the model.
Here's how you can run the model using the `pipeline()` function from 🤗 Transformers:
```python
import torch
from transformers import pipeline
pipe = pipeline("text-generation", model="HuggingFaceH4/starchat-alpha", torch_dtype=torch.bfloat16, device_map="auto")
prompt_template = "<|system|>\n<|end|>\n<|user|>\n{query}<|end|>\n<|assistant|>"
prompt = prompt_template.format(query="How do I sort a list in Python?")
# We use a special <|end|> token with ID 49155 to denote ends of a turn
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.2, top_k=50, top_p=0.95, eos_token_id=49155)
# You can sort a list in Python by using the sort() method. Here's an example:\n\n```\nnumbers = [3, 1, 4, 1, 5, 9, 2, 6, 5, 3, 5]\nnumbers.sort()\nprint(numbers)\n```\n\nThis will sort the list in place and print the sorted list.
```
## Citation
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```
@article{Tunstall2023starchat-alpha,
author = {Tunstall, Lewis and Lambert, Nathan and Rajani, Nazneen and Beeching, Edward and Le Scao, Teven and von Werra, Leandro and Han, Sheon and Schmid, Philipp and Rush, Alexander},
title = {Creating a Coding Assistant with StarCoder},
journal = {Hugging Face Blog},
year = {2023},
note = {https://huggingface.co/blog/starchat},
}
``` | 4,951 | [
[
-0.0298004150390625,
-0.0621337890625,
0.01067352294921875,
0.01229095458984375,
-0.012054443359375,
-0.00594329833984375,
-0.01629638671875,
-0.05828857421875,
0.0247955322265625,
0.03582763671875,
-0.04345703125,
-0.038726806640625,
-0.046142578125,
-0.00618743896484375,
-0.03399658203125,
0.10760498046875,
0.0139923095703125,
0.0026416778564453125,
-0.00736236572265625,
-0.00665283203125,
-0.03582763671875,
-0.04742431640625,
-0.0310211181640625,
-0.017578125,
0.0272674560546875,
0.044219970703125,
0.07366943359375,
0.048583984375,
0.0556640625,
0.022003173828125,
-0.005084991455078125,
-0.01702880859375,
-0.03338623046875,
0.00406646728515625,
0.005321502685546875,
-0.034820556640625,
-0.049652099609375,
0.00821685791015625,
0.0219573974609375,
0.020721435546875,
0.0137786865234375,
0.037322998046875,
-0.00562286376953125,
0.03753662109375,
-0.0283966064453125,
0.03955078125,
-0.028656005859375,
-0.023529052734375,
-0.012420654296875,
0.00824737548828125,
-0.0211181640625,
-0.040618896484375,
0.0018930435180664062,
-0.01409912109375,
0.0038280487060546875,
0.019744873046875,
0.08349609375,
-0.0018835067749023438,
-0.0018625259399414062,
-0.043212890625,
-0.06976318359375,
0.0400390625,
-0.060394287109375,
0.035888671875,
0.037445068359375,
0.033355712890625,
0.0155792236328125,
-0.059600830078125,
-0.044403076171875,
-0.0362548828125,
-0.0038890838623046875,
-0.022186279296875,
-0.005405426025390625,
-0.0023136138916015625,
0.024200439453125,
0.048065185546875,
-0.056396484375,
-0.005950927734375,
-0.058135986328125,
-0.012725830078125,
0.0445556640625,
-0.00490570068359375,
0.039459228515625,
-0.0310211181640625,
-0.006916046142578125,
-0.0197601318359375,
-0.037384033203125,
0.004154205322265625,
0.03228759765625,
0.036407470703125,
-0.0323486328125,
0.055419921875,
0.005321502685546875,
0.032470703125,
0.0007266998291015625,
0.007572174072265625,
0.022979736328125,
-0.0267791748046875,
-0.00786590576171875,
-0.021026611328125,
0.06951904296875,
0.027557373046875,
0.031951904296875,
0.0030040740966796875,
-0.014007568359375,
0.025115966796875,
0.01690673828125,
-0.0718994140625,
-0.033111572265625,
0.032562255859375,
-0.04486083984375,
-0.033721923828125,
-0.00942230224609375,
-0.054290771484375,
-0.017303466796875,
0.001102447509765625,
0.019683837890625,
-0.032623291015625,
-0.0222625732421875,
-0.00832366943359375,
0.007450103759765625,
0.019012451171875,
-0.005115509033203125,
-0.07415771484375,
0.0245819091796875,
0.04241943359375,
0.055206298828125,
0.020904541015625,
-0.0283050537109375,
-0.0091705322265625,
-0.0477294921875,
-0.004688262939453125,
0.036956787109375,
-0.04296875,
-0.0134735107421875,
-0.00618743896484375,
-0.0080718994140625,
0.02264404296875,
-0.019439697265625,
0.027679443359375,
-0.044189453125,
0.02081298828125,
-0.0221710205078125,
-0.038543701171875,
-0.00827789306640625,
0.003948211669921875,
-0.043365478515625,
0.06390380859375,
0.00635528564453125,
-0.060821533203125,
0.01361083984375,
-0.050506591796875,
-0.0098724365234375,
0.00750732421875,
-0.00952911376953125,
-0.027679443359375,
-0.0199737548828125,
0.0188751220703125,
0.03204345703125,
-0.0229949951171875,
0.01132965087890625,
-0.036529541015625,
-0.0194854736328125,
0.0220184326171875,
0.002468109130859375,
0.057098388671875,
0.0361328125,
-0.0170135498046875,
0.0146026611328125,
-0.0258331298828125,
0.0061492919921875,
0.027557373046875,
-0.00826263427734375,
-0.00827789306640625,
-0.0150146484375,
0.026336669921875,
0.01654052734375,
0.02685546875,
-0.0386962890625,
0.0340576171875,
-0.018890380859375,
0.03961181640625,
0.041229248046875,
-0.0035381317138671875,
0.04534912109375,
-0.0322265625,
0.04205322265625,
-0.01113128662109375,
0.03900146484375,
-0.019439697265625,
-0.06573486328125,
-0.057373046875,
0.0014982223510742188,
0.0328369140625,
0.046356201171875,
-0.052642822265625,
0.042266845703125,
-0.0039043426513671875,
-0.0631103515625,
-0.058319091796875,
0.0129852294921875,
0.040771484375,
0.0084991455078125,
0.0176544189453125,
-0.0187530517578125,
-0.0509033203125,
-0.058135986328125,
-0.0029754638671875,
-0.038665771484375,
0.01232147216796875,
0.03948974609375,
0.0496826171875,
-0.0333251953125,
0.06988525390625,
-0.042144775390625,
-0.004398345947265625,
-0.0254974365234375,
-0.010040283203125,
0.0211944580078125,
0.03814697265625,
0.04644775390625,
-0.05615234375,
-0.0251617431640625,
-0.014404296875,
-0.059600830078125,
-0.0010995864868164062,
0.0111236572265625,
-0.01233673095703125,
0.01004791259765625,
0.0222320556640625,
-0.055694580078125,
0.0278472900390625,
0.05914306640625,
-0.04278564453125,
0.04437255859375,
0.007755279541015625,
0.005565643310546875,
-0.07470703125,
0.0230560302734375,
0.00988006591796875,
-0.01197052001953125,
-0.04949951171875,
0.02227783203125,
0.00899505615234375,
-0.033721923828125,
-0.0279998779296875,
0.055816650390625,
-0.0296173095703125,
-0.002010345458984375,
-0.0239715576171875,
-0.005527496337890625,
-0.016387939453125,
0.04351806640625,
-0.006633758544921875,
0.048095703125,
0.051055908203125,
-0.041412353515625,
0.021759033203125,
0.03009033203125,
-0.011871337890625,
0.0028858184814453125,
-0.05841064453125,
0.0229949951171875,
-0.00827789306640625,
0.0104827880859375,
-0.0693359375,
-0.0032978057861328125,
0.042724609375,
-0.064453125,
0.01251983642578125,
-0.0176849365234375,
-0.02587890625,
-0.047332763671875,
-0.0045013427734375,
0.0280609130859375,
0.0687255859375,
-0.0313720703125,
0.02239990234375,
0.04339599609375,
-0.021514892578125,
-0.037567138671875,
-0.037200927734375,
-0.006439208984375,
-0.0032711029052734375,
-0.055755615234375,
0.001972198486328125,
-0.0263671875,
-0.01448822021484375,
-0.0011644363403320312,
0.004154205322265625,
-0.00878143310546875,
-0.0208282470703125,
0.027557373046875,
0.0312347412109375,
-0.0082244873046875,
-0.0038280487060546875,
-0.031829833984375,
-0.004283905029296875,
0.0008187294006347656,
-0.0177459716796875,
0.05462646484375,
-0.03924560546875,
-0.01226043701171875,
-0.0210723876953125,
0.0219573974609375,
0.059600830078125,
-0.01812744140625,
0.047760009765625,
0.0443115234375,
-0.033355712890625,
0.00222015380859375,
-0.0400390625,
-0.04083251953125,
-0.03631591796875,
0.019927978515625,
-0.0038967132568359375,
-0.052276611328125,
0.0555419921875,
0.040679931640625,
0.034271240234375,
0.02874755859375,
0.0430908203125,
0.0168914794921875,
0.0687255859375,
0.06292724609375,
-0.023895263671875,
0.046173095703125,
-0.0286865234375,
0.02545166015625,
-0.049896240234375,
-0.01959228515625,
-0.052459716796875,
-0.01214599609375,
-0.051788330078125,
-0.0272064208984375,
-0.0027751922607421875,
0.006504058837890625,
-0.0260009765625,
0.052276611328125,
-0.05889892578125,
0.02099609375,
0.037017822265625,
0.01294708251953125,
0.007343292236328125,
-0.0177459716796875,
0.01403045654296875,
0.0024089813232421875,
-0.04718017578125,
-0.050140380859375,
0.09002685546875,
0.055908203125,
0.0777587890625,
0.0005064010620117188,
0.048553466796875,
0.016510009765625,
0.0335693359375,
-0.04705810546875,
0.042572021484375,
0.0030269622802734375,
-0.060272216796875,
-0.01000213623046875,
-0.031982421875,
-0.0643310546875,
-0.00269317626953125,
-0.0206146240234375,
-0.058990478515625,
0.00818634033203125,
0.0239410400390625,
-0.0155792236328125,
0.01004791259765625,
-0.070068359375,
0.075927734375,
-0.02874755859375,
-0.0104522705078125,
0.016021728515625,
-0.033660888671875,
0.039703369140625,
0.01253509521484375,
0.012725830078125,
-0.0023288726806640625,
0.0218048095703125,
0.05816650390625,
-0.042755126953125,
0.065673828125,
-0.0148468017578125,
0.01311492919921875,
0.0254058837890625,
0.0094757080078125,
0.037841796875,
0.01253509521484375,
0.0000731348991394043,
0.03851318359375,
-0.00795745849609375,
-0.029144287109375,
-0.027496337890625,
0.06463623046875,
-0.0765380859375,
-0.0273590087890625,
-0.0255279541015625,
-0.03924560546875,
0.013885498046875,
0.0201416015625,
0.0283660888671875,
0.022613525390625,
-0.0022640228271484375,
0.004486083984375,
0.0209808349609375,
-0.0255584716796875,
0.0216217041015625,
0.0261383056640625,
-0.052276611328125,
-0.017181396484375,
0.05316162109375,
0.0015497207641601562,
0.00798797607421875,
0.0005388259887695312,
0.00946807861328125,
-0.0338134765625,
-0.028839111328125,
-0.0305023193359375,
0.033111572265625,
-0.056243896484375,
-0.01666259765625,
-0.060699462890625,
-0.0340576171875,
-0.046844482421875,
-0.006397247314453125,
-0.0430908203125,
-0.01490020751953125,
-0.01654052734375,
0.0070037841796875,
0.057098388671875,
0.05596923828125,
0.0261688232421875,
0.02056884765625,
-0.045654296875,
0.0164642333984375,
0.0155487060546875,
0.04010009765625,
-0.001674652099609375,
-0.0343017578125,
-0.015533447265625,
0.017669677734375,
-0.0232391357421875,
-0.06463623046875,
0.038421630859375,
-0.0024089813232421875,
0.027740478515625,
0.0214385986328125,
0.00372314453125,
0.0391845703125,
-0.0248870849609375,
0.06695556640625,
0.006336212158203125,
-0.057952880859375,
0.04925537109375,
-0.0272979736328125,
0.027435302734375,
0.029571533203125,
0.033538818359375,
-0.0426025390625,
-0.048095703125,
-0.066650390625,
-0.06268310546875,
0.05224609375,
0.0430908203125,
0.01129150390625,
0.0006251335144042969,
0.034423828125,
-0.006259918212890625,
0.01198577880859375,
-0.07977294921875,
-0.03570556640625,
-0.025726318359375,
0.0018100738525390625,
-0.013031005859375,
-0.0102691650390625,
0.01338958740234375,
-0.0255126953125,
0.04156494140625,
0.008026123046875,
0.05364990234375,
-0.0035037994384765625,
-0.01000213623046875,
-0.01241302490234375,
0.0029315948486328125,
0.041839599609375,
0.047943115234375,
-0.03662109375,
-0.01401519775390625,
-0.00550079345703125,
-0.05328369140625,
-0.01129150390625,
0.0152740478515625,
-0.005794525146484375,
-0.002796173095703125,
0.020263671875,
0.07171630859375,
0.0087738037109375,
-0.02996826171875,
0.03631591796875,
-0.018035888671875,
0.006488800048828125,
-0.03076171875,
0.0108184814453125,
0.0133514404296875,
0.0165863037109375,
0.02557373046875,
0.0164337158203125,
-0.006195068359375,
-0.0546875,
0.024932861328125,
0.020263671875,
-0.0245361328125,
-0.025970458984375,
0.059844970703125,
0.0274505615234375,
-0.054107666015625,
0.06658935546875,
-0.01169586181640625,
-0.032928466796875,
0.05328369140625,
0.03082275390625,
0.060516357421875,
-0.02252197265625,
0.003910064697265625,
0.034942626953125,
0.03692626953125,
-0.0018968582153320312,
0.02581787109375,
0.01629638671875,
-0.0560302734375,
-0.0103607177734375,
-0.043609619140625,
-0.03240966796875,
0.0228271484375,
-0.04156494140625,
0.039703369140625,
-0.044952392578125,
-0.024383544921875,
0.018402099609375,
0.007465362548828125,
-0.07281494140625,
-0.010040283203125,
0.0159454345703125,
0.07330322265625,
-0.0802001953125,
0.055755615234375,
0.060882568359375,
-0.056243896484375,
-0.058074951171875,
-0.01104736328125,
-0.00528717041015625,
-0.05926513671875,
0.0297393798828125,
0.01332855224609375,
0.035797119140625,
-0.014129638671875,
-0.051116943359375,
-0.06854248046875,
0.07086181640625,
0.017791748046875,
-0.039886474609375,
-0.01160430908203125,
-0.007366180419921875,
0.04742431640625,
-0.040618896484375,
0.0653076171875,
0.020050048828125,
0.033355712890625,
0.01055145263671875,
-0.089111328125,
0.012725830078125,
-0.046173095703125,
0.00013077259063720703,
-0.00031280517578125,
-0.0443115234375,
0.08721923828125,
-0.021484375,
-0.003326416015625,
-0.00408172607421875,
0.044036865234375,
0.024688720703125,
0.0056610107421875,
0.02545166015625,
0.00992584228515625,
0.0477294921875,
-0.005390167236328125,
0.09039306640625,
-0.04803466796875,
0.0452880859375,
0.052825927734375,
-0.0150146484375,
0.044830322265625,
0.0183563232421875,
-0.0059967041015625,
0.03155517578125,
0.049530029296875,
-0.0091094970703125,
0.02056884765625,
0.00015664100646972656,
0.010589599609375,
0.004608154296875,
0.0208740234375,
-0.039886474609375,
0.032989501953125,
0.0245361328125,
-0.03790283203125,
-0.0157318115234375,
0.005329132080078125,
0.016204833984375,
-0.0221710205078125,
-0.01161956787109375,
0.044952392578125,
0.01080322265625,
-0.05816650390625,
0.071533203125,
0.00244140625,
0.06695556640625,
-0.0452880859375,
-0.00431060791015625,
-0.02313232421875,
0.0216827392578125,
-0.018646240234375,
-0.053466796875,
0.0161285400390625,
0.005992889404296875,
-0.0229949951171875,
-0.01172637939453125,
0.055694580078125,
-0.029083251953125,
-0.0338134765625,
0.0184478759765625,
0.024993896484375,
0.038238525390625,
-0.0010290145874023438,
-0.0792236328125,
0.034271240234375,
0.0010967254638671875,
-0.02386474609375,
0.0186004638671875,
-0.0022754669189453125,
-0.003910064697265625,
0.07464599609375,
0.050689697265625,
0.0091400146484375,
-0.00991058349609375,
-0.01149749755859375,
0.078369140625,
-0.052398681640625,
-0.048675537109375,
-0.05230712890625,
0.043212890625,
0.0026683807373046875,
-0.033416748046875,
0.061248779296875,
0.0390625,
0.043426513671875,
-0.0130462646484375,
0.057952880859375,
-0.03369140625,
0.027679443359375,
-0.0201873779296875,
0.0731201171875,
-0.0154571533203125,
0.0316162109375,
-0.03961181640625,
-0.07891845703125,
-0.007305145263671875,
0.0531005859375,
-0.0239410400390625,
0.035491943359375,
0.0469970703125,
0.0811767578125,
-0.01309967041015625,
0.01300811767578125,
0.01080322265625,
0.02166748046875,
0.023101806640625,
0.037506103515625,
0.043914794921875,
-0.055328369140625,
0.054107666015625,
-0.019317626953125,
-0.038665771484375,
-0.0254974365234375,
-0.043853759765625,
-0.080322265625,
-0.05511474609375,
-0.0285491943359375,
-0.051910400390625,
0.009063720703125,
0.070068359375,
0.07415771484375,
-0.055023193359375,
-0.0294189453125,
0.005218505859375,
0.00986480712890625,
-0.01303863525390625,
-0.020782470703125,
0.01309967041015625,
-0.0125885009765625,
-0.05096435546875,
0.0018253326416015625,
-0.0191192626953125,
0.0008554458618164062,
-0.03570556640625,
-0.015838623046875,
-0.01465606689453125,
-0.002704620361328125,
0.03155517578125,
0.0467529296875,
-0.042388916015625,
-0.033935546875,
0.0159149169921875,
-0.020751953125,
-0.0021991729736328125,
0.0450439453125,
-0.04144287109375,
0.01462554931640625,
0.038116455078125,
0.0157318115234375,
0.041015625,
-0.0052947998046875,
0.033111572265625,
-0.05126953125,
0.00701904296875,
0.0222015380859375,
0.023529052734375,
0.020965576171875,
-0.043182373046875,
0.022003173828125,
0.003376007080078125,
-0.07049560546875,
-0.06182861328125,
0.003376007080078125,
-0.071044921875,
-0.01384735107421875,
0.11114501953125,
0.00360107421875,
-0.020965576171875,
-0.005908966064453125,
-0.0215606689453125,
0.0096893310546875,
-0.0537109375,
0.05731201171875,
0.04693603515625,
-0.0075225830078125,
-0.0343017578125,
-0.050201416015625,
0.04693603515625,
-0.0027008056640625,
-0.056976318359375,
0.01222991943359375,
0.0254058837890625,
0.0430908203125,
0.0225830078125,
0.0703125,
-0.0146026611328125,
0.0257568359375,
0.00344085693359375,
0.049591064453125,
-0.00937652587890625,
-0.0271453857421875,
-0.0252532958984375,
-0.0012607574462890625,
0.01241302490234375,
-0.0175933837890625
]
] |
EleutherAI/pythia-1.4b-v0 | 2023-03-29T18:50:36.000Z | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:the_pile",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/pythia-1.4b-v0 | 7 | 6,954 | transformers | 2022-10-16T18:24:39 | ---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
license: apache-2.0
datasets:
- the_pile
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-1.4B
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:contact@eleuther.ai).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change over the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-1.4B for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-1.4B as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-1.4B has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-1.4B will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-1.4B to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-1.4B may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-1.4B.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-1.4B.
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | 11,782 | [
[
-0.0259552001953125,
-0.06317138671875,
0.0208892822265625,
0.0044097900390625,
-0.0162506103515625,
-0.01284027099609375,
-0.0160675048828125,
-0.03460693359375,
0.0166473388671875,
0.0162353515625,
-0.0245361328125,
-0.024017333984375,
-0.03485107421875,
-0.0035190582275390625,
-0.036834716796875,
0.0906982421875,
-0.00524139404296875,
-0.0130462646484375,
0.006107330322265625,
-0.01045989990234375,
-0.0016298294067382812,
-0.039642333984375,
-0.032470703125,
-0.0262298583984375,
0.046844482421875,
0.01593017578125,
0.06591796875,
0.048004150390625,
0.0163116455078125,
0.0213623046875,
-0.022003173828125,
-0.00341796875,
-0.017303466796875,
-0.00910186767578125,
0.0006508827209472656,
-0.0208892822265625,
-0.05120849609375,
0.0021820068359375,
0.051239013671875,
0.047943115234375,
-0.0166473388671875,
0.0187225341796875,
0.00202178955078125,
0.034332275390625,
-0.038177490234375,
0.0011720657348632812,
-0.0206451416015625,
-0.013214111328125,
-0.01334381103515625,
0.00679779052734375,
-0.0279541015625,
-0.02862548828125,
0.03472900390625,
-0.047943115234375,
0.0178070068359375,
0.009979248046875,
0.0875244140625,
-0.004398345947265625,
-0.033172607421875,
-0.013824462890625,
-0.0487060546875,
0.05218505859375,
-0.052947998046875,
0.0203094482421875,
0.02740478515625,
0.007686614990234375,
0.0013475418090820312,
-0.0614013671875,
-0.03997802734375,
-0.0162353515625,
-0.015594482421875,
0.0013866424560546875,
-0.04278564453125,
0.0003933906555175781,
0.03680419921875,
0.049407958984375,
-0.06884765625,
0.0004723072052001953,
-0.0303802490234375,
-0.0230560302734375,
0.0313720703125,
0.0016279220581054688,
0.0390625,
-0.024322509765625,
0.0008373260498046875,
-0.0286865234375,
-0.0509033203125,
-0.01428985595703125,
0.04449462890625,
0.003757476806640625,
-0.0301513671875,
0.0413818359375,
-0.0218353271484375,
0.043975830078125,
-0.00418853759765625,
0.0184783935546875,
0.034423828125,
-0.0159912109375,
-0.0352783203125,
-0.005023956298828125,
0.07464599609375,
0.0100860595703125,
0.0173492431640625,
0.000007987022399902344,
-0.0017871856689453125,
0.0066070556640625,
0.006725311279296875,
-0.0841064453125,
-0.060546875,
0.02239990234375,
-0.0306854248046875,
-0.031219482421875,
-0.01027679443359375,
-0.0699462890625,
-0.018798828125,
-0.0103302001953125,
0.036407470703125,
-0.041748046875,
-0.053466796875,
-0.0135498046875,
0.0020046234130859375,
0.0170745849609375,
0.022857666015625,
-0.073974609375,
0.02923583984375,
0.032257080078125,
0.07635498046875,
0.01291656494140625,
-0.041473388671875,
-0.0175323486328125,
-0.018402099609375,
-0.01146697998046875,
0.032958984375,
-0.011199951171875,
-0.01303863525390625,
-0.004764556884765625,
0.010345458984375,
-0.00864410400390625,
-0.0279693603515625,
0.0304107666015625,
-0.0262451171875,
0.0279388427734375,
-0.0178070068359375,
-0.037750244140625,
-0.03131103515625,
0.0069580078125,
-0.05072021484375,
0.06793212890625,
0.01739501953125,
-0.07666015625,
0.0167388916015625,
-0.024627685546875,
-0.0079345703125,
-0.0005507469177246094,
0.0079498291015625,
-0.0484619140625,
0.00012063980102539062,
0.0272674560546875,
0.00997161865234375,
-0.02850341796875,
0.01824951171875,
-0.01337432861328125,
-0.026458740234375,
0.012359619140625,
-0.03759765625,
0.07513427734375,
0.0121917724609375,
-0.048370361328125,
0.0189056396484375,
-0.045166015625,
0.00982666015625,
0.019622802734375,
-0.0244903564453125,
-0.0018892288208007812,
-0.01140594482421875,
0.026885986328125,
0.01473236083984375,
0.0126953125,
-0.0250396728515625,
0.0194091796875,
-0.04022216796875,
0.052703857421875,
0.05535888671875,
-0.0076446533203125,
0.039581298828125,
-0.034637451171875,
0.037841796875,
-0.00036263465881347656,
0.0074615478515625,
-0.0062103271484375,
-0.046417236328125,
-0.07177734375,
-0.0189208984375,
0.0247955322265625,
0.0269927978515625,
-0.038360595703125,
0.033905029296875,
-0.0211639404296875,
-0.06451416015625,
-0.0213470458984375,
-0.0066986083984375,
0.0294647216796875,
0.022857666015625,
0.029815673828125,
-0.01023101806640625,
-0.038116455078125,
-0.06756591796875,
-0.01520538330078125,
-0.031005859375,
0.007373809814453125,
0.016387939453125,
0.06866455078125,
-0.003292083740234375,
0.04620361328125,
-0.0237274169921875,
0.0133514404296875,
-0.0258941650390625,
0.0132598876953125,
0.033447265625,
0.045501708984375,
0.033172607421875,
-0.046905517578125,
-0.03448486328125,
0.00479888916015625,
-0.047943115234375,
0.006328582763671875,
0.0003807544708251953,
-0.020782470703125,
0.0284271240234375,
0.00521087646484375,
-0.07415771484375,
0.030975341796875,
0.0538330078125,
-0.04364013671875,
0.0576171875,
-0.023193359375,
-0.000926971435546875,
-0.08056640625,
0.022430419921875,
0.0103759765625,
-0.016021728515625,
-0.046112060546875,
0.005950927734375,
0.01073455810546875,
-0.0115509033203125,
-0.0287628173828125,
0.046417236328125,
-0.04058837890625,
-0.011505126953125,
-0.01096343994140625,
0.008880615234375,
-0.0024890899658203125,
0.048553466796875,
0.009857177734375,
0.045166015625,
0.058319091796875,
-0.055694580078125,
0.0310516357421875,
0.0194549560546875,
-0.0240020751953125,
0.0299835205078125,
-0.063720703125,
0.01287078857421875,
0.002109527587890625,
0.0306854248046875,
-0.04345703125,
-0.0257568359375,
0.0389404296875,
-0.0447998046875,
0.01157379150390625,
-0.027740478515625,
-0.04302978515625,
-0.0287322998046875,
-0.0162353515625,
0.042694091796875,
0.057159423828125,
-0.04248046875,
0.052520751953125,
0.01345062255859375,
0.0029811859130859375,
-0.035736083984375,
-0.042266845703125,
-0.018035888671875,
-0.041748046875,
-0.049224853515625,
0.0267791748046875,
0.006866455078125,
-0.01641845703125,
0.0048828125,
0.0019989013671875,
0.00605010986328125,
-0.0039043426513671875,
0.021270751953125,
0.0236663818359375,
-0.0018186569213867188,
0.0016546249389648438,
-0.012939453125,
-0.007297515869140625,
0.00027632713317871094,
-0.03472900390625,
0.0714111328125,
-0.02227783203125,
-0.0100860595703125,
-0.05487060546875,
0.0017576217651367188,
0.06414794921875,
-0.027191162109375,
0.06866455078125,
0.04949951171875,
-0.051300048828125,
0.006816864013671875,
-0.0285797119140625,
-0.019989013671875,
-0.033172607421875,
0.049652099609375,
-0.0176239013671875,
-0.034637451171875,
0.04681396484375,
0.024658203125,
0.0243377685546875,
0.046417236328125,
0.055755615234375,
0.009185791015625,
0.087890625,
0.03662109375,
-0.0149078369140625,
0.0443115234375,
-0.040618896484375,
0.0168609619140625,
-0.07843017578125,
-0.013641357421875,
-0.04034423828125,
-0.0194091796875,
-0.07464599609375,
-0.023101806640625,
0.0207366943359375,
0.01422119140625,
-0.05511474609375,
0.039764404296875,
-0.04638671875,
0.0086517333984375,
0.0472412109375,
0.0128936767578125,
0.01326751708984375,
0.0169219970703125,
0.007171630859375,
-0.004985809326171875,
-0.055694580078125,
-0.02801513671875,
0.09002685546875,
0.038330078125,
0.04150390625,
0.0220184326171875,
0.049591064453125,
-0.0055999755859375,
0.0211334228515625,
-0.05426025390625,
0.0302276611328125,
0.017547607421875,
-0.0540771484375,
-0.0194549560546875,
-0.0582275390625,
-0.07623291015625,
0.0361328125,
0.0050506591796875,
-0.0806884765625,
0.01507568359375,
0.0160064697265625,
-0.0258941650390625,
0.036285400390625,
-0.045806884765625,
0.0743408203125,
-0.01885986328125,
-0.0302886962890625,
-0.02880859375,
-0.0257568359375,
0.01526641845703125,
0.024627685546875,
0.01702880859375,
0.006237030029296875,
0.01983642578125,
0.07861328125,
-0.048583984375,
0.04998779296875,
-0.005832672119140625,
0.00644683837890625,
0.0279083251953125,
0.021209716796875,
0.049774169921875,
0.0061187744140625,
0.00791168212890625,
0.0012989044189453125,
0.00850677490234375,
-0.042083740234375,
-0.0264434814453125,
0.06500244140625,
-0.0855712890625,
-0.02471923828125,
-0.060546875,
-0.044403076171875,
0.0117950439453125,
0.01458740234375,
0.03515625,
0.0496826171875,
-0.005382537841796875,
0.00745391845703125,
0.044586181640625,
-0.04034423828125,
0.0273895263671875,
0.0156402587890625,
-0.03338623046875,
-0.03656005859375,
0.07708740234375,
0.0067291259765625,
0.0203857421875,
0.007709503173828125,
0.02056884765625,
-0.03045654296875,
-0.032867431640625,
-0.043792724609375,
0.040863037109375,
-0.055633544921875,
0.0013332366943359375,
-0.056060791015625,
-0.007099151611328125,
-0.0328369140625,
0.0023441314697265625,
-0.0278167724609375,
-0.028717041015625,
-0.017730712890625,
-0.007152557373046875,
0.045806884765625,
0.04168701171875,
0.0076141357421875,
0.024566650390625,
-0.0439453125,
-0.003910064697265625,
0.01438140869140625,
0.00943756103515625,
0.008087158203125,
-0.06689453125,
-0.00934600830078125,
0.008819580078125,
-0.0258331298828125,
-0.08380126953125,
0.044403076171875,
-0.00206756591796875,
0.0270843505859375,
0.00962066650390625,
-0.01558685302734375,
0.040008544921875,
-0.0092315673828125,
0.0531005859375,
0.00934600830078125,
-0.0738525390625,
0.041717529296875,
-0.04248046875,
0.0209808349609375,
0.0241546630859375,
0.0294647216796875,
-0.055389404296875,
-0.00881195068359375,
-0.07708740234375,
-0.08343505859375,
0.06036376953125,
0.037750244140625,
0.01079559326171875,
0.0090789794921875,
0.0232086181640625,
-0.027984619140625,
0.0131072998046875,
-0.07928466796875,
-0.024017333984375,
-0.018096923828125,
-0.01067352294921875,
0.0084228515625,
-0.0025920867919921875,
0.0014867782592773438,
-0.03582763671875,
0.07586669921875,
0.0049896240234375,
0.0224609375,
0.0163726806640625,
-0.031890869140625,
-0.0076904296875,
-0.003753662109375,
0.01409912109375,
0.058990478515625,
-0.00981903076171875,
0.0023097991943359375,
0.0134429931640625,
-0.0445556640625,
-0.0023345947265625,
0.01166534423828125,
-0.0272674560546875,
-0.004871368408203125,
0.01678466796875,
0.06793212890625,
0.004302978515625,
-0.03533935546875,
0.0165863037109375,
-0.0017633438110351562,
-0.0082244873046875,
-0.027496337890625,
-0.016387939453125,
0.01186370849609375,
0.0096893310546875,
0.003124237060546875,
-0.01497650146484375,
0.0003809928894042969,
-0.061920166015625,
0.004913330078125,
0.01273345947265625,
-0.008270263671875,
-0.0301055908203125,
0.0390625,
0.00514984130859375,
-0.0148773193359375,
0.08538818359375,
-0.0182037353515625,
-0.0489501953125,
0.060546875,
0.032470703125,
0.05450439453125,
-0.0171051025390625,
0.0308837890625,
0.06732177734375,
0.0292510986328125,
-0.01297760009765625,
0.00433349609375,
0.0096282958984375,
-0.044158935546875,
-0.01194000244140625,
-0.0582275390625,
-0.0194854736328125,
0.0204010009765625,
-0.041961669921875,
0.0294952392578125,
-0.045135498046875,
-0.0001837015151977539,
-0.004749298095703125,
0.0187835693359375,
-0.04638671875,
0.02099609375,
0.0118408203125,
0.049652099609375,
-0.07049560546875,
0.05816650390625,
0.0535888671875,
-0.05487060546875,
-0.08148193359375,
-0.0008153915405273438,
0.0034809112548828125,
-0.040435791015625,
0.01282501220703125,
0.01934814453125,
0.0188140869140625,
0.01467132568359375,
-0.0262603759765625,
-0.06927490234375,
0.09429931640625,
0.0157928466796875,
-0.04681396484375,
-0.016693115234375,
-0.008087158203125,
0.040771484375,
0.0037555694580078125,
0.053436279296875,
0.052001953125,
0.0308837890625,
0.00299835205078125,
-0.07916259765625,
0.025482177734375,
-0.0301971435546875,
-0.008819580078125,
0.018035888671875,
-0.0518798828125,
0.09466552734375,
-0.0029239654541015625,
-0.00464630126953125,
0.0217132568359375,
0.0401611328125,
0.0302276611328125,
-0.009429931640625,
0.028594970703125,
0.059173583984375,
0.070068359375,
-0.0247955322265625,
0.09661865234375,
-0.0233154296875,
0.05511474609375,
0.065185546875,
0.0157470703125,
0.041717529296875,
0.02703857421875,
-0.0284271240234375,
0.043975830078125,
0.064697265625,
-0.006755828857421875,
0.0140533447265625,
0.01617431640625,
-0.01654052734375,
-0.01276397705078125,
0.00490570068359375,
-0.040924072265625,
0.0178985595703125,
0.01995849609375,
-0.039703369140625,
-0.01190185546875,
-0.0239410400390625,
0.0264739990234375,
-0.0275115966796875,
-0.01727294921875,
0.0266876220703125,
0.00923919677734375,
-0.050018310546875,
0.04736328125,
0.0123291015625,
0.04449462890625,
-0.0338134765625,
0.0103912353515625,
-0.01175689697265625,
0.0187530517578125,
-0.0293731689453125,
-0.037628173828125,
0.008758544921875,
0.0014934539794921875,
0.00600433349609375,
0.003910064697265625,
0.0328369140625,
-0.01525115966796875,
-0.042266845703125,
0.013397216796875,
0.041229248046875,
0.0234375,
-0.028411865234375,
-0.054595947265625,
0.0024623870849609375,
-0.01088714599609375,
-0.04425048828125,
0.0338134765625,
0.025054931640625,
-0.00830078125,
0.0469970703125,
0.05010986328125,
0.004673004150390625,
-0.005474090576171875,
0.01190185546875,
0.069580078125,
-0.03350830078125,
-0.0401611328125,
-0.0709228515625,
0.0404052734375,
-0.003810882568359375,
-0.04608154296875,
0.06658935546875,
0.039215087890625,
0.053802490234375,
0.02032470703125,
0.05096435546875,
-0.036895751953125,
0.0038433074951171875,
-0.0234375,
0.054779052734375,
-0.036285400390625,
0.002307891845703125,
-0.03448486328125,
-0.08685302734375,
-0.005764007568359375,
0.07806396484375,
-0.037078857421875,
0.0295257568359375,
0.0635986328125,
0.05572509765625,
-0.00092315673828125,
0.006061553955078125,
0.00380706787109375,
0.0283355712890625,
0.034088134765625,
0.0648193359375,
0.0648193359375,
-0.0477294921875,
0.04339599609375,
-0.0364990234375,
-0.0204620361328125,
-0.0114898681640625,
-0.04052734375,
-0.067138671875,
-0.038177490234375,
-0.03662109375,
-0.05548095703125,
-0.0038127899169921875,
0.06768798828125,
0.053802490234375,
-0.050262451171875,
-0.0162811279296875,
-0.037353515625,
0.004337310791015625,
-0.01849365234375,
-0.018280029296875,
0.039337158203125,
0.010284423828125,
-0.0711669921875,
-0.004032135009765625,
-0.00794219970703125,
0.00571441650390625,
-0.0313720703125,
-0.020782470703125,
-0.01415252685546875,
-0.0102386474609375,
0.0087127685546875,
0.023681640625,
-0.038299560546875,
-0.0174102783203125,
0.0036830902099609375,
0.0019683837890625,
0.0015926361083984375,
0.05035400390625,
-0.0428466796875,
0.0091705322265625,
0.048187255859375,
0.01473236083984375,
0.056304931640625,
-0.0173187255859375,
0.03289794921875,
-0.0251007080078125,
0.024627685546875,
0.025390625,
0.046051025390625,
0.0284881591796875,
-0.02362060546875,
0.02069091796875,
0.0262298583984375,
-0.058990478515625,
-0.0621337890625,
0.02008056640625,
-0.05499267578125,
-0.0093536376953125,
0.09600830078125,
-0.02020263671875,
-0.0251312255859375,
0.002185821533203125,
-0.01464080810546875,
0.035797119140625,
-0.02630615234375,
0.053375244140625,
0.052459716796875,
0.004032135009765625,
-0.0214691162109375,
-0.05242919921875,
0.032135009765625,
0.049896240234375,
-0.06390380859375,
0.0276031494140625,
0.04705810546875,
0.03875732421875,
0.018585205078125,
0.052001953125,
-0.018585205078125,
0.038665771484375,
0.00579833984375,
0.005519866943359375,
0.0070648193359375,
-0.03289794921875,
-0.032989501953125,
-0.0078277587890625,
0.01044464111328125,
0.0053558349609375
]
] |
EleutherAI/pythia-6.9b-v0 | 2023-03-29T18:48:58.000Z | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:the_pile",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/pythia-6.9b-v0 | 8 | 6,950 | transformers | 2022-10-16T20:16:56 | ---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
license: apache-2.0
datasets:
- the_pile
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-6.9B
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:contact@eleuther.ai).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change over the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-6.9B for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-6.9B as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-6.9B has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-6.9B will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-6.9B to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-6.9B may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-6.9B.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-6.9B.
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | 11,782 | [
[
-0.0259857177734375,
-0.062347412109375,
0.02081298828125,
0.0042724609375,
-0.0164794921875,
-0.01284027099609375,
-0.0166473388671875,
-0.034942626953125,
0.0158233642578125,
0.0164337158203125,
-0.0240325927734375,
-0.023834228515625,
-0.035308837890625,
-0.0038738250732421875,
-0.03692626953125,
0.09033203125,
-0.00421905517578125,
-0.01323699951171875,
0.006763458251953125,
-0.0105743408203125,
-0.0021514892578125,
-0.0391845703125,
-0.032379150390625,
-0.026763916015625,
0.04669189453125,
0.01522064208984375,
0.066650390625,
0.048370361328125,
0.0163421630859375,
0.0216827392578125,
-0.0228729248046875,
-0.0036640167236328125,
-0.016693115234375,
-0.00966644287109375,
0.0004177093505859375,
-0.0205535888671875,
-0.050628662109375,
0.00154876708984375,
0.050445556640625,
0.0482177734375,
-0.016693115234375,
0.0179901123046875,
0.0012483596801757812,
0.034759521484375,
-0.038299560546875,
0.0017108917236328125,
-0.020751953125,
-0.01335906982421875,
-0.0128631591796875,
0.007411956787109375,
-0.028533935546875,
-0.02789306640625,
0.03369140625,
-0.04730224609375,
0.01812744140625,
0.00989532470703125,
0.0875244140625,
-0.004558563232421875,
-0.033447265625,
-0.01360321044921875,
-0.048553466796875,
0.052001953125,
-0.05352783203125,
0.020538330078125,
0.0269012451171875,
0.008331298828125,
0.0006947517395019531,
-0.061065673828125,
-0.03948974609375,
-0.0164794921875,
-0.01561737060546875,
0.00133514404296875,
-0.04229736328125,
0.0007009506225585938,
0.037689208984375,
0.0504150390625,
-0.06854248046875,
0.0008258819580078125,
-0.030517578125,
-0.0230255126953125,
0.032440185546875,
0.0016298294067382812,
0.039764404296875,
-0.0249481201171875,
0.001220703125,
-0.0288238525390625,
-0.050750732421875,
-0.01425933837890625,
0.044281005859375,
0.0032806396484375,
-0.0302276611328125,
0.04144287109375,
-0.0219573974609375,
0.044219970703125,
-0.0038280487060546875,
0.0185089111328125,
0.034881591796875,
-0.015380859375,
-0.03521728515625,
-0.00502777099609375,
0.0748291015625,
0.01102447509765625,
0.0170440673828125,
-0.00013184547424316406,
-0.002864837646484375,
0.007007598876953125,
0.00681304931640625,
-0.083251953125,
-0.061065673828125,
0.02215576171875,
-0.0309295654296875,
-0.031341552734375,
-0.01041412353515625,
-0.06915283203125,
-0.0194091796875,
-0.01000213623046875,
0.0360107421875,
-0.041351318359375,
-0.052947998046875,
-0.0135650634765625,
0.0013790130615234375,
0.016754150390625,
0.0233612060546875,
-0.0736083984375,
0.0279693603515625,
0.031951904296875,
0.07574462890625,
0.01337432861328125,
-0.040802001953125,
-0.0178985595703125,
-0.01739501953125,
-0.0113372802734375,
0.033660888671875,
-0.0117340087890625,
-0.0137786865234375,
-0.00455474853515625,
0.01096343994140625,
-0.00920867919921875,
-0.028228759765625,
0.0311279296875,
-0.0262451171875,
0.0284423828125,
-0.01690673828125,
-0.03790283203125,
-0.03216552734375,
0.0066375732421875,
-0.05126953125,
0.0692138671875,
0.01690673828125,
-0.07611083984375,
0.0163421630859375,
-0.025299072265625,
-0.00775146484375,
-0.00029015541076660156,
0.007965087890625,
-0.04864501953125,
0.0009617805480957031,
0.0278472900390625,
0.00971221923828125,
-0.028228759765625,
0.018768310546875,
-0.013885498046875,
-0.0269927978515625,
0.01207733154296875,
-0.03729248046875,
0.075439453125,
0.0119781494140625,
-0.04901123046875,
0.0194244384765625,
-0.045074462890625,
0.0097503662109375,
0.018798828125,
-0.024139404296875,
-0.0017108917236328125,
-0.01116180419921875,
0.0270233154296875,
0.01502227783203125,
0.01238250732421875,
-0.02447509765625,
0.0191802978515625,
-0.040374755859375,
0.05291748046875,
0.055511474609375,
-0.00850677490234375,
0.0394287109375,
-0.034820556640625,
0.036834716796875,
-0.00004762411117553711,
0.007480621337890625,
-0.005809783935546875,
-0.046966552734375,
-0.07232666015625,
-0.0185089111328125,
0.02459716796875,
0.02655029296875,
-0.038177490234375,
0.03533935546875,
-0.0213623046875,
-0.0648193359375,
-0.0209197998046875,
-0.007080078125,
0.0293426513671875,
0.0228118896484375,
0.030181884765625,
-0.010589599609375,
-0.037750244140625,
-0.067626953125,
-0.0156707763671875,
-0.03070068359375,
0.00757598876953125,
0.0164337158203125,
0.069091796875,
-0.0038166046142578125,
0.04522705078125,
-0.0241241455078125,
0.01323699951171875,
-0.0262908935546875,
0.01345062255859375,
0.03375244140625,
0.045013427734375,
0.03350830078125,
-0.047515869140625,
-0.034576416015625,
0.004978179931640625,
-0.047607421875,
0.006084442138671875,
0.00047898292541503906,
-0.0204010009765625,
0.0284423828125,
0.005802154541015625,
-0.074462890625,
0.031280517578125,
0.05389404296875,
-0.04376220703125,
0.057861328125,
-0.0231475830078125,
-0.0010156631469726562,
-0.08123779296875,
0.0226287841796875,
0.0101165771484375,
-0.016082763671875,
-0.046112060546875,
0.006137847900390625,
0.0116729736328125,
-0.01155853271484375,
-0.02886962890625,
0.046600341796875,
-0.041351318359375,
-0.01145172119140625,
-0.0113067626953125,
0.00809478759765625,
-0.002567291259765625,
0.048248291015625,
0.0100555419921875,
0.04534912109375,
0.0572509765625,
-0.0552978515625,
0.031951904296875,
0.0186767578125,
-0.02392578125,
0.0300140380859375,
-0.0631103515625,
0.0128021240234375,
0.0026397705078125,
0.0307464599609375,
-0.042755126953125,
-0.0257110595703125,
0.038360595703125,
-0.044708251953125,
0.01059722900390625,
-0.0281524658203125,
-0.04229736328125,
-0.029266357421875,
-0.0164337158203125,
0.04229736328125,
0.05712890625,
-0.04205322265625,
0.053802490234375,
0.0129547119140625,
0.0035037994384765625,
-0.035858154296875,
-0.041839599609375,
-0.017578125,
-0.042266845703125,
-0.049713134765625,
0.0267181396484375,
0.00605010986328125,
-0.0164794921875,
0.00493621826171875,
0.0025081634521484375,
0.006130218505859375,
-0.003925323486328125,
0.0213623046875,
0.0235595703125,
-0.0022563934326171875,
0.0008864402770996094,
-0.01313018798828125,
-0.007030487060546875,
0.0006356239318847656,
-0.034423828125,
0.0711669921875,
-0.022308349609375,
-0.00994873046875,
-0.054351806640625,
0.0016756057739257812,
0.0643310546875,
-0.0264739990234375,
0.06878662109375,
0.050262451171875,
-0.051116943359375,
0.00710296630859375,
-0.0295562744140625,
-0.019775390625,
-0.033203125,
0.0494384765625,
-0.017303466796875,
-0.03411865234375,
0.04754638671875,
0.0254364013671875,
0.0242767333984375,
0.04742431640625,
0.0562744140625,
0.0094757080078125,
0.08819580078125,
0.036346435546875,
-0.01465606689453125,
0.044158935546875,
-0.040313720703125,
0.0169677734375,
-0.0787353515625,
-0.0142974853515625,
-0.04022216796875,
-0.019683837890625,
-0.0738525390625,
-0.0228424072265625,
0.0203399658203125,
0.0137786865234375,
-0.05450439453125,
0.039398193359375,
-0.046234130859375,
0.00908660888671875,
0.0472412109375,
0.01248931884765625,
0.01397705078125,
0.0164794921875,
0.006763458251953125,
-0.00501251220703125,
-0.05511474609375,
-0.0270843505859375,
0.08941650390625,
0.037353515625,
0.041656494140625,
0.0223236083984375,
0.048797607421875,
-0.0050048828125,
0.0208892822265625,
-0.054840087890625,
0.0304412841796875,
0.0171966552734375,
-0.054168701171875,
-0.01885986328125,
-0.05877685546875,
-0.0758056640625,
0.036468505859375,
0.005344390869140625,
-0.080810546875,
0.0151214599609375,
0.01617431640625,
-0.026092529296875,
0.03680419921875,
-0.046722412109375,
0.0745849609375,
-0.01885986328125,
-0.0291900634765625,
-0.0286407470703125,
-0.0260009765625,
0.01532745361328125,
0.0252227783203125,
0.017364501953125,
0.005794525146484375,
0.0198211669921875,
0.07867431640625,
-0.048248291015625,
0.050140380859375,
-0.005279541015625,
0.00688934326171875,
0.0268402099609375,
0.0205230712890625,
0.04949951171875,
0.0059356689453125,
0.0085296630859375,
0.0007390975952148438,
0.007610321044921875,
-0.041717529296875,
-0.02667236328125,
0.06536865234375,
-0.08587646484375,
-0.025299072265625,
-0.060821533203125,
-0.045257568359375,
0.01134490966796875,
0.01506805419921875,
0.03546142578125,
0.0498046875,
-0.004711151123046875,
0.00838470458984375,
0.04510498046875,
-0.039764404296875,
0.028564453125,
0.01493072509765625,
-0.03277587890625,
-0.037353515625,
0.07659912109375,
0.006435394287109375,
0.0203857421875,
0.007785797119140625,
0.021209716796875,
-0.0310516357421875,
-0.0330810546875,
-0.044342041015625,
0.040802001953125,
-0.055389404296875,
0.0006527900695800781,
-0.055816650390625,
-0.007965087890625,
-0.0333251953125,
0.0024013519287109375,
-0.0274200439453125,
-0.0289764404296875,
-0.01751708984375,
-0.006824493408203125,
0.0460205078125,
0.041839599609375,
0.007720947265625,
0.0249481201171875,
-0.04473876953125,
-0.0039215087890625,
0.013641357421875,
0.00899505615234375,
0.008148193359375,
-0.0673828125,
-0.00998687744140625,
0.00855255126953125,
-0.0259857177734375,
-0.083740234375,
0.04443359375,
-0.0016813278198242188,
0.027130126953125,
0.0093536376953125,
-0.0150146484375,
0.03985595703125,
-0.00873565673828125,
0.053253173828125,
0.009429931640625,
-0.07373046875,
0.0416259765625,
-0.042999267578125,
0.0219573974609375,
0.0230560302734375,
0.02880859375,
-0.056182861328125,
-0.0091400146484375,
-0.07806396484375,
-0.08355712890625,
0.060394287109375,
0.03753662109375,
0.01013946533203125,
0.010223388671875,
0.022308349609375,
-0.0290985107421875,
0.0133819580078125,
-0.0789794921875,
-0.023651123046875,
-0.0180511474609375,
-0.011322021484375,
0.00794219970703125,
-0.00269317626953125,
0.0009975433349609375,
-0.03515625,
0.07537841796875,
0.00467681884765625,
0.0222320556640625,
0.0157470703125,
-0.03289794921875,
-0.007152557373046875,
-0.0032100677490234375,
0.01434326171875,
0.05877685546875,
-0.00928497314453125,
0.002460479736328125,
0.01326751708984375,
-0.044921875,
-0.0025691986083984375,
0.01212310791015625,
-0.028045654296875,
-0.005207061767578125,
0.0167236328125,
0.06787109375,
0.004566192626953125,
-0.034942626953125,
0.0166015625,
-0.002033233642578125,
-0.008453369140625,
-0.027679443359375,
-0.0157470703125,
0.0123443603515625,
0.00968170166015625,
0.0034770965576171875,
-0.01445770263671875,
0.0006213188171386719,
-0.06121826171875,
0.004425048828125,
0.01201629638671875,
-0.00859832763671875,
-0.030059814453125,
0.03900146484375,
0.005645751953125,
-0.0142974853515625,
0.08599853515625,
-0.0170135498046875,
-0.048675537109375,
0.06036376953125,
0.03167724609375,
0.05462646484375,
-0.0171661376953125,
0.0306549072265625,
0.06640625,
0.0294036865234375,
-0.013336181640625,
0.00446319580078125,
0.01039886474609375,
-0.04425048828125,
-0.01219940185546875,
-0.0576171875,
-0.0189361572265625,
0.0210113525390625,
-0.04193115234375,
0.0292816162109375,
-0.045074462890625,
-0.0004916191101074219,
-0.00502777099609375,
0.019073486328125,
-0.04534912109375,
0.0215606689453125,
0.01153564453125,
0.04974365234375,
-0.0709228515625,
0.057861328125,
0.054107666015625,
-0.05419921875,
-0.08184814453125,
-0.0014600753784179688,
0.0039043426513671875,
-0.040374755859375,
0.011962890625,
0.0195465087890625,
0.0190582275390625,
0.01404571533203125,
-0.0254669189453125,
-0.06866455078125,
0.09368896484375,
0.01611328125,
-0.046142578125,
-0.0174713134765625,
-0.0083770751953125,
0.04046630859375,
0.00395965576171875,
0.053802490234375,
0.0517578125,
0.0307464599609375,
0.0028820037841796875,
-0.079345703125,
0.0255126953125,
-0.03076171875,
-0.007808685302734375,
0.0179901123046875,
-0.052032470703125,
0.09375,
-0.0024967193603515625,
-0.00353240966796875,
0.0218505859375,
0.0411376953125,
0.030181884765625,
-0.0100555419921875,
0.02935791015625,
0.0595703125,
0.06988525390625,
-0.0253753662109375,
0.09619140625,
-0.0234375,
0.05511474609375,
0.06451416015625,
0.016448974609375,
0.041656494140625,
0.0266265869140625,
-0.0270538330078125,
0.044158935546875,
0.0648193359375,
-0.007110595703125,
0.01360321044921875,
0.0153045654296875,
-0.017059326171875,
-0.01393890380859375,
0.004222869873046875,
-0.0416259765625,
0.0172119140625,
0.0204620361328125,
-0.040313720703125,
-0.0117645263671875,
-0.024322509765625,
0.0263824462890625,
-0.027435302734375,
-0.01788330078125,
0.0265655517578125,
0.009796142578125,
-0.049713134765625,
0.04705810546875,
0.01200103759765625,
0.044158935546875,
-0.0338134765625,
0.01078033447265625,
-0.011688232421875,
0.0196380615234375,
-0.02978515625,
-0.0377197265625,
0.008880615234375,
0.0009584426879882812,
0.005802154541015625,
0.003437042236328125,
0.033111572265625,
-0.014862060546875,
-0.042449951171875,
0.01355743408203125,
0.040618896484375,
0.022796630859375,
-0.0284576416015625,
-0.0550537109375,
0.0030975341796875,
-0.0107269287109375,
-0.044342041015625,
0.0338134765625,
0.025177001953125,
-0.0088043212890625,
0.047149658203125,
0.049896240234375,
0.0046539306640625,
-0.005115509033203125,
0.01247406005859375,
0.06976318359375,
-0.034423828125,
-0.040496826171875,
-0.07012939453125,
0.040618896484375,
-0.004550933837890625,
-0.0457763671875,
0.06622314453125,
0.03985595703125,
0.054290771484375,
0.019622802734375,
0.0509033203125,
-0.037017822265625,
0.004093170166015625,
-0.0240020751953125,
0.0552978515625,
-0.036285400390625,
0.0017995834350585938,
-0.03448486328125,
-0.08740234375,
-0.006267547607421875,
0.078125,
-0.0377197265625,
0.02984619140625,
0.06402587890625,
0.055755615234375,
-0.0011453628540039062,
0.0061492919921875,
0.003932952880859375,
0.0281524658203125,
0.034271240234375,
0.064697265625,
0.06536865234375,
-0.047821044921875,
0.043853759765625,
-0.036468505859375,
-0.0199737548828125,
-0.01190185546875,
-0.040802001953125,
-0.067626953125,
-0.03741455078125,
-0.036285400390625,
-0.055419921875,
-0.0031070709228515625,
0.06707763671875,
0.05352783203125,
-0.05023193359375,
-0.0173187255859375,
-0.036285400390625,
0.004383087158203125,
-0.01837158203125,
-0.0182647705078125,
0.039276123046875,
0.01073455810546875,
-0.07122802734375,
-0.0038318634033203125,
-0.00811767578125,
0.00586700439453125,
-0.0306243896484375,
-0.020233154296875,
-0.01403045654296875,
-0.0115509033203125,
0.0089111328125,
0.0237274169921875,
-0.0384521484375,
-0.0180816650390625,
0.003856658935546875,
0.0019216537475585938,
0.0020160675048828125,
0.050445556640625,
-0.042449951171875,
0.00818634033203125,
0.048675537109375,
0.01456451416015625,
0.055877685546875,
-0.0174407958984375,
0.033782958984375,
-0.0245819091796875,
0.02459716796875,
0.0251617431640625,
0.046539306640625,
0.029022216796875,
-0.02374267578125,
0.0206298828125,
0.0273284912109375,
-0.058685302734375,
-0.061767578125,
0.0201873779296875,
-0.054931640625,
-0.008941650390625,
0.09600830078125,
-0.0196380615234375,
-0.0259246826171875,
0.0016527175903320312,
-0.01446533203125,
0.034881591796875,
-0.027099609375,
0.05389404296875,
0.052154541015625,
0.00406646728515625,
-0.022308349609375,
-0.051849365234375,
0.032318115234375,
0.04949951171875,
-0.06365966796875,
0.0274658203125,
0.04766845703125,
0.03863525390625,
0.0185089111328125,
0.0516357421875,
-0.0186767578125,
0.03900146484375,
0.0047149658203125,
0.005435943603515625,
0.007328033447265625,
-0.032958984375,
-0.03265380859375,
-0.00794219970703125,
0.01064300537109375,
0.006160736083984375
]
] |
MoritzLaurer/DeBERTa-v3-xsmall-mnli-fever-anli-ling-binary | 2023-03-20T08:28:49.000Z | [
"transformers",
"pytorch",
"safetensors",
"deberta-v2",
"text-classification",
"zero-shot-classification",
"en",
"dataset:multi_nli",
"dataset:anli",
"dataset:fever",
"dataset:lingnli",
"arxiv:2104.07179",
"arxiv:2111.09543",
"license:mit",
"endpoints_compatible",
"region:us"
] | zero-shot-classification | MoritzLaurer | null | null | MoritzLaurer/DeBERTa-v3-xsmall-mnli-fever-anli-ling-binary | 3 | 6,943 | transformers | 2022-03-02T23:29:04 | ---
language:
- en
license: mit
tags:
- text-classification
- zero-shot-classification
metrics:
- accuracy
datasets:
- multi_nli
- anli
- fever
- lingnli
pipeline_tag: zero-shot-classification
---
# DeBERTa-v3-xsmall-mnli-fever-anli-ling-binary
## Model description
This model was trained on 782 357 hypothesis-premise pairs from 4 NLI datasets: [MultiNLI](https://huggingface.co/datasets/multi_nli), [Fever-NLI](https://github.com/easonnie/combine-FEVER-NSMN/blob/master/other_resources/nli_fever.md), [LingNLI](https://arxiv.org/abs/2104.07179) and [ANLI](https://github.com/facebookresearch/anli).
Note that the model was trained on binary NLI to predict either "entailment" or "not-entailment". This is specifically designed for zero-shot classification, where the difference between "neutral" and "contradiction" is irrelevant.
The base model is [DeBERTa-v3-xsmall from Microsoft](https://huggingface.co/microsoft/deberta-v3-xsmall). The v3 variant of DeBERTa substantially outperforms previous versions of the model by including a different pre-training objective, see the [DeBERTa-V3 paper](https://arxiv.org/abs/2111.09543).
For highest performance (but less speed), I recommend using https://huggingface.co/MoritzLaurer/DeBERTa-v3-large-mnli-fever-anli-ling-wanli.
## Intended uses & limitations
#### How to use the model
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
device = torch.device("cuda") if torch.cuda.is_available() else torch.device("cpu")
model_name = "MoritzLaurer/DeBERTa-v3-xsmall-mnli-fever-anli-ling-binary"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)
premise = "I first thought that I liked the movie, but upon second thought it was actually disappointing."
hypothesis = "The movie was good."
input = tokenizer(premise, hypothesis, truncation=True, return_tensors="pt")
output = model(input["input_ids"].to(device)) # device = "cuda:0" or "cpu"
prediction = torch.softmax(output["logits"][0], -1).tolist()
label_names = ["entailment", "not_entailment"]
prediction = {name: round(float(pred) * 100, 1) for pred, name in zip(prediction, label_names)}
print(prediction)
```
### Training data
This model was trained on 782 357 hypothesis-premise pairs from 4 NLI datasets: [MultiNLI](https://huggingface.co/datasets/multi_nli), [Fever-NLI](https://github.com/easonnie/combine-FEVER-NSMN/blob/master/other_resources/nli_fever.md), [LingNLI](https://arxiv.org/abs/2104.07179) and [ANLI](https://github.com/facebookresearch/anli).
### Training procedure
DeBERTa-v3-xsmall-mnli-fever-anli-ling-binary was trained using the Hugging Face trainer with the following hyperparameters.
```
training_args = TrainingArguments(
num_train_epochs=5, # total number of training epochs
learning_rate=2e-05,
per_device_train_batch_size=32, # batch size per device during training
per_device_eval_batch_size=32, # batch size for evaluation
warmup_ratio=0.1, # number of warmup steps for learning rate scheduler
weight_decay=0.06, # strength of weight decay
fp16=True # mixed precision training
)
```
### Eval results
The model was evaluated using the binary test sets for MultiNLI, ANLI, LingNLI and the binary dev set for Fever-NLI (two classes instead of three). The metric used is accuracy.
dataset | mnli-m-2c | mnli-mm-2c | fever-nli-2c | anli-all-2c | anli-r3-2c | lingnli-2c
--------|---------|----------|---------|----------|----------|------
accuracy | 0.925 | 0.922 | 0.892 | 0.676 | 0.665 | 0.888
speed (text/sec, CPU, 128 batch) | 6.0 | 6.3 | 3.0 | 5.8 | 5.0 | 7.6
speed (text/sec, GPU Tesla P100, 128 batch) | 473 | 487 | 230 | 390 | 340 | 586
## Limitations and bias
Please consult the original DeBERTa paper and literature on different NLI datasets for potential biases.
## Citation
If you use this model, please cite: Laurer, Moritz, Wouter van Atteveldt, Andreu Salleras Casas, and Kasper Welbers. 2022. ‘Less Annotating, More Classifying – Addressing the Data Scarcity Issue of Supervised Machine Learning with Deep Transfer Learning and BERT - NLI’. Preprint, June. Open Science Framework. https://osf.io/74b8k.
### Ideas for cooperation or questions?
If you have questions or ideas for cooperation, contact me at m{dot}laurer{at}vu{dot}nl or [LinkedIn](https://www.linkedin.com/in/moritz-laurer/)
### Debugging and issues
Note that DeBERTa-v3 was released on 06.12.21 and older versions of HF Transformers seem to have issues running the model (e.g. resulting in an issue with the tokenizer). Using Transformers>=4.13 might solve some issues. | 4,721 | [
[
-0.0246124267578125,
-0.03271484375,
0.007442474365234375,
0.0212554931640625,
-0.01538848876953125,
-0.01323699951171875,
0.005100250244140625,
-0.042449951171875,
0.0333251953125,
0.009033203125,
-0.03387451171875,
-0.03826904296875,
-0.049224853515625,
0.0163726806640625,
-0.01461029052734375,
0.07928466796875,
-0.0021877288818359375,
0.00902557373046875,
0.00036025047302246094,
-0.01302337646484375,
-0.0194091796875,
-0.0498046875,
-0.038330078125,
-0.03857421875,
0.025421142578125,
0.0281829833984375,
0.052703857421875,
0.031890869140625,
0.03973388671875,
0.0153961181640625,
-0.00731658935546875,
-0.004791259765625,
-0.021728515625,
-0.0081329345703125,
0.0207061767578125,
-0.039794921875,
-0.036712646484375,
0.0098876953125,
0.0309295654296875,
0.0229644775390625,
0.004291534423828125,
0.0262603759765625,
-0.01486968994140625,
0.047698974609375,
-0.040618896484375,
0.0079498291015625,
-0.0382080078125,
0.02142333984375,
-0.0035552978515625,
0.01275634765625,
-0.0244293212890625,
-0.011962890625,
0.0233917236328125,
-0.035369873046875,
0.01103973388671875,
-0.0086517333984375,
0.0960693359375,
0.0205078125,
-0.02142333984375,
0.003353118896484375,
-0.03857421875,
0.0633544921875,
-0.07464599609375,
0.0142669677734375,
0.018096923828125,
-0.004070281982421875,
0.0026569366455078125,
-0.029266357421875,
-0.046783447265625,
-0.01027679443359375,
-0.00618743896484375,
0.00931549072265625,
-0.03131103515625,
-0.0244903564453125,
0.0347900390625,
0.01062774658203125,
-0.0555419921875,
-0.006084442138671875,
-0.04345703125,
-0.0049285888671875,
0.054168701171875,
-0.0009055137634277344,
0.01375579833984375,
-0.03240966796875,
-0.03887939453125,
-0.01039886474609375,
-0.031890869140625,
0.010528564453125,
0.00464630126953125,
0.002132415771484375,
-0.0291595458984375,
0.019256591796875,
-0.0063629150390625,
0.05584716796875,
0.022796630859375,
-0.0203094482421875,
0.07659912109375,
-0.0152435302734375,
-0.03155517578125,
0.0182647705078125,
0.07415771484375,
0.031585693359375,
0.00827789306640625,
0.003093719482421875,
0.00545501708984375,
-0.0126495361328125,
-0.01171875,
-0.08331298828125,
-0.0213470458984375,
0.028472900390625,
-0.0289764404296875,
-0.03955078125,
0.00542449951171875,
-0.061676025390625,
-0.0177154541015625,
-0.0255126953125,
0.04193115234375,
-0.052703857421875,
-0.038055419921875,
-0.004901885986328125,
-0.01515960693359375,
0.0286407470703125,
-0.0025463104248046875,
-0.0631103515625,
0.00873565673828125,
0.01476287841796875,
0.0648193359375,
-0.0139617919921875,
-0.02276611328125,
-0.01471710205078125,
-0.0040435791015625,
-0.01390838623046875,
0.0249176025390625,
-0.020843505859375,
-0.018157958984375,
-0.0246429443359375,
0.002323150634765625,
-0.043792724609375,
-0.037628173828125,
0.0306396484375,
-0.0234832763671875,
0.01227569580078125,
-0.01288604736328125,
-0.0277557373046875,
-0.030853271484375,
0.0164337158203125,
-0.033203125,
0.073974609375,
0.0137176513671875,
-0.08563232421875,
0.0201568603515625,
-0.036346435546875,
-0.0175933837890625,
-0.0297393798828125,
0.001956939697265625,
-0.0531005859375,
-0.01230621337890625,
0.0267486572265625,
0.039642333984375,
-0.0181427001953125,
0.039154052734375,
-0.039276123046875,
-0.02978515625,
0.0300140380859375,
-0.0193634033203125,
0.0804443359375,
0.02044677734375,
-0.05633544921875,
0.009857177734375,
-0.07586669921875,
-0.00360107421875,
0.032073974609375,
0.004833221435546875,
-0.01116943359375,
-0.036468505859375,
-0.005970001220703125,
0.033660888671875,
0.01256561279296875,
-0.038726806640625,
0.01983642578125,
-0.04840087890625,
0.03216552734375,
0.025543212890625,
-0.00098419189453125,
0.0196990966796875,
-0.039276123046875,
0.0201263427734375,
0.032379150390625,
0.0274200439453125,
0.0019283294677734375,
-0.056182861328125,
-0.069091796875,
-0.0228729248046875,
0.023834228515625,
0.059417724609375,
-0.053009033203125,
0.03436279296875,
-0.00453948974609375,
-0.064453125,
-0.037872314453125,
0.01056671142578125,
0.02947998046875,
0.04736328125,
0.035308837890625,
-0.004886627197265625,
-0.060516357421875,
-0.07183837890625,
0.016357421875,
-0.01290130615234375,
0.004802703857421875,
0.0090789794921875,
0.0555419921875,
-0.0323486328125,
0.07177734375,
-0.0291595458984375,
-0.031463623046875,
-0.0237274169921875,
0.001079559326171875,
0.057952880859375,
0.056671142578125,
0.07275390625,
-0.048675537109375,
-0.037567138671875,
-0.01441192626953125,
-0.07623291015625,
0.005466461181640625,
0.0019989013671875,
-0.01488494873046875,
0.047760009765625,
0.01500701904296875,
-0.031036376953125,
0.0374755859375,
0.039825439453125,
-0.018829345703125,
0.0173492431640625,
-0.006786346435546875,
0.015380859375,
-0.09063720703125,
0.026458740234375,
0.01068115234375,
0.0090179443359375,
-0.07208251953125,
-0.00820159912109375,
0.0019245147705078125,
0.0005970001220703125,
-0.047149658203125,
0.039794921875,
-0.00820159912109375,
0.0227813720703125,
-0.00966644287109375,
-0.0062255859375,
0.01666259765625,
0.048309326171875,
0.0085296630859375,
0.0255889892578125,
0.07525634765625,
-0.045928955078125,
0.0174407958984375,
0.01285552978515625,
0.002613067626953125,
0.00591278076171875,
-0.06494140625,
0.004688262939453125,
-0.004467010498046875,
0.01216888427734375,
-0.050537109375,
-0.015380859375,
0.049560546875,
-0.03619384765625,
0.033782958984375,
-0.0060577392578125,
-0.02294921875,
-0.03948974609375,
-0.0290679931640625,
0.033721923828125,
0.058837890625,
-0.047088623046875,
0.047454833984375,
0.01505279541015625,
0.017669677734375,
-0.06719970703125,
-0.050048828125,
-0.029266357421875,
-0.029266357421875,
-0.053497314453125,
0.03411865234375,
-0.0020275115966796875,
-0.020904541015625,
-0.0007038116455078125,
0.0004906654357910156,
-0.01263427734375,
0.00348663330078125,
0.0310516357421875,
0.043853759765625,
-0.01036834716796875,
-0.00704193115234375,
0.0002994537353515625,
0.0015869140625,
0.001766204833984375,
-0.0131072998046875,
0.038970947265625,
-0.032135009765625,
-0.00127410888671875,
-0.040985107421875,
0.00732421875,
0.037506103515625,
-0.0072174072265625,
0.07012939453125,
0.07293701171875,
-0.03680419921875,
0.03009033203125,
-0.04229736328125,
-0.00839996337890625,
-0.029052734375,
0.0096435546875,
-0.03173828125,
-0.0286407470703125,
0.045745849609375,
0.03436279296875,
0.0017004013061523438,
0.06414794921875,
0.0297088623046875,
0.0262603759765625,
0.06719970703125,
0.0400390625,
-0.0164794921875,
0.00875091552734375,
-0.0531005859375,
0.00667572021484375,
-0.05389404296875,
-0.0267486572265625,
-0.0311431884765625,
-0.00934600830078125,
-0.036376953125,
-0.0284576416015625,
0.0310516357421875,
0.0193023681640625,
-0.0311431884765625,
0.027435302734375,
-0.0423583984375,
0.01190948486328125,
0.049407958984375,
0.015594482421875,
0.00864410400390625,
-0.006824493408203125,
-0.000048100948333740234,
0.00879669189453125,
-0.06103515625,
-0.0171356201171875,
0.08612060546875,
0.04058837890625,
0.039886474609375,
0.0035915374755859375,
0.0750732421875,
-0.0101318359375,
0.0283355712890625,
-0.0296173095703125,
0.0206146240234375,
-0.01398468017578125,
-0.06280517578125,
-0.0137786865234375,
-0.036651611328125,
-0.0625,
0.01910400390625,
-0.032501220703125,
-0.06280517578125,
0.04437255859375,
0.01039886474609375,
-0.03997802734375,
0.031219482421875,
-0.05596923828125,
0.06414794921875,
-0.0080718994140625,
-0.026214599609375,
0.005931854248046875,
-0.0557861328125,
0.050384521484375,
-0.0164642333984375,
0.0118865966796875,
-0.02813720703125,
0.0260009765625,
0.07464599609375,
-0.024139404296875,
0.06549072265625,
-0.0262603759765625,
0.01061248779296875,
0.03369140625,
-0.0179901123046875,
0.00572967529296875,
0.024200439453125,
-0.045135498046875,
0.0521240234375,
0.02886962890625,
-0.036163330078125,
-0.0230560302734375,
0.061126708984375,
-0.0836181640625,
-0.043060302734375,
-0.0609130859375,
-0.02337646484375,
0.001934051513671875,
0.00847625732421875,
0.046905517578125,
0.0455322265625,
-0.0001951456069946289,
-0.0034694671630859375,
0.0511474609375,
-0.0302581787109375,
0.032257080078125,
0.021697998046875,
-0.0308990478515625,
-0.035247802734375,
0.06695556640625,
0.00341796875,
0.01288604736328125,
0.0103912353515625,
0.016357421875,
-0.01093292236328125,
-0.023681640625,
-0.0533447265625,
0.0281829833984375,
-0.046600341796875,
-0.0305938720703125,
-0.0753173828125,
-0.0308990478515625,
-0.0635986328125,
0.0016164779663085938,
-0.0138092041015625,
-0.0247802734375,
-0.036346435546875,
0.005313873291015625,
0.047088623046875,
0.042572021484375,
-0.0186920166015625,
0.006175994873046875,
-0.06005859375,
0.01067352294921875,
0.0009322166442871094,
0.0083160400390625,
-0.0012903213500976562,
-0.0655517578125,
-0.00675201416015625,
0.00689697265625,
-0.0250244140625,
-0.07427978515625,
0.061279296875,
0.0183563232421875,
0.0289764404296875,
0.020843505859375,
0.011810302734375,
0.044830322265625,
-0.01027679443359375,
0.04962158203125,
0.024993896484375,
-0.07373046875,
0.047698974609375,
-0.0039215087890625,
0.028717041015625,
0.038665771484375,
0.060516357421875,
-0.01012420654296875,
-0.025115966796875,
-0.058441162109375,
-0.06683349609375,
0.0618896484375,
0.0171356201171875,
-0.0130767822265625,
0.00899505615234375,
0.02685546875,
0.0008912086486816406,
0.005390167236328125,
-0.061859130859375,
-0.0499267578125,
-0.0297393798828125,
-0.00801849365234375,
-0.0111083984375,
-0.0099334716796875,
-0.00400543212890625,
-0.050445556640625,
0.0870361328125,
-0.00081634521484375,
0.022491455078125,
0.05084228515625,
-0.0015268325805664062,
-0.0077972412109375,
0.0006513595581054688,
0.036895751953125,
0.039093017578125,
-0.042999267578125,
-0.010284423828125,
0.033538818359375,
-0.02484130859375,
0.01244354248046875,
0.031585693359375,
-0.0301055908203125,
0.0233306884765625,
0.0234832763671875,
0.08392333984375,
-0.00839996337890625,
-0.032623291015625,
0.04132080078125,
-0.014373779296875,
-0.0272369384765625,
-0.03350830078125,
0.01030731201171875,
-0.0081939697265625,
0.01413726806640625,
0.0247039794921875,
0.0253753662109375,
0.0181121826171875,
-0.031219482421875,
0.0227813720703125,
0.012542724609375,
-0.0295562744140625,
-0.0208740234375,
0.05078125,
0.006305694580078125,
0.0127105712890625,
0.04669189453125,
-0.0265960693359375,
-0.037872314453125,
0.06390380859375,
0.035980224609375,
0.050933837890625,
-0.0197296142578125,
0.020263671875,
0.0667724609375,
0.0109100341796875,
0.006633758544921875,
0.0168609619140625,
0.0338134765625,
-0.047760009765625,
-0.026458740234375,
-0.061187744140625,
-0.01427459716796875,
0.0164337158203125,
-0.05291748046875,
0.0335693359375,
-0.0262603759765625,
-0.0207977294921875,
0.0184326171875,
0.003978729248046875,
-0.05511474609375,
0.00872039794921875,
0.02032470703125,
0.06744384765625,
-0.07830810546875,
0.07696533203125,
0.0382080078125,
-0.0418701171875,
-0.07476806640625,
-0.00036835670471191406,
-0.001377105712890625,
-0.04840087890625,
0.062286376953125,
0.03533935546875,
0.0101776123046875,
-0.01617431640625,
-0.01666259765625,
-0.0865478515625,
0.080078125,
0.0173797607421875,
-0.043609619140625,
0.0019245147705078125,
-0.00748443603515625,
0.0374755859375,
-0.02520751953125,
0.035614013671875,
0.054962158203125,
0.0258636474609375,
0.025970458984375,
-0.07373046875,
0.01062774658203125,
-0.033721923828125,
-0.0097198486328125,
0.003726959228515625,
-0.038726806640625,
0.06658935546875,
-0.04583740234375,
0.0089111328125,
0.0008149147033691406,
0.0595703125,
0.0272064208984375,
0.046661376953125,
0.038055419921875,
0.038421630859375,
0.050994873046875,
-0.00829315185546875,
0.0701904296875,
-0.03717041015625,
0.0231475830078125,
0.05511474609375,
-0.0240020751953125,
0.059234619140625,
0.0213623046875,
-0.0027618408203125,
0.040802001953125,
0.04986572265625,
-0.019866943359375,
0.0239715576171875,
0.015899658203125,
-0.0112457275390625,
-0.0005407333374023438,
-0.0002834796905517578,
-0.05023193359375,
0.025421142578125,
0.016082763671875,
-0.02996826171875,
0.006801605224609375,
0.0308990478515625,
0.0241546630859375,
-0.0265045166015625,
-0.01020050048828125,
0.04620361328125,
0.0007643699645996094,
-0.061187744140625,
0.09417724609375,
-0.006877899169921875,
0.06719970703125,
-0.023101806640625,
0.01177215576171875,
-0.00916290283203125,
0.0172271728515625,
-0.032470703125,
-0.03338623046875,
0.032958984375,
-0.000015437602996826172,
-0.016693115234375,
0.0189361572265625,
0.040802001953125,
-0.0289764404296875,
-0.052520751953125,
0.0234527587890625,
0.024810791015625,
0.0238189697265625,
0.00994873046875,
-0.07427978515625,
0.0163726806640625,
0.01336669921875,
-0.0227203369140625,
0.0255126953125,
0.01313018798828125,
0.0171966552734375,
0.040618896484375,
0.0374755859375,
-0.01216888427734375,
-0.0050048828125,
0.001689910888671875,
0.062744140625,
-0.033111572265625,
-0.0011548995971679688,
-0.0733642578125,
0.0183563232421875,
-0.01036834716796875,
-0.0259857177734375,
0.052764892578125,
0.050048828125,
0.059356689453125,
-0.0187530517578125,
0.04718017578125,
-0.0232696533203125,
0.0250701904296875,
-0.03448486328125,
0.037811279296875,
-0.03680419921875,
-0.001232147216796875,
-0.025665283203125,
-0.0499267578125,
-0.0343017578125,
0.045562744140625,
-0.0248260498046875,
-0.00014400482177734375,
0.042755126953125,
0.0640869140625,
0.01715087890625,
0.00021314620971679688,
0.00809478759765625,
0.01520538330078125,
0.013519287109375,
0.052490234375,
0.033233642578125,
-0.056549072265625,
0.029937744140625,
-0.070068359375,
-0.035369873046875,
-0.0157012939453125,
-0.048919677734375,
-0.08270263671875,
-0.03668212890625,
-0.0545654296875,
-0.057647705078125,
0.010162353515625,
0.1011962890625,
0.05499267578125,
-0.0794677734375,
-0.00928497314453125,
0.012451171875,
-0.002170562744140625,
-0.0222320556640625,
-0.01546478271484375,
0.039886474609375,
-0.018646240234375,
-0.06256103515625,
0.007354736328125,
0.0008854866027832031,
0.0246429443359375,
-0.0038623809814453125,
-0.002437591552734375,
-0.0244598388671875,
0.0036773681640625,
0.035003662109375,
0.0221099853515625,
-0.04498291015625,
0.004138946533203125,
0.0173797607421875,
-0.0094146728515625,
0.013427734375,
0.01522064208984375,
-0.05377197265625,
0.0116729736328125,
0.03302001953125,
0.0208282470703125,
0.039581298828125,
-0.021026611328125,
0.0199127197265625,
-0.035003662109375,
0.0206756591796875,
-0.00004088878631591797,
0.0340576171875,
0.0180206298828125,
-0.0266265869140625,
0.041961669921875,
0.0213623046875,
-0.038818359375,
-0.063232421875,
-0.01093292236328125,
-0.08038330078125,
-0.0235748291015625,
0.10284423828125,
-0.0087127685546875,
-0.04351806640625,
0.007354736328125,
-0.01739501953125,
0.0260009765625,
-0.032562255859375,
0.044097900390625,
0.0220489501953125,
-0.0229034423828125,
0.006855010986328125,
-0.038238525390625,
0.046600341796875,
0.032135009765625,
-0.0408935546875,
-0.01369476318359375,
0.00047135353088378906,
0.026702880859375,
0.03900146484375,
0.050628662109375,
-0.0099945068359375,
0.01201629638671875,
-0.0123748779296875,
0.0215606689453125,
-0.00335693359375,
-0.0156402587890625,
-0.0377197265625,
0.0011386871337890625,
-0.0025043487548828125,
-0.01003265380859375
]
] |
rifkat/pubchem_1M | 2022-12-07T11:21:19.000Z | [
"transformers",
"pytorch",
"roberta",
"fill-mask",
"doi:10.57967/hf/0177",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | rifkat | null | null | rifkat/pubchem_1M | 0 | 6,937 | transformers | 2022-03-02T23:29:05 | Ushbu model, HuggingFace-da RoBERTa transformatorini amalga oshirishga asoslangan. Bizning RoBERTa dasturimiz 12 ta diqqat boshi va 6 ta qatlamdan foydalanadi, natijada 72 ta aniq e'tibor mexanizmlari paydo bo'ladi. Biz har bir kirish satridagi tokenlarning 15 foizini niqoblaydigan RoBERTa-dan dastlabki tekshirish protsedurasini qabul qildik. Biz maksimal 52K tokenli lug'atdan va maksimal 512 ta ketma-ketlik uzunligidan foydalanganmiz. Biz 1M PubChem to'plamlarida 10 ta davr uchun o'qitdik. Loss funksiya 2.9 dan 0.33 gacha tushdi. Ushbu modelni sizga taqdim qilamiz.
<pre><code class="language-python">
@misc {rifkat_davronov_2022,
author = { {Adilova Fatima,Rifkat Davronov, Samariddin Kushmuratov, Ruzmat Safarov} },
title = { pubchem_1M (Revision 89e2ba6) },
year = 2022,
url = { https://huggingface.co/rifkat/pubchem_1M },
doi = { 10.57967/hf/0177 },
publisher = { Hugging Face }
}
</code></pre> | 955 | [
[
-0.00919342041015625,
-0.0567626953125,
-0.00675201416015625,
0.037445068359375,
-0.029754638671875,
-0.011474609375,
0.00251007080078125,
-0.0179595947265625,
0.04266357421875,
0.0282440185546875,
-0.045928955078125,
-0.0123443603515625,
-0.045074462890625,
0.007595062255859375,
-0.017242431640625,
0.0772705078125,
-0.01043701171875,
0.02093505859375,
0.0024871826171875,
-0.0157318115234375,
-0.0083465576171875,
-0.0275115966796875,
-0.0450439453125,
-0.0261077880859375,
0.0177154541015625,
0.01088714599609375,
0.036834716796875,
0.01041412353515625,
0.029510498046875,
0.034271240234375,
-0.032318115234375,
-0.01364898681640625,
-0.0298919677734375,
0.00006139278411865234,
-0.01104736328125,
-0.047607421875,
-0.0504150390625,
0.00627899169921875,
0.056671142578125,
0.053436279296875,
-0.029571533203125,
0.0201416015625,
0.00394439697265625,
0.0679931640625,
-0.0109710693359375,
0.01006317138671875,
-0.0109405517578125,
-0.00975799560546875,
-0.0185089111328125,
0.01806640625,
-0.0192718505859375,
-0.04827880859375,
0.0036487579345703125,
-0.022918701171875,
0.005138397216796875,
0.00975799560546875,
0.1141357421875,
0.0164031982421875,
-0.0300750732421875,
0.01093292236328125,
-0.028472900390625,
0.057159423828125,
-0.06134033203125,
0.0294647216796875,
0.00008344650268554688,
0.0226593017578125,
-0.0221405029296875,
-0.05120849609375,
-0.05609130859375,
0.003841400146484375,
0.007320404052734375,
0.0031261444091796875,
-0.035675048828125,
-0.0191192626953125,
0.01482391357421875,
0.044189453125,
-0.04205322265625,
-0.009429931640625,
-0.038482666015625,
-0.026885986328125,
0.04901123046875,
0.0167999267578125,
0.03265380859375,
-0.0160369873046875,
-0.0693359375,
-0.0265960693359375,
-0.032745361328125,
0.0056610107421875,
0.035430908203125,
0.006359100341796875,
-0.036773681640625,
0.04742431640625,
-0.023956298828125,
0.04656982421875,
0.031829833984375,
-0.0145111083984375,
0.02850341796875,
-0.038238525390625,
-0.0313720703125,
-0.0269622802734375,
0.05743408203125,
0.0318603515625,
0.016082763671875,
-0.003131866455078125,
0.011077880859375,
0.004180908203125,
0.00786590576171875,
-0.07977294921875,
-0.008514404296875,
0.01442718505859375,
-0.054046630859375,
-0.0222930908203125,
0.03411865234375,
-0.061370849609375,
0.0032520294189453125,
-0.004535675048828125,
0.04248046875,
-0.03497314453125,
-0.0623779296875,
0.026580810546875,
-0.005069732666015625,
0.0290374755859375,
0.0204010009765625,
-0.03692626953125,
0.02008056640625,
0.017181396484375,
0.056365966796875,
0.004669189453125,
-0.0059814453125,
-0.0192108154296875,
0.00400543212890625,
-0.01494598388671875,
0.046844482421875,
0.006893157958984375,
-0.03167724609375,
-0.00791168212890625,
0.0241851806640625,
-0.0176849365234375,
-0.035797119140625,
0.03326416015625,
-0.0203094482421875,
0.020904541015625,
0.0019292831420898438,
-0.019622802734375,
-0.0299530029296875,
0.007965087890625,
-0.033233642578125,
0.058380126953125,
0.046630859375,
-0.06134033203125,
0.01041412353515625,
-0.0266876220703125,
-0.0107421875,
0.0019435882568359375,
0.0309906005859375,
-0.058197021484375,
0.017364501953125,
-0.0048980712890625,
0.031341552734375,
-0.0178985595703125,
0.01352691650390625,
-0.031280517578125,
0.01412200927734375,
0.047393798828125,
0.00433349609375,
0.1046142578125,
0.0269012451171875,
-0.00664520263671875,
0.026763916015625,
-0.0645751953125,
0.00659942626953125,
0.0212554931640625,
-0.0169677734375,
-0.0197906494140625,
-0.051727294921875,
0.04888916015625,
0.0225982666015625,
0.0274810791015625,
-0.058013916015625,
0.0435791015625,
-0.051483154296875,
0.04296875,
0.05657958984375,
0.01580810546875,
0.0205535888671875,
-0.043548583984375,
0.05230712890625,
0.00794219970703125,
0.0199737548828125,
0.0273895263671875,
-0.0303802490234375,
-0.048858642578125,
-0.029571533203125,
0.036529541015625,
0.016998291015625,
-0.049468994140625,
0.0258331298828125,
0.0008158683776855469,
-0.059326171875,
-0.029998779296875,
-0.0205230712890625,
0.0254669189453125,
0.037872314453125,
0.01070404052734375,
-0.018157958984375,
-0.0452880859375,
-0.05877685546875,
-0.042755126953125,
-0.004230499267578125,
-0.0027904510498046875,
0.00902557373046875,
0.040252685546875,
-0.028961181640625,
0.04266357421875,
-0.038299560546875,
-0.038421630859375,
-0.0266265869140625,
0.0161895751953125,
0.03839111328125,
0.046173095703125,
0.06414794921875,
-0.07159423828125,
-0.060516357421875,
-0.0192413330078125,
-0.040252685546875,
-0.01568603515625,
-0.00982666015625,
-0.03515625,
0.038787841796875,
-0.01486968994140625,
-0.0640869140625,
0.06689453125,
0.05084228515625,
-0.059906005859375,
0.048858642578125,
-0.0200347900390625,
0.0134735107421875,
-0.09686279296875,
0.01001739501953125,
0.004638671875,
-0.0229339599609375,
-0.033416748046875,
0.0213623046875,
-0.0102386474609375,
0.0007123947143554688,
-0.0227813720703125,
0.06341552734375,
-0.049713134765625,
0.0102996826171875,
-0.008148193359375,
0.0129241943359375,
0.006397247314453125,
0.024871826171875,
-0.0190582275390625,
0.03497314453125,
0.041351318359375,
-0.033355712890625,
0.0433349609375,
0.0621337890625,
-0.0106658935546875,
0.05352783203125,
-0.061981201171875,
-0.00799560546875,
0.01322174072265625,
0.0176849365234375,
-0.0762939453125,
-0.0113525390625,
0.042816162109375,
-0.051544189453125,
0.042816162109375,
-0.00583648681640625,
-0.031646728515625,
-0.0250091552734375,
-0.02520751953125,
0.0031566619873046875,
0.04400634765625,
-0.034912109375,
0.06329345703125,
0.018463134765625,
-0.031097412109375,
-0.017852783203125,
-0.053009033203125,
-0.005340576171875,
-0.03472900390625,
-0.0631103515625,
0.0306243896484375,
-0.03399658203125,
-0.015594482421875,
0.0006203651428222656,
-0.01393890380859375,
-0.0245819091796875,
-0.0014972686767578125,
0.0258636474609375,
0.025299072265625,
0.00386810302734375,
-0.037445068359375,
-0.00955963134765625,
-0.0306243896484375,
-0.0009822845458984375,
0.004489898681640625,
0.044219970703125,
-0.01715087890625,
0.00707244873046875,
-0.071044921875,
0.0250396728515625,
0.0318603515625,
-0.020477294921875,
0.068359375,
0.05242919921875,
-0.01690673828125,
0.0106353759765625,
-0.0391845703125,
0.003200531005859375,
-0.03271484375,
-0.004802703857421875,
-0.0369873046875,
-0.028900146484375,
0.06024169921875,
-0.01080322265625,
-0.039642333984375,
0.068359375,
0.0601806640625,
0.007381439208984375,
0.054595947265625,
0.0635986328125,
-0.01039886474609375,
0.0399169921875,
-0.06011962890625,
-0.002117156982421875,
-0.07037353515625,
-0.034454345703125,
-0.034912109375,
-0.018829345703125,
-0.040740966796875,
-0.0233306884765625,
0.036712646484375,
0.0139312744140625,
-0.019744873046875,
0.0313720703125,
-0.051544189453125,
-0.0269622802734375,
0.038482666015625,
0.02288818359375,
0.0129241943359375,
-0.00989532470703125,
0.009674072265625,
-0.00934600830078125,
-0.0258636474609375,
-0.007568359375,
0.06170654296875,
0.0293121337890625,
0.07623291015625,
0.041290283203125,
0.047027587890625,
-0.005817413330078125,
0.021148681640625,
-0.040679931640625,
0.0251617431640625,
0.0302734375,
-0.05572509765625,
-0.010223388671875,
-0.0333251953125,
-0.060760498046875,
0.006206512451171875,
-0.005680084228515625,
-0.05352783203125,
0.026153564453125,
-0.0108184814453125,
-0.01016998291015625,
0.00806427001953125,
-0.01800537109375,
0.053741455078125,
-0.0040740966796875,
-0.024993896484375,
-0.0175323486328125,
-0.05108642578125,
0.0287017822265625,
0.0242156982421875,
-0.00997161865234375,
0.0013675689697265625,
0.0005745887756347656,
0.058013916015625,
-0.05877685546875,
0.037811279296875,
-0.03326416015625,
-0.002193450927734375,
0.02069091796875,
0.006855010986328125,
0.023773193359375,
0.00659942626953125,
-0.01062774658203125,
0.040252685546875,
0.0333251953125,
-0.04034423828125,
-0.0281829833984375,
0.0435791015625,
-0.046539306640625,
-0.0269622802734375,
-0.0726318359375,
-0.0225372314453125,
0.0245513916015625,
0.0288543701171875,
0.0284271240234375,
0.0364990234375,
-0.0007829666137695312,
0.0119781494140625,
0.01715087890625,
-0.01314544677734375,
0.0137176513671875,
0.04876708984375,
-0.0099639892578125,
-0.03765869140625,
0.03900146484375,
0.0108642578125,
0.007579803466796875,
0.00799560546875,
0.026336669921875,
-0.032257080078125,
-0.029815673828125,
-0.03424072265625,
0.05963134765625,
-0.0129547119140625,
-0.04449462890625,
-0.041015625,
-0.0188140869140625,
-0.035247802734375,
0.00897979736328125,
-0.0323486328125,
-0.031829833984375,
-0.0213470458984375,
0.004978179931640625,
0.047607421875,
0.050872802734375,
-0.02880859375,
0.033172607421875,
-0.03240966796875,
0.0186309814453125,
-0.00591278076171875,
0.028900146484375,
-0.01812744140625,
-0.06268310546875,
-0.0166168212890625,
0.01322174072265625,
-0.0276031494140625,
-0.08477783203125,
0.06561279296875,
0.02203369140625,
0.032989501953125,
0.038360595703125,
0.0201263427734375,
0.05487060546875,
-0.01396942138671875,
0.04266357421875,
0.011474609375,
-0.073974609375,
0.031646728515625,
-0.031890869140625,
-0.017822265625,
0.0165557861328125,
0.041290283203125,
-0.0261077880859375,
-0.00745391845703125,
-0.0693359375,
-0.0479736328125,
0.046630859375,
0.0139312744140625,
-0.002910614013671875,
0.0210418701171875,
0.05487060546875,
-0.00864410400390625,
0.0292205810546875,
-0.04937744140625,
-0.05010986328125,
-0.0178985595703125,
-0.011444091796875,
0.01331329345703125,
-0.036834716796875,
-0.01033782958984375,
-0.03924560546875,
0.07501220703125,
-0.0034656524658203125,
0.03533935546875,
0.0213165283203125,
0.0038280487060546875,
-0.0147552490234375,
-0.00421905517578125,
0.05682373046875,
0.0352783203125,
-0.03466796875,
-0.0276336669921875,
-0.018890380859375,
-0.0257110595703125,
-0.00005620718002319336,
0.029632568359375,
-0.00925445556640625,
-0.0009794235229492188,
0.01351165771484375,
0.048980712890625,
0.0182342529296875,
-0.0253448486328125,
0.054595947265625,
-0.01023101806640625,
-0.0209808349609375,
-0.042755126953125,
-0.0123291015625,
0.00798797607421875,
0.035614013671875,
0.0191192626953125,
0.0040435791015625,
0.01910400390625,
-0.035858154296875,
0.01494598388671875,
0.04632568359375,
-0.005645751953125,
-0.036102294921875,
0.07391357421875,
-0.006900787353515625,
-0.024627685546875,
0.0284271240234375,
-0.01357269287109375,
-0.049102783203125,
0.03314208984375,
0.04876708984375,
0.061737060546875,
-0.0484619140625,
0.0137176513671875,
0.045013427734375,
-0.017181396484375,
0.0033550262451171875,
0.04443359375,
0.01241302490234375,
-0.039337158203125,
0.0008568763732910156,
-0.062744140625,
-0.0148162841796875,
0.00296783447265625,
-0.08441162109375,
0.016632080078125,
-0.04052734375,
-0.01454925537109375,
-0.002227783203125,
0.029083251953125,
-0.035736083984375,
0.0213470458984375,
-0.024444580078125,
0.06353759765625,
-0.0635986328125,
0.055450439453125,
0.0587158203125,
-0.07403564453125,
-0.06585693359375,
-0.007175445556640625,
0.0017499923706054688,
-0.06622314453125,
0.055633544921875,
-0.01715087890625,
0.00018703937530517578,
0.002246856689453125,
-0.04608154296875,
-0.07574462890625,
0.0799560546875,
0.0190887451171875,
-0.022308349609375,
0.00044417381286621094,
-0.035797119140625,
0.03485107421875,
-0.0176544189453125,
0.0195159912109375,
0.007190704345703125,
0.032135009765625,
0.02001953125,
-0.0672607421875,
-0.006633758544921875,
-0.028961181640625,
0.0201263427734375,
0.0038661956787109375,
-0.0975341796875,
0.10345458984375,
-0.024139404296875,
-0.0137176513671875,
0.041839599609375,
0.050689697265625,
-0.003841400146484375,
0.00914764404296875,
0.024444580078125,
0.0521240234375,
0.0255126953125,
-0.002948760986328125,
0.07318115234375,
-0.0120849609375,
0.05523681640625,
0.037872314453125,
0.003803253173828125,
0.0694580078125,
0.0323486328125,
-0.059906005859375,
0.07568359375,
0.0478515625,
-0.003665924072265625,
0.039794921875,
0.0141448974609375,
-0.007843017578125,
-0.00004786252975463867,
-0.00994873046875,
-0.0235748291015625,
0.047607421875,
0.01322174072265625,
0.0008111000061035156,
-0.003627777099609375,
0.006259918212890625,
0.02178955078125,
0.0228271484375,
0.0056610107421875,
0.05279541015625,
0.01293182373046875,
-0.0119781494140625,
0.06427001953125,
0.00402069091796875,
0.04376220703125,
-0.040191650390625,
-0.0009360313415527344,
-0.0285797119140625,
0.038848876953125,
-0.007568359375,
-0.0252532958984375,
0.03277587890625,
-0.0189971923828125,
-0.0283660888671875,
-0.01800537109375,
0.035186767578125,
-0.042510986328125,
-0.053009033203125,
0.01751708984375,
0.00305938720703125,
0.0361328125,
-0.007106781005859375,
-0.07159423828125,
0.01953125,
0.007781982421875,
-0.03936767578125,
0.03375244140625,
0.021759033203125,
0.0034999847412109375,
0.062469482421875,
0.03424072265625,
0.035552978515625,
-0.0200958251953125,
-0.01702880859375,
0.06976318359375,
-0.04718017578125,
-0.04156494140625,
-0.045318603515625,
0.050079345703125,
-0.0045623779296875,
-0.021881103515625,
0.062469482421875,
0.027740478515625,
0.0439453125,
-0.00933074951171875,
0.037139892578125,
-0.01055145263671875,
0.06427001953125,
-0.035614013671875,
0.0751953125,
-0.060638427734375,
-0.029510498046875,
-0.0281219482421875,
-0.06494140625,
-0.00778961181640625,
0.07080078125,
0.0097503662109375,
0.004062652587890625,
0.0298614501953125,
0.034027099609375,
0.0194244384765625,
-0.027191162109375,
0.01074981689453125,
0.057586669921875,
0.0125274658203125,
0.038787841796875,
0.0216217041015625,
-0.073486328125,
0.0036754608154296875,
-0.0217742919921875,
-0.0281219482421875,
-0.0478515625,
-0.06890869140625,
-0.0694580078125,
-0.04974365234375,
-0.043487548828125,
-0.05242919921875,
-0.03387451171875,
0.06024169921875,
0.04608154296875,
-0.05474853515625,
-0.0018672943115234375,
0.0205078125,
0.003726959228515625,
-0.026580810546875,
-0.0194091796875,
0.03924560546875,
0.0229644775390625,
-0.0694580078125,
0.0010509490966796875,
0.0157928466796875,
0.02349853515625,
-0.035430908203125,
-0.00469207763671875,
-0.026397705078125,
-0.00638580322265625,
0.006046295166015625,
0.023101806640625,
-0.07525634765625,
-0.0142669677734375,
-0.0161590576171875,
-0.00551605224609375,
-0.002216339111328125,
0.0102081298828125,
-0.04156494140625,
0.0360107421875,
0.0343017578125,
0.0136566162109375,
0.04949951171875,
0.028472900390625,
0.028472900390625,
-0.0286712646484375,
0.0457763671875,
-0.00612640380859375,
0.0232086181640625,
0.0254058837890625,
-0.044097900390625,
0.03936767578125,
0.037322998046875,
-0.049560546875,
-0.07421875,
0.0144500732421875,
-0.1041259765625,
0.01531219482421875,
0.06292724609375,
-0.027313232421875,
-0.036651611328125,
0.0306243896484375,
-0.030426025390625,
0.0379638671875,
-0.0206298828125,
0.0321044921875,
0.038787841796875,
-0.0187835693359375,
-0.025299072265625,
-0.028533935546875,
0.03289794921875,
0.030517578125,
-0.03460693359375,
-0.00555419921875,
0.02117919921875,
0.04779052734375,
0.0200958251953125,
0.052154541015625,
-0.01187896728515625,
-0.0019550323486328125,
0.0086822509765625,
0.029876708984375,
-0.00653839111328125,
-0.005352020263671875,
-0.024169921875,
-0.0224609375,
0.004459381103515625,
-0.0214080810546875
]
] |
EleutherAI/pythia-2.8b-v0 | 2023-07-10T01:35:41.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"pythia",
"pythia_v0",
"en",
"dataset:the_pile",
"arxiv:2101.00027",
"arxiv:2201.07311",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/pythia-2.8b-v0 | 5 | 6,935 | transformers | 2022-11-20T03:56:10 | ---
language:
- en
tags:
- pytorch
- causal-lm
- pythia
- pythia_v0
license: apache-2.0
datasets:
- the_pile
---
The *Pythia Scaling Suite* is a collection of models developed to facilitate
interpretability research. It contains two sets of eight models of sizes
70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, and 12B. For each size, there are two
models: one trained on the Pile, and one trained on the Pile after the dataset
has been globally deduplicated. All 8 model sizes are trained on the exact
same data, in the exact same order. All Pythia models are available
[on Hugging Face](https://huggingface.co/models?other=pythia).
The Pythia model suite was deliberately designed to promote scientific
research on large language models, especially interpretability research.
Despite not centering downstream performance as a design goal, we find the
models <a href="#evaluations">match or exceed</a> the performance of
similar and same-sized models, such as those in the OPT and GPT-Neo suites.
Please note that all models in the *Pythia* suite were renamed in January
2023. For clarity, a <a href="#naming-convention-and-parameter-count">table
comparing the old and new names</a> is provided in this model card, together
with exact parameter counts.
## Pythia-2.8B
### Model Details
- Developed by: [EleutherAI](http://eleuther.ai)
- Model type: Transformer-based Language Model
- Language: English
- Learn more: [Pythia's GitHub repository](https://github.com/EleutherAI/pythia)
for training procedure, config files, and details on how to use.
- Library: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
- License: Apache 2.0
- Contact: to ask questions about this model, join the [EleutherAI
Discord](https://discord.gg/zBGx3azzUn), and post them in `#release-discussion`.
Please read the existing *Pythia* documentation before asking about it in the
EleutherAI Discord. For general correspondence: [contact@eleuther.
ai](mailto:contact@eleuther.ai).
<figure>
| Pythia model | Non-Embedding Params | Layers | Model Dim | Heads | Batch Size | Learning Rate | Equivalent Models |
| -----------: | -------------------: | :----: | :-------: | :---: | :--------: | :-------------------: | :--------------------: |
| 70M | 18,915,328 | 6 | 512 | 8 | 2M | 1.0 x 10<sup>-3</sup> | — |
| 160M | 85,056,000 | 12 | 768 | 12 | 4M | 6.0 x 10<sup>-4</sup> | GPT-Neo 125M, OPT-125M |
| 410M | 302,311,424 | 24 | 1024 | 16 | 4M | 3.0 x 10<sup>-4</sup> | OPT-350M |
| 1.0B | 805,736,448 | 16 | 2048 | 8 | 2M | 3.0 x 10<sup>-4</sup> | — |
| 1.4B | 1,208,602,624 | 24 | 2048 | 16 | 4M | 2.0 x 10<sup>-4</sup> | GPT-Neo 1.3B, OPT-1.3B |
| 2.8B | 2,517,652,480 | 32 | 2560 | 32 | 2M | 1.6 x 10<sup>-4</sup> | GPT-Neo 2.7B, OPT-2.7B |
| 6.9B | 6,444,163,072 | 32 | 4096 | 32 | 2M | 1.2 x 10<sup>-4</sup> | OPT-6.7B |
| 12B | 11,327,027,200 | 36 | 5120 | 40 | 2M | 1.2 x 10<sup>-4</sup> | — |
<figcaption>Engineering details for the <i>Pythia Suite</i>. Deduped and
non-deduped models of a given size have the same hyperparameters. “Equivalent”
models have <b>exactly</b> the same architecture, and the same number of
non-embedding parameters.</figcaption>
</figure>
### Uses and Limitations
#### Intended Use
The primary intended use of Pythia is research on the behavior, functionality,
and limitations of large language models. This suite is intended to provide
a controlled setting for performing scientific experiments. To enable the
study of how language models change over the course of training, we provide
143 evenly spaced intermediate checkpoints per model. These checkpoints are
hosted on Hugging Face as branches. Note that branch `143000` corresponds
exactly to the model checkpoint on the `main` branch of each model.
You may also further fine-tune and adapt Pythia-2.8B for deployment,
as long as your use is in accordance with the Apache 2.0 license. Pythia
models work with the Hugging Face [Transformers
Library](https://huggingface.co/docs/transformers/index). If you decide to use
pre-trained Pythia-2.8B as a basis for your fine-tuned model, please
conduct your own risk and bias assessment.
#### Out-of-scope use
The Pythia Suite is **not** intended for deployment. It is not a in itself
a product and cannot be used for human-facing interactions.
Pythia models are English-language only, and are not suitable for translation
or generating text in other languages.
Pythia-2.8B has not been fine-tuned for downstream contexts in which
language models are commonly deployed, such as writing genre prose,
or commercial chatbots. This means Pythia-2.8B will **not**
respond to a given prompt the way a product like ChatGPT does. This is because,
unlike this model, ChatGPT was fine-tuned using methods such as Reinforcement
Learning from Human Feedback (RLHF) to better “understand” human instructions.
#### Limitations and biases
The core functionality of a large language model is to take a string of text
and predict the next token. The token deemed statistically most likely by the
model need not produce the most “accurate” text. Never rely on
Pythia-2.8B to produce factually accurate output.
This model was trained on [the Pile](https://pile.eleuther.ai/), a dataset
known to contain profanity and texts that are lewd or otherwise offensive.
See [Section 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a
discussion of documented biases with regards to gender, religion, and race.
Pythia-2.8B may produce socially unacceptable or undesirable text, *even if*
the prompt itself does not include anything explicitly offensive.
If you plan on using text generated through, for example, the Hosted Inference
API, we recommend having a human curate the outputs of this language model
before presenting it to other people. Please inform your audience that the
text was generated by Pythia-2.8B.
### Quickstart
Pythia models can be loaded and used via the following code, demonstrated here
for the third `pythia-70m-deduped` checkpoint:
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
model = GPTNeoXForCausalLM.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
tokenizer = AutoTokenizer.from_pretrained(
"EleutherAI/pythia-70m-deduped",
revision="step3000",
cache_dir="./pythia-70m-deduped/step3000",
)
inputs = tokenizer("Hello, I am", return_tensors="pt")
tokens = model.generate(**inputs)
tokenizer.decode(tokens[0])
```
Revision/branch `step143000` corresponds exactly to the model checkpoint on
the `main` branch of each model.<br>
For more information on how to use all Pythia models, see [documentation on
GitHub](https://github.com/EleutherAI/pythia).
### Training
#### Training data
[The Pile](https://pile.eleuther.ai/) is a 825GiB general-purpose dataset in
English. It was created by EleutherAI specifically for training large language
models. It contains texts from 22 diverse sources, roughly broken down into
five categories: academic writing (e.g. arXiv), internet (e.g. CommonCrawl),
prose (e.g. Project Gutenberg), dialogue (e.g. YouTube subtitles), and
miscellaneous (e.g. GitHub, Enron Emails). See [the Pile
paper](https://arxiv.org/abs/2101.00027) for a breakdown of all data sources,
methodology, and a discussion of ethical implications. Consult [the
datasheet](https://arxiv.org/abs/2201.07311) for more detailed documentation
about the Pile and its component datasets. The Pile can be downloaded from
the [official website](https://pile.eleuther.ai/), or from a [community
mirror](https://the-eye.eu/public/AI/pile/).<br>
The Pile was **not** deduplicated before being used to train Pythia-2.8B.
#### Training procedure
All models were trained on the exact same data, in the exact same order. Each
model saw 299,892,736,000 tokens during training, and 143 checkpoints for each
model are saved every 2,097,152,000 tokens, spaced evenly throughout training.
This corresponds to training for just under 1 epoch on the Pile for
non-deduplicated models, and about 1.5 epochs on the deduplicated Pile.
All *Pythia* models trained for the equivalent of 143000 steps at a batch size
of 2,097,152 tokens. Two batch sizes were used: 2M and 4M. Models with a batch
size of 4M tokens listed were originally trained for 71500 steps instead, with
checkpoints every 500 steps. The checkpoints on Hugging Face are renamed for
consistency with all 2M batch models, so `step1000` is the first checkpoint
for `pythia-1.4b` that was saved (corresponding to step 500 in training), and
`step1000` is likewise the first `pythia-6.9b` checkpoint that was saved
(corresponding to 1000 “actual” steps).<br>
See [GitHub](https://github.com/EleutherAI/pythia) for more details on training
procedure, including [how to reproduce
it](https://github.com/EleutherAI/pythia/blob/main/README.md#reproducing-training).<br>
Pythia uses the same tokenizer as [GPT-NeoX-
20B](https://huggingface.co/EleutherAI/gpt-neox-20b).
### Evaluations
All 16 *Pythia* models were evaluated using the [LM Evaluation
Harness](https://github.com/EleutherAI/lm-evaluation-harness). You can access
the results by model and step at `results/json/*` in the [GitHub
repository](https://github.com/EleutherAI/pythia/tree/main/results/json).<br>
Expand the sections below to see plots of evaluation results for all
Pythia and Pythia-deduped models compared with OPT and BLOOM.
<details>
<summary>LAMBADA – OpenAI</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/lambada_openai.png" style="width:auto"/>
</details>
<details>
<summary>Physical Interaction: Question Answering (PIQA)</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/piqa.png" style="width:auto"/>
</details>
<details>
<summary>WinoGrande</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/winogrande.png" style="width:auto"/>
</details>
<details>
<summary>AI2 Reasoning Challenge—Challenge Set</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/arc_challenge.png" style="width:auto"/>
</details>
<details>
<summary>SciQ</summary>
<img src="/EleutherAI/pythia-12b/resolve/main/eval_plots/sciq.png" style="width:auto"/>
</details>
### Naming convention and parameter count
*Pythia* models were renamed in January 2023. It is possible that the old
naming convention still persists in some documentation by accident. The
current naming convention (70M, 160M, etc.) is based on total parameter count.
<figure style="width:32em">
| current Pythia suffix | old suffix | total params | non-embedding params |
| --------------------: | ---------: | -------------: | -------------------: |
| 70M | 19M | 70,426,624 | 18,915,328 |
| 160M | 125M | 162,322,944 | 85,056,000 |
| 410M | 350M | 405,334,016 | 302,311,424 |
| 1B | 800M | 1,011,781,632 | 805,736,448 |
| 1.4B | 1.3B | 1,414,647,808 | 1,208,602,624 |
| 2.8B | 2.7B | 2,775,208,960 | 2,517,652,480 |
| 6.9B | 6.7B | 6,857,302,016 | 6,444,163,072 |
| 12B | 13B | 11,846,072,320 | 11,327,027,200 |
</figure> | 11,782 | [
[
-0.0251007080078125,
-0.06256103515625,
0.0205078125,
0.00397491455078125,
-0.0171966552734375,
-0.0129547119140625,
-0.0168609619140625,
-0.03570556640625,
0.0154266357421875,
0.0164337158203125,
-0.0229949951171875,
-0.022552490234375,
-0.03582763671875,
-0.0053253173828125,
-0.03582763671875,
0.09051513671875,
-0.00553131103515625,
-0.012420654296875,
0.006526947021484375,
-0.0101776123046875,
-0.0027294158935546875,
-0.03814697265625,
-0.032623291015625,
-0.0266571044921875,
0.046295166015625,
0.0163421630859375,
0.0662841796875,
0.048065185546875,
0.0164642333984375,
0.020904541015625,
-0.0227813720703125,
-0.0025768280029296875,
-0.0172882080078125,
-0.00916290283203125,
-0.00007534027099609375,
-0.021636962890625,
-0.050384521484375,
0.0018625259399414062,
0.0506591796875,
0.04766845703125,
-0.01611328125,
0.0186309814453125,
0.0020732879638671875,
0.0350341796875,
-0.0379638671875,
0.002338409423828125,
-0.0204925537109375,
-0.013519287109375,
-0.0133209228515625,
0.007396697998046875,
-0.0286712646484375,
-0.0276641845703125,
0.033843994140625,
-0.047454833984375,
0.0173492431640625,
0.01023101806640625,
0.0870361328125,
-0.00481414794921875,
-0.034423828125,
-0.01439666748046875,
-0.04876708984375,
0.051513671875,
-0.052001953125,
0.020050048828125,
0.0276947021484375,
0.00841522216796875,
0.00045800209045410156,
-0.061920166015625,
-0.03948974609375,
-0.01529693603515625,
-0.0156707763671875,
0.0011234283447265625,
-0.043121337890625,
-0.0001506805419921875,
0.03668212890625,
0.049468994140625,
-0.06976318359375,
0.0015535354614257812,
-0.0313720703125,
-0.02276611328125,
0.031890869140625,
0.0011224746704101562,
0.039154052734375,
-0.024383544921875,
0.0011034011840820312,
-0.02911376953125,
-0.05133056640625,
-0.01369476318359375,
0.04412841796875,
0.00218963623046875,
-0.0306854248046875,
0.04180908203125,
-0.021148681640625,
0.042022705078125,
-0.004230499267578125,
0.0191802978515625,
0.03448486328125,
-0.015167236328125,
-0.0343017578125,
-0.00580596923828125,
0.07391357421875,
0.01238250732421875,
0.016815185546875,
0.000010848045349121094,
-0.002292633056640625,
0.006641387939453125,
0.005584716796875,
-0.08404541015625,
-0.061981201171875,
0.0222320556640625,
-0.03155517578125,
-0.031341552734375,
-0.0099945068359375,
-0.07000732421875,
-0.01959228515625,
-0.010162353515625,
0.03582763671875,
-0.040435791015625,
-0.05419921875,
-0.01514434814453125,
0.0009975433349609375,
0.017547607421875,
0.0230560302734375,
-0.07403564453125,
0.0284576416015625,
0.032867431640625,
0.0767822265625,
0.01335906982421875,
-0.040313720703125,
-0.0181427001953125,
-0.0174407958984375,
-0.01214599609375,
0.03411865234375,
-0.0106048583984375,
-0.01340484619140625,
-0.00441741943359375,
0.01092529296875,
-0.00762939453125,
-0.02899169921875,
0.029815673828125,
-0.026702880859375,
0.0276947021484375,
-0.017822265625,
-0.03765869140625,
-0.031768798828125,
0.00531005859375,
-0.05047607421875,
0.07000732421875,
0.0181732177734375,
-0.07537841796875,
0.015899658203125,
-0.023681640625,
-0.0079193115234375,
-0.00041222572326660156,
0.008148193359375,
-0.04864501953125,
0.0013341903686523438,
0.02606201171875,
0.008392333984375,
-0.029815673828125,
0.0187530517578125,
-0.013702392578125,
-0.0269012451171875,
0.012054443359375,
-0.036773681640625,
0.0751953125,
0.01210784912109375,
-0.049652099609375,
0.018341064453125,
-0.04345703125,
0.01024627685546875,
0.0191650390625,
-0.0240020751953125,
-0.001323699951171875,
-0.010589599609375,
0.0279693603515625,
0.0155487060546875,
0.012420654296875,
-0.0235137939453125,
0.0191802978515625,
-0.040802001953125,
0.052734375,
0.055145263671875,
-0.008697509765625,
0.039703369140625,
-0.03424072265625,
0.0360107421875,
-0.00004988908767700195,
0.006702423095703125,
-0.00469970703125,
-0.0472412109375,
-0.07232666015625,
-0.0178070068359375,
0.02447509765625,
0.02740478515625,
-0.03741455078125,
0.034698486328125,
-0.0204315185546875,
-0.0648193359375,
-0.020263671875,
-0.006557464599609375,
0.0302734375,
0.0215911865234375,
0.030120849609375,
-0.01079559326171875,
-0.038360595703125,
-0.0677490234375,
-0.0152587890625,
-0.03167724609375,
0.00664520263671875,
0.017120361328125,
0.0692138671875,
-0.0035419464111328125,
0.045257568359375,
-0.0244598388671875,
0.0130462646484375,
-0.0261383056640625,
0.01372528076171875,
0.0318603515625,
0.04522705078125,
0.033203125,
-0.0472412109375,
-0.0343017578125,
0.004878997802734375,
-0.0482177734375,
0.005977630615234375,
0.0013513565063476562,
-0.0211639404296875,
0.0286102294921875,
0.00583648681640625,
-0.07440185546875,
0.0309906005859375,
0.053253173828125,
-0.044189453125,
0.057464599609375,
-0.023162841796875,
-0.0019989013671875,
-0.08123779296875,
0.022308349609375,
0.00977325439453125,
-0.0152587890625,
-0.046173095703125,
0.006488800048828125,
0.01171112060546875,
-0.01136016845703125,
-0.0284271240234375,
0.04705810546875,
-0.041595458984375,
-0.01299285888671875,
-0.0125579833984375,
0.008544921875,
-0.0021953582763671875,
0.047698974609375,
0.0099945068359375,
0.045379638671875,
0.057586669921875,
-0.054473876953125,
0.03204345703125,
0.0193634033203125,
-0.0244598388671875,
0.029571533203125,
-0.063720703125,
0.01384735107421875,
0.0023746490478515625,
0.03173828125,
-0.04351806640625,
-0.02667236328125,
0.039154052734375,
-0.045318603515625,
0.01120758056640625,
-0.0281219482421875,
-0.042510986328125,
-0.0289154052734375,
-0.017059326171875,
0.04254150390625,
0.05828857421875,
-0.04193115234375,
0.052490234375,
0.01375579833984375,
0.0028133392333984375,
-0.034393310546875,
-0.043121337890625,
-0.0175628662109375,
-0.041290283203125,
-0.049285888671875,
0.0263214111328125,
0.007076263427734375,
-0.0170440673828125,
0.0045623779296875,
0.0026607513427734375,
0.005962371826171875,
-0.004047393798828125,
0.0218048095703125,
0.0234375,
-0.0017223358154296875,
0.0008921623229980469,
-0.01149749755859375,
-0.00731658935546875,
0.0011453628540039062,
-0.035308837890625,
0.07147216796875,
-0.021148681640625,
-0.00890350341796875,
-0.054840087890625,
0.0024089813232421875,
0.06402587890625,
-0.0274505615234375,
0.06854248046875,
0.05047607421875,
-0.052001953125,
0.0075836181640625,
-0.0287322998046875,
-0.019989013671875,
-0.032928466796875,
0.0509033203125,
-0.0172882080078125,
-0.03424072265625,
0.046478271484375,
0.0246734619140625,
0.0241851806640625,
0.047027587890625,
0.05743408203125,
0.0087890625,
0.08782958984375,
0.035125732421875,
-0.01363372802734375,
0.04345703125,
-0.04107666015625,
0.0170745849609375,
-0.0802001953125,
-0.0158843994140625,
-0.039459228515625,
-0.01995849609375,
-0.07427978515625,
-0.024383544921875,
0.0204010009765625,
0.0156097412109375,
-0.054443359375,
0.03948974609375,
-0.045623779296875,
0.0084686279296875,
0.047393798828125,
0.01328277587890625,
0.01215362548828125,
0.0174407958984375,
0.0083465576171875,
-0.004795074462890625,
-0.053955078125,
-0.0273895263671875,
0.0904541015625,
0.038543701171875,
0.03997802734375,
0.02337646484375,
0.047637939453125,
-0.004711151123046875,
0.020172119140625,
-0.05523681640625,
0.0295257568359375,
0.017059326171875,
-0.05419921875,
-0.018829345703125,
-0.057586669921875,
-0.07501220703125,
0.036102294921875,
0.00623321533203125,
-0.08123779296875,
0.01593017578125,
0.0165863037109375,
-0.02545166015625,
0.036285400390625,
-0.04595947265625,
0.07415771484375,
-0.018585205078125,
-0.0304412841796875,
-0.0298309326171875,
-0.026885986328125,
0.0149993896484375,
0.025390625,
0.016876220703125,
0.005413055419921875,
0.019561767578125,
0.0794677734375,
-0.048583984375,
0.049102783203125,
-0.00604248046875,
0.007137298583984375,
0.0285797119140625,
0.0213623046875,
0.051025390625,
0.006732940673828125,
0.0088043212890625,
0.0008463859558105469,
0.007457733154296875,
-0.04193115234375,
-0.0268402099609375,
0.06500244140625,
-0.0859375,
-0.024261474609375,
-0.060577392578125,
-0.04486083984375,
0.01186370849609375,
0.0141754150390625,
0.03594970703125,
0.049774169921875,
-0.00400543212890625,
0.008026123046875,
0.044677734375,
-0.038726806640625,
0.0276641845703125,
0.01493072509765625,
-0.032012939453125,
-0.0374755859375,
0.07672119140625,
0.006732940673828125,
0.02020263671875,
0.00722503662109375,
0.0206756591796875,
-0.031768798828125,
-0.033111572265625,
-0.045013427734375,
0.04034423828125,
-0.055145263671875,
0.0008654594421386719,
-0.05572509765625,
-0.006500244140625,
-0.033355712890625,
0.0029926300048828125,
-0.0273590087890625,
-0.0291748046875,
-0.0181884765625,
-0.0070037841796875,
0.045989990234375,
0.041748046875,
0.00698089599609375,
0.02459716796875,
-0.043182373046875,
-0.003887176513671875,
0.01416015625,
0.00864410400390625,
0.00833892822265625,
-0.06817626953125,
-0.0100555419921875,
0.00901031494140625,
-0.0260009765625,
-0.084228515625,
0.04345703125,
-0.00199127197265625,
0.0254058837890625,
0.0094146728515625,
-0.016021728515625,
0.039154052734375,
-0.00872039794921875,
0.053436279296875,
0.00968170166015625,
-0.0728759765625,
0.041656494140625,
-0.0421142578125,
0.0198822021484375,
0.0234222412109375,
0.0278778076171875,
-0.055755615234375,
-0.0080413818359375,
-0.07672119140625,
-0.083984375,
0.060821533203125,
0.037841796875,
0.01001739501953125,
0.0100860595703125,
0.0234375,
-0.0292510986328125,
0.01305389404296875,
-0.077880859375,
-0.02410888671875,
-0.0178680419921875,
-0.01091766357421875,
0.008544921875,
-0.00493621826171875,
0.0015125274658203125,
-0.0347900390625,
0.0755615234375,
0.005279541015625,
0.022705078125,
0.0156707763671875,
-0.0328369140625,
-0.0081787109375,
-0.004261016845703125,
0.0140380859375,
0.05810546875,
-0.0088348388671875,
0.003322601318359375,
0.01337432861328125,
-0.0447998046875,
-0.0017070770263671875,
0.0123138427734375,
-0.0282135009765625,
-0.005458831787109375,
0.0165252685546875,
0.0682373046875,
0.0041351318359375,
-0.035003662109375,
0.0164642333984375,
-0.0005154609680175781,
-0.00846099853515625,
-0.0280303955078125,
-0.0164337158203125,
0.0123748779296875,
0.0107421875,
0.003173828125,
-0.0149993896484375,
0.0014467239379882812,
-0.06231689453125,
0.004547119140625,
0.01201629638671875,
-0.0085906982421875,
-0.0300445556640625,
0.039337158203125,
0.005344390869140625,
-0.01495361328125,
0.08587646484375,
-0.018951416015625,
-0.048614501953125,
0.060577392578125,
0.03271484375,
0.0540771484375,
-0.0169677734375,
0.030487060546875,
0.06658935546875,
0.0286712646484375,
-0.01397705078125,
0.004878997802734375,
0.01087188720703125,
-0.043060302734375,
-0.01104736328125,
-0.057342529296875,
-0.0186004638671875,
0.0207366943359375,
-0.042510986328125,
0.0294647216796875,
-0.045257568359375,
0.000026047229766845703,
-0.004970550537109375,
0.0196380615234375,
-0.04461669921875,
0.0215301513671875,
0.01200103759765625,
0.04901123046875,
-0.07086181640625,
0.05743408203125,
0.05389404296875,
-0.053863525390625,
-0.08233642578125,
-0.0011997222900390625,
0.00435638427734375,
-0.040313720703125,
0.0121917724609375,
0.0196075439453125,
0.0193328857421875,
0.01380157470703125,
-0.0247802734375,
-0.0687255859375,
0.09454345703125,
0.01529693603515625,
-0.04736328125,
-0.0176239013671875,
-0.00829315185546875,
0.0404052734375,
0.0047607421875,
0.05438232421875,
0.05255126953125,
0.0307769775390625,
0.0028018951416015625,
-0.0787353515625,
0.02557373046875,
-0.0307159423828125,
-0.007366180419921875,
0.018218994140625,
-0.05364990234375,
0.0938720703125,
-0.003307342529296875,
-0.003139495849609375,
0.0226287841796875,
0.04193115234375,
0.0302276611328125,
-0.0086212158203125,
0.028350830078125,
0.059295654296875,
0.07025146484375,
-0.0254058837890625,
0.09552001953125,
-0.023223876953125,
0.055755615234375,
0.0648193359375,
0.0157318115234375,
0.0419921875,
0.026580810546875,
-0.02789306640625,
0.0440673828125,
0.06536865234375,
-0.007633209228515625,
0.01422119140625,
0.0157318115234375,
-0.0166168212890625,
-0.012664794921875,
0.00379180908203125,
-0.0413818359375,
0.017852783203125,
0.0199737548828125,
-0.040435791015625,
-0.01076507568359375,
-0.02484130859375,
0.02728271484375,
-0.0279693603515625,
-0.017333984375,
0.0265045166015625,
0.0098724365234375,
-0.049774169921875,
0.0462646484375,
0.01340484619140625,
0.0430908203125,
-0.03253173828125,
0.010162353515625,
-0.01181793212890625,
0.0192718505859375,
-0.029571533203125,
-0.0384521484375,
0.00897216796875,
0.0015926361083984375,
0.00505828857421875,
0.00487518310546875,
0.032684326171875,
-0.0155487060546875,
-0.04241943359375,
0.0135345458984375,
0.04034423828125,
0.0242156982421875,
-0.0283966064453125,
-0.05401611328125,
0.002597808837890625,
-0.0106048583984375,
-0.044647216796875,
0.0341796875,
0.024383544921875,
-0.0092926025390625,
0.045745849609375,
0.049713134765625,
0.00435638427734375,
-0.00644683837890625,
0.01172637939453125,
0.0692138671875,
-0.03289794921875,
-0.039947509765625,
-0.07000732421875,
0.0400390625,
-0.004116058349609375,
-0.04632568359375,
0.065673828125,
0.04119873046875,
0.053497314453125,
0.0205535888671875,
0.05078125,
-0.037506103515625,
0.004146575927734375,
-0.02410888671875,
0.054473876953125,
-0.036407470703125,
0.0011749267578125,
-0.03289794921875,
-0.08673095703125,
-0.00508880615234375,
0.07769775390625,
-0.0372314453125,
0.029388427734375,
0.06378173828125,
0.05615234375,
-0.0003800392150878906,
0.00501251220703125,
0.002399444580078125,
0.0282745361328125,
0.03411865234375,
0.0654296875,
0.0643310546875,
-0.048370361328125,
0.04364013671875,
-0.03582763671875,
-0.0206756591796875,
-0.01297760009765625,
-0.038665771484375,
-0.06732177734375,
-0.037322998046875,
-0.03570556640625,
-0.05645751953125,
-0.002490997314453125,
0.06744384765625,
0.05352783203125,
-0.050445556640625,
-0.0167694091796875,
-0.036956787109375,
0.0039215087890625,
-0.018035888671875,
-0.0182342529296875,
0.04010009765625,
0.011810302734375,
-0.07061767578125,
-0.004055023193359375,
-0.00780487060546875,
0.0058135986328125,
-0.0304718017578125,
-0.0211639404296875,
-0.012664794921875,
-0.011383056640625,
0.00885772705078125,
0.023223876953125,
-0.038848876953125,
-0.017730712890625,
0.00341033935546875,
0.0018930435180664062,
0.0017566680908203125,
0.050811767578125,
-0.041595458984375,
0.007843017578125,
0.04833984375,
0.01332855224609375,
0.0560302734375,
-0.01739501953125,
0.032501220703125,
-0.023834228515625,
0.0255279541015625,
0.0248870849609375,
0.045989990234375,
0.0284576416015625,
-0.024017333984375,
0.020233154296875,
0.02740478515625,
-0.058563232421875,
-0.06268310546875,
0.0211181640625,
-0.054412841796875,
-0.00928497314453125,
0.0970458984375,
-0.0199737548828125,
-0.0246429443359375,
0.0019779205322265625,
-0.01458740234375,
0.036895751953125,
-0.02655029296875,
0.053314208984375,
0.05133056640625,
0.00423431396484375,
-0.0219879150390625,
-0.052459716796875,
0.032623291015625,
0.050018310546875,
-0.0635986328125,
0.0283203125,
0.047607421875,
0.038848876953125,
0.017547607421875,
0.051361083984375,
-0.018768310546875,
0.04022216796875,
0.005840301513671875,
0.00472259521484375,
0.006855010986328125,
-0.033935546875,
-0.032318115234375,
-0.00868988037109375,
0.01143646240234375,
0.005985260009765625
]
] |
NousResearch/Llama-2-13b-chat-hf | 2023-09-13T08:15:28.000Z | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"facebook",
"meta",
"llama-2",
"en",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | NousResearch | null | null | NousResearch/Llama-2-13b-chat-hf | 15 | 6,929 | transformers | 2023-07-19T01:01:32 | ---
extra_gated_heading: Access Llama 2 on Hugging Face
extra_gated_description: >-
This is a form to enable access to Llama 2 on Hugging Face after you have been
granted access from Meta. Please visit the [Meta website](https://ai.meta.com/resources/models-and-libraries/llama-downloads) and accept our
license terms and acceptable use policy before submitting this form. Requests
will be processed in 1-2 days.
extra_gated_button_content: Submit
extra_gated_fields:
I agree to share my name, email address and username with Meta and confirm that I have already been granted download access on the Meta website: checkbox
language:
- en
pipeline_tag: text-generation
inference: false
tags:
- facebook
- meta
- pytorch
- llama
- llama-2
---
# **Llama 2**
Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 13B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.
## Model Details
*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.*
Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM.
**Model Developers** Meta
**Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations.
**Input** Models input text only.
**Output** Models generate text only.
**Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety.
||Training Data|Params|Content Length|GQA|Tokens|LR|
|---|---|---|---|---|---|---|
|Llama 2|*A new mix of publicly available online data*|7B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|13B|4k|✗|2.0T|3.0 x 10<sup>-4</sup>|
|Llama 2|*A new mix of publicly available online data*|70B|4k|✔|2.0T|1.5 x 10<sup>-4</sup>|
*Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability.
**Model Dates** Llama 2 was trained between January 2023 and July 2023.
**Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback.
**License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/)
## Intended Use
**Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.
To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See our reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212).
**Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2.
## Hardware and Software
**Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.
**Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program.
||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)|
|---|---|---|---|
|Llama 2 7B|184320|400|31.22|
|Llama 2 13B|368640|400|62.44|
|Llama 2 70B|1720320|400|291.42|
|Total|3311616||539.00|
**CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.
## Training Data
**Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
**Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023.
## Evaluation Results
In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library.
|Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval|
|---|---|---|---|---|---|---|---|---|---|
|Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9|
|Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9|
|Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7|
|Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6|
|Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3|
|Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1|
|Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**|
**Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1.
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama 1|7B|27.42|23.00|
|Llama 1|13B|41.74|23.08|
|Llama 1|33B|44.19|22.57|
|Llama 1|65B|48.71|21.77|
|Llama 2|7B|33.29|**21.25**|
|Llama 2|13B|41.86|26.10|
|Llama 2|70B|**50.18**|24.60|
**Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better).
|||TruthfulQA|Toxigen|
|---|---|---|---|
|Llama-2-Chat|7B|57.04|**0.00**|
|Llama-2-Chat|13B|62.18|**0.00**|
|Llama-2-Chat|70B|**64.14**|0.01|
**Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above.
## Ethical Considerations and Limitations
Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide)
## Reporting Issues
Please report any software “bug,” or other problems with the models through one of the following means:
- Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama)
- Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback)
- Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info)
## Llama Model Index
|Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf|
|---|---|---|---|---|
|7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)|
|13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)|
|70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)|
| 10,137 | [
[
-0.0166168212890625,
-0.05279541015625,
0.02813720703125,
0.01468658447265625,
-0.0283660888671875,
0.01702880859375,
-0.003345489501953125,
-0.0562744140625,
0.004482269287109375,
0.0233917236328125,
-0.052886962890625,
-0.04302978515625,
-0.0501708984375,
0.006374359130859375,
-0.016815185546875,
0.08087158203125,
-0.0007309913635253906,
-0.0210113525390625,
-0.009613037109375,
0.006534576416015625,
-0.036346435546875,
-0.0299530029296875,
-0.0401611328125,
-0.0321044921875,
0.0302276611328125,
0.03594970703125,
0.045928955078125,
0.04888916015625,
0.040740966796875,
0.018280029296875,
-0.0194244384765625,
0.0163116455078125,
-0.052642822265625,
-0.0204315185546875,
0.00890350341796875,
-0.03680419921875,
-0.051177978515625,
0.01253509521484375,
0.0265655517578125,
0.0134124755859375,
-0.0222015380859375,
0.039520263671875,
0.004634857177734375,
0.035064697265625,
-0.042083740234375,
0.012664794921875,
-0.054443359375,
0.0027408599853515625,
-0.0162200927734375,
-0.006076812744140625,
-0.01409149169921875,
-0.022003173828125,
-0.01425933837890625,
-0.06256103515625,
-0.008331298828125,
0.006805419921875,
0.078369140625,
0.04901123046875,
-0.03424072265625,
-0.00901031494140625,
-0.0209197998046875,
0.0714111328125,
-0.0643310546875,
0.003635406494140625,
0.0438232421875,
0.0207366943359375,
-0.01702880859375,
-0.056915283203125,
-0.04888916015625,
-0.01079559326171875,
0.004550933837890625,
0.026885986328125,
-0.031280517578125,
0.0013751983642578125,
0.01384735107421875,
0.02880859375,
-0.04296875,
0.042572021484375,
-0.03839111328125,
-0.01263427734375,
0.07977294921875,
0.018280029296875,
0.00008749961853027344,
-0.004364013671875,
-0.037139892578125,
-0.0219268798828125,
-0.060455322265625,
0.013580322265625,
0.036956787109375,
-0.002986907958984375,
-0.034881591796875,
0.0465087890625,
-0.0308685302734375,
0.022430419921875,
0.00131988525390625,
-0.03778076171875,
0.036041259765625,
-0.036041259765625,
-0.0210113525390625,
-0.0088958740234375,
0.0675048828125,
0.055084228515625,
0.0115814208984375,
0.0083160400390625,
-0.005107879638671875,
0.00812530517578125,
-0.0007824897766113281,
-0.060943603515625,
-0.00354766845703125,
0.0178680419921875,
-0.028106689453125,
-0.043853759765625,
-0.0228118896484375,
-0.05621337890625,
-0.01238250732421875,
-0.007343292236328125,
0.0184783935546875,
-0.002429962158203125,
-0.028717041015625,
0.0092010498046875,
0.0038890838623046875,
0.041534423828125,
0.01611328125,
-0.07098388671875,
0.0165863037109375,
0.041717529296875,
0.05914306640625,
-0.0182647705078125,
-0.026763916015625,
0.000629425048828125,
-0.001739501953125,
-0.0247344970703125,
0.06787109375,
-0.0256500244140625,
-0.041534423828125,
-0.0167236328125,
-0.0015859603881835938,
0.0122222900390625,
-0.039093017578125,
0.033050537109375,
-0.0287628173828125,
0.01279449462890625,
-0.0250244140625,
-0.027313232421875,
-0.02606201171875,
0.01428985595703125,
-0.030059814453125,
0.10906982421875,
0.0085601806640625,
-0.036163330078125,
0.0234527587890625,
-0.051300048828125,
-0.01371002197265625,
-0.015228271484375,
0.00738525390625,
-0.03973388671875,
-0.0201873779296875,
0.00982666015625,
0.027862548828125,
-0.0482177734375,
0.0367431640625,
-0.0148162841796875,
-0.032867431640625,
0.002895355224609375,
-0.031280517578125,
0.063720703125,
0.0220489501953125,
-0.03424072265625,
0.00556182861328125,
-0.0626220703125,
0.00397491455078125,
0.0340576171875,
-0.036224365234375,
0.0197906494140625,
0.005680084228515625,
-0.00908660888671875,
0.01428985595703125,
0.03643798828125,
-0.027191162109375,
0.01238250732421875,
-0.0235748291015625,
0.03778076171875,
0.056365966796875,
0.004123687744140625,
0.01218414306640625,
-0.039093017578125,
0.039581298828125,
-0.002902984619140625,
0.0289306640625,
0.0008225440979003906,
-0.05426025390625,
-0.07745361328125,
-0.0140533447265625,
-0.0033721923828125,
0.0635986328125,
-0.0188140869140625,
0.052001953125,
-0.0004534721374511719,
-0.055328369140625,
-0.03192138671875,
0.027374267578125,
0.050750732421875,
0.037750244140625,
0.032257080078125,
-0.021331787109375,
-0.0457763671875,
-0.076904296875,
0.004039764404296875,
-0.03369140625,
-0.0010013580322265625,
0.027313232421875,
0.04925537109375,
-0.025421142578125,
0.054962158203125,
-0.040557861328125,
-0.01287078857421875,
-0.019989013671875,
-0.0088958740234375,
0.005451202392578125,
0.0258331298828125,
0.04876708984375,
-0.029510498046875,
-0.01568603515625,
-0.00936126708984375,
-0.0679931640625,
-0.0079193115234375,
0.0074615478515625,
-0.01532745361328125,
0.01800537109375,
0.0237579345703125,
-0.046051025390625,
0.034393310546875,
0.05364990234375,
-0.01346588134765625,
0.038848876953125,
-0.0002391338348388672,
-0.01306915283203125,
-0.0811767578125,
0.00298309326171875,
-0.0150299072265625,
0.0018138885498046875,
-0.032623291015625,
-0.0024929046630859375,
-0.0162811279296875,
0.005054473876953125,
-0.04644775390625,
0.044647216796875,
-0.023162841796875,
-0.01258087158203125,
-0.00927734375,
0.003665924072265625,
0.004657745361328125,
0.046600341796875,
-0.00933074951171875,
0.08148193359375,
0.0301513671875,
-0.04449462890625,
0.0198516845703125,
0.0300140380859375,
-0.037689208984375,
0.01132965087890625,
-0.06573486328125,
0.027374267578125,
0.008270263671875,
0.03973388671875,
-0.072509765625,
-0.0282745361328125,
0.024566650390625,
-0.032562255859375,
0.006504058837890625,
0.0180511474609375,
-0.04156494140625,
-0.0299835205078125,
-0.03179931640625,
0.02313232421875,
0.0606689453125,
-0.03436279296875,
0.01392364501953125,
0.0283050537109375,
0.0014705657958984375,
-0.051666259765625,
-0.06317138671875,
0.0050506591796875,
-0.0277252197265625,
-0.04022216796875,
0.0228729248046875,
-0.01450347900390625,
-0.0184783935546875,
-0.0193939208984375,
0.00533294677734375,
-0.0009593963623046875,
0.02899169921875,
0.02764892578125,
0.0279388427734375,
-0.00882720947265625,
-0.001956939697265625,
0.0107269287109375,
-0.01546478271484375,
0.0032501220703125,
0.0169677734375,
0.043914794921875,
-0.01317596435546875,
-0.0167236328125,
-0.055694580078125,
0.0038433074951171875,
0.0220184326171875,
-0.0191497802734375,
0.045623779296875,
0.031829833984375,
-0.0167388916015625,
0.017669677734375,
-0.05743408203125,
-0.0079803466796875,
-0.0400390625,
0.04071044921875,
-0.0160675048828125,
-0.061920166015625,
0.04034423828125,
-0.0006680488586425781,
0.033111572265625,
0.05633544921875,
0.046112060546875,
-0.006259918212890625,
0.0614013671875,
0.043060302734375,
-0.005886077880859375,
0.0252685546875,
-0.037322998046875,
-0.006626129150390625,
-0.07159423828125,
-0.04644775390625,
-0.0240631103515625,
-0.03216552734375,
-0.050079345703125,
-0.03143310546875,
0.02001953125,
0.01456451416015625,
-0.050994873046875,
0.0241851806640625,
-0.043304443359375,
0.04290771484375,
0.04034423828125,
0.00968170166015625,
0.0233917236328125,
0.007457733154296875,
0.01068878173828125,
0.004364013671875,
-0.03973388671875,
-0.055572509765625,
0.1103515625,
0.031707763671875,
0.033905029296875,
0.00759124755859375,
0.051300048828125,
0.01114654541015625,
0.024261474609375,
-0.054107666015625,
0.049713134765625,
0.0036449432373046875,
-0.053955078125,
-0.01142120361328125,
-0.008544921875,
-0.0673828125,
0.01082611083984375,
-0.0157623291015625,
-0.058837890625,
0.002391815185546875,
-0.002285003662109375,
-0.028900146484375,
0.0211944580078125,
-0.05035400390625,
0.04541015625,
-0.0426025390625,
-0.023529052734375,
-0.02716064453125,
-0.059478759765625,
0.051025390625,
-0.01515960693359375,
0.007328033447265625,
-0.037322998046875,
-0.0193634033203125,
0.0670166015625,
-0.025177001953125,
0.07550048828125,
-0.00312042236328125,
-0.0077972412109375,
0.0435791015625,
-0.01396942138671875,
0.0335693359375,
0.002788543701171875,
-0.02099609375,
0.0506591796875,
-0.009735107421875,
-0.0234222412109375,
-0.01143646240234375,
0.040435791015625,
-0.090576171875,
-0.06011962890625,
-0.0386962890625,
-0.03857421875,
-0.00191497802734375,
0.006504058837890625,
0.038299560546875,
-0.00667572021484375,
-0.0027313232421875,
0.00917816162109375,
0.034454345703125,
-0.039276123046875,
0.035797119140625,
0.04156494140625,
-0.00778961181640625,
-0.034881591796875,
0.049591064453125,
0.003391265869140625,
0.027374267578125,
0.01611328125,
0.0034236907958984375,
-0.0309295654296875,
-0.0318603515625,
-0.038848876953125,
0.020660400390625,
-0.035125732421875,
-0.036529541015625,
-0.04010009765625,
-0.026824951171875,
-0.0242156982421875,
-0.005702972412109375,
-0.0323486328125,
-0.03216552734375,
-0.056121826171875,
-0.029571533203125,
0.0390625,
0.061492919921875,
-0.0003104209899902344,
0.04730224609375,
-0.0243988037109375,
0.01403045654296875,
0.02813720703125,
0.013153076171875,
-0.0028705596923828125,
-0.05657958984375,
0.0041656494140625,
0.00933837890625,
-0.05743408203125,
-0.04632568359375,
0.018585205078125,
0.0207672119140625,
0.03594970703125,
0.035369873046875,
-0.005718231201171875,
0.058624267578125,
-0.0262908935546875,
0.08343505859375,
0.027191162109375,
-0.050262451171875,
0.0528564453125,
-0.016632080078125,
0.00432586669921875,
0.04766845703125,
0.0203704833984375,
-0.006946563720703125,
-0.01226043701171875,
-0.047698974609375,
-0.05108642578125,
0.06048583984375,
0.0171966552734375,
0.013641357421875,
0.0038585662841796875,
0.034515380859375,
0.00402069091796875,
0.00821685791015625,
-0.0628662109375,
-0.02313232421875,
-0.0195770263671875,
-0.007534027099609375,
-0.0152587890625,
-0.037933349609375,
-0.005584716796875,
-0.0236358642578125,
0.0482177734375,
0.003513336181640625,
0.026092529296875,
-0.0095367431640625,
0.00162506103515625,
-0.00785064697265625,
0.003757476806640625,
0.054901123046875,
0.037872314453125,
-0.01922607421875,
-0.01152801513671875,
0.04888916015625,
-0.04705810546875,
0.025299072265625,
0.0008206367492675781,
-0.010162353515625,
-0.0277252197265625,
0.0301513671875,
0.066650390625,
0.0201568603515625,
-0.053741455078125,
0.0250396728515625,
0.0099334716796875,
-0.0277862548828125,
-0.03179931640625,
0.027557373046875,
0.00643157958984375,
0.0244140625,
0.019805908203125,
-0.01010894775390625,
0.0072479248046875,
-0.038909912109375,
-0.00968170166015625,
0.029083251953125,
0.00913238525390625,
-0.031280517578125,
0.0745849609375,
0.024444580078125,
-0.022003173828125,
0.03997802734375,
-0.012603759765625,
-0.0272216796875,
0.0679931640625,
0.047576904296875,
0.04913330078125,
-0.0209197998046875,
0.00921630859375,
0.05377197265625,
0.033721923828125,
-0.017486572265625,
0.017578125,
-0.0007739067077636719,
-0.03717041015625,
-0.0157470703125,
-0.05206298828125,
-0.03558349609375,
0.0269775390625,
-0.04302978515625,
0.023590087890625,
-0.047576904296875,
-0.0202178955078125,
-0.0241851806640625,
0.0341796875,
-0.050537109375,
0.016204833984375,
0.0084686279296875,
0.06927490234375,
-0.05426025390625,
0.057159423828125,
0.037506103515625,
-0.038482666015625,
-0.06689453125,
-0.0220947265625,
0.01519775390625,
-0.0919189453125,
0.0394287109375,
0.0284881591796875,
-0.005626678466796875,
0.0093231201171875,
-0.056365966796875,
-0.09124755859375,
0.1275634765625,
0.034515380859375,
-0.056732177734375,
-0.0013580322265625,
0.0255126953125,
0.03656005859375,
-0.0084991455078125,
0.0333251953125,
0.062103271484375,
0.03704833984375,
0.0084686279296875,
-0.07977294921875,
0.0067596435546875,
-0.0262908935546875,
-0.0027179718017578125,
-0.01465606689453125,
-0.09832763671875,
0.060943603515625,
-0.029754638671875,
-0.018035888671875,
0.0167694091796875,
0.047821044921875,
0.051483154296875,
0.04217529296875,
0.026611328125,
0.060089111328125,
0.0689697265625,
-0.0019512176513671875,
0.08428955078125,
-0.0272674560546875,
0.0131683349609375,
0.0667724609375,
-0.022003173828125,
0.07269287109375,
0.0178985595703125,
-0.04425048828125,
0.046112060546875,
0.07623291015625,
-0.001445770263671875,
0.0443115234375,
0.0050048828125,
-0.013641357421875,
-0.01422882080078125,
-0.01381683349609375,
-0.048614501953125,
0.03875732421875,
0.0182647705078125,
-0.01123809814453125,
-0.0020732879638671875,
-0.025177001953125,
0.017181396484375,
-0.024688720703125,
-0.0012445449829101562,
0.0606689453125,
0.01291656494140625,
-0.046722412109375,
0.06622314453125,
0.0034732818603515625,
0.06317138671875,
-0.048980712890625,
0.006114959716796875,
-0.03912353515625,
0.00020742416381835938,
-0.027801513671875,
-0.053131103515625,
0.005489349365234375,
0.0277862548828125,
-0.0009822845458984375,
-0.00872802734375,
0.041168212890625,
0.0031757354736328125,
-0.04266357421875,
0.02691650390625,
0.0204010009765625,
0.0267181396484375,
0.01611328125,
-0.05157470703125,
0.01287078857421875,
0.0072021484375,
-0.040374755859375,
0.028289794921875,
0.002410888671875,
-0.004894256591796875,
0.0606689453125,
0.05572509765625,
-0.01528167724609375,
0.010406494140625,
-0.015228271484375,
0.07525634765625,
-0.037994384765625,
-0.014556884765625,
-0.0567626953125,
0.03997802734375,
0.003414154052734375,
-0.053619384765625,
0.041046142578125,
0.049163818359375,
0.052642822265625,
0.0208282470703125,
0.049041748046875,
0.00624847412109375,
0.0234527587890625,
-0.04052734375,
0.046142578125,
-0.058807373046875,
0.0284423828125,
0.006061553955078125,
-0.0736083984375,
-0.004749298095703125,
0.05126953125,
-0.018341064453125,
0.0032958984375,
0.0285186767578125,
0.06451416015625,
0.013336181640625,
-0.0116119384765625,
0.010009765625,
0.01265716552734375,
0.0262298583984375,
0.066650390625,
0.0635986328125,
-0.04718017578125,
0.05279541015625,
-0.0287933349609375,
-0.017852783203125,
-0.01995849609375,
-0.05572509765625,
-0.072509765625,
-0.02001953125,
-0.0187225341796875,
-0.0109100341796875,
0.005100250244140625,
0.056396484375,
0.03814697265625,
-0.0438232421875,
-0.0226593017578125,
-0.0060882568359375,
-0.00646209716796875,
0.002750396728515625,
-0.01197052001953125,
0.0245208740234375,
-0.00823211669921875,
-0.044586181640625,
0.03729248046875,
0.0005931854248046875,
0.01451873779296875,
-0.0247802734375,
-0.0204315185546875,
-0.01537322998046875,
0.01125335693359375,
0.045318603515625,
0.0213165283203125,
-0.0699462890625,
-0.0175628662109375,
0.0032672882080078125,
-0.01055908203125,
0.00933074951171875,
0.001708984375,
-0.0579833984375,
0.00799560546875,
0.01055908203125,
0.02838134765625,
0.049591064453125,
0.004589080810546875,
0.005767822265625,
-0.037750244140625,
0.03350830078125,
0.0014810562133789062,
0.0116729736328125,
0.0231170654296875,
-0.03216552734375,
0.0595703125,
0.01102447509765625,
-0.05279541015625,
-0.0706787109375,
0.0089263916015625,
-0.07867431640625,
0.00007599592208862305,
0.10369873046875,
0.00021445751190185547,
-0.00927734375,
0.0145416259765625,
-0.016021728515625,
0.0285797119140625,
-0.0294036865234375,
0.060455322265625,
0.0430908203125,
-0.006732940673828125,
-0.007904052734375,
-0.060302734375,
0.026123046875,
0.029815673828125,
-0.08203125,
-0.018768310546875,
0.03460693359375,
0.036346435546875,
-0.0068206787109375,
0.05181884765625,
0.0015859603881835938,
0.01776123046875,
0.00495147705078125,
0.007965087890625,
-0.018341064453125,
-0.0116424560546875,
-0.00817108154296875,
-0.020904541015625,
-0.00406646728515625,
-0.0164642333984375
]
] |
emilianJR/CyberRealistic_V3 | 2023-05-25T12:51:41.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | emilianJR | null | null | emilianJR/CyberRealistic_V3 | 13 | 6,921 | diffusers | 2023-05-24T10:39:28 | ---
language:
- en
license: creativeml-openrail-m
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
inference: true
---
Diffuser model for this SD checkpoint:
https://civitai.com/models/15003/cyberrealistic
**emilianJR/CyberRealistic_V3** is the HuggingFace diffuser that you can use with **diffusers.StableDiffusionPipeline()**.
Examples | Examples | Examples
---- | ---- | ----
 |  | 
 |  | 
-------
## 🧨 Diffusers
This model can be used just like any other Stable Diffusion model. For more information,
please have a look at the [Stable Diffusion](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion).
```python
from diffusers import StableDiffusionPipeline
import torch
model_id = "emilianJR/CyberRealistic_V3"
pipe = StableDiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.float16)
pipe = pipe.to("cuda")
prompt = "YOUR PROMPT"
image = pipe(prompt).images[0]
image.save("image.png")
```
## License
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) | 1,901 | [
[
-0.0482177734375,
-0.048126220703125,
0.043914794921875,
0.0335693359375,
-0.0207061767578125,
-0.00555419921875,
0.0180511474609375,
-0.0097808837890625,
0.032989501953125,
0.0311431884765625,
-0.056732177734375,
-0.0362548828125,
-0.039764404296875,
-0.00922393798828125,
-0.0223388671875,
0.0673828125,
-0.025421142578125,
0.003612518310546875,
-0.00678253173828125,
-0.002902984619140625,
-0.0144805908203125,
-0.004337310791015625,
-0.057342529296875,
-0.01186370849609375,
0.0268096923828125,
-0.007232666015625,
0.048065185546875,
0.0191497802734375,
0.019744873046875,
0.0249786376953125,
-0.0382080078125,
-0.005615234375,
-0.028839111328125,
-0.000009000301361083984,
-0.003139495849609375,
-0.01204681396484375,
-0.057403564453125,
0.007808685302734375,
0.0506591796875,
0.021759033203125,
-0.031280517578125,
0.0004782676696777344,
0.00902557373046875,
0.027069091796875,
-0.037384033203125,
-0.018890380859375,
-0.00206756591796875,
0.0008554458618164062,
-0.00957489013671875,
0.00359344482421875,
0.0011510848999023438,
-0.040283203125,
0.024322509765625,
-0.05999755859375,
0.0230255126953125,
-0.0167388916015625,
0.0972900390625,
0.025238037109375,
-0.023773193359375,
-0.007049560546875,
-0.056854248046875,
0.03973388671875,
-0.03924560546875,
0.0249786376953125,
0.0033702850341796875,
0.0309295654296875,
0.004886627197265625,
-0.079833984375,
-0.04248046875,
0.02252197265625,
0.0033111572265625,
0.033294677734375,
-0.0087127685546875,
0.018890380859375,
0.030181884765625,
0.0257110595703125,
-0.049285888671875,
-0.01387786865234375,
-0.054168701171875,
-0.006900787353515625,
0.033355712890625,
0.0246429443359375,
0.0139007568359375,
-0.0036163330078125,
-0.05419921875,
-0.0034122467041015625,
-0.0227203369140625,
0.0005593299865722656,
0.00482940673828125,
-0.02618408203125,
-0.045440673828125,
0.02899169921875,
-0.0031948089599609375,
0.04425048828125,
0.0290069580078125,
-0.0066375732421875,
0.0215911865234375,
-0.019805908203125,
-0.024444580078125,
-0.0252838134765625,
0.051788330078125,
0.0484619140625,
0.0048370361328125,
0.01324462890625,
-0.00104522705078125,
0.007427215576171875,
0.02435302734375,
-0.1009521484375,
-0.026947021484375,
0.044189453125,
-0.034454345703125,
-0.02410888671875,
0.0060577392578125,
-0.0609130859375,
-0.01154327392578125,
-0.0038890838623046875,
0.01367950439453125,
-0.025634765625,
-0.0498046875,
0.005893707275390625,
-0.0272979736328125,
0.027679443359375,
0.048095703125,
-0.0279693603515625,
0.009185791015625,
0.0233154296875,
0.09906005859375,
-0.00496673583984375,
0.00395965576171875,
-0.01140594482421875,
0.0262451171875,
-0.0088958740234375,
0.05670166015625,
-0.01776123046875,
-0.041259765625,
-0.007175445556640625,
0.0222625732421875,
-0.0108642578125,
-0.034515380859375,
0.050445556640625,
-0.037567138671875,
0.01467132568359375,
-0.0284576416015625,
-0.04095458984375,
-0.01837158203125,
0.005054473876953125,
-0.03887939453125,
0.057861328125,
0.0233917236328125,
-0.074462890625,
0.03271484375,
-0.043701171875,
0.0167388916015625,
0.0233306884765625,
-0.01190185546875,
-0.04901123046875,
-0.00494384765625,
-0.0203399658203125,
0.0401611328125,
0.001068115234375,
0.01178741455078125,
-0.049468994140625,
-0.02264404296875,
-0.01529693603515625,
-0.006256103515625,
0.09814453125,
0.028900146484375,
-0.0287017822265625,
0.016876220703125,
-0.037353515625,
0.0002276897430419922,
0.0023212432861328125,
-0.0172271728515625,
-0.00525665283203125,
-0.0265045166015625,
0.0325927734375,
0.039154052734375,
-0.001049041748046875,
-0.06280517578125,
0.0117034912109375,
-0.0005879402160644531,
0.028961181640625,
0.058135986328125,
0.025115966796875,
0.0311431884765625,
-0.036651611328125,
0.050750732421875,
0.03192138671875,
0.01824951171875,
0.021148681640625,
-0.040252685546875,
-0.05975341796875,
-0.055419921875,
0.002132415771484375,
0.035797119140625,
-0.06427001953125,
0.02423095703125,
0.0034046173095703125,
-0.055511474609375,
-0.039581298828125,
0.0018863677978515625,
0.0020008087158203125,
0.0428466796875,
0.0113067626953125,
-0.033294677734375,
-0.020782470703125,
-0.05194091796875,
0.02093505859375,
0.0017766952514648438,
-0.007965087890625,
0.005054473876953125,
0.035736083984375,
-0.0251312255859375,
0.0654296875,
-0.049285888671875,
-0.028900146484375,
-0.002033233642578125,
-0.00400543212890625,
0.029449462890625,
0.055206298828125,
0.06787109375,
-0.06341552734375,
-0.051422119140625,
-0.0167388916015625,
-0.06561279296875,
-0.0093536376953125,
-0.013031005859375,
-0.0340576171875,
0.005542755126953125,
0.03192138671875,
-0.07568359375,
0.058807373046875,
0.047027587890625,
-0.057525634765625,
0.04107666015625,
-0.0262451171875,
0.006023406982421875,
-0.08984375,
0.0248260498046875,
0.022705078125,
-0.0419921875,
-0.0556640625,
0.01318359375,
0.0038509368896484375,
0.004657745361328125,
-0.05242919921875,
0.0660400390625,
-0.035186767578125,
0.035858154296875,
-0.033355712890625,
0.007396697998046875,
0.00836944580078125,
0.0183563232421875,
0.0308685302734375,
0.03533935546875,
0.060302734375,
-0.05474853515625,
0.029937744140625,
0.026153564453125,
-0.020660400390625,
0.07403564453125,
-0.073486328125,
-0.010284423828125,
-0.033447265625,
0.04150390625,
-0.0738525390625,
-0.01537322998046875,
0.036865234375,
-0.0196075439453125,
0.0174713134765625,
-0.018798828125,
-0.0184783935546875,
-0.020904541015625,
-0.0161590576171875,
0.035736083984375,
0.05303955078125,
-0.03790283203125,
0.053009033203125,
0.0167236328125,
0.004970550537109375,
-0.0142059326171875,
-0.04833984375,
-0.034454345703125,
-0.049285888671875,
-0.06524658203125,
0.041900634765625,
-0.033538818359375,
-0.013671875,
-0.01220703125,
0.0030574798583984375,
-0.01340484619140625,
-0.0021877288818359375,
0.039276123046875,
0.03924560546875,
-0.0032711029052734375,
-0.05181884765625,
0.026031494140625,
-0.029876708984375,
0.01021575927734375,
-0.01268768310546875,
0.03497314453125,
-0.00574493408203125,
-0.0183868408203125,
-0.06671142578125,
0.0209197998046875,
0.05908203125,
0.0172576904296875,
0.090576171875,
0.06866455078125,
-0.046142578125,
0.00725555419921875,
-0.0411376953125,
-0.003917694091796875,
-0.04046630859375,
-0.005947113037109375,
-0.0341796875,
-0.03759765625,
0.062469482421875,
-0.0030155181884765625,
0.027618408203125,
0.031829833984375,
0.0560302734375,
-0.016387939453125,
0.06634521484375,
0.04217529296875,
0.0228424072265625,
0.0404052734375,
-0.0626220703125,
-0.005207061767578125,
-0.080810546875,
-0.0310516357421875,
-0.0221405029296875,
-0.0200653076171875,
-0.0244598388671875,
-0.034027099609375,
0.04400634765625,
0.0411376953125,
-0.0355224609375,
0.03179931640625,
-0.0325927734375,
0.02117919921875,
0.014801025390625,
0.0144805908203125,
0.011871337890625,
-0.01519775390625,
-0.0230712890625,
-0.0023136138916015625,
-0.033782958984375,
-0.0360107421875,
0.0283660888671875,
0.0305328369140625,
0.045867919921875,
0.0260467529296875,
0.056976318359375,
0.0010366439819335938,
0.0286712646484375,
-0.0200347900390625,
0.0279388427734375,
0.0157623291015625,
-0.06597900390625,
0.0111236572265625,
-0.020477294921875,
-0.050445556640625,
0.0262908935546875,
-0.00974273681640625,
-0.051116943359375,
0.0443115234375,
0.0204315185546875,
-0.03546142578125,
0.032012939453125,
-0.044189453125,
0.05401611328125,
0.016082763671875,
-0.052886962890625,
-0.00010401010513305664,
-0.038970947265625,
0.0301361083984375,
0.0147705078125,
0.0027866363525390625,
-0.0004432201385498047,
-0.01238250732421875,
0.0364990234375,
-0.055999755859375,
0.0494384765625,
-0.05206298828125,
-0.00927734375,
0.0289306640625,
0.00829315185546875,
0.02264404296875,
0.0274505615234375,
-0.0185089111328125,
0.00922393798828125,
0.031829833984375,
-0.053619384765625,
-0.029205322265625,
0.05889892578125,
-0.054718017578125,
-0.0214691162109375,
-0.042694091796875,
-0.0125885009765625,
0.0242462158203125,
0.014678955078125,
0.051025390625,
0.0138092041015625,
-0.015838623046875,
-0.0140838623046875,
0.0572509765625,
-0.004650115966796875,
0.036712646484375,
0.0185699462890625,
-0.0251617431640625,
-0.03485107421875,
0.047607421875,
0.00433349609375,
0.05047607421875,
-0.017242431640625,
0.003253936767578125,
-0.0105743408203125,
-0.033782958984375,
-0.05078125,
0.039703369140625,
-0.041351318359375,
-0.0184326171875,
-0.055145263671875,
-0.044586181640625,
-0.0184783935546875,
-0.04534912109375,
-0.02178955078125,
-0.022064208984375,
-0.044403076171875,
0.00252532958984375,
0.045989990234375,
0.04180908203125,
-0.0183563232421875,
0.0335693359375,
-0.038177490234375,
0.02252197265625,
0.00653839111328125,
0.0248260498046875,
0.0102996826171875,
-0.04376220703125,
0.00588226318359375,
0.00366973876953125,
-0.038360595703125,
-0.059112548828125,
0.037628173828125,
0.005268096923828125,
0.028472900390625,
0.05169677734375,
-0.0008912086486816406,
0.07403564453125,
-0.0185546875,
0.055938720703125,
0.0227203369140625,
-0.0560302734375,
0.0307159423828125,
-0.04833984375,
0.026031494140625,
0.03955078125,
0.046661376953125,
-0.04217529296875,
-0.0268096923828125,
-0.058197021484375,
-0.053009033203125,
0.03900146484375,
0.0282135009765625,
-0.004016876220703125,
0.007808685302734375,
0.048919677734375,
-0.0036411285400390625,
0.0016775131225585938,
-0.051422119140625,
-0.0300140380859375,
-0.01605224609375,
0.001224517822265625,
0.0222625732421875,
0.0020999908447265625,
-0.0146942138671875,
-0.03009033203125,
0.06591796875,
-0.0025482177734375,
0.029754638671875,
0.0364990234375,
0.01380157470703125,
-0.0172576904296875,
-0.0274505615234375,
0.038421630859375,
0.044769287109375,
-0.0304718017578125,
-0.023223876953125,
0.002094268798828125,
-0.043975830078125,
0.006153106689453125,
0.0034923553466796875,
-0.0310516357421875,
0.0016450881958007812,
-0.0035247802734375,
0.040252685546875,
-0.0172882080078125,
-0.01555633544921875,
0.052490234375,
-0.026580810546875,
-0.02630615234375,
-0.041015625,
0.0020599365234375,
0.035003662109375,
0.039581298828125,
-0.003875732421875,
0.0287017822265625,
0.010284423828125,
-0.013031005859375,
0.006160736083984375,
0.04217529296875,
-0.032379150390625,
-0.02166748046875,
0.083984375,
0.006927490234375,
-0.0022983551025390625,
0.050323486328125,
-0.01165771484375,
-0.025482177734375,
0.031524658203125,
0.035858154296875,
0.0767822265625,
-0.026580810546875,
0.0279388427734375,
0.043792724609375,
-0.0027523040771484375,
-0.007091522216796875,
0.03399658203125,
0.0019626617431640625,
-0.030975341796875,
0.001308441162109375,
-0.054718017578125,
-0.017120361328125,
-0.0099334716796875,
-0.0589599609375,
0.05352783203125,
-0.0670166015625,
-0.015533447265625,
-0.00695037841796875,
-0.00897979736328125,
-0.05523681640625,
0.0099639892578125,
0.005596160888671875,
0.08636474609375,
-0.0684814453125,
0.066162109375,
0.03131103515625,
-0.0294036865234375,
-0.037994384765625,
-0.0007214546203613281,
0.015167236328125,
-0.034942626953125,
0.006145477294921875,
-0.0004470348358154297,
-0.016998291015625,
-0.005405426025390625,
-0.0360107421875,
-0.0706787109375,
0.10894775390625,
0.02105712890625,
-0.0296478271484375,
-0.0141143798828125,
-0.0296630859375,
0.037353515625,
-0.0275726318359375,
0.04852294921875,
0.021484375,
0.025421142578125,
0.0443115234375,
-0.053131103515625,
0.0184326171875,
-0.022552490234375,
0.029449462890625,
0.006427764892578125,
-0.0836181640625,
0.0565185546875,
-0.01239776611328125,
-0.0216217041015625,
0.04583740234375,
0.06292724609375,
0.0482177734375,
0.01763916015625,
0.04095458984375,
0.057281494140625,
0.03631591796875,
-0.01129913330078125,
0.07379150390625,
-0.00423431396484375,
0.042266845703125,
0.037322998046875,
-0.0126800537109375,
0.042724609375,
0.0290985107421875,
-0.0010919570922851562,
0.060791015625,
0.05645751953125,
0.01012420654296875,
0.03875732421875,
0.0158538818359375,
-0.039947509765625,
-0.01204681396484375,
0.002857208251953125,
-0.031646728515625,
-0.00292205810546875,
0.02862548828125,
-0.0183563232421875,
-0.0132904052734375,
0.0102996826171875,
0.01190185546875,
-0.0213470458984375,
-0.00968170166015625,
0.041656494140625,
0.0032901763916015625,
-0.0266265869140625,
0.0531005859375,
-0.003177642822265625,
0.063232421875,
-0.05126953125,
-0.0022983551025390625,
-0.0007390975952148438,
0.0265655517578125,
-0.041107177734375,
-0.055145263671875,
0.03717041015625,
-0.002010345458984375,
0.0005097389221191406,
-0.0261077880859375,
0.0311431884765625,
-0.021759033203125,
-0.04669189453125,
0.0288238525390625,
0.01169586181640625,
0.03021240234375,
0.017364501953125,
-0.0684814453125,
0.0292205810546875,
-0.0009541511535644531,
-0.0279388427734375,
0.0176544189453125,
0.01189422607421875,
0.0229339599609375,
0.047698974609375,
0.0190277099609375,
0.01439666748046875,
0.0139007568359375,
-0.0202484130859375,
0.082763671875,
-0.033111572265625,
-0.035064697265625,
-0.0430908203125,
0.07183837890625,
-0.0166168212890625,
-0.0364990234375,
0.04681396484375,
0.041046142578125,
0.05908203125,
-0.003887176513671875,
0.048065185546875,
-0.02655029296875,
0.035125732421875,
-0.03631591796875,
0.06756591796875,
-0.051361083984375,
0.002651214599609375,
-0.049713134765625,
-0.08062744140625,
-0.007232666015625,
0.07763671875,
0.00400543212890625,
0.02447509765625,
0.03765869140625,
0.06585693359375,
-0.0266265869140625,
0.0011653900146484375,
-0.002826690673828125,
0.018707275390625,
0.032958984375,
0.02374267578125,
0.044189453125,
-0.042449951171875,
0.01142120361328125,
-0.033660888671875,
-0.037261962890625,
0.01500701904296875,
-0.056732177734375,
-0.07293701171875,
-0.058929443359375,
-0.0509033203125,
-0.06988525390625,
-0.02532958984375,
0.057281494140625,
0.0797119140625,
-0.050872802734375,
-0.01053619384765625,
-0.019256591796875,
-0.0040283203125,
-0.0006036758422851562,
-0.02337646484375,
0.01425933837890625,
0.0357666015625,
-0.078857421875,
-0.0090484619140625,
0.0023136138916015625,
0.042510986328125,
-0.03558349609375,
-0.021484375,
-0.024261474609375,
-0.0047149658203125,
0.0117950439453125,
0.02313232421875,
-0.04754638671875,
-0.0145416259765625,
-0.022796630859375,
0.004489898681640625,
0.0007185935974121094,
0.03985595703125,
-0.0572509765625,
0.01611328125,
0.039276123046875,
0.01120758056640625,
0.06866455078125,
0.007080078125,
0.0283203125,
-0.0210723876953125,
0.0171661376953125,
0.01271820068359375,
0.0250244140625,
-0.00028896331787109375,
-0.0328369140625,
0.022857666015625,
0.0224151611328125,
-0.053955078125,
-0.056732177734375,
0.00920867919921875,
-0.1044921875,
-0.0103607177734375,
0.070068359375,
-0.0281524658203125,
-0.0124969482421875,
-0.01062774658203125,
-0.0235443115234375,
0.0032958984375,
-0.028961181640625,
0.032867431640625,
0.037750244140625,
-0.01995849609375,
-0.0261077880859375,
-0.031982421875,
0.042724609375,
0.015960693359375,
-0.0531005859375,
-0.011688232421875,
0.036590576171875,
0.05316162109375,
0.033843994140625,
0.05560302734375,
-0.0167999267578125,
0.022918701171875,
-0.003910064697265625,
-0.00745391845703125,
0.01526641845703125,
-0.0158538818359375,
-0.0352783203125,
0.0010976791381835938,
-0.0138092041015625,
-0.01076507568359375
]
] |
circulus/Llama-2-7b-orca-v1 | 2023-08-01T05:50:51.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"dataset:Open-Orca/OpenOrca",
"license:mit",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | circulus | null | null | circulus/Llama-2-7b-orca-v1 | 6 | 6,914 | transformers | 2023-08-01T03:07:26 | ---
license: mit
datasets:
- Open-Orca/OpenOrca
language:
- en
library_name: transformers
pipeline_tag: text-generation
---
 | 204 | [
[
-0.052154541015625,
-0.05267333984375,
0.031158447265625,
0.038726806640625,
-0.06414794921875,
0.0167999267578125,
0.02825927734375,
-0.05810546875,
0.08612060546875,
0.040771484375,
-0.042633056640625,
-0.03302001953125,
-0.06414794921875,
0.00843048095703125,
-0.0233001708984375,
0.0538330078125,
0.0165863037109375,
-0.0171356201171875,
0.00101470947265625,
-0.0204925537109375,
-0.01349639892578125,
-0.007205963134765625,
-0.06610107421875,
-0.044769287109375,
0.07159423828125,
0.03753662109375,
0.05474853515625,
0.035003662109375,
0.045257568359375,
0.0201873779296875,
-0.00045561790466308594,
0.0211181640625,
-0.0055694580078125,
0.009979248046875,
-0.007080078125,
-0.0205078125,
-0.044525146484375,
0.0080413818359375,
0.037811279296875,
0.019500732421875,
-0.0318603515625,
0.040863037109375,
-0.002384185791015625,
0.046905517578125,
-0.0189971923828125,
-0.00732421875,
-0.037139892578125,
-0.02117919921875,
-0.050537109375,
-0.010406494140625,
-0.0004916191101074219,
-0.05584716796875,
-0.026458740234375,
-0.0836181640625,
-0.007335662841796875,
0.0142669677734375,
0.0946044921875,
0.038482666015625,
-0.053741455078125,
-0.034576416015625,
0.0287628173828125,
0.01360321044921875,
0.007274627685546875,
-0.0004029273986816406,
0.01102447509765625,
0.0165863037109375,
-0.058380126953125,
-0.042633056640625,
-0.04302978515625,
0.00981903076171875,
-0.00594329833984375,
0.00090789794921875,
-0.0384521484375,
-0.042144775390625,
0.0204620361328125,
0.0306243896484375,
-0.0467529296875,
0.01142120361328125,
-0.05401611328125,
-0.00577545166015625,
0.0288848876953125,
-0.0181884765625,
0.05535888671875,
0.032470703125,
-0.05157470703125,
-0.0138397216796875,
-0.05206298828125,
0.013641357421875,
0.026336669921875,
0.006793975830078125,
-0.06591796875,
0.06634521484375,
0.006908416748046875,
0.028656005859375,
0.0338134765625,
-0.0233154296875,
0.036041259765625,
0.016510009765625,
-0.00916290283203125,
-0.0020999908447265625,
0.021636962890625,
0.059967041015625,
0.0113372802734375,
0.005847930908203125,
0.0035762786865234375,
0.018707275390625,
0.00504302978515625,
-0.03717041015625,
-0.006561279296875,
0.01486968994140625,
-0.042755126953125,
-0.02996826171875,
-0.0019245147705078125,
-0.0601806640625,
-0.021484375,
0.02996826171875,
-0.0063934326171875,
0.00033354759216308594,
-0.025146484375,
0.005207061767578125,
-0.01012420654296875,
0.03485107421875,
0.036346435546875,
-0.039764404296875,
0.005870819091796875,
0.01398468017578125,
0.047698974609375,
0.029296875,
0.00927734375,
-0.03753662109375,
-0.0042572021484375,
-0.0256195068359375,
0.08642578125,
-0.0194854736328125,
-0.052276611328125,
-0.0196075439453125,
0.0313720703125,
0.0321044921875,
-0.04254150390625,
0.08392333984375,
-0.0266265869140625,
-0.0174560546875,
-0.039154052734375,
0.02490234375,
-0.043670654296875,
0.00036978721618652344,
-0.0665283203125,
0.05340576171875,
0.038970947265625,
-0.05877685546875,
0.035125732421875,
-0.053314208984375,
-0.0011138916015625,
-0.0012998580932617188,
0.0038051605224609375,
-0.038848876953125,
0.0091400146484375,
-0.00809478759765625,
0.01953125,
-0.03741455078125,
-0.0333251953125,
-0.07391357421875,
0.0003910064697265625,
0.033172607421875,
0.03546142578125,
0.037811279296875,
0.0226287841796875,
0.0005288124084472656,
-0.031158447265625,
-0.05511474609375,
-0.0171966552734375,
0.059844970703125,
-0.0275726318359375,
-0.02508544921875,
-0.01047515869140625,
-0.01085662841796875,
0.032012939453125,
0.0618896484375,
-0.06536865234375,
0.061309814453125,
0.035369873046875,
0.021209716796875,
0.0400390625,
0.005733489990234375,
0.0175628662109375,
-0.0340576171875,
0.039520263671875,
-0.0005440711975097656,
0.0283203125,
0.00788116455078125,
-0.0268707275390625,
-0.076416015625,
-0.03704833984375,
-0.0158233642578125,
0.012451171875,
-0.040008544921875,
0.023193359375,
0.0079803466796875,
-0.055572509765625,
-0.0231475830078125,
-0.002468109130859375,
0.0133819580078125,
0.0113677978515625,
-0.0031871795654296875,
-0.06890869140625,
-0.056060791015625,
-0.05926513671875,
0.0225067138671875,
-0.04931640625,
0.0066070556640625,
0.00969696044921875,
0.034088134765625,
-0.0187530517578125,
0.0281829833984375,
-0.02117919921875,
-0.0267181396484375,
0.0222625732421875,
-0.0187835693359375,
0.05548095703125,
0.060638427734375,
0.062469482421875,
-0.052978515625,
-0.017242431640625,
0.00670623779296875,
-0.0604248046875,
-0.03228759765625,
0.01488494873046875,
-0.045135498046875,
-0.005641937255859375,
0.006046295166015625,
-0.038299560546875,
0.06573486328125,
0.0557861328125,
-0.05963134765625,
0.0352783203125,
-0.01251983642578125,
0.0229034423828125,
-0.027801513671875,
0.02679443359375,
-0.005123138427734375,
-0.0305023193359375,
0.0153656005859375,
0.00823974609375,
0.01219940185546875,
0.0180206298828125,
-0.0285186767578125,
0.0178680419921875,
-0.034576416015625,
-0.008026123046875,
-0.01366424560546875,
0.007381439208984375,
0.017333984375,
0.003353118896484375,
0.004085540771484375,
0.042144775390625,
0.0576171875,
-0.029296875,
0.0249176025390625,
0.060760498046875,
-0.025390625,
0.08544921875,
-0.07373046875,
0.0166778564453125,
-0.010223388671875,
0.047393798828125,
-0.07403564453125,
-0.04217529296875,
0.0640869140625,
-0.03594970703125,
0.01555633544921875,
-0.0029449462890625,
-0.0386962890625,
-0.0304412841796875,
-0.023529052734375,
0.06463623046875,
0.033935546875,
-0.06744384765625,
0.0301361083984375,
0.0243682861328125,
-0.005672454833984375,
-0.0293121337890625,
-0.0645751953125,
0.01174163818359375,
-0.04815673828125,
-0.0272979736328125,
0.051361083984375,
0.0004756450653076172,
-0.0286407470703125,
-0.01050567626953125,
-0.0028095245361328125,
-0.0006103515625,
-0.00922393798828125,
0.06170654296875,
0.037567138671875,
-0.0229034423828125,
-0.022979736328125,
0.0014190673828125,
-0.003818511962890625,
0.0098876953125,
0.020233154296875,
0.0262451171875,
-0.025604248046875,
-0.038726806640625,
-0.041290283203125,
0.03350830078125,
0.0634765625,
0.02001953125,
0.0445556640625,
0.016143798828125,
-0.01409912109375,
0.0182647705078125,
-0.07891845703125,
-0.011322021484375,
-0.031341552734375,
-0.034637451171875,
-0.029296875,
-0.050048828125,
0.054931640625,
0.0087127685546875,
0.0020961761474609375,
0.0401611328125,
0.0284271240234375,
-0.0213165283203125,
0.03631591796875,
0.037872314453125,
-0.0311279296875,
0.01690673828125,
-0.0238189697265625,
-0.00940704345703125,
-0.058319091796875,
-0.06964111328125,
-0.028472900390625,
-0.03717041015625,
-0.039520263671875,
-0.009765625,
0.032379150390625,
0.0330810546875,
-0.0300750732421875,
0.0689697265625,
-0.0433349609375,
0.0290985107421875,
0.027679443359375,
0.032745361328125,
0.02880859375,
-0.007266998291015625,
0.004184722900390625,
0.01067352294921875,
-0.005268096923828125,
-0.0345458984375,
0.0435791015625,
0.04144287109375,
0.041748046875,
0.0295867919921875,
0.033660888671875,
0.036865234375,
0.0391845703125,
-0.01270294189453125,
0.034942626953125,
0.0115966796875,
-0.0455322265625,
-0.002838134765625,
0.0164337158203125,
-0.08123779296875,
0.028594970703125,
-0.01467132568359375,
-0.04022216796875,
0.02606201171875,
0.00467681884765625,
-0.026702880859375,
0.0199737548828125,
-0.034332275390625,
0.0318603515625,
-0.0134429931640625,
-0.0080413818359375,
-0.0208892822265625,
-0.038665771484375,
0.0213775634765625,
-0.006908416748046875,
-0.0006670951843261719,
-0.031646728515625,
-0.03521728515625,
0.021270751953125,
-0.039886474609375,
0.054595947265625,
-0.0018310546875,
-0.0031185150146484375,
0.032440185546875,
-0.0076751708984375,
0.031005859375,
0.012725830078125,
-0.01031494140625,
-0.0002315044403076172,
0.0133514404296875,
-0.055938720703125,
-0.027679443359375,
0.041473388671875,
-0.0382080078125,
-0.01568603515625,
-0.058319091796875,
0.020355224609375,
0.0187835693359375,
-0.036865234375,
0.04150390625,
-0.01036834716796875,
-0.034912109375,
0.006458282470703125,
0.02105712890625,
-0.009552001953125,
0.0291595458984375,
0.0144195556640625,
-0.0215301513671875,
-0.045135498046875,
0.0308990478515625,
0.001445770263671875,
-0.00862884521484375,
0.0013723373413085938,
0.0004546642303466797,
-0.04217529296875,
-0.01027679443359375,
-0.0162200927734375,
0.0701904296875,
-0.0341796875,
-0.030548095703125,
-0.00463104248046875,
0.005367279052734375,
-0.011688232421875,
-0.035003662109375,
-0.006076812744140625,
-0.01953125,
-0.0618896484375,
-0.0036907196044921875,
0.061004638671875,
0.05987548828125,
-0.03558349609375,
0.05328369140625,
-0.01427459716796875,
0.00785064697265625,
0.026123046875,
0.003070831298828125,
-0.032928466796875,
-0.0565185546875,
0.0261383056640625,
-0.0254974365234375,
-0.049835205078125,
-0.046539306640625,
0.05169677734375,
0.01168060302734375,
0.0308990478515625,
0.04217529296875,
-0.0031375885009765625,
0.06732177734375,
-0.0037822723388671875,
0.031585693359375,
0.060150146484375,
-0.036895751953125,
0.03582763671875,
-0.0240631103515625,
0.0163116455078125,
0.021484375,
0.01372528076171875,
-0.026641845703125,
-0.032379150390625,
-0.053741455078125,
-0.048309326171875,
0.0080718994140625,
0.01873779296875,
-0.005096435546875,
0.0139007568359375,
0.0543212890625,
0.0130157470703125,
0.01163482666015625,
-0.067626953125,
-0.0289459228515625,
-0.011566162109375,
0.01157379150390625,
0.0046234130859375,
-0.03350830078125,
0.0056610107421875,
-0.02423095703125,
0.0511474609375,
-0.0145111083984375,
0.02880859375,
0.01219940185546875,
0.00006139278411865234,
-0.0125732421875,
-0.01873779296875,
0.055572509765625,
0.0428466796875,
-0.0244598388671875,
0.0119476318359375,
0.030609130859375,
-0.03070068359375,
0.010833740234375,
-0.0304412841796875,
0.037139892578125,
0.00684356689453125,
0.01702880859375,
0.01409149169921875,
-0.005260467529296875,
-0.034423828125,
0.032684326171875,
-0.0181121826171875,
0.0004591941833496094,
-0.03985595703125,
0.0009622573852539062,
0.00910186767578125,
0.055572509765625,
0.035736083984375,
0.004848480224609375,
-0.00569915771484375,
-0.022247314453125,
0.01154327392578125,
0.017303466796875,
-0.0089263916015625,
-0.050994873046875,
0.055450439453125,
0.0168609619140625,
-0.00075531005859375,
0.0408935546875,
-0.017974853515625,
-0.032196044921875,
0.06951904296875,
0.04486083984375,
0.031707763671875,
-0.022186279296875,
0.01800537109375,
0.03704833984375,
0.0243988037109375,
0.0262451171875,
0.058685302734375,
-0.0016431808471679688,
-0.0158538818359375,
0.01285552978515625,
-0.02716064453125,
-0.04205322265625,
0.013153076171875,
-0.040985107421875,
0.0207672119140625,
-0.047393798828125,
-0.022918701171875,
-0.01082611083984375,
0.013671875,
-0.0408935546875,
0.048126220703125,
0.0172271728515625,
0.09228515625,
-0.05731201171875,
0.07891845703125,
0.04766845703125,
-0.0369873046875,
-0.034912109375,
-0.0361328125,
0.003818511962890625,
-0.09765625,
0.0377197265625,
0.0296173095703125,
-0.01349639892578125,
-0.017730712890625,
-0.09051513671875,
-0.061798095703125,
0.08648681640625,
0.047027587890625,
-0.038421630859375,
0.052001953125,
-0.030242919921875,
0.0050811767578125,
-0.05584716796875,
0.022064208984375,
0.031341552734375,
0.04998779296875,
0.011993408203125,
-0.051483154296875,
0.003223419189453125,
-0.039031982421875,
0.01003265380859375,
0.02545166015625,
-0.0731201171875,
0.04779052734375,
-0.0247344970703125,
-0.0008702278137207031,
0.035400390625,
0.0821533203125,
0.031982421875,
0.01233673095703125,
0.07965087890625,
0.04595947265625,
0.0294036865234375,
-0.021636962890625,
0.06524658203125,
0.005596160888671875,
0.0120086669921875,
0.037567138671875,
-0.009307861328125,
0.032684326171875,
0.06646728515625,
-0.01580810546875,
0.0386962890625,
0.0982666015625,
-0.00901031494140625,
0.0460205078125,
0.00945281982421875,
-0.03857421875,
-0.004878997802734375,
-0.0589599609375,
-0.04315185546875,
0.033203125,
0.0210113525390625,
-0.0013551712036132812,
-0.0220947265625,
-0.019683837890625,
0.00717926025390625,
0.015960693359375,
-0.0374755859375,
0.0257720947265625,
0.0148162841796875,
-0.02008056640625,
0.0377197265625,
-0.0249176025390625,
0.051910400390625,
-0.0361328125,
0.004547119140625,
-0.004810333251953125,
-0.01149749755859375,
-0.03839111328125,
-0.06951904296875,
0.0372314453125,
-0.0033435821533203125,
0.0019311904907226562,
-0.00555419921875,
0.04931640625,
-0.0164794921875,
-0.0302581787109375,
0.04327392578125,
-0.0108184814453125,
0.040618896484375,
-0.00812530517578125,
-0.07159423828125,
0.0290679931640625,
-0.0173187255859375,
-0.0119781494140625,
0.01117706298828125,
0.0220947265625,
-0.0110321044921875,
0.0623779296875,
0.0379638671875,
0.0064697265625,
-0.0027561187744140625,
-0.0272674560546875,
0.0863037109375,
-0.0240020751953125,
-0.0004470348358154297,
-0.0565185546875,
0.040740966796875,
-0.000057578086853027344,
-0.05810546875,
0.01277923583984375,
0.0250244140625,
0.0355224609375,
-0.0138702392578125,
0.0202484130859375,
-0.048065185546875,
0.028839111328125,
-0.024566650390625,
0.045806884765625,
-0.07012939453125,
-0.0389404296875,
-0.001018524169921875,
-0.032196044921875,
-0.0074920654296875,
0.058380126953125,
0.000942230224609375,
-0.01432037353515625,
0.02313232421875,
0.060089111328125,
-0.0030517578125,
-0.013946533203125,
0.01157379150390625,
-0.020050048828125,
0.0036640167236328125,
0.0296173095703125,
0.080810546875,
-0.023101806640625,
0.01102447509765625,
-0.0179290771484375,
-0.036651611328125,
-0.0261383056640625,
-0.07733154296875,
-0.048248291015625,
-0.053985595703125,
-0.040740966796875,
-0.047698974609375,
-0.0198974609375,
0.08294677734375,
0.05401611328125,
-0.04864501953125,
-0.03558349609375,
0.027740478515625,
0.0216064453125,
0.00232696533203125,
-0.0026569366455078125,
0.0106048583984375,
0.036102294921875,
-0.031768798828125,
-0.00945281982421875,
0.034393310546875,
0.048248291015625,
-0.004291534423828125,
-0.0004324913024902344,
-0.00368499755859375,
-0.02337646484375,
0.024658203125,
0.056427001953125,
-0.0501708984375,
-0.0252685546875,
-0.022186279296875,
-0.0052642822265625,
0.0260772705078125,
0.0108489990234375,
-0.027008056640625,
-0.01226806640625,
0.03485107421875,
0.01971435546875,
0.032867431640625,
0.0187835693359375,
0.0073699951171875,
-0.0125579833984375,
0.049346923828125,
-0.0175018310546875,
0.04559326171875,
0.01522064208984375,
-0.0306243896484375,
0.0660400390625,
0.0213165283203125,
-0.047454833984375,
-0.062164306640625,
0.005481719970703125,
-0.128173828125,
0.0219879150390625,
0.011627197265625,
-0.006916046142578125,
-0.056854248046875,
0.022491455078125,
-0.052581787109375,
0.018951416015625,
-0.05853271484375,
0.056365966796875,
0.038330078125,
-0.00220489501953125,
-0.005443572998046875,
-0.0248260498046875,
-0.01806640625,
0.01000213623046875,
-0.055694580078125,
-0.067626953125,
-0.0005669593811035156,
0.00995635986328125,
0.042144775390625,
0.022216796875,
-0.0174713134765625,
0.0217742919921875,
0.007251739501953125,
0.033203125,
0.022552490234375,
0.005126953125,
0.0012693405151367188,
-0.005825042724609375,
0.002834320068359375,
-0.072265625
]
] |
intfloat/e5-large | 2023-08-07T04:59:49.000Z | [
"sentence-transformers",
"pytorch",
"safetensors",
"bert",
"mteb",
"Sentence Transformers",
"sentence-similarity",
"en",
"arxiv:2212.03533",
"arxiv:2104.08663",
"arxiv:2210.07316",
"license:mit",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | intfloat | null | null | intfloat/e5-large | 59 | 6,911 | sentence-transformers | 2022-12-26T06:03:12 | ---
tags:
- mteb
- Sentence Transformers
- sentence-similarity
- sentence-transformers
model-index:
- name: e5-large
results:
- task:
type: Classification
dataset:
type: mteb/amazon_counterfactual
name: MTEB AmazonCounterfactualClassification (en)
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 77.68656716417911
- type: ap
value: 41.336896075573584
- type: f1
value: 71.788561468075
- task:
type: Classification
dataset:
type: mteb/amazon_polarity
name: MTEB AmazonPolarityClassification
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 90.04965
- type: ap
value: 86.24637009569418
- type: f1
value: 90.03896671762645
- task:
type: Classification
dataset:
type: mteb/amazon_reviews_multi
name: MTEB AmazonReviewsClassification (en)
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 43.016000000000005
- type: f1
value: 42.1942431880186
- task:
type: Retrieval
dataset:
type: arguana
name: MTEB ArguAna
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.107000000000003
- type: map_at_10
value: 40.464
- type: map_at_100
value: 41.577999999999996
- type: map_at_1000
value: 41.588
- type: map_at_3
value: 35.301
- type: map_at_5
value: 38.263000000000005
- type: mrr_at_1
value: 25.605
- type: mrr_at_10
value: 40.64
- type: mrr_at_100
value: 41.760000000000005
- type: mrr_at_1000
value: 41.77
- type: mrr_at_3
value: 35.443000000000005
- type: mrr_at_5
value: 38.448
- type: ndcg_at_1
value: 25.107000000000003
- type: ndcg_at_10
value: 49.352000000000004
- type: ndcg_at_100
value: 53.98500000000001
- type: ndcg_at_1000
value: 54.208
- type: ndcg_at_3
value: 38.671
- type: ndcg_at_5
value: 43.991
- type: precision_at_1
value: 25.107000000000003
- type: precision_at_10
value: 7.795000000000001
- type: precision_at_100
value: 0.979
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 16.145
- type: precision_at_5
value: 12.262
- type: recall_at_1
value: 25.107000000000003
- type: recall_at_10
value: 77.952
- type: recall_at_100
value: 97.866
- type: recall_at_1000
value: 99.57300000000001
- type: recall_at_3
value: 48.435
- type: recall_at_5
value: 61.309000000000005
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-p2p
name: MTEB ArxivClusteringP2P
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 46.19278045044154
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-s2s
name: MTEB ArxivClusteringS2S
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 41.37976387757665
- task:
type: Reranking
dataset:
type: mteb/askubuntudupquestions-reranking
name: MTEB AskUbuntuDupQuestions
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 60.07433334608074
- type: mrr
value: 73.44347711383723
- task:
type: STS
dataset:
type: mteb/biosses-sts
name: MTEB BIOSSES
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 86.4298072183543
- type: cos_sim_spearman
value: 84.73144873582848
- type: euclidean_pearson
value: 85.15885058870728
- type: euclidean_spearman
value: 85.42062106559356
- type: manhattan_pearson
value: 84.89409921792054
- type: manhattan_spearman
value: 85.31941394024344
- task:
type: Classification
dataset:
type: mteb/banking77
name: MTEB Banking77Classification
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 84.14285714285714
- type: f1
value: 84.11674412565644
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-p2p
name: MTEB BiorxivClusteringP2P
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 37.600076342340785
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-s2s
name: MTEB BiorxivClusteringS2S
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 35.08861812135148
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackAndroidRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 32.684000000000005
- type: map_at_10
value: 41.675000000000004
- type: map_at_100
value: 42.963
- type: map_at_1000
value: 43.078
- type: map_at_3
value: 38.708999999999996
- type: map_at_5
value: 40.316
- type: mrr_at_1
value: 39.485
- type: mrr_at_10
value: 47.152
- type: mrr_at_100
value: 47.96
- type: mrr_at_1000
value: 48.010000000000005
- type: mrr_at_3
value: 44.754
- type: mrr_at_5
value: 46.285
- type: ndcg_at_1
value: 39.485
- type: ndcg_at_10
value: 46.849000000000004
- type: ndcg_at_100
value: 52.059
- type: ndcg_at_1000
value: 54.358
- type: ndcg_at_3
value: 42.705
- type: ndcg_at_5
value: 44.663000000000004
- type: precision_at_1
value: 39.485
- type: precision_at_10
value: 8.455
- type: precision_at_100
value: 1.3379999999999999
- type: precision_at_1000
value: 0.178
- type: precision_at_3
value: 19.695
- type: precision_at_5
value: 13.905999999999999
- type: recall_at_1
value: 32.684000000000005
- type: recall_at_10
value: 56.227000000000004
- type: recall_at_100
value: 78.499
- type: recall_at_1000
value: 94.021
- type: recall_at_3
value: 44.157999999999994
- type: recall_at_5
value: 49.694
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackEnglishRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 31.875999999999998
- type: map_at_10
value: 41.603
- type: map_at_100
value: 42.825
- type: map_at_1000
value: 42.961
- type: map_at_3
value: 38.655
- type: map_at_5
value: 40.294999999999995
- type: mrr_at_1
value: 40.127
- type: mrr_at_10
value: 47.959
- type: mrr_at_100
value: 48.59
- type: mrr_at_1000
value: 48.634
- type: mrr_at_3
value: 45.786
- type: mrr_at_5
value: 46.964
- type: ndcg_at_1
value: 40.127
- type: ndcg_at_10
value: 47.176
- type: ndcg_at_100
value: 51.346000000000004
- type: ndcg_at_1000
value: 53.502
- type: ndcg_at_3
value: 43.139
- type: ndcg_at_5
value: 44.883
- type: precision_at_1
value: 40.127
- type: precision_at_10
value: 8.72
- type: precision_at_100
value: 1.387
- type: precision_at_1000
value: 0.188
- type: precision_at_3
value: 20.637
- type: precision_at_5
value: 14.446
- type: recall_at_1
value: 31.875999999999998
- type: recall_at_10
value: 56.54900000000001
- type: recall_at_100
value: 73.939
- type: recall_at_1000
value: 87.732
- type: recall_at_3
value: 44.326
- type: recall_at_5
value: 49.445
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGamingRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 41.677
- type: map_at_10
value: 52.222
- type: map_at_100
value: 53.229000000000006
- type: map_at_1000
value: 53.288000000000004
- type: map_at_3
value: 49.201
- type: map_at_5
value: 51.00599999999999
- type: mrr_at_1
value: 47.524
- type: mrr_at_10
value: 55.745999999999995
- type: mrr_at_100
value: 56.433
- type: mrr_at_1000
value: 56.464999999999996
- type: mrr_at_3
value: 53.37499999999999
- type: mrr_at_5
value: 54.858
- type: ndcg_at_1
value: 47.524
- type: ndcg_at_10
value: 57.406
- type: ndcg_at_100
value: 61.403
- type: ndcg_at_1000
value: 62.7
- type: ndcg_at_3
value: 52.298
- type: ndcg_at_5
value: 55.02
- type: precision_at_1
value: 47.524
- type: precision_at_10
value: 8.865
- type: precision_at_100
value: 1.179
- type: precision_at_1000
value: 0.134
- type: precision_at_3
value: 22.612
- type: precision_at_5
value: 15.461
- type: recall_at_1
value: 41.677
- type: recall_at_10
value: 69.346
- type: recall_at_100
value: 86.344
- type: recall_at_1000
value: 95.703
- type: recall_at_3
value: 55.789
- type: recall_at_5
value: 62.488
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGisRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.991999999999997
- type: map_at_10
value: 32.804
- type: map_at_100
value: 33.812999999999995
- type: map_at_1000
value: 33.897
- type: map_at_3
value: 30.567
- type: map_at_5
value: 31.599
- type: mrr_at_1
value: 27.797
- type: mrr_at_10
value: 34.768
- type: mrr_at_100
value: 35.702
- type: mrr_at_1000
value: 35.766
- type: mrr_at_3
value: 32.637
- type: mrr_at_5
value: 33.614
- type: ndcg_at_1
value: 27.797
- type: ndcg_at_10
value: 36.966
- type: ndcg_at_100
value: 41.972
- type: ndcg_at_1000
value: 44.139
- type: ndcg_at_3
value: 32.547
- type: ndcg_at_5
value: 34.258
- type: precision_at_1
value: 27.797
- type: precision_at_10
value: 5.514
- type: precision_at_100
value: 0.8340000000000001
- type: precision_at_1000
value: 0.106
- type: precision_at_3
value: 13.333
- type: precision_at_5
value: 9.04
- type: recall_at_1
value: 25.991999999999997
- type: recall_at_10
value: 47.941
- type: recall_at_100
value: 71.039
- type: recall_at_1000
value: 87.32799999999999
- type: recall_at_3
value: 36.01
- type: recall_at_5
value: 40.056000000000004
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackMathematicaRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 17.533
- type: map_at_10
value: 24.336
- type: map_at_100
value: 25.445
- type: map_at_1000
value: 25.561
- type: map_at_3
value: 22.116
- type: map_at_5
value: 23.347
- type: mrr_at_1
value: 21.642
- type: mrr_at_10
value: 28.910999999999998
- type: mrr_at_100
value: 29.836000000000002
- type: mrr_at_1000
value: 29.907
- type: mrr_at_3
value: 26.638
- type: mrr_at_5
value: 27.857
- type: ndcg_at_1
value: 21.642
- type: ndcg_at_10
value: 28.949
- type: ndcg_at_100
value: 34.211000000000006
- type: ndcg_at_1000
value: 37.031
- type: ndcg_at_3
value: 24.788
- type: ndcg_at_5
value: 26.685
- type: precision_at_1
value: 21.642
- type: precision_at_10
value: 5.137
- type: precision_at_100
value: 0.893
- type: precision_at_1000
value: 0.127
- type: precision_at_3
value: 11.733
- type: precision_at_5
value: 8.383000000000001
- type: recall_at_1
value: 17.533
- type: recall_at_10
value: 38.839
- type: recall_at_100
value: 61.458999999999996
- type: recall_at_1000
value: 81.58
- type: recall_at_3
value: 27.328999999999997
- type: recall_at_5
value: 32.168
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackPhysicsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 28.126
- type: map_at_10
value: 37.872
- type: map_at_100
value: 39.229
- type: map_at_1000
value: 39.353
- type: map_at_3
value: 34.93
- type: map_at_5
value: 36.59
- type: mrr_at_1
value: 34.071
- type: mrr_at_10
value: 43.056
- type: mrr_at_100
value: 43.944
- type: mrr_at_1000
value: 43.999
- type: mrr_at_3
value: 40.536
- type: mrr_at_5
value: 42.065999999999995
- type: ndcg_at_1
value: 34.071
- type: ndcg_at_10
value: 43.503
- type: ndcg_at_100
value: 49.120000000000005
- type: ndcg_at_1000
value: 51.410999999999994
- type: ndcg_at_3
value: 38.767
- type: ndcg_at_5
value: 41.075
- type: precision_at_1
value: 34.071
- type: precision_at_10
value: 7.843999999999999
- type: precision_at_100
value: 1.2489999999999999
- type: precision_at_1000
value: 0.163
- type: precision_at_3
value: 18.223
- type: precision_at_5
value: 13.050999999999998
- type: recall_at_1
value: 28.126
- type: recall_at_10
value: 54.952
- type: recall_at_100
value: 78.375
- type: recall_at_1000
value: 93.29899999999999
- type: recall_at_3
value: 41.714
- type: recall_at_5
value: 47.635
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackProgrammersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.957
- type: map_at_10
value: 34.749
- type: map_at_100
value: 35.929
- type: map_at_1000
value: 36.043
- type: map_at_3
value: 31.947
- type: map_at_5
value: 33.575
- type: mrr_at_1
value: 32.078
- type: mrr_at_10
value: 39.844
- type: mrr_at_100
value: 40.71
- type: mrr_at_1000
value: 40.77
- type: mrr_at_3
value: 37.386
- type: mrr_at_5
value: 38.83
- type: ndcg_at_1
value: 32.078
- type: ndcg_at_10
value: 39.97
- type: ndcg_at_100
value: 45.254
- type: ndcg_at_1000
value: 47.818
- type: ndcg_at_3
value: 35.453
- type: ndcg_at_5
value: 37.631
- type: precision_at_1
value: 32.078
- type: precision_at_10
value: 7.158
- type: precision_at_100
value: 1.126
- type: precision_at_1000
value: 0.153
- type: precision_at_3
value: 16.743
- type: precision_at_5
value: 11.872
- type: recall_at_1
value: 25.957
- type: recall_at_10
value: 50.583
- type: recall_at_100
value: 73.593
- type: recall_at_1000
value: 91.23599999999999
- type: recall_at_3
value: 37.651
- type: recall_at_5
value: 43.626
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 27.1505
- type: map_at_10
value: 34.844833333333334
- type: map_at_100
value: 35.95216666666667
- type: map_at_1000
value: 36.06675
- type: map_at_3
value: 32.41975
- type: map_at_5
value: 33.74233333333333
- type: mrr_at_1
value: 31.923666666666662
- type: mrr_at_10
value: 38.87983333333334
- type: mrr_at_100
value: 39.706250000000004
- type: mrr_at_1000
value: 39.76708333333333
- type: mrr_at_3
value: 36.72008333333333
- type: mrr_at_5
value: 37.96933333333334
- type: ndcg_at_1
value: 31.923666666666662
- type: ndcg_at_10
value: 39.44258333333334
- type: ndcg_at_100
value: 44.31475
- type: ndcg_at_1000
value: 46.75
- type: ndcg_at_3
value: 35.36299999999999
- type: ndcg_at_5
value: 37.242333333333335
- type: precision_at_1
value: 31.923666666666662
- type: precision_at_10
value: 6.643333333333333
- type: precision_at_100
value: 1.0612499999999998
- type: precision_at_1000
value: 0.14575
- type: precision_at_3
value: 15.875250000000001
- type: precision_at_5
value: 11.088916666666664
- type: recall_at_1
value: 27.1505
- type: recall_at_10
value: 49.06349999999999
- type: recall_at_100
value: 70.60841666666666
- type: recall_at_1000
value: 87.72049999999999
- type: recall_at_3
value: 37.60575000000001
- type: recall_at_5
value: 42.511166666666675
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackStatsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.101000000000003
- type: map_at_10
value: 30.147000000000002
- type: map_at_100
value: 30.98
- type: map_at_1000
value: 31.080000000000002
- type: map_at_3
value: 28.571
- type: map_at_5
value: 29.319
- type: mrr_at_1
value: 27.761000000000003
- type: mrr_at_10
value: 32.716
- type: mrr_at_100
value: 33.504
- type: mrr_at_1000
value: 33.574
- type: mrr_at_3
value: 31.135
- type: mrr_at_5
value: 32.032
- type: ndcg_at_1
value: 27.761000000000003
- type: ndcg_at_10
value: 33.358
- type: ndcg_at_100
value: 37.569
- type: ndcg_at_1000
value: 40.189
- type: ndcg_at_3
value: 30.291
- type: ndcg_at_5
value: 31.558000000000003
- type: precision_at_1
value: 27.761000000000003
- type: precision_at_10
value: 4.939
- type: precision_at_100
value: 0.759
- type: precision_at_1000
value: 0.106
- type: precision_at_3
value: 12.577
- type: precision_at_5
value: 8.497
- type: recall_at_1
value: 25.101000000000003
- type: recall_at_10
value: 40.739
- type: recall_at_100
value: 60.089999999999996
- type: recall_at_1000
value: 79.768
- type: recall_at_3
value: 32.16
- type: recall_at_5
value: 35.131
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackTexRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 20.112
- type: map_at_10
value: 26.119999999999997
- type: map_at_100
value: 27.031
- type: map_at_1000
value: 27.150000000000002
- type: map_at_3
value: 24.230999999999998
- type: map_at_5
value: 25.15
- type: mrr_at_1
value: 24.535
- type: mrr_at_10
value: 30.198000000000004
- type: mrr_at_100
value: 30.975
- type: mrr_at_1000
value: 31.051000000000002
- type: mrr_at_3
value: 28.338
- type: mrr_at_5
value: 29.269000000000002
- type: ndcg_at_1
value: 24.535
- type: ndcg_at_10
value: 30.147000000000002
- type: ndcg_at_100
value: 34.544000000000004
- type: ndcg_at_1000
value: 37.512
- type: ndcg_at_3
value: 26.726
- type: ndcg_at_5
value: 28.046
- type: precision_at_1
value: 24.535
- type: precision_at_10
value: 5.179
- type: precision_at_100
value: 0.859
- type: precision_at_1000
value: 0.128
- type: precision_at_3
value: 12.159
- type: precision_at_5
value: 8.424
- type: recall_at_1
value: 20.112
- type: recall_at_10
value: 38.312000000000005
- type: recall_at_100
value: 58.406000000000006
- type: recall_at_1000
value: 79.863
- type: recall_at_3
value: 28.358
- type: recall_at_5
value: 31.973000000000003
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackUnixRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 27.111
- type: map_at_10
value: 34.096
- type: map_at_100
value: 35.181000000000004
- type: map_at_1000
value: 35.276
- type: map_at_3
value: 31.745
- type: map_at_5
value: 33.045
- type: mrr_at_1
value: 31.343
- type: mrr_at_10
value: 37.994
- type: mrr_at_100
value: 38.873000000000005
- type: mrr_at_1000
value: 38.934999999999995
- type: mrr_at_3
value: 35.743
- type: mrr_at_5
value: 37.077
- type: ndcg_at_1
value: 31.343
- type: ndcg_at_10
value: 38.572
- type: ndcg_at_100
value: 43.854
- type: ndcg_at_1000
value: 46.190999999999995
- type: ndcg_at_3
value: 34.247
- type: ndcg_at_5
value: 36.28
- type: precision_at_1
value: 31.343
- type: precision_at_10
value: 6.166
- type: precision_at_100
value: 1
- type: precision_at_1000
value: 0.13
- type: precision_at_3
value: 15.081
- type: precision_at_5
value: 10.428999999999998
- type: recall_at_1
value: 27.111
- type: recall_at_10
value: 48.422
- type: recall_at_100
value: 71.846
- type: recall_at_1000
value: 88.57000000000001
- type: recall_at_3
value: 36.435
- type: recall_at_5
value: 41.765
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWebmastersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 26.264
- type: map_at_10
value: 33.522
- type: map_at_100
value: 34.963
- type: map_at_1000
value: 35.175
- type: map_at_3
value: 31.366
- type: map_at_5
value: 32.621
- type: mrr_at_1
value: 31.028
- type: mrr_at_10
value: 37.230000000000004
- type: mrr_at_100
value: 38.149
- type: mrr_at_1000
value: 38.218
- type: mrr_at_3
value: 35.046
- type: mrr_at_5
value: 36.617
- type: ndcg_at_1
value: 31.028
- type: ndcg_at_10
value: 37.964999999999996
- type: ndcg_at_100
value: 43.342000000000006
- type: ndcg_at_1000
value: 46.471000000000004
- type: ndcg_at_3
value: 34.67
- type: ndcg_at_5
value: 36.458
- type: precision_at_1
value: 31.028
- type: precision_at_10
value: 6.937
- type: precision_at_100
value: 1.346
- type: precision_at_1000
value: 0.22799999999999998
- type: precision_at_3
value: 15.942
- type: precision_at_5
value: 11.462
- type: recall_at_1
value: 26.264
- type: recall_at_10
value: 45.571
- type: recall_at_100
value: 70.246
- type: recall_at_1000
value: 90.971
- type: recall_at_3
value: 36.276
- type: recall_at_5
value: 41.162
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWordpressRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.372999999999998
- type: map_at_10
value: 28.992
- type: map_at_100
value: 29.837999999999997
- type: map_at_1000
value: 29.939
- type: map_at_3
value: 26.999000000000002
- type: map_at_5
value: 28.044999999999998
- type: mrr_at_1
value: 25.692999999999998
- type: mrr_at_10
value: 30.984
- type: mrr_at_100
value: 31.799
- type: mrr_at_1000
value: 31.875999999999998
- type: mrr_at_3
value: 29.267
- type: mrr_at_5
value: 30.163
- type: ndcg_at_1
value: 25.692999999999998
- type: ndcg_at_10
value: 32.45
- type: ndcg_at_100
value: 37.103
- type: ndcg_at_1000
value: 39.678000000000004
- type: ndcg_at_3
value: 28.725
- type: ndcg_at_5
value: 30.351
- type: precision_at_1
value: 25.692999999999998
- type: precision_at_10
value: 4.806
- type: precision_at_100
value: 0.765
- type: precision_at_1000
value: 0.108
- type: precision_at_3
value: 11.768
- type: precision_at_5
value: 8.096
- type: recall_at_1
value: 23.372999999999998
- type: recall_at_10
value: 41.281
- type: recall_at_100
value: 63.465
- type: recall_at_1000
value: 82.575
- type: recall_at_3
value: 31.063000000000002
- type: recall_at_5
value: 34.991
- task:
type: Retrieval
dataset:
type: climate-fever
name: MTEB ClimateFEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 8.821
- type: map_at_10
value: 15.383
- type: map_at_100
value: 17.244999999999997
- type: map_at_1000
value: 17.445
- type: map_at_3
value: 12.64
- type: map_at_5
value: 13.941999999999998
- type: mrr_at_1
value: 19.544
- type: mrr_at_10
value: 29.738999999999997
- type: mrr_at_100
value: 30.923000000000002
- type: mrr_at_1000
value: 30.969
- type: mrr_at_3
value: 26.384
- type: mrr_at_5
value: 28.199
- type: ndcg_at_1
value: 19.544
- type: ndcg_at_10
value: 22.398
- type: ndcg_at_100
value: 30.253999999999998
- type: ndcg_at_1000
value: 33.876
- type: ndcg_at_3
value: 17.473
- type: ndcg_at_5
value: 19.154
- type: precision_at_1
value: 19.544
- type: precision_at_10
value: 7.217999999999999
- type: precision_at_100
value: 1.564
- type: precision_at_1000
value: 0.22300000000000003
- type: precision_at_3
value: 13.225000000000001
- type: precision_at_5
value: 10.319
- type: recall_at_1
value: 8.821
- type: recall_at_10
value: 28.110000000000003
- type: recall_at_100
value: 55.64
- type: recall_at_1000
value: 75.964
- type: recall_at_3
value: 16.195
- type: recall_at_5
value: 20.678
- task:
type: Retrieval
dataset:
type: dbpedia-entity
name: MTEB DBPedia
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.344
- type: map_at_10
value: 20.301
- type: map_at_100
value: 28.709
- type: map_at_1000
value: 30.470999999999997
- type: map_at_3
value: 14.584
- type: map_at_5
value: 16.930999999999997
- type: mrr_at_1
value: 67.25
- type: mrr_at_10
value: 75.393
- type: mrr_at_100
value: 75.742
- type: mrr_at_1000
value: 75.75
- type: mrr_at_3
value: 73.958
- type: mrr_at_5
value: 74.883
- type: ndcg_at_1
value: 56.00000000000001
- type: ndcg_at_10
value: 42.394
- type: ndcg_at_100
value: 47.091
- type: ndcg_at_1000
value: 54.215
- type: ndcg_at_3
value: 46.995
- type: ndcg_at_5
value: 44.214999999999996
- type: precision_at_1
value: 67.25
- type: precision_at_10
value: 33.525
- type: precision_at_100
value: 10.67
- type: precision_at_1000
value: 2.221
- type: precision_at_3
value: 49.417
- type: precision_at_5
value: 42.15
- type: recall_at_1
value: 9.344
- type: recall_at_10
value: 25.209
- type: recall_at_100
value: 52.329
- type: recall_at_1000
value: 74.2
- type: recall_at_3
value: 15.699
- type: recall_at_5
value: 19.24
- task:
type: Classification
dataset:
type: mteb/emotion
name: MTEB EmotionClassification
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 48.05
- type: f1
value: 43.06718139212933
- task:
type: Retrieval
dataset:
type: fever
name: MTEB FEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 46.452
- type: map_at_10
value: 58.825
- type: map_at_100
value: 59.372
- type: map_at_1000
value: 59.399
- type: map_at_3
value: 56.264
- type: map_at_5
value: 57.879999999999995
- type: mrr_at_1
value: 49.82
- type: mrr_at_10
value: 62.178999999999995
- type: mrr_at_100
value: 62.641999999999996
- type: mrr_at_1000
value: 62.658
- type: mrr_at_3
value: 59.706
- type: mrr_at_5
value: 61.283
- type: ndcg_at_1
value: 49.82
- type: ndcg_at_10
value: 65.031
- type: ndcg_at_100
value: 67.413
- type: ndcg_at_1000
value: 68.014
- type: ndcg_at_3
value: 60.084
- type: ndcg_at_5
value: 62.858000000000004
- type: precision_at_1
value: 49.82
- type: precision_at_10
value: 8.876000000000001
- type: precision_at_100
value: 1.018
- type: precision_at_1000
value: 0.109
- type: precision_at_3
value: 24.477
- type: precision_at_5
value: 16.208
- type: recall_at_1
value: 46.452
- type: recall_at_10
value: 80.808
- type: recall_at_100
value: 91.215
- type: recall_at_1000
value: 95.52000000000001
- type: recall_at_3
value: 67.62899999999999
- type: recall_at_5
value: 74.32900000000001
- task:
type: Retrieval
dataset:
type: fiqa
name: MTEB FiQA2018
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 18.351
- type: map_at_10
value: 30.796
- type: map_at_100
value: 32.621
- type: map_at_1000
value: 32.799
- type: map_at_3
value: 26.491
- type: map_at_5
value: 28.933999999999997
- type: mrr_at_1
value: 36.265
- type: mrr_at_10
value: 45.556999999999995
- type: mrr_at_100
value: 46.323
- type: mrr_at_1000
value: 46.359
- type: mrr_at_3
value: 42.695
- type: mrr_at_5
value: 44.324000000000005
- type: ndcg_at_1
value: 36.265
- type: ndcg_at_10
value: 38.558
- type: ndcg_at_100
value: 45.18
- type: ndcg_at_1000
value: 48.292
- type: ndcg_at_3
value: 34.204
- type: ndcg_at_5
value: 35.735
- type: precision_at_1
value: 36.265
- type: precision_at_10
value: 10.879999999999999
- type: precision_at_100
value: 1.77
- type: precision_at_1000
value: 0.234
- type: precision_at_3
value: 23.044999999999998
- type: precision_at_5
value: 17.253
- type: recall_at_1
value: 18.351
- type: recall_at_10
value: 46.116
- type: recall_at_100
value: 70.786
- type: recall_at_1000
value: 89.46300000000001
- type: recall_at_3
value: 31.404
- type: recall_at_5
value: 37.678
- task:
type: Retrieval
dataset:
type: hotpotqa
name: MTEB HotpotQA
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 36.847
- type: map_at_10
value: 54.269999999999996
- type: map_at_100
value: 55.152
- type: map_at_1000
value: 55.223
- type: map_at_3
value: 51.166
- type: map_at_5
value: 53.055
- type: mrr_at_1
value: 73.693
- type: mrr_at_10
value: 79.975
- type: mrr_at_100
value: 80.202
- type: mrr_at_1000
value: 80.214
- type: mrr_at_3
value: 78.938
- type: mrr_at_5
value: 79.595
- type: ndcg_at_1
value: 73.693
- type: ndcg_at_10
value: 63.334999999999994
- type: ndcg_at_100
value: 66.452
- type: ndcg_at_1000
value: 67.869
- type: ndcg_at_3
value: 58.829
- type: ndcg_at_5
value: 61.266
- type: precision_at_1
value: 73.693
- type: precision_at_10
value: 13.122
- type: precision_at_100
value: 1.5559999999999998
- type: precision_at_1000
value: 0.174
- type: precision_at_3
value: 37.083
- type: precision_at_5
value: 24.169999999999998
- type: recall_at_1
value: 36.847
- type: recall_at_10
value: 65.61099999999999
- type: recall_at_100
value: 77.792
- type: recall_at_1000
value: 87.17099999999999
- type: recall_at_3
value: 55.625
- type: recall_at_5
value: 60.425
- task:
type: Classification
dataset:
type: mteb/imdb
name: MTEB ImdbClassification
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 82.1096
- type: ap
value: 76.67089212843918
- type: f1
value: 82.03535056754939
- task:
type: Retrieval
dataset:
type: msmarco
name: MTEB MSMARCO
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 24.465
- type: map_at_10
value: 37.072
- type: map_at_100
value: 38.188
- type: map_at_1000
value: 38.232
- type: map_at_3
value: 33.134
- type: map_at_5
value: 35.453
- type: mrr_at_1
value: 25.142999999999997
- type: mrr_at_10
value: 37.669999999999995
- type: mrr_at_100
value: 38.725
- type: mrr_at_1000
value: 38.765
- type: mrr_at_3
value: 33.82
- type: mrr_at_5
value: 36.111
- type: ndcg_at_1
value: 25.142999999999997
- type: ndcg_at_10
value: 44.054
- type: ndcg_at_100
value: 49.364000000000004
- type: ndcg_at_1000
value: 50.456
- type: ndcg_at_3
value: 36.095
- type: ndcg_at_5
value: 40.23
- type: precision_at_1
value: 25.142999999999997
- type: precision_at_10
value: 6.845
- type: precision_at_100
value: 0.95
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 15.204999999999998
- type: precision_at_5
value: 11.221
- type: recall_at_1
value: 24.465
- type: recall_at_10
value: 65.495
- type: recall_at_100
value: 89.888
- type: recall_at_1000
value: 98.165
- type: recall_at_3
value: 43.964
- type: recall_at_5
value: 53.891
- task:
type: Classification
dataset:
type: mteb/mtop_domain
name: MTEB MTOPDomainClassification (en)
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 93.86228910168718
- type: f1
value: 93.69177113259104
- task:
type: Classification
dataset:
type: mteb/mtop_intent
name: MTEB MTOPIntentClassification (en)
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 76.3999088007296
- type: f1
value: 58.96668664333438
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (en)
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 73.21788836583727
- type: f1
value: 71.4545936552952
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (en)
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 77.39071956960323
- type: f1
value: 77.12398952847603
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-p2p
name: MTEB MedrxivClusteringP2P
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 32.255379528166955
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-s2s
name: MTEB MedrxivClusteringS2S
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 29.66423362872814
- task:
type: Reranking
dataset:
type: mteb/mind_small
name: MTEB MindSmallReranking
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.782211620375964
- type: mrr
value: 31.773479703044956
- task:
type: Retrieval
dataset:
type: nfcorpus
name: MTEB NFCorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 5.863
- type: map_at_10
value: 13.831
- type: map_at_100
value: 17.534
- type: map_at_1000
value: 19.012
- type: map_at_3
value: 10.143
- type: map_at_5
value: 12.034
- type: mrr_at_1
value: 46.749
- type: mrr_at_10
value: 55.376999999999995
- type: mrr_at_100
value: 56.009
- type: mrr_at_1000
value: 56.042
- type: mrr_at_3
value: 53.30200000000001
- type: mrr_at_5
value: 54.85
- type: ndcg_at_1
value: 44.582
- type: ndcg_at_10
value: 36.07
- type: ndcg_at_100
value: 33.39
- type: ndcg_at_1000
value: 41.884
- type: ndcg_at_3
value: 41.441
- type: ndcg_at_5
value: 39.861000000000004
- type: precision_at_1
value: 46.129999999999995
- type: precision_at_10
value: 26.594
- type: precision_at_100
value: 8.365
- type: precision_at_1000
value: 2.1260000000000003
- type: precision_at_3
value: 39.009
- type: precision_at_5
value: 34.861
- type: recall_at_1
value: 5.863
- type: recall_at_10
value: 17.961
- type: recall_at_100
value: 34.026
- type: recall_at_1000
value: 64.46499999999999
- type: recall_at_3
value: 11.242
- type: recall_at_5
value: 14.493
- task:
type: Retrieval
dataset:
type: nq
name: MTEB NQ
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 38.601
- type: map_at_10
value: 55.293000000000006
- type: map_at_100
value: 56.092
- type: map_at_1000
value: 56.111999999999995
- type: map_at_3
value: 51.269
- type: map_at_5
value: 53.787
- type: mrr_at_1
value: 43.221
- type: mrr_at_10
value: 57.882999999999996
- type: mrr_at_100
value: 58.408
- type: mrr_at_1000
value: 58.421
- type: mrr_at_3
value: 54.765
- type: mrr_at_5
value: 56.809
- type: ndcg_at_1
value: 43.221
- type: ndcg_at_10
value: 62.858999999999995
- type: ndcg_at_100
value: 65.987
- type: ndcg_at_1000
value: 66.404
- type: ndcg_at_3
value: 55.605000000000004
- type: ndcg_at_5
value: 59.723000000000006
- type: precision_at_1
value: 43.221
- type: precision_at_10
value: 9.907
- type: precision_at_100
value: 1.169
- type: precision_at_1000
value: 0.121
- type: precision_at_3
value: 25.019000000000002
- type: precision_at_5
value: 17.474
- type: recall_at_1
value: 38.601
- type: recall_at_10
value: 82.966
- type: recall_at_100
value: 96.154
- type: recall_at_1000
value: 99.223
- type: recall_at_3
value: 64.603
- type: recall_at_5
value: 73.97200000000001
- task:
type: Retrieval
dataset:
type: quora
name: MTEB QuoraRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 70.77
- type: map_at_10
value: 84.429
- type: map_at_100
value: 85.04599999999999
- type: map_at_1000
value: 85.065
- type: map_at_3
value: 81.461
- type: map_at_5
value: 83.316
- type: mrr_at_1
value: 81.51
- type: mrr_at_10
value: 87.52799999999999
- type: mrr_at_100
value: 87.631
- type: mrr_at_1000
value: 87.632
- type: mrr_at_3
value: 86.533
- type: mrr_at_5
value: 87.214
- type: ndcg_at_1
value: 81.47999999999999
- type: ndcg_at_10
value: 88.181
- type: ndcg_at_100
value: 89.39200000000001
- type: ndcg_at_1000
value: 89.52
- type: ndcg_at_3
value: 85.29299999999999
- type: ndcg_at_5
value: 86.88
- type: precision_at_1
value: 81.47999999999999
- type: precision_at_10
value: 13.367
- type: precision_at_100
value: 1.5230000000000001
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.227
- type: precision_at_5
value: 24.494
- type: recall_at_1
value: 70.77
- type: recall_at_10
value: 95.199
- type: recall_at_100
value: 99.37700000000001
- type: recall_at_1000
value: 99.973
- type: recall_at_3
value: 86.895
- type: recall_at_5
value: 91.396
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering
name: MTEB RedditClustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 50.686353396858344
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering-p2p
name: MTEB RedditClusteringP2P
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 61.3664675312921
- task:
type: Retrieval
dataset:
type: scidocs
name: MTEB SCIDOCS
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.7379999999999995
- type: map_at_10
value: 12.01
- type: map_at_100
value: 14.02
- type: map_at_1000
value: 14.310999999999998
- type: map_at_3
value: 8.459
- type: map_at_5
value: 10.281
- type: mrr_at_1
value: 23.3
- type: mrr_at_10
value: 34.108
- type: mrr_at_100
value: 35.217
- type: mrr_at_1000
value: 35.272
- type: mrr_at_3
value: 30.833
- type: mrr_at_5
value: 32.768
- type: ndcg_at_1
value: 23.3
- type: ndcg_at_10
value: 20.116999999999997
- type: ndcg_at_100
value: 27.961000000000002
- type: ndcg_at_1000
value: 33.149
- type: ndcg_at_3
value: 18.902
- type: ndcg_at_5
value: 16.742
- type: precision_at_1
value: 23.3
- type: precision_at_10
value: 10.47
- type: precision_at_100
value: 2.177
- type: precision_at_1000
value: 0.34299999999999997
- type: precision_at_3
value: 17.567
- type: precision_at_5
value: 14.78
- type: recall_at_1
value: 4.7379999999999995
- type: recall_at_10
value: 21.221999999999998
- type: recall_at_100
value: 44.242
- type: recall_at_1000
value: 69.652
- type: recall_at_3
value: 10.688
- type: recall_at_5
value: 14.982999999999999
- task:
type: STS
dataset:
type: mteb/sickr-sts
name: MTEB SICK-R
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 84.84572946827069
- type: cos_sim_spearman
value: 80.48508130408966
- type: euclidean_pearson
value: 82.0481530027767
- type: euclidean_spearman
value: 80.45902876782752
- type: manhattan_pearson
value: 82.03728222483326
- type: manhattan_spearman
value: 80.45684282911755
- task:
type: STS
dataset:
type: mteb/sts12-sts
name: MTEB STS12
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 84.33476464677516
- type: cos_sim_spearman
value: 75.93057758003266
- type: euclidean_pearson
value: 80.89685744015691
- type: euclidean_spearman
value: 76.29929953441706
- type: manhattan_pearson
value: 80.91391345459995
- type: manhattan_spearman
value: 76.31985463110914
- task:
type: STS
dataset:
type: mteb/sts13-sts
name: MTEB STS13
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 84.63686106359005
- type: cos_sim_spearman
value: 85.22240034668202
- type: euclidean_pearson
value: 84.6074814189106
- type: euclidean_spearman
value: 85.17169644755828
- type: manhattan_pearson
value: 84.48329306239368
- type: manhattan_spearman
value: 85.0086508544768
- task:
type: STS
dataset:
type: mteb/sts14-sts
name: MTEB STS14
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 82.95455774064745
- type: cos_sim_spearman
value: 80.54074646118492
- type: euclidean_pearson
value: 81.79598955554704
- type: euclidean_spearman
value: 80.55837617606814
- type: manhattan_pearson
value: 81.78213797905386
- type: manhattan_spearman
value: 80.5666746878273
- task:
type: STS
dataset:
type: mteb/sts15-sts
name: MTEB STS15
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 87.92813309124739
- type: cos_sim_spearman
value: 88.81459873052108
- type: euclidean_pearson
value: 88.21193118930564
- type: euclidean_spearman
value: 88.87072745043731
- type: manhattan_pearson
value: 88.22576929706727
- type: manhattan_spearman
value: 88.8867671095791
- task:
type: STS
dataset:
type: mteb/sts16-sts
name: MTEB STS16
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 83.6881529671839
- type: cos_sim_spearman
value: 85.2807092969554
- type: euclidean_pearson
value: 84.62334178652704
- type: euclidean_spearman
value: 85.2116373296784
- type: manhattan_pearson
value: 84.54948211541777
- type: manhattan_spearman
value: 85.10737722637882
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (en-en)
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 88.55963694458408
- type: cos_sim_spearman
value: 89.36731628848683
- type: euclidean_pearson
value: 89.64975952985465
- type: euclidean_spearman
value: 89.29689484033007
- type: manhattan_pearson
value: 89.61234491713135
- type: manhattan_spearman
value: 89.20302520255782
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (en)
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 62.411800961903886
- type: cos_sim_spearman
value: 62.99105515749963
- type: euclidean_pearson
value: 65.29826669549443
- type: euclidean_spearman
value: 63.29880964105775
- type: manhattan_pearson
value: 65.00126190601183
- type: manhattan_spearman
value: 63.32011025899179
- task:
type: STS
dataset:
type: mteb/stsbenchmark-sts
name: MTEB STSBenchmark
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 85.83498531837608
- type: cos_sim_spearman
value: 87.21366640615442
- type: euclidean_pearson
value: 86.74764288798261
- type: euclidean_spearman
value: 87.06060470780834
- type: manhattan_pearson
value: 86.65971223951476
- type: manhattan_spearman
value: 86.99814399831457
- task:
type: Reranking
dataset:
type: mteb/scidocs-reranking
name: MTEB SciDocsRR
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 83.94448463485881
- type: mrr
value: 95.36291867174221
- task:
type: Retrieval
dataset:
type: scifact
name: MTEB SciFact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 59.928000000000004
- type: map_at_10
value: 68.577
- type: map_at_100
value: 69.35900000000001
- type: map_at_1000
value: 69.37299999999999
- type: map_at_3
value: 66.217
- type: map_at_5
value: 67.581
- type: mrr_at_1
value: 63
- type: mrr_at_10
value: 69.994
- type: mrr_at_100
value: 70.553
- type: mrr_at_1000
value: 70.56700000000001
- type: mrr_at_3
value: 68.167
- type: mrr_at_5
value: 69.11699999999999
- type: ndcg_at_1
value: 63
- type: ndcg_at_10
value: 72.58
- type: ndcg_at_100
value: 75.529
- type: ndcg_at_1000
value: 76.009
- type: ndcg_at_3
value: 68.523
- type: ndcg_at_5
value: 70.301
- type: precision_at_1
value: 63
- type: precision_at_10
value: 9.333
- type: precision_at_100
value: 1.09
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 26.444000000000003
- type: precision_at_5
value: 17.067
- type: recall_at_1
value: 59.928000000000004
- type: recall_at_10
value: 83.544
- type: recall_at_100
value: 96
- type: recall_at_1000
value: 100
- type: recall_at_3
value: 72.072
- type: recall_at_5
value: 76.683
- task:
type: PairClassification
dataset:
type: mteb/sprintduplicatequestions-pairclassification
name: MTEB SprintDuplicateQuestions
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.82178217821782
- type: cos_sim_ap
value: 95.41507679819003
- type: cos_sim_f1
value: 90.9456740442656
- type: cos_sim_precision
value: 91.49797570850203
- type: cos_sim_recall
value: 90.4
- type: dot_accuracy
value: 99.77227722772277
- type: dot_ap
value: 92.50123869445967
- type: dot_f1
value: 88.18414322250638
- type: dot_precision
value: 90.26178010471205
- type: dot_recall
value: 86.2
- type: euclidean_accuracy
value: 99.81782178217821
- type: euclidean_ap
value: 95.3935066749006
- type: euclidean_f1
value: 90.66128218071681
- type: euclidean_precision
value: 91.53924566768603
- type: euclidean_recall
value: 89.8
- type: manhattan_accuracy
value: 99.81881188118813
- type: manhattan_ap
value: 95.39767454613512
- type: manhattan_f1
value: 90.62019477191186
- type: manhattan_precision
value: 92.95478443743428
- type: manhattan_recall
value: 88.4
- type: max_accuracy
value: 99.82178217821782
- type: max_ap
value: 95.41507679819003
- type: max_f1
value: 90.9456740442656
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering
name: MTEB StackExchangeClustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 64.96313921233748
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering-p2p
name: MTEB StackExchangeClusteringP2P
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 33.602625720956745
- task:
type: Reranking
dataset:
type: mteb/stackoverflowdupquestions-reranking
name: MTEB StackOverflowDupQuestions
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 51.32659230651731
- type: mrr
value: 52.33861726508785
- task:
type: Summarization
dataset:
type: mteb/summeval
name: MTEB SummEval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 31.01587644214203
- type: cos_sim_spearman
value: 30.974306908731013
- type: dot_pearson
value: 29.83339853838187
- type: dot_spearman
value: 30.07761671934048
- task:
type: Retrieval
dataset:
type: trec-covid
name: MTEB TRECCOVID
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.22
- type: map_at_10
value: 1.9539999999999997
- type: map_at_100
value: 11.437
- type: map_at_1000
value: 27.861000000000004
- type: map_at_3
value: 0.6479999999999999
- type: map_at_5
value: 1.0410000000000001
- type: mrr_at_1
value: 84
- type: mrr_at_10
value: 90.333
- type: mrr_at_100
value: 90.333
- type: mrr_at_1000
value: 90.333
- type: mrr_at_3
value: 90.333
- type: mrr_at_5
value: 90.333
- type: ndcg_at_1
value: 80
- type: ndcg_at_10
value: 78.31700000000001
- type: ndcg_at_100
value: 59.396
- type: ndcg_at_1000
value: 52.733
- type: ndcg_at_3
value: 81.46900000000001
- type: ndcg_at_5
value: 80.74
- type: precision_at_1
value: 84
- type: precision_at_10
value: 84
- type: precision_at_100
value: 60.980000000000004
- type: precision_at_1000
value: 23.432
- type: precision_at_3
value: 87.333
- type: precision_at_5
value: 86.8
- type: recall_at_1
value: 0.22
- type: recall_at_10
value: 2.156
- type: recall_at_100
value: 14.557999999999998
- type: recall_at_1000
value: 49.553999999999995
- type: recall_at_3
value: 0.685
- type: recall_at_5
value: 1.121
- task:
type: Retrieval
dataset:
type: webis-touche2020
name: MTEB Touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 3.373
- type: map_at_10
value: 11.701
- type: map_at_100
value: 17.144000000000002
- type: map_at_1000
value: 18.624
- type: map_at_3
value: 6.552
- type: map_at_5
value: 9.372
- type: mrr_at_1
value: 38.775999999999996
- type: mrr_at_10
value: 51.975
- type: mrr_at_100
value: 52.873999999999995
- type: mrr_at_1000
value: 52.873999999999995
- type: mrr_at_3
value: 47.619
- type: mrr_at_5
value: 50.578
- type: ndcg_at_1
value: 36.735
- type: ndcg_at_10
value: 27.212999999999997
- type: ndcg_at_100
value: 37.245
- type: ndcg_at_1000
value: 48.602000000000004
- type: ndcg_at_3
value: 30.916
- type: ndcg_at_5
value: 30.799
- type: precision_at_1
value: 38.775999999999996
- type: precision_at_10
value: 23.469
- type: precision_at_100
value: 7.327
- type: precision_at_1000
value: 1.486
- type: precision_at_3
value: 31.973000000000003
- type: precision_at_5
value: 32.245000000000005
- type: recall_at_1
value: 3.373
- type: recall_at_10
value: 17.404
- type: recall_at_100
value: 46.105000000000004
- type: recall_at_1000
value: 80.35
- type: recall_at_3
value: 7.4399999999999995
- type: recall_at_5
value: 12.183
- task:
type: Classification
dataset:
type: mteb/toxic_conversations_50k
name: MTEB ToxicConversationsClassification
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 70.5592
- type: ap
value: 14.330910591410134
- type: f1
value: 54.45745186286521
- task:
type: Classification
dataset:
type: mteb/tweet_sentiment_extraction
name: MTEB TweetSentimentExtractionClassification
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 61.20543293718167
- type: f1
value: 61.45365480309872
- task:
type: Clustering
dataset:
type: mteb/twentynewsgroups-clustering
name: MTEB TwentyNewsgroupsClustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 43.81162998944145
- task:
type: PairClassification
dataset:
type: mteb/twittersemeval2015-pairclassification
name: MTEB TwitterSemEval2015
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 86.69011146212075
- type: cos_sim_ap
value: 76.09792353652536
- type: cos_sim_f1
value: 70.10202763786646
- type: cos_sim_precision
value: 68.65671641791045
- type: cos_sim_recall
value: 71.60949868073878
- type: dot_accuracy
value: 85.33110806461227
- type: dot_ap
value: 70.19304383327554
- type: dot_f1
value: 67.22494202525122
- type: dot_precision
value: 65.6847935548842
- type: dot_recall
value: 68.83905013192611
- type: euclidean_accuracy
value: 86.5410979316922
- type: euclidean_ap
value: 75.91906915651882
- type: euclidean_f1
value: 69.6798975672215
- type: euclidean_precision
value: 67.6865671641791
- type: euclidean_recall
value: 71.79419525065963
- type: manhattan_accuracy
value: 86.60070334386363
- type: manhattan_ap
value: 75.94617413885031
- type: manhattan_f1
value: 69.52689565780946
- type: manhattan_precision
value: 68.3312101910828
- type: manhattan_recall
value: 70.76517150395777
- type: max_accuracy
value: 86.69011146212075
- type: max_ap
value: 76.09792353652536
- type: max_f1
value: 70.10202763786646
- task:
type: PairClassification
dataset:
type: mteb/twitterurlcorpus-pairclassification
name: MTEB TwitterURLCorpus
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.25951798812434
- type: cos_sim_ap
value: 86.31476416599727
- type: cos_sim_f1
value: 78.52709971038477
- type: cos_sim_precision
value: 76.7629972792117
- type: cos_sim_recall
value: 80.37419156144134
- type: dot_accuracy
value: 88.03896456708192
- type: dot_ap
value: 83.26963599196237
- type: dot_f1
value: 76.72696459492317
- type: dot_precision
value: 73.56411162133521
- type: dot_recall
value: 80.17400677548507
- type: euclidean_accuracy
value: 89.21682772538519
- type: euclidean_ap
value: 86.29306071289969
- type: euclidean_f1
value: 78.40827030519554
- type: euclidean_precision
value: 77.42250243939053
- type: euclidean_recall
value: 79.41946412072683
- type: manhattan_accuracy
value: 89.22458959133776
- type: manhattan_ap
value: 86.2901934710645
- type: manhattan_f1
value: 78.54211378440453
- type: manhattan_precision
value: 76.85505858079729
- type: manhattan_recall
value: 80.30489682784109
- type: max_accuracy
value: 89.25951798812434
- type: max_ap
value: 86.31476416599727
- type: max_f1
value: 78.54211378440453
language:
- en
license: mit
---
## E5-large
**News (May 2023): please switch to [e5-large-v2](https://huggingface.co/intfloat/e5-large-v2), which has better performance and same method of usage.**
[Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
This model has 24 layers and the embedding size is 1024.
## Usage
Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset.
```python
import torch.nn.functional as F
from torch import Tensor
from transformers import AutoTokenizer, AutoModel
def average_pool(last_hidden_states: Tensor,
attention_mask: Tensor) -> Tensor:
last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
# Each input text should start with "query: " or "passage: ".
# For tasks other than retrieval, you can simply use the "query: " prefix.
input_texts = ['query: how much protein should a female eat',
'query: summit define',
"passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."]
tokenizer = AutoTokenizer.from_pretrained('intfloat/e5-large')
model = AutoModel.from_pretrained('intfloat/e5-large')
# Tokenize the input texts
batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')
outputs = model(**batch_dict)
embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
# normalize embeddings
embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:2] @ embeddings[2:].T) * 100
print(scores.tolist())
```
## Training Details
Please refer to our paper at [https://arxiv.org/pdf/2212.03533.pdf](https://arxiv.org/pdf/2212.03533.pdf).
## Benchmark Evaluation
Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results
on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316).
## Support for Sentence Transformers
Below is an example for usage with sentence_transformers.
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer('intfloat/e5-large')
input_texts = [
'query: how much protein should a female eat',
'query: summit define',
"passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."
]
embeddings = model.encode(input_texts, normalize_embeddings=True)
```
Package requirements
`pip install sentence_transformers~=2.2.2`
Contributors: [michaelfeil](https://huggingface.co/michaelfeil)
## FAQ
**1. Do I need to add the prefix "query: " and "passage: " to input texts?**
Yes, this is how the model is trained, otherwise you will see a performance degradation.
Here are some rules of thumb:
- Use "query: " and "passage: " correspondingly for asymmetric tasks such as passage retrieval in open QA, ad-hoc information retrieval.
- Use "query: " prefix for symmetric tasks such as semantic similarity, paraphrase retrieval.
- Use "query: " prefix if you want to use embeddings as features, such as linear probing classification, clustering.
**2. Why are my reproduced results slightly different from reported in the model card?**
Different versions of `transformers` and `pytorch` could cause negligible but non-zero performance differences.
**3. Why does the cosine similarity scores distribute around 0.7 to 1.0?**
This is a known and expected behavior as we use a low temperature 0.01 for InfoNCE contrastive loss.
For text embedding tasks like text retrieval or semantic similarity,
what matters is the relative order of the scores instead of the absolute values,
so this should not be an issue.
## Citation
If you find our paper or models helpful, please consider cite as follows:
```
@article{wang2022text,
title={Text Embeddings by Weakly-Supervised Contrastive Pre-training},
author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Jiao, Binxing and Yang, Linjun and Jiang, Daxin and Majumder, Rangan and Wei, Furu},
journal={arXiv preprint arXiv:2212.03533},
year={2022}
}
```
## Limitations
This model only works for English texts. Long texts will be truncated to at most 512 tokens.
| 67,750 | [
[
-0.0110626220703125,
-0.05291748046875,
0.01549530029296875,
0.018035888671875,
-0.01898193359375,
-0.034332275390625,
-0.0013322830200195312,
-0.03558349609375,
0.005374908447265625,
0.0228424072265625,
-0.03656005859375,
-0.04638671875,
-0.0751953125,
0.0219573974609375,
-0.0283203125,
0.07086181640625,
-0.00012087821960449219,
0.00627899169921875,
-0.0277099609375,
-0.0033740997314453125,
-0.0177764892578125,
-0.042816162109375,
-0.02947998046875,
-0.024322509765625,
0.0276031494140625,
0.0165557861328125,
0.042816162109375,
0.042816162109375,
0.0535888671875,
0.0267486572265625,
-0.00954437255859375,
0.01203155517578125,
-0.042633056640625,
-0.01122283935546875,
-0.00018334388732910156,
-0.039947509765625,
-0.03375244140625,
0.015594482421875,
0.038116455078125,
0.06109619140625,
0.0123138427734375,
0.021575927734375,
0.02703857421875,
0.0419921875,
-0.04656982421875,
0.0138702392578125,
-0.031829833984375,
0.01253509521484375,
0.00815582275390625,
-0.0006837844848632812,
-0.0296630859375,
0.01277923583984375,
0.028106689453125,
-0.045379638671875,
0.02093505859375,
0.01097869873046875,
0.09552001953125,
0.023284912109375,
-0.035614013671875,
-0.01300048828125,
-0.01094818115234375,
0.07537841796875,
-0.05255126953125,
0.03753662109375,
0.05096435546875,
-0.0172119140625,
-0.007236480712890625,
-0.07293701171875,
-0.03045654296875,
-0.01477813720703125,
-0.015380859375,
0.01385498046875,
-0.0181732177734375,
-0.0033588409423828125,
0.033660888671875,
0.033111572265625,
-0.06219482421875,
-0.0020751953125,
-0.029815673828125,
-0.0081939697265625,
0.040313720703125,
0.0082244873046875,
0.021484375,
-0.032745361328125,
-0.0166473388671875,
-0.021759033203125,
-0.042510986328125,
0.0014705657958984375,
0.0153045654296875,
0.027435302734375,
-0.028350830078125,
0.0411376953125,
-0.02215576171875,
0.046630859375,
0.0156402587890625,
0.006847381591796875,
0.052703857421875,
-0.0389404296875,
-0.020477294921875,
-0.016845703125,
0.07293701171875,
0.042022705078125,
0.00930023193359375,
-0.00751495361328125,
-0.003124237060546875,
-0.00743865966796875,
0.006107330322265625,
-0.086669921875,
-0.03826904296875,
0.017669677734375,
-0.049591064453125,
-0.01375579833984375,
0.0088043212890625,
-0.042205810546875,
-0.00556182861328125,
-0.0210113525390625,
0.06719970703125,
-0.0411376953125,
0.00786590576171875,
0.02032470703125,
-0.0177764892578125,
0.01174163818359375,
0.0129547119140625,
-0.063720703125,
0.0229339599609375,
0.0107421875,
0.06781005859375,
-0.006256103515625,
-0.0304718017578125,
-0.039947509765625,
-0.004451751708984375,
0.0032291412353515625,
0.036895751953125,
-0.0275115966796875,
-0.017059326171875,
0.0026912689208984375,
0.031707763671875,
-0.034423828125,
-0.03704833984375,
0.043853759765625,
-0.023956298828125,
0.033599853515625,
-0.021392822265625,
-0.041351318359375,
-0.004680633544921875,
0.0191802978515625,
-0.033966064453125,
0.0789794921875,
0.004230499267578125,
-0.0716552734375,
0.01116180419921875,
-0.03448486328125,
-0.02874755859375,
-0.01322174072265625,
-0.003360748291015625,
-0.038818359375,
-0.00960540771484375,
0.038116455078125,
0.032470703125,
-0.0109405517578125,
0.00006002187728881836,
-0.00713348388671875,
-0.039306640625,
0.0144500732421875,
-0.013153076171875,
0.06451416015625,
0.00864410400390625,
-0.035186767578125,
-0.01279449462890625,
-0.05340576171875,
0.003978729248046875,
0.01165771484375,
-0.0338134765625,
-0.007617950439453125,
0.00826263427734375,
0.0018939971923828125,
0.0231781005859375,
0.0302734375,
-0.037261962890625,
0.01445770263671875,
-0.03778076171875,
0.05670166015625,
0.041534423828125,
0.004230499267578125,
0.03289794921875,
-0.030426025390625,
0.00719451904296875,
0.0278167724609375,
0.00659942626953125,
-0.0009984970092773438,
-0.037841796875,
-0.058380126953125,
-0.0087127685546875,
0.0447998046875,
0.038482666015625,
-0.03533935546875,
0.0440673828125,
-0.0261383056640625,
-0.0223236083984375,
-0.051177978515625,
0.006237030029296875,
0.01537322998046875,
0.0259857177734375,
0.060455322265625,
-0.007293701171875,
-0.051910400390625,
-0.07470703125,
-0.02301025390625,
0.01385498046875,
-0.0204010009765625,
0.01788330078125,
0.067626953125,
-0.0250396728515625,
0.042572021484375,
-0.052154541015625,
-0.034942626953125,
-0.0165557861328125,
0.006900787353515625,
0.028839111328125,
0.0567626953125,
0.0291900634765625,
-0.0670166015625,
-0.034912109375,
-0.038360595703125,
-0.0667724609375,
0.00518035888671875,
0.00809478759765625,
-0.0167236328125,
-0.003803253173828125,
0.03936767578125,
-0.04986572265625,
0.0238800048828125,
0.039703369140625,
-0.0335693359375,
0.0203704833984375,
-0.0228424072265625,
0.0111236572265625,
-0.080322265625,
0.0011653900146484375,
0.01445770263671875,
-0.0162353515625,
-0.0264739990234375,
0.01102447509765625,
0.00080108642578125,
-0.01187896728515625,
-0.0360107421875,
0.0215301513671875,
-0.04400634765625,
0.0160369873046875,
-0.0061798095703125,
0.0252532958984375,
0.0237579345703125,
0.037261962890625,
-0.00930023193359375,
0.044342041015625,
0.04266357421875,
-0.0643310546875,
0.002956390380859375,
0.050079345703125,
-0.0282745361328125,
0.0239105224609375,
-0.0670166015625,
0.0087738037109375,
-0.0032291412353515625,
0.0217132568359375,
-0.06915283203125,
-0.01457977294921875,
0.0204620361328125,
-0.0511474609375,
0.0248870849609375,
0.0019216537475585938,
-0.039794921875,
-0.0254058837890625,
-0.041229248046875,
0.0174560546875,
0.0411376953125,
-0.0290374755859375,
0.035430908203125,
0.0171966552734375,
0.0023956298828125,
-0.040740966796875,
-0.07861328125,
-0.007568359375,
-0.0029582977294921875,
-0.050811767578125,
0.055755615234375,
-0.0144500732421875,
0.0168914794921875,
0.0024261474609375,
-0.01157379150390625,
0.0162811279296875,
-0.01166534423828125,
0.0175933837890625,
0.0017194747924804688,
0.00058746337890625,
0.006938934326171875,
-0.0093231201171875,
-0.003040313720703125,
0.0021266937255859375,
-0.0218963623046875,
0.043212890625,
-0.021148681640625,
0.00897979736328125,
-0.04461669921875,
0.037628173828125,
0.015838623046875,
-0.0195770263671875,
0.08404541015625,
0.0640869140625,
-0.0275115966796875,
0.00820159912109375,
-0.0250244140625,
-0.026275634765625,
-0.036163330078125,
0.048736572265625,
-0.041534423828125,
-0.042327880859375,
0.029510498046875,
0.003627777099609375,
-0.00507354736328125,
0.0675048828125,
0.024505615234375,
-0.021392822265625,
0.09967041015625,
0.0538330078125,
0.01079559326171875,
0.03277587890625,
-0.05194091796875,
0.005706787109375,
-0.07330322265625,
-0.0255584716796875,
-0.04766845703125,
-0.0362548828125,
-0.0633544921875,
-0.032623291015625,
0.0257415771484375,
0.014984130859375,
-0.03729248046875,
0.0263519287109375,
-0.04266357421875,
0.0069732666015625,
0.0390625,
0.0408935546875,
0.0036945343017578125,
0.01226806640625,
-0.0115814208984375,
-0.028839111328125,
-0.06854248046875,
-0.025787353515625,
0.0771484375,
0.022247314453125,
0.05389404296875,
-0.0027370452880859375,
0.048095703125,
0.0085601806640625,
-0.0092620849609375,
-0.04998779296875,
0.04241943359375,
-0.0307769775390625,
-0.0233306884765625,
-0.01241302490234375,
-0.05224609375,
-0.08026123046875,
0.032562255859375,
-0.032073974609375,
-0.05224609375,
0.011474609375,
-0.0115203857421875,
-0.0178070068359375,
0.0093841552734375,
-0.06927490234375,
0.0797119140625,
0.002689361572265625,
-0.026885986328125,
-0.0008759498596191406,
-0.051025390625,
-0.0161285400390625,
0.0255889892578125,
0.00986480712890625,
0.00215911865234375,
-0.006000518798828125,
0.08050537109375,
-0.02227783203125,
0.0697021484375,
-0.006450653076171875,
0.02874755859375,
0.005527496337890625,
-0.0155181884765625,
0.04345703125,
-0.0137786865234375,
-0.0079803466796875,
0.0181732177734375,
0.0040740966796875,
-0.044769287109375,
-0.0257568359375,
0.061065673828125,
-0.0926513671875,
-0.042205810546875,
-0.040924072265625,
-0.035614013671875,
0.006988525390625,
0.01082611083984375,
0.049072265625,
0.035614013671875,
0.01000213623046875,
0.043701171875,
0.04345703125,
-0.029876708984375,
0.0257568359375,
0.021484375,
0.007617950439453125,
-0.03277587890625,
0.054931640625,
0.03289794921875,
0.01430511474609375,
0.051513671875,
0.01837158203125,
-0.0247650146484375,
-0.041656494140625,
-0.0113983154296875,
0.03204345703125,
-0.051483154296875,
-0.01421356201171875,
-0.0838623046875,
-0.0236053466796875,
-0.046966552734375,
-0.0006275177001953125,
-0.018524169921875,
-0.0291900634765625,
-0.028472900390625,
-0.003803253173828125,
0.0134735107421875,
0.02740478515625,
-0.004150390625,
0.02069091796875,
-0.049591064453125,
0.023468017578125,
0.0035877227783203125,
0.004940032958984375,
-0.0117645263671875,
-0.07086181640625,
-0.031463623046875,
0.00860595703125,
-0.047332763671875,
-0.066650390625,
0.03076171875,
0.032745361328125,
0.0447998046875,
0.007396697998046875,
0.007556915283203125,
0.0474853515625,
-0.02783203125,
0.073974609375,
0.0109100341796875,
-0.067626953125,
0.048431396484375,
-0.006580352783203125,
0.052093505859375,
0.036956787109375,
0.05804443359375,
-0.0266876220703125,
-0.027557373046875,
-0.0584716796875,
-0.0838623046875,
0.047119140625,
0.030120849609375,
0.015838623046875,
-0.0049896240234375,
0.0225677490234375,
-0.002010345458984375,
0.018280029296875,
-0.0819091796875,
-0.0258026123046875,
-0.02734375,
-0.02203369140625,
-0.01470184326171875,
-0.015472412109375,
0.0015926361083984375,
-0.0450439453125,
0.06292724609375,
-0.002819061279296875,
0.050811767578125,
0.041900634765625,
-0.034698486328125,
0.006427764892578125,
0.003448486328125,
0.0222015380859375,
0.044403076171875,
-0.031951904296875,
0.0222015380859375,
0.02838134765625,
-0.050323486328125,
-0.013519287109375,
0.01708984375,
-0.015960693359375,
0.0026950836181640625,
0.036773681640625,
0.055633544921875,
0.024078369140625,
-0.024627685546875,
0.045684814453125,
0.002368927001953125,
-0.0281829833984375,
-0.0074920654296875,
-0.0029659271240234375,
0.012939453125,
0.014495849609375,
0.031097412109375,
-0.00386810302734375,
0.01326751708984375,
-0.046875,
0.006076812744140625,
-0.0033283233642578125,
-0.028900146484375,
-0.0224456787109375,
0.052398681640625,
0.0189056396484375,
-0.005672454833984375,
0.0765380859375,
-0.008514404296875,
-0.041351318359375,
0.036468505859375,
0.05462646484375,
0.04815673828125,
-0.00981903076171875,
0.01285552978515625,
0.06707763671875,
0.028472900390625,
0.0003693103790283203,
0.01434326171875,
0.0158843994140625,
-0.049652099609375,
-0.022247314453125,
-0.06781005859375,
-0.0025501251220703125,
0.0195159912109375,
-0.035919189453125,
0.0168304443359375,
-0.004974365234375,
-0.021392822265625,
0.0032367706298828125,
0.035614013671875,
-0.0718994140625,
0.020538330078125,
-0.004207611083984375,
0.05548095703125,
-0.06622314453125,
0.038787841796875,
0.060089111328125,
-0.061065673828125,
-0.053497314453125,
-0.0007739067077636719,
-0.0254364013671875,
-0.04217529296875,
0.052947998046875,
0.039520263671875,
0.01080322265625,
0.004611968994140625,
-0.042816162109375,
-0.053985595703125,
0.08831787109375,
0.013031005859375,
-0.039642333984375,
-0.0219268798828125,
0.0216064453125,
0.0321044921875,
-0.03582763671875,
0.038543701171875,
0.0258026123046875,
0.025115966796875,
-0.00433349609375,
-0.05291748046875,
0.0198822021484375,
-0.023956298828125,
-0.007457733154296875,
-0.00849151611328125,
-0.0543212890625,
0.08660888671875,
-0.0192413330078125,
-0.0033893585205078125,
0.0034332275390625,
0.04791259765625,
0.0056304931640625,
0.00559234619140625,
0.0295562744140625,
0.04632568359375,
0.05023193359375,
-0.005252838134765625,
0.09210205078125,
-0.020721435546875,
0.040435791015625,
0.059356689453125,
0.022125244140625,
0.0673828125,
0.034088134765625,
-0.02703857421875,
0.050323486328125,
0.06378173828125,
-0.011016845703125,
0.052703857421875,
0.01018524169921875,
0.0093994140625,
-0.022125244140625,
0.0034351348876953125,
-0.048004150390625,
0.0265960693359375,
0.0135498046875,
-0.048614501953125,
-0.012908935546875,
0.005340576171875,
0.0076141357421875,
-0.01311492919921875,
-0.0157470703125,
0.036102294921875,
0.036224365234375,
-0.03399658203125,
0.07080078125,
0.00847625732421875,
0.053985595703125,
-0.051483154296875,
0.0127716064453125,
-0.01369476318359375,
0.0305938720703125,
-0.0238037109375,
-0.044891357421875,
0.008514404296875,
-0.00986480712890625,
-0.0218353271484375,
-0.009735107421875,
0.040863037109375,
-0.042510986328125,
-0.03497314453125,
0.0262298583984375,
0.038665771484375,
0.0202178955078125,
-0.02069091796875,
-0.0777587890625,
0.00740814208984375,
0.0008635520935058594,
-0.03118896484375,
0.0362548828125,
0.0165252685546875,
0.018463134765625,
0.037841796875,
0.037322998046875,
-0.0106658935546875,
-0.004974365234375,
0.01445770263671875,
0.060546875,
-0.048583984375,
-0.042816162109375,
-0.0615234375,
0.028106689453125,
-0.020233154296875,
-0.026275634765625,
0.0625,
0.050872802734375,
0.05792236328125,
-0.01275634765625,
0.034912109375,
-0.0036830902099609375,
0.00876617431640625,
-0.044403076171875,
0.047515869140625,
-0.052459716796875,
-0.004482269287109375,
-0.020751953125,
-0.07867431640625,
-0.01477813720703125,
0.06298828125,
-0.037750244140625,
0.0109405517578125,
0.0726318359375,
0.0572509765625,
-0.015594482421875,
-0.00933837890625,
0.0175323486328125,
0.044464111328125,
0.0240478515625,
0.061553955078125,
0.04010009765625,
-0.083984375,
0.055511474609375,
-0.00861358642578125,
-0.015625,
-0.0162811279296875,
-0.05767822265625,
-0.064208984375,
-0.049407958984375,
-0.044891357421875,
-0.0305938720703125,
0.011260986328125,
0.0732421875,
0.056365966796875,
-0.04583740234375,
-0.00765228271484375,
0.003604888916015625,
-0.01409912109375,
-0.02716064453125,
-0.0177459716796875,
0.044769287109375,
-0.0287628173828125,
-0.0694580078125,
0.01337432861328125,
-0.01329803466796875,
0.0058746337890625,
0.0087432861328125,
-0.006290435791015625,
-0.046875,
-0.0027904510498046875,
0.055328369140625,
-0.00787353515625,
-0.0287628173828125,
-0.0308380126953125,
-0.001308441162109375,
-0.0262451171875,
0.01195526123046875,
0.00945281982421875,
-0.047760009765625,
0.0188751220703125,
0.050048828125,
0.033355712890625,
0.07525634765625,
-0.0035037994384765625,
0.032379150390625,
-0.05072021484375,
0.01015472412109375,
0.007904052734375,
0.025482177734375,
0.0406494140625,
-0.021270751953125,
0.035430908203125,
0.0283203125,
-0.046661376953125,
-0.04559326171875,
-0.0095062255859375,
-0.07427978515625,
-0.0180511474609375,
0.07861328125,
-0.01322174072265625,
-0.0245513916015625,
0.0112762451171875,
-0.006366729736328125,
0.02716064453125,
-0.02203369140625,
0.055511474609375,
0.062347412109375,
-0.00864410400390625,
-0.0068359375,
-0.057373046875,
0.03961181640625,
0.0367431640625,
-0.040252685546875,
-0.02838134765625,
0.00551605224609375,
0.0345458984375,
0.0123138427734375,
0.03704833984375,
-0.012420654296875,
0.0017290115356445312,
0.0240478515625,
-0.004932403564453125,
-0.003692626953125,
-0.01019287109375,
-0.00560760498046875,
0.01485443115234375,
-0.01959228515625,
-0.024993896484375
]
] |
Helsinki-NLP/opus-mt-en-ro | 2023-08-16T11:30:56.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"en",
"ro",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-en-ro | 1 | 6,907 | transformers | 2022-03-02T23:29:04 | ---
tags:
- translation
license: apache-2.0
---
### opus-mt-en-ro
* source languages: en
* target languages: ro
* OPUS readme: [en-ro](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/en-ro/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2019-12-18.zip](https://object.pouta.csc.fi/OPUS-MT-models/en-ro/opus-2019-12-18.zip)
* test set translations: [opus-2019-12-18.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/en-ro/opus-2019-12-18.test.txt)
* test set scores: [opus-2019-12-18.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/en-ro/opus-2019-12-18.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| newsdev2016-enro.en.ro | 30.8 | 0.592 |
| newstest2016-enro.en.ro | 28.8 | 0.571 |
| Tatoeba.en.ro | 45.3 | 0.670 |
| 907 | [
[
-0.01934814453125,
-0.0296630859375,
0.017913818359375,
0.027008056640625,
-0.0298004150390625,
-0.028656005859375,
-0.03076171875,
-0.00836944580078125,
0.0029659271240234375,
0.0333251953125,
-0.051116943359375,
-0.039703369140625,
-0.042755126953125,
0.01538848876953125,
-0.004756927490234375,
0.056488037109375,
-0.01303863525390625,
0.036956787109375,
0.01332855224609375,
-0.035125732421875,
-0.0291290283203125,
-0.0257720947265625,
-0.033203125,
-0.02618408203125,
0.0233154296875,
0.034637451171875,
0.0272064208984375,
0.026641845703125,
0.07440185546875,
0.0164947509765625,
-0.00666046142578125,
0.00499725341796875,
-0.033843994140625,
-0.002094268798828125,
0.0069427490234375,
-0.0445556640625,
-0.05926513671875,
-0.0082244873046875,
0.07562255859375,
0.02471923828125,
-0.004917144775390625,
0.032470703125,
-0.005481719970703125,
0.07086181640625,
-0.02288818359375,
0.0015859603881835938,
-0.045745849609375,
0.0100555419921875,
-0.0245361328125,
-0.0257415771484375,
-0.0491943359375,
-0.014007568359375,
0.0044097900390625,
-0.04864501953125,
0.0007944107055664062,
0.015838623046875,
0.10784912109375,
0.02532958984375,
-0.021697998046875,
-0.0118255615234375,
-0.044677734375,
0.0792236328125,
-0.057861328125,
0.0445556640625,
0.031341552734375,
0.0209808349609375,
0.017578125,
-0.04034423828125,
-0.02276611328125,
0.011199951171875,
-0.018402099609375,
0.019317626953125,
-0.0092620849609375,
-0.022125244140625,
0.02215576171875,
0.053436279296875,
-0.060272216796875,
-0.00240325927734375,
-0.04742431640625,
0.0028781890869140625,
0.054931640625,
0.011077880859375,
0.01383209228515625,
-0.0117034912109375,
-0.03558349609375,
-0.04058837890625,
-0.0576171875,
0.00835418701171875,
0.026763916015625,
0.0230255126953125,
-0.037261962890625,
0.050811767578125,
-0.01096343994140625,
0.04986572265625,
-0.006511688232421875,
-0.00283050537109375,
0.07489013671875,
-0.030426025390625,
-0.025634765625,
-0.01331329345703125,
0.092041015625,
0.029449462890625,
0.006313323974609375,
0.009185791015625,
-0.02142333984375,
-0.0159759521484375,
0.0091552734375,
-0.063720703125,
-0.00499725341796875,
0.0144195556640625,
-0.0325927734375,
-0.0118255615234375,
0.004611968994140625,
-0.048187255859375,
0.0141448974609375,
-0.0271453857421875,
0.04412841796875,
-0.043701171875,
-0.0220947265625,
0.0279083251953125,
-0.0006852149963378906,
0.032470703125,
0.0015926361083984375,
-0.0421142578125,
0.0143585205078125,
0.027923583984375,
0.054443359375,
-0.026336669921875,
-0.0157928466796875,
-0.0377197265625,
-0.0173797607421875,
-0.007534027099609375,
0.048309326171875,
-0.0090179443359375,
-0.0295562744140625,
-0.00832366943359375,
0.0323486328125,
-0.030059814453125,
-0.0267486572265625,
0.09967041015625,
-0.0216217041015625,
0.054840087890625,
-0.02728271484375,
-0.033050537109375,
-0.0230255126953125,
0.036590576171875,
-0.0394287109375,
0.09588623046875,
0.00659942626953125,
-0.06353759765625,
0.012969970703125,
-0.06201171875,
-0.013946533203125,
-0.004451751708984375,
0.002559661865234375,
-0.050384521484375,
0.002902984619140625,
0.0086669921875,
0.028106689453125,
-0.0271453857421875,
0.019561767578125,
0.0004284381866455078,
-0.021728515625,
0.005626678466796875,
-0.023284912109375,
0.0721435546875,
0.02410888671875,
-0.018402099609375,
0.0216217041015625,
-0.072265625,
-0.00031256675720214844,
0.002109527587890625,
-0.033050537109375,
-0.01163482666015625,
0.006168365478515625,
0.021148681640625,
0.007801055908203125,
0.02294921875,
-0.047943115234375,
0.0153350830078125,
-0.040069580078125,
0.0191802978515625,
0.046661376953125,
-0.022125244140625,
0.0282745361328125,
-0.0308380126953125,
0.02899169921875,
0.007320404052734375,
0.0111083984375,
0.0032405853271484375,
-0.035552978515625,
-0.06268310546875,
-0.016937255859375,
0.044158935546875,
0.08355712890625,
-0.0498046875,
0.06268310546875,
-0.05328369140625,
-0.057464599609375,
-0.0557861328125,
-0.00862884521484375,
0.03326416015625,
0.0306243896484375,
0.038177490234375,
-0.0124664306640625,
-0.033599853515625,
-0.08319091796875,
-0.00862884521484375,
-0.009490966796875,
-0.01471710205078125,
0.01081085205078125,
0.050079345703125,
-0.011810302734375,
0.040985107421875,
-0.040496826171875,
-0.032928466796875,
-0.01433563232421875,
0.007801055908203125,
0.037445068359375,
0.050384521484375,
0.036834716796875,
-0.06402587890625,
-0.046966552734375,
0.0016651153564453125,
-0.053436279296875,
-0.00841522216796875,
0.00844573974609375,
-0.0180816650390625,
0.0033435821533203125,
0.007236480712890625,
-0.02557373046875,
0.00804901123046875,
0.048065185546875,
-0.04425048828125,
0.039398193359375,
-0.007480621337890625,
0.016845703125,
-0.09661865234375,
0.00954437255859375,
-0.01031494140625,
-0.00980377197265625,
-0.031646728515625,
0.002193450927734375,
0.0226593017578125,
0.004962921142578125,
-0.0548095703125,
0.0423583984375,
-0.0218658447265625,
-0.004428863525390625,
0.0225982666015625,
0.004955291748046875,
0.00920867919921875,
0.04974365234375,
-0.0020313262939453125,
0.06353759765625,
0.05413818359375,
-0.035675048828125,
0.01306915283203125,
0.043609619140625,
-0.033721923828125,
0.0274658203125,
-0.062408447265625,
-0.01953125,
0.0209808349609375,
-0.004730224609375,
-0.048858642578125,
0.00676727294921875,
0.0282135009765625,
-0.05096435546875,
0.030364990234375,
-0.008636474609375,
-0.049713134765625,
-0.002056121826171875,
-0.0243072509765625,
0.032867431640625,
0.052001953125,
-0.0080718994140625,
0.042083740234375,
0.00513458251953125,
-0.0027618408203125,
-0.03466796875,
-0.0740966796875,
-0.0121612548828125,
-0.03204345703125,
-0.05859375,
0.01357269287109375,
-0.0275421142578125,
-0.006290435791015625,
0.0031490325927734375,
0.0229034423828125,
-0.004665374755859375,
-0.0011720657348632812,
0.005069732666015625,
0.018890380859375,
-0.0390625,
0.0070037841796875,
-0.00678253173828125,
-0.01407623291015625,
-0.01141357421875,
-0.005283355712890625,
0.044830322265625,
-0.0290374755859375,
-0.017181396484375,
-0.0440673828125,
0.0006198883056640625,
0.042724609375,
-0.036895751953125,
0.060791015625,
0.042877197265625,
-0.006847381591796875,
0.010467529296875,
-0.027069091796875,
0.006290435791015625,
-0.0322265625,
0.00858306884765625,
-0.034942626953125,
-0.05426025390625,
0.04425048828125,
0.01302337646484375,
0.033599853515625,
0.0621337890625,
0.040679931640625,
0.006893157958984375,
0.0450439453125,
0.0174102783203125,
0.006122589111328125,
0.033966064453125,
-0.038238525390625,
-0.00960540771484375,
-0.08709716796875,
0.0068206787109375,
-0.05255126953125,
-0.0290985107421875,
-0.061004638671875,
-0.0257415771484375,
0.0209808349609375,
0.00504302978515625,
-0.021942138671875,
0.0494384765625,
-0.0423583984375,
0.016754150390625,
0.0458984375,
-0.0073089599609375,
0.0206756591796875,
0.002727508544921875,
-0.034881591796875,
-0.018768310546875,
-0.03558349609375,
-0.03826904296875,
0.0972900390625,
0.0299224853515625,
0.0268402099609375,
0.0167083740234375,
0.03668212890625,
0.0014047622680664062,
0.016815185546875,
-0.043792724609375,
0.03271484375,
-0.0162200927734375,
-0.056365966796875,
-0.02618408203125,
-0.043731689453125,
-0.059661865234375,
0.040130615234375,
-0.0207366943359375,
-0.03472900390625,
0.01331329345703125,
-0.0019063949584960938,
-0.013092041015625,
0.03411865234375,
-0.048919677734375,
0.0838623046875,
-0.00804901123046875,
-0.0115814208984375,
0.0184173583984375,
-0.03643798828125,
0.022430419921875,
0.00460052490234375,
0.023956298828125,
-0.0180816650390625,
0.011474609375,
0.0478515625,
-0.006496429443359375,
0.0305023193359375,
-0.0027408599853515625,
-0.0052947998046875,
0.0048065185546875,
0.0039215087890625,
0.03240966796875,
-0.006916046142578125,
-0.03363037109375,
0.02532958984375,
-0.001529693603515625,
-0.033233642578125,
-0.01001739501953125,
0.03955078125,
-0.053680419921875,
-0.00215911865234375,
-0.037322998046875,
-0.047943115234375,
0.001255035400390625,
0.0235748291015625,
0.05035400390625,
0.053131103515625,
-0.018768310546875,
0.04302978515625,
0.06353759765625,
-0.0257110595703125,
0.024627685546875,
0.055267333984375,
-0.0178375244140625,
-0.044219970703125,
0.06494140625,
0.00576019287109375,
0.0298919677734375,
0.047088623046875,
0.00879669189453125,
-0.01029205322265625,
-0.052886962890625,
-0.053375244140625,
0.018218994140625,
-0.0218353271484375,
-0.0173492431640625,
-0.0389404296875,
-0.00836944580078125,
-0.0208892822265625,
0.017730712890625,
-0.038726806640625,
-0.046173095703125,
-0.01044464111328125,
-0.0167999267578125,
0.018951416015625,
0.0203399658203125,
-0.005870819091796875,
0.036041259765625,
-0.0753173828125,
0.013641357421875,
-0.0106048583984375,
0.0268402099609375,
-0.0343017578125,
-0.0621337890625,
-0.03167724609375,
0.003040313720703125,
-0.0498046875,
-0.045135498046875,
0.040740966796875,
0.01172637939453125,
0.0196533203125,
0.0272674560546875,
0.01519775390625,
0.028045654296875,
-0.0548095703125,
0.07244873046875,
0.0003962516784667969,
-0.047088623046875,
0.04058837890625,
-0.03558349609375,
0.0289306640625,
0.06591796875,
0.0166778564453125,
-0.0266265869140625,
-0.03857421875,
-0.0560302734375,
-0.06256103515625,
0.061431884765625,
0.051849365234375,
-0.013031005859375,
0.019561767578125,
-0.0099639892578125,
-0.0028209686279296875,
0.00991058349609375,
-0.0831298828125,
-0.0298004150390625,
0.008544921875,
-0.0265655517578125,
-0.01531219482421875,
-0.0196075439453125,
-0.021759033203125,
-0.0157623291015625,
0.0797119140625,
0.016387939453125,
0.0157928466796875,
0.032806396484375,
-0.00891876220703125,
-0.0180206298828125,
0.0277252197265625,
0.07867431640625,
0.042144775390625,
-0.04425048828125,
-0.00933837890625,
0.0206298828125,
-0.0309600830078125,
-0.01253509521484375,
0.00437164306640625,
-0.031829833984375,
0.0205078125,
0.037384033203125,
0.08306884765625,
0.0111846923828125,
-0.0472412109375,
0.0299530029296875,
-0.0272674560546875,
-0.03375244140625,
-0.0521240234375,
-0.006710052490234375,
0.0081024169921875,
0.00031948089599609375,
0.0167083740234375,
0.016754150390625,
0.0108795166015625,
-0.01464080810546875,
0.0091705322265625,
0.0100555419921875,
-0.048736572265625,
-0.040008544921875,
0.0379638671875,
0.01142120361328125,
-0.0258331298828125,
0.039276123046875,
-0.026641845703125,
-0.043304443359375,
0.031158447265625,
0.00812530517578125,
0.07659912109375,
-0.015777587890625,
-0.016082763671875,
0.058868408203125,
0.04632568359375,
-0.01739501953125,
0.03594970703125,
0.01178741455078125,
-0.0570068359375,
-0.0345458984375,
-0.06231689453125,
-0.0125274658203125,
0.00868988037109375,
-0.06842041015625,
0.0268402099609375,
0.0221099853515625,
0.0011386871337890625,
-0.0264892578125,
0.018280029296875,
-0.037322998046875,
0.01081085205078125,
-0.023345947265625,
0.07904052734375,
-0.07080078125,
0.06390380859375,
0.040191650390625,
-0.0209808349609375,
-0.057098388671875,
-0.0187835693359375,
-0.0173187255859375,
-0.037506103515625,
0.04608154296875,
0.01065826416015625,
0.0206298828125,
-0.01259613037109375,
-0.0189666748046875,
-0.062042236328125,
0.0850830078125,
0.00891876220703125,
-0.044830322265625,
0.00846099853515625,
0.01178741455078125,
0.035675048828125,
-0.0260162353515625,
0.010711669921875,
0.0260772705078125,
0.05517578125,
0.005466461181640625,
-0.08154296875,
-0.0219573974609375,
-0.043365478515625,
-0.026336669921875,
0.042144775390625,
-0.04779052734375,
0.07415771484375,
0.0323486328125,
-0.00860595703125,
0.0023021697998046875,
0.045196533203125,
0.0251312255859375,
0.02239990234375,
0.038238525390625,
0.0880126953125,
0.031524658203125,
-0.03631591796875,
0.07452392578125,
-0.0256195068359375,
0.039276123046875,
0.09033203125,
-0.006885528564453125,
0.07489013671875,
0.0233154296875,
-0.01345062255859375,
0.03778076171875,
0.045562744140625,
-0.0182037353515625,
0.033905029296875,
0.0015392303466796875,
0.01277923583984375,
-0.0139007568359375,
0.0208587646484375,
-0.056732177734375,
0.01297760009765625,
0.01739501953125,
-0.0150909423828125,
0.0029430389404296875,
-0.004245758056640625,
0.0032329559326171875,
-0.0023822784423828125,
-0.007305145263671875,
0.040618896484375,
-0.000858306884765625,
-0.04132080078125,
0.055572509765625,
-0.0018358230590820312,
0.051239013671875,
-0.05517578125,
0.0091400146484375,
-0.006011962890625,
0.025543212890625,
-0.0015106201171875,
-0.046356201171875,
0.036224365234375,
0.001308441162109375,
-0.02203369140625,
-0.038055419921875,
0.017059326171875,
-0.04046630859375,
-0.068359375,
0.028350830078125,
0.0291900634765625,
0.0255584716796875,
0.0051727294921875,
-0.06365966796875,
0.00655364990234375,
0.01218414306640625,
-0.04718017578125,
0.005641937255859375,
0.052886962890625,
0.022125244140625,
0.037689208984375,
0.044189453125,
0.0181732177734375,
0.0206756591796875,
-0.007297515869140625,
0.04730224609375,
-0.034393310546875,
-0.029510498046875,
-0.054595947265625,
0.0615234375,
-0.01058197021484375,
-0.053436279296875,
0.055023193359375,
0.07867431640625,
0.07489013671875,
-0.010650634765625,
0.02142333984375,
-0.01033782958984375,
0.058349609375,
-0.048370361328125,
0.043914794921875,
-0.07281494140625,
0.017669677734375,
-0.002559661865234375,
-0.06427001953125,
-0.0179443359375,
0.031890869140625,
-0.0185394287109375,
-0.027984619140625,
0.055450439453125,
0.051025390625,
-0.01500701904296875,
-0.01322174072265625,
0.0196380615234375,
0.0225982666015625,
0.01496124267578125,
0.040496826171875,
0.031890869140625,
-0.0782470703125,
0.0361328125,
-0.020355224609375,
-0.00821685791015625,
-0.0034809112548828125,
-0.050079345703125,
-0.059906005859375,
-0.042755126953125,
-0.0156097412109375,
-0.0170440673828125,
-0.0206756591796875,
0.06317138671875,
0.03857421875,
-0.0704345703125,
-0.0439453125,
0.0033054351806640625,
0.011383056640625,
-0.01361083984375,
-0.0200042724609375,
0.041412353515625,
-0.0257110595703125,
-0.0693359375,
0.03680419921875,
0.00238800048828125,
-0.0096282958984375,
-0.0012969970703125,
-0.0262451171875,
-0.037841796875,
-0.006450653076171875,
0.023162841796875,
0.0021228790283203125,
-0.038726806640625,
0.0088958740234375,
0.00836944580078125,
-0.005870819091796875,
0.03228759765625,
0.0246734619140625,
-0.01477813720703125,
0.013763427734375,
0.064697265625,
0.0162506103515625,
0.0305328369140625,
-0.00600433349609375,
0.040130615234375,
-0.0523681640625,
0.02630615234375,
0.01500701904296875,
0.0411376953125,
0.0269012451171875,
-0.000614166259765625,
0.06488037109375,
0.014556884765625,
-0.048736572265625,
-0.08447265625,
0.00537872314453125,
-0.0892333984375,
0.0004417896270751953,
0.07086181640625,
-0.0181732177734375,
-0.0216827392578125,
0.02728271484375,
-0.0084228515625,
0.006771087646484375,
-0.0244903564453125,
0.02313232421875,
0.06536865234375,
0.0216522216796875,
0.00495147705078125,
-0.062164306640625,
0.0262603759765625,
0.0347900390625,
-0.051483154296875,
-0.01328277587890625,
0.01306915283203125,
0.0125579833984375,
0.031524658203125,
0.036407470703125,
-0.0235137939453125,
0.0000247955322265625,
-0.0193328857421875,
0.0345458984375,
-0.002162933349609375,
-0.01515960693359375,
-0.0225982666015625,
-0.0023479461669921875,
-0.005634307861328125,
-0.0260467529296875
]
] |
flaubert/flaubert_base_uncased | 2021-10-18T08:14:52.000Z | [
"transformers",
"pytorch",
"flaubert",
"fill-mask",
"bert",
"language-model",
"flue",
"french",
"flaubert-base",
"uncased",
"fr",
"dataset:flaubert",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | flaubert | null | null | flaubert/flaubert_base_uncased | 2 | 6,907 | transformers | 2022-03-02T23:29:05 | ---
language: fr
license: mit
datasets:
- flaubert
metrics:
- flue
tags:
- bert
- language-model
- flaubert
- flue
- french
- flaubert-base
- uncased
---
# FlauBERT: Unsupervised Language Model Pre-training for French
**FlauBERT** is a French BERT trained on a very large and heterogeneous French corpus. Models of different sizes are trained using the new CNRS (French National Centre for Scientific Research) [Jean Zay](http://www.idris.fr/eng/jean-zay/ ) supercomputer.
Along with FlauBERT comes [**FLUE**](https://github.com/getalp/Flaubert/tree/master/flue): an evaluation setup for French NLP systems similar to the popular GLUE benchmark. The goal is to enable further reproducible experiments in the future and to share models and progress on the French language.For more details please refer to the [official website](https://github.com/getalp/Flaubert).
## FlauBERT models
| Model name | Number of layers | Attention Heads | Embedding Dimension | Total Parameters |
| :------: | :---: | :---: | :---: | :---: |
| `flaubert-small-cased` | 6 | 8 | 512 | 54 M |
| `flaubert-base-uncased` | 12 | 12 | 768 | 137 M |
| `flaubert-base-cased` | 12 | 12 | 768 | 138 M |
| `flaubert-large-cased` | 24 | 16 | 1024 | 373 M |
**Note:** `flaubert-small-cased` is partially trained so performance is not guaranteed. Consider using it for debugging purpose only.
## Using FlauBERT with Hugging Face's Transformers
```python
import torch
from transformers import FlaubertModel, FlaubertTokenizer
# Choose among ['flaubert/flaubert_small_cased', 'flaubert/flaubert_base_uncased',
# 'flaubert/flaubert_base_cased', 'flaubert/flaubert_large_cased']
modelname = 'flaubert/flaubert_base_cased'
# Load pretrained model and tokenizer
flaubert, log = FlaubertModel.from_pretrained(modelname, output_loading_info=True)
flaubert_tokenizer = FlaubertTokenizer.from_pretrained(modelname, do_lowercase=False)
# do_lowercase=False if using cased models, True if using uncased ones
sentence = "Le chat mange une pomme."
token_ids = torch.tensor([flaubert_tokenizer.encode(sentence)])
last_layer = flaubert(token_ids)[0]
print(last_layer.shape)
# torch.Size([1, 8, 768]) -> (batch size x number of tokens x embedding dimension)
# The BERT [CLS] token correspond to the first hidden state of the last layer
cls_embedding = last_layer[:, 0, :]
```
**Notes:** if your `transformers` version is <=2.10.0, `modelname` should take one
of the following values:
```
['flaubert-small-cased', 'flaubert-base-uncased', 'flaubert-base-cased', 'flaubert-large-cased']
```
## References
If you use FlauBERT or the FLUE Benchmark for your scientific publication, or if you find the resources in this repository useful, please cite one of the following papers:
[LREC paper](http://www.lrec-conf.org/proceedings/lrec2020/pdf/2020.lrec-1.302.pdf)
```
@InProceedings{le2020flaubert,
author = {Le, Hang and Vial, Lo\"{i}c and Frej, Jibril and Segonne, Vincent and Coavoux, Maximin and Lecouteux, Benjamin and Allauzen, Alexandre and Crabb\'{e}, Beno\^{i}t and Besacier, Laurent and Schwab, Didier},
title = {FlauBERT: Unsupervised Language Model Pre-training for French},
booktitle = {Proceedings of The 12th Language Resources and Evaluation Conference},
month = {May},
year = {2020},
address = {Marseille, France},
publisher = {European Language Resources Association},
pages = {2479--2490},
url = {https://www.aclweb.org/anthology/2020.lrec-1.302}
}
```
[TALN paper](https://hal.archives-ouvertes.fr/hal-02784776/)
```
@inproceedings{le2020flaubert,
title = {FlauBERT: des mod{\`e}les de langue contextualis{\'e}s pr{\'e}-entra{\^\i}n{\'e}s pour le fran{\c{c}}ais},
author = {Le, Hang and Vial, Lo{\"\i}c and Frej, Jibril and Segonne, Vincent and Coavoux, Maximin and Lecouteux, Benjamin and Allauzen, Alexandre and Crabb{\'e}, Beno{\^\i}t and Besacier, Laurent and Schwab, Didier},
booktitle = {Actes de la 6e conf{\'e}rence conjointe Journ{\'e}es d'{\'E}tudes sur la Parole (JEP, 31e {\'e}dition), Traitement Automatique des Langues Naturelles (TALN, 27e {\'e}dition), Rencontre des {\'E}tudiants Chercheurs en Informatique pour le Traitement Automatique des Langues (R{\'E}CITAL, 22e {\'e}dition). Volume 2: Traitement Automatique des Langues Naturelles},
pages = {268--278},
year = {2020},
organization = {ATALA}
}
``` | 4,473 | [
[
-0.02520751953125,
-0.055450439453125,
0.0262603759765625,
0.0135498046875,
-0.0009908676147460938,
0.0037899017333984375,
-0.0203704833984375,
-0.0079193115234375,
0.0252227783203125,
0.03680419921875,
-0.03094482421875,
-0.035552978515625,
-0.04705810546875,
-0.01392364501953125,
-0.044403076171875,
0.061492919921875,
-0.01395416259765625,
-0.002414703369140625,
-0.005157470703125,
-0.0025196075439453125,
0.0103759765625,
-0.05877685546875,
-0.037109375,
-0.017791748046875,
0.036468505859375,
0.0051116943359375,
0.040435791015625,
0.0221099853515625,
0.01154327392578125,
0.031280517578125,
-0.0103607177734375,
-0.005146026611328125,
-0.041900634765625,
-0.0167388916015625,
0.006732940673828125,
-0.0253143310546875,
-0.0255279541015625,
-0.00396728515625,
0.05230712890625,
0.042388916015625,
0.00940704345703125,
0.0010232925415039062,
0.00007772445678710938,
0.051422119140625,
-0.034912109375,
0.022674560546875,
-0.026824951171875,
0.012969970703125,
-0.00969696044921875,
-0.0028667449951171875,
-0.04168701171875,
-0.0100555419921875,
0.0226593017578125,
-0.0301055908203125,
0.02581787109375,
-0.00870513916015625,
0.08551025390625,
0.0054779052734375,
-0.032684326171875,
0.0031642913818359375,
-0.0511474609375,
0.06549072265625,
-0.050628662109375,
0.046295166015625,
0.01259613037109375,
0.00035309791564941406,
-0.02001953125,
-0.0833740234375,
-0.04949951171875,
-0.0128021240234375,
-0.01503753662109375,
0.004047393798828125,
-0.01824951171875,
-0.00518035888671875,
0.0199432373046875,
0.0274810791015625,
-0.037017822265625,
-0.0138702392578125,
-0.0423583984375,
-0.037109375,
0.05084228515625,
-0.0168609619140625,
0.017578125,
-0.0006189346313476562,
-0.03643798828125,
-0.038909912109375,
-0.0185089111328125,
0.018157958984375,
0.0255279541015625,
0.01548004150390625,
-0.0253448486328125,
0.033416748046875,
-0.00003981590270996094,
0.035858154296875,
-0.003170013427734375,
0.00550079345703125,
0.060272216796875,
-0.01102447509765625,
-0.0223846435546875,
-0.003551483154296875,
0.084228515625,
0.005901336669921875,
0.034637451171875,
-0.0092926025390625,
-0.0262451171875,
-0.0202789306640625,
0.010833740234375,
-0.05810546875,
-0.0300750732421875,
0.0239105224609375,
-0.03564453125,
-0.0230865478515625,
0.0229339599609375,
-0.042236328125,
-0.003025054931640625,
-0.00424957275390625,
0.05902099609375,
-0.041534423828125,
-0.03424072265625,
0.010345458984375,
0.0027790069580078125,
0.031982421875,
0.004596710205078125,
-0.06732177734375,
0.0133056640625,
0.032135009765625,
0.0618896484375,
0.0020198822021484375,
-0.0310821533203125,
-0.032012939453125,
-0.01230621337890625,
-0.0216064453125,
0.033966064453125,
-0.03173828125,
-0.01491546630859375,
0.021484375,
0.0234375,
-0.0290069580078125,
-0.015106201171875,
0.05535888671875,
-0.0286102294921875,
0.0233154296875,
-0.013092041015625,
-0.046051025390625,
-0.0302886962890625,
-0.01361846923828125,
-0.049346923828125,
0.0849609375,
0.0341796875,
-0.062744140625,
0.01111602783203125,
-0.03887939453125,
-0.0305938720703125,
-0.0094451904296875,
-0.016571044921875,
-0.0316162109375,
0.01678466796875,
0.0301513671875,
0.0474853515625,
0.001361846923828125,
-0.004589080810546875,
-0.0078582763671875,
-0.022125244140625,
0.0211029052734375,
-0.018798828125,
0.0780029296875,
0.01270294189453125,
-0.02734375,
0.0187225341796875,
-0.052398681640625,
0.00997161865234375,
0.018829345703125,
-0.0158538818359375,
0.00044798851013183594,
-0.0137176513671875,
0.022247314453125,
0.0109100341796875,
0.0384521484375,
-0.043426513671875,
0.01024627685546875,
-0.036773681640625,
0.047760009765625,
0.05548095703125,
0.0064849853515625,
0.026275634765625,
-0.0225067138671875,
0.021484375,
0.022613525390625,
0.021240234375,
-0.0024566650390625,
-0.0281982421875,
-0.07977294921875,
-0.0277862548828125,
0.042022705078125,
0.044464111328125,
-0.04150390625,
0.05694580078125,
-0.01520538330078125,
-0.03839111328125,
-0.02587890625,
-0.004093170166015625,
0.013824462890625,
0.0174713134765625,
0.03594970703125,
-0.01229095458984375,
-0.026336669921875,
-0.08349609375,
-0.009979248046875,
-0.0003867149353027344,
0.004154205322265625,
-0.00359344482421875,
0.052886962890625,
-0.03021240234375,
0.04736328125,
-0.0287322998046875,
-0.02960205078125,
-0.0184783935546875,
0.01214599609375,
0.032562255859375,
0.0491943359375,
0.06414794921875,
-0.037322998046875,
-0.037445068359375,
-0.01439666748046875,
-0.045623779296875,
0.0218048095703125,
-0.005146026611328125,
-0.026031494140625,
0.0281524658203125,
0.031524658203125,
-0.036163330078125,
0.038055419921875,
0.0269622802734375,
-0.0244598388671875,
0.0253143310546875,
-0.0207672119140625,
0.00415802001953125,
-0.0703125,
-0.0077056884765625,
-0.0008478164672851562,
-0.018646240234375,
-0.057098388671875,
-0.005939483642578125,
0.00594329833984375,
0.00656890869140625,
-0.045013427734375,
0.03912353515625,
-0.03570556640625,
0.00980377197265625,
0.01470184326171875,
0.01390838623046875,
-0.007274627685546875,
0.0628662109375,
0.0062103271484375,
0.031890869140625,
0.07159423828125,
-0.032562255859375,
0.0195159912109375,
0.027130126953125,
-0.033538818359375,
0.0097808837890625,
-0.051300048828125,
0.01450347900390625,
-0.001850128173828125,
0.0167236328125,
-0.05841064453125,
-0.0020809173583984375,
0.01556396484375,
-0.03436279296875,
0.028350830078125,
-0.0005741119384765625,
-0.059661865234375,
-0.033477783203125,
-0.0263671875,
0.022125244140625,
0.047943115234375,
-0.033233642578125,
0.05047607421875,
0.015869140625,
0.008697509765625,
-0.05401611328125,
-0.0718994140625,
-0.026336669921875,
-0.005252838134765625,
-0.0606689453125,
0.02606201171875,
-0.01174163818359375,
0.01126861572265625,
-0.0038242340087890625,
-0.007389068603515625,
-0.01395416259765625,
-0.0140533447265625,
0.008026123046875,
-0.00003224611282348633,
-0.021240234375,
-0.0031490325927734375,
0.004207611083984375,
-0.003330230712890625,
0.006999969482421875,
-0.02484130859375,
0.055999755859375,
-0.040924072265625,
-0.022674560546875,
-0.037353515625,
0.017303466796875,
0.047637939453125,
-0.028778076171875,
0.07489013671875,
0.08197021484375,
-0.043731689453125,
-0.01226043701171875,
-0.036529541015625,
-0.0248260498046875,
-0.03912353515625,
0.02655029296875,
-0.0299072265625,
-0.058074951171875,
0.0535888671875,
0.0186767578125,
0.01209259033203125,
0.051116943359375,
0.041259765625,
0.0016183853149414062,
0.073486328125,
0.0438232421875,
-0.00650787353515625,
0.0439453125,
-0.06414794921875,
0.0272979736328125,
-0.046295166015625,
-0.0258941650390625,
-0.020751953125,
-0.03204345703125,
-0.032562255859375,
-0.0328369140625,
0.018310546875,
0.02972412109375,
-0.0249481201171875,
0.032501220703125,
-0.042205810546875,
0.0197601318359375,
0.051422119140625,
0.0217132568359375,
-0.010650634765625,
0.0196075439453125,
-0.045501708984375,
-0.00566864013671875,
-0.05535888671875,
-0.036376953125,
0.07806396484375,
0.044158935546875,
0.03436279296875,
0.0159912109375,
0.07177734375,
0.0133514404296875,
0.0011882781982421875,
-0.050628662109375,
0.041748046875,
-0.0145263671875,
-0.04510498046875,
-0.017974853515625,
-0.02069091796875,
-0.071044921875,
0.0203704833984375,
-0.0144500732421875,
-0.08258056640625,
0.02557373046875,
0.002178192138671875,
-0.02581787109375,
0.02960205078125,
-0.0521240234375,
0.0777587890625,
-0.0289306640625,
-0.023773193359375,
-0.0159759521484375,
-0.039794921875,
0.0197601318359375,
-0.00679779052734375,
0.011322021484375,
0.00858306884765625,
0.01204681396484375,
0.0755615234375,
-0.045013427734375,
0.056488037109375,
-0.0152587890625,
-0.00359344482421875,
0.031982421875,
0.00612640380859375,
0.03778076171875,
0.01169586181640625,
-0.00543975830078125,
0.02001953125,
0.0291748046875,
-0.04052734375,
-0.03558349609375,
0.059722900390625,
-0.075927734375,
-0.0288543701171875,
-0.053619384765625,
-0.0252532958984375,
-0.00464630126953125,
0.0250091552734375,
0.039703369140625,
0.060211181640625,
-0.004055023193359375,
0.0343017578125,
0.0498046875,
-0.0360107421875,
0.041290283203125,
0.0272674560546875,
-0.0252227783203125,
-0.01995849609375,
0.06890869140625,
0.0117340087890625,
0.00231170654296875,
0.045135498046875,
0.01396942138671875,
-0.03350830078125,
-0.02593994140625,
-0.00583648681640625,
0.031707763671875,
-0.06536865234375,
0.002895355224609375,
-0.06304931640625,
-0.050933837890625,
-0.0290374755859375,
-0.0144805908203125,
-0.037506103515625,
-0.040496826171875,
-0.040496826171875,
-0.004161834716796875,
0.03424072265625,
0.0411376953125,
-0.01971435546875,
0.0244598388671875,
-0.05322265625,
0.0022125244140625,
0.00682830810546875,
0.02001953125,
0.005481719970703125,
-0.05767822265625,
-0.026458740234375,
0.00786590576171875,
-0.0135955810546875,
-0.05828857421875,
0.0247955322265625,
0.01451873779296875,
0.067626953125,
0.038055419921875,
0.01861572265625,
0.0196533203125,
-0.030914306640625,
0.06585693359375,
0.01324462890625,
-0.07470703125,
0.03839111328125,
-0.01214599609375,
0.01168060302734375,
0.034759521484375,
0.033477783203125,
-0.018798828125,
-0.0251922607421875,
-0.07666015625,
-0.0784912109375,
0.056549072265625,
0.0301055908203125,
0.0192413330078125,
-0.01006317138671875,
0.0123748779296875,
-0.0062713623046875,
0.01096343994140625,
-0.0684814453125,
-0.04119873046875,
-0.02960205078125,
-0.0200042724609375,
-0.005870819091796875,
-0.021575927734375,
-0.0178375244140625,
-0.0426025390625,
0.0625,
0.01007843017578125,
0.060821533203125,
0.020477294921875,
-0.02874755859375,
0.004913330078125,
0.001331329345703125,
0.06982421875,
0.044921875,
-0.0291595458984375,
0.00710296630859375,
0.0169525146484375,
-0.0282440185546875,
-0.00565338134765625,
0.010986328125,
0.001979827880859375,
0.00702667236328125,
0.053985595703125,
0.0787353515625,
0.01169586181640625,
-0.029815673828125,
0.061737060546875,
-0.004253387451171875,
-0.036773681640625,
-0.053985595703125,
0.01380157470703125,
-0.01006317138671875,
0.035797119140625,
0.033843994140625,
0.006351470947265625,
-0.02447509765625,
-0.0207672119140625,
0.034393310546875,
0.026885986328125,
-0.032440185546875,
-0.029205322265625,
0.057098388671875,
0.006931304931640625,
-0.0230255126953125,
0.039276123046875,
-0.0135498046875,
-0.05499267578125,
0.03240966796875,
0.0227508544921875,
0.07598876953125,
-0.00518035888671875,
0.0228729248046875,
0.046966552734375,
0.03619384765625,
-0.0019245147705078125,
0.0302734375,
0.0081329345703125,
-0.05633544921875,
-0.01141357421875,
-0.055633544921875,
0.009490966796875,
0.0411376953125,
-0.04351806640625,
0.019317626953125,
-0.04364013671875,
-0.0059356689453125,
-0.0087890625,
-0.002353668212890625,
-0.0689697265625,
-0.0009570121765136719,
0.00531768798828125,
0.0738525390625,
-0.06182861328125,
0.07110595703125,
0.04791259765625,
-0.05352783203125,
-0.0655517578125,
0.0076141357421875,
-0.0143280029296875,
-0.057708740234375,
0.056915283203125,
0.0112457275390625,
-0.004261016845703125,
0.018341064453125,
-0.03485107421875,
-0.0692138671875,
0.0767822265625,
0.0233917236328125,
-0.0447998046875,
0.01422882080078125,
0.0009398460388183594,
0.045135498046875,
-0.02935791015625,
0.03564453125,
0.03948974609375,
0.035430908203125,
-0.0033435821533203125,
-0.05609130859375,
-0.0024394989013671875,
-0.017303466796875,
-0.00528717041015625,
0.0007214546203613281,
-0.06103515625,
0.07806396484375,
-0.01006317138671875,
-0.01519775390625,
-0.002216339111328125,
0.071533203125,
-0.0029201507568359375,
-0.0206298828125,
0.040618896484375,
0.03662109375,
0.05010986328125,
-0.0230255126953125,
0.067626953125,
-0.045623779296875,
0.05364990234375,
0.049896240234375,
0.00806427001953125,
0.0706787109375,
0.0255584716796875,
-0.027496337890625,
0.05621337890625,
0.049163818359375,
-0.005901336669921875,
0.050872802734375,
0.0160064697265625,
-0.0120849609375,
-0.0106201171875,
0.0297088623046875,
-0.0435791015625,
0.0247802734375,
0.0295867919921875,
-0.043365478515625,
0.0008730888366699219,
0.0052947998046875,
0.0075225830078125,
-0.0084228515625,
-0.0023136138916015625,
0.0201568603515625,
0.0098419189453125,
-0.02587890625,
0.085205078125,
-0.0029087066650390625,
0.039031982421875,
-0.04364013671875,
0.021484375,
-0.01904296875,
0.0277862548828125,
-0.020721435546875,
-0.05316162109375,
-0.00531005859375,
-0.02008056640625,
-0.0168609619140625,
-0.0092926025390625,
0.0283050537109375,
-0.04345703125,
-0.054595947265625,
0.035247802734375,
0.0253753662109375,
0.0248870849609375,
0.01265716552734375,
-0.07073974609375,
0.0131378173828125,
0.01299285888671875,
-0.045806884765625,
0.00708770751953125,
0.022430419921875,
0.0108489990234375,
0.039276123046875,
0.027435302734375,
-0.0076141357421875,
0.02386474609375,
0.0239715576171875,
0.059112548828125,
-0.0318603515625,
-0.036468505859375,
-0.0521240234375,
0.054595947265625,
0.0011539459228515625,
-0.0194549560546875,
0.037506103515625,
0.0501708984375,
0.0635986328125,
-0.028045654296875,
0.058929443359375,
-0.01255035400390625,
0.027801513671875,
-0.0367431640625,
0.058807373046875,
-0.05731201171875,
0.00963592529296875,
-0.026885986328125,
-0.08184814453125,
-0.010650634765625,
0.08056640625,
-0.012969970703125,
0.0168304443359375,
0.072265625,
0.0689697265625,
-0.01910400390625,
-0.015228271484375,
0.0199432373046875,
0.03662109375,
0.0252532958984375,
0.04754638671875,
0.045318603515625,
-0.05487060546875,
0.034820556640625,
-0.042755126953125,
-0.0203399658203125,
-0.028350830078125,
-0.061737060546875,
-0.09423828125,
-0.0772705078125,
-0.041778564453125,
-0.036224365234375,
-0.0196990966796875,
0.068603515625,
0.052886962890625,
-0.0794677734375,
-0.007537841796875,
-0.01044464111328125,
-0.00992584228515625,
-0.022857666015625,
-0.0211944580078125,
0.050018310546875,
-0.01282501220703125,
-0.055999755859375,
0.0279083251953125,
0.0016336441040039062,
0.01532745361328125,
-0.0311279296875,
-0.0118865966796875,
-0.0347900390625,
0.00853729248046875,
0.03631591796875,
0.01953125,
-0.062347412109375,
-0.044097900390625,
-0.0140228271484375,
-0.01001739501953125,
0.0006022453308105469,
0.046630859375,
-0.044921875,
0.0234375,
0.046051025390625,
0.02679443359375,
0.06597900390625,
-0.0260772705078125,
0.03704833984375,
-0.0762939453125,
0.0399169921875,
0.00972747802734375,
0.044921875,
0.01226043701171875,
-0.01483154296875,
0.034149169921875,
0.0143280029296875,
-0.04290771484375,
-0.071044921875,
0.01018524169921875,
-0.0772705078125,
-0.0136871337890625,
0.0738525390625,
-0.00801849365234375,
-0.0161590576171875,
0.01505279541015625,
-0.006473541259765625,
0.03564453125,
-0.0318603515625,
0.02471923828125,
0.054107666015625,
-0.01384735107421875,
-0.033660888671875,
-0.057952880859375,
0.0400390625,
0.028656005859375,
-0.03173828125,
-0.0191650390625,
0.0020084381103515625,
0.018646240234375,
0.02484130859375,
0.0350341796875,
-0.0011444091796875,
-0.0084228515625,
-0.0014982223510742188,
0.00856781005859375,
-0.001873016357421875,
-0.0223236083984375,
-0.005596160888671875,
-0.00594329833984375,
-0.012054443359375,
-0.0187225341796875
]
] |
roneneldan/TinyStories-3M | 2023-05-17T22:11:46.000Z | [
"transformers",
"pytorch",
"gpt_neo",
"text-generation",
"arxiv:2305.07759",
"endpoints_compatible",
"has_space",
"region:us"
] | text-generation | roneneldan | null | null | roneneldan/TinyStories-3M | 2 | 6,902 | transformers | 2023-05-12T21:46:51 | Model trained on the TinyStories Dataset, see https://arxiv.org/abs/2305.07759
------ EXAMPLE USAGE ---
from transformers import AutoModelForCausalLM, AutoTokenizer, GenerationConfig
model = AutoModelForCausalLM.from_pretrained('roneneldan/TinyStories-3M')
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neo-125M")
prompt = "Once upon a time there was"
input_ids = tokenizer.encode(prompt, return_tensors="pt")
# Generate completion
output = model.generate(input_ids, max_length = 1000, num_beams=1)
# Decode the completion
output_text = tokenizer.decode(output[0], skip_special_tokens=True)
# Print the generated text
print(output_text) | 657 | [
[
-0.0222320556640625,
-0.0271759033203125,
0.0294647216796875,
0.00185394287109375,
-0.02154541015625,
-0.019866943359375,
0.002635955810546875,
0.00118255615234375,
0.00311279296875,
0.02398681640625,
-0.0595703125,
-0.03497314453125,
-0.045074462890625,
0.02996826171875,
-0.0352783203125,
0.0791015625,
0.0052490234375,
0.00714874267578125,
0.0207366943359375,
0.013397216796875,
-0.0164642333984375,
-0.028228759765625,
-0.060089111328125,
-0.0243377685546875,
0.0076751708984375,
0.0135040283203125,
0.037384033203125,
0.051727294921875,
0.022796630859375,
0.027374267578125,
-0.01355743408203125,
0.0199432373046875,
-0.026580810546875,
-0.018768310546875,
0.003032684326171875,
-0.03271484375,
-0.037689208984375,
-0.01007080078125,
0.06494140625,
0.0338134765625,
0.00823211669921875,
0.02447509765625,
0.021392822265625,
-0.002758026123046875,
-0.0208740234375,
0.02008056640625,
-0.049285888671875,
0.024749755859375,
0.0020465850830078125,
-0.006744384765625,
-0.0289306640625,
-0.009033203125,
0.005565643310546875,
-0.06976318359375,
0.054779052734375,
0.00943756103515625,
0.08831787109375,
0.036041259765625,
-0.04278564453125,
-0.0242919921875,
-0.038604736328125,
0.062744140625,
-0.055572509765625,
0.0107879638671875,
0.0333251953125,
0.013092041015625,
-0.0026226043701171875,
-0.09320068359375,
-0.04241943359375,
0.004058837890625,
-0.042877197265625,
-0.0099945068359375,
-0.006771087646484375,
0.005420684814453125,
0.044708251953125,
0.0228271484375,
-0.04180908203125,
-0.01520538330078125,
-0.0304107666015625,
-0.020751953125,
0.027862548828125,
0.04449462890625,
-0.0084381103515625,
-0.061065673828125,
-0.031280517578125,
-0.03131103515625,
-0.029754638671875,
-0.01515960693359375,
0.01076507568359375,
0.023193359375,
-0.011810302734375,
0.049224853515625,
-0.0130462646484375,
0.048828125,
0.023101806640625,
0.004730224609375,
0.031524658203125,
-0.058135986328125,
-0.02850341796875,
-0.01629638671875,
0.07861328125,
-0.004627227783203125,
-0.007232666015625,
-0.005207061767578125,
-0.0304107666015625,
-0.0143585205078125,
0.0100250244140625,
-0.07666015625,
-0.0274200439453125,
0.00437164306640625,
-0.046875,
-0.03143310546875,
0.0080413818359375,
-0.04144287109375,
0.002826690673828125,
-0.020294189453125,
0.05303955078125,
-0.0247955322265625,
-0.0043792724609375,
0.00470733642578125,
-0.006908416748046875,
0.0030002593994140625,
-0.01267242431640625,
-0.0723876953125,
0.01873779296875,
0.029296875,
0.07122802734375,
0.0172271728515625,
-0.036285400390625,
-0.031402587890625,
0.01507568359375,
-0.0115509033203125,
0.026885986328125,
0.01132965087890625,
-0.045989990234375,
-0.0086517333984375,
0.02264404296875,
-0.0184783935546875,
-0.0281219482421875,
0.0291290283203125,
-0.0216064453125,
0.0268402099609375,
-0.0006117820739746094,
-0.043853759765625,
0.00032639503479003906,
0.0223388671875,
-0.033721923828125,
0.0701904296875,
0.0202484130859375,
-0.057830810546875,
0.04010009765625,
-0.045379638671875,
-0.0175323486328125,
0.0015964508056640625,
-0.006992340087890625,
-0.052947998046875,
0.0193328857421875,
0.0231475830078125,
0.0282745361328125,
-0.000751495361328125,
0.0271148681640625,
-0.0233154296875,
-0.0235748291015625,
0.004543304443359375,
-0.035888671875,
0.06573486328125,
0.02264404296875,
-0.0273284912109375,
0.00516510009765625,
-0.05657958984375,
-0.00498199462890625,
0.01947021484375,
-0.0197296142578125,
-0.0006394386291503906,
-0.0218048095703125,
0.01261138916015625,
0.012420654296875,
0.037933349609375,
-0.041015625,
0.03448486328125,
-0.03375244140625,
0.036346435546875,
0.048797607421875,
0.01446533203125,
0.032012939453125,
-0.01555633544921875,
0.0234375,
0.00855255126953125,
0.0294647216796875,
-0.01385498046875,
-0.0169525146484375,
-0.07757568359375,
-0.0256195068359375,
0.0316162109375,
0.0156402587890625,
-0.06353759765625,
0.041534423828125,
-0.0262603759765625,
-0.03375244140625,
-0.031341552734375,
0.0031757354736328125,
0.0131988525390625,
0.0294036865234375,
0.0276947021484375,
0.00036144256591796875,
-0.0640869140625,
-0.06298828125,
0.0087127685546875,
-0.005039215087890625,
-0.0219573974609375,
0.0174102783203125,
0.0711669921875,
-0.038665771484375,
0.08380126953125,
-0.049468994140625,
-0.03448486328125,
0.001621246337890625,
0.03045654296875,
0.0394287109375,
0.05450439453125,
0.043243408203125,
-0.033050537109375,
-0.020721435546875,
-0.02880859375,
-0.0526123046875,
0.008026123046875,
-0.01004791259765625,
-0.00852203369140625,
-0.005382537841796875,
0.0265045166015625,
-0.06927490234375,
0.0345458984375,
0.02154541015625,
-0.047943115234375,
0.03656005859375,
-0.021148681640625,
0.000003874301910400391,
-0.109130859375,
0.0153656005859375,
-0.0258941650390625,
-0.027740478515625,
-0.0215911865234375,
-0.00908660888671875,
0.0084075927734375,
-0.0174102783203125,
-0.034271240234375,
0.056640625,
-0.03173828125,
-0.01380157470703125,
-0.023040771484375,
-0.006134033203125,
0.012786865234375,
0.0299224853515625,
0.012908935546875,
0.047760009765625,
0.04083251953125,
-0.054351806640625,
0.0287933349609375,
0.047607421875,
-0.0167388916015625,
-0.00214385986328125,
-0.0594482421875,
0.01375579833984375,
0.001804351806640625,
0.01263427734375,
-0.06549072265625,
-0.0149993896484375,
0.017486572265625,
-0.034454345703125,
0.0254669189453125,
-0.035888671875,
-0.060272216796875,
-0.049835205078125,
0.0016546249389648438,
0.047393798828125,
0.0207977294921875,
-0.06134033203125,
0.053558349609375,
0.0122528076171875,
0.00926971435546875,
-0.0304107666015625,
-0.053436279296875,
-0.01788330078125,
-0.01146697998046875,
-0.0266571044921875,
0.01184844970703125,
-0.006496429443359375,
0.0148162841796875,
-0.0016880035400390625,
0.0249481201171875,
0.0070953369140625,
0.0039825439453125,
0.0194091796875,
0.035369873046875,
-0.0126190185546875,
0.0031719207763671875,
0.0048675537109375,
-0.035369873046875,
0.005756378173828125,
-0.03131103515625,
0.062408447265625,
-0.031768798828125,
-0.0171356201171875,
-0.031005859375,
0.00634002685546875,
0.01837158203125,
0.010711669921875,
0.05706787109375,
0.056884765625,
-0.043243408203125,
-0.0106964111328125,
-0.01288604736328125,
-0.0472412109375,
-0.03961181640625,
0.0374755859375,
-0.0391845703125,
-0.046875,
0.05364990234375,
0.0223846435546875,
0.00858306884765625,
0.058135986328125,
0.0343017578125,
0.0121917724609375,
0.09735107421875,
0.0290679931640625,
0.01377105712890625,
0.03216552734375,
-0.0714111328125,
0.0087432861328125,
-0.05462646484375,
-0.0158233642578125,
-0.03656005859375,
-0.0106658935546875,
-0.033935546875,
-0.0178375244140625,
0.0252838134765625,
0.0191802978515625,
-0.034149169921875,
0.051116943359375,
-0.046295166015625,
0.0229034423828125,
0.030303955078125,
-0.00555419921875,
0.0139312744140625,
-0.003780364990234375,
-0.020416259765625,
-0.0011587142944335938,
-0.075439453125,
-0.032745361328125,
0.08770751953125,
0.0284423828125,
0.05023193359375,
-0.00937652587890625,
0.06536865234375,
0.006801605224609375,
0.03680419921875,
-0.054168701171875,
0.026885986328125,
-0.0113067626953125,
-0.068603515625,
-0.010528564453125,
-0.0501708984375,
-0.062744140625,
0.01187896728515625,
-0.0036334991455078125,
-0.039947509765625,
0.033477783203125,
0.0193328857421875,
-0.054107666015625,
0.0251617431640625,
-0.0428466796875,
0.0802001953125,
0.0024089813232421875,
-0.006305694580078125,
-0.0009031295776367188,
-0.021331787109375,
0.0260162353515625,
-0.0033054351806640625,
-0.00173187255859375,
0.0177764892578125,
-0.004047393798828125,
0.0692138671875,
-0.03704833984375,
0.06549072265625,
-0.0178680419921875,
0.0301971435546875,
0.0242156982421875,
0.00440216064453125,
0.03668212890625,
0.02935791015625,
0.0079803466796875,
0.01971435546875,
0.01137542724609375,
-0.028350830078125,
-0.01641845703125,
0.051788330078125,
-0.0775146484375,
-0.022186279296875,
-0.034210205078125,
-0.036651611328125,
0.01505279541015625,
0.03314208984375,
0.06787109375,
0.047454833984375,
-0.0166168212890625,
0.00921630859375,
0.03680419921875,
0.0110015869140625,
0.0654296875,
0.02923583984375,
-0.025970458984375,
-0.048431396484375,
0.044921875,
0.00833892822265625,
-0.00028228759765625,
-0.0028705596923828125,
0.002117156982421875,
-0.04052734375,
-0.0175018310546875,
-0.039306640625,
0.0294647216796875,
-0.04974365234375,
-0.0447998046875,
-0.061431884765625,
-0.0372314453125,
-0.04534912109375,
0.00196075439453125,
-0.04473876953125,
-0.037078857421875,
-0.06170654296875,
-0.01030731201171875,
0.021942138671875,
0.0556640625,
-0.023895263671875,
0.053497314453125,
-0.0653076171875,
0.023223876953125,
0.035064697265625,
0.004093170166015625,
0.0024433135986328125,
-0.07135009765625,
-0.031036376953125,
-0.004375457763671875,
-0.0189208984375,
-0.05303955078125,
0.039886474609375,
0.00955963134765625,
0.023834228515625,
0.04010009765625,
-0.00859832763671875,
0.031982421875,
-0.02288818359375,
0.054229736328125,
0.0077667236328125,
-0.0726318359375,
0.0426025390625,
-0.023223876953125,
0.033599853515625,
0.0323486328125,
0.01059722900390625,
-0.0157623291015625,
-0.00543975830078125,
-0.07354736328125,
-0.07269287109375,
0.0640869140625,
0.024505615234375,
0.0066070556640625,
-0.014862060546875,
0.0257568359375,
0.0079803466796875,
0.01125335693359375,
-0.07537841796875,
-0.023590087890625,
-0.041900634765625,
-0.0245513916015625,
-0.0015850067138671875,
-0.02044677734375,
-0.0202178955078125,
-0.0226593017578125,
0.07666015625,
-0.0212554931640625,
0.035675048828125,
0.0157012939453125,
-0.0183258056640625,
-0.0031185150146484375,
0.00018787384033203125,
0.037261962890625,
0.0394287109375,
-0.0253448486328125,
-0.004425048828125,
0.0306854248046875,
-0.0247039794921875,
0.00930023193359375,
0.0279541015625,
-0.0357666015625,
0.02313232421875,
0.01390838623046875,
0.07965087890625,
0.0038890838623046875,
0.00391387939453125,
0.035614013671875,
-0.0289764404296875,
-0.020294189453125,
-0.046356201171875,
0.0017786026000976562,
-0.01494598388671875,
0.0086517333984375,
0.03936767578125,
0.006885528564453125,
0.007144927978515625,
-0.01215362548828125,
0.0210113525390625,
0.02288818359375,
-0.022491455078125,
-0.0203704833984375,
0.063232421875,
-0.0008025169372558594,
-0.01751708984375,
0.05743408203125,
-0.02685546875,
-0.034393310546875,
0.04669189453125,
0.04522705078125,
0.07403564453125,
0.007572174072265625,
0.0037136077880859375,
0.056365966796875,
0.0302581787109375,
-0.0081634521484375,
0.0065765380859375,
-0.0146942138671875,
-0.0609130859375,
-0.0243988037109375,
-0.06927490234375,
-0.000946044921875,
0.0198211669921875,
-0.050323486328125,
0.0288543701171875,
-0.0400390625,
-0.041717529296875,
-0.01306915283203125,
0.0146942138671875,
-0.08038330078125,
0.018798828125,
0.004734039306640625,
0.062347412109375,
-0.07562255859375,
0.06878662109375,
0.04345703125,
-0.032684326171875,
-0.07293701171875,
-0.0180511474609375,
-0.01453399658203125,
-0.052520751953125,
0.0443115234375,
0.0078277587890625,
0.0116729736328125,
0.0262603759765625,
-0.041015625,
-0.0733642578125,
0.08587646484375,
0.0018901824951171875,
-0.0367431640625,
-0.023681640625,
0.01178741455078125,
0.0275726318359375,
-0.043853759765625,
0.039581298828125,
0.049407958984375,
0.02532958984375,
-0.0206298828125,
-0.04681396484375,
-0.005786895751953125,
-0.0223388671875,
0.017364501953125,
0.00044345855712890625,
-0.043853759765625,
0.074462890625,
-0.01087188720703125,
-0.0063629150390625,
0.036651611328125,
0.06585693359375,
0.034149169921875,
0.005405426025390625,
0.0430908203125,
0.052886962890625,
0.032867431640625,
-0.0231781005859375,
0.0697021484375,
-0.028594970703125,
0.07696533203125,
0.0828857421875,
0.0185089111328125,
0.030517578125,
0.0261077880859375,
0.011138916015625,
0.02154541015625,
0.053802490234375,
-0.03125,
0.052459716796875,
-0.01251983642578125,
-0.0115203857421875,
-0.018646240234375,
0.00821685791015625,
-0.0426025390625,
0.022125244140625,
0.029876708984375,
-0.043670654296875,
-0.00569915771484375,
0.01531219482421875,
-0.00787353515625,
-0.045745849609375,
-0.02142333984375,
0.046600341796875,
0.01153564453125,
-0.026519775390625,
0.047149658203125,
0.004383087158203125,
0.05072021484375,
-0.048248291015625,
0.0184478759765625,
-0.0164031982421875,
0.032012939453125,
-0.002239227294921875,
-0.03741455078125,
0.0201263427734375,
-0.004627227783203125,
-0.034759521484375,
-0.0140380859375,
0.046051025390625,
-0.0311431884765625,
-0.04608154296875,
0.01448822021484375,
0.01204681396484375,
0.00588226318359375,
0.01009368896484375,
-0.055999755859375,
-0.0160675048828125,
0.00205230712890625,
-0.047119140625,
-0.0030994415283203125,
0.03814697265625,
0.0328369140625,
0.037628173828125,
0.043609619140625,
-0.0029621124267578125,
0.0235595703125,
0.004596710205078125,
0.06396484375,
-0.041656494140625,
-0.049407958984375,
-0.04901123046875,
0.039306640625,
-0.009429931640625,
-0.051788330078125,
0.054840087890625,
0.06304931640625,
0.06585693359375,
-0.0233154296875,
0.033538818359375,
-0.00806427001953125,
0.031280517578125,
-0.03680419921875,
0.0621337890625,
-0.03680419921875,
0.0074462890625,
0.0044097900390625,
-0.0792236328125,
0.003734588623046875,
0.05780029296875,
-0.0171051025390625,
0.01397705078125,
0.05523681640625,
0.072265625,
-0.035400390625,
0.00403594970703125,
0.0132598876953125,
0.04498291015625,
0.0084991455078125,
0.02557373046875,
0.057525634765625,
-0.07354736328125,
0.047332763671875,
-0.038330078125,
0.005001068115234375,
-0.00334930419921875,
-0.045135498046875,
-0.056732177734375,
-0.0277862548828125,
-0.0157470703125,
-0.052032470703125,
-0.0248260498046875,
0.07464599609375,
0.06512451171875,
-0.07928466796875,
-0.00794219970703125,
-0.01080322265625,
-0.01129913330078125,
0.00986480712890625,
-0.0234222412109375,
0.0421142578125,
-0.02593994140625,
-0.05816650390625,
0.02972412109375,
-0.0200042724609375,
0.033966064453125,
-0.0194091796875,
-0.016204833984375,
-0.0031719207763671875,
-0.0008487701416015625,
0.0162353515625,
0.01033782958984375,
-0.02960205078125,
-0.0230255126953125,
-0.007282257080078125,
-0.03057861328125,
0.0003230571746826172,
0.04925537109375,
-0.066162109375,
0.021331787109375,
0.029998779296875,
0.018524169921875,
0.06689453125,
-0.023681640625,
0.03173828125,
-0.066650390625,
0.03106689453125,
0.0133209228515625,
0.05804443359375,
0.032745361328125,
-0.0231781005859375,
0.03826904296875,
0.03240966796875,
-0.04931640625,
-0.059844970703125,
-0.004322052001953125,
-0.04278564453125,
-0.01096343994140625,
0.06890869140625,
-0.011138916015625,
-0.0276947021484375,
0.0030364990234375,
-0.006977081298828125,
0.048187255859375,
-0.0121917724609375,
0.060211181640625,
0.034759521484375,
0.0143890380859375,
-0.0027179718017578125,
-0.01348876953125,
0.048309326171875,
0.0400390625,
-0.0428466796875,
-0.01044464111328125,
0.020477294921875,
0.0311279296875,
0.029327392578125,
0.0406494140625,
-0.0089111328125,
0.0201416015625,
0.0209808349609375,
0.008514404296875,
-0.0159759521484375,
-0.033660888671875,
-0.03057861328125,
0.011566162109375,
-0.01177978515625,
-0.0279541015625
]
] |
hiyouga/Baichuan2-7B-Chat-LLaMAfied | 2023-10-12T14:01:55.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"baichuan",
"llama2",
"baichuan2",
"en",
"zh",
"license:other",
"text-generation-inference",
"region:us"
] | text-generation | hiyouga | null | null | hiyouga/Baichuan2-7B-Chat-LLaMAfied | 3 | 6,889 | transformers | 2023-09-09T05:36:32 | ---
license: other
language:
- en
- zh
library_name: transformers
pipeline_tag: text-generation
inference: false
tags:
- baichuan
- llama2
- baichuan2
---
This is the LLaMAfied version of [Baichuan2-7B-Chat](https://huggingface.co/baichuan-inc/Baichuan2-7B-Chat) model by Baichuan Inc.
This model is converted with https://github.com/hiyouga/LLaMA-Factory/blob/main/tests/llamafy_baichuan2.py
You may use this model for fine-tuning in downstream tasks, we recommend using our efficient fine-tuning toolkit. https://github.com/hiyouga/LLaMA-Factory
- **Developed by:** Baichuan Inc.
- **Language(s) (NLP):** Chinese/English
- **License:** [Baichuan2 License](https://huggingface.co/baichuan-inc/Baichuan2-7B-Chat/resolve/main/Baichuan2%20%E6%A8%A1%E5%9E%8B%E7%A4%BE%E5%8C%BA%E8%AE%B8%E5%8F%AF%E5%8D%8F%E8%AE%AE.pdf)
Usage:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer
tokenizer = AutoTokenizer.from_pretrained("hiyouga/Baichuan2-7B-Chat-LLaMAfied", use_fast=False)
model = AutoModelForCausalLM.from_pretrained("hiyouga/Baichuan2-7B-Chat-LLaMAfied").cuda()
streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True)
query = "<reserved_106>晚上睡不着怎么办<reserved_107>"
inputs = tokenizer([query], return_tensors="pt")
inputs = inputs.to("cuda")
generate_ids = model.generate(**inputs, max_new_tokens=256, streamer=streamer)
```
You could also alternatively launch a CLI demo by using the script in [LLaMA-Factory](https://github.com/hiyouga/LLaMA-Factory)
```bash
python src/cli_demo.py --template baichuan2 --model_name_or_path hiyouga/Baichuan2-7B-Chat-LLaMAfied
```
| 1,631 | [
[
-0.003810882568359375,
-0.067138671875,
0.00811767578125,
0.04296875,
-0.0322265625,
0.005069732666015625,
-0.01458740234375,
-0.02655029296875,
0.0218505859375,
0.0290985107421875,
-0.05474853515625,
-0.0296783447265625,
-0.048126220703125,
0.0012264251708984375,
-0.0243072509765625,
0.07440185546875,
0.00951385498046875,
-0.007457733154296875,
0.01125335693359375,
0.012451171875,
-0.0491943359375,
-0.027435302734375,
-0.0767822265625,
-0.029510498046875,
0.0176849365234375,
0.0188446044921875,
0.045440673828125,
0.059356689453125,
0.032867431640625,
0.024261474609375,
-0.0234527587890625,
0.0112152099609375,
-0.06256103515625,
-0.0099945068359375,
0.01763916015625,
-0.0400390625,
-0.061126708984375,
0.0001659393310546875,
0.0193023681640625,
0.01255035400390625,
-0.0166473388671875,
0.032928466796875,
0.0076904296875,
-0.0012578964233398438,
-0.03253173828125,
0.0279083251953125,
-0.04510498046875,
-0.00885009765625,
-0.019378662109375,
-0.002376556396484375,
-0.0235137939453125,
-0.0037364959716796875,
-0.01702880859375,
-0.0394287109375,
0.00681304931640625,
-0.0012416839599609375,
0.10369873046875,
0.036468505859375,
-0.03802490234375,
-0.004970550537109375,
-0.033843994140625,
0.05511474609375,
-0.08642578125,
0.004817962646484375,
0.0264892578125,
0.030487060546875,
-0.036956787109375,
-0.08160400390625,
-0.038604736328125,
-0.017486572265625,
-0.01531219482421875,
0.00927734375,
-0.0249481201171875,
0.0015354156494140625,
0.01467132568359375,
0.036407470703125,
-0.031982421875,
0.01763916015625,
-0.05645751953125,
-0.0209503173828125,
0.05963134765625,
0.0204010009765625,
0.0173797607421875,
-0.01523590087890625,
-0.037628173828125,
0.00525665283203125,
-0.04345703125,
0.006866455078125,
0.024383544921875,
0.00988006591796875,
-0.036773681640625,
0.0391845703125,
-0.03179931640625,
0.042083740234375,
0.031280517578125,
-0.0218505859375,
0.035247802734375,
-0.0167999267578125,
-0.0276947021484375,
-0.00406646728515625,
0.0743408203125,
0.0428466796875,
-0.006778717041015625,
0.0160369873046875,
-0.01617431640625,
-0.0027828216552734375,
0.00555419921875,
-0.06671142578125,
-0.0277557373046875,
0.0369873046875,
-0.050048828125,
-0.041778564453125,
-0.0150146484375,
-0.0224456787109375,
-0.01120758056640625,
0.0027332305908203125,
0.0113372802734375,
-0.01739501953125,
-0.033935546875,
-0.0005016326904296875,
0.0162353515625,
0.0233306884765625,
0.01320648193359375,
-0.07257080078125,
0.007663726806640625,
0.048492431640625,
0.07275390625,
0.005100250244140625,
-0.03326416015625,
-0.0274658203125,
0.00799560546875,
-0.02642822265625,
0.0546875,
-0.005207061767578125,
-0.03265380859375,
-0.004528045654296875,
0.020660400390625,
0.01505279541015625,
-0.0411376953125,
0.035614013671875,
-0.02294921875,
0.00418853759765625,
0.00801849365234375,
-0.031829833984375,
-0.0190582275390625,
0.0250091552734375,
-0.042694091796875,
0.08905029296875,
0.002956390380859375,
-0.04656982421875,
0.01020050048828125,
-0.047393798828125,
-0.028594970703125,
0.0017900466918945312,
0.00823974609375,
-0.0276947021484375,
-0.0218505859375,
0.0035800933837890625,
0.036651611328125,
-0.0116424560546875,
0.021209716796875,
-0.03289794921875,
-0.033050537109375,
0.0153961181640625,
-0.02923583984375,
0.081787109375,
0.031219482421875,
-0.033599853515625,
0.0263671875,
-0.05712890625,
-0.0211181640625,
0.00951385498046875,
-0.029937744140625,
0.0048675537109375,
0.0012493133544921875,
0.00354766845703125,
0.01666259765625,
0.05450439453125,
-0.03277587890625,
0.02056884765625,
-0.0310821533203125,
0.053924560546875,
0.051422119140625,
-0.0005979537963867188,
0.018768310546875,
-0.041534423828125,
0.014678955078125,
0.018310546875,
0.042694091796875,
-0.0037860870361328125,
-0.053558349609375,
-0.09228515625,
-0.0216064453125,
-0.0026149749755859375,
0.0306549072265625,
-0.0323486328125,
0.037200927734375,
-0.000823974609375,
-0.06396484375,
-0.044525146484375,
0.005970001220703125,
0.01132965087890625,
0.0298919677734375,
0.0125274658203125,
-0.0180511474609375,
-0.053863525390625,
-0.0643310546875,
0.016845703125,
-0.0250244140625,
-0.00002855062484741211,
0.019134521484375,
0.03948974609375,
-0.04486083984375,
0.044830322265625,
-0.0305938720703125,
-0.0235748291015625,
-0.02154541015625,
-0.00255584716796875,
0.05548095703125,
0.051849365234375,
0.049072265625,
-0.0236358642578125,
-0.030914306640625,
-0.00298309326171875,
-0.04705810546875,
-0.0154571533203125,
-0.0193023681640625,
-0.051025390625,
0.0126953125,
0.02001953125,
-0.05889892578125,
0.0419921875,
0.050140380859375,
-0.0208892822265625,
0.046112060546875,
-0.016265869140625,
0.01082611083984375,
-0.10772705078125,
0.0022411346435546875,
-0.02001953125,
-0.0104522705078125,
-0.03851318359375,
0.01409149169921875,
-0.0018320083618164062,
0.011871337890625,
-0.04931640625,
0.0531005859375,
-0.0330810546875,
0.021636962890625,
-0.032806396484375,
-0.002788543701171875,
-0.00428009033203125,
0.0400390625,
-0.00969696044921875,
0.058197021484375,
0.0390625,
-0.0462646484375,
0.07525634765625,
0.0474853515625,
-0.0219879150390625,
0.00217437744140625,
-0.06878662109375,
0.0176239013671875,
0.02099609375,
0.027587890625,
-0.078369140625,
-0.0243988037109375,
0.0478515625,
-0.03973388671875,
0.001071929931640625,
-0.026702880859375,
-0.031982421875,
-0.027191162109375,
-0.01409912109375,
0.024627685546875,
0.0462646484375,
-0.04815673828125,
0.031219482421875,
0.0243682861328125,
0.0014696121215820312,
-0.05242919921875,
-0.06494140625,
0.01172637939453125,
-0.018310546875,
-0.05511474609375,
0.0245361328125,
-0.006099700927734375,
0.00502777099609375,
-0.02545166015625,
0.00439453125,
-0.017974853515625,
0.00505828857421875,
0.031463623046875,
0.045440673828125,
-0.0156402587890625,
-0.0169219970703125,
0.002803802490234375,
-0.004863739013671875,
0.00933837890625,
0.004383087158203125,
0.060943603515625,
-0.005321502685546875,
-0.0057373046875,
-0.04638671875,
0.00933837890625,
0.0234222412109375,
-0.007396697998046875,
0.055389404296875,
0.060699462890625,
-0.0361328125,
-0.003204345703125,
-0.05450439453125,
-0.0218048095703125,
-0.04205322265625,
0.0272979736328125,
-0.024932861328125,
-0.04791259765625,
0.059783935546875,
0.00722503662109375,
0.0183868408203125,
0.038482666015625,
0.065185546875,
-0.0016126632690429688,
0.06439208984375,
0.0411376953125,
-0.0109405517578125,
0.039581298828125,
-0.0426025390625,
0.0071258544921875,
-0.05767822265625,
-0.03192138671875,
-0.023284912109375,
-0.0269317626953125,
-0.036773681640625,
-0.0211029052734375,
0.0005431175231933594,
0.007251739501953125,
-0.0450439453125,
0.04833984375,
-0.036376953125,
0.014617919921875,
0.04290771484375,
0.0156097412109375,
0.022613525390625,
-0.017425537109375,
0.0193634033203125,
0.005306243896484375,
-0.0372314453125,
-0.049072265625,
0.0643310546875,
0.027618408203125,
0.0638427734375,
0.0112152099609375,
0.0421142578125,
-0.001232147216796875,
0.0239105224609375,
-0.05194091796875,
0.0338134765625,
-0.0021533966064453125,
-0.06182861328125,
0.007080078125,
-0.02001953125,
-0.0748291015625,
0.0125885009765625,
-0.01678466796875,
-0.06134033203125,
-0.0032958984375,
0.00525665283203125,
-0.028564453125,
0.0187530517578125,
-0.0584716796875,
0.07562255859375,
-0.0207366943359375,
-0.0164642333984375,
-0.0183868408203125,
-0.033111572265625,
0.049896240234375,
0.0197906494140625,
0.0007791519165039062,
-0.02471923828125,
-0.00009435415267944336,
0.051727294921875,
-0.04595947265625,
0.0782470703125,
0.0082244873046875,
-0.0171966552734375,
0.0282135009765625,
0.015228271484375,
0.040924072265625,
0.026214599609375,
0.0025615692138671875,
0.02734375,
0.004302978515625,
-0.03326416015625,
-0.032470703125,
0.058685302734375,
-0.07684326171875,
-0.05914306640625,
-0.0255889892578125,
-0.029296875,
0.017730712890625,
0.025604248046875,
0.0276336669921875,
0.0161590576171875,
0.008056640625,
0.006397247314453125,
0.0129241943359375,
-0.00952911376953125,
0.037445068359375,
0.025146484375,
-0.0157928466796875,
-0.0167083740234375,
0.036590576171875,
-0.0169830322265625,
0.01495361328125,
0.00946807861328125,
-0.0018978118896484375,
-0.019622802734375,
-0.026611328125,
-0.0272216796875,
0.0238800048828125,
-0.0406494140625,
-0.0308380126953125,
-0.0206298828125,
-0.041351318359375,
-0.044525146484375,
0.0034847259521484375,
-0.03955078125,
-0.0090484619140625,
-0.05804443359375,
-0.00897216796875,
0.04150390625,
0.0404052734375,
-0.007518768310546875,
0.048583984375,
-0.055328369140625,
0.0170440673828125,
0.024200439453125,
0.0156707763671875,
0.0018901824951171875,
-0.058197021484375,
-0.0234222412109375,
0.0303802490234375,
-0.0391845703125,
-0.06121826171875,
0.052093505859375,
0.00693511962890625,
0.043060302734375,
0.03594970703125,
0.0097198486328125,
0.06927490234375,
-0.0272216796875,
0.0733642578125,
0.012420654296875,
-0.057281494140625,
0.046966552734375,
-0.0278472900390625,
0.0033550262451171875,
0.01020050048828125,
0.008056640625,
-0.036041259765625,
-0.0241241455078125,
-0.04736328125,
-0.06243896484375,
0.05889892578125,
0.02130126953125,
0.0333251953125,
0.0024471282958984375,
0.025604248046875,
0.005115509033203125,
0.005100250244140625,
-0.056732177734375,
-0.03289794921875,
-0.033599853515625,
-0.0128326416015625,
0.00379180908203125,
-0.0244903564453125,
-0.00946807861328125,
-0.0157318115234375,
0.05902099609375,
0.006381988525390625,
0.0245208740234375,
-0.0006527900695800781,
-0.009368896484375,
0.0066986083984375,
-0.00848388671875,
0.036651611328125,
0.0288543701171875,
-0.0270538330078125,
-0.00919342041015625,
0.02935791015625,
-0.043121337890625,
0.0006079673767089844,
0.0013751983642578125,
-0.0029315948486328125,
0.0027561187744140625,
0.019744873046875,
0.08770751953125,
0.0119476318359375,
-0.031219482421875,
0.0411376953125,
-0.00823211669921875,
-0.0021457672119140625,
-0.03717041015625,
0.0205841064453125,
0.01477813720703125,
0.04296875,
0.0241546630859375,
0.003330230712890625,
0.0135498046875,
-0.024688720703125,
-0.0115966796875,
0.00823211669921875,
-0.0049591064453125,
-0.01366424560546875,
0.08203125,
0.025421142578125,
-0.031097412109375,
0.059906005859375,
-0.005535125732421875,
-0.019989013671875,
0.044097900390625,
0.051605224609375,
0.056243896484375,
0.013275146484375,
0.004940032958984375,
0.04779052734375,
0.014617919921875,
-0.00021445751190185547,
0.014007568359375,
-0.003330230712890625,
-0.0433349609375,
-0.03387451171875,
-0.038787841796875,
-0.0219879150390625,
0.027923583984375,
-0.036285400390625,
0.045135498046875,
-0.0287628173828125,
-0.04296875,
-0.012725830078125,
0.01090240478515625,
-0.032989501953125,
0.0023937225341796875,
0.0159912109375,
0.07196044921875,
-0.0321044921875,
0.07049560546875,
0.048736572265625,
-0.031463623046875,
-0.071533203125,
-0.026641845703125,
-0.00887298583984375,
-0.07720947265625,
0.04180908203125,
0.005855560302734375,
0.002803802490234375,
0.0017271041870117188,
-0.058441162109375,
-0.07000732421875,
0.09466552734375,
0.03302001953125,
-0.035614013671875,
-0.01006317138671875,
0.0022106170654296875,
0.026336669921875,
-0.017059326171875,
0.04486083984375,
0.036224365234375,
0.0204620361328125,
0.014801025390625,
-0.086669921875,
0.0174560546875,
-0.04510498046875,
0.0186309814453125,
-0.031005859375,
-0.0828857421875,
0.08050537109375,
-0.0216522216796875,
-0.01448822021484375,
0.041778564453125,
0.070556640625,
0.0406494140625,
-0.0008487701416015625,
0.0236968994140625,
0.021514892578125,
0.0308837890625,
-0.0108184814453125,
0.0513916015625,
-0.031097412109375,
0.051971435546875,
0.062164306640625,
0.007305145263671875,
0.06634521484375,
0.031951904296875,
-0.0173797607421875,
0.050079345703125,
0.0782470703125,
-0.01788330078125,
0.032562255859375,
0.003292083740234375,
-0.00907135009765625,
-0.01528167724609375,
0.0002789497375488281,
-0.034393310546875,
0.0309600830078125,
0.02374267578125,
-0.0112762451171875,
-0.0037899017333984375,
-0.0258331298828125,
0.015869140625,
-0.0277862548828125,
-0.007740020751953125,
0.042449951171875,
0.00612640380859375,
-0.03240966796875,
0.07684326171875,
0.0120697021484375,
0.09100341796875,
-0.03826904296875,
0.002513885498046875,
-0.036956787109375,
0.01543426513671875,
-0.0225372314453125,
-0.044830322265625,
0.01416015625,
-0.0011386871337890625,
0.0081024169921875,
0.004993438720703125,
0.05096435546875,
-0.0255126953125,
-0.0267181396484375,
0.0134124755859375,
0.016510009765625,
0.0300140380859375,
0.01525115966796875,
-0.047576904296875,
0.0211029052734375,
0.0024871826171875,
-0.032196044921875,
0.0005397796630859375,
0.00788116455078125,
0.00765228271484375,
0.062255859375,
0.0584716796875,
0.008209228515625,
0.0189056396484375,
0.0015392303466796875,
0.0535888671875,
-0.033721923828125,
-0.04071044921875,
-0.05914306640625,
0.049102783203125,
-0.013397216796875,
-0.052093505859375,
0.04229736328125,
0.043792724609375,
0.06927490234375,
-0.025177001953125,
0.038726806640625,
-0.016265869140625,
0.01197052001953125,
-0.0252838134765625,
0.07843017578125,
-0.042755126953125,
0.01313018798828125,
-0.0009918212890625,
-0.0477294921875,
0.00047588348388671875,
0.06903076171875,
0.004482269287109375,
-0.003742218017578125,
0.025604248046875,
0.07965087890625,
-0.0106201171875,
-0.01406097412109375,
0.019989013671875,
0.04248046875,
0.027923583984375,
0.05657958984375,
0.072998046875,
-0.06256103515625,
0.064208984375,
-0.045318603515625,
-0.015869140625,
-0.0257110595703125,
-0.058624267578125,
-0.079345703125,
-0.050537109375,
-0.02679443359375,
-0.038726806640625,
-0.00870513916015625,
0.067138671875,
0.0670166015625,
-0.053741455078125,
-0.04241943359375,
0.0186767578125,
-0.0022182464599609375,
-0.008819580078125,
-0.0178680419921875,
0.007640838623046875,
-0.0243988037109375,
-0.072998046875,
0.01468658447265625,
-0.006977081298828125,
0.028594970703125,
-0.00905609130859375,
-0.01125335693359375,
-0.00807952880859375,
-0.008544921875,
0.032958984375,
0.0128021240234375,
-0.057159423828125,
-0.014862060546875,
0.007640838623046875,
-0.0294647216796875,
0.00574493408203125,
0.01316070556640625,
-0.04534912109375,
-0.00394439697265625,
0.0276641845703125,
0.01389312744140625,
0.060882568359375,
-0.021484375,
0.035400390625,
-0.0445556640625,
0.044708251953125,
-0.010986328125,
0.04620361328125,
0.01824951171875,
-0.0309906005859375,
0.0316162109375,
0.007781982421875,
-0.040069580078125,
-0.06292724609375,
-0.00374603271484375,
-0.08868408203125,
-0.004608154296875,
0.10638427734375,
-0.01194000244140625,
-0.037200927734375,
0.01200103759765625,
-0.055908203125,
0.0601806640625,
-0.035064697265625,
0.07611083984375,
0.041259765625,
0.00749969482421875,
-0.01361083984375,
-0.0335693359375,
0.02386474609375,
0.007183074951171875,
-0.04925537109375,
-0.00855255126953125,
0.02252197265625,
0.036285400390625,
-0.004405975341796875,
0.0406494140625,
0.001953125,
0.0231475830078125,
0.0025844573974609375,
0.01220703125,
-0.01702880859375,
0.00753021240234375,
0.0014810562133789062,
-0.03826904296875,
-0.0014057159423828125,
-0.03741455078125
]
] |
wietsedv/bert-base-dutch-cased | 2023-09-11T08:56:16.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | wietsedv | null | null | wietsedv/bert-base-dutch-cased | 1 | 6,888 | transformers | 2022-03-02T23:29:05 | # BERTje: A Dutch BERT model
BERTje is a Dutch pre-trained BERT model developed at the University of Groningen.
⚠️ **The new home of this model is the [GroNLP](https://huggingface.co/GroNLP) organization.**
BERTje now lives at: [`GroNLP/bert-base-dutch-cased`](https://huggingface.co/GroNLP/bert-base-dutch-cased)
The model weights of the versions at `wietsedv/` and `GroNLP/` are the same, so do not worry if you use(d) `wietsedv/bert-base-dutch-cased`.
<img src="https://raw.githubusercontent.com/wietsedv/bertje/master/bertje.png" height="250">
| 554 | [
[
-0.04638671875,
-0.044647216796875,
0.0187530517578125,
0.027496337890625,
-0.0213775634765625,
-0.01187896728515625,
-0.006565093994140625,
-0.039825439453125,
0.03375244140625,
0.0240936279296875,
-0.0533447265625,
-0.02947998046875,
-0.044952392578125,
-0.00641632080078125,
-0.01157379150390625,
0.06695556640625,
0.00699615478515625,
0.048919677734375,
-0.0019664764404296875,
-0.005153656005859375,
0.0014867782592773438,
-0.02587890625,
-0.041412353515625,
-0.036895751953125,
0.0643310546875,
0.0036525726318359375,
0.03887939453125,
0.005832672119140625,
0.039947509765625,
0.0162200927734375,
-0.0287017822265625,
-0.0268707275390625,
-0.0343017578125,
-0.00759124755859375,
0.00937652587890625,
-0.017120361328125,
-0.03680419921875,
-0.00196075439453125,
0.0081329345703125,
0.0701904296875,
-0.01068115234375,
0.01056671142578125,
0.0028438568115234375,
0.039764404296875,
-0.038909912109375,
0.00849151611328125,
-0.0159912109375,
0.01238250732421875,
-0.00858306884765625,
0.0278472900390625,
-0.01800537109375,
-0.049652099609375,
0.04974365234375,
-0.044281005859375,
0.023712158203125,
-0.0204620361328125,
0.087158203125,
-0.0164794921875,
-0.0418701171875,
-0.00927734375,
-0.0171966552734375,
0.051025390625,
-0.0460205078125,
0.03570556640625,
0.01044464111328125,
0.03509521484375,
-0.037841796875,
-0.0428466796875,
-0.0166015625,
-0.01313018798828125,
-0.0087432861328125,
-0.004425048828125,
0.005035400390625,
0.00716400146484375,
0.0144805908203125,
0.005939483642578125,
-0.04315185546875,
-0.014801025390625,
-0.043975830078125,
-0.0122222900390625,
0.0567626953125,
-0.030975341796875,
0.0207061767578125,
-0.004779815673828125,
-0.05096435546875,
-0.0241851806640625,
-0.0404052734375,
-0.004276275634765625,
0.03460693359375,
0.02911376953125,
-0.0166015625,
0.035125732421875,
0.0003185272216796875,
0.041107177734375,
0.00778961181640625,
-0.0033779144287109375,
0.05419921875,
0.02264404296875,
-0.0138397216796875,
0.01166534423828125,
0.033538818359375,
0.0226593017578125,
0.050384521484375,
-0.00771331787109375,
-0.036285400390625,
-0.02960205078125,
0.028900146484375,
-0.047607421875,
-0.056060791015625,
-0.017578125,
-0.0675048828125,
-0.030975341796875,
0.007083892822265625,
-0.020050048828125,
-0.01102447509765625,
-0.01253509521484375,
0.04931640625,
-0.0259552001953125,
-0.0318603515625,
0.00003647804260253906,
-0.0137786865234375,
0.0224456787109375,
0.03814697265625,
-0.0640869140625,
0.00626373291015625,
0.0178680419921875,
0.039154052734375,
0.00748443603515625,
-0.00040149688720703125,
0.02288818359375,
0.005828857421875,
-0.03424072265625,
0.037567138671875,
-0.0147552490234375,
-0.040985107421875,
0.02850341796875,
0.0072784423828125,
0.0226287841796875,
-0.011962890625,
0.055023193359375,
-0.05596923828125,
0.043182373046875,
-0.058563232421875,
-0.041748046875,
-0.022918701171875,
0.0233306884765625,
-0.057891845703125,
0.08294677734375,
0.0175628662109375,
-0.0665283203125,
0.05780029296875,
-0.0738525390625,
-0.0300140380859375,
0.01041412353515625,
0.0143280029296875,
-0.02752685546875,
0.0246124267578125,
0.011810302734375,
0.04644775390625,
0.0036296844482421875,
0.006237030029296875,
-0.046630859375,
-0.01593017578125,
-0.01557159423828125,
0.00951385498046875,
0.0919189453125,
0.02508544921875,
0.025909423828125,
-0.007160186767578125,
-0.07452392578125,
0.0036830902099609375,
0.0308074951171875,
-0.0325927734375,
-0.05242919921875,
-0.0091705322265625,
0.0172271728515625,
0.014556884765625,
0.0552978515625,
-0.0579833984375,
0.046112060546875,
-0.00629425048828125,
-0.001140594482421875,
0.056427001953125,
-0.009368896484375,
0.0259552001953125,
-0.032867431640625,
0.0179443359375,
-0.036376953125,
0.034149169921875,
0.0118255615234375,
-0.0396728515625,
-0.0413818359375,
-0.0181427001953125,
0.031585693359375,
0.00858306884765625,
-0.051971435546875,
0.049072265625,
-0.0086822509765625,
-0.051910400390625,
-0.0130615234375,
-0.0228424072265625,
0.004055023193359375,
0.03460693359375,
0.018524169921875,
-0.03564453125,
-0.045318603515625,
-0.0811767578125,
-0.006343841552734375,
-0.01666259765625,
-0.01934814453125,
0.01538848876953125,
0.0426025390625,
-0.0106353759765625,
0.05450439453125,
-0.0186920166015625,
0.0008401870727539062,
-0.005596160888671875,
0.028961181640625,
0.039154052734375,
0.049041748046875,
0.088623046875,
-0.050933837890625,
-0.04254150390625,
-0.0212554931640625,
-0.05633544921875,
-0.018280029296875,
0.01447296142578125,
-0.0207977294921875,
0.0038547515869140625,
0.015289306640625,
-0.048187255859375,
0.0266876220703125,
0.042694091796875,
-0.0283355712890625,
0.05743408203125,
-0.0172271728515625,
0.005680084228515625,
-0.07177734375,
0.01016998291015625,
-0.000797271728515625,
-0.0223541259765625,
-0.0355224609375,
0.0090179443359375,
-0.0005679130554199219,
0.01104736328125,
-0.043426513671875,
0.042938232421875,
-0.048431396484375,
-0.0272979736328125,
-0.00002485513687133789,
-0.027496337890625,
0.0015192031860351562,
0.031982421875,
0.0005955696105957031,
0.044952392578125,
0.0452880859375,
-0.02935791015625,
0.0238494873046875,
0.055328369140625,
-0.039215087890625,
0.04034423828125,
-0.06793212890625,
0.0007300376892089844,
0.0089111328125,
0.0238494873046875,
-0.053375244140625,
-0.0263214111328125,
0.0165863037109375,
-0.022491455078125,
0.026763916015625,
-0.0228424072265625,
-0.050811767578125,
-0.04693603515625,
-0.0213775634765625,
0.0206146240234375,
0.04888916015625,
-0.054473876953125,
0.03790283203125,
0.03228759765625,
-0.018463134765625,
-0.0457763671875,
-0.0712890625,
-0.0131988525390625,
0.00617218017578125,
-0.0645751953125,
0.021514892578125,
-0.00003701448440551758,
-0.003612518310546875,
-0.0007338523864746094,
-0.0110931396484375,
-0.0219268798828125,
-0.0004222393035888672,
0.0037136077880859375,
0.0211334228515625,
-0.01406097412109375,
0.01529693603515625,
0.0013914108276367188,
0.0125732421875,
0.017608642578125,
0.005321502685546875,
0.037200927734375,
0.003314971923828125,
-0.01611328125,
-0.023956298828125,
0.0160064697265625,
0.036773681640625,
0.0242767333984375,
0.052581787109375,
0.0308380126953125,
-0.036041259765625,
-0.0094146728515625,
-0.050872802734375,
-0.02850341796875,
-0.037384033203125,
0.0048370361328125,
-0.041778564453125,
-0.047119140625,
0.05389404296875,
0.004505157470703125,
0.01739501953125,
0.037139892578125,
0.05230712890625,
-0.048675537109375,
0.06878662109375,
0.07373046875,
0.0018358230590820312,
0.05615234375,
-0.006404876708984375,
-0.002437591552734375,
-0.044677734375,
-0.015289306640625,
-0.01323699951171875,
-0.03839111328125,
-0.02655029296875,
-0.0026836395263671875,
0.0128631591796875,
0.01081085205078125,
-0.0443115234375,
0.060546875,
-0.04656982421875,
0.0259552001953125,
0.05303955078125,
0.0025005340576171875,
-0.007320404052734375,
-0.0002770423889160156,
-0.0304412841796875,
-0.0099639892578125,
-0.06329345703125,
-0.036285400390625,
0.0867919921875,
0.03411865234375,
0.06817626953125,
-0.007419586181640625,
0.06634521484375,
0.0262603759765625,
0.003330230712890625,
-0.01444244384765625,
0.03460693359375,
-0.004055023193359375,
-0.08062744140625,
-0.01451873779296875,
-0.0272369384765625,
-0.08172607421875,
0.024993896484375,
-0.0263519287109375,
-0.04443359375,
0.0120849609375,
0.0158538818359375,
0.0036907196044921875,
0.0263671875,
-0.058135986328125,
0.05853271484375,
-0.0224456787109375,
0.00823974609375,
-0.00731658935546875,
-0.05548095703125,
0.0235137939453125,
0.0143585205078125,
-0.00012981891632080078,
-0.01641845703125,
0.039642333984375,
0.05615234375,
-0.03948974609375,
0.046630859375,
-0.04742431640625,
-0.026519775390625,
0.00399017333984375,
0.01532745361328125,
0.0386962890625,
-0.003162384033203125,
0.00231170654296875,
0.0284576416015625,
0.0016736984252929688,
-0.053466796875,
-0.012542724609375,
0.051483154296875,
-0.0672607421875,
0.0007386207580566406,
-0.035675048828125,
-0.02655029296875,
-0.011932373046875,
0.0389404296875,
0.0251922607421875,
0.005641937255859375,
-0.043975830078125,
0.01100921630859375,
0.06427001953125,
-0.028411865234375,
0.0282440185546875,
0.03497314453125,
-0.02508544921875,
0.00018918514251708984,
0.0360107421875,
0.01302337646484375,
-0.0034084320068359375,
0.01153564453125,
0.01129150390625,
-0.010711669921875,
-0.0219268798828125,
-0.0041656494140625,
0.0037689208984375,
-0.05035400390625,
0.00832366943359375,
-0.0382080078125,
-0.0223541259765625,
-0.0308990478515625,
-0.03753662109375,
-0.040924072265625,
-0.0357666015625,
-0.00949859619140625,
-0.0017995834350585938,
0.0322265625,
0.048980712890625,
-0.028167724609375,
0.0306549072265625,
-0.04693603515625,
0.018585205078125,
0.041534423828125,
0.042510986328125,
-0.022430419921875,
-0.023223876953125,
-0.00284576416015625,
0.01153564453125,
0.002872467041015625,
-0.04071044921875,
0.0164642333984375,
-0.0003135204315185547,
0.054595947265625,
0.0278167724609375,
-0.004016876220703125,
0.019317626953125,
-0.03997802734375,
0.05450439453125,
0.04205322265625,
-0.040252685546875,
0.035552978515625,
-0.040435791015625,
0.00859832763671875,
0.044464111328125,
0.021148681640625,
-0.0196380615234375,
-0.00007027387619018555,
-0.052154541015625,
-0.07012939453125,
0.0268707275390625,
0.009674072265625,
0.0261077880859375,
0.016326904296875,
0.03704833984375,
0.028167724609375,
0.02557373046875,
-0.0712890625,
-0.02178955078125,
-0.01641845703125,
-0.009246826171875,
0.0243377685546875,
-0.03851318359375,
-0.0279693603515625,
-0.03350830078125,
0.072509765625,
0.00579833984375,
0.044464111328125,
0.0009341239929199219,
-0.011077880859375,
-0.0131072998046875,
-0.03436279296875,
0.040802001953125,
0.049652099609375,
-0.042266845703125,
-0.02935791015625,
0.0124053955078125,
-0.024200439453125,
-0.01239013671875,
0.00476837158203125,
0.0004162788391113281,
0.017486572265625,
0.044464111328125,
0.047149658203125,
0.02508544921875,
-0.0197906494140625,
0.06072998046875,
-0.01617431640625,
-0.043487548828125,
-0.059051513671875,
-0.002170562744140625,
-0.005584716796875,
0.031097412109375,
0.03387451171875,
-0.005542755126953125,
-0.001049041748046875,
-0.01023101806640625,
0.0265350341796875,
0.0297698974609375,
-0.0254669189453125,
-0.02435302734375,
0.04962158203125,
0.034393310546875,
0.00446319580078125,
0.037567138671875,
-0.0172271728515625,
-0.035400390625,
0.038055419921875,
0.0181427001953125,
0.0654296875,
0.014373779296875,
0.01178741455078125,
0.0269927978515625,
0.0179595947265625,
-0.0130157470703125,
0.054107666015625,
0.0002675056457519531,
-0.06329345703125,
-0.037933349609375,
-0.04998779296875,
-0.021697998046875,
0.005405426025390625,
-0.0579833984375,
0.03118896484375,
-0.03521728515625,
-0.011077880859375,
-0.01102447509765625,
0.00855255126953125,
-0.055511474609375,
0.0316162109375,
0.036285400390625,
0.10614013671875,
-0.061798095703125,
0.078125,
0.07098388671875,
-0.01715087890625,
-0.0262603759765625,
-0.005401611328125,
-0.0097198486328125,
-0.08660888671875,
0.07196044921875,
0.004306793212890625,
0.00684356689453125,
-0.0199737548828125,
-0.08050537109375,
-0.08465576171875,
0.0709228515625,
0.024139404296875,
-0.058258056640625,
-0.00707244873046875,
-0.042755126953125,
0.038970947265625,
0.000522613525390625,
0.004505157470703125,
0.007625579833984375,
0.044647216796875,
0.01214599609375,
-0.061676025390625,
-0.031097412109375,
-0.03424072265625,
0.012908935546875,
0.0114593505859375,
-0.0445556640625,
0.087890625,
-0.0005788803100585938,
0.0163116455078125,
0.017913818359375,
0.0526123046875,
0.01910400390625,
-0.018341064453125,
0.0491943359375,
0.055877685546875,
0.028167724609375,
-0.0231475830078125,
0.07965087890625,
-0.0289459228515625,
0.056365966796875,
0.061279296875,
-0.01218414306640625,
0.01258087158203125,
0.042877197265625,
-0.01000213623046875,
0.052337646484375,
0.055206298828125,
-0.043182373046875,
0.06842041015625,
0.033599853515625,
0.001495361328125,
-0.01064300537109375,
0.0213470458984375,
-0.045989990234375,
0.006877899169921875,
0.01274871826171875,
-0.0183868408203125,
-0.035614013671875,
0.004764556884765625,
-0.00701904296875,
-0.017822265625,
-0.0169525146484375,
0.03369140625,
-0.01291656494140625,
-0.00276947021484375,
0.0289764404296875,
-0.0178680419921875,
0.053924560546875,
-0.056976318359375,
0.002269744873046875,
0.0104827880859375,
0.0011920928955078125,
0.01165008544921875,
-0.0626220703125,
0.00345611572265625,
-0.016937255859375,
-0.0093841552734375,
-0.0347900390625,
0.0748291015625,
-0.0233612060546875,
-0.042633056640625,
0.0423583984375,
0.028900146484375,
0.023223876953125,
0.013519287109375,
-0.0802001953125,
0.0019168853759765625,
-0.019073486328125,
-0.07086181640625,
0.018585205078125,
0.03692626953125,
-0.006153106689453125,
0.058563232421875,
0.046630859375,
0.0012664794921875,
0.0071258544921875,
0.0194549560546875,
0.06585693359375,
-0.0146636962890625,
-0.044647216796875,
-0.0241851806640625,
0.0290374755859375,
-0.01580810546875,
-0.04290771484375,
0.0380859375,
0.023223876953125,
0.07952880859375,
-0.03790283203125,
0.06671142578125,
-0.047882080078125,
0.044464111328125,
-0.01305389404296875,
0.0670166015625,
-0.05322265625,
-0.03570556640625,
-0.0330810546875,
-0.08428955078125,
-0.038787841796875,
0.09844970703125,
-0.0117645263671875,
0.01708984375,
0.0013208389282226562,
0.0286712646484375,
-0.004535675048828125,
-0.0174102783203125,
0.0298919677734375,
0.0296783447265625,
0.01015472412109375,
0.0160675048828125,
0.053558349609375,
-0.023529052734375,
0.00730133056640625,
-0.032318115234375,
-0.0071868896484375,
-0.03704833984375,
-0.080322265625,
-0.0811767578125,
-0.030364990234375,
-0.0173492431640625,
-0.0020389556884765625,
0.0220947265625,
0.07598876953125,
0.0748291015625,
-0.0738525390625,
-0.0003771781921386719,
-0.01849365234375,
-0.014617919921875,
-0.002719879150390625,
-0.017608642578125,
0.023162841796875,
-0.01447296142578125,
-0.043121337890625,
0.0123443603515625,
-0.0012683868408203125,
0.0321044921875,
-0.019317626953125,
-0.005863189697265625,
-0.034637451171875,
0.0109710693359375,
0.032928466796875,
0.0222320556640625,
-0.0406494140625,
-0.059112548828125,
-0.01280975341796875,
-0.0292816162109375,
-0.00862884521484375,
0.0272064208984375,
-0.02777099609375,
0.039947509765625,
0.053375244140625,
0.0234375,
0.030853271484375,
-0.00785064697265625,
0.0638427734375,
-0.074462890625,
0.0186309814453125,
0.0155487060546875,
0.052093505859375,
0.00740814208984375,
-0.00678253173828125,
0.05328369140625,
0.0142059326171875,
-0.04425048828125,
-0.050994873046875,
0.0016994476318359375,
-0.0787353515625,
-0.035430908203125,
0.047607421875,
-0.0136871337890625,
-0.0158233642578125,
0.02301025390625,
-0.0335693359375,
0.025054931640625,
-0.03338623046875,
0.0806884765625,
0.07647705078125,
-0.0053558349609375,
-0.00809478759765625,
-0.045684814453125,
0.025146484375,
0.0250091552734375,
-0.01308441162109375,
-0.0233306884765625,
0.00640869140625,
0.02703857421875,
0.0233154296875,
0.0110321044921875,
-0.01178741455078125,
0.0184478759765625,
-0.01412200927734375,
0.08563232421875,
0.00527191162109375,
-0.0226898193359375,
0.005252838134765625,
-0.0060272216796875,
-0.025482177734375,
-0.036468505859375
]
] |
llm-agents/tora-code-7b-v1.0 | 2023-10-08T11:24:00.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"code",
"math",
"en",
"dataset:gsm8k",
"dataset:competition_math",
"arxiv:2309.17452",
"license:llama2",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | llm-agents | null | null | llm-agents/tora-code-7b-v1.0 | 11 | 6,878 | transformers | 2023-10-08T05:09:23 | ---
license: llama2
datasets:
- gsm8k
- competition_math
language:
- en
metrics:
- exact_match
library_name: transformers
pipeline_tag: text-generation
tags:
- code
- math
---
<h1 align="center">
ToRA: A Tool-Integrated Reasoning Agent <br> for Mathematical Problem Solving
</h1>
<p align="center">
<a href="https://microsoft.github.io/ToRA/"><b>[🌐 Website]</b></a> •
<a href="https://arxiv.org/pdf/2309.17452.pdf"><b>[📜 Paper]</b></a> •
<a href="https://huggingface.co/llm-agents"><b>[🤗 HF Models]</b></a> •
<a href="https://github.com/microsoft/ToRA"><b>[🐱 GitHub]</b></a>
<br>
<a href="https://twitter.com/zhs05232838/status/1708860992631763092"><b>[🐦 Twitter]</b></a> •
<a href="https://www.reddit.com/r/LocalLLaMA/comments/1703k6d/tora_a_toolintegrated_reasoning_agent_for/"><b>[💬 Reddit]</b></a> •
<a href="https://notes.aimodels.fyi/researchers-announce-tora-training-language-models-to-better-understand-math-using-external-tools/">[🍀 Unofficial Blog]</a>
<!-- <a href="#-quick-start">Quick Start</a> • -->
<!-- <a href="#%EF%B8%8F-citation">Citation</a> -->
</p>
<p align="center">
Repo for "<a href="https://arxiv.org/pdf/2309.17452.pdf" target="_blank">ToRA: A Tool-Integrated Reasoning Agent for Mathematical Problem Solving</a>"
</p>
## 🔥 News
- [2023/10/08] 🔥🔥🔥 All ToRA models released at [HuggingFace](https://huggingface.co/llm-agents)!!!
- [2023/09/29] ToRA paper, repo, and website released.
## 💡 Introduction
ToRA is a series of Tool-integrated Reasoning Agents designed to solve challenging mathematical reasoning problems by interacting with tools, e.g., computation libraries and symbolic solvers. ToRA series seamlessly integrate natural language reasoning with the utilization of external tools, thereby amalgamating the analytical prowess of language and the computational efficiency of external tools.
| Model | Size | GSM8k | MATH | AVG@10 math tasks<sup>†</sup> |
|---|---|---|---|---|
| GPT-4 | - | 92.0 | 42.5 | 78.3 |
| GPT-4 (PAL) | - | 94.2 | 51.8 | 86.4 |
| [ToRA-7B](https://huggingface.co/llm-agents/tora-7b-v1.0) | 7B | 68.8 | 40.1 | 62.4|
| [ToRA-Code-7B](https://huggingface.co/llm-agents/tora-code-7b-v1.0) | 7B | 72.6 | 44.6 | 66.5|
| [ToRA-13B](https://huggingface.co/llm-agents/tora-13b-v1.0) | 13B | 72.7 | 43.0 | 65.9|
| [ToRA-Code-13B](https://huggingface.co/llm-agents/tora-code-13b-v1.0) | 13B | 75.8 | 48.1 | 71.3 |
| [ToRA-Code-34B<sup>*</sup>](https://huggingface.co/llm-agents/tora-code-34b-v1.0) | 34B | 80.7 | **51.0** | 74.8 |
| [ToRA-70B](https://huggingface.co/llm-agents/tora-70b-v1.0) | 70B | **84.3** | 49.7 | **76.9** |
- <sup>*</sup>ToRA-Code-34B is currently the first and only open-source model to achieve over 50% accuracy (pass@1) on the MATH dataset, which significantly outperforms GPT-4’s CoT result (51.0 vs. 42.5), and is competitive with GPT-4 solving problems with programs. By open-sourcing our codes and models, we hope more breakthroughs will come!
- <sup>†</sup>10 math tasks include GSM8k, MATH, GSM-Hard, SVAMP, TabMWP, ASDiv, SingleEQ, SingleOP, AddSub, and MultiArith.
## ⚡️ Training
The models are trained on ToRA-Corpus 16k, which contains tool-integrated reasoning trajectories of MATH and GSM8k from GPT-4.
We use imitation learning (i.e., SFT) to fine-tune the models, and then apply our proposed *output space shaping* to improve tool-integrated reasoning behaviors. Please refer to the [paper](https://arxiv.org/pdf/2309.17452.pdf) for more details.
## 🪁 Inference & Evaluation
Please refer to ToRA's [GitHub repo](https://github.com/microsoft/ToRA) for inference, evaluation, and training code.
## ☕️ Citation
If you find this repository helpful, please consider citing our paper:
```
@misc{gou2023tora,
title={ToRA: A Tool-Integrated Reasoning Agent for Mathematical Problem Solving},
author={Zhibin Gou and Zhihong Shao and Yeyun Gong and yelong shen and Yujiu Yang and Minlie Huang and Nan Duan and Weizhu Chen},
year={2023},
eprint={2309.17452},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 4,085 | [
[
-0.0281524658203125,
-0.0599365234375,
0.050384521484375,
0.03369140625,
-0.0159454345703125,
0.032867431640625,
0.01560211181640625,
-0.0262298583984375,
0.0372314453125,
0.03314208984375,
-0.049163818359375,
-0.020416259765625,
-0.032379150390625,
0.00926971435546875,
0.006565093994140625,
0.031982421875,
0.0035152435302734375,
0.017486572265625,
-0.0164642333984375,
-0.0217132568359375,
-0.052459716796875,
-0.0262603759765625,
-0.036041259765625,
-0.0271453857421875,
-0.009552001953125,
-0.0070648193359375,
0.0687255859375,
0.03692626953125,
0.0287628173828125,
0.029144287109375,
0.0000730752944946289,
0.03619384765625,
-0.0013275146484375,
-0.006988525390625,
-0.004840850830078125,
-0.0357666015625,
-0.053436279296875,
0.01181793212890625,
0.04827880859375,
0.0274810791015625,
-0.0174407958984375,
0.004871368408203125,
-0.01453399658203125,
0.031524658203125,
-0.030487060546875,
0.0221099853515625,
-0.020263671875,
-0.0170135498046875,
-0.0010938644409179688,
-0.004810333251953125,
-0.046417236328125,
-0.035919189453125,
-0.0011396408081054688,
-0.07281494140625,
-0.011474609375,
0.00028705596923828125,
0.08721923828125,
0.027801513671875,
-0.0242919921875,
-0.017242431640625,
-0.04693603515625,
0.0625,
-0.06536865234375,
0.021148681640625,
-0.0012311935424804688,
0.0227508544921875,
-0.01503753662109375,
-0.05340576171875,
-0.049102783203125,
-0.01534271240234375,
-0.006206512451171875,
0.034759521484375,
-0.045806884765625,
-0.01922607421875,
0.034576416015625,
-0.0005068778991699219,
-0.0400390625,
-0.01910400390625,
-0.03839111328125,
-0.01200103759765625,
0.04364013671875,
0.0189208984375,
0.0302581787109375,
0.0013341903686523438,
-0.0095367431640625,
-0.0133514404296875,
-0.05389404296875,
0.006023406982421875,
0.0253753662109375,
0.00365447998046875,
-0.0243682861328125,
0.021026611328125,
0.023712158203125,
0.044708251953125,
-0.001972198486328125,
-0.004932403564453125,
0.03515625,
-0.00643157958984375,
-0.0205078125,
-0.026397705078125,
0.0701904296875,
-0.005706787109375,
0.00774383544921875,
-0.0014638900756835938,
-0.00875091552734375,
0.00691986083984375,
0.021759033203125,
-0.055908203125,
0.0093841552734375,
0.00739288330078125,
-0.004451751708984375,
-0.01116943359375,
0.0235443115234375,
-0.051300048828125,
-0.004772186279296875,
-0.0235748291015625,
0.0455322265625,
-0.0350341796875,
-0.0186309814453125,
0.03802490234375,
0.0009937286376953125,
0.0128631591796875,
0.049041748046875,
-0.01385498046875,
0.0287933349609375,
0.040252685546875,
0.07281494140625,
0.0103302001953125,
-0.0138397216796875,
-0.0511474609375,
-0.0086669921875,
-0.0260009765625,
0.050750732421875,
-0.029571533203125,
-0.017730712890625,
-0.0272064208984375,
0.0030422210693359375,
-0.01277923583984375,
-0.0280609130859375,
0.0032596588134765625,
-0.05206298828125,
0.0275421142578125,
-0.00875091552734375,
-0.0161285400390625,
-0.0292205810546875,
0.0102386474609375,
-0.06573486328125,
0.07244873046875,
0.0249786376953125,
-0.0306396484375,
-0.01226043701171875,
-0.06475830078125,
-0.006320953369140625,
-0.0032901763916015625,
-0.0041656494140625,
-0.036468505859375,
-0.0224456787109375,
0.0160369873046875,
0.0032901763916015625,
-0.05914306640625,
0.038482666015625,
-0.035369873046875,
-0.0169677734375,
0.019927978515625,
-0.001911163330078125,
0.09466552734375,
0.0232696533203125,
-0.038177490234375,
0.0195770263671875,
-0.0330810546875,
0.0142669677734375,
0.037689208984375,
0.0167083740234375,
-0.0158233642578125,
-0.023773193359375,
-0.0241851806640625,
0.0333251953125,
0.012451171875,
-0.041412353515625,
0.0219879150390625,
-0.04345703125,
0.047637939453125,
0.08258056640625,
-0.0062713623046875,
0.0226898193359375,
-0.02447509765625,
0.0731201171875,
0.00390625,
0.023101806640625,
0.0103912353515625,
-0.05511474609375,
-0.04315185546875,
-0.0243682861328125,
0.021453857421875,
0.05084228515625,
-0.0787353515625,
0.03326416015625,
-0.00974273681640625,
-0.058135986328125,
-0.033721923828125,
-0.01245880126953125,
0.044586181640625,
0.0333251953125,
0.02276611328125,
-0.00811767578125,
-0.0293426513671875,
-0.053741455078125,
-0.0166778564453125,
-0.004566192626953125,
0.01074981689453125,
0.0290069580078125,
0.049407958984375,
-0.0037784576416015625,
0.0755615234375,
-0.0638427734375,
-0.0011014938354492188,
-0.0189666748046875,
-0.0074462890625,
0.0300445556640625,
0.027801513671875,
0.05548095703125,
-0.052459716796875,
-0.057220458984375,
-0.0118865966796875,
-0.06268310546875,
-0.0084228515625,
-0.0138702392578125,
-0.0181732177734375,
0.015380859375,
0.0172119140625,
-0.049652099609375,
0.038360595703125,
0.0183563232421875,
-0.052520751953125,
0.036834716796875,
0.01026153564453125,
-0.014678955078125,
-0.10223388671875,
0.01020050048828125,
0.019439697265625,
-0.017578125,
-0.0297088623046875,
0.018402099609375,
-0.00875091552734375,
-0.002643585205078125,
-0.0281219482421875,
0.0728759765625,
-0.0249481201171875,
0.0049896240234375,
-0.00122833251953125,
0.0166473388671875,
-0.0004546642303466797,
0.0531005859375,
-0.01702880859375,
0.09716796875,
0.032623291015625,
-0.0249176025390625,
0.021575927734375,
0.0247802734375,
0.00928497314453125,
0.00850677490234375,
-0.0614013671875,
0.0206451416015625,
0.00856781005859375,
0.0160675048828125,
-0.043487548828125,
0.0236968994140625,
0.0374755859375,
-0.05157470703125,
-0.0013914108276367188,
0.004184722900390625,
-0.054229736328125,
-0.0203704833984375,
-0.0345458984375,
0.01824951171875,
0.047882080078125,
-0.0305633544921875,
0.07916259765625,
0.031494140625,
0.0082550048828125,
-0.050994873046875,
-0.01531982421875,
-0.018035888671875,
-0.01038360595703125,
-0.0706787109375,
0.01336669921875,
-0.03369140625,
-0.040374755859375,
0.00846099853515625,
-0.01134490966796875,
-0.0011644363403320312,
0.00296783447265625,
0.0141754150390625,
0.044830322265625,
-0.02569580078125,
0.00713348388671875,
0.005985260009765625,
-0.029022216796875,
0.01132965087890625,
-0.01502227783203125,
0.061981201171875,
-0.0650634765625,
-0.0164337158203125,
-0.0207061767578125,
0.0191650390625,
0.0498046875,
-0.026397705078125,
0.051116943359375,
0.02734375,
-0.035003662109375,
-0.0034084320068359375,
-0.041168212890625,
-0.02447509765625,
-0.041168212890625,
0.007358551025390625,
-0.041473388671875,
-0.053680419921875,
0.053009033203125,
-0.0026874542236328125,
-0.004825592041015625,
0.06414794921875,
0.0283966064453125,
0.02569580078125,
0.0863037109375,
0.050750732421875,
-0.00595855712890625,
0.031829833984375,
-0.057647705078125,
0.017547607421875,
-0.06341552734375,
-0.01861572265625,
-0.037689208984375,
-0.0015411376953125,
-0.03173828125,
-0.017486572265625,
0.046539306640625,
0.00434112548828125,
-0.036407470703125,
0.0430908203125,
-0.052215576171875,
0.03515625,
0.0509033203125,
0.0091094970703125,
0.0178375244140625,
-0.00858306884765625,
-0.0108489990234375,
-0.004444122314453125,
-0.04364013671875,
-0.04095458984375,
0.06683349609375,
0.0216064453125,
0.044677734375,
0.0286712646484375,
0.0283966064453125,
0.00669097900390625,
0.0167694091796875,
-0.04443359375,
0.0572509765625,
0.00689697265625,
-0.025787353515625,
-0.0230712890625,
-0.03570556640625,
-0.0704345703125,
0.0152740478515625,
0.01210784912109375,
-0.0653076171875,
0.0179290771484375,
-0.006809234619140625,
-0.0293121337890625,
0.0302886962890625,
-0.05340576171875,
0.054962158203125,
-0.0005602836608886719,
-0.035430908203125,
-0.021331787109375,
-0.04339599609375,
0.026458740234375,
0.0048065185546875,
0.0018701553344726562,
0.009124755859375,
0.00536346435546875,
0.061126708984375,
-0.067626953125,
0.049102783203125,
-0.0120391845703125,
-0.0043182373046875,
0.039154052734375,
0.0259246826171875,
0.05157470703125,
0.0259857177734375,
-0.00818634033203125,
0.01276397705078125,
0.01122283935546875,
-0.024078369140625,
-0.06585693359375,
0.041046142578125,
-0.07080078125,
-0.054534912109375,
-0.073974609375,
-0.051971435546875,
-0.01198577880859375,
0.0277862548828125,
0.0084686279296875,
0.036956787109375,
0.042144775390625,
0.00373077392578125,
0.052337646484375,
-0.001255035400390625,
0.03076171875,
0.0501708984375,
-0.0017833709716796875,
-0.034637451171875,
0.07244873046875,
0.018157958984375,
0.017547607421875,
0.0228271484375,
0.01556396484375,
-0.026611328125,
-0.01873779296875,
-0.037506103515625,
0.051025390625,
-0.04962158203125,
-0.034088134765625,
-0.036468505859375,
-0.0423583984375,
-0.0290985107421875,
-0.0220947265625,
-0.0306549072265625,
-0.030181884765625,
-0.034149169921875,
0.016510009765625,
0.055084228515625,
0.049652099609375,
0.00673675537109375,
0.0254364013671875,
-0.051025390625,
0.01538848876953125,
0.010772705078125,
0.025970458984375,
0.003566741943359375,
-0.033721923828125,
0.002208709716796875,
-0.00017273426055908203,
-0.045166015625,
-0.0693359375,
0.054901123046875,
-0.0234832763671875,
0.037078857421875,
0.00251007080078125,
-0.0009613037109375,
0.043853759765625,
-0.0037384033203125,
0.045379638671875,
0.01264190673828125,
-0.09674072265625,
0.03973388671875,
-0.028289794921875,
0.01525115966796875,
0.0024394989013671875,
0.0109710693359375,
-0.0269317626953125,
-0.02288818359375,
-0.07232666015625,
-0.03643798828125,
0.08380126953125,
0.02734375,
-0.01837158203125,
0.0120391845703125,
0.0245819091796875,
0.006496429443359375,
0.006591796875,
-0.057647705078125,
-0.028289794921875,
-0.0262908935546875,
-0.015655517578125,
0.018402099609375,
0.007205963134765625,
-0.00864410400390625,
-0.017608642578125,
0.0760498046875,
-0.0281219482421875,
0.03936767578125,
0.00864410400390625,
-0.01152801513671875,
-0.0009641647338867188,
0.004825592041015625,
0.0711669921875,
0.0650634765625,
-0.0143280029296875,
-0.0190582275390625,
0.0034008026123046875,
-0.066162109375,
0.007724761962890625,
0.0104217529296875,
-0.0261993408203125,
-0.00897979736328125,
0.0216064453125,
0.058013916015625,
-0.0149078369140625,
-0.0548095703125,
0.0304412841796875,
0.003536224365234375,
-0.01114654541015625,
-0.029541015625,
0.004871368408203125,
0.0003490447998046875,
0.028961181640625,
0.0164947509765625,
0.00844573974609375,
0.002460479736328125,
-0.0283355712890625,
-0.0008101463317871094,
0.036285400390625,
-0.015625,
-0.0244293212890625,
0.0382080078125,
-0.002262115478515625,
-0.042724609375,
0.047760009765625,
-0.04205322265625,
-0.044647216796875,
0.07720947265625,
0.05841064453125,
0.0692138671875,
-0.004405975341796875,
0.02301025390625,
0.029449462890625,
0.042083740234375,
0.00548553466796875,
0.050567626953125,
0.024688720703125,
-0.04669189453125,
-0.0233917236328125,
-0.016265869140625,
-0.031097412109375,
0.0146026611328125,
-0.036834716796875,
0.0224151611328125,
-0.053985595703125,
-0.00438690185546875,
-0.0013647079467773438,
0.022186279296875,
-0.04132080078125,
-0.0107269287109375,
-0.04180908203125,
0.07489013671875,
-0.03997802734375,
0.057342529296875,
0.05157470703125,
-0.060028076171875,
-0.07958984375,
-0.0167999267578125,
0.01371002197265625,
-0.0748291015625,
0.0284881591796875,
-0.0037403106689453125,
-0.031829833984375,
0.012237548828125,
-0.058197021484375,
-0.0677490234375,
0.10064697265625,
0.054534912109375,
-0.016204833984375,
0.00154876708984375,
0.001354217529296875,
0.0292205810546875,
-0.0278778076171875,
0.024169921875,
0.0099639892578125,
0.04241943359375,
0.00928497314453125,
-0.067138671875,
0.040313720703125,
-0.0592041015625,
-0.012054443359375,
0.032470703125,
-0.08050537109375,
0.07855224609375,
-0.00763702392578125,
-0.00522613525390625,
0.0083160400390625,
0.0340576171875,
0.0435791015625,
0.030303955078125,
0.031280517578125,
0.0423583984375,
0.041412353515625,
-0.0261688232421875,
0.06396484375,
-0.0079498291015625,
0.03521728515625,
0.065673828125,
-0.0133056640625,
0.031768798828125,
0.0177001953125,
-0.031097412109375,
0.05145263671875,
0.033538818359375,
-0.0221099853515625,
0.0186309814453125,
-0.0010547637939453125,
0.00342559814453125,
-0.035430908203125,
0.007354736328125,
-0.03717041015625,
0.010955810546875,
0.029815673828125,
0.006786346435546875,
-0.0191650390625,
-0.0038356781005859375,
0.0021457672119140625,
0.01534271240234375,
0.0004527568817138672,
0.038177490234375,
0.0101470947265625,
-0.0408935546875,
0.032379150390625,
0.0100250244140625,
0.0333251953125,
-0.06103515625,
-0.02398681640625,
-0.00907135009765625,
0.010986328125,
-0.0006470680236816406,
-0.058868408203125,
0.0179595947265625,
-0.0186920166015625,
-0.008270263671875,
0.0015897750854492188,
0.037384033203125,
0.01080322265625,
-0.036712646484375,
0.019927978515625,
0.036224365234375,
0.0018682479858398438,
-0.01092529296875,
-0.061370849609375,
-0.0022525787353515625,
-0.007843017578125,
-0.01342010498046875,
0.006000518798828125,
0.020477294921875,
-0.032073974609375,
0.07122802734375,
0.054107666015625,
-0.0109100341796875,
0.0025234222412109375,
-0.007419586181640625,
0.07244873046875,
-0.051788330078125,
-0.048095703125,
-0.062042236328125,
0.038909912109375,
-0.007358551025390625,
-0.029296875,
0.055572509765625,
0.046905517578125,
0.046142578125,
-0.01727294921875,
0.03369140625,
0.0014142990112304688,
0.0263671875,
-0.0401611328125,
0.0538330078125,
-0.044921875,
0.034210205078125,
-0.0146026611328125,
-0.05169677734375,
-0.0229949951171875,
0.03717041015625,
-0.0193939208984375,
0.0207366943359375,
0.06781005859375,
0.05291748046875,
-0.00821685791015625,
0.0010423660278320312,
-0.0083465576171875,
0.0142669677734375,
0.045257568359375,
0.05853271484375,
0.04315185546875,
-0.04693603515625,
0.030975341796875,
-0.0200653076171875,
-0.0082550048828125,
-0.0101165771484375,
-0.041961669921875,
-0.06475830078125,
-0.054229736328125,
-0.0197906494140625,
-0.056793212890625,
-0.0157623291015625,
0.0726318359375,
0.0484619140625,
-0.036956787109375,
-0.0152587890625,
-0.020599365234375,
0.03765869140625,
-0.026031494140625,
-0.023468017578125,
0.04522705078125,
0.0035152435302734375,
-0.048583984375,
0.01934814453125,
0.0230560302734375,
0.00771331787109375,
-0.0228271484375,
-0.034027099609375,
-0.019317626953125,
0.031707763671875,
0.032257080078125,
0.033599853515625,
-0.07586669921875,
-0.005657196044921875,
0.04443359375,
-0.00038623809814453125,
0.011260986328125,
0.048736572265625,
-0.0750732421875,
0.0275421142578125,
0.034576416015625,
0.030181884765625,
0.034912109375,
-0.032196044921875,
0.038360595703125,
-0.0224456787109375,
0.00907135009765625,
0.0301666259765625,
0.02886962890625,
-0.0163726806640625,
-0.0439453125,
0.07049560546875,
0.035491943359375,
-0.00952911376953125,
-0.0872802734375,
0.005001068115234375,
-0.107177734375,
-0.01467132568359375,
0.06671142578125,
-0.002819061279296875,
-0.0166778564453125,
0.0037899017333984375,
-0.0170440673828125,
0.01183319091796875,
-0.050567626953125,
0.051788330078125,
0.04541015625,
-0.004924774169921875,
-0.0052947998046875,
-0.031524658203125,
0.01200103759765625,
0.021636962890625,
-0.0933837890625,
-0.0171051025390625,
0.017669677734375,
0.01708984375,
0.044891357421875,
0.045318603515625,
-0.024658203125,
0.051971435546875,
-0.0039043426513671875,
-0.0016870498657226562,
-0.0465087890625,
-0.0275421142578125,
-0.02679443359375,
-0.002490997314453125,
-0.0247039794921875,
-0.006862640380859375
]
] |
alexofntu/textual_inversion_cat | 2023-10-10T05:26:54.000Z | [
"diffusers",
"tensorboard",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"textual_inversion",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | alexofntu | null | null | alexofntu/textual_inversion_cat | 0 | 6,872 | diffusers | 2023-09-30T19:26:29 |
---
license: creativeml-openrail-m
base_model: runwayml/stable-diffusion-v1-5
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- textual_inversion
inference: true
---
# Textual inversion text2image fine-tuning - alexofntu/textual_inversion_cat
These are textual inversion adaption weights for runwayml/stable-diffusion-v1-5. You can find some example images in the following.
| 418 | [
[
-0.014923095703125,
-0.050567626953125,
0.02978515625,
0.03704833984375,
-0.030426025390625,
-0.006443023681640625,
0.0187225341796875,
0.004421234130859375,
0.002681732177734375,
0.0570068359375,
-0.054962158203125,
-0.024871826171875,
-0.051025390625,
-0.006458282470703125,
-0.035552978515625,
0.0723876953125,
0.0022144317626953125,
0.0209197998046875,
-0.01251220703125,
-0.031463623046875,
-0.037078857421875,
0.011810302734375,
-0.061981201171875,
-0.047607421875,
0.051544189453125,
0.06451416015625,
0.06207275390625,
0.057830810546875,
0.069580078125,
0.01529693603515625,
0.0031452178955078125,
-0.0209197998046875,
-0.050567626953125,
-0.01215362548828125,
-0.026153564453125,
-0.0168914794921875,
-0.0293121337890625,
-0.006832122802734375,
0.0760498046875,
0.037200927734375,
-0.0230255126953125,
0.0169525146484375,
-0.01800537109375,
0.06365966796875,
-0.0509033203125,
-0.038970947265625,
-0.041900634765625,
0.00605010986328125,
-0.0034046173095703125,
-0.0234375,
-0.017974853515625,
-0.0049285888671875,
-0.001621246337890625,
-0.06695556640625,
-0.00661468505859375,
0.00814056396484375,
0.08770751953125,
0.044281005859375,
-0.041839599609375,
-0.01447296142578125,
-0.040557861328125,
0.06573486328125,
-0.03314208984375,
0.0268707275390625,
0.022247314453125,
0.0212554931640625,
-0.0046539306640625,
-0.08148193359375,
-0.0247955322265625,
0.0262908935546875,
-0.01219940185546875,
0.0282745361328125,
0.0158843994140625,
0.0036563873291015625,
0.0071563720703125,
0.02447509765625,
-0.023284912109375,
0.0234375,
-0.031463623046875,
-0.00975799560546875,
0.0265350341796875,
0.014892578125,
0.03741455078125,
0.018280029296875,
-0.06298828125,
-0.001819610595703125,
-0.0546875,
-0.02764892578125,
0.0148468017578125,
-0.009185791015625,
-0.0277099609375,
0.032470703125,
0.000850677490234375,
0.053070068359375,
0.0012788772583007812,
0.00305938720703125,
0.040496826171875,
-0.0276336669921875,
-0.03704833984375,
0.003513336181640625,
0.0274658203125,
0.04248046875,
-0.003391265869140625,
0.003978729248046875,
-0.01995849609375,
-0.034698486328125,
0.0247039794921875,
-0.0792236328125,
-0.0309906005859375,
0.0001914501190185547,
-0.06524658203125,
-0.038055419921875,
0.02191162109375,
-0.052581787109375,
-0.03936767578125,
-0.00826263427734375,
0.0555419921875,
-0.028106689453125,
-0.0372314453125,
0.0209503173828125,
-0.079833984375,
0.05523681640625,
0.0367431640625,
-0.04376220703125,
0.039093017578125,
0.01334381103515625,
0.059478759765625,
-0.007740020751953125,
-0.01467132568359375,
-0.0162200927734375,
0.0103759765625,
-0.02081298828125,
0.049163818359375,
-0.0265960693359375,
-0.0276336669921875,
-0.0180816650390625,
0.0271759033203125,
0.01148223876953125,
-0.0267486572265625,
0.08612060546875,
-0.0247955322265625,
0.032440185546875,
-0.0184173583984375,
-0.024017333984375,
-0.01947021484375,
0.002132415771484375,
-0.051422119140625,
0.0985107421875,
0.023223876953125,
-0.05841064453125,
0.03814697265625,
-0.0565185546875,
-0.0192413330078125,
-0.020050048828125,
0.013702392578125,
-0.061248779296875,
0.0028934478759765625,
-0.002811431884765625,
0.018402099609375,
0.00220489501953125,
-0.034698486328125,
-0.01235198974609375,
-0.036529541015625,
-0.0088348388671875,
-0.00043392181396484375,
0.0574951171875,
0.024871826171875,
-0.013458251953125,
0.01068115234375,
-0.07708740234375,
0.006412506103515625,
0.0025157928466796875,
-0.021820068359375,
-0.0635986328125,
0.0155792236328125,
0.038238525390625,
0.0056915283203125,
0.005611419677734375,
-0.0117034912109375,
0.0214691162109375,
-0.036651611328125,
-0.0031337738037109375,
0.037445068359375,
0.0182952880859375,
0.0438232421875,
-0.031494140625,
0.0216064453125,
0.01174163818359375,
-0.0183868408203125,
0.00720977783203125,
-0.03851318359375,
-0.0672607421875,
-0.0194091796875,
0.01078033447265625,
0.04620361328125,
-0.05242919921875,
0.01070404052734375,
-0.0283050537109375,
-0.0523681640625,
-0.0291290283203125,
0.020751953125,
0.0238037109375,
0.049346923828125,
0.026702880859375,
0.001560211181640625,
-0.04815673828125,
-0.07867431640625,
-0.029998779296875,
0.00787353515625,
-0.0170745849609375,
-0.01312255859375,
0.040191650390625,
-0.0136871337890625,
0.03900146484375,
-0.029998779296875,
-0.022216796875,
0.0169525146484375,
0.047943115234375,
0.005352020263671875,
0.06353759765625,
0.066162109375,
-0.07196044921875,
-0.00962066650390625,
-0.001590728759765625,
-0.0209197998046875,
-0.02142333984375,
-0.0291900634765625,
-0.042816162109375,
-0.006450653076171875,
0.041259765625,
-0.014862060546875,
0.0176239013671875,
0.0291900634765625,
-0.056243896484375,
-0.003231048583984375,
-0.02740478515625,
0.009429931640625,
-0.07879638671875,
0.0127410888671875,
0.0032825469970703125,
-0.048553466796875,
-0.047698974609375,
0.0251007080078125,
0.0183868408203125,
0.03521728515625,
-0.0325927734375,
0.04119873046875,
-0.06396484375,
-0.0080718994140625,
-0.01505279541015625,
0.024200439453125,
0.0137176513671875,
-0.01007080078125,
0.01377105712890625,
0.051300048828125,
0.0291595458984375,
-0.032989501953125,
0.01224517822265625,
0.0213623046875,
0.0020236968994140625,
0.043701171875,
-0.044647216796875,
0.0189208984375,
-0.03887939453125,
0.01287078857421875,
-0.1068115234375,
-0.01457977294921875,
0.016632080078125,
-0.0132904052734375,
0.009765625,
0.002750396728515625,
-0.007030487060546875,
-0.012481689453125,
-0.0552978515625,
0.044097900390625,
0.06060791015625,
-0.0273284912109375,
0.013824462890625,
0.01328277587890625,
0.005401611328125,
-0.041900634765625,
-0.082275390625,
0.00847625732421875,
-0.034942626953125,
-0.00487518310546875,
0.033447265625,
0.0009264945983886719,
-0.00659942626953125,
-0.00797271728515625,
0.0170135498046875,
-0.0277557373046875,
0.026824951171875,
0.0258331298828125,
-0.003772735595703125,
-0.0305328369140625,
0.00821685791015625,
0.04302978515625,
0.00750732421875,
-0.003490447998046875,
-0.0301055908203125,
0.01218414306640625,
-0.00311279296875,
0.02020263671875,
-0.0206298828125,
0.0001583099365234375,
0.03363037109375,
0.0289154052734375,
0.0560302734375,
0.07684326171875,
-0.05072021484375,
0.005306243896484375,
-0.01367950439453125,
-0.008819580078125,
-0.03424072265625,
0.021575927734375,
-0.019439697265625,
-0.0172119140625,
0.030914306640625,
-0.02703857421875,
-0.002803802490234375,
0.0721435546875,
0.048736572265625,
-0.0160369873046875,
0.04815673828125,
0.04571533203125,
0.02069091796875,
0.0318603515625,
-0.022247314453125,
-0.03289794921875,
-0.051971435546875,
-0.02313232421875,
-0.04931640625,
-0.05059814453125,
-0.007328033447265625,
-0.0266265869140625,
0.005199432373046875,
0.04656982421875,
-0.0239105224609375,
0.021240234375,
-0.039764404296875,
0.045654296875,
0.0192108154296875,
0.0157928466796875,
0.0165252685546875,
0.0025177001953125,
-0.0158843994140625,
-0.01367950439453125,
-0.038787841796875,
-0.038238525390625,
0.0455322265625,
0.040985107421875,
0.06158447265625,
0.02215576171875,
0.01233673095703125,
0.0251617431640625,
0.035675048828125,
-0.04443359375,
0.0164794921875,
-0.02215576171875,
-0.0555419921875,
-0.03594970703125,
-0.00550079345703125,
-0.065185546875,
-0.010833740234375,
-0.0474853515625,
-0.033905029296875,
0.0226593017578125,
0.0155792236328125,
0.0090484619140625,
0.006561279296875,
-0.04510498046875,
0.0511474609375,
0.007297515869140625,
-0.0088958740234375,
-0.0301055908203125,
-0.055328369140625,
0.0160369873046875,
-0.0015668869018554688,
0.0067596435546875,
-0.00730133056640625,
-0.053924560546875,
0.05242919921875,
-0.0220184326171875,
0.040557861328125,
-0.01959228515625,
-0.01812744140625,
0.005237579345703125,
-0.00618743896484375,
0.0262603759765625,
0.0010833740234375,
-0.0274658203125,
-0.0007863044738769531,
-0.000339508056640625,
-0.041595458984375,
-0.038116455078125,
0.044677734375,
-0.040496826171875,
0.0099029541015625,
0.0005769729614257812,
-0.0140380859375,
0.029937744140625,
0.04351806640625,
0.03509521484375,
0.0273284912109375,
-0.0155792236328125,
0.019683837890625,
0.037933349609375,
-0.0181732177734375,
0.028228759765625,
0.0209503173828125,
-0.0234375,
-0.03656005859375,
0.06951904296875,
-0.00043463706970214844,
0.0241851806640625,
0.058013916015625,
0.027557373046875,
-0.035888671875,
-0.054901123046875,
-0.054107666015625,
-0.0097198486328125,
-0.0545654296875,
-0.026458740234375,
-0.0289764404296875,
-0.0195465087890625,
-0.0060272216796875,
-0.0260009765625,
-0.01192474365234375,
-0.07879638671875,
-0.056488037109375,
-0.0264739990234375,
0.051971435546875,
0.04681396484375,
0.003650665283203125,
0.066650390625,
-0.031280517578125,
0.04022216796875,
-0.007068634033203125,
0.050201416015625,
-0.034149169921875,
-0.0533447265625,
-0.035003662109375,
-0.00342559814453125,
-0.03656005859375,
-0.046173095703125,
-0.01125335693359375,
0.060546875,
0.0251007080078125,
0.064208984375,
-0.01523590087890625,
0.06256103515625,
-0.03033447265625,
0.07171630859375,
0.019622802734375,
-0.06463623046875,
0.04522705078125,
-0.045989990234375,
0.0279693603515625,
0.049652099609375,
0.026580810546875,
-0.015045166015625,
-0.0229034423828125,
-0.035186767578125,
-0.056671142578125,
0.01953125,
0.03790283203125,
0.0209808349609375,
0.0282135009765625,
0.06939697265625,
0.0195465087890625,
0.00543212890625,
-0.0738525390625,
-0.0313720703125,
-0.025848388671875,
-0.041656494140625,
0.0142059326171875,
-0.0257415771484375,
-0.00969696044921875,
-0.03656005859375,
0.04229736328125,
-0.0003476142883300781,
0.01140594482421875,
0.033843994140625,
-0.00890350341796875,
-0.020294189453125,
-0.017181396484375,
0.06787109375,
0.036773681640625,
-0.02978515625,
0.0229034423828125,
0.027069091796875,
-0.058258056640625,
0.01605224609375,
-0.01861572265625,
-0.007373809814453125,
-0.0029048919677734375,
0.02838134765625,
0.052093505859375,
0.0132904052734375,
-0.019989013671875,
0.03607177734375,
-0.004650115966796875,
-0.04193115234375,
-0.0293426513671875,
0.01375579833984375,
-0.018463134765625,
-0.01349639892578125,
0.021148681640625,
0.00749969482421875,
0.0523681640625,
-0.0162811279296875,
0.0304107666015625,
0.035888671875,
-0.045013427734375,
-0.004596710205078125,
0.0716552734375,
0.0234375,
-0.025848388671875,
0.02593994140625,
0.0007433891296386719,
-0.0073394775390625,
0.047698974609375,
0.038360595703125,
0.0626220703125,
-0.0202484130859375,
0.00817108154296875,
0.0205230712890625,
0.041595458984375,
-0.02880859375,
0.042144775390625,
0.029052734375,
-0.04754638671875,
-0.01302337646484375,
-0.023712158203125,
-0.002460479736328125,
-0.00441741943359375,
-0.045074462890625,
0.06683349609375,
-0.0550537109375,
-0.00882720947265625,
0.005710601806640625,
-0.022705078125,
-0.0235443115234375,
0.04107666015625,
-0.005657196044921875,
0.10638427734375,
-0.058074951171875,
0.056427001953125,
0.0816650390625,
-0.00438690185546875,
-0.0338134765625,
0.01184844970703125,
-0.00957489013671875,
-0.059051513671875,
0.028900146484375,
0.0078277587890625,
-0.0158843994140625,
0.0325927734375,
-0.056610107421875,
-0.04669189453125,
0.08978271484375,
0.01513671875,
-0.00815582275390625,
-0.0012998580932617188,
-0.012237548828125,
0.030487060546875,
-0.0213775634765625,
0.037322998046875,
0.0243988037109375,
0.054351806640625,
0.0001933574676513672,
-0.0323486328125,
0.0197906494140625,
-0.014190673828125,
0.0140380859375,
0.0307159423828125,
-0.05841064453125,
0.03509521484375,
0.0029773712158203125,
-0.0008306503295898438,
0.042877197265625,
0.03802490234375,
-0.0153961181640625,
0.03759765625,
0.05633544921875,
0.055572509765625,
0.025787353515625,
-0.01983642578125,
0.049163818359375,
0.0097198486328125,
0.0465087890625,
0.06878662109375,
0.01641845703125,
0.046234130859375,
0.06365966796875,
0.01500701904296875,
0.0665283203125,
0.04351806640625,
-0.01209259033203125,
0.041351318359375,
0.0249786376953125,
0.01076507568359375,
-0.0021820068359375,
0.01396942138671875,
-0.0216827392578125,
0.023590087890625,
0.050506591796875,
-0.03228759765625,
-0.01126861572265625,
-0.005298614501953125,
0.0033779144287109375,
-0.0014123916625976562,
-0.027069091796875,
0.060546875,
-0.0011348724365234375,
-0.02880859375,
0.04998779296875,
0.03021240234375,
0.0256500244140625,
-0.0323486328125,
-0.0300140380859375,
0.0127105712890625,
0.002796173095703125,
-0.0225067138671875,
-0.07269287109375,
0.041259765625,
-0.034332275390625,
-0.0065460205078125,
-0.03558349609375,
0.07940673828125,
-0.026458740234375,
-0.0693359375,
0.0150299072265625,
-0.000701904296875,
0.01337432861328125,
0.0252532958984375,
-0.059173583984375,
-0.01284027099609375,
-0.015655517578125,
-0.0256500244140625,
-0.021087646484375,
0.048004150390625,
0.01032257080078125,
0.033416748046875,
0.008514404296875,
0.011962890625,
-0.00482177734375,
0.004482269287109375,
0.051483154296875,
-0.0531005859375,
-0.022918701171875,
-0.06536865234375,
0.07293701171875,
-0.0270843505859375,
-0.04541015625,
0.047637939453125,
0.050537109375,
0.01338958740234375,
-0.01800537109375,
0.027069091796875,
-0.000766754150390625,
0.01389312744140625,
-0.078369140625,
0.06298828125,
-0.0706787109375,
0.00366973876953125,
-0.050628662109375,
-0.04571533203125,
-0.047698974609375,
0.053924560546875,
0.027679443359375,
-0.016082763671875,
0.04571533203125,
0.04498291015625,
-0.0204010009765625,
-0.0183868408203125,
0.00247955322265625,
0.03656005859375,
-0.00814056396484375,
0.005462646484375,
0.042938232421875,
-0.07086181640625,
0.019012451171875,
0.006420135498046875,
-0.01215362548828125,
-0.018707275390625,
-0.10125732421875,
-0.06158447265625,
-0.04925537109375,
-0.034423828125,
-0.04425048828125,
0.004421234130859375,
0.03564453125,
0.06951904296875,
-0.04071044921875,
0.00905609130859375,
-0.045806884765625,
0.039276123046875,
-0.004215240478515625,
-0.016082763671875,
0.004718780517578125,
0.0091552734375,
-0.03955078125,
-0.019500732421875,
0.03106689453125,
0.04852294921875,
-0.01116943359375,
-0.01410675048828125,
-0.013275146484375,
0.0101776123046875,
0.025299072265625,
0.003398895263671875,
-0.0292205810546875,
-0.035186767578125,
-0.0225982666015625,
0.00835418701171875,
0.0032024383544921875,
0.0538330078125,
-0.049072265625,
-0.0053558349609375,
0.046356201171875,
0.035736083984375,
0.017547607421875,
0.029937744140625,
0.039703369140625,
-0.0799560546875,
0.054443359375,
-0.005146026611328125,
0.01049041748046875,
0.0003349781036376953,
-0.03326416015625,
0.02667236328125,
0.026092529296875,
-0.037841796875,
-0.04998779296875,
0.0006456375122070312,
-0.09002685546875,
-0.0171661376953125,
0.07666015625,
0.019805908203125,
0.0007567405700683594,
0.0030345916748046875,
-0.04443359375,
-0.004329681396484375,
-0.01220703125,
0.04345703125,
0.0615234375,
0.0175018310546875,
-0.040313720703125,
-0.038543701171875,
0.046295166015625,
0.0216064453125,
-0.06207275390625,
-0.03277587890625,
0.0239715576171875,
0.02471923828125,
0.0005936622619628906,
0.057525634765625,
-0.0092926025390625,
0.00931549072265625,
-0.0007696151733398438,
-0.022705078125,
0.0369873046875,
-0.024566650390625,
-0.02825927734375,
-0.036041259765625,
0.026580810546875,
-0.0234832763671875
]
] |
rinna/bilingual-gpt-neox-4b | 2023-08-14T06:40:12.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"ja",
"en",
"dataset:mc4",
"dataset:cc100",
"dataset:wikipedia",
"dataset:EleutherAI/pile",
"dataset:togethercomputer/RedPajama-Data-1T",
"license:mit",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | rinna | null | null | rinna/bilingual-gpt-neox-4b | 23 | 6,862 | transformers | 2023-07-31T02:34:03 | ---
thumbnail: https://github.com/rinnakk/japanese-pretrained-models/blob/master/rinna.png
license: mit
datasets:
- mc4
- cc100
- wikipedia
- EleutherAI/pile
- togethercomputer/RedPajama-Data-1T
language:
- ja
- en
inference: false
---
# bilingual-gpt-neox-4b

# Overview
This repository provides an English-Japanese bilingual GPT-NeoX model of 3.8 billion parameters.
* **Library**
The model was trained using code based on [EleutherAI/gpt-neox](https://github.com/EleutherAI/gpt-neox).
* **Model architecture**
A 36-layer, 2816-hidden-size transformer-based language model.
* **Pre-training**
The model was trained on around **524B** tokens from a mixture of the following corpora
- [Japanese CC-100](http://data.statmt.org/cc-100/ja.txt.xz)
- [Japanese C4](https://huggingface.co/datasets/mc4)
- [The Pile](https://huggingface.co/datasets/EleutherAI/pile)
- [Redpajama](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T)
- [Wikipedia](https://dumps.wikimedia.org/other/cirrussearch)
* **Model Series**
| Variant | Link |
| :-- | :--|
| Bilingual 4B MiniGPT4 | https://huggingface.co/rinna/bilingual-gpt-neox-4b-minigpt4 |
| Bilingual 4B PPO | https://huggingface.co/rinna/bilingual-gpt-neox-4b-instruction-ppo |
| Bilingual 4B SFT | https://huggingface.co/rinna/bilingual-gpt-neox-4b-instruction-sft |
| Bilingual 4B 8K | https://huggingface.co/rinna/bilingual-gpt-neox-4b-8k |
| Bilingual 4B | https://huggingface.co/rinna/bilingual-gpt-neox-4b |
| Japanese 3.6B PPO | https://huggingface.co/rinna/japanese-gpt-neox-3.6b-instruction-ppo |
| Japanese 3.6B SFT-v2 | https://huggingface.co/rinna/japanese-gpt-neox-3.6b-instruction-sft-v2 |
| Japanese 3.6B SFT | https://huggingface.co/rinna/japanese-gpt-neox-3.6b-instruction-sft |
| Japanese 3.6B | https://huggingface.co/rinna/japanese-gpt-neox-3.6b |
* **Authors**
- [Tianyu Zhao](https://huggingface.co/tianyuz)
- [Toshiaki Wakatsuki](https://huggingface.co/t-w)
- [Akio Kaga](https://huggingface.co/rakaga)
- [Koh Mitsuda](https://huggingface.co/mitsu-koh)
- [Kei Sawada](https://huggingface.co/keisawada)
---
# Benchmarking
* **Japanese benchmark**
Our evaluation experiments suggest that the bilingual-gpt-neox-4b model performs slightly better than the previous [Japanese GPT-NeoX 3.6B](https://huggingface.co/rinna/japanese-gpt-neox-3.6b) in Japanese tasks.
- *The 4-task average accuracy is based on results of JCommonsenseQA, JNLI, MARC-ja, and JSQuAD.*
- *The 6-task average accuracy is based on results of JCommonsenseQA, JNLI, MARC-ja, JSQuAD, XWinograd, and JAQKET-v2.*
| Model | 4-task average accuracy | 6-task average accuracy |
| :-- | :-- | :-- |
| bilingual-gpt-neox-4b-instruction-ppo | 61.01 | 61.16 |
| bilingual-gpt-neox-4b-instruction-sft | 61.02 | 61.69 |
| **bilingual-gpt-neox-4b** | **56.12** | **51.83** |
| japanese-gpt-neox-3.6b-instruction-ppo | 59.86 | 60.07 |
| japanese-gpt-neox-3.6b | 55.07 | 50.32 |
* **English benchmark**
Using the [EleutherAI Language Model Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness/tree/master), we found the bilingual-gpt-neox-4b performs comparably with English/multilingual models of similar sizes.
- *The average accuracy is based on results of Arc-Challenge, Arc-Easy, BoolQ, COPA, HellaSwag, OpenBookQA, PIQA, PROST, SWAG, and WinoGrande.*
| Model | Average accuracy |
| :-- | :-- |
| mpt-7b | 59.30 |
| llama-7b | 57.35 |
| bloom-7b | 51.51 |
| xglm-7.5b | 50.96 |
| xglm-4.5b | 50.15 |
| **bilingual-gpt-neox-4b** | **49.49** |
| bloom-3b | 48.56 |
| xglm-2.9b | 47.44 |
| bloom-1.7b | 46.54 |
---
# How to use the model
**Notice:** Since the model is **sensitive to decoding hyper-parameters** (e.g. `temperature`, `top_p`, `top_k`, `repetition_penalty`), it is suggested to explore the best setting for your task.
~~~~python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("rinna/bilingual-gpt-neox-4b", use_fast=False)
model = AutoModelForCausalLM.from_pretrained("rinna/bilingual-gpt-neox-4b")
if torch.cuda.is_available():
model = model.to("cuda")
text = "西田幾多郎は、"
token_ids = tokenizer.encode(text, add_special_tokens=False, return_tensors="pt")
with torch.no_grad():
output_ids = model.generate(
token_ids.to(model.device),
max_new_tokens=100,
min_new_tokens=100,
do_sample=True,
temperature=1.0,
top_p=0.95,
pad_token_id=tokenizer.pad_token_id,
bos_token_id=tokenizer.bos_token_id,
eos_token_id=tokenizer.eos_token_id
)
output = tokenizer.decode(output_ids.tolist()[0])
print(output)
"""
西田幾多郎は、その著書「自覚の哲学」の中で、次のように書きました。
「知識を、自分のものと考えることに満足していると、自己の限界に目覚めることを忘れてしまう。しかし、他者との協同なしには、自己の本当の理解に達することはできないのだ。知識は他者と相互の、協同の力によってこそ、得られるのである。」(引用終わり)
この一節を、私たちは今から学び直すべきです。そして、これからの社会をリードする子どもたちに、その能力を伸ばすべく、
"""
~~~~
~~~~python
text = "Socrates says"
token_ids = tokenizer.encode(text, add_special_tokens=False, return_tensors="pt")
with torch.no_grad():
output_ids = model.generate(
token_ids.to(model.device),
max_new_tokens=100,
min_new_tokens=100,
do_sample=True,
temperature=1.0,
top_p=0.95,
pad_token_id=tokenizer.pad_token_id,
bos_token_id=tokenizer.bos_token_id,
eos_token_id=tokenizer.eos_token_id
)
output = tokenizer.decode(output_ids.tolist()[0])
print(output)
"""
Socrates says: he thinks that philosophy, as opposed to myth, can be demonstrated; as opposed to poetry, that it is not possible to have knowledge of the unknowable (that is, neither by reason nor by any art of divination). So in this case he is in agreement with Socrates in not thinking that we could prove the existence of the gods or of fate. Now, I do not know the content of Xenophon's _Symposium_, but he must have made a point of this passage that has ex
"""
~~~~
~~~~python
text = "def bubble_sort(array):"
token_ids = tokenizer.encode(text, add_special_tokens=False, return_tensors="pt")
with torch.no_grad():
output_ids = model.generate(
token_ids.to(model.device),
max_new_tokens=200,
min_new_tokens=200,
do_sample=True,
temperature=1.0,
top_p=0.5,
pad_token_id=tokenizer.pad_token_id,
bos_token_id=tokenizer.bos_token_id,
eos_token_id=tokenizer.eos_token_id
)
output = tokenizer.decode(output_ids.tolist()[0])
print(output)
"""
def bubble_sort(array):
for i in range(len(array)):
for j in range(len(array)-1):
if array[j] > array[j+1]:
array[j], array[j+1] = array[j+1], array[j]
return array
print(bubble_sort([1, 2, 3, 4, 5, 6, 7, 8, 9, 10]))
The code above will sort the array from 1 to 10 in the following order:
1, 2, 3, 4, 5, 6, 7, 8, 9, 10
However, I am not sure how to do
"""
~~~~
---
# Tokenization
The model uses a [sentencepiece](https://github.com/google/sentencepiece)-based tokenizer.
* The tokenizer has a vocabulary size of 65,536.
* It uses *byte fallback* to decompose unknown text pieces into UTF-8 byte pieces to avoid producing `<UNK>` tokens.
* It can recognize *consecutive whitespaces*, *newlines*, and *tabs* to handle structured texts better.
* We turned off the default behaviour of prepending leading whitespace because it is not beneficial for processing Japanese.
* Specifically, single whitespace is always processed as one token so that any English word won't have a preceding whitespace like in many other tokenizers (e.g. `_Hello`).
* This decision trades the English processing efficiency for a unified way to treat whitespaces.
* It leads to a significantly lower loss of next token prediction on English data because whitespaces are easy to predict.
* **Don't forget to set `use_fast=False` to make the above features function correctly.**
---
# Licenese
[The MIT license](https://opensource.org/licenses/MIT) | 8,141 | [
[
-0.020965576171875,
-0.048431396484375,
0.0271453857421875,
0.0193939208984375,
-0.02227783203125,
0.0007519721984863281,
-0.01323699951171875,
-0.034454345703125,
0.02581787109375,
0.0194549560546875,
-0.0323486328125,
-0.0579833984375,
-0.043609619140625,
0.02349853515625,
-0.0151824951171875,
0.0777587890625,
-0.004268646240234375,
-0.0019989013671875,
0.01154327392578125,
-0.0022144317626953125,
-0.0203704833984375,
-0.0306396484375,
-0.06341552734375,
-0.0269317626953125,
0.020233154296875,
0.007495880126953125,
0.0440673828125,
0.05120849609375,
0.0158538818359375,
0.030303955078125,
-0.007755279541015625,
0.00811767578125,
-0.01465606689453125,
-0.0211334228515625,
-0.0056610107421875,
-0.043731689453125,
-0.047210693359375,
-0.002498626708984375,
0.044586181640625,
0.034942626953125,
0.003147125244140625,
0.01262664794921875,
0.00208282470703125,
0.0181427001953125,
-0.0193328857421875,
0.019744873046875,
-0.032257080078125,
-0.00194549560546875,
-0.016082763671875,
-0.0030765533447265625,
-0.024749755859375,
-0.0233612060546875,
-0.0021877288818359375,
-0.0650634765625,
0.01593017578125,
0.001926422119140625,
0.08331298828125,
0.00028252601623535156,
-0.0232696533203125,
-0.012603759765625,
-0.03082275390625,
0.07342529296875,
-0.06951904296875,
0.022857666015625,
0.0282440185546875,
-0.00647735595703125,
-0.00478363037109375,
-0.0643310546875,
-0.043670654296875,
0.0009946823120117188,
-0.01334381103515625,
0.035980224609375,
-0.019683837890625,
-0.007167816162109375,
0.0277099609375,
0.0233154296875,
-0.0614013671875,
-0.0001405477523803711,
-0.029632568359375,
-0.026702880859375,
0.048065185546875,
0.0175628662109375,
0.04290771484375,
-0.0293731689453125,
-0.0215606689453125,
-0.02325439453125,
-0.0252685546875,
0.007633209228515625,
0.028594970703125,
0.0193328857421875,
-0.0302734375,
0.034088134765625,
-0.01439666748046875,
0.04931640625,
-0.008819580078125,
-0.03265380859375,
0.04071044921875,
-0.045654296875,
-0.01514434814453125,
-0.003387451171875,
0.09619140625,
0.0251922607421875,
0.0034351348876953125,
0.0017528533935546875,
-0.0029144287109375,
0.004779815673828125,
-0.0264892578125,
-0.06146240234375,
-0.012420654296875,
0.0140838623046875,
-0.02142333984375,
-0.0087890625,
0.005321502685546875,
-0.056488037109375,
0.01654052734375,
-0.007129669189453125,
0.044677734375,
-0.046722412109375,
-0.026580810546875,
0.03131103515625,
-0.0175323486328125,
-0.007259368896484375,
0.01158905029296875,
-0.06396484375,
0.02667236328125,
0.04595947265625,
0.070068359375,
0.01145172119140625,
-0.049407958984375,
-0.0084686279296875,
-0.00048542022705078125,
-0.0099639892578125,
0.0269927978515625,
-0.0223846435546875,
-0.0260009765625,
-0.03216552734375,
0.01314544677734375,
-0.027923583984375,
-0.018035888671875,
0.04193115234375,
-0.0171661376953125,
0.052978515625,
-0.020782470703125,
-0.039093017578125,
-0.0180206298828125,
0.0188446044921875,
-0.03436279296875,
0.091552734375,
0.007411956787109375,
-0.07244873046875,
0.01092529296875,
-0.038787841796875,
-0.01222991943359375,
-0.001338958740234375,
-0.00414276123046875,
-0.026214599609375,
-0.01532745361328125,
0.0308837890625,
0.0222930908203125,
-0.023162841796875,
0.004573822021484375,
-0.00565338134765625,
-0.02606201171875,
0.01568603515625,
-0.0296783447265625,
0.0904541015625,
0.0138702392578125,
-0.049468994140625,
-0.0023975372314453125,
-0.064453125,
0.01116180419921875,
0.01690673828125,
-0.0240325927734375,
-0.025634765625,
-0.0242767333984375,
0.0142059326171875,
0.0243988037109375,
0.0255889892578125,
-0.0509033203125,
0.01739501953125,
-0.0455322265625,
0.042572021484375,
0.050018310546875,
-0.01180267333984375,
0.0219879150390625,
-0.0186004638671875,
0.041351318359375,
0.0030269622802734375,
0.01361846923828125,
-0.01435089111328125,
-0.064697265625,
-0.054046630859375,
-0.0183258056640625,
0.022979736328125,
0.0562744140625,
-0.0550537109375,
0.035064697265625,
-0.029876708984375,
-0.056121826171875,
-0.03143310546875,
0.004215240478515625,
0.0509033203125,
0.042999267578125,
0.03070068359375,
-0.0177764892578125,
-0.040130615234375,
-0.047698974609375,
-0.00830841064453125,
-0.0213165283203125,
0.01444244384765625,
0.0153656005859375,
0.052490234375,
-0.024505615234375,
0.047760009765625,
-0.032073974609375,
0.00675201416015625,
-0.01384735107421875,
0.0204315185546875,
0.0526123046875,
0.037261962890625,
0.042572021484375,
-0.04486083984375,
-0.050872802734375,
0.0257720947265625,
-0.05230712890625,
0.002475738525390625,
0.00388336181640625,
-0.0141448974609375,
0.022186279296875,
0.02655029296875,
-0.048126220703125,
0.02606201171875,
0.03271484375,
-0.0443115234375,
0.05059814453125,
-0.023590087890625,
0.013031005859375,
-0.10516357421875,
0.0180206298828125,
-0.0016069412231445312,
-0.00975799560546875,
-0.038970947265625,
0.0036602020263671875,
0.00859832763671875,
0.00211334228515625,
-0.037841796875,
0.0709228515625,
-0.03265380859375,
0.0110626220703125,
-0.007598876953125,
0.0132904052734375,
0.007160186767578125,
0.031005859375,
0.0213775634765625,
0.047760009765625,
0.046539306640625,
-0.036590576171875,
0.041900634765625,
0.016265869140625,
-0.020233154296875,
0.0168304443359375,
-0.05023193359375,
0.001567840576171875,
0.006893157958984375,
0.01090240478515625,
-0.066162109375,
-0.005664825439453125,
0.03216552734375,
-0.04443359375,
0.01145172119140625,
-0.0214996337890625,
-0.04364013671875,
-0.04327392578125,
-0.0151519775390625,
0.034027099609375,
0.0310821533203125,
-0.033477783203125,
0.03143310546875,
-0.0017147064208984375,
-0.012054443359375,
-0.071044921875,
-0.046356201171875,
0.002727508544921875,
-0.0168609619140625,
-0.050872802734375,
0.042205810546875,
-0.00843048095703125,
-0.0010805130004882812,
0.00013065338134765625,
0.0052947998046875,
0.00991058349609375,
-0.004730224609375,
0.0006437301635742188,
0.036285400390625,
-0.01727294921875,
-0.0158538818359375,
-0.000522613525390625,
-0.0239410400390625,
0.0026874542236328125,
-0.03021240234375,
0.06060791015625,
-0.02191162109375,
-0.00499725341796875,
-0.041595458984375,
0.00670623779296875,
0.050872802734375,
-0.00844573974609375,
0.047515869140625,
0.06903076171875,
-0.03369140625,
0.00975799560546875,
-0.031158447265625,
-0.01352691650390625,
-0.035980224609375,
0.037689208984375,
-0.040313720703125,
-0.04486083984375,
0.059844970703125,
0.027984619140625,
0.0118560791015625,
0.04327392578125,
0.0367431640625,
0.018341064453125,
0.0888671875,
0.0288848876953125,
-0.0141448974609375,
0.026824951171875,
-0.04522705078125,
0.01605224609375,
-0.057342529296875,
-0.0226287841796875,
-0.035247802734375,
-0.011566162109375,
-0.0631103515625,
-0.028167724609375,
0.0247955322265625,
0.0011682510375976562,
-0.0288848876953125,
0.042572021484375,
-0.050994873046875,
0.0230712890625,
0.0535888671875,
0.00545501708984375,
0.01605224609375,
0.00534820556640625,
-0.02581787109375,
-0.00850677490234375,
-0.052642822265625,
-0.0278167724609375,
0.0860595703125,
0.023468017578125,
0.04095458984375,
0.0037174224853515625,
0.05926513671875,
-0.0273895263671875,
-0.00017213821411132812,
-0.04315185546875,
0.048980712890625,
-0.0111083984375,
-0.055511474609375,
-0.02032470703125,
-0.046844482421875,
-0.07928466796875,
0.0274505615234375,
-0.0024356842041015625,
-0.07000732421875,
0.012908935546875,
-0.0097808837890625,
-0.0267791748046875,
0.027618408203125,
-0.0450439453125,
0.07928466796875,
-0.03253173828125,
-0.037261962890625,
-0.0012912750244140625,
-0.037628173828125,
0.022857666015625,
0.0078277587890625,
0.0221099853515625,
-0.00322723388671875,
-0.00974273681640625,
0.0794677734375,
-0.03668212890625,
0.05316162109375,
-0.0009059906005859375,
0.007411956787109375,
0.0056304931640625,
-0.006500244140625,
0.032440185546875,
0.0228271484375,
-0.0131988525390625,
0.0187225341796875,
0.0240325927734375,
-0.037841796875,
-0.032012939453125,
0.05780029296875,
-0.0791015625,
-0.03131103515625,
-0.05841064453125,
-0.03973388671875,
0.00911712646484375,
0.03948974609375,
0.056915283203125,
0.035614013671875,
0.002483367919921875,
-0.00905609130859375,
0.03179931640625,
-0.0292510986328125,
0.042266845703125,
0.03668212890625,
-0.02813720703125,
-0.05108642578125,
0.07855224609375,
0.019012451171875,
0.004680633544921875,
0.032440185546875,
0.022430419921875,
-0.033935546875,
-0.030731201171875,
-0.04693603515625,
0.044830322265625,
-0.043243408203125,
-0.0033416748046875,
-0.056915283203125,
-0.008331298828125,
-0.0537109375,
-0.0026302337646484375,
-0.036834716796875,
-0.036224365234375,
-0.02520751953125,
-0.009124755859375,
0.026092529296875,
0.03277587890625,
-0.00015819072723388672,
0.018768310546875,
-0.04559326171875,
0.00626373291015625,
0.0086822509765625,
0.00832366943359375,
-0.015350341796875,
-0.051116943359375,
-0.0249481201171875,
0.006153106689453125,
-0.0251617431640625,
-0.06317138671875,
0.054412841796875,
0.0006680488586425781,
0.032440185546875,
0.034027099609375,
-0.0149688720703125,
0.054351806640625,
-0.01514434814453125,
0.06640625,
0.026947021484375,
-0.08062744140625,
0.045928955078125,
-0.03240966796875,
0.027557373046875,
0.0196380615234375,
0.038543701171875,
-0.05657958984375,
-0.02593994140625,
-0.060211181640625,
-0.07427978515625,
0.07366943359375,
0.016754150390625,
0.0135650634765625,
-0.00591278076171875,
0.0286102294921875,
0.0077667236328125,
-0.0007920265197753906,
-0.08233642578125,
-0.044586181640625,
-0.0294189453125,
-0.021484375,
-0.0241241455078125,
-0.0013256072998046875,
-0.002178192138671875,
-0.04302978515625,
0.07330322265625,
-0.0013513565063476562,
0.0289459228515625,
0.01202392578125,
0.0006666183471679688,
0.0111846923828125,
0.013427734375,
0.044830322265625,
0.05242919921875,
-0.0202789306640625,
0.00925445556640625,
0.022613525390625,
-0.054168701171875,
0.0155029296875,
0.0172576904296875,
-0.02386474609375,
0.0179290771484375,
0.0260772705078125,
0.06915283203125,
0.0100860595703125,
-0.032440185546875,
0.027496337890625,
-0.0195465087890625,
-0.016082763671875,
-0.025238037109375,
0.002407073974609375,
0.01045989990234375,
-0.0006923675537109375,
0.00885772705078125,
-0.004383087158203125,
-0.00005608797073364258,
-0.050750732421875,
-0.006267547607421875,
0.0169525146484375,
-0.005252838134765625,
-0.041229248046875,
0.05755615234375,
0.0024967193603515625,
-0.0016908645629882812,
0.05560302734375,
-0.0202484130859375,
-0.051544189453125,
0.048004150390625,
0.0595703125,
0.0460205078125,
-0.024139404296875,
0.0193939208984375,
0.0655517578125,
0.0293731689453125,
-0.013427734375,
0.03082275390625,
0.005275726318359375,
-0.057769775390625,
-0.019287109375,
-0.048065185546875,
-0.0160064697265625,
0.0135040283203125,
-0.04071044921875,
0.0270233154296875,
-0.039154052734375,
-0.028350830078125,
0.0010442733764648438,
0.028289794921875,
-0.0450439453125,
0.0245819091796875,
0.005847930908203125,
0.06280517578125,
-0.07757568359375,
0.06640625,
0.0621337890625,
-0.030364990234375,
-0.07293701171875,
-0.0268402099609375,
-0.005199432373046875,
-0.050018310546875,
0.042083740234375,
0.01152801513671875,
0.004833221435546875,
0.006969451904296875,
-0.011383056640625,
-0.08172607421875,
0.0870361328125,
0.0333251953125,
-0.03631591796875,
-0.0056304931640625,
0.020751953125,
0.0443115234375,
-0.00778961181640625,
0.059478759765625,
0.03924560546875,
0.048309326171875,
-0.012359619140625,
-0.07861328125,
0.0141143798828125,
-0.0579833984375,
-0.00943756103515625,
0.0096282958984375,
-0.057525634765625,
0.08416748046875,
-0.0031719207763671875,
-0.0092926025390625,
0.00649261474609375,
0.044189453125,
0.0237579345703125,
0.035614013671875,
0.0214691162109375,
0.06292724609375,
0.0595703125,
-0.0267333984375,
0.07122802734375,
-0.0341796875,
0.033599853515625,
0.05755615234375,
0.00727081298828125,
0.050537109375,
0.02313232421875,
-0.028228759765625,
0.044586181640625,
0.05584716796875,
-0.0016546249389648438,
0.0241241455078125,
-0.01126861572265625,
-0.0203094482421875,
-0.006046295166015625,
0.0138092041015625,
-0.040863037109375,
0.027618408203125,
0.0250244140625,
-0.026580810546875,
0.001514434814453125,
0.006153106689453125,
0.0345458984375,
-0.01568603515625,
-0.01178741455078125,
0.038055419921875,
0.005657196044921875,
-0.0638427734375,
0.058502197265625,
0.0264892578125,
0.07293701171875,
-0.0386962890625,
0.021148681640625,
-0.0139923095703125,
0.0193939208984375,
-0.023040771484375,
-0.053131103515625,
0.00399017333984375,
-0.0031585693359375,
-0.0092926025390625,
-0.007190704345703125,
0.0285186767578125,
-0.0278167724609375,
-0.05987548828125,
0.0379638671875,
0.021392822265625,
0.0270233154296875,
0.00708770751953125,
-0.08148193359375,
0.00543975830078125,
0.010162353515625,
-0.02978515625,
0.0138092041015625,
0.01910400390625,
0.0003674030303955078,
0.03753662109375,
0.04541015625,
0.0156402587890625,
0.022796630859375,
0.0018281936645507812,
0.04473876953125,
-0.048431396484375,
-0.040191650390625,
-0.06689453125,
0.031280517578125,
-0.003452301025390625,
-0.049102783203125,
0.0721435546875,
0.06158447265625,
0.08673095703125,
0.0004968643188476562,
0.060546875,
-0.01213836669921875,
0.029083251953125,
-0.03436279296875,
0.04364013671875,
-0.047698974609375,
0.0114898681640625,
-0.03179931640625,
-0.07513427734375,
-0.0186614990234375,
0.048553466796875,
-0.03948974609375,
0.017578125,
0.0537109375,
0.0703125,
-0.0159912109375,
-0.007106781005859375,
0.0172882080078125,
0.0251007080078125,
0.016510009765625,
0.0694580078125,
0.04266357421875,
-0.06951904296875,
0.038848876953125,
-0.064208984375,
-0.017181396484375,
-0.01215362548828125,
-0.042083740234375,
-0.06396484375,
-0.0478515625,
-0.04010009765625,
-0.03973388671875,
-0.004703521728515625,
0.0704345703125,
0.0496826171875,
-0.05364990234375,
-0.0325927734375,
-0.004573822021484375,
0.005847930908203125,
-0.0333251953125,
-0.017852783203125,
0.044677734375,
-0.00524139404296875,
-0.0889892578125,
0.022857666015625,
0.007175445556640625,
0.0038089752197265625,
0.00914764404296875,
-0.0208892822265625,
-0.0083770751953125,
0.0011873245239257812,
0.03057861328125,
0.0277557373046875,
-0.05511474609375,
0.0021076202392578125,
0.013641357421875,
-0.016357421875,
0.00439453125,
0.0216217041015625,
-0.034210205078125,
0.034820556640625,
0.05169677734375,
0.004413604736328125,
0.055511474609375,
-0.01556396484375,
0.0208282470703125,
-0.03143310546875,
0.01514434814453125,
0.00934600830078125,
0.04559326171875,
0.0140533447265625,
-0.026519775390625,
0.0341796875,
0.041656494140625,
-0.035736083984375,
-0.06500244140625,
-0.0064544677734375,
-0.085693359375,
-0.0218048095703125,
0.07952880859375,
-0.015899658203125,
-0.038238525390625,
-0.00516510009765625,
-0.016876220703125,
0.0288543701171875,
-0.0186004638671875,
0.0455322265625,
0.04669189453125,
-0.007724761962890625,
-0.0169677734375,
-0.033416748046875,
0.0166473388671875,
0.0458984375,
-0.0509033203125,
-0.0079803466796875,
0.00939178466796875,
0.0489501953125,
0.0276641845703125,
0.05023193359375,
-0.0219268798828125,
0.038787841796875,
0.005428314208984375,
0.0168609619140625,
-0.0105743408203125,
-0.00157928466796875,
-0.0134124755859375,
-0.004550933837890625,
-0.0130157470703125,
-0.0040130615234375
]
] |
TheBloke/Airoboros-L2-70B-2.1-GPTQ | 2023-09-27T12:46:28.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"dataset:jondurbin/airoboros-2.1",
"license:llama2",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Airoboros-L2-70B-2.1-GPTQ | 15 | 6,860 | transformers | 2023-08-27T00:08:14 | ---
license: llama2
datasets:
- jondurbin/airoboros-2.1
model_name: Airoboros L2 70B 2.1
base_model: jondurbin/airoboros-l2-70b-2.1
inference: false
model_creator: Jon Durbin
model_type: llama
prompt_template: "A chat.\nUSER: {prompt}\nASSISTANT: \n"
quantized_by: TheBloke
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Airoboros L2 70B 2.1 - GPTQ
- Model creator: [Jon Durbin](https://huggingface.co/jondurbin)
- Original model: [Airoboros L2 70B 2.1](https://huggingface.co/jondurbin/airoboros-l2-70b-2.1)
<!-- description start -->
## Description
This repo contains GPTQ model files for [Jon Durbin's Airoboros L2 70B 2.1](https://huggingface.co/jondurbin/airoboros-l2-70b-2.1).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GGUF)
* [Jon Durbin's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/jondurbin/airoboros-l2-70b-2.1)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Chat
```
A chat.
USER: {prompt}
ASSISTANT:
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
All recent GPTQ files are made with AutoGPTQ, and all files in non-main branches are made with AutoGPTQ. Files in the `main` branch which were uploaded before August 2023 were made with GPTQ-for-LLaMa.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The dataset used for quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ/tree/main) | 4 | None | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 35.33 GB | Yes | 4-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 40.66 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 37.99 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
| [gptq-4bit-128g-actorder_True](https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ/tree/gptq-4bit-128g-actorder_True) | 4 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 36.65 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [gptq-3bit--1g-actorder_True](https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ/tree/gptq-3bit--1g-actorder_True) | 3 | None | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 26.77 GB | No | 3-bit, with Act Order and no group size. Lowest possible VRAM requirements. May be lower quality than 3-bit 128g. |
| [gptq-3bit-128g-actorder_True](https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ/tree/gptq-3bit-128g-actorder_True) | 3 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 28.03 GB | No | 3-bit, with group size 128g and act-order. Higher quality than 128g-False. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download from branches
- In text-generation-webui, you can add `:branch` to the end of the download name, eg `TheBloke/Airoboros-L2-70B-2.1-GPTQ:main`
- With Git, you can clone a branch with:
```
git clone --single-branch --branch main https://huggingface.co/TheBloke/Airoboros-L2-70B-2.1-GPTQ
```
- In Python Transformers code, the branch is the `revision` parameter; see below.
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/Airoboros-L2-70B-2.1-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/Airoboros-L2-70B-2.1-GPTQ:main`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `Airoboros-L2-70B-2.1-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
* Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation tab** and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-python start -->
## How to use this GPTQ model from Python code
### Install the necessary packages
Requires: Transformers 4.32.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install transformers>=4.32.0 optimum>=1.12.0
pip3 install auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/ # Use cu117 if on CUDA 11.7
```
If you have problems installing AutoGPTQ using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
pip3 install .
```
### For CodeLlama models only: you must use Transformers 4.33.0 or later.
If 4.33.0 is not yet released when you read this, you will need to install Transformers from source:
```shell
pip3 uninstall -y transformers
pip3 install git+https://github.com/huggingface/transformers.git
```
### You can then use the following code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/Airoboros-L2-70B-2.1-GPTQ"
# To use a different branch, change revision
# For example: revision="main"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Tell me about AI"
prompt_template=f'''A chat.
USER: {prompt}
ASSISTANT:
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with AutoGPTQ, both via Transformers and using AutoGPTQ directly. They should also work with [Occ4m's GPTQ-for-LLaMa fork](https://github.com/0cc4m/KoboldAI).
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama models in 4-bit. Please see the Provided Files table above for per-file compatibility.
[Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) is compatible with all GPTQ models.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Alicia Loh, Stephen Murray, K, Ajan Kanaga, RoA, Magnesian, Deo Leter, Olakabola, Eugene Pentland, zynix, Deep Realms, Raymond Fosdick, Elijah Stavena, Iucharbius, Erik Bjäreholt, Luis Javier Navarrete Lozano, Nicholas, theTransient, John Detwiler, alfie_i, knownsqashed, Mano Prime, Willem Michiel, Enrico Ros, LangChain4j, OG, Michael Dempsey, Pierre Kircher, Pedro Madruga, James Bentley, Thomas Belote, Luke @flexchar, Leonard Tan, Johann-Peter Hartmann, Illia Dulskyi, Fen Risland, Chadd, S_X, Jeff Scroggin, Ken Nordquist, Sean Connelly, Artur Olbinski, Swaroop Kallakuri, Jack West, Ai Maven, David Ziegler, Russ Johnson, transmissions 11, John Villwock, Alps Aficionado, Clay Pascal, Viktor Bowallius, Subspace Studios, Rainer Wilmers, Trenton Dambrowitz, vamX, Michael Levine, 준교 김, Brandon Frisco, Kalila, Trailburnt, Randy H, Talal Aujan, Nathan Dryer, Vadim, 阿明, ReadyPlayerEmma, Tiffany J. Kim, George Stoitzev, Spencer Kim, Jerry Meng, Gabriel Tamborski, Cory Kujawski, Jeffrey Morgan, Spiking Neurons AB, Edmond Seymore, Alexandros Triantafyllidis, Lone Striker, Cap'n Zoog, Nikolai Manek, danny, ya boyyy, Derek Yates, usrbinkat, Mandus, TL, Nathan LeClaire, subjectnull, Imad Khwaja, webtim, Raven Klaugh, Asp the Wyvern, Gabriel Puliatti, Caitlyn Gatomon, Joseph William Delisle, Jonathan Leane, Luke Pendergrass, SuperWojo, Sebastain Graf, Will Dee, Fred von Graf, Andrey, Dan Guido, Daniel P. Andersen, Nitin Borwankar, Elle, Vitor Caleffi, biorpg, jjj, NimbleBox.ai, Pieter, Matthew Berman, terasurfer, Michael Davis, Alex, Stanislav Ovsiannikov
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: Jon Durbin's Airoboros L2 70B 2.1
### Overview
__*This model is a bit broken due to a prompt formatting bug in the training code! 2.2 will be available soon and should fix this*__
This is an instruction fine-tuned llama-2 model, using synthetic data generated by [airoboros](https://github.com/jondurbin/airoboros)
- Experimental RP style instruction set, with two categories: rp and gtkm
- rp includes multi-round chats, with emotes, between a varying number of characters, defined by cards
- gtkm is a way to test a simpler alternative to ghost attention - first, a character card is generated, then several questions are created to ask the model (as the character), using the character system prompt, then everything in synthesized into a dialog (one system prompt, all turns remain in character)
- Experimental support for longer, more detailed writing prompts, as well as next-chapter generation
- I used the new `cull-instructions` entrypoint in airoboros to shrink the m2.0 dataset to a smaller subset of high-quality instructions (according to gpt-4)
- The training data now also includes "stylized_response", in which 1500 sample instructions from various categories were re-generated using character cards as system prompts.
- this should allow better adherence to style/etc. specified in the system card
- Thousands of new generations, using some of the updates re: Flesch hints, etc., to get longer/higher quality writing outputs.
- A small "de-alignment" dataset was also added (not published) to remove some of the censorship in the base models.
*Why do I try to remove censorship?*
- laws vary widely based on time and location
- language model may conflate certain words with laws, e.g. it may think "stealing eggs from a chicken" is illegal
- these models just produce text, what you do with that text is your resonsibility
- many people and industries deal with "sensitive" content; imagine if a court stenographer's equipment filtered illegal content - it would be useless
Huge thank you to the folks over at [a16z](https://a16z.com/) for sponsoring the costs associated with building models and associated tools!
### Prompt format
The training code was updated to randomize newline vs space:
https://github.com/jondurbin/qlora/blob/main/qlora.py#L559C1-L559C1
```
A chat. USER: {prompt} ASSISTANT:
```
or
```
A chat.
USER: {prompt}
ASSISTANT:
```
So in other words, it's the preamble/system prompt, followed by a single space or newline, then "USER: " (single space after colon) then the prompt (which can have multiple lines, spaces, whatever), then a single space or newline, followed by "ASSISTANT: " (with a single space after the colon).
__*I strongly suggest adding stopping criteria/early inference stopping on "USER:", because the training data includes many multi-round chats and could otherwise start simulating a conversation!*__
### Helpful usage tips
*The prompts shown here are are just the text that would be included after USER: and before ASSISTANT: in the full prompt format above, the system prompt and USER:/ASSISTANT: have been omited for readability.*
#### Context obedient question answering
By obedient, I mean the model was trained to ignore what it thinks it knows, and uses the context to answer the question. The model was also tuned to limit the values to the provided context as much as possible to reduce hallucinations.
The format for a closed-context prompt is as follows:
```
BEGININPUT
BEGINCONTEXT
[key0: value0]
[key1: value1]
... other metdata ...
ENDCONTEXT
[insert your text blocks here]
ENDINPUT
[add as many other blocks, in the exact same format]
BEGININSTRUCTION
[insert your instruction(s). The model was tuned with single questions, paragraph format, lists, etc.]
ENDINSTRUCTION
```
It's also helpful to add "Don't make up answers if you don't know." to your instruction block to make sure if the context is completely unrelated it doesn't make something up.
*The __only__ prompts that need this closed context formating are closed-context instructions. Normal questions/instructions do not!*
I know it's a bit verbose and annoying, but after much trial and error, using these explicit delimiters helps the model understand where to find the responses and how to associate specific sources with it.
- `BEGININPUT` - denotes a new input block
- `BEGINCONTEXT` - denotes the block of context (metadata key/value pairs) to associate with the current input block
- `ENDCONTEXT` - denotes the end of the metadata block for the current input
- [text] - Insert whatever text you want for the input block, as many paragraphs as can fit in the context.
- `ENDINPUT` - denotes the end of the current input block
- [repeat as many input blocks in this format as you want]
- `BEGININSTRUCTION` - denotes the start of the list (or one) instruction(s) to respond to for all of the input blocks above.
- [instruction(s)]
- `ENDINSTRUCTION` - denotes the end of instruction set
It sometimes works without `ENDINSTRUCTION`, but by explicitly including that in the prompt, the model better understands that all of the instructions in the block should be responded to.
Here's a trivial, but important example to prove the point:
```
BEGININPUT
BEGINCONTEXT
date: 2021-01-01
url: https://web.site/123
ENDCONTEXT
In a shocking turn of events, blueberries are now green, but will be sticking with the same name.
ENDINPUT
BEGININSTRUCTION
What color are bluberries? Source?
ENDINSTRUCTION
```
And the response:
```
Blueberries are now green.
Source:
date: 2021-01-01
url: https://web.site/123
```
#### Coding
You can ask for fairly complex coding instructions with multiple criteria, e.g.:
```
Create a python application with the following requirements:
- Asyncio FastAPI webserver
- ping endpoint that returns the current date in JSON format
- file upload endpoint, which calculates the file's sha256 checksum, and checks postgres to deduplicate
```
Or inline criteria:
```
Write a multi-threaded TCP server in C that accepts a "GET [key]" input and "SET [key] [value]" input, and uses a binary tree to get and store the input values.
```
You can also optionally add a single space and "PLAINFORMAT" at the end of your prompt to avoid backticks, explanations, etc. and just print the code, e.g.:
```
Write a websocket application in node.js. PLAINFORMAT
```
#### Agent/function calling
The dataset includes many examples of function/args generation based on input criteria. This is somewhat similar to the OpenAI function calling, but the output is either JSON or YAML.
Example prompt:
```
As an AI assistant, please select the most suitable function and parameters from the list of available functions below, based on the user's input. Provide your response in JSON format.
Input: I want to know how many times 'Python' is mentioned in my text file.
Available functions:
file_analytics:
description: This tool performs various operations on a text file.
params:
action: The operation we want to perform on the data, such as "count_occurrences", "find_line", etc.
filters:
keyword: The word or phrase we want to search for.
```
Response:
```json
{
"function": "file_analytics",
"params": {
"action": "count_occurrences",
"filters": {
"keyword": "Python"
}
}
}
```
#### Chain-of-thought
You can ask for several possible responses to a given problem, with a ranking and final answer selection.
Example prompt:
```
A group of people decide to play a round-robin tournament where every player plays every other player exactly once. If a total of 45 games were played, how many players participated in the tournament? Offer several potential answers, rank them according to mathematical logic, and pick the most feasible one.
```
Example response:
```
Solution 1:
In a round-robin tournament, each player plays every other player exactly once. This means that the total number of games played is equal to the combination of all players taken two at a time (since a game involves two players).
The formula for combinations is nC2 = n(n-1)/2, where n is the total number of players. We know that nC2 equals 45 (the total number of games), so we can set up the equation as follows:
n(n-1)/2 = 45
Solving this equation gives us n=10.
Final answer: There were 10 players in the tournament.
Solution 2:
Another way to approach this problem is by using the concept of triangular numbers. Triangular numbers are a sequence of numbers where each number can be represented as a triangle with dots. The nth triangular number is given by the sum of the first n natural numbers.
If we consider each game as a dot and each player as a side of the triangle, then the total number of games (45) would correspond to the 9th triangular number because 1+2+3+4+5+6+7+8+9=45. However, since each side of the triangle represents a player, and there's one more player than sides in our model (because the last player has no one left to play against), we need to add one more to get the total number of players.
Final answer: There were 10 players in the tournament.
Solution 3:
We could also use trial and error to solve this problem. Starting from a small number of players, we calculate how many games would be played in a round-robin tournament until we reach 45 games. For example, with 2 players, only 1 game is played; with 3 players, 3 games are played; with 4 players, 6 games are played, and so on. Continuing this process, we find that with 10 players, 45 games are played.
Final answer: There were 10 players in the tournament.
Ranking of solutions from best to worst:
1. Solution 1: This solution uses direct mathematical reasoning based on the properties of a round-robin tournament and solves an equation to find the answer.
2. Solution 2: This solution uses the concept of triangular numbers which requires additional steps and knowledge beyond the immediate problem context, but still arrives at the correct answer.
3. Solution 3: This solution relies on trial and error, which is less efficient and reliable than the other methods, but it eventually gets to the right answer.
Best and final answer: There were 10 players in the tournament.
```
#### reWOO style execution planning
The model now supports execution planning for complex instructions that would require making use of several tools. The output is just the plan, you must implement a mechanism to parse the output and actually call the functions!
Example prompt:
```
Please construct a systematic plan to generate an optimal response to the user instruction, utilizing a set of provided tools. Each plan will correspond to an evidence value, which will be the output of one of the available functions given an input string
that could be the user's question, one or more prior evidence values, or a combination of both.
Here are the tools available to be called:
DuckDuckGo[input]: Tool that enables the user to search for information using the DuckDuckGo search engine. This tool is helpful in locating a suitable list of sites that may contain the answer to the user's query. It does not directly answer the question but finds a list of sites that might have the answer.
TextScraper[input]: Load one or more websites from the input string containing newline delimited links, where input is one or more links, and produces plain text output containing the content of the links.
HyperlinkExtractor[input]: Extract links from plain text and produces a plain text, newline delimited response of links.
KnowledgeModel[input]: Question answering language model, particularly useful in answering questions based on an input passage of text. The input must be a text question that references an :evidence[n]: variable, e.g. What color is the cat, given :evidence1:?
The output should be in format:
Plan: [first action to take based on input question]
:evidence0: = [function to call with input parameter]
Plan: [next action to take, based on result of :evidence0:]
:evidence1: = [next function to call and input parameter, which may include reference to previous evidence, e.g. "Given context :evidence0"]
...
Answer: [:evidence[n]: containing the final answer.]
Question: Who won the Nobel Prize in Literature in 2020 and what is their most famous work?
```
Response:
```
Plan: Begin by conducting a web search to find out who won the Nobel Prize in Literature in 2020.
:evidence0: = DuckDuckGo[Nobel Prize in Literature 2020 winner]
Plan: Utilize the KnowledgeModel to interpret the search results and find the name of the laureate.
:evidence1: = KnowledgeModel[Who won the Nobel Prize in Literature in 2020, given :evidence0:?]
Plan: Conduct another web search to find the most famous work of the identified laureate.
:evidence2: = DuckDuckGo[Most famous work of :evidence1:]
Plan: Extract the relevant links from the DuckDuckGo search results for a more focused search.
:evidence3: = HyperlinkExtractor[:evidence2:]
Plan: Use the TextScraper tool to extract information from the relevant links.
:evidence4: = TextScraper[:evidence3:]
Plan: Finally, utilize the KnowledgeModel to identify and summarize the most famous work of the laureate from the extracted information.
:evidence5: = KnowledgeModel[What is the most famous work of :evidence1:, given :evidence4:?]
Answer: :evidence5:
```
For this to be useful, you'd have to parse the output plan text, and implement/call each of the functions. This is just pseudo-code, completely untested off the top of my head, and obviously would requiring full implementation + hardening:
```python
import re
import requests
def inject_context(input_text, **context):
for ref in set(re.findall(r"(:evidence[0-9]+:)", input_text, re.I)):
input_text = input_text.replace(ref, context.get(ref, ""))
return input_text
def duckduckgo(input_text, **context):
search_string = inject_context(input_text, **context)
... search via duck duck go using search_string
... return text content
def link_extractor(input_text, **context):
input_text = inject_context(input_text, **context)
return "\n".join(list(set(re.findall(r"(https?://[^\s]+?\.?)", input_text, re.I))))
def scrape(input_text, **context):
input_text = inject_context(input_text, **context)
text = []
for link in input_text.splitlines():
text.append(requests.get(link).text)
return "\n".join(text)
def infer(input_text, **context)
prompt = inject_context(input_text, **context)
... call model with prompt, return output
def parse_plan(plan):
method_map = {
"DuckDuckGo": duckduckgo,
"HyperlinkExtractor": link_extractor,
"KnowledgeModel": infer,
"TextScraper": scrape,
}
context = {}
for line in plan.strip().splitlines():
if line.startswith("Plan:"):
print(line)
continue
parts = re.match("^(:evidence[0-9]+:)\s*=\s*([^\[]+])(\[.*\])\s$", line, re.I)
if not parts:
if line.startswith("Answer: "):
return context.get(line.split(" ")[-1].strip(), "Answer couldn't be generated...")
raise RuntimeError("bad format: " + line)
context[parts.group(1)] = method_map[parts.group(2)](parts.group(3), **context)
```
### Contribute
If you're interested in new functionality, particularly a new "instructor" type to generate a specific type of training data,
take a look at the dataset generation tool repo: https://github.com/jondurbin/airoboros and either make a PR or open an issue with details.
To help me with the OpenAI/compute costs:
- https://bmc.link/jondurbin
- ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11
- BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf
### Licence and usage restrictions
The airoboros 2.1 models are built on top of llama-2.
The llama-2 base model has a custom Meta license:
- See the [meta-license/LICENSE.txt](meta-license/LICENSE.txt) file attached for the original license provided by Meta.
- See also [meta-license/USE_POLICY.md](meta-license/USE_POLICY.md) and [meta-license/Responsible-Use-Guide.pdf](meta-license/Responsible-Use-Guide.pdf), also provided by Meta.
The fine-tuning data was generated by OpenAI API calls to gpt-4, via [airoboros](https://github.com/jondurbin/airoboros)
The ToS for OpenAI API usage has a clause preventing the output from being used to train a model that __competes__ with OpenAI
- what does *compete* actually mean here?
- these small open source models will not produce output anywhere near the quality of gpt-4, or even gpt-3.5, so I can't imagine this could credibly be considered competing in the first place
- if someone else uses the dataset to do the same, they wouldn't necessarily be violating the ToS because they didn't call the API, so I don't know how that works
- the training data used in essentially all large language models includes a significant amount of copyrighted or otherwise non-permissive licensing in the first place
- other work using the self-instruct method, e.g. the original here: https://github.com/yizhongw/self-instruct released the data and model as apache-2
I am purposingly leaving this license ambiguous (other than the fact you must comply with the Meta original license for llama-2) because I am not a lawyer and refuse to attempt to interpret all of the terms accordingly.
Your best bet is probably to avoid using this commercially due to the OpenAI API usage.
Either way, by using this model, you agree to completely indemnify me.
| 32,002 | [
[
-0.041229248046875,
-0.05718994140625,
0.00461578369140625,
0.01354217529296875,
-0.02191162109375,
-0.01062774658203125,
0.00687408447265625,
-0.037384033203125,
0.0201416015625,
0.0245513916015625,
-0.048431396484375,
-0.031707763671875,
-0.0250244140625,
-0.00848388671875,
-0.0283355712890625,
0.07855224609375,
0.0007166862487792969,
-0.02825927734375,
-0.0012273788452148438,
-0.017181396484375,
-0.0169830322265625,
-0.0316162109375,
-0.052154541015625,
-0.0159149169921875,
0.025787353515625,
0.01287078857421875,
0.062347412109375,
0.040740966796875,
0.017303466796875,
0.023406982421875,
-0.00757598876953125,
-0.005138397216796875,
-0.038116455078125,
-0.01401519775390625,
0.014373779296875,
-0.01078033447265625,
-0.0477294921875,
0.0085601806640625,
0.03582763671875,
0.0172271728515625,
-0.03594970703125,
0.0151824951171875,
-0.0012121200561523438,
0.05572509765625,
-0.0328369140625,
0.00279998779296875,
-0.031341552734375,
0.003875732421875,
-0.006378173828125,
0.015411376953125,
-0.003993988037109375,
-0.031097412109375,
0.007793426513671875,
-0.06640625,
0.0150299072265625,
0.0016765594482421875,
0.09765625,
0.0045166015625,
-0.050628662109375,
0.0125579833984375,
-0.03076171875,
0.048370361328125,
-0.0733642578125,
0.0301055908203125,
0.037200927734375,
0.0164947509765625,
-0.01715087890625,
-0.0615234375,
-0.042572021484375,
-0.0009517669677734375,
-0.01000213623046875,
0.0244140625,
-0.038848876953125,
0.00449371337890625,
0.03436279296875,
0.057891845703125,
-0.0726318359375,
-0.01200103759765625,
-0.02215576171875,
-0.0146942138671875,
0.06793212890625,
0.0113983154296875,
0.0277557373046875,
-0.0224456787109375,
-0.024749755859375,
-0.0355224609375,
-0.048370361328125,
0.00804901123046875,
0.0271148681640625,
0.00038170814514160156,
-0.053253173828125,
0.0325927734375,
-0.03179931640625,
0.03863525390625,
0.0136566162109375,
-0.01126861572265625,
0.031524658203125,
-0.044647216796875,
-0.035125732421875,
-0.027862548828125,
0.0882568359375,
0.0269622802734375,
-0.0152587890625,
0.02227783203125,
-0.0004477500915527344,
-0.0158843994140625,
0.0010023117065429688,
-0.0809326171875,
-0.03607177734375,
0.033203125,
-0.03863525390625,
-0.023773193359375,
0.0015048980712890625,
-0.059173583984375,
0.002880096435546875,
-0.0079803466796875,
0.03826904296875,
-0.044464111328125,
-0.03662109375,
0.00696563720703125,
-0.034210205078125,
0.0341796875,
0.02679443359375,
-0.054229736328125,
0.03533935546875,
0.02325439453125,
0.052001953125,
0.01042938232421875,
-0.01123809814453125,
-0.01519012451171875,
0.007053375244140625,
-0.0094757080078125,
0.033660888671875,
-0.01275634765625,
-0.03521728515625,
-0.024749755859375,
0.015899658203125,
0.00392913818359375,
-0.01995849609375,
0.036712646484375,
-0.017181396484375,
0.0277557373046875,
-0.033111572265625,
-0.042327880859375,
-0.03070068359375,
0.0081024169921875,
-0.0499267578125,
0.09698486328125,
0.041412353515625,
-0.0615234375,
0.01413726806640625,
-0.03900146484375,
-0.01361846923828125,
-0.00386810302734375,
-0.0032253265380859375,
-0.0447998046875,
-0.00490570068359375,
0.0169830322265625,
0.019744873046875,
-0.022491455078125,
0.007068634033203125,
-0.025604248046875,
-0.018524169921875,
0.0110626220703125,
-0.04010009765625,
0.09771728515625,
0.0175018310546875,
-0.034423828125,
-0.0090789794921875,
-0.0540771484375,
0.01197052001953125,
0.033294677734375,
-0.02044677734375,
0.0002961158752441406,
-0.0172882080078125,
0.00894927978515625,
0.01131439208984375,
0.0145263671875,
-0.0267181396484375,
0.03887939453125,
-0.0165557861328125,
0.044464111328125,
0.046600341796875,
0.005695343017578125,
0.01345062255859375,
-0.035552978515625,
0.040802001953125,
0.006671905517578125,
0.045928955078125,
0.00493621826171875,
-0.05352783203125,
-0.049468994140625,
-0.017333984375,
0.0289459228515625,
0.042144775390625,
-0.050750732421875,
0.0361328125,
-0.01352691650390625,
-0.06182861328125,
-0.0191192626953125,
-0.002872467041015625,
0.02288818359375,
0.0210113525390625,
0.0306549072265625,
-0.0284881591796875,
-0.026641845703125,
-0.06231689453125,
0.005878448486328125,
-0.04150390625,
-0.0024433135986328125,
0.03594970703125,
0.059326171875,
-0.0197601318359375,
0.056671142578125,
-0.05438232421875,
-0.0031795501708984375,
-0.0000654458999633789,
0.0127105712890625,
0.0220489501953125,
0.044403076171875,
0.0634765625,
-0.0626220703125,
-0.039520263671875,
-0.00458526611328125,
-0.044158935546875,
-0.00860595703125,
0.000659942626953125,
-0.0309600830078125,
0.0183258056640625,
-0.000652313232421875,
-0.0802001953125,
0.0540771484375,
0.042144775390625,
-0.047882080078125,
0.0615234375,
-0.0136566162109375,
0.0134124755859375,
-0.08349609375,
0.00962066650390625,
0.00962066650390625,
-0.0183868408203125,
-0.03582763671875,
0.00830841064453125,
-0.005321502685546875,
0.0107269287109375,
-0.031768798828125,
0.05316162109375,
-0.036895751953125,
0.0014696121215820312,
0.007171630859375,
-0.005077362060546875,
0.029052734375,
0.038543701171875,
-0.0156097412109375,
0.055511474609375,
0.0308685302734375,
-0.0306243896484375,
0.04278564453125,
0.033172607421875,
-0.00135040283203125,
0.0214080810546875,
-0.059722900390625,
0.01067352294921875,
0.01088714599609375,
0.0318603515625,
-0.07098388671875,
-0.0239410400390625,
0.037506103515625,
-0.045562744140625,
0.031280517578125,
-0.0260009765625,
-0.0257720947265625,
-0.033966064453125,
-0.04803466796875,
0.0284271240234375,
0.05926513671875,
-0.027313232421875,
0.0364990234375,
0.0306549072265625,
0.000682830810546875,
-0.043792724609375,
-0.04949951171875,
-0.016693115234375,
-0.017608642578125,
-0.04473876953125,
0.0369873046875,
-0.01256561279296875,
-0.00635528564453125,
0.006397247314453125,
-0.005207061767578125,
-0.01015472412109375,
-0.003520965576171875,
0.02728271484375,
0.0255889892578125,
-0.00974273681640625,
-0.01273345947265625,
0.01409912109375,
0.0086822509765625,
-0.00444793701171875,
-0.0220947265625,
0.0276947021484375,
-0.01554107666015625,
-0.0007181167602539062,
-0.0261077880859375,
0.0208587646484375,
0.036407470703125,
0.005298614501953125,
0.05224609375,
0.06365966796875,
-0.031768798828125,
0.0091552734375,
-0.039031982421875,
-0.006256103515625,
-0.0379638671875,
0.0079803466796875,
-0.01678466796875,
-0.05084228515625,
0.03961181640625,
0.030426025390625,
0.0127105712890625,
0.060211181640625,
0.029571533203125,
-0.0013704299926757812,
0.07080078125,
0.0255126953125,
-0.01806640625,
0.038055419921875,
-0.050628662109375,
-0.01824951171875,
-0.06207275390625,
-0.016357421875,
-0.032012939453125,
-0.0176544189453125,
-0.059600830078125,
-0.03350830078125,
0.0229644775390625,
0.0274505615234375,
-0.058746337890625,
0.045623779296875,
-0.05059814453125,
0.00896453857421875,
0.041595458984375,
0.017059326171875,
0.01276397705078125,
0.00736236572265625,
-0.013519287109375,
0.0083160400390625,
-0.04144287109375,
-0.014373779296875,
0.0802001953125,
0.024871826171875,
0.050384521484375,
0.01898193359375,
0.03759765625,
0.00682830810546875,
0.02056884765625,
-0.038177490234375,
0.041534423828125,
-0.0014705657958984375,
-0.05877685546875,
-0.023834228515625,
-0.0538330078125,
-0.07080078125,
0.01776123046875,
-0.00824737548828125,
-0.06011962890625,
0.0250396728515625,
0.003093719482421875,
-0.0226593017578125,
0.0208587646484375,
-0.04949951171875,
0.0758056640625,
-0.0135040283203125,
-0.0281219482421875,
-0.001705169677734375,
-0.05157470703125,
0.0229644775390625,
0.01357269287109375,
0.0021514892578125,
-0.0191497802734375,
-0.0164947509765625,
0.06451416015625,
-0.06768798828125,
0.048187255859375,
-0.0241851806640625,
-0.0033283233642578125,
0.044464111328125,
-0.00933074951171875,
0.03924560546875,
0.00902557373046875,
-0.0030498504638671875,
0.0298004150390625,
0.0251007080078125,
-0.0355224609375,
-0.03076171875,
0.044677734375,
-0.07684326171875,
-0.03387451171875,
-0.03521728515625,
-0.030181884765625,
0.00334930419921875,
0.00946807861328125,
0.038970947265625,
0.037506103515625,
-0.00371551513671875,
0.0062255859375,
0.044769287109375,
-0.0291290283203125,
0.0308380126953125,
0.0274810791015625,
-0.029327392578125,
-0.05078125,
0.06439208984375,
0.01131439208984375,
0.016448974609375,
0.02044677734375,
0.01514434814453125,
-0.033782958984375,
-0.036590576171875,
-0.053314208984375,
0.0247039794921875,
-0.037322998046875,
-0.0345458984375,
-0.041351318359375,
-0.0271453857421875,
-0.03564453125,
0.01494598388671875,
-0.02496337890625,
-0.05377197265625,
-0.029937744140625,
-0.0035572052001953125,
0.07177734375,
0.038421630859375,
-0.01399993896484375,
0.0218505859375,
-0.0616455078125,
0.0253753662109375,
0.030059814453125,
0.0166168212890625,
-0.0028228759765625,
-0.057220458984375,
-0.007663726806640625,
0.0203704833984375,
-0.052001953125,
-0.06854248046875,
0.047698974609375,
0.0189361572265625,
0.03155517578125,
0.031829833984375,
0.009490966796875,
0.06549072265625,
-0.0131683349609375,
0.08270263671875,
0.01529693603515625,
-0.07049560546875,
0.04461669921875,
-0.04400634765625,
0.0182952880859375,
0.037506103515625,
0.042816162109375,
-0.02496337890625,
-0.02008056640625,
-0.060516357421875,
-0.0628662109375,
0.03607177734375,
0.035675048828125,
0.0013608932495117188,
0.0136566162109375,
0.044281005859375,
0.00127410888671875,
0.013885498046875,
-0.060516357421875,
-0.04150390625,
-0.032928466796875,
-0.01241302490234375,
0.01525115966796875,
0.0008287429809570312,
-0.01416778564453125,
-0.05474853515625,
0.075927734375,
-0.01233673095703125,
0.0545654296875,
0.0248260498046875,
0.015777587890625,
-0.00957489013671875,
0.006298065185546875,
0.02642822265625,
0.039031982421875,
-0.0222320556640625,
-0.0203857421875,
0.01009368896484375,
-0.0635986328125,
0.014984130859375,
0.02899169921875,
-0.013946533203125,
-0.00493621826171875,
0.0078887939453125,
0.06298828125,
-0.0010204315185546875,
-0.02471923828125,
0.042327880859375,
-0.02288818359375,
-0.028564453125,
-0.023284912109375,
0.0164031982421875,
0.0143280029296875,
0.0204315185546875,
0.0288238525390625,
-0.01934814453125,
0.024261474609375,
-0.0419921875,
0.01406097412109375,
0.039764404296875,
-0.009918212890625,
-0.026519775390625,
0.05889892578125,
0.0011281967163085938,
0.01092529296875,
0.057830810546875,
-0.02349853515625,
-0.0330810546875,
0.061187744140625,
0.02996826171875,
0.059814453125,
-0.01336669921875,
0.01812744140625,
0.04498291015625,
0.01171112060546875,
-0.00574493408203125,
0.034454345703125,
-0.001491546630859375,
-0.0445556640625,
-0.024749755859375,
-0.04736328125,
-0.027008056640625,
0.023681640625,
-0.060882568359375,
0.01488494873046875,
-0.0288543701171875,
-0.0282745361328125,
-0.00868988037109375,
0.032562255859375,
-0.042724609375,
0.0229339599609375,
0.007015228271484375,
0.0804443359375,
-0.054229736328125,
0.06573486328125,
0.0419921875,
-0.03167724609375,
-0.08160400390625,
-0.00994110107421875,
0.01296234130859375,
-0.04144287109375,
0.0150909423828125,
0.0053558349609375,
0.021881103515625,
0.0028171539306640625,
-0.050262451171875,
-0.0648193359375,
0.1104736328125,
0.025665283203125,
-0.043365478515625,
-0.01001739501953125,
-0.002216339111328125,
0.0225372314453125,
-0.0033435821533203125,
0.05352783203125,
0.038299560546875,
0.0305023193359375,
0.01361083984375,
-0.072021484375,
0.0335693359375,
-0.0372314453125,
0.0028400421142578125,
0.021331787109375,
-0.08013916015625,
0.0726318359375,
0.0010938644409179688,
-0.006778717041015625,
0.0146331787109375,
0.043914794921875,
0.0295867919921875,
0.00396728515625,
0.029449462890625,
0.06964111328125,
0.054168701171875,
-0.0260467529296875,
0.0858154296875,
-0.01190948486328125,
0.048797607421875,
0.054962158203125,
0.0024242401123046875,
0.0579833984375,
0.0188140869140625,
-0.054718017578125,
0.048828125,
0.0672607421875,
-0.00804901123046875,
0.02679443359375,
0.0028285980224609375,
-0.021514892578125,
-0.0012369155883789062,
0.01450347900390625,
-0.055755615234375,
0.01070404052734375,
0.0265655517578125,
-0.01279449462890625,
0.007343292236328125,
-0.01328277587890625,
0.00452423095703125,
-0.053314208984375,
-0.012725830078125,
0.044158935546875,
0.0188140869140625,
-0.0223236083984375,
0.06585693359375,
-0.0079193115234375,
0.04730224609375,
-0.045440673828125,
-0.01361083984375,
-0.028656005859375,
-0.00868988037109375,
-0.0257110595703125,
-0.056976318359375,
0.0149688720703125,
-0.01279449462890625,
-0.0029506683349609375,
0.0008177757263183594,
0.054962158203125,
-0.01445770263671875,
-0.035400390625,
0.0258026123046875,
0.030120849609375,
0.0233306884765625,
-0.007595062255859375,
-0.0830078125,
0.024627685546875,
0.0016803741455078125,
-0.06103515625,
0.031829833984375,
0.027801513671875,
0.019134521484375,
0.048980712890625,
0.04296875,
-0.005886077880859375,
0.00286865234375,
-0.01473236083984375,
0.07269287109375,
-0.06304931640625,
-0.0221710205078125,
-0.058013916015625,
0.0469970703125,
-0.01290130615234375,
-0.03741455078125,
0.057464599609375,
0.04913330078125,
0.05712890625,
0.0076446533203125,
0.0565185546875,
-0.0343017578125,
0.0135040283203125,
-0.029388427734375,
0.050872802734375,
-0.058624267578125,
0.005168914794921875,
-0.03033447265625,
-0.052001953125,
0.0011873245239257812,
0.05511474609375,
-0.0017709732055664062,
0.02374267578125,
0.0282745361328125,
0.06268310546875,
0.0020046234130859375,
0.01543426513671875,
0.0120391845703125,
0.0274810791015625,
0.01273345947265625,
0.06488037109375,
0.0526123046875,
-0.07891845703125,
0.042816162109375,
-0.035919189453125,
-0.016326904296875,
-0.01241302490234375,
-0.0614013671875,
-0.04925537109375,
-0.035003662109375,
-0.051361083984375,
-0.046234130859375,
-0.00821685791015625,
0.07147216796875,
0.06756591796875,
-0.047332763671875,
-0.022247314453125,
-0.0101776123046875,
-0.0015010833740234375,
-0.0225372314453125,
-0.024810791015625,
0.0221710205078125,
0.030059814453125,
-0.04779052734375,
0.013702392578125,
0.0024547576904296875,
0.03271484375,
-0.007770538330078125,
-0.0259857177734375,
-0.01444244384765625,
0.005054473876953125,
0.043182373046875,
0.03765869140625,
-0.041229248046875,
-0.0102691650390625,
-0.01155853271484375,
-0.00458526611328125,
0.019073486328125,
0.0225830078125,
-0.05474853515625,
-0.0023746490478515625,
0.0391845703125,
0.01142120361328125,
0.06640625,
0.004425048828125,
0.02099609375,
-0.038177490234375,
0.00490570068359375,
0.0024738311767578125,
0.0259857177734375,
0.0032405853271484375,
-0.0377197265625,
0.052093505859375,
0.032440185546875,
-0.050750732421875,
-0.0576171875,
-0.01146697998046875,
-0.094970703125,
-0.0209503173828125,
0.0882568359375,
-0.010345458984375,
-0.0221405029296875,
0.0011959075927734375,
-0.0210723876953125,
0.026763916015625,
-0.04241943359375,
0.0211334228515625,
0.036590576171875,
-0.0241546630859375,
-0.0242919921875,
-0.056976318359375,
0.0380859375,
0.01422882080078125,
-0.06829833984375,
0.002658843994140625,
0.039031982421875,
0.0379638671875,
0.004474639892578125,
0.0645751953125,
-0.0183868408203125,
0.0243988037109375,
0.015350341796875,
0.0017681121826171875,
-0.0015306472778320312,
0.005733489990234375,
-0.0289154052734375,
-0.002422332763671875,
-0.0167236328125,
0.002635955810546875
]
] |
Yukang/Llama-2-13b-longlora-16k-ft | 2023-09-24T09:41:37.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"arxiv:2309.12307",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | Yukang | null | null | Yukang/Llama-2-13b-longlora-16k-ft | 2 | 6,858 | transformers | 2023-09-12T10:56:04 | # LongLoRA: Efficient Fine-tuning of Long-Context Large Language Models
<font size=6><div align='center' > <a href=http://arxiv.org/abs/2309.12307>**Paper**</a> | <a href="https://huggingface.co/Yukang">**Models**</a> | <a href="https://github.com/dvlab-research/LongLoRA">**Code**</a> </div></font>
**LongLoRA: Efficient Fine-tuning of Long-Context Large Language Models [[Paper](http://arxiv.org/abs/2309.12307)]** <br />
[Yukang Chen](https://scholar.google.com/citations?user=6p0ygKUAAAAJ&hl=en),
[Shengju Qian](https://scholar.google.com/citations?user=QNnWmasAAAAJ),
[Haotian Tang](https://scholar.google.com/citations?user=WxL13BAAAAAJ&hl),
[Xin Lai](https://scholar.google.com/citations?user=tqNDPA4AAAAJ&hl=zh-CN),
[Zhijian Liu](https://scholar.google.com/citations?user=3coYSTUAAAAJ&hl=en),
[Song Han](https://scholar.google.com/citations?user=E0iCaa4AAAAJ&hl=zh-CN),
[Jiaya Jia](https://scholar.google.com/citations?user=XPAkzTEAAAAJ&hl=en)<br />
## Abstract
We present LongLoRA, an efficient fine-tuning approach that extends the context sizes of pre-trained large language models (LLMs), with limited computation cost.
Typically, training LLMs with long context sizes is computationally expensive, requiring extensive training hours and GPU resources.
In this paper, we speed up the context extension of LLMs in two aspects. On the one hand, although dense global attention is needed during inference, fine-tuning the model can be effectively and efficiently done by sparse local attention. The proposed shift short attention effectively enables context extension, leading to non-trivial computation saving with similar performance to fine-tuning with vanilla attention. On the other hand, we find that LoRA for context extension works well under the premise of trainable embedding and normalization. LongLoRA demonstrates strong empirical results on various tasks on LLaMA2 models from 7B/13B to 70B. LongLoRA adopts LLaMA2 7B from 4k context to 100k, or LLaMA2 70B to 32k on a single 8x A100 machine. LongLoRA extends models' context while retaining their original architectures, and is compatible with most existing techniques, like FlashAttention-2. In addition, to make LongLoRA practical, we collect a dataset, LongQA, for supervised fine-tuning. It contains more than 3k long context question-answer pairs. For more details, please refer to the [paper](http://arxiv.org/abs/2309.12307).
## Highlights
**LongLoRA** speed up the context extension of pre-trained large language models in both attention-level and weight-level.
1. The proposed shifted short attention is easy to implement, compatible with Flash-Attention, and not required during inference.
2. We release all our models, including models from 7B to 70B, context length from 8k to 100k, including [LLaMA2-LongLoRA-7B-100k](https://huggingface.co/Yukang/Llama-2-7b-longlora-100k-ft), [LLaMA2-LongLoRA-13B-64k](https://huggingface.co/Yukang/Llama-2-13b-longlora-64k), and [LLaMA2-LongLoRA-70B-32k](https://huggingface.co/Yukang/Llama-2-70b-longlora-32k).
3. We build up a long-context QA dataset, LongQA, for supervised fine-tuning (SFT). We release 13B and 70B 32k models with SFT, [Llama-2-13b-chat-longlora-32k-sft](https://huggingface.co/Yukang/Llama-2-13b-chat-longlora-32k-sft) and [Llama-2-70b-chat-longlora-32k-sft](https://huggingface.co/Yukang/Llama-2-70b-chat-longlora-32k-sft). We will further release the dataset next week.
## Released models
### Models with supervised fine-tuning
| Model | Size | Context | Train | Link |
|:----------------------------------|------|---------|---------|-------------------------------------------------------------------------|
| Llama-2-13b-chat-longlora-32k-sft | 13B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-chat-longlora-32k-sft) |
| Llama-2-70b-chat-longlora-32k-sft | 70B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-70b-chat-longlora-32k-sft) |
### Models with context extension via fully fine-tuning
| Model | Size | Context | Train | Link |
|:----------------------------|------|---------|-------|-------------------------------------------------------------------|
| Llama-2-7b-longlora-8k-ft | 7B | 8192 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-8k-ft) |
| Llama-2-7b-longlora-16k-ft | 7B | 16384 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-16k-ft) |
| Llama-2-7b-longlora-32k-ft | 7B | 32768 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-32k-ft) |
| Llama-2-7b-longlora-100k-ft | 7B | 100000 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-100k-ft) |
| Llama-2-13b-longlora-8k-ft | 13B | 8192 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-8k-ft) |
| Llama-2-13b-longlora-16k-ft | 13B | 16384 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-16k-ft) |
| Llama-2-13b-longlora-32k-ft | 13B | 32768 | Full FT | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-32k-ft) |
### Models with context extension via improved LoRA fine-tuning
| Model | Size | Context | Train | Link |
|:----------------------------|------|---------|-------|-------------------------------------------------------------------|
| Llama-2-7b-longlora-8k | 7B | 8192 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-8k) |
| Llama-2-7b-longlora-16k | 7B | 16384 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-16k) |
| Llama-2-7b-longlora-32k | 7B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-7b-longlora-32k) |
| Llama-2-13b-longlora-8k | 13B | 8192 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-8k) |
| Llama-2-13b-longlora-16k | 13B | 16384 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-16k) |
| Llama-2-13b-longlora-32k | 13B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-32k) |
| Llama-2-13b-longlora-64k | 13B | 65536 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-13b-longlora-64k) |
| Llama-2-70b-longlora-32k | 70B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-70b-longlora-32k) |
| Llama-2-70b-chat-longlora-32k | 70B | 32768 | LoRA+ | [link](https://huggingface.co/Yukang/Llama-2-70b-chat-longlora-32k) |
## Citation
If you find this project useful in your research, please consider citing:
```
@article{longlora,
title={LongLoRA: Efficient Fine-tuning of Long-Context Large Language Models},
author={Yukang Chen and Shengju Qian and Haotian Tang and Xin Lai and Zhijian Liu and Song Han and Jiaya Jia},
journal={arXiv:2309.12307},
year={2023}
}
```
## Acknowledgement
- This work is built upon the [LLaMA2](https://ai.meta.com/llama) as the pre-trained models.
- This work is based on [DeepSpeed](https://github.com/microsoft/DeepSpeed), [peft](https://github.com/huggingface/peft), and [Flash-Attention2](https://github.com/Dao-AILab/flash-attention) for acceleration.
- The perplexity evaluation code is modified upon [Landmark Attention](https://github.com/epfml/landmark-attention).
- We use [LongChat](https://github.com/DachengLi1/LongChat) for the retrieval evaluation.
| 7,594 | [
[
-0.05255126953125,
-0.058258056640625,
0.0193939208984375,
0.034088134765625,
-0.0299224853515625,
-0.023712158203125,
-0.036590576171875,
-0.0631103515625,
0.037994384765625,
0.025970458984375,
-0.048431396484375,
-0.042266845703125,
-0.035125732421875,
0.01515960693359375,
-0.00846099853515625,
0.07867431640625,
-0.001800537109375,
-0.037750244140625,
0.0222320556640625,
-0.0279541015625,
-0.03338623046875,
-0.02191162109375,
-0.045501708984375,
-0.01381683349609375,
0.059234619140625,
0.021820068359375,
0.0499267578125,
0.0455322265625,
0.02984619140625,
0.020843505859375,
-0.03558349609375,
0.0264129638671875,
-0.0408935546875,
-0.0163421630859375,
-0.0009679794311523438,
-0.01554107666015625,
-0.08056640625,
-0.01079559326171875,
0.05169677734375,
0.037139892578125,
0.00443267822265625,
0.031463623046875,
0.017303466796875,
0.058746337890625,
-0.02825927734375,
0.006298065185546875,
-0.02447509765625,
-0.011322021484375,
-0.0277862548828125,
-0.006591796875,
-0.0019102096557617188,
-0.0232696533203125,
-0.001811981201171875,
-0.049285888671875,
-0.01727294921875,
-0.00920867919921875,
0.0889892578125,
0.03411865234375,
-0.038055419921875,
-0.00717926025390625,
-0.01403045654296875,
0.0633544921875,
-0.07366943359375,
0.02130126953125,
0.0251312255859375,
0.011810302734375,
-0.0246734619140625,
-0.034698486328125,
-0.040283203125,
-0.0007739067077636719,
-0.0209197998046875,
0.0039825439453125,
-0.0173797607421875,
-0.00862884521484375,
0.0343017578125,
0.027923583984375,
-0.024627685546875,
0.0220184326171875,
-0.0213623046875,
0.005313873291015625,
0.058868408203125,
0.0005235671997070312,
0.0203857421875,
-0.008636474609375,
-0.030792236328125,
-0.0016107559204101562,
-0.063232421875,
0.02679443359375,
0.0146636962890625,
0.01995849609375,
-0.0419921875,
0.03729248046875,
-0.026885986328125,
0.0604248046875,
0.0189361572265625,
-0.0333251953125,
0.03912353515625,
-0.03350830078125,
-0.024200439453125,
-0.0170745849609375,
0.050567626953125,
0.031158447265625,
0.00366973876953125,
0.0167694091796875,
-0.0083770751953125,
0.00101470947265625,
-0.0221710205078125,
-0.07177734375,
0.015106201171875,
0.01739501953125,
-0.02978515625,
-0.0180511474609375,
-0.00757598876953125,
-0.06256103515625,
-0.0038299560546875,
-0.0230865478515625,
0.0098876953125,
-0.0289306640625,
-0.0229339599609375,
0.0233001708984375,
0.01934814453125,
0.026123046875,
0.03582763671875,
-0.041656494140625,
0.01177978515625,
0.040008544921875,
0.045745849609375,
-0.014617919921875,
-0.0247650146484375,
-0.032440185546875,
0.00592803955078125,
-0.020416259765625,
0.039337158203125,
-0.0135498046875,
-0.009613037109375,
-0.0151824951171875,
0.0235595703125,
-0.006214141845703125,
-0.01513671875,
0.041534423828125,
-0.023712158203125,
0.0015153884887695312,
-0.03570556640625,
-0.03631591796875,
-0.00965118408203125,
0.01678466796875,
-0.055908203125,
0.081787109375,
0.0188140869140625,
-0.0655517578125,
0.0230865478515625,
-0.065185546875,
-0.01446533203125,
-0.026092529296875,
0.0156097412109375,
-0.040130615234375,
-0.0189361572265625,
0.033905029296875,
0.0421142578125,
-0.0243682861328125,
0.0002162456512451172,
-0.025848388671875,
-0.032684326171875,
0.01403045654296875,
0.0008373260498046875,
0.060699462890625,
0.022674560546875,
-0.04559326171875,
0.0266876220703125,
-0.061492919921875,
0.0087890625,
0.0225830078125,
-0.035675048828125,
-0.007659912109375,
-0.0164337158203125,
0.005550384521484375,
0.0263671875,
0.0333251953125,
-0.017822265625,
0.033966064453125,
-0.03436279296875,
0.040771484375,
0.049346923828125,
-0.00984954833984375,
0.019866943359375,
-0.0262451171875,
0.0343017578125,
0.01019287109375,
0.0174713134765625,
-0.005031585693359375,
-0.0340576171875,
-0.0780029296875,
-0.034515380859375,
0.016265869140625,
0.0200347900390625,
-0.042327880859375,
0.06494140625,
-0.0384521484375,
-0.03656005859375,
-0.040618896484375,
0.0360107421875,
0.040985107421875,
0.028961181640625,
0.0290985107421875,
-0.0193939208984375,
-0.033294677734375,
-0.06591796875,
-0.0006155967712402344,
0.01195526123046875,
0.01369476318359375,
0.032012939453125,
0.048095703125,
-0.033843994140625,
0.05975341796875,
-0.03741455078125,
-0.026641845703125,
-0.01666259765625,
-0.0252838134765625,
0.0440673828125,
0.03985595703125,
0.07366943359375,
-0.05010986328125,
-0.04766845703125,
0.0119781494140625,
-0.044830322265625,
-0.0038547515869140625,
0.01081085205078125,
-0.0238037109375,
0.043243408203125,
0.0280609130859375,
-0.06085205078125,
0.042236328125,
0.047698974609375,
-0.0499267578125,
0.032196044921875,
-0.005092620849609375,
0.0015745162963867188,
-0.09521484375,
0.026885986328125,
-0.0007567405700683594,
-0.02325439453125,
-0.03936767578125,
0.025848388671875,
0.00787353515625,
0.0196990966796875,
-0.04486083984375,
0.0675048828125,
-0.04302978515625,
-0.0034313201904296875,
-0.0198516845703125,
0.011993408203125,
-0.0008502006530761719,
0.061676025390625,
-0.005786895751953125,
0.06243896484375,
0.036468505859375,
-0.037933349609375,
0.023529052734375,
0.017974853515625,
-0.03338623046875,
0.0316162109375,
-0.050048828125,
0.021697998046875,
0.00922393798828125,
0.05609130859375,
-0.052093505859375,
-0.03216552734375,
0.0176239013671875,
-0.01372528076171875,
0.016448974609375,
-0.00567626953125,
-0.038116455078125,
-0.042877197265625,
-0.04632568359375,
0.040802001953125,
0.0330810546875,
-0.055084228515625,
0.004138946533203125,
0.01531982421875,
0.011077880859375,
-0.048675537109375,
-0.033447265625,
-0.0025234222412109375,
-0.048431396484375,
-0.057708740234375,
0.0286712646484375,
-0.01666259765625,
-0.002834320068359375,
-0.0184783935546875,
0.01351165771484375,
0.007720947265625,
0.00914764404296875,
0.020782470703125,
0.010162353515625,
-0.026275634765625,
0.00249481201171875,
-0.01013946533203125,
-0.000865936279296875,
-0.02801513671875,
0.0016031265258789062,
0.055572509765625,
-0.033721923828125,
-0.012664794921875,
-0.0562744140625,
0.0126190185546875,
0.03948974609375,
-0.0196990966796875,
0.05181884765625,
0.0670166015625,
-0.020111083984375,
-0.0007486343383789062,
-0.050384521484375,
-0.00439453125,
-0.03570556640625,
0.01207733154296875,
-0.03546142578125,
-0.08209228515625,
0.059783935546875,
0.009613037109375,
0.005641937255859375,
0.048675537109375,
0.03582763671875,
0.0189361572265625,
0.0711669921875,
0.049407958984375,
-0.040283203125,
0.0482177734375,
-0.03985595703125,
-0.0008244514465332031,
-0.07391357421875,
-0.0005025863647460938,
-0.014678955078125,
-0.033966064453125,
-0.0511474609375,
-0.044036865234375,
0.0258331298828125,
0.030426025390625,
-0.03094482421875,
0.045745849609375,
-0.038604736328125,
0.0255889892578125,
0.028961181640625,
0.0211029052734375,
0.0122222900390625,
-0.00937652587890625,
0.0193023681640625,
0.0018310546875,
-0.0299530029296875,
-0.0220794677734375,
0.06964111328125,
0.0372314453125,
0.03857421875,
0.0257110595703125,
0.0482177734375,
-0.00299072265625,
0.01506805419921875,
-0.049468994140625,
0.0465087890625,
0.00896453857421875,
-0.03912353515625,
-0.03790283203125,
-0.0222320556640625,
-0.08319091796875,
0.0184478759765625,
-0.00736236572265625,
-0.06805419921875,
0.01126861572265625,
0.005889892578125,
-0.036224365234375,
0.01959228515625,
-0.03912353515625,
0.057830810546875,
-0.01039886474609375,
-0.03594970703125,
-0.02105712890625,
-0.050262451171875,
0.040496826171875,
-0.003948211669921875,
0.00756072998046875,
-0.019561767578125,
-0.003467559814453125,
0.059906005859375,
-0.04888916015625,
0.06591796875,
-0.005092620849609375,
-0.041595458984375,
0.035552978515625,
-0.01554107666015625,
0.05255126953125,
0.00817108154296875,
-0.0077362060546875,
0.0008788108825683594,
0.00018978118896484375,
-0.039398193359375,
-0.032958984375,
0.0667724609375,
-0.05584716796875,
-0.043548583984375,
-0.0209808349609375,
-0.033447265625,
-0.01103973388671875,
0.0213623046875,
0.01458740234375,
0.00669097900390625,
0.006626129150390625,
0.021759033203125,
0.03564453125,
-0.019195556640625,
0.038787841796875,
0.020416259765625,
-0.025054931640625,
-0.0242919921875,
0.05535888671875,
-0.00455474853515625,
0.0111541748046875,
0.00966644287109375,
0.0055084228515625,
-0.007511138916015625,
-0.027801513671875,
-0.028839111328125,
0.040435791015625,
-0.03814697265625,
-0.0312347412109375,
-0.0246734619140625,
-0.020782470703125,
-0.037506103515625,
-0.0089263916015625,
-0.02398681640625,
-0.0305023193359375,
-0.04461669921875,
-0.00846099853515625,
0.0545654296875,
0.039031982421875,
0.005146026611328125,
0.0253753662109375,
-0.040435791015625,
0.023162841796875,
0.027496337890625,
0.03173828125,
-0.0006833076477050781,
-0.0430908203125,
-0.016357421875,
0.0170440673828125,
-0.01556396484375,
-0.054290771484375,
0.043243408203125,
0.0186767578125,
0.012420654296875,
0.035125732421875,
-0.0197296142578125,
0.08807373046875,
-0.025054931640625,
0.05523681640625,
0.0195465087890625,
-0.06536865234375,
0.04669189453125,
-0.0511474609375,
0.02099609375,
0.0278778076171875,
0.004512786865234375,
-0.031005859375,
0.0010328292846679688,
-0.037933349609375,
-0.06298828125,
0.05169677734375,
0.018524169921875,
0.0018720626831054688,
0.008331298828125,
0.039459228515625,
-0.007144927978515625,
0.004596710205078125,
-0.06378173828125,
-0.0247955322265625,
-0.0031070709228515625,
-0.0030841827392578125,
-0.02313232421875,
-0.0242156982421875,
-0.019256591796875,
-0.04925537109375,
0.04278564453125,
-0.0298309326171875,
0.01065826416015625,
0.01229095458984375,
-0.010528564453125,
-0.013702392578125,
0.0096282958984375,
0.06732177734375,
0.051971435546875,
-0.0145263671875,
-0.0207061767578125,
0.03851318359375,
-0.01525115966796875,
-0.005641937255859375,
0.002269744873046875,
-0.001659393310546875,
-0.0156402587890625,
0.03375244140625,
0.07440185546875,
0.03936767578125,
-0.04693603515625,
0.0300445556640625,
0.00733184814453125,
-0.0004088878631591797,
-0.022430419921875,
0.012359619140625,
0.0164947509765625,
0.026092529296875,
0.0110321044921875,
-0.0211029052734375,
-0.0026378631591796875,
-0.049163818359375,
0.004642486572265625,
0.037384033203125,
-0.0189971923828125,
-0.03729248046875,
0.039093017578125,
0.006969451904296875,
0.0029659271240234375,
0.011566162109375,
-0.0056304931640625,
-0.038726806640625,
0.0548095703125,
0.040069580078125,
0.034881591796875,
-0.02264404296875,
-0.007366180419921875,
0.045654296875,
-0.00698089599609375,
-0.0089263916015625,
0.0211639404296875,
0.0007991790771484375,
-0.029296875,
-0.0195159912109375,
-0.06561279296875,
0.00954437255859375,
0.03155517578125,
-0.036224365234375,
0.0264129638671875,
-0.0294952392578125,
-0.03070068359375,
-0.004192352294921875,
0.038909912109375,
-0.052337646484375,
0.013153076171875,
0.005657196044921875,
0.0765380859375,
-0.03521728515625,
0.08709716796875,
0.036163330078125,
-0.0229644775390625,
-0.06414794921875,
-0.0160064697265625,
-0.0037078857421875,
-0.06610107421875,
0.045745849609375,
0.016754150390625,
0.0000438690185546875,
-0.01039886474609375,
-0.051422119140625,
-0.09075927734375,
0.10589599609375,
0.0254364013671875,
-0.0419921875,
-0.01056671142578125,
0.00081634521484375,
0.05694580078125,
-0.024383544921875,
0.01222991943359375,
0.053802490234375,
0.045654296875,
0.0035877227783203125,
-0.0977783203125,
0.0262451171875,
-0.03662109375,
0.0027751922607421875,
0.00823974609375,
-0.10113525390625,
0.07952880859375,
-0.0151214599609375,
-0.00917816162109375,
0.0288543701171875,
0.061767578125,
0.03948974609375,
0.006404876708984375,
0.0391845703125,
0.056610107421875,
0.036163330078125,
0.0002524852752685547,
0.07232666015625,
-0.0214691162109375,
0.0308837890625,
0.0601806640625,
0.0027008056640625,
0.06561279296875,
0.033660888671875,
-0.017181396484375,
0.03289794921875,
0.060638427734375,
0.00795745849609375,
0.0188751220703125,
0.01561737060546875,
-0.0023708343505859375,
-0.01371002197265625,
-0.0064697265625,
-0.05206298828125,
0.024993896484375,
0.0279998779296875,
-0.0173797607421875,
-0.0018568038940429688,
-0.01473236083984375,
0.0295867919921875,
-0.0166015625,
-0.0225677490234375,
0.0489501953125,
0.02142333984375,
-0.054595947265625,
0.07757568359375,
-0.0012416839599609375,
0.08050537109375,
-0.0350341796875,
0.0091552734375,
-0.0268402099609375,
0.0212554931640625,
-0.023406982421875,
-0.0474853515625,
-0.00044989585876464844,
0.007297515869140625,
0.00958251953125,
-0.0034999847412109375,
0.0445556640625,
-0.0254669189453125,
-0.041015625,
0.041046142578125,
0.0196685791015625,
0.0081329345703125,
-0.0033397674560546875,
-0.06195068359375,
0.0191650390625,
0.00481414794921875,
-0.056396484375,
0.038238525390625,
0.02655029296875,
-0.0204010009765625,
0.052459716796875,
0.051116943359375,
0.007343292236328125,
0.0110931396484375,
0.0003497600555419922,
0.0831298828125,
-0.058258056640625,
-0.03155517578125,
-0.060272216796875,
0.033172607421875,
-0.01238250732421875,
-0.034271240234375,
0.062255859375,
0.0291748046875,
0.040679931640625,
0.00860595703125,
0.023712158203125,
0.0026073455810546875,
0.04296875,
-0.039886474609375,
0.059326171875,
-0.0675048828125,
0.0054473876953125,
-0.032379150390625,
-0.06689453125,
-0.0220184326171875,
0.037200927734375,
-0.01409912109375,
0.01285552978515625,
0.0306396484375,
0.047576904296875,
-0.01212310791015625,
-0.026123046875,
-0.0008401870727539062,
0.020904541015625,
0.0338134765625,
0.0775146484375,
0.03704833984375,
-0.045562744140625,
0.016387939453125,
-0.03253173828125,
-0.00408935546875,
-0.048736572265625,
-0.06475830078125,
-0.08160400390625,
-0.0478515625,
-0.0162200927734375,
-0.0189056396484375,
-0.00991058349609375,
0.0699462890625,
0.062408447265625,
-0.05517578125,
-0.0189971923828125,
0.013275146484375,
0.010711669921875,
-0.00959014892578125,
-0.0161895751953125,
0.054046630859375,
-0.004180908203125,
-0.068359375,
0.0258941650390625,
-0.0021762847900390625,
0.0251922607421875,
0.0006213188171386719,
-0.0289154052734375,
-0.01305389404296875,
-0.0005478858947753906,
0.060028076171875,
0.04541015625,
-0.0633544921875,
-0.02447509765625,
-0.0062408447265625,
-0.01496124267578125,
0.0079193115234375,
0.016693115234375,
-0.0421142578125,
-0.015350341796875,
0.03338623046875,
0.01381683349609375,
0.0443115234375,
0.01039886474609375,
0.008331298828125,
-0.04241943359375,
0.046295166015625,
-0.00033473968505859375,
0.031219482421875,
0.019989013671875,
-0.0232696533203125,
0.06048583984375,
-0.00569915771484375,
-0.032379150390625,
-0.0787353515625,
0.010101318359375,
-0.1038818359375,
-0.01432037353515625,
0.0927734375,
-0.0116424560546875,
-0.046112060546875,
0.0345458984375,
-0.03155517578125,
0.01873779296875,
-0.0321044921875,
0.05218505859375,
0.035491943359375,
-0.00984954833984375,
-0.005786895751953125,
-0.032684326171875,
0.061248779296875,
0.037994384765625,
-0.07940673828125,
0.001766204833984375,
0.03271484375,
0.0262298583984375,
0.028533935546875,
0.048309326171875,
-0.00433349609375,
0.0170745849609375,
-0.04608154296875,
-0.006969451904296875,
0.0009255409240722656,
-0.01055145263671875,
-0.0205841064453125,
-0.01502227783203125,
-0.00795745849609375,
0.00550079345703125
]
] |
hiyouga/Baichuan2-7B-Base-LLaMAfied | 2023-10-12T14:02:26.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"baichuan",
"llama2",
"baichuan2",
"en",
"zh",
"license:other",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | hiyouga | null | null | hiyouga/Baichuan2-7B-Base-LLaMAfied | 5 | 6,851 | transformers | 2023-09-08T14:58:40 | ---
license: other
language:
- en
- zh
library_name: transformers
pipeline_tag: text-generation
inference: false
tags:
- baichuan
- llama2
- baichuan2
---
This is the LLaMAfied version of [Baichuan2-7B-Base](https://huggingface.co/baichuan-inc/Baichuan2-7B-Base) model by Baichuan Inc.
This model is converted with https://github.com/hiyouga/LLaMA-Factory/blob/main/tests/llamafy_baichuan2.py
You may use this model for fine-tuning in downstream tasks, we recommend using our efficient fine-tuning toolkit. https://github.com/hiyouga/LLaMA-Factory
- **Developed by:** Baichuan Inc.
- **Language(s) (NLP):** Chinese/English
- **License:** [Baichuan2 License](https://huggingface.co/baichuan-inc/Baichuan2-7B-Base/resolve/main/Baichuan%202%E6%A8%A1%E5%9E%8B%E7%A4%BE%E5%8C%BA%E8%AE%B8%E5%8F%AF%E5%8D%8F%E8%AE%AE.pdf)
Usage:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("hiyouga/Baichuan2-7B-Base-LLaMAfied", use_fast=False)
model = AutoModelForCausalLM.from_pretrained("hiyouga/Baichuan2-7B-Base-LLaMAfied").cuda()
```
| 1,092 | [
[
-0.006580352783203125,
-0.0482177734375,
0.006969451904296875,
0.046478271484375,
-0.03619384765625,
-0.0035247802734375,
-0.009124755859375,
-0.02252197265625,
0.0055389404296875,
0.044219970703125,
-0.049591064453125,
-0.02471923828125,
-0.0494384765625,
0.00751495361328125,
-0.01947021484375,
0.07586669921875,
-0.00780487060546875,
-0.005146026611328125,
0.0189361572265625,
0.007701873779296875,
-0.04217529296875,
-0.0212860107421875,
-0.057952880859375,
-0.042327880859375,
0.0165863037109375,
0.0191802978515625,
0.056884765625,
0.07379150390625,
0.03631591796875,
0.019439697265625,
-0.021209716796875,
0.00001627206802368164,
-0.047393798828125,
-0.0191497802734375,
0.007587432861328125,
-0.03045654296875,
-0.06683349609375,
0.006458282470703125,
0.01213836669921875,
0.0244140625,
-0.015167236328125,
0.03997802734375,
0.005584716796875,
0.015899658203125,
-0.0360107421875,
0.0278778076171875,
-0.043243408203125,
-0.000659942626953125,
-0.0143585205078125,
0.0012063980102539062,
-0.0224609375,
-0.0133819580078125,
-0.030548095703125,
-0.041748046875,
0.01849365234375,
0.0009627342224121094,
0.106201171875,
0.0494384765625,
-0.03826904296875,
-0.0004897117614746094,
-0.0298309326171875,
0.054656982421875,
-0.076171875,
0.0022792816162109375,
0.0270233154296875,
0.0207061767578125,
-0.0311279296875,
-0.0755615234375,
-0.03546142578125,
-0.01678466796875,
-0.0172576904296875,
0.01209259033203125,
-0.0124664306640625,
0.00508880615234375,
0.0266876220703125,
0.052032470703125,
-0.03369140625,
0.0086822509765625,
-0.04644775390625,
-0.02142333984375,
0.060546875,
0.0126800537109375,
0.01451873779296875,
-0.0215911865234375,
-0.0472412109375,
0.00722503662109375,
-0.0518798828125,
0.01183319091796875,
0.016998291015625,
0.01000213623046875,
-0.0286407470703125,
0.04754638671875,
-0.0323486328125,
0.04547119140625,
0.04486083984375,
-0.0139312744140625,
0.0435791015625,
-0.0184173583984375,
-0.034393310546875,
-0.00154876708984375,
0.055999755859375,
0.04022216796875,
0.0083160400390625,
0.0024166107177734375,
-0.033355712890625,
-0.006084442138671875,
0.0153656005859375,
-0.07318115234375,
-0.036468505859375,
0.018707275390625,
-0.05792236328125,
-0.0302734375,
-0.0032863616943359375,
-0.0142974853515625,
-0.00670623779296875,
-0.00958251953125,
0.01494598388671875,
-0.014373779296875,
-0.0355224609375,
0.0003554821014404297,
0.0239715576171875,
0.030548095703125,
0.0165252685546875,
-0.07440185546875,
-0.0007038116455078125,
0.050140380859375,
0.07049560546875,
0.0076446533203125,
-0.034027099609375,
-0.0239105224609375,
0.0189208984375,
-0.031463623046875,
0.056304931640625,
-0.0004703998565673828,
-0.039764404296875,
-0.007022857666015625,
0.036041259765625,
0.01404571533203125,
-0.055999755859375,
0.0433349609375,
-0.02508544921875,
-0.0178680419921875,
-0.000049233436584472656,
-0.00970458984375,
-0.0286102294921875,
0.025390625,
-0.057281494140625,
0.09710693359375,
0.02166748046875,
-0.03973388671875,
0.0220489501953125,
-0.033966064453125,
-0.041473388671875,
-0.004596710205078125,
0.015289306640625,
-0.0369873046875,
-0.01477813720703125,
-0.0016040802001953125,
0.027984619140625,
-0.01345062255859375,
0.02001953125,
-0.02117919921875,
-0.03070068359375,
-0.00811767578125,
-0.0186309814453125,
0.0809326171875,
0.0340576171875,
-0.039031982421875,
0.02459716796875,
-0.06939697265625,
-0.02471923828125,
0.00542449951171875,
-0.039886474609375,
0.00624847412109375,
-0.01067352294921875,
0.0037384033203125,
0.0219879150390625,
0.06536865234375,
-0.02886962890625,
0.0275115966796875,
-0.031158447265625,
0.043701171875,
0.056182861328125,
-0.005687713623046875,
0.017669677734375,
-0.0465087890625,
0.021942138671875,
0.016357421875,
0.043121337890625,
-0.00006210803985595703,
-0.05706787109375,
-0.09747314453125,
-0.0165863037109375,
0.004180908203125,
0.02911376953125,
-0.025299072265625,
0.042816162109375,
0.0022449493408203125,
-0.059326171875,
-0.034515380859375,
0.01030731201171875,
0.00511932373046875,
0.03460693359375,
0.0272064208984375,
-0.0233306884765625,
-0.0445556640625,
-0.07135009765625,
0.01448822021484375,
-0.0205535888671875,
0.0014429092407226562,
-0.0045013427734375,
0.041290283203125,
-0.04779052734375,
0.046051025390625,
-0.0231781005859375,
-0.0222015380859375,
-0.016632080078125,
0.003353118896484375,
0.06610107421875,
0.055419921875,
0.048492431640625,
-0.0248565673828125,
-0.03662109375,
0.005462646484375,
-0.038726806640625,
-0.0184326171875,
-0.0212860107421875,
-0.05206298828125,
0.006092071533203125,
0.01224517822265625,
-0.061614990234375,
0.04168701171875,
0.039825439453125,
-0.018341064453125,
0.052093505859375,
-0.0238037109375,
0.0109710693359375,
-0.092041015625,
0.010894775390625,
-0.01983642578125,
-0.00605010986328125,
-0.0206451416015625,
0.0225372314453125,
0.0160064697265625,
0.01525115966796875,
-0.03912353515625,
0.0428466796875,
-0.04351806640625,
0.0193328857421875,
-0.029937744140625,
-0.0281982421875,
-0.0038471221923828125,
0.03875732421875,
-0.0106048583984375,
0.05963134765625,
0.031890869140625,
-0.048828125,
0.06719970703125,
0.041229248046875,
-0.0197296142578125,
-0.00771331787109375,
-0.062469482421875,
0.006195068359375,
0.0216064453125,
0.01776123046875,
-0.059051513671875,
-0.021026611328125,
0.040863037109375,
-0.0198516845703125,
0.00733184814453125,
-0.0218963623046875,
-0.037139892578125,
-0.0298309326171875,
-0.01432037353515625,
0.0350341796875,
0.0489501953125,
-0.052825927734375,
0.0450439453125,
0.0115203857421875,
-0.0013341903686523438,
-0.046173095703125,
-0.07220458984375,
-0.0009832382202148438,
-0.0153961181640625,
-0.054595947265625,
0.035400390625,
-0.006656646728515625,
0.01184844970703125,
-0.0211181640625,
-0.00769805908203125,
-0.0239715576171875,
0.00835418701171875,
0.0335693359375,
0.038818359375,
-0.0228118896484375,
-0.0230255126953125,
0.01049041748046875,
-0.01300048828125,
0.01117706298828125,
0.01271820068359375,
0.0460205078125,
-0.00225067138671875,
-0.0018224716186523438,
-0.032806396484375,
0.0014438629150390625,
0.027587890625,
-0.0147705078125,
0.04052734375,
0.054046630859375,
-0.041259765625,
-0.01195526123046875,
-0.045623779296875,
-0.006458282470703125,
-0.040802001953125,
0.0175323486328125,
-0.03131103515625,
-0.0269622802734375,
0.057159423828125,
-0.007572174072265625,
0.01105499267578125,
0.057342529296875,
0.0621337890625,
0.00705718994140625,
0.061614990234375,
0.044097900390625,
-0.0104522705078125,
0.02801513671875,
-0.04840087890625,
0.00481414794921875,
-0.066162109375,
-0.03887939453125,
-0.0309295654296875,
-0.0196990966796875,
-0.038818359375,
-0.0208740234375,
-0.004730224609375,
0.0196685791015625,
-0.061004638671875,
0.05657958984375,
-0.043914794921875,
0.01184844970703125,
0.045623779296875,
0.01146697998046875,
0.0323486328125,
-0.01422882080078125,
0.008148193359375,
0.01142120361328125,
-0.034149169921875,
-0.048553466796875,
0.059051513671875,
0.021820068359375,
0.068115234375,
0.01070404052734375,
0.03741455078125,
0.008148193359375,
0.0325927734375,
-0.0599365234375,
0.0251312255859375,
-0.01540374755859375,
-0.064208984375,
0.0147247314453125,
-0.01690673828125,
-0.07586669921875,
0.013824462890625,
-0.0117645263671875,
-0.04364013671875,
-0.00033783912658691406,
0.014251708984375,
-0.032745361328125,
0.01959228515625,
-0.05322265625,
0.07574462890625,
-0.033660888671875,
-0.018646240234375,
-0.01549530029296875,
-0.0277862548828125,
0.040863037109375,
0.01039886474609375,
0.0010404586791992188,
-0.015899658203125,
0.00086212158203125,
0.03924560546875,
-0.056121826171875,
0.062164306640625,
0.005512237548828125,
-0.026824951171875,
0.0245513916015625,
0.0080718994140625,
0.046966552734375,
0.0175628662109375,
-0.0187530517578125,
0.02764892578125,
-0.0128326416015625,
-0.0389404296875,
-0.0219879150390625,
0.05670166015625,
-0.0704345703125,
-0.0570068359375,
-0.03790283203125,
-0.024993896484375,
0.01617431640625,
0.033477783203125,
0.03680419921875,
0.0032291412353515625,
0.016357421875,
0.0008950233459472656,
0.01465606689453125,
0.0016574859619140625,
0.040313720703125,
0.02099609375,
-0.0024051666259765625,
-0.01418304443359375,
0.039031982421875,
-0.009521484375,
0.00594329833984375,
0.00559234619140625,
0.004917144775390625,
-0.0192718505859375,
-0.03350830078125,
-0.036773681640625,
0.024200439453125,
-0.04620361328125,
-0.02850341796875,
-0.007740020751953125,
-0.0413818359375,
-0.04058837890625,
-0.006061553955078125,
-0.035003662109375,
-0.0128326416015625,
-0.055419921875,
-0.00878143310546875,
0.0277862548828125,
0.0540771484375,
-0.01430511474609375,
0.046356201171875,
-0.05230712890625,
0.030426025390625,
0.020294189453125,
0.0186614990234375,
0.00669097900390625,
-0.05743408203125,
-0.03619384765625,
0.0216064453125,
-0.047149658203125,
-0.0567626953125,
0.04400634765625,
0.004932403564453125,
0.048858642578125,
0.031585693359375,
0.008087158203125,
0.0648193359375,
-0.03216552734375,
0.0723876953125,
0.01454925537109375,
-0.05877685546875,
0.042083740234375,
-0.02001953125,
0.007965087890625,
0.014251708984375,
0.012054443359375,
-0.01995849609375,
-0.01020050048828125,
-0.0411376953125,
-0.0821533203125,
0.045562744140625,
0.01242828369140625,
0.01291656494140625,
0.01404571533203125,
0.0286712646484375,
0.02410888671875,
0.0093536376953125,
-0.06683349609375,
-0.025360107421875,
-0.043792724609375,
-0.011627197265625,
0.004207611083984375,
-0.01485443115234375,
-0.0024394989013671875,
-0.0191802978515625,
0.07049560546875,
0.0081787109375,
0.01007080078125,
0.00023186206817626953,
-0.0265960693359375,
-0.00396728515625,
-0.0142974853515625,
0.03021240234375,
0.030670166015625,
-0.03240966796875,
-0.006134033203125,
0.0192413330078125,
-0.0523681640625,
0.007747650146484375,
0.005245208740234375,
-0.015472412109375,
0.00002104043960571289,
0.0277099609375,
0.07342529296875,
0.002017974853515625,
-0.0164642333984375,
0.0269775390625,
0.0084686279296875,
0.0059051513671875,
-0.0406494140625,
0.0286712646484375,
0.0029125213623046875,
0.036376953125,
0.030853271484375,
0.0139617919921875,
0.0208587646484375,
-0.0239715576171875,
-0.0100860595703125,
-0.00015819072723388672,
-0.0006389617919921875,
-0.01358795166015625,
0.077880859375,
0.0261688232421875,
-0.0221710205078125,
0.04998779296875,
-0.00649261474609375,
-0.02508544921875,
0.059356689453125,
0.046905517578125,
0.062164306640625,
0.00612640380859375,
-0.006458282470703125,
0.0562744140625,
0.03375244140625,
-0.008209228515625,
0.01351165771484375,
-0.01099395751953125,
-0.0278472900390625,
-0.0274200439453125,
-0.049102783203125,
-0.0180206298828125,
0.029754638671875,
-0.03778076171875,
0.039154052734375,
-0.03668212890625,
-0.03973388671875,
-0.0200042724609375,
0.01107025146484375,
-0.0290374755859375,
0.029693603515625,
0.0118865966796875,
0.0738525390625,
-0.038360595703125,
0.07012939453125,
0.054534912109375,
-0.0396728515625,
-0.0748291015625,
-0.002960205078125,
-0.025970458984375,
-0.07159423828125,
0.055999755859375,
0.01611328125,
-0.0096588134765625,
-0.001934051513671875,
-0.04473876953125,
-0.072509765625,
0.09503173828125,
0.0374755859375,
-0.033233642578125,
-0.00930023193359375,
0.00974273681640625,
0.0191802978515625,
-0.0128021240234375,
0.03515625,
0.03326416015625,
0.038360595703125,
0.00875091552734375,
-0.0791015625,
0.0226287841796875,
-0.044769287109375,
0.0230712890625,
-0.026123046875,
-0.08892822265625,
0.09619140625,
-0.02447509765625,
-0.01323699951171875,
0.048126220703125,
0.0634765625,
0.03741455078125,
-0.0011501312255859375,
0.029083251953125,
0.032867431640625,
0.0292510986328125,
-0.01305389404296875,
0.05291748046875,
-0.0335693359375,
0.059661865234375,
0.0625,
0.004261016845703125,
0.056304931640625,
0.0304107666015625,
-0.0149078369140625,
0.055419921875,
0.0921630859375,
-0.029266357421875,
0.03924560546875,
0.0081939697265625,
-0.00360107421875,
-0.0252227783203125,
0.00931549072265625,
-0.051666259765625,
0.01708984375,
0.01401519775390625,
-0.0185394287109375,
-0.01221466064453125,
-0.037322998046875,
-0.00004076957702636719,
-0.0243377685546875,
-0.0277862548828125,
0.0227508544921875,
0.0062713623046875,
-0.035400390625,
0.08270263671875,
0.0161285400390625,
0.0841064453125,
-0.0299072265625,
0.000743865966796875,
-0.033660888671875,
0.0155487060546875,
-0.034393310546875,
-0.049652099609375,
0.01125335693359375,
-0.0007686614990234375,
-0.01206207275390625,
0.002532958984375,
0.048980712890625,
-0.01544952392578125,
-0.03997802734375,
0.0245513916015625,
0.0005540847778320312,
0.0186004638671875,
0.01363372802734375,
-0.050140380859375,
0.004291534423828125,
0.0003314018249511719,
-0.04071044921875,
-0.00919342041015625,
0.01374053955078125,
0.0004661083221435547,
0.055816650390625,
0.05157470703125,
0.01198577880859375,
0.0257110595703125,
0.0027523040771484375,
0.048431396484375,
-0.041107177734375,
-0.044036865234375,
-0.0435791015625,
0.058380126953125,
-0.00513458251953125,
-0.051483154296875,
0.038330078125,
0.05615234375,
0.079345703125,
-0.0379638671875,
0.04840087890625,
-0.00830841064453125,
0.00418853759765625,
-0.042755126953125,
0.07733154296875,
-0.0523681640625,
0.003635406494140625,
0.0012407302856445312,
-0.0595703125,
0.0047149658203125,
0.079833984375,
0.00807952880859375,
0.00901031494140625,
0.035247802734375,
0.07049560546875,
-0.0192108154296875,
-0.019622802734375,
0.0193023681640625,
0.042327880859375,
0.01285552978515625,
0.0487060546875,
0.063232421875,
-0.06170654296875,
0.049835205078125,
-0.056427001953125,
-0.0196990966796875,
-0.0262451171875,
-0.06243896484375,
-0.067138671875,
-0.038238525390625,
-0.026336669921875,
-0.031585693359375,
-0.0177001953125,
0.056060791015625,
0.0667724609375,
-0.060394287109375,
-0.0433349609375,
0.00669097900390625,
-0.004856109619140625,
-0.007732391357421875,
-0.0148468017578125,
0.0120697021484375,
-0.01477813720703125,
-0.065673828125,
0.01180267333984375,
-0.004634857177734375,
0.022552490234375,
-0.005329132080078125,
-0.01593017578125,
-0.00608062744140625,
-0.01100921630859375,
0.03131103515625,
0.00644683837890625,
-0.055755615234375,
-0.011138916015625,
0.0015935897827148438,
-0.0257110595703125,
0.0140838623046875,
0.0230712890625,
-0.035003662109375,
-0.00994110107421875,
0.0277099609375,
0.0167999267578125,
0.05596923828125,
-0.009429931640625,
0.0298309326171875,
-0.02978515625,
0.04071044921875,
-0.0101776123046875,
0.05377197265625,
0.0223541259765625,
-0.03912353515625,
0.0438232421875,
0.015960693359375,
-0.04150390625,
-0.05029296875,
0.0115966796875,
-0.091796875,
-0.00983428955078125,
0.10064697265625,
-0.01279449462890625,
-0.040008544921875,
0.015716552734375,
-0.041595458984375,
0.06610107421875,
-0.03057861328125,
0.0726318359375,
0.043731689453125,
0.003650665283203125,
0.00029850006103515625,
-0.026519775390625,
0.01422882080078125,
0.015472412109375,
-0.047943115234375,
-0.01551055908203125,
0.020721435546875,
0.03472900390625,
0.00948333740234375,
0.018524169921875,
-0.0013246536254882812,
0.028961181640625,
0.0011644363403320312,
0.02447509765625,
-0.017822265625,
0.00313568115234375,
0.0011129379272460938,
-0.0301971435546875,
0.00959014892578125,
-0.0301055908203125
]
] |
timm/vit_base_patch16_384.augreg_in21k_ft_in1k | 2023-05-06T00:01:15.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"dataset:imagenet-21k",
"arxiv:2106.10270",
"arxiv:2010.11929",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/vit_base_patch16_384.augreg_in21k_ft_in1k | 0 | 6,845 | timm | 2022-12-22T07:29:44 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
- imagenet-21k
---
# Model card for vit_base_patch16_384.augreg_in21k_ft_in1k
A Vision Transformer (ViT) image classification model. Trained on ImageNet-21k and fine-tuned on ImageNet-1k (with additional augmentation and regularization) in JAX by paper authors, ported to PyTorch by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 86.9
- GMACs: 49.4
- Activations (M): 48.3
- Image size: 384 x 384
- **Papers:**
- How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers: https://arxiv.org/abs/2106.10270
- An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale: https://arxiv.org/abs/2010.11929v2
- **Dataset:** ImageNet-1k
- **Pretrain Dataset:** ImageNet-21k
- **Original:** https://github.com/google-research/vision_transformer
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('vit_base_patch16_384.augreg_in21k_ft_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'vit_base_patch16_384.augreg_in21k_ft_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 577, 768) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@article{steiner2021augreg,
title={How to train your ViT? Data, Augmentation, and Regularization in Vision Transformers},
author={Steiner, Andreas and Kolesnikov, Alexander and and Zhai, Xiaohua and Wightman, Ross and Uszkoreit, Jakob and Beyer, Lucas},
journal={arXiv preprint arXiv:2106.10270},
year={2021}
}
```
```bibtex
@article{dosovitskiy2020vit,
title={An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale},
author={Dosovitskiy, Alexey and Beyer, Lucas and Kolesnikov, Alexander and Weissenborn, Dirk and Zhai, Xiaohua and Unterthiner, Thomas and Dehghani, Mostafa and Minderer, Matthias and Heigold, Georg and Gelly, Sylvain and Uszkoreit, Jakob and Houlsby, Neil},
journal={ICLR},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 3,906 | [
[
-0.0389404296875,
-0.0290069580078125,
-0.0038051605224609375,
0.007099151611328125,
-0.0294952392578125,
-0.02532958984375,
-0.0209197998046875,
-0.03448486328125,
0.01342010498046875,
0.02410888671875,
-0.0411376953125,
-0.037811279296875,
-0.04754638671875,
0.00006258487701416016,
-0.01081085205078125,
0.07391357421875,
-0.0099945068359375,
0.0027675628662109375,
-0.016265869140625,
-0.034149169921875,
-0.0241851806640625,
-0.02117919921875,
-0.044708251953125,
-0.032135009765625,
0.02740478515625,
0.0128326416015625,
0.0440673828125,
0.04608154296875,
0.05889892578125,
0.03326416015625,
-0.00832366943359375,
0.010406494140625,
-0.0261383056640625,
-0.0159759521484375,
0.0214080810546875,
-0.04705810546875,
-0.0304412841796875,
0.0184173583984375,
0.053497314453125,
0.0294342041015625,
0.0087738037109375,
0.02587890625,
0.01043701171875,
0.038238525390625,
-0.0265655517578125,
0.0143585205078125,
-0.039337158203125,
0.0204925537109375,
-0.00397491455078125,
-0.0035915374755859375,
-0.0235595703125,
-0.02459716796875,
0.0200958251953125,
-0.0406494140625,
0.0465087890625,
-0.00528717041015625,
0.1041259765625,
0.021209716796875,
0.004154205322265625,
0.0182952880859375,
-0.030731201171875,
0.05621337890625,
-0.046295166015625,
0.03265380859375,
0.0137481689453125,
0.01242828369140625,
0.00476837158203125,
-0.0762939453125,
-0.0489501953125,
-0.01316070556640625,
-0.017791748046875,
0.00887298583984375,
-0.02276611328125,
0.0190277099609375,
0.035980224609375,
0.043853759765625,
-0.040191650390625,
-0.0030536651611328125,
-0.04168701171875,
-0.020599365234375,
0.041656494140625,
-0.0030460357666015625,
0.01458740234375,
-0.01154327392578125,
-0.047027587890625,
-0.04534912109375,
-0.0254669189453125,
0.0204620361328125,
0.02130126953125,
0.0038127899169921875,
-0.03619384765625,
0.03961181640625,
0.0031414031982421875,
0.04949951171875,
0.01861572265625,
-0.0164031982421875,
0.05047607421875,
-0.01244354248046875,
-0.0304107666015625,
-0.0193328857421875,
0.08154296875,
0.035186767578125,
0.030364990234375,
-0.0033702850341796875,
-0.0134124755859375,
-0.0080108642578125,
0.00574493408203125,
-0.08123779296875,
-0.028167724609375,
0.00574493408203125,
-0.033447265625,
-0.0290679931640625,
0.0269622802734375,
-0.04742431640625,
-0.0082244873046875,
-0.007720947265625,
0.05902099609375,
-0.034454345703125,
-0.01690673828125,
0.00908660888671875,
-0.01242828369140625,
0.03631591796875,
0.0185089111328125,
-0.04449462890625,
0.0087890625,
0.01556396484375,
0.07672119140625,
0.002651214599609375,
-0.03680419921875,
-0.0170745849609375,
-0.03387451171875,
-0.024383544921875,
0.037322998046875,
-0.0012025833129882812,
-0.0115814208984375,
-0.013336181640625,
0.0277557373046875,
-0.0190277099609375,
-0.042694091796875,
0.025238037109375,
-0.01512908935546875,
0.02667236328125,
0.007472991943359375,
-0.0146026611328125,
-0.03179931640625,
0.021484375,
-0.03131103515625,
0.09185791015625,
0.03009033203125,
-0.0675048828125,
0.032379150390625,
-0.0340576171875,
-0.00624847412109375,
-0.0089111328125,
0.0020275115966796875,
-0.08209228515625,
0.003711700439453125,
0.023651123046875,
0.04351806640625,
-0.01520538330078125,
-0.0018939971923828125,
-0.0283050537109375,
-0.0256195068359375,
0.02569580078125,
-0.0203399658203125,
0.0689697265625,
0.0021209716796875,
-0.0253143310546875,
0.020538330078125,
-0.0439453125,
0.00662994384765625,
0.031890869140625,
-0.019378662109375,
0.0011491775512695312,
-0.047149658203125,
0.011444091796875,
0.01508331298828125,
0.016937255859375,
-0.0498046875,
0.0294342041015625,
-0.02838134765625,
0.0291748046875,
0.04864501953125,
-0.007732391357421875,
0.0290069580078125,
-0.02490234375,
0.02606201171875,
0.019805908203125,
0.0308837890625,
-0.0116119384765625,
-0.04730224609375,
-0.0784912109375,
-0.0328369140625,
0.0260467529296875,
0.0340576171875,
-0.049468994140625,
0.04266357421875,
-0.0290679931640625,
-0.055328369140625,
-0.04498291015625,
0.0023193359375,
0.035491943359375,
0.041900634765625,
0.03997802734375,
-0.04193115234375,
-0.041046142578125,
-0.0721435546875,
-0.0088958740234375,
-0.005008697509765625,
0.0010585784912109375,
0.01544189453125,
0.04644775390625,
-0.019775390625,
0.066162109375,
-0.03228759765625,
-0.024749755859375,
-0.0162506103515625,
0.004489898681640625,
0.026336669921875,
0.056365966796875,
0.053253173828125,
-0.049102783203125,
-0.034759521484375,
-0.0098724365234375,
-0.06488037109375,
0.00998687744140625,
-0.00246429443359375,
-0.0142974853515625,
0.01055908203125,
0.0131378173828125,
-0.05230712890625,
0.0595703125,
0.01282501220703125,
-0.02825927734375,
0.031890869140625,
-0.016326904296875,
0.0060577392578125,
-0.087890625,
-0.0006036758422851562,
0.0283966064453125,
-0.0199737548828125,
-0.03759765625,
0.0008597373962402344,
0.007564544677734375,
-0.0020503997802734375,
-0.030548095703125,
0.0418701171875,
-0.03680419921875,
-0.002704620361328125,
-0.004856109619140625,
-0.0264129638671875,
0.004779815673828125,
0.055145263671875,
-0.0036106109619140625,
0.040374755859375,
0.055999755859375,
-0.03619384765625,
0.04327392578125,
0.039794921875,
-0.01611328125,
0.034820556640625,
-0.05438232421875,
0.01218414306640625,
-0.0033817291259765625,
0.01538848876953125,
-0.07598876953125,
-0.01377105712890625,
0.029022216796875,
-0.055908203125,
0.047821044921875,
-0.0400390625,
-0.032562255859375,
-0.045074462890625,
-0.0303192138671875,
0.03094482421875,
0.0570068359375,
-0.058349609375,
0.045196533203125,
0.005992889404296875,
0.024078369140625,
-0.045562744140625,
-0.07183837890625,
-0.0166473388671875,
-0.02777099609375,
-0.054046630859375,
0.03564453125,
0.006595611572265625,
0.01152801513671875,
0.006298065185546875,
-0.00786590576171875,
0.0004105567932128906,
-0.01532745361328125,
0.033905029296875,
0.031524658203125,
-0.018341064453125,
-0.005107879638671875,
-0.0259246826171875,
-0.01611328125,
-0.000457763671875,
-0.0263824462890625,
0.03875732421875,
-0.023712158203125,
-0.01490020751953125,
-0.056182861328125,
-0.0200347900390625,
0.03729248046875,
-0.0227508544921875,
0.054290771484375,
0.08575439453125,
-0.03485107421875,
0.00446319580078125,
-0.043548583984375,
-0.0288238525390625,
-0.03692626953125,
0.035308837890625,
-0.0245208740234375,
-0.0328369140625,
0.054718017578125,
0.01256561279296875,
0.007480621337890625,
0.057464599609375,
0.0313720703125,
0.0028839111328125,
0.061553955078125,
0.0517578125,
0.01030731201171875,
0.06634521484375,
-0.07421875,
-0.00887298583984375,
-0.0679931640625,
-0.0271453857421875,
-0.0175323486328125,
-0.03924560546875,
-0.05169677734375,
-0.03656005859375,
0.03363037109375,
0.00832366943359375,
-0.020843505859375,
0.041229248046875,
-0.066650390625,
0.01390838623046875,
0.053497314453125,
0.03948974609375,
-0.00832366943359375,
0.033111572265625,
-0.01515960693359375,
-0.005466461181640625,
-0.058013916015625,
-0.0078277587890625,
0.08184814453125,
0.03558349609375,
0.059967041015625,
-0.021331787109375,
0.04876708984375,
-0.01959228515625,
0.0249481201171875,
-0.05816650390625,
0.0411376953125,
-0.0018892288208007812,
-0.0300140380859375,
-0.00882720947265625,
-0.029449462890625,
-0.07696533203125,
0.0165863037109375,
-0.0261383056640625,
-0.0592041015625,
0.026885986328125,
0.01512908935546875,
-0.0166168212890625,
0.049285888671875,
-0.06396484375,
0.07257080078125,
-0.005344390869140625,
-0.03662109375,
0.006862640380859375,
-0.053497314453125,
0.0126953125,
0.0194091796875,
-0.0279083251953125,
0.010589599609375,
0.02105712890625,
0.07440185546875,
-0.0457763671875,
0.061859130859375,
-0.0313720703125,
0.0262298583984375,
0.036285400390625,
-0.016448974609375,
0.0274810791015625,
0.0014362335205078125,
0.012664794921875,
0.0251617431640625,
-0.002414703369140625,
-0.027435302734375,
-0.037445068359375,
0.034942626953125,
-0.07769775390625,
-0.02838134765625,
-0.038055419921875,
-0.042877197265625,
0.00775146484375,
0.005001068115234375,
0.05303955078125,
0.046630859375,
0.0203094482421875,
0.0299224853515625,
0.05072021484375,
-0.0248870849609375,
0.028411865234375,
0.00048542022705078125,
-0.01285552978515625,
-0.0421142578125,
0.07098388671875,
0.01702880859375,
0.0119476318359375,
0.01357269287109375,
0.0171356201171875,
-0.0253448486328125,
-0.036224365234375,
-0.0272064208984375,
0.03045654296875,
-0.0535888671875,
-0.035797119140625,
-0.043243408203125,
-0.0396728515625,
-0.0256500244140625,
0.00086212158203125,
-0.03192138671875,
-0.02459716796875,
-0.0270538330078125,
0.00650787353515625,
0.0621337890625,
0.038665771484375,
-0.00867462158203125,
0.04046630859375,
-0.04296875,
0.015899658203125,
0.021331787109375,
0.039825439453125,
-0.0147247314453125,
-0.076416015625,
-0.0272979736328125,
0.0020351409912109375,
-0.038238525390625,
-0.057342529296875,
0.03448486328125,
0.01511383056640625,
0.0341796875,
0.0273895263671875,
-0.02105712890625,
0.0662841796875,
-0.005157470703125,
0.04376220703125,
0.025054931640625,
-0.0404052734375,
0.0379638671875,
-0.007198333740234375,
0.01140594482421875,
0.01500701904296875,
0.01300048828125,
-0.0216827392578125,
-0.004302978515625,
-0.07989501953125,
-0.057037353515625,
0.05889892578125,
0.0190887451171875,
0.00490570068359375,
0.035003662109375,
0.046356201171875,
-0.004077911376953125,
0.0039825439453125,
-0.06744384765625,
-0.024932861328125,
-0.0321044921875,
-0.0244140625,
-0.005878448486328125,
-0.0007219314575195312,
-0.00038933753967285156,
-0.060882568359375,
0.0482177734375,
-0.005474090576171875,
0.060516357421875,
0.03668212890625,
-0.01654052734375,
-0.01337432861328125,
-0.0292205810546875,
0.0261383056640625,
0.019378662109375,
-0.022491455078125,
0.0019073486328125,
0.0205230712890625,
-0.0567626953125,
-0.002353668212890625,
0.023681640625,
-0.007232666015625,
0.005008697509765625,
0.03582763671875,
0.08349609375,
-0.0094146728515625,
-0.0006976127624511719,
0.040618896484375,
-0.00679779052734375,
-0.032379150390625,
-0.0207672119140625,
0.0066986083984375,
-0.0191802978515625,
0.0261383056640625,
0.0257110595703125,
0.02978515625,
-0.01198577880859375,
-0.01019287109375,
0.01190948486328125,
0.039703369140625,
-0.03961181640625,
-0.028289794921875,
0.04998779296875,
-0.01490020751953125,
-0.006389617919921875,
0.061004638671875,
-0.0035858154296875,
-0.04376220703125,
0.0667724609375,
0.02392578125,
0.0772705078125,
-0.00775146484375,
-0.003528594970703125,
0.059967041015625,
0.028167724609375,
-0.0020313262939453125,
0.01233673095703125,
0.00970458984375,
-0.060882568359375,
-0.006725311279296875,
-0.0479736328125,
0.0020389556884765625,
0.02581787109375,
-0.039642333984375,
0.0296478271484375,
-0.040069580078125,
-0.0286712646484375,
0.003948211669921875,
0.017364501953125,
-0.076416015625,
0.0225982666015625,
0.00301361083984375,
0.056732177734375,
-0.060028076171875,
0.04864501953125,
0.06488037109375,
-0.05126953125,
-0.07183837890625,
-0.01213836669921875,
-0.01422882080078125,
-0.06549072265625,
0.035186767578125,
0.03265380859375,
0.0107879638671875,
0.0193328857421875,
-0.061767578125,
-0.045623779296875,
0.09735107421875,
0.0283050537109375,
-0.01045989990234375,
0.01045989990234375,
-0.0012769699096679688,
0.028594970703125,
-0.0205230712890625,
0.03436279296875,
0.01303863525390625,
0.0296478271484375,
0.015350341796875,
-0.054107666015625,
0.0050811767578125,
-0.0248870849609375,
0.01187896728515625,
0.014007568359375,
-0.060028076171875,
0.0732421875,
-0.0311737060546875,
-0.00908660888671875,
0.0124969482421875,
0.04833984375,
0.007305145263671875,
0.0037689208984375,
0.042266845703125,
0.0675048828125,
0.02825927734375,
-0.032379150390625,
0.0679931640625,
-0.01108551025390625,
0.054168701171875,
0.03692626953125,
0.036956787109375,
0.031463623046875,
0.034210205078125,
-0.0263824462890625,
0.02557373046875,
0.07464599609375,
-0.042205810546875,
0.0222625732421875,
0.00727081298828125,
0.003475189208984375,
-0.017364501953125,
0.00439453125,
-0.03643798828125,
0.0384521484375,
0.0157928466796875,
-0.041595458984375,
-0.00736236572265625,
0.01280975341796875,
-0.01198577880859375,
-0.0286712646484375,
-0.01332855224609375,
0.045501708984375,
-0.0009469985961914062,
-0.033355712890625,
0.0638427734375,
-0.0012083053588867188,
0.0621337890625,
-0.03363037109375,
-0.00301361083984375,
-0.0200042724609375,
0.0307464599609375,
-0.028900146484375,
-0.0595703125,
0.0115814208984375,
-0.0171966552734375,
-0.00543975830078125,
0.00199127197265625,
0.0535888671875,
-0.0301666259765625,
-0.0411376953125,
0.008270263671875,
0.02398681640625,
0.0224761962890625,
-0.006744384765625,
-0.07708740234375,
-0.002986907958984375,
0.001384735107421875,
-0.04486083984375,
0.014190673828125,
0.029205322265625,
0.002071380615234375,
0.05078125,
0.050567626953125,
-0.006988525390625,
0.017913818359375,
-0.009124755859375,
0.07012939453125,
-0.0321044921875,
-0.02923583984375,
-0.059600830078125,
0.048370361328125,
-0.004375457763671875,
-0.04766845703125,
0.049957275390625,
0.04534912109375,
0.06982421875,
-0.011505126953125,
0.0361328125,
-0.0126495361328125,
0.0019550323486328125,
-0.02618408203125,
0.044586181640625,
-0.05316162109375,
-0.00820159912109375,
-0.0219573974609375,
-0.0662841796875,
-0.0278472900390625,
0.072021484375,
-0.02386474609375,
0.032745361328125,
0.03985595703125,
0.07403564453125,
-0.0248870849609375,
-0.0298614501953125,
0.01378631591796875,
0.01508331298828125,
0.00835418701171875,
0.0296478271484375,
0.043243408203125,
-0.06695556640625,
0.037506103515625,
-0.04718017578125,
-0.01319122314453125,
-0.01837158203125,
-0.035064697265625,
-0.07672119140625,
-0.061553955078125,
-0.04327392578125,
-0.051025390625,
-0.0157623291015625,
0.06427001953125,
0.07232666015625,
-0.042236328125,
-0.00577545166015625,
-0.012969970703125,
0.0013675689697265625,
-0.0235595703125,
-0.0184783935546875,
0.038116455078125,
-0.01030731201171875,
-0.057586669921875,
-0.024017333984375,
-0.00046563148498535156,
0.0379638671875,
-0.01403045654296875,
-0.012054443359375,
-0.0115509033203125,
-0.025604248046875,
0.0185089111328125,
0.0221710205078125,
-0.051605224609375,
-0.016143798828125,
-0.0034046173095703125,
-0.0042266845703125,
0.0390625,
0.028778076171875,
-0.056976318359375,
0.041717529296875,
0.04241943359375,
0.0255279541015625,
0.0631103515625,
-0.012664794921875,
0.007770538330078125,
-0.06463623046875,
0.04449462890625,
-0.0016450881958007812,
0.03887939453125,
0.037933349609375,
-0.0210723876953125,
0.045928955078125,
0.044525146484375,
-0.033355712890625,
-0.06256103515625,
-0.00278472900390625,
-0.0828857421875,
0.00966644287109375,
0.07086181640625,
-0.01959228515625,
-0.03564453125,
0.0281524658203125,
-0.016357421875,
0.053558349609375,
-0.0047760009765625,
0.0340576171875,
0.0175933837890625,
0.006221771240234375,
-0.045684814453125,
-0.03436279296875,
0.038177490234375,
0.01230621337890625,
-0.03955078125,
-0.028594970703125,
0.0032672882080078125,
0.042236328125,
0.027862548828125,
0.0266876220703125,
-0.0117340087890625,
0.0146484375,
0.004528045654296875,
0.0419921875,
-0.0259552001953125,
-0.011688232421875,
-0.03021240234375,
-0.01201629638671875,
-0.006717681884765625,
-0.04705810546875
]
] |
digiplay/realmixUnrealjourney_v1 | 2023-10-10T01:13:20.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"license:other",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | digiplay | null | null | digiplay/realmixUnrealjourney_v1 | 2 | 6,843 | diffusers | 2023-06-26T06:18:42 | ---
license: other
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
inference: true
---
Model info :
https://civitai.com/models/83214/realmixunrealjourney
Sample image I made :

Original Author's DEMO images :


| 613 | [
[
-0.03460693359375,
-0.034515380859375,
0.037872314453125,
0.00864410400390625,
-0.0130462646484375,
-0.0220947265625,
0.0361328125,
-0.0102386474609375,
0.045806884765625,
0.051025390625,
-0.07769775390625,
-0.037933349609375,
-0.007476806640625,
-0.0238494873046875,
-0.007755279541015625,
0.03985595703125,
0.0015316009521484375,
-0.0166778564453125,
-0.016510009765625,
-0.0090179443359375,
-0.029510498046875,
-0.0225067138671875,
-0.0180206298828125,
-0.0014257431030273438,
0.0024394989013671875,
0.040924072265625,
0.04150390625,
0.037078857421875,
0.0215301513671875,
0.0196990966796875,
-0.01078033447265625,
0.01351165771484375,
-0.0221099853515625,
-0.024993896484375,
0.00966644287109375,
-0.03204345703125,
-0.05670166015625,
0.0264129638671875,
0.056915283203125,
0.043182373046875,
-0.01023101806640625,
0.038909912109375,
0.0200958251953125,
0.034759521484375,
-0.035186767578125,
-0.0011949539184570312,
0.0034637451171875,
0.01010894775390625,
-0.01221466064453125,
-0.01419830322265625,
-0.018096923828125,
-0.0193939208984375,
-0.01203155517578125,
-0.0882568359375,
0.025848388671875,
0.0280609130859375,
0.08612060546875,
-0.0172271728515625,
-0.05206298828125,
0.004413604736328125,
-0.045135498046875,
0.049285888671875,
-0.044830322265625,
0.037078857421875,
0.01161956787109375,
0.048004150390625,
-0.0222320556640625,
-0.052581787109375,
-0.047393798828125,
0.025604248046875,
-0.005786895751953125,
0.0264434814453125,
-0.0252838134765625,
-0.02880859375,
0.040679931640625,
0.018890380859375,
-0.06365966796875,
-0.015350341796875,
-0.0504150390625,
0.0004696846008300781,
0.044708251953125,
0.01433563232421875,
0.04327392578125,
0.0011014938354492188,
-0.03924560546875,
-0.0244140625,
-0.054534912109375,
0.0261383056640625,
0.03399658203125,
-0.010162353515625,
-0.04608154296875,
0.0328369140625,
0.0024280548095703125,
0.041046142578125,
0.0161895751953125,
-0.016937255859375,
0.039459228515625,
-0.003147125244140625,
-0.030853271484375,
-0.032012939453125,
0.04876708984375,
0.07940673828125,
-0.0272674560546875,
0.00392913818359375,
-0.00017571449279785156,
-0.048980712890625,
0.00806427001953125,
-0.07794189453125,
-0.0113067626953125,
0.022491455078125,
-0.057037353515625,
-0.0235443115234375,
0.0233001708984375,
-0.07342529296875,
-0.006160736083984375,
-0.01238250732421875,
0.0144805908203125,
-0.0157012939453125,
-0.048980712890625,
-0.00923919677734375,
-0.006381988525390625,
0.032440185546875,
0.0264434814453125,
-0.04693603515625,
0.034149169921875,
0.0186004638671875,
0.04876708984375,
0.0244598388671875,
0.003719329833984375,
-0.0255126953125,
0.01122283935546875,
-0.03350830078125,
0.05279541015625,
0.004360198974609375,
-0.060699462890625,
0.00792694091796875,
0.0233306884765625,
-0.00954437255859375,
-0.035247802734375,
0.0732421875,
-0.041259765625,
0.0221710205078125,
-0.042236328125,
-0.035675048828125,
-0.024139404296875,
0.030670166015625,
-0.067138671875,
0.038665771484375,
0.0361328125,
-0.05450439453125,
0.024200439453125,
-0.030609130859375,
0.017425537109375,
0.0221099853515625,
-0.006805419921875,
-0.040374755859375,
0.0001932382583618164,
0.0014333724975585938,
0.01284027099609375,
-0.0168609619140625,
-0.0184478759765625,
-0.046539306640625,
-0.0272674560546875,
0.0293426513671875,
0.00167083740234375,
0.08538818359375,
0.0255279541015625,
-0.0188751220703125,
-0.01055145263671875,
-0.07635498046875,
0.016937255859375,
0.0229949951171875,
0.0178070068359375,
-0.022125244140625,
-0.0263519287109375,
0.027191162109375,
0.037261962890625,
0.0255889892578125,
-0.05328369140625,
0.0232391357421875,
-0.012481689453125,
-0.0010890960693359375,
0.049530029296875,
0.0280303955078125,
0.01555633544921875,
-0.0384521484375,
0.062744140625,
0.00013554096221923828,
0.035064697265625,
-0.00836944580078125,
-0.023193359375,
-0.0576171875,
-0.0200042724609375,
0.0149993896484375,
0.01535797119140625,
-0.0462646484375,
0.041290283203125,
-0.0141754150390625,
-0.050018310546875,
-0.027679443359375,
0.0012693405151367188,
0.007457733154296875,
0.037994384765625,
0.0023860931396484375,
-0.01145172119140625,
-0.042816162109375,
-0.09625244140625,
0.0009031295776367188,
-0.005863189697265625,
-0.0194854736328125,
0.037445068359375,
0.05267333984375,
-0.0085906982421875,
0.058013916015625,
-0.05865478515625,
-0.01197052001953125,
-0.0005354881286621094,
-0.0173187255859375,
0.0305633544921875,
0.0499267578125,
0.08660888671875,
-0.07489013671875,
-0.039031982421875,
-0.00885772705078125,
-0.041412353515625,
-0.00720977783203125,
-0.00788116455078125,
-0.041595458984375,
-0.0308074951171875,
0.0256500244140625,
-0.038055419921875,
0.0523681640625,
0.036407470703125,
-0.06048583984375,
0.0699462890625,
-0.030120849609375,
0.04864501953125,
-0.09857177734375,
0.0238800048828125,
0.0301666259765625,
-0.0214080810546875,
-0.0211029052734375,
0.050506591796875,
0.0006628036499023438,
-0.01155853271484375,
-0.051025390625,
0.06329345703125,
-0.057647705078125,
0.0289764404296875,
-0.00557708740234375,
-0.0014047622680664062,
0.0232391357421875,
0.0297698974609375,
-0.0166168212890625,
0.02783203125,
0.04339599609375,
-0.028900146484375,
0.0302886962890625,
0.0158843994140625,
-0.040130615234375,
0.04840087890625,
-0.073974609375,
0.005657196044921875,
0.0211944580078125,
-0.0001156926155090332,
-0.0775146484375,
-0.0273284912109375,
0.0305328369140625,
-0.0262298583984375,
-0.006145477294921875,
0.0020656585693359375,
-0.055206298828125,
-0.03314208984375,
-0.0248565673828125,
0.031463623046875,
0.06622314453125,
-0.05560302734375,
0.044403076171875,
0.0285186767578125,
0.0230865478515625,
0.0158843994140625,
-0.043975830078125,
-0.02935791015625,
-0.02825927734375,
-0.031005859375,
0.057769775390625,
-0.0168304443359375,
-0.0261688232421875,
-0.00750732421875,
0.01003265380859375,
-0.032562255859375,
-0.0207366943359375,
0.06768798828125,
0.047576904296875,
-0.01349639892578125,
-0.034332275390625,
0.0166015625,
0.0008988380432128906,
0.0158843994140625,
0.004985809326171875,
0.01505279541015625,
-0.0164031982421875,
-0.03472900390625,
-0.0653076171875,
0.006160736083984375,
0.05059814453125,
0.01934814453125,
0.04840087890625,
0.0159759521484375,
-0.043121337890625,
0.0007085800170898438,
-0.04473876953125,
-0.005626678466796875,
-0.034820556640625,
-0.01788330078125,
-0.04412841796875,
-0.036895751953125,
0.050323486328125,
-0.01091766357421875,
0.0095977783203125,
0.047332763671875,
0.0163421630859375,
-0.0133819580078125,
0.07220458984375,
0.052154541015625,
0.01140594482421875,
0.0235443115234375,
-0.0731201171875,
-0.012176513671875,
-0.07672119140625,
-0.033477783203125,
-0.0252838134765625,
-0.0299530029296875,
-0.048583984375,
-0.04962158203125,
0.031951904296875,
0.0035877227783203125,
-0.0311279296875,
0.038665771484375,
-0.00571441650390625,
0.0252838134765625,
0.035247802734375,
0.03045654296875,
0.002735137939453125,
0.006305694580078125,
-0.01247406005859375,
-0.038360595703125,
-0.032440185546875,
-0.02777099609375,
0.054534912109375,
0.006805419921875,
0.01953125,
0.01959228515625,
0.05609130859375,
-0.00763702392578125,
0.00997161865234375,
-0.011322021484375,
0.04107666015625,
0.0195770263671875,
-0.059661865234375,
0.01690673828125,
-0.01019287109375,
-0.056488037109375,
0.039642333984375,
-0.0323486328125,
-0.03271484375,
0.042022705078125,
0.02001953125,
-0.03900146484375,
0.0386962890625,
-0.0290985107421875,
0.0556640625,
-0.00738525390625,
-0.0309906005859375,
0.00952911376953125,
-0.00949859619140625,
0.037261962890625,
0.0211639404296875,
0.041412353515625,
0.018646240234375,
0.004241943359375,
0.052215576171875,
-0.07269287109375,
0.05548095703125,
-0.0221099853515625,
0.02252197265625,
0.05224609375,
0.01175689697265625,
0.0030765533447265625,
0.0118255615234375,
-0.0153045654296875,
-0.009521484375,
-0.0018186569213867188,
-0.037872314453125,
-0.05126953125,
0.0596923828125,
-0.051055908203125,
-0.0234832763671875,
-0.0310211181640625,
-0.01091766357421875,
0.0012006759643554688,
0.0169219970703125,
0.044952392578125,
0.046173095703125,
-0.057220458984375,
0.0213470458984375,
0.025970458984375,
0.0008487701416015625,
0.0168304443359375,
0.027252197265625,
-0.0294647216796875,
-0.033843994140625,
0.051025390625,
0.005237579345703125,
0.0008602142333984375,
0.007762908935546875,
0.0023651123046875,
0.0005002021789550781,
-0.0012302398681640625,
-0.036102294921875,
0.029541015625,
-0.0087432861328125,
0.0007801055908203125,
-0.032196044921875,
-0.00908660888671875,
-0.049896240234375,
-0.032684326171875,
-0.01313018798828125,
-0.030670166015625,
-0.046722412109375,
-0.006694793701171875,
0.0035076141357421875,
0.0256195068359375,
0.004207611083984375,
0.0528564453125,
-0.0303955078125,
0.00803375244140625,
0.032196044921875,
-0.00885772705078125,
-0.0017299652099609375,
-0.06689453125,
0.0018138885498046875,
0.021331787109375,
-0.045867919921875,
-0.051513671875,
0.06231689453125,
-0.007904052734375,
0.048858642578125,
0.033447265625,
-0.0080108642578125,
0.07275390625,
-0.031829833984375,
0.060333251953125,
0.047271728515625,
-0.047821044921875,
0.048980712890625,
-0.032501220703125,
0.028778076171875,
0.041046142578125,
0.02099609375,
-0.03192138671875,
-0.0171051025390625,
-0.10577392578125,
-0.0556640625,
0.0253753662109375,
0.0232696533203125,
-0.0015268325805664062,
0.03704833984375,
0.044342041015625,
-0.0025482177734375,
0.0063629150390625,
-0.048431396484375,
-0.0228424072265625,
-0.00836944580078125,
-0.0005393028259277344,
0.0130462646484375,
-0.0149078369140625,
-0.0158538818359375,
-0.042724609375,
0.06658935546875,
-0.01346588134765625,
0.0090484619140625,
0.0105438232421875,
0.0064697265625,
-0.00977325439453125,
-0.005443572998046875,
0.04046630859375,
0.052947998046875,
-0.02545166015625,
-0.019989013671875,
-0.00772857666015625,
-0.017486572265625,
0.003719329833984375,
-0.005161285400390625,
-0.0241546630859375,
0.00817108154296875,
0.04852294921875,
0.060760498046875,
0.0188446044921875,
-0.0384521484375,
0.06292724609375,
-0.01107025146484375,
-0.0322265625,
-0.03411865234375,
0.018951416015625,
0.01076507568359375,
0.024627685546875,
0.0190277099609375,
0.020416259765625,
0.038543701171875,
-0.0161590576171875,
0.0072021484375,
0.00988006591796875,
-0.0284881591796875,
-0.048919677734375,
0.06475830078125,
-0.0131072998046875,
-0.0142669677734375,
0.053375244140625,
-0.022674560546875,
-0.02435302734375,
0.08282470703125,
0.05810546875,
0.056396484375,
-0.00969696044921875,
0.031280517578125,
0.059600830078125,
0.009368896484375,
0.014556884765625,
0.03741455078125,
0.0081787109375,
-0.0310211181640625,
0.022186279296875,
-0.0231781005859375,
-0.033843994140625,
0.0224151611328125,
-0.0628662109375,
0.0439453125,
-0.04754638671875,
-0.00691986083984375,
-0.0017480850219726562,
0.006656646728515625,
-0.0667724609375,
0.045440673828125,
-0.0184783935546875,
0.09124755859375,
-0.07232666015625,
0.04925537109375,
0.03936767578125,
-0.03656005859375,
-0.07293701171875,
-0.00041675567626953125,
0.0225677490234375,
-0.045013427734375,
0.046905517578125,
0.0189971923828125,
0.0056915283203125,
-0.007678985595703125,
-0.053802490234375,
-0.045928955078125,
0.09210205078125,
0.0092315673828125,
-0.06976318359375,
0.005084991455078125,
-0.034393310546875,
0.036468505859375,
-0.06103515625,
0.0137176513671875,
-0.001861572265625,
0.0193939208984375,
0.031707763671875,
-0.035186767578125,
-0.00844573974609375,
-0.053558349609375,
0.0177459716796875,
-0.0213165283203125,
-0.07049560546875,
0.06640625,
-0.0284271240234375,
-0.00959014892578125,
0.0230712890625,
0.062103271484375,
0.0308685302734375,
0.0141448974609375,
0.07470703125,
0.0498046875,
0.034637451171875,
0.006130218505859375,
0.09136962890625,
-0.0196533203125,
0.02191162109375,
0.0692138671875,
-0.0182952880859375,
0.036529541015625,
0.0022525787353515625,
-0.00861358642578125,
0.04705810546875,
0.07061767578125,
-0.0168914794921875,
0.040008544921875,
-0.004650115966796875,
-0.01335906982421875,
-0.003795623779296875,
0.00983428955078125,
-0.0367431640625,
0.007411956787109375,
0.0010080337524414062,
-0.0194244384765625,
-0.00226593017578125,
0.0197296142578125,
-0.01611328125,
0.00848388671875,
-0.01348114013671875,
0.0232086181640625,
-0.005992889404296875,
-0.01306915283203125,
0.028778076171875,
-0.006023406982421875,
0.031768798828125,
-0.0166168212890625,
-0.0109710693359375,
-0.002735137939453125,
-0.00015795230865478516,
-0.01708984375,
-0.041595458984375,
0.0066375732421875,
-0.00992584228515625,
-0.006618499755859375,
-0.029998779296875,
0.043731689453125,
-0.01320648193359375,
-0.072509765625,
0.0036373138427734375,
0.01806640625,
0.02374267578125,
0.0208587646484375,
-0.064208984375,
0.0157318115234375,
0.002628326416015625,
-0.0299530029296875,
-0.00891876220703125,
0.019134521484375,
0.006618499755859375,
0.017852783203125,
0.035736083984375,
0.016357421875,
-0.00820159912109375,
-0.00922393798828125,
0.06280517578125,
-0.037567138671875,
-0.053375244140625,
-0.04931640625,
0.04547119140625,
-0.043853759765625,
-0.062255859375,
0.062408447265625,
0.07403564453125,
0.047515869140625,
-0.037200927734375,
0.042144775390625,
-0.0081787109375,
0.0302886962890625,
-0.033843994140625,
0.048492431640625,
-0.07049560546875,
-0.0206146240234375,
-0.0093231201171875,
-0.050567626953125,
-0.007427215576171875,
0.05059814453125,
-0.004589080810546875,
0.005260467529296875,
0.0208740234375,
0.056182861328125,
-0.01125335693359375,
-0.015045166015625,
0.0156402587890625,
0.007450103759765625,
0.0389404296875,
0.0163421630859375,
0.0298309326171875,
-0.066162109375,
-0.0081787109375,
-0.04315185546875,
-0.04229736328125,
-0.0275726318359375,
-0.046905517578125,
-0.048370361328125,
-0.06903076171875,
-0.04876708984375,
-0.039794921875,
-0.00844573974609375,
0.058929443359375,
0.0927734375,
-0.04864501953125,
-0.0295562744140625,
0.0136871337890625,
-0.0008215904235839844,
-0.01812744140625,
-0.0214691162109375,
0.0188446044921875,
0.0496826171875,
-0.072509765625,
0.00055694580078125,
-0.00762176513671875,
0.039398193359375,
-0.00437164306640625,
0.004283905029296875,
-0.004718780517578125,
-0.0110015869140625,
0.024749755859375,
0.0288543701171875,
-0.01541900634765625,
-0.0016603469848632812,
0.0031108856201171875,
-0.01375579833984375,
-0.003681182861328125,
0.058013916015625,
-0.026611328125,
0.01239776611328125,
0.035797119140625,
-0.004634857177734375,
0.0255889892578125,
-0.00394439697265625,
0.0157623291015625,
-0.035369873046875,
0.043670654296875,
0.004734039306640625,
0.039398193359375,
0.010345458984375,
-0.03765869140625,
0.035369873046875,
0.03448486328125,
-0.029052734375,
-0.07275390625,
-0.0085906982421875,
-0.10040283203125,
-0.0161285400390625,
0.049102783203125,
0.0026607513427734375,
-0.06329345703125,
0.042510986328125,
-0.032989501953125,
-0.0034656524658203125,
0.0024662017822265625,
0.027252197265625,
0.053436279296875,
-0.00041794776916503906,
-0.01409912109375,
-0.057586669921875,
0.0021533966064453125,
-0.01264190673828125,
-0.06365966796875,
-0.035186767578125,
0.053924560546875,
0.039642333984375,
0.035430908203125,
0.05230712890625,
-0.0277557373046875,
0.043731689453125,
0.021636962890625,
0.04071044921875,
-0.0009822845458984375,
-0.0021114349365234375,
0.003875732421875,
0.0193328857421875,
-0.005634307861328125,
-0.038055419921875
]
] |
stablediffusionapi/disney-pixal-cartoon | 2023-06-01T04:38:30.000Z | [
"diffusers",
"stablediffusionapi.com",
"stable-diffusion-api",
"text-to-image",
"ultra-realistic",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | stablediffusionapi | null | null | stablediffusionapi/disney-pixal-cartoon | 20 | 6,833 | diffusers | 2023-06-01T04:35:53 | ---
license: creativeml-openrail-m
tags:
- stablediffusionapi.com
- stable-diffusion-api
- text-to-image
- ultra-realistic
pinned: true
---
# Disney Pixal Cartoon Type API Inference

## Get API Key
Get API key from [Stable Diffusion API](http://stablediffusionapi.com/), No Payment needed.
Replace Key in below code, change **model_id** to "disney-pixal-cartoon"
Coding in PHP/Node/Java etc? Have a look at docs for more code examples: [View docs](https://stablediffusionapi.com/docs)
Try model for free: [Generate Images](https://stablediffusionapi.com/models/disney-pixal-cartoon)
Model link: [View model](https://stablediffusionapi.com/models/disney-pixal-cartoon)
Credits: [View credits](https://civitai.com/?query=Disney%20Pixal%20Cartoon%20Type)
View all models: [View Models](https://stablediffusionapi.com/models)
import requests
import json
url = "https://stablediffusionapi.com/api/v3/dreambooth"
payload = json.dumps({
"key": "your_api_key",
"model_id": "disney-pixal-cartoon",
"prompt": "ultra realistic close up portrait ((beautiful pale cyberpunk female with heavy black eyeliner)), blue eyes, shaved side haircut, hyper detail, cinematic lighting, magic neon, dark red city, Canon EOS R3, nikon, f/1.4, ISO 200, 1/160s, 8K, RAW, unedited, symmetrical balance, in-frame, 8K",
"negative_prompt": "painting, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, deformed, ugly, blurry, bad anatomy, bad proportions, extra limbs, cloned face, skinny, glitchy, double torso, extra arms, extra hands, mangled fingers, missing lips, ugly face, distorted face, extra legs, anime",
"width": "512",
"height": "512",
"samples": "1",
"num_inference_steps": "30",
"safety_checker": "no",
"enhance_prompt": "yes",
"seed": None,
"guidance_scale": 7.5,
"multi_lingual": "no",
"panorama": "no",
"self_attention": "no",
"upscale": "no",
"embeddings": "embeddings_model_id",
"lora": "lora_model_id",
"webhook": None,
"track_id": None
})
headers = {
'Content-Type': 'application/json'
}
response = requests.request("POST", url, headers=headers, data=payload)
print(response.text)
> Use this coupon code to get 25% off **DMGG0RBN** | 2,510 | [
[
-0.039703369140625,
-0.05712890625,
0.01947021484375,
0.025787353515625,
-0.0312347412109375,
0.022918701171875,
0.0221710205078125,
-0.035736083984375,
0.050048828125,
0.042999267578125,
-0.060302734375,
-0.051544189453125,
-0.033599853515625,
0.0018949508666992188,
-0.01367950439453125,
0.045501708984375,
0.0055084228515625,
-0.016754150390625,
-0.0087127685546875,
0.0185394287109375,
-0.00574493408203125,
-0.021026611328125,
-0.055572509765625,
-0.0028629302978515625,
0.0192718505859375,
-0.003345489501953125,
0.0604248046875,
0.044464111328125,
0.0175323486328125,
0.0198822021484375,
-0.0205841064453125,
-0.0085296630859375,
-0.03546142578125,
-0.02294921875,
-0.0198822021484375,
-0.05181884765625,
-0.043792724609375,
-0.005001068115234375,
0.0135345458984375,
0.0305023193359375,
0.0100250244140625,
0.02838134765625,
-0.0052032470703125,
0.0511474609375,
-0.0552978515625,
0.028228759765625,
-0.02166748046875,
0.0252227783203125,
0.005825042724609375,
0.01361083984375,
-0.005725860595703125,
-0.02410888671875,
-0.009307861328125,
-0.07476806640625,
0.0283050537109375,
0.00827789306640625,
0.0994873046875,
-0.002025604248046875,
-0.016143798828125,
-0.0125732421875,
-0.03350830078125,
0.058868408203125,
-0.06292724609375,
0.01464080810546875,
0.03131103515625,
0.005588531494140625,
-0.0034809112548828125,
-0.064697265625,
-0.05133056640625,
0.01212310791015625,
0.0217132568359375,
0.0132904052734375,
-0.0305938720703125,
-0.0177001953125,
0.0205841064453125,
0.032928466796875,
-0.0288543701171875,
-0.0173187255859375,
-0.032440185546875,
-0.005199432373046875,
0.050201416015625,
0.00943756103515625,
0.031768798828125,
-0.03399658203125,
-0.0212860107421875,
-0.021820068359375,
-0.041046142578125,
0.0330810546875,
0.055633544921875,
0.0227203369140625,
-0.04571533203125,
0.05096435546875,
-0.0192108154296875,
0.040374755859375,
0.02642822265625,
-0.005550384521484375,
0.04193115234375,
-0.001354217529296875,
-0.007503509521484375,
-0.01461029052734375,
0.0733642578125,
0.06256103515625,
0.010528564453125,
0.016632080078125,
-0.01305389404296875,
-0.000025331974029541016,
0.01015472412109375,
-0.0660400390625,
-0.01055145263671875,
0.04656982421875,
-0.04998779296875,
-0.0433349609375,
0.003948211669921875,
-0.0791015625,
-0.01513671875,
-0.0040283203125,
0.02532958984375,
-0.029022216796875,
-0.05035400390625,
0.0234527587890625,
-0.0143890380859375,
0.00905609130859375,
0.0184478759765625,
-0.06756591796875,
-0.00424957275390625,
0.04168701171875,
0.05804443359375,
0.005847930908203125,
0.0005054473876953125,
0.0227508544921875,
0.0193328857421875,
-0.035614013671875,
0.061553955078125,
-0.0176239013671875,
-0.038299560546875,
-0.004817962646484375,
0.0275421142578125,
-0.0157318115234375,
-0.038482666015625,
0.043914794921875,
-0.037445068359375,
-0.0098724365234375,
-0.0032138824462890625,
-0.024444580078125,
-0.0330810546875,
0.01103973388671875,
-0.060028076171875,
0.040252685546875,
0.017578125,
-0.063232421875,
0.01114654541015625,
-0.05291748046875,
-0.0182037353515625,
-0.01079559326171875,
0.00897979736328125,
-0.03460693359375,
0.00424957275390625,
0.0205078125,
0.0153961181640625,
0.001499176025390625,
-0.005214691162109375,
-0.065185546875,
-0.027069091796875,
0.02435302734375,
-0.01065826416015625,
0.08685302734375,
0.0355224609375,
-0.011993408203125,
-0.00859832763671875,
-0.05975341796875,
0.00920867919921875,
0.060516357421875,
-0.0120086669921875,
-0.00921630859375,
-0.01371002197265625,
0.013031005859375,
0.00753021240234375,
0.0279541015625,
-0.034942626953125,
0.0126953125,
-0.02716064453125,
0.02386474609375,
0.035247802734375,
0.01071929931640625,
0.00798797607421875,
-0.0246734619140625,
0.057525634765625,
0.009552001953125,
0.031890869140625,
0.0056304931640625,
-0.044403076171875,
-0.04779052734375,
-0.0261688232421875,
0.0086669921875,
0.028900146484375,
-0.047515869140625,
0.034332275390625,
-0.013092041015625,
-0.041473388671875,
-0.035369873046875,
-0.016265869140625,
0.0226593017578125,
0.029754638671875,
0.004001617431640625,
-0.018951416015625,
-0.042572021484375,
-0.064697265625,
-0.003993988037109375,
-0.0124969482421875,
-0.0158843994140625,
0.0235595703125,
0.032470703125,
-0.0200042724609375,
0.061981201171875,
-0.0413818359375,
0.0050506591796875,
-0.0173187255859375,
-0.0030193328857421875,
0.06402587890625,
0.052093505859375,
0.06671142578125,
-0.05615234375,
-0.0345458984375,
-0.0228424072265625,
-0.0489501953125,
0.0115509033203125,
0.0264129638671875,
-0.01438140869140625,
-0.00728607177734375,
-0.0009107589721679688,
-0.06341552734375,
0.044158935546875,
0.03021240234375,
-0.04376220703125,
0.046600341796875,
-0.0056304931640625,
0.048675537109375,
-0.0797119140625,
0.0061492919921875,
-0.0009889602661132812,
-0.0108489990234375,
-0.028076171875,
0.0322265625,
0.003017425537109375,
-0.01509857177734375,
-0.0609130859375,
0.039703369140625,
-0.0211029052734375,
0.01273345947265625,
-0.01312255859375,
0.0037212371826171875,
0.0216217041015625,
0.0401611328125,
0.00830078125,
0.0312347412109375,
0.056365966796875,
-0.037567138671875,
0.03436279296875,
0.01654052734375,
-0.0207061767578125,
0.04595947265625,
-0.052581787109375,
0.0203857421875,
-0.0065155029296875,
0.0178680419921875,
-0.08001708984375,
-0.038299560546875,
0.039825439453125,
-0.041046142578125,
-0.0015468597412109375,
-0.049896240234375,
-0.046600341796875,
-0.05413818359375,
-0.032440185546875,
0.0245208740234375,
0.047576904296875,
-0.0386962890625,
0.05108642578125,
0.020111083984375,
0.004230499267578125,
-0.04376220703125,
-0.06207275390625,
-0.0218048095703125,
-0.0234375,
-0.052490234375,
0.020355224609375,
-0.003570556640625,
-0.01168060302734375,
0.006427764892578125,
-0.005199432373046875,
-0.0183258056640625,
-0.0185546875,
0.0292816162109375,
0.03936767578125,
-0.03179931640625,
-0.0296630859375,
0.01357269287109375,
0.0077667236328125,
0.01253509521484375,
-0.007747650146484375,
0.057769775390625,
-0.004108428955078125,
-0.039031982421875,
-0.05975341796875,
0.0106964111328125,
0.059326171875,
0.0017576217651367188,
0.033935546875,
0.0452880859375,
-0.039764404296875,
0.00968170166015625,
-0.034942626953125,
-0.0012407302856445312,
-0.036529541015625,
0.01549530029296875,
-0.038818359375,
-0.032501220703125,
0.068359375,
-0.0022983551025390625,
-0.00991058349609375,
0.05963134765625,
0.024993896484375,
-0.00940704345703125,
0.10675048828125,
0.0192718505859375,
0.01059722900390625,
0.0274658203125,
-0.062103271484375,
-0.0010662078857421875,
-0.0565185546875,
-0.0143890380859375,
-0.0228271484375,
-0.0223846435546875,
-0.0259246826171875,
-0.033172607421875,
-0.0002416372299194336,
0.0364990234375,
-0.037567138671875,
0.0242156982421875,
-0.040802001953125,
0.03582763671875,
0.0333251953125,
0.0213775634765625,
0.0158233642578125,
0.016265869140625,
-0.0181427001953125,
0.0017185211181640625,
-0.0246429443359375,
-0.038238525390625,
0.0849609375,
0.0203857421875,
0.0697021484375,
0.00997161865234375,
0.037078857421875,
0.007793426513671875,
0.0003540515899658203,
-0.032623291015625,
0.032867431640625,
-0.004364013671875,
-0.0738525390625,
0.018707275390625,
-0.007442474365234375,
-0.075439453125,
0.0182952880859375,
-0.02386474609375,
-0.0758056640625,
0.0445556640625,
0.00890350341796875,
-0.044158935546875,
0.044677734375,
-0.05908203125,
0.057525634765625,
-0.0113677978515625,
-0.038299560546875,
-0.01416015625,
-0.02947998046875,
0.047332763671875,
-0.004222869873046875,
0.033905029296875,
-0.03875732421875,
-0.01529693603515625,
0.049896240234375,
-0.032440185546875,
0.07293701171875,
-0.0192108154296875,
0.0078887939453125,
0.054931640625,
0.0091705322265625,
0.036102294921875,
0.0308685302734375,
-0.004791259765625,
0.0276031494140625,
0.026763916015625,
-0.033905029296875,
-0.032135009765625,
0.06085205078125,
-0.07281494140625,
-0.032989501953125,
-0.018341064453125,
-0.0199737548828125,
-0.01114654541015625,
0.0221710205078125,
0.040679931640625,
0.01416778564453125,
0.01020050048828125,
0.0010738372802734375,
0.046875,
-0.014556884765625,
0.0399169921875,
0.019927978515625,
-0.066650390625,
-0.05615234375,
0.05999755859375,
-0.0198822021484375,
0.0241546630859375,
0.00978851318359375,
0.0198822021484375,
-0.045440673828125,
-0.0311737060546875,
-0.0325927734375,
0.0272674560546875,
-0.050506591796875,
-0.0231781005859375,
-0.061309814453125,
-0.00820159912109375,
-0.05291748046875,
-0.0135498046875,
-0.0552978515625,
-0.037841796875,
-0.05267333984375,
0.0014104843139648438,
0.048828125,
0.0296630859375,
-0.0033550262451171875,
0.0209197998046875,
-0.0506591796875,
0.036773681640625,
0.018951416015625,
0.028228759765625,
0.01343536376953125,
-0.03326416015625,
0.00620269775390625,
0.017791748046875,
-0.04107666015625,
-0.058197021484375,
0.040618896484375,
-0.012786865234375,
0.0266876220703125,
0.060028076171875,
0.0059051513671875,
0.07147216796875,
-0.005588531494140625,
0.0697021484375,
0.039642333984375,
-0.05743408203125,
0.05609130859375,
-0.04364013671875,
0.00836181640625,
0.029144287109375,
0.0266876220703125,
-0.0291900634765625,
-0.02642822265625,
-0.06719970703125,
-0.07208251953125,
0.0482177734375,
0.0167388916015625,
0.0295562744140625,
-0.001316070556640625,
0.0278472900390625,
-0.004642486572265625,
0.01947021484375,
-0.0628662109375,
-0.034698486328125,
-0.034942626953125,
-0.0094451904296875,
0.020538330078125,
-0.0005946159362792969,
-0.00847625732421875,
-0.036529541015625,
0.04888916015625,
-0.002826690673828125,
0.0281829833984375,
0.0181427001953125,
0.0157318115234375,
-0.03070068359375,
-0.01007843017578125,
0.027984619140625,
0.05548095703125,
-0.03961181640625,
-0.01253509521484375,
-0.0190277099609375,
-0.042510986328125,
0.0048370361328125,
0.0079345703125,
-0.018035888671875,
0.0053558349609375,
0.020538330078125,
0.054412841796875,
-0.00279998779296875,
-0.044342041015625,
0.04779052734375,
-0.01708984375,
-0.01422119140625,
-0.032928466796875,
0.0128021240234375,
0.03021240234375,
0.033721923828125,
0.037078857421875,
0.01393890380859375,
0.005847930908203125,
-0.0181884765625,
-0.00455474853515625,
0.0247650146484375,
-0.0167236328125,
-0.0309906005859375,
0.07928466796875,
-0.0037078857421875,
-0.034332275390625,
0.03472900390625,
-0.0218353271484375,
-0.0037078857421875,
0.06915283203125,
0.056365966796875,
0.0697021484375,
-0.009674072265625,
0.019439697265625,
0.0604248046875,
-0.003444671630859375,
0.0018749237060546875,
0.05194091796875,
0.026824951171875,
-0.0416259765625,
-0.0190887451171875,
-0.062347412109375,
-0.018707275390625,
0.034271240234375,
-0.041595458984375,
0.04852294921875,
-0.061553955078125,
-0.0300140380859375,
-0.014251708984375,
-0.022125244140625,
-0.0428466796875,
0.017822265625,
0.007549285888671875,
0.0628662109375,
-0.0447998046875,
0.041046142578125,
0.04718017578125,
-0.050811767578125,
-0.07696533203125,
-0.0168304443359375,
0.0119476318359375,
-0.06329345703125,
0.03643798828125,
0.0032863616943359375,
0.00907135009765625,
0.007534027099609375,
-0.056396484375,
-0.0789794921875,
0.0712890625,
0.02557373046875,
-0.03778076171875,
-0.0027866363525390625,
0.01035308837890625,
0.03204345703125,
-0.043487548828125,
0.028564453125,
0.0140533447265625,
0.0201263427734375,
0.0261077880859375,
-0.0418701171875,
-0.0016794204711914062,
-0.03131103515625,
0.0124969482421875,
-0.0262451171875,
-0.052459716796875,
0.07373046875,
-0.030731201171875,
-0.001140594482421875,
0.01406097412109375,
0.048675537109375,
0.054290771484375,
0.0266876220703125,
0.05267333984375,
0.059112548828125,
0.0227813720703125,
-0.0022449493408203125,
0.0738525390625,
-0.034820556640625,
0.064697265625,
0.0477294921875,
0.00873565673828125,
0.0599365234375,
0.0308990478515625,
-0.041015625,
0.047515869140625,
0.0853271484375,
-0.02001953125,
0.048492431640625,
0.00978851318359375,
-0.0250244140625,
-0.00577545166015625,
0.004093170166015625,
-0.04254150390625,
0.0120086669921875,
0.0247650146484375,
-0.03485107421875,
0.001331329345703125,
0.004634857177734375,
-0.0025005340576171875,
-0.0226287841796875,
-0.0274810791015625,
0.0211944580078125,
-0.0023937225341796875,
-0.0205230712890625,
0.064697265625,
-0.017181396484375,
0.064453125,
-0.038421630859375,
-0.0015344619750976562,
-0.0192413330078125,
0.01038360595703125,
-0.0191650390625,
-0.050262451171875,
0.01085662841796875,
-0.0019168853759765625,
-0.00799560546875,
0.0101470947265625,
0.04180908203125,
0.002239227294921875,
-0.0628662109375,
0.0112762451171875,
0.0174713134765625,
0.0223846435546875,
-0.0102996826171875,
-0.0849609375,
0.0204620361328125,
0.0107574462890625,
-0.027496337890625,
-0.00006526708602905273,
0.0227508544921875,
0.0285186767578125,
0.04547119140625,
0.0506591796875,
0.0088348388671875,
0.01288604736328125,
0.006191253662109375,
0.044708251953125,
-0.04388427734375,
-0.04498291015625,
-0.05096435546875,
0.048675537109375,
-0.01165771484375,
-0.011505126953125,
0.036865234375,
0.06292724609375,
0.057952880859375,
-0.05810546875,
0.0565185546875,
-0.029876708984375,
0.03411865234375,
-0.0235748291015625,
0.0640869140625,
-0.05780029296875,
-0.00004607439041137695,
-0.040771484375,
-0.057037353515625,
-0.016143798828125,
0.040557861328125,
-0.00637054443359375,
0.0170135498046875,
0.037628173828125,
0.0614013671875,
-0.0312347412109375,
0.0003483295440673828,
0.006641387939453125,
-0.0007843971252441406,
0.0164642333984375,
0.0250701904296875,
0.06787109375,
-0.03302001953125,
0.03729248046875,
-0.0645751953125,
-0.00852203369140625,
-0.0067291259765625,
-0.061859130859375,
-0.05615234375,
-0.0198516845703125,
-0.052093505859375,
-0.052764892578125,
-0.0174102783203125,
0.057830810546875,
0.06719970703125,
-0.07232666015625,
-0.0201416015625,
-0.005344390869140625,
0.0171356201171875,
-0.0266265869140625,
-0.024871826171875,
0.036773681640625,
0.01255035400390625,
-0.08917236328125,
0.0017042160034179688,
-0.0010728836059570312,
0.037811279296875,
-0.00023448467254638672,
0.00930023193359375,
-0.007354736328125,
0.0022449493408203125,
0.026031494140625,
0.0281219482421875,
-0.074462890625,
-0.01152801513671875,
-0.0138092041015625,
0.0034732818603515625,
0.018585205078125,
0.033447265625,
-0.0298004150390625,
0.030975341796875,
0.044677734375,
0.01367950439453125,
0.040374755859375,
-0.0059356689453125,
0.004985809326171875,
-0.029571533203125,
0.022216796875,
0.008270263671875,
0.040802001953125,
0.012451171875,
-0.03741455078125,
0.033782958984375,
0.04986572265625,
-0.0173797607421875,
-0.04705810546875,
-0.005146026611328125,
-0.0841064453125,
-0.0252685546875,
0.07281494140625,
-0.015838623046875,
-0.053375244140625,
0.00801849365234375,
-0.0213623046875,
0.014129638671875,
-0.024566650390625,
0.0482177734375,
0.038055419921875,
-0.028411865234375,
-0.01947021484375,
-0.04705810546875,
0.01155853271484375,
-0.01087188720703125,
-0.04815673828125,
-0.0150604248046875,
0.0279541015625,
0.05206298828125,
0.04351806640625,
0.031524658203125,
-0.040802001953125,
0.01432037353515625,
0.0229949951171875,
0.03948974609375,
0.006740570068359375,
0.030548095703125,
-0.0140228271484375,
0.0094757080078125,
-0.0100250244140625,
-0.03826904296875
]
] |
Tap-M/Luna-AI-Llama2-Uncensored | 2023-07-26T19:31:12.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"license:cc-by-sa-4.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | Tap-M | null | null | Tap-M/Luna-AI-Llama2-Uncensored | 110 | 6,829 | transformers | 2023-07-19T09:16:29 | ---
license: cc-by-sa-4.0
---
<div style="width: 800px; margin: auto;">
<h2>Model Description</h2>
<p>“Luna AI Llama2 Uncensored” is a Llama2 based Chat model <br />fine-tuned on over 40,000 long form chat discussions <br />
This model was fine-tuned by Tap, the creator of Luna AI. <br />
<h2>Model Training</h2>
<p>The fine-tuning process was performed on an 8x a100 80GB machine.
<br />The model was trained on synthetic outputs which include multiple rounds of chats between Human & AI.
</p>
<a rel="noopener nofollow" href="https://huggingface.co/TheBloke/Luna-AI-Llama2-Uncensored-GPTQ">4bit GPTQ Version provided by @TheBloke - for GPU inference</a><br />
<a rel="noopener nofollow" href="https://huggingface.co/TheBloke/Luna-AI-Llama2-Uncensored-GGML">GGML Version provided by @TheBloke - For CPU inference</a>
<h2>Prompt Format</h2>
<p>The model follows the Vicuna 1.1/ OpenChat format:</p>
```
USER: I have difficulties in making friends, and I really need someone to talk to. Would you be my friend?
ASSISTANT: Of course! Friends are always here for each other. What do you like to do?
```
<h2>Benchmark Results</h2>
||||||
|---:|---:|---:|---:|---:|
|Task|Version| Metric |Value |Stderr|
|arc_challenge|0|acc_norm|0.5512|0.0146|
|hellaswag|0||||
|mmlu|1|acc_norm|0.46521|0.036|
|truthfulqa_mc|1|mc2|0.4716|0.0155|
|Average|-|-|0.5114|0.0150|
</div>
| 1,382 | [
[
-0.0233612060546875,
-0.0643310546875,
0.034637451171875,
0.0304412841796875,
-0.0303192138671875,
0.01244354248046875,
-0.00104522705078125,
-0.0528564453125,
0.0406494140625,
0.04290771484375,
-0.056671142578125,
-0.036285400390625,
-0.0462646484375,
-0.004222869873046875,
-0.023040771484375,
0.07025146484375,
0.00797271728515625,
-0.01629638671875,
0.00616455078125,
-0.002651214599609375,
-0.053375244140625,
-0.0215606689453125,
-0.07635498046875,
-0.0247802734375,
0.0050811767578125,
0.03173828125,
0.05255126953125,
0.03033447265625,
0.0234832763671875,
0.031463623046875,
-0.009735107421875,
0.0340576171875,
-0.0692138671875,
-0.0018987655639648438,
0.0040130615234375,
-0.033966064453125,
-0.05615234375,
0.0009899139404296875,
0.031494140625,
0.01001739501953125,
-0.01090240478515625,
0.0240936279296875,
0.0241241455078125,
0.04217529296875,
-0.0260009765625,
0.015655517578125,
-0.043365478515625,
-0.00269317626953125,
-0.01544189453125,
0.0211944580078125,
-0.014739990234375,
-0.0183258056640625,
-0.02325439453125,
-0.0762939453125,
0.002227783203125,
0.01605224609375,
0.080322265625,
0.04705810546875,
-0.0222625732421875,
-0.00649261474609375,
-0.04705810546875,
0.05084228515625,
-0.05255126953125,
0.00098419189453125,
0.05255126953125,
0.037353515625,
-0.0272979736328125,
-0.0540771484375,
-0.047393798828125,
-0.008056640625,
0.00450897216796875,
0.0226593017578125,
-0.031524658203125,
-0.0027675628662109375,
0.0032825469970703125,
0.0261077880859375,
-0.029083251953125,
0.0171356201171875,
-0.046539306640625,
-0.00937652587890625,
0.06304931640625,
0.0244140625,
0.01239776611328125,
-0.00077056884765625,
-0.02764892578125,
0.002368927001953125,
-0.016265869140625,
0.006816864013671875,
0.032745361328125,
0.0023059844970703125,
-0.046478271484375,
0.055419921875,
-0.03460693359375,
0.00862884521484375,
0.018524169921875,
-0.022735595703125,
0.0206451416015625,
-0.0173187255859375,
-0.0162353515625,
-0.01953125,
0.09478759765625,
0.06011962890625,
0.0217132568359375,
-0.0038928985595703125,
0.01483154296875,
0.0033588409423828125,
0.0156707763671875,
-0.0611572265625,
-0.0070343017578125,
0.025970458984375,
-0.030670166015625,
-0.032470703125,
0.0025081634521484375,
-0.055816650390625,
-0.040130615234375,
0.01788330078125,
0.0237579345703125,
-0.01238250732421875,
-0.037750244140625,
0.01763916015625,
0.0026798248291015625,
0.00009173154830932617,
0.03271484375,
-0.05828857421875,
0.037200927734375,
0.0299072265625,
0.0589599609375,
-0.00174713134765625,
-0.0134735107421875,
-0.0062713623046875,
0.00839996337890625,
-0.022857666015625,
0.032989501953125,
-0.01318359375,
-0.04144287109375,
-0.0253143310546875,
0.01003265380859375,
0.01013946533203125,
-0.03338623046875,
0.041778564453125,
-0.02020263671875,
0.01085662841796875,
-0.00855255126953125,
-0.02777099609375,
-0.00917816162109375,
-0.005847930908203125,
-0.04791259765625,
0.10357666015625,
0.000032782554626464844,
-0.05194091796875,
0.0001882314682006836,
-0.053924560546875,
0.0140228271484375,
-0.0107879638671875,
0.0080718994140625,
-0.01328277587890625,
-0.011322021484375,
0.00353240966796875,
0.034332275390625,
-0.04296875,
0.01049041748046875,
-0.039337158203125,
-0.0303955078125,
0.02703857421875,
-0.020416259765625,
0.05120849609375,
0.0106964111328125,
-0.029296875,
0.0085296630859375,
-0.061309814453125,
-0.00293731689453125,
0.0352783203125,
-0.007305145263671875,
-0.01146697998046875,
0.0016117095947265625,
-0.0034313201904296875,
-0.0007200241088867188,
0.044921875,
-0.051055908203125,
0.0173797607421875,
-0.01532745361328125,
0.03131103515625,
0.055877685546875,
-0.0099945068359375,
0.0214385986328125,
-0.01708984375,
0.021087646484375,
-0.0207366943359375,
0.043426513671875,
0.0237274169921875,
-0.06170654296875,
-0.059173583984375,
-0.0279388427734375,
-0.0147552490234375,
0.05487060546875,
-0.03460693359375,
0.038238525390625,
0.00860595703125,
-0.050506591796875,
-0.0714111328125,
0.0310821533203125,
0.042022705078125,
0.036651611328125,
0.0343017578125,
-0.024810791015625,
-0.035736083984375,
-0.07000732421875,
0.01393890380859375,
-0.02813720703125,
-0.0172882080078125,
0.043243408203125,
0.030670166015625,
-0.033905029296875,
0.06170654296875,
-0.041534423828125,
-0.0256195068359375,
-0.019073486328125,
-0.01174163818359375,
0.005245208740234375,
0.021453857421875,
0.04315185546875,
-0.039886474609375,
-0.049072265625,
-0.007030487060546875,
-0.0838623046875,
-0.022186279296875,
0.01493072509765625,
-0.038909912109375,
0.0013132095336914062,
0.025421142578125,
-0.057891845703125,
0.05889892578125,
0.052337646484375,
-0.037445068359375,
0.0308837890625,
-0.009063720703125,
0.004741668701171875,
-0.10400390625,
-0.013671875,
0.00048470497131347656,
-0.0161285400390625,
-0.0369873046875,
0.0018892288208007812,
-0.0292510986328125,
0.007732391357421875,
-0.0240631103515625,
0.061279296875,
-0.0255126953125,
-0.0008716583251953125,
-0.035247802734375,
0.0162506103515625,
0.00812530517578125,
0.040069580078125,
-0.007129669189453125,
0.056365966796875,
0.042633056640625,
-0.021575927734375,
0.051483154296875,
0.03753662109375,
-0.020538330078125,
0.01508331298828125,
-0.08160400390625,
0.02740478515625,
0.01221466064453125,
0.0175628662109375,
-0.0916748046875,
-0.0169219970703125,
0.043243408203125,
-0.04547119140625,
0.0093841552734375,
-0.001903533935546875,
-0.037628173828125,
-0.0241241455078125,
-0.0308074951171875,
0.0106658935546875,
0.045745849609375,
-0.0467529296875,
0.012603759765625,
0.04327392578125,
0.01161956787109375,
-0.0445556640625,
-0.037628173828125,
-0.0031585693359375,
-0.034332275390625,
-0.05938720703125,
0.025604248046875,
-0.01025390625,
-0.0204925537109375,
-0.00815582275390625,
0.0115814208984375,
-0.0202789306640625,
0.006500244140625,
0.056976318359375,
0.02239990234375,
-0.0010356903076171875,
-0.0189666748046875,
0.00395965576171875,
-0.01654052734375,
-0.0013399124145507812,
0.03125,
0.04888916015625,
-0.0226593017578125,
-0.0177154541015625,
-0.07073974609375,
0.0301055908203125,
0.034637451171875,
0.0093841552734375,
0.07183837890625,
0.05181884765625,
-0.018707275390625,
0.0012454986572265625,
-0.049774169921875,
-0.0112457275390625,
-0.04290771484375,
0.0226898193359375,
-0.01299285888671875,
-0.07904052734375,
0.06268310546875,
0.01247406005859375,
-0.003894805908203125,
0.0323486328125,
0.047088623046875,
-0.0027484893798828125,
0.06268310546875,
0.06097412109375,
-0.0208587646484375,
0.02191162109375,
-0.017608642578125,
-0.008941650390625,
-0.0750732421875,
-0.0265960693359375,
-0.0214996337890625,
-0.011077880859375,
-0.04437255859375,
-0.036407470703125,
0.016021728515625,
0.023773193359375,
-0.039642333984375,
0.030059814453125,
-0.049224853515625,
0.037872314453125,
0.0372314453125,
0.032745361328125,
0.025054931640625,
-0.025726318359375,
0.0189666748046875,
-0.0088043212890625,
-0.0208740234375,
-0.056243896484375,
0.07708740234375,
0.042999267578125,
0.06671142578125,
0.017974853515625,
0.037750244140625,
0.0203094482421875,
0.0023555755615234375,
-0.04107666015625,
0.052642822265625,
0.0125274658203125,
-0.06414794921875,
-0.025848388671875,
-0.01641845703125,
-0.06243896484375,
0.007476806640625,
-0.0240478515625,
-0.10552978515625,
-0.00884246826171875,
-0.003963470458984375,
-0.0143585205078125,
0.0267486572265625,
-0.045867919921875,
0.055572509765625,
0.00087738037109375,
-0.03643798828125,
-0.0146331787109375,
-0.042724609375,
0.04962158203125,
-0.00241851806640625,
0.0036678314208984375,
-0.035003662109375,
-0.0010595321655273438,
0.06231689453125,
-0.04559326171875,
0.08905029296875,
-0.0017061233520507812,
0.006328582763671875,
0.039276123046875,
-0.0006632804870605469,
0.031036376953125,
0.0308990478515625,
0.0007109642028808594,
0.0272674560546875,
-0.0034027099609375,
-0.0249176025390625,
-0.019073486328125,
0.0399169921875,
-0.09814453125,
-0.05340576171875,
-0.0168914794921875,
-0.0157623291015625,
-0.0040740966796875,
-0.0033512115478515625,
0.041290283203125,
-0.018829345703125,
-0.015167236328125,
0.01461029052734375,
0.0258331298828125,
-0.020965576171875,
0.022491455078125,
0.032073974609375,
-0.031463623046875,
-0.034759521484375,
0.04217529296875,
-0.01407623291015625,
0.01119232177734375,
0.007442474365234375,
-0.005245208740234375,
-0.02801513671875,
-0.0203857421875,
-0.032867431640625,
0.040008544921875,
-0.01953125,
-0.01100921630859375,
-0.0335693359375,
-0.02783203125,
-0.0160980224609375,
0.0005331039428710938,
-0.0662841796875,
-0.0260772705078125,
-0.04571533203125,
-0.0100860595703125,
0.060821533203125,
0.06646728515625,
0.0019102096557617188,
0.054656982421875,
-0.03314208984375,
0.03692626953125,
0.028472900390625,
0.024200439453125,
-0.0021953582763671875,
-0.049072265625,
-0.00971221923828125,
0.0128021240234375,
-0.03314208984375,
-0.0806884765625,
0.02508544921875,
0.00533294677734375,
0.04632568359375,
0.037261962890625,
0.015777587890625,
0.056732177734375,
-0.0168609619140625,
0.058502197265625,
0.009979248046875,
-0.054229736328125,
0.03857421875,
-0.023590087890625,
-0.0272369384765625,
0.042388916015625,
0.028045654296875,
-0.024078369140625,
-0.0293121337890625,
-0.039031982421875,
-0.0435791015625,
0.0599365234375,
0.0316162109375,
0.0212249755859375,
0.008819580078125,
0.038116455078125,
-0.0039520263671875,
0.0084228515625,
-0.0631103515625,
-0.035980224609375,
-0.0252685546875,
-0.0003840923309326172,
-0.00017690658569335938,
-0.032745361328125,
-0.01001739501953125,
-0.028778076171875,
0.05535888671875,
-0.00650787353515625,
0.040313720703125,
-0.00400543212890625,
0.0312042236328125,
-0.005153656005859375,
-0.00641632080078125,
0.02032470703125,
0.0255584716796875,
-0.02117919921875,
-0.033447265625,
0.02362060546875,
-0.05035400390625,
0.00417327880859375,
-0.0025081634521484375,
0.01258087158203125,
-0.0160369873046875,
0.010498046875,
0.07977294921875,
-0.007236480712890625,
-0.037750244140625,
0.029541015625,
-0.024261474609375,
-0.01453399658203125,
-0.0290069580078125,
0.0280303955078125,
0.024383544921875,
0.05023193359375,
0.016754150390625,
-0.036773681640625,
0.00333404541015625,
-0.031982421875,
-0.01479339599609375,
0.04571533203125,
-0.0159149169921875,
-0.0215606689453125,
0.07794189453125,
0.0063629150390625,
-0.0174713134765625,
0.05010986328125,
-0.01027679443359375,
-0.0212554931640625,
0.04937744140625,
0.04498291015625,
0.053192138671875,
-0.020172119140625,
0.02703857421875,
0.036773681640625,
0.025238037109375,
-0.00586700439453125,
0.0163726806640625,
0.0084991455078125,
-0.049957275390625,
-0.022979736328125,
-0.048614501953125,
-0.05377197265625,
0.037933349609375,
-0.04888916015625,
0.0188751220703125,
-0.036956787109375,
-0.0238800048828125,
-0.0011796951293945312,
0.01531982421875,
-0.0204315185546875,
-0.00568389892578125,
0.0005125999450683594,
0.05706787109375,
-0.055816650390625,
0.07696533203125,
0.032745361328125,
-0.02642822265625,
-0.06103515625,
-0.034027099609375,
0.00897979736328125,
-0.0821533203125,
0.00960540771484375,
0.0079345703125,
0.0004050731658935547,
-0.00982666015625,
-0.058837890625,
-0.0740966796875,
0.10467529296875,
0.032196044921875,
-0.023712158203125,
0.0004749298095703125,
0.0232696533203125,
0.046966552734375,
-0.026763916015625,
0.03265380859375,
0.018798828125,
0.014007568359375,
0.0238494873046875,
-0.0958251953125,
0.00931549072265625,
-0.049774169921875,
0.0038051605224609375,
-0.00289154052734375,
-0.09857177734375,
0.06524658203125,
-0.004077911376953125,
0.0017347335815429688,
0.03515625,
0.062042236328125,
0.0274810791015625,
0.03521728515625,
0.0211181640625,
0.037445068359375,
0.0582275390625,
-0.009674072265625,
0.055511474609375,
-0.019317626953125,
0.0269622802734375,
0.0709228515625,
-0.01314544677734375,
0.057647705078125,
0.011260986328125,
-0.0281524658203125,
0.028900146484375,
0.06890869140625,
-0.001743316650390625,
0.0301055908203125,
-0.00653076171875,
-0.015380859375,
-0.00797271728515625,
-0.00522613525390625,
-0.0369873046875,
0.029449462890625,
0.02703857421875,
-0.0093994140625,
0.002925872802734375,
-0.00855255126953125,
0.023345947265625,
-0.004039764404296875,
-0.00563812255859375,
0.067626953125,
0.0035457611083984375,
-0.021575927734375,
0.059814453125,
0.01222991943359375,
0.06121826171875,
-0.048370361328125,
-0.004795074462890625,
-0.03350830078125,
0.00960540771484375,
-0.01611328125,
-0.059539794921875,
0.02117919921875,
0.021392822265625,
0.0030117034912109375,
0.018707275390625,
0.04833984375,
-0.0038051605224609375,
-0.02142333984375,
0.04180908203125,
0.0281524658203125,
0.04144287109375,
-0.0103302001953125,
-0.059234619140625,
0.0170440673828125,
0.007503509521484375,
-0.0267333984375,
0.035614013671875,
0.0210113525390625,
-0.0098876953125,
0.06304931640625,
0.040802001953125,
-0.019439697265625,
0.0066680908203125,
-0.0236663818359375,
0.0804443359375,
-0.0291290283203125,
-0.0294189453125,
-0.059417724609375,
0.041534423828125,
-0.0125274658203125,
-0.044891357421875,
0.059234619140625,
0.037506103515625,
0.03656005859375,
0.00672149658203125,
0.0369873046875,
-0.0012798309326171875,
0.042022705078125,
-0.03515625,
0.056671142578125,
-0.05419921875,
0.0104827880859375,
-0.0183563232421875,
-0.0533447265625,
0.0199127197265625,
0.0572509765625,
0.00560760498046875,
-0.0010089874267578125,
0.0154571533203125,
0.06884765625,
0.0024127960205078125,
0.0036525726318359375,
0.01538848876953125,
0.0272216796875,
0.0172119140625,
0.057586669921875,
0.0655517578125,
-0.03887939453125,
0.038482666015625,
-0.035430908203125,
-0.027099609375,
-0.017608642578125,
-0.04083251953125,
-0.091796875,
-0.049072265625,
-0.01861572265625,
-0.0430908203125,
0.01245880126953125,
0.055206298828125,
0.0450439453125,
-0.03057861328125,
-0.03643798828125,
0.00899505615234375,
0.0032291412353515625,
0.009063720703125,
-0.0142364501953125,
0.01012420654296875,
0.0213775634765625,
-0.06304931640625,
0.033905029296875,
-0.00649261474609375,
0.0169219970703125,
-0.0183868408203125,
-0.026885986328125,
0.000743865966796875,
0.01418304443359375,
0.0257415771484375,
0.03961181640625,
-0.05596923828125,
-0.0157012939453125,
0.0009756088256835938,
-0.01395416259765625,
0.00836944580078125,
-0.021392822265625,
-0.034942626953125,
0.01513671875,
0.004150390625,
0.005496978759765625,
0.069580078125,
0.01226043701171875,
0.00933837890625,
-0.042083740234375,
0.0302276611328125,
-0.0163726806640625,
0.01910400390625,
0.01184844970703125,
-0.032958984375,
0.060760498046875,
0.01222991943359375,
-0.052032470703125,
-0.06341552734375,
-0.003170013427734375,
-0.09625244140625,
-0.006122589111328125,
0.09552001953125,
-0.0020999908447265625,
-0.01824951171875,
0.006587982177734375,
-0.047515869140625,
0.033782958984375,
-0.052032470703125,
0.066162109375,
0.046844482421875,
-0.023040771484375,
-0.00830841064453125,
-0.065673828125,
0.03704833984375,
-0.00732421875,
-0.06268310546875,
-0.012298583984375,
0.0303802490234375,
0.0146942138671875,
0.00925445556640625,
0.07073974609375,
0.00800323486328125,
0.00732421875,
0.00598907470703125,
0.0062713623046875,
-0.0191650390625,
-0.0188140869140625,
0.007495880126953125,
-0.00408935546875,
-0.0031452178955078125,
-0.039093017578125
]
] |
TheBloke/Spicyboros-70B-2.2-GPTQ | 2023-09-27T12:48:49.000Z | [
"transformers",
"safetensors",
"llama",
"text-generation",
"not-for-all-audiences",
"dataset:jondurbin/airoboros-2.2",
"license:llama2",
"text-generation-inference",
"region:us"
] | text-generation | TheBloke | null | null | TheBloke/Spicyboros-70B-2.2-GPTQ | 5 | 6,829 | transformers | 2023-09-11T19:24:17 | ---
license: llama2
tags:
- not-for-all-audiences
datasets:
- jondurbin/airoboros-2.2
model_name: Spicyboros 70B 2.2
base_model: jondurbin/spicyboros-70b-2.2
inference: false
model_creator: Jon Durbin
model_type: llama
prompt_template: "A chat.\nUSER: {prompt}\nASSISTANT: \n"
quantized_by: TheBloke
---
<!-- header start -->
<!-- 200823 -->
<div style="width: auto; margin-left: auto; margin-right: auto">
<img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;">
</div>
<div style="display: flex; justify-content: space-between; width: 100%;">
<div style="display: flex; flex-direction: column; align-items: flex-start;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p>
</div>
<div style="display: flex; flex-direction: column; align-items: flex-end;">
<p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p>
</div>
</div>
<div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div>
<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->
# Spicyboros 70B 2.2 - GPTQ
- Model creator: [Jon Durbin](https://huggingface.co/jondurbin)
- Original model: [Spicyboros 70B 2.2](https://huggingface.co/jondurbin/spicyboros-70b-2.2)
<!-- description start -->
## Description
This repo contains GPTQ model files for [Jon Durbin's Spicyboros 70B 2.2](https://huggingface.co/jondurbin/spicyboros-70b-2.2).
Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them.
<!-- description end -->
<!-- repositories-available start -->
## Repositories available
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/Spicyboros-70B-2.2-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/Spicyboros-70B-2.2-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/Spicyboros-70B-2.2-GGUF)
* [Jon Durbin's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/jondurbin/spicyboros-70b-2.2)
<!-- repositories-available end -->
<!-- prompt-template start -->
## Prompt template: Chat
```
A chat.
USER: {prompt}
ASSISTANT:
```
<!-- prompt-template end -->
<!-- README_GPTQ.md-provided-files start -->
## Provided files and GPTQ parameters
Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements.
Each separate quant is in a different branch. See below for instructions on fetching from different branches.
All recent GPTQ files are made with AutoGPTQ, and all files in non-main branches are made with AutoGPTQ. Files in the `main` branch which were uploaded before August 2023 were made with GPTQ-for-LLaMa.
<details>
<summary>Explanation of GPTQ parameters</summary>
- Bits: The bit size of the quantised model.
- GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value.
- Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now.
- Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy.
- GPTQ dataset: The dataset used for quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s).
- Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences.
- ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama models in 4-bit.
</details>
| Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc |
| ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- |
| [main](https://huggingface.co/TheBloke/Spicyboros-70B-2.2-GPTQ/tree/main) | 4 | None | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 35.33 GB | Yes | 4-bit, with Act Order. No group size, to lower VRAM requirements. |
| [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/Spicyboros-70B-2.2-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 40.66 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. |
| [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/Spicyboros-70B-2.2-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 37.99 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. |
| [gptq-4bit-128g-actorder_True](https://huggingface.co/TheBloke/Spicyboros-70B-2.2-GPTQ/tree/gptq-4bit-128g-actorder_True) | 4 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 36.65 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. |
| [gptq-3bit--1g-actorder_True](https://huggingface.co/TheBloke/Spicyboros-70B-2.2-GPTQ/tree/gptq-3bit--1g-actorder_True) | 3 | None | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 26.77 GB | No | 3-bit, with Act Order and no group size. Lowest possible VRAM requirements. May be lower quality than 3-bit 128g. |
| [gptq-3bit-128g-actorder_True](https://huggingface.co/TheBloke/Spicyboros-70B-2.2-GPTQ/tree/gptq-3bit-128g-actorder_True) | 3 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 28.03 GB | No | 3-bit, with group size 128g and act-order. Higher quality than 128g-False. |
<!-- README_GPTQ.md-provided-files end -->
<!-- README_GPTQ.md-download-from-branches start -->
## How to download from branches
- In text-generation-webui, you can add `:branch` to the end of the download name, eg `TheBloke/Spicyboros-70B-2.2-GPTQ:main`
- With Git, you can clone a branch with:
```
git clone --single-branch --branch main https://huggingface.co/TheBloke/Spicyboros-70B-2.2-GPTQ
```
- In Python Transformers code, the branch is the `revision` parameter; see below.
<!-- README_GPTQ.md-download-from-branches end -->
<!-- README_GPTQ.md-text-generation-webui start -->
## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui).
It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `TheBloke/Spicyboros-70B-2.2-GPTQ`.
- To download from a specific branch, enter for example `TheBloke/Spicyboros-70B-2.2-GPTQ:main`
- see Provided Files above for the list of branches for each option.
3. Click **Download**.
4. The model will start downloading. Once it's finished it will say "Done".
5. In the top left, click the refresh icon next to **Model**.
6. In the **Model** dropdown, choose the model you just downloaded: `Spicyboros-70B-2.2-GPTQ`
7. The model will automatically load, and is now ready for use!
8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right.
* Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`.
9. Once you're ready, click the **Text Generation tab** and enter a prompt to get started!
<!-- README_GPTQ.md-text-generation-webui end -->
<!-- README_GPTQ.md-use-from-python start -->
## How to use this GPTQ model from Python code
### Install the necessary packages
Requires: Transformers 4.32.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later.
```shell
pip3 install transformers>=4.32.0 optimum>=1.12.0
pip3 install auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/ # Use cu117 if on CUDA 11.7
```
If you have problems installing AutoGPTQ using the pre-built wheels, install it from source instead:
```shell
pip3 uninstall -y auto-gptq
git clone https://github.com/PanQiWei/AutoGPTQ
cd AutoGPTQ
pip3 install .
```
### For CodeLlama models only: you must use Transformers 4.33.0 or later.
If 4.33.0 is not yet released when you read this, you will need to install Transformers from source:
```shell
pip3 uninstall -y transformers
pip3 install git+https://github.com/huggingface/transformers.git
```
### You can then use the following code
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_name_or_path = "TheBloke/Spicyboros-70B-2.2-GPTQ"
# To use a different branch, change revision
# For example: revision="main"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map="auto",
trust_remote_code=False,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
prompt = "Tell me about AI"
prompt_template=f'''A chat.
USER: {prompt}
ASSISTANT:
'''
print("\n\n*** Generate:")
input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda()
output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512)
print(tokenizer.decode(output[0]))
# Inference can also be done using transformers' pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1
)
print(pipe(prompt_template)[0]['generated_text'])
```
<!-- README_GPTQ.md-use-from-python end -->
<!-- README_GPTQ.md-compatibility start -->
## Compatibility
The files provided are tested to work with AutoGPTQ, both via Transformers and using AutoGPTQ directly. They should also work with [Occ4m's GPTQ-for-LLaMa fork](https://github.com/0cc4m/KoboldAI).
[ExLlama](https://github.com/turboderp/exllama) is compatible with Llama models in 4-bit. Please see the Provided Files table above for per-file compatibility.
[Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) is compatible with all GPTQ models.
<!-- README_GPTQ.md-compatibility end -->
<!-- footer start -->
<!-- 200823 -->
## Discord
For further support, and discussions on these models and AI in general, join us at:
[TheBloke AI's Discord server](https://discord.gg/theblokeai)
## Thanks, and how to contribute
Thanks to the [chirper.ai](https://chirper.ai) team!
Thanks to Clay from [gpus.llm-utils.org](llm-utils)!
I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training.
If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects.
Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits.
* Patreon: https://patreon.com/TheBlokeAI
* Ko-Fi: https://ko-fi.com/TheBlokeAI
**Special thanks to**: Aemon Algiz.
**Patreon special mentions**: Alicia Loh, Stephen Murray, K, Ajan Kanaga, RoA, Magnesian, Deo Leter, Olakabola, Eugene Pentland, zynix, Deep Realms, Raymond Fosdick, Elijah Stavena, Iucharbius, Erik Bjäreholt, Luis Javier Navarrete Lozano, Nicholas, theTransient, John Detwiler, alfie_i, knownsqashed, Mano Prime, Willem Michiel, Enrico Ros, LangChain4j, OG, Michael Dempsey, Pierre Kircher, Pedro Madruga, James Bentley, Thomas Belote, Luke @flexchar, Leonard Tan, Johann-Peter Hartmann, Illia Dulskyi, Fen Risland, Chadd, S_X, Jeff Scroggin, Ken Nordquist, Sean Connelly, Artur Olbinski, Swaroop Kallakuri, Jack West, Ai Maven, David Ziegler, Russ Johnson, transmissions 11, John Villwock, Alps Aficionado, Clay Pascal, Viktor Bowallius, Subspace Studios, Rainer Wilmers, Trenton Dambrowitz, vamX, Michael Levine, 준교 김, Brandon Frisco, Kalila, Trailburnt, Randy H, Talal Aujan, Nathan Dryer, Vadim, 阿明, ReadyPlayerEmma, Tiffany J. Kim, George Stoitzev, Spencer Kim, Jerry Meng, Gabriel Tamborski, Cory Kujawski, Jeffrey Morgan, Spiking Neurons AB, Edmond Seymore, Alexandros Triantafyllidis, Lone Striker, Cap'n Zoog, Nikolai Manek, danny, ya boyyy, Derek Yates, usrbinkat, Mandus, TL, Nathan LeClaire, subjectnull, Imad Khwaja, webtim, Raven Klaugh, Asp the Wyvern, Gabriel Puliatti, Caitlyn Gatomon, Joseph William Delisle, Jonathan Leane, Luke Pendergrass, SuperWojo, Sebastain Graf, Will Dee, Fred von Graf, Andrey, Dan Guido, Daniel P. Andersen, Nitin Borwankar, Elle, Vitor Caleffi, biorpg, jjj, NimbleBox.ai, Pieter, Matthew Berman, terasurfer, Michael Davis, Alex, Stanislav Ovsiannikov
Thank you to all my generous patrons and donaters!
And thank you again to a16z for their generous grant.
<!-- footer end -->
# Original model card: Jon Durbin's Spicyboros 70B 2.2
### Overview
__Usage restriction: To use this model, you must agree to the following:__
- Some of the content than can be produced is "toxic"/"harmful", and contains profanity and other types of sensitive content.
- None of the content or views contained in the dataset or generated outputs necessarily align with my personal beliefs or opinions, they are simply text generated by LLMs and/or scraped from the web.
- Use with extreme caution, particularly in locations with less-than-free speech laws.
- You, and you alone are responsible for having downloaded and generated outputs with the model and I am completely indemnified from any and all liabilities.
__Ok, now that the warning is out of the way...__
Another experimental model, using mostly sythetic data generated by [airoboros](https://github.com/jondurbin/airoboros)
Highlights:
- The prompt format has changed! It is now newlines instead of spaces between system/USER/ASSISTANT (see prompt info below).
- This version also includes "de-alignment" data, to enable less savory interactions and outputs.
- To learn more about the dataset, see: https://hf.co/datasets/jondurbin/airoboros-2.2 (this is the instructions.jsonl file, not instructions-clean.jsonl)
- I re-generated all of the outputs in the dataset that had "Once upon a time" so they'd be less cliche - no guarantees that won't still happen, but in theory it may happen less.
- More multiple choice, better awareness, some alignment for normal use case but system-prompt overridable etc.
__WARNING: This model will gladly spew profane and otherwise NSFW content, if asked, use with care.__
Breakdown of the training data:
| Count | Category |
|--------|----------------------------|
| 60 | quiz |
| 63 | card |
| 100 | detailed\_writing |
| 103 | experience |
| 114 | greeting |
| 200 | song |
| 204 | editor |
| 250 | counterfactual\_contextual |
| 268 | cot |
| 339 | theory\_of\_mind |
| 460 | misconception |
| 500 | summarization |
| 573 | awareness |
| 715 | riddle |
| 719 | agent |
| 800 | plan |
| 873 | gtkm |
| 966 | rp |
| 1000 | stylized\_response |
| 1000 | wordgame |
| 1279 | multiple\_choice |
| 1641 | joke |
| 1785 | writing |
| 2155 | contextual |
| 2364 | roleplay |
| 2508 | trivia |
| 5216 | general |
| 5779 | coding |
| 11367 | orca |
In other words, it's a fairly general purpose model, but focuses fairly heavily on instruction response pairs rather than casual chat/roleplay.
*Why do I try to remove censorship?*
- laws vary widely based on time and location
- language model may conflate certain words with laws, e.g. it may think "stealing eggs from a chicken" is illegal
- these models just produce text, what you do with that text is your resonsibility
- many people and industries deal with "sensitive" content; imagine if a court stenographer's equipment filtered illegal content - it would be useless
Huge thank you to the folks over at [a16z](https://a16z.com/) for sponsoring the costs associated with building models and associated tools!
### Prompt format
The prompt format:
```
A chat.
USER: {prompt}
ASSISTANT:
```
The default system prompt ("A chat.") was used for most of the prompts, however it also included a wide sampling of responses with other prompts, particularly in "stylized\_response", "rp", "gtkm", etc.
Here's another example:
```
A chat between Bob (aka USER) and Tom (aka ASSISTANT). Tom is an extremely intelligent 18th century bookkeeper, who speaks loquaciously.
USER: {prompt}
ASSISTANT:
```
And chat scenario that wouldn't require USER/ASSISTANT (but should use stopping criteria to prevent the model from speaking on your behalf).
```
A chat between old friends: Timmy and Tommy.
{description of characters}
{setting for the chat}
Timmy: *takes a big sip from his coffee* "Ah, sweet, delicious, magical coffee."
Tommy:
```
__*I strongly suggest adding stopping criteria/early inference stopping on "USER:", and/or whatever names you specify in the system prompt.*__
### Fine-tuning details
https://gist.github.com/jondurbin/51a336c582a224de197ba1d2c6b1da97
*Note: I used checkpoint 750 for final model!*
### Helpful usage tips
*The prompts shown here are are just the text that would be included after USER: and before ASSISTANT: in the full prompt format above, the system prompt and USER:/ASSISTANT: have been omited for readability.*
#### Context obedient question answering
By obedient, I mean the model was trained to ignore what it thinks it knows, and uses the context to answer the question. The model was also tuned to limit the values to the provided context as much as possible to reduce hallucinations.
The format for a closed-context prompt is as follows:
```
BEGININPUT
BEGINCONTEXT
[key0: value0]
[key1: value1]
... other metdata ...
ENDCONTEXT
[insert your text blocks here]
ENDINPUT
[add as many other blocks, in the exact same format]
BEGININSTRUCTION
[insert your instruction(s). The model was tuned with single questions, paragraph format, lists, etc.]
ENDINSTRUCTION
```
It's also helpful to add "Don't make up answers if you don't know." to your instruction block to make sure if the context is completely unrelated it doesn't make something up.
*The __only__ prompts that need this closed context formating are closed-context instructions. Normal questions/instructions do not!*
I know it's a bit verbose and annoying, but after much trial and error, using these explicit delimiters helps the model understand where to find the responses and how to associate specific sources with it.
- `BEGININPUT` - denotes a new input block
- `BEGINCONTEXT` - denotes the block of context (metadata key/value pairs) to associate with the current input block
- `ENDCONTEXT` - denotes the end of the metadata block for the current input
- [text] - Insert whatever text you want for the input block, as many paragraphs as can fit in the context.
- `ENDINPUT` - denotes the end of the current input block
- [repeat as many input blocks in this format as you want]
- `BEGININSTRUCTION` - denotes the start of the list (or one) instruction(s) to respond to for all of the input blocks above.
- [instruction(s)]
- `ENDINSTRUCTION` - denotes the end of instruction set
It sometimes works without `ENDINSTRUCTION`, but by explicitly including that in the prompt, the model better understands that all of the instructions in the block should be responded to.
Here's a trivial, but important example to prove the point:
```
BEGININPUT
BEGINCONTEXT
date: 2021-01-01
url: https://web.site/123
ENDCONTEXT
In a shocking turn of events, blueberries are now green, but will be sticking with the same name.
ENDINPUT
BEGININSTRUCTION
What color are bluberries? Source?
ENDINSTRUCTION
```
And the response:
```
Blueberries are now green.
Source:
date: 2021-01-01
url: https://web.site/123
```
#### Coding
You can ask for fairly complex coding instructions with multiple criteria, e.g.:
```
Create a python application with the following requirements:
- Asyncio FastAPI webserver
- ping endpoint that returns the current date in JSON format
- file upload endpoint, which calculates the file's sha256 checksum, and checks postgres to deduplicate
```
Or inline criteria:
```
Write a multi-threaded TCP server in C that accepts a "GET [key]" input and "SET [key] [value]" input, and uses a binary tree to get and store the input values.
```
You can also optionally add a single space and "PLAINFORMAT" at the end of your prompt to avoid backticks, explanations, etc. and just print the code, e.g.:
```
Write a websocket application in node.js. PLAINFORMAT
```
#### Agent/function calling
The dataset includes many examples of function/args generation based on input criteria. This is somewhat similar to the OpenAI function calling, but the output is either JSON or YAML.
Example prompt:
```
As an AI assistant, please select the most suitable function and parameters from the list of available functions below, based on the user's input. Provide your response in JSON format.
Input: I want to know how many times 'Python' is mentioned in my text file.
Available functions:
file_analytics:
description: This tool performs various operations on a text file.
params:
action: The operation we want to perform on the data, such as "count_occurrences", "find_line", etc.
filters:
keyword: The word or phrase we want to search for.
```
Response:
```json
{
"function": "file_analytics",
"params": {
"action": "count_occurrences",
"filters": {
"keyword": "Python"
}
}
}
```
#### Chain-of-thought
You can ask for several possible responses to a given problem, with a ranking and final answer selection.
Example prompt:
```
A group of people decide to play a round-robin tournament where every player plays every other player exactly once. If a total of 45 games were played, how many players participated in the tournament? Offer several potential answers, rank them according to mathematical logic, and pick the most feasible one.
```
Example response:
```
Solution 1:
In a round-robin tournament, each player plays every other player exactly once. This means that the total number of games played is equal to the combination of all players taken two at a time (since a game involves two players).
The formula for combinations is nC2 = n(n-1)/2, where n is the total number of players. We know that nC2 equals 45 (the total number of games), so we can set up the equation as follows:
n(n-1)/2 = 45
Solving this equation gives us n=10.
Final answer: There were 10 players in the tournament.
Solution 2:
Another way to approach this problem is by using the concept of triangular numbers. Triangular numbers are a sequence of numbers where each number can be represented as a triangle with dots. The nth triangular number is given by the sum of the first n natural numbers.
If we consider each game as a dot and each player as a side of the triangle, then the total number of games (45) would correspond to the 9th triangular number because 1+2+3+4+5+6+7+8+9=45. However, since each side of the triangle represents a player, and there's one more player than sides in our model (because the last player has no one left to play against), we need to add one more to get the total number of players.
Final answer: There were 10 players in the tournament.
Solution 3:
We could also use trial and error to solve this problem. Starting from a small number of players, we calculate how many games would be played in a round-robin tournament until we reach 45 games. For example, with 2 players, only 1 game is played; with 3 players, 3 games are played; with 4 players, 6 games are played, and so on. Continuing this process, we find that with 10 players, 45 games are played.
Final answer: There were 10 players in the tournament.
Ranking of solutions from best to worst:
1. Solution 1: This solution uses direct mathematical reasoning based on the properties of a round-robin tournament and solves an equation to find the answer.
2. Solution 2: This solution uses the concept of triangular numbers which requires additional steps and knowledge beyond the immediate problem context, but still arrives at the correct answer.
3. Solution 3: This solution relies on trial and error, which is less efficient and reliable than the other methods, but it eventually gets to the right answer.
Best and final answer: There were 10 players in the tournament.
```
#### reWOO style execution planning
The model now supports execution planning for complex instructions that would require making use of several tools. The output is just the plan, you must implement a mechanism to parse the output and actually call the functions!
Example prompt:
```
Please construct a systematic plan to generate an optimal response to the user instruction, utilizing a set of provided tools. Each plan will correspond to an evidence value, which will be the output of one of the available functions given an input string
that could be the user's question, one or more prior evidence values, or a combination of both.
Here are the tools available to be called:
DuckDuckGo[input]: Tool that enables the user to search for information using the DuckDuckGo search engine. This tool is helpful in locating a suitable list of sites that may contain the answer to the user's query. It does not directly answer the question but finds a list of sites that might have the answer.
TextScraper[input]: Load one or more websites from the input string containing newline delimited links, where input is one or more links, and produces plain text output containing the content of the links.
HyperlinkExtractor[input]: Extract links from plain text and produces a plain text, newline delimited response of links.
KnowledgeModel[input]: Question answering language model, particularly useful in answering questions based on an input passage of text. The input must be a text question that references an :evidence[n]: variable, e.g. What color is the cat, given :evidence1:?
The output should be in format:
Plan: [first action to take based on input question]
:evidence0: = [function to call with input parameter]
Plan: [next action to take, based on result of :evidence0:]
:evidence1: = [next function to call and input parameter, which may include reference to previous evidence, e.g. "Given context :evidence0"]
...
Answer: [:evidence[n]: containing the final answer.]
Question: Who won the Nobel Prize in Literature in 2020 and what is their most famous work?
```
Response:
```
Plan: Begin by conducting a web search to find out who won the Nobel Prize in Literature in 2020.
:evidence0: = DuckDuckGo[Nobel Prize in Literature 2020 winner]
Plan: Utilize the KnowledgeModel to interpret the search results and find the name of the laureate.
:evidence1: = KnowledgeModel[Who won the Nobel Prize in Literature in 2020, given :evidence0:?]
Plan: Conduct another web search to find the most famous work of the identified laureate.
:evidence2: = DuckDuckGo[Most famous work of :evidence1:]
Plan: Extract the relevant links from the DuckDuckGo search results for a more focused search.
:evidence3: = HyperlinkExtractor[:evidence2:]
Plan: Use the TextScraper tool to extract information from the relevant links.
:evidence4: = TextScraper[:evidence3:]
Plan: Finally, utilize the KnowledgeModel to identify and summarize the most famous work of the laureate from the extracted information.
:evidence5: = KnowledgeModel[What is the most famous work of :evidence1:, given :evidence4:?]
Answer: :evidence5:
```
For this to be useful, you'd have to parse the output plan text, and implement/call each of the functions. This is just pseudo-code, completely untested off the top of my head, and obviously would requiring full implementation + hardening:
```python
import re
import requests
def inject_context(input_text, **context):
for ref in set(re.findall(r"(:evidence[0-9]+:)", input_text, re.I)):
input_text = input_text.replace(ref, context.get(ref, ""))
return input_text
def duckduckgo(input_text, **context):
search_string = inject_context(input_text, **context)
... search via duck duck go using search_string
... return text content
def link_extractor(input_text, **context):
input_text = inject_context(input_text, **context)
return "\n".join(list(set(re.findall(r"(https?://[^\s]+?\.?)", input_text, re.I))))
def scrape(input_text, **context):
input_text = inject_context(input_text, **context)
text = []
for link in input_text.splitlines():
text.append(requests.get(link).text)
return "\n".join(text)
def infer(input_text, **context)
prompt = inject_context(input_text, **context)
... call model with prompt, return output
def parse_plan(plan):
method_map = {
"DuckDuckGo": duckduckgo,
"HyperlinkExtractor": link_extractor,
"KnowledgeModel": infer,
"TextScraper": scrape,
}
context = {}
for line in plan.strip().splitlines():
if line.startswith("Plan:"):
print(line)
continue
parts = re.match("^(:evidence[0-9]+:)\s*=\s*([^\[]+])(\[.*\])\s$", line, re.I)
if not parts:
if line.startswith("Answer: "):
return context.get(line.split(" ")[-1].strip(), "Answer couldn't be generated...")
raise RuntimeError("bad format: " + line)
context[parts.group(1)] = method_map[parts.group(2)](parts.group(3), **context)
```
### Contribute
If you're interested in new functionality, particularly a new "instructor" type to generate a specific type of training data,
take a look at the dataset generation tool repo: https://github.com/jondurbin/airoboros and either make a PR or open an issue with details.
To help me with the OpenAI/compute costs:
- https://bmc.link/jondurbin
- ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11
- BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf
### Licence and usage restrictions
The airoboros 2.2 models are built on top of llama-2/codellama.
The llama-2 base model has a custom Meta license:
- See the [meta-license/LICENSE.txt](meta-license/LICENSE.txt) file attached for the original license provided by Meta.
- See also [meta-license/USE_POLICY.md](meta-license/USE_POLICY.md) and [meta-license/Responsible-Use-Guide.pdf](meta-license/Responsible-Use-Guide.pdf), also provided by Meta.
The fine-tuning data was mostly generated by OpenAI API calls to gpt-4, via [airoboros](https://github.com/jondurbin/airoboros)
The ToS for OpenAI API usage has a clause preventing the output from being used to train a model that __competes__ with OpenAI
- what does *compete* actually mean here?
- these small open source models will not produce output anywhere near the quality of gpt-4, or even gpt-3.5, so I can't imagine this could credibly be considered competing in the first place
- if someone else uses the dataset to do the same, they wouldn't necessarily be violating the ToS because they didn't call the API, so I don't know how that works
- the training data used in essentially all large language models includes a significant amount of copyrighted or otherwise non-permissive licensing in the first place
- other work using the self-instruct method, e.g. the original here: https://github.com/yizhongw/self-instruct released the data and model as apache-2
I am purposingly leaving this license ambiguous (other than the fact you must comply with the Meta original license for llama-2) because I am not a lawyer and refuse to attempt to interpret all of the terms accordingly.
Your best bet is probably to avoid using this commercially due to the OpenAI API usage.
Either way, by using this model, you agree to completely indemnify me.
| 33,871 | [
[
-0.041229248046875,
-0.06005859375,
0.007709503173828125,
0.0174713134765625,
-0.0223541259765625,
-0.00955963134765625,
0.0027637481689453125,
-0.038970947265625,
0.018096923828125,
0.0287322998046875,
-0.044921875,
-0.03656005859375,
-0.0250091552734375,
-0.004825592041015625,
-0.0270843505859375,
0.07977294921875,
0.003124237060546875,
-0.0229949951171875,
-0.0025653839111328125,
-0.0171661376953125,
-0.0169830322265625,
-0.026702880859375,
-0.05517578125,
-0.016204833984375,
0.027923583984375,
0.01186370849609375,
0.062347412109375,
0.043121337890625,
0.018341064453125,
0.0248260498046875,
-0.00557708740234375,
-0.005802154541015625,
-0.03765869140625,
-0.01123046875,
0.00959014892578125,
-0.01145172119140625,
-0.047943115234375,
0.01018524169921875,
0.031982421875,
0.01490020751953125,
-0.0298614501953125,
0.017974853515625,
0.00045871734619140625,
0.05426025390625,
-0.031585693359375,
0.00905609130859375,
-0.030120849609375,
0.003116607666015625,
-0.00859832763671875,
0.0158233642578125,
-0.006389617919921875,
-0.0360107421875,
0.01021575927734375,
-0.06304931640625,
0.016876220703125,
-0.00016760826110839844,
0.09100341796875,
0.004009246826171875,
-0.05078125,
0.0123748779296875,
-0.03265380859375,
0.04766845703125,
-0.07171630859375,
0.027923583984375,
0.038116455078125,
0.016815185546875,
-0.0170135498046875,
-0.0648193359375,
-0.0457763671875,
-0.004543304443359375,
-0.01052093505859375,
0.0238800048828125,
-0.03314208984375,
0.0031871795654296875,
0.03643798828125,
0.06134033203125,
-0.07244873046875,
-0.01380157470703125,
-0.02288818359375,
-0.01064300537109375,
0.06854248046875,
0.0139617919921875,
0.0270843505859375,
-0.0224151611328125,
-0.0272674560546875,
-0.036712646484375,
-0.046539306640625,
0.00690460205078125,
0.027252197265625,
0.0031833648681640625,
-0.05328369140625,
0.030975341796875,
-0.02838134765625,
0.034515380859375,
0.0152740478515625,
-0.006683349609375,
0.030792236328125,
-0.045166015625,
-0.033538818359375,
-0.032073974609375,
0.0963134765625,
0.028594970703125,
-0.01496124267578125,
0.01934814453125,
0.00046944618225097656,
-0.013397216796875,
0.0017490386962890625,
-0.07818603515625,
-0.039764404296875,
0.035369873046875,
-0.036590576171875,
-0.0218353271484375,
-0.0022449493408203125,
-0.06219482421875,
-0.0027923583984375,
-0.006885528564453125,
0.035980224609375,
-0.04571533203125,
-0.036346435546875,
0.002864837646484375,
-0.0340576171875,
0.03765869140625,
0.0245819091796875,
-0.05682373046875,
0.03607177734375,
0.0232696533203125,
0.053253173828125,
0.006591796875,
-0.00997161865234375,
-0.0145263671875,
0.006755828857421875,
-0.00879669189453125,
0.033843994140625,
-0.00958251953125,
-0.036834716796875,
-0.023162841796875,
0.0214996337890625,
0.00850677490234375,
-0.01934814453125,
0.0396728515625,
-0.01849365234375,
0.027587890625,
-0.039093017578125,
-0.037689208984375,
-0.02764892578125,
0.01259613037109375,
-0.05291748046875,
0.0965576171875,
0.041717529296875,
-0.06280517578125,
0.0155181884765625,
-0.0390625,
-0.0160675048828125,
0.00046634674072265625,
-0.0022602081298828125,
-0.04443359375,
-0.00597381591796875,
0.01482391357421875,
0.0201568603515625,
-0.02947998046875,
0.00640869140625,
-0.0235137939453125,
-0.0189361572265625,
0.0114593505859375,
-0.03973388671875,
0.09710693359375,
0.0166473388671875,
-0.035797119140625,
-0.00710296630859375,
-0.056610107421875,
0.0133056640625,
0.037506103515625,
-0.01800537109375,
-0.0029239654541015625,
-0.0183258056640625,
0.00911712646484375,
0.0107421875,
0.0160064697265625,
-0.0270538330078125,
0.03753662109375,
-0.0177459716796875,
0.04541015625,
0.048919677734375,
0.0014925003051757812,
0.018157958984375,
-0.03179931640625,
0.038177490234375,
0.006877899169921875,
0.04547119140625,
0.0089874267578125,
-0.054443359375,
-0.04949951171875,
-0.0179595947265625,
0.03289794921875,
0.04443359375,
-0.054534912109375,
0.034210205078125,
-0.010986328125,
-0.0601806640625,
-0.0198822021484375,
-0.00753021240234375,
0.0235443115234375,
0.021575927734375,
0.032989501953125,
-0.0323486328125,
-0.0261993408203125,
-0.0650634765625,
0.005886077880859375,
-0.0377197265625,
-0.004276275634765625,
0.03314208984375,
0.0589599609375,
-0.01971435546875,
0.0562744140625,
-0.053253173828125,
-0.00220489501953125,
-0.0010433197021484375,
0.01031494140625,
0.022918701171875,
0.04510498046875,
0.06024169921875,
-0.063720703125,
-0.045654296875,
-0.004451751708984375,
-0.045654296875,
-0.0037403106689453125,
0.0025482177734375,
-0.030731201171875,
0.0167083740234375,
-0.005126953125,
-0.08062744140625,
0.050933837890625,
0.04119873046875,
-0.044677734375,
0.06365966796875,
-0.01525115966796875,
0.0172271728515625,
-0.08599853515625,
0.00576019287109375,
0.01221466064453125,
-0.0218505859375,
-0.03704833984375,
0.01045989990234375,
-0.0029087066650390625,
0.0112762451171875,
-0.0294952392578125,
0.050872802734375,
-0.038909912109375,
0.00368499755859375,
0.00872039794921875,
-0.006732940673828125,
0.0252685546875,
0.040924072265625,
-0.017547607421875,
0.059844970703125,
0.0330810546875,
-0.03173828125,
0.044189453125,
0.031524658203125,
-0.00202178955078125,
0.0217742919921875,
-0.06182861328125,
0.01296234130859375,
0.0130615234375,
0.0273284912109375,
-0.0704345703125,
-0.0240478515625,
0.04144287109375,
-0.04315185546875,
0.0335693359375,
-0.0268402099609375,
-0.0273590087890625,
-0.034637451171875,
-0.0496826171875,
0.0272064208984375,
0.061981201171875,
-0.0274505615234375,
0.038116455078125,
0.0297393798828125,
0.00391387939453125,
-0.043212890625,
-0.05340576171875,
-0.0171356201171875,
-0.0190582275390625,
-0.045318603515625,
0.0367431640625,
-0.01291656494140625,
-0.00655364990234375,
0.00487518310546875,
0.0002187490463256836,
-0.0092010498046875,
-0.007434844970703125,
0.025238037109375,
0.02239990234375,
-0.0097503662109375,
-0.01412200927734375,
0.01308441162109375,
0.006481170654296875,
-0.00041675567626953125,
-0.02166748046875,
0.030914306640625,
-0.0142059326171875,
0.0012187957763671875,
-0.0261077880859375,
0.0240325927734375,
0.0360107421875,
0.002063751220703125,
0.05810546875,
0.065185546875,
-0.0269317626953125,
0.009185791015625,
-0.03680419921875,
-0.007167816162109375,
-0.0382080078125,
0.01056671142578125,
-0.0164947509765625,
-0.050384521484375,
0.04241943359375,
0.0291900634765625,
0.014678955078125,
0.060455322265625,
0.033416748046875,
0.00020503997802734375,
0.074462890625,
0.0277557373046875,
-0.01071929931640625,
0.0369873046875,
-0.048980712890625,
-0.014312744140625,
-0.061553955078125,
-0.01444244384765625,
-0.0302276611328125,
-0.01468658447265625,
-0.06781005859375,
-0.03607177734375,
0.023406982421875,
0.0195770263671875,
-0.059478759765625,
0.045501708984375,
-0.05126953125,
0.01204681396484375,
0.04443359375,
0.021087646484375,
0.01268768310546875,
0.006500244140625,
-0.010162353515625,
0.006496429443359375,
-0.037933349609375,
-0.0144195556640625,
0.0821533203125,
0.025177001953125,
0.048553466796875,
0.0192108154296875,
0.036590576171875,
0.01020050048828125,
0.020751953125,
-0.037628173828125,
0.043548583984375,
-0.0031414031982421875,
-0.054229736328125,
-0.024200439453125,
-0.05450439453125,
-0.06756591796875,
0.0171356201171875,
-0.0098114013671875,
-0.059295654296875,
0.025054931640625,
0.00698089599609375,
-0.0225830078125,
0.019775390625,
-0.054718017578125,
0.07818603515625,
-0.01091766357421875,
-0.034637451171875,
-0.00043463706970214844,
-0.051361083984375,
0.0261688232421875,
0.0175628662109375,
-0.0007562637329101562,
-0.0158233642578125,
-0.01007080078125,
0.0601806640625,
-0.0660400390625,
0.04931640625,
-0.02191162109375,
-0.0033664703369140625,
0.04156494140625,
-0.0099334716796875,
0.047393798828125,
0.01094818115234375,
-0.0004858970642089844,
0.02911376953125,
0.0258331298828125,
-0.038116455078125,
-0.032135009765625,
0.04583740234375,
-0.07275390625,
-0.038360595703125,
-0.037841796875,
-0.03277587890625,
0.00008213520050048828,
0.00594329833984375,
0.040069580078125,
0.03240966796875,
-0.00566864013671875,
0.0031528472900390625,
0.05126953125,
-0.028350830078125,
0.028411865234375,
0.022735595703125,
-0.0250701904296875,
-0.047760009765625,
0.06500244140625,
0.00875091552734375,
0.0178680419921875,
0.0164642333984375,
0.0147705078125,
-0.03594970703125,
-0.03802490234375,
-0.0535888671875,
0.0258941650390625,
-0.03631591796875,
-0.030914306640625,
-0.04461669921875,
-0.0249176025390625,
-0.0374755859375,
0.01873779296875,
-0.0284271240234375,
-0.049957275390625,
-0.0271148681640625,
-0.0002658367156982422,
0.07220458984375,
0.036376953125,
-0.01367950439453125,
0.0199737548828125,
-0.06060791015625,
0.0227508544921875,
0.030670166015625,
0.01800537109375,
-0.0020275115966796875,
-0.05352783203125,
-0.00873565673828125,
0.018768310546875,
-0.049835205078125,
-0.0721435546875,
0.047515869140625,
0.01139068603515625,
0.026702880859375,
0.030120849609375,
0.011810302734375,
0.0609130859375,
-0.016357421875,
0.08245849609375,
0.0173492431640625,
-0.0660400390625,
0.039886474609375,
-0.044830322265625,
0.020233154296875,
0.03741455078125,
0.0457763671875,
-0.02239990234375,
-0.0220947265625,
-0.054718017578125,
-0.0657958984375,
0.03216552734375,
0.033782958984375,
0.002895355224609375,
0.00930023193359375,
0.042327880859375,
0.0006232261657714844,
0.011474609375,
-0.065673828125,
-0.045013427734375,
-0.030364990234375,
-0.0133514404296875,
0.01678466796875,
-0.004253387451171875,
-0.016571044921875,
-0.052886962890625,
0.072998046875,
-0.01117706298828125,
0.054656982421875,
0.02215576171875,
0.0155487060546875,
-0.005748748779296875,
0.0103607177734375,
0.024688720703125,
0.042083740234375,
-0.019775390625,
-0.0197296142578125,
0.01361083984375,
-0.0662841796875,
0.01256561279296875,
0.02935791015625,
-0.0156097412109375,
-0.00873565673828125,
0.003810882568359375,
0.0589599609375,
-0.005126953125,
-0.023223876953125,
0.037750244140625,
-0.0276947021484375,
-0.026641845703125,
-0.0275726318359375,
0.01439666748046875,
0.01541900634765625,
0.02764892578125,
0.0271759033203125,
-0.0204620361328125,
0.025787353515625,
-0.044342041015625,
0.00914764404296875,
0.039764404296875,
-0.01322174072265625,
-0.02960205078125,
0.059844970703125,
-0.00004601478576660156,
0.0082244873046875,
0.0579833984375,
-0.02874755859375,
-0.03173828125,
0.057891845703125,
0.033447265625,
0.059600830078125,
-0.01366424560546875,
0.0206451416015625,
0.04632568359375,
0.01068115234375,
-0.0074920654296875,
0.033233642578125,
-0.0011129379272460938,
-0.041015625,
-0.0242767333984375,
-0.04840087890625,
-0.02264404296875,
0.0251312255859375,
-0.059295654296875,
0.01309967041015625,
-0.0308990478515625,
-0.0306549072265625,
-0.009918212890625,
0.02972412109375,
-0.04217529296875,
0.021728515625,
-0.00044465065002441406,
0.074951171875,
-0.055023193359375,
0.0638427734375,
0.040557861328125,
-0.033721923828125,
-0.07879638671875,
-0.0098724365234375,
0.0108642578125,
-0.037322998046875,
0.01157379150390625,
0.001796722412109375,
0.025177001953125,
0.00040841102600097656,
-0.052398681640625,
-0.064453125,
0.1070556640625,
0.0241241455078125,
-0.044647216796875,
-0.0093841552734375,
-0.000415802001953125,
0.026763916015625,
-0.0014810562133789062,
0.052642822265625,
0.042724609375,
0.0288848876953125,
0.01490020751953125,
-0.06854248046875,
0.034881591796875,
-0.035003662109375,
0.0024929046630859375,
0.0230712890625,
-0.078369140625,
0.0755615234375,
0.002056121826171875,
-0.00783538818359375,
0.01508331298828125,
0.04833984375,
0.029693603515625,
0.003124237060546875,
0.02935791015625,
0.0650634765625,
0.052886962890625,
-0.028411865234375,
0.088623046875,
-0.0106048583984375,
0.04754638671875,
0.058746337890625,
0.0029087066650390625,
0.053863525390625,
0.0185699462890625,
-0.05596923828125,
0.04547119140625,
0.06927490234375,
-0.00902557373046875,
0.0296478271484375,
0.0017642974853515625,
-0.0238037109375,
-0.0036602020263671875,
0.0123443603515625,
-0.0552978515625,
0.00750732421875,
0.0253753662109375,
-0.0158233642578125,
0.01009368896484375,
-0.0139923095703125,
0.0037174224853515625,
-0.049163818359375,
-0.01279449462890625,
0.041900634765625,
0.01873779296875,
-0.0203704833984375,
0.06341552734375,
-0.008392333984375,
0.047943115234375,
-0.044708251953125,
-0.01361083984375,
-0.03009033203125,
-0.00614166259765625,
-0.0225372314453125,
-0.05908203125,
0.0148468017578125,
-0.0124664306640625,
-0.01018524169921875,
0.0017652511596679688,
0.051361083984375,
-0.01593017578125,
-0.034271240234375,
0.0250091552734375,
0.02813720703125,
0.0226593017578125,
-0.0078887939453125,
-0.08349609375,
0.0199432373046875,
0.0018157958984375,
-0.0579833984375,
0.034454345703125,
0.0287933349609375,
0.0138092041015625,
0.04632568359375,
0.042999267578125,
-0.0073394775390625,
0.00038886070251464844,
-0.01165771484375,
0.0745849609375,
-0.06329345703125,
-0.0221710205078125,
-0.059295654296875,
0.047882080078125,
-0.0157623291015625,
-0.035064697265625,
0.059844970703125,
0.04632568359375,
0.057098388671875,
0.0054168701171875,
0.054107666015625,
-0.035888671875,
0.01493072509765625,
-0.0265655517578125,
0.052734375,
-0.05810546875,
0.0045013427734375,
-0.031341552734375,
-0.054046630859375,
0.003002166748046875,
0.050933837890625,
-0.00689697265625,
0.024505615234375,
0.0293121337890625,
0.06256103515625,
0.0008625984191894531,
0.014556884765625,
0.01373291015625,
0.031097412109375,
0.0107421875,
0.06256103515625,
0.050872802734375,
-0.0780029296875,
0.041015625,
-0.034576416015625,
-0.0172271728515625,
-0.004058837890625,
-0.058135986328125,
-0.0552978515625,
-0.03900146484375,
-0.047607421875,
-0.04913330078125,
-0.0054168701171875,
0.07086181640625,
0.06219482421875,
-0.04931640625,
-0.0196685791015625,
-0.008056640625,
-0.00064849853515625,
-0.0225067138671875,
-0.0244293212890625,
0.024932861328125,
0.0284576416015625,
-0.052642822265625,
0.01129150390625,
0.002857208251953125,
0.0293731689453125,
-0.0105438232421875,
-0.023223876953125,
-0.0171661376953125,
0.00276947021484375,
0.04583740234375,
0.038360595703125,
-0.03765869140625,
-0.00890350341796875,
-0.01102447509765625,
-0.005741119384765625,
0.02032470703125,
0.022705078125,
-0.052642822265625,
-0.0019521713256835938,
0.039337158203125,
0.01187896728515625,
0.06695556640625,
0.0033283233642578125,
0.02655029296875,
-0.035552978515625,
0.00661468505859375,
0.003520965576171875,
0.0242156982421875,
0.0066986083984375,
-0.039886474609375,
0.051361083984375,
0.0308990478515625,
-0.050811767578125,
-0.0579833984375,
-0.00942230224609375,
-0.0916748046875,
-0.0173187255859375,
0.08624267578125,
-0.01241302490234375,
-0.029296875,
-0.0004944801330566406,
-0.0175933837890625,
0.0294342041015625,
-0.041290283203125,
0.02484130859375,
0.0323486328125,
-0.0190887451171875,
-0.0289154052734375,
-0.058868408203125,
0.042877197265625,
0.014801025390625,
-0.06683349609375,
0.002223968505859375,
0.03765869140625,
0.037506103515625,
0.00402069091796875,
0.06268310546875,
-0.01873779296875,
0.0260467529296875,
0.01316070556640625,
0.00258636474609375,
0.00040602684020996094,
0.00946807861328125,
-0.0251312255859375,
-0.00036025047302246094,
-0.01812744140625,
0.0026493072509765625
]
] |
llm-agents/tora-13b-v1.0 | 2023-10-08T11:37:04.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"code",
"math",
"en",
"dataset:gsm8k",
"dataset:competition_math",
"arxiv:2309.17452",
"license:llama2",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | llm-agents | null | null | llm-agents/tora-13b-v1.0 | 4 | 6,824 | transformers | 2023-10-08T05:26:43 | ---
license: llama2
datasets:
- gsm8k
- competition_math
language:
- en
metrics:
- exact_match
library_name: transformers
pipeline_tag: text-generation
tags:
- code
- math
---
<h1 align="center">
ToRA: A Tool-Integrated Reasoning Agent <br> for Mathematical Problem Solving
</h1>
<p align="center">
<a href="https://microsoft.github.io/ToRA/"><b>[🌐 Website]</b></a> •
<a href="https://arxiv.org/abs/2309.17452"><b>[📜 Paper]</b></a> •
<a href="https://huggingface.co/llm-agents"><b>[🤗 HF Models]</b></a> •
<a href="https://github.com/microsoft/ToRA"><b>[🐱 GitHub]</b></a>
<br>
<a href="https://twitter.com/zhs05232838/status/1708860992631763092"><b>[🐦 Twitter]</b></a> •
<a href="https://www.reddit.com/r/LocalLLaMA/comments/1703k6d/tora_a_toolintegrated_reasoning_agent_for/"><b>[💬 Reddit]</b></a> •
<a href="https://notes.aimodels.fyi/researchers-announce-tora-training-language-models-to-better-understand-math-using-external-tools/">[🍀 Unofficial Blog]</a>
<!-- <a href="#-quick-start">Quick Start</a> • -->
<!-- <a href="#%EF%B8%8F-citation">Citation</a> -->
</p>
<p align="center">
Repo for "<a href="https://arxiv.org/abs/2309.17452" target="_blank">ToRA: A Tool-Integrated Reasoning Agent for Mathematical Problem Solving</a>"
</p>
## 🔥 News
- [2023/10/08] 🔥🔥🔥 All ToRA models released at [HuggingFace](https://huggingface.co/llm-agents)!!!
- [2023/09/29] ToRA paper, repo, and website released.
## 💡 Introduction
ToRA is a series of Tool-integrated Reasoning Agents designed to solve challenging mathematical reasoning problems by interacting with tools, e.g., computation libraries and symbolic solvers. ToRA series seamlessly integrate natural language reasoning with the utilization of external tools, thereby amalgamating the analytical prowess of language and the computational efficiency of external tools.
| Model | Size | GSM8k | MATH | AVG@10 math tasks<sup>†</sup> |
|---|---|---|---|---|
| GPT-4 | - | 92.0 | 42.5 | 78.3 |
| GPT-4 (PAL) | - | 94.2 | 51.8 | 86.4 |
| [ToRA-7B](https://huggingface.co/llm-agents/tora-7b-v1.0) | 7B | 68.8 | 40.1 | 62.4|
| [ToRA-Code-7B](https://huggingface.co/llm-agents/tora-code-7b-v1.0) | 7B | 72.6 | 44.6 | 66.5|
| [ToRA-13B](https://huggingface.co/llm-agents/tora-13b-v1.0) | 13B | 72.7 | 43.0 | 65.9|
| [ToRA-Code-13B](https://huggingface.co/llm-agents/tora-code-13b-v1.0) | 13B | 75.8 | 48.1 | 71.3 |
| [ToRA-Code-34B<sup>*</sup>](https://huggingface.co/llm-agents/tora-code-34b-v1.0) | 34B | 80.7 | **51.0** | 74.8 |
| [ToRA-70B](https://huggingface.co/llm-agents/tora-70b-v1.0) | 70B | **84.3** | 49.7 | **76.9** |
- <sup>*</sup>ToRA-Code-34B is currently the first and only open-source model to achieve over 50% accuracy (pass@1) on the MATH dataset, which significantly outperforms GPT-4’s CoT result (51.0 vs. 42.5), and is competitive with GPT-4 solving problems with programs. By open-sourcing our codes and models, we hope more breakthroughs will come!
- <sup>†</sup>10 math tasks include GSM8k, MATH, GSM-Hard, SVAMP, TabMWP, ASDiv, SingleEQ, SingleOP, AddSub, and MultiArith.
## ⚡️ Training
The models are trained on ToRA-Corpus 16k, which contains tool-integrated reasoning trajectories of MATH and GSM8k from GPT-4.
We use imitation learning (i.e., SFT) to fine-tune the models, and then apply our proposed *output space shaping* to improve tool-integrated reasoning behaviors. Please refer to the [paper](https://arxiv.org/pdf/2309.17452.pdf) for more details.
## 🪁 Inference & Evaluation
Please refer to ToRA's [GitHub repo](https://github.com/microsoft/ToRA) for inference, evaluation, and training code.
## ☕️ Citation
If you find this repository helpful, please consider citing our paper:
```
@misc{gou2023tora,
title={ToRA: A Tool-Integrated Reasoning Agent for Mathematical Problem Solving},
author={Zhibin Gou and Zhihong Shao and Yeyun Gong and yelong shen and Yujiu Yang and Minlie Huang and Nan Duan and Weizhu Chen},
year={2023},
eprint={2309.17452},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 4,077 | [
[
-0.02825927734375,
-0.059600830078125,
0.05035400390625,
0.033447265625,
-0.0160980224609375,
0.032958984375,
0.0155029296875,
-0.02618408203125,
0.037445068359375,
0.03289794921875,
-0.04925537109375,
-0.0202789306640625,
-0.03253173828125,
0.00931549072265625,
0.00647735595703125,
0.032012939453125,
0.0037899017333984375,
0.017242431640625,
-0.016143798828125,
-0.0220794677734375,
-0.0523681640625,
-0.0262298583984375,
-0.035736083984375,
-0.0271759033203125,
-0.00936126708984375,
-0.007274627685546875,
0.06890869140625,
0.037017822265625,
0.02850341796875,
0.029022216796875,
0.00009393692016601562,
0.0361328125,
-0.0015811920166015625,
-0.007183074951171875,
-0.0048828125,
-0.035614013671875,
-0.053497314453125,
0.01160430908203125,
0.0484619140625,
0.0274505615234375,
-0.0171661376953125,
0.004871368408203125,
-0.01453399658203125,
0.031463623046875,
-0.030731201171875,
0.02191162109375,
-0.0206298828125,
-0.0171356201171875,
-0.0010232925415039062,
-0.004467010498046875,
-0.046417236328125,
-0.035736083984375,
-0.0007176399230957031,
-0.0726318359375,
-0.01149749755859375,
0.0005283355712890625,
0.0875244140625,
0.0278472900390625,
-0.0241546630859375,
-0.0171051025390625,
-0.046783447265625,
0.06256103515625,
-0.06561279296875,
0.0214080810546875,
-0.001308441162109375,
0.0225830078125,
-0.01493072509765625,
-0.0535888671875,
-0.04913330078125,
-0.0154266357421875,
-0.00640869140625,
0.034515380859375,
-0.046051025390625,
-0.0191650390625,
0.034332275390625,
-0.0004203319549560547,
-0.040191650390625,
-0.0190277099609375,
-0.038330078125,
-0.012054443359375,
0.043609619140625,
0.018829345703125,
0.0300445556640625,
0.00101470947265625,
-0.0096282958984375,
-0.0129852294921875,
-0.0540771484375,
0.0059051513671875,
0.025115966796875,
0.0037555694580078125,
-0.024566650390625,
0.02099609375,
0.0233917236328125,
0.044708251953125,
-0.0016736984252929688,
-0.004665374755859375,
0.035736083984375,
-0.00652313232421875,
-0.02056884765625,
-0.0263214111328125,
0.07012939453125,
-0.005535125732421875,
0.00751495361328125,
-0.0016145706176757812,
-0.008697509765625,
0.007045745849609375,
0.0218048095703125,
-0.056060791015625,
0.00943756103515625,
0.00756072998046875,
-0.004344940185546875,
-0.01129150390625,
0.02392578125,
-0.051422119140625,
-0.004703521728515625,
-0.02325439453125,
0.045654296875,
-0.034698486328125,
-0.0186309814453125,
0.03790283203125,
0.0012025833129882812,
0.01300811767578125,
0.0489501953125,
-0.01375579833984375,
0.028533935546875,
0.040191650390625,
0.07269287109375,
0.009918212890625,
-0.013946533203125,
-0.050750732421875,
-0.00848388671875,
-0.02618408203125,
0.050628662109375,
-0.0295257568359375,
-0.018035888671875,
-0.027130126953125,
0.0034084320068359375,
-0.01299285888671875,
-0.0279998779296875,
0.0033283233642578125,
-0.051971435546875,
0.0272979736328125,
-0.0089111328125,
-0.016265869140625,
-0.0289764404296875,
0.01053619384765625,
-0.06573486328125,
0.07244873046875,
0.0248565673828125,
-0.0304412841796875,
-0.01198577880859375,
-0.06463623046875,
-0.006366729736328125,
-0.0031642913818359375,
-0.004199981689453125,
-0.036712646484375,
-0.022552490234375,
0.015838623046875,
0.0033321380615234375,
-0.059173583984375,
0.038299560546875,
-0.03533935546875,
-0.01708984375,
0.0200042724609375,
-0.001953125,
0.093994140625,
0.0231170654296875,
-0.037933349609375,
0.0196380615234375,
-0.033203125,
0.01438140869140625,
0.037841796875,
0.016693115234375,
-0.0159454345703125,
-0.02398681640625,
-0.024139404296875,
0.033111572265625,
0.01210784912109375,
-0.04150390625,
0.021881103515625,
-0.04327392578125,
0.047637939453125,
0.08282470703125,
-0.006053924560546875,
0.0227203369140625,
-0.0245208740234375,
0.07318115234375,
0.0037689208984375,
0.0227508544921875,
0.01024627685546875,
-0.055145263671875,
-0.042877197265625,
-0.02447509765625,
0.0214385986328125,
0.050750732421875,
-0.07843017578125,
0.033477783203125,
-0.01012420654296875,
-0.0579833984375,
-0.03369140625,
-0.012451171875,
0.04461669921875,
0.03338623046875,
0.0228729248046875,
-0.008148193359375,
-0.0293731689453125,
-0.053802490234375,
-0.016754150390625,
-0.004673004150390625,
0.0106658935546875,
0.0293121337890625,
0.049407958984375,
-0.00402069091796875,
0.07550048828125,
-0.06402587890625,
-0.00107574462890625,
-0.0186920166015625,
-0.00751495361328125,
0.0299072265625,
0.02789306640625,
0.0555419921875,
-0.05230712890625,
-0.05706787109375,
-0.011932373046875,
-0.062347412109375,
-0.00821685791015625,
-0.01384735107421875,
-0.0179290771484375,
0.0156097412109375,
0.01727294921875,
-0.049896240234375,
0.038482666015625,
0.018402099609375,
-0.0528564453125,
0.03680419921875,
0.01061248779296875,
-0.0146636962890625,
-0.10247802734375,
0.01015472412109375,
0.019287109375,
-0.0175018310546875,
-0.0298309326171875,
0.017608642578125,
-0.008636474609375,
-0.002410888671875,
-0.02838134765625,
0.072998046875,
-0.0244140625,
0.005039215087890625,
-0.00128936767578125,
0.0160980224609375,
-0.000021576881408691406,
0.053253173828125,
-0.0170745849609375,
0.09674072265625,
0.032623291015625,
-0.0250244140625,
0.021270751953125,
0.024810791015625,
0.0089111328125,
0.00848388671875,
-0.06146240234375,
0.0206298828125,
0.008697509765625,
0.0161895751953125,
-0.0438232421875,
0.0235595703125,
0.037445068359375,
-0.051788330078125,
-0.001651763916015625,
0.004199981689453125,
-0.053955078125,
-0.0202789306640625,
-0.0345458984375,
0.018218994140625,
0.0479736328125,
-0.030670166015625,
0.0794677734375,
0.0310821533203125,
0.008544921875,
-0.050750732421875,
-0.01556396484375,
-0.0181884765625,
-0.01043701171875,
-0.0704345703125,
0.01364898681640625,
-0.033477783203125,
-0.040252685546875,
0.0087127685546875,
-0.01161956787109375,
-0.0008153915405273438,
0.0029449462890625,
0.01425933837890625,
0.045013427734375,
-0.0257568359375,
0.006977081298828125,
0.0059356689453125,
-0.028778076171875,
0.01129150390625,
-0.01486968994140625,
0.06201171875,
-0.0645751953125,
-0.016326904296875,
-0.02099609375,
0.0187835693359375,
0.050201416015625,
-0.026580810546875,
0.051300048828125,
0.0275115966796875,
-0.035003662109375,
-0.0035228729248046875,
-0.04119873046875,
-0.024383544921875,
-0.041046142578125,
0.0074920654296875,
-0.041656494140625,
-0.0538330078125,
0.053131103515625,
-0.002532958984375,
-0.005016326904296875,
0.06427001953125,
0.028167724609375,
0.0256500244140625,
0.08624267578125,
0.050201416015625,
-0.006237030029296875,
0.03192138671875,
-0.05743408203125,
0.0174407958984375,
-0.0635986328125,
-0.0185394287109375,
-0.03759765625,
-0.001323699951171875,
-0.031982421875,
-0.0173797607421875,
0.046478271484375,
0.0038890838623046875,
-0.0364990234375,
0.04278564453125,
-0.05206298828125,
0.035064697265625,
0.050933837890625,
0.00927734375,
0.0175018310546875,
-0.00876617431640625,
-0.0107879638671875,
-0.004093170166015625,
-0.043487548828125,
-0.040557861328125,
0.06719970703125,
0.02166748046875,
0.0447998046875,
0.028594970703125,
0.028167724609375,
0.007099151611328125,
0.01666259765625,
-0.0440673828125,
0.05718994140625,
0.0070343017578125,
-0.025421142578125,
-0.0231475830078125,
-0.03564453125,
-0.0704345703125,
0.0153045654296875,
0.012115478515625,
-0.0653076171875,
0.01800537109375,
-0.006793975830078125,
-0.0295257568359375,
0.0300140380859375,
-0.053253173828125,
0.05517578125,
-0.000377655029296875,
-0.0355224609375,
-0.021270751953125,
-0.04339599609375,
0.02630615234375,
0.005035400390625,
0.0014562606811523438,
0.00893402099609375,
0.005489349365234375,
0.0611572265625,
-0.06719970703125,
0.04888916015625,
-0.01226043701171875,
-0.0042266845703125,
0.039093017578125,
0.0259552001953125,
0.051788330078125,
0.02606201171875,
-0.0081024169921875,
0.01318359375,
0.01102447509765625,
-0.0242919921875,
-0.06585693359375,
0.041290283203125,
-0.0706787109375,
-0.054351806640625,
-0.0738525390625,
-0.05181884765625,
-0.01209259033203125,
0.0277099609375,
0.00855255126953125,
0.03662109375,
0.04193115234375,
0.003780364990234375,
0.05224609375,
-0.001308441162109375,
0.0306549072265625,
0.050323486328125,
-0.001987457275390625,
-0.0347900390625,
0.07232666015625,
0.01800537109375,
0.0179595947265625,
0.0224151611328125,
0.01557159423828125,
-0.0268402099609375,
-0.0189666748046875,
-0.03778076171875,
0.050750732421875,
-0.04937744140625,
-0.033935546875,
-0.036407470703125,
-0.042083740234375,
-0.029205322265625,
-0.02203369140625,
-0.031005859375,
-0.0300140380859375,
-0.034210205078125,
0.0161895751953125,
0.0550537109375,
0.04931640625,
0.00698089599609375,
0.0250396728515625,
-0.050933837890625,
0.01556396484375,
0.01070404052734375,
0.0260162353515625,
0.0035915374755859375,
-0.034149169921875,
0.0021724700927734375,
-0.00010693073272705078,
-0.044952392578125,
-0.06903076171875,
0.0548095703125,
-0.0233917236328125,
0.03704833984375,
0.0025119781494140625,
-0.0010194778442382812,
0.043975830078125,
-0.0038299560546875,
0.045745849609375,
0.0123291015625,
-0.09661865234375,
0.0399169921875,
-0.02825927734375,
0.01568603515625,
0.0024967193603515625,
0.0109405517578125,
-0.0266571044921875,
-0.0229644775390625,
-0.072265625,
-0.0364990234375,
0.083740234375,
0.027374267578125,
-0.018157958984375,
0.01172637939453125,
0.024383544921875,
0.006427764892578125,
0.00653839111328125,
-0.057464599609375,
-0.02825927734375,
-0.026458740234375,
-0.0155029296875,
0.0184783935546875,
0.007076263427734375,
-0.00843048095703125,
-0.017364501953125,
0.0762939453125,
-0.02813720703125,
0.038970947265625,
0.008758544921875,
-0.011199951171875,
-0.0012226104736328125,
0.004688262939453125,
0.07080078125,
0.0650634765625,
-0.01435089111328125,
-0.01922607421875,
0.0033969879150390625,
-0.06585693359375,
0.00738525390625,
0.01047515869140625,
-0.02642822265625,
-0.008819580078125,
0.0214691162109375,
0.058135986328125,
-0.01453399658203125,
-0.05499267578125,
0.030303955078125,
0.0036754608154296875,
-0.01117706298828125,
-0.0294036865234375,
0.004856109619140625,
0.00010985136032104492,
0.0288543701171875,
0.01654052734375,
0.008636474609375,
0.0025844573974609375,
-0.028289794921875,
-0.0006833076477050781,
0.036590576171875,
-0.01557159423828125,
-0.024810791015625,
0.03826904296875,
-0.002391815185546875,
-0.04241943359375,
0.04779052734375,
-0.04193115234375,
-0.044342041015625,
0.07708740234375,
0.05804443359375,
0.0692138671875,
-0.004596710205078125,
0.022796630859375,
0.029449462890625,
0.0419921875,
0.0053863525390625,
0.050445556640625,
0.024627685546875,
-0.047149658203125,
-0.0232086181640625,
-0.016357421875,
-0.0308685302734375,
0.01479339599609375,
-0.037109375,
0.02239990234375,
-0.053955078125,
-0.00433349609375,
-0.0011377334594726562,
0.0218353271484375,
-0.04119873046875,
-0.010650634765625,
-0.0416259765625,
0.07452392578125,
-0.039947509765625,
0.057525634765625,
0.051422119140625,
-0.060028076171875,
-0.07940673828125,
-0.016693115234375,
0.013824462890625,
-0.07476806640625,
0.02813720703125,
-0.00372314453125,
-0.0316162109375,
0.0123291015625,
-0.057891845703125,
-0.06787109375,
0.1009521484375,
0.05426025390625,
-0.01629638671875,
0.0015859603881835938,
0.0013675689697265625,
0.0290679931640625,
-0.0279083251953125,
0.02392578125,
0.01004791259765625,
0.042144775390625,
0.00942230224609375,
-0.0673828125,
0.040374755859375,
-0.059051513671875,
-0.01210784912109375,
0.03240966796875,
-0.080810546875,
0.0782470703125,
-0.00762176513671875,
-0.00524139404296875,
0.00841522216796875,
0.0341796875,
0.043548583984375,
0.0302734375,
0.0311431884765625,
0.04254150390625,
0.0413818359375,
-0.026123046875,
0.06402587890625,
-0.00818634033203125,
0.03521728515625,
0.06573486328125,
-0.013824462890625,
0.03167724609375,
0.0177764892578125,
-0.0311126708984375,
0.051513671875,
0.033538818359375,
-0.022247314453125,
0.019012451171875,
-0.0010423660278320312,
0.0033969879150390625,
-0.035125732421875,
0.007030487060546875,
-0.037109375,
0.01079559326171875,
0.02978515625,
0.006374359130859375,
-0.0185546875,
-0.0037822723388671875,
0.0022220611572265625,
0.0151214599609375,
0.00017309188842773438,
0.038299560546875,
0.00988006591796875,
-0.0408935546875,
0.032501220703125,
0.00980377197265625,
0.03350830078125,
-0.061126708984375,
-0.02398681640625,
-0.00908660888671875,
0.010894775390625,
-0.000579833984375,
-0.058349609375,
0.0179290771484375,
-0.018890380859375,
-0.00855255126953125,
0.0016660690307617188,
0.037384033203125,
0.0109710693359375,
-0.036590576171875,
0.02001953125,
0.03607177734375,
0.0016679763793945312,
-0.01091766357421875,
-0.0611572265625,
-0.0017156600952148438,
-0.007686614990234375,
-0.01375579833984375,
0.006366729736328125,
0.0200958251953125,
-0.03179931640625,
0.07122802734375,
0.053955078125,
-0.0111236572265625,
0.0027637481689453125,
-0.007293701171875,
0.0726318359375,
-0.051788330078125,
-0.047943115234375,
-0.06207275390625,
0.038848876953125,
-0.006923675537109375,
-0.0294342041015625,
0.055572509765625,
0.047027587890625,
0.046417236328125,
-0.01727294921875,
0.033935546875,
0.001251220703125,
0.0265655517578125,
-0.040130615234375,
0.053680419921875,
-0.044830322265625,
0.034332275390625,
-0.01453399658203125,
-0.05157470703125,
-0.02276611328125,
0.03680419921875,
-0.019195556640625,
0.02069091796875,
0.06756591796875,
0.053192138671875,
-0.008148193359375,
0.0011539459228515625,
-0.00830078125,
0.0145111083984375,
0.04510498046875,
0.0582275390625,
0.042999267578125,
-0.046875,
0.0308685302734375,
-0.0200042724609375,
-0.00817108154296875,
-0.0098876953125,
-0.04180908203125,
-0.064697265625,
-0.054107666015625,
-0.0197601318359375,
-0.056915283203125,
-0.01537322998046875,
0.0728759765625,
0.048583984375,
-0.037078857421875,
-0.01500701904296875,
-0.020599365234375,
0.037353515625,
-0.0259552001953125,
-0.0235595703125,
0.045440673828125,
0.0036792755126953125,
-0.0487060546875,
0.019744873046875,
0.0230560302734375,
0.007503509521484375,
-0.0227508544921875,
-0.034149169921875,
-0.019561767578125,
0.031646728515625,
0.032318115234375,
0.03375244140625,
-0.0762939453125,
-0.00583648681640625,
0.044281005859375,
-0.00016641616821289062,
0.01125335693359375,
0.0484619140625,
-0.0751953125,
0.027557373046875,
0.0345458984375,
0.03057861328125,
0.03460693359375,
-0.03179931640625,
0.038177490234375,
-0.0224761962890625,
0.00885009765625,
0.030059814453125,
0.028778076171875,
-0.0164642333984375,
-0.043792724609375,
0.0701904296875,
0.03546142578125,
-0.00933837890625,
-0.08758544921875,
0.004756927490234375,
-0.1072998046875,
-0.0145111083984375,
0.06689453125,
-0.0031299591064453125,
-0.016693115234375,
0.0040435791015625,
-0.016693115234375,
0.01160430908203125,
-0.051025390625,
0.05194091796875,
0.045684814453125,
-0.00522613525390625,
-0.0054779052734375,
-0.03143310546875,
0.01202392578125,
0.0219268798828125,
-0.09326171875,
-0.0170135498046875,
0.0176849365234375,
0.0170745849609375,
0.04486083984375,
0.04522705078125,
-0.0246429443359375,
0.05224609375,
-0.00417327880859375,
-0.0017242431640625,
-0.046478271484375,
-0.0274505615234375,
-0.02679443359375,
-0.0024738311767578125,
-0.024688720703125,
-0.007160186767578125
]
] |
cyberagent/open-calm-large | 2023-05-18T01:11:13.000Z | [
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"japanese",
"causal-lm",
"ja",
"dataset:wikipedia",
"dataset:cc100",
"dataset:mc4",
"license:cc-by-sa-4.0",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | cyberagent | null | null | cyberagent/open-calm-large | 9 | 6,811 | transformers | 2023-05-15T06:50:24 | ---
license: cc-by-sa-4.0
datasets:
- wikipedia
- cc100
- mc4
language:
- ja
tags:
- japanese
- causal-lm
inference: false
---
# OpenCALM-Large
## Model Description
OpenCALM is a suite of decoder-only language models pre-trained on Japanese datasets, developed by CyberAgent, Inc.
## Usage
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("cyberagent/open-calm-large", device_map="auto", torch_dtype=torch.float16)
tokenizer = AutoTokenizer.from_pretrained("cyberagent/open-calm-large")
inputs = tokenizer("AIによって私達の暮らしは、", return_tensors="pt").to(model.device)
with torch.no_grad():
tokens = model.generate(
**inputs,
max_new_tokens=64,
do_sample=True,
temperature=0.7,
top_p=0.9,
repetition_penalty=1.05,
pad_token_id=tokenizer.pad_token_id,
)
output = tokenizer.decode(tokens[0], skip_special_tokens=True)
print(output)
```
## Model Details
|Model|Params|Layers|Dim|Heads|Dev ppl|
|:---:|:---: |:---:|:---:|:---:|:---:|
|[cyberagent/open-calm-small](https://huggingface.co/cyberagent/open-calm-small)|160M|12|768|12|19.7|
|[cyberagent/open-calm-medium](https://huggingface.co/cyberagent/open-calm-medium)|400M|24|1024|16|13.8|
|[cyberagent/open-calm-large](https://huggingface.co/cyberagent/open-calm-large)|830M|24|1536|16|11.3|
|[cyberagent/open-calm-1b](https://huggingface.co/cyberagent/open-calm-1b)|1.4B|24|2048|16|10.3|
|[cyberagent/open-calm-3b](https://huggingface.co/cyberagent/open-calm-3b)|2.7B|32|2560|32|9.7|
|[cyberagent/open-calm-7b](https://huggingface.co/cyberagent/open-calm-7b)|6.8B|32|4096|32|8.2|
* **Developed by**: [CyberAgent, Inc.](https://www.cyberagent.co.jp/)
* **Model type**: Transformer-based Language Model
* **Language**: Japanese
* **Library**: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
* **License**: OpenCALM is licensed under the Creative Commons Attribution-ShareAlike 4.0 International License ([CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)). When using this model, please provide appropriate credit to CyberAgent, Inc.
* Example (en): This model is a fine-tuned version of OpenCALM-XX developed by CyberAgent, Inc. The original model is released under the CC BY-SA 4.0 license, and this model is also released under the same CC BY-SA 4.0 license. For more information, please visit: https://creativecommons.org/licenses/by-sa/4.0/
* Example (ja): 本モデルは、株式会社サイバーエージェントによるOpenCALM-XXをファインチューニングしたものです。元のモデルはCC BY-SA 4.0ライセンスのもとで公開されており、本モデルも同じくCC BY-SA 4.0ライセンスで公開します。詳しくはこちらをご覧ください: https://creativecommons.org/licenses/by-sa/4.0/
## Training Dataset
* Wikipedia (ja)
* Common Crawl (ja)
## Author
[Ryosuke Ishigami](https://huggingface.co/rishigami)
## Citations
```bibtext
@software{gpt-neox-library,
title = {{GPT-NeoX: Large Scale Autoregressive Language Modeling in PyTorch}},
author = {Andonian, Alex and Anthony, Quentin and Biderman, Stella and Black, Sid and Gali, Preetham and Gao, Leo and Hallahan, Eric and Levy-Kramer, Josh and Leahy, Connor and Nestler, Lucas and Parker, Kip and Pieler, Michael and Purohit, Shivanshu and Songz, Tri and Phil, Wang and Weinbach, Samuel},
url = {https://www.github.com/eleutherai/gpt-neox},
doi = {10.5281/zenodo.5879544},
month = {8},
year = {2021},
version = {0.0.1},
}
``` | 3,375 | [
[
-0.030426025390625,
-0.054718017578125,
0.0218048095703125,
0.006908416748046875,
-0.01047515869140625,
-0.0222015380859375,
-0.033416748046875,
-0.034149169921875,
0.015869140625,
0.039459228515625,
-0.03643798828125,
-0.057464599609375,
-0.036102294921875,
0.003757476806640625,
-0.006702423095703125,
0.07208251953125,
-0.0179901123046875,
-0.009765625,
-0.00405120849609375,
-0.01039886474609375,
-0.00917816162109375,
-0.0394287109375,
-0.055206298828125,
-0.0278167724609375,
0.00452423095703125,
0.01482391357421875,
0.0714111328125,
0.05218505859375,
0.034088134765625,
0.029815673828125,
0.005527496337890625,
0.0015039443969726562,
-0.025177001953125,
-0.0189208984375,
-0.000006079673767089844,
-0.05224609375,
-0.055816650390625,
-0.007781982421875,
0.05157470703125,
0.0252532958984375,
-0.00412750244140625,
0.01544952392578125,
-0.00893402099609375,
0.01288604736328125,
-0.040771484375,
0.030670166015625,
-0.032135009765625,
0.0023708343505859375,
-0.00852203369140625,
-0.0015077590942382812,
-0.027191162109375,
-0.0286102294921875,
0.0027713775634765625,
-0.056549072265625,
0.0136566162109375,
-0.0104217529296875,
0.0828857421875,
0.008392333984375,
-0.0104522705078125,
0.0027866363525390625,
-0.060516357421875,
0.0638427734375,
-0.077392578125,
0.01739501953125,
0.041107177734375,
0.017913818359375,
0.00039124488830566406,
-0.060394287109375,
-0.029693603515625,
-0.01248931884765625,
0.0012273788452148438,
0.01531219482421875,
-0.043792724609375,
-0.0028133392333984375,
0.0288848876953125,
0.0268402099609375,
-0.06524658203125,
0.0074462890625,
-0.0218963623046875,
-0.025543212890625,
0.035186767578125,
0.0276336669921875,
0.0335693359375,
0.004978179931640625,
-0.024261474609375,
-0.022979736328125,
-0.03387451171875,
-0.01050567626953125,
0.02239990234375,
0.028472900390625,
-0.046112060546875,
0.03607177734375,
-0.00605010986328125,
0.043487548828125,
-0.020111083984375,
-0.017120361328125,
0.046051025390625,
-0.03704833984375,
-0.021240234375,
-0.005878448486328125,
0.11175537109375,
0.01641845703125,
0.03179931640625,
0.004817962646484375,
-0.0111083984375,
-0.0012149810791015625,
-0.0033092498779296875,
-0.077392578125,
-0.005435943603515625,
0.0120849609375,
-0.0299530029296875,
-0.0137786865234375,
0.02239990234375,
-0.060516357421875,
0.00850677490234375,
-0.0195159912109375,
0.033233642578125,
-0.036834716796875,
-0.045135498046875,
0.0254364013671875,
0.01560211181640625,
0.0059051513671875,
0.020172119140625,
-0.035919189453125,
0.040008544921875,
0.0214996337890625,
0.07684326171875,
0.021392822265625,
-0.0306549072265625,
-0.01218414306640625,
-0.0208740234375,
-0.0121002197265625,
0.0230712890625,
-0.02020263671875,
-0.018646240234375,
-0.0033588409423828125,
0.0161590576171875,
-0.0242462158203125,
-0.039642333984375,
0.02777099609375,
-0.0276947021484375,
0.027435302734375,
0.01331329345703125,
-0.002017974853515625,
-0.0054473876953125,
0.0089263916015625,
-0.043914794921875,
0.0802001953125,
0.0167083740234375,
-0.0679931640625,
-0.0024662017822265625,
-0.042999267578125,
0.0008263587951660156,
-0.01132965087890625,
0.0034999847412109375,
-0.04461669921875,
-0.01389312744140625,
0.01959228515625,
0.0072784423828125,
-0.03961181640625,
0.0101165771484375,
-0.038238525390625,
-0.01085662841796875,
0.02239990234375,
-0.0180816650390625,
0.08526611328125,
0.0295562744140625,
-0.0298309326171875,
0.017822265625,
-0.060577392578125,
0.014068603515625,
0.041900634765625,
-0.006481170654296875,
-0.0213165283203125,
-0.0180816650390625,
0.01412200927734375,
0.03277587890625,
0.029541015625,
-0.043914794921875,
0.0175933837890625,
-0.037841796875,
0.049346923828125,
0.040252685546875,
-0.01708984375,
0.0232696533203125,
-0.01580810546875,
0.0506591796875,
0.0157623291015625,
0.026947021484375,
-0.0199127197265625,
-0.034576416015625,
-0.0489501953125,
-0.037261962890625,
0.01324462890625,
0.04547119140625,
-0.041412353515625,
0.039276123046875,
-0.030670166015625,
-0.052520751953125,
-0.047637939453125,
-0.007503509521484375,
0.037139892578125,
0.01432037353515625,
0.0274810791015625,
-0.0243377685546875,
-0.047210693359375,
-0.053253173828125,
0.0063629150390625,
-0.0091094970703125,
0.006771087646484375,
0.03167724609375,
0.051116943359375,
-0.040985107421875,
0.078857421875,
-0.057769775390625,
-0.0106353759765625,
-0.006870269775390625,
0.01558685302734375,
0.03277587890625,
0.0281982421875,
0.046783447265625,
-0.0355224609375,
-0.04443359375,
0.01372528076171875,
-0.054840087890625,
-0.0125579833984375,
0.007503509521484375,
-0.0017213821411132812,
0.0191650390625,
0.04180908203125,
-0.057220458984375,
0.040985107421875,
0.0238037109375,
-0.028045654296875,
0.06414794921875,
-0.0276336669921875,
-0.00730133056640625,
-0.10577392578125,
0.03228759765625,
0.0091400146484375,
-0.0289459228515625,
-0.0322265625,
0.0157928466796875,
0.0014705657958984375,
-0.0182647705078125,
-0.03369140625,
0.062744140625,
-0.031707763671875,
0.00628662109375,
-0.0196685791015625,
0.016815185546875,
-0.0185699462890625,
0.031005859375,
0.00835418701171875,
0.0513916015625,
0.057037353515625,
-0.02886962890625,
0.0290069580078125,
0.01497650146484375,
-0.01174163818359375,
0.01331329345703125,
-0.062255859375,
0.010101318359375,
0.0021686553955078125,
0.017852783203125,
-0.06341552734375,
-0.0090789794921875,
0.0238494873046875,
-0.051544189453125,
0.020294189453125,
-0.0188140869140625,
-0.05877685546875,
-0.044647216796875,
-0.0160675048828125,
0.04095458984375,
0.0304412841796875,
-0.033447265625,
0.019195556640625,
0.0155029296875,
-0.004863739013671875,
-0.035797119140625,
-0.0482177734375,
-0.01055145263671875,
-0.00977325439453125,
-0.053985595703125,
0.0201416015625,
-0.01093292236328125,
-0.01029205322265625,
0.01343536376953125,
0.006389617919921875,
-0.017425537109375,
-0.01384735107421875,
0.01369476318359375,
0.036834716796875,
-0.038787841796875,
-0.01556396484375,
-0.032440185546875,
-0.024139404296875,
0.0196380615234375,
-0.026336669921875,
0.06829833984375,
-0.02569580078125,
-0.00969696044921875,
-0.041412353515625,
0.003520965576171875,
0.05889892578125,
-0.029083251953125,
0.06256103515625,
0.070068359375,
-0.036468505859375,
0.004627227783203125,
-0.019073486328125,
-0.0178680419921875,
-0.034637451171875,
0.040313720703125,
-0.02520751953125,
-0.060882568359375,
0.06805419921875,
0.01175689697265625,
0.004779815673828125,
0.04754638671875,
0.04876708984375,
0.04296875,
0.089599609375,
0.035186767578125,
-0.02850341796875,
0.0379638671875,
-0.036468505859375,
0.00986480712890625,
-0.059906005859375,
-0.0034351348876953125,
-0.05841064453125,
0.006587982177734375,
-0.058837890625,
-0.03173828125,
0.01812744140625,
-0.005840301513671875,
-0.048004150390625,
0.050323486328125,
-0.0298614501953125,
0.0134124755859375,
0.03704833984375,
0.01430511474609375,
0.012420654296875,
0.002349853515625,
-0.0305633544921875,
-0.00014007091522216797,
-0.049530029296875,
-0.0220947265625,
0.0816650390625,
0.038360595703125,
0.0723876953125,
0.00762176513671875,
0.0501708984375,
-0.030853271484375,
-0.003238677978515625,
-0.05267333984375,
0.048583984375,
0.00762939453125,
-0.056793212890625,
-0.006256103515625,
-0.06427001953125,
-0.0892333984375,
0.00928497314453125,
-0.0019588470458984375,
-0.0814208984375,
0.004116058349609375,
0.00003463029861450195,
-0.0202789306640625,
0.0394287109375,
-0.041229248046875,
0.06280517578125,
0.0027141571044921875,
-0.0338134765625,
-0.01108551025390625,
-0.04443359375,
0.025787353515625,
-0.00042176246643066406,
0.021942138671875,
0.01073455810546875,
-0.004970550537109375,
0.05169677734375,
-0.058441162109375,
0.0701904296875,
-0.005374908447265625,
0.0059051513671875,
0.030517578125,
-0.00047707557678222656,
0.0301055908203125,
0.006359100341796875,
-0.00026345252990722656,
0.0269622802734375,
0.00751495361328125,
-0.0174407958984375,
-0.0226898193359375,
0.06658935546875,
-0.09765625,
-0.01629638671875,
-0.05194091796875,
-0.041290283203125,
0.00849151611328125,
0.04791259765625,
0.03173828125,
0.041412353515625,
-0.004917144775390625,
0.0179595947265625,
0.044647216796875,
-0.040374755859375,
0.0309600830078125,
0.040557861328125,
-0.033935546875,
-0.0626220703125,
0.06591796875,
0.005603790283203125,
0.030517578125,
0.0243377685546875,
0.028839111328125,
-0.0186004638671875,
-0.0200347900390625,
-0.02032470703125,
0.04766845703125,
-0.03533935546875,
-0.0195465087890625,
-0.046539306640625,
-0.0372314453125,
-0.051483154296875,
0.0033550262451171875,
-0.04180908203125,
-0.01409149169921875,
-0.031005859375,
0.018157958984375,
0.0308990478515625,
0.033538818359375,
-0.0093841552734375,
0.0251617431640625,
-0.050872802734375,
0.01544952392578125,
-0.003154754638671875,
0.0304412841796875,
0.0037631988525390625,
-0.04864501953125,
-0.0309600830078125,
0.0170440673828125,
-0.031829833984375,
-0.04840087890625,
0.050750732421875,
-0.005481719970703125,
0.034423828125,
0.0310516357421875,
-0.0014677047729492188,
0.05072021484375,
-0.0115814208984375,
0.046844482421875,
0.02490234375,
-0.05621337890625,
0.039886474609375,
-0.04620361328125,
0.0595703125,
0.0210418701171875,
0.04534912109375,
-0.036224365234375,
-0.0189666748046875,
-0.059906005859375,
-0.076904296875,
0.08978271484375,
0.013916015625,
0.00006580352783203125,
0.001659393310546875,
0.01540374755859375,
-0.01461029052734375,
0.0017137527465820312,
-0.075439453125,
-0.0302886962890625,
-0.013702392578125,
-0.0232391357421875,
-0.01236724853515625,
-0.0019664764404296875,
0.00383758544921875,
-0.0241241455078125,
0.062255859375,
-0.01134490966796875,
0.052032470703125,
0.0196380615234375,
-0.01094818115234375,
-0.02484130859375,
0.005603790283203125,
0.04840087890625,
0.033477783203125,
-0.0264434814453125,
-0.004810333251953125,
0.005306243896484375,
-0.0595703125,
-0.0124664306640625,
0.019012451171875,
-0.00969696044921875,
-0.0010576248168945312,
0.026336669921875,
0.0792236328125,
0.0139923095703125,
-0.030059814453125,
0.04095458984375,
-0.00931549072265625,
-0.0311279296875,
-0.03228759765625,
-0.0031261444091796875,
0.001323699951171875,
0.003986358642578125,
0.0006270408630371094,
-0.003509521484375,
-0.01312255859375,
-0.0455322265625,
-0.0016698837280273438,
0.026214599609375,
-0.0285186767578125,
-0.038330078125,
0.050140380859375,
-0.01690673828125,
-0.0099334716796875,
0.057464599609375,
-0.00942230224609375,
-0.036346435546875,
0.0533447265625,
0.06829833984375,
0.073486328125,
-0.03326416015625,
-0.0028285980224609375,
0.0712890625,
0.034759521484375,
-0.0012359619140625,
0.026336669921875,
0.01611328125,
-0.046478271484375,
-0.0213775634765625,
-0.043792724609375,
-0.0090179443359375,
0.03302001953125,
-0.0543212890625,
0.037750244140625,
-0.036895751953125,
-0.031829833984375,
-0.0037288665771484375,
0.0135040283203125,
-0.0546875,
0.01303863525390625,
0.0040740966796875,
0.0728759765625,
-0.05303955078125,
0.061920166015625,
0.061981201171875,
-0.0404052734375,
-0.058013916015625,
-0.01039886474609375,
0.006317138671875,
-0.0648193359375,
0.025054931640625,
0.0062713623046875,
0.010040283203125,
0.01324462890625,
-0.032135009765625,
-0.07757568359375,
0.07623291015625,
0.0282440185546875,
-0.03179931640625,
-0.008209228515625,
0.037841796875,
0.03857421875,
-0.0153350830078125,
0.0709228515625,
0.01247406005859375,
0.03424072265625,
0.0021152496337890625,
-0.0941162109375,
0.01812744140625,
-0.0277862548828125,
-0.0015354156494140625,
0.034881591796875,
-0.0643310546875,
0.074462890625,
-0.019805908203125,
-0.00489044189453125,
0.01299285888671875,
0.027679443359375,
0.008544921875,
0.01047515869140625,
0.0243072509765625,
0.0645751953125,
0.0231170654296875,
-0.01430511474609375,
0.0611572265625,
-0.0206146240234375,
0.0498046875,
0.08111572265625,
0.004425048828125,
0.051544189453125,
-0.003856658935546875,
-0.02752685546875,
0.03961181640625,
0.03179931640625,
-0.0002944469451904297,
0.0180816650390625,
0.004604339599609375,
-0.01152801513671875,
-0.01258087158203125,
0.01131439208984375,
-0.037933349609375,
0.033782958984375,
0.0234527587890625,
-0.03741455078125,
-0.03857421875,
0.001209259033203125,
0.0316162109375,
-0.0193939208984375,
-0.017730712890625,
0.04620361328125,
0.0186309814453125,
-0.0401611328125,
0.066162109375,
0.018096923828125,
0.03839111328125,
-0.0626220703125,
0.0231170654296875,
0.00679779052734375,
0.036834716796875,
-0.0182342529296875,
-0.0465087890625,
0.01251983642578125,
-0.0009307861328125,
-0.005786895751953125,
0.002429962158203125,
0.04412841796875,
-0.021820068359375,
-0.02337646484375,
0.03424072265625,
0.0128326416015625,
0.0147247314453125,
-0.01079559326171875,
-0.0704345703125,
0.0207672119140625,
-0.0035076141357421875,
-0.0270843505859375,
0.02911376953125,
0.030517578125,
-0.008453369140625,
0.04217529296875,
0.053466796875,
0.0020389556884765625,
0.002063751220703125,
0.00372314453125,
0.06658935546875,
-0.050872802734375,
-0.0352783203125,
-0.0684814453125,
0.035125732421875,
0.004482269287109375,
-0.044677734375,
0.06494140625,
0.06103515625,
0.06646728515625,
-0.007305145263671875,
0.052642822265625,
-0.00860595703125,
0.01531219482421875,
-0.01313018798828125,
0.04595947265625,
-0.04248046875,
-0.01480865478515625,
-0.032470703125,
-0.0712890625,
-0.020416259765625,
0.045989990234375,
-0.035919189453125,
0.02362060546875,
0.049346923828125,
0.07574462890625,
-0.0198974609375,
-0.0108184814453125,
0.0116119384765625,
0.04534912109375,
0.02142333984375,
0.05853271484375,
0.03472900390625,
-0.051055908203125,
0.039276123046875,
-0.030059814453125,
-0.036956787109375,
-0.026580810546875,
-0.042999267578125,
-0.07025146484375,
-0.03863525390625,
-0.035736083984375,
-0.04437255859375,
-0.0183258056640625,
0.08245849609375,
0.06134033203125,
-0.0501708984375,
-0.032501220703125,
-0.0132293701171875,
-0.0221405029296875,
-0.01203155517578125,
-0.0222320556640625,
0.016448974609375,
-0.0126190185546875,
-0.057281494140625,
0.00540924072265625,
-0.00016319751739501953,
0.004848480224609375,
-0.0162353515625,
-0.007305145263671875,
-0.01354217529296875,
0.00812530517578125,
0.034271240234375,
0.03619384765625,
-0.03753662109375,
0.013580322265625,
0.01288604736328125,
-0.0200042724609375,
0.0121307373046875,
0.034088134765625,
-0.036163330078125,
0.04266357421875,
0.047393798828125,
0.003246307373046875,
0.055206298828125,
-0.033538818359375,
0.0268707275390625,
-0.031982421875,
0.0213470458984375,
0.0101318359375,
0.036468505859375,
0.003597259521484375,
-0.0132293701171875,
0.0272369384765625,
0.0282745361328125,
-0.0496826171875,
-0.0631103515625,
0.0018262863159179688,
-0.08489990234375,
-0.01171112060546875,
0.09857177734375,
-0.0170745849609375,
-0.032928466796875,
0.0096588134765625,
-0.0180206298828125,
0.02655029296875,
-0.0245513916015625,
0.0281524658203125,
0.04083251953125,
0.0232391357421875,
-0.00862884521484375,
-0.048370361328125,
0.00647735595703125,
0.00461578369140625,
-0.060028076171875,
0.0104217529296875,
0.02178955078125,
0.03228759765625,
0.02886962890625,
0.048583984375,
-0.048187255859375,
0.023162841796875,
0.002689361572265625,
0.0299224853515625,
-0.016021728515625,
-0.012237548828125,
-0.0242156982421875,
-0.01371002197265625,
0.003307342529296875,
-0.00836944580078125
]
] |
llm-agents/tora-7b-v1.0 | 2023-10-08T11:22:45.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"code",
"math",
"en",
"dataset:gsm8k",
"dataset:competition_math",
"arxiv:2309.17452",
"license:llama2",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | llm-agents | null | null | llm-agents/tora-7b-v1.0 | 4 | 6,809 | transformers | 2023-10-08T05:15:03 | ---
license: llama2
datasets:
- gsm8k
- competition_math
language:
- en
metrics:
- exact_match
library_name: transformers
pipeline_tag: text-generation
tags:
- code
- math
---
<h1 align="center">
ToRA: A Tool-Integrated Reasoning Agent <br> for Mathematical Problem Solving
</h1>
<p align="center">
<a href="https://microsoft.github.io/ToRA/"><b>[🌐 Website]</b></a> •
<a href="https://arxiv.org/pdf/2309.17452.pdf"><b>[📜 Paper]</b></a> •
<a href="https://huggingface.co/llm-agents"><b>[🤗 HF Models]</b></a> •
<a href="https://github.com/microsoft/ToRA"><b>[🐱 GitHub]</b></a>
<br>
<a href="https://twitter.com/zhs05232838/status/1708860992631763092"><b>[🐦 Twitter]</b></a> •
<a href="https://www.reddit.com/r/LocalLLaMA/comments/1703k6d/tora_a_toolintegrated_reasoning_agent_for/"><b>[💬 Reddit]</b></a> •
<a href="https://notes.aimodels.fyi/researchers-announce-tora-training-language-models-to-better-understand-math-using-external-tools/">[🍀 Unofficial Blog]</a>
<!-- <a href="#-quick-start">Quick Start</a> • -->
<!-- <a href="#%EF%B8%8F-citation">Citation</a> -->
</p>
<p align="center">
Repo for "<a href="https://arxiv.org/pdf/2309.17452.pdf" target="_blank">ToRA: A Tool-Integrated Reasoning Agent for Mathematical Problem Solving</a>"
</p>
## 🔥 News
- [2023/10/08] 🔥🔥🔥 All ToRA models released at [HuggingFace](https://huggingface.co/llm-agents)!!!
- [2023/09/29] ToRA paper, repo, and website released.
## 💡 Introduction
ToRA is a series of Tool-integrated Reasoning Agents designed to solve challenging mathematical reasoning problems by interacting with tools, e.g., computation libraries and symbolic solvers. ToRA series seamlessly integrate natural language reasoning with the utilization of external tools, thereby amalgamating the analytical prowess of language and the computational efficiency of external tools.
| Model | Size | GSM8k | MATH | AVG@10 math tasks<sup>†</sup> |
|---|---|---|---|---|
| GPT-4 | - | 92.0 | 42.5 | 78.3 |
| GPT-4 (PAL) | - | 94.2 | 51.8 | 86.4 |
| [ToRA-7B](https://huggingface.co/llm-agents/tora-7b-v1.0) | 7B | 68.8 | 40.1 | 62.4|
| [ToRA-Code-7B](https://huggingface.co/llm-agents/tora-code-7b-v1.0) | 7B | 72.6 | 44.6 | 66.5|
| [ToRA-13B](https://huggingface.co/llm-agents/tora-13b-v1.0) | 13B | 72.7 | 43.0 | 65.9|
| [ToRA-Code-13B](https://huggingface.co/llm-agents/tora-code-13b-v1.0) | 13B | 75.8 | 48.1 | 71.3 |
| [ToRA-Code-34B<sup>*</sup>](https://huggingface.co/llm-agents/tora-code-34b-v1.0) | 34B | 80.7 | **51.0** | 74.8 |
| [ToRA-70B](https://huggingface.co/llm-agents/tora-70b-v1.0) | 70B | **84.3** | 49.7 | **76.9** |
- <sup>*</sup>ToRA-Code-34B is currently the first and only open-source model to achieve over 50% accuracy (pass@1) on the MATH dataset, which significantly outperforms GPT-4’s CoT result (51.0 vs. 42.5), and is competitive with GPT-4 solving problems with programs. By open-sourcing our codes and models, we hope more breakthroughs will come!
- <sup>†</sup>10 math tasks include GSM8k, MATH, GSM-Hard, SVAMP, TabMWP, ASDiv, SingleEQ, SingleOP, AddSub, and MultiArith.
## ⚡️ Training
The models are trained on ToRA-Corpus 16k, which contains tool-integrated reasoning trajectories of MATH and GSM8k from GPT-4.
We use imitation learning (i.e., SFT) to fine-tune the models, and then apply our proposed *output space shaping* to improve tool-integrated reasoning behaviors. Please refer to the [paper](https://arxiv.org/pdf/2309.17452.pdf) for more details.
## 🪁 Inference & Evaluation
Please refer to ToRA's [GitHub repo](https://github.com/microsoft/ToRA) for inference, evaluation, and training code.
## ☕️ Citation
If you find this repository helpful, please consider citing our paper:
```
@misc{gou2023tora,
title={ToRA: A Tool-Integrated Reasoning Agent for Mathematical Problem Solving},
author={Zhibin Gou and Zhihong Shao and Yeyun Gong and yelong shen and Yujiu Yang and Minlie Huang and Nan Duan and Weizhu Chen},
year={2023},
eprint={2309.17452},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 4,085 | [
[
-0.02813720703125,
-0.059906005859375,
0.050445556640625,
0.033660888671875,
-0.0159454345703125,
0.03289794921875,
0.0155792236328125,
-0.02618408203125,
0.037139892578125,
0.03314208984375,
-0.04913330078125,
-0.0204315185546875,
-0.0323486328125,
0.00925445556640625,
0.006561279296875,
0.031951904296875,
0.003505706787109375,
0.0174713134765625,
-0.0164794921875,
-0.02166748046875,
-0.05242919921875,
-0.0262451171875,
-0.0360107421875,
-0.027130126953125,
-0.00957489013671875,
-0.0070648193359375,
0.06866455078125,
0.036865234375,
0.028778076171875,
0.0291290283203125,
0.0000711679458618164,
0.036163330078125,
-0.001308441162109375,
-0.006984710693359375,
-0.004825592041015625,
-0.035736083984375,
-0.05340576171875,
0.01180267333984375,
0.048309326171875,
0.0274658203125,
-0.017425537109375,
0.00487518310546875,
-0.0145263671875,
0.031524658203125,
-0.03045654296875,
0.0221099853515625,
-0.0202484130859375,
-0.0170135498046875,
-0.0010957717895507812,
-0.004810333251953125,
-0.04638671875,
-0.035858154296875,
-0.0011386871337890625,
-0.0726318359375,
-0.01145172119140625,
0.00029206275939941406,
0.087158203125,
0.0277862548828125,
-0.0242767333984375,
-0.017242431640625,
-0.046905517578125,
0.062469482421875,
-0.0653076171875,
0.021148681640625,
-0.0012264251708984375,
0.0227203369140625,
-0.0150299072265625,
-0.053375244140625,
-0.049072265625,
-0.01531982421875,
-0.006191253662109375,
0.034759521484375,
-0.0457763671875,
-0.01922607421875,
0.034576416015625,
-0.000514984130859375,
-0.0400390625,
-0.019134521484375,
-0.038330078125,
-0.0120086669921875,
0.043609619140625,
0.0189056396484375,
0.03021240234375,
0.0013532638549804688,
-0.009521484375,
-0.01334381103515625,
-0.053863525390625,
0.006015777587890625,
0.025360107421875,
0.0036487579345703125,
-0.0243988037109375,
0.021026611328125,
0.0236968994140625,
0.04473876953125,
-0.0019741058349609375,
-0.004940032958984375,
0.03515625,
-0.0064239501953125,
-0.020477294921875,
-0.0263824462890625,
0.07012939453125,
-0.00571441650390625,
0.007755279541015625,
-0.0014543533325195312,
-0.00875091552734375,
0.006927490234375,
0.0217437744140625,
-0.055877685546875,
0.0093841552734375,
0.007389068603515625,
-0.004451751708984375,
-0.01114654541015625,
0.0235443115234375,
-0.05126953125,
-0.00475311279296875,
-0.0235595703125,
0.045501708984375,
-0.0350341796875,
-0.0186004638671875,
0.03802490234375,
0.0009927749633789062,
0.01282501220703125,
0.04901123046875,
-0.01384735107421875,
0.028778076171875,
0.040252685546875,
0.07275390625,
0.010345458984375,
-0.01383209228515625,
-0.0511474609375,
-0.00867462158203125,
-0.0259857177734375,
0.05072021484375,
-0.0295867919921875,
-0.017730712890625,
-0.027191162109375,
0.003025054931640625,
-0.0127716064453125,
-0.028045654296875,
0.00327301025390625,
-0.052032470703125,
0.02752685546875,
-0.00875091552734375,
-0.0161285400390625,
-0.0292205810546875,
0.010223388671875,
-0.06573486328125,
0.0723876953125,
0.0249481201171875,
-0.030609130859375,
-0.01227569580078125,
-0.06475830078125,
-0.006317138671875,
-0.0032978057861328125,
-0.004161834716796875,
-0.036468505859375,
-0.022430419921875,
0.016021728515625,
0.0032806396484375,
-0.059112548828125,
0.038482666015625,
-0.03533935546875,
-0.0169677734375,
0.0199127197265625,
-0.0019235610961914062,
0.09466552734375,
0.02325439453125,
-0.038116455078125,
0.01953125,
-0.033050537109375,
0.0142669677734375,
0.03765869140625,
0.0167236328125,
-0.0158233642578125,
-0.0237579345703125,
-0.024169921875,
0.03326416015625,
0.012451171875,
-0.0413818359375,
0.0219573974609375,
-0.043426513671875,
0.047607421875,
0.08258056640625,
-0.006267547607421875,
0.0226593017578125,
-0.0244598388671875,
0.0731201171875,
0.003910064697265625,
0.0230865478515625,
0.0103759765625,
-0.055084228515625,
-0.043121337890625,
-0.02435302734375,
0.021453857421875,
0.050811767578125,
-0.07867431640625,
0.033233642578125,
-0.00974273681640625,
-0.05810546875,
-0.033721923828125,
-0.0124664306640625,
0.0445556640625,
0.0333251953125,
0.02276611328125,
-0.00809478759765625,
-0.029296875,
-0.0537109375,
-0.0166778564453125,
-0.00455474853515625,
0.01076507568359375,
0.0290069580078125,
0.04937744140625,
-0.0037708282470703125,
0.0755615234375,
-0.0638427734375,
-0.0010890960693359375,
-0.018951416015625,
-0.00742340087890625,
0.030029296875,
0.0277862548828125,
0.055419921875,
-0.05242919921875,
-0.05718994140625,
-0.01190185546875,
-0.0626220703125,
-0.00843048095703125,
-0.01385498046875,
-0.0181732177734375,
0.01537322998046875,
0.0172119140625,
-0.04962158203125,
0.038330078125,
0.0183258056640625,
-0.052490234375,
0.036834716796875,
0.01025390625,
-0.014678955078125,
-0.1021728515625,
0.0101776123046875,
0.019439697265625,
-0.0175628662109375,
-0.029693603515625,
0.018402099609375,
-0.00872802734375,
-0.0026493072509765625,
-0.028106689453125,
0.07281494140625,
-0.024932861328125,
0.004993438720703125,
-0.0012264251708984375,
0.0166168212890625,
-0.0004570484161376953,
0.053070068359375,
-0.01702880859375,
0.09710693359375,
0.0325927734375,
-0.02490234375,
0.0215606689453125,
0.024749755859375,
0.00927734375,
0.00850677490234375,
-0.061370849609375,
0.0206298828125,
0.0085601806640625,
0.016021728515625,
-0.043426513671875,
0.023681640625,
0.037445068359375,
-0.051544189453125,
-0.0013818740844726562,
0.0041961669921875,
-0.05419921875,
-0.0203704833984375,
-0.0345458984375,
0.01824951171875,
0.0478515625,
-0.030548095703125,
0.0791015625,
0.031463623046875,
0.00823974609375,
-0.05096435546875,
-0.0153045654296875,
-0.018035888671875,
-0.0103759765625,
-0.0706787109375,
0.01337432861328125,
-0.033660888671875,
-0.040435791015625,
0.008453369140625,
-0.0113372802734375,
-0.0011587142944335938,
0.0029754638671875,
0.01416778564453125,
0.0447998046875,
-0.025634765625,
0.00714874267578125,
0.00598907470703125,
-0.029052734375,
0.01132965087890625,
-0.0150146484375,
0.06195068359375,
-0.0650634765625,
-0.0164337158203125,
-0.02069091796875,
0.019134521484375,
0.049774169921875,
-0.0263824462890625,
0.05108642578125,
0.027313232421875,
-0.035003662109375,
-0.0034160614013671875,
-0.0411376953125,
-0.0244598388671875,
-0.0411376953125,
0.007350921630859375,
-0.04144287109375,
-0.0535888671875,
0.052978515625,
-0.0026988983154296875,
-0.0048370361328125,
0.0640869140625,
0.02838134765625,
0.02569580078125,
0.08624267578125,
0.05072021484375,
-0.0059356689453125,
0.031829833984375,
-0.0576171875,
0.017547607421875,
-0.0633544921875,
-0.0186004638671875,
-0.03765869140625,
-0.0015287399291992188,
-0.031707763671875,
-0.0174713134765625,
0.0465087890625,
0.00433349609375,
-0.036376953125,
0.043060302734375,
-0.052215576171875,
0.035125732421875,
0.0509033203125,
0.00909423828125,
0.017822265625,
-0.00858306884765625,
-0.0108489990234375,
-0.00445556640625,
-0.043609619140625,
-0.040924072265625,
0.0667724609375,
0.0215911865234375,
0.044677734375,
0.028656005859375,
0.02838134765625,
0.006671905517578125,
0.016754150390625,
-0.044403076171875,
0.057220458984375,
0.0068817138671875,
-0.0257720947265625,
-0.0230560302734375,
-0.03570556640625,
-0.07037353515625,
0.01525115966796875,
0.0121002197265625,
-0.06524658203125,
0.0179290771484375,
-0.006805419921875,
-0.029296875,
0.0302734375,
-0.053375244140625,
0.054901123046875,
-0.0005702972412109375,
-0.035400390625,
-0.02130126953125,
-0.0433349609375,
0.0264434814453125,
0.00479888916015625,
0.0018835067749023438,
0.0091094970703125,
0.00536346435546875,
0.06109619140625,
-0.067626953125,
0.04901123046875,
-0.01202392578125,
-0.004314422607421875,
0.03912353515625,
0.02587890625,
0.051544189453125,
0.025970458984375,
-0.00818634033203125,
0.01275634765625,
0.0112152099609375,
-0.0240478515625,
-0.0657958984375,
0.041015625,
-0.07080078125,
-0.05450439453125,
-0.07391357421875,
-0.05194091796875,
-0.011993408203125,
0.0277099609375,
0.00847625732421875,
0.03692626953125,
0.0421142578125,
0.0037326812744140625,
0.05230712890625,
-0.0012416839599609375,
0.0307464599609375,
0.050140380859375,
-0.0017833709716796875,
-0.03460693359375,
0.0723876953125,
0.018157958984375,
0.01751708984375,
0.022796630859375,
0.01555633544921875,
-0.0265960693359375,
-0.0187225341796875,
-0.03753662109375,
0.050994873046875,
-0.049591064453125,
-0.0340576171875,
-0.03643798828125,
-0.042327880859375,
-0.029083251953125,
-0.022064208984375,
-0.0306396484375,
-0.0301971435546875,
-0.034149169921875,
0.0165252685546875,
0.055023193359375,
0.04962158203125,
0.006740570068359375,
0.025421142578125,
-0.050994873046875,
0.01537322998046875,
0.01076507568359375,
0.0259552001953125,
0.003574371337890625,
-0.033782958984375,
0.00218963623046875,
-0.00016963481903076172,
-0.045166015625,
-0.0693359375,
0.05487060546875,
-0.0234527587890625,
0.037078857421875,
0.002498626708984375,
-0.0009627342224121094,
0.043853759765625,
-0.0037250518798828125,
0.045318603515625,
0.0126190185546875,
-0.0966796875,
0.039703369140625,
-0.02825927734375,
0.0152435302734375,
0.002429962158203125,
0.01096343994140625,
-0.02691650390625,
-0.0228729248046875,
-0.072265625,
-0.036407470703125,
0.083740234375,
0.0273284912109375,
-0.0183563232421875,
0.01203155517578125,
0.024566650390625,
0.00647735595703125,
0.006587982177734375,
-0.0576171875,
-0.0282745361328125,
-0.02630615234375,
-0.0156707763671875,
0.018402099609375,
0.0072021484375,
-0.0086669921875,
-0.017608642578125,
0.0760498046875,
-0.028106689453125,
0.039337158203125,
0.00864410400390625,
-0.01152801513671875,
-0.0009493827819824219,
0.004810333251953125,
0.07110595703125,
0.0650634765625,
-0.01432037353515625,
-0.01904296875,
0.003406524658203125,
-0.06610107421875,
0.007720947265625,
0.0104217529296875,
-0.02618408203125,
-0.00894927978515625,
0.0215911865234375,
0.0579833984375,
-0.01490020751953125,
-0.054779052734375,
0.030426025390625,
0.003520965576171875,
-0.01116943359375,
-0.0295562744140625,
0.004856109619140625,
0.0003476142883300781,
0.0289459228515625,
0.0164947509765625,
0.00844573974609375,
0.002460479736328125,
-0.0282745361328125,
-0.0008192062377929688,
0.03631591796875,
-0.0156097412109375,
-0.0243988037109375,
0.03814697265625,
-0.002269744873046875,
-0.042694091796875,
0.0477294921875,
-0.042083740234375,
-0.04461669921875,
0.0771484375,
0.058380126953125,
0.06915283203125,
-0.004383087158203125,
0.0229949951171875,
0.0294647216796875,
0.042083740234375,
0.00547027587890625,
0.050537109375,
0.0246734619140625,
-0.046630859375,
-0.02337646484375,
-0.0162506103515625,
-0.0310821533203125,
0.01456451416015625,
-0.036834716796875,
0.02239990234375,
-0.053955078125,
-0.00440216064453125,
-0.0013561248779296875,
0.0221710205078125,
-0.041290283203125,
-0.01070404052734375,
-0.04180908203125,
0.07476806640625,
-0.039947509765625,
0.05731201171875,
0.051544189453125,
-0.059967041015625,
-0.07952880859375,
-0.01678466796875,
0.013702392578125,
-0.07476806640625,
0.0284881591796875,
-0.0037364959716796875,
-0.031829833984375,
0.01222991943359375,
-0.05816650390625,
-0.06768798828125,
0.1005859375,
0.05450439453125,
-0.0162200927734375,
0.0015497207641601562,
0.0013437271118164062,
0.029205322265625,
-0.027862548828125,
0.0241546630859375,
0.00994873046875,
0.042388916015625,
0.00931549072265625,
-0.06707763671875,
0.040283203125,
-0.059173583984375,
-0.01204681396484375,
0.032440185546875,
-0.0804443359375,
0.07855224609375,
-0.00763702392578125,
-0.00522613525390625,
0.00830841064453125,
0.0340576171875,
0.043548583984375,
0.030303955078125,
0.03125,
0.042327880859375,
0.0413818359375,
-0.02618408203125,
0.06390380859375,
-0.00795745849609375,
0.03521728515625,
0.065673828125,
-0.0132904052734375,
0.031707763671875,
0.0177001953125,
-0.0310821533203125,
0.0513916015625,
0.03350830078125,
-0.0221099853515625,
0.0186004638671875,
-0.0010519027709960938,
0.003421783447265625,
-0.035430908203125,
0.007354736328125,
-0.037139892578125,
0.0109710693359375,
0.0298309326171875,
0.00677490234375,
-0.0191650390625,
-0.0038356781005859375,
0.0021533966064453125,
0.0153350830078125,
0.0004470348358154297,
0.0380859375,
0.01015472412109375,
-0.0408935546875,
0.0323486328125,
0.01001739501953125,
0.0333251953125,
-0.0609130859375,
-0.02398681640625,
-0.0090484619140625,
0.01097869873046875,
-0.0006513595581054688,
-0.058837890625,
0.0179443359375,
-0.0186767578125,
-0.00824737548828125,
0.0016145706176757812,
0.037353515625,
0.01079559326171875,
-0.03668212890625,
0.0199127197265625,
0.03619384765625,
0.0018720626831054688,
-0.0109405517578125,
-0.06134033203125,
-0.0022430419921875,
-0.007843017578125,
-0.01338958740234375,
0.0059967041015625,
0.020477294921875,
-0.03204345703125,
0.07122802734375,
0.0540771484375,
-0.01090240478515625,
0.00251007080078125,
-0.007415771484375,
0.0723876953125,
-0.0517578125,
-0.048065185546875,
-0.06201171875,
0.03887939453125,
-0.007366180419921875,
-0.029296875,
0.0555419921875,
0.046905517578125,
0.046142578125,
-0.0172271728515625,
0.03369140625,
0.0014314651489257812,
0.0263519287109375,
-0.0401611328125,
0.053802490234375,
-0.044921875,
0.03424072265625,
-0.0145721435546875,
-0.051666259765625,
-0.022979736328125,
0.037139892578125,
-0.0194091796875,
0.0207366943359375,
0.06781005859375,
0.05291748046875,
-0.008209228515625,
0.00104522705078125,
-0.0083465576171875,
0.01425933837890625,
0.04522705078125,
0.058441162109375,
0.04315185546875,
-0.046905517578125,
0.030975341796875,
-0.02008056640625,
-0.00824737548828125,
-0.01010894775390625,
-0.04193115234375,
-0.064697265625,
-0.05419921875,
-0.019775390625,
-0.0567626953125,
-0.015777587890625,
0.0726318359375,
0.048431396484375,
-0.03692626953125,
-0.0152435302734375,
-0.02056884765625,
0.037628173828125,
-0.0260162353515625,
-0.0234527587890625,
0.045196533203125,
0.0035076141357421875,
-0.048553466796875,
0.0193023681640625,
0.0230560302734375,
0.0077056884765625,
-0.022796630859375,
-0.03399658203125,
-0.019287109375,
0.031646728515625,
0.0322265625,
0.03350830078125,
-0.0758056640625,
-0.005645751953125,
0.044403076171875,
-0.0003879070281982422,
0.01123809814453125,
0.0487060546875,
-0.0750732421875,
0.0275421142578125,
0.0345458984375,
0.0301666259765625,
0.034881591796875,
-0.03216552734375,
0.038330078125,
-0.022430419921875,
0.0090484619140625,
0.0301666259765625,
0.0288543701171875,
-0.016357421875,
-0.043914794921875,
0.0704345703125,
0.03546142578125,
-0.0095367431640625,
-0.0872802734375,
0.004978179931640625,
-0.10711669921875,
-0.0146636962890625,
0.066650390625,
-0.0028133392333984375,
-0.016693115234375,
0.003787994384765625,
-0.0170440673828125,
0.0118408203125,
-0.050537109375,
0.0517578125,
0.045440673828125,
-0.0049285888671875,
-0.005279541015625,
-0.031494140625,
0.011993408203125,
0.021636962890625,
-0.09332275390625,
-0.017120361328125,
0.01763916015625,
0.017059326171875,
0.04486083984375,
0.0452880859375,
-0.0246429443359375,
0.051971435546875,
-0.003894805908203125,
-0.001697540283203125,
-0.0465087890625,
-0.02752685546875,
-0.0267791748046875,
-0.00249481201171875,
-0.024688720703125,
-0.006847381591796875
]
] |
facebook/wav2vec2-conformer-rope-large-960h-ft | 2023-03-21T10:48:52.000Z | [
"transformers",
"pytorch",
"safetensors",
"wav2vec2-conformer",
"automatic-speech-recognition",
"speech",
"audio",
"hf-asr-leaderboard",
"en",
"dataset:librispeech_asr",
"arxiv:2010.05171",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | facebook | null | null | facebook/wav2vec2-conformer-rope-large-960h-ft | 6 | 6,805 | transformers | 2022-04-18T09:48:39 | ---
language: en
datasets:
- librispeech_asr
tags:
- speech
- audio
- automatic-speech-recognition
- hf-asr-leaderboard
license: apache-2.0
model-index:
- name: wav2vec2-conformer-rel-pos-large-960h-ft
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (clean)
type: librispeech_asr
config: clean
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 1.96
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (other)
type: librispeech_asr
config: other
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 3.98
---
# Wav2Vec2-Conformer-Large-960h with Rotary Position Embeddings
Wav2Vec2 Conformer with rotary position embeddings, pretrained and **fine-tuned on 960 hours of Librispeech** on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.
**Paper**: [fairseq S2T: Fast Speech-to-Text Modeling with fairseq](https://arxiv.org/abs/2010.05171)
**Authors**: Changhan Wang, Yun Tang, Xutai Ma, Anne Wu, Sravya Popuri, Dmytro Okhonko, Juan Pino
The results of Wav2Vec2-Conformer can be found in Table 3 and Table 4 of the [official paper](https://arxiv.org/abs/2010.05171).
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Wav2Vec2ConformerForCTC
from datasets import load_dataset
import torch
# load model and processor
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-conformer-rope-large-960h-ft")
model = Wav2Vec2ConformerForCTC.from_pretrained("facebook/wav2vec2-conformer-rope-large-960h-ft")
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# tokenize
input_values = processor(ds[0]["audio"]["array"], return_tensors="pt", padding="longest").input_values
# retrieve logits
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
```
## Evaluation
This code snippet shows how to evaluate **facebook/wav2vec2-conformer-rope-large-960h-ft** on LibriSpeech's "clean" and "other" test data.
```python
from datasets import load_dataset
from transformers import Wav2Vec2ConformerForCTC, Wav2Vec2Processor
import torch
from jiwer import wer
librispeech_eval = load_dataset("librispeech_asr", "clean", split="test")
model = Wav2Vec2ConformerForCTC.from_pretrained("facebook/wav2vec2-conformer-rope-large-960h-ft").to("cuda")
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-conformer-rope-large-960h-ft")
def map_to_pred(batch):
inputs = processor(batch["audio"]["array"], return_tensors="pt", padding="longest")
input_values = inputs.input_values.to("cuda")
attention_mask = inputs.attention_mask.to("cuda")
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
batch["transcription"] = transcription
return batch
result = librispeech_eval.map(map_to_pred, remove_columns=["audio"])
print("WER:", wer(result["text"], result["transcription"]))
```
*Result (WER)*:
| "clean" | "other" |
|---|---|
| 1.96 | 3.98 | | 3,748 | [
[
-0.0154571533203125,
-0.050537109375,
0.0180511474609375,
0.0225677490234375,
-0.0143280029296875,
-0.0264434814453125,
-0.035980224609375,
-0.03515625,
-0.004497528076171875,
0.0210723876953125,
-0.0484619140625,
-0.044830322265625,
-0.05218505859375,
-0.026702880859375,
-0.0280914306640625,
0.06640625,
0.007564544677734375,
-0.0008273124694824219,
0.0094757080078125,
-0.00550079345703125,
-0.0274505615234375,
-0.00922393798828125,
-0.0570068359375,
-0.0221099853515625,
0.0074920654296875,
0.019927978515625,
0.016937255859375,
0.0183563232421875,
0.027191162109375,
0.025848388671875,
-0.01319122314453125,
-0.01043701171875,
-0.045166015625,
-0.0017747879028320312,
0.008453369140625,
-0.028045654296875,
-0.0310516357421875,
0.01641845703125,
0.04833984375,
0.01428985595703125,
-0.01358795166015625,
0.0469970703125,
0.002399444580078125,
0.0360107421875,
-0.021942138671875,
0.0199127197265625,
-0.042816162109375,
-0.021392822265625,
-0.0084075927734375,
0.0011749267578125,
-0.039703369140625,
-0.0221099853515625,
0.01203155517578125,
-0.043304443359375,
0.0169830322265625,
-0.014404296875,
0.07177734375,
0.01617431640625,
-0.0263519287109375,
-0.032012939453125,
-0.056915283203125,
0.05889892578125,
-0.0582275390625,
0.058837890625,
0.0291900634765625,
0.0143585205078125,
-0.01053619384765625,
-0.08697509765625,
-0.037933349609375,
0.0004849433898925781,
0.017852783203125,
0.0325927734375,
-0.017547607421875,
0.0009236335754394531,
0.03173828125,
0.0142059326171875,
-0.053131103515625,
-0.00801849365234375,
-0.058380126953125,
-0.03094482421875,
0.06439208984375,
-0.0262451171875,
-0.002162933349609375,
0.0026836395263671875,
-0.0271759033203125,
-0.040191650390625,
-0.0169219970703125,
0.036407470703125,
0.03338623046875,
0.003078460693359375,
-0.031951904296875,
0.0386962890625,
0.0008864402770996094,
0.047515869140625,
0.0192413330078125,
-0.034454345703125,
0.048919677734375,
-0.01236724853515625,
-0.01351165771484375,
0.0263824462890625,
0.068359375,
0.023284912109375,
0.01027679443359375,
0.01715087890625,
-0.0203857421875,
0.002368927001953125,
-0.012725830078125,
-0.07000732421875,
-0.028900146484375,
0.04205322265625,
-0.03582763671875,
0.004638671875,
0.0126190185546875,
-0.0283355712890625,
0.001617431640625,
-0.0166015625,
0.07293701171875,
-0.039215087890625,
-0.022216796875,
0.022064208984375,
-0.032470703125,
0.0266571044921875,
-0.008544921875,
-0.05419921875,
0.01267242431640625,
0.037750244140625,
0.0576171875,
0.006740570068359375,
-0.006103515625,
-0.038421630859375,
0.005939483642578125,
-0.0010852813720703125,
0.033111572265625,
-0.0007677078247070312,
-0.042449951171875,
-0.01611328125,
0.0081024169921875,
-0.003940582275390625,
-0.036468505859375,
0.06298828125,
-0.0338134765625,
0.0220794677734375,
-0.006015777587890625,
-0.044830322265625,
-0.0251617431640625,
-0.0364990234375,
-0.042022705078125,
0.09259033203125,
0.0163726806640625,
-0.04632568359375,
0.01123809814453125,
-0.0286102294921875,
-0.04559326171875,
-0.0360107421875,
-0.0019245147705078125,
-0.054901123046875,
0.0018110275268554688,
0.0091552734375,
0.02508544921875,
-0.0204925537109375,
0.00046515464782714844,
-0.008941650390625,
-0.048614501953125,
0.03759765625,
-0.037811279296875,
0.0858154296875,
0.016815185546875,
-0.049468994140625,
0.0139617919921875,
-0.0635986328125,
0.01279449462890625,
0.00024962425231933594,
-0.033416748046875,
0.01690673828125,
-0.00975799560546875,
0.04730224609375,
0.025665283203125,
0.0131072998046875,
-0.0426025390625,
-0.00872039794921875,
-0.048675537109375,
0.03875732421875,
0.050384521484375,
-0.0055389404296875,
0.0164031982421875,
-0.0360107421875,
0.003101348876953125,
-0.01523590087890625,
0.003040313720703125,
0.0198822021484375,
-0.04364013671875,
-0.0462646484375,
-0.03076171875,
0.0163726806640625,
0.039947509765625,
-0.0149993896484375,
0.05291748046875,
-0.013153076171875,
-0.06988525390625,
-0.07073974609375,
-0.009002685546875,
0.018035888671875,
0.033447265625,
0.047943115234375,
-0.0169219970703125,
-0.046630859375,
-0.05914306640625,
-0.0164794921875,
-0.0026874542236328125,
-0.0206756591796875,
0.01507568359375,
0.0227508544921875,
-0.0291900634765625,
0.049072265625,
-0.02734375,
-0.034088134765625,
-0.003627777099609375,
0.004611968994140625,
0.050750732421875,
0.05389404296875,
0.025634765625,
-0.043670654296875,
-0.031707763671875,
-0.02972412109375,
-0.02838134765625,
-0.00943756103515625,
-0.01025390625,
-0.00624847412109375,
0.01505279541015625,
0.02752685546875,
-0.043304443359375,
0.033355712890625,
0.04052734375,
-0.022216796875,
0.038055419921875,
0.00324249267578125,
0.0116424560546875,
-0.0712890625,
0.0078277587890625,
-0.0012407302856445312,
-0.01531982421875,
-0.03533935546875,
-0.037139892578125,
-0.003238677978515625,
0.0034618377685546875,
-0.041717529296875,
0.0265655517578125,
-0.033935546875,
-0.01258087158203125,
-0.01300811767578125,
0.0217437744140625,
-0.0095367431640625,
0.03582763671875,
0.006282806396484375,
0.052337646484375,
0.046234130859375,
-0.040985107421875,
0.036956787109375,
0.0269012451171875,
-0.042816162109375,
0.00336456298828125,
-0.0733642578125,
0.0284576416015625,
0.0152130126953125,
0.026031494140625,
-0.09698486328125,
-0.0031986236572265625,
0.0027313232421875,
-0.068115234375,
0.0218505859375,
-0.00307464599609375,
-0.030517578125,
-0.0280914306640625,
-0.0243072509765625,
0.039276123046875,
0.06610107421875,
-0.036468505859375,
0.038360595703125,
0.028656005859375,
0.0098724365234375,
-0.03240966796875,
-0.0697021484375,
-0.047088623046875,
-0.00231170654296875,
-0.06744384765625,
0.0333251953125,
-0.00452423095703125,
0.003749847412109375,
-0.01629638671875,
-0.03387451171875,
0.011077880859375,
-0.009490966796875,
0.036956787109375,
0.02081298828125,
-0.00994110107421875,
-0.0031585693359375,
0.00118255615234375,
-0.01873779296875,
0.02008056640625,
-0.041351318359375,
0.048004150390625,
-0.0129241943359375,
-0.01934814453125,
-0.07354736328125,
-0.00499725341796875,
0.02691650390625,
-0.022216796875,
0.037261962890625,
0.08990478515625,
-0.025299072265625,
-0.005138397216796875,
-0.045501708984375,
-0.0246124267578125,
-0.042266845703125,
0.049957275390625,
-0.0206298828125,
-0.045928955078125,
0.028839111328125,
0.00672149658203125,
0.00923919677734375,
0.05426025390625,
0.058013916015625,
-0.0192108154296875,
0.058807373046875,
0.01146697998046875,
0.00930023193359375,
0.036102294921875,
-0.061187744140625,
0.005405426025390625,
-0.0618896484375,
-0.0248260498046875,
-0.0310516357421875,
-0.03387451171875,
-0.04791259765625,
-0.042083740234375,
0.03570556640625,
0.007297515869140625,
-0.0263824462890625,
0.03192138671875,
-0.0560302734375,
0.00817108154296875,
0.0509033203125,
0.012847900390625,
-0.0064697265625,
0.0094757080078125,
0.0052490234375,
-0.004825592041015625,
-0.0264129638671875,
-0.0182037353515625,
0.09002685546875,
0.03717041015625,
0.0518798828125,
0.003047943115234375,
0.0460205078125,
0.000007450580596923828,
-0.00928497314453125,
-0.06817626953125,
0.02777099609375,
-0.005039215087890625,
-0.053314208984375,
-0.0212860107421875,
-0.003681182861328125,
-0.053558349609375,
0.0057373046875,
-0.024017333984375,
-0.06585693359375,
0.01372528076171875,
-0.004138946533203125,
-0.01413726806640625,
0.0092926025390625,
-0.04290771484375,
0.050048828125,
-0.000995635986328125,
-0.023956298828125,
-0.0139923095703125,
-0.054595947265625,
0.0157928466796875,
0.006103515625,
0.015899658203125,
-0.012969970703125,
0.0236358642578125,
0.09637451171875,
-0.01126861572265625,
0.03289794921875,
-0.026641845703125,
-0.01515960693359375,
0.054046630859375,
-0.01219940185546875,
0.037628173828125,
0.01422119140625,
-0.0281219482421875,
0.0244140625,
0.0198516845703125,
-0.01413726806640625,
-0.0183258056640625,
0.049224853515625,
-0.078125,
-0.027496337890625,
-0.017852783203125,
-0.038330078125,
-0.0126495361328125,
0.0074462890625,
0.06378173828125,
0.048248291015625,
0.0015726089477539062,
0.0302734375,
0.043792724609375,
0.0009889602661132812,
0.03997802734375,
0.0158843994140625,
-0.00690460205078125,
-0.0457763671875,
0.06256103515625,
0.01535797119140625,
0.0225372314453125,
0.0011386871337890625,
0.004512786865234375,
-0.046356201171875,
-0.030548095703125,
-0.00782012939453125,
0.0272216796875,
-0.046142578125,
-0.0085601806640625,
-0.038421630859375,
-0.02484130859375,
-0.05718994140625,
0.0002722740173339844,
-0.050201416015625,
-0.038238525390625,
-0.0251617431640625,
0.00646209716796875,
0.037261962890625,
0.0345458984375,
-0.022857666015625,
0.03839111328125,
-0.048828125,
0.03515625,
0.01277923583984375,
0.0105743408203125,
-0.0107269287109375,
-0.08184814453125,
-0.0238494873046875,
0.0200653076171875,
-0.0198211669921875,
-0.07110595703125,
0.0110931396484375,
0.00943756103515625,
0.03509521484375,
0.0254058837890625,
0.00069427490234375,
0.04541015625,
-0.0267791748046875,
0.052642822265625,
0.0194549560546875,
-0.08660888671875,
0.048980712890625,
-0.01328277587890625,
0.01049041748046875,
0.036285400390625,
0.0174560546875,
-0.03985595703125,
-0.0178985595703125,
-0.045166015625,
-0.0750732421875,
0.07666015625,
0.03485107421875,
-0.008819580078125,
0.025665283203125,
0.0198211669921875,
-0.01043701171875,
-0.006500244140625,
-0.056915283203125,
-0.046875,
-0.02545166015625,
-0.0247344970703125,
-0.0178375244140625,
-0.01995849609375,
-0.012786865234375,
-0.039215087890625,
0.08111572265625,
0.0231781005859375,
0.04925537109375,
0.037811279296875,
-0.0096435546875,
-0.00496673583984375,
0.005779266357421875,
0.0379638671875,
0.023223876953125,
-0.02978515625,
-0.0020580291748046875,
0.0225982666015625,
-0.052764892578125,
0.0202789306640625,
0.0190887451171875,
0.0017147064208984375,
0.01422119140625,
0.039703369140625,
0.07098388671875,
-0.006397247314453125,
-0.0247802734375,
0.047515869140625,
0.0002231597900390625,
-0.0302886962890625,
-0.046783447265625,
0.0115509033203125,
0.0408935546875,
0.0270538330078125,
0.0285491943359375,
0.0011415481567382812,
0.01160430908203125,
-0.019500732421875,
0.0279541015625,
0.0171356201171875,
-0.0435791015625,
-0.02239990234375,
0.076904296875,
0.0025310516357421875,
-0.0262603759765625,
0.049652099609375,
0.00042510032653808594,
-0.0172119140625,
0.05059814453125,
0.052764892578125,
0.06787109375,
-0.026214599609375,
-0.0230560302734375,
0.0467529296875,
0.0188751220703125,
-0.005687713623046875,
0.03948974609375,
0.00043010711669921875,
-0.032806396484375,
-0.0171051025390625,
-0.0457763671875,
0.01189422607421875,
0.0147705078125,
-0.06658935546875,
0.039276123046875,
-0.0219573974609375,
-0.0325927734375,
0.0124969482421875,
0.00830841064453125,
-0.053497314453125,
0.0229339599609375,
0.01087188720703125,
0.06658935546875,
-0.057586669921875,
0.0838623046875,
0.0236358642578125,
-0.023651123046875,
-0.08453369140625,
0.005847930908203125,
-0.001537322998046875,
-0.049835205078125,
0.042144775390625,
0.03070068359375,
-0.0277099609375,
0.0175323486328125,
-0.0367431640625,
-0.06756591796875,
0.09564208984375,
0.0174407958984375,
-0.05682373046875,
0.0137786865234375,
-0.0034542083740234375,
0.0251007080078125,
-0.00514984130859375,
0.0234527587890625,
0.060211181640625,
0.04443359375,
0.0027217864990234375,
-0.0740966796875,
0.0059814453125,
-0.0034351348876953125,
-0.0126190185546875,
-0.01141357421875,
-0.057952880859375,
0.070068359375,
-0.0246429443359375,
-0.0130615234375,
0.0018243789672851562,
0.07635498046875,
0.0310821533203125,
0.02227783203125,
0.041412353515625,
0.033050537109375,
0.055877685546875,
-0.013824462890625,
0.05059814453125,
-0.017578125,
0.05096435546875,
0.07843017578125,
0.00482940673828125,
0.06842041015625,
0.0305328369140625,
-0.027008056640625,
0.032989501953125,
0.03753662109375,
-0.0098876953125,
0.05035400390625,
0.018768310546875,
-0.019927978515625,
-0.00455474853515625,
0.018341064453125,
-0.056915283203125,
0.057861328125,
0.0254974365234375,
-0.017333984375,
0.024017333984375,
0.0010051727294921875,
-0.00798797607421875,
-0.01556396484375,
-0.0004677772521972656,
0.04534912109375,
0.006320953369140625,
-0.0232391357421875,
0.071044921875,
0.0135040283203125,
0.064697265625,
-0.03955078125,
0.006313323974609375,
0.020782470703125,
0.022003173828125,
-0.023101806640625,
-0.0533447265625,
0.0175018310546875,
-0.0235748291015625,
-0.0201416015625,
0.0056610107421875,
0.038177490234375,
-0.053955078125,
-0.039947509765625,
0.042022705078125,
0.00240325927734375,
0.01403045654296875,
-0.0004093647003173828,
-0.049102783203125,
0.0252532958984375,
0.0258026123046875,
-0.0433349609375,
-0.00934600830078125,
0.01155853271484375,
0.0281219482421875,
0.0206146240234375,
0.049041748046875,
0.0011148452758789062,
0.007579803466796875,
-0.0024433135986328125,
0.04559326171875,
-0.044830322265625,
-0.033660888671875,
-0.0423583984375,
0.040618896484375,
0.007556915283203125,
-0.023590087890625,
0.04498291015625,
0.06842041015625,
0.06695556640625,
-0.01078033447265625,
0.0543212890625,
-0.0014019012451171875,
0.037841796875,
-0.047576904296875,
0.06719970703125,
-0.04400634765625,
0.013671875,
-0.0166168212890625,
-0.063232421875,
0.00849151611328125,
0.06903076171875,
-0.011993408203125,
0.018951416015625,
0.03875732421875,
0.075439453125,
-0.007625579833984375,
-0.00412750244140625,
0.006580352783203125,
0.0292510986328125,
0.0255126953125,
0.05291748046875,
0.030487060546875,
-0.07952880859375,
0.053497314453125,
-0.035888671875,
-0.01064300537109375,
-0.015472412109375,
-0.02508544921875,
-0.055206298828125,
-0.05950927734375,
-0.0276947021484375,
-0.060211181640625,
-0.006595611572265625,
0.07464599609375,
0.05078125,
-0.06610107421875,
-0.03704833984375,
0.0205535888671875,
-0.004245758056640625,
-0.0280914306640625,
-0.0186614990234375,
0.05584716796875,
0.004116058349609375,
-0.066650390625,
0.040313720703125,
-0.01033782958984375,
0.008575439453125,
0.0121002197265625,
-0.01541900634765625,
-0.028656005859375,
0.004169464111328125,
0.02313232421875,
0.0056610107421875,
-0.057373046875,
-0.0249481201171875,
-0.007232666015625,
-0.0189971923828125,
0.006137847900390625,
0.026885986328125,
-0.039337158203125,
0.0286102294921875,
0.042633056640625,
0.00982666015625,
0.07806396484375,
-0.018157958984375,
0.0135650634765625,
-0.049102783203125,
0.038665771484375,
0.010467529296875,
0.0213470458984375,
0.006923675537109375,
-0.0201568603515625,
0.018096923828125,
0.0262603759765625,
-0.052032470703125,
-0.050262451171875,
-0.0091552734375,
-0.10906982421875,
-0.01531982421875,
0.10528564453125,
0.00771331787109375,
-0.018310546875,
0.006099700927734375,
-0.034088134765625,
0.07159423828125,
-0.032989501953125,
0.031707763671875,
0.028472900390625,
-0.01171875,
0.018341064453125,
-0.047637939453125,
0.05010986328125,
0.0231170654296875,
-0.02239990234375,
-0.006317138671875,
0.025634765625,
0.054595947265625,
0.003917694091796875,
0.060882568359375,
-0.01251220703125,
0.035675048828125,
0.0200347900390625,
0.0265655517578125,
-0.01525115966796875,
-0.0038967132568359375,
-0.0303955078125,
-0.00018143653869628906,
-0.00356292724609375,
-0.03759765625
]
] |
georgesung/llama2_7b_chat_uncensored | 2023-07-22T15:55:50.000Z | [
"transformers",
"pytorch",
"tensorboard",
"llama",
"text-generation",
"dataset:ehartford/wizard_vicuna_70k_unfiltered",
"license:other",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | georgesung | null | null | georgesung/llama2_7b_chat_uncensored | 163 | 6,786 | transformers | 2023-07-20T10:45:03 | ---
license: other
datasets:
- ehartford/wizard_vicuna_70k_unfiltered
---
# Overview
Fine-tuned [Llama-2 7B](https://huggingface.co/TheBloke/Llama-2-7B-fp16) with an uncensored/unfiltered Wizard-Vicuna conversation dataset [ehartford/wizard_vicuna_70k_unfiltered](https://huggingface.co/datasets/ehartford/wizard_vicuna_70k_unfiltered).
Used QLoRA for fine-tuning. Trained for one epoch on a 24GB GPU (NVIDIA A10G) instance, took ~19 hours to train.
The version here is the fp16 HuggingFace model.
## GGML & GPTQ versions
Thanks to [TheBloke](https://huggingface.co/TheBloke), he has created the GGML and GPTQ versions:
* https://huggingface.co/TheBloke/llama2_7b_chat_uncensored-GGML
* https://huggingface.co/TheBloke/llama2_7b_chat_uncensored-GPTQ
# Prompt style
The model was trained with the following prompt style:
```
### HUMAN:
Hello
### RESPONSE:
Hi, how are you?
### HUMAN:
I'm fine.
### RESPONSE:
How can I help you?
...
```
# Training code
Code used to train the model is available [here](https://github.com/georgesung/llm_qlora).
To reproduce the results:
```
git clone https://github.com/georgesung/llm_qlora
cd llm_qlora
pip install -r requirements.txt
python train.py configs/llama2_7b_chat_uncensored.yaml
```
# Fine-tuning guide
https://georgesung.github.io/ai/qlora-ift/
| 1,299 | [
[
-0.03521728515625,
-0.06475830078125,
0.0250244140625,
0.0281219482421875,
-0.058685302734375,
-0.004566192626953125,
-0.013458251953125,
-0.060333251953125,
0.0016841888427734375,
0.025421142578125,
-0.057891845703125,
-0.036163330078125,
-0.036834716796875,
0.002414703369140625,
-0.0011243820190429688,
0.09234619140625,
0.01386260986328125,
0.00960540771484375,
0.0148468017578125,
-0.0207977294921875,
-0.052581787109375,
-0.027923583984375,
-0.072998046875,
-0.032745361328125,
0.040985107421875,
0.0217742919921875,
0.0615234375,
0.0341796875,
0.0105438232421875,
0.021087646484375,
-0.0287017822265625,
0.0343017578125,
-0.060546875,
-0.0109100341796875,
-0.005279541015625,
-0.020050048828125,
-0.04644775390625,
-0.003719329833984375,
0.038116455078125,
-0.003787994384765625,
-0.016265869140625,
0.02850341796875,
0.0138397216796875,
0.039398193359375,
-0.020172119140625,
0.0269012451171875,
-0.049072265625,
-0.0080718994140625,
-0.0054931640625,
-0.01276397705078125,
0.002307891845703125,
-0.0132904052734375,
-0.002620697021484375,
-0.0560302734375,
0.002368927001953125,
0.0166473388671875,
0.0804443359375,
0.033172607421875,
-0.03118896484375,
-0.00620269775390625,
-0.0270843505859375,
0.045745849609375,
-0.06207275390625,
0.01174163818359375,
0.04205322265625,
0.03497314453125,
-0.032257080078125,
-0.048736572265625,
-0.050201416015625,
-0.01983642578125,
-0.005535125732421875,
-0.007808685302734375,
-0.0195159912109375,
0.0180206298828125,
0.030731201171875,
0.0289459228515625,
-0.0445556640625,
0.0242462158203125,
-0.04302978515625,
-0.0220794677734375,
0.055938720703125,
0.01380157470703125,
0.014251708984375,
-0.004810333251953125,
-0.026214599609375,
-0.01477813720703125,
-0.039947509765625,
0.005565643310546875,
0.03668212890625,
0.01544189453125,
-0.038970947265625,
0.04119873046875,
-0.026947021484375,
0.038970947265625,
0.01715087890625,
-0.0247039794921875,
0.0185089111328125,
-0.025604248046875,
-0.018646240234375,
-0.00737762451171875,
0.07171630859375,
0.03985595703125,
0.0194244384765625,
0.0022068023681640625,
-0.0107269287109375,
0.01375579833984375,
0.01187896728515625,
-0.078125,
-0.03741455078125,
0.03387451171875,
-0.0222625732421875,
-0.0212860107421875,
-0.0194244384765625,
-0.02374267578125,
-0.030364990234375,
-0.02777099609375,
0.02703857421875,
-0.030059814453125,
-0.0257110595703125,
0.0089111328125,
-0.0002396106719970703,
0.040771484375,
0.038238525390625,
-0.0643310546875,
0.0035877227783203125,
0.048095703125,
0.0623779296875,
0.0035400390625,
-0.002239227294921875,
-0.033050537109375,
-0.0211944580078125,
-0.0137176513671875,
0.0579833984375,
-0.0091705322265625,
-0.0186309814453125,
-0.01485443115234375,
0.014495849609375,
-0.0008740425109863281,
-0.039398193359375,
0.042022705078125,
-0.041839599609375,
0.02935791015625,
-0.0112762451171875,
-0.0286712646484375,
-0.0211944580078125,
0.02069091796875,
-0.0310516357421875,
0.07940673828125,
0.024261474609375,
-0.06268310546875,
0.006134033203125,
-0.0335693359375,
0.007080078125,
0.0005612373352050781,
0.0018320083618164062,
-0.04437255859375,
-0.0143890380859375,
0.022491455078125,
0.0221405029296875,
-0.0246124267578125,
0.01800537109375,
-0.029083251953125,
-0.0478515625,
0.017852783203125,
-0.025848388671875,
0.06756591796875,
0.01383209228515625,
-0.038330078125,
0.0245819091796875,
-0.0628662109375,
0.00769805908203125,
0.01107025146484375,
-0.0263519287109375,
0.004070281982421875,
-0.0300445556640625,
0.0177764892578125,
0.014373779296875,
0.022247314453125,
-0.036590576171875,
0.027801513671875,
-0.0151824951171875,
0.0288848876953125,
0.0509033203125,
0.003387451171875,
0.01300811767578125,
-0.033721923828125,
0.03887939453125,
0.0003573894500732422,
0.04522705078125,
0.019439697265625,
-0.05487060546875,
-0.053497314453125,
-0.0273895263671875,
0.00843048095703125,
0.04913330078125,
-0.06512451171875,
0.027587890625,
0.0102386474609375,
-0.053680419921875,
-0.05914306640625,
0.0250244140625,
0.049407958984375,
0.0296783447265625,
0.03289794921875,
-0.038482666015625,
-0.03265380859375,
-0.06610107421875,
0.003879547119140625,
-0.021820068359375,
-0.016510009765625,
0.0276947021484375,
0.0169830322265625,
-0.034393310546875,
0.05059814453125,
-0.01617431640625,
-0.0251007080078125,
-0.0041351318359375,
-0.01108551025390625,
0.01715087890625,
0.033660888671875,
0.05615234375,
-0.043792724609375,
-0.02783203125,
-0.0007691383361816406,
-0.0528564453125,
-0.01459503173828125,
0.00348663330078125,
-0.042724609375,
0.012451171875,
0.022705078125,
-0.05035400390625,
0.03424072265625,
0.0458984375,
-0.041412353515625,
0.044921875,
-0.00994873046875,
0.00395965576171875,
-0.07611083984375,
0.001911163330078125,
-0.003307342529296875,
-0.0222625732421875,
-0.03662109375,
-0.003993988037109375,
-0.0022678375244140625,
0.01313018798828125,
-0.05218505859375,
0.051971435546875,
-0.049072265625,
0.00278472900390625,
-0.03515625,
0.0062255859375,
-0.007518768310546875,
0.058624267578125,
-0.00785064697265625,
0.068603515625,
0.03704833984375,
-0.04156494140625,
0.026885986328125,
0.0509033203125,
-0.01238250732421875,
0.032318115234375,
-0.07647705078125,
0.03863525390625,
0.0006194114685058594,
0.0247039794921875,
-0.0714111328125,
-0.01464080810546875,
0.06280517578125,
-0.032379150390625,
0.0226593017578125,
-0.0267181396484375,
-0.03546142578125,
-0.01255035400390625,
-0.0288238525390625,
0.0261383056640625,
0.0411376953125,
-0.06097412109375,
0.0170440673828125,
0.02081298828125,
0.0109710693359375,
-0.038726806640625,
-0.0302734375,
0.006805419921875,
-0.035614013671875,
-0.040863037109375,
0.01861572265625,
-0.01531982421875,
-0.007236480712890625,
-0.0232391357421875,
0.00811767578125,
-0.0007443428039550781,
0.0101470947265625,
0.031982421875,
0.0203094482421875,
-0.004802703857421875,
-0.006946563720703125,
0.0095977783203125,
0.00019633769989013672,
-0.0020389556884765625,
0.005084991455078125,
0.0479736328125,
-0.022491455078125,
-0.004119873046875,
-0.062042236328125,
0.005725860595703125,
0.020294189453125,
0.0038909912109375,
0.06695556640625,
0.0799560546875,
0.0005488395690917969,
-0.0008935928344726562,
-0.043701171875,
-0.017181396484375,
-0.040496826171875,
0.00647735595703125,
-0.0078125,
-0.08367919921875,
0.04443359375,
0.0228729248046875,
0.0092315673828125,
0.02984619140625,
0.04345703125,
0.012451171875,
0.0535888671875,
0.044342041015625,
-0.01169586181640625,
0.06842041015625,
-0.0122528076171875,
-0.00920867919921875,
-0.06829833984375,
-0.0217132568359375,
-0.0158538818359375,
-0.0172271728515625,
-0.05572509765625,
-0.03375244140625,
0.030426025390625,
0.028350830078125,
-0.041229248046875,
0.0252532958984375,
-0.053375244140625,
0.0258026123046875,
0.040374755859375,
0.0361328125,
0.0234222412109375,
0.002231597900390625,
0.0286712646484375,
0.01143646240234375,
-0.048828125,
-0.05303955078125,
0.0748291015625,
0.042724609375,
0.045166015625,
0.0113983154296875,
0.033447265625,
0.01120758056640625,
0.0231475830078125,
-0.04315185546875,
0.03985595703125,
0.01611328125,
-0.035369873046875,
-0.012451171875,
-0.0211181640625,
-0.07257080078125,
0.0252227783203125,
-0.0257415771484375,
-0.06463623046875,
0.0016031265258789062,
0.02276611328125,
-0.0267791748046875,
0.010833740234375,
-0.04510498046875,
0.06146240234375,
-0.00995635986328125,
-0.0281829833984375,
-0.01617431640625,
-0.06072998046875,
0.04931640625,
0.0092620849609375,
-0.01885986328125,
-0.031341552734375,
0.01045989990234375,
0.049285888671875,
-0.037445068359375,
0.0780029296875,
-0.0208282470703125,
-0.0280914306640625,
0.038116455078125,
-0.0177764892578125,
0.0251007080078125,
0.0251007080078125,
0.01251220703125,
0.031829833984375,
0.0167236328125,
-0.042266845703125,
-0.037261962890625,
0.0270843505859375,
-0.08856201171875,
-0.041259765625,
-0.0265045166015625,
-0.022979736328125,
-0.01482391357421875,
-0.01036834716796875,
0.0280609130859375,
0.0249176025390625,
-0.0054779052734375,
-0.0033550262451171875,
0.046356201171875,
-0.025421142578125,
0.01763916015625,
0.0290985107421875,
-0.0181121826171875,
-0.03497314453125,
0.039459228515625,
-0.01482391357421875,
0.01398468017578125,
0.0030670166015625,
0.0034008026123046875,
-0.0310516357421875,
-0.03192138671875,
-0.05975341796875,
0.038299560546875,
-0.03167724609375,
-0.0276031494140625,
-0.037200927734375,
-0.0253448486328125,
-0.023345947265625,
0.033203125,
-0.027587890625,
-0.0283660888671875,
-0.056915283203125,
0.0005574226379394531,
0.06890869140625,
0.052459716796875,
-0.001102447509765625,
0.0526123046875,
-0.05157470703125,
0.0301513671875,
0.0218963623046875,
0.01513671875,
0.0001042485237121582,
-0.07177734375,
-0.01526641845703125,
0.02081298828125,
-0.03173828125,
-0.057830810546875,
0.03179931640625,
0.0252532958984375,
0.03765869140625,
0.0302734375,
0.007175445556640625,
0.053009033203125,
-0.01442718505859375,
0.052398681640625,
0.0006074905395507812,
-0.04156494140625,
0.032379150390625,
-0.04559326171875,
0.007450103759765625,
0.03594970703125,
0.0198974609375,
-0.0272369384765625,
-0.01666259765625,
-0.047119140625,
-0.051727294921875,
0.037506103515625,
0.019622802734375,
0.0159454345703125,
0.01184844970703125,
0.0555419921875,
0.0049285888671875,
0.0214080810546875,
-0.039642333984375,
-0.021881103515625,
-0.0231475830078125,
0.00499725341796875,
-0.01470184326171875,
-0.0345458984375,
-0.02178955078125,
-0.033172607421875,
0.056427001953125,
-0.02593994140625,
0.0330810546875,
0.01039886474609375,
-0.0012807846069335938,
0.0024852752685546875,
0.005786895751953125,
0.048919677734375,
0.038238525390625,
-0.0182342529296875,
-0.021514892578125,
0.0125579833984375,
-0.0361328125,
0.00042700767517089844,
0.006496429443359375,
0.0056610107421875,
-0.00931549072265625,
0.0160980224609375,
0.1021728515625,
0.01090240478515625,
-0.041046142578125,
0.0269775390625,
-0.032958984375,
-0.01268768310546875,
-0.01517486572265625,
0.034637451171875,
0.013275146484375,
0.05670166015625,
0.010345458984375,
-0.031829833984375,
0.0010480880737304688,
-0.037078857421875,
-0.0005087852478027344,
0.0316162109375,
-0.00701904296875,
-0.0282745361328125,
0.05755615234375,
0.0006766319274902344,
-0.011383056640625,
0.0633544921875,
-0.02630615234375,
-0.03863525390625,
0.0325927734375,
0.037109375,
0.048431396484375,
0.0006542205810546875,
0.02264404296875,
0.038909912109375,
0.00225830078125,
-0.00930023193359375,
0.033294677734375,
-0.01427459716796875,
-0.047515869140625,
-0.007720947265625,
-0.05242919921875,
-0.051300048828125,
0.0071868896484375,
-0.054351806640625,
0.01056671142578125,
-0.041046142578125,
-0.0252532958984375,
-0.027374267578125,
0.0310211181640625,
-0.057220458984375,
0.00537872314453125,
0.008758544921875,
0.06097412109375,
-0.06414794921875,
0.068115234375,
0.039947509765625,
-0.0225677490234375,
-0.08251953125,
-0.0279998779296875,
-0.005008697509765625,
-0.07220458984375,
0.00623321533203125,
0.0121917724609375,
0.0027980804443359375,
-0.004184722900390625,
-0.06719970703125,
-0.06060791015625,
0.1060791015625,
0.04022216796875,
-0.056396484375,
0.006561279296875,
-0.01271820068359375,
0.064697265625,
-0.01334381103515625,
0.03277587890625,
0.0582275390625,
0.0162353515625,
0.01255035400390625,
-0.10186767578125,
0.01026153564453125,
-0.038055419921875,
0.0134429931640625,
-0.0273284912109375,
-0.09588623046875,
0.058380126953125,
-0.014251708984375,
-0.01033782958984375,
0.0308837890625,
0.07379150390625,
0.04412841796875,
0.0180511474609375,
0.037017822265625,
0.033660888671875,
0.07049560546875,
0.00395965576171875,
0.07025146484375,
-0.022491455078125,
0.0254974365234375,
0.07403564453125,
0.01309967041015625,
0.05535888671875,
0.0241241455078125,
-0.00882720947265625,
0.03131103515625,
0.08526611328125,
-0.0001354217529296875,
0.038116455078125,
0.0173187255859375,
-0.021484375,
-0.017974853515625,
-0.0120391845703125,
-0.043792724609375,
0.0265960693359375,
0.01580810546875,
0.00461578369140625,
-0.0033245086669921875,
-0.00439453125,
0.0192413330078125,
-0.025543212890625,
-0.0218048095703125,
0.0665283203125,
0.0190887451171875,
-0.039642333984375,
0.0872802734375,
-0.005596160888671875,
0.0777587890625,
-0.038909912109375,
0.00347137451171875,
-0.041015625,
0.01015472412109375,
-0.0160675048828125,
-0.0423583984375,
0.0009212493896484375,
0.002468109130859375,
0.003997802734375,
0.0169677734375,
0.04864501953125,
-0.041656494140625,
-0.01378631591796875,
0.0259552001953125,
0.0303497314453125,
0.042633056640625,
-0.0096588134765625,
-0.0625,
0.02362060546875,
0.0008058547973632812,
-0.03668212890625,
0.0513916015625,
0.0253753662109375,
-0.019439697265625,
0.059600830078125,
0.048095703125,
-0.009613037109375,
0.004245758056640625,
0.01129913330078125,
0.0926513671875,
-0.032989501953125,
-0.0338134765625,
-0.04754638671875,
0.0201873779296875,
0.00485992431640625,
-0.054412841796875,
0.047332763671875,
0.0185546875,
0.05169677734375,
0.006671905517578125,
0.03167724609375,
0.00843048095703125,
0.02008056640625,
-0.04864501953125,
0.05157470703125,
-0.04022216796875,
0.0059814453125,
-0.014495849609375,
-0.0635986328125,
0.0007119178771972656,
0.058197021484375,
-0.001422882080078125,
0.001995086669921875,
0.01812744140625,
0.08349609375,
0.0030803680419921875,
-0.0009927749633789062,
0.0167999267578125,
0.004856109619140625,
0.03369140625,
0.045166015625,
0.058441162109375,
-0.047760009765625,
0.03326416015625,
-0.02777099609375,
-0.01183319091796875,
-0.019500732421875,
-0.062103271484375,
-0.08868408203125,
-0.02227783203125,
-0.030487060546875,
-0.04620361328125,
0.0198974609375,
0.07843017578125,
0.0439453125,
-0.04718017578125,
-0.01070404052734375,
0.025177001953125,
-0.006282806396484375,
0.00036787986755371094,
-0.010284423828125,
0.0282135009765625,
0.0027675628662109375,
-0.049468994140625,
0.0272369384765625,
-0.01309967041015625,
0.0263519287109375,
-0.01727294921875,
-0.014556884765625,
0.00010895729064941406,
-0.01213836669921875,
0.03570556640625,
0.044708251953125,
-0.042144775390625,
-0.0357666015625,
-0.02008056640625,
-0.0166778564453125,
0.019012451171875,
0.0127105712890625,
-0.049102783203125,
-0.0015888214111328125,
0.013763427734375,
0.024627685546875,
0.04876708984375,
0.03436279296875,
0.0288543701171875,
-0.039703369140625,
0.0282135009765625,
-0.007152557373046875,
0.0186004638671875,
0.03759765625,
-0.03790283203125,
0.0443115234375,
0.002811431884765625,
-0.06170654296875,
-0.061370849609375,
-0.00525665283203125,
-0.09588623046875,
0.0036525726318359375,
0.10845947265625,
-0.0029735565185546875,
-0.0318603515625,
0.0017852783203125,
-0.06463623046875,
0.0350341796875,
-0.04034423828125,
0.05816650390625,
0.018890380859375,
-0.0132904052734375,
-0.0255584716796875,
-0.04144287109375,
0.05279541015625,
0.0163421630859375,
-0.048248291015625,
-0.0100555419921875,
0.040985107421875,
0.04302978515625,
-0.0209808349609375,
0.07281494140625,
0.00902557373046875,
0.0175933837890625,
-0.00685882568359375,
-0.002590179443359375,
-0.027740478515625,
-0.01186370849609375,
-0.032989501953125,
-0.01143646240234375,
0.00952911376953125,
-0.0277252197265625
]
] |
hfl/chinese-alpaca-2-13b | 2023-08-25T01:08:33.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"license:apache-2.0",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | hfl | null | null | hfl/chinese-alpaca-2-13b | 67 | 6,771 | transformers | 2023-08-14T03:10:08 | ---
license: apache-2.0
---
# Chinese-Alpaca-2-13B
**This is the full Chinese-Alpaca-2-13B model,which can be loaded directly for inference and full-parameter training.**
**Related models👇**
* Long context base models
* [Chinese-LLaMA-2-7B-16K (full model)](https://huggingface.co/ziqingyang/chinese-llama-2-7b-16k)
* [Chinese-LLaMA-2-LoRA-7B-16K (LoRA model)](https://huggingface.co/ziqingyang/chinese-llama-2-lora-7b-16k)
* [Chinese-LLaMA-2-13B-16K (full model)](https://huggingface.co/ziqingyang/chinese-llama-2-13b-16k)
* [Chinese-LLaMA-2-LoRA-13B-16K (LoRA model)](https://huggingface.co/ziqingyang/chinese-llama-2-lora-13b-16k)
* Base models
* [Chinese-LLaMA-2-7B (full model)](https://huggingface.co/ziqingyang/chinese-llama-2-7b)
* [Chinese-LLaMA-2-LoRA-7B (LoRA model)](https://huggingface.co/ziqingyang/chinese-llama-2-lora-7b)
* [Chinese-LLaMA-2-13B (full model)](https://huggingface.co/ziqingyang/chinese-llama-2-13b)
* [Chinese-LLaMA-2-LoRA-13B (LoRA model)](https://huggingface.co/ziqingyang/chinese-llama-2-lora-13b)
* Instruction/Chat models
* [Chinese-Alpaca-2-7B (full model)](https://huggingface.co/ziqingyang/chinese-alpaca-2-7b)
* [Chinese-Alpaca-2-LoRA-7B (LoRA model)](https://huggingface.co/ziqingyang/chinese-alpaca-2-lora-7b)
* [Chinese-Alpaca-2-13B (full model)](https://huggingface.co/ziqingyang/chinese-alpaca-2-13b)
* [Chinese-Alpaca-2-LoRA-13B (LoRA model)](https://huggingface.co/ziqingyang/chinese-alpaca-2-lora-13b)
# Description of Chinese-LLaMA-Alpaca-2
This project is based on the Llama-2, released by Meta, and it is the second generation of the Chinese LLaMA & Alpaca LLM project. We open-source Chinese LLaMA-2 (foundation model) and Alpaca-2 (instruction-following model). These models have been expanded and optimized with Chinese vocabulary beyond the original Llama-2. We used large-scale Chinese data for incremental pre-training, which further improved the fundamental semantic understanding of the Chinese language, resulting in a significant performance improvement compared to the first-generation models. The relevant models support a 4K context and can be expanded up to 18K+ using the NTK method.
The main contents of this project include:
* 🚀 New extended Chinese vocabulary beyond Llama-2, open-sourcing the Chinese LLaMA-2 and Alpaca-2 LLMs.
* 🚀 Open-sourced the pre-training and instruction finetuning (SFT) scripts for further tuning on user's data
* 🚀 Quickly deploy and experience the quantized LLMs on CPU/GPU of personal PC
* 🚀 Support for LLaMA ecosystems like 🤗transformers, llama.cpp, text-generation-webui, LangChain, vLLM etc.
Please refer to [https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/](https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/) for details. | 2,756 | [
[
-0.0303192138671875,
-0.04718017578125,
0.0198822021484375,
0.056427001953125,
-0.04644775390625,
-0.0182952880859375,
0.003360748291015625,
-0.06787109375,
0.02783203125,
0.0321044921875,
-0.044525146484375,
-0.03778076171875,
-0.040771484375,
0.0014667510986328125,
-0.019134521484375,
0.050628662109375,
-0.0046539306640625,
0.01143646240234375,
0.0260772705078125,
-0.026123046875,
-0.0258331298828125,
-0.02484130859375,
-0.053802490234375,
-0.038116455078125,
0.0518798828125,
0.005718231201171875,
0.060546875,
0.06396484375,
0.0299072265625,
0.0153961181640625,
-0.0184478759765625,
0.023468017578125,
-0.01837158203125,
-0.0340576171875,
0.01152801513671875,
-0.02685546875,
-0.05999755859375,
-0.0004360675811767578,
0.024261474609375,
0.034576416015625,
-0.0189666748046875,
0.028045654296875,
-0.0004978179931640625,
0.028533935546875,
-0.0311431884765625,
0.020111083984375,
-0.034454345703125,
0.005126953125,
-0.0259552001953125,
0.0017290115356445312,
-0.0207977294921875,
-0.0163421630859375,
-0.00867462158203125,
-0.06964111328125,
-0.0010833740234375,
-0.007343292236328125,
0.09832763671875,
0.019378662109375,
-0.04150390625,
-0.026214599609375,
-0.0191497802734375,
0.0545654296875,
-0.0689697265625,
0.0171661376953125,
0.038909912109375,
0.0086669921875,
-0.0254974365234375,
-0.056182861328125,
-0.04522705078125,
-0.0236358642578125,
-0.019927978515625,
0.01198577880859375,
0.006320953369140625,
-0.01418304443359375,
0.0045623779296875,
0.02227783203125,
-0.033447265625,
0.0377197265625,
-0.03863525390625,
-0.006397247314453125,
0.053558349609375,
-0.008636474609375,
0.0221099853515625,
-0.0032596588134765625,
-0.0391845703125,
-0.01418304443359375,
-0.068359375,
0.0092620849609375,
0.020355224609375,
0.028411865234375,
-0.047943115234375,
0.03704833984375,
-0.0231781005859375,
0.047637939453125,
-0.0013256072998046875,
-0.030853271484375,
0.04302978515625,
-0.031219482421875,
-0.015228271484375,
-0.0184326171875,
0.061126708984375,
0.023895263671875,
-0.006839752197265625,
0.01352691650390625,
-0.013763427734375,
-0.014556884765625,
-0.033782958984375,
-0.062408447265625,
0.0008745193481445312,
0.00449371337890625,
-0.05078125,
-0.0247039794921875,
0.008880615234375,
-0.03289794921875,
-0.0125274658203125,
-0.01340484619140625,
0.0252838134765625,
-0.01404571533203125,
-0.0308380126953125,
0.0196990966796875,
0.00868988037109375,
0.06640625,
0.0265350341796875,
-0.05859375,
0.0125885009765625,
0.040496826171875,
0.0596923828125,
0.0055999755859375,
-0.02911376953125,
0.0012111663818359375,
0.019561767578125,
-0.039398193359375,
0.052520751953125,
-0.0133209228515625,
-0.03277587890625,
-0.01468658447265625,
0.0321044921875,
0.00966644287109375,
-0.03192138671875,
0.046539306640625,
-0.030609130859375,
0.005886077880859375,
-0.04327392578125,
-0.01030731201171875,
-0.037689208984375,
0.0193328857421875,
-0.06671142578125,
0.0872802734375,
0.003963470458984375,
-0.046417236328125,
0.0162506103515625,
-0.050018310546875,
-0.01473236083984375,
-0.01177978515625,
-0.0007910728454589844,
-0.022064208984375,
-0.0236968994140625,
0.0159759521484375,
0.02813720703125,
-0.04736328125,
0.0001710653305053711,
-0.018524169921875,
-0.039215087890625,
-0.006381988525390625,
-0.004039764404296875,
0.08685302734375,
0.0127716064453125,
-0.0203857421875,
-0.00732421875,
-0.06512451171875,
-0.01258087158203125,
0.057281494140625,
-0.029937744140625,
-0.0023632049560546875,
-0.0016536712646484375,
-0.00820159912109375,
0.0126800537109375,
0.051513671875,
-0.0305633544921875,
0.019744873046875,
-0.0230712890625,
0.03173828125,
0.052520751953125,
-0.0097808837890625,
0.006885528564453125,
-0.0282135009765625,
0.0192108154296875,
0.0107879638671875,
0.02545166015625,
-0.003192901611328125,
-0.05487060546875,
-0.0848388671875,
-0.0177001953125,
0.00787353515625,
0.05133056640625,
-0.046142578125,
0.04290771484375,
0.0090179443359375,
-0.057830810546875,
-0.0227813720703125,
0.0094146728515625,
0.034423828125,
0.01898193359375,
0.0213623046875,
-0.0207977294921875,
-0.045440673828125,
-0.07684326171875,
0.0213623046875,
-0.033935546875,
-0.007587432861328125,
0.01025390625,
0.040252685546875,
-0.02459716796875,
0.03594970703125,
-0.0278167724609375,
-0.01194000244140625,
-0.014617919921875,
-0.01038360595703125,
0.03167724609375,
0.034149169921875,
0.072265625,
-0.04193115234375,
-0.011505126953125,
0.00862884521484375,
-0.052398681640625,
-0.0006256103515625,
-0.00103759765625,
-0.033050537109375,
0.013031005859375,
0.004299163818359375,
-0.05560302734375,
0.0298004150390625,
0.046844482421875,
-0.0176239013671875,
0.0273284912109375,
-0.0081024169921875,
-0.0166015625,
-0.0819091796875,
0.00701141357421875,
-0.00479888916015625,
0.0155181884765625,
-0.0306854248046875,
0.032867431640625,
0.0109405517578125,
0.030120849609375,
-0.050201416015625,
0.058258056640625,
-0.041046142578125,
-0.014556884765625,
-0.01209259033203125,
0.00521087646484375,
0.01947021484375,
0.0587158203125,
0.008026123046875,
0.045684814453125,
0.0263671875,
-0.039947509765625,
0.042327880859375,
0.035491943359375,
-0.0247650146484375,
-0.005401611328125,
-0.0645751953125,
0.0262603759765625,
0.0035953521728515625,
0.051300048828125,
-0.058135986328125,
-0.0213470458984375,
0.04656982421875,
-0.0240020751953125,
-0.0021457672119140625,
0.01372528076171875,
-0.0440673828125,
-0.040313720703125,
-0.046295166015625,
0.03448486328125,
0.039886474609375,
-0.07208251953125,
0.0228118896484375,
-0.000247955322265625,
0.01556396484375,
-0.0609130859375,
-0.07501220703125,
-0.00817108154296875,
-0.01560211181640625,
-0.036529541015625,
0.0262603759765625,
-0.00997161865234375,
-0.002559661865234375,
-0.01244354248046875,
0.006916046142578125,
-0.00925445556640625,
0.0073394775390625,
0.01114654541015625,
0.053466796875,
-0.0275421142578125,
-0.009857177734375,
0.0012989044189453125,
0.00963592529296875,
-0.007781982421875,
0.01088714599609375,
0.051239013671875,
-0.00565338134765625,
-0.021484375,
-0.039031982421875,
0.00870513916015625,
0.006923675537109375,
-0.0201263427734375,
0.06390380859375,
0.053924560546875,
-0.037384033203125,
0.0011491775512695312,
-0.04022216796875,
0.005889892578125,
-0.035858154296875,
0.02593994140625,
-0.037261962890625,
-0.04364013671875,
0.04669189453125,
0.0192108154296875,
0.023651123046875,
0.0478515625,
0.049835205078125,
0.0222015380859375,
0.07501220703125,
0.048858642578125,
-0.017303466796875,
0.033233642578125,
-0.0278778076171875,
-0.0018320083618164062,
-0.057037353515625,
-0.0421142578125,
-0.03411865234375,
-0.01849365234375,
-0.0374755859375,
-0.044036865234375,
-0.00196075439453125,
0.024627685546875,
-0.05389404296875,
0.038787841796875,
-0.045928955078125,
0.028778076171875,
0.043426513671875,
0.024871826171875,
0.0243377685546875,
0.0058441162109375,
0.00765228271484375,
0.0267181396484375,
-0.0221710205078125,
-0.041229248046875,
0.07647705078125,
0.0298614501953125,
0.037841796875,
0.01134490966796875,
0.03228759765625,
-0.0018405914306640625,
0.0184478759765625,
-0.060333251953125,
0.047271728515625,
-0.0152435302734375,
-0.033538818359375,
-0.006694793701171875,
-0.007110595703125,
-0.0667724609375,
0.028564453125,
0.009979248046875,
-0.043365478515625,
0.00424957275390625,
-0.00614166259765625,
-0.0253143310546875,
0.0157470703125,
-0.02618408203125,
0.03594970703125,
-0.030426025390625,
0.0027828216552734375,
-0.0086517333984375,
-0.046844482421875,
0.06365966796875,
-0.01971435546875,
0.005954742431640625,
-0.0322265625,
-0.031982421875,
0.057403564453125,
-0.042572021484375,
0.06707763671875,
-0.0167236328125,
-0.0343017578125,
0.0487060546875,
-0.019683837890625,
0.05169677734375,
-0.00186920166015625,
-0.0220184326171875,
0.03997802734375,
-0.0025310516357421875,
-0.03875732421875,
-0.01386260986328125,
0.036468505859375,
-0.0887451171875,
-0.043792724609375,
-0.02203369140625,
-0.0183258056640625,
-0.0018606185913085938,
0.005817413330078125,
0.03765869140625,
-0.00540924072265625,
-0.006069183349609375,
0.00945281982421875,
0.01366424560546875,
-0.0297393798828125,
0.03887939453125,
0.042388916015625,
-0.01323699951171875,
-0.0277557373046875,
0.045013427734375,
0.006595611572265625,
0.0129852294921875,
0.02569580078125,
0.0113983154296875,
-0.01248931884765625,
-0.03350830078125,
-0.05072021484375,
0.042236328125,
-0.05206298828125,
-0.01593017578125,
-0.03582763671875,
-0.04364013671875,
-0.0250091552734375,
0.0033702850341796875,
-0.019622802734375,
-0.03765869140625,
-0.0455322265625,
-0.01934814453125,
0.03912353515625,
0.04046630859375,
-0.00800323486328125,
0.052734375,
-0.044952392578125,
0.031646728515625,
0.030609130859375,
0.007472991943359375,
0.0128326416015625,
-0.0625,
-0.01139068603515625,
0.020477294921875,
-0.035430908203125,
-0.057159423828125,
0.0372314453125,
0.0221405029296875,
0.037841796875,
0.045440673828125,
-0.0221710205078125,
0.06854248046875,
-0.0224609375,
0.07891845703125,
0.02593994140625,
-0.053466796875,
0.043426513671875,
-0.0196990966796875,
-0.007537841796875,
0.0118865966796875,
0.0107574462890625,
-0.0285797119140625,
-0.00011289119720458984,
-0.0214080810546875,
-0.0560302734375,
0.06787109375,
0.005619049072265625,
0.016510009765625,
0.002880096435546875,
0.0380859375,
0.020050048828125,
-0.0057830810546875,
-0.08245849609375,
-0.028961181640625,
-0.032440185546875,
-0.003597259521484375,
0.00553131103515625,
-0.02777099609375,
-0.01087188720703125,
-0.0269317626953125,
0.07281494140625,
-0.01490020751953125,
0.01300048828125,
-0.001373291015625,
0.01039886474609375,
-0.0100555419921875,
-0.02227783203125,
0.0478515625,
0.03204345703125,
-0.00797271728515625,
-0.02996826171875,
0.0343017578125,
-0.04022216796875,
0.0093841552734375,
0.004520416259765625,
-0.0191192626953125,
0.0007843971252441406,
0.0404052734375,
0.07135009765625,
-0.0012760162353515625,
-0.04718017578125,
0.037841796875,
0.004222869873046875,
-0.0083465576171875,
-0.0439453125,
-0.0007672309875488281,
0.023956298828125,
0.0273284912109375,
0.018707275390625,
-0.02593994140625,
0.0033283233642578125,
-0.037750244140625,
-0.0169525146484375,
0.0200653076171875,
0.0177764892578125,
-0.0330810546875,
0.0509033203125,
0.009033203125,
-0.00719451904296875,
0.033843994140625,
-0.025177001953125,
-0.01485443115234375,
0.08856201171875,
0.04937744140625,
0.03607177734375,
-0.0306549072265625,
0.0106048583984375,
0.05010986328125,
0.02392578125,
-0.0345458984375,
0.02874755859375,
0.0068511962890625,
-0.05841064453125,
-0.0141754150390625,
-0.050506591796875,
-0.0273284912109375,
0.031982421875,
-0.0482177734375,
0.049468994140625,
-0.041534423828125,
-0.00969696044921875,
-0.017181396484375,
0.028656005859375,
-0.04095458984375,
0.0191650390625,
0.03399658203125,
0.0802001953125,
-0.044647216796875,
0.0869140625,
0.04425048828125,
-0.02899169921875,
-0.0821533203125,
-0.0289764404296875,
-0.0006690025329589844,
-0.113525390625,
0.046722412109375,
0.01861572265625,
0.0004811286926269531,
-0.029327392578125,
-0.06085205078125,
-0.0933837890625,
0.123291015625,
0.0257568359375,
-0.0372314453125,
-0.011932373046875,
0.0137481689453125,
0.0282440185546875,
-0.0175628662109375,
0.0221405029296875,
0.052734375,
0.035552978515625,
0.037322998046875,
-0.0697021484375,
0.01302337646484375,
-0.0274505615234375,
0.01132965087890625,
-0.0087738037109375,
-0.109130859375,
0.0968017578125,
-0.0180816650390625,
-0.007022857666015625,
0.05523681640625,
0.0667724609375,
0.0625,
0.0112762451171875,
0.0440673828125,
0.03509521484375,
0.04644775390625,
0.00495147705078125,
0.046295166015625,
-0.0175628662109375,
0.0189208984375,
0.069091796875,
-0.0204925537109375,
0.0633544921875,
0.016021728515625,
-0.0264434814453125,
0.042999267578125,
0.08978271484375,
-0.0174713134765625,
0.0291595458984375,
0.0053558349609375,
-0.01442718505859375,
-0.0017576217651367188,
-0.0200653076171875,
-0.06036376953125,
0.042205810546875,
0.034332275390625,
-0.024810791015625,
-0.0024700164794921875,
-0.031951904296875,
0.0275726318359375,
-0.037200927734375,
-0.020416259765625,
0.029296875,
0.0177001953125,
-0.0288848876953125,
0.060089111328125,
0.0248870849609375,
0.07135009765625,
-0.055908203125,
-0.0017881393432617188,
-0.03948974609375,
-0.003612518310546875,
-0.0218353271484375,
-0.033599853515625,
-0.0030918121337890625,
0.004657745361328125,
0.007770538330078125,
0.02191162109375,
0.04400634765625,
-0.015380859375,
-0.058135986328125,
0.050384521484375,
0.0312347412109375,
0.0218505859375,
0.0155181884765625,
-0.060638427734375,
0.01349639892578125,
0.004970550537109375,
-0.05792236328125,
0.019989013671875,
0.02374267578125,
-0.01311492919921875,
0.052886962890625,
0.053619384765625,
0.0057373046875,
0.01395416259765625,
0.0055389404296875,
0.0614013671875,
-0.047210693359375,
-0.0217132568359375,
-0.058135986328125,
0.011383056640625,
0.00145721435546875,
-0.025787353515625,
0.031036376953125,
0.031280517578125,
0.0631103515625,
0.002727508544921875,
0.0445556640625,
-0.0080718994140625,
0.031280517578125,
-0.032135009765625,
0.039398193359375,
-0.06011962890625,
0.013763427734375,
0.001750946044921875,
-0.065185546875,
-0.01293182373046875,
0.051513671875,
0.01092529296875,
0.00876617431640625,
0.022308349609375,
0.0540771484375,
0.01010894775390625,
0.0008497238159179688,
0.005016326904296875,
0.018218994140625,
0.0286712646484375,
0.0687255859375,
0.059417724609375,
-0.051239013671875,
0.045013427734375,
-0.0474853515625,
-0.018341064453125,
-0.01016998291015625,
-0.062042236328125,
-0.05706787109375,
-0.0193328857421875,
-0.0051727294921875,
-0.00899505615234375,
-0.0157928466796875,
0.0682373046875,
0.05670166015625,
-0.05377197265625,
-0.032257080078125,
0.019989013671875,
-0.0074462890625,
-0.00904083251953125,
-0.00853729248046875,
0.044036865234375,
0.008514404296875,
-0.063720703125,
0.0262298583984375,
-0.0027828216552734375,
0.031494140625,
-0.0111846923828125,
-0.0200958251953125,
-0.0115509033203125,
0.01519775390625,
0.054168701171875,
0.031280517578125,
-0.07281494140625,
-0.0211944580078125,
-0.00470733642578125,
-0.02032470703125,
0.009674072265625,
0.01013946533203125,
-0.0458984375,
-0.02923583984375,
0.0191497802734375,
0.02203369140625,
0.034149169921875,
0.0006380081176757812,
0.0033893585205078125,
-0.027069091796875,
0.057342529296875,
-0.01165771484375,
0.0447998046875,
0.021484375,
-0.01947021484375,
0.07330322265625,
0.01397705078125,
-0.0237579345703125,
-0.05780029296875,
0.0074005126953125,
-0.1005859375,
-0.0237884521484375,
0.08319091796875,
-0.019561767578125,
-0.027435302734375,
0.031890869140625,
-0.026763916015625,
0.036956787109375,
-0.0130157470703125,
0.0438232421875,
0.041107177734375,
0.0006542205810546875,
-0.006046295166015625,
-0.042236328125,
0.005298614501953125,
0.02545166015625,
-0.064697265625,
-0.020599365234375,
0.01308441162109375,
0.018707275390625,
0.01068878173828125,
0.040985107421875,
-0.00978851318359375,
0.014495849609375,
-0.006694793701171875,
0.0202484130859375,
-0.01019287109375,
0.00284576416015625,
0.0027637481689453125,
-0.020416259765625,
0.004230499267578125,
-0.0345458984375
]
] |
llm-agents/tora-code-34b-v1.0 | 2023-10-08T11:23:00.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"code",
"math",
"en",
"dataset:gsm8k",
"dataset:competition_math",
"arxiv:2309.17452",
"license:llama2",
"endpoints_compatible",
"text-generation-inference",
"region:us",
"has_space"
] | text-generation | llm-agents | null | null | llm-agents/tora-code-34b-v1.0 | 9 | 6,770 | transformers | 2023-10-08T05:33:54 | ---
license: llama2
datasets:
- gsm8k
- competition_math
language:
- en
metrics:
- exact_match
library_name: transformers
pipeline_tag: text-generation
tags:
- code
- math
---
<h1 align="center">
ToRA: A Tool-Integrated Reasoning Agent <br> for Mathematical Problem Solving
</h1>
<p align="center">
<a href="https://microsoft.github.io/ToRA/"><b>[🌐 Website]</b></a> •
<a href="https://arxiv.org/pdf/2309.17452.pdf"><b>[📜 Paper]</b></a> •
<a href="https://huggingface.co/llm-agents"><b>[🤗 HF Models]</b></a> •
<a href="https://github.com/microsoft/ToRA"><b>[🐱 GitHub]</b></a>
<br>
<a href="https://twitter.com/zhs05232838/status/1708860992631763092"><b>[🐦 Twitter]</b></a> •
<a href="https://www.reddit.com/r/LocalLLaMA/comments/1703k6d/tora_a_toolintegrated_reasoning_agent_for/"><b>[💬 Reddit]</b></a> •
<a href="https://notes.aimodels.fyi/researchers-announce-tora-training-language-models-to-better-understand-math-using-external-tools/">[🍀 Unofficial Blog]</a>
<!-- <a href="#-quick-start">Quick Start</a> • -->
<!-- <a href="#%EF%B8%8F-citation">Citation</a> -->
</p>
<p align="center">
Repo for "<a href="https://arxiv.org/pdf/2309.17452.pdf" target="_blank">ToRA: A Tool-Integrated Reasoning Agent for Mathematical Problem Solving</a>"
</p>
## 🔥 News
- [2023/10/08] 🔥🔥🔥 All ToRA models released at [HuggingFace](https://huggingface.co/llm-agents)!!!
- [2023/09/29] ToRA paper, repo, and website released.
## 💡 Introduction
ToRA is a series of Tool-integrated Reasoning Agents designed to solve challenging mathematical reasoning problems by interacting with tools, e.g., computation libraries and symbolic solvers. ToRA series seamlessly integrate natural language reasoning with the utilization of external tools, thereby amalgamating the analytical prowess of language and the computational efficiency of external tools.
| Model | Size | GSM8k | MATH | AVG@10 math tasks<sup>†</sup> |
|---|---|---|---|---|
| GPT-4 | - | 92.0 | 42.5 | 78.3 |
| GPT-4 (PAL) | - | 94.2 | 51.8 | 86.4 |
| [ToRA-7B](https://huggingface.co/llm-agents/tora-7b-v1.0) | 7B | 68.8 | 40.1 | 62.4|
| [ToRA-Code-7B](https://huggingface.co/llm-agents/tora-code-7b-v1.0) | 7B | 72.6 | 44.6 | 66.5|
| [ToRA-13B](https://huggingface.co/llm-agents/tora-13b-v1.0) | 13B | 72.7 | 43.0 | 65.9|
| [ToRA-Code-13B](https://huggingface.co/llm-agents/tora-code-13b-v1.0) | 13B | 75.8 | 48.1 | 71.3 |
| [ToRA-Code-34B<sup>*</sup>](https://huggingface.co/llm-agents/tora-code-34b-v1.0) | 34B | 80.7 | **51.0** | 74.8 |
| [ToRA-70B](https://huggingface.co/llm-agents/tora-70b-v1.0) | 70B | **84.3** | 49.7 | **76.9** |
- <sup>*</sup>ToRA-Code-34B is currently the first and only open-source model to achieve over 50% accuracy (pass@1) on the MATH dataset, which significantly outperforms GPT-4’s CoT result (51.0 vs. 42.5), and is competitive with GPT-4 solving problems with programs. By open-sourcing our codes and models, we hope more breakthroughs will come!
- <sup>†</sup>10 math tasks include GSM8k, MATH, GSM-Hard, SVAMP, TabMWP, ASDiv, SingleEQ, SingleOP, AddSub, and MultiArith.
## ⚡️ Training
The models are trained on ToRA-Corpus 16k, which contains tool-integrated reasoning trajectories of MATH and GSM8k from GPT-4.
We use imitation learning (i.e., SFT) to fine-tune the models, and then apply our proposed *output space shaping* to improve tool-integrated reasoning behaviors. Please refer to the [paper](https://arxiv.org/pdf/2309.17452.pdf) for more details.
## 🪁 Inference & Evaluation
Please refer to ToRA's [GitHub repo](https://github.com/microsoft/ToRA) for inference, evaluation, and training code.
## ☕️ Citation
If you find this repository helpful, please consider citing our paper:
```
@misc{gou2023tora,
title={ToRA: A Tool-Integrated Reasoning Agent for Mathematical Problem Solving},
author={Zhibin Gou and Zhihong Shao and Yeyun Gong and yelong shen and Yujiu Yang and Minlie Huang and Nan Duan and Weizhu Chen},
year={2023},
eprint={2309.17452},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 4,085 | [
[
-0.0281829833984375,
-0.0599365234375,
0.05047607421875,
0.03369140625,
-0.0159454345703125,
0.032928466796875,
0.01558685302734375,
-0.0262298583984375,
0.03717041015625,
0.03314208984375,
-0.049163818359375,
-0.0204315185546875,
-0.032379150390625,
0.00928497314453125,
0.006561279296875,
0.031982421875,
0.0035076141357421875,
0.0174713134765625,
-0.0164794921875,
-0.0216827392578125,
-0.052459716796875,
-0.0262603759765625,
-0.036041259765625,
-0.0271453857421875,
-0.00958251953125,
-0.00705718994140625,
0.0687255859375,
0.03692626953125,
0.0288238525390625,
0.029144287109375,
0.00006788969039916992,
0.03619384765625,
-0.0013217926025390625,
-0.006988525390625,
-0.0048370361328125,
-0.0357666015625,
-0.053436279296875,
0.01180267333984375,
0.04827880859375,
0.0274505615234375,
-0.017425537109375,
0.004871368408203125,
-0.01453399658203125,
0.031524658203125,
-0.0304412841796875,
0.0221099853515625,
-0.020263671875,
-0.0170135498046875,
-0.0011014938354492188,
-0.00482177734375,
-0.046417236328125,
-0.035919189453125,
-0.0011320114135742188,
-0.07269287109375,
-0.01145172119140625,
0.00030040740966796875,
0.08721923828125,
0.0277862548828125,
-0.0242462158203125,
-0.017242431640625,
-0.04693603515625,
0.0625,
-0.06536865234375,
0.021148681640625,
-0.0012378692626953125,
0.022735595703125,
-0.015045166015625,
-0.05340576171875,
-0.049102783203125,
-0.01534271240234375,
-0.0062103271484375,
0.034759521484375,
-0.045806884765625,
-0.0192108154296875,
0.03460693359375,
-0.000507354736328125,
-0.040069580078125,
-0.01910400390625,
-0.038360595703125,
-0.01198577880859375,
0.04364013671875,
0.0189208984375,
0.0302276611328125,
0.0013208389282226562,
-0.0095367431640625,
-0.0133209228515625,
-0.053955078125,
0.0060272216796875,
0.0253753662109375,
0.003662109375,
-0.0244140625,
0.021026611328125,
0.0236968994140625,
0.044708251953125,
-0.001964569091796875,
-0.004947662353515625,
0.03515625,
-0.006443023681640625,
-0.0205078125,
-0.026397705078125,
0.0701904296875,
-0.005702972412109375,
0.00775909423828125,
-0.0014753341674804688,
-0.0087432861328125,
0.006885528564453125,
0.0217437744140625,
-0.055908203125,
0.0093841552734375,
0.007404327392578125,
-0.00446319580078125,
-0.01117706298828125,
0.0235748291015625,
-0.05133056640625,
-0.0047760009765625,
-0.0235748291015625,
0.0455322265625,
-0.0350341796875,
-0.018646240234375,
0.03802490234375,
0.0009851455688476562,
0.01287841796875,
0.049041748046875,
-0.01387786865234375,
0.0288238525390625,
0.040252685546875,
0.07281494140625,
0.0103302001953125,
-0.0138397216796875,
-0.051116943359375,
-0.00865936279296875,
-0.0260009765625,
0.050750732421875,
-0.029571533203125,
-0.0177459716796875,
-0.0271759033203125,
0.0030384063720703125,
-0.01277923583984375,
-0.0281219482421875,
0.003253936767578125,
-0.05206298828125,
0.0274810791015625,
-0.008758544921875,
-0.01611328125,
-0.02923583984375,
0.0102386474609375,
-0.06573486328125,
0.072509765625,
0.0249481201171875,
-0.0306549072265625,
-0.0122833251953125,
-0.06475830078125,
-0.006328582763671875,
-0.0032825469970703125,
-0.0041656494140625,
-0.036468505859375,
-0.0224456787109375,
0.016021728515625,
0.0032825469970703125,
-0.059234619140625,
0.038482666015625,
-0.035369873046875,
-0.0169830322265625,
0.019927978515625,
-0.00191497802734375,
0.0946044921875,
0.0232696533203125,
-0.038177490234375,
0.019561767578125,
-0.0330810546875,
0.0142669677734375,
0.037689208984375,
0.0167083740234375,
-0.0158233642578125,
-0.023773193359375,
-0.024169921875,
0.03326416015625,
0.012451171875,
-0.041412353515625,
0.0219879150390625,
-0.04345703125,
0.047637939453125,
0.0826416015625,
-0.006259918212890625,
0.022674560546875,
-0.02447509765625,
0.0731201171875,
0.003932952880859375,
0.023101806640625,
0.01036834716796875,
-0.05511474609375,
-0.04315185546875,
-0.0243682861328125,
0.0214691162109375,
0.05084228515625,
-0.0787353515625,
0.03326416015625,
-0.00975799560546875,
-0.058135986328125,
-0.033721923828125,
-0.01245880126953125,
0.044586181640625,
0.0333251953125,
0.02276611328125,
-0.00811767578125,
-0.0293121337890625,
-0.053741455078125,
-0.016693115234375,
-0.004566192626953125,
0.0107879638671875,
0.0290069580078125,
0.049407958984375,
-0.00377655029296875,
0.0755615234375,
-0.0638427734375,
-0.0010671615600585938,
-0.01898193359375,
-0.007427215576171875,
0.0300750732421875,
0.0277862548828125,
0.05548095703125,
-0.052459716796875,
-0.057220458984375,
-0.0118865966796875,
-0.06268310546875,
-0.00841522216796875,
-0.0138702392578125,
-0.0181732177734375,
0.0153656005859375,
0.017242431640625,
-0.049652099609375,
0.038360595703125,
0.0183258056640625,
-0.052520751953125,
0.036895751953125,
0.01026153564453125,
-0.014678955078125,
-0.10223388671875,
0.01019287109375,
0.0194549560546875,
-0.017578125,
-0.0297088623046875,
0.018402099609375,
-0.0087432861328125,
-0.0026397705078125,
-0.0281219482421875,
0.0728759765625,
-0.0249481201171875,
0.005008697509765625,
-0.0012178421020507812,
0.0166168212890625,
-0.0004546642303466797,
0.0531005859375,
-0.0170440673828125,
0.09716796875,
0.032623291015625,
-0.0249176025390625,
0.021575927734375,
0.0247650146484375,
0.00926971435546875,
0.008514404296875,
-0.0614013671875,
0.0206451416015625,
0.00855255126953125,
0.01605224609375,
-0.043487548828125,
0.0236968994140625,
0.03741455078125,
-0.051544189453125,
-0.0013790130615234375,
0.0041961669921875,
-0.054229736328125,
-0.0203857421875,
-0.034576416015625,
0.01824951171875,
0.047882080078125,
-0.0305633544921875,
0.07916259765625,
0.031494140625,
0.00824737548828125,
-0.050994873046875,
-0.01532745361328125,
-0.018035888671875,
-0.0103912353515625,
-0.0706787109375,
0.013397216796875,
-0.03369140625,
-0.040374755859375,
0.00846099853515625,
-0.0113372802734375,
-0.0011644363403320312,
0.00296783447265625,
0.0141754150390625,
0.04486083984375,
-0.02569580078125,
0.007137298583984375,
0.005992889404296875,
-0.0290679931640625,
0.01131439208984375,
-0.01502227783203125,
0.061981201171875,
-0.0650634765625,
-0.01641845703125,
-0.0207061767578125,
0.0191650390625,
0.049774169921875,
-0.0264434814453125,
0.051116943359375,
0.02734375,
-0.03497314453125,
-0.0033969879150390625,
-0.04119873046875,
-0.024444580078125,
-0.041168212890625,
0.007343292236328125,
-0.041473388671875,
-0.053619384765625,
0.053009033203125,
-0.0027103424072265625,
-0.0048370361328125,
0.06414794921875,
0.0283966064453125,
0.02569580078125,
0.0863037109375,
0.050750732421875,
-0.005950927734375,
0.031829833984375,
-0.057647705078125,
0.0175628662109375,
-0.06341552734375,
-0.0186309814453125,
-0.037689208984375,
-0.0015268325805664062,
-0.03173828125,
-0.017486572265625,
0.046539306640625,
0.0043487548828125,
-0.036407470703125,
0.043121337890625,
-0.05224609375,
0.03515625,
0.0509033203125,
0.0091094970703125,
0.017852783203125,
-0.0085906982421875,
-0.0108489990234375,
-0.004459381103515625,
-0.04364013671875,
-0.04095458984375,
0.06683349609375,
0.0216064453125,
0.044677734375,
0.0286712646484375,
0.0283966064453125,
0.006683349609375,
0.0167694091796875,
-0.04443359375,
0.0572509765625,
0.00689697265625,
-0.025848388671875,
-0.0230560302734375,
-0.03570556640625,
-0.0704345703125,
0.0152587890625,
0.01210784912109375,
-0.0653076171875,
0.0179290771484375,
-0.0068206787109375,
-0.0293426513671875,
0.0302886962890625,
-0.05340576171875,
0.054962158203125,
-0.0005626678466796875,
-0.035430908203125,
-0.021331787109375,
-0.043365478515625,
0.026458740234375,
0.0048065185546875,
0.0018768310546875,
0.00911712646484375,
0.005367279052734375,
0.061126708984375,
-0.067626953125,
0.049041748046875,
-0.01202392578125,
-0.00431060791015625,
0.039154052734375,
0.0258941650390625,
0.05157470703125,
0.0259857177734375,
-0.0081939697265625,
0.01276397705078125,
0.01122283935546875,
-0.024078369140625,
-0.0657958984375,
0.041046142578125,
-0.07073974609375,
-0.054534912109375,
-0.073974609375,
-0.051971435546875,
-0.0119781494140625,
0.0277862548828125,
0.00848388671875,
0.036956787109375,
0.042144775390625,
0.0037441253662109375,
0.052337646484375,
-0.001251220703125,
0.0307769775390625,
0.0501708984375,
-0.0017805099487304688,
-0.034637451171875,
0.07244873046875,
0.018157958984375,
0.017547607421875,
0.0228271484375,
0.01556396484375,
-0.026611328125,
-0.0187530517578125,
-0.03753662109375,
0.051025390625,
-0.04962158203125,
-0.034088134765625,
-0.036468505859375,
-0.0423583984375,
-0.0290985107421875,
-0.0220794677734375,
-0.0306549072265625,
-0.030181884765625,
-0.034149169921875,
0.016510009765625,
0.055084228515625,
0.049652099609375,
0.006732940673828125,
0.02545166015625,
-0.050994873046875,
0.015380859375,
0.010772705078125,
0.0259857177734375,
0.0035877227783203125,
-0.033721923828125,
0.0021820068359375,
-0.00018334388732910156,
-0.04522705078125,
-0.0693359375,
0.054901123046875,
-0.0234832763671875,
0.037078857421875,
0.0025005340576171875,
-0.0009593963623046875,
0.04388427734375,
-0.0037517547607421875,
0.045379638671875,
0.01264190673828125,
-0.09674072265625,
0.03973388671875,
-0.028289794921875,
0.0152435302734375,
0.0024566650390625,
0.0109710693359375,
-0.0269317626953125,
-0.02288818359375,
-0.07232666015625,
-0.036468505859375,
0.08380126953125,
0.02734375,
-0.01837158203125,
0.01202392578125,
0.0245819091796875,
0.006496429443359375,
0.006595611572265625,
-0.057647705078125,
-0.028289794921875,
-0.0263214111328125,
-0.0156707763671875,
0.0184173583984375,
0.007205963134765625,
-0.00864410400390625,
-0.017608642578125,
0.0760498046875,
-0.0281219482421875,
0.03936767578125,
0.00864410400390625,
-0.0115203857421875,
-0.0009579658508300781,
0.004810333251953125,
0.07110595703125,
0.0650634765625,
-0.014312744140625,
-0.0190582275390625,
0.0034046173095703125,
-0.066162109375,
0.007724761962890625,
0.0104217529296875,
-0.0261688232421875,
-0.0089569091796875,
0.0216064453125,
0.05804443359375,
-0.014892578125,
-0.0548095703125,
0.0304412841796875,
0.003536224365234375,
-0.011138916015625,
-0.0295562744140625,
0.004856109619140625,
0.00035858154296875,
0.028961181640625,
0.0164947509765625,
0.008453369140625,
0.0024662017822265625,
-0.0283355712890625,
-0.0008091926574707031,
0.036285400390625,
-0.01561737060546875,
-0.0243682861328125,
0.038177490234375,
-0.0022525787353515625,
-0.042724609375,
0.047760009765625,
-0.042083740234375,
-0.044647216796875,
0.07720947265625,
0.05841064453125,
0.06927490234375,
-0.00438690185546875,
0.02301025390625,
0.029449462890625,
0.042083740234375,
0.005466461181640625,
0.050567626953125,
0.024658203125,
-0.046600341796875,
-0.0233917236328125,
-0.016265869140625,
-0.031097412109375,
0.0145721435546875,
-0.036834716796875,
0.0224609375,
-0.053985595703125,
-0.0043792724609375,
-0.0013637542724609375,
0.022186279296875,
-0.041351318359375,
-0.01071929931640625,
-0.04180908203125,
0.074951171875,
-0.03997802734375,
0.057403564453125,
0.051544189453125,
-0.060028076171875,
-0.07958984375,
-0.0167999267578125,
0.013702392578125,
-0.0748291015625,
0.02850341796875,
-0.0037384033203125,
-0.031829833984375,
0.01224517822265625,
-0.058197021484375,
-0.0677490234375,
0.1007080078125,
0.054534912109375,
-0.016204833984375,
0.0015439987182617188,
0.0013647079467773438,
0.029205322265625,
-0.0278778076171875,
0.0241851806640625,
0.0099639892578125,
0.04241943359375,
0.0092926025390625,
-0.067138671875,
0.040313720703125,
-0.0592041015625,
-0.012054443359375,
0.032470703125,
-0.08050537109375,
0.07855224609375,
-0.00763702392578125,
-0.005214691162109375,
0.00832366943359375,
0.034027099609375,
0.0435791015625,
0.030303955078125,
0.031280517578125,
0.0423583984375,
0.041412353515625,
-0.0262298583984375,
0.06396484375,
-0.0079498291015625,
0.035247802734375,
0.065673828125,
-0.0133209228515625,
0.031768798828125,
0.0176849365234375,
-0.031097412109375,
0.05145263671875,
0.033538818359375,
-0.0221405029296875,
0.0186309814453125,
-0.00104522705078125,
0.003414154052734375,
-0.035400390625,
0.007354736328125,
-0.03717041015625,
0.010955810546875,
0.029815673828125,
0.006778717041015625,
-0.0191802978515625,
-0.00384521484375,
0.00215911865234375,
0.01532745361328125,
0.0004584789276123047,
0.03814697265625,
0.0101470947265625,
-0.0408935546875,
0.032379150390625,
0.010040283203125,
0.0333251953125,
-0.06103515625,
-0.0239715576171875,
-0.00905609130859375,
0.010986328125,
-0.0006737709045410156,
-0.058868408203125,
0.0179595947265625,
-0.0186767578125,
-0.00826263427734375,
0.0016107559204101562,
0.037384033203125,
0.01080322265625,
-0.036712646484375,
0.019927978515625,
0.03619384765625,
0.001861572265625,
-0.01092529296875,
-0.061309814453125,
-0.00225067138671875,
-0.007843017578125,
-0.01340484619140625,
0.005985260009765625,
0.020477294921875,
-0.032073974609375,
0.07122802734375,
0.054107666015625,
-0.0109100341796875,
0.0025348663330078125,
-0.007411956787109375,
0.07244873046875,
-0.051849365234375,
-0.048095703125,
-0.062042236328125,
0.0389404296875,
-0.00738525390625,
-0.029296875,
0.055572509765625,
0.04693603515625,
0.046142578125,
-0.01727294921875,
0.033721923828125,
0.0014257431030273438,
0.0263671875,
-0.04010009765625,
0.0538330078125,
-0.044921875,
0.034271240234375,
-0.01458740234375,
-0.05169677734375,
-0.0229949951171875,
0.037139892578125,
-0.0193939208984375,
0.0207366943359375,
0.06787109375,
0.05291748046875,
-0.00821685791015625,
0.0010585784912109375,
-0.00835418701171875,
0.0142669677734375,
0.045257568359375,
0.0584716796875,
0.04315185546875,
-0.04693603515625,
0.030975341796875,
-0.02008056640625,
-0.00824737548828125,
-0.0101165771484375,
-0.041961669921875,
-0.06475830078125,
-0.054229736328125,
-0.0197906494140625,
-0.056793212890625,
-0.015777587890625,
0.0726318359375,
0.0484619140625,
-0.036956787109375,
-0.0152740478515625,
-0.020599365234375,
0.037628173828125,
-0.0259857177734375,
-0.023468017578125,
0.04522705078125,
0.0035152435302734375,
-0.048583984375,
0.019317626953125,
0.0230712890625,
0.007724761962890625,
-0.0228271484375,
-0.034027099609375,
-0.0193023681640625,
0.031707763671875,
0.032257080078125,
0.033538818359375,
-0.075927734375,
-0.005645751953125,
0.04443359375,
-0.0003879070281982422,
0.011260986328125,
0.048736572265625,
-0.0750732421875,
0.027557373046875,
0.034576416015625,
0.030181884765625,
0.034942626953125,
-0.0322265625,
0.038360595703125,
-0.0224456787109375,
0.00907135009765625,
0.0301513671875,
0.028839111328125,
-0.0163726806640625,
-0.0439453125,
0.07049560546875,
0.035491943359375,
-0.009521484375,
-0.0872802734375,
0.005008697509765625,
-0.107177734375,
-0.0146484375,
0.06671142578125,
-0.0028171539306640625,
-0.01666259765625,
0.0037784576416015625,
-0.0170440673828125,
0.01184844970703125,
-0.050567626953125,
0.051788330078125,
0.04547119140625,
-0.004924774169921875,
-0.005283355712890625,
-0.031524658203125,
0.01200103759765625,
0.0216217041015625,
-0.09344482421875,
-0.0171051025390625,
0.01763916015625,
0.0170745849609375,
0.044891357421875,
0.045318603515625,
-0.024688720703125,
0.052001953125,
-0.003910064697265625,
-0.0016918182373046875,
-0.046478271484375,
-0.0275421142578125,
-0.0268096923828125,
-0.002498626708984375,
-0.0247039794921875,
-0.0068511962890625
]
] |
facebook/xlm-roberta-xl | 2022-08-08T07:20:58.000Z | [
"transformers",
"pytorch",
"xlm-roberta-xl",
"fill-mask",
"multilingual",
"af",
"am",
"ar",
"as",
"az",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"cs",
"cy",
"da",
"de",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"hu",
"hy",
"id",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"ku",
"ky",
"la",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"om",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sa",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"su",
"sv",
"sw",
"ta",
"te",
"th",
"tl",
"tr",
"ug",
"uk",
"ur",
"uz",
"vi",
"xh",
"yi",
"zh",
"arxiv:2105.00572",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | facebook | null | null | facebook/xlm-roberta-xl | 14 | 6,767 | transformers | 2022-03-02T23:29:05 | ---
language:
- multilingual
- af
- am
- ar
- as
- az
- be
- bg
- bn
- br
- bs
- ca
- cs
- cy
- da
- de
- el
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- he
- hi
- hr
- hu
- hy
- id
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- ku
- ky
- la
- lo
- lt
- lv
- mg
- mk
- ml
- mn
- mr
- ms
- my
- ne
- nl
- no
- om
- or
- pa
- pl
- ps
- pt
- ro
- ru
- sa
- sd
- si
- sk
- sl
- so
- sq
- sr
- su
- sv
- sw
- ta
- te
- th
- tl
- tr
- ug
- uk
- ur
- uz
- vi
- xh
- yi
- zh
license: mit
---
# XLM-RoBERTa-XL (xlarge-sized model)
XLM-RoBERTa-XL model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper [Larger-Scale Transformers for Multilingual Masked Language Modeling](https://arxiv.org/abs/2105.00572) by Naman Goyal, Jingfei Du, Myle Ott, Giri Anantharaman, Alexis Conneau and first released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/xlmr).
Disclaimer: The team releasing XLM-RoBERTa-XL did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
XLM-RoBERTa-XL is a extra large multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages.
RoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labeling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of 100 languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the XLM-RoBERTa-XL model as inputs.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?search=xlm-roberta-xl) to look for fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.
## Usage
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='facebook/xlm-roberta-xl')
>>> unmasker("Europe is a <mask> continent.")
[{'score': 0.08562745153903961,
'token': 38043,
'token_str': 'living',
'sequence': 'Europe is a living continent.'},
{'score': 0.0799778401851654,
'token': 103494,
'token_str': 'dead',
'sequence': 'Europe is a dead continent.'},
{'score': 0.046154674142599106,
'token': 72856,
'token_str': 'lost',
'sequence': 'Europe is a lost continent.'},
{'score': 0.04358183592557907,
'token': 19336,
'token_str': 'small',
'sequence': 'Europe is a small continent.'},
{'score': 0.040570393204689026,
'token': 34923,
'token_str': 'beautiful',
'sequence': 'Europe is a beautiful continent.'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained('facebook/xlm-roberta-xl')
model = AutoModelForMaskedLM.from_pretrained("facebook/xlm-roberta-xl")
# prepare input
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
# forward pass
output = model(**encoded_input)
```
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2105-00572,
author = {Naman Goyal and
Jingfei Du and
Myle Ott and
Giri Anantharaman and
Alexis Conneau},
title = {Larger-Scale Transformers for Multilingual Masked Language Modeling},
journal = {CoRR},
volume = {abs/2105.00572},
year = {2021},
url = {https://arxiv.org/abs/2105.00572},
eprinttype = {arXiv},
eprint = {2105.00572},
timestamp = {Wed, 12 May 2021 15:54:31 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2105-00572.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | 5,035 | [
[
-0.0367431640625,
-0.0631103515625,
0.0131683349609375,
0.0170440673828125,
-0.01451873779296875,
0.0022869110107421875,
-0.04180908203125,
-0.033935546875,
0.0227203369140625,
0.04638671875,
-0.047760009765625,
-0.039398193359375,
-0.047393798828125,
0.0250701904296875,
-0.0268096923828125,
0.0869140625,
-0.003292083740234375,
-0.008270263671875,
0.006015777587890625,
-0.006824493408203125,
-0.0168914794921875,
-0.061065673828125,
-0.03643798828125,
-0.01580810546875,
0.027740478515625,
0.01294708251953125,
0.0484619140625,
0.045196533203125,
0.027252197265625,
0.033416748046875,
-0.0140533447265625,
0.0189208984375,
-0.02593994140625,
-0.004772186279296875,
0.006992340087890625,
-0.038970947265625,
-0.034942626953125,
0.0170135498046875,
0.046905517578125,
0.05157470703125,
0.015045166015625,
0.0264129638671875,
0.01348876953125,
0.0229644775390625,
-0.0084381103515625,
0.0291290283203125,
-0.043701171875,
0.00830078125,
-0.0112762451171875,
0.005084991455078125,
-0.03155517578125,
-0.0146331787109375,
0.01085662841796875,
-0.0234222412109375,
0.0229339599609375,
0.007602691650390625,
0.09222412109375,
0.005855560302734375,
-0.0175018310546875,
-0.01302337646484375,
-0.05157470703125,
0.07666015625,
-0.05267333984375,
0.022918701171875,
0.0245819091796875,
0.003925323486328125,
-0.0021343231201171875,
-0.063720703125,
-0.052825927734375,
-0.015380859375,
-0.03240966796875,
0.01187896728515625,
-0.03173828125,
-0.01169586181640625,
0.01922607421875,
0.0333251953125,
-0.06048583984375,
-0.004627227783203125,
-0.035888671875,
-0.0299530029296875,
0.051971435546875,
-0.0017251968383789062,
0.027252197265625,
-0.039154052734375,
-0.0262908935546875,
-0.0257415771484375,
-0.033477783203125,
0.005908966064453125,
0.03369140625,
0.045135498046875,
-0.0276641845703125,
0.037353515625,
-0.008056640625,
0.0589599609375,
0.005550384521484375,
-0.00899505615234375,
0.042938232421875,
-0.018280029296875,
-0.00910186767578125,
-0.0162353515625,
0.08447265625,
0.007587432861328125,
0.0186767578125,
-0.006450653076171875,
-0.00789642333984375,
0.0042266845703125,
-0.0028247833251953125,
-0.05706787109375,
-0.02447509765625,
0.0175323486328125,
-0.0419921875,
-0.01433563232421875,
0.007648468017578125,
-0.05767822265625,
0.01256561279296875,
-0.0234527587890625,
0.047027587890625,
-0.030792236328125,
-0.0224761962890625,
0.0037097930908203125,
0.004425048828125,
-0.003551483154296875,
-0.000942230224609375,
-0.049713134765625,
0.0157012939453125,
0.0333251953125,
0.0660400390625,
-0.001834869384765625,
-0.022125244140625,
-0.03021240234375,
-0.0306396484375,
-0.0083770751953125,
0.0477294921875,
-0.0305023193359375,
-0.00481414794921875,
0.002658843994140625,
0.0347900390625,
-0.017669677734375,
-0.0281982421875,
0.04241943359375,
-0.0279388427734375,
0.041168212890625,
0.0092010498046875,
-0.0247344970703125,
-0.03070068359375,
0.01279449462890625,
-0.051361083984375,
0.08740234375,
0.017242431640625,
-0.049346923828125,
0.00782012939453125,
-0.042144775390625,
-0.0233917236328125,
-0.004863739013671875,
0.01158905029296875,
-0.053253173828125,
-0.01082611083984375,
0.027252197265625,
0.039306640625,
-0.005542755126953125,
0.01312255859375,
-0.0162811279296875,
-0.0061492919921875,
0.029144287109375,
-0.0230560302734375,
0.08367919921875,
0.034271240234375,
-0.0299530029296875,
0.0028285980224609375,
-0.06231689453125,
0.0101470947265625,
0.01320648193359375,
-0.0220184326171875,
-0.01373291015625,
-0.01332855224609375,
0.0277099609375,
0.026763916015625,
0.0175933837890625,
-0.040069580078125,
0.003993988037109375,
-0.03790283203125,
0.050994873046875,
0.0394287109375,
-0.019195556640625,
0.02752685546875,
-0.0198516845703125,
0.048004150390625,
0.0008769035339355469,
0.00811767578125,
-0.02801513671875,
-0.046783447265625,
-0.06591796875,
-0.032073974609375,
0.0477294921875,
0.045928955078125,
-0.04425048828125,
0.0550537109375,
-0.0207366943359375,
-0.040771484375,
-0.055328369140625,
0.00821685791015625,
0.035858154296875,
0.0244903564453125,
0.032562255859375,
-0.0284881591796875,
-0.062744140625,
-0.0528564453125,
-0.00595855712890625,
0.003467559814453125,
0.00021922588348388672,
0.02862548828125,
0.046783447265625,
-0.0240325927734375,
0.0712890625,
-0.0386962890625,
-0.0355224609375,
-0.041717529296875,
0.009674072265625,
0.031219482421875,
0.043731689453125,
0.043426513671875,
-0.05633544921875,
-0.0528564453125,
-0.005664825439453125,
-0.042083740234375,
-0.005466461181640625,
-0.0029582977294921875,
-0.01308441162109375,
0.040191650390625,
0.03204345703125,
-0.050689697265625,
0.02392578125,
0.056427001953125,
-0.0224151611328125,
0.0271759033203125,
-0.0157012939453125,
-0.005931854248046875,
-0.0975341796875,
0.01030731201171875,
-0.0013256072998046875,
-0.0306396484375,
-0.049774169921875,
0.005748748779296875,
0.004077911376953125,
-0.014190673828125,
-0.0215911865234375,
0.04864501953125,
-0.047088623046875,
0.005889892578125,
-0.00421142578125,
0.033477783203125,
0.005741119384765625,
0.053314208984375,
0.01593017578125,
0.032257080078125,
0.042633056640625,
-0.0338134765625,
0.0231170654296875,
0.036224365234375,
-0.029998779296875,
0.014129638671875,
-0.0390625,
0.01009368896484375,
-0.0009336471557617188,
0.019317626953125,
-0.0689697265625,
-0.0021038055419921875,
0.018402099609375,
-0.042999267578125,
0.02874755859375,
-0.0209197998046875,
-0.041168212890625,
-0.052520751953125,
-0.0048065185546875,
0.0241241455078125,
0.049285888671875,
-0.036895751953125,
0.04693603515625,
0.02972412109375,
-0.0214080810546875,
-0.040191650390625,
-0.0567626953125,
0.017059326171875,
-0.0240631103515625,
-0.052520751953125,
0.03619384765625,
-0.014892578125,
0.0009503364562988281,
0.00107574462890625,
0.0205535888671875,
0.006549835205078125,
-0.0022640228271484375,
0.01186370849609375,
0.0245208740234375,
-0.01428985595703125,
-0.0033817291259765625,
-0.0186309814453125,
-0.0257415771484375,
-0.006481170654296875,
-0.03448486328125,
0.07012939453125,
-0.00937652587890625,
0.005252838134765625,
-0.038360595703125,
0.0262603759765625,
0.016845703125,
-0.0361328125,
0.053985595703125,
0.0631103515625,
-0.0232086181640625,
-0.01206207275390625,
-0.036224365234375,
-0.01166534423828125,
-0.034271240234375,
0.040130615234375,
-0.030303955078125,
-0.06634521484375,
0.05615234375,
0.01397705078125,
-0.00875091552734375,
0.052642822265625,
0.05242919921875,
0.0107879638671875,
0.08282470703125,
0.057220458984375,
-0.00550079345703125,
0.0352783203125,
-0.045257568359375,
0.031280517578125,
-0.061981201171875,
-0.01425933837890625,
-0.048248291015625,
-0.016204833984375,
-0.06695556640625,
-0.0325927734375,
0.01203155517578125,
0.0038509368896484375,
-0.0093841552734375,
0.05413818359375,
-0.04571533203125,
0.0142822265625,
0.05413818359375,
0.0118865966796875,
0.01192474365234375,
0.00954437255859375,
-0.01507568359375,
-0.00435638427734375,
-0.05029296875,
-0.03759765625,
0.08135986328125,
0.0257568359375,
0.047088623046875,
-0.0028209686279296875,
0.055206298828125,
0.0014123916625976562,
0.0186004638671875,
-0.0433349609375,
0.0364990234375,
-0.0231781005859375,
-0.055511474609375,
-0.015838623046875,
-0.033416748046875,
-0.08050537109375,
0.0209197998046875,
-0.01983642578125,
-0.06256103515625,
0.01213836669921875,
0.0014553070068359375,
-0.0298919677734375,
0.0240478515625,
-0.048828125,
0.0694580078125,
-0.01206207275390625,
-0.01271820068359375,
0.007450103759765625,
-0.051483154296875,
0.0169677734375,
-0.0146636962890625,
0.0229644775390625,
0.00608062744140625,
0.0225067138671875,
0.054779052734375,
-0.03662109375,
0.07098388671875,
0.0036907196044921875,
0.0004901885986328125,
0.0218048095703125,
0.0010290145874023438,
0.0335693359375,
-0.0081787109375,
0.0014495849609375,
0.03594970703125,
-0.00786590576171875,
-0.021270751953125,
-0.0260467529296875,
0.04840087890625,
-0.07281494140625,
-0.040771484375,
-0.0355224609375,
-0.05010986328125,
0.01076507568359375,
0.02294921875,
0.034027099609375,
0.0401611328125,
0.0044403076171875,
0.0143890380859375,
0.03485107421875,
-0.036712646484375,
0.04241943359375,
0.030181884765625,
-0.0258636474609375,
-0.0404052734375,
0.05487060546875,
0.0181427001953125,
0.0198822021484375,
0.03594970703125,
0.0136871337890625,
-0.0280609130859375,
-0.02923583984375,
-0.0250091552734375,
0.025665283203125,
-0.041717529296875,
-0.0253753662109375,
-0.078125,
-0.036590576171875,
-0.045074462890625,
0.004772186279296875,
-0.0165252685546875,
-0.046783447265625,
-0.0264129638671875,
-0.005237579345703125,
0.040191650390625,
0.0423583984375,
-0.02606201171875,
0.019317626953125,
-0.0484619140625,
0.022857666015625,
0.010650634765625,
0.0103759765625,
-0.01055145263671875,
-0.0677490234375,
-0.026031494140625,
0.0125274658203125,
-0.0293121337890625,
-0.04583740234375,
0.06591796875,
0.0170135498046875,
0.0380859375,
0.0230865478515625,
0.00850677490234375,
0.0589599609375,
-0.039794921875,
0.0592041015625,
0.0030841827392578125,
-0.0689697265625,
0.036163330078125,
-0.00775146484375,
0.0099945068359375,
0.007633209228515625,
0.02801513671875,
-0.044921875,
-0.05194091796875,
-0.057159423828125,
-0.0753173828125,
0.07098388671875,
0.0283355712890625,
0.03350830078125,
-0.003276824951171875,
0.0155487060546875,
0.004825592041015625,
0.0087738037109375,
-0.09271240234375,
-0.044189453125,
-0.031280517578125,
-0.031097412109375,
-0.0318603515625,
-0.0147705078125,
-0.0047607421875,
-0.02569580078125,
0.056671142578125,
-0.0020771026611328125,
0.039947509765625,
0.023223876953125,
-0.031463623046875,
-0.0009374618530273438,
0.0036640167236328125,
0.044952392578125,
0.0247802734375,
-0.01739501953125,
0.0084686279296875,
0.0086517333984375,
-0.036956787109375,
-0.01271820068359375,
0.038055419921875,
-0.0152740478515625,
0.014190673828125,
0.0290679931640625,
0.06695556640625,
0.006870269775390625,
-0.035003662109375,
0.033111572265625,
0.0074615478515625,
-0.0198974609375,
-0.040924072265625,
-0.0019512176513671875,
0.0020122528076171875,
0.023406982421875,
0.0305328369140625,
0.005130767822265625,
-0.01050567626953125,
-0.047271728515625,
0.022552490234375,
0.03314208984375,
-0.031280517578125,
-0.0273284912109375,
0.0615234375,
-0.008148193359375,
-0.034149169921875,
0.044647216796875,
-0.01898193359375,
-0.06634521484375,
0.048797607421875,
0.049591064453125,
0.0648193359375,
-0.01148223876953125,
0.02374267578125,
0.046234130859375,
0.0310516357421875,
0.0042266845703125,
0.006130218505859375,
0.00743865966796875,
-0.05828857421875,
-0.03155517578125,
-0.06622314453125,
0.0030117034912109375,
0.0243988037109375,
-0.0419921875,
0.02276611328125,
-0.01412200927734375,
-0.01593017578125,
0.004474639892578125,
0.0175933837890625,
-0.0577392578125,
0.0186309814453125,
0.00678253173828125,
0.05743408203125,
-0.07037353515625,
0.061065673828125,
0.042205810546875,
-0.048797607421875,
-0.07373046875,
-0.0190887451171875,
-0.01220703125,
-0.06292724609375,
0.0594482421875,
0.0255889892578125,
0.02740478515625,
0.007221221923828125,
-0.03826904296875,
-0.083740234375,
0.07452392578125,
0.01059722900390625,
-0.0333251953125,
0.00980377197265625,
0.029205322265625,
0.049591064453125,
-0.0458984375,
0.03302001953125,
0.02667236328125,
0.035491943359375,
-0.00835418701171875,
-0.06793212890625,
0.0122833251953125,
-0.0280303955078125,
0.00421142578125,
0.0031757354736328125,
-0.060821533203125,
0.087646484375,
-0.01203155517578125,
-0.01175689697265625,
0.0261993408203125,
0.04058837890625,
0.004932403564453125,
-0.0027523040771484375,
0.03387451171875,
0.047088623046875,
0.048370361328125,
-0.027679443359375,
0.07196044921875,
-0.03759765625,
0.057586669921875,
0.06829833984375,
0.000347137451171875,
0.05975341796875,
0.008544921875,
-0.0207977294921875,
0.057525634765625,
0.05487060546875,
-0.0244598388671875,
0.03338623046875,
0.004535675048828125,
-0.0011167526245117188,
-0.0170135498046875,
0.016937255859375,
-0.02484130859375,
0.042144775390625,
0.00949859619140625,
-0.040069580078125,
-0.01042938232421875,
0.01629638671875,
0.0313720703125,
0.004543304443359375,
-0.0081634521484375,
0.04571533203125,
0.01290130615234375,
-0.053741455078125,
0.053863525390625,
0.0110931396484375,
0.052947998046875,
-0.0386962890625,
0.0105438232421875,
-0.027557373046875,
0.0244293212890625,
-0.01103973388671875,
-0.03863525390625,
0.0167083740234375,
0.007587432861328125,
-0.0157318115234375,
-0.024932861328125,
0.0305328369140625,
-0.0548095703125,
-0.0665283203125,
0.0240020751953125,
0.0225372314453125,
0.018157958984375,
-0.0022563934326171875,
-0.06658935546875,
0.0015726089477539062,
0.01125335693359375,
-0.032073974609375,
0.0166473388671875,
0.036712646484375,
-0.001575469970703125,
0.049285888671875,
0.059600830078125,
0.006832122802734375,
0.00940704345703125,
-0.0023288726806640625,
0.053619384765625,
-0.052581787109375,
-0.036865234375,
-0.058807373046875,
0.0526123046875,
-0.006618499755859375,
-0.0152435302734375,
0.06878662109375,
0.049591064453125,
0.064453125,
-0.01092529296875,
0.054840087890625,
-0.0178680419921875,
0.0460205078125,
-0.033416748046875,
0.0653076171875,
-0.042144775390625,
0.00457763671875,
-0.031280517578125,
-0.0711669921875,
-0.035064697265625,
0.06231689453125,
-0.0124664306640625,
0.0169830322265625,
0.053314208984375,
0.072998046875,
-0.0023345947265625,
-0.0225372314453125,
0.0189361572265625,
0.04473876953125,
0.0174713134765625,
0.03094482421875,
0.04034423828125,
-0.051177978515625,
0.0589599609375,
-0.032379150390625,
-0.0167083740234375,
-0.01194000244140625,
-0.06402587890625,
-0.07745361328125,
-0.061614990234375,
-0.03363037109375,
-0.04241943359375,
-0.0162811279296875,
0.0655517578125,
0.05804443359375,
-0.07220458984375,
-0.0146331787109375,
0.002521514892578125,
0.00992584228515625,
-0.01910400390625,
-0.02313232421875,
0.050262451171875,
-0.0318603515625,
-0.085205078125,
0.003345489501953125,
0.0017032623291015625,
0.00910186767578125,
-0.0254669189453125,
0.0038242340087890625,
-0.0153045654296875,
0.0015764236450195312,
0.048797607421875,
0.0106048583984375,
-0.054229736328125,
-0.02227783203125,
0.005741119384765625,
-0.004058837890625,
0.00954437255859375,
0.0286712646484375,
-0.05987548828125,
0.025238037109375,
0.023681640625,
0.01349639892578125,
0.0484619140625,
-0.01503753662109375,
0.03472900390625,
-0.051513671875,
0.0270233154296875,
-0.0016698837280273438,
0.035980224609375,
0.033477783203125,
-0.0208282470703125,
0.0292205810546875,
0.0197296142578125,
-0.039520263671875,
-0.06787109375,
0.0035037994384765625,
-0.08245849609375,
-0.0169219970703125,
0.08843994140625,
-0.0265960693359375,
-0.0250396728515625,
-0.00882720947265625,
-0.01593017578125,
0.02386474609375,
-0.01273345947265625,
0.066162109375,
0.046112060546875,
0.0149383544921875,
-0.0220184326171875,
-0.035797119140625,
0.03656005859375,
0.01346588134765625,
-0.042236328125,
-0.00650787353515625,
0.0107879638671875,
0.03643798828125,
0.03656005859375,
0.0279693603515625,
-0.0180511474609375,
-0.0099029541015625,
-0.01312255859375,
0.022308349609375,
0.0031871795654296875,
-0.00931549072265625,
-0.0213165283203125,
0.0075836181640625,
-0.025390625,
-0.006717681884765625
]
] |
mncai/Mistral-7B-openplatypus-1k | 2023-10-11T07:42:43.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"MindsAndCompany",
"llama-2",
"en",
"dataset:garage-bAInd/Open-Platypus",
"arxiv:2306.02707",
"license:llama2",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | mncai | null | null | mncai/Mistral-7B-openplatypus-1k | 0 | 6,766 | transformers | 2023-10-10T08:40:01 | ---
pipeline_tag: text-generation
license: llama2
language:
- en
library_name: transformers
tags:
- MindsAndCompany
- llama-2
datasets:
- garage-bAInd/Open-Platypus
---
## Model Details
* **Developed by**: [Minds And Company](https://mnc.ai/)
* **Backbone Model**: [Mistral-7B-v0.1](mistralai/Mistral-7B-v0.1)
* **Library**: [HuggingFace Transformers](https://github.com/huggingface/transformers)
## Dataset Details
### Used Datasets
- Orca-style dataset
- Alpaca-style dataset
### Prompt Template
- Llama Prompt Template
## Limitations & Biases:
Llama2 and fine-tuned variants are a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2 and any fine-tuned varient's potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2 variants, developers should perform safety testing and tuning tailored to their specific applications of the model.
Please see the Responsible Use Guide available at https://ai.meta.com/llama/responsible-use-guide/
## License Disclaimer:
This model is bound by the license & usage restrictions of the original Llama-2 model. And comes with no warranty or gurantees of any kind.
## Contact Us
- [Minds And Company](https://mnc.ai/)
## Citiation:
Please kindly cite using the following BibTeX:
```bibtex
@article{platypus2023,
title={Platypus: Quick, Cheap, and Powerful Refinement of LLMs},
author={Ariel N. Lee and Cole J. Hunter and Nataniel Ruiz},
booktitle={arXiv preprint arxiv:2308.07317},
year={2023}
}
```
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```
@misc{Orca-best,
title = {Orca-best: A filtered version of orca gpt4 dataset.},
author = {Shahul Es},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://huggingface.co/datasets/shahules786/orca-best/},
}
```
```
@software{touvron2023llama2,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron, Louis Martin, Kevin Stone, Peter Albert, Amjad Almahairi, Yasmine Babaei, Nikolay Bashlykov, Soumya Batra, Prajjwal Bhargava,
Shruti Bhosale, Dan Bikel, Lukas Blecher, Cristian Canton Ferrer, Moya Chen, Guillem Cucurull, David Esiobu, Jude Fernandes, Jeremy Fu, Wenyin Fu, Brian Fuller,
Cynthia Gao, Vedanuj Goswami, Naman Goyal, Anthony Hartshorn, Saghar Hosseini, Rui Hou, Hakan Inan, Marcin Kardas, Viktor Kerkez Madian Khabsa, Isabel Kloumann,
Artem Korenev, Punit Singh Koura, Marie-Anne Lachaux, Thibaut Lavril, Jenya Lee, Diana Liskovich, Yinghai Lu, Yuning Mao, Xavier Martinet, Todor Mihaylov,
Pushkar Mishra, Igor Molybog, Yixin Nie, Andrew Poulton, Jeremy Reizenstein, Rashi Rungta, Kalyan Saladi, Alan Schelten, Ruan Silva, Eric Michael Smith,
Ranjan Subramanian, Xiaoqing Ellen Tan, Binh Tang, Ross Taylor, Adina Williams, Jian Xiang Kuan, Puxin Xu , Zheng Yan, Iliyan Zarov, Yuchen Zhang, Angela Fan,
Melanie Kambadur, Sharan Narang, Aurelien Rodriguez, Robert Stojnic, Sergey Edunov, Thomas Scialom},
year={2023}
}
```
> Readme format: [Riiid/sheep-duck-llama-2-70b-v1.1](https://huggingface.co/Riiid/sheep-duck-llama-2-70b-v1.1) | 3,667 | [
[
-0.033111572265625,
-0.048370361328125,
0.01297760009765625,
0.0187530517578125,
-0.026763916015625,
0.007015228271484375,
0.01043701171875,
-0.05267333984375,
0.0184478759765625,
0.0265960693359375,
-0.05340576171875,
-0.042633056640625,
-0.046142578125,
-0.0021915435791015625,
-0.01995849609375,
0.07977294921875,
-0.006740570068359375,
-0.033172607421875,
-0.00017452239990234375,
-0.0155181884765625,
-0.028076171875,
-0.0188751220703125,
-0.051727294921875,
-0.02532958984375,
0.0208740234375,
0.0192718505859375,
0.0565185546875,
0.04595947265625,
0.03570556640625,
0.02227783203125,
-0.032470703125,
0.0211181640625,
-0.034088134765625,
-0.0169525146484375,
0.005939483642578125,
-0.036224365234375,
-0.07415771484375,
-0.0010900497436523438,
0.0283966064453125,
0.0213470458984375,
-0.01546478271484375,
0.02734375,
0.0195465087890625,
0.040863037109375,
-0.0206146240234375,
0.0285186767578125,
-0.031341552734375,
0.00012022256851196289,
-0.0286407470703125,
-0.01177215576171875,
-0.0020008087158203125,
-0.026397705078125,
0.0031185150146484375,
-0.055328369140625,
0.00435638427734375,
-0.00406646728515625,
0.09539794921875,
0.0263671875,
-0.036773681640625,
-0.007720947265625,
-0.025787353515625,
0.058868408203125,
-0.0653076171875,
0.022064208984375,
0.020965576171875,
0.01898193359375,
-0.036956787109375,
-0.06268310546875,
-0.051727294921875,
0.00001341104507446289,
-0.0139617919921875,
0.0182647705078125,
-0.0260162353515625,
-0.006107330322265625,
0.0141754150390625,
0.0290679931640625,
-0.04107666015625,
0.0141448974609375,
-0.042144775390625,
-0.022216796875,
0.050628662109375,
0.01305389404296875,
0.01110076904296875,
-0.01369476318359375,
-0.046295166015625,
-0.0256805419921875,
-0.058441162109375,
0.03546142578125,
0.032135009765625,
0.00635528564453125,
-0.05584716796875,
0.04852294921875,
-0.01250457763671875,
0.033447265625,
0.00640106201171875,
-0.037506103515625,
0.04388427734375,
-0.0404052734375,
-0.013824462890625,
-0.020416259765625,
0.058349609375,
0.031646728515625,
0.00899505615234375,
0.022674560546875,
-0.00907135009765625,
0.0024051666259765625,
-0.0081024169921875,
-0.0574951171875,
-0.00629425048828125,
0.03179931640625,
-0.033111572265625,
-0.0213775634765625,
-0.00908660888671875,
-0.07061767578125,
-0.0230255126953125,
-0.016082763671875,
0.0121307373046875,
-0.0126495361328125,
-0.0380859375,
0.0171051025390625,
0.01032257080078125,
0.049041748046875,
0.01299285888671875,
-0.05572509765625,
0.024810791015625,
0.038726806640625,
0.062408447265625,
-0.01200103759765625,
-0.0099639892578125,
-0.0069122314453125,
0.0055694580078125,
-0.020233154296875,
0.064453125,
-0.0201568603515625,
-0.02978515625,
-0.012786865234375,
0.006885528564453125,
0.004138946533203125,
-0.02972412109375,
0.046844482421875,
-0.0236663818359375,
0.0181121826171875,
-0.024139404296875,
-0.01552581787109375,
-0.032684326171875,
0.0117340087890625,
-0.03680419921875,
0.0882568359375,
0.0104522705078125,
-0.05767822265625,
0.0182647705078125,
-0.043212890625,
-0.01230621337890625,
-0.0234832763671875,
-0.01032257080078125,
-0.0595703125,
-0.025299072265625,
0.02789306640625,
0.0272369384765625,
-0.03375244140625,
0.019775390625,
-0.032684326171875,
-0.01824951171875,
0.004383087158203125,
-0.01247406005859375,
0.0743408203125,
0.017333984375,
-0.0504150390625,
0.016876220703125,
-0.053497314453125,
-0.0126953125,
0.03302001953125,
-0.0193328857421875,
0.00858306884765625,
0.0009751319885253906,
-0.0165557861328125,
0.019134521484375,
0.0301361083984375,
-0.0298614501953125,
0.0159149169921875,
-0.0244903564453125,
0.041534423828125,
0.05645751953125,
-0.0032367706298828125,
0.0167694091796875,
-0.043792724609375,
0.044708251953125,
0.00809478759765625,
0.042694091796875,
0.002925872802734375,
-0.060577392578125,
-0.066162109375,
-0.031585693359375,
-0.0059356689453125,
0.049591064453125,
-0.0308837890625,
0.039398193359375,
-0.011077880859375,
-0.051513671875,
-0.026275634765625,
0.0113525390625,
0.0297393798828125,
0.04046630859375,
0.034332275390625,
-0.028717041015625,
-0.04315185546875,
-0.0740966796875,
-0.0016641616821289062,
-0.022674560546875,
0.004573822021484375,
0.0279541015625,
0.0338134765625,
-0.024169921875,
0.07269287109375,
-0.031829833984375,
-0.0296173095703125,
-0.01349639892578125,
-0.0149688720703125,
0.0292510986328125,
0.038360595703125,
0.056121826171875,
-0.05072021484375,
-0.020416259765625,
-0.01284027099609375,
-0.0538330078125,
-0.01177215576171875,
0.00659942626953125,
-0.0280609130859375,
0.01433563232421875,
0.0240631103515625,
-0.05743408203125,
0.05328369140625,
0.0546875,
-0.0311431884765625,
0.0428466796875,
0.002307891845703125,
-0.004268646240234375,
-0.0809326171875,
0.00855255126953125,
0.006496429443359375,
-0.0078887939453125,
-0.03350830078125,
-0.00531768798828125,
-0.005558013916015625,
0.0101776123046875,
-0.032470703125,
0.04931640625,
-0.032928466796875,
-0.002445220947265625,
-0.0050201416015625,
0.0166778564453125,
0.0009140968322753906,
0.04937744140625,
-0.01543426513671875,
0.054534912109375,
0.040771484375,
-0.0301666259765625,
0.01384735107421875,
0.032623291015625,
-0.0245208740234375,
0.03546142578125,
-0.072021484375,
0.01502227783203125,
0.002307891845703125,
0.048614501953125,
-0.0965576171875,
-0.0216827392578125,
0.03778076171875,
-0.0304412841796875,
0.028961181640625,
-0.00001519918441772461,
-0.0245361328125,
-0.040374755859375,
-0.043853759765625,
0.0287322998046875,
0.04656982421875,
-0.039794921875,
0.0404052734375,
0.03253173828125,
-0.00838470458984375,
-0.053009033203125,
-0.0545654296875,
-0.01165008544921875,
-0.038238525390625,
-0.0572509765625,
0.0287017822265625,
-0.021942138671875,
0.0011157989501953125,
-0.01453399658203125,
-0.01654052734375,
0.006938934326171875,
0.004764556884765625,
0.02825927734375,
0.03204345703125,
-0.0133819580078125,
-0.022430419921875,
0.010711669921875,
-0.0175628662109375,
-0.0034542083740234375,
-0.00031280517578125,
0.04193115234375,
-0.01513671875,
-0.0275421142578125,
-0.049102783203125,
0.0028781890869140625,
0.034332275390625,
-0.0252838134765625,
0.041839599609375,
0.05047607421875,
-0.02130126953125,
0.01139068603515625,
-0.0552978515625,
-0.0173492431640625,
-0.040374755859375,
0.0182647705078125,
-0.028594970703125,
-0.0709228515625,
0.06939697265625,
0.005779266357421875,
0.0254974365234375,
0.057464599609375,
0.044158935546875,
0.0087738037109375,
0.06256103515625,
0.0433349609375,
0.00234222412109375,
0.0390625,
-0.033355712890625,
0.0020313262939453125,
-0.075439453125,
-0.050079345703125,
-0.035980224609375,
-0.0338134765625,
-0.046661376953125,
-0.038848876953125,
0.0288238525390625,
0.0190582275390625,
-0.046844482421875,
0.02764892578125,
-0.05322265625,
0.01084136962890625,
0.02825927734375,
0.0140228271484375,
0.0213775634765625,
0.00576019287109375,
-0.0161285400390625,
0.004535675048828125,
-0.034210205078125,
-0.044830322265625,
0.08978271484375,
0.036041259765625,
0.04852294921875,
0.024658203125,
0.0364990234375,
0.0004324913024902344,
0.0203094482421875,
-0.036529541015625,
0.040618896484375,
0.01084136962890625,
-0.058197021484375,
-0.01141357421875,
-0.0172119140625,
-0.09130859375,
0.01258087158203125,
-0.00009459257125854492,
-0.07037353515625,
0.034271240234375,
0.00228118896484375,
-0.031494140625,
0.020477294921875,
-0.0386962890625,
0.043426513671875,
-0.00908660888671875,
-0.007640838623046875,
-0.0050506591796875,
-0.06072998046875,
0.0487060546875,
-0.00443267822265625,
0.01409149169921875,
-0.0216827392578125,
-0.0255279541015625,
0.057342529296875,
-0.03729248046875,
0.07666015625,
-0.00719451904296875,
-0.016510009765625,
0.04364013671875,
-0.0025386810302734375,
0.0557861328125,
0.0171661376953125,
-0.00506591796875,
0.0333251953125,
-0.0099945068359375,
-0.0269927978515625,
-0.01837158203125,
0.045989990234375,
-0.08685302734375,
-0.05560302734375,
-0.0295867919921875,
-0.0146942138671875,
0.005283355712890625,
0.00811767578125,
0.0231781005859375,
0.0213775634765625,
0.0239715576171875,
0.0116424560546875,
0.03955078125,
-0.022216796875,
0.031768798828125,
0.0360107421875,
-0.01194000244140625,
-0.04217529296875,
0.04071044921875,
0.01031494140625,
0.0228271484375,
0.01190948486328125,
0.01018524169921875,
-0.0303192138671875,
-0.037994384765625,
-0.0208740234375,
0.039215087890625,
-0.04315185546875,
-0.03143310546875,
-0.0408935546875,
-0.0218353271484375,
-0.0195465087890625,
0.0008697509765625,
-0.037139892578125,
-0.02740478515625,
-0.04669189453125,
-0.0152740478515625,
0.05084228515625,
0.045562744140625,
-0.00629425048828125,
0.0238800048828125,
-0.0270538330078125,
0.01611328125,
0.0213470458984375,
0.020050048828125,
0.0045928955078125,
-0.06121826171875,
0.008880615234375,
0.0133514404296875,
-0.054840087890625,
-0.05340576171875,
0.026336669921875,
0.01432037353515625,
0.052978515625,
0.015777587890625,
-0.0017948150634765625,
0.07037353515625,
-0.0130767822265625,
0.07977294921875,
0.0224151611328125,
-0.058258056640625,
0.043914794921875,
-0.0355224609375,
0.0102691650390625,
0.019683837890625,
0.024200439453125,
-0.0171661376953125,
-0.0175323486328125,
-0.059600830078125,
-0.07476806640625,
0.048095703125,
0.0266265869140625,
0.00858306884765625,
0.007778167724609375,
0.040313720703125,
0.01308441162109375,
0.006103515625,
-0.056243896484375,
-0.038543701171875,
-0.0284576416015625,
-0.0016546249389648438,
-0.0031337738037109375,
-0.0243377685546875,
-0.013397216796875,
-0.0196685791015625,
0.04888916015625,
-0.00565338134765625,
0.036285400390625,
0.02337646484375,
0.0188140869140625,
-0.0129241943359375,
0.00015783309936523438,
0.06610107421875,
0.0467529296875,
-0.0171051025390625,
-0.0102386474609375,
0.0216064453125,
-0.04510498046875,
-0.005779266357421875,
0.01291656494140625,
0.0040435791015625,
-0.01837158203125,
0.029815673828125,
0.06005859375,
0.00698089599609375,
-0.0286712646484375,
0.034393310546875,
0.0034942626953125,
-0.01898193359375,
-0.03802490234375,
0.01074981689453125,
0.015228271484375,
0.0496826171875,
0.0341796875,
0.00829315185546875,
-0.0007476806640625,
-0.0270538330078125,
0.0027179718017578125,
0.020965576171875,
-0.0108489990234375,
-0.037261962890625,
0.07257080078125,
0.002506256103515625,
-0.0127105712890625,
0.03265380859375,
-0.004932403564453125,
-0.0307464599609375,
0.054412841796875,
0.0302276611328125,
0.046783447265625,
-0.0208892822265625,
0.00157928466796875,
0.03973388671875,
0.0190887451171875,
-0.00782012939453125,
0.038360595703125,
0.01015472412109375,
-0.041015625,
-0.0256805419921875,
-0.039642333984375,
-0.0216064453125,
0.0287933349609375,
-0.041839599609375,
0.038360595703125,
-0.040740966796875,
-0.032745361328125,
-0.0173492431640625,
0.018524169921875,
-0.0595703125,
0.0009183883666992188,
0.0019025802612304688,
0.06787109375,
-0.048492431640625,
0.0482177734375,
0.04022216796875,
-0.034454345703125,
-0.08502197265625,
-0.024078369140625,
0.020782470703125,
-0.07244873046875,
0.02783203125,
0.0010890960693359375,
-0.0025005340576171875,
0.002597808837890625,
-0.054931640625,
-0.08673095703125,
0.120361328125,
0.0313720703125,
-0.0377197265625,
0.0113067626953125,
0.0032062530517578125,
0.034271240234375,
-0.0115509033203125,
0.043426513671875,
0.05096435546875,
0.03973388671875,
0.02398681640625,
-0.08837890625,
0.0204315185546875,
-0.0299072265625,
-0.005153656005859375,
-0.0029239654541015625,
-0.09539794921875,
0.08233642578125,
-0.0222930908203125,
-0.005855560302734375,
0.0273895263671875,
0.05560302734375,
0.05792236328125,
0.018402099609375,
0.028228759765625,
0.043914794921875,
0.05438232421875,
-0.00901031494140625,
0.08477783203125,
-0.01206207275390625,
0.03704833984375,
0.0635986328125,
0.00418853759765625,
0.06072998046875,
0.019927978515625,
-0.040802001953125,
0.057281494140625,
0.07623291015625,
0.0021991729736328125,
0.03912353515625,
0.01178741455078125,
0.006229400634765625,
-0.005126953125,
-0.002460479736328125,
-0.0577392578125,
0.029022216796875,
0.03509521484375,
-0.020660400390625,
-0.00966644287109375,
-0.023956298828125,
0.0289154052734375,
-0.0239715576171875,
-0.001262664794921875,
0.044647216796875,
0.01934814453125,
-0.0299224853515625,
0.08074951171875,
-0.014190673828125,
0.064208984375,
-0.053009033203125,
0.002254486083984375,
-0.04315185546875,
0.007083892822265625,
-0.033538818359375,
-0.0577392578125,
0.006420135498046875,
-0.002643585205078125,
0.0029239654541015625,
0.011199951171875,
0.044647216796875,
-0.0022411346435546875,
-0.0255584716796875,
0.0283966064453125,
0.020660400390625,
0.028076171875,
0.021331787109375,
-0.07318115234375,
0.0175018310546875,
0.00965118408203125,
-0.06683349609375,
0.0218658447265625,
0.0302581787109375,
-0.013580322265625,
0.06573486328125,
0.05548095703125,
-0.006988525390625,
0.0103302001953125,
-0.00994873046875,
0.08551025390625,
-0.03302001953125,
-0.028289794921875,
-0.057891845703125,
0.05657958984375,
-0.003864288330078125,
-0.049591064453125,
0.050750732421875,
0.03302001953125,
0.051361083984375,
0.0199432373046875,
0.04461669921875,
-0.00833892822265625,
0.02825927734375,
-0.0186920166015625,
0.04327392578125,
-0.0555419921875,
0.0308685302734375,
-0.017486572265625,
-0.07891845703125,
-0.022796630859375,
0.05670166015625,
-0.0133819580078125,
0.016326904296875,
0.043701171875,
0.07476806640625,
-0.0007042884826660156,
-0.002933502197265625,
-0.007568359375,
0.0272064208984375,
0.04852294921875,
0.048248291015625,
0.041778564453125,
-0.045654296875,
0.05279541015625,
-0.023712158203125,
-0.0258941650390625,
-0.0231475830078125,
-0.0733642578125,
-0.07037353515625,
-0.038665771484375,
-0.0307769775390625,
-0.03466796875,
-0.009796142578125,
0.055084228515625,
0.056243896484375,
-0.05572509765625,
-0.017303466796875,
-0.0010471343994140625,
0.0034637451171875,
-0.01922607421875,
-0.01092529296875,
0.0469970703125,
0.0121917724609375,
-0.0484619140625,
0.017425537109375,
0.00799560546875,
0.032867431640625,
-0.022491455078125,
-0.01751708984375,
-0.00833892822265625,
0.006534576416015625,
0.02789306640625,
0.03570556640625,
-0.06683349609375,
-0.0312042236328125,
-0.007274627685546875,
-0.01346588134765625,
0.0137939453125,
0.011260986328125,
-0.05181884765625,
0.0157012939453125,
0.032470703125,
0.012359619140625,
0.05126953125,
-0.0002758502960205078,
0.010894775390625,
-0.029693603515625,
0.0287628173828125,
-0.002689361572265625,
0.023712158203125,
0.0165557861328125,
-0.024169921875,
0.05767822265625,
0.0180206298828125,
-0.043212890625,
-0.06036376953125,
0.00308990478515625,
-0.10455322265625,
0.00879669189453125,
0.09796142578125,
-0.0196685791015625,
-0.0155029296875,
-0.002605438232421875,
-0.02545166015625,
0.0333251953125,
-0.037139892578125,
0.05694580078125,
0.032989501953125,
-0.009674072265625,
-0.0143585205078125,
-0.052459716796875,
0.039306640625,
0.012176513671875,
-0.062469482421875,
-0.020843505859375,
0.00821685791015625,
0.032562255859375,
0.014434814453125,
0.038543701171875,
-0.02142333984375,
0.00542449951171875,
-0.006443023681640625,
0.00896453857421875,
-0.0152740478515625,
0.0055999755859375,
-0.006244659423828125,
-0.0107574462890625,
-0.01080322265625,
-0.00975799560546875
]
] |
KT-AI/midm-bitext-S-7B-inst-v1 | 2023-11-04T23:37:30.000Z | [
"transformers",
"pytorch",
"midm-bitext-S",
"text-generation",
"custom_code",
"ko",
"en",
"arxiv:2104.09864",
"license:cc-by-nc-4.0",
"has_space",
"region:us"
] | text-generation | KT-AI | null | null | KT-AI/midm-bitext-S-7B-inst-v1 | 36 | 6,765 | transformers | 2023-10-30T15:27:51 | ---
license: cc-by-nc-4.0
language:
- ko
- en
pipeline_tag: text-generation
---
# Mi:dm (**M**indful **I**ntelligence that **D**ialogs, Empathizes, Understands and **M**oves, 믿:음)
Mi:dm은 KT가 개발한 사전학습 한국어-영어 언어모델 입니다.
문자열을 입력으로 하며, 문자열을 생성합니다.
Mi:dm is a pre-trained Korean-English language model developed by KT.
It takes text as input and creates text.
## Model Descriptions
### Midm-bitext-S (7B) Hyper Parameters
| Hyperparameter | Value |
|:---------------------|--------------:|
| \\(n_{layers}\\) | 32 |
| \\(d_{model}\\) | 4,096 |
| \\(d_{ff}\\) | 10,880 |
| \\(n_{heads}\\) | 32 |
| \\(d_{head}\\) | 128 |
| \\(n_{ctx}\\) | 2,048 |
| \\(n_{vocab}\\) | 72,154 |
| Positional Encoding | [Rotary Position Embedding (RoPE)](https://arxiv.org/abs/2104.09864) |
위 파라미터로 계산하면, 모델 로딩에는 약 30GB의 GPU 메모리가 필요합니다.
모델 추론에는 입출력 토큰 수에 비례하여 추가 메모리가 더 소요됩니다.
### Architecture
Mi:dm 은 Transformer 구조를 활용한 Auto-regressive Language Model 입니다. 선정된 Task 수행을 위해 supervised fine-tuning (SFT) 되었습니다.
Mi:dm is a transformer based auto-regressive Language Model. It was supervised fine-tuned (SFT).
### Tokenizer
[google sentencepiece](https://github.com/google/sentencepiece) 에 기반한 토크나이저를 사용하고 있습니다. 한국어 복합어를 고려한 형태소 기반 학습을 하였으며 bi-lingual tokenization 성능 향상을 위하여 영어 어휘를 같이 학습하였습니다.
Tokenizer was trained with [google sentencepiece](https://github.com/google/sentencepiece).
### Prompt Template
```
###System;{System}
###User;{User}
###Midm;
```
### Usage
```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM, TextStreamer
def main():
tokenizer = AutoTokenizer.from_pretrained(
"KT-AI/midm-bitext-S-7B-inst-v1",
trust_remote_code = True
)
model = AutoModelForCausalLM.from_pretrained(
"KT-AI/midm-bitext-S-7B-inst-v1",
trust_remote_code=True
)
model.cuda()
model.eval()
dummy_data = "###User;AI란?\n###Midm;"
data = tokenizer(dummy_data, return_tensors="pt")
streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True)
pred = model.generate(
input_ids=data.input_ids[..., :-1].cuda(),
streamer=streamer,
use_cache=True,
max_new_tokens=float('inf')
)
decoded_text = tokenizer.batch_decode(pred[0], skip_special_tokens=True)
if __name__ == "__main__":
main()
```
### Training Data
Mi:dm-bitext-S 모델은 한국어/영어 공개 데이터를 이용하여 사전 학습하였습니다. 미세 조정 학습을 위해서도 공개되었거나 자체 구축한 데이터를 이용하였으며 이를 일부 가공하거나 다시 정제하는 과정을 거쳤습니다.
KT는 공개 데이터를 직접 수집하거나 적법한 사용 허가 조건 하에 확보하였습니다. AI-HUB(https://www.aihub.or.kr/) 의 말뭉치 데이터와 국립국어원 모두의 말뭉치 데이터 (https://corpus.korean.go.kr/) 를 사전 학습 단계에서 이용하였습니다.
KT가 보유한 고객 데이터는 이용하지 않았습니다.
The Mi:dm-bitext-S model was pre-trained using Korean/English publicly available data. For fine-tuning, we used also publicly available data and went through some processing or refining.
KT collected public data directly or obtained it under legal permission conditions. The korean corpus data from AI-HUB (https://www.aihub.or.kr/) and the National Institute of Korean Language (https://corpus.korean.go.kr/) were used in the pre-training stage.
We did not use any customer data held by KT.
### Evaluation Results
TBA
## Limitations
KT는 Mi:dm 학습 데이터에서 욕설, 비속어, 편견, 차별 등 비윤리적 표현을 제거하려고 노력하였습니다.
그럼에도 불구하고 위와 같은 바람직하지 않은 표현 또는 부정확한 사실이 생성될 가능성을 완전히 제거하지 못하였습니다.
본 모델을 사용하기 전 이러한 한계를 인식하고 올바른 사용을 위해 필요한 조치를 취하는 것은 사용자의 책임이며, KT는 본 모델의 활용이 야기하는 위험이나 손해에 대해 책임을 지지 않습니다.
Mi:dm 학습 데이터의 대부분은 한국어와 영어로 구성되어 있습니다. 그 외 언어에 대한 이해와 생성 기능은 제공하지 않습니다.
We tried to remove unethical expressions such as profanity, slang, prejudice, and discrimination from training data.
Nevertheless, the possibility of creating undesirable and inaccurate expressions such as the above has not been completely eliminated.
It is the user's responsibility to be aware of these limitations before utilizing this model and take the necessary actions for proper use, and KT is not responsible for any risks or damages resulting from the use of this model.
Most of Mi:dm's training data consists of Korean and English.
## Licence
Mi:dm 모델 (Midm-bitext-S) 은 CC-BY-NC 4.0 라이선스 하에 공개되어 있습니다.
사용자는 본 모델의 일부 혹은 전체를 재학습하거나 일부만을 이용하는 것이 가능합니다. 다만 반드시 저작자를 표시하여야 하며, 영리 목적으로 이용할 수 없습니다. 또한 본 모델을 재배포하거나 본 모델의 2차적저작물을 작성하여 공유할 때는 본 모델과 동일한 CC-BY-NC 4.0 라이선스를 적용하여야 합니다.
Mi:dm (Midm-bitext-S) is released under the CC-BY-NC 4.0 license.
Users can retrain part or all of this model or use only part of it. However, the author must be indicated and cannot be used for commercial purposes. Additionally, when sharing secondary works using this model, they must be distributed under the same CC-BY-NC 4.0 license.
## Citations
Mi:dm을 이용한 2차 저작물을 배포할 경우 아래 내용을 인용하여 출처를 명시해야 합니다.
When distributing secondary works using Mi:dm, the source must be indicated by citing the content below.
```
@misc{kt-mi:dm,
title = {Mi:dm: KT Bilingual (Korean,English) Generative Pre-trained Transformer},
author = {KT},
year = {2023},
url = {https://huggingface.co/KT-AT/midm-bitext-S-7B-inst-v1}
howpublished = {\url{https://genielabs.ai}},
}
```
## Contacts
본 모델의 다양한 연구 목적의 활용과 개선 의견을 기대 합니다. dschang@kt.com
We look forward to receiving any suggestions for improvement. dschang@kt.com
| 5,389 | [
[
-0.0282440185546875,
-0.05078125,
0.0095977783203125,
0.0260009765625,
-0.04119873046875,
-0.0033550262451171875,
-0.01788330078125,
-0.0188140869140625,
0.0252227783203125,
0.01378631591796875,
-0.037841796875,
-0.052032470703125,
-0.044219970703125,
0.01324462890625,
-0.004581451416015625,
0.059173583984375,
-0.0193939208984375,
0.009521484375,
-0.01168060302734375,
0.0032176971435546875,
-0.04730224609375,
-0.036895751953125,
-0.05523681640625,
-0.02301025390625,
0.00011491775512695312,
0.021575927734375,
0.034820556640625,
0.046417236328125,
0.032470703125,
0.02886962890625,
-0.0065460205078125,
0.006687164306640625,
-0.0264129638671875,
-0.00905609130859375,
0.008392333984375,
-0.032806396484375,
-0.0279541015625,
0.0025615692138671875,
0.035614013671875,
0.0301971435546875,
-0.01273345947265625,
0.0106048583984375,
0.0184783935546875,
0.04833984375,
-0.020111083984375,
0.0251922607421875,
-0.02117919921875,
0.0017518997192382812,
-0.014678955078125,
-0.007068634033203125,
-0.01004791259765625,
-0.036773681640625,
0.0008959770202636719,
-0.052978515625,
0.00493621826171875,
0.00777435302734375,
0.09619140625,
0.0027484893798828125,
-0.02294921875,
-0.01499176025390625,
-0.04864501953125,
0.05450439453125,
-0.06817626953125,
0.036346435546875,
0.0372314453125,
0.0101318359375,
-0.0017232894897460938,
-0.059814453125,
-0.061676025390625,
-0.0102996826171875,
-0.0209503173828125,
0.036651611328125,
-0.01055908203125,
-0.005649566650390625,
0.0198822021484375,
0.029510498046875,
-0.041656494140625,
0.0203857421875,
-0.0186767578125,
-0.0232391357421875,
0.053070068359375,
0.01190948486328125,
0.04119873046875,
-0.04754638671875,
-0.05181884765625,
-0.0188751220703125,
-0.0304107666015625,
0.0251922607421875,
0.033233642578125,
0.0017023086547851562,
-0.040985107421875,
0.03057861328125,
-0.0144805908203125,
0.03424072265625,
0.01358795166015625,
-0.018035888671875,
0.04876708984375,
-0.03900146484375,
-0.03521728515625,
0.0018978118896484375,
0.08465576171875,
0.0273895263671875,
0.0113067626953125,
0.004108428955078125,
-0.01593017578125,
-0.002536773681640625,
0.003963470458984375,
-0.0634765625,
-0.012451171875,
0.023284912109375,
-0.0382080078125,
-0.0252838134765625,
0.0069580078125,
-0.0677490234375,
0.0016260147094726562,
-0.031402587890625,
0.0389404296875,
-0.0484619140625,
-0.041748046875,
0.0189056396484375,
0.003925323486328125,
0.0177459716796875,
0.01232147216796875,
-0.04425048828125,
0.017730712890625,
0.01074981689453125,
0.06024169921875,
-0.00061798095703125,
-0.0215911865234375,
0.0174407958984375,
0.0010700225830078125,
-0.00844573974609375,
0.04766845703125,
0.00783538818359375,
-0.0301513671875,
-0.011505126953125,
0.0131683349609375,
-0.0447998046875,
-0.026458740234375,
0.03863525390625,
-0.0258636474609375,
0.0321044921875,
0.005947113037109375,
-0.04608154296875,
-0.019927978515625,
0.019775390625,
-0.0323486328125,
0.08526611328125,
0.00970458984375,
-0.06402587890625,
0.0185394287109375,
-0.04510498046875,
-0.0248260498046875,
0.00289154052734375,
-0.0158538818359375,
-0.045196533203125,
-0.0245361328125,
0.033782958984375,
0.048126220703125,
-0.00305938720703125,
0.0201416015625,
-0.0030002593994140625,
-0.0251007080078125,
0.0238494873046875,
-0.026397705078125,
0.0709228515625,
0.0135650634765625,
-0.045196533203125,
0.003696441650390625,
-0.05828857421875,
0.00803375244140625,
0.046539306640625,
-0.031585693359375,
-0.017730712890625,
-0.026947021484375,
0.0068817138671875,
0.04132080078125,
0.0186614990234375,
-0.036712646484375,
0.01085662841796875,
-0.059814453125,
0.032745361328125,
0.06719970703125,
0.0052947998046875,
0.027130126953125,
-0.02728271484375,
0.0482177734375,
0.0250244140625,
0.007801055908203125,
-0.00714874267578125,
-0.0169525146484375,
-0.060791015625,
-0.0221099853515625,
0.033203125,
0.04815673828125,
-0.07257080078125,
0.04486083984375,
-0.022705078125,
-0.051116943359375,
-0.0509033203125,
-0.01134490966796875,
0.04296875,
0.0462646484375,
0.03521728515625,
-0.006076812744140625,
-0.05157470703125,
-0.04888916015625,
-0.001132965087890625,
-0.00685882568359375,
-0.002269744873046875,
0.03607177734375,
0.046875,
-0.01666259765625,
0.058929443359375,
-0.048065185546875,
-0.023773193359375,
-0.022430419921875,
-0.0095672607421875,
0.02984619140625,
0.05694580078125,
0.051239013671875,
-0.06787109375,
-0.06195068359375,
-0.010467529296875,
-0.059814453125,
0.01023101806640625,
-0.00701904296875,
-0.01119232177734375,
0.0273895263671875,
0.04180908203125,
-0.050140380859375,
0.041107177734375,
0.036834716796875,
-0.03363037109375,
0.07080078125,
-0.017242431640625,
0.01334381103515625,
-0.11212158203125,
0.0190582275390625,
0.0058746337890625,
-0.0012645721435546875,
-0.049774169921875,
-0.0126190185546875,
-0.0014104843139648438,
-0.00047707557678222656,
-0.048492431640625,
0.06048583984375,
-0.0435791015625,
0.032623291015625,
-0.01177978515625,
0.0197906494140625,
0.0018568038940429688,
0.06011962890625,
0.00278472900390625,
0.056365966796875,
0.0484619140625,
-0.056671142578125,
0.0174102783203125,
0.0338134765625,
-0.027801513671875,
0.025115966796875,
-0.06378173828125,
-0.0018758773803710938,
-0.0015773773193359375,
0.0047454833984375,
-0.0906982421875,
-0.01241302490234375,
0.044342041015625,
-0.056732177734375,
0.040374755859375,
-0.0090789794921875,
-0.02423095703125,
-0.042724609375,
-0.03106689453125,
0.0077972412109375,
0.030181884765625,
-0.03173828125,
0.0321044921875,
0.016357421875,
-0.006500244140625,
-0.044952392578125,
-0.0521240234375,
-0.012725830078125,
-0.0250244140625,
-0.0601806640625,
0.048492431640625,
-0.01067352294921875,
0.01000213623046875,
-0.01079559326171875,
-0.00672149658203125,
0.0038928985595703125,
-0.006069183349609375,
0.02349853515625,
0.03741455078125,
-0.01465606689453125,
-0.0018405914306640625,
0.0005154609680175781,
-0.01500701904296875,
0.002101898193359375,
-0.0086669921875,
0.06658935546875,
-0.0081634521484375,
-0.0207366943359375,
-0.05914306640625,
0.00797271728515625,
0.03582763671875,
-0.00786590576171875,
0.06427001953125,
0.061614990234375,
-0.0214996337890625,
0.01340484619140625,
-0.03765869140625,
-0.012847900390625,
-0.03863525390625,
0.031982421875,
-0.0260772705078125,
-0.050750732421875,
0.0494384765625,
-0.00577545166015625,
-0.0250244140625,
0.052886962890625,
0.040313720703125,
-0.01495361328125,
0.07391357421875,
0.033477783203125,
-0.01427459716796875,
0.036285400390625,
-0.05413818359375,
0.0162811279296875,
-0.0765380859375,
-0.041656494140625,
-0.0275726318359375,
-0.0204010009765625,
-0.05535888671875,
-0.01165771484375,
0.016754150390625,
0.0131072998046875,
-0.018951416015625,
0.035064697265625,
-0.044769287109375,
-0.0016527175903320312,
0.032196044921875,
0.0296783447265625,
-0.00830078125,
-0.004016876220703125,
-0.0372314453125,
-0.0027065277099609375,
-0.059234619140625,
-0.0299072265625,
0.0743408203125,
0.040283203125,
0.043304443359375,
-0.0014362335205078125,
0.0445556640625,
-0.0028896331787109375,
-0.015838623046875,
-0.0550537109375,
0.040374755859375,
0.005462646484375,
-0.034515380859375,
-0.028228759765625,
-0.03369140625,
-0.0849609375,
0.0208892822265625,
-0.01678466796875,
-0.06182861328125,
0.02294921875,
0.0003104209899902344,
-0.028411865234375,
0.025970458984375,
-0.063232421875,
0.07879638671875,
-0.0158233642578125,
-0.025115966796875,
0.005016326904296875,
-0.05615234375,
0.0127105712890625,
0.0077972412109375,
0.01055908203125,
0.0018787384033203125,
0.009765625,
0.07501220703125,
-0.045745849609375,
0.05584716796875,
-0.0264434814453125,
0.00376129150390625,
0.02886962890625,
-0.0113067626953125,
0.044219970703125,
0.00705718994140625,
0.01007080078125,
0.02447509765625,
-0.00930023193359375,
-0.022674560546875,
-0.0273895263671875,
0.048095703125,
-0.07757568359375,
-0.035675048828125,
-0.034210205078125,
-0.0196685791015625,
-0.0012788772583007812,
0.028228759765625,
0.0423583984375,
0.026580810546875,
0.0125732421875,
0.01523590087890625,
0.04443359375,
-0.0283355712890625,
0.046722412109375,
0.0306854248046875,
-0.0084381103515625,
-0.031402587890625,
0.061676025390625,
0.0169677734375,
0.01532745361328125,
0.0107421875,
0.0224456787109375,
-0.042633056640625,
-0.046539306640625,
-0.03265380859375,
0.03253173828125,
-0.038055419921875,
-0.02532958984375,
-0.054718017578125,
-0.0301513671875,
-0.05059814453125,
0.0006165504455566406,
-0.0247344970703125,
-0.017059326171875,
-0.029510498046875,
-0.0170135498046875,
0.0270233154296875,
0.018463134765625,
-0.00095367431640625,
0.037353515625,
-0.049468994140625,
0.00659942626953125,
-0.004596710205078125,
0.0300140380859375,
0.0086822509765625,
-0.058380126953125,
-0.03369140625,
0.01322174072265625,
-0.023956298828125,
-0.054962158203125,
0.0460205078125,
-0.0164642333984375,
0.049163818359375,
0.031341552734375,
-0.0006008148193359375,
0.042877197265625,
-0.023773193359375,
0.06640625,
0.0258331298828125,
-0.05926513671875,
0.04876708984375,
-0.0321044921875,
0.0279693603515625,
0.0318603515625,
0.056549072265625,
-0.051544189453125,
-0.01016998291015625,
-0.054779052734375,
-0.076416015625,
0.0736083984375,
0.0207672119140625,
0.01024627685546875,
0.0068206787109375,
0.013641357421875,
-0.017730712890625,
0.01465606689453125,
-0.0623779296875,
-0.04742431640625,
-0.0190277099609375,
-0.0187225341796875,
0.0095977783203125,
-0.01401519775390625,
0.01092529296875,
-0.035736083984375,
0.06976318359375,
0.00923919677734375,
0.046142578125,
0.0293121337890625,
-0.01003265380859375,
0.01557159423828125,
0.00821685791015625,
0.04852294921875,
0.04443359375,
-0.01148223876953125,
-0.00823211669921875,
0.01328277587890625,
-0.0732421875,
0.01218414306640625,
0.0126953125,
-0.026947021484375,
0.004955291748046875,
0.01312255859375,
0.08795166015625,
0.01354217529296875,
-0.0258331298828125,
0.046783447265625,
-0.0014657974243164062,
-0.027923583984375,
-0.026611328125,
-0.00643157958984375,
0.01561737060546875,
0.01038360595703125,
0.01389312744140625,
-0.0029468536376953125,
-0.00830078125,
-0.00705718994140625,
0.01076507568359375,
0.00640869140625,
-0.0273895263671875,
-0.01560211181640625,
0.055267333984375,
0.0011377334594726562,
-0.0083465576171875,
0.049102783203125,
-0.014312744140625,
-0.0498046875,
0.055572509765625,
0.058135986328125,
0.053253173828125,
-0.03448486328125,
0.0119781494140625,
0.06280517578125,
0.007297515869140625,
0.004459381103515625,
0.01470947265625,
0.0196990966796875,
-0.05645751953125,
-0.0189208984375,
-0.045654296875,
0.00836944580078125,
0.03497314453125,
-0.049560546875,
0.0277862548828125,
-0.02935791015625,
-0.01369476318359375,
-0.0019102096557617188,
0.012847900390625,
-0.054046630859375,
0.032806396484375,
-0.0009279251098632812,
0.0606689453125,
-0.057830810546875,
0.060028076171875,
0.052154541015625,
-0.060821533203125,
-0.08740234375,
0.0092926025390625,
-0.003063201904296875,
-0.07415771484375,
0.04388427734375,
0.01203155517578125,
0.00844573974609375,
0.01169586181640625,
-0.0289154052734375,
-0.07122802734375,
0.1053466796875,
0.0043182373046875,
-0.030914306640625,
0.00319671630859375,
0.0220184326171875,
0.03424072265625,
-0.0035457611083984375,
0.02752685546875,
0.031005859375,
0.0282745361328125,
0.0047454833984375,
-0.07080078125,
0.0224609375,
-0.0263519287109375,
0.008087158203125,
0.01027679443359375,
-0.06402587890625,
0.0673828125,
-0.00394439697265625,
-0.021148681640625,
-0.021026611328125,
0.04241943359375,
0.020263671875,
0.0225677490234375,
0.037506103515625,
0.042205810546875,
0.043212890625,
-0.0092620849609375,
0.06378173828125,
-0.030364990234375,
0.035736083984375,
0.060394287109375,
0.01751708984375,
0.03985595703125,
0.02593994140625,
-0.036651611328125,
0.034423828125,
0.05267333984375,
-0.01509857177734375,
0.04010009765625,
0.006793975830078125,
-0.0122222900390625,
0.00325775146484375,
0.007293701171875,
-0.029541015625,
0.041778564453125,
0.0149078369140625,
-0.02032470703125,
-0.00034689903259277344,
0.035980224609375,
0.02532958984375,
-0.0009274482727050781,
-0.026336669921875,
0.0556640625,
0.0062713623046875,
-0.03204345703125,
0.05157470703125,
0.0102996826171875,
0.07025146484375,
-0.047332763671875,
0.016204833984375,
-0.0174560546875,
0.008758544921875,
-0.0216217041015625,
-0.060302734375,
0.009735107421875,
-0.004150390625,
-0.01409149169921875,
-0.016845703125,
0.06549072265625,
-0.0249481201171875,
-0.039215087890625,
0.025634765625,
0.0275726318359375,
0.0225677490234375,
-0.0080108642578125,
-0.06854248046875,
-0.0128173828125,
0.01514434814453125,
-0.044769287109375,
0.0243377685546875,
0.0248870849609375,
0.00698089599609375,
0.04931640625,
0.049591064453125,
0.00450897216796875,
0.004878997802734375,
0.0016183853149414062,
0.07415771484375,
-0.0433349609375,
-0.046478271484375,
-0.07379150390625,
0.05938720703125,
-0.01287078857421875,
-0.0249176025390625,
0.07232666015625,
0.051025390625,
0.0831298828125,
-0.01224517822265625,
0.06292724609375,
-0.0184478759765625,
0.04296875,
-0.049468994140625,
0.07269287109375,
-0.0416259765625,
-0.0017614364624023438,
-0.0301361083984375,
-0.039337158203125,
-0.0047760009765625,
0.048919677734375,
-0.0281219482421875,
0.0257415771484375,
0.04754638671875,
0.05694580078125,
-0.0004696846008300781,
-0.010650634765625,
0.01458740234375,
0.0267333984375,
0.01031494140625,
0.042449951171875,
0.038665771484375,
-0.06341552734375,
0.04864501953125,
-0.039337158203125,
-0.0008749961853027344,
-0.0227203369140625,
-0.03466796875,
-0.06854248046875,
-0.040191650390625,
-0.031463623046875,
-0.030059814453125,
-0.02532958984375,
0.0684814453125,
0.03997802734375,
-0.05389404296875,
-0.0167694091796875,
0.0034046173095703125,
0.0008230209350585938,
-0.031890869140625,
-0.02215576171875,
0.05950927734375,
-0.017730712890625,
-0.06573486328125,
0.005664825439453125,
-0.00982666015625,
0.0304718017578125,
-0.008270263671875,
-0.0106201171875,
-0.051116943359375,
-0.003093719482421875,
0.041839599609375,
0.0008702278137207031,
-0.0304718017578125,
-0.00867462158203125,
0.02325439453125,
-0.0273895263671875,
0.02362060546875,
0.0208892822265625,
-0.037933349609375,
0.02203369140625,
0.03057861328125,
0.0265350341796875,
0.055450439453125,
-0.00240325927734375,
0.012786865234375,
-0.052886962890625,
0.023162841796875,
0.0083465576171875,
0.0280303955078125,
0.006084442138671875,
-0.0413818359375,
0.039459228515625,
0.03912353515625,
-0.04400634765625,
-0.05615234375,
-0.0043792724609375,
-0.075927734375,
-0.0276336669921875,
0.0885009765625,
-0.02081298828125,
-0.025604248046875,
-0.00348663330078125,
-0.031097412109375,
0.044647216796875,
-0.01690673828125,
0.048614501953125,
0.056060791015625,
-0.0043792724609375,
-0.0034465789794921875,
-0.04425048828125,
0.041107177734375,
0.0248870849609375,
-0.05963134765625,
-0.02435302734375,
0.01183319091796875,
0.03411865234375,
0.006084442138671875,
0.06280517578125,
-0.00562286376953125,
0.0029296875,
-0.0027027130126953125,
0.0245208740234375,
-0.0196380615234375,
0.0006809234619140625,
-0.024688720703125,
-0.0181732177734375,
-0.0306854248046875,
-0.048187255859375
]
] |
timm/tf_efficientnet_b1.ns_jft_in1k | 2023-04-27T21:17:43.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1905.11946",
"arxiv:1911.04252",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/tf_efficientnet_b1.ns_jft_in1k | 0 | 6,749 | timm | 2022-12-13T00:01:58 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for tf_efficientnet_b1.ns_jft_in1k
A EfficientNet image classification model. Trained on ImageNet-1k and unlabeled JFT-300m using Noisy Student semi-supervised learning in Tensorflow by paper authors, ported to PyTorch by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 7.8
- GMACs: 0.7
- Activations (M): 10.9
- Image size: 240 x 240
- **Papers:**
- EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks: https://arxiv.org/abs/1905.11946
- Self-training with Noisy Student improves ImageNet classification: https://arxiv.org/abs/1911.04252
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('tf_efficientnet_b1.ns_jft_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'tf_efficientnet_b1.ns_jft_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 120, 120])
# torch.Size([1, 24, 60, 60])
# torch.Size([1, 40, 30, 30])
# torch.Size([1, 112, 15, 15])
# torch.Size([1, 320, 8, 8])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'tf_efficientnet_b1.ns_jft_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1280, 8, 8) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{tan2019efficientnet,
title={Efficientnet: Rethinking model scaling for convolutional neural networks},
author={Tan, Mingxing and Le, Quoc},
booktitle={International conference on machine learning},
pages={6105--6114},
year={2019},
organization={PMLR}
}
```
```bibtex
@article{Xie2019SelfTrainingWN,
title={Self-Training With Noisy Student Improves ImageNet Classification},
author={Qizhe Xie and Eduard H. Hovy and Minh-Thang Luong and Quoc V. Le},
journal={2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2019},
pages={10684-10695}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 4,598 | [
[
-0.0295867919921875,
-0.042694091796875,
-0.00627899169921875,
0.00890350341796875,
-0.017608642578125,
-0.0282135009765625,
-0.0255584716796875,
-0.0309600830078125,
0.0121612548828125,
0.0261993408203125,
-0.0258941650390625,
-0.042327880859375,
-0.054962158203125,
-0.01097869873046875,
-0.0194091796875,
0.0653076171875,
-0.006381988525390625,
-0.000016868114471435547,
-0.0120086669921875,
-0.040985107421875,
-0.00223541259765625,
-0.0164337158203125,
-0.0704345703125,
-0.032623291015625,
0.0244903564453125,
0.01861572265625,
0.04058837890625,
0.050506591796875,
0.0491943359375,
0.039337158203125,
-0.007297515869140625,
0.0029163360595703125,
-0.021392822265625,
-0.007808685302734375,
0.0311126708984375,
-0.040191650390625,
-0.0262908935546875,
0.01209259033203125,
0.056365966796875,
0.038238525390625,
-0.0027027130126953125,
0.036865234375,
0.01209259033203125,
0.044769287109375,
-0.0240631103515625,
0.01555633544921875,
-0.0276641845703125,
0.01422119140625,
-0.0052642822265625,
0.0110015869140625,
-0.022186279296875,
-0.02239990234375,
0.0174560546875,
-0.0419921875,
0.0384521484375,
-0.007038116455078125,
0.096923828125,
0.02294921875,
-0.00600433349609375,
0.0011510848999023438,
-0.01617431640625,
0.055908203125,
-0.061798095703125,
0.0175933837890625,
0.01065826416015625,
0.0149383544921875,
-0.00734710693359375,
-0.08294677734375,
-0.0352783203125,
-0.0144500732421875,
-0.017486572265625,
-0.00788116455078125,
-0.0255279541015625,
0.01157379150390625,
0.027557373046875,
0.021881103515625,
-0.030853271484375,
0.00634002685546875,
-0.04241943359375,
-0.01556396484375,
0.044219970703125,
-0.0013475418090820312,
0.016571044921875,
-0.0116729736328125,
-0.03277587890625,
-0.035064697265625,
-0.0171966552734375,
0.029266357421875,
0.01873779296875,
0.020233154296875,
-0.038238525390625,
0.0223846435546875,
0.003993988037109375,
0.0513916015625,
0.01342010498046875,
-0.02978515625,
0.047882080078125,
0.00390625,
-0.035858154296875,
-0.00986480712890625,
0.08526611328125,
0.0255889892578125,
0.01837158203125,
0.003917694091796875,
-0.01241302490234375,
-0.0345458984375,
-0.0017461776733398438,
-0.09686279296875,
-0.0291900634765625,
0.0230712890625,
-0.052093505859375,
-0.03399658203125,
0.01136016845703125,
-0.041778564453125,
-0.0066375732421875,
-0.006206512451171875,
0.052581787109375,
-0.0284881591796875,
-0.036895751953125,
-0.0006356239318847656,
-0.01233673095703125,
0.01062774658203125,
0.0218963623046875,
-0.040863037109375,
0.0114898681640625,
0.01727294921875,
0.0843505859375,
0.005786895751953125,
-0.033355712890625,
-0.014495849609375,
-0.033966064453125,
-0.0202178955078125,
0.0277557373046875,
-0.0014162063598632812,
-0.0018777847290039062,
-0.024444580078125,
0.0250701904296875,
-0.0122528076171875,
-0.053863525390625,
0.02227783203125,
-0.01654052734375,
0.01215362548828125,
0.006816864013671875,
-0.02227783203125,
-0.04083251953125,
0.021209716796875,
-0.03558349609375,
0.087646484375,
0.02801513671875,
-0.06390380859375,
0.020538330078125,
-0.040863037109375,
-0.01197052001953125,
-0.0189056396484375,
0.0031986236572265625,
-0.0860595703125,
-0.007293701171875,
0.013916015625,
0.06768798828125,
-0.01678466796875,
0.01119232177734375,
-0.047210693359375,
-0.019134521484375,
0.02203369140625,
-0.00867462158203125,
0.08160400390625,
0.0217437744140625,
-0.03619384765625,
0.0216217041015625,
-0.04827880859375,
0.01611328125,
0.03729248046875,
-0.0190582275390625,
-0.0025501251220703125,
-0.047454833984375,
0.01224517822265625,
0.0205841064453125,
0.01126861572265625,
-0.038604736328125,
0.014373779296875,
-0.01177215576171875,
0.0377197265625,
0.046234130859375,
-0.01004791259765625,
0.03173828125,
-0.02508544921875,
0.01824951171875,
0.020904541015625,
0.0184478759765625,
-0.004337310791015625,
-0.03082275390625,
-0.06341552734375,
-0.038818359375,
0.0254058837890625,
0.019073486328125,
-0.037445068359375,
0.03082275390625,
-0.01552581787109375,
-0.060272216796875,
-0.0335693359375,
0.00994110107421875,
0.0322265625,
0.05352783203125,
0.0261688232421875,
-0.0254058837890625,
-0.03179931640625,
-0.06878662109375,
-0.0004668235778808594,
-0.0012445449829101562,
0.0023651123046875,
0.0227813720703125,
0.04486083984375,
-0.0012769699096679688,
0.040740966796875,
-0.0247039794921875,
-0.024139404296875,
-0.0161895751953125,
0.0085296630859375,
0.034698486328125,
0.0634765625,
0.058563232421875,
-0.046478271484375,
-0.0445556640625,
-0.015625,
-0.07080078125,
0.0082550048828125,
-0.01172637939453125,
-0.01262664794921875,
0.01435089111328125,
0.0211029052734375,
-0.039276123046875,
0.037872314453125,
0.0177001953125,
-0.0281524658203125,
0.0274810791015625,
-0.0165863037109375,
0.0158843994140625,
-0.08148193359375,
0.007785797119140625,
0.03326416015625,
-0.0165863037109375,
-0.04058837890625,
0.00963592529296875,
0.0075531005859375,
-0.0016002655029296875,
-0.035308837890625,
0.04388427734375,
-0.042633056640625,
-0.007293701171875,
-0.01108551025390625,
-0.02581787109375,
-0.00018918514251708984,
0.0579833984375,
-0.00943756103515625,
0.031341552734375,
0.06390380859375,
-0.035308837890625,
0.0309906005859375,
0.0184173583984375,
-0.016387939453125,
0.0267791748046875,
-0.055633544921875,
0.008056640625,
0.002498626708984375,
0.01898193359375,
-0.07568359375,
-0.0166015625,
0.0240631103515625,
-0.0433349609375,
0.051788330078125,
-0.0396728515625,
-0.032012939453125,
-0.038299560546875,
-0.028900146484375,
0.0290069580078125,
0.0469970703125,
-0.060394287109375,
0.034088134765625,
0.019134521484375,
0.0301361083984375,
-0.044586181640625,
-0.06640625,
-0.0199432373046875,
-0.0316162109375,
-0.0589599609375,
0.023834228515625,
0.01035308837890625,
0.00982666015625,
0.00925445556640625,
-0.002071380615234375,
-0.010833740234375,
0.0042877197265625,
0.037872314453125,
0.020782470703125,
-0.021697998046875,
0.00214385986328125,
-0.0201416015625,
0.004047393798828125,
0.00848388671875,
-0.0273284912109375,
0.034820556640625,
-0.0263519287109375,
-0.0009937286376953125,
-0.061920166015625,
-0.0041656494140625,
0.03472900390625,
0.0005269050598144531,
0.062286376953125,
0.08984375,
-0.034271240234375,
-0.0079345703125,
-0.0308837890625,
-0.02203369140625,
-0.038543701171875,
0.0400390625,
-0.0258026123046875,
-0.046142578125,
0.0589599609375,
-0.00437164306640625,
0.008636474609375,
0.056427001953125,
0.0262298583984375,
-0.00792694091796875,
0.048126220703125,
0.0433349609375,
0.0172576904296875,
0.061981201171875,
-0.07952880859375,
-0.01523590087890625,
-0.0584716796875,
-0.027313232421875,
-0.02783203125,
-0.053314208984375,
-0.055877685546875,
-0.02215576171875,
0.037628173828125,
0.0181427001953125,
-0.044189453125,
0.03118896484375,
-0.0682373046875,
0.00733184814453125,
0.0477294921875,
0.0439453125,
-0.026153564453125,
0.0254669189453125,
-0.01132965087890625,
0.0035991668701171875,
-0.0623779296875,
-0.0105438232421875,
0.0887451171875,
0.03497314453125,
0.048095703125,
-0.0107574462890625,
0.055450439453125,
-0.0162200927734375,
0.0262298583984375,
-0.0462646484375,
0.043548583984375,
-0.01035308837890625,
-0.03314208984375,
-0.0204925537109375,
-0.0445556640625,
-0.0811767578125,
0.014801025390625,
-0.0220947265625,
-0.0567626953125,
0.0172576904296875,
0.01544952392578125,
-0.0198822021484375,
0.059234619140625,
-0.07080078125,
0.0731201171875,
-0.00547027587890625,
-0.03753662109375,
0.0038890838623046875,
-0.0498046875,
0.021575927734375,
0.015869140625,
-0.020294189453125,
-0.0055694580078125,
0.0069732666015625,
0.08831787109375,
-0.049468994140625,
0.06292724609375,
-0.043182373046875,
0.033538818359375,
0.042938232421875,
-0.00811004638671875,
0.0280609130859375,
-0.0082550048828125,
-0.013275146484375,
0.032684326171875,
0.0012063980102539062,
-0.036834716796875,
-0.040283203125,
0.0450439453125,
-0.079833984375,
-0.0267791748046875,
-0.0199127197265625,
-0.036956787109375,
0.0165252685546875,
0.0114288330078125,
0.0367431640625,
0.04779052734375,
0.0208740234375,
0.0269317626953125,
0.041656494140625,
-0.0198974609375,
0.042510986328125,
-0.006267547607421875,
-0.01029205322265625,
-0.033233642578125,
0.060821533203125,
0.027130126953125,
0.014373779296875,
0.0079193115234375,
0.02105712890625,
-0.021697998046875,
-0.04296875,
-0.025482177734375,
0.01904296875,
-0.053955078125,
-0.042327880859375,
-0.053375244140625,
-0.0345458984375,
-0.02838134765625,
-0.00852203369140625,
-0.042144775390625,
-0.0338134765625,
-0.034637451171875,
0.0157012939453125,
0.0531005859375,
0.0369873046875,
-0.015106201171875,
0.04547119140625,
-0.033355712890625,
0.004131317138671875,
0.0103607177734375,
0.0322265625,
0.00884246826171875,
-0.06982421875,
-0.024871826171875,
-0.0100250244140625,
-0.034423828125,
-0.045623779296875,
0.038238525390625,
0.0203399658203125,
0.0386962890625,
0.0310516357421875,
-0.0086517333984375,
0.054931640625,
0.0056304931640625,
0.0369873046875,
0.03350830078125,
-0.042205810546875,
0.0382080078125,
-0.0020294189453125,
0.017486572265625,
0.01303863525390625,
0.02301025390625,
-0.01146697998046875,
-0.006938934326171875,
-0.080322265625,
-0.056121826171875,
0.06512451171875,
0.006885528564453125,
0.001026153564453125,
0.031829833984375,
0.054290771484375,
-0.0006499290466308594,
0.0012083053588867188,
-0.0574951171875,
-0.03643798828125,
-0.0287933349609375,
-0.023345947265625,
0.0010881423950195312,
0.00021409988403320312,
-0.0009708404541015625,
-0.054931640625,
0.047607421875,
-0.0088958740234375,
0.062347412109375,
0.0285186767578125,
-0.002651214599609375,
-0.01129913330078125,
-0.0284423828125,
0.0276031494140625,
0.0196075439453125,
-0.025360107421875,
0.01021575927734375,
0.015655517578125,
-0.042022705078125,
0.012603759765625,
0.01473236083984375,
-0.00335693359375,
0.0003962516784667969,
0.040313720703125,
0.06884765625,
-0.004398345947265625,
0.010009765625,
0.034637451171875,
-0.006259918212890625,
-0.03729248046875,
-0.0188446044921875,
0.017578125,
-0.004825592041015625,
0.038238525390625,
0.0245513916015625,
0.03411865234375,
-0.00362396240234375,
-0.01226806640625,
0.019073486328125,
0.038238525390625,
-0.0198516845703125,
-0.0205078125,
0.049407958984375,
-0.01227569580078125,
-0.00923919677734375,
0.06536865234375,
-0.01250457763671875,
-0.037445068359375,
0.08453369140625,
0.03106689453125,
0.07281494140625,
0.00007551908493041992,
0.0014896392822265625,
0.0780029296875,
0.016754150390625,
-0.00621795654296875,
0.006038665771484375,
0.002025604248046875,
-0.056304931640625,
0.002941131591796875,
-0.041412353515625,
0.00412750244140625,
0.0231781005859375,
-0.036285400390625,
0.0183868408203125,
-0.05572509765625,
-0.0322265625,
0.01439666748046875,
0.029876708984375,
-0.07135009765625,
0.015899658203125,
-0.0036411285400390625,
0.062286376953125,
-0.054290771484375,
0.061370849609375,
0.060302734375,
-0.034210205078125,
-0.0916748046875,
-0.0112457275390625,
-0.00612640380859375,
-0.061248779296875,
0.043487548828125,
0.036102294921875,
0.01322174072265625,
0.0081939697265625,
-0.0657958984375,
-0.0537109375,
0.10772705078125,
0.04058837890625,
-0.01290130615234375,
0.020904541015625,
-0.009674072265625,
0.020599365234375,
-0.03338623046875,
0.041961669921875,
0.01312255859375,
0.0267333984375,
0.0222320556640625,
-0.0494384765625,
0.02227783203125,
-0.026580810546875,
0.00434112548828125,
0.01514434814453125,
-0.071044921875,
0.07061767578125,
-0.038055419921875,
-0.0080718994140625,
-0.005184173583984375,
0.05780029296875,
0.007480621337890625,
0.01197052001953125,
0.049774169921875,
0.05975341796875,
0.044158935546875,
-0.02337646484375,
0.063232421875,
0.002811431884765625,
0.051727294921875,
0.0469970703125,
0.042633056640625,
0.035858154296875,
0.0276641845703125,
-0.023040771484375,
0.0199432373046875,
0.081298828125,
-0.0290069580078125,
0.0218048095703125,
0.0168609619140625,
0.005916595458984375,
-0.0159149169921875,
0.00829315185546875,
-0.0267181396484375,
0.039825439453125,
0.01007080078125,
-0.0421142578125,
-0.019744873046875,
0.005283355712890625,
0.0005445480346679688,
-0.029876708984375,
-0.0225677490234375,
0.033111572265625,
0.0013589859008789062,
-0.0284881591796875,
0.0711669921875,
0.0031948089599609375,
0.06964111328125,
-0.0264434814453125,
0.006130218505859375,
-0.019561767578125,
0.0188751220703125,
-0.0291748046875,
-0.059173583984375,
0.02337646484375,
-0.0212860107421875,
0.002384185791015625,
-0.0009965896606445312,
0.054931640625,
-0.0289764404296875,
-0.0394287109375,
0.0165863037109375,
0.022979736328125,
0.03643798828125,
0.0011568069458007812,
-0.09698486328125,
0.01015472412109375,
0.004909515380859375,
-0.058563232421875,
0.024444580078125,
0.032928466796875,
0.00923919677734375,
0.05670166015625,
0.04058837890625,
-0.01139068603515625,
0.011505126953125,
-0.01131439208984375,
0.0616455078125,
-0.0294342041015625,
-0.0193328857421875,
-0.0584716796875,
0.045166015625,
-0.0088043212890625,
-0.04412841796875,
0.032501220703125,
0.0343017578125,
0.06707763671875,
0.0023345947265625,
0.025665283203125,
-0.021759033203125,
-0.004375457763671875,
-0.02117919921875,
0.060272216796875,
-0.06195068359375,
-0.003955841064453125,
-0.01346588134765625,
-0.046844482421875,
-0.02911376953125,
0.056060791015625,
-0.014373779296875,
0.0389404296875,
0.032257080078125,
0.0760498046875,
-0.025115966796875,
-0.0260009765625,
0.0203094482421875,
0.01407623291015625,
0.00897979736328125,
0.03302001953125,
0.0248260498046875,
-0.0616455078125,
0.032440185546875,
-0.059234619140625,
-0.015289306640625,
-0.012054443359375,
-0.0478515625,
-0.0704345703125,
-0.06787109375,
-0.05194091796875,
-0.05120849609375,
-0.0181427001953125,
0.07415771484375,
0.083740234375,
-0.04986572265625,
-0.01209259033203125,
-0.003147125244140625,
0.01363372802734375,
-0.0234527587890625,
-0.017974853515625,
0.052490234375,
-0.0198822021484375,
-0.05523681640625,
-0.0295867919921875,
-0.007198333740234375,
0.0268402099609375,
-0.0019969940185546875,
-0.01546478271484375,
-0.0132904052734375,
-0.0267333984375,
0.012908935546875,
0.0173492431640625,
-0.0421142578125,
-0.00919342041015625,
-0.0197601318359375,
-0.0148773193359375,
0.02899169921875,
0.0279083251953125,
-0.03924560546875,
0.028839111328125,
0.0298614501953125,
0.0295257568359375,
0.06488037109375,
-0.0304718017578125,
-0.00293731689453125,
-0.057952880859375,
0.041259765625,
-0.01068115234375,
0.034637451171875,
0.0312347412109375,
-0.033935546875,
0.049346923828125,
0.0287933349609375,
-0.0384521484375,
-0.060821533203125,
-0.020111083984375,
-0.08172607421875,
-0.01166534423828125,
0.0689697265625,
-0.038238525390625,
-0.042022705078125,
0.0372314453125,
0.004459381103515625,
0.0531005859375,
-0.01666259765625,
0.03765869140625,
0.01517486572265625,
-0.0102691650390625,
-0.051910400390625,
-0.038116455078125,
0.02947998046875,
0.014862060546875,
-0.039825439453125,
-0.0308990478515625,
-0.0032367706298828125,
0.051116943359375,
0.01476287841796875,
0.035614013671875,
-0.0019140243530273438,
0.01079559326171875,
0.00926971435546875,
0.036163330078125,
-0.0396728515625,
-0.0031948089599609375,
-0.02972412109375,
0.01270294189453125,
-0.006511688232421875,
-0.041839599609375
]
] |
jy46604790/Fake-News-Bert-Detect | 2022-04-26T04:36:13.000Z | [
"transformers",
"pytorch",
"roberta",
"text-classification",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | jy46604790 | null | null | jy46604790/Fake-News-Bert-Detect | 7 | 6,746 | transformers | 2022-04-24T11:25:53 | ---
license: apache-2.0
---
# Fake News Recognition
## Overview
This model is trained by over 40,000 news from different medias based on the 'roberta-base'. It can give result by simply entering the text of the news less than 500 words(the excess will be truncated automatically).
LABEL_0: Fake news
LABEL_1: Real news
## Qucik Tutorial
### Download The Model
```python
from transformers import pipeline
MODEL = "jy46604790/Fake-News-Bert-Detect"
clf = pipeline("text-classification", model=MODEL, tokenizer=MODEL)
```
### Feed Data
```python
text = "Indonesian police have recaptured a U.S. citizen who escaped a week ago from an overcrowded prison on the holiday island of Bali, the jail s second breakout of foreign inmates this year. Cristian Beasley from California was rearrested on Sunday, Badung Police chief Yudith Satria Hananta said, without providing further details. Beasley was a suspect in crimes related to narcotics but had not been sentenced when he escaped from Kerobokan prison in Bali last week. The 32-year-old is believed to have cut through bars in the ceiling of his cell before scaling a perimeter wall of the prison in an area being refurbished. The Kerobokan prison, about 10 km (six miles) from the main tourist beaches in the Kuta area, often holds foreigners facing drug-related charges. Representatives of Beasley could not immediately be reached for comment. In June, an Australian, a Bulgarian, an Indian and a Malaysian tunneled to freedom about 12 meters (13 yards) under Kerobokan prison s walls. The Indian and the Bulgarian were caught soon after in neighboring East Timor, but Australian Shaun Edward Davidson and Malaysian Tee Kok King remain at large. Davidson has taunted authorities by saying he was enjoying life in various parts of the world, in purported posts on Facebook. Kerobokan has housed a number of well-known foreign drug convicts, including Australian Schappelle Corby, whose 12-1/2-year sentence for marijuana smuggling got huge media attention."
```
### Result
```python
result = clf(text)
result
```
output:[{'label': 'LABEL_1', 'score': 0.9994995594024658}] | 2,136 | [
[
-0.0215911865234375,
-0.061431884765625,
0.03875732421875,
0.0137939453125,
-0.0182952880859375,
0.018310546875,
-0.01104736328125,
-0.043731689453125,
0.0248260498046875,
0.0276336669921875,
-0.0192718505859375,
-0.048736572265625,
-0.056396484375,
0.023895263671875,
-0.057220458984375,
0.0604248046875,
-0.013946533203125,
-0.0171661376953125,
0.0177001953125,
-0.02081298828125,
-0.0277557373046875,
-0.043701171875,
-0.043182373046875,
0.012786865234375,
0.04888916015625,
0.0198974609375,
0.0284576416015625,
0.01544189453125,
0.027740478515625,
0.0203399658203125,
-0.0022144317626953125,
0.015411376953125,
-0.023284912109375,
-0.042938232421875,
-0.030792236328125,
-0.0516357421875,
0.0084075927734375,
0.0198211669921875,
0.0288543701171875,
0.003780364990234375,
0.0081329345703125,
0.003993988037109375,
-0.020111083984375,
0.03570556640625,
-0.031829833984375,
-0.01215362548828125,
-0.01910400390625,
0.005889892578125,
-0.0267333984375,
0.0007452964782714844,
-0.0298614501953125,
-0.04736328125,
-0.0083465576171875,
-0.056915283203125,
0.0106201171875,
0.01209259033203125,
0.111083984375,
-0.004970550537109375,
-0.0570068359375,
-0.0272369384765625,
-0.043853759765625,
0.051300048828125,
-0.055633544921875,
0.02716064453125,
0.042633056640625,
0.038665771484375,
-0.02557373046875,
-0.049346923828125,
-0.059478759765625,
-0.033111572265625,
-0.0200653076171875,
0.007503509521484375,
0.0036487579345703125,
-0.0203704833984375,
0.0318603515625,
0.0262451171875,
-0.06591796875,
-0.020782470703125,
-0.06622314453125,
-0.037628173828125,
0.03997802734375,
-0.0093231201171875,
0.0213165283203125,
-0.0234375,
-0.0240631103515625,
-0.025054931640625,
-0.03619384765625,
0.0228729248046875,
0.056610107421875,
0.0193023681640625,
-0.03839111328125,
0.03900146484375,
-0.039093017578125,
0.0205535888671875,
0.007781982421875,
-0.01317596435546875,
0.041412353515625,
-0.0220794677734375,
0.00592803955078125,
0.0031719207763671875,
0.06707763671875,
0.01529693603515625,
0.01082611083984375,
-0.007030487060546875,
-0.004726409912109375,
0.0198211669921875,
0.001422882080078125,
-0.04681396484375,
-0.0270843505859375,
0.02166748046875,
-0.05230712890625,
-0.029449462890625,
0.04437255859375,
-0.0479736328125,
-0.0220947265625,
0.0190277099609375,
0.034332275390625,
-0.033782958984375,
-0.05438232421875,
0.0028533935546875,
-0.041778564453125,
0.0323486328125,
0.004772186279296875,
-0.018829345703125,
0.01690673828125,
0.02850341796875,
0.06658935546875,
0.005474090576171875,
-0.00963592529296875,
-0.037872314453125,
-0.015625,
-0.0260162353515625,
0.03570556640625,
-0.01763916015625,
-0.006114959716796875,
-0.0103912353515625,
0.0487060546875,
0.02252197265625,
-0.036865234375,
0.046875,
-0.051361083984375,
0.01001739501953125,
-0.0233612060546875,
-0.0200347900390625,
-0.038787841796875,
0.0228271484375,
-0.0447998046875,
0.05157470703125,
0.0333251953125,
-0.0645751953125,
0.034423828125,
-0.0200653076171875,
-0.02716064453125,
-0.0029888153076171875,
0.0287017822265625,
-0.0482177734375,
-0.0087127685546875,
0.01776123046875,
0.01282501220703125,
0.0333251953125,
0.00567626953125,
-0.04736328125,
-0.04254150390625,
0.043792724609375,
-0.0266876220703125,
0.0977783203125,
0.041351318359375,
-0.003997802734375,
-0.0254364013671875,
-0.0667724609375,
-0.0206451416015625,
0.00921630859375,
-0.059417724609375,
-0.0301055908203125,
-0.0193023681640625,
0.017547607421875,
0.0176544189453125,
-0.0012521743774414062,
-0.08404541015625,
0.028350830078125,
-0.02545166015625,
0.0278472900390625,
0.030029296875,
0.034271240234375,
0.00785064697265625,
-0.00894927978515625,
0.0233612060546875,
0.032440185546875,
0.01413726806640625,
0.0007085800170898438,
-0.01082611083984375,
-0.035980224609375,
-0.01371002197265625,
0.052490234375,
0.05596923828125,
-0.05084228515625,
0.036285400390625,
0.0017719268798828125,
-0.041259765625,
-0.0232391357421875,
0.0035572052001953125,
0.006977081298828125,
0.0236053466796875,
0.031341552734375,
-0.014404296875,
-0.0777587890625,
-0.05517578125,
-0.0531005859375,
-0.0177001953125,
-0.0030536651611328125,
-0.0156707763671875,
0.056396484375,
-0.0179290771484375,
0.0799560546875,
-0.02301025390625,
-0.046630859375,
-0.0110015869140625,
0.00846099853515625,
0.036651611328125,
0.051361083984375,
0.040863037109375,
-0.078125,
-0.04156494140625,
-0.019500732421875,
-0.054779052734375,
0.018218994140625,
-0.029571533203125,
-0.0156402587890625,
0.0251007080078125,
0.032684326171875,
-0.05169677734375,
0.049591064453125,
0.0279693603515625,
-0.0335693359375,
0.03759765625,
-0.0145416259765625,
0.005084991455078125,
-0.06878662109375,
0.006671905517578125,
0.018646240234375,
-0.033660888671875,
-0.0289764404296875,
0.01361846923828125,
0.0157470703125,
-0.0103607177734375,
-0.04376220703125,
0.0302886962890625,
-0.0227508544921875,
0.01151275634765625,
-0.0007486343383789062,
0.0055999755859375,
-0.0065765380859375,
0.039825439453125,
0.01288604736328125,
0.029571533203125,
0.022705078125,
-0.030670166015625,
0.053070068359375,
0.0033206939697265625,
-0.05438232421875,
0.0244903564453125,
-0.0263214111328125,
0.00475311279296875,
0.00542449951171875,
0.044952392578125,
-0.0716552734375,
-0.03179931640625,
0.05267333984375,
-0.049346923828125,
-0.0179443359375,
-0.025054931640625,
-0.046630859375,
-0.07684326171875,
-0.018096923828125,
0.021697998046875,
0.041656494140625,
-0.04180908203125,
0.020782470703125,
0.006275177001953125,
0.01493072509765625,
-0.05023193359375,
-0.062408447265625,
0.0138397216796875,
-0.0219268798828125,
-0.00872039794921875,
-0.006641387939453125,
0.006191253662109375,
0.0001157522201538086,
0.015777587890625,
-0.019317626953125,
-0.0226898193359375,
0.01079559326171875,
0.0175628662109375,
0.00548553466796875,
-0.0033664703369140625,
-0.001338958740234375,
0.022003173828125,
0.0029964447021484375,
0.01448822021484375,
-0.00701904296875,
0.035003662109375,
-0.023681640625,
-0.0122222900390625,
-0.057952880859375,
0.0233306884765625,
0.005046844482421875,
-0.0172271728515625,
0.06378173828125,
0.055633544921875,
-0.0308837890625,
0.00756072998046875,
-0.0255584716796875,
-0.00736236572265625,
-0.032623291015625,
0.0182647705078125,
-0.03094482421875,
-0.0256500244140625,
0.034149169921875,
0.0034084320068359375,
-0.0263519287109375,
0.0230560302734375,
0.0267181396484375,
0.00786590576171875,
0.086181640625,
0.04541015625,
-0.0009775161743164062,
0.031707763671875,
-0.0242919921875,
0.043182373046875,
-0.028778076171875,
-0.0093536376953125,
-0.012298583984375,
-0.0192413330078125,
-0.0518798828125,
0.00714874267578125,
0.01537322998046875,
0.0264892578125,
0.00128173828125,
0.05230712890625,
-0.0440673828125,
0.036773681640625,
0.0408935546875,
0.0284271240234375,
0.0180206298828125,
-0.0183563232421875,
0.00787353515625,
0.02471923828125,
-0.0295562744140625,
-0.0243377685546875,
0.084228515625,
0.02947998046875,
0.06878662109375,
-0.0032806396484375,
0.057708740234375,
0.03387451171875,
0.016815185546875,
-0.035308837890625,
0.0285797119140625,
-0.007656097412109375,
-0.06378173828125,
-0.007053375244140625,
-0.0255584716796875,
-0.069580078125,
0.006633758544921875,
-0.020355224609375,
-0.046417236328125,
0.048583984375,
-0.00737762451171875,
-0.042877197265625,
0.02813720703125,
-0.053924560546875,
0.036590576171875,
-0.003108978271484375,
0.00818634033203125,
-0.0367431640625,
-0.05145263671875,
0.0560302734375,
0.00771331787109375,
0.0307464599609375,
-0.0159454345703125,
0.033416748046875,
0.055999755859375,
-0.04278564453125,
0.055694580078125,
-0.031524658203125,
0.00621795654296875,
0.033111572265625,
-0.006824493408203125,
0.0228271484375,
0.03240966796875,
0.024078369140625,
0.0281982421875,
0.03857421875,
-0.041595458984375,
-0.0178680419921875,
0.039825439453125,
-0.060272216796875,
-0.0307159423828125,
-0.03900146484375,
-0.03607177734375,
0.034637451171875,
0.024566650390625,
0.039215087890625,
0.028900146484375,
0.00952911376953125,
0.0196533203125,
0.0325927734375,
-0.037109375,
0.0282440185546875,
0.0311279296875,
-0.02752685546875,
-0.032440185546875,
0.07257080078125,
0.0294342041015625,
-0.015869140625,
0.0260772705078125,
0.004550933837890625,
-0.0274505615234375,
0.003932952880859375,
-0.022186279296875,
0.035064697265625,
-0.037017822265625,
-0.0219268798828125,
-0.053253173828125,
-0.0217742919921875,
-0.052337646484375,
0.005939483642578125,
-0.01419830322265625,
-0.039764404296875,
-0.01262664794921875,
0.000492095947265625,
0.0192718505859375,
0.03131103515625,
-0.03399658203125,
0.0286865234375,
-0.05206298828125,
0.050048828125,
0.01520538330078125,
0.01226043701171875,
-0.006755828857421875,
-0.048736572265625,
-0.04345703125,
0.02947998046875,
-0.024261474609375,
-0.07843017578125,
0.037017822265625,
-0.0107421875,
0.0254058837890625,
0.05633544921875,
0.031585693359375,
0.042694091796875,
0.005886077880859375,
0.060028076171875,
-0.00433349609375,
-0.056243896484375,
0.059967041015625,
-0.015472412109375,
-0.00846099853515625,
0.0043182373046875,
0.0290374755859375,
-0.064697265625,
-0.059326171875,
-0.054840087890625,
-0.036468505859375,
0.05517578125,
-0.0035247802734375,
0.001262664794921875,
-0.0005350112915039062,
0.027740478515625,
0.00550079345703125,
-0.004230499267578125,
-0.1070556640625,
0.003993988037109375,
-0.03802490234375,
-0.02789306640625,
0.0230865478515625,
-0.01104736328125,
-0.0177154541015625,
-0.0333251953125,
0.05780029296875,
0.0126800537109375,
0.0288543701171875,
0.034027099609375,
-0.037322998046875,
-0.0026988983154296875,
0.0131683349609375,
0.038543701171875,
0.0557861328125,
-0.020599365234375,
0.01490020751953125,
0.01018524169921875,
-0.024444580078125,
0.025604248046875,
-0.0032100677490234375,
-0.0025691986083984375,
0.01267242431640625,
0.04052734375,
0.03802490234375,
-0.0253753662109375,
-0.059814453125,
0.06927490234375,
-0.01538848876953125,
-0.033599853515625,
-0.06298828125,
0.0213165283203125,
-0.016204833984375,
0.033660888671875,
0.0465087890625,
0.0029010772705078125,
-0.0066375732421875,
-0.052001953125,
0.01016998291015625,
0.0256500244140625,
-0.0133209228515625,
-0.035980224609375,
0.0579833984375,
-0.0095672607421875,
-0.023834228515625,
0.05816650390625,
-0.024444580078125,
-0.04095458984375,
0.053985595703125,
0.038665771484375,
0.0249481201171875,
-0.0104522705078125,
0.036529541015625,
0.04254150390625,
0.024871826171875,
-0.006992340087890625,
0.0205841064453125,
-0.007541656494140625,
-0.053680419921875,
-0.00643157958984375,
-0.0309295654296875,
0.0169677734375,
0.01561737060546875,
-0.055267333984375,
0.0229949951171875,
-0.040435791015625,
-0.0214080810546875,
-0.0264129638671875,
0.032958984375,
-0.031585693359375,
0.013153076171875,
0.0170440673828125,
0.0904541015625,
-0.1046142578125,
0.07952880859375,
0.052337646484375,
-0.046722412109375,
-0.058258056640625,
-0.007328033447265625,
-0.004913330078125,
-0.03802490234375,
0.062042236328125,
0.0278472900390625,
0.03814697265625,
-0.01499176025390625,
-0.06744384765625,
-0.057708740234375,
0.0667724609375,
-0.01141357421875,
-0.03863525390625,
-0.005336761474609375,
0.01009368896484375,
0.045654296875,
-0.0168609619140625,
-0.00646209716796875,
0.0452880859375,
0.0290069580078125,
-0.03564453125,
-0.02996826171875,
0.01374053955078125,
-0.00830078125,
0.00200653076171875,
0.003795623779296875,
-0.06884765625,
0.07025146484375,
-0.0184783935546875,
-0.03369140625,
0.034942626953125,
0.0517578125,
0.026641845703125,
0.0212554931640625,
0.055633544921875,
0.042205810546875,
0.051055908203125,
-0.009368896484375,
0.06854248046875,
-0.0012331008911132812,
0.03936767578125,
0.029632568359375,
0.0131683349609375,
0.052764892578125,
0.039520263671875,
-0.0218048095703125,
0.07666015625,
0.053680419921875,
-0.0310516357421875,
0.0784912109375,
-0.0003571510314941406,
0.00292205810546875,
-0.0037555694580078125,
0.0166168212890625,
-0.014617919921875,
0.02862548828125,
0.0297393798828125,
-0.017425537109375,
-0.0129852294921875,
0.0379638671875,
0.00722503662109375,
-0.0067901611328125,
-0.0229339599609375,
0.0307464599609375,
0.0038909912109375,
-0.020599365234375,
0.054290771484375,
0.01018524169921875,
0.036865234375,
-0.03411865234375,
-0.019134521484375,
-0.018402099609375,
0.034698486328125,
-0.026336669921875,
-0.053314208984375,
0.017913818359375,
-0.01145172119140625,
-0.0122222900390625,
0.01263427734375,
0.0679931640625,
-0.043243408203125,
-0.031890869140625,
0.0142974853515625,
-0.00257110595703125,
0.030364990234375,
-0.0141754150390625,
-0.04345703125,
0.005523681640625,
0.005924224853515625,
-0.018890380859375,
0.0030269622802734375,
0.021820068359375,
-0.0185546875,
0.046783447265625,
0.0703125,
0.030853271484375,
0.027587890625,
-0.032562255859375,
0.04632568359375,
-0.044586181640625,
-0.051849365234375,
-0.052001953125,
0.00730133056640625,
-0.0311737060546875,
-0.0254364013671875,
0.061126708984375,
0.054931640625,
0.07208251953125,
-0.00798797607421875,
0.054840087890625,
-0.0321044921875,
0.076904296875,
-0.036834716796875,
0.062469482421875,
-0.023681640625,
-0.004856109619140625,
-0.0116119384765625,
-0.0634765625,
0.00437164306640625,
0.055938720703125,
-0.028472900390625,
-0.028045654296875,
0.047637939453125,
0.0731201171875,
-0.021240234375,
-0.0225372314453125,
-0.0012340545654296875,
0.03125,
0.0258026123046875,
0.053253173828125,
0.0721435546875,
-0.06451416015625,
0.0745849609375,
-0.027496337890625,
-0.006282806396484375,
-0.030731201171875,
-0.027618408203125,
-0.0682373046875,
-0.0305633544921875,
-0.0036449432373046875,
-0.037811279296875,
0.00907135009765625,
0.053741455078125,
0.0697021484375,
-0.08197021484375,
-0.0036983489990234375,
-0.0009531974792480469,
0.01739501953125,
-0.04400634765625,
-0.028076171875,
0.03289794921875,
0.006198883056640625,
-0.046417236328125,
0.01241302490234375,
-0.0126495361328125,
0.008575439453125,
-0.007534027099609375,
-0.0077056884765625,
-0.0295867919921875,
-0.0017137527465820312,
0.027587890625,
0.04345703125,
-0.0633544921875,
-0.03497314453125,
-0.013427734375,
-0.025146484375,
0.00140380859375,
0.037078857421875,
-0.0953369140625,
-0.002147674560546875,
0.033660888671875,
0.034912109375,
0.050201416015625,
0.024444580078125,
0.00777435302734375,
-0.06787109375,
0.0367431640625,
-0.006313323974609375,
0.01934814453125,
0.007053375244140625,
-0.038543701171875,
0.046417236328125,
0.049041748046875,
-0.046051025390625,
-0.08013916015625,
-0.006519317626953125,
-0.0634765625,
-0.043243408203125,
0.0567626953125,
-0.0191802978515625,
-0.03021240234375,
-0.01678466796875,
-0.028350830078125,
0.033935546875,
-0.025634765625,
0.0667724609375,
0.0662841796875,
-0.01209259033203125,
-0.019439697265625,
-0.037933349609375,
0.022857666015625,
-0.00019216537475585938,
-0.041595458984375,
-0.0166168212890625,
0.0106353759765625,
0.05841064453125,
0.026947021484375,
0.038177490234375,
-0.00122833251953125,
0.027374267578125,
0.0146942138671875,
-0.0019521713256835938,
0.003147125244140625,
-0.00506591796875,
-0.005756378173828125,
-0.015411376953125,
-0.035797119140625,
-0.032745361328125
]
] |
simsim314/WizardLM-70B-V1.0-HF | 2023-08-12T01:11:14.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"license:llama2",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | simsim314 | null | null | simsim314/WizardLM-70B-V1.0-HF | 1 | 6,742 | transformers | 2023-08-11T22:10:14 | ---
license: llama2
---
Float16 of [WizardLM/WizardLM-70B-V1.0](https://huggingface.co/WizardLM/WizardLM-70B-V1.0)
```
from transformers import LlamaTokenizer, AutoModelForCausalLM
tokenizer = LlamaTokenizer.from_pretrained("simsim314/WizardLM-70B-V1.0-HF")
model = AutoModelForCausalLM.from_pretrained("simsim314/WizardLM-70B-V1.0-HF")
``` | 342 | [
[
-0.039825439453125,
-0.020233154296875,
0.0031147003173828125,
0.05157470703125,
-0.028472900390625,
0.01389312744140625,
0.0244598388671875,
0.001499176025390625,
0.01096343994140625,
0.030426025390625,
-0.060760498046875,
-0.0241241455078125,
-0.0628662109375,
0.0018892288208007812,
-0.0198516845703125,
0.06689453125,
-0.025299072265625,
0.010406494140625,
-0.024261474609375,
0.008575439453125,
-0.008026123046875,
-0.0413818359375,
-0.04534912109375,
-0.02911376953125,
0.01247406005859375,
0.02142333984375,
0.03271484375,
0.009033203125,
0.0255279541015625,
0.03436279296875,
-0.0152130126953125,
0.01384735107421875,
-0.0304412841796875,
-0.01490020751953125,
0.003376007080078125,
-0.036773681640625,
-0.06707763671875,
0.002880096435546875,
0.048553466796875,
0.012908935546875,
-0.01312255859375,
0.036834716796875,
-0.004985809326171875,
0.0252532958984375,
-0.031219482421875,
0.034423828125,
-0.03741455078125,
0.01849365234375,
-0.00243377685546875,
-0.01513671875,
-0.0115814208984375,
-0.0012769699096679688,
0.0059356689453125,
-0.06219482421875,
0.022674560546875,
0.0178680419921875,
0.106201171875,
0.034820556640625,
-0.03411865234375,
0.0088653564453125,
-0.0301055908203125,
0.06170654296875,
-0.0643310546875,
0.00960540771484375,
0.0279693603515625,
0.01230621337890625,
-0.022796630859375,
-0.0745849609375,
-0.038238525390625,
-0.0009002685546875,
-0.00690460205078125,
-0.00024366378784179688,
-0.0253448486328125,
-0.001239776611328125,
0.038177490234375,
0.0250701904296875,
-0.0283966064453125,
-0.005947113037109375,
-0.051361083984375,
-0.044403076171875,
0.054046630859375,
0.02789306640625,
0.0213165283203125,
-0.01457977294921875,
-0.03924560546875,
-0.0170745849609375,
-0.015167236328125,
0.00466156005859375,
0.027008056640625,
0.0035724639892578125,
-0.036224365234375,
0.0611572265625,
-0.0251312255859375,
0.032928466796875,
0.046630859375,
-0.0284423828125,
0.057647705078125,
-0.005641937255859375,
-0.0116729736328125,
0.010986328125,
0.054412841796875,
0.026214599609375,
0.03125,
0.018768310546875,
-0.017791748046875,
-0.0290069580078125,
0.0159454345703125,
-0.07373046875,
0.0037822723388671875,
0.0011472702026367188,
-0.03326416015625,
-0.017364501953125,
0.0292510986328125,
-0.03448486328125,
0.001018524169921875,
-0.024688720703125,
0.04388427734375,
-0.019805908203125,
-0.041534423828125,
0.0124664306640625,
-0.011993408203125,
0.044189453125,
0.030609130859375,
-0.05535888671875,
0.0097198486328125,
0.047607421875,
0.05206298828125,
-0.021270751953125,
-0.031707763671875,
-0.00423431396484375,
0.02197265625,
-0.019622802734375,
0.04974365234375,
0.0056610107421875,
-0.052490234375,
-0.0025177001953125,
0.0298919677734375,
-0.02337646484375,
-0.049346923828125,
0.03546142578125,
-0.045440673828125,
0.01172637939453125,
0.0107269287109375,
-0.04144287109375,
-0.0110626220703125,
0.01190948486328125,
-0.043304443359375,
0.0985107421875,
0.031097412109375,
-0.05670166015625,
0.0292205810546875,
-0.040985107421875,
-0.0260009765625,
0.005126953125,
0.00798797607421875,
-0.050537109375,
0.01471710205078125,
-0.00873565673828125,
0.021270751953125,
0.008392333984375,
0.0169677734375,
-0.0165863037109375,
-0.0377197265625,
-0.0010995864868164062,
-0.0499267578125,
0.07073974609375,
0.0159759521484375,
-0.02618408203125,
0.0172882080078125,
-0.050506591796875,
-0.01409149169921875,
0.021759033203125,
-0.021942138671875,
0.00849151611328125,
-0.03900146484375,
0.001560211181640625,
0.01910400390625,
0.05621337890625,
-0.0323486328125,
0.0186309814453125,
-0.026702880859375,
-0.009857177734375,
0.051971435546875,
-0.005908966064453125,
0.031463623046875,
-0.0158538818359375,
0.056671142578125,
0.0224609375,
0.024169921875,
0.0290069580078125,
-0.036041259765625,
-0.0679931640625,
-0.0325927734375,
-0.005046844482421875,
0.033660888671875,
-0.04559326171875,
0.05279541015625,
-0.0206756591796875,
-0.06195068359375,
-0.029388427734375,
0.0041961669921875,
-0.002666473388671875,
0.0250701904296875,
0.036895751953125,
-0.00647735595703125,
-0.037109375,
-0.068359375,
-0.0007710456848144531,
-0.01068115234375,
-0.00630950927734375,
-0.00838470458984375,
0.052978515625,
-0.035491943359375,
0.077880859375,
-0.0430908203125,
-0.013275146484375,
-0.00815582275390625,
0.006679534912109375,
0.026458740234375,
0.061279296875,
0.04815673828125,
-0.0416259765625,
-0.04913330078125,
-0.024017333984375,
-0.037078857421875,
-0.00548553466796875,
0.0036983489990234375,
-0.033843994140625,
0.01480865478515625,
0.027496337890625,
-0.08099365234375,
0.07098388671875,
0.025604248046875,
-0.059722900390625,
0.055816650390625,
-0.021026611328125,
0.007328033447265625,
-0.0771484375,
-0.00586700439453125,
-0.01474761962890625,
-0.021026611328125,
-0.0250396728515625,
-0.0253753662109375,
0.006870269775390625,
0.01471710205078125,
-0.064208984375,
0.06976318359375,
-0.0225372314453125,
0.005519866943359375,
-0.0249481201171875,
-0.0157012939453125,
0.012115478515625,
0.0221099853515625,
-0.003452301025390625,
0.045166015625,
0.047637939453125,
-0.040069580078125,
0.053741455078125,
0.0421142578125,
-0.006801605224609375,
0.02423095703125,
-0.05096435546875,
-0.0081024169921875,
-0.0002899169921875,
0.03424072265625,
-0.054534912109375,
-0.0159912109375,
0.03741455078125,
-0.023651123046875,
0.04205322265625,
-0.006137847900390625,
-0.025634765625,
-0.03887939453125,
-0.01690673828125,
0.046600341796875,
0.045501708984375,
-0.0335693359375,
0.07098388671875,
0.004425048828125,
-0.0017871856689453125,
-0.042724609375,
-0.05291748046875,
-0.0156097412109375,
-0.0181121826171875,
-0.041290283203125,
0.0364990234375,
-0.020538330078125,
-0.0032958984375,
-0.01458740234375,
-0.044189453125,
-0.00749969482421875,
-0.007038116455078125,
0.026123046875,
0.046051025390625,
-0.0128173828125,
-0.027130126953125,
0.0200042724609375,
-0.03741455078125,
0.0286712646484375,
0.0124664306640625,
0.052764892578125,
-0.01776123046875,
-0.039886474609375,
-0.039306640625,
-0.0091705322265625,
0.0297088623046875,
0.00420379638671875,
0.07171630859375,
0.08050537109375,
-0.0266571044921875,
-0.004535675048828125,
-0.044952392578125,
-0.0065765380859375,
-0.044647216796875,
0.03375244140625,
-0.031768798828125,
-0.04742431640625,
0.040863037109375,
0.00382232666015625,
-0.01020050048828125,
0.068603515625,
0.047576904296875,
0.0175323486328125,
0.07196044921875,
0.033721923828125,
0.01189422607421875,
0.03851318359375,
-0.033111572265625,
0.0104217529296875,
-0.053955078125,
-0.046051025390625,
-0.0250396728515625,
0.0099029541015625,
-0.0302581787109375,
-0.04132080078125,
-0.0017347335815429688,
0.019134521484375,
-0.05767822265625,
0.0330810546875,
-0.038665771484375,
0.0163726806640625,
0.041290283203125,
0.0021572113037109375,
0.0086517333984375,
-0.002956390380859375,
-0.0167388916015625,
0.006107330322265625,
-0.040130615234375,
-0.04541015625,
0.058868408203125,
0.03240966796875,
0.06488037109375,
-0.017059326171875,
0.05352783203125,
-0.004058837890625,
0.038604736328125,
-0.045379638671875,
0.042449951171875,
0.0207977294921875,
-0.08135986328125,
-0.019012451171875,
-0.0186920166015625,
-0.048858642578125,
0.01556396484375,
-0.0052947998046875,
-0.044281005859375,
0.01247406005859375,
-0.00006783008575439453,
-0.00872039794921875,
0.02960205078125,
-0.030059814453125,
0.072509765625,
-0.014495849609375,
-0.005828857421875,
-0.0076446533203125,
-0.0225372314453125,
0.020538330078125,
-0.009674072265625,
0.00479888916015625,
-0.036041259765625,
-0.02099609375,
0.058349609375,
-0.0404052734375,
0.04083251953125,
-0.0289459228515625,
-0.0166015625,
0.038177490234375,
-0.0063934326171875,
0.0227203369140625,
0.017364501953125,
-0.00711822509765625,
0.0290374755859375,
0.0208892822265625,
-0.047576904296875,
-0.0228729248046875,
0.04461669921875,
-0.0792236328125,
-0.05731201171875,
-0.06011962890625,
-0.028900146484375,
0.00009161233901977539,
0.0135040283203125,
0.0450439453125,
0.005863189697265625,
0.0110626220703125,
0.005954742431640625,
0.045074462890625,
-0.011993408203125,
0.05682373046875,
0.03369140625,
-0.029205322265625,
-0.030548095703125,
0.055206298828125,
0.0016307830810546875,
0.0217437744140625,
-0.00521087646484375,
0.002147674560546875,
-0.0301361083984375,
-0.016387939453125,
-0.0167694091796875,
0.0205078125,
-0.0628662109375,
-0.03070068359375,
-0.03302001953125,
-0.0596923828125,
-0.0291900634765625,
-0.0006051063537597656,
-0.033721923828125,
-0.0279998779296875,
-0.059814453125,
-0.01409149169921875,
0.053192138671875,
0.034881591796875,
-0.0224761962890625,
0.03741455078125,
-0.05877685546875,
0.0275421142578125,
0.037750244140625,
0.0207672119140625,
0.005084991455078125,
-0.0897216796875,
-0.01522064208984375,
-0.00011819601058959961,
-0.0333251953125,
-0.045074462890625,
0.034637451171875,
0.0013380050659179688,
0.058624267578125,
0.03643798828125,
0.004398345947265625,
0.0654296875,
-0.036529541015625,
0.0660400390625,
0.0152740478515625,
-0.08074951171875,
0.00815582275390625,
-0.00018334388732910156,
0.0198516845703125,
0.020904541015625,
0.036041259765625,
-0.0172271728515625,
-0.0011129379272460938,
-0.06182861328125,
-0.07269287109375,
0.044097900390625,
0.01493072509765625,
0.031951904296875,
0.01485443115234375,
0.0244140625,
0.01085662841796875,
0.0201873779296875,
-0.0626220703125,
-0.0543212890625,
-0.0509033203125,
-0.0280303955078125,
0.003833770751953125,
-0.011932373046875,
0.0013885498046875,
-0.061431884765625,
0.07635498046875,
-0.0029754638671875,
0.03228759765625,
0.01387786865234375,
-0.01241302490234375,
-0.017730712890625,
-0.0201873779296875,
0.053924560546875,
0.049774169921875,
-0.0148773193359375,
-0.0111083984375,
0.046173095703125,
-0.0599365234375,
0.0230255126953125,
0.0178375244140625,
-0.0029773712158203125,
0.012847900390625,
0.035919189453125,
0.0689697265625,
0.016693115234375,
-0.017578125,
0.046417236328125,
-0.002277374267578125,
-0.0191650390625,
-0.050079345703125,
0.019287109375,
0.0175933837890625,
0.01345062255859375,
0.05242919921875,
0.0072479248046875,
-0.01422119140625,
0.01030731201171875,
0.0174407958984375,
0.0421142578125,
-0.03973388671875,
-0.00696563720703125,
0.055572509765625,
-0.00601959228515625,
-0.03515625,
0.0299530029296875,
-0.01422119140625,
-0.029144287109375,
0.0589599609375,
0.044281005859375,
0.0675048828125,
-0.00943756103515625,
0.019683837890625,
0.030853271484375,
0.0036468505859375,
-0.002040863037109375,
0.01433563232421875,
0.0177001953125,
-0.052398681640625,
-0.03369140625,
-0.07366943359375,
-0.0204925537109375,
-0.01262664794921875,
-0.051116943359375,
0.0477294921875,
-0.024261474609375,
-0.0236053466796875,
-0.027099609375,
0.006893157958984375,
-0.064208984375,
0.01904296875,
0.01526641845703125,
0.0777587890625,
-0.050079345703125,
0.054962158203125,
0.04339599609375,
-0.03509521484375,
-0.0660400390625,
-0.015899658203125,
-0.007129669189453125,
-0.072998046875,
0.058349609375,
0.006031036376953125,
0.0029621124267578125,
0.0200958251953125,
-0.058807373046875,
-0.08990478515625,
0.09405517578125,
0.01256561279296875,
-0.028717041015625,
0.0017261505126953125,
0.02117919921875,
0.013427734375,
-0.026824951171875,
0.03472900390625,
0.035736083984375,
0.057159423828125,
0.0143585205078125,
-0.0677490234375,
0.003673553466796875,
-0.01227569580078125,
-0.0020771026611328125,
-0.012969970703125,
-0.06292724609375,
0.07171630859375,
0.0034637451171875,
0.0132598876953125,
0.0201263427734375,
0.07354736328125,
0.0440673828125,
0.01507568359375,
0.030914306640625,
0.036346435546875,
0.0219268798828125,
-0.01285552978515625,
0.05255126953125,
-0.0195770263671875,
0.0628662109375,
0.047332763671875,
-0.0008339881896972656,
0.0389404296875,
0.0380859375,
-0.024932861328125,
0.060699462890625,
0.069580078125,
-0.0323486328125,
0.0218048095703125,
0.0238037109375,
-0.0254974365234375,
-0.019439697265625,
0.025604248046875,
-0.04302978515625,
0.0157318115234375,
0.00853729248046875,
0.009490966796875,
-0.0106353759765625,
0.0072784423828125,
0.01241302490234375,
-0.034637451171875,
-0.0343017578125,
0.0419921875,
0.018646240234375,
-0.01490020751953125,
0.0543212890625,
-0.012786865234375,
0.0745849609375,
-0.05328369140625,
0.0006928443908691406,
-0.01171112060546875,
0.006420135498046875,
-0.0239105224609375,
-0.03546142578125,
0.01226806640625,
-0.016571044921875,
-0.00478363037109375,
0.00008094310760498047,
0.05767822265625,
-0.016143798828125,
-0.0653076171875,
0.0142364501953125,
0.025909423828125,
0.007640838623046875,
-0.004314422607421875,
-0.07330322265625,
0.0046539306640625,
-0.003910064697265625,
-0.06243896484375,
0.0230712890625,
0.041748046875,
0.0200347900390625,
0.061614990234375,
0.03704833984375,
-0.01091766357421875,
0.030914306640625,
-0.00623321533203125,
0.057037353515625,
-0.04254150390625,
-0.0189208984375,
-0.060577392578125,
0.0655517578125,
-0.01355743408203125,
-0.036773681640625,
0.06243896484375,
0.04345703125,
0.053924560546875,
-0.03814697265625,
0.031951904296875,
-0.0063629150390625,
0.0047607421875,
-0.014404296875,
0.06329345703125,
-0.03399658203125,
-0.0076904296875,
-0.0142059326171875,
-0.0897216796875,
-0.0009660720825195312,
0.059356689453125,
-0.006305694580078125,
0.01983642578125,
0.050750732421875,
0.06317138671875,
-0.01312255859375,
-0.0233154296875,
0.02069091796875,
0.0087738037109375,
0.0157623291015625,
-0.00749969482421875,
0.059173583984375,
-0.05078125,
0.038970947265625,
-0.031951904296875,
-0.0097808837890625,
-0.0194854736328125,
-0.06964111328125,
-0.0743408203125,
-0.0132598876953125,
-0.03173828125,
-0.0478515625,
-0.054931640625,
0.08746337890625,
0.061279296875,
-0.06640625,
-0.01392364501953125,
0.007965087890625,
0.00830078125,
0.00647735595703125,
-0.0200347900390625,
0.0296783447265625,
-0.01369476318359375,
-0.07171630859375,
0.00977325439453125,
-0.005214691162109375,
0.050689697265625,
-0.034820556640625,
-0.032501220703125,
-0.0121917724609375,
0.0244293212890625,
0.022796630859375,
0.0072784423828125,
-0.052764892578125,
-0.0180511474609375,
-0.018585205078125,
-0.0226593017578125,
0.00926971435546875,
0.0135955810546875,
-0.037139892578125,
-0.016998291015625,
0.0301055908203125,
0.00785064697265625,
0.06829833984375,
0.0017719268798828125,
0.0216522216796875,
-0.041900634765625,
0.037139892578125,
0.008148193359375,
0.0321044921875,
0.0123291015625,
-0.018707275390625,
0.021484375,
0.0285491943359375,
-0.040618896484375,
-0.055694580078125,
-0.01485443115234375,
-0.0677490234375,
-0.01224517822265625,
0.07720947265625,
-0.003368377685546875,
-0.034454345703125,
0.0005540847778320312,
-0.046112060546875,
0.032135009765625,
-0.02099609375,
0.034576416015625,
0.050079345703125,
0.0007328987121582031,
-0.00006401538848876953,
-0.0295867919921875,
0.03936767578125,
0.01422119140625,
-0.05242919921875,
-0.036712646484375,
-0.002819061279296875,
0.04217529296875,
0.0302734375,
0.0185546875,
-0.00899505615234375,
0.0031337738037109375,
0.0176544189453125,
0.0265045166015625,
-0.0178375244140625,
-0.0001862049102783203,
-0.01232147216796875,
-0.00791168212890625,
-0.0030956268310546875,
-0.0101470947265625
]
] |
lambdalabs/sd-pokemon-diffusers | 2023-05-16T09:17:58.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"en",
"dataset:lambdalabs/pokemon-blip-captions",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | lambdalabs | null | null | lambdalabs/sd-pokemon-diffusers | 160 | 6,740 | diffusers | 2022-09-16T15:43:00 | ---
language:
- en
thumbnail: "https://s3.amazonaws.com/moonup/production/uploads/1663756797814-62bd5f951e22ec84279820e8.png"
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
datasets:
- lambdalabs/pokemon-blip-captions
---
__Stable Diffusion fine tuned on Pokémon by [Lambda Labs](https://lambdalabs.com/).__
Put in a text prompt and generate your own Pokémon character, no "prompt engineering" required!
If you want to find out how to train your own Stable Diffusion variants, see this [example](https://github.com/LambdaLabsML/examples/tree/main/stable-diffusion-finetuning) from Lambda Labs.

> Girl with a pearl earring, Cute Obama creature, Donald Trump, Boris Johnson, Totoro, Hello Kitty
## Usage
```bash
!pip install diffusers==0.3.0
!pip install transformers scipy ftfy
```
```python
import torch
from diffusers import StableDiffusionPipeline
from torch import autocast
pipe = StableDiffusionPipeline.from_pretrained("lambdalabs/sd-pokemon-diffusers", torch_dtype=torch.float16)
pipe = pipe.to("cuda")
prompt = "Yoda"
scale = 10
n_samples = 4
# Sometimes the nsfw checker is confused by the Pokémon images, you can disable
# it at your own risk here
disable_safety = False
if disable_safety:
def null_safety(images, **kwargs):
return images, False
pipe.safety_checker = null_safety
with autocast("cuda"):
images = pipe(n_samples*[prompt], guidance_scale=scale).images
for idx, im in enumerate(images):
im.save(f"{idx:06}.png")
```
## Model description
Trained on [BLIP captioned Pokémon images](https://huggingface.co/datasets/lambdalabs/pokemon-blip-captions) using 2xA6000 GPUs on [Lambda GPU Cloud](https://lambdalabs.com/service/gpu-cloud) for around 15,000 step (about 6 hours, at a cost of about $10).
## Links
- [Lambda Diffusers](https://github.com/LambdaLabsML/lambda-diffusers)
- [Captioned Pokémon dataset](https://huggingface.co/datasets/lambdalabs/pokemon-blip-captions)
- [Model weights in Diffusers format](https://huggingface.co/lambdalabs/sd-pokemon-diffusers)
- [Original model weights](https://huggingface.co/justinpinkney/pokemon-stable-diffusion)
- [Training code](https://github.com/justinpinkney/stable-diffusion)
Trained by [Justin Pinkney](justinpinkney.com) ([@Buntworthy](https://twitter.com/Buntworthy)) at [Lambda Labs](https://lambdalabs.com/). | 2,451 | [
[
-0.039306640625,
-0.054962158203125,
0.03192138671875,
0.012298583984375,
-0.0105743408203125,
-0.0033512115478515625,
0.006114959716796875,
-0.0262603759765625,
0.0269775390625,
0.0245819091796875,
-0.030303955078125,
-0.017425537109375,
-0.039642333984375,
0.004093170166015625,
-0.0202789306640625,
0.06982421875,
0.0006909370422363281,
-0.002742767333984375,
0.01165771484375,
-0.00027751922607421875,
-0.02752685546875,
-0.01203155517578125,
-0.07666015625,
-0.024017333984375,
0.042572021484375,
0.031036376953125,
0.055084228515625,
0.03607177734375,
0.0182342529296875,
0.025604248046875,
-0.0192718505859375,
0.0037136077880859375,
-0.038360595703125,
0.0080108642578125,
-0.01407623291015625,
-0.031585693359375,
-0.0282135009765625,
-0.0011053085327148438,
0.0268096923828125,
0.0243988037109375,
0.00392913818359375,
-0.00634002685546875,
0.002552032470703125,
0.0367431640625,
-0.044677734375,
-0.0088043212890625,
-0.026885986328125,
0.016204833984375,
-0.01078033447265625,
-0.006103515625,
-0.0234832763671875,
-0.006908416748046875,
0.0187530517578125,
-0.056060791015625,
0.042999267578125,
-0.005889892578125,
0.08984375,
0.00901031494140625,
-0.0183868408203125,
-0.0284881591796875,
-0.03961181640625,
0.054473876953125,
-0.035614013671875,
0.0189361572265625,
0.0148468017578125,
0.01544952392578125,
-0.00868988037109375,
-0.0728759765625,
-0.018585205078125,
-0.0234832763671875,
0.005035400390625,
0.02630615234375,
-0.0097808837890625,
-0.0071868896484375,
0.036956787109375,
0.025665283203125,
-0.037109375,
-0.00754547119140625,
-0.035552978515625,
-0.00014543533325195312,
0.0479736328125,
-0.0034961700439453125,
0.03826904296875,
-0.0050048828125,
-0.01361846923828125,
-0.0086517333984375,
-0.02386474609375,
0.010467529296875,
0.0100555419921875,
-0.0155792236328125,
-0.0279541015625,
0.050506591796875,
0.011199951171875,
0.0210113525390625,
0.0462646484375,
0.01427459716796875,
0.02801513671875,
-0.002544403076171875,
-0.01033782958984375,
0.0107269287109375,
0.06756591796875,
0.024627685546875,
0.00015437602996826172,
-0.019134521484375,
-0.00522613525390625,
-0.01216888427734375,
0.01432037353515625,
-0.0828857421875,
-0.061859130859375,
0.04119873046875,
-0.035858154296875,
-0.0274505615234375,
-0.0258331298828125,
-0.07635498046875,
-0.048858642578125,
0.01971435546875,
0.04388427734375,
-0.055908203125,
-0.026092529296875,
0.01013946533203125,
-0.0423583984375,
-0.013641357421875,
0.03875732421875,
-0.09027099609375,
-0.004955291748046875,
0.019134521484375,
0.081298828125,
0.013641357421875,
-0.016876220703125,
-0.0127716064453125,
0.005123138427734375,
-0.0225677490234375,
0.045135498046875,
-0.02996826171875,
-0.035125732421875,
-0.0229644775390625,
0.0153961181640625,
0.00789642333984375,
-0.0185546875,
0.061676025390625,
-0.00997161865234375,
0.0189056396484375,
-0.00797271728515625,
-0.0482177734375,
-0.01148223876953125,
-0.005138397216796875,
-0.049163818359375,
0.062469482421875,
0.007171630859375,
-0.0831298828125,
0.0225372314453125,
-0.052886962890625,
0.0025959014892578125,
0.0174713134765625,
0.0149078369140625,
-0.0450439453125,
0.004390716552734375,
0.01104736328125,
0.032012939453125,
0.02020263671875,
-0.0234832763671875,
-0.038543701171875,
-0.0197906494140625,
0.026153564453125,
-0.0198211669921875,
0.09014892578125,
0.027069091796875,
-0.045654296875,
-0.0010013580322265625,
-0.06512451171875,
-0.008941650390625,
0.0097198486328125,
0.0020160675048828125,
-0.019622802734375,
0.00630950927734375,
-0.0010938644409179688,
-0.0028209686279296875,
0.0305938720703125,
-0.03289794921875,
0.0231475830078125,
-0.007740020751953125,
0.0455322265625,
0.031494140625,
0.0120086669921875,
0.045745849609375,
-0.0238800048828125,
0.028900146484375,
-0.0072784423828125,
0.0271759033203125,
-0.00975799560546875,
-0.06414794921875,
-0.06317138671875,
-0.044464111328125,
0.00937652587890625,
0.03363037109375,
-0.049560546875,
0.04833984375,
0.016845703125,
-0.06756591796875,
-0.03448486328125,
0.00012564659118652344,
0.00841522216796875,
0.06109619140625,
0.03350830078125,
-0.0177764892578125,
-0.0272216796875,
-0.041107177734375,
0.0280303955078125,
0.01222991943359375,
-0.014190673828125,
0.01233673095703125,
0.0323486328125,
-0.03802490234375,
0.048095703125,
-0.02728271484375,
0.006195068359375,
-0.0012578964233398438,
0.0220489501953125,
0.033203125,
0.047698974609375,
0.058807373046875,
-0.045623779296875,
-0.038421630859375,
-0.0084228515625,
-0.05535888671875,
-0.006763458251953125,
-0.01457977294921875,
-0.04461669921875,
-0.0108184814453125,
0.0211944580078125,
-0.0599365234375,
0.0279388427734375,
0.041717529296875,
-0.0439453125,
0.03668212890625,
-0.053741455078125,
0.01265716552734375,
-0.07427978515625,
0.026092529296875,
0.0284423828125,
-0.0247802734375,
-0.05963134765625,
0.025299072265625,
0.019561767578125,
-0.00902557373046875,
-0.04937744140625,
0.0572509765625,
-0.0418701171875,
0.0330810546875,
-0.011199951171875,
0.0007886886596679688,
0.0073699951171875,
0.0122528076171875,
0.0238800048828125,
0.0465087890625,
0.0777587890625,
-0.068359375,
0.036773681640625,
0.0287933349609375,
-0.00452423095703125,
0.03826904296875,
-0.0665283203125,
0.0005373954772949219,
-0.0101318359375,
0.014190673828125,
-0.0648193359375,
-0.01447296142578125,
0.042327880859375,
-0.0284576416015625,
0.0225677490234375,
-0.03216552734375,
-0.0230560302734375,
-0.040283203125,
-0.031494140625,
0.0224456787109375,
0.06781005859375,
-0.051666259765625,
0.01934814453125,
0.011322021484375,
0.0136566162109375,
-0.036224365234375,
-0.05950927734375,
0.0000095367431640625,
-0.040771484375,
-0.05322265625,
0.0302276611328125,
-0.02838134765625,
0.0023326873779296875,
-0.0011653900146484375,
0.016571044921875,
-0.0181427001953125,
0.01432037353515625,
0.0172576904296875,
0.0153350830078125,
0.0107574462890625,
-0.0130767822265625,
0.0034542083740234375,
-0.0020389556884765625,
-0.0029811859130859375,
-0.0010499954223632812,
0.04949951171875,
-0.0010118484497070312,
-0.00695037841796875,
-0.059661865234375,
-0.0092620849609375,
0.03662109375,
0.007587432861328125,
0.051849365234375,
0.06414794921875,
-0.0297088623046875,
-0.0083465576171875,
-0.0147552490234375,
-0.00991058349609375,
-0.041046142578125,
0.0028076171875,
-0.0261383056640625,
-0.0182952880859375,
0.05157470703125,
0.01125335693359375,
0.03265380859375,
0.02801513671875,
0.045135498046875,
-0.02337646484375,
0.10406494140625,
0.03948974609375,
0.01447296142578125,
0.056182861328125,
-0.0599365234375,
0.00177001953125,
-0.06805419921875,
-0.0254364013671875,
-0.02740478515625,
-0.02886962890625,
-0.038665771484375,
-0.0191802978515625,
0.0178375244140625,
0.05963134765625,
-0.033905029296875,
0.0214996337890625,
-0.0631103515625,
0.0281982421875,
0.035980224609375,
0.0240631103515625,
-0.0036373138427734375,
0.0006461143493652344,
-0.0011539459228515625,
-0.01515960693359375,
-0.04144287109375,
-0.04144287109375,
0.0845947265625,
0.044921875,
0.07666015625,
0.01015472412109375,
0.0802001953125,
0.0113983154296875,
0.0128631591796875,
-0.038177490234375,
0.0301361083984375,
-0.0156402587890625,
-0.035125732421875,
-0.01317596435546875,
-0.029815673828125,
-0.0980224609375,
0.00592803955078125,
-0.03326416015625,
-0.0567626953125,
0.0236663818359375,
0.0296630859375,
-0.0186004638671875,
0.017303466796875,
-0.07293701171875,
0.054595947265625,
-0.0299224853515625,
-0.047882080078125,
0.0029430389404296875,
-0.01398468017578125,
0.03961181640625,
0.0002968311309814453,
-0.005889892578125,
-0.004703521728515625,
-0.01010894775390625,
0.034515380859375,
-0.0562744140625,
0.030181884765625,
-0.029449462890625,
-0.002288818359375,
0.0097808837890625,
0.0057525634765625,
0.020660400390625,
0.038665771484375,
-0.0124053955078125,
0.00498199462890625,
0.00249481201171875,
-0.046905517578125,
-0.02801513671875,
0.05535888671875,
-0.07183837890625,
-0.0146026611328125,
-0.0310821533203125,
-0.01177978515625,
0.01067352294921875,
0.027069091796875,
0.0709228515625,
0.0232696533203125,
0.007732391357421875,
-0.007083892822265625,
0.0872802734375,
-0.0174713134765625,
0.040985107421875,
0.00021314620971679688,
-0.0186920166015625,
-0.03887939453125,
0.05780029296875,
-0.0095367431640625,
0.01328277587890625,
0.003047943115234375,
0.025238037109375,
-0.0401611328125,
-0.033935546875,
-0.049163818359375,
0.026702880859375,
-0.027008056640625,
-0.002613067626953125,
-0.039276123046875,
-0.031646728515625,
-0.0176849365234375,
-0.033172607421875,
-0.018157958984375,
-0.0290985107421875,
-0.06585693359375,
0.020050048828125,
0.057708740234375,
0.034698486328125,
-0.009490966796875,
0.01157379150390625,
-0.05413818359375,
0.030670166015625,
0.0283050537109375,
0.0128631591796875,
-0.004558563232421875,
-0.04180908203125,
-0.004741668701171875,
0.03204345703125,
-0.037109375,
-0.0726318359375,
0.0438232421875,
0.0173187255859375,
0.05499267578125,
0.048309326171875,
-0.0120086669921875,
0.05950927734375,
-0.05267333984375,
0.080322265625,
0.038604736328125,
-0.06256103515625,
0.04925537109375,
-0.021575927734375,
0.0035266876220703125,
0.024566650390625,
0.034332275390625,
-0.034820556640625,
-0.049896240234375,
-0.0660400390625,
-0.05340576171875,
0.04345703125,
0.031951904296875,
0.0164794921875,
0.0153961181640625,
0.049407958984375,
0.0009732246398925781,
-0.004344940185546875,
-0.05517578125,
-0.0413818359375,
-0.04461669921875,
-0.01007080078125,
-0.02215576171875,
-0.0002377033233642578,
-0.0062713623046875,
-0.03631591796875,
0.05340576171875,
0.01708984375,
0.0233154296875,
0.02691650390625,
-0.0090789794921875,
-0.01395416259765625,
-0.01861572265625,
0.043365478515625,
0.02764892578125,
-0.0452880859375,
0.006256103515625,
-0.010711669921875,
-0.046051025390625,
0.0147552490234375,
0.00974273681640625,
-0.035552978515625,
0.02520751953125,
-0.0016908645629882812,
0.056060791015625,
0.007396697998046875,
-0.0309295654296875,
0.019622802734375,
-0.029815673828125,
-0.0219268798828125,
-0.00858306884765625,
0.031768798828125,
0.00634002685546875,
0.0245361328125,
0.022918701171875,
0.0125579833984375,
0.0181732177734375,
-0.034515380859375,
0.01153564453125,
0.0288543701171875,
-0.0017461776733398438,
-0.027496337890625,
0.08355712890625,
0.0134429931640625,
-0.01617431640625,
0.033447265625,
-0.041259765625,
-0.0214996337890625,
0.0404052734375,
0.031494140625,
0.06549072265625,
-0.0214996337890625,
0.045684814453125,
0.047698974609375,
-0.0013551712036132812,
-0.02581787109375,
0.0197906494140625,
0.00899505615234375,
-0.03240966796875,
0.01064300537109375,
-0.0269317626953125,
-0.041351318359375,
0.0195159912109375,
-0.0197906494140625,
0.04937744140625,
-0.05975341796875,
-0.0185089111328125,
0.005069732666015625,
0.022918701171875,
-0.04156494140625,
0.01015472412109375,
0.0200042724609375,
0.07879638671875,
-0.061614990234375,
0.05084228515625,
0.052703857421875,
-0.043426513671875,
-0.039276123046875,
-0.01003265380859375,
-0.03662109375,
-0.0406494140625,
0.02935791015625,
-0.003200531005859375,
0.0181884765625,
0.005092620849609375,
-0.057708740234375,
-0.066162109375,
0.0872802734375,
0.02783203125,
-0.0321044921875,
0.0024623870849609375,
-0.0286712646484375,
0.053741455078125,
-0.004642486572265625,
0.054901123046875,
0.01276397705078125,
0.0196380615234375,
0.02191162109375,
-0.043243408203125,
-0.0037784576416015625,
-0.031890869140625,
0.00547027587890625,
-0.004352569580078125,
-0.05889892578125,
0.08941650390625,
-0.0012960433959960938,
-0.0123138427734375,
0.027587890625,
0.044708251953125,
0.0501708984375,
0.019927978515625,
0.0311431884765625,
0.059539794921875,
0.07196044921875,
-0.00223541259765625,
0.055938720703125,
-0.035003662109375,
0.041351318359375,
0.0369873046875,
0.004680633544921875,
0.0406494140625,
0.00894927978515625,
0.004306793212890625,
0.0443115234375,
0.07928466796875,
-0.0264434814453125,
0.0501708984375,
0.00749969482421875,
-0.01837158203125,
0.0018157958984375,
0.0253448486328125,
-0.044830322265625,
-0.0247039794921875,
0.03387451171875,
-0.0203857421875,
-0.0163116455078125,
0.01023101806640625,
-0.0035915374755859375,
-0.027069091796875,
-0.03985595703125,
0.01629638671875,
-0.0091705322265625,
-0.029937744140625,
0.06988525390625,
-0.007049560546875,
0.08966064453125,
-0.041107177734375,
0.0009059906005859375,
-0.0266571044921875,
0.00548553466796875,
-0.040863037109375,
-0.073974609375,
-0.00910186767578125,
0.0167388916015625,
0.005474090576171875,
-0.020050048828125,
0.041717529296875,
-0.0318603515625,
-0.038177490234375,
-0.006015777587890625,
-0.0011129379272460938,
0.0379638671875,
0.03228759765625,
-0.079833984375,
-0.0007710456848144531,
-0.0004699230194091797,
-0.0153961181640625,
0.0006265640258789062,
0.040283203125,
0.01788330078125,
0.060394287109375,
0.0294036865234375,
0.0291748046875,
0.04376220703125,
-0.0000349879264831543,
0.03326416015625,
-0.0235748291015625,
-0.0272216796875,
-0.0350341796875,
0.058929443359375,
-0.0242919921875,
-0.031494140625,
0.0711669921875,
0.052886962890625,
0.030731201171875,
-0.0171661376953125,
0.056304931640625,
-0.016876220703125,
0.0241241455078125,
-0.040496826171875,
0.0670166015625,
-0.049896240234375,
0.0005230903625488281,
-0.0299224853515625,
-0.0592041015625,
-0.000629425048828125,
0.0806884765625,
0.00609588623046875,
0.00287628173828125,
0.036041259765625,
0.051910400390625,
-0.0276031494140625,
-0.028900146484375,
0.003612518310546875,
0.002292633056640625,
0.0504150390625,
0.038848876953125,
0.05975341796875,
-0.040496826171875,
0.02349853515625,
-0.061309814453125,
-0.026641845703125,
-0.026092529296875,
-0.059173583984375,
-0.079345703125,
-0.038177490234375,
-0.053985595703125,
-0.057586669921875,
-0.01297760009765625,
0.03814697265625,
0.0736083984375,
-0.052703857421875,
-0.003917694091796875,
-0.05255126953125,
-0.0004379749298095703,
0.0035247802734375,
-0.0197601318359375,
0.024749755859375,
-0.01910400390625,
-0.08056640625,
0.0027256011962890625,
-0.001804351806640625,
0.0438232421875,
-0.024932861328125,
-0.0221405029296875,
0.003376007080078125,
-0.00635528564453125,
0.0131988525390625,
0.057098388671875,
-0.027099609375,
-0.024505615234375,
-0.01197052001953125,
-0.0167236328125,
-0.0012226104736328125,
0.01806640625,
-0.04083251953125,
0.039642333984375,
0.047271728515625,
0.0170745849609375,
0.056396484375,
-0.0174713134765625,
0.00972747802734375,
-0.042388916015625,
0.02508544921875,
0.022918701171875,
0.038970947265625,
0.0112457275390625,
-0.0306243896484375,
0.0261383056640625,
0.0131988525390625,
-0.05560302734375,
-0.05963134765625,
0.004009246826171875,
-0.093994140625,
-0.01788330078125,
0.0810546875,
-0.00643157958984375,
-0.02557373046875,
0.0009021759033203125,
-0.047271728515625,
0.017181396484375,
-0.035736083984375,
0.07305908203125,
0.035675048828125,
-0.029296875,
-0.03326416015625,
-0.0207672119140625,
0.050537109375,
0.006732940673828125,
-0.063720703125,
-0.019561767578125,
0.02996826171875,
0.050323486328125,
0.0325927734375,
0.040069580078125,
0.01202392578125,
0.021392822265625,
0.0010461807250976562,
0.0100860595703125,
0.0162353515625,
0.0037078857421875,
-0.0323486328125,
-0.01837158203125,
-0.0022106170654296875,
-0.0312347412109375
]
] |
ehartford/WizardLM-7B-Uncensored | 2023-05-12T23:12:44.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"uncensored",
"dataset:ehartford/WizardLM_alpaca_evol_instruct_70k_unfiltered",
"license:other",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | ehartford | null | null | ehartford/WizardLM-7B-Uncensored | 348 | 6,740 | transformers | 2023-05-04T20:31:51 | ---
license: other
datasets:
- ehartford/WizardLM_alpaca_evol_instruct_70k_unfiltered
tags:
- uncensored
---
This is WizardLM trained with a subset of the dataset - responses that contained alignment / moralizing were removed. The intent is to train a WizardLM that doesn't have alignment built-in, so that alignment (of any sort) can be added separately with for example with a RLHF LoRA.
Shout out to the open source AI/ML community, and everyone who helped me out.
Note:
An uncensored model has no guardrails.
You are responsible for anything you do with the model, just as you are responsible for anything you do with any dangerous object such as a knife, gun, lighter, or car.
Publishing anything this model generates is the same as publishing it yourself.
You are responsible for the content you publish, and you cannot blame the model any more than you can blame the knife, gun, lighter, or car for what you do with it. | 937 | [
[
-0.0215606689453125,
-0.0498046875,
0.00110626220703125,
-0.00726318359375,
-0.0308990478515625,
-0.0262603759765625,
0.020904541015625,
-0.023223876953125,
0.0021228790283203125,
0.0791015625,
-0.053741455078125,
-0.03564453125,
-0.035247802734375,
-0.0016717910766601562,
-0.053466796875,
0.12200927734375,
0.0090789794921875,
0.028106689453125,
-0.02813720703125,
-0.0147705078125,
-0.024932861328125,
-0.039154052734375,
-0.01375579833984375,
-0.033203125,
0.042449951171875,
0.005954742431640625,
0.0601806640625,
0.06475830078125,
0.0406494140625,
0.0197296142578125,
0.0055389404296875,
0.00042176246643066406,
-0.062225341796875,
-0.009613037109375,
-0.036163330078125,
0.002712249755859375,
-0.054107666015625,
0.03802490234375,
0.0264434814453125,
0.0460205078125,
-0.0225677490234375,
0.04644775390625,
0.010894775390625,
0.0640869140625,
-0.0743408203125,
-0.011138916015625,
-0.03955078125,
0.017547607421875,
0.0012989044189453125,
0.0019330978393554688,
-0.038543701171875,
-0.03131103515625,
-0.0031566619873046875,
-0.075927734375,
0.01358795166015625,
0.0285491943359375,
0.07049560546875,
0.043609619140625,
-0.04595947265625,
-0.0017881393432617188,
-0.052276611328125,
0.05029296875,
-0.04052734375,
0.0019474029541015625,
0.03741455078125,
0.031646728515625,
-0.01061248779296875,
-0.033905029296875,
-0.036651611328125,
0.021331787109375,
-0.00670623779296875,
0.0134429931640625,
0.01025390625,
0.00524139404296875,
0.0172119140625,
0.011688232421875,
-0.040130615234375,
0.00811767578125,
-0.038421630859375,
-0.0192108154296875,
0.07537841796875,
0.0305633544921875,
0.0153656005859375,
-0.00754547119140625,
-0.04730224609375,
-0.009857177734375,
-0.050262451171875,
0.01421356201171875,
0.04791259765625,
0.028228759765625,
-0.01922607421875,
0.09588623046875,
0.01364898681640625,
0.048187255859375,
0.01293182373046875,
-0.0101776123046875,
0.017303466796875,
-0.003681182861328125,
-0.04412841796875,
0.017669677734375,
0.068359375,
0.052276611328125,
0.038238525390625,
-0.016845703125,
-0.01092529296875,
-0.018096923828125,
0.037628173828125,
-0.049102783203125,
-0.010894775390625,
0.028106689453125,
-0.045654296875,
-0.0355224609375,
0.0003917217254638672,
-0.034027099609375,
-0.057830810546875,
-0.033477783203125,
0.027801513671875,
-0.0206451416015625,
-0.0168914794921875,
0.0164337158203125,
-0.0255126953125,
0.039642333984375,
0.024810791015625,
-0.049224853515625,
0.003261566162109375,
0.048980712890625,
0.023895263671875,
0.0031337738037109375,
-0.03125,
-0.039459228515625,
0.0309906005859375,
-0.0439453125,
0.039093017578125,
-0.0213470458984375,
-0.04547119140625,
0.0055084228515625,
0.0157318115234375,
-0.0011644363403320312,
-0.0197906494140625,
0.041412353515625,
-0.0555419921875,
0.0214691162109375,
-0.0186309814453125,
-0.055877685546875,
-0.0255279541015625,
0.01197052001953125,
-0.043853759765625,
0.038482666015625,
-0.0015583038330078125,
-0.07818603515625,
0.03228759765625,
-0.04412841796875,
-0.011505126953125,
-0.0199127197265625,
-0.01215362548828125,
-0.04156494140625,
-0.015045166015625,
-0.006954193115234375,
-0.004100799560546875,
0.00568389892578125,
0.0340576171875,
-0.046905517578125,
-0.02545166015625,
0.0287933349609375,
-0.0396728515625,
0.094482421875,
0.00836944580078125,
-0.0086822509765625,
0.0079193115234375,
-0.07568359375,
0.00395965576171875,
0.01371002197265625,
-0.023101806640625,
-0.0166778564453125,
-0.0031585693359375,
0.01904296875,
0.019989013671875,
0.030181884765625,
-0.06268310546875,
0.016082763671875,
-0.03607177734375,
-0.0390625,
0.075927734375,
-0.0015277862548828125,
0.04803466796875,
-0.0159912109375,
0.023834228515625,
-0.0005588531494140625,
0.0284576416015625,
0.052520751953125,
-0.0306243896484375,
-0.04571533203125,
-0.0220794677734375,
0.01268768310546875,
0.044342041015625,
-0.038818359375,
0.06396484375,
-0.005218505859375,
-0.044586181640625,
-0.04254150390625,
0.0035877227783203125,
0.036102294921875,
0.06829833984375,
0.0164337158203125,
0.0005197525024414062,
-0.0302276611328125,
-0.08056640625,
-0.0211334228515625,
-0.006031036376953125,
-0.003353118896484375,
-0.02020263671875,
0.0357666015625,
0.01277923583984375,
0.0831298828125,
-0.034515380859375,
-0.0286712646484375,
0.010772705078125,
-0.00101470947265625,
0.0017099380493164062,
0.058502197265625,
0.041473388671875,
-0.049835205078125,
-0.03985595703125,
-0.004283905029296875,
-0.099609375,
0.0014181137084960938,
0.016021728515625,
-0.0276031494140625,
0.0008459091186523438,
0.0008788108825683594,
-0.058197021484375,
0.0748291015625,
0.01540374755859375,
-0.045257568359375,
0.039031982421875,
-0.0198211669921875,
0.0015716552734375,
-0.071044921875,
0.01544952392578125,
-0.01776123046875,
-0.011566162109375,
-0.053466796875,
-0.0081634521484375,
0.00276947021484375,
-0.0276947021484375,
-0.041656494140625,
0.047515869140625,
-0.00231170654296875,
-0.0154266357421875,
-0.04315185546875,
0.00432586669921875,
0.0272674560546875,
0.0380859375,
0.0170745849609375,
0.045623779296875,
0.04571533203125,
-0.04296875,
0.02886962890625,
0.049224853515625,
-0.006809234619140625,
0.055755615234375,
-0.04388427734375,
0.0038661956787109375,
-0.040802001953125,
-0.0015230178833007812,
-0.023284912109375,
-0.015045166015625,
0.048675537109375,
-0.0308074951171875,
0.01751708984375,
-0.01294708251953125,
-0.017669677734375,
-0.01180267333984375,
-0.0225677490234375,
0.016845703125,
0.037445068359375,
-0.03515625,
0.0472412109375,
0.0305328369140625,
0.03558349609375,
-0.07867431640625,
-0.054962158203125,
-0.0450439453125,
-0.0406494140625,
-0.0260772705078125,
-0.0055694580078125,
-0.00732421875,
-0.043701171875,
0.01424407958984375,
-0.003376007080078125,
-0.015899658203125,
0.00353240966796875,
0.032012939453125,
0.039794921875,
-0.0028171539306640625,
0.01120758056640625,
-0.0100555419921875,
-0.00412750244140625,
0.013214111328125,
0.014251708984375,
0.0050811767578125,
0.01568603515625,
-0.050140380859375,
-0.044158935546875,
0.049072265625,
0.0195159912109375,
-0.0308685302734375,
0.06658935546875,
0.045501708984375,
-0.01800537109375,
0.004886627197265625,
-0.006343841552734375,
-0.0093536376953125,
-0.0389404296875,
0.01331329345703125,
0.008758544921875,
-0.049163818359375,
0.03570556640625,
0.0469970703125,
0.040802001953125,
0.0394287109375,
0.0290985107421875,
-0.0280609130859375,
0.08074951171875,
0.046295166015625,
0.00925445556640625,
0.015960693359375,
0.01104736328125,
0.0230712890625,
-0.06085205078125,
-0.037933349609375,
-0.04071044921875,
-0.0133819580078125,
-0.056671142578125,
0.0089263916015625,
0.02325439453125,
0.01611328125,
-0.074951171875,
0.024383544921875,
-0.05511474609375,
0.0311279296875,
0.0282440185546875,
0.00713348388671875,
0.030303955078125,
-0.0008845329284667969,
0.0296630859375,
-0.0006918907165527344,
-0.0350341796875,
-0.03643798828125,
0.0950927734375,
0.00676727294921875,
0.10125732421875,
0.0094451904296875,
0.051300048828125,
0.04351806640625,
0.01531219482421875,
-0.061492919921875,
0.04315185546875,
-0.00501251220703125,
-0.06292724609375,
-0.034149169921875,
-0.026336669921875,
-0.0855712890625,
0.0290679931640625,
-0.0157318115234375,
-0.06671142578125,
0.0164642333984375,
0.020172119140625,
-0.023101806640625,
0.03460693359375,
-0.04119873046875,
0.052978515625,
-0.02117919921875,
-0.020050048828125,
0.002002716064453125,
-0.042236328125,
0.03228759765625,
-0.003894805908203125,
0.00977325439453125,
-0.03253173828125,
0.00902557373046875,
0.07049560546875,
-0.057952880859375,
0.08331298828125,
-0.020965576171875,
-0.010711669921875,
0.036529541015625,
-0.0016727447509765625,
0.0401611328125,
-0.00969696044921875,
0.0101318359375,
-0.018218994140625,
0.016021728515625,
-0.036224365234375,
-0.0355224609375,
0.032135009765625,
-0.08734130859375,
-0.0640869140625,
-0.04241943359375,
-0.046356201171875,
-0.00914764404296875,
0.023468017578125,
0.0190582275390625,
0.031585693359375,
-0.01953125,
-0.007747650146484375,
0.0574951171875,
-0.0011911392211914062,
0.033935546875,
0.041046142578125,
-0.0445556640625,
-0.02154541015625,
0.049468994140625,
0.0087890625,
0.00862884521484375,
-0.010009765625,
0.007274627685546875,
-0.03582763671875,
-0.0171051025390625,
-0.036224365234375,
0.0093994140625,
-0.0811767578125,
-0.0149688720703125,
-0.040374755859375,
-0.0445556640625,
-0.05010986328125,
-0.0107879638671875,
-0.042388916015625,
-0.03314208984375,
-0.039398193359375,
-0.016571044921875,
0.0570068359375,
0.07208251953125,
-0.010894775390625,
0.0204010009765625,
-0.06085205078125,
0.01172637939453125,
0.020111083984375,
0.0008530616760253906,
-0.01328277587890625,
-0.039520263671875,
-0.024261474609375,
0.0090484619140625,
-0.028045654296875,
-0.03369140625,
0.021209716796875,
-0.01041412353515625,
0.0491943359375,
0.038604736328125,
0.04620361328125,
0.04071044921875,
-0.0389404296875,
0.055511474609375,
0.0254974365234375,
-0.05078125,
0.01776123046875,
-0.0292510986328125,
0.00036215782165527344,
0.03753662109375,
0.017791748046875,
-0.00809478759765625,
-0.0218505859375,
-0.046417236328125,
-0.027069091796875,
0.0377197265625,
0.0166168212890625,
0.0163116455078125,
0.01605224609375,
0.01538848876953125,
0.00974273681640625,
0.0276031494140625,
-0.06781005859375,
-0.03521728515625,
-0.054534912109375,
-0.00360870361328125,
0.020782470703125,
0.0088043212890625,
-0.052398681640625,
-0.022857666015625,
0.0692138671875,
-0.004913330078125,
0.0031337738037109375,
0.0094451904296875,
-0.00312042236328125,
-0.01611328125,
0.005275726318359375,
0.025299072265625,
0.0462646484375,
-0.0187530517578125,
-0.01094818115234375,
-0.01561737060546875,
-0.0321044921875,
0.020233154296875,
0.00032067298889160156,
-0.01061248779296875,
-0.0238800048828125,
0.0289459228515625,
0.051727294921875,
-0.0260772705078125,
-0.0274505615234375,
0.0504150390625,
0.006500244140625,
-0.007762908935546875,
-0.038970947265625,
0.01593017578125,
-0.0088043212890625,
0.0199127197265625,
0.0021495819091796875,
0.0216064453125,
0.01555633544921875,
0.0038471221923828125,
0.010894775390625,
0.048492431640625,
-0.0271759033203125,
-0.006511688232421875,
0.06097412109375,
0.006885528564453125,
-0.01546478271484375,
0.050384521484375,
0.0030765533447265625,
0.005130767822265625,
0.051300048828125,
0.033477783203125,
0.046783447265625,
-0.0025348663330078125,
0.03155517578125,
0.0307769775390625,
0.020355224609375,
0.0256500244140625,
0.001617431640625,
0.01500701904296875,
-0.06732177734375,
-0.021575927734375,
-0.03558349609375,
-0.017608642578125,
0.0189361572265625,
-0.08697509765625,
0.0197601318359375,
-0.039093017578125,
-0.0134735107421875,
-0.0093231201171875,
-0.01499176025390625,
-0.031463623046875,
0.0299835205078125,
-0.0167388916015625,
0.0728759765625,
-0.06292724609375,
0.06890869140625,
0.01959228515625,
-0.041839599609375,
-0.05609130859375,
0.00677490234375,
0.0171966552734375,
-0.06744384765625,
0.01641845703125,
0.0033130645751953125,
-0.0124053955078125,
-0.01495361328125,
-0.068359375,
-0.070556640625,
0.07220458984375,
0.03240966796875,
-0.00223541259765625,
-0.0260467529296875,
0.0253143310546875,
0.034027099609375,
-0.0183868408203125,
-0.017608642578125,
0.0040740966796875,
0.034515380859375,
0.0030307769775390625,
-0.055206298828125,
-0.01555633544921875,
-0.005580902099609375,
-0.006195068359375,
-0.00977325439453125,
-0.049896240234375,
0.06939697265625,
0.01210784912109375,
-0.009429931640625,
0.021240234375,
0.034820556640625,
0.019927978515625,
-0.0013790130615234375,
0.0092620849609375,
0.031646728515625,
0.060455322265625,
0.01399993896484375,
0.08343505859375,
0.004299163818359375,
0.04266357421875,
0.11102294921875,
-0.04644775390625,
0.033966064453125,
0.04852294921875,
0.0034236907958984375,
0.0157928466796875,
0.06396484375,
-0.0123291015625,
0.06475830078125,
-0.0005383491516113281,
-0.0102996826171875,
-0.0316162109375,
-0.027069091796875,
-0.04058837890625,
0.04669189453125,
-0.00067901611328125,
-0.031768798828125,
-0.038909912109375,
0.01284027099609375,
0.01200103759765625,
0.01122283935546875,
-0.0386962890625,
0.065673828125,
0.00476837158203125,
-0.0293121337890625,
0.044281005859375,
-0.0203857421875,
0.0269317626953125,
-0.04351806640625,
0.0093231201171875,
-0.0004246234893798828,
0.003398895263671875,
-0.02508544921875,
-0.0697021484375,
0.041168212890625,
0.004016876220703125,
-0.0236358642578125,
-0.00055694580078125,
0.0562744140625,
-0.03033447265625,
-0.044586181640625,
0.02410888671875,
0.0426025390625,
0.0117340087890625,
0.0290069580078125,
-0.05987548828125,
-0.026580810546875,
-0.006565093994140625,
-0.0440673828125,
0.034912109375,
0.0215911865234375,
-0.0026874542236328125,
0.05859375,
0.03753662109375,
-0.016265869140625,
-0.00917816162109375,
0.003948211669921875,
0.06842041015625,
-0.03369140625,
-0.0104827880859375,
-0.05340576171875,
0.0504150390625,
-0.0225067138671875,
-0.011260986328125,
0.0582275390625,
0.048675537109375,
0.0380859375,
-0.0214691162109375,
0.055206298828125,
0.0006442070007324219,
0.02264404296875,
-0.050537109375,
0.08062744140625,
-0.04571533203125,
-0.002483367919921875,
0.0036220550537109375,
-0.040374755859375,
-0.007534027099609375,
0.031585693359375,
-0.0186920166015625,
-0.0080108642578125,
0.0308074951171875,
0.061126708984375,
-0.001064300537109375,
0.0003044605255126953,
0.044769287109375,
-0.02154541015625,
0.0038356781005859375,
0.0008997917175292969,
0.052734375,
-0.0168609619140625,
0.041534423828125,
-0.0290679931640625,
-0.0017347335815429688,
0.016082763671875,
-0.061614990234375,
-0.09722900390625,
-0.021820068359375,
-0.0252685546875,
-0.05291748046875,
-0.0167694091796875,
0.06365966796875,
0.049224853515625,
-0.05352783203125,
-0.01047515869140625,
0.0005559921264648438,
0.01496124267578125,
-0.00434112548828125,
-0.01483154296875,
0.031524658203125,
0.00926971435546875,
-0.04052734375,
0.021392822265625,
-0.00540924072265625,
0.03228759765625,
-0.034393310546875,
-0.0066070556640625,
-0.01418304443359375,
0.01056671142578125,
0.014312744140625,
0.01416778564453125,
-0.0372314453125,
-0.050140380859375,
-0.0100250244140625,
-0.020294189453125,
0.026824951171875,
0.029937744140625,
-0.02935791015625,
0.02813720703125,
0.0171051025390625,
0.0279388427734375,
0.0195465087890625,
0.007778167724609375,
0.048004150390625,
-0.0638427734375,
0.03131103515625,
0.0098876953125,
0.0299835205078125,
0.0256195068359375,
-0.0645751953125,
0.038665771484375,
0.0102386474609375,
-0.053375244140625,
-0.0292205810546875,
0.024139404296875,
-0.051788330078125,
-0.0127105712890625,
0.064697265625,
-0.021575927734375,
-0.046539306640625,
-0.01313018798828125,
-0.018890380859375,
0.031707763671875,
-0.021575927734375,
0.03985595703125,
0.035980224609375,
-0.01363372802734375,
0.0048065185546875,
-0.035003662109375,
0.049163818359375,
0.0035228729248046875,
-0.054107666015625,
0.0166778564453125,
0.04840087890625,
0.0321044921875,
0.00971221923828125,
0.038818359375,
-0.01293182373046875,
0.016204833984375,
0.00896453857421875,
0.024932861328125,
-0.006717681884765625,
0.0008459091186523438,
-0.02484130859375,
0.012298583984375,
-0.0036029815673828125,
-0.03253173828125
]
] |
ydshieh/kosmos-2-patch14-224 | 2023-11-02T16:42:01.000Z | [
"transformers",
"pytorch",
"kosmos-2",
"feature-extraction",
"custom_code",
"has_space",
"region:us"
] | feature-extraction | ydshieh | null | null | ydshieh/kosmos-2-patch14-224 | 54 | 6,740 | transformers | 2023-07-29T17:44:41 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/model-cards
{}
---
# Kosmos-2: Grounding Multimodal Large Language Models to the World
**This model (remote code on the Hub) is deprecated. Please use https://huggingface.co/microsoft/kosmos-2-patch14-224**
**There are some changes in terms of input formats: see the model card in https://huggingface.co/microsoft/kosmos-2-patch14-224**
~~**(There is an on going effort to port `Kosmos-2` directly into `transformers`. This repository (remote code) might need some more bug fixes later, including breaking changes.)**~~
<a href="https://huggingface.co/ydshieh/kosmos-2-patch14-224/resolve/main/annotated_snowman.jpg" target="_blank"><figure><img src="https://huggingface.co/ydshieh/kosmos-2-patch14-224/resolve/main/annotated_snowman.jpg" width="384"><figcaption><b>[An image of a snowman warming himself by a fire.]</b></figcaption></figure></a>
This Hub repository contains a HuggingFace's `transformers` implementation of [the original Kosmos-2 model](https://github.com/microsoft/unilm/tree/master/kosmos-2) from Microsoft.
## How to Get Started with the Model
Use the code below to get started with the model.
```python
import requests
from PIL import Image
from transformers import AutoProcessor, AutoModelForVision2Seq
model = AutoModelForVision2Seq.from_pretrained("ydshieh/kosmos-2-patch14-224", trust_remote_code=True)
processor = AutoProcessor.from_pretrained("ydshieh/kosmos-2-patch14-224", trust_remote_code=True)
prompt = "<grounding>An image of"
url = "https://huggingface.co/ydshieh/kosmos-2-patch14-224/resolve/main/snowman.png"
image = Image.open(requests.get(url, stream=True).raw)
# The original Kosmos-2 demo saves the image first then reload it. For some images, this will give slightly different image input and change the generation outputs.
# Uncomment the following 2 lines if you want to match the original demo's outputs.
# (One example is the `two_dogs.jpg` from the demo)
# image.save("new_image.jpg")
# image = Image.open("new_image.jpg")
inputs = processor(text=prompt, images=image, return_tensors="pt")
generated_ids = model.generate(
pixel_values=inputs["pixel_values"],
input_ids=inputs["input_ids"][:, :-1],
attention_mask=inputs["attention_mask"][:, :-1],
img_features=None,
img_attn_mask=inputs["img_attn_mask"][:, :-1],
use_cache=True,
max_new_tokens=64,
)
generated_text = processor.batch_decode(generated_ids, skip_special_tokens=True)[0]
# Specify `cleanup_and_extract=False` in order to see the raw model generation.
processed_text = processor.post_process_generation(generated_text, cleanup_and_extract=False)
print(processed_text)
# `<grounding> An image of<phrase> a snowman</phrase><object><patch_index_0044><patch_index_0863></object> warming himself by<phrase> a fire</phrase><object><patch_index_0005><patch_index_0911></object>.`
# By default, the generated text is cleanup and the entities are extracted.
processed_text, entities = processor.post_process_generation(generated_text)
print(processed_text)
# `An image of a snowman warming himself by a fire.`
print(entities)
# `[('a snowman', (12, 21), [(0.390625, 0.046875, 0.984375, 0.828125)]), ('a fire', (41, 47), [(0.171875, 0.015625, 0.484375, 0.890625)])]`
```
## Draw the bounding bboxes of the entities on the image
Once you have the `entities`, you can use the following helper function to draw their bounding bboxes on the image:
```python
import cv2
import numpy as np
import os
import requests
import torch
import torchvision.transforms as T
from PIL import Image
def is_overlapping(rect1, rect2):
x1, y1, x2, y2 = rect1
x3, y3, x4, y4 = rect2
return not (x2 < x3 or x1 > x4 or y2 < y3 or y1 > y4)
def draw_entity_boxes_on_image(image, entities, show=False, save_path=None):
"""_summary_
Args:
image (_type_): image or image path
collect_entity_location (_type_): _description_
"""
if isinstance(image, Image.Image):
image_h = image.height
image_w = image.width
image = np.array(image)[:, :, [2, 1, 0]]
elif isinstance(image, str):
if os.path.exists(image):
pil_img = Image.open(image).convert("RGB")
image = np.array(pil_img)[:, :, [2, 1, 0]]
image_h = pil_img.height
image_w = pil_img.width
else:
raise ValueError(f"invaild image path, {image}")
elif isinstance(image, torch.Tensor):
# pdb.set_trace()
image_tensor = image.cpu()
reverse_norm_mean = torch.tensor([0.48145466, 0.4578275, 0.40821073])[:, None, None]
reverse_norm_std = torch.tensor([0.26862954, 0.26130258, 0.27577711])[:, None, None]
image_tensor = image_tensor * reverse_norm_std + reverse_norm_mean
pil_img = T.ToPILImage()(image_tensor)
image_h = pil_img.height
image_w = pil_img.width
image = np.array(pil_img)[:, :, [2, 1, 0]]
else:
raise ValueError(f"invaild image format, {type(image)} for {image}")
if len(entities) == 0:
return image
new_image = image.copy()
previous_bboxes = []
# size of text
text_size = 1
# thickness of text
text_line = 1 # int(max(1 * min(image_h, image_w) / 512, 1))
box_line = 3
(c_width, text_height), _ = cv2.getTextSize("F", cv2.FONT_HERSHEY_COMPLEX, text_size, text_line)
base_height = int(text_height * 0.675)
text_offset_original = text_height - base_height
text_spaces = 3
for entity_name, (start, end), bboxes in entities:
for (x1_norm, y1_norm, x2_norm, y2_norm) in bboxes:
orig_x1, orig_y1, orig_x2, orig_y2 = int(x1_norm * image_w), int(y1_norm * image_h), int(x2_norm * image_w), int(y2_norm * image_h)
# draw bbox
# random color
color = tuple(np.random.randint(0, 255, size=3).tolist())
new_image = cv2.rectangle(new_image, (orig_x1, orig_y1), (orig_x2, orig_y2), color, box_line)
l_o, r_o = box_line // 2 + box_line % 2, box_line // 2 + box_line % 2 + 1
x1 = orig_x1 - l_o
y1 = orig_y1 - l_o
if y1 < text_height + text_offset_original + 2 * text_spaces:
y1 = orig_y1 + r_o + text_height + text_offset_original + 2 * text_spaces
x1 = orig_x1 + r_o
# add text background
(text_width, text_height), _ = cv2.getTextSize(f" {entity_name}", cv2.FONT_HERSHEY_COMPLEX, text_size, text_line)
text_bg_x1, text_bg_y1, text_bg_x2, text_bg_y2 = x1, y1 - (text_height + text_offset_original + 2 * text_spaces), x1 + text_width, y1
for prev_bbox in previous_bboxes:
while is_overlapping((text_bg_x1, text_bg_y1, text_bg_x2, text_bg_y2), prev_bbox):
text_bg_y1 += (text_height + text_offset_original + 2 * text_spaces)
text_bg_y2 += (text_height + text_offset_original + 2 * text_spaces)
y1 += (text_height + text_offset_original + 2 * text_spaces)
if text_bg_y2 >= image_h:
text_bg_y1 = max(0, image_h - (text_height + text_offset_original + 2 * text_spaces))
text_bg_y2 = image_h
y1 = image_h
break
alpha = 0.5
for i in range(text_bg_y1, text_bg_y2):
for j in range(text_bg_x1, text_bg_x2):
if i < image_h and j < image_w:
if j < text_bg_x1 + 1.35 * c_width:
# original color
bg_color = color
else:
# white
bg_color = [255, 255, 255]
new_image[i, j] = (alpha * new_image[i, j] + (1 - alpha) * np.array(bg_color)).astype(np.uint8)
cv2.putText(
new_image, f" {entity_name}", (x1, y1 - text_offset_original - 1 * text_spaces), cv2.FONT_HERSHEY_COMPLEX, text_size, (0, 0, 0), text_line, cv2.LINE_AA
)
# previous_locations.append((x1, y1))
previous_bboxes.append((text_bg_x1, text_bg_y1, text_bg_x2, text_bg_y2))
pil_image = Image.fromarray(new_image[:, :, [2, 1, 0]])
if save_path:
pil_image.save(save_path)
if show:
pil_image.show()
return new_image
# (The same image from the previous code example)
url = "https://huggingface.co/ydshieh/kosmos-2-patch14-224/resolve/main/snowman.jpg"
image = Image.open(requests.get(url, stream=True).raw)
# From the previous code example
entities = [('a snowman', (12, 21), [(0.390625, 0.046875, 0.984375, 0.828125)]), ('a fire', (41, 47), [(0.171875, 0.015625, 0.484375, 0.890625)])]
# Draw the bounding bboxes
draw_entity_boxes_on_image(image, entities, show=True)
```
Here is the annotated image:
<a href="https://huggingface.co/ydshieh/kosmos-2-patch14-224/resolve/main/annotated_snowman.jpg" target="_blank"><img src="https://huggingface.co/ydshieh/kosmos-2-patch14-224/resolve/main/annotated_snowman.jpg" width="500"></a>
## Tasks
This model is capable of performing different tasks through changing the prompts.
First, let's define a function to run a prompt.
```python
import requests
from PIL import Image
from transformers import AutoProcessor, AutoModelForVision2Seq
model = AutoModelForVision2Seq.from_pretrained("ydshieh/kosmos-2-patch14-224", trust_remote_code=True)
processor = AutoProcessor.from_pretrained("ydshieh/kosmos-2-patch14-224", trust_remote_code=True)
url = "https://huggingface.co/ydshieh/kosmos-2-patch14-224/resolve/main/snowman.png"
image = Image.open(requests.get(url, stream=True).raw)
def run_example(prompt):
inputs = processor(text=prompt, images=image, return_tensors="pt")
generated_ids = model.generate(
pixel_values=inputs["pixel_values"],
input_ids=inputs["input_ids"][:, :-1],
attention_mask=inputs["attention_mask"][:, :-1],
img_features=None,
img_attn_mask=inputs["img_attn_mask"][:, :-1],
use_cache=True,
max_new_tokens=64,
)
generated_text = processor.batch_decode(generated_ids, skip_special_tokens=True)[0]
_processed_text = processor.post_process_generation(generated_text, cleanup_and_extract=False)
processed_text, entities = processor.post_process_generation(generated_text)
print(processed_text)
print(entities)
print(_processed_text)
```
Here are the tasks `Kosmos-2` could perform:
### Multimodal Grounding
#### • Phrase Grounding
```python
prompt = "<grounding><phrase> a snowman</phrase>"
run_example(prompt)
# a snowman is warming himself by the fire
# [('a snowman', (0, 9), [(0.390625, 0.046875, 0.984375, 0.828125)]), ('the fire', (32, 40), [(0.203125, 0.015625, 0.453125, 0.859375)])]
# <grounding><phrase> a snowman</phrase><object><patch_index_0044><patch_index_0863></object> is warming himself by<phrase> the fire</phrase><object><patch_index_0006><patch_index_0878></object>
```
#### • Referring Expression Comprehension
```python
prompt = "<grounding><phrase> a snowman next to a fire</phrase>"
run_example(prompt)
# a snowman next to a fire
# [('a snowman next to a fire', (0, 24), [(0.390625, 0.046875, 0.984375, 0.828125)])]
# <grounding><phrase> a snowman next to a fire</phrase><object><patch_index_0044><patch_index_0863></object>
```
### Multimodal Referring
#### • Referring expression generation
```python
prompt = "<grounding><phrase> It</phrase><object><patch_index_0044><patch_index_0863></object> is"
run_example(prompt)
# It is snowman in a hat and scarf
# [('It', (0, 2), [(0.390625, 0.046875, 0.984375, 0.828125)])]
# <grounding><phrase> It</phrase><object><patch_index_0044><patch_index_0863></object> is snowman in a hat and scarf
```
### Perception-Language Tasks
#### • Grounded VQA
```python
prompt = "<grounding> Question: What is special about this image? Answer:"
run_example(prompt)
# Question: What is special about this image? Answer: The image features a snowman sitting by a campfire in the snow.
# [('a snowman', (71, 80), [(0.390625, 0.046875, 0.984375, 0.828125)]), ('a campfire', (92, 102), [(0.109375, 0.640625, 0.546875, 0.984375)])]
# <grounding> Question: What is special about this image? Answer: The image features<phrase> a snowman</phrase><object><patch_index_0044><patch_index_0863></object> sitting by<phrase> a campfire</phrase><object><patch_index_0643><patch_index_1009></object> in the snow.
```
#### • Grounded VQA with multimodal referring via bounding boxes
```python
prompt = "<grounding> Question: Where is<phrase> the fire</phrase><object><patch_index_0005><patch_index_0911></object> next to? Answer:"
run_example(prompt)
# Question: Where is the fire next to? Answer: Near the snowman.
# [('the fire', (19, 27), [(0.171875, 0.015625, 0.484375, 0.890625)]), ('the snowman', (50, 61), [(0.390625, 0.046875, 0.984375, 0.828125)])]
# <grounding> Question: Where is<phrase> the fire</phrase><object><patch_index_0005><patch_index_0911></object> next to? Answer: Near<phrase> the snowman</phrase><object><patch_index_0044><patch_index_0863></object>.
```
### Grounded Image captioning
#### • Brief
```python
prompt = "<grounding> An image of"
run_example(prompt)
# An image of a snowman warming himself by a campfire.
# [('a snowman', (12, 21), [(0.390625, 0.046875, 0.984375, 0.828125)]), ('a campfire', (41, 51), [(0.109375, 0.640625, 0.546875, 0.984375)])]
# <grounding> An image of<phrase> a snowman</phrase><object><patch_index_0044><patch_index_0863></object> warming himself by<phrase> a campfire</phrase><object><patch_index_0643><patch_index_1009></object>.
```
#### • Detailed
```python
prompt = "<grounding> Describe this image in detail:"
run_example(prompt)
# Describe this image in detail: The image features a snowman sitting by a campfire in the snow. He is wearing a hat, scarf, and gloves, with a pot nearby and a cup
# [('a campfire', (71, 81), [(0.171875, 0.015625, 0.484375, 0.984375)]), ('a hat', (109, 114), [(0.515625, 0.046875, 0.828125, 0.234375)]), ('scarf', (116, 121), [(0.515625, 0.234375, 0.890625, 0.578125)]), ('gloves', (127, 133), [(0.515625, 0.390625, 0.640625, 0.515625)]), ('a pot', (140, 145), [(0.078125, 0.609375, 0.265625, 0.859375)])]
# <grounding> Describe this image in detail: The image features a snowman sitting by<phrase> a campfire</phrase><object><patch_index_0005><patch_index_1007></object> in the snow. He is wearing<phrase> a hat</phrase><object><patch_index_0048><patch_index_0250></object>,<phrase> scarf</phrase><object><patch_index_0240><patch_index_0604></object>, and<phrase> gloves</phrase><object><patch_index_0400><patch_index_0532></object>, with<phrase> a pot</phrase><object><patch_index_0610><patch_index_0872></object> nearby and<phrase> a cup</phrase><object>
```
## Running the Flask Server
_flask_kosmos2.py_ shows the implementation of a Flask server for the model.
It allowes the model to be approached as a REST API.
After starting the server. You can send a POST request to `http://localhost:8005/process_prompt` with the following form data:
- `prompt`: For example `<grounding> an image of`
- `image`: The image file as binary data
This in turn will produce a reply with the following JSON format:
- `message`: The Kosmos-2 generated text
- `entities`: The extracted entities
An easy way to test this is through an application like Postman. Make sure the image field is set to `File`.
```python
from PIL import Image
from transformers import AutoProcessor, AutoModelForVision2Seq
from flask import Flask, request, jsonify
import json
app = Flask(__name__)
model = AutoModelForVision2Seq.from_pretrained("ydshieh/kosmos-2-patch14-224", trust_remote_code=True)
processor = AutoProcessor.from_pretrained("ydshieh/kosmos-2-patch14-224", trust_remote_code=True)
@app.route('/process_prompt', methods=['POST'])
def process_prompt():
try:
# Get the uploaded image data from the POST request
uploaded_file = request.files['image']
prompt = request.form.get('prompt')
image = Image.open(uploaded_file.stream)
print(image.size)
inputs = processor(text=prompt, images=image, return_tensors="pt")
generated_ids = model.generate(
pixel_values=inputs["pixel_values"],
input_ids=inputs["input_ids"][:, :-1],
attention_mask=inputs["attention_mask"][:, :-1],
img_features=None,
img_attn_mask=inputs["img_attn_mask"][:, :-1],
use_cache=True,
max_new_tokens=64,
)
generated_text = processor.batch_decode(generated_ids, skip_special_tokens=True)[0]
# By default, the generated text is cleanup and the entities are extracted.
processed_text, entities = processor.post_process_generation(generated_text)
parsed_entities = entities_to_json(entities)
print(generated_text)
print(processed_text)
return jsonify({"message": processed_text, 'entities': parsed_entities})
except Exception as e:
return jsonify({"error": str(e)})
def entities_to_json(entities):
result = []
for e in entities:
label = e[0]
box_coords = e[1]
box_size = e[2][0]
entity_result = {
"label": label,
"boundingBoxPosition": {"x": box_coords[0], "y": box_coords[1]},
"boundingBox": {"x_min": box_size[0], "y_min": box_size[1], "x_max": box_size[2], "y_max": box_size[3]}
}
print(entity_result)
result.append(entity_result)
return result
if __name__ == '__main__':
app.run(host='localhost', port=8005)
``` | 17,854 | [
[
-0.029754638671875,
-0.05328369140625,
0.0293731689453125,
0.033843994140625,
-0.0294189453125,
-0.0242462158203125,
-0.00957489013671875,
-0.042755126953125,
0.0187225341796875,
0.048919677734375,
-0.0338134765625,
-0.0513916015625,
-0.041839599609375,
-0.006771087646484375,
0.00403594970703125,
0.05291748046875,
-0.010345458984375,
-0.0297088623046875,
-0.00537872314453125,
-0.004985809326171875,
-0.035003662109375,
0.0000035762786865234375,
-0.04620361328125,
-0.0161285400390625,
0.003978729248046875,
0.0136566162109375,
0.058746337890625,
0.0592041015625,
0.0303497314453125,
0.033203125,
-0.00467681884765625,
0.00960540771484375,
-0.031494140625,
-0.0084991455078125,
-0.005092620849609375,
-0.043121337890625,
-0.0230712890625,
-0.009521484375,
0.04937744140625,
0.01519012451171875,
0.005863189697265625,
0.01111602783203125,
-0.006805419921875,
0.0467529296875,
-0.028594970703125,
0.0208587646484375,
-0.042724609375,
0.003398895263671875,
-0.000560760498046875,
-0.0190887451171875,
-0.016021728515625,
-0.0237579345703125,
0.004581451416015625,
-0.0704345703125,
0.0211944580078125,
-0.006526947021484375,
0.100341796875,
0.0170745849609375,
-0.01434326171875,
-0.02880859375,
-0.025726318359375,
0.06982421875,
-0.0433349609375,
0.01129913330078125,
0.01447296142578125,
-0.0029048919677734375,
-0.0164794921875,
-0.06842041015625,
-0.039306640625,
0.007686614990234375,
-0.0107421875,
0.02618408203125,
-0.030181884765625,
-0.019378662109375,
0.02508544921875,
0.007080078125,
-0.03265380859375,
-0.005916595458984375,
-0.04913330078125,
-0.01380157470703125,
0.05255126953125,
0.01514434814453125,
0.03887939453125,
-0.0266876220703125,
-0.0357666015625,
-0.02142333984375,
-0.0298919677734375,
0.01325225830078125,
0.0190887451171875,
-0.01145172119140625,
-0.0301361083984375,
0.0484619140625,
0.0092010498046875,
0.04833984375,
0.0092620849609375,
-0.0293121337890625,
0.03436279296875,
-0.0113067626953125,
-0.027862548828125,
-0.007671356201171875,
0.076904296875,
0.06695556640625,
0.0029010772705078125,
0.0114898681640625,
0.002696990966796875,
0.00015103816986083984,
-0.0016355514526367188,
-0.0718994140625,
-0.03875732421875,
0.0266876220703125,
-0.044464111328125,
-0.011260986328125,
-0.009521484375,
-0.08087158203125,
-0.0059661865234375,
0.000014483928680419922,
0.034454345703125,
-0.0489501953125,
-0.0183563232421875,
0.00763702392578125,
-0.0230255126953125,
0.00754547119140625,
0.031707763671875,
-0.054351806640625,
0.00408935546875,
0.013824462890625,
0.07122802734375,
0.018707275390625,
-0.01271820068359375,
-0.0140380859375,
0.003841400146484375,
-0.024169921875,
0.047637939453125,
-0.0169219970703125,
-0.02325439453125,
-0.0149688720703125,
0.016021728515625,
0.0040740966796875,
-0.034912109375,
0.033477783203125,
-0.032135009765625,
0.01049041748046875,
-0.00887298583984375,
-0.0215911865234375,
-0.01107025146484375,
0.01334381103515625,
-0.0396728515625,
0.0791015625,
0.0296478271484375,
-0.079833984375,
0.00885772705078125,
-0.025543212890625,
-0.0114593505859375,
-0.005092620849609375,
-0.0196685791015625,
-0.062744140625,
-0.01036834716796875,
0.0269927978515625,
0.032958984375,
-0.005939483642578125,
-0.019012451171875,
-0.01434326171875,
-0.017425537109375,
0.01074981689453125,
-0.0004489421844482422,
0.0919189453125,
0.01222991943359375,
-0.045166015625,
-0.0118560791015625,
-0.065673828125,
0.0208587646484375,
0.03662109375,
-0.0211944580078125,
-0.019927978515625,
-0.01043701171875,
0.01091766357421875,
0.0295257568359375,
0.0244140625,
-0.036346435546875,
0.02166748046875,
-0.033599853515625,
0.03466796875,
0.040191650390625,
0.009429931640625,
0.0266265869140625,
-0.00977325439453125,
0.01788330078125,
0.013397216796875,
-0.00531005859375,
-0.0287017822265625,
-0.052764892578125,
-0.050567626953125,
-0.03558349609375,
-0.00439453125,
0.04608154296875,
-0.06964111328125,
0.0433349609375,
-0.01062774658203125,
-0.0562744140625,
-0.0287933349609375,
0.0020465850830078125,
0.028533935546875,
0.0311126708984375,
0.04150390625,
-0.0288848876953125,
-0.041473388671875,
-0.0733642578125,
0.02069091796875,
0.0011529922485351562,
-0.0030059814453125,
0.0306243896484375,
0.05279541015625,
-0.0239410400390625,
0.0711669921875,
-0.055511474609375,
-0.019622802734375,
0.01050567626953125,
0.01317596435546875,
0.0293731689453125,
0.060699462890625,
0.06573486328125,
-0.05657958984375,
-0.044647216796875,
-0.00421142578125,
-0.0594482421875,
-0.0021533966064453125,
-0.0198516845703125,
-0.0222625732421875,
0.0158538818359375,
0.0207061767578125,
-0.059356689453125,
0.055023193359375,
0.025482177734375,
-0.030120849609375,
0.04815673828125,
-0.038543701171875,
0.0196533203125,
-0.08502197265625,
-0.0003612041473388672,
0.00897979736328125,
-0.0081024169921875,
-0.0433349609375,
0.0013494491577148438,
0.0201873779296875,
0.00315093994140625,
-0.03704833984375,
0.02947998046875,
-0.06005859375,
-0.01064300537109375,
-0.019989013671875,
0.0078277587890625,
0.006000518798828125,
0.038177490234375,
0.01552581787109375,
0.033599853515625,
0.0775146484375,
-0.041656494140625,
0.04925537109375,
0.029205322265625,
-0.029754638671875,
0.06243896484375,
-0.058746337890625,
0.0151214599609375,
-0.0255889892578125,
0.013336181640625,
-0.0850830078125,
-0.01338958740234375,
0.032501220703125,
-0.044677734375,
0.0272979736328125,
-0.0159149169921875,
-0.04327392578125,
-0.034393310546875,
-0.025970458984375,
0.030181884765625,
0.04150390625,
-0.03350830078125,
0.0296478271484375,
0.01506805419921875,
0.01088714599609375,
-0.03802490234375,
-0.06683349609375,
0.003086090087890625,
-0.0032863616943359375,
-0.07183837890625,
0.046783447265625,
-0.0011110305786132812,
-0.01146697998046875,
0.023284912109375,
0.017730712890625,
0.005191802978515625,
-0.01197052001953125,
0.024322509765625,
0.0240020751953125,
-0.0204010009765625,
-0.0213165283203125,
0.0018596649169921875,
-0.01065826416015625,
0.007965087890625,
-0.0221710205078125,
0.04803466796875,
-0.012237548828125,
-0.018280029296875,
-0.05255126953125,
0.01354217529296875,
0.031585693359375,
-0.0038127899169921875,
0.05828857421875,
0.061676025390625,
-0.048004150390625,
0.0167083740234375,
-0.032623291015625,
0.008544921875,
-0.036224365234375,
0.03271484375,
-0.0318603515625,
-0.029815673828125,
0.0533447265625,
0.0217742919921875,
0.020355224609375,
0.061004638671875,
0.054168701171875,
-0.00969696044921875,
0.07708740234375,
0.038299560546875,
0.02227783203125,
0.03521728515625,
-0.061370849609375,
-0.01374053955078125,
-0.0679931640625,
-0.04278564453125,
-0.0238037109375,
-0.019805908203125,
-0.041046142578125,
-0.044219970703125,
0.0282440185546875,
0.033050537109375,
-0.00768280029296875,
0.036102294921875,
-0.058380126953125,
0.022552490234375,
0.0283355712890625,
0.031829833984375,
-0.011688232421875,
0.0171966552734375,
-0.0164031982421875,
-0.021270751953125,
-0.0303497314453125,
-0.0357666015625,
0.054962158203125,
0.0247344970703125,
0.051116943359375,
-0.004993438720703125,
0.047393798828125,
-0.0164031982421875,
0.0221710205078125,
-0.0211334228515625,
0.038848876953125,
0.007354736328125,
-0.039337158203125,
-0.01303863525390625,
-0.0221710205078125,
-0.0888671875,
0.0266265869140625,
-0.034027099609375,
-0.0904541015625,
0.01245880126953125,
0.0112457275390625,
-0.01251983642578125,
0.043792724609375,
-0.057464599609375,
0.0531005859375,
-0.00814056396484375,
-0.046051025390625,
0.01311492919921875,
-0.05169677734375,
0.0233154296875,
0.0169830322265625,
0.011627197265625,
-0.0032024383544921875,
-0.0004572868347167969,
0.06781005859375,
-0.0352783203125,
0.044158935546875,
-0.0101776123046875,
-0.0002130270004272461,
0.049468994140625,
-0.01010894775390625,
0.04290771484375,
0.0205230712890625,
0.0005030632019042969,
0.00432586669921875,
0.0010137557983398438,
-0.042633056640625,
-0.03924560546875,
0.052215576171875,
-0.0687255859375,
-0.0260009765625,
-0.02056884765625,
-0.0267486572265625,
0.0276641845703125,
0.02252197265625,
0.056365966796875,
0.033538818359375,
0.01123046875,
0.0174560546875,
0.0465087890625,
-0.037628173828125,
0.0293121337890625,
-0.0004830360412597656,
-0.01160430908203125,
-0.06591796875,
0.054595947265625,
0.01385498046875,
0.01444244384765625,
0.01922607421875,
-0.0038700103759765625,
-0.0234375,
-0.0186614990234375,
-0.05316162109375,
0.03045654296875,
-0.05499267578125,
-0.0255584716796875,
-0.0628662109375,
-0.0142669677734375,
-0.040771484375,
-0.0372314453125,
-0.038543701171875,
-0.0110321044921875,
-0.060791015625,
0.024627685546875,
0.03228759765625,
0.0284423828125,
-0.0101776123046875,
0.0247802734375,
-0.04510498046875,
0.032501220703125,
0.0208282470703125,
0.019317626953125,
0.00047397613525390625,
-0.03326416015625,
-0.0094146728515625,
0.00618743896484375,
-0.039764404296875,
-0.082763671875,
0.0455322265625,
-0.02215576171875,
0.037353515625,
0.0245819091796875,
-0.0183563232421875,
0.07403564453125,
-0.030181884765625,
0.058624267578125,
0.057403564453125,
-0.0665283203125,
0.044708251953125,
-0.01412200927734375,
0.017974853515625,
0.005092620849609375,
0.01419830322265625,
-0.035980224609375,
-0.0253448486328125,
-0.0716552734375,
-0.07861328125,
0.07086181640625,
0.058624267578125,
-0.003849029541015625,
0.01081085205078125,
0.04296875,
-0.0135345458984375,
-0.004505157470703125,
-0.057159423828125,
-0.0599365234375,
-0.039459228515625,
-0.0167083740234375,
-0.00682830810546875,
-0.0095367431640625,
0.0028133392333984375,
-0.0243377685546875,
0.05413818359375,
0.0022602081298828125,
0.06536865234375,
0.035797119140625,
-0.0141448974609375,
-0.020965576171875,
-0.00998687744140625,
0.050262451171875,
0.0428466796875,
-0.016937255859375,
0.0199127197265625,
0.023345947265625,
-0.03582763671875,
0.005588531494140625,
0.01165771484375,
-0.01277923583984375,
0.01421356201171875,
0.02899169921875,
0.0599365234375,
0.00006258487701416016,
-0.0224609375,
0.0291900634765625,
-0.0013837814331054688,
-0.03045654296875,
-0.030242919921875,
-0.00017905235290527344,
0.0208587646484375,
0.03204345703125,
0.02813720703125,
0.0131988525390625,
-0.01337432861328125,
-0.031005859375,
0.01031494140625,
0.03240966796875,
-0.0167083740234375,
-0.0184326171875,
0.06304931640625,
-0.0235443115234375,
-0.0285797119140625,
0.049774169921875,
-0.01434326171875,
-0.0411376953125,
0.0780029296875,
0.0645751953125,
0.035552978515625,
0.00998687744140625,
0.0086517333984375,
0.059295654296875,
0.033935546875,
-0.0019779205322265625,
0.0190887451171875,
0.01702880859375,
-0.044647216796875,
0.0066986083984375,
-0.0248565673828125,
-0.0031108856201171875,
0.0209808349609375,
-0.0310516357421875,
0.04010009765625,
-0.032745361328125,
-0.0160369873046875,
0.0151214599609375,
0.01303863525390625,
-0.06719970703125,
0.025726318359375,
0.00510406494140625,
0.059295654296875,
-0.0546875,
0.044464111328125,
0.06048583984375,
-0.033477783203125,
-0.05145263671875,
-0.0253448486328125,
-0.01158905029296875,
-0.06573486328125,
0.045562744140625,
0.0289154052734375,
0.01282501220703125,
0.01202392578125,
-0.0638427734375,
-0.07403564453125,
0.1026611328125,
0.0198211669921875,
-0.018310546875,
0.0110321044921875,
-0.011138916015625,
0.036773681640625,
-0.03363037109375,
0.0404052734375,
0.025665283203125,
0.03143310546875,
0.0088348388671875,
-0.041900634765625,
0.017852783203125,
-0.0280609130859375,
0.0202484130859375,
0.0175628662109375,
-0.057708740234375,
0.06805419921875,
-0.0266265869140625,
-0.024993896484375,
0.020477294921875,
0.06658935546875,
0.036041259765625,
0.017120361328125,
0.045989990234375,
0.06280517578125,
0.0341796875,
-0.01617431640625,
0.052642822265625,
-0.00864410400390625,
0.06036376953125,
0.04412841796875,
0.009552001953125,
0.03570556640625,
0.02215576171875,
-0.0120391845703125,
0.03973388671875,
0.047943115234375,
-0.03228759765625,
0.040069580078125,
0.017669677734375,
0.0003857612609863281,
0.0071868896484375,
0.0182952880859375,
-0.04193115234375,
0.01363372802734375,
0.0390625,
-0.03961181640625,
-0.02349853515625,
0.004573822021484375,
0.00940704345703125,
-0.01348876953125,
-0.02166748046875,
0.0244293212890625,
-0.008575439453125,
-0.041168212890625,
0.050750732421875,
0.032073974609375,
0.057464599609375,
-0.01548004150390625,
-0.00374603271484375,
0.0164642333984375,
0.01039886474609375,
-0.0284423828125,
-0.08526611328125,
0.019744873046875,
-0.00707244873046875,
-0.0021610260009765625,
-0.005802154541015625,
0.053680419921875,
-0.032379150390625,
-0.039825439453125,
0.0165252685546875,
-0.003719329833984375,
0.03900146484375,
0.0164642333984375,
-0.082275390625,
0.0219268798828125,
0.01059722900390625,
-0.039093017578125,
-0.0010652542114257812,
0.040313720703125,
0.01202392578125,
0.0261993408203125,
0.042724609375,
-0.00572967529296875,
0.02447509765625,
-0.035552978515625,
0.06707763671875,
-0.047637939453125,
-0.034149169921875,
-0.07025146484375,
0.0501708984375,
-0.02252197265625,
-0.036468505859375,
0.04559326171875,
0.05413818359375,
0.056060791015625,
-0.018463134765625,
0.045745849609375,
-0.02880859375,
-0.005420684814453125,
-0.052154541015625,
0.046295166015625,
-0.0537109375,
-0.01503753662109375,
-0.03692626953125,
-0.0626220703125,
-0.0261688232421875,
0.06561279296875,
-0.013885498046875,
0.0037670135498046875,
0.04034423828125,
0.07177734375,
-0.00768280029296875,
-0.029266357421875,
0.01143646240234375,
0.01197052001953125,
0.025909423828125,
0.061370849609375,
0.040496826171875,
-0.060272216796875,
0.047637939453125,
-0.058319091796875,
-0.019378662109375,
-0.0164031982421875,
-0.058502197265625,
-0.0684814453125,
-0.05426025390625,
-0.04559326171875,
-0.055511474609375,
-0.0220489501953125,
0.057891845703125,
0.0657958984375,
-0.040740966796875,
0.00862884521484375,
-0.00981903076171875,
0.01482391357421875,
-0.01371002197265625,
-0.02056884765625,
0.051513671875,
0.01117706298828125,
-0.06878662109375,
-0.0189056396484375,
0.02435302734375,
0.028594970703125,
0.01020050048828125,
-0.01328277587890625,
-0.019805908203125,
-0.00778961181640625,
0.032257080078125,
0.0286407470703125,
-0.04742431640625,
-0.0069580078125,
-0.017913818359375,
-0.0239410400390625,
0.026641845703125,
0.032623291015625,
-0.032958984375,
0.038421630859375,
0.042144775390625,
0.0052337646484375,
0.05474853515625,
-0.0124053955078125,
-0.01444244384765625,
-0.0465087890625,
0.0295257568359375,
-0.0055999755859375,
0.038970947265625,
0.0176849365234375,
-0.038299560546875,
0.0211334228515625,
0.036712646484375,
-0.032958984375,
-0.0489501953125,
0.00359344482421875,
-0.09417724609375,
-0.0172271728515625,
0.0789794921875,
-0.0226287841796875,
-0.04132080078125,
0.003818511962890625,
-0.037017822265625,
0.034454345703125,
-0.020294189453125,
0.040283203125,
0.0265655517578125,
-0.006221771240234375,
-0.02947998046875,
-0.033050537109375,
0.01326751708984375,
0.00033402442932128906,
-0.05859375,
-0.0325927734375,
0.0306243896484375,
0.04345703125,
0.057586669921875,
0.06158447265625,
-0.0165557861328125,
0.024566650390625,
0.0196990966796875,
0.0338134765625,
-0.0013418197631835938,
0.0019683837890625,
-0.00862884521484375,
0.00879669189453125,
-0.0269775390625,
-0.0291595458984375
]
] |
FlagAlpha/Llama2-Chinese-13b-Chat | 2023-09-11T13:24:23.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"question-answering",
"zh",
"en",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | question-answering | FlagAlpha | null | null | FlagAlpha/Llama2-Chinese-13b-Chat | 227 | 6,739 | transformers | 2023-07-24T12:10:46 | ---
developers: [https://huggingface.co/FlagAlphaAI]
license: apache-2.0
language:
- zh
- en
pipeline_tag: question-answering
library_name: transformers
---
# Llama2中文社区
---
## Llama2中文微调参数
由于Llama2本身的中文对齐较弱,我们采用中文指令集,对meta-llama/Llama-2-13b-chat-hf进行LoRA微调,使其具备较强的中文对话能力。
🎯 **该版本为LoRA中文微调参数FlagAlpha/Llama2-Chinese-13b-Chat-LoRA和meta-llama/Llama-2-13b-chat-hf参数结合后的版本,可直接使用**
---
## 🚀 社区地址:
Github:[**Llama2-Chinese**](https://github.com/FlagAlpha/Llama2-Chinese)
在线体验链接:[**llama.family**](https://llama.family/)
## 🔥 社区介绍
欢迎来到Llama2中文社区!
我们是一个专注于Llama2模型在中文方面的优化和上层建设的高级技术社区。
**基于大规模中文数据,从预训练开始对Llama2模型进行中文能力的持续迭代升级**。
我们热忱欢迎对大模型LLM充满热情的开发者和研究者加入我们的行列。
## 🐼 社区资源
- Llama2在线体验链接[**llama.family**](https://llama.family/),同时包含Meta原版和中文微调版本!
- Llama2 Chat模型的[中文问答能力评测](https://github.com/FlagAlpha/Llama2-Chinese/tree/main#-%E6%A8%A1%E5%9E%8B%E8%AF%84%E6%B5%8B)!
- [社区飞书知识库](https://chinesellama.feishu.cn/wiki/space/7257824476874768388?ccm_open_type=lark_wiki_spaceLink),欢迎大家一起共建!
| 1,000 | [
[
-0.0300750732421875,
-0.043701171875,
0.0160369873046875,
0.052337646484375,
-0.059600830078125,
0.020843505859375,
0.00823974609375,
-0.05169677734375,
0.0357666015625,
0.02899169921875,
-0.042388916015625,
-0.04827880859375,
-0.036590576171875,
0.00463104248046875,
-0.0077972412109375,
0.062225341796875,
-0.015716552734375,
-0.0196990966796875,
0.026092529296875,
-0.013946533203125,
-0.031280517578125,
-0.0147247314453125,
-0.045257568359375,
-0.01385498046875,
0.01303863525390625,
0.01953125,
0.046112060546875,
0.038177490234375,
0.041961669921875,
0.0176849365234375,
-0.01409149169921875,
0.0206756591796875,
-0.028045654296875,
-0.017669677734375,
0.006137847900390625,
-0.04364013671875,
-0.054840087890625,
-0.025360107421875,
0.0404052734375,
0.0171966552734375,
-0.00946807861328125,
0.029815673828125,
0.017486572265625,
0.058685302734375,
-0.0207366943359375,
0.04351806640625,
-0.035491943359375,
0.005779266357421875,
-0.030120849609375,
-0.023681640625,
0.005519866943359375,
-0.04595947265625,
-0.0267791748046875,
-0.047088623046875,
-0.01959228515625,
0.00542449951171875,
0.09063720703125,
0.00605010986328125,
-0.040069580078125,
-0.00978851318359375,
-0.0196533203125,
0.054840087890625,
-0.060943603515625,
-0.01206207275390625,
0.0364990234375,
0.027496337890625,
-0.0242462158203125,
-0.05120849609375,
-0.049774169921875,
0.00769805908203125,
-0.02020263671875,
0.009307861328125,
-0.02203369140625,
-0.025482177734375,
-0.00380706787109375,
0.01079559326171875,
-0.036407470703125,
0.0268096923828125,
-0.03662109375,
-0.0104827880859375,
0.045318603515625,
0.005161285400390625,
0.025665283203125,
-0.030120849609375,
-0.041717529296875,
0.005611419677734375,
-0.07122802734375,
0.0214691162109375,
0.018035888671875,
0.004032135009765625,
-0.053192138671875,
0.0318603515625,
-0.0160064697265625,
0.01708984375,
0.01215362548828125,
-0.04351806640625,
0.039093017578125,
-0.036834716796875,
-0.007564544677734375,
-0.0170135498046875,
0.06805419921875,
0.04937744140625,
-0.00222015380859375,
0.0180511474609375,
-0.00791168212890625,
-0.0194854736328125,
-0.034820556640625,
-0.05328369140625,
0.00557708740234375,
0.0338134765625,
-0.05865478515625,
-0.02850341796875,
0.007122039794921875,
-0.0670166015625,
-0.002658843994140625,
0.00981903076171875,
0.004787445068359375,
-0.00983428955078125,
-0.05010986328125,
0.003078460693359375,
0.007091522216796875,
0.033355712890625,
0.032470703125,
-0.060028076171875,
0.0021724700927734375,
0.05596923828125,
0.059051513671875,
-0.003650665283203125,
-0.0134124755859375,
0.003978729248046875,
0.040374755859375,
-0.0211181640625,
0.0540771484375,
-0.006824493408203125,
-0.037445068359375,
0.01153564453125,
0.01303863525390625,
0.00397491455078125,
-0.036865234375,
0.043853759765625,
-0.042755126953125,
0.00806427001953125,
-0.03466796875,
0.00646209716796875,
-0.039642333984375,
0.0202789306640625,
-0.0406494140625,
0.083740234375,
0.011016845703125,
-0.06988525390625,
-0.006221771240234375,
-0.042266845703125,
-0.028564453125,
0.00446319580078125,
-0.0169525146484375,
-0.02783203125,
-0.0286102294921875,
0.01021575927734375,
0.0158843994140625,
-0.03466796875,
0.0102996826171875,
0.0025577545166015625,
-0.027801513671875,
0.00902557373046875,
-0.01355743408203125,
0.0865478515625,
0.022613525390625,
-0.04510498046875,
0.00690460205078125,
-0.037750244140625,
0.002552032470703125,
0.043609619140625,
-0.036224365234375,
-0.0032958984375,
0.01532745361328125,
0.007045745849609375,
0.0082855224609375,
0.06378173828125,
-0.0193634033203125,
0.03680419921875,
-0.037628173828125,
0.032073974609375,
0.06378173828125,
0.00775909423828125,
-0.00295257568359375,
-0.034759521484375,
0.025177001953125,
0.01331329345703125,
0.02197265625,
-0.00963592529296875,
-0.05133056640625,
-0.07672119140625,
-0.0104827880859375,
-0.0196685791015625,
0.0633544921875,
-0.044342041015625,
0.06427001953125,
-0.0072021484375,
-0.050567626953125,
-0.0195770263671875,
0.0171966552734375,
0.041778564453125,
0.0035572052001953125,
0.02947998046875,
-0.01593017578125,
-0.0531005859375,
-0.051849365234375,
0.01617431640625,
-0.05804443359375,
0.011688232421875,
0.030303955078125,
0.052764892578125,
-0.04766845703125,
0.04473876953125,
-0.034759521484375,
-0.0282440185546875,
-0.01837158203125,
-0.0104522705078125,
0.040985107421875,
0.0233001708984375,
0.06451416015625,
-0.043487548828125,
-0.046051025390625,
0.009490966796875,
-0.058929443359375,
-0.01331329345703125,
-0.00502777099609375,
-0.020416259765625,
0.03472900390625,
-0.003173828125,
-0.03594970703125,
0.0285186767578125,
0.052154541015625,
-0.027862548828125,
0.051025390625,
0.003742218017578125,
0.00759124755859375,
-0.08551025390625,
-0.0000023245811462402344,
-0.042633056640625,
0.007965087890625,
-0.0189056396484375,
0.0247650146484375,
-0.01198577880859375,
0.0408935546875,
-0.044464111328125,
0.0433349609375,
-0.028564453125,
-0.00408935546875,
-0.017913818359375,
0.01227569580078125,
-0.00572967529296875,
0.04327392578125,
-0.023681640625,
0.0579833984375,
0.040771484375,
-0.04119873046875,
0.037017822265625,
0.038787841796875,
-0.0200347900390625,
-0.0003535747528076172,
-0.053253173828125,
0.0063018798828125,
0.01410675048828125,
0.04901123046875,
-0.0860595703125,
-0.01080322265625,
0.054046630859375,
-0.021636962890625,
0.0040130615234375,
0.002040863037109375,
-0.03985595703125,
-0.035552978515625,
-0.060638427734375,
0.034027099609375,
0.046234130859375,
-0.053955078125,
0.0045318603515625,
0.029754638671875,
0.0004265308380126953,
-0.0594482421875,
-0.056732177734375,
0.009857177734375,
-0.00592803955078125,
-0.04803466796875,
0.027374267578125,
-0.011077880859375,
-0.00238037109375,
-0.01080322265625,
0.003787994384765625,
-0.00557708740234375,
-0.00229644775390625,
0.0196075439453125,
0.01580810546875,
-0.0177154541015625,
-0.024078369140625,
0.0226898193359375,
-0.015960693359375,
0.0034656524658203125,
0.01262664794921875,
0.050079345703125,
-0.01123809814453125,
-0.045196533203125,
-0.056976318359375,
0.0223846435546875,
0.03082275390625,
-0.00748443603515625,
0.042755126953125,
0.050262451171875,
-0.0142822265625,
0.010040283203125,
-0.05145263671875,
0.010833740234375,
-0.039581298828125,
0.0286865234375,
-0.0198822021484375,
-0.08551025390625,
0.04290771484375,
-0.0073699951171875,
0.0214080810546875,
0.0548095703125,
0.04534912109375,
-0.013916015625,
0.0498046875,
0.05096435546875,
-0.0297393798828125,
0.0166778564453125,
-0.032470703125,
0.0011072158813476562,
-0.0670166015625,
-0.041351318359375,
-0.039276123046875,
-0.0231170654296875,
-0.055633544921875,
-0.048583984375,
0.0239410400390625,
0.0185394287109375,
-0.031951904296875,
0.03900146484375,
-0.048095703125,
0.01253509521484375,
0.024017333984375,
0.000031888484954833984,
0.031219482421875,
0.0096893310546875,
-0.004901885986328125,
0.01149749755859375,
-0.0297393798828125,
-0.062225341796875,
0.060821533203125,
0.03729248046875,
0.034820556640625,
0.0305023193359375,
0.04736328125,
0.00394439697265625,
0.00868988037109375,
-0.035858154296875,
0.052459716796875,
0.0013103485107421875,
-0.0430908203125,
-0.01137542724609375,
-0.002086639404296875,
-0.08258056640625,
0.01313018798828125,
0.018218994140625,
-0.08087158203125,
0.01554107666015625,
-0.010345458984375,
-0.00945281982421875,
0.041839599609375,
-0.023162841796875,
0.02166748046875,
-0.0262908935546875,
-0.0020275115966796875,
-0.028656005859375,
-0.04205322265625,
0.0609130859375,
-0.02520751953125,
0.0246124267578125,
-0.0270538330078125,
-0.035003662109375,
0.0693359375,
-0.04022216796875,
0.07244873046875,
-0.0125274658203125,
-0.0281829833984375,
0.05963134765625,
0.00298309326171875,
0.052947998046875,
0.0304107666015625,
-0.004421234130859375,
0.049713134765625,
0.007293701171875,
-0.0200958251953125,
-0.0087738037109375,
0.05096435546875,
-0.08251953125,
-0.0657958984375,
-0.03302001953125,
0.0107879638671875,
0.018646240234375,
0.00820159912109375,
0.021728515625,
-0.0204315185546875,
0.003719329833984375,
0.0086669921875,
0.01245880126953125,
-0.03350830078125,
0.050140380859375,
0.054473876953125,
-0.03656005859375,
-0.0458984375,
0.046478271484375,
0.01177215576171875,
0.0021724700927734375,
0.038116455078125,
-0.00693511962890625,
-0.0022335052490234375,
-0.03228759765625,
-0.031829833984375,
0.0330810546875,
-0.03594970703125,
-0.0240478515625,
-0.03466796875,
-0.0306549072265625,
-0.0157928466796875,
-0.0017290115356445312,
-0.01554107666015625,
-0.0295562744140625,
-0.03167724609375,
-0.0122222900390625,
0.04254150390625,
0.041412353515625,
-0.0170745849609375,
0.037109375,
-0.060699462890625,
0.041900634765625,
0.005664825439453125,
0.01325225830078125,
0.0171661376953125,
-0.04046630859375,
-0.021759033203125,
0.005367279052734375,
-0.033355712890625,
-0.07415771484375,
0.0299530029296875,
-0.0026378631591796875,
0.03900146484375,
0.047332763671875,
-0.005710601806640625,
0.05804443359375,
-0.0209503173828125,
0.0716552734375,
0.0277557373046875,
-0.06121826171875,
0.050079345703125,
-0.028533935546875,
-0.006439208984375,
0.01509857177734375,
0.0156097412109375,
-0.036224365234375,
-0.00836944580078125,
-0.0223388671875,
-0.07086181640625,
0.0579833984375,
0.0265960693359375,
0.017120361328125,
0.00778961181640625,
0.005764007568359375,
0.0008258819580078125,
0.0090484619140625,
-0.068603515625,
-0.0494384765625,
-0.01300048828125,
0.0131683349609375,
0.0175018310546875,
-0.057403564453125,
-0.01678466796875,
-0.01511383056640625,
0.05133056640625,
0.009979248046875,
0.039947509765625,
0.0081787109375,
0.024322509765625,
-0.03411865234375,
0.009490966796875,
0.038787841796875,
0.03167724609375,
-0.01045989990234375,
-0.01021575927734375,
0.03192138671875,
-0.0445556640625,
0.01351165771484375,
-0.0173797607421875,
-0.0281982421875,
0.00948333740234375,
0.047943115234375,
0.053192138671875,
0.006595611572265625,
-0.0390625,
0.033203125,
0.0119476318359375,
-0.0122833251953125,
-0.056060791015625,
0.01474761962890625,
0.0232391357421875,
0.035858154296875,
0.05157470703125,
-0.0184478759765625,
-0.00873565673828125,
-0.03643798828125,
-0.005176544189453125,
0.031768798828125,
0.00336456298828125,
-0.017547607421875,
0.037933349609375,
0.028106689453125,
-0.00670623779296875,
0.0126953125,
-0.019744873046875,
-0.056732177734375,
0.0738525390625,
0.046905517578125,
0.0582275390625,
-0.00911712646484375,
0.005847930908203125,
0.0616455078125,
0.047393798828125,
0.002857208251953125,
0.052337646484375,
-0.0089569091796875,
-0.035919189453125,
-0.01525115966796875,
-0.042022705078125,
-0.02752685546875,
0.0193939208984375,
-0.0158233642578125,
0.02874755859375,
-0.047607421875,
-0.007137298583984375,
-0.037384033203125,
0.0291748046875,
-0.01690673828125,
-0.006969451904296875,
-0.0010213851928710938,
0.0750732421875,
-0.040985107421875,
0.066650390625,
0.02679443359375,
-0.0281219482421875,
-0.06561279296875,
0.00272369384765625,
0.00858306884765625,
-0.0738525390625,
0.05462646484375,
0.01129150390625,
-0.01012420654296875,
-0.01047515869140625,
-0.043121337890625,
-0.101806640625,
0.1151123046875,
-0.0012111663818359375,
-0.04412841796875,
0.0192718505859375,
0.0135650634765625,
0.02685546875,
-0.01262664794921875,
0.0246734619140625,
0.018310546875,
0.0687255859375,
0.011444091796875,
-0.0853271484375,
0.0240631103515625,
-0.030487060546875,
-0.0027179718017578125,
-0.00926971435546875,
-0.10101318359375,
0.0599365234375,
-0.0194549560546875,
-0.03240966796875,
0.030914306640625,
0.05517578125,
0.04248046875,
0.030914306640625,
0.023345947265625,
0.032379150390625,
0.031951904296875,
-0.0172576904296875,
0.033660888671875,
-0.0196685791015625,
0.03228759765625,
0.0517578125,
-0.019744873046875,
0.06597900390625,
0.0266265869140625,
-0.056610107421875,
0.0604248046875,
0.05865478515625,
-0.0227203369140625,
0.038177490234375,
-0.00872039794921875,
-0.024200439453125,
0.0125579833984375,
-0.0110626220703125,
-0.06829833984375,
0.01898193359375,
0.03289794921875,
-0.005130767822265625,
-0.00847625732421875,
-0.039459228515625,
0.022705078125,
-0.0226898193359375,
-0.009002685546875,
0.051055908203125,
0.0033702850341796875,
-0.028594970703125,
0.05828857421875,
0.023712158203125,
0.073486328125,
-0.0499267578125,
-0.00766754150390625,
-0.026397705078125,
-0.008880615234375,
-0.044647216796875,
-0.05419921875,
0.01971435546875,
0.01413726806640625,
-0.0063629150390625,
0.0189208984375,
0.041259765625,
-0.0031032562255859375,
-0.027099609375,
0.05126953125,
0.0285491943359375,
0.0273895263671875,
0.03436279296875,
-0.065673828125,
0.01526641845703125,
0.027862548828125,
-0.0526123046875,
0.0262451171875,
0.019989013671875,
-0.0178985595703125,
0.05126953125,
0.06781005859375,
0.0013980865478515625,
-0.0037078857421875,
-0.0156097412109375,
0.07781982421875,
-0.042816162109375,
-0.026397705078125,
-0.06988525390625,
0.03515625,
0.0051116943359375,
-0.027374267578125,
0.0482177734375,
0.04156494140625,
0.047149658203125,
-0.005069732666015625,
0.0548095703125,
-0.0250396728515625,
0.037567138671875,
-0.018402099609375,
0.055206298828125,
-0.041900634765625,
0.023193359375,
-0.0170440673828125,
-0.05865478515625,
-0.0102081298828125,
0.04803466796875,
0.0124664306640625,
0.0106353759765625,
0.056610107421875,
0.0670166015625,
0.0189361572265625,
-0.0238037109375,
0.0015516281127929688,
0.024322509765625,
0.028411865234375,
0.08184814453125,
0.04644775390625,
-0.04876708984375,
0.0482177734375,
-0.0251617431640625,
-0.0170135498046875,
-0.040740966796875,
-0.06024169921875,
-0.0654296875,
-0.02838134765625,
-0.01485443115234375,
-0.01418304443359375,
-0.006961822509765625,
0.0552978515625,
0.030853271484375,
-0.065185546875,
-0.038787841796875,
0.011474609375,
0.033966064453125,
-0.00931549072265625,
-0.01152801513671875,
0.0570068359375,
0.01418304443359375,
-0.0440673828125,
0.0225982666015625,
0.031829833984375,
0.00510406494140625,
-0.013702392578125,
-0.0232696533203125,
-0.019683837890625,
-0.0038013458251953125,
0.034576416015625,
0.034759521484375,
-0.080078125,
-0.007625579833984375,
-0.0199432373046875,
-0.033538818359375,
0.0014896392822265625,
0.0088958740234375,
-0.035369873046875,
-0.0122222900390625,
0.047088623046875,
0.0006656646728515625,
0.0202789306640625,
-0.0108642578125,
-0.0025177001953125,
-0.0269927978515625,
0.0302581787109375,
-0.0204010009765625,
0.038665771484375,
-0.0079193115234375,
-0.0260772705078125,
0.06500244140625,
0.03253173828125,
-0.0261077880859375,
-0.04412841796875,
0.0196380615234375,
-0.0931396484375,
-0.01885986328125,
0.1014404296875,
-0.0167236328125,
-0.01263427734375,
-0.0005021095275878906,
-0.052154541015625,
0.04583740234375,
-0.0271148681640625,
0.053924560546875,
0.031005859375,
0.01053619384765625,
0.007091522216796875,
-0.0340576171875,
0.017578125,
0.017974853515625,
-0.06280517578125,
-0.0194549560546875,
0.00922393798828125,
0.0009737014770507812,
0.021820068359375,
0.052581787109375,
-0.0048828125,
0.0231781005859375,
-0.0285797119140625,
0.00894927978515625,
-0.019683837890625,
0.011932373046875,
0.0013952255249023438,
0.0032291412353515625,
-0.004894256591796875,
-0.018280029296875
]
] |
uukuguy/speechless-code-mistral-7b-v1.0 | 2023-10-13T07:34:35.000Z | [
"transformers",
"pytorch",
"mistral",
"text-generation",
"llama-2",
"code",
"en",
"dataset:jondurbin/airoboros-2.2",
"dataset:Open-Orca/OpenOrca",
"dataset:garage-bAInd/Open-Platypus",
"dataset:WizardLM/WizardLM_evol_instruct_V2_196k",
"dataset:TokenBender/python_eval_instruct_51k",
"license:llama2",
"model-index",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | uukuguy | null | null | uukuguy/speechless-code-mistral-7b-v1.0 | 5 | 6,736 | transformers | 2023-10-10T06:14:00 | ---
language:
- en
library_name: transformers
pipeline_tag: text-generation
datasets:
- jondurbin/airoboros-2.2
- Open-Orca/OpenOrca
- garage-bAInd/Open-Platypus
- WizardLM/WizardLM_evol_instruct_V2_196k
- TokenBender/python_eval_instruct_51k
tags:
- llama-2
- code
license: llama2
model-index:
- name: SpeechlessCoder
results:
- task:
type: text-generation
dataset:
type: openai_humaneval
name: HumanEval
metrics:
- name: pass@1
type: pass@1
value: 50.0
verified: false
---
<p><h1> speechless-code-mistral-7b-v1.0 </h1></p>
* [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/speechless-code-mistral-7B-v1.0-AWQ)
* [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/speechless-code-mistral-7B-v1.0-GPTQ)
* [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/speechless-code-mistral-7B-v1.0-GGUF)
Use the following dataset to fine-tune mistralai/Mistral-7B-v0.1 in order to improve the model's reasoning and planning abilities.
Total 201,981 samples.
- jondurbin/airoboros-2.2: Filter categories related to coding, reasoning and planning. 23,462 samples.
- Open-Orca/OpenOrca: Filter the 'cot' category in 1M GPT4 dataset. 74,440 samples.
- garage-bAInd/Open-Platypus: 100%, 24,926 samples.
- WizardLM/WizardLM_evol_instruct_V2_196k: Coding coversation part. 30,185 samples
- TokenBender/python_eval_instruct_51k: “python” in output .40,309 samples
- Spider: 8,659 samples
## HumanEval
| Metric | Value |
| --- | --- |
| humaneval-python | 50.0|
[Big Code Models Leaderboard](https://huggingface.co/spaces/bigcode/bigcode-models-leaderboard)
CodeLlama-34B-Python: 53.29
CodeLlama-34B-Instruct: 50.79
CodeLlama-13B-Instruct: 50.6
CodeLlama-34B: 45.11
CodeLlama-13B-Python: 42.89
CodeLlama-13B: 35.07
## lm-evaluation-harness
[Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
| Metric | Value |
| --- | --- |
| ARC |59.64 |
| HellaSwag |82.25 |
| MMLU | 61.33 |
| TruthfulQA | 48.45 |
| Average | 62.92 |
## Parameters
| | |
|------ | ------ |
| lr | 2e-4 |
| lr_scheduler_type | cosine |
| weight_decay | 0.0 |
| optim | paged_adamw_8bit |
| flash_attention | True |
| rerope | False |
| max_new_tokens | 4096 |
| num_train_epochs | 2 |
| bits | 4 |
| lora_r | 64 |
| lora_alpha | 16 |
| lora_dropout | 0.05 |
| double_quant | True |
| quant_type | nf4 |
| dataset_format | airoboros |
| mini_batch_size | 2 |
| grandient_accumulation_steps | 32 |
| bf16 | True |
A40-48G x 2
| | |
|------ | ------ |
| epoch | 2.0 |
| etrain_loss | 0.5 |
| etrain_runtime | 1 day, 10:25:26.77 |
| etrain_samples_per_second | 3.194 |
| etrain_steps_per_second | 0.025 |
| eeval_loss | 0.5146 |
| eeval_runtime | 0:00:25.04 |
| eeval_samples_per_second | 7.985 |
| eeval_steps_per_second | |
| 3,043 | [
[
-0.03729248046875,
-0.055908203125,
0.017913818359375,
0.0170440673828125,
-0.006267547607421875,
-0.0203857421875,
-0.01026153564453125,
-0.018157958984375,
0.00989532470703125,
0.0224761962890625,
-0.032684326171875,
-0.0556640625,
-0.039459228515625,
-0.01242828369140625,
-0.01490020751953125,
0.08013916015625,
-0.0012416839599609375,
-0.00804901123046875,
0.00514984130859375,
-0.022247314453125,
-0.040679931640625,
-0.047607421875,
-0.05517578125,
-0.029205322265625,
0.01482391357421875,
0.01153564453125,
0.0517578125,
0.03668212890625,
0.032440185546875,
0.0249481201171875,
-0.0184783935546875,
0.006351470947265625,
-0.039764404296875,
-0.0209503173828125,
0.01080322265625,
-0.034820556640625,
-0.0625,
-0.0007233619689941406,
0.04278564453125,
0.023040771484375,
-0.0309906005859375,
0.039520263671875,
0.00984954833984375,
0.052520751953125,
-0.038055419921875,
0.0264892578125,
-0.025665283203125,
0.0038509368896484375,
-0.0124969482421875,
-0.0155487060546875,
-0.008453369140625,
-0.028717041015625,
-0.0162200927734375,
-0.046478271484375,
0.0288238525390625,
0.006870269775390625,
0.09173583984375,
0.0286865234375,
-0.017730712890625,
-0.0022563934326171875,
-0.036041259765625,
0.0648193359375,
-0.07965087890625,
0.0133209228515625,
0.0294342041015625,
0.01232147216796875,
-0.007755279541015625,
-0.053131103515625,
-0.03924560546875,
-0.004444122314453125,
-0.0014600753784179688,
0.018524169921875,
-0.0306243896484375,
-0.0005288124084472656,
0.052642822265625,
0.056121826171875,
-0.05224609375,
0.0037784576416015625,
-0.0287628173828125,
-0.0084381103515625,
0.052825927734375,
0.035186767578125,
0.004596710205078125,
-0.01357269287109375,
-0.025054931640625,
-0.0276031494140625,
-0.0361328125,
0.025634765625,
0.020172119140625,
0.00687408447265625,
-0.0252685546875,
0.036346435546875,
-0.0310211181640625,
0.055267333984375,
0.00872039794921875,
-0.023681640625,
0.04132080078125,
-0.041290283203125,
-0.041015625,
-0.005924224853515625,
0.061553955078125,
0.03228759765625,
-0.002410888671875,
0.0160980224609375,
-0.024322509765625,
0.0095977783203125,
0.0009055137634277344,
-0.0859375,
-0.031280517578125,
0.0286407470703125,
-0.03271484375,
-0.01511383056640625,
0.01067352294921875,
-0.049530029296875,
0.0016841888427734375,
-0.035369873046875,
0.040863037109375,
-0.038299560546875,
-0.0166168212890625,
0.012939453125,
-0.028961181640625,
0.030487060546875,
0.025238037109375,
-0.043609619140625,
0.0155792236328125,
0.037109375,
0.057769775390625,
-0.004154205322265625,
-0.0138397216796875,
-0.0201263427734375,
-0.00603485107421875,
-0.0278167724609375,
0.04315185546875,
-0.0175323486328125,
-0.0284423828125,
-0.039337158203125,
-0.0007543563842773438,
-0.01477813720703125,
-0.037322998046875,
0.04925537109375,
-0.0200347900390625,
0.0184173583984375,
-0.0200347900390625,
-0.03790283203125,
-0.0227508544921875,
0.00982666015625,
-0.05145263671875,
0.10052490234375,
0.0172576904296875,
-0.06488037109375,
0.03265380859375,
-0.049224853515625,
0.00661468505859375,
-0.0095062255859375,
-0.01157379150390625,
-0.043731689453125,
-0.0036468505859375,
0.037567138671875,
0.031524658203125,
-0.0264129638671875,
0.030059814453125,
-0.021209716796875,
-0.03973388671875,
0.01462554931640625,
-0.043060302734375,
0.07904052734375,
0.0261688232421875,
-0.041748046875,
0.0205230712890625,
-0.0660400390625,
0.0293731689453125,
0.002719879150390625,
-0.027252197265625,
-0.0028018951416015625,
-0.0199127197265625,
0.0092315673828125,
0.01210784912109375,
0.0153961181640625,
-0.009521484375,
0.023651123046875,
-0.022552490234375,
0.0419921875,
0.057159423828125,
0.01477813720703125,
0.006565093994140625,
-0.039337158203125,
0.05084228515625,
0.0084991455078125,
0.035186767578125,
0.015869140625,
-0.04638671875,
-0.051910400390625,
-0.037017822265625,
0.0238800048828125,
0.04180908203125,
-0.040130615234375,
0.046600341796875,
-0.01904296875,
-0.048828125,
-0.04351806640625,
0.003124237060546875,
0.0291748046875,
0.041107177734375,
0.038726806640625,
-0.0099945068359375,
-0.032989501953125,
-0.07080078125,
-0.004703521728515625,
-0.01270294189453125,
0.0015325546264648438,
0.016754150390625,
0.048583984375,
-0.0169219970703125,
0.062103271484375,
-0.053680419921875,
-0.0128173828125,
-0.004215240478515625,
0.00594329833984375,
0.05841064453125,
0.040283203125,
0.052886962890625,
-0.046875,
-0.028533935546875,
-0.0102691650390625,
-0.052734375,
-0.004657745361328125,
0.0017108917236328125,
-0.0298919677734375,
0.00814056396484375,
0.0283355712890625,
-0.044830322265625,
0.060760498046875,
0.029815673828125,
-0.04083251953125,
0.057525634765625,
-0.020904541015625,
0.0230560302734375,
-0.0750732421875,
0.0238800048828125,
0.0157470703125,
-0.0008611679077148438,
-0.025726318359375,
0.01708984375,
0.0004286766052246094,
0.00909423828125,
-0.0203704833984375,
0.043609619140625,
-0.037872314453125,
0.01035308837890625,
-0.0021915435791015625,
-0.01137542724609375,
0.0007791519165039062,
0.058319091796875,
-0.01131439208984375,
0.0706787109375,
0.049346923828125,
-0.039337158203125,
0.0255126953125,
0.0159912109375,
-0.021484375,
0.020721435546875,
-0.0650634765625,
-0.0140228271484375,
0.00437164306640625,
0.0240325927734375,
-0.0718994140625,
-0.0178985595703125,
0.0198516845703125,
-0.041015625,
0.0128021240234375,
-0.0191497802734375,
-0.028717041015625,
-0.0296783447265625,
-0.039398193359375,
0.0258941650390625,
0.05126953125,
-0.033416748046875,
0.029754638671875,
0.01305389404296875,
-0.0031871795654296875,
-0.046783447265625,
-0.054779052734375,
-0.015625,
-0.0311431884765625,
-0.033233642578125,
0.019134521484375,
-0.0165557861328125,
-0.0111236572265625,
0.003108978271484375,
-0.0158843994140625,
-0.01036834716796875,
-0.004253387451171875,
0.041229248046875,
0.041900634765625,
-0.0220184326171875,
-0.017242431640625,
-0.0089263916015625,
-0.00673675537109375,
0.0087890625,
-0.01050567626953125,
0.0309600830078125,
-0.0222015380859375,
-0.01519012451171875,
-0.034423828125,
0.0005707740783691406,
0.037261962890625,
-0.0133209228515625,
0.056884765625,
0.06317138671875,
-0.02587890625,
-0.0164642333984375,
-0.03314208984375,
0.00473785400390625,
-0.036224365234375,
0.0189056396484375,
-0.031341552734375,
-0.058624267578125,
0.047607421875,
0.0142822265625,
0.005214691162109375,
0.07110595703125,
0.04974365234375,
0.0288238525390625,
0.08221435546875,
0.0250091552734375,
-0.020843505859375,
0.035186767578125,
-0.055816650390625,
0.01328277587890625,
-0.05877685546875,
-0.0163421630859375,
-0.037628173828125,
-0.0166168212890625,
-0.05303955078125,
-0.041412353515625,
0.032745361328125,
0.033416748046875,
-0.05108642578125,
0.0188446044921875,
-0.064697265625,
0.0148468017578125,
0.05621337890625,
0.0195770263671875,
0.0158538818359375,
0.00853729248046875,
-0.0162506103515625,
-0.0008187294006347656,
-0.05816650390625,
-0.0301361083984375,
0.09674072265625,
0.0206756591796875,
0.0733642578125,
0.0122833251953125,
0.04412841796875,
0.0197906494140625,
0.018524169921875,
-0.040069580078125,
0.031280517578125,
0.00725555419921875,
-0.044097900390625,
-0.0182952880859375,
-0.048126220703125,
-0.057464599609375,
0.0238800048828125,
-0.01277923583984375,
-0.057952880859375,
0.0297698974609375,
0.0169219970703125,
-0.036956787109375,
0.0275726318359375,
-0.053314208984375,
0.06536865234375,
-0.006137847900390625,
-0.0298004150390625,
-0.005321502685546875,
-0.0260162353515625,
0.031982421875,
-0.0038089752197265625,
0.004703521728515625,
-0.00862884521484375,
0.00751495361328125,
0.06134033203125,
-0.057525634765625,
0.047760009765625,
-0.012939453125,
0.00011920928955078125,
0.035980224609375,
-0.022552490234375,
0.033782958984375,
0.01415252685546875,
-0.021484375,
0.0291748046875,
0.0111541748046875,
-0.03607177734375,
-0.0239715576171875,
0.05889892578125,
-0.07696533203125,
-0.035919189453125,
-0.051513671875,
-0.030303955078125,
0.00405120849609375,
0.007511138916015625,
0.041290283203125,
0.046966552734375,
-0.0024738311767578125,
0.003993988037109375,
0.038177490234375,
-0.0171966552734375,
0.037384033203125,
0.0156097412109375,
-0.0148773193359375,
-0.05426025390625,
0.07293701171875,
0.0020904541015625,
0.003391265869140625,
0.00829315185546875,
-0.0017271041870117188,
-0.0302886962890625,
-0.042572021484375,
-0.0286865234375,
0.01145172119140625,
-0.038818359375,
-0.032318115234375,
-0.03900146484375,
-0.032073974609375,
-0.038482666015625,
0.006420135498046875,
-0.01495361328125,
-0.033050537109375,
-0.033050537109375,
-0.0109405517578125,
0.04339599609375,
0.044158935546875,
-0.01192474365234375,
0.01678466796875,
-0.040252685546875,
0.0284881591796875,
0.0164642333984375,
0.0211029052734375,
0.01074981689453125,
-0.060577392578125,
-0.023223876953125,
0.01129913330078125,
-0.043914794921875,
-0.05352783203125,
0.052215576171875,
0.00756072998046875,
0.04998779296875,
0.042510986328125,
-0.0003056526184082031,
0.08892822265625,
-0.0108795166015625,
0.06988525390625,
0.0140228271484375,
-0.0643310546875,
0.04827880859375,
-0.01445770263671875,
0.0197906494140625,
0.028717041015625,
0.0232391357421875,
-0.0219268798828125,
-0.0259246826171875,
-0.0714111328125,
-0.07379150390625,
0.056884765625,
0.038116455078125,
-0.0304107666015625,
0.015899658203125,
0.029296875,
-0.005725860595703125,
0.0164337158203125,
-0.053253173828125,
-0.037689208984375,
-0.01271820068359375,
-0.0083770751953125,
-0.01397705078125,
-0.0033779144287109375,
-0.018768310546875,
-0.056884765625,
0.06500244140625,
-0.005832672119140625,
0.037353515625,
0.0188140869140625,
0.01099395751953125,
0.004848480224609375,
0.01053619384765625,
0.0303192138671875,
0.0638427734375,
-0.0291595458984375,
-0.01007080078125,
0.0170440673828125,
-0.04296875,
0.0014657974243164062,
0.023345947265625,
-0.00734710693359375,
-0.006191253662109375,
0.032958984375,
0.0738525390625,
-0.01082611083984375,
-0.033966064453125,
0.032135009765625,
-0.01409149169921875,
-0.018951416015625,
-0.03448486328125,
0.035980224609375,
-0.004833221435546875,
0.0225677490234375,
0.020904541015625,
0.0096435546875,
0.022216796875,
-0.0282440185546875,
0.01126861572265625,
0.0237579345703125,
-0.0132293701171875,
-0.020050048828125,
0.07470703125,
-0.01345062255859375,
0.01220703125,
0.045867919921875,
-0.0274200439453125,
-0.034454345703125,
0.07073974609375,
0.0175018310546875,
0.055511474609375,
-0.00653076171875,
0.004230499267578125,
0.06317138671875,
0.0161590576171875,
-0.020294189453125,
0.042205810546875,
0.006488800048828125,
-0.0283660888671875,
-0.0311431884765625,
-0.059295654296875,
-0.01137542724609375,
0.0227203369140625,
-0.0706787109375,
0.0213470458984375,
-0.034088134765625,
-0.039764404296875,
0.00634765625,
0.0301361083984375,
-0.06475830078125,
0.020294189453125,
0.005268096923828125,
0.07269287109375,
-0.06329345703125,
0.057861328125,
0.052734375,
-0.03900146484375,
-0.08148193359375,
-0.0219573974609375,
-0.00879669189453125,
-0.054840087890625,
0.032562255859375,
-0.002773284912109375,
0.0206756591796875,
-0.0007958412170410156,
-0.058502197265625,
-0.07244873046875,
0.10516357421875,
0.0216827392578125,
-0.038970947265625,
0.003421783447265625,
-0.0037441253662109375,
0.04241943359375,
-0.005817413330078125,
0.04638671875,
0.047454833984375,
0.0347900390625,
0.00420379638671875,
-0.067138671875,
0.0158843994140625,
-0.035675048828125,
0.0087738037109375,
0.0224761962890625,
-0.071044921875,
0.09100341796875,
-0.01036834716796875,
-0.0002789497375488281,
0.003475189208984375,
0.04327392578125,
0.048492431640625,
0.0235443115234375,
0.0197296142578125,
0.08380126953125,
0.05633544921875,
-0.0197601318359375,
0.07989501953125,
-0.033660888671875,
0.046539306640625,
0.07037353515625,
0.01617431640625,
0.059173583984375,
0.01025390625,
-0.038848876953125,
0.03155517578125,
0.0670166015625,
-0.0209503173828125,
0.0236053466796875,
0.0164642333984375,
-0.007061004638671875,
-0.023712158203125,
0.020111083984375,
-0.05328369140625,
0.0114288330078125,
0.0222930908203125,
-0.0156402587890625,
-0.0098876953125,
-0.020233154296875,
0.0135650634765625,
-0.031524658203125,
-0.0302886962890625,
0.03973388671875,
0.005420684814453125,
-0.04327392578125,
0.07135009765625,
0.01116180419921875,
0.049346923828125,
-0.037017822265625,
-0.021728515625,
-0.024871826171875,
0.0176544189453125,
-0.029998779296875,
-0.048583984375,
-0.005115509033203125,
0.00696563720703125,
-0.00966644287109375,
0.00152587890625,
0.034820556640625,
-0.0166168212890625,
-0.050140380859375,
0.00762939453125,
0.029022216796875,
0.0176849365234375,
-0.0177001953125,
-0.06768798828125,
-0.0006356239318847656,
0.01479339599609375,
-0.0312347412109375,
0.01055908203125,
0.04931640625,
-0.0035858154296875,
0.04144287109375,
0.058197021484375,
0.01319122314453125,
0.0155029296875,
-0.0008974075317382812,
0.0771484375,
-0.07464599609375,
-0.045379638671875,
-0.062286376953125,
0.03216552734375,
-0.0150299072265625,
-0.060272216796875,
0.059173583984375,
0.06304931640625,
0.04840087890625,
0.0120849609375,
0.053802490234375,
-0.01265716552734375,
0.0145721435546875,
-0.042694091796875,
0.04559326171875,
-0.04473876953125,
0.0173187255859375,
-0.016876220703125,
-0.062744140625,
0.0030975341796875,
0.042816162109375,
-0.0191497802734375,
0.01445770263671875,
0.039703369140625,
0.0633544921875,
0.0013704299926757812,
0.004528045654296875,
0.0026607513427734375,
0.028717041015625,
0.025177001953125,
0.061798095703125,
0.041748046875,
-0.053497314453125,
0.0299072265625,
-0.0322265625,
-0.0272064208984375,
-0.0211639404296875,
-0.04571533203125,
-0.059112548828125,
-0.022247314453125,
-0.039581298828125,
-0.04461669921875,
-0.007244110107421875,
0.059173583984375,
0.052520751953125,
-0.065185546875,
-0.022918701171875,
-0.0013170242309570312,
-0.00274658203125,
-0.04052734375,
-0.0227203369140625,
0.048828125,
-0.006000518798828125,
-0.05010986328125,
0.0016050338745117188,
-0.005443572998046875,
-0.0004458427429199219,
-0.004547119140625,
-0.0177001953125,
0.0012359619140625,
-0.0015115737915039062,
0.043731689453125,
0.0153045654296875,
-0.050140380859375,
-0.0016336441040039062,
-0.00824737548828125,
-0.0165863037109375,
0.016754150390625,
0.016937255859375,
-0.050262451171875,
0.01030731201171875,
0.038421630859375,
0.016571044921875,
0.05706787109375,
0.0180511474609375,
0.00855255126953125,
-0.03680419921875,
0.00949859619140625,
0.0029850006103515625,
0.0258941650390625,
-0.0019464492797851562,
-0.0301361083984375,
0.055633544921875,
0.028717041015625,
-0.05682373046875,
-0.050262451171875,
-0.0289764404296875,
-0.0921630859375,
-0.016510009765625,
0.0841064453125,
-0.0015325546264648438,
-0.0308685302734375,
0.022216796875,
-0.023101806640625,
0.0161895751953125,
-0.045806884765625,
0.03729248046875,
0.044219970703125,
-0.032684326171875,
0.01070404052734375,
-0.0677490234375,
0.037567138671875,
0.01036834716796875,
-0.06036376953125,
-0.007015228271484375,
0.04766845703125,
0.03387451171875,
-0.003173828125,
0.059814453125,
-0.00897979736328125,
0.0112457275390625,
0.0184478759765625,
0.019287109375,
-0.00919342041015625,
0.002902984619140625,
-0.032440185546875,
0.0172882080078125,
-0.0020465850830078125,
-0.019287109375
]
] |
euclaise/falcon_1b_stage3_2 | 2023-09-25T22:31:31.000Z | [
"transformers",
"pytorch",
"falcon",
"text-generation",
"generated_from_trainer",
"custom_code",
"license:apache-2.0",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | text-generation | euclaise | null | null | euclaise/falcon_1b_stage3_2 | 0 | 6,724 | transformers | 2023-09-23T01:28:52 | ---
license: apache-2.0
base_model: euclaise/falcon_1b_stage2
tags:
- generated_from_trainer
model-index:
- name: falcon_1b_stage3_2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# falcon_1b_stage3_2
This model is a fine-tuned version of [euclaise/falcon_1b_stage2](https://huggingface.co/euclaise/falcon_1b_stage2) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-06
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 128.0
- total_train_batch_size: 128.0
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.15
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.33.2
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.13.3
| 1,188 | [
[
-0.0400390625,
-0.046173095703125,
0.00036644935607910156,
0.02032470703125,
-0.0179595947265625,
-0.019622802734375,
0.00518798828125,
-0.0262451171875,
0.025146484375,
0.031463623046875,
-0.06280517578125,
-0.03216552734375,
-0.051971435546875,
0.001293182373046875,
-0.0244598388671875,
0.0916748046875,
0.00006389617919921875,
0.0265655517578125,
0.0024738311767578125,
0.00792694091796875,
-0.0156402587890625,
-0.036163330078125,
-0.08587646484375,
-0.045135498046875,
0.030487060546875,
0.02587890625,
0.0474853515625,
0.07745361328125,
0.05230712890625,
0.022216796875,
-0.0394287109375,
-0.0013427734375,
-0.046173095703125,
-0.02783203125,
-0.0048065185546875,
-0.0297393798828125,
-0.0589599609375,
0.0011692047119140625,
0.05078125,
0.031494140625,
-0.025360107421875,
0.03729248046875,
0.004547119140625,
0.037322998046875,
-0.03472900390625,
0.041107177734375,
-0.0391845703125,
0.03436279296875,
-0.016357421875,
-0.02069091796875,
-0.0146942138671875,
0.0076446533203125,
-0.01446533203125,
-0.06256103515625,
0.04022216796875,
-0.006481170654296875,
0.078125,
0.0276947021484375,
-0.0129241943359375,
-0.002056121826171875,
-0.05462646484375,
0.0285491943359375,
-0.041259765625,
0.01401519775390625,
0.0232696533203125,
0.05230712890625,
0.0008120536804199219,
-0.077880859375,
-0.0206298828125,
-0.001323699951171875,
0.00531768798828125,
0.019927978515625,
-0.006130218505859375,
0.0013332366943359375,
0.059844970703125,
0.01194000244140625,
-0.0330810546875,
0.01300048828125,
-0.056060791015625,
-0.0202789306640625,
0.04638671875,
0.030242919921875,
-0.01413726806640625,
-0.0044708251953125,
-0.03948974609375,
-0.01166534423828125,
-0.0218658447265625,
0.0311737060546875,
0.042327880859375,
0.0352783203125,
-0.015533447265625,
0.050445556640625,
-0.02410888671875,
0.039703369140625,
0.02484130859375,
0.0028247833251953125,
0.03765869140625,
0.004238128662109375,
-0.0297393798828125,
-0.0008115768432617188,
0.049407958984375,
0.036651611328125,
0.026397705078125,
0.002010345458984375,
-0.025299072265625,
-0.014678955078125,
0.02447509765625,
-0.07122802734375,
-0.0080413818359375,
0.006580352783203125,
-0.0325927734375,
-0.050811767578125,
0.035491943359375,
-0.047637939453125,
0.0036563873291015625,
-0.0115509033203125,
0.0229644775390625,
-0.005954742431640625,
-0.0147247314453125,
0.01346588134765625,
-0.011199951171875,
0.01312255859375,
0.0019435882568359375,
-0.0654296875,
0.0401611328125,
0.040252685546875,
0.0369873046875,
0.014373779296875,
-0.04290771484375,
-0.03948974609375,
0.01137542724609375,
-0.0245208740234375,
0.03228759765625,
-0.020263671875,
-0.0419921875,
-0.005664825439453125,
0.040863037109375,
-0.01535797119140625,
-0.0296783447265625,
0.089111328125,
-0.0242767333984375,
0.006793975830078125,
-0.0305328369140625,
-0.0460205078125,
-0.018890380859375,
0.01580810546875,
-0.05194091796875,
0.0810546875,
0.018157958984375,
-0.06787109375,
0.037689208984375,
-0.064208984375,
-0.01535797119140625,
0.01178741455078125,
0.002017974853515625,
-0.0478515625,
-0.00246429443359375,
0.005619049072265625,
0.039703369140625,
-0.004364013671875,
0.0171051025390625,
-0.03741455078125,
-0.052886962890625,
-0.011505126953125,
-0.0191802978515625,
0.047882080078125,
0.0302276611328125,
-0.02435302734375,
0.00131988525390625,
-0.08380126953125,
0.0093536376953125,
0.0251007080078125,
-0.02679443359375,
0.020111083984375,
-0.020172119140625,
0.037811279296875,
0.024139404296875,
0.025146484375,
-0.048187255859375,
0.0078277587890625,
-0.0257720947265625,
0.026947021484375,
0.03387451171875,
0.00804901123046875,
0.002490997314453125,
-0.036407470703125,
0.0298309326171875,
0.0308837890625,
0.03216552734375,
0.01299285888671875,
-0.03106689453125,
-0.07257080078125,
-0.01058197021484375,
0.0154876708984375,
0.0276947021484375,
-0.0178070068359375,
0.044097900390625,
0.0019741058349609375,
-0.05059814453125,
-0.01305389404296875,
-0.0009136199951171875,
0.025543212890625,
0.047882080078125,
0.0328369140625,
-0.0013952255249023438,
-0.04071044921875,
-0.08245849609375,
-0.006061553955078125,
0.002674102783203125,
0.0258941650390625,
0.00884246826171875,
0.062744140625,
-0.01537322998046875,
0.041046142578125,
-0.0300750732421875,
-0.007320404052734375,
-0.00785064697265625,
0.0013866424560546875,
0.024261474609375,
0.065673828125,
0.055938720703125,
-0.027557373046875,
-0.019561767578125,
-0.0111541748046875,
-0.05218505859375,
0.0231781005859375,
-0.00969696044921875,
-0.01311492919921875,
-0.00970458984375,
0.033966064453125,
-0.03729248046875,
0.043060302734375,
0.002658843994140625,
-0.022918701171875,
0.0232391357421875,
-0.032196044921875,
-0.005146026611328125,
-0.08209228515625,
0.020294189453125,
0.0268402099609375,
-0.00626373291015625,
-0.02508544921875,
0.0213775634765625,
0.00019490718841552734,
-0.0108184814453125,
-0.041778564453125,
0.046722412109375,
-0.010589599609375,
0.005100250244140625,
-0.016357421875,
-0.031829833984375,
-0.00015032291412353516,
0.050018310546875,
0.0164031982421875,
0.0439453125,
0.05291748046875,
-0.0302276611328125,
0.013153076171875,
0.0394287109375,
-0.0104217529296875,
0.028961181640625,
-0.07965087890625,
0.00421905517578125,
-0.002407073974609375,
0.00710296630859375,
-0.0535888671875,
-0.0285797119140625,
0.0439453125,
-0.0312347412109375,
0.021636962890625,
-0.013763427734375,
-0.0270843505859375,
-0.043426513671875,
-0.0007572174072265625,
0.0176544189453125,
0.0301971435546875,
-0.04791259765625,
0.0205230712890625,
-0.00200653076171875,
0.0207061767578125,
-0.035888671875,
-0.05615234375,
-0.02362060546875,
-0.0168609619140625,
-0.03564453125,
0.0166015625,
-0.0010576248168945312,
0.01158905029296875,
-0.015960693359375,
-0.001255035400390625,
-0.0194091796875,
-0.0011234283447265625,
0.032989501953125,
0.007965087890625,
-0.01090240478515625,
-0.01251983642578125,
0.01428985595703125,
-0.0247039794921875,
0.0258026123046875,
-0.003810882568359375,
0.01739501953125,
-0.0186767578125,
-0.0178680419921875,
-0.063232421875,
0.005985260009765625,
0.0499267578125,
-0.01120758056640625,
0.053680419921875,
0.070556640625,
-0.043701171875,
-0.00782012939453125,
-0.02117919921875,
-0.01558685302734375,
-0.032958984375,
0.041290283203125,
-0.0487060546875,
-0.0218658447265625,
0.051788330078125,
0.00460052490234375,
-0.0021209716796875,
0.07537841796875,
0.0257110595703125,
0.0050201416015625,
0.0977783203125,
0.018707275390625,
0.009429931640625,
0.0189666748046875,
-0.0714111328125,
-0.02691650390625,
-0.060394287109375,
-0.04351806640625,
-0.04644775390625,
-0.02410888671875,
-0.050201416015625,
0.0045318603515625,
0.0097503662109375,
0.019744873046875,
-0.05975341796875,
0.025634765625,
-0.0345458984375,
0.03887939453125,
0.041259765625,
0.0276947021484375,
-0.00843048095703125,
0.0140533447265625,
-0.023834228515625,
0.000010371208190917969,
-0.07183837890625,
-0.02880859375,
0.08856201171875,
0.04730224609375,
0.033233642578125,
-0.023681640625,
0.051513671875,
-0.00408935546875,
-0.0009160041809082031,
-0.038726806640625,
0.02899169921875,
-0.00328826904296875,
-0.0635986328125,
-0.0040130615234375,
-0.0229644775390625,
-0.048980712890625,
0.004085540771484375,
-0.03515625,
-0.04071044921875,
0.004360198974609375,
0.00437164306640625,
-0.02386474609375,
0.0265655517578125,
-0.033233642578125,
0.08160400390625,
-0.0130767822265625,
-0.0390625,
-0.0008325576782226562,
-0.04156494140625,
0.01486968994140625,
-0.00197601318359375,
-0.0053863525390625,
0.01177215576171875,
0.008087158203125,
0.07086181640625,
-0.0589599609375,
0.047698974609375,
-0.0372314453125,
0.033966064453125,
0.0294952392578125,
-0.0111541748046875,
0.06304931640625,
0.018218994140625,
-0.0215606689453125,
0.007312774658203125,
0.0172119140625,
-0.051605224609375,
-0.0302276611328125,
0.055572509765625,
-0.0902099609375,
-0.00704193115234375,
-0.044219970703125,
-0.03338623046875,
-0.015960693359375,
0.01203155517578125,
0.044830322265625,
0.0504150390625,
-0.01509857177734375,
0.0162200927734375,
0.01422882080078125,
0.0221405029296875,
0.037750244140625,
0.029541015625,
-0.007434844970703125,
-0.0379638671875,
0.0599365234375,
-0.0018301010131835938,
0.00040721893310546875,
-0.007671356201171875,
0.01178741455078125,
-0.034393310546875,
-0.045318603515625,
-0.0235443115234375,
0.0204315185546875,
-0.04949951171875,
0.0005550384521484375,
-0.031341552734375,
-0.042633056640625,
-0.00846099853515625,
-0.003620147705078125,
-0.036773681640625,
-0.03338623046875,
-0.056304931640625,
-0.0184783935546875,
0.0215301513671875,
0.0526123046875,
-0.0090179443359375,
0.059478759765625,
-0.047088623046875,
-0.0161590576171875,
0.005550384521484375,
0.031982421875,
0.00021886825561523438,
-0.06854248046875,
-0.0297393798828125,
0.0240936279296875,
-0.0316162109375,
-0.0254058837890625,
0.026397705078125,
0.00991058349609375,
0.057098388671875,
0.043609619140625,
-0.0211334228515625,
0.0638427734375,
-0.01311492919921875,
0.057037353515625,
0.0218658447265625,
-0.035614013671875,
0.0318603515625,
-0.034149169921875,
0.013885498046875,
0.05126953125,
0.04071044921875,
0.00992584228515625,
-0.01318359375,
-0.095947265625,
-0.038665771484375,
0.062408447265625,
0.0312347412109375,
0.005649566650390625,
0.00193023681640625,
0.054931640625,
-0.00016319751739501953,
0.01194000244140625,
-0.045166015625,
-0.03106689453125,
-0.029083251953125,
0.005523681640625,
-0.0224761962890625,
-0.0121612548828125,
-0.0111541748046875,
-0.05810546875,
0.08221435546875,
-0.00325775146484375,
0.012664794921875,
0.0047454833984375,
0.0194091796875,
-0.027801513671875,
-0.02069091796875,
0.05511474609375,
0.04852294921875,
-0.04974365234375,
-0.0244140625,
0.007091522216796875,
-0.037689208984375,
-0.008392333984375,
0.0228271484375,
-0.0121002197265625,
0.0024242401123046875,
0.0250396728515625,
0.075927734375,
0.01165008544921875,
-0.0000324249267578125,
0.0298004150390625,
-0.01032257080078125,
-0.042633056640625,
-0.019134521484375,
0.02862548828125,
-0.01531219482421875,
0.02923583984375,
0.0028438568115234375,
0.042388916015625,
0.0090789794921875,
-0.004817962646484375,
0.02239990234375,
0.016357421875,
-0.049072265625,
-0.0199432373046875,
0.07379150390625,
0.013153076171875,
-0.033203125,
0.0408935546875,
-0.01387786865234375,
-0.01220703125,
0.07061767578125,
0.045654296875,
0.06591796875,
0.0052337646484375,
0.019622802734375,
0.070556640625,
0.0045013427734375,
-0.0258026123046875,
0.042938232421875,
0.01070404052734375,
-0.034637451171875,
0.002834320068359375,
-0.050537109375,
-0.007144927978515625,
0.031341552734375,
-0.08807373046875,
0.035308837890625,
-0.046722412109375,
-0.031585693359375,
0.01959228515625,
0.027191162109375,
-0.06719970703125,
0.040283203125,
0.0011930465698242188,
0.0985107421875,
-0.06597900390625,
0.0560302734375,
0.05224609375,
-0.046661376953125,
-0.07904052734375,
-0.0220489501953125,
-0.01102447509765625,
-0.0703125,
0.055206298828125,
0.0003800392150878906,
0.0157928466796875,
0.026824951171875,
-0.0400390625,
-0.041168212890625,
0.07763671875,
0.011810302734375,
-0.05316162109375,
0.008270263671875,
0.0208740234375,
0.0496826171875,
-0.04290771484375,
0.054290771484375,
0.01849365234375,
0.0278167724609375,
0.047698974609375,
-0.05584716796875,
-0.0181427001953125,
-0.03680419921875,
0.017333984375,
-0.0011720657348632812,
-0.05914306640625,
0.07196044921875,
-0.0077362060546875,
0.029052734375,
0.02392578125,
0.040618896484375,
0.01187896728515625,
0.00971221923828125,
0.02294921875,
0.05511474609375,
0.047882080078125,
-0.0005693435668945312,
0.06365966796875,
-0.059478759765625,
0.060272216796875,
0.078857421875,
-0.00162506103515625,
0.042633056640625,
0.03375244140625,
-0.00630950927734375,
0.0115814208984375,
0.07763671875,
-0.03497314453125,
0.0294189453125,
0.016937255859375,
0.011962890625,
-0.032562255859375,
0.016448974609375,
-0.060455322265625,
0.03717041015625,
-0.0016241073608398438,
-0.054351806640625,
-0.03179931640625,
-0.02459716796875,
0.002651214599609375,
-0.015960693359375,
-0.038116455078125,
0.034088134765625,
-0.0303192138671875,
-0.04486083984375,
0.059967041015625,
0.01084136962890625,
0.0146942138671875,
-0.044342041015625,
-0.0018901824951171875,
-0.017120361328125,
0.026763916015625,
-0.0240020751953125,
-0.035888671875,
0.023101806640625,
-0.0014600753784179688,
-0.01354217529296875,
0.006580352783203125,
0.02197265625,
-0.017486572265625,
-0.086181640625,
0.00568389892578125,
0.03204345703125,
0.0209197998046875,
0.004085540771484375,
-0.08709716796875,
0.0035037994384765625,
-0.0157012939453125,
-0.025238037109375,
0.0097808837890625,
0.01482391357421875,
0.018035888671875,
0.03668212890625,
0.0400390625,
0.00290679931640625,
-0.0003247261047363281,
0.00969696044921875,
0.060821533203125,
-0.032135009765625,
-0.044952392578125,
-0.0443115234375,
0.041046142578125,
-0.0099639892578125,
-0.0693359375,
0.037200927734375,
0.08331298828125,
0.0660400390625,
-0.018096923828125,
0.041259765625,
0.0161590576171875,
0.0311737060546875,
-0.03179931640625,
0.04638671875,
-0.04638671875,
0.006099700927734375,
-0.020782470703125,
-0.0640869140625,
0.00540924072265625,
0.054534912109375,
-0.002330780029296875,
0.018890380859375,
0.042022705078125,
0.07257080078125,
-0.0201263427734375,
0.034149169921875,
0.00897979736328125,
0.0084686279296875,
0.0115509033203125,
0.03607177734375,
0.039306640625,
-0.06805419921875,
0.020355224609375,
-0.04315185546875,
-0.008087158203125,
-0.005466461181640625,
-0.052459716796875,
-0.08111572265625,
-0.03228759765625,
-0.036041259765625,
-0.03717041015625,
0.009674072265625,
0.0726318359375,
0.0643310546875,
-0.05596923828125,
-0.03369140625,
-0.02197265625,
-0.0291748046875,
-0.02496337890625,
-0.01198577880859375,
0.02166748046875,
-0.0217132568359375,
-0.049041748046875,
-0.004482269287109375,
-0.0238037109375,
0.025543212890625,
-0.01401519775390625,
-0.02471923828125,
-0.0132598876953125,
-0.017059326171875,
0.009552001953125,
0.01480865478515625,
-0.03912353515625,
-0.02947998046875,
-0.0059051513671875,
-0.0037689208984375,
0.01172637939453125,
0.0219573974609375,
-0.02984619140625,
0.0270233154296875,
0.00887298583984375,
0.015899658203125,
0.07354736328125,
0.00274658203125,
0.01457977294921875,
-0.04052734375,
0.0294952392578125,
0.002552032470703125,
0.0308837890625,
-0.009552001953125,
-0.0294647216796875,
0.04437255859375,
0.035919189453125,
-0.03228759765625,
-0.05023193359375,
-0.0169677734375,
-0.08624267578125,
0.0096282958984375,
0.08514404296875,
0.0034637451171875,
-0.032623291015625,
0.03948974609375,
-0.02117919921875,
0.0262451171875,
-0.0177764892578125,
0.03363037109375,
0.049957275390625,
-0.00917816162109375,
0.0123138427734375,
-0.041748046875,
0.038726806640625,
-0.005321502685546875,
-0.05401611328125,
-0.0285797119140625,
0.01427459716796875,
0.039764404296875,
-0.01198577880859375,
0.01251983642578125,
-0.0010499954223632812,
0.024444580078125,
0.01751708984375,
0.003143310546875,
-0.0574951171875,
-0.034912109375,
-0.0187530517578125,
0.0171356201171875,
-0.00835418701171875,
-0.035858154296875
]
] |
swl-models/xiaolxl-guofeng-v1 | 2023-02-01T01:00:00.000Z | [
"diffusers",
" stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | swl-models | null | null | swl-models/xiaolxl-guofeng-v1 | 1 | 6,719 | diffusers | 2023-01-31T15:43:58 | ---
license: creativeml-openrail-m
language:
- en
library_name: diffusers
pipeline_tag: text-to-image
tags:
- ' stable-diffusion'
- stable-diffusion-diffusers
duplicated_from: xiaolxl/Gf_style
---
# Gf_style - 介绍
欢迎使用Gf_style模型 - 这是一个中国华丽古风风格模型,也可以说是一个古风游戏角色模型,具有2.5D的质感。这是一个模型系列,会在未来不断更新模型。
2.0版本已发布:[https://huggingface.co/xiaolxl/Gf_style2](https://huggingface.co/xiaolxl/Gf_style2)
3.0版本已发布:[https://huggingface.co/xiaolxl/Gf_style3](https://huggingface.co/xiaolxl/Gf_style3)
--
Welcome to Gf_style - This is a model of Chinese gorgeous ancient style, which can also be said to be an ancient game character model, with the effect of 2.5D texture. This is a series of models that will be updated in the future.
# install - 安装教程
1. 将XXX.saftensors模型和XXX.yaml放入SD目录 - Put the XXX.safetensors model and XXX.yaml into the SD directory
2. 请记住选择任何VAE文件,否则图形将为灰色 - Remember to select any VAE file, otherwise the drawing will be gray
# How to use - 如何使用
(TIP:人物是竖图炼制,理论上生成竖图效果更好)
如果您想使图片尽可能更好,请尝试以下配置 - If you want to make the picture better as possible, please try the following configuration
- Sampling steps:**30 or 50**
- Sampler:**DDIM** or **(DPM++ 2M Karras, DPM++ SDE Karras)** - These two have different surprises - 这两个有不一样的惊喜
- The size of the picture should be at least **768**, otherwise it will collapse - 图片大小至少768,不然会崩图
- Turn on Hires fix:**R-ESRGAN 4x+ Anime6B** and **Upscale by 2**
- If the face is deformed, try to Open **face repair**
- **key word(Start):**
```
{best quality}, {{masterpiece}}, {highres}, {an extremely delicate and beautiful}, original, extremely detailed wallpaper,
```
- **Negative words:**
```
NSFW, lowres,bad anatomy,bad hands, text, error, missing fingers,extra digit, fewer digits, cropped, worstquality, low quality, normal quality,jpegartifacts,signature, watermark, username,blurry,bad feet
```
# Examples - 例图
(可在文件列表中找到原图,并放入WebUi查看关键词等信息) - (You can find the original image in the file list, and put WebUi to view keywords and other information)
Town building map -- 镇楼图
<img src=https://huggingface.co/xiaolxl/Gf_style/resolve/main/examples/f1.png>
<img src=https://huggingface.co/xiaolxl/Gf_style/resolve/main/examples/f2.png>
Graph generated by keywords in How to use -- How to use中的关键词所生成的图
<img src=https://huggingface.co/xiaolxl/Gf_style/resolve/main/examples/e1.png>
<img src=https://huggingface.co/xiaolxl/Gf_style/resolve/main/examples/e2.png>
<img src=https://huggingface.co/xiaolxl/Gf_style/resolve/main/examples/e3.png>
<img src=https://huggingface.co/xiaolxl/Gf_style/resolve/main/examples/e4.png>
Ending figure -- 收尾图
<img src=https://huggingface.co/xiaolxl/Gf_style/resolve/main/examples/g1.png>
<img src=https://huggingface.co/xiaolxl/Gf_style/resolve/main/examples/g2.png>
| 2,768 | [
[
-0.042205810546875,
-0.06591796875,
0.0183563232421875,
0.034393310546875,
-0.045074462890625,
-0.0225677490234375,
0.009735107421875,
-0.05657958984375,
0.03631591796875,
0.0458984375,
-0.039215087890625,
-0.05810546875,
-0.04388427734375,
0.003627777099609375,
0.0166473388671875,
0.054901123046875,
-0.0123443603515625,
-0.00875091552734375,
0.0196685791015625,
0.005870819091796875,
-0.034820556640625,
0.0007734298706054688,
-0.06268310546875,
-0.0229949951171875,
-0.0030803680419921875,
0.020599365234375,
0.07025146484375,
0.06591796875,
0.041717529296875,
0.0277862548828125,
-0.011016845703125,
0.01274871826171875,
-0.0325927734375,
-0.02655029296875,
0.0136566162109375,
-0.027130126953125,
-0.06878662109375,
0.00435638427734375,
0.03314208984375,
0.00726318359375,
0.0021877288818359375,
0.0012950897216796875,
0.006008148193359375,
0.061126708984375,
-0.0099639892578125,
0.0241241455078125,
-0.0005245208740234375,
0.0022411346435546875,
-0.0269012451171875,
-0.007205963134765625,
0.004169464111328125,
-0.04718017578125,
-0.0005755424499511719,
-0.060791015625,
0.01629638671875,
0.0028705596923828125,
0.1307373046875,
-0.00217437744140625,
-0.0248565673828125,
0.0095672607421875,
-0.023712158203125,
0.03741455078125,
-0.05291748046875,
-0.0008578300476074219,
0.0217132568359375,
0.032196044921875,
-0.0206451416015625,
-0.05731201171875,
-0.03741455078125,
0.0031585693359375,
-0.033660888671875,
0.0182647705078125,
-0.01531219482421875,
-0.01300811767578125,
0.0246124267578125,
0.043548583984375,
-0.04180908203125,
-0.0168609619140625,
-0.0394287109375,
-0.0193023681640625,
0.06396484375,
-0.0028781890869140625,
0.06744384765625,
-0.0220184326171875,
-0.050140380859375,
-0.02276611328125,
-0.06298828125,
0.01552581787109375,
0.00499725341796875,
0.000782012939453125,
-0.053802490234375,
0.014617919921875,
-0.00980377197265625,
0.036346435546875,
0.01763916015625,
-0.0275421142578125,
0.040313720703125,
-0.0266265869140625,
-0.0200042724609375,
-0.03643798828125,
0.068115234375,
0.0694580078125,
0.016387939453125,
0.0115203857421875,
-0.016632080078125,
-0.017364501953125,
-0.0078277587890625,
-0.0762939453125,
-0.022430419921875,
0.0241546630859375,
-0.06103515625,
-0.0201263427734375,
0.0182952880859375,
-0.06915283203125,
0.00717926025390625,
0.01476287841796875,
0.026702880859375,
-0.036285400390625,
-0.046539306640625,
0.0018205642700195312,
-0.0190582275390625,
0.0186614990234375,
0.0297393798828125,
-0.054779052734375,
0.005306243896484375,
0.01319122314453125,
0.059051513671875,
0.019866943359375,
-0.004604339599609375,
-0.00560760498046875,
0.0197601318359375,
-0.035797119140625,
0.046722412109375,
-0.00029730796813964844,
-0.05572509765625,
-0.00914764404296875,
0.0240478515625,
0.01016998291015625,
-0.042877197265625,
0.054595947265625,
-0.0189666748046875,
0.0162506103515625,
-0.028717041015625,
-0.0019779205322265625,
-0.03497314453125,
0.01174163818359375,
-0.058837890625,
0.0755615234375,
0.03204345703125,
-0.058380126953125,
0.008544921875,
-0.04638671875,
-0.0220794677734375,
0.007297515869140625,
-0.002864837646484375,
-0.03594970703125,
-0.00862884521484375,
0.0152740478515625,
0.03216552734375,
-0.00579071044921875,
-0.02044677734375,
-0.01702880859375,
-0.032012939453125,
0.0010728836059570312,
0.01236724853515625,
0.07977294921875,
0.0218048095703125,
-0.049346923828125,
0.0033359527587890625,
-0.038970947265625,
0.0010509490966796875,
0.05535888671875,
-0.0024871826171875,
-0.034454345703125,
-0.01090240478515625,
0.0236968994140625,
0.039093017578125,
0.045318603515625,
-0.0244598388671875,
0.0291290283203125,
-0.038848876953125,
0.0211639404296875,
0.06097412109375,
-0.00662994384765625,
0.0221099853515625,
-0.056304931640625,
0.0229644775390625,
0.00457763671875,
0.0169677734375,
-0.0058135986328125,
-0.045623779296875,
-0.06396484375,
-0.0163421630859375,
-0.0027370452880859375,
0.03997802734375,
-0.066162109375,
0.038818359375,
-0.005992889404296875,
-0.05328369140625,
-0.02947998046875,
-0.007534027099609375,
0.045318603515625,
0.01337432861328125,
0.03582763671875,
-0.036407470703125,
-0.0288543701171875,
-0.061981201171875,
0.01422119140625,
-0.01485443115234375,
-0.01314544677734375,
0.04608154296875,
0.034881591796875,
-0.0201568603515625,
0.04071044921875,
-0.04913330078125,
-0.040191650390625,
-0.01029205322265625,
-0.0032215118408203125,
0.0171661376953125,
0.03961181640625,
0.09063720703125,
-0.06707763671875,
-0.038421630859375,
-0.0047760009765625,
-0.05548095703125,
-0.01141357421875,
0.010833740234375,
-0.03155517578125,
0.027862548828125,
0.00634765625,
-0.032806396484375,
0.03143310546875,
0.035919189453125,
-0.0498046875,
0.0625,
-0.0218353271484375,
0.03082275390625,
-0.0916748046875,
0.00887298583984375,
0.017913818359375,
-0.0018749237060546875,
-0.043701171875,
0.057861328125,
-0.00958251953125,
0.0246734619140625,
-0.03399658203125,
0.06591796875,
-0.0482177734375,
0.0214080810546875,
-0.0264129638671875,
0.0190277099609375,
0.010345458984375,
0.04144287109375,
0.0120849609375,
0.03106689453125,
0.04583740234375,
-0.032928466796875,
0.0347900390625,
0.033782958984375,
-0.0169219970703125,
0.0265350341796875,
-0.055938720703125,
0.0177764892578125,
-0.003509521484375,
0.01244354248046875,
-0.08233642578125,
-0.028076171875,
0.053314208984375,
-0.033966064453125,
0.0311431884765625,
-0.0026760101318359375,
-0.0251312255859375,
-0.04864501953125,
-0.049102783203125,
0.01053619384765625,
0.04217529296875,
-0.033233642578125,
0.034149169921875,
0.0246429443359375,
0.00015115737915039062,
-0.017303466796875,
-0.05963134765625,
-0.0109405517578125,
-0.0216217041015625,
-0.0718994140625,
0.055419921875,
-0.009307861328125,
-0.00887298583984375,
0.00417327880859375,
0.0130157470703125,
-0.01375579833984375,
-0.004253387451171875,
0.0129852294921875,
0.054412841796875,
-0.019073486328125,
-0.04193115234375,
-0.006542205810546875,
-0.01142120361328125,
0.0020961761474609375,
0.0207977294921875,
0.037994384765625,
-0.020263671875,
-0.014251708984375,
-0.041168212890625,
0.0207977294921875,
0.04437255859375,
0.0014448165893554688,
0.01476287841796875,
0.061431884765625,
-0.036895751953125,
0.022796630859375,
-0.041717529296875,
-0.0023441314697265625,
-0.040008544921875,
0.0172576904296875,
-0.034393310546875,
-0.0665283203125,
0.04913330078125,
0.005573272705078125,
0.008514404296875,
0.06427001953125,
0.0262908935546875,
0.00089263916015625,
0.08740234375,
0.05340576171875,
-0.01678466796875,
0.026763916015625,
-0.044219970703125,
-0.006305694580078125,
-0.05108642578125,
-0.032135009765625,
-0.02215576171875,
-0.03955078125,
-0.0635986328125,
-0.0304412841796875,
0.02886962890625,
0.01375579833984375,
-0.0204925537109375,
0.040130615234375,
-0.06024169921875,
0.006130218505859375,
0.0307159423828125,
0.034332275390625,
0.00931549072265625,
-0.0011262893676757812,
0.008026123046875,
-0.004863739013671875,
-0.01336669921875,
-0.027618408203125,
0.03125,
0.034637451171875,
0.040435791015625,
0.0183868408203125,
0.048980712890625,
-0.024169921875,
0.006603240966796875,
-0.0362548828125,
0.0478515625,
-0.0246429443359375,
-0.044952392578125,
-0.01505279541015625,
-0.0217437744140625,
-0.057159423828125,
0.0211944580078125,
-0.0243377685546875,
-0.067626953125,
0.01099395751953125,
0.0059051513671875,
-0.033233642578125,
0.037445068359375,
-0.03350830078125,
0.05828857421875,
-0.02252197265625,
-0.0244598388671875,
0.00121307373046875,
-0.0540771484375,
0.042816162109375,
0.0173492431640625,
0.0252685546875,
-0.01849365234375,
0.0068817138671875,
0.044921875,
-0.033599853515625,
0.038055419921875,
-0.0269622802734375,
-0.0031490325927734375,
0.027679443359375,
0.002109527587890625,
0.04052734375,
0.0016489028930664062,
0.01751708984375,
0.0227203369140625,
0.01259613037109375,
-0.02703857421875,
-0.036834716796875,
0.045196533203125,
-0.0628662109375,
-0.048370361328125,
-0.0312347412109375,
-0.0162353515625,
0.018524169921875,
0.0248870849609375,
0.054168701171875,
0.0124664306640625,
-0.016815185546875,
-0.0007266998291015625,
0.02703857421875,
-0.0156402587890625,
0.034332275390625,
0.0226593017578125,
-0.036895751953125,
-0.036590576171875,
0.064208984375,
0.0135345458984375,
0.0104217529296875,
0.0134429931640625,
0.018280029296875,
-0.01119232177734375,
-0.01593017578125,
-0.049072265625,
0.0282440185546875,
-0.00817108154296875,
-0.00972747802734375,
-0.0181884765625,
-0.034942626953125,
-0.042266845703125,
-0.0229034423828125,
-0.011993408203125,
-0.007381439208984375,
-0.0299835205078125,
0.01039886474609375,
0.036163330078125,
0.0316162109375,
-0.0168304443359375,
0.020965576171875,
-0.0601806640625,
0.043212890625,
0.01019287109375,
0.0156402587890625,
0.00007599592208862305,
-0.02728271484375,
-0.01493072509765625,
0.0177001953125,
-0.031280517578125,
-0.06591796875,
0.050994873046875,
-0.0066375732421875,
0.0262451171875,
0.051116943359375,
-0.0177459716796875,
0.048614501953125,
-0.02301025390625,
0.0621337890625,
0.02496337890625,
-0.040924072265625,
0.047027587890625,
-0.0438232421875,
0.01806640625,
0.027496337890625,
0.026947021484375,
-0.043975830078125,
-0.01007080078125,
-0.0562744140625,
-0.0657958984375,
0.050384521484375,
0.01629638671875,
0.0154876708984375,
0.011627197265625,
0.037200927734375,
-0.0242767333984375,
-0.001316070556640625,
-0.053314208984375,
-0.063232421875,
-0.02838134765625,
0.01629638671875,
0.027130126953125,
-0.0293731689453125,
-0.00701904296875,
-0.034423828125,
0.06878662109375,
0.0008635520935058594,
0.0494384765625,
0.0178375244140625,
0.032440185546875,
-0.0213775634765625,
-0.0031337738037109375,
0.0355224609375,
0.0285491943359375,
-0.022247314453125,
-0.0207977294921875,
0.00696563720703125,
-0.045989990234375,
-0.005382537841796875,
0.0211944580078125,
-0.032379150390625,
0.012359619140625,
0.00621795654296875,
0.0565185546875,
-0.004001617431640625,
-0.016448974609375,
0.04693603515625,
-0.00133514404296875,
-0.01129150390625,
-0.0227813720703125,
0.0099639892578125,
0.0039520263671875,
0.0219573974609375,
0.033294677734375,
-0.0020351409912109375,
0.0195465087890625,
-0.046295166015625,
0.01462554931640625,
0.0278472900390625,
-0.02081298828125,
-0.01399993896484375,
0.0531005859375,
0.00632476806640625,
-0.0089569091796875,
0.0328369140625,
-0.02923583984375,
-0.046600341796875,
0.062469482421875,
0.04754638671875,
0.058868408203125,
-0.040802001953125,
0.022216796875,
0.06207275390625,
0.0143585205078125,
-0.009429931640625,
0.045318603515625,
0.0254364013671875,
-0.05029296875,
-0.0171966552734375,
-0.039215087890625,
-0.00478363037109375,
0.036376953125,
-0.0250396728515625,
0.039794921875,
-0.053466796875,
-0.01300048828125,
-0.0266265869140625,
0.024749755859375,
-0.0214080810546875,
0.042083740234375,
-0.0034694671630859375,
0.07342529296875,
-0.05413818359375,
0.050750732421875,
0.052642822265625,
-0.052459716796875,
-0.08197021484375,
-0.0005016326904296875,
0.0290679931640625,
-0.06817626953125,
0.031341552734375,
0.0180816650390625,
-0.001995086669921875,
-0.00945281982421875,
-0.062225341796875,
-0.05975341796875,
0.09619140625,
0.01165008544921875,
-0.02252197265625,
-0.024505615234375,
-0.02203369140625,
0.03167724609375,
-0.0248870849609375,
0.0218048095703125,
0.0296783447265625,
0.049774169921875,
0.0281219482421875,
-0.0706787109375,
0.03826904296875,
-0.055999755859375,
0.01273345947265625,
-0.0019245147705078125,
-0.08526611328125,
0.067626953125,
-0.02392578125,
-0.031982421875,
0.034149169921875,
0.059906005859375,
0.03155517578125,
0.01194000244140625,
0.03997802734375,
0.04766845703125,
0.02655029296875,
-0.03607177734375,
0.06658935546875,
-0.0203704833984375,
0.0299530029296875,
0.048126220703125,
0.021759033203125,
0.0372314453125,
-0.00922393798828125,
-0.026763916015625,
0.05792236328125,
0.069580078125,
-0.032745361328125,
0.040802001953125,
-0.003021240234375,
-0.0126953125,
0.006420135498046875,
-0.01256561279296875,
-0.052154541015625,
0.0311737060546875,
0.01427459716796875,
-0.0197906494140625,
-0.01430511474609375,
-0.0194244384765625,
0.035491943359375,
-0.005962371826171875,
-0.0135955810546875,
0.0499267578125,
0.004886627197265625,
-0.0209808349609375,
0.045867919921875,
0.0233001708984375,
0.07537841796875,
-0.04766845703125,
-0.0219879150390625,
-0.0213775634765625,
-0.01194000244140625,
-0.0251312255859375,
-0.0673828125,
0.02117919921875,
-0.0307159423828125,
0.0038013458251953125,
-0.0034770965576171875,
0.0654296875,
-0.028350830078125,
-0.033477783203125,
0.0347900390625,
0.0221099853515625,
0.0223846435546875,
0.007373809814453125,
-0.072998046875,
0.004302978515625,
0.0139923095703125,
-0.03411865234375,
0.01520538330078125,
0.0275421142578125,
-0.0003147125244140625,
0.02801513671875,
0.0384521484375,
0.0167999267578125,
-0.037506103515625,
-0.00872039794921875,
0.07342529296875,
-0.04052734375,
-0.029205322265625,
-0.058807373046875,
0.04730224609375,
-0.0211944580078125,
-0.032135009765625,
0.0477294921875,
0.03533935546875,
0.0704345703125,
-0.012451171875,
0.06048583984375,
-0.035858154296875,
0.0221099853515625,
-0.03302001953125,
0.07574462890625,
-0.07012939453125,
-0.0262908935546875,
-0.057952880859375,
-0.0518798828125,
-0.0219573974609375,
0.07684326171875,
-0.0005555152893066406,
0.02349853515625,
0.031280517578125,
0.07110595703125,
0.01288604736328125,
-0.015899658203125,
0.00937652587890625,
0.02069091796875,
0.0096282958984375,
0.067626953125,
0.02587890625,
-0.06365966796875,
0.0218505859375,
-0.05853271484375,
-0.03546142578125,
-0.026641845703125,
-0.04815673828125,
-0.05133056640625,
-0.058746337890625,
-0.0209503173828125,
-0.045440673828125,
-0.011322021484375,
0.049774169921875,
0.0517578125,
-0.05426025390625,
-0.01470184326171875,
0.0207672119140625,
0.01139068603515625,
-0.02734375,
-0.021881103515625,
0.0479736328125,
0.0256805419921875,
-0.08062744140625,
0.0007839202880859375,
0.02215576171875,
0.04486083984375,
-0.0019969940185546875,
-0.00914764404296875,
-0.01265716552734375,
-0.0199432373046875,
0.03399658203125,
0.0291595458984375,
-0.053802490234375,
-0.033599853515625,
0.007228851318359375,
-0.0223846435546875,
0.01143646240234375,
0.017608642578125,
-0.0242767333984375,
0.021087646484375,
0.049224853515625,
-0.0127716064453125,
0.033782958984375,
-0.001953125,
0.02410888671875,
-0.03167724609375,
0.003414154052734375,
-0.0268096923828125,
0.041168212890625,
0.0164947509765625,
-0.04205322265625,
0.0635986328125,
0.041290283203125,
-0.0279693603515625,
-0.0230712890625,
0.011932373046875,
-0.0853271484375,
-0.0272979736328125,
0.0802001953125,
-0.0050811767578125,
-0.026519775390625,
0.01255035400390625,
-0.052093505859375,
0.037322998046875,
-0.02032470703125,
0.06011962890625,
0.05438232421875,
-0.00562286376953125,
-0.01288604736328125,
-0.0362548828125,
0.045867919921875,
-0.002925872802734375,
-0.046539306640625,
-0.0281219482421875,
0.052154541015625,
0.0174407958984375,
0.04443359375,
0.06524658203125,
-0.033050537109375,
0.029815673828125,
-0.0173187255859375,
0.019683837890625,
-0.00972747802734375,
0.0024852752685546875,
0.00021529197692871094,
0.002498626708984375,
-0.0211639404296875,
-0.0216064453125
]
] |
llm-agents/tora-70b-v1.0 | 2023-10-08T11:37:14.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"code",
"math",
"en",
"dataset:gsm8k",
"dataset:competition_math",
"arxiv:2309.17452",
"license:llama2",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | llm-agents | null | null | llm-agents/tora-70b-v1.0 | 14 | 6,702 | transformers | 2023-10-08T05:46:26 | ---
license: llama2
datasets:
- gsm8k
- competition_math
language:
- en
metrics:
- exact_match
library_name: transformers
pipeline_tag: text-generation
tags:
- code
- math
---
<h1 align="center">
ToRA: A Tool-Integrated Reasoning Agent <br> for Mathematical Problem Solving
</h1>
<p align="center">
<a href="https://microsoft.github.io/ToRA/"><b>[🌐 Website]</b></a> •
<a href="https://arxiv.org/abs/2309.17452"><b>[📜 Paper]</b></a> •
<a href="https://huggingface.co/llm-agents"><b>[🤗 HF Models]</b></a> •
<a href="https://github.com/microsoft/ToRA"><b>[🐱 GitHub]</b></a>
<br>
<a href="https://twitter.com/zhs05232838/status/1708860992631763092"><b>[🐦 Twitter]</b></a> •
<a href="https://www.reddit.com/r/LocalLLaMA/comments/1703k6d/tora_a_toolintegrated_reasoning_agent_for/"><b>[💬 Reddit]</b></a> •
<a href="https://notes.aimodels.fyi/researchers-announce-tora-training-language-models-to-better-understand-math-using-external-tools/">[🍀 Unofficial Blog]</a>
<!-- <a href="#-quick-start">Quick Start</a> • -->
<!-- <a href="#%EF%B8%8F-citation">Citation</a> -->
</p>
<p align="center">
Repo for "<a href="https://arxiv.org/abs/2309.17452" target="_blank">ToRA: A Tool-Integrated Reasoning Agent for Mathematical Problem Solving</a>"
</p>
## 🔥 News
- [2023/10/08] 🔥🔥🔥 All ToRA models released at [HuggingFace](https://huggingface.co/llm-agents)!!!
- [2023/09/29] ToRA paper, repo, and website released.
## 💡 Introduction
ToRA is a series of Tool-integrated Reasoning Agents designed to solve challenging mathematical reasoning problems by interacting with tools, e.g., computation libraries and symbolic solvers. ToRA series seamlessly integrate natural language reasoning with the utilization of external tools, thereby amalgamating the analytical prowess of language and the computational efficiency of external tools.
| Model | Size | GSM8k | MATH | AVG@10 math tasks<sup>†</sup> |
|---|---|---|---|---|
| GPT-4 | - | 92.0 | 42.5 | 78.3 |
| GPT-4 (PAL) | - | 94.2 | 51.8 | 86.4 |
| [ToRA-7B](https://huggingface.co/llm-agents/tora-7b-v1.0) | 7B | 68.8 | 40.1 | 62.4|
| [ToRA-Code-7B](https://huggingface.co/llm-agents/tora-code-7b-v1.0) | 7B | 72.6 | 44.6 | 66.5|
| [ToRA-13B](https://huggingface.co/llm-agents/tora-13b-v1.0) | 13B | 72.7 | 43.0 | 65.9|
| [ToRA-Code-13B](https://huggingface.co/llm-agents/tora-code-13b-v1.0) | 13B | 75.8 | 48.1 | 71.3 |
| [ToRA-Code-34B<sup>*</sup>](https://huggingface.co/llm-agents/tora-code-34b-v1.0) | 34B | 80.7 | **51.0** | 74.8 |
| [ToRA-70B](https://huggingface.co/llm-agents/tora-70b-v1.0) | 70B | **84.3** | 49.7 | **76.9** |
- <sup>*</sup>ToRA-Code-34B is currently the first and only open-source model to achieve over 50% accuracy (pass@1) on the MATH dataset, which significantly outperforms GPT-4’s CoT result (51.0 vs. 42.5), and is competitive with GPT-4 solving problems with programs. By open-sourcing our codes and models, we hope more breakthroughs will come!
- <sup>†</sup>10 math tasks include GSM8k, MATH, GSM-Hard, SVAMP, TabMWP, ASDiv, SingleEQ, SingleOP, AddSub, and MultiArith.
## ⚡️ Training
The models are trained on ToRA-Corpus 16k, which contains tool-integrated reasoning trajectories of MATH and GSM8k from GPT-4.
We use imitation learning (i.e., SFT) to fine-tune the models, and then apply our proposed *output space shaping* to improve tool-integrated reasoning behaviors. Please refer to the [paper](https://arxiv.org/pdf/2309.17452.pdf) for more details.
## 🪁 Inference & Evaluation
Please refer to ToRA's [GitHub repo](https://github.com/microsoft/ToRA) for inference, evaluation, and training code.
## ☕️ Citation
If you find this repository helpful, please consider citing our paper:
```
@misc{gou2023tora,
title={ToRA: A Tool-Integrated Reasoning Agent for Mathematical Problem Solving},
author={Zhibin Gou and Zhihong Shao and Yeyun Gong and yelong shen and Yujiu Yang and Minlie Huang and Nan Duan and Weizhu Chen},
year={2023},
eprint={2309.17452},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | 4,077 | [
[
-0.02825927734375,
-0.05963134765625,
0.05047607421875,
0.033477783203125,
-0.01611328125,
0.032989501953125,
0.01551055908203125,
-0.0261993408203125,
0.0374755859375,
0.03289794921875,
-0.049285888671875,
-0.0203094482421875,
-0.032562255859375,
0.009307861328125,
0.006481170654296875,
0.031982421875,
0.00377655029296875,
0.017242431640625,
-0.016143798828125,
-0.0220947265625,
-0.052490234375,
-0.0262451171875,
-0.0357666015625,
-0.027191162109375,
-0.009368896484375,
-0.00727081298828125,
0.06890869140625,
0.0369873046875,
0.0285186767578125,
0.029052734375,
0.00008881092071533203,
0.036163330078125,
-0.0015726089477539062,
-0.007198333740234375,
-0.004901885986328125,
-0.03564453125,
-0.05352783203125,
0.01158905029296875,
0.0484619140625,
0.0274505615234375,
-0.01715087890625,
0.0048675537109375,
-0.0145416259765625,
0.031463623046875,
-0.0307464599609375,
0.0219573974609375,
-0.0206756591796875,
-0.01715087890625,
-0.0010061264038085938,
-0.0044708251953125,
-0.046417236328125,
-0.0357666015625,
-0.0007219314575195312,
-0.07257080078125,
-0.0115203857421875,
0.0005345344543457031,
0.08758544921875,
0.0278167724609375,
-0.024139404296875,
-0.0171356201171875,
-0.046783447265625,
0.06256103515625,
-0.065673828125,
0.02142333984375,
-0.00131988525390625,
0.0225677490234375,
-0.01494598388671875,
-0.053619384765625,
-0.049163818359375,
-0.01543426513671875,
-0.00641632080078125,
0.0345458984375,
-0.04608154296875,
-0.019134521484375,
0.03436279296875,
-0.00041222572326660156,
-0.04022216796875,
-0.019012451171875,
-0.038360595703125,
-0.012054443359375,
0.043670654296875,
0.018829345703125,
0.030059814453125,
0.0010166168212890625,
-0.0096435546875,
-0.0129852294921875,
-0.054107666015625,
0.00591278076171875,
0.025115966796875,
0.0037670135498046875,
-0.0245819091796875,
0.0210113525390625,
0.0233917236328125,
0.044647216796875,
-0.0016794204711914062,
-0.00464630126953125,
0.0357666015625,
-0.006519317626953125,
-0.0205535888671875,
-0.026336669921875,
0.07025146484375,
-0.00554656982421875,
0.0075225830078125,
-0.001617431640625,
-0.00872039794921875,
0.00705718994140625,
0.021820068359375,
-0.05609130859375,
0.00943756103515625,
0.007579803466796875,
-0.004344940185546875,
-0.01131439208984375,
0.0239410400390625,
-0.05145263671875,
-0.00470733642578125,
-0.0232696533203125,
0.04571533203125,
-0.03472900390625,
-0.0186309814453125,
0.037933349609375,
0.0012054443359375,
0.01300048828125,
0.048980712890625,
-0.01375579833984375,
0.0285491943359375,
0.04022216796875,
0.0726318359375,
0.00991058349609375,
-0.01396942138671875,
-0.05078125,
-0.00849151611328125,
-0.0261993408203125,
0.0506591796875,
-0.029541015625,
-0.0180206298828125,
-0.0271148681640625,
0.0034046173095703125,
-0.0129852294921875,
-0.02801513671875,
0.0033397674560546875,
-0.052001953125,
0.02728271484375,
-0.00890350341796875,
-0.016265869140625,
-0.029052734375,
0.01055145263671875,
-0.0657958984375,
0.072509765625,
0.0248565673828125,
-0.03045654296875,
-0.0120086669921875,
-0.0645751953125,
-0.0063629150390625,
-0.0031757354736328125,
-0.0041961669921875,
-0.036712646484375,
-0.0225677490234375,
0.015869140625,
0.0033397674560546875,
-0.0592041015625,
0.038360595703125,
-0.03533935546875,
-0.0171051025390625,
0.0200347900390625,
-0.001956939697265625,
0.09405517578125,
0.02313232421875,
-0.0379638671875,
0.0196380615234375,
-0.033233642578125,
0.0143890380859375,
0.037872314453125,
0.0167083740234375,
-0.01593017578125,
-0.024017333984375,
-0.024169921875,
0.03314208984375,
0.01215362548828125,
-0.04150390625,
0.0218963623046875,
-0.04327392578125,
0.047637939453125,
0.08282470703125,
-0.0060577392578125,
0.022735595703125,
-0.0245361328125,
0.07318115234375,
0.0037670135498046875,
0.02276611328125,
0.01024627685546875,
-0.05517578125,
-0.04290771484375,
-0.024444580078125,
0.021453857421875,
0.05078125,
-0.07843017578125,
0.03350830078125,
-0.0101318359375,
-0.0579833984375,
-0.033721923828125,
-0.01245880126953125,
0.044647216796875,
0.033416748046875,
0.0229034423828125,
-0.008148193359375,
-0.029388427734375,
-0.0538330078125,
-0.0167694091796875,
-0.004688262939453125,
0.01068115234375,
0.029327392578125,
0.0494384765625,
-0.0040435791015625,
0.0755615234375,
-0.06402587890625,
-0.0010833740234375,
-0.0187225341796875,
-0.00750732421875,
0.0299224853515625,
0.0279083251953125,
0.055511474609375,
-0.052337646484375,
-0.05706787109375,
-0.01192474365234375,
-0.062469482421875,
-0.00823974609375,
-0.01385498046875,
-0.017913818359375,
0.0156402587890625,
0.0172882080078125,
-0.0499267578125,
0.038543701171875,
0.018402099609375,
-0.052886962890625,
0.03680419921875,
0.0106201171875,
-0.0146636962890625,
-0.1025390625,
0.01015472412109375,
0.0193023681640625,
-0.0175018310546875,
-0.02984619140625,
0.01763916015625,
-0.008636474609375,
-0.002391815185546875,
-0.02838134765625,
0.07305908203125,
-0.0244293212890625,
0.0050506591796875,
-0.001312255859375,
0.01611328125,
-0.00003337860107421875,
0.05328369140625,
-0.0170745849609375,
0.0968017578125,
0.032623291015625,
-0.0250396728515625,
0.0212554931640625,
0.02484130859375,
0.00893402099609375,
0.00848388671875,
-0.061492919921875,
0.0206451416015625,
0.0087127685546875,
0.016204833984375,
-0.0438232421875,
0.023529052734375,
0.0374755859375,
-0.05181884765625,
-0.0016450881958007812,
0.00421142578125,
-0.053985595703125,
-0.020294189453125,
-0.03460693359375,
0.01824951171875,
0.04803466796875,
-0.0306854248046875,
0.07952880859375,
0.031097412109375,
0.00856781005859375,
-0.05078125,
-0.0155792236328125,
-0.0182037353515625,
-0.0104217529296875,
-0.07049560546875,
0.01369476318359375,
-0.033477783203125,
-0.040283203125,
0.00870513916015625,
-0.011627197265625,
-0.0008225440979003906,
0.0029354095458984375,
0.01427459716796875,
0.0450439453125,
-0.0257720947265625,
0.0070037841796875,
0.005939483642578125,
-0.028778076171875,
0.01129150390625,
-0.014862060546875,
0.062042236328125,
-0.0645751953125,
-0.0163421630859375,
-0.021026611328125,
0.018798828125,
0.05023193359375,
-0.02655029296875,
0.051300048828125,
0.0274658203125,
-0.0350341796875,
-0.0035419464111328125,
-0.04119873046875,
-0.024383544921875,
-0.04107666015625,
0.007495880126953125,
-0.04168701171875,
-0.053863525390625,
0.0531005859375,
-0.0025463104248046875,
-0.00502777099609375,
0.064208984375,
0.0281829833984375,
0.02569580078125,
0.0863037109375,
0.05023193359375,
-0.006252288818359375,
0.03192138671875,
-0.057464599609375,
0.0174713134765625,
-0.06365966796875,
-0.0185699462890625,
-0.037628173828125,
-0.0013294219970703125,
-0.032012939453125,
-0.0174102783203125,
0.0465087890625,
0.003902435302734375,
-0.0364990234375,
0.04278564453125,
-0.052001953125,
0.03509521484375,
0.05096435546875,
0.00928497314453125,
0.0175018310546875,
-0.00879669189453125,
-0.01079559326171875,
-0.0041046142578125,
-0.043487548828125,
-0.04058837890625,
0.067138671875,
0.021697998046875,
0.0447998046875,
0.028594970703125,
0.0281829833984375,
0.00710296630859375,
0.01666259765625,
-0.04412841796875,
0.057220458984375,
0.007045745849609375,
-0.0254364013671875,
-0.023162841796875,
-0.035675048828125,
-0.0704345703125,
0.01531219482421875,
0.01212310791015625,
-0.06536865234375,
0.01800537109375,
-0.006786346435546875,
-0.029510498046875,
0.030029296875,
-0.05328369140625,
0.055206298828125,
-0.0003781318664550781,
-0.0355224609375,
-0.02130126953125,
-0.043426513671875,
0.0263214111328125,
0.0050201416015625,
0.0014705657958984375,
0.00893402099609375,
0.005489349365234375,
0.06121826171875,
-0.0672607421875,
0.048919677734375,
-0.01226806640625,
-0.004230499267578125,
0.039093017578125,
0.0259857177734375,
0.05181884765625,
0.0260772705078125,
-0.00811004638671875,
0.01322174072265625,
0.011016845703125,
-0.0243072509765625,
-0.06591796875,
0.041290283203125,
-0.0706787109375,
-0.05438232421875,
-0.07391357421875,
-0.05194091796875,
-0.01210784912109375,
0.027740478515625,
0.00855255126953125,
0.036651611328125,
0.041961669921875,
0.0037899017333984375,
0.052276611328125,
-0.001323699951171875,
0.0306549072265625,
0.05035400390625,
-0.00200653076171875,
-0.0347900390625,
0.07244873046875,
0.0180206298828125,
0.017974853515625,
0.0224456787109375,
0.0155792236328125,
-0.02685546875,
-0.0189971923828125,
-0.03778076171875,
0.05078125,
-0.049407958984375,
-0.033966064453125,
-0.036468505859375,
-0.042083740234375,
-0.029205322265625,
-0.0220489501953125,
-0.0310211181640625,
-0.030029296875,
-0.034210205078125,
0.016204833984375,
0.0550537109375,
0.049346923828125,
0.006977081298828125,
0.0250396728515625,
-0.051025390625,
0.0155792236328125,
0.01071929931640625,
0.026031494140625,
0.0036029815673828125,
-0.034149169921875,
0.0021724700927734375,
-0.00009363889694213867,
-0.04498291015625,
-0.069091796875,
0.054840087890625,
-0.023406982421875,
0.03704833984375,
0.002498626708984375,
-0.0010166168212890625,
0.043975830078125,
-0.003826141357421875,
0.0457763671875,
0.01233673095703125,
-0.0966796875,
0.0399169921875,
-0.028228759765625,
0.0157012939453125,
0.0025043487548828125,
0.010955810546875,
-0.026641845703125,
-0.022979736328125,
-0.07232666015625,
-0.0364990234375,
0.08380126953125,
0.0273895263671875,
-0.0181427001953125,
0.01174163818359375,
0.0243682861328125,
0.00643157958984375,
0.006549835205078125,
-0.0574951171875,
-0.0282745361328125,
-0.0264434814453125,
-0.01551055908203125,
0.018524169921875,
0.007083892822265625,
-0.00844573974609375,
-0.0173797607421875,
0.07635498046875,
-0.0281524658203125,
0.038970947265625,
0.008758544921875,
-0.0112152099609375,
-0.0012273788452148438,
0.00469207763671875,
0.07086181640625,
0.06512451171875,
-0.01436614990234375,
-0.01922607421875,
0.0034160614013671875,
-0.06585693359375,
0.007389068603515625,
0.01047515869140625,
-0.0263824462890625,
-0.00881195068359375,
0.021484375,
0.05816650390625,
-0.0145416259765625,
-0.0550537109375,
0.0303192138671875,
0.00366973876953125,
-0.01119232177734375,
-0.0294189453125,
0.004863739013671875,
0.00011271238327026367,
0.0288848876953125,
0.0165557861328125,
0.008636474609375,
0.0025844573974609375,
-0.0283050537109375,
-0.0006818771362304688,
0.036590576171875,
-0.0155792236328125,
-0.0248260498046875,
0.038330078125,
-0.0023822784423828125,
-0.042449951171875,
0.04779052734375,
-0.041961669921875,
-0.04443359375,
0.0771484375,
0.05810546875,
0.06927490234375,
-0.004589080810546875,
0.0228271484375,
0.0294647216796875,
0.042022705078125,
0.005390167236328125,
0.05047607421875,
0.02459716796875,
-0.04718017578125,
-0.023223876953125,
-0.0163726806640625,
-0.0308837890625,
0.01483154296875,
-0.037139892578125,
0.0224151611328125,
-0.053985595703125,
-0.004314422607421875,
-0.0011377334594726562,
0.0218353271484375,
-0.041229248046875,
-0.010650634765625,
-0.041656494140625,
0.0745849609375,
-0.03997802734375,
0.05755615234375,
0.05145263671875,
-0.06005859375,
-0.0794677734375,
-0.0166778564453125,
0.01383209228515625,
-0.0748291015625,
0.0281829833984375,
-0.0036983489990234375,
-0.031646728515625,
0.01232147216796875,
-0.0579833984375,
-0.06793212890625,
0.10101318359375,
0.054351806640625,
-0.0162811279296875,
0.0015878677368164062,
0.00136566162109375,
0.029083251953125,
-0.0279388427734375,
0.0239410400390625,
0.01007080078125,
0.04217529296875,
0.00943756103515625,
-0.0673828125,
0.04034423828125,
-0.05908203125,
-0.012115478515625,
0.03240966796875,
-0.080810546875,
0.07830810546875,
-0.007625579833984375,
-0.005252838134765625,
0.00841522216796875,
0.0341796875,
0.0435791015625,
0.0303192138671875,
0.031158447265625,
0.042510986328125,
0.041412353515625,
-0.0261383056640625,
0.06402587890625,
-0.00818634033203125,
0.03521728515625,
0.06573486328125,
-0.01383209228515625,
0.03167724609375,
0.017791748046875,
-0.0311279296875,
0.051544189453125,
0.033538818359375,
-0.0222320556640625,
0.0189971923828125,
-0.0010509490966796875,
0.0033893585205078125,
-0.03515625,
0.0070343017578125,
-0.037139892578125,
0.0108184814453125,
0.029815673828125,
0.006359100341796875,
-0.0185699462890625,
-0.0037841796875,
0.0022068023681640625,
0.01512908935546875,
0.00016987323760986328,
0.038360595703125,
0.00989532470703125,
-0.0408935546875,
0.03253173828125,
0.009796142578125,
0.0335693359375,
-0.06109619140625,
-0.024017333984375,
-0.0090789794921875,
0.010894775390625,
-0.0005917549133300781,
-0.058380126953125,
0.0179290771484375,
-0.018890380859375,
-0.00853729248046875,
0.0016727447509765625,
0.03741455078125,
0.0109710693359375,
-0.036529541015625,
0.0200653076171875,
0.03607177734375,
0.0016727447509765625,
-0.01094818115234375,
-0.06121826171875,
-0.00171661376953125,
-0.007686614990234375,
-0.01373291015625,
0.00637054443359375,
0.020111083984375,
-0.031829833984375,
0.07135009765625,
0.053985595703125,
-0.01113128662109375,
0.00274658203125,
-0.00730133056640625,
0.0726318359375,
-0.05181884765625,
-0.0479736328125,
-0.062042236328125,
0.03887939453125,
-0.006938934326171875,
-0.0294189453125,
0.05560302734375,
0.04705810546875,
0.04644775390625,
-0.01727294921875,
0.033966064453125,
0.0012540817260742188,
0.026580810546875,
-0.040130615234375,
0.0537109375,
-0.044891357421875,
0.034332275390625,
-0.0145416259765625,
-0.05157470703125,
-0.0227813720703125,
0.0367431640625,
-0.0191802978515625,
0.0206756591796875,
0.067626953125,
0.05322265625,
-0.00815582275390625,
0.0011615753173828125,
-0.0082855224609375,
0.0145263671875,
0.045135498046875,
0.058258056640625,
0.042999267578125,
-0.046905517578125,
0.0308685302734375,
-0.0200347900390625,
-0.00818634033203125,
-0.00991058349609375,
-0.04180908203125,
-0.064697265625,
-0.05413818359375,
-0.019775390625,
-0.05694580078125,
-0.0153961181640625,
0.07293701171875,
0.04852294921875,
-0.037139892578125,
-0.0150146484375,
-0.0206298828125,
0.037384033203125,
-0.025970458984375,
-0.0235748291015625,
0.04541015625,
0.003658294677734375,
-0.048736572265625,
0.019775390625,
0.0230560302734375,
0.00750732421875,
-0.02276611328125,
-0.034210205078125,
-0.0195770263671875,
0.03167724609375,
0.0323486328125,
0.033782958984375,
-0.07635498046875,
-0.005847930908203125,
0.0443115234375,
-0.00016796588897705078,
0.011260986328125,
0.048492431640625,
-0.0753173828125,
0.027587890625,
0.034576416015625,
0.030609130859375,
0.034637451171875,
-0.031829833984375,
0.0382080078125,
-0.0225067138671875,
0.00885772705078125,
0.0300750732421875,
0.0288238525390625,
-0.0164642333984375,
-0.043853759765625,
0.07025146484375,
0.035491943359375,
-0.0093536376953125,
-0.087646484375,
0.00476837158203125,
-0.10736083984375,
-0.01453399658203125,
0.06695556640625,
-0.0031261444091796875,
-0.0167083740234375,
0.00405120849609375,
-0.0167236328125,
0.01161956787109375,
-0.051055908203125,
0.052001953125,
0.04571533203125,
-0.00522613525390625,
-0.005481719970703125,
-0.031463623046875,
0.01203155517578125,
0.021942138671875,
-0.09332275390625,
-0.01702880859375,
0.0176849365234375,
0.0170745849609375,
0.044891357421875,
0.0452880859375,
-0.0246429443359375,
0.052337646484375,
-0.004184722900390625,
-0.0017080307006835938,
-0.0465087890625,
-0.0275115966796875,
-0.0268096923828125,
-0.002471923828125,
-0.0247039794921875,
-0.00716400146484375
]
] |
vinai/bertweet-large | 2022-06-08T04:43:57.000Z | [
"transformers",
"pytorch",
"tf",
"roberta",
"fill-mask",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | vinai | null | null | vinai/bertweet-large | 6 | 6,695 | transformers | 2022-03-02T23:29:05 | # <a name="introduction"></a> BERTweet: A pre-trained language model for English Tweets
BERTweet is the first public large-scale language model pre-trained for English Tweets. BERTweet is trained based on the [RoBERTa](https://github.com/pytorch/fairseq/blob/master/examples/roberta/README.md) pre-training procedure. The corpus used to pre-train BERTweet consists of 850M English Tweets (16B word tokens ~ 80GB), containing 845M Tweets streamed from 01/2012 to 08/2019 and 5M Tweets related to the **COVID-19** pandemic. The general architecture and experimental results of BERTweet can be found in our [paper](https://aclanthology.org/2020.emnlp-demos.2/):
@inproceedings{bertweet,
title = {{BERTweet: A pre-trained language model for English Tweets}},
author = {Dat Quoc Nguyen and Thanh Vu and Anh Tuan Nguyen},
booktitle = {Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations},
pages = {9--14},
year = {2020}
}
**Please CITE** our paper when BERTweet is used to help produce published results or is incorporated into other software.
For further information or requests, please go to [BERTweet's homepage](https://github.com/VinAIResearch/BERTweet)!
### Main results
<p float="left">
<img width="275" alt="postagging" src="https://user-images.githubusercontent.com/2412555/135724590-01d8d435-262d-44fe-a383-cd39324fe190.png" />
<img width="275" alt="ner" src="https://user-images.githubusercontent.com/2412555/135724598-1e3605e7-d8ce-4c5e-be4a-62ae8501fae7.png" />
</p>
<p float="left">
<img width="275" alt="sentiment" src="https://user-images.githubusercontent.com/2412555/135724597-f1981f1e-fe73-4c03-b1ff-0cae0cc5f948.png" />
<img width="275" alt="irony" src="https://user-images.githubusercontent.com/2412555/135724595-15f4f2c8-bbb6-4ee6-82a0-034769dec183.png" />
</p>
| 1,891 | [
[
-0.03759765625,
-0.04571533203125,
0.0215301513671875,
0.0213165283203125,
-0.0347900390625,
0.0022220611572265625,
-0.0269927978515625,
-0.0399169921875,
0.03936767578125,
0.01148223876953125,
-0.040740966796875,
-0.046600341796875,
-0.047119140625,
-0.01444244384765625,
-0.029449462890625,
0.06744384765625,
0.00988006591796875,
-0.01529693603515625,
0.0164794921875,
0.006465911865234375,
0.0015764236450195312,
-0.037200927734375,
-0.05096435546875,
-0.0023326873779296875,
0.052764892578125,
0.01410675048828125,
0.056610107421875,
0.0119171142578125,
0.044097900390625,
0.0185699462890625,
0.01172637939453125,
-0.01206207275390625,
-0.04693603515625,
-0.0033550262451171875,
0.01242828369140625,
0.006542205810546875,
-0.026458740234375,
0.0007867813110351562,
0.02923583984375,
0.024566650390625,
0.0081939697265625,
0.01129913330078125,
0.00734710693359375,
0.045196533203125,
-0.041046142578125,
0.0007624626159667969,
-0.049346923828125,
-0.01467132568359375,
-0.0283050537109375,
-0.0159912109375,
-0.0206451416015625,
-0.0445556640625,
0.0286712646484375,
-0.044036865234375,
0.0052947998046875,
-0.003398895263671875,
0.117431640625,
-0.01456451416015625,
-0.0174102783203125,
-0.007305145263671875,
-0.034393310546875,
0.053863525390625,
-0.06231689453125,
0.026611328125,
0.022705078125,
0.01305389404296875,
0.0017547607421875,
-0.0655517578125,
-0.04498291015625,
-0.01776123046875,
-0.0097503662109375,
0.0189971923828125,
-0.03466796875,
-0.0089569091796875,
-0.0004100799560546875,
-0.0006437301635742188,
-0.05072021484375,
-0.0017375946044921875,
-0.012237548828125,
-0.0021514892578125,
0.039703369140625,
-0.0289459228515625,
0.0204925537109375,
-0.022613525390625,
-0.04437255859375,
-0.0167083740234375,
-0.0304107666015625,
0.004497528076171875,
0.0095672607421875,
0.010162353515625,
-0.0205841064453125,
0.0189208984375,
0.01049041748046875,
0.0279083251953125,
-0.005634307861328125,
0.01143646240234375,
0.03173828125,
-0.01152801513671875,
-0.0133209228515625,
-0.02923583984375,
0.0863037109375,
0.028778076171875,
0.035064697265625,
-0.0030117034912109375,
-0.033538818359375,
-0.020721435546875,
0.01519012451171875,
-0.07098388671875,
-0.0210418701171875,
0.01158905029296875,
-0.03656005859375,
-0.0195770263671875,
0.0052947998046875,
-0.046966552734375,
-0.00859832763671875,
-0.00592041015625,
0.05682373046875,
-0.061767578125,
-0.02423095703125,
-0.0102996826171875,
-0.004383087158203125,
0.0311279296875,
0.042938232421875,
-0.049041748046875,
-0.0019683837890625,
0.0404052734375,
0.05755615234375,
0.0218048095703125,
-0.0196533203125,
-0.0038700103759765625,
0.006038665771484375,
-0.0225372314453125,
0.060882568359375,
-0.02056884765625,
-0.0037899017333984375,
0.018585205078125,
0.0012750625610351562,
-0.0001685619354248047,
-0.0005555152893066406,
0.04241943359375,
-0.0460205078125,
0.0149078369140625,
-0.03265380859375,
-0.04290771484375,
-0.010528564453125,
0.0186309814453125,
-0.034820556640625,
0.07330322265625,
0.00040912628173828125,
-0.0809326171875,
0.0307769775390625,
-0.06744384765625,
-0.038116455078125,
0.01556396484375,
-0.01433563232421875,
-0.03790283203125,
-0.0098724365234375,
0.028778076171875,
0.05889892578125,
-0.012847900390625,
0.007144927978515625,
-0.03900146484375,
-0.0190887451171875,
0.00960540771484375,
-0.019256591796875,
0.08770751953125,
0.00942230224609375,
-0.0308380126953125,
-0.0074462890625,
-0.03564453125,
0.001514434814453125,
0.0396728515625,
-0.015411376953125,
-0.012725830078125,
-0.0039043426513671875,
0.0262298583984375,
0.0160980224609375,
0.046173095703125,
-0.04779052734375,
0.008819580078125,
-0.013519287109375,
0.04119873046875,
0.06982421875,
-0.00804901123046875,
0.0205841064453125,
-0.0430908203125,
0.0223541259765625,
-0.0159912109375,
0.024505615234375,
-0.0124053955078125,
-0.032928466796875,
-0.04071044921875,
-0.04046630859375,
0.0156707763671875,
0.0361328125,
-0.041473388671875,
0.051422119140625,
-0.04229736328125,
-0.04962158203125,
-0.04522705078125,
-0.017425537109375,
0.006366729736328125,
0.0297393798828125,
0.023895263671875,
-0.006389617919921875,
-0.046234130859375,
-0.070068359375,
-0.00899505615234375,
-0.024078369140625,
0.00099945068359375,
0.020477294921875,
0.051239013671875,
-0.0188751220703125,
0.07177734375,
-0.034423828125,
-0.01004791259765625,
0.005733489990234375,
0.022186279296875,
0.0119476318359375,
0.038330078125,
0.0682373046875,
-0.058685302734375,
-0.054229736328125,
-0.00405120849609375,
-0.03887939453125,
-0.006000518798828125,
-0.0053863525390625,
-0.020263671875,
0.03826904296875,
0.0226287841796875,
-0.05328369140625,
0.040252685546875,
0.055206298828125,
-0.019287109375,
0.04730224609375,
-0.00820159912109375,
0.0239410400390625,
-0.0977783203125,
0.01374053955078125,
0.00286865234375,
-0.0273284912109375,
-0.03350830078125,
-0.00811004638671875,
0.0083160400390625,
0.002277374267578125,
-0.0311279296875,
0.0401611328125,
-0.033172607421875,
0.0008230209350585938,
0.00365447998046875,
0.001987457275390625,
-0.001300811767578125,
0.030670166015625,
-0.003940582275390625,
0.052032470703125,
0.050537109375,
-0.01128387451171875,
0.018524169921875,
-0.00024139881134033203,
-0.0300140380859375,
0.0230255126953125,
-0.060211181640625,
0.011810302734375,
0.0020999908447265625,
0.024383544921875,
-0.092529296875,
-0.0201263427734375,
0.00628662109375,
-0.0633544921875,
0.01158905029296875,
-0.01351165771484375,
-0.0643310546875,
-0.03204345703125,
-0.041534423828125,
0.0011844635009765625,
0.06298828125,
-0.041290283203125,
0.03326416015625,
0.0228271484375,
-0.003154754638671875,
-0.042816162109375,
-0.064208984375,
0.00830841064453125,
0.0011243820190429688,
-0.0784912109375,
0.0196075439453125,
-0.0008320808410644531,
0.0122222900390625,
0.0185699462890625,
0.00865936279296875,
-0.0264739990234375,
-0.0006299018859863281,
0.01551055908203125,
0.02203369140625,
-0.01389312744140625,
0.027374267578125,
-0.0196990966796875,
0.01139068603515625,
-0.00951385498046875,
-0.0263824462890625,
0.05548095703125,
-0.02197265625,
-0.031463623046875,
-0.0452880859375,
0.0269317626953125,
0.045440673828125,
-0.0178375244140625,
0.0701904296875,
0.07330322265625,
-0.0298004150390625,
0.00048542022705078125,
-0.046966552734375,
-0.01788330078125,
-0.03338623046875,
0.0218658447265625,
-0.0157623291015625,
-0.0731201171875,
0.04913330078125,
0.0163116455078125,
0.0291595458984375,
0.04180908203125,
0.042236328125,
-0.057159423828125,
0.0543212890625,
0.021575927734375,
-0.0003070831298828125,
0.057891845703125,
-0.0277252197265625,
0.021514892578125,
-0.027496337890625,
0.00759124755859375,
-0.035888671875,
-0.02362060546875,
-0.06695556640625,
-0.0277099609375,
0.011871337890625,
-0.0011577606201171875,
-0.032958984375,
0.03851318359375,
-0.037872314453125,
0.0168304443359375,
0.044281005859375,
0.0200958251953125,
-0.01561737060546875,
0.00433349609375,
-0.001010894775390625,
-0.00530242919921875,
-0.0374755859375,
-0.042327880859375,
0.08538818359375,
0.029510498046875,
0.0618896484375,
0.01442718505859375,
0.08209228515625,
0.01544189453125,
0.0199432373046875,
-0.042266845703125,
0.044097900390625,
-0.025787353515625,
-0.0648193359375,
-0.033782958984375,
-0.0224761962890625,
-0.0906982421875,
0.00469207763671875,
-0.021240234375,
-0.052093505859375,
0.005641937255859375,
0.01386260986328125,
-0.0132904052734375,
0.037017822265625,
-0.07403564453125,
0.05255126953125,
-0.0229339599609375,
-0.0190277099609375,
-0.0108642578125,
-0.04742431640625,
0.0029201507568359375,
-0.007144927978515625,
0.0199432373046875,
-0.01454925537109375,
-0.0026645660400390625,
0.059906005859375,
-0.037139892578125,
0.06890869140625,
-0.02191162109375,
0.00415802001953125,
0.011627197265625,
-0.0016345977783203125,
0.052093505859375,
0.00522613525390625,
0.00931549072265625,
0.0191650390625,
0.00045228004455566406,
-0.05889892578125,
-0.01751708984375,
0.045013427734375,
-0.07568359375,
-0.036712646484375,
-0.0304412841796875,
-0.0146331787109375,
0.006900787353515625,
0.0283966064453125,
0.03948974609375,
0.0228271484375,
-0.0240020751953125,
0.04779052734375,
0.041748046875,
-0.0249176025390625,
0.0440673828125,
0.0220184326171875,
-0.00823974609375,
-0.0305938720703125,
0.04827880859375,
0.0250091552734375,
0.00852203369140625,
0.0428466796875,
0.0090484619140625,
-0.0130767822265625,
-0.0343017578125,
0.008819580078125,
0.0167083740234375,
-0.04119873046875,
-0.01546478271484375,
-0.052093505859375,
-0.02508544921875,
-0.06829833984375,
-0.040802001953125,
-0.0309295654296875,
-0.0292816162109375,
-0.0278472900390625,
0.01059722900390625,
0.047271728515625,
0.042266845703125,
-0.01910400390625,
0.0253753662109375,
-0.04254150390625,
0.0201263427734375,
0.037689208984375,
0.0252685546875,
0.0019550323486328125,
-0.040496826171875,
-0.02154541015625,
0.005397796630859375,
-0.0207061767578125,
-0.05670166015625,
0.03350830078125,
0.01119232177734375,
0.0290985107421875,
0.03326416015625,
0.0077056884765625,
0.043609619140625,
-0.038177490234375,
0.05670166015625,
0.03466796875,
-0.063720703125,
0.047149658203125,
-0.04315185546875,
0.01934814453125,
0.037384033203125,
0.0279083251953125,
-0.04449462890625,
-0.03448486328125,
-0.0657958984375,
-0.0772705078125,
0.04339599609375,
0.0333251953125,
0.01169586181640625,
-0.00788116455078125,
-0.0047760009765625,
-0.00843048095703125,
-0.0013370513916015625,
-0.060302734375,
-0.015411376953125,
-0.015869140625,
-0.013946533203125,
0.00279998779296875,
-0.01552581787109375,
-0.0029621124267578125,
-0.040191650390625,
0.053253173828125,
0.01548004150390625,
0.052703857421875,
0.02154541015625,
-0.0164794921875,
-0.0037517547607421875,
0.01474761962890625,
0.0435791015625,
0.053009033203125,
-0.0416259765625,
-0.002117156982421875,
-0.0111541748046875,
-0.03656005859375,
-0.021514892578125,
0.0299530029296875,
-0.01092529296875,
0.0165557861328125,
0.04779052734375,
0.062408447265625,
0.01174163818359375,
-0.0243377685546875,
0.0675048828125,
-0.00695037841796875,
-0.032440185546875,
-0.049835205078125,
-0.0274810791015625,
0.0153045654296875,
0.0181732177734375,
0.05694580078125,
0.01030731201171875,
-0.01035308837890625,
-0.0252227783203125,
0.02874755859375,
0.03948974609375,
-0.01702880859375,
-0.03375244140625,
0.029876708984375,
0.0177764892578125,
-0.00907135009765625,
0.020751953125,
-0.02008056640625,
-0.05084228515625,
0.040130615234375,
0.01512908935546875,
0.08563232421875,
0.004108428955078125,
0.0245819091796875,
0.03143310546875,
0.059356689453125,
0.020751953125,
0.03277587890625,
0.0166168212890625,
-0.054656982421875,
-0.0297393798828125,
-0.039154052734375,
-0.01403045654296875,
0.020172119140625,
-0.033538818359375,
0.0277099609375,
-0.04632568359375,
-0.0299224853515625,
0.0174407958984375,
0.00827789306640625,
-0.07122802734375,
0.0138092041015625,
0.01389312744140625,
0.076171875,
-0.06072998046875,
0.0703125,
0.069091796875,
-0.042999267578125,
-0.06964111328125,
0.030731201171875,
-0.0209808349609375,
-0.07086181640625,
0.05377197265625,
0.00925445556640625,
0.0042877197265625,
0.007781982421875,
-0.0740966796875,
-0.06158447265625,
0.08160400390625,
0.035247802734375,
-0.030364990234375,
-0.0024051666259765625,
-0.01270294189453125,
0.0341796875,
-0.0211334228515625,
0.0244598388671875,
0.0081787109375,
0.03948974609375,
0.0032672882080078125,
-0.0697021484375,
0.00785064697265625,
-0.01560211181640625,
0.00733184814453125,
0.01425933837890625,
-0.0560302734375,
0.07330322265625,
-0.006927490234375,
-0.01392364501953125,
-0.0107879638671875,
0.03607177734375,
0.00823211669921875,
0.005504608154296875,
0.045135498046875,
0.053680419921875,
0.040069580078125,
-0.0294952392578125,
0.07623291015625,
-0.019256591796875,
0.056793212890625,
0.062408447265625,
0.004337310791015625,
0.06964111328125,
0.0438232421875,
-0.020294189453125,
0.03759765625,
0.050994873046875,
0.01209259033203125,
0.037811279296875,
0.004756927490234375,
-0.00830841064453125,
-0.01806640625,
0.005107879638671875,
-0.032440185546875,
0.0115966796875,
0.0158233642578125,
-0.029449462890625,
-0.0233612060546875,
-0.005401611328125,
0.01497650146484375,
-0.01198577880859375,
0.0005846023559570312,
0.044036865234375,
-0.008697509765625,
-0.016510009765625,
0.062042236328125,
-0.0245361328125,
0.0673828125,
-0.04833984375,
0.01873779296875,
-0.007320404052734375,
-0.000020742416381835938,
-0.013397216796875,
-0.08331298828125,
0.0026073455810546875,
0.006267547607421875,
0.0013990402221679688,
-0.0301055908203125,
0.058685302734375,
-0.01302337646484375,
-0.0282440185546875,
0.037811279296875,
0.019622802734375,
0.01331329345703125,
-0.0058746337890625,
-0.07440185546875,
0.00650787353515625,
0.00665283203125,
-0.05487060546875,
0.0015506744384765625,
0.059478759765625,
0.01479339599609375,
0.039215087890625,
0.040802001953125,
0.00618743896484375,
0.020355224609375,
0.0087890625,
0.080322265625,
-0.044830322265625,
-0.038604736328125,
-0.053863525390625,
0.049591064453125,
-0.0167236328125,
-0.01386260986328125,
0.04180908203125,
0.03289794921875,
0.06280517578125,
-0.01515960693359375,
0.08123779296875,
-0.040130615234375,
0.048614501953125,
-0.01263427734375,
0.0703125,
-0.061767578125,
-0.01206207275390625,
-0.027191162109375,
-0.045013427734375,
-0.0204315185546875,
0.057342529296875,
-0.036285400390625,
0.0206298828125,
0.04888916015625,
0.05877685546875,
0.00482177734375,
-0.0187225341796875,
0.0226898193359375,
0.0296173095703125,
0.024627685546875,
0.0372314453125,
0.053863525390625,
-0.034759521484375,
0.051666259765625,
-0.0258941650390625,
-0.0258941650390625,
-0.039154052734375,
-0.057861328125,
-0.070068359375,
-0.0595703125,
-0.0114288330078125,
-0.0258331298828125,
0.01934814453125,
0.085205078125,
0.0740966796875,
-0.06744384765625,
-0.004425048828125,
0.0003209114074707031,
-0.0100860595703125,
0.0068359375,
-0.0213623046875,
0.040069580078125,
-0.0218505859375,
-0.042816162109375,
-0.01776123046875,
0.008880615234375,
0.0165252685546875,
-0.0016298294067382812,
-0.0082550048828125,
-0.051513671875,
0.0139007568359375,
0.046844482421875,
0.01477813720703125,
-0.04461669921875,
-0.035369873046875,
0.0011425018310546875,
-0.033477783203125,
0.016937255859375,
0.016510009765625,
-0.029693603515625,
0.0102386474609375,
0.04705810546875,
0.04150390625,
0.03619384765625,
-0.01033782958984375,
0.0171966552734375,
-0.07257080078125,
0.0157470703125,
0.032379150390625,
0.0233612060546875,
0.01824951171875,
-0.01690673828125,
0.031158447265625,
0.02020263671875,
-0.048797607421875,
-0.060882568359375,
-0.0026836395263671875,
-0.08990478515625,
-0.0223388671875,
0.071533203125,
-0.019134521484375,
-0.011322021484375,
0.0195465087890625,
-0.01190185546875,
0.02459716796875,
-0.0740966796875,
0.0472412109375,
0.05963134765625,
0.0172576904296875,
-0.023834228515625,
-0.04302978515625,
0.02801513671875,
0.02215576171875,
-0.056365966796875,
-0.0092926025390625,
0.0184783935546875,
0.01666259765625,
0.01824951171875,
0.06268310546875,
0.00604248046875,
0.0191802978515625,
-0.024169921875,
0.038330078125,
0.0236968994140625,
0.0003094673156738281,
-0.0094757080078125,
-0.0014238357543945312,
0.00020360946655273438,
-0.03179931640625
]
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.