modelId stringlengths 4 111 | lastModified stringlengths 24 24 | tags list | pipeline_tag stringlengths 5 30 ⌀ | author stringlengths 2 34 ⌀ | config null | securityStatus null | id stringlengths 4 111 | likes int64 0 9.53k | downloads int64 2 73.6M | library_name stringlengths 2 84 ⌀ | created timestamp[us] | card stringlengths 101 901k | card_len int64 101 901k | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
timm/repvgg_a2.rvgg_in1k | 2023-03-22T07:19:36.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2101.03697",
"license:mit",
"region:us"
] | image-classification | timm | null | null | timm/repvgg_a2.rvgg_in1k | 0 | 37,314 | timm | 2023-03-22T07:19:08 | ---
tags:
- image-classification
- timm
library_tag: timm
license: mit
datasets:
- imagenet-1k
---
# Model card for repvgg_a2.rvgg_in1k
A RepVGG image classification model. Trained on ImageNet-1k by paper authors.
This model architecture is implemented using `timm`'s flexible [BYOBNet (Bring-Your-Own-Blocks Network)](https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py).
BYOBNet allows configuration of:
* block / stage layout
* stem layout
* output stride (dilation)
* activation and norm layers
* channel and spatial / self-attention layers
...and also includes `timm` features common to many other architectures, including:
* stochastic depth
* gradient checkpointing
* layer-wise LR decay
* per-stage feature extraction
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 28.2
- GMACs: 5.7
- Activations (M): 6.3
- Image size: 224 x 224
- **Papers:**
- RepVGG: Making VGG-style ConvNets Great Again: https://arxiv.org/abs/2101.03697
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/DingXiaoH/RepVGG
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('repvgg_a2.rvgg_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'repvgg_a2.rvgg_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 112, 112])
# torch.Size([1, 96, 56, 56])
# torch.Size([1, 192, 28, 28])
# torch.Size([1, 384, 14, 14])
# torch.Size([1, 1408, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'repvgg_a2.rvgg_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1408, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@inproceedings{ding2021repvgg,
title={Repvgg: Making vgg-style convnets great again},
author={Ding, Xiaohan and Zhang, Xiangyu and Ma, Ningning and Han, Jungong and Ding, Guiguang and Sun, Jian},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={13733--13742},
year={2021}
}
```
| 4,541 | [
[
-0.03662109375,
-0.036407470703125,
-0.0010385513305664062,
0.00403594970703125,
-0.0247039794921875,
-0.018096923828125,
-0.0130615234375,
-0.0311126708984375,
0.01104736328125,
0.03472900390625,
-0.03558349609375,
-0.057708740234375,
-0.048614501953125,
-0.0122528076171875,
-0.00801849365234375,
0.07684326171875,
-0.0051727294921875,
0.0004799365997314453,
-0.017669677734375,
-0.046661376953125,
-0.00852203369140625,
-0.0270233154296875,
-0.05621337890625,
-0.045318603515625,
0.0260162353515625,
0.01482391357421875,
0.04071044921875,
0.0462646484375,
0.03973388671875,
0.032989501953125,
-0.00710296630859375,
0.0015268325805664062,
-0.0187225341796875,
-0.0174560546875,
0.023162841796875,
-0.04742431640625,
-0.031829833984375,
0.0216217041015625,
0.0528564453125,
0.0211181640625,
-0.0056610107421875,
0.0303802490234375,
0.0031375885009765625,
0.044036865234375,
-0.012237548828125,
0.00485992431640625,
-0.0286102294921875,
0.01444244384765625,
-0.00421142578125,
0.01163482666015625,
-0.02410888671875,
-0.032562255859375,
0.0169219970703125,
-0.0380859375,
0.035797119140625,
0.005016326904296875,
0.10089111328125,
0.0090179443359375,
0.005828857421875,
-0.0004935264587402344,
-0.00936126708984375,
0.054931640625,
-0.05804443359375,
0.0189971923828125,
0.018951416015625,
0.01143646240234375,
-0.01361083984375,
-0.0792236328125,
-0.042327880859375,
-0.0179901123046875,
-0.02142333984375,
0.003635406494140625,
-0.017608642578125,
0.00528717041015625,
0.02642822265625,
0.0295257568359375,
-0.0443115234375,
0.004344940185546875,
-0.039947509765625,
-0.0137786865234375,
0.03900146484375,
-0.0090789794921875,
0.024261474609375,
-0.028472900390625,
-0.045440673828125,
-0.03021240234375,
-0.03375244140625,
0.023956298828125,
0.023651123046875,
0.017547607421875,
-0.0439453125,
0.03302001953125,
0.01528167724609375,
0.045928955078125,
0.00586700439453125,
-0.0261077880859375,
0.043701171875,
0.00022542476654052734,
-0.035736083984375,
-0.007396697998046875,
0.0916748046875,
0.0215301513671875,
0.017974853515625,
0.0039043426513671875,
-0.01056671142578125,
-0.033905029296875,
-0.0002562999725341797,
-0.082275390625,
-0.031494140625,
0.025115966796875,
-0.049957275390625,
-0.0311126708984375,
0.0184478759765625,
-0.044921875,
-0.0158538818359375,
-0.00115966796875,
0.05059814453125,
-0.040618896484375,
-0.0255126953125,
0.0009183883666992188,
-0.0192718505859375,
0.02423095703125,
0.01219940185546875,
-0.034423828125,
0.01552581787109375,
0.02276611328125,
0.08489990234375,
0.007511138916015625,
-0.030731201171875,
-0.024505615234375,
-0.0280303955078125,
-0.018035888671875,
0.036346435546875,
-0.01222991943359375,
-0.0177764892578125,
-0.019805908203125,
0.0291290283203125,
-0.00323486328125,
-0.056365966796875,
0.01093292236328125,
-0.013275146484375,
0.0276641845703125,
-0.00962066650390625,
-0.0189208984375,
-0.036407470703125,
0.026824951171875,
-0.031585693359375,
0.0970458984375,
0.035888671875,
-0.061737060546875,
0.0288848876953125,
-0.03216552734375,
-0.00870513916015625,
-0.01262664794921875,
-0.00506591796875,
-0.086181640625,
-0.00650787353515625,
0.0158843994140625,
0.05059814453125,
-0.0212554931640625,
-0.003040313720703125,
-0.0406494140625,
-0.0175933837890625,
0.02490234375,
0.0102996826171875,
0.07342529296875,
0.005748748779296875,
-0.0244598388671875,
0.02252197265625,
-0.049713134765625,
0.01415252685546875,
0.036590576171875,
-0.019866943359375,
-0.00830078125,
-0.045928955078125,
0.00927734375,
0.021453857421875,
0.00835418701171875,
-0.036895751953125,
0.0360107421875,
-0.0037841796875,
0.033935546875,
0.047119140625,
-0.0093994140625,
0.0276336669921875,
-0.03143310546875,
0.0274658203125,
0.017608642578125,
0.01800537109375,
-0.0101318359375,
-0.04229736328125,
-0.0517578125,
-0.048126220703125,
0.033721923828125,
0.0299835205078125,
-0.037445068359375,
0.034698486328125,
-0.005035400390625,
-0.055023193359375,
-0.04241943359375,
0.00669097900390625,
0.04412841796875,
0.047271728515625,
0.02471923828125,
-0.044677734375,
-0.03729248046875,
-0.06207275390625,
0.003215789794921875,
-0.008209228515625,
-0.0020771026611328125,
0.0283050537109375,
0.0445556640625,
-0.00852203369140625,
0.050018310546875,
-0.0263671875,
-0.02032470703125,
-0.016448974609375,
0.0184478759765625,
0.0316162109375,
0.055023193359375,
0.06494140625,
-0.0413818359375,
-0.036407470703125,
-0.001293182373046875,
-0.06658935546875,
0.007595062255859375,
-0.00970458984375,
-0.01513671875,
0.0224151611328125,
0.0094451904296875,
-0.0496826171875,
0.04254150390625,
0.0188140869140625,
-0.01947021484375,
0.043701171875,
-0.022735595703125,
0.022796630859375,
-0.0809326171875,
0.005802154541015625,
0.03729248046875,
-0.020599365234375,
-0.0313720703125,
0.004974365234375,
0.0021514892578125,
-0.0008730888366699219,
-0.03271484375,
0.05096435546875,
-0.042724609375,
-0.016387939453125,
-0.00727081298828125,
-0.01148223876953125,
0.0033779144287109375,
0.05169677734375,
-0.004852294921875,
0.034942626953125,
0.063232421875,
-0.032470703125,
0.03533935546875,
0.02984619140625,
-0.017303466796875,
0.0296173095703125,
-0.052154541015625,
0.0164031982421875,
0.00015914440155029297,
0.016387939453125,
-0.08489990234375,
-0.0146331787109375,
0.03802490234375,
-0.049102783203125,
0.057098388671875,
-0.049530029296875,
-0.03472900390625,
-0.04278564453125,
-0.03607177734375,
0.019439697265625,
0.062255859375,
-0.0560302734375,
0.034149169921875,
0.0200042724609375,
0.01934814453125,
-0.047821044921875,
-0.0596923828125,
-0.0235443115234375,
-0.028228759765625,
-0.045928955078125,
0.024658203125,
0.01447296142578125,
0.00653076171875,
0.01108551025390625,
-0.01291656494140625,
0.0006704330444335938,
-0.0216064453125,
0.0380859375,
0.0305328369140625,
-0.021240234375,
-0.004611968994140625,
-0.02545166015625,
0.0009822845458984375,
0.0026874542236328125,
-0.0250244140625,
0.042510986328125,
-0.01910400390625,
-0.00716400146484375,
-0.051361083984375,
-0.01108551025390625,
0.038726806640625,
-0.01000213623046875,
0.060882568359375,
0.0823974609375,
-0.033355712890625,
-0.0028591156005859375,
-0.0328369140625,
-0.01800537109375,
-0.036285400390625,
0.0288543701171875,
-0.0271453857421875,
-0.036407470703125,
0.07147216796875,
-0.0021457672119140625,
0.0033702850341796875,
0.039093017578125,
0.03057861328125,
-0.0086517333984375,
0.04632568359375,
0.044281005859375,
0.00873565673828125,
0.0628662109375,
-0.06842041015625,
-0.01496124267578125,
-0.06561279296875,
-0.037933349609375,
-0.026519775390625,
-0.040802001953125,
-0.057220458984375,
-0.0306549072265625,
0.0259246826171875,
0.0182647705078125,
-0.0255126953125,
0.040863037109375,
-0.07525634765625,
0.006313323974609375,
0.05572509765625,
0.044036865234375,
-0.0255126953125,
0.0228424072265625,
-0.0155181884765625,
0.001148223876953125,
-0.05731201171875,
-0.0199432373046875,
0.08367919921875,
0.034423828125,
0.059967041015625,
-0.0235137939453125,
0.057861328125,
-0.01363372802734375,
0.0345458984375,
-0.04705810546875,
0.042694091796875,
-0.00843048095703125,
-0.032196044921875,
-0.01041412353515625,
-0.0303802490234375,
-0.07611083984375,
0.0122222900390625,
-0.0212554931640625,
-0.064453125,
0.01143646240234375,
0.0167083740234375,
-0.0126495361328125,
0.0538330078125,
-0.06524658203125,
0.07403564453125,
-0.007564544677734375,
-0.034332275390625,
0.007244110107421875,
-0.05389404296875,
0.021514892578125,
0.0216217041015625,
-0.017730712890625,
-0.00399017333984375,
0.0169677734375,
0.07666015625,
-0.04119873046875,
0.05523681640625,
-0.034393310546875,
0.027069091796875,
0.046600341796875,
-0.01235198974609375,
0.029449462890625,
-0.0024700164794921875,
-0.005435943603515625,
0.0302581787109375,
0.0022678375244140625,
-0.032806396484375,
-0.03802490234375,
0.043304443359375,
-0.0750732421875,
-0.029022216796875,
-0.032318115234375,
-0.037933349609375,
0.015960693359375,
0.007076263427734375,
0.0418701171875,
0.043853759765625,
0.014068603515625,
0.0143280029296875,
0.0506591796875,
-0.036529541015625,
0.025482177734375,
-0.006977081298828125,
-0.0251007080078125,
-0.040008544921875,
0.062469482421875,
0.018218994140625,
0.0131378173828125,
0.00919342041015625,
0.0185394287109375,
-0.027618408203125,
-0.036651611328125,
-0.0210113525390625,
0.03863525390625,
-0.050323486328125,
-0.03143310546875,
-0.0418701171875,
-0.0338134765625,
-0.03436279296875,
-0.0062255859375,
-0.03131103515625,
-0.0204010009765625,
-0.023162841796875,
0.0154571533203125,
0.05841064453125,
0.046630859375,
-0.01222991943359375,
0.04254150390625,
-0.0426025390625,
0.0172576904296875,
0.01265716552734375,
0.035919189453125,
-0.01678466796875,
-0.067626953125,
-0.016510009765625,
-0.0129547119140625,
-0.03424072265625,
-0.05731201171875,
0.039031982421875,
0.004810333251953125,
0.042022705078125,
0.021209716796875,
-0.0188140869140625,
0.060882568359375,
-0.00888824462890625,
0.040008544921875,
0.034027099609375,
-0.044525146484375,
0.0433349609375,
-0.0047607421875,
0.01221466064453125,
0.002666473388671875,
0.02557373046875,
-0.02154541015625,
-0.0104522705078125,
-0.0750732421875,
-0.05194091796875,
0.06744384765625,
0.00754547119140625,
-0.0106658935546875,
0.0303192138671875,
0.06036376953125,
0.0111083984375,
0.01192474365234375,
-0.06036376953125,
-0.035614013671875,
-0.0272674560546875,
-0.0142059326171875,
0.005207061767578125,
-0.0031223297119140625,
-0.0145416259765625,
-0.0526123046875,
0.04742431640625,
-0.002658843994140625,
0.060546875,
0.027313232421875,
0.001697540283203125,
-0.0127105712890625,
-0.02728271484375,
0.0308685302734375,
0.02587890625,
-0.028411865234375,
-0.0007205009460449219,
0.0159912109375,
-0.040069580078125,
0.00569915771484375,
0.01380157470703125,
0.004268646240234375,
0.00795745849609375,
0.0308380126953125,
0.06817626953125,
-0.0086212158203125,
0.00232696533203125,
0.0266265869140625,
-0.00323486328125,
-0.0304718017578125,
-0.0145416259765625,
0.01187896728515625,
-0.008880615234375,
0.036224365234375,
0.02435302734375,
0.02130126953125,
-0.0141754150390625,
-0.0288543701171875,
0.026763916015625,
0.04522705078125,
-0.0242462158203125,
-0.026947021484375,
0.04888916015625,
-0.024322509765625,
-0.016998291015625,
0.0633544921875,
-0.007965087890625,
-0.03839111328125,
0.07855224609375,
0.03594970703125,
0.0673828125,
-0.005931854248046875,
0.0081329345703125,
0.06402587890625,
0.0221710205078125,
-0.0015249252319335938,
0.0129852294921875,
0.01153564453125,
-0.047454833984375,
0.00557708740234375,
-0.0394287109375,
-0.0041961669921875,
0.03570556640625,
-0.0418701171875,
0.02728271484375,
-0.05413818359375,
-0.0316162109375,
0.0081024169921875,
0.023468017578125,
-0.06787109375,
0.021270751953125,
0.000568389892578125,
0.059844970703125,
-0.053802490234375,
0.058013916015625,
0.07159423828125,
-0.040985107421875,
-0.07037353515625,
-0.00753021240234375,
-0.0025920867919921875,
-0.078857421875,
0.034881591796875,
0.0328369140625,
0.0162200927734375,
0.0006866455078125,
-0.0706787109375,
-0.0404052734375,
0.10540771484375,
0.037811279296875,
-0.01268768310546875,
0.015167236328125,
-0.0150909423828125,
0.0211334228515625,
-0.0294189453125,
0.02850341796875,
0.0175018310546875,
0.0278778076171875,
0.0282135009765625,
-0.047332763671875,
0.0191650390625,
-0.01430511474609375,
0.005611419677734375,
0.0170440673828125,
-0.06353759765625,
0.0655517578125,
-0.039093017578125,
-0.010986328125,
0.001888275146484375,
0.050323486328125,
0.028656005859375,
0.00812530517578125,
0.03240966796875,
0.063720703125,
0.04412841796875,
-0.03350830078125,
0.07379150390625,
-0.0012226104736328125,
0.053253173828125,
0.0467529296875,
0.029052734375,
0.0321044921875,
0.0287322998046875,
-0.0240478515625,
0.02874755859375,
0.08184814453125,
-0.0308074951171875,
0.0313720703125,
0.0203399658203125,
0.000008106231689453125,
-0.0111083984375,
0.0097198486328125,
-0.03546142578125,
0.0247344970703125,
0.01407623291015625,
-0.0380859375,
-0.0130157470703125,
0.006305694580078125,
0.0008220672607421875,
-0.0278167724609375,
-0.0157470703125,
0.03619384765625,
0.0053253173828125,
-0.03033447265625,
0.05865478515625,
-0.0026874542236328125,
0.058013916015625,
-0.03753662109375,
-0.0016279220581054688,
-0.01904296875,
0.01568603515625,
-0.0274200439453125,
-0.070556640625,
0.0119171142578125,
-0.0209503173828125,
0.0009684562683105469,
-0.0004897117614746094,
0.055023193359375,
-0.03057861328125,
-0.0300140380859375,
0.014068603515625,
0.0225982666015625,
0.036712646484375,
0.0086669921875,
-0.0941162109375,
0.0130615234375,
-0.0041961669921875,
-0.054412841796875,
0.0286102294921875,
0.039276123046875,
0.009918212890625,
0.05706787109375,
0.04083251953125,
-0.006725311279296875,
0.00963592529296875,
-0.01187896728515625,
0.06378173828125,
-0.037506103515625,
-0.0128173828125,
-0.06072998046875,
0.053924560546875,
-0.0066680908203125,
-0.039215087890625,
0.038543701171875,
0.039337158203125,
0.06781005859375,
-0.0031909942626953125,
0.0445556640625,
-0.025360107421875,
-0.0084686279296875,
-0.0278167724609375,
0.056304931640625,
-0.06036376953125,
-0.010406494140625,
-0.018951416015625,
-0.048919677734375,
-0.0267486572265625,
0.062469482421875,
-0.0169677734375,
0.030120849609375,
0.036590576171875,
0.072509765625,
-0.0203094482421875,
-0.03778076171875,
0.017913818359375,
0.01094818115234375,
0.01041412353515625,
0.025970458984375,
0.0297698974609375,
-0.061492919921875,
0.0286712646484375,
-0.039825439453125,
-0.0197601318359375,
-0.01031494140625,
-0.049530029296875,
-0.08251953125,
-0.0638427734375,
-0.05401611328125,
-0.061553955078125,
-0.0108642578125,
0.0654296875,
0.079833984375,
-0.05224609375,
-0.009735107421875,
0.0050048828125,
0.0200653076171875,
-0.020904541015625,
-0.0175628662109375,
0.0413818359375,
-0.01519012451171875,
-0.056427001953125,
-0.02880859375,
0.008392333984375,
0.03106689453125,
-0.00024771690368652344,
-0.01605224609375,
-0.01316070556640625,
-0.017303466796875,
0.0164947509765625,
0.033233642578125,
-0.05352783203125,
-0.023040771484375,
-0.01348114013671875,
-0.01629638671875,
0.039886474609375,
0.03271484375,
-0.047943115234375,
0.033111572265625,
0.03240966796875,
0.020263671875,
0.06451416015625,
-0.0133819580078125,
0.0049285888671875,
-0.06512451171875,
0.038909912109375,
-0.015533447265625,
0.037994384765625,
0.029388427734375,
-0.02642822265625,
0.043121337890625,
0.0350341796875,
-0.041595458984375,
-0.06817626953125,
-0.00408935546875,
-0.08880615234375,
-0.00872802734375,
0.06787109375,
-0.030181884765625,
-0.04534912109375,
0.037811279296875,
-0.01235198974609375,
0.054931640625,
-0.01161956787109375,
0.04473876953125,
0.031982421875,
-0.01146697998046875,
-0.05877685546875,
-0.034698486328125,
0.03289794921875,
0.0006098747253417969,
-0.04791259765625,
-0.028778076171875,
0.0011606216430664062,
0.05535888671875,
0.0200958251953125,
0.039703369140625,
-0.01538848876953125,
0.0127410888671875,
0.007785797119140625,
0.046356201171875,
-0.027618408203125,
-0.00553131103515625,
-0.023406982421875,
0.00296783447265625,
-0.009063720703125,
-0.049560546875
]
] |
coqui/XTTS-v1 | 2023-10-31T10:06:22.000Z | [
"coqui",
"text-to-speech",
"license:other",
"has_space",
"region:us"
] | text-to-speech | coqui | null | null | coqui/XTTS-v1 | 335 | 37,300 | coqui | 2023-09-13T09:22:03 | ---
license: other
license_name: coqui-public-model-license
license_link: https://coqui.ai/cpml
library_name: coqui
pipeline_tag: text-to-speech
---
# ⓍTTS
ⓍTTS is a Voice generation model that lets you clone voices into different languages by using just a quick 6-second audio clip. Built on Tortoise,
ⓍTTS has important model changes that make cross-language voice cloning and multi-lingual speech generation super easy.
There is no need for an excessive amount of training data that spans countless hours.
This is the same model that powers [Coqui Studio](https://coqui.ai/), and [Coqui API](https://docs.coqui.ai/docs), however we apply
a few tricks to make it faster and support streaming inference.
### Features
- Supports 14 languages.
- Voice cloning with just a 6-second audio clip.
- Emotion and style transfer by cloning.
- Cross-language voice cloning.
- Multi-lingual speech generation.
- 24khz sampling rate.
### Languages
As of now, XTTS-v1 (v1.1) supports 14 languages: **English, Spanish, French, German, Italian, Portuguese,
Polish, Turkish, Russian, Dutch, Czech, Arabic, Chinese, and Japanese**.
Stay tuned as we continue to add support for more languages. If you have any language requests, please feel free to reach out!
### Code
The current implementation supports inference and [fine-tuning](https://tts.readthedocs.io/en/latest/models/xtts.html#training).
### License
This model is licensed under [Coqui Public Model License](https://coqui.ai/cpml). There's a lot that goes into a license for generative models, and you can read more of [the origin story of CPML here](https://coqui.ai/blog/tts/cpml).
### Contact
Come and join in our 🐸Community. We're active on [Discord](https://discord.gg/fBC58unbKE) and [Twitter](https://twitter.com/coqui_ai).
You can also mail us at info@coqui.ai.
Using 🐸TTS API:
```python
from TTS.api import TTS
tts = TTS("tts_models/multilingual/multi-dataset/xtts_v1", gpu=True)
# generate speech by cloning a voice using default settings
tts.tts_to_file(text="It took me quite a long time to develop a voice, and now that I have it I'm not going to be silent.",
file_path="output.wav",
speaker_wav="/path/to/target/speaker.wav",
language="en")
# generate speech by cloning a voice using custom settings
tts.tts_to_file(text="It took me quite a long time to develop a voice, and now that I have it I'm not going to be silent.",
file_path="output.wav",
speaker_wav="/path/to/target/speaker.wav",
language="en",
decoder_iterations=30)
```
Using 🐸TTS Command line:
```console
tts --model_name tts_models/multilingual/multi-dataset/xtts_v1 \
--text "Bugün okula gitmek istemiyorum." \
--speaker_wav /path/to/target/speaker.wav \
--language_idx tr \
--use_cuda true
```
Using model directly:
```python
from TTS.tts.configs.xtts_config import XttsConfig
from TTS.tts.models.xtts import Xtts
config = XttsConfig()
config.load_json("/path/to/xtts/config.json")
model = Xtts.init_from_config(config)
model.load_checkpoint(config, checkpoint_dir="/path/to/xtts/", eval=True)
model.cuda()
outputs = model.synthesize(
"It took me quite a long time to develop a voice and now that I have it I am not going to be silent.",
config,
speaker_wav="/data/TTS-public/_refclips/3.wav",
gpt_cond_len=3,
language="en",
)
``` | 3,427 | [
[
-0.0284271240234375,
-0.0404052734375,
0.01079559326171875,
0.033599853515625,
-0.025665283203125,
0.0281219482421875,
-0.02197265625,
-0.03289794921875,
0.0181884765625,
0.02203369140625,
-0.054840087890625,
-0.0377197265625,
-0.023193359375,
-0.0003132820129394531,
-0.0233612060546875,
0.06878662109375,
0.03155517578125,
0.0037860870361328125,
0.031829833984375,
0.0105743408203125,
-0.0248870849609375,
-0.038330078125,
-0.0911865234375,
-0.0029621124267578125,
0.026031494140625,
0.01395416259765625,
0.04461669921875,
0.045806884765625,
-0.0095367431640625,
0.02508544921875,
-0.01480865478515625,
0.006824493408203125,
-0.025848388671875,
0.00737762451171875,
0.01372528076171875,
-0.040435791015625,
-0.035369873046875,
-0.01273345947265625,
0.047821044921875,
0.025634765625,
-0.007617950439453125,
0.005756378173828125,
-0.0127716064453125,
0.0006327629089355469,
-0.01264190673828125,
0.01110076904296875,
-0.0291290283203125,
0.0079803466796875,
-0.0177764892578125,
-0.01033782958984375,
-0.0298919677734375,
-0.038116455078125,
0.03997802734375,
-0.057342529296875,
0.028045654296875,
-0.0004584789276123047,
0.0714111328125,
-0.00675201416015625,
-0.0169677734375,
-0.02197265625,
-0.0426025390625,
0.07666015625,
-0.0626220703125,
0.0439453125,
0.007808685302734375,
0.0293426513671875,
-0.0005125999450683594,
-0.077880859375,
-0.048309326171875,
-0.035369873046875,
-0.0032196044921875,
0.006122589111328125,
-0.0094757080078125,
-0.015380859375,
0.0222930908203125,
0.02020263671875,
-0.0430908203125,
-0.00473785400390625,
-0.0340576171875,
-0.024932861328125,
0.040924072265625,
-0.0015211105346679688,
0.046905517578125,
-0.03570556640625,
-0.0291748046875,
-0.040924072265625,
-0.044158935546875,
0.00922393798828125,
0.022705078125,
0.0499267578125,
-0.047393798828125,
0.01195526123046875,
0.006237030029296875,
0.01517486572265625,
0.005199432373046875,
-0.0059356689453125,
0.0261383056640625,
-0.026702880859375,
-0.0308380126953125,
0.01904296875,
0.0946044921875,
-0.0132598876953125,
0.0111083984375,
-0.00331878662109375,
0.00396728515625,
-0.0081939697265625,
-0.01082611083984375,
-0.048980712890625,
-0.00543212890625,
0.03997802734375,
-0.0089874267578125,
-0.0222320556640625,
-0.01409912109375,
-0.050445556640625,
0.0014972686767578125,
-0.00034046173095703125,
0.026641845703125,
-0.053466796875,
-0.042083740234375,
-0.00040531158447265625,
-0.0287322998046875,
0.0035648345947265625,
0.0172576904296875,
-0.07391357421875,
0.0162811279296875,
0.04144287109375,
0.07379150390625,
0.00453948974609375,
-0.026763916015625,
-0.040863037109375,
0.00983428955078125,
-0.0263671875,
0.0168609619140625,
-0.0032196044921875,
-0.04498291015625,
-0.026519775390625,
0.001468658447265625,
0.00872039794921875,
-0.04119873046875,
0.0278167724609375,
-0.01448822021484375,
0.0255279541015625,
-0.016357421875,
-0.0160675048828125,
-0.0077056884765625,
-0.01413726806640625,
-0.043853759765625,
0.0743408203125,
0.03558349609375,
-0.0272369384765625,
0.0105743408203125,
-0.055938720703125,
-0.0390625,
0.003936767578125,
-0.01003265380859375,
-0.039520263671875,
-0.0091400146484375,
0.007183074951171875,
0.0263671875,
-0.035675048828125,
0.034454345703125,
-0.0062255859375,
-0.01027679443359375,
0.0236968994140625,
-0.020050048828125,
0.11236572265625,
0.0355224609375,
-0.045745849609375,
0.006664276123046875,
-0.0518798828125,
0.030426025390625,
0.00988006591796875,
-0.01409912109375,
-0.000774383544921875,
-0.011016845703125,
0.0190582275390625,
0.007293701171875,
-0.0038547515869140625,
-0.044281005859375,
0.007389068603515625,
-0.035003662109375,
0.0726318359375,
0.03790283203125,
-0.0086669921875,
0.0181732177734375,
-0.039947509765625,
0.034210205078125,
-0.010955810546875,
0.005035400390625,
-0.01418304443359375,
-0.0465087890625,
-0.071533203125,
-0.02752685546875,
0.01446533203125,
0.036285400390625,
-0.039154052734375,
0.0262451171875,
-0.021484375,
-0.054840087890625,
-0.057220458984375,
-0.01352691650390625,
0.039947509765625,
0.0240631103515625,
0.0352783203125,
-0.0029125213623046875,
-0.07745361328125,
-0.0416259765625,
0.01303863525390625,
-0.0280914306640625,
-0.014984130859375,
0.013641357421875,
0.029205322265625,
-0.00807952880859375,
0.047607421875,
-0.01320648193359375,
-0.00548553466796875,
-0.031158447265625,
0.0107421875,
0.0123443603515625,
0.06146240234375,
0.05560302734375,
-0.043701171875,
-0.04620361328125,
0.004787445068359375,
-0.047698974609375,
-0.01953125,
-0.0221405029296875,
-0.01061248779296875,
0.004085540771484375,
0.01401519775390625,
-0.048553466796875,
0.003772735595703125,
0.0469970703125,
-0.026153564453125,
0.0543212890625,
0.00778961181640625,
0.0161285400390625,
-0.09075927734375,
-0.0006732940673828125,
-0.0102386474609375,
-0.0310211181640625,
-0.0426025390625,
0.0172576904296875,
-0.0175323486328125,
-0.02130126953125,
-0.059967041015625,
0.05426025390625,
-0.045074462890625,
0.01261138916015625,
0.01050567626953125,
0.00714874267578125,
0.00386810302734375,
0.03594970703125,
0.01126861572265625,
0.0653076171875,
0.05548095703125,
-0.04827880859375,
0.039306640625,
0.03662109375,
-0.0218963623046875,
0.01222991943359375,
-0.05950927734375,
0.0244598388671875,
0.04278564453125,
0.0034999847412109375,
-0.06378173828125,
-0.005451202392578125,
0.0276031494140625,
-0.05303955078125,
0.0229034423828125,
-0.0214385986328125,
-0.0266571044921875,
-0.0238494873046875,
-0.01554107666015625,
0.01541900634765625,
0.0587158203125,
-0.044097900390625,
0.0291748046875,
0.02728271484375,
-0.0258636474609375,
-0.0201568603515625,
-0.07086181640625,
0.019073486328125,
-0.007232666015625,
-0.05841064453125,
0.0301971435546875,
-0.01361083984375,
-0.0035610198974609375,
-0.0247802734375,
0.0128936767578125,
-0.0154266357421875,
-0.0105743408203125,
0.005115509033203125,
0.0105438232421875,
-0.0289764404296875,
-0.0134735107421875,
0.02239990234375,
-0.0077056884765625,
-0.0086669921875,
-0.0188751220703125,
0.06451416015625,
0.00047135353088378906,
-0.0107574462890625,
-0.06158447265625,
0.03424072265625,
0.056915283203125,
-0.01507568359375,
0.01165008544921875,
0.04376220703125,
-0.01329803466796875,
-0.01528167724609375,
-0.02752685546875,
-0.003498077392578125,
-0.039031982421875,
0.0379638671875,
-0.0277099609375,
-0.0552978515625,
0.0301055908203125,
0.0198822021484375,
0.0005779266357421875,
0.0282745361328125,
0.044219970703125,
-0.0012607574462890625,
0.08172607421875,
0.0426025390625,
0.0023937225341796875,
0.046966552734375,
-0.056610107421875,
-0.00952911376953125,
-0.0572509765625,
-0.02459716796875,
-0.0374755859375,
-0.004718780517578125,
-0.048980712890625,
-0.03582763671875,
0.02301025390625,
-0.01079559326171875,
-0.0239715576171875,
0.05145263671875,
-0.048919677734375,
-0.007640838623046875,
0.06292724609375,
0.0016946792602539062,
0.0104827880859375,
0.0019063949584960938,
-0.005889892578125,
-0.003223419189453125,
-0.05230712890625,
-0.04071044921875,
0.0684814453125,
0.018798828125,
0.0430908203125,
0.01143646240234375,
0.056060791015625,
0.0018358230590820312,
-0.003360748291015625,
-0.0445556640625,
0.042022705078125,
-0.0184783935546875,
-0.05126953125,
-0.0306243896484375,
-0.033172607421875,
-0.060882568359375,
-0.0038909912109375,
-0.00893402099609375,
-0.06781005859375,
0.01666259765625,
0.0084381103515625,
-0.042877197265625,
0.030059814453125,
-0.07244873046875,
0.047607421875,
-0.03436279296875,
-0.025787353515625,
-0.00933074951171875,
-0.053009033203125,
0.01442718505859375,
-0.005084991455078125,
-0.00574493408203125,
-0.0078277587890625,
0.01232147216796875,
0.07086181640625,
-0.0455322265625,
0.061370849609375,
-0.0115966796875,
-0.0020809173583984375,
0.037322998046875,
-0.003215789794921875,
0.017608642578125,
0.016693115234375,
-0.0228729248046875,
0.028961181640625,
0.03668212890625,
-0.01873779296875,
-0.03253173828125,
0.06500244140625,
-0.0743408203125,
-0.04095458984375,
-0.0252532958984375,
-0.03411865234375,
0.00518798828125,
0.0321044921875,
0.041015625,
0.029083251953125,
-0.01279449462890625,
0.0048370361328125,
0.030731201171875,
-0.035186767578125,
0.04150390625,
0.029388427734375,
-0.02337646484375,
-0.05548095703125,
0.06085205078125,
0.0245513916015625,
0.033050537109375,
0.02978515625,
0.03369140625,
-0.0279693603515625,
-0.029541015625,
-0.055145263671875,
0.008880615234375,
-0.013214111328125,
-0.01020050048828125,
-0.0521240234375,
-0.02252197265625,
-0.05157470703125,
0.00023186206817626953,
-0.041351318359375,
-0.03399658203125,
-0.03265380859375,
0.0170745849609375,
0.038909912109375,
0.0316162109375,
-0.00348663330078125,
0.042388916015625,
-0.0726318359375,
0.039703369140625,
0.0233001708984375,
0.0157470703125,
0.0098724365234375,
-0.04071044921875,
-0.0206146240234375,
0.038116455078125,
-0.0218505859375,
-0.058929443359375,
0.050140380859375,
0.0159454345703125,
0.030181884765625,
0.0189056396484375,
0.0024547576904296875,
0.06781005859375,
-0.02435302734375,
0.06640625,
0.0184173583984375,
-0.09912109375,
0.040435791015625,
-0.0302886962890625,
0.0178070068359375,
0.004047393798828125,
0.01055145263671875,
-0.0643310546875,
-0.04595947265625,
-0.04638671875,
-0.049407958984375,
0.06689453125,
0.04620361328125,
0.0217132568359375,
-0.0057525634765625,
0.0178680419921875,
0.015655517578125,
0.012481689453125,
-0.049102783203125,
-0.034942626953125,
-0.0299072265625,
-0.0012655258178710938,
-0.00027179718017578125,
-0.00392913818359375,
0.00035953521728515625,
-0.040313720703125,
0.061920166015625,
0.01132965087890625,
0.037384033203125,
0.0196380615234375,
0.00807952880859375,
-0.01325225830078125,
0.0183868408203125,
0.040924072265625,
0.024017333984375,
-0.016632080078125,
-0.0028057098388671875,
0.0127716064453125,
-0.040557861328125,
0.029083251953125,
0.0034923553466796875,
-0.01340484619140625,
0.0222015380859375,
0.0134429931640625,
0.07281494140625,
0.00609588623046875,
-0.038360595703125,
0.017791748046875,
-0.005481719970703125,
-0.0020294189453125,
-0.03485107421875,
0.0217132568359375,
0.01641845703125,
0.02581787109375,
0.0304718017578125,
0.00010639429092407227,
-0.016326904296875,
-0.04364013671875,
0.01776123046875,
0.0187835693359375,
-0.0180511474609375,
-0.042877197265625,
0.078857421875,
0.0175933837890625,
-0.043487548828125,
0.04400634765625,
-0.012786865234375,
-0.0287017822265625,
0.0672607421875,
0.058990478515625,
0.08612060546875,
-0.0202484130859375,
0.012054443359375,
0.03192138671875,
0.00965118408203125,
-0.00951385498046875,
0.040802001953125,
-0.0006175041198730469,
-0.036346435546875,
-0.0234527587890625,
-0.0390625,
-0.032806396484375,
0.0262603759765625,
-0.03363037109375,
0.05267333984375,
-0.047760009765625,
-0.0290985107421875,
0.01444244384765625,
-0.001873016357421875,
-0.03717041015625,
0.035430908203125,
0.01534271240234375,
0.075927734375,
-0.043212890625,
0.07940673828125,
0.02508544921875,
-0.045379638671875,
-0.0706787109375,
-0.023681640625,
-0.006195068359375,
-0.0615234375,
0.036773681640625,
-0.0031414031982421875,
-0.0159454345703125,
0.0265655517578125,
-0.047637939453125,
-0.06591796875,
0.07672119140625,
0.0548095703125,
-0.05596923828125,
-0.015380859375,
0.00457763671875,
0.04486083984375,
-0.0256500244140625,
0.0177764892578125,
0.032318115234375,
0.02703857421875,
-0.006809234619140625,
-0.07421875,
0.0234832763671875,
-0.027587890625,
-0.0128631591796875,
-0.0065155029296875,
-0.0667724609375,
0.0826416015625,
-0.0386962890625,
-0.0291900634765625,
0.0457763671875,
0.03753662109375,
0.025360107421875,
0.00428009033203125,
0.0360107421875,
0.033447265625,
0.05621337890625,
-0.026153564453125,
0.06591796875,
-0.022613525390625,
0.03631591796875,
0.08514404296875,
-0.0023975372314453125,
0.042999267578125,
0.0225982666015625,
-0.023956298828125,
0.0487060546875,
0.043701171875,
-0.02484130859375,
0.038330078125,
0.0010242462158203125,
-0.022857666015625,
-0.01331329345703125,
-0.0012063980102539062,
-0.048553466796875,
0.0325927734375,
0.0120391845703125,
-0.02685546875,
0.00833892822265625,
0.01239776611328125,
0.008636474609375,
-0.0291748046875,
0.01369476318359375,
0.032470703125,
0.005779266357421875,
-0.03582763671875,
0.06982421875,
-0.01561737060546875,
0.0509033203125,
-0.07220458984375,
-0.006649017333984375,
0.00827789306640625,
-0.0022678375244140625,
-0.00958251953125,
-0.05084228515625,
0.01149749755859375,
-0.0035419464111328125,
-0.01416778564453125,
-0.017333984375,
0.01068115234375,
-0.03179931640625,
-0.0263519287109375,
0.036407470703125,
0.038482666015625,
0.0177459716796875,
0.0012340545654296875,
-0.05084228515625,
0.0145111083984375,
0.026214599609375,
-0.00910186767578125,
-0.0003147125244140625,
0.0269012451171875,
-0.00933074951171875,
0.06982421875,
0.06756591796875,
0.02947998046875,
0.0135040283203125,
0.0152435302734375,
0.05377197265625,
-0.06231689453125,
-0.050933837890625,
-0.05560302734375,
0.04229736328125,
0.005817413330078125,
-0.03411865234375,
0.050262451171875,
0.05682373046875,
0.06536865234375,
-0.01349639892578125,
0.0997314453125,
-0.031982421875,
0.0460205078125,
-0.020355224609375,
0.07708740234375,
-0.03955078125,
0.0254364013671875,
-0.031494140625,
-0.02752685546875,
-0.016632080078125,
0.03460693359375,
0.0010004043579101562,
0.009674072265625,
0.044647216796875,
0.0672607421875,
-0.00441741943359375,
-0.009033203125,
0.0297393798828125,
0.02459716796875,
0.03564453125,
0.044219970703125,
0.06134033203125,
-0.04351806640625,
0.0635986328125,
-0.0447998046875,
-0.0018444061279296875,
-0.0011987686157226562,
-0.042510986328125,
-0.049346923828125,
-0.037689208984375,
-0.03143310546875,
-0.0389404296875,
0.00202178955078125,
0.07745361328125,
0.08111572265625,
-0.079833984375,
-0.048492431640625,
0.00037479400634765625,
0.0142059326171875,
-0.0301513671875,
-0.0183868408203125,
0.0270233154296875,
-0.0016956329345703125,
-0.0936279296875,
0.034942626953125,
0.00673675537109375,
0.037811279296875,
-0.019195556640625,
-0.0162811279296875,
-0.01378631591796875,
-0.006412506103515625,
0.0301055908203125,
0.035430908203125,
-0.05389404296875,
-0.02203369140625,
0.0172271728515625,
-0.017333984375,
0.0291290283203125,
0.048126220703125,
-0.049285888671875,
0.056793212890625,
0.033935546875,
0.0007042884826660156,
0.043701171875,
-0.036651611328125,
0.060211181640625,
-0.0582275390625,
0.02276611328125,
0.01096343994140625,
0.0260009765625,
0.03582763671875,
-0.019927978515625,
0.0164794921875,
0.0293426513671875,
-0.01556396484375,
-0.062225341796875,
0.00342559814453125,
-0.0927734375,
-0.00914764404296875,
0.07623291015625,
-0.0036602020263671875,
-0.01293182373046875,
-0.012451171875,
-0.045166015625,
0.07513427734375,
-0.048919677734375,
0.043792724609375,
0.006649017333984375,
-0.003986358642578125,
-0.01074981689453125,
-0.03173828125,
0.04022216796875,
0.0148773193359375,
-0.042510986328125,
0.00008302927017211914,
0.004962921142578125,
0.04248046875,
0.028564453125,
0.04791259765625,
-0.004352569580078125,
0.033935546875,
0.01032257080078125,
0.0253448486328125,
-0.00922393798828125,
0.005481719970703125,
-0.006107330322265625,
0.001522064208984375,
-0.01462554931640625,
-0.028778076171875
]
] |
timm/coat_lite_mini.in1k | 2023-04-24T03:43:16.000Z | [
"timm",
"pytorch",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2104.06399",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/coat_lite_mini.in1k | 0 | 37,293 | timm | 2023-04-24T03:43:09 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for coat_lite_mini.in1k
A CoaT (Co-Scale Conv-Attentional Transformer) image classification model. Trained on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 11.0
- GMACs: 2.0
- Activations (M): 12.2
- Image size: 224 x 224
- **Papers:**
- Co-Scale Conv-Attentional Image Transformers: https://arxiv.org/abs/2104.06399
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/mlpc-ucsd/CoaT
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('coat_lite_mini.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'coat_lite_mini.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 50, 512) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@InProceedings{Xu_2021_ICCV,
author = {Xu, Weijian and Xu, Yifan and Chang, Tyler and Tu, Zhuowen},
title = {Co-Scale Conv-Attentional Image Transformers},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
month = {October},
year = {2021},
pages = {9981-9990}
}
```
| 2,819 | [
[
-0.037109375,
-0.03509521484375,
-0.0033168792724609375,
0.0105743408203125,
-0.0216217041015625,
-0.024078369140625,
-0.0155181884765625,
-0.03045654296875,
0.0157928466796875,
0.0264129638671875,
-0.042144775390625,
-0.043731689453125,
-0.04931640625,
-0.01027679443359375,
-0.01439666748046875,
0.06646728515625,
-0.0031452178955078125,
-0.0103607177734375,
-0.0165863037109375,
-0.03338623046875,
-0.0127716064453125,
-0.023529052734375,
-0.07305908203125,
-0.03558349609375,
0.0231781005859375,
0.027587890625,
0.043212890625,
0.038330078125,
0.056427001953125,
0.03143310546875,
-0.0020351409912109375,
0.00463104248046875,
-0.018310546875,
-0.0271148681640625,
0.0259857177734375,
-0.0455322265625,
-0.040985107421875,
0.031005859375,
0.03399658203125,
0.027801513671875,
0.0052032470703125,
0.0347900390625,
0.01261138916015625,
0.0404052734375,
-0.0258331298828125,
0.0172576904296875,
-0.0322265625,
0.0201416015625,
-0.0035381317138671875,
0.004848480224609375,
-0.03369140625,
-0.01348876953125,
0.0235595703125,
-0.046356201171875,
0.04156494140625,
0.0128021240234375,
0.09368896484375,
0.01367950439453125,
-0.0028839111328125,
-0.0021305084228515625,
-0.0137176513671875,
0.0716552734375,
-0.05633544921875,
0.0207366943359375,
0.02728271484375,
0.0089263916015625,
0.006458282470703125,
-0.083251953125,
-0.042266845703125,
-0.012451171875,
-0.0254364013671875,
0.006755828857421875,
-0.00942230224609375,
0.00830841064453125,
0.024566650390625,
0.034576416015625,
-0.033355712890625,
0.0053253173828125,
-0.033050537109375,
-0.014617919921875,
0.046295166015625,
0.0026226043701171875,
0.027557373046875,
-0.01056671142578125,
-0.052825927734375,
-0.035308837890625,
-0.0158233642578125,
0.033599853515625,
0.01192474365234375,
0.00856781005859375,
-0.0567626953125,
0.0282135009765625,
0.0126495361328125,
0.0303955078125,
0.0283203125,
-0.0191497802734375,
0.04595947265625,
0.0020503997802734375,
-0.036895751953125,
-0.0045928955078125,
0.07940673828125,
0.0423583984375,
0.0173492431640625,
-0.0026798248291015625,
-0.005512237548828125,
-0.0198974609375,
-0.01155853271484375,
-0.088623046875,
-0.022918701171875,
0.0280914306640625,
-0.042236328125,
-0.035400390625,
0.015960693359375,
-0.040863037109375,
-0.0033016204833984375,
-0.01035308837890625,
0.053009033203125,
-0.02825927734375,
-0.0174713134765625,
0.018463134765625,
-0.01666259765625,
0.03106689453125,
0.01007080078125,
-0.05078125,
0.025726318359375,
0.016021728515625,
0.07904052734375,
-0.002124786376953125,
-0.02752685546875,
-0.0191192626953125,
-0.016632080078125,
-0.0228729248046875,
0.0285491943359375,
-0.00492095947265625,
-0.003520965576171875,
-0.0196533203125,
0.0343017578125,
-0.0227203369140625,
-0.051361083984375,
0.0286712646484375,
-0.0225372314453125,
0.02581787109375,
0.002277374267578125,
-0.0172882080078125,
-0.025482177734375,
0.0259552001953125,
-0.034912109375,
0.085693359375,
0.029541015625,
-0.07952880859375,
0.0259246826171875,
-0.04522705078125,
-0.0018301010131835938,
-0.009490966796875,
0.0030517578125,
-0.08575439453125,
-0.01351165771484375,
0.0189361572265625,
0.057403564453125,
-0.0234527587890625,
0.00881195068359375,
-0.0401611328125,
-0.0227813720703125,
0.0293426513671875,
0.00018978118896484375,
0.06756591796875,
0.0012760162353515625,
-0.0262908935546875,
0.0196075439453125,
-0.052490234375,
0.01284027099609375,
0.033843994140625,
-0.00893402099609375,
-0.00725555419921875,
-0.053619384765625,
0.0223236083984375,
0.0249786376953125,
0.01812744140625,
-0.049468994140625,
0.01361083984375,
-0.015228271484375,
0.0294342041015625,
0.047119140625,
0.00008034706115722656,
0.030914306640625,
-0.03857421875,
0.021728515625,
0.024322509765625,
0.0355224609375,
0.0006279945373535156,
-0.03350830078125,
-0.07305908203125,
-0.04693603515625,
0.0184173583984375,
0.0298919677734375,
-0.033966064453125,
0.043731689453125,
0.0015201568603515625,
-0.05267333984375,
-0.032928466796875,
-0.005977630615234375,
0.0288238525390625,
0.03973388671875,
0.0352783203125,
-0.03643798828125,
-0.038909912109375,
-0.076171875,
-0.005428314208984375,
0.0146484375,
-0.0018301010131835938,
0.0244903564453125,
0.048187255859375,
-0.018402099609375,
0.046295166015625,
-0.0318603515625,
-0.0269622802734375,
-0.017608642578125,
0.01200103759765625,
0.037078857421875,
0.05682373046875,
0.054962158203125,
-0.047454833984375,
-0.0516357421875,
-0.017578125,
-0.0679931640625,
0.01328277587890625,
-0.00988006591796875,
-0.0157470703125,
0.01361846923828125,
0.0244903564453125,
-0.04290771484375,
0.05181884765625,
0.0193939208984375,
-0.0284576416015625,
0.0345458984375,
-0.0192718505859375,
0.0123443603515625,
-0.0823974609375,
0.01348876953125,
0.03076171875,
-0.01084136962890625,
-0.043365478515625,
0.0012722015380859375,
0.0100860595703125,
0.0030612945556640625,
-0.039093017578125,
0.05517578125,
-0.037506103515625,
0.0034027099609375,
-0.0070343017578125,
-0.0208587646484375,
0.01021575927734375,
0.0633544921875,
-0.002483367919921875,
0.027008056640625,
0.0577392578125,
-0.041748046875,
0.036895751953125,
0.040863037109375,
-0.0232696533203125,
0.0299835205078125,
-0.05108642578125,
0.021728515625,
0.01068878173828125,
0.006267547607421875,
-0.07830810546875,
-0.018951416015625,
0.0265045166015625,
-0.0293121337890625,
0.050048828125,
-0.0264129638671875,
-0.0362548828125,
-0.036865234375,
-0.04608154296875,
0.037689208984375,
0.04913330078125,
-0.058013916015625,
0.0187530517578125,
0.01120758056640625,
0.01806640625,
-0.042236328125,
-0.0682373046875,
-0.0233306884765625,
-0.0270538330078125,
-0.054046630859375,
0.03662109375,
-0.01081085205078125,
0.006336212158203125,
-0.00016582012176513672,
-0.004741668701171875,
-0.0029296875,
-0.0133056640625,
0.0311737060546875,
0.034698486328125,
-0.016357421875,
-0.005466461181640625,
-0.021759033203125,
-0.0024738311767578125,
0.00457000732421875,
-0.008056640625,
0.045166015625,
-0.0247802734375,
-0.0083465576171875,
-0.062469482421875,
-0.01053619384765625,
0.043670654296875,
-0.00553131103515625,
0.06884765625,
0.076904296875,
-0.0269775390625,
-0.006046295166015625,
-0.030914306640625,
-0.02691650390625,
-0.03973388671875,
0.034149169921875,
-0.0245819091796875,
-0.0286712646484375,
0.061431884765625,
0.004245758056640625,
-0.003147125244140625,
0.05377197265625,
0.0204010009765625,
-0.0002875328063964844,
0.062744140625,
0.058563232421875,
0.0169830322265625,
0.0531005859375,
-0.076171875,
-0.00974273681640625,
-0.079833984375,
-0.036407470703125,
-0.01430511474609375,
-0.050445556640625,
-0.04974365234375,
-0.0254364013671875,
0.037567138671875,
0.01021575927734375,
-0.036468505859375,
0.028900146484375,
-0.063232421875,
0.0024089813232421875,
0.0479736328125,
0.046905517578125,
-0.0177154541015625,
0.0263214111328125,
-0.0199432373046875,
-0.005321502685546875,
-0.062042236328125,
-0.0097808837890625,
0.07952880859375,
0.043182373046875,
0.0628662109375,
-0.0276641845703125,
0.04541015625,
-0.01306915283203125,
0.028411865234375,
-0.055938720703125,
0.04290771484375,
-0.0104827880859375,
-0.0286712646484375,
-0.01512908935546875,
-0.015380859375,
-0.0701904296875,
0.007564544677734375,
-0.01629638671875,
-0.049957275390625,
0.030609130859375,
0.006885528564453125,
-0.0127716064453125,
0.047332763671875,
-0.058868408203125,
0.07733154296875,
-0.0098876953125,
-0.039520263671875,
0.007114410400390625,
-0.047454833984375,
0.022369384765625,
0.00623321533203125,
-0.00894927978515625,
-0.002422332763671875,
0.0052947998046875,
0.0885009765625,
-0.04034423828125,
0.06927490234375,
-0.0257568359375,
0.0243682861328125,
0.034576416015625,
-0.017852783203125,
0.029693603515625,
-0.01174163818359375,
-0.003017425537109375,
0.0210113525390625,
0.00997161865234375,
-0.03271484375,
-0.03192138671875,
0.038665771484375,
-0.07659912109375,
-0.0266265869140625,
-0.036224365234375,
-0.055084228515625,
0.002803802490234375,
0.0051727294921875,
0.050323486328125,
0.055938720703125,
0.0191192626953125,
0.0275421142578125,
0.0469970703125,
-0.02911376953125,
0.034637451171875,
0.0055084228515625,
-0.020416259765625,
-0.03424072265625,
0.057159423828125,
0.019989013671875,
0.021026611328125,
0.0097503662109375,
0.0216217041015625,
-0.019256591796875,
-0.031768798828125,
-0.0119781494140625,
0.033843994140625,
-0.049163818359375,
-0.03173828125,
-0.04132080078125,
-0.04449462890625,
-0.0256500244140625,
-0.0113525390625,
-0.02984619140625,
-0.0266265869140625,
-0.0274200439453125,
0.019989013671875,
0.046112060546875,
0.040863037109375,
-0.007495880126953125,
0.042144775390625,
-0.03973388671875,
0.01410675048828125,
0.01422882080078125,
0.041107177734375,
-0.007175445556640625,
-0.08233642578125,
-0.032928466796875,
-0.0074005126953125,
-0.03668212890625,
-0.04266357421875,
0.046356201171875,
0.0182037353515625,
0.033233642578125,
0.0267486572265625,
-0.00010669231414794922,
0.049591064453125,
0.002628326416015625,
0.0293426513671875,
0.032745361328125,
-0.040985107421875,
0.0372314453125,
-0.00344085693359375,
0.0227813720703125,
0.002658843994140625,
0.0280303955078125,
-0.02972412109375,
-0.016998291015625,
-0.08270263671875,
-0.053863525390625,
0.07568359375,
0.0107574462890625,
-0.008087158203125,
0.024932861328125,
0.047576904296875,
0.001468658447265625,
0.004734039306640625,
-0.060791015625,
-0.03656005859375,
-0.019195556640625,
-0.0258026123046875,
-0.0037822723388671875,
-0.01039886474609375,
-0.00453948974609375,
-0.047454833984375,
0.056793212890625,
-0.01187896728515625,
0.051361083984375,
0.0340576171875,
-0.0023193359375,
-0.0195770263671875,
-0.02130126953125,
0.0265655517578125,
0.01512908935546875,
-0.03814697265625,
0.005817413330078125,
0.013763427734375,
-0.047332763671875,
0.001003265380859375,
0.00830078125,
-0.0004839897155761719,
-0.0026988983154296875,
0.035614013671875,
0.0618896484375,
-0.0007004737854003906,
0.006439208984375,
0.042144775390625,
-0.0184783935546875,
-0.0295257568359375,
-0.016815185546875,
0.00634765625,
-0.005535125732421875,
0.027191162109375,
0.0206146240234375,
0.039581298828125,
-0.00620269775390625,
-0.025726318359375,
0.0195465087890625,
0.046661376953125,
-0.0303955078125,
-0.027679443359375,
0.04595947265625,
-0.0172119140625,
-0.0072479248046875,
0.0594482421875,
-0.0009775161743164062,
-0.033355712890625,
0.085205078125,
0.0360107421875,
0.07464599609375,
-0.0069580078125,
-0.0032253265380859375,
0.0701904296875,
0.01459503173828125,
-0.0008792877197265625,
0.00841522216796875,
0.01361846923828125,
-0.05621337890625,
0.006744384765625,
-0.045013427734375,
-0.0070343017578125,
0.043914794921875,
-0.042266845703125,
0.02716064453125,
-0.05352783203125,
-0.024658203125,
0.0034732818603515625,
0.0268402099609375,
-0.0704345703125,
0.0250701904296875,
0.0010652542114257812,
0.0704345703125,
-0.06512451171875,
0.052337646484375,
0.050872802734375,
-0.04296875,
-0.07403564453125,
-0.007663726806640625,
-0.0179901123046875,
-0.07086181640625,
0.049591064453125,
0.0428466796875,
0.007495880126953125,
0.00899505615234375,
-0.074951171875,
-0.045501708984375,
0.1033935546875,
0.034942626953125,
-0.0168304443359375,
0.01727294921875,
0.004730224609375,
0.020721435546875,
-0.032867431640625,
0.035797119140625,
0.0159149169921875,
0.0259552001953125,
0.0191497802734375,
-0.05462646484375,
0.0133056640625,
-0.0230255126953125,
-0.005916595458984375,
0.0000317692756652832,
-0.0723876953125,
0.07513427734375,
-0.0306549072265625,
-0.01251983642578125,
0.01287841796875,
0.04364013671875,
0.01398468017578125,
0.021209716796875,
0.037506103515625,
0.060028076171875,
0.035614013671875,
-0.01641845703125,
0.06964111328125,
-0.006500244140625,
0.05596923828125,
0.047271728515625,
0.027557373046875,
0.038482666015625,
0.0291748046875,
-0.0202789306640625,
0.028717041015625,
0.074951171875,
-0.0296783447265625,
0.034637451171875,
0.005786895751953125,
-0.0034732818603515625,
-0.01464080810546875,
-0.0021877288818359375,
-0.0294036865234375,
0.0293121337890625,
0.01476287841796875,
-0.03839111328125,
-0.0037746429443359375,
0.00864410400390625,
-0.0123443603515625,
-0.0294342041015625,
-0.0230712890625,
0.037353515625,
0.010650634765625,
-0.0302581787109375,
0.06671142578125,
-0.0013666152954101562,
0.06695556640625,
-0.031219482421875,
0.0008440017700195312,
-0.02349853515625,
0.035675048828125,
-0.03094482421875,
-0.057403564453125,
0.0138397216796875,
-0.028961181640625,
-0.01165008544921875,
0.0016765594482421875,
0.053680419921875,
-0.03704833984375,
-0.03936767578125,
0.018829345703125,
0.01552581787109375,
0.0260467529296875,
0.00080108642578125,
-0.0921630859375,
0.005908966064453125,
-0.006275177001953125,
-0.0494384765625,
0.026611328125,
0.0251922607421875,
0.00496673583984375,
0.057342529296875,
0.036895751953125,
-0.017547607421875,
0.0132293701171875,
-0.0100555419921875,
0.053131103515625,
-0.03173828125,
-0.0242462158203125,
-0.0567626953125,
0.041046142578125,
-0.0077972412109375,
-0.0242919921875,
0.039886474609375,
0.0516357421875,
0.06182861328125,
-0.0191650390625,
0.0367431640625,
-0.0103607177734375,
0.00421905517578125,
-0.027801513671875,
0.0528564453125,
-0.052642822265625,
-0.006870269775390625,
-0.008514404296875,
-0.045562744140625,
-0.02734375,
0.049560546875,
-0.024688720703125,
0.0290985107421875,
0.056121826171875,
0.0780029296875,
-0.033111572265625,
-0.025604248046875,
0.026123046875,
0.0152130126953125,
0.0075836181640625,
0.03326416015625,
0.03399658203125,
-0.0755615234375,
0.0222625732421875,
-0.05108642578125,
-0.01357269287109375,
-0.0124053955078125,
-0.0447998046875,
-0.08148193359375,
-0.06024169921875,
-0.056793212890625,
-0.040313720703125,
-0.02752685546875,
0.06829833984375,
0.079345703125,
-0.057647705078125,
-0.0137939453125,
-0.0016927719116210938,
-0.0035343170166015625,
-0.01708984375,
-0.019287109375,
0.057098388671875,
-0.0241851806640625,
-0.059906005859375,
-0.032928466796875,
0.0062255859375,
0.0249176025390625,
0.003261566162109375,
-0.0184173583984375,
-0.0212860107421875,
-0.00789642333984375,
0.0224151611328125,
0.0161285400390625,
-0.045928955078125,
-0.0207061767578125,
-0.01444244384765625,
-0.0216064453125,
0.032928466796875,
0.02288818359375,
-0.0426025390625,
0.025177001953125,
0.02471923828125,
0.0198516845703125,
0.0655517578125,
-0.02960205078125,
-0.0027179718017578125,
-0.06854248046875,
0.04217529296875,
-0.01605224609375,
0.035797119140625,
0.032440185546875,
-0.0299072265625,
0.03887939453125,
0.0355224609375,
-0.039703369140625,
-0.051849365234375,
-0.0087127685546875,
-0.0931396484375,
0.0011129379272460938,
0.060791015625,
-0.0194854736328125,
-0.04583740234375,
0.028961181640625,
-0.0161590576171875,
0.05096435546875,
-0.0023975372314453125,
0.039306640625,
0.0213165283203125,
-0.0106658935546875,
-0.052398681640625,
-0.031219482421875,
0.041046142578125,
0.01019287109375,
-0.04833984375,
-0.0264739990234375,
-0.01556396484375,
0.050384521484375,
0.0210113525390625,
0.03204345703125,
-0.0097503662109375,
0.016632080078125,
0.00283050537109375,
0.0379638671875,
-0.0222930908203125,
-0.0009531974792480469,
-0.0167999267578125,
0.00457763671875,
-0.00206756591796875,
-0.045166015625
]
] |
timm/fbnetc_100.rmsp_in1k | 2023-04-27T21:13:21.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1812.03443",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/fbnetc_100.rmsp_in1k | 0 | 37,275 | timm | 2022-12-12T23:59:14 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for fbnetc_100.rmsp_in1k
A FBNet image classification model. Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* A simple RmsProp based recipe without RandAugment. Using RandomErasing, mixup, dropout, standard random-resize-crop augmentation.
* RMSProp (TF 1.0 behaviour) optimizer, EMA weight averaging
* Step (exponential decay w/ staircase) LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 5.6
- GMACs: 0.4
- Activations (M): 6.5
- Image size: 224 x 224
- **Papers:**
- FBNet: Hardware-Aware Efficient ConvNet Design via Differentiable Neural Architecture Search: https://arxiv.org/abs/1812.03443
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('fbnetc_100.rmsp_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'fbnetc_100.rmsp_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 112, 112])
# torch.Size([1, 24, 56, 56])
# torch.Size([1, 32, 28, 28])
# torch.Size([1, 112, 14, 14])
# torch.Size([1, 352, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'fbnetc_100.rmsp_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1984, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@inproceedings{wu2019fbnet,
title={Fbnet: Hardware-aware efficient convnet design via differentiable neural architecture search},
author={Wu, Bichen and Dai, Xiaoliang and Zhang, Peizhao and Wang, Yanghan and Sun, Fei and Wu, Yiming and Tian, Yuandong and Vajda, Peter and Jia, Yangqing and Keutzer, Kurt},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={10734--10742},
year={2019}
}
```
| 4,474 | [
[
-0.041900634765625,
-0.0401611328125,
-0.005950927734375,
0.01517486572265625,
-0.026397705078125,
-0.0316162109375,
-0.01520538330078125,
-0.0299224853515625,
0.0230255126953125,
0.030242919921875,
-0.041229248046875,
-0.052154541015625,
-0.0474853515625,
-0.010711669921875,
-0.00794219970703125,
0.0626220703125,
-0.0098419189453125,
-0.003383636474609375,
-0.01226806640625,
-0.03936767578125,
-0.02252197265625,
-0.024322509765625,
-0.07080078125,
-0.03302001953125,
0.0203399658203125,
0.00933837890625,
0.042083740234375,
0.0389404296875,
0.04290771484375,
0.03564453125,
-0.01458740234375,
0.01039886474609375,
-0.0100250244140625,
-0.019073486328125,
0.030303955078125,
-0.04302978515625,
-0.0374755859375,
0.016876220703125,
0.04754638671875,
0.02398681640625,
0.002101898193359375,
0.025421142578125,
0.00299072265625,
0.04058837890625,
-0.0160675048828125,
0.01366424560546875,
-0.032440185546875,
0.0243377685546875,
-0.0074615478515625,
-0.0018310546875,
-0.0267333984375,
-0.0304107666015625,
0.01268768310546875,
-0.037811279296875,
0.0355224609375,
0.0011377334594726562,
0.09478759765625,
0.01459503173828125,
-0.0013179779052734375,
-0.00691986083984375,
-0.0151519775390625,
0.06256103515625,
-0.0660400390625,
0.02569580078125,
0.021240234375,
0.0127105712890625,
-0.00913238525390625,
-0.0675048828125,
-0.041168212890625,
-0.0203399658203125,
-0.0113067626953125,
-0.007793426513671875,
-0.01690673828125,
0.001194000244140625,
0.0261688232421875,
0.031158447265625,
-0.03887939453125,
0.006313323974609375,
-0.03155517578125,
-0.0219268798828125,
0.041290283203125,
0.0005774497985839844,
0.0249786376953125,
-0.0190887451171875,
-0.041015625,
-0.032257080078125,
-0.0294952392578125,
0.02386474609375,
0.013275146484375,
0.0235748291015625,
-0.041290283203125,
0.0248260498046875,
0.0110321044921875,
0.050079345703125,
0.006320953369140625,
-0.0159759521484375,
0.0450439453125,
0.0005016326904296875,
-0.03399658203125,
-0.004726409912109375,
0.07672119140625,
0.0256500244140625,
0.014801025390625,
0.008453369140625,
-0.01947021484375,
-0.023681640625,
-0.0023937225341796875,
-0.0931396484375,
-0.0263519287109375,
0.0291900634765625,
-0.04730224609375,
-0.0333251953125,
0.0195465087890625,
-0.041595458984375,
-0.002048492431640625,
-0.005092620849609375,
0.04754638671875,
-0.045806884765625,
-0.0296630859375,
0.000030517578125,
-0.0177154541015625,
0.0236968994140625,
0.013427734375,
-0.048126220703125,
0.01558685302734375,
0.02752685546875,
0.0897216796875,
0.012603759765625,
-0.0308685302734375,
-0.0216827392578125,
-0.03173828125,
-0.0283355712890625,
0.028045654296875,
-0.0031890869140625,
-0.01058197021484375,
-0.024078369140625,
0.0241546630859375,
-0.006778717041015625,
-0.053924560546875,
0.01361083984375,
-0.020599365234375,
0.019256591796875,
-0.01202392578125,
-0.0189971923828125,
-0.037811279296875,
0.0221099853515625,
-0.03839111328125,
0.0953369140625,
0.0296783447265625,
-0.06207275390625,
0.0290985107421875,
-0.046539306640625,
-0.016876220703125,
-0.0205230712890625,
-0.00794219970703125,
-0.07525634765625,
-0.002422332763671875,
0.023193359375,
0.058563232421875,
-0.01910400390625,
-0.001720428466796875,
-0.03424072265625,
-0.02508544921875,
0.0243377685546875,
-0.006237030029296875,
0.0791015625,
0.01201629638671875,
-0.03961181640625,
0.021331787109375,
-0.041534423828125,
0.01556396484375,
0.043548583984375,
-0.019622802734375,
-0.00814056396484375,
-0.047210693359375,
0.01392364501953125,
0.0285491943359375,
0.007648468017578125,
-0.04571533203125,
0.0173187255859375,
-0.0223388671875,
0.03466796875,
0.054718017578125,
-0.007396697998046875,
0.019683837890625,
-0.03338623046875,
0.0177001953125,
0.02728271484375,
0.021392822265625,
-0.0015316009521484375,
-0.045806884765625,
-0.056304931640625,
-0.034820556640625,
0.032379150390625,
0.030242919921875,
-0.04547119140625,
0.0355224609375,
-0.0052337646484375,
-0.056182861328125,
-0.0333251953125,
0.006412506103515625,
0.041839599609375,
0.0382080078125,
0.0293731689453125,
-0.04534912109375,
-0.0335693359375,
-0.06317138671875,
0.01056671142578125,
0.013763427734375,
-0.005207061767578125,
0.02716064453125,
0.046661376953125,
-0.00566864013671875,
0.04486083984375,
-0.035125732421875,
-0.025787353515625,
-0.014556884765625,
0.00865936279296875,
0.0355224609375,
0.06231689453125,
0.07666015625,
-0.054046630859375,
-0.0301513671875,
-0.006946563720703125,
-0.068603515625,
0.0181427001953125,
-0.00919342041015625,
-0.014923095703125,
0.028533935546875,
0.01107025146484375,
-0.04937744140625,
0.048828125,
0.0140228271484375,
-0.021484375,
0.03326416015625,
-0.01556396484375,
0.0182037353515625,
-0.08392333984375,
0.00696563720703125,
0.030975341796875,
-0.0061798095703125,
-0.0301055908203125,
0.0034885406494140625,
0.0006852149963378906,
-0.0024394989013671875,
-0.04034423828125,
0.055877685546875,
-0.04400634765625,
-0.0167388916015625,
-0.007480621337890625,
-0.01678466796875,
-0.00038909912109375,
0.0618896484375,
-0.00266265869140625,
0.0291595458984375,
0.055023193359375,
-0.041290283203125,
0.0312347412109375,
0.035064697265625,
-0.01303863525390625,
0.029541015625,
-0.0484619140625,
0.01305389404296875,
0.00441741943359375,
0.01412200927734375,
-0.07049560546875,
-0.01219940185546875,
0.0247650146484375,
-0.04278564453125,
0.055023193359375,
-0.0335693359375,
-0.0295562744140625,
-0.04156494140625,
-0.03680419921875,
0.026763916015625,
0.043731689453125,
-0.053314208984375,
0.035247802734375,
0.01468658447265625,
0.02056884765625,
-0.047393798828125,
-0.07208251953125,
-0.02593994140625,
-0.01617431640625,
-0.055267333984375,
0.032623291015625,
0.00542449951171875,
0.0097198486328125,
0.007526397705078125,
-0.0161590576171875,
-0.00865936279296875,
-0.01021575927734375,
0.044189453125,
0.0259857177734375,
-0.038970947265625,
-0.01111602783203125,
-0.01515960693359375,
-0.00484466552734375,
-0.0004024505615234375,
-0.02557373046875,
0.05352783203125,
-0.0269622802734375,
-0.00995635986328125,
-0.06427001953125,
-0.003887176513671875,
0.0289764404296875,
-0.009185791015625,
0.057373046875,
0.0797119140625,
-0.0450439453125,
-0.01253509521484375,
-0.029022216796875,
-0.03131103515625,
-0.03961181640625,
0.0292816162109375,
-0.0345458984375,
-0.03167724609375,
0.060943603515625,
0.004756927490234375,
0.012054443359375,
0.049530029296875,
0.027099609375,
-0.0012493133544921875,
0.041900634765625,
0.041900634765625,
0.0124053955078125,
0.05926513671875,
-0.07415771484375,
-0.01763916015625,
-0.056396484375,
-0.02880859375,
-0.0195159912109375,
-0.049591064453125,
-0.04827880859375,
-0.029998779296875,
0.024627685546875,
0.0040435791015625,
-0.0289306640625,
0.0367431640625,
-0.07342529296875,
0.01018524169921875,
0.057830810546875,
0.0489501953125,
-0.029632568359375,
0.0194549560546875,
-0.0301971435546875,
-0.00409698486328125,
-0.059478759765625,
-0.01027679443359375,
0.07928466796875,
0.03271484375,
0.037353515625,
-0.0160064697265625,
0.05682373046875,
-0.01361083984375,
0.032379150390625,
-0.043609619140625,
0.048858642578125,
-0.02008056640625,
-0.03662109375,
-0.01151275634765625,
-0.034271240234375,
-0.07318115234375,
0.01108551025390625,
-0.0221710205078125,
-0.0518798828125,
0.01509857177734375,
0.0130157470703125,
-0.0185699462890625,
0.06219482421875,
-0.057952880859375,
0.0751953125,
-0.0183563232421875,
-0.0298614501953125,
0.00913238525390625,
-0.05120849609375,
0.0193023681640625,
0.022369384765625,
-0.02117919921875,
-0.0031833648681640625,
0.0243988037109375,
0.079345703125,
-0.048431396484375,
0.0595703125,
-0.04541015625,
0.032684326171875,
0.037200927734375,
-0.00499725341796875,
0.0154876708984375,
-0.0127410888671875,
-0.01308441162109375,
0.036163330078125,
0.00556182861328125,
-0.037384033203125,
-0.037139892578125,
0.04534912109375,
-0.07098388671875,
-0.0167388916015625,
-0.0294036865234375,
-0.0382080078125,
0.0137939453125,
0.0132293701171875,
0.03839111328125,
0.05596923828125,
0.00995635986328125,
0.0191650390625,
0.043975830078125,
-0.03570556640625,
0.031646728515625,
-0.01177215576171875,
-0.0206451416015625,
-0.040985107421875,
0.066650390625,
0.018218994140625,
0.008453369140625,
0.02093505859375,
0.01486968994140625,
-0.0212554931640625,
-0.043975830078125,
-0.025848388671875,
0.031494140625,
-0.050537109375,
-0.037567138671875,
-0.037353515625,
-0.037200927734375,
-0.02734375,
-0.012451171875,
-0.03662109375,
-0.0274200439453125,
-0.026458740234375,
0.0133209228515625,
0.056671142578125,
0.041534423828125,
-0.01305389404296875,
0.046173095703125,
-0.0343017578125,
0.0126953125,
0.00534820556640625,
0.03570556640625,
-0.00440216064453125,
-0.057586669921875,
-0.00853729248046875,
-0.0022716522216796875,
-0.03851318359375,
-0.05242919921875,
0.04058837890625,
0.01047515869140625,
0.0372314453125,
0.0264129638671875,
-0.0149078369140625,
0.055572509765625,
0.00234222412109375,
0.038604736328125,
0.02459716796875,
-0.051971435546875,
0.050018310546875,
-0.0015048980712890625,
0.01238250732421875,
0.0080108642578125,
0.0286407470703125,
-0.0195770263671875,
0.0008449554443359375,
-0.07269287109375,
-0.058441162109375,
0.06890869140625,
0.0111236572265625,
-0.00807952880859375,
0.02899169921875,
0.057373046875,
0.0023326873779296875,
0.0097808837890625,
-0.05145263671875,
-0.02960205078125,
-0.03143310546875,
-0.021636962890625,
0.00504302978515625,
-0.001224517822265625,
-0.004695892333984375,
-0.047882080078125,
0.0545654296875,
-0.00322723388671875,
0.06732177734375,
0.026641845703125,
0.00118255615234375,
-0.007480621337890625,
-0.03302001953125,
0.04083251953125,
0.017364501953125,
-0.035064697265625,
-0.0007891654968261719,
0.0152435302734375,
-0.0450439453125,
-0.0010595321655273438,
0.0140228271484375,
-0.002002716064453125,
0.00982666015625,
0.0292816162109375,
0.07196044921875,
0.00885009765625,
0.0089874267578125,
0.0304718017578125,
-0.01001739501953125,
-0.03558349609375,
-0.0189361572265625,
0.0130157470703125,
-0.0013856887817382812,
0.0323486328125,
0.0235595703125,
0.02850341796875,
-0.0217437744140625,
-0.0164031982421875,
0.032684326171875,
0.047607421875,
-0.0181121826171875,
-0.02581787109375,
0.05340576171875,
-0.018890380859375,
-0.0186920166015625,
0.07244873046875,
-0.00634765625,
-0.03314208984375,
0.08349609375,
0.031982421875,
0.06524658203125,
-0.01325225830078125,
0.00537872314453125,
0.0582275390625,
0.0265960693359375,
0.007091522216796875,
0.0223388671875,
0.005878448486328125,
-0.057281494140625,
0.01114654541015625,
-0.050567626953125,
-0.004917144775390625,
0.0276641845703125,
-0.03631591796875,
0.0260772705078125,
-0.06878662109375,
-0.0312347412109375,
0.0206146240234375,
0.02984619140625,
-0.07379150390625,
0.02294921875,
-0.00461578369140625,
0.07037353515625,
-0.05364990234375,
0.05780029296875,
0.057708740234375,
-0.050933837890625,
-0.08056640625,
-0.00940704345703125,
0.0046539306640625,
-0.07177734375,
0.041717529296875,
0.040771484375,
0.0128936767578125,
-0.0004067420959472656,
-0.05609130859375,
-0.03839111328125,
0.108642578125,
0.03887939453125,
-0.0032863616943359375,
0.01505279541015625,
-0.00125885009765625,
0.0181732177734375,
-0.03411865234375,
0.03497314453125,
0.00876617431640625,
0.030120849609375,
0.026275634765625,
-0.05279541015625,
0.02593994140625,
-0.0220794677734375,
0.006870269775390625,
0.0177154541015625,
-0.060943603515625,
0.07421875,
-0.03277587890625,
-0.0079498291015625,
0.0009570121765136719,
0.0538330078125,
0.0196685791015625,
0.0242156982421875,
0.039642333984375,
0.05535888671875,
0.038604736328125,
-0.0280914306640625,
0.0654296875,
-0.0008234977722167969,
0.044097900390625,
0.043731689453125,
0.029022216796875,
0.047515869140625,
0.02777099609375,
-0.019378662109375,
0.0246429443359375,
0.08380126953125,
-0.02398681640625,
0.02471923828125,
0.0164337158203125,
0.0079193115234375,
-0.0034046173095703125,
0.017608642578125,
-0.032440185546875,
0.038330078125,
0.0185394287109375,
-0.043548583984375,
-0.0133209228515625,
0.0017843246459960938,
0.00017571449279785156,
-0.030487060546875,
-0.0131988525390625,
0.034149169921875,
0.0092315673828125,
-0.0292205810546875,
0.07354736328125,
0.0048675537109375,
0.06951904296875,
-0.0357666015625,
0.0020351409912109375,
-0.0251922607421875,
0.019256591796875,
-0.0310516357421875,
-0.055267333984375,
0.0272216796875,
-0.032867431640625,
-0.0016469955444335938,
0.002643585205078125,
0.04571533203125,
-0.0209197998046875,
-0.033447265625,
0.01392364501953125,
0.019378662109375,
0.03857421875,
-0.0035762786865234375,
-0.093505859375,
0.0134124755859375,
0.007137298583984375,
-0.045989990234375,
0.0255126953125,
0.032562255859375,
0.005184173583984375,
0.0506591796875,
0.049530029296875,
-0.007137298583984375,
0.01114654541015625,
-0.0233154296875,
0.06512451171875,
-0.0428466796875,
-0.0193634033203125,
-0.054046630859375,
0.042724609375,
-0.005985260009765625,
-0.047119140625,
0.0316162109375,
0.03912353515625,
0.0679931640625,
-0.0078277587890625,
0.0273590087890625,
-0.02264404296875,
-0.008270263671875,
-0.0360107421875,
0.058135986328125,
-0.05987548828125,
-0.0037746429443359375,
-0.0216064453125,
-0.05108642578125,
-0.02728271484375,
0.0623779296875,
-0.021484375,
0.03472900390625,
0.0362548828125,
0.07904052734375,
-0.032318115234375,
-0.01096343994140625,
0.0139923095703125,
0.009490966796875,
0.00885009765625,
0.033477783203125,
0.02166748046875,
-0.061431884765625,
0.0176849365234375,
-0.054443359375,
-0.0185089111328125,
-0.01568603515625,
-0.0506591796875,
-0.06927490234375,
-0.070068359375,
-0.044189453125,
-0.0462646484375,
-0.0155792236328125,
0.07171630859375,
0.08050537109375,
-0.05908203125,
-0.007343292236328125,
0.003543853759765625,
0.01432037353515625,
-0.031280517578125,
-0.0188751220703125,
0.04876708984375,
-0.00787353515625,
-0.05108642578125,
-0.02056884765625,
0.0018701553344726562,
0.0280914306640625,
-0.0005865097045898438,
-0.01496124267578125,
-0.02069091796875,
-0.01654052734375,
0.023101806640625,
0.026763916015625,
-0.049774169921875,
-0.01216888427734375,
-0.0134124755859375,
-0.0038585662841796875,
0.032012939453125,
0.03558349609375,
-0.03656005859375,
0.0196685791015625,
0.0265350341796875,
0.03173828125,
0.0625,
-0.026763916015625,
0.0006098747253417969,
-0.07550048828125,
0.049102783203125,
-0.004913330078125,
0.03314208984375,
0.0309906005859375,
-0.0297393798828125,
0.043609619140625,
0.031036376953125,
-0.034759521484375,
-0.06829833984375,
-0.0093994140625,
-0.090087890625,
-0.01329803466796875,
0.062347412109375,
-0.0250396728515625,
-0.044189453125,
0.037200927734375,
-0.0030078887939453125,
0.04766845703125,
-0.01396942138671875,
0.03558349609375,
0.0195465087890625,
-0.01389312744140625,
-0.044677734375,
-0.043975830078125,
0.034759521484375,
0.01464080810546875,
-0.04486083984375,
-0.03387451171875,
0.00434112548828125,
0.055084228515625,
0.0245513916015625,
0.042144775390625,
-0.011474609375,
0.01226043701171875,
0.0128631591796875,
0.03607177734375,
-0.03399658203125,
-0.005199432373046875,
-0.0182647705078125,
0.00907135009765625,
-0.00983428955078125,
-0.0518798828125
]
] |
timm/res2net50_14w_8s.in1k | 2023-04-24T00:04:42.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1904.01169",
"license:unknown",
"region:us"
] | image-classification | timm | null | null | timm/res2net50_14w_8s.in1k | 0 | 37,253 | timm | 2023-04-24T00:04:16 | ---
tags:
- image-classification
- timm
library_name: timm
license: unknown
datasets:
- imagenet-1k
---
# Model card for res2net50_14w_8s.in1k
A Res2Net (Multi-Scale ResNet) image classification model. Trained on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 25.1
- GMACs: 4.2
- Activations (M): 13.3
- Image size: 224 x 224
- **Papers:**
- Res2Net: A New Multi-scale Backbone Architecture: https://arxiv.org/abs/1904.01169
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/gasvn/Res2Net/
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('res2net50_14w_8s.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'res2net50_14w_8s.in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 112, 112])
# torch.Size([1, 256, 56, 56])
# torch.Size([1, 512, 28, 28])
# torch.Size([1, 1024, 14, 14])
# torch.Size([1, 2048, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'res2net50_14w_8s.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2048, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@article{gao2019res2net,
title={Res2Net: A New Multi-scale Backbone Architecture},
author={Gao, Shang-Hua and Cheng, Ming-Ming and Zhao, Kai and Zhang, Xin-Yu and Yang, Ming-Hsuan and Torr, Philip},
journal={IEEE TPAMI},
doi={10.1109/TPAMI.2019.2938758},
}
```
| 3,670 | [
[
-0.034332275390625,
-0.02459716796875,
-0.006622314453125,
0.01319122314453125,
-0.0203857421875,
-0.0287322998046875,
-0.0163116455078125,
-0.02935791015625,
0.0228424072265625,
0.038665771484375,
-0.033172607421875,
-0.039886474609375,
-0.050689697265625,
-0.0086212158203125,
-0.007213592529296875,
0.066162109375,
-0.0033740997314453125,
0.0017910003662109375,
-0.01111602783203125,
-0.04241943359375,
-0.0103607177734375,
-0.0178680419921875,
-0.0748291015625,
-0.031829833984375,
0.0286712646484375,
0.0189361572265625,
0.02734375,
0.047454833984375,
0.04681396484375,
0.03704833984375,
-0.00543212890625,
0.009979248046875,
-0.018035888671875,
-0.018707275390625,
0.0247650146484375,
-0.05169677734375,
-0.03704833984375,
0.01824951171875,
0.059967041015625,
0.0221405029296875,
0.00617218017578125,
0.040374755859375,
0.0128173828125,
0.046112060546875,
-0.011993408203125,
0.01253509521484375,
-0.0300445556640625,
0.01476287841796875,
-0.014007568359375,
0.0132598876953125,
-0.0242462158203125,
-0.0300140380859375,
0.01303863525390625,
-0.035186767578125,
0.0301055908203125,
0.00983428955078125,
0.09686279296875,
0.017120361328125,
-0.00882720947265625,
0.004024505615234375,
-0.0106964111328125,
0.06463623046875,
-0.0589599609375,
0.0167236328125,
0.0226593017578125,
0.017059326171875,
-0.004497528076171875,
-0.0867919921875,
-0.043304443359375,
0.000621795654296875,
-0.01419830322265625,
-0.00495147705078125,
-0.01383209228515625,
-0.006488800048828125,
0.02020263671875,
0.0193939208984375,
-0.03411865234375,
0.016571044921875,
-0.046966552734375,
-0.0177764892578125,
0.040283203125,
0.0031604766845703125,
0.02301025390625,
-0.0205535888671875,
-0.039459228515625,
-0.02996826171875,
-0.0202789306640625,
0.0253143310546875,
0.02276611328125,
0.02490234375,
-0.0494384765625,
0.0295257568359375,
0.01171112060546875,
0.036773681640625,
0.000335693359375,
-0.0267791748046875,
0.04718017578125,
-0.0005059242248535156,
-0.0311431884765625,
-0.0113372802734375,
0.083740234375,
0.032501220703125,
0.01947021484375,
0.0151214599609375,
-0.00534820556640625,
-0.032073974609375,
-0.0033111572265625,
-0.08819580078125,
-0.0279693603515625,
0.0293121337890625,
-0.0548095703125,
-0.031524658203125,
0.01306915283203125,
-0.057464599609375,
-0.014678955078125,
0.0007138252258300781,
0.033782958984375,
-0.026458740234375,
-0.04058837890625,
0.00528717041015625,
-0.0174102783203125,
0.022796630859375,
0.00647735595703125,
-0.041351318359375,
0.0149993896484375,
0.0230560302734375,
0.08514404296875,
0.01242828369140625,
-0.032135009765625,
-0.01236724853515625,
-0.03009033203125,
-0.0184783935546875,
0.03643798828125,
-0.001323699951171875,
-0.0022678375244140625,
-0.0260162353515625,
0.031402587890625,
-0.0097198486328125,
-0.0496826171875,
0.01465606689453125,
-0.01690673828125,
0.0249786376953125,
-0.00693511962890625,
-0.01210784912109375,
-0.0418701171875,
0.0245208740234375,
-0.038726806640625,
0.09649658203125,
0.0310821533203125,
-0.06884765625,
0.0147247314453125,
-0.03875732421875,
-0.00698089599609375,
-0.02764892578125,
0.004062652587890625,
-0.085693359375,
-0.00946807861328125,
0.005001068115234375,
0.04608154296875,
-0.031585693359375,
-0.001605987548828125,
-0.04541015625,
-0.01074981689453125,
0.0223236083984375,
-0.0019321441650390625,
0.073974609375,
0.007396697998046875,
-0.030242919921875,
0.0124664306640625,
-0.04302978515625,
0.02520751953125,
0.04052734375,
-0.024444580078125,
0.0001481771469116211,
-0.04803466796875,
0.021514892578125,
0.0297088623046875,
0.0120086669921875,
-0.04559326171875,
0.019866943359375,
-0.0178375244140625,
0.040283203125,
0.043212890625,
-0.01349639892578125,
0.02178955078125,
-0.0273284912109375,
0.0130767822265625,
0.01383209228515625,
0.0145416259765625,
0.0032176971435546875,
-0.045166015625,
-0.059173583984375,
-0.031402587890625,
0.0299224853515625,
0.031707763671875,
-0.030914306640625,
0.033935546875,
-0.0125274658203125,
-0.05816650390625,
-0.0343017578125,
0.0057220458984375,
0.04071044921875,
0.042755126953125,
0.032562255859375,
-0.034423828125,
-0.0445556640625,
-0.07257080078125,
0.00817108154296875,
0.0013637542724609375,
-0.00286102294921875,
0.028045654296875,
0.051513671875,
-0.005405426025390625,
0.048248291015625,
-0.033050537109375,
-0.0268096923828125,
-0.0224456787109375,
0.006214141845703125,
0.0258331298828125,
0.059722900390625,
0.06573486328125,
-0.046783447265625,
-0.031646728515625,
-0.0129241943359375,
-0.0728759765625,
0.0158843994140625,
-0.008453369140625,
-0.01357269287109375,
0.0195159912109375,
0.006801605224609375,
-0.03729248046875,
0.03875732421875,
0.0204620361328125,
-0.0224761962890625,
0.034698486328125,
-0.0191802978515625,
0.015472412109375,
-0.098388671875,
0.01061248779296875,
0.031768798828125,
-0.0019407272338867188,
-0.0323486328125,
0.003025054931640625,
0.006969451904296875,
-0.004566192626953125,
-0.03802490234375,
0.049530029296875,
-0.044219970703125,
-0.026214599609375,
-0.01396942138671875,
-0.01494598388671875,
0.01171112060546875,
0.05462646484375,
0.00013697147369384766,
0.0267333984375,
0.056365966796875,
-0.0302276611328125,
0.043853759765625,
0.026611328125,
-0.01413726806640625,
0.0294189453125,
-0.050872802734375,
0.01181793212890625,
0.0031795501708984375,
0.0283355712890625,
-0.08319091796875,
-0.01885986328125,
0.0159149169921875,
-0.042755126953125,
0.04803466796875,
-0.0372314453125,
-0.030120849609375,
-0.04986572265625,
-0.0382080078125,
0.0338134765625,
0.05279541015625,
-0.05169677734375,
0.027557373046875,
0.02056884765625,
0.02142333984375,
-0.04058837890625,
-0.07415771484375,
-0.0157623291015625,
-0.034423828125,
-0.05487060546875,
0.0207977294921875,
0.0201873779296875,
0.01067352294921875,
0.01010894775390625,
-0.0013246536254882812,
-0.0152435302734375,
-0.0128326416015625,
0.040313720703125,
0.0224151611328125,
-0.02667236328125,
-0.007511138916015625,
-0.0229644775390625,
-0.00724029541015625,
0.006984710693359375,
-0.02752685546875,
0.0477294921875,
-0.02398681640625,
-0.00806427001953125,
-0.07476806640625,
-0.0110015869140625,
0.0408935546875,
-0.0103302001953125,
0.0638427734375,
0.08660888671875,
-0.040771484375,
0.0008168220520019531,
-0.0292205810546875,
-0.02899169921875,
-0.035614013671875,
0.041412353515625,
-0.02947998046875,
-0.031585693359375,
0.0648193359375,
-0.01149749755859375,
0.01137542724609375,
0.050872802734375,
0.0164642333984375,
-0.007091522216796875,
0.045501708984375,
0.038482666015625,
0.015380859375,
0.041656494140625,
-0.0811767578125,
-0.01309967041015625,
-0.08001708984375,
-0.048095703125,
-0.036041259765625,
-0.055877685546875,
-0.048095703125,
-0.0312347412109375,
0.030548095703125,
0.0169677734375,
-0.036346435546875,
0.04351806640625,
-0.06280517578125,
-0.00034546852111816406,
0.04705810546875,
0.042816162109375,
-0.036041259765625,
0.0244293212890625,
-0.015655517578125,
-0.00385284423828125,
-0.0517578125,
-0.01480865478515625,
0.08343505859375,
0.0439453125,
0.042022705078125,
-0.004116058349609375,
0.045318603515625,
-0.018035888671875,
0.016082763671875,
-0.04217529296875,
0.03973388671875,
-0.0152435302734375,
-0.031494140625,
-0.01861572265625,
-0.026763916015625,
-0.07647705078125,
0.01180267333984375,
-0.0226593017578125,
-0.056793212890625,
0.01139068603515625,
0.01078033447265625,
-0.016387939453125,
0.06964111328125,
-0.056488037109375,
0.0672607421875,
-0.0080413818359375,
-0.030853271484375,
-0.00020039081573486328,
-0.0501708984375,
0.024078369140625,
0.00994873046875,
-0.0122528076171875,
-0.004840850830078125,
0.0091552734375,
0.08697509765625,
-0.041717529296875,
0.0673828125,
-0.035980224609375,
0.029083251953125,
0.042388916015625,
-0.013153076171875,
0.024810791015625,
-0.0035991668701171875,
-0.01515960693359375,
0.03021240234375,
-0.00337982177734375,
-0.038665771484375,
-0.0399169921875,
0.042999267578125,
-0.0740966796875,
-0.019378662109375,
-0.0208282470703125,
-0.031005859375,
0.01190948486328125,
0.00713348388671875,
0.0408935546875,
0.0589599609375,
0.0261688232421875,
0.031494140625,
0.043609619140625,
-0.033233642578125,
0.035675048828125,
-0.009918212890625,
-0.01055145263671875,
-0.0411376953125,
0.0599365234375,
0.0176849365234375,
0.0134429931640625,
0.006439208984375,
0.01433563232421875,
-0.039398193359375,
-0.03741455078125,
-0.031646728515625,
0.0301055908203125,
-0.04229736328125,
-0.03802490234375,
-0.0458984375,
-0.034698486328125,
-0.03802490234375,
-0.0014047622680664062,
-0.046417236328125,
-0.029083251953125,
-0.027252197265625,
0.0179443359375,
0.05279541015625,
0.044036865234375,
-0.0129241943359375,
0.0396728515625,
-0.04095458984375,
0.006679534912109375,
0.0106964111328125,
0.036468505859375,
-0.0005297660827636719,
-0.07415771484375,
-0.01519012451171875,
-0.005126953125,
-0.02734375,
-0.047760009765625,
0.03094482421875,
0.01108551025390625,
0.031646728515625,
0.028045654296875,
-0.0132904052734375,
0.052825927734375,
-0.0008711814880371094,
0.045379638671875,
0.03387451171875,
-0.038238525390625,
0.043792724609375,
0.004756927490234375,
0.0107574462890625,
0.0017747879028320312,
0.0206756591796875,
-0.0234832763671875,
0.004001617431640625,
-0.068603515625,
-0.052276611328125,
0.0703125,
0.001430511474609375,
-0.0009741783142089844,
0.029052734375,
0.058990478515625,
-0.005123138427734375,
-0.00362396240234375,
-0.052825927734375,
-0.031982421875,
-0.028900146484375,
-0.01446533203125,
0.0080718994140625,
-0.01473236083984375,
-0.004116058349609375,
-0.0413818359375,
0.047821044921875,
-0.00873565673828125,
0.05841064453125,
0.019439697265625,
0.01248931884765625,
-0.0006380081176757812,
-0.032745361328125,
0.03802490234375,
0.0166168212890625,
-0.019805908203125,
0.00611114501953125,
0.023193359375,
-0.04547119140625,
0.007045745849609375,
-0.00037550926208496094,
0.00952911376953125,
-0.00760650634765625,
0.043731689453125,
0.06988525390625,
-0.00046706199645996094,
0.012115478515625,
0.0252227783203125,
-0.006244659423828125,
-0.035064697265625,
-0.01910400390625,
0.0076141357421875,
0.004436492919921875,
0.03131103515625,
0.017242431640625,
0.027801513671875,
-0.016815185546875,
-0.0175323486328125,
0.0254669189453125,
0.0347900390625,
-0.0211944580078125,
-0.0271148681640625,
0.050323486328125,
-0.0106964111328125,
-0.01360321044921875,
0.078369140625,
-0.004425048828125,
-0.033599853515625,
0.09271240234375,
0.03985595703125,
0.06414794921875,
-0.006732940673828125,
-0.0009622573852539062,
0.0732421875,
0.017364501953125,
-0.00457763671875,
0.0147247314453125,
0.022369384765625,
-0.0562744140625,
0.008697509765625,
-0.0494384765625,
0.00904083251953125,
0.035003662109375,
-0.04925537109375,
0.0234222412109375,
-0.0560302734375,
-0.034271240234375,
0.0099945068359375,
0.0288543701171875,
-0.0626220703125,
0.022308349609375,
-0.007740020751953125,
0.07196044921875,
-0.059539794921875,
0.06427001953125,
0.06500244140625,
-0.03607177734375,
-0.08660888671875,
-0.006694793701171875,
0.00904083251953125,
-0.07379150390625,
0.0472412109375,
0.036956787109375,
0.0169677734375,
0.0057373046875,
-0.044830322265625,
-0.0511474609375,
0.10943603515625,
0.030487060546875,
-0.00623321533203125,
0.019378662109375,
-0.0004737377166748047,
0.01983642578125,
-0.034820556640625,
0.042022705078125,
0.01261138916015625,
0.033355712890625,
0.0243988037109375,
-0.057403564453125,
0.0202178955078125,
-0.0193634033203125,
0.01335906982421875,
0.0165252685546875,
-0.07208251953125,
0.07391357421875,
-0.045135498046875,
-0.005619049072265625,
0.0025482177734375,
0.0513916015625,
0.019927978515625,
0.01045989990234375,
0.04083251953125,
0.06536865234375,
0.043121337890625,
-0.0260162353515625,
0.058502197265625,
-0.00197601318359375,
0.05352783203125,
0.04534912109375,
0.0220184326171875,
0.047119140625,
0.0187530517578125,
-0.0208587646484375,
0.034332275390625,
0.087158203125,
-0.030487060546875,
0.022186279296875,
0.0220794677734375,
-0.0006909370422363281,
-0.005035400390625,
0.004302978515625,
-0.0396728515625,
0.042755126953125,
0.01557159423828125,
-0.036865234375,
-0.0204925537109375,
0.005077362060546875,
0.0029449462890625,
-0.0221710205078125,
-0.01178741455078125,
0.034637451171875,
-0.0005116462707519531,
-0.0245513916015625,
0.0614013671875,
0.0136260986328125,
0.06109619140625,
-0.025054931640625,
0.00033593177795410156,
-0.0238189697265625,
0.0182037353515625,
-0.0297698974609375,
-0.069091796875,
0.028106689453125,
-0.0207672119140625,
-0.0030612945556640625,
0.00417327880859375,
0.04571533203125,
-0.025726318359375,
-0.029388427734375,
0.0186614990234375,
0.0135345458984375,
0.048004150390625,
0.007293701171875,
-0.09564208984375,
0.01202392578125,
0.00676727294921875,
-0.0517578125,
0.0305938720703125,
0.0313720703125,
0.003643035888671875,
0.058135986328125,
0.042510986328125,
-0.006702423095703125,
0.0023174285888671875,
-0.0190582275390625,
0.056304931640625,
-0.03350830078125,
-0.0179443359375,
-0.05712890625,
0.048797607421875,
-0.01300811767578125,
-0.044891357421875,
0.03326416015625,
0.051116943359375,
0.0567626953125,
0.0015354156494140625,
0.035736083984375,
-0.0232086181640625,
-0.00292205810546875,
-0.028900146484375,
0.053436279296875,
-0.054656982421875,
-0.00672149658203125,
0.00391387939453125,
-0.050689697265625,
-0.018951416015625,
0.047332763671875,
-0.01091766357421875,
0.0295867919921875,
0.039306640625,
0.074951171875,
-0.0240020751953125,
-0.0229034423828125,
0.006000518798828125,
0.007213592529296875,
0.00641632080078125,
0.035400390625,
0.0111083984375,
-0.06463623046875,
0.0270233154296875,
-0.044921875,
-0.0180816650390625,
-0.00852203369140625,
-0.043304443359375,
-0.0758056640625,
-0.06884765625,
-0.040435791015625,
-0.059539794921875,
-0.023712158203125,
0.06146240234375,
0.08538818359375,
-0.0523681640625,
-0.01036834716796875,
0.00787353515625,
0.01195526123046875,
-0.0155029296875,
-0.0160369873046875,
0.0540771484375,
-0.01287078857421875,
-0.05889892578125,
-0.0247344970703125,
0.0007476806640625,
0.025909423828125,
-0.007503509521484375,
-0.0176239013671875,
-0.011962890625,
-0.0204620361328125,
0.01273345947265625,
0.0274810791015625,
-0.05316162109375,
-0.0196380615234375,
-0.02215576171875,
-0.01479339599609375,
0.0225830078125,
0.0265045166015625,
-0.035980224609375,
0.01654052734375,
0.0333251953125,
0.02996826171875,
0.05615234375,
-0.022308349609375,
-0.003936767578125,
-0.06402587890625,
0.043487548828125,
-0.0211334228515625,
0.03326416015625,
0.0247650146484375,
-0.0272216796875,
0.045562744140625,
0.03875732421875,
-0.03131103515625,
-0.06298828125,
-0.005199432373046875,
-0.0733642578125,
-0.0125579833984375,
0.06591796875,
-0.035247802734375,
-0.038360595703125,
0.033660888671875,
0.0004525184631347656,
0.048095703125,
-0.0010919570922851562,
0.035614013671875,
0.0163726806640625,
-0.0166168212890625,
-0.0487060546875,
-0.038299560546875,
0.032623291015625,
0.0063934326171875,
-0.051239013671875,
-0.033294677734375,
-0.00937652587890625,
0.056671142578125,
0.0231170654296875,
0.030426025390625,
-0.0120849609375,
0.0037136077880859375,
0.0160064697265625,
0.04205322265625,
-0.034027099609375,
-0.00942230224609375,
-0.02056884765625,
0.01049041748046875,
-0.01160430908203125,
-0.05242919921875
]
] |
timm/poolformer_m36.sail_in1k | 2023-05-05T06:15:32.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2210.13452",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/poolformer_m36.sail_in1k | 0 | 37,119 | timm | 2023-05-05T06:14:32 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for poolformer_m36.sail_in1k
A PoolFormer (a MetaFormer) image classification model. Trained on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 56.2
- GMACs: 8.8
- Activations (M): 22.0
- Image size: 224 x 224
- **Papers:**
- MetaFormer Is Actually What You Need for Vision: https://arxiv.org/abs/2210.13452
- **Original:** https://github.com/sail-sg/poolformer
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('poolformer_m36.sail_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'poolformer_m36.sail_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 96, 56, 56])
# torch.Size([1, 192, 28, 28])
# torch.Size([1, 384, 14, 14])
# torch.Size([1, 768, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'poolformer_m36.sail_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 768, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{yu2022metaformer,
title={Metaformer is actually what you need for vision},
author={Yu, Weihao and Luo, Mi and Zhou, Pan and Si, Chenyang and Zhou, Yichen and Wang, Xinchao and Feng, Jiashi and Yan, Shuicheng},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={10819--10829},
year={2022}
}
```
| 3,744 | [
[
-0.035919189453125,
-0.026947021484375,
0.01218414306640625,
0.00658416748046875,
-0.0345458984375,
-0.0255126953125,
-0.00679779052734375,
-0.0272674560546875,
0.018798828125,
0.041595458984375,
-0.039520263671875,
-0.05633544921875,
-0.055633544921875,
-0.01309967041015625,
-0.0110931396484375,
0.07196044921875,
-0.007350921630859375,
-0.003322601318359375,
-0.0079498291015625,
-0.0350341796875,
-0.0206756591796875,
-0.01416015625,
-0.06707763671875,
-0.0221099853515625,
0.029388427734375,
0.0154876708984375,
0.0341796875,
0.045867919921875,
0.049346923828125,
0.03814697265625,
-0.0022144317626953125,
0.00759124755859375,
-0.01409149169921875,
-0.0173797607421875,
0.0272674560546875,
-0.054290771484375,
-0.03424072265625,
0.03363037109375,
0.05804443359375,
0.0252227783203125,
0.0081329345703125,
0.036590576171875,
0.016204833984375,
0.050201416015625,
-0.034454345703125,
0.03045654296875,
-0.026763916015625,
0.018524169921875,
-0.020477294921875,
0.005786895751953125,
-0.018463134765625,
-0.022216796875,
0.0172119140625,
-0.04547119140625,
0.0350341796875,
0.0091552734375,
0.09478759765625,
0.0246429443359375,
0.00008666515350341797,
0.0006070137023925781,
-0.0211334228515625,
0.061248779296875,
-0.059234619140625,
0.011138916015625,
0.0218963623046875,
0.0238800048828125,
0.0007181167602539062,
-0.06854248046875,
-0.046783447265625,
-0.001705169677734375,
-0.0208892822265625,
0.00861358642578125,
-0.00469207763671875,
0.00913238525390625,
0.01690673828125,
0.0233001708984375,
-0.040802001953125,
0.0013065338134765625,
-0.050140380859375,
-0.0095062255859375,
0.055999755859375,
0.00420379638671875,
0.02783203125,
-0.017791748046875,
-0.031463623046875,
-0.034454345703125,
-0.01390838623046875,
0.0300140380859375,
0.015472412109375,
0.01222991943359375,
-0.0478515625,
0.035919189453125,
0.0172119140625,
0.049835205078125,
0.0147247314453125,
-0.03314208984375,
0.0419921875,
0.01324462890625,
-0.03619384765625,
-0.006877899169921875,
0.078857421875,
0.03375244140625,
0.0154266357421875,
0.016998291015625,
-0.01080322265625,
-0.0307769775390625,
-0.00702667236328125,
-0.069091796875,
-0.02642822265625,
0.026763916015625,
-0.045379638671875,
-0.03814697265625,
0.039031982421875,
-0.05511474609375,
-0.01056671142578125,
-0.00004285573959350586,
0.048492431640625,
-0.03173828125,
-0.01348876953125,
0.006412506103515625,
-0.02874755859375,
0.03350830078125,
0.01291656494140625,
-0.043182373046875,
0.008392333984375,
0.02001953125,
0.079345703125,
0.0014562606811523438,
-0.039154052734375,
-0.022064208984375,
-0.015411376953125,
-0.0165557861328125,
0.042266845703125,
-0.008087158203125,
-0.00457763671875,
-0.023895263671875,
0.02911376953125,
-0.0292510986328125,
-0.0474853515625,
0.0201416015625,
-0.013946533203125,
0.04473876953125,
0.0064849853515625,
-0.0225067138671875,
-0.044403076171875,
0.014556884765625,
-0.043609619140625,
0.08697509765625,
0.0258331298828125,
-0.06072998046875,
0.02703857421875,
-0.04791259765625,
-0.002712249755859375,
-0.005550384521484375,
-0.006381988525390625,
-0.089111328125,
-0.003818511962890625,
0.01462554931640625,
0.05108642578125,
-0.01068115234375,
-0.003101348876953125,
-0.040679931640625,
-0.0268402099609375,
0.02288818359375,
0.0014448165893554688,
0.07061767578125,
0.007049560546875,
-0.0430908203125,
0.016021728515625,
-0.04736328125,
0.016204833984375,
0.0345458984375,
-0.026123046875,
-0.004978179931640625,
-0.03680419921875,
0.02105712890625,
0.0207061767578125,
0.005573272705078125,
-0.04254150390625,
0.01535797119140625,
-0.0198516845703125,
0.0328369140625,
0.049224853515625,
-0.007091522216796875,
0.0289459228515625,
-0.031158447265625,
0.023040771484375,
0.0206451416015625,
0.0265045166015625,
-0.00763702392578125,
-0.04852294921875,
-0.0732421875,
-0.04412841796875,
0.0147247314453125,
0.0322265625,
-0.038055419921875,
0.043212890625,
-0.0018682479858398438,
-0.058441162109375,
-0.04339599609375,
0.0054473876953125,
0.037506103515625,
0.04052734375,
0.0305023193359375,
-0.03466796875,
-0.03765869140625,
-0.060394287109375,
0.0026493072509765625,
0.003871917724609375,
0.0038280487060546875,
0.02545166015625,
0.048126220703125,
-0.003917694091796875,
0.055023193359375,
-0.039642333984375,
-0.0333251953125,
-0.019622802734375,
0.0018177032470703125,
0.0390625,
0.059234619140625,
0.0697021484375,
-0.05694580078125,
-0.040313720703125,
-0.0195465087890625,
-0.07183837890625,
0.01119232177734375,
-0.006427764892578125,
-0.016632080078125,
0.0322265625,
0.005352020263671875,
-0.044158935546875,
0.052581787109375,
0.0274505615234375,
-0.03729248046875,
0.033721923828125,
-0.0193634033203125,
0.0202789306640625,
-0.0745849609375,
0.0164337158203125,
0.02166748046875,
-0.0090789794921875,
-0.0300140380859375,
0.00479888916015625,
-0.000011324882507324219,
-0.005298614501953125,
-0.03436279296875,
0.049896240234375,
-0.03814697265625,
-0.01273345947265625,
-0.016448974609375,
-0.013763427734375,
0.005313873291015625,
0.048675537109375,
0.00360870361328125,
0.02593994140625,
0.061065673828125,
-0.04071044921875,
0.0430908203125,
0.03387451171875,
-0.01160430908203125,
0.0323486328125,
-0.05072021484375,
0.008148193359375,
-0.0177154541015625,
0.016021728515625,
-0.08233642578125,
-0.01458740234375,
0.0297393798828125,
-0.03375244140625,
0.04168701171875,
-0.036590576171875,
-0.01024627685546875,
-0.0509033203125,
-0.045166015625,
0.02899169921875,
0.05035400390625,
-0.0504150390625,
0.0423583984375,
0.0171356201171875,
0.01531982421875,
-0.047149658203125,
-0.061248779296875,
-0.0235748291015625,
-0.031280517578125,
-0.05291748046875,
0.03326416015625,
0.005706787109375,
0.0111083984375,
0.006099700927734375,
-0.00605010986328125,
-0.00981903076171875,
-0.01345062255859375,
0.04541015625,
0.045013427734375,
-0.0275115966796875,
-0.0171356201171875,
-0.0240936279296875,
-0.01556396484375,
-0.00234222412109375,
-0.030426025390625,
0.043792724609375,
-0.02056884765625,
-0.01873779296875,
-0.060699462890625,
-0.0128173828125,
0.040374755859375,
-0.0148773193359375,
0.0660400390625,
0.07861328125,
-0.031005859375,
-0.0031566619873046875,
-0.038299560546875,
-0.017120361328125,
-0.037506103515625,
0.0306243896484375,
-0.0268096923828125,
-0.03240966796875,
0.0570068359375,
0.0130615234375,
0.00934600830078125,
0.0616455078125,
0.014251708984375,
-0.0216217041015625,
0.045989990234375,
0.058685302734375,
0.00983428955078125,
0.050872802734375,
-0.07452392578125,
-0.012969970703125,
-0.05694580078125,
-0.045013427734375,
-0.02197265625,
-0.0474853515625,
-0.040130615234375,
-0.0283660888671875,
0.038543701171875,
0.015289306640625,
-0.0267333984375,
0.031890869140625,
-0.060089111328125,
0.01430511474609375,
0.05316162109375,
0.039764404296875,
-0.02069091796875,
0.009307861328125,
-0.0207366943359375,
-0.003864288330078125,
-0.058258056640625,
-0.02459716796875,
0.07867431640625,
0.0276947021484375,
0.045257568359375,
-0.0198822021484375,
0.054962158203125,
-0.0165863037109375,
0.020294189453125,
-0.0380859375,
0.0386962890625,
-0.016326904296875,
-0.0300140380859375,
-0.01910400390625,
-0.0207672119140625,
-0.07708740234375,
0.011138916015625,
-0.036712646484375,
-0.043975830078125,
0.0086822509765625,
0.0165252685546875,
-0.022216796875,
0.058990478515625,
-0.0509033203125,
0.07659912109375,
-0.013519287109375,
-0.021514892578125,
0.01299285888671875,
-0.046051025390625,
0.025970458984375,
0.001556396484375,
-0.00772857666015625,
-0.0035552978515625,
0.01033782958984375,
0.08197021484375,
-0.045013427734375,
0.06573486328125,
-0.040802001953125,
0.029022216796875,
0.037811279296875,
-0.022064208984375,
0.01358795166015625,
-0.0163116455078125,
0.003513336181640625,
0.0186614990234375,
0.0047607421875,
-0.0382080078125,
-0.039306640625,
0.0307464599609375,
-0.074951171875,
-0.0225067138671875,
-0.0286407470703125,
-0.02911376953125,
0.021209716796875,
0.005496978759765625,
0.046844482421875,
0.0361328125,
0.0196533203125,
0.008056640625,
0.040924072265625,
-0.03485107421875,
0.05047607421875,
0.00018274784088134766,
-0.0198211669921875,
-0.038299560546875,
0.055023193359375,
0.013336181640625,
0.004344940185546875,
0.010589599609375,
0.01171875,
-0.022491455078125,
-0.040252685546875,
-0.023590087890625,
0.036376953125,
-0.040313720703125,
-0.03411865234375,
-0.035858154296875,
-0.041717529296875,
-0.0322265625,
-0.0096282958984375,
-0.03521728515625,
-0.0294952392578125,
-0.0181427001953125,
0.01708984375,
0.04681396484375,
0.05328369140625,
-0.006740570068359375,
0.04296875,
-0.049346923828125,
0.0042266845703125,
0.017181396484375,
0.035064697265625,
-0.0157012939453125,
-0.0653076171875,
-0.01629638671875,
-0.01340484619140625,
-0.03472900390625,
-0.053375244140625,
0.0430908203125,
0.023529052734375,
0.0482177734375,
0.0396728515625,
-0.0220489501953125,
0.059417724609375,
-0.00591278076171875,
0.044189453125,
0.023406982421875,
-0.05133056640625,
0.059478759765625,
0.003269195556640625,
0.01068878173828125,
0.003246307373046875,
0.0301361083984375,
-0.0244140625,
-0.0150299072265625,
-0.07342529296875,
-0.050506591796875,
0.0655517578125,
0.0067138671875,
-0.014404296875,
0.039642333984375,
0.061248779296875,
-0.002864837646484375,
0.001636505126953125,
-0.059234619140625,
-0.036773681640625,
-0.0280609130859375,
-0.025970458984375,
-0.0199432373046875,
-0.0005431175231933594,
0.006969451904296875,
-0.043243408203125,
0.05535888671875,
-0.00580596923828125,
0.050201416015625,
0.02545166015625,
-0.0022640228271484375,
-0.007709503173828125,
-0.034942626953125,
0.031463623046875,
0.017333984375,
-0.0226287841796875,
0.009429931640625,
0.0146484375,
-0.053070068359375,
0.0026988983154296875,
0.0125885009765625,
-0.0024852752685546875,
0.00276947021484375,
0.045135498046875,
0.06146240234375,
0.003170013427734375,
-0.003589630126953125,
0.034881591796875,
0.0043182373046875,
-0.029632568359375,
-0.016357421875,
0.024322509765625,
-0.006328582763671875,
0.035797119140625,
0.027587890625,
0.02880859375,
-0.013214111328125,
-0.01727294921875,
0.01605224609375,
0.04217529296875,
-0.0252227783203125,
-0.0178680419921875,
0.05780029296875,
-0.0130157470703125,
-0.008392333984375,
0.0653076171875,
0.0014896392822265625,
-0.040313720703125,
0.07440185546875,
0.0269927978515625,
0.064453125,
-0.0164794921875,
0.0046539306640625,
0.06689453125,
0.01611328125,
0.0025653839111328125,
0.01983642578125,
0.017364501953125,
-0.0584716796875,
0.001621246337890625,
-0.048492431640625,
0.002353668212890625,
0.0282135009765625,
-0.03570556640625,
0.027618408203125,
-0.05621337890625,
-0.0272674560546875,
0.01256561279296875,
0.0282135009765625,
-0.06365966796875,
0.01395416259765625,
0.0154876708984375,
0.0621337890625,
-0.065185546875,
0.0533447265625,
0.0665283203125,
-0.051727294921875,
-0.0732421875,
-0.0208892822265625,
0.00542449951171875,
-0.07733154296875,
0.041107177734375,
0.036376953125,
0.008331298828125,
-0.00074005126953125,
-0.0645751953125,
-0.050689697265625,
0.1177978515625,
0.033233642578125,
-0.0107421875,
0.00812530517578125,
0.006748199462890625,
0.00890350341796875,
-0.04278564453125,
0.02789306640625,
0.0130767822265625,
0.0400390625,
0.0224761962890625,
-0.0614013671875,
0.01241302490234375,
-0.0201263427734375,
0.00795745849609375,
0.0090484619140625,
-0.055633544921875,
0.07305908203125,
-0.040924072265625,
-0.0083465576171875,
0.0008683204650878906,
0.043243408203125,
0.0305328369140625,
0.01541900634765625,
0.0421142578125,
0.05523681640625,
0.0482177734375,
-0.018768310546875,
0.059906005859375,
-0.004627227783203125,
0.051849365234375,
0.04034423828125,
0.0269012451171875,
0.04296875,
0.0251007080078125,
-0.0160980224609375,
0.037200927734375,
0.07318115234375,
-0.03631591796875,
0.0269317626953125,
0.01085662841796875,
0.0014438629150390625,
-0.0081329345703125,
0.02886962890625,
-0.0263519287109375,
0.04766845703125,
0.006229400634765625,
-0.045745849609375,
-0.01404571533203125,
0.01160430908203125,
-0.004425048828125,
-0.022003173828125,
-0.0252838134765625,
0.036590576171875,
-0.0017642974853515625,
-0.030670166015625,
0.05523681640625,
0.01436614990234375,
0.0731201171875,
-0.03125,
-0.00140380859375,
-0.0178070068359375,
0.0224609375,
-0.03192138671875,
-0.0657958984375,
0.025848388671875,
-0.0193634033203125,
0.0009245872497558594,
0.0032196044921875,
0.05316162109375,
-0.03271484375,
-0.036773681640625,
0.0087127685546875,
0.031707763671875,
0.043975830078125,
-0.006748199462890625,
-0.0911865234375,
0.0056304931640625,
0.002597808837890625,
-0.041290283203125,
0.01226043701171875,
0.02752685546875,
0.01378631591796875,
0.05572509765625,
0.0411376953125,
-0.00896453857421875,
0.0160980224609375,
-0.02178955078125,
0.0540771484375,
-0.0479736328125,
-0.0220794677734375,
-0.065185546875,
0.0509033203125,
-0.00856781005859375,
-0.040496826171875,
0.037811279296875,
0.052734375,
0.05950927734375,
-0.017333984375,
0.0335693359375,
-0.01482391357421875,
-0.005970001220703125,
-0.03790283203125,
0.068359375,
-0.038726806640625,
-0.0110321044921875,
-0.0179290771484375,
-0.06256103515625,
-0.03253173828125,
0.067138671875,
-0.019073486328125,
0.0267333984375,
0.038543701171875,
0.0662841796875,
-0.0273590087890625,
-0.0227813720703125,
0.0140533447265625,
0.007068634033203125,
0.005481719970703125,
0.04193115234375,
0.028594970703125,
-0.0648193359375,
0.0243072509765625,
-0.0401611328125,
-0.026397705078125,
-0.00396728515625,
-0.050018310546875,
-0.076904296875,
-0.07122802734375,
-0.04962158203125,
-0.04217529296875,
-0.015289306640625,
0.062164306640625,
0.08685302734375,
-0.049835205078125,
-0.014251708984375,
0.00927734375,
0.00960540771484375,
-0.0178985595703125,
-0.018157958984375,
0.050140380859375,
-0.0224151611328125,
-0.059600830078125,
-0.0292510986328125,
0.006549835205078125,
0.04437255859375,
-0.0041046142578125,
-0.019134521484375,
-0.00759124755859375,
-0.0195770263671875,
0.0213775634765625,
0.02197265625,
-0.04229736328125,
-0.01216888427734375,
-0.0225372314453125,
-0.01222991943359375,
0.0259246826171875,
0.021209716796875,
-0.05316162109375,
0.005161285400390625,
0.018035888671875,
0.0167999267578125,
0.07049560546875,
-0.033782958984375,
0.002170562744140625,
-0.051513671875,
0.034881591796875,
-0.0162200927734375,
0.03997802734375,
0.019256591796875,
-0.0244293212890625,
0.044219970703125,
0.04229736328125,
-0.026397705078125,
-0.047149658203125,
-0.0157928466796875,
-0.0845947265625,
-0.00977325439453125,
0.051239013671875,
-0.0289306640625,
-0.048583984375,
0.0301513671875,
-0.0118255615234375,
0.03680419921875,
0.003345489501953125,
0.041534423828125,
0.0170135498046875,
-0.01403045654296875,
-0.051544189453125,
-0.031341552734375,
0.03839111328125,
-0.0023040771484375,
-0.056182861328125,
-0.039459228515625,
0.0027008056640625,
0.0611572265625,
0.0210723876953125,
0.046173095703125,
-0.01354217529296875,
0.0105743408203125,
0.0026340484619140625,
0.038818359375,
-0.025970458984375,
-0.007259368896484375,
-0.0271453857421875,
0.01010894775390625,
-0.0145721435546875,
-0.048858642578125
]
] |
timm/res2next50.in1k | 2023-04-24T00:08:47.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1904.01169",
"license:unknown",
"region:us"
] | image-classification | timm | null | null | timm/res2next50.in1k | 0 | 37,081 | timm | 2023-04-24T00:08:28 | ---
tags:
- image-classification
- timm
library_name: timm
license: unknown
datasets:
- imagenet-1k
---
# Model card for res2next50.in1k
A Res2Net (Multi-Scale ResNet) image classification model. Trained on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 24.7
- GMACs: 4.2
- Activations (M): 13.7
- Image size: 224 x 224
- **Papers:**
- Res2Net: A New Multi-scale Backbone Architecture: https://arxiv.org/abs/1904.01169
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/gasvn/Res2Net/
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('res2next50.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'res2next50.in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 112, 112])
# torch.Size([1, 256, 56, 56])
# torch.Size([1, 512, 28, 28])
# torch.Size([1, 1024, 14, 14])
# torch.Size([1, 2048, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'res2next50.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2048, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@article{gao2019res2net,
title={Res2Net: A New Multi-scale Backbone Architecture},
author={Gao, Shang-Hua and Cheng, Ming-Ming and Zhao, Kai and Zhang, Xin-Yu and Yang, Ming-Hsuan and Torr, Philip},
journal={IEEE TPAMI},
doi={10.1109/TPAMI.2019.2938758},
}
```
| 3,646 | [
[
-0.033599853515625,
-0.0231781005859375,
-0.003643035888671875,
0.0131072998046875,
-0.0204010009765625,
-0.026580810546875,
-0.01470947265625,
-0.03094482421875,
0.0213165283203125,
0.037933349609375,
-0.03564453125,
-0.040283203125,
-0.05029296875,
-0.010101318359375,
-0.00939178466796875,
0.067626953125,
-0.00540924072265625,
0.002704620361328125,
-0.0168304443359375,
-0.042266845703125,
-0.01132965087890625,
-0.018585205078125,
-0.07427978515625,
-0.033599853515625,
0.026214599609375,
0.0202178955078125,
0.0289459228515625,
0.0457763671875,
0.044281005859375,
0.035980224609375,
-0.00557708740234375,
0.0066986083984375,
-0.0168609619140625,
-0.0166473388671875,
0.02215576171875,
-0.053253173828125,
-0.0369873046875,
0.0179595947265625,
0.061248779296875,
0.0228271484375,
0.00690460205078125,
0.03924560546875,
0.01255035400390625,
0.044769287109375,
-0.0135040283203125,
0.01216888427734375,
-0.028350830078125,
0.015716552734375,
-0.0135040283203125,
0.014007568359375,
-0.0240936279296875,
-0.033050537109375,
0.0123291015625,
-0.034423828125,
0.0286865234375,
0.00785064697265625,
0.09429931640625,
0.0174713134765625,
-0.00795745849609375,
0.00168609619140625,
-0.0133514404296875,
0.06524658203125,
-0.057525634765625,
0.019775390625,
0.0227203369140625,
0.0178070068359375,
-0.0034198760986328125,
-0.08758544921875,
-0.04437255859375,
0.0003323554992675781,
-0.01438140869140625,
-0.004421234130859375,
-0.015380859375,
-0.00713348388671875,
0.0206298828125,
0.0187530517578125,
-0.035430908203125,
0.01555633544921875,
-0.0435791015625,
-0.0169219970703125,
0.039154052734375,
0.00533294677734375,
0.0215301513671875,
-0.01873779296875,
-0.040557861328125,
-0.03179931640625,
-0.022003173828125,
0.0254364013671875,
0.0228271484375,
0.022552490234375,
-0.0484619140625,
0.029388427734375,
0.0148162841796875,
0.03521728515625,
-0.0009279251098632812,
-0.027984619140625,
0.04595947265625,
-0.0024738311767578125,
-0.0299835205078125,
-0.010223388671875,
0.08001708984375,
0.0306243896484375,
0.01910400390625,
0.013458251953125,
-0.006885528564453125,
-0.031585693359375,
-0.004161834716796875,
-0.08770751953125,
-0.026702880859375,
0.0302276611328125,
-0.052581787109375,
-0.029937744140625,
0.01300811767578125,
-0.057342529296875,
-0.01435089111328125,
-0.0014801025390625,
0.03631591796875,
-0.0276336669921875,
-0.040557861328125,
0.0031299591064453125,
-0.0187530517578125,
0.0236358642578125,
0.0081329345703125,
-0.040283203125,
0.018707275390625,
0.0242767333984375,
0.08453369140625,
0.0114288330078125,
-0.0303192138671875,
-0.01262664794921875,
-0.028564453125,
-0.0182342529296875,
0.0386962890625,
-0.0049591064453125,
-0.004604339599609375,
-0.024383544921875,
0.03131103515625,
-0.00919342041015625,
-0.051971435546875,
0.0157012939453125,
-0.0146484375,
0.0265960693359375,
-0.006397247314453125,
-0.01450347900390625,
-0.039520263671875,
0.02484130859375,
-0.0369873046875,
0.09661865234375,
0.033538818359375,
-0.06634521484375,
0.0161285400390625,
-0.037750244140625,
-0.006649017333984375,
-0.027923583984375,
0.00403594970703125,
-0.0859375,
-0.00913238525390625,
0.004711151123046875,
0.044677734375,
-0.0308685302734375,
-0.0045928955078125,
-0.04559326171875,
-0.0123748779296875,
0.023895263671875,
-0.0004792213439941406,
0.07135009765625,
0.007534027099609375,
-0.0284423828125,
0.0114593505859375,
-0.0430908203125,
0.02691650390625,
0.038787841796875,
-0.0234375,
-0.0006728172302246094,
-0.04876708984375,
0.0241546630859375,
0.030487060546875,
0.0105133056640625,
-0.04559326171875,
0.0205230712890625,
-0.0166168212890625,
0.039398193359375,
0.043853759765625,
-0.01434326171875,
0.0221710205078125,
-0.0273590087890625,
0.0161895751953125,
0.01377105712890625,
0.0148162841796875,
-0.00038242340087890625,
-0.0440673828125,
-0.057830810546875,
-0.030731201171875,
0.0306549072265625,
0.032196044921875,
-0.0279083251953125,
0.031707763671875,
-0.0124969482421875,
-0.058013916015625,
-0.033843994140625,
0.005313873291015625,
0.04278564453125,
0.041259765625,
0.031707763671875,
-0.035491943359375,
-0.045654296875,
-0.07037353515625,
0.006649017333984375,
0.00110626220703125,
-0.0010652542114257812,
0.0287322998046875,
0.0517578125,
-0.00811004638671875,
0.050811767578125,
-0.033416748046875,
-0.02484130859375,
-0.0216522216796875,
0.005687713623046875,
0.025054931640625,
0.060150146484375,
0.06512451171875,
-0.04620361328125,
-0.032745361328125,
-0.012481689453125,
-0.07049560546875,
0.016815185546875,
-0.00798797607421875,
-0.0111236572265625,
0.0233306884765625,
0.00933074951171875,
-0.038177490234375,
0.03936767578125,
0.019500732421875,
-0.02362060546875,
0.036895751953125,
-0.0197906494140625,
0.0141754150390625,
-0.09857177734375,
0.01137542724609375,
0.030548095703125,
-0.005218505859375,
-0.031768798828125,
0.0015687942504882812,
0.0087890625,
-0.00458526611328125,
-0.038116455078125,
0.05279541015625,
-0.04150390625,
-0.0262298583984375,
-0.01506805419921875,
-0.01568603515625,
0.0102081298828125,
0.052978515625,
0.002166748046875,
0.0253753662109375,
0.0538330078125,
-0.0301361083984375,
0.041839599609375,
0.0278167724609375,
-0.0154266357421875,
0.03289794921875,
-0.050384521484375,
0.0129852294921875,
0.0016202926635742188,
0.0283660888671875,
-0.0853271484375,
-0.018890380859375,
0.016143798828125,
-0.042877197265625,
0.05010986328125,
-0.038299560546875,
-0.032684326171875,
-0.050567626953125,
-0.037200927734375,
0.03411865234375,
0.054229736328125,
-0.05255126953125,
0.025177001953125,
0.0202789306640625,
0.0195770263671875,
-0.0396728515625,
-0.07244873046875,
-0.015716552734375,
-0.034088134765625,
-0.053192138671875,
0.020538330078125,
0.0182647705078125,
0.0099029541015625,
0.010772705078125,
-0.004352569580078125,
-0.0144195556640625,
-0.01499176025390625,
0.042236328125,
0.021331787109375,
-0.0234375,
-0.00397491455078125,
-0.020904541015625,
-0.00974273681640625,
0.00795745849609375,
-0.0278167724609375,
0.047607421875,
-0.02166748046875,
-0.00843048095703125,
-0.0748291015625,
-0.010772705078125,
0.04205322265625,
-0.007549285888671875,
0.060821533203125,
0.086181640625,
-0.04144287109375,
0.0018901824951171875,
-0.0297088623046875,
-0.0294036865234375,
-0.035064697265625,
0.042205810546875,
-0.0307159423828125,
-0.031524658203125,
0.06329345703125,
-0.01226806640625,
0.01007080078125,
0.048736572265625,
0.0175933837890625,
-0.00800323486328125,
0.0478515625,
0.04193115234375,
0.01666259765625,
0.042449951171875,
-0.07965087890625,
-0.01497650146484375,
-0.07843017578125,
-0.048309326171875,
-0.037109375,
-0.055206298828125,
-0.04779052734375,
-0.0307159423828125,
0.0306396484375,
0.0152740478515625,
-0.036102294921875,
0.044403076171875,
-0.06072998046875,
0.0010271072387695312,
0.04730224609375,
0.04071044921875,
-0.0341796875,
0.0235748291015625,
-0.01412200927734375,
-0.005523681640625,
-0.052154541015625,
-0.0154266357421875,
0.08502197265625,
0.044830322265625,
0.0455322265625,
-0.00591278076171875,
0.045013427734375,
-0.0188446044921875,
0.0173797607421875,
-0.0423583984375,
0.038726806640625,
-0.01491546630859375,
-0.032012939453125,
-0.015411376953125,
-0.0244598388671875,
-0.07525634765625,
0.0093994140625,
-0.023193359375,
-0.055908203125,
0.012115478515625,
0.010986328125,
-0.0168304443359375,
0.06842041015625,
-0.057830810546875,
0.06939697265625,
-0.0077667236328125,
-0.03131103515625,
-0.00173187255859375,
-0.049957275390625,
0.0234527587890625,
0.008087158203125,
-0.013031005859375,
-0.00296783447265625,
0.007534027099609375,
0.08465576171875,
-0.041595458984375,
0.06658935546875,
-0.036041259765625,
0.027618408203125,
0.03985595703125,
-0.01244354248046875,
0.027130126953125,
-0.0008835792541503906,
-0.01348876953125,
0.028228759765625,
0.0010595321655273438,
-0.038116455078125,
-0.040496826171875,
0.043609619140625,
-0.0745849609375,
-0.0186920166015625,
-0.02252197265625,
-0.033447265625,
0.011688232421875,
0.005126953125,
0.04010009765625,
0.0574951171875,
0.0247650146484375,
0.02947998046875,
0.04400634765625,
-0.033721923828125,
0.035614013671875,
-0.00775909423828125,
-0.01114654541015625,
-0.042572021484375,
0.059814453125,
0.0172882080078125,
0.01393890380859375,
0.00775146484375,
0.0131072998046875,
-0.04132080078125,
-0.035736083984375,
-0.0286102294921875,
0.0301971435546875,
-0.04327392578125,
-0.036346435546875,
-0.046478271484375,
-0.032958984375,
-0.038604736328125,
-0.0011796951293945312,
-0.04632568359375,
-0.031829833984375,
-0.0243682861328125,
0.020233154296875,
0.05035400390625,
0.04638671875,
-0.01238250732421875,
0.038726806640625,
-0.041534423828125,
0.003818511962890625,
0.01137542724609375,
0.037689208984375,
-0.004032135009765625,
-0.0738525390625,
-0.0167083740234375,
-0.005939483642578125,
-0.0285491943359375,
-0.0469970703125,
0.0301971435546875,
0.01015472412109375,
0.032012939453125,
0.0293731689453125,
-0.01531982421875,
0.0546875,
0.00006979703903198242,
0.04376220703125,
0.03314208984375,
-0.0394287109375,
0.045989990234375,
0.007472991943359375,
0.01161956787109375,
0.0005846023559570312,
0.0225677490234375,
-0.0253753662109375,
0.0032863616943359375,
-0.0704345703125,
-0.05096435546875,
0.07220458984375,
0.002925872802734375,
-0.00003713369369506836,
0.0302276611328125,
0.0606689453125,
-0.0009732246398925781,
-0.0022983551025390625,
-0.05279541015625,
-0.031768798828125,
-0.0284271240234375,
-0.01325225830078125,
0.005687713623046875,
-0.016265869140625,
-0.004062652587890625,
-0.04278564453125,
0.04736328125,
-0.00815582275390625,
0.05767822265625,
0.0195770263671875,
0.0117034912109375,
-0.002819061279296875,
-0.031463623046875,
0.0390625,
0.0172271728515625,
-0.02252197265625,
0.00547027587890625,
0.0242767333984375,
-0.0460205078125,
0.007678985595703125,
0.0008344650268554688,
0.0116424560546875,
-0.0056304931640625,
0.0433349609375,
0.06939697265625,
0.0009026527404785156,
0.007373809814453125,
0.024139404296875,
-0.006748199462890625,
-0.033966064453125,
-0.0214691162109375,
0.00789642333984375,
0.0022735595703125,
0.033111572265625,
0.01540374755859375,
0.0274505615234375,
-0.018585205078125,
-0.0202789306640625,
0.0247650146484375,
0.037139892578125,
-0.0252838134765625,
-0.029296875,
0.05218505859375,
-0.013763427734375,
-0.0129241943359375,
0.078125,
-0.0037517547607421875,
-0.0321044921875,
0.0906982421875,
0.042205810546875,
0.06512451171875,
-0.00855255126953125,
-0.00042557716369628906,
0.07421875,
0.0170440673828125,
-0.00373077392578125,
0.01398468017578125,
0.018280029296875,
-0.054534912109375,
0.00893402099609375,
-0.0482177734375,
0.005435943603515625,
0.03350830078125,
-0.048095703125,
0.0242462158203125,
-0.054473876953125,
-0.032440185546875,
0.00928497314453125,
0.0300140380859375,
-0.06231689453125,
0.022705078125,
-0.006191253662109375,
0.07281494140625,
-0.058990478515625,
0.06719970703125,
0.06744384765625,
-0.036041259765625,
-0.08184814453125,
-0.00614166259765625,
0.00832366943359375,
-0.0743408203125,
0.0482177734375,
0.03631591796875,
0.019622802734375,
0.00536346435546875,
-0.04620361328125,
-0.05169677734375,
0.11102294921875,
0.032318115234375,
-0.00618743896484375,
0.020111083984375,
0.0022640228271484375,
0.020111083984375,
-0.034454345703125,
0.04345703125,
0.01430511474609375,
0.03411865234375,
0.0258636474609375,
-0.05712890625,
0.020050048828125,
-0.0189361572265625,
0.01326751708984375,
0.0167999267578125,
-0.06982421875,
0.07318115234375,
-0.04791259765625,
-0.007381439208984375,
0.0050506591796875,
0.048675537109375,
0.0202789306640625,
0.012908935546875,
0.042266845703125,
0.06890869140625,
0.04266357421875,
-0.0270233154296875,
0.059783935546875,
-0.0014705657958984375,
0.0523681640625,
0.04791259765625,
0.0220947265625,
0.04656982421875,
0.0205841064453125,
-0.023590087890625,
0.035369873046875,
0.08673095703125,
-0.029022216796875,
0.0218658447265625,
0.020751953125,
-0.001918792724609375,
-0.00696563720703125,
0.0048065185546875,
-0.040618896484375,
0.041046142578125,
0.0164794921875,
-0.036468505859375,
-0.0227813720703125,
0.007297515869140625,
0.003631591796875,
-0.02203369140625,
-0.01165008544921875,
0.035369873046875,
-0.00014603137969970703,
-0.025146484375,
0.059967041015625,
0.01116943359375,
0.058013916015625,
-0.0265350341796875,
-0.0003604888916015625,
-0.0255889892578125,
0.0206298828125,
-0.0297393798828125,
-0.07208251953125,
0.0289154052734375,
-0.0221405029296875,
-0.00386810302734375,
0.0020999908447265625,
0.0447998046875,
-0.0264434814453125,
-0.0286712646484375,
0.016937255859375,
0.01360321044921875,
0.04742431640625,
0.004425048828125,
-0.09429931640625,
0.011199951171875,
0.004848480224609375,
-0.04913330078125,
0.031402587890625,
0.0295257568359375,
0.0017213821411132812,
0.058990478515625,
0.041839599609375,
-0.00994110107421875,
0.0008854866027832031,
-0.0193634033203125,
0.056640625,
-0.035675048828125,
-0.0180816650390625,
-0.05584716796875,
0.048095703125,
-0.0104217529296875,
-0.043701171875,
0.03411865234375,
0.052886962890625,
0.05810546875,
0.0004105567932128906,
0.036163330078125,
-0.0227508544921875,
-0.001277923583984375,
-0.0278167724609375,
0.051849365234375,
-0.056365966796875,
-0.006473541259765625,
0.0016393661499023438,
-0.052398681640625,
-0.0193023681640625,
0.044281005859375,
-0.00983428955078125,
0.0290985107421875,
0.039825439453125,
0.07574462890625,
-0.023162841796875,
-0.0232391357421875,
0.00629425048828125,
0.007843017578125,
0.007297515869140625,
0.03759765625,
0.01268768310546875,
-0.06549072265625,
0.029937744140625,
-0.045166015625,
-0.0172271728515625,
-0.0076904296875,
-0.04180908203125,
-0.0748291015625,
-0.070068359375,
-0.042449951171875,
-0.060394287109375,
-0.022857666015625,
0.0621337890625,
0.08526611328125,
-0.053009033203125,
-0.010894775390625,
0.006885528564453125,
0.01430511474609375,
-0.016998291015625,
-0.015869140625,
0.053009033203125,
-0.0156402587890625,
-0.06256103515625,
-0.0262603759765625,
0.0015392303466796875,
0.0277099609375,
-0.007328033447265625,
-0.0188751220703125,
-0.00904083251953125,
-0.0208740234375,
0.01461029052734375,
0.0288543701171875,
-0.0516357421875,
-0.0182037353515625,
-0.0223541259765625,
-0.01378631591796875,
0.022552490234375,
0.0279693603515625,
-0.0408935546875,
0.019500732421875,
0.033416748046875,
0.026702880859375,
0.05841064453125,
-0.0234527587890625,
-0.0034809112548828125,
-0.06390380859375,
0.042755126953125,
-0.02032470703125,
0.032379150390625,
0.022705078125,
-0.025146484375,
0.04681396484375,
0.040069580078125,
-0.032012939453125,
-0.0653076171875,
-0.005123138427734375,
-0.07427978515625,
-0.013092041015625,
0.06634521484375,
-0.032745361328125,
-0.03656005859375,
0.03216552734375,
-0.0006318092346191406,
0.047088623046875,
0.00026988983154296875,
0.035186767578125,
0.017486572265625,
-0.0164337158203125,
-0.049835205078125,
-0.037994384765625,
0.033599853515625,
0.00658416748046875,
-0.0521240234375,
-0.033721923828125,
-0.0111083984375,
0.057464599609375,
0.022369384765625,
0.0298309326171875,
-0.015716552734375,
0.006103515625,
0.01507568359375,
0.04266357421875,
-0.03204345703125,
-0.0080718994140625,
-0.0215301513671875,
0.0106658935546875,
-0.01201629638671875,
-0.050201416015625
]
] |
timm/rexnet_100.nav_in1k | 2023-03-20T20:35:27.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2007.00992",
"license:mit",
"region:us"
] | image-classification | timm | null | null | timm/rexnet_100.nav_in1k | 0 | 36,988 | timm | 2023-03-20T20:35:20 | ---
tags:
- image-classification
- timm
library_tag: timm
license: mit
datasets:
- imagenet-1k
---
# Model card for rexnet_100.nav_in1k
A ReXNet image classification model. Pretrained on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 4.8
- GMACs: 0.4
- Activations (M): 7.4
- Image size: 224 x 224
- **Papers:**
- Rethinking Channel Dimensions for Efficient Model Design: https://arxiv.org/abs/2007.00992
- **Original:** https://github.com/clovaai/rexnet
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('rexnet_100.nav_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'rexnet_100.nav_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 112, 112])
# torch.Size([1, 38, 56, 56])
# torch.Size([1, 61, 28, 28])
# torch.Size([1, 128, 14, 14])
# torch.Size([1, 185, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'rexnet_100.nav_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1280, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results)."
|model |top1 |top5 |param_count|img_size|crop_pct|
|-------------------------|------|------|-----------|--------|--------|
|rexnetr_300.sw_in12k_ft_in1k|84.53 |97.252|34.81 |288 |1.0 |
|rexnetr_200.sw_in12k_ft_in1k|83.164|96.648|16.52 |288 |1.0 |
|rexnet_300.nav_in1k |82.772|96.232|34.71 |224 |0.875 |
|rexnet_200.nav_in1k |81.652|95.668|16.37 |224 |0.875 |
|rexnet_150.nav_in1k |80.308|95.174|9.73 |224 |0.875 |
|rexnet_130.nav_in1k |79.478|94.68 |7.56 |224 |0.875 |
|rexnet_100.nav_in1k |77.832|93.886|4.8 |224 |0.875 |
## Citation
```bibtex
@misc{han2021rethinking,
title={Rethinking Channel Dimensions for Efficient Model Design},
author={Dongyoon Han and Sangdoo Yun and Byeongho Heo and YoungJoon Yoo},
year={2021},
eprint={2007.00992},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 4,572 | [
[
-0.043060302734375,
-0.02789306640625,
0.006168365478515625,
-0.0018672943115234375,
-0.032196044921875,
-0.0210723876953125,
-0.0136871337890625,
-0.0237274169921875,
0.0300445556640625,
0.0243072509765625,
-0.033203125,
-0.054901123046875,
-0.047119140625,
-0.008270263671875,
-0.003444671630859375,
0.06304931640625,
-0.0052947998046875,
-0.0028743743896484375,
-0.005138397216796875,
-0.039520263671875,
-0.01493072509765625,
-0.0211334228515625,
-0.0587158203125,
-0.0340576171875,
0.034271240234375,
0.013916015625,
0.042022705078125,
0.05322265625,
0.040618896484375,
0.035858154296875,
-0.011627197265625,
0.0088653564453125,
-0.018096923828125,
-0.0167083740234375,
0.0292510986328125,
-0.035797119140625,
-0.04022216796875,
0.01453399658203125,
0.056396484375,
0.030548095703125,
0.0076141357421875,
0.0307769775390625,
0.0077667236328125,
0.043853759765625,
-0.01116943359375,
0.003917694091796875,
-0.0214996337890625,
0.00760650634765625,
-0.00493621826171875,
-0.005523681640625,
-0.0175018310546875,
-0.03204345703125,
0.0233001708984375,
-0.04998779296875,
0.027191162109375,
0.000023305416107177734,
0.1087646484375,
0.01146697998046875,
-0.012847900390625,
-0.00024819374084472656,
-0.0183258056640625,
0.06304931640625,
-0.0654296875,
0.02056884765625,
0.0196075439453125,
0.005901336669921875,
-0.003162384033203125,
-0.07305908203125,
-0.035400390625,
-0.005584716796875,
-0.0189056396484375,
-0.00004094839096069336,
-0.0226287841796875,
-0.00010693073272705078,
0.015625,
0.02056884765625,
-0.03912353515625,
0.004451751708984375,
-0.041290283203125,
-0.0157012939453125,
0.0447998046875,
0.0010242462158203125,
0.020599365234375,
-0.032379150390625,
-0.04266357421875,
-0.03448486328125,
-0.0281982421875,
0.0223388671875,
0.026458740234375,
0.01511383056640625,
-0.04364013671875,
0.025299072265625,
0.0033702850341796875,
0.0404052734375,
0.0013551712036132812,
-0.02685546875,
0.05120849609375,
-0.00687408447265625,
-0.0345458984375,
-0.004085540771484375,
0.08697509765625,
0.03021240234375,
0.0137176513671875,
0.01270294189453125,
-0.0101470947265625,
-0.034698486328125,
-0.0025653839111328125,
-0.08856201171875,
-0.0210418701171875,
0.0268402099609375,
-0.05126953125,
-0.031158447265625,
0.0203399658203125,
-0.056304931640625,
-0.002288818359375,
-0.003643035888671875,
0.050628662109375,
-0.038848876953125,
-0.035797119140625,
0.0056304931640625,
-0.01343536376953125,
0.0216217041015625,
0.008697509765625,
-0.0323486328125,
0.00916290283203125,
0.0191802978515625,
0.0845947265625,
0.01006317138671875,
-0.034271240234375,
-0.01012420654296875,
-0.022613525390625,
-0.022857666015625,
0.03411865234375,
-0.0018320083618164062,
-0.01363372802734375,
-0.02386474609375,
0.02264404296875,
-0.01153564453125,
-0.042938232421875,
0.02386474609375,
-0.024078369140625,
0.01776123046875,
-0.01256561279296875,
-0.01544189453125,
-0.04034423828125,
0.0181121826171875,
-0.035247802734375,
0.08624267578125,
0.028106689453125,
-0.0628662109375,
0.02423095703125,
-0.0330810546875,
-0.009002685546875,
-0.0184478759765625,
-0.0005397796630859375,
-0.08599853515625,
-0.012939453125,
0.019317626953125,
0.043426513671875,
-0.02423095703125,
0.003787994384765625,
-0.03863525390625,
-0.01395416259765625,
0.0285491943359375,
-0.0017805099487304688,
0.0772705078125,
0.01947021484375,
-0.03692626953125,
0.023590087890625,
-0.0499267578125,
0.0228424072265625,
0.035247802734375,
-0.029541015625,
-0.00724029541015625,
-0.045318603515625,
0.0096588134765625,
0.0190582275390625,
0.004711151123046875,
-0.037506103515625,
0.02435302734375,
-0.01052093505859375,
0.03814697265625,
0.0540771484375,
-0.00487518310546875,
0.0275421142578125,
-0.031646728515625,
0.02508544921875,
0.017974853515625,
0.01641845703125,
-0.0025653839111328125,
-0.0400390625,
-0.0604248046875,
-0.044342041015625,
0.03594970703125,
0.023956298828125,
-0.039886474609375,
0.035369873046875,
-0.02191162109375,
-0.054168701171875,
-0.03314208984375,
0.0044097900390625,
0.0372314453125,
0.041534423828125,
0.022491455078125,
-0.03631591796875,
-0.037811279296875,
-0.0653076171875,
0.01146697998046875,
-0.0012292861938476562,
0.0027866363525390625,
0.031524658203125,
0.055267333984375,
-0.0033702850341796875,
0.04803466796875,
-0.035400390625,
-0.0186767578125,
-0.0179290771484375,
0.01296234130859375,
0.0341796875,
0.056396484375,
0.06597900390625,
-0.04083251953125,
-0.039764404296875,
-0.007610321044921875,
-0.074462890625,
0.00856781005859375,
-0.006900787353515625,
-0.0209808349609375,
0.02667236328125,
0.01097869873046875,
-0.061614990234375,
0.046722412109375,
0.01166534423828125,
-0.03125,
0.03289794921875,
-0.02203369140625,
0.02349853515625,
-0.08941650390625,
0.0098876953125,
0.022430419921875,
-0.005092620849609375,
-0.03558349609375,
0.0018739700317382812,
0.002246856689453125,
-0.005802154541015625,
-0.02862548828125,
0.044281005859375,
-0.0501708984375,
-0.013946533203125,
-0.004611968994140625,
-0.0087890625,
0.0034122467041015625,
0.05340576171875,
-0.007244110107421875,
0.0266876220703125,
0.0615234375,
-0.03021240234375,
0.03814697265625,
0.0253448486328125,
-0.0233306884765625,
0.027099609375,
-0.05499267578125,
0.0125274658203125,
0.0008864402770996094,
0.0192718505859375,
-0.08160400390625,
-0.01416778564453125,
0.026947021484375,
-0.049652099609375,
0.039947509765625,
-0.04901123046875,
-0.02471923828125,
-0.038604736328125,
-0.04229736328125,
0.0291595458984375,
0.045867919921875,
-0.04742431640625,
0.03497314453125,
0.01428985595703125,
0.0183563232421875,
-0.046417236328125,
-0.061553955078125,
-0.022613525390625,
-0.03387451171875,
-0.06561279296875,
0.032379150390625,
0.012359619140625,
0.01229095458984375,
0.01438140869140625,
-0.003658294677734375,
-0.002384185791015625,
-0.0162506103515625,
0.038970947265625,
0.033477783203125,
-0.033294677734375,
-0.007579803466796875,
-0.0238037109375,
-0.007293701171875,
0.00800323486328125,
-0.0294647216796875,
0.05120849609375,
-0.0184783935546875,
-0.011627197265625,
-0.066162109375,
-0.007354736328125,
0.049652099609375,
-0.01271820068359375,
0.06451416015625,
0.07196044921875,
-0.030548095703125,
-0.0004448890686035156,
-0.028961181640625,
-0.02349853515625,
-0.037872314453125,
0.03399658203125,
-0.0341796875,
-0.026947021484375,
0.0706787109375,
0.010040283203125,
-0.0008144378662109375,
0.046905517578125,
0.0186309814453125,
-0.0007166862487792969,
0.057861328125,
0.038970947265625,
0.01020050048828125,
0.05133056640625,
-0.082275390625,
-0.004817962646484375,
-0.061798095703125,
-0.038238525390625,
-0.023956298828125,
-0.0477294921875,
-0.054046630859375,
-0.0261383056640625,
0.034881591796875,
0.0132293701171875,
-0.032470703125,
0.033782958984375,
-0.06109619140625,
-0.0015840530395507812,
0.057220458984375,
0.03948974609375,
-0.0224609375,
0.0270233154296875,
-0.0293731689453125,
0.00022113323211669922,
-0.058837890625,
-0.01422882080078125,
0.08447265625,
0.032745361328125,
0.05133056640625,
-0.0027923583984375,
0.05303955078125,
-0.021575927734375,
0.01119232177734375,
-0.0438232421875,
0.04547119140625,
0.005634307861328125,
-0.039794921875,
-0.0156402587890625,
-0.037384033203125,
-0.0804443359375,
0.01468658447265625,
-0.018463134765625,
-0.05133056640625,
0.0216064453125,
0.00897216796875,
-0.0233001708984375,
0.064697265625,
-0.05999755859375,
0.06878662109375,
-0.01200103759765625,
-0.03631591796875,
0.002193450927734375,
-0.04638671875,
0.01328277587890625,
0.01885986328125,
-0.01189422607421875,
-0.0038166046142578125,
0.01360321044921875,
0.07958984375,
-0.052978515625,
0.05340576171875,
-0.02960205078125,
0.03125,
0.042083740234375,
-0.006999969482421875,
0.0285491943359375,
-0.0005397796630859375,
-0.005046844482421875,
0.01861572265625,
0.001087188720703125,
-0.033477783203125,
-0.030517578125,
0.05255126953125,
-0.082763671875,
-0.0261383056640625,
-0.03814697265625,
-0.037353515625,
0.0202484130859375,
0.01195526123046875,
0.04241943359375,
0.055145263671875,
0.0197906494140625,
0.0236663818359375,
0.045989990234375,
-0.0345458984375,
0.036224365234375,
0.0019989013671875,
-0.015960693359375,
-0.04815673828125,
0.0684814453125,
0.01800537109375,
0.01377105712890625,
0.00473785400390625,
0.012451171875,
-0.0236663818359375,
-0.0426025390625,
-0.032012939453125,
0.03887939453125,
-0.052978515625,
-0.036895751953125,
-0.038726806640625,
-0.03900146484375,
-0.0345458984375,
-0.007511138916015625,
-0.0377197265625,
-0.0261077880859375,
-0.031890869140625,
0.01232147216796875,
0.055450439453125,
0.047454833984375,
-0.0189056396484375,
0.03387451171875,
-0.036773681640625,
0.003063201904296875,
0.0167999267578125,
0.037200927734375,
-0.002948760986328125,
-0.071044921875,
-0.01543426513671875,
-0.00807952880859375,
-0.03204345703125,
-0.0589599609375,
0.041839599609375,
0.0011720657348632812,
0.0452880859375,
0.0262908935546875,
-0.01605224609375,
0.056488037109375,
0.0026531219482421875,
0.037200927734375,
0.025604248046875,
-0.0465087890625,
0.046478271484375,
-0.01140594482421875,
0.01727294921875,
0.0014162063598632812,
0.0300750732421875,
-0.020843505859375,
0.0022430419921875,
-0.0709228515625,
-0.0595703125,
0.07012939453125,
0.016021728515625,
-0.009429931640625,
0.0300445556640625,
0.0472412109375,
-0.0124969482421875,
-0.0027484893798828125,
-0.0595703125,
-0.04278564453125,
-0.0299530029296875,
-0.0159149169921875,
0.0008425712585449219,
-0.0100860595703125,
-0.004833221435546875,
-0.05206298828125,
0.05621337890625,
0.00015544891357421875,
0.06243896484375,
0.0290069580078125,
-0.00023484230041503906,
-0.00980377197265625,
-0.026153564453125,
0.042724609375,
0.0367431640625,
-0.0301361083984375,
0.008331298828125,
0.01248931884765625,
-0.04290771484375,
0.006130218505859375,
0.014892578125,
-0.00316619873046875,
0.00714111328125,
0.03240966796875,
0.06402587890625,
-0.0021152496337890625,
0.00583648681640625,
0.031158447265625,
-0.0023250579833984375,
-0.03875732421875,
-0.0177459716796875,
0.0007691383361816406,
0.000015854835510253906,
0.02978515625,
0.027435302734375,
0.0214996337890625,
-0.0177154541015625,
-0.0185089111328125,
0.0181121826171875,
0.04400634765625,
-0.0210418701171875,
-0.026580810546875,
0.05230712890625,
-0.0144195556640625,
-0.00844573974609375,
0.05780029296875,
-0.008392333984375,
-0.038604736328125,
0.07904052734375,
0.0276031494140625,
0.06390380859375,
-0.00856781005859375,
0.01026153564453125,
0.0726318359375,
0.0216522216796875,
0.0092926025390625,
0.0126190185546875,
0.020660400390625,
-0.05126953125,
0.00550079345703125,
-0.0419921875,
0.01557159423828125,
0.0380859375,
-0.04168701171875,
0.031585693359375,
-0.04864501953125,
-0.034210205078125,
0.02069091796875,
0.027496337890625,
-0.056976318359375,
0.01409912109375,
-0.0007543563842773438,
0.06085205078125,
-0.05438232421875,
0.0655517578125,
0.06427001953125,
-0.04486083984375,
-0.080810546875,
-0.0139007568359375,
0.007266998291015625,
-0.06512451171875,
0.0440673828125,
0.02508544921875,
0.00457000732421875,
0.00432586669921875,
-0.053741455078125,
-0.06451416015625,
0.11395263671875,
0.031951904296875,
-0.0001900196075439453,
0.0196685791015625,
0.004505157470703125,
0.0194091796875,
-0.03326416015625,
0.0362548828125,
0.0110015869140625,
0.027923583984375,
0.0223541259765625,
-0.050811767578125,
0.0206451416015625,
-0.0268707275390625,
0.0102081298828125,
0.0194854736328125,
-0.06427001953125,
0.0758056640625,
-0.03125,
-0.0086517333984375,
0.007511138916015625,
0.042999267578125,
0.0218505859375,
0.00504302978515625,
0.039947509765625,
0.0633544921875,
0.0364990234375,
-0.030120849609375,
0.06390380859375,
-0.007488250732421875,
0.053375244140625,
0.036468505859375,
0.036041259765625,
0.03411865234375,
0.02508544921875,
-0.0180511474609375,
0.036102294921875,
0.07696533203125,
-0.0219573974609375,
0.030120849609375,
0.01415252685546875,
-0.00183868408203125,
-0.0089263916015625,
0.0103912353515625,
-0.040985107421875,
0.0316162109375,
0.00890350341796875,
-0.038787841796875,
-0.013397216796875,
0.0012826919555664062,
0.007442474365234375,
-0.0290679931640625,
-0.0110321044921875,
0.0289154052734375,
-0.0047149658203125,
-0.032135009765625,
0.073486328125,
0.007312774658203125,
0.05938720703125,
-0.0345458984375,
0.006622314453125,
-0.0253143310546875,
0.0226898193359375,
-0.0221710205078125,
-0.066162109375,
0.0098876953125,
-0.01873779296875,
-0.0012006759643554688,
-0.0027332305908203125,
0.0521240234375,
-0.021759033203125,
-0.03399658203125,
0.0215301513671875,
0.0208282470703125,
0.033203125,
0.008026123046875,
-0.0831298828125,
0.0184326171875,
0.0100250244140625,
-0.052764892578125,
0.0193939208984375,
0.034759521484375,
0.0164031982421875,
0.05804443359375,
0.042572021484375,
-0.00641632080078125,
0.0189666748046875,
-0.0180206298828125,
0.0699462890625,
-0.04351806640625,
-0.0230712890625,
-0.058563232421875,
0.0499267578125,
-0.01277923583984375,
-0.053741455078125,
0.039306640625,
0.049163818359375,
0.06231689453125,
-0.00014317035675048828,
0.03265380859375,
-0.0229949951171875,
0.00353240966796875,
-0.0316162109375,
0.05804443359375,
-0.05584716796875,
0.005924224853515625,
-0.0133056640625,
-0.0494384765625,
-0.0165863037109375,
0.05615234375,
-0.02117919921875,
0.035186767578125,
0.042724609375,
0.07977294921875,
-0.025634765625,
-0.0253143310546875,
0.0073699951171875,
0.016387939453125,
0.0088958740234375,
0.037353515625,
0.038665771484375,
-0.06353759765625,
0.04168701171875,
-0.05462646484375,
-0.011016845703125,
-0.0168304443359375,
-0.04693603515625,
-0.0782470703125,
-0.06756591796875,
-0.043701171875,
-0.05230712890625,
-0.0177154541015625,
0.06268310546875,
0.0880126953125,
-0.04998779296875,
-0.0166015625,
0.007083892822265625,
0.0198822021484375,
-0.0158233642578125,
-0.0185394287109375,
0.050750732421875,
-0.006862640380859375,
-0.054443359375,
-0.024993896484375,
-0.0034809112548828125,
0.03399658203125,
-0.002521514892578125,
-0.0221099853515625,
-0.0212860107421875,
-0.018157958984375,
0.015838623046875,
0.02325439453125,
-0.053009033203125,
-0.0168304443359375,
-0.01171112060546875,
-0.0180511474609375,
0.033660888671875,
0.0291900634765625,
-0.039581298828125,
0.0171051025390625,
0.04150390625,
0.01453399658203125,
0.06982421875,
-0.02325439453125,
0.00028443336486816406,
-0.05902099609375,
0.039825439453125,
-0.0176849365234375,
0.03094482421875,
0.0244903564453125,
-0.0215301513671875,
0.03924560546875,
0.035980224609375,
-0.039306640625,
-0.0660400390625,
-0.01291656494140625,
-0.0859375,
-0.0183258056640625,
0.06573486328125,
-0.0252532958984375,
-0.03662109375,
0.0321044921875,
0.0002123117446899414,
0.043121337890625,
-0.0120086669921875,
0.0347900390625,
0.0222320556640625,
-0.007091522216796875,
-0.04595947265625,
-0.047119140625,
0.033538818359375,
0.015380859375,
-0.047454833984375,
-0.037017822265625,
0.00246429443359375,
0.058624267578125,
0.0272216796875,
0.03143310546875,
-0.00847625732421875,
0.0091400146484375,
0.00763702392578125,
0.032257080078125,
-0.02838134765625,
-0.0092010498046875,
-0.019744873046875,
-0.0005207061767578125,
-0.0088653564453125,
-0.04718017578125
]
] |
timm/spnasnet_100.rmsp_in1k | 2023-04-27T21:14:40.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1904.02877",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/spnasnet_100.rmsp_in1k | 0 | 36,967 | timm | 2022-12-13T00:01:11 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for spnasnet_100.rmsp_in1k
A SPNasNet image classification model. Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* A simple RmsProp based recipe without RandAugment. Using RandomErasing, mixup, dropout, standard random-resize-crop augmentation.
* RMSProp (TF 1.0 behaviour) optimizer, EMA weight averaging
* Step (exponential decay w/ staircase) LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 4.4
- GMACs: 0.3
- Activations (M): 6.0
- Image size: 224 x 224
- **Papers:**
- Single-Path NAS: Designing Hardware-Efficient ConvNets in less than 4 Hours: https://arxiv.org/abs/1904.02877
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('spnasnet_100.rmsp_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'spnasnet_100.rmsp_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 112, 112])
# torch.Size([1, 24, 56, 56])
# torch.Size([1, 40, 28, 28])
# torch.Size([1, 96, 14, 14])
# torch.Size([1, 320, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'spnasnet_100.rmsp_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1280, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@inproceedings{stamoulis2020single,
title={Single-path nas: Designing hardware-efficient convnets in less than 4 hours},
author={Stamoulis, Dimitrios and Ding, Ruizhou and Wang, Di and Lymberopoulos, Dimitrios and Priyantha, Bodhi and Liu, Jie and Marculescu, Diana},
booktitle={Machine Learning and Knowledge Discovery in Databases: European Conference, ECML PKDD 2019, W{"u}rzburg, Germany, September 16--20, 2019, Proceedings, Part II},
pages={481--497},
year={2020},
organization={Springer}
}
```
| 4,528 | [
[
-0.036773681640625,
-0.033935546875,
0.002532958984375,
0.0113067626953125,
-0.0267486572265625,
-0.0282440185546875,
-0.01605224609375,
-0.021209716796875,
0.035797119140625,
0.037139892578125,
-0.04571533203125,
-0.052764892578125,
-0.05560302734375,
-0.0102081298828125,
-0.01165771484375,
0.07952880859375,
-0.00341033935546875,
-0.001636505126953125,
-0.0189361572265625,
-0.0423583984375,
-0.01142120361328125,
-0.0305328369140625,
-0.06640625,
-0.0191802978515625,
0.0252227783203125,
0.0157470703125,
0.03460693359375,
0.048126220703125,
0.04736328125,
0.033721923828125,
-0.016815185546875,
0.017486572265625,
-0.0164947509765625,
-0.005420684814453125,
0.0290679931640625,
-0.03753662109375,
-0.04052734375,
0.01039886474609375,
0.05328369140625,
0.0333251953125,
0.0010385513305664062,
0.0380859375,
0.004779815673828125,
0.047637939453125,
-0.01520538330078125,
0.00576019287109375,
-0.03411865234375,
0.027801513671875,
-0.0184173583984375,
-0.0079345703125,
-0.024871826171875,
-0.0286102294921875,
0.018463134765625,
-0.040924072265625,
0.036163330078125,
0.000732421875,
0.09930419921875,
0.021331787109375,
-0.00691986083984375,
-0.01468658447265625,
-0.0165863037109375,
0.057861328125,
-0.062164306640625,
0.012603759765625,
0.0183868408203125,
0.0166778564453125,
-0.0063323974609375,
-0.07611083984375,
-0.037506103515625,
-0.01090240478515625,
-0.011383056640625,
-0.00833892822265625,
-0.011627197265625,
0.00152587890625,
0.027099609375,
0.023040771484375,
-0.03643798828125,
-0.0010128021240234375,
-0.0443115234375,
-0.012969970703125,
0.043731689453125,
-0.0036907196044921875,
0.0208892822265625,
-0.0181884765625,
-0.045806884765625,
-0.02471923828125,
-0.032135009765625,
0.01528167724609375,
0.01776123046875,
0.0308837890625,
-0.035980224609375,
0.031036376953125,
0.01690673828125,
0.04241943359375,
0.004100799560546875,
-0.02459716796875,
0.042755126953125,
0.003925323486328125,
-0.03460693359375,
-0.00555419921875,
0.0762939453125,
0.0237884521484375,
0.00820159912109375,
0.00551605224609375,
-0.01540374755859375,
-0.027008056640625,
-0.000011444091796875,
-0.0848388671875,
-0.027008056640625,
0.0272674560546875,
-0.0443115234375,
-0.03271484375,
0.0205230712890625,
-0.043731689453125,
-0.0156097412109375,
0.00024116039276123047,
0.05133056640625,
-0.03955078125,
-0.02972412109375,
0.00986480712890625,
-0.01922607421875,
0.0094146728515625,
0.007610321044921875,
-0.033050537109375,
0.02557373046875,
0.0268402099609375,
0.0904541015625,
-0.003116607666015625,
-0.0399169921875,
-0.023223876953125,
-0.02618408203125,
-0.022979736328125,
0.032867431640625,
0.01068115234375,
-0.00783538818359375,
-0.029571533203125,
0.0263824462890625,
-0.010528564453125,
-0.049957275390625,
0.0214385986328125,
-0.0160064697265625,
0.016998291015625,
-0.01128387451171875,
-0.027374267578125,
-0.035888671875,
0.0157470703125,
-0.039154052734375,
0.09954833984375,
0.0183868408203125,
-0.065185546875,
0.02642822265625,
-0.052734375,
-0.0183563232421875,
-0.0146942138671875,
-0.0041656494140625,
-0.08697509765625,
-0.006626129150390625,
0.009063720703125,
0.058929443359375,
-0.015167236328125,
0.00914764404296875,
-0.032440185546875,
-0.0162200927734375,
0.024261474609375,
-0.00010585784912109375,
0.08209228515625,
0.0180206298828125,
-0.03643798828125,
0.0218505859375,
-0.044586181640625,
0.0110015869140625,
0.045257568359375,
-0.014129638671875,
0.0030155181884765625,
-0.03955078125,
0.006866455078125,
0.02703857421875,
0.00662994384765625,
-0.043243408203125,
0.0157470703125,
-0.01177215576171875,
0.0242156982421875,
0.057342529296875,
0.0003466606140136719,
0.0286712646484375,
-0.03271484375,
0.0268096923828125,
0.0294342041015625,
0.02117919921875,
-0.006359100341796875,
-0.042205810546875,
-0.0689697265625,
-0.041595458984375,
0.0241546630859375,
0.0191192626953125,
-0.03790283203125,
0.035308837890625,
-0.01204681396484375,
-0.0599365234375,
-0.03790283203125,
0.00453948974609375,
0.036590576171875,
0.040802001953125,
0.024871826171875,
-0.04541015625,
-0.04522705078125,
-0.06396484375,
0.00431060791015625,
0.01605224609375,
0.0023975372314453125,
0.0295562744140625,
0.04913330078125,
0.0015573501586914062,
0.048553466796875,
-0.036468505859375,
-0.021026611328125,
-0.0233154296875,
-0.0006561279296875,
0.034423828125,
0.058349609375,
0.06488037109375,
-0.04974365234375,
-0.04052734375,
-0.01383209228515625,
-0.07012939453125,
0.013519287109375,
-0.01111602783203125,
-0.013092041015625,
0.0286712646484375,
0.009246826171875,
-0.052581787109375,
0.045562744140625,
0.01422882080078125,
-0.028411865234375,
0.03656005859375,
-0.020355224609375,
0.0137939453125,
-0.0865478515625,
0.007030487060546875,
0.035125732421875,
-0.007144927978515625,
-0.0418701171875,
0.01082611083984375,
0.0004878044128417969,
-0.0172576904296875,
-0.040069580078125,
0.0440673828125,
-0.044219970703125,
-0.02288818359375,
-0.0022792816162109375,
-0.0189971923828125,
0.00533294677734375,
0.05804443359375,
-0.0025386810302734375,
0.0311279296875,
0.06866455078125,
-0.048797607421875,
0.033050537109375,
0.0239410400390625,
-0.01198577880859375,
0.0274658203125,
-0.050140380859375,
0.0140380859375,
-0.005664825439453125,
0.0205230712890625,
-0.06689453125,
-0.00807952880859375,
0.0276031494140625,
-0.04443359375,
0.04779052734375,
-0.041839599609375,
-0.034942626953125,
-0.03765869140625,
-0.0305633544921875,
0.01751708984375,
0.039276123046875,
-0.05169677734375,
0.038909912109375,
0.027862548828125,
0.018035888671875,
-0.03765869140625,
-0.06329345703125,
-0.017120361328125,
-0.03173828125,
-0.053497314453125,
0.025054931640625,
0.006256103515625,
0.01364898681640625,
0.004650115966796875,
-0.0134124755859375,
-0.0038547515869140625,
-0.004474639892578125,
0.04595947265625,
0.0307464599609375,
-0.0283966064453125,
-0.00417327880859375,
-0.0253448486328125,
-0.00739288330078125,
-0.0006666183471679688,
-0.033355712890625,
0.04974365234375,
-0.0226898193359375,
-0.0190277099609375,
-0.0665283203125,
-0.0006990432739257812,
0.041229248046875,
-0.01090240478515625,
0.0687255859375,
0.06781005859375,
-0.03955078125,
-0.015777587890625,
-0.0230865478515625,
-0.02703857421875,
-0.036773681640625,
0.0355224609375,
-0.0372314453125,
-0.0201568603515625,
0.054534912109375,
-0.00014138221740722656,
0.00811004638671875,
0.048583984375,
0.019989013671875,
-0.00812530517578125,
0.050384521484375,
0.044464111328125,
0.0132904052734375,
0.046630859375,
-0.07568359375,
-0.00775146484375,
-0.052642822265625,
-0.033477783203125,
-0.02880859375,
-0.043670654296875,
-0.045623779296875,
-0.0290679931640625,
0.0382080078125,
0.0003864765167236328,
-0.035064697265625,
0.043701171875,
-0.072998046875,
0.008148193359375,
0.052703857421875,
0.04541015625,
-0.0308380126953125,
0.0205230712890625,
-0.0207061767578125,
-0.0032939910888671875,
-0.0712890625,
-0.0157318115234375,
0.0787353515625,
0.043914794921875,
0.037994384765625,
-0.01511383056640625,
0.0584716796875,
-0.0161590576171875,
0.0188446044921875,
-0.0433349609375,
0.0523681640625,
-0.0147857666015625,
-0.035125732421875,
-0.01079559326171875,
-0.035186767578125,
-0.06805419921875,
0.00853729248046875,
-0.023406982421875,
-0.04449462890625,
0.00585174560546875,
0.0048065185546875,
-0.0159454345703125,
0.0675048828125,
-0.061767578125,
0.0694580078125,
-0.017669677734375,
-0.0219573974609375,
0.01264190673828125,
-0.056121826171875,
0.01380157470703125,
0.0222320556640625,
-0.01490020751953125,
-0.00394439697265625,
0.021728515625,
0.08355712890625,
-0.04815673828125,
0.05670166015625,
-0.046630859375,
0.033203125,
0.0291748046875,
-0.00206756591796875,
0.024932861328125,
-0.0194549560546875,
-0.01479339599609375,
0.0241546630859375,
0.00717926025390625,
-0.041534423828125,
-0.038848876953125,
0.041717529296875,
-0.069091796875,
-0.015472412109375,
-0.03582763671875,
-0.037811279296875,
0.01482391357421875,
0.005558013916015625,
0.033905029296875,
0.05230712890625,
0.0219268798828125,
0.018707275390625,
0.05401611328125,
-0.03692626953125,
0.0209197998046875,
-0.004444122314453125,
-0.018951416015625,
-0.030792236328125,
0.0689697265625,
0.0219573974609375,
0.01387786865234375,
0.0013399124145507812,
0.0152435302734375,
-0.016998291015625,
-0.038238525390625,
-0.0290679931640625,
0.029388427734375,
-0.0560302734375,
-0.03839111328125,
-0.0457763671875,
-0.040863037109375,
-0.0299530029296875,
-0.004169464111328125,
-0.043304443359375,
-0.033538818359375,
-0.028778076171875,
0.011016845703125,
0.056488037109375,
0.04608154296875,
-0.00850677490234375,
0.038604736328125,
-0.032867431640625,
0.01114654541015625,
-0.0024662017822265625,
0.0384521484375,
-0.0036563873291015625,
-0.06414794921875,
-0.0177154541015625,
-0.00760650634765625,
-0.0289154052734375,
-0.05560302734375,
0.0423583984375,
0.017333984375,
0.037994384765625,
0.0268402099609375,
-0.01422882080078125,
0.06005859375,
-0.0015010833740234375,
0.0421142578125,
0.0253143310546875,
-0.04437255859375,
0.053802490234375,
0.005558013916015625,
0.0091552734375,
0.0086517333984375,
0.036041259765625,
-0.02740478515625,
-0.004047393798828125,
-0.0767822265625,
-0.0577392578125,
0.0703125,
0.005878448486328125,
-0.0096588134765625,
0.023681640625,
0.0487060546875,
0.0061798095703125,
0.005344390869140625,
-0.053009033203125,
-0.047271728515625,
-0.0254364013671875,
-0.017059326171875,
0.004062652587890625,
-0.0022869110107421875,
-0.01067352294921875,
-0.04766845703125,
0.06463623046875,
-0.0014200210571289062,
0.05670166015625,
0.022247314453125,
-0.0021514892578125,
-0.00884246826171875,
-0.037628173828125,
0.034271240234375,
0.0238494873046875,
-0.0294342041015625,
0.0083770751953125,
0.0173797607421875,
-0.038604736328125,
-0.00243377685546875,
0.00891876220703125,
-0.0036716461181640625,
0.015594482421875,
0.021392822265625,
0.0745849609375,
0.0033397674560546875,
0.005809783935546875,
0.03173828125,
-0.006183624267578125,
-0.029266357421875,
-0.0157928466796875,
0.006359100341796875,
0.00026798248291015625,
0.026153564453125,
0.032318115234375,
0.03497314453125,
-0.0197601318359375,
-0.022369384765625,
0.029022216796875,
0.043182373046875,
-0.0173797607421875,
-0.0193328857421875,
0.04193115234375,
-0.02276611328125,
-0.0213470458984375,
0.056060791015625,
-0.01288604736328125,
-0.044281005859375,
0.0828857421875,
0.037139892578125,
0.0601806640625,
0.0014829635620117188,
0.007572174072265625,
0.059844970703125,
0.0160064697265625,
0.004302978515625,
0.0111236572265625,
0.003643035888671875,
-0.052581787109375,
0.0155181884765625,
-0.04962158203125,
-0.0004794597625732422,
0.019744873046875,
-0.04443359375,
0.026153564453125,
-0.056243896484375,
-0.026153564453125,
0.01800537109375,
0.035064697265625,
-0.071044921875,
0.021636962890625,
-0.006427764892578125,
0.071044921875,
-0.05950927734375,
0.05810546875,
0.05950927734375,
-0.045440673828125,
-0.075439453125,
-0.01227569580078125,
0.00859832763671875,
-0.06439208984375,
0.0421142578125,
0.027618408203125,
0.01197052001953125,
-0.00028634071350097656,
-0.059295654296875,
-0.048828125,
0.1065673828125,
0.036102294921875,
0.00315093994140625,
0.01561737060546875,
-0.00408935546875,
0.0150909423828125,
-0.03289794921875,
0.02618408203125,
0.0121917724609375,
0.022674560546875,
0.0271453857421875,
-0.04986572265625,
0.0212860107421875,
-0.0237579345703125,
0.00582122802734375,
0.01593017578125,
-0.0518798828125,
0.06781005859375,
-0.03106689453125,
-0.0228729248046875,
0.0016651153564453125,
0.05517578125,
0.032989501953125,
0.021575927734375,
0.03814697265625,
0.061981201171875,
0.04949951171875,
-0.016876220703125,
0.0687255859375,
-0.007152557373046875,
0.049896240234375,
0.04925537109375,
0.0238800048828125,
0.038909912109375,
0.033905029296875,
-0.0250091552734375,
0.02215576171875,
0.08135986328125,
-0.020538330078125,
0.02447509765625,
0.017669677734375,
-0.004180908203125,
0.0036468505859375,
0.0101776123046875,
-0.031219482421875,
0.0295257568359375,
0.015380859375,
-0.03948974609375,
-0.0149383544921875,
0.004886627197265625,
0.0019369125366210938,
-0.0247802734375,
-0.0132598876953125,
0.03802490234375,
0.0015878677368164062,
-0.03009033203125,
0.059814453125,
0.006221771240234375,
0.07354736328125,
-0.0404052734375,
0.0081329345703125,
-0.0222015380859375,
0.0112457275390625,
-0.030029296875,
-0.060577392578125,
0.0253753662109375,
-0.022674560546875,
0.0014629364013671875,
-0.008148193359375,
0.05401611328125,
-0.0232391357421875,
-0.035919189453125,
0.00860595703125,
0.0231170654296875,
0.049163818359375,
0.00524139404296875,
-0.09088134765625,
0.01416015625,
0.0028362274169921875,
-0.042999267578125,
0.02880859375,
0.0384521484375,
0.018890380859375,
0.06085205078125,
0.046875,
0.00984954833984375,
0.01293182373046875,
-0.0183258056640625,
0.06390380859375,
-0.032867431640625,
-0.0226593017578125,
-0.0506591796875,
0.042022705078125,
-0.01102447509765625,
-0.04400634765625,
0.04736328125,
0.046630859375,
0.061737060546875,
-0.00498199462890625,
0.0294647216796875,
-0.0231170654296875,
0.007808685302734375,
-0.027496337890625,
0.0633544921875,
-0.054718017578125,
0.002544403076171875,
-0.01247406005859375,
-0.043121337890625,
-0.0261688232421875,
0.056243896484375,
-0.013519287109375,
0.0269012451171875,
0.036834716796875,
0.08636474609375,
-0.030120849609375,
-0.0206298828125,
0.01207733154296875,
0.005252838134765625,
0.0005893707275390625,
0.023834228515625,
0.0206298828125,
-0.061798095703125,
0.017608642578125,
-0.05419921875,
-0.01107025146484375,
-0.00672149658203125,
-0.050811767578125,
-0.061981201171875,
-0.08123779296875,
-0.03961181640625,
-0.055084228515625,
-0.00919342041015625,
0.0782470703125,
0.08734130859375,
-0.053680419921875,
-0.01479339599609375,
0.002338409423828125,
0.01276397705078125,
-0.022369384765625,
-0.0193939208984375,
0.05230712890625,
-0.01502227783203125,
-0.045135498046875,
-0.017303466796875,
0.00669097900390625,
0.024200439453125,
-0.00731658935546875,
-0.0224609375,
-0.023712158203125,
-0.016510009765625,
0.0214996337890625,
0.0250091552734375,
-0.041839599609375,
-0.0028781890869140625,
-0.014007568359375,
-0.01020050048828125,
0.0304107666015625,
0.03631591796875,
-0.037689208984375,
0.020416259765625,
0.021087646484375,
0.03289794921875,
0.072265625,
-0.02703857421875,
0.0026264190673828125,
-0.07647705078125,
0.039886474609375,
-0.0023174285888671875,
0.03363037109375,
0.032012939453125,
-0.0306243896484375,
0.0416259765625,
0.0308837890625,
-0.034576416015625,
-0.07568359375,
-0.006839752197265625,
-0.08013916015625,
-0.0157470703125,
0.0648193359375,
-0.030487060546875,
-0.043121337890625,
0.0380859375,
0.0002703666687011719,
0.05230712890625,
-0.0038738250732421875,
0.0401611328125,
0.0174407958984375,
-0.0110321044921875,
-0.049346923828125,
-0.042724609375,
0.038360595703125,
0.0183868408203125,
-0.04986572265625,
-0.03619384765625,
-0.002338409423828125,
0.050628662109375,
0.0175628662109375,
0.040802001953125,
-0.0097808837890625,
0.01229095458984375,
0.0096282958984375,
0.033660888671875,
-0.0330810546875,
-0.017669677734375,
-0.02740478515625,
0.00815582275390625,
-0.011444091796875,
-0.047210693359375
]
] |
timm/resnest101e.in1k | 2023-04-23T23:37:24.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2004.08955",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/resnest101e.in1k | 0 | 36,948 | timm | 2023-04-23T23:36:40 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for resnest101e.in1k
A ResNeSt (ResNet based architecture with Split Attention) image classification model. Trained on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 48.3
- GMACs: 13.4
- Activations (M): 28.7
- Image size: 256 x 256
- **Papers:**
- ResNeSt: Split-Attention Networks: https://arxiv.org/abs/2004.08955
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/zhanghang1989/ResNeSt
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('resnest101e.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnest101e.in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 128, 128, 128])
# torch.Size([1, 256, 64, 64])
# torch.Size([1, 512, 32, 32])
# torch.Size([1, 1024, 16, 16])
# torch.Size([1, 2048, 8, 8])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnest101e.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2048, 8, 8) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@article{zhang2020resnest,
title={ResNeSt: Split-Attention Networks},
author={Zhang, Hang and Wu, Chongruo and Zhang, Zhongyue and Zhu, Yi and Zhang, Zhi and Lin, Haibin and Sun, Yue and He, Tong and Muller, Jonas and Manmatha, R. and Li, Mu and Smola, Alexander},
journal={arXiv preprint arXiv:2004.08955},
year={2020}
}
```
| 3,739 | [
[
-0.040557861328125,
-0.035797119140625,
0.00794219970703125,
0.01453399658203125,
-0.025726318359375,
-0.026123046875,
-0.0203704833984375,
-0.0216522216796875,
0.031646728515625,
0.03619384765625,
-0.048919677734375,
-0.04486083984375,
-0.052337646484375,
-0.0021800994873046875,
-0.00817108154296875,
0.07281494140625,
-0.006072998046875,
-0.00479888916015625,
-0.01727294921875,
-0.04193115234375,
-0.00853729248046875,
-0.0218505859375,
-0.06744384765625,
-0.023712158203125,
0.029510498046875,
0.0153045654296875,
0.0292510986328125,
0.042266845703125,
0.050811767578125,
0.036712646484375,
-0.0116119384765625,
0.0018672943115234375,
-0.0205841064453125,
-0.01061248779296875,
0.026214599609375,
-0.043670654296875,
-0.032501220703125,
0.0244140625,
0.05743408203125,
0.0269012451171875,
0.00396728515625,
0.0394287109375,
0.0117950439453125,
0.0496826171875,
-0.01531219482421875,
0.00865936279296875,
-0.022552490234375,
0.0162200927734375,
-0.0078582763671875,
0.01399993896484375,
-0.0238189697265625,
-0.040496826171875,
0.0201416015625,
-0.03472900390625,
0.0325927734375,
-0.0006031990051269531,
0.093994140625,
0.02740478515625,
0.00519561767578125,
0.004306793212890625,
-0.00856781005859375,
0.061431884765625,
-0.06390380859375,
0.02374267578125,
0.0233001708984375,
0.0082550048828125,
-0.0021724700927734375,
-0.08001708984375,
-0.040313720703125,
-0.0068511962890625,
-0.0190887451171875,
-0.0012416839599609375,
-0.007389068603515625,
-0.00026798248291015625,
0.02490234375,
0.0254364013671875,
-0.030975341796875,
-0.004787445068359375,
-0.042572021484375,
-0.01062774658203125,
0.038177490234375,
0.00209808349609375,
0.0263214111328125,
-0.017120361328125,
-0.039947509765625,
-0.027374267578125,
-0.0188140869140625,
0.0177764892578125,
0.01555633544921875,
0.02008056640625,
-0.043670654296875,
0.02203369140625,
0.0129241943359375,
0.04180908203125,
0.01262664794921875,
-0.0289459228515625,
0.045806884765625,
0.00206756591796875,
-0.04302978515625,
-0.0047607421875,
0.08245849609375,
0.018707275390625,
0.0217742919921875,
0.007007598876953125,
-0.003387451171875,
-0.023590087890625,
0.0011539459228515625,
-0.08587646484375,
-0.0258026123046875,
0.0279998779296875,
-0.046051025390625,
-0.035552978515625,
0.0164642333984375,
-0.0491943359375,
-0.00688934326171875,
-0.0011758804321289062,
0.03802490234375,
-0.038330078125,
-0.03173828125,
0.007904052734375,
-0.00617218017578125,
0.0186767578125,
0.007152557373046875,
-0.039215087890625,
0.01541900634765625,
0.0204010009765625,
0.08197021484375,
0.0016193389892578125,
-0.03875732421875,
-0.01169586181640625,
-0.03070068359375,
-0.01070404052734375,
0.03759765625,
-0.005889892578125,
0.0036029815673828125,
-0.027984619140625,
0.031768798828125,
-0.0253753662109375,
-0.052001953125,
0.0205078125,
-0.01751708984375,
0.035247802734375,
-0.0024967193603515625,
-0.022186279296875,
-0.043243408203125,
0.03167724609375,
-0.03936767578125,
0.0877685546875,
0.027618408203125,
-0.072998046875,
0.0195159912109375,
-0.043121337890625,
-0.012115478515625,
-0.0230560302734375,
0.0006947517395019531,
-0.08795166015625,
-0.00659942626953125,
0.018951416015625,
0.046142578125,
-0.018524169921875,
-0.0009427070617675781,
-0.0404052734375,
-0.00905609130859375,
0.0270843505859375,
-0.007472991943359375,
0.07220458984375,
0.00571441650390625,
-0.0284423828125,
0.020599365234375,
-0.0433349609375,
0.0230865478515625,
0.039947509765625,
-0.0262451171875,
-0.0004897117614746094,
-0.04742431640625,
0.014892578125,
0.02484130859375,
0.00933837890625,
-0.049102783203125,
0.019561767578125,
-0.0181732177734375,
0.037994384765625,
0.048431396484375,
-0.0031604766845703125,
0.01904296875,
-0.0252838134765625,
0.0275115966796875,
0.022430419921875,
0.0182647705078125,
-0.0085296630859375,
-0.036102294921875,
-0.060821533203125,
-0.0399169921875,
0.0364990234375,
0.0289764404296875,
-0.0313720703125,
0.036529541015625,
-0.0097198486328125,
-0.046051025390625,
-0.051849365234375,
0.000690460205078125,
0.040069580078125,
0.042236328125,
0.0252838134765625,
-0.037384033203125,
-0.0369873046875,
-0.07122802734375,
0.003711700439453125,
0.00679779052734375,
0.006229400634765625,
0.0269012451171875,
0.0408935546875,
-0.00428009033203125,
0.053741455078125,
-0.0360107421875,
-0.02288818359375,
-0.021820068359375,
-0.004909515380859375,
0.0294036865234375,
0.0550537109375,
0.0577392578125,
-0.0491943359375,
-0.038299560546875,
-0.018829345703125,
-0.07159423828125,
0.01483917236328125,
-0.00855255126953125,
-0.017333984375,
0.0190887451171875,
0.006866455078125,
-0.045318603515625,
0.045379638671875,
0.01250457763671875,
-0.024566650390625,
0.03204345703125,
-0.013336181640625,
0.0218048095703125,
-0.09100341796875,
0.003253936767578125,
0.029022216796875,
-0.01149749755859375,
-0.03521728515625,
0.00289154052734375,
0.004100799560546875,
-0.0027866363525390625,
-0.04193115234375,
0.04156494140625,
-0.044921875,
-0.01947021484375,
-0.01163482666015625,
-0.01273345947265625,
0.00334930419921875,
0.06561279296875,
0.002452850341796875,
0.0265655517578125,
0.056610107421875,
-0.032318115234375,
0.038909912109375,
0.03192138671875,
-0.0208587646484375,
0.0246429443359375,
-0.048583984375,
0.01580810546875,
-0.0009264945983886719,
0.016876220703125,
-0.0831298828125,
-0.015594482421875,
0.0250091552734375,
-0.043853759765625,
0.0491943359375,
-0.033416748046875,
-0.0283660888671875,
-0.04083251953125,
-0.03656005859375,
0.03302001953125,
0.048583984375,
-0.053497314453125,
0.0242919921875,
0.0179595947265625,
0.0165557861328125,
-0.03680419921875,
-0.0614013671875,
-0.02728271484375,
-0.036346435546875,
-0.05706787109375,
0.0189971923828125,
0.01363372802734375,
0.01312255859375,
0.01806640625,
-0.0085296630859375,
-0.0021266937255859375,
-0.013275146484375,
0.0413818359375,
0.0200042724609375,
-0.0270538330078125,
-0.0092010498046875,
-0.0243377685546875,
-0.005077362060546875,
0.003490447998046875,
-0.01910400390625,
0.049285888671875,
-0.0189971923828125,
-0.01288604736328125,
-0.06744384765625,
-0.01056671142578125,
0.04534912109375,
-0.01444244384765625,
0.0694580078125,
0.07208251953125,
-0.03936767578125,
0.004070281982421875,
-0.033294677734375,
-0.0273590087890625,
-0.035491943359375,
0.03485107421875,
-0.0291748046875,
-0.034149169921875,
0.06488037109375,
-0.004138946533203125,
0.0083465576171875,
0.043487548828125,
0.022857666015625,
-0.0011959075927734375,
0.050628662109375,
0.0440673828125,
0.005306243896484375,
0.05401611328125,
-0.0745849609375,
-0.0154876708984375,
-0.0760498046875,
-0.032623291015625,
-0.0187225341796875,
-0.054840087890625,
-0.0423583984375,
-0.024749755859375,
0.027587890625,
0.00971221923828125,
-0.0325927734375,
0.0372314453125,
-0.07098388671875,
0.01207733154296875,
0.04931640625,
0.04486083984375,
-0.0296478271484375,
0.0225982666015625,
-0.0163726806640625,
-0.00551605224609375,
-0.05328369140625,
-0.0149078369140625,
0.07611083984375,
0.04083251953125,
0.050933837890625,
-0.01259613037109375,
0.058563232421875,
-0.0185394287109375,
0.028564453125,
-0.043121337890625,
0.0455322265625,
-0.00765228271484375,
-0.03057861328125,
-0.0214385986328125,
-0.027618408203125,
-0.07403564453125,
0.01506805419921875,
-0.01287078857421875,
-0.054351806640625,
0.00878143310546875,
0.0009636878967285156,
-0.0269012451171875,
0.05767822265625,
-0.06488037109375,
0.07843017578125,
-0.0082550048828125,
-0.0321044921875,
0.009307861328125,
-0.046630859375,
0.0228118896484375,
0.0153045654296875,
-0.0194244384765625,
-0.00543975830078125,
0.0164794921875,
0.09014892578125,
-0.038726806640625,
0.06658935546875,
-0.03302001953125,
0.0283355712890625,
0.037139892578125,
-0.0119171142578125,
0.0204620361328125,
-0.00562286376953125,
-0.002964019775390625,
0.01593017578125,
0.0033740997314453125,
-0.03729248046875,
-0.04345703125,
0.0458984375,
-0.073486328125,
-0.031280517578125,
-0.027191162109375,
-0.03582763671875,
0.01061248779296875,
0.006458282470703125,
0.0423583984375,
0.048980712890625,
0.0179595947265625,
0.0198822021484375,
0.04583740234375,
-0.035308837890625,
0.0298919677734375,
-0.007476806640625,
-0.0184173583984375,
-0.04736328125,
0.057861328125,
0.0167388916015625,
0.01678466796875,
0.0029087066650390625,
0.011383056640625,
-0.028717041015625,
-0.03271484375,
-0.024261474609375,
0.041259765625,
-0.0419921875,
-0.03387451171875,
-0.043609619140625,
-0.038665771484375,
-0.03912353515625,
-0.0013256072998046875,
-0.04327392578125,
-0.028656005859375,
-0.021453857421875,
0.008056640625,
0.052703857421875,
0.04498291015625,
-0.004886627197265625,
0.035308837890625,
-0.046661376953125,
0.00843048095703125,
0.01090240478515625,
0.038116455078125,
-0.0024166107177734375,
-0.074462890625,
-0.022613525390625,
-0.009521484375,
-0.0325927734375,
-0.0594482421875,
0.0404052734375,
0.00421905517578125,
0.037994384765625,
0.02777099609375,
-0.01114654541015625,
0.056427001953125,
-0.002048492431640625,
0.035064697265625,
0.021881103515625,
-0.0364990234375,
0.05145263671875,
0.0055389404296875,
0.0174713134765625,
0.01073455810546875,
0.02276611328125,
-0.025146484375,
-0.00572967529296875,
-0.081787109375,
-0.05126953125,
0.085205078125,
0.011871337890625,
-0.0096588134765625,
0.024139404296875,
0.05517578125,
-0.00421905517578125,
0.0008459091186523438,
-0.056243896484375,
-0.035980224609375,
-0.0191650390625,
-0.0241546630859375,
0.0025634765625,
-0.00940704345703125,
-0.00713348388671875,
-0.042510986328125,
0.052032470703125,
0.0039825439453125,
0.05462646484375,
0.030609130859375,
0.0018863677978515625,
-0.005687713623046875,
-0.0283966064453125,
0.038299560546875,
0.01214599609375,
-0.030609130859375,
0.004512786865234375,
0.016754150390625,
-0.05096435546875,
0.0105438232421875,
0.007717132568359375,
0.0014677047729492188,
-0.00505828857421875,
0.040191650390625,
0.0738525390625,
0.001628875732421875,
0.006488800048828125,
0.01934814453125,
-0.0094451904296875,
-0.0269622802734375,
-0.012420654296875,
0.00490570068359375,
0.005191802978515625,
0.031494140625,
0.01412200927734375,
0.01788330078125,
-0.019439697265625,
-0.017822265625,
0.0249176025390625,
0.03143310546875,
-0.0242462158203125,
-0.0294342041015625,
0.047607421875,
-0.009552001953125,
-0.0189056396484375,
0.07049560546875,
-0.0034313201904296875,
-0.04571533203125,
0.09014892578125,
0.034576416015625,
0.078125,
-0.01053619384765625,
0.0003905296325683594,
0.0673828125,
0.0219573974609375,
-0.0009055137634277344,
0.007106781005859375,
0.009765625,
-0.059173583984375,
0.004405975341796875,
-0.048980712890625,
0.005565643310546875,
0.036224365234375,
-0.04656982421875,
0.01247406005859375,
-0.0531005859375,
-0.0321044921875,
0.01354217529296875,
0.0272674560546875,
-0.06103515625,
0.0208740234375,
-0.0011091232299804688,
0.063720703125,
-0.05804443359375,
0.060699462890625,
0.0732421875,
-0.043853759765625,
-0.07769775390625,
0.00144195556640625,
-0.0042572021484375,
-0.06976318359375,
0.048736572265625,
0.041534423828125,
0.01708984375,
0.007129669189453125,
-0.052947998046875,
-0.05474853515625,
0.1029052734375,
0.037353515625,
-0.004428863525390625,
0.019683837890625,
-0.004512786865234375,
0.0178375244140625,
-0.038482666015625,
0.03125,
0.0280914306640625,
0.0257110595703125,
0.01947021484375,
-0.052490234375,
0.0192108154296875,
-0.0166015625,
0.00296783447265625,
0.0240325927734375,
-0.070068359375,
0.067626953125,
-0.04278564453125,
-0.01824951171875,
-0.006561279296875,
0.053680419921875,
0.0274505615234375,
0.01386260986328125,
0.040283203125,
0.07415771484375,
0.036834716796875,
-0.0213470458984375,
0.06121826171875,
-0.003307342529296875,
0.055816650390625,
0.050018310546875,
0.02264404296875,
0.03985595703125,
0.0275726318359375,
-0.029754638671875,
0.0250091552734375,
0.08685302734375,
-0.015655517578125,
0.033477783203125,
0.012908935546875,
-0.00229644775390625,
-0.010467529296875,
0.01271820068359375,
-0.03424072265625,
0.029052734375,
0.0124969482421875,
-0.043609619140625,
-0.0178070068359375,
0.0114593505859375,
-0.0018749237060546875,
-0.021881103515625,
-0.0089111328125,
0.04473876953125,
-0.00011581182479858398,
-0.03143310546875,
0.062286376953125,
-0.0018548965454101562,
0.06427001953125,
-0.022979736328125,
0.00618743896484375,
-0.0321044921875,
0.024688720703125,
-0.0267333984375,
-0.06695556640625,
0.0230255126953125,
-0.01861572265625,
-0.00714874267578125,
0.00246429443359375,
0.04962158203125,
-0.033203125,
-0.035888671875,
0.014404296875,
0.019378662109375,
0.044189453125,
-0.0010499954223632812,
-0.095703125,
0.0118408203125,
0.017608642578125,
-0.0614013671875,
0.029327392578125,
0.03582763671875,
0.013641357421875,
0.05596923828125,
0.037353515625,
-0.01056671142578125,
0.0123291015625,
-0.0142669677734375,
0.058807373046875,
-0.043121337890625,
-0.02294921875,
-0.06524658203125,
0.040130615234375,
-0.007293701171875,
-0.036224365234375,
0.0379638671875,
0.04730224609375,
0.060455322265625,
-0.0052947998046875,
0.032806396484375,
-0.01561737060546875,
0.00817108154296875,
-0.0277099609375,
0.055450439453125,
-0.0474853515625,
-0.004650115966796875,
-0.006816864013671875,
-0.05560302734375,
-0.017425537109375,
0.0543212890625,
-0.0222015380859375,
0.0311431884765625,
0.038055419921875,
0.07037353515625,
-0.0264434814453125,
-0.0222625732421875,
0.0161285400390625,
0.009185791015625,
0.00286865234375,
0.038818359375,
0.02435302734375,
-0.062286376953125,
0.023406982421875,
-0.052398681640625,
-0.014984130859375,
-0.0036830902099609375,
-0.0367431640625,
-0.07257080078125,
-0.06622314453125,
-0.0506591796875,
-0.0491943359375,
-0.0219268798828125,
0.06427001953125,
0.08056640625,
-0.0501708984375,
-0.0028972625732421875,
0.009918212890625,
0.015625,
-0.01678466796875,
-0.01806640625,
0.056976318359375,
-0.0164947509765625,
-0.061737060546875,
-0.026336669921875,
0.00469207763671875,
0.0279083251953125,
-0.00638580322265625,
-0.0143890380859375,
-0.01788330078125,
-0.01525115966796875,
0.0164031982421875,
0.025909423828125,
-0.05181884765625,
-0.0118408203125,
-0.0189208984375,
-0.01354217529296875,
0.0259246826171875,
0.0285186767578125,
-0.048431396484375,
0.01349639892578125,
0.0309295654296875,
0.033599853515625,
0.06317138671875,
-0.01316070556640625,
-0.00562286376953125,
-0.06494140625,
0.04437255859375,
-0.0152130126953125,
0.03350830078125,
0.03167724609375,
-0.0301361083984375,
0.052734375,
0.03424072265625,
-0.036224365234375,
-0.06976318359375,
-0.0093841552734375,
-0.07598876953125,
-0.01023101806640625,
0.061920166015625,
-0.0322265625,
-0.04571533203125,
0.040130615234375,
-0.006542205810546875,
0.04681396484375,
-0.00807952880859375,
0.040496826171875,
0.0301971435546875,
-0.0201568603515625,
-0.055267333984375,
-0.03326416015625,
0.037353515625,
0.01468658447265625,
-0.054290771484375,
-0.0234222412109375,
-0.00701904296875,
0.055999755859375,
0.0241546630859375,
0.040283203125,
-0.011871337890625,
0.00600433349609375,
0.005645751953125,
0.03924560546875,
-0.0268707275390625,
-0.007602691650390625,
-0.0251922607421875,
0.005847930908203125,
-0.01457977294921875,
-0.060821533203125
]
] |
timm/dla102.in1k | 2023-04-24T21:14:22.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1707.06484",
"license:bsd-3-clause",
"region:us"
] | image-classification | timm | null | null | timm/dla102.in1k | 0 | 36,941 | timm | 2023-04-24T19:35:50 | ---
tags:
- image-classification
- timm
library_name: timm
license: bsd-3-clause
datasets:
- imagenet-1k
---
# Model card for dla102.in1k
A DLA (Deep Layer Aggregation) image classification model. Trained on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 33.3
- GMACs: 7.2
- Activations (M): 14.2
- Image size: 224 x 224
- **Papers:**
- Deep Layer Aggregation: https://arxiv.org/abs/1707.06484
- **Original:** https://github.com/ucbdrive/dla
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('dla102.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'dla102.in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 32, 112, 112])
# torch.Size([1, 128, 56, 56])
# torch.Size([1, 256, 28, 28])
# torch.Size([1, 512, 14, 14])
# torch.Size([1, 1024, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'dla102.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1024, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{yu2018deep,
title={Deep layer aggregation},
author={Yu, Fisher and Wang, Dequan and Shelhamer, Evan and Darrell, Trevor},
booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition},
year={2018}
}
```
| 3,590 | [
[
-0.034332275390625,
-0.042327880859375,
0.01177215576171875,
0.00897979736328125,
-0.0255584716796875,
-0.0201416015625,
-0.0025787353515625,
-0.0384521484375,
0.0198516845703125,
0.03497314453125,
-0.03466796875,
-0.05615234375,
-0.057891845703125,
-0.0171661376953125,
-0.015838623046875,
0.0693359375,
-0.003925323486328125,
0.00011032819747924805,
-0.0160369873046875,
-0.040008544921875,
-0.0158538818359375,
-0.0250396728515625,
-0.07159423828125,
-0.036651611328125,
0.0274200439453125,
0.0061492919921875,
0.0362548828125,
0.0439453125,
0.03985595703125,
0.0372314453125,
-0.0131683349609375,
0.01129150390625,
-0.0141448974609375,
-0.0216827392578125,
0.035736083984375,
-0.042266845703125,
-0.047515869140625,
0.0206451416015625,
0.050445556640625,
0.0400390625,
0.009765625,
0.028289794921875,
0.01026153564453125,
0.048675537109375,
-0.0201873779296875,
0.015167236328125,
-0.0300445556640625,
0.0156402587890625,
-0.00928497314453125,
0.0149078369140625,
-0.01849365234375,
-0.03472900390625,
0.016815185546875,
-0.037017822265625,
0.020965576171875,
-0.01100921630859375,
0.0975341796875,
0.02020263671875,
-0.01152801513671875,
-0.01178741455078125,
-0.0162506103515625,
0.06256103515625,
-0.0697021484375,
0.007476806640625,
0.0282440185546875,
0.005893707275390625,
-0.0044097900390625,
-0.070556640625,
-0.046600341796875,
-0.0021152496337890625,
-0.0129241943359375,
-0.0041046142578125,
-0.0035190582275390625,
-0.0006542205810546875,
0.0095977783203125,
0.0283050537109375,
-0.0297393798828125,
0.0210113525390625,
-0.042755126953125,
-0.0099029541015625,
0.047088623046875,
0.008331298828125,
0.0243988037109375,
-0.00429534912109375,
-0.044952392578125,
-0.025177001953125,
-0.0227203369140625,
0.026611328125,
0.023040771484375,
0.0177764892578125,
-0.050689697265625,
0.0247650146484375,
0.021759033203125,
0.0382080078125,
0.01396942138671875,
-0.0283355712890625,
0.049591064453125,
0.0072479248046875,
-0.040252685546875,
-0.006336212158203125,
0.0731201171875,
0.0232391357421875,
0.013458251953125,
0.0072479248046875,
-0.0083465576171875,
-0.0262908935546875,
-0.01371002197265625,
-0.08770751953125,
-0.0291290283203125,
0.035919189453125,
-0.04412841796875,
-0.03643798828125,
0.0229339599609375,
-0.04534912109375,
-0.01422882080078125,
-0.008026123046875,
0.044464111328125,
-0.032562255859375,
-0.028411865234375,
0.005863189697265625,
-0.01009368896484375,
0.02435302734375,
0.01450347900390625,
-0.043060302734375,
0.01100921630859375,
0.020050048828125,
0.07958984375,
-0.00246429443359375,
-0.039215087890625,
-0.0156097412109375,
-0.01555633544921875,
-0.0236968994140625,
0.038177490234375,
-0.005084991455078125,
-0.00646209716796875,
-0.0192413330078125,
0.0207672119140625,
-0.0141448974609375,
-0.057769775390625,
0.0196075439453125,
-0.024200439453125,
0.019805908203125,
-0.0024242401123046875,
-0.0286102294921875,
-0.042022705078125,
0.017547607421875,
-0.038299560546875,
0.09088134765625,
0.02850341796875,
-0.055755615234375,
0.03839111328125,
-0.042266845703125,
-0.0105133056640625,
-0.01898193359375,
0.0008783340454101562,
-0.07940673828125,
-0.0109710693359375,
0.021209716796875,
0.05242919921875,
-0.0162353515625,
0.0005970001220703125,
-0.044097900390625,
-0.01837158203125,
0.029998779296875,
0.0003108978271484375,
0.0850830078125,
0.0101776123046875,
-0.0259246826171875,
0.01204681396484375,
-0.0469970703125,
0.0160675048828125,
0.03790283203125,
-0.0304718017578125,
-0.00954437255859375,
-0.046051025390625,
0.0116119384765625,
0.02166748046875,
0.015655517578125,
-0.03839111328125,
0.025726318359375,
-0.0206298828125,
0.021881103515625,
0.055572509765625,
-0.00893402099609375,
0.0252685546875,
-0.0208587646484375,
0.0156402587890625,
0.04010009765625,
0.0133514404296875,
-0.0016622543334960938,
-0.042266845703125,
-0.057159423828125,
-0.042510986328125,
0.030029296875,
0.030059814453125,
-0.03375244140625,
0.052520751953125,
-0.0019931793212890625,
-0.051055908203125,
-0.03961181640625,
0.0174102783203125,
0.031890869140625,
0.041229248046875,
0.0230712890625,
-0.039794921875,
-0.035125732421875,
-0.058868408203125,
0.0172271728515625,
0.003948211669921875,
0.0021533966064453125,
0.0218505859375,
0.0484619140625,
-0.01232147216796875,
0.047882080078125,
-0.0379638671875,
-0.032196044921875,
-0.0170135498046875,
0.0092010498046875,
0.0352783203125,
0.05908203125,
0.07513427734375,
-0.04547119140625,
-0.0372314453125,
-0.016357421875,
-0.0814208984375,
0.010833740234375,
-0.00434112548828125,
-0.01276397705078125,
0.03350830078125,
0.003261566162109375,
-0.05181884765625,
0.040924072265625,
0.02142333984375,
-0.02911376953125,
0.0238037109375,
-0.0204315185546875,
0.019989013671875,
-0.084716796875,
0.00360107421875,
0.02325439453125,
0.0004508495330810547,
-0.037200927734375,
0.007808685302734375,
-0.005962371826171875,
0.00014543533325195312,
-0.041900634765625,
0.050750732421875,
-0.0433349609375,
-0.0200347900390625,
-0.016998291015625,
-0.013763427734375,
0.00399017333984375,
0.057586669921875,
-0.01323699951171875,
0.029815673828125,
0.0736083984375,
-0.045745849609375,
0.03411865234375,
0.033050537109375,
-0.02001953125,
0.025299072265625,
-0.045166015625,
0.0272064208984375,
-0.0052337646484375,
0.018280029296875,
-0.078125,
-0.01470947265625,
0.021728515625,
-0.0367431640625,
0.04931640625,
-0.036956787109375,
-0.032684326171875,
-0.036773681640625,
-0.042510986328125,
0.03338623046875,
0.053314208984375,
-0.06005859375,
0.0297393798828125,
0.0307159423828125,
0.0189971923828125,
-0.04193115234375,
-0.06982421875,
-0.0180816650390625,
-0.032562255859375,
-0.0548095703125,
0.02069091796875,
0.004058837890625,
0.005619049072265625,
0.00778961181640625,
-0.0032100677490234375,
-0.0198211669921875,
-0.00678253173828125,
0.037994384765625,
0.02783203125,
-0.02197265625,
-0.011566162109375,
-0.016815185546875,
-0.0010471343994140625,
0.008514404296875,
-0.0203094482421875,
0.051055908203125,
-0.0103759765625,
-0.011444091796875,
-0.0618896484375,
-0.00603485107421875,
0.037841796875,
-0.0022296905517578125,
0.06524658203125,
0.08306884765625,
-0.03753662109375,
-0.005741119384765625,
-0.0227203369140625,
-0.0169677734375,
-0.0369873046875,
0.03271484375,
-0.028076171875,
-0.0240631103515625,
0.07049560546875,
0.0024433135986328125,
0.0079193115234375,
0.052001953125,
0.0289306640625,
-0.01084136962890625,
0.060943603515625,
0.04327392578125,
-0.0006074905395507812,
0.041534423828125,
-0.082763671875,
-0.01702880859375,
-0.059478759765625,
-0.0369873046875,
-0.024200439453125,
-0.047149658203125,
-0.037841796875,
-0.023834228515625,
0.0297393798828125,
0.020111083984375,
-0.026702880859375,
0.0269927978515625,
-0.054168701171875,
0.01190185546875,
0.04559326171875,
0.0477294921875,
-0.02740478515625,
0.0187835693359375,
-0.006744384765625,
0.002826690673828125,
-0.045562744140625,
-0.0181732177734375,
0.08026123046875,
0.0404052734375,
0.050140380859375,
-0.00628662109375,
0.056488037109375,
-0.006580352783203125,
0.016265869140625,
-0.043212890625,
0.047088623046875,
-0.0239410400390625,
-0.0285491943359375,
-0.01363372802734375,
-0.0216827392578125,
-0.076416015625,
0.0008134841918945312,
-0.0167694091796875,
-0.05657958984375,
0.0169677734375,
0.0124969482421875,
-0.01309967041015625,
0.061370849609375,
-0.0626220703125,
0.0673828125,
-0.01244354248046875,
-0.0249786376953125,
0.004413604736328125,
-0.041046142578125,
0.0284881591796875,
0.005458831787109375,
-0.0197601318359375,
-0.00568389892578125,
0.01120758056640625,
0.07781982421875,
-0.04254150390625,
0.07049560546875,
-0.046112060546875,
0.0124053955078125,
0.04327392578125,
-0.012451171875,
0.0260772705078125,
-0.0126953125,
-0.00893402099609375,
0.03277587890625,
-0.00183868408203125,
-0.0400390625,
-0.040191650390625,
0.045318603515625,
-0.0780029296875,
-0.021942138671875,
-0.02606201171875,
-0.042816162109375,
0.01480865478515625,
0.00011211633682250977,
0.028228759765625,
0.046173095703125,
0.0218048095703125,
0.01226806640625,
0.044921875,
-0.0318603515625,
0.036041259765625,
-0.007595062255859375,
-0.0212249755859375,
-0.0311431884765625,
0.06072998046875,
0.0191192626953125,
0.01134490966796875,
0.002735137939453125,
0.01012420654296875,
-0.0301971435546875,
-0.05364990234375,
-0.028594970703125,
0.0255889892578125,
-0.05377197265625,
-0.031280517578125,
-0.04510498046875,
-0.04388427734375,
-0.037139892578125,
-0.00514984130859375,
-0.0251007080078125,
-0.0265045166015625,
-0.032196044921875,
0.01113128662109375,
0.0509033203125,
0.0546875,
-0.006946563720703125,
0.039459228515625,
-0.04345703125,
0.00861358642578125,
0.0135040283203125,
0.034515380859375,
-0.0015859603881835938,
-0.07012939453125,
-0.026824951171875,
-0.006603240966796875,
-0.032379150390625,
-0.0516357421875,
0.040435791015625,
0.01373291015625,
0.041717529296875,
0.032745361328125,
-0.018646240234375,
0.05975341796875,
0.00738525390625,
0.04718017578125,
0.0178680419921875,
-0.043304443359375,
0.04840087890625,
0.0026645660400390625,
0.01265716552734375,
0.005947113037109375,
0.025848388671875,
0.004116058349609375,
-0.0109710693359375,
-0.06549072265625,
-0.061248779296875,
0.07159423828125,
0.006717681884765625,
-0.002574920654296875,
0.02752685546875,
0.054534912109375,
0.00553131103515625,
-0.0006313323974609375,
-0.05780029296875,
-0.04229736328125,
-0.0182342529296875,
-0.0218048095703125,
0.0026836395263671875,
-0.0083465576171875,
-0.005252838134765625,
-0.04522705078125,
0.054168701171875,
-0.005462646484375,
0.05157470703125,
0.0198974609375,
-0.00627899169921875,
-0.00922393798828125,
-0.034210205078125,
0.032928466796875,
0.027679443359375,
-0.035797119140625,
-0.00269317626953125,
0.0189971923828125,
-0.0484619140625,
0.007541656494140625,
0.01238250732421875,
-0.00418853759765625,
-0.0035991668701171875,
0.0352783203125,
0.061676025390625,
-0.0074005126953125,
0.0044403076171875,
0.0236358642578125,
0.00241851806640625,
-0.03167724609375,
-0.0234375,
0.01453399658203125,
-0.00470733642578125,
0.033203125,
0.031707763671875,
0.0167236328125,
-0.00885772705078125,
-0.0287017822265625,
0.0220794677734375,
0.045745849609375,
-0.0159912109375,
-0.024993896484375,
0.052337646484375,
-0.01074981689453125,
-0.00824737548828125,
0.06011962890625,
-0.00908660888671875,
-0.0302734375,
0.09320068359375,
0.03363037109375,
0.07574462890625,
-0.0015659332275390625,
-0.0006394386291503906,
0.061309814453125,
0.01080322265625,
0.0132293701171875,
0.01027679443359375,
0.0158538818359375,
-0.059326171875,
0.00377655029296875,
-0.0478515625,
0.0035552978515625,
0.03851318359375,
-0.037628173828125,
0.01800537109375,
-0.055206298828125,
-0.0238037109375,
0.0213165283203125,
0.031524658203125,
-0.063720703125,
0.01837158203125,
0.0028095245361328125,
0.06365966796875,
-0.06353759765625,
0.0712890625,
0.06597900390625,
-0.05322265625,
-0.0721435546875,
-0.0021839141845703125,
0.004467010498046875,
-0.076416015625,
0.057525634765625,
0.033447265625,
0.01389312744140625,
-0.0005555152893066406,
-0.061065673828125,
-0.047698974609375,
0.11419677734375,
0.03826904296875,
0.0023136138916015625,
0.0169677734375,
0.0005908012390136719,
0.0191497802734375,
-0.041015625,
0.0288848876953125,
0.01971435546875,
0.0307769775390625,
0.0231781005859375,
-0.047576904296875,
0.0125579833984375,
-0.0154266357421875,
0.0005211830139160156,
0.0129852294921875,
-0.059326171875,
0.07879638671875,
-0.038787841796875,
-0.0103759765625,
0.00537872314453125,
0.05413818359375,
0.0225372314453125,
0.0198822021484375,
0.04736328125,
0.0660400390625,
0.04058837890625,
-0.020050048828125,
0.06121826171875,
0.00006324052810668945,
0.04559326171875,
0.043609619140625,
0.00907135009765625,
0.032196044921875,
0.0213165283203125,
-0.0262908935546875,
0.032470703125,
0.0797119140625,
-0.0290679931640625,
0.0275726318359375,
0.004718780517578125,
-0.007366180419921875,
-0.00025463104248046875,
0.00936126708984375,
-0.030853271484375,
0.040008544921875,
0.015106201171875,
-0.038177490234375,
-0.015899658203125,
0.0010747909545898438,
0.0012311935424804688,
-0.0197601318359375,
-0.0216522216796875,
0.047882080078125,
-0.0018396377563476562,
-0.0224609375,
0.0611572265625,
-0.0005922317504882812,
0.0662841796875,
-0.032562255859375,
-0.00514984130859375,
-0.0303802490234375,
0.0243377685546875,
-0.0297088623046875,
-0.06884765625,
0.022064208984375,
-0.0218353271484375,
-0.0052337646484375,
0.00899505615234375,
0.046173095703125,
-0.0311279296875,
-0.04144287109375,
0.02423095703125,
0.0240020751953125,
0.047454833984375,
0.01190185546875,
-0.09185791015625,
0.008209228515625,
0.0019044876098632812,
-0.045501708984375,
0.037261962890625,
0.03253173828125,
0.01290130615234375,
0.05487060546875,
0.044158935546875,
-0.003452301025390625,
0.00984954833984375,
-0.006542205810546875,
0.063720703125,
-0.03826904296875,
-0.0164337158203125,
-0.058074951171875,
0.05242919921875,
-0.0080718994140625,
-0.03985595703125,
0.034332275390625,
0.04840087890625,
0.061004638671875,
-0.007183074951171875,
0.0312347412109375,
-0.022430419921875,
-0.0009131431579589844,
-0.045745849609375,
0.058258056640625,
-0.04840087890625,
0.003238677978515625,
-0.01099395751953125,
-0.0516357421875,
-0.02001953125,
0.04364013671875,
-0.00485992431640625,
0.03851318359375,
0.033447265625,
0.0743408203125,
-0.023681640625,
-0.0350341796875,
0.01555633544921875,
0.01751708984375,
0.010498046875,
0.037750244140625,
0.0196990966796875,
-0.06561279296875,
0.0237274169921875,
-0.04852294921875,
-0.01361083984375,
-0.000881195068359375,
-0.046539306640625,
-0.081787109375,
-0.06903076171875,
-0.045745849609375,
-0.046051025390625,
-0.0181121826171875,
0.0662841796875,
0.0823974609375,
-0.05743408203125,
-0.01000213623046875,
0.01090240478515625,
0.01480865478515625,
-0.019439697265625,
-0.01849365234375,
0.057159423828125,
-0.0206451416015625,
-0.068115234375,
-0.0195159912109375,
-0.0006899833679199219,
0.037689208984375,
-0.0120697021484375,
-0.0219268798828125,
-0.0231781005859375,
-0.0170745849609375,
0.026336669921875,
0.019134521484375,
-0.04974365234375,
-0.010833740234375,
-0.0306549072265625,
-0.011962890625,
0.0285186767578125,
0.02618408203125,
-0.047454833984375,
0.01056671142578125,
0.0243377685546875,
0.0311431884765625,
0.061492919921875,
-0.0196380615234375,
-0.0026760101318359375,
-0.0657958984375,
0.045745849609375,
-0.0164031982421875,
0.032562255859375,
0.023681640625,
-0.030029296875,
0.0455322265625,
0.0288238525390625,
-0.03680419921875,
-0.06689453125,
-0.01117706298828125,
-0.08770751953125,
-0.016754150390625,
0.054229736328125,
-0.035552978515625,
-0.036865234375,
0.028961181640625,
-0.002620697021484375,
0.042266845703125,
-0.0091400146484375,
0.047454833984375,
0.0202484130859375,
-0.01457977294921875,
-0.0494384765625,
-0.032745361328125,
0.030517578125,
0.002574920654296875,
-0.058013916015625,
-0.03179931640625,
0.0015439987182617188,
0.04595947265625,
0.0177001953125,
0.04119873046875,
-0.01091766357421875,
0.005084991455078125,
0.0077362060546875,
0.03326416015625,
-0.031524658203125,
-0.00740814208984375,
-0.03094482421875,
0.007053375244140625,
-0.0048065185546875,
-0.05029296875
]
] |
intfloat/e5-base-v2 | 2023-09-27T10:13:27.000Z | [
"sentence-transformers",
"pytorch",
"onnx",
"safetensors",
"bert",
"mteb",
"Sentence Transformers",
"sentence-similarity",
"en",
"arxiv:2212.03533",
"arxiv:2104.08663",
"arxiv:2210.07316",
"license:mit",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | intfloat | null | null | intfloat/e5-base-v2 | 57 | 36,915 | sentence-transformers | 2023-05-19T07:21:14 | ---
tags:
- mteb
- Sentence Transformers
- sentence-similarity
- sentence-transformers
model-index:
- name: e5-base-v2
results:
- task:
type: Classification
dataset:
type: mteb/amazon_counterfactual
name: MTEB AmazonCounterfactualClassification (en)
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 77.77611940298506
- type: ap
value: 42.052710266606056
- type: f1
value: 72.12040628266567
- task:
type: Classification
dataset:
type: mteb/amazon_polarity
name: MTEB AmazonPolarityClassification
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 92.81012500000001
- type: ap
value: 89.4213700757244
- type: f1
value: 92.8039091197065
- task:
type: Classification
dataset:
type: mteb/amazon_reviews_multi
name: MTEB AmazonReviewsClassification (en)
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 46.711999999999996
- type: f1
value: 46.11544975436018
- task:
type: Retrieval
dataset:
type: arguana
name: MTEB ArguAna
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.186
- type: map_at_10
value: 36.632999999999996
- type: map_at_100
value: 37.842
- type: map_at_1000
value: 37.865
- type: map_at_3
value: 32.278
- type: map_at_5
value: 34.760999999999996
- type: mrr_at_1
value: 23.400000000000002
- type: mrr_at_10
value: 36.721
- type: mrr_at_100
value: 37.937
- type: mrr_at_1000
value: 37.96
- type: mrr_at_3
value: 32.302
- type: mrr_at_5
value: 34.894
- type: ndcg_at_1
value: 23.186
- type: ndcg_at_10
value: 44.49
- type: ndcg_at_100
value: 50.065000000000005
- type: ndcg_at_1000
value: 50.629999999999995
- type: ndcg_at_3
value: 35.461
- type: ndcg_at_5
value: 39.969
- type: precision_at_1
value: 23.186
- type: precision_at_10
value: 6.97
- type: precision_at_100
value: 0.951
- type: precision_at_1000
value: 0.099
- type: precision_at_3
value: 14.912
- type: precision_at_5
value: 11.152
- type: recall_at_1
value: 23.186
- type: recall_at_10
value: 69.70100000000001
- type: recall_at_100
value: 95.092
- type: recall_at_1000
value: 99.431
- type: recall_at_3
value: 44.737
- type: recall_at_5
value: 55.761
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-p2p
name: MTEB ArxivClusteringP2P
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 46.10312401440185
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-s2s
name: MTEB ArxivClusteringS2S
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 39.67275326095384
- task:
type: Reranking
dataset:
type: mteb/askubuntudupquestions-reranking
name: MTEB AskUbuntuDupQuestions
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 58.97793816337376
- type: mrr
value: 72.76832431957087
- task:
type: STS
dataset:
type: mteb/biosses-sts
name: MTEB BIOSSES
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 83.11646947018187
- type: cos_sim_spearman
value: 81.40064994975234
- type: euclidean_pearson
value: 82.37355689019232
- type: euclidean_spearman
value: 81.6777646977348
- type: manhattan_pearson
value: 82.61101422716945
- type: manhattan_spearman
value: 81.80427360442245
- task:
type: Classification
dataset:
type: mteb/banking77
name: MTEB Banking77Classification
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 83.52922077922076
- type: f1
value: 83.45298679360866
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-p2p
name: MTEB BiorxivClusteringP2P
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 37.495115019668496
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-s2s
name: MTEB BiorxivClusteringS2S
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 32.724792944166765
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackAndroidRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 32.361000000000004
- type: map_at_10
value: 43.765
- type: map_at_100
value: 45.224
- type: map_at_1000
value: 45.35
- type: map_at_3
value: 40.353
- type: map_at_5
value: 42.195
- type: mrr_at_1
value: 40.629
- type: mrr_at_10
value: 50.458000000000006
- type: mrr_at_100
value: 51.06699999999999
- type: mrr_at_1000
value: 51.12
- type: mrr_at_3
value: 47.902
- type: mrr_at_5
value: 49.447
- type: ndcg_at_1
value: 40.629
- type: ndcg_at_10
value: 50.376
- type: ndcg_at_100
value: 55.065
- type: ndcg_at_1000
value: 57.196000000000005
- type: ndcg_at_3
value: 45.616
- type: ndcg_at_5
value: 47.646
- type: precision_at_1
value: 40.629
- type: precision_at_10
value: 9.785
- type: precision_at_100
value: 1.562
- type: precision_at_1000
value: 0.2
- type: precision_at_3
value: 22.031
- type: precision_at_5
value: 15.737000000000002
- type: recall_at_1
value: 32.361000000000004
- type: recall_at_10
value: 62.214000000000006
- type: recall_at_100
value: 81.464
- type: recall_at_1000
value: 95.905
- type: recall_at_3
value: 47.5
- type: recall_at_5
value: 53.69500000000001
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackEnglishRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 27.971
- type: map_at_10
value: 37.444
- type: map_at_100
value: 38.607
- type: map_at_1000
value: 38.737
- type: map_at_3
value: 34.504000000000005
- type: map_at_5
value: 36.234
- type: mrr_at_1
value: 35.35
- type: mrr_at_10
value: 43.441
- type: mrr_at_100
value: 44.147999999999996
- type: mrr_at_1000
value: 44.196000000000005
- type: mrr_at_3
value: 41.285
- type: mrr_at_5
value: 42.552
- type: ndcg_at_1
value: 35.35
- type: ndcg_at_10
value: 42.903999999999996
- type: ndcg_at_100
value: 47.406
- type: ndcg_at_1000
value: 49.588
- type: ndcg_at_3
value: 38.778
- type: ndcg_at_5
value: 40.788000000000004
- type: precision_at_1
value: 35.35
- type: precision_at_10
value: 8.083
- type: precision_at_100
value: 1.313
- type: precision_at_1000
value: 0.18
- type: precision_at_3
value: 18.769
- type: precision_at_5
value: 13.439
- type: recall_at_1
value: 27.971
- type: recall_at_10
value: 52.492000000000004
- type: recall_at_100
value: 71.642
- type: recall_at_1000
value: 85.488
- type: recall_at_3
value: 40.1
- type: recall_at_5
value: 45.800000000000004
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGamingRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 39.898
- type: map_at_10
value: 51.819
- type: map_at_100
value: 52.886
- type: map_at_1000
value: 52.941
- type: map_at_3
value: 48.619
- type: map_at_5
value: 50.493
- type: mrr_at_1
value: 45.391999999999996
- type: mrr_at_10
value: 55.230000000000004
- type: mrr_at_100
value: 55.887
- type: mrr_at_1000
value: 55.916
- type: mrr_at_3
value: 52.717000000000006
- type: mrr_at_5
value: 54.222
- type: ndcg_at_1
value: 45.391999999999996
- type: ndcg_at_10
value: 57.586999999999996
- type: ndcg_at_100
value: 61.745000000000005
- type: ndcg_at_1000
value: 62.83800000000001
- type: ndcg_at_3
value: 52.207
- type: ndcg_at_5
value: 54.925999999999995
- type: precision_at_1
value: 45.391999999999996
- type: precision_at_10
value: 9.21
- type: precision_at_100
value: 1.226
- type: precision_at_1000
value: 0.136
- type: precision_at_3
value: 23.177
- type: precision_at_5
value: 16.038
- type: recall_at_1
value: 39.898
- type: recall_at_10
value: 71.18900000000001
- type: recall_at_100
value: 89.082
- type: recall_at_1000
value: 96.865
- type: recall_at_3
value: 56.907
- type: recall_at_5
value: 63.397999999999996
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGisRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 22.706
- type: map_at_10
value: 30.818
- type: map_at_100
value: 32.038
- type: map_at_1000
value: 32.123000000000005
- type: map_at_3
value: 28.077
- type: map_at_5
value: 29.709999999999997
- type: mrr_at_1
value: 24.407
- type: mrr_at_10
value: 32.555
- type: mrr_at_100
value: 33.692
- type: mrr_at_1000
value: 33.751
- type: mrr_at_3
value: 29.848999999999997
- type: mrr_at_5
value: 31.509999999999998
- type: ndcg_at_1
value: 24.407
- type: ndcg_at_10
value: 35.624
- type: ndcg_at_100
value: 41.454
- type: ndcg_at_1000
value: 43.556
- type: ndcg_at_3
value: 30.217
- type: ndcg_at_5
value: 33.111000000000004
- type: precision_at_1
value: 24.407
- type: precision_at_10
value: 5.548
- type: precision_at_100
value: 0.8869999999999999
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 12.731
- type: precision_at_5
value: 9.22
- type: recall_at_1
value: 22.706
- type: recall_at_10
value: 48.772
- type: recall_at_100
value: 75.053
- type: recall_at_1000
value: 90.731
- type: recall_at_3
value: 34.421
- type: recall_at_5
value: 41.427
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackMathematicaRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 13.424
- type: map_at_10
value: 21.09
- type: map_at_100
value: 22.264999999999997
- type: map_at_1000
value: 22.402
- type: map_at_3
value: 18.312
- type: map_at_5
value: 19.874
- type: mrr_at_1
value: 16.915
- type: mrr_at_10
value: 25.258000000000003
- type: mrr_at_100
value: 26.228
- type: mrr_at_1000
value: 26.31
- type: mrr_at_3
value: 22.492
- type: mrr_at_5
value: 24.04
- type: ndcg_at_1
value: 16.915
- type: ndcg_at_10
value: 26.266000000000002
- type: ndcg_at_100
value: 32.08
- type: ndcg_at_1000
value: 35.086
- type: ndcg_at_3
value: 21.049
- type: ndcg_at_5
value: 23.508000000000003
- type: precision_at_1
value: 16.915
- type: precision_at_10
value: 5.1
- type: precision_at_100
value: 0.9329999999999999
- type: precision_at_1000
value: 0.131
- type: precision_at_3
value: 10.282
- type: precision_at_5
value: 7.836
- type: recall_at_1
value: 13.424
- type: recall_at_10
value: 38.179
- type: recall_at_100
value: 63.906
- type: recall_at_1000
value: 84.933
- type: recall_at_3
value: 23.878
- type: recall_at_5
value: 30.037999999999997
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackPhysicsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 26.154
- type: map_at_10
value: 35.912
- type: map_at_100
value: 37.211
- type: map_at_1000
value: 37.327
- type: map_at_3
value: 32.684999999999995
- type: map_at_5
value: 34.562
- type: mrr_at_1
value: 32.435
- type: mrr_at_10
value: 41.411
- type: mrr_at_100
value: 42.297000000000004
- type: mrr_at_1000
value: 42.345
- type: mrr_at_3
value: 38.771
- type: mrr_at_5
value: 40.33
- type: ndcg_at_1
value: 32.435
- type: ndcg_at_10
value: 41.785
- type: ndcg_at_100
value: 47.469
- type: ndcg_at_1000
value: 49.685
- type: ndcg_at_3
value: 36.618
- type: ndcg_at_5
value: 39.101
- type: precision_at_1
value: 32.435
- type: precision_at_10
value: 7.642
- type: precision_at_100
value: 1.244
- type: precision_at_1000
value: 0.163
- type: precision_at_3
value: 17.485
- type: precision_at_5
value: 12.57
- type: recall_at_1
value: 26.154
- type: recall_at_10
value: 54.111
- type: recall_at_100
value: 78.348
- type: recall_at_1000
value: 92.996
- type: recall_at_3
value: 39.189
- type: recall_at_5
value: 45.852
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackProgrammersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 26.308999999999997
- type: map_at_10
value: 35.524
- type: map_at_100
value: 36.774
- type: map_at_1000
value: 36.891
- type: map_at_3
value: 32.561
- type: map_at_5
value: 34.034
- type: mrr_at_1
value: 31.735000000000003
- type: mrr_at_10
value: 40.391
- type: mrr_at_100
value: 41.227000000000004
- type: mrr_at_1000
value: 41.288000000000004
- type: mrr_at_3
value: 37.938
- type: mrr_at_5
value: 39.193
- type: ndcg_at_1
value: 31.735000000000003
- type: ndcg_at_10
value: 41.166000000000004
- type: ndcg_at_100
value: 46.702
- type: ndcg_at_1000
value: 49.157000000000004
- type: ndcg_at_3
value: 36.274
- type: ndcg_at_5
value: 38.177
- type: precision_at_1
value: 31.735000000000003
- type: precision_at_10
value: 7.5569999999999995
- type: precision_at_100
value: 1.2109999999999999
- type: precision_at_1000
value: 0.16
- type: precision_at_3
value: 17.199
- type: precision_at_5
value: 12.123000000000001
- type: recall_at_1
value: 26.308999999999997
- type: recall_at_10
value: 53.083000000000006
- type: recall_at_100
value: 76.922
- type: recall_at_1000
value: 93.767
- type: recall_at_3
value: 39.262
- type: recall_at_5
value: 44.413000000000004
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 24.391250000000003
- type: map_at_10
value: 33.280166666666666
- type: map_at_100
value: 34.49566666666667
- type: map_at_1000
value: 34.61533333333333
- type: map_at_3
value: 30.52183333333333
- type: map_at_5
value: 32.06608333333333
- type: mrr_at_1
value: 29.105083333333337
- type: mrr_at_10
value: 37.44766666666666
- type: mrr_at_100
value: 38.32491666666667
- type: mrr_at_1000
value: 38.385666666666665
- type: mrr_at_3
value: 35.06883333333333
- type: mrr_at_5
value: 36.42066666666667
- type: ndcg_at_1
value: 29.105083333333337
- type: ndcg_at_10
value: 38.54358333333333
- type: ndcg_at_100
value: 43.833583333333344
- type: ndcg_at_1000
value: 46.215333333333334
- type: ndcg_at_3
value: 33.876
- type: ndcg_at_5
value: 36.05208333333333
- type: precision_at_1
value: 29.105083333333337
- type: precision_at_10
value: 6.823416666666665
- type: precision_at_100
value: 1.1270833333333334
- type: precision_at_1000
value: 0.15208333333333332
- type: precision_at_3
value: 15.696750000000002
- type: precision_at_5
value: 11.193499999999998
- type: recall_at_1
value: 24.391250000000003
- type: recall_at_10
value: 49.98808333333333
- type: recall_at_100
value: 73.31616666666666
- type: recall_at_1000
value: 89.96291666666667
- type: recall_at_3
value: 36.86666666666667
- type: recall_at_5
value: 42.54350000000001
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackStatsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 21.995
- type: map_at_10
value: 28.807
- type: map_at_100
value: 29.813000000000002
- type: map_at_1000
value: 29.903000000000002
- type: map_at_3
value: 26.636
- type: map_at_5
value: 27.912
- type: mrr_at_1
value: 24.847
- type: mrr_at_10
value: 31.494
- type: mrr_at_100
value: 32.381
- type: mrr_at_1000
value: 32.446999999999996
- type: mrr_at_3
value: 29.473
- type: mrr_at_5
value: 30.7
- type: ndcg_at_1
value: 24.847
- type: ndcg_at_10
value: 32.818999999999996
- type: ndcg_at_100
value: 37.835
- type: ndcg_at_1000
value: 40.226
- type: ndcg_at_3
value: 28.811999999999998
- type: ndcg_at_5
value: 30.875999999999998
- type: precision_at_1
value: 24.847
- type: precision_at_10
value: 5.244999999999999
- type: precision_at_100
value: 0.856
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_3
value: 12.577
- type: precision_at_5
value: 8.895999999999999
- type: recall_at_1
value: 21.995
- type: recall_at_10
value: 42.479
- type: recall_at_100
value: 65.337
- type: recall_at_1000
value: 83.23700000000001
- type: recall_at_3
value: 31.573
- type: recall_at_5
value: 36.684
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackTexRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 15.751000000000001
- type: map_at_10
value: 21.909
- type: map_at_100
value: 23.064
- type: map_at_1000
value: 23.205000000000002
- type: map_at_3
value: 20.138
- type: map_at_5
value: 20.973
- type: mrr_at_1
value: 19.305
- type: mrr_at_10
value: 25.647
- type: mrr_at_100
value: 26.659
- type: mrr_at_1000
value: 26.748
- type: mrr_at_3
value: 23.933
- type: mrr_at_5
value: 24.754
- type: ndcg_at_1
value: 19.305
- type: ndcg_at_10
value: 25.886
- type: ndcg_at_100
value: 31.56
- type: ndcg_at_1000
value: 34.799
- type: ndcg_at_3
value: 22.708000000000002
- type: ndcg_at_5
value: 23.838
- type: precision_at_1
value: 19.305
- type: precision_at_10
value: 4.677
- type: precision_at_100
value: 0.895
- type: precision_at_1000
value: 0.136
- type: precision_at_3
value: 10.771
- type: precision_at_5
value: 7.46
- type: recall_at_1
value: 15.751000000000001
- type: recall_at_10
value: 34.156
- type: recall_at_100
value: 59.899
- type: recall_at_1000
value: 83.08
- type: recall_at_3
value: 24.772
- type: recall_at_5
value: 28.009
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackUnixRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.34
- type: map_at_10
value: 32.383
- type: map_at_100
value: 33.629999999999995
- type: map_at_1000
value: 33.735
- type: map_at_3
value: 29.68
- type: map_at_5
value: 31.270999999999997
- type: mrr_at_1
value: 27.612
- type: mrr_at_10
value: 36.381
- type: mrr_at_100
value: 37.351
- type: mrr_at_1000
value: 37.411
- type: mrr_at_3
value: 33.893
- type: mrr_at_5
value: 35.353
- type: ndcg_at_1
value: 27.612
- type: ndcg_at_10
value: 37.714999999999996
- type: ndcg_at_100
value: 43.525000000000006
- type: ndcg_at_1000
value: 45.812999999999995
- type: ndcg_at_3
value: 32.796
- type: ndcg_at_5
value: 35.243
- type: precision_at_1
value: 27.612
- type: precision_at_10
value: 6.465
- type: precision_at_100
value: 1.0619999999999998
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 15.049999999999999
- type: precision_at_5
value: 10.764999999999999
- type: recall_at_1
value: 23.34
- type: recall_at_10
value: 49.856
- type: recall_at_100
value: 75.334
- type: recall_at_1000
value: 91.156
- type: recall_at_3
value: 36.497
- type: recall_at_5
value: 42.769
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWebmastersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.097
- type: map_at_10
value: 34.599999999999994
- type: map_at_100
value: 36.174
- type: map_at_1000
value: 36.398
- type: map_at_3
value: 31.781
- type: map_at_5
value: 33.22
- type: mrr_at_1
value: 31.225
- type: mrr_at_10
value: 39.873
- type: mrr_at_100
value: 40.853
- type: mrr_at_1000
value: 40.904
- type: mrr_at_3
value: 37.681
- type: mrr_at_5
value: 38.669
- type: ndcg_at_1
value: 31.225
- type: ndcg_at_10
value: 40.586
- type: ndcg_at_100
value: 46.226
- type: ndcg_at_1000
value: 48.788
- type: ndcg_at_3
value: 36.258
- type: ndcg_at_5
value: 37.848
- type: precision_at_1
value: 31.225
- type: precision_at_10
value: 7.707999999999999
- type: precision_at_100
value: 1.536
- type: precision_at_1000
value: 0.242
- type: precision_at_3
value: 17.26
- type: precision_at_5
value: 12.253
- type: recall_at_1
value: 25.097
- type: recall_at_10
value: 51.602000000000004
- type: recall_at_100
value: 76.854
- type: recall_at_1000
value: 93.303
- type: recall_at_3
value: 38.68
- type: recall_at_5
value: 43.258
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWordpressRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 17.689
- type: map_at_10
value: 25.291000000000004
- type: map_at_100
value: 26.262
- type: map_at_1000
value: 26.372
- type: map_at_3
value: 22.916
- type: map_at_5
value: 24.315
- type: mrr_at_1
value: 19.409000000000002
- type: mrr_at_10
value: 27.233
- type: mrr_at_100
value: 28.109
- type: mrr_at_1000
value: 28.192
- type: mrr_at_3
value: 24.892
- type: mrr_at_5
value: 26.278000000000002
- type: ndcg_at_1
value: 19.409000000000002
- type: ndcg_at_10
value: 29.809
- type: ndcg_at_100
value: 34.936
- type: ndcg_at_1000
value: 37.852000000000004
- type: ndcg_at_3
value: 25.179000000000002
- type: ndcg_at_5
value: 27.563
- type: precision_at_1
value: 19.409000000000002
- type: precision_at_10
value: 4.861
- type: precision_at_100
value: 0.8
- type: precision_at_1000
value: 0.116
- type: precision_at_3
value: 11.029
- type: precision_at_5
value: 7.985
- type: recall_at_1
value: 17.689
- type: recall_at_10
value: 41.724
- type: recall_at_100
value: 65.95299999999999
- type: recall_at_1000
value: 88.094
- type: recall_at_3
value: 29.621
- type: recall_at_5
value: 35.179
- task:
type: Retrieval
dataset:
type: climate-fever
name: MTEB ClimateFEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 10.581
- type: map_at_10
value: 18.944
- type: map_at_100
value: 20.812
- type: map_at_1000
value: 21.002000000000002
- type: map_at_3
value: 15.661
- type: map_at_5
value: 17.502000000000002
- type: mrr_at_1
value: 23.388
- type: mrr_at_10
value: 34.263
- type: mrr_at_100
value: 35.364000000000004
- type: mrr_at_1000
value: 35.409
- type: mrr_at_3
value: 30.586000000000002
- type: mrr_at_5
value: 32.928000000000004
- type: ndcg_at_1
value: 23.388
- type: ndcg_at_10
value: 26.56
- type: ndcg_at_100
value: 34.248
- type: ndcg_at_1000
value: 37.779
- type: ndcg_at_3
value: 21.179000000000002
- type: ndcg_at_5
value: 23.504
- type: precision_at_1
value: 23.388
- type: precision_at_10
value: 8.476
- type: precision_at_100
value: 1.672
- type: precision_at_1000
value: 0.233
- type: precision_at_3
value: 15.852
- type: precision_at_5
value: 12.73
- type: recall_at_1
value: 10.581
- type: recall_at_10
value: 32.512
- type: recall_at_100
value: 59.313
- type: recall_at_1000
value: 79.25
- type: recall_at_3
value: 19.912
- type: recall_at_5
value: 25.832
- task:
type: Retrieval
dataset:
type: dbpedia-entity
name: MTEB DBPedia
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 9.35
- type: map_at_10
value: 20.134
- type: map_at_100
value: 28.975
- type: map_at_1000
value: 30.709999999999997
- type: map_at_3
value: 14.513000000000002
- type: map_at_5
value: 16.671
- type: mrr_at_1
value: 69.75
- type: mrr_at_10
value: 77.67699999999999
- type: mrr_at_100
value: 77.97500000000001
- type: mrr_at_1000
value: 77.985
- type: mrr_at_3
value: 76.292
- type: mrr_at_5
value: 77.179
- type: ndcg_at_1
value: 56.49999999999999
- type: ndcg_at_10
value: 42.226
- type: ndcg_at_100
value: 47.562
- type: ndcg_at_1000
value: 54.923
- type: ndcg_at_3
value: 46.564
- type: ndcg_at_5
value: 43.830000000000005
- type: precision_at_1
value: 69.75
- type: precision_at_10
value: 33.525
- type: precision_at_100
value: 11.035
- type: precision_at_1000
value: 2.206
- type: precision_at_3
value: 49.75
- type: precision_at_5
value: 42
- type: recall_at_1
value: 9.35
- type: recall_at_10
value: 25.793
- type: recall_at_100
value: 54.186
- type: recall_at_1000
value: 77.81
- type: recall_at_3
value: 15.770000000000001
- type: recall_at_5
value: 19.09
- task:
type: Classification
dataset:
type: mteb/emotion
name: MTEB EmotionClassification
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 46.945
- type: f1
value: 42.07407842992542
- task:
type: Retrieval
dataset:
type: fever
name: MTEB FEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 71.04599999999999
- type: map_at_10
value: 80.718
- type: map_at_100
value: 80.961
- type: map_at_1000
value: 80.974
- type: map_at_3
value: 79.49199999999999
- type: map_at_5
value: 80.32000000000001
- type: mrr_at_1
value: 76.388
- type: mrr_at_10
value: 85.214
- type: mrr_at_100
value: 85.302
- type: mrr_at_1000
value: 85.302
- type: mrr_at_3
value: 84.373
- type: mrr_at_5
value: 84.979
- type: ndcg_at_1
value: 76.388
- type: ndcg_at_10
value: 84.987
- type: ndcg_at_100
value: 85.835
- type: ndcg_at_1000
value: 86.04899999999999
- type: ndcg_at_3
value: 83.04
- type: ndcg_at_5
value: 84.22500000000001
- type: precision_at_1
value: 76.388
- type: precision_at_10
value: 10.35
- type: precision_at_100
value: 1.099
- type: precision_at_1000
value: 0.11399999999999999
- type: precision_at_3
value: 32.108
- type: precision_at_5
value: 20.033
- type: recall_at_1
value: 71.04599999999999
- type: recall_at_10
value: 93.547
- type: recall_at_100
value: 96.887
- type: recall_at_1000
value: 98.158
- type: recall_at_3
value: 88.346
- type: recall_at_5
value: 91.321
- task:
type: Retrieval
dataset:
type: fiqa
name: MTEB FiQA2018
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 19.8
- type: map_at_10
value: 31.979999999999997
- type: map_at_100
value: 33.876
- type: map_at_1000
value: 34.056999999999995
- type: map_at_3
value: 28.067999999999998
- type: map_at_5
value: 30.066
- type: mrr_at_1
value: 38.735
- type: mrr_at_10
value: 47.749
- type: mrr_at_100
value: 48.605
- type: mrr_at_1000
value: 48.644999999999996
- type: mrr_at_3
value: 45.165
- type: mrr_at_5
value: 46.646
- type: ndcg_at_1
value: 38.735
- type: ndcg_at_10
value: 39.883
- type: ndcg_at_100
value: 46.983000000000004
- type: ndcg_at_1000
value: 50.043000000000006
- type: ndcg_at_3
value: 35.943000000000005
- type: ndcg_at_5
value: 37.119
- type: precision_at_1
value: 38.735
- type: precision_at_10
value: 10.940999999999999
- type: precision_at_100
value: 1.836
- type: precision_at_1000
value: 0.23900000000000002
- type: precision_at_3
value: 23.817
- type: precision_at_5
value: 17.346
- type: recall_at_1
value: 19.8
- type: recall_at_10
value: 47.082
- type: recall_at_100
value: 73.247
- type: recall_at_1000
value: 91.633
- type: recall_at_3
value: 33.201
- type: recall_at_5
value: 38.81
- task:
type: Retrieval
dataset:
type: hotpotqa
name: MTEB HotpotQA
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 38.102999999999994
- type: map_at_10
value: 60.547
- type: map_at_100
value: 61.466
- type: map_at_1000
value: 61.526
- type: map_at_3
value: 56.973
- type: map_at_5
value: 59.244
- type: mrr_at_1
value: 76.205
- type: mrr_at_10
value: 82.816
- type: mrr_at_100
value: 83.002
- type: mrr_at_1000
value: 83.009
- type: mrr_at_3
value: 81.747
- type: mrr_at_5
value: 82.467
- type: ndcg_at_1
value: 76.205
- type: ndcg_at_10
value: 69.15
- type: ndcg_at_100
value: 72.297
- type: ndcg_at_1000
value: 73.443
- type: ndcg_at_3
value: 64.07000000000001
- type: ndcg_at_5
value: 66.96600000000001
- type: precision_at_1
value: 76.205
- type: precision_at_10
value: 14.601
- type: precision_at_100
value: 1.7049999999999998
- type: precision_at_1000
value: 0.186
- type: precision_at_3
value: 41.202
- type: precision_at_5
value: 27.006000000000004
- type: recall_at_1
value: 38.102999999999994
- type: recall_at_10
value: 73.005
- type: recall_at_100
value: 85.253
- type: recall_at_1000
value: 92.795
- type: recall_at_3
value: 61.803
- type: recall_at_5
value: 67.515
- task:
type: Classification
dataset:
type: mteb/imdb
name: MTEB ImdbClassification
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 86.15
- type: ap
value: 80.36282825265391
- type: f1
value: 86.07368510726472
- task:
type: Retrieval
dataset:
type: msmarco
name: MTEB MSMARCO
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 22.6
- type: map_at_10
value: 34.887
- type: map_at_100
value: 36.069
- type: map_at_1000
value: 36.115
- type: map_at_3
value: 31.067
- type: map_at_5
value: 33.300000000000004
- type: mrr_at_1
value: 23.238
- type: mrr_at_10
value: 35.47
- type: mrr_at_100
value: 36.599
- type: mrr_at_1000
value: 36.64
- type: mrr_at_3
value: 31.735999999999997
- type: mrr_at_5
value: 33.939
- type: ndcg_at_1
value: 23.252
- type: ndcg_at_10
value: 41.765
- type: ndcg_at_100
value: 47.402
- type: ndcg_at_1000
value: 48.562
- type: ndcg_at_3
value: 34.016999999999996
- type: ndcg_at_5
value: 38.016
- type: precision_at_1
value: 23.252
- type: precision_at_10
value: 6.569
- type: precision_at_100
value: 0.938
- type: precision_at_1000
value: 0.104
- type: precision_at_3
value: 14.479000000000001
- type: precision_at_5
value: 10.722
- type: recall_at_1
value: 22.6
- type: recall_at_10
value: 62.919000000000004
- type: recall_at_100
value: 88.82
- type: recall_at_1000
value: 97.71600000000001
- type: recall_at_3
value: 41.896
- type: recall_at_5
value: 51.537
- task:
type: Classification
dataset:
type: mteb/mtop_domain
name: MTEB MTOPDomainClassification (en)
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 93.69357045143639
- type: f1
value: 93.55489858177597
- task:
type: Classification
dataset:
type: mteb/mtop_intent
name: MTEB MTOPIntentClassification (en)
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 75.31235750114
- type: f1
value: 57.891491963121155
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (en)
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 73.04303967720243
- type: f1
value: 70.51516022297616
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (en)
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 77.65299260255549
- type: f1
value: 77.49059766538576
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-p2p
name: MTEB MedrxivClusteringP2P
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 31.458906115906597
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-s2s
name: MTEB MedrxivClusteringS2S
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 28.9851513122443
- task:
type: Reranking
dataset:
type: mteb/mind_small
name: MTEB MindSmallReranking
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.2916268497217
- type: mrr
value: 32.328276715593816
- task:
type: Retrieval
dataset:
type: nfcorpus
name: MTEB NFCorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 6.3740000000000006
- type: map_at_10
value: 13.089999999999998
- type: map_at_100
value: 16.512
- type: map_at_1000
value: 18.014
- type: map_at_3
value: 9.671000000000001
- type: map_at_5
value: 11.199
- type: mrr_at_1
value: 46.749
- type: mrr_at_10
value: 55.367
- type: mrr_at_100
value: 56.021
- type: mrr_at_1000
value: 56.058
- type: mrr_at_3
value: 53.30200000000001
- type: mrr_at_5
value: 54.773
- type: ndcg_at_1
value: 45.046
- type: ndcg_at_10
value: 35.388999999999996
- type: ndcg_at_100
value: 32.175
- type: ndcg_at_1000
value: 41.018
- type: ndcg_at_3
value: 40.244
- type: ndcg_at_5
value: 38.267
- type: precision_at_1
value: 46.749
- type: precision_at_10
value: 26.563
- type: precision_at_100
value: 8.074
- type: precision_at_1000
value: 2.099
- type: precision_at_3
value: 37.358000000000004
- type: precision_at_5
value: 33.003
- type: recall_at_1
value: 6.3740000000000006
- type: recall_at_10
value: 16.805999999999997
- type: recall_at_100
value: 31.871
- type: recall_at_1000
value: 64.098
- type: recall_at_3
value: 10.383000000000001
- type: recall_at_5
value: 13.166
- task:
type: Retrieval
dataset:
type: nq
name: MTEB NQ
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 34.847
- type: map_at_10
value: 50.532
- type: map_at_100
value: 51.504000000000005
- type: map_at_1000
value: 51.528
- type: map_at_3
value: 46.219
- type: map_at_5
value: 48.868
- type: mrr_at_1
value: 39.137
- type: mrr_at_10
value: 53.157
- type: mrr_at_100
value: 53.839999999999996
- type: mrr_at_1000
value: 53.857
- type: mrr_at_3
value: 49.667
- type: mrr_at_5
value: 51.847
- type: ndcg_at_1
value: 39.108
- type: ndcg_at_10
value: 58.221000000000004
- type: ndcg_at_100
value: 62.021
- type: ndcg_at_1000
value: 62.57
- type: ndcg_at_3
value: 50.27199999999999
- type: ndcg_at_5
value: 54.623999999999995
- type: precision_at_1
value: 39.108
- type: precision_at_10
value: 9.397
- type: precision_at_100
value: 1.1520000000000001
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 22.644000000000002
- type: precision_at_5
value: 16.141
- type: recall_at_1
value: 34.847
- type: recall_at_10
value: 78.945
- type: recall_at_100
value: 94.793
- type: recall_at_1000
value: 98.904
- type: recall_at_3
value: 58.56
- type: recall_at_5
value: 68.535
- task:
type: Retrieval
dataset:
type: quora
name: MTEB QuoraRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 68.728
- type: map_at_10
value: 82.537
- type: map_at_100
value: 83.218
- type: map_at_1000
value: 83.238
- type: map_at_3
value: 79.586
- type: map_at_5
value: 81.416
- type: mrr_at_1
value: 79.17999999999999
- type: mrr_at_10
value: 85.79299999999999
- type: mrr_at_100
value: 85.937
- type: mrr_at_1000
value: 85.938
- type: mrr_at_3
value: 84.748
- type: mrr_at_5
value: 85.431
- type: ndcg_at_1
value: 79.17
- type: ndcg_at_10
value: 86.555
- type: ndcg_at_100
value: 88.005
- type: ndcg_at_1000
value: 88.146
- type: ndcg_at_3
value: 83.557
- type: ndcg_at_5
value: 85.152
- type: precision_at_1
value: 79.17
- type: precision_at_10
value: 13.163
- type: precision_at_100
value: 1.52
- type: precision_at_1000
value: 0.156
- type: precision_at_3
value: 36.53
- type: precision_at_5
value: 24.046
- type: recall_at_1
value: 68.728
- type: recall_at_10
value: 94.217
- type: recall_at_100
value: 99.295
- type: recall_at_1000
value: 99.964
- type: recall_at_3
value: 85.646
- type: recall_at_5
value: 90.113
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering
name: MTEB RedditClustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 56.15680266226348
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering-p2p
name: MTEB RedditClusteringP2P
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 63.4318549229047
- task:
type: Retrieval
dataset:
type: scidocs
name: MTEB SCIDOCS
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.353
- type: map_at_10
value: 10.956000000000001
- type: map_at_100
value: 12.873999999999999
- type: map_at_1000
value: 13.177
- type: map_at_3
value: 7.854
- type: map_at_5
value: 9.327
- type: mrr_at_1
value: 21.4
- type: mrr_at_10
value: 31.948999999999998
- type: mrr_at_100
value: 33.039
- type: mrr_at_1000
value: 33.106
- type: mrr_at_3
value: 28.449999999999996
- type: mrr_at_5
value: 30.535
- type: ndcg_at_1
value: 21.4
- type: ndcg_at_10
value: 18.694
- type: ndcg_at_100
value: 26.275
- type: ndcg_at_1000
value: 31.836
- type: ndcg_at_3
value: 17.559
- type: ndcg_at_5
value: 15.372
- type: precision_at_1
value: 21.4
- type: precision_at_10
value: 9.790000000000001
- type: precision_at_100
value: 2.0709999999999997
- type: precision_at_1000
value: 0.34099999999999997
- type: precision_at_3
value: 16.467000000000002
- type: precision_at_5
value: 13.54
- type: recall_at_1
value: 4.353
- type: recall_at_10
value: 19.892000000000003
- type: recall_at_100
value: 42.067
- type: recall_at_1000
value: 69.268
- type: recall_at_3
value: 10.042
- type: recall_at_5
value: 13.741999999999999
- task:
type: STS
dataset:
type: mteb/sickr-sts
name: MTEB SICK-R
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 83.75433886279843
- type: cos_sim_spearman
value: 78.29727771767095
- type: euclidean_pearson
value: 80.83057828506621
- type: euclidean_spearman
value: 78.35203149750356
- type: manhattan_pearson
value: 80.7403553891142
- type: manhattan_spearman
value: 78.33670488531051
- task:
type: STS
dataset:
type: mteb/sts12-sts
name: MTEB STS12
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 84.59999465280839
- type: cos_sim_spearman
value: 75.79279003980383
- type: euclidean_pearson
value: 82.29895375956758
- type: euclidean_spearman
value: 77.33856514102094
- type: manhattan_pearson
value: 82.22694214534756
- type: manhattan_spearman
value: 77.3028993008695
- task:
type: STS
dataset:
type: mteb/sts13-sts
name: MTEB STS13
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 83.09296929691297
- type: cos_sim_spearman
value: 83.58056936846941
- type: euclidean_pearson
value: 83.84067483060005
- type: euclidean_spearman
value: 84.45155680480985
- type: manhattan_pearson
value: 83.82353052971942
- type: manhattan_spearman
value: 84.43030567861112
- task:
type: STS
dataset:
type: mteb/sts14-sts
name: MTEB STS14
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 82.74616852320915
- type: cos_sim_spearman
value: 79.948683747966
- type: euclidean_pearson
value: 81.55702283757084
- type: euclidean_spearman
value: 80.1721505114231
- type: manhattan_pearson
value: 81.52251518619441
- type: manhattan_spearman
value: 80.1469800135577
- task:
type: STS
dataset:
type: mteb/sts15-sts
name: MTEB STS15
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 87.97170104226318
- type: cos_sim_spearman
value: 88.82021731518206
- type: euclidean_pearson
value: 87.92950547187615
- type: euclidean_spearman
value: 88.67043634645866
- type: manhattan_pearson
value: 87.90668112827639
- type: manhattan_spearman
value: 88.64471082785317
- task:
type: STS
dataset:
type: mteb/sts16-sts
name: MTEB STS16
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 83.02790375770599
- type: cos_sim_spearman
value: 84.46308496590792
- type: euclidean_pearson
value: 84.29430000414911
- type: euclidean_spearman
value: 84.77298303589936
- type: manhattan_pearson
value: 84.23919291368665
- type: manhattan_spearman
value: 84.75272234871308
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (en-en)
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 87.62885108477064
- type: cos_sim_spearman
value: 87.58456196391622
- type: euclidean_pearson
value: 88.2602775281007
- type: euclidean_spearman
value: 87.51556278299846
- type: manhattan_pearson
value: 88.11224053672842
- type: manhattan_spearman
value: 87.4336094383095
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (en)
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_pearson
value: 63.98187965128411
- type: cos_sim_spearman
value: 64.0653163219731
- type: euclidean_pearson
value: 62.30616725924099
- type: euclidean_spearman
value: 61.556971332295916
- type: manhattan_pearson
value: 62.07642330128549
- type: manhattan_spearman
value: 61.155494129828
- task:
type: STS
dataset:
type: mteb/stsbenchmark-sts
name: MTEB STSBenchmark
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 85.6089703921826
- type: cos_sim_spearman
value: 86.52303197250791
- type: euclidean_pearson
value: 85.95801955963246
- type: euclidean_spearman
value: 86.25242424112962
- type: manhattan_pearson
value: 85.88829100470312
- type: manhattan_spearman
value: 86.18742955805165
- task:
type: Reranking
dataset:
type: mteb/scidocs-reranking
name: MTEB SciDocsRR
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 83.02282098487036
- type: mrr
value: 95.05126409538174
- task:
type: Retrieval
dataset:
type: scifact
name: MTEB SciFact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 55.928
- type: map_at_10
value: 67.308
- type: map_at_100
value: 67.89500000000001
- type: map_at_1000
value: 67.91199999999999
- type: map_at_3
value: 65.091
- type: map_at_5
value: 66.412
- type: mrr_at_1
value: 58.667
- type: mrr_at_10
value: 68.401
- type: mrr_at_100
value: 68.804
- type: mrr_at_1000
value: 68.819
- type: mrr_at_3
value: 66.72200000000001
- type: mrr_at_5
value: 67.72200000000001
- type: ndcg_at_1
value: 58.667
- type: ndcg_at_10
value: 71.944
- type: ndcg_at_100
value: 74.464
- type: ndcg_at_1000
value: 74.82799999999999
- type: ndcg_at_3
value: 68.257
- type: ndcg_at_5
value: 70.10300000000001
- type: precision_at_1
value: 58.667
- type: precision_at_10
value: 9.533
- type: precision_at_100
value: 1.09
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 27.222
- type: precision_at_5
value: 17.533
- type: recall_at_1
value: 55.928
- type: recall_at_10
value: 84.65
- type: recall_at_100
value: 96.267
- type: recall_at_1000
value: 99
- type: recall_at_3
value: 74.656
- type: recall_at_5
value: 79.489
- task:
type: PairClassification
dataset:
type: mteb/sprintduplicatequestions-pairclassification
name: MTEB SprintDuplicateQuestions
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.79009900990098
- type: cos_sim_ap
value: 94.5795129511524
- type: cos_sim_f1
value: 89.34673366834171
- type: cos_sim_precision
value: 89.79797979797979
- type: cos_sim_recall
value: 88.9
- type: dot_accuracy
value: 99.53465346534654
- type: dot_ap
value: 81.56492504352725
- type: dot_f1
value: 76.33816908454227
- type: dot_precision
value: 76.37637637637637
- type: dot_recall
value: 76.3
- type: euclidean_accuracy
value: 99.78514851485149
- type: euclidean_ap
value: 94.59134620408962
- type: euclidean_f1
value: 88.96484375
- type: euclidean_precision
value: 86.92748091603053
- type: euclidean_recall
value: 91.10000000000001
- type: manhattan_accuracy
value: 99.78415841584159
- type: manhattan_ap
value: 94.5190197328845
- type: manhattan_f1
value: 88.84462151394423
- type: manhattan_precision
value: 88.4920634920635
- type: manhattan_recall
value: 89.2
- type: max_accuracy
value: 99.79009900990098
- type: max_ap
value: 94.59134620408962
- type: max_f1
value: 89.34673366834171
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering
name: MTEB StackExchangeClustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 65.1487505617497
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering-p2p
name: MTEB StackExchangeClusteringP2P
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 32.502518166001856
- task:
type: Reranking
dataset:
type: mteb/stackoverflowdupquestions-reranking
name: MTEB StackOverflowDupQuestions
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 50.33775480236701
- type: mrr
value: 51.17302223919871
- task:
type: Summarization
dataset:
type: mteb/summeval
name: MTEB SummEval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.561111309808208
- type: cos_sim_spearman
value: 30.2839254379273
- type: dot_pearson
value: 29.560242291401973
- type: dot_spearman
value: 30.51527274679116
- task:
type: Retrieval
dataset:
type: trec-covid
name: MTEB TRECCOVID
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.215
- type: map_at_10
value: 1.752
- type: map_at_100
value: 9.258
- type: map_at_1000
value: 23.438
- type: map_at_3
value: 0.6
- type: map_at_5
value: 0.968
- type: mrr_at_1
value: 84
- type: mrr_at_10
value: 91.333
- type: mrr_at_100
value: 91.333
- type: mrr_at_1000
value: 91.333
- type: mrr_at_3
value: 91.333
- type: mrr_at_5
value: 91.333
- type: ndcg_at_1
value: 75
- type: ndcg_at_10
value: 69.596
- type: ndcg_at_100
value: 51.970000000000006
- type: ndcg_at_1000
value: 48.864999999999995
- type: ndcg_at_3
value: 73.92699999999999
- type: ndcg_at_5
value: 73.175
- type: precision_at_1
value: 84
- type: precision_at_10
value: 74
- type: precision_at_100
value: 53.2
- type: precision_at_1000
value: 21.836
- type: precision_at_3
value: 79.333
- type: precision_at_5
value: 78.4
- type: recall_at_1
value: 0.215
- type: recall_at_10
value: 1.9609999999999999
- type: recall_at_100
value: 12.809999999999999
- type: recall_at_1000
value: 46.418
- type: recall_at_3
value: 0.6479999999999999
- type: recall_at_5
value: 1.057
- task:
type: Retrieval
dataset:
type: webis-touche2020
name: MTEB Touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 3.066
- type: map_at_10
value: 10.508000000000001
- type: map_at_100
value: 16.258
- type: map_at_1000
value: 17.705000000000002
- type: map_at_3
value: 6.157
- type: map_at_5
value: 7.510999999999999
- type: mrr_at_1
value: 34.694
- type: mrr_at_10
value: 48.786
- type: mrr_at_100
value: 49.619
- type: mrr_at_1000
value: 49.619
- type: mrr_at_3
value: 45.918
- type: mrr_at_5
value: 46.837
- type: ndcg_at_1
value: 31.633
- type: ndcg_at_10
value: 26.401999999999997
- type: ndcg_at_100
value: 37.139
- type: ndcg_at_1000
value: 48.012
- type: ndcg_at_3
value: 31.875999999999998
- type: ndcg_at_5
value: 27.383000000000003
- type: precision_at_1
value: 34.694
- type: precision_at_10
value: 22.857
- type: precision_at_100
value: 7.611999999999999
- type: precision_at_1000
value: 1.492
- type: precision_at_3
value: 33.333
- type: precision_at_5
value: 26.122
- type: recall_at_1
value: 3.066
- type: recall_at_10
value: 16.239
- type: recall_at_100
value: 47.29
- type: recall_at_1000
value: 81.137
- type: recall_at_3
value: 7.069
- type: recall_at_5
value: 9.483
- task:
type: Classification
dataset:
type: mteb/toxic_conversations_50k
name: MTEB ToxicConversationsClassification
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 72.1126
- type: ap
value: 14.710862719285753
- type: f1
value: 55.437808972378846
- task:
type: Classification
dataset:
type: mteb/tweet_sentiment_extraction
name: MTEB TweetSentimentExtractionClassification
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 60.39049235993209
- type: f1
value: 60.69810537250234
- task:
type: Clustering
dataset:
type: mteb/twentynewsgroups-clustering
name: MTEB TwentyNewsgroupsClustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 48.15576640316866
- task:
type: PairClassification
dataset:
type: mteb/twittersemeval2015-pairclassification
name: MTEB TwitterSemEval2015
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 86.52917684925792
- type: cos_sim_ap
value: 75.97497873817315
- type: cos_sim_f1
value: 70.01151926276718
- type: cos_sim_precision
value: 67.98409147402435
- type: cos_sim_recall
value: 72.16358839050132
- type: dot_accuracy
value: 82.47004828038385
- type: dot_ap
value: 62.48739894974198
- type: dot_f1
value: 59.13107511045656
- type: dot_precision
value: 55.27765029830197
- type: dot_recall
value: 63.562005277044854
- type: euclidean_accuracy
value: 86.46361089586935
- type: euclidean_ap
value: 75.59282886839452
- type: euclidean_f1
value: 69.6465443945099
- type: euclidean_precision
value: 64.52847175331982
- type: euclidean_recall
value: 75.64643799472296
- type: manhattan_accuracy
value: 86.43380818978363
- type: manhattan_ap
value: 75.5742420974403
- type: manhattan_f1
value: 69.8636926889715
- type: manhattan_precision
value: 65.8644859813084
- type: manhattan_recall
value: 74.37994722955145
- type: max_accuracy
value: 86.52917684925792
- type: max_ap
value: 75.97497873817315
- type: max_f1
value: 70.01151926276718
- task:
type: PairClassification
dataset:
type: mteb/twitterurlcorpus-pairclassification
name: MTEB TwitterURLCorpus
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.29056545193464
- type: cos_sim_ap
value: 86.63028865482376
- type: cos_sim_f1
value: 79.18166458532285
- type: cos_sim_precision
value: 75.70585756426465
- type: cos_sim_recall
value: 82.99199260856174
- type: dot_accuracy
value: 85.23305002522606
- type: dot_ap
value: 76.0482687263196
- type: dot_f1
value: 70.80484330484332
- type: dot_precision
value: 65.86933474688577
- type: dot_recall
value: 76.53988296889437
- type: euclidean_accuracy
value: 89.26145845461248
- type: euclidean_ap
value: 86.54073288416006
- type: euclidean_f1
value: 78.9721371479794
- type: euclidean_precision
value: 76.68649354417525
- type: euclidean_recall
value: 81.39821373575609
- type: manhattan_accuracy
value: 89.22847052431405
- type: manhattan_ap
value: 86.51250729037905
- type: manhattan_f1
value: 78.94601825044894
- type: manhattan_precision
value: 75.32694594027555
- type: manhattan_recall
value: 82.93039728980598
- type: max_accuracy
value: 89.29056545193464
- type: max_ap
value: 86.63028865482376
- type: max_f1
value: 79.18166458532285
language:
- en
license: mit
---
# E5-base-v2
[Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
This model has 12 layers and the embedding size is 768.
## Usage
Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset.
```python
import torch.nn.functional as F
from torch import Tensor
from transformers import AutoTokenizer, AutoModel
def average_pool(last_hidden_states: Tensor,
attention_mask: Tensor) -> Tensor:
last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
# Each input text should start with "query: " or "passage: ".
# For tasks other than retrieval, you can simply use the "query: " prefix.
input_texts = ['query: how much protein should a female eat',
'query: summit define',
"passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."]
tokenizer = AutoTokenizer.from_pretrained('intfloat/e5-base-v2')
model = AutoModel.from_pretrained('intfloat/e5-base-v2')
# Tokenize the input texts
batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')
outputs = model(**batch_dict)
embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
# normalize embeddings
embeddings = F.normalize(embeddings, p=2, dim=1)
scores = (embeddings[:2] @ embeddings[2:].T) * 100
print(scores.tolist())
```
## Training Details
Please refer to our paper at [https://arxiv.org/pdf/2212.03533.pdf](https://arxiv.org/pdf/2212.03533.pdf).
## Benchmark Evaluation
Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results
on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316).
## Support for Sentence Transformers
Below is an example for usage with sentence_transformers.
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer('intfloat/e5-base-v2')
input_texts = [
'query: how much protein should a female eat',
'query: summit define',
"passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
"passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."
]
embeddings = model.encode(input_texts, normalize_embeddings=True)
```
Package requirements
`pip install sentence_transformers~=2.2.2`
Contributors: [michaelfeil](https://huggingface.co/michaelfeil)
## FAQ
**1. Do I need to add the prefix "query: " and "passage: " to input texts?**
Yes, this is how the model is trained, otherwise you will see a performance degradation.
Here are some rules of thumb:
- Use "query: " and "passage: " correspondingly for asymmetric tasks such as passage retrieval in open QA, ad-hoc information retrieval.
- Use "query: " prefix for symmetric tasks such as semantic similarity, paraphrase retrieval.
- Use "query: " prefix if you want to use embeddings as features, such as linear probing classification, clustering.
**2. Why are my reproduced results slightly different from reported in the model card?**
Different versions of `transformers` and `pytorch` could cause negligible but non-zero performance differences.
**3. Why does the cosine similarity scores distribute around 0.7 to 1.0?**
This is a known and expected behavior as we use a low temperature 0.01 for InfoNCE contrastive loss.
For text embedding tasks like text retrieval or semantic similarity,
what matters is the relative order of the scores instead of the absolute values,
so this should not be an issue.
## Citation
If you find our paper or models helpful, please consider cite as follows:
```
@article{wang2022text,
title={Text Embeddings by Weakly-Supervised Contrastive Pre-training},
author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Jiao, Binxing and Yang, Linjun and Jiang, Daxin and Majumder, Rangan and Wei, Furu},
journal={arXiv preprint arXiv:2212.03533},
year={2022}
}
```
## Limitations
This model only works for English texts. Long texts will be truncated to at most 512 tokens.
| 67,561 | [
[
-0.0091094970703125,
-0.0533447265625,
0.017333984375,
0.01544952392578125,
-0.01959228515625,
-0.034820556640625,
0.002666473388671875,
-0.030975341796875,
0.00543212890625,
0.0228424072265625,
-0.03533935546875,
-0.04864501953125,
-0.07666015625,
0.01690673828125,
-0.028656005859375,
0.07037353515625,
0.0014190673828125,
0.004940032958984375,
-0.0277862548828125,
-0.00525665283203125,
-0.01708984375,
-0.044219970703125,
-0.02130126953125,
-0.0238494873046875,
0.01934814453125,
0.01715087890625,
0.045166015625,
0.04364013671875,
0.050872802734375,
0.025848388671875,
-0.01094818115234375,
0.01268768310546875,
-0.04071044921875,
-0.0096435546875,
0.0010766983032226562,
-0.041473388671875,
-0.032562255859375,
0.017059326171875,
0.03851318359375,
0.0633544921875,
0.01238250732421875,
0.018646240234375,
0.028900146484375,
0.038726806640625,
-0.0438232421875,
0.013641357421875,
-0.032989501953125,
0.01070404052734375,
0.00856781005859375,
-0.002460479736328125,
-0.02899169921875,
0.01114654541015625,
0.0311737060546875,
-0.044281005859375,
0.0220489501953125,
0.00940704345703125,
0.09783935546875,
0.022918701171875,
-0.0360107421875,
-0.017852783203125,
-0.006496429443359375,
0.07708740234375,
-0.054046630859375,
0.032989501953125,
0.052825927734375,
-0.0240631103515625,
-0.008514404296875,
-0.06890869140625,
-0.0281829833984375,
-0.0147857666015625,
-0.0217437744140625,
0.01149749755859375,
-0.0204620361328125,
-0.0037326812744140625,
0.030426025390625,
0.034332275390625,
-0.063232421875,
-0.007785797119140625,
-0.0249481201171875,
-0.0062255859375,
0.03668212890625,
0.00882720947265625,
0.0227813720703125,
-0.03680419921875,
-0.0176239013671875,
-0.0188751220703125,
-0.0400390625,
0.0026988983154296875,
0.0171356201171875,
0.02972412109375,
-0.03131103515625,
0.0428466796875,
-0.02001953125,
0.047454833984375,
0.01702880859375,
0.0102691650390625,
0.05120849609375,
-0.042144775390625,
-0.0211334228515625,
-0.0194091796875,
0.07574462890625,
0.037139892578125,
0.0117340087890625,
-0.00812530517578125,
-0.003261566162109375,
-0.005397796630859375,
0.006534576416015625,
-0.0859375,
-0.039825439453125,
0.0150604248046875,
-0.0491943359375,
-0.01708984375,
0.01004791259765625,
-0.0400390625,
-0.005779266357421875,
-0.018829345703125,
0.066650390625,
-0.0419921875,
0.005596160888671875,
0.0189056396484375,
-0.018707275390625,
0.0102081298828125,
0.006397247314453125,
-0.06219482421875,
0.0254669189453125,
0.00695037841796875,
0.06756591796875,
-0.005596160888671875,
-0.02850341796875,
-0.03741455078125,
-0.0079803466796875,
0.00350189208984375,
0.037994384765625,
-0.030120849609375,
-0.0220947265625,
0.00205230712890625,
0.0290374755859375,
-0.036712646484375,
-0.038909912109375,
0.038482666015625,
-0.0240936279296875,
0.030914306640625,
-0.018707275390625,
-0.04559326171875,
-0.00324249267578125,
0.0199737548828125,
-0.0311737060546875,
0.0816650390625,
0.0087127685546875,
-0.0736083984375,
0.00885772705078125,
-0.0306549072265625,
-0.02978515625,
-0.012481689453125,
-0.00930023193359375,
-0.03607177734375,
-0.007793426513671875,
0.037261962890625,
0.032867431640625,
-0.01061248779296875,
-0.0002589225769042969,
-0.0038280487060546875,
-0.037872314453125,
0.0150604248046875,
-0.01082611083984375,
0.06756591796875,
0.005718231201171875,
-0.0361328125,
-0.01207733154296875,
-0.052459716796875,
0.0062255859375,
0.01129150390625,
-0.03326416015625,
-0.01183319091796875,
0.00595855712890625,
0.0004487037658691406,
0.0213470458984375,
0.0271453857421875,
-0.03411865234375,
0.01271820068359375,
-0.04071044921875,
0.053009033203125,
0.0401611328125,
0.00782012939453125,
0.036163330078125,
-0.030181884765625,
0.0087890625,
0.0299072265625,
0.00353240966796875,
-0.002803802490234375,
-0.0391845703125,
-0.05865478515625,
-0.00620269775390625,
0.042144775390625,
0.039703369140625,
-0.03460693359375,
0.0447998046875,
-0.0254974365234375,
-0.0240631103515625,
-0.052459716796875,
0.01050567626953125,
0.01947021484375,
0.0310821533203125,
0.061492919921875,
-0.00696563720703125,
-0.05035400390625,
-0.0772705078125,
-0.02276611328125,
0.01068878173828125,
-0.02197265625,
0.01922607421875,
0.0667724609375,
-0.0245361328125,
0.04681396484375,
-0.051239013671875,
-0.037811279296875,
-0.0151824951171875,
0.01070404052734375,
0.0316162109375,
0.05926513671875,
0.02923583984375,
-0.06646728515625,
-0.035888671875,
-0.041351318359375,
-0.068359375,
0.005390167236328125,
0.00849151611328125,
-0.0192718505859375,
-0.00867462158203125,
0.03680419921875,
-0.04833984375,
0.025787353515625,
0.037933349609375,
-0.035186767578125,
0.0198516845703125,
-0.023284912109375,
0.01247406005859375,
-0.080078125,
-0.0009446144104003906,
0.01442718505859375,
-0.01497650146484375,
-0.0291290283203125,
0.0102691650390625,
-0.000004708766937255859,
-0.01044464111328125,
-0.03570556640625,
0.0185699462890625,
-0.044708251953125,
0.0183563232421875,
-0.006420135498046875,
0.0272216796875,
0.0236053466796875,
0.039947509765625,
-0.0049285888671875,
0.0419921875,
0.045074462890625,
-0.0653076171875,
-0.003559112548828125,
0.050750732421875,
-0.028594970703125,
0.0241546630859375,
-0.06793212890625,
0.00972747802734375,
-0.00015938282012939453,
0.0162506103515625,
-0.0706787109375,
-0.011260986328125,
0.02099609375,
-0.051544189453125,
0.022308349609375,
-0.00395965576171875,
-0.03851318359375,
-0.0181121826171875,
-0.039031982421875,
0.0198516845703125,
0.04022216796875,
-0.024444580078125,
0.037017822265625,
0.0209808349609375,
0.003238677978515625,
-0.040863037109375,
-0.08203125,
-0.01274871826171875,
0.0005154609680175781,
-0.04974365234375,
0.058135986328125,
-0.0130462646484375,
0.01495361328125,
0.0085296630859375,
-0.010711669921875,
0.01456451416015625,
-0.01026153564453125,
0.020538330078125,
0.00182342529296875,
-0.0006589889526367188,
0.004024505615234375,
-0.00800323486328125,
-0.003631591796875,
0.0014619827270507812,
-0.02008056640625,
0.0433349609375,
-0.02642822265625,
0.005794525146484375,
-0.042816162109375,
0.0394287109375,
0.0119781494140625,
-0.0160369873046875,
0.08245849609375,
0.058502197265625,
-0.0303802490234375,
0.006496429443359375,
-0.021820068359375,
-0.026092529296875,
-0.03564453125,
0.04473876953125,
-0.0428466796875,
-0.04058837890625,
0.032501220703125,
0.008331298828125,
-0.007213592529296875,
0.06793212890625,
0.0260009765625,
-0.0235748291015625,
0.09716796875,
0.0545654296875,
0.01332855224609375,
0.031494140625,
-0.055511474609375,
0.001800537109375,
-0.06890869140625,
-0.0265960693359375,
-0.045440673828125,
-0.034332275390625,
-0.06427001953125,
-0.033477783203125,
0.02294921875,
0.0189666748046875,
-0.034912109375,
0.0289154052734375,
-0.044097900390625,
0.007808685302734375,
0.0428466796875,
0.04071044921875,
0.0023632049560546875,
0.01126861572265625,
-0.017547607421875,
-0.0302276611328125,
-0.06890869140625,
-0.0278472900390625,
0.07391357421875,
0.0216217041015625,
0.055084228515625,
-0.004215240478515625,
0.050018310546875,
0.01212310791015625,
-0.006885528564453125,
-0.050445556640625,
0.04052734375,
-0.0292510986328125,
-0.0247802734375,
-0.00958251953125,
-0.050933837890625,
-0.07586669921875,
0.02923583984375,
-0.030609130859375,
-0.050750732421875,
0.018218994140625,
-0.00997161865234375,
-0.018646240234375,
0.006587982177734375,
-0.06732177734375,
0.07904052734375,
0.0034313201904296875,
-0.023895263671875,
0.0008640289306640625,
-0.052703857421875,
-0.0176239013671875,
0.0298919677734375,
0.00882720947265625,
0.0014390945434570312,
-0.005535125732421875,
0.08258056640625,
-0.02423095703125,
0.07025146484375,
-0.007785797119140625,
0.031280517578125,
0.0110626220703125,
-0.01148223876953125,
0.039825439453125,
-0.01534271240234375,
-0.006999969482421875,
0.018829345703125,
0.00333404541015625,
-0.04522705078125,
-0.02642822265625,
0.058135986328125,
-0.0914306640625,
-0.039398193359375,
-0.041229248046875,
-0.0341796875,
0.004604339599609375,
0.00949859619140625,
0.049713134765625,
0.03643798828125,
0.01224517822265625,
0.041717529296875,
0.044281005859375,
-0.0272216796875,
0.0236663818359375,
0.020172119140625,
0.00954437255859375,
-0.03265380859375,
0.05242919921875,
0.030914306640625,
0.0151824951171875,
0.049041748046875,
0.0172882080078125,
-0.024688720703125,
-0.04150390625,
-0.01143646240234375,
0.03228759765625,
-0.056396484375,
-0.01482391357421875,
-0.08172607421875,
-0.0252685546875,
-0.044097900390625,
-0.0007119178771972656,
-0.019134521484375,
-0.0291290283203125,
-0.031341552734375,
-0.006343841552734375,
0.0117340087890625,
0.031768798828125,
-0.0023632049560546875,
0.0250396728515625,
-0.048828125,
0.023468017578125,
0.002796173095703125,
0.00576019287109375,
-0.01134490966796875,
-0.072021484375,
-0.031707763671875,
0.00860595703125,
-0.0435791015625,
-0.07061767578125,
0.032196044921875,
0.0310821533203125,
0.049713134765625,
0.00832366943359375,
0.007476806640625,
0.04840087890625,
-0.02630615234375,
0.07403564453125,
0.007266998291015625,
-0.06890869140625,
0.052825927734375,
-0.00279998779296875,
0.048065185546875,
0.04083251953125,
0.060302734375,
-0.0242462158203125,
-0.030181884765625,
-0.058197021484375,
-0.08331298828125,
0.04718017578125,
0.032684326171875,
0.01788330078125,
-0.00861358642578125,
0.02349853515625,
-0.004077911376953125,
0.016265869140625,
-0.0814208984375,
-0.029571533203125,
-0.029296875,
-0.0250244140625,
-0.0101470947265625,
-0.0169677734375,
0.0020656585693359375,
-0.041290283203125,
0.06378173828125,
-0.0007338523864746094,
0.04974365234375,
0.04248046875,
-0.041229248046875,
0.0082855224609375,
0.00377655029296875,
0.0206756591796875,
0.04119873046875,
-0.0340576171875,
0.0242462158203125,
0.025726318359375,
-0.048431396484375,
-0.01122283935546875,
0.017059326171875,
-0.017547607421875,
0.005809783935546875,
0.036865234375,
0.053131103515625,
0.0238494873046875,
-0.0241241455078125,
0.045501708984375,
0.00030922889709472656,
-0.0259246826171875,
-0.004543304443359375,
-0.0046844482421875,
0.0125579833984375,
0.013275146484375,
0.035888671875,
-0.0036296844482421875,
0.01213836669921875,
-0.042755126953125,
0.0104217529296875,
-0.000762939453125,
-0.033935546875,
-0.02069091796875,
0.0528564453125,
0.0162811279296875,
-0.00717926025390625,
0.07763671875,
-0.01021575927734375,
-0.04229736328125,
0.03509521484375,
0.0518798828125,
0.0489501953125,
-0.0081787109375,
0.009552001953125,
0.06427001953125,
0.0297698974609375,
0.004852294921875,
0.01343536376953125,
0.0125579833984375,
-0.052520751953125,
-0.0174560546875,
-0.06854248046875,
-0.00287628173828125,
0.0179595947265625,
-0.036346435546875,
0.0150604248046875,
-0.00435638427734375,
-0.0248565673828125,
-0.00022017955780029297,
0.03497314453125,
-0.07281494140625,
0.01904296875,
-0.003040313720703125,
0.053131103515625,
-0.07012939453125,
0.037841796875,
0.05718994140625,
-0.066162109375,
-0.0546875,
0.0001544952392578125,
-0.0266265869140625,
-0.04193115234375,
0.055206298828125,
0.039031982421875,
0.00696563720703125,
0.005382537841796875,
-0.03839111328125,
-0.051239013671875,
0.0830078125,
0.0135650634765625,
-0.03387451171875,
-0.024627685546875,
0.0232391357421875,
0.0311431884765625,
-0.037078857421875,
0.041015625,
0.0244140625,
0.022674560546875,
-0.008697509765625,
-0.04913330078125,
0.0202178955078125,
-0.0295867919921875,
-0.0080718994140625,
-0.01456451416015625,
-0.0460205078125,
0.08624267578125,
-0.017791748046875,
-0.0049285888671875,
0.0033740997314453125,
0.048553466796875,
0.006870269775390625,
0.0005297660827636719,
0.030426025390625,
0.04510498046875,
0.0479736328125,
-0.004665374755859375,
0.09356689453125,
-0.022796630859375,
0.040191650390625,
0.062164306640625,
0.028106689453125,
0.06439208984375,
0.0343017578125,
-0.02508544921875,
0.053924560546875,
0.064208984375,
-0.010589599609375,
0.054718017578125,
0.010711669921875,
0.01021575927734375,
-0.0225372314453125,
0.00015413761138916016,
-0.04608154296875,
0.02392578125,
0.016143798828125,
-0.0518798828125,
-0.0135650634765625,
0.0027370452880859375,
0.00577545166015625,
-0.00946807861328125,
-0.01343536376953125,
0.036376953125,
0.036834716796875,
-0.03167724609375,
0.07086181640625,
0.00798797607421875,
0.056243896484375,
-0.052642822265625,
0.01117706298828125,
-0.0156097412109375,
0.02984619140625,
-0.0244140625,
-0.04541015625,
0.00901031494140625,
-0.0085296630859375,
-0.0227813720703125,
-0.01139068603515625,
0.0411376953125,
-0.04541015625,
-0.033935546875,
0.029296875,
0.04364013671875,
0.020294189453125,
-0.0212554931640625,
-0.078369140625,
0.003818511962890625,
0.00033545494079589844,
-0.034027099609375,
0.033660888671875,
0.015045166015625,
0.0210113525390625,
0.03759765625,
0.034759521484375,
-0.0117340087890625,
-0.0013532638549804688,
0.0162811279296875,
0.060943603515625,
-0.04681396484375,
-0.04217529296875,
-0.06549072265625,
0.03167724609375,
-0.0172271728515625,
-0.0254364013671875,
0.0638427734375,
0.051544189453125,
0.056549072265625,
-0.011871337890625,
0.035400390625,
-0.0036144256591796875,
0.00928497314453125,
-0.041473388671875,
0.048919677734375,
-0.050689697265625,
-0.002407073974609375,
-0.019195556640625,
-0.074951171875,
-0.0139312744140625,
0.0638427734375,
-0.036285400390625,
0.01081085205078125,
0.06964111328125,
0.060699462890625,
-0.017181396484375,
-0.0120849609375,
0.015350341796875,
0.04107666015625,
0.0209808349609375,
0.061004638671875,
0.040252685546875,
-0.08514404296875,
0.05865478515625,
-0.0145263671875,
-0.0145263671875,
-0.01367950439453125,
-0.056365966796875,
-0.0660400390625,
-0.0494384765625,
-0.04522705078125,
-0.032989501953125,
0.00843048095703125,
0.07403564453125,
0.05859375,
-0.0513916015625,
-0.00936126708984375,
-0.00016582012176513672,
-0.01183319091796875,
-0.0241546630859375,
-0.01885986328125,
0.046539306640625,
-0.029388427734375,
-0.06866455078125,
0.0172576904296875,
-0.016937255859375,
0.01032257080078125,
0.0091094970703125,
-0.007656097412109375,
-0.051025390625,
-0.0017547607421875,
0.05023193359375,
-0.01013946533203125,
-0.028076171875,
-0.02679443359375,
0.0009794235229492188,
-0.0290985107421875,
0.01299285888671875,
0.01189422607421875,
-0.050567626953125,
0.0199737548828125,
0.050262451171875,
0.03411865234375,
0.0748291015625,
0.0003554821014404297,
0.03155517578125,
-0.057952880859375,
0.0105438232421875,
0.01207733154296875,
0.026031494140625,
0.04071044921875,
-0.0197296142578125,
0.034027099609375,
0.0308380126953125,
-0.04119873046875,
-0.0460205078125,
-0.00943756103515625,
-0.07183837890625,
-0.0173492431640625,
0.0765380859375,
-0.017120361328125,
-0.0208587646484375,
0.00933837890625,
-0.0027561187744140625,
0.0298919677734375,
-0.019805908203125,
0.050537109375,
0.061370849609375,
-0.01090240478515625,
-0.0036067962646484375,
-0.053009033203125,
0.0411376953125,
0.037109375,
-0.037994384765625,
-0.02642822265625,
0.0029201507568359375,
0.035400390625,
0.01433563232421875,
0.040496826171875,
-0.01311492919921875,
0.0012607574462890625,
0.0285186767578125,
-0.00763702392578125,
-0.004123687744140625,
-0.00713348388671875,
-0.006458282470703125,
0.015045166015625,
-0.019866943359375,
-0.025115966796875
]
] |
timm/res2net101_26w_4s.in1k | 2023-04-24T00:07:55.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1904.01169",
"license:unknown",
"region:us"
] | image-classification | timm | null | null | timm/res2net101_26w_4s.in1k | 0 | 36,904 | timm | 2023-04-24T00:07:15 | ---
tags:
- image-classification
- timm
library_name: timm
license: unknown
datasets:
- imagenet-1k
---
# Model card for res2net101_26w_4s.in1k
A Res2Net (Multi-Scale ResNet) image classification model. Trained on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 45.2
- GMACs: 8.1
- Activations (M): 18.4
- Image size: 224 x 224
- **Papers:**
- Res2Net: A New Multi-scale Backbone Architecture: https://arxiv.org/abs/1904.01169
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/gasvn/Res2Net/
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('res2net101_26w_4s.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'res2net101_26w_4s.in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 112, 112])
# torch.Size([1, 256, 56, 56])
# torch.Size([1, 512, 28, 28])
# torch.Size([1, 1024, 14, 14])
# torch.Size([1, 2048, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'res2net101_26w_4s.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2048, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@article{gao2019res2net,
title={Res2Net: A New Multi-scale Backbone Architecture},
author={Gao, Shang-Hua and Cheng, Ming-Ming and Zhao, Kai and Zhang, Xin-Yu and Yang, Ming-Hsuan and Torr, Philip},
journal={IEEE TPAMI},
doi={10.1109/TPAMI.2019.2938758},
}
```
| 3,674 | [
[
-0.0338134765625,
-0.0254974365234375,
-0.0057525634765625,
0.014556884765625,
-0.020751953125,
-0.0290374755859375,
-0.0156402587890625,
-0.0298004150390625,
0.02447509765625,
0.0384521484375,
-0.032745361328125,
-0.039642333984375,
-0.05145263671875,
-0.00814056396484375,
-0.0074005126953125,
0.0665283203125,
-0.0029582977294921875,
0.0015630722045898438,
-0.010833740234375,
-0.043365478515625,
-0.01172637939453125,
-0.0191650390625,
-0.0740966796875,
-0.031707763671875,
0.029693603515625,
0.0179443359375,
0.027801513671875,
0.048004150390625,
0.047088623046875,
0.037200927734375,
-0.00566864013671875,
0.01027679443359375,
-0.018890380859375,
-0.0173187255859375,
0.0247344970703125,
-0.0498046875,
-0.036956787109375,
0.0204925537109375,
0.06048583984375,
0.0228118896484375,
0.006130218505859375,
0.040008544921875,
0.013641357421875,
0.045654296875,
-0.01079559326171875,
0.0133056640625,
-0.0300140380859375,
0.01317596435546875,
-0.01446533203125,
0.0133056640625,
-0.0229034423828125,
-0.031494140625,
0.01313018798828125,
-0.03515625,
0.0282745361328125,
0.0102081298828125,
0.09710693359375,
0.016632080078125,
-0.00795745849609375,
0.00475311279296875,
-0.00992584228515625,
0.06402587890625,
-0.05853271484375,
0.01708984375,
0.0230255126953125,
0.0164642333984375,
-0.004138946533203125,
-0.0870361328125,
-0.042327880859375,
0.0012836456298828125,
-0.014984130859375,
-0.005001068115234375,
-0.01218414306640625,
-0.006443023681640625,
0.0199432373046875,
0.020477294921875,
-0.033172607421875,
0.0156097412109375,
-0.046722412109375,
-0.019195556640625,
0.039398193359375,
0.0034542083740234375,
0.023345947265625,
-0.018890380859375,
-0.038818359375,
-0.0296783447265625,
-0.0205841064453125,
0.0252685546875,
0.0228118896484375,
0.025726318359375,
-0.050628662109375,
0.0281982421875,
0.0131988525390625,
0.03741455078125,
0.001087188720703125,
-0.0277557373046875,
0.04833984375,
-0.0012311935424804688,
-0.031494140625,
-0.011993408203125,
0.08270263671875,
0.032196044921875,
0.0186004638671875,
0.0150146484375,
-0.005344390869140625,
-0.031951904296875,
-0.0034656524658203125,
-0.08892822265625,
-0.028472900390625,
0.0274810791015625,
-0.055023193359375,
-0.03399658203125,
0.01180267333984375,
-0.05670166015625,
-0.01556396484375,
0.00026297569274902344,
0.03424072265625,
-0.026031494140625,
-0.039642333984375,
0.004703521728515625,
-0.0160675048828125,
0.0228424072265625,
0.00756072998046875,
-0.041900634765625,
0.015106201171875,
0.0222930908203125,
0.0855712890625,
0.01100921630859375,
-0.033355712890625,
-0.01364898681640625,
-0.03131103515625,
-0.0188751220703125,
0.037384033203125,
-0.001903533935546875,
-0.0004374980926513672,
-0.026580810546875,
0.03131103515625,
-0.01157379150390625,
-0.0484619140625,
0.0142059326171875,
-0.017608642578125,
0.02593994140625,
-0.0072784423828125,
-0.011871337890625,
-0.042205810546875,
0.026336669921875,
-0.0379638671875,
0.09478759765625,
0.03131103515625,
-0.06842041015625,
0.01366424560546875,
-0.037445068359375,
-0.006595611572265625,
-0.02886962890625,
0.0020294189453125,
-0.0850830078125,
-0.009490966796875,
0.004383087158203125,
0.04583740234375,
-0.032470703125,
-0.0006127357482910156,
-0.044769287109375,
-0.00945281982421875,
0.0216217041015625,
-0.002529144287109375,
0.07342529296875,
0.007091522216796875,
-0.027923583984375,
0.0134429931640625,
-0.04351806640625,
0.0248870849609375,
0.0406494140625,
-0.025604248046875,
-0.0001690387725830078,
-0.048553466796875,
0.020721435546875,
0.0295867919921875,
0.0120086669921875,
-0.045196533203125,
0.020599365234375,
-0.0182952880859375,
0.0401611328125,
0.0440673828125,
-0.01239776611328125,
0.0230865478515625,
-0.0279693603515625,
0.0121612548828125,
0.0142059326171875,
0.01385498046875,
0.001308441162109375,
-0.04638671875,
-0.057525634765625,
-0.030792236328125,
0.030181884765625,
0.03082275390625,
-0.0311431884765625,
0.03387451171875,
-0.0123138427734375,
-0.057647705078125,
-0.033203125,
0.005077362060546875,
0.040985107421875,
0.044677734375,
0.031707763671875,
-0.034271240234375,
-0.04327392578125,
-0.07275390625,
0.00885772705078125,
0.0009379386901855469,
-0.001102447509765625,
0.027099609375,
0.0506591796875,
-0.00453948974609375,
0.04632568359375,
-0.032257080078125,
-0.0272064208984375,
-0.0224456787109375,
0.004695892333984375,
0.02667236328125,
0.060302734375,
0.0662841796875,
-0.0452880859375,
-0.0310516357421875,
-0.013153076171875,
-0.0740966796875,
0.0143585205078125,
-0.0103302001953125,
-0.01464080810546875,
0.0184783935546875,
0.00576019287109375,
-0.0352783203125,
0.039825439453125,
0.020660400390625,
-0.0238800048828125,
0.03399658203125,
-0.020050048828125,
0.01515960693359375,
-0.09765625,
0.0100250244140625,
0.030242919921875,
-0.0016937255859375,
-0.033721923828125,
0.003635406494140625,
0.00791168212890625,
-0.00429534912109375,
-0.03717041015625,
0.049713134765625,
-0.044097900390625,
-0.0267486572265625,
-0.0139617919921875,
-0.0160675048828125,
0.01255035400390625,
0.05584716796875,
-0.00015735626220703125,
0.0279388427734375,
0.057525634765625,
-0.0309600830078125,
0.04217529296875,
0.0287933349609375,
-0.014007568359375,
0.027923583984375,
-0.050537109375,
0.01104736328125,
0.0035953521728515625,
0.0269775390625,
-0.08197021484375,
-0.0189666748046875,
0.0170135498046875,
-0.042388916015625,
0.04901123046875,
-0.036956787109375,
-0.0293121337890625,
-0.049224853515625,
-0.0390625,
0.033355712890625,
0.05224609375,
-0.05279541015625,
0.026641845703125,
0.0213165283203125,
0.02239990234375,
-0.03973388671875,
-0.0736083984375,
-0.015350341796875,
-0.03424072265625,
-0.056304931640625,
0.0212554931640625,
0.019744873046875,
0.010955810546875,
0.0083465576171875,
-0.001842498779296875,
-0.0157623291015625,
-0.0111541748046875,
0.039581298828125,
0.0224609375,
-0.0267486572265625,
-0.009033203125,
-0.0226287841796875,
-0.008331298828125,
0.006443023681640625,
-0.0270233154296875,
0.047943115234375,
-0.0242767333984375,
-0.008453369140625,
-0.0751953125,
-0.01232147216796875,
0.0408935546875,
-0.0096893310546875,
0.06353759765625,
0.08477783203125,
-0.040130615234375,
0.0031032562255859375,
-0.02960205078125,
-0.02899169921875,
-0.03564453125,
0.038909912109375,
-0.02899169921875,
-0.031890869140625,
0.06683349609375,
-0.01096343994140625,
0.01136016845703125,
0.049407958984375,
0.0159912109375,
-0.006214141845703125,
0.044830322265625,
0.037933349609375,
0.01444244384765625,
0.04144287109375,
-0.0810546875,
-0.0127716064453125,
-0.080322265625,
-0.047210693359375,
-0.035125732421875,
-0.056976318359375,
-0.046905517578125,
-0.0303497314453125,
0.030792236328125,
0.018341064453125,
-0.037445068359375,
0.04400634765625,
-0.06317138671875,
0.0002841949462890625,
0.0477294921875,
0.04150390625,
-0.0357666015625,
0.0249786376953125,
-0.0175323486328125,
-0.00417327880859375,
-0.050262451171875,
-0.01485443115234375,
0.0823974609375,
0.04327392578125,
0.042694091796875,
-0.0030879974365234375,
0.045074462890625,
-0.01806640625,
0.01525115966796875,
-0.041351318359375,
0.03826904296875,
-0.012664794921875,
-0.0302886962890625,
-0.0183868408203125,
-0.025238037109375,
-0.0771484375,
0.01346588134765625,
-0.023895263671875,
-0.057952880859375,
0.0121307373046875,
0.010711669921875,
-0.017181396484375,
0.06939697265625,
-0.056182861328125,
0.06805419921875,
-0.0081939697265625,
-0.031890869140625,
0.0009474754333496094,
-0.050445556640625,
0.024200439453125,
0.00870513916015625,
-0.0124664306640625,
-0.00504302978515625,
0.01010894775390625,
0.08551025390625,
-0.042938232421875,
0.0673828125,
-0.0362548828125,
0.0292510986328125,
0.041778564453125,
-0.01230621337890625,
0.0250396728515625,
-0.003177642822265625,
-0.0154571533203125,
0.03143310546875,
-0.0034999847412109375,
-0.03765869140625,
-0.040283203125,
0.043670654296875,
-0.07427978515625,
-0.02056884765625,
-0.0216827392578125,
-0.029449462890625,
0.0122833251953125,
0.00785064697265625,
0.04034423828125,
0.06036376953125,
0.0269927978515625,
0.0304412841796875,
0.041900634765625,
-0.034698486328125,
0.03472900390625,
-0.009765625,
-0.01235198974609375,
-0.04168701171875,
0.060577392578125,
0.0176239013671875,
0.012725830078125,
0.00476837158203125,
0.015411376953125,
-0.038482666015625,
-0.039093017578125,
-0.033050537109375,
0.03045654296875,
-0.0406494140625,
-0.0396728515625,
-0.04571533203125,
-0.036224365234375,
-0.03826904296875,
-0.0016565322875976562,
-0.045623779296875,
-0.0294952392578125,
-0.0273284912109375,
0.0171661376953125,
0.053131103515625,
0.044647216796875,
-0.01473236083984375,
0.040863037109375,
-0.042083740234375,
0.00638580322265625,
0.01010894775390625,
0.037139892578125,
0.00004589557647705078,
-0.07550048828125,
-0.01503753662109375,
-0.006069183349609375,
-0.0288543701171875,
-0.0479736328125,
0.032928466796875,
0.0108795166015625,
0.032928466796875,
0.0287322998046875,
-0.01398468017578125,
0.053924560546875,
-0.0014142990112304688,
0.04547119140625,
0.033905029296875,
-0.036895751953125,
0.044036865234375,
0.0034732818603515625,
0.00836944580078125,
0.0025730133056640625,
0.0186920166015625,
-0.023162841796875,
0.0044403076171875,
-0.06854248046875,
-0.05108642578125,
0.0721435546875,
0.00417327880859375,
-0.0004458427429199219,
0.0274505615234375,
0.059417724609375,
-0.004871368408203125,
-0.004077911376953125,
-0.052764892578125,
-0.033782958984375,
-0.0284576416015625,
-0.01324462890625,
0.00928497314453125,
-0.01503753662109375,
-0.003143310546875,
-0.041900634765625,
0.047149658203125,
-0.007648468017578125,
0.05938720703125,
0.0192413330078125,
0.01203155517578125,
0.00020110607147216797,
-0.034515380859375,
0.038909912109375,
0.0158843994140625,
-0.0202178955078125,
0.005706787109375,
0.0224609375,
-0.046173095703125,
0.0075836181640625,
-0.0010986328125,
0.00847625732421875,
-0.009063720703125,
0.044036865234375,
0.07012939453125,
-0.00011527538299560547,
0.013824462890625,
0.0266571044921875,
-0.00710296630859375,
-0.0352783203125,
-0.0199737548828125,
0.007144927978515625,
0.004215240478515625,
0.03070068359375,
0.0158843994140625,
0.0299835205078125,
-0.0172271728515625,
-0.0164642333984375,
0.024566650390625,
0.0333251953125,
-0.02008056640625,
-0.029266357421875,
0.050628662109375,
-0.00989532470703125,
-0.01337432861328125,
0.0780029296875,
-0.0052490234375,
-0.033721923828125,
0.0921630859375,
0.040557861328125,
0.06280517578125,
-0.00708770751953125,
-0.002231597900390625,
0.07379150390625,
0.0167236328125,
-0.005298614501953125,
0.01432037353515625,
0.0223846435546875,
-0.0592041015625,
0.00876617431640625,
-0.049224853515625,
0.010223388671875,
0.0367431640625,
-0.048675537109375,
0.0236053466796875,
-0.0574951171875,
-0.032745361328125,
0.0104217529296875,
0.028228759765625,
-0.062469482421875,
0.01995849609375,
-0.005924224853515625,
0.0712890625,
-0.059814453125,
0.0653076171875,
0.06512451171875,
-0.038482666015625,
-0.086669921875,
-0.00797271728515625,
0.01110076904296875,
-0.07525634765625,
0.0484619140625,
0.03753662109375,
0.0172882080078125,
0.00643157958984375,
-0.044586181640625,
-0.0537109375,
0.10931396484375,
0.0303802490234375,
-0.0035991668701171875,
0.0188751220703125,
-0.0020294189453125,
0.019317626953125,
-0.035797119140625,
0.042388916015625,
0.01403045654296875,
0.034271240234375,
0.024383544921875,
-0.056915283203125,
0.0189666748046875,
-0.0196075439453125,
0.0132904052734375,
0.016387939453125,
-0.0711669921875,
0.07379150390625,
-0.0439453125,
-0.0066986083984375,
0.00279998779296875,
0.051300048828125,
0.01995849609375,
0.01080322265625,
0.0401611328125,
0.06573486328125,
0.044769287109375,
-0.0250396728515625,
0.05877685546875,
-0.0033740997314453125,
0.05206298828125,
0.046905517578125,
0.021636962890625,
0.044891357421875,
0.017181396484375,
-0.020355224609375,
0.033721923828125,
0.087646484375,
-0.0295867919921875,
0.023193359375,
0.0213470458984375,
0.0005445480346679688,
-0.0057525634765625,
0.00543975830078125,
-0.039459228515625,
0.042999267578125,
0.0171966552734375,
-0.037139892578125,
-0.0228118896484375,
0.00531005859375,
0.0022525787353515625,
-0.02008056640625,
-0.01125335693359375,
0.035430908203125,
-0.001232147216796875,
-0.0236053466796875,
0.061920166015625,
0.0126190185546875,
0.06103515625,
-0.0238494873046875,
0.0001678466796875,
-0.02587890625,
0.0174713134765625,
-0.0289764404296875,
-0.06890869140625,
0.0297088623046875,
-0.020416259765625,
-0.0033893585205078125,
0.0028972625732421875,
0.046539306640625,
-0.0247344970703125,
-0.0285797119140625,
0.0191802978515625,
0.0157928466796875,
0.049407958984375,
0.007183074951171875,
-0.0960693359375,
0.01282501220703125,
0.00847625732421875,
-0.0511474609375,
0.031585693359375,
0.03131103515625,
0.002437591552734375,
0.057891845703125,
0.04315185546875,
-0.00665283203125,
0.0017366409301757812,
-0.0175323486328125,
0.05743408203125,
-0.03466796875,
-0.0175323486328125,
-0.057342529296875,
0.04742431640625,
-0.01361846923828125,
-0.044403076171875,
0.032562255859375,
0.05145263671875,
0.056396484375,
0.0017137527465820312,
0.035003662109375,
-0.022430419921875,
-0.0010929107666015625,
-0.028411865234375,
0.05419921875,
-0.053131103515625,
-0.00506591796875,
0.004238128662109375,
-0.051513671875,
-0.0208282470703125,
0.0491943359375,
-0.011505126953125,
0.027923583984375,
0.040008544921875,
0.07568359375,
-0.0244903564453125,
-0.022613525390625,
0.00878143310546875,
0.007129669189453125,
0.006847381591796875,
0.03350830078125,
0.0129852294921875,
-0.06488037109375,
0.0259552001953125,
-0.04534912109375,
-0.017608642578125,
-0.007415771484375,
-0.0438232421875,
-0.0760498046875,
-0.06915283203125,
-0.04010009765625,
-0.0595703125,
-0.0251007080078125,
0.063232421875,
0.0850830078125,
-0.052581787109375,
-0.009552001953125,
0.00811767578125,
0.012542724609375,
-0.01418304443359375,
-0.01552581787109375,
0.054412841796875,
-0.014892578125,
-0.05816650390625,
-0.0236663818359375,
0.0010633468627929688,
0.026580810546875,
-0.00506591796875,
-0.0176239013671875,
-0.01152801513671875,
-0.0211944580078125,
0.01293182373046875,
0.0269622802734375,
-0.05352783203125,
-0.0208740234375,
-0.02264404296875,
-0.0147857666015625,
0.023223876953125,
0.0263824462890625,
-0.036712646484375,
0.0159454345703125,
0.03350830078125,
0.031524658203125,
0.055084228515625,
-0.0210723876953125,
-0.00441741943359375,
-0.06329345703125,
0.043853759765625,
-0.0223541259765625,
0.034271240234375,
0.0249786376953125,
-0.0274505615234375,
0.046966552734375,
0.039031982421875,
-0.032135009765625,
-0.06182861328125,
-0.00554656982421875,
-0.07342529296875,
-0.01169586181640625,
0.06549072265625,
-0.03515625,
-0.03802490234375,
0.0328369140625,
0.0008540153503417969,
0.046630859375,
-0.0023956298828125,
0.035003662109375,
0.0158233642578125,
-0.0168609619140625,
-0.04705810546875,
-0.037322998046875,
0.03277587890625,
0.00447845458984375,
-0.050262451171875,
-0.032318115234375,
-0.007354736328125,
0.056121826171875,
0.023040771484375,
0.031494140625,
-0.0125732421875,
0.0022754669189453125,
0.015167236328125,
0.041748046875,
-0.034576416015625,
-0.0091705322265625,
-0.0223236083984375,
0.008453369140625,
-0.01132965087890625,
-0.052154541015625
]
] |
timm/ese_vovnet19b_dw.ra_in1k | 2023-04-21T23:12:00.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:1904.09730",
"arxiv:1911.06667",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/ese_vovnet19b_dw.ra_in1k | 0 | 36,879 | timm | 2023-04-21T23:11:53 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for ese_vovnet19b_dw.ra_in1k
A VoVNet-v2 image classification model. Pretrained on ImageNet-1k in `timm` by Ross Wightman using RandAugment `RA` recipe. Related to `B` recipe in [ResNet Strikes Back](https://arxiv.org/abs/2110.00476).
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 6.5
- GMACs: 1.3
- Activations (M): 8.2
- Image size: train = 224 x 224, test = 288 x 288
- **Papers:**
- An Energy and GPU-Computation Efficient Backbone Network: https://arxiv.org/abs/1904.09730
- CenterMask : Real-Time Anchor-Free Instance Segmentation: https://arxiv.org/abs/1911.06667
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('ese_vovnet19b_dw.ra_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'ese_vovnet19b_dw.ra_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 112, 112])
# torch.Size([1, 256, 56, 56])
# torch.Size([1, 512, 28, 28])
# torch.Size([1, 768, 14, 14])
# torch.Size([1, 1024, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'ese_vovnet19b_dw.ra_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1024, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Citation
```bibtex
@inproceedings{lee2019energy,
title = {An Energy and GPU-Computation Efficient Backbone Network for Real-Time Object Detection},
author = {Lee, Youngwan and Hwang, Joong-won and Lee, Sangrok and Bae, Yuseok and Park, Jongyoul},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops},
year = {2019}
}
```
```bibtex
@article{lee2019centermask,
title={CenterMask: Real-Time Anchor-Free Instance Segmentation},
author={Lee, Youngwan and Park, Jongyoul},
booktitle={CVPR},
year={2020}
}
```
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
| 4,400 | [
[
-0.037139892578125,
-0.03948974609375,
0.004077911376953125,
0.00316619873046875,
-0.02984619140625,
-0.0271148681640625,
-0.0123291015625,
-0.032501220703125,
0.0185546875,
0.038482666015625,
-0.037841796875,
-0.051605224609375,
-0.045562744140625,
-0.007213592529296875,
-0.01222991943359375,
0.059906005859375,
0.002086639404296875,
-0.0034465789794921875,
-0.019622802734375,
-0.040252685546875,
-0.0190582275390625,
-0.03131103515625,
-0.07073974609375,
-0.032318115234375,
0.025634765625,
0.016937255859375,
0.0242767333984375,
0.057159423828125,
0.043975830078125,
0.035675048828125,
-0.00717926025390625,
0.0077362060546875,
-0.01538848876953125,
-0.0115203857421875,
0.025970458984375,
-0.044586181640625,
-0.032073974609375,
0.01155853271484375,
0.049163818359375,
0.017730712890625,
-0.0025043487548828125,
0.0247802734375,
0.00439453125,
0.03863525390625,
-0.01971435546875,
0.004688262939453125,
-0.0367431640625,
0.018951416015625,
-0.0164642333984375,
0.00519561767578125,
-0.02789306640625,
-0.0212860107421875,
0.015106201171875,
-0.044769287109375,
0.039093017578125,
0.00122833251953125,
0.110595703125,
0.020233154296875,
0.0022068023681640625,
0.005863189697265625,
-0.014678955078125,
0.07574462890625,
-0.056671142578125,
0.0178985595703125,
0.0194244384765625,
0.0183563232421875,
-0.00548553466796875,
-0.074462890625,
-0.039459228515625,
-0.0062408447265625,
-0.00408935546875,
0.0004906654357910156,
-0.0134429931640625,
-0.004299163818359375,
0.02294921875,
0.0178375244140625,
-0.02874755859375,
0.0173187255859375,
-0.040740966796875,
-0.01276397705078125,
0.039886474609375,
0.000606536865234375,
0.027496337890625,
-0.0188751220703125,
-0.042510986328125,
-0.037078857421875,
-0.030303955078125,
0.0192718505859375,
0.0240478515625,
0.01666259765625,
-0.045013427734375,
0.0254058837890625,
0.01149749755859375,
0.04718017578125,
0.001117706298828125,
-0.0238800048828125,
0.04351806640625,
0.006046295166015625,
-0.03173828125,
-0.007488250732421875,
0.0714111328125,
0.0285797119140625,
0.017852783203125,
0.01482391357421875,
-0.003795623779296875,
-0.026123046875,
0.004669189453125,
-0.08563232421875,
-0.023406982421875,
0.0236053466796875,
-0.047637939453125,
-0.03369140625,
0.019317626953125,
-0.048553466796875,
-0.004459381103515625,
-0.005130767822265625,
0.032928466796875,
-0.03948974609375,
-0.03607177734375,
0.01361846923828125,
-0.020477294921875,
0.0386962890625,
0.0140533447265625,
-0.040252685546875,
0.01158905029296875,
0.0286712646484375,
0.07586669921875,
0.005992889404296875,
-0.028106689453125,
-0.0216827392578125,
-0.01953125,
-0.01389312744140625,
0.047607421875,
-0.01197052001953125,
-0.01380157470703125,
-0.0240478515625,
0.029052734375,
-0.00919342041015625,
-0.055328369140625,
0.016693115234375,
-0.012847900390625,
0.0185394287109375,
-0.01033782958984375,
-0.0198822021484375,
-0.046539306640625,
0.0259857177734375,
-0.0308074951171875,
0.10015869140625,
0.0153350830078125,
-0.06195068359375,
0.0264129638671875,
-0.037109375,
-0.0133209228515625,
-0.016204833984375,
-0.0108184814453125,
-0.09063720703125,
-0.00530242919921875,
0.02276611328125,
0.055267333984375,
-0.0216064453125,
-0.00716400146484375,
-0.044647216796875,
-0.01474761962890625,
0.0225830078125,
-0.00511932373046875,
0.076171875,
0.00820159912109375,
-0.03424072265625,
0.012054443359375,
-0.056427001953125,
0.0186004638671875,
0.04302978515625,
-0.01505279541015625,
-0.0112762451171875,
-0.03424072265625,
0.0108795166015625,
0.0151519775390625,
0.0053558349609375,
-0.0457763671875,
0.0189208984375,
-0.01073455810546875,
0.02886962890625,
0.050872802734375,
-0.0013113021850585938,
0.018096923828125,
-0.0253448486328125,
0.0182952880859375,
0.0189361572265625,
0.024169921875,
-0.00444793701171875,
-0.0411376953125,
-0.06402587890625,
-0.039306640625,
0.020477294921875,
0.033233642578125,
-0.0379638671875,
0.043487548828125,
-0.00566864013671875,
-0.05499267578125,
-0.028778076171875,
-0.006290435791015625,
0.0423583984375,
0.03948974609375,
0.03387451171875,
-0.0361328125,
-0.046905517578125,
-0.06927490234375,
0.00446319580078125,
0.00600433349609375,
0.0009832382202148438,
0.02703857421875,
0.0479736328125,
-0.004070281982421875,
0.05340576171875,
-0.038299560546875,
-0.0232696533203125,
-0.0194091796875,
0.006927490234375,
0.019317626953125,
0.0633544921875,
0.0670166015625,
-0.05255126953125,
-0.0340576171875,
-0.00977325439453125,
-0.0816650390625,
0.01262664794921875,
-0.00862884521484375,
-0.012847900390625,
0.01273345947265625,
0.0212860107421875,
-0.02960205078125,
0.047119140625,
0.00908660888671875,
-0.0159759521484375,
0.0285797119140625,
-0.0185089111328125,
0.01474761962890625,
-0.0869140625,
-0.0012502670288085938,
0.030364990234375,
-0.003101348876953125,
-0.03399658203125,
0.00395965576171875,
0.00035071372985839844,
-0.00673675537109375,
-0.049224853515625,
0.047454833984375,
-0.050262451171875,
-0.0225677490234375,
-0.0087890625,
-0.0148468017578125,
0.002849578857421875,
0.056793212890625,
0.0031642913818359375,
0.0296478271484375,
0.065673828125,
-0.0428466796875,
0.0308990478515625,
0.0251617431640625,
-0.0165863037109375,
0.039703369140625,
-0.05224609375,
0.0252685546875,
-0.006927490234375,
0.01061248779296875,
-0.0660400390625,
-0.01540374755859375,
0.0204925537109375,
-0.039520263671875,
0.048919677734375,
-0.040008544921875,
-0.017913818359375,
-0.041290283203125,
-0.034698486328125,
0.033294677734375,
0.04901123046875,
-0.057159423828125,
0.0323486328125,
0.0196533203125,
0.0277557373046875,
-0.050537109375,
-0.06317138671875,
-0.0177154541015625,
-0.0302581787109375,
-0.048583984375,
0.026519775390625,
0.0131072998046875,
0.00514984130859375,
0.017730712890625,
-0.00676727294921875,
-0.0109100341796875,
-0.0025920867919921875,
0.038726806640625,
0.0239105224609375,
-0.0230560302734375,
-0.014892578125,
-0.0238800048828125,
-0.01201629638671875,
-0.006305694580078125,
-0.0223846435546875,
0.04815673828125,
-0.02069091796875,
-0.0255126953125,
-0.06646728515625,
-0.0037555694580078125,
0.036956787109375,
-0.0139923095703125,
0.060211181640625,
0.08294677734375,
-0.04217529296875,
-0.006587982177734375,
-0.037078857421875,
-0.025604248046875,
-0.036590576171875,
0.042083740234375,
-0.0312347412109375,
-0.030487060546875,
0.060516357421875,
0.0009207725524902344,
0.008392333984375,
0.0501708984375,
0.0184478759765625,
-0.0031681060791015625,
0.04827880859375,
0.04705810546875,
0.018463134765625,
0.057403564453125,
-0.0787353515625,
-0.01125335693359375,
-0.07366943359375,
-0.04412841796875,
-0.03485107421875,
-0.03851318359375,
-0.0318603515625,
-0.030426025390625,
0.032196044921875,
0.0181427001953125,
-0.0240478515625,
0.039520263671875,
-0.06976318359375,
0.0118560791015625,
0.0478515625,
0.043426513671875,
-0.0321044921875,
0.0269775390625,
-0.016021728515625,
-0.006183624267578125,
-0.053924560546875,
-0.0155487060546875,
0.0809326171875,
0.032257080078125,
0.041900634765625,
-0.0190582275390625,
0.059295654296875,
-0.01275634765625,
0.0218048095703125,
-0.045562744140625,
0.053466796875,
-0.01202392578125,
-0.03271484375,
-0.0218963623046875,
-0.0188751220703125,
-0.07952880859375,
0.016204833984375,
-0.0216217041015625,
-0.0540771484375,
0.013275146484375,
0.0082855224609375,
-0.0172576904296875,
0.06463623046875,
-0.056732177734375,
0.06878662109375,
-0.0133819580078125,
-0.0265045166015625,
0.00023806095123291016,
-0.038482666015625,
0.01261138916015625,
0.0185089111328125,
-0.020263671875,
-0.0161590576171875,
0.021087646484375,
0.083984375,
-0.048370361328125,
0.06597900390625,
-0.040191650390625,
0.0316162109375,
0.04718017578125,
-0.02484130859375,
0.0304718017578125,
-0.00765228271484375,
-0.0093536376953125,
0.0302276611328125,
-0.00930023193359375,
-0.041290283203125,
-0.037353515625,
0.038116455078125,
-0.07122802734375,
-0.0276031494140625,
-0.0290985107421875,
-0.021759033203125,
0.02093505859375,
0.01006317138671875,
0.0462646484375,
0.0587158203125,
0.01247406005859375,
0.032196044921875,
0.051666259765625,
-0.03961181640625,
0.037750244140625,
0.00007265806198120117,
-0.0070037841796875,
-0.034149169921875,
0.06317138671875,
0.0196075439453125,
0.004779815673828125,
0.0145263671875,
0.00946044921875,
-0.0267333984375,
-0.0457763671875,
-0.02593994140625,
0.0285186767578125,
-0.045318603515625,
-0.0350341796875,
-0.0478515625,
-0.036956787109375,
-0.0283355712890625,
-0.01361846923828125,
-0.04339599609375,
-0.0183258056640625,
-0.032501220703125,
0.006893157958984375,
0.056732177734375,
0.042572021484375,
-0.0102081298828125,
0.04302978515625,
-0.03741455078125,
0.012664794921875,
0.0205535888671875,
0.026947021484375,
-0.005046844482421875,
-0.058074951171875,
-0.014617919921875,
-0.00975799560546875,
-0.0285186767578125,
-0.053802490234375,
0.03436279296875,
0.00820159912109375,
0.04779052734375,
0.02923583984375,
-0.01654052734375,
0.05755615234375,
-0.0036468505859375,
0.046295166015625,
0.03558349609375,
-0.042816162109375,
0.04595947265625,
-0.0023784637451171875,
0.00775909423828125,
0.00447845458984375,
0.01271820068359375,
-0.021209716796875,
-0.0012645721435546875,
-0.07061767578125,
-0.058685302734375,
0.06903076171875,
0.004062652587890625,
-0.010284423828125,
0.0163116455078125,
0.05633544921875,
-0.002407073974609375,
-0.00446319580078125,
-0.055572509765625,
-0.037506103515625,
-0.026275634765625,
-0.0167388916015625,
0.01145172119140625,
-0.0115814208984375,
0.0005083084106445312,
-0.04473876953125,
0.056976318359375,
0.00202178955078125,
0.060516357421875,
0.0264739990234375,
0.0008387565612792969,
-0.0004215240478515625,
-0.0247039794921875,
0.037353515625,
0.0185546875,
-0.0272369384765625,
-0.0011386871337890625,
0.0218963623046875,
-0.049224853515625,
0.01120758056640625,
0.002735137939453125,
-0.004718780517578125,
0.0007448196411132812,
0.033782958984375,
0.072265625,
0.005214691162109375,
0.0077972412109375,
0.030364990234375,
-0.006862640380859375,
-0.0328369140625,
-0.0285797119140625,
0.0149383544921875,
-0.01271820068359375,
0.0291595458984375,
0.02728271484375,
0.0256500244140625,
-0.0214996337890625,
-0.01495361328125,
0.0257110595703125,
0.037078857421875,
-0.02593994140625,
-0.025787353515625,
0.05377197265625,
-0.00997161865234375,
-0.019989013671875,
0.068115234375,
-0.01042938232421875,
-0.03521728515625,
0.093017578125,
0.0391845703125,
0.0810546875,
-0.006771087646484375,
0.0035648345947265625,
0.06549072265625,
0.0168304443359375,
-0.00510406494140625,
0.014373779296875,
0.01556396484375,
-0.061859130859375,
0.000293731689453125,
-0.043975830078125,
0.004772186279296875,
0.03607177734375,
-0.05169677734375,
0.02734375,
-0.05438232421875,
-0.0305023193359375,
0.007793426513671875,
0.0164031982421875,
-0.0736083984375,
0.0190582275390625,
-0.00616455078125,
0.05938720703125,
-0.0594482421875,
0.063720703125,
0.07061767578125,
-0.03570556640625,
-0.072265625,
-0.01081085205078125,
0.0022907257080078125,
-0.0784912109375,
0.041229248046875,
0.035552978515625,
0.00760650634765625,
0.00499725341796875,
-0.054840087890625,
-0.050262451171875,
0.10443115234375,
0.0293426513671875,
-0.00913238525390625,
0.0207672119140625,
-0.00908660888671875,
0.01416778564453125,
-0.0301666259765625,
0.0297393798828125,
0.00928497314453125,
0.0302276611328125,
0.013641357421875,
-0.04156494140625,
0.0119476318359375,
-0.018157958984375,
0.01151275634765625,
0.0185089111328125,
-0.06414794921875,
0.0687255859375,
-0.033477783203125,
-0.0151519775390625,
-0.004116058349609375,
0.05963134765625,
0.02044677734375,
0.011749267578125,
0.04681396484375,
0.07049560546875,
0.0428466796875,
-0.0245513916015625,
0.06329345703125,
0.0030345916748046875,
0.044464111328125,
0.0413818359375,
0.024810791015625,
0.04473876953125,
0.0246429443359375,
-0.020660400390625,
0.03265380859375,
0.077392578125,
-0.0308074951171875,
0.0296478271484375,
0.015655517578125,
-0.0011272430419921875,
-0.0144805908203125,
0.00798797607421875,
-0.04052734375,
0.038177490234375,
0.0171966552734375,
-0.026031494140625,
-0.0173797607421875,
0.01464080810546875,
0.00027680397033691406,
-0.015655517578125,
-0.01300048828125,
0.0372314453125,
-0.001132965087890625,
-0.0260772705078125,
0.062103271484375,
0.006710052490234375,
0.06427001953125,
-0.02886962890625,
-0.00417327880859375,
-0.0202178955078125,
0.0155181884765625,
-0.02484130859375,
-0.057159423828125,
0.0191497802734375,
-0.0178070068359375,
-0.003265380859375,
0.0023326873779296875,
0.052337646484375,
-0.0192718505859375,
-0.031036376953125,
0.0170440673828125,
0.0178070068359375,
0.046539306640625,
0.00647735595703125,
-0.09246826171875,
0.017425537109375,
0.01080322265625,
-0.046112060546875,
0.024993896484375,
0.032440185546875,
0.003505706787109375,
0.05206298828125,
0.0487060546875,
-0.0081787109375,
0.015899658203125,
-0.01305389404296875,
0.053466796875,
-0.041778564453125,
-0.0255584716796875,
-0.05743408203125,
0.042938232421875,
-0.0196990966796875,
-0.04339599609375,
0.041473388671875,
0.04974365234375,
0.07000732421875,
-0.0080718994140625,
0.03900146484375,
-0.0186767578125,
0.0036296844482421875,
-0.036895751953125,
0.044586181640625,
-0.05413818359375,
0.0028362274169921875,
-0.0047607421875,
-0.056640625,
-0.0209808349609375,
0.05816650390625,
-0.0187530517578125,
0.0234375,
0.03863525390625,
0.07025146484375,
-0.027496337890625,
-0.03131103515625,
0.014678955078125,
0.00646209716796875,
0.016082763671875,
0.028045654296875,
0.028106689453125,
-0.061614990234375,
0.02801513671875,
-0.0467529296875,
-0.01190185546875,
-0.0165557861328125,
-0.056427001953125,
-0.07635498046875,
-0.0648193359375,
-0.0509033203125,
-0.0452880859375,
-0.01300048828125,
0.064208984375,
0.08843994140625,
-0.052764892578125,
-0.008514404296875,
-0.0008306503295898438,
0.0161895751953125,
-0.0296630859375,
-0.017059326171875,
0.05224609375,
-0.01291656494140625,
-0.060516357421875,
-0.01067352294921875,
0.0129241943359375,
0.032684326171875,
0.004184722900390625,
-0.017791748046875,
-0.0129241943359375,
-0.016937255859375,
0.016845703125,
0.03759765625,
-0.0513916015625,
-0.0258941650390625,
-0.0185089111328125,
-0.00646209716796875,
0.0305023193359375,
0.036041259765625,
-0.037689208984375,
0.023345947265625,
0.0246124267578125,
0.0299835205078125,
0.07135009765625,
-0.0194549560546875,
0.00034546852111816406,
-0.068115234375,
0.057342529296875,
-0.0135345458984375,
0.037017822265625,
0.029083251953125,
-0.0246124267578125,
0.045379638671875,
0.0396728515625,
-0.0293426513671875,
-0.07012939453125,
-0.000545501708984375,
-0.08514404296875,
-0.017181396484375,
0.063720703125,
-0.0277099609375,
-0.04931640625,
0.0201873779296875,
-0.002902984619140625,
0.048095703125,
-0.005992889404296875,
0.03399658203125,
0.0152130126953125,
-0.01036834716796875,
-0.044891357421875,
-0.037017822265625,
0.038055419921875,
0.01067352294921875,
-0.051025390625,
-0.037353515625,
-0.0013418197631835938,
0.050628662109375,
0.016357421875,
0.041015625,
-0.01319122314453125,
0.003955841064453125,
0.01195526123046875,
0.045166015625,
-0.0390625,
-0.01319122314453125,
-0.020904541015625,
0.005939483642578125,
-0.01497650146484375,
-0.05963134765625
]
] |
Habana/gpt2 | 2023-10-26T15:17:37.000Z | [
"optimum_habana",
"license:apache-2.0",
"region:us"
] | null | Habana | null | null | Habana/gpt2 | 0 | 36,779 | null | 2022-05-24T12:41:41 | ---
license: apache-2.0
---
[Optimum Habana](https://github.com/huggingface/optimum-habana) is the interface between the Hugging Face Transformers and Diffusers libraries and Habana's Gaudi processor (HPU).
It provides a set of tools enabling easy and fast model loading, training and inference on single- and multi-HPU settings for different downstream tasks.
Learn more about how to take advantage of the power of Habana HPUs to train and deploy Transformers and Diffusers models at [hf.co/hardware/habana](https://huggingface.co/hardware/habana).
## GPT2 model HPU configuration
This model only contains the `GaudiConfig` file for running the [GPT2](https://huggingface.co/gpt2) model on Habana's Gaudi processors (HPU).
**This model contains no model weights, only a GaudiConfig.**
This enables to specify:
- `use_fused_adam`: whether to use Habana's custom AdamW implementation
- `use_fused_clip_norm`: whether to use Habana's fused gradient norm clipping operator
- `use_torch_autocast`: whether to use PyTorch's autocast mixed precision
## Usage
The model is instantiated the same way as in the Transformers library.
The only difference is that there are a few new training arguments specific to HPUs.
[Here](https://github.com/huggingface/optimum-habana/blob/main/examples/language-modeling/run_clm.py) is a causal language modeling example script to pre-train/fine-tune a model. You can run it with GPT2 with the following command:
```bash
python run_clm.py \
--model_name_or_path gpt2 \
--dataset_name wikitext \
--dataset_config_name wikitext-2-raw-v1 \
--per_device_train_batch_size 4 \
--per_device_eval_batch_size 4 \
--do_train \
--do_eval \
--output_dir /tmp/test-clm \
--gaudi_config_name Habana/gpt2 \
--use_habana \
--use_lazy_mode \
--throughput_warmup_steps 2
```
Check the [documentation](https://huggingface.co/docs/optimum/habana/index) out for more advanced usage and examples.
| 1,961 | [
[
-0.052337646484375,
-0.06732177734375,
0.024688720703125,
0.01678466796875,
-0.0158233642578125,
-0.01021575927734375,
-0.00807952880859375,
-0.033111572265625,
0.006591796875,
0.0204315185546875,
-0.031463623046875,
-0.0029010772705078125,
-0.033294677734375,
-0.02081298828125,
-0.025634765625,
0.08544921875,
-0.01390838623046875,
-0.0162506103515625,
-0.01258087158203125,
-0.0107879638671875,
-0.030181884765625,
-0.0311431884765625,
-0.07855224609375,
-0.036285400390625,
0.0204620361328125,
0.005889892578125,
0.07794189453125,
0.03558349609375,
0.02667236328125,
0.029083251953125,
-0.0096282958984375,
-0.0008401870727539062,
-0.0275421142578125,
-0.01290130615234375,
-0.0016498565673828125,
-0.0251617431640625,
-0.03240966796875,
0.017974853515625,
0.04144287109375,
0.01230621337890625,
-0.0162506103515625,
0.0267181396484375,
0.00933837890625,
0.03106689453125,
-0.0379638671875,
-0.004299163818359375,
-0.01474761962890625,
0.00975799560546875,
-0.01467132568359375,
-0.0179443359375,
-0.01373291015625,
-0.0210723876953125,
0.006183624267578125,
-0.043487548828125,
0.016387939453125,
0.00634765625,
0.1038818359375,
0.06036376953125,
-0.0447998046875,
0.0164337158203125,
-0.061126708984375,
0.051422119140625,
-0.033599853515625,
0.0257415771484375,
0.027374267578125,
0.04669189453125,
-0.015289306640625,
-0.06610107421875,
-0.029693603515625,
-0.0015344619750976562,
-0.016204833984375,
0.0179595947265625,
-0.03759765625,
0.025482177734375,
0.02978515625,
0.052459716796875,
-0.04595947265625,
0.0004401206970214844,
-0.03314208984375,
-0.023773193359375,
0.028106689453125,
-0.0016498565673828125,
0.01806640625,
-0.0251617431640625,
-0.029876708984375,
-0.01018524169921875,
-0.031402587890625,
-0.01107025146484375,
0.029083251953125,
-0.01396942138671875,
-0.0341796875,
0.0205230712890625,
0.00330352783203125,
0.06597900390625,
0.000469207763671875,
-0.00435638427734375,
0.0288238525390625,
-0.00588226318359375,
-0.041717529296875,
-0.0262298583984375,
0.057891845703125,
0.0013074874877929688,
0.007221221923828125,
0.00122833251953125,
-0.020538330078125,
0.0135040283203125,
0.045074462890625,
-0.05615234375,
-0.037872314453125,
0.0258026123046875,
-0.037872314453125,
-0.038482666015625,
-0.0306549072265625,
-0.058349609375,
0.00913238525390625,
-0.0303497314453125,
0.064208984375,
-0.01328277587890625,
-0.018707275390625,
-0.0113677978515625,
-0.02099609375,
0.02642822265625,
0.0177154541015625,
-0.06353759765625,
0.037689208984375,
0.0335693359375,
0.07806396484375,
-0.01275634765625,
-0.01788330078125,
-0.025177001953125,
-0.004634857177734375,
0.0003647804260253906,
0.041229248046875,
-0.0025348663330078125,
-0.02056884765625,
-0.00453948974609375,
0.01116180419921875,
-0.0123443603515625,
-0.037689208984375,
0.0667724609375,
-0.03411865234375,
0.041595458984375,
0.01788330078125,
-0.0263671875,
-0.024505615234375,
-0.013916015625,
-0.047882080078125,
0.11517333984375,
0.041015625,
-0.051605224609375,
0.01152801513671875,
-0.048431396484375,
-0.03900146484375,
-0.00785064697265625,
-0.0034694671630859375,
-0.05133056640625,
0.01080322265625,
-0.002056121826171875,
0.023712158203125,
-0.0293426513671875,
0.0222930908203125,
-0.01467132568359375,
-0.0243682861328125,
-0.01165771484375,
-0.0260772705078125,
0.08978271484375,
0.035430908203125,
-0.039459228515625,
0.029754638671875,
-0.048614501953125,
-0.0137786865234375,
0.017791748046875,
-0.03350830078125,
-0.01334381103515625,
-0.01407623291015625,
0.0262603759765625,
0.021697998046875,
0.013946533203125,
-0.026336669921875,
0.004604339599609375,
-0.00019431114196777344,
0.053131103515625,
0.060455322265625,
-0.0131988525390625,
0.0233154296875,
-0.032012939453125,
0.043243408203125,
-0.031890869140625,
0.057769775390625,
0.0214080810546875,
-0.059356689453125,
-0.07928466796875,
-0.0304412841796875,
-0.006500244140625,
0.036529541015625,
-0.03045654296875,
0.03704833984375,
0.0167999267578125,
-0.045135498046875,
-0.055145263671875,
0.00005733966827392578,
0.0184173583984375,
0.05181884765625,
0.033935546875,
-0.018157958984375,
-0.05548095703125,
-0.07843017578125,
0.00008803606033325195,
-0.007404327392578125,
-0.00029754638671875,
0.03765869140625,
0.034912109375,
-0.025787353515625,
0.06695556640625,
-0.0350341796875,
-0.02703857421875,
-0.0000979304313659668,
0.00411224365234375,
0.0467529296875,
0.037445068359375,
0.052032470703125,
-0.06256103515625,
-0.038421630859375,
-0.005084991455078125,
-0.056640625,
-0.01009368896484375,
0.0000693202018737793,
-0.047882080078125,
0.0156097412109375,
0.0214080810546875,
-0.060546875,
0.0272674560546875,
0.05975341796875,
-0.0218505859375,
0.054229736328125,
-0.02825927734375,
-0.0044708251953125,
-0.07965087890625,
0.02105712890625,
-0.00624847412109375,
-0.037811279296875,
-0.044281005859375,
0.0114288330078125,
0.000579833984375,
-0.00702667236328125,
-0.033355712890625,
0.054595947265625,
-0.0234375,
0.00006020069122314453,
-0.0265960693359375,
-0.021148681640625,
0.00585174560546875,
0.046844482421875,
-0.0043487548828125,
0.06683349609375,
0.05084228515625,
-0.04583740234375,
0.0301055908203125,
0.024749755859375,
-0.022705078125,
0.017974853515625,
-0.07659912109375,
0.01441192626953125,
0.00782012939453125,
0.013885498046875,
-0.06134033203125,
-0.0343017578125,
0.011627197265625,
-0.033233642578125,
0.028076171875,
-0.0196380615234375,
-0.020050048828125,
-0.047271728515625,
-0.004486083984375,
0.024871826171875,
0.08538818359375,
-0.0728759765625,
0.05657958984375,
0.05987548828125,
0.011993408203125,
-0.037322998046875,
-0.04132080078125,
-0.01296234130859375,
-0.043853759765625,
-0.050537109375,
0.04742431640625,
0.0015964508056640625,
0.002391815185546875,
-0.00653076171875,
0.005084991455078125,
-0.00910186767578125,
0.01435089111328125,
0.0256500244140625,
0.03338623046875,
0.0130462646484375,
-0.01161956787109375,
0.007617950439453125,
-0.020050048828125,
0.0126495361328125,
-0.0325927734375,
0.055816650390625,
-0.0160064697265625,
0.0074005126953125,
-0.0308685302734375,
0.0019817352294921875,
0.035430908203125,
-0.00571441650390625,
0.041107177734375,
0.08319091796875,
-0.0309295654296875,
-0.00899505615234375,
-0.048492431640625,
-0.009490966796875,
-0.043731689453125,
0.006816864013671875,
-0.0248565673828125,
-0.052825927734375,
0.036834716796875,
0.004779815673828125,
0.004974365234375,
0.041717529296875,
0.0679931640625,
-0.0097503662109375,
0.07421875,
0.056671142578125,
-0.0252838134765625,
0.05340576171875,
-0.027923583984375,
0.0034999847412109375,
-0.07281494140625,
0.0010356903076171875,
-0.047454833984375,
-0.00783538818359375,
-0.034515380859375,
-0.02685546875,
0.0504150390625,
0.0085296630859375,
-0.0360107421875,
0.04180908203125,
-0.057403564453125,
0.02227783203125,
0.06524658203125,
0.00890350341796875,
0.0032138824462890625,
0.007183074951171875,
-0.007617950439453125,
0.0295562744140625,
-0.05853271484375,
-0.021453857421875,
0.0631103515625,
0.03704833984375,
0.051605224609375,
0.0037384033203125,
0.041046142578125,
0.0063629150390625,
0.011749267578125,
-0.05657958984375,
0.0209503173828125,
0.00833892822265625,
-0.06231689453125,
-0.0087738037109375,
-0.0283660888671875,
-0.06304931640625,
0.0209808349609375,
-0.0002313852310180664,
-0.041717529296875,
0.0188446044921875,
0.030303955078125,
-0.0216064453125,
0.0117034912109375,
-0.04791259765625,
0.07861328125,
-0.00905609130859375,
-0.041961669921875,
-0.020965576171875,
-0.045867919921875,
0.0187835693359375,
-0.00370025634765625,
0.005275726318359375,
0.00049591064453125,
0.01513671875,
0.06768798828125,
-0.040802001953125,
0.049102783203125,
-0.030609130859375,
0.004085540771484375,
0.0298309326171875,
-0.00513458251953125,
0.03778076171875,
-0.00390625,
-0.0015583038330078125,
0.0200347900390625,
-0.01355743408203125,
-0.0295562744140625,
-0.022369384765625,
0.047515869140625,
-0.09014892578125,
-0.0300140380859375,
-0.02490234375,
-0.0350341796875,
0.0033321380615234375,
0.01678466796875,
0.0360107421875,
0.022796630859375,
-0.0164947509765625,
-0.0169830322265625,
0.033050537109375,
-0.0183258056640625,
0.0245208740234375,
0.006832122802734375,
-0.0195465087890625,
-0.0281982421875,
0.056915283203125,
-0.022216796875,
0.020355224609375,
0.007244110107421875,
0.034210205078125,
-0.0187835693359375,
-0.028900146484375,
-0.040313720703125,
0.01505279541015625,
-0.043853759765625,
-0.01358795166015625,
-0.05242919921875,
-0.006389617919921875,
-0.0250091552734375,
-0.0110015869140625,
-0.03070068359375,
-0.030303955078125,
-0.006320953369140625,
0.00620269775390625,
0.044708251953125,
0.021697998046875,
-0.020599365234375,
0.046051025390625,
-0.03314208984375,
0.034088134765625,
0.020660400390625,
0.01025390625,
-0.007633209228515625,
-0.05816650390625,
-0.0237274169921875,
-0.0157012939453125,
-0.0218658447265625,
-0.057525634765625,
0.035491943359375,
0.0296173095703125,
0.049285888671875,
0.023162841796875,
-0.0198974609375,
0.043914794921875,
-0.031097412109375,
0.039276123046875,
-0.0038166046142578125,
-0.057525634765625,
0.046875,
-0.035552978515625,
0.01206207275390625,
0.04351806640625,
0.051849365234375,
-0.031494140625,
-0.0124359130859375,
-0.051544189453125,
-0.06671142578125,
0.05242919921875,
0.0199432373046875,
0.0007600784301757812,
0.012054443359375,
0.022125244140625,
-0.00623321533203125,
0.00750732421875,
-0.045501708984375,
-0.02001953125,
-0.0178375244140625,
0.000156402587890625,
-0.007465362548828125,
-0.00572967529296875,
-0.02728271484375,
-0.04437255859375,
0.07781982421875,
-0.0017490386962890625,
0.045257568359375,
0.0123748779296875,
-0.01039886474609375,
-0.014862060546875,
-0.0234222412109375,
0.00042939186096191406,
0.0184478759765625,
-0.03411865234375,
-0.002536773681640625,
0.0018014907836914062,
-0.03692626953125,
-0.00011521577835083008,
0.00766754150390625,
-0.0265655517578125,
0.00662994384765625,
0.00383758544921875,
0.08599853515625,
-0.0008521080017089844,
-0.0178985595703125,
0.0275115966796875,
-0.01177215576171875,
-0.00582122802734375,
-0.042510986328125,
0.0168304443359375,
-0.0016412734985351562,
0.01042938232421875,
0.0002970695495605469,
0.0113677978515625,
0.0208587646484375,
-0.0276641845703125,
0.00927734375,
0.030120849609375,
-0.00926971435546875,
-0.0107879638671875,
0.07257080078125,
0.01959228515625,
-0.0159912109375,
0.0692138671875,
-0.0030040740966796875,
-0.043731689453125,
0.04827880859375,
0.039886474609375,
0.07659912109375,
-0.026275634765625,
0.00920867919921875,
0.0372314453125,
0.01641845703125,
0.00988006591796875,
0.017425537109375,
-0.0208282470703125,
-0.049713134765625,
-0.03350830078125,
-0.07757568359375,
-0.0305633544921875,
0.0014057159423828125,
-0.055450439453125,
0.041839599609375,
-0.024871826171875,
-0.0250244140625,
0.005031585693359375,
-0.01171875,
-0.063232421875,
0.0069122314453125,
-0.0028667449951171875,
0.0703125,
-0.07183837890625,
0.08807373046875,
0.05364990234375,
-0.0251922607421875,
-0.06341552734375,
-0.034332275390625,
0.01018524169921875,
-0.058837890625,
0.016998291015625,
0.0036182403564453125,
0.00128173828125,
0.0087127685546875,
-0.017791748046875,
-0.061737060546875,
0.07757568359375,
0.0223236083984375,
-0.0181121826171875,
-0.004238128662109375,
-0.00914764404296875,
0.031646728515625,
-0.0006804466247558594,
0.042877197265625,
0.07366943359375,
0.047698974609375,
-0.002559661865234375,
-0.0806884765625,
0.00994873046875,
-0.034210205078125,
-0.01189422607421875,
0.0250244140625,
-0.05810546875,
0.07293701171875,
-0.0037975311279296875,
-0.0013484954833984375,
0.003658294677734375,
0.04119873046875,
0.0033721923828125,
0.00044846534729003906,
0.043670654296875,
0.05706787109375,
0.061309814453125,
-0.0196380615234375,
0.10076904296875,
-0.0287933349609375,
0.055816650390625,
0.062042236328125,
0.01132965087890625,
0.035552978515625,
0.033233642578125,
-0.03277587890625,
0.0258636474609375,
0.06854248046875,
-0.012725830078125,
0.042449951171875,
0.00010496377944946289,
-0.0232696533203125,
-0.00396728515625,
0.0024280548095703125,
-0.0258331298828125,
0.03240966796875,
0.01629638671875,
-0.0204620361328125,
-0.007167816162109375,
0.006153106689453125,
0.0244140625,
-0.043853759765625,
-0.0016946792602539062,
0.04315185546875,
0.00732421875,
-0.060333251953125,
0.07891845703125,
0.0036144256591796875,
0.06341552734375,
-0.050048828125,
0.01392364501953125,
-0.017547607421875,
0.0244293212890625,
-0.0260772705078125,
-0.0257110595703125,
0.039886474609375,
-0.00914764404296875,
-0.000060439109802246094,
0.0034923553466796875,
0.0650634765625,
-0.0198974609375,
-0.0124969482421875,
0.035400390625,
0.017364501953125,
0.031341552734375,
-0.0036525726318359375,
-0.06298828125,
0.0158843994140625,
0.00177764892578125,
-0.03790283203125,
0.0222625732421875,
-0.0170440673828125,
0.0005483627319335938,
0.03631591796875,
0.048309326171875,
0.0012693405151367188,
0.00701904296875,
0.01471710205078125,
0.06353759765625,
-0.027069091796875,
-0.04296875,
-0.04461669921875,
0.029083251953125,
-0.0131072998046875,
-0.040069580078125,
0.060394287109375,
0.04132080078125,
0.06280517578125,
-0.01006317138671875,
0.059722900390625,
-0.0289764404296875,
0.00830841064453125,
-0.0272979736328125,
0.045013427734375,
-0.048828125,
-0.016754150390625,
-0.022796630859375,
-0.089111328125,
-0.005950927734375,
0.0721435546875,
-0.0032596588134765625,
0.0106353759765625,
0.0460205078125,
0.06878662109375,
-0.023712158203125,
0.00453948974609375,
0.0046234130859375,
0.022125244140625,
0.0307769775390625,
0.041778564453125,
0.045989990234375,
-0.034454345703125,
0.0196075439453125,
-0.048492431640625,
-0.053253173828125,
-0.01471710205078125,
-0.055511474609375,
-0.055267333984375,
-0.03521728515625,
-0.0247650146484375,
-0.0321044921875,
0.01004791259765625,
0.050628662109375,
0.06787109375,
-0.033477783203125,
-0.032958984375,
-0.021759033203125,
-0.0176544189453125,
-0.0171661376953125,
-0.0191192626953125,
0.041015625,
-0.03021240234375,
-0.0740966796875,
0.0379638671875,
0.00124359130859375,
-0.001041412353515625,
-0.0272369384765625,
-0.0166168212890625,
-0.00927734375,
-0.003353118896484375,
0.03851318359375,
0.03167724609375,
-0.016845703125,
-0.0213165283203125,
-0.0257110595703125,
0.006137847900390625,
-0.0009551048278808594,
0.04229736328125,
-0.066650390625,
0.0161895751953125,
0.05218505859375,
0.034210205078125,
0.06890869140625,
-0.01374053955078125,
0.038909912109375,
-0.041015625,
0.0271759033203125,
-0.0004470348358154297,
0.044036865234375,
0.00786590576171875,
-0.0328369140625,
0.057586669921875,
0.0083770751953125,
-0.075927734375,
-0.051849365234375,
0.008087158203125,
-0.08648681640625,
-0.0150909423828125,
0.083251953125,
0.00638580322265625,
-0.0313720703125,
-0.00638580322265625,
-0.0210113525390625,
0.039825439453125,
-0.0176849365234375,
0.051605224609375,
0.0281829833984375,
-0.0250701904296875,
0.0124053955078125,
-0.06573486328125,
0.05499267578125,
0.0467529296875,
-0.05792236328125,
-0.00897216796875,
0.0325927734375,
0.01898193359375,
0.0137481689453125,
0.046417236328125,
-0.026947021484375,
0.0220184326171875,
-0.006290435791015625,
0.017974853515625,
-0.0168914794921875,
-0.032379150390625,
-0.0369873046875,
-0.00670623779296875,
-0.0030765533447265625,
-0.01080322265625
]
] |
Yntec/photoMovieRealistic | 2023-08-05T08:48:07.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"MagicArt35",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Yntec | null | null | Yntec/photoMovieRealistic | 10 | 36,741 | diffusers | 2023-08-05T07:31:21 | ---
license: creativeml-openrail-m
library_name: diffusers
pipeline_tag: text-to-image
tags:
- stable-diffusion
- stable-diffusion-diffusers
- diffusers
- text-to-image
- MagicArt35
---
# Photo Movie Realistic
Original page:
https://civitai.com/models/95413/photo-movie-realistic
| 283 | [
[
-0.0177001953125,
-0.0306243896484375,
0.0237884521484375,
0.0269012451171875,
-0.0341796875,
-0.0013742446899414062,
0.0159759521484375,
-0.01265716552734375,
0.050506591796875,
0.061920166015625,
-0.05657958984375,
-0.0133209228515625,
-0.0016994476318359375,
-0.03631591796875,
-0.029510498046875,
0.028106689453125,
0.002201080322265625,
0.0289306640625,
-0.007053375244140625,
0.0031871795654296875,
0.003368377685546875,
0.00739288330078125,
-0.091064453125,
0.0156402587890625,
0.054351806640625,
0.0372314453125,
0.058258056640625,
0.03216552734375,
0.043731689453125,
0.009124755859375,
0.03350830078125,
-0.0106658935546875,
-0.023834228515625,
-0.004924774169921875,
-0.0248260498046875,
-0.006748199462890625,
-0.0143280029296875,
0.031768798828125,
0.051483154296875,
0.0196380615234375,
-0.004016876220703125,
0.024322509765625,
-0.01511383056640625,
0.039154052734375,
-0.041839599609375,
-0.0094757080078125,
0.0008168220520019531,
0.0103912353515625,
0.0007824897766113281,
-0.019805908203125,
0.0008630752563476562,
-0.0239715576171875,
-0.03619384765625,
-0.0843505859375,
0.046051025390625,
-0.0232086181640625,
0.09710693359375,
-0.00792694091796875,
-0.046844482421875,
0.0169525146484375,
-0.06829833984375,
0.0142059326171875,
0.0007834434509277344,
0.0396728515625,
0.021392822265625,
0.064453125,
-0.0266876220703125,
-0.0548095703125,
-0.040740966796875,
-0.00041484832763671875,
0.0247955322265625,
0.00577545166015625,
-0.0242156982421875,
-0.0180206298828125,
0.0298919677734375,
0.040557861328125,
-0.043487548828125,
0.01036834716796875,
-0.05615234375,
0.011871337890625,
0.052581787109375,
-0.00939178466796875,
0.034698486328125,
-0.002231597900390625,
-0.02227783203125,
-0.016754150390625,
-0.039825439453125,
0.023529052734375,
0.020660400390625,
-0.01508331298828125,
-0.02349853515625,
0.06005859375,
-0.03375244140625,
0.036956787109375,
0.015838623046875,
-0.00942230224609375,
0.004344940185546875,
-0.0164794921875,
-0.04010009765625,
-0.0110015869140625,
0.0207061767578125,
0.0916748046875,
0.0240631103515625,
0.002765655517578125,
0.0031299591064453125,
-0.00925445556640625,
0.034576416015625,
-0.0723876953125,
-0.033447265625,
0.00115203857421875,
-0.02337646484375,
-0.04718017578125,
0.031494140625,
-0.0589599609375,
-0.0250396728515625,
0.002742767333984375,
0.00711822509765625,
-0.033935546875,
-0.0293121337890625,
-0.0204925537109375,
-0.00600433349609375,
0.03875732421875,
0.040252685546875,
-0.0285797119140625,
0.0017452239990234375,
0.031280517578125,
0.041107177734375,
0.05828857421875,
0.0023746490478515625,
-0.0352783203125,
0.03228759765625,
-0.042877197265625,
0.07000732421875,
-0.00455474853515625,
-0.035491943359375,
0.00925445556640625,
0.01508331298828125,
0.0167388916015625,
-0.0188446044921875,
0.06585693359375,
-0.05859375,
-0.01837158203125,
-0.0310516357421875,
-0.016845703125,
-0.0204620361328125,
0.0193328857421875,
-0.07196044921875,
0.02349853515625,
0.0024242401123046875,
-0.034515380859375,
0.056732177734375,
-0.052337646484375,
0.003871917724609375,
0.01409912109375,
-0.006061553955078125,
-0.0006513595581054688,
0.035919189453125,
-0.01177215576171875,
-0.0260162353515625,
0.01241302490234375,
0.0034637451171875,
-0.032257080078125,
-0.032745361328125,
0.02862548828125,
-0.0308990478515625,
0.0487060546875,
0.058135986328125,
0.016204833984375,
0.00946807861328125,
-0.087890625,
0.0070648193359375,
0.028656005859375,
-0.00995635986328125,
0.00263214111328125,
-0.03369140625,
0.0148773193359375,
0.0251007080078125,
0.0188140869140625,
-0.049102783203125,
0.037353515625,
-0.00394439697265625,
-0.032562255859375,
0.01134490966796875,
0.0272369384765625,
0.0256500244140625,
-0.0290374755859375,
0.047332763671875,
-0.0079345703125,
0.026763916015625,
-0.00217437744140625,
-0.0175323486328125,
-0.07635498046875,
-0.0229949951171875,
0.027008056640625,
0.024871826171875,
-0.04595947265625,
0.00801849365234375,
-0.00939178466796875,
-0.06219482421875,
-0.0182342529296875,
-0.01593017578125,
0.0093231201171875,
0.003963470458984375,
-0.00914764404296875,
-0.03973388671875,
-0.0545654296875,
-0.08221435546875,
-0.01439666748046875,
-0.01290130615234375,
-0.03521728515625,
0.0165863037109375,
0.0152740478515625,
-0.014312744140625,
0.032745361328125,
-0.032806396484375,
-0.0167083740234375,
-0.007442474365234375,
-0.0254974365234375,
0.046417236328125,
0.052490234375,
0.052093505859375,
-0.07769775390625,
-0.05267333984375,
-0.0167999267578125,
-0.050079345703125,
-0.0003437995910644531,
-0.01406097412109375,
-0.03851318359375,
0.0147552490234375,
0.007236480712890625,
-0.05230712890625,
0.0552978515625,
0.0298614501953125,
-0.03076171875,
0.03411865234375,
-0.047393798828125,
0.060760498046875,
-0.061798095703125,
0.0135498046875,
0.0262603759765625,
-0.01593017578125,
-0.0062713623046875,
0.043701171875,
-0.0164337158203125,
-0.047088623046875,
-0.03857421875,
0.036407470703125,
-0.0309600830078125,
-0.0008878707885742188,
-0.0258941650390625,
-0.0055999755859375,
0.0178680419921875,
0.00977325439453125,
0.0027217864990234375,
0.03485107421875,
0.04150390625,
-0.026611328125,
0.0390625,
0.004119873046875,
-0.00743865966796875,
0.052032470703125,
-0.04595947265625,
0.002414703369140625,
0.00568389892578125,
0.004146575927734375,
-0.09869384765625,
-0.06256103515625,
0.0262451171875,
-0.0142059326171875,
-0.002094268798828125,
-0.0306243896484375,
-0.057861328125,
-0.0303497314453125,
-0.0188446044921875,
0.033660888671875,
0.054412841796875,
-0.0191192626953125,
0.00830078125,
0.0198516845703125,
0.0155792236328125,
0.0161590576171875,
-0.025360107421875,
0.0167694091796875,
0.00814056396484375,
-0.037384033203125,
0.050933837890625,
0.00785064697265625,
-0.0323486328125,
-0.01451873779296875,
0.0014638900756835938,
-0.05096435546875,
-0.0426025390625,
0.0323486328125,
0.0255584716796875,
-0.023284912109375,
-0.0114898681640625,
-0.016632080078125,
0.01435089111328125,
-0.022735595703125,
0.0188446044921875,
0.03936767578125,
-0.0172119140625,
-0.01971435546875,
-0.0689697265625,
0.030029296875,
0.054290771484375,
0.011627197265625,
0.051055908203125,
0.052978515625,
-0.05462646484375,
0.04339599609375,
-0.06329345703125,
0.0024814605712890625,
-0.0304412841796875,
0.0008449554443359375,
-0.0494384765625,
-0.00722503662109375,
0.00882720947265625,
0.019073486328125,
-0.0304107666015625,
0.0628662109375,
-0.0021228790283203125,
-0.041168212890625,
0.04510498046875,
0.04010009765625,
0.030914306640625,
0.030517578125,
-0.04400634765625,
-0.006626129150390625,
-0.0299530029296875,
-0.0242156982421875,
-0.0008406639099121094,
-0.0207061767578125,
-0.053192138671875,
-0.04730224609375,
0.0064849853515625,
0.0308074951171875,
-0.0303955078125,
0.036590576171875,
-0.0036106109619140625,
0.03509521484375,
0.043365478515625,
0.022369384765625,
0.0236968994140625,
0.005527496337890625,
0.0025196075439453125,
-0.0240631103515625,
-0.01123046875,
-0.047149658203125,
0.034820556640625,
-0.00044417381286621094,
0.0262298583984375,
0.00632476806640625,
0.043701171875,
0.00675201416015625,
0.016845703125,
-0.0298614501953125,
0.054901123046875,
-0.011962890625,
-0.07110595703125,
-0.00005322694778442383,
0.0038089752197265625,
-0.055023193359375,
0.003753662109375,
-0.0389404296875,
-0.01959228515625,
0.02099609375,
0.0236358642578125,
-0.0465087890625,
0.03961181640625,
-0.044677734375,
0.075927734375,
-0.037445068359375,
-0.0265960693359375,
-0.023284912109375,
-0.0193939208984375,
0.0176544189453125,
0.0007162094116210938,
0.016998291015625,
-0.01009368896484375,
-0.014862060546875,
0.035003662109375,
-0.0245208740234375,
0.061492919921875,
-0.01097869873046875,
0.0123138427734375,
0.0296783447265625,
0.023284912109375,
-0.0012578964233398438,
0.0170745849609375,
-0.01264190673828125,
-0.0270233154296875,
-0.014404296875,
-0.0472412109375,
-0.016357421875,
0.0703125,
-0.0284423828125,
-0.0224151611328125,
-0.048980712890625,
-0.00595855712890625,
0.017242431640625,
0.01568603515625,
0.04595947265625,
0.05438232421875,
-0.0694580078125,
0.004360198974609375,
0.01047515869140625,
-0.01238250732421875,
0.04608154296875,
0.036346435546875,
-0.05609130859375,
-0.038055419921875,
0.05718994140625,
0.0281524658203125,
-0.01282501220703125,
0.0167694091796875,
0.005279541015625,
-0.006504058837890625,
-0.027862548828125,
-0.0233612060546875,
0.0458984375,
-0.00885772705078125,
-0.0032062530517578125,
-0.020538330078125,
-0.0261688232421875,
-0.030914306640625,
-0.04229736328125,
-0.0298614501953125,
-0.0266571044921875,
-0.03369140625,
-0.0193328857421875,
0.036773681640625,
0.0279541015625,
-0.011077880859375,
0.029754638671875,
-0.054931640625,
0.041107177734375,
0.031890869140625,
0.038116455078125,
-0.04095458984375,
-0.0633544921875,
0.0024166107177734375,
0.01468658447265625,
-0.05047607421875,
-0.0352783203125,
0.0679931640625,
0.0304412841796875,
0.046417236328125,
0.035491943359375,
0.003459930419921875,
0.06005859375,
-0.03143310546875,
0.045684814453125,
0.03143310546875,
-0.035064697265625,
0.072509765625,
-0.0469970703125,
0.0285186767578125,
0.08831787109375,
0.046722412109375,
-0.0166778564453125,
0.01419830322265625,
-0.09918212890625,
-0.049407958984375,
0.006099700927734375,
0.0219268798828125,
0.0187225341796875,
0.038604736328125,
0.031951904296875,
-0.01041412353515625,
0.02752685546875,
-0.056243896484375,
0.002269744873046875,
-0.01168060302734375,
0.00658416748046875,
-0.00437164306640625,
-0.047698974609375,
0.0008754730224609375,
-0.037109375,
0.027099609375,
0.01788330078125,
0.0182647705078125,
0.02264404296875,
0.0097808837890625,
-0.0180206298828125,
-0.01540374755859375,
0.03973388671875,
0.05499267578125,
-0.052581787109375,
0.00003975629806518555,
-0.0198516845703125,
-0.0195770263671875,
0.00785064697265625,
0.0028400421142578125,
-0.00772857666015625,
0.01372528076171875,
0.0118408203125,
0.0654296875,
0.0262298583984375,
-0.033447265625,
0.03558349609375,
0.005584716796875,
0.02545166015625,
-0.09552001953125,
0.0296783447265625,
-0.01543426513671875,
0.0297698974609375,
0.0229339599609375,
0.035980224609375,
0.041839599609375,
-0.067138671875,
0.0377197265625,
0.004058837890625,
-0.059173583984375,
-0.040985107421875,
0.06268310546875,
0.02203369140625,
-0.07867431640625,
0.060577392578125,
-0.02764892578125,
-0.00457763671875,
0.0491943359375,
0.0513916015625,
0.047393798828125,
-0.0283660888671875,
0.046844482421875,
0.07696533203125,
-0.0265960693359375,
0.00789642333984375,
0.039337158203125,
0.0258941650390625,
-0.03546142578125,
0.018157958984375,
-0.0299530029296875,
-0.038482666015625,
0.01561737060546875,
-0.036590576171875,
0.0372314453125,
-0.08233642578125,
0.0031642913818359375,
0.00261688232421875,
0.008514404296875,
-0.0309906005859375,
0.06640625,
0.0269317626953125,
0.1070556640625,
-0.037109375,
0.092529296875,
0.0465087890625,
-0.065185546875,
-0.04278564453125,
0.006580352783203125,
-0.005039215087890625,
-0.036041259765625,
0.035003662109375,
0.040863037109375,
-0.0011882781982421875,
0.0021190643310546875,
-0.054718017578125,
-0.03900146484375,
0.049957275390625,
0.0175018310546875,
-0.04766845703125,
-0.01070404052734375,
-0.036041259765625,
0.06878662109375,
-0.0634765625,
-0.007781982421875,
0.0167999267578125,
0.0322265625,
0.04217529296875,
-0.04156494140625,
-0.02593994140625,
-0.04302978515625,
-0.0015478134155273438,
-0.01483917236328125,
-0.05511474609375,
0.03070068359375,
-0.0296173095703125,
0.0029735565185546875,
0.0318603515625,
0.054656982421875,
0.0382080078125,
0.019256591796875,
0.0540771484375,
0.046630859375,
0.016326904296875,
-0.003620147705078125,
0.107421875,
0.023681640625,
0.00977325439453125,
0.065185546875,
-0.005157470703125,
0.06256103515625,
0.036834716796875,
-0.01107025146484375,
0.0313720703125,
0.06622314453125,
-0.04925537109375,
0.052276611328125,
0.005176544189453125,
-0.0460205078125,
-0.0194244384765625,
-0.0330810546875,
0.0013332366943359375,
0.03912353515625,
-0.00595855712890625,
-0.01885986328125,
-0.0031375885009765625,
-0.0158538818359375,
-0.016265869140625,
0.026763916015625,
-0.0266876220703125,
0.025054931640625,
-0.01849365234375,
-0.03839111328125,
0.026947021484375,
-0.00925445556640625,
0.016693115234375,
-0.0192718505859375,
-0.00872039794921875,
-0.0081939697265625,
-0.0107879638671875,
-0.0133209228515625,
-0.047607421875,
0.0433349609375,
0.00879669189453125,
-0.02398681640625,
-0.00055694580078125,
0.04400634765625,
-0.0382080078125,
-0.102783203125,
0.00913238525390625,
-0.0252838134765625,
0.029266357421875,
-0.0209503173828125,
-0.050567626953125,
0.01210784912109375,
0.0211029052734375,
0.004390716552734375,
-0.00022792816162109375,
-0.003231048583984375,
-0.00010210275650024414,
0.02734375,
0.0438232421875,
0.016326904296875,
-0.0113677978515625,
-0.00231170654296875,
0.039154052734375,
-0.052215576171875,
-0.034698486328125,
-0.039276123046875,
0.039154052734375,
-0.06170654296875,
-0.0386962890625,
0.035980224609375,
0.06805419921875,
0.03369140625,
-0.065673828125,
0.0080718994140625,
-0.0113677978515625,
0.0182037353515625,
-0.0186309814453125,
0.056640625,
-0.063232421875,
-0.0218505859375,
-0.0184783935546875,
-0.06396484375,
-0.038543701171875,
0.044189453125,
0.0124359130859375,
-0.0089111328125,
0.00527191162109375,
0.051025390625,
-0.01358795166015625,
-0.0260772705078125,
0.041351318359375,
-0.01318359375,
0.020263671875,
0.001743316650390625,
0.0469970703125,
-0.0513916015625,
0.0253753662109375,
-0.041351318359375,
-0.01763916015625,
-0.03729248046875,
-0.0655517578125,
-0.0233306884765625,
-0.036590576171875,
-0.05181884765625,
-0.02239990234375,
-0.0208282470703125,
0.052642822265625,
0.0557861328125,
-0.04241943359375,
-0.0201263427734375,
-0.0010814666748046875,
-0.00594329833984375,
0.00978851318359375,
-0.0217437744140625,
0.0037841796875,
0.053253173828125,
-0.0654296875,
0.01312255859375,
0.01491546630859375,
0.040740966796875,
-0.001750946044921875,
0.029571533203125,
-0.0286407470703125,
0.01166534423828125,
0.024169921875,
0.04541015625,
-0.0335693359375,
-0.0289154052734375,
-0.0158233642578125,
-0.025390625,
0.00439453125,
0.059906005859375,
-0.006649017333984375,
0.0169677734375,
0.03424072265625,
0.005710601806640625,
0.04180908203125,
0.00606536865234375,
0.060272216796875,
-0.043182373046875,
0.048492431640625,
-0.005840301513671875,
0.039794921875,
0.02972412109375,
-0.040130615234375,
0.06219482421875,
0.0491943359375,
-0.0168304443359375,
-0.0574951171875,
0.0016336441040039062,
-0.130615234375,
0.0168304443359375,
0.056182861328125,
0.00968170166015625,
-0.0085296630859375,
0.025970458984375,
-0.0550537109375,
0.0196380615234375,
-0.0242156982421875,
0.035247802734375,
0.0253448486328125,
0.003437042236328125,
-0.038177490234375,
-0.0193939208984375,
0.005840301513671875,
-0.039276123046875,
-0.0318603515625,
-0.064453125,
0.047607421875,
0.0212249755859375,
0.038848876953125,
0.04864501953125,
-0.023590087890625,
0.0241546630859375,
0.0303802490234375,
0.037445068359375,
0.000027954578399658203,
-0.032562255859375,
0.007106781005859375,
0.010650634765625,
0.0130615234375,
-0.05194091796875
]
] |
hf-internal-testing/tiny-vilt-random-vqa | 2022-05-16T14:49:45.000Z | [
"transformers",
"pytorch",
"vilt",
"visual-question-answering",
"arxiv:2102.03334",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | visual-question-answering | hf-internal-testing | null | null | hf-internal-testing/tiny-vilt-random-vqa | 0 | 36,713 | transformers | 2022-05-15T15:23:27 | ---
license: apache-2.0
---
A tiny randomly-initialized [ViLT](https://arxiv.org/abs/2102.03334) used for unit tests in the Transformers VQA pipeline | 150 | [
[
-0.046539306640625,
-0.03704833984375,
0.0207061767578125,
-0.0010194778442382812,
-0.038665771484375,
0.00691986083984375,
0.024078369140625,
0.012664794921875,
0.00568389892578125,
0.042388916015625,
-0.04229736328125,
0.00351715087890625,
-0.004344940185546875,
-0.0237579345703125,
-0.07421875,
0.06390380859375,
0.03338623046875,
0.00890350341796875,
0.019439697265625,
-0.01824951171875,
-0.01045989990234375,
-0.021636962890625,
-0.03466796875,
-0.02276611328125,
0.04534912109375,
0.053497314453125,
0.0557861328125,
0.00846099853515625,
0.0257568359375,
0.00823211669921875,
0.004070281982421875,
-0.00884246826171875,
-0.042724609375,
-0.0171661376953125,
0.00626373291015625,
-0.025482177734375,
0.01123046875,
-0.0194091796875,
0.07928466796875,
0.0281982421875,
-0.02740478515625,
0.061492919921875,
-0.017120361328125,
0.020477294921875,
-0.043060302734375,
-0.0176849365234375,
-0.0287933349609375,
0.02960205078125,
-0.047576904296875,
-0.01123809814453125,
-0.04962158203125,
-0.00809478759765625,
0.005870819091796875,
-0.0200653076171875,
0.031585693359375,
0.03753662109375,
0.07684326171875,
0.043365478515625,
-0.033355712890625,
0.032470703125,
-0.0772705078125,
0.045196533203125,
0.0036182403564453125,
0.0404052734375,
-0.00789642333984375,
0.034332275390625,
-0.0117034912109375,
-0.0938720703125,
-0.05181884765625,
0.01253509521484375,
-0.015594482421875,
-0.0004246234893798828,
-0.0171661376953125,
-0.0037555694580078125,
0.03173828125,
0.03350830078125,
-0.02972412109375,
-0.0036678314208984375,
-0.049652099609375,
-0.0210113525390625,
0.027252197265625,
0.03173828125,
0.008331298828125,
-0.0206146240234375,
-0.03900146484375,
-0.01378631591796875,
-0.032623291015625,
0.0291290283203125,
0.041717529296875,
0.007259368896484375,
-0.0088348388671875,
0.06927490234375,
-0.01248931884765625,
0.041259765625,
-0.0121307373046875,
0.0423583984375,
0.0347900390625,
-0.01348876953125,
-0.0399169921875,
0.0206756591796875,
0.048828125,
-0.0092010498046875,
0.0262603759765625,
0.01824951171875,
-0.02081298828125,
-0.0182952880859375,
0.05621337890625,
-0.07073974609375,
-0.056396484375,
0.00494384765625,
-0.01947021484375,
-0.057952880859375,
0.017303466796875,
-0.0252532958984375,
-0.0146484375,
0.0032558441162109375,
0.0435791015625,
-0.01568603515625,
0.0031681060791015625,
-0.0258636474609375,
-0.00559234619140625,
0.0631103515625,
-0.0227813720703125,
-0.0479736328125,
0.0081329345703125,
0.0211334228515625,
0.036590576171875,
0.01129150390625,
-0.0006871223449707031,
-0.07501220703125,
-0.039764404296875,
-0.028717041015625,
0.0604248046875,
-0.0106658935546875,
-0.056732177734375,
-0.0173797607421875,
0.031524658203125,
0.015960693359375,
-0.012115478515625,
0.04864501953125,
-0.035247802734375,
0.004405975341796875,
-0.00010085105895996094,
0.0045166015625,
0.0193328857421875,
-0.02752685546875,
-0.038726806640625,
0.11724853515625,
0.047515869140625,
-0.064453125,
0.04583740234375,
-0.041107177734375,
0.0106048583984375,
0.045623779296875,
-0.0197601318359375,
-0.054351806640625,
0.0280609130859375,
-0.0267486572265625,
0.00017726421356201172,
-0.007476806640625,
-0.040191650390625,
-0.00399017333984375,
-0.06951904296875,
0.0133209228515625,
-0.042205810546875,
0.0736083984375,
0.02117919921875,
-0.0103607177734375,
0.0101776123046875,
-0.0614013671875,
0.03558349609375,
0.0014142990112304688,
-0.0045013427734375,
-0.03106689453125,
-0.0181884765625,
0.0009851455688476562,
-0.01861572265625,
-0.0016775131225585938,
-0.05108642578125,
0.015838623046875,
-0.0118408203125,
0.0016775131225585938,
0.060333251953125,
0.0220794677734375,
0.04052734375,
-0.00788116455078125,
0.0191650390625,
0.018341064453125,
0.03662109375,
0.00862884521484375,
-0.056182861328125,
-0.087158203125,
-0.0234832763671875,
0.042510986328125,
0.023651123046875,
-0.062255859375,
-0.00989532470703125,
-0.0083770751953125,
-0.05487060546875,
-0.057220458984375,
0.007167816162109375,
0.00974273681640625,
0.003070831298828125,
0.00838470458984375,
-0.0230865478515625,
-0.0333251953125,
-0.08837890625,
0.0064544677734375,
-0.0224609375,
-0.0283355712890625,
-0.00007069110870361328,
0.0250244140625,
-0.036865234375,
0.07708740234375,
-0.038177490234375,
0.0094451904296875,
0.010528564453125,
0.01448822021484375,
0.027587890625,
0.02294921875,
0.04534912109375,
-0.0443115234375,
-0.03228759765625,
-0.0009965896606445312,
-0.0224456787109375,
-0.003704071044921875,
0.003948211669921875,
-0.025665283203125,
-0.0306549072265625,
0.03338623046875,
-0.0233154296875,
0.040679931640625,
0.03759765625,
-0.059356689453125,
0.048614501953125,
-0.0027008056640625,
0.0092010498046875,
-0.047760009765625,
-0.0176239013671875,
-0.0084991455078125,
-0.0229034423828125,
-0.032958984375,
-0.009063720703125,
0.0256195068359375,
0.01483917236328125,
-0.06475830078125,
0.028839111328125,
-0.0182952880859375,
0.002918243408203125,
0.0017442703247070312,
-0.04071044921875,
-0.0191192626953125,
0.0019254684448242188,
-0.022430419921875,
0.1033935546875,
0.00997161865234375,
-0.034027099609375,
0.007038116455078125,
0.0211944580078125,
-0.00508880615234375,
0.032684326171875,
-0.0450439453125,
0.0167083740234375,
-0.0113983154296875,
-0.034576416015625,
-0.055450439453125,
-0.0232696533203125,
0.0242767333984375,
-0.033294677734375,
-0.0218353271484375,
-0.028045654296875,
-0.030364990234375,
-0.04852294921875,
-0.020538330078125,
0.0394287109375,
0.064208984375,
-0.057861328125,
0.034027099609375,
0.016143798828125,
0.0015344619750976562,
-0.04351806640625,
-0.048309326171875,
-0.02752685546875,
-0.01151275634765625,
-0.0245513916015625,
-0.00316619873046875,
-0.01268768310546875,
-0.0224456787109375,
-0.0024013519287109375,
0.00848388671875,
-0.032012939453125,
0.003856658935546875,
0.055145263671875,
0.0222320556640625,
-0.039459228515625,
0.015167236328125,
0.009307861328125,
-0.0139007568359375,
0.0241851806640625,
-0.006008148193359375,
0.03369140625,
-0.06646728515625,
-0.039459228515625,
0.0024433135986328125,
0.016357421875,
0.036956787109375,
0.02630615234375,
0.03167724609375,
0.06951904296875,
-0.022125244140625,
-0.048004150390625,
-0.0033969879150390625,
-0.0263671875,
-0.040496826171875,
0.0006794929504394531,
-0.031707763671875,
-0.052215576171875,
0.031402587890625,
0.01294708251953125,
0.004665374755859375,
0.0421142578125,
0.043487548828125,
-0.02056884765625,
0.035675048828125,
0.033599853515625,
0.0060882568359375,
0.0288848876953125,
-0.031280517578125,
0.02972412109375,
-0.035858154296875,
0.01267242431640625,
-0.0360107421875,
-0.0169677734375,
-0.02850341796875,
-0.059173583984375,
0.027435302734375,
-0.00579833984375,
-0.049468994140625,
0.01439666748046875,
-0.0455322265625,
0.06573486328125,
0.068603515625,
-0.013092041015625,
0.0237884521484375,
-0.0188140869140625,
-0.045654296875,
-0.03021240234375,
-0.044158935546875,
-0.022674560546875,
0.08441162109375,
0.014617919921875,
0.03985595703125,
0.00533294677734375,
0.06707763671875,
0.052093505859375,
0.04229736328125,
-0.05291748046875,
0.022125244140625,
0.00555419921875,
-0.053375244140625,
-0.04986572265625,
0.00460052490234375,
-0.07568359375,
0.016082763671875,
-0.01885986328125,
-0.0654296875,
0.016937255859375,
0.011810302734375,
-0.053192138671875,
0.0294342041015625,
-0.05322265625,
0.042144775390625,
-0.00896453857421875,
0.0006432533264160156,
0.036407470703125,
-0.02532958984375,
0.050506591796875,
-0.022735595703125,
0.0037384033203125,
0.0222015380859375,
0.02581787109375,
0.02276611328125,
-0.02899169921875,
0.0252532958984375,
-0.004795074462890625,
0.014373779296875,
0.06256103515625,
0.00588226318359375,
0.01885986328125,
0.0484619140625,
0.0244293212890625,
0.0016164779663085938,
0.00646209716796875,
-0.035552978515625,
0.0015621185302734375,
0.0408935546875,
-0.059844970703125,
-0.013885498046875,
-0.054473876953125,
-0.0076141357421875,
0.0163116455078125,
0.028472900390625,
0.0211029052734375,
0.02142333984375,
-0.0207366943359375,
0.055694580078125,
0.08294677734375,
-0.0005993843078613281,
0.04345703125,
0.03472900390625,
-0.02899169921875,
0.0188446044921875,
0.07952880859375,
-0.01447296142578125,
0.0184173583984375,
-0.007488250732421875,
0.0147247314453125,
0.0082244873046875,
-0.0227813720703125,
-0.038421630859375,
-0.01397705078125,
-0.032562255859375,
-0.0124664306640625,
-0.0357666015625,
-0.08087158203125,
0.01264190673828125,
-0.006114959716796875,
-0.05908203125,
-0.0205841064453125,
-0.01279449462890625,
-0.01471710205078125,
0.033538818359375,
0.0308990478515625,
-0.01055908203125,
0.045623779296875,
-0.0703125,
0.0290374755859375,
0.0535888671875,
0.007526397705078125,
-0.04071044921875,
-0.045745849609375,
-0.024444580078125,
-0.0160064697265625,
-0.016204833984375,
-0.014251708984375,
0.0203094482421875,
-0.003711700439453125,
0.044647216796875,
0.033355712890625,
0.01206207275390625,
-0.0017147064208984375,
-0.02996826171875,
0.0894775390625,
0.005306243896484375,
-0.041412353515625,
0.010284423828125,
-0.031341552734375,
0.0391845703125,
0.054412841796875,
-0.004398345947265625,
-0.01430511474609375,
-0.024627685546875,
-0.06854248046875,
-0.07037353515625,
0.0007729530334472656,
0.041717529296875,
-0.00023436546325683594,
-0.00302886962890625,
-0.006893157958984375,
0.00937652587890625,
0.0029010772705078125,
-0.031829833984375,
-0.054473876953125,
0.0001627206802368164,
-0.00344085693359375,
0.01422882080078125,
-0.023956298828125,
-0.02435302734375,
-0.0300140380859375,
0.02117919921875,
-0.027435302734375,
0.060821533203125,
0.015106201171875,
-0.02288818359375,
-0.0017843246459960938,
0.01241302490234375,
0.03387451171875,
-0.0096435546875,
-0.0338134765625,
0.0114898681640625,
0.048614501953125,
-0.077392578125,
0.0225067138671875,
-0.0181732177734375,
-0.0296783447265625,
0.0268402099609375,
-0.0208282470703125,
0.030242919921875,
0.031646728515625,
-0.0301055908203125,
0.035247802734375,
-0.0623779296875,
-0.0341796875,
-0.07696533203125,
0.03271484375,
0.00457763671875,
0.033111572265625,
0.050140380859375,
0.00994873046875,
0.027130126953125,
-0.00983428955078125,
0.04388427734375,
0.0025310516357421875,
-0.06585693359375,
0.0020275115966796875,
0.0545654296875,
0.018157958984375,
-0.043487548828125,
0.051055908203125,
-0.0404052734375,
-0.00849151611328125,
0.0183868408203125,
-0.021636962890625,
0.06951904296875,
0.0083770751953125,
0.03424072265625,
0.00782012939453125,
0.0543212890625,
0.0178680419921875,
0.0352783203125,
0.0109710693359375,
-0.047393798828125,
-0.01074981689453125,
-0.0635986328125,
-0.03668212890625,
0.002506256103515625,
-0.04193115234375,
0.04119873046875,
-0.036407470703125,
-0.0104827880859375,
-0.002513885498046875,
-0.014801025390625,
-0.09906005859375,
0.0175018310546875,
0.020172119140625,
0.0572509765625,
-0.060760498046875,
0.07818603515625,
0.033966064453125,
-0.0031280517578125,
-0.060333251953125,
-0.016815185546875,
0.0125732421875,
-0.06683349609375,
0.0343017578125,
-0.0280914306640625,
-0.0012664794921875,
0.0186309814453125,
-0.0218505859375,
-0.061981201171875,
0.0992431640625,
-0.0059356689453125,
-0.0504150390625,
0.0210418701171875,
0.0231781005859375,
0.029815673828125,
-0.00966644287109375,
0.044586181640625,
0.06671142578125,
0.02996826171875,
-0.02288818359375,
-0.04150390625,
-0.023468017578125,
-0.0345458984375,
0.01092529296875,
0.035675048828125,
-0.06134033203125,
0.07537841796875,
-0.01361846923828125,
0.005863189697265625,
0.0118255615234375,
0.06451416015625,
0.0269317626953125,
0.0090484619140625,
0.06640625,
0.04473876953125,
0.060211181640625,
-0.0322265625,
0.06414794921875,
-0.0106658935546875,
0.051055908203125,
0.08697509765625,
0.01079559326171875,
0.0428466796875,
0.0711669921875,
-0.0016765594482421875,
0.033935546875,
0.03271484375,
-0.01544952392578125,
0.037841796875,
0.02899169921875,
-0.038177490234375,
0.002033233642578125,
0.00391387939453125,
-0.033599853515625,
0.0176239013671875,
0.03192138671875,
-0.021392822265625,
-0.015869140625,
-0.0088653564453125,
-0.03692626953125,
-0.035797119140625,
-0.0026721954345703125,
0.0114898681640625,
0.017486572265625,
-0.00015735626220703125,
0.0125732421875,
-0.0309295654296875,
0.031402587890625,
-0.0234832763671875,
0.00489044189453125,
-0.005970001220703125,
0.0168609619140625,
0.0007700920104980469,
-0.078857421875,
0.01366424560546875,
-0.007350921630859375,
-0.01153564453125,
-0.00655364990234375,
0.054168701171875,
0.01110076904296875,
-0.04278564453125,
0.0010614395141601562,
0.0232391357421875,
-0.00476837158203125,
0.0217437744140625,
-0.017364501953125,
-0.00518798828125,
-0.005474090576171875,
-0.06494140625,
0.0218353271484375,
0.05279541015625,
-0.0029048919677734375,
0.049957275390625,
0.043853759765625,
-0.006439208984375,
0.045013427734375,
0.0285491943359375,
0.060333251953125,
-0.05023193359375,
-0.0267486572265625,
-0.0159454345703125,
0.083984375,
-0.0240020751953125,
-0.0594482421875,
0.044830322265625,
0.037506103515625,
0.056304931640625,
-0.0262298583984375,
0.066650390625,
0.0023365020751953125,
0.01285552978515625,
-0.0106964111328125,
0.045135498046875,
-0.021759033203125,
0.00211334228515625,
-0.005466461181640625,
-0.06549072265625,
0.005504608154296875,
0.034912109375,
0.0298004150390625,
-0.033599853515625,
0.08428955078125,
0.04144287109375,
-0.024810791015625,
0.001983642578125,
0.037933349609375,
0.024932861328125,
0.0173187255859375,
0.00017571449279785156,
0.0621337890625,
-0.055511474609375,
0.041015625,
-0.048370361328125,
0.005947113037109375,
-0.0191192626953125,
-0.02947998046875,
-0.05908203125,
-0.026824951171875,
-0.017608642578125,
-0.04010009765625,
-0.00868988037109375,
0.04888916015625,
0.040802001953125,
-0.057403564453125,
0.005832672119140625,
-0.006519317626953125,
-0.01052093505859375,
0.01271820068359375,
-0.0220489501953125,
-0.027252197265625,
0.0031337738037109375,
-0.040802001953125,
0.016754150390625,
0.01325225830078125,
0.0223541259765625,
-0.03814697265625,
0.0009927749633789062,
-0.0013990402221679688,
0.0229034423828125,
0.0257568359375,
0.0256195068359375,
-0.01361846923828125,
-0.0880126953125,
0.006103515625,
-0.0174713134765625,
-0.00786590576171875,
0.042266845703125,
-0.037750244140625,
0.0229339599609375,
0.042510986328125,
0.034881591796875,
0.04986572265625,
-0.00672149658203125,
0.03314208984375,
-0.06829833984375,
0.04168701171875,
0.027862548828125,
0.03485107421875,
-0.0143585205078125,
-0.0321044921875,
0.054656982421875,
0.0120391845703125,
-0.047271728515625,
-0.08740234375,
0.0012884140014648438,
-0.1004638671875,
0.029693603515625,
0.057098388671875,
0.0267333984375,
-0.043182373046875,
0.00687408447265625,
-0.040191650390625,
0.0025482177734375,
-0.0273590087890625,
0.040985107421875,
0.036285400390625,
0.02716064453125,
-0.0222930908203125,
-0.06591796875,
0.0221710205078125,
0.0022983551025390625,
-0.024261474609375,
-0.05084228515625,
-0.009857177734375,
0.0265045166015625,
0.01053619384765625,
0.02886962890625,
0.004718780517578125,
0.028717041015625,
0.033294677734375,
0.0127105712890625,
0.02117919921875,
-0.02825927734375,
-0.0038166046142578125,
0.00838470458984375,
0.006694793701171875,
-0.029022216796875
]
] |
timm/resnext101_32x16d.fb_swsl_ig1b_ft_in1k | 2023-04-05T19:15:51.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"arxiv:1905.00546",
"arxiv:1611.05431",
"arxiv:1512.03385",
"license:cc-by-nc-4.0",
"region:us"
] | image-classification | timm | null | null | timm/resnext101_32x16d.fb_swsl_ig1b_ft_in1k | 0 | 36,676 | timm | 2023-04-05T19:13:21 | ---
tags:
- image-classification
- timm
library_tag: timm
license: cc-by-nc-4.0
---
# Model card for resnext101_32x16d.fb_swsl_ig1b_ft_in1k
A ResNeXt-B image classification model.
This model features:
* ReLU activations
* single layer 7x7 convolution with pooling
* 1x1 convolution shortcut downsample
* grouped 3x3 bottleneck convolutions
Pretrained on Instagram-1B hashtags dataset using semi-weakly supervised learning and fine-tuned on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 194.0
- GMACs: 36.3
- Activations (M): 51.2
- Image size: 224 x 224
- **Papers:**
- Billion-scale semi-supervised learning for image classification: https://arxiv.org/abs/1905.00546
- Aggregated Residual Transformations for Deep Neural Networks: https://arxiv.org/abs/1611.05431
- Deep Residual Learning for Image Recognition: https://arxiv.org/abs/1512.03385
- **Original:** https://github.com/facebookresearch/semi-supervised-ImageNet1K-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('resnext101_32x16d.fb_swsl_ig1b_ft_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnext101_32x16d.fb_swsl_ig1b_ft_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 112, 112])
# torch.Size([1, 256, 56, 56])
# torch.Size([1, 512, 28, 28])
# torch.Size([1, 1024, 14, 14])
# torch.Size([1, 2048, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'resnext101_32x16d.fb_swsl_ig1b_ft_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2048, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
|model |img_size|top1 |top5 |param_count|gmacs|macts|img/sec|
|------------------------------------------|--------|-----|-----|-----------|-----|-----|-------|
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|320 |86.72|98.17|93.6 |35.2 |69.7 |451 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k_288](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k_288)|288 |86.51|98.08|93.6 |28.5 |56.4 |560 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|288 |86.49|98.03|93.6 |28.5 |56.4 |557 |
|[seresnextaa101d_32x8d.sw_in12k_ft_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.sw_in12k_ft_in1k)|224 |85.96|97.82|93.6 |17.2 |34.2 |923 |
|[resnext101_32x32d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x32d.fb_wsl_ig1b_ft_in1k)|224 |85.11|97.44|468.5 |87.3 |91.1 |254 |
|[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|416 |85.0 |97.12|191.9 |108.4|213.8|134 |
|[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|352 |84.96|97.22|102.1 |50.2 |101.2|291 |
|[ecaresnet269d.ra2_in1k](https://huggingface.co/timm/ecaresnet269d.ra2_in1k)|320 |84.73|97.18|102.1 |41.5 |83.7 |353 |
|[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|384 |84.71|96.99|164.0 |77.6 |154.7|183 |
|[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|288 |84.57|97.08|93.6 |28.5 |56.4 |557 |
|[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|320 |84.45|97.08|93.2 |31.5 |67.8 |446 |
|[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|352 |84.43|96.97|129.9 |51.1 |105.5|280 |
|[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|288 |84.36|96.92|93.6 |27.6 |53.0 |595 |
|[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|320 |84.35|97.04|66.8 |24.1 |47.7 |610 |
|[resnetrs350.tf_in1k](https://huggingface.co/timm/resnetrs350.tf_in1k)|288 |84.3 |96.94|164.0 |43.7 |87.1 |333 |
|[resnext101_32x8d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_swsl_ig1b_ft_in1k)|224 |84.28|97.17|88.8 |16.5 |31.2 |1100 |
|[resnetrs420.tf_in1k](https://huggingface.co/timm/resnetrs420.tf_in1k)|320 |84.24|96.86|191.9 |64.2 |126.6|228 |
|[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|288 |84.19|96.87|93.6 |27.2 |51.6 |613 |
|[resnext101_32x16d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_wsl_ig1b_ft_in1k)|224 |84.18|97.19|194.0 |36.3 |51.2 |581 |
|[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|288 |84.11|97.11|44.6 |15.1 |29.0 |1144 |
|[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|320 |83.97|96.82|64.7 |31.2 |67.3 |518 |
|[resnetrs200.tf_in1k](https://huggingface.co/timm/resnetrs200.tf_in1k)|256 |83.87|96.75|93.2 |20.2 |43.4 |692 |
|[seresnextaa101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnextaa101d_32x8d.ah_in1k)|224 |83.86|96.65|93.6 |17.2 |34.2 |923 |
|[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|320 |83.72|96.61|86.6 |24.3 |48.1 |617 |
|[seresnet152d.ra2_in1k](https://huggingface.co/timm/seresnet152d.ra2_in1k)|256 |83.69|96.78|66.8 |15.4 |30.6 |943 |
|[seresnext101d_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101d_32x8d.ah_in1k)|224 |83.68|96.61|93.6 |16.7 |32.0 |986 |
|[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|320 |83.67|96.74|60.2 |24.1 |47.7 |706 |
|[resnetrs270.tf_in1k](https://huggingface.co/timm/resnetrs270.tf_in1k)|256 |83.59|96.61|129.9 |27.1 |55.8 |526 |
|[seresnext101_32x8d.ah_in1k](https://huggingface.co/timm/seresnext101_32x8d.ah_in1k)|224 |83.58|96.4 |93.6 |16.5 |31.2 |1013 |
|[resnetaa101d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa101d.sw_in12k_ft_in1k)|224 |83.54|96.83|44.6 |9.1 |17.6 |1864 |
|[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|288 |83.46|96.54|60.2 |19.1 |37.3 |904 |
|[resnext101_32x16d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_swsl_ig1b_ft_in1k)|224 |83.35|96.85|194.0 |36.3 |51.2 |582 |
|[resnet200d.ra2_in1k](https://huggingface.co/timm/resnet200d.ra2_in1k)|256 |83.23|96.53|64.7 |20.0 |43.1 |809 |
|[resnext101_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_swsl_ig1b_ft_in1k)|224 |83.22|96.75|44.2 |8.0 |21.2 |1814 |
|[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|288 |83.16|96.38|83.5 |25.7 |51.6 |590 |
|[resnet152d.ra2_in1k](https://huggingface.co/timm/resnet152d.ra2_in1k)|256 |83.14|96.38|60.2 |15.4 |30.5 |1096 |
|[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|320 |83.02|96.45|44.6 |16.5 |34.8 |992 |
|[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|288 |82.98|96.54|44.6 |13.4 |28.2 |1077 |
|[resnext101_64x4d.tv_in1k](https://huggingface.co/timm/resnext101_64x4d.tv_in1k)|224 |82.98|96.25|83.5 |15.5 |31.2 |989 |
|[resnetrs152.tf_in1k](https://huggingface.co/timm/resnetrs152.tf_in1k)|256 |82.86|96.28|86.6 |15.6 |30.8 |951 |
|[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|224 |82.83|96.22|88.8 |16.5 |31.2 |1099 |
|[resnet152.a1h_in1k](https://huggingface.co/timm/resnet152.a1h_in1k)|224 |82.8 |96.13|60.2 |11.6 |22.6 |1486 |
|[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|288 |82.8 |96.32|44.6 |13.0 |26.8 |1291 |
|[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|288 |82.74|95.71|60.2 |19.1 |37.3 |905 |
|[resnext101_32x8d.fb_wsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_wsl_ig1b_ft_in1k)|224 |82.69|96.63|88.8 |16.5 |31.2 |1100 |
|[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|288 |82.62|95.75|60.2 |19.1 |37.3 |904 |
|[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|288 |82.61|96.49|25.6 |8.9 |20.6 |1729 |
|[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|288 |82.53|96.13|36.8 |9.9 |21.5 |1773 |
|[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|224 |82.5 |96.02|126.9 |22.8 |21.2 |1078 |
|[resnext101_64x4d.c1_in1k](https://huggingface.co/timm/resnext101_64x4d.c1_in1k)|224 |82.46|95.92|83.5 |15.5 |31.2 |987 |
|[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|288 |82.36|96.18|35.7 |8.1 |20.9 |1964 |
|[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|320 |82.35|96.14|25.6 |8.8 |24.1 |1386 |
|[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|288 |82.31|95.63|44.6 |13.0 |26.8 |1291 |
|[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|288 |82.29|96.01|63.6 |13.6 |28.5 |1078 |
|[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|224 |82.29|96.0 |60.2 |11.6 |22.6 |1484 |
|[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|288 |82.27|96.06|68.9 |18.9 |23.8 |1176 |
|[resnet101d.ra2_in1k](https://huggingface.co/timm/resnet101d.ra2_in1k)|256 |82.26|96.07|44.6 |10.6 |22.2 |1542 |
|[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|288 |82.24|95.73|44.6 |13.0 |26.8 |1290 |
|[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|288 |82.2 |96.14|27.6 |7.0 |23.8 |1547 |
|[ecaresnet101d.miil_in1k](https://huggingface.co/timm/ecaresnet101d.miil_in1k)|224 |82.18|96.05|44.6 |8.1 |17.1 |1771 |
|[resnext50_32x4d.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_swsl_ig1b_ft_in1k)|224 |82.17|96.22|25.0 |4.3 |14.4 |2943 |
|[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|288 |82.12|95.65|25.6 |7.1 |19.6 |1704 |
|[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|288 |82.03|95.94|25.0 |7.0 |23.8 |1745 |
|[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|288 |82.0 |96.15|24.9 |5.8 |12.7 |1787 |
|[resnet61q.ra2_in1k](https://huggingface.co/timm/resnet61q.ra2_in1k)|256 |81.99|95.85|36.8 |7.8 |17.0 |2230 |
|[resnext101_32x8d.tv2_in1k](https://huggingface.co/timm/resnext101_32x8d.tv2_in1k)|176 |81.98|95.72|88.8 |10.3 |19.4 |1768 |
|[resnet152.a1_in1k](https://huggingface.co/timm/resnet152.a1_in1k)|224 |81.97|95.24|60.2 |11.6 |22.6 |1486 |
|[resnet101.a1h_in1k](https://huggingface.co/timm/resnet101.a1h_in1k)|224 |81.93|95.75|44.6 |7.8 |16.2 |2122 |
|[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|224 |81.9 |95.77|44.6 |7.8 |16.2 |2118 |
|[resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x16d.fb_ssl_yfcc100m_ft_in1k)|224 |81.84|96.1 |194.0 |36.3 |51.2 |583 |
|[resnet51q.ra2_in1k](https://huggingface.co/timm/resnet51q.ra2_in1k)|256 |81.78|95.94|35.7 |6.4 |16.6 |2471 |
|[resnet152.a2_in1k](https://huggingface.co/timm/resnet152.a2_in1k)|224 |81.77|95.22|60.2 |11.6 |22.6 |1485 |
|[resnetaa50d.sw_in12k_ft_in1k](https://huggingface.co/timm/resnetaa50d.sw_in12k_ft_in1k)|224 |81.74|96.06|25.6 |5.4 |12.4 |2813 |
|[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|288 |81.65|95.54|25.6 |7.1 |19.6 |1703 |
|[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|288 |81.64|95.88|25.6 |7.2 |19.7 |1694 |
|[resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x8d.fb_ssl_yfcc100m_ft_in1k)|224 |81.62|96.04|88.8 |16.5 |31.2 |1101 |
|[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|224 |81.61|95.76|68.9 |11.4 |14.4 |1930 |
|[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|288 |81.61|95.83|25.6 |8.5 |19.2 |1868 |
|[resnet101.a1_in1k](https://huggingface.co/timm/resnet101.a1_in1k)|224 |81.5 |95.16|44.6 |7.8 |16.2 |2125 |
|[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|288 |81.48|95.16|25.0 |7.0 |23.8 |1745 |
|[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|288 |81.47|95.71|25.9 |6.9 |18.6 |2071 |
|[wide_resnet50_2.racm_in1k](https://huggingface.co/timm/wide_resnet50_2.racm_in1k)|224 |81.45|95.53|68.9 |11.4 |14.4 |1929 |
|[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|288 |81.44|95.22|25.6 |7.2 |19.7 |1908 |
|[ecaresnet50t.ra2_in1k](https://huggingface.co/timm/ecaresnet50t.ra2_in1k)|256 |81.44|95.67|25.6 |5.6 |15.4 |2168 |
|[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|288 |81.4 |95.82|30.2 |6.8 |13.9 |2132 |
|[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|288 |81.37|95.74|25.6 |7.2 |19.7 |1910 |
|[resnet101.a2_in1k](https://huggingface.co/timm/resnet101.a2_in1k)|224 |81.32|95.19|44.6 |7.8 |16.2 |2125 |
|[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|288 |81.3 |95.65|28.1 |6.8 |18.4 |1803 |
|[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|288 |81.3 |95.11|25.0 |7.0 |23.8 |1746 |
|[seresnext50_32x4d.racm_in1k](https://huggingface.co/timm/seresnext50_32x4d.racm_in1k)|224 |81.27|95.62|27.6 |4.3 |14.4 |2591 |
|[ecaresnet50t.a1_in1k](https://huggingface.co/timm/ecaresnet50t.a1_in1k)|224 |81.26|95.16|25.6 |4.3 |11.8 |2823 |
|[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|288 |81.23|95.54|15.7 |4.8 |19.6 |2117 |
|[senet154.gluon_in1k](https://huggingface.co/timm/senet154.gluon_in1k)|224 |81.23|95.35|115.1 |20.8 |38.7 |545 |
|[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|288 |81.22|95.11|25.6 |6.8 |18.4 |2089 |
|[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|288 |81.22|95.63|25.6 |6.8 |18.4 |676 |
|[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|288 |81.18|95.09|25.6 |7.2 |19.7 |1908 |
|[resnet50.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet50.fb_swsl_ig1b_ft_in1k)|224 |81.18|95.98|25.6 |4.1 |11.1 |3455 |
|[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|224 |81.17|95.34|25.0 |4.3 |14.4 |2933 |
|[resnext50_32x4d.a1h_in1k](https://huggingface.co/timm/resnext50_32x4d.a1h_in1k)|224 |81.1 |95.33|25.0 |4.3 |14.4 |2934 |
|[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|288 |81.1 |95.23|28.1 |6.8 |18.4 |1801 |
|[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|288 |81.1 |95.12|28.1 |6.8 |18.4 |1799 |
|[resnet152s.gluon_in1k](https://huggingface.co/timm/resnet152s.gluon_in1k)|224 |81.02|95.41|60.3 |12.9 |25.0 |1347 |
|[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|288 |80.97|95.44|25.6 |6.8 |18.4 |2085 |
|[gcresnet50t.ra2_in1k](https://huggingface.co/timm/gcresnet50t.ra2_in1k)|256 |80.94|95.45|25.9 |5.4 |14.7 |2571 |
|[resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext101_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.93|95.73|44.2 |8.0 |21.2 |1814 |
|[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|288 |80.91|95.55|25.6 |6.8 |18.4 |2084 |
|[seresnext101_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_32x4d.gluon_in1k)|224 |80.9 |95.31|49.0 |8.0 |21.3 |1585 |
|[seresnext101_64x4d.gluon_in1k](https://huggingface.co/timm/seresnext101_64x4d.gluon_in1k)|224 |80.9 |95.3 |88.2 |15.5 |31.2 |918 |
|[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|288 |80.86|95.52|25.6 |6.8 |18.4 |2085 |
|[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|224 |80.85|95.43|25.6 |4.1 |11.1 |3450 |
|[ecaresnet50t.a2_in1k](https://huggingface.co/timm/ecaresnet50t.a2_in1k)|224 |80.84|95.02|25.6 |4.3 |11.8 |2821 |
|[ecaresnet101d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet101d_pruned.miil_in1k)|224 |80.79|95.62|24.9 |3.5 |7.7 |2961 |
|[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|288 |80.79|95.36|19.8 |6.0 |14.8 |2506 |
|[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|288 |80.79|95.58|19.9 |4.2 |10.6 |2349 |
|[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|288 |80.78|94.99|25.6 |6.8 |18.4 |2088 |
|[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|288 |80.71|95.43|25.6 |6.8 |18.4 |2087 |
|[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|288 |80.7 |95.39|25.0 |7.0 |23.8 |1749 |
|[resnetrs101.tf_in1k](https://huggingface.co/timm/resnetrs101.tf_in1k)|192 |80.69|95.24|63.6 |6.0 |12.7 |2270 |
|[resnet50d.a1_in1k](https://huggingface.co/timm/resnet50d.a1_in1k)|224 |80.68|94.71|25.6 |4.4 |11.9 |3162 |
|[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|288 |80.68|95.36|19.7 |6.0 |14.8 |2637 |
|[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|224 |80.67|95.3 |25.6 |4.1 |11.1 |3452 |
|[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|288 |80.67|95.42|25.0 |7.4 |25.1 |1626 |
|[resnetaa50.a1h_in1k](https://huggingface.co/timm/resnetaa50.a1h_in1k)|224 |80.63|95.21|25.6 |5.2 |11.6 |3034 |
|[ecaresnet50d.miil_in1k](https://huggingface.co/timm/ecaresnet50d.miil_in1k)|224 |80.61|95.32|25.6 |4.4 |11.9 |2813 |
|[resnext101_64x4d.gluon_in1k](https://huggingface.co/timm/resnext101_64x4d.gluon_in1k)|224 |80.61|94.99|83.5 |15.5 |31.2 |989 |
|[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|288 |80.6 |95.31|19.9 |6.0 |14.8 |2578 |
|[gcresnext50ts.ch_in1k](https://huggingface.co/timm/gcresnext50ts.ch_in1k)|256 |80.57|95.17|15.7 |3.8 |15.5 |2710 |
|[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|224 |80.56|95.0 |60.2 |11.6 |22.6 |1483 |
|[resnet50d.ra2_in1k](https://huggingface.co/timm/resnet50d.ra2_in1k)|224 |80.53|95.16|25.6 |4.4 |11.9 |3164 |
|[resnext50_32x4d.a1_in1k](https://huggingface.co/timm/resnext50_32x4d.a1_in1k)|224 |80.53|94.46|25.0 |4.3 |14.4 |2930 |
|[wide_resnet101_2.tv2_in1k](https://huggingface.co/timm/wide_resnet101_2.tv2_in1k)|176 |80.48|94.98|126.9 |14.3 |13.2 |1719 |
|[resnet152d.gluon_in1k](https://huggingface.co/timm/resnet152d.gluon_in1k)|224 |80.47|95.2 |60.2 |11.8 |23.4 |1428 |
|[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|288 |80.45|95.32|25.6 |6.8 |18.4 |2086 |
|[ecaresnetlight.miil_in1k](https://huggingface.co/timm/ecaresnetlight.miil_in1k)|224 |80.45|95.24|30.2 |4.1 |8.4 |3530 |
|[resnext50_32x4d.a2_in1k](https://huggingface.co/timm/resnext50_32x4d.a2_in1k)|224 |80.45|94.63|25.0 |4.3 |14.4 |2936 |
|[wide_resnet50_2.tv2_in1k](https://huggingface.co/timm/wide_resnet50_2.tv2_in1k)|176 |80.43|95.09|68.9 |7.3 |9.0 |3015 |
|[resnet101d.gluon_in1k](https://huggingface.co/timm/resnet101d.gluon_in1k)|224 |80.42|95.01|44.6 |8.1 |17.0 |2007 |
|[resnet50.a1_in1k](https://huggingface.co/timm/resnet50.a1_in1k)|224 |80.38|94.6 |25.6 |4.1 |11.1 |3461 |
|[seresnet33ts.ra2_in1k](https://huggingface.co/timm/seresnet33ts.ra2_in1k)|256 |80.36|95.1 |19.8 |4.8 |11.7 |3267 |
|[resnext101_32x4d.gluon_in1k](https://huggingface.co/timm/resnext101_32x4d.gluon_in1k)|224 |80.34|94.93|44.2 |8.0 |21.2 |1814 |
|[resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnext50_32x4d.fb_ssl_yfcc100m_ft_in1k)|224 |80.32|95.4 |25.0 |4.3 |14.4 |2941 |
|[resnet101s.gluon_in1k](https://huggingface.co/timm/resnet101s.gluon_in1k)|224 |80.28|95.16|44.7 |9.2 |18.6 |1851 |
|[seresnet50.ra2_in1k](https://huggingface.co/timm/seresnet50.ra2_in1k)|224 |80.26|95.08|28.1 |4.1 |11.1 |2972 |
|[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|288 |80.24|95.24|25.6 |8.5 |19.9 |1523 |
|[resnet50d.a2_in1k](https://huggingface.co/timm/resnet50d.a2_in1k)|224 |80.22|94.63|25.6 |4.4 |11.9 |3162 |
|[resnet152.tv2_in1k](https://huggingface.co/timm/resnet152.tv2_in1k)|176 |80.2 |94.64|60.2 |7.2 |14.0 |2346 |
|[seresnet50.a2_in1k](https://huggingface.co/timm/seresnet50.a2_in1k)|224 |80.08|94.74|28.1 |4.1 |11.1 |2969 |
|[eca_resnet33ts.ra2_in1k](https://huggingface.co/timm/eca_resnet33ts.ra2_in1k)|256 |80.08|94.97|19.7 |4.8 |11.7 |3284 |
|[gcresnet33ts.ra2_in1k](https://huggingface.co/timm/gcresnet33ts.ra2_in1k)|256 |80.06|94.99|19.9 |4.8 |11.7 |3216 |
|[resnet50_gn.a1h_in1k](https://huggingface.co/timm/resnet50_gn.a1h_in1k)|224 |80.06|94.95|25.6 |4.1 |11.1 |1109 |
|[seresnet50.a1_in1k](https://huggingface.co/timm/seresnet50.a1_in1k)|224 |80.02|94.71|28.1 |4.1 |11.1 |2962 |
|[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|288 |79.97|95.05|25.6 |6.8 |18.4 |2086 |
|[resnet152c.gluon_in1k](https://huggingface.co/timm/resnet152c.gluon_in1k)|224 |79.92|94.84|60.2 |11.8 |23.4 |1455 |
|[seresnext50_32x4d.gluon_in1k](https://huggingface.co/timm/seresnext50_32x4d.gluon_in1k)|224 |79.91|94.82|27.6 |4.3 |14.4 |2591 |
|[resnet50.d_in1k](https://huggingface.co/timm/resnet50.d_in1k)|224 |79.91|94.67|25.6 |4.1 |11.1 |3456 |
|[resnet101.tv2_in1k](https://huggingface.co/timm/resnet101.tv2_in1k)|176 |79.9 |94.6 |44.6 |4.9 |10.1 |3341 |
|[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|224 |79.89|94.97|35.7 |4.5 |12.1 |2774 |
|[resnet50.c2_in1k](https://huggingface.co/timm/resnet50.c2_in1k)|224 |79.88|94.87|25.6 |4.1 |11.1 |3455 |
|[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|320 |79.86|95.07|16.0 |5.2 |16.4 |2168 |
|[resnet50.a2_in1k](https://huggingface.co/timm/resnet50.a2_in1k)|224 |79.85|94.56|25.6 |4.1 |11.1 |3460 |
|[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|288 |79.83|94.97|25.6 |6.8 |18.4 |2087 |
|[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|224 |79.82|94.62|44.6 |7.8 |16.2 |2114 |
|[resnext50_32x4d.ra_in1k](https://huggingface.co/timm/resnext50_32x4d.ra_in1k)|224 |79.76|94.6 |25.0 |4.3 |14.4 |2943 |
|[resnet50.c1_in1k](https://huggingface.co/timm/resnet50.c1_in1k)|224 |79.74|94.95|25.6 |4.1 |11.1 |3455 |
|[ecaresnet50d_pruned.miil_in1k](https://huggingface.co/timm/ecaresnet50d_pruned.miil_in1k)|224 |79.74|94.87|19.9 |2.5 |6.4 |3929 |
|[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|288 |79.71|94.83|19.7 |6.0 |14.8 |2710 |
|[resnet152.gluon_in1k](https://huggingface.co/timm/resnet152.gluon_in1k)|224 |79.68|94.74|60.2 |11.6 |22.6 |1486 |
|[resnext50d_32x4d.bt_in1k](https://huggingface.co/timm/resnext50d_32x4d.bt_in1k)|224 |79.67|94.87|25.0 |4.5 |15.2 |2729 |
|[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|288 |79.63|94.91|25.6 |6.8 |18.4 |2086 |
|[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|224 |79.56|94.72|25.6 |4.3 |11.8 |2805 |
|[resnet101c.gluon_in1k](https://huggingface.co/timm/resnet101c.gluon_in1k)|224 |79.53|94.58|44.6 |8.1 |17.0 |2062 |
|[resnet50.b1k_in1k](https://huggingface.co/timm/resnet50.b1k_in1k)|224 |79.52|94.61|25.6 |4.1 |11.1 |3459 |
|[resnet50.tv2_in1k](https://huggingface.co/timm/resnet50.tv2_in1k)|176 |79.42|94.64|25.6 |2.6 |6.9 |5397 |
|[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|288 |79.4 |94.66|18.0 |5.9 |14.6 |2752 |
|[resnet50.b2k_in1k](https://huggingface.co/timm/resnet50.b2k_in1k)|224 |79.38|94.57|25.6 |4.1 |11.1 |3459 |
|[resnext50_32x4d.tv2_in1k](https://huggingface.co/timm/resnext50_32x4d.tv2_in1k)|176 |79.37|94.3 |25.0 |2.7 |9.0 |4577 |
|[resnext50_32x4d.gluon_in1k](https://huggingface.co/timm/resnext50_32x4d.gluon_in1k)|224 |79.36|94.43|25.0 |4.3 |14.4 |2942 |
|[resnext101_32x8d.tv_in1k](https://huggingface.co/timm/resnext101_32x8d.tv_in1k)|224 |79.31|94.52|88.8 |16.5 |31.2 |1100 |
|[resnet101.gluon_in1k](https://huggingface.co/timm/resnet101.gluon_in1k)|224 |79.31|94.53|44.6 |7.8 |16.2 |2125 |
|[resnetblur50.bt_in1k](https://huggingface.co/timm/resnetblur50.bt_in1k)|224 |79.31|94.63|25.6 |5.2 |12.0 |2524 |
|[resnet50.a1h_in1k](https://huggingface.co/timm/resnet50.a1h_in1k)|176 |79.27|94.49|25.6 |2.6 |6.9 |5404 |
|[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|224 |79.25|94.31|25.0 |4.3 |14.4 |2931 |
|[resnet50.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet50.fb_ssl_yfcc100m_ft_in1k)|224 |79.22|94.84|25.6 |4.1 |11.1 |3451 |
|[resnet33ts.ra2_in1k](https://huggingface.co/timm/resnet33ts.ra2_in1k)|256 |79.21|94.56|19.7 |4.8 |11.7 |3392 |
|[resnet50d.gluon_in1k](https://huggingface.co/timm/resnet50d.gluon_in1k)|224 |79.07|94.48|25.6 |4.4 |11.9 |3162 |
|[resnet50.ram_in1k](https://huggingface.co/timm/resnet50.ram_in1k)|224 |79.03|94.38|25.6 |4.1 |11.1 |3453 |
|[resnet50.am_in1k](https://huggingface.co/timm/resnet50.am_in1k)|224 |79.01|94.39|25.6 |4.1 |11.1 |3461 |
|[resnet32ts.ra2_in1k](https://huggingface.co/timm/resnet32ts.ra2_in1k)|256 |79.01|94.37|18.0 |4.6 |11.6 |3440 |
|[ecaresnet26t.ra2_in1k](https://huggingface.co/timm/ecaresnet26t.ra2_in1k)|256 |78.9 |94.54|16.0 |3.4 |10.5 |3421 |
|[resnet152.a3_in1k](https://huggingface.co/timm/resnet152.a3_in1k)|160 |78.89|94.11|60.2 |5.9 |11.5 |2745 |
|[wide_resnet101_2.tv_in1k](https://huggingface.co/timm/wide_resnet101_2.tv_in1k)|224 |78.84|94.28|126.9 |22.8 |21.2 |1079 |
|[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|288 |78.83|94.24|16.8 |4.5 |16.8 |2251 |
|[resnet50.ra_in1k](https://huggingface.co/timm/resnet50.ra_in1k)|224 |78.81|94.32|25.6 |4.1 |11.1 |3454 |
|[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|288 |78.74|94.33|16.8 |4.5 |16.7 |2264 |
|[resnet50s.gluon_in1k](https://huggingface.co/timm/resnet50s.gluon_in1k)|224 |78.72|94.23|25.7 |5.5 |13.5 |2796 |
|[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|224 |78.71|94.24|25.6 |4.4 |11.9 |3154 |
|[wide_resnet50_2.tv_in1k](https://huggingface.co/timm/wide_resnet50_2.tv_in1k)|224 |78.47|94.09|68.9 |11.4 |14.4 |1934 |
|[resnet50.bt_in1k](https://huggingface.co/timm/resnet50.bt_in1k)|224 |78.46|94.27|25.6 |4.1 |11.1 |3454 |
|[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|288 |78.43|94.35|21.8 |6.5 |7.5 |3291 |
|[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|288 |78.42|94.04|10.5 |3.1 |13.3 |3226 |
|[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|320 |78.33|94.13|16.0 |5.2 |16.4 |2391 |
|[resnet152.tv_in1k](https://huggingface.co/timm/resnet152.tv_in1k)|224 |78.32|94.04|60.2 |11.6 |22.6 |1487 |
|[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|288 |78.28|94.1 |10.4 |3.1 |13.3 |3062 |
|[bat_resnext26ts.ch_in1k](https://huggingface.co/timm/bat_resnext26ts.ch_in1k)|256 |78.25|94.1 |10.7 |2.5 |12.5 |3393 |
|[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|224 |78.06|93.78|25.6 |4.1 |11.1 |3450 |
|[resnet50c.gluon_in1k](https://huggingface.co/timm/resnet50c.gluon_in1k)|224 |78.0 |93.99|25.6 |4.4 |11.9 |3286 |
|[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|288 |78.0 |93.91|10.3 |3.1 |13.3 |3297 |
|[seresnext26t_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26t_32x4d.bt_in1k)|224 |77.98|93.75|16.8 |2.7 |10.1 |3841 |
|[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|288 |77.92|93.77|21.8 |6.1 |6.2 |3609 |
|[resnet101.a3_in1k](https://huggingface.co/timm/resnet101.a3_in1k)|160 |77.88|93.71|44.6 |4.0 |8.3 |3926 |
|[resnet26t.ra2_in1k](https://huggingface.co/timm/resnet26t.ra2_in1k)|256 |77.87|93.84|16.0 |3.4 |10.5 |3772 |
|[seresnext26ts.ch_in1k](https://huggingface.co/timm/seresnext26ts.ch_in1k)|256 |77.86|93.79|10.4 |2.4 |10.5 |4263 |
|[resnetrs50.tf_in1k](https://huggingface.co/timm/resnetrs50.tf_in1k)|160 |77.82|93.81|35.7 |2.3 |6.2 |5238 |
|[gcresnext26ts.ch_in1k](https://huggingface.co/timm/gcresnext26ts.ch_in1k)|256 |77.81|93.82|10.5 |2.4 |10.5 |4183 |
|[ecaresnet50t.a3_in1k](https://huggingface.co/timm/ecaresnet50t.a3_in1k)|160 |77.79|93.6 |25.6 |2.2 |6.0 |5329 |
|[resnext50_32x4d.a3_in1k](https://huggingface.co/timm/resnext50_32x4d.a3_in1k)|160 |77.73|93.32|25.0 |2.2 |7.4 |5576 |
|[resnext50_32x4d.tv_in1k](https://huggingface.co/timm/resnext50_32x4d.tv_in1k)|224 |77.61|93.7 |25.0 |4.3 |14.4 |2944 |
|[seresnext26d_32x4d.bt_in1k](https://huggingface.co/timm/seresnext26d_32x4d.bt_in1k)|224 |77.59|93.61|16.8 |2.7 |10.2 |3807 |
|[resnet50.gluon_in1k](https://huggingface.co/timm/resnet50.gluon_in1k)|224 |77.58|93.72|25.6 |4.1 |11.1 |3455 |
|[eca_resnext26ts.ch_in1k](https://huggingface.co/timm/eca_resnext26ts.ch_in1k)|256 |77.44|93.56|10.3 |2.4 |10.5 |4284 |
|[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|288 |77.41|93.63|16.0 |4.3 |13.5 |2907 |
|[resnet101.tv_in1k](https://huggingface.co/timm/resnet101.tv_in1k)|224 |77.38|93.54|44.6 |7.8 |16.2 |2125 |
|[resnet50d.a3_in1k](https://huggingface.co/timm/resnet50d.a3_in1k)|160 |77.22|93.27|25.6 |2.2 |6.1 |5982 |
|[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|288 |77.17|93.47|10.3 |3.1 |13.3 |3392 |
|[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|288 |77.15|93.27|21.8 |6.1 |6.2 |3615 |
|[resnet34d.ra2_in1k](https://huggingface.co/timm/resnet34d.ra2_in1k)|224 |77.1 |93.37|21.8 |3.9 |4.5 |5436 |
|[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|224 |77.02|93.07|28.1 |4.1 |11.1 |2952 |
|[resnext26ts.ra2_in1k](https://huggingface.co/timm/resnext26ts.ra2_in1k)|256 |76.78|93.13|10.3 |2.4 |10.5 |4410 |
|[resnet26d.bt_in1k](https://huggingface.co/timm/resnet26d.bt_in1k)|224 |76.7 |93.17|16.0 |2.6 |8.2 |4859 |
|[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|288 |76.5 |93.35|21.8 |6.1 |6.2 |3617 |
|[resnet34.a1_in1k](https://huggingface.co/timm/resnet34.a1_in1k)|224 |76.42|92.87|21.8 |3.7 |3.7 |5984 |
|[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|288 |76.35|93.18|16.0 |3.9 |12.2 |3331 |
|[resnet50.tv_in1k](https://huggingface.co/timm/resnet50.tv_in1k)|224 |76.13|92.86|25.6 |4.1 |11.1 |3457 |
|[resnet50.a3_in1k](https://huggingface.co/timm/resnet50.a3_in1k)|160 |75.96|92.5 |25.6 |2.1 |5.7 |6490 |
|[resnet34.a2_in1k](https://huggingface.co/timm/resnet34.a2_in1k)|224 |75.52|92.44|21.8 |3.7 |3.7 |5991 |
|[resnet26.bt_in1k](https://huggingface.co/timm/resnet26.bt_in1k)|224 |75.3 |92.58|16.0 |2.4 |7.4 |5583 |
|[resnet34.bt_in1k](https://huggingface.co/timm/resnet34.bt_in1k)|224 |75.16|92.18|21.8 |3.7 |3.7 |5994 |
|[seresnet50.a3_in1k](https://huggingface.co/timm/seresnet50.a3_in1k)|160 |75.1 |92.08|28.1 |2.1 |5.7 |5513 |
|[resnet34.gluon_in1k](https://huggingface.co/timm/resnet34.gluon_in1k)|224 |74.57|91.98|21.8 |3.7 |3.7 |5984 |
|[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|288 |73.81|91.83|11.7 |3.4 |5.4 |5196 |
|[resnet34.tv_in1k](https://huggingface.co/timm/resnet34.tv_in1k)|224 |73.32|91.42|21.8 |3.7 |3.7 |5979 |
|[resnet18.fb_swsl_ig1b_ft_in1k](https://huggingface.co/timm/resnet18.fb_swsl_ig1b_ft_in1k)|224 |73.28|91.73|11.7 |1.8 |2.5 |10213 |
|[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|288 |73.16|91.03|11.7 |3.0 |4.1 |6050 |
|[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|224 |72.98|91.11|21.8 |3.7 |3.7 |5967 |
|[resnet18.fb_ssl_yfcc100m_ft_in1k](https://huggingface.co/timm/resnet18.fb_ssl_yfcc100m_ft_in1k)|224 |72.6 |91.42|11.7 |1.8 |2.5 |10213 |
|[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|288 |72.37|90.59|11.7 |3.0 |4.1 |6051 |
|[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|224 |72.26|90.31|10.1 |1.7 |5.8 |7026 |
|[resnet18d.ra2_in1k](https://huggingface.co/timm/resnet18d.ra2_in1k)|224 |72.26|90.68|11.7 |2.1 |3.3 |8707 |
|[resnet18.a1_in1k](https://huggingface.co/timm/resnet18.a1_in1k)|224 |71.49|90.07|11.7 |1.8 |2.5 |10187 |
|[resnet14t.c3_in1k](https://huggingface.co/timm/resnet14t.c3_in1k)|176 |71.31|89.69|10.1 |1.1 |3.6 |10970 |
|[resnet18.gluon_in1k](https://huggingface.co/timm/resnet18.gluon_in1k)|224 |70.84|89.76|11.7 |1.8 |2.5 |10210 |
|[resnet18.a2_in1k](https://huggingface.co/timm/resnet18.a2_in1k)|224 |70.64|89.47|11.7 |1.8 |2.5 |10194 |
|[resnet34.a3_in1k](https://huggingface.co/timm/resnet34.a3_in1k)|160 |70.56|89.52|21.8 |1.9 |1.9 |10737 |
|[resnet18.tv_in1k](https://huggingface.co/timm/resnet18.tv_in1k)|224 |69.76|89.07|11.7 |1.8 |2.5 |10205 |
|[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|224 |68.34|88.03|5.4 |1.1 |2.4 |13079 |
|[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|224 |68.25|88.17|11.7 |1.8 |2.5 |10167 |
|[resnet10t.c3_in1k](https://huggingface.co/timm/resnet10t.c3_in1k)|176 |66.71|86.96|5.4 |0.7 |1.5 |20327 |
|[resnet18.a3_in1k](https://huggingface.co/timm/resnet18.a3_in1k)|160 |65.66|86.26|11.7 |0.9 |1.3 |18229 |
## Citation
```bibtex
@misc{yalniz2019billionscale,
title={Billion-scale semi-supervised learning for image classification},
author={I. Zeki Yalniz and Hervé Jégou and Kan Chen and Manohar Paluri and Dhruv Mahajan},
year={2019},
eprint={1905.00546},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
```bibtex
@article{Xie2016,
title={Aggregated Residual Transformations for Deep Neural Networks},
author={Saining Xie and Ross Girshick and Piotr Dollár and Zhuowen Tu and Kaiming He},
journal={arXiv preprint arXiv:1611.05431},
year={2016}
}
```
```bibtex
@article{He2015,
author = {Kaiming He and Xiangyu Zhang and Shaoqing Ren and Jian Sun},
title = {Deep Residual Learning for Image Recognition},
journal = {arXiv preprint arXiv:1512.03385},
year = {2015}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 38,880 | [
[
-0.06365966796875,
-0.0199127197265625,
0.00467681884765625,
0.029510498046875,
-0.032073974609375,
-0.0085601806640625,
-0.0109710693359375,
-0.032318115234375,
0.08148193359375,
0.018524169921875,
-0.0487060546875,
-0.041229248046875,
-0.0458984375,
-0.00048661231994628906,
0.0217132568359375,
0.0662841796875,
-0.0014925003051757812,
-0.007366180419921875,
0.0122222900390625,
-0.020233154296875,
-0.0027713775634765625,
-0.0229949951171875,
-0.0750732421875,
-0.01280975341796875,
0.02960205078125,
0.01309967041015625,
0.050628662109375,
0.04400634765625,
0.0276641845703125,
0.04473876953125,
-0.01971435546875,
0.0201263427734375,
-0.006885528564453125,
-0.00982666015625,
0.0439453125,
-0.0282135009765625,
-0.0665283203125,
-0.0016450881958007812,
0.05419921875,
0.04742431640625,
0.005153656005859375,
0.028472900390625,
0.0275726318359375,
0.04241943359375,
0.0014972686767578125,
-0.0025463104248046875,
-0.0007390975952148438,
0.007965087890625,
-0.020233154296875,
0.00826263427734375,
-0.004009246826171875,
-0.053131103515625,
0.01255035400390625,
-0.045684814453125,
-0.002704620361328125,
-0.0002853870391845703,
0.100830078125,
-0.0083160400390625,
-0.016143798828125,
0.0060882568359375,
0.0079193115234375,
0.055908203125,
-0.0616455078125,
0.0252532958984375,
0.039581298828125,
-0.0029926300048828125,
-0.01375579833984375,
-0.04986572265625,
-0.039154052734375,
0.01020050048828125,
-0.032379150390625,
0.0211029052734375,
-0.0240020751953125,
-0.01727294921875,
0.028167724609375,
0.02532958984375,
-0.0341796875,
-0.00771331787109375,
-0.0275726318359375,
-0.00887298583984375,
0.053131103515625,
0.00510406494140625,
0.05084228515625,
-0.0245208740234375,
-0.039825439453125,
-0.0097503662109375,
-0.01128387451171875,
0.036834716796875,
0.01611328125,
0.012115478515625,
-0.081298828125,
0.030242919921875,
0.01137542724609375,
0.0195159912109375,
0.0275115966796875,
-0.01399993896484375,
0.062286376953125,
-0.0088043212890625,
-0.03753662109375,
-0.035247802734375,
0.083251953125,
0.0474853515625,
0.0189056396484375,
-0.007030487060546875,
-0.004619598388671875,
-0.0121002197265625,
-0.0304718017578125,
-0.073486328125,
-0.002132415771484375,
0.020721435546875,
-0.04119873046875,
-0.0176544189453125,
0.024810791015625,
-0.06536865234375,
-0.002063751220703125,
-0.006839752197265625,
0.00989532470703125,
-0.054931640625,
-0.033538818359375,
0.0003437995910644531,
-0.0156707763671875,
0.036102294921875,
0.0168914794921875,
-0.02423095703125,
0.03131103515625,
0.004566192626953125,
0.0657958984375,
0.020416259765625,
-0.005558013916015625,
-0.01505279541015625,
0.0007600784301757812,
-0.0248260498046875,
0.030120849609375,
0.0092926025390625,
-0.0147705078125,
-0.0241241455078125,
0.03350830078125,
-0.0194549560546875,
-0.0187530517578125,
0.047576904296875,
0.02117919921875,
0.0129241943359375,
-0.0193328857421875,
-0.020416259765625,
-0.0165863037109375,
0.025390625,
-0.04144287109375,
0.07757568359375,
0.0306854248046875,
-0.08123779296875,
0.0131072998046875,
-0.03472900390625,
-0.003612518310546875,
-0.0205078125,
0.006732940673828125,
-0.0689697265625,
0.0031566619873046875,
0.0169525146484375,
0.055084228515625,
-0.016845703125,
-0.012542724609375,
-0.0305633544921875,
0.002162933349609375,
0.029876708984375,
0.01436614990234375,
0.06732177734375,
0.02239990234375,
-0.032318115234375,
-0.01334381103515625,
-0.054931640625,
0.03289794921875,
0.033843994140625,
-0.0013151168823242188,
-0.00738525390625,
-0.060150146484375,
0.0041961669921875,
0.04559326171875,
0.0208587646484375,
-0.053009033203125,
0.01922607421875,
-0.01165008544921875,
0.02532958984375,
0.049896240234375,
0.00246429443359375,
0.0152130126953125,
-0.049835205078125,
0.04315185546875,
0.0012483596801757812,
0.0207672119140625,
-0.0011138916015625,
-0.031280517578125,
-0.052642822265625,
-0.05487060546875,
0.01806640625,
0.0299224853515625,
-0.0302734375,
0.0655517578125,
0.00748443603515625,
-0.047576904296875,
-0.047637939453125,
0.006465911865234375,
0.043060302734375,
0.01959228515625,
0.00920867919921875,
-0.0245208740234375,
-0.0521240234375,
-0.07159423828125,
-0.023101806640625,
0.00951385498046875,
-0.0018777847290039062,
0.04937744140625,
0.0338134765625,
-0.01549530029296875,
0.0400390625,
-0.028564453125,
-0.0176544189453125,
-0.009552001953125,
-0.0065765380859375,
0.033203125,
0.060150146484375,
0.0775146484375,
-0.05584716796875,
-0.0701904296875,
0.00933074951171875,
-0.08111572265625,
-0.00763702392578125,
-0.00272369384765625,
-0.0198211669921875,
0.03228759765625,
0.017913818359375,
-0.065185546875,
0.05706787109375,
0.029266357421875,
-0.0633544921875,
0.03240966796875,
-0.025543212890625,
0.038665771484375,
-0.08233642578125,
0.0193023681640625,
0.0236358642578125,
-0.0193939208984375,
-0.043060302734375,
0.0020008087158203125,
-0.004177093505859375,
0.01087188720703125,
-0.041168212890625,
0.06072998046875,
-0.0537109375,
-0.0026702880859375,
0.0098876953125,
0.004695892333984375,
-0.0021209716796875,
0.03460693359375,
-0.002666473388671875,
0.043426513671875,
0.06707763671875,
-0.01611328125,
0.0215301513671875,
0.032867431640625,
0.0016851425170898438,
0.05682373046875,
-0.045135498046875,
0.00862884521484375,
0.0035552978515625,
0.032745361328125,
-0.076904296875,
-0.0274658203125,
0.037841796875,
-0.060516357421875,
0.04925537109375,
-0.0201873779296875,
-0.0241241455078125,
-0.06317138671875,
-0.06396484375,
0.0203704833984375,
0.050262451171875,
-0.0450439453125,
0.026153564453125,
0.0177764892578125,
-0.0026721954345703125,
-0.03564453125,
-0.050018310546875,
0.0015764236450195312,
-0.0296630859375,
-0.061767578125,
0.0343017578125,
0.0222015380859375,
-0.011871337890625,
0.00970458984375,
-0.0135955810546875,
-0.00844573974609375,
-0.0157928466796875,
0.04486083984375,
0.02685546875,
-0.0213165283203125,
-0.02978515625,
-0.0257415771484375,
-0.0209197998046875,
-0.004848480224609375,
-0.0093841552734375,
0.037811279296875,
-0.03350830078125,
0.0054779052734375,
-0.1060791015625,
0.007110595703125,
0.06597900390625,
-0.0004916191101074219,
0.0740966796875,
0.06048583984375,
-0.03515625,
0.01255035400390625,
-0.033233642578125,
-0.01554107666015625,
-0.038360595703125,
-0.0160980224609375,
-0.051727294921875,
-0.0467529296875,
0.06939697265625,
0.004016876220703125,
-0.00855255126953125,
0.058197021484375,
0.01352691650390625,
-0.01885986328125,
0.0635986328125,
0.035888671875,
-0.0029468536376953125,
0.042938232421875,
-0.06195068359375,
0.006683349609375,
-0.060211181640625,
-0.054443359375,
-0.0191497802734375,
-0.0438232421875,
-0.045806884765625,
-0.026519775390625,
0.0189056396484375,
0.0276641845703125,
-0.0189208984375,
0.044403076171875,
-0.042724609375,
0.0018596649169921875,
0.0249176025390625,
0.039825439453125,
-0.016571044921875,
-0.00817108154296875,
-0.0087127685546875,
-0.0236358642578125,
-0.03759765625,
-0.026611328125,
0.058624267578125,
0.04583740234375,
0.03509521484375,
0.0065155029296875,
0.042877197265625,
0.00390625,
0.0157928466796875,
-0.02264404296875,
0.052703857421875,
0.00374603271484375,
-0.032135009765625,
-0.0242767333984375,
-0.0304412841796875,
-0.08001708984375,
0.01322174072265625,
-0.036041259765625,
-0.0638427734375,
-0.01085662841796875,
-0.004405975341796875,
-0.0276336669921875,
0.055999755859375,
-0.047210693359375,
0.05010986328125,
-0.004375457763671875,
-0.04302978515625,
-0.0018672943115234375,
-0.060546875,
0.0052490234375,
0.02764892578125,
0.00406646728515625,
-0.001132965087890625,
-0.004425048828125,
0.058624267578125,
-0.060760498046875,
0.045166015625,
-0.0277099609375,
0.0096282958984375,
0.030059814453125,
-0.0034389495849609375,
0.0305328369140625,
-0.0014715194702148438,
-0.0115203857421875,
-0.0042877197265625,
0.00968170166015625,
-0.06097412109375,
-0.025390625,
0.049957275390625,
-0.055999755859375,
-0.0282745361328125,
-0.048431396484375,
-0.0206756591796875,
0.005558013916015625,
0.0033397674560546875,
0.03460693359375,
0.0462646484375,
-0.00278472900390625,
0.01739501953125,
0.04144287109375,
-0.0309600830078125,
0.038421630859375,
-0.00872802734375,
-0.0005116462707519531,
-0.041534423828125,
0.05303955078125,
0.0040130615234375,
0.0006122589111328125,
-0.0004000663757324219,
0.0017671585083007812,
-0.032989501953125,
-0.0183563232421875,
-0.0214385986328125,
0.0562744140625,
-0.0128936767578125,
-0.02410888671875,
-0.048553466796875,
-0.0256195068359375,
-0.0438232421875,
-0.03167724609375,
-0.032440185546875,
-0.0276641845703125,
-0.023895263671875,
0.0007939338684082031,
0.05224609375,
0.0628662109375,
-0.0275726318359375,
0.02972412109375,
-0.040863037109375,
0.021942138671875,
0.005706787109375,
0.041595458984375,
-0.0248260498046875,
-0.053466796875,
0.0015611648559570312,
-0.0031261444091796875,
-0.006557464599609375,
-0.061767578125,
0.048980712890625,
-0.0002529621124267578,
0.0290679931640625,
0.03204345703125,
-0.01541900634765625,
0.05712890625,
0.00010669231414794922,
0.034027099609375,
0.044036865234375,
-0.05426025390625,
0.027191162109375,
-0.03192138671875,
0.0019006729125976562,
0.021942138671875,
0.0161590576171875,
-0.027435302734375,
-0.0257568359375,
-0.07037353515625,
-0.033538818359375,
0.057708740234375,
0.0095977783203125,
-0.001934051513671875,
0.00046539306640625,
0.0546875,
-0.004940032958984375,
0.004657745361328125,
-0.043609619140625,
-0.0662841796875,
-0.00811767578125,
-0.01275634765625,
0.00513458251953125,
-0.004360198974609375,
0.004123687744140625,
-0.05242919921875,
0.05120849609375,
0.0028972625732421875,
0.037811279296875,
0.01311492919921875,
0.002552032470703125,
0.0013074874877929688,
-0.0228118896484375,
0.045623779296875,
0.026123046875,
-0.01528167724609375,
-0.0101165771484375,
0.0266571044921875,
-0.037628173828125,
0.007419586181640625,
0.017120361328125,
-0.00017654895782470703,
0.00608062744140625,
0.0084075927734375,
0.0399169921875,
0.0261383056640625,
-0.005950927734375,
0.039581298828125,
-0.0183563232421875,
-0.0413818359375,
-0.0167083740234375,
-0.0144195556640625,
0.018707275390625,
0.031707763671875,
0.0245208740234375,
0.005947113037109375,
-0.0298614501953125,
-0.0291290283203125,
0.03857421875,
0.055328369140625,
-0.030364990234375,
-0.0321044921875,
0.04547119140625,
-0.003143310546875,
-0.0133209228515625,
0.030426025390625,
-0.00965118408203125,
-0.04937744140625,
0.07513427734375,
0.02734375,
0.047271728515625,
-0.038116455078125,
0.0059356689453125,
0.0675048828125,
-0.0007061958312988281,
0.01378631591796875,
0.02520751953125,
0.0312347412109375,
-0.0260009765625,
-0.003376007080078125,
-0.0433349609375,
0.015289306640625,
0.0377197265625,
-0.031585693359375,
0.0228271484375,
-0.05352783203125,
-0.02606201171875,
0.00768280029296875,
0.035797119140625,
-0.0472412109375,
0.025421142578125,
0.0004305839538574219,
0.07928466796875,
-0.06536865234375,
0.06329345703125,
0.06939697265625,
-0.042724609375,
-0.06585693359375,
-0.0006661415100097656,
0.00733184814453125,
-0.0648193359375,
0.032928466796875,
0.009246826171875,
0.0031223297119140625,
-0.001987457275390625,
-0.040924072265625,
-0.0509033203125,
0.10382080078125,
0.030029296875,
-0.002628326416015625,
0.0197906494140625,
-0.030670166015625,
0.0293731689453125,
-0.0147552490234375,
0.0435791015625,
0.026611328125,
0.03875732421875,
0.01385498046875,
-0.06585693359375,
0.0286102294921875,
-0.032135009765625,
-0.0068359375,
0.0233001708984375,
-0.0975341796875,
0.06878662109375,
-0.0197296142578125,
-0.0046234130859375,
0.0174713134765625,
0.050323486328125,
0.0246124267578125,
-0.0011911392211914062,
0.022003173828125,
0.069580078125,
0.03692626953125,
-0.019805908203125,
0.076904296875,
-0.0160980224609375,
0.042327880859375,
0.017608642578125,
0.039947509765625,
0.0265045166015625,
0.0289306640625,
-0.042755126953125,
0.02130126953125,
0.06317138671875,
-0.00388336181640625,
0.01103973388671875,
0.0182037353515625,
-0.0298614501953125,
-0.0171661376953125,
-0.01508331298828125,
-0.04931640625,
0.017669677734375,
0.00984954833984375,
-0.01224517822265625,
-0.012176513671875,
-0.0021209716796875,
0.0189208984375,
0.021240234375,
-0.0191497802734375,
0.0400390625,
0.0059051513671875,
-0.029876708984375,
0.034698486328125,
-0.0033016204833984375,
0.07891845703125,
-0.0275421142578125,
0.012359619140625,
-0.02532958984375,
0.0238800048828125,
-0.019378662109375,
-0.0836181640625,
0.0239105224609375,
-0.007358551025390625,
0.00589752197265625,
-0.0179443359375,
0.051666259765625,
-0.027191162109375,
-0.0277557373046875,
0.03021240234375,
0.030364990234375,
0.036529541015625,
0.0196685791015625,
-0.08587646484375,
0.01849365234375,
0.006847381591796875,
-0.04681396484375,
0.03338623046875,
0.038726806640625,
0.025115966796875,
0.05645751953125,
0.02813720703125,
0.019683837890625,
0.0135650634765625,
-0.026611328125,
0.057403564453125,
-0.04840087890625,
-0.034698486328125,
-0.06280517578125,
0.041473388671875,
-0.0302276611328125,
-0.03790283203125,
0.055511474609375,
0.0433349609375,
0.02996826171875,
0.002048492431640625,
0.05010986328125,
-0.042449951171875,
0.037933349609375,
-0.0193328857421875,
0.055999755859375,
-0.051483154296875,
-0.0194854736328125,
-0.01861572265625,
-0.04486083984375,
-0.0293426513671875,
0.0626220703125,
-0.008056640625,
0.019012451171875,
0.0233001708984375,
0.05059814453125,
0.0035190582275390625,
-0.009796142578125,
0.0017910003662109375,
0.0154876708984375,
-0.01020050048828125,
0.06463623046875,
0.037567138671875,
-0.05694580078125,
0.00556182861328125,
-0.037109375,
-0.020721435546875,
-0.0261383056640625,
-0.053924560546875,
-0.0863037109375,
-0.054840087890625,
-0.0411376953125,
-0.051025390625,
-0.0218658447265625,
0.0894775390625,
0.060272216796875,
-0.0447998046875,
-0.0101165771484375,
0.011962890625,
0.00794219970703125,
-0.012054443359375,
-0.0158843994140625,
0.040069580078125,
0.00714111328125,
-0.07525634765625,
-0.03289794921875,
0.00975799560546875,
0.047088623046875,
0.0291748046875,
-0.034881591796875,
-0.0186309814453125,
-0.0069580078125,
0.027008056640625,
0.061553955078125,
-0.058746337890625,
-0.0193023681640625,
-0.0006723403930664062,
-0.035614013671875,
0.01088714599609375,
0.0201263427734375,
-0.035247802734375,
-0.0081634521484375,
0.035797119140625,
0.0277252197265625,
0.056243896484375,
0.0060882568359375,
0.0092926025390625,
-0.03411865234375,
0.0423583984375,
-0.0024471282958984375,
0.026763916015625,
0.016387939453125,
-0.02227783203125,
0.057342529296875,
0.041259765625,
-0.029510498046875,
-0.07781982421875,
-0.016448974609375,
-0.10009765625,
-0.006290435791015625,
0.051727294921875,
-0.005504608154296875,
-0.033721923828125,
0.0311431884765625,
-0.03143310546875,
0.03753662109375,
-0.0184478759765625,
0.01953125,
0.01824951171875,
-0.0266265869140625,
-0.0263214111328125,
-0.03997802734375,
0.0447998046875,
0.0245513916015625,
-0.051483154296875,
-0.0296630859375,
0.00106048583984375,
0.0255126953125,
0.015655517578125,
0.057861328125,
-0.028656005859375,
0.01030731201171875,
-0.008636474609375,
0.0199127197265625,
-0.0009250640869140625,
0.013458251953125,
-0.0228271484375,
-0.009063720703125,
-0.0187530517578125,
-0.0452880859375
]
] |
timm/tf_mixnet_l.in1k | 2023-04-27T21:50:04.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1907.09595",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/tf_mixnet_l.in1k | 0 | 36,625 | timm | 2022-12-13T00:21:35 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for tf_mixnet_l.in1k
A MixNet image classification model. Trained on ImageNet-1k in Tensorflow by paper authors, ported to PyTorch by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 7.3
- GMACs: 0.6
- Activations (M): 10.8
- Image size: 224 x 224
- **Papers:**
- MixConv: Mixed Depthwise Convolutional Kernels: https://arxiv.org/abs/1907.09595
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('tf_mixnet_l.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'tf_mixnet_l.in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 32, 112, 112])
# torch.Size([1, 40, 56, 56])
# torch.Size([1, 56, 28, 28])
# torch.Size([1, 160, 14, 14])
# torch.Size([1, 264, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'tf_mixnet_l.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1536, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@misc{tan2019mixconv,
title={MixConv: Mixed Depthwise Convolutional Kernels},
author={Mingxing Tan and Quoc V. Le},
year={2019},
eprint={1907.09595},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 3,940 | [
[
-0.038970947265625,
-0.039581298828125,
-0.0028228759765625,
0.0126190185546875,
-0.02960205078125,
-0.0219573974609375,
-0.01422119140625,
-0.03167724609375,
0.024139404296875,
0.027618408203125,
-0.032928466796875,
-0.057830810546875,
-0.054779052734375,
-0.0143890380859375,
-0.0207061767578125,
0.0755615234375,
-0.0152740478515625,
-0.00328826904296875,
-0.00890350341796875,
-0.039337158203125,
-0.00823211669921875,
-0.01479339599609375,
-0.0572509765625,
-0.03662109375,
0.02691650390625,
0.0163726806640625,
0.0462646484375,
0.03814697265625,
0.03802490234375,
0.03363037109375,
-0.0152130126953125,
-0.000009059906005859375,
-0.02587890625,
-0.01061248779296875,
0.029632568359375,
-0.03668212890625,
-0.032470703125,
0.0164031982421875,
0.050750732421875,
0.040771484375,
0.00107574462890625,
0.023162841796875,
0.007732391357421875,
0.033294677734375,
-0.02301025390625,
-0.0031566619873046875,
-0.0225372314453125,
0.022186279296875,
0.0005769729614257812,
0.00146484375,
-0.01548004150390625,
-0.029388427734375,
0.02294921875,
-0.04376220703125,
0.039215087890625,
-0.01026153564453125,
0.09228515625,
0.014495849609375,
-0.004566192626953125,
-0.0038471221923828125,
-0.0210113525390625,
0.05535888671875,
-0.06402587890625,
0.01788330078125,
0.01885986328125,
0.01386260986328125,
-0.00948333740234375,
-0.0772705078125,
-0.040863037109375,
-0.006290435791015625,
-0.0167999267578125,
0.00010013580322265625,
-0.010040283203125,
0.004665374755859375,
0.0193634033203125,
0.035797119140625,
-0.034454345703125,
0.0009355545043945312,
-0.047332763671875,
-0.0175933837890625,
0.045440673828125,
0.0031280517578125,
0.02374267578125,
-0.0118408203125,
-0.03424072265625,
-0.0389404296875,
-0.0167999267578125,
0.020416259765625,
0.021636962890625,
0.00804901123046875,
-0.05426025390625,
0.040191650390625,
0.0096588134765625,
0.04632568359375,
0.003719329833984375,
-0.0288543701171875,
0.0462646484375,
0.0006732940673828125,
-0.03619384765625,
-0.004230499267578125,
0.08428955078125,
0.027801513671875,
0.016571044921875,
0.0012645721435546875,
-0.011749267578125,
-0.0225982666015625,
-0.00923919677734375,
-0.0926513671875,
-0.02606201171875,
0.035980224609375,
-0.03955078125,
-0.033294677734375,
0.0244293212890625,
-0.047454833984375,
-0.01081085205078125,
0.00218963623046875,
0.050262451171875,
-0.03436279296875,
-0.0400390625,
0.00554656982421875,
-0.0204925537109375,
0.024169921875,
0.01430511474609375,
-0.033355712890625,
0.01557159423828125,
0.0270843505859375,
0.087158203125,
0.0004856586456298828,
-0.036865234375,
-0.012054443359375,
-0.0273284912109375,
-0.025543212890625,
0.0277252197265625,
-0.004085540771484375,
-0.0162353515625,
-0.022186279296875,
0.0201568603515625,
-0.0077056884765625,
-0.058074951171875,
0.019012451171875,
-0.0124664306640625,
0.0225982666015625,
0.00848388671875,
-0.020263671875,
-0.0338134765625,
0.0229644775390625,
-0.032562255859375,
0.0965576171875,
0.031402587890625,
-0.06976318359375,
0.0285797119140625,
-0.036956787109375,
-0.0192718505859375,
-0.019378662109375,
-0.0013780593872070312,
-0.0794677734375,
-0.0110015869140625,
0.0172119140625,
0.05621337890625,
-0.010498046875,
0.01003265380859375,
-0.04986572265625,
-0.019989013671875,
0.024566650390625,
0.00617218017578125,
0.08331298828125,
0.01198577880859375,
-0.029388427734375,
0.02410888671875,
-0.04193115234375,
0.00989532470703125,
0.030975341796875,
-0.0204620361328125,
-0.00044226646423339844,
-0.050140380859375,
0.019561767578125,
0.028167724609375,
0.006420135498046875,
-0.044708251953125,
0.01605224609375,
-0.012542724609375,
0.033233642578125,
0.03900146484375,
-0.00479888916015625,
0.02410888671875,
-0.0281829833984375,
0.0259552001953125,
0.03741455078125,
0.0109405517578125,
-0.005268096923828125,
-0.041748046875,
-0.06683349609375,
-0.04315185546875,
0.02655029296875,
0.025146484375,
-0.0382080078125,
0.038177490234375,
-0.0155181884765625,
-0.057098388671875,
-0.046234130859375,
0.00363922119140625,
0.024566650390625,
0.04400634765625,
0.02398681640625,
-0.033599853515625,
-0.038177490234375,
-0.067626953125,
0.003665924072265625,
0.00876617431640625,
-0.002197265625,
0.024749755859375,
0.05413818359375,
-0.01279449462890625,
0.050811767578125,
-0.0261688232421875,
-0.0136566162109375,
-0.0194854736328125,
0.00738525390625,
0.03668212890625,
0.0640869140625,
0.05743408203125,
-0.044036865234375,
-0.036865234375,
-0.007232666015625,
-0.07275390625,
0.013275146484375,
0.0028018951416015625,
-0.0212249755859375,
0.01557159423828125,
0.00807952880859375,
-0.048675537109375,
0.0479736328125,
0.019256591796875,
-0.035675048828125,
0.02496337890625,
-0.01393890380859375,
0.0190887451171875,
-0.09002685546875,
0.00919342041015625,
0.029388427734375,
-0.0178985595703125,
-0.0333251953125,
0.0027942657470703125,
-0.002895355224609375,
-0.007205963134765625,
-0.040771484375,
0.05230712890625,
-0.03436279296875,
-0.02001953125,
-0.0055389404296875,
-0.0259552001953125,
0.0027790069580078125,
0.051422119140625,
-0.004726409912109375,
0.0259552001953125,
0.068359375,
-0.03790283203125,
0.037078857421875,
0.02557373046875,
-0.017578125,
0.0302581787109375,
-0.0584716796875,
0.0151824951171875,
0.001438140869140625,
0.0178070068359375,
-0.07537841796875,
-0.0214080810546875,
0.029327392578125,
-0.041015625,
0.04388427734375,
-0.03948974609375,
-0.034210205078125,
-0.030548095703125,
-0.0343017578125,
0.0347900390625,
0.052215576171875,
-0.0545654296875,
0.0411376953125,
0.0161590576171875,
0.0260009765625,
-0.04071044921875,
-0.07537841796875,
-0.0217742919921875,
-0.0274810791015625,
-0.06524658203125,
0.027984619140625,
0.0066375732421875,
0.00681304931640625,
0.01428985595703125,
-0.013702392578125,
-0.0096893310546875,
-0.011688232421875,
0.03436279296875,
0.0226593017578125,
-0.027099609375,
-0.02032470703125,
-0.0176239013671875,
0.0029449462890625,
0.00014257431030273438,
-0.02349853515625,
0.043243408203125,
-0.0139312744140625,
-0.00518035888671875,
-0.0570068359375,
-0.0016489028930664062,
0.0364990234375,
0.0009207725524902344,
0.06341552734375,
0.08221435546875,
-0.037994384765625,
-0.010498046875,
-0.0300140380859375,
-0.0252227783203125,
-0.03472900390625,
0.03546142578125,
-0.027313232421875,
-0.039337158203125,
0.07135009765625,
0.01067352294921875,
0.0088348388671875,
0.0531005859375,
0.029693603515625,
-0.0079803466796875,
0.04705810546875,
0.03436279296875,
0.015960693359375,
0.051910400390625,
-0.07867431640625,
-0.0182952880859375,
-0.05279541015625,
-0.038421630859375,
-0.0220489501953125,
-0.052398681640625,
-0.053619384765625,
-0.0224151611328125,
0.03668212890625,
0.01543426513671875,
-0.029083251953125,
0.029388427734375,
-0.0582275390625,
0.00937652587890625,
0.0467529296875,
0.049102783203125,
-0.0229034423828125,
0.0235443115234375,
-0.0221405029296875,
0.00548553466796875,
-0.060211181640625,
-0.00847625732421875,
0.08489990234375,
0.03448486328125,
0.0477294921875,
-0.01568603515625,
0.059295654296875,
-0.0176239013671875,
0.0255279541015625,
-0.047821044921875,
0.0401611328125,
-0.004169464111328125,
-0.0341796875,
-0.00600433349609375,
-0.03778076171875,
-0.07501220703125,
0.0196685791015625,
-0.0214691162109375,
-0.049102783203125,
0.01395416259765625,
0.00909423828125,
-0.0101470947265625,
0.05743408203125,
-0.06634521484375,
0.07965087890625,
-0.0079803466796875,
-0.037811279296875,
0.006805419921875,
-0.0465087890625,
0.0228424072265625,
0.0249786376953125,
-0.018096923828125,
-0.0018682479858398438,
0.010040283203125,
0.0838623046875,
-0.0472412109375,
0.06085205078125,
-0.03436279296875,
0.0254669189453125,
0.036865234375,
0.0029754638671875,
0.019256591796875,
-0.00928497314453125,
-0.00363922119140625,
0.0301513671875,
0.00872039794921875,
-0.042449951171875,
-0.0360107421875,
0.051513671875,
-0.08184814453125,
-0.0164031982421875,
-0.02783203125,
-0.041015625,
0.01374053955078125,
0.007282257080078125,
0.040557861328125,
0.047332763671875,
0.0130767822265625,
0.0214691162109375,
0.041107177734375,
-0.028106689453125,
0.033447265625,
-0.014068603515625,
-0.0205841064453125,
-0.04241943359375,
0.057098388671875,
0.01058197021484375,
0.01221466064453125,
0.0019989013671875,
0.0180816650390625,
-0.022491455078125,
-0.0416259765625,
-0.0227508544921875,
0.02398681640625,
-0.056365966796875,
-0.027313232421875,
-0.046630859375,
-0.034820556640625,
-0.0282135009765625,
-0.0086822509765625,
-0.03741455078125,
-0.0328369140625,
-0.0289154052734375,
0.0134124755859375,
0.04852294921875,
0.046630859375,
-0.0011959075927734375,
0.040924072265625,
-0.034423828125,
0.0118255615234375,
0.00759124755859375,
0.04376220703125,
-0.006526947021484375,
-0.06768798828125,
-0.01556396484375,
-0.005458831787109375,
-0.031982421875,
-0.056060791015625,
0.036468505859375,
0.0084381103515625,
0.035797119140625,
0.0231170654296875,
-0.02337646484375,
0.0570068359375,
0.01259613037109375,
0.036468505859375,
0.0245513916015625,
-0.043548583984375,
0.044281005859375,
-0.005123138427734375,
0.017364501953125,
0.00594329833984375,
0.0237579345703125,
-0.022491455078125,
-0.0087127685546875,
-0.07867431640625,
-0.06195068359375,
0.07421875,
0.016510009765625,
-0.004688262939453125,
0.034332275390625,
0.046234130859375,
0.0091552734375,
0.0079803466796875,
-0.059661865234375,
-0.037689208984375,
-0.03265380859375,
-0.0298309326171875,
-0.002582550048828125,
-0.0008959770202636719,
-0.00792694091796875,
-0.054168701171875,
0.0472412109375,
0.00283050537109375,
0.051513671875,
0.029937744140625,
-0.004848480224609375,
-0.01312255859375,
-0.027557373046875,
0.03790283203125,
0.0301055908203125,
-0.03179931640625,
0.01010894775390625,
0.01064300537109375,
-0.03924560546875,
0.0027217864990234375,
0.0129852294921875,
-0.000720977783203125,
0.00524139404296875,
0.04180908203125,
0.07110595703125,
0.002960205078125,
0.005428314208984375,
0.03131103515625,
0.00606536865234375,
-0.035888671875,
-0.0153350830078125,
0.01390838623046875,
0.0005145072937011719,
0.03240966796875,
0.022491455078125,
0.023345947265625,
-0.00992584228515625,
-0.017364501953125,
0.0159454345703125,
0.038421630859375,
-0.0166473388671875,
-0.027984619140625,
0.046478271484375,
-0.015228271484375,
-0.00817108154296875,
0.060089111328125,
-0.0015802383422851562,
-0.036163330078125,
0.0802001953125,
0.0312347412109375,
0.0721435546875,
-0.00042057037353515625,
0.0011663436889648438,
0.06646728515625,
0.0166168212890625,
0.0015592575073242188,
0.0146636962890625,
0.00569915771484375,
-0.057708740234375,
0.00799560546875,
-0.04248046875,
0.0014352798461914062,
0.032623291015625,
-0.032318115234375,
0.02691650390625,
-0.057342529296875,
-0.022552490234375,
0.01361083984375,
0.018280029296875,
-0.07049560546875,
0.0180816650390625,
0.0005617141723632812,
0.060302734375,
-0.059661865234375,
0.053955078125,
0.0655517578125,
-0.042938232421875,
-0.0762939453125,
-0.0135650634765625,
-0.00887298583984375,
-0.0635986328125,
0.0382080078125,
0.029083251953125,
0.01222991943359375,
0.00214385986328125,
-0.06768798828125,
-0.039154052734375,
0.1048583984375,
0.0389404296875,
-0.002109527587890625,
0.0228271484375,
0.002704620361328125,
0.02032470703125,
-0.037689208984375,
0.044403076171875,
0.0177764892578125,
0.0281219482421875,
0.03533935546875,
-0.04986572265625,
0.01535797119140625,
-0.024932861328125,
0.0055389404296875,
0.0143890380859375,
-0.0499267578125,
0.07037353515625,
-0.0309600830078125,
-0.00787353515625,
-0.0091400146484375,
0.052581787109375,
0.024871826171875,
0.01507568359375,
0.040130615234375,
0.068115234375,
0.029998779296875,
-0.02471923828125,
0.06890869140625,
0.0042266845703125,
0.052001953125,
0.039031982421875,
0.03204345703125,
0.0350341796875,
0.03814697265625,
-0.0198974609375,
0.0228424072265625,
0.08673095703125,
-0.0238037109375,
0.024932861328125,
0.008544921875,
0.0032215118408203125,
-0.00888824462890625,
0.01116180419921875,
-0.032958984375,
0.0265655517578125,
0.01276397705078125,
-0.041748046875,
-0.0214691162109375,
-0.00550079345703125,
0.000007927417755126953,
-0.037689208984375,
-0.0164642333984375,
0.037078857421875,
0.0032215118408203125,
-0.033447265625,
0.06964111328125,
0.005573272705078125,
0.07049560546875,
-0.026641845703125,
-0.00013446807861328125,
-0.0261383056640625,
0.018310546875,
-0.036468505859375,
-0.07208251953125,
0.02008056640625,
-0.023223876953125,
0.0078582763671875,
0.00580596923828125,
0.04644775390625,
-0.04022216796875,
-0.03875732421875,
0.02142333984375,
0.02203369140625,
0.036346435546875,
0.00678253173828125,
-0.0880126953125,
0.014495849609375,
0.0007128715515136719,
-0.050262451171875,
0.0175628662109375,
0.03131103515625,
0.013763427734375,
0.049102783203125,
0.03851318359375,
-0.0059356689453125,
0.0212249755859375,
-0.012847900390625,
0.06463623046875,
-0.03753662109375,
-0.03131103515625,
-0.05987548828125,
0.0556640625,
-0.0028533935546875,
-0.04681396484375,
0.027618408203125,
0.04754638671875,
0.0733642578125,
-0.0031757354736328125,
0.03680419921875,
-0.0212249755859375,
-0.00525665283203125,
-0.0283355712890625,
0.05010986328125,
-0.05999755859375,
-0.00531005859375,
-0.011627197265625,
-0.057861328125,
-0.0248260498046875,
0.04998779296875,
-0.0179443359375,
0.046051025390625,
0.03839111328125,
0.0736083984375,
-0.04241943359375,
-0.0230255126953125,
0.01212310791015625,
0.01505279541015625,
0.0034332275390625,
0.031585693359375,
0.031982421875,
-0.06060791015625,
0.027008056640625,
-0.058868408203125,
-0.007843017578125,
-0.0146942138671875,
-0.045684814453125,
-0.0762939453125,
-0.058868408203125,
-0.05010986328125,
-0.0596923828125,
-0.0225982666015625,
0.0706787109375,
0.08367919921875,
-0.0616455078125,
-0.020660400390625,
0.0011396408081054688,
0.01020050048828125,
-0.0109405517578125,
-0.0186004638671875,
0.042236328125,
-0.002765655517578125,
-0.05322265625,
-0.0272674560546875,
-0.00501251220703125,
0.033203125,
-0.0013637542724609375,
-0.0223846435546875,
-0.01409149169921875,
-0.029937744140625,
0.0117340087890625,
0.0189361572265625,
-0.051971435546875,
-0.004482269287109375,
-0.0229034423828125,
-0.014251708984375,
0.038604736328125,
0.03656005859375,
-0.033538818359375,
0.0272674560546875,
0.040008544921875,
0.0288848876953125,
0.07794189453125,
-0.0259552001953125,
0.0013380050659179688,
-0.064208984375,
0.050323486328125,
-0.0021533966064453125,
0.036773681640625,
0.037811279296875,
-0.021728515625,
0.040863037109375,
0.032318115234375,
-0.02117919921875,
-0.059295654296875,
-0.01528167724609375,
-0.08563232421875,
-0.00772857666015625,
0.0687255859375,
-0.024139404296875,
-0.039642333984375,
0.035491943359375,
-0.0002321004867553711,
0.054534912109375,
-0.01363372802734375,
0.032806396484375,
0.0255889892578125,
-0.004398345947265625,
-0.0533447265625,
-0.037567138671875,
0.027130126953125,
0.01361083984375,
-0.0484619140625,
-0.039703369140625,
0.0010986328125,
0.06085205078125,
0.0086822509765625,
0.03399658203125,
-0.01226043701171875,
0.00630950927734375,
0.00890350341796875,
0.041748046875,
-0.032135009765625,
-0.0076141357421875,
-0.026397705078125,
0.006855010986328125,
-0.0118865966796875,
-0.04376220703125
]
] |
timm/gernet_l.idstcv_in1k | 2023-03-22T07:15:42.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2006.14090",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/gernet_l.idstcv_in1k | 0 | 36,611 | timm | 2023-03-22T07:15:08 | ---
tags:
- image-classification
- timm
library_tag: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for gernet_l.idstcv_in1k
A GENet (GPU-Efficient-Networks) image classification model. Trained on ImageNet-1k by paper authors.
This model architecture is implemented using `timm`'s flexible [BYOBNet (Bring-Your-Own-Blocks Network)](https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py).
BYOBNet allows configuration of:
* block / stage layout
* stem layout
* output stride (dilation)
* activation and norm layers
* channel and spatial / self-attention layers
...and also includes `timm` features common to many other architectures, including:
* stochastic depth
* gradient checkpointing
* layer-wise LR decay
* per-stage feature extraction
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 31.1
- GMACs: 4.6
- Activations (M): 8.0
- Image size: 256 x 256
- **Papers:**
- Neural Architecture Design for GPU-Efficient Networks: https://arxiv.org/abs/2006.14090
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/idstcv/GPU-Efficient-Networks
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('gernet_l.idstcv_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'gernet_l.idstcv_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 32, 128, 128])
# torch.Size([1, 128, 64, 64])
# torch.Size([1, 192, 32, 32])
# torch.Size([1, 640, 16, 16])
# torch.Size([1, 2560, 8, 8])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'gernet_l.idstcv_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2560, 8, 8) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@misc{lin2020neural,
title={Neural Architecture Design for GPU-Efficient Networks},
author={Ming Lin and Hesen Chen and Xiuyu Sun and Qi Qian and Hao Li and Rong Jin},
year={2020},
eprint={2006.14090},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
| 4,521 | [
[
-0.03753662109375,
-0.0428466796875,
0.006687164306640625,
0.006984710693359375,
-0.024566650390625,
-0.020660400390625,
-0.017669677734375,
-0.0298614501953125,
0.01120758056640625,
0.0201568603515625,
-0.0291748046875,
-0.055816650390625,
-0.046844482421875,
-0.0217742919921875,
-0.01331329345703125,
0.07794189453125,
-0.006725311279296875,
0.00012814998626708984,
-0.02093505859375,
-0.042083740234375,
-0.006763458251953125,
-0.024871826171875,
-0.064208984375,
-0.0465087890625,
0.021514892578125,
0.00667572021484375,
0.0482177734375,
0.047943115234375,
0.03826904296875,
0.0341796875,
-0.015899658203125,
-0.0018739700317382812,
-0.013702392578125,
-0.0250091552734375,
0.02569580078125,
-0.03326416015625,
-0.033721923828125,
0.01476287841796875,
0.0499267578125,
0.026153564453125,
0.00010704994201660156,
0.027740478515625,
0.0211944580078125,
0.036163330078125,
-0.0121002197265625,
0.00894927978515625,
-0.030853271484375,
0.01012420654296875,
-0.01314544677734375,
0.01044464111328125,
-0.018768310546875,
-0.03717041015625,
0.0180206298828125,
-0.036895751953125,
0.034210205078125,
-0.005199432373046875,
0.10064697265625,
0.0135040283203125,
0.00302886962890625,
-0.0080413818359375,
-0.0212554931640625,
0.05340576171875,
-0.05950927734375,
0.00925445556640625,
0.0121002197265625,
0.00733184814453125,
-0.00545501708984375,
-0.07232666015625,
-0.033843994140625,
-0.0134735107421875,
-0.019989013671875,
-0.0019855499267578125,
-0.0195159912109375,
0.0084686279296875,
0.027984619140625,
0.033355712890625,
-0.042633056640625,
0.0038738250732421875,
-0.03759765625,
-0.00997161865234375,
0.04022216796875,
-0.0003018379211425781,
0.032470703125,
-0.020599365234375,
-0.042694091796875,
-0.0251007080078125,
-0.0276031494140625,
0.0227508544921875,
0.026947021484375,
0.01184844970703125,
-0.046875,
0.0215606689453125,
0.01509857177734375,
0.0389404296875,
-0.0004787445068359375,
-0.0176544189453125,
0.0423583984375,
0.00232696533203125,
-0.040618896484375,
-0.004230499267578125,
0.0843505859375,
0.0179443359375,
0.0170745849609375,
-0.0033626556396484375,
-0.007526397705078125,
-0.01183319091796875,
-0.00896453857421875,
-0.089111328125,
-0.035064697265625,
0.026702880859375,
-0.04400634765625,
-0.035797119140625,
0.0097503662109375,
-0.046600341796875,
-0.004978179931640625,
0.0014019012451171875,
0.048736572265625,
-0.043487548828125,
-0.0249176025390625,
0.00064849853515625,
-0.0146026611328125,
0.01959228515625,
0.01132965087890625,
-0.050201416015625,
0.02569580078125,
0.0233917236328125,
0.08782958984375,
0.005420684814453125,
-0.031951904296875,
-0.0175323486328125,
-0.028045654296875,
-0.0177001953125,
0.0299835205078125,
-0.00970458984375,
-0.0199127197265625,
-0.0269012451171875,
0.0186004638671875,
-0.011627197265625,
-0.052581787109375,
0.023101806640625,
-0.01116180419921875,
0.0185089111328125,
-0.00537872314453125,
-0.017852783203125,
-0.0430908203125,
0.0199432373046875,
-0.0277862548828125,
0.1007080078125,
0.02392578125,
-0.06402587890625,
0.0221710205078125,
-0.044677734375,
-0.0169219970703125,
-0.01085662841796875,
-0.00963592529296875,
-0.0784912109375,
-0.0037078857421875,
0.020660400390625,
0.0546875,
-0.0249481201171875,
0.00875091552734375,
-0.036590576171875,
-0.0245513916015625,
0.0211029052734375,
-0.0031795501708984375,
0.07513427734375,
0.01605224609375,
-0.033233642578125,
0.0206451416015625,
-0.04718017578125,
0.0180816650390625,
0.039520263671875,
-0.026153564453125,
-0.01385498046875,
-0.038787841796875,
0.01470947265625,
0.0236663818359375,
0.0014123916625976562,
-0.046783447265625,
0.0234527587890625,
-0.01512908935546875,
0.0445556640625,
0.052642822265625,
-0.0118560791015625,
0.0271453857421875,
-0.0246734619140625,
0.0191192626953125,
0.0168304443359375,
0.0231170654296875,
-0.0005993843078613281,
-0.04827880859375,
-0.053680419921875,
-0.037384033203125,
0.02496337890625,
0.0277099609375,
-0.036956787109375,
0.0252532958984375,
-0.01262664794921875,
-0.05792236328125,
-0.036468505859375,
0.006816864013671875,
0.044036865234375,
0.04498291015625,
0.023834228515625,
-0.0390625,
-0.022064208984375,
-0.064697265625,
0.004512786865234375,
0.0021953582763671875,
0.0044097900390625,
0.036712646484375,
0.05120849609375,
-0.0032806396484375,
0.0457763671875,
-0.0290374755859375,
-0.0125885009765625,
-0.01366424560546875,
0.01476287841796875,
0.040802001953125,
0.06402587890625,
0.064697265625,
-0.04443359375,
-0.04315185546875,
0.0004878044128417969,
-0.078125,
0.0192108154296875,
-0.00653076171875,
-0.01470184326171875,
0.01476287841796875,
0.00514984130859375,
-0.05078125,
0.044891357421875,
0.01910400390625,
-0.02105712890625,
0.032562255859375,
-0.029083251953125,
0.0190887451171875,
-0.08319091796875,
0.0125732421875,
0.03485107421875,
-0.00923919677734375,
-0.039031982421875,
0.002288818359375,
0.0034008026123046875,
-0.01171112060546875,
-0.034332275390625,
0.05157470703125,
-0.046600341796875,
-0.0130462646484375,
-0.00618743896484375,
-0.01910400390625,
-0.0046234130859375,
0.0662841796875,
-0.01360321044921875,
0.037872314453125,
0.0692138671875,
-0.042694091796875,
0.03265380859375,
0.0306396484375,
-0.0212554931640625,
0.0307769775390625,
-0.05999755859375,
0.0238189697265625,
-0.00299835205078125,
0.0018014907836914062,
-0.07183837890625,
-0.01503753662109375,
0.033843994140625,
-0.042877197265625,
0.047882080078125,
-0.049346923828125,
-0.0310821533203125,
-0.04107666015625,
-0.036224365234375,
0.030181884765625,
0.05718994140625,
-0.060760498046875,
0.041107177734375,
0.019287109375,
0.0140228271484375,
-0.04254150390625,
-0.062225341796875,
-0.0184783935546875,
-0.03240966796875,
-0.06292724609375,
0.032470703125,
0.0035343170166015625,
0.004871368408203125,
0.0207977294921875,
-0.001354217529296875,
0.0011272430419921875,
-0.002597808837890625,
0.0447998046875,
0.0362548828125,
-0.021240234375,
-0.0060272216796875,
-0.018096923828125,
-0.00492095947265625,
0.006488800048828125,
-0.025634765625,
0.049224853515625,
-0.0226593017578125,
-0.0138702392578125,
-0.055999755859375,
-0.00270843505859375,
0.037841796875,
-0.010711669921875,
0.051849365234375,
0.071044921875,
-0.042327880859375,
-0.00836181640625,
-0.02520751953125,
-0.015899658203125,
-0.03643798828125,
0.031890869140625,
-0.0311126708984375,
-0.0279083251953125,
0.068115234375,
-0.00021958351135253906,
0.0036106109619140625,
0.045623779296875,
0.031524658203125,
-0.0009737014770507812,
0.058807373046875,
0.038421630859375,
-0.0012569427490234375,
0.04998779296875,
-0.07452392578125,
-0.0168609619140625,
-0.06512451171875,
-0.032012939453125,
-0.0254669189453125,
-0.048553466796875,
-0.04412841796875,
-0.0197601318359375,
0.034149169921875,
0.01041412353515625,
-0.0291595458984375,
0.0273895263671875,
-0.0771484375,
0.01371002197265625,
0.057403564453125,
0.042572021484375,
-0.024658203125,
0.0253753662109375,
-0.0222015380859375,
0.002040863037109375,
-0.05328369140625,
-0.0174560546875,
0.08099365234375,
0.0321044921875,
0.059906005859375,
-0.016143798828125,
0.04913330078125,
-0.0156402587890625,
0.0257720947265625,
-0.0433349609375,
0.0447998046875,
-0.01178741455078125,
-0.030303955078125,
-0.0066070556640625,
-0.0273284912109375,
-0.0672607421875,
0.00775909423828125,
-0.0187225341796875,
-0.0655517578125,
0.01165008544921875,
0.01322174072265625,
-0.021728515625,
0.06573486328125,
-0.06195068359375,
0.07513427734375,
-0.016845703125,
-0.0360107421875,
0.002445220947265625,
-0.045318603515625,
0.0187835693359375,
0.0230865478515625,
-0.02191162109375,
-0.0076141357421875,
0.028594970703125,
0.07586669921875,
-0.0457763671875,
0.05413818359375,
-0.040496826171875,
0.038055419921875,
0.039764404296875,
-0.01250457763671875,
0.033538818359375,
-0.00307464599609375,
-0.00521087646484375,
0.02569580078125,
-0.00626373291015625,
-0.0306549072265625,
-0.03741455078125,
0.052947998046875,
-0.07611083984375,
-0.025543212890625,
-0.032135009765625,
-0.042266845703125,
0.0157012939453125,
0.002376556396484375,
0.044708251953125,
0.056365966796875,
0.00440216064453125,
0.01219940185546875,
0.05230712890625,
-0.03155517578125,
0.035736083984375,
-0.0236968994140625,
-0.0167236328125,
-0.034027099609375,
0.062225341796875,
0.0188751220703125,
0.01081085205078125,
-0.000051081180572509766,
0.01438140869140625,
-0.02093505859375,
-0.05078125,
-0.035400390625,
0.0318603515625,
-0.05218505859375,
-0.026153564453125,
-0.04425048828125,
-0.03973388671875,
-0.0262451171875,
-0.008819580078125,
-0.029937744140625,
-0.01525115966796875,
-0.0256500244140625,
0.01392364501953125,
0.05694580078125,
0.04510498046875,
-0.0057525634765625,
0.037872314453125,
-0.038360595703125,
0.003482818603515625,
0.0144805908203125,
0.03411865234375,
-0.0048828125,
-0.06060791015625,
-0.0186920166015625,
-0.01190948486328125,
-0.0275421142578125,
-0.051666259765625,
0.037445068359375,
0.012542724609375,
0.041412353515625,
0.0234527587890625,
-0.0242919921875,
0.056396484375,
-0.000396728515625,
0.037445068359375,
0.0249176025390625,
-0.048858642578125,
0.048065185546875,
-0.005260467529296875,
0.01361083984375,
0.006122589111328125,
0.0228271484375,
-0.0225982666015625,
-0.00033855438232421875,
-0.074462890625,
-0.051177978515625,
0.06988525390625,
0.0100250244140625,
0.004032135009765625,
0.031890869140625,
0.05560302734375,
-0.001255035400390625,
-0.002735137939453125,
-0.052886962890625,
-0.035797119140625,
-0.0217437744140625,
-0.0196990966796875,
0.00519561767578125,
-0.005706787109375,
-0.0125274658203125,
-0.049774169921875,
0.059600830078125,
-0.005199432373046875,
0.058319091796875,
0.024810791015625,
-0.002819061279296875,
-0.0013837814331054688,
-0.0222625732421875,
0.0263519287109375,
0.021942138671875,
-0.0282745361328125,
0.0004220008850097656,
0.022369384765625,
-0.04290771484375,
0.0034999847412109375,
0.01995849609375,
-0.00984954833984375,
0.006900787353515625,
0.01512908935546875,
0.072998046875,
-0.0018434524536132812,
0.010101318359375,
0.0253448486328125,
-0.005367279052734375,
-0.0290374755859375,
-0.01395416259765625,
0.0127716064453125,
0.0039520263671875,
0.034759521484375,
0.0170745849609375,
0.030059814453125,
-0.01155853271484375,
-0.0225677490234375,
0.015869140625,
0.04742431640625,
-0.021270751953125,
-0.033050537109375,
0.066650390625,
-0.0172119140625,
-0.008392333984375,
0.07476806640625,
-0.008941650390625,
-0.032257080078125,
0.07989501953125,
0.03271484375,
0.074462890625,
-0.0019626617431640625,
0.005451202392578125,
0.062225341796875,
0.01073455810546875,
0.004352569580078125,
0.016693115234375,
0.007518768310546875,
-0.054931640625,
0.00902557373046875,
-0.040313720703125,
0.0026683807373046875,
0.023040771484375,
-0.040313720703125,
0.02606201171875,
-0.06329345703125,
-0.03155517578125,
0.01666259765625,
0.0202789306640625,
-0.069580078125,
0.0197296142578125,
0.006496429443359375,
0.064208984375,
-0.059600830078125,
0.07281494140625,
0.0648193359375,
-0.041748046875,
-0.08184814453125,
-0.020599365234375,
0.002017974853515625,
-0.0740966796875,
0.0345458984375,
0.0287628173828125,
0.009857177734375,
0.006237030029296875,
-0.06390380859375,
-0.0404052734375,
0.11175537109375,
0.040771484375,
0.00011771917343139648,
0.00983428955078125,
-0.0110015869140625,
0.01519775390625,
-0.0291748046875,
0.030303955078125,
0.01422119140625,
0.0219573974609375,
0.02508544921875,
-0.045440673828125,
0.018585205078125,
-0.03375244140625,
0.01451873779296875,
0.022003173828125,
-0.05712890625,
0.077880859375,
-0.03643798828125,
-0.019805908203125,
0.007007598876953125,
0.04840087890625,
0.01512908935546875,
0.0078125,
0.034942626953125,
0.0711669921875,
0.05023193359375,
-0.035125732421875,
0.07537841796875,
-0.0062713623046875,
0.054107666015625,
0.0438232421875,
0.030792236328125,
0.0291595458984375,
0.02349853515625,
-0.017913818359375,
0.026824951171875,
0.0804443359375,
-0.03216552734375,
0.02880859375,
0.0149688720703125,
-0.0089263916015625,
-0.0044403076171875,
0.00640869140625,
-0.035003662109375,
0.026092529296875,
0.01318359375,
-0.041748046875,
-0.0185546875,
0.004596710205078125,
0.014190673828125,
-0.0262451171875,
-0.0212249755859375,
0.03240966796875,
0.0023365020751953125,
-0.0279388427734375,
0.0570068359375,
0.005115509033203125,
0.06689453125,
-0.039825439453125,
0.005466461181640625,
-0.02117919921875,
0.020782470703125,
-0.0322265625,
-0.0665283203125,
0.0125732421875,
-0.015350341796875,
0.000015676021575927734,
0.0057830810546875,
0.05792236328125,
-0.024261474609375,
-0.0333251953125,
0.018524169921875,
0.02203369140625,
0.042633056640625,
-0.00008428096771240234,
-0.09173583984375,
0.0078125,
-0.00003343820571899414,
-0.04449462890625,
0.0290985107421875,
0.0263824462890625,
0.00667572021484375,
0.054046630859375,
0.045867919921875,
-0.00420379638671875,
0.01065826416015625,
-0.01345062255859375,
0.07183837890625,
-0.0357666015625,
-0.019317626953125,
-0.07061767578125,
0.049713134765625,
-0.01013946533203125,
-0.047637939453125,
0.038360595703125,
0.0489501953125,
0.06768798828125,
0.0016345977783203125,
0.044403076171875,
-0.023223876953125,
-0.01166534423828125,
-0.0302581787109375,
0.056884765625,
-0.052581787109375,
-0.0048675537109375,
-0.020538330078125,
-0.051483154296875,
-0.02288818359375,
0.049774169921875,
-0.0205535888671875,
0.0283050537109375,
0.032073974609375,
0.07720947265625,
-0.0292510986328125,
-0.0184783935546875,
0.0139007568359375,
0.0096893310546875,
0.01267242431640625,
0.0394287109375,
0.0294647216796875,
-0.05841064453125,
0.0272979736328125,
-0.0465087890625,
-0.0173187255859375,
-0.0015354156494140625,
-0.044708251953125,
-0.07012939453125,
-0.060546875,
-0.048797607421875,
-0.052581787109375,
-0.011016845703125,
0.06744384765625,
0.07928466796875,
-0.0523681640625,
-0.01154327392578125,
0.005859375,
0.0115966796875,
-0.02301025390625,
-0.01555633544921875,
0.052886962890625,
-0.01322174072265625,
-0.06842041015625,
-0.0259552001953125,
0.01158905029296875,
0.0259246826171875,
0.0005702972412109375,
-0.0231781005859375,
-0.01444244384765625,
-0.02227783203125,
0.01462554931640625,
0.0294036865234375,
-0.043670654296875,
-0.020050048828125,
-0.014251708984375,
-0.016143798828125,
0.04364013671875,
0.0307769775390625,
-0.044403076171875,
0.0294189453125,
0.031951904296875,
0.030181884765625,
0.07000732421875,
-0.0191192626953125,
-0.0004374980926513672,
-0.05718994140625,
0.0364990234375,
-0.0112457275390625,
0.04119873046875,
0.028594970703125,
-0.031951904296875,
0.0472412109375,
0.037109375,
-0.041473388671875,
-0.0570068359375,
-0.01473236083984375,
-0.0865478515625,
-0.00902557373046875,
0.06451416015625,
-0.02734375,
-0.043609619140625,
0.03753662109375,
-0.0008702278137207031,
0.0418701171875,
-0.00991058349609375,
0.03350830078125,
0.027587890625,
-0.0128021240234375,
-0.040740966796875,
-0.03912353515625,
0.0309295654296875,
0.0088043212890625,
-0.050994873046875,
-0.031829833984375,
-0.0036296844482421875,
0.0552978515625,
0.01654052734375,
0.044403076171875,
-0.0188140869140625,
0.01016998291015625,
0.005878448486328125,
0.04156494140625,
-0.0279083251953125,
-0.006191253662109375,
-0.02734375,
0.0018682479858398438,
-0.003498077392578125,
-0.052581787109375
]
] |
timm/gmlp_s16_224.ra3_in1k | 2023-03-27T23:01:25.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2105.08050",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/gmlp_s16_224.ra3_in1k | 0 | 36,551 | timm | 2023-03-27T23:01:08 | ---
tags:
- image-classification
- timm
library_tag: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for gmlp_s16_224.ra3_in1k
A gMLP image classification model. Trained on ImageNet-1k in `timm` by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 19.4
- GMACs: 4.4
- Activations (M): 15.1
- Image size: 224 x 224
- **Papers:**
- Pay Attention to MLPs: https://arxiv.org/abs/2105.08050
- **Original:** https://github.com/huggingface/pytorch-image-models
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('gmlp_s16_224.ra3_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'gmlp_s16_224.ra3_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 196, 256) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{Liu2021PayAT,
title={Pay Attention to MLPs},
author={Hanxiao Liu and Zihang Dai and David R. So and Quoc V. Le},
booktitle={Neural Information Processing Systems},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 2,934 | [
[
-0.03546142578125,
-0.0310516357421875,
0.00719451904296875,
0.007663726806640625,
-0.0204925537109375,
-0.0210723876953125,
-0.00940704345703125,
-0.038330078125,
0.00946044921875,
0.0226593017578125,
-0.0386962890625,
-0.048797607421875,
-0.05584716796875,
-0.00782012939453125,
-0.004543304443359375,
0.08428955078125,
-0.0006318092346191406,
0.0031986236572265625,
-0.0191192626953125,
-0.031036376953125,
0.0014944076538085938,
-0.0250091552734375,
-0.068603515625,
-0.041656494140625,
0.0301361083984375,
0.00652313232421875,
0.048675537109375,
0.0364990234375,
0.047760009765625,
0.0311279296875,
-0.0207672119140625,
0.00324249267578125,
-0.01532745361328125,
-0.0230712890625,
0.03082275390625,
-0.03961181640625,
-0.045379638671875,
0.029876708984375,
0.055023193359375,
0.030517578125,
-0.002056121826171875,
0.0260467529296875,
0.018035888671875,
0.042816162109375,
-0.0203857421875,
0.011627197265625,
-0.034271240234375,
0.030029296875,
-0.00318145751953125,
0.00020313262939453125,
-0.02740478515625,
-0.032135009765625,
0.012451171875,
-0.03656005859375,
0.037445068359375,
-0.00007528066635131836,
0.10015869140625,
0.01001739501953125,
0.00685882568359375,
-0.0037937164306640625,
-0.013397216796875,
0.050140380859375,
-0.0731201171875,
0.01090240478515625,
0.030548095703125,
0.01251220703125,
-0.007419586181640625,
-0.0679931640625,
-0.03826904296875,
-0.0187530517578125,
-0.01506805419921875,
-0.0029926300048828125,
-0.0169525146484375,
0.00972747802734375,
0.031890869140625,
0.03094482421875,
-0.0428466796875,
0.006008148193359375,
-0.04400634765625,
-0.016082763671875,
0.037261962890625,
0.00226593017578125,
0.0172882080078125,
-0.020416259765625,
-0.035064697265625,
-0.028350830078125,
-0.0301513671875,
0.01407623291015625,
0.0352783203125,
0.0195465087890625,
-0.047027587890625,
0.038482666015625,
0.0174560546875,
0.049468994140625,
0.005451202392578125,
-0.0253143310546875,
0.047760009765625,
0.003864288330078125,
-0.03582763671875,
-0.015380859375,
0.08868408203125,
0.02886962890625,
0.01507568359375,
-0.0019931793212890625,
-0.003826141357421875,
-0.0262908935546875,
-0.0090179443359375,
-0.08831787109375,
-0.0205535888671875,
0.0217437744140625,
-0.04620361328125,
-0.03668212890625,
0.0229339599609375,
-0.04833984375,
-0.01030731201171875,
-0.0003502368927001953,
0.04345703125,
-0.035797119140625,
-0.0225372314453125,
0.006549835205078125,
-0.00860595703125,
0.0271148681640625,
0.0079498291015625,
-0.047119140625,
0.0179901123046875,
0.03485107421875,
0.0980224609375,
0.004116058349609375,
-0.0380859375,
-0.021728515625,
-0.022979736328125,
-0.01812744140625,
0.032867431640625,
-0.011962890625,
-0.01206207275390625,
-0.021026611328125,
0.02496337890625,
-0.016357421875,
-0.053802490234375,
0.0220489501953125,
-0.011260986328125,
0.03076171875,
0.002471923828125,
-0.0243377685546875,
-0.027008056640625,
0.0269775390625,
-0.028839111328125,
0.09613037109375,
0.025054931640625,
-0.07440185546875,
0.023162841796875,
-0.0416259765625,
0.00152587890625,
-0.0169677734375,
-0.004039764404296875,
-0.08001708984375,
-0.007640838623046875,
0.01160430908203125,
0.0579833984375,
-0.0164947509765625,
0.011077880859375,
-0.0467529296875,
-0.017120361328125,
0.0211334228515625,
0.0010728836059570312,
0.07342529296875,
0.01525115966796875,
-0.01904296875,
0.02264404296875,
-0.039886474609375,
0.0142364501953125,
0.0360107421875,
-0.02142333984375,
0.00850677490234375,
-0.042572021484375,
0.014312744140625,
0.0264129638671875,
0.00811767578125,
-0.043853759765625,
0.032379150390625,
-0.01324462890625,
0.03326416015625,
0.05340576171875,
-0.01442718505859375,
0.022186279296875,
-0.031036376953125,
0.03350830078125,
0.01537322998046875,
0.0253448486328125,
0.007068634033203125,
-0.055511474609375,
-0.0484619140625,
-0.036590576171875,
0.0255279541015625,
0.028839111328125,
-0.032318115234375,
0.037994384765625,
-0.005252838134765625,
-0.054656982421875,
-0.032318115234375,
-0.0011453628540039062,
0.036651611328125,
0.047576904296875,
0.02288818359375,
-0.03411865234375,
-0.0335693359375,
-0.061859130859375,
-0.00217437744140625,
-0.0019216537475585938,
-0.007442474365234375,
0.0294036865234375,
0.04803466796875,
-0.01506805419921875,
0.0472412109375,
-0.036041259765625,
-0.020782470703125,
-0.019134521484375,
0.0140380859375,
0.04022216796875,
0.057373046875,
0.06268310546875,
-0.034912109375,
-0.0390625,
-0.01299285888671875,
-0.06884765625,
0.00798797607421875,
0.000213623046875,
-0.0132598876953125,
0.02691650390625,
0.01507568359375,
-0.051513671875,
0.044219970703125,
0.0272064208984375,
-0.036468505859375,
0.043853759765625,
-0.023406982421875,
0.007030487060546875,
-0.08270263671875,
0.01593017578125,
0.0250396728515625,
-0.0127105712890625,
-0.03076171875,
0.00397491455078125,
-0.003448486328125,
-0.00823211669921875,
-0.0298614501953125,
0.05316162109375,
-0.03607177734375,
-0.020294189453125,
-0.01328277587890625,
-0.0205535888671875,
-0.0005125999450683594,
0.050628662109375,
-0.007320404052734375,
0.03399658203125,
0.0643310546875,
-0.0306396484375,
0.028472900390625,
0.0311431884765625,
-0.017669677734375,
0.028106689453125,
-0.053253173828125,
0.0175018310546875,
-0.0006070137023925781,
0.01666259765625,
-0.07275390625,
-0.015838623046875,
0.039154052734375,
-0.046356201171875,
0.051513671875,
-0.049591064453125,
-0.038726806640625,
-0.041107177734375,
-0.033233642578125,
0.0245513916015625,
0.045318603515625,
-0.049713134765625,
0.036895751953125,
0.01629638671875,
0.01123809814453125,
-0.045684814453125,
-0.066162109375,
-0.0186309814453125,
-0.0264129638671875,
-0.05084228515625,
0.029449462890625,
0.00330352783203125,
0.016204833984375,
0.0115203857421875,
-0.009552001953125,
-0.0022792816162109375,
-0.016876220703125,
0.0335693359375,
0.034210205078125,
-0.016632080078125,
-0.0112762451171875,
-0.0302886962890625,
-0.01293182373046875,
0.0074462890625,
-0.01611328125,
0.046478271484375,
-0.026763916015625,
-0.01096343994140625,
-0.057159423828125,
-0.00577545166015625,
0.03955078125,
0.00045871734619140625,
0.0670166015625,
0.07672119140625,
-0.034027099609375,
0.004070281982421875,
-0.0302276611328125,
-0.02471923828125,
-0.036834716796875,
0.030731201171875,
-0.0252685546875,
-0.032989501953125,
0.067138671875,
0.013641357421875,
-0.0012607574462890625,
0.048583984375,
0.019989013671875,
-0.0016422271728515625,
0.06329345703125,
0.04052734375,
0.004016876220703125,
0.052978515625,
-0.06927490234375,
-0.01071929931640625,
-0.06982421875,
-0.037841796875,
-0.0279388427734375,
-0.041015625,
-0.049224853515625,
-0.0301361083984375,
0.0285491943359375,
0.0086517333984375,
-0.032684326171875,
0.0261688232421875,
-0.0650634765625,
0.014678955078125,
0.050079345703125,
0.037628173828125,
-0.022247314453125,
0.0237884521484375,
-0.0298919677734375,
0.0018930435180664062,
-0.06378173828125,
-0.020294189453125,
0.0849609375,
0.038360595703125,
0.0526123046875,
-0.01453399658203125,
0.045196533203125,
-0.0205535888671875,
0.02862548828125,
-0.049468994140625,
0.048004150390625,
-0.0101318359375,
-0.03533935546875,
0.0004088878631591797,
-0.0386962890625,
-0.07122802734375,
0.012298583984375,
-0.023895263671875,
-0.06097412109375,
0.005016326904296875,
0.0137939453125,
-0.01444244384765625,
0.06390380859375,
-0.053680419921875,
0.07757568359375,
-0.007404327392578125,
-0.0347900390625,
0.00931549072265625,
-0.0494384765625,
0.0199432373046875,
0.0239715576171875,
-0.0165863037109375,
-0.01312255859375,
0.0183258056640625,
0.08111572265625,
-0.04449462890625,
0.059600830078125,
-0.04302978515625,
0.0267333984375,
0.0283660888671875,
-0.00670623779296875,
0.031036376953125,
-0.00878143310546875,
-0.006122589111328125,
0.02410888671875,
0.00795745849609375,
-0.0313720703125,
-0.0335693359375,
0.048187255859375,
-0.07257080078125,
-0.0205535888671875,
-0.045989990234375,
-0.045501708984375,
0.0048065185546875,
0.0012912750244140625,
0.03936767578125,
0.041839599609375,
0.020721435546875,
0.01690673828125,
0.038330078125,
-0.0287628173828125,
0.03216552734375,
-0.003276824951171875,
-0.0321044921875,
-0.03863525390625,
0.0546875,
0.01190185546875,
0.01470184326171875,
-0.005023956298828125,
0.0239715576171875,
-0.0264892578125,
-0.04364013671875,
-0.019378662109375,
0.04095458984375,
-0.0494384765625,
-0.032073974609375,
-0.042449951171875,
-0.0404052734375,
-0.02569580078125,
-0.004058837890625,
-0.0311737060546875,
-0.026214599609375,
-0.03216552734375,
0.009124755859375,
0.053680419921875,
0.045135498046875,
-0.0183258056640625,
0.037994384765625,
-0.042388916015625,
0.0129547119140625,
0.008026123046875,
0.047943115234375,
-0.0064239501953125,
-0.075439453125,
-0.01332855224609375,
-0.00934600830078125,
-0.039459228515625,
-0.057037353515625,
0.041412353515625,
0.018218994140625,
0.040283203125,
0.0335693359375,
-0.0280303955078125,
0.058074951171875,
-0.0062408447265625,
0.0328369140625,
0.026458740234375,
-0.04193115234375,
0.04632568359375,
-0.0114288330078125,
0.01421356201171875,
0.0083160400390625,
0.03509521484375,
-0.02490234375,
-0.00914764404296875,
-0.07818603515625,
-0.05694580078125,
0.07049560546875,
0.010345458984375,
-0.003818511962890625,
0.027618408203125,
0.057708740234375,
0.0011749267578125,
0.00211334228515625,
-0.07122802734375,
-0.0290679931640625,
-0.0187225341796875,
-0.0202484130859375,
-0.0016279220581054688,
-0.01053619384765625,
-0.007415771484375,
-0.051300048828125,
0.059539794921875,
-0.01141357421875,
0.059814453125,
0.02227783203125,
0.01204681396484375,
-0.00974273681640625,
-0.0266571044921875,
0.034759521484375,
0.026275634765625,
-0.04205322265625,
0.00360870361328125,
0.0029735565185546875,
-0.044952392578125,
-0.0007505416870117188,
0.018829345703125,
-0.0015001296997070312,
-0.0009446144104003906,
0.0269927978515625,
0.065673828125,
-0.00439453125,
0.004116058349609375,
0.0269317626953125,
-0.0009851455688476562,
-0.03326416015625,
-0.01227569580078125,
0.00920867919921875,
0.0002460479736328125,
0.027801513671875,
0.025787353515625,
0.0300750732421875,
-0.01177978515625,
-0.022796630859375,
0.0152130126953125,
0.050994873046875,
-0.023162841796875,
-0.027618408203125,
0.0498046875,
-0.0218048095703125,
-0.0078887939453125,
0.061279296875,
-0.01470184326171875,
-0.03485107421875,
0.0745849609375,
0.036773681640625,
0.0723876953125,
-0.00490570068359375,
0.01003265380859375,
0.06927490234375,
0.0168609619140625,
-0.004779815673828125,
0.0172271728515625,
0.01488494873046875,
-0.057891845703125,
-0.0018177032470703125,
-0.04425048828125,
-0.0089111328125,
0.036285400390625,
-0.046905517578125,
0.019378662109375,
-0.059722900390625,
-0.033477783203125,
0.01381683349609375,
0.0240325927734375,
-0.055267333984375,
0.0107421875,
0.007442474365234375,
0.0650634765625,
-0.0640869140625,
0.053436279296875,
0.054412841796875,
-0.03375244140625,
-0.06524658203125,
-0.01294708251953125,
0.003978729248046875,
-0.06610107421875,
0.041168212890625,
0.035552978515625,
0.00670623779296875,
0.0025959014892578125,
-0.07012939453125,
-0.047515869140625,
0.09686279296875,
0.03656005859375,
-0.006725311279296875,
0.0140838623046875,
-0.005329132080078125,
0.018585205078125,
-0.032684326171875,
0.0316162109375,
0.00888824462890625,
0.027191162109375,
0.0311737060546875,
-0.050445556640625,
0.0141754150390625,
-0.0257568359375,
0.000040411949157714844,
0.0148468017578125,
-0.05096435546875,
0.06842041015625,
-0.037811279296875,
-0.00800323486328125,
0.01202392578125,
0.040374755859375,
0.035308837890625,
0.01277923583984375,
0.03021240234375,
0.0693359375,
0.044281005859375,
-0.027069091796875,
0.0701904296875,
-0.002071380615234375,
0.058074951171875,
0.05255126953125,
0.02276611328125,
0.044708251953125,
0.034942626953125,
-0.0333251953125,
0.0272216796875,
0.07781982421875,
-0.035125732421875,
0.02813720703125,
0.0174560546875,
-0.00018835067749023438,
-0.00643157958984375,
0.0114898681640625,
-0.04010009765625,
0.0304107666015625,
0.0160675048828125,
-0.040771484375,
-0.0155029296875,
0.0005183219909667969,
0.00974273681640625,
-0.035400390625,
-0.0180816650390625,
0.032562255859375,
-0.0018901824951171875,
-0.0290374755859375,
0.056854248046875,
0.0022754669189453125,
0.0675048828125,
-0.041778564453125,
0.0034694671630859375,
-0.0207366943359375,
0.0202484130859375,
-0.029327392578125,
-0.06719970703125,
0.02667236328125,
-0.01387786865234375,
-0.00643157958984375,
0.0013704299926757812,
0.047119140625,
-0.022674560546875,
-0.0452880859375,
0.018524169921875,
0.0174407958984375,
0.039520263671875,
0.00027680397033691406,
-0.0975341796875,
0.006420135498046875,
0.0005702972412109375,
-0.044677734375,
0.0272216796875,
0.0207366943359375,
0.0234832763671875,
0.061431884765625,
0.049041748046875,
-0.0025882720947265625,
0.01195526123046875,
-0.01062774658203125,
0.06317138671875,
-0.037841796875,
-0.02069091796875,
-0.06268310546875,
0.038665771484375,
-0.0023937225341796875,
-0.041015625,
0.03948974609375,
0.0509033203125,
0.065673828125,
-0.007663726806640625,
0.033355712890625,
-0.02423095703125,
0.003559112548828125,
-0.03509521484375,
0.056427001953125,
-0.052093505859375,
-0.00439453125,
-0.0198822021484375,
-0.056243896484375,
-0.0369873046875,
0.06927490234375,
-0.0249786376953125,
0.041656494140625,
0.0380859375,
0.07781982421875,
-0.0285797119140625,
-0.02301025390625,
0.01264190673828125,
0.0157928466796875,
0.0076904296875,
0.032012939453125,
0.0372314453125,
-0.0589599609375,
0.021240234375,
-0.043914794921875,
-0.0097808837890625,
-0.0114898681640625,
-0.062744140625,
-0.07147216796875,
-0.05560302734375,
-0.050689697265625,
-0.052734375,
-0.0195465087890625,
0.0667724609375,
0.0771484375,
-0.053253173828125,
-0.01309967041015625,
-0.0019550323486328125,
0.0023708343505859375,
-0.01446533203125,
-0.0191650390625,
0.0455322265625,
-0.0150909423828125,
-0.054443359375,
-0.0231475830078125,
0.0030345916748046875,
0.03863525390625,
0.00010818243026733398,
-0.02703857421875,
-0.020660400390625,
-0.0263519287109375,
0.0121917724609375,
0.0278167724609375,
-0.054779052734375,
-0.00896453857421875,
-0.0153656005859375,
-0.01068878173828125,
0.03765869140625,
0.0297088623046875,
-0.039398193359375,
0.02557373046875,
0.029083251953125,
0.0164337158203125,
0.06939697265625,
-0.0230865478515625,
0.0015850067138671875,
-0.0623779296875,
0.042755126953125,
-0.0010547637939453125,
0.040802001953125,
0.034820556640625,
-0.0220489501953125,
0.049102783203125,
0.034698486328125,
-0.041656494140625,
-0.06524658203125,
-0.01067352294921875,
-0.08245849609375,
-0.00897216796875,
0.06964111328125,
-0.02813720703125,
-0.0313720703125,
0.040069580078125,
-0.00533294677734375,
0.038360595703125,
-0.0009260177612304688,
0.03759765625,
0.027862548828125,
-0.014801025390625,
-0.049652099609375,
-0.0455322265625,
0.03875732421875,
0.01018524169921875,
-0.052276611328125,
-0.0261383056640625,
0.003170013427734375,
0.0498046875,
0.01520538330078125,
0.035980224609375,
-0.016357421875,
0.0048370361328125,
0.0013322830200195312,
0.036285400390625,
-0.0321044921875,
-0.00988006591796875,
-0.0274810791015625,
0.004039764404296875,
-0.00494384765625,
-0.040557861328125
]
] |
sentence-transformers/paraphrase-albert-small-v2 | 2022-07-08T04:07:04.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"rust",
"albert",
"feature-extraction",
"sentence-similarity",
"transformers",
"dataset:flax-sentence-embeddings/stackexchange_xml",
"dataset:s2orc",
"dataset:ms_marco",
"dataset:wiki_atomic_edits",
"dataset:snli",
"dataset:multi_nli",
"dataset:embedding-data/altlex",
"dataset:embedding-data/simple-wiki",
"dataset:embedding-data/flickr30k-captions",
"dataset:embedding-data/coco_captions",
"dataset:embedding-data/sentence-compression",
"dataset:embedding-data/QQP",
"dataset:yahoo_answers_topics",
"arxiv:1908.10084",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/paraphrase-albert-small-v2 | 4 | 36,479 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
datasets:
- flax-sentence-embeddings/stackexchange_xml
- s2orc
- ms_marco
- wiki_atomic_edits
- snli
- multi_nli
- embedding-data/altlex
- embedding-data/simple-wiki
- embedding-data/flickr30k-captions
- embedding-data/coco_captions
- embedding-data/sentence-compression
- embedding-data/QQP
- yahoo_answers_topics
---
# sentence-transformers/paraphrase-albert-small-v2
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/paraphrase-albert-small-v2')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/paraphrase-albert-small-v2')
model = AutoModel.from_pretrained('sentence-transformers/paraphrase-albert-small-v2')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, max pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/paraphrase-albert-small-v2)
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 100, 'do_lower_case': False}) with Transformer model: AlbertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
This model was trained by [sentence-transformers](https://www.sbert.net/).
If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
``` | 4,026 | [
[
-0.019256591796875,
-0.05560302734375,
0.0278778076171875,
0.027618408203125,
-0.0246734619140625,
-0.035430908203125,
-0.0159759521484375,
0.0016260147094726562,
0.01160430908203125,
0.037078857421875,
-0.032958984375,
-0.027313232421875,
-0.04779052734375,
0.01064300537109375,
-0.039886474609375,
0.0673828125,
-0.016143798828125,
-0.0040130615234375,
-0.0289154052734375,
-0.0187835693359375,
-0.0166168212890625,
-0.0223846435546875,
-0.0275115966796875,
-0.0171966552734375,
0.017730712890625,
0.0176239013671875,
0.04498291015625,
0.03851318359375,
0.029998779296875,
0.033355712890625,
-0.0020198822021484375,
0.001224517822265625,
-0.0199737548828125,
-0.001194000244140625,
-0.0016498565673828125,
-0.033477783203125,
0.0012140274047851562,
0.0081939697265625,
0.0443115234375,
0.03472900390625,
-0.007610321044921875,
0.0121002197265625,
0.00907135009765625,
0.016845703125,
-0.033935546875,
0.035888671875,
-0.047454833984375,
0.0089263916015625,
0.007404327392578125,
0.00012874603271484375,
-0.03607177734375,
-0.00923919677734375,
0.023345947265625,
-0.032958984375,
0.0216827392578125,
0.0209503173828125,
0.08355712890625,
0.02777099609375,
-0.01421356201171875,
-0.0251617431640625,
-0.02288818359375,
0.06787109375,
-0.06597900390625,
0.00917816162109375,
0.0245361328125,
-0.00226593017578125,
0.0086669921875,
-0.0855712890625,
-0.058074951171875,
-0.01326751708984375,
-0.042205810546875,
0.00630950927734375,
-0.0289459228515625,
-0.0034122467041015625,
0.011810302734375,
0.015960693359375,
-0.053192138671875,
-0.0168609619140625,
-0.035552978515625,
-0.01129913330078125,
0.0313720703125,
0.007404327392578125,
0.027587890625,
-0.048370361328125,
-0.0311431884765625,
-0.0308837890625,
-0.01418304443359375,
-0.005138397216796875,
-0.0010013580322265625,
0.01141357421875,
-0.0258331298828125,
0.06207275390625,
-0.01125335693359375,
0.03533935546875,
-0.0006561279296875,
0.0176239013671875,
0.043243408203125,
-0.029022216796875,
-0.01540374755859375,
-0.007747650146484375,
0.0908203125,
0.0299835205078125,
0.0184326171875,
-0.006694793701171875,
-0.01214599609375,
-0.0032901763916015625,
0.010284423828125,
-0.06494140625,
-0.03546142578125,
0.007503509521484375,
-0.03436279296875,
-0.0224151611328125,
0.00897216796875,
-0.059600830078125,
-0.00009614229202270508,
0.00428009033203125,
0.058258056640625,
-0.048370361328125,
0.0147705078125,
0.0131072998046875,
-0.0276031494140625,
0.0176544189453125,
-0.024688720703125,
-0.054046630859375,
0.01763916015625,
0.0207977294921875,
0.0836181640625,
0.0087738037109375,
-0.039947509765625,
-0.022705078125,
0.0010042190551757812,
0.0052947998046875,
0.05419921875,
-0.03155517578125,
-0.0018062591552734375,
0.00719451904296875,
0.01549530029296875,
-0.042205810546875,
-0.031158447265625,
0.048797607421875,
-0.0224609375,
0.04913330078125,
0.00583648681640625,
-0.058990478515625,
-0.0084686279296875,
0.0020751953125,
-0.03753662109375,
0.0753173828125,
0.0128631591796875,
-0.072998046875,
-0.0010528564453125,
-0.054168701171875,
-0.020355224609375,
-0.005977630615234375,
-0.0015153884887695312,
-0.053741455078125,
0.0094146728515625,
0.041015625,
0.055328369140625,
-0.0007567405700683594,
0.01384735107421875,
-0.027313232421875,
-0.032623291015625,
0.0282440185546875,
-0.0244598388671875,
0.0799560546875,
0.00997161865234375,
-0.0274200439453125,
0.004375457763671875,
-0.036041259765625,
-0.00954437255859375,
0.02606201171875,
-0.0052337646484375,
-0.02215576171875,
-0.00484466552734375,
0.0205078125,
0.0232696533203125,
0.024169921875,
-0.04473876953125,
0.0027523040771484375,
-0.042083740234375,
0.07452392578125,
0.03656005859375,
0.01313018798828125,
0.04827880859375,
-0.030731201171875,
0.0156402587890625,
0.017791748046875,
-0.0011692047119140625,
-0.0137939453125,
-0.035186767578125,
-0.07318115234375,
-0.0212860107421875,
0.0209503173828125,
0.042694091796875,
-0.064697265625,
0.071044921875,
-0.037109375,
-0.038543701171875,
-0.057098388671875,
0.00495147705078125,
0.00998687744140625,
0.0330810546875,
0.055694580078125,
-0.0005602836608886719,
-0.043609619140625,
-0.0755615234375,
-0.00843048095703125,
-0.00450897216796875,
-0.005313873291015625,
0.01195526123046875,
0.060577392578125,
-0.0260772705078125,
0.07684326171875,
-0.043365478515625,
-0.0305023193359375,
-0.033111572265625,
0.023651123046875,
0.0132293701171875,
0.04571533203125,
0.040863037109375,
-0.05267333984375,
-0.035552978515625,
-0.04144287109375,
-0.059814453125,
-0.0033321380615234375,
-0.021240234375,
-0.016265869140625,
-0.00046706199645996094,
0.0423583984375,
-0.06976318359375,
0.0228729248046875,
0.03668212890625,
-0.0347900390625,
0.021697998046875,
-0.023529052734375,
-0.01491546630859375,
-0.09869384765625,
0.00038123130798339844,
0.0039215087890625,
-0.01326751708984375,
-0.02532958984375,
0.008148193359375,
0.013946533203125,
-0.004749298095703125,
-0.032012939453125,
0.0367431640625,
-0.02923583984375,
0.01287078857421875,
-0.002655029296875,
0.035552978515625,
0.0015916824340820312,
0.05279541015625,
-0.005092620849609375,
0.05523681640625,
0.039886474609375,
-0.04339599609375,
0.0266265869140625,
0.05517578125,
-0.039886474609375,
0.01837158203125,
-0.070068359375,
-0.0035190582275390625,
-0.0028533935546875,
0.0287628173828125,
-0.08636474609375,
-0.007904052734375,
0.025390625,
-0.038726806640625,
0.00157928466796875,
0.02337646484375,
-0.0545654296875,
-0.04559326171875,
-0.034698486328125,
0.01123046875,
0.04962158203125,
-0.03863525390625,
0.03814697265625,
0.0146026611328125,
-0.01407623291015625,
-0.0303192138671875,
-0.0789794921875,
0.004962921142578125,
-0.025543212890625,
-0.048736572265625,
0.039031982421875,
-0.009002685546875,
0.009796142578125,
0.019775390625,
0.0185089111328125,
-0.0032482147216796875,
-0.007228851318359375,
-0.003925323486328125,
0.01186370849609375,
-0.00787353515625,
0.0076751708984375,
0.020477294921875,
-0.005451202392578125,
0.0092010498046875,
-0.00909423828125,
0.05194091796875,
-0.0165557861328125,
-0.001842498779296875,
-0.03900146484375,
0.016510009765625,
0.032745361328125,
-0.01532745361328125,
0.0841064453125,
0.0740966796875,
-0.025360107421875,
-0.0026721954345703125,
-0.0298309326171875,
-0.0260467529296875,
-0.03741455078125,
0.04766845703125,
-0.018768310546875,
-0.0654296875,
0.02777099609375,
0.022857666015625,
0.004718780517578125,
0.054962158203125,
0.048858642578125,
-0.01540374755859375,
0.06597900390625,
0.03515625,
-0.0078277587890625,
0.040924072265625,
-0.041015625,
0.02166748046875,
-0.06976318359375,
-0.00048470497131347656,
-0.024322509765625,
-0.02099609375,
-0.047515869140625,
-0.036102294921875,
0.017364501953125,
0.002227783203125,
-0.022003173828125,
0.055328369140625,
-0.041290283203125,
0.01546478271484375,
0.050689697265625,
0.01312255859375,
-0.00466156005859375,
0.0038738250732421875,
-0.0289154052734375,
-0.00675201416015625,
-0.057281494140625,
-0.04534912109375,
0.06707763671875,
0.0269775390625,
0.03387451171875,
-0.01157379150390625,
0.058685302734375,
0.0009603500366210938,
0.004974365234375,
-0.0452880859375,
0.048828125,
-0.01245880126953125,
-0.035858154296875,
-0.0261077880859375,
-0.02740478515625,
-0.06646728515625,
0.033782958984375,
-0.01055145263671875,
-0.05938720703125,
0.005352020263671875,
-0.0134429931640625,
-0.0299835205078125,
0.01114654541015625,
-0.06207275390625,
0.08343505859375,
0.0115203857421875,
-0.00649261474609375,
-0.00670623779296875,
-0.0594482421875,
0.0160675048828125,
0.0079498291015625,
0.00890350341796875,
-0.0013799667358398438,
-0.01041412353515625,
0.0771484375,
-0.0251617431640625,
0.0623779296875,
-0.0056610107421875,
0.024169921875,
0.0293426513671875,
-0.0183868408203125,
0.02703857421875,
0.0010089874267578125,
0.001026153564453125,
0.0037593841552734375,
0.0024471282958984375,
-0.032073974609375,
-0.0426025390625,
0.05157470703125,
-0.07183837890625,
-0.026458740234375,
-0.0333251953125,
-0.05145263671875,
-0.0054168701171875,
0.01233673095703125,
0.034271240234375,
0.0247039794921875,
-0.004741668701171875,
0.04931640625,
0.03631591796875,
-0.02276611328125,
0.056121826171875,
0.004550933837890625,
0.0034351348876953125,
-0.0396728515625,
0.050506591796875,
0.0006871223449707031,
0.004791259765625,
0.0433349609375,
0.0196685791015625,
-0.031494140625,
-0.0160980224609375,
-0.0193634033203125,
0.04010009765625,
-0.045074462890625,
-0.0141754150390625,
-0.0826416015625,
-0.033905029296875,
-0.047576904296875,
-0.003620147705078125,
-0.013946533203125,
-0.031280517578125,
-0.0367431640625,
-0.01519012451171875,
0.026580810546875,
0.029052734375,
-0.0008478164672851562,
0.038818359375,
-0.052490234375,
0.0198516845703125,
0.018707275390625,
-0.0045928955078125,
-0.007686614990234375,
-0.062469482421875,
-0.0262451171875,
0.0025424957275390625,
-0.036102294921875,
-0.062255859375,
0.051177978515625,
0.019744873046875,
0.037353515625,
0.01317596435546875,
0.0120391845703125,
0.047454833984375,
-0.046142578125,
0.0673828125,
0.006072998046875,
-0.07513427734375,
0.0281524658203125,
-0.007213592529296875,
0.025238037109375,
0.040008544921875,
0.019927978515625,
-0.035797119140625,
-0.0391845703125,
-0.061370849609375,
-0.0799560546875,
0.0552978515625,
0.0467529296875,
0.04083251953125,
-0.02215576171875,
0.0201568603515625,
-0.0193634033203125,
0.01519775390625,
-0.08331298828125,
-0.03375244140625,
-0.024658203125,
-0.048858642578125,
-0.0275115966796875,
-0.0179595947265625,
0.007175445556640625,
-0.032806396484375,
0.0550537109375,
0.004657745361328125,
0.0643310546875,
0.017974853515625,
-0.03753662109375,
0.0141448974609375,
0.011749267578125,
0.034942626953125,
0.0180206298828125,
-0.009735107421875,
0.013641357421875,
0.0220489501953125,
-0.01959228515625,
-0.004909515380859375,
0.036529541015625,
-0.01491546630859375,
0.0209503173828125,
0.0364990234375,
0.0753173828125,
0.039154052734375,
-0.0396728515625,
0.058258056640625,
-0.005584716796875,
-0.0227508544921875,
-0.02740478515625,
-0.005863189697265625,
0.0237579345703125,
0.028289794921875,
0.0185394287109375,
-0.0006670951843261719,
0.0034313201904296875,
-0.02642822265625,
0.0284881591796875,
0.017669677734375,
-0.02142333984375,
-0.006923675537109375,
0.0623779296875,
-0.0022525787353515625,
-0.0175933837890625,
0.06524658203125,
-0.0168609619140625,
-0.0469970703125,
0.0306854248046875,
0.044525146484375,
0.0723876953125,
0.005771636962890625,
0.022705078125,
0.033843994140625,
0.0251617431640625,
-0.0133056640625,
-0.00931549072265625,
-0.00270843505859375,
-0.06207275390625,
-0.0110626220703125,
-0.051177978515625,
-0.0009427070617675781,
0.006053924560546875,
-0.048004150390625,
0.0201873779296875,
-0.00775909423828125,
0.004547119140625,
-0.0059967041015625,
-0.006977081298828125,
-0.05462646484375,
0.0021190643310546875,
0.0009512901306152344,
0.055908203125,
-0.0743408203125,
0.059478759765625,
0.0489501953125,
-0.0555419921875,
-0.05047607421875,
0.005275726318359375,
-0.036773681640625,
-0.06689453125,
0.032928466796875,
0.032928466796875,
0.025665283203125,
0.0170135498046875,
-0.044677734375,
-0.0653076171875,
0.1033935546875,
0.02008056640625,
-0.0251007080078125,
-0.0261688232421875,
0.008758544921875,
0.039764404296875,
-0.0308685302734375,
0.0279693603515625,
0.039398193359375,
0.0225677490234375,
-0.0059356689453125,
-0.048797607421875,
0.0175933837890625,
-0.0145721435546875,
0.0201263427734375,
-0.01155853271484375,
-0.0421142578125,
0.0732421875,
0.002593994140625,
-0.002105712890625,
0.0274810791015625,
0.07373046875,
0.0233612060546875,
-0.0014123916625976562,
0.033599853515625,
0.056365966796875,
0.03302001953125,
-0.00283050537109375,
0.07269287109375,
-0.0289459228515625,
0.066650390625,
0.07281494140625,
0.01230621337890625,
0.0819091796875,
0.04296875,
-0.0119171142578125,
0.055908203125,
0.033294677734375,
-0.017333984375,
0.060760498046875,
0.00626373291015625,
0.00022077560424804688,
-0.0033473968505859375,
0.0189208984375,
-0.01418304443359375,
0.0300140380859375,
0.017242431640625,
-0.05322265625,
-0.011627197265625,
0.0102081298828125,
0.0023174285888671875,
-0.0037212371826171875,
0.0038089752197265625,
0.040191650390625,
0.0201873779296875,
-0.03875732421875,
0.030029296875,
0.0163421630859375,
0.0703125,
-0.0290985107421875,
0.01461029052734375,
-0.00409698486328125,
0.028106689453125,
0.0089263916015625,
-0.0443115234375,
0.028778076171875,
-0.01137542724609375,
-0.0008149147033691406,
-0.0189666748046875,
0.039947509765625,
-0.04864501953125,
-0.048187255859375,
0.019805908203125,
0.036956787109375,
0.00769805908203125,
-0.0004897117614746094,
-0.08721923828125,
-0.01004791259765625,
0.00005257129669189453,
-0.042938232421875,
0.01556396484375,
0.031829833984375,
0.026611328125,
0.041259765625,
0.0218048095703125,
-0.01641845703125,
0.0147552490234375,
-0.002532958984375,
0.053985595703125,
-0.045928955078125,
-0.04205322265625,
-0.0745849609375,
0.0435791015625,
-0.01812744140625,
-0.0261993408203125,
0.06256103515625,
0.038970947265625,
0.06524658203125,
-0.0211181640625,
0.046478271484375,
-0.01898193359375,
0.022216796875,
-0.03369140625,
0.05908203125,
-0.03717041015625,
-0.01065826416015625,
-0.0241241455078125,
-0.07098388671875,
-0.0209503173828125,
0.09197998046875,
-0.03265380859375,
0.00922393798828125,
0.0789794921875,
0.06524658203125,
-0.01153564453125,
-0.00745391845703125,
0.01099395751953125,
0.0301513671875,
0.01514434814453125,
0.038055419921875,
0.0310211181640625,
-0.06671142578125,
0.06353759765625,
-0.0462646484375,
0.0013065338134765625,
-0.01549530029296875,
-0.050079345703125,
-0.07568359375,
-0.0633544921875,
-0.033599853515625,
-0.0268096923828125,
-0.01081085205078125,
0.0733642578125,
0.038055419921875,
-0.05804443359375,
-0.00449371337890625,
-0.028900146484375,
-0.0152740478515625,
-0.01253509521484375,
-0.02545166015625,
0.040191650390625,
-0.04150390625,
-0.06744384765625,
0.00782012939453125,
-0.00771331787109375,
0.0008502006530761719,
-0.0172882080078125,
0.006748199462890625,
-0.042816162109375,
0.0124053955078125,
0.03717041015625,
-0.012115478515625,
-0.05377197265625,
-0.0250701904296875,
-0.00612640380859375,
-0.03314208984375,
-0.005054473876953125,
0.0267486572265625,
-0.048583984375,
0.01031494140625,
0.031707763671875,
0.043975830078125,
0.05322265625,
-0.01552581787109375,
0.029937744140625,
-0.056365966796875,
0.019500732421875,
0.00922393798828125,
0.056732177734375,
0.0316162109375,
-0.0110015869140625,
0.03314208984375,
0.027618408203125,
-0.039886474609375,
-0.053985595703125,
-0.0150299072265625,
-0.07867431640625,
-0.018096923828125,
0.09381103515625,
-0.0229949951171875,
-0.0292816162109375,
0.013458251953125,
-0.017059326171875,
0.040740966796875,
-0.0254364013671875,
0.05194091796875,
0.06134033203125,
0.0033969879150390625,
-0.0191192626953125,
-0.0294189453125,
0.0139923095703125,
0.04083251953125,
-0.04010009765625,
-0.01305389404296875,
0.01239776611328125,
0.0305633544921875,
0.0194549560546875,
0.02777099609375,
0.0005125999450683594,
0.0006136894226074219,
0.0067291259765625,
0.0026645660400390625,
-0.00501251220703125,
0.00321197509765625,
-0.026641845703125,
0.01416778564453125,
-0.0297698974609375,
-0.032135009765625
]
] |
charactr/vocos-encodec-24khz | 2023-10-17T14:09:41.000Z | [
"pytorch",
"arxiv:2306.00814",
"license:mit",
"has_space",
"region:us"
] | null | charactr | null | null | charactr/vocos-encodec-24khz | 5 | 36,464 | null | 2023-06-11T16:41:12 | ---
license: mit
---
# Vocos: Closing the gap between time-domain and Fourier-based neural vocoders for high-quality audio synthesis
[Audio samples](https://charactr-platform.github.io/vocos/) |
Paper [[abs]](https://arxiv.org/abs/2306.00814) [[pdf]](https://arxiv.org/pdf/2306.00814.pdf)
Vocos is a fast neural vocoder designed to synthesize audio waveforms from acoustic features. Trained using a Generative
Adversarial Network (GAN) objective, Vocos can generate waveforms in a single forward pass. Unlike other typical
GAN-based vocoders, Vocos does not model audio samples in the time domain. Instead, it generates spectral
coefficients, facilitating rapid audio reconstruction through inverse Fourier transform.
## Installation
To use Vocos only in inference mode, install it using:
```bash
pip install vocos
```
If you wish to train the model, install it with additional dependencies:
```bash
pip install vocos[train]
```
## Usage
### Reconstruct audio from EnCodec tokens
Additionally, you need to provide a `bandwidth_id` which corresponds to the embedding for bandwidth from the
list: `[1.5, 3.0, 6.0, 12.0]`.
```python
vocos = Vocos.from_pretrained("charactr/vocos-encodec-24khz")
audio_tokens = torch.randint(low=0, high=1024, size=(8, 200)) # 8 codeboooks, 200 frames
features = vocos.codes_to_features(audio_tokens)
bandwidth_id = torch.tensor([2]) # 6 kbps
audio = vocos.decode(features, bandwidth_id=bandwidth_id)
```
Copy-synthesis from a file: It extracts and quantizes features with EnCodec, then reconstructs them with Vocos in a
single forward pass.
```python
y, sr = torchaudio.load(YOUR_AUDIO_FILE)
if y.size(0) > 1: # mix to mono
y = y.mean(dim=0, keepdim=True)
y = torchaudio.functional.resample(y, orig_freq=sr, new_freq=24000)
y_hat = vocos(y, bandwidth_id=bandwidth_id)
```
## Citation
If this code contributes to your research, please cite our work:
```
@article{siuzdak2023vocos,
title={Vocos: Closing the gap between time-domain and Fourier-based neural vocoders for high-quality audio synthesis},
author={Siuzdak, Hubert},
journal={arXiv preprint arXiv:2306.00814},
year={2023}
}
```
## License
The code in this repository is released under the MIT license. | 2,226 | [
[
-0.0531005859375,
-0.041534423828125,
0.0028438568115234375,
0.041717529296875,
0.00556182861328125,
0.003147125244140625,
-0.0185546875,
-0.016204833984375,
0.0298919677734375,
0.04119873046875,
-0.0296630859375,
-0.034698486328125,
-0.0237884521484375,
-0.00452423095703125,
-0.034698486328125,
0.05010986328125,
0.01922607421875,
-0.00273895263671875,
0.00006586313247680664,
0.0000059604644775390625,
-0.0158843994140625,
-0.0303192138671875,
-0.07061767578125,
-0.02178955078125,
0.00489044189453125,
-0.01100921630859375,
0.01322174072265625,
0.01428985595703125,
-0.000850677490234375,
0.0297698974609375,
-0.04193115234375,
0.01389312744140625,
-0.0147247314453125,
0.01160430908203125,
0.024322509765625,
-0.0296630859375,
-0.06378173828125,
-0.0159759521484375,
0.037567138671875,
0.034912109375,
-0.032928466796875,
0.01276397705078125,
0.01369476318359375,
0.03399658203125,
-0.0472412109375,
-0.0174102783203125,
-0.05218505859375,
0.025146484375,
-0.00478363037109375,
-0.052825927734375,
-0.0202789306640625,
-0.0229644775390625,
0.00035643577575683594,
-0.0643310546875,
0.051971435546875,
-0.02197265625,
0.0662841796875,
0.032073974609375,
-0.019622802734375,
-0.0159454345703125,
-0.0655517578125,
0.048095703125,
-0.059112548828125,
0.05584716796875,
0.0301055908203125,
0.03448486328125,
-0.0228729248046875,
-0.08038330078125,
-0.053131103515625,
0.01415252685546875,
0.04913330078125,
0.02630615234375,
-0.0019464492797851562,
0.004726409912109375,
0.0144805908203125,
0.0445556640625,
-0.01332855224609375,
-0.0243377685546875,
-0.0604248046875,
-0.051422119140625,
0.045440673828125,
0.00815582275390625,
-0.0026073455810546875,
0.0004429817199707031,
-0.041534423828125,
-0.0328369140625,
-0.045379638671875,
0.040008544921875,
0.058685302734375,
-0.004070281982421875,
-0.0217437744140625,
0.0228729248046875,
0.0032291412353515625,
0.03778076171875,
0.00006020069122314453,
-0.0059051513671875,
0.041412353515625,
-0.031951904296875,
-0.0165252685546875,
0.0189208984375,
0.070068359375,
0.0054473876953125,
0.0051727294921875,
0.0020084381103515625,
-0.02264404296875,
-0.00658416748046875,
0.0340576171875,
-0.0523681640625,
-0.01007080078125,
0.043487548828125,
-0.00885772705078125,
-0.0173797607421875,
0.00563812255859375,
-0.0679931640625,
0.0196533203125,
-0.0222015380859375,
0.03814697265625,
-0.01238250732421875,
-0.0173797607421875,
0.0230255126953125,
-0.03521728515625,
0.0020389556884765625,
0.004985809326171875,
-0.06884765625,
0.037139892578125,
0.028411865234375,
0.0758056640625,
0.01342010498046875,
-0.027313232421875,
-0.048736572265625,
-0.00415802001953125,
-0.03125,
0.0283203125,
0.01161956787109375,
-0.042877197265625,
-0.01678466796875,
-0.004985809326171875,
0.039520263671875,
-0.046142578125,
0.04583740234375,
-0.03155517578125,
0.002727508544921875,
-0.01103973388671875,
-0.00872039794921875,
0.0040740966796875,
-0.028533935546875,
-0.0367431640625,
0.08258056640625,
0.0109405517578125,
-0.023040771484375,
0.007038116455078125,
-0.0322265625,
-0.03179931640625,
-0.029022216796875,
0.0122222900390625,
-0.044952392578125,
-0.004192352294921875,
0.0267791748046875,
0.01317596435546875,
-0.0318603515625,
0.0261688232421875,
-0.030792236328125,
-0.0406494140625,
0.0240020751953125,
-0.059539794921875,
0.091796875,
0.0400390625,
-0.033935546875,
0.0058135986328125,
-0.048736572265625,
0.0049285888671875,
-0.01136016845703125,
-0.0394287109375,
0.00013709068298339844,
-0.007511138916015625,
0.03167724609375,
0.006885528564453125,
-0.0199432373046875,
-0.04656982421875,
-0.0022106170654296875,
-0.043792724609375,
0.0789794921875,
0.054901123046875,
-0.00310516357421875,
0.01776123046875,
-0.0295562744140625,
0.025115966796875,
-0.021240234375,
0.025390625,
0.0013017654418945312,
-0.0186767578125,
-0.046417236328125,
-0.02410888671875,
0.033538818359375,
0.0092620849609375,
-0.049530029296875,
0.05682373046875,
-0.022247314453125,
-0.04095458984375,
-0.061187744140625,
-0.00856781005859375,
-0.0002034902572631836,
0.0286407470703125,
0.048004150390625,
-0.038482666015625,
-0.050201416015625,
-0.03875732421875,
0.0171661376953125,
0.01464080810546875,
-0.032806396484375,
0.03338623046875,
0.0240020751953125,
0.006732940673828125,
0.05999755859375,
-0.034759521484375,
-0.034759521484375,
0.021728515625,
-0.005519866943359375,
0.01910400390625,
0.042816162109375,
0.05511474609375,
-0.053070068359375,
-0.0186920166015625,
-0.0022106170654296875,
-0.0343017578125,
-0.01200103759765625,
0.005313873291015625,
0.005947113037109375,
-0.01459503173828125,
0.044158935546875,
-0.0281982421875,
0.026275634765625,
0.0377197265625,
-0.010101318359375,
0.04437255859375,
-0.00075531005859375,
0.0157623291015625,
-0.07489013671875,
0.002689361572265625,
-0.0207672119140625,
-0.0162200927734375,
-0.03131103515625,
-0.041534423828125,
-0.0227813720703125,
-0.0240020751953125,
-0.062744140625,
0.039764404296875,
-0.0272674560546875,
-0.01502227783203125,
-0.006664276123046875,
-0.008758544921875,
0.022979736328125,
0.07513427734375,
-0.0174560546875,
0.07354736328125,
0.0582275390625,
-0.06390380859375,
0.03076171875,
0.005588531494140625,
-0.0199432373046875,
0.0187225341796875,
-0.08221435546875,
0.0125732421875,
0.0114898681640625,
0.029937744140625,
-0.056060791015625,
-0.00036263465881347656,
0.004955291748046875,
-0.064453125,
0.003932952880859375,
-0.03076171875,
0.0009646415710449219,
-0.023101806640625,
-0.0216827392578125,
0.0545654296875,
0.040069580078125,
-0.0494384765625,
0.0423583984375,
0.048858642578125,
-0.0037841796875,
-0.062164306640625,
-0.06884765625,
-0.01434326171875,
0.0019121170043945312,
-0.033477783203125,
0.05255126953125,
-0.0043792724609375,
0.0082244873046875,
0.0105438232421875,
-0.023712158203125,
-0.015594482421875,
-0.00815582275390625,
0.027374267578125,
-0.0010433197021484375,
-0.015869140625,
-0.004337310791015625,
0.0302734375,
0.003509521484375,
0.006198883056640625,
-0.05621337890625,
0.055206298828125,
-0.0297698974609375,
-0.01212310791015625,
-0.02587890625,
0.024566650390625,
0.07073974609375,
-0.0272216796875,
0.0100250244140625,
0.0633544921875,
-0.0111236572265625,
-0.048583984375,
-0.03106689453125,
-0.0078887939453125,
-0.046905517578125,
0.01824951171875,
-0.0244903564453125,
-0.0513916015625,
0.04449462890625,
0.0093994140625,
-0.00409698486328125,
0.03857421875,
0.037811279296875,
-0.019439697265625,
0.027679443359375,
-0.0024166107177734375,
0.032684326171875,
0.053863525390625,
-0.0841064453125,
-0.006603240966796875,
-0.07470703125,
-0.0227203369140625,
-0.036224365234375,
-0.01250457763671875,
-0.0173492431640625,
-0.02435302734375,
0.051177978515625,
0.007717132568359375,
-0.00970458984375,
0.044219970703125,
-0.058563232421875,
0.02587890625,
0.046844482421875,
0.00890350341796875,
-0.02008056640625,
0.0263519287109375,
-0.005023956298828125,
0.025421142578125,
-0.055389404296875,
-0.006183624267578125,
0.0892333984375,
0.0212249755859375,
0.034881591796875,
0.01140594482421875,
0.04095458984375,
0.04925537109375,
-0.03076171875,
-0.0643310546875,
0.03076171875,
-0.0166015625,
-0.03546142578125,
-0.0384521484375,
-0.0350341796875,
-0.07635498046875,
0.0216827392578125,
-0.028411865234375,
-0.0244140625,
0.04443359375,
0.009124755859375,
-0.01861572265625,
0.0445556640625,
-0.04498291015625,
0.0206451416015625,
0.002044677734375,
-0.0307464599609375,
0.00881195068359375,
-0.025115966796875,
-0.009918212890625,
0.01477813720703125,
-0.007617950439453125,
-0.01062774658203125,
0.021636962890625,
0.055572509765625,
-0.047149658203125,
0.043182373046875,
-0.02142333984375,
-0.00589752197265625,
0.0594482421875,
-0.006298065185546875,
0.00516510009765625,
0.0166015625,
-0.005680084228515625,
0.05780029296875,
0.018951416015625,
-0.030487060546875,
-0.035186767578125,
0.045928955078125,
-0.060546875,
-0.0224761962890625,
0.00453948974609375,
-0.032562255859375,
-0.00937652587890625,
0.006389617919921875,
0.053253173828125,
0.066650390625,
-0.00687408447265625,
0.0269927978515625,
0.0283050537109375,
-0.01348114013671875,
0.054229736328125,
-0.0018815994262695312,
0.01800537109375,
-0.073486328125,
0.07220458984375,
0.00988006591796875,
0.019439697265625,
0.01502227783203125,
0.00862884521484375,
-0.0250244140625,
-0.0438232421875,
-0.0177459716796875,
-0.008331298828125,
-0.03076171875,
-0.01678466796875,
-0.03875732421875,
-0.0299224853515625,
-0.03997802734375,
-0.01171875,
-0.0830078125,
-0.0260467529296875,
-0.013641357421875,
0.02569580078125,
0.01837158203125,
0.012420654296875,
-0.007965087890625,
0.0275421142578125,
-0.05194091796875,
0.0367431640625,
0.0286865234375,
0.030029296875,
0.017547607421875,
-0.040008544921875,
-0.01494598388671875,
0.019561767578125,
-0.0301971435546875,
-0.03790283203125,
0.043670654296875,
-0.0010089874267578125,
0.036895751953125,
0.00885772705078125,
0.004482269287109375,
0.035919189453125,
-0.0194854736328125,
0.064697265625,
0.027191162109375,
-0.103759765625,
0.0276947021484375,
-0.00841522216796875,
0.006237030029296875,
0.01611328125,
0.01326751708984375,
-0.0309295654296875,
-0.0208587646484375,
-0.037689208984375,
-0.07794189453125,
0.049652099609375,
0.054901123046875,
0.00389862060546875,
0.0125885009765625,
0.0085296630859375,
0.00630950927734375,
0.012451171875,
-0.042083740234375,
-0.0335693359375,
-0.056854248046875,
-0.00394439697265625,
-0.0072784423828125,
0.005519866943359375,
-0.0091552734375,
-0.0278167724609375,
0.05987548828125,
0.001979827880859375,
0.061859130859375,
0.0634765625,
0.020263671875,
0.00864410400390625,
0.0272674560546875,
0.06524658203125,
0.0153961181640625,
-0.01450347900390625,
0.0122222900390625,
0.0213470458984375,
-0.052459716796875,
0.037506103515625,
-0.01348114013671875,
0.003055572509765625,
0.0131988525390625,
0.0335693359375,
0.08172607421875,
-0.01387786865234375,
-0.024932861328125,
0.05889892578125,
-0.01119232177734375,
-0.04608154296875,
-0.05035400390625,
0.0201263427734375,
0.00936126708984375,
0.016632080078125,
0.030975341796875,
0.041046142578125,
0.00396728515625,
-0.018035888671875,
0.01145172119140625,
0.00916290283203125,
-0.0225677490234375,
-0.029449462890625,
0.0797119140625,
0.033294677734375,
-0.035430908203125,
0.03546142578125,
-0.002960205078125,
-0.0109100341796875,
0.06878662109375,
0.0236663818359375,
0.07843017578125,
-0.0087890625,
-0.0141754150390625,
0.039459228515625,
0.008514404296875,
-0.0014696121215820312,
0.0229644775390625,
-0.047027587890625,
-0.03790283203125,
-0.0254974365234375,
-0.04840087890625,
-0.003147125244140625,
0.0121612548828125,
-0.051177978515625,
0.02923583984375,
-0.03387451171875,
-0.050445556640625,
0.01126861572265625,
-0.038543701171875,
-0.049224853515625,
0.00846099853515625,
-0.01477813720703125,
0.053680419921875,
-0.045928955078125,
0.0765380859375,
0.0026569366455078125,
-0.0209808349609375,
-0.049224853515625,
0.0021762847900390625,
-0.01375579833984375,
-0.0255584716796875,
0.0230255126953125,
-0.0234222412109375,
-0.043792724609375,
-0.002384185791015625,
-0.032440185546875,
-0.052337646484375,
0.0765380859375,
0.03948974609375,
-0.0484619140625,
0.01004791259765625,
-0.0101165771484375,
0.041534423828125,
-0.0126190185546875,
0.00862884521484375,
0.0091400146484375,
0.01322174072265625,
0.00958251953125,
-0.0654296875,
-0.0221710205078125,
-0.035980224609375,
0.001007080078125,
0.017120361328125,
-0.037506103515625,
0.07171630859375,
-0.0217437744140625,
-0.008880615234375,
-0.01222991943359375,
0.046051025390625,
-0.004016876220703125,
0.0295562744140625,
0.062164306640625,
0.023956298828125,
0.0487060546875,
-0.025054931640625,
0.064453125,
-0.007007598876953125,
0.040283203125,
0.062286376953125,
0.0283050537109375,
0.047149658203125,
0.05615234375,
-0.03131103515625,
0.01035308837890625,
0.0290679931640625,
-0.00952911376953125,
0.053253173828125,
0.009429931640625,
-0.0201263427734375,
-0.00421905517578125,
-0.006298065185546875,
-0.059661865234375,
0.04400634765625,
0.0167388916015625,
-0.0207061767578125,
0.02655029296875,
0.00978851318359375,
-0.03228759765625,
-0.023773193359375,
-0.00914764404296875,
0.0254058837890625,
0.004543304443359375,
-0.039581298828125,
0.09320068359375,
-0.0188446044921875,
0.06695556640625,
-0.055023193359375,
-0.0097198486328125,
-0.0159454345703125,
-0.0161590576171875,
-0.0240936279296875,
-0.0615234375,
0.0382080078125,
-0.0165863037109375,
-0.02154541015625,
-0.006847381591796875,
0.01294708251953125,
-0.041290283203125,
-0.0016307830810546875,
0.031463623046875,
-0.0007777214050292969,
0.0018777847290039062,
0.00151824951171875,
-0.0228118896484375,
0.0160675048828125,
0.00946044921875,
-0.0161895751953125,
-0.019622802734375,
0.0274810791015625,
0.0203857421875,
0.041412353515625,
0.0802001953125,
0.036468505859375,
0.0133056640625,
0.005481719970703125,
0.041961669921875,
-0.0496826171875,
-0.05426025390625,
-0.019439697265625,
0.0247039794921875,
0.0110321044921875,
-0.042266845703125,
0.013092041015625,
0.034423828125,
0.08587646484375,
-0.0266265869140625,
0.039306640625,
0.0017251968383789062,
-0.01296234130859375,
-0.06329345703125,
0.057708740234375,
-0.0672607421875,
0.049285888671875,
-0.0097808837890625,
-0.0623779296875,
0.01207733154296875,
0.012969970703125,
0.015350341796875,
0.0063629150390625,
0.027984619140625,
0.0711669921875,
-0.0242767333984375,
-0.0024089813232421875,
0.0190582275390625,
0.017059326171875,
0.03076171875,
0.048858642578125,
0.047943115234375,
-0.061859130859375,
0.06756591796875,
-0.04718017578125,
0.022064208984375,
0.00357818603515625,
-0.02783203125,
-0.03375244140625,
-0.07244873046875,
-0.01251220703125,
-0.037933349609375,
-0.00222015380859375,
0.05267333984375,
0.07452392578125,
-0.0526123046875,
-0.025146484375,
-0.0204925537109375,
0.0085601806640625,
-0.051666259765625,
-0.022003173828125,
0.0323486328125,
0.0065155029296875,
-0.07733154296875,
0.0479736328125,
0.016143798828125,
0.0301055908203125,
-0.005313873291015625,
-0.031982421875,
0.009002685546875,
0.0120697021484375,
0.039306640625,
0.02264404296875,
-0.0709228515625,
-0.006496429443359375,
-0.0081787109375,
-0.00710296630859375,
0.032928466796875,
0.0565185546875,
-0.0447998046875,
0.055389404296875,
0.05450439453125,
0.0257568359375,
0.08349609375,
-0.03662109375,
0.028106689453125,
-0.06298828125,
0.0237884521484375,
0.025970458984375,
0.0303192138671875,
0.0118408203125,
-0.0226593017578125,
0.03411865234375,
0.03289794921875,
-0.025421142578125,
-0.05584716796875,
-0.014007568359375,
-0.11846923828125,
-0.035858154296875,
0.074462890625,
0.00653076171875,
-0.0200347900390625,
-0.00774383544921875,
-0.0166778564453125,
0.04425048828125,
-0.041046142578125,
0.040985107421875,
0.0190887451171875,
0.0088958740234375,
0.03277587890625,
-0.037933349609375,
0.035736083984375,
0.01611328125,
-0.0308685302734375,
-0.004070281982421875,
0.0162811279296875,
0.0389404296875,
0.021148681640625,
0.054168701171875,
-0.0264434814453125,
0.03668212890625,
0.047576904296875,
0.045166015625,
-0.06317138671875,
-0.041748046875,
-0.01477813720703125,
0.043487548828125,
-0.0404052734375,
-0.039459228515625
]
] |
lllyasviel/control_v11f1e_sd15_tile | 2023-05-04T18:51:13.000Z | [
"diffusers",
"art",
"controlnet",
"stable-diffusion",
"controlnet-v1-1",
"image-to-image",
"arxiv:2302.05543",
"license:openrail",
"has_space",
"diffusers:ControlNetModel",
"region:us"
] | image-to-image | lllyasviel | null | null | lllyasviel/control_v11f1e_sd15_tile | 42 | 36,462 | diffusers | 2023-05-04T18:42:24 | ---
license: openrail
base_model: runwayml/stable-diffusion-v1-5
tags:
- art
- controlnet
- stable-diffusion
- controlnet-v1-1
- image-to-image
duplicated_from: ControlNet-1-1-preview/control_v11f1e_sd15_tile
---
# Controlnet - v1.1 - *Tile Version*
**Controlnet v1.1** was released in [lllyasviel/ControlNet-v1-1](https://huggingface.co/lllyasviel/ControlNet-v1-1) by [Lvmin Zhang](https://huggingface.co/lllyasviel).
This checkpoint is a conversion of [the original checkpoint](https://huggingface.co/lllyasviel/ControlNet-v1-1/blob/main/control_v11f1e_sd15_tile.pth) into `diffusers` format.
It can be used in combination with **Stable Diffusion**, such as [runwayml/stable-diffusion-v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5).
For more details, please also have a look at the [🧨 Diffusers docs](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/controlnet).
ControlNet is a neural network structure to control diffusion models by adding extra conditions.

This checkpoint corresponds to the ControlNet conditioned on **tiled image**. Conceptually, it is similar to a super-resolution model, but its usage is not limited to that. It is also possible to generate details at the same size as the input (conditione) image.
**This model was contributed by [*takuma104*](https://huggingface.co/takuma104)**
## Model Details
- **Developed by:** Lvmin Zhang, Maneesh Agrawala
- **Model type:** Diffusion-based text-to-image generation model
- **Language(s):** English
- **License:** [The CreativeML OpenRAIL M license](https://huggingface.co/spaces/CompVis/stable-diffusion-license) is an [Open RAIL M license](https://www.licenses.ai/blog/2022/8/18/naming-convention-of-responsible-ai-licenses), adapted from the work that [BigScience](https://bigscience.huggingface.co/) and [the RAIL Initiative](https://www.licenses.ai/) are jointly carrying in the area of responsible AI licensing. See also [the article about the BLOOM Open RAIL license](https://bigscience.huggingface.co/blog/the-bigscience-rail-license) on which our license is based.
- **Resources for more information:** [GitHub Repository](https://github.com/lllyasviel/ControlNet), [Paper](https://arxiv.org/abs/2302.05543).
- **Cite as:**
@misc{zhang2023adding,
title={Adding Conditional Control to Text-to-Image Diffusion Models},
author={Lvmin Zhang and Maneesh Agrawala},
year={2023},
eprint={2302.05543},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
## Introduction
Controlnet was proposed in [*Adding Conditional Control to Text-to-Image Diffusion Models*](https://arxiv.org/abs/2302.05543) by
Lvmin Zhang, Maneesh Agrawala.
The abstract reads as follows:
*We present a neural network structure, ControlNet, to control pretrained large diffusion models to support additional input conditions.
The ControlNet learns task-specific conditions in an end-to-end way, and the learning is robust even when the training dataset is small (< 50k).
Moreover, training a ControlNet is as fast as fine-tuning a diffusion model, and the model can be trained on a personal devices.
Alternatively, if powerful computation clusters are available, the model can scale to large amounts (millions to billions) of data.
We report that large diffusion models like Stable Diffusion can be augmented with ControlNets to enable conditional inputs like edge maps, segmentation maps, keypoints, etc.
This may enrich the methods to control large diffusion models and further facilitate related applications.*
## Example
It is recommended to use the checkpoint with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5) as the checkpoint has been trained on it.
Experimentally, the checkpoint can be used with other diffusion models such as dreamboothed stable diffusion.
1. Let's install `diffusers` and related packages:
```
$ pip install diffusers transformers accelerate
```
2. Run code:
```python
import torch
from PIL import Image
from diffusers import ControlNetModel, DiffusionPipeline
from diffusers.utils import load_image
def resize_for_condition_image(input_image: Image, resolution: int):
input_image = input_image.convert("RGB")
W, H = input_image.size
k = float(resolution) / min(H, W)
H *= k
W *= k
H = int(round(H / 64.0)) * 64
W = int(round(W / 64.0)) * 64
img = input_image.resize((W, H), resample=Image.LANCZOS)
return img
controlnet = ControlNetModel.from_pretrained('lllyasviel/control_v11f1e_sd15_tile',
torch_dtype=torch.float16)
pipe = DiffusionPipeline.from_pretrained("runwayml/stable-diffusion-v1-5",
custom_pipeline="stable_diffusion_controlnet_img2img",
controlnet=controlnet,
torch_dtype=torch.float16).to('cuda')
pipe.enable_xformers_memory_efficient_attention()
source_image = load_image('https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/original.png')
condition_image = resize_for_condition_image(source_image, 1024)
image = pipe(prompt="best quality",
negative_prompt="blur, lowres, bad anatomy, bad hands, cropped, worst quality",
image=condition_image,
controlnet_conditioning_image=condition_image,
width=condition_image.size[0],
height=condition_image.size[1],
strength=1.0,
generator=torch.manual_seed(0),
num_inference_steps=32,
).images[0]
image.save('output.png')
```


## Other released checkpoints v1-1
The authors released 14 different checkpoints, each trained with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5)
on a different type of conditioning:
| Model Name | Control Image Overview| Condition Image | Control Image Example | Generated Image Example |
|---|---|---|---|---|
|[lllyasviel/control_v11p_sd15_canny](https://huggingface.co/lllyasviel/control_v11p_sd15_canny)<br/> | *Trained with canny edge detection* | A monochrome image with white edges on a black background.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_canny/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11e_sd15_ip2p](https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p)<br/> | *Trained with pixel to pixel instruction* | No condition .|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11e_sd15_ip2p/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_inpaint](https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint)<br/> | Trained with image inpainting | No condition.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/output.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint/resolve/main/images/output.png"/></a>|
|[lllyasviel/control_v11p_sd15_mlsd](https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd)<br/> | Trained with multi-level line segment detection | An image with annotated line segments.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_mlsd/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11f1p_sd15_depth](https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth)<br/> | Trained with depth estimation | An image with depth information, usually represented as a grayscale image.|<a href="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11f1p_sd15_depth/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_normalbae](https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae)<br/> | Trained with surface normal estimation | An image with surface normal information, usually represented as a color-coded image.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_normalbae/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_seg](https://huggingface.co/lllyasviel/control_v11p_sd15_seg)<br/> | Trained with image segmentation | An image with segmented regions, usually represented as a color-coded image.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_seg/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_lineart](https://huggingface.co/lllyasviel/control_v11p_sd15_lineart)<br/> | Trained with line art generation | An image with line art, usually black lines on a white background.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_lineart/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15s2_lineart_anime](https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime)<br/> | Trained with anime line art generation | An image with anime-style line art.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_openpose](https://huggingface.co/lllyasviel/control_v11p_sd15s2_lineart_anime)<br/> | Trained with human pose estimation | An image with human poses, usually represented as a set of keypoints or skeletons.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_openpose/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_scribble](https://huggingface.co/lllyasviel/control_v11p_sd15_scribble)<br/> | Trained with scribble-based image generation | An image with scribbles, usually random or user-drawn strokes.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_scribble/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11p_sd15_softedge](https://huggingface.co/lllyasviel/control_v11p_sd15_softedge)<br/> | Trained with soft edge image generation | An image with soft edges, usually to create a more painterly or artistic effect.|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11p_sd15_softedge/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11e_sd15_shuffle](https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle)<br/> | Trained with image shuffling | An image with shuffled patches or regions.|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/control.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/control.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/image_out.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11e_sd15_shuffle/resolve/main/images/image_out.png"/></a>|
|[lllyasviel/control_v11f1e_sd15_tile](https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile)<br/> | Trained with image tiling | A blurry image or part of an image .|<a href="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/original.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/original.png"/></a>|<a href="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/output.png"><img width="64" src="https://huggingface.co/lllyasviel/control_v11f1e_sd15_tile/resolve/main/images/output.png"/></a>|
## More information
For more information, please also have a look at the [Diffusers ControlNet Blog Post](https://huggingface.co/blog/controlnet) and have a look at the [official docs](https://github.com/lllyasviel/ControlNet-v1-1-nightly). | 15,797 | [
[
-0.04229736328125,
-0.041107177734375,
0.0132598876953125,
0.042022705078125,
-0.019256591796875,
-0.021820068359375,
0.001758575439453125,
-0.035430908203125,
0.040985107421875,
0.0195770263671875,
-0.051483154296875,
-0.0330810546875,
-0.05975341796875,
-0.0175018310546875,
-0.01007080078125,
0.06048583984375,
-0.0253448486328125,
0.00206756591796875,
0.0081024169921875,
-0.006580352783203125,
-0.00931549072265625,
-0.0056915283203125,
-0.09466552734375,
-0.0305938720703125,
0.03961181640625,
0.0032634735107421875,
0.041534423828125,
0.04425048828125,
0.03887939453125,
0.0284576416015625,
-0.02423095703125,
0.00801849365234375,
-0.0241851806640625,
-0.017547607421875,
0.01232147216796875,
-0.01255035400390625,
-0.0521240234375,
0.00980377197265625,
0.054229736328125,
0.0202484130859375,
-0.003658294677734375,
-0.0164642333984375,
0.009674072265625,
0.049835205078125,
-0.0391845703125,
-0.00655364990234375,
-0.00914764404296875,
0.022705078125,
-0.00440216064453125,
0.0036640167236328125,
-0.01305389404296875,
-0.0246429443359375,
0.0024585723876953125,
-0.060394287109375,
-0.0016870498657226562,
-0.01383209228515625,
0.1014404296875,
0.02349853515625,
-0.028839111328125,
-0.00791168212890625,
-0.0226287841796875,
0.05029296875,
-0.05999755859375,
0.00786590576171875,
0.01099395751953125,
0.0220489501953125,
-0.0196380615234375,
-0.0799560546875,
-0.036651611328125,
-0.01358795166015625,
-0.0058746337890625,
0.036712646484375,
-0.02484130859375,
0.0062713623046875,
0.0199127197265625,
0.016632080078125,
-0.027984619140625,
0.0216827392578125,
-0.0242462158203125,
-0.030059814453125,
0.047637939453125,
0.0056610107421875,
0.04443359375,
0.005435943603515625,
-0.046478271484375,
-0.006114959716796875,
-0.0321044921875,
0.024505615234375,
0.01215362548828125,
-0.00930023193359375,
-0.054473876953125,
0.0298004150390625,
-0.0009188652038574219,
0.054443359375,
0.031768798828125,
-0.018890380859375,
0.032073974609375,
-0.0183258056640625,
-0.02886962890625,
-0.0223846435546875,
0.0792236328125,
0.03985595703125,
0.0051727294921875,
-0.0007801055908203125,
-0.0117645263671875,
-0.00841522216796875,
-0.00010859966278076172,
-0.09716796875,
-0.01416015625,
0.0179595947265625,
-0.043914794921875,
-0.0232696533203125,
-0.013214111328125,
-0.055084228515625,
-0.018218994140625,
-0.0034656524658203125,
0.0299224853515625,
-0.042633056640625,
-0.033233642578125,
0.0133514404296875,
-0.0302734375,
0.034637451171875,
0.05108642578125,
-0.034332275390625,
0.01447296142578125,
0.00989532470703125,
0.0792236328125,
-0.0189056396484375,
-0.0111083984375,
-0.0236053466796875,
-0.00905609130859375,
-0.027496337890625,
0.035858154296875,
-0.01389312744140625,
-0.0110626220703125,
-0.007167816162109375,
0.0265655517578125,
-0.00794219970703125,
-0.0305328369140625,
0.0268402099609375,
-0.0321044921875,
0.01406097412109375,
-0.0033168792724609375,
-0.0282440185546875,
-0.0106353759765625,
0.0174560546875,
-0.036224365234375,
0.05535888671875,
0.016754150390625,
-0.075439453125,
0.030426025390625,
-0.04010009765625,
-0.01331329345703125,
-0.0178680419921875,
0.007843017578125,
-0.058319091796875,
-0.032501220703125,
0.0007734298706054688,
0.043853759765625,
-0.0002359151840209961,
-0.005893707275390625,
-0.031463623046875,
-0.0015621185302734375,
0.005672454833984375,
-0.0107421875,
0.08953857421875,
0.01462554931640625,
-0.0477294921875,
0.0168609619140625,
-0.055206298828125,
0.002124786376953125,
0.00921630859375,
-0.023284912109375,
0.007354736328125,
-0.0215301513671875,
0.0087738037109375,
0.050384521484375,
0.0278167724609375,
-0.052398681640625,
0.01055908203125,
-0.01226043701171875,
0.036865234375,
0.049346923828125,
0.01425933837890625,
0.0458984375,
-0.036224365234375,
0.04425048828125,
0.025665283203125,
0.020172119140625,
0.0025539398193359375,
-0.039642333984375,
-0.07244873046875,
-0.04632568359375,
0.007228851318359375,
0.040435791015625,
-0.061248779296875,
0.051971435546875,
0.005352020263671875,
-0.0521240234375,
-0.020751953125,
0.004467010498046875,
0.037109375,
0.03778076171875,
0.0245513916015625,
-0.040252685546875,
-0.0227508544921875,
-0.06689453125,
0.0199127197265625,
0.0196533203125,
-0.0010423660278320312,
0.019012451171875,
0.0487060546875,
-0.006618499755859375,
0.04974365234375,
-0.019805908203125,
-0.0298919677734375,
-0.002166748046875,
-0.00528717041015625,
0.02044677734375,
0.07794189453125,
0.057098388671875,
-0.05914306640625,
-0.047027587890625,
-0.0013675689697265625,
-0.061920166015625,
-0.0017080307006835938,
-0.02142333984375,
-0.03631591796875,
0.01445770263671875,
0.0474853515625,
-0.051025390625,
0.063232421875,
0.04241943359375,
-0.041259765625,
0.047760009765625,
-0.03509521484375,
0.01312255859375,
-0.07684326171875,
0.0160369873046875,
0.0298919677734375,
-0.028076171875,
-0.04296875,
0.01044464111328125,
0.01515960693359375,
0.00637054443359375,
-0.05340576171875,
0.052276611328125,
-0.038177490234375,
0.01497650146484375,
-0.023895263671875,
-0.006237030029296875,
0.00420379638671875,
0.049560546875,
0.01558685302734375,
0.0406494140625,
0.07135009765625,
-0.04974365234375,
0.0278167724609375,
0.0277099609375,
-0.020721435546875,
0.0614013671875,
-0.063720703125,
0.006610870361328125,
-0.01348114013671875,
0.04248046875,
-0.0792236328125,
-0.0194854736328125,
0.04296875,
-0.03717041015625,
0.0445556640625,
-0.0171966552734375,
-0.01922607421875,
-0.031707763671875,
-0.024078369140625,
0.017486572265625,
0.054351806640625,
-0.038787841796875,
0.0259857177734375,
0.01409149169921875,
0.01513671875,
-0.0390625,
-0.0712890625,
-0.0024700164794921875,
-0.0261993408203125,
-0.059722900390625,
0.036865234375,
-0.0156402587890625,
0.00643157958984375,
0.0011348724365234375,
0.004642486572265625,
-0.0201416015625,
-0.0025501251220703125,
0.0285186767578125,
0.0144195556640625,
-0.00925445556640625,
-0.01291656494140625,
0.0094451904296875,
-0.01346588134765625,
-0.005092620849609375,
-0.02850341796875,
0.03570556640625,
0.006587982177734375,
-0.0160369873046875,
-0.072998046875,
0.0164031982421875,
0.045196533203125,
-0.0013170242309570312,
0.07159423828125,
0.06292724609375,
-0.0304718017578125,
-0.002185821533203125,
-0.02734375,
-0.0094146728515625,
-0.038482666015625,
0.003475189208984375,
-0.01538848876953125,
-0.055816650390625,
0.057403564453125,
0.0020847320556640625,
-0.002490997314453125,
0.04205322265625,
0.034149169921875,
-0.015716552734375,
0.07025146484375,
0.03948974609375,
-0.0059814453125,
0.058135986328125,
-0.058441162109375,
-0.01163482666015625,
-0.07342529296875,
-0.0247039794921875,
-0.0236053466796875,
-0.054443359375,
-0.023651123046875,
-0.0318603515625,
0.0311126708984375,
0.0297393798828125,
-0.055084228515625,
0.0347900390625,
-0.04241943359375,
0.00850677490234375,
0.0285186767578125,
0.043975830078125,
-0.0127716064453125,
-0.0080413818359375,
-0.0173187255859375,
0.0009617805480957031,
-0.05291748046875,
-0.0187225341796875,
0.04315185546875,
0.041107177734375,
0.045135498046875,
-0.00424957275390625,
0.042083740234375,
0.005462646484375,
0.0207672119140625,
-0.0435791015625,
0.04156494140625,
0.0010824203491210938,
-0.04071044921875,
-0.01473236083984375,
-0.0280609130859375,
-0.0828857421875,
0.006130218505859375,
-0.035064697265625,
-0.049407958984375,
0.0299835205078125,
0.017425537109375,
-0.01160430908203125,
0.041259765625,
-0.05126953125,
0.0570068359375,
0.0034427642822265625,
-0.04815673828125,
0.00331878662109375,
-0.061798095703125,
0.01174163818359375,
0.014617919921875,
-0.01271820068359375,
0.00567626953125,
-0.01277923583984375,
0.064697265625,
-0.062347412109375,
0.0670166015625,
-0.04254150390625,
-0.00420379638671875,
0.0252532958984375,
0.0030498504638671875,
0.04205322265625,
-0.01174163818359375,
-0.0114593505859375,
0.004413604736328125,
-0.0083465576171875,
-0.0474853515625,
-0.031494140625,
0.050628662109375,
-0.0552978515625,
-0.0171051025390625,
-0.012786865234375,
-0.025421142578125,
0.0191192626953125,
0.0205230712890625,
0.051513671875,
0.0321044921875,
0.0151519775390625,
0.00438690185546875,
0.050384521484375,
-0.0290985107421875,
0.049407958984375,
0.004413604736328125,
-0.0089874267578125,
-0.040252685546875,
0.05389404296875,
0.007526397705078125,
0.032318115234375,
0.01284027099609375,
0.009033203125,
-0.01505279541015625,
-0.04345703125,
-0.03387451171875,
0.032958984375,
-0.046142578125,
-0.03466796875,
-0.0419921875,
-0.03466796875,
-0.026031494140625,
-0.0352783203125,
-0.0238189697265625,
-0.02252197265625,
-0.053070068359375,
0.01119232177734375,
0.041748046875,
0.039825439453125,
-0.02166748046875,
0.043975830078125,
-0.02227783203125,
0.01690673828125,
0.023406982421875,
0.032073974609375,
-0.00030231475830078125,
-0.04998779296875,
-0.005168914794921875,
0.0111236572265625,
-0.033935546875,
-0.055755615234375,
0.04058837890625,
0.001544952392578125,
0.034271240234375,
0.041412353515625,
-0.0203857421875,
0.052276611328125,
-0.027557373046875,
0.047119140625,
0.046173095703125,
-0.056060791015625,
0.038543701171875,
-0.031585693359375,
0.0214691162109375,
0.0213470458984375,
0.04217529296875,
-0.03387451171875,
-0.0258026123046875,
-0.056793212890625,
-0.0511474609375,
0.045135498046875,
0.01800537109375,
-0.002063751220703125,
0.0206451416015625,
0.052154541015625,
-0.0258026123046875,
0.00720977783203125,
-0.05975341796875,
-0.035125732421875,
-0.017333984375,
0.00591278076171875,
0.0019521713256835938,
0.001827239990234375,
-0.0067138671875,
-0.0347900390625,
0.06610107421875,
0.0013990402221679688,
0.044769287109375,
0.03839111328125,
0.00963592529296875,
-0.01506805419921875,
-0.025634765625,
0.046539306640625,
0.04010009765625,
-0.007175445556640625,
-0.017822265625,
0.0030651092529296875,
-0.037200927734375,
0.0184326171875,
-0.005340576171875,
-0.0239105224609375,
-0.00998687744140625,
0.0229644775390625,
0.06341552734375,
-0.0159149169921875,
-0.00945281982421875,
0.0584716796875,
0.00104522705078125,
-0.0460205078125,
-0.01641845703125,
0.003047943115234375,
0.01092529296875,
0.037261962890625,
0.00981903076171875,
0.033935546875,
0.0018720626831054688,
-0.00571441650390625,
0.0220489501953125,
0.042388916015625,
-0.045379638671875,
-0.0110321044921875,
0.0628662109375,
-0.0003871917724609375,
-0.0084228515625,
0.031280517578125,
-0.033111572265625,
-0.0465087890625,
0.07501220703125,
0.0433349609375,
0.051788330078125,
-0.0083465576171875,
0.02606201171875,
0.05712890625,
0.0174560546875,
0.003185272216796875,
0.0108642578125,
0.00801849365234375,
-0.056915283203125,
-0.036163330078125,
-0.0267791748046875,
-0.0031375885009765625,
0.0131988525390625,
-0.033935546875,
0.0291748046875,
-0.062744140625,
-0.0176544189453125,
-0.010040283203125,
0.0101165771484375,
-0.050689697265625,
0.03375244140625,
0.007091522216796875,
0.0982666015625,
-0.06243896484375,
0.0631103515625,
0.045135498046875,
-0.035919189453125,
-0.07086181640625,
-0.0086669921875,
-0.00457000732421875,
-0.0628662109375,
0.04986572265625,
0.008758544921875,
-0.0028553009033203125,
0.00734710693359375,
-0.0693359375,
-0.044219970703125,
0.10113525390625,
0.020172119140625,
-0.0178070068359375,
0.0106048583984375,
-0.033111572265625,
0.03485107421875,
-0.033203125,
0.03631591796875,
0.0283203125,
0.03753662109375,
0.03399658203125,
-0.05645751953125,
0.0199737548828125,
-0.035308837890625,
0.01326751708984375,
0.010040283203125,
-0.077392578125,
0.0654296875,
-0.004428863525390625,
-0.0121002197265625,
0.021026611328125,
0.05792236328125,
0.0189208984375,
0.0095977783203125,
0.046417236328125,
0.06689453125,
0.030242919921875,
-0.00759124755859375,
0.0771484375,
-0.00528717041015625,
0.0248260498046875,
0.048583984375,
0.02215576171875,
0.03887939453125,
0.027252197265625,
-0.006450653076171875,
0.0408935546875,
0.06671142578125,
0.004062652587890625,
0.039703369140625,
0.03662109375,
-0.0187225341796875,
-0.01251983642578125,
0.00028014183044433594,
-0.0290985107421875,
0.00296783447265625,
0.0235595703125,
-0.024444580078125,
-0.0234832763671875,
0.0218963623046875,
0.0218963623046875,
-0.01412200927734375,
-0.035430908203125,
0.04541015625,
-0.0092620849609375,
-0.03643798828125,
0.056549072265625,
-0.005420684814453125,
0.08270263671875,
-0.05419921875,
0.001056671142578125,
-0.0236663818359375,
0.00408935546875,
-0.0306396484375,
-0.06561279296875,
0.0105133056640625,
-0.0178070068359375,
0.016845703125,
-0.02655029296875,
0.050384521484375,
-0.02899169921875,
-0.0301971435546875,
0.03662109375,
0.006328582763671875,
0.031768798828125,
0.01477813720703125,
-0.08123779296875,
0.01401519775390625,
0.00926971435546875,
-0.0352783203125,
0.0202484130859375,
0.0313720703125,
0.014801025390625,
0.055206298828125,
0.02581787109375,
0.0291900634765625,
0.0244598388671875,
-0.0201263427734375,
0.08160400390625,
-0.0214691162109375,
-0.023345947265625,
-0.04376220703125,
0.060394287109375,
-0.022369384765625,
-0.03375244140625,
0.04449462890625,
0.02386474609375,
0.0540771484375,
-0.0054168701171875,
0.05072021484375,
-0.0308990478515625,
0.01038360595703125,
-0.049560546875,
0.06671142578125,
-0.06658935546875,
-0.01837158203125,
-0.0229644775390625,
-0.05224609375,
-0.0290069580078125,
0.0648193359375,
-0.0147552490234375,
0.018310546875,
0.040069580078125,
0.08154296875,
-0.0183868408203125,
-0.044647216796875,
0.00974273681640625,
0.0095367431640625,
0.0258026123046875,
0.05517578125,
0.049530029296875,
-0.05084228515625,
0.022369384765625,
-0.038421630859375,
-0.03643798828125,
-0.003284454345703125,
-0.07379150390625,
-0.061279296875,
-0.05517578125,
-0.052886962890625,
-0.0584716796875,
-0.022064208984375,
0.050689697265625,
0.089111328125,
-0.051605224609375,
-0.00923919677734375,
-0.0262451171875,
0.0037097930908203125,
-0.0087432861328125,
-0.0149078369140625,
0.031463623046875,
-0.01012420654296875,
-0.06585693359375,
-0.0004029273986816406,
0.017333984375,
0.038360595703125,
-0.006908416748046875,
-0.032867431640625,
-0.0279998779296875,
-0.020050048828125,
0.019195556640625,
0.034454345703125,
-0.029022216796875,
-0.0145416259765625,
-0.0218963623046875,
-0.022064208984375,
0.01435089111328125,
0.038665771484375,
-0.036376953125,
0.02178955078125,
0.0391845703125,
0.033111572265625,
0.06451416015625,
-0.01538848876953125,
0.00860595703125,
-0.042510986328125,
0.039703369140625,
-0.0000642538070678711,
0.034881591796875,
0.00508880615234375,
-0.02484130859375,
0.03216552734375,
0.0204925537109375,
-0.05963134765625,
-0.0292510986328125,
0.00908660888671875,
-0.1019287109375,
-0.01222991943359375,
0.0751953125,
-0.0307769775390625,
-0.03021240234375,
0.01328277587890625,
-0.036712646484375,
0.0259246826171875,
-0.0242462158203125,
0.0197601318359375,
0.0291595458984375,
-0.00818634033203125,
-0.0307769775390625,
-0.0303192138671875,
0.045074462890625,
0.021148681640625,
-0.058013916015625,
-0.042724609375,
0.042327880859375,
0.02874755859375,
0.0276031494140625,
0.06756591796875,
-0.0131072998046875,
0.01416015625,
-0.003910064697265625,
0.02020263671875,
0.0017852783203125,
-0.01080322265625,
-0.042816162109375,
-0.004947662353515625,
-0.017181396484375,
-0.033111572265625
]
] |
timm/eca_botnext26ts_256.c1_in1k | 2023-04-26T16:09:31.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:2101.11605",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/eca_botnext26ts_256.c1_in1k | 0 | 36,454 | timm | 2023-04-26T16:09:12 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for eca_botnext26ts_256.c1_in1k
A BotNet image classification model (with Efficient channel attention, based on ResNeXt architecture). Trained on ImageNet-1k in `timm` by Ross Wightman.
NOTE: this model did not adhere to any specific paper configuration, it was tuned for reasonable training times and reduced frequency of self-attention blocks.
Recipe details:
* Based on [ResNet Strikes Back](https://arxiv.org/abs/2110.00476) `C` recipes
* SGD (w/ Nesterov) optimizer and AGC (adaptive gradient clipping).
* Cosine LR schedule with warmup
This model architecture is implemented using `timm`'s flexible [BYOBNet (Bring-Your-Own-Blocks Network)](https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py).
BYOB (with BYOANet attention specific blocks) allows configuration of:
* block / stage layout
* block-type interleaving
* stem layout
* output stride (dilation)
* activation and norm layers
* channel and spatial / self-attention layers
...and also includes `timm` features common to many other architectures, including:
* stochastic depth
* gradient checkpointing
* layer-wise LR decay
* per-stage feature extraction
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 10.6
- GMACs: 2.5
- Activations (M): 11.6
- Image size: 256 x 256
- **Papers:**
- Bottleneck Transformers for Visual Recognition: https://arxiv.org/abs/2101.11605
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('eca_botnext26ts_256.c1_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'eca_botnext26ts_256.c1_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 128, 128])
# torch.Size([1, 256, 64, 64])
# torch.Size([1, 512, 32, 32])
# torch.Size([1, 1024, 16, 16])
# torch.Size([1, 2048, 8, 8])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'eca_botnext26ts_256.c1_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2048, 8, 8) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@article{Srinivas2021BottleneckTF,
title={Bottleneck Transformers for Visual Recognition},
author={A. Srinivas and Tsung-Yi Lin and Niki Parmar and Jonathon Shlens and P. Abbeel and Ashish Vaswani},
journal={2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2021},
pages={16514-16524}
}
```
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
| 5,385 | [
[
-0.039398193359375,
-0.038909912109375,
0.0034236907958984375,
0.005126953125,
-0.0201873779296875,
-0.0222015380859375,
-0.0197601318359375,
-0.032684326171875,
0.0195770263671875,
0.03778076171875,
-0.040191650390625,
-0.050262451171875,
-0.051025390625,
-0.003314971923828125,
-0.012786865234375,
0.0660400390625,
-0.006076812744140625,
-0.004940032958984375,
-0.01116180419921875,
-0.043609619140625,
-0.021453857421875,
-0.0281982421875,
-0.0718994140625,
-0.027313232421875,
0.031097412109375,
0.01354217529296875,
0.0308074951171875,
0.04827880859375,
0.050506591796875,
0.03814697265625,
-0.006923675537109375,
0.007080078125,
-0.015594482421875,
-0.01467132568359375,
0.0196075439453125,
-0.047088623046875,
-0.035400390625,
0.017730712890625,
0.04388427734375,
0.019317626953125,
0.00041961669921875,
0.03399658203125,
0.002056121826171875,
0.046142578125,
-0.0169830322265625,
0.004337310791015625,
-0.040557861328125,
0.01416015625,
-0.01386260986328125,
0.0002008676528930664,
-0.029296875,
-0.017547607421875,
0.020172119140625,
-0.03717041015625,
0.036895751953125,
0.0090789794921875,
0.10089111328125,
0.019927978515625,
0.00260162353515625,
0.002315521240234375,
-0.0191192626953125,
0.061920166015625,
-0.054290771484375,
0.0241546630859375,
0.0130767822265625,
0.0186004638671875,
-0.005611419677734375,
-0.07476806640625,
-0.043701171875,
-0.0113067626953125,
-0.005237579345703125,
0.00653076171875,
-0.02154541015625,
-0.0005950927734375,
0.02618408203125,
0.01561737060546875,
-0.034637451171875,
0.008697509765625,
-0.042205810546875,
-0.01271820068359375,
0.03631591796875,
0.00019931793212890625,
0.0239410400390625,
-0.019378662109375,
-0.041961669921875,
-0.032379150390625,
-0.02899169921875,
0.02044677734375,
0.0201416015625,
0.01568603515625,
-0.04730224609375,
0.0225372314453125,
0.01336669921875,
0.0413818359375,
0.003421783447265625,
-0.0177001953125,
0.042572021484375,
0.0054931640625,
-0.031097412109375,
-0.0164947509765625,
0.08050537109375,
0.0251312255859375,
0.022125244140625,
0.00832366943359375,
-0.01464080810546875,
-0.0212554931640625,
0.005886077880859375,
-0.07855224609375,
-0.0266876220703125,
0.0271453857421875,
-0.050506591796875,
-0.028106689453125,
0.017120361328125,
-0.050811767578125,
-0.0054931640625,
-0.003681182861328125,
0.03900146484375,
-0.04296875,
-0.0243072509765625,
-0.0003268718719482422,
-0.01715087890625,
0.025604248046875,
0.019744873046875,
-0.035247802734375,
0.01715087890625,
0.0216522216796875,
0.0863037109375,
0.003139495849609375,
-0.02978515625,
-0.0156402587890625,
-0.0276336669921875,
-0.013946533203125,
0.042449951171875,
-0.004444122314453125,
-0.0021343231201171875,
-0.024505615234375,
0.02996826171875,
-0.00856781005859375,
-0.055389404296875,
0.0109100341796875,
-0.023895263671875,
0.0177001953125,
0.002471923828125,
-0.0180206298828125,
-0.04119873046875,
0.0274810791015625,
-0.039886474609375,
0.08465576171875,
0.0212249755859375,
-0.060943603515625,
0.0209808349609375,
-0.04302978515625,
-0.00849151611328125,
-0.01837158203125,
-0.0045623779296875,
-0.08648681640625,
-0.007049560546875,
0.018768310546875,
0.052825927734375,
-0.02459716796875,
-0.00644683837890625,
-0.04425048828125,
-0.0167999267578125,
0.032257080078125,
-0.00968170166015625,
0.07464599609375,
0.01123046875,
-0.033447265625,
0.018035888671875,
-0.04974365234375,
0.0109405517578125,
0.046051025390625,
-0.01287078857421875,
-0.0026912689208984375,
-0.04296875,
0.0148773193359375,
0.0257720947265625,
0.0086517333984375,
-0.045257568359375,
0.0145416259765625,
-0.0161895751953125,
0.039886474609375,
0.056976318359375,
-0.00930023193359375,
0.0177001953125,
-0.0304412841796875,
0.0244293212890625,
0.0214691162109375,
0.020721435546875,
-0.01050567626953125,
-0.040435791015625,
-0.0625,
-0.04278564453125,
0.03131103515625,
0.0306243896484375,
-0.036041259765625,
0.040771484375,
-0.0117950439453125,
-0.052459716796875,
-0.04058837890625,
0.00789642333984375,
0.040802001953125,
0.04107666015625,
0.0267486572265625,
-0.043487548828125,
-0.04486083984375,
-0.06890869140625,
0.0039825439453125,
0.005855560302734375,
0.0029048919677734375,
0.0291748046875,
0.050323486328125,
-0.004913330078125,
0.0511474609375,
-0.0256195068359375,
-0.0251312255859375,
-0.0227508544921875,
0.0090789794921875,
0.03033447265625,
0.058074951171875,
0.061492919921875,
-0.046478271484375,
-0.040985107421875,
-0.00833892822265625,
-0.06256103515625,
0.00974273681640625,
-0.0122528076171875,
-0.0110015869140625,
0.029693603515625,
0.0110626220703125,
-0.04107666015625,
0.04901123046875,
0.01702880859375,
-0.014923095703125,
0.0273895263671875,
-0.017364501953125,
0.01557159423828125,
-0.08538818359375,
0.00975799560546875,
0.031494140625,
-0.0198516845703125,
-0.02764892578125,
0.004852294921875,
0.004344940185546875,
-0.006572723388671875,
-0.039154052734375,
0.049957275390625,
-0.042877197265625,
-0.0175018310546875,
-0.01256561279296875,
-0.0083160400390625,
-0.003780364990234375,
0.0673828125,
0.0004317760467529297,
0.03173828125,
0.059295654296875,
-0.03607177734375,
0.03515625,
0.024688720703125,
-0.0092315673828125,
0.036773681640625,
-0.0548095703125,
0.0229644775390625,
0.001262664794921875,
0.0146026611328125,
-0.079345703125,
-0.01131439208984375,
0.0306243896484375,
-0.04937744140625,
0.051666259765625,
-0.035614013671875,
-0.03106689453125,
-0.046722412109375,
-0.0357666015625,
0.030548095703125,
0.05780029296875,
-0.06195068359375,
0.0380859375,
0.01776123046875,
0.0266876220703125,
-0.039276123046875,
-0.05615234375,
-0.0146331787109375,
-0.0310821533203125,
-0.047088623046875,
0.0251312255859375,
0.01009368896484375,
0.01053619384765625,
0.0082550048828125,
-0.0094146728515625,
-0.0010786056518554688,
-0.0084228515625,
0.043212890625,
0.0304107666015625,
-0.0146636962890625,
-0.009307861328125,
-0.03192138671875,
-0.005229949951171875,
-0.0008916854858398438,
-0.0213470458984375,
0.0523681640625,
-0.0197601318359375,
-0.01177978515625,
-0.07196044921875,
-0.0038051605224609375,
0.039337158203125,
-0.01267242431640625,
0.0574951171875,
0.0771484375,
-0.03875732421875,
-0.003314971923828125,
-0.039276123046875,
-0.0225372314453125,
-0.037384033203125,
0.03826904296875,
-0.0267486572265625,
-0.0290374755859375,
0.06732177734375,
-0.0023708343505859375,
0.001251220703125,
0.0479736328125,
0.02874755859375,
-0.010406494140625,
0.055694580078125,
0.05157470703125,
0.00862884521484375,
0.06158447265625,
-0.07232666015625,
-0.01776123046875,
-0.06964111328125,
-0.0341796875,
-0.0259246826171875,
-0.049957275390625,
-0.053192138671875,
-0.02313232421875,
0.033599853515625,
0.0030040740966796875,
-0.02978515625,
0.04290771484375,
-0.06536865234375,
0.0033416748046875,
0.0557861328125,
0.049072265625,
-0.0271453857421875,
0.021636962890625,
-0.009979248046875,
-0.0023097991943359375,
-0.0595703125,
-0.019866943359375,
0.09063720703125,
0.03863525390625,
0.04888916015625,
-0.0173187255859375,
0.055633544921875,
-0.009979248046875,
0.0338134765625,
-0.042144775390625,
0.048248291015625,
-0.00795745849609375,
-0.034149169921875,
-0.01357269287109375,
-0.030975341796875,
-0.0841064453125,
-0.0005574226379394531,
-0.019317626953125,
-0.06103515625,
0.00864410400390625,
0.01161956787109375,
-0.018646240234375,
0.059326171875,
-0.06158447265625,
0.0711669921875,
-0.01128387451171875,
-0.029449462890625,
-0.005157470703125,
-0.05419921875,
0.016998291015625,
0.01439666748046875,
-0.0205841064453125,
-0.00400543212890625,
0.01025390625,
0.080810546875,
-0.0474853515625,
0.0673828125,
-0.026702880859375,
0.0260162353515625,
0.039703369140625,
-0.01690673828125,
0.02685546875,
-0.01149749755859375,
0.0015783309936523438,
0.034454345703125,
-0.0012845993041992188,
-0.037322998046875,
-0.04351806640625,
0.03973388671875,
-0.076171875,
-0.03106689453125,
-0.0261383056640625,
-0.03436279296875,
0.018585205078125,
0.005725860595703125,
0.03875732421875,
0.052734375,
0.018035888671875,
0.013153076171875,
0.045074462890625,
-0.038970947265625,
0.03338623046875,
0.00017893314361572266,
-0.00742340087890625,
-0.0440673828125,
0.0660400390625,
0.0195465087890625,
0.01342010498046875,
0.016937255859375,
0.007080078125,
-0.02044677734375,
-0.0341796875,
-0.020599365234375,
0.03216552734375,
-0.046844482421875,
-0.04034423828125,
-0.042083740234375,
-0.03857421875,
-0.032867431640625,
-0.00409698486328125,
-0.03997802734375,
-0.0225067138671875,
-0.02130126953125,
0.0215606689453125,
0.056732177734375,
0.04144287109375,
-0.005710601806640625,
0.04412841796875,
-0.04107666015625,
0.0132904052734375,
0.00684356689453125,
0.03662109375,
-0.003498077392578125,
-0.0684814453125,
-0.017791748046875,
-0.01531219482421875,
-0.04022216796875,
-0.0606689453125,
0.040557861328125,
0.00731658935546875,
0.036773681640625,
0.02557373046875,
-0.0177764892578125,
0.05389404296875,
-0.00862884521484375,
0.032440185546875,
0.03192138671875,
-0.04449462890625,
0.047027587890625,
0.0032787322998046875,
0.0113983154296875,
0.0009355545043945312,
0.01995849609375,
-0.022125244140625,
-0.01013946533203125,
-0.0809326171875,
-0.051605224609375,
0.0703125,
0.006504058837890625,
-0.0066986083984375,
0.02117919921875,
0.053497314453125,
0.0025787353515625,
0.0031299591064453125,
-0.058746337890625,
-0.03436279296875,
-0.019866943359375,
-0.022705078125,
0.0080413818359375,
0.0019178390502929688,
-0.0032444000244140625,
-0.046661376953125,
0.050079345703125,
0.0002923011779785156,
0.06280517578125,
0.03070068359375,
-0.0012760162353515625,
-0.00682830810546875,
-0.029388427734375,
0.030059814453125,
0.0169830322265625,
-0.025787353515625,
0.004150390625,
0.021209716796875,
-0.0408935546875,
0.0033550262451171875,
0.00786590576171875,
0.0001481771469116211,
-0.00017690658569335938,
0.0355224609375,
0.06610107421875,
-0.004596710205078125,
-0.007671356201171875,
0.0301361083984375,
-0.004268646240234375,
-0.02520751953125,
-0.0224609375,
0.00921630859375,
-0.0146331787109375,
0.037109375,
0.01708984375,
0.031524658203125,
-0.0175018310546875,
-0.0247955322265625,
0.0234375,
0.04425048828125,
-0.0182647705078125,
-0.0308380126953125,
0.0540771484375,
-0.01226806640625,
-0.0243377685546875,
0.0655517578125,
-0.005252838134765625,
-0.042388916015625,
0.088134765625,
0.034698486328125,
0.07659912109375,
-0.004405975341796875,
-0.0013904571533203125,
0.0599365234375,
0.02117919921875,
0.00879669189453125,
0.017486572265625,
0.00907135009765625,
-0.055145263671875,
0.009063720703125,
-0.03131103515625,
0.0003764629364013672,
0.031402587890625,
-0.03741455078125,
0.029510498046875,
-0.061279296875,
-0.0364990234375,
0.007568359375,
0.0225677490234375,
-0.08148193359375,
0.01837158203125,
0.000018894672393798828,
0.068115234375,
-0.0498046875,
0.054931640625,
0.06536865234375,
-0.044952392578125,
-0.06982421875,
-0.007587432861328125,
-0.001949310302734375,
-0.07818603515625,
0.04364013671875,
0.037109375,
0.01346588134765625,
0.006317138671875,
-0.06658935546875,
-0.046173095703125,
0.10736083984375,
0.03558349609375,
-0.0164031982421875,
0.021636962890625,
-0.0114898681640625,
0.0208587646484375,
-0.03192138671875,
0.037841796875,
0.0140838623046875,
0.0227508544921875,
0.0174560546875,
-0.050262451171875,
0.0211181640625,
-0.02008056640625,
0.00662994384765625,
0.01126861572265625,
-0.07122802734375,
0.0623779296875,
-0.03936767578125,
-0.01561737060546875,
0.0090484619140625,
0.054931640625,
0.0200042724609375,
0.01514434814453125,
0.039886474609375,
0.0660400390625,
0.03778076171875,
-0.021209716796875,
0.0687255859375,
-0.00919342041015625,
0.043212890625,
0.046478271484375,
0.02838134765625,
0.045379638671875,
0.0308380126953125,
-0.0236358642578125,
0.0280303955078125,
0.08160400390625,
-0.0291595458984375,
0.0255889892578125,
0.016448974609375,
0.009979248046875,
-0.01140594482421875,
0.005886077880859375,
-0.028900146484375,
0.034698486328125,
0.01204681396484375,
-0.039581298828125,
-0.01024627685546875,
0.00832366943359375,
-0.007808685302734375,
-0.0183868408203125,
-0.01380157470703125,
0.0389404296875,
0.0032749176025390625,
-0.02685546875,
0.06109619140625,
0.00565338134765625,
0.064697265625,
-0.027496337890625,
0.0012311935424804688,
-0.021087646484375,
0.015838623046875,
-0.0181427001953125,
-0.059112548828125,
0.0168609619140625,
-0.0173187255859375,
0.0006098747253417969,
0.002292633056640625,
0.04962158203125,
-0.02099609375,
-0.03009033203125,
0.0175323486328125,
0.022613525390625,
0.041961669921875,
0.0026073455810546875,
-0.0924072265625,
0.016357421875,
-0.00548553466796875,
-0.045623779296875,
0.0186614990234375,
0.037322998046875,
0.00612640380859375,
0.06256103515625,
0.038482666015625,
-0.0034332275390625,
0.00908660888671875,
-0.01064300537109375,
0.0601806640625,
-0.0440673828125,
-0.02069091796875,
-0.0599365234375,
0.0396728515625,
-0.0118865966796875,
-0.041656494140625,
0.036834716796875,
0.039215087890625,
0.07025146484375,
-0.005405426025390625,
0.03167724609375,
-0.0182952880859375,
-0.0012273788452148438,
-0.0304412841796875,
0.046905517578125,
-0.057647705078125,
0.0027008056640625,
-0.0165863037109375,
-0.0523681640625,
-0.0226287841796875,
0.04937744140625,
-0.0179290771484375,
0.03326416015625,
0.03857421875,
0.07427978515625,
-0.0227813720703125,
-0.03277587890625,
0.0161590576171875,
0.0106048583984375,
0.00807952880859375,
0.042144775390625,
0.03106689453125,
-0.061737060546875,
0.0287322998046875,
-0.043487548828125,
-0.01480865478515625,
-0.00986480712890625,
-0.044769287109375,
-0.080810546875,
-0.06732177734375,
-0.04931640625,
-0.0474853515625,
-0.01235198974609375,
0.07513427734375,
0.0828857421875,
-0.053131103515625,
-0.00789642333984375,
-0.0013971328735351562,
0.01641845703125,
-0.025482177734375,
-0.017303466796875,
0.045379638671875,
-0.0187225341796875,
-0.063720703125,
-0.0303192138671875,
0.01116943359375,
0.031494140625,
-0.0036449432373046875,
-0.01568603515625,
-0.01258087158203125,
-0.014556884765625,
0.01165771484375,
0.03179931640625,
-0.0557861328125,
-0.019500732421875,
-0.01296234130859375,
-0.00986480712890625,
0.03643798828125,
0.03631591796875,
-0.047027587890625,
0.022003173828125,
0.02606201171875,
0.033905029296875,
0.05743408203125,
-0.0225067138671875,
0.0061798095703125,
-0.06488037109375,
0.04119873046875,
-0.0174407958984375,
0.041107177734375,
0.0291748046875,
-0.0277557373046875,
0.044342041015625,
0.042816162109375,
-0.033172607421875,
-0.0633544921875,
-0.006237030029296875,
-0.090576171875,
-0.0103759765625,
0.0599365234375,
-0.025360107421875,
-0.05078125,
0.03173828125,
-0.0091552734375,
0.052978515625,
-0.01216888427734375,
0.048675537109375,
0.019012451171875,
-0.00699615478515625,
-0.054534912109375,
-0.0362548828125,
0.0279388427734375,
0.017486572265625,
-0.04931640625,
-0.029296875,
-0.005252838134765625,
0.05078125,
0.02117919921875,
0.036895751953125,
-0.01337432861328125,
0.012847900390625,
0.0028438568115234375,
0.044677734375,
-0.026702880859375,
-0.0165863037109375,
-0.0244598388671875,
-0.00014019012451171875,
-0.01302337646484375,
-0.05303955078125
]
] |
timm/inception_v3.gluon_in1k | 2023-04-25T21:28:50.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1512.00567",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/inception_v3.gluon_in1k | 0 | 36,422 | timm | 2023-04-25T21:28:32 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for inception_v3.gluon_in1k
A Inception-v3 image classification model. Trained on ImageNet-1k by MxNet GLUON authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 23.8
- GMACs: 5.7
- Activations (M): 9.0
- Image size: 299 x 299
- **Papers:**
- Rethinking the Inception Architecture for Computer Vision: https://arxiv.org/abs/1512.00567
- **Original:** https://github.com/tensorflow/models
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('inception_v3.gluon_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'inception_v3.gluon_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 147, 147])
# torch.Size([1, 192, 71, 71])
# torch.Size([1, 288, 35, 35])
# torch.Size([1, 768, 17, 17])
# torch.Size([1, 2048, 8, 8])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'inception_v3.gluon_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2048, 8, 8) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@article{DBLP:journals/corr/SzegedyVISW15,
author = {Christian Szegedy and
Vincent Vanhoucke and
Sergey Ioffe and
Jonathon Shlens and
Zbigniew Wojna},
title = {Rethinking the Inception Architecture for Computer Vision},
journal = {CoRR},
volume = {abs/1512.00567},
year = {2015},
url = {http://arxiv.org/abs/1512.00567},
archivePrefix = {arXiv},
eprint = {1512.00567},
timestamp = {Mon, 13 Aug 2018 16:49:07 +0200},
biburl = {https://dblp.org/rec/journals/corr/SzegedyVISW15.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
| 4,076 | [
[
-0.03662109375,
-0.043243408203125,
0.01434326171875,
0.002655029296875,
-0.027313232421875,
-0.016082763671875,
-0.00926971435546875,
-0.044036865234375,
0.01056671142578125,
0.017730712890625,
-0.030181884765625,
-0.055938720703125,
-0.050048828125,
-0.0102386474609375,
-0.0217132568359375,
0.0711669921875,
-0.00354766845703125,
-0.004730224609375,
-0.025604248046875,
-0.0278167724609375,
0.0004124641418457031,
-0.0222320556640625,
-0.071044921875,
-0.0338134765625,
0.0178680419921875,
0.0019683837890625,
0.048980712890625,
0.050262451171875,
0.041534423828125,
0.031982421875,
-0.0198211669921875,
-0.0024814605712890625,
-0.01727294921875,
-0.0286407470703125,
0.028289794921875,
-0.042205810546875,
-0.03363037109375,
0.01248931884765625,
0.054779052734375,
0.041748046875,
0.0030670166015625,
0.0241241455078125,
0.025909423828125,
0.033447265625,
-0.019805908203125,
0.0027675628662109375,
-0.030548095703125,
0.024139404296875,
-0.007289886474609375,
0.00910186767578125,
-0.01328277587890625,
-0.030548095703125,
0.0158233642578125,
-0.04559326171875,
0.048095703125,
0.001461029052734375,
0.099609375,
0.00934600830078125,
0.016754150390625,
-0.00738525390625,
-0.027557373046875,
0.056884765625,
-0.05718994140625,
0.01953125,
0.010284423828125,
0.008575439453125,
-0.01080322265625,
-0.0762939453125,
-0.0447998046875,
-0.01256561279296875,
-0.012451171875,
-0.007274627685546875,
-0.01189422607421875,
-0.002838134765625,
0.0178680419921875,
0.04388427734375,
-0.035552978515625,
0.00864410400390625,
-0.04913330078125,
-0.0137481689453125,
0.04083251953125,
-0.0013818740844726562,
0.0212249755859375,
-0.01316070556640625,
-0.0430908203125,
-0.0290985107421875,
-0.0301513671875,
0.025390625,
0.029937744140625,
0.00963592529296875,
-0.050262451171875,
0.0301971435546875,
0.01087188720703125,
0.03985595703125,
0.0098114013671875,
-0.0259552001953125,
0.0482177734375,
-0.0029659271240234375,
-0.03399658203125,
-0.0037555694580078125,
0.0787353515625,
0.0255279541015625,
0.0118560791015625,
-0.00004297494888305664,
0.004405975341796875,
-0.029296875,
0.004833221435546875,
-0.081298828125,
-0.0146484375,
0.033782958984375,
-0.03564453125,
-0.031280517578125,
0.015777587890625,
-0.04449462890625,
-0.01171112060546875,
-0.0017299652099609375,
0.046234130859375,
-0.04998779296875,
-0.025115966796875,
0.0075225830078125,
-0.0106964111328125,
0.037933349609375,
0.0182647705078125,
-0.04400634765625,
0.001682281494140625,
0.02947998046875,
0.09161376953125,
0.004833221435546875,
-0.036285400390625,
0.003803253173828125,
-0.01512908935546875,
-0.026580810546875,
0.035675048828125,
-0.0015697479248046875,
-0.022125244140625,
-0.0222930908203125,
0.021636962890625,
-0.002285003662109375,
-0.0491943359375,
0.0237274169921875,
-0.0100555419921875,
0.0216064453125,
0.00824737548828125,
-0.022186279296875,
-0.04705810546875,
0.0239105224609375,
-0.029205322265625,
0.08465576171875,
0.0242767333984375,
-0.0548095703125,
0.038665771484375,
-0.03509521484375,
-0.01042938232421875,
-0.01142120361328125,
-0.023193359375,
-0.0833740234375,
-0.0013103485107421875,
0.0277099609375,
0.046356201171875,
-0.0225982666015625,
0.009063720703125,
-0.04840087890625,
-0.021392822265625,
0.013458251953125,
0.0027980804443359375,
0.08642578125,
0.01142120361328125,
-0.031707763671875,
0.00920867919921875,
-0.047760009765625,
0.00824737548828125,
0.040679931640625,
-0.018157958984375,
0.0007748603820800781,
-0.0384521484375,
-0.005397796630859375,
0.01169586181640625,
0.00951385498046875,
-0.041229248046875,
0.0215911865234375,
-0.02935791015625,
0.034332275390625,
0.04388427734375,
-0.0024051666259765625,
0.0208587646484375,
-0.025115966796875,
0.01329803466796875,
0.036376953125,
0.0270843505859375,
0.00035953521728515625,
-0.055084228515625,
-0.056365966796875,
-0.03240966796875,
0.01045989990234375,
0.02587890625,
-0.026397705078125,
0.038330078125,
-0.0183868408203125,
-0.0517578125,
-0.044830322265625,
0.006317138671875,
0.0258026123046875,
0.04461669921875,
0.0270843505859375,
-0.040252685546875,
-0.03948974609375,
-0.07513427734375,
0.00911712646484375,
-0.0017309188842773438,
-0.004291534423828125,
0.0191497802734375,
0.0491943359375,
-0.005615234375,
0.060516357421875,
-0.032928466796875,
-0.018280029296875,
-0.01611328125,
0.01042938232421875,
0.041229248046875,
0.06927490234375,
0.061126708984375,
-0.04327392578125,
-0.0240936279296875,
-0.007564544677734375,
-0.0870361328125,
0.0310821533203125,
-0.0018711090087890625,
-0.0107269287109375,
0.020904541015625,
0.0155181884765625,
-0.04339599609375,
0.051025390625,
0.002429962158203125,
-0.01551055908203125,
0.032623291015625,
-0.0189208984375,
0.021026611328125,
-0.08685302734375,
0.012359619140625,
0.0183868408203125,
0.002227783203125,
-0.03179931640625,
0.00481414794921875,
-0.005802154541015625,
-0.0105133056640625,
-0.048248291015625,
0.04046630859375,
-0.04351806640625,
-0.02130126953125,
-0.003253936767578125,
-0.017913818359375,
0.00807952880859375,
0.057891845703125,
-0.01023101806640625,
0.036773681640625,
0.07135009765625,
-0.045166015625,
0.0264129638671875,
0.034698486328125,
-0.030914306640625,
0.039031982421875,
-0.052001953125,
0.0234832763671875,
-0.0149688720703125,
0.0103607177734375,
-0.072021484375,
-0.01165008544921875,
0.0338134765625,
-0.048370361328125,
0.039703369140625,
-0.043792724609375,
-0.0235748291015625,
-0.039337158203125,
-0.044097900390625,
0.037506103515625,
0.050140380859375,
-0.0499267578125,
0.037261962890625,
0.0112457275390625,
0.0202178955078125,
-0.046600341796875,
-0.0693359375,
-0.00832366943359375,
-0.031494140625,
-0.055908203125,
0.038665771484375,
0.004390716552734375,
0.0070648193359375,
0.0170440673828125,
0.0038909912109375,
0.0031948089599609375,
-0.00629425048828125,
0.0390625,
0.0299530029296875,
-0.029266357421875,
-0.02044677734375,
-0.02520751953125,
0.0043182373046875,
-0.0015249252319335938,
-0.0286407470703125,
0.044525146484375,
-0.0176849365234375,
-0.01241302490234375,
-0.0616455078125,
-0.00397491455078125,
0.037384033203125,
0.00415802001953125,
0.0623779296875,
0.08160400390625,
-0.048675537109375,
0.005283355712890625,
-0.0294342041015625,
-0.0115203857421875,
-0.035400390625,
0.031982421875,
-0.034698486328125,
-0.03387451171875,
0.06719970703125,
0.01239776611328125,
0.0078277587890625,
0.06097412109375,
0.0309600830078125,
-0.004669189453125,
0.0501708984375,
0.046234130859375,
0.00637054443359375,
0.043701171875,
-0.0858154296875,
-0.005062103271484375,
-0.07366943359375,
-0.039764404296875,
-0.0267791748046875,
-0.04150390625,
-0.0343017578125,
-0.0205230712890625,
0.037353515625,
0.035369873046875,
-0.023529052734375,
0.02947998046875,
-0.06195068359375,
0.00926971435546875,
0.0467529296875,
0.03546142578125,
-0.0161285400390625,
0.0238800048828125,
-0.0226898193359375,
0.01007080078125,
-0.049835205078125,
-0.0189208984375,
0.0872802734375,
0.0289459228515625,
0.04412841796875,
-0.0156402587890625,
0.051727294921875,
-0.01387786865234375,
0.0207672119140625,
-0.0419921875,
0.042236328125,
0.00008338689804077148,
-0.0352783203125,
-0.01015472412109375,
-0.019683837890625,
-0.07989501953125,
0.006072998046875,
-0.0198516845703125,
-0.0687255859375,
0.02459716796875,
0.00983428955078125,
-0.03631591796875,
0.060577392578125,
-0.056884765625,
0.067626953125,
-0.004608154296875,
-0.032257080078125,
-0.0005741119384765625,
-0.037841796875,
0.0279388427734375,
0.021331787109375,
-0.01313018798828125,
-0.0046539306640625,
0.0172882080078125,
0.0753173828125,
-0.0345458984375,
0.062469482421875,
-0.033660888671875,
0.016204833984375,
0.047760009765625,
-0.01152801513671875,
0.018280029296875,
0.00498199462890625,
-0.002887725830078125,
0.03875732421875,
-0.0057525634765625,
-0.0291290283203125,
-0.039031982421875,
0.046356201171875,
-0.0821533203125,
-0.037261962890625,
-0.0255279541015625,
-0.0287933349609375,
0.0175323486328125,
0.004421234130859375,
0.04901123046875,
0.0567626953125,
0.008026123046875,
0.0258636474609375,
0.03131103515625,
-0.03271484375,
0.038665771484375,
-0.01367950439453125,
-0.0228118896484375,
-0.048065185546875,
0.04901123046875,
0.020660400390625,
0.0096893310546875,
0.001750946044921875,
0.01055145263671875,
-0.0287017822265625,
-0.0589599609375,
-0.03594970703125,
0.0218048095703125,
-0.05126953125,
-0.03179931640625,
-0.04547119140625,
-0.04638671875,
-0.04339599609375,
-0.01611328125,
-0.0267486572265625,
-0.023193359375,
-0.03253173828125,
0.01399993896484375,
0.046173095703125,
0.044830322265625,
-0.01220703125,
0.0369873046875,
-0.03216552734375,
0.01097869873046875,
0.005229949951171875,
0.038665771484375,
0.0007600784301757812,
-0.066162109375,
-0.0310821533203125,
-0.0089569091796875,
-0.04254150390625,
-0.042633056640625,
0.0227508544921875,
0.0212554931640625,
0.04083251953125,
0.033966064453125,
-0.03082275390625,
0.06207275390625,
-0.00510406494140625,
0.0526123046875,
0.019256591796875,
-0.041229248046875,
0.05072021484375,
0.00328826904296875,
0.01385498046875,
0.006954193115234375,
0.0295257568359375,
-0.009307861328125,
-0.016632080078125,
-0.080810546875,
-0.06475830078125,
0.06640625,
0.01186370849609375,
-0.007442474365234375,
0.0259857177734375,
0.061431884765625,
0.01026153564453125,
0.0004820823669433594,
-0.055816650390625,
-0.0266876220703125,
-0.02508544921875,
-0.021697998046875,
-0.0016965866088867188,
-0.00567626953125,
-0.01007843017578125,
-0.05059814453125,
0.04962158203125,
0.001216888427734375,
0.06256103515625,
0.038299560546875,
-0.01119232177734375,
-0.00563812255859375,
-0.02838134765625,
0.034454345703125,
0.031707763671875,
-0.035186767578125,
0.0024394989013671875,
0.01861572265625,
-0.04986572265625,
0.006267547607421875,
0.008636474609375,
-0.01270294189453125,
-0.01085662841796875,
0.037506103515625,
0.07098388671875,
-0.0006146430969238281,
0.01454925537109375,
0.02264404296875,
0.007472991943359375,
-0.0313720703125,
-0.02532958984375,
0.01885986328125,
-0.008026123046875,
0.0321044921875,
0.03778076171875,
0.0214385986328125,
-0.004673004150390625,
-0.02166748046875,
0.0207672119140625,
0.043304443359375,
-0.01180267333984375,
-0.034759521484375,
0.057373046875,
-0.0214996337890625,
-0.0126800537109375,
0.060577392578125,
-0.007251739501953125,
-0.0236053466796875,
0.073974609375,
0.035797119140625,
0.0816650390625,
0.001811981201171875,
-0.002475738525390625,
0.07171630859375,
0.0230255126953125,
0.0020618438720703125,
0.0022792816162109375,
0.0110015869140625,
-0.0616455078125,
0.0020751953125,
-0.055145263671875,
-0.003185272216796875,
0.0411376953125,
-0.033203125,
0.0278167724609375,
-0.067138671875,
-0.0246429443359375,
0.0152435302734375,
0.0190277099609375,
-0.07000732421875,
0.017578125,
0.00966644287109375,
0.0555419921875,
-0.0465087890625,
0.053741455078125,
0.057891845703125,
-0.054046630859375,
-0.0595703125,
-0.0138397216796875,
0.00273895263671875,
-0.078125,
0.0287628173828125,
0.036773681640625,
0.0011119842529296875,
0.0146026611328125,
-0.0670166015625,
-0.056976318359375,
0.1112060546875,
0.04730224609375,
-0.0004630088806152344,
0.01220703125,
0.00494384765625,
0.01331329345703125,
-0.038238525390625,
0.0297393798828125,
0.01739501953125,
0.019317626953125,
0.024444580078125,
-0.03594970703125,
0.0100555419921875,
-0.0150909423828125,
0.01483917236328125,
0.0046539306640625,
-0.04766845703125,
0.0703125,
-0.04046630859375,
-0.0253143310546875,
0.002376556396484375,
0.06298828125,
0.0172271728515625,
0.00746917724609375,
0.04290771484375,
0.07647705078125,
0.033599853515625,
-0.0205535888671875,
0.0670166015625,
-0.00730133056640625,
0.0479736328125,
0.037109375,
0.0250091552734375,
0.032623291015625,
0.03082275390625,
-0.02838134765625,
0.034393310546875,
0.0765380859375,
-0.030731201171875,
0.0204010009765625,
0.01067352294921875,
0.004852294921875,
0.0034942626953125,
0.01232147216796875,
-0.03558349609375,
0.033966064453125,
0.01396942138671875,
-0.02764892578125,
-0.0211639404296875,
0.0075225830078125,
0.004024505615234375,
-0.032379150390625,
-0.0189208984375,
0.036529541015625,
0.0003674030303955078,
-0.0202178955078125,
0.06805419921875,
-0.000354766845703125,
0.061859130859375,
-0.0288848876953125,
0.00677490234375,
-0.0253143310546875,
0.0175933837890625,
-0.028076171875,
-0.06719970703125,
0.030181884765625,
-0.0195770263671875,
0.0015869140625,
0.0135955810546875,
0.038299560546875,
-0.02154541015625,
-0.04522705078125,
0.0160064697265625,
0.01715087890625,
0.041259765625,
0.0070343017578125,
-0.08447265625,
0.01387786865234375,
-0.002880096435546875,
-0.042236328125,
0.016143798828125,
0.026885986328125,
0.007289886474609375,
0.046844482421875,
0.044830322265625,
-0.00804901123046875,
0.0100555419921875,
-0.01462554931640625,
0.055328369140625,
-0.04840087890625,
-0.0290069580078125,
-0.0711669921875,
0.05389404296875,
-0.006557464599609375,
-0.04876708984375,
0.031768798828125,
0.057708740234375,
0.07586669921875,
-0.00656890869140625,
0.03497314453125,
-0.029052734375,
-0.005462646484375,
-0.03790283203125,
0.0498046875,
-0.052337646484375,
-0.007843017578125,
-0.01117706298828125,
-0.060089111328125,
-0.03118896484375,
0.045257568359375,
-0.0149383544921875,
0.030487060546875,
0.0372314453125,
0.07391357421875,
-0.0264739990234375,
-0.021270751953125,
0.01206207275390625,
0.018463134765625,
0.0192108154296875,
0.034698486328125,
0.0302886962890625,
-0.060394287109375,
0.028900146484375,
-0.04254150390625,
-0.01160430908203125,
-0.01206207275390625,
-0.0501708984375,
-0.055419921875,
-0.06817626953125,
-0.047210693359375,
-0.04425048828125,
-0.0172271728515625,
0.065185546875,
0.067626953125,
-0.0565185546875,
-0.01357269287109375,
0.0038394927978515625,
0.0105133056640625,
-0.0265655517578125,
-0.0157470703125,
0.049346923828125,
-0.0208587646484375,
-0.061126708984375,
-0.01145172119140625,
0.01313018798828125,
0.039947509765625,
-0.0092926025390625,
-0.01104736328125,
-0.0189666748046875,
-0.026458740234375,
0.01499176025390625,
0.0229949951171875,
-0.051605224609375,
-0.03192138671875,
-0.0285797119140625,
-0.006160736083984375,
0.0457763671875,
0.032806396484375,
-0.048980712890625,
0.01654052734375,
0.0211639404296875,
0.0307159423828125,
0.062164306640625,
-0.0240325927734375,
0.00699615478515625,
-0.056884765625,
0.044921875,
-0.01174163818359375,
0.034088134765625,
0.036285400390625,
-0.034759521484375,
0.04327392578125,
0.04150390625,
-0.024566650390625,
-0.06597900390625,
-0.00884246826171875,
-0.07611083984375,
-0.00421905517578125,
0.051513671875,
-0.0211639404296875,
-0.04754638671875,
0.0297698974609375,
0.002628326416015625,
0.05084228515625,
0.00030493736267089844,
0.04730224609375,
0.0247650146484375,
-0.01189422607421875,
-0.0404052734375,
-0.035736083984375,
0.041473388671875,
0.0013294219970703125,
-0.045257568359375,
-0.036956787109375,
-0.004894256591796875,
0.04736328125,
0.028839111328125,
0.03863525390625,
-0.0252685546875,
0.0019741058349609375,
0.00800323486328125,
0.044921875,
-0.0306549072265625,
-0.018585205078125,
-0.031829833984375,
0.01297760009765625,
-0.0038280487060546875,
-0.047637939453125
]
] |
timm/eca_halonext26ts.c1_in1k | 2023-04-26T16:09:44.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:2103.12731",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/eca_halonext26ts.c1_in1k | 0 | 36,415 | timm | 2023-04-26T16:09:32 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for eca_halonext26ts.c1_in1k
A HaloNet image classification model (with Efficient channel attention, based on ResNeXt architecture). Trained on ImageNet-1k in `timm` by Ross Wightman.
NOTE: this model did not adhere to any specific paper configuration, it was tuned for reasonable training times and reduced frequency of self-attention blocks.
Recipe details:
* Based on [ResNet Strikes Back](https://arxiv.org/abs/2110.00476) `C` recipes
* SGD (w/ Nesterov) optimizer and AGC (adaptive gradient clipping).
* Cosine LR schedule with warmup
This model architecture is implemented using `timm`'s flexible [BYOBNet (Bring-Your-Own-Blocks Network)](https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py).
BYOB (with BYOANet attention specific blocks) allows configuration of:
* block / stage layout
* block-type interleaving
* stem layout
* output stride (dilation)
* activation and norm layers
* channel and spatial / self-attention layers
...and also includes `timm` features common to many other architectures, including:
* stochastic depth
* gradient checkpointing
* layer-wise LR decay
* per-stage feature extraction
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 10.8
- GMACs: 2.4
- Activations (M): 11.5
- Image size: 256 x 256
- **Papers:**
- Scaling Local Self-Attention for Parameter Efficient Visual Backbones: https://arxiv.org/abs/2103.12731
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('eca_halonext26ts.c1_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'eca_halonext26ts.c1_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 64, 128, 128])
# torch.Size([1, 256, 64, 64])
# torch.Size([1, 512, 32, 32])
# torch.Size([1, 1024, 16, 16])
# torch.Size([1, 2048, 8, 8])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'eca_halonext26ts.c1_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2048, 8, 8) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@article{Vaswani2021ScalingLS,
title={Scaling Local Self-Attention for Parameter Efficient Visual Backbones},
author={Ashish Vaswani and Prajit Ramachandran and A. Srinivas and Niki Parmar and Blake A. Hechtman and Jonathon Shlens},
journal={2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2021},
pages={12889-12899}
}
```
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
| 5,431 | [
[
-0.041473388671875,
-0.03778076171875,
0.0036907196044921875,
0.01470947265625,
-0.0192718505859375,
-0.02630615234375,
-0.0234222412109375,
-0.0340576171875,
0.0244140625,
0.0384521484375,
-0.040283203125,
-0.05438232421875,
-0.046234130859375,
-0.0016326904296875,
-0.0110321044921875,
0.0687255859375,
-0.0088348388671875,
-0.01166534423828125,
-0.00444793701171875,
-0.047821044921875,
-0.01331329345703125,
-0.0297393798828125,
-0.0709228515625,
-0.0237274169921875,
0.036376953125,
0.015228271484375,
0.030548095703125,
0.059661865234375,
0.052032470703125,
0.0362548828125,
-0.00888824462890625,
0.0050506591796875,
-0.0194091796875,
-0.019561767578125,
0.02130126953125,
-0.041259765625,
-0.03973388671875,
0.01538848876953125,
0.04815673828125,
0.023681640625,
-0.002994537353515625,
0.035797119140625,
0.0061187744140625,
0.044464111328125,
-0.024078369140625,
0.0084381103515625,
-0.032806396484375,
0.01611328125,
-0.0132293701171875,
0.001689910888671875,
-0.022491455078125,
-0.018707275390625,
0.0196533203125,
-0.043701171875,
0.035003662109375,
0.006237030029296875,
0.09405517578125,
0.0240325927734375,
-0.0017833709716796875,
-0.003017425537109375,
-0.0151214599609375,
0.06524658203125,
-0.0589599609375,
0.0224761962890625,
0.01506805419921875,
0.0173797607421875,
0.004688262939453125,
-0.07720947265625,
-0.03485107421875,
-0.0145263671875,
-0.00959014892578125,
0.00984954833984375,
-0.0200653076171875,
0.005481719970703125,
0.018798828125,
0.0248260498046875,
-0.03948974609375,
0.00937652587890625,
-0.0439453125,
-0.00817108154296875,
0.036376953125,
-0.002651214599609375,
0.020294189453125,
-0.0222015380859375,
-0.04541015625,
-0.03204345703125,
-0.0286102294921875,
0.01654052734375,
0.01285552978515625,
0.0184783935546875,
-0.0452880859375,
0.0173492431640625,
0.006771087646484375,
0.0439453125,
0.019439697265625,
-0.0188446044921875,
0.03900146484375,
0.003826141357421875,
-0.03680419921875,
-0.0014829635620117188,
0.077392578125,
0.0257415771484375,
0.0162200927734375,
0.0074310302734375,
-0.0083465576171875,
-0.0210723876953125,
-0.001220703125,
-0.0784912109375,
-0.026123046875,
0.026336669921875,
-0.049896240234375,
-0.026885986328125,
0.01548004150390625,
-0.047271728515625,
-0.00232696533203125,
-0.00576019287109375,
0.03338623046875,
-0.03741455078125,
-0.0244140625,
0.00444793701171875,
-0.00559234619140625,
0.0211639404296875,
0.0120391845703125,
-0.043701171875,
0.01418304443359375,
0.015655517578125,
0.08056640625,
0.0005426406860351562,
-0.03466796875,
-0.0161895751953125,
-0.01534271240234375,
-0.0155487060546875,
0.04156494140625,
-0.005481719970703125,
-0.003993988037109375,
-0.0221099853515625,
0.0269927978515625,
-0.01100921630859375,
-0.06103515625,
0.0161590576171875,
-0.0233612060546875,
0.01561737060546875,
-0.006069183349609375,
-0.01708984375,
-0.040313720703125,
0.029632568359375,
-0.036712646484375,
0.08917236328125,
0.022003173828125,
-0.06439208984375,
0.0218505859375,
-0.04351806640625,
-0.006160736083984375,
-0.0212860107421875,
-0.0016584396362304688,
-0.0770263671875,
-0.00970458984375,
0.0290985107421875,
0.0523681640625,
-0.023773193359375,
-0.00348663330078125,
-0.04766845703125,
-0.0145263671875,
0.0231475830078125,
-0.0031528472900390625,
0.0714111328125,
0.0130462646484375,
-0.039154052734375,
0.0219573974609375,
-0.053131103515625,
0.0146636962890625,
0.035369873046875,
-0.0181884765625,
-0.0024261474609375,
-0.049713134765625,
0.013397216796875,
0.03173828125,
0.0110626220703125,
-0.041473388671875,
0.01364898681640625,
-0.0189971923828125,
0.039520263671875,
0.052886962890625,
-0.0004906654357910156,
0.023468017578125,
-0.027618408203125,
0.0240936279296875,
0.0222930908203125,
0.023773193359375,
-0.0090484619140625,
-0.0303802490234375,
-0.06817626953125,
-0.045867919921875,
0.0272979736328125,
0.026702880859375,
-0.034271240234375,
0.03936767578125,
-0.00960540771484375,
-0.054779052734375,
-0.041168212890625,
0.01543426513671875,
0.044189453125,
0.0421142578125,
0.031524658203125,
-0.04229736328125,
-0.043426513671875,
-0.0740966796875,
0.0010967254638671875,
0.0024566650390625,
0.0024089813232421875,
0.0275115966796875,
0.0421142578125,
-0.00666046142578125,
0.05560302734375,
-0.0260009765625,
-0.0307769775390625,
-0.0181732177734375,
0.009368896484375,
0.0265350341796875,
0.05712890625,
0.064453125,
-0.04559326171875,
-0.041534423828125,
-0.00896453857421875,
-0.07513427734375,
0.00276947021484375,
-0.01514434814453125,
-0.00970458984375,
0.0206298828125,
0.0172882080078125,
-0.040252685546875,
0.043304443359375,
0.0171966552734375,
-0.01206207275390625,
0.025390625,
-0.01522064208984375,
0.017486572265625,
-0.081298828125,
0.0067138671875,
0.03564453125,
-0.01345062255859375,
-0.026214599609375,
0.0101470947265625,
0.005584716796875,
-0.0012807846069335938,
-0.042816162109375,
0.051788330078125,
-0.05078125,
-0.013916015625,
-0.0144805908203125,
-0.0089263916015625,
-0.0058746337890625,
0.068603515625,
0.0032329559326171875,
0.0298919677734375,
0.060455322265625,
-0.037078857421875,
0.03204345703125,
0.0236358642578125,
-0.004802703857421875,
0.04095458984375,
-0.056915283203125,
0.0184173583984375,
0.0038509368896484375,
0.01531219482421875,
-0.0716552734375,
-0.01378631591796875,
0.0275726318359375,
-0.03961181640625,
0.04913330078125,
-0.0310821533203125,
-0.031829833984375,
-0.04498291015625,
-0.03900146484375,
0.032928466796875,
0.049530029296875,
-0.069580078125,
0.035247802734375,
0.01519012451171875,
0.0238189697265625,
-0.0347900390625,
-0.06341552734375,
-0.01288604736328125,
-0.03924560546875,
-0.051300048828125,
0.02642822265625,
0.00659942626953125,
0.008880615234375,
0.007061004638671875,
-0.00641632080078125,
-0.00380706787109375,
-0.0147705078125,
0.042816162109375,
0.0171966552734375,
-0.0208587646484375,
-0.01043701171875,
-0.0229949951171875,
-0.005573272705078125,
0.001262664794921875,
-0.0279693603515625,
0.04132080078125,
-0.0254058837890625,
-0.01166534423828125,
-0.06854248046875,
0.0017261505126953125,
0.037811279296875,
-0.01483917236328125,
0.0556640625,
0.08087158203125,
-0.035064697265625,
0.0004115104675292969,
-0.042327880859375,
-0.0267333984375,
-0.03948974609375,
0.035552978515625,
-0.0225677490234375,
-0.038360595703125,
0.06683349609375,
-0.001293182373046875,
-0.00005644559860229492,
0.04840087890625,
0.03143310546875,
-0.006031036376953125,
0.057220458984375,
0.052001953125,
0.00795745849609375,
0.057586669921875,
-0.06744384765625,
-0.0179290771484375,
-0.07379150390625,
-0.02691650390625,
-0.0186920166015625,
-0.044464111328125,
-0.049652099609375,
-0.02276611328125,
0.03759765625,
0.009918212890625,
-0.03277587890625,
0.0374755859375,
-0.069580078125,
0.003360748291015625,
0.055419921875,
0.044586181640625,
-0.0241851806640625,
0.022491455078125,
-0.009124755859375,
-0.007602691650390625,
-0.06494140625,
-0.0168304443359375,
0.08428955078125,
0.0367431640625,
0.0496826171875,
-0.018463134765625,
0.061553955078125,
-0.0139007568359375,
0.038330078125,
-0.033050537109375,
0.0452880859375,
-0.0017309188842773438,
-0.034942626953125,
-0.0084228515625,
-0.03167724609375,
-0.08050537109375,
0.003360748291015625,
-0.01251983642578125,
-0.059356689453125,
0.018341064453125,
0.01453399658203125,
-0.022186279296875,
0.0538330078125,
-0.0638427734375,
0.0723876953125,
-0.012298583984375,
-0.033538818359375,
0.0026836395263671875,
-0.0560302734375,
0.0251617431640625,
0.01177978515625,
-0.0169677734375,
0.0011491775512695312,
0.011749267578125,
0.0830078125,
-0.044281005859375,
0.07379150390625,
-0.024078369140625,
0.0201263427734375,
0.044830322265625,
-0.0178985595703125,
0.02020263671875,
-0.0054168701171875,
0.002468109130859375,
0.0308074951171875,
0.0003428459167480469,
-0.040130615234375,
-0.044281005859375,
0.049285888671875,
-0.07275390625,
-0.03045654296875,
-0.0268096923828125,
-0.03826904296875,
0.014862060546875,
0.00421905517578125,
0.03924560546875,
0.047119140625,
0.02105712890625,
0.0134429931640625,
0.043548583984375,
-0.03155517578125,
0.03759765625,
0.002315521240234375,
-0.0117340087890625,
-0.046722412109375,
0.0657958984375,
0.022064208984375,
0.01287078857421875,
0.0126495361328125,
0.004947662353515625,
-0.022125244140625,
-0.03375244140625,
-0.019927978515625,
0.038330078125,
-0.053314208984375,
-0.035980224609375,
-0.042449951171875,
-0.0328369140625,
-0.034149169921875,
-0.01137542724609375,
-0.040985107421875,
-0.0226898193359375,
-0.0266571044921875,
0.0140533447265625,
0.0487060546875,
0.041778564453125,
-0.01363372802734375,
0.04608154296875,
-0.04443359375,
0.00659942626953125,
0.01561737060546875,
0.02618408203125,
-0.0011072158813476562,
-0.0657958984375,
-0.0217437744140625,
-0.01617431640625,
-0.038055419921875,
-0.059906005859375,
0.042144775390625,
0.0091094970703125,
0.03533935546875,
0.031005859375,
-0.0135955810546875,
0.05706787109375,
-0.0009274482727050781,
0.0347900390625,
0.0296783447265625,
-0.0440673828125,
0.04833984375,
-0.006847381591796875,
0.0212860107421875,
0.007659912109375,
0.019866943359375,
-0.0189971923828125,
-0.01197052001953125,
-0.08087158203125,
-0.053955078125,
0.065673828125,
0.00972747802734375,
-0.01059722900390625,
0.016815185546875,
0.060760498046875,
-0.0078582763671875,
0.0052337646484375,
-0.061248779296875,
-0.03570556640625,
-0.02252197265625,
-0.0221405029296875,
0.004573822021484375,
-0.0023784637451171875,
-0.0060882568359375,
-0.04736328125,
0.04833984375,
-0.00563812255859375,
0.054229736328125,
0.03759765625,
-0.0017633438110351562,
-0.00899505615234375,
-0.02874755859375,
0.031494140625,
0.0223236083984375,
-0.0274658203125,
0.008544921875,
0.02520751953125,
-0.03643798828125,
0.0018329620361328125,
0.010009765625,
0.001468658447265625,
-0.004398345947265625,
0.040252685546875,
0.061279296875,
0.00885772705078125,
-0.00420379638671875,
0.02178955078125,
-0.001983642578125,
-0.0251312255859375,
-0.01425933837890625,
0.00733184814453125,
-0.008941650390625,
0.035919189453125,
0.021514892578125,
0.0229339599609375,
-0.01010894775390625,
-0.022369384765625,
0.0238800048828125,
0.032928466796875,
-0.0253143310546875,
-0.03173828125,
0.056121826171875,
-0.0156402587890625,
-0.01261138916015625,
0.05853271484375,
-0.00617218017578125,
-0.03985595703125,
0.0833740234375,
0.04010009765625,
0.069580078125,
-0.01351165771484375,
0.0009088516235351562,
0.060638427734375,
0.0240936279296875,
0.00939178466796875,
0.01059722900390625,
0.0080108642578125,
-0.046478271484375,
0.00879669189453125,
-0.034271240234375,
0.00545501708984375,
0.024688720703125,
-0.04583740234375,
0.0189056396484375,
-0.058563232421875,
-0.03143310546875,
0.004306793212890625,
0.022064208984375,
-0.072265625,
0.0251312255859375,
-0.00153350830078125,
0.0687255859375,
-0.0501708984375,
0.055419921875,
0.06549072265625,
-0.0455322265625,
-0.07672119140625,
-0.0086822509765625,
-0.0010623931884765625,
-0.07086181640625,
0.04656982421875,
0.03558349609375,
0.01367950439453125,
0.0110626220703125,
-0.061767578125,
-0.0556640625,
0.1080322265625,
0.04132080078125,
-0.0131072998046875,
0.01554107666015625,
-0.0095367431640625,
0.023773193359375,
-0.034088134765625,
0.034759521484375,
0.018585205078125,
0.028289794921875,
0.0158233642578125,
-0.05242919921875,
0.0212554931640625,
-0.01904296875,
0.0083160400390625,
0.00714874267578125,
-0.0751953125,
0.0654296875,
-0.032440185546875,
-0.0125274658203125,
0.0028076171875,
0.0643310546875,
0.0249176025390625,
0.01458740234375,
0.037261962890625,
0.06634521484375,
0.038665771484375,
-0.01538848876953125,
0.07421875,
-0.00984954833984375,
0.049896240234375,
0.0504150390625,
0.02545166015625,
0.042144775390625,
0.032501220703125,
-0.015960693359375,
0.0235137939453125,
0.0882568359375,
-0.022491455078125,
0.0302734375,
0.01309967041015625,
0.007785797119140625,
-0.0143585205078125,
0.00263214111328125,
-0.034881591796875,
0.03314208984375,
0.0102386474609375,
-0.038818359375,
-0.0187530517578125,
0.00983428955078125,
-0.00675201416015625,
-0.019927978515625,
-0.0179901123046875,
0.035064697265625,
0.007511138916015625,
-0.02850341796875,
0.06793212890625,
0.005870819091796875,
0.0643310546875,
-0.02838134765625,
0.006069183349609375,
-0.0248565673828125,
0.01922607421875,
-0.0223846435546875,
-0.05096435546875,
0.0163726806640625,
-0.02008056640625,
-0.003070831298828125,
0.004833221435546875,
0.0450439453125,
-0.0255279541015625,
-0.033477783203125,
0.0235595703125,
0.0214385986328125,
0.036346435546875,
-0.0014581680297851562,
-0.0963134765625,
0.017181396484375,
-0.006656646728515625,
-0.048248291015625,
0.025299072265625,
0.03131103515625,
0.00437164306640625,
0.0576171875,
0.03887939453125,
-0.00879669189453125,
0.016265869140625,
-0.0135955810546875,
0.06494140625,
-0.0411376953125,
-0.026458740234375,
-0.05657958984375,
0.034637451171875,
-0.01116943359375,
-0.044677734375,
0.04608154296875,
0.039947509765625,
0.06658935546875,
-0.001995086669921875,
0.0260009765625,
-0.0103302001953125,
-0.001422882080078125,
-0.0341796875,
0.046356201171875,
-0.05474853515625,
0.0050201416015625,
-0.01494598388671875,
-0.05633544921875,
-0.0272674560546875,
0.051177978515625,
-0.0255279541015625,
0.04010009765625,
0.04248046875,
0.0721435546875,
-0.0260162353515625,
-0.0306396484375,
0.01372528076171875,
0.013397216796875,
0.0086212158203125,
0.04473876953125,
0.0291748046875,
-0.059478759765625,
0.0292510986328125,
-0.046966552734375,
-0.016876220703125,
-0.01096343994140625,
-0.043853759765625,
-0.0806884765625,
-0.06365966796875,
-0.0523681640625,
-0.041534423828125,
-0.0113372802734375,
0.0672607421875,
0.09173583984375,
-0.05242919921875,
-0.00853729248046875,
0.0004432201385498047,
0.0076751708984375,
-0.0247650146484375,
-0.0180511474609375,
0.056427001953125,
-0.016998291015625,
-0.05902099609375,
-0.0206146240234375,
0.01434326171875,
0.0254058837890625,
-0.004825592041015625,
-0.0180511474609375,
-0.0157318115234375,
-0.013275146484375,
0.01236724853515625,
0.03515625,
-0.05865478515625,
-0.017425537109375,
-0.01149749755859375,
-0.01436614990234375,
0.036468505859375,
0.035186767578125,
-0.047271728515625,
0.020172119140625,
0.0147552490234375,
0.03173828125,
0.06195068359375,
-0.0176849365234375,
0.00330352783203125,
-0.0694580078125,
0.039886474609375,
-0.01715087890625,
0.036224365234375,
0.0328369140625,
-0.03155517578125,
0.044677734375,
0.033721923828125,
-0.0357666015625,
-0.0625,
-0.00811767578125,
-0.09210205078125,
-0.006847381591796875,
0.061767578125,
-0.028900146484375,
-0.052886962890625,
0.036102294921875,
-0.01065826416015625,
0.048248291015625,
-0.01026153564453125,
0.0499267578125,
0.01904296875,
-0.0111083984375,
-0.058197021484375,
-0.033233642578125,
0.030731201171875,
0.02459716796875,
-0.054718017578125,
-0.0274658203125,
-0.005321502685546875,
0.047698974609375,
0.0248565673828125,
0.038970947265625,
-0.0093231201171875,
0.0158233642578125,
0.00038051605224609375,
0.038970947265625,
-0.0264434814453125,
-0.0133819580078125,
-0.0285186767578125,
0.00884246826171875,
-0.01299285888671875,
-0.0460205078125
]
] |
timm/gmixer_24_224.ra3_in1k | 2023-03-27T23:01:06.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/gmixer_24_224.ra3_in1k | 0 | 36,359 | timm | 2023-03-27T23:00:47 | ---
tags:
- image-classification
- timm
library_tag: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for gmixer_24_224.ra3_in1k
A G-Mixer image classification model. Trained on ImageNet-1k in `timm` by Ross Wightman. This is a custom `timm` model variant based on MLP-Mixer but using SwiGLU.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 24.7
- GMACs: 5.3
- Activations (M): 14.5
- Image size: 224 x 224
- **Papers:**
- **Original:** https://github.com/huggingface/pytorch-image-models
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('gmixer_24_224.ra3_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'gmixer_24_224.ra3_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 196, 384) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 2,740 | [
[
-0.041259765625,
-0.0298614501953125,
0.00882720947265625,
0.006572723388671875,
-0.0227813720703125,
-0.0171966552734375,
0.0005888938903808594,
-0.033233642578125,
0.0214080810546875,
0.0237579345703125,
-0.04412841796875,
-0.052642822265625,
-0.06011962890625,
-0.00933837890625,
-0.0126495361328125,
0.0811767578125,
-0.01285552978515625,
0.0028228759765625,
-0.0182647705078125,
-0.038177490234375,
-0.01189422607421875,
-0.0271148681640625,
-0.06793212890625,
-0.050018310546875,
0.032470703125,
0.01125335693359375,
0.055999755859375,
0.03570556640625,
0.03912353515625,
0.0313720703125,
-0.0118408203125,
-0.005218505859375,
-0.0183563232421875,
-0.0211944580078125,
0.0280609130859375,
-0.047821044921875,
-0.047821044921875,
0.03131103515625,
0.049072265625,
0.0289306640625,
-0.0016956329345703125,
0.02825927734375,
0.01163482666015625,
0.0297698974609375,
-0.01739501953125,
0.00412750244140625,
-0.0307464599609375,
0.029327392578125,
-0.0037403106689453125,
0.003864288330078125,
-0.0258026123046875,
-0.031951904296875,
0.00621795654296875,
-0.041107177734375,
0.041046142578125,
0.00107574462890625,
0.0992431640625,
0.0079345703125,
-0.0027713775634765625,
-0.0004901885986328125,
-0.025634765625,
0.05450439453125,
-0.0662841796875,
0.01214599609375,
0.028045654296875,
0.023773193359375,
-0.0058441162109375,
-0.07464599609375,
-0.039459228515625,
-0.004810333251953125,
-0.012786865234375,
0.000965118408203125,
-0.0238494873046875,
0.007030487060546875,
0.0265960693359375,
0.0276336669921875,
-0.043914794921875,
0.0020313262939453125,
-0.046173095703125,
-0.01248931884765625,
0.035858154296875,
0.00997161865234375,
0.0260162353515625,
-0.008026123046875,
-0.0313720703125,
-0.03326416015625,
-0.024078369140625,
0.00783538818359375,
0.039581298828125,
0.00756072998046875,
-0.048797607421875,
0.036712646484375,
0.017608642578125,
0.0374755859375,
0.01123046875,
-0.02215576171875,
0.05133056640625,
0.01056671142578125,
-0.035888671875,
-0.01470947265625,
0.08697509765625,
0.03094482421875,
0.0167999267578125,
-0.00286102294921875,
-0.0097808837890625,
-0.0195159912109375,
-0.00662994384765625,
-0.09710693359375,
-0.023956298828125,
0.027587890625,
-0.03912353515625,
-0.035247802734375,
0.01934814453125,
-0.0426025390625,
-0.0017948150634765625,
-0.004337310791015625,
0.05401611328125,
-0.03021240234375,
-0.0233001708984375,
0.00004464387893676758,
-0.0187530517578125,
0.021484375,
0.0108184814453125,
-0.0439453125,
0.025390625,
0.0290069580078125,
0.09423828125,
0.00827789306640625,
-0.0341796875,
-0.0233612060546875,
-0.0223388671875,
-0.021484375,
0.028564453125,
-0.01461029052734375,
-0.022613525390625,
-0.0248870849609375,
0.028900146484375,
-0.0203704833984375,
-0.056976318359375,
0.017425537109375,
-0.0175933837890625,
0.03607177734375,
0.00959014892578125,
-0.0192718505859375,
-0.032806396484375,
0.0224151611328125,
-0.0293426513671875,
0.08868408203125,
0.0279388427734375,
-0.07183837890625,
0.0263214111328125,
-0.042083740234375,
-0.005523681640625,
-0.0140228271484375,
-0.003917694091796875,
-0.08013916015625,
-0.006572723388671875,
0.01039886474609375,
0.0506591796875,
-0.01477813720703125,
0.01629638671875,
-0.049652099609375,
-0.0267486572265625,
0.0252227783203125,
-0.0038509368896484375,
0.06842041015625,
0.01308441162109375,
-0.020294189453125,
0.0218505859375,
-0.04412841796875,
0.01380157470703125,
0.034027099609375,
-0.01251983642578125,
0.0038585662841796875,
-0.044677734375,
0.020843505859375,
0.0276031494140625,
-0.0021457672119140625,
-0.039886474609375,
0.0279998779296875,
-0.01351165771484375,
0.03265380859375,
0.04498291015625,
-0.00861358642578125,
0.020538330078125,
-0.034454345703125,
0.039764404296875,
0.0229949951171875,
0.02825927734375,
0.010009765625,
-0.04986572265625,
-0.051422119140625,
-0.03668212890625,
0.0276031494140625,
0.028961181640625,
-0.038238525390625,
0.02886962890625,
-0.00959014892578125,
-0.0518798828125,
-0.040771484375,
0.0046844482421875,
0.0297088623046875,
0.038726806640625,
0.025787353515625,
-0.03607177734375,
-0.037109375,
-0.06842041015625,
0.0035076141357421875,
0.0078887939453125,
-0.00897216796875,
0.0268096923828125,
0.05535888671875,
-0.028564453125,
0.05364990234375,
-0.032623291015625,
-0.018035888671875,
-0.0230712890625,
0.02362060546875,
0.04901123046875,
0.057525634765625,
0.07427978515625,
-0.0357666015625,
-0.04229736328125,
-0.009490966796875,
-0.06842041015625,
0.005855560302734375,
0.0091552734375,
-0.015899658203125,
0.0170440673828125,
0.0045013427734375,
-0.04931640625,
0.042205810546875,
0.031494140625,
-0.028900146484375,
0.0390625,
-0.02044677734375,
0.01727294921875,
-0.0926513671875,
0.0123291015625,
0.022705078125,
-0.01654052734375,
-0.036407470703125,
-0.004909515380859375,
-0.0014638900756835938,
-0.0054931640625,
-0.035919189453125,
0.05572509765625,
-0.03277587890625,
-0.00714111328125,
-0.023956298828125,
-0.0160980224609375,
0.0031986236572265625,
0.048126220703125,
-0.013397216796875,
0.03631591796875,
0.0626220703125,
-0.0300750732421875,
0.0286407470703125,
0.030029296875,
-0.0100250244140625,
0.033538818359375,
-0.05810546875,
0.0185546875,
-0.004730224609375,
0.01436614990234375,
-0.07403564453125,
-0.0242767333984375,
0.038970947265625,
-0.04833984375,
0.049041748046875,
-0.0576171875,
-0.033843994140625,
-0.042327880859375,
-0.035369873046875,
0.034393310546875,
0.055023193359375,
-0.059814453125,
0.042510986328125,
0.013916015625,
0.01129150390625,
-0.04443359375,
-0.06591796875,
-0.0198211669921875,
-0.029449462890625,
-0.054046630859375,
0.0211334228515625,
0.00186920166015625,
0.007099151611328125,
0.00881195068359375,
-0.01361846923828125,
-0.0060577392578125,
-0.01284027099609375,
0.039154052734375,
0.037139892578125,
-0.0179595947265625,
-0.0189361572265625,
-0.01678466796875,
-0.01151275634765625,
0.01435089111328125,
-0.01447296142578125,
0.046142578125,
-0.01168060302734375,
-0.010223388671875,
-0.055908203125,
-0.0048675537109375,
0.042694091796875,
0.007076263427734375,
0.0673828125,
0.07904052734375,
-0.03765869140625,
-0.0016717910766601562,
-0.031524658203125,
-0.0204620361328125,
-0.038055419921875,
0.02734375,
-0.0283203125,
-0.029449462890625,
0.07305908203125,
0.01380157470703125,
-0.0013532638549804688,
0.049713134765625,
0.019073486328125,
0.002811431884765625,
0.082275390625,
0.03741455078125,
0.0031909942626953125,
0.047119140625,
-0.0716552734375,
-0.01519012451171875,
-0.06549072265625,
-0.0300750732421875,
-0.032623291015625,
-0.04449462890625,
-0.0447998046875,
-0.0237579345703125,
0.0304718017578125,
0.0082244873046875,
-0.032745361328125,
0.02667236328125,
-0.055023193359375,
0.01488494873046875,
0.044586181640625,
0.02960205078125,
-0.021209716796875,
0.019317626953125,
-0.0313720703125,
0.0041046142578125,
-0.057464599609375,
-0.0214996337890625,
0.07928466796875,
0.03558349609375,
0.062469482421875,
-0.010467529296875,
0.05035400390625,
-0.019683837890625,
0.0406494140625,
-0.048583984375,
0.037017822265625,
-0.0027103424072265625,
-0.036712646484375,
0.01178741455078125,
-0.040130615234375,
-0.0706787109375,
0.024383544921875,
-0.0212554931640625,
-0.06292724609375,
0.00791168212890625,
0.0171966552734375,
-0.0024871826171875,
0.060028076171875,
-0.058746337890625,
0.076904296875,
-0.00600433349609375,
-0.03521728515625,
0.01270294189453125,
-0.04315185546875,
0.0239105224609375,
0.01898193359375,
-0.0037937164306640625,
-0.01302337646484375,
0.0234222412109375,
0.08160400390625,
-0.044189453125,
0.0633544921875,
-0.04290771484375,
0.02783203125,
0.0265045166015625,
-0.00313568115234375,
0.0282440185546875,
-0.0034656524658203125,
-0.0002371072769165039,
0.02337646484375,
0.01140594482421875,
-0.0272979736328125,
-0.0361328125,
0.05169677734375,
-0.0831298828125,
-0.0255279541015625,
-0.03851318359375,
-0.039642333984375,
0.0034122467041015625,
0.0024166107177734375,
0.043792724609375,
0.0513916015625,
0.01326751708984375,
0.01355743408203125,
0.03533935546875,
-0.01904296875,
0.0290679931640625,
-0.0012750625610351562,
-0.0361328125,
-0.051361083984375,
0.0511474609375,
0.01068115234375,
0.0159759521484375,
-0.00257110595703125,
0.01837158203125,
-0.022918701171875,
-0.0390625,
-0.0238494873046875,
0.0367431640625,
-0.0537109375,
-0.0333251953125,
-0.04864501953125,
-0.03662109375,
-0.02459716796875,
-0.006191253662109375,
-0.0276336669921875,
-0.0234222412109375,
-0.0291595458984375,
0.00911712646484375,
0.04754638671875,
0.048126220703125,
-0.0049896240234375,
0.0445556640625,
-0.048370361328125,
0.00685882568359375,
0.007038116455078125,
0.046783447265625,
-0.00656890869140625,
-0.07177734375,
-0.01250457763671875,
-0.0005807876586914062,
-0.041259765625,
-0.058807373046875,
0.04400634765625,
0.010009765625,
0.04498291015625,
0.0313720703125,
-0.02764892578125,
0.057281494140625,
-0.006351470947265625,
0.03460693359375,
0.028717041015625,
-0.04461669921875,
0.047119140625,
-0.007091522216796875,
0.019317626953125,
0.005695343017578125,
0.03570556640625,
-0.0260467529296875,
-0.018463134765625,
-0.07659912109375,
-0.05908203125,
0.07208251953125,
0.0163116455078125,
0.0017271041870117188,
0.0229034423828125,
0.0494384765625,
0.0086822509765625,
0.0023860931396484375,
-0.06451416015625,
-0.03277587890625,
-0.0233612060546875,
-0.0262908935546875,
-0.01154327392578125,
-0.0030956268310546875,
-0.012969970703125,
-0.050262451171875,
0.06268310546875,
-0.011474609375,
0.052947998046875,
0.017669677734375,
0.0016641616821289062,
-0.0162506103515625,
-0.0185089111328125,
0.03460693359375,
0.029205322265625,
-0.037445068359375,
0.0031299591064453125,
0.01406097412109375,
-0.04876708984375,
0.002849578857421875,
0.01538848876953125,
0.0029754638671875,
0.0070343017578125,
0.022308349609375,
0.0714111328125,
0.00199127197265625,
0.00688934326171875,
0.0297393798828125,
-0.01012420654296875,
-0.044677734375,
-0.017364501953125,
0.01641845703125,
-0.00330352783203125,
0.0291748046875,
0.0240325927734375,
0.02972412109375,
-0.005847930908203125,
-0.018280029296875,
0.007793426513671875,
0.04736328125,
-0.0229034423828125,
-0.0386962890625,
0.065185546875,
-0.0245361328125,
-0.004589080810546875,
0.05889892578125,
-0.00496673583984375,
-0.032928466796875,
0.0758056640625,
0.043304443359375,
0.0772705078125,
-0.00580596923828125,
0.0083160400390625,
0.06268310546875,
0.012115478515625,
-0.00691986083984375,
0.0127716064453125,
0.0139617919921875,
-0.056976318359375,
0.0031833648681640625,
-0.04052734375,
-0.017181396484375,
0.030853271484375,
-0.047149658203125,
0.0260009765625,
-0.052398681640625,
-0.035369873046875,
0.01308441162109375,
0.01837158203125,
-0.06298828125,
0.00946807861328125,
-0.0006775856018066406,
0.06378173828125,
-0.06396484375,
0.057281494140625,
0.063720703125,
-0.03155517578125,
-0.0667724609375,
-0.020172119140625,
0.00499725341796875,
-0.06549072265625,
0.033538818359375,
0.035614013671875,
0.003910064697265625,
0.002086639404296875,
-0.07012939453125,
-0.041748046875,
0.10723876953125,
0.03485107421875,
-0.005992889404296875,
0.02197265625,
-0.00228118896484375,
0.020050048828125,
-0.03265380859375,
0.04315185546875,
0.009521484375,
0.027130126953125,
0.044281005859375,
-0.050048828125,
0.010009765625,
-0.031494140625,
0.007541656494140625,
0.0227203369140625,
-0.04315185546875,
0.06866455078125,
-0.03240966796875,
-0.00991058349609375,
0.00988006591796875,
0.0399169921875,
0.0245208740234375,
0.01532745361328125,
0.0307464599609375,
0.06475830078125,
0.040679931640625,
-0.0286407470703125,
0.070556640625,
-0.00148773193359375,
0.06085205078125,
0.04681396484375,
0.01526641845703125,
0.03778076171875,
0.0302276611328125,
-0.0234222412109375,
0.0255889892578125,
0.068115234375,
-0.03277587890625,
0.026336669921875,
0.0111083984375,
-0.0027637481689453125,
-0.0054931640625,
0.0189208984375,
-0.039398193359375,
0.0224609375,
0.013458251953125,
-0.040283203125,
-0.018585205078125,
0.0095977783203125,
0.0153350830078125,
-0.039794921875,
-0.0188751220703125,
0.034698486328125,
-0.0014009475708007812,
-0.029541015625,
0.058990478515625,
0.005535125732421875,
0.0667724609375,
-0.03912353515625,
0.0012340545654296875,
-0.0254058837890625,
0.0284576416015625,
-0.028564453125,
-0.06329345703125,
0.0230255126953125,
-0.01561737060546875,
-0.002971649169921875,
0.006755828857421875,
0.047943115234375,
-0.026641845703125,
-0.04010009765625,
0.0210418701171875,
0.0183563232421875,
0.03411865234375,
0.00634765625,
-0.081787109375,
0.0126495361328125,
-0.00803375244140625,
-0.03912353515625,
0.029296875,
0.01898193359375,
0.01934814453125,
0.05780029296875,
0.040496826171875,
-0.0136260986328125,
0.00829315185546875,
-0.0008945465087890625,
0.06341552734375,
-0.036163330078125,
-0.024658203125,
-0.058258056640625,
0.03973388671875,
-0.0011005401611328125,
-0.0350341796875,
0.03765869140625,
0.05133056640625,
0.0648193359375,
-0.01422119140625,
0.039031982421875,
-0.0179443359375,
-0.0008921623229980469,
-0.029876708984375,
0.052764892578125,
-0.0526123046875,
-0.006671905517578125,
-0.0161590576171875,
-0.05670166015625,
-0.0307159423828125,
0.05780029296875,
-0.02435302734375,
0.04290771484375,
0.033935546875,
0.07415771484375,
-0.0396728515625,
-0.01145172119140625,
0.0152740478515625,
0.0192718505859375,
0.013336181640625,
0.027496337890625,
0.038726806640625,
-0.0623779296875,
0.0240325927734375,
-0.042083740234375,
-0.013153076171875,
-0.0115966796875,
-0.050689697265625,
-0.08258056640625,
-0.0484619140625,
-0.050933837890625,
-0.0494384765625,
-0.0139312744140625,
0.07086181640625,
0.07183837890625,
-0.057708740234375,
-0.0157012939453125,
0.00966644287109375,
0.00472259521484375,
-0.0154266357421875,
-0.021453857421875,
0.02862548828125,
-0.012603759765625,
-0.061676025390625,
-0.0244903564453125,
-0.0036678314208984375,
0.041961669921875,
0.007564544677734375,
-0.022705078125,
-0.00962066650390625,
-0.0172576904296875,
0.01493072509765625,
0.0234222412109375,
-0.043701171875,
-0.0227508544921875,
-0.01873779296875,
-0.00860595703125,
0.04010009765625,
0.02459716796875,
-0.04071044921875,
0.018310546875,
0.0286407470703125,
0.009185791015625,
0.071533203125,
-0.022064208984375,
0.0048370361328125,
-0.0662841796875,
0.03350830078125,
-0.00958251953125,
0.04644775390625,
0.0244293212890625,
-0.017578125,
0.041534423828125,
0.034515380859375,
-0.038543701171875,
-0.05914306640625,
-0.0188140869140625,
-0.08453369140625,
-0.00237274169921875,
0.06658935546875,
-0.016571044921875,
-0.034454345703125,
0.03564453125,
-0.00978851318359375,
0.035064697265625,
-0.001796722412109375,
0.031280517578125,
0.025787353515625,
-0.008270263671875,
-0.04541015625,
-0.046112060546875,
0.0225830078125,
-0.0009860992431640625,
-0.0477294921875,
-0.0386962890625,
0.00399017333984375,
0.0667724609375,
0.01230621337890625,
0.033233642578125,
-0.021331787109375,
0.008148193359375,
0.003276824951171875,
0.0419921875,
-0.0297393798828125,
-0.0096282958984375,
-0.0220947265625,
0.0077362060546875,
-0.004703521728515625,
-0.04510498046875
]
] |
microsoft/biogpt | 2023-02-03T09:28:24.000Z | [
"transformers",
"pytorch",
"biogpt",
"text-generation",
"en",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-generation | microsoft | null | null | microsoft/biogpt | 175 | 36,333 | transformers | 2022-11-20T13:20:45 | ---
language: en
license: mit
widget:
- text: "COVID-19 is"
---
## BioGPT
Pre-trained language models have attracted increasing attention in the biomedical domain, inspired by their great success in the general natural language domain. Among the two main branches of pre-trained language models in the general language domain, i.e. BERT (and its variants) and GPT (and its variants), the first one has been extensively studied in the biomedical domain, such as BioBERT and PubMedBERT. While they have achieved great success on a variety of discriminative downstream biomedical tasks, the lack of generation ability constrains their application scope. In this paper, we propose BioGPT, a domain-specific generative Transformer language model pre-trained on large-scale biomedical literature. We evaluate BioGPT on six biomedical natural language processing tasks and demonstrate that our model outperforms previous models on most tasks. Especially, we get 44.98%, 38.42% and 40.76% F1 score on BC5CDR, KD-DTI and DDI end-to-end relation extraction tasks, respectively, and 78.2% accuracy on PubMedQA, creating a new record. Our case study on text generation further demonstrates the advantage of BioGPT on biomedical literature to generate fluent descriptions for biomedical terms.
You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we
set a seed for reproducibility:
```python
>>> from transformers import pipeline, set_seed
>>> from transformers import BioGptTokenizer, BioGptForCausalLM
>>> model = BioGptForCausalLM.from_pretrained("microsoft/biogpt")
>>> tokenizer = BioGptTokenizer.from_pretrained("microsoft/biogpt")
>>> generator = pipeline('text-generation', model=model, tokenizer=tokenizer)
>>> set_seed(42)
>>> generator("COVID-19 is", max_length=20, num_return_sequences=5, do_sample=True)
[{'generated_text': 'COVID-19 is a disease that spreads worldwide and is currently found in a growing proportion of the population'},
{'generated_text': 'COVID-19 is one of the largest viral epidemics in the world.'},
{'generated_text': 'COVID-19 is a common condition affecting an estimated 1.1 million people in the United States alone.'},
{'generated_text': 'COVID-19 is a pandemic, the incidence has been increased in a manner similar to that in other'},
{'generated_text': 'COVID-19 is transmitted via droplets, air-borne, or airborne transmission.'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import BioGptTokenizer, BioGptForCausalLM
tokenizer = BioGptTokenizer.from_pretrained("microsoft/biogpt")
model = BioGptForCausalLM.from_pretrained("microsoft/biogpt")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
Beam-search decoding:
```python
import torch
from transformers import BioGptTokenizer, BioGptForCausalLM, set_seed
tokenizer = BioGptTokenizer.from_pretrained("microsoft/biogpt")
model = BioGptForCausalLM.from_pretrained("microsoft/biogpt")
sentence = "COVID-19 is"
inputs = tokenizer(sentence, return_tensors="pt")
set_seed(42)
with torch.no_grad():
beam_output = model.generate(**inputs,
min_length=100,
max_length=1024,
num_beams=5,
early_stopping=True
)
tokenizer.decode(beam_output[0], skip_special_tokens=True)
'COVID-19 is a global pandemic caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), the causative agent of coronavirus disease 2019 (COVID-19), which has spread to more than 200 countries and territories, including the United States (US), Canada, Australia, New Zealand, the United Kingdom (UK), and the United States of America (USA), as of March 11, 2020, with more than 800,000 confirmed cases and more than 800,000 deaths.'
```
## Citation
If you find BioGPT useful in your research, please cite the following paper:
```latex
@article{10.1093/bib/bbac409,
author = {Luo, Renqian and Sun, Liai and Xia, Yingce and Qin, Tao and Zhang, Sheng and Poon, Hoifung and Liu, Tie-Yan},
title = "{BioGPT: generative pre-trained transformer for biomedical text generation and mining}",
journal = {Briefings in Bioinformatics},
volume = {23},
number = {6},
year = {2022},
month = {09},
abstract = "{Pre-trained language models have attracted increasing attention in the biomedical domain, inspired by their great success in the general natural language domain. Among the two main branches of pre-trained language models in the general language domain, i.e. BERT (and its variants) and GPT (and its variants), the first one has been extensively studied in the biomedical domain, such as BioBERT and PubMedBERT. While they have achieved great success on a variety of discriminative downstream biomedical tasks, the lack of generation ability constrains their application scope. In this paper, we propose BioGPT, a domain-specific generative Transformer language model pre-trained on large-scale biomedical literature. We evaluate BioGPT on six biomedical natural language processing tasks and demonstrate that our model outperforms previous models on most tasks. Especially, we get 44.98\%, 38.42\% and 40.76\% F1 score on BC5CDR, KD-DTI and DDI end-to-end relation extraction tasks, respectively, and 78.2\% accuracy on PubMedQA, creating a new record. Our case study on text generation further demonstrates the advantage of BioGPT on biomedical literature to generate fluent descriptions for biomedical terms.}",
issn = {1477-4054},
doi = {10.1093/bib/bbac409},
url = {https://doi.org/10.1093/bib/bbac409},
note = {bbac409},
eprint = {https://academic.oup.com/bib/article-pdf/23/6/bbac409/47144271/bbac409.pdf},
}
```
| 5,916 | [
[
-0.0165252685546875,
-0.06439208984375,
0.027984619140625,
0.0107879638671875,
-0.033050537109375,
0.00130462646484375,
-0.0028324127197265625,
-0.0377197265625,
0.001194000244140625,
0.01082611083984375,
-0.04010009765625,
-0.0440673828125,
-0.04644775390625,
0.024993896484375,
-0.0162506103515625,
0.1092529296875,
0.010650634765625,
0.004894256591796875,
-0.016357421875,
0.0016222000122070312,
-0.00896453857421875,
-0.04595947265625,
-0.03662109375,
-0.0501708984375,
0.02301025390625,
-0.0245361328125,
0.04345703125,
0.0299224853515625,
0.05487060546875,
0.019195556640625,
-0.01392364501953125,
-0.0015506744384765625,
-0.028045654296875,
-0.007732391357421875,
-0.0190277099609375,
-0.006961822509765625,
-0.049224853515625,
-0.0005006790161132812,
0.0245819091796875,
0.04425048828125,
0.00325775146484375,
0.0033702850341796875,
0.00258636474609375,
0.0257110595703125,
0.0009860992431640625,
0.0138702392578125,
-0.0430908203125,
0.004703521728515625,
-0.007122039794921875,
-0.01055908203125,
-0.0287628173828125,
-0.01192474365234375,
0.0285797119140625,
-0.0311431884765625,
0.062164306640625,
-0.005123138427734375,
0.09600830078125,
0.01043701171875,
-0.016082763671875,
0.0021533966064453125,
-0.03765869140625,
0.0443115234375,
-0.060577392578125,
0.01947021484375,
0.00829315185546875,
-0.0044097900390625,
-0.00699615478515625,
-0.091796875,
-0.032989501953125,
-0.02484130859375,
-0.0197906494140625,
0.02520751953125,
-0.038787841796875,
0.017578125,
0.00920867919921875,
0.018829345703125,
-0.07684326171875,
-0.01404571533203125,
-0.041595458984375,
-0.0224761962890625,
0.043243408203125,
-0.00424957275390625,
0.036865234375,
-0.03466796875,
-0.026336669921875,
-0.0124053955078125,
-0.047393798828125,
-0.01100921630859375,
-0.0253753662109375,
0.005039215087890625,
-0.0079498291015625,
0.0291290283203125,
-0.000021457672119140625,
0.045196533203125,
0.009735107421875,
0.0056304931640625,
0.03692626953125,
-0.05267333984375,
-0.0301666259765625,
-0.0020694732666015625,
0.08837890625,
-0.001194000244140625,
0.007213592529296875,
-0.0190887451171875,
0.01287841796875,
0.00501251220703125,
0.031005859375,
-0.0946044921875,
-0.030364990234375,
0.0269622802734375,
-0.041961669921875,
-0.0205078125,
-0.01241302490234375,
-0.0645751953125,
-0.0037994384765625,
0.01059722900390625,
0.0450439453125,
-0.06329345703125,
0.0025787353515625,
0.019622802734375,
-0.0160980224609375,
0.017181396484375,
0.0002111196517944336,
-0.059844970703125,
-0.015106201171875,
0.033233642578125,
0.0697021484375,
-0.00026226043701171875,
-0.01477813720703125,
-0.030517578125,
0.0027103424072265625,
0.008514404296875,
0.04608154296875,
-0.01885986328125,
-0.0299224853515625,
-0.0195770263671875,
0.00394439697265625,
-0.01073455810546875,
-0.024566650390625,
0.030914306640625,
-0.028045654296875,
0.0312347412109375,
-0.0011959075927734375,
-0.02490234375,
-0.0118255615234375,
-0.005809783935546875,
-0.026397705078125,
0.0576171875,
0.011505126953125,
-0.06890869140625,
0.031829833984375,
-0.053924560546875,
-0.04931640625,
0.033050537109375,
-0.0213775634765625,
-0.0736083984375,
-0.01229095458984375,
0.03369140625,
0.04962158203125,
-0.019439697265625,
0.04559326171875,
-0.0008668899536132812,
-0.00628662109375,
0.004886627197265625,
-0.022796630859375,
0.06805419921875,
0.0146942138671875,
-0.049957275390625,
0.0123291015625,
-0.051239013671875,
0.005146026611328125,
0.0108795166015625,
-0.0312347412109375,
-0.000690460205078125,
0.0027942657470703125,
0.016754150390625,
0.01018524169921875,
0.01094818115234375,
-0.0537109375,
0.0202789306640625,
-0.055816650390625,
0.0635986328125,
0.0261993408203125,
0.00682830810546875,
0.0164794921875,
-0.01509857177734375,
0.04034423828125,
-0.00411224365234375,
0.0079498291015625,
-0.00968170166015625,
-0.05780029296875,
-0.0178070068359375,
-0.041900634765625,
0.01561737060546875,
0.052886962890625,
-0.060882568359375,
0.055938720703125,
-0.03485107421875,
-0.04913330078125,
-0.03814697265625,
-0.0045318603515625,
0.0396728515625,
0.041229248046875,
0.056884765625,
-0.01285552978515625,
-0.04345703125,
-0.058990478515625,
-0.01047515869140625,
-0.01171875,
-0.0093536376953125,
0.01442718505859375,
0.055328369140625,
-0.06378173828125,
0.082275390625,
-0.041839599609375,
-0.01184844970703125,
-0.0367431640625,
0.026763916015625,
0.022186279296875,
0.04168701171875,
0.041900634765625,
-0.056976318359375,
-0.02569580078125,
0.0126800537109375,
-0.07098388671875,
-0.00986480712890625,
-0.00786590576171875,
-0.00586700439453125,
0.0038547515869140625,
0.026397705078125,
-0.061767578125,
0.037078857421875,
0.037994384765625,
-0.030364990234375,
0.034698486328125,
-0.041015625,
-0.006443023681640625,
-0.10479736328125,
0.039031982421875,
-0.00235748291015625,
-0.0201416015625,
-0.05859375,
0.004119873046875,
-0.01035308837890625,
-0.022979736328125,
-0.024383544921875,
0.0360107421875,
-0.0179443359375,
0.029632568359375,
0.00001531839370727539,
-0.0037174224853515625,
0.016326904296875,
0.019195556640625,
0.00954437255859375,
0.06060791015625,
0.02764892578125,
-0.036102294921875,
0.02490234375,
0.054046630859375,
-0.019012451171875,
0.0032329559326171875,
-0.07708740234375,
0.00485992431640625,
-0.02581787109375,
0.0386962890625,
-0.052978515625,
-0.010986328125,
0.0158233642578125,
-0.047393798828125,
0.020172119140625,
0.00791168212890625,
-0.0199127197265625,
-0.04388427734375,
-0.0113677978515625,
0.02655029296875,
0.06280517578125,
-0.03302001953125,
0.033294677734375,
-0.0008087158203125,
-0.01715087890625,
-0.037811279296875,
-0.05841064453125,
-0.00243377685546875,
-0.00519561767578125,
-0.065185546875,
0.06134033203125,
-0.006134033203125,
0.017852783203125,
-0.005367279052734375,
0.01065826416015625,
0.0015468597412109375,
-0.0130462646484375,
0.01003265380859375,
0.0269622802734375,
-0.01226806640625,
0.018035888671875,
0.004650115966796875,
-0.00322723388671875,
-0.005443572998046875,
-0.0266876220703125,
0.056121826171875,
-0.01446533203125,
-0.004665374755859375,
-0.012603759765625,
0.0306396484375,
0.027557373046875,
-0.03466796875,
0.061431884765625,
0.0833740234375,
-0.0180206298828125,
0.006717681884765625,
-0.0181732177734375,
-0.021148681640625,
-0.0369873046875,
0.07086181640625,
-0.01210784912109375,
-0.05633544921875,
0.0232696533203125,
0.0017099380493164062,
-0.0096282958984375,
0.050567626953125,
0.0562744140625,
0.0029582977294921875,
0.0645751953125,
0.04522705078125,
-0.003322601318359375,
0.016326904296875,
-0.032989501953125,
0.038787841796875,
-0.04644775390625,
-0.0211029052734375,
-0.032623291015625,
0.00693511962890625,
-0.0450439453125,
-0.027313232421875,
0.050567626953125,
0.016204833984375,
-0.02252197265625,
0.034271240234375,
-0.059539794921875,
-0.0025463104248046875,
0.040435791015625,
0.035003662109375,
-0.005832672119140625,
0.017578125,
-0.025634765625,
0.0029010772705078125,
-0.07354736328125,
-0.032196044921875,
0.06610107421875,
0.0177154541015625,
0.043243408203125,
-0.007068634033203125,
0.06304931640625,
0.00414276123046875,
0.02899169921875,
-0.031646728515625,
0.032196044921875,
-0.0309295654296875,
-0.04925537109375,
-0.0225677490234375,
-0.0280914306640625,
-0.1031494140625,
-0.0004131793975830078,
-0.0165557861328125,
-0.054168701171875,
0.0154571533203125,
0.01334381103515625,
-0.038543701171875,
0.01348114013671875,
-0.038604736328125,
0.0697021484375,
-0.0093536376953125,
-0.03656005859375,
0.003032684326171875,
-0.06402587890625,
0.0057830810546875,
-0.01493072509765625,
0.007541656494140625,
0.0104217529296875,
0.0297393798828125,
0.05877685546875,
-0.038055419921875,
0.0679931640625,
-0.005062103271484375,
0.01218414306640625,
0.022857666015625,
-0.0254669189453125,
0.0307464599609375,
0.00867462158203125,
0.0030841827392578125,
0.0208587646484375,
0.0117645263671875,
-0.026336669921875,
-0.0013418197631835938,
0.0164794921875,
-0.06134033203125,
-0.0411376953125,
-0.040008544921875,
-0.03045654296875,
0.004199981689453125,
0.0220794677734375,
0.0692138671875,
0.035797119140625,
-0.0267333984375,
0.022247314453125,
0.0526123046875,
-0.0675048828125,
0.030364990234375,
0.020538330078125,
-0.01248931884765625,
-0.037872314453125,
0.0640869140625,
-0.01187896728515625,
0.01227569580078125,
0.042449951171875,
0.021820068359375,
-0.0207672119140625,
-0.040435791015625,
-0.0289764404296875,
0.0301666259765625,
-0.03826904296875,
-0.038238525390625,
-0.071533203125,
-0.04644775390625,
-0.053680419921875,
-0.0182647705078125,
0.000255584716796875,
0.0108184814453125,
-0.034454345703125,
-0.0034008026123046875,
0.050872802734375,
0.0355224609375,
-0.00634002685546875,
0.0219268798828125,
-0.08392333984375,
0.037017822265625,
0.024810791015625,
0.0075531005859375,
-0.0245819091796875,
-0.038604736328125,
-0.0150604248046875,
0.00861358642578125,
-0.03271484375,
-0.06549072265625,
0.06671142578125,
0.039306640625,
0.037261962890625,
0.00870513916015625,
-0.0185546875,
0.0307464599609375,
-0.052825927734375,
0.06854248046875,
0.0111236572265625,
-0.06500244140625,
0.050140380859375,
-0.0384521484375,
0.0291900634765625,
0.024261474609375,
0.02911376953125,
-0.031524658203125,
-0.0239715576171875,
-0.057952880859375,
-0.0758056640625,
0.0367431640625,
0.03082275390625,
0.00841522216796875,
-0.00925445556640625,
0.028839111328125,
0.004314422607421875,
0.0096435546875,
-0.0692138671875,
-0.016204833984375,
-0.0298614501953125,
-0.0295562744140625,
-0.013946533203125,
-0.012664794921875,
0.004634857177734375,
-0.039276123046875,
0.048919677734375,
-0.0018796920776367188,
0.065185546875,
0.047576904296875,
-0.007167816162109375,
0.0103759765625,
0.0310211181640625,
0.042083740234375,
0.030303955078125,
-0.0283660888671875,
-0.0015468597412109375,
-0.0003402233123779297,
-0.06402587890625,
0.006984710693359375,
0.043731689453125,
-0.0227813720703125,
0.01303863525390625,
0.044677734375,
0.05645751953125,
-0.00797271728515625,
-0.041534423828125,
0.036224365234375,
0.0193023681640625,
-0.0272064208984375,
0.0091705322265625,
-0.01153564453125,
0.0120086669921875,
0.01073455810546875,
0.0238800048828125,
0.0125885009765625,
0.00920867919921875,
-0.02655029296875,
0.03515625,
0.0384521484375,
-0.00855255126953125,
-0.031158447265625,
0.08648681640625,
0.0195770263671875,
-0.01047515869140625,
0.052825927734375,
-0.01207733154296875,
-0.05560302734375,
0.04486083984375,
0.058929443359375,
0.0706787109375,
-0.020050048828125,
0.0302276611328125,
0.0285186767578125,
0.017578125,
0.00627899169921875,
0.01383209228515625,
0.01806640625,
-0.059844970703125,
-0.04266357421875,
-0.0699462890625,
0.00806427001953125,
0.020599365234375,
-0.028411865234375,
0.01154327392578125,
-0.010711669921875,
-0.0300140380859375,
0.0256195068359375,
-0.006015777587890625,
-0.062744140625,
0.0220794677734375,
0.0010843276977539062,
0.035186767578125,
-0.065185546875,
0.07318115234375,
0.057037353515625,
-0.05511474609375,
-0.046966552734375,
-0.004306793212890625,
-0.0355224609375,
-0.062164306640625,
0.06463623046875,
0.004302978515625,
0.0143585205078125,
0.007183074951171875,
-0.03485107421875,
-0.055694580078125,
0.08563232421875,
0.0233001708984375,
-0.03668212890625,
-0.023590087890625,
0.0231475830078125,
0.0687255859375,
-0.006786346435546875,
0.012115478515625,
0.03265380859375,
0.026580810546875,
-0.0053863525390625,
-0.07073974609375,
0.017303466796875,
-0.0293426513671875,
0.0034656524658203125,
-0.0016307830810546875,
-0.044891357421875,
0.0660400390625,
-0.01171112060546875,
-0.0016651153564453125,
0.0010051727294921875,
0.032440185546875,
0.0272674560546875,
0.0227813720703125,
-0.00862884521484375,
0.02899169921875,
0.0489501953125,
-0.005413055419921875,
0.0968017578125,
-0.0204925537109375,
0.044708251953125,
0.04449462890625,
-0.00007390975952148438,
0.0384521484375,
0.0190277099609375,
-0.040313720703125,
0.03411865234375,
0.0305633544921875,
-0.0192108154296875,
0.039581298828125,
0.0137939453125,
-0.0310211181640625,
-0.003589630126953125,
0.007785797119140625,
-0.05181884765625,
0.02130126953125,
0.02142333984375,
-0.057769775390625,
-0.00872039794921875,
0.0180511474609375,
0.0163421630859375,
-0.0136566162109375,
-0.0033054351806640625,
0.042144775390625,
0.012725830078125,
-0.035369873046875,
0.054168701171875,
0.009735107421875,
0.05633544921875,
-0.054412841796875,
0.0003275871276855469,
-0.0113983154296875,
0.020751953125,
-0.01424407958984375,
-0.02685546875,
0.02301025390625,
0.001148223876953125,
-0.0237579345703125,
-0.0201416015625,
0.05999755859375,
-0.03533935546875,
-0.0469970703125,
0.0177459716796875,
0.033050537109375,
0.012969970703125,
-0.007167816162109375,
-0.057525634765625,
-0.0165863037109375,
0.0197906494140625,
-0.0178985595703125,
0.017913818359375,
0.019012451171875,
0.007686614990234375,
0.038787841796875,
0.03857421875,
0.01264190673828125,
0.007171630859375,
-0.016510009765625,
0.050994873046875,
-0.041015625,
-0.0193328857421875,
-0.076904296875,
0.0295257568359375,
-0.0015192031860351562,
-0.015411376953125,
0.040283203125,
0.041351318359375,
0.032684326171875,
-0.03680419921875,
0.07647705078125,
-0.0194549560546875,
0.0224761962890625,
-0.025604248046875,
0.07861328125,
-0.0161285400390625,
0.0029125213623046875,
-0.019622802734375,
-0.04541015625,
-0.03271484375,
0.05859375,
-0.021728515625,
0.0203704833984375,
0.0650634765625,
0.051910400390625,
0.007503509521484375,
-0.01947021484375,
0.0153045654296875,
0.0288543701171875,
0.0516357421875,
0.041046142578125,
0.0103759765625,
-0.044036865234375,
0.03948974609375,
-0.0092315673828125,
-0.01678466796875,
-0.01055145263671875,
-0.06341552734375,
-0.06732177734375,
-0.03912353515625,
-0.03387451171875,
-0.046112060546875,
0.0209503173828125,
0.07049560546875,
0.0552978515625,
-0.0653076171875,
0.0209808349609375,
-0.030792236328125,
-0.033203125,
-0.0158233642578125,
-0.0175323486328125,
0.06060791015625,
-0.0281829833984375,
-0.040740966796875,
0.0182952880859375,
0.0168609619140625,
0.013702392578125,
0.0018911361694335938,
0.00934600830078125,
-0.0236663818359375,
0.0005617141723632812,
0.06072998046875,
0.025146484375,
-0.047698974609375,
-0.0230865478515625,
0.0015535354614257812,
-0.0222015380859375,
0.00872802734375,
0.03851318359375,
-0.077880859375,
0.0294952392578125,
0.041046142578125,
0.0509033203125,
0.05279541015625,
-0.002460479736328125,
0.053375244140625,
-0.06121826171875,
0.0122528076171875,
0.0207672119140625,
0.0145263671875,
0.0223236083984375,
-0.041259765625,
0.034820556640625,
0.038665771484375,
-0.05810546875,
-0.03839111328125,
-0.00753021240234375,
-0.0845947265625,
-0.021575927734375,
0.08868408203125,
-0.004596710205078125,
-0.005329132080078125,
-0.006137847900390625,
-0.020538330078125,
0.047576904296875,
-0.01165008544921875,
0.056549072265625,
0.016571044921875,
-0.012451171875,
0.01253509521484375,
-0.031402587890625,
0.0584716796875,
0.03350830078125,
-0.0687255859375,
-0.01544189453125,
0.0095672607421875,
0.0272064208984375,
0.01297760009765625,
0.059112548828125,
-0.0036678314208984375,
0.00827789306640625,
-0.00318145751953125,
0.03314208984375,
0.0035076141357421875,
0.0090789794921875,
-0.022735595703125,
0.00521087646484375,
-0.0209808349609375,
-0.00433349609375
]
] |
timm/selecsls42b.in1k | 2023-04-25T00:28:59.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1907.00837",
"license:cc-by-4.0",
"region:us"
] | image-classification | timm | null | null | timm/selecsls42b.in1k | 0 | 36,308 | timm | 2023-04-25T00:28:28 | ---
tags:
- image-classification
- timm
library_name: timm
license: cc-by-4.0
datasets:
- imagenet-1k
---
# Model card for selecsls42b.in1k
A SelecSLS image classification model. Trained on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 32.5
- GMACs: 3.0
- Activations (M): 4.6
- Image size: 224 x 224
- **Papers:**
- XNect: Real-time Multi-Person 3D Motion Capture with a Single RGB Camera: https://arxiv.org/abs/1907.00837
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/mehtadushy/SelecSLS-Pytorch
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('selecsls42b.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'selecsls42b.in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 32, 112, 112])
# torch.Size([1, 128, 56, 56])
# torch.Size([1, 288, 28, 28])
# torch.Size([1, 480, 14, 14])
# torch.Size([1, 1024, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'selecsls42b.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1024, 4, 4) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{XNect_SIGGRAPH2020,
author = {Mehta, Dushyant and Sotnychenko, Oleksandr and Mueller, Franziska and Xu, Weipeng and Elgharib, Mohamed and Fua, Pascal and Seidel, Hans-Peter and Rhodin, Helge and Pons-Moll, Gerard and Theobalt, Christian},
title = {{XNect}: Real-time Multi-Person {3D} Motion Capture with a Single {RGB} Camera},
journal = {ACM Transactions on Graphics},
url = {http://gvv.mpi-inf.mpg.de/projects/XNect/},
numpages = {17},
volume={39},
number={4},
month = July,
year = {2020},
doi={10.1145/3386569.3392410}
}
```
| 3,962 | [
[
-0.03570556640625,
-0.0341796875,
0.01045989990234375,
0.01470184326171875,
-0.02313232421875,
-0.025115966796875,
-0.005825042724609375,
-0.0323486328125,
0.0274810791015625,
0.032135009765625,
-0.043121337890625,
-0.05419921875,
-0.052947998046875,
-0.0079193115234375,
-0.0174560546875,
0.0689697265625,
-0.00896453857421875,
-0.00885772705078125,
-0.006717681884765625,
-0.0413818359375,
-0.002105712890625,
-0.03350830078125,
-0.060089111328125,
-0.024871826171875,
0.024322509765625,
0.013671875,
0.044586181640625,
0.045379638671875,
0.03802490234375,
0.034912109375,
-0.00577545166015625,
0.0019044876098632812,
-0.0168914794921875,
-0.02008056640625,
0.031585693359375,
-0.035369873046875,
-0.040435791015625,
0.01824951171875,
0.058685302734375,
0.031890869140625,
0.0005350112915039062,
0.0286712646484375,
0.00872039794921875,
0.035858154296875,
-0.016754150390625,
0.00008422136306762695,
-0.01297760009765625,
0.0217132568359375,
-0.00716400146484375,
0.00591278076171875,
-0.0098876953125,
-0.036041259765625,
0.021087646484375,
-0.03857421875,
0.041229248046875,
-0.0019664764404296875,
0.09326171875,
0.0125732421875,
-0.0018815994262695312,
-0.0011882781982421875,
-0.01383209228515625,
0.0543212890625,
-0.06512451171875,
0.0222930908203125,
0.0180206298828125,
0.010040283203125,
-0.00022530555725097656,
-0.06292724609375,
-0.044952392578125,
-0.0157012939453125,
-0.00960540771484375,
-0.0014514923095703125,
-0.0008978843688964844,
-0.0034809112548828125,
0.0271759033203125,
0.01727294921875,
-0.040863037109375,
0.0035839080810546875,
-0.0474853515625,
-0.012847900390625,
0.053466796875,
0.00386810302734375,
0.0263824462890625,
-0.0226287841796875,
-0.046661376953125,
-0.041046142578125,
-0.0208282470703125,
0.0229339599609375,
0.0279541015625,
0.0164642333984375,
-0.058624267578125,
0.034912109375,
0.0155792236328125,
0.037322998046875,
0.0033740997314453125,
-0.030548095703125,
0.05413818359375,
0.00441741943359375,
-0.03173828125,
-0.0026378631591796875,
0.08349609375,
0.0311431884765625,
0.01323699951171875,
0.007457733154296875,
0.0034046173095703125,
-0.025482177734375,
-0.01335906982421875,
-0.0816650390625,
-0.023651123046875,
0.031402587890625,
-0.038482666015625,
-0.021728515625,
0.031158447265625,
-0.05767822265625,
-0.017913818359375,
-0.0033588409423828125,
0.03997802734375,
-0.037017822265625,
-0.0384521484375,
0.01035308837890625,
-0.01479339599609375,
0.0293121337890625,
0.0186614990234375,
-0.042236328125,
0.01434326171875,
0.01953125,
0.08502197265625,
-0.003993988037109375,
-0.046173095703125,
-0.01763916015625,
-0.0142669677734375,
-0.01873779296875,
0.032073974609375,
0.00562286376953125,
-0.015838623046875,
-0.021240234375,
0.029205322265625,
-0.01462554931640625,
-0.052154541015625,
0.018646240234375,
0.00007414817810058594,
0.0203094482421875,
-0.0014629364013671875,
-0.017181396484375,
-0.035491943359375,
0.0268707275390625,
-0.033966064453125,
0.0914306640625,
0.024261474609375,
-0.055328369140625,
0.0244293212890625,
-0.0357666015625,
-0.01476287841796875,
-0.0219879150390625,
-0.0061492919921875,
-0.0853271484375,
-0.00780487060546875,
0.0166168212890625,
0.06329345703125,
-0.014404296875,
-0.0010986328125,
-0.049224853515625,
-0.0146026611328125,
0.026641845703125,
0.0003123283386230469,
0.076904296875,
0.009002685546875,
-0.031097412109375,
0.01293182373046875,
-0.046173095703125,
0.01580810546875,
0.038482666015625,
-0.0285797119140625,
0.007625579833984375,
-0.05029296875,
0.01007080078125,
0.03302001953125,
0.0031795501708984375,
-0.04449462890625,
0.016937255859375,
-0.00791168212890625,
0.0260772705078125,
0.0435791015625,
-0.01502227783203125,
0.0228729248046875,
-0.0296478271484375,
0.03131103515625,
0.018035888671875,
0.0153045654296875,
-0.008544921875,
-0.038787841796875,
-0.057952880859375,
-0.040283203125,
0.01617431640625,
0.0295867919921875,
-0.029571533203125,
0.035064697265625,
-0.0159759521484375,
-0.05413818359375,
-0.034759521484375,
0.001556396484375,
0.034698486328125,
0.03289794921875,
0.0253448486328125,
-0.031707763671875,
-0.03704833984375,
-0.057708740234375,
0.0083160400390625,
-0.00046133995056152344,
-0.005321502685546875,
0.0271759033203125,
0.050140380859375,
-0.0024929046630859375,
0.04608154296875,
-0.030914306640625,
-0.0186614990234375,
-0.017578125,
0.01140594482421875,
0.027008056640625,
0.06243896484375,
0.06707763671875,
-0.049346923828125,
-0.0390625,
-0.01073455810546875,
-0.0777587890625,
0.0111846923828125,
-0.013763427734375,
-0.01468658447265625,
0.0290985107421875,
0.01506805419921875,
-0.040008544921875,
0.049163818359375,
0.0249786376953125,
-0.036834716796875,
0.03631591796875,
-0.0222930908203125,
0.0289764404296875,
-0.08056640625,
0.0102081298828125,
0.029693603515625,
-0.018157958984375,
-0.036376953125,
-0.0010499954223632812,
-0.00040411949157714844,
-0.00812530517578125,
-0.05096435546875,
0.053497314453125,
-0.04681396484375,
-0.0250701904296875,
-0.0006785392761230469,
-0.006977081298828125,
0.00412750244140625,
0.06756591796875,
0.007434844970703125,
0.01654052734375,
0.060516357421875,
-0.048126220703125,
0.034271240234375,
0.03564453125,
-0.01248931884765625,
0.036590576171875,
-0.047149658203125,
0.016754150390625,
0.0026378631591796875,
0.004131317138671875,
-0.0654296875,
-0.01910400390625,
0.027557373046875,
-0.04571533203125,
0.04608154296875,
-0.03863525390625,
-0.0265045166015625,
-0.045257568359375,
-0.039825439453125,
0.025146484375,
0.050537109375,
-0.057098388671875,
0.0377197265625,
0.0226593017578125,
0.0186614990234375,
-0.0343017578125,
-0.070068359375,
-0.0224456787109375,
-0.032073974609375,
-0.05767822265625,
0.023223876953125,
0.005130767822265625,
0.0124053955078125,
0.01016998291015625,
-0.009765625,
-0.00739288330078125,
-0.0150299072265625,
0.040191650390625,
0.0307769775390625,
-0.017669677734375,
-0.019775390625,
-0.0283966064453125,
-0.00524139404296875,
0.00557708740234375,
-0.01081085205078125,
0.04290771484375,
-0.015869140625,
-0.02783203125,
-0.06549072265625,
0.00024700164794921875,
0.042083740234375,
0.003711700439453125,
0.05914306640625,
0.07769775390625,
-0.03497314453125,
0.00510406494140625,
-0.03021240234375,
-0.0180511474609375,
-0.03509521484375,
0.04241943359375,
-0.0307159423828125,
-0.033294677734375,
0.062286376953125,
0.004940032958984375,
0.006198883056640625,
0.0458984375,
0.027679443359375,
-0.01219940185546875,
0.062103271484375,
0.041107177734375,
0.019317626953125,
0.046295166015625,
-0.07720947265625,
-0.01442718505859375,
-0.06854248046875,
-0.04071044921875,
-0.0257720947265625,
-0.041473388671875,
-0.039520263671875,
-0.0325927734375,
0.03533935546875,
0.018798828125,
-0.03509521484375,
0.03863525390625,
-0.055877685546875,
0.00690460205078125,
0.04736328125,
0.0443115234375,
-0.0284881591796875,
0.0194244384765625,
-0.0177459716796875,
-0.0024509429931640625,
-0.0550537109375,
-0.0191192626953125,
0.07220458984375,
0.035369873046875,
0.055694580078125,
-0.002788543701171875,
0.054595947265625,
-0.01512908935546875,
0.0142669677734375,
-0.034820556640625,
0.050994873046875,
-0.00971221923828125,
-0.03369140625,
-0.0167236328125,
-0.02642822265625,
-0.0712890625,
0.015411376953125,
-0.02838134765625,
-0.057159423828125,
0.014404296875,
0.01873779296875,
-0.031951904296875,
0.06390380859375,
-0.0611572265625,
0.07513427734375,
-0.0226898193359375,
-0.0386962890625,
0.00799560546875,
-0.0531005859375,
0.019622802734375,
0.0148468017578125,
-0.0224609375,
-0.0160980224609375,
0.021881103515625,
0.0849609375,
-0.042694091796875,
0.06280517578125,
-0.03887939453125,
0.0229339599609375,
0.038238525390625,
-0.006000518798828125,
0.0188446044921875,
-0.00841522216796875,
-0.0056610107421875,
0.033447265625,
0.01366424560546875,
-0.0333251953125,
-0.036376953125,
0.040130615234375,
-0.0706787109375,
-0.023284912109375,
-0.034698486328125,
-0.0307159423828125,
0.01525115966796875,
-0.000789642333984375,
0.037322998046875,
0.048126220703125,
0.00974273681640625,
0.01654052734375,
0.03997802734375,
-0.03924560546875,
0.0360107421875,
-0.003963470458984375,
-0.0255279541015625,
-0.039093017578125,
0.05841064453125,
0.01409912109375,
0.016571044921875,
0.0049285888671875,
0.01462554931640625,
-0.0309906005859375,
-0.032012939453125,
-0.022918701171875,
0.0382080078125,
-0.04644775390625,
-0.02734375,
-0.049713134765625,
-0.034698486328125,
-0.044586181640625,
-0.0189208984375,
-0.03424072265625,
-0.02862548828125,
-0.031219482421875,
0.0115966796875,
0.043426513671875,
0.0413818359375,
-0.0206146240234375,
0.0377197265625,
-0.05303955078125,
0.017303466796875,
0.0173797607421875,
0.0279541015625,
-0.01343536376953125,
-0.06597900390625,
-0.020050048828125,
-0.00786590576171875,
-0.0284423828125,
-0.055694580078125,
0.046234130859375,
0.01055908203125,
0.0428466796875,
0.0278472900390625,
-0.017669677734375,
0.07080078125,
0.004360198974609375,
0.0450439453125,
0.036163330078125,
-0.049224853515625,
0.04693603515625,
-0.001705169677734375,
0.0235748291015625,
0.0028705596923828125,
0.0244293212890625,
-0.031158447265625,
-0.00156402587890625,
-0.07427978515625,
-0.054534912109375,
0.08349609375,
0.0088653564453125,
-0.01007080078125,
0.0286712646484375,
0.0543212890625,
-0.0024738311767578125,
0.0010471343994140625,
-0.056732177734375,
-0.02801513671875,
-0.0279388427734375,
-0.022491455078125,
0.0018253326416015625,
-0.007381439208984375,
-0.000301361083984375,
-0.048797607421875,
0.05706787109375,
-0.009735107421875,
0.058563232421875,
0.034881591796875,
-0.005023956298828125,
-0.00722503662109375,
-0.032501220703125,
0.036773681640625,
0.0176849365234375,
-0.024139404296875,
0.0006899833679199219,
0.02520751953125,
-0.042572021484375,
0.005321502685546875,
0.00766754150390625,
0.00667572021484375,
-0.003200531005859375,
0.037109375,
0.0701904296875,
0.00437164306640625,
-0.0008640289306640625,
0.031097412109375,
-0.0006561279296875,
-0.036224365234375,
-0.021453857421875,
0.0108642578125,
-0.0048980712890625,
0.03857421875,
0.020263671875,
0.02349853515625,
-0.0204315185546875,
-0.0203704833984375,
0.01788330078125,
0.0379638671875,
-0.0208282470703125,
-0.04229736328125,
0.053009033203125,
-0.0115203857421875,
-0.01293182373046875,
0.06658935546875,
-0.0063323974609375,
-0.04052734375,
0.08648681640625,
0.031494140625,
0.07244873046875,
-0.01322174072265625,
0.007228851318359375,
0.07623291015625,
0.01203155517578125,
-0.002529144287109375,
0.01314544677734375,
0.005298614501953125,
-0.062103271484375,
0.005615234375,
-0.044189453125,
-0.00450897216796875,
0.024810791015625,
-0.039520263671875,
0.0338134765625,
-0.047027587890625,
-0.023101806640625,
0.001071929931640625,
0.0219268798828125,
-0.06793212890625,
0.01520538330078125,
0.01207733154296875,
0.059539794921875,
-0.057891845703125,
0.061248779296875,
0.058807373046875,
-0.040618896484375,
-0.07379150390625,
-0.008514404296875,
0.008270263671875,
-0.07080078125,
0.038330078125,
0.03656005859375,
0.0018148422241210938,
0.0021762847900390625,
-0.057342529296875,
-0.04876708984375,
0.107666015625,
0.045440673828125,
-0.0085601806640625,
0.01194000244140625,
0.00490570068359375,
0.0110626220703125,
-0.0301971435546875,
0.027740478515625,
0.02032470703125,
0.031768798828125,
0.029876708984375,
-0.052032470703125,
0.00986480712890625,
-0.01568603515625,
0.0005869865417480469,
0.0168914794921875,
-0.058807373046875,
0.07110595703125,
-0.04901123046875,
-0.0206146240234375,
0.0032901763916015625,
0.05035400390625,
0.0293731689453125,
0.01030731201171875,
0.04376220703125,
0.06427001953125,
0.04229736328125,
-0.020965576171875,
0.06439208984375,
-0.0010023117065429688,
0.052337646484375,
0.053680419921875,
0.014373779296875,
0.034759521484375,
0.024017333984375,
-0.023468017578125,
0.0212554931640625,
0.0736083984375,
-0.0235595703125,
0.0231781005859375,
0.0057830810546875,
0.0014324188232421875,
-0.01181793212890625,
0.0198211669921875,
-0.0269775390625,
0.03411865234375,
0.013580322265625,
-0.0279083251953125,
-0.017669677734375,
0.01465606689453125,
0.01027679443359375,
-0.02618408203125,
-0.0109710693359375,
0.03485107421875,
-0.004581451416015625,
-0.031402587890625,
0.05712890625,
-0.00408172607421875,
0.06402587890625,
-0.0394287109375,
-0.00124359130859375,
-0.0202178955078125,
0.0198211669921875,
-0.02978515625,
-0.0714111328125,
0.024200439453125,
-0.0220794677734375,
0.010711669921875,
-0.004970550537109375,
0.041748046875,
-0.033355712890625,
-0.0333251953125,
0.0218963623046875,
0.0248565673828125,
0.048980712890625,
0.003002166748046875,
-0.0928955078125,
0.01352691650390625,
0.0123291015625,
-0.04339599609375,
0.0181121826171875,
0.0287322998046875,
0.01279449462890625,
0.052886962890625,
0.0399169921875,
-0.0093231201171875,
0.0222015380859375,
-0.0164642333984375,
0.051910400390625,
-0.04888916015625,
-0.025115966796875,
-0.059356689453125,
0.052734375,
-0.0096435546875,
-0.045318603515625,
0.045379638671875,
0.04583740234375,
0.06671142578125,
-0.0025787353515625,
0.03521728515625,
-0.0284576416015625,
0.0092926025390625,
-0.03729248046875,
0.049713134765625,
-0.04827880859375,
-0.006175994873046875,
-0.01397705078125,
-0.050537109375,
-0.032867431640625,
0.059051513671875,
-0.0087432861328125,
0.031158447265625,
0.041748046875,
0.0770263671875,
-0.02532958984375,
-0.0293121337890625,
0.024688720703125,
0.01085662841796875,
0.01096343994140625,
0.0323486328125,
0.03094482421875,
-0.056976318359375,
0.025482177734375,
-0.048858642578125,
-0.013824462890625,
-0.01096343994140625,
-0.048980712890625,
-0.07275390625,
-0.0675048828125,
-0.060302734375,
-0.048004150390625,
-0.0198974609375,
0.073486328125,
0.08984375,
-0.053680419921875,
-0.0167083740234375,
0.0126190185546875,
0.00363922119140625,
-0.0228729248046875,
-0.0159454345703125,
0.05084228515625,
0.0005702972412109375,
-0.067626953125,
-0.0220794677734375,
0.003910064697265625,
0.039337158203125,
-0.0093994140625,
-0.03167724609375,
-0.0201568603515625,
-0.021942138671875,
0.018951416015625,
0.03515625,
-0.04522705078125,
-0.0158843994140625,
-0.0215301513671875,
-0.0073699951171875,
0.036041259765625,
0.037384033203125,
-0.04718017578125,
0.020050048828125,
0.032257080078125,
0.0162200927734375,
0.0731201171875,
-0.02105712890625,
-0.0038127899169921875,
-0.062103271484375,
0.041107177734375,
-0.01387786865234375,
0.040252685546875,
0.034271240234375,
-0.025146484375,
0.051971435546875,
0.0293731689453125,
-0.03564453125,
-0.057830810546875,
-0.014129638671875,
-0.08349609375,
-0.0107879638671875,
0.06719970703125,
-0.03167724609375,
-0.042083740234375,
0.0267181396484375,
-0.00788116455078125,
0.044403076171875,
-0.00470733642578125,
0.03497314453125,
0.0135040283203125,
-0.02215576171875,
-0.050079345703125,
-0.03485107421875,
0.02752685546875,
0.0019092559814453125,
-0.0543212890625,
-0.0292510986328125,
0.005840301513671875,
0.0498046875,
0.02569580078125,
0.0467529296875,
-0.01934814453125,
0.007171630859375,
0.00202178955078125,
0.047332763671875,
-0.0238189697265625,
-0.01068115234375,
-0.03314208984375,
0.010528564453125,
-0.0216522216796875,
-0.0477294921875
]
] |
allenai/specter | 2023-10-18T04:19:07.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"feature-extraction",
"en",
"dataset:SciDocs",
"arxiv:2004.07180",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | feature-extraction | allenai | null | null | allenai/specter | 54 | 36,290 | transformers | 2022-03-02T23:29:05 | ---
language: en
thumbnail: "https://camo.githubusercontent.com/7d080b7a769f7fdf64ac0ebeb47b039cb50be35287e3071f9d633f0fe33e7596/68747470733a2f2f692e6962622e636f2f33544331576d472f737065637465722d6c6f676f2d63726f707065642e706e67"
license: apache-2.0
datasets:
- SciDocs
metrics:
- F1
- accuracy
- map
- ndcg
---
## SPECTER
SPECTER is a pre-trained language model to generate document-level embedding of documents. It is pre-trained on a powerful signal of document-level relatedness: the citation graph. Unlike existing pretrained language models, SPECTER can be easily applied to downstream applications without task-specific fine-tuning.
If you're coming here because you want to embed papers, SPECTER has now been superceded by [SPECTER2](https://huggingface.co/allenai/specter2_proximity). Use that instead.
Paper: [SPECTER: Document-level Representation Learning using Citation-informed Transformers](https://arxiv.org/pdf/2004.07180.pdf)
Original Repo: [Github](https://github.com/allenai/specter)
Evaluation Benchmark: [SciDocs](https://github.com/allenai/scidocs)
Authors: *Arman Cohan, Sergey Feldman, Iz Beltagy, Doug Downey, Daniel S. Weld*
| 1,160 | [
[
-0.03167724609375,
-0.012481689453125,
0.054412841796875,
-0.00571441650390625,
-0.0150604248046875,
0.045074462890625,
0.00405120849609375,
-0.026275634765625,
0.041473388671875,
0.01218414306640625,
-0.00640106201171875,
-0.03594970703125,
-0.04913330078125,
-0.0104217529296875,
-0.03631591796875,
0.08184814453125,
-0.0019550323486328125,
0.009918212890625,
-0.01079559326171875,
0.0311126708984375,
0.01030731201171875,
-0.02423095703125,
-0.043548583984375,
-0.01898193359375,
0.055938720703125,
0.0202178955078125,
0.032989501953125,
0.0219268798828125,
0.047760009765625,
0.0207672119140625,
-0.0015954971313476562,
0.00807952880859375,
-0.035736083984375,
0.0160980224609375,
-0.037811279296875,
-0.00963592529296875,
-0.02496337890625,
0.0092926025390625,
0.06109619140625,
0.037933349609375,
0.023162841796875,
-0.0037250518798828125,
0.006465911865234375,
0.025115966796875,
-0.06512451171875,
0.00847625732421875,
-0.0330810546875,
-0.00958251953125,
-0.031280517578125,
-0.029693603515625,
-0.053985595703125,
-0.031402587890625,
0.0018024444580078125,
-0.054473876953125,
0.046630859375,
0.004852294921875,
0.088134765625,
0.022674560546875,
-0.040191650390625,
-0.019317626953125,
-0.05816650390625,
0.03131103515625,
-0.01123809814453125,
0.03363037109375,
0.031280517578125,
0.032562255859375,
0.0047607421875,
-0.0870361328125,
-0.016326904296875,
-0.0225372314453125,
-0.0097198486328125,
0.037384033203125,
-0.01004791259765625,
0.007694244384765625,
0.005817413330078125,
0.016815185546875,
-0.038787841796875,
-0.0141448974609375,
-0.0377197265625,
-0.00537872314453125,
0.01104736328125,
-0.0172271728515625,
-0.016571044921875,
-0.0263671875,
-0.0132598876953125,
-0.039764404296875,
-0.01849365234375,
-0.032958984375,
0.040924072265625,
0.0208740234375,
0.00832366943359375,
0.0147857666015625,
0.041534423828125,
0.01544952392578125,
0.02490234375,
0.034881591796875,
0.035614013671875,
-0.019287109375,
0.0075531005859375,
-0.0188446044921875,
0.057769775390625,
-0.0208740234375,
0.016357421875,
-0.01800537109375,
0.005397796630859375,
-0.00347137451171875,
0.03753662109375,
-0.05902099609375,
-0.0187530517578125,
0.0042572021484375,
-0.0184173583984375,
-0.020660400390625,
0.03436279296875,
-0.031646728515625,
-0.0188446044921875,
0.013214111328125,
0.003570556640625,
-0.053070068359375,
-0.0330810546875,
-0.01995849609375,
-0.019927978515625,
0.0045928955078125,
0.01557159423828125,
-0.0615234375,
0.0217437744140625,
0.05548095703125,
0.070068359375,
0.0271759033203125,
-0.0455322265625,
-0.032135009765625,
0.0268402099609375,
-0.033538818359375,
0.083984375,
-0.044219970703125,
-0.044708251953125,
0.007083892822265625,
0.00360107421875,
-0.0259857177734375,
-0.0242919921875,
0.05316162109375,
-0.051239013671875,
0.007965087890625,
-0.00820159912109375,
-0.055999755859375,
0.003505706787109375,
-0.005916595458984375,
-0.059051513671875,
0.08770751953125,
0.036865234375,
-0.072021484375,
0.030670166015625,
-0.06695556640625,
-0.0040130615234375,
0.0268707275390625,
-0.016571044921875,
-0.00672149658203125,
-0.00676727294921875,
-0.0212554931640625,
0.0269775390625,
-0.00665283203125,
0.016265869140625,
-0.0157470703125,
-0.02740478515625,
-0.00266265869140625,
0.017059326171875,
0.03216552734375,
0.036346435546875,
-0.0033245086669921875,
0.0172882080078125,
-0.04473876953125,
-0.017303466796875,
-0.00461578369140625,
-0.0125274658203125,
-0.040557861328125,
0.003139495849609375,
0.043304443359375,
0.0034656524658203125,
0.0258331298828125,
-0.053192138671875,
-0.0004334449768066406,
-0.0207977294921875,
0.028228759765625,
0.046478271484375,
-0.0004024505615234375,
0.033599853515625,
-0.01116180419921875,
0.05084228515625,
-0.01525115966796875,
-0.007785797119140625,
-0.0232391357421875,
-0.01239013671875,
-0.0430908203125,
-0.0186614990234375,
0.0301055908203125,
0.026885986328125,
-0.0239105224609375,
0.03143310546875,
-0.0308074951171875,
-0.05316162109375,
-0.0272979736328125,
0.010833740234375,
0.026580810546875,
0.0233612060546875,
0.0164947509765625,
-0.0090484619140625,
-0.06646728515625,
-0.06024169921875,
-0.0037593841552734375,
-0.0222015380859375,
0.00010865926742553711,
-0.0088653564453125,
0.05419921875,
-0.0225372314453125,
0.06488037109375,
-0.03509521484375,
-0.024078369140625,
0.00432586669921875,
0.022979736328125,
0.0017671585083007812,
0.0518798828125,
0.052154541015625,
-0.062255859375,
-0.053009033203125,
-0.003421783447265625,
-0.04962158203125,
0.0189971923828125,
0.0104522705078125,
-0.01053619384765625,
0.00795745849609375,
0.06622314453125,
-0.04339599609375,
0.01111602783203125,
0.0384521484375,
-0.037384033203125,
0.0307769775390625,
-0.0197601318359375,
-0.005199432373046875,
-0.11749267578125,
0.0267333984375,
0.003963470458984375,
-0.03253173828125,
-0.0242156982421875,
0.02423095703125,
0.0290985107421875,
-0.037689208984375,
-0.022857666015625,
0.03033447265625,
-0.03900146484375,
-0.007648468017578125,
-0.0160980224609375,
0.008819580078125,
0.0133056640625,
0.0025119781494140625,
0.005950927734375,
0.0609130859375,
0.0374755859375,
-0.00835418701171875,
-0.00997161865234375,
0.028076171875,
-0.006809234619140625,
0.0285797119140625,
-0.051727294921875,
0.034210205078125,
-0.005069732666015625,
0.03973388671875,
-0.06695556640625,
-0.01265716552734375,
0.009796142578125,
-0.0428466796875,
0.041961669921875,
-0.0247802734375,
-0.05474853515625,
-0.03369140625,
-0.038055419921875,
0.024261474609375,
0.0189666748046875,
-0.0307769775390625,
0.043609619140625,
0.01088714599609375,
0.01503753662109375,
-0.03680419921875,
-0.04498291015625,
0.005023956298828125,
0.01409912109375,
-0.04266357421875,
0.048126220703125,
-0.0104827880859375,
0.002777099609375,
0.023681640625,
0.0019397735595703125,
0.0022182464599609375,
-0.005096435546875,
0.046783447265625,
0.006778717041015625,
-0.007152557373046875,
0.019317626953125,
-0.0078277587890625,
-0.035125732421875,
-0.00975799560546875,
-0.01983642578125,
0.043975830078125,
-0.034881591796875,
-0.03692626953125,
-0.01169586181640625,
0.0394287109375,
0.03607177734375,
-0.0164031982421875,
0.043853759765625,
0.046630859375,
-0.0224761962890625,
0.003345489501953125,
-0.035736083984375,
-0.00836181640625,
-0.038330078125,
0.03515625,
-0.042724609375,
-0.0692138671875,
0.034027099609375,
-0.01450347900390625,
-0.0077362060546875,
0.034149169921875,
0.0114898681640625,
-0.02490234375,
0.0584716796875,
0.055419921875,
0.00037598609924316406,
0.060760498046875,
-0.043853759765625,
0.003551483154296875,
-0.06201171875,
-0.0157470703125,
-0.060882568359375,
-0.0019016265869140625,
-0.050537109375,
-0.0472412109375,
-0.0049591064453125,
0.0090179443359375,
-0.03558349609375,
0.0389404296875,
-0.048126220703125,
0.029266357421875,
0.050994873046875,
-0.0189971923828125,
-0.004489898681640625,
-0.0036468505859375,
-0.006954193115234375,
-0.0008330345153808594,
-0.032745361328125,
-0.053466796875,
0.0784912109375,
0.031341552734375,
0.06591796875,
-0.01000213623046875,
0.0911865234375,
0.0308990478515625,
0.007732391357421875,
-0.05950927734375,
0.0303802490234375,
-0.026092529296875,
-0.058807373046875,
-0.005519866943359375,
-0.0012826919555664062,
-0.0811767578125,
0.00191497802734375,
0.0028514862060546875,
-0.05364990234375,
0.037445068359375,
0.000957489013671875,
-0.00003719329833984375,
0.0212554931640625,
-0.057342529296875,
0.05853271484375,
0.00894927978515625,
-0.01479339599609375,
-0.0261077880859375,
-0.00951385498046875,
-0.016143798828125,
-0.00945281982421875,
0.01342010498046875,
0.0098724365234375,
-0.003452301025390625,
0.061859130859375,
-0.028106689453125,
0.051361083984375,
-0.0260162353515625,
0.01000213623046875,
0.0087890625,
0.00980377197265625,
0.0421142578125,
0.0021266937255859375,
-0.018463134765625,
-0.00965118408203125,
0.0223388671875,
-0.04400634765625,
-0.031982421875,
0.052520751953125,
-0.06768798828125,
-0.0308990478515625,
-0.0516357421875,
-0.038604736328125,
-0.006549835205078125,
0.022979736328125,
0.0233306884765625,
0.01123046875,
-0.036712646484375,
0.0223236083984375,
0.0272369384765625,
0.020660400390625,
0.0311431884765625,
0.036346435546875,
-0.0098876953125,
-0.025115966796875,
0.064697265625,
0.00527191162109375,
-0.01267242431640625,
0.04888916015625,
-0.003299713134765625,
0.0004949569702148438,
-0.035125732421875,
0.004642486572265625,
0.033660888671875,
-0.068115234375,
0.0005288124084472656,
-0.055694580078125,
-0.033599853515625,
-0.053863525390625,
-0.038299560546875,
-0.03668212890625,
-0.02203369140625,
-0.0238037109375,
-0.03277587890625,
0.0307159423828125,
0.0614013671875,
-0.0242156982421875,
0.035400390625,
-0.03656005859375,
-0.0022373199462890625,
0.00968170166015625,
0.0257110595703125,
-0.0080413818359375,
-0.03424072265625,
-0.044403076171875,
-0.037689208984375,
-0.028228759765625,
-0.06787109375,
0.01267242431640625,
0.0268402099609375,
0.049774169921875,
0.044403076171875,
-0.00893402099609375,
0.0254058837890625,
-0.022979736328125,
0.03302001953125,
0.0006937980651855469,
-0.0662841796875,
0.06048583984375,
-0.040252685546875,
0.01486968994140625,
0.06500244140625,
0.0469970703125,
-0.018096923828125,
-0.053466796875,
-0.056549072265625,
-0.06201171875,
0.05615234375,
0.0020999908447265625,
0.0206146240234375,
-0.0131378173828125,
0.0108642578125,
-0.0008568763732910156,
-0.01375579833984375,
-0.047637939453125,
-0.0075836181640625,
-0.0452880859375,
-0.0063934326171875,
-0.02020263671875,
-0.040008544921875,
-0.00769805908203125,
-0.013153076171875,
0.047454833984375,
-0.01959228515625,
0.0254058837890625,
0.014892578125,
-0.0236968994140625,
-0.00724029541015625,
0.00858306884765625,
0.02593994140625,
0.061309814453125,
-0.0731201171875,
0.032196044921875,
-0.0301361083984375,
-0.0614013671875,
-0.008087158203125,
0.043731689453125,
-0.0111541748046875,
0.0242919921875,
0.041229248046875,
0.04632568359375,
0.034332275390625,
-0.0232696533203125,
0.02496337890625,
0.005313873291015625,
-0.032623291015625,
-0.05023193359375,
-0.01309967041015625,
0.0225372314453125,
0.0157623291015625,
0.062225341796875,
0.00974273681640625,
0.0015697479248046875,
-0.0321044921875,
0.0192718505859375,
0.01079559326171875,
-0.039276123046875,
-0.033447265625,
0.047637939453125,
0.022796630859375,
-0.0248260498046875,
0.002391815185546875,
-0.026702880859375,
-0.031494140625,
0.039947509765625,
0.050079345703125,
0.06683349609375,
-0.01983642578125,
0.038787841796875,
0.0377197265625,
0.0308074951171875,
0.0198974609375,
0.020050048828125,
-0.01033782958984375,
-0.05303955078125,
-0.00861358642578125,
-0.029449462890625,
-0.04302978515625,
-0.0062255859375,
-0.046173095703125,
0.0171051025390625,
-0.057952880859375,
-0.0215911865234375,
0.031890869140625,
0.0243988037109375,
-0.058349609375,
-0.0108184814453125,
0.00606536865234375,
0.090087890625,
-0.0650634765625,
0.081787109375,
0.08050537109375,
-0.07061767578125,
-0.053314208984375,
0.004947662353515625,
-0.0225372314453125,
-0.03167724609375,
0.061187744140625,
0.00550079345703125,
-0.0223541259765625,
0.024322509765625,
-0.031524658203125,
-0.07818603515625,
0.08319091796875,
0.048736572265625,
-0.028076171875,
-0.016143798828125,
-0.01641845703125,
0.04693603515625,
-0.02630615234375,
0.059112548828125,
0.00281524658203125,
0.044464111328125,
0.00785064697265625,
-0.045562744140625,
0.01070404052734375,
-0.047454833984375,
0.01177978515625,
0.008026123046875,
-0.052337646484375,
0.0709228515625,
0.017608642578125,
-0.019775390625,
-0.007965087890625,
0.040679931640625,
0.01459503173828125,
0.0191650390625,
0.0288848876953125,
0.037200927734375,
0.07122802734375,
0.00865936279296875,
0.0758056640625,
-0.03509521484375,
0.033935546875,
0.10211181640625,
-0.022369384765625,
0.0570068359375,
0.045074462890625,
-0.01861572265625,
0.08306884765625,
0.058349609375,
-0.027191162109375,
0.04473876953125,
-0.0058441162109375,
-0.006683349609375,
-0.0171661376953125,
-0.0093994140625,
-0.05023193359375,
0.019500732421875,
0.0243377685546875,
-0.053619384765625,
-0.029693603515625,
-0.006290435791015625,
-0.0012922286987304688,
0.022918701171875,
0.01100921630859375,
0.034393310546875,
0.0041656494140625,
-0.0242156982421875,
0.04571533203125,
-0.01551055908203125,
0.053070068359375,
-0.06396484375,
0.00957489013671875,
-0.0173492431640625,
0.0089569091796875,
-0.029632568359375,
-0.0360107421875,
0.022369384765625,
-0.0007462501525878906,
-0.037506103515625,
-0.0214385986328125,
0.06298828125,
0.000682830810546875,
-0.036865234375,
0.0653076171875,
0.03369140625,
0.03173828125,
0.0036411285400390625,
-0.0413818359375,
-0.00858306884765625,
-0.034698486328125,
-0.0247955322265625,
0.018951416015625,
0.0093841552734375,
-0.00258636474609375,
0.036285400390625,
0.0404052734375,
-0.00872039794921875,
-0.01129913330078125,
0.03875732421875,
0.0650634765625,
-0.04315185546875,
-0.0411376953125,
-0.0305328369140625,
0.01555633544921875,
-0.0085906982421875,
-0.00908660888671875,
0.061065673828125,
0.07220458984375,
0.056610107421875,
-0.0107574462890625,
0.059173583984375,
-0.0198974609375,
0.041534423828125,
0.003429412841796875,
0.06134033203125,
-0.0307769775390625,
-0.006702423095703125,
-0.035614013671875,
-0.0701904296875,
-0.0097808837890625,
0.03424072265625,
-0.0246124267578125,
0.00872039794921875,
0.08172607421875,
0.069580078125,
-0.0262603759765625,
-0.019073486328125,
0.0203399658203125,
-0.0007467269897460938,
0.0214691162109375,
0.029693603515625,
0.063720703125,
-0.034942626953125,
0.06573486328125,
-0.001567840576171875,
-0.01021575927734375,
-0.02703857421875,
-0.05914306640625,
-0.06622314453125,
-0.079345703125,
-0.022918701171875,
-0.026702880859375,
0.0175628662109375,
0.053070068359375,
0.088134765625,
-0.06268310546875,
-0.0131072998046875,
-0.0262603759765625,
0.0012102127075195312,
0.0090484619140625,
-0.0141754150390625,
0.0478515625,
-0.032806396484375,
-0.06707763671875,
0.044464111328125,
-0.004180908203125,
-0.0023956298828125,
-0.0235748291015625,
-0.0145111083984375,
-0.044281005859375,
-0.00536346435546875,
0.0069427490234375,
0.032012939453125,
-0.02960205078125,
-0.0362548828125,
-0.0108642578125,
-0.024139404296875,
0.00516510009765625,
0.06640625,
-0.062408447265625,
0.0273284912109375,
0.041534423828125,
0.028411865234375,
0.0762939453125,
-0.0131378173828125,
0.03216552734375,
-0.058197021484375,
0.02569580078125,
0.017913818359375,
0.034820556640625,
0.0308990478515625,
-0.0266571044921875,
0.054229736328125,
0.01168060302734375,
-0.033599853515625,
-0.06207275390625,
0.003376007080078125,
-0.08026123046875,
-0.0254058837890625,
0.061309814453125,
-0.0218048095703125,
-0.00975799560546875,
-0.004032135009765625,
-0.0114898681640625,
0.044403076171875,
-0.0538330078125,
0.035369873046875,
0.03271484375,
0.018280029296875,
-0.0186309814453125,
-0.019561767578125,
0.0284423828125,
0.0220794677734375,
-0.04766845703125,
-0.02703857421875,
0.0144500732421875,
0.022979736328125,
0.065185546875,
0.05181884765625,
0.0009698867797851562,
0.02197265625,
0.033905029296875,
0.0290374755859375,
-0.0287933349609375,
-0.01493072509765625,
-0.0030879974365234375,
0.0211334228515625,
0.0253448486328125,
-0.0276031494140625
]
] |
airesearch/wangchanberta-base-att-spm-uncased | 2023-03-19T02:31:42.000Z | [
"transformers",
"pytorch",
"safetensors",
"camembert",
"fill-mask",
"th",
"arxiv:1907.11692",
"arxiv:1801.06146",
"arxiv:1808.06226",
"arxiv:2101.09635",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | airesearch | null | null | airesearch/wangchanberta-base-att-spm-uncased | 23 | 36,286 | transformers | 2022-03-02T23:29:05 | ---
language: th
widget:
- text: "ผู้ใช้งานท่าอากาศยานนานาชาติ<mask>มีกว่าสามล้านคน<pad>"
---
# WangchanBERTa base model: `wangchanberta-base-att-spm-uncased`
<br>
Pretrained RoBERTa BASE model on assorted Thai texts (78.5 GB).
The script and documentation can be found at [this repository](https://github.com/vistec-AI/thai2transformers).
<br>
## Model description
<br>
The architecture of the pretrained model is based on RoBERTa [[Liu et al., 2019]](https://arxiv.org/abs/1907.11692).
<br>
## Intended uses & limitations
<br>
You can use the pretrained model for masked language modeling (i.e. predicting a mask token in the input text). In addition, we also provide finetuned models for multiclass/multilabel text classification and token classification task.
<br>
**Multiclass text classification**
- `wisesight_sentiment`
4-class text classification task (`positive`, `neutral`, `negative`, and `question`) based on social media posts and tweets.
- `wongnai_reivews`
Users' review rating classification task (scale is ranging from 1 to 5)
- `generated_reviews_enth` : (`review_star` as label)
Generated users' review rating classification task (scale is ranging from 1 to 5).
**Multilabel text classification**
- `prachathai67k`
Thai topic classification with 12 labels based on news article corpus from prachathai.com. The detail is described in this [page](https://huggingface.co/datasets/prachathai67k).
**Token classification**
- `thainer`
Named-entity recognition tagging with 13 named-entities as described in this [page](https://huggingface.co/datasets/thainer).
- `lst20` : NER NER and POS tagging
Named-entity recognition tagging with 10 named-entities and Part-of-Speech tagging with 16 tags as described in this [page](https://huggingface.co/datasets/lst20).
<br>
## How to use
<br>
The getting started notebook of WangchanBERTa model can be found at this [Colab notebook](https://colab.research.google.com/drive/1Kbk6sBspZLwcnOE61adAQo30xxqOQ9ko)
<br>
## Training data
`wangchanberta-base-att-spm-uncased` model was pretrained on assorted Thai text dataset. The total size of uncompressed text is 78.5GB.
### Preprocessing
Texts are preprocessed with the following rules:
- Replace HTML forms of characters with the actual characters such asnbsp;with a space and \\\\\\\\\\\\\\\\<br /> with a line break [[Howard and Ruder, 2018]](https://arxiv.org/abs/1801.06146).
- Remove empty brackets ((), {}, and []) than sometimes come up as a result of text extraction such as from Wikipedia.
- Replace line breaks with spaces.
- Replace more than one spaces with a single space
- Remove more than 3 repetitive characters such as ดีมากกก to ดีมาก [Howard and Ruder, 2018]](https://arxiv.org/abs/1801.06146).
- Word-level tokenization using [[Phatthiyaphaibun et al., 2020]](https://zenodo.org/record/4319685#.YA4xEGQzaDU) ’s `newmm` dictionary-based maximal matching tokenizer.
- Replace repetitive words; this is done post-tokenization unlike [[Howard and Ruder, 2018]](https://arxiv.org/abs/1801.06146). since there is no delimitation by space in Thai as in English.
- Replace spaces with <\\\\\\\\\\\\\\\\_>. The SentencePiece tokenizer combines the spaces with other tokens. Since spaces serve as punctuation in Thai such as sentence boundaries similar to periods in English, combining it with other tokens will omit an important feature for tasks such as word tokenization and sentence breaking. Therefore, we opt to explicitly mark spaces with <\\\\\\\\\\\\\\\\_>.
<br>
Regarding the vocabulary, we use SentencePiece [[Kudo, 2018]](https://arxiv.org/abs/1808.06226) to train SentencePiece unigram model.
The tokenizer has a vocabulary size of 25,000 subwords, trained on 15M sentences sampled from the training set.
The length of each sequence is limited up to 416 subword tokens.
Regarding the masking procedure, for each sequence, we sampled 15% of the tokens and replace them with<mask>token.Out of the 15%, 80% is replaced with a<mask>token, 10% is left unchanged and 10% is replaced with a random token.
<br>
**Train/Val/Test splits**
After preprocessing and deduplication, we have a training set of 381,034,638 unique, mostly Thai sentences with sequence length of 5 to 300 words (78.5GB). The training set has a total of 16,957,775,412 words as tokenized by dictionary-based maximal matching [[Phatthiyaphaibun et al., 2020]](https://zenodo.org/record/4319685#.YA4xEGQzaDU), 8,680,485,067 subwords as tokenized by SentencePiece tokenizer, and 53,035,823,287 characters.
<br>
**Pretraining**
The model was trained on 8 V100 GPUs for 500,000 steps with the batch size of 4,096 (32 sequences per device with 16 accumulation steps) and a sequence length of 416 tokens. The optimizer we used is Adam with the learning rate of $3e-4$, $\\\\\\\\\\\\\\\\beta_1 = 0.9$, $\\\\\\\\\\\\\\\\beta_2= 0.999$ and $\\\\\\\\\\\\\\\\epsilon = 1e-6$. The learning rate is warmed up for the first 24,000 steps and linearly decayed to zero. The model checkpoint with minimum validation loss will be selected as the best model checkpoint.
As of Sun 24 Jan 2021, we release the model from the checkpoint @360,000 steps due to the model pretraining has not yet been completed
<br>
**BibTeX entry and citation info**
```
@misc{lowphansirikul2021wangchanberta,
title={WangchanBERTa: Pretraining transformer-based Thai Language Models},
author={Lalita Lowphansirikul and Charin Polpanumas and Nawat Jantrakulchai and Sarana Nutanong},
year={2021},
eprint={2101.09635},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| 5,654 | [
[
-0.0233612060546875,
-0.05548095703125,
0.00536346435546875,
0.0240936279296875,
-0.040802001953125,
0.0036945343017578125,
-0.020294189453125,
-0.0240936279296875,
0.0282440185546875,
0.031707763671875,
-0.02886962890625,
-0.034393310546875,
-0.0439453125,
0.020294189453125,
-0.007720947265625,
0.0772705078125,
-0.0164337158203125,
0.0194854736328125,
0.0273895263671875,
-0.0189971923828125,
-0.050628662109375,
-0.039276123046875,
-0.036865234375,
-0.0233612060546875,
0.031463623046875,
0.032196044921875,
0.020751953125,
0.044403076171875,
0.0313720703125,
0.017974853515625,
-0.018218994140625,
0.0099029541015625,
-0.035614013671875,
-0.0179595947265625,
0.0105743408203125,
-0.039398193359375,
-0.032989501953125,
0.007411956787109375,
0.039764404296875,
0.0531005859375,
-0.007579803466796875,
0.00690460205078125,
-0.00025653839111328125,
0.043670654296875,
-0.04046630859375,
0.004245758056640625,
-0.03338623046875,
0.0205078125,
-0.0276336669921875,
-0.01031494140625,
-0.035797119140625,
-0.0261688232421875,
0.0105743408203125,
-0.058319091796875,
-0.01294708251953125,
0.008819580078125,
0.094482421875,
0.00440216064453125,
-0.034423828125,
-0.0219573974609375,
-0.029388427734375,
0.06689453125,
-0.04827880859375,
0.03924560546875,
0.0343017578125,
0.00824737548828125,
-0.0020542144775390625,
-0.065185546875,
-0.06256103515625,
-0.0016021728515625,
-0.0038394927978515625,
0.0194244384765625,
-0.0138702392578125,
0.006282806396484375,
0.015655517578125,
0.035125732421875,
-0.052978515625,
0.01258087158203125,
-0.033111572265625,
-0.0248565673828125,
0.027313232421875,
-0.008941650390625,
0.0360107421875,
-0.0472412109375,
-0.049102783203125,
-0.01531219482421875,
-0.052886962890625,
0.007568359375,
0.0253753662109375,
0.01800537109375,
-0.008331298828125,
0.049530029296875,
-0.01186370849609375,
0.032135009765625,
0.00881195068359375,
-0.01617431640625,
0.0386962890625,
-0.04718017578125,
-0.03265380859375,
0.0030670166015625,
0.072265625,
0.015350341796875,
0.028411865234375,
-0.0001251697540283203,
-0.00872039794921875,
-0.0019121170043945312,
-0.006519317626953125,
-0.07269287109375,
-0.0275421142578125,
0.01416015625,
-0.0338134765625,
-0.024810791015625,
0.0091400146484375,
-0.056854248046875,
-0.0099029541015625,
-0.0139312744140625,
0.038055419921875,
-0.066162109375,
-0.0322265625,
0.023406982421875,
-0.0008177757263183594,
0.026031494140625,
0.013763427734375,
-0.03875732421875,
0.0133514404296875,
0.019012451171875,
0.07586669921875,
-0.020904541015625,
-0.034881591796875,
-0.028564453125,
-0.0114593505859375,
-0.0194854736328125,
0.04852294921875,
-0.012725830078125,
-0.0304107666015625,
-0.00897979736328125,
0.0170745849609375,
-0.0166015625,
-0.031494140625,
0.031646728515625,
-0.043701171875,
0.048248291015625,
-0.019683837890625,
-0.045440673828125,
-0.028564453125,
0.010284423828125,
-0.04901123046875,
0.079833984375,
0.005336761474609375,
-0.06842041015625,
0.04156494140625,
-0.051513671875,
-0.033355712890625,
0.0003414154052734375,
0.022979736328125,
-0.048614501953125,
-0.0211639404296875,
0.01971435546875,
0.03656005859375,
-0.0067138671875,
0.028564453125,
-0.0010309219360351562,
-0.026397705078125,
0.0202484130859375,
-0.0164947509765625,
0.09271240234375,
0.035308837890625,
-0.03570556640625,
0.01690673828125,
-0.05987548828125,
0.0003085136413574219,
0.01169586181640625,
-0.040496826171875,
-0.008575439453125,
-0.038238525390625,
0.0222625732421875,
0.02313232421875,
0.0159149169921875,
-0.03662109375,
0.0190887451171875,
-0.04620361328125,
0.034454345703125,
0.053863525390625,
0.006023406982421875,
0.028778076171875,
-0.023529052734375,
0.05560302734375,
0.04083251953125,
0.0018873214721679688,
-0.03173828125,
-0.0323486328125,
-0.0738525390625,
-0.037811279296875,
0.050140380859375,
0.0472412109375,
-0.0626220703125,
0.058929443359375,
-0.033355712890625,
-0.056793212890625,
-0.0418701171875,
0.0034427642822265625,
0.035614013671875,
0.0283203125,
0.0352783203125,
-0.040985107421875,
-0.06719970703125,
-0.041107177734375,
-0.012939453125,
-0.00788116455078125,
0.01324462890625,
0.00083160400390625,
0.049896240234375,
-0.0250396728515625,
0.0693359375,
-0.0305633544921875,
-0.03240966796875,
-0.040283203125,
0.0091400146484375,
-0.0025386810302734375,
0.041778564453125,
0.030975341796875,
-0.048248291015625,
-0.04913330078125,
-0.005008697509765625,
-0.0472412109375,
-0.005954742431640625,
0.0003097057342529297,
-0.0015420913696289062,
0.034149169921875,
0.043182373046875,
-0.057403564453125,
0.024017333984375,
0.0248565673828125,
-0.0148468017578125,
0.04718017578125,
-0.0101470947265625,
0.005580902099609375,
-0.107666015625,
0.01837158203125,
-0.0016880035400390625,
-0.0159149169921875,
-0.050323486328125,
0.01015472412109375,
0.0018529891967773438,
-0.024444580078125,
-0.0419921875,
0.0423583984375,
-0.048004150390625,
0.0261077880859375,
-0.015960693359375,
0.0191650390625,
-0.00543975830078125,
0.0638427734375,
0.0017213821411132812,
0.0665283203125,
0.0340576171875,
-0.046478271484375,
0.01210784912109375,
0.01428985595703125,
-0.033477783203125,
0.0222625732421875,
-0.059600830078125,
0.0102081298828125,
0.00385284423828125,
0.03314208984375,
-0.0792236328125,
-0.0111083984375,
0.031463623046875,
-0.03753662109375,
0.024169921875,
0.013916015625,
-0.05975341796875,
-0.022979736328125,
-0.044189453125,
0.03363037109375,
0.04052734375,
-0.027862548828125,
0.0306549072265625,
0.0328369140625,
0.0018177032470703125,
-0.04180908203125,
-0.053955078125,
0.0014696121215820312,
-0.031524658203125,
-0.0279083251953125,
0.019317626953125,
-0.0027751922607421875,
0.009368896484375,
-0.0156402587890625,
0.01221466064453125,
-0.007526397705078125,
-0.0021953582763671875,
0.0081634521484375,
0.0129241943359375,
-0.01116180419921875,
0.01019287109375,
-0.01143646240234375,
-0.01482391357421875,
-0.0247650146484375,
-0.0308837890625,
0.0711669921875,
-0.0080108642578125,
-0.00384521484375,
-0.0594482421875,
-0.00010752677917480469,
0.03466796875,
-0.0255889892578125,
0.058502197265625,
0.060638427734375,
-0.035308837890625,
0.010711669921875,
-0.0401611328125,
-0.008056640625,
-0.033172607421875,
0.046173095703125,
-0.035614013671875,
-0.050323486328125,
0.03692626953125,
0.001678466796875,
-0.0201568603515625,
0.04058837890625,
0.040008544921875,
0.014862060546875,
0.06170654296875,
0.05206298828125,
-0.0261993408203125,
0.040985107421875,
-0.0242156982421875,
0.032470703125,
-0.057647705078125,
-0.0182647705078125,
-0.02166748046875,
-0.004367828369140625,
-0.0552978515625,
-0.03155517578125,
0.00882720947265625,
0.0227508544921875,
-0.01947021484375,
0.0592041015625,
-0.0433349609375,
0.01520538330078125,
0.033782958984375,
0.00589752197265625,
0.0174407958984375,
-0.0035915374755859375,
-0.0174407958984375,
-0.0211639404296875,
-0.05694580078125,
-0.039154052734375,
0.09014892578125,
0.036468505859375,
0.041961669921875,
-0.007640838623046875,
0.041015625,
0.0005064010620117188,
-0.0016231536865234375,
-0.047698974609375,
0.046630859375,
0.00495147705078125,
-0.036346435546875,
-0.017242431640625,
-0.032196044921875,
-0.07879638671875,
0.015869140625,
0.0003459453582763672,
-0.04925537109375,
0.003925323486328125,
-0.00824737548828125,
-0.0171356201171875,
0.02813720703125,
-0.06488037109375,
0.06689453125,
-0.005619049072265625,
0.0007562637329101562,
-0.005374908447265625,
-0.052154541015625,
0.039642333984375,
-0.01451873779296875,
-0.000091552734375,
0.013671875,
0.006622314453125,
0.08062744140625,
-0.05352783203125,
0.058563232421875,
-0.0204925537109375,
0.0015649795532226562,
0.01515960693359375,
-0.01261138916015625,
0.025634765625,
-0.01325225830078125,
0.01641845703125,
0.0176239013671875,
0.0086822509765625,
-0.029449462890625,
-0.0228271484375,
0.03192138671875,
-0.072021484375,
-0.01873779296875,
-0.0531005859375,
-0.01641845703125,
0.0018453598022460938,
0.04168701171875,
0.049957275390625,
0.00923919677734375,
-0.002643585205078125,
0.01139068603515625,
0.055084228515625,
-0.03729248046875,
0.0218505859375,
0.0276947021484375,
-0.0291595458984375,
-0.04425048828125,
0.07574462890625,
0.0301971435546875,
0.007232666015625,
0.030670166015625,
0.0142059326171875,
-0.0169219970703125,
-0.02471923828125,
-0.024566650390625,
0.034210205078125,
-0.05303955078125,
-0.01910400390625,
-0.0693359375,
-0.029693603515625,
-0.038848876953125,
0.00836181640625,
-0.0293731689453125,
-0.034881591796875,
-0.0330810546875,
-0.0167694091796875,
0.0235748291015625,
0.040679931640625,
0.0118865966796875,
0.040435791015625,
-0.0252532958984375,
0.0069732666015625,
0.00615692138671875,
0.01346588134765625,
0.00855255126953125,
-0.0640869140625,
-0.023406982421875,
0.0021419525146484375,
-0.0283355712890625,
-0.0592041015625,
0.031707763671875,
0.0009226799011230469,
0.013671875,
0.0312347412109375,
-0.006214141845703125,
0.07159423828125,
-0.04254150390625,
0.07855224609375,
0.0264434814453125,
-0.07666015625,
0.04541015625,
-0.01020050048828125,
0.0292510986328125,
0.038238525390625,
0.04010009765625,
-0.057891845703125,
-0.021331787109375,
-0.04803466796875,
-0.0665283203125,
0.05084228515625,
0.01123046875,
0.00335693359375,
0.0082855224609375,
0.01520538330078125,
0.00557708740234375,
0.0189056396484375,
-0.0787353515625,
-0.0209503173828125,
-0.0390625,
-0.015777587890625,
-0.0014562606811523438,
-0.026519775390625,
0.019805908203125,
-0.03912353515625,
0.0694580078125,
0.00928497314453125,
0.0308837890625,
0.0103759765625,
-0.038238525390625,
0.00641632080078125,
0.0094757080078125,
0.06744384765625,
0.03656005859375,
-0.0043182373046875,
-0.0006856918334960938,
0.0220489501953125,
-0.051971435546875,
0.010406494140625,
0.0122528076171875,
-0.0132293701171875,
0.010894775390625,
0.0301055908203125,
0.06903076171875,
0.0218048095703125,
-0.05322265625,
0.05340576171875,
0.0007128715515136719,
-0.011138916015625,
-0.035186767578125,
-0.01262664794921875,
0.0006799697875976562,
-0.016021728515625,
0.0219268798828125,
-0.0024127960205078125,
-0.0157623291015625,
-0.038848876953125,
0.021331787109375,
0.01104736328125,
-0.033050537109375,
-0.041168212890625,
0.060150146484375,
0.0021305084228515625,
-0.0253143310546875,
0.0355224609375,
-0.027862548828125,
-0.059112548828125,
0.0138092041015625,
0.03240966796875,
0.061309814453125,
-0.0198822021484375,
0.01026153564453125,
0.0401611328125,
0.0149383544921875,
0.0034351348876953125,
0.007320404052734375,
0.00180816650390625,
-0.04840087890625,
-0.0193023681640625,
-0.051971435546875,
-0.000010371208190917969,
0.025970458984375,
-0.025238037109375,
0.040924072265625,
-0.0384521484375,
0.0018358230590820312,
-0.006542205810546875,
0.033355712890625,
-0.044708251953125,
0.0191802978515625,
-0.0290679931640625,
0.07611083984375,
-0.051605224609375,
0.072509765625,
0.04119873046875,
-0.0714111328125,
-0.07794189453125,
0.010406494140625,
-0.00849151611328125,
-0.0662841796875,
0.061859130859375,
0.032012939453125,
0.0024433135986328125,
0.00241851806640625,
-0.02911376953125,
-0.048248291015625,
0.0875244140625,
0.013336181640625,
-0.0238494873046875,
-0.017578125,
0.0159149169921875,
0.04620361328125,
-0.0072174072265625,
0.0184783935546875,
0.029693603515625,
0.027801513671875,
-0.0083770751953125,
-0.07806396484375,
0.01032257080078125,
-0.032501220703125,
0.00634765625,
-0.01369476318359375,
-0.063232421875,
0.06829833984375,
0.00641632080078125,
-0.0263671875,
0.02203369140625,
0.047821044921875,
0.025665283203125,
0.0011949539184570312,
0.040283203125,
0.05560302734375,
0.0635986328125,
-0.0035076141357421875,
0.058868408203125,
-0.03265380859375,
0.02545166015625,
0.047637939453125,
-0.00925445556640625,
0.056671142578125,
0.0372314453125,
-0.02020263671875,
0.0439453125,
0.06512451171875,
0.00734710693359375,
0.04376220703125,
0.015869140625,
-0.00589752197265625,
0.01261138916015625,
-0.006072998046875,
-0.0250396728515625,
0.0394287109375,
-0.003910064697265625,
-0.01479339599609375,
0.00484466552734375,
0.0390625,
0.037994384765625,
-0.00879669189453125,
-0.0008502006530761719,
0.0574951171875,
0.0096893310546875,
-0.0550537109375,
0.048828125,
0.0191802978515625,
0.0782470703125,
-0.052520751953125,
0.019805908203125,
-0.020477294921875,
0.01515960693359375,
-0.007785797119140625,
-0.0289306640625,
-0.00015664100646972656,
0.0015020370483398438,
-0.0020694732666015625,
-0.022979736328125,
0.050323486328125,
-0.038055419921875,
-0.031524658203125,
0.01137542724609375,
0.0237274169921875,
0.019622802734375,
-0.002941131591796875,
-0.047882080078125,
0.0031185150146484375,
0.0011587142944335938,
-0.0306549072265625,
0.019744873046875,
0.046600341796875,
-0.00025391578674316406,
0.049713134765625,
0.044219970703125,
0.02685546875,
0.01265716552734375,
-0.00528717041015625,
0.060150146484375,
-0.049346923828125,
-0.05255126953125,
-0.05584716796875,
0.038360595703125,
-0.006732940673828125,
-0.044769287109375,
0.073486328125,
0.054443359375,
0.0753173828125,
-0.0052947998046875,
0.04644775390625,
0.005527496337890625,
0.042022705078125,
-0.044281005859375,
0.0638427734375,
-0.04541015625,
0.01220703125,
-0.0374755859375,
-0.056304931640625,
-0.025543212890625,
0.05096435546875,
-0.0150604248046875,
0.0165863037109375,
0.05938720703125,
0.0550537109375,
-0.0084228515625,
-0.00775146484375,
0.0091705322265625,
0.013397216796875,
0.0293731689453125,
0.04290771484375,
0.039642333984375,
-0.060821533203125,
0.051116943359375,
-0.0168304443359375,
-0.00559234619140625,
-0.0180206298828125,
-0.0423583984375,
-0.07049560546875,
-0.06878662109375,
-0.0271148681640625,
-0.026153564453125,
-0.0031337738037109375,
0.06756591796875,
0.049102783203125,
-0.061065673828125,
-0.0198974609375,
-0.0189971923828125,
-0.004451751708984375,
-0.00606536865234375,
-0.0227813720703125,
0.048919677734375,
-0.025238037109375,
-0.0546875,
0.0020732879638671875,
0.0167236328125,
0.00785064697265625,
-0.0282135009765625,
-0.0160675048828125,
-0.0474853515625,
-0.002948760986328125,
0.04583740234375,
0.0162353515625,
-0.053863525390625,
-0.021881103515625,
0.02178955078125,
-0.0195465087890625,
0.016387939453125,
0.048828125,
-0.04888916015625,
0.038238525390625,
0.023651123046875,
0.0445556640625,
0.0443115234375,
-0.016632080078125,
0.031707763671875,
-0.058258056640625,
0.0286102294921875,
0.00872802734375,
0.02789306640625,
0.032867431640625,
-0.005859375,
0.032867431640625,
0.05084228515625,
-0.041107177734375,
-0.08392333984375,
0.007602691650390625,
-0.0712890625,
-0.022552490234375,
0.08447265625,
-0.017974853515625,
-0.0228118896484375,
-0.0021228790283203125,
-0.031585693359375,
0.045928955078125,
-0.004505157470703125,
0.052490234375,
0.0728759765625,
0.0165863037109375,
-0.02215576171875,
-0.02667236328125,
0.045623779296875,
0.043670654296875,
-0.0511474609375,
-0.011749267578125,
0.00787353515625,
0.038726806640625,
0.017486572265625,
0.07427978515625,
-0.0076751708984375,
0.021636962890625,
-0.0170440673828125,
0.00580596923828125,
0.0062255859375,
-0.0131988525390625,
-0.033721923828125,
-0.00046443939208984375,
-0.0023097991943359375,
-0.0301971435546875
]
] |
timm/convit_base.fb_in1k | 2023-04-24T04:14:31.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2103.10697",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/convit_base.fb_in1k | 0 | 36,195 | timm | 2023-04-24T04:13:12 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for convit_base.fb_in1k
A ConViT image classification model. Trained on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 86.5
- GMACs: 17.5
- Activations (M): 31.8
- Image size: 224 x 224
- **Papers:**
- ConViT: Improving Vision Transformers with Soft Convolutional Inductive Biases: https://arxiv.org/abs/2103.10697
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/facebookresearch/convit
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('convit_base.fb_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'convit_base.fb_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 197, 768) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@article{d2021convit,
title={ConViT: Improving Vision Transformers with Soft Convolutional Inductive Biases},
author={d'Ascoli, St{'e}phane and Touvron, Hugo and Leavitt, Matthew and Morcos, Ari and Biroli, Giulio and Sagun, Levent},
journal={arXiv preprint arXiv:2103.10697},
year={2021}
}
```
| 2,787 | [
[
-0.038421630859375,
-0.0333251953125,
-0.0089874267578125,
0.007640838623046875,
-0.03131103515625,
-0.033416748046875,
-0.028350830078125,
-0.02569580078125,
0.003265380859375,
0.02154541015625,
-0.0489501953125,
-0.047607421875,
-0.043792724609375,
-0.00945281982421875,
-0.01093292236328125,
0.0745849609375,
0.0057220458984375,
0.0153656005859375,
-0.0146484375,
-0.035675048828125,
-0.0206756591796875,
-0.01934814453125,
-0.058624267578125,
-0.036468505859375,
0.01806640625,
0.0233154296875,
0.045196533203125,
0.016632080078125,
0.057769775390625,
0.0328369140625,
-0.0096282958984375,
0.01204681396484375,
-0.023406982421875,
-0.0191802978515625,
0.0208892822265625,
-0.043426513671875,
-0.0248260498046875,
0.01387786865234375,
0.04754638671875,
0.03875732421875,
0.0011892318725585938,
0.0254364013671875,
0.0242919921875,
0.041778564453125,
-0.020904541015625,
0.0163421630859375,
-0.040374755859375,
0.0153656005859375,
-0.0161895751953125,
0.00440216064453125,
-0.0288848876953125,
-0.0386962890625,
0.0288543701171875,
-0.04034423828125,
0.04730224609375,
-0.011871337890625,
0.08380126953125,
0.029998779296875,
-0.0014028549194335938,
-0.004764556884765625,
-0.0213775634765625,
0.0538330078125,
-0.06158447265625,
0.03057861328125,
0.00460052490234375,
0.00757598876953125,
-0.0121612548828125,
-0.06951904296875,
-0.048736572265625,
-0.017303466796875,
-0.0195159912109375,
0.0009431838989257812,
-0.007381439208984375,
0.0163116455078125,
0.031280517578125,
0.025726318359375,
-0.04010009765625,
-0.00099945068359375,
-0.039459228515625,
-0.01221466064453125,
0.038177490234375,
-0.0013036727905273438,
0.005645751953125,
-0.00876617431640625,
-0.0457763671875,
-0.0345458984375,
-0.027435302734375,
0.03411865234375,
0.0177764892578125,
0.01446533203125,
-0.041778564453125,
0.0298004150390625,
0.005336761474609375,
0.034576416015625,
0.0254669189453125,
-0.0238800048828125,
0.0556640625,
-0.0036182403564453125,
-0.0411376953125,
-0.0033397674560546875,
0.07135009765625,
0.035858154296875,
0.03302001953125,
0.0013818740844726562,
-0.0013971328735351562,
-0.01036834716796875,
-0.01042938232421875,
-0.08990478515625,
-0.02191162109375,
0.01488494873046875,
-0.041229248046875,
-0.0251922607421875,
0.02630615234375,
-0.044891357421875,
-0.01062774658203125,
-0.0100555419921875,
0.047943115234375,
-0.0379638671875,
-0.027191162109375,
0.015899658203125,
-0.0104522705078125,
0.035186767578125,
0.01367950439453125,
-0.048919677734375,
0.0176544189453125,
0.0233154296875,
0.0745849609375,
0.0002627372741699219,
-0.0271453857421875,
-0.018646240234375,
-0.032867431640625,
-0.0158233642578125,
0.031402587890625,
-0.0005831718444824219,
-0.017791748046875,
-0.0014753341674804688,
0.036102294921875,
-0.01500701904296875,
-0.05523681640625,
0.034515380859375,
-0.005092620849609375,
0.01971435546875,
0.00426483154296875,
-0.0207061767578125,
-0.03216552734375,
0.0254058837890625,
-0.0255889892578125,
0.082763671875,
0.0283050537109375,
-0.058502197265625,
0.03155517578125,
-0.035064697265625,
-0.017181396484375,
-0.002964019775390625,
-0.0018558502197265625,
-0.080810546875,
-0.00450897216796875,
0.00841522216796875,
0.06207275390625,
-0.01227569580078125,
0.01493072509765625,
-0.04010009765625,
-0.027984619140625,
0.032562255859375,
-0.01517486572265625,
0.0740966796875,
0.002407073974609375,
-0.035736083984375,
0.01468658447265625,
-0.0506591796875,
0.004856109619140625,
0.02734375,
-0.02166748046875,
-0.00893402099609375,
-0.0428466796875,
0.01373291015625,
0.015777587890625,
0.006656646728515625,
-0.050994873046875,
0.032379150390625,
-0.0264129638671875,
0.0247344970703125,
0.04730224609375,
-0.00504302978515625,
0.031951904296875,
-0.017974853515625,
0.0175628662109375,
0.0206146240234375,
0.037689208984375,
0.00501251220703125,
-0.03857421875,
-0.0728759765625,
-0.022979736328125,
0.015594482421875,
0.034271240234375,
-0.046844482421875,
0.0280303955078125,
-0.01174163818359375,
-0.056793212890625,
-0.034332275390625,
0.01200103759765625,
0.040191650390625,
0.043426513671875,
0.0330810546875,
-0.039215087890625,
-0.0333251953125,
-0.072509765625,
0.00807952880859375,
0.0101318359375,
0.002605438232421875,
0.01904296875,
0.044281005859375,
-0.01271820068359375,
0.04193115234375,
-0.0306396484375,
-0.0307464599609375,
-0.0156402587890625,
-0.003910064697265625,
0.04034423828125,
0.0648193359375,
0.06414794921875,
-0.057281494140625,
-0.037811279296875,
-0.018524169921875,
-0.0728759765625,
0.018829345703125,
-0.005672454833984375,
-0.015106201171875,
0.0160675048828125,
0.016754150390625,
-0.06011962890625,
0.049957275390625,
0.00804901123046875,
-0.0249176025390625,
0.03466796875,
-0.006072998046875,
0.0031147003173828125,
-0.072509765625,
-0.000008344650268554688,
0.037811279296875,
-0.0049896240234375,
-0.0384521484375,
-0.007659912109375,
0.0031681060791015625,
0.006069183349609375,
-0.053985595703125,
0.04364013671875,
-0.0306396484375,
0.00628662109375,
-0.0028896331787109375,
-0.0245513916015625,
0.002437591552734375,
0.06744384765625,
-0.00882720947265625,
0.025604248046875,
0.054595947265625,
-0.044097900390625,
0.039398193359375,
0.0438232421875,
-0.0250396728515625,
0.035614013671875,
-0.045989990234375,
0.00905609130859375,
-0.000812530517578125,
0.02142333984375,
-0.080078125,
-0.0103759765625,
0.030853271484375,
-0.044952392578125,
0.056243896484375,
-0.04522705078125,
-0.0299530029296875,
-0.0338134765625,
-0.037628173828125,
0.033538818359375,
0.046173095703125,
-0.050994873046875,
0.02191162109375,
0.00951385498046875,
0.0189208984375,
-0.045867919921875,
-0.06866455078125,
-0.03057861328125,
-0.021240234375,
-0.0604248046875,
0.032745361328125,
-0.0009317398071289062,
0.00586700439453125,
0.00684356689453125,
-0.0033359527587890625,
0.0051116943359375,
-0.0130615234375,
0.0377197265625,
0.02740478515625,
-0.0267486572265625,
0.002719879150390625,
-0.020111083984375,
-0.0096435546875,
0.00009948015213012695,
-0.02288818359375,
0.041778564453125,
-0.0235443115234375,
-0.010589599609375,
-0.055877685546875,
-0.00806427001953125,
0.0401611328125,
-0.01043701171875,
0.055755615234375,
0.08074951171875,
-0.036224365234375,
-0.010009765625,
-0.034027099609375,
-0.0253448486328125,
-0.03936767578125,
0.03155517578125,
-0.0269622802734375,
-0.0285491943359375,
0.05145263671875,
0.0039043426513671875,
0.00823974609375,
0.04833984375,
0.034210205078125,
0.0002224445343017578,
0.055267333984375,
0.050811767578125,
0.0238800048828125,
0.0633544921875,
-0.08648681640625,
-0.01387786865234375,
-0.063232421875,
-0.050323486328125,
-0.018707275390625,
-0.048736572265625,
-0.06036376953125,
-0.01497650146484375,
0.035736083984375,
-0.004642486572265625,
-0.027008056640625,
0.026580810546875,
-0.07293701171875,
0.0033206939697265625,
0.05804443359375,
0.043365478515625,
-0.0202484130859375,
0.015899658203125,
-0.02001953125,
-0.00466156005859375,
-0.05926513671875,
-0.00833892822265625,
0.07275390625,
0.033721923828125,
0.0682373046875,
-0.0150909423828125,
0.032196044921875,
-0.016571044921875,
0.03265380859375,
-0.05029296875,
0.045928955078125,
-0.0287017822265625,
-0.035888671875,
-0.00372314453125,
-0.01290130615234375,
-0.06842041015625,
0.01337432861328125,
-0.0286102294921875,
-0.0430908203125,
0.033355712890625,
0.01239013671875,
-0.0262298583984375,
0.044464111328125,
-0.06475830078125,
0.07891845703125,
-0.01446533203125,
-0.04132080078125,
0.00988006591796875,
-0.05126953125,
0.0258331298828125,
0.024871826171875,
-0.026763916015625,
-0.0027561187744140625,
0.0275421142578125,
0.077392578125,
-0.0433349609375,
0.06781005859375,
-0.045379638671875,
0.033203125,
0.04193115234375,
-0.0106201171875,
0.02252197265625,
-0.00417327880859375,
-0.004261016845703125,
0.025634765625,
0.0011663436889648438,
-0.0264739990234375,
-0.038421630859375,
0.04608154296875,
-0.061553955078125,
-0.02703857421875,
-0.03460693359375,
-0.036834716796875,
0.00814056396484375,
-0.0052337646484375,
0.055328369140625,
0.06207275390625,
0.01305389404296875,
0.04095458984375,
0.053619384765625,
-0.024749755859375,
0.031982421875,
-0.00572967529296875,
-0.00908660888671875,
-0.024566650390625,
0.0556640625,
0.020050048828125,
0.0110626220703125,
0.005619049072265625,
0.0092315673828125,
-0.0252685546875,
-0.046051025390625,
-0.024658203125,
0.026824951171875,
-0.0506591796875,
-0.041168212890625,
-0.038604736328125,
-0.038970947265625,
-0.03021240234375,
-0.00917816162109375,
-0.032745361328125,
-0.0241546630859375,
-0.0251617431640625,
0.00502777099609375,
0.04815673828125,
0.0355224609375,
-0.0012025833129882812,
0.049285888671875,
-0.03131103515625,
0.006168365478515625,
0.009521484375,
0.0408935546875,
-0.0127716064453125,
-0.0672607421875,
-0.0249481201171875,
-0.0015516281127929688,
-0.0418701171875,
-0.052520751953125,
0.03399658203125,
0.017974853515625,
0.03857421875,
0.0340576171875,
-0.0170135498046875,
0.05572509765625,
0.0004863739013671875,
0.0396728515625,
0.026519775390625,
-0.04022216796875,
0.0447998046875,
0.0015478134155273438,
0.01126861572265625,
0.0217132568359375,
0.0169525146484375,
-0.0146484375,
-0.00876617431640625,
-0.0745849609375,
-0.054931640625,
0.057373046875,
0.01377105712890625,
0.006298065185546875,
0.031890869140625,
0.0576171875,
0.01068115234375,
0.0034961700439453125,
-0.06341552734375,
-0.0274810791015625,
-0.0301971435546875,
-0.0299224853515625,
-0.00431060791015625,
-0.002162933349609375,
0.006336212158203125,
-0.055267333984375,
0.05523681640625,
-0.009368896484375,
0.044525146484375,
0.03466796875,
-0.00826263427734375,
-0.009185791015625,
-0.0256805419921875,
0.03106689453125,
0.0195770263671875,
-0.03314208984375,
0.0048370361328125,
0.0251922607421875,
-0.04638671875,
-0.00011533498764038086,
0.0218048095703125,
-0.0126190185546875,
0.0133056640625,
0.029693603515625,
0.061981201171875,
0.006641387939453125,
0.003704071044921875,
0.04315185546875,
-0.014404296875,
-0.0282440185546875,
-0.03204345703125,
0.0188446044921875,
-0.01480865478515625,
0.03863525390625,
0.03192138671875,
0.0347900390625,
-0.00232696533203125,
-0.0217132568359375,
0.018280029296875,
0.055908203125,
-0.034698486328125,
-0.027862548828125,
0.051361083984375,
-0.000530242919921875,
-0.0113983154296875,
0.0626220703125,
-0.005741119384765625,
-0.0367431640625,
0.078857421875,
0.025970458984375,
0.07464599609375,
-0.00942230224609375,
-0.002307891845703125,
0.06231689453125,
0.020751953125,
0.001148223876953125,
0.0100250244140625,
0.01397705078125,
-0.059722900390625,
0.0035037994384765625,
-0.048187255859375,
-0.006114959716796875,
0.039215087890625,
-0.033721923828125,
0.02972412109375,
-0.051666259765625,
-0.0302734375,
0.016937255859375,
0.032073974609375,
-0.082763671875,
0.0229949951171875,
0.0066680908203125,
0.06719970703125,
-0.059967041015625,
0.061676025390625,
0.054229736328125,
-0.049346923828125,
-0.06903076171875,
-0.01345062255859375,
-0.00827789306640625,
-0.0540771484375,
0.034759521484375,
0.0452880859375,
0.002410888671875,
0.0054931640625,
-0.0821533203125,
-0.032684326171875,
0.09820556640625,
0.03729248046875,
-0.01763916015625,
0.0175018310546875,
-0.003078460693359375,
0.0210723876953125,
-0.032562255859375,
0.0222015380859375,
0.0139617919921875,
0.0267486572265625,
0.0171661376953125,
-0.04833984375,
0.0129547119140625,
-0.0265655517578125,
-0.00405120849609375,
0.00969696044921875,
-0.0692138671875,
0.06787109375,
-0.032196044921875,
-0.0087127685546875,
0.00615692138671875,
0.04876708984375,
0.00862884521484375,
0.025604248046875,
0.043121337890625,
0.04925537109375,
0.033721923828125,
-0.026519775390625,
0.0604248046875,
0.00396728515625,
0.05108642578125,
0.043914794921875,
0.030120849609375,
0.0283050537109375,
0.033447265625,
-0.017486572265625,
0.0288238525390625,
0.075439453125,
-0.0357666015625,
0.034576416015625,
0.0100250244140625,
-0.0015087127685546875,
-0.0085906982421875,
0.0010213851928710938,
-0.035888671875,
0.03509521484375,
0.0231170654296875,
-0.031494140625,
-0.006938934326171875,
0.0131378173828125,
-0.0019817352294921875,
-0.0281829833984375,
-0.015899658203125,
0.037628173828125,
0.009674072265625,
-0.04486083984375,
0.06011962890625,
-0.0078887939453125,
0.07879638671875,
-0.0352783203125,
-0.0015878677368164062,
-0.029693603515625,
0.045074462890625,
-0.031768798828125,
-0.06842041015625,
0.0180511474609375,
-0.0187225341796875,
-0.0007805824279785156,
0.004047393798828125,
0.05572509765625,
-0.036346435546875,
-0.0438232421875,
0.020599365234375,
0.0184478759765625,
0.0271759033203125,
-0.00514984130859375,
-0.08160400390625,
-0.0033016204833984375,
0.00302886962890625,
-0.04681396484375,
0.01361846923828125,
0.030853271484375,
0.00499725341796875,
0.056304931640625,
0.053955078125,
-0.0217132568359375,
0.01580810546875,
-0.005756378173828125,
0.06597900390625,
-0.03021240234375,
-0.0146942138671875,
-0.06671142578125,
0.047332763671875,
0.0007848739624023438,
-0.048919677734375,
0.033721923828125,
0.04833984375,
0.076171875,
-0.009185791015625,
0.03472900390625,
-0.01444244384765625,
-0.0014963150024414062,
-0.035308837890625,
0.047943115234375,
-0.053619384765625,
-0.0093231201171875,
-0.016632080078125,
-0.057861328125,
-0.031463623046875,
0.05517578125,
-0.01543426513671875,
0.0232086181640625,
0.03753662109375,
0.0797119140625,
-0.03668212890625,
-0.0323486328125,
0.01416015625,
0.021026611328125,
0.01016998291015625,
0.038665771484375,
0.035614013671875,
-0.059051513671875,
0.037506103515625,
-0.0499267578125,
-0.01397705078125,
-0.023101806640625,
-0.0364990234375,
-0.0714111328125,
-0.06304931640625,
-0.046417236328125,
-0.05499267578125,
-0.018829345703125,
0.06976318359375,
0.0775146484375,
-0.053009033203125,
-0.002536773681640625,
0.005596160888671875,
0.0095977783203125,
-0.014556884765625,
-0.0177459716796875,
0.04443359375,
-0.01505279541015625,
-0.059051513671875,
-0.0292816162109375,
-0.003871917724609375,
0.029693603515625,
-0.014373779296875,
-0.02203369140625,
-0.020965576171875,
-0.019866943359375,
0.02337646484375,
0.0177459716796875,
-0.046051025390625,
-0.0164642333984375,
-0.003936767578125,
-0.013336181640625,
0.033172607421875,
0.025421142578125,
-0.047576904296875,
0.022216796875,
0.0292205810546875,
0.016082763671875,
0.06939697265625,
-0.0207672119140625,
-0.004241943359375,
-0.065185546875,
0.051605224609375,
-0.00689697265625,
0.0396728515625,
0.042572021484375,
-0.031982421875,
0.05218505859375,
0.037689208984375,
-0.034820556640625,
-0.058990478515625,
-0.00763702392578125,
-0.0977783203125,
-0.002849578857421875,
0.07275390625,
-0.0158843994140625,
-0.037139892578125,
0.0291748046875,
-0.011199951171875,
0.054534912109375,
-0.010955810546875,
0.0288543701171875,
0.0162506103515625,
-0.0064239501953125,
-0.03887939453125,
-0.0396728515625,
0.0457763671875,
0.00966644287109375,
-0.03692626953125,
-0.039520263671875,
-0.0027008056640625,
0.055755615234375,
0.017974853515625,
0.032867431640625,
-0.008209228515625,
0.0177459716796875,
-0.004787445068359375,
0.038482666015625,
-0.022430419921875,
-0.004619598388671875,
-0.023040771484375,
0.001499176025390625,
-0.017547607421875,
-0.052520751953125
]
] |
deepset/deberta-v3-base-squad2 | 2023-09-26T08:17:16.000Z | [
"transformers",
"pytorch",
"safetensors",
"deberta-v2",
"question-answering",
"deberta",
"deberta-v3",
"en",
"dataset:squad_v2",
"license:cc-by-4.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | question-answering | deepset | null | null | deepset/deberta-v3-base-squad2 | 16 | 36,181 | transformers | 2022-07-22T12:54:36 | ---
language: en
license: cc-by-4.0
tags:
- deberta
- deberta-v3
datasets:
- squad_v2
base_model: microsoft/deberta-v3-base
model-index:
- name: deepset/deberta-v3-base-squad2
results:
- task:
type: question-answering
name: Question Answering
dataset:
name: squad_v2
type: squad_v2
config: squad_v2
split: validation
metrics:
- type: exact_match
value: 83.8248
name: Exact Match
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiY2IyZTEyYzNlOTAwZmFlNWRiZTdiNzQzMTUyM2FmZTQ3ZWQwNWZmMzc2ZDVhYWYyMzkxOTUyMGNlMWY0M2E5MiIsInZlcnNpb24iOjF9.y8KvfefMLI977BYun0X1rAq5qudmezW_UJe9mh6sYBoiWaBosDO5TRnEGR1BHzdxmv2EgPK_PSomtZvb043jBQ
- type: f1
value: 87.41
name: F1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOWVhNjAwM2Q5N2Y3MGU4ZWY3N2Y0MmNjYWYwYmQzNTdiYWExODhkYmQ1YjIwM2I1ODEzNWIxZDI1ZWQ1YWRjNSIsInZlcnNpb24iOjF9.Jk0v1ZheLRFz6k9iNAgCMMZtPYj5eVwUCku4E76wRYc-jHPmiUuxvNiNkn6NW-jkBD8bJGMqDSjJyVpVMn9pBA
- task:
type: question-answering
name: Question Answering
dataset:
name: squad
type: squad
config: plain_text
split: validation
metrics:
- type: exact_match
value: 84.9678
name: Exact Match
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOWUxYTg4MzU3YTdmMDRmMGM0NjFjMTcwNGM3YzljM2RkMTc1ZGNhMDQwMTgwNGI0ZDE4ZGMxZTE3YjY5YzQ0ZiIsInZlcnNpb24iOjF9.KKaJ1UtikNe2g6T8XhLoWNtL9X4dHHyl_O4VZ5LreBT9nXneGc21lI1AW3n8KXTFGemzRpRMvmCDyKVDHucdDQ
- type: f1
value: 92.2777
name: F1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNDU0ZTQwMzg4ZDY1ZWYxOGIxMzY2ODljZTBkMTNlYjA0ODBjNjcxNTg3ZDliYWU1YTdkYTM2NTIxOTg1MGM4OCIsInZlcnNpb24iOjF9.8VHg1BXx6gLw_K7MUK2QSE80Y9guiVR8n8K8nX4laGsLibxv5u_yDv9F3ahbUa1eZG_bbidl93TY2qFUiYHtAQ
- task:
type: question-answering
name: Question Answering
dataset:
name: adversarial_qa
type: adversarial_qa
config: adversarialQA
split: validation
metrics:
- type: exact_match
value: 30.733
name: Exact Match
- type: f1
value: 44.099
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squad_adversarial
type: squad_adversarial
config: AddOneSent
split: validation
metrics:
- type: exact_match
value: 79.295
name: Exact Match
- type: f1
value: 86.609
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts amazon
type: squadshifts
config: amazon
split: test
metrics:
- type: exact_match
value: 68.680
name: Exact Match
- type: f1
value: 83.832
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts new_wiki
type: squadshifts
config: new_wiki
split: test
metrics:
- type: exact_match
value: 80.171
name: Exact Match
- type: f1
value: 90.452
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts nyt
type: squadshifts
config: nyt
split: test
metrics:
- type: exact_match
value: 81.570
name: Exact Match
- type: f1
value: 90.644
name: F1
- task:
type: question-answering
name: Question Answering
dataset:
name: squadshifts reddit
type: squadshifts
config: reddit
split: test
metrics:
- type: exact_match
value: 66.990
name: Exact Match
- type: f1
value: 80.231
name: F1
---
# deberta-v3-base for QA
This is the [deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) model, fine-tuned using the [SQuAD2.0](https://huggingface.co/datasets/squad_v2) dataset. It's been trained on question-answer pairs, including unanswerable questions, for the task of Question Answering.
## Overview
**Language model:** deberta-v3-base
**Language:** English
**Downstream-task:** Extractive QA
**Training data:** SQuAD 2.0
**Eval data:** SQuAD 2.0
**Code:** See [an example QA pipeline on Haystack](https://haystack.deepset.ai/tutorials/first-qa-system)
**Infrastructure**: 1x NVIDIA A10G
## Hyperparameters
```
batch_size = 12
n_epochs = 4
base_LM_model = "deberta-v3-base"
max_seq_len = 512
learning_rate = 2e-5
lr_schedule = LinearWarmup
warmup_proportion = 0.2
doc_stride = 128
max_query_length = 64
```
## Usage
### In Haystack
Haystack is an NLP framework by deepset. You can use this model in a Haystack pipeline to do question answering at scale (over many documents). To load the model in [Haystack](https://github.com/deepset-ai/haystack/):
```python
reader = FARMReader(model_name_or_path="deepset/deberta-v3-base-squad2")
# or
reader = TransformersReader(model_name_or_path="deepset/deberta-v3-base-squad2",tokenizer="deepset/deberta-v3-base-squad2")
```
### In Transformers
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "deepset/deberta-v3-base-squad2"
# a) Get predictions
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
QA_input = {
'question': 'Why is model conversion important?',
'context': 'The option to convert models between FARM and transformers gives freedom to the user and let people easily switch between frameworks.'
}
res = nlp(QA_input)
# b) Load model & tokenizer
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Authors
**Sebastian Lee:** sebastian.lee [at] deepset.ai
**Timo Möller:** timo.moeller [at] deepset.ai
**Malte Pietsch:** malte.pietsch [at] deepset.ai
## About us
<div class="grid lg:grid-cols-2 gap-x-4 gap-y-3">
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://huggingface.co/spaces/deepset/README/resolve/main/haystack-logo-colored.svg" class="w-40"/>
</div>
<div class="w-full h-40 object-cover mb-2 rounded-lg flex items-center justify-center">
<img alt="" src="https://huggingface.co/spaces/deepset/README/resolve/main/deepset-logo-colored.svg" class="w-40"/>
</div>
</div>
[deepset](http://deepset.ai/) is the company behind the open-source NLP framework [Haystack](https://haystack.deepset.ai/) which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.
Some of our other work:
- [Distilled roberta-base-squad2 (aka "tinyroberta-squad2")]([https://huggingface.co/deepset/tinyroberta-squad2)
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
## Get in touch and join the Haystack community
<p>For more info on Haystack, visit our <strong><a href="https://github.com/deepset-ai/haystack">GitHub</a></strong> repo and <strong><a href="https://haystack.deepset.ai">Documentation</a></strong>.
We also have a <strong><a class="h-7" href="https://haystack.deepset.ai/community/join">Discord community open to everyone!</a></strong></p>
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai) | 7,719 | [
[
-0.0291900634765625,
-0.05084228515625,
0.0297698974609375,
0.01027679443359375,
-0.00153350830078125,
0.005413055419921875,
0.00323486328125,
-0.033355712890625,
0.0290069580078125,
0.0231170654296875,
-0.06207275390625,
-0.0457763671875,
-0.022552490234375,
0.0029888153076171875,
-0.030181884765625,
0.0640869140625,
0.00868988037109375,
-0.0028629302978515625,
-0.01125335693359375,
-0.01085662841796875,
-0.04034423828125,
-0.03594970703125,
-0.052978515625,
-0.0174560546875,
0.02288818359375,
0.0291900634765625,
0.0562744140625,
0.0258941650390625,
0.0352783203125,
0.0226898193359375,
-0.007305145263671875,
0.01155853271484375,
-0.0309295654296875,
0.019287109375,
-0.00605010986328125,
-0.0240020751953125,
-0.033172607421875,
0.00011551380157470703,
0.037322998046875,
0.0310516357421875,
-0.0073089599609375,
0.0372314453125,
-0.00850677490234375,
0.052459716796875,
-0.041595458984375,
0.0062408447265625,
-0.0533447265625,
-0.00959014892578125,
0.018218994140625,
0.0172119140625,
-0.005893707275390625,
-0.01446533203125,
0.017974853515625,
-0.047821044921875,
0.018218994140625,
-0.0200653076171875,
0.08575439453125,
0.02349853515625,
-0.0081634521484375,
-0.0113983154296875,
-0.03790283203125,
0.05975341796875,
-0.075439453125,
0.0013723373413085938,
0.039276123046875,
0.0284576416015625,
0.01068115234375,
-0.0645751953125,
-0.050994873046875,
0.005611419677734375,
-0.0198822021484375,
0.0131683349609375,
-0.019256591796875,
-0.025299072265625,
0.01052093505859375,
0.02740478515625,
-0.054107666015625,
0.01284027099609375,
-0.039764404296875,
0.0042724609375,
0.05902099609375,
0.0166015625,
0.0150909423828125,
-0.0222625732421875,
-0.0158843994140625,
-0.0208892822265625,
-0.039093017578125,
0.01364898681640625,
0.007293701171875,
0.0223846435546875,
-0.0114898681640625,
0.035430908203125,
-0.03363037109375,
0.044097900390625,
0.0235137939453125,
0.035736083984375,
0.0380859375,
-0.040191650390625,
-0.021331787109375,
-0.01508331298828125,
0.069580078125,
0.031768798828125,
0.00124359130859375,
-0.002105712890625,
-0.0174560546875,
-0.0106658935546875,
0.01438140869140625,
-0.07135009765625,
-0.01461029052734375,
0.042327880859375,
-0.0211944580078125,
-0.03533935546875,
-0.00563812255859375,
-0.053802490234375,
-0.0305023193359375,
0.0010814666748046875,
0.04595947265625,
-0.0290985107421875,
-0.03009033203125,
0.025238037109375,
-0.0140533447265625,
0.0516357421875,
0.01259613037109375,
-0.06817626953125,
0.00920867919921875,
0.043609619140625,
0.054931640625,
0.01873779296875,
-0.0225372314453125,
-0.03875732421875,
-0.01329803466796875,
-0.009918212890625,
0.039642333984375,
-0.0246124267578125,
-0.0008211135864257812,
-0.0010614395141601562,
0.0175323486328125,
-0.01114654541015625,
-0.03009033203125,
0.0114898681640625,
-0.04876708984375,
0.039093017578125,
-0.0126953125,
-0.03863525390625,
-0.02642822265625,
0.035675048828125,
-0.058380126953125,
0.080322265625,
0.0258331298828125,
-0.04736328125,
0.006866455078125,
-0.05487060546875,
-0.015380859375,
0.00817108154296875,
-0.0012884140014648438,
-0.021453857421875,
-0.0213470458984375,
0.034149169921875,
0.043609619140625,
-0.0179290771484375,
0.01015472412109375,
-0.01983642578125,
-0.0345458984375,
0.01212310791015625,
0.001903533935546875,
0.09271240234375,
0.003955841064453125,
-0.037811279296875,
-0.004405975341796875,
-0.049957275390625,
0.0192413330078125,
0.0162506103515625,
-0.00902557373046875,
-0.004364013671875,
-0.005817413330078125,
0.002498626708984375,
0.0262603759765625,
0.04315185546875,
-0.0360107421875,
0.0150604248046875,
-0.050384521484375,
0.048126220703125,
0.042694091796875,
0.0014858245849609375,
0.0270538330078125,
-0.024200439453125,
0.046783447265625,
0.000034809112548828125,
0.0118560791015625,
0.0100860595703125,
-0.02752685546875,
-0.07135009765625,
-0.005611419677734375,
0.032440185546875,
0.060089111328125,
-0.05902099609375,
0.056488037109375,
-0.0126495361328125,
-0.045928955078125,
-0.062255859375,
0.00905609130859375,
0.018707275390625,
0.0268096923828125,
0.0399169921875,
-0.0017528533935546875,
-0.05242919921875,
-0.07763671875,
0.000347137451171875,
-0.0171356201171875,
-0.018218994140625,
0.018341064453125,
0.053009033203125,
-0.032318115234375,
0.0665283203125,
-0.048126220703125,
-0.0257415771484375,
-0.01523590087890625,
-0.012939453125,
0.04132080078125,
0.052001953125,
0.053558349609375,
-0.05865478515625,
-0.034637451171875,
-0.022613525390625,
-0.05963134765625,
0.0268096923828125,
-0.005214691162109375,
-0.0232696533203125,
0.007061004638671875,
0.0285186767578125,
-0.05194091796875,
0.0192413330078125,
0.04180908203125,
-0.04083251953125,
0.018890380859375,
0.005870819091796875,
0.004329681396484375,
-0.11065673828125,
0.02410888671875,
-0.007293701171875,
-0.01425933837890625,
-0.03515625,
0.0171051025390625,
-0.018524169921875,
-0.0033740997314453125,
-0.037139892578125,
0.049530029296875,
-0.0149383544921875,
0.01247406005859375,
0.009765625,
0.0171966552734375,
0.02252197265625,
0.042755126953125,
-0.0078887939453125,
0.0645751953125,
0.052520751953125,
-0.03680419921875,
0.0487060546875,
0.0516357421875,
-0.02947998046875,
0.019500732421875,
-0.07672119140625,
0.01499176025390625,
-0.0017833709716796875,
0.01071929931640625,
-0.0697021484375,
-0.013885498046875,
0.019683837890625,
-0.0556640625,
-0.0000908970832824707,
-0.0044097900390625,
-0.05047607421875,
-0.031402587890625,
-0.038238525390625,
0.0178375244140625,
0.059906005859375,
-0.031982421875,
0.0268096923828125,
0.0301055908203125,
0.000537872314453125,
-0.050872802734375,
-0.0640869140625,
-0.0016231536865234375,
-0.00933074951171875,
-0.054290771484375,
0.0174560546875,
-0.01258087158203125,
-0.006866455078125,
0.0087890625,
0.005741119384765625,
-0.0372314453125,
0.024444580078125,
0.01163482666015625,
0.02734375,
-0.026397705078125,
0.023712158203125,
-0.0086517333984375,
-0.0026493072509765625,
0.0015707015991210938,
-0.0203857421875,
0.0433349609375,
-0.04864501953125,
0.00420379638671875,
-0.035675048828125,
0.02813720703125,
0.037689208984375,
-0.028106689453125,
0.066162109375,
0.05120849609375,
-0.034393310546875,
-0.0013017654418945312,
-0.03863525390625,
-0.025909423828125,
-0.036224365234375,
0.027099609375,
-0.018798828125,
-0.067138671875,
0.05010986328125,
0.0278167724609375,
0.02008056640625,
0.076416015625,
0.034423828125,
-0.0297088623046875,
0.07379150390625,
0.04205322265625,
-0.0017414093017578125,
0.0276947021484375,
-0.063720703125,
-0.0027618408203125,
-0.07159423828125,
-0.01148223876953125,
-0.038299560546875,
-0.037689208984375,
-0.046142578125,
-0.0279388427734375,
0.01239013671875,
0.016876220703125,
-0.03759765625,
0.0469970703125,
-0.058380126953125,
0.036834716796875,
0.053436279296875,
0.01232147216796875,
0.006252288818359375,
-0.004413604736328125,
0.01324462890625,
0.0176849365234375,
-0.058197021484375,
-0.036224365234375,
0.0765380859375,
0.016845703125,
0.033782958984375,
0.00902557373046875,
0.06378173828125,
0.0113372802734375,
-0.01033782958984375,
-0.048065185546875,
0.041015625,
-0.013397216796875,
-0.0732421875,
-0.04150390625,
-0.024749755859375,
-0.08050537109375,
0.00677490234375,
-0.0175323486328125,
-0.05224609375,
0.0281219482421875,
0.00274658203125,
-0.05181884765625,
0.01629638671875,
-0.042266845703125,
0.07464599609375,
-0.002605438232421875,
-0.01153564453125,
-0.0090789794921875,
-0.062103271484375,
0.01568603515625,
0.002819061279296875,
-0.00765228271484375,
-0.016143798828125,
0.001922607421875,
0.061859130859375,
-0.0308685302734375,
0.07159423828125,
-0.01016998291015625,
-0.004425048828125,
0.03515625,
0.0025730133056640625,
0.028350830078125,
0.0227813720703125,
-0.0278167724609375,
0.01605224609375,
0.034332275390625,
-0.037689208984375,
-0.0401611328125,
0.053497314453125,
-0.07171630859375,
-0.02880859375,
-0.032073974609375,
-0.0278167724609375,
-0.01119232177734375,
0.0227813720703125,
0.0167694091796875,
0.0306549072265625,
-0.01349639892578125,
0.040435791015625,
0.045562744140625,
-0.0097503662109375,
0.03173828125,
0.0377197265625,
-0.0114898681640625,
-0.02740478515625,
0.055084228515625,
-0.00557708740234375,
0.0070343017578125,
0.03326416015625,
0.006015777587890625,
-0.031463623046875,
-0.028106689453125,
-0.044525146484375,
0.0147552490234375,
-0.0516357421875,
-0.0278778076171875,
-0.048797607421875,
-0.03900146484375,
-0.048736572265625,
-0.0018863677978515625,
-0.026947021484375,
-0.049072265625,
-0.0418701171875,
-0.006923675537109375,
0.052215576171875,
0.0399169921875,
-0.004482269287109375,
0.01526641845703125,
-0.05810546875,
0.0278778076171875,
0.039947509765625,
0.0251922607421875,
-0.0088653564453125,
-0.041351318359375,
-0.015411376953125,
0.032958984375,
-0.0077056884765625,
-0.050994873046875,
0.018218994140625,
0.00983428955078125,
0.0242919921875,
-0.0132598876953125,
0.01096343994140625,
0.03460693359375,
-0.022796630859375,
0.0631103515625,
0.004901885986328125,
-0.062164306640625,
0.049072265625,
-0.0276641845703125,
0.03680419921875,
0.082763671875,
0.0233612060546875,
-0.0504150390625,
-0.018218994140625,
-0.05731201171875,
-0.0723876953125,
0.04595947265625,
0.029510498046875,
0.020751953125,
-0.0031223297119140625,
0.028350830078125,
-0.0055389404296875,
0.019927978515625,
-0.04766845703125,
-0.016876220703125,
-0.018157958984375,
-0.02374267578125,
-0.0013360977172851562,
-0.00640869140625,
-0.0137176513671875,
-0.031768798828125,
0.07122802734375,
-0.01090240478515625,
0.00713348388671875,
0.0249176025390625,
-0.0153045654296875,
0.0139923095703125,
0.00812530517578125,
0.0282440185546875,
0.06378173828125,
-0.02923583984375,
-0.0182647705078125,
0.0193328857421875,
-0.0246734619140625,
0.0030727386474609375,
0.0235595703125,
-0.037841796875,
0.003688812255859375,
0.0291900634765625,
0.060699462890625,
-0.0025196075439453125,
-0.046722412109375,
0.045379638671875,
-0.0103912353515625,
-0.032073974609375,
-0.04168701171875,
0.010772705078125,
0.018341064453125,
0.035675048828125,
0.0276336669921875,
-0.014404296875,
0.0135040283203125,
-0.0304107666015625,
0.01470184326171875,
0.0333251953125,
-0.0330810546875,
-0.004238128662109375,
0.035308837890625,
0.031494140625,
-0.022125244140625,
0.0643310546875,
-0.0157623291015625,
-0.041595458984375,
0.07171630859375,
0.01273345947265625,
0.08062744140625,
0.0124053955078125,
0.031036376953125,
0.044891357421875,
0.0297088623046875,
0.00860595703125,
0.0238494873046875,
0.0087432861328125,
-0.0487060546875,
-0.0277252197265625,
-0.0546875,
-0.0173492431640625,
0.0293426513671875,
-0.05181884765625,
0.0107574462890625,
-0.0323486328125,
-0.00513458251953125,
0.0012292861938476562,
0.0229644775390625,
-0.0682373046875,
0.0171661376953125,
-0.004093170166015625,
0.06292724609375,
-0.04290771484375,
0.034027099609375,
0.06292724609375,
-0.051788330078125,
-0.0638427734375,
-0.0099029541015625,
-0.024688720703125,
-0.072265625,
0.03857421875,
0.0237579345703125,
-0.01027679443359375,
0.022674560546875,
-0.058380126953125,
-0.06982421875,
0.09869384765625,
0.00569915771484375,
-0.034027099609375,
-0.0227813720703125,
-0.00421905517578125,
0.040313720703125,
-0.02581787109375,
0.0159912109375,
0.043304443359375,
0.0302886962890625,
0.008148193359375,
-0.061737060546875,
0.023193359375,
-0.028656005859375,
-0.005672454833984375,
-0.0015745162963867188,
-0.058380126953125,
0.059173583984375,
-0.00946807861328125,
-0.0140228271484375,
0.0223541259765625,
0.040802001953125,
0.01558685302734375,
0.000400543212890625,
0.04266357421875,
0.042327880859375,
0.05303955078125,
0.0019626617431640625,
0.06036376953125,
-0.01532745361328125,
0.050537109375,
0.087646484375,
-0.01226806640625,
0.07244873046875,
0.0245513916015625,
-0.0161895751953125,
0.054962158203125,
0.05194091796875,
-0.027984619140625,
0.0274810791015625,
0.01861572265625,
-0.0033168792724609375,
-0.0301055908203125,
0.007511138916015625,
-0.04827880859375,
0.044464111328125,
0.00785064697265625,
-0.021331787109375,
-0.018646240234375,
-0.02337646484375,
-0.00995635986328125,
-0.00572967529296875,
-0.00662994384765625,
0.06353759765625,
-0.00836181640625,
-0.0419921875,
0.07208251953125,
-0.01073455810546875,
0.04974365234375,
-0.045654296875,
-0.005001068115234375,
-0.023773193359375,
0.017303466796875,
-0.0100250244140625,
-0.066650390625,
0.01409149169921875,
-0.005218505859375,
-0.03204345703125,
-0.006160736083984375,
0.038055419921875,
-0.0345458984375,
-0.059844970703125,
0.00974273681640625,
0.047210693359375,
0.0162811279296875,
-0.0108642578125,
-0.07635498046875,
-0.0132904052734375,
-0.0006880760192871094,
-0.0264129638671875,
0.00669097900390625,
0.022735595703125,
0.01910400390625,
0.044525146484375,
0.0443115234375,
-0.0140228271484375,
-0.0070343017578125,
-0.005153656005859375,
0.072509765625,
-0.05670166015625,
-0.0205535888671875,
-0.05712890625,
0.046417236328125,
-0.0211334228515625,
-0.03387451171875,
0.048370361328125,
0.0447998046875,
0.059844970703125,
-0.010223388671875,
0.05029296875,
-0.021820068359375,
0.043121337890625,
-0.032928466796875,
0.07012939453125,
-0.06268310546875,
0.0018568038940429688,
-0.0011234283447265625,
-0.05352783203125,
-0.009552001953125,
0.054931640625,
-0.0026645660400390625,
0.006252288818359375,
0.050506591796875,
0.05975341796875,
0.00409698486328125,
-0.0216522216796875,
-0.0010471343994140625,
0.01971435546875,
0.0157012939453125,
0.06982421875,
0.053741455078125,
-0.06256103515625,
0.0496826171875,
-0.027435302734375,
-0.005443572998046875,
-0.016143798828125,
-0.05029296875,
-0.0682373046875,
-0.04815673828125,
-0.0258026123046875,
-0.049530029296875,
-0.0043182373046875,
0.0584716796875,
0.059539794921875,
-0.0675048828125,
-0.00791168212890625,
-0.0025882720947265625,
0.0115814208984375,
-0.0177459716796875,
-0.02203369140625,
0.033538818359375,
-0.016876220703125,
-0.051513671875,
0.0269927978515625,
-0.00872802734375,
0.00467681884765625,
-0.0283355712890625,
0.0065765380859375,
-0.05364990234375,
-0.0037364959716796875,
0.0362548828125,
0.0277252197265625,
-0.042388916015625,
-0.0096893310546875,
0.01947021484375,
-0.018646240234375,
-0.0013942718505859375,
0.0255889892578125,
-0.0770263671875,
0.01447296142578125,
0.048004150390625,
0.058441162109375,
0.040985107421875,
-0.0038166046142578125,
0.038360595703125,
-0.0489501953125,
0.00890350341796875,
0.040557861328125,
0.011322021484375,
0.0255279541015625,
-0.037384033203125,
0.05865478515625,
0.002178192138671875,
-0.035430908203125,
-0.058807373046875,
0.004180908203125,
-0.06585693359375,
-0.028045654296875,
0.08758544921875,
-0.0016641616821289062,
-0.014404296875,
0.011749267578125,
-0.009063720703125,
0.01126861572265625,
-0.025421142578125,
0.047332763671875,
0.0526123046875,
0.0090179443359375,
0.00801849365234375,
-0.043670654296875,
0.03515625,
0.040740966796875,
-0.06414794921875,
-0.00286102294921875,
0.034881591796875,
0.0238494873046875,
0.0185089111328125,
0.0469970703125,
0.01006317138671875,
0.030120849609375,
-0.016326904296875,
0.004154205322265625,
-0.0084075927734375,
-0.00923919677734375,
-0.03265380859375,
-0.0035228729248046875,
-0.018463134765625,
-0.035308837890625
]
] |
timm/volo_d1_224.sail_in1k | 2023-04-13T05:52:29.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2106.13112",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/volo_d1_224.sail_in1k | 1 | 36,075 | timm | 2023-04-13T05:51:34 | ---
tags:
- image-classification
- timm
library_tag: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for volo_d1_224.sail_in1k
A VOLO (Vision Outlooker) image classification model. Trained on ImageNet-1k with token labelling by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 26.6
- GMACs: 6.9
- Activations (M): 24.4
- Image size: 224 x 224
- **Papers:**
- VOLO: Vision Outlooker for Visual Recognition: https://arxiv.org/abs/2106.13112
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/sail-sg/volo
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('volo_d1_224.sail_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'volo_d1_224.sail_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 197, 384) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Citation
```bibtex
@article{yuan2022volo,
title={Volo: Vision outlooker for visual recognition},
author={Yuan, Li and Hou, Qibin and Jiang, Zihang and Feng, Jiashi and Yan, Shuicheng},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
year={2022},
publisher={IEEE}
}
```
| 2,597 | [
[
-0.029754638671875,
-0.0161895751953125,
0.00783538818359375,
0.01934814453125,
-0.044219970703125,
-0.03076171875,
0.000713348388671875,
-0.0253143310546875,
0.02227783203125,
0.03887939453125,
-0.051605224609375,
-0.04815673828125,
-0.053802490234375,
-0.004791259765625,
-0.004711151123046875,
0.06494140625,
0.005901336669921875,
0.00225067138671875,
-0.026519775390625,
-0.028045654296875,
-0.0288848876953125,
-0.021881103515625,
-0.0645751953125,
-0.0183868408203125,
0.0201568603515625,
0.0148468017578125,
0.0232391357421875,
0.0416259765625,
0.048248291015625,
0.035369873046875,
-0.01507568359375,
0.0014934539794921875,
-0.025054931640625,
-0.019561767578125,
0.029937744140625,
-0.060455322265625,
-0.048248291015625,
0.0193328857421875,
0.0587158203125,
0.0192108154296875,
0.0014333724975585938,
0.0251922607421875,
0.01108551025390625,
0.045562744140625,
-0.02294921875,
0.022735595703125,
-0.038543701171875,
0.02056884765625,
-0.0177459716796875,
0.0012941360473632812,
-0.024993896484375,
-0.0185394287109375,
0.0197296142578125,
-0.0589599609375,
0.0190582275390625,
0.0081634521484375,
0.09759521484375,
0.0226898193359375,
-0.00701141357421875,
0.0002694129943847656,
-0.010467529296875,
0.0679931640625,
-0.060638427734375,
0.024932861328125,
0.021881103515625,
0.0122528076171875,
-0.000995635986328125,
-0.058929443359375,
-0.040252685546875,
0.000052809715270996094,
-0.0037288665771484375,
0.0199432373046875,
-0.00446319580078125,
-0.00010699033737182617,
0.0161895751953125,
0.0196533203125,
-0.0350341796875,
0.006877899169921875,
-0.0447998046875,
-0.01398468017578125,
0.051422119140625,
0.01427459716796875,
0.0124359130859375,
-0.01509857177734375,
-0.04345703125,
-0.0330810546875,
-0.0190887451171875,
0.0275115966796875,
0.0206756591796875,
0.01155853271484375,
-0.0419921875,
0.032440185546875,
0.003387451171875,
0.040496826171875,
0.01515960693359375,
-0.028594970703125,
0.055206298828125,
0.006500244140625,
-0.039306640625,
-0.00923919677734375,
0.072998046875,
0.04052734375,
0.005420684814453125,
0.017486572265625,
-0.007633209228515625,
-0.041961669921875,
0.0031642913818359375,
-0.0732421875,
-0.004486083984375,
0.017669677734375,
-0.046875,
-0.050567626953125,
0.0343017578125,
-0.05908203125,
-0.01189422607421875,
-0.005947113037109375,
0.05157470703125,
-0.03179931640625,
-0.0185546875,
0.00963592529296875,
-0.0225982666015625,
0.035491943359375,
0.0206756591796875,
-0.03350830078125,
0.0054931640625,
0.024871826171875,
0.07843017578125,
0.003543853759765625,
-0.0384521484375,
-0.0219573974609375,
-0.024017333984375,
-0.0244903564453125,
0.03106689453125,
-0.01012420654296875,
-0.01091766357421875,
-0.0162200927734375,
0.0258026123046875,
-0.0161895751953125,
-0.035980224609375,
0.028839111328125,
-0.0225677490234375,
0.0333251953125,
-0.0037517547607421875,
-0.029449462890625,
-0.039459228515625,
0.019287109375,
-0.02880859375,
0.0751953125,
0.0229339599609375,
-0.07025146484375,
0.033355712890625,
-0.0335693359375,
0.002349853515625,
0.0021209716796875,
0.0053558349609375,
-0.08612060546875,
-0.0024890899658203125,
0.007526397705078125,
0.054412841796875,
-0.01910400390625,
-0.005046844482421875,
-0.03802490234375,
-0.0157623291015625,
0.0196533203125,
-0.0026645660400390625,
0.07904052734375,
0.003604888916015625,
-0.0191650390625,
0.00909423828125,
-0.060577392578125,
0.0056610107421875,
0.038787841796875,
-0.0214385986328125,
-0.0247802734375,
-0.040557861328125,
0.007602691650390625,
0.01168060302734375,
0.020904541015625,
-0.0445556640625,
0.032989501953125,
-0.021942138671875,
0.0210113525390625,
0.05670166015625,
-0.0017948150634765625,
0.02349853515625,
-0.0215301513671875,
0.028564453125,
0.029541015625,
0.03070068359375,
-0.0008025169372558594,
-0.05230712890625,
-0.0699462890625,
-0.042449951171875,
0.01026153564453125,
0.040374755859375,
-0.052398681640625,
0.041748046875,
-0.0131378173828125,
-0.060394287109375,
-0.029937744140625,
-0.00437164306640625,
0.03662109375,
0.03973388671875,
0.0201416015625,
-0.0269012451171875,
-0.043914794921875,
-0.0638427734375,
-0.00005555152893066406,
0.0011644363403320312,
0.00580596923828125,
0.0221405029296875,
0.050994873046875,
-0.0141754150390625,
0.04364013671875,
-0.040130615234375,
-0.0302276611328125,
-0.01352691650390625,
0.013427734375,
0.037994384765625,
0.04949951171875,
0.07110595703125,
-0.053619384765625,
-0.0423583984375,
-0.01355743408203125,
-0.0733642578125,
0.01210784912109375,
-0.013336181640625,
-0.0225677490234375,
0.0205841064453125,
0.01180267333984375,
-0.03717041015625,
0.05230712890625,
0.01355743408203125,
-0.036895751953125,
0.02850341796875,
-0.0207061767578125,
0.01013946533203125,
-0.08587646484375,
-0.005008697509765625,
0.020965576171875,
-0.006168365478515625,
-0.0411376953125,
-0.01739501953125,
-0.004833221435546875,
0.0086212158203125,
-0.0391845703125,
0.056793212890625,
-0.034210205078125,
0.0042877197265625,
-0.01336669921875,
-0.0224151611328125,
-0.001361846923828125,
0.04730224609375,
0.0025310516357421875,
0.014984130859375,
0.077392578125,
-0.0465087890625,
0.04400634765625,
0.044403076171875,
-0.01227569580078125,
0.04705810546875,
-0.047698974609375,
0.01198577880859375,
-0.0184478759765625,
-0.0007243156433105469,
-0.07940673828125,
-0.018096923828125,
0.0304107666015625,
-0.038604736328125,
0.0457763671875,
-0.036834716796875,
-0.0099334716796875,
-0.04803466796875,
-0.040679931640625,
0.03338623046875,
0.0509033203125,
-0.055389404296875,
0.03533935546875,
0.018157958984375,
0.0154266357421875,
-0.052520751953125,
-0.058441162109375,
-0.023529052734375,
-0.03387451171875,
-0.047637939453125,
0.0261993408203125,
0.00687408447265625,
0.01031494140625,
0.0008497238159179688,
-0.0028972625732421875,
-0.00995635986328125,
-0.0097198486328125,
0.03643798828125,
0.05029296875,
-0.01418304443359375,
-0.00836181640625,
-0.01500701904296875,
-0.0037326812744140625,
0.005496978759765625,
-0.024658203125,
0.03192138671875,
-0.0251007080078125,
-0.01110076904296875,
-0.056610107421875,
-0.0156707763671875,
0.04644775390625,
-0.0051422119140625,
0.07391357421875,
0.06915283203125,
-0.0311431884765625,
-0.007556915283203125,
-0.027862548828125,
-0.011871337890625,
-0.03759765625,
0.035308837890625,
-0.036102294921875,
-0.0247650146484375,
0.0595703125,
0.017181396484375,
0.003368377685546875,
0.06195068359375,
0.0167388916015625,
0.0006155967712402344,
0.05560302734375,
0.05914306640625,
0.01203155517578125,
0.05859375,
-0.06561279296875,
-0.01568603515625,
-0.0692138671875,
-0.0458984375,
-0.0140838623046875,
-0.026275634765625,
-0.037261962890625,
-0.0200958251953125,
0.029083251953125,
0.0134735107421875,
-0.0239715576171875,
0.037353515625,
-0.064453125,
0.007625579833984375,
0.058380126953125,
0.036376953125,
-0.02545166015625,
0.007328033447265625,
-0.0199127197265625,
-0.0030689239501953125,
-0.05731201171875,
-0.01067352294921875,
0.08526611328125,
0.03271484375,
0.05047607421875,
-0.027008056640625,
0.05047607421875,
-0.017242431640625,
0.0231170654296875,
-0.04290771484375,
0.046661376953125,
-0.0025463104248046875,
-0.023040771484375,
-0.01873779296875,
-0.028076171875,
-0.07366943359375,
0.014312744140625,
-0.02899169921875,
-0.054168701171875,
0.0244598388671875,
0.0053863525390625,
-0.0133819580078125,
0.0445556640625,
-0.040130615234375,
0.07025146484375,
-0.0152435302734375,
-0.0230712890625,
0.0149688720703125,
-0.04315185546875,
0.0184173583984375,
0.0038700103759765625,
-0.0098876953125,
0.0023365020751953125,
0.010589599609375,
0.09033203125,
-0.041778564453125,
0.06951904296875,
-0.0313720703125,
0.024749755859375,
0.03802490234375,
-0.0211944580078125,
0.0244903564453125,
-0.00484466552734375,
0.0015411376953125,
0.028167724609375,
0.005462646484375,
-0.0253143310546875,
-0.0279083251953125,
0.0301055908203125,
-0.06951904296875,
-0.0256195068359375,
-0.04302978515625,
-0.032073974609375,
0.0180511474609375,
0.00931549072265625,
0.058685302734375,
0.041717529296875,
0.01203155517578125,
0.0220947265625,
0.041046142578125,
-0.0301971435546875,
0.032318115234375,
0.00785064697265625,
-0.020416259765625,
-0.045501708984375,
0.066650390625,
0.01824951171875,
0.004497528076171875,
0.0164031982421875,
0.0123748779296875,
-0.0197601318359375,
-0.03564453125,
-0.01444244384765625,
0.036346435546875,
-0.04998779296875,
-0.02984619140625,
-0.0308685302734375,
-0.039886474609375,
-0.032470703125,
-0.00843048095703125,
-0.038604736328125,
-0.0225067138671875,
-0.0361328125,
0.0140380859375,
0.047210693359375,
0.047210693359375,
-0.00746917724609375,
0.0350341796875,
-0.051361083984375,
0.0025177001953125,
0.01800537109375,
0.035369873046875,
-0.007152557373046875,
-0.0797119140625,
-0.0229034423828125,
-0.0005202293395996094,
-0.032989501953125,
-0.061614990234375,
0.054779052734375,
0.01364898681640625,
0.05352783203125,
0.045806884765625,
-0.0174713134765625,
0.051025390625,
-0.0019178390502929688,
0.0416259765625,
0.0263824462890625,
-0.055633544921875,
0.059173583984375,
-0.006320953369140625,
0.007354736328125,
0.0004146099090576172,
0.03497314453125,
-0.0176239013671875,
-0.0103302001953125,
-0.0703125,
-0.050506591796875,
0.0750732421875,
0.01885986328125,
-0.013885498046875,
0.0258941650390625,
0.057403564453125,
0.00011152029037475586,
0.01305389404296875,
-0.0626220703125,
-0.0308074951171875,
-0.036163330078125,
-0.0201568603515625,
-0.0005345344543457031,
-0.00783538818359375,
0.0020694732666015625,
-0.046875,
0.06561279296875,
-0.002323150634765625,
0.056610107421875,
0.0284423828125,
0.006824493408203125,
-0.00646209716796875,
-0.030853271484375,
0.04156494140625,
0.01354217529296875,
-0.0253448486328125,
-0.0098724365234375,
0.01971435546875,
-0.05474853515625,
-0.00016880035400390625,
0.00218963623046875,
-0.004207611083984375,
0.0007758140563964844,
0.03924560546875,
0.07733154296875,
-0.01265716552734375,
-0.0042572021484375,
0.04534912109375,
-0.00853729248046875,
-0.0357666015625,
-0.0229034423828125,
0.01092529296875,
-0.0160369873046875,
0.035675048828125,
0.02410888671875,
0.027557373046875,
-0.006427764892578125,
-0.0175628662109375,
0.0023860931396484375,
0.050048828125,
-0.035003662109375,
-0.02685546875,
0.060394287109375,
-0.01393890380859375,
-0.01195526123046875,
0.0596923828125,
-0.016387939453125,
-0.04608154296875,
0.0855712890625,
0.026336669921875,
0.07672119140625,
-0.0118255615234375,
0.002197265625,
0.06829833984375,
0.0184173583984375,
-0.0090179443359375,
0.025665283203125,
0.0225372314453125,
-0.06195068359375,
0.0089263916015625,
-0.048797607421875,
-0.0031871795654296875,
0.0287933349609375,
-0.04632568359375,
0.032257080078125,
-0.039337158203125,
-0.024627685546875,
0.00347137451171875,
0.0147705078125,
-0.06463623046875,
0.0164947509765625,
0.0179595947265625,
0.053802490234375,
-0.06842041015625,
0.050201416015625,
0.07025146484375,
-0.0399169921875,
-0.05908203125,
-0.0223541259765625,
0.00682830810546875,
-0.06744384765625,
0.051605224609375,
0.03369140625,
0.01309967041015625,
-0.0035495758056640625,
-0.0653076171875,
-0.06072998046875,
0.113037109375,
0.01934814453125,
-0.0185546875,
0.0186767578125,
-0.004283905029296875,
0.0217437744140625,
-0.041595458984375,
0.0251007080078125,
-0.00373077392578125,
0.0355224609375,
0.02569580078125,
-0.055816650390625,
0.0107574462890625,
-0.02215576171875,
0.0093536376953125,
0.0078125,
-0.06768798828125,
0.0758056640625,
-0.03216552734375,
-0.001079559326171875,
0.006755828857421875,
0.0526123046875,
0.017669677734375,
0.0188140869140625,
0.043701171875,
0.058441162109375,
0.051361083984375,
-0.020416259765625,
0.06402587890625,
0.003963470458984375,
0.050811767578125,
0.035369873046875,
0.021331787109375,
0.038421630859375,
0.0251922607421875,
-0.012481689453125,
0.027862548828125,
0.0638427734375,
-0.03948974609375,
0.043121337890625,
0.005634307861328125,
0.008087158203125,
-0.012420654296875,
0.0211944580078125,
-0.0283966064453125,
0.042724609375,
0.00923919677734375,
-0.03167724609375,
0.0003600120544433594,
0.0253143310546875,
-0.00809478759765625,
-0.032379150390625,
-0.034393310546875,
0.041015625,
0.001445770263671875,
-0.034576416015625,
0.05633544921875,
-0.01129913330078125,
0.0670166015625,
-0.032012939453125,
-0.0015239715576171875,
-0.01214599609375,
0.0279541015625,
-0.0196380615234375,
-0.06329345703125,
0.0217437744140625,
-0.02569580078125,
-0.0032634735107421875,
-0.001941680908203125,
0.0462646484375,
-0.028564453125,
-0.047698974609375,
0.0139312744140625,
0.0195465087890625,
0.036376953125,
-0.00489044189453125,
-0.07562255859375,
-0.0003628730773925781,
0.0120391845703125,
-0.0521240234375,
0.0182342529296875,
0.043426513671875,
0.016082763671875,
0.062286376953125,
0.0369873046875,
-0.0080413818359375,
0.01806640625,
-0.01959228515625,
0.055633544921875,
-0.03802490234375,
-0.027313232421875,
-0.061279296875,
0.044097900390625,
-0.01279449462890625,
-0.054046630859375,
0.0487060546875,
0.04498291015625,
0.058380126953125,
-0.021148681640625,
0.034149169921875,
-0.00618743896484375,
0.000598907470703125,
-0.03717041015625,
0.05316162109375,
-0.06170654296875,
-0.0022029876708984375,
-0.0208587646484375,
-0.057769775390625,
-0.0267333984375,
0.061920166015625,
-0.0170440673828125,
0.017578125,
0.036834716796875,
0.058013916015625,
-0.0169830322265625,
-0.0237884521484375,
0.02099609375,
0.0239715576171875,
0.0051116943359375,
0.03521728515625,
0.0369873046875,
-0.06982421875,
0.02685546875,
-0.043212890625,
-0.0210723876953125,
-0.019622802734375,
-0.05511474609375,
-0.0894775390625,
-0.06915283203125,
-0.051544189453125,
-0.037750244140625,
-0.0204925537109375,
0.062042236328125,
0.0745849609375,
-0.04058837890625,
-0.006443023681640625,
-0.005218505859375,
0.0088043212890625,
-0.0271759033203125,
-0.0193023681640625,
0.03350830078125,
-0.01062774658203125,
-0.057525634765625,
-0.0256805419921875,
0.011016845703125,
0.043975830078125,
0.00511932373046875,
-0.019866943359375,
-0.00865936279296875,
-0.006832122802734375,
0.024749755859375,
0.01885986328125,
-0.04132080078125,
-0.0125274658203125,
-0.0049896240234375,
-0.01016998291015625,
0.029083251953125,
0.0045623779296875,
-0.044219970703125,
0.0180511474609375,
0.035980224609375,
0.01715087890625,
0.06414794921875,
-0.0259857177734375,
-0.0004901885986328125,
-0.05767822265625,
0.041534423828125,
-0.0098724365234375,
0.04461669921875,
0.01535797119140625,
-0.03570556640625,
0.0413818359375,
0.041229248046875,
-0.036895751953125,
-0.061004638671875,
-0.019561767578125,
-0.09552001953125,
-0.00826263427734375,
0.0599365234375,
-0.0213623046875,
-0.047760009765625,
0.03338623046875,
-0.011627197265625,
0.03277587890625,
-0.0013523101806640625,
0.033966064453125,
0.02984619140625,
-0.004482269287109375,
-0.03887939453125,
-0.0270233154296875,
0.0303192138671875,
-0.0012140274047851562,
-0.0533447265625,
-0.03924560546875,
-0.005859375,
0.046844482421875,
0.0240631103515625,
0.039459228515625,
-0.0018110275268554688,
0.010894775390625,
0.0162200927734375,
0.0394287109375,
-0.0240936279296875,
-0.01213836669921875,
-0.0250244140625,
-0.007190704345703125,
-0.019134521484375,
-0.047637939453125
]
] |
dangvantuan/sentence-camembert-large | 2023-09-12T11:38:28.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"camembert",
"feature-extraction",
"Text",
"Sentence Similarity",
"Sentence-Embedding",
"camembert-large",
"sentence-similarity",
"fr",
"dataset:stsb_multi_mt",
"arxiv:1908.10084",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | dangvantuan | null | null | dangvantuan/sentence-camembert-large | 36 | 36,048 | transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
language: fr
datasets:
- stsb_multi_mt
tags:
- Text
- Sentence Similarity
- Sentence-Embedding
- camembert-large
license: apache-2.0
model-index:
- name: sentence-camembert-large by Van Tuan DANG
results:
- task:
name: Sentence-Embedding
type: Text Similarity
dataset:
name: Text Similarity fr
type: stsb_multi_mt
args: fr
metrics:
- name: Test Pearson correlation coefficient
type: Pearson_correlation_coefficient
value: xx.xx
---
## Description:
[**Sentence-CamemBERT-Large**](https://huggingface.co/dangvantuan/sentence-camembert-large) is the Embedding Model for French developed by [La Javaness](https://www.lajavaness.com/). The purpose of this embedding model is to represent the content and semantics of a French sentence in a mathematical vector which allows it to understand the meaning of the text-beyond individual words in queries and documents, offering a powerful semantic search.
## Pre-trained sentence embedding models are state-of-the-art of Sentence Embeddings for French.
The model is Fine-tuned using pre-trained [facebook/camembert-large](https://huggingface.co/camembert/camembert-large) and
[Siamese BERT-Networks with 'sentences-transformers'](https://www.sbert.net/) on dataset [stsb](https://huggingface.co/datasets/stsb_multi_mt/viewer/fr/train)
## Usage
The model can be used directly (without a language model) as follows:
```python
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("dangvantuan/sentence-camembert-large")
sentences = ["Un avion est en train de décoller.",
"Un homme joue d'une grande flûte.",
"Un homme étale du fromage râpé sur une pizza.",
"Une personne jette un chat au plafond.",
"Une personne est en train de plier un morceau de papier.",
]
embeddings = model.encode(sentences)
```
## Evaluation
The model can be evaluated as follows on the French test data of stsb.
```python
from sentence_transformers import SentenceTransformer
from sentence_transformers.readers import InputExample
from datasets import load_dataset
def convert_dataset(dataset):
dataset_samples=[]
for df in dataset:
score = float(df['similarity_score'])/5.0 # Normalize score to range 0 ... 1
inp_example = InputExample(texts=[df['sentence1'],
df['sentence2']], label=score)
dataset_samples.append(inp_example)
return dataset_samples
# Loading the dataset for evaluation
df_dev = load_dataset("stsb_multi_mt", name="fr", split="dev")
df_test = load_dataset("stsb_multi_mt", name="fr", split="test")
# Convert the dataset for evaluation
# For Dev set:
dev_samples = convert_dataset(df_dev)
val_evaluator = EmbeddingSimilarityEvaluator.from_input_examples(dev_samples, name='sts-dev')
val_evaluator(model, output_path="./")
# For Test set:
test_samples = convert_dataset(df_test)
test_evaluator = EmbeddingSimilarityEvaluator.from_input_examples(test_samples, name='sts-test')
test_evaluator(model, output_path="./")
```
**Test Result**:
The performance is measured using Pearson and Spearman correlation:
- On dev
| Model | Pearson correlation | Spearman correlation | #params |
| ------------- | ------------- | ------------- |------------- |
| [dangvantuan/sentence-camembert-large](https://huggingface.co/dangvantuan/sentence-camembert-large)| 88.2 |88.02 | 336M|
| [dangvantuan/sentence-camembert-base](https://huggingface.co/dangvantuan/sentence-camembert-base) | 86.73|86.54 | 110M |
| [distiluse-base-multilingual-cased](https://huggingface.co/sentence-transformers/distiluse-base-multilingual-cased) | 79.22 | 79.16|135M |
| [GPT-3 (text-davinci-003)](https://platform.openai.com/docs/models) | 85 | NaN|175B |
| [GPT-(text-embedding-ada-002)](https://platform.openai.com/docs/models) | 79.75 | 80.44|NaN |
- On test
| Model | Pearson correlation | Spearman correlation |
| ------------- | ------------- | ------------- |
| [dangvantuan/sentence-camembert-large](https://huggingface.co/dangvantuan/sentence-camembert-large)| 85.9 | 85.8|
| [dangvantuan/sentence-camembert-base](https://huggingface.co/dangvantuan/sentence-camembert-base)| 82.36 | 81.64|
| [distiluse-base-multilingual-cased](https://huggingface.co/sentence-transformers/distiluse-base-multilingual-cased) | 78.62 | 77.48|
| [GPT-3 (text-davinci-003)](https://platform.openai.com/docs/models) | 82 | NaN|175B |
| [GPT-(text-embedding-ada-002)](https://platform.openai.com/docs/models) | 79.05 | 77.56|NaN |
## Citation
@article{reimers2019sentence,
title={Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks},
author={Nils Reimers, Iryna Gurevych},
journal={https://arxiv.org/abs/1908.10084},
year={2019}
}
@article{martin2020camembert,
title={CamemBERT: a Tasty French Language Mode},
author={Martin, Louis and Muller, Benjamin and Su{\'a}rez, Pedro Javier Ortiz and Dupont, Yoann and Romary, Laurent and de la Clergerie, {\'E}ric Villemonte and Seddah, Djam{\'e} and Sagot, Beno{\^\i}t},
journal={Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics},
year={2020}
} | 5,243 | [
[
-0.0121612548828125,
-0.08282470703125,
0.035064697265625,
0.036376953125,
-0.016510009765625,
-0.0128936767578125,
-0.0333251953125,
-0.005680084228515625,
0.0321044921875,
0.015472412109375,
-0.0249176025390625,
-0.044281005859375,
-0.051788330078125,
0.006305694580078125,
-0.01427459716796875,
0.06378173828125,
-0.0264129638671875,
0.01303863525390625,
-0.0182647705078125,
-0.01739501953125,
-0.0169525146484375,
-0.05224609375,
-0.03436279296875,
0.004913330078125,
0.0197296142578125,
0.00885009765625,
0.042327880859375,
0.02276611328125,
0.035797119140625,
0.0253143310546875,
-0.00873565673828125,
0.006351470947265625,
-0.0243682861328125,
0.000247955322265625,
0.00908660888671875,
-0.035308837890625,
-0.026641845703125,
0.0009765625,
0.04656982421875,
0.037689208984375,
0.0014896392822265625,
-0.0048980712890625,
0.0069580078125,
0.046966552734375,
-0.02899169921875,
0.039398193359375,
-0.0235595703125,
0.0016908645629882812,
-0.005939483642578125,
-0.00443267822265625,
-0.03680419921875,
-0.037689208984375,
0.005420684814453125,
-0.031890869140625,
0.0016832351684570312,
0.005054473876953125,
0.0911865234375,
0.0099945068359375,
-0.0283660888671875,
-0.019500732421875,
-0.02490234375,
0.06939697265625,
-0.0543212890625,
0.037994384765625,
0.0273284912109375,
0.00022590160369873047,
-0.00799560546875,
-0.05267333984375,
-0.0498046875,
-0.027740478515625,
-0.0269927978515625,
0.0311279296875,
-0.0274810791015625,
-0.0165863037109375,
0.0079498291015625,
0.025970458984375,
-0.058135986328125,
-0.00527191162109375,
-0.01666259765625,
-0.004917144775390625,
0.051910400390625,
-0.00832366943359375,
0.0157623291015625,
-0.042083740234375,
-0.0308990478515625,
-0.0391845703125,
-0.032562255859375,
0.015716552734375,
0.0182647705078125,
0.0286407470703125,
-0.03814697265625,
0.06365966796875,
-0.0016832351684570312,
0.047882080078125,
-0.0106201171875,
0.0024776458740234375,
0.058807373046875,
-0.041748046875,
-0.0174713134765625,
-0.008758544921875,
0.08917236328125,
0.01898193359375,
0.0278167724609375,
0.01041412353515625,
-0.0088958740234375,
0.0090179443359375,
-0.01204681396484375,
-0.058563232421875,
-0.0215606689453125,
0.00939178466796875,
-0.029144287109375,
-0.0148162841796875,
0.0121612548828125,
-0.054779052734375,
0.010009765625,
-0.004405975341796875,
0.040924072265625,
-0.05999755859375,
-0.0073699951171875,
0.018310546875,
-0.00965118408203125,
0.01476287841796875,
-0.005184173583984375,
-0.036285400390625,
0.00830078125,
0.036224365234375,
0.06988525390625,
0.005832672119140625,
-0.0205841064453125,
-0.038238525390625,
-0.01560211181640625,
-0.01337432861328125,
0.041900634765625,
-0.03955078125,
-0.0149078369140625,
0.004032135009765625,
0.0163726806640625,
-0.032379150390625,
-0.008636474609375,
0.0670166015625,
-0.0155181884765625,
0.03948974609375,
-0.01346588134765625,
-0.056488037109375,
-0.0211639404296875,
0.0152740478515625,
-0.046478271484375,
0.07867431640625,
0.0156097412109375,
-0.059722900390625,
0.02093505859375,
-0.049560546875,
-0.0321044921875,
-0.01549530029296875,
-0.019378662109375,
-0.0540771484375,
-0.00016820430755615234,
0.04827880859375,
0.060150146484375,
-0.0178070068359375,
0.0151519775390625,
-0.018402099609375,
-0.0167388916015625,
0.03277587890625,
-0.037506103515625,
0.0802001953125,
0.0050506591796875,
-0.014434814453125,
0.00006437301635742188,
-0.0535888671875,
-0.00037407875061035156,
0.0268402099609375,
-0.024627685546875,
-0.0261688232421875,
-0.00848388671875,
0.021240234375,
0.0005693435668945312,
0.0102691650390625,
-0.041656494140625,
0.01096343994140625,
-0.035308837890625,
0.0511474609375,
0.05133056640625,
0.00922393798828125,
0.00921630859375,
-0.0298614501953125,
0.032958984375,
0.0173492431640625,
-0.0025310516357421875,
-0.0166015625,
-0.04229736328125,
-0.059722900390625,
-0.03863525390625,
0.0450439453125,
0.059478759765625,
-0.045166015625,
0.0728759765625,
-0.046295166015625,
-0.033905029296875,
-0.0455322265625,
0.002475738525390625,
0.0228729248046875,
0.006740570068359375,
0.0401611328125,
-0.01007843017578125,
-0.032470703125,
-0.08233642578125,
-0.00714874267578125,
-0.0012035369873046875,
0.01335906982421875,
0.016510009765625,
0.060333251953125,
-0.0151824951171875,
0.061309814453125,
-0.057281494140625,
-0.0265350341796875,
-0.0218353271484375,
-0.006557464599609375,
0.0300445556640625,
0.047943115234375,
0.06060791015625,
-0.06170654296875,
-0.053741455078125,
-0.0138092041015625,
-0.054412841796875,
0.0231475830078125,
-0.00472259521484375,
-0.005847930908203125,
0.0240325927734375,
0.0394287109375,
-0.05126953125,
0.03302001953125,
0.03955078125,
-0.0302734375,
0.031494140625,
-0.0205078125,
0.0149078369140625,
-0.114013671875,
-0.00910186767578125,
0.0240478515625,
-0.00960540771484375,
-0.026031494140625,
-0.0027713775634765625,
0.00344085693359375,
0.0006709098815917969,
-0.0244903564453125,
0.029083251953125,
-0.0352783203125,
0.0257568359375,
0.038360595703125,
0.033172607421875,
0.00457763671875,
0.059051513671875,
-0.0017137527465820312,
0.054107666015625,
0.03814697265625,
-0.02557373046875,
0.02838134765625,
0.042938232421875,
-0.045166015625,
0.044647216796875,
-0.057586669921875,
-0.010711669921875,
0.0034961700439453125,
0.031829833984375,
-0.08624267578125,
-0.003856658935546875,
0.0195159912109375,
-0.03948974609375,
0.0195159912109375,
0.004802703857421875,
-0.056640625,
-0.034698486328125,
-0.03900146484375,
0.0169830322265625,
0.0379638671875,
-0.03131103515625,
0.03460693359375,
0.0039520263671875,
-0.00882720947265625,
-0.04205322265625,
-0.08056640625,
-0.0094146728515625,
-0.019317626953125,
-0.061431884765625,
0.0169830322265625,
-0.01052093505859375,
-0.0031757354736328125,
0.013916015625,
0.01407623291015625,
0.0018568038940429688,
-0.010101318359375,
0.01100921630859375,
0.01397705078125,
-0.0246429443359375,
0.0190887451171875,
0.0065765380859375,
0.0031528472900390625,
-0.01079559326171875,
-0.00042510032653808594,
0.053070068359375,
-0.029083251953125,
-0.0185394287109375,
-0.0380859375,
0.0232391357421875,
0.0250396728515625,
-0.0231475830078125,
0.0750732421875,
0.06585693359375,
-0.0130615234375,
0.01218414306640625,
-0.035675048828125,
-0.00498199462890625,
-0.0340576171875,
0.0450439453125,
-0.051483154296875,
-0.07061767578125,
0.0297088623046875,
0.0194091796875,
-0.005832672119140625,
0.0595703125,
0.04095458984375,
0.00568389892578125,
0.06341552734375,
0.0284423828125,
-0.00966644287109375,
0.0288238525390625,
-0.03997802734375,
0.03240966796875,
-0.059783935546875,
-0.01062774658203125,
-0.03253173828125,
-0.01215362548828125,
-0.067138671875,
-0.039276123046875,
0.022735595703125,
0.0084228515625,
-0.019744873046875,
0.040283203125,
-0.03314208984375,
0.004276275634765625,
0.036834716796875,
0.0219268798828125,
0.0015821456909179688,
0.01540374755859375,
-0.0335693359375,
-0.0160675048828125,
-0.053863525390625,
-0.032867431640625,
0.07952880859375,
0.038116455078125,
0.04425048828125,
0.01428985595703125,
0.038330078125,
0.0031642913818359375,
-0.01004791259765625,
-0.0540771484375,
0.0433349609375,
-0.0258026123046875,
-0.0239105224609375,
-0.01007080078125,
-0.044158935546875,
-0.0726318359375,
0.0193328857421875,
-0.0175323486328125,
-0.05694580078125,
0.01074981689453125,
-0.0149993896484375,
-0.0257568359375,
0.020416259765625,
-0.07073974609375,
0.06658935546875,
-0.016571044921875,
-0.020111083984375,
-0.00579071044921875,
-0.03363037109375,
0.0162353515625,
0.00201416015625,
0.0125732421875,
0.01019287109375,
0.0092620849609375,
0.0679931640625,
-0.027587890625,
0.058807373046875,
0.0038909912109375,
-0.00431060791015625,
0.0214996337890625,
-0.0009670257568359375,
0.031341552734375,
0.01299285888671875,
-0.006778717041015625,
0.01322174072265625,
0.00807952880859375,
-0.03515625,
-0.04315185546875,
0.06268310546875,
-0.06414794921875,
-0.03753662109375,
-0.049102783203125,
-0.040283203125,
-0.005947113037109375,
-0.0017690658569335938,
0.0189971923828125,
0.0399169921875,
-0.0263824462890625,
0.04864501953125,
0.03448486328125,
-0.037200927734375,
0.033599853515625,
0.0081024169921875,
-0.00624847412109375,
-0.031494140625,
0.059173583984375,
0.0035076141357421875,
0.00942230224609375,
0.05548095703125,
0.0156707763671875,
-0.0237579345703125,
-0.01486968994140625,
-0.0169677734375,
0.03302001953125,
-0.04327392578125,
-0.00589752197265625,
-0.07159423828125,
-0.031707763671875,
-0.0552978515625,
-0.01316070556640625,
-0.019989013671875,
-0.054656982421875,
-0.025421142578125,
-0.02008056640625,
0.047393798828125,
0.03448486328125,
-0.00839996337890625,
0.0241546630859375,
-0.0396728515625,
0.00263214111328125,
0.002445220947265625,
0.0175323486328125,
-0.0113525390625,
-0.0469970703125,
-0.0192108154296875,
0.0124359130859375,
-0.027069091796875,
-0.05621337890625,
0.037506103515625,
0.013763427734375,
0.04180908203125,
0.029205322265625,
-0.0086822509765625,
0.039031982421875,
-0.0328369140625,
0.08160400390625,
0.017608642578125,
-0.06573486328125,
0.0411376953125,
-0.00481414794921875,
0.0187835693359375,
0.035003662109375,
0.042083740234375,
-0.0338134765625,
-0.0188140869140625,
-0.048980712890625,
-0.07403564453125,
0.038665771484375,
0.03643798828125,
0.016571044921875,
-0.0159912109375,
0.01369476318359375,
-0.007366180419921875,
0.00514984130859375,
-0.07196044921875,
-0.029296875,
-0.0228118896484375,
-0.029205322265625,
-0.037933349609375,
-0.0257415771484375,
-0.00846099853515625,
-0.03448486328125,
0.056732177734375,
0.0089874267578125,
0.0465087890625,
0.02313232421875,
-0.01751708984375,
0.016693115234375,
0.025665283203125,
0.044647216796875,
0.0308990478515625,
-0.0265350341796875,
-0.0005369186401367188,
0.01409912109375,
-0.031707763671875,
0.00045299530029296875,
0.0243377685546875,
0.0006761550903320312,
0.005126953125,
0.0521240234375,
0.07403564453125,
0.00771331787109375,
-0.03912353515625,
0.0579833984375,
-0.01280975341796875,
-0.031402587890625,
-0.041534423828125,
-0.00970458984375,
0.00931549072265625,
0.0211944580078125,
0.01053619384765625,
-0.0033321380615234375,
-0.015228271484375,
-0.04095458984375,
0.026214599609375,
0.0225677490234375,
-0.03668212890625,
-0.01715087890625,
0.040374755859375,
-0.006473541259765625,
-0.006366729736328125,
0.04949951171875,
-0.020965576171875,
-0.05718994140625,
0.033599853515625,
0.034637451171875,
0.05792236328125,
-0.0057373046875,
0.026092529296875,
0.050445556640625,
0.03094482421875,
-0.0035247802734375,
0.0174560546875,
0.017242431640625,
-0.0570068359375,
-0.017822265625,
-0.046112060546875,
0.0188140869140625,
0.0123138427734375,
-0.04193115234375,
0.01690673828125,
-0.0175628662109375,
-0.00787353515625,
-0.00403594970703125,
0.00904083251953125,
-0.057830810546875,
0.004436492919921875,
-0.02069091796875,
0.05908203125,
-0.0772705078125,
0.045989990234375,
0.05096435546875,
-0.061004638671875,
-0.05682373046875,
0.0029354095458984375,
-0.021270751953125,
-0.056793212890625,
0.051788330078125,
0.0098724365234375,
0.0288238525390625,
-0.0138092041015625,
-0.03173828125,
-0.059478759765625,
0.07403564453125,
0.00799560546875,
-0.0440673828125,
0.006744384765625,
0.004741668701171875,
0.046234130859375,
-0.030364990234375,
0.036834716796875,
0.040069580078125,
0.044891357421875,
-0.003040313720703125,
-0.0599365234375,
0.0219573974609375,
-0.02630615234375,
0.005977630615234375,
0.00010335445404052734,
-0.07049560546875,
0.07659912109375,
-0.001956939697265625,
0.0034027099609375,
0.003963470458984375,
0.05682373046875,
0.0211639404296875,
-0.004268646240234375,
0.031219482421875,
0.066650390625,
0.044952392578125,
-0.0213165283203125,
0.09466552734375,
-0.01129150390625,
0.0379638671875,
0.06915283203125,
0.0038013458251953125,
0.075927734375,
0.025634765625,
-0.026336669921875,
0.0579833984375,
0.047637939453125,
-0.016326904296875,
0.043792724609375,
0.019256591796875,
0.006259918212890625,
-0.006710052490234375,
0.0164947509765625,
-0.033905029296875,
0.039459228515625,
0.02313232421875,
-0.03656005859375,
0.0019311904907226562,
0.0095062255859375,
0.031951904296875,
0.0153350830078125,
0.0168609619140625,
0.0289764404296875,
0.0184173583984375,
-0.037139892578125,
0.057403564453125,
0.00740814208984375,
0.056640625,
-0.040283203125,
0.0113983154296875,
-0.00927734375,
0.0247802734375,
-0.0054779052734375,
-0.0609130859375,
0.00949859619140625,
-0.01369476318359375,
-0.0001386404037475586,
-0.005321502685546875,
0.01398468017578125,
-0.039886474609375,
-0.061798095703125,
0.03607177734375,
0.046966552734375,
0.00823211669921875,
0.005146026611328125,
-0.08056640625,
0.008026123046875,
0.0157012939453125,
-0.041259765625,
0.00365447998046875,
0.038238525390625,
0.00476837158203125,
0.03460693359375,
0.0374755859375,
-0.0012636184692382812,
-0.00431060791015625,
0.0123138427734375,
0.05694580078125,
-0.044952392578125,
-0.04168701171875,
-0.07196044921875,
0.03125,
-0.014617919921875,
-0.0270843505859375,
0.0667724609375,
0.0662841796875,
0.055755615234375,
-0.0044097900390625,
0.060577392578125,
-0.01776123046875,
0.03668212890625,
-0.050140380859375,
0.049163818359375,
-0.05706787109375,
0.00565338134765625,
-0.03228759765625,
-0.0826416015625,
-0.029205322265625,
0.0667724609375,
-0.037628173828125,
0.021697998046875,
0.0802001953125,
0.06549072265625,
-0.0184478759765625,
-0.01197052001953125,
0.007534027099609375,
0.045989990234375,
0.034088134765625,
0.059234619140625,
0.0309906005859375,
-0.062469482421875,
0.036590576171875,
-0.0245513916015625,
-0.01727294921875,
-0.020477294921875,
-0.054656982421875,
-0.0882568359375,
-0.0775146484375,
-0.034576416015625,
-0.04150390625,
0.0009260177612304688,
0.07647705078125,
0.034027099609375,
-0.0650634765625,
-0.014617919921875,
-0.00021123886108398438,
-0.00534820556640625,
-0.0255126953125,
-0.023590087890625,
0.0687255859375,
-0.0091705322265625,
-0.05462646484375,
0.02508544921875,
-0.00490570068359375,
0.0029468536376953125,
-0.0088958740234375,
-0.0014410018920898438,
-0.059295654296875,
0.0114593505859375,
0.0556640625,
-0.0116729736328125,
-0.04541015625,
-0.023834228515625,
-0.0011386871337890625,
-0.0165252685546875,
0.010833740234375,
0.0262451171875,
-0.032806396484375,
0.019317626953125,
0.04510498046875,
0.03955078125,
0.05499267578125,
-0.00194549560546875,
0.0343017578125,
-0.06524658203125,
0.0269317626953125,
0.006622314453125,
0.04119873046875,
0.029632568359375,
-0.0236968994140625,
0.053314208984375,
0.0181121826171875,
-0.035919189453125,
-0.0552978515625,
-0.00449371337890625,
-0.09613037109375,
-0.02703857421875,
0.09393310546875,
-0.020477294921875,
-0.025787353515625,
0.0240478515625,
-0.01297760009765625,
0.0241241455078125,
-0.044464111328125,
0.033538818359375,
0.0645751953125,
0.00688934326171875,
-0.01519012451171875,
-0.046295166015625,
0.0242767333984375,
0.041168212890625,
-0.0455322265625,
-0.0206451416015625,
0.0191650390625,
0.0240020751953125,
0.0244140625,
0.0304107666015625,
-0.01187896728515625,
-0.006282806396484375,
-0.001224517822265625,
-0.0029144287109375,
0.00475311279296875,
0.005916595458984375,
-0.0160369873046875,
0.01091766357421875,
-0.032196044921875,
-0.0200958251953125
]
] |
timm/nest_base_jx.goog_in1k | 2023-04-23T23:11:41.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2105.12723",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/nest_base_jx.goog_in1k | 0 | 36,028 | timm | 2023-04-23T23:10:38 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for nest_base_jx.goog_in1k
A NesT image classification model. Trained on ImageNet-1k by paper authors in JAX. Ported to PyTorch by Alexander Soare.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 67.7
- GMACs: 18.0
- Activations (M): 53.4
- Image size: 224 x 224
- **Papers:**
- Nested Hierarchical Transformer: Towards Accurate, Data-Efficient and Interpretable Visual Understanding: https://arxiv.org/abs/2105.12723
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/google-research/nested-transformer
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('nest_base_jx.goog_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'nest_base_jx.goog_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 128, 56, 56])
# torch.Size([1, 256, 28, 28])
# torch.Size([1, 512, 14, 14])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'nest_base_jx.goog_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 512, 14, 14) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{zhang2021aggregating,
title={Nested Hierarchical Transformer: Towards Accurate, Data-Efficient and Interpretable Visual Understanding},
author={Zizhao Zhang and Han Zhang and Long Zhao and Ting Chen and and Sercan Ö. Arık and Tomas Pfister},
booktitle={AAAI Conference on Artificial Intelligence (AAAI)},
year={2022}
}
```
| 3,779 | [
[
-0.027435302734375,
-0.035369873046875,
-0.002857208251953125,
0.006145477294921875,
-0.020782470703125,
-0.0225372314453125,
-0.0195770263671875,
-0.0250396728515625,
0.000019371509552001953,
0.026824951171875,
-0.042327880859375,
-0.05987548828125,
-0.05633544921875,
-0.0123748779296875,
-0.00479888916015625,
0.07879638671875,
0.0006508827209472656,
-0.011260986328125,
-0.01522064208984375,
-0.025390625,
-0.0105743408203125,
-0.0175323486328125,
-0.058563232421875,
-0.033111572265625,
0.0158843994140625,
0.01544189453125,
0.046539306640625,
0.04730224609375,
0.0462646484375,
0.03546142578125,
-0.00707244873046875,
-0.003082275390625,
-0.028961181640625,
-0.01824951171875,
0.0303192138671875,
-0.06402587890625,
-0.03741455078125,
0.0106353759765625,
0.05609130859375,
0.0389404296875,
0.01190185546875,
0.03228759765625,
0.00798797607421875,
0.03167724609375,
-0.0222320556640625,
0.0335693359375,
-0.03277587890625,
0.0287933349609375,
-0.005413055419921875,
0.006137847900390625,
-0.02374267578125,
-0.02923583984375,
0.0233306884765625,
-0.041473388671875,
0.034393310546875,
0.00244903564453125,
0.09478759765625,
0.0230560302734375,
-0.00527191162109375,
-0.00452423095703125,
-0.01788330078125,
0.0726318359375,
-0.06689453125,
0.01751708984375,
0.0191497802734375,
0.0117645263671875,
-0.009124755859375,
-0.07177734375,
-0.036712646484375,
-0.010650634765625,
-0.0302276611328125,
0.01312255859375,
-0.023468017578125,
-0.0007576942443847656,
0.021392822265625,
0.036895751953125,
-0.040679931640625,
0.005596160888671875,
-0.032989501953125,
-0.0064239501953125,
0.033721923828125,
0.00634765625,
0.031463623046875,
-0.0221710205078125,
-0.0343017578125,
-0.038543701171875,
-0.0228424072265625,
0.0286712646484375,
0.016937255859375,
0.0287628173828125,
-0.04876708984375,
0.0233001708984375,
0.01959228515625,
0.034912109375,
0.0164642333984375,
-0.038787841796875,
0.04425048828125,
0.005168914794921875,
-0.037109375,
-0.01110076904296875,
0.0775146484375,
0.020782470703125,
0.016082763671875,
-0.004085540771484375,
0.0018739700317382812,
-0.0178070068359375,
-0.0043182373046875,
-0.0821533203125,
-0.0309295654296875,
0.0245361328125,
-0.045074462890625,
-0.032318115234375,
0.02825927734375,
-0.0556640625,
-0.005306243896484375,
0.0019512176513671875,
0.04156494140625,
-0.0304412841796875,
-0.035980224609375,
0.0036640167236328125,
-0.01427459716796875,
0.0238494873046875,
0.00208282470703125,
-0.049102783203125,
0.0224456787109375,
0.020355224609375,
0.071044921875,
0.0015840530395507812,
-0.036773681640625,
-0.00623321533203125,
-0.0198974609375,
-0.023223876953125,
0.02783203125,
-0.0028247833251953125,
-0.01198577880859375,
-0.0153961181640625,
0.027679443359375,
-0.02569580078125,
-0.05694580078125,
0.017730712890625,
-0.0142669677734375,
0.032623291015625,
0.0106964111328125,
-0.0119171142578125,
-0.04046630859375,
0.0259246826171875,
-0.03338623046875,
0.08892822265625,
0.035888671875,
-0.0643310546875,
0.0295867919921875,
-0.04217529296875,
-0.01425933837890625,
-0.0171356201171875,
-0.0022296905517578125,
-0.08978271484375,
-0.016265869140625,
0.030120849609375,
0.051910400390625,
-0.01401519775390625,
-0.0047760009765625,
-0.028839111328125,
-0.017333984375,
0.00797271728515625,
0.005344390869140625,
0.08056640625,
0.0009450912475585938,
-0.045928955078125,
0.0274505615234375,
-0.048675537109375,
0.0142822265625,
0.04345703125,
-0.0160064697265625,
-0.00366973876953125,
-0.044097900390625,
0.00809478759765625,
0.019195556640625,
0.0090789794921875,
-0.032012939453125,
0.0280609130859375,
-0.02044677734375,
0.029541015625,
0.0411376953125,
-0.01213836669921875,
0.033599853515625,
-0.0246429443359375,
0.026336669921875,
0.0252685546875,
0.0201416015625,
-0.00902557373046875,
-0.042572021484375,
-0.060760498046875,
-0.043487548828125,
0.0216522216796875,
0.0270843505859375,
-0.020599365234375,
0.0498046875,
-0.0213470458984375,
-0.052947998046875,
-0.039398193359375,
0.0016717910766601562,
0.031494140625,
0.037200927734375,
0.0298614501953125,
-0.02398681640625,
-0.0489501953125,
-0.058868408203125,
0.019775390625,
0.001445770263671875,
0.001514434814453125,
0.0245208740234375,
0.057037353515625,
-0.024566650390625,
0.051910400390625,
-0.0311126708984375,
-0.0271453857421875,
-0.01751708984375,
0.02154541015625,
0.037139892578125,
0.058319091796875,
0.06390380859375,
-0.045501708984375,
-0.035247802734375,
-0.00460052490234375,
-0.07708740234375,
0.00946807861328125,
-0.0121917724609375,
-0.0187530517578125,
0.0216064453125,
0.00812530517578125,
-0.053955078125,
0.04736328125,
0.021270751953125,
-0.0160980224609375,
0.037353515625,
-0.006145477294921875,
0.0216522216796875,
-0.0821533203125,
0.00670623779296875,
0.0161590576171875,
-0.0067138671875,
-0.0250396728515625,
0.0186614990234375,
-0.001068115234375,
0.0028057098388671875,
-0.029083251953125,
0.038330078125,
-0.04510498046875,
-0.01502227783203125,
-0.009918212890625,
-0.0215606689453125,
-0.0051422119140625,
0.05609130859375,
-0.00435638427734375,
0.029510498046875,
0.0692138671875,
-0.030670166015625,
0.0300445556640625,
0.040802001953125,
-0.0103302001953125,
0.02105712890625,
-0.03851318359375,
0.01483917236328125,
0.005443572998046875,
0.024688720703125,
-0.0831298828125,
-0.0144805908203125,
0.02801513671875,
-0.040496826171875,
0.0447998046875,
-0.04913330078125,
-0.0350341796875,
-0.045806884765625,
-0.035308837890625,
0.0293121337890625,
0.060760498046875,
-0.065185546875,
0.0380859375,
0.0117340087890625,
0.01554107666015625,
-0.043853759765625,
-0.06683349609375,
-0.0196075439453125,
-0.03045654296875,
-0.0604248046875,
0.03021240234375,
0.0101776123046875,
0.0118255615234375,
0.019378662109375,
-0.01207733154296875,
-0.00522613525390625,
-0.01824951171875,
0.045654296875,
0.0285186767578125,
-0.038482666015625,
-0.0207061767578125,
-0.0229034423828125,
-0.008544921875,
0.01111602783203125,
-0.0217742919921875,
0.04473876953125,
-0.0306396484375,
-0.00766754150390625,
-0.056793212890625,
-0.006931304931640625,
0.046112060546875,
0.0013399124145507812,
0.06243896484375,
0.0811767578125,
-0.038909912109375,
-0.0063629150390625,
-0.0189666748046875,
-0.0174560546875,
-0.0340576171875,
0.0287322998046875,
-0.032501220703125,
-0.03369140625,
0.06732177734375,
-0.003993988037109375,
0.00684356689453125,
0.05499267578125,
0.0297393798828125,
-0.003498077392578125,
0.06976318359375,
0.04840087890625,
0.00525665283203125,
0.0433349609375,
-0.08349609375,
-0.0015048980712890625,
-0.059722900390625,
-0.031768798828125,
-0.024749755859375,
-0.033172607421875,
-0.050933837890625,
-0.02020263671875,
0.036590576171875,
0.0165557861328125,
-0.02783203125,
0.03448486328125,
-0.057342529296875,
0.002986907958984375,
0.05548095703125,
0.0335693359375,
-0.020233154296875,
0.01479339599609375,
-0.0164337158203125,
-0.0027713775634765625,
-0.0626220703125,
-0.016998291015625,
0.07342529296875,
0.035919189453125,
0.0579833984375,
-0.01483917236328125,
0.049652099609375,
-0.0244293212890625,
0.0157623291015625,
-0.04974365234375,
0.040740966796875,
-0.00547027587890625,
-0.022705078125,
-0.0098419189453125,
-0.025848388671875,
-0.08734130859375,
0.011260986328125,
-0.02569580078125,
-0.059112548828125,
0.0020999908447265625,
0.01035308837890625,
-0.0157928466796875,
0.06158447265625,
-0.062164306640625,
0.07440185546875,
-0.004974365234375,
-0.02581787109375,
0.01116180419921875,
-0.051239013671875,
0.02581787109375,
0.0194854736328125,
-0.0186004638671875,
0.0125732421875,
0.0272216796875,
0.07623291015625,
-0.03900146484375,
0.067626953125,
-0.02569580078125,
0.0255584716796875,
0.035308837890625,
-0.0098876953125,
0.020233154296875,
-0.013763427734375,
0.002643585205078125,
0.0286407470703125,
0.004638671875,
-0.035369873046875,
-0.048248291015625,
0.04400634765625,
-0.08197021484375,
-0.0256805419921875,
-0.03363037109375,
-0.045135498046875,
0.01279449462890625,
0.00934600830078125,
0.041351318359375,
0.03912353515625,
0.0251922607421875,
0.0279388427734375,
0.037139892578125,
-0.02667236328125,
0.0259857177734375,
-0.016143798828125,
-0.0187835693359375,
-0.0285491943359375,
0.053955078125,
0.0099639892578125,
0.00479888916015625,
0.0126495361328125,
0.0278167724609375,
-0.03369140625,
-0.0418701171875,
-0.0265350341796875,
0.02294921875,
-0.05145263671875,
-0.032501220703125,
-0.043060302734375,
-0.044403076171875,
-0.043304443359375,
-0.0124359130859375,
-0.0286407470703125,
-0.0179595947265625,
-0.02264404296875,
0.01187896728515625,
0.039703369140625,
0.047149658203125,
-0.007659912109375,
0.045562744140625,
-0.04248046875,
0.01397705078125,
0.0175933837890625,
0.02984619140625,
-0.0017910003662109375,
-0.0621337890625,
-0.017791748046875,
-0.008819580078125,
-0.0272979736328125,
-0.04833984375,
0.035369873046875,
0.0021114349365234375,
0.042724609375,
0.0287322998046875,
-0.0185394287109375,
0.068115234375,
-0.0015954971313476562,
0.045684814453125,
0.019256591796875,
-0.04376220703125,
0.05206298828125,
-0.0102691650390625,
0.0218353271484375,
-0.00514984130859375,
0.0283660888671875,
-0.0096435546875,
-0.01288604736328125,
-0.07421875,
-0.0657958984375,
0.0672607421875,
0.0158843994140625,
-0.0117645263671875,
0.031280517578125,
0.03955078125,
0.01102447509765625,
0.0010957717895507812,
-0.061676025390625,
-0.0360107421875,
-0.0302581787109375,
-0.0230712890625,
0.0109710693359375,
-0.0005965232849121094,
-0.007320404052734375,
-0.052947998046875,
0.0516357421875,
-0.00013387203216552734,
0.05523681640625,
0.016693115234375,
-0.01306915283203125,
-0.0225677490234375,
-0.0229034423828125,
0.026031494140625,
0.025177001953125,
-0.040771484375,
0.007106781005859375,
0.01264190673828125,
-0.043731689453125,
-0.003448486328125,
0.009979248046875,
0.0031375885009765625,
-0.00659942626953125,
0.0469970703125,
0.0660400390625,
0.0098724365234375,
0.0048980712890625,
0.0277557373046875,
0.000051856040954589844,
-0.027252197265625,
-0.0214385986328125,
0.01544189453125,
-0.00708770751953125,
0.028656005859375,
0.026458740234375,
0.023651123046875,
-0.01313018798828125,
-0.0264739990234375,
0.021148681640625,
0.038055419921875,
-0.0101318359375,
-0.0249786376953125,
0.050323486328125,
-0.019927978515625,
-0.006237030029296875,
0.06976318359375,
-0.002124786376953125,
-0.042724609375,
0.07916259765625,
0.05194091796875,
0.0667724609375,
-0.0012063980102539062,
-0.00434112548828125,
0.068115234375,
0.0191802978515625,
0.01102447509765625,
-0.004283905029296875,
0.01236724853515625,
-0.05206298828125,
0.0020046234130859375,
-0.0413818359375,
-0.0082244873046875,
0.042938232421875,
-0.04718017578125,
0.0167236328125,
-0.051239013671875,
-0.0203857421875,
0.01062774658203125,
0.0256500244140625,
-0.06787109375,
0.02276611328125,
0.00982666015625,
0.05224609375,
-0.06353759765625,
0.046600341796875,
0.0660400390625,
-0.0430908203125,
-0.0748291015625,
-0.0003795623779296875,
0.002613067626953125,
-0.07293701171875,
0.046600341796875,
0.03179931640625,
0.002536773681640625,
0.0130157470703125,
-0.061126708984375,
-0.0450439453125,
0.10235595703125,
0.026763916015625,
0.0010633468627929688,
0.0138092041015625,
0.005001068115234375,
0.0233917236328125,
-0.034088134765625,
0.045806884765625,
0.023773193359375,
0.03375244140625,
0.0192718505859375,
-0.051300048828125,
0.0146331787109375,
-0.029144287109375,
0.0054473876953125,
0.0098876953125,
-0.04730224609375,
0.0731201171875,
-0.03814697265625,
-0.014190673828125,
0.004810333251953125,
0.046600341796875,
0.0186309814453125,
0.005237579345703125,
0.04730224609375,
0.06353759765625,
0.035369873046875,
-0.029083251953125,
0.0665283203125,
0.0027408599853515625,
0.05706787109375,
0.044464111328125,
0.03155517578125,
0.035675048828125,
0.02020263671875,
-0.024261474609375,
0.042999267578125,
0.072021484375,
-0.0308380126953125,
0.028045654296875,
0.01128387451171875,
0.00426483154296875,
-0.0031070709228515625,
0.015838623046875,
-0.035675048828125,
0.039215087890625,
0.0157623291015625,
-0.0293121337890625,
-0.0262603759765625,
0.0025501251220703125,
0.005706787109375,
-0.023223876953125,
-0.013916015625,
0.03558349609375,
-0.0089263916015625,
-0.028564453125,
0.06280517578125,
-0.00249481201171875,
0.0665283203125,
-0.034210205078125,
-0.0066375732421875,
-0.0136871337890625,
0.0196075439453125,
-0.035186767578125,
-0.061431884765625,
0.019439697265625,
-0.0236663818359375,
-0.00688934326171875,
0.0002206563949584961,
0.049285888671875,
-0.0296783447265625,
-0.03125,
0.0199737548828125,
0.026641845703125,
0.031036376953125,
0.0016374588012695312,
-0.09271240234375,
0.007747650146484375,
-0.00484466552734375,
-0.045989990234375,
0.0203857421875,
0.023834228515625,
0.0037403106689453125,
0.046356201171875,
0.051788330078125,
-0.01071929931640625,
0.024444580078125,
-0.00666046142578125,
0.055877685546875,
-0.03826904296875,
-0.027130126953125,
-0.05389404296875,
0.05755615234375,
-0.00337982177734375,
-0.04058837890625,
0.04107666015625,
0.041595458984375,
0.0699462890625,
-0.017333984375,
0.04278564453125,
-0.0225982666015625,
-0.0018033981323242188,
-0.0302886962890625,
0.051971435546875,
-0.043975830078125,
-0.01421356201171875,
-0.015777587890625,
-0.050567626953125,
-0.03521728515625,
0.06646728515625,
-0.02386474609375,
0.030609130859375,
0.05096435546875,
0.07342529296875,
-0.03125,
-0.021392822265625,
0.0141754150390625,
0.0252532958984375,
0.01093292236328125,
0.038116455078125,
0.0380859375,
-0.070068359375,
0.031036376953125,
-0.056854248046875,
-0.02325439453125,
-0.0137786865234375,
-0.047576904296875,
-0.082275390625,
-0.0633544921875,
-0.04913330078125,
-0.041015625,
-0.01934814453125,
0.0654296875,
0.09161376953125,
-0.060821533203125,
-0.01062774658203125,
-0.0018510818481445312,
0.012908935546875,
-0.0288848876953125,
-0.0216064453125,
0.055908203125,
-0.0200042724609375,
-0.0657958984375,
-0.0214385986328125,
0.0018110275268554688,
0.0252685546875,
0.0023555755615234375,
-0.0102996826171875,
-0.01094818115234375,
-0.030029296875,
0.0248565673828125,
0.021575927734375,
-0.0565185546875,
-0.0151214599609375,
-0.0180816650390625,
-0.01666259765625,
0.041015625,
0.0254974365234375,
-0.046173095703125,
0.01824951171875,
0.0275421142578125,
0.0292816162109375,
0.06207275390625,
-0.020599365234375,
-0.004642486572265625,
-0.061279296875,
0.03985595703125,
-0.006717681884765625,
0.034912109375,
0.023651123046875,
-0.0341796875,
0.0443115234375,
0.03289794921875,
-0.031585693359375,
-0.058441162109375,
-0.01256561279296875,
-0.08978271484375,
-0.0067138671875,
0.0596923828125,
-0.0313720703125,
-0.045196533203125,
0.0306396484375,
-0.008544921875,
0.0531005859375,
-0.0001308917999267578,
0.047576904296875,
0.01690673828125,
-0.01502227783203125,
-0.053131103515625,
-0.0305328369140625,
0.0261383056640625,
0.01200103759765625,
-0.0565185546875,
-0.037567138671875,
-0.00714874267578125,
0.04779052734375,
0.02783203125,
0.0300445556640625,
-0.014862060546875,
0.0083465576171875,
0.0192108154296875,
0.03558349609375,
-0.024658203125,
-0.011993408203125,
-0.0172119140625,
0.0070648193359375,
-0.01568603515625,
-0.050872802734375
]
] |
huggyllama/llama-30b | 2023-04-07T15:50:57.000Z | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"license:other",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | huggyllama | null | null | huggyllama/llama-30b | 37 | 35,991 | transformers | 2023-04-04T00:50:30 | ---
license: other
---
This contains the weights for the LLaMA-30b model. This model is under a non-commercial license (see the LICENSE file).
You should only use this repository if you have been granted access to the model by filling out [this form](https://docs.google.com/forms/d/e/1FAIpQLSfqNECQnMkycAp2jP4Z9TFX0cGR4uf7b_fBxjY_OjhJILlKGA/viewform?usp=send_form) but either lost your copy of the weights or got some trouble converting them to the Transformers format.
| 473 | [
[
-0.0101470947265625,
-0.00855255126953125,
0.0286865234375,
0.05511474609375,
-0.032318115234375,
0.00003230571746826172,
0.0236053466796875,
-0.0190277099609375,
0.0240325927734375,
0.055694580078125,
-0.052947998046875,
-0.02459716796875,
-0.0556640625,
0.007274627685546875,
-0.0426025390625,
0.07769775390625,
-0.017120361328125,
0.020050048828125,
-0.03363037109375,
-0.00876617431640625,
-0.006671905517578125,
-0.014617919921875,
-0.016448974609375,
-0.0550537109375,
0.046875,
0.0280914306640625,
0.0323486328125,
0.02911376953125,
0.0501708984375,
0.01690673828125,
-0.023101806640625,
-0.04608154296875,
-0.029327392578125,
-0.0262451171875,
0.0020542144775390625,
-0.05816650390625,
-0.0726318359375,
0.0139617919921875,
0.044281005859375,
0.035125732421875,
-0.060943603515625,
0.0419921875,
-0.0172119140625,
0.037139892578125,
-0.03350830078125,
0.0306243896484375,
-0.0305633544921875,
-0.0013370513916015625,
-0.0328369140625,
0.0032062530517578125,
-0.02325439453125,
-0.0140380859375,
-0.02490234375,
-0.0589599609375,
0.005340576171875,
0.0120086669921875,
0.09600830078125,
0.0257110595703125,
-0.039306640625,
-0.016571044921875,
-0.02154541015625,
0.0543212890625,
-0.06182861328125,
0.0122833251953125,
0.03350830078125,
0.0467529296875,
-0.00806427001953125,
-0.0628662109375,
-0.0256195068359375,
-0.0233612060546875,
0.01378631591796875,
0.00930023193359375,
-0.035186767578125,
0.002300262451171875,
0.015899658203125,
0.0556640625,
-0.026336669921875,
0.00875091552734375,
-0.07342529296875,
-0.00434112548828125,
0.071533203125,
0.007747650146484375,
0.035064697265625,
-0.0188140869140625,
-0.063720703125,
-0.0276641845703125,
-0.06683349609375,
-0.0024776458740234375,
0.058563232421875,
0.007297515869140625,
-0.06622314453125,
0.1002197265625,
-0.0103607177734375,
0.0338134765625,
0.004383087158203125,
-0.009552001953125,
0.048370361328125,
0.018402099609375,
-0.02935791015625,
-0.0122833251953125,
0.034393310546875,
0.04296875,
0.0227203369140625,
-0.0079193115234375,
-0.01433563232421875,
-0.0123138427734375,
0.033935546875,
-0.052154541015625,
-0.012420654296875,
-0.00531768798828125,
-0.045379638671875,
-0.0151824951171875,
-0.004425048828125,
-0.0238189697265625,
-0.0289459228515625,
-0.01580810546875,
0.0301361083984375,
0.00429534912109375,
-0.0219879150390625,
0.023956298828125,
-0.023468017578125,
0.0248565673828125,
0.015655517578125,
-0.054107666015625,
0.03338623046875,
-0.008758544921875,
0.033294677734375,
0.0235595703125,
-0.01306915283203125,
0.005733489990234375,
0.0225067138671875,
-0.0111236572265625,
0.05572509765625,
-0.0036563873291015625,
-0.059478759765625,
-0.01015472412109375,
0.038177490234375,
0.00043892860412597656,
-0.049591064453125,
0.03143310546875,
-0.0672607421875,
-0.003993988037109375,
-0.0169677734375,
-0.02587890625,
-0.037994384765625,
-0.008087158203125,
-0.07049560546875,
0.06494140625,
0.03955078125,
-0.04248046875,
0.0286712646484375,
-0.039276123046875,
-0.00916290283203125,
0.0308074951171875,
0.00835418701171875,
-0.04315185546875,
0.020660400390625,
-0.003147125244140625,
0.0204620361328125,
-0.023101806640625,
0.0180816650390625,
-0.03863525390625,
-0.0225067138671875,
0.00516510009765625,
-0.0038661956787109375,
0.08221435546875,
0.0222625732421875,
0.0081787109375,
-0.0004940032958984375,
-0.07080078125,
-0.0380859375,
0.044677734375,
-0.05462646484375,
0.004962921142578125,
-0.0176544189453125,
0.0093841552734375,
-0.00848388671875,
0.04962158203125,
-0.05078125,
0.035797119140625,
0.0157928466796875,
0.00878143310546875,
0.05474853515625,
-0.003780364990234375,
0.0401611328125,
-0.03680419921875,
0.0628662109375,
-0.01149749755859375,
0.02703857421875,
0.036529541015625,
-0.04840087890625,
-0.04840087890625,
-0.0113983154296875,
-0.0038166046142578125,
0.0251617431640625,
-0.03594970703125,
0.01215362548828125,
-0.007045745849609375,
-0.06689453125,
-0.028289794921875,
0.011505126953125,
0.01442718505859375,
0.0260162353515625,
0.0219879150390625,
-0.012237548828125,
-0.036102294921875,
-0.09014892578125,
0.0199737548828125,
-0.01934814453125,
-0.03631591796875,
0.04168701171875,
0.0655517578125,
-0.01806640625,
0.0394287109375,
-0.046539306640625,
-0.031280517578125,
-0.0211181640625,
0.00562286376953125,
0.0247650146484375,
0.01849365234375,
0.071533203125,
-0.0419921875,
-0.006072998046875,
-0.0264739990234375,
-0.06494140625,
-0.0401611328125,
-0.0026073455810546875,
0.004886627197265625,
-0.01308441162109375,
0.0015659332275390625,
-0.038604736328125,
0.0264739990234375,
0.07470703125,
-0.018890380859375,
0.03717041015625,
-0.016571044921875,
-0.01824951171875,
-0.0693359375,
0.0229034423828125,
0.01184844970703125,
-0.01544189453125,
0.005184173583984375,
0.03277587890625,
0.0091705322265625,
0.0005307197570800781,
-0.05657958984375,
0.03240966796875,
-0.024566650390625,
-0.01788330078125,
-0.007633209228515625,
-0.0036220550537109375,
0.004741668701171875,
0.0164947509765625,
-0.030303955078125,
0.073974609375,
0.016815185546875,
-0.03485107421875,
0.0318603515625,
0.041595458984375,
-0.0292510986328125,
0.036041259765625,
-0.060272216796875,
-0.00431060791015625,
-0.022216796875,
0.0271148681640625,
-0.0245361328125,
-0.026763916015625,
0.0272674560546875,
-0.0250091552734375,
0.0171051025390625,
-0.01605224609375,
-0.01031494140625,
-0.025390625,
-0.01166534423828125,
0.0237579345703125,
0.0234375,
-0.0279693603515625,
0.055023193359375,
0.04248046875,
-0.00495147705078125,
-0.044158935546875,
-0.088134765625,
-0.01447296142578125,
-0.0239105224609375,
-0.0179290771484375,
0.037384033203125,
-0.00572967529296875,
-0.021728515625,
0.0181732177734375,
-0.0090179443359375,
-0.013397216796875,
0.004482269287109375,
0.031707763671875,
0.02911376953125,
0.002346038818359375,
-0.015899658203125,
0.01410675048828125,
-0.004962921142578125,
0.007762908935546875,
0.01363372802734375,
0.04876708984375,
-0.0083160400390625,
-0.0164337158203125,
-0.041259765625,
-0.0032787322998046875,
0.03460693359375,
0.0063018798828125,
0.057708740234375,
0.014129638671875,
-0.036712646484375,
0.006061553955078125,
-0.033966064453125,
0.010833740234375,
-0.034423828125,
0.0195159912109375,
-0.022064208984375,
-0.0335693359375,
0.048095703125,
0.0127410888671875,
-0.006893157958984375,
0.0723876953125,
0.0615234375,
-0.0117950439453125,
0.035125732421875,
0.0615234375,
-0.007244110107421875,
0.031890869140625,
-0.04449462890625,
-0.01483154296875,
-0.09796142578125,
-0.06939697265625,
-0.0445556640625,
-0.04095458984375,
-0.013946533203125,
-0.00820159912109375,
0.0030574798583984375,
0.020050048828125,
-0.06317138671875,
0.0679931640625,
-0.02276611328125,
0.009979248046875,
0.04302978515625,
0.0251617431640625,
0.031097412109375,
0.0026912689208984375,
-0.00782012939453125,
0.004726409912109375,
-0.0247955322265625,
-0.038970947265625,
0.09332275390625,
0.0175018310546875,
0.0836181640625,
0.01763916015625,
0.049468994140625,
0.0271759033203125,
0.0460205078125,
-0.044586181640625,
0.035675048828125,
0.0004992485046386719,
-0.07403564453125,
0.0055999755859375,
-0.033050537109375,
-0.06829833984375,
0.0044403076171875,
-0.0030918121337890625,
-0.0670166015625,
0.0157012939453125,
-0.01430511474609375,
0.0027523040771484375,
0.03497314453125,
-0.047088623046875,
0.033416748046875,
-0.00656890869140625,
0.025115966796875,
-0.0230712890625,
-0.0173187255859375,
0.04595947265625,
-0.0017805099487304688,
0.016204833984375,
-0.0224151611328125,
-0.003688812255859375,
0.06890869140625,
-0.033905029296875,
0.06951904296875,
-0.030914306640625,
-0.0165557861328125,
0.038970947265625,
-0.009979248046875,
0.028961181640625,
0.0170745849609375,
-0.004100799560546875,
0.03167724609375,
-0.003936767578125,
-0.0223236083984375,
-0.00013208389282226562,
0.032867431640625,
-0.0916748046875,
-0.032470703125,
-0.0267486572265625,
-0.04791259765625,
0.0352783203125,
0.0127105712890625,
0.0168609619140625,
-0.01800537109375,
0.0254974365234375,
0.044403076171875,
0.017974853515625,
-0.02252197265625,
0.02783203125,
0.044830322265625,
-0.022064208984375,
-0.0287322998046875,
0.035858154296875,
0.006984710693359375,
0.0157012939453125,
0.0223846435546875,
0.016326904296875,
-0.0299835205078125,
-0.02923583984375,
-0.041351318359375,
0.0243072509765625,
-0.0574951171875,
-0.036865234375,
-0.028076171875,
-0.018035888671875,
-0.0095062255859375,
-0.01049041748046875,
-0.021087646484375,
-0.041534423828125,
-0.041107177734375,
-0.01291656494140625,
0.039306640625,
0.06494140625,
-0.00861358642578125,
0.08221435546875,
-0.05548095703125,
0.0215911865234375,
0.022796630859375,
0.021636962890625,
0.00521087646484375,
-0.046478271484375,
-0.0044403076171875,
-0.00897979736328125,
-0.050018310546875,
-0.06060791015625,
0.0227203369140625,
-0.005382537841796875,
0.0438232421875,
0.0181884765625,
-0.0127410888671875,
0.039764404296875,
-0.018798828125,
0.076904296875,
0.029266357421875,
-0.062042236328125,
0.00458526611328125,
-0.021636962890625,
0.0033397674560546875,
0.02734375,
0.030670166015625,
-0.0176239013671875,
0.0038928985595703125,
-0.049224853515625,
-0.06158447265625,
0.04681396484375,
0.013427734375,
0.01213836669921875,
0.035003662109375,
0.016265869140625,
0.00036716461181640625,
0.029693603515625,
-0.0911865234375,
-0.007312774658203125,
-0.0195159912109375,
-0.0160980224609375,
0.0299835205078125,
-0.01507568359375,
-0.035919189453125,
-0.0025463104248046875,
0.05499267578125,
0.0111846923828125,
0.0131378173828125,
-0.01042938232421875,
-0.0167388916015625,
-0.03125,
0.001445770263671875,
0.03399658203125,
0.033843994140625,
-0.038360595703125,
-0.0258941650390625,
0.034454345703125,
-0.06219482421875,
0.010498046875,
0.023834228515625,
-0.0036525726318359375,
-0.03662109375,
0.02606201171875,
0.032440185546875,
0.0215301513671875,
-0.03253173828125,
0.0433349609375,
0.002101898193359375,
-0.032958984375,
-0.03448486328125,
-0.017486572265625,
0.0239410400390625,
0.05010986328125,
0.037139892578125,
-0.0033702850341796875,
0.007518768310546875,
-0.0165557861328125,
-0.00015735626220703125,
0.020111083984375,
0.004795074462890625,
-0.025146484375,
0.08099365234375,
-0.0016489028930664062,
-0.0111236572265625,
0.05230712890625,
-0.0042572021484375,
0.006183624267578125,
0.0623779296875,
0.042816162109375,
0.05511474609375,
-0.007122039794921875,
0.01042938232421875,
0.009613037109375,
0.027008056640625,
-0.0052642822265625,
0.051239013671875,
0.006206512451171875,
-0.03350830078125,
-0.037017822265625,
-0.07794189453125,
-0.04620361328125,
0.00437164306640625,
-0.06219482421875,
0.0374755859375,
-0.031494140625,
-0.0209197998046875,
-0.0230560302734375,
0.00064849853515625,
-0.0462646484375,
0.0355224609375,
0.0271148681640625,
0.07659912109375,
-0.0537109375,
0.052459716796875,
0.05181884765625,
-0.074951171875,
-0.08306884765625,
-0.04315185546875,
0.0081787109375,
-0.10430908203125,
0.050689697265625,
-0.01042938232421875,
-0.0003554821014404297,
-0.007259368896484375,
-0.06048583984375,
-0.08807373046875,
0.10699462890625,
0.03564453125,
-0.05010986328125,
-0.03399658203125,
0.006740570068359375,
0.0007205009460449219,
-0.0023651123046875,
0.01727294921875,
-0.007534027099609375,
0.03948974609375,
0.0245361328125,
-0.062744140625,
0.004405975341796875,
-0.0007081031799316406,
-0.0007371902465820312,
-0.004871368408203125,
-0.06353759765625,
0.084716796875,
-0.013153076171875,
-0.0126190185546875,
0.03021240234375,
0.034210205078125,
0.044525146484375,
0.0037555694580078125,
0.024871826171875,
0.07177734375,
0.04510498046875,
-0.001392364501953125,
0.089599609375,
-0.012359619140625,
0.058013916015625,
0.04302978515625,
-0.026519775390625,
0.049163818359375,
0.038909912109375,
-0.0479736328125,
0.06646728515625,
0.040924072265625,
-0.040924072265625,
0.0285491943359375,
0.04248046875,
-0.005420684814453125,
0.0017757415771484375,
-0.01027679443359375,
-0.046112060546875,
0.02703857421875,
0.010772705078125,
-0.030303955078125,
-0.033172607421875,
-0.0196533203125,
0.007083892822265625,
-0.02093505859375,
-0.022125244140625,
0.0245819091796875,
0.037353515625,
0.0150146484375,
0.0306243896484375,
0.01097869873046875,
0.0369873046875,
-0.058685302734375,
0.0035762786865234375,
-0.00052642822265625,
0.01232147216796875,
-0.0308837890625,
-0.05023193359375,
0.03533935546875,
0.0091094970703125,
-0.009185791015625,
-0.016082763671875,
0.02734375,
0.025390625,
-0.0546875,
0.03167724609375,
0.01003265380859375,
0.014129638671875,
0.004283905029296875,
-0.045379638671875,
0.01494598388671875,
-0.0214996337890625,
-0.038726806640625,
0.0229034423828125,
-0.0193023681640625,
-0.00730133056640625,
0.07745361328125,
0.04327392578125,
0.0118560791015625,
0.00783538818359375,
0.0195465087890625,
0.0777587890625,
-0.035064697265625,
-0.00823974609375,
-0.032867431640625,
0.05621337890625,
0.0063934326171875,
-0.042572021484375,
0.039642333984375,
0.04730224609375,
0.07745361328125,
-0.03448486328125,
0.031280517578125,
-0.03143310546875,
-0.00974273681640625,
-0.03387451171875,
0.08978271484375,
-0.070556640625,
-0.0001990795135498047,
-0.00685882568359375,
-0.07464599609375,
-0.035491943359375,
0.045684814453125,
0.00434112548828125,
-0.007099151611328125,
0.031097412109375,
0.033355712890625,
0.0142364501953125,
-0.0029735565185546875,
0.004177093505859375,
0.00101470947265625,
0.0270843505859375,
0.038299560546875,
0.04058837890625,
-0.04815673828125,
0.052032470703125,
-0.0085601806640625,
-0.0243072509765625,
0.0011262893676757812,
-0.06915283203125,
-0.051300048828125,
-0.017120361328125,
-0.0059051513671875,
-0.0106964111328125,
-0.0282135009765625,
0.050018310546875,
0.050750732421875,
-0.03424072265625,
-0.037689208984375,
-0.00732421875,
-0.00848388671875,
-0.004749298095703125,
-0.01171875,
0.01479339599609375,
0.015350341796875,
-0.06298828125,
0.0299530029296875,
-0.0023937225341796875,
0.03955078125,
-0.0232696533203125,
-0.000522613525390625,
-0.011810302734375,
-0.007312774658203125,
0.016357421875,
0.0105438232421875,
-0.048248291015625,
0.0008540153503417969,
-0.0002703666687011719,
-0.0003228187561035156,
-0.0028133392333984375,
0.0219268798828125,
-0.045806884765625,
0.0079193115234375,
0.0311737060546875,
0.0219879150390625,
0.056976318359375,
-0.005130767822265625,
0.032440185546875,
-0.046783447265625,
0.0263824462890625,
0.01496124267578125,
0.061309814453125,
0.0175323486328125,
-0.01497650146484375,
0.036041259765625,
0.02362060546875,
-0.050384521484375,
-0.0438232421875,
0.007160186767578125,
-0.0972900390625,
0.0011148452758789062,
0.061248779296875,
-0.01459503173828125,
-0.032806396484375,
0.046295166015625,
-0.0232391357421875,
0.0275421142578125,
-0.0301361083984375,
0.051544189453125,
0.051910400390625,
0.01222991943359375,
-0.035125732421875,
-0.04510498046875,
-0.006458282470703125,
-0.0036563873291015625,
-0.046783447265625,
-0.031951904296875,
0.0175628662109375,
0.032196044921875,
0.0168609619140625,
0.0130157470703125,
-0.04949951171875,
-0.0130615234375,
0.01062774658203125,
0.046112060546875,
-0.009613037109375,
-0.01380157470703125,
-0.0146484375,
-0.005199432373046875,
0.001445770263671875,
-0.0299835205078125
]
] |
timm/lcnet_050.ra2_in1k | 2023-04-27T22:48:56.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:2109.15099",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/lcnet_050.ra2_in1k | 0 | 35,905 | timm | 2022-12-16T05:37:27 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for lcnet_050.ra2_in1k
A LCNet image classification model. Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* RandAugment `RA2` recipe. Inspired by and evolved from EfficientNet RandAugment recipes. Published as `B` recipe in [ResNet Strikes Back](https://arxiv.org/abs/2110.00476).
* RMSProp (TF 1.0 behaviour) optimizer, EMA weight averaging
* Step (exponential decay w/ staircase) LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 1.9
- GMACs: 0.0
- Activations (M): 1.3
- Image size: 224 x 224
- **Papers:**
- PP-LCNet: A Lightweight CPU Convolutional Neural Network: https://arxiv.org/abs/2109.15099
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('lcnet_050.ra2_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'lcnet_050.ra2_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 112, 112])
# torch.Size([1, 32, 56, 56])
# torch.Size([1, 64, 28, 28])
# torch.Size([1, 128, 14, 14])
# torch.Size([1, 256, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'lcnet_050.ra2_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 256, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@article{cui2021pp,
title={PP-LCNet: A lightweight CPU convolutional neural network},
author={Cui, Cheng and Gao, Tingquan and Wei, Shengyu and Du, Yuning and Guo, Ruoyu and Dong, Shuilong and Lu, Bin and Zhou, Ying and Lv, Xueying and Liu, Qiwen and others},
journal={arXiv preprint arXiv:2109.15099},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
| 4,706 | [
[
-0.032989501953125,
-0.0305633544921875,
-0.004489898681640625,
-0.0011920928955078125,
-0.0230712890625,
-0.0252838134765625,
-0.025299072265625,
-0.03314208984375,
0.0127105712890625,
0.0404052734375,
-0.033935546875,
-0.044769287109375,
-0.05072021484375,
-0.01128387451171875,
-0.0082855224609375,
0.07232666015625,
0.00007891654968261719,
0.006397247314453125,
-0.0101318359375,
-0.046142578125,
-0.01187896728515625,
-0.02728271484375,
-0.07763671875,
-0.040618896484375,
0.036346435546875,
0.0193939208984375,
0.044464111328125,
0.034820556640625,
0.045379638671875,
0.034271240234375,
-0.0090179443359375,
0.0116119384765625,
-0.01497650146484375,
-0.017120361328125,
0.0287933349609375,
-0.043609619140625,
-0.033843994140625,
0.0155792236328125,
0.04498291015625,
0.03424072265625,
0.0058135986328125,
0.0306243896484375,
0.0149688720703125,
0.056488037109375,
-0.0210723876953125,
-0.00028705596923828125,
-0.03411865234375,
0.022918701171875,
-0.0084381103515625,
0.0083770751953125,
-0.0283966064453125,
-0.0200347900390625,
0.01497650146484375,
-0.039825439453125,
0.032623291015625,
0.006404876708984375,
0.08935546875,
0.016693115234375,
0.0009732246398925781,
-0.007526397705078125,
-0.0155181884765625,
0.06378173828125,
-0.06280517578125,
0.015777587890625,
0.01558685302734375,
0.017364501953125,
-0.0024261474609375,
-0.078125,
-0.041473388671875,
-0.0240936279296875,
-0.01528167724609375,
0.0042572021484375,
-0.01305389404296875,
0.0019102096557617188,
0.0247802734375,
0.01480865478515625,
-0.041534423828125,
0.01371002197265625,
-0.034942626953125,
-0.01441192626953125,
0.03631591796875,
0.0007071495056152344,
0.0243072509765625,
-0.0156402587890625,
-0.03643798828125,
-0.03076171875,
-0.02667236328125,
0.0233306884765625,
0.0247650146484375,
0.0203399658203125,
-0.050567626953125,
0.0306854248046875,
0.01629638671875,
0.041046142578125,
0.01139068603515625,
-0.022247314453125,
0.041534423828125,
-0.001155853271484375,
-0.033599853515625,
-0.0026836395263671875,
0.080810546875,
0.0283966064453125,
0.017303466796875,
0.0088348388671875,
-0.0095977783203125,
-0.03717041015625,
-0.008819580078125,
-0.086181640625,
-0.0300140380859375,
0.0246429443359375,
-0.054412841796875,
-0.03302001953125,
0.0196075439453125,
-0.040435791015625,
-0.01806640625,
-0.00760650634765625,
0.0264892578125,
-0.0304718017578125,
-0.0297698974609375,
-0.00197601318359375,
-0.00836944580078125,
0.0260009765625,
0.01812744140625,
-0.03851318359375,
0.0214080810546875,
0.0268402099609375,
0.08697509765625,
0.0022258758544921875,
-0.0253143310546875,
-0.0217132568359375,
-0.03564453125,
-0.02203369140625,
0.028411865234375,
-0.000019311904907226562,
-0.00347900390625,
-0.02447509765625,
0.033660888671875,
-0.005939483642578125,
-0.058319091796875,
0.018096923828125,
-0.0199432373046875,
0.01488494873046875,
-0.00855255126953125,
-0.0162200927734375,
-0.035186767578125,
0.025054931640625,
-0.044281005859375,
0.098388671875,
0.02325439453125,
-0.0743408203125,
0.01480865478515625,
-0.03594970703125,
-0.00847625732421875,
-0.0221405029296875,
-0.0017757415771484375,
-0.08251953125,
-0.010406494140625,
0.009368896484375,
0.05059814453125,
-0.033294677734375,
0.0005044937133789062,
-0.044525146484375,
-0.0219268798828125,
0.0288238525390625,
0.0010251998901367188,
0.0848388671875,
0.015960693359375,
-0.036865234375,
0.00862884521484375,
-0.041900634765625,
0.01523590087890625,
0.03936767578125,
-0.0195159912109375,
-0.0004086494445800781,
-0.043365478515625,
0.021514892578125,
0.0275726318359375,
0.01056671142578125,
-0.040802001953125,
0.0129547119140625,
-0.022674560546875,
0.0289306640625,
0.04437255859375,
-0.01007843017578125,
0.0308380126953125,
-0.03338623046875,
0.0191192626953125,
0.0229949951171875,
0.0202484130859375,
0.0038089752197265625,
-0.037567138671875,
-0.06488037109375,
-0.02264404296875,
0.02484130859375,
0.037750244140625,
-0.0467529296875,
0.034332275390625,
-0.00469970703125,
-0.056396484375,
-0.0300445556640625,
0.0160675048828125,
0.041107177734375,
0.04730224609375,
0.023651123046875,
-0.035736083984375,
-0.04656982421875,
-0.06787109375,
0.0082855224609375,
0.0098419189453125,
0.0016803741455078125,
0.03436279296875,
0.048309326171875,
0.0024662017822265625,
0.04412841796875,
-0.01971435546875,
-0.0202484130859375,
-0.029266357421875,
0.005275726318359375,
0.03643798828125,
0.06378173828125,
0.06597900390625,
-0.05072021484375,
-0.042877197265625,
-0.0035915374755859375,
-0.07269287109375,
0.01216888427734375,
-0.01277923583984375,
-0.00936126708984375,
0.0269927978515625,
0.01898193359375,
-0.039825439453125,
0.0390625,
0.0219573974609375,
-0.007503509521484375,
0.038665771484375,
-0.01580810546875,
0.013397216796875,
-0.0897216796875,
0.014678955078125,
0.0244598388671875,
-0.00745391845703125,
-0.029510498046875,
0.002582550048828125,
-0.0032215118408203125,
-0.01119232177734375,
-0.048187255859375,
0.051239013671875,
-0.0450439453125,
-0.0261993408203125,
-0.017120361328125,
-0.01102447509765625,
0.002529144287109375,
0.05804443359375,
-0.006622314453125,
0.028961181640625,
0.0618896484375,
-0.0297698974609375,
0.038055419921875,
0.010467529296875,
-0.01413726806640625,
0.0275115966796875,
-0.06292724609375,
0.0249176025390625,
0.0022487640380859375,
0.026336669921875,
-0.07159423828125,
-0.0079193115234375,
0.0343017578125,
-0.045257568359375,
0.046966552734375,
-0.040802001953125,
-0.037628173828125,
-0.041259765625,
-0.036285400390625,
0.032928466796875,
0.0504150390625,
-0.05078125,
0.0355224609375,
0.01043701171875,
0.0300750732421875,
-0.05023193359375,
-0.070556640625,
-0.01239013671875,
-0.02294921875,
-0.047119140625,
0.0220794677734375,
0.002506256103515625,
0.002834320068359375,
0.00914764404296875,
0.0015554428100585938,
-0.00765228271484375,
-0.00580596923828125,
0.043487548828125,
0.025360107421875,
-0.0266571044921875,
-0.011627197265625,
-0.031890869140625,
-0.006805419921875,
-0.0013265609741210938,
-0.026763916015625,
0.04327392578125,
-0.0188751220703125,
-0.01073455810546875,
-0.06622314453125,
-0.0034503936767578125,
0.0362548828125,
-0.0013866424560546875,
0.06341552734375,
0.08935546875,
-0.039031982421875,
-0.0085906982421875,
-0.03076171875,
-0.0270233154296875,
-0.037139892578125,
0.032684326171875,
-0.0184478759765625,
-0.03436279296875,
0.0660400390625,
-0.0034275054931640625,
0.003753662109375,
0.04437255859375,
0.0267333984375,
-0.01348114013671875,
0.048797607421875,
0.042816162109375,
0.01067352294921875,
0.053009033203125,
-0.077392578125,
-0.0166778564453125,
-0.07159423828125,
-0.029052734375,
-0.02783203125,
-0.052215576171875,
-0.047393798828125,
-0.028106689453125,
0.03424072265625,
0.0151214599609375,
-0.031494140625,
0.032989501953125,
-0.07037353515625,
0.0052337646484375,
0.047882080078125,
0.04937744140625,
-0.033355712890625,
0.019073486328125,
-0.0276336669921875,
-0.0036163330078125,
-0.059906005859375,
-0.013641357421875,
0.0804443359375,
0.0299224853515625,
0.04248046875,
-0.016357421875,
0.05078125,
-0.01001739501953125,
0.0282440185546875,
-0.041107177734375,
0.047454833984375,
-0.02423095703125,
-0.0298004150390625,
-0.007415771484375,
-0.0283203125,
-0.0736083984375,
0.0035724639892578125,
-0.02783203125,
-0.05084228515625,
0.01177978515625,
0.0182037353515625,
-0.01473236083984375,
0.0645751953125,
-0.055419921875,
0.06585693359375,
-0.0163421630859375,
-0.03521728515625,
0.004871368408203125,
-0.05657958984375,
0.026214599609375,
0.019073486328125,
-0.015472412109375,
-0.006649017333984375,
0.0122222900390625,
0.07379150390625,
-0.051239013671875,
0.07281494140625,
-0.037750244140625,
0.0333251953125,
0.044464111328125,
-0.01092529296875,
0.027435302734375,
-0.010101318359375,
-0.018218994140625,
0.0254974365234375,
-0.007198333740234375,
-0.036590576171875,
-0.04345703125,
0.043365478515625,
-0.076904296875,
-0.01500701904296875,
-0.024078369140625,
-0.034820556640625,
0.021331787109375,
0.0032501220703125,
0.043121337890625,
0.054931640625,
0.019134521484375,
0.025299072265625,
0.046356201171875,
-0.04132080078125,
0.036407470703125,
-0.00554656982421875,
-0.007537841796875,
-0.039794921875,
0.054931640625,
0.0210723876953125,
0.014068603515625,
0.01207733154296875,
0.0165557861328125,
-0.01264190673828125,
-0.043243408203125,
-0.0237579345703125,
0.0241546630859375,
-0.05303955078125,
-0.03778076171875,
-0.041534423828125,
-0.0390625,
-0.0255889892578125,
-0.01409912109375,
-0.0418701171875,
-0.0247802734375,
-0.027099609375,
0.0231781005859375,
0.050506591796875,
0.041534423828125,
-0.00997161865234375,
0.038604736328125,
-0.0355224609375,
0.005695343017578125,
0.007663726806640625,
0.0238494873046875,
0.005466461181640625,
-0.063232421875,
-0.0256805419921875,
-0.0030345916748046875,
-0.029510498046875,
-0.047576904296875,
0.030029296875,
0.01340484619140625,
0.037811279296875,
0.018035888671875,
-0.0100555419921875,
0.04583740234375,
-0.0009164810180664062,
0.043243408203125,
0.03753662109375,
-0.0379638671875,
0.04510498046875,
0.0037212371826171875,
0.0194244384765625,
0.0089874267578125,
0.01727294921875,
-0.0201416015625,
-0.00557708740234375,
-0.06805419921875,
-0.05474853515625,
0.065185546875,
0.00347137451171875,
-0.0015287399291992188,
0.02752685546875,
0.062225341796875,
-0.0035152435302734375,
-0.005615234375,
-0.059906005859375,
-0.039306640625,
-0.023681640625,
-0.01837158203125,
0.005771636962890625,
-0.004184722900390625,
-0.003963470458984375,
-0.046844482421875,
0.0496826171875,
-0.00934600830078125,
0.05316162109375,
0.031646728515625,
-0.0018768310546875,
-0.0078277587890625,
-0.029541015625,
0.036041259765625,
0.018768310546875,
-0.023162841796875,
0.01265716552734375,
0.0241546630859375,
-0.041656494140625,
0.0020084381103515625,
0.00559234619140625,
-0.0019140243530273438,
0.0023403167724609375,
0.041412353515625,
0.06292724609375,
0.01085662841796875,
0.0004067420959472656,
0.0289306640625,
-0.0024127960205078125,
-0.035247802734375,
-0.0246429443359375,
0.0093231201171875,
0.0026683807373046875,
0.03216552734375,
0.0219879150390625,
0.03314208984375,
-0.01230621337890625,
-0.0199432373046875,
0.026611328125,
0.0380859375,
-0.022918701171875,
-0.0244598388671875,
0.04827880859375,
-0.01898193359375,
-0.0137481689453125,
0.06439208984375,
-0.01000213623046875,
-0.03704833984375,
0.0938720703125,
0.033447265625,
0.07269287109375,
0.0059967041015625,
0.00017380714416503906,
0.06951904296875,
0.022613525390625,
0.003429412841796875,
0.0098419189453125,
0.01073455810546875,
-0.060760498046875,
0.007442474365234375,
-0.035919189453125,
-0.0037021636962890625,
0.0225677490234375,
-0.04010009765625,
0.0226898193359375,
-0.05926513671875,
-0.032684326171875,
0.0140228271484375,
0.03521728515625,
-0.07476806640625,
0.01401519775390625,
0.004543304443359375,
0.06719970703125,
-0.050140380859375,
0.0679931640625,
0.06304931640625,
-0.039581298828125,
-0.08099365234375,
-0.01512908935546875,
0.0055084228515625,
-0.06982421875,
0.055145263671875,
0.0304718017578125,
0.007129669189453125,
-0.0028667449951171875,
-0.070068359375,
-0.042938232421875,
0.1123046875,
0.0380859375,
-0.01465606689453125,
0.0221405029296875,
-0.005558013916015625,
0.018829345703125,
-0.03961181640625,
0.035980224609375,
0.006069183349609375,
0.0288238525390625,
0.0257720947265625,
-0.050506591796875,
0.0214385986328125,
-0.02337646484375,
0.00861358642578125,
0.0034275054931640625,
-0.06365966796875,
0.0699462890625,
-0.041748046875,
-0.005767822265625,
0.00807952880859375,
0.047119140625,
0.02276611328125,
0.01910400390625,
0.046661376953125,
0.06341552734375,
0.03546142578125,
-0.018798828125,
0.06988525390625,
-0.007205963134765625,
0.047332763671875,
0.05718994140625,
0.01849365234375,
0.0399169921875,
0.0208587646484375,
-0.015716552734375,
0.0288848876953125,
0.07916259765625,
-0.0288238525390625,
0.019805908203125,
0.01702880859375,
0.00433349609375,
-0.008056640625,
-0.00009810924530029297,
-0.0299224853515625,
0.0308685302734375,
0.01024627685546875,
-0.04302978515625,
-0.015899658203125,
0.00669097900390625,
0.00815582275390625,
-0.02056884765625,
-0.0142974853515625,
0.036834716796875,
0.00913238525390625,
-0.0278167724609375,
0.065673828125,
0.00597381591796875,
0.06597900390625,
-0.0287017822265625,
-0.00043201446533203125,
-0.0205535888671875,
0.0165863037109375,
-0.0257568359375,
-0.060394287109375,
0.0210418701171875,
-0.022979736328125,
-0.0028076171875,
0.00591278076171875,
0.054443359375,
-0.023529052734375,
-0.03814697265625,
0.0238494873046875,
0.0227508544921875,
0.039154052734375,
0.0157012939453125,
-0.10968017578125,
0.0212554931640625,
-0.0032196044921875,
-0.047271728515625,
0.029052734375,
0.0286407470703125,
0.01279449462890625,
0.051177978515625,
0.046844482421875,
-0.00911712646484375,
0.0027008056640625,
-0.005023956298828125,
0.065185546875,
-0.029022216796875,
-0.01204681396484375,
-0.05670166015625,
0.043853759765625,
-0.01050567626953125,
-0.0419921875,
0.035430908203125,
0.04522705078125,
0.056884765625,
0.004505157470703125,
0.034149169921875,
-0.02276611328125,
-0.004024505615234375,
-0.03582763671875,
0.054046630859375,
-0.057891845703125,
0.0023021697998046875,
-0.00811004638671875,
-0.048736572265625,
-0.0246734619140625,
0.046661376953125,
-0.014373779296875,
0.03594970703125,
0.036773681640625,
0.08056640625,
-0.0226593017578125,
-0.033843994140625,
0.01641845703125,
0.01015472412109375,
0.001834869384765625,
0.03839111328125,
0.021331787109375,
-0.061614990234375,
0.0265960693359375,
-0.057464599609375,
-0.0100250244140625,
-0.0109710693359375,
-0.055908203125,
-0.07568359375,
-0.06402587890625,
-0.0419921875,
-0.055816650390625,
-0.00878143310546875,
0.0712890625,
0.08929443359375,
-0.057647705078125,
-0.0122528076171875,
0.0020656585693359375,
0.01611328125,
-0.0233001708984375,
-0.017822265625,
0.05145263671875,
-0.0171051025390625,
-0.048126220703125,
-0.0265960693359375,
0.002452850341796875,
0.02484130859375,
-0.002506256103515625,
-0.024688720703125,
-0.01505279541015625,
-0.023223876953125,
0.0158843994140625,
0.0281829833984375,
-0.0538330078125,
-0.0132904052734375,
-0.0122222900390625,
-0.0168304443359375,
0.0265045166015625,
0.04461669921875,
-0.035552978515625,
0.0227813720703125,
0.023040771484375,
0.026580810546875,
0.055908203125,
-0.0258941650390625,
-0.0009446144104003906,
-0.067626953125,
0.051361083984375,
-0.00717926025390625,
0.0384521484375,
0.03424072265625,
-0.033599853515625,
0.044891357421875,
0.032012939453125,
-0.038848876953125,
-0.059844970703125,
-0.0085296630859375,
-0.09368896484375,
-0.01512908935546875,
0.0626220703125,
-0.031005859375,
-0.039642333984375,
0.031402587890625,
-0.0040130615234375,
0.050506591796875,
-0.005035400390625,
0.02642822265625,
0.015869140625,
-0.0095977783203125,
-0.06549072265625,
-0.049285888671875,
0.031463623046875,
0.0135498046875,
-0.04510498046875,
-0.0301055908203125,
-0.00479888916015625,
0.050140380859375,
0.0146026611328125,
0.0400390625,
-0.01078033447265625,
0.0155181884765625,
0.004215240478515625,
0.04315185546875,
-0.0338134765625,
-0.0010766983032226562,
-0.01849365234375,
0.0022983551025390625,
-0.006290435791015625,
-0.038665771484375
]
] |
Helsinki-NLP/opus-mt-en-hi | 2023-08-16T11:29:49.000Z | [
"transformers",
"pytorch",
"tf",
"rust",
"marian",
"text2text-generation",
"translation",
"en",
"hi",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-en-hi | 19 | 35,773 | transformers | 2022-03-02T23:29:04 | ---
language:
- en
- hi
tags:
- translation
license: apache-2.0
---
### eng-hin
* source group: English
* target group: Hindi
* OPUS readme: [eng-hin](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-hin/README.md)
* model: transformer-align
* source language(s): eng
* target language(s): hin
* model: transformer-align
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* download original weights: [opus-2020-06-17.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-hin/opus-2020-06-17.zip)
* test set translations: [opus-2020-06-17.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-hin/opus-2020-06-17.test.txt)
* test set scores: [opus-2020-06-17.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-hin/opus-2020-06-17.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| newsdev2014.eng.hin | 6.9 | 0.296 |
| newstest2014-hien.eng.hin | 9.9 | 0.323 |
| Tatoeba-test.eng.hin | 16.1 | 0.447 |
### System Info:
- hf_name: eng-hin
- source_languages: eng
- target_languages: hin
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-hin/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['en', 'hi']
- src_constituents: {'eng'}
- tgt_constituents: {'hin'}
- src_multilingual: False
- tgt_multilingual: False
- prepro: normalization + SentencePiece (spm32k,spm32k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/eng-hin/opus-2020-06-17.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/eng-hin/opus-2020-06-17.test.txt
- src_alpha3: eng
- tgt_alpha3: hin
- short_pair: en-hi
- chrF2_score: 0.447
- bleu: 16.1
- brevity_penalty: 1.0
- ref_len: 32904.0
- src_name: English
- tgt_name: Hindi
- train_date: 2020-06-17
- src_alpha2: en
- tgt_alpha2: hi
- prefer_old: False
- long_pair: eng-hin
- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
- transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
- port_machine: brutasse
- port_time: 2020-08-21-14:41 | 2,137 | [
[
-0.0285797119140625,
-0.049163818359375,
0.015716552734375,
0.030242919921875,
-0.031341552734375,
-0.0171661376953125,
-0.0265350341796875,
-0.02972412109375,
0.0247039794921875,
0.01934814453125,
-0.04388427734375,
-0.053192138671875,
-0.038787841796875,
0.0291290283203125,
0.0020389556884765625,
0.06964111328125,
-0.0081634521484375,
0.010467529296875,
0.03118896484375,
-0.038360595703125,
-0.0362548828125,
-0.015106201171875,
-0.042572021484375,
-0.01617431640625,
0.03179931640625,
0.0267791748046875,
0.03753662109375,
0.041748046875,
0.051910400390625,
0.0220489501953125,
-0.028656005859375,
0.012939453125,
-0.0132598876953125,
-0.006175994873046875,
-0.002330780029296875,
-0.031768798828125,
-0.0419921875,
-0.0181884765625,
0.0657958984375,
0.035919189453125,
0.00838470458984375,
0.034942626953125,
-0.00263214111328125,
0.047637939453125,
-0.0166015625,
0.010345458984375,
-0.03564453125,
-0.00899505615234375,
-0.036834716796875,
-0.01763916015625,
-0.03875732421875,
-0.02362060546875,
0.00846099853515625,
-0.04736328125,
-0.0002760887145996094,
0.0128173828125,
0.1258544921875,
0.0091552734375,
-0.027069091796875,
-0.007110595703125,
-0.0284271240234375,
0.0633544921875,
-0.059722900390625,
0.030120849609375,
0.0301055908203125,
-0.0018062591552734375,
0.0009336471557617188,
-0.023895263671875,
-0.027374267578125,
0.00789642333984375,
-0.0216827392578125,
0.0230865478515625,
-0.0037670135498046875,
-0.0162811279296875,
0.014404296875,
0.047393798828125,
-0.055755615234375,
0.0022525787353515625,
-0.0269622802734375,
-0.0146636962890625,
0.03387451171875,
0.003940582275390625,
0.030670166015625,
-0.057159423828125,
-0.034332275390625,
-0.033843994140625,
-0.039337158203125,
0.012298583984375,
0.031341552734375,
0.0294647216796875,
-0.03936767578125,
0.049896240234375,
-0.003582000732421875,
0.040557861328125,
0.005664825439453125,
-0.009796142578125,
0.050323486328125,
-0.054443359375,
-0.01154327392578125,
-0.01096343994140625,
0.09051513671875,
0.0122222900390625,
-0.0001914501190185547,
0.0164947509765625,
-0.0137481689453125,
-0.0162811279296875,
-0.00896453857421875,
-0.0625,
0.0128326416015625,
0.01337432861328125,
-0.03228759765625,
-0.0125732421875,
0.00017082691192626953,
-0.0657958984375,
0.007343292236328125,
0.0012655258178710938,
0.04095458984375,
-0.06378173828125,
-0.0204010009765625,
0.0201416015625,
0.0036373138427734375,
0.0292816162109375,
-0.0022029876708984375,
-0.03948974609375,
0.006610870361328125,
0.0252838134765625,
0.07232666015625,
-0.0128021240234375,
-0.034088134765625,
-0.0127410888671875,
0.00457000732421875,
-0.00930023193359375,
0.047607421875,
-0.0142669677734375,
-0.0289154052734375,
-0.0086822509765625,
0.02972412109375,
-0.01320648193359375,
-0.01406097412109375,
0.06646728515625,
-0.020843505859375,
0.04107666015625,
-0.02557373046875,
-0.034088134765625,
-0.024322509765625,
0.0274810791015625,
-0.056365966796875,
0.09417724609375,
0.01178741455078125,
-0.0711669921875,
0.02520751953125,
-0.06329345703125,
-0.0245513916015625,
-0.0016632080078125,
0.009124755859375,
-0.056060791015625,
-0.00885772705078125,
0.01971435546875,
0.0245819091796875,
-0.0296478271484375,
0.035064697265625,
-0.0006771087646484375,
-0.01824951171875,
-0.00634002685546875,
-0.0218353271484375,
0.09881591796875,
0.0174407958984375,
-0.03277587890625,
0.010833740234375,
-0.057159423828125,
-0.0006251335144042969,
0.0229034423828125,
-0.033966064453125,
-0.01812744140625,
-0.01192474365234375,
0.01505279541015625,
0.005615234375,
0.0213165283203125,
-0.042266845703125,
0.02471923828125,
-0.0550537109375,
0.01605224609375,
0.061279296875,
0.01092529296875,
0.0176239013671875,
-0.0272064208984375,
0.03155517578125,
0.0187835693359375,
0.00983428955078125,
0.0011205673217773438,
-0.044647216796875,
-0.05322265625,
-0.0191802978515625,
0.043975830078125,
0.05181884765625,
-0.052947998046875,
0.055938720703125,
-0.055450439453125,
-0.057159423828125,
-0.056549072265625,
-0.00962066650390625,
0.040008544921875,
0.0224761962890625,
0.035247802734375,
-0.01849365234375,
-0.041534423828125,
-0.07647705078125,
-0.0170440673828125,
-0.0190887451171875,
0.004467010498046875,
0.0155487060546875,
0.06005859375,
-0.0015249252319335938,
0.042205810546875,
-0.031494140625,
-0.03900146484375,
-0.01337432861328125,
0.018402099609375,
0.0301666259765625,
0.053436279296875,
0.045989990234375,
-0.06646728515625,
-0.0482177734375,
0.00982666015625,
-0.04864501953125,
-0.00826263427734375,
-0.0006651878356933594,
-0.0230560302734375,
0.02587890625,
-0.0011348724365234375,
-0.040496826171875,
0.0215301513671875,
0.04144287109375,
-0.05133056640625,
0.0352783203125,
-0.00971221923828125,
0.0308837890625,
-0.1220703125,
0.0137939453125,
-0.0063934326171875,
-0.0032978057861328125,
-0.022186279296875,
0.004756927490234375,
0.01103973388671875,
0.003204345703125,
-0.043975830078125,
0.0567626953125,
-0.047393798828125,
0.007030487060546875,
0.026214599609375,
0.01000213623046875,
-0.0007176399230957031,
0.061187744140625,
-0.016510009765625,
0.073974609375,
0.04119873046875,
-0.0264434814453125,
0.00675201416015625,
0.038848876953125,
-0.02984619140625,
0.021759033203125,
-0.052154541015625,
-0.0192413330078125,
0.02020263671875,
-0.006107330322265625,
-0.055572509765625,
-0.019012451171875,
0.01522064208984375,
-0.053375244140625,
0.0228729248046875,
-0.007266998291015625,
-0.049163818359375,
-0.015625,
-0.031005859375,
0.043975830078125,
0.03814697265625,
-0.0169830322265625,
0.05657958984375,
0.01253509521484375,
-0.0035228729248046875,
-0.04364013671875,
-0.06329345703125,
-0.0010042190551757812,
-0.0149688720703125,
-0.050323486328125,
0.030303955078125,
-0.0142669677734375,
0.0066375732421875,
0.0110321044921875,
0.00281524658203125,
-0.00962066650390625,
0.0010356903076171875,
0.007511138916015625,
0.0259857177734375,
-0.027313232421875,
0.00848388671875,
-0.008697509765625,
-0.005413055419921875,
-0.016326904296875,
-0.017303466796875,
0.055206298828125,
-0.035736083984375,
-0.0197906494140625,
-0.05517578125,
0.00958251953125,
0.03936767578125,
-0.037261962890625,
0.0777587890625,
0.0472412109375,
-0.0259857177734375,
0.0203399658203125,
-0.03961181640625,
0.00608062744140625,
-0.02911376953125,
0.0220947265625,
-0.04119873046875,
-0.0426025390625,
0.06268310546875,
0.01491546630859375,
0.00951385498046875,
0.07366943359375,
0.05157470703125,
0.0135345458984375,
0.049652099609375,
0.025360107421875,
0.006099700927734375,
0.040008544921875,
-0.04132080078125,
0.0021572113037109375,
-0.060333251953125,
-0.0167694091796875,
-0.056060791015625,
-0.0123138427734375,
-0.07440185546875,
-0.010986328125,
0.024200439453125,
-0.00879669189453125,
-0.01036834716796875,
0.05682373046875,
-0.04241943359375,
0.02520751953125,
0.045135498046875,
0.01416015625,
0.0211029052734375,
-0.00823974609375,
-0.032928466796875,
-0.007526397705078125,
-0.0295257568359375,
-0.039154052734375,
0.08575439453125,
0.02252197265625,
0.019378662109375,
0.0263824462890625,
0.047760009765625,
0.0015878677368164062,
0.00724029541015625,
-0.0504150390625,
0.0450439453125,
-0.009674072265625,
-0.0694580078125,
-0.03167724609375,
-0.02850341796875,
-0.06988525390625,
0.01532745361328125,
-0.01389312744140625,
-0.04510498046875,
0.0103607177734375,
-0.0106964111328125,
-0.0085296630859375,
0.04730224609375,
-0.060577392578125,
0.07366943359375,
0.00423431396484375,
-0.025787353515625,
0.01232147216796875,
-0.050018310546875,
0.0155181884765625,
-0.0056304931640625,
0.01346588134765625,
-0.0112152099609375,
-0.00386810302734375,
0.06964111328125,
-0.019439697265625,
0.041748046875,
-0.0034503936767578125,
-0.005352020263671875,
0.0162811279296875,
0.00884246826171875,
0.03228759765625,
-0.0066680908203125,
-0.02447509765625,
0.02392578125,
0.0077972412109375,
-0.05120849609375,
-0.016448974609375,
0.043731689453125,
-0.05755615234375,
-0.037322998046875,
-0.041656494140625,
-0.04815673828125,
0.00023686885833740234,
0.04345703125,
0.0406494140625,
0.043060302734375,
-0.0023632049560546875,
0.04754638671875,
0.050811767578125,
-0.019805908203125,
0.038482666015625,
0.031585693359375,
-0.0006132125854492188,
-0.043548583984375,
0.05120849609375,
0.018157958984375,
0.01413726806640625,
0.037322998046875,
0.015655517578125,
-0.013153076171875,
-0.0548095703125,
-0.04412841796875,
0.028961181640625,
-0.030792236328125,
-0.025909423828125,
-0.038848876953125,
-0.0095367431640625,
-0.0309600830078125,
0.0137939453125,
-0.028656005859375,
-0.0272216796875,
-0.0038166046142578125,
-0.0181121826171875,
0.027679443359375,
0.029510498046875,
0.006702423095703125,
0.0193328857421875,
-0.06451416015625,
0.016204833984375,
-0.0123138427734375,
0.039093017578125,
-0.0198211669921875,
-0.05841064453125,
-0.0289306640625,
-0.00047206878662109375,
-0.0268402099609375,
-0.080810546875,
0.035736083984375,
-0.0026416778564453125,
0.018157958984375,
0.00983428955078125,
0.0045928955078125,
0.045684814453125,
-0.032806396484375,
0.0770263671875,
-0.00392913818359375,
-0.06585693359375,
0.0517578125,
-0.034271240234375,
0.03814697265625,
0.05230712890625,
0.0274810791015625,
-0.0200958251953125,
-0.04449462890625,
-0.053680419921875,
-0.061126708984375,
0.056488037109375,
0.042236328125,
-0.00698089599609375,
-0.001941680908203125,
0.005588531494140625,
0.0013952255249023438,
-0.0184478759765625,
-0.0882568359375,
-0.0298614501953125,
0.0058441162109375,
-0.03729248046875,
0.0122528076171875,
-0.0281829833984375,
-0.01181793212890625,
-0.02435302734375,
0.0789794921875,
0.01250457763671875,
0.00823974609375,
0.037567138671875,
-0.0143890380859375,
0.0011739730834960938,
0.0301513671875,
0.046356201171875,
0.033416748046875,
-0.0254058837890625,
-0.01165008544921875,
0.0274810791015625,
-0.04461669921875,
0.00634002685546875,
0.00296783447265625,
-0.037628173828125,
0.0165252685546875,
0.040374755859375,
0.0655517578125,
0.0168609619140625,
-0.033416748046875,
0.04132080078125,
-0.00585174560546875,
-0.0252532958984375,
-0.027313232421875,
-0.0235748291015625,
0.0090484619140625,
0.00008606910705566406,
0.0187225341796875,
0.00457000732421875,
-0.00024175643920898438,
-0.01947021484375,
0.00148773193359375,
0.0037517547607421875,
-0.018524169921875,
-0.031341552734375,
0.0372314453125,
0.00641632080078125,
-0.0300445556640625,
0.0282135009765625,
-0.028289794921875,
-0.03143310546875,
0.04150390625,
0.02215576171875,
0.0794677734375,
-0.0174407958984375,
-0.01434326171875,
0.0615234375,
0.035736083984375,
0.0016832351684570312,
0.03076171875,
0.01837158203125,
-0.03961181640625,
-0.0296630859375,
-0.0643310546875,
0.007190704345703125,
0.00826263427734375,
-0.056793212890625,
0.03082275390625,
0.0030193328857421875,
-0.0216522216796875,
-0.00234222412109375,
0.0274658203125,
-0.04425048828125,
0.004886627197265625,
-0.032440185546875,
0.07000732421875,
-0.0635986328125,
0.06097412109375,
0.0567626953125,
-0.064208984375,
-0.0794677734375,
0.0012798309326171875,
-0.01776123046875,
-0.045623779296875,
0.04132080078125,
0.003971099853515625,
0.004787445068359375,
-0.0036373138427734375,
-0.0225677490234375,
-0.061126708984375,
0.0841064453125,
0.0249786376953125,
-0.021820068359375,
-0.019439697265625,
-0.00021147727966308594,
0.0406494140625,
-0.0009598731994628906,
0.0148162841796875,
0.0251312255859375,
0.061431884765625,
-0.00954437255859375,
-0.0826416015625,
0.01131439208984375,
-0.0433349609375,
-0.0014200210571289062,
0.0297393798828125,
-0.063232421875,
0.0626220703125,
0.00933074951171875,
-0.021026611328125,
0.007640838623046875,
0.043182373046875,
0.022613525390625,
0.0035991668701171875,
0.035797119140625,
0.06756591796875,
0.0303497314453125,
-0.038360595703125,
0.07684326171875,
-0.027374267578125,
0.042816162109375,
0.0634765625,
0.0137939453125,
0.06304931640625,
0.041534423828125,
-0.018035888671875,
0.047393798828125,
0.05584716796875,
-0.01471710205078125,
0.0199432373046875,
-0.006763458251953125,
0.0017108917236328125,
-0.01253509521484375,
-0.019683837890625,
-0.035308837890625,
0.038970947265625,
-0.0007023811340332031,
-0.004302978515625,
0.0017843246459960938,
-0.01213836669921875,
0.02313232421875,
0.00986480712890625,
-0.005512237548828125,
0.049713134765625,
-0.0131378173828125,
-0.053985595703125,
0.05584716796875,
-0.0034732818603515625,
0.0538330078125,
-0.047119140625,
0.0026111602783203125,
-0.01474761962890625,
0.00946807861328125,
-0.00490570068359375,
-0.058563232421875,
0.032440185546875,
0.0131072998046875,
-0.02020263671875,
-0.020477294921875,
0.0172271728515625,
-0.036834716796875,
-0.05621337890625,
0.0322265625,
0.035675048828125,
0.0178985595703125,
0.02093505859375,
-0.051361083984375,
0.0012969970703125,
0.0109100341796875,
-0.051971435546875,
-0.0021209716796875,
0.054168701171875,
0.00395965576171875,
0.046173095703125,
0.032135009765625,
0.0189971923828125,
0.00848388671875,
0.0009098052978515625,
0.042266845703125,
-0.06304931640625,
-0.03228759765625,
-0.063232421875,
0.0391845703125,
-0.00899505615234375,
-0.045684814453125,
0.051849365234375,
0.062225341796875,
0.0677490234375,
-0.00481414794921875,
0.0206451416015625,
-0.01479339599609375,
0.039337158203125,
-0.050872802734375,
0.054718017578125,
-0.075439453125,
0.003704071044921875,
-0.01117706298828125,
-0.0552978515625,
-0.0223388671875,
0.015106201171875,
-0.0161285400390625,
0.004207611083984375,
0.07891845703125,
0.05987548828125,
-0.0000908970832824707,
-0.0202789306640625,
-0.002147674560546875,
0.031524658203125,
0.01751708984375,
0.059417724609375,
0.020172119140625,
-0.07000732421875,
0.056671142578125,
-0.025604248046875,
0.003131866455078125,
-0.0046234130859375,
-0.05731201171875,
-0.057861328125,
-0.06097412109375,
-0.01328277587890625,
-0.03118896484375,
-0.01313018798828125,
0.07098388671875,
0.03369140625,
-0.07513427734375,
-0.023773193359375,
0.004009246826171875,
0.015838623046875,
-0.0172271728515625,
-0.020538330078125,
0.057861328125,
-0.006351470947265625,
-0.078125,
0.007137298583984375,
0.00724029541015625,
0.01076507568359375,
0.0036468505859375,
-0.006099700927734375,
-0.0587158203125,
-0.0021686553955078125,
0.02032470703125,
0.006473541259765625,
-0.06475830078125,
-0.01247406005859375,
0.01474761962890625,
-0.0247039794921875,
0.0248870849609375,
0.005523681640625,
-0.0162353515625,
0.0166473388671875,
0.05999755859375,
0.03759765625,
0.032928466796875,
-0.006732940673828125,
0.022125244140625,
-0.054443359375,
0.0302886962890625,
0.016632080078125,
0.043853759765625,
0.022918701171875,
-0.0125274658203125,
0.058746337890625,
0.0256500244140625,
-0.0268707275390625,
-0.07696533203125,
-0.0019168853759765625,
-0.08929443359375,
-0.00363922119140625,
0.073486328125,
-0.0140533447265625,
-0.03424072265625,
0.020721435546875,
-0.0194091796875,
0.040191650390625,
-0.0264434814453125,
0.0421142578125,
0.07318115234375,
0.0276031494140625,
0.01342010498046875,
-0.03509521484375,
0.0177154541015625,
0.050323486328125,
-0.06317138671875,
-0.01422119140625,
0.01352691650390625,
0.019561767578125,
0.033935546875,
0.04168701171875,
-0.031585693359375,
0.0144500732421875,
-0.0126800537109375,
0.027679443359375,
-0.011322021484375,
-0.0090484619140625,
-0.024078369140625,
0.00817108154296875,
-0.00762939453125,
-0.019805908203125
]
] |
guillaumekln/faster-whisper-tiny.en | 2023-05-12T18:56:53.000Z | [
"ctranslate2",
"audio",
"automatic-speech-recognition",
"en",
"license:mit",
"region:us"
] | automatic-speech-recognition | guillaumekln | null | null | guillaumekln/faster-whisper-tiny.en | 2 | 35,599 | ctranslate2 | 2023-03-23T10:17:41 | ---
language:
- en
tags:
- audio
- automatic-speech-recognition
license: mit
library_name: ctranslate2
---
# Whisper tiny.en model for CTranslate2
This repository contains the conversion of [openai/whisper-tiny.en](https://huggingface.co/openai/whisper-tiny.en) to the [CTranslate2](https://github.com/OpenNMT/CTranslate2) model format.
This model can be used in CTranslate2 or projects based on CTranslate2 such as [faster-whisper](https://github.com/guillaumekln/faster-whisper).
## Example
```python
from faster_whisper import WhisperModel
model = WhisperModel("tiny.en")
segments, info = model.transcribe("audio.mp3")
for segment in segments:
print("[%.2fs -> %.2fs] %s" % (segment.start, segment.end, segment.text))
```
## Conversion details
The original model was converted with the following command:
```
ct2-transformers-converter --model openai/whisper-tiny.en --output_dir faster-whisper-tiny.en \
--copy_files tokenizer.json --quantization float16
```
Note that the model weights are saved in FP16. This type can be changed when the model is loaded using the [`compute_type` option in CTranslate2](https://opennmt.net/CTranslate2/quantization.html).
## More information
**For more information about the original model, see its [model card](https://huggingface.co/openai/whisper-tiny.en).**
| 1,328 | [
[
0.0015811920166015625,
-0.0287628173828125,
0.0214385986328125,
0.0310516357421875,
-0.033721923828125,
-0.026336669921875,
-0.037628173828125,
-0.03387451171875,
0.006046295166015625,
0.04815673828125,
-0.03741455078125,
-0.0379638671875,
-0.0390625,
-0.0164642333984375,
-0.0194244384765625,
0.065185546875,
-0.011383056640625,
0.0197601318359375,
0.02838134765625,
-0.00707244873046875,
-0.0265045166015625,
-0.00832366943359375,
-0.059234619140625,
-0.019134521484375,
0.0126953125,
0.0275726318359375,
0.046295166015625,
0.0306396484375,
0.035614013671875,
0.0195159912109375,
-0.0231475830078125,
-0.006946563720703125,
-0.0167999267578125,
-0.0176239013671875,
0.01605224609375,
-0.053436279296875,
-0.049407958984375,
0.008056640625,
0.0491943359375,
0.01015472412109375,
-0.018768310546875,
0.03973388671875,
-0.01216888427734375,
0.022918701171875,
-0.031768798828125,
0.0217437744140625,
-0.044708251953125,
0.00018393993377685547,
-0.012451171875,
-0.01153564453125,
-0.04010009765625,
-0.0308074951171875,
0.03228759765625,
-0.061126708984375,
0.015777587890625,
0.0018110275268554688,
0.06451416015625,
0.02008056640625,
-0.036529541015625,
-0.027801513671875,
-0.06488037109375,
0.070556640625,
-0.05657958984375,
0.0163116455078125,
0.017974853515625,
0.0401611328125,
0.0175018310546875,
-0.08148193359375,
-0.021087646484375,
-0.006809234619140625,
0.0032787322998046875,
0.018341064453125,
-0.02886962890625,
0.021331787109375,
0.0098876953125,
0.034423828125,
-0.049530029296875,
-0.005046844482421875,
-0.05511474609375,
-0.0328369140625,
0.041595458984375,
0.01105499267578125,
0.0208282470703125,
-0.02545166015625,
-0.019866943359375,
-0.0433349609375,
-0.037933349609375,
-0.0011348724365234375,
0.032440185546875,
0.027313232421875,
-0.054595947265625,
0.047027587890625,
0.00739288330078125,
0.028717041015625,
0.006946563720703125,
-0.0235137939453125,
0.02301025390625,
-0.023529052734375,
-0.0129852294921875,
0.034271240234375,
0.052764892578125,
0.033660888671875,
0.00665283203125,
0.0204925537109375,
-0.0190582275390625,
-0.0021114349365234375,
-0.0002484321594238281,
-0.0853271484375,
-0.03350830078125,
0.0192413330078125,
-0.06732177734375,
-0.0184173583984375,
0.0004851818084716797,
-0.0261993408203125,
0.00733184814453125,
-0.01042938232421875,
0.04656982421875,
-0.033782958984375,
-0.033935546875,
0.02838134765625,
-0.035736083984375,
0.00647735595703125,
0.032806396484375,
-0.059295654296875,
0.032745361328125,
0.03955078125,
0.08428955078125,
0.00972747802734375,
-0.002307891845703125,
-0.0139617919921875,
0.01617431640625,
-0.00586700439453125,
0.041046142578125,
-0.00035452842712402344,
-0.038848876953125,
-0.0079345703125,
-0.0027980804443359375,
-0.01247406005859375,
-0.042724609375,
0.045867919921875,
-0.01357269287109375,
0.0285797119140625,
0.0237579345703125,
-0.0121612548828125,
-0.0139312744140625,
0.0012683868408203125,
-0.051513671875,
0.07196044921875,
0.0291290283203125,
-0.0579833984375,
-0.01025390625,
-0.061767578125,
-0.01033782958984375,
-0.01471710205078125,
0.03167724609375,
-0.0312347412109375,
0.0241241455078125,
-0.002498626708984375,
-0.0019893646240234375,
-0.034454345703125,
0.0168914794921875,
-0.01134490966796875,
-0.028289794921875,
0.020294189453125,
-0.036956787109375,
0.0733642578125,
0.0287628173828125,
0.00954437255859375,
0.0183563232421875,
-0.04681396484375,
0.00571441650390625,
0.001445770263671875,
-0.0291290283203125,
-0.025604248046875,
-0.01568603515625,
0.04150390625,
0.00431060791015625,
0.0266876220703125,
-0.048095703125,
0.0235443115234375,
-0.049346923828125,
0.0643310546875,
0.02899169921875,
0.00347137451171875,
0.03668212890625,
-0.030975341796875,
0.0080413818359375,
0.01439666748046875,
0.0330810546875,
0.01001739501953125,
-0.042633056640625,
-0.057586669921875,
-0.011322021484375,
0.03790283203125,
0.031402587890625,
-0.04840087890625,
0.01555633544921875,
-0.029296875,
-0.06756591796875,
-0.07568359375,
-0.0305633544921875,
0.017364501953125,
0.01399993896484375,
0.037628173828125,
-0.0032482147216796875,
-0.058837890625,
-0.06634521484375,
-0.00860595703125,
-0.031890869140625,
-0.0215911865234375,
0.01520538330078125,
0.04815673828125,
-0.01248931884765625,
0.05096435546875,
-0.044769287109375,
-0.0386962890625,
-0.017578125,
0.0255279541015625,
0.0187530517578125,
0.0635986328125,
0.046356201171875,
-0.056304931640625,
-0.0252532958984375,
-0.01074981689453125,
-0.020477294921875,
0.004428863525390625,
-0.01045989990234375,
-0.00044536590576171875,
-0.002277374267578125,
0.004611968994140625,
-0.055877685546875,
0.033111572265625,
0.051605224609375,
-0.02899169921875,
0.034759521484375,
-0.003528594970703125,
-0.0032215118408203125,
-0.0919189453125,
0.00943756103515625,
0.0021648406982421875,
-0.01282501220703125,
-0.04241943359375,
0.00482940673828125,
0.019073486328125,
0.0038127899169921875,
-0.0633544921875,
0.055145263671875,
-0.011810302734375,
-0.002376556396484375,
-0.005523681640625,
-0.01006317138671875,
-0.0033054351806640625,
0.0199127197265625,
0.0285186767578125,
0.059234619140625,
0.027984619140625,
-0.029052734375,
0.00766754150390625,
0.046356201171875,
-0.0200347900390625,
0.01035308837890625,
-0.07470703125,
0.009918212890625,
0.0194854736328125,
0.0262451171875,
-0.041168212890625,
-0.004119873046875,
0.0200653076171875,
-0.053985595703125,
0.01531982421875,
-0.0462646484375,
-0.04766845703125,
-0.0240020751953125,
-0.03863525390625,
0.03692626953125,
0.047454833984375,
-0.0312042236328125,
0.04693603515625,
0.0182037353515625,
0.00440216064453125,
0.005031585693359375,
-0.08233642578125,
-0.0095672607421875,
-0.0149078369140625,
-0.060577392578125,
0.050079345703125,
-0.020599365234375,
-0.01276397705078125,
-0.007965087890625,
-0.003589630126953125,
-0.021484375,
-0.0111083984375,
0.0280303955078125,
0.0218353271484375,
-0.032470703125,
-0.016265869140625,
0.030426025390625,
-0.026092529296875,
0.0027179718017578125,
-0.0379638671875,
0.046356201171875,
-0.016204833984375,
0.004734039306640625,
-0.05548095703125,
0.005733489990234375,
0.03668212890625,
-0.01340484619140625,
0.035552978515625,
0.05487060546875,
-0.0301361083984375,
-0.0139923095703125,
-0.0269622802734375,
-0.0231475830078125,
-0.03692626953125,
0.0131683349609375,
-0.0273590087890625,
-0.058837890625,
0.036712646484375,
0.00875091552734375,
-0.0013685226440429688,
0.060821533203125,
0.0391845703125,
0.0009551048278808594,
0.08380126953125,
0.03955078125,
0.0239715576171875,
0.031646728515625,
-0.054901123046875,
-0.01322174072265625,
-0.083984375,
-0.02044677734375,
-0.052947998046875,
-0.0173492431640625,
-0.032318115234375,
-0.026092529296875,
0.041595458984375,
0.0140380859375,
-0.033721923828125,
0.04937744140625,
-0.054290771484375,
0.003631591796875,
0.033782958984375,
0.0138702392578125,
0.0219268798828125,
-0.0009765625,
0.01314544677734375,
-0.017974853515625,
-0.033477783203125,
-0.031768798828125,
0.07958984375,
0.04315185546875,
0.05377197265625,
0.0277862548828125,
0.049346923828125,
0.01171112060546875,
0.007762908935546875,
-0.06787109375,
0.02166748046875,
-0.0181732177734375,
-0.04315185546875,
-0.005146026611328125,
-0.0232391357421875,
-0.0509033203125,
0.0059661865234375,
0.0020580291748046875,
-0.0545654296875,
0.0082244873046875,
0.0020503997802734375,
-0.018280029296875,
0.026336669921875,
-0.043182373046875,
0.062469482421875,
0.00860595703125,
0.02203369140625,
-0.0158233642578125,
-0.030853271484375,
0.03778076171875,
0.00662994384765625,
-0.0167236328125,
0.004467010498046875,
-0.0093994140625,
0.08148193359375,
-0.053436279296875,
0.06292724609375,
-0.02911376953125,
-0.007511138916015625,
0.05377197265625,
0.0176849365234375,
0.03338623046875,
0.0181427001953125,
-0.01409912109375,
0.038665771484375,
0.0335693359375,
-0.0015211105346679688,
-0.023895263671875,
0.04315185546875,
-0.092041015625,
-0.00693511962890625,
-0.0181121826171875,
-0.038116455078125,
0.02685546875,
0.0118255615234375,
0.0391845703125,
0.041259765625,
-0.0033054351806640625,
0.01323699951171875,
0.045318603515625,
0.006378173828125,
0.036346435546875,
0.0545654296875,
-0.00479888916015625,
-0.0517578125,
0.058135986328125,
0.0163421630859375,
0.0199127197265625,
0.032501220703125,
0.0288543701171875,
-0.036285400390625,
-0.0679931640625,
-0.035919189453125,
0.01025390625,
-0.043487548828125,
-0.03070068359375,
-0.045379638671875,
-0.035186767578125,
-0.036834716796875,
0.0072784423828125,
-0.05303955078125,
-0.059967041015625,
-0.043914794921875,
0.02130126953125,
0.0501708984375,
0.033935546875,
-0.0101318359375,
0.052581787109375,
-0.0806884765625,
0.0192413330078125,
0.0012865066528320312,
0.00830841064453125,
0.00936126708984375,
-0.073974609375,
-0.013519287109375,
0.01099395751953125,
-0.02301025390625,
-0.056854248046875,
0.036834716796875,
0.00904083251953125,
0.006591796875,
0.01514434814453125,
0.0109710693359375,
0.054168701171875,
-0.0132904052734375,
0.07440185546875,
0.0226287841796875,
-0.0850830078125,
0.055419921875,
-0.03302001953125,
0.012725830078125,
0.039764404296875,
0.0024585723876953125,
-0.043792724609375,
-0.005046844482421875,
-0.051300048828125,
-0.054290771484375,
0.06927490234375,
0.033416748046875,
-0.008026123046875,
0.0146331787109375,
0.010040283203125,
0.008819580078125,
0.01268768310546875,
-0.054290771484375,
-0.022064208984375,
-0.037200927734375,
-0.034332275390625,
0.0162200927734375,
-0.0201873779296875,
-0.005062103271484375,
-0.0213165283203125,
0.055816650390625,
-0.020751953125,
0.03643798828125,
0.031982421875,
-0.015655517578125,
-0.008331298828125,
0.003597259521484375,
0.05133056640625,
0.0011920928955078125,
-0.03814697265625,
-0.007389068603515625,
0.0129852294921875,
-0.059112548828125,
-0.004344940185546875,
0.00005453824996948242,
-0.031463623046875,
0.0167083740234375,
0.029083251953125,
0.0560302734375,
0.0297393798828125,
-0.0192108154296875,
0.05120849609375,
-0.015350341796875,
-0.0256805419921875,
-0.055419921875,
0.00383758544921875,
0.01181793212890625,
0.0163726806640625,
0.0192108154296875,
0.0255126953125,
0.018463134765625,
-0.0256500244140625,
-0.01531219482421875,
0.00830841064453125,
-0.037689208984375,
-0.041473388671875,
0.06280517578125,
0.0138702392578125,
-0.0250244140625,
0.043914794921875,
-0.003101348876953125,
-0.00890350341796875,
0.04559326171875,
0.055023193359375,
0.0892333984375,
-0.0020008087158203125,
0.0101165771484375,
0.0477294921875,
0.0306243896484375,
-0.0160064697265625,
0.05267333984375,
-0.01322174072265625,
-0.028167724609375,
-0.0198211669921875,
-0.058258056640625,
-0.0222015380859375,
0.0018701553344726562,
-0.07080078125,
0.0250396728515625,
-0.039886474609375,
-0.016082763671875,
0.008026123046875,
0.01044464111328125,
-0.052886962890625,
0.00001996755599975586,
0.01486968994140625,
0.1097412109375,
-0.05035400390625,
0.09051513671875,
0.041015625,
-0.03369140625,
-0.07147216796875,
-0.0180816650390625,
0.0034046173095703125,
-0.052947998046875,
0.0408935546875,
0.01068878173828125,
0.00461578369140625,
0.006259918212890625,
-0.054901123046875,
-0.0733642578125,
0.10150146484375,
0.004390716552734375,
-0.0435791015625,
-0.0141754150390625,
0.011383056640625,
0.036956787109375,
-0.039215087890625,
0.048095703125,
0.03607177734375,
0.06170654296875,
-0.0003199577331542969,
-0.0933837890625,
-0.0010356903076171875,
-0.01131439208984375,
0.0243377685546875,
0.005779266357421875,
-0.0694580078125,
0.087646484375,
-0.016143798828125,
-0.01349639892578125,
0.061614990234375,
0.0501708984375,
0.0182647705078125,
0.022705078125,
0.027313232421875,
0.03192138671875,
0.031829833984375,
-0.0275115966796875,
0.045135498046875,
-0.0182342529296875,
0.04205322265625,
0.0706787109375,
-0.01187896728515625,
0.07525634765625,
0.03033447265625,
-0.0044097900390625,
0.049530029296875,
0.03826904296875,
-0.0220947265625,
0.046783447265625,
-0.005817413330078125,
0.0011415481567382812,
-0.0016155242919921875,
-0.006011962890625,
-0.031951904296875,
0.0517578125,
0.037353515625,
-0.0237579345703125,
-0.00926971435546875,
0.00014102458953857422,
0.001819610595703125,
-0.0150299072265625,
-0.036529541015625,
0.05267333984375,
-0.0003521442413330078,
-0.024383544921875,
0.0477294921875,
0.033172607421875,
0.06768798828125,
-0.064697265625,
-0.003978729248046875,
0.01531982421875,
0.0146942138671875,
-0.01812744140625,
-0.054168701171875,
0.03485107421875,
-0.012725830078125,
-0.0239715576171875,
-0.0015048980712890625,
0.0496826171875,
-0.043975830078125,
-0.032379150390625,
0.02178955078125,
0.015655517578125,
0.0243377685546875,
-0.01611328125,
-0.059844970703125,
0.0301055908203125,
0.0186309814453125,
-0.0224456787109375,
0.0185699462890625,
0.0008378028869628906,
0.0008549690246582031,
0.03302001953125,
0.0572509765625,
0.01313018798828125,
0.00036144256591796875,
0.0032749176025390625,
0.046661376953125,
-0.0482177734375,
-0.05517578125,
-0.0294952392578125,
0.044281005859375,
-0.0010709762573242188,
-0.048248291015625,
0.042999267578125,
0.0626220703125,
0.0467529296875,
-0.0257720947265625,
0.045501708984375,
-0.00372314453125,
0.0243377685546875,
-0.05718994140625,
0.06219482421875,
-0.032257080078125,
-0.006839752197265625,
0.0025234222412109375,
-0.055023193359375,
0.00046539306640625,
0.02728271484375,
-0.003551483154296875,
-0.0175628662109375,
0.04827880859375,
0.06732177734375,
-0.01291656494140625,
0.021575927734375,
0.002506256103515625,
0.033843994140625,
0.018890380859375,
0.045562744140625,
0.041778564453125,
-0.07708740234375,
0.0546875,
-0.0297088623046875,
0.00354766845703125,
-0.007762908935546875,
-0.04461669921875,
-0.0660400390625,
-0.0406494140625,
-0.036346435546875,
-0.0435791015625,
-0.01096343994140625,
0.0711669921875,
0.0660400390625,
-0.05389404296875,
-0.022308349609375,
0.007213592529296875,
-0.00412750244140625,
-0.0006155967712402344,
-0.02154541015625,
0.0382080078125,
0.02899169921875,
-0.05865478515625,
0.040283203125,
0.00811767578125,
0.03961181640625,
-0.0198822021484375,
-0.031097412109375,
0.0239410400390625,
0.00506591796875,
0.0262451171875,
0.01021575927734375,
-0.0626220703125,
-0.01849365234375,
-0.0234832763671875,
-0.00519561767578125,
0.01181793212890625,
0.051788330078125,
-0.05364990234375,
0.01050567626953125,
0.042999267578125,
-0.021514892578125,
0.051025390625,
-0.0311737060546875,
0.01068878173828125,
-0.0489501953125,
0.04248046875,
0.020721435546875,
0.0291900634765625,
0.01200103759765625,
-0.005161285400390625,
0.0309600830078125,
0.014312744140625,
-0.0230255126953125,
-0.07073974609375,
-0.0040283203125,
-0.098388671875,
-0.004177093505859375,
0.07952880859375,
0.0040283203125,
-0.030487060546875,
0.0165863037109375,
-0.04052734375,
0.024566650390625,
-0.047271728515625,
0.0175628662109375,
0.005657196044921875,
0.0295562744140625,
-0.0019483566284179688,
-0.03546142578125,
0.033233642578125,
-0.009124755859375,
-0.02783203125,
0.00087738037109375,
0.005634307861328125,
0.035064697265625,
0.0311737060546875,
0.0419921875,
-0.031585693359375,
0.0322265625,
0.0276947021484375,
0.0308837890625,
-0.02392578125,
-0.03240966796875,
-0.029205322265625,
0.00022554397583007812,
-0.00017344951629638672,
-0.0233306884765625
]
] |
bigcode/starpii | 2023-07-24T09:43:04.000Z | [
"transformers",
"pytorch",
"bert",
"token-classification",
"code",
"dataset:bigcode/pii-annotated-toloka-donwsample-emails",
"dataset:bigcode/pseudo-labeled-python-data-pii-detection-filtered",
"arxiv:2301.03988",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | token-classification | bigcode | null | null | bigcode/starpii | 76 | 35,590 | transformers | 2023-04-23T16:12:27 | ---
datasets:
- bigcode/pii-annotated-toloka-donwsample-emails
- bigcode/pseudo-labeled-python-data-pii-detection-filtered
metrics:
- f1
pipeline_tag: token-classification
language:
- code
extra_gated_prompt: >-
## Terms of Use for the model
This is an NER model trained to detect Personal Identifiable Information (PII)
in code datasets. We ask that you read and agree to the following Terms of Use
before using the model:
1. You agree that you will not use the model for any purpose other than PII
detection for the purpose of removing PII from datasets.
2. You agree that you will not share the model or any modified versions for
whatever purpose.
3. Unless required by applicable law or agreed to in writing, the model is
provided on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
either express or implied, including, without limitation, any warranties or
conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using the model, and assume any risks associated with your
exercise of permissions under these Terms of Use.
4. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE MODEL OR THE USE OR
OTHER DEALINGS IN THE MODEL.
extra_gated_fields:
Email: text
I have read the License and agree with its terms: checkbox
---
# StarPII
## Model description
This is an NER model trained to detect Personal Identifiable Information (PII) in code datasets. We fine-tuned [bigcode-encoder](https://huggingface.co/bigcode/bigcode-encoder)
on a PII dataset we annotated, available with gated access at [bigcode-pii-dataset](https://huggingface.co/datasets/bigcode/pii-annotated-toloka-donwsample-emails) (see [bigcode-pii-dataset-training](https://huggingface.co/datasets/bigcode/bigcode-pii-dataset-training) for the exact data splits).
We added a linear layer as a token classification head on top of the encoder model, with 6 target classes: Names, Emails, Keys, Passwords, IP addresses and Usernames.
## Dataset
### Fine-tuning on the annotated dataset
The finetuning dataset contains 20961 secrets and 31 programming languages, but the base encoder model was pre-trained on 88
programming languages from [The Stack](https://huggingface.co/datasets/bigcode/the-stack) dataset.
### Initial training on a pseudo-labelled dataset
To enhance model performance on some rare PII entities like keys, we initially trained on a pseudo-labeled dataset before fine-tuning on the annotated dataset.
The method involves training a model on a small set of labeled data and subsequently generating predictions for a larger set of unlabeled data.
Specifically, we annotated 18,000 files available at [bigcode-pii-ppseudo-labeled](https://huggingface.co/datasets/bigcode/pseudo-labeled-python-data-pii-detection-filtered)
using an ensemble of two encoder models [Deberta-v3-large](https://huggingface.co/microsoft/deberta-v3-large) and [stanford-deidentifier-base](StanfordAIMI/stanford-deidentifier-base)
which were fine-tuned on an intern previously labeled PII [dataset](https://huggingface.co/datasets/bigcode/pii-for-code) for code with 400 files from this [work](https://arxiv.org/abs/2301.03988).
To select good-quality pseudo-labels, we computed the average probability logits between the models and filtered based on a minimum score.
After inspection, we observed a high rate of false positives for Keys and Passwords, hence we retained only the entities that had a trigger word like `key`, `auth` and `pwd` in the surrounding context.
Training on this synthetic dataset prior to fine-tuning on the annotated one yielded superior results for all PII categories,
as demonstrated in the table in the following section.
### Performance
This model is respresented in the last row (NER + pseudo labels )
- Emails, IP addresses and Keys
| Method | Email address | | | IP address | | | Key | | |
| ------------------ | -------------- | ---- | ---- | ---------- | ---- | ---- | ----- | ---- | ---- |
| | Prec. | Recall | F1 | Prec. | Recall | F1 | Prec. | Recall | F1 |
| Regex | 69.8% | 98.8% | 81.8% | 65.9% | 78% | 71.7% | 2.8% | 46.9% | 5.3% |
| NER | 94.01% | 98.10% | 96.01% | 88.95% | *94.43%* | 91.61% | 60.37% | 53.38% | 56.66% |
| + pseudo labels | **97.73%** | **98.94%** | **98.15%** | **90.10%** | 93.86% | **91.94%** | **62.38%** | **80.81%** | **70.41%** |
- Names, Usernames and Passwords
| Method | Name | | | Username | | | Password | | |
| ------------------ | -------- | ---- | ---- | -------- | ---- | ---- | -------- | ---- | ---- |
| | Prec. | Recall | F1 | Prec. | Recall | F1 | Prec. | Recall | F1 |
| NER | 83.66% | 95.52% | 89.19% | 48.93% | *75.55%* | 59.39% | 59.16% | *96.62%* | 73.39%|
| + pseudo labels | **86.45%** | **97.38%** | **91.59%** | **52.20%** | 74.81% | **61.49%** | **70.94%** | 95.96% | **81.57%** |
We used this model to mask PII in the bigcode large model training. We dropped usernames since they resulted in many false positives and negatives.
For the other PII types, we added the following post-processing that we recommend for future uses of the model (the code is also available on GitHub):
- Ignore secrets with less than 4 characters.
- Detect full names only.
- Ignore detected keys with less than 9 characters or that are not gibberish using a [gibberish-detector](https://github.com/domanchi/gibberish-detector).
- Ignore IP addresses that aren't valid or are private (non-internet facing) using the `ipaddress` python package. We also ignore IP addresses from popular DNS servers.
We use the same list as in this [paper](https://huggingface.co/bigcode/santacoder).
# Considerations for Using the Model
While using this model, please be aware that there may be potential risks associated with its application.
There is a possibility of false positives and negatives, which could lead to unintended consequences when processing sensitive data.
Moreover, the model's performance may vary across different data types and programming languages, necessitating validation and fine-tuning for specific use cases.
Researchers and developers are expected to uphold ethical standards and data protection measures when using the model. By making it openly accessible,
our aim is to encourage the development of privacy-preserving AI technologies while remaining vigilant of potential risks associated with PII. | 6,823 | [
[
-0.041534423828125,
-0.05157470703125,
0.004917144775390625,
-0.0033969879150390625,
-0.0007028579711914062,
-0.0007891654968261719,
-0.00894927978515625,
-0.043121337890625,
0.015838623046875,
0.0341796875,
-0.034820556640625,
-0.04742431640625,
-0.0504150390625,
0.006832122802734375,
-0.01398468017578125,
0.0833740234375,
0.0260467529296875,
-0.00540924072265625,
0.0010509490966796875,
0.01181793212890625,
-0.031463623046875,
-0.062347412109375,
-0.04644775390625,
-0.012420654296875,
0.02490234375,
0.024749755859375,
0.039825439453125,
0.0306396484375,
0.046356201171875,
0.0177459716796875,
-0.0214996337890625,
0.003582000732421875,
-0.0328369140625,
-0.0186004638671875,
-0.010467529296875,
-0.0264129638671875,
-0.03497314453125,
-0.00890350341796875,
0.02569580078125,
0.0128936767578125,
0.005863189697265625,
0.041046142578125,
-0.01116943359375,
0.04931640625,
-0.06884765625,
0.00754547119140625,
-0.042724609375,
-0.004352569580078125,
-0.0157470703125,
0.0033245086669921875,
-0.024566650390625,
-0.01262664794921875,
-0.0131988525390625,
-0.02362060546875,
0.0205841064453125,
0.0164642333984375,
0.0751953125,
0.0200042724609375,
-0.01410675048828125,
-0.01418304443359375,
-0.0452880859375,
0.037353515625,
-0.051177978515625,
0.0233001708984375,
0.04949951171875,
0.0199127197265625,
-0.001983642578125,
-0.041961669921875,
-0.0628662109375,
-0.0018320083618164062,
-0.014862060546875,
0.001140594482421875,
-0.02520751953125,
-0.00152587890625,
0.050506591796875,
0.0308990478515625,
-0.05548095703125,
0.021484375,
-0.046783447265625,
-0.02191162109375,
0.07135009765625,
0.0026721954345703125,
0.01326751708984375,
-0.007228851318359375,
-0.0166473388671875,
-0.0204620361328125,
-0.038238525390625,
0.0249786376953125,
0.03741455078125,
0.00960540771484375,
-0.00815582275390625,
0.03839111328125,
-0.024261474609375,
0.04833984375,
0.016021728515625,
-0.022796630859375,
0.04022216796875,
-0.02557373046875,
-0.02880859375,
0.0218963623046875,
0.07305908203125,
0.021759033203125,
0.011932373046875,
0.005687713623046875,
-0.023345947265625,
-0.00420379638671875,
0.0135040283203125,
-0.08087158203125,
-0.034820556640625,
0.035797119140625,
-0.041351318359375,
-0.024658203125,
0.0284271240234375,
-0.06219482421875,
-0.01434326171875,
-0.0224151611328125,
0.0229949951171875,
-0.04449462890625,
-0.020965576171875,
0.0003542900085449219,
-0.0016088485717773438,
0.0207366943359375,
0.01300811767578125,
-0.06939697265625,
0.01441192626953125,
0.065673828125,
0.056396484375,
0.007724761962890625,
-0.01312255859375,
-0.029205322265625,
-0.0003523826599121094,
-0.022125244140625,
0.0278167724609375,
-0.033843994140625,
-0.010406494140625,
-0.016754150390625,
0.015350341796875,
-0.028656005859375,
-0.034423828125,
0.0245513916015625,
-0.047760009765625,
0.025360107421875,
0.004253387451171875,
-0.042327880859375,
-0.027740478515625,
0.02880859375,
-0.047515869140625,
0.06439208984375,
0.0033855438232421875,
-0.06536865234375,
0.0212554931640625,
-0.0745849609375,
-0.017486572265625,
0.01117706298828125,
0.01197052001953125,
-0.064208984375,
-0.001338958740234375,
0.02093505859375,
0.05633544921875,
-0.016387939453125,
0.041839599609375,
-0.028656005859375,
-0.039764404296875,
0.0330810546875,
-0.0189361572265625,
0.07269287109375,
0.028228759765625,
-0.04925537109375,
0.0012054443359375,
-0.0640869140625,
0.01039886474609375,
0.0164642333984375,
-0.00043272972106933594,
-0.016448974609375,
-0.027984619140625,
0.0033016204833984375,
0.018798828125,
0.0113525390625,
-0.04754638671875,
0.0182037353515625,
-0.021453857421875,
0.036529541015625,
0.037750244140625,
0.0167083740234375,
0.007541656494140625,
-0.0212249755859375,
0.0296173095703125,
0.01464080810546875,
0.02606201171875,
-0.0272369384765625,
-0.05792236328125,
-0.040435791015625,
-0.046600341796875,
0.03497314453125,
0.04705810546875,
-0.030548095703125,
0.06451416015625,
-0.026153564453125,
-0.045989990234375,
-0.042572021484375,
0.0006146430969238281,
0.05157470703125,
0.02716064453125,
0.022552490234375,
-0.0032520294189453125,
-0.027801513671875,
-0.08416748046875,
0.004749298095703125,
-0.0091552734375,
-0.00304412841796875,
0.025787353515625,
0.07635498046875,
-0.0367431640625,
0.06243896484375,
-0.039276123046875,
-0.01654052734375,
-0.00563812255859375,
-0.00618743896484375,
0.040191650390625,
0.05548095703125,
0.052032470703125,
-0.081787109375,
-0.037200927734375,
-0.00240325927734375,
-0.0413818359375,
0.018157958984375,
-0.00916290283203125,
-0.0007762908935546875,
0.021514892578125,
0.02081298828125,
-0.040740966796875,
0.0560302734375,
0.0377197265625,
-0.0416259765625,
0.040008544921875,
-0.00833892822265625,
0.0182647705078125,
-0.08111572265625,
0.049591064453125,
0.0039520263671875,
-0.004337310791015625,
-0.06439208984375,
0.00890350341796875,
0.0262603759765625,
-0.030517578125,
-0.0335693359375,
0.0384521484375,
-0.03363037109375,
0.005908966064453125,
-0.004955291748046875,
-0.04302978515625,
-0.00885009765625,
0.0560302734375,
0.00864410400390625,
0.07928466796875,
0.04974365234375,
-0.045196533203125,
0.0084381103515625,
0.0180206298828125,
-0.0299072265625,
0.0278167724609375,
-0.056396484375,
0.0169219970703125,
-0.00551605224609375,
0.007503509521484375,
-0.06787109375,
-0.033294677734375,
0.055877685546875,
-0.042877197265625,
0.0323486328125,
-0.02545166015625,
-0.043121337890625,
-0.046661376953125,
-0.014007568359375,
0.0276641845703125,
0.040435791015625,
-0.05340576171875,
0.034759521484375,
0.02325439453125,
0.0034236907958984375,
-0.0338134765625,
-0.04315185546875,
0.003688812255859375,
-0.0236053466796875,
-0.042724609375,
0.03143310546875,
-0.0101318359375,
-0.006168365478515625,
-0.0125732421875,
0.0128631591796875,
-0.02410888671875,
0.0177001953125,
0.041290283203125,
0.02410888671875,
-0.00923919677734375,
-0.002391815185546875,
-0.01473236083984375,
-0.003467559814453125,
-0.003925323486328125,
-0.010833740234375,
0.050048828125,
-0.0193023681640625,
-0.0299835205078125,
-0.0626220703125,
0.0190277099609375,
0.04205322265625,
-0.0306396484375,
0.07965087890625,
0.042633056640625,
-0.040679931640625,
0.01076507568359375,
-0.0543212890625,
-0.01256561279296875,
-0.033355712890625,
0.0174560546875,
-0.0374755859375,
-0.057830810546875,
0.040771484375,
0.012359619140625,
-0.00885009765625,
0.03302001953125,
0.0307769775390625,
0.0066375732421875,
0.06475830078125,
0.03369140625,
-0.03204345703125,
0.0308837890625,
-0.042449951171875,
0.038909912109375,
-0.059356689453125,
-0.0333251953125,
-0.042266845703125,
-0.0180511474609375,
-0.050140380859375,
-0.01110076904296875,
0.01433563232421875,
0.02203369140625,
-0.039520263671875,
0.0293121337890625,
-0.06048583984375,
0.02813720703125,
0.0601806640625,
0.0293121337890625,
-0.004425048828125,
-0.003704071044921875,
0.0238037109375,
0.008636474609375,
-0.0545654296875,
-0.031036376953125,
0.09893798828125,
0.037445068359375,
0.02850341796875,
0.005687713623046875,
0.055938720703125,
0.01421356201171875,
0.013427734375,
-0.035614013671875,
0.024566650390625,
-0.0029354095458984375,
-0.059967041015625,
-0.0163116455078125,
-0.0295257568359375,
-0.0711669921875,
0.0088348388671875,
-0.0257415771484375,
-0.080078125,
0.03729248046875,
0.00988006591796875,
-0.036529541015625,
0.034088134765625,
-0.065673828125,
0.07867431640625,
-0.02435302734375,
-0.0262451171875,
0.0006422996520996094,
-0.05029296875,
0.051177978515625,
-0.0245819091796875,
0.01027679443359375,
0.00325775146484375,
0.01497650146484375,
0.06365966796875,
-0.0352783203125,
0.068359375,
-0.0178070068359375,
0.023162841796875,
0.0215606689453125,
-0.0087890625,
0.03826904296875,
0.01129913330078125,
-0.01136016845703125,
0.056884765625,
0.0114593505859375,
-0.0233917236328125,
-0.034881591796875,
0.0462646484375,
-0.062225341796875,
-0.039642333984375,
-0.040740966796875,
-0.01190185546875,
0.0121307373046875,
0.0275421142578125,
0.03472900390625,
0.01520538330078125,
0.01271820068359375,
0.027099609375,
0.0689697265625,
-0.027099609375,
0.0296783447265625,
0.025146484375,
0.0038814544677734375,
-0.041290283203125,
0.0675048828125,
-0.00826263427734375,
0.007152557373046875,
0.0161590576171875,
0.01251220703125,
-0.014251708984375,
-0.038421630859375,
-0.027008056640625,
0.0311126708984375,
-0.025848388671875,
-0.036407470703125,
-0.057830810546875,
-0.03033447265625,
-0.05517578125,
0.005031585693359375,
-0.03094482421875,
-0.005146026611328125,
-0.029083251953125,
0.01062774658203125,
0.034027099609375,
0.02801513671875,
0.0143890380859375,
0.033294677734375,
-0.05108642578125,
0.0167388916015625,
0.0010356903076171875,
0.0284271240234375,
-0.010467529296875,
-0.0396728515625,
-0.01039886474609375,
0.0228271484375,
-0.0088958740234375,
-0.050628662109375,
0.04742431640625,
0.0195465087890625,
0.039886474609375,
0.04144287109375,
0.0142974853515625,
0.048919677734375,
-0.00902557373046875,
0.06787109375,
0.024169921875,
-0.0596923828125,
0.060211181640625,
-0.0096435546875,
0.01465606689453125,
0.05303955078125,
0.05657958984375,
-0.0274200439453125,
-0.002658843994140625,
-0.06561279296875,
-0.055145263671875,
0.05926513671875,
0.0218353271484375,
-0.0079193115234375,
0.00846099853515625,
0.03729248046875,
-0.00897979736328125,
0.02801513671875,
-0.0574951171875,
-0.0225982666015625,
-0.0133514404296875,
-0.0083160400390625,
-0.01262664794921875,
-0.02606201171875,
-0.00876617431640625,
-0.0226898193359375,
0.06207275390625,
-0.006732940673828125,
0.037750244140625,
0.01306915283203125,
-0.0009717941284179688,
0.0209197998046875,
0.0022602081298828125,
0.04443359375,
0.04620361328125,
-0.013580322265625,
-0.0112152099609375,
-0.004291534423828125,
-0.041290283203125,
-0.006866455078125,
0.0000028014183044433594,
-0.0307159423828125,
-0.00682830810546875,
0.0318603515625,
0.074462890625,
-0.004886627197265625,
-0.042327880859375,
0.058380126953125,
-0.004779815673828125,
-0.0252532958984375,
-0.042388916015625,
0.0261993408203125,
-0.02362060546875,
0.015655517578125,
0.0228118896484375,
0.010772705078125,
0.00972747802734375,
-0.032073974609375,
0.007320404052734375,
0.027099609375,
-0.018707275390625,
-0.0230255126953125,
0.051483154296875,
-0.0086517333984375,
-0.0452880859375,
0.06707763671875,
-0.01275634765625,
-0.05670166015625,
0.05560302734375,
0.0325927734375,
0.06591796875,
-0.0294036865234375,
0.0007414817810058594,
0.059906005859375,
0.019439697265625,
-0.007781982421875,
0.0110321044921875,
-0.006603240966796875,
-0.044830322265625,
-0.00446319580078125,
-0.04315185546875,
-0.009368896484375,
0.0267486572265625,
-0.054779052734375,
0.0240325927734375,
-0.050140380859375,
-0.0172882080078125,
-0.0005078315734863281,
0.0032634735107421875,
-0.06964111328125,
0.0235443115234375,
0.017791748046875,
0.0684814453125,
-0.07220458984375,
0.0626220703125,
0.04180908203125,
-0.036529541015625,
-0.060821533203125,
-0.00213623046875,
-0.0058135986328125,
-0.050445556640625,
0.059356689453125,
0.04302978515625,
0.02935791015625,
-0.0106353759765625,
-0.04669189453125,
-0.07763671875,
0.08551025390625,
0.00848388671875,
-0.0682373046875,
0.0104522705078125,
0.01192474365234375,
0.044525146484375,
-0.0272674560546875,
0.030181884765625,
0.035858154296875,
0.0305633544921875,
0.00870513916015625,
-0.0732421875,
0.007137298583984375,
-0.04486083984375,
0.003345489501953125,
-0.00023448467254638672,
-0.047271728515625,
0.0589599609375,
-0.00482940673828125,
-0.00463104248046875,
-0.013824462890625,
0.026397705078125,
0.026702880859375,
0.04388427734375,
0.033172607421875,
0.04742431640625,
0.07025146484375,
-0.004215240478515625,
0.0810546875,
-0.052642822265625,
0.0243377685546875,
0.0709228515625,
0.0005106925964355469,
0.058135986328125,
0.02581787109375,
-0.00797271728515625,
0.01375579833984375,
0.06427001953125,
-0.0269622802734375,
0.035675048828125,
0.01157379150390625,
-0.01273345947265625,
-0.0143280029296875,
0.01280975341796875,
-0.037994384765625,
0.034698486328125,
0.022552490234375,
-0.031494140625,
0.0229339599609375,
0.02313232421875,
-0.0045013427734375,
-0.0169830322265625,
-0.0228118896484375,
0.06591796875,
-0.03009033203125,
-0.04925537109375,
0.06268310546875,
0.003200531005859375,
0.066162109375,
-0.03857421875,
0.0078582763671875,
-0.00730133056640625,
0.0186614990234375,
-0.0341796875,
-0.04498291015625,
0.0231475830078125,
0.0038280487060546875,
-0.01119232177734375,
-0.00292205810546875,
0.047943115234375,
-0.00811767578125,
-0.0438232421875,
0.00667572021484375,
-0.0016498565673828125,
0.0242919921875,
-0.03271484375,
-0.06256103515625,
0.01074981689453125,
0.012451171875,
-0.0290374755859375,
0.0222625732421875,
0.0214080810546875,
-0.01152801513671875,
0.059722900390625,
0.07196044921875,
0.001712799072265625,
-0.0007352828979492188,
-0.012786865234375,
0.08026123046875,
-0.05029296875,
-0.048095703125,
-0.06494140625,
0.04046630859375,
-0.02813720703125,
-0.0350341796875,
0.04241943359375,
0.0682373046875,
0.0723876953125,
-0.00507354736328125,
0.045654296875,
-0.0189971923828125,
0.0374755859375,
-0.032318115234375,
0.055419921875,
-0.0304107666015625,
0.0208740234375,
-0.020233154296875,
-0.068115234375,
-0.027984619140625,
0.0465087890625,
-0.037750244140625,
0.0031719207763671875,
0.0145416259765625,
0.08721923828125,
-0.0118560791015625,
0.0031108856201171875,
0.0191650390625,
-0.0013227462768554688,
0.0292816162109375,
0.05157470703125,
0.038818359375,
-0.056427001953125,
0.0185546875,
-0.0423583984375,
-0.026458740234375,
-0.0194549560546875,
-0.0433349609375,
-0.082275390625,
-0.03997802734375,
-0.049346923828125,
-0.024627685546875,
0.014923095703125,
0.053863525390625,
0.0751953125,
-0.0772705078125,
0.0009102821350097656,
0.00029587745666503906,
-0.0128021240234375,
-0.0116729736328125,
-0.01515960693359375,
0.029144287109375,
-0.0051116943359375,
-0.047607421875,
0.0073089599609375,
-0.01081085205078125,
0.00902557373046875,
-0.00820159912109375,
-0.007106781005859375,
-0.016876220703125,
-0.0244903564453125,
0.026153564453125,
0.0253753662109375,
-0.046600341796875,
-0.01355743408203125,
-0.00510406494140625,
-0.01169586181640625,
-0.00012862682342529297,
0.039031982421875,
-0.036102294921875,
0.021484375,
0.0198822021484375,
0.052490234375,
0.05169677734375,
0.0137481689453125,
0.00948333740234375,
-0.0260162353515625,
0.014801025390625,
0.01535797119140625,
0.0175628662109375,
0.00862884521484375,
-0.0660400390625,
0.042083740234375,
0.0199737548828125,
-0.0391845703125,
-0.06158447265625,
-0.0167388916015625,
-0.0601806640625,
-0.030181884765625,
0.0836181640625,
-0.0230712890625,
-0.042510986328125,
0.0067901611328125,
-0.00904083251953125,
-0.00194549560546875,
-0.031951904296875,
0.051849365234375,
0.030487060546875,
-0.00498199462890625,
-0.0167388916015625,
-0.0277862548828125,
0.026702880859375,
0.0017242431640625,
-0.04742431640625,
-0.009613037109375,
0.034759521484375,
0.0355224609375,
0.02484130859375,
0.038330078125,
-0.0005574226379394531,
0.02362060546875,
-0.00537109375,
0.023406982421875,
-0.039520263671875,
-0.031982421875,
-0.0421142578125,
0.016876220703125,
-0.0227203369140625,
-0.03277587890625
]
] |
facebook/wmt19-ru-en | 2023-03-16T20:03:21.000Z | [
"transformers",
"pytorch",
"safetensors",
"fsmt",
"text2text-generation",
"translation",
"wmt19",
"facebook",
"ru",
"en",
"dataset:wmt19",
"arxiv:1907.06616",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | facebook | null | null | facebook/wmt19-ru-en | 14 | 35,521 | transformers | 2022-03-02T23:29:05 | ---
language:
- ru
- en
tags:
- translation
- wmt19
- facebook
license: apache-2.0
datasets:
- wmt19
metrics:
- bleu
thumbnail: https://huggingface.co/front/thumbnails/facebook.png
---
# FSMT
## Model description
This is a ported version of [fairseq wmt19 transformer](https://github.com/pytorch/fairseq/blob/master/examples/wmt19/README.md) for ru-en.
For more details, please see, [Facebook FAIR's WMT19 News Translation Task Submission](https://arxiv.org/abs/1907.06616).
The abbreviation FSMT stands for FairSeqMachineTranslation
All four models are available:
* [wmt19-en-ru](https://huggingface.co/facebook/wmt19-en-ru)
* [wmt19-ru-en](https://huggingface.co/facebook/wmt19-ru-en)
* [wmt19-en-de](https://huggingface.co/facebook/wmt19-en-de)
* [wmt19-de-en](https://huggingface.co/facebook/wmt19-de-en)
## Intended uses & limitations
#### How to use
```python
from transformers import FSMTForConditionalGeneration, FSMTTokenizer
mname = "facebook/wmt19-ru-en"
tokenizer = FSMTTokenizer.from_pretrained(mname)
model = FSMTForConditionalGeneration.from_pretrained(mname)
input = "Машинное обучение - это здорово, не так ли?"
input_ids = tokenizer.encode(input, return_tensors="pt")
outputs = model.generate(input_ids)
decoded = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(decoded) # Machine learning is great, isn't it?
```
#### Limitations and bias
- The original (and this ported model) doesn't seem to handle well inputs with repeated sub-phrases, [content gets truncated](https://discuss.huggingface.co/t/issues-with-translating-inputs-containing-repeated-phrases/981)
## Training data
Pretrained weights were left identical to the original model released by fairseq. For more details, please, see the [paper](https://arxiv.org/abs/1907.06616).
## Eval results
pair | fairseq | transformers
-------|---------|----------
ru-en | [41.3](http://matrix.statmt.org/matrix/output/1907?run_id=6937) | 39.20
The score is slightly below the score reported by `fairseq`, since `transformers`` currently doesn't support:
- model ensemble, therefore the best performing checkpoint was ported (``model4.pt``).
- re-ranking
The score was calculated using this code:
```bash
git clone https://github.com/huggingface/transformers
cd transformers
export PAIR=ru-en
export DATA_DIR=data/$PAIR
export SAVE_DIR=data/$PAIR
export BS=8
export NUM_BEAMS=15
mkdir -p $DATA_DIR
sacrebleu -t wmt19 -l $PAIR --echo src > $DATA_DIR/val.source
sacrebleu -t wmt19 -l $PAIR --echo ref > $DATA_DIR/val.target
echo $PAIR
PYTHONPATH="src:examples/seq2seq" python examples/seq2seq/run_eval.py facebook/wmt19-$PAIR $DATA_DIR/val.source $SAVE_DIR/test_translations.txt --reference_path $DATA_DIR/val.target --score_path $SAVE_DIR/test_bleu.json --bs $BS --task translation --num_beams $NUM_BEAMS
```
note: fairseq reports using a beam of 50, so you should get a slightly higher score if re-run with `--num_beams 50`.
## Data Sources
- [training, etc.](http://www.statmt.org/wmt19/)
- [test set](http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561)
### BibTeX entry and citation info
```bibtex
@inproceedings{...,
year={2020},
title={Facebook FAIR's WMT19 News Translation Task Submission},
author={Ng, Nathan and Yee, Kyra and Baevski, Alexei and Ott, Myle and Auli, Michael and Edunov, Sergey},
booktitle={Proc. of WMT},
}
```
## TODO
- port model ensemble (fairseq uses 4 model checkpoints)
| 3,434 | [
[
-0.0277099609375,
-0.0445556640625,
0.0252532958984375,
0.027435302734375,
-0.020233154296875,
-0.0005021095275878906,
-0.00860595703125,
-0.0246429443359375,
0.004924774169921875,
0.01000213623046875,
-0.06597900390625,
-0.0235748291015625,
-0.055267333984375,
0.01271820068359375,
-0.037109375,
0.0701904296875,
-0.019134521484375,
0.0270843505859375,
-0.0017108917236328125,
-0.0194244384765625,
-0.0152587890625,
-0.00905609130859375,
-0.0361328125,
-0.0298614501953125,
0.01038360595703125,
0.00858306884765625,
0.044097900390625,
0.020782470703125,
0.043975830078125,
0.03326416015625,
-0.0144805908203125,
-0.002727508544921875,
-0.0333251953125,
-0.01558685302734375,
-0.0194549560546875,
-0.0272674560546875,
-0.028564453125,
0.000934600830078125,
0.0477294921875,
0.043731689453125,
-0.003971099853515625,
0.04449462890625,
0.007442474365234375,
0.0360107421875,
-0.008087158203125,
0.0116424560546875,
-0.0469970703125,
0.01416778564453125,
-0.0153961181640625,
-0.004688262939453125,
-0.034698486328125,
-0.0199432373046875,
-0.0140228271484375,
-0.032867431640625,
0.0093994140625,
0.006298065185546875,
0.1051025390625,
0.0175323486328125,
-0.045654296875,
0.0246124267578125,
-0.040557861328125,
0.08001708984375,
-0.05548095703125,
0.0562744140625,
0.004039764404296875,
0.0300445556640625,
-0.0152740478515625,
-0.07965087890625,
-0.0276336669921875,
0.00873565673828125,
-0.0028839111328125,
0.0223846435546875,
-0.03662109375,
-0.016754150390625,
0.03338623046875,
0.0302276611328125,
-0.047607421875,
-0.01532745361328125,
-0.05047607421875,
-0.0396728515625,
0.04803466796875,
0.01136016845703125,
0.0062255859375,
-0.0234527587890625,
-0.039947509765625,
-0.0210723876953125,
-0.0212860107421875,
0.015167236328125,
-0.00019121170043945312,
0.0225677490234375,
-0.0095367431640625,
0.04510498046875,
-0.04296875,
0.04095458984375,
0.02667236328125,
-0.01129913330078125,
0.06390380859375,
-0.039215087890625,
-0.006832122802734375,
-0.024688720703125,
0.0826416015625,
0.034576416015625,
-0.0016717910766601562,
-0.00762939453125,
-0.0330810546875,
-0.021392822265625,
0.00730133056640625,
-0.08184814453125,
0.018035888671875,
0.006732940673828125,
-0.051483154296875,
-0.01690673828125,
0.0204620361328125,
-0.050994873046875,
0.0169830322265625,
-0.0179901123046875,
0.06866455078125,
-0.03826904296875,
-0.0022335052490234375,
-0.0003399848937988281,
-0.02276611328125,
0.0237884521484375,
0.0143890380859375,
-0.022735595703125,
0.007320404052734375,
0.0227508544921875,
0.0711669921875,
-0.007381439208984375,
-0.0311431884765625,
-0.048004150390625,
-0.01293182373046875,
-0.01519775390625,
0.03155517578125,
-0.0068359375,
-0.01415252685546875,
-0.01187896728515625,
0.0479736328125,
-0.0173187255859375,
-0.0246124267578125,
0.050537109375,
-0.03564453125,
0.04254150390625,
-0.0224151611328125,
-0.026031494140625,
-0.0136871337890625,
0.004261016845703125,
-0.0308990478515625,
0.08331298828125,
0.034393310546875,
-0.056854248046875,
0.0117340087890625,
-0.03741455078125,
-0.0380859375,
-0.01007843017578125,
0.004131317138671875,
-0.031402587890625,
0.01026153564453125,
0.00861358642578125,
0.0298919677734375,
-0.0139312744140625,
0.037109375,
-0.0152740478515625,
-0.043609619140625,
0.0230560302734375,
-0.0343017578125,
0.07037353515625,
0.037017822265625,
-0.036712646484375,
0.0189208984375,
-0.0501708984375,
0.00023353099822998047,
0.020416259765625,
-0.02801513671875,
0.0172576904296875,
-0.0158843994140625,
0.0119781494140625,
0.0469970703125,
0.0321044921875,
-0.0428466796875,
-0.0076904296875,
-0.039886474609375,
0.034454345703125,
0.06109619140625,
-0.008270263671875,
0.0299835205078125,
-0.049835205078125,
0.036346435546875,
0.01079559326171875,
0.020904541015625,
0.008941650390625,
-0.049346923828125,
-0.0501708984375,
-0.00522613525390625,
0.0198516845703125,
0.044677734375,
-0.072265625,
0.0338134765625,
-0.033966064453125,
-0.060546875,
-0.0369873046875,
-0.016845703125,
0.0232696533203125,
0.025238037109375,
0.0467529296875,
-0.023956298828125,
-0.0474853515625,
-0.0721435546875,
-0.036285400390625,
0.001567840576171875,
0.000278472900390625,
0.001728057861328125,
0.0450439453125,
-0.037322998046875,
0.05133056640625,
-0.0267181396484375,
-0.017547607421875,
-0.0186767578125,
-0.0067291259765625,
0.041534423828125,
0.052276611328125,
0.046783447265625,
-0.045745849609375,
-0.033599853515625,
-0.0079193115234375,
-0.039459228515625,
-0.007221221923828125,
-0.000030100345611572266,
-0.0301971435546875,
0.0171661376953125,
0.0269775390625,
-0.058563232421875,
0.0268096923828125,
0.034027099609375,
-0.038238525390625,
0.035186767578125,
0.018585205078125,
0.040008544921875,
-0.10906982421875,
0.01132965087890625,
-0.0017480850219726562,
-0.04168701171875,
-0.0286712646484375,
-0.005657196044921875,
0.00550079345703125,
-0.0051116943359375,
-0.04986572265625,
0.051483154296875,
-0.01494598388671875,
0.0034694671630859375,
-0.0067291259765625,
-0.003917694091796875,
0.008209228515625,
0.04449462890625,
-0.0262298583984375,
0.038818359375,
0.03326416015625,
-0.0352783203125,
0.0215606689453125,
0.0390625,
-0.0259857177734375,
0.0272369384765625,
-0.040771484375,
-0.0108184814453125,
0.0037899017333984375,
0.02655029296875,
-0.06781005859375,
-0.0131988525390625,
0.0282745361328125,
-0.054840087890625,
0.02374267578125,
-0.00927734375,
-0.037109375,
-0.04736328125,
-0.0203857421875,
0.0190887451171875,
0.056549072265625,
-0.03558349609375,
0.0391845703125,
0.011871337890625,
0.00511932373046875,
-0.045684814453125,
-0.0767822265625,
-0.0227203369140625,
-0.0232696533203125,
-0.0533447265625,
0.044281005859375,
-0.00824737548828125,
0.004833221435546875,
-0.0013113021850585938,
-0.028900146484375,
0.0034465789794921875,
0.0021419525146484375,
0.0162353515625,
0.01293182373046875,
-0.01348876953125,
-0.0026988983154296875,
0.0230865478515625,
-0.01548004150390625,
0.0021724700927734375,
-0.03253173828125,
0.05474853515625,
-0.030517578125,
-0.0173187255859375,
-0.055908203125,
0.00555419921875,
0.040069580078125,
-0.022918701171875,
0.06378173828125,
0.09063720703125,
-0.035064697265625,
0.0098724365234375,
-0.0303497314453125,
-0.0292510986328125,
-0.04071044921875,
0.04632568359375,
-0.037872314453125,
-0.07080078125,
0.04229736328125,
-0.0010213851928710938,
0.01132965087890625,
0.0704345703125,
0.05145263671875,
0.0018835067749023438,
0.0897216796875,
0.0134429931640625,
0.00508880615234375,
0.052947998046875,
-0.024749755859375,
0.0062255859375,
-0.040924072265625,
-0.0013666152954101562,
-0.03936767578125,
-0.038238525390625,
-0.056640625,
-0.043243408203125,
0.0106201171875,
0.002429962158203125,
-0.040618896484375,
0.043975830078125,
-0.034942626953125,
0.0031032562255859375,
0.0399169921875,
0.004375457763671875,
0.01953125,
-0.0021266937255859375,
-0.012054443359375,
-0.0143890380859375,
-0.042633056640625,
-0.0230560302734375,
0.0751953125,
0.0247344970703125,
0.038665771484375,
0.00353240966796875,
0.060577392578125,
-0.0009822845458984375,
0.01554107666015625,
-0.0477294921875,
0.048919677734375,
-0.0020313262939453125,
-0.05401611328125,
-0.00940704345703125,
-0.06427001953125,
-0.070556640625,
0.0321044921875,
-0.00858306884765625,
-0.0638427734375,
0.006168365478515625,
-0.0118255615234375,
-0.005733489990234375,
0.0167236328125,
-0.037841796875,
0.0863037109375,
-0.007785797119140625,
-0.0192108154296875,
-0.0129241943359375,
-0.05169677734375,
0.029815673828125,
-0.00925445556640625,
0.038482666015625,
-0.0157012939453125,
0.02020263671875,
0.07513427734375,
-0.0313720703125,
0.04119873046875,
-0.0200042724609375,
0.01270294189453125,
0.022491455078125,
0.00197601318359375,
0.04296875,
0.00017249584197998047,
-0.0165863037109375,
0.0204925537109375,
0.033203125,
-0.0308380126953125,
-0.014892578125,
0.055267333984375,
-0.05950927734375,
-0.03839111328125,
-0.036041259765625,
-0.042510986328125,
-0.01168060302734375,
0.037017822265625,
0.045318603515625,
0.036712646484375,
-0.0100555419921875,
0.0288848876953125,
0.0187835693359375,
-0.013275146484375,
0.034637451171875,
0.02838134765625,
-0.038665771484375,
-0.032012939453125,
0.06927490234375,
0.0152740478515625,
0.0217132568359375,
0.022308349609375,
0.0165863037109375,
-0.0286407470703125,
-0.024932861328125,
-0.0396728515625,
0.0199127197265625,
-0.052886962890625,
-0.03839111328125,
-0.05670166015625,
-0.0190277099609375,
-0.04425048828125,
0.010467529296875,
-0.043212890625,
-0.059722900390625,
-0.00787353515625,
-0.01213836669921875,
0.032012939453125,
0.0168609619140625,
-0.0177459716796875,
0.01580810546875,
-0.0694580078125,
0.0175018310546875,
-0.0016002655029296875,
0.01885986328125,
-0.01190948486328125,
-0.07568359375,
-0.0284576416015625,
0.0223846435546875,
-0.0516357421875,
-0.07513427734375,
0.0256500244140625,
0.006565093994140625,
0.054779052734375,
0.01294708251953125,
0.017059326171875,
0.047271728515625,
-0.03436279296875,
0.058807373046875,
0.01273345947265625,
-0.08123779296875,
0.034088134765625,
-0.025360107421875,
0.0251922607421875,
0.045654296875,
0.0238189697265625,
-0.042633056640625,
-0.044158935546875,
-0.062347412109375,
-0.064453125,
0.07989501953125,
0.032623291015625,
0.0162811279296875,
0.006679534912109375,
0.00441741943359375,
-0.0006451606750488281,
0.0172882080078125,
-0.07196044921875,
-0.01910400390625,
-0.0302276611328125,
-0.032867431640625,
0.0010547637939453125,
0.0100250244140625,
-0.011993408203125,
-0.035552978515625,
0.07763671875,
-0.004383087158203125,
0.03717041015625,
0.015716552734375,
-0.015869140625,
-0.0182952880859375,
0.015411376953125,
0.0211029052734375,
0.03759765625,
-0.0079498291015625,
-0.005504608154296875,
0.0338134765625,
-0.0180511474609375,
0.002437591552734375,
0.03656005859375,
-0.01520538330078125,
0.014678955078125,
0.0156707763671875,
0.061614990234375,
0.01317596435546875,
-0.045257568359375,
0.057708740234375,
-0.0036563873291015625,
-0.0285797119140625,
-0.0126800537109375,
-0.00476837158203125,
0.011322021484375,
0.0460205078125,
0.031646728515625,
0.023284912109375,
0.0170135498046875,
-0.029754638671875,
0.031494140625,
0.0229339599609375,
-0.05633544921875,
-0.0293426513671875,
0.067626953125,
0.0030727386474609375,
-0.01462554931640625,
0.034210205078125,
-0.0394287109375,
-0.041961669921875,
0.03863525390625,
0.040679931640625,
0.05328369140625,
-0.014892578125,
0.015716552734375,
0.060211181640625,
0.031829833984375,
-0.0244598388671875,
0.0280914306640625,
0.00894927978515625,
-0.0469970703125,
-0.036163330078125,
-0.06689453125,
0.007732391357421875,
-0.004901885986328125,
-0.0634765625,
0.034759521484375,
-0.0001690387725830078,
-0.027618408203125,
-0.0236968994140625,
-0.0025539398193359375,
-0.062042236328125,
0.01270294189453125,
-0.0096588134765625,
0.062103271484375,
-0.061309814453125,
0.05419921875,
0.060394287109375,
-0.034210205078125,
-0.0628662109375,
-0.00362396240234375,
0.0021724700927734375,
-0.0439453125,
0.0197906494140625,
0.031951904296875,
0.0208587646484375,
0.017547607421875,
-0.0305633544921875,
-0.08233642578125,
0.09832763671875,
0.018798828125,
-0.037017822265625,
0.00919342041015625,
0.00519561767578125,
0.031341552734375,
-0.0011854171752929688,
0.02203369140625,
0.0281982421875,
0.04510498046875,
0.00737762451171875,
-0.088134765625,
0.030029296875,
-0.02410888671875,
-0.0092010498046875,
0.016387939453125,
-0.06732177734375,
0.06640625,
-0.00676727294921875,
-0.0189056396484375,
0.020416259765625,
0.061309814453125,
0.034820556640625,
0.032135009765625,
0.036773681640625,
0.033355712890625,
0.040252685546875,
-0.0303497314453125,
0.058807373046875,
-0.01453399658203125,
0.06365966796875,
0.052032470703125,
0.01047515869140625,
0.048736572265625,
0.04034423828125,
-0.032958984375,
0.0289764404296875,
0.04876708984375,
-0.0169525146484375,
0.033477783203125,
0.009613037109375,
0.0217132568359375,
-0.02117919921875,
0.00489044189453125,
-0.048736572265625,
0.0292510986328125,
0.00006890296936035156,
-0.022003173828125,
-0.01555633544921875,
0.016204833984375,
0.00797271728515625,
-0.0103302001953125,
-0.005096435546875,
0.0286407470703125,
0.0228729248046875,
-0.044921875,
0.046173095703125,
0.0237884521484375,
0.054840087890625,
-0.03924560546875,
0.0228729248046875,
-0.0187530517578125,
0.02239990234375,
-0.01267242431640625,
-0.044769287109375,
0.041046142578125,
-0.005855560302734375,
-0.004192352294921875,
-0.029266357421875,
0.039886474609375,
-0.0272979736328125,
-0.053009033203125,
0.0250396728515625,
0.043853759765625,
0.013458251953125,
-0.01006317138671875,
-0.06591796875,
0.00760650634765625,
0.0243988037109375,
-0.04833984375,
0.02813720703125,
0.038360595703125,
-0.0083465576171875,
0.03070068359375,
0.04083251953125,
-0.0312347412109375,
0.0083770751953125,
0.004779815673828125,
0.058624267578125,
-0.056549072265625,
-0.024200439453125,
-0.05029296875,
0.05548095703125,
0.0011739730834960938,
-0.0245208740234375,
0.0552978515625,
0.058197021484375,
0.0682373046875,
-0.0116119384765625,
0.037841796875,
-0.02508544921875,
0.034088134765625,
-0.029205322265625,
0.057708740234375,
-0.07470703125,
-0.005451202392578125,
-0.0328369140625,
-0.0706787109375,
0.005741119384765625,
0.04803466796875,
-0.007762908935546875,
0.0175628662109375,
0.05419921875,
0.059326171875,
-0.00958251953125,
-0.005992889404296875,
0.01312255859375,
0.033538818359375,
0.0249176025390625,
0.051544189453125,
0.06683349609375,
-0.0753173828125,
0.0670166015625,
-0.034759521484375,
-0.01340484619140625,
-0.0170135498046875,
-0.0357666015625,
-0.0556640625,
-0.05517578125,
-0.029693603515625,
-0.0531005859375,
-0.02239990234375,
0.06463623046875,
0.03741455078125,
-0.05560302734375,
-0.004291534423828125,
0.00542449951171875,
0.0015850067138671875,
-0.03546142578125,
-0.022247314453125,
0.017791748046875,
-0.0213165283203125,
-0.0775146484375,
0.03369140625,
-0.0102691650390625,
0.0052337646484375,
0.0161895751953125,
-0.01413726806640625,
-0.01708984375,
-0.001392364501953125,
0.034576416015625,
-0.011260986328125,
-0.0426025390625,
-0.025360107421875,
0.0202178955078125,
-0.017578125,
-0.0014553070068359375,
0.01300811767578125,
-0.03839111328125,
0.00582122802734375,
0.055206298828125,
0.0305633544921875,
0.060455322265625,
-0.0012407302856445312,
0.0305328369140625,
-0.041961669921875,
0.0229644775390625,
0.0071563720703125,
0.04638671875,
0.0103302001953125,
-0.00553131103515625,
0.038543701171875,
0.04345703125,
-0.04150390625,
-0.07275390625,
0.006702423095703125,
-0.09063720703125,
-0.0226593017578125,
0.09039306640625,
0.00652313232421875,
-0.0154266357421875,
0.018096923828125,
-0.0157623291015625,
0.041748046875,
-0.024810791015625,
0.0380859375,
0.046661376953125,
0.01081085205078125,
0.00788116455078125,
-0.05023193359375,
0.017547607421875,
0.0367431640625,
-0.037322998046875,
-0.034393310546875,
0.019866943359375,
0.0309295654296875,
0.01409912109375,
0.047149658203125,
-0.027862548828125,
0.02301025390625,
-0.01397705078125,
0.00782012939453125,
0.00298309326171875,
0.005390167236328125,
-0.0076751708984375,
-0.00909423828125,
-0.007808685302734375,
-0.0256195068359375
]
] |
cl-tohoku/bert-base-japanese-v2 | 2021-09-23T13:45:31.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"ja",
"dataset:wikipedia",
"license:cc-by-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | cl-tohoku | null | null | cl-tohoku/bert-base-japanese-v2 | 20 | 35,420 | transformers | 2022-03-02T23:29:05 | ---
language: ja
license: cc-by-sa-4.0
datasets:
- wikipedia
widget:
- text: 東北大学で[MASK]の研究をしています。
---
# BERT base Japanese (unidic-lite with whole word masking, jawiki-20200831)
This is a [BERT](https://github.com/google-research/bert) model pretrained on texts in the Japanese language.
This version of the model processes input texts with word-level tokenization based on the Unidic 2.1.2 dictionary (available in [unidic-lite](https://pypi.org/project/unidic-lite/) package), followed by the WordPiece subword tokenization.
Additionally, the model is trained with the whole word masking enabled for the masked language modeling (MLM) objective.
The codes for the pretraining are available at [cl-tohoku/bert-japanese](https://github.com/cl-tohoku/bert-japanese/tree/v2.0).
## Model architecture
The model architecture is the same as the original BERT base model; 12 layers, 768 dimensions of hidden states, and 12 attention heads.
## Training Data
The models are trained on the Japanese version of Wikipedia.
The training corpus is generated from the Wikipedia Cirrussearch dump file as of August 31, 2020.
The generated corpus files are 4.0GB in total, containing approximately 30M sentences.
We used the [MeCab](https://taku910.github.io/mecab/) morphological parser with [mecab-ipadic-NEologd](https://github.com/neologd/mecab-ipadic-neologd) dictionary to split texts into sentences.
## Tokenization
The texts are first tokenized by MeCab with the Unidic 2.1.2 dictionary and then split into subwords by the WordPiece algorithm.
The vocabulary size is 32768.
We used [`fugashi`](https://github.com/polm/fugashi) and [`unidic-lite`](https://github.com/polm/unidic-lite) packages for the tokenization.
## Training
The models are trained with the same configuration as the original BERT; 512 tokens per instance, 256 instances per batch, and 1M training steps.
For training of the MLM (masked language modeling) objective, we introduced whole word masking in which all of the subword tokens corresponding to a single word (tokenized by MeCab) are masked at once.
For training of each model, we used a v3-8 instance of Cloud TPUs provided by [TensorFlow Research Cloud program](https://www.tensorflow.org/tfrc/).
The training took about 5 days to finish.
## Licenses
The pretrained models are distributed under the terms of the [Creative Commons Attribution-ShareAlike 3.0](https://creativecommons.org/licenses/by-sa/3.0/).
## Acknowledgments
This model is trained with Cloud TPUs provided by [TensorFlow Research Cloud](https://www.tensorflow.org/tfrc/) program.
| 2,587 | [
[
-0.02947998046875,
-0.06365966796875,
0.01329803466796875,
0.0104827880859375,
-0.053466796875,
0.004154205322265625,
-0.0233001708984375,
-0.029266357421875,
0.0379638671875,
0.04144287109375,
-0.05377197265625,
-0.041107177734375,
-0.044769287109375,
-0.0034427642822265625,
-0.01629638671875,
0.0848388671875,
0.0010824203491210938,
0.0171356201171875,
0.0192718505859375,
0.00913238525390625,
-0.0229339599609375,
-0.043365478515625,
-0.05145263671875,
-0.03070068359375,
0.0294952392578125,
0.011749267578125,
0.03485107421875,
0.029998779296875,
0.012237548828125,
0.01505279541015625,
0.0007719993591308594,
-0.002338409423828125,
-0.03936767578125,
-0.0202484130859375,
-0.00013077259063720703,
-0.028076171875,
-0.0142822265625,
0.0021514892578125,
0.04052734375,
0.0595703125,
0.0019702911376953125,
0.00626373291015625,
-0.0039043426513671875,
0.031890869140625,
-0.04083251953125,
0.0020847320556640625,
-0.05780029296875,
0.0020923614501953125,
-0.023406982421875,
0.01549530029296875,
-0.023284912109375,
0.00965118408203125,
0.01320648193359375,
-0.050933837890625,
0.0247344970703125,
0.0001271963119506836,
0.08660888671875,
0.000568389892578125,
-0.004638671875,
-0.0183563232421875,
-0.03265380859375,
0.053863525390625,
-0.07275390625,
0.030731201171875,
0.0394287109375,
-0.004436492919921875,
-0.01059722900390625,
-0.06890869140625,
-0.04815673828125,
-0.008575439453125,
0.00254058837890625,
0.01027679443359375,
-0.00533294677734375,
0.013916015625,
0.0200347900390625,
0.0186767578125,
-0.052947998046875,
0.024444580078125,
-0.03369140625,
-0.02325439453125,
0.03375244140625,
-0.00894927978515625,
0.0232696533203125,
-0.02777099609375,
-0.03643798828125,
-0.0118865966796875,
-0.038726806640625,
0.00821685791015625,
0.023895263671875,
0.0132293701171875,
-0.00537109375,
0.039093017578125,
-0.0026378631591796875,
0.03564453125,
-0.0176849365234375,
-0.02142333984375,
0.0239105224609375,
-0.0247802734375,
-0.0294189453125,
0.00911712646484375,
0.0748291015625,
0.0110931396484375,
0.030426025390625,
-0.0183868408203125,
-0.0256500244140625,
-0.001556396484375,
0.017974853515625,
-0.06707763671875,
-0.0174560546875,
0.00897979736328125,
-0.041717529296875,
-0.021942138671875,
0.006282806396484375,
-0.036102294921875,
0.00037407875061035156,
-0.0012578964233398438,
0.060211181640625,
-0.04461669921875,
-0.01763916015625,
0.0184478759765625,
-0.007781982421875,
0.01055145263671875,
-0.0004677772521972656,
-0.06671142578125,
0.01494598388671875,
0.03778076171875,
0.060516357421875,
-0.0013217926025390625,
-0.0164031982421875,
0.0010461807250976562,
0.003993988037109375,
-0.0290069580078125,
0.0310516357421875,
-0.0182342529296875,
-0.0312347412109375,
0.004390716552734375,
0.0117340087890625,
-0.019927978515625,
-0.017608642578125,
0.03302001953125,
-0.03985595703125,
0.035125732421875,
-0.00048232078552246094,
-0.062286376953125,
-0.0193328857421875,
0.01126861572265625,
-0.032684326171875,
0.08203125,
0.00785064697265625,
-0.055877685546875,
0.015655517578125,
-0.06280517578125,
-0.028900146484375,
0.026275634765625,
0.00615692138671875,
-0.033538818359375,
0.005435943603515625,
0.0184326171875,
0.032745361328125,
0.0027751922607421875,
0.0287017822265625,
-0.008941650390625,
-0.032073974609375,
0.0207061767578125,
-0.0212554931640625,
0.08740234375,
0.0142974853515625,
-0.04638671875,
0.0016193389892578125,
-0.0550537109375,
0.004528045654296875,
0.015472412109375,
-0.020660400390625,
-0.02435302734375,
-0.024139404296875,
0.0186920166015625,
0.01453399658203125,
0.03448486328125,
-0.0596923828125,
0.009490966796875,
-0.043243408203125,
0.033355712890625,
0.05352783203125,
-0.00836944580078125,
0.017913818359375,
-0.00757598876953125,
0.0291290283203125,
0.0025272369384765625,
0.0197906494140625,
-0.0231170654296875,
-0.046234130859375,
-0.08331298828125,
-0.0283203125,
0.0614013671875,
0.035369873046875,
-0.060577392578125,
0.06378173828125,
-0.0279998779296875,
-0.031707763671875,
-0.0665283203125,
0.006839752197265625,
0.035430908203125,
0.0288848876953125,
0.02044677734375,
-0.041717529296875,
-0.048492431640625,
-0.0718994140625,
0.01160430908203125,
-0.0042572021484375,
-0.016937255859375,
0.0009813308715820312,
0.054473876953125,
-0.037261962890625,
0.058746337890625,
-0.0241546630859375,
-0.034759521484375,
-0.0231170654296875,
0.02227783203125,
0.023193359375,
0.04193115234375,
0.03326416015625,
-0.0406494140625,
-0.037261962890625,
-0.02288818359375,
-0.036102294921875,
-0.004497528076171875,
-0.00316619873046875,
-0.0150146484375,
0.026885986328125,
0.04815673828125,
-0.03948974609375,
0.03369140625,
0.0406494140625,
-0.023681640625,
0.0270233154296875,
-0.014739990234375,
-0.0225067138671875,
-0.10736083984375,
0.0279083251953125,
-0.0182647705078125,
-0.0195159912109375,
-0.053924560546875,
-0.0013132095336914062,
-0.0051116943359375,
-0.01264190673828125,
-0.03167724609375,
0.047821044921875,
-0.040496826171875,
-0.00034809112548828125,
-0.0254669189453125,
0.00322723388671875,
-0.0038738250732421875,
0.060150146484375,
0.01031494140625,
0.0548095703125,
0.0377197265625,
-0.045440673828125,
0.00836181640625,
0.0038604736328125,
-0.059112548828125,
-0.00942230224609375,
-0.057037353515625,
0.0090179443359375,
-0.003131866455078125,
0.01091766357421875,
-0.0736083984375,
-0.00609588623046875,
0.02947998046875,
-0.042510986328125,
0.03955078125,
0.01421356201171875,
-0.06060791015625,
-0.0253143310546875,
-0.0328369140625,
0.007587432861328125,
0.0501708984375,
-0.033660888671875,
0.034332275390625,
0.0290679931640625,
-0.0009603500366210938,
-0.053680419921875,
-0.0623779296875,
0.0195770263671875,
0.01177978515625,
-0.0306396484375,
0.04241943359375,
-0.00875091552734375,
0.00833892822265625,
0.01351165771484375,
0.0050201416015625,
-0.0024814605712890625,
0.01519775390625,
0.00875091552734375,
0.0277252197265625,
-0.01110076904296875,
0.004390716552734375,
0.0006914138793945312,
-0.0038299560546875,
-0.0080108642578125,
-0.0169830322265625,
0.0736083984375,
0.0076446533203125,
-0.00543975830078125,
-0.030059814453125,
0.016326904296875,
0.0205535888671875,
-0.0050506591796875,
0.08148193359375,
0.0693359375,
-0.03558349609375,
-0.006992340087890625,
-0.048980712890625,
-0.0189971923828125,
-0.032318115234375,
0.039581298828125,
-0.0291748046875,
-0.073974609375,
0.040740966796875,
0.021697998046875,
0.0237579345703125,
0.0438232421875,
0.042388916015625,
-0.0149383544921875,
0.06951904296875,
0.058837890625,
-0.0261993408203125,
0.04644775390625,
-0.033966064453125,
0.022857666015625,
-0.06829833984375,
-0.030731201171875,
-0.020904541015625,
-0.0239410400390625,
-0.047210693359375,
-0.03070068359375,
0.01531982421875,
0.0230255126953125,
-0.012969970703125,
0.026641845703125,
-0.032806396484375,
0.03240966796875,
0.050872802734375,
0.0099639892578125,
0.0042572021484375,
0.0262603759765625,
-0.0316162109375,
-0.0098419189453125,
-0.055419921875,
-0.035797119140625,
0.08837890625,
0.0426025390625,
0.039031982421875,
-0.00609588623046875,
0.045196533203125,
0.00926971435546875,
0.02349853515625,
-0.05224609375,
0.042266845703125,
-0.03564453125,
-0.0765380859375,
-0.0216064453125,
-0.0211029052734375,
-0.07733154296875,
0.025787353515625,
-0.022186279296875,
-0.055938720703125,
-0.0019626617431640625,
-0.0255279541015625,
-0.00707244873046875,
0.038055419921875,
-0.04571533203125,
0.0628662109375,
-0.0227203369140625,
0.005207061767578125,
-0.0096893310546875,
-0.06689453125,
0.026885986328125,
-0.0164642333984375,
0.01235198974609375,
-0.00408172607421875,
-0.007030487060546875,
0.085693359375,
-0.033355712890625,
0.07366943359375,
-0.01244354248046875,
0.002559661865234375,
0.006999969482421875,
-0.017791748046875,
0.0089263916015625,
-0.0113372802734375,
0.01459503173828125,
0.042022705078125,
-0.01241302490234375,
-0.037689208984375,
-0.0145111083984375,
0.042388916015625,
-0.07989501953125,
-0.0241546630859375,
-0.035430908203125,
-0.027496337890625,
-0.005126953125,
0.03826904296875,
0.049591064453125,
0.0256500244140625,
-0.0162811279296875,
0.0298919677734375,
0.06854248046875,
-0.02001953125,
0.0462646484375,
0.049896240234375,
-0.025909423828125,
-0.03240966796875,
0.060943603515625,
0.01293182373046875,
0.00824737548828125,
0.041717529296875,
0.0027904510498046875,
-0.0293731689453125,
-0.0399169921875,
-0.034027099609375,
0.0379638671875,
-0.03594970703125,
-0.0034122467041015625,
-0.06390380859375,
-0.04052734375,
-0.048004150390625,
0.0034275054931640625,
-0.0265045166015625,
-0.028228759765625,
-0.0341796875,
-0.0024433135986328125,
0.003570556640625,
0.043609619140625,
0.0033092498779296875,
0.041351318359375,
-0.043212890625,
0.0186767578125,
0.01702880859375,
0.02081298828125,
-0.00800323486328125,
-0.062225341796875,
-0.0288848876953125,
0.019683837890625,
-0.01006317138671875,
-0.05072021484375,
0.026885986328125,
0.00789642333984375,
0.04925537109375,
0.0399169921875,
-0.009429931640625,
0.045806884765625,
-0.0288848876953125,
0.0714111328125,
0.02801513671875,
-0.07293701171875,
0.0325927734375,
-0.0191802978515625,
0.032257080078125,
0.04913330078125,
0.038604736328125,
-0.04425048828125,
-0.027435302734375,
-0.062744140625,
-0.06182861328125,
0.058258056640625,
0.022186279296875,
0.0258636474609375,
0.006664276123046875,
0.032073974609375,
-0.004116058349609375,
0.0251312255859375,
-0.08294677734375,
-0.03131103515625,
-0.0281829833984375,
-0.0196533203125,
-0.025848388671875,
-0.044403076171875,
0.005573272705078125,
-0.022796630859375,
0.07965087890625,
0.004486083984375,
0.0279998779296875,
0.00849151611328125,
-0.0269622802734375,
0.003528594970703125,
0.00914764404296875,
0.055694580078125,
0.0421142578125,
-0.029296875,
-0.0004935264587402344,
0.0016565322875976562,
-0.05145263671875,
-0.01629638671875,
0.01081085205078125,
-0.030426025390625,
0.040496826171875,
0.0380859375,
0.09478759765625,
0.026031494140625,
-0.050750732421875,
0.04547119140625,
-0.0020542144775390625,
-0.03265380859375,
-0.0202789306640625,
-0.003582000732421875,
0.01134490966796875,
-0.0102691650390625,
0.02984619140625,
-0.02734375,
-0.00971221923828125,
-0.039947509765625,
0.0016117095947265625,
0.0357666015625,
-0.0211944580078125,
-0.0186614990234375,
0.042083740234375,
0.01065826416015625,
-0.014892578125,
0.056396484375,
-0.01435089111328125,
-0.06341552734375,
0.03509521484375,
0.0482177734375,
0.06951904296875,
-0.0166473388671875,
0.017974853515625,
0.039306640625,
0.046417236328125,
0.005596160888671875,
-0.00004649162292480469,
-0.005100250244140625,
-0.06573486328125,
-0.025543212890625,
-0.06451416015625,
-0.01558685302734375,
0.0474853515625,
-0.03204345703125,
0.0101776123046875,
-0.04449462890625,
-0.0057525634765625,
-0.003131866455078125,
0.031402587890625,
-0.03277587890625,
0.026885986328125,
0.0172119140625,
0.06475830078125,
-0.05963134765625,
0.085693359375,
0.0546875,
-0.031768798828125,
-0.06292724609375,
-0.01078033447265625,
-0.031585693359375,
-0.0814208984375,
0.03790283203125,
0.026397705078125,
0.0117340087890625,
0.005016326904296875,
-0.046630859375,
-0.055816650390625,
0.07159423828125,
0.007282257080078125,
-0.042022705078125,
-0.008514404296875,
0.0164642333984375,
0.051116943359375,
-0.035125732421875,
0.0102386474609375,
0.02593994140625,
0.016204833984375,
0.004886627197265625,
-0.06829833984375,
-0.015838623046875,
-0.0256805419921875,
0.028900146484375,
0.006313323974609375,
-0.033538818359375,
0.06884765625,
0.006504058837890625,
-0.0193634033203125,
0.00893402099609375,
0.0369873046875,
0.01251220703125,
-0.0022525787353515625,
0.0462646484375,
0.0657958984375,
0.043701171875,
0.00560760498046875,
0.058502197265625,
-0.0294189453125,
0.030364990234375,
0.0650634765625,
0.0130615234375,
0.057708740234375,
0.0309906005859375,
-0.00372314453125,
0.05029296875,
0.0589599609375,
-0.0123138427734375,
0.0548095703125,
0.00601959228515625,
-0.017669677734375,
-0.01108551025390625,
-0.0204620361328125,
-0.027496337890625,
0.0260772705078125,
0.0284881591796875,
-0.037139892578125,
-0.01190185546875,
0.0102081298828125,
0.0238800048828125,
-0.0242156982421875,
-0.02813720703125,
0.060577392578125,
0.011749267578125,
-0.04779052734375,
0.043060302734375,
0.02978515625,
0.079345703125,
-0.07275390625,
0.0286712646484375,
-0.01678466796875,
0.0134735107421875,
0.005062103271484375,
-0.043243408203125,
0.0014944076538085938,
0.00945281982421875,
-0.01421356201171875,
-0.0218963623046875,
0.05047607421875,
-0.048583984375,
-0.0499267578125,
0.0239410400390625,
0.0153656005859375,
0.03204345703125,
0.003986358642578125,
-0.07354736328125,
0.005031585693359375,
0.022003173828125,
-0.024078369140625,
0.031982421875,
0.0120697021484375,
-0.003753662109375,
0.037200927734375,
0.06500244140625,
0.0093536376953125,
0.0175323486328125,
0.0238800048828125,
0.0596923828125,
-0.035003662109375,
-0.038726806640625,
-0.0614013671875,
0.025726318359375,
-0.00934600830078125,
-0.0293731689453125,
0.049346923828125,
0.04217529296875,
0.07354736328125,
-0.0133514404296875,
0.040283203125,
-0.00873565673828125,
0.039642333984375,
-0.043212890625,
0.05535888671875,
-0.05609130859375,
0.0039215087890625,
-0.012176513671875,
-0.0751953125,
-0.011444091796875,
0.060516357421875,
0.00817108154296875,
0.006114959716796875,
0.046661376953125,
0.05804443359375,
0.0036983489990234375,
-0.0153350830078125,
0.02532958984375,
0.01555633544921875,
0.019744873046875,
0.04205322265625,
0.03057861328125,
-0.05206298828125,
0.0294189453125,
-0.04180908203125,
-0.00981903076171875,
-0.0174102783203125,
-0.053314208984375,
-0.08978271484375,
-0.0435791015625,
-0.0173187255859375,
-0.0230865478515625,
-0.00510406494140625,
0.062225341796875,
0.052337646484375,
-0.06011962890625,
-0.0225830078125,
-0.007472991943359375,
-0.01398468017578125,
0.01593017578125,
-0.0188446044921875,
0.03265380859375,
-0.0262298583984375,
-0.05340576171875,
0.01374053955078125,
0.007045745849609375,
0.0113525390625,
-0.0295562744140625,
0.0037631988525390625,
-0.04180908203125,
-0.0009613037109375,
0.038482666015625,
0.00670623779296875,
-0.054107666015625,
-0.004375457763671875,
0.005008697509765625,
-0.0240631103515625,
-0.01404571533203125,
0.049896240234375,
-0.03662109375,
0.045166015625,
0.0231781005859375,
0.042510986328125,
0.0665283203125,
-0.0171356201171875,
0.021636962890625,
-0.0714111328125,
0.0291748046875,
0.0111236572265625,
0.0306396484375,
0.030242919921875,
-0.0109710693359375,
0.038909912109375,
0.03369140625,
-0.0171356201171875,
-0.066162109375,
-0.00557708740234375,
-0.0684814453125,
-0.04730224609375,
0.07635498046875,
-0.01119232177734375,
-0.027435302734375,
-0.004383087158203125,
-0.00732421875,
0.0330810546875,
-0.0069732666015625,
0.05029296875,
0.072021484375,
0.02587890625,
-0.01702880859375,
-0.028656005859375,
0.026824951171875,
0.042022705078125,
-0.040740966796875,
-0.029266357421875,
0.00873565673828125,
0.04583740234375,
0.0246734619140625,
0.0687255859375,
-0.0016317367553710938,
0.0016994476318359375,
-0.01056671142578125,
0.0271148681640625,
-0.00916290283203125,
-0.01415252685546875,
-0.0210113525390625,
-0.00705718994140625,
-0.02142333984375,
-0.03631591796875
]
] |
timm/eva02_base_patch14_448.mim_in22k_ft_in22k_in1k | 2023-03-31T05:45:17.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"dataset:imagenet-22k",
"arxiv:2303.11331",
"arxiv:2303.15389",
"license:mit",
"region:us"
] | image-classification | timm | null | null | timm/eva02_base_patch14_448.mim_in22k_ft_in22k_in1k | 0 | 35,030 | timm | 2023-03-31T04:17:36 | ---
tags:
- image-classification
- timm
library_tag: timm
license: mit
datasets:
- imagenet-1k
- imagenet-22k
---
# Model card for eva02_base_patch14_448.mim_in22k_ft_in22k_in1k
An EVA02 image classification model. Pretrained on ImageNet-22k with masked image modeling (using EVA-CLIP as a MIM teacher) and fine-tuned on ImageNet-22k then on ImageNet-1k by paper authors.
EVA-02 models are vision transformers with mean pooling, SwiGLU, Rotary Position Embeddings (ROPE), and extra LN in MLP (for Base & Large).
NOTE: `timm` checkpoints are float32 for consistency with other models. Original checkpoints are float16 or bfloat16 in some cases, see originals if that's preferred.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 87.1
- GMACs: 107.1
- Activations (M): 259.1
- Image size: 448 x 448
- **Papers:**
- EVA-02: A Visual Representation for Neon Genesis: https://arxiv.org/abs/2303.11331
- EVA-CLIP: Improved Training Techniques for CLIP at Scale: https://arxiv.org/abs/2303.15389
- **Original:**
- https://github.com/baaivision/EVA
- https://huggingface.co/Yuxin-CV/EVA-02
- **Pretrain Dataset:** ImageNet-22k
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('eva02_base_patch14_448.mim_in22k_ft_in22k_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'eva02_base_patch14_448.mim_in22k_ft_in22k_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1025, 768) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
|model |top1 |top5 |param_count|img_size|
|-----------------------------------------------|------|------|-----------|--------|
|eva02_large_patch14_448.mim_m38m_ft_in22k_in1k |90.054|99.042|305.08 |448 |
|eva02_large_patch14_448.mim_in22k_ft_in22k_in1k|89.946|99.01 |305.08 |448 |
|eva_giant_patch14_560.m30m_ft_in22k_in1k |89.792|98.992|1014.45 |560 |
|eva02_large_patch14_448.mim_in22k_ft_in1k |89.626|98.954|305.08 |448 |
|eva02_large_patch14_448.mim_m38m_ft_in1k |89.57 |98.918|305.08 |448 |
|eva_giant_patch14_336.m30m_ft_in22k_in1k |89.56 |98.956|1013.01 |336 |
|eva_giant_patch14_336.clip_ft_in1k |89.466|98.82 |1013.01 |336 |
|eva_large_patch14_336.in22k_ft_in22k_in1k |89.214|98.854|304.53 |336 |
|eva_giant_patch14_224.clip_ft_in1k |88.882|98.678|1012.56 |224 |
|eva02_base_patch14_448.mim_in22k_ft_in22k_in1k |88.692|98.722|87.12 |448 |
|eva_large_patch14_336.in22k_ft_in1k |88.652|98.722|304.53 |336 |
|eva_large_patch14_196.in22k_ft_in22k_in1k |88.592|98.656|304.14 |196 |
|eva02_base_patch14_448.mim_in22k_ft_in1k |88.23 |98.564|87.12 |448 |
|eva_large_patch14_196.in22k_ft_in1k |87.934|98.504|304.14 |196 |
|eva02_small_patch14_336.mim_in22k_ft_in1k |85.74 |97.614|22.13 |336 |
|eva02_tiny_patch14_336.mim_in22k_ft_in1k |80.658|95.524|5.76 |336 |
## Citation
```bibtex
@article{EVA02,
title={EVA-02: A Visual Representation for Neon Genesis},
author={Fang, Yuxin and Sun, Quan and Wang, Xinggang and Huang, Tiejun and Wang, Xinlong and Cao, Yue},
journal={arXiv preprint arXiv:2303.11331},
year={2023}
}
```
```bibtex
@article{EVA-CLIP,
title={EVA-02: A Visual Representation for Neon Genesis},
author={Sun, Quan and Fang, Yuxin and Wu, Ledell and Wang, Xinlong and Cao, Yue},
journal={arXiv preprint arXiv:2303.15389},
year={2023}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 5,438 | [
[
-0.044158935546875,
-0.0297698974609375,
0.0120391845703125,
0.008087158203125,
-0.0171966552734375,
0.0013332366943359375,
-0.00823211669921875,
-0.033294677734375,
0.038726806640625,
0.0275726318359375,
-0.034576416015625,
-0.05145263671875,
-0.043243408203125,
0.006328582763671875,
-0.0042877197265625,
0.0625,
-0.007335662841796875,
-0.00786590576171875,
0.0004169940948486328,
-0.027435302734375,
-0.017242431640625,
-0.017822265625,
-0.047393798828125,
-0.0214080810546875,
0.03192138671875,
0.0203857421875,
0.046539306640625,
0.05096435546875,
0.040740966796875,
0.030426025390625,
-0.021728515625,
0.00489044189453125,
-0.020721435546875,
-0.0242156982421875,
0.0276947021484375,
-0.04962158203125,
-0.0517578125,
0.0024127960205078125,
0.061767578125,
0.0278167724609375,
0.006275177001953125,
0.022552490234375,
0.010040283203125,
0.04156494140625,
-0.0227203369140625,
-0.0008287429809570312,
-0.02020263671875,
0.01324462890625,
-0.01514434814453125,
-0.008758544921875,
-0.0206146240234375,
-0.024261474609375,
0.0056610107421875,
-0.056427001953125,
0.03167724609375,
0.006069183349609375,
0.1019287109375,
0.005268096923828125,
-0.005931854248046875,
0.0130157470703125,
-0.0218353271484375,
0.056488037109375,
-0.06182861328125,
0.0149078369140625,
0.01352691650390625,
0.00759124755859375,
-0.001163482666015625,
-0.06622314453125,
-0.0438232421875,
0.0008211135864257812,
-0.024017333984375,
0.018096923828125,
-0.0301513671875,
0.0017652511596679688,
0.031341552734375,
0.0287322998046875,
-0.03936767578125,
0.004962921142578125,
-0.038299560546875,
-0.0177154541015625,
0.045989990234375,
0.00658416748046875,
0.033172607421875,
-0.03277587890625,
-0.056671142578125,
-0.033660888671875,
-0.04443359375,
0.01436614990234375,
0.02777099609375,
-0.006000518798828125,
-0.04547119140625,
0.037109375,
0.005767822265625,
0.036102294921875,
0.0033416748046875,
-0.0139923095703125,
0.051605224609375,
-0.0229644775390625,
-0.0234832763671875,
-0.01367950439453125,
0.0849609375,
0.048919677734375,
0.006404876708984375,
0.0167694091796875,
-0.0081939697265625,
-0.024993896484375,
-0.004413604736328125,
-0.08441162109375,
-0.01641845703125,
0.016357421875,
-0.042510986328125,
-0.0213165283203125,
0.0186004638671875,
-0.07696533203125,
-0.0003757476806640625,
-0.002307891845703125,
0.04736328125,
-0.042938232421875,
-0.0272064208984375,
-0.0006537437438964844,
-0.02166748046875,
0.0195770263671875,
0.02728271484375,
-0.041900634765625,
0.01617431640625,
0.024871826171875,
0.08245849609375,
0.00873565673828125,
-0.0266571044921875,
-0.0037078857421875,
-0.0105438232421875,
-0.0361328125,
0.0439453125,
-0.00948333740234375,
-0.01175689697265625,
-0.0179443359375,
0.0232391357421875,
-0.01213836669921875,
-0.039215087890625,
0.01568603515625,
-0.0111236572265625,
0.00850677490234375,
-0.01233673095703125,
-0.0240020751953125,
-0.036468505859375,
0.0182342529296875,
-0.042266845703125,
0.0833740234375,
0.03759765625,
-0.06561279296875,
0.0255889892578125,
-0.037933349609375,
0.01351165771484375,
-0.001979827880859375,
0.0002665519714355469,
-0.0648193359375,
-0.011810302734375,
0.02447509765625,
0.03070068359375,
-0.0173492431640625,
-0.01171875,
-0.03173828125,
-0.0215911865234375,
0.01409149169921875,
-0.0089111328125,
0.0733642578125,
0.0102386474609375,
-0.037506103515625,
0.01837158203125,
-0.052154541015625,
0.01412200927734375,
0.034942626953125,
-0.012115478515625,
-0.00238037109375,
-0.043731689453125,
0.00922393798828125,
0.01512908935546875,
0.0103607177734375,
-0.032806396484375,
0.0240020751953125,
-0.01425933837890625,
0.0301513671875,
0.062469482421875,
0.00553131103515625,
0.019805908203125,
-0.029388427734375,
0.037384033203125,
0.014556884765625,
0.0203399658203125,
-0.01390838623046875,
-0.04791259765625,
-0.06829833984375,
-0.042205810546875,
0.02569580078125,
0.028564453125,
-0.04132080078125,
0.051300048828125,
-0.0103302001953125,
-0.054779052734375,
-0.03173828125,
0.00405120849609375,
0.044464111328125,
0.04364013671875,
0.0361328125,
-0.035919189453125,
-0.0372314453125,
-0.0770263671875,
0.012054443359375,
-0.0099639892578125,
0.0027141571044921875,
0.032623291015625,
0.062408447265625,
-0.01345062255859375,
0.059539794921875,
-0.046539306640625,
-0.0176849365234375,
-0.0206146240234375,
0.006336212158203125,
0.0311279296875,
0.0472412109375,
0.075439453125,
-0.035552978515625,
-0.0504150390625,
-0.009674072265625,
-0.060943603515625,
-0.0029277801513671875,
0.00003421306610107422,
-0.020965576171875,
0.021942138671875,
0.01268768310546875,
-0.051116943359375,
0.055419921875,
0.0194854736328125,
-0.044219970703125,
0.04193115234375,
-0.026458740234375,
0.0214385986328125,
-0.08221435546875,
0.022857666015625,
0.01514434814453125,
-0.01678466796875,
-0.04571533203125,
0.001987457275390625,
0.01329803466796875,
0.00421142578125,
-0.042877197265625,
0.0545654296875,
-0.040008544921875,
-0.00611114501953125,
-0.00299835205078125,
-0.01532745361328125,
0.00960540771484375,
0.056488037109375,
-0.0012235641479492188,
0.053253173828125,
0.053131103515625,
-0.0240478515625,
0.0234222412109375,
0.0274505615234375,
-0.0301971435546875,
0.03863525390625,
-0.057952880859375,
0.00498199462890625,
-0.005336761474609375,
0.03448486328125,
-0.068603515625,
-0.0190582275390625,
0.029327392578125,
-0.036041259765625,
0.03765869140625,
-0.032684326171875,
-0.035858154296875,
-0.045745849609375,
-0.04815673828125,
0.035919189453125,
0.048919677734375,
-0.04620361328125,
0.0225677490234375,
0.00873565673828125,
0.007724761962890625,
-0.0408935546875,
-0.0513916015625,
-0.018585205078125,
-0.03173828125,
-0.05419921875,
0.0433349609375,
0.0024814605712890625,
0.00027060508728027344,
0.005855560302734375,
0.0021915435791015625,
0.00832366943359375,
-0.0160064697265625,
0.0338134765625,
0.042633056640625,
-0.026336669921875,
-0.0229949951171875,
-0.01763916015625,
-0.0205841064453125,
-0.0008635520935058594,
-0.0025386810302734375,
0.052154541015625,
-0.017578125,
-0.02197265625,
-0.061279296875,
0.0014009475708007812,
0.05303955078125,
-0.01001739501953125,
0.05718994140625,
0.06585693359375,
-0.03436279296875,
0.0022907257080078125,
-0.038818359375,
-0.0242462158203125,
-0.035430908203125,
0.031280517578125,
-0.018218994140625,
-0.034210205078125,
0.06890869140625,
0.018646240234375,
-0.0020275115966796875,
0.06781005859375,
0.0307159423828125,
-0.00347137451171875,
0.08184814453125,
0.0276641845703125,
0.01464080810546875,
0.053253173828125,
-0.08331298828125,
-0.01369476318359375,
-0.083251953125,
-0.042724609375,
-0.019927978515625,
-0.04486083984375,
-0.03302001953125,
-0.03704833984375,
0.043182373046875,
0.03057861328125,
-0.032623291015625,
0.0413818359375,
-0.05731201171875,
0.010345458984375,
0.053131103515625,
0.038299560546875,
-0.015411376953125,
0.01593017578125,
-0.01322174072265625,
-0.0068206787109375,
-0.048553466796875,
-0.0178985595703125,
0.08526611328125,
0.0394287109375,
0.039764404296875,
-0.00009375810623168945,
0.048004150390625,
-0.00823974609375,
0.017303466796875,
-0.034759521484375,
0.046844482421875,
0.004852294921875,
-0.047027587890625,
-0.0065460205078125,
-0.031280517578125,
-0.0706787109375,
0.0221710205078125,
-0.02752685546875,
-0.06915283203125,
0.0271148681640625,
0.019439697265625,
-0.0279541015625,
0.0550537109375,
-0.061309814453125,
0.06256103515625,
-0.01392364501953125,
-0.034576416015625,
-0.003040313720703125,
-0.052764892578125,
0.0157470703125,
0.0185699462890625,
-0.00554656982421875,
-0.003997802734375,
0.0120086669921875,
0.08642578125,
-0.0604248046875,
0.051361083984375,
-0.025238037109375,
0.0228424072265625,
0.044891357421875,
-0.01450347900390625,
0.0419921875,
0.0113525390625,
0.0068817138671875,
0.0248565673828125,
0.0027790069580078125,
-0.0300750732421875,
-0.0294952392578125,
0.0380859375,
-0.07281494140625,
-0.03289794921875,
-0.042694091796875,
-0.0228424072265625,
0.02044677734375,
0.01511383056640625,
0.050262451171875,
0.04571533203125,
0.01328277587890625,
0.028839111328125,
0.042205810546875,
-0.02960205078125,
0.035919189453125,
0.0009679794311523438,
-0.02288818359375,
-0.057037353515625,
0.068359375,
0.0157623291015625,
0.0179595947265625,
0.005535125732421875,
0.015625,
-0.02593994140625,
-0.0307159423828125,
-0.038726806640625,
0.035858154296875,
-0.04718017578125,
-0.03680419921875,
-0.038360595703125,
-0.0293731689453125,
-0.0203704833984375,
-0.01500701904296875,
-0.033935546875,
-0.028564453125,
-0.039093017578125,
0.0061798095703125,
0.064208984375,
0.046630859375,
-0.0181732177734375,
0.0273284912109375,
-0.04150390625,
0.0143890380859375,
0.01047515869140625,
0.032501220703125,
0.0003006458282470703,
-0.058380126953125,
-0.006343841552734375,
0.001163482666015625,
-0.036895751953125,
-0.07843017578125,
0.040130615234375,
-0.0002570152282714844,
0.033203125,
0.031829833984375,
-0.0261993408203125,
0.07012939453125,
-0.0132598876953125,
0.056121826171875,
0.047943115234375,
-0.049285888671875,
0.044158935546875,
-0.02197265625,
0.0020465850830078125,
0.017333984375,
0.01074981689453125,
-0.0237579345703125,
-0.012451171875,
-0.0777587890625,
-0.0716552734375,
0.073486328125,
0.022979736328125,
-0.00937652587890625,
0.0215606689453125,
0.030914306640625,
-0.004085540771484375,
-0.00782012939453125,
-0.053009033203125,
-0.05096435546875,
-0.026947021484375,
-0.01171112060546875,
-0.0043487548828125,
-0.01041412353515625,
-0.0115509033203125,
-0.05255126953125,
0.04693603515625,
-0.0007610321044921875,
0.058990478515625,
0.02288818359375,
0.00141143798828125,
-0.01178741455078125,
-0.0196990966796875,
0.039306640625,
0.031951904296875,
-0.0311279296875,
-0.0025691986083984375,
0.0167083740234375,
-0.042266845703125,
-0.0002130270004272461,
0.012969970703125,
-0.0179901123046875,
0.005397796630859375,
0.036895751953125,
0.073486328125,
0.00707244873046875,
-0.012969970703125,
0.04132080078125,
0.006221771240234375,
-0.031494140625,
-0.0171661376953125,
0.0194091796875,
-0.00989532470703125,
0.0270843505859375,
0.03155517578125,
0.0219879150390625,
-0.01200103759765625,
-0.0291900634765625,
0.012725830078125,
0.042144775390625,
-0.0263519287109375,
-0.034912109375,
0.054718017578125,
-0.016204833984375,
-0.01502227783203125,
0.0462646484375,
-0.0010318756103515625,
-0.0423583984375,
0.0870361328125,
0.04180908203125,
0.058258056640625,
-0.011444091796875,
0.0139312744140625,
0.07269287109375,
0.00742340087890625,
-0.0005540847778320312,
0.0189361572265625,
0.024810791015625,
-0.034149169921875,
0.00922393798828125,
-0.042266845703125,
-0.0015773773193359375,
0.03759765625,
-0.041229248046875,
0.0292205810546875,
-0.04473876953125,
-0.0277252197265625,
0.004482269287109375,
0.0147857666015625,
-0.061676025390625,
0.01275634765625,
0.0027828216552734375,
0.06768798828125,
-0.061676025390625,
0.04913330078125,
0.051300048828125,
-0.04473876953125,
-0.0723876953125,
-0.0159912109375,
0.005794525146484375,
-0.06494140625,
0.032257080078125,
0.019927978515625,
0.0223846435546875,
0.0064239501953125,
-0.053985595703125,
-0.06890869140625,
0.1156005859375,
0.032806396484375,
-0.01861572265625,
0.01509857177734375,
-0.0029449462890625,
0.0279541015625,
-0.022186279296875,
0.048797607421875,
0.0165557861328125,
0.037811279296875,
0.0177764892578125,
-0.047943115234375,
0.01404571533203125,
-0.03289794921875,
0.00860595703125,
0.0192108154296875,
-0.0758056640625,
0.07318115234375,
-0.0244293212890625,
-0.00977325439453125,
0.011688232421875,
0.039093017578125,
0.03741455078125,
0.00420379638671875,
0.038116455078125,
0.06329345703125,
0.035308837890625,
-0.0296478271484375,
0.062744140625,
-0.021575927734375,
0.048431396484375,
0.031951904296875,
0.02777099609375,
0.041351318359375,
0.029510498046875,
-0.03411865234375,
0.0352783203125,
0.0665283203125,
-0.03369140625,
0.0258941650390625,
0.00543212890625,
-0.00457000732421875,
-0.0020847320556640625,
0.0032367706298828125,
-0.05242919921875,
0.0194854736328125,
0.0222625732421875,
-0.03851318359375,
-0.008453369140625,
-0.00722503662109375,
0.0084381103515625,
-0.0278167724609375,
-0.0207977294921875,
0.03558349609375,
-0.0006399154663085938,
-0.038818359375,
0.052154541015625,
0.00470733642578125,
0.0584716796875,
-0.037506103515625,
-0.01052093505859375,
-0.02288818359375,
0.0167388916015625,
-0.02862548828125,
-0.07672119140625,
0.01241302490234375,
-0.0008268356323242188,
-0.00016927719116210938,
-0.00028395652770996094,
0.046875,
-0.016143798828125,
-0.0438232421875,
0.02288818359375,
0.01027679443359375,
0.01947021484375,
0.01294708251953125,
-0.0751953125,
0.006175994873046875,
0.003803253173828125,
-0.0521240234375,
0.0213775634765625,
0.020721435546875,
0.01120758056640625,
0.040496826171875,
0.044677734375,
-0.0007672309875488281,
0.021759033203125,
-0.0196685791015625,
0.061279296875,
-0.0399169921875,
-0.030120849609375,
-0.06024169921875,
0.051513671875,
-0.010833740234375,
-0.05010986328125,
0.050048828125,
0.05487060546875,
0.048919677734375,
-0.005237579345703125,
0.0340576171875,
-0.02117919921875,
0.005615234375,
-0.037994384765625,
0.059814453125,
-0.062347412109375,
-0.0049896240234375,
-0.0248260498046875,
-0.0670166015625,
-0.0239410400390625,
0.059051513671875,
-0.012786865234375,
0.0218353271484375,
0.04461669921875,
0.0732421875,
-0.0126953125,
-0.0235443115234375,
-0.004245758056640625,
0.01461029052734375,
0.0153045654296875,
0.043212890625,
0.037384033203125,
-0.05169677734375,
0.0257110595703125,
-0.05194091796875,
-0.0123138427734375,
-0.0281829833984375,
-0.04296875,
-0.0684814453125,
-0.052001953125,
-0.038665771484375,
-0.048980712890625,
-0.018157958984375,
0.064697265625,
0.070556640625,
-0.051116943359375,
0.0007214546203613281,
-0.00936126708984375,
0.00937652587890625,
-0.0195770263671875,
-0.018707275390625,
0.057281494140625,
0.01025390625,
-0.059478759765625,
-0.01042938232421875,
0.00980377197265625,
0.0343017578125,
0.0030269622802734375,
-0.034149169921875,
-0.006053924560546875,
-0.01386260986328125,
0.0120849609375,
0.03277587890625,
-0.05963134765625,
-0.025115966796875,
-0.00864410400390625,
-0.006649017333984375,
0.0380859375,
0.020751953125,
-0.04241943359375,
0.0249786376953125,
0.04840087890625,
-0.0011539459228515625,
0.06048583984375,
-0.0091552734375,
0.0014181137084960938,
-0.05340576171875,
0.029083251953125,
-0.0016336441040039062,
0.04437255859375,
0.011566162109375,
-0.0169219970703125,
0.0467529296875,
0.038238525390625,
-0.0330810546875,
-0.07086181640625,
-0.00370025634765625,
-0.0914306640625,
-0.00994110107421875,
0.07916259765625,
-0.016571044921875,
-0.037139892578125,
0.02642822265625,
-0.0157623291015625,
0.034027099609375,
-0.0187530517578125,
0.027191162109375,
0.017547607421875,
-0.00504302978515625,
-0.040679931640625,
-0.038604736328125,
0.03143310546875,
0.0168304443359375,
-0.0560302734375,
-0.027252197265625,
0.007373809814453125,
0.038177490234375,
0.041229248046875,
0.040496826171875,
-0.01690673828125,
0.0170745849609375,
0.01033782958984375,
0.02801513671875,
-0.020782470703125,
-0.017181396484375,
-0.017547607421875,
0.0002510547637939453,
-0.01275634765625,
-0.03082275390625
]
] |
bigcode/gpt_bigcode-santacoder | 2023-06-08T09:20:22.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_bigcode",
"text-generation",
"code",
"dataset:bigcode/the-stack",
"license:openrail",
"model-index",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | bigcode | null | null | bigcode/gpt_bigcode-santacoder | 21 | 34,770 | transformers | 2023-04-06T01:35:04 | ---
license: openrail
datasets:
- bigcode/the-stack
language:
- code
programming_language:
- Java
- JavaScript
- Python
pipeline_tag: text-generation
inference: false
model-index:
- name: SantaCoder
results:
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL HumanEval (Python)
metrics:
- name: pass@1
type: pass@1
value: 0.18
verified: false
- name: pass@10
type: pass@10
value: 0.29
verified: false
- name: pass@100
type: pass@100
value: 0.49
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL MBPP (Python)
metrics:
- name: pass@1
type: pass@1
value: 0.35
verified: false
- name: pass@10
type: pass@10
value: 0.58
verified: false
- name: pass@100
type: pass@100
value: 0.77
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL HumanEval (JavaScript)
metrics:
- name: pass@1
type: pass@1
value: 0.16
verified: false
- name: pass@10
type: pass@10
value: 0.27
verified: false
- name: pass@100
type: pass@100
value: 0.47
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL MBPP (Javascript)
metrics:
- name: pass@1
type: pass@1
value: 0.28
verified: false
- name: pass@10
type: pass@10
value: 0.51
verified: false
- name: pass@100
type: pass@100
value: 0.70
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL HumanEval (Java)
metrics:
- name: pass@1
type: pass@1
value: 0.15
verified: false
- name: pass@10
type: pass@10
value: 0.26
verified: false
- name: pass@100
type: pass@100
value: 0.41
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL MBPP (Java)
metrics:
- name: pass@1
type: pass@1
value: 0.28
verified: false
- name: pass@10
type: pass@10
value: 0.44
verified: false
- name: pass@100
type: pass@100
value: 0.59
verified: false
- task:
type: text-generation
dataset:
type: loubnabnl/humaneval_infilling
name: HumanEval FIM (Python)
metrics:
- name: single_line
type: exact_match
value: 0.44
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL HumanEval FIM (Java)
metrics:
- name: single_line
type: exact_match
value: 0.62
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL HumanEval FIM (JavaScript)
metrics:
- name: single_line
type: exact_match
value: 0.60
verified: false
- task:
type: text-generation
dataset:
type: code_x_glue_ct_code_to_text
name: CodeXGLUE code-to-text (Python)
metrics:
- name: BLEU
type: bleu
value: 18.13
verified: false
---
# SantaCoder

Play with the model on the [SantaCoder Space Demo](https://huggingface.co/spaces/bigcode/santacoder-demo).
# Table of Contents
1. [Model Summary](#model-summary)
2. [Use](#use)
3. [Limitations](#limitations)
4. [Training](#training)
5. [License](#license)
6. [Citation](#citation)
# Model Summary
This is the same model as [SantaCoder](https://huggingface.co/bigcode/santacoder) but it can be loaded with transformers >=4.28.1 to use the GPTBigCode architecture.
We refer the reader to the [SantaCoder model page](https://huggingface.co/bigcode/santacoder) for full documentation about this model
- **Repository:** [bigcode/Megatron-LM](https://github.com/bigcode-project/Megatron-LM)
- **Project Website:** [bigcode-project.org](www.bigcode-project.org)
- **Paper:** [🎅SantaCoder: Don't reach for the stars!🌟](https://t.co/YV3pzUbYOr)
- **Point of Contact:** [contact@bigcode-project.org](mailto:contact@bigcode-project.org)
- **Languages:** Python, Java, and JavaScript
There are two versions (branches) of the model:
* `main`: Uses the `gpt_bigcode` model. [Requires the bigcode fork of transformers](https://github.com/bigcode-project/transformers).
* `main_custom`: Packaged with its modeling code. Requires `transformers>=4.27`.
Alternatively, it can run on older versions by setting the configuration parameter `activation_function = "gelu_pytorch_tanh"`.
# Use
## Intended use
The model was trained on GitHub code. As such it is _not_ an instruction model and commands like "Write a function that computes the square root." do not work well.
You should phrase commands like they occur in source code such as comments (e.g. `# the following function computes the sqrt`) or write a function signature and docstring and let the model complete the function body.
### Attribution & Other Requirements
The pretraining dataset of the model was filtered for permissive licenses only. Nevertheless, the model can generate source code verbatim from the dataset. The code's license might require attribution and/or other specific requirements that must be respected. We provide a [search index](https://huggingface.co/spaces/bigcode/santacoder-search) that let's you search through the pretraining data to identify where generated code came from and apply the proper attribution to your code.
# Limitations
The model has been trained on source code in Python, Java, and JavaScript. The predominant language in source is English although other languages are also present. As such the model is capable to generate code snippets provided some context but the generated code is not guaranteed to work as intended. It can be inefficient, contain bugs or exploits.
# Training
## Model
- **Architecture:** GPT-2 model with multi-query attention and Fill-in-the-Middle objective
- **Pretraining steps:** 600K
- **Pretraining tokens:** 236 billion
- **Precision:** float16
## Hardware
- **GPUs:** 96 Tesla V100
- **Training time:** 6.2 days
- **Total FLOPS:** 2.1 x 10e21
## Software
- **Orchestration:** [Megatron-LM](https://github.com/bigcode-project/Megatron-LM)
- **Neural networks:** [PyTorch](https://github.com/pytorch/pytorch)
- **FP16 if applicable:** [apex](https://github.com/NVIDIA/apex)
# License
The model is licenses under the CodeML Open RAIL-M v0.1 license. You can find the full license [here](https://huggingface.co/spaces/bigcode/license).
| 6,802 | [
[
-0.028350830078125,
-0.028411865234375,
0.019561767578125,
0.00952911376953125,
-0.01922607421875,
-0.0154876708984375,
-0.0227508544921875,
-0.036041259765625,
0.00023508071899414062,
0.0193939208984375,
-0.040435791015625,
-0.037628173828125,
-0.048858642578125,
-0.011383056640625,
-0.0182952880859375,
0.0980224609375,
-0.001628875732421875,
0.005950927734375,
-0.01727294921875,
0.01279449462890625,
-0.007904052734375,
-0.05389404296875,
-0.023651123046875,
-0.02508544921875,
0.0257720947265625,
0.01318359375,
0.066650390625,
0.031768798828125,
0.0338134765625,
0.02197265625,
-0.02032470703125,
-0.0202484130859375,
-0.0220794677734375,
-0.03753662109375,
-0.0090484619140625,
-0.021087646484375,
-0.0445556640625,
-0.00637054443359375,
0.0328369140625,
-0.005046844482421875,
-0.0167694091796875,
0.03692626953125,
-0.0191192626953125,
0.02423095703125,
-0.02642822265625,
0.035430908203125,
-0.033477783203125,
-0.0009469985961914062,
0.0267333984375,
0.022735595703125,
-0.0114288330078125,
-0.0325927734375,
-0.005458831787109375,
-0.042205810546875,
0.034454345703125,
-0.008087158203125,
0.08563232421875,
0.0173187255859375,
-0.016387939453125,
-0.0185394287109375,
-0.058837890625,
0.039703369140625,
-0.053070068359375,
0.0297698974609375,
0.02752685546875,
0.0430908203125,
0.0263519287109375,
-0.0784912109375,
-0.041290283203125,
-0.042266845703125,
-0.01326751708984375,
0.00016760826110839844,
-0.027130126953125,
-0.0169525146484375,
0.052032470703125,
0.02197265625,
-0.06268310546875,
0.0011425018310546875,
-0.047515869140625,
0.0040435791015625,
0.054901123046875,
-0.00232696533203125,
0.01537322998046875,
-0.032470703125,
-0.03656005859375,
-0.023651123046875,
-0.0531005859375,
-0.003856658935546875,
0.019256591796875,
0.0029315948486328125,
-0.0328369140625,
0.04107666015625,
0.01580810546875,
0.0236053466796875,
0.0004825592041015625,
0.005413055419921875,
0.041839599609375,
-0.02392578125,
-0.0345458984375,
-0.0043487548828125,
0.0703125,
0.01276397705078125,
0.0251922607421875,
-0.00551605224609375,
-0.0277557373046875,
-0.0124359130859375,
0.03582763671875,
-0.070068359375,
-0.0296783447265625,
0.0160369873046875,
-0.00928497314453125,
-0.005931854248046875,
0.0146026611328125,
-0.045654296875,
0.00887298583984375,
-0.0226287841796875,
0.037353515625,
-0.0264739990234375,
-0.0306396484375,
0.0201873779296875,
-0.0028400421142578125,
0.0238189697265625,
0.01039886474609375,
-0.06805419921875,
0.01493072509765625,
0.0513916015625,
0.0546875,
0.0305633544921875,
-0.032958984375,
-0.0016393661499023438,
0.0021381378173828125,
-0.006084442138671875,
0.00734710693359375,
-0.0098724365234375,
-0.0199432373046875,
-0.0104522705078125,
0.025177001953125,
-0.0013103485107421875,
-0.032257080078125,
0.0261383056640625,
-0.0623779296875,
0.018463134765625,
0.00881195068359375,
-0.0200347900390625,
-0.02105712890625,
0.00878143310546875,
-0.055023193359375,
0.07037353515625,
0.035736083984375,
-0.05023193359375,
0.01493072509765625,
-0.064697265625,
-0.02423095703125,
0.001346588134765625,
0.0035915374755859375,
-0.0633544921875,
0.00196075439453125,
0.01297760009765625,
0.0306396484375,
-0.02020263671875,
0.041748046875,
-0.0030117034912109375,
-0.0268402099609375,
0.01849365234375,
-0.01544952392578125,
0.06634521484375,
0.01508331298828125,
-0.053466796875,
0.0240478515625,
-0.043426513671875,
-0.0032176971435546875,
0.042724609375,
-0.00022399425506591797,
0.01458740234375,
-0.02655029296875,
0.032562255859375,
0.035125732421875,
0.0223236083984375,
-0.0301971435546875,
0.033935546875,
-0.0224609375,
0.050079345703125,
0.036346435546875,
-0.02783203125,
0.014678955078125,
-0.006862640380859375,
0.054107666015625,
0.0008707046508789062,
0.02911376953125,
-0.0037212371826171875,
-0.038330078125,
-0.037872314453125,
-0.018829345703125,
0.041534423828125,
0.0299530029296875,
-0.05023193359375,
0.062286376953125,
-0.0333251953125,
-0.047607421875,
-0.024688720703125,
-0.00768280029296875,
0.041168212890625,
0.0013685226440429688,
0.0390625,
-0.01424407958984375,
-0.040557861328125,
-0.0579833984375,
0.03765869140625,
0.0015554428100585938,
-0.01416778564453125,
0.01812744140625,
0.0831298828125,
-0.034271240234375,
0.05474853515625,
-0.046600341796875,
0.003971099853515625,
-0.02545166015625,
-0.005786895751953125,
0.035797119140625,
0.052398681640625,
0.047821044921875,
-0.0625,
-0.0213165283203125,
0.004459381103515625,
-0.045867919921875,
0.0291290283203125,
0.006069183349609375,
0.00812530517578125,
0.0012884140014648438,
0.0242767333984375,
-0.090087890625,
0.052947998046875,
0.04400634765625,
-0.02154541015625,
0.06341552734375,
-0.01136016845703125,
-0.0013494491577148438,
-0.08441162109375,
0.035064697265625,
0.00814056396484375,
-0.0231781005859375,
-0.019012451171875,
0.0255279541015625,
0.01568603515625,
-0.02740478515625,
-0.03912353515625,
0.0281524658203125,
-0.02008056640625,
-0.01062774658203125,
-0.0107879638671875,
-0.0179443359375,
-0.0016584396362304688,
0.0416259765625,
0.0019464492797851562,
0.07659912109375,
0.052581787109375,
-0.038299560546875,
0.040679931640625,
0.02972412109375,
-0.0231475830078125,
-0.00734710693359375,
-0.0850830078125,
0.0113677978515625,
0.0016317367553710938,
0.024871826171875,
-0.06719970703125,
-0.0277099609375,
0.041046142578125,
-0.050750732421875,
0.027740478515625,
-0.029022216796875,
-0.057037353515625,
-0.061920166015625,
-0.00801849365234375,
0.029388427734375,
0.0654296875,
-0.055816650390625,
0.02728271484375,
0.01323699951171875,
0.003810882568359375,
-0.037872314453125,
-0.049224853515625,
-0.0030536651611328125,
-0.0167694091796875,
-0.04901123046875,
0.02899169921875,
-0.0052947998046875,
0.0152740478515625,
0.0067291259765625,
0.0019817352294921875,
-0.0108795166015625,
-0.008148193359375,
0.029388427734375,
0.0254974365234375,
-0.0220947265625,
0.005584716796875,
-0.0289764404296875,
-0.0263519287109375,
0.017120361328125,
-0.037841796875,
0.048065185546875,
-0.02154541015625,
-0.02587890625,
-0.03338623046875,
0.0004487037658691406,
0.05120849609375,
-0.025299072265625,
0.0513916015625,
0.06243896484375,
-0.0276336669921875,
-0.01055145263671875,
-0.033721923828125,
-0.0182037353515625,
-0.038421630859375,
0.03851318359375,
-0.0151214599609375,
-0.06134033203125,
0.0197906494140625,
0.007122039794921875,
-0.00952911376953125,
0.041168212890625,
0.0428466796875,
0.01470947265625,
0.07659912109375,
0.053009033203125,
-0.01427459716796875,
0.03521728515625,
-0.047821044921875,
0.016357421875,
-0.05743408203125,
-0.017608642578125,
-0.046112060546875,
-0.01233673095703125,
-0.02984619140625,
-0.036956787109375,
0.0228271484375,
0.0121917724609375,
-0.0560302734375,
0.0550537109375,
-0.0604248046875,
0.0384521484375,
0.056884765625,
0.0037975311279296875,
0.0029621124267578125,
0.01409912109375,
0.00734710693359375,
0.000027954578399658203,
-0.0615234375,
-0.0343017578125,
0.0797119140625,
0.0390625,
0.056884765625,
0.0069732666015625,
0.0283203125,
-0.01032257080078125,
0.019927978515625,
-0.04400634765625,
0.029083251953125,
0.00833892822265625,
-0.06280517578125,
-0.01001739501953125,
-0.0462646484375,
-0.082275390625,
0.0018205642700195312,
0.0097198486328125,
-0.059906005859375,
-0.00861358642578125,
0.0060577392578125,
-0.0225982666015625,
0.029693603515625,
-0.0665283203125,
0.08917236328125,
-0.037322998046875,
-0.0194549560546875,
0.0005030632019042969,
-0.054534912109375,
0.036834716796875,
0.007534027099609375,
-0.0006079673767089844,
0.02288818359375,
0.018829345703125,
0.043487548828125,
-0.03143310546875,
0.06292724609375,
-0.0161895751953125,
0.0003879070281982422,
0.031524658203125,
-0.0169677734375,
0.042572021484375,
0.0214080810546875,
0.00879669189453125,
0.0279998779296875,
-0.00530242919921875,
-0.022186279296875,
-0.035186767578125,
0.0506591796875,
-0.08447265625,
-0.028045654296875,
-0.045654296875,
-0.02117919921875,
0.00634002685546875,
0.012237548828125,
0.02362060546875,
0.01290130615234375,
-0.004856109619140625,
0.0089263916015625,
0.040008544921875,
-0.0246124267578125,
0.034332275390625,
0.01155853271484375,
-0.03765869140625,
-0.040618896484375,
0.056915283203125,
-0.0164642333984375,
0.0030193328857421875,
0.0018854141235351562,
0.00989532470703125,
-0.043212890625,
-0.03472900390625,
-0.053741455078125,
0.03167724609375,
-0.035308837890625,
-0.03167724609375,
-0.0380859375,
-0.037872314453125,
-0.04534912109375,
0.0011968612670898438,
-0.0305938720703125,
-0.016082763671875,
-0.0281219482421875,
-0.01561737060546875,
0.03814697265625,
0.05316162109375,
0.003551483154296875,
0.040283203125,
-0.0643310546875,
0.017852783203125,
0.0268402099609375,
0.038818359375,
0.0041046142578125,
-0.039520263671875,
-0.04010009765625,
0.00771331787109375,
-0.035308837890625,
-0.03790283203125,
0.035491943359375,
-0.019256591796875,
0.0265045166015625,
0.004253387451171875,
-0.0290069580078125,
0.03424072265625,
-0.0386962890625,
0.07666015625,
0.0474853515625,
-0.067138671875,
0.02899169921875,
-0.019073486328125,
0.02978515625,
0.021209716796875,
0.0545654296875,
-0.0171051025390625,
0.0120086669921875,
-0.07696533203125,
-0.06549072265625,
0.06353759765625,
0.0194091796875,
0.017913818359375,
0.0200958251953125,
0.014862060546875,
-0.01035308837890625,
0.03570556640625,
-0.0797119140625,
-0.0020351409912109375,
-0.0404052734375,
-0.0079498291015625,
-0.034027099609375,
-0.005008697509765625,
-0.0211944580078125,
-0.04644775390625,
0.045867919921875,
0.00495147705078125,
0.044708251953125,
0.0036411285400390625,
-0.0153350830078125,
-0.0106048583984375,
0.00266265869140625,
0.044708251953125,
0.0675048828125,
-0.01407623291015625,
-0.006916046142578125,
-0.0080108642578125,
-0.047515869140625,
0.006221771240234375,
0.03729248046875,
-0.01299285888671875,
0.006343841552734375,
0.01436614990234375,
0.07568359375,
0.0261077880859375,
-0.03704833984375,
0.059326171875,
0.006443023681640625,
-0.034423828125,
-0.03900146484375,
0.01175689697265625,
0.02606201171875,
0.0268402099609375,
0.01081085205078125,
0.00574493408203125,
-0.00927734375,
-0.0200042724609375,
0.0236663818359375,
0.0213623046875,
-0.024932861328125,
-0.041717529296875,
0.07244873046875,
-0.0078887939453125,
-0.02362060546875,
0.049163818359375,
-0.0294647216796875,
-0.054412841796875,
0.077880859375,
0.053863525390625,
0.0665283203125,
-0.005893707275390625,
0.02398681640625,
0.04681396484375,
0.035797119140625,
-0.006591796875,
0.0080718994140625,
0.0074462890625,
-0.0226898193359375,
-0.043487548828125,
-0.04205322265625,
-0.01238250732421875,
0.01395416259765625,
-0.0316162109375,
0.0270843505859375,
-0.053436279296875,
-0.01538848876953125,
-0.00870513916015625,
-0.0009069442749023438,
-0.07025146484375,
0.0225830078125,
0.0193939208984375,
0.06768798828125,
-0.042266845703125,
0.06256103515625,
0.052215576171875,
-0.04473876953125,
-0.0758056640625,
-0.00971221923828125,
-0.01502227783203125,
-0.060821533203125,
0.054473876953125,
0.0172119140625,
0.00908660888671875,
0.01068115234375,
-0.0684814453125,
-0.06585693359375,
0.087890625,
0.013885498046875,
-0.05120849609375,
-0.0084075927734375,
0.0007109642028808594,
0.04571533203125,
-0.0196075439453125,
0.0462646484375,
0.019927978515625,
0.04083251953125,
0.0305633544921875,
-0.07659912109375,
0.00931549072265625,
-0.01206207275390625,
-0.0018110275268554688,
0.01715087890625,
-0.051025390625,
0.0677490234375,
-0.01546478271484375,
-0.0013189315795898438,
0.00919342041015625,
0.032073974609375,
0.0137786865234375,
0.02947998046875,
0.0145111083984375,
0.059173583984375,
0.036376953125,
-0.009857177734375,
0.0843505859375,
-0.0309906005859375,
0.068603515625,
0.0712890625,
-0.0025787353515625,
0.041534423828125,
0.01012420654296875,
-0.034027099609375,
0.0237274169921875,
0.0272064208984375,
-0.039337158203125,
0.02740478515625,
0.03338623046875,
0.0028362274169921875,
0.0017986297607421875,
0.0260009765625,
-0.06378173828125,
0.012847900390625,
0.0176849365234375,
-0.03277587890625,
-0.01157379150390625,
-0.00559234619140625,
0.0027141571044921875,
-0.03314208984375,
-0.022491455078125,
0.036346435546875,
-0.00011557340621948242,
-0.032196044921875,
0.07208251953125,
0.00853729248046875,
0.053009033203125,
-0.06768798828125,
0.007022857666015625,
0.00914764404296875,
0.00887298583984375,
-0.021942138671875,
-0.044952392578125,
0.01544952392578125,
0.007175445556640625,
-0.006435394287109375,
-0.0084991455078125,
0.03411865234375,
-0.01557159423828125,
-0.050018310546875,
0.0162353515625,
0.0202178955078125,
0.00878143310546875,
-0.0154266357421875,
-0.07086181640625,
0.01016998291015625,
0.00295257568359375,
-0.0263519287109375,
0.0244293212890625,
0.0232086181640625,
0.017852783203125,
0.04083251953125,
0.052032470703125,
-0.01261138916015625,
0.00098419189453125,
0.0011072158813476562,
0.08331298828125,
-0.06646728515625,
-0.0289459228515625,
-0.051422119140625,
0.058929443359375,
-0.0008749961853027344,
-0.050140380859375,
0.054046630859375,
0.0501708984375,
0.0692138671875,
-0.038604736328125,
0.05035400390625,
-0.0163726806640625,
0.01305389404296875,
-0.0345458984375,
0.04339599609375,
-0.048614501953125,
0.00470733642578125,
-0.0268707275390625,
-0.084228515625,
-0.0313720703125,
0.0506591796875,
-0.0357666015625,
0.03179931640625,
0.0474853515625,
0.07159423828125,
-0.040374755859375,
0.0142669677734375,
0.00958251953125,
0.014801025390625,
0.0246124267578125,
0.05718994140625,
0.06622314453125,
-0.049591064453125,
0.048797607421875,
-0.0144195556640625,
-0.015228271484375,
-0.0274810791015625,
-0.049652099609375,
-0.0631103515625,
-0.02667236328125,
-0.028717041015625,
-0.045501708984375,
0.0023822784423828125,
0.075439453125,
0.0721435546875,
-0.051116943359375,
-0.01111602783203125,
-0.007488250732421875,
0.0018072128295898438,
-0.0146484375,
-0.01490020751953125,
0.034515380859375,
-0.002933502197265625,
-0.0472412109375,
0.0036373138427734375,
-0.0136566162109375,
0.0128936767578125,
-0.0174560546875,
-0.01172637939453125,
-0.006610870361328125,
-0.032958984375,
0.036956787109375,
0.0290069580078125,
-0.0328369140625,
-0.0195159912109375,
-0.003505706787109375,
-0.0219573974609375,
0.0233612060546875,
0.055877685546875,
-0.038604736328125,
0.00029349327087402344,
0.0183563232421875,
0.04339599609375,
0.06671142578125,
0.00910186767578125,
0.01001739501953125,
-0.045135498046875,
0.01544189453125,
0.0294189453125,
0.015899658203125,
0.019561767578125,
-0.02880859375,
0.039703369140625,
0.0192718505859375,
-0.061859130859375,
-0.036834716796875,
0.01395416259765625,
-0.0718994140625,
-0.0235137939453125,
0.095703125,
-0.0010156631469726562,
-0.027862548828125,
0.01531219482421875,
-0.0212554931640625,
0.00621795654296875,
-0.02490234375,
0.04925537109375,
0.031463623046875,
0.0210723876953125,
-0.01094818115234375,
-0.05279541015625,
0.024932861328125,
0.0069427490234375,
-0.04693603515625,
-0.0029392242431640625,
0.042327880859375,
0.033843994140625,
0.031036376953125,
0.0206298828125,
-0.0122528076171875,
0.03070068359375,
0.00685882568359375,
0.042266845703125,
-0.0309295654296875,
-0.03765869140625,
-0.04180908203125,
0.0121002197265625,
-0.00206756591796875,
-0.0161590576171875
]
] |
julien-c/dummy-unknown | 2021-05-20T17:31:14.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"roberta",
"fill-mask",
"ci",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | julien-c | null | null | julien-c/dummy-unknown | 0 | 34,760 | transformers | 2022-03-02T23:29:05 | ---
tags:
- ci
---
## Dummy model used for unit testing and CI
```python
import json
import os
from transformers import RobertaConfig, RobertaForMaskedLM, TFRobertaForMaskedLM
DIRNAME = "./dummy-unknown"
config = RobertaConfig(10, 20, 1, 1, 40)
model = RobertaForMaskedLM(config)
model.save_pretrained(DIRNAME)
tf_model = TFRobertaForMaskedLM.from_pretrained(DIRNAME, from_pt=True)
tf_model.save_pretrained(DIRNAME)
# Tokenizer:
vocab = [
"l",
"o",
"w",
"e",
"r",
"s",
"t",
"i",
"d",
"n",
"\u0120",
"\u0120l",
"\u0120n",
"\u0120lo",
"\u0120low",
"er",
"\u0120lowest",
"\u0120newer",
"\u0120wider",
"<unk>",
]
vocab_tokens = dict(zip(vocab, range(len(vocab))))
merges = ["#version: 0.2", "\u0120 l", "\u0120l o", "\u0120lo w", "e r", ""]
vocab_file = os.path.join(DIRNAME, "vocab.json")
merges_file = os.path.join(DIRNAME, "merges.txt")
with open(vocab_file, "w", encoding="utf-8") as fp:
fp.write(json.dumps(vocab_tokens) + "\n")
with open(merges_file, "w", encoding="utf-8") as fp:
fp.write("\n".join(merges))
```
| 1,114 | [
[
-0.01258087158203125,
-0.052154541015625,
0.01959228515625,
0.018829345703125,
-0.02178955078125,
0.013275146484375,
-0.022064208984375,
0.00616455078125,
0.002483367919921875,
0.025177001953125,
-0.032135009765625,
-0.038726806640625,
-0.030181884765625,
0.0222930908203125,
-0.043792724609375,
0.07818603515625,
-0.0251617431640625,
-0.0013360977172851562,
0.00968170166015625,
-0.01241302490234375,
-0.017822265625,
-0.061798095703125,
-0.035919189453125,
-0.00455474853515625,
0.0170745849609375,
0.01160430908203125,
0.04339599609375,
0.010528564453125,
0.0242156982421875,
0.0307159423828125,
-0.009124755859375,
0.0052490234375,
-0.01593017578125,
-0.006267547607421875,
-0.0008220672607421875,
-0.0360107421875,
-0.03411865234375,
-0.0133056640625,
0.04852294921875,
0.041900634765625,
0.0013227462768554688,
0.0222320556640625,
-0.00948333740234375,
0.02850341796875,
-0.043243408203125,
0.00360870361328125,
-0.037567138671875,
-0.00380706787109375,
0.00186920166015625,
-0.00909423828125,
-0.0179290771484375,
-0.014373779296875,
-0.0026226043701171875,
-0.05316162109375,
0.0328369140625,
0.008758544921875,
0.10595703125,
0.01470184326171875,
-0.0352783203125,
-0.007022857666015625,
-0.037841796875,
0.07183837890625,
-0.06585693359375,
0.0298309326171875,
0.00864410400390625,
-0.00589752197265625,
-0.026885986328125,
-0.061798095703125,
-0.0772705078125,
-0.01470184326171875,
-0.02154541015625,
0.00022733211517333984,
-0.005199432373046875,
0.00896453857421875,
0.029571533203125,
0.026702880859375,
-0.0501708984375,
-0.018646240234375,
-0.05438232421875,
-0.0142059326171875,
0.048614501953125,
0.0126495361328125,
0.022705078125,
-0.0258026123046875,
-0.0307159423828125,
-0.0421142578125,
-0.0114288330078125,
0.0185394287109375,
0.034942626953125,
0.035308837890625,
-0.0233612060546875,
0.05810546875,
-0.0224456787109375,
0.037261962890625,
0.001049041748046875,
-0.0144805908203125,
0.042388916015625,
-0.0305023193359375,
-0.036651611328125,
-0.01107025146484375,
0.08306884765625,
0.024932861328125,
0.0295867919921875,
-0.0036373138427734375,
-0.00775146484375,
0.0007004737854003906,
0.02886962890625,
-0.053680419921875,
-0.0219573974609375,
0.0452880859375,
-0.0302581787109375,
-0.0177154541015625,
0.0160980224609375,
-0.045257568359375,
0.009613037109375,
-0.01776123046875,
0.039459228515625,
-0.0477294921875,
-0.038360595703125,
0.010406494140625,
-0.0232086181640625,
0.018310546875,
-0.00963592529296875,
-0.06378173828125,
0.0107269287109375,
0.0377197265625,
0.054168701171875,
0.005893707275390625,
-0.03997802734375,
-0.0175933837890625,
-0.00759124755859375,
-0.026397705078125,
0.057159423828125,
0.0131683349609375,
-0.04498291015625,
0.00801849365234375,
0.022125244140625,
-0.0289306640625,
-0.022918701171875,
0.040679931640625,
-0.012542724609375,
0.0123291015625,
-0.01155853271484375,
-0.0135955810546875,
-0.0186614990234375,
0.033172607421875,
-0.0281982421875,
0.09588623046875,
0.04779052734375,
-0.057861328125,
0.0182037353515625,
-0.043121337890625,
-0.03961181640625,
-0.00372314453125,
-0.01035308837890625,
-0.03839111328125,
0.00400543212890625,
0.007236480712890625,
0.03125,
-0.007732391357421875,
0.01050567626953125,
-0.020233154296875,
-0.045074462890625,
0.03857421875,
-0.0216064453125,
0.07177734375,
0.00665283203125,
-0.0185089111328125,
0.01251983642578125,
-0.046356201171875,
0.0161590576171875,
0.005321502685546875,
-0.03369140625,
-0.029632568359375,
-0.0121917724609375,
0.0338134765625,
-0.01030731201171875,
0.0104522705078125,
-0.03228759765625,
0.044647216796875,
-0.0301361083984375,
0.030426025390625,
0.037567138671875,
-0.0044708251953125,
0.0253143310546875,
-0.0233612060546875,
0.036376953125,
0.0126800537109375,
0.00786590576171875,
0.0106964111328125,
-0.007534027099609375,
-0.08349609375,
-0.023193359375,
0.027618408203125,
0.042327880859375,
-0.06036376953125,
0.053009033203125,
-0.03399658203125,
-0.035430908203125,
-0.048126220703125,
-0.01091766357421875,
0.0185089111328125,
0.0281219482421875,
0.036773681640625,
0.0042877197265625,
-0.059906005859375,
-0.063720703125,
-0.02337646484375,
-0.03497314453125,
-0.0149383544921875,
-0.01214599609375,
0.060028076171875,
-0.03924560546875,
0.0540771484375,
-0.040191650390625,
-0.0243682861328125,
-0.01473236083984375,
0.0240631103515625,
0.04437255859375,
0.05303955078125,
0.03863525390625,
-0.04656982421875,
-0.022735595703125,
-0.00864410400390625,
-0.04071044921875,
0.005321502685546875,
-0.0352783203125,
-0.007068634033203125,
0.0264892578125,
0.0250396728515625,
-0.035430908203125,
0.036651611328125,
0.035888671875,
-0.042724609375,
0.056121826171875,
-0.01544952392578125,
0.034210205078125,
-0.101318359375,
0.00775146484375,
-0.0173797607421875,
0.004383087158203125,
-0.032867431640625,
0.01512908935546875,
0.0191650390625,
0.0186920166015625,
-0.04107666015625,
0.03607177734375,
-0.0457763671875,
0.012786865234375,
0.006244659423828125,
-0.01097869873046875,
0.0008063316345214844,
0.031982421875,
-0.0118255615234375,
0.051116943359375,
0.042572021484375,
-0.04742431640625,
0.01336669921875,
0.0209503173828125,
-0.004772186279296875,
0.03277587890625,
-0.042999267578125,
0.004657745361328125,
0.0228118896484375,
0.004489898681640625,
-0.06500244140625,
-0.0080108642578125,
0.016815185546875,
-0.041229248046875,
0.01439666748046875,
-0.02777099609375,
-0.0543212890625,
-0.04473876953125,
-0.01959228515625,
0.03460693359375,
0.034759521484375,
-0.0400390625,
0.05487060546875,
0.00020265579223632812,
0.01593017578125,
-0.035980224609375,
-0.0579833984375,
-0.0131072998046875,
-0.0182647705078125,
-0.03656005859375,
0.00804901123046875,
-0.022674560546875,
-0.004817962646484375,
-0.002193450927734375,
-0.0086517333984375,
-0.0196990966796875,
0.015869140625,
0.0243377685546875,
0.0391845703125,
-0.01309967041015625,
-0.0036029815673828125,
0.00824737548828125,
-0.0006284713745117188,
0.003955841064453125,
-0.00283050537109375,
0.056060791015625,
-0.034912109375,
-0.029541015625,
-0.03497314453125,
0.026611328125,
0.040283203125,
-0.004688262939453125,
0.0538330078125,
0.0601806640625,
-0.0170135498046875,
-0.01317596435546875,
-0.0252532958984375,
0.00551605224609375,
-0.037322998046875,
0.032806396484375,
-0.04913330078125,
-0.051788330078125,
0.0498046875,
0.01114654541015625,
0.010009765625,
0.0513916015625,
0.031402587890625,
-0.0143585205078125,
0.057342529296875,
0.0225677490234375,
-0.0004341602325439453,
0.01006317138671875,
-0.06536865234375,
0.037200927734375,
-0.07720947265625,
-0.01192474365234375,
-0.047698974609375,
0.00009757280349731445,
-0.054107666015625,
-0.0248870849609375,
0.01485443115234375,
0.0380859375,
-0.0230560302734375,
0.04168701171875,
-0.05499267578125,
0.017181396484375,
0.033111572265625,
-0.018096923828125,
0.01102447509765625,
-0.00832366943359375,
-0.0168304443359375,
0.0169677734375,
-0.033355712890625,
-0.036956787109375,
0.09259033203125,
0.007099151611328125,
0.047637939453125,
0.007297515869140625,
0.0557861328125,
-0.0025119781494140625,
0.013671875,
-0.06683349609375,
0.031585693359375,
-0.004791259765625,
-0.04412841796875,
-0.00853729248046875,
-0.0098114013671875,
-0.06451416015625,
0.0225677490234375,
-0.009613037109375,
-0.0665283203125,
0.00638580322265625,
-0.0025539398193359375,
-0.0283660888671875,
0.03570556640625,
-0.0550537109375,
0.053955078125,
-0.00965118408203125,
0.0061187744140625,
0.015960693359375,
-0.01271820068359375,
0.042449951171875,
-0.0026950836181640625,
0.0132904052734375,
0.00597381591796875,
0.00494384765625,
0.06170654296875,
-0.0345458984375,
0.033538818359375,
0.0134124755859375,
0.0151214599609375,
0.0158233642578125,
0.00971221923828125,
0.0073394775390625,
0.0211944580078125,
-0.0039005279541015625,
0.041473388671875,
0.04644775390625,
-0.0231475830078125,
-0.033538818359375,
0.04931640625,
-0.06463623046875,
-0.032745361328125,
-0.06939697265625,
-0.044158935546875,
0.0211639404296875,
0.0201568603515625,
0.0270843505859375,
0.0223541259765625,
-0.0020046234130859375,
0.02166748046875,
0.0166168212890625,
-0.0041961669921875,
0.04339599609375,
0.04412841796875,
-0.017242431640625,
-0.03741455078125,
0.03961181640625,
0.01244354248046875,
0.0137176513671875,
0.0130767822265625,
0.004161834716796875,
-0.03363037109375,
-0.03997802734375,
-0.030059814453125,
0.01433563232421875,
-0.03271484375,
-0.0264434814453125,
-0.055023193359375,
-0.0399169921875,
-0.06341552734375,
-0.01241302490234375,
-0.025054931640625,
-0.005290985107421875,
-0.0362548828125,
0.0028285980224609375,
0.04180908203125,
0.031524658203125,
-0.0121612548828125,
0.0391845703125,
-0.059906005859375,
0.033782958984375,
-0.0078887939453125,
0.017547607421875,
-0.0128326416015625,
-0.048187255859375,
-0.0207672119140625,
-0.00042891502380371094,
-0.0283660888671875,
-0.06866455078125,
0.04132080078125,
-0.01024627685546875,
0.03936767578125,
0.034515380859375,
0.036376953125,
0.06964111328125,
-0.009796142578125,
0.08251953125,
0.006008148193359375,
-0.0809326171875,
0.0310821533203125,
-0.0243988037109375,
0.0010328292846679688,
0.042724609375,
0.0128631591796875,
-0.053619384765625,
-0.0295867919921875,
-0.06463623046875,
-0.09381103515625,
0.04736328125,
0.029052734375,
0.0013322830200195312,
-0.006015777587890625,
0.01222991943359375,
-0.01015472412109375,
0.0011053085327148438,
-0.052215576171875,
-0.032196044921875,
-0.02093505859375,
-0.016021728515625,
-0.006816864013671875,
-0.00408172607421875,
-0.021942138671875,
-0.040618896484375,
0.057464599609375,
-0.0156402587890625,
0.0204925537109375,
0.04376220703125,
-0.007022857666015625,
0.010894775390625,
0.0257110595703125,
0.045440673828125,
0.020904541015625,
-0.036529541015625,
0.0017566680908203125,
0.0302734375,
-0.03131103515625,
0.0073394775390625,
0.001140594482421875,
-0.003814697265625,
0.0166473388671875,
0.049774169921875,
0.049591064453125,
0.00725555419921875,
-0.038177490234375,
0.0301666259765625,
-0.0267333984375,
-0.0404052734375,
-0.0718994140625,
0.01354217529296875,
0.00452423095703125,
0.0081787109375,
0.045166015625,
0.003101348876953125,
-0.01206207275390625,
-0.0377197265625,
-0.0015430450439453125,
0.0187530517578125,
-0.004344940185546875,
-0.016265869140625,
0.03387451171875,
-0.005123138427734375,
-0.042327880859375,
0.05853271484375,
-0.0213470458984375,
-0.058319091796875,
0.0472412109375,
0.0281524658203125,
0.071044921875,
0.01105499267578125,
0.00160980224609375,
0.05352783203125,
0.0421142578125,
-0.00618743896484375,
0.04254150390625,
-0.005496978759765625,
-0.045745849609375,
0.00496673583984375,
-0.06884765625,
-0.013641357421875,
0.035003662109375,
-0.0609130859375,
0.0189208984375,
-0.02777099609375,
-0.0139923095703125,
-0.0179901123046875,
0.01058197021484375,
-0.06695556640625,
0.0167083740234375,
-0.0311431884765625,
0.05511474609375,
-0.07891845703125,
0.06396484375,
0.0478515625,
-0.06396484375,
-0.10748291015625,
0.00946044921875,
-0.0204315185546875,
-0.0281982421875,
0.051513671875,
0.00829315185546875,
0.030059814453125,
-0.0007648468017578125,
-0.0148162841796875,
-0.07208251953125,
0.08349609375,
0.00617218017578125,
-0.028594970703125,
0.005741119384765625,
0.015228271484375,
0.044921875,
-0.0163726806640625,
0.047760009765625,
0.05572509765625,
0.0469970703125,
-0.0156707763671875,
-0.06378173828125,
0.00936126708984375,
-0.03009033203125,
0.01013946533203125,
0.0018672943115234375,
-0.038543701171875,
0.10418701171875,
-0.0229644775390625,
-0.0198974609375,
0.027740478515625,
0.05548095703125,
0.03973388671875,
0.00664520263671875,
0.054351806640625,
0.04144287109375,
0.04168701171875,
-0.0261077880859375,
0.07110595703125,
-0.007598876953125,
0.050506591796875,
0.0650634765625,
-0.0036144256591796875,
0.07708740234375,
0.05621337890625,
-0.0236968994140625,
0.0621337890625,
0.050628662109375,
-0.0321044921875,
0.0174560546875,
0.024505615234375,
-0.004673004150390625,
0.0004482269287109375,
0.0140838623046875,
-0.0362548828125,
0.032806396484375,
0.00954437255859375,
-0.018890380859375,
0.004337310791015625,
-0.01013946533203125,
0.0083160400390625,
-0.003570556640625,
0.007167816162109375,
0.031097412109375,
-0.0153350830078125,
-0.042724609375,
0.077392578125,
-0.010833740234375,
0.0631103515625,
-0.0394287109375,
0.0001976490020751953,
-0.01320648193359375,
0.030609130859375,
-0.018798828125,
-0.044219970703125,
0.020904541015625,
-0.0024127960205078125,
-0.0164947509765625,
0.00016772747039794922,
0.0131378173828125,
-0.033294677734375,
-0.062103271484375,
0.044952392578125,
0.0030307769775390625,
0.0269927978515625,
-0.01110076904296875,
-0.043212890625,
0.01290130615234375,
0.0098114013671875,
-0.03131103515625,
0.0117340087890625,
0.029388427734375,
0.0067138671875,
0.0357666015625,
0.07867431640625,
0.019256591796875,
0.013641357421875,
0.0157012939453125,
0.06585693359375,
-0.052947998046875,
-0.0338134765625,
-0.041351318359375,
0.07049560546875,
-0.0089874267578125,
-0.042816162109375,
0.044281005859375,
0.058135986328125,
0.07666015625,
-0.0204010009765625,
0.062744140625,
-0.025146484375,
0.0262298583984375,
-0.03759765625,
0.0791015625,
-0.05230712890625,
-0.0014667510986328125,
-0.0064849853515625,
-0.0489501953125,
0.004398345947265625,
0.045562744140625,
0.00612640380859375,
-0.006000518798828125,
0.09112548828125,
0.0755615234375,
-0.007556915283203125,
-0.03173828125,
0.0213165283203125,
0.025726318359375,
0.0345458984375,
0.0350341796875,
0.04119873046875,
-0.08441162109375,
0.046234130859375,
-0.03497314453125,
-0.00734710693359375,
-0.0161285400390625,
-0.04833984375,
-0.08465576171875,
-0.07183837890625,
-0.0169219970703125,
-0.04473876953125,
-0.0151519775390625,
0.06591796875,
0.048065185546875,
-0.06951904296875,
-0.01436614990234375,
-0.0030670166015625,
0.009979248046875,
-0.018585205078125,
-0.0258026123046875,
0.043975830078125,
-0.0169219970703125,
-0.06683349609375,
0.02630615234375,
-0.0223541259765625,
0.0166473388671875,
-0.01904296875,
-0.00696563720703125,
-0.031524658203125,
-0.00659942626953125,
0.0142669677734375,
0.0243988037109375,
-0.0732421875,
-0.030120849609375,
-0.01357269287109375,
-0.017303466796875,
-0.008636474609375,
0.03973388671875,
-0.0506591796875,
0.0201263427734375,
0.04522705078125,
0.01092529296875,
0.0174102783203125,
-0.00366973876953125,
0.03338623046875,
-0.043975830078125,
0.024505615234375,
0.004367828369140625,
0.042205810546875,
0.0203094482421875,
-0.0248260498046875,
0.039794921875,
0.0299530029296875,
-0.0316162109375,
-0.08319091796875,
-0.0223846435546875,
-0.09820556640625,
-0.0155487060546875,
0.093994140625,
-0.014923095703125,
-0.037841796875,
-0.00879669189453125,
-0.0270843505859375,
0.06121826171875,
-0.03326416015625,
0.05804443359375,
0.0304412841796875,
0.0093536376953125,
0.0170440673828125,
-0.025421142578125,
0.0229949951171875,
0.0255279541015625,
-0.051605224609375,
-0.005035400390625,
-0.00547027587890625,
0.04840087890625,
0.034088134765625,
0.036285400390625,
0.01036834716796875,
0.006877899169921875,
0.01044464111328125,
0.041961669921875,
-0.0216064453125,
-0.001415252685546875,
-0.022705078125,
-0.00450897216796875,
-0.0194854736328125,
-0.038299560546875
]
] |
lllyasviel/sd-controlnet-canny | 2023-05-01T19:33:49.000Z | [
"diffusers",
"art",
"controlnet",
"stable-diffusion",
"image-to-image",
"arxiv:2302.05543",
"license:openrail",
"has_space",
"diffusers:ControlNetModel",
"region:us"
] | image-to-image | lllyasviel | null | null | lllyasviel/sd-controlnet-canny | 104 | 34,732 | diffusers | 2023-02-24T06:55:23 | ---
license: openrail
base_model: runwayml/stable-diffusion-v1-5
tags:
- art
- controlnet
- stable-diffusion
- image-to-image
widget:
- src: https://huggingface.co/datasets/mishig/sample_images/resolve/main/canny-edge.jpg
prompt: Girl with Pearl Earring
---
# Controlnet - *Canny Version*
ControlNet is a neural network structure to control diffusion models by adding extra conditions.
This checkpoint corresponds to the ControlNet conditioned on **Canny edges**.
It can be used in combination with [Stable Diffusion](https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/text2img).

## Model Details
- **Developed by:** Lvmin Zhang, Maneesh Agrawala
- **Model type:** Diffusion-based text-to-image generation model
- **Language(s):** English
- **License:** [The CreativeML OpenRAIL M license](https://huggingface.co/spaces/CompVis/stable-diffusion-license) is an [Open RAIL M license](https://www.licenses.ai/blog/2022/8/18/naming-convention-of-responsible-ai-licenses), adapted from the work that [BigScience](https://bigscience.huggingface.co/) and [the RAIL Initiative](https://www.licenses.ai/) are jointly carrying in the area of responsible AI licensing. See also [the article about the BLOOM Open RAIL license](https://bigscience.huggingface.co/blog/the-bigscience-rail-license) on which our license is based.
- **Resources for more information:** [GitHub Repository](https://github.com/lllyasviel/ControlNet), [Paper](https://arxiv.org/abs/2302.05543).
- **Cite as:**
@misc{zhang2023adding,
title={Adding Conditional Control to Text-to-Image Diffusion Models},
author={Lvmin Zhang and Maneesh Agrawala},
year={2023},
eprint={2302.05543},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
## Introduction
Controlnet was proposed in [*Adding Conditional Control to Text-to-Image Diffusion Models*](https://arxiv.org/abs/2302.05543) by
Lvmin Zhang, Maneesh Agrawala.
The abstract reads as follows:
*We present a neural network structure, ControlNet, to control pretrained large diffusion models to support additional input conditions.
The ControlNet learns task-specific conditions in an end-to-end way, and the learning is robust even when the training dataset is small (< 50k).
Moreover, training a ControlNet is as fast as fine-tuning a diffusion model, and the model can be trained on a personal devices.
Alternatively, if powerful computation clusters are available, the model can scale to large amounts (millions to billions) of data.
We report that large diffusion models like Stable Diffusion can be augmented with ControlNets to enable conditional inputs like edge maps, segmentation maps, keypoints, etc.
This may enrich the methods to control large diffusion models and further facilitate related applications.*
## Released Checkpoints
The authors released 8 different checkpoints, each trained with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5)
on a different type of conditioning:
| Model Name | Control Image Overview| Control Image Example | Generated Image Example |
|---|---|---|---|
|[lllyasviel/sd-controlnet-canny](https://huggingface.co/lllyasviel/sd-controlnet-canny)<br/> *Trained with canny edge detection* | A monochrome image with white edges on a black background.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_bird_canny.png"><img width="64" style="margin:0;padding:0;" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_bird_canny.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_bird_canny_1.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_bird_canny_1.png"/></a>|
|[lllyasviel/sd-controlnet-depth](https://huggingface.co/lllyasviel/sd-controlnet-depth)<br/> *Trained with Midas depth estimation* |A grayscale image with black representing deep areas and white representing shallow areas.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_vermeer_depth.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_vermeer_depth.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_vermeer_depth_2.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_vermeer_depth_2.png"/></a>|
|[lllyasviel/sd-controlnet-hed](https://huggingface.co/lllyasviel/sd-controlnet-hed)<br/> *Trained with HED edge detection (soft edge)* |A monochrome image with white soft edges on a black background.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_bird_hed.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_bird_hed.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_bird_hed_1.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_bird_hed_1.png"/></a> |
|[lllyasviel/sd-controlnet-mlsd](https://huggingface.co/lllyasviel/sd-controlnet-mlsd)<br/> *Trained with M-LSD line detection* |A monochrome image composed only of white straight lines on a black background.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_room_mlsd.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_room_mlsd.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_room_mlsd_0.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_room_mlsd_0.png"/></a>|
|[lllyasviel/sd-controlnet-normal](https://huggingface.co/lllyasviel/sd-controlnet-normal)<br/> *Trained with normal map* |A [normal mapped](https://en.wikipedia.org/wiki/Normal_mapping) image.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_human_normal.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_human_normal.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_human_normal_1.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_human_normal_1.png"/></a>|
|[lllyasviel/sd-controlnet_openpose](https://huggingface.co/lllyasviel/sd-controlnet-openpose)<br/> *Trained with OpenPose bone image* |A [OpenPose bone](https://github.com/CMU-Perceptual-Computing-Lab/openpose) image.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_human_openpose.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_human_openpose.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_human_openpose_0.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_human_openpose_0.png"/></a>|
|[lllyasviel/sd-controlnet_scribble](https://huggingface.co/lllyasviel/sd-controlnet-scribble)<br/> *Trained with human scribbles* |A hand-drawn monochrome image with white outlines on a black background.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_vermeer_scribble.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_vermeer_scribble.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_vermeer_scribble_0.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_vermeer_scribble_0.png"/></a> |
|[lllyasviel/sd-controlnet_seg](https://huggingface.co/lllyasviel/sd-controlnet-seg)<br/>*Trained with semantic segmentation* |An [ADE20K](https://groups.csail.mit.edu/vision/datasets/ADE20K/)'s segmentation protocol image.|<a href="https://huggingface.co/takuma104/controlnet_dev/blob/main/gen_compare/control_images/converted/control_room_seg.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/control_images/converted/control_room_seg.png"/></a>|<a href="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_room_seg_1.png"><img width="64" src="https://huggingface.co/takuma104/controlnet_dev/resolve/main/gen_compare/output_images/diffusers/output_room_seg_1.png"/></a> |
## Example
It is recommended to use the checkpoint with [Stable Diffusion v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5) as the checkpoint
has been trained on it.
Experimentally, the checkpoint can be used with other diffusion models such as dreamboothed stable diffusion.
**Note**: If you want to process an image to create the auxiliary conditioning, external dependencies are required as shown below:
1. Install opencv
```sh
$ pip install opencv-contrib-python
```
2. Let's install `diffusers` and related packages:
```
$ pip install diffusers transformers accelerate
```
3. Run code:
```python
import cv2
from PIL import Image
from diffusers import StableDiffusionControlNetPipeline, ControlNetModel, UniPCMultistepScheduler
import torch
import numpy as np
from diffusers.utils import load_image
image = load_image("https://huggingface.co/lllyasviel/sd-controlnet-hed/resolve/main/images/bird.png")
image = np.array(image)
low_threshold = 100
high_threshold = 200
image = cv2.Canny(image, low_threshold, high_threshold)
image = image[:, :, None]
image = np.concatenate([image, image, image], axis=2)
image = Image.fromarray(image)
controlnet = ControlNetModel.from_pretrained(
"lllyasviel/sd-controlnet-canny", torch_dtype=torch.float16
)
pipe = StableDiffusionControlNetPipeline.from_pretrained(
"runwayml/stable-diffusion-v1-5", controlnet=controlnet, safety_checker=None, torch_dtype=torch.float16
)
pipe.scheduler = UniPCMultistepScheduler.from_config(pipe.scheduler.config)
# Remove if you do not have xformers installed
# see https://huggingface.co/docs/diffusers/v0.13.0/en/optimization/xformers#installing-xformers
# for installation instructions
pipe.enable_xformers_memory_efficient_attention()
pipe.enable_model_cpu_offload()
image = pipe("bird", image, num_inference_steps=20).images[0]
image.save('images/bird_canny_out.png')
```



### Training
The canny edge model was trained on 3M edge-image, caption pairs. The model was trained for 600 GPU-hours with Nvidia A100 80G using Stable Diffusion 1.5 as a base model.
### Blog post
For more information, please also have a look at the [official ControlNet Blog Post](https://huggingface.co/blog/controlnet). | 11,661 | [
[
-0.04388427734375,
-0.03973388671875,
-0.00499725341796875,
0.03326416015625,
-0.02264404296875,
-0.0229949951171875,
-0.006938934326171875,
-0.0491943359375,
0.0655517578125,
0.012451171875,
-0.042755126953125,
-0.033782958984375,
-0.05426025390625,
-0.0031299591064453125,
-0.0010747909545898438,
0.046722412109375,
-0.0084075927734375,
-0.0082855224609375,
0.018096923828125,
-0.00879669189453125,
-0.007030487060546875,
-0.00836181640625,
-0.0965576171875,
-0.037750244140625,
0.045074462890625,
0.003368377685546875,
0.0498046875,
0.06182861328125,
0.040313720703125,
0.03076171875,
-0.02581787109375,
0.0205841064453125,
-0.0166778564453125,
-0.01264190673828125,
0.01019287109375,
-0.00936126708984375,
-0.05755615234375,
0.00572967529296875,
0.042877197265625,
0.0256195068359375,
0.013336181640625,
-0.0159149169921875,
0.0066986083984375,
0.052398681640625,
-0.037994384765625,
-0.0019435882568359375,
-0.00836181640625,
0.031829833984375,
-0.015411376953125,
-0.00714111328125,
-0.0063323974609375,
-0.0225677490234375,
0.00811767578125,
-0.0650634765625,
-0.022979736328125,
-0.00516510009765625,
0.10888671875,
0.0048828125,
-0.024261474609375,
-0.004512786865234375,
-0.01019287109375,
0.044403076171875,
-0.05950927734375,
0.008941650390625,
0.0186309814453125,
0.0169677734375,
-0.017486572265625,
-0.060821533203125,
-0.04058837890625,
-0.01377105712890625,
-0.00833892822265625,
0.035675048828125,
-0.035064697265625,
0.0045928955078125,
0.01380157470703125,
0.01544189453125,
-0.027191162109375,
0.018218994140625,
-0.026580810546875,
-0.0312347412109375,
0.052032470703125,
-0.0037250518798828125,
0.052581787109375,
-0.0079345703125,
-0.039703369140625,
-0.0243682861328125,
-0.02532958984375,
0.033447265625,
0.02496337890625,
0.004688262939453125,
-0.055145263671875,
0.033355712890625,
-0.00859832763671875,
0.06768798828125,
0.0280303955078125,
-0.0187225341796875,
0.03765869140625,
-0.0196990966796875,
-0.017486572265625,
-0.0253753662109375,
0.0860595703125,
0.049468994140625,
0.0218505859375,
0.0024871826171875,
-0.01209259033203125,
-0.009246826171875,
0.00276947021484375,
-0.06744384765625,
-0.01153564453125,
0.0064697265625,
-0.055633544921875,
-0.0302276611328125,
-0.007747650146484375,
-0.05364990234375,
-0.0218963623046875,
-0.01513671875,
0.02459716796875,
-0.052398681640625,
-0.046722412109375,
0.0203094482421875,
-0.038330078125,
0.038421630859375,
0.039886474609375,
-0.025390625,
0.016632080078125,
0.00994873046875,
0.08001708984375,
-0.0161895751953125,
-0.019012451171875,
-0.0158538818359375,
-0.00875091552734375,
-0.0262451171875,
0.03912353515625,
-0.0035305023193359375,
-0.006717681884765625,
-0.01490020751953125,
0.0228118896484375,
-0.003650665283203125,
-0.0253753662109375,
0.0193634033203125,
-0.0129241943359375,
0.004360198974609375,
-0.00437164306640625,
-0.0142822265625,
-0.024139404296875,
0.0236663818359375,
-0.044708251953125,
0.0435791015625,
0.013946533203125,
-0.08087158203125,
0.0248260498046875,
-0.051116943359375,
-0.021759033203125,
-0.0182647705078125,
0.00160980224609375,
-0.04705810546875,
-0.042724609375,
-0.00015676021575927734,
0.054931640625,
-0.0045928955078125,
-0.00980377197265625,
-0.04595947265625,
-0.0029010772705078125,
0.02008056640625,
-0.00791168212890625,
0.09149169921875,
0.012451171875,
-0.04559326171875,
0.0162811279296875,
-0.05950927734375,
0.006927490234375,
0.0196990966796875,
-0.0281829833984375,
-0.011444091796875,
-0.01209259033203125,
0.00965118408203125,
0.051025390625,
0.023162841796875,
-0.0535888671875,
0.0084075927734375,
-0.023956298828125,
0.0307159423828125,
0.041046142578125,
0.007747650146484375,
0.041046142578125,
-0.049285888671875,
0.0309906005859375,
0.0192108154296875,
0.0298614501953125,
0.015380859375,
-0.0311126708984375,
-0.06390380859375,
-0.040863037109375,
-0.0030670166015625,
0.04644775390625,
-0.061004638671875,
0.06365966796875,
-0.0038661956787109375,
-0.043731689453125,
-0.0240325927734375,
0.00334930419921875,
0.03692626953125,
0.0439453125,
0.0251312255859375,
-0.038330078125,
-0.0298614501953125,
-0.07720947265625,
0.0103607177734375,
0.01520538330078125,
-0.00037670135498046875,
0.023040771484375,
0.042724609375,
-0.00421142578125,
0.04718017578125,
-0.0107269287109375,
-0.033935546875,
-0.0021343231201171875,
-0.01117706298828125,
0.0157318115234375,
0.07525634765625,
0.058074951171875,
-0.07080078125,
-0.039093017578125,
0.00975799560546875,
-0.07379150390625,
-0.00010311603546142578,
-0.0142364501953125,
-0.031646728515625,
0.005641937255859375,
0.038970947265625,
-0.037139892578125,
0.066650390625,
0.03277587890625,
-0.03594970703125,
0.03466796875,
-0.0234832763671875,
0.0158538818359375,
-0.0789794921875,
0.02490234375,
0.033172607421875,
-0.0150604248046875,
-0.0428466796875,
0.00801849365234375,
0.00936126708984375,
0.0096893310546875,
-0.049072265625,
0.048187255859375,
-0.047454833984375,
-0.0037860870361328125,
-0.013824462890625,
0.00015854835510253906,
-0.0030364990234375,
0.0491943359375,
0.01141357421875,
0.0372314453125,
0.062103271484375,
-0.0300750732421875,
0.0294952392578125,
0.029937744140625,
-0.00611114501953125,
0.0677490234375,
-0.06878662109375,
0.001922607421875,
-0.0001634359359741211,
0.035614013671875,
-0.060150146484375,
-0.02008056640625,
0.041351318359375,
-0.04144287109375,
0.049591064453125,
-0.007183074951171875,
-0.01358795166015625,
-0.039215087890625,
-0.04351806640625,
0.0036869049072265625,
0.029876708984375,
-0.03594970703125,
0.032989501953125,
0.0148162841796875,
0.0187225341796875,
-0.050445556640625,
-0.08123779296875,
0.0034027099609375,
-0.011260986328125,
-0.06390380859375,
0.0264434814453125,
-0.0024280548095703125,
0.001270294189453125,
0.01256561279296875,
0.0083465576171875,
-0.0161590576171875,
0.010040283203125,
0.044830322265625,
0.01055908203125,
-0.0242919921875,
-0.01113128662109375,
0.0004429817199707031,
-0.006378173828125,
-0.01502227783203125,
-0.035491943359375,
0.032196044921875,
-0.007595062255859375,
-0.0157012939453125,
-0.07586669921875,
0.0205841064453125,
0.0411376953125,
-0.01369476318359375,
0.053741455078125,
0.052642822265625,
-0.0308837890625,
-0.00272369384765625,
-0.0257110595703125,
-0.007167816162109375,
-0.037322998046875,
-0.0129547119140625,
-0.018218994140625,
-0.06341552734375,
0.05364990234375,
0.005252838134765625,
-0.01003265380859375,
0.04180908203125,
0.016876220703125,
-0.0156402587890625,
0.0635986328125,
0.0300750732421875,
-0.00971221923828125,
0.05194091796875,
-0.058746337890625,
-0.01332855224609375,
-0.074462890625,
-0.023681640625,
-0.0204620361328125,
-0.06689453125,
-0.0298614501953125,
-0.0213775634765625,
0.0273590087890625,
0.0221099853515625,
-0.0489501953125,
0.035247802734375,
-0.050140380859375,
0.00992584228515625,
0.0310516357421875,
0.049774169921875,
-0.01727294921875,
-0.007556915283203125,
-0.027130126953125,
0.01450347900390625,
-0.047332763671875,
-0.0166015625,
0.05426025390625,
0.034698486328125,
0.038604736328125,
-0.0026073455810546875,
0.0433349609375,
0.007335662841796875,
0.00800323486328125,
-0.051025390625,
0.03997802734375,
-0.007183074951171875,
-0.044219970703125,
-0.03118896484375,
-0.026702880859375,
-0.0909423828125,
0.014984130859375,
-0.042724609375,
-0.055755615234375,
0.03179931640625,
0.01096343994140625,
-0.0129547119140625,
0.051116943359375,
-0.052520751953125,
0.047210693359375,
0.000012576580047607422,
-0.03643798828125,
0.0129547119140625,
-0.06927490234375,
0.0131683349609375,
0.021209716796875,
-0.0193634033203125,
0.0054168701171875,
-0.01427459716796875,
0.06402587890625,
-0.06280517578125,
0.07049560546875,
-0.03594970703125,
0.00913238525390625,
0.02764892578125,
0.00557708740234375,
0.0289764404296875,
-0.0193023681640625,
-0.01389312744140625,
0.004108428955078125,
-0.006900787353515625,
-0.04302978515625,
-0.0288543701171875,
0.042327880859375,
-0.06658935546875,
-0.01216888427734375,
-0.0169677734375,
-0.021728515625,
0.01215362548828125,
0.026336669921875,
0.041534423828125,
0.036102294921875,
0.0209197998046875,
0.0228271484375,
0.0404052734375,
-0.031463623046875,
0.04266357421875,
-0.0005521774291992188,
0.00522613525390625,
-0.041473388671875,
0.055267333984375,
0.0176239013671875,
0.01309967041015625,
0.0250396728515625,
0.01849365234375,
-0.01433563232421875,
-0.038238525390625,
-0.0249786376953125,
0.032562255859375,
-0.042205810546875,
-0.04010009765625,
-0.0462646484375,
-0.037933349609375,
-0.026458740234375,
-0.03594970703125,
-0.0193634033203125,
-0.019256591796875,
-0.051788330078125,
0.00926971435546875,
0.04888916015625,
0.04345703125,
-0.0296783447265625,
0.049285888671875,
-0.01274871826171875,
0.0235137939453125,
0.0260009765625,
0.0248260498046875,
0.007358551025390625,
-0.0257110595703125,
0.0192718505859375,
0.0176239013671875,
-0.0230712890625,
-0.067138671875,
0.03326416015625,
0.008941650390625,
0.034576416015625,
0.034454345703125,
-0.022003173828125,
0.0428466796875,
-0.00811004638671875,
0.03326416015625,
0.056610107421875,
-0.051605224609375,
0.0401611328125,
-0.035430908203125,
0.027008056640625,
0.031524658203125,
0.04119873046875,
-0.039031982421875,
-0.0181884765625,
-0.042572021484375,
-0.043243408203125,
0.043365478515625,
-0.0003731250762939453,
-0.01617431640625,
0.0199432373046875,
0.049468994140625,
-0.0264892578125,
-0.00125885009765625,
-0.060943603515625,
-0.034454345703125,
-0.0252227783203125,
-0.00855255126953125,
0.00533294677734375,
0.01174163818359375,
-0.00229644775390625,
-0.040283203125,
0.051025390625,
-0.0150909423828125,
0.044219970703125,
0.0380859375,
0.01299285888671875,
-0.015655517578125,
-0.027923583984375,
0.05682373046875,
0.031707763671875,
0.003704071044921875,
-0.00864410400390625,
-0.005062103271484375,
-0.04058837890625,
0.0185394287109375,
-0.0079345703125,
-0.0276031494140625,
-0.0168914794921875,
0.028839111328125,
0.06585693359375,
-0.00652313232421875,
0.0022296905517578125,
0.072509765625,
-0.0033016204833984375,
-0.05865478515625,
-0.01102447509765625,
0.00446319580078125,
0.0157318115234375,
0.037384033203125,
0.0118865966796875,
0.02685546875,
0.005767822265625,
-0.0038967132568359375,
0.025634765625,
0.035858154296875,
-0.03826904296875,
-0.01274871826171875,
0.03863525390625,
-0.00440216064453125,
-0.01549530029296875,
0.0205841064453125,
-0.034698486328125,
-0.05712890625,
0.07403564453125,
0.039581298828125,
0.051544189453125,
-0.005645751953125,
0.0269775390625,
0.049713134765625,
0.0253143310546875,
0.01654052734375,
0.030029296875,
0.006946563720703125,
-0.049713134765625,
-0.02789306640625,
-0.0284881591796875,
0.0050506591796875,
0.025665283203125,
-0.034454345703125,
0.0243072509765625,
-0.0728759765625,
-0.0228729248046875,
0.0021991729736328125,
0.0145416259765625,
-0.04840087890625,
0.031463623046875,
0.0007257461547851562,
0.0970458984375,
-0.058990478515625,
0.0546875,
0.053375244140625,
-0.03680419921875,
-0.092041015625,
-0.0105438232421875,
0.01092529296875,
-0.0611572265625,
0.056365966796875,
0.0016956329345703125,
-0.0106048583984375,
-0.006824493408203125,
-0.0709228515625,
-0.044525146484375,
0.09991455078125,
0.00550079345703125,
-0.0177764892578125,
0.01092529296875,
-0.03948974609375,
0.032562255859375,
-0.033599853515625,
0.0282440185546875,
0.014984130859375,
0.044219970703125,
0.0170135498046875,
-0.0596923828125,
0.0218048095703125,
-0.044952392578125,
0.0133819580078125,
0.015167236328125,
-0.07781982421875,
0.06829833984375,
0.018646240234375,
-0.007476806640625,
0.0031280517578125,
0.048828125,
0.01218414306640625,
0.01214599609375,
0.05682373046875,
0.0546875,
0.0249786376953125,
-0.00730133056640625,
0.06402587890625,
-0.004367828369140625,
0.006664276123046875,
0.047882080078125,
0.023162841796875,
0.044403076171875,
0.028778076171875,
0.005107879638671875,
0.027130126953125,
0.05938720703125,
0.0033359527587890625,
0.02197265625,
0.0439453125,
0.0006556510925292969,
-0.00572967529296875,
0.0005550384521484375,
-0.02471923828125,
0.0293731689453125,
0.0247955322265625,
-0.0160369873046875,
-0.01386260986328125,
0.0290985107421875,
0.01141357421875,
0.0010929107666015625,
-0.02880859375,
0.050140380859375,
-0.01003265380859375,
-0.0306243896484375,
0.06317138671875,
-0.00965118408203125,
0.09674072265625,
-0.04766845703125,
0.001079559326171875,
-0.024139404296875,
-0.004901885986328125,
-0.029876708984375,
-0.07122802734375,
0.01459503173828125,
-0.0177001953125,
0.0245361328125,
-0.01788330078125,
0.072265625,
-0.0184173583984375,
-0.009735107421875,
0.046142578125,
0.01367950439453125,
0.025970458984375,
0.020477294921875,
-0.08917236328125,
0.016845703125,
0.00533294677734375,
-0.034393310546875,
0.0098724365234375,
0.038116455078125,
0.00640869140625,
0.0535888671875,
0.03118896484375,
0.0501708984375,
0.0209197998046875,
-0.0301513671875,
0.07635498046875,
-0.0192718505859375,
-0.0255126953125,
-0.033203125,
0.047454833984375,
-0.042816162109375,
-0.0390625,
0.03729248046875,
0.021087646484375,
0.05853271484375,
0.00020742416381835938,
0.04669189453125,
-0.0270233154296875,
0.0210723876953125,
-0.036529541015625,
0.08148193359375,
-0.07501220703125,
-0.032470703125,
-0.0156402587890625,
-0.03570556640625,
-0.0384521484375,
0.07427978515625,
-0.014862060546875,
0.01116180419921875,
0.032989501953125,
0.0721435546875,
-0.0239105224609375,
-0.033721923828125,
0.0063934326171875,
0.004425048828125,
0.01079559326171875,
0.06036376953125,
0.048187255859375,
-0.04669189453125,
0.011810302734375,
-0.04095458984375,
-0.038177490234375,
-0.021209716796875,
-0.0694580078125,
-0.06573486328125,
-0.0552978515625,
-0.037139892578125,
-0.04290771484375,
-0.01390838623046875,
0.0653076171875,
0.092529296875,
-0.047760009765625,
-0.024688720703125,
-0.0113372802734375,
0.006988525390625,
-0.01410675048828125,
-0.01326751708984375,
0.0280303955078125,
0.0090789794921875,
-0.03173828125,
0.0008907318115234375,
0.025909423828125,
0.0452880859375,
0.0002720355987548828,
-0.0303955078125,
-0.043365478515625,
-0.0223236083984375,
0.02008056640625,
0.04815673828125,
-0.0338134765625,
-0.0222015380859375,
-0.0035037994384765625,
-0.0253448486328125,
0.0076141357421875,
0.029266357421875,
-0.01337432861328125,
0.0225677490234375,
0.04522705078125,
0.046722412109375,
0.04827880859375,
-0.0168914794921875,
0.0013170242309570312,
-0.037872314453125,
0.038330078125,
0.002010345458984375,
0.0252227783203125,
0.016571044921875,
-0.023529052734375,
0.039276123046875,
0.025848388671875,
-0.042236328125,
-0.0301055908203125,
-0.002506256103515625,
-0.11083984375,
-0.0198822021484375,
0.06658935546875,
-0.031585693359375,
-0.05352783203125,
0.01922607421875,
-0.0362548828125,
0.0335693359375,
-0.0218658447265625,
0.005401611328125,
0.0201416015625,
-0.00246429443359375,
-0.018798828125,
-0.03131103515625,
0.04290771484375,
0.0162200927734375,
-0.0599365234375,
-0.05364990234375,
0.039093017578125,
0.01479339599609375,
0.036102294921875,
0.05712890625,
-0.007404327392578125,
0.0207672119140625,
-0.004489898681640625,
0.025299072265625,
-0.01511383056640625,
-0.01305389404296875,
-0.041351318359375,
-0.004291534423828125,
-0.0144195556640625,
-0.034820556640625
]
] |
sentence-transformers/distilbert-base-nli-stsb-mean-tokens | 2022-06-15T20:07:20.000Z | [
"sentence-transformers",
"pytorch",
"tf",
"distilbert",
"feature-extraction",
"sentence-similarity",
"transformers",
"arxiv:1908.10084",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | sentence-similarity | sentence-transformers | null | null | sentence-transformers/distilbert-base-nli-stsb-mean-tokens | 10 | 34,646 | sentence-transformers | 2022-03-02T23:29:05 | ---
pipeline_tag: sentence-similarity
license: apache-2.0
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
**⚠️ This model is deprecated. Please don't use it as it produces sentence embeddings of low quality. You can find recommended sentence embedding models here: [SBERT.net - Pretrained Models](https://www.sbert.net/docs/pretrained_models.html)**
# sentence-transformers/distilbert-base-nli-stsb-mean-tokens
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('sentence-transformers/distilbert-base-nli-stsb-mean-tokens')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('sentence-transformers/distilbert-base-nli-stsb-mean-tokens')
model = AutoModel.from_pretrained('sentence-transformers/distilbert-base-nli-stsb-mean-tokens')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, max pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/distilbert-base-nli-stsb-mean-tokens)
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: DistilBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
This model was trained by [sentence-transformers](https://www.sbert.net/).
If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084):
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "http://arxiv.org/abs/1908.10084",
}
``` | 4,010 | [
[
-0.017181396484375,
-0.0592041015625,
0.0193328857421875,
0.03265380859375,
-0.03076171875,
-0.02801513671875,
-0.02001953125,
-0.004207611083984375,
0.015838623046875,
0.0211181640625,
-0.040771484375,
-0.03265380859375,
-0.0589599609375,
0.00952911376953125,
-0.032318115234375,
0.060546875,
-0.01171875,
0.001422882080078125,
-0.019622802734375,
-0.0105133056640625,
-0.0251922607421875,
-0.03472900390625,
-0.0253143310546875,
-0.018310546875,
0.01401519775390625,
0.00656890869140625,
0.03515625,
0.027374267578125,
0.023406982421875,
0.034454345703125,
-0.0098724365234375,
0.01261138916015625,
-0.028961181640625,
-0.01129150390625,
0.0043487548828125,
-0.0208587646484375,
-0.0084686279296875,
0.02752685546875,
0.043670654296875,
0.041900634765625,
-0.01453399658203125,
0.007350921630859375,
0.00284576416015625,
0.0249176025390625,
-0.03607177734375,
0.031707763671875,
-0.04205322265625,
0.014068603515625,
0.00691986083984375,
0.0025691986083984375,
-0.047088623046875,
-0.0111541748046875,
0.0236053466796875,
-0.0311279296875,
0.004878997802734375,
0.0157318115234375,
0.083251953125,
0.03369140625,
-0.0219573974609375,
-0.0272979736328125,
-0.023223876953125,
0.06597900390625,
-0.07183837890625,
0.0179595947265625,
0.0195770263671875,
-0.00516510009765625,
-0.00345611572265625,
-0.07940673828125,
-0.05841064453125,
-0.0111236572265625,
-0.035980224609375,
0.016998291015625,
-0.035186767578125,
0.0021114349365234375,
0.01410675048828125,
0.0236053466796875,
-0.05010986328125,
-0.0055084228515625,
-0.032562255859375,
-0.005603790283203125,
0.03997802734375,
0.0028438568115234375,
0.0254669189453125,
-0.042236328125,
-0.033477783203125,
-0.0217742919921875,
-0.0127716064453125,
-0.01000213623046875,
0.01084136962890625,
0.01180267333984375,
-0.0204925537109375,
0.05645751953125,
0.00074005126953125,
0.04315185546875,
0.0026397705078125,
0.0189666748046875,
0.0535888671875,
-0.0290374755859375,
-0.0270843505859375,
-0.00350189208984375,
0.08404541015625,
0.033538818359375,
0.02764892578125,
-0.00598907470703125,
-0.0106964111328125,
0.005733489990234375,
0.017852783203125,
-0.06402587890625,
-0.026702880859375,
0.012451171875,
-0.0277557373046875,
-0.0247955322265625,
0.0153350830078125,
-0.0460205078125,
0.00007176399230957031,
0.0023517608642578125,
0.05029296875,
-0.044189453125,
-0.002254486083984375,
0.0267486572265625,
-0.02093505859375,
0.00852203369140625,
-0.02313232421875,
-0.054443359375,
0.0169830322265625,
0.017303466796875,
0.0703125,
0.00830078125,
-0.038360595703125,
-0.0167388916015625,
-0.0138397216796875,
0.0028076171875,
0.040924072265625,
-0.0203857421875,
-0.0128936767578125,
0.01055908203125,
0.019439697265625,
-0.040740966796875,
-0.0274810791015625,
0.042633056640625,
-0.0261077880859375,
0.050811767578125,
0.01080322265625,
-0.063720703125,
-0.01213836669921875,
0.0110626220703125,
-0.040679931640625,
0.0819091796875,
0.0173492431640625,
-0.0755615234375,
0.008514404296875,
-0.060455322265625,
-0.0244140625,
-0.0128936767578125,
0.0102386474609375,
-0.05145263671875,
0.01464080810546875,
0.03271484375,
0.05364990234375,
0.0115509033203125,
0.03912353515625,
-0.0167999267578125,
-0.03839111328125,
0.031890869140625,
-0.033966064453125,
0.089599609375,
0.0122222900390625,
-0.026153564453125,
0.00473785400390625,
-0.040771484375,
-0.0110626220703125,
0.0254058837890625,
-0.01036834716796875,
-0.0199127197265625,
-0.00006324052810668945,
0.0234375,
0.0191497802734375,
0.01543426513671875,
-0.053955078125,
0.01068115234375,
-0.046630859375,
0.0745849609375,
0.045166015625,
0.0003883838653564453,
0.040374755859375,
-0.0169525146484375,
0.00807952880859375,
0.0325927734375,
0.0025959014892578125,
-0.01407623291015625,
-0.026885986328125,
-0.07293701171875,
-0.0236053466796875,
0.0247955322265625,
0.038543701171875,
-0.054443359375,
0.08282470703125,
-0.036376953125,
-0.034942626953125,
-0.052978515625,
-0.005130767822265625,
0.005298614501953125,
0.03125,
0.0501708984375,
-0.0039215087890625,
-0.050506591796875,
-0.06890869140625,
0.0011758804321289062,
-0.002910614013671875,
0.007633209228515625,
0.0121307373046875,
0.057037353515625,
-0.036041259765625,
0.07952880859375,
-0.0535888671875,
-0.031036376953125,
-0.03717041015625,
0.02093505859375,
0.023406982421875,
0.04864501953125,
0.04296875,
-0.05291748046875,
-0.0254974365234375,
-0.052337646484375,
-0.052398681640625,
0.0017147064208984375,
-0.01544952392578125,
-0.0116424560546875,
0.0104522705078125,
0.038116455078125,
-0.061279296875,
0.03131103515625,
0.047454833984375,
-0.041534423828125,
0.0235748291015625,
-0.0190582275390625,
-0.015777587890625,
-0.10943603515625,
0.00391387939453125,
0.00991058349609375,
-0.018798828125,
-0.03350830078125,
-0.0007309913635253906,
0.0079345703125,
-0.00687408447265625,
-0.039642333984375,
0.03424072265625,
-0.02960205078125,
0.0157318115234375,
-0.0032253265380859375,
0.0304412841796875,
0.00782012939453125,
0.0555419921875,
-0.0052032470703125,
0.05084228515625,
0.03802490234375,
-0.04364013671875,
0.0232086181640625,
0.04925537109375,
-0.0382080078125,
0.01059722900390625,
-0.068115234375,
-0.0022029876708984375,
-0.0040130615234375,
0.0341796875,
-0.078857421875,
0.0027923583984375,
0.026519775390625,
-0.039520263671875,
0.0185394287109375,
0.0241851806640625,
-0.050262451171875,
-0.04718017578125,
-0.0285491943359375,
0.0142974853515625,
0.04205322265625,
-0.0435791015625,
0.04180908203125,
0.0216827392578125,
0.00004595518112182617,
-0.0457763671875,
-0.08984375,
-0.0002467632293701172,
-0.01403045654296875,
-0.04644775390625,
0.04522705078125,
-0.004787445068359375,
0.01155853271484375,
0.0264892578125,
0.0196380615234375,
0.0004315376281738281,
0.00431060791015625,
0.00432586669921875,
0.021026611328125,
-0.004425048828125,
0.0154266357421875,
0.0152587890625,
-0.01326751708984375,
0.002834320068359375,
-0.0168609619140625,
0.05181884765625,
-0.0146636962890625,
-0.01198577880859375,
-0.03741455078125,
0.01233673095703125,
0.02947998046875,
-0.0215911865234375,
0.08172607421875,
0.07440185546875,
-0.035186767578125,
-0.004787445068359375,
-0.04022216796875,
-0.0213775634765625,
-0.035797119140625,
0.055328369140625,
-0.01029205322265625,
-0.07611083984375,
0.022796630859375,
0.00858306884765625,
0.0026493072509765625,
0.048431396484375,
0.038177490234375,
-0.0100860595703125,
0.0592041015625,
0.043670654296875,
-0.0189208984375,
0.03753662109375,
-0.05047607421875,
0.0261077880859375,
-0.0732421875,
-0.0007457733154296875,
-0.0155029296875,
-0.026275634765625,
-0.053955078125,
-0.033966064453125,
0.00894927978515625,
-0.0006022453308105469,
-0.02301025390625,
0.04388427734375,
-0.04522705078125,
0.01349639892578125,
0.04449462890625,
0.009918212890625,
-0.006610870361328125,
0.002063751220703125,
-0.026580810546875,
-0.004375457763671875,
-0.05010986328125,
-0.04315185546875,
0.0596923828125,
0.03857421875,
0.036224365234375,
-0.00968170166015625,
0.051971435546875,
0.00308990478515625,
0.0032062530517578125,
-0.050201416015625,
0.041412353515625,
-0.03057861328125,
-0.03424072265625,
-0.0233917236328125,
-0.0279388427734375,
-0.061920166015625,
0.0264892578125,
-0.01181793212890625,
-0.0577392578125,
0.0100860595703125,
-0.0169525146484375,
-0.023193359375,
0.0209808349609375,
-0.0616455078125,
0.07952880859375,
0.003665924072265625,
-0.00287628173828125,
-0.00926971435546875,
-0.048492431640625,
0.00936126708984375,
0.020843505859375,
0.0027065277099609375,
-0.0019321441650390625,
0.002452850341796875,
0.06488037109375,
-0.0204620361328125,
0.07830810546875,
-0.0197601318359375,
0.0242462158203125,
0.032958984375,
-0.02947998046875,
0.0199432373046875,
-0.0014190673828125,
-0.005641937255859375,
0.01418304443359375,
-0.01146697998046875,
-0.024322509765625,
-0.040740966796875,
0.051666259765625,
-0.07269287109375,
-0.0291290283203125,
-0.039306640625,
-0.0382080078125,
-0.0012922286987304688,
0.01129913330078125,
0.029571533203125,
0.035186767578125,
-0.018798828125,
0.03204345703125,
0.034515380859375,
-0.02618408203125,
0.0596923828125,
0.008819580078125,
-0.0013666152954101562,
-0.03839111328125,
0.046295166015625,
0.005229949951171875,
-0.0019702911376953125,
0.033050537109375,
0.0155181884765625,
-0.03857421875,
-0.02020263671875,
-0.0292510986328125,
0.03125,
-0.043212890625,
-0.01519775390625,
-0.07781982421875,
-0.041900634765625,
-0.0455322265625,
0.00005066394805908203,
-0.0187225341796875,
-0.032745361328125,
-0.046051025390625,
-0.0289154052734375,
0.02764892578125,
0.035369873046875,
-0.001373291015625,
0.0310821533203125,
-0.053436279296875,
0.006439208984375,
0.00807952880859375,
0.01153564453125,
-0.003078460693359375,
-0.056427001953125,
-0.0282745361328125,
0.003330230712890625,
-0.03057861328125,
-0.06243896484375,
0.047882080078125,
0.0172271728515625,
0.044189453125,
0.01084136962890625,
0.01262664794921875,
0.050537109375,
-0.043121337890625,
0.06793212890625,
0.007511138916015625,
-0.0799560546875,
0.03472900390625,
0.003734588623046875,
0.029296875,
0.040924072265625,
0.027069091796875,
-0.03375244140625,
-0.031463623046875,
-0.051025390625,
-0.07965087890625,
0.049468994140625,
0.0325927734375,
0.04754638671875,
-0.032379150390625,
0.0241241455078125,
-0.0217742919921875,
0.01666259765625,
-0.08837890625,
-0.0302581787109375,
-0.034088134765625,
-0.04833984375,
-0.025054931640625,
-0.026702880859375,
0.0175323486328125,
-0.029937744140625,
0.058502197265625,
0.005615234375,
0.051239013671875,
0.029388427734375,
-0.0419921875,
0.017669677734375,
0.0160980224609375,
0.036285400390625,
0.01479339599609375,
-0.01042938232421875,
0.01256561279296875,
0.022491455078125,
-0.0303955078125,
0.0030574798583984375,
0.03997802734375,
-0.00959014892578125,
0.0171966552734375,
0.0280303955078125,
0.08013916015625,
0.037994384765625,
-0.032989501953125,
0.06005859375,
-0.00800323486328125,
-0.0216064453125,
-0.03692626953125,
-0.01256561279296875,
0.022796630859375,
0.0208282470703125,
0.023040771484375,
-0.00014078617095947266,
0.0020771026611328125,
-0.0240478515625,
0.02459716796875,
0.01503753662109375,
-0.03656005859375,
-0.00844573974609375,
0.0499267578125,
0.0102691650390625,
-0.00922393798828125,
0.07861328125,
-0.026397705078125,
-0.052032470703125,
0.026275634765625,
0.048919677734375,
0.07373046875,
0.002971649169921875,
0.02301025390625,
0.043731689453125,
0.026824951171875,
-0.004268646240234375,
-0.005832672119140625,
0.01317596435546875,
-0.0677490234375,
-0.02313232421875,
-0.05072021484375,
0.01300048828125,
0.00165557861328125,
-0.0426025390625,
0.01953125,
-0.007236480712890625,
-0.0139312744140625,
-0.0165252685546875,
0.0007605552673339844,
-0.048736572265625,
0.007205963134765625,
0.005916595458984375,
0.06256103515625,
-0.07781982421875,
0.055450439453125,
0.050262451171875,
-0.053985595703125,
-0.056121826171875,
-0.00536346435546875,
-0.02630615234375,
-0.05126953125,
0.041259765625,
0.041656494140625,
0.01476287841796875,
0.0207061767578125,
-0.03948974609375,
-0.057342529296875,
0.10333251953125,
0.01898193359375,
-0.034820556640625,
-0.018218994140625,
0.0109710693359375,
0.037078857421875,
-0.03753662109375,
0.032012939453125,
0.0266571044921875,
0.02349853515625,
-0.003662109375,
-0.04833984375,
0.0185546875,
-0.026031494140625,
0.0184173583984375,
-0.0156402587890625,
-0.0394287109375,
0.07086181640625,
-0.00423431396484375,
-0.0201873779296875,
0.0113525390625,
0.06646728515625,
0.0222015380859375,
-0.0008358955383300781,
0.03839111328125,
0.06744384765625,
0.0438232421875,
-0.01128387451171875,
0.06768798828125,
-0.0216522216796875,
0.0521240234375,
0.0762939453125,
0.00298309326171875,
0.0799560546875,
0.0367431640625,
-0.006259918212890625,
0.062286376953125,
0.04144287109375,
-0.02581787109375,
0.0513916015625,
0.0215606689453125,
0.006500244140625,
0.0047760009765625,
0.00876617431640625,
-0.01483917236328125,
0.0394287109375,
0.0118560791015625,
-0.0543212890625,
-0.00035762786865234375,
0.01277923583984375,
0.0035762786865234375,
0.0010519027709960938,
0.0113525390625,
0.04510498046875,
0.01479339599609375,
-0.031097412109375,
0.030517578125,
0.014007568359375,
0.0789794921875,
-0.026885986328125,
0.01018524169921875,
-0.0018892288208007812,
0.025634765625,
0.00045180320739746094,
-0.041351318359375,
0.0299835205078125,
-0.00786590576171875,
-0.0030460357666015625,
-0.0169830322265625,
0.0426025390625,
-0.04559326171875,
-0.0494384765625,
0.0293426513671875,
0.0382080078125,
-0.0008487701416015625,
0.003948211669921875,
-0.07708740234375,
0.00013697147369384766,
-0.002147674560546875,
-0.042724609375,
0.01230621337890625,
0.02496337890625,
0.030975341796875,
0.03924560546875,
0.0293731689453125,
-0.01398468017578125,
0.005222320556640625,
0.0158233642578125,
0.06817626953125,
-0.044677734375,
-0.041107177734375,
-0.07708740234375,
0.061798095703125,
-0.018768310546875,
-0.0240325927734375,
0.0465087890625,
0.04132080078125,
0.06719970703125,
-0.0199432373046875,
0.040679931640625,
-0.01198577880859375,
0.0156402587890625,
-0.040130615234375,
0.06640625,
-0.035491943359375,
-0.004730224609375,
-0.02130126953125,
-0.0716552734375,
-0.021697998046875,
0.08343505859375,
-0.0239715576171875,
0.01282501220703125,
0.0703125,
0.059051513671875,
-0.00836944580078125,
-0.004150390625,
0.00931549072265625,
0.032958984375,
0.015411376953125,
0.036102294921875,
0.040740966796875,
-0.064453125,
0.043914794921875,
-0.04034423828125,
-0.00858306884765625,
-0.0091400146484375,
-0.06597900390625,
-0.07513427734375,
-0.064453125,
-0.0369873046875,
-0.0180816650390625,
-0.00585174560546875,
0.080322265625,
0.047882080078125,
-0.05206298828125,
-0.00473785400390625,
-0.0215911865234375,
-0.0194549560546875,
-0.00934600830078125,
-0.0231475830078125,
0.039031982421875,
-0.037841796875,
-0.06573486328125,
0.0122528076171875,
-0.0068511962890625,
0.01194000244140625,
-0.031494140625,
0.008697509765625,
-0.050628662109375,
0.007781982421875,
0.043365478515625,
-0.026824951171875,
-0.057098388671875,
-0.0200347900390625,
0.006137847900390625,
-0.0260772705078125,
-0.0107421875,
0.023223876953125,
-0.049407958984375,
0.0161285400390625,
0.0229339599609375,
0.04180908203125,
0.04779052734375,
-0.01611328125,
0.031982421875,
-0.06341552734375,
0.0245361328125,
0.01032257080078125,
0.054534912109375,
0.034881591796875,
-0.01983642578125,
0.04254150390625,
0.0176849365234375,
-0.034942626953125,
-0.05194091796875,
-0.0132293701171875,
-0.0760498046875,
-0.0233154296875,
0.08123779296875,
-0.033050537109375,
-0.0271148681640625,
0.0164337158203125,
-0.0135955810546875,
0.03997802734375,
-0.0211944580078125,
0.05328369140625,
0.064453125,
0.01018524169921875,
-0.01898193359375,
-0.022186279296875,
0.0131988525390625,
0.0289764404296875,
-0.039306640625,
-0.01080322265625,
0.020111083984375,
0.01971435546875,
0.0231475830078125,
0.034088134765625,
-0.009368896484375,
-0.0056610107421875,
0.0058746337890625,
0.006992340087890625,
-0.021392822265625,
0.0025482177734375,
-0.0238037109375,
-0.0001023411750793457,
-0.026519775390625,
-0.0306396484375
]
] |
Helsinki-NLP/opus-mt-tc-big-en-ar | 2023-10-10T10:22:24.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"marian",
"text2text-generation",
"translation",
"opus-mt-tc",
"ar",
"en",
"license:cc-by-4.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-tc-big-en-ar | 9 | 34,502 | transformers | 2022-04-13T13:38:54 | ---
language:
- ar
- en
tags:
- translation
- opus-mt-tc
license: cc-by-4.0
model-index:
- name: opus-mt-tc-big-en-ar
results:
- task:
name: Translation eng-ara
type: translation
args: eng-ara
dataset:
name: flores101-devtest
type: flores_101
args: eng ara devtest
metrics:
- name: BLEU
type: bleu
value: 29.4
- task:
name: Translation eng-ara
type: translation
args: eng-ara
dataset:
name: tatoeba-test-v2020-07-28
type: tatoeba_mt
args: eng-ara
metrics:
- name: BLEU
type: bleu
value: 20.0
- task:
name: Translation eng-ara
type: translation
args: eng-ara
dataset:
name: tico19-test
type: tico19-test
args: eng-ara
metrics:
- name: BLEU
type: bleu
value: 30.0
---
# opus-mt-tc-big-en-ar
Neural machine translation model for translating from English (en) to Arabic (ar).
This model is part of the [OPUS-MT project](https://github.com/Helsinki-NLP/Opus-MT), an effort to make neural machine translation models widely available and accessible for many languages in the world. All models are originally trained using the amazing framework of [Marian NMT](https://marian-nmt.github.io/), an efficient NMT implementation written in pure C++. The models have been converted to pyTorch using the transformers library by huggingface. Training data is taken from [OPUS](https://opus.nlpl.eu/) and training pipelines use the procedures of [OPUS-MT-train](https://github.com/Helsinki-NLP/Opus-MT-train).
* Publications: [OPUS-MT – Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge – Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
```
@inproceedings{tiedemann-thottingal-2020-opus,
title = "{OPUS}-{MT} {--} Building open translation services for the World",
author = {Tiedemann, J{\"o}rg and Thottingal, Santhosh},
booktitle = "Proceedings of the 22nd Annual Conference of the European Association for Machine Translation",
month = nov,
year = "2020",
address = "Lisboa, Portugal",
publisher = "European Association for Machine Translation",
url = "https://aclanthology.org/2020.eamt-1.61",
pages = "479--480",
}
@inproceedings{tiedemann-2020-tatoeba,
title = "The Tatoeba Translation Challenge {--} Realistic Data Sets for Low Resource and Multilingual {MT}",
author = {Tiedemann, J{\"o}rg},
booktitle = "Proceedings of the Fifth Conference on Machine Translation",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.wmt-1.139",
pages = "1174--1182",
}
```
## Model info
* Release: 2022-02-25
* source language(s): eng
* target language(s): afb ara
* valid target language labels: >>afb<< >>ara<<
* model: transformer-big
* data: opusTCv20210807+bt ([source](https://github.com/Helsinki-NLP/Tatoeba-Challenge))
* tokenization: SentencePiece (spm32k,spm32k)
* original model: [opusTCv20210807+bt_transformer-big_2022-02-25.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-ara/opusTCv20210807+bt_transformer-big_2022-02-25.zip)
* more information released models: [OPUS-MT eng-ara README](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-ara/README.md)
* more information about the model: [MarianMT](https://huggingface.co/docs/transformers/model_doc/marian)
This is a multilingual translation model with multiple target languages. A sentence initial language token is required in the form of `>>id<<` (id = valid target language ID), e.g. `>>afb<<`
## Usage
A short example code:
```python
from transformers import MarianMTModel, MarianTokenizer
src_text = [
">>ara<< I can't help you because I'm busy.",
">>ara<< I have to write a letter. Do you have some paper?"
]
model_name = "pytorch-models/opus-mt-tc-big-en-ar"
tokenizer = MarianTokenizer.from_pretrained(model_name)
model = MarianMTModel.from_pretrained(model_name)
translated = model.generate(**tokenizer(src_text, return_tensors="pt", padding=True))
for t in translated:
print( tokenizer.decode(t, skip_special_tokens=True) )
# expected output:
# لا أستطيع مساعدتك لأنني مشغول.
# يجب أن أكتب رسالة هل لديك بعض الأوراق؟
```
You can also use OPUS-MT models with the transformers pipelines, for example:
```python
from transformers import pipeline
pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-big-en-ar")
print(pipe(">>ara<< I can't help you because I'm busy."))
# expected output: لا أستطيع مساعدتك لأنني مشغول.
```
## Benchmarks
* test set translations: [opusTCv20210807+bt_transformer-big_2022-02-25.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-ara/opusTCv20210807+bt_transformer-big_2022-02-25.test.txt)
* test set scores: [opusTCv20210807+bt_transformer-big_2022-02-25.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-ara/opusTCv20210807+bt_transformer-big_2022-02-25.eval.txt)
* benchmark results: [benchmark_results.txt](benchmark_results.txt)
* benchmark output: [benchmark_translations.zip](benchmark_translations.zip)
| langpair | testset | chr-F | BLEU | #sent | #words |
|----------|---------|-------|-------|-------|--------|
| eng-ara | tatoeba-test-v2021-08-07 | 0.48813 | 19.8 | 10305 | 61356 |
| eng-ara | flores101-devtest | 0.61154 | 29.4 | 1012 | 21357 |
| eng-ara | tico19-test | 0.60075 | 30.0 | 2100 | 51339 |
## Acknowledgements
The work is supported by the [European Language Grid](https://www.european-language-grid.eu/) as [pilot project 2866](https://live.european-language-grid.eu/catalogue/#/resource/projects/2866), by the [FoTran project](https://www.helsinki.fi/en/researchgroups/natural-language-understanding-with-cross-lingual-grounding), funded by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 771113), and the [MeMAD project](https://memad.eu/), funded by the European Union’s Horizon 2020 Research and Innovation Programme under grant agreement No 780069. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland.
## Model conversion info
* transformers version: 4.16.2
* OPUS-MT git hash: 3405783
* port time: Wed Apr 13 16:37:31 EEST 2022
* port machine: LM0-400-22516.local
| 6,590 | [
[
-0.0265960693359375,
-0.042327880859375,
0.0162353515625,
0.0216522216796875,
-0.035919189453125,
-0.0133514404296875,
-0.038970947265625,
-0.0246429443359375,
0.01258087158203125,
0.0289154052734375,
-0.032257080078125,
-0.051300048828125,
-0.053009033203125,
0.028839111328125,
-0.01776123046875,
0.06591796875,
-0.030548095703125,
0.0184173583984375,
0.0259857177734375,
-0.025848388671875,
-0.01263427734375,
-0.033721923828125,
-0.032135009765625,
-0.0246124267578125,
0.0205230712890625,
0.0139617919921875,
0.040008544921875,
0.044708251953125,
0.04327392578125,
0.02606201171875,
-0.0155792236328125,
0.019378662109375,
-0.00791168212890625,
-0.0099639892578125,
0.006160736083984375,
-0.03143310546875,
-0.0406494140625,
-0.00994873046875,
0.06768798828125,
0.042083740234375,
0.006710052490234375,
0.0279693603515625,
0.00540924072265625,
0.04443359375,
-0.0136260986328125,
0.01105499267578125,
-0.039520263671875,
0.006816864013671875,
-0.0246429443359375,
-0.0168609619140625,
-0.042266845703125,
-0.01107025146484375,
0.00106048583984375,
-0.0357666015625,
-0.00016582012176513672,
0.0088653564453125,
0.094970703125,
0.0153656005859375,
-0.0299072265625,
-0.01294708251953125,
-0.0587158203125,
0.07440185546875,
-0.060211181640625,
0.047821044921875,
0.017181396484375,
0.004924774169921875,
-0.00626373291015625,
-0.050384521484375,
-0.046142578125,
-0.0074005126953125,
-0.01500701904296875,
0.022247314453125,
-0.02362060546875,
-0.01175689697265625,
0.015106201171875,
0.04254150390625,
-0.048248291015625,
-0.00013244152069091797,
-0.032440185546875,
-0.014678955078125,
0.029144287109375,
0.000002682209014892578,
0.023223876953125,
-0.026611328125,
-0.031951904296875,
-0.03253173828125,
-0.0521240234375,
0.0093231201171875,
0.0247955322265625,
0.0264892578125,
-0.049713134765625,
0.053497314453125,
0.001377105712890625,
0.0501708984375,
0.0023937225341796875,
-0.0026721954345703125,
0.048828125,
-0.0401611328125,
-0.01273345947265625,
-0.01062774658203125,
0.0911865234375,
0.0139617919921875,
0.007320404052734375,
-0.01474761962890625,
-0.01275634765625,
-0.0102691650390625,
-0.01392364501953125,
-0.06689453125,
0.0169219970703125,
0.0192718505859375,
-0.0340576171875,
-0.0017604827880859375,
-0.00476837158203125,
-0.044342041015625,
0.0130157470703125,
-0.018035888671875,
0.0325927734375,
-0.04986572265625,
-0.035308837890625,
0.0181427001953125,
0.00975799560546875,
0.0231781005859375,
0.0025482177734375,
-0.045379638671875,
-0.001384735107421875,
0.0276336669921875,
0.07037353515625,
-0.00748443603515625,
-0.03857421875,
-0.03570556640625,
-0.0015001296997070312,
-0.019287109375,
0.03399658203125,
-0.0099639892578125,
-0.020660400390625,
-0.007236480712890625,
0.0193023681640625,
-0.0182952880859375,
-0.0196685791015625,
0.0648193359375,
-0.0289459228515625,
0.033905029296875,
-0.0128936767578125,
-0.0234832763671875,
-0.02313232421875,
0.019256591796875,
-0.0367431640625,
0.0821533203125,
0.00981903076171875,
-0.06390380859375,
0.0036754608154296875,
-0.05426025390625,
-0.0243377685546875,
-0.00606536865234375,
0.01042938232421875,
-0.03619384765625,
0.006038665771484375,
0.0246124267578125,
0.029876708984375,
-0.045745849609375,
0.0277557373046875,
0.01216888427734375,
-0.01177215576171875,
0.0142974853515625,
-0.040771484375,
0.078369140625,
0.026580810546875,
-0.040283203125,
0.01190948486328125,
-0.05230712890625,
-0.0027618408203125,
0.0108489990234375,
-0.033233642578125,
-0.00872802734375,
-0.01134490966796875,
0.014434814453125,
0.026702880859375,
0.016693115234375,
-0.048675537109375,
0.0122222900390625,
-0.05267333984375,
0.033447265625,
0.04986572265625,
-0.0210723876953125,
0.0282440185546875,
-0.0183258056640625,
0.036224365234375,
0.020294189453125,
-0.006969451904296875,
-0.014251708984375,
-0.043670654296875,
-0.07318115234375,
-0.02032470703125,
0.045013427734375,
0.04449462890625,
-0.08428955078125,
0.046417236328125,
-0.051513671875,
-0.06048583984375,
-0.058868408203125,
-0.01495361328125,
0.045013427734375,
0.025299072265625,
0.047882080078125,
-0.01491546630859375,
-0.039276123046875,
-0.06256103515625,
-0.021759033203125,
-0.0133056640625,
-0.00261688232421875,
0.007373809814453125,
0.05010986328125,
-0.02557373046875,
0.04248046875,
-0.01256561279296875,
-0.030853271484375,
-0.0290069580078125,
0.01192474365234375,
0.0435791015625,
0.05169677734375,
0.035369873046875,
-0.05810546875,
-0.0517578125,
0.0298309326171875,
-0.0509033203125,
-0.0030193328857421875,
-0.002407073974609375,
-0.01544952392578125,
0.0293121337890625,
0.01505279541015625,
-0.0511474609375,
0.016204833984375,
0.05767822265625,
-0.032745361328125,
0.0341796875,
-0.01340484619140625,
0.0207366943359375,
-0.113525390625,
0.016754150390625,
-0.0085296630859375,
-0.00994873046875,
-0.047027587890625,
0.0097808837890625,
0.004375457763671875,
0.004314422607421875,
-0.052520751953125,
0.051544189453125,
-0.043060302734375,
0.0056915283203125,
0.0162200927734375,
-0.00292205810546875,
-0.003826141357421875,
0.06256103515625,
-0.000010192394256591797,
0.0677490234375,
0.04119873046875,
-0.04302978515625,
0.0152740478515625,
0.03759765625,
-0.025848388671875,
0.01309967041015625,
-0.05767822265625,
0.00412750244140625,
0.01239013671875,
0.007312774658203125,
-0.049346923828125,
0.005466461181640625,
0.038177490234375,
-0.052154541015625,
0.032958984375,
-0.0226898193359375,
-0.058502197265625,
-0.0226287841796875,
-0.006961822509765625,
0.0401611328125,
0.03955078125,
-0.033966064453125,
0.0628662109375,
0.005859375,
-0.0019311904907226562,
-0.046356201171875,
-0.06707763671875,
0.0077056884765625,
-0.0193634033203125,
-0.056793212890625,
0.034637451171875,
-0.0089874267578125,
0.00487518310546875,
-0.00008767843246459961,
0.00998687744140625,
0.003936767578125,
-0.0017385482788085938,
0.0029773712158203125,
0.0096282958984375,
-0.03045654296875,
-0.0018968582153320312,
0.004253387451171875,
-0.0271148681640625,
-0.012481689453125,
-0.03741455078125,
0.0655517578125,
-0.039794921875,
-0.019378662109375,
-0.053070068359375,
0.0212554931640625,
0.05938720703125,
-0.048736572265625,
0.0758056640625,
0.05938720703125,
-0.02099609375,
0.0168914794921875,
-0.03045654296875,
-0.0019130706787109375,
-0.032989501953125,
0.03369140625,
-0.04583740234375,
-0.05230712890625,
0.054046630859375,
0.00829315185546875,
0.01641845703125,
0.06884765625,
0.0660400390625,
0.0218505859375,
0.07464599609375,
0.0276031494140625,
0.0008435249328613281,
0.0222015380859375,
-0.0501708984375,
0.0190887451171875,
-0.0718994140625,
-0.006404876708984375,
-0.0511474609375,
-0.01244354248046875,
-0.060791015625,
-0.03985595703125,
0.028289794921875,
-0.003528594970703125,
-0.00754547119140625,
0.048248291015625,
-0.03594970703125,
0.004974365234375,
0.039337158203125,
-0.0155029296875,
0.0258941650390625,
0.014739990234375,
-0.037078857421875,
-0.0209503173828125,
-0.0457763671875,
-0.04315185546875,
0.083251953125,
0.031402587890625,
0.020172119140625,
0.0205841064453125,
0.042022705078125,
-0.0080108642578125,
0.0234832763671875,
-0.04315185546875,
0.035064697265625,
-0.01806640625,
-0.042327880859375,
-0.0117645263671875,
-0.04541015625,
-0.07305908203125,
0.038848876953125,
-0.00878143310546875,
-0.04656982421875,
0.0101776123046875,
-0.0067596435546875,
-0.004924774169921875,
0.049285888671875,
-0.050079345703125,
0.0709228515625,
-0.014190673828125,
-0.01751708984375,
-0.00156402587890625,
-0.043701171875,
0.013336181640625,
-0.0028095245361328125,
0.0205841064453125,
0.0003516674041748047,
0.005153656005859375,
0.06036376953125,
-0.027008056640625,
0.037933349609375,
0.0001195669174194336,
-0.01558685302734375,
0.0107421875,
0.001667022705078125,
0.038604736328125,
-0.01316070556640625,
-0.0223541259765625,
0.0467529296875,
-0.0008935928344726562,
-0.02679443359375,
-0.0161285400390625,
0.04241943359375,
-0.071044921875,
-0.0292816162109375,
-0.0374755859375,
-0.047637939453125,
0.0104827880859375,
0.03277587890625,
0.049530029296875,
0.0435791015625,
0.005191802978515625,
0.03369140625,
0.03240966796875,
-0.03814697265625,
0.034088134765625,
0.042266845703125,
-0.0117034912109375,
-0.03533935546875,
0.06854248046875,
0.0235748291015625,
0.0220947265625,
0.042449951171875,
0.0108489990234375,
-0.017913818359375,
-0.04827880859375,
-0.06842041015625,
0.035919189453125,
-0.0401611328125,
-0.0183258056640625,
-0.058380126953125,
-0.01068115234375,
-0.026611328125,
0.011444091796875,
-0.045135498046875,
-0.039337158203125,
-0.01380157470703125,
-0.0025539398193359375,
0.0289459228515625,
0.02783203125,
0.002643585205078125,
0.0276031494140625,
-0.0750732421875,
0.017486572265625,
-0.021942138671875,
0.0210418701171875,
-0.0058441162109375,
-0.06280517578125,
-0.038116455078125,
0.0217437744140625,
-0.03277587890625,
-0.064697265625,
0.053375244140625,
0.003467559814453125,
0.0180816650390625,
0.01139068603515625,
0.0078277587890625,
0.042144775390625,
-0.0533447265625,
0.060028076171875,
0.00667572021484375,
-0.0732421875,
0.023956298828125,
-0.0274505615234375,
0.024749755859375,
0.02423095703125,
0.0213775634765625,
-0.0535888671875,
-0.045013427734375,
-0.047393798828125,
-0.073974609375,
0.06982421875,
0.046661376953125,
0.00722503662109375,
0.0013408660888671875,
0.0022602081298828125,
-0.0014743804931640625,
0.00897979736328125,
-0.0770263671875,
-0.041595458984375,
-0.0058746337890625,
-0.0208740234375,
-0.01108551025390625,
-0.0108642578125,
0.0024204254150390625,
-0.0295867919921875,
0.0770263671875,
0.007289886474609375,
0.033355712890625,
0.025909423828125,
-0.025665283203125,
-0.00856781005859375,
0.0244140625,
0.052032470703125,
0.0311126708984375,
-0.006633758544921875,
0.0043182373046875,
0.0254974365234375,
-0.034576416015625,
0.0048675537109375,
0.01506805419921875,
-0.024261474609375,
0.0280914306640625,
0.0272216796875,
0.07440185546875,
0.004528045654296875,
-0.0325927734375,
0.03485107421875,
-0.01024627685546875,
-0.019317626953125,
-0.03228759765625,
-0.033447265625,
0.01190948486328125,
0.006195068359375,
0.0291900634765625,
0.0079345703125,
-0.010467529296875,
-0.0186004638671875,
-0.0030040740966796875,
0.01236724853515625,
-0.0200347900390625,
-0.04150390625,
0.058380126953125,
0.0108642578125,
-0.0250396728515625,
0.044403076171875,
-0.019195556640625,
-0.05938720703125,
0.03741455078125,
0.0330810546875,
0.07781982421875,
-0.0192108154296875,
0.0005350112915039062,
0.0517578125,
0.045013427734375,
-0.0022716522216796875,
0.0125732421875,
-0.0005507469177246094,
-0.047088623046875,
-0.0374755859375,
-0.0628662109375,
-0.0038547515869140625,
0.0029468536376953125,
-0.04339599609375,
0.0260162353515625,
0.004848480224609375,
-0.006805419921875,
-0.01424407958984375,
0.00800323486328125,
-0.05224609375,
0.0006313323974609375,
-0.01099395751953125,
0.061920166015625,
-0.0634765625,
0.07012939453125,
0.048004150390625,
-0.0423583984375,
-0.0701904296875,
-0.008544921875,
-0.0298309326171875,
-0.046783447265625,
0.049072265625,
0.0188140869140625,
-0.003978729248046875,
0.01444244384765625,
-0.0137481689453125,
-0.068603515625,
0.070068359375,
0.034576416015625,
-0.0276641845703125,
-0.0025959014892578125,
0.035430908203125,
0.048126220703125,
-0.012725830078125,
0.024871826171875,
0.029388427734375,
0.053985595703125,
-0.0068511962890625,
-0.08087158203125,
0.0011920928955078125,
-0.048828125,
-0.0008902549743652344,
0.0251617431640625,
-0.047576904296875,
0.0771484375,
0.006591796875,
-0.0235137939453125,
0.020965576171875,
0.056488037109375,
0.0123748779296875,
0.00018668174743652344,
0.02691650390625,
0.05810546875,
0.0284576416015625,
-0.032379150390625,
0.085693359375,
-0.03875732421875,
0.035186767578125,
0.05645751953125,
0.01351165771484375,
0.061920166015625,
0.04327392578125,
-0.0174102783203125,
0.038116455078125,
0.04010009765625,
-0.0028629302978515625,
0.0194549560546875,
-0.00679779052734375,
0.0001232624053955078,
-0.00722503662109375,
-0.0162353515625,
-0.05474853515625,
0.0308685302734375,
0.0214385986328125,
-0.025177001953125,
-0.006748199462890625,
0.0010623931884765625,
0.0170440673828125,
-0.00228118896484375,
-0.0034580230712890625,
0.0360107421875,
0.01263427734375,
-0.057281494140625,
0.08074951171875,
0.0299530029296875,
0.053619384765625,
-0.04119873046875,
0.00792694091796875,
-0.01096343994140625,
0.0268096923828125,
-0.00445556640625,
-0.036956787109375,
0.0228729248046875,
0.0177001953125,
-0.01435089111328125,
-0.03851318359375,
0.00411224365234375,
-0.05059814453125,
-0.0606689453125,
0.033599853515625,
0.038726806640625,
0.0254364013671875,
0.006191253662109375,
-0.05364990234375,
0.006748199462890625,
0.0170135498046875,
-0.039337158203125,
0.003673553466796875,
0.04248046875,
-0.00312042236328125,
0.038726806640625,
0.0528564453125,
0.0177764892578125,
0.0182037353515625,
-0.00768280029296875,
0.051910400390625,
-0.0335693359375,
-0.03338623046875,
-0.0667724609375,
0.05718994140625,
0.01215362548828125,
-0.038787841796875,
0.06573486328125,
0.059967041015625,
0.06927490234375,
-0.01495361328125,
0.037506103515625,
-0.0087432861328125,
0.03704833984375,
-0.042083740234375,
0.0546875,
-0.06256103515625,
0.0180206298828125,
-0.0166473388671875,
-0.07562255859375,
-0.022613525390625,
0.03173828125,
-0.0162200927734375,
-0.005100250244140625,
0.058319091796875,
0.052978515625,
-0.0080108642578125,
-0.027313232421875,
0.01027679443359375,
0.0394287109375,
0.03131103515625,
0.05767822265625,
0.037078857421875,
-0.0701904296875,
0.050262451171875,
-0.029052734375,
-0.0006113052368164062,
-0.004795074462890625,
-0.045562744140625,
-0.0604248046875,
-0.060028076171875,
-0.01287078857421875,
-0.034698486328125,
-0.0167236328125,
0.078369140625,
0.022247314453125,
-0.065673828125,
-0.02777099609375,
-0.004108428955078125,
0.0146636962890625,
-0.0189056396484375,
-0.01287841796875,
0.050048828125,
-0.006191253662109375,
-0.0849609375,
0.0202789306640625,
0.00640106201171875,
0.01090240478515625,
-0.005413055419921875,
-0.0246734619140625,
-0.02972412109375,
-0.01479339599609375,
0.02471923828125,
0.01103973388671875,
-0.06884765625,
-0.0004930496215820312,
0.0180816650390625,
-0.01373291015625,
0.0185699462890625,
0.02471923828125,
-0.02984619140625,
0.0299530029296875,
0.04180908203125,
0.046173095703125,
0.04302978515625,
-0.021942138671875,
0.0457763671875,
-0.049591064453125,
0.038909912109375,
0.0157623291015625,
0.041107177734375,
0.03533935546875,
-0.00510406494140625,
0.042083740234375,
0.01629638671875,
-0.0213470458984375,
-0.0823974609375,
0.0095977783203125,
-0.0673828125,
0.0010595321655273438,
0.08538818359375,
-0.01873779296875,
-0.0242462158203125,
0.01210784912109375,
-0.0170745849609375,
0.044281005859375,
-0.018218994140625,
0.035003662109375,
0.053466796875,
0.039337158203125,
0.0009088516235351562,
-0.04290771484375,
0.0196380615234375,
0.05438232421875,
-0.040435791015625,
-0.004974365234375,
0.004322052001953125,
0.0055084228515625,
0.029388427734375,
0.028656005859375,
-0.017852783203125,
0.0100555419921875,
-0.026092529296875,
0.032958984375,
-0.00908660888671875,
-0.01256561279296875,
-0.027313232421875,
-0.01081085205078125,
-0.007701873779296875,
-0.0134429931640625
]
] |
EleutherAI/polyglot-ko-1.3b | 2023-06-07T05:03:00.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt_neox",
"text-generation",
"causal-lm",
"ko",
"arxiv:2104.09864",
"arxiv:2204.04541",
"arxiv:2306.02254",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | EleutherAI | null | null | EleutherAI/polyglot-ko-1.3b | 52 | 34,499 | transformers | 2022-09-15T06:10:18 | ---
language:
- ko
tags:
- pytorch
- causal-lm
license: apache-2.0
---
# Polyglot-Ko-1.3B
## Model Description
Polyglot-Ko is a series of large-scale Korean autoregressive language models made by the EleutherAI polyglot team.
| Hyperparameter | Value |
|----------------------|----------------------------------------------------------------------------------------------------------------------------------------|
| \\(n_{parameters}\\) | 1,331,810,304 |
| \\(n_{layers}\\) | 24 |
| \\(d_{model}\\) | 2,048 |
| \\(d_{ff}\\) | 8,192 |
| \\(n_{heads}\\) | 16 |
| \\(d_{head}\\) | 128 |
| \\(n_{ctx}\\) | 2,048 |
| \\(n_{vocab}\\) | 30,003 / 30,080 |
| Positional Encoding | [Rotary Position Embedding (RoPE)](https://arxiv.org/abs/2104.09864) |
| RoPE Dimensions | [64](https://github.com/kingoflolz/mesh-transformer-jax/blob/f2aa66e0925de6593dcbb70e72399b97b4130482/mesh_transformer/layers.py#L223) |
The model consists of 24 transformer layers with a model dimension of 2048, and a feedforward dimension of 8192. The model
dimension is split into 16 heads, each with a dimension of 128. Rotary Position Embedding (RoPE) is applied to 64
dimensions of each head. The model is trained with a tokenization vocabulary of 30003.
## Training data
Polyglot-Ko-1.3B was trained on 863 GB of Korean language data (1.2TB before processing), a large-scale dataset curated by [TUNiB](https://tunib.ai/). The data collection process has abided by South Korean laws. This dataset was collected for the purpose of training Polyglot-Ko models, so it will not be released for public use.
| Source |Size (GB) | Link |
|-------------------------------------|---------|------------------------------------------|
| Korean blog posts | 682.3 | - |
| Korean news dataset | 87.0 | - |
| Modu corpus | 26.4 |corpus.korean.go.kr |
| Korean patent dataset | 19.0 | - |
| Korean Q & A dataset | 18.1 | - |
| KcBert dataset | 12.7 | github.com/Beomi/KcBERT |
| Korean fiction dataset | 6.1 | - |
| Korean online comments | 4.2 | - |
| Korean wikipedia | 1.4 | ko.wikipedia.org |
| Clova call | < 1.0 | github.com/clovaai/ClovaCall |
| Naver sentiment movie corpus | < 1.0 | github.com/e9t/nsmc |
| Korean hate speech dataset | < 1.0 | - |
| Open subtitles | < 1.0 | opus.nlpl.eu/OpenSubtitles.php |
| AIHub various tasks datasets | < 1.0 |aihub.or.kr |
| Standard Korean language dictionary | < 1.0 | stdict.korean.go.kr/main/main.do |
Furthermore, in order to avoid the model memorizing and generating personally identifiable information (PII) in the training data, we masked out the following sensitive information in the pre-processing stage:
* `<|acc|>` : bank account number
* `<|rrn|>` : resident registration number
* `<|tell|>` : phone number
## Training procedure
Polyglot-Ko-1.3B was trained on 213 billion tokens over 102,000 steps on 256 A100 GPUs with the [GPT-NeoX framework](https://github.com/EleutherAI/gpt-neox). It was trained as an autoregressive language model, using cross-entropy loss to maximize the likelihood of predicting the next token.
## How to use
This model can be easily loaded using the `AutoModelForCausalLM` class:
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/polyglot-ko-1.3b")
model = AutoModelForCausalLM.from_pretrained("EleutherAI/polyglot-ko-1.3b")
```
## Evaluation results
We evaluate Polyglot-Ko-1.3B on [KOBEST dataset](https://arxiv.org/abs/2204.04541), a benchmark with 5 downstream tasks, against comparable models such as skt/ko-gpt-trinity-1.2B-v0.5, kakaobrain/kogpt and facebook/xglm-7.5B, using the prompts provided in the paper.
The following tables show the results when the number of few-shot examples differ. You can reproduce these results using the [polyglot branch of lm-evaluation-harness](https://github.com/EleutherAI/lm-evaluation-harness/tree/polyglot) and the following scripts. For a fair comparison, all models were run under the same conditions and using the same prompts. In the tables, `n` refers to the number of few-shot examples.
In case of WiC dataset, all models show random performance.
```console
python main.py \
--model gpt2 \
--model_args pretrained='EleutherAI/polyglot-ko-1.3b' \
--tasks kobest_copa,kobest_hellaswag,kobest_boolq,kobest_sentineg,kobest_wic \
--num_fewshot $YOUR_NUM_FEWSHOT \
--batch_size $YOUR_BATCH_SIZE \
--device $YOUR_DEVICE \
--output_path $/path/to/output/
```
### COPA (F1)
| Model | params | 0-shot | 5-shot | 10-shot | 50-shot |
|----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------|
| [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.6696 | 0.6477 | 0.6419 | 0.6514 |
| [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.7345 | 0.7287 | 0.7277 | 0.7479 |
| [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.6723 | 0.6731 | 0.6769 | 0.7119 |
| **[EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) (this)** | **1.3B** | **0.7196** | **0.7193** | **0.7204** | **0.7206** |
| [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.7595 | 0.7608 | 0.7638 | 0.7788 |
| [EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) | 5.8B | 0.7745 | 0.7676 | 0.7775 | 0.7887 |
| [EleutherAI/polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b) | 12.8B | 0.7937 | 0.8108 | 0.8037 | 0.8369 |
<img src="https://github.com/EleutherAI/polyglot/assets/19511788/d5b49364-aed5-4467-bae2-5a322c8e2ceb" width="800px">
### HellaSwag (F1)
| Model | params | 0-shot | 5-shot | 10-shot | 50-shot |
|----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------|
| [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.5243 | 0.5272 | 0.5166 | 0.5352 |
| [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.5590 | 0.5833 | 0.5828 | 0.5907 |
| [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.5665 | 0.5689 | 0.5565 | 0.5622 |
| **[EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) (this)** | **1.3B** | **0.5247** | **0.5260** | **0.5278** | **0.5427** |
| [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.5707 | 0.5830 | 0.5670 | 0.5787 |
| [EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) | 5.8B | 0.5976 | 0.5998 | 0.5979 | 0.6208 |
| [EleutherAI/polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b) | 12.8B | 0.5954 | 0.6306 | 0.6098 | 0.6118 |
<img src="https://github.com/EleutherAI/polyglot/assets/19511788/5acb60ac-161a-4ab3-a296-db4442e08b7f" width="800px">
### BoolQ (F1)
| Model | params | 0-shot | 5-shot | 10-shot | 50-shot |
|----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------|
| [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.3356 | 0.4014 | 0.3640 | 0.3560 |
| [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.4514 | 0.5981 | 0.5499 | 0.5202 |
| [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.4464 | 0.3324 | 0.3324 | 0.3324 |
| **[EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) (this)** | **1.3B** | **0.3552** | **0.4751** | **0.4109** | **0.4038** |
| [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.4320 | 0.5263 | 0.4930 | 0.4038 |
| [EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) | 5.8B | 0.4356 | 0.5698 | 0.5187 | 0.5236 |
| [EleutherAI/polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b) | 12.8B | 0.4818 | 0.6041 | 0.6289 | 0.6448 |
<img src="https://github.com/EleutherAI/polyglot/assets/19511788/b74c23c0-01f3-4b68-9e10-a48e9aa052ab" width="800px">
### SentiNeg (F1)
| Model | params | 0-shot | 5-shot | 10-shot | 50-shot |
|----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------|
| [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.6065 | 0.6878 | 0.7280 | 0.8413 |
| [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.3747 | 0.8942 | 0.9294 | 0.9698 |
| [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.3578 | 0.4471 | 0.3964 | 0.5271 |
| **[EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) (this)** | **1.3B** | **0.6790** | **0.6257** | **0.5514** | **0.7851** |
| [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.4858 | 0.7950 | 0.7320 | 0.7851 |
| [EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) | 5.8B | 0.3394 | 0.8841 | 0.8808 | 0.9521 |
| [EleutherAI/polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b) | 12.8B | 0.9117 | 0.9015 | 0.9345 | 0.9723 |
<img src="https://github.com/EleutherAI/polyglot/assets/19511788/95b56b19-d349-4b70-9ff9-94a5560f89ee" width="800px">
### WiC (F1)
| Model | params | 0-shot | 5-shot | 10-shot | 50-shot |
|----------------------------------------------------------------------------------------------|--------|--------|--------|---------|---------|
| [skt/ko-gpt-trinity-1.2B-v0.5](https://huggingface.co/skt/ko-gpt-trinity-1.2B-v0.5) | 1.2B | 0.3290 | 0.4313 | 0.4001 | 0.3621 |
| [kakaobrain/kogpt](https://huggingface.co/kakaobrain/kogpt) | 6.0B | 0.3526 | 0.4775 | 0.4358 | 0.4061 |
| [facebook/xglm-7.5B](https://huggingface.co/facebook/xglm-7.5B) | 7.5B | 0.3280 | 0.4903 | 0.4945 | 0.3656 |
| **[EleutherAI/polyglot-ko-1.3b](https://huggingface.co/EleutherAI/polyglot-ko-1.3b) (this)** | **1.3B** | **0.3297** | **0.4850** | **0.465** | **0.3290** |
| [EleutherAI/polyglot-ko-3.8b](https://huggingface.co/EleutherAI/polyglot-ko-3.8b) | 3.8B | 0.3390 | 0.4944 | 0.4203 | 0.3835 |
| [EleutherAI/polyglot-ko-5.8b](https://huggingface.co/EleutherAI/polyglot-ko-5.8b) | 5.8B | 0.3913 | 0.4688 | 0.4189 | 0.3910 |
| [EleutherAI/polyglot-ko-12.8b](https://huggingface.co/EleutherAI/polyglot-ko-12.8b) | 12.8B | 0.3985 | 0.3683 | 0.3307 | 0.3273 |
<img src="https://github.com/EleutherAI/polyglot/assets/19511788/4de4a4c3-d7ac-4e04-8b0c-0d533fe88294" width="800px">
## Limitations and Biases
Polyglot-Ko has been trained to optimize next token prediction. Language models such as this are often used for a wide variety of tasks and it is important to be aware of possible unexpected outcomes. For instance, Polyglot-Ko will not always return the most factual or accurate response but the most statistically likely one. In addition, Polyglot may produce socially unacceptable or offensive content. We recommend having a human curator or other filtering mechanism to censor sensitive content.
## Citation and Related Information
### BibTeX entry
If you find our work useful, please consider citing:
```bibtex
@misc{ko2023technical,
title={A Technical Report for Polyglot-Ko: Open-Source Large-Scale Korean Language Models},
author={Hyunwoong Ko and Kichang Yang and Minho Ryu and Taekyoon Choi and Seungmu Yang and jiwung Hyun and Sungho Park},
year={2023},
eprint={2306.02254},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Licensing
All our models are licensed under the terms of the Apache License 2.0.
```
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
```
### Acknowledgement
This project was made possible thanks to the computing resources from [Stability.ai](https://stability.ai), and thanks to [TUNiB](https://tunib.ai) for providing a large-scale Korean dataset for this work.
| 15,508 | [
[
-0.049774169921875,
-0.052490234375,
0.020477294921875,
0.004192352294921875,
-0.038330078125,
0.00019466876983642578,
-0.007686614990234375,
-0.038848876953125,
0.031829833984375,
0.01387786865234375,
-0.035247802734375,
-0.048583984375,
-0.054229736328125,
-0.007659912109375,
-0.00473785400390625,
0.09295654296875,
-0.003635406494140625,
-0.016021728515625,
0.001861572265625,
-0.0049285888671875,
-0.006999969482421875,
-0.0477294921875,
-0.041107177734375,
-0.036041259765625,
0.025909423828125,
0.0027751922607421875,
0.06072998046875,
0.03155517578125,
0.025421142578125,
0.022735595703125,
-0.02001953125,
0.0016412734985351562,
-0.03778076171875,
-0.03607177734375,
0.019989013671875,
-0.0391845703125,
-0.060821533203125,
-0.0011920928955078125,
0.04632568359375,
0.0210723876953125,
-0.0098724365234375,
0.03240966796875,
0.0009341239929199219,
0.04833984375,
-0.03204345703125,
0.019256591796875,
-0.0254364013671875,
0.0141448974609375,
-0.031341552734375,
0.0225677490234375,
-0.0189971923828125,
-0.029998779296875,
0.01187896728515625,
-0.044036865234375,
0.004062652587890625,
-0.002719879150390625,
0.096435546875,
-0.002201080322265625,
-0.01479339599609375,
-0.0027408599853515625,
-0.040496826171875,
0.0562744140625,
-0.07904052734375,
0.0184173583984375,
0.0257415771484375,
0.00800323486328125,
0.0007653236389160156,
-0.050537109375,
-0.044189453125,
0.0019817352294921875,
-0.029327392578125,
0.035369873046875,
-0.017852783203125,
0.0022602081298828125,
0.03515625,
0.03179931640625,
-0.0574951171875,
-0.00624847412109375,
-0.038970947265625,
-0.0227813720703125,
0.068603515625,
0.0163726806640625,
0.0292205810546875,
-0.0244903564453125,
-0.0233917236328125,
-0.02410888671875,
-0.021728515625,
0.030059814453125,
0.038848876953125,
-0.0002582073211669922,
-0.0533447265625,
0.0411376953125,
-0.0163726806640625,
0.049468994140625,
0.01398468017578125,
-0.0374755859375,
0.049468994140625,
-0.02880859375,
-0.025238037109375,
-0.005245208740234375,
0.08453369140625,
0.033355712890625,
0.02215576171875,
0.0134735107421875,
-0.00691986083984375,
-0.0027332305908203125,
-0.01152801513671875,
-0.061859130859375,
-0.021209716796875,
0.0177154541015625,
-0.042449951171875,
-0.025726318359375,
0.0272674560546875,
-0.06146240234375,
0.002117156982421875,
-0.01239776611328125,
0.028778076171875,
-0.035369873046875,
-0.03302001953125,
0.016082763671875,
0.0017271041870117188,
0.03466796875,
0.0156707763671875,
-0.043212890625,
0.017181396484375,
0.0288238525390625,
0.068359375,
-0.01143646240234375,
-0.02508544921875,
-0.00212860107421875,
-0.0030651092529296875,
-0.0165252685546875,
0.04205322265625,
-0.01285552978515625,
-0.013885498046875,
-0.0258331298828125,
0.01146697998046875,
-0.0256500244140625,
-0.0234832763671875,
0.031158447265625,
-0.012237548828125,
0.018951416015625,
-0.00408172607421875,
-0.035003662109375,
-0.0312347412109375,
0.005809783935546875,
-0.03961181640625,
0.07861328125,
0.01387786865234375,
-0.07403564453125,
0.018310546875,
-0.025634765625,
-0.0028400421142578125,
-0.007076263427734375,
0.0006394386291503906,
-0.061004638671875,
0.00740814208984375,
0.021820068359375,
0.0258331298828125,
-0.0268096923828125,
-0.0008649826049804688,
-0.024078369140625,
-0.02099609375,
-0.0019216537475585938,
-0.00963592529296875,
0.07379150390625,
0.00969696044921875,
-0.035308837890625,
-0.0024623870849609375,
-0.06500244140625,
0.0191802978515625,
0.040435791015625,
-0.00974273681640625,
-0.00688934326171875,
-0.0255279541015625,
-0.0077056884765625,
0.035186767578125,
0.023468017578125,
-0.0411376953125,
0.0169830322265625,
-0.03948974609375,
-0.0020961761474609375,
0.050689697265625,
-0.00634765625,
0.01322174072265625,
-0.03790283203125,
0.06304931640625,
0.01427459716796875,
0.0345458984375,
0.0025272369384765625,
-0.050445556640625,
-0.0555419921875,
-0.0281829833984375,
0.0184783935546875,
0.043975830078125,
-0.04815673828125,
0.0362548828125,
-0.0047149658203125,
-0.0672607421875,
-0.053497314453125,
0.01041412353515625,
0.03448486328125,
0.02349853515625,
0.012939453125,
-0.016876220703125,
-0.037567138671875,
-0.07379150390625,
-0.0004246234893798828,
-0.0140380859375,
0.01323699951171875,
0.032135009765625,
0.050537109375,
-0.0025234222412109375,
0.0623779296875,
-0.054962158203125,
-0.0078125,
-0.02703857421875,
0.0161590576171875,
0.05224609375,
0.0301513671875,
0.065185546875,
-0.04925537109375,
-0.08099365234375,
0.0114898681640625,
-0.07305908203125,
-0.0095367431640625,
0.00405120849609375,
-0.006771087646484375,
0.0245208740234375,
0.0182037353515625,
-0.0645751953125,
0.047576904296875,
0.048797607421875,
-0.028106689453125,
0.06878662109375,
-0.007274627685546875,
-0.00257110595703125,
-0.0771484375,
0.00528717041015625,
-0.01409149169921875,
-0.01995849609375,
-0.051361083984375,
-0.0029659271240234375,
-0.013336181640625,
0.006450653076171875,
-0.05206298828125,
0.04833984375,
-0.03466796875,
0.0051727294921875,
-0.0175933837890625,
0.00453948974609375,
-0.0091094970703125,
0.044403076171875,
-0.0032062530517578125,
0.044464111328125,
0.0693359375,
-0.02081298828125,
0.047760009765625,
0.00336456298828125,
-0.0206298828125,
0.018829345703125,
-0.06427001953125,
0.0233306884765625,
-0.01824951171875,
0.0307464599609375,
-0.06268310546875,
-0.0169677734375,
0.029327392578125,
-0.03375244140625,
0.00634765625,
-0.0291900634765625,
-0.043853759765625,
-0.05084228515625,
-0.04833984375,
0.03399658203125,
0.055267333984375,
-0.0196533203125,
0.036590576171875,
0.0166473388671875,
-0.0124664306640625,
-0.03387451171875,
-0.031463623046875,
-0.027801513671875,
-0.026123046875,
-0.06280517578125,
0.023345947265625,
0.0020599365234375,
0.0013437271118164062,
-0.0007653236389160156,
-0.0007996559143066406,
0.0063323974609375,
-0.0255889892578125,
0.0207366943359375,
0.039093017578125,
-0.01702880859375,
-0.0179443359375,
-0.016326904296875,
-0.0150909423828125,
-0.0035305023193359375,
-0.00971221923828125,
0.0653076171875,
-0.03082275390625,
-0.0152740478515625,
-0.052886962890625,
0.002895355224609375,
0.057525634765625,
-0.0089263916015625,
0.06964111328125,
0.08001708984375,
-0.0252838134765625,
0.019989013671875,
-0.034912109375,
-0.00228118896484375,
-0.035003662109375,
0.007534027099609375,
-0.035186767578125,
-0.0438232421875,
0.06939697265625,
0.0150299072265625,
0.0005116462707519531,
0.052886962890625,
0.047821044921875,
0.00016069412231445312,
0.0897216796875,
0.0281524658203125,
-0.0172271728515625,
0.0297088623046875,
-0.04541015625,
0.0160064697265625,
-0.06268310546875,
-0.026824951171875,
-0.005496978759765625,
-0.015716552734375,
-0.06329345703125,
-0.0287017822265625,
0.039276123046875,
0.02569580078125,
-0.0133056640625,
0.0379638671875,
-0.030548095703125,
0.020599365234375,
0.03485107421875,
0.0092010498046875,
0.0014944076538085938,
-0.00222015380859375,
-0.0289764404296875,
-0.0014362335205078125,
-0.05364990234375,
-0.0248260498046875,
0.076904296875,
0.033050537109375,
0.06646728515625,
0.0026340484619140625,
0.0616455078125,
-0.008758544921875,
-0.0017213821411132812,
-0.046875,
0.0428466796875,
-0.005527496337890625,
-0.048431396484375,
-0.0184173583984375,
-0.031951904296875,
-0.061859130859375,
0.0283203125,
-0.0176544189453125,
-0.07855224609375,
0.00616455078125,
0.0177154541015625,
-0.03558349609375,
0.04229736328125,
-0.05322265625,
0.0555419921875,
-0.01409149169921875,
-0.0252838134765625,
0.0012884140014648438,
-0.041259765625,
0.0262298583984375,
-0.005889892578125,
0.0007004737854003906,
-0.0167236328125,
0.015869140625,
0.05462646484375,
-0.047882080078125,
0.05230712890625,
-0.0162811279296875,
-0.002819061279296875,
0.047149658203125,
-0.01267242431640625,
0.058197021484375,
0.002277374267578125,
-0.0024566650390625,
0.0218353271484375,
0.0038242340087890625,
-0.036529541015625,
-0.038848876953125,
0.037353515625,
-0.06268310546875,
-0.038787841796875,
-0.05218505859375,
-0.04095458984375,
0.01275634765625,
0.0171966552734375,
0.048583984375,
0.01995849609375,
0.0173797607421875,
0.016143798828125,
0.02960205078125,
-0.037017822265625,
0.037628173828125,
0.0118255615234375,
-0.03436279296875,
-0.037628173828125,
0.064208984375,
0.0189666748046875,
0.03375244140625,
-0.00710296630859375,
0.0217437744140625,
-0.023651123046875,
-0.0267181396484375,
-0.0234375,
0.048126220703125,
-0.030609130859375,
-0.01505279541015625,
-0.03411865234375,
-0.042083740234375,
-0.044586181640625,
-0.00522613525390625,
-0.040618896484375,
-0.021514892578125,
-0.0084228515625,
-0.009185791015625,
0.034423828125,
0.05291748046875,
-0.0018625259399414062,
0.030059814453125,
-0.04449462890625,
0.021270751953125,
0.0256500244140625,
0.0289306640625,
0.0020542144775390625,
-0.052337646484375,
-0.01546478271484375,
0.0067291259765625,
-0.0226593017578125,
-0.05682373046875,
0.039306640625,
0.005252838134765625,
0.036834716796875,
0.024658203125,
-0.0045928955078125,
0.05938720703125,
-0.0274505615234375,
0.06268310546875,
0.0273895263671875,
-0.05694580078125,
0.052581787109375,
-0.029327392578125,
0.0450439453125,
0.0206756591796875,
0.041595458984375,
-0.02728271484375,
-0.0217437744140625,
-0.06268310546875,
-0.06591796875,
0.08636474609375,
0.035797119140625,
-0.004787445068359375,
0.0132293701171875,
0.0214996337890625,
-0.0027790069580078125,
0.0027866363525390625,
-0.0758056640625,
-0.04522705078125,
-0.0220489501953125,
-0.01222991943359375,
0.002407073974609375,
-0.017242431640625,
0.0025920867919921875,
-0.0452880859375,
0.062103271484375,
-0.00034737586975097656,
0.033660888671875,
0.01470184326171875,
-0.0125579833984375,
0.0039215087890625,
-0.00551605224609375,
0.05389404296875,
0.0556640625,
-0.03118896484375,
-0.01214599609375,
0.0310821533203125,
-0.047760009765625,
0.00701141357421875,
0.0016431808471679688,
-0.026336669921875,
0.01410675048828125,
0.0268402099609375,
0.07879638671875,
-0.0175323486328125,
-0.029998779296875,
0.038177490234375,
0.005565643310546875,
-0.0221710205078125,
-0.031005859375,
0.006168365478515625,
0.009063720703125,
0.01505279541015625,
0.01812744140625,
-0.02374267578125,
-0.01241302490234375,
-0.03131103515625,
0.01708984375,
0.022064208984375,
-0.01555633544921875,
-0.0396728515625,
0.03558349609375,
-0.01248931884765625,
-0.00516510009765625,
0.035675048828125,
-0.0276031494140625,
-0.045654296875,
0.05084228515625,
0.045745849609375,
0.05902099609375,
-0.022613525390625,
0.0227813720703125,
0.05499267578125,
0.0191497802734375,
-0.0001614093780517578,
0.019805908203125,
0.028045654296875,
-0.0428466796875,
-0.022247314453125,
-0.06268310546875,
0.00933837890625,
0.035064697265625,
-0.047149658203125,
0.0289154052734375,
-0.042572021484375,
-0.03497314453125,
0.0092315673828125,
0.0017137527465820312,
-0.043060302734375,
0.0095367431640625,
0.0255889892578125,
0.051361083984375,
-0.0732421875,
0.0653076171875,
0.0538330078125,
-0.03948974609375,
-0.0557861328125,
-0.0069580078125,
0.02032470703125,
-0.057647705078125,
0.036163330078125,
0.0053863525390625,
-0.001766204833984375,
-0.005855560302734375,
-0.035125732421875,
-0.08099365234375,
0.09698486328125,
0.03607177734375,
-0.040924072265625,
-0.005573272705078125,
0.0213775634765625,
0.0443115234375,
-0.0172271728515625,
0.0361328125,
0.0357666015625,
0.034027099609375,
-0.0035762786865234375,
-0.093994140625,
0.0049591064453125,
-0.02581787109375,
0.004642486572265625,
0.0155792236328125,
-0.0867919921875,
0.0794677734375,
-0.00426483154296875,
0.0005092620849609375,
-0.0142364501953125,
0.033294677734375,
0.035491943359375,
0.006641387939453125,
0.04669189453125,
0.060302734375,
0.0322265625,
-0.0119781494140625,
0.0926513671875,
-0.016510009765625,
0.05938720703125,
0.06976318359375,
0.0129241943359375,
0.033355712890625,
0.0105743408203125,
-0.03790283203125,
0.041595458984375,
0.046783447265625,
-0.010894775390625,
0.0247650146484375,
0.0110015869140625,
-0.0209197998046875,
-0.014251708984375,
-0.0031871795654296875,
-0.0311126708984375,
0.040496826171875,
0.0064849853515625,
-0.015716552734375,
-0.01318359375,
0.012542724609375,
0.023162841796875,
-0.024993896484375,
-0.0191192626953125,
0.057403564453125,
0.016815185546875,
-0.038665771484375,
0.06463623046875,
-0.01152801513671875,
0.054534912109375,
-0.050079345703125,
0.01132965087890625,
-0.01183319091796875,
0.0047454833984375,
-0.028076171875,
-0.0595703125,
0.0160980224609375,
0.0025634765625,
-0.01128387451171875,
0.0072784423828125,
0.052001953125,
-0.0217437744140625,
-0.0556640625,
0.038543701171875,
0.0184783935546875,
0.0312347412109375,
0.0029506683349609375,
-0.08563232421875,
0.00965118408203125,
0.00982666015625,
-0.038299560546875,
0.023773193359375,
0.017242431640625,
0.004680633544921875,
0.03839111328125,
0.0430908203125,
0.02410888671875,
0.03271484375,
0.01617431640625,
0.05718994140625,
-0.0447998046875,
-0.0251922607421875,
-0.071044921875,
0.04486083984375,
-0.0166778564453125,
-0.0311431884765625,
0.057159423828125,
0.04766845703125,
0.072998046875,
-0.0095672607421875,
0.050689697265625,
-0.030181884765625,
0.021636962890625,
-0.033416748046875,
0.04644775390625,
-0.036895751953125,
-0.00472259521484375,
-0.0360107421875,
-0.0689697265625,
-0.005466461181640625,
0.05218505859375,
-0.01885986328125,
0.0163726806640625,
0.042022705078125,
0.055938720703125,
-0.0086517333984375,
-0.0307464599609375,
0.018402099609375,
0.02874755859375,
0.01546478271484375,
0.0550537109375,
0.04119873046875,
-0.060455322265625,
0.041351318359375,
-0.052886962890625,
-0.01352691650390625,
-0.01641845703125,
-0.044677734375,
-0.06884765625,
-0.0321044921875,
-0.03363037109375,
-0.019866943359375,
-0.0079345703125,
0.07513427734375,
0.06390380859375,
-0.058135986328125,
-0.0286712646484375,
0.004322052001953125,
0.007366180419921875,
-0.0201873779296875,
-0.02191162109375,
0.041015625,
-0.0094146728515625,
-0.07318115234375,
-0.006183624267578125,
0.00788116455078125,
0.033782958984375,
0.005123138427734375,
-0.0127410888671875,
-0.031005859375,
-0.0024242401123046875,
0.05499267578125,
0.0204010009765625,
-0.060821533203125,
-0.0205078125,
-0.00435638427734375,
-0.00200653076171875,
0.0131072998046875,
0.0210723876953125,
-0.033477783203125,
0.0323486328125,
0.057647705078125,
0.01258087158203125,
0.06976318359375,
0.0103912353515625,
0.030364990234375,
-0.04510498046875,
0.031707763671875,
0.005619049072265625,
0.0223388671875,
0.0038299560546875,
-0.0216522216796875,
0.0489501953125,
0.0323486328125,
-0.031585693359375,
-0.05975341796875,
-0.0041046142578125,
-0.08251953125,
-0.009490966796875,
0.080078125,
-0.0267486572265625,
-0.023834228515625,
0.00974273681640625,
-0.0166778564453125,
0.016357421875,
-0.01442718505859375,
0.041839599609375,
0.06878662109375,
-0.031829833984375,
-0.022430419921875,
-0.054443359375,
0.040802001953125,
0.029754638671875,
-0.0609130859375,
-0.005504608154296875,
0.0028591156005859375,
0.025177001953125,
0.0263519287109375,
0.050048828125,
-0.0281829833984375,
0.026123046875,
-0.005481719970703125,
0.012664794921875,
0.005767822265625,
-0.016937255859375,
-0.0214996337890625,
-0.0134429931640625,
-0.0159149169921875,
-0.01316070556640625
]
] |
google/t5-v1_1-xxl | 2023-01-24T16:52:41.000Z | [
"transformers",
"pytorch",
"tf",
"t5",
"text2text-generation",
"en",
"dataset:c4",
"arxiv:2002.05202",
"arxiv:1910.10683",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | google | null | null | google/t5-v1_1-xxl | 21 | 34,401 | transformers | 2022-03-02T23:29:05 | ---
language: en
datasets:
- c4
license: apache-2.0
---
[Google's T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) Version 1.1
## Version 1.1
[T5 Version 1.1](https://github.com/google-research/text-to-text-transfer-transformer/blob/master/released_checkpoints.md#t511) includes the following improvements compared to the original T5 model- GEGLU activation in feed-forward hidden layer, rather than ReLU - see [here](https://arxiv.org/abs/2002.05202).
- Dropout was turned off in pre-training (quality win). Dropout should be re-enabled during fine-tuning.
- Pre-trained on C4 only without mixing in the downstream tasks.
- no parameter sharing between embedding and classifier layer
- "xl" and "xxl" replace "3B" and "11B". The model shapes are a bit different - larger `d_model` and smaller `num_heads` and `d_ff`.
**Note**: T5 Version 1.1 was only pre-trained on C4 excluding any supervised training. Therefore, this model has to be fine-tuned before it is useable on a downstream task.
Pretraining Dataset: [C4](https://huggingface.co/datasets/c4)
Other Community Checkpoints: [here](https://huggingface.co/models?search=t5-v1_1)
Paper: [Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer](https://arxiv.org/pdf/1910.10683.pdf)
Authors: *Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu*
## Abstract
Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP). The effectiveness of transfer learning has given rise to a diversity of approaches, methodology, and practice. In this paper, we explore the landscape of transfer learning techniques for NLP by introducing a unified framework that converts every language problem into a text-to-text format. Our systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks. By combining the insights from our exploration with scale and our new “Colossal Clean Crawled Corpus”, we achieve state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more. To facilitate future work on transfer learning for NLP, we release our dataset, pre-trained models, and code.

| 2,673 | [
[
-0.021514892578125,
-0.026824951171875,
0.0296478271484375,
0.01580810546875,
-0.0155792236328125,
0.010498046875,
-0.0178985595703125,
-0.05303955078125,
-0.012451171875,
0.033538818359375,
-0.052734375,
-0.043701171875,
-0.070068359375,
0.0150604248046875,
-0.04840087890625,
0.09722900390625,
-0.0142364501953125,
-0.0144195556640625,
0.00115203857421875,
-0.0031452178955078125,
-0.02667236328125,
-0.032501220703125,
-0.0640869140625,
-0.0270538330078125,
0.02862548828125,
0.027984619140625,
0.0205078125,
0.0275115966796875,
0.0537109375,
0.01299285888671875,
-0.0007090568542480469,
-0.00643157958984375,
-0.0501708984375,
-0.029083251953125,
-0.026519775390625,
-0.01445770263671875,
-0.036376953125,
0.007038116455078125,
0.04522705078125,
0.054107666015625,
0.0020961761474609375,
0.0189208984375,
0.0254669189453125,
0.0450439453125,
-0.052764892578125,
0.01415252685546875,
-0.045318603515625,
0.0151824951171875,
-0.007663726806640625,
0.0008535385131835938,
-0.05047607421875,
-0.01508331298828125,
0.039306640625,
-0.056610107421875,
0.0254669189453125,
-0.0089111328125,
0.091796875,
0.0274200439453125,
-0.037628173828125,
-0.019134521484375,
-0.048614501953125,
0.06646728515625,
-0.045166015625,
0.0287017822265625,
0.00934600830078125,
0.02838134765625,
0.01158905029296875,
-0.0894775390625,
-0.034149169921875,
-0.002288818359375,
-0.00897216796875,
0.004276275634765625,
-0.0216827392578125,
-0.0035800933837890625,
0.0077362060546875,
0.03607177734375,
-0.035430908203125,
0.01605224609375,
-0.05010986328125,
-0.0195159912109375,
0.0367431640625,
-0.0180511474609375,
0.0235748291015625,
0.0011758804321289062,
-0.04913330078125,
-0.0191497802734375,
-0.04052734375,
0.007904052734375,
-0.01666259765625,
0.02447509765625,
-0.0257415771484375,
-0.0070343017578125,
-0.0015687942504882812,
0.04852294921875,
0.01123046875,
-0.004711151123046875,
0.0265655517578125,
-0.04791259765625,
-0.0174407958984375,
-0.01568603515625,
0.065673828125,
0.01358795166015625,
0.0215911865234375,
-0.032257080078125,
-0.0016183853149414062,
-0.0212554931640625,
0.03228759765625,
-0.072998046875,
-0.033538818359375,
-0.006038665771484375,
-0.02801513671875,
-0.038116455078125,
0.00763702392578125,
-0.0457763671875,
-0.00408172607421875,
-0.018951416015625,
0.0411376953125,
-0.04302978515625,
-0.0204620361328125,
0.02630615234375,
0.0027408599853515625,
0.0321044921875,
0.04071044921875,
-0.07843017578125,
0.035400390625,
0.036102294921875,
0.06304931640625,
-0.04473876953125,
-0.0254669189453125,
-0.041748046875,
-0.0005326271057128906,
-0.0101776123046875,
0.06097412109375,
-0.02423095703125,
-0.017547607421875,
-0.0064239501953125,
0.01291656494140625,
-0.0190582275390625,
-0.02325439453125,
0.060211181640625,
-0.031280517578125,
0.043212890625,
-0.020660400390625,
-0.03448486328125,
-0.03900146484375,
0.01280975341796875,
-0.05120849609375,
0.0755615234375,
0.0135498046875,
-0.044586181640625,
0.035491943359375,
-0.0655517578125,
-0.033233642578125,
-0.01319122314453125,
0.027435302734375,
-0.0307769775390625,
-0.01739501953125,
0.02520751953125,
0.042938232421875,
-0.007659912109375,
0.005199432373046875,
-0.017242431640625,
-0.02130126953125,
-0.01262664794921875,
-0.00449371337890625,
0.06805419921875,
0.0223388671875,
-0.0233917236328125,
0.004058837890625,
-0.047119140625,
0.013824462890625,
-0.0011196136474609375,
-0.0232086181640625,
0.01023101806640625,
-0.022369384765625,
0.01143646240234375,
0.03173828125,
0.0215606689453125,
-0.024932861328125,
0.01995849609375,
-0.0195159912109375,
0.040191650390625,
0.040313720703125,
-0.01451873779296875,
0.065673828125,
-0.031829833984375,
0.038299560546875,
0.004116058349609375,
0.004680633544921875,
-0.01184844970703125,
-0.0167388916015625,
-0.055908203125,
-0.0088958740234375,
0.05059814453125,
0.05364990234375,
-0.0511474609375,
0.043060302734375,
-0.04205322265625,
-0.03948974609375,
-0.0455322265625,
0.0062255859375,
0.0280303955078125,
0.047882080078125,
0.057098388671875,
-0.01934814453125,
-0.04278564453125,
-0.0411376953125,
-0.0205841064453125,
0.004261016845703125,
-0.006744384765625,
-0.002197265625,
0.0362548828125,
-0.015838623046875,
0.05841064453125,
-0.0233306884765625,
-0.041229248046875,
-0.04461669921875,
0.01360321044921875,
-0.00408935546875,
0.046051025390625,
0.05230712890625,
-0.04534912109375,
-0.040496826171875,
0.00702667236328125,
-0.058837890625,
-0.013397216796875,
-0.01204681396484375,
-0.0068206787109375,
0.02410888671875,
0.04425048828125,
-0.0196533203125,
0.0238494873046875,
0.06298828125,
-0.018402099609375,
0.02716064453125,
-0.0106048583984375,
0.0011758804321289062,
-0.11767578125,
0.0294189453125,
0.00342559814453125,
-0.038177490234375,
-0.05615234375,
-0.0008826255798339844,
0.0204925537109375,
0.006374359130859375,
-0.0433349609375,
0.046661376953125,
-0.036956787109375,
0.005573272705078125,
-0.0198974609375,
0.01395416259765625,
-0.0004787445068359375,
0.03948974609375,
-0.008758544921875,
0.060333251953125,
0.036041259765625,
-0.060638427734375,
-0.00609588623046875,
0.0323486328125,
-0.01508331298828125,
0.0098876953125,
-0.045196533203125,
0.03204345703125,
0.0004801750183105469,
0.0345458984375,
-0.06591796875,
0.0201263427734375,
0.0310211181640625,
-0.04437255859375,
0.042816162109375,
-0.00970458984375,
-0.01529693603515625,
-0.01496124267578125,
-0.0267333984375,
0.0215606689453125,
0.049041748046875,
-0.04730224609375,
0.040191650390625,
0.0106658935546875,
0.0018253326416015625,
-0.052093505859375,
-0.055999755859375,
0.01480865478515625,
-0.019500732421875,
-0.047882080078125,
0.0640869140625,
0.0013113021850585938,
0.0183258056640625,
-0.004291534423828125,
-0.006114959716796875,
-0.0212860107421875,
0.016693115234375,
-0.01190948486328125,
0.02001953125,
-0.002246856689453125,
0.007762908935546875,
0.0107421875,
-0.021392822265625,
-0.002841949462890625,
-0.034912109375,
0.022064208984375,
-0.01031494140625,
0.015777587890625,
-0.04083251953125,
0.0010242462158203125,
0.023468017578125,
-0.0201263427734375,
0.05548095703125,
0.06951904296875,
-0.0188751220703125,
-0.0234375,
-0.0205078125,
-0.01580810546875,
-0.034515380859375,
0.03228759765625,
-0.037750244140625,
-0.07611083984375,
0.0310821533203125,
-0.017120361328125,
0.022918701171875,
0.052276611328125,
0.006565093994140625,
-0.0029144287109375,
0.0489501953125,
0.082275390625,
-0.02520751953125,
0.050018310546875,
-0.03369140625,
0.0206298828125,
-0.06732177734375,
-0.012054443359375,
-0.050811767578125,
-0.022308349609375,
-0.048370361328125,
-0.0216217041015625,
0.0036296844482421875,
0.01934814453125,
-0.01377105712890625,
0.040252685546875,
-0.029144287109375,
0.027099609375,
0.01482391357421875,
0.01232147216796875,
0.029815673828125,
0.0072784423828125,
0.0027751922607421875,
-0.01442718505859375,
-0.058929443359375,
-0.037445068359375,
0.09197998046875,
0.02392578125,
0.03863525390625,
0.006763458251953125,
0.048797607421875,
0.032745361328125,
0.0330810546875,
-0.055999755859375,
0.033905029296875,
-0.034515380859375,
-0.0208587646484375,
-0.0270538330078125,
-0.03460693359375,
-0.08685302734375,
0.0225830078125,
-0.0372314453125,
-0.054351806640625,
-0.01049041748046875,
0.0011453628540039062,
-0.00884246826171875,
0.037689208984375,
-0.060638427734375,
0.0782470703125,
0.004848480224609375,
-0.01520538330078125,
-0.0008831024169921875,
-0.0574951171875,
0.019256591796875,
-0.0105743408203125,
-0.0034084320068359375,
0.007381439208984375,
-0.00493621826171875,
0.055389404296875,
-0.0166168212890625,
0.051055908203125,
-0.0092010498046875,
-0.006526947021484375,
0.0004360675811767578,
0.00021767616271972656,
0.039764404296875,
-0.029754638671875,
-0.00012254714965820312,
0.024566650390625,
-0.0015163421630859375,
-0.042327880859375,
-0.037109375,
0.0328369140625,
-0.060211181640625,
-0.0237884521484375,
-0.0229644775390625,
-0.021209716796875,
-0.00464630126953125,
0.0260467529296875,
0.0352783203125,
0.0136566162109375,
-0.015899658203125,
0.0263824462890625,
0.053924560546875,
-0.01134490966796875,
0.043975830078125,
0.026641845703125,
-0.0212860107421875,
-0.006023406982421875,
0.052978515625,
-0.00018405914306640625,
0.037445068359375,
0.046417236328125,
0.007579803466796875,
-0.02801513671875,
-0.058837890625,
-0.037322998046875,
0.0150299072265625,
-0.047332763671875,
-0.00994873046875,
-0.060821533203125,
-0.03094482421875,
-0.045166015625,
-0.01015472412109375,
-0.034759521484375,
-0.0219268798828125,
-0.038177490234375,
-0.0193634033203125,
0.01094818115234375,
0.050933837890625,
0.00971221923828125,
0.0169525146484375,
-0.07977294921875,
0.00847625732421875,
0.004718780517578125,
0.0175018310546875,
-0.0031909942626953125,
-0.07550048828125,
-0.01163482666015625,
0.001979827880859375,
-0.02862548828125,
-0.050933837890625,
0.03619384765625,
0.0304718017578125,
0.0296478271484375,
0.0123748779296875,
0.006313323974609375,
0.0390625,
-0.028778076171875,
0.05841064453125,
0.016876220703125,
-0.0892333984375,
0.0302581787109375,
-0.023284912109375,
0.0297393798828125,
0.058563232421875,
0.0418701171875,
-0.034576416015625,
-0.00798797607421875,
-0.051849365234375,
-0.050262451171875,
0.05938720703125,
0.0133209228515625,
-0.00508880615234375,
0.037322998046875,
0.0229949951171875,
0.02667236328125,
-0.004192352294921875,
-0.06939697265625,
-0.0108489990234375,
-0.0115203857421875,
-0.01535797119140625,
-0.01143646240234375,
0.007602691650390625,
0.031005859375,
-0.0287017822265625,
0.0443115234375,
-0.01508331298828125,
0.0239410400390625,
0.0248260498046875,
-0.038421630859375,
0.01366424560546875,
0.0181732177734375,
0.043243408203125,
0.05816650390625,
-0.018280029296875,
-0.006488800048828125,
0.035888671875,
-0.049102783203125,
-0.002948760986328125,
0.01580810546875,
-0.01064300537109375,
-0.00582122802734375,
0.033233642578125,
0.0634765625,
0.0245208740234375,
-0.01873779296875,
0.043365478515625,
-0.01039886474609375,
-0.048614501953125,
-0.01113128662109375,
0.004871368408203125,
-0.0079193115234375,
-0.00566864013671875,
0.02716064453125,
0.01922607421875,
0.023468017578125,
-0.032745361328125,
0.0092926025390625,
0.005157470703125,
-0.038055419921875,
-0.0400390625,
0.0472412109375,
0.029998779296875,
-0.0113372802734375,
0.058563232421875,
-0.0195159912109375,
-0.0426025390625,
0.0297393798828125,
0.043121337890625,
0.077392578125,
-0.007343292236328125,
0.0260467529296875,
0.045806884765625,
0.0269317626953125,
-0.0115203857421875,
-0.00814056396484375,
-0.0180816650390625,
-0.061187744140625,
-0.063232421875,
-0.03369140625,
-0.03594970703125,
0.0111846923828125,
-0.050811767578125,
0.03460693359375,
-0.0247802734375,
0.0151824951171875,
-0.0006184577941894531,
0.01445770263671875,
-0.06219482421875,
0.0156402587890625,
0.0115509033203125,
0.0716552734375,
-0.0582275390625,
0.0794677734375,
0.0535888671875,
-0.0222015380859375,
-0.0650634765625,
0.003696441650390625,
-0.02471923828125,
-0.047119140625,
0.03240966796875,
0.0226593017578125,
-0.01273345947265625,
0.0169219970703125,
-0.05072021484375,
-0.072021484375,
0.09930419921875,
0.036346435546875,
-0.026031494140625,
-0.0214385986328125,
0.0062713623046875,
0.038970947265625,
-0.024078369140625,
0.01348876953125,
0.045684814453125,
0.028533935546875,
0.01971435546875,
-0.09429931640625,
0.0201568603515625,
-0.0196075439453125,
-0.00933074951171875,
0.0166778564453125,
-0.0399169921875,
0.05328369140625,
-0.024200439453125,
-0.0258026123046875,
-0.0010404586791992188,
0.0552978515625,
0.0018453598022460938,
0.018585205078125,
0.040802001953125,
0.057830810546875,
0.061431884765625,
-0.01508331298828125,
0.08819580078125,
-0.003681182861328125,
0.034912109375,
0.07952880859375,
-0.0010738372802734375,
0.062744140625,
0.0240936279296875,
-0.0203857421875,
0.044830322265625,
0.04156494140625,
0.009735107421875,
0.043121337890625,
0.0035457611083984375,
-0.004344940185546875,
-0.0062255859375,
0.00862884521484375,
-0.033111572265625,
0.0256500244140625,
0.01290130615234375,
-0.0237884521484375,
-0.032745361328125,
0.00439453125,
0.016204833984375,
-0.006420135498046875,
-0.0142364501953125,
0.072509765625,
0.00629425048828125,
-0.050018310546875,
0.047637939453125,
-0.00568389892578125,
0.0723876953125,
-0.04412841796875,
-0.0003142356872558594,
-0.0221099853515625,
0.0163116455078125,
-0.0184478759765625,
-0.053955078125,
0.032470703125,
-0.00682830810546875,
-0.0094757080078125,
-0.050811767578125,
0.07330322265625,
-0.0236663818359375,
-0.017913818359375,
0.030548095703125,
0.04022216796875,
0.0186004638671875,
-0.0096893310546875,
-0.055694580078125,
-0.017120361328125,
0.0202484130859375,
-0.007228851318359375,
0.036376953125,
0.03631591796875,
0.005352020263671875,
0.050811767578125,
0.043792724609375,
-0.0018072128295898438,
0.01073455810546875,
0.0034008026123046875,
0.053924560546875,
-0.054840087890625,
-0.03875732421875,
-0.044403076171875,
0.03662109375,
-0.004329681396484375,
-0.0399169921875,
0.046783447265625,
0.030181884765625,
0.0885009765625,
-0.00969696044921875,
0.05810546875,
-0.0017042160034179688,
0.041595458984375,
-0.045928955078125,
0.04852294921875,
-0.03961181640625,
0.007152557373046875,
-0.024993896484375,
-0.06378173828125,
-0.0251312255859375,
0.03955078125,
-0.0233917236328125,
0.016510009765625,
0.07391357421875,
0.036834716796875,
-0.00653076171875,
0.0009965896606445312,
0.0182342529296875,
-0.0007891654968261719,
0.038177490234375,
0.062255859375,
0.041015625,
-0.06719970703125,
0.06610107421875,
-0.0168914794921875,
-0.004764556884765625,
-0.005306243896484375,
-0.07720947265625,
-0.0616455078125,
-0.0562744140625,
-0.0289154052734375,
-0.0169525146484375,
0.004802703857421875,
0.04937744140625,
0.06475830078125,
-0.047882080078125,
-0.00193023681640625,
-0.0215911865234375,
-0.00553131103515625,
-0.0145263671875,
-0.0165863037109375,
0.0272064208984375,
-0.051605224609375,
-0.06072998046875,
0.005474090576171875,
-0.0022525787353515625,
0.00525665283203125,
0.01031494140625,
-0.00528717041015625,
-0.0232086181640625,
-0.032745361328125,
0.044586181640625,
0.02178955078125,
-0.0244140625,
-0.0238494873046875,
0.002674102783203125,
-0.007053375244140625,
0.0185089111328125,
0.044403076171875,
-0.06597900390625,
0.01395416259765625,
0.03521728515625,
0.07757568359375,
0.06439208984375,
-0.00998687744140625,
0.043060302734375,
-0.043731689453125,
-0.00862884521484375,
0.0132904052734375,
0.00814056396484375,
0.0272064208984375,
-0.01418304443359375,
0.051361083984375,
0.0124969482421875,
-0.040374755859375,
-0.03515625,
-0.00954437255859375,
-0.09588623046875,
-0.01407623291015625,
0.07977294921875,
-0.01629638671875,
-0.01568603515625,
0.00241851806640625,
-0.01141357421875,
0.02471923828125,
-0.0243377685546875,
0.06103515625,
0.06072998046875,
0.01319122314453125,
-0.029144287109375,
-0.03662109375,
0.05169677734375,
0.044525146484375,
-0.0882568359375,
-0.0247802734375,
0.014312744140625,
0.033416748046875,
0.0033397674560546875,
0.042327880859375,
-0.011383056640625,
0.0189971923828125,
-0.030426025390625,
0.01468658447265625,
-0.0011491775512695312,
-0.0302581787109375,
-0.04327392578125,
0.011077880859375,
-0.0169830322265625,
-0.026153564453125
]
] |
pyannote/voice-activity-detection | 2023-08-30T15:43:42.000Z | [
"pyannote-audio",
"pyannote",
"pyannote-audio-pipeline",
"audio",
"voice",
"speech",
"speaker",
"voice-activity-detection",
"automatic-speech-recognition",
"dataset:ami",
"dataset:dihard",
"dataset:voxconverse",
"license:mit",
"has_space",
"region:us"
] | automatic-speech-recognition | pyannote | null | null | pyannote/voice-activity-detection | 109 | 34,392 | pyannote-audio | 2022-03-02T23:29:05 | ---
tags:
- pyannote
- pyannote-audio
- pyannote-audio-pipeline
- audio
- voice
- speech
- speaker
- voice-activity-detection
- automatic-speech-recognition
datasets:
- ami
- dihard
- voxconverse
license: mit
extra_gated_prompt: "The collected information will help acquire a better knowledge of pyannote.audio userbase and help its maintainers apply for grants to improve it further. If you are an academic researcher, please cite the relevant papers in your own publications using the model. If you work for a company, please consider contributing back to pyannote.audio development (e.g. through unrestricted gifts). We also provide scientific consulting services around speaker diarization and machine listening."
extra_gated_fields:
Company/university: text
Website: text
I plan to use this model for (task, type of audio data, etc): text
---
I propose (paid) scientific [consulting services](https://herve.niderb.fr/consulting.html) to companies willing to make the most of their data and open-source speech processing toolkits (and `pyannote` in particular).
# 🎹 Voice activity detection
Relies on pyannote.audio 2.1: see [installation instructions](https://github.com/pyannote/pyannote-audio#installation).
```python
# 1. visit hf.co/pyannote/segmentation and accept user conditions
# 2. visit hf.co/settings/tokens to create an access token
# 3. instantiate pretrained voice activity detection pipeline
from pyannote.audio import Pipeline
pipeline = Pipeline.from_pretrained("pyannote/voice-activity-detection",
use_auth_token="ACCESS_TOKEN_GOES_HERE")
output = pipeline("audio.wav")
for speech in output.get_timeline().support():
# active speech between speech.start and speech.end
...
```
## Citation
```bibtex
@inproceedings{Bredin2021,
Title = {{End-to-end speaker segmentation for overlap-aware resegmentation}},
Author = {{Bredin}, Herv{\'e} and {Laurent}, Antoine},
Booktitle = {Proc. Interspeech 2021},
Address = {Brno, Czech Republic},
Month = {August},
Year = {2021},
}
```
```bibtex
@inproceedings{Bredin2020,
Title = {{pyannote.audio: neural building blocks for speaker diarization}},
Author = {{Bredin}, Herv{\'e} and {Yin}, Ruiqing and {Coria}, Juan Manuel and {Gelly}, Gregory and {Korshunov}, Pavel and {Lavechin}, Marvin and {Fustes}, Diego and {Titeux}, Hadrien and {Bouaziz}, Wassim and {Gill}, Marie-Philippe},
Booktitle = {ICASSP 2020, IEEE International Conference on Acoustics, Speech, and Signal Processing},
Address = {Barcelona, Spain},
Month = {May},
Year = {2020},
}
```
| 2,594 | [
[
-0.01508331298828125,
-0.0325927734375,
0.0199127197265625,
0.036529541015625,
-0.012451171875,
-0.006755828857421875,
-0.029083251953125,
-0.049407958984375,
0.039581298828125,
0.042938232421875,
-0.0333251953125,
-0.04229736328125,
-0.006866455078125,
-0.020965576171875,
-0.0206756591796875,
0.046722412109375,
0.033416748046875,
0.0145263671875,
-0.005451202392578125,
0.003726959228515625,
-0.0258026123046875,
-0.020050048828125,
-0.0543212890625,
-0.022613525390625,
0.0191650390625,
0.0477294921875,
0.005092620849609375,
0.031646728515625,
0.01102447509765625,
0.0260467529296875,
-0.0401611328125,
0.00902557373046875,
0.004047393798828125,
0.0263824462890625,
0.00415802001953125,
-0.0265960693359375,
-0.04083251953125,
0.010986328125,
0.052093505859375,
0.0560302734375,
-0.025726318359375,
0.019317626953125,
-0.00940704345703125,
0.028717041015625,
-0.0268707275390625,
0.018035888671875,
-0.04083251953125,
-0.0057220458984375,
-0.036041259765625,
0.00015723705291748047,
-0.03155517578125,
0.00998687744140625,
0.02581787109375,
-0.04119873046875,
0.01409912109375,
-0.007434844970703125,
0.05999755859375,
0.01375579833984375,
0.0199127197265625,
-0.01506805419921875,
-0.053009033203125,
0.0474853515625,
-0.056793212890625,
0.032623291015625,
0.036773681640625,
0.0190887451171875,
-0.02471923828125,
-0.062347412109375,
-0.029754638671875,
-0.03399658203125,
0.006130218505859375,
0.0138092041015625,
-0.022064208984375,
0.00972747802734375,
0.022857666015625,
0.03033447265625,
-0.0240020751953125,
0.0025482177734375,
-0.03851318359375,
-0.0302581787109375,
0.056427001953125,
-0.01617431640625,
0.034576416015625,
-0.00861358642578125,
-0.0069122314453125,
-0.0051116943359375,
-0.02484130859375,
0.0211944580078125,
0.04412841796875,
0.03106689453125,
-0.01195526123046875,
0.033233642578125,
-0.01373291015625,
0.047607421875,
0.00589752197265625,
-0.000010728836059570312,
0.0265045166015625,
-0.035247802734375,
-0.0053863525390625,
0.057647705078125,
0.057708740234375,
-0.01629638671875,
0.0223846435546875,
-0.0024814605712890625,
0.0015382766723632812,
-0.0208587646484375,
-0.0007643699645996094,
-0.047119140625,
-0.058837890625,
0.03204345703125,
-0.03460693359375,
0.016632080078125,
-0.006984710693359375,
-0.04864501953125,
-0.041595458984375,
-0.01947021484375,
0.0261688232421875,
-0.0205230712890625,
-0.06060791015625,
0.0158843994140625,
-0.028778076171875,
-0.01152801513671875,
-0.0004069805145263672,
-0.08026123046875,
0.016204833984375,
0.03729248046875,
0.08642578125,
0.01001739501953125,
-0.0061798095703125,
-0.057647705078125,
0.0038967132568359375,
-0.0160369873046875,
0.059051513671875,
-0.020721435546875,
-0.029693603515625,
-0.0164642333984375,
0.0022640228271484375,
-0.013641357421875,
-0.060028076171875,
0.0606689453125,
0.00507354736328125,
0.0015153884887695312,
-0.007419586181640625,
-0.0172576904296875,
0.0157012939453125,
-0.0265045166015625,
-0.045989990234375,
0.0634765625,
0.0033855438232421875,
-0.06024169921875,
0.0261077880859375,
-0.06842041015625,
-0.00946807861328125,
0.0163726806640625,
-0.00830078125,
-0.049102783203125,
-0.0261993408203125,
0.0199737548828125,
0.0178375244140625,
0.0004584789276123047,
0.01128387451171875,
-0.0192413330078125,
-0.0268707275390625,
0.021331787109375,
-0.0008764266967773438,
0.0814208984375,
0.01335906982421875,
-0.043060302734375,
0.01898193359375,
-0.08465576171875,
-0.012664794921875,
-0.0166168212890625,
-0.032806396484375,
-0.028778076171875,
0.01416015625,
0.01097869873046875,
-0.01087188720703125,
0.00554656982421875,
-0.07781982421875,
-0.0114593505859375,
-0.06280517578125,
0.051666259765625,
0.036163330078125,
0.005340576171875,
0.01751708984375,
-0.00426483154296875,
0.031524658203125,
-0.004024505615234375,
-0.0028362274169921875,
-0.03228759765625,
-0.056915283203125,
-0.04266357421875,
-0.065673828125,
0.021697998046875,
0.037353515625,
-0.0260009765625,
0.0264892578125,
-0.00848388671875,
-0.045318603515625,
-0.0596923828125,
-0.00360870361328125,
0.01447296142578125,
0.03704833984375,
0.022918701171875,
-0.01251220703125,
-0.06689453125,
-0.07196044921875,
-0.0479736328125,
-0.038177490234375,
-0.01351165771484375,
0.02105712890625,
0.00409698486328125,
0.0035762786865234375,
0.0872802734375,
-0.0157012939453125,
-0.018646240234375,
0.017730712890625,
0.01190185546875,
0.037872314453125,
0.0428466796875,
0.0277557373046875,
-0.06646728515625,
-0.0411376953125,
0.01074981689453125,
-0.01110076904296875,
-0.034393310546875,
-0.0076141357421875,
0.00003355741500854492,
0.004913330078125,
0.038421630859375,
-0.046142578125,
0.026580810546875,
0.016204833984375,
-0.015869140625,
0.046905517578125,
0.0175323486328125,
-0.0019445419311523438,
-0.062225341796875,
0.0176849365234375,
0.01531982421875,
-0.00844573974609375,
-0.05426025390625,
-0.05859375,
-0.028167724609375,
-0.0195465087890625,
-0.01343536376953125,
0.03143310546875,
-0.038116455078125,
-0.01971435546875,
-0.012176513671875,
0.028045654296875,
-0.036834716796875,
0.0308074951171875,
0.006824493408203125,
0.0665283203125,
0.039794921875,
-0.04559326171875,
0.0401611328125,
0.044219970703125,
-0.06689453125,
0.033966064453125,
-0.059295654296875,
0.0106201171875,
0.035797119140625,
0.003509521484375,
-0.09930419921875,
-0.01468658447265625,
0.052947998046875,
-0.07177734375,
0.007518768310546875,
-0.039947509765625,
-0.01436614990234375,
-0.009552001953125,
-0.008331298828125,
0.02471923828125,
0.0157623291015625,
-0.057342529296875,
0.028717041015625,
0.056671142578125,
-0.0589599609375,
-0.032745361328125,
-0.06640625,
0.0185699462890625,
-0.01561737060546875,
-0.0714111328125,
0.039306640625,
0.0016317367553710938,
-0.0275421142578125,
0.0077667236328125,
-0.0174102783203125,
-0.01629638671875,
-0.0220794677734375,
0.0260162353515625,
0.001003265380859375,
-0.018646240234375,
-0.002155303955078125,
-0.017120361328125,
-0.0008554458618164062,
0.014373779296875,
-0.04998779296875,
0.025360107421875,
0.00743865966796875,
-0.0210418701171875,
-0.0535888671875,
0.01445770263671875,
0.04644775390625,
-0.034454345703125,
0.018463134765625,
0.07373046875,
-0.020355224609375,
-0.0124053955078125,
-0.038970947265625,
0.006557464599609375,
-0.0399169921875,
0.036285400390625,
0.003551483154296875,
-0.054443359375,
0.040771484375,
-0.0009336471557617188,
0.023773193359375,
0.0222930908203125,
0.050201416015625,
-0.0076141357421875,
0.0574951171875,
0.0293426513671875,
-0.00431060791015625,
0.07794189453125,
-0.0207977294921875,
0.049224853515625,
-0.07159423828125,
-0.04034423828125,
-0.0599365234375,
-0.013336181640625,
-0.047760009765625,
-0.0233612060546875,
0.0305023193359375,
-0.01934814453125,
0.0130767822265625,
0.0245208740234375,
-0.066162109375,
0.0102386474609375,
0.0443115234375,
-0.01314544677734375,
-0.0152740478515625,
0.0272979736328125,
-0.004558563232421875,
-0.001934051513671875,
-0.049957275390625,
-0.0247650146484375,
0.057952880859375,
0.023773193359375,
0.034088134765625,
-0.004421234130859375,
0.04412841796875,
0.01512908935546875,
-0.0108184814453125,
-0.0775146484375,
0.0445556640625,
-0.00623321533203125,
-0.0401611328125,
-0.052032470703125,
-0.0313720703125,
-0.088623046875,
0.0192413330078125,
0.0184478759765625,
-0.092529296875,
0.049591064453125,
0.00835418701171875,
-0.0267181396484375,
0.0236358642578125,
-0.051727294921875,
0.046356201171875,
0.0012578964233398438,
-0.027557373046875,
-0.0296630859375,
-0.0306396484375,
0.016937255859375,
0.0311431884765625,
0.01300811767578125,
-0.00872802734375,
0.031005859375,
0.10040283203125,
-0.0014429092407226562,
0.05743408203125,
-0.052093505859375,
-0.005123138427734375,
0.04351806640625,
-0.024871826171875,
0.01204681396484375,
0.0173492431640625,
-0.0047149658203125,
-0.0010461807250976562,
0.020599365234375,
-0.015655517578125,
-0.013092041015625,
0.051025390625,
-0.047698974609375,
-0.034454345703125,
-0.02398681640625,
-0.039154052734375,
-0.018463134765625,
0.015045166015625,
-0.003414154052734375,
0.038330078125,
-0.015960693359375,
0.0265350341796875,
0.050689697265625,
-0.006679534912109375,
0.0528564453125,
0.034088134765625,
0.012847900390625,
-0.0660400390625,
0.06787109375,
0.0175933837890625,
-0.00881195068359375,
0.0190582275390625,
0.01430511474609375,
-0.0201263427734375,
-0.049346923828125,
-0.0145416259765625,
0.03411865234375,
-0.03125,
0.01508331298828125,
-0.0504150390625,
-0.0126800537109375,
-0.059478759765625,
0.034820556640625,
-0.032989501953125,
-0.04937744140625,
-0.0226287841796875,
0.006931304931640625,
0.01184844970703125,
0.0108184814453125,
-0.0309906005859375,
0.0262603759765625,
-0.041656494140625,
0.017120361328125,
0.0185699462890625,
0.01290130615234375,
-0.012054443359375,
-0.0369873046875,
-0.05682373046875,
0.01488494873046875,
-0.0157012939453125,
-0.048431396484375,
0.0179290771484375,
0.0225677490234375,
0.07220458984375,
0.0242767333984375,
-0.01279449462890625,
0.044830322265625,
-0.0106353759765625,
0.0849609375,
0.00945281982421875,
-0.076171875,
0.033660888671875,
-0.025665283203125,
0.0121917724609375,
0.03948974609375,
0.003902435302734375,
-0.070068359375,
-0.01290130615234375,
-0.048980712890625,
-0.07635498046875,
0.09368896484375,
0.024658203125,
-0.0135955810546875,
0.0251312255859375,
0.0309906005859375,
-0.0201568603515625,
0.0113677978515625,
-0.0295257568359375,
-0.004070281982421875,
-0.0186767578125,
-0.0026912689208984375,
0.0021839141845703125,
-0.01149749755859375,
-0.00984954833984375,
-0.03533935546875,
0.07635498046875,
0.01151275634765625,
0.038970947265625,
0.0465087890625,
0.004093170166015625,
-0.0015916824340820312,
0.008697509765625,
0.059478759765625,
0.03936767578125,
-0.0226287841796875,
-0.00890350341796875,
-0.0167388916015625,
-0.0625,
-0.0005631446838378906,
0.0149078369140625,
0.018890380859375,
0.039794921875,
0.033203125,
0.058013916015625,
0.007205963134765625,
-0.051422119140625,
0.0284423828125,
-0.002685546875,
-0.03240966796875,
-0.04522705078125,
0.00537109375,
0.023406982421875,
0.025970458984375,
0.0234375,
-0.006908416748046875,
0.01312255859375,
-0.0238494873046875,
0.0169525146484375,
0.01428985595703125,
-0.037872314453125,
-0.0303497314453125,
0.0267181396484375,
0.033477783203125,
-0.06683349609375,
0.05902099609375,
-0.006038665771484375,
-0.04071044921875,
0.0523681640625,
0.0309906005859375,
0.08624267578125,
-0.056182861328125,
0.0059051513671875,
0.0274658203125,
0.0234527587890625,
0.006244659423828125,
0.03131103515625,
-0.039947509765625,
-0.03863525390625,
-0.0224151611328125,
-0.0303802490234375,
-0.04229736328125,
0.02752685546875,
-0.03717041015625,
0.0266265869140625,
-0.033050537109375,
-0.0164642333984375,
0.0300445556640625,
0.00829315185546875,
-0.00945281982421875,
0.0094146728515625,
0.0198516845703125,
0.0606689453125,
-0.04638671875,
0.048858642578125,
0.072021484375,
-0.0061798095703125,
-0.04571533203125,
-0.0015716552734375,
-0.0079345703125,
-0.0231170654296875,
0.033416748046875,
-0.004009246826171875,
-0.03564453125,
-0.0152130126953125,
-0.040802001953125,
-0.047149658203125,
0.072509765625,
0.038482666015625,
-0.056671142578125,
0.01023101806640625,
-0.01461029052734375,
0.0306549072265625,
-0.01529693603515625,
0.005489349365234375,
0.055999755859375,
0.05224609375,
0.001102447509765625,
-0.09130859375,
-0.0289459228515625,
-0.045806884765625,
-0.0430908203125,
0.019805908203125,
-0.050048828125,
0.057342529296875,
0.01059722900390625,
-0.017913818359375,
0.01105499267578125,
0.045318603515625,
0.0173492431640625,
0.06561279296875,
0.057281494140625,
0.02020263671875,
0.0662841796875,
0.0073699951171875,
0.01611328125,
-0.02215576171875,
0.006504058837890625,
0.07073974609375,
0.0117950439453125,
0.04498291015625,
0.040771484375,
-0.0227813720703125,
0.0379638671875,
0.0533447265625,
-0.015869140625,
0.046966552734375,
0.0236663818359375,
-0.0238189697265625,
-0.01175689697265625,
-0.0170745849609375,
-0.026580810546875,
0.0518798828125,
0.0394287109375,
-0.01255035400390625,
0.0185699462890625,
-0.0128936767578125,
-0.005207061767578125,
-0.002254486083984375,
-0.0030517578125,
0.039703369140625,
0.016265869140625,
-0.0369873046875,
0.03887939453125,
-0.0035572052001953125,
0.038970947265625,
-0.05194091796875,
0.00031375885009765625,
0.007381439208984375,
0.00887298583984375,
-0.038726806640625,
-0.02020263671875,
0.0132293701171875,
-0.0047607421875,
-0.010009765625,
-0.0158843994140625,
0.045989990234375,
-0.051177978515625,
-0.002933502197265625,
0.015533447265625,
0.0039825439453125,
0.0313720703125,
-0.004802703857421875,
-0.039703369140625,
-0.005916595458984375,
0.005199432373046875,
-0.003955841064453125,
-0.00048613548278808594,
0.0186767578125,
-0.00010353326797485352,
0.027435302734375,
0.050140380859375,
0.046356201171875,
0.004764556884765625,
0.0126953125,
0.049835205078125,
-0.0195770263671875,
-0.0821533203125,
-0.049896240234375,
0.03192138671875,
-0.03582763671875,
-0.030914306640625,
0.04931640625,
0.05755615234375,
0.055877685546875,
0.0209808349609375,
0.050140380859375,
0.00843048095703125,
0.044219970703125,
-0.0283050537109375,
0.06329345703125,
-0.0233612060546875,
0.0228271484375,
-0.03515625,
-0.048858642578125,
-0.00647735595703125,
0.051849365234375,
-0.027130126953125,
-0.01263427734375,
0.040069580078125,
0.06829833984375,
-0.0115966796875,
0.0259552001953125,
0.01105499267578125,
0.017730712890625,
0.05426025390625,
0.0128326416015625,
0.0711669921875,
-0.0015058517456054688,
0.039215087890625,
-0.01136016845703125,
-0.023651123046875,
-0.0241851806640625,
-0.0386962890625,
-0.058441162109375,
-0.06451416015625,
-0.0389404296875,
-0.01023101806640625,
-0.000988006591796875,
0.06927490234375,
0.0721435546875,
-0.0589599609375,
-0.053131103515625,
0.006191253662109375,
-0.0013332366943359375,
-0.042327880859375,
-0.0126190185546875,
0.035888671875,
-0.011444091796875,
-0.056396484375,
0.0723876953125,
0.0297698974609375,
-0.0108795166015625,
-0.00962066650390625,
0.0111541748046875,
-0.03387451171875,
0.005840301513671875,
0.008148193359375,
0.03759765625,
-0.04010009765625,
0.0034770965576171875,
-0.0175323486328125,
0.0226898193359375,
0.012786865234375,
0.08941650390625,
-0.032440185546875,
0.0491943359375,
0.044677734375,
0.043731689453125,
0.0604248046875,
-0.01226043701171875,
0.0295867919921875,
-0.0450439453125,
0.0394287109375,
0.044097900390625,
0.0162353515625,
0.04510498046875,
-0.03387451171875,
0.0185699462890625,
0.0283050537109375,
-0.052581787109375,
-0.08795166015625,
-0.0012302398681640625,
-0.069580078125,
-0.0014123916625976562,
0.06256103515625,
-0.0168609619140625,
-0.0199432373046875,
-0.0258636474609375,
-0.031097412109375,
0.040252685546875,
-0.03936767578125,
0.05621337890625,
0.0684814453125,
-0.0362548828125,
0.00508880615234375,
-0.03558349609375,
0.05267333984375,
0.039794921875,
-0.04180908203125,
0.0230865478515625,
0.0195770263671875,
0.015167236328125,
0.039581298828125,
0.05670166015625,
-0.0044097900390625,
0.035369873046875,
0.015655517578125,
0.009857177734375,
-0.035980224609375,
-0.0242767333984375,
-0.036895751953125,
-0.002811431884765625,
-0.0017538070678710938,
-0.063720703125
]
] |
princeton-nlp/unsup-simcse-bert-base-uncased | 2022-11-11T20:04:07.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"feature-extraction",
"arxiv:2104.08821",
"arxiv:1910.09700",
"endpoints_compatible",
"has_space",
"region:us"
] | feature-extraction | princeton-nlp | null | null | princeton-nlp/unsup-simcse-bert-base-uncased | 2 | 34,385 | transformers | 2022-03-02T23:29:05 | ---
tags:
- feature-extraction
- bert
---
# Model Card for unsup-simcse-bert-base-uncased
# Model Details
## Model Description
More information needed
- **Developed by:** Princeton NLP group
- **Shared by [Optional]:** Hugging Face
- **Model type:** Feature Extraction
- **Language(s) (NLP):** More information needed
- **License:** More information needed
- **Related Models:**
- **Parent Model:** BERT
- **Resources for more information:**
- [GitHub Repo](https://github.com/princeton-nlp/SimCSE)
- [Model Space](https://huggingface.co/spaces/mteb/leaderboard)
- [Associated Paper](https://arxiv.org/abs/2104.08821)
# Uses
## Direct Use
This model can be used for the task of Feature Engineering.
## Downstream Use [Optional]
More information needed
## Out-of-Scope Use
The model should not be used to intentionally create hostile or alienating environments for people.
# Bias, Risks, and Limitations
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups.
## Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
# Training Details
## Training Data
The model craters note in the [Github Repository](https://github.com/princeton-nlp/SimCSE/blob/main/README.md)
> We train unsupervised SimCSE on 106 randomly sampled sentences from English Wikipedia, and train supervised SimCSE on the combination of MNLI and SNLI datasets (314k).
## Training Procedure
### Preprocessing
More information needed
### Speeds, Sizes, Times
More information needed
# Evaluation
## Testing Data, Factors & Metrics
### Testing Data
The model craters note in the [associated paper](https://arxiv.org/pdf/2104.08821.pdf)
> Our evaluation code for sentence embeddings is based on a modified version of [SentEval](https://github.com/facebookresearch/SentEval). It evaluates sentence embeddings on semantic textual similarity (STS) tasks and downstream transfer tasks. For STS tasks, our evaluation takes the "all" setting, and report Spearman's correlation. See [associated paper](https://arxiv.org/pdf/2104.08821.pdf) (Appendix B) for evaluation details.
### Factors
More information needed
### Metrics
More information needed
## Results
More information needed
# Model Examination
The model craters note in the [associated paper](https://arxiv.org/pdf/2104.08821.pdf)
> **Uniformity and alignment.**
We also observe that (1) though pre-trained embeddings have good alignment, their uniformity is poor (i.e., the embeddings are highly anisotropic); (2) post-processing methods like BERT-flow and BERT-whitening greatly improve uniformity but also suffer a degeneration in alignment; (3) unsupervised SimCSE effectively improves uniformity of pre-trained embeddings whereas keeping a good alignment;(4) incorporating supervised data in SimCSE further amends alignment.
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** Nvidia 3090 GPUs with CUDA 11
- **Hours used:** More information needed
- **Cloud Provider:** More information needed
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
More information needed
## Compute Infrastructure
More information needed
### Hardware
More information needed
### Software
More information needed
# Citation
**BibTeX:**
```bibtex
@inproceedings{gao2021simcse,
title={{SimCSE}: Simple Contrastive Learning of Sentence Embeddings},
author={Gao, Tianyu and Yao, Xingcheng and Chen, Danqi},
booktitle={Empirical Methods in Natural Language Processing (EMNLP)},
year={2021}
}
```
# Glossary [optional]
More information needed
# More Information [optional]
More information needed
# Model Card Authors [optional]
Princeton NLP group in collaboration with Ezi Ozoani and the Hugging Face team
# Model Card Contact
If you have any questions related to the code or the paper, feel free to email Tianyu (`tianyug@cs.princeton.edu`) and Xingcheng (`yxc18@mails.tsinghua.edu.cn`). If you encounter any problems when using the code, or want to report a bug, you can open an issue. Please try to specify the problem with details so we can help you better and quicker!
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("princeton-nlp/unsup-simcse-bert-base-uncased")
model = AutoModel.from_pretrained("princeton-nlp/unsup-simcse-bert-base-uncased")
```
</details>
| 5,306 | [
[
-0.0265960693359375,
-0.044647216796875,
0.028717041015625,
0.016021728515625,
-0.0295562744140625,
-0.02081298828125,
-0.0259246826171875,
-0.0215911865234375,
0.01053619384765625,
0.034698486328125,
-0.04241943359375,
-0.049774169921875,
-0.050323486328125,
-0.00388336181640625,
-0.023712158203125,
0.078125,
0.01200103759765625,
0.011566162109375,
-0.033538818359375,
0.00882720947265625,
-0.01373291015625,
-0.04632568359375,
-0.033721923828125,
-0.01036834716796875,
0.01074981689453125,
0.0077667236328125,
0.04986572265625,
0.052642822265625,
0.033050537109375,
0.0264129638671875,
-0.025970458984375,
0.00093841552734375,
-0.028350830078125,
-0.0230712890625,
-0.0023059844970703125,
-0.0157623291015625,
-0.0289154052734375,
0.006744384765625,
0.04241943359375,
0.056365966796875,
-0.01580810546875,
0.01239776611328125,
0.0236968994140625,
0.0260009765625,
-0.04150390625,
0.020660400390625,
-0.04742431640625,
-0.00876617431640625,
-0.004726409912109375,
0.01207733154296875,
-0.038482666015625,
-0.00803375244140625,
0.004146575927734375,
-0.046142578125,
0.0098114013671875,
0.0044403076171875,
0.09600830078125,
0.022674560546875,
-0.024566650390625,
-0.0228271484375,
-0.0301513671875,
0.07757568359375,
-0.07379150390625,
0.034332275390625,
0.0095672607421875,
-0.006122589111328125,
-0.015899658203125,
-0.0645751953125,
-0.0390625,
-0.0316162109375,
-0.01415252685546875,
0.0251007080078125,
-0.0189361572265625,
0.004566192626953125,
0.0269622802734375,
0.0142364501953125,
-0.035308837890625,
0.01110076904296875,
-0.021240234375,
-0.0118255615234375,
0.054168701171875,
0.0030498504638671875,
0.0278778076171875,
-0.049102783203125,
-0.046142578125,
-0.022705078125,
-0.0215606689453125,
0.01715087890625,
0.032867431640625,
0.033905029296875,
-0.034149169921875,
0.046630859375,
-0.0029449462890625,
0.032806396484375,
-0.01041412353515625,
0.01013946533203125,
0.036529541015625,
-0.0244598388671875,
-0.0247650146484375,
-0.01296234130859375,
0.08209228515625,
0.0106201171875,
0.0189971923828125,
-0.003673553466796875,
-0.0058441162109375,
-0.0036411285400390625,
0.018280029296875,
-0.05963134765625,
-0.0203704833984375,
0.00848388671875,
-0.046142578125,
-0.0245513916015625,
0.024932861328125,
-0.063232421875,
0.00438690185546875,
-0.03131103515625,
0.0242462158203125,
-0.048736572265625,
-0.008087158203125,
0.0099945068359375,
-0.00655364990234375,
0.005428314208984375,
0.00592803955078125,
-0.051116943359375,
0.033477783203125,
0.03497314453125,
0.053955078125,
-0.0179443359375,
-0.0201568603515625,
-0.002338409423828125,
-0.00441741943359375,
-0.00045418739318847656,
0.043304443359375,
-0.0401611328125,
-0.0303955078125,
-0.0027370452880859375,
0.0062255859375,
-0.0166015625,
-0.0177764892578125,
0.0673828125,
-0.01332855224609375,
0.0302276611328125,
-0.01953125,
-0.055755615234375,
-0.0233154296875,
-0.001972198486328125,
-0.037872314453125,
0.1044921875,
-0.0067138671875,
-0.0771484375,
0.00629425048828125,
-0.058013916015625,
-0.01116180419921875,
-0.0089111328125,
-0.01235198974609375,
-0.0443115234375,
0.003009796142578125,
0.02783203125,
0.034149169921875,
-0.0185699462890625,
0.035125732421875,
-0.02960205078125,
-0.0301666259765625,
0.0047149658203125,
-0.0322265625,
0.1075439453125,
0.0213775634765625,
-0.0220489501953125,
-0.0043487548828125,
-0.0465087890625,
0.00482940673828125,
0.021697998046875,
-0.01087188720703125,
-0.029754638671875,
-0.002979278564453125,
0.0201416015625,
0.0225372314453125,
0.02691650390625,
-0.051116943359375,
0.01078033447265625,
-0.028350830078125,
0.019378662109375,
0.049468994140625,
-0.00628662109375,
0.025726318359375,
-0.0170440673828125,
0.0217132568359375,
0.0211181640625,
0.0260772705078125,
0.00786590576171875,
-0.036773681640625,
-0.06463623046875,
-0.027923583984375,
0.050567626953125,
0.049774169921875,
-0.04180908203125,
0.063232421875,
-0.025177001953125,
-0.044586181640625,
-0.0428466796875,
0.0015058517456054688,
0.03424072265625,
0.0151214599609375,
0.0477294921875,
-0.019378662109375,
-0.05108642578125,
-0.07415771484375,
-0.021636962890625,
-0.007602691650390625,
-0.002655029296875,
0.03570556640625,
0.0543212890625,
-0.022979736328125,
0.0767822265625,
-0.04931640625,
-0.01953125,
-0.01509857177734375,
0.019378662109375,
0.0012388229370117188,
0.055419921875,
0.044921875,
-0.056854248046875,
-0.0271759033203125,
-0.0272979736328125,
-0.06396484375,
-0.01153564453125,
-0.0151214599609375,
-0.00728607177734375,
0.0193023681640625,
0.053192138671875,
-0.048919677734375,
0.034271240234375,
0.038909912109375,
-0.0173492431640625,
0.033233642578125,
-0.016510009765625,
-0.004364013671875,
-0.1044921875,
0.00865936279296875,
0.00693511962890625,
0.0020771026611328125,
-0.035980224609375,
-0.01052093505859375,
-0.0009522438049316406,
-0.006927490234375,
-0.037750244140625,
0.03857421875,
-0.038482666015625,
-0.0009598731994628906,
-0.0012273788452148438,
0.01499176025390625,
-0.0078277587890625,
0.055084228515625,
0.00832366943359375,
0.03857421875,
0.04248046875,
-0.04705810546875,
-0.005794525146484375,
0.0263519287109375,
-0.02581787109375,
0.013427734375,
-0.0535888671875,
0.002285003662109375,
-0.004238128662109375,
0.015716552734375,
-0.06414794921875,
0.0089569091796875,
0.010650634765625,
-0.044830322265625,
0.0276641845703125,
-0.0006694793701171875,
-0.044647216796875,
-0.0227508544921875,
-0.0177001953125,
0.024810791015625,
0.049407958984375,
-0.04058837890625,
0.0526123046875,
0.02667236328125,
-0.01015472412109375,
-0.055877685546875,
-0.058258056640625,
-0.005123138427734375,
-0.0006814002990722656,
-0.04278564453125,
0.04791259765625,
-0.014678955078125,
-0.00323486328125,
0.0186614990234375,
0.0190277099609375,
-0.0100250244140625,
0.004241943359375,
0.02783203125,
0.029388427734375,
-0.003787994384765625,
0.00809478759765625,
0.00872802734375,
-0.00811767578125,
0.005748748779296875,
-0.0021686553955078125,
0.037628173828125,
-0.007373809814453125,
-0.01727294921875,
-0.038543701171875,
0.0267333984375,
0.024566650390625,
-0.004230499267578125,
0.06463623046875,
0.05596923828125,
-0.03228759765625,
-0.00827789306640625,
-0.04083251953125,
-0.019500732421875,
-0.0350341796875,
0.04156494140625,
-0.01412200927734375,
-0.060821533203125,
0.037933349609375,
-0.0021343231201171875,
0.0083465576171875,
0.05938720703125,
0.031707763671875,
-0.0177459716796875,
0.07275390625,
0.06536865234375,
-0.0004582405090332031,
0.035308837890625,
-0.0227813720703125,
0.0189971923828125,
-0.06695556640625,
-0.0265655517578125,
-0.05035400390625,
0.0006580352783203125,
-0.0689697265625,
-0.0281829833984375,
-0.0021686553955078125,
0.0166168212890625,
-0.024505615234375,
0.03472900390625,
-0.04107666015625,
0.017486572265625,
0.037445068359375,
0.005466461181640625,
0.0083465576171875,
0.0009436607360839844,
-0.034698486328125,
-0.02685546875,
-0.06048583984375,
-0.04132080078125,
0.059478759765625,
0.037628173828125,
0.04583740234375,
-0.006740570068359375,
0.056365966796875,
0.01128387451171875,
0.00878143310546875,
-0.036865234375,
0.053924560546875,
-0.04010009765625,
-0.036529541015625,
-0.01313018798828125,
-0.038543701171875,
-0.06011962890625,
0.0217437744140625,
-0.032806396484375,
-0.057342529296875,
0.0241241455078125,
-0.00511932373046875,
-0.017852783203125,
0.03631591796875,
-0.046661376953125,
0.0716552734375,
-0.01580810546875,
0.0018472671508789062,
0.0026035308837890625,
-0.0501708984375,
0.029083251953125,
0.0102386474609375,
0.0152740478515625,
-0.0025653839111328125,
-0.0002048015594482422,
0.0692138671875,
-0.0242156982421875,
0.078125,
-0.01251983642578125,
0.02044677734375,
0.0208740234375,
-0.01300811767578125,
0.03155517578125,
-0.00839996337890625,
0.00006794929504394531,
0.047576904296875,
-0.004131317138671875,
-0.0260009765625,
-0.033935546875,
0.057769775390625,
-0.07525634765625,
-0.0247344970703125,
-0.045501708984375,
-0.052764892578125,
0.003353118896484375,
0.0148162841796875,
0.029266357421875,
0.00756072998046875,
-0.021453857421875,
0.025970458984375,
0.05096435546875,
-0.0430908203125,
0.04022216796875,
0.0212554931640625,
-0.00582122802734375,
-0.025177001953125,
0.053436279296875,
0.01415252685546875,
0.013458251953125,
0.0169677734375,
0.00547027587890625,
-0.0254058837890625,
-0.029693603515625,
-0.01279449462890625,
0.032135009765625,
-0.047271728515625,
-0.01220703125,
-0.0780029296875,
-0.0428466796875,
-0.049102783203125,
-0.005321502685546875,
-0.040130615234375,
-0.03253173828125,
-0.038299560546875,
-0.001537322998046875,
0.021575927734375,
0.04742431640625,
-0.0055389404296875,
0.0310821533203125,
-0.036834716796875,
0.01131439208984375,
0.00263214111328125,
0.0222320556640625,
-0.00832366943359375,
-0.0625,
-0.01219940185546875,
0.018829345703125,
-0.0269622802734375,
-0.052490234375,
0.032318115234375,
-0.0003330707550048828,
0.0537109375,
0.01186370849609375,
0.012603759765625,
0.0477294921875,
-0.02960205078125,
0.07574462890625,
0.0078582763671875,
-0.08465576171875,
0.0282440185546875,
-0.01065826416015625,
0.019622802734375,
0.049072265625,
0.03887939453125,
-0.027374267578125,
-0.027679443359375,
-0.08258056640625,
-0.06982421875,
0.055419921875,
0.041473388671875,
0.02191162109375,
-0.00624847412109375,
0.0301971435546875,
-0.021942138671875,
0.0051422119140625,
-0.0830078125,
-0.037322998046875,
-0.0208587646484375,
-0.022735595703125,
0.0021915435791015625,
-0.03619384765625,
0.0095672607421875,
-0.0230560302734375,
0.07501220703125,
0.004192352294921875,
0.044677734375,
0.01056671142578125,
-0.015167236328125,
0.0191650390625,
0.01461029052734375,
0.036285400390625,
0.005558013916015625,
-0.0301971435546875,
-0.00409698486328125,
0.033966064453125,
-0.04022216796875,
-0.0193634033203125,
0.0195770263671875,
-0.0255126953125,
0.015045166015625,
0.0215911865234375,
0.061798095703125,
0.0277099609375,
-0.033660888671875,
0.05950927734375,
-0.00850677490234375,
-0.0307159423828125,
-0.026458740234375,
-0.00787353515625,
0.0139617919921875,
-0.00019991397857666016,
0.00811004638671875,
0.0098876953125,
0.01216888427734375,
-0.0214691162109375,
0.0235443115234375,
0.040924072265625,
-0.041839599609375,
0.0009655952453613281,
0.047576904296875,
0.007358551025390625,
-0.0067138671875,
0.05987548828125,
-0.01546478271484375,
-0.033355712890625,
0.058258056640625,
0.04656982421875,
0.069580078125,
-0.00940704345703125,
0.0155029296875,
0.0517578125,
0.039642333984375,
-0.005512237548828125,
0.01428985595703125,
0.0143585205078125,
-0.053314208984375,
-0.0288848876953125,
-0.04766845703125,
-0.0096588134765625,
0.0053558349609375,
-0.064453125,
0.040374755859375,
-0.03179931640625,
-0.00482177734375,
-0.0014238357543945312,
0.003711700439453125,
-0.05078125,
0.0231781005859375,
0.01201629638671875,
0.06927490234375,
-0.08544921875,
0.064208984375,
0.048980712890625,
-0.049468994140625,
-0.060699462890625,
0.0004265308380126953,
-0.006565093994140625,
-0.042724609375,
0.046661376953125,
0.0127716064453125,
0.004150390625,
-0.00141143798828125,
-0.054840087890625,
-0.0732421875,
0.08978271484375,
0.031982421875,
-0.037384033203125,
0.0013475418090820312,
0.00872802734375,
0.039276123046875,
-0.044647216796875,
0.044647216796875,
0.01739501953125,
0.025054931640625,
0.00463104248046875,
-0.056121826171875,
0.010040283203125,
-0.020660400390625,
0.0264129638671875,
-0.00588226318359375,
-0.0521240234375,
0.0723876953125,
-0.0155029296875,
-0.0193328857421875,
0.004863739013671875,
0.06591796875,
0.0173187255859375,
-0.0009479522705078125,
0.04632568359375,
0.05938720703125,
0.048309326171875,
-0.006542205810546875,
0.0816650390625,
-0.0166778564453125,
0.0501708984375,
0.09027099609375,
0.00534820556640625,
0.07366943359375,
0.0307769775390625,
-0.01116180419921875,
0.06329345703125,
0.047576904296875,
-0.0170745849609375,
0.033935546875,
0.0079803466796875,
0.0156402587890625,
-0.024627685546875,
-0.0254974365234375,
-0.029754638671875,
0.0350341796875,
0.010894775390625,
-0.04449462890625,
-0.0151214599609375,
-0.013275146484375,
0.01433563232421875,
0.0091094970703125,
0.0005755424499511719,
0.05133056640625,
0.0149078369140625,
-0.02813720703125,
0.03179931640625,
0.00994873046875,
0.061767578125,
-0.0511474609375,
-0.000926971435546875,
-0.0002968311309814453,
0.0081329345703125,
-0.01593017578125,
-0.046630859375,
0.017913818359375,
-0.0014171600341796875,
-0.0209808349609375,
-0.00437164306640625,
0.032623291015625,
-0.042999267578125,
-0.0567626953125,
0.04522705078125,
0.0290679931640625,
0.0188446044921875,
0.0117340087890625,
-0.0965576171875,
0.0028285980224609375,
0.0193634033203125,
-0.028350830078125,
0.0246734619140625,
0.0238037109375,
-0.00038886070251464844,
0.04022216796875,
0.0537109375,
-0.0088348388671875,
-0.0005326271057128906,
0.01349639892578125,
0.06646728515625,
-0.048858642578125,
-0.04083251953125,
-0.060943603515625,
0.05804443359375,
-0.0252838134765625,
-0.0269317626953125,
0.060943603515625,
0.05511474609375,
0.06719970703125,
-0.0012054443359375,
0.056610107421875,
-0.01215362548828125,
0.02984619140625,
-0.040069580078125,
0.03961181640625,
-0.0501708984375,
0.0197296142578125,
-0.03924560546875,
-0.07537841796875,
-0.024200439453125,
0.052032470703125,
-0.0157470703125,
0.0286712646484375,
0.06829833984375,
0.06591796875,
-0.01203155517578125,
0.004180908203125,
0.0164794921875,
0.029144287109375,
0.006938934326171875,
0.031646728515625,
0.03564453125,
-0.0601806640625,
0.042816162109375,
-0.039154052734375,
-0.0118255615234375,
-0.0017881393432617188,
-0.082763671875,
-0.07403564453125,
-0.046844482421875,
-0.045135498046875,
-0.0244598388671875,
-0.0031375885009765625,
0.07373046875,
0.043792724609375,
-0.06817626953125,
-0.017333984375,
-0.0042266845703125,
-0.0075225830078125,
-0.0217132568359375,
-0.018035888671875,
0.047576904296875,
-0.028350830078125,
-0.060882568359375,
0.004146575927734375,
-0.0012884140014648438,
0.014068603515625,
-0.007724761962890625,
-0.008453369140625,
-0.048004150390625,
0.01520538330078125,
0.027557373046875,
0.0032291412353515625,
-0.063720703125,
-0.006378173828125,
-0.00010466575622558594,
-0.028961181640625,
-0.00849151611328125,
0.04290771484375,
-0.036041259765625,
0.0192108154296875,
0.028717041015625,
0.0421142578125,
0.039947509765625,
0.007793426513671875,
0.020660400390625,
-0.05938720703125,
0.0190277099609375,
0.011199951171875,
0.0435791015625,
0.0237274169921875,
-0.042724609375,
0.03436279296875,
0.0225067138671875,
-0.04248046875,
-0.057220458984375,
-0.0011043548583984375,
-0.09722900390625,
-0.0362548828125,
0.0994873046875,
-0.0172882080078125,
-0.0177764892578125,
-0.0010061264038085938,
-0.01448822021484375,
0.03009033203125,
-0.032562255859375,
0.046112060546875,
0.05169677734375,
0.0038051605224609375,
-0.0144805908203125,
-0.0419921875,
0.040802001953125,
0.04864501953125,
-0.06048583984375,
0.002590179443359375,
0.0202789306640625,
0.021453857421875,
0.0175628662109375,
0.029266357421875,
-0.0206146240234375,
0.0011653900146484375,
0.00045800209045410156,
0.043487548828125,
-0.01036834716796875,
-0.0142364501953125,
-0.029327392578125,
-0.0024776458740234375,
-0.02099609375,
-0.00839996337890625
]
] |
flair/ner-english-large | 2021-05-08T15:36:27.000Z | [
"flair",
"pytorch",
"token-classification",
"sequence-tagger-model",
"en",
"dataset:conll2003",
"arxiv:2011.06993",
"has_space",
"region:us"
] | token-classification | flair | null | null | flair/ner-english-large | 32 | 34,276 | flair | 2022-03-02T23:29:05 | ---
tags:
- flair
- token-classification
- sequence-tagger-model
language: en
datasets:
- conll2003
widget:
- text: "George Washington went to Washington"
---
## English NER in Flair (large model)
This is the large 4-class NER model for English that ships with [Flair](https://github.com/flairNLP/flair/).
F1-Score: **94,36** (corrected CoNLL-03)
Predicts 4 tags:
| **tag** | **meaning** |
|---------------------------------|-----------|
| PER | person name |
| LOC | location name |
| ORG | organization name |
| MISC | other name |
Based on document-level XLM-R embeddings and [FLERT](https://arxiv.org/pdf/2011.06993v1.pdf/).
---
### Demo: How to use in Flair
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
```python
from flair.data import Sentence
from flair.models import SequenceTagger
# load tagger
tagger = SequenceTagger.load("flair/ner-english-large")
# make example sentence
sentence = Sentence("George Washington went to Washington")
# predict NER tags
tagger.predict(sentence)
# print sentence
print(sentence)
# print predicted NER spans
print('The following NER tags are found:')
# iterate over entities and print
for entity in sentence.get_spans('ner'):
print(entity)
```
This yields the following output:
```
Span [1,2]: "George Washington" [− Labels: PER (1.0)]
Span [5]: "Washington" [− Labels: LOC (1.0)]
```
So, the entities "*George Washington*" (labeled as a **person**) and "*Washington*" (labeled as a **location**) are found in the sentence "*George Washington went to Washington*".
---
### Training: Script to train this model
The following Flair script was used to train this model:
```python
import torch
# 1. get the corpus
from flair.datasets import CONLL_03
corpus = CONLL_03()
# 2. what tag do we want to predict?
tag_type = 'ner'
# 3. make the tag dictionary from the corpus
tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
# 4. initialize fine-tuneable transformer embeddings WITH document context
from flair.embeddings import TransformerWordEmbeddings
embeddings = TransformerWordEmbeddings(
model='xlm-roberta-large',
layers="-1",
subtoken_pooling="first",
fine_tune=True,
use_context=True,
)
# 5. initialize bare-bones sequence tagger (no CRF, no RNN, no reprojection)
from flair.models import SequenceTagger
tagger = SequenceTagger(
hidden_size=256,
embeddings=embeddings,
tag_dictionary=tag_dictionary,
tag_type='ner',
use_crf=False,
use_rnn=False,
reproject_embeddings=False,
)
# 6. initialize trainer with AdamW optimizer
from flair.trainers import ModelTrainer
trainer = ModelTrainer(tagger, corpus, optimizer=torch.optim.AdamW)
# 7. run training with XLM parameters (20 epochs, small LR)
from torch.optim.lr_scheduler import OneCycleLR
trainer.train('resources/taggers/ner-english-large',
learning_rate=5.0e-6,
mini_batch_size=4,
mini_batch_chunk_size=1,
max_epochs=20,
scheduler=OneCycleLR,
embeddings_storage_mode='none',
weight_decay=0.,
)
)
```
---
### Cite
Please cite the following paper when using this model.
```
@misc{schweter2020flert,
title={FLERT: Document-Level Features for Named Entity Recognition},
author={Stefan Schweter and Alan Akbik},
year={2020},
eprint={2011.06993},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
---
### Issues?
The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
| 3,632 | [
[
-0.031402587890625,
-0.03765869140625,
0.0215911865234375,
0.002948760986328125,
-0.004573822021484375,
-0.00921630859375,
-0.0222625732421875,
-0.034423828125,
0.033294677734375,
0.0241241455078125,
-0.033050537109375,
-0.04248046875,
-0.036285400390625,
0.0242156982421875,
-0.013885498046875,
0.08026123046875,
0.0019388198852539062,
0.01535797119140625,
-0.0008363723754882812,
-0.0005540847778320312,
-0.02349853515625,
-0.0428466796875,
-0.0635986328125,
-0.0207366943359375,
0.05108642578125,
0.022125244140625,
0.03350830078125,
0.05084228515625,
0.0274505615234375,
0.020599365234375,
-0.01050567626953125,
0.006328582763671875,
-0.0189666748046875,
-0.01399993896484375,
-0.0098419189453125,
-0.0254669189453125,
-0.052276611328125,
0.0041656494140625,
0.053802490234375,
0.0269012451171875,
0.0110321044921875,
0.00179290771484375,
-0.0015802383422851562,
0.016754150390625,
-0.020782470703125,
0.0205230712890625,
-0.0496826171875,
-0.0164642333984375,
-0.01158905029296875,
-0.004116058349609375,
-0.039031982421875,
-0.017822265625,
0.0170440673828125,
-0.040496826171875,
0.01220703125,
0.0213165283203125,
0.10595703125,
0.00994873046875,
-0.0340576171875,
-0.00974273681640625,
-0.043792724609375,
0.07373046875,
-0.0723876953125,
0.031005859375,
0.0210113525390625,
-0.005809783935546875,
0.00170135498046875,
-0.05548095703125,
-0.047607421875,
-0.01502227783203125,
-0.0069427490234375,
0.01324462890625,
-0.0136566162109375,
-0.00952911376953125,
0.02642822265625,
0.005451202392578125,
-0.05377197265625,
0.0089111328125,
-0.025634765625,
-0.0186004638671875,
0.0545654296875,
0.0131378173828125,
0.00875091552734375,
-0.0189666748046875,
-0.036163330078125,
-0.007183074951171875,
-0.0283660888671875,
-0.0045623779296875,
0.01678466796875,
0.03179931640625,
-0.0181884765625,
0.029815673828125,
0.006191253662109375,
0.053802490234375,
0.01081085205078125,
-0.017059326171875,
0.04608154296875,
-0.00323486328125,
-0.01421356201171875,
0.0027370452880859375,
0.06854248046875,
0.022705078125,
0.0158843994140625,
-0.007251739501953125,
-0.014404296875,
0.00042939186096191406,
-0.008697509765625,
-0.0667724609375,
-0.0189056396484375,
0.019073486328125,
-0.0208282470703125,
-0.0198822021484375,
0.007328033447265625,
-0.051483154296875,
-0.0051422119140625,
-0.01297760009765625,
0.05096435546875,
-0.049102783203125,
-0.012542724609375,
-0.0028533935546875,
-0.015777587890625,
0.0309600830078125,
0.0160980224609375,
-0.0599365234375,
0.00563812255859375,
0.0309600830078125,
0.05169677734375,
0.01145172119140625,
-0.036865234375,
-0.0239410400390625,
-0.0089263916015625,
-0.0188140869140625,
0.049285888671875,
-0.0303955078125,
-0.01209259033203125,
-0.007099151611328125,
0.01654052734375,
-0.0265655517578125,
-0.0160980224609375,
0.03717041015625,
-0.047027587890625,
0.0267181396484375,
-0.01421356201171875,
-0.06341552734375,
-0.031463623046875,
0.0208587646484375,
-0.047760009765625,
0.07196044921875,
0.0108642578125,
-0.07818603515625,
0.0295867919921875,
-0.03192138671875,
-0.0262908935546875,
0.0010938644409179688,
-0.0020198822021484375,
-0.040802001953125,
-0.01128387451171875,
0.00934600830078125,
0.0513916015625,
-0.01519775390625,
0.01507568359375,
-0.01727294921875,
-0.00971221923828125,
0.01236724853515625,
-0.0014553070068359375,
0.056488037109375,
0.00923919677734375,
-0.0287933349609375,
0.003604888916015625,
-0.067138671875,
-0.002033233642578125,
0.020477294921875,
-0.0294189453125,
-0.010162353515625,
-0.00391387939453125,
0.0197601318359375,
0.021209716796875,
0.0194244384765625,
-0.034332275390625,
0.0285186767578125,
-0.0364990234375,
0.03375244140625,
0.03924560546875,
0.002140045166015625,
0.037933349609375,
-0.032623291015625,
0.031768798828125,
0.0042877197265625,
-0.01491546630859375,
-0.006069183349609375,
-0.04290771484375,
-0.055816650390625,
-0.0192718505859375,
0.0380859375,
0.04473876953125,
-0.041259765625,
0.053466796875,
-0.027130126953125,
-0.058013916015625,
-0.023162841796875,
-0.0196685791015625,
0.0137939453125,
0.0504150390625,
0.04339599609375,
0.0014858245849609375,
-0.05902099609375,
-0.057952880859375,
-0.006694793701171875,
0.0014009475708007812,
0.0157623291015625,
0.0185089111328125,
0.07330322265625,
-0.027862548828125,
0.062103271484375,
-0.033355712890625,
-0.0267181396484375,
-0.03594970703125,
0.0101318359375,
0.038055419921875,
0.04473876953125,
0.03472900390625,
-0.0478515625,
-0.052398681640625,
-0.004180908203125,
-0.03472900390625,
0.0164337158203125,
-0.0174560546875,
-0.0003139972686767578,
0.035308837890625,
0.031494140625,
-0.036346435546875,
0.03363037109375,
0.0257568359375,
-0.03631591796875,
0.041473388671875,
-0.005130767822265625,
-0.0126800537109375,
-0.10516357421875,
0.0204010009765625,
0.012847900390625,
-0.01824951171875,
-0.03875732421875,
-0.01806640625,
0.0200347900390625,
0.00904083251953125,
-0.027313232421875,
0.0614013671875,
-0.036956787109375,
0.014678955078125,
-0.0097198486328125,
0.003543853759765625,
0.00937652587890625,
0.0256195068359375,
0.0263824462890625,
0.03509521484375,
0.04705810546875,
-0.044281005859375,
0.01554107666015625,
0.0283660888671875,
-0.0238800048828125,
0.0154571533203125,
-0.0362548828125,
-0.0033721923828125,
-0.004970550537109375,
0.018829345703125,
-0.06427001953125,
-0.0194244384765625,
0.007137298583984375,
-0.04412841796875,
0.054443359375,
-0.003116607666015625,
-0.0230560302734375,
-0.03863525390625,
-0.0192413330078125,
0.0079803466796875,
0.037200927734375,
-0.03582763671875,
0.0469970703125,
0.00988006591796875,
0.01076507568359375,
-0.059478759765625,
-0.048492431640625,
-0.0066070556640625,
-0.01824951171875,
-0.046478271484375,
0.04766845703125,
-0.007770538330078125,
0.004726409912109375,
0.007183074951171875,
-0.0016870498657226562,
-0.000055670738220214844,
0.0054779052734375,
0.0098876953125,
0.0362548828125,
-0.0179443359375,
0.002750396728515625,
-0.0213165283203125,
-0.002819061279296875,
0.002094268798828125,
-0.016815185546875,
0.05767822265625,
-0.004886627197265625,
0.015228271484375,
-0.0384521484375,
0.00870513916015625,
0.03076171875,
-0.023162841796875,
0.07916259765625,
0.0667724609375,
-0.042633056640625,
-0.0164947509765625,
-0.03668212890625,
-0.0171661376953125,
-0.0286865234375,
0.052093505859375,
-0.0266876220703125,
-0.0450439453125,
0.043853759765625,
0.014434814453125,
0.0140228271484375,
0.06549072265625,
0.0250396728515625,
0.004913330078125,
0.08453369140625,
0.046478271484375,
-0.01288604736328125,
0.032867431640625,
-0.05548095703125,
0.01280975341796875,
-0.07232666015625,
-0.0179901123046875,
-0.039459228515625,
-0.0159149169921875,
-0.04840087890625,
-0.01294708251953125,
0.00982666015625,
0.0181884765625,
-0.04473876953125,
0.0457763671875,
-0.042022705078125,
0.0182342529296875,
0.037322998046875,
-0.00020396709442138672,
-0.00467681884765625,
-0.015777587890625,
-0.0223388671875,
-0.0168304443359375,
-0.052581787109375,
-0.041015625,
0.08074951171875,
0.0301513671875,
0.057098388671875,
-0.006572723388671875,
0.06964111328125,
-0.017608642578125,
0.02764892578125,
-0.06488037109375,
0.03472900390625,
-0.00569915771484375,
-0.058563232421875,
-0.0081939697265625,
-0.028900146484375,
-0.0740966796875,
0.00211334228515625,
-0.0321044921875,
-0.0648193359375,
0.01338958740234375,
0.002227783203125,
-0.0364990234375,
0.0419921875,
-0.032928466796875,
0.07269287109375,
-0.0125885009765625,
-0.0272064208984375,
0.0161895751953125,
-0.05413818359375,
0.009429931640625,
-0.004913330078125,
0.0204620361328125,
-0.006359100341796875,
-0.00001621246337890625,
0.08013916015625,
-0.02349853515625,
0.054412841796875,
-0.002231597900390625,
0.0093841552734375,
0.0037975311279296875,
-0.0070037841796875,
0.042205810546875,
0.00676727294921875,
-0.0228271484375,
0.00977325439453125,
-0.005046844482421875,
-0.01390838623046875,
-0.006916046142578125,
0.0618896484375,
-0.06951904296875,
-0.0278778076171875,
-0.060394287109375,
-0.027435302734375,
0.005580902099609375,
0.0289764404296875,
0.06121826171875,
0.0472412109375,
-0.00579071044921875,
0.0078887939453125,
0.042236328125,
-0.019287109375,
0.053558349609375,
0.024932861328125,
-0.031890869140625,
-0.04559326171875,
0.059173583984375,
0.0211944580078125,
0.0031795501708984375,
0.039276123046875,
0.00911712646484375,
-0.02978515625,
-0.026947021484375,
-0.0186614990234375,
0.0406494140625,
-0.0458984375,
-0.0396728515625,
-0.061737060546875,
-0.033966064453125,
-0.054840087890625,
-0.01904296875,
-0.024017333984375,
-0.01983642578125,
-0.055755615234375,
0.0011854171752929688,
0.0308074951171875,
0.05718994140625,
-0.0035533905029296875,
0.020294189453125,
-0.048614501953125,
-0.01103973388671875,
0.00507354736328125,
0.004886627197265625,
0.0010662078857421875,
-0.0750732421875,
-0.032135009765625,
-0.012176513671875,
-0.033447265625,
-0.07354736328125,
0.07275390625,
0.017822265625,
0.032745361328125,
0.028045654296875,
0.0001747608184814453,
0.041778564453125,
-0.04180908203125,
0.06195068359375,
0.0091400146484375,
-0.0631103515625,
0.036712646484375,
-0.01392364501953125,
0.01224517822265625,
0.01181793212890625,
0.055755615234375,
-0.0293731689453125,
-0.01064300537109375,
-0.062744140625,
-0.07818603515625,
0.058563232421875,
-0.004638671875,
0.00791168212890625,
-0.022186279296875,
0.021453857421875,
-0.011505126953125,
-0.0030536651611328125,
-0.0806884765625,
-0.040771484375,
-0.0087890625,
-0.0108489990234375,
-0.022064208984375,
-0.01502227783203125,
0.01605224609375,
-0.03240966796875,
0.09326171875,
0.0015106201171875,
0.037322998046875,
0.034393310546875,
-0.003086090087890625,
-0.0011720657348632812,
0.0163726806640625,
0.038482666015625,
0.03375244140625,
-0.035308837890625,
-0.00507354736328125,
0.0257720947265625,
-0.023590087890625,
-0.015045166015625,
0.0234222412109375,
-0.0039825439453125,
0.0184783935546875,
0.033966064453125,
0.06317138671875,
0.0213165283203125,
-0.01055908203125,
0.038482666015625,
0.0032329559326171875,
-0.0225677490234375,
-0.05072021484375,
-0.00655364990234375,
0.0128173828125,
0.0137939453125,
0.03173828125,
0.01247406005859375,
-0.007266998291015625,
-0.0343017578125,
0.004459381103515625,
0.03802490234375,
-0.025360107421875,
-0.033233642578125,
0.07830810546875,
-0.00009757280349731445,
-0.0158233642578125,
0.036712646484375,
-0.035552978515625,
-0.061737060546875,
0.053985595703125,
0.0577392578125,
0.056427001953125,
-0.0192108154296875,
-0.00383758544921875,
0.06854248046875,
0.022369384765625,
-0.0122222900390625,
0.037322998046875,
0.037078857421875,
-0.0638427734375,
-0.027069091796875,
-0.06256103515625,
-0.004878997802734375,
0.0250091552734375,
-0.04638671875,
0.050994873046875,
-0.027740478515625,
-0.0270233154296875,
0.02978515625,
0.02398681640625,
-0.06976318359375,
0.0213470458984375,
0.0279541015625,
0.0908203125,
-0.06109619140625,
0.074462890625,
0.066162109375,
-0.0467529296875,
-0.09344482421875,
-0.00630950927734375,
-0.01067352294921875,
-0.048828125,
0.05615234375,
0.038330078125,
0.02423095703125,
0.0275115966796875,
-0.043792724609375,
-0.091796875,
0.09423828125,
-0.002742767333984375,
-0.042388916015625,
-0.01788330078125,
-0.0244903564453125,
0.0215301513671875,
-0.032562255859375,
0.04034423828125,
0.02728271484375,
0.03765869140625,
0.002376556396484375,
-0.069580078125,
0.00006854534149169922,
-0.01129150390625,
-0.0045166015625,
0.0201873779296875,
-0.049224853515625,
0.08734130859375,
-0.0226898193359375,
-0.010223388671875,
0.023468017578125,
0.055023193359375,
0.0034961700439453125,
0.0161590576171875,
0.0163421630859375,
0.0634765625,
0.053466796875,
-0.015533447265625,
0.06964111328125,
-0.0287933349609375,
0.05645751953125,
0.07940673828125,
-0.0203704833984375,
0.07354736328125,
0.02252197265625,
-0.010467529296875,
0.05010986328125,
0.046600341796875,
-0.01220703125,
0.03472900390625,
0.01239776611328125,
-0.006511688232421875,
-0.0163116455078125,
0.00609588623046875,
-0.040130615234375,
0.03717041015625,
0.024688720703125,
-0.0433349609375,
-0.0036182403564453125,
-0.001476287841796875,
0.034820556640625,
-0.0129852294921875,
-0.03314208984375,
0.05126953125,
0.0003445148468017578,
-0.039093017578125,
0.05126953125,
0.005413055419921875,
0.0733642578125,
-0.037384033203125,
0.004459381103515625,
-0.00421905517578125,
0.0192718505859375,
-0.02691650390625,
-0.0408935546875,
0.00958251953125,
-0.00926971435546875,
-0.014190673828125,
0.0031585693359375,
0.041473388671875,
-0.036895751953125,
-0.037322998046875,
0.025970458984375,
0.02508544921875,
0.0156402587890625,
-0.001617431640625,
-0.05126953125,
-0.01023101806640625,
0.006572723388671875,
-0.0391845703125,
0.01316070556640625,
0.018951416015625,
0.005950927734375,
0.0312347412109375,
0.03607177734375,
0.006336212158203125,
0.0006213188171386719,
-0.0137176513671875,
0.05987548828125,
-0.06878662109375,
-0.02459716796875,
-0.0675048828125,
0.03936767578125,
-0.0063018798828125,
-0.0323486328125,
0.054412841796875,
0.06634521484375,
0.064453125,
-0.007305145263671875,
0.0548095703125,
-0.0274810791015625,
0.04888916015625,
-0.022796630859375,
0.059783935546875,
-0.04766845703125,
0.00577545166015625,
-0.018402099609375,
-0.06976318359375,
-0.03033447265625,
0.05889892578125,
-0.0333251953125,
0.00766754150390625,
0.05633544921875,
0.0599365234375,
0.0003037452697753906,
-0.01174163818359375,
0.0211944580078125,
0.03277587890625,
0.009552001953125,
0.041656494140625,
0.0457763671875,
-0.049041748046875,
0.0302276611328125,
-0.044097900390625,
-0.0106048583984375,
-0.0210418701171875,
-0.07110595703125,
-0.07275390625,
-0.052215576171875,
-0.03814697265625,
-0.05169677734375,
-0.0266571044921875,
0.09503173828125,
0.04949951171875,
-0.06610107421875,
-0.0046234130859375,
0.002758026123046875,
-0.00815582275390625,
-0.0027637481689453125,
-0.0218963623046875,
0.0408935546875,
-0.019622802734375,
-0.06060791015625,
0.0159149169921875,
-0.01091766357421875,
0.01215362548828125,
0.01290130615234375,
-0.006458282470703125,
-0.042236328125,
0.0017328262329101562,
0.0283203125,
0.022369384765625,
-0.05450439453125,
-0.0208740234375,
0.01824951171875,
-0.0245208740234375,
0.00891876220703125,
0.01018524169921875,
-0.053192138671875,
0.0117034912109375,
0.0244903564453125,
0.0229034423828125,
0.040069580078125,
-0.01678466796875,
0.01154327392578125,
-0.050537109375,
-0.00360870361328125,
0.0279541015625,
0.05181884765625,
0.0218048095703125,
-0.0239715576171875,
0.032684326171875,
0.021209716796875,
-0.06402587890625,
-0.050750732421875,
-0.00911712646484375,
-0.0836181640625,
-0.01332855224609375,
0.09246826171875,
-0.01256561279296875,
-0.0301513671875,
0.006237030029296875,
-0.0160675048828125,
0.03851318359375,
-0.033233642578125,
0.0231475830078125,
0.035308837890625,
-0.007640838623046875,
-0.0010938644409179688,
-0.039093017578125,
0.042633056640625,
0.019287109375,
-0.041748046875,
-0.0195770263671875,
0.0189208984375,
0.0478515625,
0.0238189697265625,
0.034210205078125,
0.008697509765625,
0.013031005859375,
-0.00409698486328125,
0.03448486328125,
0.0058135986328125,
-0.00881195068359375,
-0.032989501953125,
-0.016021728515625,
0.00009149312973022461,
-0.01349639892578125
]
] |
Helsinki-NLP/opus-mt-en-ar | 2023-08-16T11:28:58.000Z | [
"transformers",
"pytorch",
"tf",
"rust",
"marian",
"text2text-generation",
"translation",
"en",
"ar",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-en-ar | 15 | 34,242 | transformers | 2022-03-02T23:29:04 | ---
language:
- en
- ar
tags:
- translation
license: apache-2.0
---
### eng-ara
* source group: English
* target group: Arabic
* OPUS readme: [eng-ara](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-ara/README.md)
* model: transformer
* source language(s): eng
* target language(s): acm afb apc apc_Latn ara ara_Latn arq arq_Latn ary arz
* model: transformer
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* a sentence initial language token is required in the form of `>>id<<` (id = valid target language ID)
* download original weights: [opus-2020-07-03.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-ara/opus-2020-07-03.zip)
* test set translations: [opus-2020-07-03.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-ara/opus-2020-07-03.test.txt)
* test set scores: [opus-2020-07-03.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-ara/opus-2020-07-03.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba-test.eng.ara | 14.0 | 0.437 |
### System Info:
- hf_name: eng-ara
- source_languages: eng
- target_languages: ara
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-ara/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['en', 'ar']
- src_constituents: {'eng'}
- tgt_constituents: {'apc', 'ara', 'arq_Latn', 'arq', 'afb', 'ara_Latn', 'apc_Latn', 'arz'}
- src_multilingual: False
- tgt_multilingual: False
- prepro: normalization + SentencePiece (spm32k,spm32k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/eng-ara/opus-2020-07-03.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/eng-ara/opus-2020-07-03.test.txt
- src_alpha3: eng
- tgt_alpha3: ara
- short_pair: en-ar
- chrF2_score: 0.43700000000000006
- bleu: 14.0
- brevity_penalty: 1.0
- ref_len: 58935.0
- src_name: English
- tgt_name: Arabic
- train_date: 2020-07-03
- src_alpha2: en
- tgt_alpha2: ar
- prefer_old: False
- long_pair: eng-ara
- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
- transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
- port_machine: brutasse
- port_time: 2020-08-21-14:41 | 2,274 | [
[
-0.0313720703125,
-0.045013427734375,
0.0166778564453125,
0.0299530029296875,
-0.035675048828125,
-0.0094451904296875,
-0.0196990966796875,
-0.032928466796875,
0.0212860107421875,
0.0209808349609375,
-0.03582763671875,
-0.057525634765625,
-0.047576904296875,
0.032318115234375,
-0.00571441650390625,
0.07525634765625,
-0.0136260986328125,
0.00342559814453125,
0.03863525390625,
-0.0377197265625,
-0.0241241455078125,
-0.015228271484375,
-0.04998779296875,
-0.0099029541015625,
0.029205322265625,
0.0254364013671875,
0.0380859375,
0.036163330078125,
0.040191650390625,
0.022979736328125,
-0.0343017578125,
0.0181121826171875,
-0.0035266876220703125,
-0.01149749755859375,
-0.003753662109375,
-0.0287628173828125,
-0.04217529296875,
-0.0171051025390625,
0.0648193359375,
0.039031982421875,
0.003704071044921875,
0.039764404296875,
-0.0101470947265625,
0.058563232421875,
-0.0205841064453125,
0.007537841796875,
-0.035125732421875,
-0.0088348388671875,
-0.0341796875,
-0.0198822021484375,
-0.03460693359375,
-0.021209716796875,
0.00847625732421875,
-0.0428466796875,
0.003204345703125,
0.0120391845703125,
0.125244140625,
0.0069427490234375,
-0.037750244140625,
-0.021514892578125,
-0.03131103515625,
0.0574951171875,
-0.057342529296875,
0.030517578125,
0.03778076171875,
-0.0010080337524414062,
-0.005157470703125,
-0.0280303955078125,
-0.031494140625,
-0.002532958984375,
-0.0158843994140625,
0.0214385986328125,
-0.0130615234375,
-0.0157928466796875,
0.00994110107421875,
0.04034423828125,
-0.049285888671875,
-0.00496673583984375,
-0.0312347412109375,
-0.00972747802734375,
0.032470703125,
0.0012645721435546875,
0.0281219482421875,
-0.043365478515625,
-0.02874755859375,
-0.033233642578125,
-0.033538818359375,
0.0194549560546875,
0.036834716796875,
0.0272369384765625,
-0.04229736328125,
0.050537109375,
-0.0087890625,
0.039154052734375,
0.000995635986328125,
-0.00695037841796875,
0.050872802734375,
-0.048187255859375,
-0.00742340087890625,
-0.01078033447265625,
0.08392333984375,
0.01007080078125,
0.000274658203125,
0.00969696044921875,
-0.0146484375,
-0.0131988525390625,
-0.002227783203125,
-0.0576171875,
0.0082550048828125,
0.0224609375,
-0.029876708984375,
-0.01543426513671875,
-0.00040984153747558594,
-0.0601806640625,
0.00978851318359375,
0.0108795166015625,
0.035003662109375,
-0.061309814453125,
-0.0198516845703125,
0.0269012451171875,
-0.00032639503479003906,
0.021636962890625,
0.00022590160369873047,
-0.03643798828125,
0.01213836669921875,
0.01898193359375,
0.071533203125,
-0.0029811859130859375,
-0.0312347412109375,
-0.01947021484375,
0.0154571533203125,
-0.0130157470703125,
0.053009033203125,
-0.01482391357421875,
-0.0278778076171875,
-0.010162353515625,
0.0244140625,
-0.006656646728515625,
-0.013275146484375,
0.0677490234375,
-0.0275421142578125,
0.0311279296875,
-0.0212860107421875,
-0.03466796875,
-0.034515380859375,
0.02569580078125,
-0.06134033203125,
0.08819580078125,
0.01502227783203125,
-0.06683349609375,
0.02606201171875,
-0.06298828125,
-0.0247344970703125,
-0.0010805130004882812,
0.00655364990234375,
-0.05584716796875,
-0.00714111328125,
0.019866943359375,
0.0227508544921875,
-0.0309600830078125,
0.0293426513671875,
0.00399017333984375,
-0.0174713134765625,
-0.0009617805480957031,
-0.019195556640625,
0.0904541015625,
0.0167694091796875,
-0.034393310546875,
0.0096282958984375,
-0.06378173828125,
-0.005596160888671875,
0.025970458984375,
-0.0382080078125,
-0.01512908935546875,
-0.006496429443359375,
0.0211944580078125,
0.0052032470703125,
0.0192413330078125,
-0.0413818359375,
0.0265960693359375,
-0.050689697265625,
0.0167999267578125,
0.059326171875,
0.004833221435546875,
0.0181732177734375,
-0.02886962890625,
0.038299560546875,
0.0194854736328125,
0.0025501251220703125,
0.01145172119140625,
-0.041778564453125,
-0.05517578125,
-0.0187835693359375,
0.04449462890625,
0.054534912109375,
-0.054901123046875,
0.058319091796875,
-0.047576904296875,
-0.061676025390625,
-0.05426025390625,
-0.01166534423828125,
0.035400390625,
0.017242431640625,
0.036773681640625,
-0.01910400390625,
-0.040618896484375,
-0.075439453125,
-0.015777587890625,
-0.021759033203125,
0.00543212890625,
0.0178070068359375,
0.0635986328125,
-0.005741119384765625,
0.036468505859375,
-0.0280609130859375,
-0.042633056640625,
-0.0201416015625,
0.0162811279296875,
0.03131103515625,
0.04791259765625,
0.052154541015625,
-0.0609130859375,
-0.04376220703125,
0.005359649658203125,
-0.04638671875,
-0.00716400146484375,
0.00022232532501220703,
-0.01300048828125,
0.03082275390625,
0.00012540817260742188,
-0.038604736328125,
0.0275726318359375,
0.046630859375,
-0.05859375,
0.040252685546875,
-0.01202392578125,
0.031158447265625,
-0.11053466796875,
0.018035888671875,
-0.0098114013671875,
-0.00547027587890625,
-0.020965576171875,
0.006267547607421875,
0.0004096031188964844,
0.00928497314453125,
-0.0401611328125,
0.052978515625,
-0.04559326171875,
0.0011072158813476562,
0.0209197998046875,
0.0082550048828125,
-0.004608154296875,
0.0574951171875,
-0.0160675048828125,
0.08074951171875,
0.041534423828125,
-0.0321044921875,
0.0002543926239013672,
0.0416259765625,
-0.034332275390625,
0.0114288330078125,
-0.052886962890625,
-0.01450347900390625,
0.019622802734375,
0.005298614501953125,
-0.05572509765625,
-0.00867462158203125,
0.019317626953125,
-0.052032470703125,
0.01380157470703125,
-0.00904083251953125,
-0.048553466796875,
-0.00811767578125,
-0.02777099609375,
0.044677734375,
0.031402587890625,
-0.01126861572265625,
0.06549072265625,
0.01349639892578125,
-0.0003261566162109375,
-0.04888916015625,
-0.057952880859375,
-0.00006604194641113281,
-0.01412200927734375,
-0.04718017578125,
0.029205322265625,
-0.01055145263671875,
0.0029296875,
0.015869140625,
0.0004017353057861328,
-0.007213592529296875,
0.0112457275390625,
0.00501251220703125,
0.015838623046875,
-0.016265869140625,
0.00312042236328125,
-0.0007967948913574219,
-0.00858306884765625,
-0.019256591796875,
-0.011962890625,
0.06268310546875,
-0.037261962890625,
-0.01104736328125,
-0.052703857421875,
0.02020263671875,
0.033905029296875,
-0.0328369140625,
0.08416748046875,
0.048187255859375,
-0.0176544189453125,
0.01480865478515625,
-0.034637451171875,
0.00928497314453125,
-0.029937744140625,
0.023223876953125,
-0.048309326171875,
-0.050750732421875,
0.06732177734375,
0.0107574462890625,
0.01332855224609375,
0.07281494140625,
0.04815673828125,
0.00902557373046875,
0.048828125,
0.0270233154296875,
0.003185272216796875,
0.036285400390625,
-0.0467529296875,
-0.00722503662109375,
-0.0638427734375,
-0.0118865966796875,
-0.05865478515625,
-0.01088714599609375,
-0.070068359375,
-0.01100921630859375,
0.02587890625,
0.0017490386962890625,
-0.01497650146484375,
0.049896240234375,
-0.037353515625,
0.0211639404296875,
0.048980712890625,
0.01255035400390625,
0.0227203369140625,
-0.002452850341796875,
-0.0308685302734375,
0.00479888916015625,
-0.032958984375,
-0.05047607421875,
0.08941650390625,
0.019256591796875,
0.007030487060546875,
0.03155517578125,
0.050262451171875,
0.01020050048828125,
0.00516510009765625,
-0.04248046875,
0.04595947265625,
-0.0129547119140625,
-0.062225341796875,
-0.027801513671875,
-0.027069091796875,
-0.0772705078125,
0.00962066650390625,
-0.0132293701171875,
-0.047332763671875,
0.005748748779296875,
-0.011932373046875,
-0.007068634033203125,
0.048553466796875,
-0.0491943359375,
0.06341552734375,
0.00786590576171875,
-0.01126861572265625,
0.001827239990234375,
-0.048675537109375,
0.0106353759765625,
-0.00579071044921875,
0.017913818359375,
-0.01258087158203125,
-0.01270294189453125,
0.07342529296875,
-0.0242919921875,
0.0380859375,
-0.00072479248046875,
-0.00977325439453125,
0.0176239013671875,
0.0142822265625,
0.039703369140625,
-0.0057830810546875,
-0.0237579345703125,
0.033172607421875,
0.006435394287109375,
-0.053192138671875,
-0.014007568359375,
0.033660888671875,
-0.0692138671875,
-0.03662109375,
-0.047332763671875,
-0.04608154296875,
-0.000024020671844482422,
0.031951904296875,
0.03778076171875,
0.036773681640625,
0.004016876220703125,
0.03521728515625,
0.03997802734375,
-0.027008056640625,
0.0369873046875,
0.039581298828125,
-0.00014448165893554688,
-0.047332763671875,
0.0499267578125,
0.0189056396484375,
0.0106353759765625,
0.033599853515625,
0.006145477294921875,
-0.00725555419921875,
-0.06646728515625,
-0.043670654296875,
0.0257415771484375,
-0.032928466796875,
-0.0244903564453125,
-0.042205810546875,
-0.0040740966796875,
-0.028167724609375,
0.012481689453125,
-0.02655029296875,
-0.031768798828125,
-0.00725555419921875,
-0.0184478759765625,
0.037506103515625,
0.030364990234375,
0.0004420280456542969,
0.020050048828125,
-0.06903076171875,
0.02337646484375,
-0.017120361328125,
0.03509521484375,
-0.0135955810546875,
-0.059234619140625,
-0.0263671875,
-0.002719879150390625,
-0.0220184326171875,
-0.08392333984375,
0.0380859375,
-0.0010890960693359375,
0.01511383056640625,
0.004199981689453125,
0.00803375244140625,
0.04486083984375,
-0.0404052734375,
0.0762939453125,
-0.013824462890625,
-0.07049560546875,
0.050872802734375,
-0.03033447265625,
0.033111572265625,
0.0531005859375,
0.02581787109375,
-0.0238037109375,
-0.051605224609375,
-0.058746337890625,
-0.0723876953125,
0.0601806640625,
0.04986572265625,
-0.01107025146484375,
-0.00677490234375,
0.01041412353515625,
-0.0006189346313476562,
-0.01393890380859375,
-0.085693359375,
-0.037261962890625,
0.007282257080078125,
-0.032501220703125,
0.01108551025390625,
-0.031494140625,
-0.0126953125,
-0.019622802734375,
0.08697509765625,
0.0166473388671875,
0.0127410888671875,
0.039093017578125,
-0.01186370849609375,
-0.0003993511199951172,
0.0292816162109375,
0.053375244140625,
0.031524658203125,
-0.0264739990234375,
-0.01103973388671875,
0.02740478515625,
-0.0462646484375,
0.010955810546875,
0.00531005859375,
-0.037933349609375,
0.017059326171875,
0.04248046875,
0.06304931640625,
0.005939483642578125,
-0.04132080078125,
0.033905029296875,
-0.006256103515625,
-0.02606201171875,
-0.04010009765625,
-0.0240936279296875,
0.004886627197265625,
0.012603759765625,
0.0318603515625,
0.0004527568817138672,
0.006748199462890625,
-0.01861572265625,
0.0035381317138671875,
0.00511932373046875,
-0.00904083251953125,
-0.0277252197265625,
0.0389404296875,
0.004650115966796875,
-0.0289306640625,
0.0282135009765625,
-0.0300140380859375,
-0.0325927734375,
0.0496826171875,
0.0201873779296875,
0.07843017578125,
-0.011688232421875,
-0.0084991455078125,
0.0521240234375,
0.04119873046875,
0.007015228271484375,
0.036590576171875,
0.01264190673828125,
-0.047607421875,
-0.02618408203125,
-0.064453125,
0.0046844482421875,
0.01120758056640625,
-0.055023193359375,
0.02362060546875,
-0.0038204193115234375,
-0.01824951171875,
-0.002849578857421875,
0.0291595458984375,
-0.043212890625,
0.007457733154296875,
-0.027801513671875,
0.07281494140625,
-0.06732177734375,
0.06024169921875,
0.06048583984375,
-0.0633544921875,
-0.085205078125,
-0.00118255615234375,
-0.0240936279296875,
-0.044036865234375,
0.045166015625,
0.0032291412353515625,
-0.0014905929565429688,
0.0006546974182128906,
-0.02490234375,
-0.058624267578125,
0.08135986328125,
0.027984619140625,
-0.024261474609375,
-0.01324462890625,
0.00235748291015625,
0.03741455078125,
0.0016889572143554688,
0.018890380859375,
0.0263671875,
0.05517578125,
-0.014007568359375,
-0.08074951171875,
0.0145111083984375,
-0.04217529296875,
-0.00641632080078125,
0.02642822265625,
-0.06494140625,
0.05975341796875,
0.01073455810546875,
-0.01922607421875,
0.011260986328125,
0.0513916015625,
0.032745361328125,
0.002063751220703125,
0.042449951171875,
0.0679931640625,
0.029052734375,
-0.0297088623046875,
0.07745361328125,
-0.031463623046875,
0.0384521484375,
0.0635986328125,
0.01529693603515625,
0.06396484375,
0.040283203125,
-0.022796630859375,
0.056610107421875,
0.0521240234375,
-0.01192474365234375,
0.0242156982421875,
-0.007564544677734375,
-0.006298065185546875,
-0.0128936767578125,
-0.01885986328125,
-0.037353515625,
0.042205810546875,
0.003299713134765625,
-0.0113067626953125,
-0.00449371337890625,
-0.0115966796875,
0.024261474609375,
0.009552001953125,
-0.00936126708984375,
0.05194091796875,
-0.01425933837890625,
-0.043426513671875,
0.05517578125,
0.007061004638671875,
0.049163818359375,
-0.04852294921875,
-0.0006313323974609375,
-0.01419830322265625,
0.007335662841796875,
-0.00142669677734375,
-0.054534912109375,
0.0309906005859375,
0.01386260986328125,
-0.0201416015625,
-0.01284027099609375,
0.0082855224609375,
-0.0286102294921875,
-0.059722900390625,
0.033233642578125,
0.032562255859375,
0.022369384765625,
0.0176239013671875,
-0.055572509765625,
0.004119873046875,
0.01244354248046875,
-0.0535888671875,
-0.001129150390625,
0.053436279296875,
0.0020961761474609375,
0.05242919921875,
0.03533935546875,
0.0220947265625,
0.00463104248046875,
-0.0009899139404296875,
0.046539306640625,
-0.0653076171875,
-0.035125732421875,
-0.058441162109375,
0.037811279296875,
-0.0031757354736328125,
-0.048431396484375,
0.04949951171875,
0.056243896484375,
0.0645751953125,
-0.0055694580078125,
0.0266265869140625,
-0.0171966552734375,
0.035858154296875,
-0.048828125,
0.055877685546875,
-0.06939697265625,
0.00449371337890625,
-0.0152130126953125,
-0.052642822265625,
-0.0181732177734375,
0.0269622802734375,
-0.00959014892578125,
0.0027313232421875,
0.07275390625,
0.0509033203125,
0.01200103759765625,
-0.0223236083984375,
-0.007061004638671875,
0.03338623046875,
0.021728515625,
0.06353759765625,
0.018707275390625,
-0.07373046875,
0.05682373046875,
-0.020965576171875,
0.004070281982421875,
0.002292633056640625,
-0.058258056640625,
-0.06085205078125,
-0.06060791015625,
-0.01548004150390625,
-0.033355712890625,
-0.00920867919921875,
0.07427978515625,
0.02294921875,
-0.0755615234375,
-0.0247802734375,
-0.004375457763671875,
0.00962066650390625,
-0.0268096923828125,
-0.0186309814453125,
0.0616455078125,
-0.004535675048828125,
-0.07635498046875,
0.0137939453125,
0.0010595321655273438,
0.01342010498046875,
0.007442474365234375,
-0.0028362274169921875,
-0.0618896484375,
0.00223541259765625,
0.019805908203125,
0.00943756103515625,
-0.066162109375,
-0.0151824951171875,
0.00009268522262573242,
-0.0174560546875,
0.01751708984375,
0.001102447509765625,
-0.0266876220703125,
0.007171630859375,
0.04833984375,
0.03045654296875,
0.037933349609375,
-0.0113067626953125,
0.0218963623046875,
-0.055816650390625,
0.038177490234375,
0.0191650390625,
0.040802001953125,
0.017822265625,
-0.01021575927734375,
0.06488037109375,
0.020782470703125,
-0.0268096923828125,
-0.075439453125,
0.0017023086547851562,
-0.09796142578125,
-0.01029205322265625,
0.0748291015625,
-0.016021728515625,
-0.0293731689453125,
0.01204681396484375,
-0.0205841064453125,
0.042205810546875,
-0.03436279296875,
0.052459716796875,
0.06982421875,
0.029815673828125,
0.0112762451171875,
-0.03466796875,
0.023529052734375,
0.049407958984375,
-0.058868408203125,
-0.0157623291015625,
0.01323699951171875,
0.020294189453125,
0.0308685302734375,
0.044464111328125,
-0.022796630859375,
0.0166015625,
-0.0114898681640625,
0.026641845703125,
-0.01480865478515625,
-0.007335662841796875,
-0.020660400390625,
0.01554107666015625,
-0.00617218017578125,
-0.01763916015625
]
] |
KoboldAI/GPT-J-6B-Janeway | 2022-03-20T12:59:44.000Z | [
"transformers",
"pytorch",
"gptj",
"text-generation",
"en",
"arxiv:2101.00027",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-generation | KoboldAI | null | null | KoboldAI/GPT-J-6B-Janeway | 11 | 34,195 | transformers | 2022-03-02T23:29:04 | ---
language: en
license: mit
---
# GPT-J 6B - Janeway
## Model Description
GPT-J 6B-Janeway is a finetune created using EleutherAI's GPT-J 6B model.
## Training data
The training data contains around 2210 ebooks, mostly in the sci-fi and fantasy genres. The dataset is based on the same dataset used by GPT-Neo-2.7B-Picard, with 20% more data in various genres.
Some parts of the dataset have been prepended using the following text: `[Genre: <genre1>,<genre2>]`
### How to use
You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
```py
>>> from transformers import pipeline
>>> generator = pipeline('text-generation', model='KoboldAI/GPT-J-6B-Janeway')
>>> generator("Welcome Captain Janeway, I apologize for the delay.", do_sample=True, min_length=50)
[{'generated_text': 'Welcome Captain Janeway, I apologize for the delay."\nIt's all right," Janeway said. "I'm certain that you're doing your best to keep me informed of what\'s going on."'}]
```
### Limitations and Biases
The core functionality of GPT-J is taking a string of text and predicting the next token. While language models are widely used for tasks other than this, there are a lot of unknowns with this work. When prompting GPT-J it is important to remember that the statistically most likely next token is often not the token that produces the most "accurate" text. Never depend upon GPT-J to produce factually accurate output.
GPT-J was trained on the Pile, a dataset known to contain profanity, lewd, and otherwise abrasive language. Depending upon use case GPT-J may produce socially unacceptable text. See [Sections 5 and 6 of the Pile paper](https://arxiv.org/abs/2101.00027) for a more detailed analysis of the biases in the Pile.
As with all language models, it is hard to predict in advance how GPT-J will respond to particular prompts and offensive content may occur without warning. We recommend having a human curate or filter the outputs before releasing them, both to censor undesirable content and to improve the quality of the results.
### BibTeX entry and citation info
The model uses the following model as base:
```bibtex
@misc{gpt-j,
author = {Wang, Ben and Komatsuzaki, Aran},
title = {{GPT-J-6B: A 6 Billion Parameter Autoregressive Language Model}},
howpublished = {\url{https://github.com/kingoflolz/mesh-transformer-jax}},
year = 2021,
month = May
}
```
## Acknowledgements
This project would not have been possible without compute generously provided by Google through the
[TPU Research Cloud](https://sites.research.google/trc/), as well as the Cloud TPU team for providing early access to the [Cloud TPU VM](https://cloud.google.com/blog/products/compute/introducing-cloud-tpu-vms) Alpha.
| 2,823 | [
[
-0.0204620361328125,
-0.0579833984375,
0.0325927734375,
-0.003955841064453125,
-0.0216064453125,
-0.0188140869140625,
-0.007320404052734375,
-0.0207672119140625,
-0.00978851318359375,
0.040130615234375,
-0.03936767578125,
-0.0306396484375,
-0.059417724609375,
0.00921630859375,
-0.034576416015625,
0.1024169921875,
0.02264404296875,
-0.0186767578125,
0.01445770263671875,
0.01367950439453125,
-0.026336669921875,
-0.03192138671875,
-0.05059814453125,
-0.0156707763671875,
0.04302978515625,
-0.006237030029296875,
0.062347412109375,
0.05718994140625,
0.0269012451171875,
0.0207672119140625,
-0.01035308837890625,
-0.0110015869140625,
-0.04107666015625,
-0.0079803466796875,
-0.0167388916015625,
-0.01461029052734375,
-0.017913818359375,
-0.003856658935546875,
0.049560546875,
0.041839599609375,
0.00923919677734375,
0.01189422607421875,
0.003475189208984375,
0.03765869140625,
-0.0261688232421875,
0.0214691162109375,
-0.041107177734375,
-0.01168060302734375,
-0.032073974609375,
0.00678253173828125,
-0.037261962890625,
-0.0204010009765625,
0.0159454345703125,
-0.034393310546875,
0.04571533203125,
0.003742218017578125,
0.08538818359375,
0.0225830078125,
-0.038970947265625,
-0.0171051025390625,
-0.051513671875,
0.054290771484375,
-0.06243896484375,
0.0266265869140625,
0.0278472900390625,
0.009063720703125,
-0.017059326171875,
-0.05889892578125,
-0.051910400390625,
-0.022857666015625,
-0.01555633544921875,
0.0231170654296875,
0.0124969482421875,
-0.00020885467529296875,
0.033203125,
0.02752685546875,
-0.071533203125,
-0.009368896484375,
-0.042022705078125,
-0.01279449462890625,
0.049285888671875,
0.01055145263671875,
0.0216522216796875,
-0.05047607421875,
-0.0265350341796875,
-0.0251312255859375,
-0.033416748046875,
0.002681732177734375,
0.047149658203125,
0.0241851806640625,
-0.021240234375,
0.04498291015625,
0.00399017333984375,
0.044525146484375,
0.011322021484375,
-0.0083770751953125,
0.02825927734375,
-0.0290374755859375,
-0.020782470703125,
-0.017364501953125,
0.08831787109375,
0.0163726806640625,
0.01461029052734375,
-0.01139068603515625,
-0.00420379638671875,
0.00974273681640625,
0.0295867919921875,
-0.061126708984375,
-0.038848876953125,
0.0240936279296875,
-0.0301666259765625,
-0.0240325927734375,
0.0081634521484375,
-0.04949951171875,
-0.02911376953125,
-0.0112762451171875,
0.0263671875,
-0.038421630859375,
-0.035736083984375,
-0.005390167236328125,
-0.0208282470703125,
0.0172271728515625,
0.0088348388671875,
-0.07623291015625,
0.0027828216552734375,
0.027557373046875,
0.0537109375,
-0.0009212493896484375,
-0.023193359375,
-0.01398468017578125,
0.022003173828125,
-0.0134429931640625,
0.050445556640625,
-0.02789306640625,
-0.039215087890625,
-0.005340576171875,
0.019775390625,
-0.0149078369140625,
-0.01482391357421875,
0.051025390625,
-0.0274200439453125,
0.05987548828125,
-0.01279449462890625,
-0.041412353515625,
-0.018310546875,
-0.00140380859375,
-0.04931640625,
0.0701904296875,
0.02728271484375,
-0.0684814453125,
0.034088134765625,
-0.053680419921875,
-0.0193328857421875,
0.007335662841796875,
-0.00823974609375,
-0.047271728515625,
-0.006633758544921875,
0.0204315185546875,
0.0274810791015625,
-0.0185089111328125,
0.0347900390625,
-0.0187530517578125,
-0.0273590087890625,
0.0016384124755859375,
-0.043060302734375,
0.0787353515625,
0.02117919921875,
-0.04217529296875,
-0.0086212158203125,
-0.050689697265625,
-0.00791168212890625,
0.0272216796875,
-0.01120758056640625,
-0.0157623291015625,
-0.007171630859375,
0.0212860107421875,
0.01555633544921875,
0.00975799560546875,
-0.024749755859375,
0.0322265625,
-0.03399658203125,
0.02154541015625,
0.05218505859375,
-0.0035228729248046875,
0.033416748046875,
-0.033599853515625,
0.03662109375,
-0.00695037841796875,
0.01226806640625,
-0.01007080078125,
-0.05633544921875,
-0.053619384765625,
-0.0031642913818359375,
0.031646728515625,
0.04217529296875,
-0.05767822265625,
0.0328369140625,
-0.02044677734375,
-0.051910400390625,
-0.0352783203125,
-0.01340484619140625,
0.032379150390625,
0.028961181640625,
0.032806396484375,
-0.0184478759765625,
-0.0271453857421875,
-0.058746337890625,
-0.0252685546875,
-0.034912109375,
-0.01012420654296875,
0.014678955078125,
0.0313720703125,
-0.023284912109375,
0.06256103515625,
-0.02789306640625,
-0.01155853271484375,
-0.015899658203125,
0.02581787109375,
0.0389404296875,
0.034149169921875,
0.04638671875,
-0.06353759765625,
-0.045501708984375,
-0.006145477294921875,
-0.0452880859375,
-0.0224151611328125,
-0.0214385986328125,
-0.0096893310546875,
0.0232696533203125,
0.00838470458984375,
-0.059783935546875,
0.021697998046875,
0.045501708984375,
-0.046295166015625,
0.046844482421875,
0.00506591796875,
0.0121917724609375,
-0.09619140625,
0.00689697265625,
0.0010080337524414062,
-0.01175689697265625,
-0.045867919921875,
-0.01168060302734375,
-0.0030574798583984375,
0.0108184814453125,
-0.030242919921875,
0.048919677734375,
-0.033843994140625,
0.008544921875,
-0.0145263671875,
0.001659393310546875,
-0.00069427490234375,
0.041717529296875,
-0.003566741943359375,
0.060150146484375,
0.023590087890625,
-0.047760009765625,
0.01239013671875,
0.024200439453125,
-0.008758544921875,
0.01251983642578125,
-0.049285888671875,
0.0133056640625,
-0.01348114013671875,
0.0191650390625,
-0.070068359375,
-0.023040771484375,
0.039825439453125,
-0.04779052734375,
0.0279541015625,
-0.041961669921875,
-0.038116455078125,
-0.037567138671875,
-0.0009174346923828125,
0.01922607421875,
0.05548095703125,
-0.024261474609375,
0.046478271484375,
0.0251312255859375,
-0.0264434814453125,
-0.046661376953125,
-0.039215087890625,
-0.009735107421875,
-0.027557373046875,
-0.0413818359375,
0.027801513671875,
-0.01329803466796875,
0.003147125244140625,
-0.0068206787109375,
0.012420654296875,
-0.0007815361022949219,
-0.0126800537109375,
0.01068878173828125,
0.021240234375,
0.006465911865234375,
-0.00276947021484375,
0.00878143310546875,
-0.01399993896484375,
0.0202789306640625,
-0.031097412109375,
0.0426025390625,
-0.006336212158203125,
-0.00946044921875,
-0.0169677734375,
0.01558685302734375,
0.039306640625,
0.0145263671875,
0.055419921875,
0.0845947265625,
-0.01708984375,
0.0098724365234375,
-0.033416748046875,
-0.006832122802734375,
-0.035919189453125,
0.039886474609375,
-0.02557373046875,
-0.05548095703125,
0.03668212890625,
0.0246734619140625,
0.018524169921875,
0.0653076171875,
0.05267333984375,
-0.000469207763671875,
0.0859375,
0.03582763671875,
-0.006511688232421875,
0.03662109375,
-0.0245361328125,
0.006336212158203125,
-0.05267333984375,
-0.0061187744140625,
-0.046539306640625,
-0.0019664764404296875,
-0.07623291015625,
-0.0160369873046875,
0.018524169921875,
-0.00482940673828125,
-0.044525146484375,
0.03741455078125,
-0.047149658203125,
0.0260467529296875,
0.0484619140625,
-0.01279449462890625,
0.0166015625,
-0.0058746337890625,
-0.0107421875,
-0.006710052490234375,
-0.04974365234375,
-0.04071044921875,
0.08819580078125,
0.0310211181640625,
0.051055908203125,
0.0261077880859375,
0.04766845703125,
0.01727294921875,
0.0263824462890625,
-0.034088134765625,
0.0227813720703125,
-0.02349853515625,
-0.06512451171875,
-0.0250701904296875,
-0.038909912109375,
-0.07855224609375,
0.01983642578125,
-0.006702423095703125,
-0.05780029296875,
0.004764556884765625,
0.009613037109375,
-0.013702392578125,
0.0210113525390625,
-0.07537841796875,
0.0670166015625,
-0.007213592529296875,
-0.0277099609375,
0.0157623291015625,
-0.059051513671875,
0.0302734375,
-0.0005364418029785156,
0.01497650146484375,
0.004299163818359375,
0.01134490966796875,
0.05047607421875,
-0.026458740234375,
0.06817626953125,
-0.0199432373046875,
-0.00818634033203125,
0.0269012451171875,
-0.00901031494140625,
0.04876708984375,
0.0092620849609375,
0.01021575927734375,
0.0229949951171875,
-0.0084075927734375,
-0.0343017578125,
-0.023162841796875,
0.044921875,
-0.0794677734375,
-0.044189453125,
-0.04705810546875,
-0.0433349609375,
0.0034198760986328125,
0.031280517578125,
0.05084228515625,
0.0294342041015625,
0.0078125,
0.00655364990234375,
0.039825439453125,
-0.0199127197265625,
0.0389404296875,
0.01482391357421875,
-0.029266357421875,
-0.03240966796875,
0.05975341796875,
0.01052093505859375,
0.020843505859375,
0.019989013671875,
0.0246429443359375,
-0.05120849609375,
-0.04376220703125,
-0.05340576171875,
0.0289306640625,
-0.0345458984375,
-0.00449371337890625,
-0.06500244140625,
-0.0301666259765625,
-0.046234130859375,
0.002483367919921875,
-0.019744873046875,
-0.0252227783203125,
-0.0191192626953125,
0.0014820098876953125,
0.02996826171875,
0.0638427734375,
0.0170745849609375,
0.03662109375,
-0.0643310546875,
0.02325439453125,
0.017669677734375,
0.0184783935546875,
-0.016632080078125,
-0.068359375,
-0.02197265625,
0.005893707275390625,
-0.033782958984375,
-0.058868408203125,
0.045806884765625,
-0.004932403564453125,
0.031341552734375,
0.01467132568359375,
0.006053924560546875,
0.03497314453125,
-0.037750244140625,
0.08648681640625,
0.005649566650390625,
-0.053924560546875,
0.032318115234375,
-0.07464599609375,
0.0439453125,
0.0217132568359375,
0.0310211181640625,
-0.037200927734375,
-0.05615234375,
-0.0794677734375,
-0.07525634765625,
0.04583740234375,
0.037384033203125,
0.0164794921875,
-0.0016536712646484375,
0.006999969482421875,
0.0219268798828125,
0.0151214599609375,
-0.0880126953125,
-0.0289306640625,
-0.041748046875,
-0.0229339599609375,
-0.0009264945983886719,
-0.00832366943359375,
0.009368896484375,
-0.00893402099609375,
0.0660400390625,
-0.01024627685546875,
0.054718017578125,
0.00815582275390625,
-0.01360321044921875,
0.003925323486328125,
0.0171966552734375,
0.039520263671875,
0.04364013671875,
-0.0214385986328125,
-0.0190582275390625,
-0.0018396377563476562,
-0.06536865234375,
-0.00798797607421875,
0.03106689453125,
-0.03289794921875,
0.013824462890625,
0.01776123046875,
0.07513427734375,
-0.0186004638671875,
-0.0178375244140625,
0.0360107421875,
-0.01099395751953125,
-0.0299530029296875,
-0.045654296875,
0.0180511474609375,
0.0004112720489501953,
0.0145263671875,
0.03131103515625,
-0.0028400421142578125,
0.0209808349609375,
-0.01544952392578125,
0.006946563720703125,
0.0139007568359375,
-0.01197052001953125,
-0.019073486328125,
0.06982421875,
0.012359619140625,
-0.01085662841796875,
0.05474853515625,
-0.0298614501953125,
-0.0301513671875,
0.04296875,
0.046295166015625,
0.07550048828125,
-0.00460052490234375,
0.01519775390625,
0.04547119140625,
0.037200927734375,
0.00478363037109375,
0.007648468017578125,
0.0293731689453125,
-0.050537109375,
-0.047027587890625,
-0.04656982421875,
-0.0211029052734375,
0.042633056640625,
-0.0302276611328125,
0.0294342041015625,
-0.034454345703125,
-0.0229034423828125,
-0.01441192626953125,
0.00624847412109375,
-0.0595703125,
0.0267791748046875,
0.00887298583984375,
0.0384521484375,
-0.0714111328125,
0.058868408203125,
0.0552978515625,
-0.052093505859375,
-0.0753173828125,
0.009796142578125,
0.00214385986328125,
-0.039215087890625,
0.01947021484375,
0.01593017578125,
0.0225677490234375,
0.020111083984375,
-0.04168701171875,
-0.068359375,
0.08306884765625,
0.01049041748046875,
-0.039825439453125,
-0.01291656494140625,
0.0099945068359375,
0.0379638671875,
-0.00971221923828125,
0.05322265625,
0.0340576171875,
0.0443115234375,
-0.01214599609375,
-0.06634521484375,
-0.0001360177993774414,
-0.03472900390625,
0.0153350830078125,
0.0233001708984375,
-0.0594482421875,
0.06927490234375,
-0.002422332763671875,
-0.0144805908203125,
0.004058837890625,
0.050323486328125,
0.0189056396484375,
0.005939483642578125,
0.043731689453125,
0.039337158203125,
0.050994873046875,
-0.026763916015625,
0.09503173828125,
-0.00872802734375,
0.049072265625,
0.0723876953125,
0.01372528076171875,
0.028778076171875,
0.0178680419921875,
-0.027740478515625,
0.04876708984375,
0.0552978515625,
-0.0182342529296875,
0.04486083984375,
-0.00251007080078125,
-0.00844573974609375,
0.00441741943359375,
0.00074005126953125,
-0.04345703125,
0.0166778564453125,
0.0278472900390625,
-0.04052734375,
0.00931549072265625,
-0.01239013671875,
0.015533447265625,
-0.015380859375,
-0.013824462890625,
0.0494384765625,
-0.0007076263427734375,
-0.0455322265625,
0.05120849609375,
-0.01068878173828125,
0.052215576171875,
-0.046905517578125,
0.006801605224609375,
-0.0107421875,
0.0008645057678222656,
-0.006633758544921875,
-0.0518798828125,
0.01715087890625,
0.0005078315734863281,
-0.03369140625,
-0.0165252685546875,
0.0548095703125,
-0.044158935546875,
-0.038116455078125,
0.0082855224609375,
0.038482666015625,
0.023834228515625,
0.00811767578125,
-0.06304931640625,
-0.009521484375,
0.014495849609375,
-0.037567138671875,
0.023681640625,
0.035186767578125,
0.0005450248718261719,
0.037750244140625,
0.0557861328125,
0.00180816650390625,
0.004718780517578125,
0.0291290283203125,
0.05426025390625,
-0.056884765625,
-0.044342041015625,
-0.051513671875,
0.0706787109375,
-0.00823211669921875,
-0.03350830078125,
0.04852294921875,
0.0269317626953125,
0.07861328125,
-0.0237884521484375,
0.090576171875,
-0.0306243896484375,
0.04864501953125,
-0.037994384765625,
0.060760498046875,
-0.0284423828125,
0.006656646728515625,
-0.04705810546875,
-0.0885009765625,
-0.0051727294921875,
0.05401611328125,
-0.0216064453125,
0.041595458984375,
0.07470703125,
0.05621337890625,
0.0009679794311523438,
0.0092926025390625,
0.0162811279296875,
0.028778076171875,
0.0300750732421875,
0.04071044921875,
0.04571533203125,
-0.0615234375,
0.0499267578125,
-0.0207672119140625,
-0.00971221923828125,
-0.005039215087890625,
-0.06866455078125,
-0.0740966796875,
-0.043060302734375,
-0.0192108154296875,
-0.04510498046875,
0.005565643310546875,
0.058258056640625,
0.03326416015625,
-0.0557861328125,
-0.0050811767578125,
-0.0171051025390625,
-0.0021991729736328125,
-0.01175689697265625,
-0.02471923828125,
0.0283050537109375,
-0.01593017578125,
-0.0850830078125,
0.009368896484375,
-0.0121307373046875,
0.0230865478515625,
-0.0220489501953125,
-0.006206512451171875,
-0.01029205322265625,
-0.0193939208984375,
0.0286865234375,
0.00865936279296875,
-0.04736328125,
-0.032806396484375,
-0.0010232925415039062,
-0.0168609619140625,
-0.0116729736328125,
0.0328369140625,
-0.05133056640625,
0.021484375,
0.038848876953125,
0.04803466796875,
0.038818359375,
0.0014772415161132812,
0.0537109375,
-0.04705810546875,
0.002948760986328125,
0.013519287109375,
0.02349853515625,
0.02752685546875,
-0.043792724609375,
0.045440673828125,
0.042083740234375,
-0.06756591796875,
-0.03363037109375,
0.0101776123046875,
-0.06939697265625,
-0.0088043212890625,
0.10546875,
0.0013246536254882812,
-0.0147247314453125,
-0.027557373046875,
-0.0190887451171875,
0.03173828125,
-0.040283203125,
0.059051513671875,
0.0616455078125,
0.0084228515625,
-0.0207061767578125,
-0.05413818359375,
0.037506103515625,
0.0249176025390625,
-0.0440673828125,
-0.0011491775512695312,
0.03936767578125,
0.0222625732421875,
0.01503753662109375,
0.049896240234375,
-0.010894775390625,
0.0110626220703125,
0.005580902099609375,
0.006252288818359375,
-0.0006661415100097656,
-0.0184478759765625,
-0.00835418701171875,
0.004650115966796875,
-0.00540924072265625,
0.007568359375
]
] |
stabilityai/sd-x2-latent-upscaler | 2023-06-05T16:28:02.000Z | [
"diffusers",
"stable-diffusion",
"license:openrail++",
"has_space",
"diffusers:StableDiffusionLatentUpscalePipeline",
"region:us"
] | null | stabilityai | null | null | stabilityai/sd-x2-latent-upscaler | 145 | 34,118 | diffusers | 2023-02-03T11:24:02 | ---
license: openrail++
tags:
- stable-diffusion
inference: false
---
# Stable Diffusion x2 latent upscaler model card
This model card focuses on the latent diffusion-based upscaler developed by [Katherine Crowson](https://github.com/crowsonkb/k-diffusion)
in collaboration with [Stability AI](https://stability.ai/).
This model was trained on a high-resolution subset of the LAION-2B dataset.
It is a diffusion model that operates in the same latent space as the Stable Diffusion model, which is decoded into a full-resolution image.
To use it with Stable Diffusion, You can take the generated latent from Stable Diffusion and pass it into the upscaler before decoding with your standard VAE.
Or you can take any image, encode it into the latent space, use the upscaler, and decode it.
**Note**:
This upscaling model is designed explicitely for **Stable Diffusion** as it can upscale Stable Diffusion's latent denoised image embeddings.
This allows for very fast text-to-image + upscaling pipelines as all intermeditate states can be kept on GPU. More for information, see example below.
This model works on all [Stable Diffusion checkpoints](https://huggingface.co/models?other=stable-diffusion)
|  |
|:--:|
Image by Tanishq Abraham from [Stability AI](https://stability.ai/) originating from [this tweet](https://twitter.com/StabilityAI/status/1590531958815064065)|
Original output image | 2x upscaled output image
:-------------------------:|:-------------------------:
 | 
- Use it with 🧨 [`diffusers`](https://huggingface.co/stabilityai/sd-x2-latent-upscaler#examples)
## Model Details
- **Developed by:** Katherine Crowson
- **Model type:** Diffusion-based latent upscaler
- **Language(s):** English
- **License:** [CreativeML Open RAIL++-M License](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/LICENSE-MODEL)
## Examples
Using the [🤗's Diffusers library](https://github.com/huggingface/diffusers) to run latent upscaler on top of any `StableDiffusionUpscalePipeline` checkpoint
to enhance its output image resolution by a factor of 2.
```bash
pip install git+https://github.com/huggingface/diffusers.git
pip install transformers accelerate scipy safetensors
```
```python
from diffusers import StableDiffusionLatentUpscalePipeline, StableDiffusionPipeline
import torch
pipeline = StableDiffusionPipeline.from_pretrained("CompVis/stable-diffusion-v1-4", torch_dtype=torch.float16)
pipeline.to("cuda")
upscaler = StableDiffusionLatentUpscalePipeline.from_pretrained("stabilityai/sd-x2-latent-upscaler", torch_dtype=torch.float16)
upscaler.to("cuda")
prompt = "a photo of an astronaut high resolution, unreal engine, ultra realistic"
generator = torch.manual_seed(33)
# we stay in latent space! Let's make sure that Stable Diffusion returns the image
# in latent space
low_res_latents = pipeline(prompt, generator=generator, output_type="latent").images
upscaled_image = upscaler(
prompt=prompt,
image=low_res_latents,
num_inference_steps=20,
guidance_scale=0,
generator=generator,
).images[0]
# Let's save the upscaled image under "upscaled_astronaut.png"
upscaled_image.save("astronaut_1024.png")
# as a comparison: Let's also save the low-res image
with torch.no_grad():
image = pipeline.decode_latents(low_res_latents)
image = pipeline.numpy_to_pil(image)[0]
image.save("astronaut_512.png")
```
**Result**:
*512-res Astronaut*

*1024-res Astronaut*

**Notes**:
- Despite not being a dependency, we highly recommend you to install [xformers](https://github.com/facebookresearch/xformers) for memory efficient attention (better performance)
- If you have low GPU RAM available, make sure to add a `pipe.enable_attention_slicing()` after sending it to `cuda` for less VRAM usage (to the cost of speed)
# Uses
## Direct Use
The model is intended for research purposes only. Possible research areas and tasks include
- Safe deployment of models which have the potential to generate harmful content.
- Probing and understanding the limitations and biases of generative models.
- Generation of artworks and use in design and other artistic processes.
- Applications in educational or creative tools.
- Research on generative models.
Excluded uses are described below.
### Misuse, Malicious Use, and Out-of-Scope Use
_Note: This section is originally taken from the [DALLE-MINI model card](https://huggingface.co/dalle-mini/dalle-mini), was used for Stable Diffusion v1, but applies in the same way to Stable Diffusion v2_.
The model should not be used to intentionally create or disseminate images that create hostile or alienating environments for people. This includes generating images that people would foreseeably find disturbing, distressing, or offensive; or content that propagates historical or current stereotypes.
#### Out-of-Scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
#### Misuse and Malicious Use
Using the model to generate content that is cruel to individuals is a misuse of this model. This includes, but is not limited to:
- Generating demeaning, dehumanizing, or otherwise harmful representations of people or their environments, cultures, religions, etc.
- Intentionally promoting or propagating discriminatory content or harmful stereotypes.
- Impersonating individuals without their consent.
- Sexual content without consent of the people who might see it.
- Mis- and disinformation
- Representations of egregious violence and gore
- Sharing of copyrighted or licensed material in violation of its terms of use.
- Sharing content that is an alteration of copyrighted or licensed material in violation of its terms of use.
## Limitations and Bias
### Limitations
- The model does not achieve perfect photorealism
- The model cannot render legible text
- The model does not perform well on more difficult tasks which involve compositionality, such as rendering an image corresponding to “A red cube on top of a blue sphere”
- Faces and people in general may not be generated properly.
- The model was trained mainly with English captions and will not work as well in other languages.
- The autoencoding part of the model is lossy
- The model was trained on a subset of the large-scale dataset
[LAION-5B](https://laion.ai/blog/laion-5b/), which contains adult, violent and sexual content. To partially mitigate this, we have filtered the dataset using LAION's NFSW detector (see Training section).
### Bias
While the capabilities of image generation models are impressive, they can also reinforce or exacerbate social biases.
Stable Diffusion vw was primarily trained on subsets of [LAION-2B(en)](https://laion.ai/blog/laion-5b/),
which consists of images that are limited to English descriptions.
Texts and images from communities and cultures that use other languages are likely to be insufficiently accounted for.
This affects the overall output of the model, as white and western cultures are often set as the default. Further, the
ability of the model to generate content with non-English prompts is significantly worse than with English-language prompts.
Stable Diffusion v2 mirrors and exacerbates biases to such a degree that viewer discretion must be advised irrespective of the input or its intent. | 7,687 | [
[
-0.03179931640625,
-0.05499267578125,
0.0257110595703125,
0.01081085205078125,
-0.01611328125,
-0.0135650634765625,
-0.002532958984375,
-0.0297088623046875,
0.00927734375,
0.0272064208984375,
-0.02984619140625,
-0.031158447265625,
-0.05426025390625,
-0.01763916015625,
-0.0264739990234375,
0.0704345703125,
-0.0022640228271484375,
-0.00916290283203125,
-0.014801025390625,
-0.007274627685546875,
-0.029388427734375,
-0.01264190673828125,
-0.076416015625,
-0.01194000244140625,
0.029876708984375,
0.01178741455078125,
0.06671142578125,
0.04437255859375,
0.035888671875,
0.0254364013671875,
-0.0229644775390625,
-0.00328826904296875,
-0.05743408203125,
-0.01035308837890625,
0.012298583984375,
-0.021331787109375,
-0.04522705078125,
0.00885009765625,
0.05609130859375,
0.0191802978515625,
-0.007659912109375,
0.007720947265625,
0.0007166862487792969,
0.048614501953125,
-0.039093017578125,
-0.0011110305786132812,
-0.0081634521484375,
0.005107879638671875,
-0.006847381591796875,
0.0223846435546875,
-0.02099609375,
-0.01552581787109375,
0.0009756088256835938,
-0.063720703125,
0.029388427734375,
-0.0124969482421875,
0.08892822265625,
0.033294677734375,
-0.021697998046875,
0.0005192756652832031,
-0.052825927734375,
0.0462646484375,
-0.046478271484375,
0.0233154296875,
0.02020263671875,
0.006641387939453125,
0.0011129379272460938,
-0.07281494140625,
-0.0423583984375,
-0.00884246826171875,
-0.01264190673828125,
0.036834716796875,
-0.0289764404296875,
0.0026683807373046875,
0.031524658203125,
0.030517578125,
-0.04730224609375,
-0.0030612945556640625,
-0.045806884765625,
-0.00878143310546875,
0.057281494140625,
0.007843017578125,
0.026641845703125,
-0.0156402587890625,
-0.023956298828125,
-0.0157928466796875,
-0.0294952392578125,
0.002056121826171875,
0.021514892578125,
-0.0207672119140625,
-0.038238525390625,
0.0301361083984375,
0.00966644287109375,
0.033050537109375,
0.0245513916015625,
-0.0025196075439453125,
0.0245208740234375,
-0.01788330078125,
-0.01288604736328125,
-0.03228759765625,
0.06573486328125,
0.054473876953125,
-0.0078582763671875,
0.004852294921875,
-0.0034809112548828125,
0.018402099609375,
0.00945281982421875,
-0.0867919921875,
-0.036865234375,
0.023895263671875,
-0.05499267578125,
-0.034210205078125,
-0.01385498046875,
-0.0811767578125,
-0.0157928466796875,
0.0164794921875,
0.035491943359375,
-0.0225677490234375,
-0.0423583984375,
-0.006011962890625,
-0.02496337890625,
0.00859832763671875,
0.0268707275390625,
-0.053070068359375,
0.0157012939453125,
0.006206512451171875,
0.08489990234375,
-0.01503753662109375,
0.001068115234375,
-0.0118865966796875,
0.00284576416015625,
-0.0251007080078125,
0.049224853515625,
-0.025360107421875,
-0.045196533203125,
-0.01959228515625,
0.0225372314453125,
0.01274871826171875,
-0.041168212890625,
0.059539794921875,
-0.036376953125,
0.02294921875,
-0.00984954833984375,
-0.032806396484375,
-0.01519012451171875,
-0.0059356689453125,
-0.048980712890625,
0.0821533203125,
0.02862548828125,
-0.0635986328125,
0.013580322265625,
-0.0562744140625,
-0.0028629302978515625,
-0.0014410018920898438,
-0.00583648681640625,
-0.053070068359375,
0.0008139610290527344,
0.0037212371826171875,
0.029052734375,
-0.005718231201171875,
0.01019287109375,
-0.01520538330078125,
-0.0236968994140625,
-0.006153106689453125,
-0.031768798828125,
0.0740966796875,
0.0242156982421875,
-0.028961181640625,
0.01519775390625,
-0.044952392578125,
-0.01348114013671875,
0.038543701171875,
-0.0163421630859375,
-0.01392364501953125,
-0.01678466796875,
0.0234375,
0.0239410400390625,
0.00691986083984375,
-0.039337158203125,
0.0035228729248046875,
-0.0176544189453125,
0.03778076171875,
0.057861328125,
0.0019664764404296875,
0.056854248046875,
-0.03253173828125,
0.040191650390625,
0.0180206298828125,
0.0191802978515625,
-0.006072998046875,
-0.060211181640625,
-0.055145263671875,
-0.025390625,
0.00978851318359375,
0.032318115234375,
-0.04559326171875,
0.01271820068359375,
-0.006633758544921875,
-0.053070068359375,
-0.03131103515625,
-0.004741668701171875,
0.0205535888671875,
0.04803466796875,
0.0227203369140625,
-0.0286712646484375,
-0.025390625,
-0.053558349609375,
0.0225372314453125,
-0.01499176025390625,
0.004238128662109375,
0.02630615234375,
0.0433349609375,
-0.03173828125,
0.04132080078125,
-0.054931640625,
-0.0218963623046875,
0.01537322998046875,
0.002910614013671875,
0.0043487548828125,
0.053955078125,
0.060516357421875,
-0.0745849609375,
-0.04779052734375,
-0.01236724853515625,
-0.06427001953125,
0.0039215087890625,
0.0013055801391601562,
-0.0298919677734375,
0.023468017578125,
0.03369140625,
-0.06317138671875,
0.044189453125,
0.0494384765625,
-0.040435791015625,
0.045379638671875,
-0.03656005859375,
0.004589080810546875,
-0.08038330078125,
0.0055694580078125,
0.0217132568359375,
-0.0254669189453125,
-0.0457763671875,
0.01312255859375,
0.006427764892578125,
-0.015899658203125,
-0.034820556640625,
0.058624267578125,
-0.035064697265625,
0.029998779296875,
-0.0297393798828125,
0.0007758140563964844,
0.01343536376953125,
0.0179901123046875,
0.026123046875,
0.0484619140625,
0.060089111328125,
-0.0357666015625,
0.021331787109375,
0.023895263671875,
-0.0076446533203125,
0.037384033203125,
-0.06365966796875,
0.01111602783203125,
-0.036651611328125,
0.0203857421875,
-0.0843505859375,
-0.0200347900390625,
0.03857421875,
-0.035186767578125,
0.032958984375,
-0.0138702392578125,
-0.03118896484375,
-0.040740966796875,
-0.0183868408203125,
0.038726806640625,
0.082763671875,
-0.033233642578125,
0.041107177734375,
0.0247955322265625,
0.0098724365234375,
-0.034912109375,
-0.054473876953125,
-0.01432037353515625,
-0.033447265625,
-0.067626953125,
0.04534912109375,
-0.02142333984375,
-0.01837158203125,
0.01947021484375,
0.01035308837890625,
0.003704071044921875,
-0.01702880859375,
0.037261962890625,
0.027740478515625,
-0.00666046142578125,
-0.0168304443359375,
0.0143280029296875,
-0.015472412109375,
0.0076446533203125,
-0.01097869873046875,
0.0242156982421875,
-0.0009016990661621094,
0.001911163330078125,
-0.0477294921875,
0.03448486328125,
0.045440673828125,
0.00885772705078125,
0.060760498046875,
0.0784912109375,
-0.036712646484375,
0.0139617919921875,
-0.035430908203125,
-0.0247955322265625,
-0.039337158203125,
0.028717041015625,
-0.00972747802734375,
-0.050048828125,
0.057952880859375,
-0.0007677078247070312,
0.006717681884765625,
0.047210693359375,
0.0535888671875,
-0.015716552734375,
0.08587646484375,
0.0517578125,
0.01552581787109375,
0.047454833984375,
-0.06597900390625,
-0.005794525146484375,
-0.07464599609375,
-0.0225372314453125,
-0.012969970703125,
-0.0242919921875,
-0.03753662109375,
-0.05535888671875,
0.027740478515625,
0.01444244384765625,
-0.01611328125,
0.0121917724609375,
-0.04986572265625,
0.030303955078125,
0.0204315185546875,
0.01320648193359375,
0.002033233642578125,
0.01202392578125,
0.0080108642578125,
-0.01849365234375,
-0.054931640625,
-0.0394287109375,
0.07318115234375,
0.03314208984375,
0.0654296875,
0.00731658935546875,
0.036102294921875,
0.030303955078125,
0.03143310546875,
-0.03704833984375,
0.034881591796875,
-0.019622802734375,
-0.0574951171875,
-0.002277374267578125,
-0.027008056640625,
-0.0721435546875,
0.0165557861328125,
-0.0157012939453125,
-0.04876708984375,
0.044189453125,
0.0177154541015625,
-0.032196044921875,
0.0305633544921875,
-0.06201171875,
0.0704345703125,
-0.00426483154296875,
-0.06097412109375,
-0.00820159912109375,
-0.05499267578125,
0.025787353515625,
0.00185394287109375,
0.016357421875,
-0.0036869049072265625,
-0.010040283203125,
0.06463623046875,
-0.0267181396484375,
0.062103271484375,
-0.0301361083984375,
0.002048492431640625,
0.0274505615234375,
-0.005970001220703125,
0.0266571044921875,
0.0211181640625,
-0.006153106689453125,
0.02691650390625,
0.0114898681640625,
-0.044097900390625,
-0.033416748046875,
0.06396484375,
-0.0826416015625,
-0.03948974609375,
-0.0301513671875,
-0.036102294921875,
0.034912109375,
0.01064300537109375,
0.058990478515625,
0.019256591796875,
-0.011474609375,
-0.00931549072265625,
0.05072021484375,
-0.0200347900390625,
0.029998779296875,
0.0156707763671875,
-0.022705078125,
-0.03704833984375,
0.059844970703125,
0.0185546875,
0.035858154296875,
-0.01248931884765625,
0.016387939453125,
-0.025787353515625,
-0.0294647216796875,
-0.045623779296875,
0.02947998046875,
-0.056488037109375,
-0.0151519775390625,
-0.05548095703125,
-0.025360107421875,
-0.03082275390625,
-0.0203704833984375,
-0.030181884765625,
-0.0204620361328125,
-0.0548095703125,
0.01029205322265625,
0.026153564453125,
0.04425048828125,
-0.0277099609375,
0.031890869140625,
-0.034637451171875,
0.0293426513671875,
0.0095672607421875,
0.0119476318359375,
0.00641632080078125,
-0.05908203125,
-0.013519287109375,
0.0037403106689453125,
-0.042938232421875,
-0.0667724609375,
0.031646728515625,
0.007808685302734375,
0.0347900390625,
0.04779052734375,
0.0014781951904296875,
0.048980712890625,
-0.025360107421875,
0.0709228515625,
0.0166168212890625,
-0.044097900390625,
0.0455322265625,
-0.03204345703125,
0.0113983154296875,
0.01407623291015625,
0.03924560546875,
-0.028045654296875,
-0.0219268798828125,
-0.06646728515625,
-0.0693359375,
0.052337646484375,
0.034515380859375,
0.03125,
0.0011043548583984375,
0.061187744140625,
-0.00823211669921875,
-0.00745391845703125,
-0.07470703125,
-0.046875,
-0.024688720703125,
0.00870513916015625,
-0.004444122314453125,
-0.02923583984375,
-0.01280975341796875,
-0.041748046875,
0.061431884765625,
0.0057830810546875,
0.044342041015625,
0.03143310546875,
0.013397216796875,
-0.032196044921875,
-0.0223846435546875,
0.030609130859375,
0.024017333984375,
-0.0219268798828125,
-0.00946807861328125,
-0.0049896240234375,
-0.041015625,
0.0215911865234375,
0.0154266357421875,
-0.041015625,
-0.00269317626953125,
-0.0017118453979492188,
0.0655517578125,
-0.02740478515625,
-0.03173828125,
0.04119873046875,
-0.0233612060546875,
-0.02288818359375,
-0.034332275390625,
0.00621795654296875,
0.0102996826171875,
0.0244140625,
0.01088714599609375,
0.042999267578125,
0.0128021240234375,
-0.0251312255859375,
0.0006937980651855469,
0.03656005859375,
-0.028656005859375,
-0.029541015625,
0.089599609375,
0.006969451904296875,
-0.019287109375,
0.051788330078125,
-0.033966064453125,
-0.0145416259765625,
0.051422119140625,
0.056671142578125,
0.05108642578125,
-0.017425537109375,
0.034027099609375,
0.055938720703125,
0.0222320556640625,
-0.0166168212890625,
0.0196685791015625,
0.01096343994140625,
-0.052642822265625,
-0.020538330078125,
-0.03289794921875,
-0.0009617805480957031,
0.016265869140625,
-0.041351318359375,
0.0266571044921875,
-0.035552978515625,
-0.035675048828125,
-0.00047779083251953125,
-0.00701141357421875,
-0.0408935546875,
0.01042938232421875,
0.021636962890625,
0.059814453125,
-0.08038330078125,
0.052093505859375,
0.0574951171875,
-0.04791259765625,
-0.04876708984375,
0.0035400390625,
0.0005412101745605469,
-0.0297698974609375,
0.034637451171875,
0.0084991455078125,
0.00473785400390625,
0.012451171875,
-0.0526123046875,
-0.07098388671875,
0.09112548828125,
0.035186767578125,
-0.02337646484375,
-0.00672149658203125,
-0.0203857421875,
0.047607421875,
-0.029144287109375,
0.0286712646484375,
0.0285797119140625,
0.03277587890625,
0.0257720947265625,
-0.036590576171875,
0.018829345703125,
-0.033721923828125,
0.0224609375,
-0.003650665283203125,
-0.079833984375,
0.06842041015625,
-0.0380859375,
-0.022552490234375,
0.0260009765625,
0.05865478515625,
0.01910400390625,
0.0227203369140625,
0.035064697265625,
0.06732177734375,
0.0458984375,
-0.0149993896484375,
0.08636474609375,
-0.003368377685546875,
0.03631591796875,
0.04608154296875,
0.0018606185913085938,
0.04656982421875,
0.0297393798828125,
-0.01198577880859375,
0.043243408203125,
0.05767822265625,
-0.0167388916015625,
0.046905517578125,
0.0035152435302734375,
-0.0195465087890625,
-0.0014505386352539062,
-0.006679534912109375,
-0.0390625,
0.0059967041015625,
0.0286407470703125,
-0.036956787109375,
-0.005115509033203125,
0.017333984375,
0.00476837158203125,
-0.0108489990234375,
-0.011474609375,
0.041534423828125,
0.004486083984375,
-0.0278778076171875,
0.0511474609375,
0.004230499267578125,
0.06658935546875,
-0.0303802490234375,
-0.007686614990234375,
-0.01373291015625,
0.0137481689453125,
-0.03155517578125,
-0.056304931640625,
0.039093017578125,
-0.01320648193359375,
-0.01349639892578125,
-0.005584716796875,
0.06201171875,
-0.0223541259765625,
-0.0504150390625,
0.03814697265625,
0.0219268798828125,
0.0293426513671875,
0.00032806396484375,
-0.0784912109375,
0.024627685546875,
-0.006103515625,
-0.0240478515625,
0.01690673828125,
0.01029205322265625,
0.0098114013671875,
0.03887939453125,
0.041259765625,
0.004459381103515625,
0.007965087890625,
-0.007190704345703125,
0.07110595703125,
-0.0222320556640625,
-0.0224609375,
-0.05224609375,
0.05926513671875,
-0.0133514404296875,
-0.01220703125,
0.05645751953125,
0.046783447265625,
0.053802490234375,
-0.008056640625,
0.05133056640625,
-0.024078369140625,
-0.0035381317138671875,
-0.023773193359375,
0.06591796875,
-0.0704345703125,
0.0025634765625,
-0.033660888671875,
-0.06781005859375,
-0.01490020751953125,
0.06646728515625,
-0.021209716796875,
0.0279541015625,
0.0309295654296875,
0.0721435546875,
-0.01385498046875,
-0.0221099853515625,
0.0300445556640625,
0.023651123046875,
0.0236358642578125,
0.020050048828125,
0.058624267578125,
-0.058685302734375,
0.036651611328125,
-0.040008544921875,
-0.028717041015625,
-0.0016956329345703125,
-0.049041748046875,
-0.0618896484375,
-0.053070068359375,
-0.059967041015625,
-0.06304931640625,
-0.005931854248046875,
0.03271484375,
0.0848388671875,
-0.03802490234375,
-0.0007405281066894531,
-0.01030731201171875,
-0.003421783447265625,
-0.00634002685546875,
-0.023956298828125,
0.03369140625,
0.01496124267578125,
-0.0743408203125,
-0.006069183349609375,
0.0183868408203125,
0.035369873046875,
-0.033447265625,
-0.021331787109375,
-0.007442474365234375,
-0.0094146728515625,
0.04010009765625,
0.015655517578125,
-0.049652099609375,
-0.006443023681640625,
-0.0037403106689453125,
0.006427764892578125,
0.01580810546875,
0.0236053466796875,
-0.04998779296875,
0.033935546875,
0.037139892578125,
0.0137939453125,
0.0589599609375,
0.0006341934204101562,
0.007289886474609375,
-0.0421142578125,
0.0298004150390625,
0.00978851318359375,
0.0297088623046875,
0.0232086181640625,
-0.052642822265625,
0.033721923828125,
0.032745361328125,
-0.06964111328125,
-0.054107666015625,
0.00862884521484375,
-0.086669921875,
-0.014068603515625,
0.09375,
-0.0187530517578125,
-0.0252227783203125,
0.003055572509765625,
-0.035003662109375,
0.0185546875,
-0.03924560546875,
0.05322265625,
0.03741455078125,
-0.0166015625,
-0.0391845703125,
-0.04522705078125,
0.0404052734375,
0.0216827392578125,
-0.045379638671875,
-0.01111602783203125,
0.046356201171875,
0.040557861328125,
0.032867431640625,
0.07147216796875,
-0.0283966064453125,
0.0174102783203125,
0.013641357421875,
-0.0026264190673828125,
0.015869140625,
-0.003086090087890625,
-0.033905029296875,
0.0010938644409179688,
-0.00408935546875,
-0.005588531494140625
]
] |
Qiliang/bart-large-cnn-samsum-ChatGPT_v3 | 2022-12-13T17:45:10.000Z | [
"transformers",
"pytorch",
"bart",
"text2text-generation",
"generated_from_trainer",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | text2text-generation | Qiliang | null | null | Qiliang/bart-large-cnn-samsum-ChatGPT_v3 | 26 | 34,079 | transformers | 2022-12-13T17:32:47 | ---
license: mit
tags:
- generated_from_trainer
model-index:
- name: bart-large-cnn-samsum-ChatGPT_v3
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bart-large-cnn-samsum-ChatGPT_v3
This model is a fine-tuned version of [philschmid/bart-large-cnn-samsum](https://huggingface.co/philschmid/bart-large-cnn-samsum) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.24.0
- Pytorch 1.12.1
- Datasets 2.6.1
- Tokenizers 0.13.2
| 1,091 | [
[
-0.04522705078125,
-0.0560302734375,
0.01169586181640625,
0.0107879638671875,
-0.04437255859375,
-0.0253448486328125,
-0.0250091552734375,
-0.0206756591796875,
0.0253753662109375,
0.0372314453125,
-0.046661376953125,
-0.03656005859375,
-0.054229736328125,
-0.005161285400390625,
-0.01529693603515625,
0.10137939453125,
0.006473541259765625,
0.04205322265625,
-0.0116729736328125,
0.00463104248046875,
-0.017059326171875,
-0.038360595703125,
-0.0819091796875,
-0.0369873046875,
0.032318115234375,
0.0279541015625,
0.057373046875,
0.050537109375,
0.06695556640625,
0.0230255126953125,
-0.0249176025390625,
-0.0018243789672851562,
-0.047943115234375,
-0.025726318359375,
-0.00875091552734375,
-0.026397705078125,
-0.061187744140625,
0.00569915771484375,
0.058563232421875,
0.029144287109375,
-0.015899658203125,
0.044830322265625,
-0.0013637542724609375,
0.0258636474609375,
-0.0273895263671875,
0.04058837890625,
-0.044891357421875,
0.02001953125,
-0.00469970703125,
-0.026763916015625,
-0.0259246826171875,
-0.0165863037109375,
0.0142974853515625,
-0.04278564453125,
0.040283203125,
0.00882720947265625,
0.08697509765625,
0.03192138671875,
-0.0212249755859375,
0.0011577606201171875,
-0.05908203125,
0.02972412109375,
-0.05059814453125,
0.01068878173828125,
0.0408935546875,
0.05029296875,
-0.00725555419921875,
-0.0667724609375,
-0.0273590087890625,
-0.01552581787109375,
0.004535675048828125,
0.00699615478515625,
-0.0071868896484375,
0.007419586181640625,
0.05731201171875,
0.026519775390625,
-0.03668212890625,
0.0012750625610351562,
-0.065673828125,
-0.015838623046875,
0.045867919921875,
0.0194244384765625,
-0.01465606689453125,
-0.0096893310546875,
-0.03729248046875,
-0.00395965576171875,
-0.034423828125,
-0.00806427001953125,
0.036895751953125,
0.021820068359375,
-0.0274658203125,
0.0665283203125,
-0.02117919921875,
0.0465087890625,
-0.0048370361328125,
-0.001628875732421875,
0.0419921875,
0.005733489990234375,
-0.0303497314453125,
0.0121612548828125,
0.0653076171875,
0.04443359375,
0.036895751953125,
0.0158538818359375,
-0.012176513671875,
-0.019256591796875,
0.0233001708984375,
-0.0767822265625,
-0.050750732421875,
-0.01033782958984375,
-0.0419921875,
-0.04241943359375,
0.004756927490234375,
-0.042694091796875,
-0.00237274169921875,
-0.031494140625,
0.037750244140625,
-0.034149169921875,
-0.003940582275390625,
0.0020198822021484375,
-0.009765625,
0.015411376953125,
0.0179901123046875,
-0.059539794921875,
0.0265655517578125,
0.02801513671875,
0.03399658203125,
0.0203399658203125,
-0.024505615234375,
-0.0174713134765625,
0.00707244873046875,
-0.02886962890625,
0.031463623046875,
-0.0115966796875,
-0.030242919921875,
-0.026031494140625,
0.0262603759765625,
0.006923675537109375,
-0.0245208740234375,
0.07275390625,
-0.03643798828125,
0.03314208984375,
-0.0240936279296875,
-0.05029296875,
-0.02020263671875,
0.0285186767578125,
-0.0477294921875,
0.07110595703125,
0.007099151611328125,
-0.067626953125,
0.040435791015625,
-0.061737060546875,
-0.019317626953125,
0.0166473388671875,
-0.0006613731384277344,
-0.0655517578125,
-0.00093841552734375,
0.01641845703125,
0.042510986328125,
-0.006038665771484375,
0.031951904296875,
-0.039794921875,
-0.0499267578125,
-0.0014476776123046875,
-0.0572509765625,
0.060760498046875,
0.017669677734375,
-0.0164031982421875,
0.0136871337890625,
-0.08636474609375,
0.0014429092407226562,
0.0357666015625,
-0.03533935546875,
0.007381439208984375,
-0.01349639892578125,
0.01922607421875,
0.01309967041015625,
0.033172607421875,
-0.04730224609375,
0.012969970703125,
-0.0135650634765625,
0.0294342041015625,
0.0528564453125,
0.0103607177734375,
-0.0019588470458984375,
-0.019775390625,
0.0107879638671875,
0.005756378173828125,
0.0323486328125,
0.006824493408203125,
-0.042144775390625,
-0.064208984375,
-0.037017822265625,
0.0284271240234375,
0.021148681640625,
-0.0247955322265625,
0.05633544921875,
-0.0220794677734375,
-0.052459716796875,
-0.0181121826171875,
-0.00347137451171875,
0.0115814208984375,
0.035980224609375,
0.034088134765625,
-0.0188751220703125,
-0.038330078125,
-0.08148193359375,
0.01311492919921875,
-0.0008435249328613281,
0.01190948486328125,
0.02252197265625,
0.05145263671875,
-0.0083770751953125,
0.0517578125,
-0.0445556640625,
-0.007343292236328125,
-0.006214141845703125,
-0.0087738037109375,
0.011199951171875,
0.064208984375,
0.042816162109375,
-0.0274658203125,
-0.0139617919921875,
-0.016876220703125,
-0.050262451171875,
0.0284271240234375,
-0.00882720947265625,
-0.01514434814453125,
-0.0176239013671875,
0.039520263671875,
-0.0533447265625,
0.05279541015625,
0.0166168212890625,
-0.0167388916015625,
0.041412353515625,
-0.041412353515625,
-0.006855010986328125,
-0.0863037109375,
0.0129852294921875,
0.0167388916015625,
-0.0188446044921875,
-0.005405426025390625,
0.0036792755126953125,
0.0023746490478515625,
-0.0253448486328125,
-0.02947998046875,
0.03790283203125,
-0.007572174072265625,
-0.00738525390625,
-0.017730712890625,
-0.0150604248046875,
0.0016527175903320312,
0.055084228515625,
0.0029544830322265625,
0.0357666015625,
0.044891357421875,
-0.039154052734375,
0.052398681640625,
0.038299560546875,
-0.031463623046875,
0.0361328125,
-0.07098388671875,
-0.008392333984375,
-0.0079345703125,
0.0238037109375,
-0.060821533203125,
-0.031402587890625,
0.058502197265625,
-0.035003662109375,
0.028533935546875,
-0.0119476318359375,
-0.041748046875,
-0.047821044921875,
0.01264190673828125,
0.04119873046875,
0.032379150390625,
-0.051177978515625,
0.022125244140625,
-0.005596160888671875,
0.01425933837890625,
-0.0249786376953125,
-0.0552978515625,
-0.010009765625,
-0.0078582763671875,
-0.0256195068359375,
0.004749298095703125,
-0.0046539306640625,
0.0163116455078125,
0.006481170654296875,
0.006084442138671875,
-0.0105743408203125,
-0.012359619140625,
0.0260772705078125,
0.032470703125,
-0.0227813720703125,
0.0024890899658203125,
0.00608062744140625,
-0.0206451416015625,
0.023193359375,
0.01006317138671875,
0.038970947265625,
-0.0135955810546875,
-0.0372314453125,
-0.073486328125,
-0.006732940673828125,
0.045745849609375,
-0.021240234375,
0.0572509765625,
0.07958984375,
-0.027069091796875,
0.0136871337890625,
-0.032135009765625,
-0.0223236083984375,
-0.03277587890625,
0.05169677734375,
-0.0311279296875,
-0.03143310546875,
0.03729248046875,
-0.00356292724609375,
0.00020170211791992188,
0.05828857421875,
0.041290283203125,
0.006618499755859375,
0.0897216796875,
0.0163116455078125,
-0.0012636184692382812,
0.03271484375,
-0.04107666015625,
0.0038776397705078125,
-0.06256103515625,
-0.032745361328125,
-0.0316162109375,
-0.025970458984375,
-0.05914306640625,
-0.00746917724609375,
0.00836944580078125,
-0.0025882720947265625,
-0.06378173828125,
0.0265045166015625,
-0.040435791015625,
0.036956787109375,
0.068359375,
0.03839111328125,
-0.0037708282470703125,
0.011932373046875,
-0.0034732818603515625,
0.00738525390625,
-0.061798095703125,
-0.0250244140625,
0.1016845703125,
0.04345703125,
0.04510498046875,
-0.020355224609375,
0.03656005859375,
-0.007450103759765625,
0.0037708282470703125,
-0.039337158203125,
0.05047607421875,
0.012908935546875,
-0.0775146484375,
-0.0192718505859375,
-0.033233642578125,
-0.06158447265625,
0.009002685546875,
-0.03948974609375,
-0.0396728515625,
0.0012044906616210938,
0.015411376953125,
-0.0182342529296875,
0.033935546875,
-0.052459716796875,
0.08392333984375,
-0.0112152099609375,
-0.00786590576171875,
-0.023651123046875,
-0.049774169921875,
0.025299072265625,
0.01263427734375,
-0.0173187255859375,
-0.015655517578125,
0.01316070556640625,
0.054901123046875,
-0.04266357421875,
0.054901123046875,
-0.018524169921875,
0.0301971435546875,
0.02984619140625,
-0.0146026611328125,
0.04644775390625,
0.0151214599609375,
-0.004573822021484375,
0.01348876953125,
-0.0005822181701660156,
-0.061798095703125,
-0.0156402587890625,
0.0533447265625,
-0.08880615234375,
-0.01727294921875,
-0.0323486328125,
-0.035919189453125,
-0.00839996337890625,
0.01715087890625,
0.048248291015625,
0.053985595703125,
-0.02215576171875,
0.029296875,
0.0268096923828125,
0.00047087669372558594,
0.027252197265625,
0.005191802978515625,
-0.004901885986328125,
-0.045196533203125,
0.07757568359375,
-0.0181121826171875,
0.0140380859375,
-0.01044464111328125,
0.01018524169921875,
-0.0063018798828125,
-0.03424072265625,
-0.02606201171875,
0.02484130859375,
-0.049774169921875,
-0.01152801513671875,
-0.01580810546875,
-0.0458984375,
-0.0127105712890625,
0.005916595458984375,
-0.038818359375,
-0.024078369140625,
-0.048980712890625,
-0.0120391845703125,
0.0205841064453125,
0.035003662109375,
0.007335662841796875,
0.062103271484375,
-0.045989990234375,
0.005176544189453125,
0.0161590576171875,
0.044403076171875,
-0.0021724700927734375,
-0.0557861328125,
-0.0316162109375,
0.01168060302734375,
-0.03204345703125,
-0.0263214111328125,
0.0227813720703125,
0.0026988983154296875,
0.04522705078125,
0.04901123046875,
-0.00843048095703125,
0.054473876953125,
-0.0323486328125,
0.0635986328125,
0.0257720947265625,
-0.0345458984375,
0.024932861328125,
-0.023223876953125,
0.0243377685546875,
0.035980224609375,
0.031982421875,
0.00681304931640625,
-0.016998291015625,
-0.0968017578125,
-0.043487548828125,
0.0628662109375,
0.02960205078125,
0.0229949951171875,
-0.00006341934204101562,
0.0242156982421875,
-0.0098876953125,
0.0293731689453125,
-0.06292724609375,
-0.029998779296875,
-0.024078369140625,
-0.006069183349609375,
-0.0235137939453125,
-0.0321044921875,
-0.021026611328125,
-0.054840087890625,
0.08587646484375,
0.01061248779296875,
0.0285186767578125,
0.0035266876220703125,
0.020233154296875,
-0.018798828125,
-0.02386474609375,
0.056854248046875,
0.0494384765625,
-0.041412353515625,
-0.0079498291015625,
0.01342010498046875,
-0.041778564453125,
-0.0232391357421875,
0.021209716796875,
-0.0045623779296875,
0.01459503173828125,
0.0227813720703125,
0.09686279296875,
0.01392364501953125,
-0.0186309814453125,
0.04254150390625,
-0.01611328125,
-0.036041259765625,
-0.02996826171875,
0.01331329345703125,
-0.0156402587890625,
0.0228424072265625,
-0.0003247261047363281,
0.0516357421875,
-0.0032215118408203125,
-0.0010824203491210938,
0.00942230224609375,
0.0281219482421875,
-0.033599853515625,
-0.0122833251953125,
0.06011962890625,
0.01012420654296875,
-0.0233917236328125,
0.058624267578125,
-0.014923095703125,
-0.0171661376953125,
0.0528564453125,
0.0413818359375,
0.051300048828125,
-0.01009368896484375,
0.00862884521484375,
0.056854248046875,
0.017486572265625,
-0.0114898681640625,
0.019744873046875,
0.002674102783203125,
-0.041778564453125,
-0.02001953125,
-0.032257080078125,
-0.01788330078125,
0.0280609130859375,
-0.07049560546875,
0.050323486328125,
-0.049713134765625,
-0.031280517578125,
0.0174102783203125,
0.01406097412109375,
-0.0662841796875,
0.040008544921875,
0.01280975341796875,
0.0716552734375,
-0.05584716796875,
0.0611572265625,
0.041748046875,
-0.0216064453125,
-0.06964111328125,
-0.016204833984375,
-0.019500732421875,
-0.07586669921875,
0.035186767578125,
-0.006023406982421875,
0.037445068359375,
0.0146942138671875,
-0.061065673828125,
-0.0611572265625,
0.06951904296875,
0.021820068359375,
-0.038299560546875,
0.005062103271484375,
-0.0008149147033691406,
0.03948974609375,
-0.017913818359375,
0.04278564453125,
0.00930023193359375,
0.01142120361328125,
0.039031982421875,
-0.061767578125,
-0.01477813720703125,
-0.0171661376953125,
0.006565093994140625,
0.01210784912109375,
-0.05499267578125,
0.06256103515625,
-0.010589599609375,
0.03851318359375,
0.022796630859375,
0.047149658203125,
0.0164947509765625,
0.0149688720703125,
0.0238494873046875,
0.053802490234375,
0.04638671875,
-0.00019860267639160156,
0.059478759765625,
-0.0426025390625,
0.05712890625,
0.0997314453125,
0.01434326171875,
0.04339599609375,
0.026153564453125,
-0.00838470458984375,
0.019073486328125,
0.07196044921875,
-0.0479736328125,
0.035919189453125,
0.0152740478515625,
-0.001312255859375,
-0.02862548828125,
0.0283355712890625,
-0.0555419921875,
0.04083251953125,
0.017669677734375,
-0.06878662109375,
-0.0254974365234375,
-0.011932373046875,
-0.0037021636962890625,
-0.0304412841796875,
-0.032196044921875,
0.036834716796875,
-0.0200042724609375,
-0.035003662109375,
0.043060302734375,
0.0056304931640625,
0.0101776123046875,
-0.03692626953125,
0.0011129379272460938,
0.0078125,
0.0172119140625,
-0.005840301513671875,
-0.042694091796875,
0.00977325439453125,
-0.01499176025390625,
0.0002484321594238281,
0.0159912109375,
0.034088134765625,
-0.0225067138671875,
-0.06854248046875,
-0.01403045654296875,
0.0208740234375,
0.032470703125,
0.003665924072265625,
-0.07720947265625,
-0.0022449493408203125,
0.0030460357666015625,
-0.0343017578125,
0.010650634765625,
0.036102294921875,
0.01047515869140625,
0.048858642578125,
0.052581787109375,
0.00940704345703125,
0.001644134521484375,
0.003997802734375,
0.06939697265625,
-0.03509521484375,
-0.040008544921875,
-0.06024169921875,
0.027801513671875,
-0.026611328125,
-0.06317138671875,
0.041900634765625,
0.07275390625,
0.06787109375,
-0.0176239013671875,
0.034149169921875,
0.01491546630859375,
0.026702880859375,
-0.02032470703125,
0.039764404296875,
-0.03265380859375,
0.005825042724609375,
-0.032073974609375,
-0.08319091796875,
-0.016510009765625,
0.0594482421875,
-0.028839111328125,
0.00948333740234375,
0.035400390625,
0.050079345703125,
-0.019073486328125,
0.0338134765625,
0.01363372802734375,
0.0063018798828125,
0.01364898681640625,
0.0174560546875,
0.033355712890625,
-0.054595947265625,
0.0445556640625,
-0.040863037109375,
-0.004825592041015625,
-0.0207061767578125,
-0.049560546875,
-0.087890625,
-0.0307464599609375,
-0.04351806640625,
-0.036590576171875,
-0.00800323486328125,
0.08050537109375,
0.0675048828125,
-0.04632568359375,
-0.01006317138671875,
-0.01126861572265625,
-0.017822265625,
-0.00662994384765625,
-0.01313018798828125,
0.03656005859375,
-0.01509857177734375,
-0.05169677734375,
-0.0211181640625,
-0.022186279296875,
0.032867431640625,
-0.01296234130859375,
-0.00754547119140625,
0.00041365623474121094,
-0.01617431640625,
0.0261993408203125,
0.006870269775390625,
-0.04388427734375,
-0.034332275390625,
-0.021392822265625,
-0.015472412109375,
0.0158233642578125,
0.0223236083984375,
-0.035552978515625,
0.030670166015625,
0.006587982177734375,
0.018463134765625,
0.06427001953125,
0.005741119384765625,
0.02294921875,
-0.047515869140625,
0.036041259765625,
0.01519012451171875,
0.03533935546875,
0.0115966796875,
-0.03131103515625,
0.030059814453125,
0.0308685302734375,
-0.0611572265625,
-0.054534912109375,
-0.01617431640625,
-0.08880615234375,
0.001911163330078125,
0.08575439453125,
0.007389068603515625,
-0.028656005859375,
0.037933349609375,
-0.0419921875,
0.03271484375,
-0.0236358642578125,
0.037200927734375,
0.04486083984375,
-0.0005011558532714844,
-0.00418853759765625,
-0.048095703125,
0.0310516357421875,
0.00666046142578125,
-0.034393310546875,
-0.02386474609375,
0.031982421875,
0.035797119140625,
-0.0072021484375,
0.0202178955078125,
-0.0007138252258300781,
0.019805908203125,
0.01555633544921875,
0.025543212890625,
-0.0288238525390625,
-0.0236968994140625,
-0.00684356689453125,
0.005725860595703125,
0.001117706298828125,
-0.055908203125
]
] |
JujoHotaru/lora | 2023-11-06T15:57:06.000Z | [
"diffusers",
"art",
"stable-diffusion",
"lora",
"text-to-image",
"ja",
"license:mit",
"license:openrail",
"region:us"
] | text-to-image | JujoHotaru | null | null | JujoHotaru/lora | 169 | 33,825 | diffusers | 2023-07-10T11:56:44 | ---
license: [mit, openrail]
language:
- ja
tags:
- art
- stable-diffusion
- lora
- text-to-image
- diffusers
---
# 
- 十条蛍(Hotaru Jujo)の作成したLoRAを配布しています。
- You can download Hotaru Jujo's LoRA collection from this repo.
- [作者プロフィール / Author's profile](profile.md)
- すべてのLoRAは[MITライセンス](LICENSE)またはCreativeML Open RAIL-Mのデュアルライセンスでリリースされます。どちらかのライセンスを選択して使用できます。
- All LoRA's are dual-licensed under [MIT LICENSE](LICENSE) or CreativeML Open RAIL-M.
- LoRAの使用にあたって事前承諾や事後報告などは一切必要ありませんが、TwitterなどSNSで紹介していただけると嬉しいです。
- No prior consent or after reporting is required for the use of LoRA, but I would appreciate it if you could introduce it on Twitter or other SNS.
- 配布中のLoRAは、特記していない限りCFG Scale 7、Clip skip 1を標準設定として開発・動作検証しています。
- Unless otherwise noted, all LoRA's are developed and tested on "CFG Scale 7" and "Clip skip 1" settings.
## 目次 (Index)
[実験LoRA置き場 (Experimental LoRA files)](./experimental/README.md)
[アイコレクション](#アイコレクション-eye-collection) /
[デフォル眼](#デフォル眼-comic-expressions) / [ジト目](#ジト目-comic-expression--scornful-eyes) / [白目](#白目-comic-expression--white-eyes) / [黒目](#黒目-comic-expression--black-eyes) / [(☆\_☆)/(♡\_♡)の目](#☆_☆/♡_♡の目-star-and-heart-shaped-eyes) / [オッドアイ固定化補助](#オッドアイ固定化補助-heterochromia-helper) / [あいうえお発音の口](#あいうえお発音の口-mouths-pronouncing-aiueo) / [官能的な表情](#官能的な表情-sensual-face) / [にやにやした表情の目と口](#にやにやした表情の目と口-smirking-eyes--slyly-mouth) / [デフォルメされた猫の目と口](#デフォルメされた猫の目と口-anime-cat-eyesmouth) / [猫の目&猫の口](#猫の目&猫の口-cat-eyes--cat-mouth) / [極細の眼](#極細の眼-semi-closed-eyes) / [困り顔の眼](#困り顔の眼-worried-eyes) / [ドヤ顔](#ドヤ顔-doyagao--smug-showing-off-face) / [驚いた目&眠そうな目](#驚いた目&眠そうな目-surprised-eyes--sleepy-eyes) / [目隠れ](#目隠れ-hair-over-eyes) / [円形の口](#円形の口-circular-mouth) / [ぐにゃぐにゃ口](#ぐにゃぐにゃ口-wavy-mouth-set) / [閉じた口](#閉じた口-closed-mouth-set) / [口の大きさ変更](#口の大きさ変更-mouth-size-control) / [前面ライトアップ](#前面ライトアップ-front-lighting) / [暗闇化/光る眼](#暗闇化/光る眼-darkness--glowing-eyes) / [2.5D変換](#25d変換-convert-2d-to-25d) / [ペーパーキャラクター](#ペーパーキャラクター-paper-character-effect) / [集中線](#集中線-comic-effect--concentrated-lines) / [ぼかし&背景ぼかし](#ぼかし&背景ぼかし-blur--background-blur) / [キャラクター発光](#キャラクター発光-character-luminescence) / [トーンカーブ調整](#トーンカーブ調整-tone-curve-control) / [彩度調整](#彩度調整-saturation-control) / [ウィンク補助](#ウィンク補助-wink-helper) / [激おこ顔](#激おこ顔-extremely-angry-face) / [にっこり笑顔補助](#にっこり笑顔補助-smiling-face-helper) / [思案顔補助](#思案顔補助-thinking-face-helper) / [茹でダコ顔](#茹でダコ顔-strongly-embarrassed-face) / [青醒め顔](#青醒め顔-paled-face)
[Eye collection](#アイコレクション-eye-collection) / [Comic expressions](#デフォル眼-comic-expressions) / [Comic expression : scornful eyes](#ジト目-comic-expression--scornful-eyes) / [Comic expression : white eyes](#白目-comic-expression--white-eyes) / [Comic expression : black eyes](#黒目-comic-expression--black-eyes) / [Star and heart shaped eyes](#☆_☆/♡_♡の目-star-and-heart-shaped-eyes) / [Heterochromia helper](#オッドアイ固定化補助-heterochromia-helper) / [Mouths pronouncing A,I,U,E,O](#あいうえお発音の口-mouths-pronouncing-aiueo) / [Sensual face](#官能的な表情-sensual-face) / [Smirking eyes and mouth](#にやにやした表情の目と口-smirking-eyes--slyly-mouth) / [Anime cat eyes/mouth](#デフォルメされた猫の目と口-anime-cat-eyesmouth) / [Cat eyes / Cat mouth](#猫の目&猫の口-cat-eyes--cat-mouth) / [Semi-closed eyes](#極細の眼-semi-closed-eyes) / [Worried eyes](#困り顔の眼-worried-eyes) / [Doyagao : smug, showing-off face](#ドヤ顔-doyagao--smug-showing-off-face) / [Surprised eyes / Sleepy eyes](#驚いた目&眠そうな目-surprised-eyes--sleepy-eyes) / [Hair over eyes](#目隠れ-hair-over-eyes) / [Circular mouth](#円形の口-circular-mouth) / [Wavy mouth set](#ぐにゃぐにゃ口-wavy-mouth-set) / [Closed mouth set](#閉じた口-closed-mouth-set) / [Mouth size control](#口の大きさ変更-mouth-size-control) / [Front lighting](#前面ライトアップ-front-lighting) / [Darkness / Glowing eyes](#暗闇化/光る眼-darkness--glowing-eyes) / [Convert 2D to 2.5D](#25d変換-convert-2d-to-25d) / [Paper character effect](#ペーパーキャラクター-paper-character-effect) / [Comic effect : concentrated lines](#集中線-comic-effect--concentrated-lines) / [Blur / Background blur](#ぼかし&背景ぼかし-blur--background-blur) / [Character luminescence](#キャラクター発光-character-luminescence) / [Tone curve control](#トーンカーブ調整-tone-curve-control) / [Saturation control](#彩度調整-saturation-control) / [Wink helper](#ウィンク補助-wink-helper) / [Extremely angry face](#激おこ顔-extremely-angry-face) / [Smiling face helper](#にっこり笑顔補助-smiling-face-helper) / [Thinking face helper](#思案顔補助-thinking-face-helper) / [Strongly embarrassed face](#茹でダコ顔-strongly-embarrassed-face) / [Paled face](#青醒め顔-paled-face)
-----------------------------------------------
## アイコレクション (Eye collection)
[詳しく見る/ダウンロード](./eyecolle/README.md)
[](./eyecolle/README.md)
「アイコレクション」シリーズは、使用するデータモデルに依存することなく、いろいろな眼の形を再現できることを目的としたLoRA群です。
"Eye collection" is a series of LoRAs designed to reproduce various eye shapes without depending on data models.
## デフォル眼 (Comic expressions)
[詳しく見る/ダウンロード (Details/Download)](./comiceye/README.md)
[](./comiceye/README.md)
漫画・アニメ的なデフォルメ表現の眼を各種再現できます。
Deformation expressions which are familiar in manga and anime-style can be reproduced.
## ジト目 (Comic expression : scornful eyes)
[詳しく見る/ダウンロード (Details/Download)](./jitome/README.md)
[](./jitome/README.md) [](./jitome/README.md)
漫画・アニメ的なデフォルメ表現でおなじみ、ジト目を再現できます。
Many types of LoRA are available to reproduce scornful eyes, a familiar cartoon/anime deformation expression.
## 白目 (Comic expression : white eyes)
[詳しく見る/ダウンロード (Details/Download)](./whiteeyes/README.md)
[](./whiteeyes/README.md)
漫画・アニメ的なデフォルメ表現でおなじみ、白目を再現できるLoRAを各種用意しました。
Many types of LoRA are available to reproduce white eyes, a familiar cartoon/anime deformation expression.
## 黒目 (Comic expression : black eyes)
[詳しく見る/ダウンロード (Details/Download)](./blackeyes/README.md)
[](./blackeyes/README.md)
漫画・アニメ的なデフォルメ表現でおなじみ、黒目を再現できるLoRAを6種類用意しました。
6 types of LoRA are available to reproduce black eyes(●_●), a familiar cartoon/anime deformation expression.
## (☆\_☆)/(♡\_♡)の目 (Star and heart shaped eyes)
[詳しく見る/ダウンロード (Details/Download)](./starhearteyes/README.md)
[](./starhearteyes/README.md)
漫画・アニメ的なデフォルメ表現でおなじみ、(☆\_☆)と(♡\_♡)の目を再現できます。
Star shaped and heart shaped eyes, familiar in manga and anime-style deformation expressions, can be reproduced.
## オッドアイ固定化補助 (Heterochromia helper)
[詳しく見る/ダウンロード (Details/Download)](./hetechro/README.md)
[](./hetechro/README.md)
オッドアイの色および左右の組み合わせを固定することができます。
青・緑・黄・赤の4色、それぞれ左右の組み合わせで全12通りが用意されています。
少し使い方に癖があるので、「使い方」を参照してください。
The color and left-right combination of the heterochromia eyes can be fixed.
Total of 12 combinations of four colors (blue, green, yellow, and red), each with left and right sides, are available.
There are a few quirks to using this LoRA. Please refer to the "Usage" section.
## あいうえお発音の口 (Mouths pronouncing A,I,U,E,O)
[詳しく見る/ダウンロード (Details/Download)](./talkmouth/README.md)
[](./talkmouth/README.md)
「あ、い、う、え、お」の発声をしている形の口を再現できます。
形に応じて他のさまざまな用途にも応用できます。
Reproduces mouths pronouncing Japanese 5 basic vowels, `"A" (Ah; /a/)` , `"I" (Ee; /i/)` , `"U" (Woo; /ɯ/)` , `"E" (Eh; /e/)` , `"O" (Oh; /o/)` .
It can be applied to a variety of other applications depending on its shape.
## 官能的な表情 (Sensual face)
[詳しく見る/ダウンロード (Details/Download)](./sensualface/README.md)
[](./sensualface/README.md)
少しうるうるした半眼、ハの字型に下がり気味の眉毛、若干頬に赤みが差すなど、官能的な表情を再現できます。NSFWなシーンにも使えます。
4種類を用意しました。
Reproduces sensual (voluptuous) face with half-closed (and a bit wet) eyes and inverted-v-shaped eyeblows. Also suitable for NSFW scenes.
4 types are available.
## にやにやした表情の目と口 (Smirking eyes / Slyly mouth)
[詳しく見る/ダウンロード (Details/Download)](./smirking/README.md)
[](./smirking/README.md)
にやにやした表情の目と口をそれぞれ再現できます。
Reproduces smirking eyes and slyly mouth.
## デフォルメされた猫の目と口 (Anime cat eyes/mouth)
[詳しく見る/ダウンロード (Details/Download)](./animecat/README.md)
[](./animecat/README.md)
アニメ調にデフォルメされた猫の目、およびそれと組み合わせて使われる菱形の口を再現できます。
Reproduces anime cat eyes and rhombus shaped mouth.
## 猫の目&猫の口 (Cat eyes / Cat mouth)
[詳しく見る/ダウンロード (Details/Download)](./cateyemouth/README.md)
[](./cateyemouth/README.md)
瞳孔が縦に細まる猫の目、およびω形の猫の口を再現できます。
Reproduces cat shaped (slit pupils) and cat-like shaped ("ω"-shaped) mouth.
## 極細の眼 (Semi-closed eyes)
[詳しく見る/ダウンロード (Details/Download)](./hosome/README.md)
[](./hosome/README.md)
閉じかけ、極細の眼を再現できます。マイナス適用すると広く開いた眼にもできます。
細目キャラクターのほか、まばたきアニメーションの中間状態の作成にも使用できます。
Reproduces semi-closed (very thin) eyes, or widely open eyes (by minus LoRA weight).
## 困り顔の眼 (Worried eyes)
[詳しく見る/ダウンロード (Details/Download)](./worriedeyes/README.md)
[](./worriedeyes/README.md)
上瞼が谷型に曲がった、困り顔などで使われる目つきを再現できます。笑顔にも困り顔にも対応します。
Reproduces eyes with valley shaped eyelids, expressing worry, upset, confused, or thinking etc.
## ドヤ顔 (Doyagao : smug, showing-off face)
[詳しく見る/ダウンロード (Details/Download)](./doyagao/README.md)
[](./doyagao/README.md)
V字型眉のドヤ顔を再現できます。
通常、V字型眉はV-shaped eyebrowsのプロンプトで再現できますが、たまに極太の眉毛になってしまうことがあります。そういった場合に、プロンプトの代わりにこのLoRAを使ってみてください。
Reproduces V-shaped eyebrows to express smug / proudly face (called "Doyagao" - Japanese anime slung).
Usually, V-shaped eyebrows can be reproduced by using V-shaped eyebrows prompt, but it sometimes makes very thick eyebrows. This LoRA does not reproduce thick one.
## 驚いた目&眠そうな目 (Surprised eyes / Sleepy eyes)
[詳しく見る/ダウンロード (Details/Download)](./sleepy_surprised/README.md)
[](./sleepy_surprised/README.md)
驚きに見開いた目、および眠そうな生気の無い半目を再現できます。
Reproduces wide-open surprised eyes or sleepy half-lidded eyes.
## 目隠れ (Hair over eyes)
[詳しく見る/ダウンロード (Details/Download)](./mekakure/README.md)
[](./mekakure/README.md)
前髪で目が隠れているキャラクターを再現できます。両目が隠れているパターンのほか、右側・左側の片目だけを隠した状態を再現するタイプも用意しました。
Reproduces character whose eyes are hidden by bangs. Three types are available : both eyes are hidden, right eye is hidden, or left eye is hidden.
## 円形の口 (Circular mouth)
[詳しく見る/ダウンロード (Details/Download)](./circlemouth/README.md)
[](./circlemouth/README.md)
円形の口は`(:0)`のプロンプトで再現できますが、思ったより大きくなったり小さくなったりしてしまうことがあります。
このLoRAを適用すると、大きいサイズまたは小さいサイズに固定することができます。
With most checkpoints, "o"-shaped (circular) mouth can be reproduced with prompt (:0), but its size may be larger or smaller than expected.
With this LoRA, mouth size can be fixed to large size or small size.
## ぐにゃぐにゃ口 (Wavy mouth set)
[詳しく見る/ダウンロード (Details/Download)](./wavymouth/README.md)
[](./wavymouth/README.md)
標準プロンプトで出せる`wavy mouth`の効果を拡張し、輪郭がぐにゃぐにゃした漫画的表現の口を生成することができます。
形状別に6種類用意しました。
Extends `wavy mouth` prompt to produce a cartoon-like mouth with squishy contours.
6 types of shapes are available.
## 閉じた口 (Closed mouth set)
[詳しく見る/ダウンロード (Details/Download)](./closedmouth/README.md)
[](./closedmouth/README.md)
閉じた口の特殊な形を表現することができます。
形の異なる2種類を公開しています。
Reproduces special shapes of the closed mouth.
2 different types are available.
## 口の大きさ変更 (Mouth size control)
[詳しく見る/ダウンロード (Details/Download)](./widemouth/README.md)
[](./widemouth/README.md)
口の大きさを広げたり狭めたりすることができます。プラス適用すると大きく、マイナス適用すると小さくなります。
形の異なる2種類を公開しています。
## 前面ライトアップ (Front lighting)
[詳しく見る/ダウンロード (Details/Download)](./lightup/README.md)
[](./lightup/README.md)
AIイラストでよく発生する「キャラクターの顔に影が落ちる」現象を改善するため、前面をライトアップできます。
ライトアップ用のLoRAを使用すると塗りが甘くなりディティールが潰れる場合があるため、それを修正するディティールアップ用のLoRAも用意しています。
To improve the "shadow cast on the character's face" phenomenon that often occurs in AI illustrations, this LoRA lights up character's face.
Since using "lighting up" LoRA may result painting and details to be poor, we also provide "detailing up" LoRA to use in combination.
## 暗闇化/光る眼 (Darkness / Glowing eyes)
[詳しく見る/ダウンロード (Details/Download)](./dark_gloweye/README.md)
[](./dark_gloweye/README.md)
Stable Diffusionでキャラクターを出力すると、基本的にキャラクターの前側に光が当たった状態となり、暗い状態の再現が難しくなっています。
このLoRAを使用すると、キャラクター前面にほとんど光が当たらない暗闇状態を再現しやすくなります。
また、暗闇にいるキャラクターでよく演出として使用される「光る眼」を再現しやすくしたLoRAも同時に公開しています。
When using Stable Diffusion, basically front side of the character is lit up, making it difficult to reproduce a dark state.
With this LoRA, it is easier to reproduce a dark state with almost no light on the front of the character.
In addition, a LoRA is also available that makes it easier to reproduce the "glowing eyes" often used for characters in the dark as a dramatic effect.
## 2.5D変換 (Convert 2D to 2.5D)
[詳しく見る/ダウンロード (Details/Download)](./make25d/README.md)
[](./make25d/README.md)
2Dアニメ系モデルの出力を、リアル/3D寄り(2.5D)な見た目に変換できます。
Converts output of 2D animated models to realistic/3D-like(2.5D) appearance.
## ペーパーキャラクター (Paper character effect)
[詳しく見る/ダウンロード (Details/Download)](./paperchara/README.md)
[](./paperchara/README.md)
アニメのおまけ映像などで見かける、キャラクターを紙に印刷して切り取ったような縁取りを付けた状態を再現できます。
Reproduces characters as printed on paper with a cut-out border, as seen in extra contents of some Japanese animations.
## 集中線 (Comic effect : concentrated lines)
[詳しく見る/ダウンロード (Details/Download)](./concentratedlines/README.md)
[](./concentratedlines/README.md)
背景に漫画的表現の集中線を出します。集中線のような形で色の付いたエフェクトになる場合も多いです。
Reproduces a concentrated line (mainly used in manga effect) in the background.
## ぼかし&背景ぼかし (Blur / Background blur)
[詳しく見る/ダウンロード (Details/Download)](./blur/README.md)
[](./blur/README.md)
blurは被写体含め全体を、blurbkは被写体を除いた背景部分だけを、ぼかしたりシャープにしたりすることができるエフェクトLoRAです。
You can blur or sharpen(and detail up) entire image or only background of output image. Minus weight makes sharpen effect.
## キャラクター発光 (Character luminescence)
[詳しく見る/ダウンロード (Details/Download)](./lumi/README.md)
[](./lumi/README.md)
キャラクターの周囲に発光エフェクトを付与します。
Gives a luminescence effect around the character.
## トーンカーブ調整 (Tone curve control)
[詳しく見る/ダウンロード (Details/Download)](./tone/README.md)
[](./tone/README.md)
出力画像のトーンカーブを調整することができます。
トーンアップ(白っぽくする)とトーンダウン(黒っぽくする)の2種類を用意しました。
Raises/Lowers the tone curve of the output image.
## 彩度調整 (Saturation control)
[詳しく見る/ダウンロード (Details/Download)](./saturation/README.md)
[](./saturation/README.md)
出力画像の彩度をアップすることができます。テイスト別に3種類用意しました。
Increases saturation of output image. Three types are available.
## ウィンク補助 (Wink helper)
[詳しく見る/ダウンロード (Details/Download)](./wink/README.md)
[](./wink/README.md)
ウィンクをほぼ確実に出せるようになります。閉じる目を左右どちらにするか、LoRAを使い分けて指定できます。
## 激おこ顔 (Extremely angry face)
[詳しく見る/ダウンロード (Details/Download)](./gekioko/README.md)
[](./gekioko/README.md)
吊り上がった目で激しく怒っている、または不良のような表情を出すことができます。smileと合わせると不敵な笑みの表現にも使えます。
雰囲気の異なる複数バージョンを公開しています。
## にっこり笑顔補助 (Smiling face helper)
[詳しく見る/ダウンロード (Details/Download)](./nikkori/README.md)
[](./nikkori/README.md)
閉じた目は`closed eyes`のプロンプトで再現できますが、形が悪かったりウィンクや半目になってしまうことがよくあります。
このLoRAを使用すると、目を閉じてにっこりと笑っている目つきを安定して出すことができます。`closed eyes`のプロンプトだけよりも形状が整い、上向きに強めのカーブを描いた目になります。
To reproduce closed eyes, usually `closed eyes` prompt is used. But it may not certainly reproduce closed shape, sometimes get wink or half closed eyes.
This LoRA helps to reproduce smiling faces with better shaped closed eyes. The eyes will have a stronger upward curve than the normal `closed eyes` prompt.
## 思案顔補助 (Thinking face helper)
[詳しく見る/ダウンロード (Details/Download)](./thinkingface/README.md)
[](./thinkingface/README.md)
閉じた目は`closed eyes`のプロンプトで再現できますが、形が悪かったりウィンクや半目になってしまうことがよくあります。
このLoRAを使用すると、目を閉じて考え込んでいる状態を安定して出すことができます。`closed eyes`のプロンプトだけよりも形状が整い、下向きに強めのカーブを描いた目になります。
To reproduce closed eyes, usually `closed eyes` prompt is used. But it may not certainly reproduce closed shape, sometimes get wink or half closed eyes.
This LoRA helps to reproduce thoughtful look with better shaped closed eyes. The eyes will have a stronger downward curve than the normal `closed eyes` prompt.
## 茹でダコ顔 (Strongly embarrassed face)
[詳しく見る/ダウンロード (Details/Download)](./yudedako/README.md)
[](./yudedako/README.md)
俗に「茹でダコのような」などと呼ばれる、恥ずかしさで真っ赤になった顔を少しオーバー気味に表現できます。
顔に赤線が入るタイプと、赤線が入らず赤く染まるだけのタイプ2種類(顔全体/頬のみ)の合計3種類を用意しました。
Reproduces a face strongly turned red with embarrassment.
Three types are available: one with a red line on the face, and two types with no red line but only a red tint (full face/cheeks only).
## 青醒め顔 (Paled face)
[詳しく見る/ダウンロード (Details/Download)](./paleface/README.md)
[](./paleface/README.md)
顔の上半分が青く染まる、恐怖や強い怒りなどをアニメチックに表現した顔を再現することができます。
Reproduces pale face (turn pale), an anime expression of fear or strong anger.
-----------------------------------------------
© 2023 Hotaru Jujo.
 | 17,693 | [
[
-0.039276123046875,
-0.0595703125,
0.0164337158203125,
0.01129913330078125,
-0.04376220703125,
-0.00667572021484375,
-0.0027484893798828125,
-0.063720703125,
0.10089111328125,
0.043914794921875,
-0.060028076171875,
-0.03863525390625,
-0.033660888671875,
0.034912109375,
-0.0004413127899169922,
0.0487060546875,
0.00521087646484375,
-0.00373077392578125,
0.034942626953125,
0.004161834716796875,
-0.02349853515625,
0.007122039794921875,
-0.029266357421875,
-0.01039886474609375,
0.0628662109375,
0.0247802734375,
0.056671142578125,
0.036834716796875,
0.037628173828125,
0.02923583984375,
-0.003917694091796875,
0.005889892578125,
-0.0300750732421875,
-0.021270751953125,
-0.00937652587890625,
-0.06072998046875,
-0.042083740234375,
0.0037860870361328125,
0.03314208984375,
0.0310211181640625,
0.0021877288818359375,
-0.01003265380859375,
0.0016031265258789062,
0.055389404296875,
-0.034149169921875,
0.007297515869140625,
-0.00946044921875,
0.0273895263671875,
-0.0309600830078125,
-0.007640838623046875,
-0.0081634521484375,
-0.040985107421875,
-0.022064208984375,
-0.07806396484375,
-0.02459716796875,
-0.008392333984375,
0.09814453125,
0.0150604248046875,
-0.0144195556640625,
-0.006694793701171875,
-0.023223876953125,
0.050750732421875,
-0.04412841796875,
-0.006893157958984375,
0.0231781005859375,
0.026153564453125,
0.00710296630859375,
-0.061065673828125,
-0.0587158203125,
0.0088653564453125,
-0.016387939453125,
0.046875,
-0.035736083984375,
-0.040313720703125,
0.0240478515625,
0.024383544921875,
-0.0259857177734375,
-0.01027679443359375,
-0.034332275390625,
0.0028018951416015625,
0.054901123046875,
0.0003204345703125,
0.06231689453125,
-0.02655029296875,
-0.0283050537109375,
-0.01097869873046875,
-0.03790283203125,
0.002681732177734375,
0.03753662109375,
0.010772705078125,
-0.03564453125,
0.057952880859375,
-0.00914764404296875,
0.031402587890625,
0.0304412841796875,
-0.023590087890625,
0.036712646484375,
-0.007663726806640625,
-0.0163726806640625,
-0.01297760009765625,
0.055816650390625,
0.043609619140625,
0.004016876220703125,
0.007274627685546875,
0.011383056640625,
0.0264129638671875,
-0.005680084228515625,
-0.055816650390625,
0.004520416259765625,
0.017730712890625,
-0.037628173828125,
-0.047027587890625,
-0.006114959716796875,
-0.0908203125,
-0.01496124267578125,
0.005992889404296875,
0.028167724609375,
-0.045562744140625,
-0.026702880859375,
0.01544952392578125,
-0.007396697998046875,
-0.0024051666259765625,
0.0244598388671875,
-0.053955078125,
-0.01085662841796875,
0.00563812255859375,
0.0253448486328125,
0.0014123916625976562,
-0.01438140869140625,
0.005767822265625,
0.0232696533203125,
-0.0241546630859375,
0.050445556640625,
-0.0240478515625,
-0.01371002197265625,
-0.0032482147216796875,
0.03289794921875,
-0.017578125,
-0.022308349609375,
0.043426513671875,
-0.00896453857421875,
0.01108551025390625,
-0.0169219970703125,
-0.0109100341796875,
-0.0160675048828125,
-0.0021419525146484375,
-0.06072998046875,
0.031829833984375,
0.01421356201171875,
-0.07708740234375,
-0.003414154052734375,
-0.038543701171875,
-0.0191650390625,
0.00811767578125,
-0.0013227462768554688,
-0.0214080810546875,
-0.0086517333984375,
0.0074920654296875,
0.012725830078125,
-0.007251739501953125,
-0.046173095703125,
-0.023590087890625,
-0.0156097412109375,
0.0511474609375,
0.01120758056640625,
0.091064453125,
0.03411865234375,
-0.04669189453125,
-0.0138092041015625,
-0.049285888671875,
0.0130462646484375,
0.062347412109375,
-0.003841400146484375,
-0.016693115234375,
-0.01039886474609375,
0.0029735565185546875,
0.01461029052734375,
0.048370361328125,
-0.03204345703125,
0.0032367706298828125,
-0.0249786376953125,
0.03546142578125,
0.050933837890625,
0.006694793701171875,
-0.0024204254150390625,
-0.033538818359375,
0.0462646484375,
0.0177764892578125,
0.040496826171875,
-0.01099395751953125,
-0.039276123046875,
-0.09014892578125,
-0.015289306640625,
-0.01557159423828125,
0.05010986328125,
-0.07415771484375,
0.046478271484375,
0.00988006591796875,
-0.058868408203125,
-0.024993896484375,
0.01052093505859375,
0.0474853515625,
0.0196380615234375,
0.003173828125,
-0.035614013671875,
-0.0254058837890625,
-0.062469482421875,
-0.0242767333984375,
-0.016204833984375,
0.03582763671875,
0.036651611328125,
0.032928466796875,
-0.048797607421875,
0.0249176025390625,
-0.053619384765625,
-0.041168212890625,
-0.0189056396484375,
-0.0126190185546875,
0.038055419921875,
0.024017333984375,
0.052215576171875,
-0.07958984375,
-0.04901123046875,
0.008636474609375,
-0.0726318359375,
-0.007205963134765625,
0.0295562744140625,
-0.021392822265625,
0.0184783935546875,
0.0178680419921875,
-0.042144775390625,
0.0452880859375,
0.03582763671875,
-0.037841796875,
0.04461669921875,
-0.00574493408203125,
0.02374267578125,
-0.08905029296875,
0.0208587646484375,
0.00218963623046875,
0.0015325546264648438,
-0.049041748046875,
0.04461669921875,
-0.01007843017578125,
-0.00557708740234375,
-0.0489501953125,
0.05560302734375,
-0.043060302734375,
0.0201416015625,
0.003986358642578125,
0.03387451171875,
0.001796722412109375,
0.053863525390625,
-0.002742767333984375,
0.02734375,
0.033966064453125,
-0.02557373046875,
0.048919677734375,
0.03619384765625,
-0.03125,
0.06365966796875,
-0.049713134765625,
-0.0025634765625,
-0.00724029541015625,
0.00037288665771484375,
-0.058441162109375,
-0.04736328125,
0.0545654296875,
-0.032440185546875,
0.01715087890625,
0.018310546875,
-0.03271484375,
-0.060394287109375,
-0.033966064453125,
0.0081634521484375,
0.0482177734375,
-0.03973388671875,
0.0311431884765625,
0.03076171875,
-0.0014371871948242188,
-0.057159423828125,
-0.07318115234375,
0.0020275115966796875,
-0.034759521484375,
-0.048553466796875,
0.0210723876953125,
-0.010894775390625,
-0.0159149169921875,
-0.004024505615234375,
0.0179290771484375,
-0.0158538818359375,
-0.0111541748046875,
0.00823974609375,
0.022186279296875,
-0.0222930908203125,
-0.0305023193359375,
-0.006496429443359375,
0.0117950439453125,
-0.0181732177734375,
0.03607177734375,
0.04510498046875,
-0.010040283203125,
-0.0309906005859375,
-0.065673828125,
0.019073486328125,
0.038787841796875,
0.00042057037353515625,
0.003047943115234375,
0.05853271484375,
-0.0220794677734375,
0.007358551025390625,
-0.0469970703125,
-0.0083465576171875,
-0.038360595703125,
0.016357421875,
-0.0201568603515625,
-0.030792236328125,
0.054046630859375,
0.00934600830078125,
-0.02001953125,
0.0531005859375,
0.02557373046875,
-0.0167083740234375,
0.083251953125,
0.0487060546875,
-0.008514404296875,
0.05322265625,
-0.05023193359375,
0.006298065185546875,
-0.0645751953125,
-0.035888671875,
-0.00597381591796875,
-0.034423828125,
-0.0491943359375,
-0.02410888671875,
0.0289459228515625,
0.0150604248046875,
-0.00563812255859375,
0.043731689453125,
-0.04559326171875,
0.0230865478515625,
0.005954742431640625,
0.0472412109375,
0.0122833251953125,
0.01393890380859375,
0.00847625732421875,
-0.0157318115234375,
-0.0196380615234375,
-0.0184173583984375,
0.04937744140625,
0.0238494873046875,
0.05316162109375,
0.031402587890625,
0.048095703125,
0.001819610595703125,
0.0079498291015625,
-0.0207061767578125,
0.0650634765625,
-0.038787841796875,
-0.0640869140625,
-0.0225067138671875,
-0.017303466796875,
-0.08349609375,
0.0214385986328125,
-0.0249481201171875,
-0.09515380859375,
0.0460205078125,
-0.0122833251953125,
-0.01165771484375,
0.017669677734375,
-0.03564453125,
0.048095703125,
-0.0323486328125,
-0.05157470703125,
-0.0168609619140625,
-0.050811767578125,
0.041839599609375,
0.00780487060546875,
0.0158538818359375,
-0.01483917236328125,
-0.011962890625,
0.0606689453125,
-0.04876708984375,
0.0718994140625,
0.0002605915069580078,
-0.0164642333984375,
0.037261962890625,
-0.006740570068359375,
0.038818359375,
0.00788116455078125,
0.0160369873046875,
0.0020904541015625,
-0.004764556884765625,
-0.0308685302734375,
-0.034698486328125,
0.0628662109375,
-0.06011962890625,
-0.034210205078125,
-0.03619384765625,
-0.0135498046875,
0.010162353515625,
0.04638671875,
0.05853271484375,
0.0216522216796875,
-0.00730133056640625,
-0.013031005859375,
0.0301971435546875,
-0.0211181640625,
0.0268096923828125,
0.0177764892578125,
-0.04510498046875,
-0.04962158203125,
0.07647705078125,
0.0074920654296875,
0.0248870849609375,
0.01103973388671875,
0.0202484130859375,
-0.005458831787109375,
0.00997161865234375,
-0.02557373046875,
0.06982421875,
-0.048919677734375,
0.0006999969482421875,
-0.0255126953125,
-0.006504058837890625,
-0.033782958984375,
-0.020477294921875,
-0.0113372802734375,
-0.030120849609375,
-0.048614501953125,
0.03033447265625,
0.043243408203125,
0.046722412109375,
-0.004825592041015625,
0.003765106201171875,
-0.03082275390625,
0.040496826171875,
0.02947998046875,
0.024261474609375,
-0.0114898681640625,
-0.00567626953125,
-0.011474609375,
0.007427215576171875,
-0.03076171875,
-0.070556640625,
0.055877685546875,
-0.006610870361328125,
0.0216827392578125,
0.0491943359375,
-0.0244598388671875,
0.056976318359375,
-0.01059722900390625,
0.0589599609375,
0.06463623046875,
-0.038360595703125,
0.048309326171875,
-0.059295654296875,
0.005123138427734375,
0.03607177734375,
0.0195465087890625,
-0.038116455078125,
-0.02545166015625,
-0.052520751953125,
-0.042083740234375,
0.057373046875,
0.0228271484375,
0.0207366943359375,
0.0002505779266357422,
0.0094757080078125,
-0.01090240478515625,
0.0191802978515625,
-0.060760498046875,
-0.06451416015625,
-0.03594970703125,
-0.004489898681640625,
-0.01239013671875,
-0.004123687744140625,
-0.0105438232421875,
-0.02545166015625,
0.04339599609375,
-0.00800323486328125,
0.042449951171875,
0.023956298828125,
0.0029144287109375,
-0.0285186767578125,
0.006893157958984375,
0.044403076171875,
0.03741455078125,
-0.0233154296875,
-0.0141448974609375,
-0.00036263465881347656,
-0.06109619140625,
-0.00301361083984375,
-0.001922607421875,
-0.031707763671875,
0.021148681640625,
0.00571441650390625,
0.0574951171875,
0.032958984375,
-0.051422119140625,
0.0482177734375,
0.0049591064453125,
0.01361083984375,
-0.03814697265625,
0.0232696533203125,
0.01025390625,
0.0221405029296875,
0.0175018310546875,
-0.0187530517578125,
0.00986480712890625,
-0.07965087890625,
-0.0091705322265625,
0.01486968994140625,
-0.02978515625,
-0.0157928466796875,
0.0533447265625,
0.0021114349365234375,
-0.01241302490234375,
0.0013179779052734375,
-0.00850677490234375,
-0.0440673828125,
0.0673828125,
0.053680419921875,
0.058197021484375,
-0.040191650390625,
0.03271484375,
0.04052734375,
-0.003269195556640625,
0.005859375,
0.051727294921875,
0.036224365234375,
-0.03265380859375,
0.00797271728515625,
-0.0443115234375,
-0.01003265380859375,
0.024383544921875,
-0.03131103515625,
0.0333251953125,
-0.0543212890625,
-0.02410888671875,
0.0072784423828125,
0.0107879638671875,
-0.022857666015625,
0.0296630859375,
0.0094451904296875,
0.0662841796875,
-0.0511474609375,
0.041412353515625,
0.03973388671875,
-0.06768798828125,
-0.05560302734375,
-0.0094146728515625,
0.01076507568359375,
-0.06640625,
0.053466796875,
-0.005237579345703125,
-0.002620697021484375,
-0.01031494140625,
-0.053009033203125,
-0.07049560546875,
0.07769775390625,
0.01113128662109375,
-0.018096923828125,
-0.008697509765625,
0.005062103271484375,
0.04913330078125,
-0.033203125,
0.017303466796875,
0.0249481201171875,
0.059356689453125,
0.011016845703125,
-0.044677734375,
0.0119171142578125,
-0.05889892578125,
0.0029144287109375,
-0.0173797607421875,
-0.08843994140625,
0.095703125,
-0.025390625,
-0.0278472900390625,
0.0418701171875,
0.048095703125,
0.0249481201171875,
0.0179290771484375,
0.03900146484375,
0.051483154296875,
0.0236968994140625,
-0.0076446533203125,
0.0867919921875,
-0.01470184326171875,
0.0162200927734375,
0.0300445556640625,
-0.002071380615234375,
0.05224609375,
0.01404571533203125,
-0.032745361328125,
0.0567626953125,
0.036224365234375,
-0.0291748046875,
0.041839599609375,
0.0026302337646484375,
-0.004070281982421875,
0.005706787109375,
-0.027679443359375,
-0.035919189453125,
0.020782470703125,
0.01336669921875,
-0.013519287109375,
0.023223876953125,
0.00418853759765625,
0.02178955078125,
0.00612640380859375,
-0.02655029296875,
0.039337158203125,
0.02203369140625,
-0.034637451171875,
0.0655517578125,
-0.02166748046875,
0.0660400390625,
-0.04278564453125,
0.007720947265625,
-0.022064208984375,
0.019287109375,
-0.0306396484375,
-0.070068359375,
-0.004703521728515625,
-0.006038665771484375,
-0.00337982177734375,
0.01251983642578125,
0.042327880859375,
-0.01525115966796875,
-0.06292724609375,
0.03839111328125,
-0.006053924560546875,
0.043304443359375,
0.038726806640625,
-0.0732421875,
0.016204833984375,
0.0199432373046875,
-0.01314544677734375,
0.015960693359375,
0.020111083984375,
0.009796142578125,
0.05010986328125,
0.0211181640625,
0.032501220703125,
0.006816864013671875,
0.00980377197265625,
0.05218505859375,
-0.04241943359375,
-0.034698486328125,
-0.024017333984375,
0.040252685546875,
-0.0185546875,
-0.0295562744140625,
0.06597900390625,
0.0214691162109375,
0.035064697265625,
-0.03521728515625,
0.048553466796875,
-0.022491455078125,
0.0418701171875,
-0.040191650390625,
0.07049560546875,
-0.09161376953125,
-0.0265655517578125,
-0.06561279296875,
-0.06903076171875,
-0.02001953125,
0.048553466796875,
-0.0020503997802734375,
0.014617919921875,
0.0565185546875,
0.04534912109375,
-0.0020313262939453125,
-0.0016145706176757812,
0.0113372802734375,
0.040985107421875,
-0.0104827880859375,
0.06329345703125,
0.0521240234375,
-0.056121826171875,
0.013519287109375,
-0.051971435546875,
-0.0025920867919921875,
-0.06353759765625,
-0.038848876953125,
-0.0699462890625,
-0.05029296875,
-0.03936767578125,
-0.030670166015625,
-0.01293182373046875,
0.02423095703125,
0.024993896484375,
-0.039215087890625,
-0.01514434814453125,
0.0294647216796875,
0.0133514404296875,
-0.0101318359375,
-0.0156707763671875,
0.0292510986328125,
0.01532745361328125,
-0.0745849609375,
0.0091094970703125,
0.0166015625,
0.026123046875,
-0.0188446044921875,
-0.0049285888671875,
-0.00421905517578125,
0.006816864013671875,
0.03033447265625,
0.04669189453125,
-0.0673828125,
-0.009857177734375,
0.01079559326171875,
-0.0181732177734375,
-0.0032806396484375,
0.01593017578125,
-0.0183563232421875,
0.01291656494140625,
0.04315185546875,
-0.01470947265625,
0.03668212890625,
-0.020721435546875,
-0.002819061279296875,
-0.01428985595703125,
0.026947021484375,
0.0005197525024414062,
0.05010986328125,
0.01067352294921875,
-0.0411376953125,
0.055145263671875,
0.036102294921875,
-0.01154327392578125,
-0.0653076171875,
0.025665283203125,
-0.11163330078125,
-0.0335693359375,
0.08056640625,
0.0010242462158203125,
-0.056304931640625,
0.0253448486328125,
-0.0283660888671875,
0.0264892578125,
-0.004222869873046875,
0.035186767578125,
0.0296173095703125,
-0.00012731552124023438,
-0.0273284912109375,
-0.035919189453125,
0.016387939453125,
0.016693115234375,
-0.08013916015625,
-0.0034160614013671875,
0.0295562744140625,
0.02593994140625,
0.0657958984375,
0.031585693359375,
-0.039947509765625,
0.034637451171875,
-0.003986358642578125,
0.02392578125,
0.021270751953125,
0.003231048583984375,
-0.005962371826171875,
-0.0187225341796875,
-0.0280609130859375,
-0.00110626220703125
]
] |
facebook/wav2vec2-large-960h-lv60-self | 2022-05-23T16:13:42.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"speech",
"audio",
"hf-asr-leaderboard",
"en",
"dataset:librispeech_asr",
"arxiv:2010.11430",
"arxiv:2006.11477",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"has_space",
"region:us"
] | automatic-speech-recognition | facebook | null | null | facebook/wav2vec2-large-960h-lv60-self | 92 | 33,765 | transformers | 2022-03-02T23:29:05 | ---
language: en
datasets:
- librispeech_asr
tags:
- speech
- audio
- automatic-speech-recognition
- hf-asr-leaderboard
license: apache-2.0
model-index:
- name: wav2vec2-large-960h-lv60
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (clean)
type: librispeech_asr
config: clean
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 1.9
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (other)
type: librispeech_asr
config: other
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 3.9
---
# Wav2Vec2-Large-960h-Lv60 + Self-Training
[Facebook's Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/)
The large model pretrained and fine-tuned on 960 hours of Libri-Light and Librispeech on 16kHz sampled speech audio. Model was trained with [Self-Training objective](https://arxiv.org/abs/2010.11430). When using the model make sure that your speech input is also sampled at 16Khz.
[Paper](https://arxiv.org/abs/2006.11477)
Authors: Alexei Baevski, Henry Zhou, Abdelrahman Mohamed, Michael Auli
**Abstract**
We show for the first time that learning powerful representations from speech audio alone followed by fine-tuning on transcribed speech can outperform the best semi-supervised methods while being conceptually simpler. wav2vec 2.0 masks the speech input in the latent space and solves a contrastive task defined over a quantization of the latent representations which are jointly learned. Experiments using all labeled data of Librispeech achieve 1.8/3.3 WER on the clean/other test sets. When lowering the amount of labeled data to one hour, wav2vec 2.0 outperforms the previous state of the art on the 100 hour subset while using 100 times less labeled data. Using just ten minutes of labeled data and pre-training on 53k hours of unlabeled data still achieves 4.8/8.2 WER. This demonstrates the feasibility of speech recognition with limited amounts of labeled data.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
To transcribe audio files the model can be used as a standalone acoustic model as follows:
```python
from transformers import Wav2Vec2Processor, Wav2Vec2ForCTC
from datasets import load_dataset
import torch
# load model and processor
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-large-960h-lv60-self")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-large-960h-lv60-self")
# load dummy dataset and read soundfiles
ds = load_dataset("patrickvonplaten/librispeech_asr_dummy", "clean", split="validation")
# tokenize
input_values = processor(ds[0]["audio"]["array"], return_tensors="pt", padding="longest").input_values
# retrieve logits
logits = model(input_values).logits
# take argmax and decode
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
```
## Evaluation
This code snippet shows how to evaluate **facebook/wav2vec2-large-960h-lv60-self** on LibriSpeech's "clean" and "other" test data.
```python
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import torch
from jiwer import wer
librispeech_eval = load_dataset("librispeech_asr", "clean", split="test")
model = Wav2Vec2ForCTC.from_pretrained("facebook/wav2vec2-large-960h-lv60-self").to("cuda")
processor = Wav2Vec2Processor.from_pretrained("facebook/wav2vec2-large-960h-lv60-self")
def map_to_pred(batch):
inputs = processor(batch["audio"]["array"], return_tensors="pt", padding="longest")
input_values = inputs.input_values.to("cuda")
attention_mask = inputs.attention_mask.to("cuda")
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
transcription = processor.batch_decode(predicted_ids)
batch["transcription"] = transcription
return batch
result = librispeech_eval.map(map_to_pred, remove_columns=["audio"])
print("WER:", wer(result["text"], result["transcription"]))
```
*Result (WER)*:
| "clean" | "other" |
|---|---|
| 1.9 | 3.9 | | 4,471 | [
[
-0.0178680419921875,
-0.04779052734375,
0.015625,
0.016510009765625,
-0.01088714599609375,
-0.01446533203125,
-0.03631591796875,
-0.04071044921875,
0.000560760498046875,
0.01233673095703125,
-0.049041748046875,
-0.043548583984375,
-0.041778564453125,
-0.026702880859375,
-0.029876708984375,
0.07012939453125,
0.019439697265625,
0.0051422119140625,
0.0048370361328125,
-0.0107879638671875,
-0.0301055908203125,
-0.0221405029296875,
-0.06402587890625,
-0.031585693359375,
0.0175628662109375,
0.0117340087890625,
0.01177215576171875,
0.01849365234375,
0.02587890625,
0.0263519287109375,
-0.0152740478515625,
0.00148773193359375,
-0.05059814453125,
-0.00820159912109375,
0.0101165771484375,
-0.0247344970703125,
-0.026214599609375,
0.0199127197265625,
0.0438232421875,
0.029571533203125,
-0.01513671875,
0.043212890625,
0.00860595703125,
0.03253173828125,
-0.0247039794921875,
0.0238037109375,
-0.044097900390625,
-0.0108489990234375,
-0.0089569091796875,
-0.007724761962890625,
-0.04217529296875,
-0.005565643310546875,
0.00859832763671875,
-0.039947509765625,
0.01534271240234375,
-0.017120361328125,
0.064208984375,
0.0214080810546875,
-0.018951416015625,
-0.02935791015625,
-0.0694580078125,
0.06268310546875,
-0.0516357421875,
0.055877685546875,
0.037078857421875,
0.016754150390625,
-0.0027256011962890625,
-0.082763671875,
-0.03228759765625,
-0.0023784637451171875,
0.0229339599609375,
0.03759765625,
-0.0236663818359375,
0.0058746337890625,
0.0287628173828125,
0.0226898193359375,
-0.047698974609375,
0.00794219970703125,
-0.06597900390625,
-0.038360595703125,
0.058319091796875,
-0.02606201171875,
0.00007975101470947266,
-0.0037937164306640625,
-0.027923583984375,
-0.038909912109375,
-0.016143798828125,
0.035552978515625,
0.02471923828125,
0.01007080078125,
-0.031402587890625,
0.030059814453125,
0.0023746490478515625,
0.046112060546875,
0.0095062255859375,
-0.0296173095703125,
0.052978515625,
-0.01441192626953125,
-0.01148223876953125,
0.033721923828125,
0.06964111328125,
0.01012420654296875,
0.01081085205078125,
0.0095062255859375,
-0.0133819580078125,
0.01305389404296875,
-0.01334381103515625,
-0.05413818359375,
-0.0400390625,
0.034698486328125,
-0.031402587890625,
0.00595855712890625,
0.00809478759765625,
-0.0182037353515625,
-0.0019474029541015625,
-0.0184478759765625,
0.07818603515625,
-0.037811279296875,
-0.0210418701171875,
0.01325225830078125,
-0.01849365234375,
0.0143585205078125,
-0.00862884521484375,
-0.0643310546875,
0.0126495361328125,
0.033355712890625,
0.061279296875,
0.0095062255859375,
-0.00634765625,
-0.04302978515625,
-0.001888275146484375,
-0.0162353515625,
0.03729248046875,
-0.00110626220703125,
-0.038909912109375,
-0.0216217041015625,
-0.007568359375,
0.005619049072265625,
-0.045867919921875,
0.056060791015625,
-0.02667236328125,
0.0190887451171875,
-0.005828857421875,
-0.050750732421875,
-0.01849365234375,
-0.041595458984375,
-0.042510986328125,
0.09063720703125,
0.0108184814453125,
-0.047637939453125,
0.016998291015625,
-0.029815673828125,
-0.046722412109375,
-0.024322509765625,
-0.00615692138671875,
-0.047332763671875,
0.0067596435546875,
0.019256591796875,
0.036590576171875,
-0.00972747802734375,
0.000568389892578125,
-0.01364898681640625,
-0.04541015625,
0.0301055908203125,
-0.040679931640625,
0.0806884765625,
0.022216796875,
-0.042877197265625,
0.012115478515625,
-0.0672607421875,
0.014312744140625,
0.005336761474609375,
-0.035247802734375,
0.01039886474609375,
-0.014678955078125,
0.0250091552734375,
0.0218048095703125,
0.01357269287109375,
-0.04486083984375,
-0.01568603515625,
-0.05517578125,
0.047119140625,
0.05499267578125,
-0.00916290283203125,
0.0249176025390625,
-0.024932861328125,
-0.000759124755859375,
-0.02020263671875,
0.00246429443359375,
0.006504058837890625,
-0.0307769775390625,
-0.04913330078125,
-0.032806396484375,
0.0251007080078125,
0.035125732421875,
-0.0160980224609375,
0.051849365234375,
-0.0075531005859375,
-0.0654296875,
-0.07708740234375,
0.0037708282470703125,
0.0262298583984375,
0.03912353515625,
0.05413818359375,
-0.0173492431640625,
-0.056610107421875,
-0.059661865234375,
-0.0084075927734375,
-0.0060577392578125,
-0.01464080810546875,
0.0252532958984375,
0.017974853515625,
-0.0265045166015625,
0.048736572265625,
-0.017120361328125,
-0.03912353515625,
-0.0167083740234375,
0.0115509033203125,
0.047271728515625,
0.050506591796875,
0.02008056640625,
-0.04522705078125,
-0.0220794677734375,
-0.028411865234375,
-0.040374755859375,
-0.0093231201171875,
-0.00830841064453125,
-0.0015087127685546875,
0.012115478515625,
0.0338134765625,
-0.035003662109375,
0.027069091796875,
0.041412353515625,
-0.00669097900390625,
0.0289459228515625,
-0.00789642333984375,
0.0001208186149597168,
-0.07171630859375,
0.00048732757568359375,
-0.00475311279296875,
-0.0209197998046875,
-0.03570556640625,
-0.0460205078125,
-0.006656646728515625,
-0.004608154296875,
-0.0400390625,
0.0282440185546875,
-0.036041259765625,
-0.022064208984375,
-0.0144195556640625,
0.01557159423828125,
-0.0117034912109375,
0.040191650390625,
0.00366973876953125,
0.050048828125,
0.0467529296875,
-0.04327392578125,
0.045684814453125,
0.01605224609375,
-0.044677734375,
0.0018596649169921875,
-0.06658935546875,
0.035003662109375,
0.0105438232421875,
0.0260162353515625,
-0.08746337890625,
-0.0053863525390625,
-0.003582000732421875,
-0.06903076171875,
0.0252532958984375,
0.002017974853515625,
-0.0282745361328125,
-0.035400390625,
-0.00717926025390625,
0.033905029296875,
0.07086181640625,
-0.05194091796875,
0.041595458984375,
0.03369140625,
0.0138092041015625,
-0.03350830078125,
-0.07281494140625,
-0.036651611328125,
-0.0033054351806640625,
-0.052520751953125,
0.0277862548828125,
-0.0015401840209960938,
0.003673553466796875,
-0.0115966796875,
-0.037994384765625,
0.0145416259765625,
-0.0104827880859375,
0.0400390625,
0.01519012451171875,
-0.0054931640625,
0.01580810546875,
-0.00812530517578125,
-0.0190277099609375,
0.016448974609375,
-0.040557861328125,
0.054656982421875,
-0.01120758056640625,
-0.0165557861328125,
-0.0726318359375,
-0.00420379638671875,
0.01436614990234375,
-0.0276031494140625,
0.0292510986328125,
0.0863037109375,
-0.0265350341796875,
-0.0177001953125,
-0.04638671875,
-0.0234527587890625,
-0.0411376953125,
0.053436279296875,
-0.0201873779296875,
-0.05120849609375,
0.0242919921875,
0.0038700103759765625,
0.005397796630859375,
0.050567626953125,
0.055908203125,
-0.033966064453125,
0.063232421875,
0.0181427001953125,
-0.003017425537109375,
0.039337158203125,
-0.06646728515625,
0.00445556640625,
-0.058441162109375,
-0.03271484375,
-0.0246429443359375,
-0.031829833984375,
-0.038543701171875,
-0.038238525390625,
0.036865234375,
-0.0013904571533203125,
-0.011749267578125,
0.0270538330078125,
-0.05560302734375,
0.0124664306640625,
0.053192138671875,
0.025909423828125,
-0.01090240478515625,
0.0153961181640625,
0.004116058349609375,
-0.003993988037109375,
-0.040374755859375,
-0.009521484375,
0.092529296875,
0.036651611328125,
0.05877685546875,
-0.01271820068359375,
0.061737060546875,
0.0150146484375,
-0.02239990234375,
-0.0654296875,
0.0338134765625,
-0.00946807861328125,
-0.052001953125,
-0.0228729248046875,
-0.021453857421875,
-0.06256103515625,
0.00919342041015625,
-0.02630615234375,
-0.057861328125,
0.0084075927734375,
0.002925872802734375,
-0.0265045166015625,
0.0114288330078125,
-0.05560302734375,
0.049407958984375,
-0.0184478759765625,
-0.02398681640625,
-0.02496337890625,
-0.051849365234375,
0.005504608154296875,
0.003635406494140625,
0.0172882080078125,
-0.0128021240234375,
0.033416748046875,
0.10308837890625,
-0.00968170166015625,
0.0401611328125,
-0.030517578125,
-0.001312255859375,
0.052093505859375,
-0.016326904296875,
0.02447509765625,
0.0023403167724609375,
-0.012847900390625,
0.02203369140625,
0.01013946533203125,
-0.02264404296875,
-0.0282745361328125,
0.048431396484375,
-0.079345703125,
-0.0226898193359375,
-0.0162353515625,
-0.0341796875,
-0.019561767578125,
0.007740020751953125,
0.061004638671875,
0.06085205078125,
-0.0016651153564453125,
0.0361328125,
0.052490234375,
-0.004428863525390625,
0.0364990234375,
0.00847625732421875,
-0.007232666015625,
-0.035400390625,
0.06988525390625,
0.020599365234375,
0.01849365234375,
0.00785064697265625,
0.0175933837890625,
-0.046051025390625,
-0.033843994140625,
-0.0011186599731445312,
0.0183258056640625,
-0.053070068359375,
-0.0059661865234375,
-0.049652099609375,
-0.0303802490234375,
-0.054229736328125,
0.0101776123046875,
-0.05548095703125,
-0.031646728515625,
-0.03350830078125,
-0.006580352783203125,
0.0240478515625,
0.040496826171875,
-0.035552978515625,
0.028564453125,
-0.043060302734375,
0.03826904296875,
0.0294647216796875,
-0.00312042236328125,
-0.0102691650390625,
-0.08056640625,
-0.028106689453125,
0.017791748046875,
-0.002780914306640625,
-0.065673828125,
0.011260986328125,
0.01800537109375,
0.03662109375,
0.0248260498046875,
-0.004215240478515625,
0.04876708984375,
-0.021575927734375,
0.051055908203125,
0.0212554931640625,
-0.08062744140625,
0.051422119140625,
-0.00815582275390625,
0.0162353515625,
0.03594970703125,
0.01454925537109375,
-0.0258636474609375,
-0.005924224853515625,
-0.05377197265625,
-0.073974609375,
0.06671142578125,
0.0279998779296875,
0.0011539459228515625,
0.0306854248046875,
0.022491455078125,
-0.0087127685546875,
-0.005001068115234375,
-0.0552978515625,
-0.035858154296875,
-0.0300140380859375,
-0.025238037109375,
-0.024993896484375,
-0.019134521484375,
-0.004154205322265625,
-0.041290283203125,
0.0743408203125,
0.0250244140625,
0.042999267578125,
0.034820556640625,
-0.01114654541015625,
0.006511688232421875,
0.0083770751953125,
0.0260162353515625,
0.023468017578125,
-0.023590087890625,
0.01128387451171875,
0.02392578125,
-0.04425048828125,
0.015869140625,
0.017303466796875,
0.01221466064453125,
0.00323486328125,
0.052825927734375,
0.08502197265625,
0.0010251998901367188,
-0.02996826171875,
0.04058837890625,
0.00044918060302734375,
-0.0223388671875,
-0.043792724609375,
0.01666259765625,
0.032745361328125,
0.0298309326171875,
0.0309600830078125,
-0.0025463104248046875,
0.00983428955078125,
-0.0307769775390625,
0.0260467529296875,
0.0187835693359375,
-0.0396728515625,
-0.0207977294921875,
0.069580078125,
0.00502777099609375,
-0.0197296142578125,
0.05029296875,
-0.0023193359375,
-0.0223236083984375,
0.047637939453125,
0.0479736328125,
0.058746337890625,
-0.027587890625,
-0.016754150390625,
0.044952392578125,
0.0176544189453125,
-0.0019178390502929688,
0.033843994140625,
-0.01800537109375,
-0.033935546875,
-0.021392822265625,
-0.047332763671875,
0.005157470703125,
0.0190887451171875,
-0.059326171875,
0.02423095703125,
-0.03155517578125,
-0.0307159423828125,
0.019012451171875,
0.013214111328125,
-0.058380126953125,
0.030548095703125,
0.0202484130859375,
0.0523681640625,
-0.062469482421875,
0.0782470703125,
0.024200439453125,
-0.0240325927734375,
-0.09613037109375,
-0.011199951171875,
-0.00731658935546875,
-0.05853271484375,
0.04583740234375,
0.0283203125,
-0.0316162109375,
0.0177154541015625,
-0.04278564453125,
-0.06439208984375,
0.081787109375,
0.0238189697265625,
-0.059600830078125,
0.01214599609375,
-0.00911712646484375,
0.036285400390625,
-0.00629425048828125,
0.0122833251953125,
0.05474853515625,
0.032470703125,
0.005207061767578125,
-0.07440185546875,
-0.01035308837890625,
-0.00878143310546875,
-0.021240234375,
-0.0201416015625,
-0.05023193359375,
0.07086181640625,
-0.0297088623046875,
-0.022613525390625,
-0.007091522216796875,
0.07781982421875,
0.018829345703125,
0.0232086181640625,
0.046478271484375,
0.039306640625,
0.068115234375,
-0.01540374755859375,
0.05511474609375,
-0.0029201507568359375,
0.042144775390625,
0.08746337890625,
0.00815582275390625,
0.064697265625,
0.02056884765625,
-0.027557373046875,
0.0274200439453125,
0.043182373046875,
-0.013275146484375,
0.054229736328125,
0.015350341796875,
-0.018035888671875,
-0.0164947509765625,
0.0045166015625,
-0.050689697265625,
0.06787109375,
0.0189208984375,
-0.01271820068359375,
0.0222625732421875,
0.0135345458984375,
-0.01003265380859375,
-0.00833892822265625,
-0.01186370849609375,
0.05682373046875,
0.0158233642578125,
-0.019989013671875,
0.06939697265625,
-0.0012760162353515625,
0.059600830078125,
-0.0478515625,
0.0091400146484375,
0.017120361328125,
0.022735595703125,
-0.0304718017578125,
-0.047119140625,
0.005321502685546875,
-0.019989013671875,
-0.013397216796875,
0.004180908203125,
0.04827880859375,
-0.050537109375,
-0.03619384765625,
0.04693603515625,
0.006183624267578125,
0.0190277099609375,
-0.004169464111328125,
-0.049652099609375,
0.0236663818359375,
0.019989013671875,
-0.031494140625,
-0.0035572052001953125,
0.006866455078125,
0.031097412109375,
0.023406982421875,
0.0555419921875,
0.01293182373046875,
0.012908935546875,
0.0019426345825195312,
0.045318603515625,
-0.042572021484375,
-0.043182373046875,
-0.04571533203125,
0.027435302734375,
0.005420684814453125,
-0.01354217529296875,
0.04364013671875,
0.05780029296875,
0.07647705078125,
0.0010690689086914062,
0.052734375,
0.0022487640380859375,
0.049285888671875,
-0.0533447265625,
0.061767578125,
-0.04754638671875,
0.00839996337890625,
-0.01232147216796875,
-0.06292724609375,
0.006900787353515625,
0.07000732421875,
-0.010345458984375,
0.0283966064453125,
0.036376953125,
0.062103271484375,
-0.00667572021484375,
-0.0016117095947265625,
0.0138397216796875,
0.029571533203125,
0.024566650390625,
0.056182861328125,
0.042205810546875,
-0.06121826171875,
0.057159423828125,
-0.04412841796875,
-0.014892578125,
-0.005252838134765625,
-0.017303466796875,
-0.06781005859375,
-0.05987548828125,
-0.0210418701171875,
-0.0484619140625,
-0.004993438720703125,
0.0767822265625,
0.06317138671875,
-0.060516357421875,
-0.0270233154296875,
0.0212554931640625,
-0.01351165771484375,
-0.0284271240234375,
-0.014007568359375,
0.056549072265625,
0.0009050369262695312,
-0.061553955078125,
0.057281494140625,
-0.002849578857421875,
0.0094757080078125,
-0.002376556396484375,
-0.01263427734375,
-0.020599365234375,
-0.003448486328125,
0.0276031494140625,
0.0185546875,
-0.049957275390625,
-0.0181884765625,
-0.0121917724609375,
-0.01192474365234375,
0.0125274658203125,
0.0292205810546875,
-0.05499267578125,
0.046112060546875,
0.0408935546875,
0.0225830078125,
0.07861328125,
-0.017913818359375,
0.005443572998046875,
-0.0499267578125,
0.034515380859375,
0.0189056396484375,
0.0238494873046875,
0.021575927734375,
-0.0171051025390625,
0.0208740234375,
0.0231781005859375,
-0.0484619140625,
-0.057159423828125,
-0.005645751953125,
-0.099853515625,
-0.0137481689453125,
0.09649658203125,
0.00396728515625,
-0.018951416015625,
0.01233673095703125,
-0.028594970703125,
0.0718994140625,
-0.0341796875,
0.035003662109375,
0.0269622802734375,
-0.01470184326171875,
0.00911712646484375,
-0.04327392578125,
0.04193115234375,
0.0350341796875,
-0.0271759033203125,
-0.008087158203125,
0.029266357421875,
0.0411376953125,
0.007083892822265625,
0.06939697265625,
-0.00963592529296875,
0.0310516357421875,
0.0173492431640625,
0.0203704833984375,
-0.021484375,
-0.0224151611328125,
-0.037628173828125,
0.006076812744140625,
-0.0097198486328125,
-0.040374755859375
]
] |
guillaumekln/faster-whisper-tiny | 2023-05-12T18:57:08.000Z | [
"ctranslate2",
"audio",
"automatic-speech-recognition",
"en",
"zh",
"de",
"es",
"ru",
"ko",
"fr",
"ja",
"pt",
"tr",
"pl",
"ca",
"nl",
"ar",
"sv",
"it",
"id",
"hi",
"fi",
"vi",
"he",
"uk",
"el",
"ms",
"cs",
"ro",
"da",
"hu",
"ta",
"no",
"th",
"ur",
"hr",
"bg",
"lt",
"la",
"mi",
"ml",
"cy",
"sk",
"te",
"fa",
"lv",
"bn",
"sr",
"az",
"sl",
"kn",
"et",
"mk",
"br",
"eu",
"is",
"hy",
"ne",
"mn",
"bs",
"kk",
"sq",
"sw",
"gl",
"mr",
"pa",
"si",
"km",
"sn",
"yo",
"so",
"af",
"oc",
"ka",
"be",
"tg",
"sd",
"gu",
"am",
"yi",
"lo",
"uz",
"fo",
"ht",
"ps",
"tk",
"nn",
"mt",
"sa",
"lb",
"my",
"bo",
"tl",
"mg",
"as",
"tt",
"haw",
"ln",
"ha",
"ba",
"jw",
"su",
"license:mit",
"region:us"
] | automatic-speech-recognition | guillaumekln | null | null | guillaumekln/faster-whisper-tiny | 5 | 33,742 | ctranslate2 | 2023-03-23T10:14:28 | ---
language:
- en
- zh
- de
- es
- ru
- ko
- fr
- ja
- pt
- tr
- pl
- ca
- nl
- ar
- sv
- it
- id
- hi
- fi
- vi
- he
- uk
- el
- ms
- cs
- ro
- da
- hu
- ta
- 'no'
- th
- ur
- hr
- bg
- lt
- la
- mi
- ml
- cy
- sk
- te
- fa
- lv
- bn
- sr
- az
- sl
- kn
- et
- mk
- br
- eu
- is
- hy
- ne
- mn
- bs
- kk
- sq
- sw
- gl
- mr
- pa
- si
- km
- sn
- yo
- so
- af
- oc
- ka
- be
- tg
- sd
- gu
- am
- yi
- lo
- uz
- fo
- ht
- ps
- tk
- nn
- mt
- sa
- lb
- my
- bo
- tl
- mg
- as
- tt
- haw
- ln
- ha
- ba
- jw
- su
tags:
- audio
- automatic-speech-recognition
license: mit
library_name: ctranslate2
---
# Whisper tiny model for CTranslate2
This repository contains the conversion of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) to the [CTranslate2](https://github.com/OpenNMT/CTranslate2) model format.
This model can be used in CTranslate2 or projects based on CTranslate2 such as [faster-whisper](https://github.com/guillaumekln/faster-whisper).
## Example
```python
from faster_whisper import WhisperModel
model = WhisperModel("tiny")
segments, info = model.transcribe("audio.mp3")
for segment in segments:
print("[%.2fs -> %.2fs] %s" % (segment.start, segment.end, segment.text))
```
## Conversion details
The original model was converted with the following command:
```
ct2-transformers-converter --model openai/whisper-tiny --output_dir faster-whisper-tiny \
--copy_files tokenizer.json --quantization float16
```
Note that the model weights are saved in FP16. This type can be changed when the model is loaded using the [`compute_type` option in CTranslate2](https://opennmt.net/CTranslate2/quantization.html).
## More information
**For more information about the original model, see its [model card](https://huggingface.co/openai/whisper-tiny).**
| 1,996 | [
[
0.0040435791015625,
-0.029388427734375,
0.0240936279296875,
0.0259857177734375,
-0.03277587890625,
-0.02691650390625,
-0.0340576171875,
-0.027496337890625,
0.00699615478515625,
0.0467529296875,
-0.036956787109375,
-0.036651611328125,
-0.036407470703125,
-0.0247039794921875,
-0.0241241455078125,
0.06927490234375,
-0.0046234130859375,
0.0197601318359375,
0.026611328125,
-0.0054779052734375,
-0.0305938720703125,
-0.006649017333984375,
-0.058837890625,
-0.0255584716796875,
0.01404571533203125,
0.02972412109375,
0.049224853515625,
0.036468505859375,
0.034820556640625,
0.018280029296875,
-0.027374267578125,
-0.0081329345703125,
-0.01488494873046875,
-0.01983642578125,
0.0172882080078125,
-0.05255126953125,
-0.052825927734375,
0.00738525390625,
0.051513671875,
0.015533447265625,
-0.0193023681640625,
0.043853759765625,
-0.01416015625,
0.0207672119140625,
-0.033599853515625,
0.0191802978515625,
-0.0426025390625,
0.0018587112426757812,
-0.0140380859375,
-0.012359619140625,
-0.03887939453125,
-0.03179931640625,
0.0296173095703125,
-0.061737060546875,
0.016387939453125,
0.00547027587890625,
0.061981201171875,
0.02459716796875,
-0.037872314453125,
-0.024871826171875,
-0.06982421875,
0.0677490234375,
-0.056640625,
0.01113128662109375,
0.0175323486328125,
0.04083251953125,
0.0181121826171875,
-0.08447265625,
-0.021636962890625,
-0.01029205322265625,
0.004772186279296875,
0.015655517578125,
-0.0311279296875,
0.02117919921875,
0.01235198974609375,
0.036102294921875,
-0.04833984375,
-0.003551483154296875,
-0.051971435546875,
-0.032440185546875,
0.0362548828125,
0.013519287109375,
0.0185699462890625,
-0.026580810546875,
-0.0188140869140625,
-0.042236328125,
-0.038299560546875,
-0.00212860107421875,
0.032440185546875,
0.0249481201171875,
-0.04901123046875,
0.0465087890625,
0.00809478759765625,
0.026458740234375,
0.00839996337890625,
-0.021728515625,
0.0214691162109375,
-0.0259857177734375,
-0.01250457763671875,
0.033477783203125,
0.05419921875,
0.0312347412109375,
0.0031452178955078125,
0.0240478515625,
-0.0221099853515625,
-0.00337982177734375,
0.005157470703125,
-0.088623046875,
-0.04315185546875,
0.016754150390625,
-0.071044921875,
-0.0236663818359375,
-0.0030670166015625,
-0.0240478515625,
0.004810333251953125,
-0.00710296630859375,
0.05023193359375,
-0.035797119140625,
-0.03411865234375,
0.0222015380859375,
-0.035400390625,
0.003635406494140625,
0.035491943359375,
-0.062103271484375,
0.03631591796875,
0.0374755859375,
0.0849609375,
0.0129547119140625,
-0.0023040771484375,
-0.0219573974609375,
0.0160675048828125,
-0.00688934326171875,
0.046661376953125,
0.00353240966796875,
-0.040191650390625,
-0.0108184814453125,
-0.010101318359375,
-0.00995635986328125,
-0.04742431640625,
0.046783447265625,
-0.01415252685546875,
0.024810791015625,
0.0229339599609375,
-0.015716552734375,
-0.01227569580078125,
-0.00116729736328125,
-0.05279541015625,
0.0706787109375,
0.0268096923828125,
-0.05523681640625,
-0.010528564453125,
-0.06488037109375,
-0.0107421875,
-0.012115478515625,
0.0309295654296875,
-0.03546142578125,
0.0259857177734375,
-0.00518798828125,
-0.00074005126953125,
-0.032867431640625,
0.0133209228515625,
-0.016998291015625,
-0.030303955078125,
0.02587890625,
-0.0341796875,
0.06610107421875,
0.0261383056640625,
0.010589599609375,
0.0261077880859375,
-0.045562744140625,
0.006473541259765625,
0.0035266876220703125,
-0.0258941650390625,
-0.0231170654296875,
-0.0169525146484375,
0.041412353515625,
0.0012044906616210938,
0.02313232421875,
-0.041351318359375,
0.0240478515625,
-0.042816162109375,
0.06414794921875,
0.03216552734375,
0.0008816719055175781,
0.040252685546875,
-0.0340576171875,
0.006916046142578125,
0.01451873779296875,
0.033050537109375,
0.0087890625,
-0.040252685546875,
-0.057891845703125,
-0.0108489990234375,
0.042205810546875,
0.0262298583984375,
-0.0440673828125,
0.0146331787109375,
-0.02606201171875,
-0.06695556640625,
-0.073486328125,
-0.027679443359375,
0.0167694091796875,
0.01611328125,
0.03765869140625,
-0.0033397674560546875,
-0.05303955078125,
-0.062042236328125,
-0.00531005859375,
-0.033538818359375,
-0.021514892578125,
0.016998291015625,
0.046905517578125,
-0.01218414306640625,
0.051666259765625,
-0.050933837890625,
-0.04248046875,
-0.0169677734375,
0.0299224853515625,
0.01483154296875,
0.06396484375,
0.047943115234375,
-0.05596923828125,
-0.01898193359375,
-0.01313018798828125,
-0.01531982421875,
0.0038299560546875,
-0.007686614990234375,
0.0016632080078125,
-0.005435943603515625,
0.00312042236328125,
-0.0546875,
0.03204345703125,
0.051513671875,
-0.026336669921875,
0.04010009765625,
-0.00266265869140625,
-0.006160736083984375,
-0.09002685546875,
0.012847900390625,
-0.001873016357421875,
-0.01529693603515625,
-0.042816162109375,
-0.001125335693359375,
0.0184783935546875,
0.004116058349609375,
-0.059051513671875,
0.05316162109375,
-0.0090179443359375,
-0.0061187744140625,
-0.0122222900390625,
-0.01096343994140625,
-0.00653839111328125,
0.019378662109375,
0.02581787109375,
0.05816650390625,
0.0225067138671875,
-0.0277862548828125,
0.01126861572265625,
0.04510498046875,
-0.0162353515625,
0.008880615234375,
-0.07611083984375,
0.011566162109375,
0.0241546630859375,
0.0264129638671875,
-0.04315185546875,
-0.00656890869140625,
0.0211029052734375,
-0.052337646484375,
0.01451873779296875,
-0.05157470703125,
-0.047210693359375,
-0.0218658447265625,
-0.04180908203125,
0.04095458984375,
0.04986572265625,
-0.032196044921875,
0.04595947265625,
0.016632080078125,
0.004192352294921875,
0.01010894775390625,
-0.08148193359375,
-0.0114288330078125,
-0.014892578125,
-0.05615234375,
0.04925537109375,
-0.0211029052734375,
-0.01146697998046875,
-0.00391387939453125,
-0.004932403564453125,
-0.0251312255859375,
-0.011383056640625,
0.029937744140625,
0.0261383056640625,
-0.034942626953125,
-0.0124359130859375,
0.03326416015625,
-0.0304412841796875,
0.0033359527587890625,
-0.040679931640625,
0.047943115234375,
-0.017608642578125,
0.003170013427734375,
-0.05615234375,
0.0032215118408203125,
0.037384033203125,
-0.0137176513671875,
0.032745361328125,
0.060699462890625,
-0.02960205078125,
-0.0130157470703125,
-0.026153564453125,
-0.0258331298828125,
-0.03582763671875,
0.01078033447265625,
-0.026092529296875,
-0.0615234375,
0.033203125,
0.00403594970703125,
-0.00853729248046875,
0.05963134765625,
0.041656494140625,
-0.0015726089477539062,
0.08349609375,
0.038177490234375,
0.0239410400390625,
0.03607177734375,
-0.05523681640625,
-0.014068603515625,
-0.08673095703125,
-0.0172119140625,
-0.0517578125,
-0.0174102783203125,
-0.0269927978515625,
-0.0257568359375,
0.041259765625,
0.0172882080078125,
-0.0298919677734375,
0.0457763671875,
-0.053924560546875,
0.0014104843139648438,
0.038543701171875,
0.01406097412109375,
0.024322509765625,
-0.004329681396484375,
0.0140380859375,
-0.016021728515625,
-0.036376953125,
-0.032562255859375,
0.079833984375,
0.042327880859375,
0.0565185546875,
0.0259857177734375,
0.049530029296875,
0.01410675048828125,
0.00930023193359375,
-0.06927490234375,
0.0204620361328125,
-0.02008056640625,
-0.041259765625,
-0.0037784576416015625,
-0.0235748291015625,
-0.047943115234375,
0.0048675537109375,
-0.0010061264038085938,
-0.050750732421875,
0.00698089599609375,
0.01148223876953125,
-0.0243072509765625,
0.0278778076171875,
-0.043975830078125,
0.064697265625,
0.0113372802734375,
0.0245819091796875,
-0.0170440673828125,
-0.031951904296875,
0.041748046875,
0.00682830810546875,
-0.0213623046875,
0.003917694091796875,
-0.0106658935546875,
0.08172607421875,
-0.0570068359375,
0.0601806640625,
-0.027923583984375,
-0.004962921142578125,
0.050506591796875,
0.0161285400390625,
0.02935791015625,
0.0172119140625,
-0.0101165771484375,
0.03515625,
0.037109375,
-0.0011224746704101562,
-0.0254669189453125,
0.04144287109375,
-0.0909423828125,
-0.0036487579345703125,
-0.0167694091796875,
-0.03912353515625,
0.029815673828125,
0.01343536376953125,
0.039031982421875,
0.04205322265625,
-0.0035266876220703125,
0.012786865234375,
0.04736328125,
0.0126800537109375,
0.034332275390625,
0.047698974609375,
-0.0085296630859375,
-0.052825927734375,
0.061187744140625,
0.014739990234375,
0.021881103515625,
0.032958984375,
0.034881591796875,
-0.034820556640625,
-0.06964111328125,
-0.03387451171875,
0.01006317138671875,
-0.039093017578125,
-0.033355712890625,
-0.044769287109375,
-0.033843994140625,
-0.038665771484375,
0.01352691650390625,
-0.05419921875,
-0.0625,
-0.047027587890625,
0.0280914306640625,
0.04766845703125,
0.0357666015625,
-0.00835418701171875,
0.05755615234375,
-0.080810546875,
0.0217437744140625,
0.005809783935546875,
0.0028743743896484375,
0.012298583984375,
-0.073974609375,
-0.01224517822265625,
0.00876617431640625,
-0.0257568359375,
-0.051788330078125,
0.033538818359375,
0.00728607177734375,
0.0003170967102050781,
0.0175628662109375,
0.012176513671875,
0.056610107421875,
-0.0157012939453125,
0.07183837890625,
0.024810791015625,
-0.08795166015625,
0.055206298828125,
-0.033050537109375,
0.01226806640625,
0.03900146484375,
-0.001224517822265625,
-0.04290771484375,
-0.00213623046875,
-0.051513671875,
-0.04931640625,
0.06982421875,
0.0321044921875,
-0.0099945068359375,
0.0198516845703125,
0.01107025146484375,
0.00606536865234375,
0.017578125,
-0.053924560546875,
-0.0195465087890625,
-0.0390625,
-0.033050537109375,
0.0171661376953125,
-0.021514892578125,
-0.00502777099609375,
-0.0188446044921875,
0.051666259765625,
-0.0261077880859375,
0.036895751953125,
0.02880859375,
-0.013336181640625,
-0.01103973388671875,
0.00537872314453125,
0.052459716796875,
-0.0015277862548828125,
-0.041259765625,
-0.0039215087890625,
0.0110321044921875,
-0.059295654296875,
-0.004425048828125,
0.0024852752685546875,
-0.0316162109375,
0.0119476318359375,
0.03216552734375,
0.05780029296875,
0.02801513671875,
-0.0189056396484375,
0.052337646484375,
-0.0157318115234375,
-0.026519775390625,
-0.054779052734375,
0.009063720703125,
0.0131683349609375,
0.0196990966796875,
0.0197601318359375,
0.0292816162109375,
0.02008056640625,
-0.02679443359375,
-0.017913818359375,
0.0084381103515625,
-0.0377197265625,
-0.04083251953125,
0.06488037109375,
0.0108489990234375,
-0.0278778076171875,
0.04486083984375,
-0.0018100738525390625,
-0.0093994140625,
0.0400390625,
0.051910400390625,
0.08892822265625,
-0.00197601318359375,
0.00865936279296875,
0.038604736328125,
0.02716064453125,
-0.0135955810546875,
0.044677734375,
-0.0173187255859375,
-0.0283660888671875,
-0.02099609375,
-0.0557861328125,
-0.0269317626953125,
0.0016031265258789062,
-0.06988525390625,
0.0238037109375,
-0.0457763671875,
-0.018890380859375,
0.00867462158203125,
0.01399993896484375,
-0.0450439453125,
0.00010210275650024414,
0.01383209228515625,
0.1087646484375,
-0.048614501953125,
0.09442138671875,
0.04327392578125,
-0.035400390625,
-0.07421875,
-0.0129547119140625,
0.004322052001953125,
-0.04876708984375,
0.04296875,
0.0084686279296875,
0.0026454925537109375,
0.006298065185546875,
-0.05267333984375,
-0.06524658203125,
0.10540771484375,
0.0072021484375,
-0.039459228515625,
-0.0166015625,
0.00823211669921875,
0.03607177734375,
-0.039764404296875,
0.047210693359375,
0.033447265625,
0.05963134765625,
0.0036983489990234375,
-0.09075927734375,
0.00042724609375,
-0.0088348388671875,
0.0255126953125,
0.00662994384765625,
-0.06951904296875,
0.0899658203125,
-0.0174407958984375,
-0.0094451904296875,
0.06988525390625,
0.050750732421875,
0.0178680419921875,
0.027618408203125,
0.033203125,
0.0310211181640625,
0.031707763671875,
-0.0269927978515625,
0.044677734375,
-0.01099395751953125,
0.04296875,
0.0714111328125,
-0.01256561279296875,
0.07476806640625,
0.0303497314453125,
-0.003047943115234375,
0.046661376953125,
0.03509521484375,
-0.020843505859375,
0.045928955078125,
-0.006015777587890625,
0.0009207725524902344,
-0.004085540771484375,
-0.00804901123046875,
-0.02923583984375,
0.048736572265625,
0.032501220703125,
-0.0219268798828125,
-0.0057220458984375,
-0.001857757568359375,
0.0008168220520019531,
-0.01517486572265625,
-0.037353515625,
0.05267333984375,
-0.001758575439453125,
-0.0186767578125,
0.044708251953125,
0.0311126708984375,
0.06658935546875,
-0.0667724609375,
-0.00589752197265625,
0.017578125,
0.018096923828125,
-0.020904541015625,
-0.05267333984375,
0.032684326171875,
-0.015533447265625,
-0.0288238525390625,
-0.0007305145263671875,
0.05499267578125,
-0.043365478515625,
-0.0283660888671875,
0.0206756591796875,
0.01470184326171875,
0.02105712890625,
-0.020538330078125,
-0.05780029296875,
0.0321044921875,
0.0158538818359375,
-0.0190887451171875,
0.0189361572265625,
-0.0017976760864257812,
0.001220703125,
0.03533935546875,
0.059326171875,
0.016082763671875,
0.001125335693359375,
0.005016326904296875,
0.050811767578125,
-0.050323486328125,
-0.052734375,
-0.0286865234375,
0.04034423828125,
-0.0017213821411132812,
-0.04962158203125,
0.040008544921875,
0.06036376953125,
0.0452880859375,
-0.0252532958984375,
0.038818359375,
-0.0026035308837890625,
0.0282135009765625,
-0.059234619140625,
0.06378173828125,
-0.0360107421875,
-0.007236480712890625,
-0.0028858184814453125,
-0.0533447265625,
0.0006303787231445312,
0.026275634765625,
-0.0034236907958984375,
-0.01308441162109375,
0.04962158203125,
0.06494140625,
-0.015899658203125,
0.0228424072265625,
0.004058837890625,
0.033294677734375,
0.02001953125,
0.042510986328125,
0.046142578125,
-0.07525634765625,
0.05859375,
-0.0290374755859375,
0.0008177757263183594,
-0.01361083984375,
-0.043243408203125,
-0.06427001953125,
-0.0379638671875,
-0.037139892578125,
-0.04119873046875,
-0.0131378173828125,
0.06817626953125,
0.07171630859375,
-0.05047607421875,
-0.0238189697265625,
0.00937652587890625,
-0.0028076171875,
-0.0011920928955078125,
-0.021728515625,
0.03607177734375,
0.02685546875,
-0.0570068359375,
0.03741455078125,
0.00829315185546875,
0.039276123046875,
-0.019287109375,
-0.0295867919921875,
0.0236663818359375,
0.0001659393310546875,
0.02667236328125,
0.0084381103515625,
-0.057891845703125,
-0.02081298828125,
-0.0240936279296875,
-0.004119873046875,
0.010009765625,
0.05755615234375,
-0.05419921875,
0.004467010498046875,
0.038421630859375,
-0.0199432373046875,
0.05316162109375,
-0.0309295654296875,
0.00797271728515625,
-0.049407958984375,
0.037384033203125,
0.0225372314453125,
0.02850341796875,
0.01134490966796875,
-0.004680633544921875,
0.0294189453125,
0.0178985595703125,
-0.0261383056640625,
-0.0709228515625,
-0.0018510818481445312,
-0.1004638671875,
-0.00368499755859375,
0.07476806640625,
0.006458282470703125,
-0.0304107666015625,
0.0223388671875,
-0.042572021484375,
0.02496337890625,
-0.048583984375,
0.0158233642578125,
0.0006361007690429688,
0.0237274169921875,
-0.00537109375,
-0.0294189453125,
0.031890869140625,
-0.004611968994140625,
-0.0287017822265625,
0.0012845993041992188,
0.010650634765625,
0.036163330078125,
0.03466796875,
0.0416259765625,
-0.0293426513671875,
0.03436279296875,
0.027252197265625,
0.0272216796875,
-0.0264739990234375,
-0.0325927734375,
-0.03179931640625,
0.002140045166015625,
0.0032711029052734375,
-0.028594970703125
]
] |
THUDM/chatglm2-6b-32k | 2023-10-12T20:38:40.000Z | [
"transformers",
"pytorch",
"chatglm",
"glm",
"thudm",
"custom_code",
"zh",
"en",
"arxiv:2103.10360",
"arxiv:2210.02414",
"arxiv:2306.15595",
"arxiv:1911.02150",
"endpoints_compatible",
"has_space",
"region:us"
] | null | THUDM | null | null | THUDM/chatglm2-6b-32k | 283 | 33,621 | transformers | 2023-07-30T14:16:43 | ---
language:
- zh
- en
tags:
- glm
- chatglm
- thudm
---
# ChatGLM2-6B-32K
<p align="center">
💻 <a href="https://github.com/THUDM/ChatGLM2-6B" target="_blank">Github Repo</a> • 🐦 <a href="https://twitter.com/thukeg" target="_blank">Twitter</a> • 📃 <a href="https://arxiv.org/abs/2103.10360" target="_blank">[GLM@ACL 22]</a> <a href="https://github.com/THUDM/GLM" target="_blank">[GitHub]</a> • 📃 <a href="https://arxiv.org/abs/2210.02414" target="_blank">[GLM-130B@ICLR 23]</a> <a href="https://github.com/THUDM/GLM-130B" target="_blank">[GitHub]</a> <br>
</p>
<p align="center">
👋 Join our <a href="https://join.slack.com/t/chatglm/shared_invite/zt-1y7pqoloy-9b1g6T6JjA8J0KxvUjbwJw" target="_blank">Slack</a> and <a href="https://github.com/THUDM/ChatGLM-6B/blob/main/resources/WECHAT.md" target="_blank">WeChat</a>
</p>
## 更新/Update
- 我们优化了KV Cache的存储方式,减少了显存碎片的产生。基于优化后的代码,模型可以在约**20G显存**的情况下处理32K长度的上下文(FP/BF16格式)。
- We have optimized the storage method of the KV Cache, reducing the generation of memory fragmentation. Based on the optimized code, the model can process a context length of 32K under approximately **20G** of memory (FP/BF16 format).
## 介绍
ChatGLM**2**-6B-32K在[ChatGLM2-6B](https://huggingface.co/THUDM/chatglm2-6b)的基础上进一步强化了对于长文本的理解能力,能够更好的处理最多32K长度的上下文。具体地,我们基于[位置插值](https://arxiv.org/abs/2306.15595)(Positional Interpolation)的方法对位置编码进行了更新,并在对话阶段使用 32K 的上下文长度训练。在实际的使用中,如果您面临的上下文长度基本在 **8K 以内**,我们推荐使用[ChatGLM2-6B](https://huggingface.co/THUDM/chatglm2-6b);如果您需要处理**超过 8K** 的上下文长度,我们推荐使用ChatGLM2-6B-32K。
ChatGLM**2**-6B-32K是开源中英双语对话模型 [ChatGLM2-6B](https://github.com/THUDM/ChatGLM2-6B) 的加长版本,在保留了初代模型对话流畅、部署门槛较低等众多优秀特性的基础之上,ChatGLM**2**-6B-32k 引入了如下新特性:
1. **更强大的性能**:基于 ChatGLM 初代模型的开发经验,我们全面升级了 ChatGLM2-6B-32K 的基座模型。ChatGLM2-6B-32K 使用了 [GLM](https://github.com/THUDM/GLM) 的混合目标函数,经过了 1.4T 中英标识符的预训练与人类偏好对齐训练。
2. **更长的上下文**:基于 [FlashAttention](https://github.com/HazyResearch/flash-attention) 技术,我们将基座模型的上下文长度(Context Length)由 ChatGLM-6B 的 2K 扩展到了 32K,并在对话阶段使用 32K 的上下文长度训练,允许更多轮次的对话。
3. **更高效的推理**:基于 [Multi-Query Attention](http://arxiv.org/abs/1911.02150) 技术,ChatGLM2-6B-32K 有更高效的推理速度和更低的显存占用:在官方的模型实现下,推理速度相比初代提升了 42%,INT4 量化下,6G 显存支持的对话长度由 1K 提升到了 8K。
4. **更开放的协议**:ChatGLM2-6B-32K 权重对学术研究**完全开放**,在填写[问卷](https://open.bigmodel.cn/mla/form)进行登记后**亦允许免费商业使用**。
The ChatGLM**2**-6B-32K further strengthens the ability to understand long texts based on the [ChatGLM2-6B](https://huggingface.co/THUDM/chatglm2-6b), and can better handle up to 32K context length. Specifically, we have updated the position encoding based on the method of [Positional Interpolation](https://arxiv.org/abs/2306.15595), and trained with a 32K context length during the dialogue alignment. In practical use, if the context length you are dealing with is generally within 8K, we recommend using [ChatGLM2-6B](https://huggingface.co/THUDM/chatglm2-6b); if you need to handle a context length exceeding 8K, we recommend using ChatGLM2-6B-32K.
ChatGLM2-6B-32K is the second-generation version of the open-source bilingual (Chinese-English) chat model [ChatGLM-6B](https://github.com/THUDM/ChatGLM-6B). It retains the smooth conversation flow and low deployment threshold of the first-generation model, while introducing the following new features:
1. **Stronger Performance**: Based on the development experience of the first-generation ChatGLM model, we have fully upgraded the base model of ChatGLM2-6B-32K. ChatGLM2-6B-32K uses the hybrid objective function of [GLM](https://github.com/THUDM/GLM), and has undergone pre-training with 1.4T bilingual tokens and human preference alignment training.
2. **Longer Context**: Based on [FlashAttention](https://github.com/HazyResearch/flash-attention) technique, we have extended the context length of the base model from 2K in ChatGLM-6B to 32K, and trained with a context length of 32K during the dialogue alignment, allowing for more rounds of dialogue.
3. **More Efficient Inference**: Based on [Multi-Query Attention](http://arxiv.org/abs/1911.02150) technique, ChatGLM2-6B-32K has more efficient inference speed and lower GPU memory usage: under the official implementation, the inference speed has increased by 42% compared to the first generation; under INT4 quantization, the dialogue length supported by 6G GPU memory has increased from 1K to 8K.
4. **More Open License**: ChatGLM2-6B-32K weights are **completely open** for academic research, and **free commercial use** is also allowed after completing the [questionnaire](https://open.bigmodel.cn/mla/form).
## 软件依赖
```shell
pip install protobuf transformers==4.30.2 cpm_kernels torch>=2.0 gradio mdtex2html sentencepiece accelerate
```
## 代码调用
可以通过如下代码调用 ChatGLM-6B-32K 模型来生成对话:
```ipython
>>> from transformers import AutoTokenizer, AutoModel
>>> tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm2-6b-32k", trust_remote_code=True)
>>> model = AutoModel.from_pretrained("THUDM/chatglm2-6b-32k", trust_remote_code=True).half().cuda()
>>> model = model.eval()
>>> response, history = model.chat(tokenizer, "你好", history=[])
>>> print(response)
你好👋!我是人工智能助手 ChatGLM-6B,很高兴见到你,欢迎问我任何问题。
>>> response, history = model.chat(tokenizer, "晚上睡不着应该怎么办", history=history)
>>> print(response)
晚上睡不着可能会让你感到焦虑或不舒服,但以下是一些可以帮助你入睡的方法:
1. 制定规律的睡眠时间表:保持规律的睡眠时间表可以帮助你建立健康的睡眠习惯,使你更容易入睡。尽量在每天的相同时间上床,并在同一时间起床。
2. 创造一个舒适的睡眠环境:确保睡眠环境舒适,安静,黑暗且温度适宜。可以使用舒适的床上用品,并保持房间通风。
3. 放松身心:在睡前做些放松的活动,例如泡个热水澡,听些轻柔的音乐,阅读一些有趣的书籍等,有助于缓解紧张和焦虑,使你更容易入睡。
4. 避免饮用含有咖啡因的饮料:咖啡因是一种刺激性物质,会影响你的睡眠质量。尽量避免在睡前饮用含有咖啡因的饮料,例如咖啡,茶和可乐。
5. 避免在床上做与睡眠无关的事情:在床上做些与睡眠无关的事情,例如看电影,玩游戏或工作等,可能会干扰你的睡眠。
6. 尝试呼吸技巧:深呼吸是一种放松技巧,可以帮助你缓解紧张和焦虑,使你更容易入睡。试着慢慢吸气,保持几秒钟,然后缓慢呼气。
如果这些方法无法帮助你入睡,你可以考虑咨询医生或睡眠专家,寻求进一步的建议。
```
关于更多的使用说明,包括如何运行命令行和网页版本的 DEMO,以及使用模型量化以节省显存,请参考我们的 [Github Repo](https://github.com/THUDM/ChatGLM2-6B)。
For more instructions, including how to run CLI and web demos, and model quantization, please refer to our [Github Repo](https://github.com/THUDM/ChatGLM2-6B).
## Change Log
* v1.0
## 协议
本仓库的代码依照 [Apache-2.0](LICENSE) 协议开源,ChatGLM2-6B-32K 模型的权重的使用则需要遵循 [Model License](MODEL_LICENSE)。
## 引用
如果你觉得我们的工作有帮助的话,请考虑引用下列论文,ChatGLM2-6B 的论文会在近期公布,敬请期待~
```
@article{zeng2022glm,
title={Glm-130b: An open bilingual pre-trained model},
author={Zeng, Aohan and Liu, Xiao and Du, Zhengxiao and Wang, Zihan and Lai, Hanyu and Ding, Ming and Yang, Zhuoyi and Xu, Yifan and Zheng, Wendi and Xia, Xiao and others},
journal={arXiv preprint arXiv:2210.02414},
year={2022}
}
```
```
@inproceedings{du2022glm,
title={GLM: General Language Model Pretraining with Autoregressive Blank Infilling},
author={Du, Zhengxiao and Qian, Yujie and Liu, Xiao and Ding, Ming and Qiu, Jiezhong and Yang, Zhilin and Tang, Jie},
booktitle={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
pages={320--335},
year={2022}
}
``` | 6,871 | [
[
-0.034942626953125,
-0.0654296875,
0.0092315673828125,
0.033935546875,
-0.029144287109375,
-0.00603485107421875,
-0.023284912109375,
-0.041717529296875,
0.007602691650390625,
0.0121002197265625,
-0.043365478515625,
-0.045318603515625,
-0.03704833984375,
-0.017822265625,
-0.00835418701171875,
0.06524658203125,
0.01421356201171875,
0.0036334991455078125,
0.00289154052734375,
-0.006671905517578125,
-0.03961181640625,
-0.041015625,
-0.0576171875,
-0.01131439208984375,
0.0053253173828125,
0.0109100341796875,
0.05023193359375,
0.02789306640625,
0.0287628173828125,
0.0251617431640625,
-0.0177154541015625,
0.01425933837890625,
-0.0506591796875,
-0.01885986328125,
0.0192413330078125,
-0.0307464599609375,
-0.051361083984375,
-0.0018949508666992188,
0.043609619140625,
0.01194000244140625,
-0.00936126708984375,
0.022369384765625,
0.022247314453125,
0.048614501953125,
-0.02935791015625,
0.0361328125,
-0.042572021484375,
-0.0067291259765625,
-0.007602691650390625,
-0.01258087158203125,
-0.0228118896484375,
-0.0223846435546875,
0.00836181640625,
-0.04168701171875,
0.00119781494140625,
0.0149993896484375,
0.09588623046875,
-0.0022716522216796875,
-0.0228118896484375,
-0.0089874267578125,
-0.042572021484375,
0.0733642578125,
-0.08465576171875,
0.01617431640625,
0.027130126953125,
0.032501220703125,
-0.020751953125,
-0.052734375,
-0.0367431640625,
-0.01194000244140625,
-0.03179931640625,
0.0240631103515625,
-0.00798797607421875,
-0.0015201568603515625,
0.01297760009765625,
0.0233612060546875,
-0.051910400390625,
0.0006494522094726562,
-0.03631591796875,
-0.024658203125,
0.054107666015625,
0.01309967041015625,
0.044952392578125,
-0.01297760009765625,
-0.037506103515625,
-0.0037479400634765625,
-0.0372314453125,
0.01910400390625,
0.017181396484375,
0.015869140625,
-0.05096435546875,
0.023162841796875,
-0.01026153564453125,
0.03887939453125,
0.004009246826171875,
-0.025115966796875,
0.031463623046875,
-0.0443115234375,
-0.022857666015625,
-0.0154876708984375,
0.09130859375,
0.035003662109375,
-0.0003452301025390625,
0.016021728515625,
-0.00124359130859375,
-0.01470184326171875,
-0.00740814208984375,
-0.0634765625,
-0.00400543212890625,
0.0301666259765625,
-0.043609619140625,
-0.010772705078125,
-0.005168914794921875,
-0.0479736328125,
0.01213836669921875,
0.00017249584197998047,
0.038238525390625,
-0.047515869140625,
-0.02880859375,
0.0137939453125,
0.0005140304565429688,
0.0259552001953125,
0.031646728515625,
-0.07122802734375,
0.026092529296875,
0.0372314453125,
0.06591796875,
-0.004978179931640625,
-0.0178070068359375,
-0.012664794921875,
0.0044403076171875,
-0.00925445556640625,
0.0262603759765625,
-0.002841949462890625,
-0.031982421875,
-0.009033203125,
0.00470733642578125,
-0.0207977294921875,
-0.0228271484375,
0.02911376953125,
-0.028045654296875,
0.054107666015625,
-0.004604339599609375,
-0.03900146484375,
-0.025604248046875,
0.022308349609375,
-0.026947021484375,
0.081787109375,
0.002323150634765625,
-0.06524658203125,
-0.00502777099609375,
-0.045654296875,
-0.0112152099609375,
-0.0019931793212890625,
-0.0032806396484375,
-0.027862548828125,
-0.0218963623046875,
0.0306243896484375,
0.022125244140625,
-0.030670166015625,
0.013092041015625,
-0.016143798828125,
-0.032989501953125,
0.0200042724609375,
-0.02606201171875,
0.082763671875,
0.0187225341796875,
-0.03521728515625,
0.01873779296875,
-0.0287628173828125,
0.02490234375,
0.0188140869140625,
-0.0172271728515625,
-0.00101470947265625,
0.0029201507568359375,
0.003658294677734375,
0.036041259765625,
0.0384521484375,
-0.0216064453125,
0.007129669189453125,
-0.05181884765625,
0.03436279296875,
0.046539306640625,
-0.0052642822265625,
0.035858154296875,
-0.0292205810546875,
0.0219879150390625,
0.01464080810546875,
0.037445068359375,
-0.01666259765625,
-0.052215576171875,
-0.07196044921875,
-0.0133514404296875,
0.0154876708984375,
0.057373046875,
-0.046875,
0.059417724609375,
-0.0166778564453125,
-0.0421142578125,
-0.0404052734375,
0.013092041015625,
0.048919677734375,
0.0284576416015625,
0.0377197265625,
-0.018707275390625,
-0.038604736328125,
-0.05621337890625,
-0.000039577484130859375,
-0.028900146484375,
-0.007419586181640625,
0.041595458984375,
0.035919189453125,
-0.0239715576171875,
0.07171630859375,
-0.038299560546875,
-0.027862548828125,
-0.0171661376953125,
0.002101898193359375,
0.0180206298828125,
0.048980712890625,
0.051025390625,
-0.058349609375,
-0.062225341796875,
-0.003314971923828125,
-0.06353759765625,
0.01097869873046875,
0.006103515625,
-0.03143310546875,
0.038665771484375,
0.025390625,
-0.048065185546875,
0.032958984375,
0.050506591796875,
-0.0259552001953125,
0.044342041015625,
-0.0172882080078125,
0.0025310516357421875,
-0.0941162109375,
0.00382232666015625,
-0.007480621337890625,
-0.000766754150390625,
-0.052001953125,
-0.0098724365234375,
0.00010645389556884766,
0.0149383544921875,
-0.048980712890625,
0.07989501953125,
-0.04931640625,
0.01412200927734375,
-0.01025390625,
0.0246734619140625,
-0.01107025146484375,
0.061767578125,
-0.01203155517578125,
0.048126220703125,
0.0531005859375,
-0.043121337890625,
0.0212860107421875,
0.025177001953125,
-0.019287109375,
0.002529144287109375,
-0.05596923828125,
0.0177459716796875,
0.0020427703857421875,
0.0266876220703125,
-0.09698486328125,
-0.0088043212890625,
0.0443115234375,
-0.06329345703125,
0.0221710205078125,
-0.01593017578125,
-0.032135009765625,
-0.041473388671875,
-0.0374755859375,
0.0186767578125,
0.0560302734375,
-0.023101806640625,
0.03692626953125,
0.030670166015625,
-0.0026531219482421875,
-0.048126220703125,
-0.04510498046875,
-0.005970001220703125,
-0.019317626953125,
-0.0738525390625,
0.026031494140625,
-0.014739990234375,
0.001544952392578125,
-0.01220703125,
0.01021575927734375,
0.00341796875,
-0.0041961669921875,
0.0166168212890625,
0.035064697265625,
-0.00988006591796875,
-0.0033206939697265625,
-0.01070404052734375,
-0.00966644287109375,
-0.004611968994140625,
-0.00931549072265625,
0.053375244140625,
-0.0253143310546875,
-0.0233612060546875,
-0.042877197265625,
0.0136871337890625,
0.034637451171875,
-0.007686614990234375,
0.061065673828125,
0.0777587890625,
-0.01511383056640625,
0.0177459716796875,
-0.0526123046875,
-0.0156402587890625,
-0.04156494140625,
0.0282440185546875,
-0.0045928955078125,
-0.07763671875,
0.06304931640625,
0.0227508544921875,
0.01459503173828125,
0.043975830078125,
0.04791259765625,
0.006938934326171875,
0.09234619140625,
0.02801513671875,
-0.02618408203125,
0.041168212890625,
-0.031707763671875,
0.0162353515625,
-0.06500244140625,
-0.0216522216796875,
-0.025482177734375,
-0.0222320556640625,
-0.047943115234375,
-0.045196533203125,
0.027130126953125,
0.0131072998046875,
-0.0208740234375,
0.013153076171875,
-0.03582763671875,
-0.0025653839111328125,
0.0377197265625,
0.005130767822265625,
0.0019083023071289062,
-0.01030731201171875,
-0.007686614990234375,
-0.0013580322265625,
-0.0506591796875,
-0.034820556640625,
0.056854248046875,
0.03466796875,
0.048980712890625,
0.0196990966796875,
0.03173828125,
-0.00676727294921875,
0.0178680419921875,
-0.045196533203125,
0.05194091796875,
0.01087188720703125,
-0.056915283203125,
-0.034576416015625,
-0.03387451171875,
-0.0703125,
0.03887939453125,
-0.006114959716796875,
-0.08233642578125,
-0.005992889404296875,
0.01093292236328125,
-0.018829345703125,
0.0150909423828125,
-0.061553955078125,
0.06597900390625,
-0.0287017822265625,
-0.0233154296875,
-0.0059051513671875,
-0.0675048828125,
0.041717529296875,
0.021087646484375,
0.0226898193359375,
-0.02886962890625,
0.00426483154296875,
0.056488037109375,
-0.03887939453125,
0.059661865234375,
-0.0228271484375,
-0.00608062744140625,
0.0408935546875,
-0.00751495361328125,
0.045135498046875,
0.01544189453125,
0.0140380859375,
0.02410888671875,
0.0038280487060546875,
-0.03009033203125,
-0.043914794921875,
0.05218505859375,
-0.057037353515625,
-0.054779052734375,
-0.02777099609375,
-0.0362548828125,
-0.01230621337890625,
0.01800537109375,
0.0304107666015625,
0.021148681640625,
-0.004974365234375,
0.0189666748046875,
0.0269622802734375,
-0.036651611328125,
0.045654296875,
0.042205810546875,
-0.037872314453125,
-0.035491943359375,
0.0535888671875,
0.00292205810546875,
0.0298309326171875,
0.0189361572265625,
0.006183624267578125,
-0.031982421875,
-0.032928466796875,
-0.035430908203125,
0.0274658203125,
-0.032135009765625,
-0.006908416748046875,
-0.056671142578125,
-0.03887939453125,
-0.04962158203125,
0.006572723388671875,
-0.026885986328125,
-0.00849151611328125,
-0.0303497314453125,
0.005756378173828125,
0.037506103515625,
0.004436492919921875,
0.00649261474609375,
0.0172882080078125,
-0.071533203125,
0.021636962890625,
0.0165863037109375,
0.0188751220703125,
0.0174560546875,
-0.0557861328125,
-0.037689208984375,
0.03778076171875,
-0.01248931884765625,
-0.036773681640625,
0.049591064453125,
0.01030731201171875,
0.04534912109375,
0.0310516357421875,
-0.004428863525390625,
0.059967041015625,
-0.026885986328125,
0.0731201171875,
0.0225067138671875,
-0.0689697265625,
0.037445068359375,
-0.038360595703125,
0.037078857421875,
0.0220794677734375,
0.021728515625,
-0.048553466796875,
-0.0309295654296875,
-0.053131103515625,
-0.068115234375,
0.080322265625,
0.040557861328125,
0.0325927734375,
0.00521087646484375,
0.00572967529296875,
-0.024627685546875,
0.00919342041015625,
-0.04705810546875,
-0.0506591796875,
-0.01039886474609375,
-0.00970458984375,
-0.0009593963623046875,
-0.035491943359375,
-0.0067901611328125,
-0.0379638671875,
0.057281494140625,
-0.00571441650390625,
0.0540771484375,
0.004474639892578125,
0.0001575946807861328,
0.0094146728515625,
0.010711669921875,
0.05609130859375,
0.047027587890625,
-0.023529052734375,
-0.0141448974609375,
0.0240631103515625,
-0.045501708984375,
-0.002727508544921875,
0.00539398193359375,
-0.01458740234375,
0.005466461181640625,
0.01534271240234375,
0.08648681640625,
0.0255126953125,
-0.04241943359375,
0.041168212890625,
-0.0211181640625,
-0.017120361328125,
-0.0220489501953125,
0.0139923095703125,
0.0242767333984375,
0.0160369873046875,
0.04095458984375,
-0.023040771484375,
-0.0018768310546875,
-0.04571533203125,
0.0013484954833984375,
0.0411376953125,
-0.0235748291015625,
-0.0218963623046875,
0.05120849609375,
0.00948333740234375,
-0.012420654296875,
0.033355712890625,
-0.0171661376953125,
-0.046722412109375,
0.045257568359375,
0.04278564453125,
0.0589599609375,
-0.02105712890625,
0.00926971435546875,
0.05340576171875,
0.0086212158203125,
-0.0220794677734375,
0.019805908203125,
0.00864410400390625,
-0.057159423828125,
-0.0227813720703125,
-0.042083740234375,
-0.013702392578125,
0.01552581787109375,
-0.03887939453125,
0.025482177734375,
-0.0275726318359375,
-0.02130126953125,
-0.01491546630859375,
0.00740814208984375,
-0.0290374755859375,
0.014007568359375,
0.00812530517578125,
0.05218505859375,
-0.026885986328125,
0.06842041015625,
0.033447265625,
-0.0206756591796875,
-0.06646728515625,
-0.01220703125,
0.01611328125,
-0.059967041015625,
0.034210205078125,
0.0147857666015625,
-0.0035247802734375,
0.0019969940185546875,
-0.043365478515625,
-0.0888671875,
0.0968017578125,
0.0176239013671875,
-0.0296478271484375,
-0.01282501220703125,
-0.003749847412109375,
0.052276611328125,
-0.0197906494140625,
0.041748046875,
0.0197601318359375,
0.028228759765625,
0.0204925537109375,
-0.095947265625,
0.0150909423828125,
-0.04241943359375,
0.01335906982421875,
-0.0010662078857421875,
-0.08148193359375,
0.0780029296875,
-0.00821685791015625,
-0.030792236328125,
0.0001800060272216797,
0.0535888671875,
0.0219573974609375,
0.0086212158203125,
0.02410888671875,
0.0262603759765625,
0.03692626953125,
-0.0226898193359375,
0.064208984375,
-0.041595458984375,
0.05621337890625,
0.0721435546875,
0.001285552978515625,
0.0517578125,
0.0146942138671875,
-0.0236663818359375,
0.03387451171875,
0.045196533203125,
-0.0051116943359375,
0.036712646484375,
-0.003814697265625,
-0.0188446044921875,
-0.0012874603271484375,
0.01352691650390625,
-0.045623779296875,
0.021148681640625,
0.03570556640625,
-0.0130157470703125,
-0.0086669921875,
-0.000766754150390625,
0.0205078125,
-0.0310211181640625,
-0.007656097412109375,
0.06451416015625,
0.016754150390625,
-0.05078125,
0.07879638671875,
0.009674072265625,
0.0743408203125,
-0.05975341796875,
0.01233673095703125,
-0.017181396484375,
0.00786590576171875,
-0.01165771484375,
-0.046417236328125,
0.0096588134765625,
-0.01287841796875,
0.00499725341796875,
-0.004245758056640625,
0.065673828125,
-0.040252685546875,
-0.0264434814453125,
0.04144287109375,
0.0287933349609375,
0.01134490966796875,
0.003330230712890625,
-0.0762939453125,
0.009185791015625,
0.016937255859375,
-0.038604736328125,
0.03436279296875,
0.021820068359375,
0.0016145706176757812,
0.0587158203125,
0.048492431640625,
-0.00965118408203125,
0.00824737548828125,
-0.00418853759765625,
0.06494140625,
-0.051055908203125,
-0.0400390625,
-0.0726318359375,
0.052215576171875,
-0.01392364501953125,
-0.0223541259765625,
0.07574462890625,
0.04205322265625,
0.05877685546875,
0.0011730194091796875,
0.0526123046875,
-0.022216796875,
0.0377197265625,
-0.038055419921875,
0.055633544921875,
-0.043609619140625,
0.012115478515625,
-0.0254974365234375,
-0.0460205078125,
-0.01629638671875,
0.0401611328125,
-0.0239715576171875,
0.022857666015625,
0.047149658203125,
0.06878662109375,
0.0141448974609375,
-0.014495849609375,
0.01499176025390625,
0.0207366943359375,
0.026885986328125,
0.06689453125,
0.039947509765625,
-0.05499267578125,
0.049407958984375,
-0.020355224609375,
-0.0013971328735351562,
-0.040191650390625,
-0.03887939453125,
-0.07904052734375,
-0.04608154296875,
-0.0158233642578125,
-0.034454345703125,
-0.01132965087890625,
0.06280517578125,
0.05242919921875,
-0.05023193359375,
-0.029876708984375,
0.0162506103515625,
0.009552001953125,
-0.0206451416015625,
-0.0201263427734375,
0.034942626953125,
-0.024444580078125,
-0.067138671875,
0.00562286376953125,
0.021392822265625,
0.0196990966796875,
-0.0148773193359375,
-0.02801513671875,
-0.03668212890625,
0.0030727386474609375,
0.042572021484375,
0.025115966796875,
-0.0633544921875,
-0.014404296875,
0.009552001953125,
-0.035430908203125,
0.017333984375,
0.01399993896484375,
-0.034393310546875,
0.0300750732421875,
0.043060302734375,
0.00585174560546875,
0.056396484375,
0.0052337646484375,
0.0290679931640625,
-0.045623779296875,
0.03607177734375,
0.00167083740234375,
0.0225067138671875,
0.01067352294921875,
-0.0262603759765625,
0.0443115234375,
0.01207733154296875,
-0.033172607421875,
-0.060302734375,
-0.015106201171875,
-0.08526611328125,
-0.00960540771484375,
0.1026611328125,
-0.0162200927734375,
-0.0224609375,
0.0004303455352783203,
-0.042877197265625,
0.027984619140625,
-0.03375244140625,
0.061004638671875,
0.059417724609375,
-0.005321502685546875,
-0.0150604248046875,
-0.048919677734375,
0.04571533203125,
0.0290374755859375,
-0.059600830078125,
-0.014312744140625,
0.02862548828125,
0.02313232421875,
0.0127105712890625,
0.07379150390625,
-0.01218414306640625,
0.01934814453125,
-0.0227508544921875,
0.011962890625,
-0.00494384765625,
0.01097869873046875,
-0.0120086669921875,
-0.01096343994140625,
-0.01580810546875,
-0.0190582275390625
]
] |
timm/dpn107.mx_in1k | 2023-04-21T22:00:16.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:1707.01629",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/dpn107.mx_in1k | 0 | 33,489 | timm | 2023-04-21T21:58:56 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for dpn107.mx_in1k
A DPN (Dual-Path Net) image classification model. Trained on ImageNet-1k in MXNet by paper authors and ported to PyTorch by Ross Wightman.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 86.9
- GMACs: 18.4
- Activations (M): 33.5
- Image size: 224 x 224
- **Papers:**
- Dual Path Networks: https://arxiv.org/abs/1707.01629
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/cypw/DPNs
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('dpn107.mx_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'dpn107.mx_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 128, 112, 112])
# torch.Size([1, 376, 56, 56])
# torch.Size([1, 1152, 28, 28])
# torch.Size([1, 2432, 14, 14])
# torch.Size([1, 2688, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'dpn107.mx_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 2688, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Citation
```bibtex
@article{Chen2017,
title={Dual Path Networks},
author={Yunpeng Chen, Jianan Li, Huaxin Xiao, Xiaojie Jin, Shuicheng Yan, Jiashi Feng},
journal={arXiv preprint arXiv:1707.01629},
year={2017}
}
```
| 3,415 | [
[
-0.0304412841796875,
-0.0303497314453125,
0.00621795654296875,
0.0160980224609375,
-0.026214599609375,
-0.019775390625,
-0.0108184814453125,
-0.0160064697265625,
0.0244903564453125,
0.035491943359375,
-0.043548583984375,
-0.049346923828125,
-0.054473876953125,
-0.0057220458984375,
-0.0088958740234375,
0.06829833984375,
-0.004154205322265625,
0.00437164306640625,
-0.008392333984375,
-0.0306854248046875,
-0.0124053955078125,
-0.02294921875,
-0.062469482421875,
-0.037567138671875,
0.0289154052734375,
0.01611328125,
0.038330078125,
0.045379638671875,
0.0482177734375,
0.036956787109375,
-0.01155853271484375,
0.0008783340454101562,
-0.020416259765625,
-0.015777587890625,
0.02996826171875,
-0.034698486328125,
-0.04205322265625,
0.0161895751953125,
0.062103271484375,
0.0433349609375,
0.011932373046875,
0.023712158203125,
0.0105743408203125,
0.04205322265625,
-0.006744384765625,
0.0105743408203125,
-0.0306854248046875,
0.01451873779296875,
-0.0140228271484375,
0.003856658935546875,
-0.017486572265625,
-0.030609130859375,
0.018829345703125,
-0.035858154296875,
0.0259552001953125,
-0.0023937225341796875,
0.08563232421875,
0.016876220703125,
-0.010711669921875,
-0.0111236572265625,
-0.01459503173828125,
0.05523681640625,
-0.066650390625,
0.007724761962890625,
0.0225372314453125,
0.01317596435546875,
-0.0200653076171875,
-0.0728759765625,
-0.0478515625,
-0.01079559326171875,
-0.0161590576171875,
-0.0072174072265625,
-0.005977630615234375,
0.006473541259765625,
0.028717041015625,
0.0263671875,
-0.03363037109375,
0.0085906982421875,
-0.04339599609375,
-0.01446533203125,
0.0484619140625,
0.0092010498046875,
0.0290985107421875,
-0.017181396484375,
-0.042236328125,
-0.03173828125,
-0.024261474609375,
0.0189971923828125,
0.03472900390625,
0.0213470458984375,
-0.03759765625,
0.02337646484375,
0.003681182861328125,
0.04571533203125,
0.002056121826171875,
-0.032470703125,
0.046661376953125,
0.007205963134765625,
-0.03466796875,
0.007396697998046875,
0.08270263671875,
0.0197296142578125,
0.0116119384765625,
0.002796173095703125,
-0.008209228515625,
-0.0294342041015625,
-0.00991058349609375,
-0.0841064453125,
-0.037200927734375,
0.02581787109375,
-0.04034423828125,
-0.03460693359375,
0.027679443359375,
-0.05615234375,
-0.0150299072265625,
-0.005035400390625,
0.0477294921875,
-0.032958984375,
-0.037841796875,
0.01678466796875,
-0.01702880859375,
0.01727294921875,
0.0135650634765625,
-0.044342041015625,
0.01496124267578125,
0.022064208984375,
0.08441162109375,
0.006069183349609375,
-0.041351318359375,
-0.010833740234375,
-0.023651123046875,
-0.01427459716796875,
0.03118896484375,
-0.003253936767578125,
-0.006420135498046875,
-0.0232086181640625,
0.0289764404296875,
-0.0201416015625,
-0.05084228515625,
0.01490020751953125,
-0.01482391357421875,
0.024322509765625,
-0.0006518363952636719,
-0.0182037353515625,
-0.044403076171875,
0.0178375244140625,
-0.032440185546875,
0.09222412109375,
0.027099609375,
-0.06622314453125,
0.0185394287109375,
-0.043731689453125,
-0.0124969482421875,
-0.0191650390625,
0.002979278564453125,
-0.0791015625,
-0.01279449462890625,
0.0052337646484375,
0.054718017578125,
-0.01268768310546875,
0.011810302734375,
-0.04681396484375,
-0.02081298828125,
0.0265655517578125,
-0.008453369140625,
0.08941650390625,
0.00377655029296875,
-0.03802490234375,
0.01419830322265625,
-0.0545654296875,
0.01416778564453125,
0.0390625,
-0.023162841796875,
-0.007389068603515625,
-0.05426025390625,
0.005298614501953125,
0.0296630859375,
0.004154205322265625,
-0.0491943359375,
0.032379150390625,
-0.017333984375,
0.033721923828125,
0.053466796875,
-0.0101470947265625,
0.0223541259765625,
-0.025390625,
0.0192718505859375,
0.0235137939453125,
0.01024627685546875,
-0.002735137939453125,
-0.0457763671875,
-0.06268310546875,
-0.035614013671875,
0.0220947265625,
0.0259857177734375,
-0.03778076171875,
0.042938232421875,
-0.0170745849609375,
-0.05419921875,
-0.0299072265625,
0.0028362274169921875,
0.035552978515625,
0.04638671875,
0.0240478515625,
-0.0310821533203125,
-0.03680419921875,
-0.056793212890625,
0.0159912109375,
0.003265380859375,
0.00591278076171875,
0.01470184326171875,
0.052337646484375,
-0.0079803466796875,
0.04412841796875,
-0.034576416015625,
-0.0203704833984375,
-0.0203094482421875,
0.0037708282470703125,
0.038726806640625,
0.06158447265625,
0.06317138671875,
-0.042449951171875,
-0.043487548828125,
-0.016998291015625,
-0.0679931640625,
0.0078277587890625,
-0.01285552978515625,
-0.027740478515625,
0.0134429931640625,
0.0037059783935546875,
-0.041748046875,
0.05078125,
0.017486572265625,
-0.033355712890625,
0.029754638671875,
-0.0214691162109375,
0.01513671875,
-0.080810546875,
0.01097869873046875,
0.01751708984375,
-0.004276275634765625,
-0.041961669921875,
-0.01172637939453125,
0.004505157470703125,
-0.00252532958984375,
-0.0390625,
0.05242919921875,
-0.04278564453125,
-0.01271820068359375,
-0.01495361328125,
-0.0207366943359375,
0.00591278076171875,
0.0604248046875,
-0.00406646728515625,
0.019012451171875,
0.06793212890625,
-0.042388916015625,
0.054534912109375,
0.036590576171875,
-0.013671875,
0.022796630859375,
-0.041595458984375,
0.0191650390625,
-0.0049591064453125,
0.016845703125,
-0.08233642578125,
-0.01367950439453125,
0.034515380859375,
-0.035858154296875,
0.051300048828125,
-0.04217529296875,
-0.0294952392578125,
-0.0460205078125,
-0.03466796875,
0.025146484375,
0.047088623046875,
-0.058074951171875,
0.01641845703125,
0.0197601318359375,
0.0214996337890625,
-0.035888671875,
-0.06536865234375,
-0.0168609619140625,
-0.034515380859375,
-0.05975341796875,
0.0274810791015625,
0.0108795166015625,
0.01000213623046875,
0.006969451904296875,
-0.0103302001953125,
-0.0113067626953125,
-0.00656890869140625,
0.0380859375,
0.023895263671875,
-0.02191162109375,
-0.00962066650390625,
-0.0186767578125,
-0.009521484375,
0.00930023193359375,
-0.024169921875,
0.04949951171875,
-0.00864410400390625,
-0.0186767578125,
-0.0640869140625,
-0.00930023193359375,
0.03662109375,
0.00095367431640625,
0.06146240234375,
0.07470703125,
-0.0364990234375,
-0.00655364990234375,
-0.0210113525390625,
-0.0281982421875,
-0.036651611328125,
0.03521728515625,
-0.0310516357421875,
-0.023284912109375,
0.062286376953125,
-0.0029621124267578125,
-0.0022716522216796875,
0.048858642578125,
0.022674560546875,
-0.0017070770263671875,
0.0484619140625,
0.037261962890625,
0.01323699951171875,
0.04656982421875,
-0.08221435546875,
-0.01092529296875,
-0.06658935546875,
-0.041656494140625,
-0.0234527587890625,
-0.054107666015625,
-0.039337158203125,
-0.0212249755859375,
0.0297088623046875,
0.0181732177734375,
-0.0299072265625,
0.038909912109375,
-0.06329345703125,
0.003177642822265625,
0.05548095703125,
0.043609619140625,
-0.02484130859375,
0.0177154541015625,
-0.0255889892578125,
0.0012464523315429688,
-0.05377197265625,
-0.0187225341796875,
0.0831298828125,
0.039154052734375,
0.050079345703125,
-0.0042572021484375,
0.054168701171875,
-0.0211334228515625,
0.019134521484375,
-0.0416259765625,
0.04412841796875,
-0.00952911376953125,
-0.032562255859375,
-0.0157012939453125,
-0.0323486328125,
-0.0780029296875,
0.01422882080078125,
-0.0208282470703125,
-0.056427001953125,
0.007724761962890625,
0.01068115234375,
-0.016021728515625,
0.06634521484375,
-0.0548095703125,
0.077880859375,
-0.0201416015625,
-0.02862548828125,
0.0120849609375,
-0.04833984375,
0.0225982666015625,
0.0157318115234375,
-0.0260009765625,
-0.017608642578125,
0.02044677734375,
0.08245849609375,
-0.041595458984375,
0.0635986328125,
-0.0413818359375,
0.0308990478515625,
0.0322265625,
-0.010498046875,
0.0219573974609375,
-0.003749847412109375,
-0.011749267578125,
0.027862548828125,
0.003353118896484375,
-0.034149169921875,
-0.039520263671875,
0.04425048828125,
-0.06951904296875,
-0.025909423828125,
-0.037841796875,
-0.03021240234375,
0.0128936767578125,
0.0097198486328125,
0.03326416015625,
0.05792236328125,
0.0225982666015625,
0.0182037353515625,
0.04522705078125,
-0.03509521484375,
0.039642333984375,
-0.005218505859375,
-0.0165252685546875,
-0.037445068359375,
0.057220458984375,
0.0120849609375,
0.01143646240234375,
0.011566162109375,
0.0235443115234375,
-0.03363037109375,
-0.050628662109375,
-0.0364990234375,
0.033172607421875,
-0.050079345703125,
-0.036651611328125,
-0.0457763671875,
-0.038604736328125,
-0.042022705078125,
0.004161834716796875,
-0.030487060546875,
-0.03192138671875,
-0.0259246826171875,
0.01424407958984375,
0.056915283203125,
0.05096435546875,
-0.0092620849609375,
0.0439453125,
-0.041534423828125,
0.0037403106689453125,
0.01108551025390625,
0.04046630859375,
-0.003757476806640625,
-0.07476806640625,
-0.01532745361328125,
-0.002094268798828125,
-0.037750244140625,
-0.06494140625,
0.042205810546875,
0.01113128662109375,
0.050323486328125,
0.0257110595703125,
-0.00872039794921875,
0.0579833984375,
0.0006923675537109375,
0.0304107666015625,
0.026763916015625,
-0.042633056640625,
0.05010986328125,
-0.0035381317138671875,
0.01271820068359375,
0.003269195556640625,
0.032470703125,
-0.0259246826171875,
-0.00015282630920410156,
-0.06781005859375,
-0.051544189453125,
0.0804443359375,
0.01153564453125,
-0.0007662773132324219,
0.028106689453125,
0.055267333984375,
0.004608154296875,
0.0052947998046875,
-0.0546875,
-0.0430908203125,
-0.0283050537109375,
-0.0245361328125,
0.0028171539306640625,
-0.00635528564453125,
-0.0024471282958984375,
-0.047515869140625,
0.05303955078125,
-0.0038509368896484375,
0.05328369140625,
0.0272979736328125,
0.00045228004455566406,
-0.002353668212890625,
-0.036346435546875,
0.036773681640625,
0.0295562744140625,
-0.03497314453125,
0.005863189697265625,
0.0109100341796875,
-0.052001953125,
0.01242828369140625,
0.01195526123046875,
0.005100250244140625,
0.0024738311767578125,
0.035552978515625,
0.0711669921875,
-0.0001952648162841797,
0.01018524169921875,
0.032623291015625,
-0.0018548965454101562,
-0.031280517578125,
-0.017791748046875,
0.01082611083984375,
-0.005138397216796875,
0.022186279296875,
0.019775390625,
0.0197601318359375,
-0.0113372802734375,
-0.0178985595703125,
0.0235595703125,
0.039337158203125,
-0.0237274169921875,
-0.031829833984375,
0.04351806640625,
-0.0087432861328125,
-0.0085906982421875,
0.067626953125,
-0.0143890380859375,
-0.0423583984375,
0.083740234375,
0.034423828125,
0.070556640625,
-0.001575469970703125,
0.00411224365234375,
0.068115234375,
0.013336181640625,
0.0018634796142578125,
0.0081329345703125,
0.0175628662109375,
-0.05645751953125,
0.00884246826171875,
-0.053131103515625,
0.0025482177734375,
0.031829833984375,
-0.035186767578125,
0.0271759033203125,
-0.054595947265625,
-0.0239105224609375,
0.0109100341796875,
0.031707763671875,
-0.0638427734375,
0.01517486572265625,
0.0018405914306640625,
0.0592041015625,
-0.060455322265625,
0.07745361328125,
0.07098388671875,
-0.038330078125,
-0.0718994140625,
-0.0098876953125,
0.00640106201171875,
-0.0772705078125,
0.05303955078125,
0.03594970703125,
0.0192108154296875,
0.00580596923828125,
-0.04156494140625,
-0.051300048828125,
0.10882568359375,
0.03924560546875,
-0.002471923828125,
0.01751708984375,
0.0079803466796875,
0.013763427734375,
-0.0310211181640625,
0.03790283203125,
0.01280975341796875,
0.0279083251953125,
0.0281982421875,
-0.0584716796875,
0.019927978515625,
-0.01543426513671875,
0.00687408447265625,
0.0214691162109375,
-0.058746337890625,
0.073486328125,
-0.032379150390625,
-0.01064300537109375,
-0.0022125244140625,
0.041656494140625,
0.018035888671875,
0.01453399658203125,
0.03924560546875,
0.061614990234375,
0.042877197265625,
-0.023895263671875,
0.0528564453125,
-0.0013017654418945312,
0.050628662109375,
0.043182373046875,
0.0289459228515625,
0.0310821533203125,
0.0252227783203125,
-0.0271759033203125,
0.0274505615234375,
0.07891845703125,
-0.0206298828125,
0.031585693359375,
0.026214599609375,
-0.003787994384765625,
-0.0015621185302734375,
0.0212249755859375,
-0.033294677734375,
0.041473388671875,
0.01145172119140625,
-0.038421630859375,
-0.0194244384765625,
0.0032711029052734375,
0.00829315185546875,
-0.02227783203125,
-0.0182342529296875,
0.038604736328125,
-0.001674652099609375,
-0.026153564453125,
0.05859375,
-0.0001786947250366211,
0.06964111328125,
-0.034637451171875,
-0.003215789794921875,
-0.025634765625,
0.022186279296875,
-0.0254364013671875,
-0.0682373046875,
0.029296875,
-0.0264434814453125,
0.0026149749755859375,
-0.0101318359375,
0.0484619140625,
-0.03485107421875,
-0.0390625,
0.021331787109375,
0.0179290771484375,
0.044677734375,
-0.0030612945556640625,
-0.09326171875,
0.0108184814453125,
0.01451873779296875,
-0.04327392578125,
0.033599853515625,
0.039947509765625,
0.0206146240234375,
0.054931640625,
0.04229736328125,
-0.0014657974243164062,
0.0182647705078125,
-0.0211181640625,
0.062225341796875,
-0.0439453125,
-0.015106201171875,
-0.06683349609375,
0.05242919921875,
-0.0086669921875,
-0.048126220703125,
0.0386962890625,
0.041351318359375,
0.065673828125,
-0.004070281982421875,
0.030487060546875,
-0.0199432373046875,
0.00246429443359375,
-0.038055419921875,
0.057037353515625,
-0.046539306640625,
-0.0016775131225585938,
-0.01259613037109375,
-0.05255126953125,
-0.0252532958984375,
0.05609130859375,
-0.0167999267578125,
0.02532958984375,
0.038299560546875,
0.0745849609375,
-0.02862548828125,
-0.0261688232421875,
0.0171661376953125,
0.003261566162109375,
0.01018524169921875,
0.0386962890625,
0.0304107666015625,
-0.0653076171875,
0.029510498046875,
-0.068603515625,
-0.019866943359375,
-0.0075531005859375,
-0.056396484375,
-0.07781982421875,
-0.065673828125,
-0.051422119140625,
-0.0517578125,
-0.0218048095703125,
0.06475830078125,
0.0828857421875,
-0.05389404296875,
-0.01258087158203125,
0.005962371826171875,
0.0187225341796875,
-0.01338958740234375,
-0.016876220703125,
0.049285888671875,
-0.015716552734375,
-0.062255859375,
-0.0239410400390625,
0.005950927734375,
0.037689208984375,
-0.0087890625,
-0.0169677734375,
-0.0169677734375,
-0.03436279296875,
0.0110321044921875,
0.020111083984375,
-0.047943115234375,
-0.00960540771484375,
-0.018646240234375,
-0.01306915283203125,
0.0309600830078125,
0.031585693359375,
-0.04010009765625,
0.0243682861328125,
0.03448486328125,
0.025390625,
0.056060791015625,
-0.027923583984375,
0.0017337799072265625,
-0.0604248046875,
0.04962158203125,
-0.0092315673828125,
0.0367431640625,
0.030059814453125,
-0.027618408203125,
0.040618896484375,
0.0416259765625,
-0.03509521484375,
-0.06622314453125,
-0.0126495361328125,
-0.0791015625,
-0.00963592529296875,
0.06622314453125,
-0.033599853515625,
-0.03656005859375,
0.0251922607421875,
-0.004993438720703125,
0.04302978515625,
-0.005062103271484375,
0.036102294921875,
0.0177001953125,
-0.0186767578125,
-0.0404052734375,
-0.0338134765625,
0.038238525390625,
0.0011692047119140625,
-0.044464111328125,
-0.033447265625,
-0.006732940673828125,
0.04931640625,
0.01554107666015625,
0.04278564453125,
-0.00637054443359375,
-0.00313568115234375,
0.01276397705078125,
0.030731201171875,
-0.03173828125,
-0.00632476806640625,
-0.0289764404296875,
-0.005168914794921875,
-0.007213592529296875,
-0.055267333984375
]
] |
timm/levit_128.fb_dist_in1k | 2023-02-03T21:13:20.000Z | [
"timm",
"pytorch",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2104.01136",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/levit_128.fb_dist_in1k | 0 | 33,482 | timm | 2023-02-03T21:13:15 | ---
tags:
- image-classification
- timm
library_tag: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for levit_128.fb_dist_in1k
A LeViT image classification model using convolutional mode (using nn.Conv2d and nn.BatchNorm2d). Pretrained on ImageNet-1k using distillation by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 9.2
- GMACs: 0.4
- Activations (M): 2.7
- Image size: 224 x 224
- **Papers:**
- LeViT: a Vision Transformer in ConvNet's Clothing for Faster Inference: https://arxiv.org/abs/2104.01136
- **Original:** https://github.com/facebookresearch/LeViT
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(
urlopen('https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'))
model = timm.create_model('levit_128.fb_dist_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(
urlopen('https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'))
model = timm.create_model(
'levit_128.fb_dist_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled (ie.e a (batch_size, num_features, H, W) tensor
output = model.forward_head(output, pre_logits=True)
# output is (batch_size, num_features) tensor
```
## Model Comparison
|model |top1 |top5 |param_count|img_size|
|-----------------------------------|------|------|-----------|--------|
|levit_384.fb_dist_in1k |82.596|96.012|39.13 |224 |
|levit_conv_384.fb_dist_in1k |82.596|96.012|39.13 |224 |
|levit_256.fb_dist_in1k |81.512|95.48 |18.89 |224 |
|levit_conv_256.fb_dist_in1k |81.512|95.48 |18.89 |224 |
|levit_conv_192.fb_dist_in1k |79.86 |94.792|10.95 |224 |
|levit_192.fb_dist_in1k |79.858|94.792|10.95 |224 |
|levit_128.fb_dist_in1k |78.474|94.014|9.21 |224 |
|levit_conv_128.fb_dist_in1k |78.474|94.02 |9.21 |224 |
|levit_128s.fb_dist_in1k |76.534|92.864|7.78 |224 |
|levit_conv_128s.fb_dist_in1k |76.532|92.864|7.78 |224 |
## Citation
```bibtex
@InProceedings{Graham_2021_ICCV,
author = {Graham, Benjamin and El-Nouby, Alaaeldin and Touvron, Hugo and Stock, Pierre and Joulin, Armand and Jegou, Herve and Douze, Matthijs},
title = {LeViT: A Vision Transformer in ConvNet's Clothing for Faster Inference},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
month = {October},
year = {2021},
pages = {12259-12269}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/rwightman/pytorch-image-models}}
}
```
| 4,032 | [
[
-0.04046630859375,
-0.0280609130859375,
0.010894775390625,
0.002788543701171875,
-0.029998779296875,
-0.022857666015625,
-0.012908935546875,
-0.02154541015625,
0.016021728515625,
0.0218658447265625,
-0.0426025390625,
-0.046234130859375,
-0.044342041015625,
-0.009033203125,
-0.018798828125,
0.068115234375,
-0.005695343017578125,
0.002559661865234375,
-0.0075531005859375,
-0.0283050537109375,
-0.0230255126953125,
-0.020416259765625,
-0.051544189453125,
-0.0290069580078125,
0.0215911865234375,
0.0096282958984375,
0.03729248046875,
0.03240966796875,
0.04852294921875,
0.031982421875,
-0.0103759765625,
0.01006317138671875,
-0.01450347900390625,
-0.01873779296875,
0.016082763671875,
-0.038726806640625,
-0.0303955078125,
0.00527191162109375,
0.061248779296875,
0.03472900390625,
-0.001071929931640625,
0.0333251953125,
0.016937255859375,
0.05487060546875,
-0.0221405029296875,
-0.0006132125854492188,
-0.0286865234375,
0.0057525634765625,
-0.0163726806640625,
-0.007061004638671875,
-0.026458740234375,
-0.020355224609375,
0.0161590576171875,
-0.043701171875,
0.0460205078125,
0.007129669189453125,
0.101318359375,
0.0201416015625,
-0.005859375,
0.01291656494140625,
-0.0269927978515625,
0.05633544921875,
-0.0694580078125,
0.0249176025390625,
0.0206146240234375,
0.006011962890625,
-0.00003421306610107422,
-0.0675048828125,
-0.053497314453125,
0.0011310577392578125,
-0.03057861328125,
0.0064544677734375,
-0.00970458984375,
-0.006046295166015625,
0.0229339599609375,
0.028350830078125,
-0.0458984375,
-0.0007143020629882812,
-0.040283203125,
-0.0075225830078125,
0.042205810546875,
0.009979248046875,
0.0086669921875,
-0.032623291015625,
-0.052276611328125,
-0.03717041015625,
-0.0233306884765625,
0.0266571044921875,
0.023773193359375,
0.007411956787109375,
-0.037689208984375,
0.03302001953125,
-0.0055694580078125,
0.03875732421875,
0.01861572265625,
-0.00963592529296875,
0.054595947265625,
-0.013427734375,
-0.03363037109375,
-0.00965118408203125,
0.081787109375,
0.0435791015625,
0.00923919677734375,
0.006381988525390625,
-0.0127716064453125,
-0.02130126953125,
-0.01009368896484375,
-0.08062744140625,
-0.017547607421875,
0.02203369140625,
-0.03863525390625,
-0.0199432373046875,
0.026031494140625,
-0.0638427734375,
-0.01555633544921875,
-0.01006317138671875,
0.04925537109375,
-0.036773681640625,
-0.03582763671875,
0.00717926025390625,
-0.0181732177734375,
0.034393310546875,
0.0087738037109375,
-0.0457763671875,
0.020111083984375,
0.0239410400390625,
0.076416015625,
-0.0087127685546875,
-0.0281219482421875,
-0.00604248046875,
-0.016998291015625,
-0.0300140380859375,
0.0406494140625,
0.00917816162109375,
-0.0067138671875,
-0.025421142578125,
0.0285186767578125,
-0.0194549560546875,
-0.043304443359375,
0.03253173828125,
-0.024169921875,
0.00824737548828125,
-0.003696441650390625,
-0.0232696533203125,
-0.0374755859375,
0.0206298828125,
-0.038360595703125,
0.08514404296875,
0.0270538330078125,
-0.0732421875,
0.0291748046875,
-0.0364990234375,
-0.0078277587890625,
-0.004711151123046875,
0.00063323974609375,
-0.0740966796875,
-0.0098724365234375,
0.0195770263671875,
0.05194091796875,
-0.02545166015625,
0.016448974609375,
-0.0301971435546875,
-0.0208282470703125,
0.0267791748046875,
-0.0198822021484375,
0.075927734375,
0.005443572998046875,
-0.03814697265625,
0.01666259765625,
-0.057952880859375,
0.008270263671875,
0.033203125,
-0.01549530029296875,
-0.00440216064453125,
-0.033477783203125,
0.00771331787109375,
0.00431060791015625,
0.0096588134765625,
-0.03948974609375,
0.00447845458984375,
-0.016998291015625,
0.0239410400390625,
0.0599365234375,
0.00408172607421875,
0.0214996337890625,
-0.03265380859375,
0.01116943359375,
0.0220184326171875,
0.0235443115234375,
0.00400543212890625,
-0.033935546875,
-0.06463623046875,
-0.036224365234375,
0.0227203369140625,
0.039764404296875,
-0.04876708984375,
0.037200927734375,
-0.0225982666015625,
-0.05255126953125,
-0.023773193359375,
-0.004650115966796875,
0.0296478271484375,
0.045257568359375,
0.03155517578125,
-0.018768310546875,
-0.038909912109375,
-0.07318115234375,
0.0026073455810546875,
0.005207061767578125,
0.00574493408203125,
0.01444244384765625,
0.05535888671875,
-0.007350921630859375,
0.05340576171875,
-0.0462646484375,
-0.018341064453125,
-0.0122528076171875,
0.00885009765625,
0.04901123046875,
0.0625,
0.063720703125,
-0.059295654296875,
-0.054473876953125,
-0.006832122802734375,
-0.067138671875,
0.01107025146484375,
-0.0024261474609375,
-0.0233001708984375,
0.027191162109375,
0.01513671875,
-0.05364990234375,
0.054168701171875,
0.0178680419921875,
-0.032470703125,
0.042816162109375,
-0.0178680419921875,
0.025604248046875,
-0.0870361328125,
0.0093231201171875,
0.0275726318359375,
-0.015350341796875,
-0.0435791015625,
-0.01406097412109375,
0.0081024169921875,
0.006229400634765625,
-0.036865234375,
0.04827880859375,
-0.0426025390625,
-0.0021762847900390625,
0.0029277801513671875,
-0.0118865966796875,
0.0037822723388671875,
0.0667724609375,
-0.01374053955078125,
0.037445068359375,
0.05804443359375,
-0.03997802734375,
0.03924560546875,
0.0284881591796875,
-0.030670166015625,
0.04156494140625,
-0.04766845703125,
-0.0029239654541015625,
-0.0019683837890625,
0.0164031982421875,
-0.09002685546875,
-0.01442718505859375,
0.027435302734375,
-0.04193115234375,
0.049041748046875,
-0.0293426513671875,
-0.035430908203125,
-0.042205810546875,
-0.042022705078125,
0.030975341796875,
0.043548583984375,
-0.049835205078125,
0.0224761962890625,
0.0140838623046875,
0.0148468017578125,
-0.045074462890625,
-0.066650390625,
-0.036651611328125,
-0.023956298828125,
-0.05133056640625,
0.0306396484375,
0.00452423095703125,
0.0018129348754882812,
0.0154571533203125,
-0.00899505615234375,
-0.0025615692138671875,
-0.0176849365234375,
0.028167724609375,
0.0435791015625,
-0.020538330078125,
-0.01214599609375,
-0.0157318115234375,
-0.010711669921875,
-0.00299835205078125,
-0.020294189453125,
0.05059814453125,
-0.0219268798828125,
-0.01549530029296875,
-0.062225341796875,
0.0008988380432128906,
0.042938232421875,
-0.006168365478515625,
0.0662841796875,
0.06658935546875,
-0.035980224609375,
-0.0034847259521484375,
-0.03265380859375,
-0.01412200927734375,
-0.037200927734375,
0.035614013671875,
-0.0269012451171875,
-0.0242919921875,
0.06597900390625,
0.01096343994140625,
0.006443023681640625,
0.07244873046875,
0.0325927734375,
-0.006870269775390625,
0.058929443359375,
0.03076171875,
0.0318603515625,
0.06182861328125,
-0.08331298828125,
-0.00514984130859375,
-0.07757568359375,
-0.03363037109375,
-0.0222930908203125,
-0.041046142578125,
-0.05438232421875,
-0.0234222412109375,
0.03790283203125,
0.00868988037109375,
-0.03619384765625,
0.03814697265625,
-0.066162109375,
0.004878997802734375,
0.051177978515625,
0.04595947265625,
-0.0201873779296875,
0.01007843017578125,
-0.0222930908203125,
0.0022068023681640625,
-0.05084228515625,
-0.0078582763671875,
0.08154296875,
0.0235443115234375,
0.052734375,
-0.00385284423828125,
0.044281005859375,
-0.00206756591796875,
0.02593994140625,
-0.0335693359375,
0.044525146484375,
-0.0159149169921875,
-0.052825927734375,
-0.0204620361328125,
-0.0260162353515625,
-0.06610107421875,
0.026214599609375,
-0.0207366943359375,
-0.0645751953125,
0.03759765625,
0.01161956787109375,
-0.0239410400390625,
0.057586669921875,
-0.0682373046875,
0.067626953125,
-0.00531768798828125,
-0.041259765625,
0.005458831787109375,
-0.0509033203125,
0.024871826171875,
0.0095672607421875,
-0.00774383544921875,
-0.0084075927734375,
0.023651123046875,
0.08349609375,
-0.044525146484375,
0.057281494140625,
-0.02935791015625,
0.029693603515625,
0.048248291015625,
-0.0148468017578125,
0.037353515625,
-0.01019287109375,
-0.00994873046875,
0.022857666015625,
0.014678955078125,
-0.0270843505859375,
-0.0287933349609375,
0.042083740234375,
-0.064697265625,
-0.03717041015625,
-0.03997802734375,
-0.038604736328125,
0.01398468017578125,
0.011260986328125,
0.053070068359375,
0.060272216796875,
0.01934814453125,
0.0338134765625,
0.046478271484375,
-0.0201263427734375,
0.049346923828125,
-0.0024929046630859375,
-0.008575439453125,
-0.046875,
0.06695556640625,
0.025299072265625,
0.0209808349609375,
0.017303466796875,
0.0192413330078125,
-0.031402587890625,
-0.0374755859375,
-0.03253173828125,
0.031494140625,
-0.0594482421875,
-0.0404052734375,
-0.03662109375,
-0.03741455078125,
-0.03448486328125,
-0.0092620849609375,
-0.0280609130859375,
-0.0298919677734375,
-0.0245361328125,
0.00852203369140625,
0.052978515625,
0.039031982421875,
-0.019195556640625,
0.035614013671875,
-0.036529541015625,
0.01873779296875,
0.00899505615234375,
0.03802490234375,
-0.0029544830322265625,
-0.06610107421875,
-0.02276611328125,
-0.003749847412109375,
-0.03790283203125,
-0.0645751953125,
0.039031982421875,
0.0022830963134765625,
0.05035400390625,
0.0278167724609375,
0.003368377685546875,
0.05950927734375,
-0.0091094970703125,
0.045318603515625,
0.03594970703125,
-0.052154541015625,
0.03790283203125,
-0.00824737548828125,
0.003032684326171875,
0.011749267578125,
0.016021728515625,
-0.01270294189453125,
-0.00960540771484375,
-0.07391357421875,
-0.064453125,
0.0728759765625,
0.0127410888671875,
-0.01366424560546875,
0.0238189697265625,
0.037353515625,
-0.0017642974853515625,
0.0005941390991210938,
-0.0587158203125,
-0.04583740234375,
-0.026153564453125,
-0.02178955078125,
-0.00290679931640625,
-0.005428314208984375,
-0.0082855224609375,
-0.054229736328125,
0.05767822265625,
-0.0014925003051757812,
0.050811767578125,
0.035400390625,
-0.0037441253662109375,
-0.0064849853515625,
-0.0195770263671875,
0.04107666015625,
0.033966064453125,
-0.0294647216796875,
0.00970458984375,
0.0244140625,
-0.047393798828125,
0.0015087127685546875,
0.01971435546875,
-0.01158905029296875,
0.003398895263671875,
0.02386474609375,
0.0689697265625,
-0.0016908645629882812,
0.00392913818359375,
0.03955078125,
-0.0005192756652832031,
-0.035369873046875,
-0.0215301513671875,
0.00702667236328125,
-0.01166534423828125,
0.042236328125,
0.03924560546875,
0.0282440185546875,
-0.0008783340454101562,
-0.0138397216796875,
0.01934814453125,
0.0447998046875,
-0.01690673828125,
-0.0198822021484375,
0.054962158203125,
-0.0207061767578125,
-0.007171630859375,
0.035980224609375,
-0.00748443603515625,
-0.03515625,
0.0760498046875,
0.0310516357421875,
0.074951171875,
-0.0009646415710449219,
0.00203704833984375,
0.0699462890625,
0.015289306640625,
-0.005863189697265625,
0.004505157470703125,
0.010498046875,
-0.0496826171875,
0.003383636474609375,
-0.047760009765625,
-0.0014801025390625,
0.0218963623046875,
-0.043853759765625,
0.0245361328125,
-0.038421630859375,
-0.0328369140625,
0.0189666748046875,
0.0270843505859375,
-0.062408447265625,
0.014801025390625,
-0.0017871856689453125,
0.05340576171875,
-0.057098388671875,
0.053314208984375,
0.04730224609375,
-0.036590576171875,
-0.071044921875,
-0.01081085205078125,
-0.01522064208984375,
-0.054473876953125,
0.03564453125,
0.0195770263671875,
0.003063201904296875,
0.01119232177734375,
-0.065673828125,
-0.05877685546875,
0.11669921875,
0.0297698974609375,
-0.01267242431640625,
0.02288818359375,
-0.0015687942504882812,
0.01898193359375,
-0.0119476318359375,
0.022796630859375,
0.015838623046875,
0.0333251953125,
0.0258941650390625,
-0.0494384765625,
0.0211334228515625,
-0.0230255126953125,
0.00691986083984375,
0.023529052734375,
-0.062469482421875,
0.08660888671875,
-0.0261688232421875,
-0.01654052734375,
0.006725311279296875,
0.045562744140625,
0.0226287841796875,
0.01507568359375,
0.042694091796875,
0.057373046875,
0.037750244140625,
-0.0297698974609375,
0.068603515625,
-0.0079498291015625,
0.06695556640625,
0.043304443359375,
0.029083251953125,
0.043609619140625,
0.031341552734375,
-0.0298919677734375,
0.03204345703125,
0.07025146484375,
-0.0279693603515625,
0.039794921875,
0.00772857666015625,
-0.00266265869140625,
-0.005748748779296875,
0.017425537109375,
-0.034454345703125,
0.0137176513671875,
0.01488494873046875,
-0.02117919921875,
-0.0105133056640625,
-0.00007647275924682617,
-0.0009551048278808594,
-0.0281219482421875,
-0.0201416015625,
0.03192138671875,
-0.001064300537109375,
-0.04327392578125,
0.06695556640625,
-0.0035266876220703125,
0.06396484375,
-0.03289794921875,
0.00405120849609375,
-0.014862060546875,
0.036712646484375,
-0.0287322998046875,
-0.0762939453125,
0.019500732421875,
-0.00994873046875,
-0.01509857177734375,
0.0009613037109375,
0.045074462890625,
-0.0227203369140625,
-0.048736572265625,
0.0162353515625,
0.011260986328125,
0.0234222412109375,
0.006488800048828125,
-0.06512451171875,
-0.007190704345703125,
-0.00014913082122802734,
-0.057952880859375,
0.011474609375,
0.041717529296875,
0.006591796875,
0.044830322265625,
0.056732177734375,
-0.01210784912109375,
0.03167724609375,
-0.0200347900390625,
0.0687255859375,
-0.03802490234375,
-0.032470703125,
-0.0675048828125,
0.051177978515625,
-0.01282501220703125,
-0.044586181640625,
0.037109375,
0.04754638671875,
0.058502197265625,
-0.00574493408203125,
0.03387451171875,
-0.025726318359375,
0.0144500732421875,
-0.03411865234375,
0.055389404296875,
-0.05670166015625,
-0.00289154052734375,
-0.014739990234375,
-0.049346923828125,
-0.0274810791015625,
0.053558349609375,
-0.0229339599609375,
0.0209503173828125,
0.05157470703125,
0.0687255859375,
-0.0162200927734375,
-0.0255279541015625,
0.017181396484375,
0.018524169921875,
0.011138916015625,
0.030242919921875,
0.03424072265625,
-0.0650634765625,
0.029510498046875,
-0.046356201171875,
-0.01522064208984375,
-0.01422882080078125,
-0.045806884765625,
-0.06951904296875,
-0.0665283203125,
-0.03436279296875,
-0.046661376953125,
-0.024444580078125,
0.060272216796875,
0.0703125,
-0.057586669921875,
-0.00734710693359375,
-0.0032367706298828125,
0.018798828125,
-0.0280609130859375,
-0.0193023681640625,
0.056182861328125,
-0.00994110107421875,
-0.059661865234375,
-0.0303802490234375,
-0.005413055419921875,
0.032470703125,
-0.00313568115234375,
-0.0225982666015625,
-0.0303802490234375,
-0.01076507568359375,
0.0179443359375,
0.0151214599609375,
-0.0447998046875,
-0.012542724609375,
-0.0098876953125,
-0.01434326171875,
0.0330810546875,
0.0211639404296875,
-0.0399169921875,
0.012725830078125,
0.050506591796875,
0.0127716064453125,
0.07489013671875,
-0.01580810546875,
-0.007244110107421875,
-0.056732177734375,
0.03997802734375,
-0.01434326171875,
0.037200927734375,
0.0266571044921875,
-0.0276947021484375,
0.04241943359375,
0.045074462890625,
-0.03424072265625,
-0.0711669921875,
-0.019622802734375,
-0.08917236328125,
-0.0089111328125,
0.07635498046875,
-0.01812744140625,
-0.04241943359375,
0.028839111328125,
-0.00432586669921875,
0.0472412109375,
-0.00994873046875,
0.0232391357421875,
0.0239410400390625,
-0.00919342041015625,
-0.03802490234375,
-0.0478515625,
0.0277252197265625,
-0.001247406005859375,
-0.048187255859375,
-0.0275726318359375,
-0.003101348876953125,
0.04931640625,
0.031982421875,
0.0408935546875,
-0.00848388671875,
0.00588226318359375,
0.01309967041015625,
0.0275421142578125,
-0.017242431640625,
-0.007801055908203125,
-0.0127716064453125,
-0.0026092529296875,
-0.0245208740234375,
-0.056488037109375
]
] |
deepset/gbert-base | 2023-05-05T07:00:24.000Z | [
"transformers",
"pytorch",
"tf",
"safetensors",
"fill-mask",
"de",
"dataset:wikipedia",
"dataset:OPUS",
"dataset:OpenLegalData",
"arxiv:2010.10906",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | deepset | null | null | deepset/gbert-base | 27 | 33,400 | transformers | 2022-03-02T23:29:05 | ---
language: de
license: mit
datasets:
- wikipedia
- OPUS
- OpenLegalData
---
# German BERT base
Released, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our [paper](https://arxiv.org/pdf/2010.10906.pdf), we outline the steps taken to train our model and show that it outperforms its predecessors.
## Overview
**Paper:** [here](https://arxiv.org/pdf/2010.10906.pdf)
**Architecture:** BERT base
**Language:** German
## Performance
```
GermEval18 Coarse: 78.17
GermEval18 Fine: 50.90
GermEval14: 87.98
```
See also:
deepset/gbert-base
deepset/gbert-large
deepset/gelectra-base
deepset/gelectra-large
deepset/gelectra-base-generator
deepset/gelectra-large-generator
## Authors
Branden Chan: `branden.chan [at] deepset.ai`
Stefan Schweter: `stefan [at] schweter.eu`
Timo Möller: `timo.moeller [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Slack](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
| 1,883 | [
[
-0.0379638671875,
-0.04833984375,
0.027801513671875,
0.00875091552734375,
-0.0004906654357910156,
0.0005207061767578125,
-0.029541015625,
-0.0428466796875,
0.01334381103515625,
0.03564453125,
-0.049530029296875,
-0.056060791015625,
-0.02508544921875,
-0.0169677734375,
-0.032440185546875,
0.07904052734375,
0.0014286041259765625,
0.03094482421875,
0.00904083251953125,
-0.01151275634765625,
-0.018646240234375,
-0.054534912109375,
-0.043426513671875,
-0.04071044921875,
0.03887939453125,
0.027496337890625,
0.047760009765625,
0.0179443359375,
0.04058837890625,
0.0181732177734375,
-0.01351165771484375,
-0.022125244140625,
-0.03167724609375,
0.01236724853515625,
0.0037937164306640625,
-0.01202392578125,
-0.031494140625,
-0.020416259765625,
0.03729248046875,
0.071533203125,
-0.019287109375,
0.0218963623046875,
-0.01551055908203125,
0.061676025390625,
-0.043212890625,
0.0094146728515625,
-0.035491943359375,
-0.0029010772705078125,
-0.01055145263671875,
0.04345703125,
-0.0280303955078125,
-0.0297393798828125,
0.0160980224609375,
-0.0164642333984375,
0.0257720947265625,
-0.03668212890625,
0.08740234375,
0.0028324127197265625,
-0.0157623291015625,
-0.010894775390625,
-0.052703857421875,
0.03497314453125,
-0.0709228515625,
0.04522705078125,
0.00862884521484375,
0.03875732421875,
-0.01201629638671875,
-0.07794189453125,
-0.0313720703125,
-0.0267791748046875,
0.009490966796875,
0.002109527587890625,
-0.01399993896484375,
-0.00908660888671875,
0.004535675048828125,
0.039337158203125,
-0.051788330078125,
0.020172119140625,
-0.03253173828125,
0.0025615692138671875,
0.05072021484375,
-0.0178985595703125,
0.00728607177734375,
0.013885498046875,
-0.01334381103515625,
-0.03289794921875,
-0.058502197265625,
-0.00550079345703125,
0.0172576904296875,
0.0241851806640625,
-0.00243377685546875,
0.032928466796875,
-0.00948333740234375,
0.036285400390625,
0.015167236328125,
0.045318603515625,
0.049774169921875,
-0.0162506103515625,
-0.029083251953125,
0.002285003662109375,
0.054718017578125,
-0.004093170166015625,
0.01308441162109375,
-0.0185089111328125,
-0.031494140625,
-0.0090484619140625,
0.0262451171875,
-0.05438232421875,
-0.0305328369140625,
0.026947021484375,
-0.04180908203125,
-0.0160675048828125,
-0.00091552734375,
-0.038543701171875,
-0.0222320556640625,
-0.004848480224609375,
0.042816162109375,
-0.044921875,
-0.03179931640625,
0.01202392578125,
-0.00908660888671875,
0.032806396484375,
0.00908660888671875,
-0.0697021484375,
0.01873779296875,
0.0572509765625,
0.0418701171875,
0.006450653076171875,
-0.0248260498046875,
-0.0171661376953125,
-0.01050567626953125,
-0.03515625,
0.03814697265625,
-0.01361083984375,
-0.012786865234375,
0.0145263671875,
0.000247955322265625,
0.013580322265625,
-0.03314208984375,
0.034515380859375,
-0.060028076171875,
0.034759521484375,
-0.015533447265625,
-0.048858642578125,
-0.01448822021484375,
0.006450653076171875,
-0.07012939453125,
0.07574462890625,
0.0258636474609375,
-0.0267181396484375,
0.0283203125,
-0.06756591796875,
-0.0308837890625,
0.01241302490234375,
-0.0012388229370117188,
-0.025634765625,
-0.0029754638671875,
0.01374053955078125,
0.025634765625,
-0.006542205810546875,
0.019195556640625,
-0.02655029296875,
-0.04010009765625,
0.004222869873046875,
-0.0047454833984375,
0.09527587890625,
0.0202178955078125,
-0.037384033203125,
0.01345062255859375,
-0.062347412109375,
0.024383544921875,
0.014923095703125,
-0.0294189453125,
0.0018014907836914062,
-0.00860595703125,
0.0178680419921875,
0.007205963134765625,
0.0367431640625,
-0.029754638671875,
0.00826263427734375,
-0.0399169921875,
0.0238800048828125,
0.053375244140625,
-0.01523590087890625,
0.032257080078125,
-0.0010204315185546875,
0.026702880859375,
-0.01067352294921875,
0.024322509765625,
0.00502777099609375,
-0.0308380126953125,
-0.0732421875,
-0.01232147216796875,
0.052978515625,
0.049957275390625,
-0.035980224609375,
0.07513427734375,
-0.01708984375,
-0.064453125,
-0.04754638671875,
0.01023101806640625,
0.02593994140625,
0.0211944580078125,
0.0183563232421875,
-0.033721923828125,
-0.052581787109375,
-0.08343505859375,
0.00385284423828125,
-0.0184478759765625,
-0.01366424560546875,
0.00653839111328125,
0.03363037109375,
-0.044647216796875,
0.05828857421875,
-0.0268096923828125,
-0.0118408203125,
-0.0169525146484375,
0.01517486572265625,
0.050811767578125,
0.044189453125,
0.06365966796875,
-0.0489501953125,
-0.03912353515625,
-0.006549835205078125,
-0.0501708984375,
0.01497650146484375,
0.019073486328125,
-0.015655517578125,
0.038726806640625,
0.0293121337890625,
-0.04974365234375,
-0.004573822021484375,
0.057098388671875,
-0.0268096923828125,
0.032440185546875,
-0.00504302978515625,
-0.01666259765625,
-0.08709716796875,
0.025665283203125,
-0.0081787109375,
-0.00727081298828125,
-0.03460693359375,
0.0269012451171875,
-0.00820159912109375,
-0.01338958740234375,
-0.034423828125,
0.030303955078125,
-0.03448486328125,
-0.0123443603515625,
0.00804901123046875,
-0.0278167724609375,
-0.02410888671875,
0.042327880859375,
-0.00875091552734375,
0.07049560546875,
0.036041259765625,
-0.03778076171875,
0.03289794921875,
0.0318603515625,
-0.048065185546875,
0.006984710693359375,
-0.05999755859375,
0.0211029052734375,
0.00832366943359375,
0.016571044921875,
-0.05657958984375,
-0.0139923095703125,
0.01380157470703125,
-0.0406494140625,
0.0207061767578125,
-0.00919342041015625,
-0.056365966796875,
-0.041229248046875,
-0.01406097412109375,
-0.005977630615234375,
0.06854248046875,
-0.043487548828125,
0.02264404296875,
0.0272216796875,
-0.021697998046875,
-0.050567626953125,
-0.0631103515625,
0.024627685546875,
0.0162506103515625,
-0.05450439453125,
0.032012939453125,
-0.010650634765625,
-0.0158843994140625,
0.0206146240234375,
-0.0008807182312011719,
-0.040374755859375,
0.0159454345703125,
-0.0014200210571289062,
0.0133819580078125,
-0.042144775390625,
0.0212249755859375,
-0.00887298583984375,
-0.0014324188232421875,
0.00988006591796875,
-0.018035888671875,
0.058624267578125,
-0.053741455078125,
-0.023712158203125,
-0.01434326171875,
0.032928466796875,
0.0245513916015625,
-0.01039886474609375,
0.0494384765625,
0.06317138671875,
-0.0308837890625,
-0.00943756103515625,
-0.0457763671875,
-0.0238189697265625,
-0.0357666015625,
0.0280914306640625,
-0.0161285400390625,
-0.07666015625,
0.03857421875,
0.0098419189453125,
0.0153656005859375,
0.058990478515625,
0.05511474609375,
-0.023681640625,
0.06793212890625,
0.0723876953125,
-0.01352691650390625,
0.053802490234375,
-0.03594970703125,
0.00565338134765625,
-0.0308837890625,
0.0006189346313476562,
-0.033477783203125,
-0.0224151611328125,
-0.04913330078125,
-0.008514404296875,
0.0124359130859375,
0.00742340087890625,
-0.044830322265625,
0.037567138671875,
-0.041748046875,
0.0212860107421875,
0.07952880859375,
0.0004181861877441406,
-0.004955291748046875,
0.0094451904296875,
-0.0128936767578125,
0.0108184814453125,
-0.057037353515625,
-0.04193115234375,
0.08966064453125,
0.0273590087890625,
0.037689208984375,
0.0032958984375,
0.0728759765625,
0.033538818359375,
0.01224517822265625,
-0.035369873046875,
0.03594970703125,
-0.0172119140625,
-0.068115234375,
-0.032501220703125,
-0.0207672119140625,
-0.075927734375,
-0.00408935546875,
-0.0016431808471679688,
-0.044708251953125,
0.0230560302734375,
0.0085601806640625,
-0.009735107421875,
0.0199127197265625,
-0.0736083984375,
0.0810546875,
-0.028594970703125,
0.0008435249328613281,
-0.019500732421875,
-0.06549072265625,
0.01004791259765625,
-0.0097198486328125,
-0.0044708251953125,
-0.00449371337890625,
0.0257568359375,
0.038970947265625,
-0.04681396484375,
0.0728759765625,
-0.0197601318359375,
-0.0193634033203125,
0.0200653076171875,
-0.0021114349365234375,
0.03106689453125,
-0.005657196044921875,
-0.0205078125,
0.048065185546875,
-0.004611968994140625,
-0.0341796875,
-0.0161285400390625,
0.04705810546875,
-0.06109619140625,
-0.0186920166015625,
-0.02716064453125,
-0.01302337646484375,
-0.02191162109375,
0.04266357421875,
0.0233001708984375,
0.00974273681640625,
-0.045074462890625,
0.0249481201171875,
0.05218505859375,
-0.0149993896484375,
0.027557373046875,
0.039825439453125,
-0.004657745361328125,
-0.036285400390625,
0.061248779296875,
-0.003269195556640625,
-0.00426483154296875,
0.0281829833984375,
-0.008331298828125,
-0.007373809814453125,
-0.033416748046875,
-0.035552978515625,
0.0116729736328125,
-0.0506591796875,
-0.0004200935363769531,
-0.034759521484375,
-0.043365478515625,
-0.054107666015625,
-0.00571441650390625,
-0.03924560546875,
-0.031280517578125,
-0.0029144287109375,
-0.0126800537109375,
0.0302734375,
0.0511474609375,
-0.0307769775390625,
0.0164947509765625,
-0.06451416015625,
0.0027103424072265625,
0.0260467529296875,
0.045684814453125,
-0.0135040283203125,
-0.001468658447265625,
-0.036865234375,
0.0236358642578125,
-0.002132415771484375,
-0.048919677734375,
0.0192108154296875,
-0.0009546279907226562,
0.05645751953125,
0.0083770751953125,
-0.01175689697265625,
0.01068878173828125,
-0.044464111328125,
0.06915283203125,
0.0153656005859375,
-0.06658935546875,
0.04296875,
-0.0310821533203125,
0.0284881591796875,
0.064453125,
0.0197601318359375,
-0.057647705078125,
-0.0169677734375,
-0.0543212890625,
-0.08673095703125,
0.04217529296875,
0.0173797607421875,
0.02301025390625,
0.0005369186401367188,
0.0007171630859375,
0.021484375,
0.034820556640625,
-0.047515869140625,
-0.03094482421875,
-0.01020050048828125,
-0.01480865478515625,
-0.002895355224609375,
-0.043182373046875,
-0.0165557861328125,
-0.019683837890625,
0.0703125,
0.004619598388671875,
0.04827880859375,
0.003326416015625,
-0.0178985595703125,
-0.0009617805480957031,
0.018951416015625,
0.040435791015625,
0.071533203125,
-0.0467529296875,
-0.0160980224609375,
0.005344390869140625,
-0.0312347412109375,
-0.00727081298828125,
0.038360595703125,
-0.033782958984375,
0.035064697265625,
0.032012939453125,
0.06353759765625,
0.00841522216796875,
-0.0531005859375,
0.035186767578125,
0.01300811767578125,
-0.03424072265625,
-0.054779052734375,
0.0052337646484375,
0.0114593505859375,
0.02783203125,
0.034423828125,
-0.01137542724609375,
0.01271820068359375,
-0.0160980224609375,
0.026214599609375,
0.0183258056640625,
-0.04461669921875,
-0.0100250244140625,
0.034271240234375,
0.0318603515625,
-0.0227203369140625,
0.0635986328125,
-0.0190887451171875,
-0.04498291015625,
0.05511474609375,
0.0246734619140625,
0.0855712890625,
0.0018138885498046875,
0.0278778076171875,
0.019683837890625,
0.025634765625,
0.00862884521484375,
0.0435791015625,
0.007106781005859375,
-0.05584716796875,
-0.039520263671875,
-0.022857666015625,
-0.04193115234375,
0.0307464599609375,
-0.042572021484375,
0.0034961700439453125,
-0.032745361328125,
-0.0175323486328125,
-0.0029087066650390625,
0.00850677490234375,
-0.054534912109375,
0.017181396484375,
0.02545166015625,
0.08941650390625,
-0.04034423828125,
0.0606689453125,
0.0714111328125,
-0.05010986328125,
-0.036529541015625,
-0.005107879638671875,
-0.02001953125,
-0.068359375,
0.0496826171875,
-0.0038909912109375,
0.00194549560546875,
0.00299072265625,
-0.0596923828125,
-0.07061767578125,
0.06964111328125,
0.031463623046875,
-0.040252685546875,
-0.01396942138671875,
-0.0120849609375,
0.056182861328125,
-0.005374908447265625,
-0.007549285888671875,
0.0230560302734375,
0.035491943359375,
-0.006977081298828125,
-0.06622314453125,
-0.00594329833984375,
-0.03875732421875,
-0.001232147216796875,
0.0122222900390625,
-0.042633056640625,
0.05859375,
-0.00860595703125,
-0.00370025634765625,
0.016204833984375,
0.048736572265625,
0.01151275634765625,
0.0004673004150390625,
0.04046630859375,
0.051055908203125,
0.059234619140625,
-0.00830841064453125,
0.0748291015625,
-0.0290069580078125,
0.02960205078125,
0.09075927734375,
-0.027191162109375,
0.064208984375,
0.0216827392578125,
-0.0153656005859375,
0.0618896484375,
0.04986572265625,
-0.03936767578125,
0.049468994140625,
0.007049560546875,
-0.004123687744140625,
-0.0240325927734375,
0.01141357421875,
-0.06646728515625,
0.0169525146484375,
0.0097198486328125,
-0.035247802734375,
-0.0240325927734375,
-0.019805908203125,
0.00037407875061035156,
-0.015869140625,
0.0030803680419921875,
0.045318603515625,
-0.00763702392578125,
-0.035369873046875,
0.0526123046875,
0.004791259765625,
0.041229248046875,
-0.06048583984375,
-0.0012311935424804688,
-0.0174713134765625,
0.03094482421875,
0.01244354248046875,
-0.056732177734375,
0.0017490386962890625,
-0.00975799560546875,
-0.02508544921875,
-0.0306396484375,
0.053955078125,
-0.0229339599609375,
-0.0482177734375,
0.0280914306640625,
0.04815673828125,
0.01160430908203125,
0.0025119781494140625,
-0.0650634765625,
-0.00478363037109375,
-0.00843048095703125,
-0.0240478515625,
0.0085906982421875,
0.021697998046875,
0.004150390625,
0.041656494140625,
0.051513671875,
0.0090484619140625,
0.002437591552734375,
0.0245208740234375,
0.068115234375,
-0.048065185546875,
-0.026397705078125,
-0.040557861328125,
0.034912109375,
-0.019561767578125,
-0.032196044921875,
0.04046630859375,
0.050048828125,
0.08154296875,
-0.028289794921875,
0.062744140625,
-0.01477813720703125,
0.04791259765625,
-0.0233001708984375,
0.07232666015625,
-0.041015625,
-0.0105133056640625,
-0.0130462646484375,
-0.06805419921875,
-0.01256561279296875,
0.04632568359375,
-0.0021076202392578125,
0.022705078125,
0.04449462890625,
0.038330078125,
0.0009741783142089844,
-0.019134521484375,
0.0027866363525390625,
0.0289459228515625,
0.021881103515625,
0.040740966796875,
0.048309326171875,
-0.037261962890625,
0.049560546875,
-0.033233642578125,
0.0005092620849609375,
-0.042449951171875,
-0.057220458984375,
-0.04937744140625,
-0.039825439453125,
-0.0202178955078125,
-0.03424072265625,
0.0288543701171875,
0.059295654296875,
0.063232421875,
-0.09197998046875,
-0.0293426513671875,
-0.0249786376953125,
0.00807952880859375,
-0.036376953125,
-0.0169525146484375,
0.021484375,
-0.031280517578125,
-0.0472412109375,
0.038421630859375,
-0.0058746337890625,
0.0009031295776367188,
-0.02398681640625,
-0.00042939186096191406,
-0.0284881591796875,
-0.017608642578125,
0.040985107421875,
0.033935546875,
-0.04345703125,
-0.01294708251953125,
0.009368896484375,
-0.036407470703125,
0.00007778406143188477,
0.041778564453125,
-0.048370361328125,
0.00894927978515625,
0.042572021484375,
0.05389404296875,
0.043212890625,
-0.0271148681640625,
0.07708740234375,
-0.055633544921875,
0.004535675048828125,
0.0330810546875,
0.025238037109375,
0.01947021484375,
-0.0218353271484375,
0.07171630859375,
0.0066375732421875,
-0.03192138671875,
-0.05999755859375,
0.00933837890625,
-0.0693359375,
-0.037261962890625,
0.08172607421875,
0.01067352294921875,
-0.0088043212890625,
0.015289306640625,
-0.0053863525390625,
0.015655517578125,
-0.03564453125,
0.056365966796875,
0.06292724609375,
0.012542724609375,
0.00435638427734375,
-0.025360107421875,
0.02069091796875,
0.026885986328125,
-0.04937744140625,
-0.01102447509765625,
0.054779052734375,
0.018096923828125,
0.01470947265625,
0.0489501953125,
0.002262115478515625,
0.036773681640625,
-0.00786590576171875,
0.0557861328125,
-0.004428863525390625,
-0.00926971435546875,
-0.02880859375,
0.0012693405151367188,
0.004852294921875,
-0.0289154052734375
]
] |
nvidia/segformer-b5-finetuned-ade-640-640 | 2022-08-06T10:25:55.000Z | [
"transformers",
"pytorch",
"tf",
"segformer",
"vision",
"image-segmentation",
"dataset:scene_parse_150",
"arxiv:2105.15203",
"license:other",
"endpoints_compatible",
"has_space",
"region:us"
] | image-segmentation | nvidia | null | null | nvidia/segformer-b5-finetuned-ade-640-640 | 30 | 33,339 | transformers | 2022-03-02T23:29:05 | ---
license: other
tags:
- vision
- image-segmentation
datasets:
- scene_parse_150
widget:
- src: https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000001.jpg
example_title: House
- src: https://huggingface.co/datasets/hf-internal-testing/fixtures_ade20k/resolve/main/ADE_val_00000002.jpg
example_title: Castle
---
# SegFormer (b5-sized) model fine-tuned on ADE20k
SegFormer model fine-tuned on ADE20k at resolution 640x640. It was introduced in the paper [SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers](https://arxiv.org/abs/2105.15203) by Xie et al. and first released in [this repository](https://github.com/NVlabs/SegFormer).
Disclaimer: The team releasing SegFormer did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
SegFormer consists of a hierarchical Transformer encoder and a lightweight all-MLP decode head to achieve great results on semantic segmentation benchmarks such as ADE20K and Cityscapes. The hierarchical Transformer is first pre-trained on ImageNet-1k, after which a decode head is added and fine-tuned altogether on a downstream dataset.
## Intended uses & limitations
You can use the raw model for semantic segmentation. See the [model hub](https://huggingface.co/models?other=segformer) to look for fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model to classify an image of the COCO 2017 dataset into one of the 1,000 ImageNet classes:
```python
from transformers import SegformerFeatureExtractor, SegformerForSemanticSegmentation
from PIL import Image
import requests
feature_extractor = SegformerFeatureExtractor.from_pretrained("nvidia/segformer-b5-finetuned-ade-512-512")
model = SegformerForSemanticSegmentation.from_pretrained("nvidia/segformer-b5-finetuned-ade-512-512")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
inputs = feature_extractor(images=image, return_tensors="pt")
outputs = model(**inputs)
logits = outputs.logits # shape (batch_size, num_labels, height/4, width/4)
```
For more code examples, we refer to the [documentation](https://huggingface.co/transformers/model_doc/segformer.html#).
### License
The license for this model can be found [here](https://github.com/NVlabs/SegFormer/blob/master/LICENSE).
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-2105-15203,
author = {Enze Xie and
Wenhai Wang and
Zhiding Yu and
Anima Anandkumar and
Jose M. Alvarez and
Ping Luo},
title = {SegFormer: Simple and Efficient Design for Semantic Segmentation with
Transformers},
journal = {CoRR},
volume = {abs/2105.15203},
year = {2021},
url = {https://arxiv.org/abs/2105.15203},
eprinttype = {arXiv},
eprint = {2105.15203},
timestamp = {Wed, 02 Jun 2021 11:46:42 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2105-15203.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
| 3,209 | [
[
-0.06756591796875,
-0.05322265625,
0.01255035400390625,
0.015869140625,
-0.024200439453125,
-0.0264129638671875,
0.000026047229766845703,
-0.051727294921875,
0.0213623046875,
0.04229736328125,
-0.06622314453125,
-0.044830322265625,
-0.05731201171875,
0.00920867919921875,
-0.0238494873046875,
0.061126708984375,
0.006561279296875,
-0.01062774658203125,
-0.0246734619140625,
-0.028289794921875,
-0.00145721435546875,
-0.0225067138671875,
-0.045562744140625,
-0.0265960693359375,
0.029083251953125,
0.0161895751953125,
0.04327392578125,
0.060211181640625,
0.05230712890625,
0.035369873046875,
-0.034088134765625,
0.01129913330078125,
-0.0234222412109375,
-0.01464080810546875,
0.00237274169921875,
-0.006099700927734375,
-0.03253173828125,
-0.0015478134155273438,
0.027679443359375,
0.04718017578125,
0.005889892578125,
0.027557373046875,
-0.00391387939453125,
0.03350830078125,
-0.03387451171875,
0.007274627685546875,
-0.036468505859375,
0.0115509033203125,
0.005329132080078125,
0.0007281303405761719,
-0.021484375,
-0.0133514404296875,
0.0168914794921875,
-0.03839111328125,
0.05560302734375,
0.00395965576171875,
0.116455078125,
0.032470703125,
-0.0274658203125,
-0.004756927490234375,
-0.03546142578125,
0.061309814453125,
-0.048797607421875,
0.041259765625,
-0.00830078125,
0.0268402099609375,
0.0102386474609375,
-0.07684326171875,
-0.0347900390625,
0.0122833251953125,
-0.015716552734375,
-0.00518035888671875,
-0.0271453857421875,
0.004825592041015625,
0.037445068359375,
0.042572021484375,
-0.0333251953125,
0.00440216064453125,
-0.053680419921875,
-0.03253173828125,
0.054840087890625,
0.0043487548828125,
0.02301025390625,
-0.025360107421875,
-0.060760498046875,
-0.03509521484375,
-0.0233917236328125,
0.01055145263671875,
0.0184326171875,
0.0002949237823486328,
-0.0218963623046875,
0.03253173828125,
-0.003780364990234375,
0.055389404296875,
0.03436279296875,
-0.0107574462890625,
0.038970947265625,
-0.01239776611328125,
-0.0259246826171875,
0.0040435791015625,
0.0689697265625,
0.0328369140625,
-0.0005207061767578125,
0.0047760009765625,
-0.007843017578125,
0.0088043212890625,
0.0245208740234375,
-0.099853515625,
-0.018402099609375,
0.0064697265625,
-0.040069580078125,
-0.022857666015625,
0.01447296142578125,
-0.058624267578125,
-0.0048065185546875,
-0.012237548828125,
0.036163330078125,
-0.0245361328125,
-0.004978179931640625,
0.0079345703125,
-0.01038360595703125,
0.05621337890625,
0.01971435546875,
-0.06378173828125,
0.0164031982421875,
0.03826904296875,
0.05816650390625,
-0.016754150390625,
-0.0088958740234375,
-0.009918212890625,
-0.00580596923828125,
-0.01192474365234375,
0.06640625,
-0.0201873779296875,
-0.02545166015625,
-0.023468017578125,
0.04730224609375,
-0.0193023681640625,
-0.0479736328125,
0.0604248046875,
-0.03985595703125,
0.0178070068359375,
-0.0034465789794921875,
-0.0300140380859375,
-0.04608154296875,
0.022064208984375,
-0.048797607421875,
0.068603515625,
0.0175933837890625,
-0.06048583984375,
0.038604736328125,
-0.044708251953125,
-0.0192108154296875,
-0.004589080810546875,
0.0044097900390625,
-0.0701904296875,
0.00360870361328125,
0.036285400390625,
0.036285400390625,
-0.01476287841796875,
0.01219940185546875,
-0.043731689453125,
-0.0120697021484375,
-0.0005154609680175781,
-0.016143798828125,
0.07452392578125,
0.0247955322265625,
-0.0248260498046875,
0.035186767578125,
-0.05352783203125,
0.000039458274841308594,
0.035797119140625,
0.0026092529296875,
0.00017011165618896484,
-0.0257720947265625,
0.0191192626953125,
0.0328369140625,
0.018798828125,
-0.051727294921875,
0.002742767333984375,
-0.02471923828125,
0.03173828125,
0.0484619140625,
0.008941650390625,
0.034332275390625,
-0.0078582763671875,
0.0238494873046875,
0.0148773193359375,
0.033782958984375,
-0.0115509033203125,
-0.01678466796875,
-0.08380126953125,
-0.033355712890625,
0.0191802978515625,
0.00945281982421875,
-0.030059814453125,
0.0482177734375,
-0.0164947509765625,
-0.04736328125,
-0.038970947265625,
-0.00036787986755371094,
0.004467010498046875,
0.03717041015625,
0.039886474609375,
-0.0345458984375,
-0.055694580078125,
-0.08795166015625,
0.0118255615234375,
0.0199127197265625,
-0.00498199462890625,
0.0284271240234375,
0.04345703125,
-0.04998779296875,
0.06121826171875,
-0.0582275390625,
-0.0253448486328125,
-0.0158843994140625,
-0.005092620849609375,
0.02374267578125,
0.04296875,
0.049713134765625,
-0.0638427734375,
-0.0279388427734375,
-0.016571044921875,
-0.04510498046875,
-0.00614166259765625,
0.01019287109375,
-0.024383544921875,
0.0118255615234375,
0.031768798828125,
-0.04180908203125,
0.031402587890625,
0.035400390625,
-0.045928955078125,
0.0250701904296875,
-0.00624847412109375,
-0.001819610595703125,
-0.076904296875,
0.01415252685546875,
0.0141143798828125,
-0.02020263671875,
-0.0372314453125,
0.01120758056640625,
0.0003674030303955078,
-0.01276397705078125,
-0.047027587890625,
0.04132080078125,
-0.027618408203125,
-0.0017423629760742188,
-0.018646240234375,
-0.0136260986328125,
0.0119781494140625,
0.058380126953125,
0.0122222900390625,
0.024078369140625,
0.036376953125,
-0.052703857421875,
0.016754150390625,
0.0401611328125,
-0.0274810791015625,
0.038909912109375,
-0.07958984375,
0.00691986083984375,
-0.007144927978515625,
0.0086517333984375,
-0.056610107421875,
-0.026519775390625,
0.029205322265625,
-0.02545166015625,
0.031463623046875,
-0.02264404296875,
-0.0158843994140625,
-0.043792724609375,
-0.017425537109375,
0.0283050537109375,
0.039581298828125,
-0.060333251953125,
0.043426513671875,
0.0382080078125,
0.01012420654296875,
-0.0214385986328125,
-0.0447998046875,
-0.0253448486328125,
-0.031280517578125,
-0.08013916015625,
0.05120849609375,
-0.003314971923828125,
0.016357421875,
0.0029144287109375,
-0.026611328125,
-0.0027008056640625,
-0.004558563232421875,
0.026641845703125,
0.03680419921875,
-0.01064300537109375,
-0.031707763671875,
-0.000028371810913085938,
-0.03192138671875,
0.0094146728515625,
-0.0085601806640625,
0.04864501953125,
-0.02490234375,
-0.0269012451171875,
-0.0224456787109375,
-0.0015411376953125,
0.038818359375,
-0.0223236083984375,
0.03759765625,
0.09423828125,
-0.0237274169921875,
-0.0051116943359375,
-0.041656494140625,
-0.022247314453125,
-0.042999267578125,
0.02447509765625,
-0.0179290771484375,
-0.07989501953125,
0.0430908203125,
0.0015535354614257812,
0.00510406494140625,
0.07421875,
0.036041259765625,
0.0107879638671875,
0.0960693359375,
0.047393798828125,
0.0309295654296875,
0.040069580078125,
-0.060821533203125,
0.01451873779296875,
-0.07830810546875,
-0.04217529296875,
-0.0311279296875,
-0.0328369140625,
-0.057098388671875,
-0.0499267578125,
0.031768798828125,
0.0105438232421875,
-0.028656005859375,
0.04302978515625,
-0.0650634765625,
0.01442718505859375,
0.037445068359375,
0.004550933837890625,
-0.0111083984375,
0.00678253173828125,
-0.01166534423828125,
0.00495147705078125,
-0.055511474609375,
-0.02801513671875,
0.02685546875,
0.042724609375,
0.056854248046875,
-0.0117340087890625,
0.045196533203125,
-0.0071258544921875,
-0.002288818359375,
-0.07012939453125,
0.045684814453125,
-0.00911712646484375,
-0.05230712890625,
-0.0087890625,
-0.023101806640625,
-0.07635498046875,
0.033843994140625,
-0.0118408203125,
-0.0653076171875,
0.0484619140625,
0.0084381103515625,
-0.0192718505859375,
0.021240234375,
-0.049530029296875,
0.09112548828125,
-0.01535797119140625,
-0.03131103515625,
0.00913238525390625,
-0.05450439453125,
0.0177459716796875,
0.0221710205078125,
-0.005962371826171875,
-0.031585693359375,
0.0218658447265625,
0.0703125,
-0.05218505859375,
0.0521240234375,
-0.024993896484375,
0.01727294921875,
0.044891357421875,
-0.00983428955078125,
0.0269927978515625,
0.0010986328125,
0.0225830078125,
0.036407470703125,
0.019012451171875,
-0.0264434814453125,
-0.032379150390625,
0.049041748046875,
-0.06317138671875,
-0.044036865234375,
-0.03173828125,
-0.021697998046875,
0.0009112358093261719,
0.02716064453125,
0.036590576171875,
0.031768798828125,
-0.00664520263671875,
0.0360107421875,
0.049591064453125,
-0.025482177734375,
0.039947509765625,
0.013031005859375,
-0.01120758056640625,
-0.032012939453125,
0.06988525390625,
-0.0121002197265625,
0.0022754669189453125,
0.0196380615234375,
0.022918701171875,
-0.03759765625,
-0.0180816650390625,
-0.034942626953125,
0.021484375,
-0.0479736328125,
-0.0291595458984375,
-0.06561279296875,
-0.04217529296875,
-0.035430908203125,
-0.01934814453125,
-0.036346435546875,
-0.02349853515625,
-0.03009033203125,
0.0006937980651855469,
0.0284881591796875,
0.0277557373046875,
-0.0120086669921875,
0.0243988037109375,
-0.05303955078125,
0.017364501953125,
0.0283050537109375,
0.025634765625,
-0.0028018951416015625,
-0.046295166015625,
-0.0089111328125,
-0.0012502670288085938,
-0.043182373046875,
-0.039947509765625,
0.04620361328125,
0.006450653076171875,
0.040069580078125,
0.043609619140625,
-0.006679534912109375,
0.0712890625,
-0.0163726806640625,
0.045135498046875,
0.0300750732421875,
-0.05914306640625,
0.0266265869140625,
-0.012481689453125,
0.03875732421875,
0.02947998046875,
0.021148681640625,
-0.043212890625,
0.005016326904296875,
-0.06439208984375,
-0.08441162109375,
0.06988525390625,
0.007427215576171875,
0.0006060600280761719,
0.01088714599609375,
-0.0047454833984375,
0.0009002685546875,
-0.0034160614013671875,
-0.042388916015625,
-0.0260162353515625,
-0.0279998779296875,
-0.016632080078125,
-0.0028591156005859375,
-0.0305938720703125,
-0.0005340576171875,
-0.03961181640625,
0.0523681640625,
-0.011444091796875,
0.051849365234375,
0.0219879150390625,
-0.023590087890625,
-0.0001875162124633789,
-0.0025234222412109375,
0.034881591796875,
0.0233001708984375,
-0.0204620361328125,
0.00762939453125,
0.016143798828125,
-0.031585693359375,
-0.007038116455078125,
0.0276947021484375,
-0.022491455078125,
-0.002655029296875,
0.025054931640625,
0.08477783203125,
0.023773193359375,
-0.021636962890625,
0.040740966796875,
-0.0008130073547363281,
-0.03857421875,
-0.0272064208984375,
0.0197296142578125,
0.001155853271484375,
0.0287322998046875,
0.0167999267578125,
0.0269927978515625,
0.023773193359375,
-0.0013265609741210938,
0.02276611328125,
0.026580810546875,
-0.05450439453125,
-0.0276947021484375,
0.060943603515625,
0.00811767578125,
-0.001384735107421875,
0.0535888671875,
-0.00896453857421875,
-0.051727294921875,
0.06591796875,
0.03900146484375,
0.07562255859375,
-0.0036449432373046875,
0.01806640625,
0.0570068359375,
0.0093841552734375,
0.0119476318359375,
-0.0093841552734375,
-0.007602691650390625,
-0.057464599609375,
-0.0253753662109375,
-0.07666015625,
-0.0006313323974609375,
0.009124755859375,
-0.051727294921875,
0.041473388671875,
-0.032379150390625,
-0.01322174072265625,
0.0183563232421875,
0.006855010986328125,
-0.07757568359375,
0.0217742919921875,
0.01727294921875,
0.07257080078125,
-0.045166015625,
0.035125732421875,
0.0599365234375,
-0.019378662109375,
-0.0587158203125,
-0.0355224609375,
-0.0033588409423828125,
-0.06475830078125,
0.030303955078125,
0.037261962890625,
0.0020904541015625,
0.004150390625,
-0.05194091796875,
-0.07684326171875,
0.09930419921875,
0.00986480712890625,
-0.022064208984375,
-0.00287628173828125,
0.00008589029312133789,
0.028717041015625,
-0.033447265625,
0.027801513671875,
0.030029296875,
0.04071044921875,
0.055419921875,
-0.035430908203125,
0.00814056396484375,
-0.0214385986328125,
0.01294708251953125,
0.0252532958984375,
-0.06475830078125,
0.04571533203125,
-0.024627685546875,
-0.01806640625,
-0.01177978515625,
0.053375244140625,
0.0106964111328125,
0.020904541015625,
0.05291748046875,
0.05908203125,
0.032196044921875,
-0.024627685546875,
0.063720703125,
-0.017425537109375,
0.058624267578125,
0.055694580078125,
0.0197601318359375,
0.023406982421875,
0.03216552734375,
-0.00823974609375,
0.03387451171875,
0.0704345703125,
-0.041778564453125,
0.03460693359375,
-0.006343841552734375,
0.01245880126953125,
-0.030731201171875,
-0.016357421875,
-0.0304718017578125,
0.05816650390625,
0.0198822021484375,
-0.04608154296875,
-0.01605224609375,
-0.01226806640625,
-0.0030460357666015625,
-0.034515380859375,
-0.01424407958984375,
0.0501708984375,
0.0109710693359375,
-0.0244903564453125,
0.049102783203125,
0.0115203857421875,
0.053009033203125,
-0.032806396484375,
0.0021305084228515625,
-0.00691986083984375,
0.022308349609375,
-0.0269622802734375,
-0.034332275390625,
0.043975830078125,
-0.0166015625,
-0.00372314453125,
-0.00882720947265625,
0.07366943359375,
-0.0223541259765625,
-0.05645751953125,
0.0127410888671875,
0.01000213623046875,
0.0027942657470703125,
0.01235198974609375,
-0.0704345703125,
0.0313720703125,
0.00460052490234375,
-0.033721923828125,
0.0026493072509765625,
0.0108489990234375,
0.00702667236328125,
0.03961181640625,
0.044403076171875,
-0.024566650390625,
0.002197265625,
-0.01120758056640625,
0.066162109375,
-0.0526123046875,
-0.030181884765625,
-0.054931640625,
0.04388427734375,
-0.025543212890625,
-0.025115966796875,
0.05731201171875,
0.04949951171875,
0.0853271484375,
-0.019927978515625,
0.0217742919921875,
-0.032012939453125,
0.01213836669921875,
-0.0171966552734375,
0.041748046875,
-0.050140380859375,
-0.00794219970703125,
-0.032257080078125,
-0.08013916015625,
-0.024810791015625,
0.06561279296875,
-0.02813720703125,
0.0205230712890625,
0.03802490234375,
0.06842041015625,
-0.025177001953125,
-0.0028018951416015625,
0.02001953125,
0.007427215576171875,
0.01580810546875,
0.025054931640625,
0.044891357421875,
-0.0406494140625,
0.03802490234375,
-0.053924560546875,
0.0033702850341796875,
-0.036865234375,
-0.046630859375,
-0.06488037109375,
-0.044403076171875,
-0.03826904296875,
-0.024322509765625,
-0.0299224853515625,
0.06561279296875,
0.08203125,
-0.0657958984375,
-0.00506591796875,
0.003269195556640625,
0.01531219482421875,
-0.01496124267578125,
-0.022125244140625,
0.035919189453125,
-0.003093719482421875,
-0.073486328125,
-0.00560760498046875,
0.0198516845703125,
0.01163482666015625,
-0.0036067962646484375,
-0.0162811279296875,
-0.0020580291748046875,
-0.0115966796875,
0.050933837890625,
0.0206298828125,
-0.04547119140625,
-0.027587890625,
0.01548004150390625,
-0.0012502670288085938,
0.018402099609375,
0.044403076171875,
-0.03802490234375,
0.0276641845703125,
0.04412841796875,
0.03570556640625,
0.07171630859375,
0.006504058837890625,
0.0124969482421875,
-0.0350341796875,
0.0193634033203125,
0.01158905029296875,
0.03515625,
0.030059814453125,
-0.017822265625,
0.038726806640625,
0.026275634765625,
-0.03790283203125,
-0.04443359375,
0.00748443603515625,
-0.092529296875,
-0.00955963134765625,
0.07574462890625,
0.001758575439453125,
-0.04736328125,
0.022491455078125,
-0.01535797119140625,
0.0297088623046875,
-0.01544189453125,
0.039947509765625,
0.0206146240234375,
-0.011993408203125,
-0.031280517578125,
-0.01161956787109375,
0.02801513671875,
0.0032501220703125,
-0.039337158203125,
-0.0413818359375,
0.03424072265625,
0.035400390625,
0.021636962890625,
0.0120086669921875,
-0.030517578125,
0.00740814208984375,
0.01128387451171875,
0.0267791748046875,
-0.020111083984375,
-0.0168914794921875,
-0.0153656005859375,
0.0120697021484375,
-0.01171112060546875,
-0.0231170654296875
]
] |
hakurei/waifu-diffusion | 2023-07-05T16:18:18.000Z | [
"diffusers",
"stable-diffusion",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | hakurei | null | null | hakurei/waifu-diffusion | 2,281 | 33,289 | diffusers | 2022-08-30T02:28:33 | ---
language:
- en
tags:
- stable-diffusion
- text-to-image
license: creativeml-openrail-m
inference: true
---
# waifu-diffusion v1.4 - Diffusion for Weebs
waifu-diffusion is a latent text-to-image diffusion model that has been conditioned on high-quality anime images through fine-tuning.

<sub>masterpiece, best quality, 1girl, green hair, sweater, looking at viewer, upper body, beanie, outdoors, watercolor, night, turtleneck</sub>
[Original Weights](https://huggingface.co/hakurei/waifu-diffusion-v1-4)
# Gradio & Colab
We also support a [Gradio](https://github.com/gradio-app/gradio) Web UI and Colab with Diffusers to run Waifu Diffusion:
[](https://huggingface.co/spaces/hakurei/waifu-diffusion-demo)
[](https://colab.research.google.com/drive/1_8wPN7dJO746QXsFnB09Uq2VGgSRFuYE#scrollTo=1HaCauSq546O)
## Model Description
[See here for a full model overview.](https://gist.github.com/harubaru/f727cedacae336d1f7877c4bbe2196e1)
## License
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license)
## Downstream Uses
This model can be used for entertainment purposes and as a generative art assistant.
## Example Code
```python
import torch
from torch import autocast
from diffusers import StableDiffusionPipeline
pipe = StableDiffusionPipeline.from_pretrained(
'hakurei/waifu-diffusion',
torch_dtype=torch.float32
).to('cuda')
prompt = "1girl, aqua eyes, baseball cap, blonde hair, closed mouth, earrings, green background, hat, hoop earrings, jewelry, looking at viewer, shirt, short hair, simple background, solo, upper body, yellow shirt"
with autocast("cuda"):
image = pipe(prompt, guidance_scale=6)["sample"][0]
image.save("test.png")
```
## Team Members and Acknowledgements
This project would not have been possible without the incredible work by Stability AI and Novel AI.
- [Haru](https://github.com/harubaru)
- [Salt](https://github.com/sALTaccount/)
- [Sta @ Bit192](https://twitter.com/naclbbr)
In order to reach us, you can join our [Discord server](https://discord.gg/touhouai).
[](https://discord.gg/touhouai) | 3,404 | [
[
-0.0399169921875,
-0.06671142578125,
0.035064697265625,
0.042724609375,
-0.0185089111328125,
-0.015899658203125,
0.017242431640625,
-0.0217742919921875,
0.0164947509765625,
0.0294647216796875,
-0.0391845703125,
-0.036468505859375,
-0.057037353515625,
-0.01537322998046875,
-0.023345947265625,
0.07000732421875,
-0.01230621337890625,
-0.00876617431640625,
-0.01390838623046875,
-0.0138397216796875,
-0.045623779296875,
-0.005191802978515625,
-0.06951904296875,
-0.0438232421875,
0.043060302734375,
0.0106201171875,
0.058929443359375,
0.028289794921875,
0.021881103515625,
0.0231170654296875,
-0.009674072265625,
-0.004055023193359375,
-0.04931640625,
-0.007328033447265625,
0.010223388671875,
-0.025482177734375,
-0.053253173828125,
0.00789642333984375,
0.040771484375,
0.0171356201171875,
-0.0191802978515625,
0.00022220611572265625,
0.003662109375,
0.044036865234375,
-0.0328369140625,
0.004451751708984375,
-0.0040435791015625,
0.00347137451171875,
-0.025238037109375,
0.006526947021484375,
-0.01514434814453125,
-0.029815673828125,
-0.0030002593994140625,
-0.07196044921875,
0.011444091796875,
0.007534027099609375,
0.0980224609375,
0.0272064208984375,
-0.0274658203125,
-0.0079193115234375,
-0.0321044921875,
0.0362548828125,
-0.056060791015625,
0.0286102294921875,
0.02862548828125,
0.014892578125,
-0.01531982421875,
-0.08123779296875,
-0.032684326171875,
0.005947113037109375,
-0.00888824462890625,
0.027923583984375,
-0.0267791748046875,
-0.01015472412109375,
0.0085601806640625,
0.03643798828125,
-0.05279541015625,
-0.001216888427734375,
-0.03094482421875,
0.00534820556640625,
0.04705810546875,
0.0066680908203125,
0.037994384765625,
-0.006061553955078125,
-0.04583740234375,
-0.00933837890625,
-0.03729248046875,
-0.0006861686706542969,
0.03485107421875,
-0.0264739990234375,
-0.058135986328125,
0.019683837890625,
-0.0007205009460449219,
0.03399658203125,
0.0182647705078125,
-0.003925323486328125,
0.0330810546875,
-0.0012235641479492188,
-0.026397705078125,
-0.0292510986328125,
0.093017578125,
0.02386474609375,
-0.0100860595703125,
-0.0035247802734375,
-0.00505828857421875,
-0.01312255859375,
-0.0028285980224609375,
-0.07830810546875,
-0.043060302734375,
0.022674560546875,
-0.044189453125,
-0.0343017578125,
-0.0203857421875,
-0.06195068359375,
-0.0289306640625,
0.024749755859375,
0.0501708984375,
-0.029144287109375,
-0.05023193359375,
0.006435394287109375,
-0.039794921875,
0.01079559326171875,
0.0297088623046875,
-0.053253173828125,
0.010589599609375,
0.0155181884765625,
0.07958984375,
-0.0005497932434082031,
-0.010101318359375,
-0.007568359375,
0.017730712890625,
-0.0154571533203125,
0.054168701171875,
-0.0168609619140625,
-0.05853271484375,
-0.0377197265625,
0.01073455810546875,
0.004077911376953125,
-0.02569580078125,
0.0421142578125,
-0.03680419921875,
0.0249176025390625,
-0.0030193328857421875,
-0.035552978515625,
-0.0240020751953125,
-0.01220703125,
-0.049835205078125,
0.0650634765625,
0.0283660888671875,
-0.061370849609375,
0.01043701171875,
-0.07513427734375,
-0.024658203125,
0.0135955810546875,
0.00580596923828125,
-0.053253173828125,
-0.006130218505859375,
0.00018036365509033203,
0.0203857421875,
-0.01554107666015625,
0.01279449462890625,
-0.042144775390625,
-0.0011806488037109375,
0.00010949373245239258,
-0.03704833984375,
0.09930419921875,
0.036376953125,
-0.0213470458984375,
0.0179443359375,
-0.03448486328125,
-0.00916290283203125,
0.0282440185546875,
-0.0085296630859375,
-0.01189422607421875,
-0.0093231201171875,
0.0169677734375,
0.00601959228515625,
0.016265869140625,
-0.03289794921875,
0.01788330078125,
-0.01525115966796875,
0.039093017578125,
0.07080078125,
0.01026153564453125,
0.03485107421875,
-0.032928466796875,
0.046630859375,
0.0009741783142089844,
0.04205322265625,
0.0185394287109375,
-0.0709228515625,
-0.05657958984375,
-0.045135498046875,
0.01654052734375,
0.0207061767578125,
-0.046844482421875,
0.023773193359375,
0.00009447336196899414,
-0.0562744140625,
-0.037933349609375,
0.00539398193359375,
0.0026607513427734375,
0.052734375,
0.01480865478515625,
-0.026153564453125,
-0.0216064453125,
-0.047760009765625,
0.021636962890625,
-0.007354736328125,
0.00010091066360473633,
0.0179901123046875,
0.044708251953125,
-0.0240020751953125,
0.05450439453125,
-0.06451416015625,
-0.032623291015625,
0.0076751708984375,
0.027099609375,
0.027435302734375,
0.054656982421875,
0.06756591796875,
-0.080322265625,
-0.05963134765625,
-0.003948211669921875,
-0.0704345703125,
-0.0181121826171875,
0.0080413818359375,
-0.053466796875,
-0.00142669677734375,
0.0119171142578125,
-0.060394287109375,
0.04534912109375,
0.057220458984375,
-0.056610107421875,
0.03460693359375,
-0.0306396484375,
0.0154571533203125,
-0.08575439453125,
0.026763916015625,
0.0064239501953125,
-0.033447265625,
-0.042236328125,
0.029571533203125,
-0.012115478515625,
-0.00885772705078125,
-0.038116455078125,
0.06927490234375,
-0.028961181640625,
0.0316162109375,
-0.027923583984375,
-0.007350921630859375,
0.0135040283203125,
0.03436279296875,
-0.0036563873291015625,
0.04931640625,
0.0638427734375,
-0.046173095703125,
0.0242919921875,
0.031158447265625,
-0.017791748046875,
0.044189453125,
-0.0721435546875,
0.0017328262329101562,
-0.0129852294921875,
0.0216522216796875,
-0.0806884765625,
-0.019927978515625,
0.04913330078125,
-0.044921875,
0.017486572265625,
-0.0220184326171875,
-0.035247802734375,
-0.0241851806640625,
-0.008575439453125,
0.030181884765625,
0.07513427734375,
-0.0325927734375,
0.0484619140625,
0.032562255859375,
0.0010957717895507812,
-0.0268402099609375,
-0.05194091796875,
-0.0216827392578125,
-0.041473388671875,
-0.06494140625,
0.0290679931640625,
-0.026397705078125,
0.0032196044921875,
0.01276397705078125,
0.01065826416015625,
-0.016754150390625,
0.00980377197265625,
0.0382080078125,
0.0250091552734375,
0.005725860595703125,
-0.0222015380859375,
0.0019588470458984375,
-0.01216888427734375,
0.001476287841796875,
-0.01259613037109375,
0.052734375,
-0.0012493133544921875,
-0.023162841796875,
-0.05438232421875,
0.0193939208984375,
0.0501708984375,
-0.004589080810546875,
0.06427001953125,
0.07696533203125,
-0.030242919921875,
0.01641845703125,
-0.03033447265625,
0.0083465576171875,
-0.04150390625,
0.004665374755859375,
-0.020660400390625,
-0.035186767578125,
0.058258056640625,
0.0162353515625,
0.019256591796875,
0.0462646484375,
0.04229736328125,
-0.029449462890625,
0.0980224609375,
0.044830322265625,
0.00818634033203125,
0.04547119140625,
-0.06646728515625,
0.0034236907958984375,
-0.07049560546875,
-0.029876708984375,
-0.0306854248046875,
-0.036102294921875,
-0.035888671875,
-0.036285400390625,
0.046478271484375,
0.028167724609375,
-0.029571533203125,
0.0064544677734375,
-0.0270538330078125,
0.005382537841796875,
0.0237274169921875,
0.0239410400390625,
0.01015472412109375,
0.0014791488647460938,
0.003520965576171875,
0.007724761962890625,
-0.0506591796875,
-0.0160675048828125,
0.060760498046875,
0.032958984375,
0.0736083984375,
0.0279083251953125,
0.05242919921875,
0.032318115234375,
0.049163818359375,
-0.039520263671875,
0.0297698974609375,
-0.00879669189453125,
-0.062744140625,
-0.0158843994140625,
-0.037139892578125,
-0.07672119140625,
0.0197906494140625,
-0.011474609375,
-0.0361328125,
0.0305328369140625,
0.016326904296875,
-0.007442474365234375,
0.034637451171875,
-0.06304931640625,
0.06927490234375,
-0.01007080078125,
-0.045166015625,
-0.01187896728515625,
-0.029144287109375,
0.0234527587890625,
0.00998687744140625,
0.017120361328125,
-0.00978851318359375,
-0.007099151611328125,
0.0599365234375,
-0.0494384765625,
0.059844970703125,
-0.036651611328125,
-0.01276397705078125,
0.028167724609375,
-0.00362396240234375,
0.0262603759765625,
0.01546478271484375,
-0.0005898475646972656,
0.0233612060546875,
-0.00978851318359375,
-0.052215576171875,
-0.0288238525390625,
0.06561279296875,
-0.0712890625,
-0.037109375,
-0.0269622802734375,
-0.034210205078125,
0.024932861328125,
0.0338134765625,
0.0531005859375,
0.0211181640625,
-0.0006251335144042969,
0.005985260009765625,
0.055908203125,
-0.003368377685546875,
0.035675048828125,
0.0146026611328125,
-0.031158447265625,
-0.040985107421875,
0.06658935546875,
-0.0040283203125,
0.0274810791015625,
0.00017821788787841797,
0.03131103515625,
-0.03131103515625,
-0.040985107421875,
-0.048583984375,
0.0240020751953125,
-0.045196533203125,
-0.01971435546875,
-0.04296875,
-0.01015472412109375,
-0.03173828125,
-0.0113372802734375,
-0.01302337646484375,
-0.01293182373046875,
-0.0518798828125,
0.0144805908203125,
0.046417236328125,
0.0458984375,
-0.01389312744140625,
0.03363037109375,
-0.03253173828125,
0.03399658203125,
0.0140533447265625,
0.0364990234375,
0.01873779296875,
-0.05145263671875,
-0.0033626556396484375,
0.0123291015625,
-0.043701171875,
-0.0650634765625,
0.039947509765625,
0.005947113037109375,
0.046417236328125,
0.040252685546875,
-0.00821685791015625,
0.06829833984375,
-0.0236358642578125,
0.056854248046875,
0.04339599609375,
-0.0498046875,
0.038543701171875,
-0.04315185546875,
0.02081298828125,
0.013580322265625,
0.05352783203125,
-0.031036376953125,
-0.032562255859375,
-0.058074951171875,
-0.0657958984375,
0.029144287109375,
0.0294647216796875,
0.005138397216796875,
0.0112457275390625,
0.021697998046875,
0.00466156005859375,
-0.0022449493408203125,
-0.0740966796875,
-0.041473388671875,
-0.02117919921875,
-0.01110076904296875,
0.01047515869140625,
0.01422882080078125,
-0.01432037353515625,
-0.022613525390625,
0.07086181640625,
0.00669097900390625,
0.0322265625,
0.0195465087890625,
0.0022296905517578125,
-0.02252197265625,
-0.01271820068359375,
0.0209808349609375,
0.036529541015625,
-0.016693115234375,
-0.022918701171875,
-0.00691986083984375,
-0.040618896484375,
0.01403045654296875,
0.003620147705078125,
-0.040252685546875,
-0.004863739013671875,
0.0057373046875,
0.06805419921875,
-0.0090484619140625,
-0.017333984375,
0.0322265625,
-0.0163116455078125,
-0.0292510986328125,
-0.0278778076171875,
0.0182037353515625,
0.02655029296875,
0.0275726318359375,
0.0211029052734375,
0.0211639404296875,
0.017425537109375,
-0.03192138671875,
-0.0084991455078125,
0.04058837890625,
-0.017364501953125,
-0.018585205078125,
0.0911865234375,
0.014251708984375,
-0.0239715576171875,
0.0406494140625,
-0.0201263427734375,
-0.01226043701171875,
0.041473388671875,
0.0450439453125,
0.0662841796875,
-0.0086669921875,
0.025177001953125,
0.02978515625,
-0.00612640380859375,
-0.006244659423828125,
0.0197906494140625,
0.01019287109375,
-0.0280609130859375,
-0.0098876953125,
-0.04620361328125,
0.0036468505859375,
0.016326904296875,
-0.035919189453125,
0.038726806640625,
-0.045013427734375,
-0.0328369140625,
-0.0137481689453125,
-0.007671356201171875,
-0.038818359375,
0.0107269287109375,
0.009857177734375,
0.0682373046875,
-0.07196044921875,
0.056793212890625,
0.044403076171875,
-0.05096435546875,
-0.048614501953125,
-0.006618499755859375,
-0.021697998046875,
-0.03900146484375,
0.02471923828125,
-0.0089111328125,
-0.014068603515625,
0.014892578125,
-0.06494140625,
-0.06280517578125,
0.102783203125,
0.0204010009765625,
-0.0005054473876953125,
-0.019805908203125,
-0.01971435546875,
0.03741455078125,
-0.016998291015625,
0.032745361328125,
0.00524139404296875,
0.0416259765625,
0.0159149169921875,
-0.0484619140625,
0.004150390625,
-0.0267791748046875,
0.0074005126953125,
0.0090484619140625,
-0.08135986328125,
0.065673828125,
-0.01032257080078125,
-0.022247314453125,
0.032867431640625,
0.064208984375,
0.02581787109375,
0.022674560546875,
0.039031982421875,
0.0682373046875,
0.04522705078125,
-0.0199127197265625,
0.08160400390625,
0.00669097900390625,
0.042633056640625,
0.053497314453125,
0.004421234130859375,
0.0462646484375,
0.0282440185546875,
-0.017486572265625,
0.0653076171875,
0.047149658203125,
-0.0013751983642578125,
0.055633544921875,
-0.00396728515625,
-0.0166473388671875,
-0.007274627685546875,
-0.002628326416015625,
-0.043609619140625,
0.0003266334533691406,
0.025543212890625,
-0.01342010498046875,
-0.010040283203125,
0.0142669677734375,
0.0039520263671875,
-0.0229949951171875,
-0.01364898681640625,
0.030914306640625,
0.00860595703125,
-0.0271148681640625,
0.06884765625,
-0.002323150634765625,
0.06866455078125,
-0.0655517578125,
-0.01058197021484375,
-0.0197601318359375,
-0.0032520294189453125,
-0.0310516357421875,
-0.070068359375,
0.00811767578125,
0.001888275146484375,
-0.002655029296875,
-0.0231475830078125,
0.04534912109375,
-0.00946807861328125,
-0.032867431640625,
0.0257568359375,
0.01505279541015625,
0.0195770263671875,
0.02569580078125,
-0.06463623046875,
0.01354217529296875,
0.00502777099609375,
-0.0265350341796875,
0.0190277099609375,
0.01230621337890625,
0.0108184814453125,
0.07037353515625,
0.0416259765625,
0.007328033447265625,
0.01241302490234375,
-0.001407623291015625,
0.056854248046875,
-0.019683837890625,
-0.0418701171875,
-0.04547119140625,
0.061859130859375,
-0.009735107421875,
-0.02447509765625,
0.050994873046875,
0.0416259765625,
0.055145263671875,
-0.01219940185546875,
0.057769775390625,
-0.046844482421875,
0.00627899169921875,
-0.034423828125,
0.08636474609375,
-0.077392578125,
0.0022106170654296875,
-0.027252197265625,
-0.04229736328125,
0.00396728515625,
0.062164306640625,
-0.0018796920776367188,
0.0150909423828125,
0.03326416015625,
0.060821533203125,
-0.0239715576171875,
-0.015411376953125,
0.011077880859375,
0.0200653076171875,
0.03570556640625,
0.0421142578125,
0.044342041015625,
-0.0628662109375,
0.039306640625,
-0.027191162109375,
-0.023712158203125,
-0.01181793212890625,
-0.061798095703125,
-0.058502197265625,
-0.0498046875,
-0.041900634765625,
-0.0450439453125,
-0.0146026611328125,
0.0269775390625,
0.0794677734375,
-0.039093017578125,
-0.0232696533203125,
-0.0175323486328125,
0.01401519775390625,
-0.00531005859375,
-0.0228424072265625,
0.0181427001953125,
0.0184173583984375,
-0.068115234375,
0.0005092620849609375,
-0.002353668212890625,
0.033599853515625,
-0.0250396728515625,
-0.03265380859375,
-0.0379638671875,
-0.0170440673828125,
0.0266265869140625,
0.02508544921875,
-0.04168701171875,
-0.00339508056640625,
-0.0162353515625,
0.00146484375,
0.016082763671875,
0.01885986328125,
-0.043701171875,
0.0170135498046875,
0.06597900390625,
0.0063323974609375,
0.0557861328125,
-0.0005750656127929688,
0.0235137939453125,
-0.037933349609375,
0.0197601318359375,
0.00165557861328125,
0.03558349609375,
0.0144805908203125,
-0.032623291015625,
0.045989990234375,
0.0364990234375,
-0.050750732421875,
-0.0574951171875,
0.01076507568359375,
-0.07928466796875,
-0.0211639404296875,
0.06439208984375,
-0.0234222412109375,
-0.0244903564453125,
0.007297515869140625,
-0.01885986328125,
0.0268096923828125,
-0.0308990478515625,
0.039642333984375,
0.048248291015625,
-0.024627685546875,
-0.0174102783203125,
-0.04400634765625,
0.02215576171875,
0.0167236328125,
-0.06341552734375,
-0.0185699462890625,
0.03594970703125,
0.058502197265625,
0.0221710205078125,
0.068115234375,
-0.0203399658203125,
0.02197265625,
-0.00067138671875,
0.0025234222412109375,
0.001251220703125,
-0.004550933837890625,
-0.0195159912109375,
0.0010986328125,
0.004058837890625,
-0.01177215576171875
]
] |
timm/swin_tiny_patch4_window7_224.ms_in1k | 2023-03-18T04:15:09.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2103.14030",
"license:mit",
"region:us"
] | image-classification | timm | null | null | timm/swin_tiny_patch4_window7_224.ms_in1k | 0 | 33,159 | timm | 2023-03-18T04:14:56 | ---
tags:
- image-classification
- timm
library_tag: timm
license: mit
datasets:
- imagenet-1k
---
# Model card for swin_tiny_patch4_window7_224.ms_in1k
A Swin Transformer image classification model. Pretrained on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 28.3
- GMACs: 4.5
- Activations (M): 17.1
- Image size: 224 x 224
- **Papers:**
- Swin Transformer: Hierarchical Vision Transformer using Shifted Windows: https://arxiv.org/abs/2103.14030
- **Original:** https://github.com/microsoft/Swin-Transformer
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('swin_tiny_patch4_window7_224.ms_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'swin_tiny_patch4_window7_224.ms_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g. for swin_base_patch4_window7_224 (NHWC output)
# torch.Size([1, 56, 56, 128])
# torch.Size([1, 28, 28, 256])
# torch.Size([1, 14, 14, 512])
# torch.Size([1, 7, 7, 1024])
# e.g. for swinv2_cr_small_ns_224 (NCHW output)
# torch.Size([1, 96, 56, 56])
# torch.Size([1, 192, 28, 28])
# torch.Size([1, 384, 14, 14])
# torch.Size([1, 768, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'swin_tiny_patch4_window7_224.ms_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled (ie.e a (batch_size, H, W, num_features) tensor for swin / swinv2
# or (batch_size, num_features, H, W) for swinv2_cr
output = model.forward_head(output, pre_logits=True)
# output is (batch_size, num_features) tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{liu2021Swin,
title={Swin Transformer: Hierarchical Vision Transformer using Shifted Windows},
author={Liu, Ze and Lin, Yutong and Cao, Yue and Hu, Han and Wei, Yixuan and Zhang, Zheng and Lin, Stephen and Guo, Baining},
booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 4,404 | [
[
-0.0316162109375,
-0.034210205078125,
-0.00247955322265625,
0.00919342041015625,
-0.02178955078125,
-0.032806396484375,
-0.0171661376953125,
-0.03631591796875,
0.00547027587890625,
0.0261383056640625,
-0.0447998046875,
-0.04791259765625,
-0.043914794921875,
-0.01519775390625,
-0.007762908935546875,
0.07568359375,
-0.00850677490234375,
-0.00695037841796875,
-0.01361846923828125,
-0.041473388671875,
-0.01259613037109375,
-0.01007843017578125,
-0.041015625,
-0.0258331298828125,
0.026519775390625,
0.0121307373046875,
0.047332763671875,
0.04144287109375,
0.0565185546875,
0.036041259765625,
-0.006763458251953125,
-0.0012292861938476562,
-0.01372528076171875,
-0.021759033203125,
0.0276641845703125,
-0.0496826171875,
-0.039520263671875,
0.01898193359375,
0.052154541015625,
0.02911376953125,
0.007122039794921875,
0.03472900390625,
0.00994110107421875,
0.03277587890625,
-0.0107269287109375,
0.00566864013671875,
-0.03912353515625,
0.01143646240234375,
-0.00930023193359375,
0.004970550537109375,
-0.01458740234375,
-0.029876708984375,
0.0203094482421875,
-0.041839599609375,
0.051239013671875,
0.008148193359375,
0.10406494140625,
0.007049560546875,
-0.005619049072265625,
0.005664825439453125,
-0.019317626953125,
0.0679931640625,
-0.073486328125,
0.003978729248046875,
0.006439208984375,
0.0114898681640625,
-0.003704071044921875,
-0.0655517578125,
-0.0399169921875,
-0.0099945068359375,
-0.023681640625,
-0.004177093505859375,
-0.0178375244140625,
0.00487518310546875,
0.0301513671875,
0.020599365234375,
-0.034759521484375,
0.01200103759765625,
-0.040924072265625,
-0.0205078125,
0.0469970703125,
0.00725555419921875,
0.03125,
-0.0237884521484375,
-0.044281005859375,
-0.035888671875,
-0.02435302734375,
0.02154541015625,
0.0102386474609375,
0.018707275390625,
-0.050628662109375,
0.03680419921875,
0.02685546875,
0.0379638671875,
0.004528045654296875,
-0.038726806640625,
0.051544189453125,
-0.01313018798828125,
-0.029876708984375,
-0.01910400390625,
0.0731201171875,
0.030853271484375,
0.004917144775390625,
0.0224761962890625,
-0.017913818359375,
-0.02008056640625,
-0.0073699951171875,
-0.081787109375,
-0.021942138671875,
0.0211181640625,
-0.04901123046875,
-0.039520263671875,
0.0169219970703125,
-0.048126220703125,
-0.006923675537109375,
-0.0048370361328125,
0.044158935546875,
-0.034912109375,
-0.032440185546875,
-0.015106201171875,
-0.0274200439453125,
0.03350830078125,
0.021453857421875,
-0.041839599609375,
0.003726959228515625,
0.0192413330078125,
0.08099365234375,
-0.00559234619140625,
-0.045196533203125,
-0.006256103515625,
-0.0203094482421875,
-0.0207061767578125,
0.029632568359375,
-0.00031065940856933594,
-0.01004791259765625,
-0.01515960693359375,
0.0275115966796875,
-0.015655517578125,
-0.049652099609375,
0.01038360595703125,
-0.00797271728515625,
0.0183258056640625,
0.002742767333984375,
-0.00867462158203125,
-0.023040771484375,
0.018890380859375,
-0.0287933349609375,
0.0997314453125,
0.035125732421875,
-0.073486328125,
0.0214385986328125,
-0.037017822265625,
-0.0205841064453125,
-0.0118560791015625,
-0.003559112548828125,
-0.0751953125,
0.0005993843078613281,
0.023040771484375,
0.043975830078125,
-0.01267242431640625,
0.0025177001953125,
-0.038238525390625,
-0.0198211669921875,
0.021514892578125,
-0.00829315185546875,
0.076171875,
0.00017404556274414062,
-0.0445556640625,
0.0291748046875,
-0.042694091796875,
0.00838470458984375,
0.040069580078125,
-0.012237548828125,
-0.0174102783203125,
-0.047515869140625,
0.020599365234375,
0.0300750732421875,
0.01885986328125,
-0.04669189453125,
0.022064208984375,
-0.0194091796875,
0.033538818359375,
0.055908203125,
-0.006206512451171875,
0.0292816162109375,
-0.0274505615234375,
0.0196380615234375,
0.03564453125,
0.032379150390625,
-0.003192901611328125,
-0.047576904296875,
-0.06414794921875,
-0.03338623046875,
0.0189208984375,
0.0289154052734375,
-0.045654296875,
0.0423583984375,
-0.018585205078125,
-0.055023193359375,
-0.043731689453125,
0.006420135498046875,
0.0235748291015625,
0.0440673828125,
0.0265960693359375,
-0.0205078125,
-0.044586181640625,
-0.07196044921875,
0.01020050048828125,
-0.0030002593994140625,
-0.00736236572265625,
0.0263824462890625,
0.061431884765625,
-0.0228729248046875,
0.05474853515625,
-0.0308074951171875,
-0.02734375,
-0.0247802734375,
0.01540374755859375,
0.031646728515625,
0.055145263671875,
0.06182861328125,
-0.042724609375,
-0.0306549072265625,
-0.00567626953125,
-0.0645751953125,
0.004634857177734375,
-0.0152587890625,
-0.0196075439453125,
0.0282135009765625,
0.0021877288818359375,
-0.04815673828125,
0.054840087890625,
0.022979736328125,
-0.0253753662109375,
0.042938232421875,
-0.02386474609375,
0.0191497802734375,
-0.07696533203125,
0.00738525390625,
0.03009033203125,
-0.00626373291015625,
-0.034393310546875,
-0.0003178119659423828,
0.0119781494140625,
-0.00391387939453125,
-0.03863525390625,
0.0458984375,
-0.04144287109375,
-0.0194244384765625,
-0.0103607177734375,
0.0022525787353515625,
0.0041046142578125,
0.056396484375,
-0.0018186569213867188,
0.0262603759765625,
0.06146240234375,
-0.0303192138671875,
0.0223846435546875,
0.029998779296875,
-0.0166168212890625,
0.02777099609375,
-0.0543212890625,
0.00045371055603027344,
0.002834320068359375,
0.0155029296875,
-0.07525634765625,
-0.00417327880859375,
0.00901031494140625,
-0.041046142578125,
0.04193115234375,
-0.040985107421875,
-0.0249481201171875,
-0.047576904296875,
-0.04864501953125,
0.0274505615234375,
0.06298828125,
-0.05975341796875,
0.034759521484375,
0.017333984375,
0.01103973388671875,
-0.0419921875,
-0.0697021484375,
-0.02789306640625,
-0.0213165283203125,
-0.06280517578125,
0.031951904296875,
0.01192474365234375,
0.0009441375732421875,
0.00969696044921875,
-0.0099029541015625,
0.003284454345703125,
-0.020477294921875,
0.04229736328125,
0.056671142578125,
-0.03076171875,
-0.0221099853515625,
-0.0163726806640625,
-0.00341033935546875,
0.00127410888671875,
-0.004955291748046875,
0.0308074951171875,
-0.0224761962890625,
-0.01236724853515625,
-0.045654296875,
-0.0086517333984375,
0.044158935546875,
-0.0029449462890625,
0.0557861328125,
0.09210205078125,
-0.03167724609375,
-0.006305694580078125,
-0.031829833984375,
-0.0257720947265625,
-0.038360595703125,
0.0310516357421875,
-0.0244140625,
-0.033111572265625,
0.061798095703125,
0.005023956298828125,
0.021240234375,
0.05963134765625,
0.02020263671875,
-0.020721435546875,
0.07012939453125,
0.042388916015625,
0.006275177001953125,
0.0556640625,
-0.0723876953125,
-0.0075836181640625,
-0.06378173828125,
-0.032318115234375,
-0.0261993408203125,
-0.0469970703125,
-0.045135498046875,
-0.039215087890625,
0.03692626953125,
0.01316070556640625,
-0.0244903564453125,
0.036346435546875,
-0.060791015625,
-0.004795074462890625,
0.04864501953125,
0.03253173828125,
-0.0259857177734375,
0.02178955078125,
-0.0217132568359375,
-0.01654052734375,
-0.053466796875,
-0.00901031494140625,
0.064697265625,
0.040679931640625,
0.0592041015625,
-0.015289306640625,
0.04827880859375,
-0.0089569091796875,
0.0199432373046875,
-0.03857421875,
0.05029296875,
-0.0006737709045410156,
-0.0258331298828125,
-0.019012451171875,
-0.0279998779296875,
-0.07635498046875,
0.0237884521484375,
-0.029449462890625,
-0.05010986328125,
0.015380859375,
0.00817108154296875,
-0.00301361083984375,
0.06243896484375,
-0.059844970703125,
0.06805419921875,
-0.01548004150390625,
-0.02587890625,
-0.0006132125854492188,
-0.055908203125,
0.019256591796875,
0.0265045166015625,
-0.00830841064453125,
-0.011474609375,
0.00887298583984375,
0.0828857421875,
-0.050872802734375,
0.07281494140625,
-0.044219970703125,
0.0265960693359375,
0.03472900390625,
-0.015777587890625,
0.033172607421875,
-0.01483917236328125,
0.006107330322265625,
0.03179931640625,
0.007602691650390625,
-0.036224365234375,
-0.043670654296875,
0.04779052734375,
-0.07763671875,
-0.0276947021484375,
-0.0299224853515625,
-0.035308837890625,
0.0178680419921875,
0.00911712646484375,
0.03985595703125,
0.05010986328125,
0.01336669921875,
0.01422119140625,
0.0418701171875,
-0.02667236328125,
0.041046142578125,
0.0010814666748046875,
-0.0177154541015625,
-0.029693603515625,
0.0584716796875,
0.01013946533203125,
0.00968170166015625,
0.00637054443359375,
0.0247039794921875,
-0.0204925537109375,
-0.041168212890625,
-0.0297698974609375,
0.036956787109375,
-0.04901123046875,
-0.037841796875,
-0.035430908203125,
-0.04095458984375,
-0.040069580078125,
-0.0162811279296875,
-0.03143310546875,
-0.0274810791015625,
-0.022796630859375,
0.0139007568359375,
0.052642822265625,
0.04742431640625,
-0.00597381591796875,
0.0248260498046875,
-0.043212890625,
0.012603759765625,
0.00954437255859375,
0.023345947265625,
-0.00446319580078125,
-0.07208251953125,
-0.0171966552734375,
-0.0015249252319335938,
-0.024383544921875,
-0.04681396484375,
0.03924560546875,
0.00739288330078125,
0.044677734375,
0.024810791015625,
-0.006992340087890625,
0.06793212890625,
-0.004695892333984375,
0.05560302734375,
0.0340576171875,
-0.047119140625,
0.05914306640625,
-0.0013151168823242188,
0.01910400390625,
0.0025177001953125,
0.01284027099609375,
-0.0172882080078125,
-0.01552581787109375,
-0.074462890625,
-0.066650390625,
0.06689453125,
0.0089569091796875,
-0.006855010986328125,
0.03253173828125,
0.036163330078125,
0.00821685791015625,
-0.005596160888671875,
-0.0557861328125,
-0.03741455078125,
-0.0279998779296875,
-0.0197601318359375,
0.0042266845703125,
-0.0030612945556640625,
-0.01320648193359375,
-0.062225341796875,
0.05242919921875,
-0.008087158203125,
0.059356689453125,
0.0242156982421875,
-0.017852783203125,
-0.007442474365234375,
-0.02008056640625,
0.034759521484375,
0.0240478515625,
-0.02301025390625,
0.0015249252319335938,
0.0208740234375,
-0.051025390625,
-0.00262451171875,
0.0031299591064453125,
-0.0035400390625,
0.008819580078125,
0.045166015625,
0.07763671875,
0.01346588134765625,
0.0008869171142578125,
0.0521240234375,
0.0018758773803710938,
-0.03717041015625,
-0.015167236328125,
0.01093292236328125,
-0.008758544921875,
0.0273590087890625,
0.0296783447265625,
0.03863525390625,
-0.017791748046875,
-0.0184478759765625,
0.011627197265625,
0.03997802734375,
-0.0181121826171875,
-0.0264739990234375,
0.047210693359375,
-0.00717926025390625,
-0.005023956298828125,
0.06695556640625,
0.01181793212890625,
-0.0350341796875,
0.07806396484375,
0.044769287109375,
0.059722900390625,
0.002811431884765625,
0.005306243896484375,
0.06414794921875,
0.0240631103515625,
0.00621795654296875,
0.006862640380859375,
0.00769805908203125,
-0.0533447265625,
0.0174560546875,
-0.037384033203125,
-0.0005803108215332031,
0.0219268798828125,
-0.048828125,
0.02960205078125,
-0.042816162109375,
-0.0281982421875,
0.00957489013671875,
0.02691650390625,
-0.0751953125,
0.00859832763671875,
0.00083160400390625,
0.07135009765625,
-0.06658935546875,
0.0687255859375,
0.0556640625,
-0.03955078125,
-0.072265625,
-0.0124969482421875,
0.00022101402282714844,
-0.07470703125,
0.029815673828125,
0.032073974609375,
0.004520416259765625,
-0.0086822509765625,
-0.06231689453125,
-0.044158935546875,
0.116455078125,
0.0222320556640625,
-0.01461029052734375,
0.00801849365234375,
-0.006908416748046875,
0.02294921875,
-0.033660888671875,
0.038177490234375,
0.027313232421875,
0.036285400390625,
0.01995849609375,
-0.0482177734375,
0.0173492431640625,
-0.0283355712890625,
0.0232391357421875,
0.0089874267578125,
-0.045684814453125,
0.065673828125,
-0.0523681640625,
-0.01030731201171875,
0.001377105712890625,
0.05023193359375,
0.0255584716796875,
0.008209228515625,
0.04595947265625,
0.045654296875,
0.03863525390625,
-0.02685546875,
0.06427001953125,
0.004726409912109375,
0.0467529296875,
0.042694091796875,
0.019561767578125,
0.048492431640625,
0.035736083984375,
-0.026214599609375,
0.040496826171875,
0.06854248046875,
-0.036468505859375,
0.0244140625,
0.003070831298828125,
0.0083770751953125,
0.00293731689453125,
0.021514892578125,
-0.03680419921875,
0.0201568603515625,
0.01548004150390625,
-0.0345458984375,
-0.0079498291015625,
0.0167694091796875,
-0.0073394775390625,
-0.03204345703125,
-0.018707275390625,
0.035400390625,
-0.003925323486328125,
-0.0280914306640625,
0.05780029296875,
0.005184173583984375,
0.086181640625,
-0.044891357421875,
0.003978729248046875,
-0.020477294921875,
0.018463134765625,
-0.032196044921875,
-0.06805419921875,
0.01457977294921875,
-0.0213470458984375,
-0.0003733634948730469,
0.0035495758056640625,
0.06768798828125,
-0.032745361328125,
-0.035552978515625,
0.02099609375,
0.0189971923828125,
0.0298919677734375,
0.0156402587890625,
-0.0953369140625,
0.01129913330078125,
0.0088348388671875,
-0.050048828125,
0.0310516357421875,
0.0272216796875,
0.00339508056640625,
0.049072265625,
0.0426025390625,
-0.0035114288330078125,
0.0181427001953125,
-0.0006437301635742188,
0.059356689453125,
-0.04681396484375,
-0.0251007080078125,
-0.0511474609375,
0.054229736328125,
-0.01641845703125,
-0.044219970703125,
0.0538330078125,
0.03729248046875,
0.052398681640625,
-0.01247406005859375,
0.042877197265625,
-0.027679443359375,
-0.00128173828125,
-0.0179901123046875,
0.052276611328125,
-0.048614501953125,
-0.000873565673828125,
-0.01947021484375,
-0.048126220703125,
-0.0200653076171875,
0.0567626953125,
-0.0171356201171875,
0.0256805419921875,
0.041534423828125,
0.079345703125,
-0.0201263427734375,
-0.021087646484375,
0.019622802734375,
0.0144805908203125,
0.0025577545166015625,
0.0279693603515625,
0.028900146484375,
-0.06451416015625,
0.032928466796875,
-0.05242919921875,
-0.0205841064453125,
-0.0177764892578125,
-0.0433349609375,
-0.0777587890625,
-0.06597900390625,
-0.048187255859375,
-0.0518798828125,
-0.031219482421875,
0.062103271484375,
0.0816650390625,
-0.05743408203125,
-0.004001617431640625,
0.01300811767578125,
0.0179901123046875,
-0.029632568359375,
-0.019317626953125,
0.0455322265625,
-0.010711669921875,
-0.04925537109375,
-0.0217132568359375,
0.0037174224853515625,
0.037139892578125,
-0.00887298583984375,
-0.02447509765625,
-0.007740020751953125,
-0.016143798828125,
0.0253143310546875,
0.020721435546875,
-0.0543212890625,
-0.0177001953125,
-0.0171051025390625,
-0.0236663818359375,
0.034881591796875,
0.044769287109375,
-0.03778076171875,
0.009979248046875,
0.03863525390625,
0.01241302490234375,
0.0677490234375,
-0.0257720947265625,
0.0002753734588623047,
-0.0679931640625,
0.044219970703125,
-0.00878143310546875,
0.038787841796875,
0.0270538330078125,
-0.02386474609375,
0.036224365234375,
0.035919189453125,
-0.033355712890625,
-0.067138671875,
-0.0181427001953125,
-0.08685302734375,
-0.0153045654296875,
0.06817626953125,
-0.025146484375,
-0.046844482421875,
0.0232696533203125,
-0.010223388671875,
0.036895751953125,
-0.006504058837890625,
0.0252532958984375,
0.007045745849609375,
-0.01007080078125,
-0.04632568359375,
-0.032867431640625,
0.0290374755859375,
0.0024051666259765625,
-0.04461669921875,
-0.0243988037109375,
0.0006003379821777344,
0.0584716796875,
0.019500732421875,
0.038421630859375,
-0.020721435546875,
0.00882720947265625,
0.0224761962890625,
0.043548583984375,
-0.0191802978515625,
-0.005878448486328125,
-0.0303802490234375,
-0.0030345916748046875,
-0.00827789306640625,
-0.04364013671875
]
] |
timm/mobilevit_s.cvnets_in1k | 2023-04-24T22:23:12.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.02178",
"license:other",
"region:us"
] | image-classification | timm | null | null | timm/mobilevit_s.cvnets_in1k | 2 | 33,149 | timm | 2023-04-24T22:23:00 | ---
tags:
- image-classification
- timm
library_name: timm
license: other
datasets:
- imagenet-1k
---
# Model card for mobilevit_s.cvnets_in1k
A MobileViT image classification model. Trained on ImageNet-1k by paper authors.
See license details at https://github.com/apple/ml-cvnets/blob/main/LICENSE
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 5.6
- GMACs: 2.0
- Activations (M): 19.9
- Image size: 256 x 256
- **Papers:**
- MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer: https://arxiv.org/abs/2110.02178
- **Original:** https://github.com/apple/ml-cvnets
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('mobilevit_s.cvnets_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'mobilevit_s.cvnets_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 32, 128, 128])
# torch.Size([1, 64, 64, 64])
# torch.Size([1, 96, 32, 32])
# torch.Size([1, 128, 16, 16])
# torch.Size([1, 640, 8, 8])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'mobilevit_s.cvnets_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 640, 8, 8) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{mehta2022mobilevit,
title={MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer},
author={Sachin Mehta and Mohammad Rastegari},
booktitle={International Conference on Learning Representations},
year={2022}
}
```
| 3,754 | [
[
-0.0297698974609375,
-0.0213165283203125,
-0.007465362548828125,
0.01202392578125,
-0.03399658203125,
-0.0251617431640625,
-0.00400543212890625,
-0.0157928466796875,
0.0243377685546875,
0.029541015625,
-0.035919189453125,
-0.05615234375,
-0.0455322265625,
-0.01983642578125,
-0.010986328125,
0.0706787109375,
-0.005641937255859375,
-0.0032596588134765625,
-0.0202484130859375,
-0.047882080078125,
-0.019134521484375,
-0.0230255126953125,
-0.0494384765625,
-0.024993896484375,
0.02154541015625,
0.031341552734375,
0.040496826171875,
0.042724609375,
0.045928955078125,
0.031341552734375,
0.00014865398406982422,
0.004566192626953125,
-0.01324462890625,
-0.0198211669921875,
0.029266357421875,
-0.04736328125,
-0.0297698974609375,
0.0216217041015625,
0.043426513671875,
0.0200653076171875,
-0.0023021697998046875,
0.043060302734375,
0.01026153564453125,
0.04437255859375,
-0.0253753662109375,
0.002773284912109375,
-0.036956787109375,
0.0102996826171875,
-0.0095672607421875,
0.0099945068359375,
-0.014190673828125,
-0.03765869140625,
0.0133819580078125,
-0.032012939453125,
0.026397705078125,
-0.004123687744140625,
0.09466552734375,
0.022674560546875,
-0.01380157470703125,
-0.0047454833984375,
-0.021820068359375,
0.0604248046875,
-0.049713134765625,
0.0176239013671875,
0.02197265625,
0.01421356201171875,
0.0007824897766113281,
-0.06866455078125,
-0.04461669921875,
-0.013214111328125,
-0.0023632049560546875,
0.004009246826171875,
-0.0092926025390625,
-0.007228851318359375,
0.0202178955078125,
0.0181732177734375,
-0.038787841796875,
-0.0036830902099609375,
-0.0443115234375,
-0.015899658203125,
0.046478271484375,
0.0055389404296875,
0.0284423828125,
-0.0205078125,
-0.039520263671875,
-0.0251007080078125,
-0.0262603759765625,
0.04412841796875,
0.0038051605224609375,
0.0069732666015625,
-0.05181884765625,
0.036102294921875,
0.0102996826171875,
0.043975830078125,
0.0029201507568359375,
-0.040679931640625,
0.051605224609375,
-0.0025272369384765625,
-0.041656494140625,
-0.0035572052001953125,
0.08087158203125,
0.036407470703125,
0.015380859375,
0.01541900634765625,
-0.0021114349365234375,
-0.0207061767578125,
-0.0008602142333984375,
-0.0863037109375,
-0.0178680419921875,
0.0201873779296875,
-0.05743408203125,
-0.03961181640625,
0.0227508544921875,
-0.04302978515625,
-0.00988006591796875,
0.0020618438720703125,
0.041107177734375,
-0.031402587890625,
-0.02496337890625,
0.00731658935546875,
-0.005184173583984375,
0.0303802490234375,
0.0068817138671875,
-0.052032470703125,
0.0113067626953125,
0.007289886474609375,
0.0838623046875,
0.0048370361328125,
-0.0276031494140625,
-0.01502227783203125,
-0.025054931640625,
-0.01435089111328125,
0.029144287109375,
-0.004467010498046875,
-0.0308685302734375,
-0.02203369140625,
0.0257110595703125,
-0.0234527587890625,
-0.052642822265625,
0.040130615234375,
-0.0051116943359375,
0.0108795166015625,
-0.003238677978515625,
-0.0002493858337402344,
-0.0418701171875,
0.0235443115234375,
-0.0208740234375,
0.10150146484375,
0.019927978515625,
-0.068115234375,
0.027313232421875,
-0.03857421875,
-0.02020263671875,
-0.02105712890625,
-0.0024242401123046875,
-0.0869140625,
-0.0089111328125,
0.0178375244140625,
0.06463623046875,
-0.01873779296875,
-0.002681732177734375,
-0.03485107421875,
-0.021575927734375,
0.0241546630859375,
-0.0000029802322387695312,
0.074951171875,
0.01316070556640625,
-0.034210205078125,
0.018157958984375,
-0.0538330078125,
0.010955810546875,
0.032196044921875,
-0.023223876953125,
-0.0138397216796875,
-0.0301666259765625,
0.00909423828125,
0.0279998779296875,
0.0064697265625,
-0.048492431640625,
0.030364990234375,
-0.007965087890625,
0.040008544921875,
0.03375244140625,
-0.0143585205078125,
0.0301361083984375,
-0.0220794677734375,
0.022735595703125,
0.022216796875,
0.032318115234375,
-0.0108795166015625,
-0.046600341796875,
-0.06451416015625,
-0.036407470703125,
0.0223388671875,
0.0443115234375,
-0.046783447265625,
0.026763916015625,
-0.023284912109375,
-0.06640625,
-0.03399658203125,
0.005329132080078125,
0.03021240234375,
0.04022216796875,
0.0208587646484375,
-0.037261962890625,
-0.037872314453125,
-0.06658935546875,
0.000007987022399902344,
0.0016126632690429688,
0.0038318634033203125,
0.02984619140625,
0.056915283203125,
-0.0184783935546875,
0.057830810546875,
-0.0262908935546875,
-0.0198516845703125,
-0.0124359130859375,
0.007030487060546875,
0.02423095703125,
0.061279296875,
0.06292724609375,
-0.06463623046875,
-0.03790283203125,
-0.006740570068359375,
-0.0748291015625,
0.01532745361328125,
-0.0131378173828125,
-0.0012540817260742188,
0.00391387939453125,
0.01042938232421875,
-0.048919677734375,
0.058380126953125,
0.01861572265625,
-0.022430419921875,
0.035064697265625,
-0.01361846923828125,
0.02203369140625,
-0.08984375,
0.0022220611572265625,
0.032257080078125,
-0.017333984375,
-0.032684326171875,
-0.00786590576171875,
0.01207733154296875,
-0.0077056884765625,
-0.050384521484375,
0.0430908203125,
-0.0345458984375,
-0.0105743408203125,
-0.009246826171875,
-0.0183868408203125,
-0.005054473876953125,
0.04815673828125,
-0.0135040283203125,
0.039764404296875,
0.06427001953125,
-0.04913330078125,
0.0391845703125,
0.034637451171875,
-0.01715087890625,
0.0260772705078125,
-0.045989990234375,
0.004047393798828125,
-0.0015497207641601562,
0.017120361328125,
-0.06884765625,
-0.01480865478515625,
0.028900146484375,
-0.051361083984375,
0.0309600830078125,
-0.046600341796875,
-0.0236358642578125,
-0.04852294921875,
-0.041015625,
0.031158447265625,
0.0545654296875,
-0.055206298828125,
0.041900634765625,
0.02496337890625,
0.0146331787109375,
-0.045867919921875,
-0.06365966796875,
-0.0215301513671875,
-0.031585693359375,
-0.06011962890625,
0.032073974609375,
0.014923095703125,
0.008819580078125,
-0.0018396377563476562,
-0.01324462890625,
-0.01371002197265625,
-0.01371002197265625,
0.060882568359375,
0.0328369140625,
-0.0187835693359375,
-0.00893402099609375,
-0.017730712890625,
-0.001468658447265625,
-0.000057697296142578125,
-0.0256805419921875,
0.03631591796875,
-0.026214599609375,
-0.00881195068359375,
-0.061920166015625,
-0.01715087890625,
0.051300048828125,
-0.0178680419921875,
0.057830810546875,
0.08251953125,
-0.030853271484375,
0.00687408447265625,
-0.028076171875,
-0.0045623779296875,
-0.037261962890625,
0.02569580078125,
-0.04193115234375,
-0.033050537109375,
0.06341552734375,
-0.00443267822265625,
0.00490570068359375,
0.05072021484375,
0.03302001953125,
-0.00502777099609375,
0.060821533203125,
0.049591064453125,
0.005931854248046875,
0.05169677734375,
-0.07025146484375,
-0.016754150390625,
-0.061676025390625,
-0.0494384765625,
-0.0288848876953125,
-0.028076171875,
-0.0557861328125,
-0.031646728515625,
0.030517578125,
0.0137939453125,
-0.0246429443359375,
0.03863525390625,
-0.06536865234375,
0.0055084228515625,
0.0450439453125,
0.0418701171875,
-0.0294952392578125,
0.0231475830078125,
-0.03302001953125,
-0.0033550262451171875,
-0.05462646484375,
-0.0262603759765625,
0.080078125,
0.035736083984375,
0.055633544921875,
-0.01355743408203125,
0.04754638671875,
-0.016998291015625,
0.02410888671875,
-0.045745849609375,
0.04449462890625,
-0.01026153564453125,
-0.02587890625,
-0.0024433135986328125,
-0.0209197998046875,
-0.0712890625,
0.016510009765625,
-0.03594970703125,
-0.060699462890625,
0.019744873046875,
0.01148223876953125,
-0.0232391357421875,
0.054962158203125,
-0.062103271484375,
0.07342529296875,
-0.01239013671875,
-0.036956787109375,
0.01380157470703125,
-0.052764892578125,
0.0278472900390625,
0.0148162841796875,
-0.00928497314453125,
-0.006977081298828125,
0.0195770263671875,
0.08172607421875,
-0.04705810546875,
0.059295654296875,
-0.037841796875,
0.03228759765625,
0.050445556640625,
-0.0139923095703125,
0.027069091796875,
-0.00010031461715698242,
-0.006805419921875,
0.0213165283203125,
0.003360748291015625,
-0.0347900390625,
-0.036895751953125,
0.0462646484375,
-0.0643310546875,
-0.0163726806640625,
-0.029144287109375,
-0.0263824462890625,
0.019989013671875,
0.005779266357421875,
0.04583740234375,
0.047271728515625,
0.00969696044921875,
0.01904296875,
0.050628662109375,
-0.036224365234375,
0.03472900390625,
-0.013092041015625,
-0.0186309814453125,
-0.030029296875,
0.07318115234375,
0.005771636962890625,
0.001316070556640625,
0.00443267822265625,
0.008453369140625,
-0.025726318359375,
-0.041595458984375,
-0.036468505859375,
0.0163421630859375,
-0.0438232421875,
-0.0246429443359375,
-0.044097900390625,
-0.0311431884765625,
-0.032196044921875,
-0.008056640625,
-0.040313720703125,
-0.0296478271484375,
-0.0296478271484375,
0.0184478759765625,
0.045928955078125,
0.042083740234375,
-0.0025005340576171875,
0.050384521484375,
-0.05316162109375,
0.01055145263671875,
0.0168304443359375,
0.0261688232421875,
-0.015777587890625,
-0.06304931640625,
-0.0242462158203125,
-0.0012979507446289062,
-0.042266845703125,
-0.0499267578125,
0.04083251953125,
0.004276275634765625,
0.032196044921875,
0.033447265625,
-0.0240478515625,
0.05828857421875,
-0.0042877197265625,
0.053802490234375,
0.039154052734375,
-0.0499267578125,
0.047210693359375,
-0.00926971435546875,
0.0164031982421875,
0.01314544677734375,
0.028900146484375,
-0.0138397216796875,
0.01108551025390625,
-0.061767578125,
-0.051971435546875,
0.051422119140625,
0.01751708984375,
-0.002979278564453125,
0.0379638671875,
0.05584716796875,
0.00023472309112548828,
0.0012731552124023438,
-0.06378173828125,
-0.036468505859375,
-0.03857421875,
-0.025665283203125,
0.00498199462890625,
-0.0069427490234375,
0.01239013671875,
-0.060760498046875,
0.045684814453125,
0.005214691162109375,
0.054290771484375,
0.029937744140625,
-0.0084075927734375,
0.006305694580078125,
-0.0265655517578125,
0.050506591796875,
0.01910400390625,
-0.02313232421875,
-0.0013914108276367188,
0.0237274169921875,
-0.05950927734375,
0.01511383056640625,
0.0033092498779296875,
-0.002727508544921875,
0.00786590576171875,
0.0180816650390625,
0.0675048828125,
-0.006740570068359375,
0.0016927719116210938,
0.042694091796875,
-0.011138916015625,
-0.037078857421875,
-0.023529052734375,
0.01165008544921875,
-0.00539398193359375,
0.0250091552734375,
0.0288238525390625,
0.036224365234375,
-0.006084442138671875,
-0.0178375244140625,
0.016845703125,
0.041473388671875,
-0.033599853515625,
-0.0239410400390625,
0.060150146484375,
-0.0032863616943359375,
-0.0147247314453125,
0.05279541015625,
-0.0086822509765625,
-0.04486083984375,
0.0777587890625,
0.0248870849609375,
0.0648193359375,
-0.017486572265625,
0.004024505615234375,
0.06402587890625,
0.024688720703125,
0.0007700920104980469,
0.017181396484375,
0.00547027587890625,
-0.049957275390625,
0.0016155242919921875,
-0.041839599609375,
0.0063018798828125,
0.0212554931640625,
-0.040740966796875,
0.0294647216796875,
-0.045318603515625,
-0.032135009765625,
0.0182647705078125,
0.0159149169921875,
-0.06475830078125,
0.030975341796875,
-0.006317138671875,
0.065185546875,
-0.05731201171875,
0.07244873046875,
0.067626953125,
-0.04248046875,
-0.072021484375,
-0.006488800048828125,
-0.00037169456481933594,
-0.06500244140625,
0.050567626953125,
0.03741455078125,
0.0008339881896972656,
0.0010099411010742188,
-0.06463623046875,
-0.041839599609375,
0.10321044921875,
0.0247955322265625,
-0.01323699951171875,
0.0215301513671875,
0.0023632049560546875,
-0.0002722740173339844,
-0.041748046875,
0.033935546875,
0.01263427734375,
0.02203369140625,
0.0180511474609375,
-0.058929443359375,
0.00945281982421875,
-0.021270751953125,
0.008636474609375,
0.01544189453125,
-0.06048583984375,
0.06524658203125,
-0.045867919921875,
-0.006084442138671875,
-0.0016260147094726562,
0.04644775390625,
0.00899505615234375,
0.0233612060546875,
0.037841796875,
0.050994873046875,
0.039703369140625,
-0.016937255859375,
0.062225341796875,
0.006252288818359375,
0.047210693359375,
0.039703369140625,
0.0243377685546875,
0.039520263671875,
0.027313232421875,
-0.01161956787109375,
0.03314208984375,
0.08538818359375,
-0.02777099609375,
0.0281219482421875,
0.01123809814453125,
-0.01007843017578125,
-0.002727508544921875,
0.01390838623046875,
-0.032257080078125,
0.047576904296875,
0.0166015625,
-0.04168701171875,
-0.005115509033203125,
0.021942138671875,
-0.005756378173828125,
-0.0298614501953125,
-0.0260009765625,
0.0284881591796875,
0.003322601318359375,
-0.025177001953125,
0.071044921875,
0.01178741455078125,
0.07244873046875,
-0.02520751953125,
0.004894256591796875,
-0.0116119384765625,
0.01332855224609375,
-0.0362548828125,
-0.05279541015625,
0.0211029052734375,
-0.0166473388671875,
-0.004512786865234375,
0.00482177734375,
0.06317138671875,
-0.020111083984375,
-0.032135009765625,
0.004673004150390625,
0.0167999267578125,
0.03948974609375,
-0.0013513565063476562,
-0.0927734375,
0.012359619140625,
0.0180206298828125,
-0.04144287109375,
0.0176849365234375,
0.018341064453125,
-0.0001246929168701172,
0.06640625,
0.0477294921875,
-0.0170135498046875,
0.0189056396484375,
-0.0255584716796875,
0.0635986328125,
-0.04888916015625,
-0.0208740234375,
-0.06787109375,
0.0596923828125,
-0.0113677978515625,
-0.042938232421875,
0.04443359375,
0.055877685546875,
0.055816650390625,
-0.0079345703125,
0.044464111328125,
-0.0180206298828125,
-0.0022029876708984375,
-0.0310211181640625,
0.05841064453125,
-0.05401611328125,
-0.0032482147216796875,
-0.0056304931640625,
-0.04583740234375,
-0.0255889892578125,
0.06500244140625,
-0.015838623046875,
0.0184783935546875,
0.036224365234375,
0.07269287109375,
-0.036163330078125,
-0.0214691162109375,
0.0168304443359375,
0.004467010498046875,
-0.0018968582153320312,
0.0287628173828125,
0.0272369384765625,
-0.06982421875,
0.035919189453125,
-0.03497314453125,
-0.021820068359375,
-0.0178070068359375,
-0.0443115234375,
-0.078369140625,
-0.06658935546875,
-0.04315185546875,
-0.05340576171875,
-0.0173187255859375,
0.06817626953125,
0.08807373046875,
-0.041015625,
-0.01137542724609375,
0.00994873046875,
0.01380157470703125,
-0.006832122802734375,
-0.0147552490234375,
0.04388427734375,
0.0025653839111328125,
-0.04693603515625,
-0.0188751220703125,
0.0004563331604003906,
0.03546142578125,
0.00618743896484375,
-0.01849365234375,
-0.0082550048828125,
-0.0166473388671875,
0.03472900390625,
0.031951904296875,
-0.0418701171875,
-0.004970550537109375,
-0.018218994140625,
-0.0185089111328125,
0.03497314453125,
0.0406494140625,
-0.0445556640625,
0.0192108154296875,
0.019500732421875,
0.022552490234375,
0.0748291015625,
-0.021453857421875,
-0.0067291259765625,
-0.059112548828125,
0.0501708984375,
-0.0185699462890625,
0.032440185546875,
0.0252685546875,
-0.0200958251953125,
0.038665771484375,
0.0299072265625,
-0.03302001953125,
-0.06256103515625,
-0.006549835205078125,
-0.09173583984375,
-0.0038127899169921875,
0.06488037109375,
-0.021881103515625,
-0.041595458984375,
0.0233001708984375,
-0.0111083984375,
0.045318603515625,
-0.007579803466796875,
0.030059814453125,
0.01227569580078125,
-0.00969696044921875,
-0.046112060546875,
-0.05035400390625,
0.035369873046875,
0.0028591156005859375,
-0.049072265625,
-0.044281005859375,
-0.01320648193359375,
0.059661865234375,
0.018646240234375,
0.045318603515625,
-0.01038360595703125,
0.0096435546875,
0.00774383544921875,
0.045867919921875,
-0.0230560302734375,
0.0015687942504882812,
-0.0218353271484375,
-0.0020694732666015625,
-0.0159149169921875,
-0.06451416015625
]
] |
jackaduma/SecRoBERTa | 2023-06-26T05:55:27.000Z | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"fill-mask",
"exbert",
"security",
"cybersecurity",
"cyber security",
"threat hunting",
"threat intelligence",
"en",
"dataset:APTnotes",
"dataset:Stucco-Data",
"dataset:CASIE",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | fill-mask | jackaduma | null | null | jackaduma/SecRoBERTa | 8 | 33,106 | transformers | 2022-03-02T23:29:05 | ---
language: en
thumbnail: https://github.com/jackaduma
tags:
- exbert
- security
- cybersecurity
- cyber security
- threat hunting
- threat intelligence
license: apache-2.0
datasets:
- APTnotes
- Stucco-Data
- CASIE
---
# SecRoBERTa
This is the pretrained model presented in [SecBERT: A Pretrained Language Model for Cyber Security Text](https://github.com/jackaduma/SecBERT/), which is a SecRoBERTa model trained on cyber security text.
The training corpus was papers taken from
* [APTnotes](https://github.com/kbandla/APTnotes)
* [Stucco-Data: Cyber security data sources](https://stucco.github.io/data/)
* [CASIE: Extracting Cybersecurity Event Information from Text](https://ebiquity.umbc.edu/_file_directory_/papers/943.pdf)
* [SemEval-2018 Task 8: Semantic Extraction from CybersecUrity REports using Natural Language Processing (SecureNLP)](https://competitions.codalab.org/competitions/17262).
SecRoBERTa has its own wordpiece vocabulary (secvocab) that's built to best match the training corpus.
We trained [SecBERT](https://huggingface.co/jackaduma/SecBERT) and [SecRoBERTa](https://huggingface.co/jackaduma/SecRoBERTa) versions.
Available models include:
* [`SecBERT`](https://huggingface.co/jackaduma/SecBERT)
* [`SecRoBERTa`](https://huggingface.co/jackaduma/SecRoBERTa)
---
## **Fill Mask**
We proposed to build language model which work on cyber security text, as result, it can improve downstream tasks (NER, Text Classification, Semantic Understand, Q&A) in Cyber Security Domain.
First, as below shows Fill-Mask pipeline in [Google Bert](), [AllenAI SciBert](https://github.com/allenai/scibert) and our [SecBERT](https://github.com/jackaduma/SecBERT) .
<!-- <img src="./fill-mask-result.png" width="150%" height="150%"> -->

---
The original repo can be found [here](https://github.com/jackaduma/SecBERT). | 1,943 | [
[
-0.0261688232421875,
-0.051025390625,
0.024169921875,
-0.00186920166015625,
-0.0298309326171875,
0.02178955078125,
-0.005901336669921875,
-0.04815673828125,
0.032623291015625,
0.048980712890625,
-0.043731689453125,
-0.04632568359375,
-0.0469970703125,
-0.004688262939453125,
-0.0299530029296875,
0.0799560546875,
0.00293731689453125,
0.00505828857421875,
0.004451751708984375,
0.009185791015625,
-0.0291290283203125,
-0.049713134765625,
-0.037078857421875,
-0.01372528076171875,
0.0140380859375,
0.0203399658203125,
0.035308837890625,
0.019073486328125,
0.040435791015625,
0.026153564453125,
-0.01038360595703125,
-0.0026226043701171875,
-0.0216217041015625,
-0.0116729736328125,
-0.00702667236328125,
-0.0218048095703125,
-0.0209503173828125,
0.0123291015625,
0.0208892822265625,
0.011077880859375,
-0.002269744873046875,
0.0174407958984375,
-0.005107879638671875,
0.0242767333984375,
-0.062164306640625,
-0.004405975341796875,
-0.053436279296875,
-0.01318359375,
-0.0203399658203125,
-0.00992584228515625,
-0.034942626953125,
-0.0135955810546875,
0.0244903564453125,
-0.04345703125,
0.016387939453125,
0.00518035888671875,
0.0999755859375,
0.0082855224609375,
-0.0099945068359375,
-0.0185546875,
-0.043243408203125,
0.037506103515625,
-0.047943115234375,
0.0287933349609375,
0.025543212890625,
0.01617431640625,
0.00423431396484375,
-0.0865478515625,
-0.044403076171875,
-0.0127716064453125,
-0.0030498504638671875,
0.008941650390625,
-0.0311279296875,
-0.0031833648681640625,
0.0177459716796875,
0.007144927978515625,
-0.043487548828125,
0.0035114288330078125,
-0.07281494140625,
-0.0369873046875,
0.043670654296875,
-0.0098419189453125,
0.024749755859375,
-0.0019741058349609375,
-0.048004150390625,
-0.0241546630859375,
-0.048095703125,
0.01497650146484375,
0.0288848876953125,
0.01519012451171875,
-0.0262298583984375,
0.01259613037109375,
-0.016876220703125,
0.048095703125,
-0.016265869140625,
-0.00438690185546875,
0.037506103515625,
-0.017364501953125,
-0.031951904296875,
0.004535675048828125,
0.0679931640625,
0.002685546875,
0.0146026611328125,
-0.0081634521484375,
-0.00258636474609375,
0.01209259033203125,
0.047332763671875,
-0.07830810546875,
-0.039459228515625,
0.0167694091796875,
-0.0198516845703125,
-0.040802001953125,
0.01309967041015625,
-0.04833984375,
-0.007415771484375,
-0.0305938720703125,
0.051666259765625,
-0.048797607421875,
-0.00569915771484375,
0.01096343994140625,
-0.0064544677734375,
0.0177459716796875,
0.022857666015625,
-0.03240966796875,
0.02001953125,
0.0499267578125,
0.060028076171875,
-0.00009006261825561523,
-0.01079559326171875,
-0.0243988037109375,
-0.016632080078125,
-0.0134429931640625,
0.044189453125,
-0.0357666015625,
0.007183074951171875,
-0.0002124309539794922,
0.00827789306640625,
-0.0073089599609375,
-0.00824737548828125,
0.0494384765625,
-0.0684814453125,
0.037933349609375,
0.004894256591796875,
-0.0301971435546875,
-0.0280303955078125,
0.00937652587890625,
-0.040283203125,
0.04949951171875,
-0.0005788803100585938,
-0.08721923828125,
0.019317626953125,
-0.04547119140625,
-0.0019207000732421875,
0.0175018310546875,
-0.00281524658203125,
-0.033660888671875,
-0.019927978515625,
0.007293701171875,
0.0290679931640625,
-0.01861572265625,
0.0167999267578125,
-0.031280517578125,
-0.0239715576171875,
0.0345458984375,
0.0008435249328613281,
0.087890625,
0.0251922607421875,
-0.0285797119140625,
0.019683837890625,
-0.051727294921875,
-0.0024871826171875,
0.0025653839111328125,
0.0172882080078125,
-0.0259246826171875,
-0.012969970703125,
0.006732940673828125,
0.0264434814453125,
0.0338134765625,
-0.052459716796875,
-0.01202392578125,
-0.042755126953125,
0.0271453857421875,
0.0574951171875,
-0.0025730133056640625,
0.0362548828125,
-0.011871337890625,
0.052734375,
0.00725555419921875,
0.022064208984375,
-0.0098114013671875,
-0.0186004638671875,
-0.059814453125,
-0.05279541015625,
0.040924072265625,
0.066162109375,
-0.060150146484375,
0.027862548828125,
-0.009979248046875,
-0.0655517578125,
-0.037994384765625,
-0.01016998291015625,
0.0302734375,
0.04010009765625,
0.02899169921875,
-0.0318603515625,
-0.049560546875,
-0.068603515625,
0.00672149658203125,
-0.01580810546875,
-0.002979278564453125,
-0.00891876220703125,
0.04998779296875,
-0.0244903564453125,
0.06646728515625,
-0.056182861328125,
-0.009185791015625,
-0.0080413818359375,
0.01073455810546875,
0.01025390625,
0.0302276611328125,
0.03857421875,
-0.0498046875,
-0.01580810546875,
-0.01837158203125,
-0.045562744140625,
-0.00960540771484375,
0.0024261474609375,
-0.0175323486328125,
0.00399017333984375,
0.0435791015625,
-0.0293731689453125,
0.036041259765625,
0.03521728515625,
-0.037017822265625,
0.054901123046875,
0.0003390312194824219,
0.00020992755889892578,
-0.11273193359375,
0.0202178955078125,
-0.014404296875,
-0.0223541259765625,
-0.06646728515625,
0.036834716796875,
0.01202392578125,
-0.027099609375,
-0.052337646484375,
0.0435791015625,
-0.0298919677734375,
0.01320648193359375,
-0.0211181640625,
0.0252227783203125,
-0.006710052490234375,
0.04583740234375,
0.0082244873046875,
0.06414794921875,
0.05804443359375,
-0.04400634765625,
0.016998291015625,
0.02545166015625,
-0.0142974853515625,
0.01332855224609375,
-0.06988525390625,
0.00679779052734375,
0.003696441650390625,
0.01222991943359375,
-0.055267333984375,
-0.020172119140625,
0.0435791015625,
-0.046112060546875,
0.02264404296875,
0.00121307373046875,
-0.03155517578125,
-0.018951416015625,
-0.040863037109375,
0.02105712890625,
0.037322998046875,
-0.024871826171875,
0.0215911865234375,
0.059783935546875,
-0.01385498046875,
-0.058074951171875,
-0.03851318359375,
0.003208160400390625,
-0.0207672119140625,
-0.052093505859375,
0.033721923828125,
0.0001926422119140625,
-0.005462646484375,
-0.01248931884765625,
0.004077911376953125,
-0.032623291015625,
0.0272216796875,
0.025634765625,
0.05438232421875,
-0.0110015869140625,
0.014739990234375,
-0.0201568603515625,
-0.0034694671630859375,
0.00882720947265625,
-0.008941650390625,
0.053802490234375,
-0.033721923828125,
-0.0205230712890625,
-0.04107666015625,
0.0004038810729980469,
0.01580810546875,
-0.0313720703125,
0.06866455078125,
0.053558349609375,
-0.0299835205078125,
-0.003627777099609375,
-0.03790283203125,
-0.0338134765625,
-0.037109375,
0.0008115768432617188,
-0.0110015869140625,
-0.09930419921875,
0.042938232421875,
0.006572723388671875,
-0.003650665283203125,
0.0419921875,
0.0232086181640625,
0.00226593017578125,
0.0240325927734375,
0.04974365234375,
-0.01404571533203125,
0.042144775390625,
-0.026947021484375,
0.021270751953125,
-0.059906005859375,
-0.010528564453125,
-0.051910400390625,
0.00130462646484375,
-0.0643310546875,
-0.0173492431640625,
0.0166778564453125,
0.0267791748046875,
-0.020050048828125,
0.07470703125,
-0.049652099609375,
0.01500701904296875,
0.036865234375,
0.0160369873046875,
-0.005664825439453125,
0.003147125244140625,
-0.0085296630859375,
0.0007905960083007812,
-0.053192138671875,
-0.05145263671875,
0.07415771484375,
0.038848876953125,
0.04193115234375,
0.006603240966796875,
0.073486328125,
0.0064849853515625,
0.01131439208984375,
-0.02264404296875,
0.043731689453125,
0.0035266876220703125,
-0.048980712890625,
-0.01380157470703125,
-0.0275726318359375,
-0.093505859375,
0.016357421875,
-0.0207366943359375,
-0.09375,
0.0167236328125,
0.0016279220581054688,
-0.024444580078125,
0.009674072265625,
-0.034423828125,
0.07452392578125,
0.0149688720703125,
-0.01346588134765625,
-0.0236358642578125,
-0.054840087890625,
0.0482177734375,
-0.0174407958984375,
0.0193939208984375,
-0.0020923614501953125,
0.0027179718017578125,
0.0677490234375,
-0.034881591796875,
0.08221435546875,
0.006908416748046875,
0.005641937255859375,
0.03839111328125,
-0.0009479522705078125,
0.033111572265625,
0.00872802734375,
0.0153045654296875,
0.0231781005859375,
0.01529693603515625,
-0.0124359130859375,
-0.0167236328125,
0.0251922607421875,
-0.064453125,
-0.0300140380859375,
-0.06451416015625,
-0.00156402587890625,
0.01073455810546875,
0.0145263671875,
0.039764404296875,
0.050262451171875,
-0.02001953125,
0.026947021484375,
0.06451416015625,
-0.0222015380859375,
0.031982421875,
0.0498046875,
-0.0161590576171875,
-0.03411865234375,
0.052459716796875,
-0.006305694580078125,
0.006893157958984375,
0.039703369140625,
-0.004650115966796875,
-0.01206207275390625,
-0.0273590087890625,
-0.0166168212890625,
0.0235443115234375,
-0.04425048828125,
-0.033355712890625,
-0.06890869140625,
-0.0643310546875,
-0.046142578125,
-0.01611328125,
-0.00881195068359375,
-0.046478271484375,
-0.045928955078125,
-0.003620147705078125,
0.03936767578125,
0.01392364501953125,
-0.0032196044921875,
0.0445556640625,
-0.07025146484375,
0.00881195068359375,
0.0072784423828125,
-0.00699615478515625,
-0.0211029052734375,
-0.05926513671875,
-0.0207366943359375,
-0.0124969482421875,
-0.037078857421875,
-0.08935546875,
0.0406494140625,
-0.003681182861328125,
0.037078857421875,
0.0181884765625,
0.01015472412109375,
0.02874755859375,
-0.0280609130859375,
0.065185546875,
0.015838623046875,
-0.0762939453125,
0.05023193359375,
-0.020843505859375,
0.0245819091796875,
0.056060791015625,
0.03179931640625,
-0.035125732421875,
-0.0440673828125,
-0.08587646484375,
-0.07659912109375,
0.06744384765625,
0.040863037109375,
0.011749267578125,
0.00653839111328125,
-0.0061798095703125,
-0.01300811767578125,
0.0222625732421875,
-0.06439208984375,
-0.00916290283203125,
-0.004543304443359375,
-0.00618743896484375,
0.01392364501953125,
-0.0307464599609375,
-0.01800537109375,
0.01271820068359375,
0.0736083984375,
0.0024776458740234375,
0.034027099609375,
0.0121612548828125,
-0.0207366943359375,
0.0277252197265625,
0.026702880859375,
0.06732177734375,
0.05322265625,
-0.01006317138671875,
0.003307342529296875,
0.0271453857421875,
-0.049224853515625,
-0.00943756103515625,
0.018829345703125,
-0.0290069580078125,
0.0145721435546875,
0.05194091796875,
0.07061767578125,
0.00850677490234375,
-0.05120849609375,
0.0633544921875,
-0.00437164306640625,
-0.007781982421875,
-0.057342529296875,
0.015716552734375,
-0.00490570068359375,
0.016632080078125,
0.01444244384765625,
0.005767822265625,
0.0201416015625,
-0.0246124267578125,
0.03997802734375,
0.0283966064453125,
-0.038177490234375,
-0.0291290283203125,
0.04046630859375,
0.011505126953125,
-0.025848388671875,
0.02655029296875,
-0.0360107421875,
-0.06414794921875,
0.033233642578125,
0.0300750732421875,
0.0892333984375,
-0.0109100341796875,
0.040008544921875,
0.043182373046875,
0.045928955078125,
0.0162811279296875,
0.03253173828125,
-0.005756378173828125,
-0.055755615234375,
-0.029693603515625,
-0.06304931640625,
-0.009979248046875,
0.019927978515625,
-0.0255584716796875,
0.03167724609375,
-0.048980712890625,
-0.009002685546875,
-0.0017223358154296875,
-0.0009937286376953125,
-0.054046630859375,
0.0188751220703125,
-0.005550384521484375,
0.08502197265625,
-0.06298828125,
0.058349609375,
0.03668212890625,
-0.0322265625,
-0.07568359375,
-0.01128387451171875,
-0.006320953369140625,
-0.049835205078125,
0.05950927734375,
0.01233673095703125,
0.002696990966796875,
-0.033935546875,
-0.0460205078125,
-0.07958984375,
0.08221435546875,
0.016448974609375,
-0.03448486328125,
0.01074981689453125,
0.003173828125,
0.049163818359375,
-0.03765869140625,
0.02197265625,
0.013519287109375,
0.0081634521484375,
0.00893402099609375,
-0.06646728515625,
0.0135955810546875,
-0.02252197265625,
-0.0032558441162109375,
0.01447296142578125,
-0.043304443359375,
0.06573486328125,
0.01158905029296875,
-0.024749755859375,
-0.007465362548828125,
0.0653076171875,
0.0298004150390625,
0.0299530029296875,
0.0287017822265625,
0.053466796875,
0.050201416015625,
-0.0034694671630859375,
0.051666259765625,
-0.035064697265625,
0.018951416015625,
0.10052490234375,
0.0031414031982421875,
0.06402587890625,
0.01006317138671875,
0.00150299072265625,
0.055633544921875,
0.053680419921875,
-0.0026302337646484375,
0.03680419921875,
0.01324462890625,
-0.0177459716796875,
-0.0029087066650390625,
0.0035266876220703125,
-0.033721923828125,
0.04998779296875,
0.017486572265625,
-0.033355712890625,
-0.009185791015625,
-0.00971221923828125,
0.02276611328125,
-0.010009765625,
-0.0015869140625,
0.058135986328125,
0.006992340087890625,
-0.03277587890625,
0.037139892578125,
0.01396942138671875,
0.036895751953125,
-0.053741455078125,
0.018646240234375,
0.01561737060546875,
-0.00856781005859375,
0.0026302337646484375,
-0.035491943359375,
0.035614013671875,
0.01165008544921875,
0.01407623291015625,
-0.0002906322479248047,
0.0767822265625,
-0.0275115966796875,
-0.042724609375,
0.0157470703125,
0.0033855438232421875,
0.0246124267578125,
0.00563812255859375,
-0.0509033203125,
0.005168914794921875,
-0.00009459257125854492,
-0.00693511962890625,
0.01490020751953125,
0.023773193359375,
0.0010223388671875,
0.040863037109375,
0.05841064453125,
0.00933074951171875,
-0.00812530517578125,
-0.012725830078125,
0.07171630859375,
-0.0240478515625,
-0.0518798828125,
-0.06427001953125,
0.04571533203125,
-0.0418701171875,
-0.040924072265625,
0.04803466796875,
0.0419921875,
0.068359375,
-0.0164947509765625,
0.037078857421875,
-0.006244659423828125,
0.0287933349609375,
-0.048248291015625,
0.049163818359375,
-0.0310516357421875,
0.007259368896484375,
-0.033935546875,
-0.06201171875,
0.000640869140625,
0.043701171875,
-0.01190948486328125,
-0.019317626953125,
0.055755615234375,
0.080810546875,
-0.00884246826171875,
0.00336456298828125,
0.01076507568359375,
-0.00223541259765625,
0.00881195068359375,
0.01971435546875,
0.047515869140625,
-0.048675537109375,
0.02557373046875,
-0.01206207275390625,
-0.024871826171875,
-0.0214385986328125,
-0.06304931640625,
-0.068603515625,
-0.05950927734375,
-0.035888671875,
-0.036041259765625,
0.00351715087890625,
0.068115234375,
0.06732177734375,
-0.0687255859375,
-0.0031280517578125,
0.0001270771026611328,
0.0005135536193847656,
0.00579071044921875,
-0.015960693359375,
0.0174713134765625,
-0.0274505615234375,
-0.0254364013671875,
0.0203094482421875,
0.007724761962890625,
0.0092315673828125,
-0.00498199462890625,
0.00876617431640625,
-0.0215301513671875,
0.0205078125,
0.0308380126953125,
0.0207977294921875,
-0.044464111328125,
-0.031036376953125,
0.0038394927978515625,
-0.020751953125,
-0.0239105224609375,
0.04510498046875,
-0.050933837890625,
0.014190673828125,
0.0572509765625,
0.052581787109375,
0.047515869140625,
-0.0269317626953125,
0.0309600830078125,
-0.06402587890625,
0.0264739990234375,
0.010009765625,
0.0313720703125,
0.0212554931640625,
-0.0233154296875,
0.05419921875,
0.0169219970703125,
-0.039276123046875,
-0.056976318359375,
0.0232696533203125,
-0.0750732421875,
-0.032012939453125,
0.07037353515625,
-0.01593017578125,
-0.024993896484375,
0.010528564453125,
-0.01043701171875,
0.0311431884765625,
-0.0257568359375,
0.05902099609375,
0.050262451171875,
0.0282745361328125,
-0.0114898681640625,
-0.023590087890625,
0.04998779296875,
0.005168914794921875,
-0.06768798828125,
-0.0139312744140625,
0.0247802734375,
0.01486968994140625,
0.0445556640625,
0.058319091796875,
0.0008449554443359375,
0.02178955078125,
-0.01425933837890625,
0.01470947265625,
0.004730224609375,
-0.04241943359375,
-0.0206298828125,
0.01302337646484375,
-0.0167694091796875,
-0.025848388671875
]
] |
google/owlvit-large-patch14 | 2023-10-23T09:20:09.000Z | [
"transformers",
"pytorch",
"owlvit",
"zero-shot-object-detection",
"vision",
"object-detection",
"arxiv:2205.06230",
"license:apache-2.0",
"has_space",
"region:us"
] | object-detection | google | null | null | google/owlvit-large-patch14 | 15 | 33,104 | transformers | 2022-07-05T07:12:49 | ---
license: apache-2.0
tags:
- vision
- object-detection
inference: false
---
# Model Card: OWL-ViT
## Model Details
The OWL-ViT (short for Vision Transformer for Open-World Localization) was proposed in [Simple Open-Vocabulary Object Detection with Vision Transformers](https://arxiv.org/abs/2205.06230) by Matthias Minderer, Alexey Gritsenko, Austin Stone, Maxim Neumann, Dirk Weissenborn, Alexey Dosovitskiy, Aravindh Mahendran, Anurag Arnab, Mostafa Dehghani, Zhuoran Shen, Xiao Wang, Xiaohua Zhai, Thomas Kipf, and Neil Houlsby. OWL-ViT is a zero-shot text-conditioned object detection model that can be used to query an image with one or multiple text queries.
OWL-ViT uses CLIP as its multi-modal backbone, with a ViT-like Transformer to get visual features and a causal language model to get the text features. To use CLIP for detection, OWL-ViT removes the final token pooling layer of the vision model and attaches a lightweight classification and box head to each transformer output token. Open-vocabulary classification is enabled by replacing the fixed classification layer weights with the class-name embeddings obtained from the text model. The authors first train CLIP from scratch and fine-tune it end-to-end with the classification and box heads on standard detection datasets using a bipartite matching loss. One or multiple text queries per image can be used to perform zero-shot text-conditioned object detection.
### Model Date
May 2022
### Model Type
The model uses a CLIP backbone with a ViT-L/14 Transformer architecture as an image encoder and uses a masked self-attention Transformer as a text encoder. These encoders are trained to maximize the similarity of (image, text) pairs via a contrastive loss. The CLIP backbone is trained from scratch and fine-tuned together with the box and class prediction heads with an object detection objective.
### Documents
- [OWL-ViT Paper](https://arxiv.org/abs/2205.06230)
### Use with Transformers
```python3
import requests
from PIL import Image
import torch
from transformers import OwlViTProcessor, OwlViTForObjectDetection
processor = OwlViTProcessor.from_pretrained("google/owlvit-large-patch14")
model = OwlViTForObjectDetection.from_pretrained("google/owlvit-large-patch14")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
texts = [["a photo of a cat", "a photo of a dog"]]
inputs = processor(text=texts, images=image, return_tensors="pt")
outputs = model(**inputs)
# Target image sizes (height, width) to rescale box predictions [batch_size, 2]
target_sizes = torch.Tensor([image.size[::-1]])
# Convert outputs (bounding boxes and class logits) to COCO API
results = processor.post_process_object_detection(outputs=outputs, threshold=0.1, target_sizes=target_sizes)
i = 0 # Retrieve predictions for the first image for the corresponding text queries
text = texts[i]
boxes, scores, labels = results[i]["boxes"], results[i]["scores"], results[i]["labels"]
# Print detected objects and rescaled box coordinates
for box, score, label in zip(boxes, scores, labels):
box = [round(i, 2) for i in box.tolist()]
print(f"Detected {text[label]} with confidence {round(score.item(), 3)} at location {box}")
```
## Model Use
### Intended Use
The model is intended as a research output for research communities. We hope that this model will enable researchers to better understand and explore zero-shot, text-conditioned object detection. We also hope it can be used for interdisciplinary studies of the potential impact of such models, especially in areas that commonly require identifying objects whose label is unavailable during training.
#### Primary intended uses
The primary intended users of these models are AI researchers.
We primarily imagine the model will be used by researchers to better understand robustness, generalization, and other capabilities, biases, and constraints of computer vision models.
## Data
The CLIP backbone of the model was trained on publicly available image-caption data. This was done through a combination of crawling a handful of websites and using commonly-used pre-existing image datasets such as [YFCC100M](http://projects.dfki.uni-kl.de/yfcc100m/). A large portion of the data comes from our crawling of the internet. This means that the data is more representative of people and societies most connected to the internet. The prediction heads of OWL-ViT, along with the CLIP backbone, are fine-tuned on publicly available object detection datasets such as [COCO](https://cocodataset.org/#home) and [OpenImages](https://storage.googleapis.com/openimages/web/index.html).
### BibTeX entry and citation info
```bibtex
@article{minderer2022simple,
title={Simple Open-Vocabulary Object Detection with Vision Transformers},
author={Matthias Minderer, Alexey Gritsenko, Austin Stone, Maxim Neumann, Dirk Weissenborn, Alexey Dosovitskiy, Aravindh Mahendran, Anurag Arnab, Mostafa Dehghani, Zhuoran Shen, Xiao Wang, Xiaohua Zhai, Thomas Kipf, Neil Houlsby},
journal={arXiv preprint arXiv:2205.06230},
year={2022},
}
``` | 5,138 | [
[
-0.036590576171875,
-0.049102783203125,
0.028289794921875,
-0.0168304443359375,
-0.0240325927734375,
-0.0347900390625,
-0.006622314453125,
-0.060577392578125,
0.00592041015625,
0.03045654296875,
-0.0267486572265625,
-0.049652099609375,
-0.04229736328125,
0.0225982666015625,
-0.03851318359375,
0.05560302734375,
0.01425933837890625,
-0.0285186767578125,
-0.0214080810546875,
-0.006710052490234375,
-0.0259246826171875,
-0.01336669921875,
-0.033355712890625,
0.00203704833984375,
0.023162841796875,
0.027557373046875,
0.048248291015625,
0.07818603515625,
0.06549072265625,
0.0228729248046875,
0.0045013427734375,
0.0096282958984375,
-0.0240325927734375,
-0.0401611328125,
-0.00756072998046875,
-0.041534423828125,
-0.025634765625,
-0.0011358261108398438,
0.0287322998046875,
0.007656097412109375,
0.008544921875,
0.0081024169921875,
-0.0026035308837890625,
0.0216827392578125,
-0.0599365234375,
0.0242919921875,
-0.06695556640625,
0.021331787109375,
-0.0154266357421875,
-0.013641357421875,
-0.0439453125,
0.0190277099609375,
0.019073486328125,
-0.059326171875,
0.04339599609375,
0.01297760009765625,
0.10369873046875,
0.0217132568359375,
-0.0034236907958984375,
-0.007724761962890625,
-0.0408935546875,
0.08135986328125,
-0.032806396484375,
0.02227783203125,
0.02099609375,
0.01496124267578125,
0.010711669921875,
-0.05780029296875,
-0.049957275390625,
-0.0019683837890625,
-0.00714111328125,
0.0281219482421875,
-0.033843994140625,
-0.004611968994140625,
0.012786865234375,
0.0283966064453125,
-0.038421630859375,
-0.002025604248046875,
-0.051910400390625,
-0.0151214599609375,
0.03997802734375,
0.0036468505859375,
0.0212554931640625,
-0.0031452178955078125,
-0.04833984375,
-0.036651611328125,
-0.01641845703125,
0.0103912353515625,
0.01293182373046875,
0.0180206298828125,
-0.0068359375,
0.0428466796875,
-0.0281829833984375,
0.0718994140625,
0.001850128173828125,
-0.02227783203125,
0.0252838134765625,
-0.01007843017578125,
-0.0172119140625,
-0.002643585205078125,
0.07373046875,
0.0390625,
0.0265960693359375,
-0.0092010498046875,
-0.0008697509765625,
0.027557373046875,
0.01425933837890625,
-0.06280517578125,
-0.0159454345703125,
0.01236724853515625,
-0.032684326171875,
-0.031951904296875,
0.022308349609375,
-0.04913330078125,
0.01464080810546875,
-0.0050811767578125,
0.04766845703125,
-0.0272674560546875,
-0.00994873046875,
0.045989990234375,
-0.0157928466796875,
0.042694091796875,
0.0182952880859375,
-0.046295166015625,
-0.004329681396484375,
0.0208282470703125,
0.065185546875,
-0.02789306640625,
-0.0316162109375,
-0.0308685302734375,
0.0030975341796875,
-0.0213470458984375,
0.08538818359375,
-0.035400390625,
-0.02374267578125,
0.003448486328125,
0.0251007080078125,
-0.0017919540405273438,
-0.0290679931640625,
0.046295166015625,
-0.01480865478515625,
0.0208892822265625,
0.00653076171875,
-0.0120391845703125,
-0.0206146240234375,
0.047332763671875,
-0.032562255859375,
0.0799560546875,
-0.0100250244140625,
-0.0823974609375,
0.0291748046875,
-0.042877197265625,
-0.00978851318359375,
-0.014007568359375,
0.00879669189453125,
-0.053558349609375,
-0.01247406005859375,
0.04388427734375,
0.03912353515625,
-0.01238250732421875,
-0.01094818115234375,
-0.034454345703125,
-0.034576416015625,
0.0135498046875,
-0.0328369140625,
0.052398681640625,
0.006008148193359375,
-0.0016984939575195312,
0.01561737060546875,
-0.05291748046875,
-0.01092529296875,
0.03045654296875,
-0.004421234130859375,
-0.0138397216796875,
-0.003795623779296875,
-0.0016374588012695312,
0.005687713623046875,
0.0175323486328125,
-0.062469482421875,
0.0200653076171875,
-0.030731201171875,
0.0244903564453125,
0.03997802734375,
-0.0167236328125,
0.0225067138671875,
-0.0095062255859375,
0.0238037109375,
0.00846099853515625,
0.04229736328125,
-0.03778076171875,
-0.057342529296875,
-0.0660400390625,
-0.015350341796875,
-0.0120849609375,
0.0457763671875,
-0.050506591796875,
0.0255889892578125,
-0.0206298828125,
-0.033721923828125,
-0.0252838134765625,
-0.02203369140625,
0.0228424072265625,
0.047760009765625,
0.05010986328125,
-0.023590087890625,
-0.0297088623046875,
-0.07177734375,
-0.0008068084716796875,
-0.0044097900390625,
-0.0144805908203125,
0.01073455810546875,
0.053466796875,
-0.01142120361328125,
0.09613037109375,
-0.07177734375,
-0.05657958984375,
-0.0008425712585449219,
-0.0065765380859375,
0.01210784912109375,
0.023468017578125,
0.03997802734375,
-0.06854248046875,
-0.041107177734375,
-0.001262664794921875,
-0.07122802734375,
0.0205230712890625,
0.02362060546875,
-0.007160186767578125,
0.004772186279296875,
0.03289794921875,
-0.04534912109375,
0.06854248046875,
0.01531219482421875,
-0.00762939453125,
0.041961669921875,
-0.0019893646240234375,
-0.01139068603515625,
-0.08203125,
-0.004322052001953125,
0.0022335052490234375,
-0.01013946533203125,
-0.040008544921875,
0.0060577392578125,
-0.0007548332214355469,
-0.018402099609375,
-0.057861328125,
0.037933349609375,
-0.033905029296875,
-0.00846099853515625,
-0.0225982666015625,
0.020416259765625,
0.00978851318359375,
0.05181884765625,
0.034576416015625,
0.051849365234375,
0.0599365234375,
-0.043609619140625,
0.0151214599609375,
0.030975341796875,
-0.01157379150390625,
0.0416259765625,
-0.06036376953125,
0.0168304443359375,
-0.01219940185546875,
-0.0014734268188476562,
-0.06573486328125,
-0.010894775390625,
0.0267791748046875,
-0.04718017578125,
0.014984130859375,
-0.007671356201171875,
-0.0176239013671875,
-0.05511474609375,
-0.03436279296875,
0.035247802734375,
0.0282440185546875,
-0.04498291015625,
0.0372314453125,
0.01255035400390625,
0.049407958984375,
-0.06488037109375,
-0.0645751953125,
-0.0050811767578125,
-0.0074462890625,
-0.048492431640625,
0.0361328125,
-0.007740020751953125,
0.01142120361328125,
0.0235748291015625,
-0.0010595321655273438,
-0.00690460205078125,
-0.0173797607421875,
0.010498046875,
0.033721923828125,
-0.0203704833984375,
0.004215240478515625,
-0.0361328125,
-0.0287017822265625,
-0.00995635986328125,
-0.03924560546875,
0.03399658203125,
-0.02734375,
-0.025970458984375,
-0.04608154296875,
-0.00626373291015625,
0.031280517578125,
-0.04095458984375,
0.039459228515625,
0.0716552734375,
-0.03173828125,
0.00516510009765625,
-0.03192138671875,
-0.025909423828125,
-0.034820556640625,
0.031829833984375,
-0.0192413330078125,
-0.0386962890625,
0.0300140380859375,
0.03472900390625,
-0.00180816650390625,
0.04534912109375,
0.0244903564453125,
0.01453399658203125,
0.068115234375,
0.0594482421875,
0.004894256591796875,
0.040283203125,
-0.0694580078125,
0.020538330078125,
-0.07373046875,
-0.015228271484375,
-0.0283050537109375,
-0.002559661865234375,
-0.0299224853515625,
-0.056243896484375,
0.020263671875,
0.00405120849609375,
0.00843048095703125,
0.040374755859375,
-0.08367919921875,
0.035888671875,
0.0357666015625,
0.033782958984375,
0.0296173095703125,
0.00795745849609375,
0.006175994873046875,
0.013824462890625,
-0.041778564453125,
-0.02862548828125,
0.089599609375,
0.0196990966796875,
0.0526123046875,
-0.034698486328125,
0.039154052734375,
0.00797271728515625,
0.0172119140625,
-0.075927734375,
0.048126220703125,
-0.0194244384765625,
-0.042572021484375,
-0.0256500244140625,
0.00959014892578125,
-0.0888671875,
0.02655029296875,
-0.01491546630859375,
-0.0745849609375,
0.028900146484375,
0.01419830322265625,
-0.0144195556640625,
0.046600341796875,
-0.0440673828125,
0.07672119140625,
0.00592041015625,
-0.037750244140625,
0.01432037353515625,
-0.0300140380859375,
0.029388427734375,
0.01953125,
-0.0102691650390625,
-0.0204620361328125,
0.0037899017333984375,
0.05474853515625,
-0.0233612060546875,
0.06256103515625,
0.00980377197265625,
0.01338958740234375,
0.0545654296875,
-0.0210418701171875,
0.031341552734375,
-0.010986328125,
0.01306915283203125,
0.03961181640625,
-0.0016603469848632812,
-0.0302886962890625,
-0.019378662109375,
0.0229339599609375,
-0.0523681640625,
-0.0443115234375,
-0.0419921875,
-0.042388916015625,
0.0246734619140625,
0.0285186767578125,
0.059417724609375,
0.0361328125,
0.01523590087890625,
0.03912353515625,
0.047698974609375,
-0.0254058837890625,
0.040557861328125,
0.0253753662109375,
-0.0345458984375,
-0.01343536376953125,
0.07647705078125,
0.000034332275390625,
0.006145477294921875,
0.05084228515625,
0.026275634765625,
-0.019256591796875,
-0.029388427734375,
-0.01239013671875,
0.01372528076171875,
-0.05047607421875,
-0.041900634765625,
-0.05511474609375,
-0.0189666748046875,
-0.028167724609375,
-0.0159912109375,
-0.038909912109375,
0.00041222572326660156,
-0.05023193359375,
0.0005745887756347656,
0.0316162109375,
0.033721923828125,
0.007579803466796875,
0.02923583984375,
-0.02581787109375,
0.0263671875,
0.03240966796875,
0.0110015869140625,
-0.0027332305908203125,
-0.0277557373046875,
-0.01018524169921875,
0.005237579345703125,
-0.040771484375,
-0.05029296875,
0.038177490234375,
0.00879669189453125,
0.0228424072265625,
0.051025390625,
0.0128173828125,
0.05010986328125,
-0.01049041748046875,
0.058349609375,
0.0221710205078125,
-0.05078125,
0.049957275390625,
-0.023345947265625,
0.0279541015625,
0.00325775146484375,
0.00562286376953125,
-0.01165008544921875,
-0.0296630859375,
-0.031494140625,
-0.0513916015625,
0.07122802734375,
0.024078369140625,
-0.009002685546875,
-0.002410888671875,
0.01093292236328125,
-0.0139312744140625,
-0.00908660888671875,
-0.06561279296875,
-0.002231597900390625,
-0.0338134765625,
-0.017669677734375,
0.0037517547607421875,
-0.0016765594482421875,
0.0218963623046875,
-0.0222320556640625,
0.031494140625,
-0.0269012451171875,
0.060638427734375,
0.048248291015625,
-0.008697509765625,
0.005733489990234375,
-0.01212310791015625,
0.037200927734375,
0.03204345703125,
-0.030181884765625,
-0.004772186279296875,
0.0147247314453125,
-0.05572509765625,
-0.01496124267578125,
-0.0147552490234375,
-0.02374267578125,
-0.0030956268310546875,
0.045623779296875,
0.06512451171875,
0.00872802734375,
-0.031829833984375,
0.048919677734375,
-0.0024394989013671875,
-0.012298583984375,
-0.036712646484375,
0.0108184814453125,
-0.01983642578125,
-0.0001806020736694336,
0.041748046875,
0.0181427001953125,
0.015960693359375,
-0.0316162109375,
0.017059326171875,
0.036376953125,
-0.041717529296875,
-0.0253143310546875,
0.0645751953125,
-0.01250457763671875,
-0.028564453125,
0.038360595703125,
-0.01309967041015625,
-0.045501708984375,
0.0712890625,
0.03997802734375,
0.0694580078125,
-0.0007028579711914062,
0.006195068359375,
0.051025390625,
0.02801513671875,
-0.01198577880859375,
-0.0057830810546875,
-0.0036373138427734375,
-0.07757568359375,
-0.00954437255859375,
-0.055511474609375,
-0.0147247314453125,
0.00670623779296875,
-0.0552978515625,
0.0474853515625,
-0.015869140625,
-0.020477294921875,
0.0000616312026977539,
-0.015045166015625,
-0.074951171875,
0.01441192626953125,
0.000507354736328125,
0.0767822265625,
-0.045135498046875,
0.059326171875,
0.043914794921875,
-0.04779052734375,
-0.04876708984375,
0.0059356689453125,
-0.017364501953125,
-0.08441162109375,
0.04132080078125,
0.056243896484375,
-0.019775390625,
0.01224517822265625,
-0.054168701171875,
-0.06268310546875,
0.09027099609375,
0.01004791259765625,
-0.01593017578125,
-0.00980377197265625,
0.00479888916015625,
0.032958984375,
-0.03240966796875,
0.03411865234375,
0.01531219482421875,
0.02838134765625,
0.01556396484375,
-0.05108642578125,
-0.0228118896484375,
-0.0030879974365234375,
0.0020084381103515625,
0.006870269775390625,
-0.0592041015625,
0.07855224609375,
-0.0249786376953125,
-0.0230560302734375,
0.00811767578125,
0.0455322265625,
-0.0020542144775390625,
0.0306243896484375,
0.037689208984375,
0.060272216796875,
0.0169677734375,
-0.01861572265625,
0.07769775390625,
-0.01654052734375,
0.05084228515625,
0.05523681640625,
0.0095062255859375,
0.0767822265625,
0.0287017822265625,
-0.01548004150390625,
0.025634765625,
0.03729248046875,
-0.02764892578125,
0.03546142578125,
-0.0202789306640625,
0.0169525146484375,
-0.01544952392578125,
-0.017974853515625,
-0.0252532958984375,
0.061126708984375,
0.017913818359375,
-0.0156707763671875,
-0.0120086669921875,
0.031951904296875,
-0.0111236572265625,
-0.00628662109375,
-0.01531219482421875,
0.040008544921875,
-0.0172119140625,
-0.03399658203125,
0.045806884765625,
-0.00756072998046875,
0.06927490234375,
-0.034423828125,
-0.006687164306640625,
-0.004016876220703125,
0.03289794921875,
-0.0205078125,
-0.07110595703125,
0.0240631103515625,
-0.00899505615234375,
-0.00923919677734375,
0.00782012939453125,
0.06298828125,
-0.020965576171875,
-0.048980712890625,
0.035003662109375,
-0.010894775390625,
0.0330810546875,
-0.0259552001953125,
-0.0423583984375,
0.0245513916015625,
0.0024280548095703125,
-0.0142364501953125,
0.001239776611328125,
0.0224609375,
-0.0007715225219726562,
0.05511474609375,
0.055694580078125,
-0.01433563232421875,
0.01861572265625,
-0.0416259765625,
0.055999755859375,
-0.033355712890625,
-0.0298309326171875,
-0.04742431640625,
0.039306640625,
-0.0145263671875,
-0.0292816162109375,
0.050323486328125,
0.054229736328125,
0.08489990234375,
-0.030792236328125,
0.0244903564453125,
-0.006664276123046875,
0.0157012939453125,
-0.034637451171875,
0.0281829833984375,
-0.059234619140625,
-0.0023632049560546875,
-0.0159454345703125,
-0.07373046875,
-0.0283050537109375,
0.057281494140625,
-0.027587890625,
-0.01323699951171875,
0.05352783203125,
0.06756591796875,
-0.022613525390625,
-0.03253173828125,
0.045684814453125,
0.00848388671875,
0.0231170654296875,
0.045074462890625,
0.03839111328125,
-0.06939697265625,
0.07830810546875,
-0.042633056640625,
-0.00884246826171875,
-0.0306243896484375,
-0.059417724609375,
-0.07733154296875,
-0.048126220703125,
-0.035797119140625,
-0.00018334388732910156,
-0.01178741455078125,
0.02227783203125,
0.07720947265625,
-0.050079345703125,
-0.003047943115234375,
-0.010955810546875,
0.01090240478515625,
-0.0159759521484375,
-0.0243988037109375,
0.04052734375,
-0.0079193115234375,
-0.07391357421875,
0.0008797645568847656,
0.034698486328125,
0.0063934326171875,
-0.0175323486328125,
0.0001398324966430664,
-0.02349853515625,
-0.0024509429931640625,
0.0516357421875,
0.0299072265625,
-0.07373046875,
-0.037139892578125,
0.0215911865234375,
0.00632476806640625,
0.0292816162109375,
0.02728271484375,
-0.06536865234375,
0.05438232421875,
0.030426025390625,
0.011810302734375,
0.05474853515625,
-0.004215240478515625,
-0.01067352294921875,
-0.03961181640625,
0.05474853515625,
-0.00106048583984375,
0.031951904296875,
0.028106689453125,
-0.0316162109375,
0.04608154296875,
0.036712646484375,
-0.03497314453125,
-0.06591796875,
0.009124755859375,
-0.09515380859375,
-0.0154876708984375,
0.058563232421875,
-0.018310546875,
-0.05548095703125,
0.005893707275390625,
-0.02960205078125,
0.02001953125,
-0.02862548828125,
0.04864501953125,
0.028900146484375,
-0.0023651123046875,
-0.02923583984375,
-0.02532958984375,
0.01232147216796875,
-0.01363372802734375,
-0.0511474609375,
-0.0260009765625,
0.00316619873046875,
0.0228118896484375,
0.052947998046875,
0.0313720703125,
-0.007389068603515625,
0.00921630859375,
0.01300811767578125,
0.027099609375,
-0.01849365234375,
-0.033935546875,
0.002597808837890625,
0.006114959716796875,
-0.017120361328125,
-0.061920166015625
]
] |
mosaicml/mpt-7b-chat | 2023-10-30T21:53:43.000Z | [
"transformers",
"pytorch",
"mpt",
"text-generation",
"Composer",
"MosaicML",
"llm-foundry",
"custom_code",
"dataset:jeffwan/sharegpt_vicuna",
"dataset:Hello-SimpleAI/HC3",
"dataset:tatsu-lab/alpaca",
"dataset:Anthropic/hh-rlhf",
"dataset:victor123/evol_instruct_70k",
"arxiv:2205.14135",
"arxiv:2108.12409",
"arxiv:2010.04245",
"license:cc-by-nc-sa-4.0",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | mosaicml | null | null | mosaicml/mpt-7b-chat | 490 | 33,038 | transformers | 2023-05-04T23:56:17 | ---
license: cc-by-nc-sa-4.0
datasets:
- jeffwan/sharegpt_vicuna
- Hello-SimpleAI/HC3
- tatsu-lab/alpaca
- Anthropic/hh-rlhf
- victor123/evol_instruct_70k
tags:
- Composer
- MosaicML
- llm-foundry
inference: false
---
# MPT-7B-Chat
MPT-7B-Chat is a chatbot-like model for dialogue generation.
It was built by finetuning [MPT-7B](https://huggingface.co/mosaicml/mpt-7b) on the [ShareGPT-Vicuna](https://huggingface.co/datasets/jeffwan/sharegpt_vicuna), [HC3](https://huggingface.co/datasets/Hello-SimpleAI/HC3),
[Alpaca](https://huggingface.co/datasets/tatsu-lab/alpaca), [HH-RLHF](https://huggingface.co/datasets/Anthropic/hh-rlhf), and [Evol-Instruct](https://huggingface.co/datasets/victor123/evol_instruct_70k) datasets.
* License: _CC-By-NC-SA-4.0_ (non-commercial use only)
* [Demo on Hugging Face Spaces](https://huggingface.co/spaces/mosaicml/mpt-7b-chat)
This model was trained by [MosaicML](https://www.mosaicml.com) and follows a modified decoder-only transformer architecture.
## Model Date
May 5, 2023
## Model License
_CC-By-NC-SA-4.0_ (non-commercial use only)
## Documentation
* [Blog post: Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs](https://www.mosaicml.com/blog/mpt-7b)
* [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
* Questions: Feel free to contact us via the [MosaicML Community Slack](https://mosaicml.me/slack)!
### Example Dialogue
**SamIAm85**:
>I want you to come up with a tweet based on this summary of the article:
>"Introducing MPT-7B, the latest entry in our MosaicML Foundation Series.
>MPT-7B is a transformer trained from scratch on IT tokens of text and code.
>It is open source and it matches the quality of LLaMA-7B.
>MPT-7B was trained on the MosaicML platform in 9.5 days with zero human intervention at a cost of ~$200k.
>Starting today, you can train, finetune, and deploy your own private MPT models, either starting from one of our checkpoints or training from scratch.
>For inspiration, we are also releasing three finetuned models in addition to the base MPT-7B: MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-Story Writer-65k+, the last of which uses a context length of 65k tokens!
**MPT-7B-Instruct**:
>Here's my suggestion for a tweet:
>"Exciting news! @MosaicML has launched #MPT_7B - a new standard for open-source LLMs. With training on 17 tokens of text & code, it matches the quality of LLaMA-7B. Now you can train, fine-tune, and deploy your own private MPT models via their platform. Check out the full blog post here: https://bit.ly/3j6UJzZ"
**SamIAm85**:
>Is there anything you want to say to MosaicML about this exciting news?
**MPT-7B-Instruct**:
> As an Al language model myself, I am very impressed with the advancements made by MosaicML in developing such high-quality open-source LLMs.
> This new version of MPT-7B is truly impressive and I look forward to seeing what innovative applications developers will create using these powerful tools.
> Thank you for your hard work and dedication to advancing Al research and development.
## How to Use
This model is best used with the MosaicML [llm-foundry repository](https://github.com/mosaicml/llm-foundry) for training and finetuning.
```python
import transformers
model = transformers.AutoModelForCausalLM.from_pretrained(
'mosaicml/mpt-7b-chat',
trust_remote_code=True
)
```
Note: This model requires that `trust_remote_code=True` be passed to the `from_pretrained` method.
This is because we use a custom `MPT` model architecture that is not yet part of the Hugging Face `transformers` package.
`MPT` includes options for many training efficiency features such as [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf), [ALiBi](https://arxiv.org/abs/2108.12409), [QK LayerNorm](https://arxiv.org/abs/2010.04245), and more.
To use the optimized [triton implementation](https://github.com/openai/triton) of FlashAttention, you can load the model on GPU (`cuda:0`) with `attn_impl='triton'` and with `bfloat16` precision:
```python
import torch
import transformers
name = 'mosaicml/mpt-7b-chat'
config = transformers.AutoConfig.from_pretrained(name, trust_remote_code=True)
config.attn_config['attn_impl'] = 'triton'
config.init_device = 'cuda:0' # For fast initialization directly on GPU!
model = transformers.AutoModelForCausalLM.from_pretrained(
name,
config=config,
torch_dtype=torch.bfloat16, # Load model weights in bfloat16
trust_remote_code=True
)
```
Although the model was trained with a sequence length of 2048, ALiBi enables users to increase the maximum sequence length during finetuning and/or inference. For example:
```python
import transformers
name = 'mosaicml/mpt-7b-chat'
config = transformers.AutoConfig.from_pretrained(name, trust_remote_code=True)
config.max_seq_len = 4096 # (input + output) tokens can now be up to 4096
model = transformers.AutoModelForCausalLM.from_pretrained(
name,
config=config,
trust_remote_code=True
)
```
This model was trained with the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer.
```python
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b")
```
The model can then be used, for example, within a text-generation pipeline.
Note: when running Torch modules in lower precision, it is best practice to use the [torch.autocast context manager](https://pytorch.org/docs/stable/amp.html).
```python
from transformers import pipeline
pipe = pipeline('text-generation', model=model, tokenizer=tokenizer, device='cuda:0')
with torch.autocast('cuda', dtype=torch.bfloat16):
print(
pipe('Here is a recipe for vegan banana bread:\n',
max_new_tokens=100,
do_sample=True,
use_cache=True))
```
## Model Description
The architecture is a modification of a standard decoder-only transformer.
The model has been modified from a standard transformer in the following ways:
* It uses [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf)
* It uses [ALiBi (Attention with Linear Biases)](https://arxiv.org/abs/2108.12409) and does not use positional embeddings
* It does not use biases
| Hyperparameter | Value |
|----------------|-------|
|n_parameters | 6.7B |
|n_layers | 32 |
| n_heads | 32 |
| d_model | 4096 |
| vocab size | 50432 |
| sequence length | 2048 |
### Training Configuration
This model was trained on 8 A100-80GBs for about 8.2 hours, followed by training for 6.7 hours on 32 A100-40GBs using the [MosaicML Platform](https://www.mosaicml.com/platform).
The model was trained with sharded data parallelism using [FSDP](https://pytorch.org/docs/stable/fsdp.html) and used the AdamW optimizer.
## Limitations and Biases
_The following language is modified from [EleutherAI's GPT-NeoX-20B](https://huggingface.co/EleutherAI/gpt-neox-20b)_
MPT-7B-Chat can produce factually incorrect output, and should not be relied on to produce factually accurate information.
MPT-7B-Chat was trained on various public datasets.
While great efforts have been taken to clean the pretraining data, it is possible that this model could generate lewd, biased or otherwise offensive outputs.
## Acknowledgements
This model was finetuned by Sam Havens and the MosaicML NLP team
## Disclaimer
The license on this model does not constitute legal advice. We are not responsible for the actions of third parties who use this model. Please cosult an attorney before using this model for commercial purposes.
## MosaicML Platform
If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://forms.mosaicml.com/demo?utm_source=huggingface&utm_medium=referral&utm_campaign=mpt-7b).
## Citation
Please cite this model using the following format:
```
@online{MosaicML2023Introducing,
author = {MosaicML NLP Team},
title = {Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs},
year = {2023},
url = {www.mosaicml.com/blog/mpt-7b},
note = {Accessed: 2023-03-28}, % change this date
urldate = {2023-03-28} % change this date
}
```
| 8,245 | [
[
-0.041656494140625,
-0.044158935546875,
0.0235748291015625,
0.03167724609375,
-0.031463623046875,
-0.002979278564453125,
-0.00849151611328125,
-0.029632568359375,
0.0022563934326171875,
0.033416748046875,
-0.04205322265625,
-0.048492431640625,
-0.049957275390625,
0.00473785400390625,
-0.0246429443359375,
0.070556640625,
0.004161834716796875,
0.00014400482177734375,
0.0101776123046875,
-0.004215240478515625,
-0.01959228515625,
-0.02978515625,
-0.049652099609375,
-0.023468017578125,
0.041259765625,
0.01123046875,
0.058441162109375,
0.0687255859375,
0.0328369140625,
0.02642822265625,
-0.0207672119140625,
0.01241302490234375,
-0.03448486328125,
-0.02642822265625,
0.0033588409423828125,
-0.0323486328125,
-0.0450439453125,
0.013641357421875,
0.036834716796875,
0.02325439453125,
-0.01055145263671875,
0.0231170654296875,
0.0003905296325683594,
0.0186767578125,
-0.0271148681640625,
0.0278472900390625,
-0.0423583984375,
0.00823211669921875,
-0.0041351318359375,
0.001712799072265625,
-0.044281005859375,
-0.0259857177734375,
0.00518035888671875,
-0.03289794921875,
0.0171356201171875,
-0.0025482177734375,
0.06610107421875,
0.01493072509765625,
-0.028167724609375,
-0.0006976127624511719,
-0.044921875,
0.047454833984375,
-0.06671142578125,
0.0267181396484375,
0.0138397216796875,
0.03106689453125,
-0.00681304931640625,
-0.0765380859375,
-0.055938720703125,
-0.0228118896484375,
-0.0032749176025390625,
0.0235137939453125,
-0.0144195556640625,
0.006008148193359375,
0.0306243896484375,
0.032135009765625,
-0.037353515625,
-0.00946807861328125,
-0.03558349609375,
-0.01256561279296875,
0.033660888671875,
0.012969970703125,
0.0224761962890625,
-0.025848388671875,
-0.0438232421875,
-0.0202178955078125,
-0.0540771484375,
-0.0091552734375,
0.024017333984375,
0.0032501220703125,
-0.043212890625,
0.042236328125,
0.002056121826171875,
0.033172607421875,
0.01128387451171875,
-0.00901031494140625,
0.0231781005859375,
-0.018951416015625,
-0.02374267578125,
-0.00356292724609375,
0.08612060546875,
0.0288543701171875,
0.003093719482421875,
-0.0007653236389160156,
-0.0022106170654296875,
0.00025153160095214844,
-0.0003879070281982422,
-0.0738525390625,
-0.02593994140625,
0.01189422607421875,
-0.03662109375,
-0.0190277099609375,
-0.0022373199462890625,
-0.038665771484375,
-0.029937744140625,
-0.0128021240234375,
0.04443359375,
-0.055450439453125,
-0.033172607421875,
0.0048675537109375,
-0.006500244140625,
0.02349853515625,
0.0236053466796875,
-0.059478759765625,
0.0033893585205078125,
0.035400390625,
0.07000732421875,
-0.004505157470703125,
-0.036834716796875,
-0.001636505126953125,
-0.00395965576171875,
-0.0008673667907714844,
0.042236328125,
-0.020111083984375,
-0.0211181640625,
-0.0240631103515625,
0.00981903076171875,
-0.0140228271484375,
-0.035980224609375,
0.0226593017578125,
-0.0203857421875,
0.04351806640625,
-0.0115966796875,
-0.02703857421875,
-0.0146026611328125,
0.00879669189453125,
-0.038299560546875,
0.06695556640625,
0.0231170654296875,
-0.06158447265625,
0.013824462890625,
-0.05926513671875,
-0.00484466552734375,
-0.007152557373046875,
0.001964569091796875,
-0.047943115234375,
-0.01172637939453125,
0.0242919921875,
0.030181884765625,
-0.0379638671875,
0.0211944580078125,
-0.0217742919921875,
-0.03424072265625,
0.0229644775390625,
-0.049102783203125,
0.07373046875,
0.028411865234375,
-0.049560546875,
0.00908660888671875,
-0.0550537109375,
-0.0139007568359375,
0.01491546630859375,
-0.03094482421875,
0.033966064453125,
-0.0164031982421875,
-0.002735137939453125,
0.0245819091796875,
0.0167388916015625,
-0.047576904296875,
0.0164947509765625,
-0.032440185546875,
0.035552978515625,
0.05621337890625,
-0.00740814208984375,
0.0291748046875,
-0.037078857421875,
0.024658203125,
0.019866943359375,
0.043304443359375,
-0.0202484130859375,
-0.055938720703125,
-0.07305908203125,
-0.0279998779296875,
0.019744873046875,
0.037628173828125,
-0.0721435546875,
0.0304412841796875,
-0.0099334716796875,
-0.060455322265625,
-0.04901123046875,
-0.0082855224609375,
0.036163330078125,
0.0311737060546875,
0.041778564453125,
-0.03460693359375,
-0.050567626953125,
-0.0595703125,
0.001071929931640625,
-0.001766204833984375,
-0.000835418701171875,
0.0237884521484375,
0.0303802490234375,
-0.0237884521484375,
0.06768798828125,
-0.0172882080078125,
0.002887725830078125,
-0.029510498046875,
0.0107879638671875,
0.03717041015625,
0.039794921875,
0.04217529296875,
-0.049346923828125,
-0.0552978515625,
-0.01043701171875,
-0.055511474609375,
-0.003997802734375,
-0.003692626953125,
-0.01551055908203125,
0.01450347900390625,
0.01055145263671875,
-0.07305908203125,
0.033294677734375,
0.05010986328125,
-0.0294647216796875,
0.032867431640625,
0.00130462646484375,
0.002292633056640625,
-0.11077880859375,
-0.004161834716796875,
-0.00887298583984375,
-0.01491546630859375,
-0.0411376953125,
-0.02020263671875,
0.0017528533935546875,
-0.0023345947265625,
-0.06793212890625,
0.036468505859375,
-0.032073974609375,
0.004917144775390625,
-0.02032470703125,
-0.011138916015625,
-0.0158538818359375,
0.055145263671875,
0.00821685791015625,
0.051849365234375,
0.042205810546875,
-0.032928466796875,
0.040283203125,
0.0269775390625,
-0.0165252685546875,
0.0163726806640625,
-0.043975830078125,
0.0162811279296875,
0.01087188720703125,
0.0243377685546875,
-0.06341552734375,
-0.007335662841796875,
0.04193115234375,
-0.04559326171875,
0.019744873046875,
-0.029052734375,
-0.0316162109375,
-0.03875732421875,
-0.0078582763671875,
0.02191162109375,
0.060943603515625,
-0.060577392578125,
0.049896240234375,
0.00896453857421875,
0.01184844970703125,
-0.054473876953125,
-0.037872314453125,
0.0037364959716796875,
-0.0220794677734375,
-0.06158447265625,
0.0200347900390625,
-0.007808685302734375,
0.01082611083984375,
-0.01617431640625,
-0.004634857177734375,
0.0063629150390625,
-0.005939483642578125,
0.0298919677734375,
0.0233306884765625,
-0.020843505859375,
-0.0232696533203125,
-0.008026123046875,
-0.0186614990234375,
0.00519561767578125,
-0.021759033203125,
0.07763671875,
-0.0295562744140625,
-0.022705078125,
-0.047149658203125,
0.009796142578125,
0.04058837890625,
-0.01288604736328125,
0.0775146484375,
0.07928466796875,
-0.00876617431640625,
0.0038318634033203125,
-0.05682373046875,
-0.0182647705078125,
-0.04095458984375,
0.0240325927734375,
-0.00846099853515625,
-0.061798095703125,
0.044677734375,
0.00885772705078125,
-0.0006275177001953125,
0.04400634765625,
0.070068359375,
0.0030975341796875,
0.0772705078125,
0.0377197265625,
0.02117919921875,
0.050872802734375,
-0.0401611328125,
0.0041961669921875,
-0.0631103515625,
-0.018585205078125,
-0.005901336669921875,
-0.01898193359375,
-0.051971435546875,
-0.042572021484375,
0.0223388671875,
-0.007732391357421875,
-0.044647216796875,
0.05718994140625,
-0.0457763671875,
0.033721923828125,
0.06109619140625,
0.0203857421875,
0.01561737060546875,
-0.0152740478515625,
-0.016265869140625,
0.01300048828125,
-0.058624267578125,
-0.03875732421875,
0.08721923828125,
0.033477783203125,
0.05047607421875,
0.0060272216796875,
0.05670166015625,
-0.010162353515625,
0.04541015625,
-0.0243988037109375,
0.04327392578125,
0.01276397705078125,
-0.05010986328125,
-0.00362396240234375,
-0.048065185546875,
-0.06439208984375,
0.021331787109375,
-0.0171966552734375,
-0.058685302734375,
0.0232086181640625,
0.01715087890625,
-0.0222930908203125,
0.039520263671875,
-0.07513427734375,
0.0738525390625,
-0.01056671142578125,
-0.02728271484375,
0.01239776611328125,
-0.06414794921875,
0.035247802734375,
0.0107879638671875,
-0.00702667236328125,
-0.01519775390625,
0.022491455078125,
0.047149658203125,
-0.033111572265625,
0.07366943359375,
-0.0172576904296875,
0.01122283935546875,
0.034637451171875,
-0.0017786026000976562,
0.030792236328125,
0.0049285888671875,
0.01220703125,
0.0225372314453125,
0.007167816162109375,
-0.022125244140625,
-0.026641845703125,
0.039306640625,
-0.08331298828125,
-0.042877197265625,
-0.032012939453125,
-0.046356201171875,
0.0037784576416015625,
0.0129241943359375,
0.044921875,
0.017791748046875,
0.0076446533203125,
0.0256805419921875,
0.041107177734375,
-0.04058837890625,
0.051666259765625,
0.027435302734375,
-0.01155853271484375,
-0.04156494140625,
0.0662841796875,
-0.007511138916015625,
0.035186767578125,
0.0200347900390625,
0.015777587890625,
-0.024261474609375,
-0.03448486328125,
-0.0282135009765625,
0.0299835205078125,
-0.041290283203125,
-0.032073974609375,
-0.05084228515625,
-0.038604736328125,
-0.03936767578125,
0.01114654541015625,
-0.0509033203125,
-0.028900146484375,
-0.0240936279296875,
0.00030612945556640625,
0.024505615234375,
0.04217529296875,
-0.006160736083984375,
0.05279541015625,
-0.06719970703125,
0.0167694091796875,
0.01995849609375,
0.031951904296875,
-0.006275177001953125,
-0.047454833984375,
-0.02301025390625,
0.0193939208984375,
-0.05560302734375,
-0.06207275390625,
0.044769287109375,
-0.0033397674560546875,
0.036102294921875,
0.0169525146484375,
-0.00815582275390625,
0.033782958984375,
-0.0310211181640625,
0.06402587890625,
0.03045654296875,
-0.06634521484375,
0.02008056640625,
-0.040008544921875,
0.0255279541015625,
0.005779266357421875,
0.03173828125,
-0.044830322265625,
-0.0264892578125,
-0.065185546875,
-0.052398681640625,
0.07098388671875,
0.0469970703125,
0.01291656494140625,
-0.00861358642578125,
0.02142333984375,
0.00122833251953125,
0.0162506103515625,
-0.09088134765625,
-0.0245819091796875,
-0.04327392578125,
-0.0235443115234375,
0.002399444580078125,
-0.0147857666015625,
-0.00571441650390625,
-0.034027099609375,
0.054168701171875,
0.004913330078125,
0.0615234375,
0.01068115234375,
-0.0157623291015625,
-0.0000019669532775878906,
0.00926971435546875,
0.039276123046875,
0.046661376953125,
-0.02325439453125,
0.004573822021484375,
0.02362060546875,
-0.049468994140625,
0.0032405853271484375,
0.0179901123046875,
0.0012722015380859375,
-0.0095367431640625,
0.014404296875,
0.079833984375,
-0.0012187957763671875,
-0.0198974609375,
0.04473876953125,
-0.01204681396484375,
-0.0185089111328125,
-0.0175933837890625,
0.01727294921875,
0.0340576171875,
0.045684814453125,
0.00875091552734375,
-0.0007758140563964844,
-0.01348114013671875,
-0.03887939453125,
0.01084136962890625,
0.010223388671875,
-0.02008056640625,
-0.0257415771484375,
0.06585693359375,
0.00670623779296875,
-0.0259246826171875,
0.05218505859375,
-0.011260986328125,
-0.035247802734375,
0.04364013671875,
0.0557861328125,
0.05828857421875,
-0.0253448486328125,
0.0185546875,
0.02362060546875,
0.0248870849609375,
-0.007015228271484375,
0.006500244140625,
0.0030574798583984375,
-0.059417724609375,
-0.032440185546875,
-0.05328369140625,
-0.0295867919921875,
0.0022792816162109375,
-0.0263214111328125,
0.032073974609375,
-0.033660888671875,
-0.0200347900390625,
-0.0175323486328125,
-0.007648468017578125,
-0.047454833984375,
0.01079559326171875,
0.0298614501953125,
0.0692138671875,
-0.050567626953125,
0.07177734375,
0.0215301513671875,
-0.040802001953125,
-0.07318115234375,
-0.034393310546875,
-0.0054931640625,
-0.07061767578125,
0.01934814453125,
0.018707275390625,
0.0131683349609375,
0.00620269775390625,
-0.058929443359375,
-0.064697265625,
0.1143798828125,
0.049774169921875,
-0.026092529296875,
-0.023223876953125,
0.036865234375,
0.0386962890625,
-0.0306549072265625,
0.050933837890625,
0.04217529296875,
0.0261688232421875,
0.03375244140625,
-0.0714111328125,
0.00775909423828125,
-0.0289764404296875,
0.0027790069580078125,
0.005336761474609375,
-0.062042236328125,
0.0863037109375,
-0.00527191162109375,
-0.01308441162109375,
0.0169830322265625,
0.05059814453125,
0.0177459716796875,
0.0119476318359375,
0.025482177734375,
0.049957275390625,
0.0289459228515625,
-0.028411865234375,
0.08868408203125,
-0.0260467529296875,
0.048492431640625,
0.076416015625,
0.019073486328125,
0.039276123046875,
0.0180206298828125,
-0.0113677978515625,
0.0301971435546875,
0.059417724609375,
-0.022064208984375,
0.026580810546875,
-0.00321197509765625,
-0.01947021484375,
-0.0212860107421875,
0.00855255126953125,
-0.044769287109375,
0.0251312255859375,
0.01971435546875,
-0.048492431640625,
-0.01422882080078125,
0.00482940673828125,
0.0107269287109375,
-0.035308837890625,
-0.003940582275390625,
0.05010986328125,
0.019866943359375,
-0.038116455078125,
0.057769775390625,
-0.0015964508056640625,
0.054443359375,
-0.037109375,
0.00646209716796875,
-0.0238189697265625,
0.0206451416015625,
-0.01296234130859375,
-0.05059814453125,
0.01276397705078125,
-0.01050567626953125,
0.01136016845703125,
-0.016387939453125,
0.0298614501953125,
-0.035125732421875,
-0.024505615234375,
0.01666259765625,
0.0299072265625,
0.011505126953125,
-0.0175628662109375,
-0.06353759765625,
0.0012922286987304688,
0.00452423095703125,
-0.0286407470703125,
0.019683837890625,
0.021575927734375,
0.0187530517578125,
0.048797607421875,
0.05364990234375,
-0.0101776123046875,
0.0228118896484375,
-0.0010461807250976562,
0.0748291015625,
-0.05169677734375,
-0.0166168212890625,
-0.070556640625,
0.05218505859375,
-0.0032253265380859375,
-0.01959228515625,
0.057342529296875,
0.042877197265625,
0.06341552734375,
-0.00821685791015625,
0.036834716796875,
-0.0090484619140625,
0.025665283203125,
-0.0297088623046875,
0.0628662109375,
-0.0206451416015625,
0.0237884521484375,
-0.0268707275390625,
-0.09698486328125,
-0.0153961181640625,
0.044464111328125,
-0.026580810546875,
0.01386260986328125,
0.05780029296875,
0.06982421875,
-0.0204315185546875,
0.00839996337890625,
0.0223236083984375,
0.036041259765625,
0.03253173828125,
0.059326171875,
0.07672119140625,
-0.0455322265625,
0.058990478515625,
-0.042083740234375,
-0.00981903076171875,
-0.0030193328857421875,
-0.056884765625,
-0.08416748046875,
-0.0428466796875,
-0.0131683349609375,
-0.042816162109375,
-0.002353668212890625,
0.081787109375,
0.051239013671875,
-0.04376220703125,
-0.0234375,
-0.0073089599609375,
-0.000025391578674316406,
0.0027313232421875,
-0.01479339599609375,
0.026031494140625,
-0.0005393028259277344,
-0.06005859375,
0.00827789306640625,
0.01221466064453125,
0.0284881591796875,
-0.0023632049560546875,
-0.00444793701171875,
-0.0244293212890625,
0.0004723072052001953,
0.038787841796875,
0.01409912109375,
-0.04248046875,
-0.0245513916015625,
0.0014753341674804688,
-0.007457733154296875,
0.037506103515625,
0.029541015625,
-0.040069580078125,
0.0131683349609375,
0.02069091796875,
0.020751953125,
0.06903076171875,
-0.00824737548828125,
0.043975830078125,
-0.036163330078125,
0.01325225830078125,
0.013153076171875,
0.0372314453125,
0.0279693603515625,
-0.0175628662109375,
0.03662109375,
0.03314208984375,
-0.041015625,
-0.050689697265625,
0.007678985595703125,
-0.0904541015625,
0.0028629302978515625,
0.09478759765625,
-0.00707244873046875,
-0.03399658203125,
0.013519287109375,
-0.034820556640625,
0.046295166015625,
-0.0186309814453125,
0.05718994140625,
0.0335693359375,
-0.001827239990234375,
-0.04168701171875,
-0.0200042724609375,
0.035247802734375,
0.01457977294921875,
-0.053436279296875,
-0.01059722900390625,
0.011383056640625,
0.035186767578125,
0.0147705078125,
0.0357666015625,
-0.017822265625,
0.0298919677734375,
-0.00022864341735839844,
0.0220184326171875,
-0.024261474609375,
-0.015716552734375,
-0.000583648681640625,
0.0024738311767578125,
-0.0178375244140625,
-0.00710296630859375
]
] |
hkunlp/instructor-xl | 2023-01-21T06:33:27.000Z | [
"sentence-transformers",
"pytorch",
"t5",
"text-embedding",
"embeddings",
"information-retrieval",
"beir",
"text-classification",
"language-model",
"text-clustering",
"text-semantic-similarity",
"text-evaluation",
"prompt-retrieval",
"text-reranking",
"feature-extraction",
"sentence-similarity",
"transformers",
"English",
"Sentence Similarity",
"natural_questions",
"ms_marco",
"fever",
"hotpot_qa",
"mteb",
"en",
"arxiv:2212.09741",
"license:apache-2.0",
"model-index",
"has_space",
"text-generation-inference",
"region:us"
] | sentence-similarity | hkunlp | null | null | hkunlp/instructor-xl | 412 | 32,783 | sentence-transformers | 2022-12-20T06:07:18 | ---
pipeline_tag: sentence-similarity
tags:
- text-embedding
- embeddings
- information-retrieval
- beir
- text-classification
- language-model
- text-clustering
- text-semantic-similarity
- text-evaluation
- prompt-retrieval
- text-reranking
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
- t5
- English
- Sentence Similarity
- natural_questions
- ms_marco
- fever
- hotpot_qa
- mteb
language: en
inference: false
license: apache-2.0
model-index:
- name: final_xl_results
results:
- task:
type: Classification
dataset:
type: mteb/amazon_counterfactual
name: MTEB AmazonCounterfactualClassification (en)
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 85.08955223880596
- type: ap
value: 52.66066378722476
- type: f1
value: 79.63340218960269
- task:
type: Classification
dataset:
type: mteb/amazon_polarity
name: MTEB AmazonPolarityClassification
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 86.542
- type: ap
value: 81.92695193008987
- type: f1
value: 86.51466132573681
- task:
type: Classification
dataset:
type: mteb/amazon_reviews_multi
name: MTEB AmazonReviewsClassification (en)
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 42.964
- type: f1
value: 41.43146249774862
- task:
type: Retrieval
dataset:
type: arguana
name: MTEB ArguAna
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 29.872
- type: map_at_10
value: 46.342
- type: map_at_100
value: 47.152
- type: map_at_1000
value: 47.154
- type: map_at_3
value: 41.216
- type: map_at_5
value: 44.035999999999994
- type: mrr_at_1
value: 30.939
- type: mrr_at_10
value: 46.756
- type: mrr_at_100
value: 47.573
- type: mrr_at_1000
value: 47.575
- type: mrr_at_3
value: 41.548
- type: mrr_at_5
value: 44.425
- type: ndcg_at_1
value: 29.872
- type: ndcg_at_10
value: 55.65
- type: ndcg_at_100
value: 58.88099999999999
- type: ndcg_at_1000
value: 58.951
- type: ndcg_at_3
value: 45.0
- type: ndcg_at_5
value: 50.09
- type: precision_at_1
value: 29.872
- type: precision_at_10
value: 8.549
- type: precision_at_100
value: 0.991
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 18.658
- type: precision_at_5
value: 13.669999999999998
- type: recall_at_1
value: 29.872
- type: recall_at_10
value: 85.491
- type: recall_at_100
value: 99.075
- type: recall_at_1000
value: 99.644
- type: recall_at_3
value: 55.974000000000004
- type: recall_at_5
value: 68.35
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-p2p
name: MTEB ArxivClusteringP2P
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 42.452729850641276
- task:
type: Clustering
dataset:
type: mteb/arxiv-clustering-s2s
name: MTEB ArxivClusteringS2S
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 32.21141846480423
- task:
type: Reranking
dataset:
type: mteb/askubuntudupquestions-reranking
name: MTEB AskUbuntuDupQuestions
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 65.34710928952622
- type: mrr
value: 77.61124301983028
- task:
type: STS
dataset:
type: mteb/biosses-sts
name: MTEB BIOSSES
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_spearman
value: 84.15312230525639
- task:
type: Classification
dataset:
type: mteb/banking77
name: MTEB Banking77Classification
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 82.66233766233766
- type: f1
value: 82.04175284777669
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-p2p
name: MTEB BiorxivClusteringP2P
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 37.36697339826455
- task:
type: Clustering
dataset:
type: mteb/biorxiv-clustering-s2s
name: MTEB BiorxivClusteringS2S
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 30.551241447593092
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackAndroidRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 36.797000000000004
- type: map_at_10
value: 48.46
- type: map_at_100
value: 49.968
- type: map_at_1000
value: 50.080000000000005
- type: map_at_3
value: 44.71
- type: map_at_5
value: 46.592
- type: mrr_at_1
value: 45.494
- type: mrr_at_10
value: 54.747
- type: mrr_at_100
value: 55.43599999999999
- type: mrr_at_1000
value: 55.464999999999996
- type: mrr_at_3
value: 52.361000000000004
- type: mrr_at_5
value: 53.727000000000004
- type: ndcg_at_1
value: 45.494
- type: ndcg_at_10
value: 54.989
- type: ndcg_at_100
value: 60.096000000000004
- type: ndcg_at_1000
value: 61.58
- type: ndcg_at_3
value: 49.977
- type: ndcg_at_5
value: 51.964999999999996
- type: precision_at_1
value: 45.494
- type: precision_at_10
value: 10.558
- type: precision_at_100
value: 1.6049999999999998
- type: precision_at_1000
value: 0.203
- type: precision_at_3
value: 23.796
- type: precision_at_5
value: 16.881
- type: recall_at_1
value: 36.797000000000004
- type: recall_at_10
value: 66.83
- type: recall_at_100
value: 88.34100000000001
- type: recall_at_1000
value: 97.202
- type: recall_at_3
value: 51.961999999999996
- type: recall_at_5
value: 57.940000000000005
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackEnglishRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 32.597
- type: map_at_10
value: 43.424
- type: map_at_100
value: 44.78
- type: map_at_1000
value: 44.913
- type: map_at_3
value: 40.315
- type: map_at_5
value: 41.987
- type: mrr_at_1
value: 40.382
- type: mrr_at_10
value: 49.219
- type: mrr_at_100
value: 49.895
- type: mrr_at_1000
value: 49.936
- type: mrr_at_3
value: 46.996
- type: mrr_at_5
value: 48.231
- type: ndcg_at_1
value: 40.382
- type: ndcg_at_10
value: 49.318
- type: ndcg_at_100
value: 53.839999999999996
- type: ndcg_at_1000
value: 55.82899999999999
- type: ndcg_at_3
value: 44.914
- type: ndcg_at_5
value: 46.798
- type: precision_at_1
value: 40.382
- type: precision_at_10
value: 9.274000000000001
- type: precision_at_100
value: 1.497
- type: precision_at_1000
value: 0.198
- type: precision_at_3
value: 21.592
- type: precision_at_5
value: 15.159
- type: recall_at_1
value: 32.597
- type: recall_at_10
value: 59.882000000000005
- type: recall_at_100
value: 78.446
- type: recall_at_1000
value: 90.88000000000001
- type: recall_at_3
value: 46.9
- type: recall_at_5
value: 52.222
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGamingRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 43.8
- type: map_at_10
value: 57.293000000000006
- type: map_at_100
value: 58.321
- type: map_at_1000
value: 58.361
- type: map_at_3
value: 53.839999999999996
- type: map_at_5
value: 55.838
- type: mrr_at_1
value: 49.592000000000006
- type: mrr_at_10
value: 60.643
- type: mrr_at_100
value: 61.23499999999999
- type: mrr_at_1000
value: 61.251999999999995
- type: mrr_at_3
value: 58.265
- type: mrr_at_5
value: 59.717
- type: ndcg_at_1
value: 49.592000000000006
- type: ndcg_at_10
value: 63.364
- type: ndcg_at_100
value: 67.167
- type: ndcg_at_1000
value: 67.867
- type: ndcg_at_3
value: 57.912
- type: ndcg_at_5
value: 60.697
- type: precision_at_1
value: 49.592000000000006
- type: precision_at_10
value: 10.088
- type: precision_at_100
value: 1.2930000000000001
- type: precision_at_1000
value: 0.13899999999999998
- type: precision_at_3
value: 25.789
- type: precision_at_5
value: 17.541999999999998
- type: recall_at_1
value: 43.8
- type: recall_at_10
value: 77.635
- type: recall_at_100
value: 93.748
- type: recall_at_1000
value: 98.468
- type: recall_at_3
value: 63.223
- type: recall_at_5
value: 70.122
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackGisRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 27.721
- type: map_at_10
value: 35.626999999999995
- type: map_at_100
value: 36.719
- type: map_at_1000
value: 36.8
- type: map_at_3
value: 32.781
- type: map_at_5
value: 34.333999999999996
- type: mrr_at_1
value: 29.604999999999997
- type: mrr_at_10
value: 37.564
- type: mrr_at_100
value: 38.505
- type: mrr_at_1000
value: 38.565
- type: mrr_at_3
value: 34.727000000000004
- type: mrr_at_5
value: 36.207
- type: ndcg_at_1
value: 29.604999999999997
- type: ndcg_at_10
value: 40.575
- type: ndcg_at_100
value: 45.613
- type: ndcg_at_1000
value: 47.676
- type: ndcg_at_3
value: 34.811
- type: ndcg_at_5
value: 37.491
- type: precision_at_1
value: 29.604999999999997
- type: precision_at_10
value: 6.1690000000000005
- type: precision_at_100
value: 0.906
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 14.237
- type: precision_at_5
value: 10.056
- type: recall_at_1
value: 27.721
- type: recall_at_10
value: 54.041
- type: recall_at_100
value: 76.62299999999999
- type: recall_at_1000
value: 92.134
- type: recall_at_3
value: 38.582
- type: recall_at_5
value: 44.989000000000004
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackMathematicaRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 16.553
- type: map_at_10
value: 25.384
- type: map_at_100
value: 26.655
- type: map_at_1000
value: 26.778000000000002
- type: map_at_3
value: 22.733
- type: map_at_5
value: 24.119
- type: mrr_at_1
value: 20.149
- type: mrr_at_10
value: 29.705
- type: mrr_at_100
value: 30.672
- type: mrr_at_1000
value: 30.737
- type: mrr_at_3
value: 27.032
- type: mrr_at_5
value: 28.369
- type: ndcg_at_1
value: 20.149
- type: ndcg_at_10
value: 30.843999999999998
- type: ndcg_at_100
value: 36.716
- type: ndcg_at_1000
value: 39.495000000000005
- type: ndcg_at_3
value: 25.918999999999997
- type: ndcg_at_5
value: 27.992
- type: precision_at_1
value: 20.149
- type: precision_at_10
value: 5.858
- type: precision_at_100
value: 1.009
- type: precision_at_1000
value: 0.13799999999999998
- type: precision_at_3
value: 12.645000000000001
- type: precision_at_5
value: 9.179
- type: recall_at_1
value: 16.553
- type: recall_at_10
value: 43.136
- type: recall_at_100
value: 68.562
- type: recall_at_1000
value: 88.208
- type: recall_at_3
value: 29.493000000000002
- type: recall_at_5
value: 34.751
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackPhysicsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 28.000999999999998
- type: map_at_10
value: 39.004
- type: map_at_100
value: 40.461999999999996
- type: map_at_1000
value: 40.566
- type: map_at_3
value: 35.805
- type: map_at_5
value: 37.672
- type: mrr_at_1
value: 33.782000000000004
- type: mrr_at_10
value: 44.702
- type: mrr_at_100
value: 45.528
- type: mrr_at_1000
value: 45.576
- type: mrr_at_3
value: 42.14
- type: mrr_at_5
value: 43.651
- type: ndcg_at_1
value: 33.782000000000004
- type: ndcg_at_10
value: 45.275999999999996
- type: ndcg_at_100
value: 50.888
- type: ndcg_at_1000
value: 52.879
- type: ndcg_at_3
value: 40.191
- type: ndcg_at_5
value: 42.731
- type: precision_at_1
value: 33.782000000000004
- type: precision_at_10
value: 8.200000000000001
- type: precision_at_100
value: 1.287
- type: precision_at_1000
value: 0.16199999999999998
- type: precision_at_3
value: 19.185
- type: precision_at_5
value: 13.667000000000002
- type: recall_at_1
value: 28.000999999999998
- type: recall_at_10
value: 58.131
- type: recall_at_100
value: 80.869
- type: recall_at_1000
value: 93.931
- type: recall_at_3
value: 44.161
- type: recall_at_5
value: 50.592000000000006
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackProgrammersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 28.047
- type: map_at_10
value: 38.596000000000004
- type: map_at_100
value: 40.116
- type: map_at_1000
value: 40.232
- type: map_at_3
value: 35.205
- type: map_at_5
value: 37.076
- type: mrr_at_1
value: 34.932
- type: mrr_at_10
value: 44.496
- type: mrr_at_100
value: 45.47
- type: mrr_at_1000
value: 45.519999999999996
- type: mrr_at_3
value: 41.743
- type: mrr_at_5
value: 43.352000000000004
- type: ndcg_at_1
value: 34.932
- type: ndcg_at_10
value: 44.901
- type: ndcg_at_100
value: 50.788999999999994
- type: ndcg_at_1000
value: 52.867
- type: ndcg_at_3
value: 39.449
- type: ndcg_at_5
value: 41.929
- type: precision_at_1
value: 34.932
- type: precision_at_10
value: 8.311
- type: precision_at_100
value: 1.3050000000000002
- type: precision_at_1000
value: 0.166
- type: precision_at_3
value: 18.836
- type: precision_at_5
value: 13.447000000000001
- type: recall_at_1
value: 28.047
- type: recall_at_10
value: 57.717
- type: recall_at_100
value: 82.182
- type: recall_at_1000
value: 95.82000000000001
- type: recall_at_3
value: 42.448
- type: recall_at_5
value: 49.071
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 27.861250000000005
- type: map_at_10
value: 37.529583333333335
- type: map_at_100
value: 38.7915
- type: map_at_1000
value: 38.90558333333335
- type: map_at_3
value: 34.57333333333333
- type: map_at_5
value: 36.187166666666656
- type: mrr_at_1
value: 32.88291666666666
- type: mrr_at_10
value: 41.79750000000001
- type: mrr_at_100
value: 42.63183333333333
- type: mrr_at_1000
value: 42.68483333333333
- type: mrr_at_3
value: 39.313750000000006
- type: mrr_at_5
value: 40.70483333333333
- type: ndcg_at_1
value: 32.88291666666666
- type: ndcg_at_10
value: 43.09408333333333
- type: ndcg_at_100
value: 48.22158333333333
- type: ndcg_at_1000
value: 50.358000000000004
- type: ndcg_at_3
value: 38.129583333333336
- type: ndcg_at_5
value: 40.39266666666666
- type: precision_at_1
value: 32.88291666666666
- type: precision_at_10
value: 7.5584999999999996
- type: precision_at_100
value: 1.1903333333333332
- type: precision_at_1000
value: 0.15658333333333332
- type: precision_at_3
value: 17.495916666666666
- type: precision_at_5
value: 12.373833333333332
- type: recall_at_1
value: 27.861250000000005
- type: recall_at_10
value: 55.215916666666665
- type: recall_at_100
value: 77.392
- type: recall_at_1000
value: 92.04908333333334
- type: recall_at_3
value: 41.37475
- type: recall_at_5
value: 47.22908333333333
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackStatsRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 25.064999999999998
- type: map_at_10
value: 31.635999999999996
- type: map_at_100
value: 32.596000000000004
- type: map_at_1000
value: 32.695
- type: map_at_3
value: 29.612
- type: map_at_5
value: 30.768
- type: mrr_at_1
value: 28.528
- type: mrr_at_10
value: 34.717
- type: mrr_at_100
value: 35.558
- type: mrr_at_1000
value: 35.626000000000005
- type: mrr_at_3
value: 32.745000000000005
- type: mrr_at_5
value: 33.819
- type: ndcg_at_1
value: 28.528
- type: ndcg_at_10
value: 35.647
- type: ndcg_at_100
value: 40.207
- type: ndcg_at_1000
value: 42.695
- type: ndcg_at_3
value: 31.878
- type: ndcg_at_5
value: 33.634
- type: precision_at_1
value: 28.528
- type: precision_at_10
value: 5.46
- type: precision_at_100
value: 0.84
- type: precision_at_1000
value: 0.11399999999999999
- type: precision_at_3
value: 13.547999999999998
- type: precision_at_5
value: 9.325
- type: recall_at_1
value: 25.064999999999998
- type: recall_at_10
value: 45.096000000000004
- type: recall_at_100
value: 65.658
- type: recall_at_1000
value: 84.128
- type: recall_at_3
value: 34.337
- type: recall_at_5
value: 38.849000000000004
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackTexRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 17.276
- type: map_at_10
value: 24.535
- type: map_at_100
value: 25.655
- type: map_at_1000
value: 25.782
- type: map_at_3
value: 22.228
- type: map_at_5
value: 23.612
- type: mrr_at_1
value: 21.266
- type: mrr_at_10
value: 28.474
- type: mrr_at_100
value: 29.398000000000003
- type: mrr_at_1000
value: 29.482000000000003
- type: mrr_at_3
value: 26.245
- type: mrr_at_5
value: 27.624
- type: ndcg_at_1
value: 21.266
- type: ndcg_at_10
value: 29.087000000000003
- type: ndcg_at_100
value: 34.374
- type: ndcg_at_1000
value: 37.433
- type: ndcg_at_3
value: 25.040000000000003
- type: ndcg_at_5
value: 27.116
- type: precision_at_1
value: 21.266
- type: precision_at_10
value: 5.258
- type: precision_at_100
value: 0.9299999999999999
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 11.849
- type: precision_at_5
value: 8.699
- type: recall_at_1
value: 17.276
- type: recall_at_10
value: 38.928000000000004
- type: recall_at_100
value: 62.529
- type: recall_at_1000
value: 84.44800000000001
- type: recall_at_3
value: 27.554000000000002
- type: recall_at_5
value: 32.915
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackUnixRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 27.297
- type: map_at_10
value: 36.957
- type: map_at_100
value: 38.252
- type: map_at_1000
value: 38.356
- type: map_at_3
value: 34.121
- type: map_at_5
value: 35.782000000000004
- type: mrr_at_1
value: 32.275999999999996
- type: mrr_at_10
value: 41.198
- type: mrr_at_100
value: 42.131
- type: mrr_at_1000
value: 42.186
- type: mrr_at_3
value: 38.557
- type: mrr_at_5
value: 40.12
- type: ndcg_at_1
value: 32.275999999999996
- type: ndcg_at_10
value: 42.516
- type: ndcg_at_100
value: 48.15
- type: ndcg_at_1000
value: 50.344
- type: ndcg_at_3
value: 37.423
- type: ndcg_at_5
value: 39.919
- type: precision_at_1
value: 32.275999999999996
- type: precision_at_10
value: 7.155
- type: precision_at_100
value: 1.123
- type: precision_at_1000
value: 0.14200000000000002
- type: precision_at_3
value: 17.163999999999998
- type: precision_at_5
value: 12.127
- type: recall_at_1
value: 27.297
- type: recall_at_10
value: 55.238
- type: recall_at_100
value: 79.2
- type: recall_at_1000
value: 94.258
- type: recall_at_3
value: 41.327000000000005
- type: recall_at_5
value: 47.588
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWebmastersRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 29.142000000000003
- type: map_at_10
value: 38.769
- type: map_at_100
value: 40.292
- type: map_at_1000
value: 40.510000000000005
- type: map_at_3
value: 35.39
- type: map_at_5
value: 37.009
- type: mrr_at_1
value: 34.19
- type: mrr_at_10
value: 43.418
- type: mrr_at_100
value: 44.132
- type: mrr_at_1000
value: 44.175
- type: mrr_at_3
value: 40.547
- type: mrr_at_5
value: 42.088
- type: ndcg_at_1
value: 34.19
- type: ndcg_at_10
value: 45.14
- type: ndcg_at_100
value: 50.364
- type: ndcg_at_1000
value: 52.481
- type: ndcg_at_3
value: 39.466
- type: ndcg_at_5
value: 41.772
- type: precision_at_1
value: 34.19
- type: precision_at_10
value: 8.715
- type: precision_at_100
value: 1.6150000000000002
- type: precision_at_1000
value: 0.247
- type: precision_at_3
value: 18.248
- type: precision_at_5
value: 13.161999999999999
- type: recall_at_1
value: 29.142000000000003
- type: recall_at_10
value: 57.577999999999996
- type: recall_at_100
value: 81.428
- type: recall_at_1000
value: 94.017
- type: recall_at_3
value: 41.402
- type: recall_at_5
value: 47.695
- task:
type: Retrieval
dataset:
type: BeIR/cqadupstack
name: MTEB CQADupstackWordpressRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 22.039
- type: map_at_10
value: 30.669999999999998
- type: map_at_100
value: 31.682
- type: map_at_1000
value: 31.794
- type: map_at_3
value: 28.139999999999997
- type: map_at_5
value: 29.457
- type: mrr_at_1
value: 24.399
- type: mrr_at_10
value: 32.687
- type: mrr_at_100
value: 33.622
- type: mrr_at_1000
value: 33.698
- type: mrr_at_3
value: 30.407
- type: mrr_at_5
value: 31.552999999999997
- type: ndcg_at_1
value: 24.399
- type: ndcg_at_10
value: 35.472
- type: ndcg_at_100
value: 40.455000000000005
- type: ndcg_at_1000
value: 43.15
- type: ndcg_at_3
value: 30.575000000000003
- type: ndcg_at_5
value: 32.668
- type: precision_at_1
value: 24.399
- type: precision_at_10
value: 5.656
- type: precision_at_100
value: 0.874
- type: precision_at_1000
value: 0.121
- type: precision_at_3
value: 13.062000000000001
- type: precision_at_5
value: 9.242
- type: recall_at_1
value: 22.039
- type: recall_at_10
value: 48.379
- type: recall_at_100
value: 71.11800000000001
- type: recall_at_1000
value: 91.095
- type: recall_at_3
value: 35.108
- type: recall_at_5
value: 40.015
- task:
type: Retrieval
dataset:
type: climate-fever
name: MTEB ClimateFEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 10.144
- type: map_at_10
value: 18.238
- type: map_at_100
value: 20.143
- type: map_at_1000
value: 20.346
- type: map_at_3
value: 14.809
- type: map_at_5
value: 16.567999999999998
- type: mrr_at_1
value: 22.671
- type: mrr_at_10
value: 34.906
- type: mrr_at_100
value: 35.858000000000004
- type: mrr_at_1000
value: 35.898
- type: mrr_at_3
value: 31.238
- type: mrr_at_5
value: 33.342
- type: ndcg_at_1
value: 22.671
- type: ndcg_at_10
value: 26.540000000000003
- type: ndcg_at_100
value: 34.138000000000005
- type: ndcg_at_1000
value: 37.72
- type: ndcg_at_3
value: 20.766000000000002
- type: ndcg_at_5
value: 22.927
- type: precision_at_1
value: 22.671
- type: precision_at_10
value: 8.619
- type: precision_at_100
value: 1.678
- type: precision_at_1000
value: 0.23500000000000001
- type: precision_at_3
value: 15.592
- type: precision_at_5
value: 12.43
- type: recall_at_1
value: 10.144
- type: recall_at_10
value: 33.46
- type: recall_at_100
value: 59.758
- type: recall_at_1000
value: 79.704
- type: recall_at_3
value: 19.604
- type: recall_at_5
value: 25.367
- task:
type: Retrieval
dataset:
type: dbpedia-entity
name: MTEB DBPedia
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 8.654
- type: map_at_10
value: 18.506
- type: map_at_100
value: 26.412999999999997
- type: map_at_1000
value: 28.13
- type: map_at_3
value: 13.379
- type: map_at_5
value: 15.529000000000002
- type: mrr_at_1
value: 66.0
- type: mrr_at_10
value: 74.13
- type: mrr_at_100
value: 74.48700000000001
- type: mrr_at_1000
value: 74.49799999999999
- type: mrr_at_3
value: 72.75
- type: mrr_at_5
value: 73.762
- type: ndcg_at_1
value: 54.50000000000001
- type: ndcg_at_10
value: 40.236
- type: ndcg_at_100
value: 44.690999999999995
- type: ndcg_at_1000
value: 52.195
- type: ndcg_at_3
value: 45.632
- type: ndcg_at_5
value: 42.952
- type: precision_at_1
value: 66.0
- type: precision_at_10
value: 31.724999999999998
- type: precision_at_100
value: 10.299999999999999
- type: precision_at_1000
value: 2.194
- type: precision_at_3
value: 48.75
- type: precision_at_5
value: 41.6
- type: recall_at_1
value: 8.654
- type: recall_at_10
value: 23.74
- type: recall_at_100
value: 50.346999999999994
- type: recall_at_1000
value: 74.376
- type: recall_at_3
value: 14.636
- type: recall_at_5
value: 18.009
- task:
type: Classification
dataset:
type: mteb/emotion
name: MTEB EmotionClassification
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 53.245
- type: f1
value: 48.74520523753552
- task:
type: Retrieval
dataset:
type: fever
name: MTEB FEVER
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 51.729
- type: map_at_10
value: 63.904
- type: map_at_100
value: 64.363
- type: map_at_1000
value: 64.38199999999999
- type: map_at_3
value: 61.393
- type: map_at_5
value: 63.02100000000001
- type: mrr_at_1
value: 55.686
- type: mrr_at_10
value: 67.804
- type: mrr_at_100
value: 68.15299999999999
- type: mrr_at_1000
value: 68.161
- type: mrr_at_3
value: 65.494
- type: mrr_at_5
value: 67.01599999999999
- type: ndcg_at_1
value: 55.686
- type: ndcg_at_10
value: 70.025
- type: ndcg_at_100
value: 72.011
- type: ndcg_at_1000
value: 72.443
- type: ndcg_at_3
value: 65.32900000000001
- type: ndcg_at_5
value: 68.05600000000001
- type: precision_at_1
value: 55.686
- type: precision_at_10
value: 9.358
- type: precision_at_100
value: 1.05
- type: precision_at_1000
value: 0.11
- type: precision_at_3
value: 26.318
- type: precision_at_5
value: 17.321
- type: recall_at_1
value: 51.729
- type: recall_at_10
value: 85.04
- type: recall_at_100
value: 93.777
- type: recall_at_1000
value: 96.824
- type: recall_at_3
value: 72.521
- type: recall_at_5
value: 79.148
- task:
type: Retrieval
dataset:
type: fiqa
name: MTEB FiQA2018
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 23.765
- type: map_at_10
value: 39.114
- type: map_at_100
value: 40.987
- type: map_at_1000
value: 41.155
- type: map_at_3
value: 34.028000000000006
- type: map_at_5
value: 36.925000000000004
- type: mrr_at_1
value: 46.451
- type: mrr_at_10
value: 54.711
- type: mrr_at_100
value: 55.509
- type: mrr_at_1000
value: 55.535000000000004
- type: mrr_at_3
value: 52.649
- type: mrr_at_5
value: 53.729000000000006
- type: ndcg_at_1
value: 46.451
- type: ndcg_at_10
value: 46.955999999999996
- type: ndcg_at_100
value: 53.686
- type: ndcg_at_1000
value: 56.230000000000004
- type: ndcg_at_3
value: 43.374
- type: ndcg_at_5
value: 44.372
- type: precision_at_1
value: 46.451
- type: precision_at_10
value: 13.256
- type: precision_at_100
value: 2.019
- type: precision_at_1000
value: 0.247
- type: precision_at_3
value: 29.115000000000002
- type: precision_at_5
value: 21.389
- type: recall_at_1
value: 23.765
- type: recall_at_10
value: 53.452999999999996
- type: recall_at_100
value: 78.828
- type: recall_at_1000
value: 93.938
- type: recall_at_3
value: 39.023
- type: recall_at_5
value: 45.18
- task:
type: Retrieval
dataset:
type: hotpotqa
name: MTEB HotpotQA
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 31.918000000000003
- type: map_at_10
value: 46.741
- type: map_at_100
value: 47.762
- type: map_at_1000
value: 47.849000000000004
- type: map_at_3
value: 43.578
- type: map_at_5
value: 45.395
- type: mrr_at_1
value: 63.834999999999994
- type: mrr_at_10
value: 71.312
- type: mrr_at_100
value: 71.695
- type: mrr_at_1000
value: 71.714
- type: mrr_at_3
value: 69.82000000000001
- type: mrr_at_5
value: 70.726
- type: ndcg_at_1
value: 63.834999999999994
- type: ndcg_at_10
value: 55.879999999999995
- type: ndcg_at_100
value: 59.723000000000006
- type: ndcg_at_1000
value: 61.49400000000001
- type: ndcg_at_3
value: 50.964
- type: ndcg_at_5
value: 53.47
- type: precision_at_1
value: 63.834999999999994
- type: precision_at_10
value: 11.845
- type: precision_at_100
value: 1.4869999999999999
- type: precision_at_1000
value: 0.172
- type: precision_at_3
value: 32.158
- type: precision_at_5
value: 21.278
- type: recall_at_1
value: 31.918000000000003
- type: recall_at_10
value: 59.223000000000006
- type: recall_at_100
value: 74.328
- type: recall_at_1000
value: 86.05000000000001
- type: recall_at_3
value: 48.238
- type: recall_at_5
value: 53.193999999999996
- task:
type: Classification
dataset:
type: mteb/imdb
name: MTEB ImdbClassification
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 79.7896
- type: ap
value: 73.65166029460288
- type: f1
value: 79.71794693711813
- task:
type: Retrieval
dataset:
type: msmarco
name: MTEB MSMARCO
config: default
split: dev
revision: None
metrics:
- type: map_at_1
value: 22.239
- type: map_at_10
value: 34.542
- type: map_at_100
value: 35.717999999999996
- type: map_at_1000
value: 35.764
- type: map_at_3
value: 30.432
- type: map_at_5
value: 32.81
- type: mrr_at_1
value: 22.908
- type: mrr_at_10
value: 35.127
- type: mrr_at_100
value: 36.238
- type: mrr_at_1000
value: 36.278
- type: mrr_at_3
value: 31.076999999999998
- type: mrr_at_5
value: 33.419
- type: ndcg_at_1
value: 22.908
- type: ndcg_at_10
value: 41.607
- type: ndcg_at_100
value: 47.28
- type: ndcg_at_1000
value: 48.414
- type: ndcg_at_3
value: 33.253
- type: ndcg_at_5
value: 37.486000000000004
- type: precision_at_1
value: 22.908
- type: precision_at_10
value: 6.645
- type: precision_at_100
value: 0.9490000000000001
- type: precision_at_1000
value: 0.105
- type: precision_at_3
value: 14.130999999999998
- type: precision_at_5
value: 10.616
- type: recall_at_1
value: 22.239
- type: recall_at_10
value: 63.42
- type: recall_at_100
value: 89.696
- type: recall_at_1000
value: 98.351
- type: recall_at_3
value: 40.77
- type: recall_at_5
value: 50.93
- task:
type: Classification
dataset:
type: mteb/mtop_domain
name: MTEB MTOPDomainClassification (en)
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 95.06839945280439
- type: f1
value: 94.74276398224072
- task:
type: Classification
dataset:
type: mteb/mtop_intent
name: MTEB MTOPIntentClassification (en)
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 72.25718194254446
- type: f1
value: 53.91164489161391
- task:
type: Classification
dataset:
type: mteb/amazon_massive_intent
name: MTEB MassiveIntentClassification (en)
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 71.47948890383323
- type: f1
value: 69.98520247230257
- task:
type: Classification
dataset:
type: mteb/amazon_massive_scenario
name: MTEB MassiveScenarioClassification (en)
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 76.46603900470748
- type: f1
value: 76.44111526065399
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-p2p
name: MTEB MedrxivClusteringP2P
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 33.19106070798198
- task:
type: Clustering
dataset:
type: mteb/medrxiv-clustering-s2s
name: MTEB MedrxivClusteringS2S
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 30.78772205248094
- task:
type: Reranking
dataset:
type: mteb/mind_small
name: MTEB MindSmallReranking
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.811231631488507
- type: mrr
value: 32.98200485378021
- task:
type: Retrieval
dataset:
type: nfcorpus
name: MTEB NFCorpus
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 6.9
- type: map_at_10
value: 13.703000000000001
- type: map_at_100
value: 17.251
- type: map_at_1000
value: 18.795
- type: map_at_3
value: 10.366999999999999
- type: map_at_5
value: 11.675
- type: mrr_at_1
value: 47.059
- type: mrr_at_10
value: 55.816
- type: mrr_at_100
value: 56.434
- type: mrr_at_1000
value: 56.467
- type: mrr_at_3
value: 53.973000000000006
- type: mrr_at_5
value: 55.257999999999996
- type: ndcg_at_1
value: 44.737
- type: ndcg_at_10
value: 35.997
- type: ndcg_at_100
value: 33.487
- type: ndcg_at_1000
value: 41.897
- type: ndcg_at_3
value: 41.18
- type: ndcg_at_5
value: 38.721
- type: precision_at_1
value: 46.129999999999995
- type: precision_at_10
value: 26.533
- type: precision_at_100
value: 8.706
- type: precision_at_1000
value: 2.16
- type: precision_at_3
value: 38.493
- type: precision_at_5
value: 33.189
- type: recall_at_1
value: 6.9
- type: recall_at_10
value: 17.488999999999997
- type: recall_at_100
value: 34.583000000000006
- type: recall_at_1000
value: 64.942
- type: recall_at_3
value: 11.494
- type: recall_at_5
value: 13.496
- task:
type: Retrieval
dataset:
type: nq
name: MTEB NQ
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 33.028999999999996
- type: map_at_10
value: 49.307
- type: map_at_100
value: 50.205
- type: map_at_1000
value: 50.23
- type: map_at_3
value: 44.782
- type: map_at_5
value: 47.599999999999994
- type: mrr_at_1
value: 37.108999999999995
- type: mrr_at_10
value: 51.742999999999995
- type: mrr_at_100
value: 52.405
- type: mrr_at_1000
value: 52.422000000000004
- type: mrr_at_3
value: 48.087999999999994
- type: mrr_at_5
value: 50.414
- type: ndcg_at_1
value: 37.08
- type: ndcg_at_10
value: 57.236
- type: ndcg_at_100
value: 60.931999999999995
- type: ndcg_at_1000
value: 61.522
- type: ndcg_at_3
value: 48.93
- type: ndcg_at_5
value: 53.561
- type: precision_at_1
value: 37.08
- type: precision_at_10
value: 9.386
- type: precision_at_100
value: 1.1480000000000001
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 22.258
- type: precision_at_5
value: 16.025
- type: recall_at_1
value: 33.028999999999996
- type: recall_at_10
value: 78.805
- type: recall_at_100
value: 94.643
- type: recall_at_1000
value: 99.039
- type: recall_at_3
value: 57.602
- type: recall_at_5
value: 68.253
- task:
type: Retrieval
dataset:
type: quora
name: MTEB QuoraRetrieval
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 71.122
- type: map_at_10
value: 85.237
- type: map_at_100
value: 85.872
- type: map_at_1000
value: 85.885
- type: map_at_3
value: 82.27499999999999
- type: map_at_5
value: 84.13199999999999
- type: mrr_at_1
value: 81.73
- type: mrr_at_10
value: 87.834
- type: mrr_at_100
value: 87.92
- type: mrr_at_1000
value: 87.921
- type: mrr_at_3
value: 86.878
- type: mrr_at_5
value: 87.512
- type: ndcg_at_1
value: 81.73
- type: ndcg_at_10
value: 88.85499999999999
- type: ndcg_at_100
value: 89.992
- type: ndcg_at_1000
value: 90.07
- type: ndcg_at_3
value: 85.997
- type: ndcg_at_5
value: 87.55199999999999
- type: precision_at_1
value: 81.73
- type: precision_at_10
value: 13.491
- type: precision_at_100
value: 1.536
- type: precision_at_1000
value: 0.157
- type: precision_at_3
value: 37.623
- type: precision_at_5
value: 24.742
- type: recall_at_1
value: 71.122
- type: recall_at_10
value: 95.935
- type: recall_at_100
value: 99.657
- type: recall_at_1000
value: 99.996
- type: recall_at_3
value: 87.80799999999999
- type: recall_at_5
value: 92.161
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering
name: MTEB RedditClustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 63.490029238193756
- task:
type: Clustering
dataset:
type: mteb/reddit-clustering-p2p
name: MTEB RedditClusteringP2P
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 65.13153408508836
- task:
type: Retrieval
dataset:
type: scidocs
name: MTEB SCIDOCS
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 4.202999999999999
- type: map_at_10
value: 10.174
- type: map_at_100
value: 12.138
- type: map_at_1000
value: 12.418
- type: map_at_3
value: 7.379
- type: map_at_5
value: 8.727
- type: mrr_at_1
value: 20.7
- type: mrr_at_10
value: 30.389
- type: mrr_at_100
value: 31.566
- type: mrr_at_1000
value: 31.637999999999998
- type: mrr_at_3
value: 27.133000000000003
- type: mrr_at_5
value: 29.078
- type: ndcg_at_1
value: 20.7
- type: ndcg_at_10
value: 17.355999999999998
- type: ndcg_at_100
value: 25.151
- type: ndcg_at_1000
value: 30.37
- type: ndcg_at_3
value: 16.528000000000002
- type: ndcg_at_5
value: 14.396999999999998
- type: precision_at_1
value: 20.7
- type: precision_at_10
value: 8.98
- type: precision_at_100
value: 2.015
- type: precision_at_1000
value: 0.327
- type: precision_at_3
value: 15.367
- type: precision_at_5
value: 12.559999999999999
- type: recall_at_1
value: 4.202999999999999
- type: recall_at_10
value: 18.197
- type: recall_at_100
value: 40.903
- type: recall_at_1000
value: 66.427
- type: recall_at_3
value: 9.362
- type: recall_at_5
value: 12.747
- task:
type: STS
dataset:
type: mteb/sickr-sts
name: MTEB SICK-R
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_spearman
value: 81.69890989765257
- task:
type: STS
dataset:
type: mteb/sts12-sts
name: MTEB STS12
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_spearman
value: 75.31953790551489
- task:
type: STS
dataset:
type: mteb/sts13-sts
name: MTEB STS13
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_spearman
value: 87.44050861280759
- task:
type: STS
dataset:
type: mteb/sts14-sts
name: MTEB STS14
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_spearman
value: 81.86922869270393
- task:
type: STS
dataset:
type: mteb/sts15-sts
name: MTEB STS15
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_spearman
value: 88.9399170304284
- task:
type: STS
dataset:
type: mteb/sts16-sts
name: MTEB STS16
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_spearman
value: 85.38015314088582
- task:
type: STS
dataset:
type: mteb/sts17-crosslingual-sts
name: MTEB STS17 (en-en)
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_spearman
value: 90.53653527788835
- task:
type: STS
dataset:
type: mteb/sts22-crosslingual-sts
name: MTEB STS22 (en)
config: en
split: test
revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
metrics:
- type: cos_sim_spearman
value: 68.64526474250209
- task:
type: STS
dataset:
type: mteb/stsbenchmark-sts
name: MTEB STSBenchmark
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_spearman
value: 86.56156983963042
- task:
type: Reranking
dataset:
type: mteb/scidocs-reranking
name: MTEB SciDocsRR
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 79.48610254648003
- type: mrr
value: 94.02481505422682
- task:
type: Retrieval
dataset:
type: scifact
name: MTEB SciFact
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 48.983
- type: map_at_10
value: 59.077999999999996
- type: map_at_100
value: 59.536
- type: map_at_1000
value: 59.575
- type: map_at_3
value: 55.691
- type: map_at_5
value: 57.410000000000004
- type: mrr_at_1
value: 51.666999999999994
- type: mrr_at_10
value: 60.427
- type: mrr_at_100
value: 60.763
- type: mrr_at_1000
value: 60.79900000000001
- type: mrr_at_3
value: 57.556
- type: mrr_at_5
value: 59.089000000000006
- type: ndcg_at_1
value: 51.666999999999994
- type: ndcg_at_10
value: 64.559
- type: ndcg_at_100
value: 66.58
- type: ndcg_at_1000
value: 67.64
- type: ndcg_at_3
value: 58.287
- type: ndcg_at_5
value: 61.001000000000005
- type: precision_at_1
value: 51.666999999999994
- type: precision_at_10
value: 9.067
- type: precision_at_100
value: 1.0170000000000001
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 23.0
- type: precision_at_5
value: 15.6
- type: recall_at_1
value: 48.983
- type: recall_at_10
value: 80.289
- type: recall_at_100
value: 89.43299999999999
- type: recall_at_1000
value: 97.667
- type: recall_at_3
value: 62.978
- type: recall_at_5
value: 69.872
- task:
type: PairClassification
dataset:
type: mteb/sprintduplicatequestions-pairclassification
name: MTEB SprintDuplicateQuestions
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.79009900990098
- type: cos_sim_ap
value: 94.94115052608419
- type: cos_sim_f1
value: 89.1260162601626
- type: cos_sim_precision
value: 90.599173553719
- type: cos_sim_recall
value: 87.7
- type: dot_accuracy
value: 99.79009900990098
- type: dot_ap
value: 94.94115052608419
- type: dot_f1
value: 89.1260162601626
- type: dot_precision
value: 90.599173553719
- type: dot_recall
value: 87.7
- type: euclidean_accuracy
value: 99.79009900990098
- type: euclidean_ap
value: 94.94115052608419
- type: euclidean_f1
value: 89.1260162601626
- type: euclidean_precision
value: 90.599173553719
- type: euclidean_recall
value: 87.7
- type: manhattan_accuracy
value: 99.7940594059406
- type: manhattan_ap
value: 94.95271414642431
- type: manhattan_f1
value: 89.24508790072387
- type: manhattan_precision
value: 92.3982869379015
- type: manhattan_recall
value: 86.3
- type: max_accuracy
value: 99.7940594059406
- type: max_ap
value: 94.95271414642431
- type: max_f1
value: 89.24508790072387
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering
name: MTEB StackExchangeClustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 68.43866571935851
- task:
type: Clustering
dataset:
type: mteb/stackexchange-clustering-p2p
name: MTEB StackExchangeClusteringP2P
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 35.16579026551532
- task:
type: Reranking
dataset:
type: mteb/stackoverflowdupquestions-reranking
name: MTEB StackOverflowDupQuestions
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 52.518952473513934
- type: mrr
value: 53.292457134368895
- task:
type: Summarization
dataset:
type: mteb/summeval
name: MTEB SummEval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 31.12529588316604
- type: cos_sim_spearman
value: 32.31662126895294
- type: dot_pearson
value: 31.125303796647056
- type: dot_spearman
value: 32.31662126895294
- task:
type: Retrieval
dataset:
type: trec-covid
name: MTEB TRECCOVID
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 0.219
- type: map_at_10
value: 1.7469999999999999
- type: map_at_100
value: 10.177999999999999
- type: map_at_1000
value: 26.108999999999998
- type: map_at_3
value: 0.64
- type: map_at_5
value: 0.968
- type: mrr_at_1
value: 82.0
- type: mrr_at_10
value: 89.067
- type: mrr_at_100
value: 89.067
- type: mrr_at_1000
value: 89.067
- type: mrr_at_3
value: 88.333
- type: mrr_at_5
value: 88.73299999999999
- type: ndcg_at_1
value: 78.0
- type: ndcg_at_10
value: 71.398
- type: ndcg_at_100
value: 55.574999999999996
- type: ndcg_at_1000
value: 51.771
- type: ndcg_at_3
value: 77.765
- type: ndcg_at_5
value: 73.614
- type: precision_at_1
value: 82.0
- type: precision_at_10
value: 75.4
- type: precision_at_100
value: 58.040000000000006
- type: precision_at_1000
value: 23.516000000000002
- type: precision_at_3
value: 84.0
- type: precision_at_5
value: 78.4
- type: recall_at_1
value: 0.219
- type: recall_at_10
value: 1.958
- type: recall_at_100
value: 13.797999999999998
- type: recall_at_1000
value: 49.881
- type: recall_at_3
value: 0.672
- type: recall_at_5
value: 1.0370000000000001
- task:
type: Retrieval
dataset:
type: webis-touche2020
name: MTEB Touche2020
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 1.8610000000000002
- type: map_at_10
value: 8.705
- type: map_at_100
value: 15.164
- type: map_at_1000
value: 16.78
- type: map_at_3
value: 4.346
- type: map_at_5
value: 6.151
- type: mrr_at_1
value: 22.448999999999998
- type: mrr_at_10
value: 41.556
- type: mrr_at_100
value: 42.484
- type: mrr_at_1000
value: 42.494
- type: mrr_at_3
value: 37.755
- type: mrr_at_5
value: 40.102
- type: ndcg_at_1
value: 21.429000000000002
- type: ndcg_at_10
value: 23.439
- type: ndcg_at_100
value: 36.948
- type: ndcg_at_1000
value: 48.408
- type: ndcg_at_3
value: 22.261
- type: ndcg_at_5
value: 23.085
- type: precision_at_1
value: 22.448999999999998
- type: precision_at_10
value: 21.633
- type: precision_at_100
value: 8.02
- type: precision_at_1000
value: 1.5939999999999999
- type: precision_at_3
value: 23.810000000000002
- type: precision_at_5
value: 24.490000000000002
- type: recall_at_1
value: 1.8610000000000002
- type: recall_at_10
value: 15.876000000000001
- type: recall_at_100
value: 50.300999999999995
- type: recall_at_1000
value: 86.098
- type: recall_at_3
value: 5.892
- type: recall_at_5
value: 9.443
- task:
type: Classification
dataset:
type: mteb/toxic_conversations_50k
name: MTEB ToxicConversationsClassification
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 70.3264
- type: ap
value: 13.249577616243794
- type: f1
value: 53.621518367695685
- task:
type: Classification
dataset:
type: mteb/tweet_sentiment_extraction
name: MTEB TweetSentimentExtractionClassification
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 61.57611771363894
- type: f1
value: 61.79797478568639
- task:
type: Clustering
dataset:
type: mteb/twentynewsgroups-clustering
name: MTEB TwentyNewsgroupsClustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 53.38315344479284
- task:
type: PairClassification
dataset:
type: mteb/twittersemeval2015-pairclassification
name: MTEB TwitterSemEval2015
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 87.55438993860642
- type: cos_sim_ap
value: 77.98702600017738
- type: cos_sim_f1
value: 71.94971653931476
- type: cos_sim_precision
value: 67.50693802035153
- type: cos_sim_recall
value: 77.01846965699208
- type: dot_accuracy
value: 87.55438993860642
- type: dot_ap
value: 77.98702925907986
- type: dot_f1
value: 71.94971653931476
- type: dot_precision
value: 67.50693802035153
- type: dot_recall
value: 77.01846965699208
- type: euclidean_accuracy
value: 87.55438993860642
- type: euclidean_ap
value: 77.98702951957925
- type: euclidean_f1
value: 71.94971653931476
- type: euclidean_precision
value: 67.50693802035153
- type: euclidean_recall
value: 77.01846965699208
- type: manhattan_accuracy
value: 87.54246885617214
- type: manhattan_ap
value: 77.95531413902947
- type: manhattan_f1
value: 71.93605683836589
- type: manhattan_precision
value: 69.28152492668622
- type: manhattan_recall
value: 74.80211081794195
- type: max_accuracy
value: 87.55438993860642
- type: max_ap
value: 77.98702951957925
- type: max_f1
value: 71.94971653931476
- task:
type: PairClassification
dataset:
type: mteb/twitterurlcorpus-pairclassification
name: MTEB TwitterURLCorpus
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.47296930182016
- type: cos_sim_ap
value: 86.92853616302108
- type: cos_sim_f1
value: 79.35138351681047
- type: cos_sim_precision
value: 76.74820143884892
- type: cos_sim_recall
value: 82.13735756082538
- type: dot_accuracy
value: 89.47296930182016
- type: dot_ap
value: 86.92854339601595
- type: dot_f1
value: 79.35138351681047
- type: dot_precision
value: 76.74820143884892
- type: dot_recall
value: 82.13735756082538
- type: euclidean_accuracy
value: 89.47296930182016
- type: euclidean_ap
value: 86.92854191061649
- type: euclidean_f1
value: 79.35138351681047
- type: euclidean_precision
value: 76.74820143884892
- type: euclidean_recall
value: 82.13735756082538
- type: manhattan_accuracy
value: 89.47685023479644
- type: manhattan_ap
value: 86.90063722679578
- type: manhattan_f1
value: 79.30753865502702
- type: manhattan_precision
value: 76.32066068631639
- type: manhattan_recall
value: 82.53772713273791
- type: max_accuracy
value: 89.47685023479644
- type: max_ap
value: 86.92854339601595
- type: max_f1
value: 79.35138351681047
---
# hkunlp/instructor-xl
We introduce **Instructor**👨🏫, an instruction-finetuned text embedding model that can generate text embeddings tailored to any task (e.g., classification, retrieval, clustering, text evaluation, etc.) and domains (e.g., science, finance, etc.) ***by simply providing the task instruction, without any finetuning***. Instructor👨 achieves sota on 70 diverse embedding tasks!
The model is easy to use with **our customized** `sentence-transformer` library. For more details, check out [our paper](https://arxiv.org/abs/2212.09741) and [project page](https://instructor-embedding.github.io/)!
**************************** **Updates** ****************************
* 01/21: We released a new [checkpoint](https://huggingface.co/hkunlp/instructor-xl) trained with hard negatives, which gives better performance.
* 12/21: We released our [paper](https://arxiv.org/abs/2212.09741), [code](https://github.com/HKUNLP/instructor-embedding), [checkpoint](https://huggingface.co/hkunlp/instructor-xl) and [project page](https://instructor-embedding.github.io/)! Check them out!
## Quick start
<hr />
## Installation
```bash
pip install InstructorEmbedding
```
## Compute your customized embeddings
Then you can use the model like this to calculate domain-specific and task-aware embeddings:
```python
from InstructorEmbedding import INSTRUCTOR
model = INSTRUCTOR('hkunlp/instructor-xl')
sentence = "3D ActionSLAM: wearable person tracking in multi-floor environments"
instruction = "Represent the Science title:"
embeddings = model.encode([[instruction,sentence]])
print(embeddings)
```
## Use cases
<hr />
## Calculate embeddings for your customized texts
If you want to calculate customized embeddings for specific sentences, you may follow the unified template to write instructions:
Represent the `domain` `text_type` for `task_objective`:
* `domain` is optional, and it specifies the domain of the text, e.g., science, finance, medicine, etc.
* `text_type` is required, and it specifies the encoding unit, e.g., sentence, document, paragraph, etc.
* `task_objective` is optional, and it specifies the objective of embedding, e.g., retrieve a document, classify the sentence, etc.
## Calculate Sentence similarities
You can further use the model to compute similarities between two groups of sentences, with **customized embeddings**.
```python
from sklearn.metrics.pairwise import cosine_similarity
sentences_a = [['Represent the Science sentence: ','Parton energy loss in QCD matter'],
['Represent the Financial statement: ','The Federal Reserve on Wednesday raised its benchmark interest rate.']]
sentences_b = [['Represent the Science sentence: ','The Chiral Phase Transition in Dissipative Dynamics'],
['Represent the Financial statement: ','The funds rose less than 0.5 per cent on Friday']]
embeddings_a = model.encode(sentences_a)
embeddings_b = model.encode(sentences_b)
similarities = cosine_similarity(embeddings_a,embeddings_b)
print(similarities)
```
## Information Retrieval
You can also use **customized embeddings** for information retrieval.
```python
import numpy as np
from sklearn.metrics.pairwise import cosine_similarity
query = [['Represent the Wikipedia question for retrieving supporting documents: ','where is the food stored in a yam plant']]
corpus = [['Represent the Wikipedia document for retrieval: ','Capitalism has been dominant in the Western world since the end of feudalism, but most feel[who?] that the term "mixed economies" more precisely describes most contemporary economies, due to their containing both private-owned and state-owned enterprises. In capitalism, prices determine the demand-supply scale. For example, higher demand for certain goods and services lead to higher prices and lower demand for certain goods lead to lower prices.'],
['Represent the Wikipedia document for retrieval: ',"The disparate impact theory is especially controversial under the Fair Housing Act because the Act regulates many activities relating to housing, insurance, and mortgage loans—and some scholars have argued that the theory's use under the Fair Housing Act, combined with extensions of the Community Reinvestment Act, contributed to rise of sub-prime lending and the crash of the U.S. housing market and ensuing global economic recession"],
['Represent the Wikipedia document for retrieval: ','Disparate impact in United States labor law refers to practices in employment, housing, and other areas that adversely affect one group of people of a protected characteristic more than another, even though rules applied by employers or landlords are formally neutral. Although the protected classes vary by statute, most federal civil rights laws protect based on race, color, religion, national origin, and sex as protected traits, and some laws include disability status and other traits as well.']]
query_embeddings = model.encode(query)
corpus_embeddings = model.encode(corpus)
similarities = cosine_similarity(query_embeddings,corpus_embeddings)
retrieved_doc_id = np.argmax(similarities)
print(retrieved_doc_id)
```
## Clustering
Use **customized embeddings** for clustering texts in groups.
```python
import sklearn.cluster
sentences = [['Represent the Medicine sentence for clustering: ','Dynamical Scalar Degree of Freedom in Horava-Lifshitz Gravity'],
['Represent the Medicine sentence for clustering: ','Comparison of Atmospheric Neutrino Flux Calculations at Low Energies'],
['Represent the Medicine sentence for clustering: ','Fermion Bags in the Massive Gross-Neveu Model'],
['Represent the Medicine sentence for clustering: ',"QCD corrections to Associated t-tbar-H production at the Tevatron"],
['Represent the Medicine sentence for clustering: ','A New Analysis of the R Measurements: Resonance Parameters of the Higher, Vector States of Charmonium']]
embeddings = model.encode(sentences)
clustering_model = sklearn.cluster.MiniBatchKMeans(n_clusters=2)
clustering_model.fit(embeddings)
cluster_assignment = clustering_model.labels_
print(cluster_assignment)
``` | 66,324 | [
[
-0.013427734375,
-0.078125,
0.036468505859375,
0.00106048583984375,
-0.0021381378173828125,
-0.01514434814453125,
-0.0179901123046875,
-0.00809478759765625,
0.0195770263671875,
0.0198822021484375,
-0.0183563232421875,
-0.060791015625,
-0.03460693359375,
0.0005106925964355469,
-0.02410888671875,
0.08087158203125,
-0.00661468505859375,
0.01300811767578125,
-0.04461669921875,
-0.00928497314453125,
-0.022674560546875,
-0.039520263671875,
-0.012664794921875,
-0.024566650390625,
0.0267181396484375,
0.017669677734375,
0.042510986328125,
0.054901123046875,
0.0283203125,
0.02496337890625,
-0.0035037994384765625,
-0.00910186767578125,
-0.0276336669921875,
-0.0169677734375,
-0.0091094970703125,
-0.0443115234375,
-0.0155487060546875,
-0.0006275177001953125,
0.0538330078125,
0.05609130859375,
-0.005054473876953125,
0.00518798828125,
0.0206756591796875,
0.024749755859375,
-0.03961181640625,
0.021392822265625,
-0.0266876220703125,
0.01031494140625,
-0.0016307830810546875,
0.0006670951843261719,
-0.01454925537109375,
-0.0234222412109375,
0.01727294921875,
-0.055450439453125,
0.01068115234375,
0.0165252685546875,
0.0731201171875,
0.0299835205078125,
-0.034942626953125,
-0.03814697265625,
-0.0048980712890625,
0.0806884765625,
-0.060150146484375,
0.027069091796875,
0.0312042236328125,
-0.0087432861328125,
-0.0215606689453125,
-0.06781005859375,
-0.05120849609375,
-0.017303466796875,
-0.0246734619140625,
0.011444091796875,
-0.00411224365234375,
-0.0006480216979980469,
0.0188446044921875,
0.035675048828125,
-0.0487060546875,
0.004062652587890625,
-0.03173828125,
-0.01404571533203125,
0.036407470703125,
0.02081298828125,
0.034454345703125,
-0.0357666015625,
-0.0280914306640625,
-0.02325439453125,
-0.0288238525390625,
0.012664794921875,
0.0233001708984375,
0.01190948486328125,
-0.0291748046875,
0.06256103515625,
-0.0002276897430419922,
0.037872314453125,
-0.00028395652770996094,
-0.0185699462890625,
0.03778076171875,
-0.0273590087890625,
-0.00963592529296875,
0.0034999847412109375,
0.0633544921875,
0.031646728515625,
-0.01082611083984375,
0.0012989044189453125,
0.004077911376953125,
0.01233673095703125,
0.0034084320068359375,
-0.0499267578125,
-0.0233001708984375,
0.0182342529296875,
-0.043975830078125,
-0.02691650390625,
0.00846099853515625,
-0.06158447265625,
-0.01036834716796875,
-0.00931549072265625,
0.05499267578125,
-0.0433349609375,
0.0078125,
0.01532745361328125,
-0.0263519287109375,
-0.0029735565185546875,
0.002044677734375,
-0.059539794921875,
0.0298004150390625,
0.05322265625,
0.062744140625,
-0.00977325439453125,
-0.033721923828125,
-0.01031494140625,
0.00815582275390625,
0.0071563720703125,
0.03662109375,
-0.05316162109375,
-0.00942230224609375,
0.0082244873046875,
0.00952911376953125,
-0.0157928466796875,
-0.0247039794921875,
0.0203704833984375,
-0.007457733154296875,
0.03790283203125,
-0.0034809112548828125,
-0.07568359375,
-0.037872314453125,
0.0188446044921875,
-0.036224365234375,
0.08636474609375,
0.01503753662109375,
-0.0736083984375,
-0.00034165382385253906,
-0.0618896484375,
-0.031005859375,
-0.0214385986328125,
-0.018890380859375,
-0.0341796875,
-0.018890380859375,
0.001720428466796875,
0.055328369140625,
-0.0029582977294921875,
0.01001739501953125,
-0.0265350341796875,
-0.027587890625,
0.0243682861328125,
0.00624847412109375,
0.07183837890625,
0.0115966796875,
-0.03155517578125,
0.002643585205078125,
-0.045318603515625,
-0.011383056640625,
0.0087432861328125,
-0.03680419921875,
-0.038665771484375,
0.0034084320068359375,
0.01461029052734375,
-0.01357269287109375,
0.036956787109375,
-0.046051025390625,
0.0249481201171875,
-0.02239990234375,
0.0538330078125,
0.051025390625,
0.00940704345703125,
0.035247802734375,
-0.028778076171875,
0.0263519287109375,
0.004802703857421875,
-0.001598358154296875,
-0.01122283935546875,
-0.04632568359375,
-0.048828125,
-0.0231170654296875,
0.027191162109375,
0.04620361328125,
-0.04046630859375,
0.0484619140625,
-0.00815582275390625,
-0.049835205078125,
-0.042236328125,
-0.0028533935546875,
0.0297088623046875,
0.0216217041015625,
0.0491943359375,
-0.00848388671875,
-0.046661376953125,
-0.052398681640625,
-0.024871826171875,
0.007541656494140625,
0.016082763671875,
0.0270843505859375,
0.0635986328125,
-0.037322998046875,
0.05731201171875,
-0.07476806640625,
-0.0240325927734375,
-0.0162200927734375,
-0.01090240478515625,
0.018218994140625,
0.06146240234375,
0.054534912109375,
-0.05877685546875,
-0.0484619140625,
-0.008148193359375,
-0.049285888671875,
0.0173797607421875,
-0.01020050048828125,
-0.00792694091796875,
-0.0005197525024414062,
0.039794921875,
-0.04095458984375,
0.029510498046875,
0.0263214111328125,
-0.0313720703125,
0.039703369140625,
-0.0225372314453125,
0.002979278564453125,
-0.1065673828125,
-0.0019159317016601562,
0.0167083740234375,
-0.0091400146484375,
-0.038604736328125,
0.0177764892578125,
-0.0077056884765625,
0.0016698837280273438,
-0.0267486572265625,
0.04296875,
-0.0223541259765625,
0.012359619140625,
0.0010528564453125,
0.020751953125,
-0.0092315673828125,
0.047515869140625,
-0.0111236572265625,
0.03375244140625,
0.0496826171875,
-0.05364990234375,
0.026092529296875,
0.04339599609375,
-0.031280517578125,
0.0211029052734375,
-0.043304443359375,
-0.004039764404296875,
0.0016651153564453125,
0.0218353271484375,
-0.051025390625,
-0.01239013671875,
0.02825927734375,
-0.03143310546875,
0.0131072998046875,
0.00783538818359375,
-0.035186767578125,
-0.02984619140625,
-0.041961669921875,
0.0214080810546875,
0.045013427734375,
-0.031280517578125,
0.01453399658203125,
0.043914794921875,
-0.0009584426879882812,
-0.05572509765625,
-0.052093505859375,
-0.01299285888671875,
-0.0254669189453125,
-0.0220184326171875,
0.0440673828125,
-0.015899658203125,
-0.004657745361328125,
0.035919189453125,
0.005832672119140625,
-0.0165863037109375,
0.0139312744140625,
0.0234222412109375,
0.015869140625,
-0.01326751708984375,
0.011474609375,
0.0173797607421875,
0.0022068023681640625,
-0.002071380615234375,
-0.023712158203125,
0.04779052734375,
-0.01873779296875,
-0.01806640625,
-0.038177490234375,
0.031097412109375,
0.01537322998046875,
0.006992340087890625,
0.05133056640625,
0.06573486328125,
-0.044403076171875,
0.00543975830078125,
-0.0243682861328125,
-0.0191650390625,
-0.0372314453125,
0.05401611328125,
-0.01617431640625,
-0.0823974609375,
0.0207672119140625,
0.0027141571044921875,
0.006534576416015625,
0.04974365234375,
0.04180908203125,
-0.0106964111328125,
0.060333251953125,
0.058685302734375,
-0.0100860595703125,
0.04345703125,
-0.043121337890625,
0.0236968994140625,
-0.060302734375,
-0.0335693359375,
-0.04132080078125,
-0.0293426513671875,
-0.0616455078125,
-0.04119873046875,
0.01270294189453125,
0.021575927734375,
-0.0195465087890625,
0.0450439453125,
-0.042816162109375,
0.020782470703125,
0.04974365234375,
0.01392364501953125,
-0.004482269287109375,
0.01288604736328125,
-0.0260467529296875,
-0.011749267578125,
-0.06292724609375,
-0.04296875,
0.08941650390625,
0.028228759765625,
0.058685302734375,
-0.0165252685546875,
0.07537841796875,
0.0190277099609375,
0.0025787353515625,
-0.057373046875,
0.041351318359375,
-0.0294342041015625,
-0.032928466796875,
-0.01216888427734375,
-0.034759521484375,
-0.0819091796875,
0.021270751953125,
-0.033050537109375,
-0.061737060546875,
0.01136016845703125,
-0.0017232894897460938,
-0.0224456787109375,
0.03521728515625,
-0.03887939453125,
0.08099365234375,
-0.006679534912109375,
-0.0142822265625,
-0.038177490234375,
-0.029937744140625,
0.002899169921875,
0.005168914794921875,
0.0222015380859375,
-0.00348663330078125,
-0.0057525634765625,
0.07940673828125,
-0.024993896484375,
0.07501220703125,
-0.01166534423828125,
0.0223846435546875,
0.0307464599609375,
-0.0256500244140625,
0.0129547119140625,
-0.001987457275390625,
-0.004276275634765625,
0.00732421875,
0.028594970703125,
-0.040740966796875,
-0.03765869140625,
0.06964111328125,
-0.07135009765625,
-0.033203125,
-0.0333251953125,
-0.05499267578125,
0.0225372314453125,
0.01715087890625,
0.0198974609375,
0.0261383056640625,
-0.0206298828125,
0.0306854248046875,
0.0280914306640625,
-0.035888671875,
0.022674560546875,
0.012054443359375,
-0.00756072998046875,
-0.03863525390625,
0.0853271484375,
0.0104827880859375,
-0.00528717041015625,
0.041473388671875,
0.0240478515625,
-0.0189208984375,
-0.0172882080078125,
-0.006443023681640625,
0.02825927734375,
-0.049407958984375,
-0.01079559326171875,
-0.07843017578125,
-0.0133514404296875,
-0.054443359375,
-0.028411865234375,
-0.01125335693359375,
-0.048583984375,
-0.045074462890625,
-0.00809478759765625,
0.02838134765625,
0.07281494140625,
-0.014312744140625,
0.0143280029296875,
-0.05218505859375,
0.0141143798828125,
0.002094268798828125,
0.001613616943359375,
0.0077972412109375,
-0.02301025390625,
-0.05078125,
0.009490966796875,
-0.045440673828125,
-0.062042236328125,
0.02618408203125,
0.015777587890625,
0.066650390625,
0.037384033203125,
0.010406494140625,
0.05279541015625,
-0.04547119140625,
0.07708740234375,
0.025146484375,
-0.07269287109375,
0.036956787109375,
0.002605438232421875,
-0.01253509521484375,
0.026123046875,
0.054779052734375,
-0.040252685546875,
-0.037567138671875,
-0.055267333984375,
-0.07086181640625,
0.037628173828125,
0.0124053955078125,
0.0249481201171875,
-0.01324462890625,
0.03765869140625,
0.0170745849609375,
0.01424407958984375,
-0.07427978515625,
-0.042144775390625,
-0.02862548828125,
-0.0499267578125,
-0.006847381591796875,
-0.0092315673828125,
-0.0015764236450195312,
-0.03887939453125,
0.04766845703125,
0.004241943359375,
0.038818359375,
0.023193359375,
-0.0252838134765625,
0.0186920166015625,
0.0198822021484375,
0.038482666015625,
0.0239105224609375,
-0.014862060546875,
0.0181732177734375,
0.0273590087890625,
-0.047943115234375,
0.007396697998046875,
0.01090240478515625,
-0.006927490234375,
0.0076904296875,
0.0277557373046875,
0.048980712890625,
0.033294677734375,
-0.046234130859375,
0.0452880859375,
0.01448822021484375,
-0.0211944580078125,
-0.0294647216796875,
0.0175628662109375,
0.004177093505859375,
0.033843994140625,
0.0200958251953125,
-0.002300262451171875,
0.01282501220703125,
-0.054534912109375,
0.0158233642578125,
0.012115478515625,
-0.03570556640625,
-0.027740478515625,
0.046630859375,
-0.000060558319091796875,
-0.0148468017578125,
0.033843994140625,
-0.0204925537109375,
-0.0643310546875,
0.055023193359375,
0.06427001953125,
0.051910400390625,
-0.0009245872497558594,
0.017303466796875,
0.05029296875,
0.013275146484375,
-0.00807952880859375,
0.024871826171875,
0.004016876220703125,
-0.0499267578125,
-0.003696441650390625,
-0.0445556640625,
-0.0198974609375,
0.00010651350021362305,
-0.057525634765625,
0.01218414306640625,
-0.0345458984375,
-0.0102386474609375,
-0.007320404052734375,
0.01425933837890625,
-0.05938720703125,
0.0053863525390625,
-0.00933074951171875,
0.05706787109375,
-0.06964111328125,
0.061737060546875,
0.08599853515625,
-0.03729248046875,
-0.04425048828125,
0.005817413330078125,
-0.012420654296875,
-0.049102783203125,
0.034454345703125,
0.034027099609375,
0.032867431640625,
0.007358551025390625,
-0.043548583984375,
-0.059417724609375,
0.10406494140625,
0.005741119384765625,
-0.0287017822265625,
-0.0063323974609375,
0.01166534423828125,
0.034210205078125,
-0.045745849609375,
0.00659942626953125,
0.0285186767578125,
0.0362548828125,
-0.04180908203125,
-0.0457763671875,
0.02935791015625,
-0.02001953125,
-0.01192474365234375,
-0.013153076171875,
-0.046783447265625,
0.08123779296875,
-0.0263519287109375,
-0.00409698486328125,
0.01507568359375,
0.048736572265625,
0.00926971435546875,
0.03887939453125,
0.038360595703125,
0.0699462890625,
0.06951904296875,
-0.002468109130859375,
0.0882568359375,
-0.036346435546875,
0.0516357421875,
0.06390380859375,
0.0134124755859375,
0.07550048828125,
0.02691650390625,
-0.029205322265625,
0.0640869140625,
0.058929443359375,
-0.031463623046875,
0.052032470703125,
0.0182647705078125,
-0.0019435882568359375,
-0.00414276123046875,
-0.007259368896484375,
-0.0538330078125,
0.0216522216796875,
0.02984619140625,
-0.0243377685546875,
0.0034847259521484375,
0.015625,
0.0079345703125,
0.00676727294921875,
-0.0110321044921875,
0.0491943359375,
0.023834228515625,
-0.0214996337890625,
0.0280914306640625,
0.0119171142578125,
0.0721435546875,
-0.0309295654296875,
-0.005893707275390625,
0.004116058349609375,
0.019317626953125,
-0.0321044921875,
-0.059661865234375,
0.0087432861328125,
-0.006504058837890625,
-0.01392364501953125,
0.0015401840209960938,
0.04278564453125,
-0.05120849609375,
-0.027435302734375,
0.03125,
0.020263671875,
0.029876708984375,
0.01235198974609375,
-0.0694580078125,
-0.01125335693359375,
0.0121002197265625,
-0.0167236328125,
0.0281524658203125,
0.021484375,
0.01480865478515625,
0.033966064453125,
0.051910400390625,
-0.0103302001953125,
0.01031494140625,
-0.00896453857421875,
0.06683349609375,
-0.06646728515625,
-0.0517578125,
-0.0565185546875,
0.037109375,
-0.004180908203125,
-0.016845703125,
0.057830810546875,
0.06427001953125,
0.07855224609375,
-0.0144500732421875,
0.060760498046875,
-0.0161895751953125,
0.023162841796875,
-0.037445068359375,
0.06744384765625,
-0.06683349609375,
-0.0062713623046875,
-0.031341552734375,
-0.07049560546875,
-0.01435089111328125,
0.078125,
-0.0198974609375,
-0.00720977783203125,
0.06671142578125,
0.05914306640625,
-0.007720947265625,
-0.01027679443359375,
0.01209259033203125,
0.026580810546875,
0.01268768310546875,
0.0386962890625,
0.049468994140625,
-0.051300048828125,
0.046600341796875,
-0.042266845703125,
-0.0038776397705078125,
-0.034454345703125,
-0.050445556640625,
-0.074951171875,
-0.061767578125,
-0.045806884765625,
-0.0298004150390625,
-0.0001901388168334961,
0.07574462890625,
0.0345458984375,
-0.07232666015625,
-0.0196380615234375,
0.00455474853515625,
0.0233001708984375,
-0.02734375,
-0.02410888671875,
0.048675537109375,
-0.0224456787109375,
-0.05810546875,
0.01666259765625,
-0.0046539306640625,
0.01258087158203125,
-0.00826263427734375,
-0.01216888427734375,
-0.036224365234375,
-0.0055084228515625,
0.039886474609375,
0.0018301010131835938,
-0.062103271484375,
-0.0257720947265625,
-0.004734039306640625,
-0.0250091552734375,
0.0017271041870117188,
0.04595947265625,
-0.03656005859375,
0.01033782958984375,
0.04278564453125,
0.05279541015625,
0.038543701171875,
-0.0207366943359375,
0.0231170654296875,
-0.043853759765625,
0.011962890625,
0.00720977783203125,
0.040618896484375,
0.01453399658203125,
-0.031524658203125,
0.039306640625,
0.027191162109375,
-0.0435791015625,
-0.0648193359375,
-0.00881195068359375,
-0.07781982421875,
-0.03729248046875,
0.0653076171875,
-0.04034423828125,
-0.0267791748046875,
-0.0014410018920898438,
-0.0178985595703125,
0.038604736328125,
-0.03277587890625,
0.05975341796875,
0.045806884765625,
0.0007381439208984375,
0.005100250244140625,
-0.03875732421875,
0.0216217041015625,
0.04156494140625,
-0.0491943359375,
-0.03399658203125,
0.017974853515625,
0.04052734375,
0.025360107421875,
0.0233001708984375,
0.003429412841796875,
0.01276397705078125,
0.0142974853515625,
0.004726409912109375,
-0.006175994873046875,
0.0006580352783203125,
-0.01309967041015625,
0.00925445556640625,
-0.03173828125,
-0.0260162353515625
]
] |
THUDM/chatglm3-6b | 2023-11-01T07:55:10.000Z | [
"transformers",
"pytorch",
"chatglm",
"glm",
"thudm",
"custom_code",
"zh",
"en",
"arxiv:2103.10360",
"arxiv:2210.02414",
"endpoints_compatible",
"has_space",
"region:us"
] | null | THUDM | null | null | THUDM/chatglm3-6b | 341 | 32,736 | transformers | 2023-10-25T09:56:41 | ---
language:
- zh
- en
tags:
- glm
- chatglm
- thudm
---
# ChatGLM3-6B
<p align="center">
💻 <a href="https://github.com/THUDM/ChatGLM" target="_blank">Github Repo</a> • 🐦 <a href="https://twitter.com/thukeg" target="_blank">Twitter</a> • 📃 <a href="https://arxiv.org/abs/2103.10360" target="_blank">[GLM@ACL 22]</a> <a href="https://github.com/THUDM/GLM" target="_blank">[GitHub]</a> • 📃 <a href="https://arxiv.org/abs/2210.02414" target="_blank">[GLM-130B@ICLR 23]</a> <a href="https://github.com/THUDM/GLM-130B" target="_blank">[GitHub]</a> <br>
</p>
<p align="center">
👋 Join our <a href="https://join.slack.com/t/chatglm/shared_invite/zt-25ti5uohv-A_hs~am_D3Q8XPZMpj7wwQ" target="_blank">Slack</a> and <a href="https://github.com/THUDM/ChatGLM/blob/main/resources/WECHAT.md" target="_blank">WeChat</a>
</p>
<p align="center">
📍Experience the larger-scale ChatGLM model at <a href="https://www.chatglm.cn">chatglm.cn</a>
</p>
## 介绍 (Introduction)
ChatGLM3-6B 是 ChatGLM 系列最新一代的开源模型,在保留了前两代模型对话流畅、部署门槛低等众多优秀特性的基础上,ChatGLM3-6B 引入了如下特性:
1. **更强大的基础模型:** ChatGLM3-6B 的基础模型 ChatGLM3-6B-Base 采用了更多样的训练数据、更充分的训练步数和更合理的训练策略。在语义、数学、推理、代码、知识等不同角度的数据集上测评显示,ChatGLM3-6B-Base 具有在 10B 以下的预训练模型中最强的性能。
2. **更完整的功能支持:** ChatGLM3-6B 采用了全新设计的 [Prompt 格式](https://github.com/THUDM/ChatGLM3/blob/main/PROMPT.md),除正常的多轮对话外。同时原生支持[工具调用](https://github.com/THUDM/ChatGLM3/blob/main/tool_using/README.md)(Function Call)、代码执行(Code Interpreter)和 Agent 任务等复杂场景。
3. **更全面的开源序列:** 除了对话模型 ChatGLM3-6B 外,还开源了基础模型 ChatGLM-6B-Base、长文本对话模型 ChatGLM3-6B-32K。以上所有权重对学术研究**完全开放**,在填写[问卷](https://open.bigmodel.cn/mla/form)进行登记后**亦允许免费商业使用**。
ChatGLM3-6B is the latest open-source model in the ChatGLM series. While retaining many excellent features such as smooth dialogue and low deployment threshold from the previous two generations, ChatGLM3-6B introduces the following features:
1. **More Powerful Base Model:** The base model of ChatGLM3-6B, ChatGLM3-6B-Base, employs a more diverse training dataset, more sufficient training steps, and a more reasonable training strategy. Evaluations on datasets such as semantics, mathematics, reasoning, code, knowledge, etc., show that ChatGLM3-6B-Base has the strongest performance among pre-trained models under 10B.
2. **More Comprehensive Function Support:** ChatGLM3-6B adopts a newly designed [Prompt format](https://github.com/THUDM/ChatGLM3/blob/main/PROMPT_en.md), in addition to the normal multi-turn dialogue. It also natively supports [function call](https://github.com/THUDM/ChatGLM3/blob/main/tool_using/README_en.md), code interpreter, and complex scenarios such as agent tasks.
3. **More Comprehensive Open-source Series:** In addition to the dialogue model ChatGLM3-6B, the base model ChatGLM-6B-Base and the long-text dialogue model ChatGLM3-6B-32K are also open-sourced. All the weights are **fully open** for academic research, and after completing the [questionnaire](https://open.bigmodel.cn/mla/form) registration, they are also **allowed for free commercial use**.
## 软件依赖 (Dependencies)
```shell
pip install protobuf transformers==4.30.2 cpm_kernels torch>=2.0 gradio mdtex2html sentencepiece accelerate
```
## 代码调用 (Code Usage)
可以通过如下代码调用 ChatGLM3-6B 模型来生成对话:
You can generate dialogue by invoking the ChatGLM3-6B model with the following code:
```ipython
>>> from transformers import AutoTokenizer, AutoModel
>>> tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm3-6b", trust_remote_code=True)
>>> model = AutoModel.from_pretrained("THUDM/chatglm3-6b", trust_remote_code=True).half().cuda()
>>> model = model.eval()
>>> response, history = model.chat(tokenizer, "你好", history=[])
>>> print(response)
你好👋!我是人工智能助手 ChatGLM-6B,很高兴见到你,欢迎问我任何问题。
>>> response, history = model.chat(tokenizer, "晚上睡不着应该怎么办", history=history)
>>> print(response)
晚上睡不着可能会让你感到焦虑或不舒服,但以下是一些可以帮助你入睡的方法:
1. 制定规律的睡眠时间表:保持规律的睡眠时间表可以帮助你建立健康的睡眠习惯,使你更容易入睡。尽量在每天的相同时间上床,并在同一时间起床。
2. 创造一个舒适的睡眠环境:确保睡眠环境舒适,安静,黑暗且温度适宜。可以使用舒适的床上用品,并保持房间通风。
3. 放松身心:在睡前做些放松的活动,例如泡个热水澡,听些轻柔的音乐,阅读一些有趣的书籍等,有助于缓解紧张和焦虑,使你更容易入睡。
4. 避免饮用含有咖啡因的饮料:咖啡因是一种刺激性物质,会影响你的睡眠质量。尽量避免在睡前饮用含有咖啡因的饮料,例如咖啡,茶和可乐。
5. 避免在床上做与睡眠无关的事情:在床上做些与睡眠无关的事情,例如看电影,玩游戏或工作等,可能会干扰你的睡眠。
6. 尝试呼吸技巧:深呼吸是一种放松技巧,可以帮助你缓解紧张和焦虑,使你更容易入睡。试着慢慢吸气,保持几秒钟,然后缓慢呼气。
如果这些方法无法帮助你入睡,你可以考虑咨询医生或睡眠专家,寻求进一步的建议。
```
关于更多的使用说明,包括如何运行命令行和网页版本的 DEMO,以及使用模型量化以节省显存,请参考我们的 [Github Repo](https://github.com/THUDM/ChatGLM)。
For more instructions, including how to run CLI and web demos, and model quantization, please refer to our [Github Repo](https://github.com/THUDM/ChatGLM).
## 协议 (License)
本仓库的代码依照 [Apache-2.0](LICENSE) 协议开源,ChatGLM3-6B 模型的权重的使用则需要遵循 [Model License](MODEL_LICENSE)。
The code in this repository is open-sourced under the [Apache-2.0 license](LICENSE), while the use of the ChatGLM3-6B model weights needs to comply with the [Model License](MODEL_LICENSE).
## 引用 (Citation)
如果你觉得我们的工作有帮助的话,请考虑引用下列论文。
If you find our work helpful, please consider citing the following papers.
```
@article{zeng2022glm,
title={Glm-130b: An open bilingual pre-trained model},
author={Zeng, Aohan and Liu, Xiao and Du, Zhengxiao and Wang, Zihan and Lai, Hanyu and Ding, Ming and Yang, Zhuoyi and Xu, Yifan and Zheng, Wendi and Xia, Xiao and others},
journal={arXiv preprint arXiv:2210.02414},
year={2022}
}
```
```
@inproceedings{du2022glm,
title={GLM: General Language Model Pretraining with Autoregressive Blank Infilling},
author={Du, Zhengxiao and Qian, Yujie and Liu, Xiao and Ding, Ming and Qiu, Jiezhong and Yang, Zhilin and Tang, Jie},
booktitle={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
pages={320--335},
year={2022}
}
``` | 5,669 | [
[
-0.0360107421875,
-0.061981201171875,
0.01251220703125,
0.02410888671875,
-0.0196685791015625,
-0.0023784637451171875,
-0.01953125,
-0.035797119140625,
-0.0022830963134765625,
0.019683837890625,
-0.0390625,
-0.049407958984375,
-0.0360107421875,
-0.00860595703125,
-0.01021575927734375,
0.06671142578125,
0.0106658935546875,
0.0093231201171875,
0.005313873291015625,
0.0016021728515625,
-0.03851318359375,
-0.04205322265625,
-0.05072021484375,
-0.0203704833984375,
0.007076263427734375,
0.00897216796875,
0.05242919921875,
0.0211181640625,
0.0295867919921875,
0.0272979736328125,
-0.0178985595703125,
0.01009368896484375,
-0.042633056640625,
-0.0177764892578125,
0.0294647216796875,
-0.041961669921875,
-0.057098388671875,
0.00748443603515625,
0.04156494140625,
0.018218994140625,
-0.012054443359375,
0.0163726806640625,
0.0235443115234375,
0.041107177734375,
-0.031280517578125,
0.045440673828125,
-0.0372314453125,
-0.0036983489990234375,
-0.004730224609375,
-0.009796142578125,
-0.028900146484375,
-0.034454345703125,
0.004161834716796875,
-0.042327880859375,
-0.0015106201171875,
0.00937652587890625,
0.09619140625,
-0.01514434814453125,
-0.01374053955078125,
-0.0134124755859375,
-0.045013427734375,
0.066650390625,
-0.09320068359375,
0.00677490234375,
0.017303466796875,
0.03424072265625,
-0.0208740234375,
-0.05560302734375,
-0.03607177734375,
-0.017486572265625,
-0.032806396484375,
0.0250701904296875,
-0.0163116455078125,
0.0010709762573242188,
0.0161285400390625,
0.0341796875,
-0.04315185546875,
-0.0071868896484375,
-0.044677734375,
-0.019012451171875,
0.039886474609375,
0.0194244384765625,
0.04833984375,
-0.0105133056640625,
-0.030792236328125,
-0.00522613525390625,
-0.04168701171875,
0.0161285400390625,
0.023773193359375,
0.0290679931640625,
-0.04766845703125,
0.0257110595703125,
-0.002849578857421875,
0.04400634765625,
-0.0002205371856689453,
-0.0221099853515625,
0.037261962890625,
-0.03466796875,
-0.022308349609375,
-0.0168304443359375,
0.09698486328125,
0.0321044921875,
0.003231048583984375,
0.01259613037109375,
-0.003917694091796875,
-0.0172576904296875,
-0.007190704345703125,
-0.06951904296875,
-0.00750732421875,
0.02569580078125,
-0.0460205078125,
-0.02374267578125,
-0.004253387451171875,
-0.0413818359375,
0.00905609130859375,
0.000095367431640625,
0.050140380859375,
-0.045440673828125,
-0.034820556640625,
0.0176849365234375,
-0.00594329833984375,
0.026275634765625,
0.03057861328125,
-0.07171630859375,
0.03155517578125,
0.036285400390625,
0.0699462890625,
-0.0079498291015625,
-0.0217437744140625,
-0.0153961181640625,
0.00640106201171875,
-0.01258087158203125,
0.02532958984375,
-0.00743865966796875,
-0.041229248046875,
-0.0072479248046875,
0.002780914306640625,
-0.0210418701171875,
-0.0295867919921875,
0.022064208984375,
-0.0213775634765625,
0.05572509765625,
0.0014858245849609375,
-0.036346435546875,
-0.027862548828125,
0.024017333984375,
-0.0304718017578125,
0.07049560546875,
-0.003917694091796875,
-0.06536865234375,
-0.0108795166015625,
-0.05010986328125,
-0.0025177001953125,
0.001178741455078125,
-0.0011119842529296875,
-0.0271148681640625,
-0.020111083984375,
0.0280609130859375,
0.0227203369140625,
-0.0308380126953125,
0.009735107421875,
-0.0195770263671875,
-0.030242919921875,
0.021087646484375,
-0.024810791015625,
0.095458984375,
0.0251617431640625,
-0.0245819091796875,
0.0213623046875,
-0.0304718017578125,
0.0220947265625,
0.020111083984375,
-0.01145172119140625,
0.00335693359375,
-0.0018587112426757812,
-0.00922393798828125,
0.02227783203125,
0.033935546875,
-0.01934814453125,
0.0091400146484375,
-0.0516357421875,
0.032257080078125,
0.050750732421875,
-0.0017786026000976562,
0.031951904296875,
-0.0418701171875,
0.0308837890625,
0.01503753662109375,
0.04144287109375,
-0.0179443359375,
-0.05181884765625,
-0.0775146484375,
-0.01849365234375,
0.01288604736328125,
0.057647705078125,
-0.04833984375,
0.0576171875,
-0.01055908203125,
-0.052459716796875,
-0.033782958984375,
0.008331298828125,
0.034515380859375,
0.028167724609375,
0.03155517578125,
-0.0153961181640625,
-0.038818359375,
-0.0494384765625,
-0.0043487548828125,
-0.038360595703125,
-0.006923675537109375,
0.0318603515625,
0.038116455078125,
-0.0194549560546875,
0.0736083984375,
-0.047882080078125,
-0.02838134765625,
-0.0299835205078125,
0.006622314453125,
0.0310211181640625,
0.04632568359375,
0.049224853515625,
-0.042633056640625,
-0.0631103515625,
-0.00388336181640625,
-0.06494140625,
0.003993988037109375,
0.01242828369140625,
-0.026092529296875,
0.0251312255859375,
0.02392578125,
-0.053955078125,
0.0284576416015625,
0.046783447265625,
-0.0191650390625,
0.043548583984375,
-0.00945281982421875,
0.00110626220703125,
-0.09686279296875,
0.0030059814453125,
-0.0178375244140625,
0.0018415451049804688,
-0.051788330078125,
-0.0088348388671875,
-0.0087738037109375,
0.0018491744995117188,
-0.04833984375,
0.07403564453125,
-0.04522705078125,
0.02142333984375,
-0.0045928955078125,
0.021026611328125,
-0.0126495361328125,
0.0596923828125,
-0.01345062255859375,
0.044219970703125,
0.060943603515625,
-0.04150390625,
0.0269775390625,
0.0266876220703125,
-0.01427459716796875,
0.002101898193359375,
-0.06365966796875,
0.00878143310546875,
0.005344390869140625,
0.024383544921875,
-0.08880615234375,
-0.00243377685546875,
0.054931640625,
-0.06402587890625,
0.01263427734375,
-0.0196075439453125,
-0.0309295654296875,
-0.040069580078125,
-0.0341796875,
0.00952911376953125,
0.063232421875,
-0.0187225341796875,
0.051788330078125,
0.027679443359375,
-0.005901336669921875,
-0.03790283203125,
-0.03631591796875,
-0.01160430908203125,
-0.021759033203125,
-0.06671142578125,
0.01335906982421875,
-0.02252197265625,
-0.0048370361328125,
-0.00836181640625,
0.0126953125,
0.0036640167236328125,
0.00010251998901367188,
0.020477294921875,
0.040496826171875,
-0.0113983154296875,
-0.00850677490234375,
-0.0264434814453125,
-0.0099639892578125,
0.006961822509765625,
-0.006500244140625,
0.050048828125,
-0.037933349609375,
-0.0297393798828125,
-0.036102294921875,
0.008392333984375,
0.03155517578125,
-0.01065826416015625,
0.0572509765625,
0.07464599609375,
-0.0115966796875,
0.01497650146484375,
-0.04705810546875,
-0.015960693359375,
-0.042877197265625,
0.0179290771484375,
-0.0029392242431640625,
-0.07025146484375,
0.06103515625,
0.022796630859375,
0.0214080810546875,
0.044891357421875,
0.052215576171875,
0.01033782958984375,
0.091064453125,
0.0275421142578125,
-0.01450347900390625,
0.045013427734375,
-0.031494140625,
0.016448974609375,
-0.05712890625,
-0.02032470703125,
-0.038818359375,
-0.01056671142578125,
-0.04632568359375,
-0.035003662109375,
0.02777099609375,
0.0056610107421875,
-0.023406982421875,
0.01294708251953125,
-0.03192138671875,
0.003662109375,
0.037933349609375,
-0.0003407001495361328,
0.01338958740234375,
-0.01351165771484375,
-0.01473236083984375,
0.011962890625,
-0.0531005859375,
-0.0433349609375,
0.06256103515625,
0.036895751953125,
0.060577392578125,
0.020294189453125,
0.046295166015625,
-0.0071563720703125,
0.031402587890625,
-0.0489501953125,
0.052978515625,
0.018035888671875,
-0.056396484375,
-0.02874755859375,
-0.040069580078125,
-0.07489013671875,
0.038238525390625,
-0.010345458984375,
-0.07763671875,
-0.01071929931640625,
0.01062774658203125,
-0.02105712890625,
0.02557373046875,
-0.06561279296875,
0.06158447265625,
-0.02392578125,
-0.017333984375,
0.0040435791015625,
-0.05377197265625,
0.040496826171875,
0.0239410400390625,
0.0294036865234375,
-0.031463623046875,
0.009918212890625,
0.05572509765625,
-0.045928955078125,
0.06756591796875,
-0.02239990234375,
-0.00820159912109375,
0.046173095703125,
-0.0026111602783203125,
0.042510986328125,
0.0117034912109375,
0.01568603515625,
0.02337646484375,
0.00412750244140625,
-0.025543212890625,
-0.04156494140625,
0.052215576171875,
-0.06719970703125,
-0.06524658203125,
-0.036712646484375,
-0.032745361328125,
-0.0141143798828125,
0.0269012451171875,
0.02459716796875,
0.0237274169921875,
-0.00606536865234375,
0.020172119140625,
0.02398681640625,
-0.03240966796875,
0.05029296875,
0.045989990234375,
-0.047882080078125,
-0.03741455078125,
0.052459716796875,
-0.0003871917724609375,
0.034149169921875,
0.0115203857421875,
0.01125335693359375,
-0.01922607421875,
-0.037200927734375,
-0.027435302734375,
0.024871826171875,
-0.03717041015625,
-0.01055908203125,
-0.05828857421875,
-0.047332763671875,
-0.0390625,
0.007781982421875,
-0.0235137939453125,
-0.00995635986328125,
-0.032501220703125,
0.0002257823944091797,
0.0384521484375,
0.01654052734375,
-0.0024204254150390625,
0.0212554931640625,
-0.07745361328125,
0.021514892578125,
0.02239990234375,
0.0252685546875,
0.0200653076171875,
-0.047607421875,
-0.03594970703125,
0.036346435546875,
-0.0232696533203125,
-0.03857421875,
0.046875,
0.0034732818603515625,
0.0531005859375,
0.0199127197265625,
-0.00556182861328125,
0.050750732421875,
-0.028076171875,
0.07611083984375,
0.0288543701171875,
-0.07012939453125,
0.03350830078125,
-0.041656494140625,
0.0345458984375,
0.0142822265625,
0.0250701904296875,
-0.04058837890625,
-0.039642333984375,
-0.055572509765625,
-0.06182861328125,
0.07916259765625,
0.046295166015625,
0.033905029296875,
0.0013446807861328125,
-0.00652313232421875,
-0.01861572265625,
0.010467529296875,
-0.061248779296875,
-0.04766845703125,
-0.0169219970703125,
-0.00848388671875,
0.00138092041015625,
-0.0236358642578125,
-0.01181793212890625,
-0.0292510986328125,
0.0633544921875,
0.003116607666015625,
0.047607421875,
-0.0036830902099609375,
0.0013723373413085938,
0.0084228515625,
0.01593017578125,
0.043243408203125,
0.045806884765625,
-0.03192138671875,
-0.01457977294921875,
0.023834228515625,
-0.0439453125,
0.00305938720703125,
0.0030765533447265625,
-0.0151214599609375,
0.006679534912109375,
0.01593017578125,
0.0826416015625,
0.0135498046875,
-0.035003662109375,
0.045806884765625,
-0.0244140625,
-0.0220947265625,
-0.0242462158203125,
0.026702880859375,
0.023345947265625,
0.0147552490234375,
0.042724609375,
-0.0171356201171875,
-0.0025806427001953125,
-0.04949951171875,
-0.00495147705078125,
0.0372314453125,
-0.0261383056640625,
-0.01953125,
0.0477294921875,
0.0172576904296875,
-0.013671875,
0.033477783203125,
-0.01259613037109375,
-0.040740966796875,
0.04046630859375,
0.04254150390625,
0.0660400390625,
-0.0143585205078125,
0.01177215576171875,
0.053314208984375,
0.0108489990234375,
-0.0236358642578125,
0.020721435546875,
0.015777587890625,
-0.058013916015625,
-0.0206451416015625,
-0.037750244140625,
-0.02099609375,
0.0163726806640625,
-0.0379638671875,
0.030731201171875,
-0.0284576416015625,
-0.02581787109375,
-0.0179901123046875,
0.0007643699645996094,
-0.027679443359375,
0.0095062255859375,
0.01056671142578125,
0.05133056640625,
-0.032501220703125,
0.052978515625,
0.0280609130859375,
-0.0316162109375,
-0.069580078125,
-0.0166473388671875,
0.0126800537109375,
-0.058807373046875,
0.03057861328125,
0.00878143310546875,
-0.0108795166015625,
0.0056610107421875,
-0.047454833984375,
-0.08148193359375,
0.09368896484375,
0.020477294921875,
-0.02105712890625,
-0.0152587890625,
-0.005664825439453125,
0.047821044921875,
-0.0234832763671875,
0.049713134765625,
0.01479339599609375,
0.025604248046875,
0.0238800048828125,
-0.089599609375,
0.01377105712890625,
-0.04486083984375,
0.007434844970703125,
-0.003261566162109375,
-0.07574462890625,
0.0828857421875,
-0.00490570068359375,
-0.0280914306640625,
0.002536773681640625,
0.050262451171875,
0.0196075439453125,
-0.00623321533203125,
0.027099609375,
0.018524169921875,
0.035675048828125,
-0.029541015625,
0.0628662109375,
-0.03851318359375,
0.05169677734375,
0.0731201171875,
0.0029582977294921875,
0.04718017578125,
0.009979248046875,
-0.026580810546875,
0.04107666015625,
0.039825439453125,
-0.015533447265625,
0.031524658203125,
0.00213623046875,
-0.0198974609375,
-0.0071563720703125,
0.01136016845703125,
-0.053192138671875,
0.0196990966796875,
0.033172607421875,
-0.01145172119140625,
-0.0164031982421875,
-0.00595855712890625,
0.02264404296875,
-0.03533935546875,
-0.0134429931640625,
0.06805419921875,
0.018402099609375,
-0.048095703125,
0.08367919921875,
-0.0001316070556640625,
0.06402587890625,
-0.06982421875,
0.005298614501953125,
-0.01593017578125,
0.004825592041015625,
-0.015899658203125,
-0.044464111328125,
0.004093170166015625,
-0.00893402099609375,
0.01180267333984375,
-0.00585174560546875,
0.060699462890625,
-0.031829833984375,
-0.029510498046875,
0.030242919921875,
0.0362548828125,
0.01137542724609375,
0.01364898681640625,
-0.06646728515625,
0.0117645263671875,
0.0150909423828125,
-0.034088134765625,
0.0267486572265625,
0.026275634765625,
0.0113983154296875,
0.06182861328125,
0.050750732421875,
-0.00273895263671875,
0.0006422996520996094,
0.0018491744995117188,
0.06640625,
-0.050384521484375,
-0.033843994140625,
-0.07366943359375,
0.05596923828125,
-0.01090240478515625,
-0.01190185546875,
0.078369140625,
0.040740966796875,
0.055877685546875,
-0.0059967041015625,
0.06396484375,
-0.019256591796875,
0.0369873046875,
-0.03680419921875,
0.0555419921875,
-0.03741455078125,
0.019256591796875,
-0.0301513671875,
-0.048187255859375,
-0.0206146240234375,
0.043853759765625,
-0.0255126953125,
0.0274810791015625,
0.049224853515625,
0.069580078125,
0.01503753662109375,
-0.00848388671875,
0.01605224609375,
0.02752685546875,
0.031402587890625,
0.058013916015625,
0.056243896484375,
-0.045928955078125,
0.061248779296875,
-0.01435089111328125,
-0.001789093017578125,
-0.034576416015625,
-0.050750732421875,
-0.09228515625,
-0.03839111328125,
-0.01412200927734375,
-0.03680419921875,
-0.0116729736328125,
0.0697021484375,
0.04638671875,
-0.052093505859375,
-0.0400390625,
0.01302337646484375,
0.01317596435546875,
-0.0205841064453125,
-0.0193634033203125,
0.027618408203125,
-0.03271484375,
-0.06451416015625,
0.00025963783264160156,
0.01453399658203125,
0.030242919921875,
-0.01517486572265625,
-0.018768310546875,
-0.0244140625,
0.0092620849609375,
0.03472900390625,
0.0193634033203125,
-0.05975341796875,
-0.01104736328125,
0.005153656005859375,
-0.032012939453125,
0.0196685791015625,
0.01473236083984375,
-0.034149169921875,
0.0304718017578125,
0.042724609375,
-0.0005440711975097656,
0.05548095703125,
-0.005306243896484375,
0.04364013671875,
-0.034088134765625,
0.025238037109375,
0.001026153564453125,
0.0257110595703125,
0.00969696044921875,
-0.01348114013671875,
0.044219970703125,
0.00893402099609375,
-0.0285186767578125,
-0.0596923828125,
-0.01111602783203125,
-0.0772705078125,
-0.0084686279296875,
0.10113525390625,
-0.0221405029296875,
-0.019195556640625,
0.004413604736328125,
-0.0447998046875,
0.02593994140625,
-0.03289794921875,
0.059906005859375,
0.06561279296875,
-0.004058837890625,
-0.0126495361328125,
-0.0478515625,
0.046600341796875,
0.0171051025390625,
-0.06658935546875,
-0.00782012939453125,
0.0300750732421875,
0.0239105224609375,
0.01050567626953125,
0.06964111328125,
-0.01861572265625,
0.017181396484375,
-0.01922607421875,
0.0170135498046875,
-0.01549530029296875,
0.01300048828125,
-0.00614166259765625,
-0.007190704345703125,
-0.00260162353515625,
-0.0159759521484375
]
] |
timm/fbnetv3_b.ra2_in1k | 2023-04-27T22:48:34.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:2006.02049",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/fbnetv3_b.ra2_in1k | 0 | 32,633 | timm | 2022-12-16T05:36:34 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for fbnetv3_b.ra2_in1k
A FBNet-v3 image classification model. Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* RandAugment `RA2` recipe. Inspired by and evolved from EfficientNet RandAugment recipes. Published as `B` recipe in [ResNet Strikes Back](https://arxiv.org/abs/2110.00476).
* RMSProp (TF 1.0 behaviour) optimizer, EMA weight averaging
* Step (exponential decay w/ staircase) LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 8.6
- GMACs: 0.4
- Activations (M): 7.0
- Image size: train = 224 x 224, test = 256 x 256
- **Papers:**
- FBNetV3: Joint Architecture-Recipe Search using Predictor Pretraining: https://arxiv.org/abs/2006.02049
- ResNet strikes back: An improved training procedure in timm: https://arxiv.org/abs/2110.00476
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('fbnetv3_b.ra2_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'fbnetv3_b.ra2_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 112, 112])
# torch.Size([1, 24, 56, 56])
# torch.Size([1, 40, 28, 28])
# torch.Size([1, 120, 14, 14])
# torch.Size([1, 1344, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'fbnetv3_b.ra2_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1344, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{dai2021fbnetv3,
title={Fbnetv3: Joint architecture-recipe search using predictor pretraining},
author={Dai, Xiaoliang and Wan, Alvin and Zhang, Peizhao and Wu, Bichen and He, Zijian and Wei, Zhen and Chen, Kan and Tian, Yuandong and Yu, Matthew and Vajda, Peter and others},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={16276--16285},
year={2021}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@inproceedings{wightman2021resnet,
title={ResNet strikes back: An improved training procedure in timm},
author={Wightman, Ross and Touvron, Hugo and Jegou, Herve},
booktitle={NeurIPS 2021 Workshop on ImageNet: Past, Present, and Future}
}
```
| 4,856 | [
[
-0.032440185546875,
-0.0322265625,
0.0017080307006835938,
0.00788116455078125,
-0.02203369140625,
-0.0266571044921875,
-0.01421356201171875,
-0.031646728515625,
0.0194854736328125,
0.0367431640625,
-0.035797119140625,
-0.048004150390625,
-0.054046630859375,
-0.00897216796875,
-0.00493621826171875,
0.06536865234375,
-0.0191802978515625,
0.0008459091186523438,
-0.009674072265625,
-0.036376953125,
-0.0226898193359375,
-0.0227508544921875,
-0.06805419921875,
-0.040496826171875,
0.035369873046875,
0.01422882080078125,
0.039642333984375,
0.045318603515625,
0.047332763671875,
0.0343017578125,
-0.0025882720947265625,
0.0072479248046875,
-0.014251708984375,
-0.01522064208984375,
0.0266571044921875,
-0.0504150390625,
-0.033843994140625,
0.02215576171875,
0.046875,
0.01763916015625,
0.00177001953125,
0.0227813720703125,
0.00399017333984375,
0.043060302734375,
-0.022064208984375,
0.01557159423828125,
-0.031524658203125,
0.0194549560546875,
-0.015380859375,
-0.0014390945434570312,
-0.0246429443359375,
-0.02972412109375,
0.0133056640625,
-0.038848876953125,
0.036285400390625,
0.0033855438232421875,
0.10040283203125,
0.016357421875,
0.00441741943359375,
0.00263214111328125,
-0.018707275390625,
0.062164306640625,
-0.056610107421875,
0.020599365234375,
0.0246124267578125,
0.0178375244140625,
-0.01004791259765625,
-0.0810546875,
-0.042877197265625,
-0.01123046875,
-0.01122283935546875,
-0.002567291259765625,
-0.0200347900390625,
-0.0019073486328125,
0.021209716796875,
0.024200439453125,
-0.031280517578125,
0.0131378173828125,
-0.037353515625,
-0.018402099609375,
0.03765869140625,
-0.0006480216979980469,
0.0235443115234375,
-0.0132904052734375,
-0.03485107421875,
-0.039154052734375,
-0.0294647216796875,
0.018829345703125,
0.017608642578125,
0.0194244384765625,
-0.040863037109375,
0.022552490234375,
0.0120697021484375,
0.050567626953125,
0.0025691986083984375,
-0.0201568603515625,
0.052337646484375,
0.0011892318725585938,
-0.027618408203125,
-0.009613037109375,
0.0799560546875,
0.03094482421875,
0.0160064697265625,
0.0117950439453125,
-0.01401519775390625,
-0.029693603515625,
-0.0029125213623046875,
-0.08990478515625,
-0.032745361328125,
0.0213623046875,
-0.046356201171875,
-0.032958984375,
0.0216827392578125,
-0.0413818359375,
-0.005123138427734375,
-0.0015544891357421875,
0.044219970703125,
-0.04180908203125,
-0.03253173828125,
0.00238037109375,
-0.01511383056640625,
0.0294342041015625,
0.01849365234375,
-0.043914794921875,
0.00868988037109375,
0.0242767333984375,
0.0894775390625,
0.01218414306640625,
-0.037445068359375,
-0.0240478515625,
-0.025054931640625,
-0.02215576171875,
0.031341552734375,
-0.00920867919921875,
-0.0016508102416992188,
-0.0238494873046875,
0.0259552001953125,
-0.0121002197265625,
-0.052947998046875,
0.00720977783203125,
-0.0232696533203125,
0.0215301513671875,
-0.006866455078125,
-0.012359619140625,
-0.04254150390625,
0.0211181640625,
-0.041229248046875,
0.095703125,
0.0281524658203125,
-0.06591796875,
0.029998779296875,
-0.0413818359375,
-0.01371002197265625,
-0.0225830078125,
-0.00826263427734375,
-0.08087158203125,
-0.00695037841796875,
0.0233612060546875,
0.0611572265625,
-0.026214599609375,
-0.00154876708984375,
-0.04248046875,
-0.0253143310546875,
0.0211639404296875,
0.004852294921875,
0.07110595703125,
0.0104522705078125,
-0.039703369140625,
0.0189971923828125,
-0.039306640625,
0.01300811767578125,
0.04608154296875,
-0.0139007568359375,
-0.0036907196044921875,
-0.044281005859375,
0.00896453857421875,
0.0273284912109375,
0.005352020263671875,
-0.039794921875,
0.01158905029296875,
-0.023345947265625,
0.033294677734375,
0.051422119140625,
-0.0095367431640625,
0.016815185546875,
-0.037994384765625,
0.024139404296875,
0.028076171875,
0.0173187255859375,
-0.0025386810302734375,
-0.042266845703125,
-0.056182861328125,
-0.033111572265625,
0.0290069580078125,
0.030029296875,
-0.046173095703125,
0.035888671875,
-0.00894927978515625,
-0.05792236328125,
-0.034454345703125,
0.006195068359375,
0.045684814453125,
0.036651611328125,
0.029998779296875,
-0.04266357421875,
-0.0413818359375,
-0.06280517578125,
0.0018281936645507812,
0.0059814453125,
0.0028018951416015625,
0.0240936279296875,
0.048187255859375,
-0.00949859619140625,
0.047576904296875,
-0.03466796875,
-0.0246429443359375,
-0.023681640625,
0.00936126708984375,
0.0318603515625,
0.062744140625,
0.07244873046875,
-0.048919677734375,
-0.032379150390625,
-0.005718231201171875,
-0.06439208984375,
0.017669677734375,
-0.0003707408905029297,
-0.016204833984375,
0.025421142578125,
0.01421356201171875,
-0.03948974609375,
0.047454833984375,
0.01209259033203125,
-0.011444091796875,
0.0281219482421875,
-0.0102081298828125,
0.01708984375,
-0.091796875,
0.0081329345703125,
0.0251922607421875,
-0.0020275115966796875,
-0.0281829833984375,
0.0033206939697265625,
-0.0003914833068847656,
-0.0024623870849609375,
-0.040496826171875,
0.0557861328125,
-0.041412353515625,
-0.020904541015625,
-0.0094757080078125,
-0.0207061767578125,
0.0039043426513671875,
0.0516357421875,
-0.00007283687591552734,
0.029632568359375,
0.058807373046875,
-0.032196044921875,
0.031280517578125,
0.030914306640625,
-0.0160064697265625,
0.020660400390625,
-0.047393798828125,
0.0205230712890625,
0.00030612945556640625,
0.0231170654296875,
-0.06988525390625,
-0.0227508544921875,
0.032196044921875,
-0.048919677734375,
0.051300048828125,
-0.035125732421875,
-0.03363037109375,
-0.048095703125,
-0.034942626953125,
0.0300140380859375,
0.047882080078125,
-0.058502197265625,
0.0390625,
0.0098724365234375,
0.022186279296875,
-0.049285888671875,
-0.06793212890625,
-0.0241851806640625,
-0.023345947265625,
-0.0499267578125,
0.0296478271484375,
0.01238250732421875,
0.01255035400390625,
0.009613037109375,
-0.01202392578125,
-0.01128387451171875,
-0.007282257080078125,
0.042144775390625,
0.02337646484375,
-0.02862548828125,
-0.0182342529296875,
-0.0294342041015625,
-0.007099151611328125,
0.002414703369140625,
-0.0261993408203125,
0.049713134765625,
-0.0214080810546875,
-0.00749969482421875,
-0.06439208984375,
-0.0113372802734375,
0.029754638671875,
-0.01424407958984375,
0.0638427734375,
0.09100341796875,
-0.042144775390625,
-0.001171112060546875,
-0.040252685546875,
-0.028717041015625,
-0.037567138671875,
0.03387451171875,
-0.028045654296875,
-0.034881591796875,
0.06182861328125,
0.0045166015625,
0.005046844482421875,
0.045928955078125,
0.0260162353515625,
-0.004055023193359375,
0.0477294921875,
0.041015625,
0.01471710205078125,
0.059051513671875,
-0.0733642578125,
-0.015411376953125,
-0.061004638671875,
-0.039215087890625,
-0.0251007080078125,
-0.046844482421875,
-0.041473388671875,
-0.03033447265625,
0.026458740234375,
0.00933837890625,
-0.0279998779296875,
0.03302001953125,
-0.059326171875,
0.0113677978515625,
0.058441162109375,
0.0491943359375,
-0.0379638671875,
0.025787353515625,
-0.025390625,
-0.006488800048828125,
-0.060272216796875,
-0.0100250244140625,
0.08428955078125,
0.03131103515625,
0.036529541015625,
-0.01416778564453125,
0.0662841796875,
-0.0228424072265625,
0.031036376953125,
-0.042022705078125,
0.0438232421875,
-0.0172271728515625,
-0.0343017578125,
-0.004344940185546875,
-0.03369140625,
-0.0789794921875,
0.004627227783203125,
-0.022064208984375,
-0.0509033203125,
0.008056640625,
0.0195465087890625,
-0.022216796875,
0.060394287109375,
-0.057830810546875,
0.0751953125,
-0.0156707763671875,
-0.036163330078125,
0.0065765380859375,
-0.05615234375,
0.023223876953125,
0.01096343994140625,
-0.0186767578125,
-0.0036773681640625,
0.023040771484375,
0.08306884765625,
-0.047943115234375,
0.062286376953125,
-0.04290771484375,
0.03546142578125,
0.034210205078125,
-0.01354217529296875,
0.019927978515625,
-0.01320648193359375,
-0.01323699951171875,
0.030517578125,
-0.001506805419921875,
-0.03497314453125,
-0.043212890625,
0.042633056640625,
-0.07928466796875,
-0.021728515625,
-0.032928466796875,
-0.026702880859375,
0.01335906982421875,
0.0181884765625,
0.041290283203125,
0.05340576171875,
0.02203369140625,
0.023468017578125,
0.038421630859375,
-0.03717041015625,
0.02923583984375,
-0.003910064697265625,
-0.01837158203125,
-0.047332763671875,
0.06317138671875,
0.01629638671875,
0.004802703857421875,
0.017425537109375,
0.0156402587890625,
-0.01959228515625,
-0.049560546875,
-0.01328277587890625,
0.0312347412109375,
-0.04888916015625,
-0.041290283203125,
-0.042816162109375,
-0.033935546875,
-0.02880859375,
-0.00994873046875,
-0.033355712890625,
-0.015869140625,
-0.029052734375,
0.0139007568359375,
0.057586669921875,
0.044952392578125,
-0.01474761962890625,
0.04901123046875,
-0.040374755859375,
0.00980377197265625,
0.0035305023193359375,
0.036865234375,
-0.005352020263671875,
-0.056976318359375,
-0.0179901123046875,
-0.005374908447265625,
-0.03204345703125,
-0.049835205078125,
0.036224365234375,
0.0098419189453125,
0.03741455078125,
0.019012451171875,
-0.0181884765625,
0.057861328125,
-0.00261688232421875,
0.03814697265625,
0.0350341796875,
-0.037139892578125,
0.05230712890625,
-0.003032684326171875,
0.0103607177734375,
0.01070404052734375,
0.0281219482421875,
-0.006778717041015625,
0.0007009506225585938,
-0.07470703125,
-0.06005859375,
0.07806396484375,
0.00798797607421875,
-0.01318359375,
0.0216522216796875,
0.06341552734375,
0.0027713775634765625,
0.001979827880859375,
-0.0557861328125,
-0.0273895263671875,
-0.0292205810546875,
-0.0186767578125,
-0.0010662078857421875,
-0.004878997802734375,
-0.003139495849609375,
-0.0489501953125,
0.05615234375,
0.0006427764892578125,
0.060028076171875,
0.0217437744140625,
0.00514984130859375,
-0.007694244384765625,
-0.03839111328125,
0.029815673828125,
0.016815185546875,
-0.035980224609375,
-0.002124786376953125,
0.0101318359375,
-0.046112060546875,
0.004940032958984375,
0.0167236328125,
0.004039764404296875,
0.00007027387619018555,
0.03729248046875,
0.07611083984375,
0.005859375,
0.01068115234375,
0.033782958984375,
-0.0093841552734375,
-0.03741455078125,
-0.0243072509765625,
0.015838623046875,
-0.01056671142578125,
0.034088134765625,
0.017852783203125,
0.032379150390625,
-0.021209716796875,
-0.01496124267578125,
0.0290069580078125,
0.033935546875,
-0.01611328125,
-0.0279083251953125,
0.05535888671875,
-0.016387939453125,
-0.01282501220703125,
0.06903076171875,
-0.003894805908203125,
-0.034942626953125,
0.07659912109375,
0.03515625,
0.072265625,
-0.01068878173828125,
0.0012731552124023438,
0.062469482421875,
0.0204010009765625,
-0.003448486328125,
0.020233154296875,
0.015411376953125,
-0.061279296875,
0.0046234130859375,
-0.046905517578125,
0.0038661956787109375,
0.033355712890625,
-0.042724609375,
0.027801513671875,
-0.062347412109375,
-0.034271240234375,
0.01534271240234375,
0.0292510986328125,
-0.07940673828125,
0.0216064453125,
-0.00612640380859375,
0.061279296875,
-0.054595947265625,
0.06005859375,
0.068603515625,
-0.048126220703125,
-0.0845947265625,
-0.0008797645568847656,
0.00412750244140625,
-0.07733154296875,
0.0504150390625,
0.0450439453125,
0.0064544677734375,
0.007381439208984375,
-0.05523681640625,
-0.047332763671875,
0.1103515625,
0.03997802734375,
-0.0058441162109375,
0.01511383056640625,
-0.0099639892578125,
0.0134429931640625,
-0.0304718017578125,
0.03717041015625,
0.01532745361328125,
0.0293426513671875,
0.03173828125,
-0.049560546875,
0.0195159912109375,
-0.023162841796875,
0.0116119384765625,
0.018035888671875,
-0.06005859375,
0.0677490234375,
-0.036102294921875,
-0.0061798095703125,
0.00186920166015625,
0.05230712890625,
0.0218353271484375,
0.0172271728515625,
0.037445068359375,
0.05792236328125,
0.0343017578125,
-0.0288238525390625,
0.0673828125,
-0.0031528472900390625,
0.03466796875,
0.048065185546875,
0.02032470703125,
0.0467529296875,
0.0234527587890625,
-0.018646240234375,
0.0301666259765625,
0.08465576171875,
-0.0229644775390625,
0.026702880859375,
0.01285552978515625,
0.00911712646484375,
-0.005161285400390625,
0.01419830322265625,
-0.035888671875,
0.041656494140625,
0.015838623046875,
-0.040863037109375,
-0.0124359130859375,
0.003063201904296875,
0.004619598388671875,
-0.025604248046875,
-0.00969696044921875,
0.038604736328125,
0.00286865234375,
-0.033111572265625,
0.06842041015625,
0.00804901123046875,
0.05963134765625,
-0.034027099609375,
0.00103759765625,
-0.0225677490234375,
0.020843505859375,
-0.0264739990234375,
-0.0557861328125,
0.0220794677734375,
-0.0285797119140625,
-0.0014295578002929688,
0.0007500648498535156,
0.046051025390625,
-0.0182647705078125,
-0.028900146484375,
0.0189971923828125,
0.0193023681640625,
0.0379638671875,
0.006320953369140625,
-0.0933837890625,
0.01934814453125,
0.00775909423828125,
-0.04248046875,
0.0245513916015625,
0.029632568359375,
0.008758544921875,
0.0482177734375,
0.04669189453125,
-0.0081024169921875,
0.006229400634765625,
-0.01525115966796875,
0.0635986328125,
-0.040008544921875,
-0.02264404296875,
-0.05767822265625,
0.040374755859375,
-0.004627227783203125,
-0.041107177734375,
0.031524658203125,
0.0390625,
0.06756591796875,
-0.0034770965576171875,
0.032684326171875,
-0.02032470703125,
0.0002579689025878906,
-0.04229736328125,
0.0484619140625,
-0.056793212890625,
-0.0014200210571289062,
-0.0183258056640625,
-0.057098388671875,
-0.032501220703125,
0.061859130859375,
-0.025054931640625,
0.03759765625,
0.038299560546875,
0.07574462890625,
-0.0295257568359375,
-0.0217742919921875,
0.0046234130859375,
0.01070404052734375,
0.009490966796875,
0.0325927734375,
0.0287322998046875,
-0.057464599609375,
0.019134521484375,
-0.05059814453125,
-0.0176849365234375,
-0.0165863037109375,
-0.0538330078125,
-0.0736083984375,
-0.06866455078125,
-0.045654296875,
-0.052398681640625,
-0.007633209228515625,
0.07244873046875,
0.07904052734375,
-0.0521240234375,
-0.005939483642578125,
0.0011014938354492188,
0.004543304443359375,
-0.024688720703125,
-0.0162506103515625,
0.045257568359375,
-0.01201629638671875,
-0.045989990234375,
-0.0194244384765625,
0.0003001689910888672,
0.03253173828125,
0.0072479248046875,
-0.0178985595703125,
-0.01763916015625,
-0.0255584716796875,
0.013336181640625,
0.0258636474609375,
-0.057830810546875,
-0.0163116455078125,
-0.02166748046875,
0.0007319450378417969,
0.0290985107421875,
0.0345458984375,
-0.040130615234375,
0.025177001953125,
0.0216522216796875,
0.032196044921875,
0.06439208984375,
-0.0256500244140625,
0.0126800537109375,
-0.07037353515625,
0.048004150390625,
-0.00873565673828125,
0.03460693359375,
0.033416748046875,
-0.0219879150390625,
0.045379638671875,
0.030426025390625,
-0.03143310546875,
-0.0682373046875,
-0.005584716796875,
-0.07489013671875,
-0.007564544677734375,
0.06268310546875,
-0.031402587890625,
-0.041656494140625,
0.035736083984375,
-0.00007623434066772461,
0.051788330078125,
-0.007038116455078125,
0.030853271484375,
0.0201873779296875,
-0.00970458984375,
-0.0484619140625,
-0.042999267578125,
0.0382080078125,
0.0157012939453125,
-0.040283203125,
-0.03253173828125,
0.0057830810546875,
0.05157470703125,
0.0178680419921875,
0.042510986328125,
-0.0130767822265625,
0.01264190673828125,
0.002918243408203125,
0.036865234375,
-0.038848876953125,
-0.01444244384765625,
-0.018280029296875,
-0.002010345458984375,
-0.0019207000732421875,
-0.049285888671875
]
] |
rinna/japanese-clip-vit-b-16 | 2023-09-09T02:15:59.000Z | [
"transformers",
"pytorch",
"safetensors",
"clip",
"zero-shot-image-classification",
"feature-extraction",
"ja",
"japanese",
"vision",
"arxiv:2103.00020",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | feature-extraction | rinna | null | null | rinna/japanese-clip-vit-b-16 | 15 | 32,588 | transformers | 2022-04-27T07:52:33 | ---
language: ja
thumbnail: https://github.com/rinnakk/japanese-pretrained-models/blob/master/rinna.png
license: apache-2.0
tags:
- feature-extraction
- ja
- japanese
- clip
- vision
---
# rinna/japanese-clip-vit-b-16

This is a Japanese [CLIP (Contrastive Language-Image Pre-Training)](https://arxiv.org/abs/2103.00020) model trained by [rinna Co., Ltd.](https://corp.rinna.co.jp/).
Please see [japanese-clip](https://github.com/rinnakk/japanese-clip) for the other available models.
# How to use the model
1. Install package
```shell
$ pip install git+https://github.com/rinnakk/japanese-clip.git
```
2. Run
```python
import io
import requests
from PIL import Image
import torch
import japanese_clip as ja_clip
device = "cuda" if torch.cuda.is_available() else "cpu"
model, preprocess = ja_clip.load("rinna/japanese-clip-vit-b-16", cache_dir="/tmp/japanese_clip", device=device)
tokenizer = ja_clip.load_tokenizer()
img = Image.open(io.BytesIO(requests.get('https://images.pexels.com/photos/2253275/pexels-photo-2253275.jpeg?auto=compress&cs=tinysrgb&dpr=3&h=750&w=1260').content))
image = preprocess(img).unsqueeze(0).to(device)
encodings = ja_clip.tokenize(
texts=["犬", "猫", "象"],
max_seq_len=77,
device=device,
tokenizer=tokenizer, # this is optional. if you don't pass, load tokenizer each time
)
with torch.no_grad():
image_features = model.get_image_features(image)
text_features = model.get_text_features(**encodings)
text_probs = (100.0 * image_features @ text_features.T).softmax(dim=-1)
print("Label probs:", text_probs) # prints: [[1.0, 0.0, 0.0]]
```
# Model architecture
The model was trained a ViT-B/16 Transformer architecture as an image encoder and uses a 12-layer BERT as a text encoder. The image encoder was initialized from the [AugReg `vit-base-patch16-224` model](https://github.com/google-research/vision_transformer).
# Training
The model was trained on [CC12M](https://github.com/google-research-datasets/conceptual-12m) translated the captions to Japanese.
# License
[The Apache 2.0 license](https://www.apache.org/licenses/LICENSE-2.0)
| 2,149 | [
[
-0.025604248046875,
-0.0592041015625,
0.0162506103515625,
0.0225830078125,
-0.042633056640625,
-0.002010345458984375,
-0.0142974853515625,
-0.027862548828125,
0.0287017822265625,
0.031890869140625,
-0.04083251953125,
-0.041351318359375,
-0.051727294921875,
0.004947662353515625,
-0.006183624267578125,
0.07208251953125,
-0.004581451416015625,
0.0120086669921875,
0.017333984375,
-0.006317138671875,
-0.04010009765625,
-0.0142059326171875,
-0.0491943359375,
-0.024139404296875,
0.01047515869140625,
0.0175018310546875,
0.04815673828125,
0.03741455078125,
0.044677734375,
0.0272369384765625,
0.0013418197631835938,
-0.0038204193115234375,
-0.0261993408203125,
-0.0199432373046875,
-0.00539398193359375,
-0.03857421875,
-0.0189208984375,
-0.00580596923828125,
0.051727294921875,
-0.0011510848999023438,
0.0280303955078125,
0.00921630859375,
-0.010345458984375,
0.0178375244140625,
-0.0408935546875,
0.01023101806640625,
-0.042022705078125,
-0.0015010833740234375,
-0.0181427001953125,
-0.0084381103515625,
-0.0222930908203125,
-0.0173492431640625,
0.01239776611328125,
-0.051422119140625,
0.0264434814453125,
0.0007166862487792969,
0.1273193359375,
0.0019025802612304688,
-0.00627899169921875,
-0.01189422607421875,
-0.0235748291015625,
0.0560302734375,
-0.052581787109375,
0.0215911865234375,
0.01322174072265625,
0.005889892578125,
0.006072998046875,
-0.07769775390625,
-0.04052734375,
-0.0008249282836914062,
0.0022792816162109375,
0.01178741455078125,
-0.0102691650390625,
-0.005889892578125,
0.0298919677734375,
0.030731201171875,
-0.032196044921875,
-0.007678985595703125,
-0.048919677734375,
-0.0196685791015625,
0.033843994140625,
0.007610321044921875,
0.055267333984375,
-0.0233917236328125,
-0.03839111328125,
-0.039825439453125,
-0.04022216796875,
0.00571441650390625,
0.0266571044921875,
-0.0028896331787109375,
-0.043182373046875,
0.040008544921875,
0.008819580078125,
0.0244903564453125,
0.004192352294921875,
-0.0240325927734375,
0.03448486328125,
-0.01088714599609375,
-0.017333984375,
-0.003383636474609375,
0.0855712890625,
0.0499267578125,
0.014404296875,
0.022186279296875,
-0.0037555694580078125,
0.00945281982421875,
-0.0018777847290039062,
-0.0833740234375,
-0.0443115234375,
-0.011474609375,
-0.033203125,
-0.02069091796875,
0.004894256591796875,
-0.0562744140625,
0.002292633056640625,
0.023895263671875,
0.067138671875,
-0.049163818359375,
-0.014404296875,
0.02105712890625,
-0.01947021484375,
0.0250701904296875,
0.01342010498046875,
-0.0634765625,
0.001476287841796875,
0.01131439208984375,
0.0771484375,
0.01558685302734375,
-0.03289794921875,
0.0079193115234375,
0.0058441162109375,
-0.0162811279296875,
0.051177978515625,
-0.0168914794921875,
-0.04150390625,
-0.01491546630859375,
0.033538818359375,
-0.0045013427734375,
-0.0255279541015625,
0.046600341796875,
-0.0301055908203125,
0.0232391357421875,
-0.013946533203125,
-0.0208892822265625,
-0.0318603515625,
0.01788330078125,
-0.044647216796875,
0.07293701171875,
0.0011606216430664062,
-0.08843994140625,
0.0200042724609375,
-0.04254150390625,
-0.01132965087890625,
0.0092315673828125,
-0.0193634033203125,
-0.05645751953125,
-0.015228271484375,
0.037841796875,
0.0228118896484375,
-0.005828857421875,
-0.0018892288208007812,
-0.0283660888671875,
-0.0285491943359375,
0.0350341796875,
-0.0377197265625,
0.07366943359375,
0.0088653564453125,
-0.018218994140625,
0.0084991455078125,
-0.042327880859375,
-0.0011491775512695312,
0.0389404296875,
-0.005706787109375,
-0.0251922607421875,
-0.040771484375,
0.0182342529296875,
0.00972747802734375,
0.036041259765625,
-0.05804443359375,
0.01007843017578125,
-0.0230712890625,
0.042633056640625,
0.04595947265625,
0.02301025390625,
0.011505126953125,
-0.01438140869140625,
0.034149169921875,
0.00850677490234375,
0.01800537109375,
-0.03216552734375,
-0.0361328125,
-0.0758056640625,
-0.03143310546875,
0.0079193115234375,
0.03326416015625,
-0.07110595703125,
0.020904541015625,
-0.0219268798828125,
-0.044097900390625,
-0.0595703125,
-0.0176239013671875,
0.033966064453125,
0.0391845703125,
0.0234222412109375,
-0.04168701171875,
-0.040771484375,
-0.0791015625,
0.0010995864868164062,
-0.004558563232421875,
0.006465911865234375,
0.0247955322265625,
0.0645751953125,
-0.0100250244140625,
0.04571533203125,
-0.048553466796875,
-0.01435089111328125,
-0.024871826171875,
0.01561737060546875,
0.040557861328125,
0.05364990234375,
0.03863525390625,
-0.053070068359375,
-0.04388427734375,
-0.004482269287109375,
-0.05279541015625,
0.004756927490234375,
-0.006076812744140625,
-0.01959228515625,
-0.005336761474609375,
0.01235198974609375,
-0.040771484375,
0.039794921875,
0.0166778564453125,
-0.0223388671875,
0.053924560546875,
-0.03363037109375,
0.032012939453125,
-0.0858154296875,
0.018463134765625,
0.0027217864990234375,
-0.0222320556640625,
-0.025848388671875,
0.01047515869140625,
0.0202789306640625,
-0.0157470703125,
-0.052581787109375,
0.046783447265625,
-0.032257080078125,
0.0008006095886230469,
-0.02215576171875,
-0.0151519775390625,
0.0296783447265625,
0.04632568359375,
0.031982421875,
0.0802001953125,
0.050323486328125,
-0.032318115234375,
0.0238189697265625,
0.038604736328125,
-0.05145263671875,
0.034912109375,
-0.0706787109375,
0.007114410400390625,
0.0062103271484375,
0.00016689300537109375,
-0.052581787109375,
-0.0307769775390625,
0.037628173828125,
-0.047149658203125,
0.0216522216796875,
-0.0213623046875,
-0.031524658203125,
-0.028533935546875,
-0.036956787109375,
0.03936767578125,
0.047454833984375,
-0.03668212890625,
0.01544952392578125,
0.025726318359375,
-0.007755279541015625,
-0.055328369140625,
-0.081787109375,
-0.002826690673828125,
-0.0115814208984375,
-0.037322998046875,
0.04010009765625,
0.01206207275390625,
0.0265655517578125,
0.0159759521484375,
0.0084075927734375,
-0.0212554931640625,
-0.0103607177734375,
0.0173797607421875,
0.04534912109375,
-0.0211181640625,
-0.0274200439453125,
-0.0018758773803710938,
0.0036983489990234375,
0.01129150390625,
0.010284423828125,
0.049896240234375,
-0.0190887451171875,
-0.025634765625,
-0.05645751953125,
0.0008392333984375,
0.020233154296875,
-0.00753021240234375,
0.0460205078125,
0.07037353515625,
-0.02294921875,
0.0089263916015625,
-0.0330810546875,
-0.01389312744140625,
-0.0404052734375,
0.05548095703125,
-0.02728271484375,
-0.0406494140625,
0.05029296875,
0.0164947509765625,
-0.003936767578125,
0.04022216796875,
0.03692626953125,
-0.01422882080078125,
0.0733642578125,
0.052581787109375,
-0.003498077392578125,
0.057098388671875,
-0.06719970703125,
0.0074005126953125,
-0.07794189453125,
-0.0271759033203125,
-0.01751708984375,
-0.008941650390625,
-0.028594970703125,
-0.03399658203125,
0.036285400390625,
0.01557159423828125,
-0.0274658203125,
0.031494140625,
-0.04449462890625,
0.043670654296875,
0.027008056640625,
0.033294677734375,
0.0017719268798828125,
-0.0004024505615234375,
-0.01006317138671875,
-0.0301513671875,
-0.04632568359375,
-0.0330810546875,
0.063720703125,
0.047210693359375,
0.072509765625,
-0.0222625732421875,
0.042510986328125,
0.0007905960083007812,
-0.0009694099426269531,
-0.043304443359375,
0.039794921875,
-0.0190277099609375,
-0.05072021484375,
0.006008148193359375,
-0.002044677734375,
-0.062042236328125,
0.01543426513671875,
-0.02484130859375,
-0.049163818359375,
0.005401611328125,
-0.0003643035888671875,
-0.0036449432373046875,
0.03741455078125,
-0.0443115234375,
0.07513427734375,
-0.017364501953125,
-0.0298919677734375,
-0.001190185546875,
-0.044036865234375,
0.03240966796875,
0.0311737060546875,
0.00568389892578125,
-0.00441741943359375,
0.0081787109375,
0.0797119140625,
-0.035888671875,
0.0625,
-0.00931549072265625,
0.0172271728515625,
0.02642822265625,
-0.01152801513671875,
0.019989013671875,
0.00879669189453125,
0.0241851806640625,
0.006198883056640625,
-0.0009407997131347656,
-0.0259857177734375,
-0.03753662109375,
0.03948974609375,
-0.07659912109375,
-0.029541015625,
-0.021728515625,
-0.0161285400390625,
0.02496337890625,
0.0279541015625,
0.07232666015625,
0.050384521484375,
0.002712249755859375,
0.0179290771484375,
0.05706787109375,
-0.0308685302734375,
0.029022216796875,
0.00911712646484375,
-0.0291595458984375,
-0.061798095703125,
0.08245849609375,
-0.004901885986328125,
0.00904083251953125,
0.0183563232421875,
0.0080108642578125,
-0.029022216796875,
-0.026763916015625,
-0.036956787109375,
0.035797119140625,
-0.06805419921875,
-0.0124969482421875,
-0.036651611328125,
-0.039031982421875,
-0.0281829833984375,
-0.0247039794921875,
-0.0413818359375,
-0.00872802734375,
-0.04278564453125,
0.0111083984375,
0.038421630859375,
0.03167724609375,
0.01375579833984375,
0.041351318359375,
-0.062255859375,
0.022979736328125,
0.01319122314453125,
0.012969970703125,
0.0007171630859375,
-0.060882568359375,
-0.035888671875,
-0.00984954833984375,
-0.04571533203125,
-0.07244873046875,
0.0531005859375,
0.00971221923828125,
0.036712646484375,
0.0245819091796875,
-0.01113128662109375,
0.059112548828125,
-0.01800537109375,
0.070068359375,
0.036590576171875,
-0.0682373046875,
0.04351806640625,
-0.0179595947265625,
0.033966064453125,
0.034423828125,
0.027069091796875,
-0.04071044921875,
-0.02685546875,
-0.050140380859375,
-0.07354736328125,
0.056610107421875,
0.0138092041015625,
0.0202178955078125,
0.003692626953125,
0.0133056640625,
0.0035457611083984375,
-0.004878997802734375,
-0.08673095703125,
-0.0301055908203125,
-0.045440673828125,
-0.025970458984375,
0.0186614990234375,
-0.015716552734375,
0.01047515869140625,
-0.0246124267578125,
0.06292724609375,
0.00659942626953125,
0.036102294921875,
0.034576416015625,
-0.032928466796875,
0.0162811279296875,
0.003936767578125,
0.03912353515625,
0.0192108154296875,
-0.0206451416015625,
0.0008602142333984375,
0.015716552734375,
-0.062744140625,
0.0066375732421875,
-0.0195465087890625,
-0.03277587890625,
0.0278472900390625,
0.033111572265625,
0.0972900390625,
0.033294677734375,
-0.02496337890625,
0.0504150390625,
-0.005786895751953125,
-0.0129852294921875,
-0.0186920166015625,
-0.0011434555053710938,
-0.007843017578125,
0.01513671875,
0.0177764892578125,
-0.0088958740234375,
0.00884246826171875,
-0.0201263427734375,
-0.00011414289474487305,
0.0252838134765625,
-0.0160980224609375,
-0.03656005859375,
0.056182861328125,
0.016082763671875,
-0.01404571533203125,
0.0362548828125,
-0.0027446746826171875,
-0.062286376953125,
0.054901123046875,
0.05279541015625,
0.0643310546875,
0.01201629638671875,
0.01035308837890625,
0.06182861328125,
0.0194244384765625,
0.0006694793701171875,
0.0294342041015625,
0.00514984130859375,
-0.05615234375,
-0.0128021240234375,
-0.03668212890625,
-0.03302001953125,
0.0298309326171875,
-0.0516357421875,
0.044677734375,
-0.04083251953125,
-0.0288848876953125,
-0.003173828125,
0.0140380859375,
-0.053802490234375,
0.022674560546875,
-0.0011644363403320312,
0.06658935546875,
-0.0714111328125,
0.0777587890625,
0.0531005859375,
-0.049468994140625,
-0.0587158203125,
-0.0272216796875,
-0.0343017578125,
-0.07281494140625,
0.052001953125,
0.0241546630859375,
-0.00632476806640625,
-0.0015010833740234375,
-0.05816650390625,
-0.0518798828125,
0.07049560546875,
0.0215911865234375,
-0.015716552734375,
0.006908416748046875,
-0.00955963134765625,
0.0224761962890625,
-0.0269012451171875,
0.035369873046875,
0.01432037353515625,
0.016326904296875,
0.003173828125,
-0.056182861328125,
-0.0016241073608398438,
-0.0265045166015625,
0.0211181640625,
-0.0015583038330078125,
-0.05682373046875,
0.075927734375,
-0.01543426513671875,
-0.0231781005859375,
0.01641845703125,
0.06414794921875,
0.0275726318359375,
0.00344085693359375,
0.0185394287109375,
0.04046630859375,
0.01538848876953125,
0.001312255859375,
0.051513671875,
-0.016448974609375,
0.05078125,
0.06884765625,
0.01556396484375,
0.054443359375,
0.034515380859375,
-0.01342010498046875,
0.050506591796875,
0.043975830078125,
-0.037506103515625,
0.056396484375,
-0.0209503173828125,
-0.0125579833984375,
-0.002628326416015625,
-0.004154205322265625,
-0.0380859375,
0.0271148681640625,
0.0265045166015625,
-0.037841796875,
-0.01311492919921875,
0.0164642333984375,
0.00567626953125,
-0.0076446533203125,
-0.02032470703125,
0.037994384765625,
-0.0024776458740234375,
-0.017669677734375,
0.043914794921875,
0.004974365234375,
0.0830078125,
-0.0457763671875,
-0.00007462501525878906,
0.0152435302734375,
0.0096282958984375,
-0.00693511962890625,
-0.09228515625,
0.017364501953125,
0.01161956787109375,
-0.007965087890625,
-0.00060272216796875,
0.06463623046875,
-0.0352783203125,
-0.0384521484375,
0.00800323486328125,
-0.01361083984375,
0.04095458984375,
0.0285797119140625,
-0.07171630859375,
0.035247802734375,
0.01180267333984375,
-0.019195556640625,
0.011474609375,
0.0115203857421875,
0.006526947021484375,
0.0440673828125,
0.030670166015625,
0.01279449462890625,
0.01422882080078125,
-0.0084075927734375,
0.057098388671875,
-0.046234130859375,
-0.04779052734375,
-0.068115234375,
0.03192138671875,
-0.01959228515625,
-0.037750244140625,
0.052154541015625,
0.04656982421875,
0.06964111328125,
-0.037750244140625,
0.05194091796875,
-0.00951385498046875,
0.0097808837890625,
-0.054351806640625,
0.07073974609375,
-0.036834716796875,
-0.01435089111328125,
-0.0287933349609375,
-0.042938232421875,
-0.00441741943359375,
0.0699462890625,
-0.019134521484375,
-0.0024204254150390625,
0.04205322265625,
0.06561279296875,
-0.01232147216796875,
-0.01885986328125,
0.015228271484375,
0.00823974609375,
0.0216217041015625,
0.051544189453125,
0.04925537109375,
-0.0670166015625,
0.03839111328125,
-0.0557861328125,
0.0119171142578125,
-0.016571044921875,
-0.060791015625,
-0.0653076171875,
-0.0489501953125,
-0.040283203125,
-0.020111083984375,
-0.0174102783203125,
0.06585693359375,
0.06463623046875,
-0.047027587890625,
-0.01375579833984375,
0.0081024169921875,
0.005512237548828125,
0.0219268798828125,
-0.0184783935546875,
0.0352783203125,
-0.004222869873046875,
-0.0635986328125,
-0.00632476806640625,
0.013427734375,
0.0287933349609375,
-0.003986358642578125,
-0.0038204193115234375,
-0.0112762451171875,
-0.01105499267578125,
0.00864410400390625,
0.03564453125,
-0.04937744140625,
-0.0177764892578125,
-0.01079559326171875,
-0.01560211181640625,
0.0220794677734375,
0.0272979736328125,
-0.03424072265625,
0.034210205078125,
0.0394287109375,
0.023223876953125,
0.062042236328125,
-0.017791748046875,
0.01242828369140625,
-0.0537109375,
0.0428466796875,
0.0010099411010742188,
0.042510986328125,
0.03424072265625,
-0.017852783203125,
0.01177978515625,
0.0305633544921875,
-0.038726806640625,
-0.0648193359375,
-0.0005521774291992188,
-0.0797119140625,
-0.013275146484375,
0.06182861328125,
-0.041900634765625,
-0.05511474609375,
0.01479339599609375,
-0.04693603515625,
0.0361328125,
-0.0205535888671875,
0.03619384765625,
0.04388427734375,
0.01904296875,
-0.045440673828125,
-0.021331787109375,
0.0169525146484375,
0.0104522705078125,
-0.03863525390625,
-0.0300445556640625,
0.01325225830078125,
0.0518798828125,
0.01763916015625,
0.054656982421875,
-0.0181427001953125,
0.0266571044921875,
0.020782470703125,
0.033966064453125,
-0.01412200927734375,
-0.0131988525390625,
-0.0279541015625,
-0.01389312744140625,
0.00511932373046875,
-0.059356689453125
]
] |
clefourrier/graphormer-base-pcqm4mv1 | 2023-02-07T16:35:10.000Z | [
"transformers",
"pytorch",
"graphormer",
"graphs",
"graph-ml",
"arxiv:2106.05234",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | graph-ml | clefourrier | null | null | clefourrier/graphormer-base-pcqm4mv1 | 2 | 32,559 | transformers | 2023-01-05T10:12:34 | ---
license: mit
tags:
- graphs
pipeline_tag: graph-ml
---
# Model Card for pcqm4mv1_graphormer_base
The Graphormer is a graph classification model.
# Model Details
## Model Description
The Graphormer is a graph Transformer model, pretrained on PCQM4M-LSC, and which got 1st place on the KDD CUP 2021 (quantum prediction track).
- **Developed by:** Microsoft
- **Model type:** Graphormer
- **License:** MIT
## Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** [Github](https://github.com/microsoft/Graphormer)
- **Paper:** [Paper](https://arxiv.org/abs/2106.05234)
- **Documentation:** [Link](https://graphormer.readthedocs.io/en/latest/)
# Uses
## Direct Use
This model should be used for graph classification tasks or graph representation tasks; the most likely associated task is molecule modeling. It can either be used as such, or finetuned on downstream tasks.
# Bias, Risks, and Limitations
The Graphormer model is ressource intensive for large graphs, and might lead to OOM errors.
## How to Get Started with the Model
See the Graph Classification with Transformers tutorial.
# Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```
@article{DBLP:journals/corr/abs-2106-05234,
author = {Chengxuan Ying and
Tianle Cai and
Shengjie Luo and
Shuxin Zheng and
Guolin Ke and
Di He and
Yanming Shen and
Tie{-}Yan Liu},
title = {Do Transformers Really Perform Bad for Graph Representation?},
journal = {CoRR},
volume = {abs/2106.05234},
year = {2021},
url = {https://arxiv.org/abs/2106.05234},
eprinttype = {arXiv},
eprint = {2106.05234},
timestamp = {Tue, 15 Jun 2021 16:35:15 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2106-05234.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` | 2,043 | [
[
-0.043792724609375,
-0.0305938720703125,
0.0227508544921875,
-0.0251617431640625,
-0.0212249755859375,
0.0156707763671875,
0.0229644775390625,
-0.0272216796875,
-0.003658294677734375,
0.04400634765625,
-0.0278472900390625,
-0.051239013671875,
-0.05316162109375,
0.00879669189453125,
-0.0296173095703125,
0.07440185546875,
-0.00687408447265625,
0.0006175041198730469,
-0.0170440673828125,
-0.00321197509765625,
-0.0201873779296875,
-0.051605224609375,
-0.0384521484375,
-0.05078125,
0.0300140380859375,
0.023590087890625,
0.044525146484375,
0.044647216796875,
0.03973388671875,
0.029632568359375,
-0.0302734375,
0.0006580352783203125,
-0.057342529296875,
-0.0257110595703125,
-0.003047943115234375,
-0.030364990234375,
-0.05633544921875,
0.02032470703125,
0.0499267578125,
0.04486083984375,
-0.01129150390625,
0.0198211669921875,
-0.00518798828125,
0.03692626953125,
-0.027374267578125,
0.0263671875,
-0.031768798828125,
0.01506805419921875,
-0.005680084228515625,
0.00469970703125,
-0.052978515625,
-0.034820556640625,
0.0026836395263671875,
-0.03570556640625,
0.048004150390625,
0.021331787109375,
0.10418701171875,
0.003612518310546875,
-0.059967041015625,
0.0122528076171875,
-0.0704345703125,
0.0416259765625,
-0.03631591796875,
0.0261383056640625,
0.00374603271484375,
0.054840087890625,
0.00429534912109375,
-0.0916748046875,
-0.03350830078125,
0.00722503662109375,
-0.0157318115234375,
0.02801513671875,
-0.04083251953125,
0.0192108154296875,
0.03533935546875,
0.039093017578125,
-0.046356201171875,
-0.004016876220703125,
-0.06390380859375,
-0.0260162353515625,
0.0263824462890625,
0.01548004150390625,
0.00009733438491821289,
-0.032073974609375,
-0.04443359375,
-0.022216796875,
-0.035736083984375,
0.0018405914306640625,
0.03631591796875,
0.0122833251953125,
-0.0084228515625,
0.047393798828125,
0.0244903564453125,
0.0579833984375,
0.018585205078125,
0.0007843971252441406,
0.03155517578125,
-0.017730712890625,
-0.01343536376953125,
0.01666259765625,
0.07061767578125,
0.0189666748046875,
0.0124053955078125,
-0.003963470458984375,
0.01222991943359375,
0.01154327392578125,
0.026123046875,
-0.09027099609375,
-0.041595458984375,
0.010101318359375,
-0.04150390625,
-0.00400543212890625,
0.036163330078125,
-0.054931640625,
0.0012254714965820312,
-0.003772735595703125,
0.0100250244140625,
-0.025421142578125,
-0.017547607421875,
-0.00746917724609375,
-0.0295867919921875,
0.017608642578125,
0.004711151123046875,
-0.03338623046875,
0.029510498046875,
0.036102294921875,
0.0679931640625,
-0.0205535888671875,
-0.0089111328125,
-0.0155487060546875,
0.01971435546875,
-0.016937255859375,
0.065185546875,
-0.023590087890625,
-0.052001953125,
-0.0244140625,
0.038665771484375,
-0.019134521484375,
-0.041229248046875,
0.0550537109375,
-0.052520751953125,
0.0237274169921875,
0.0053253173828125,
-0.046600341796875,
-0.03253173828125,
-0.0007619857788085938,
-0.06951904296875,
0.0677490234375,
0.041473388671875,
-0.07196044921875,
0.00930023193359375,
-0.0296173095703125,
-0.002246856689453125,
0.01026153564453125,
0.00460052490234375,
-0.059417724609375,
-0.023590087890625,
-0.0195770263671875,
0.0251617431640625,
-0.00679779052734375,
0.005950927734375,
-0.0263671875,
-0.03594970703125,
0.00165557861328125,
-0.0080718994140625,
0.0765380859375,
0.0179901123046875,
0.006244659423828125,
0.02850341796875,
-0.07049560546875,
-0.0029926300048828125,
0.0214080810546875,
-0.020416259765625,
0.01451873779296875,
-0.01090240478515625,
0.0045928955078125,
0.0277557373046875,
0.0106353759765625,
-0.039886474609375,
0.007656097412109375,
-0.00974273681640625,
0.03753662109375,
0.056732177734375,
0.0082855224609375,
0.031005859375,
-0.01419830322265625,
0.058563232421875,
-0.00539398193359375,
0.014434814453125,
0.0002837181091308594,
-0.053955078125,
-0.043792724609375,
-0.0246734619140625,
0.0177764892578125,
0.0426025390625,
-0.0390625,
0.047271728515625,
-0.0234527587890625,
-0.07244873046875,
-0.0181884765625,
-0.0092010498046875,
0.0209197998046875,
0.0540771484375,
0.044189453125,
-0.007312774658203125,
-0.055145263671875,
-0.0631103515625,
-0.0025501251220703125,
-0.007579803466796875,
0.0015926361083984375,
0.0136871337890625,
0.0237274169921875,
-0.049468994140625,
0.0682373046875,
-0.056732177734375,
-0.0210418701171875,
-0.0157928466796875,
0.01372528076171875,
0.035125732421875,
0.0203094482421875,
0.047088623046875,
-0.04595947265625,
-0.047332763671875,
-0.0126495361328125,
-0.032073974609375,
0.00223541259765625,
0.038604736328125,
-0.01544189453125,
0.00518035888671875,
0.036651611328125,
-0.046661376953125,
0.0274200439453125,
0.060150146484375,
-0.04595947265625,
0.045166015625,
-0.007709503173828125,
-0.003993988037109375,
-0.07684326171875,
0.0258941650390625,
0.0010862350463867188,
-0.05035400390625,
-0.041748046875,
-0.01263427734375,
-0.00759124755859375,
-0.0220184326171875,
-0.031768798828125,
0.025390625,
-0.0264129638671875,
0.01403045654296875,
-0.024566650390625,
-0.003055572509765625,
-0.0033283233642578125,
0.037841796875,
-0.024627685546875,
0.039337158203125,
0.0300140380859375,
-0.039794921875,
0.027252197265625,
0.038848876953125,
-0.0089874267578125,
0.0164031982421875,
-0.06268310546875,
0.01824951171875,
-0.01580810546875,
0.0311126708984375,
-0.09564208984375,
-0.007843017578125,
0.035736083984375,
-0.029632568359375,
0.042449951171875,
-0.0252685546875,
-0.033111572265625,
-0.061492919921875,
0.0038967132568359375,
0.0255126953125,
0.05364990234375,
-0.045166015625,
0.06268310546875,
0.0258941650390625,
0.0103759765625,
-0.022613525390625,
-0.04443359375,
-0.0281829833984375,
-0.0084991455078125,
-0.037200927734375,
0.053985595703125,
-0.009490966796875,
0.0203094482421875,
0.0090484619140625,
-0.0159759521484375,
-0.004550933837890625,
-0.019439697265625,
0.030303955078125,
0.0335693359375,
0.00959014892578125,
-0.019775390625,
-0.01000213623046875,
-0.055938720703125,
0.019775390625,
0.00916290283203125,
0.04559326171875,
-0.029052734375,
-0.0186920166015625,
-0.035308837890625,
0.003204345703125,
0.04107666015625,
-0.01256561279296875,
0.037567138671875,
0.061920166015625,
-0.006580352783203125,
0.010040283203125,
-0.00272369384765625,
-0.01508331298828125,
-0.038970947265625,
0.03466796875,
-0.0149078369140625,
-0.0648193359375,
0.0406494140625,
-0.0199737548828125,
-0.019500732421875,
0.07012939453125,
0.0338134765625,
0.00740814208984375,
0.110595703125,
0.03802490234375,
0.01493072509765625,
0.0452880859375,
-0.05572509765625,
0.027130126953125,
-0.0614013671875,
-0.019012451171875,
-0.05078125,
-0.01116943359375,
-0.065185546875,
-0.022979736328125,
0.0257568359375,
0.0196685791015625,
-0.052520751953125,
0.040863037109375,
-0.067626953125,
0.0139312744140625,
0.033172607421875,
0.00501251220703125,
0.010894775390625,
0.002674102783203125,
-0.00732421875,
0.01235198974609375,
-0.0706787109375,
-0.052520751953125,
0.0616455078125,
0.029388427734375,
0.053009033203125,
0.005985260009765625,
0.0281219482421875,
-0.00421905517578125,
0.025909423828125,
-0.051177978515625,
0.0438232421875,
-0.01502227783203125,
-0.059295654296875,
0.00751495361328125,
-0.036834716796875,
-0.08306884765625,
0.01325225830078125,
-0.018768310546875,
-0.057861328125,
0.03985595703125,
0.03082275390625,
0.0161285400390625,
0.035064697265625,
-0.045196533203125,
0.0885009765625,
-0.0017366409301757812,
0.0000871419906616211,
0.002239227294921875,
-0.04425048828125,
0.039520263671875,
0.00988006591796875,
0.0066986083984375,
-0.005184173583984375,
0.020721435546875,
0.052581787109375,
-0.0499267578125,
0.044952392578125,
-0.0247650146484375,
0.0175018310546875,
0.045135498046875,
-0.007091522216796875,
0.050567626953125,
-0.0159149169921875,
-0.0019273757934570312,
0.03515625,
0.018768310546875,
-0.031707763671875,
-0.01216888427734375,
0.03558349609375,
-0.08172607421875,
-0.039306640625,
-0.052642822265625,
-0.0180511474609375,
0.00612640380859375,
0.032928466796875,
0.0518798828125,
0.0085296630859375,
-0.0006127357482910156,
0.015777587890625,
0.041534423828125,
0.00597381591796875,
0.0175933837890625,
0.01457977294921875,
-0.040283203125,
-0.044281005859375,
0.07080078125,
0.01265716552734375,
0.02325439453125,
0.01336669921875,
0.0190277099609375,
-0.018524169921875,
-0.03192138671875,
-0.00884246826171875,
0.0233917236328125,
-0.07061767578125,
-0.027374267578125,
-0.05035400390625,
-0.041656494140625,
-0.020416259765625,
0.00881195068359375,
-0.0243682861328125,
-0.02899169921875,
-0.039642333984375,
-0.0158538818359375,
0.039642333984375,
0.039581298828125,
0.0034770965576171875,
0.0518798828125,
-0.057647705078125,
0.023345947265625,
0.0285491943359375,
0.031982421875,
0.006923675537109375,
-0.04461669921875,
-0.0167388916015625,
-0.01021575927734375,
-0.051177978515625,
-0.0885009765625,
0.0257110595703125,
0.004962921142578125,
0.0579833984375,
0.04058837890625,
-0.0024127960205078125,
0.0421142578125,
-0.038330078125,
0.04461669921875,
0.0297393798828125,
-0.05706787109375,
0.006519317626953125,
-0.0136260986328125,
0.017120361328125,
0.023406982421875,
0.051239013671875,
-0.0243377685546875,
-0.01312255859375,
-0.048858642578125,
-0.06939697265625,
0.058319091796875,
0.00994110107421875,
-0.0057830810546875,
0.0037746429443359375,
0.0260467529296875,
-0.002346038818359375,
-0.0029163360595703125,
-0.063232421875,
-0.049072265625,
-0.05316162109375,
-0.00780487060546875,
-0.0125274658203125,
-0.0197601318359375,
-0.003662109375,
-0.046142578125,
0.055084228515625,
-0.007434844970703125,
0.05316162109375,
0.0128936767578125,
0.00006848573684692383,
-0.0189208984375,
-0.004558563232421875,
0.0235595703125,
0.049041748046875,
-0.049713134765625,
0.0128936767578125,
0.00154876708984375,
-0.0421142578125,
0.006771087646484375,
0.04010009765625,
0.0026569366455078125,
0.01215362548828125,
0.01480865478515625,
0.059478759765625,
-0.003753662109375,
-0.02264404296875,
0.031768798828125,
-0.0011110305786132812,
-0.0238494873046875,
-0.029083251953125,
0.0215606689453125,
0.015777587890625,
0.0056915283203125,
0.034088134765625,
0.01317596435546875,
0.0135498046875,
-0.0198211669921875,
-0.0020313262939453125,
0.0219268798828125,
-0.054534912109375,
-0.0286712646484375,
0.06414794921875,
-0.0112762451171875,
-0.021392822265625,
0.04998779296875,
-0.03826904296875,
-0.051025390625,
0.051513671875,
0.05572509765625,
0.05474853515625,
-0.035308837890625,
0.0003001689910888672,
0.050323486328125,
0.006343841552734375,
0.0177459716796875,
0.0127105712890625,
0.027679443359375,
-0.0421142578125,
-0.0217132568359375,
-0.0224456787109375,
-0.026611328125,
-0.00777435302734375,
-0.0457763671875,
0.040496826171875,
-0.0300445556640625,
-0.034912109375,
0.0161590576171875,
0.0008578300476074219,
-0.052734375,
-0.00982666015625,
-0.0002727508544921875,
0.0665283203125,
-0.054931640625,
0.053741455078125,
0.06201171875,
-0.0268402099609375,
-0.042816162109375,
-0.04425048828125,
-0.008453369140625,
-0.038665771484375,
0.044677734375,
0.0036296844482421875,
-0.00457763671875,
0.030670166015625,
-0.04925537109375,
-0.07733154296875,
0.10858154296875,
0.0216217041015625,
-0.04583740234375,
-0.0005469322204589844,
0.002887725830078125,
0.03704833984375,
-0.020477294921875,
0.037139892578125,
0.0179290771484375,
0.0386962890625,
0.029052734375,
-0.04693603515625,
0.016876220703125,
-0.0121002197265625,
0.00655364990234375,
0.026123046875,
-0.0631103515625,
0.092041015625,
-0.0007128715515136719,
-0.035186767578125,
-0.004940032958984375,
0.034820556640625,
0.038909912109375,
0.01690673828125,
0.028167724609375,
0.0335693359375,
0.061767578125,
-0.0169677734375,
0.07904052734375,
-0.04290771484375,
0.0648193359375,
0.0689697265625,
-0.01078033447265625,
0.03216552734375,
0.0192718505859375,
-0.047607421875,
0.07000732421875,
0.059814453125,
-0.040008544921875,
0.033782958984375,
0.0125885009765625,
0.01910400390625,
-0.0179290771484375,
0.0127716064453125,
-0.044403076171875,
0.021575927734375,
0.018890380859375,
-0.04107666015625,
-0.0189208984375,
-0.01519012451171875,
-0.002834320068359375,
-0.0106353759765625,
-0.0272369384765625,
0.032989501953125,
0.0080718994140625,
-0.01336669921875,
0.05364990234375,
0.002948760986328125,
0.027008056640625,
-0.036895751953125,
-0.020538330078125,
0.0090484619140625,
0.0301361083984375,
-0.0279998779296875,
-0.03753662109375,
0.0283660888671875,
-0.02301025390625,
-0.04119873046875,
0.0017795562744140625,
0.06585693359375,
-0.01561737060546875,
-0.0504150390625,
0.0273284912109375,
0.013580322265625,
0.0174713134765625,
0.01666259765625,
-0.062286376953125,
-0.0036296844482421875,
-0.0167236328125,
-0.014373779296875,
-0.002010345458984375,
0.0015878677368164062,
0.005428314208984375,
0.0460205078125,
0.036468505859375,
-0.0189056396484375,
0.006435394287109375,
-0.01346588134765625,
0.06414794921875,
-0.049560546875,
-0.0274658203125,
-0.044281005859375,
0.03106689453125,
-0.00975799560546875,
-0.0110015869140625,
0.07135009765625,
0.0721435546875,
0.05712890625,
-0.01300811767578125,
0.043609619140625,
0.006137847900390625,
0.039093017578125,
-0.00757598876953125,
0.045440673828125,
-0.01479339599609375,
-0.0093536376953125,
-0.030242919921875,
-0.08502197265625,
-0.00948333740234375,
0.042449951171875,
-0.020355224609375,
0.0252532958984375,
0.054412841796875,
0.0426025390625,
-0.0145416259765625,
-0.005126953125,
0.002101898193359375,
0.0281829833984375,
0.04071044921875,
0.03424072265625,
0.05224609375,
-0.05084228515625,
0.0609130859375,
-0.0158233642578125,
-0.006504058837890625,
-0.031646728515625,
-0.06494140625,
-0.08477783203125,
-0.0242156982421875,
-0.0286102294921875,
-0.052978515625,
-0.0268707275390625,
0.048797607421875,
0.061431884765625,
-0.06072998046875,
-0.0221405029296875,
-0.026336669921875,
0.0219573974609375,
-0.021392822265625,
-0.02313232421875,
0.044525146484375,
0.0072479248046875,
-0.055633544921875,
-0.00737762451171875,
-0.006114959716796875,
0.0335693359375,
-0.01335906982421875,
-0.02178955078125,
-0.01335906982421875,
-0.01934814453125,
0.04266357421875,
0.025848388671875,
-0.0384521484375,
-0.00753021240234375,
0.0028247833251953125,
-0.01413726806640625,
0.029510498046875,
0.035736083984375,
-0.0501708984375,
0.03399658203125,
0.0533447265625,
-2.384185791015625e-7,
0.0799560546875,
0.01335906982421875,
0.027130126953125,
-0.031463623046875,
0.00798797607421875,
0.01560211181640625,
0.038909912109375,
0.0120391845703125,
-0.01629638671875,
0.0274200439453125,
0.036285400390625,
-0.05902099609375,
-0.0577392578125,
0.00852203369140625,
-0.09368896484375,
-0.03375244140625,
0.0714111328125,
0.0014009475708007812,
-0.027099609375,
-0.003948211669921875,
-0.025909423828125,
0.044189453125,
-0.0185699462890625,
0.0310211181640625,
0.03631591796875,
-0.016815185546875,
-0.0246734619140625,
-0.027099609375,
0.031097412109375,
-0.0267181396484375,
-0.052490234375,
-0.019317626953125,
0.024169921875,
0.022430419921875,
0.038055419921875,
0.0159759521484375,
-0.0166015625,
0.01220703125,
0.005290985107421875,
0.0333251953125,
-0.021148681640625,
-0.003810882568359375,
-0.0184783935546875,
0.01251983642578125,
-0.0005636215209960938,
-0.005126953125
]
] |
timm/deit_base_distilled_patch16_224.fb_in1k | 2023-03-28T01:29:31.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2012.12877",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/deit_base_distilled_patch16_224.fb_in1k | 0 | 32,530 | timm | 2023-03-28T01:27:56 | ---
tags:
- image-classification
- timm
library_tag: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for deit_base_distilled_patch16_224.fb_in1k
A DeiT image classification model. Trained on ImageNet-1k using distillation tokens by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 87.3
- GMACs: 17.7
- Activations (M): 24.0
- Image size: 224 x 224
- **Papers:**
- Training data-efficient image transformers & distillation through attention: https://arxiv.org/abs/2012.12877
- **Original:** https://github.com/facebookresearch/deit
- **Dataset:** ImageNet-1k
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('deit_base_distilled_patch16_224.fb_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'deit_base_distilled_patch16_224.fb_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 198, 768) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@InProceedings{pmlr-v139-touvron21a,
title = {Training data-efficient image transformers & distillation through attention},
author = {Touvron, Hugo and Cord, Matthieu and Douze, Matthijs and Massa, Francisco and Sablayrolles, Alexandre and Jegou, Herve},
booktitle = {International Conference on Machine Learning},
pages = {10347--10357},
year = {2021},
volume = {139},
month = {July}
}
```
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
| 3,271 | [
[
-0.036102294921875,
-0.03826904296875,
0.01031494140625,
0.01497650146484375,
-0.033538818359375,
-0.020172119140625,
-0.01561737060546875,
-0.022003173828125,
0.005184173583984375,
0.014190673828125,
-0.03948974609375,
-0.049346923828125,
-0.061248779296875,
-0.0005955696105957031,
-0.0128021240234375,
0.08209228515625,
-0.00424957275390625,
-0.00609588623046875,
-0.0094146728515625,
-0.0260162353515625,
-0.0169525146484375,
-0.0220794677734375,
-0.0545654296875,
-0.031158447265625,
0.0272064208984375,
0.00807952880859375,
0.0322265625,
0.038360595703125,
0.0545654296875,
0.036865234375,
-0.01558685302734375,
0.00399017333984375,
-0.032196044921875,
-0.01326751708984375,
0.0198822021484375,
-0.040557861328125,
-0.034881591796875,
0.023529052734375,
0.045562744140625,
0.033233642578125,
0.0020542144775390625,
0.02972412109375,
0.0225982666015625,
0.0592041015625,
-0.025970458984375,
0.01806640625,
-0.041748046875,
0.01168060302734375,
-0.00653076171875,
0.00995635986328125,
-0.0236663818359375,
-0.0286407470703125,
0.0189208984375,
-0.032501220703125,
0.03759765625,
-0.01274871826171875,
0.0936279296875,
0.04296875,
-0.004276275634765625,
0.00323486328125,
-0.0260467529296875,
0.05853271484375,
-0.06298828125,
0.01514434814453125,
0.0251617431640625,
0.00754547119140625,
-0.0088653564453125,
-0.07891845703125,
-0.0384521484375,
-0.01023101806640625,
-0.019866943359375,
0.00042176246643066406,
-0.0287628173828125,
0.004810333251953125,
0.03277587890625,
0.0390625,
-0.0297393798828125,
-0.00823974609375,
-0.040130615234375,
-0.005954742431640625,
0.036224365234375,
-0.005672454833984375,
0.003307342529296875,
-0.009552001953125,
-0.046112060546875,
-0.0313720703125,
-0.01300811767578125,
0.01088714599609375,
0.021087646484375,
0.00984954833984375,
-0.035308837890625,
0.025665283203125,
0.000045359134674072266,
0.043121337890625,
0.033966064453125,
-0.02001953125,
0.05072021484375,
-0.0030193328857421875,
-0.03369140625,
0.0014944076538085938,
0.0792236328125,
0.02728271484375,
0.018524169921875,
0.009002685546875,
-0.006755828857421875,
-0.016448974609375,
-0.0060577392578125,
-0.09564208984375,
-0.0361328125,
0.0233612060546875,
-0.03662109375,
-0.03692626953125,
0.0205535888671875,
-0.04632568359375,
-0.006237030029296875,
-0.00919342041015625,
0.0380859375,
-0.03546142578125,
-0.0296478271484375,
0.0092315673828125,
-0.01568603515625,
0.00507354736328125,
0.0102691650390625,
-0.04241943359375,
0.0055084228515625,
0.018341064453125,
0.082275390625,
0.0031490325927734375,
-0.027374267578125,
-0.0142974853515625,
-0.022308349609375,
-0.01535797119140625,
0.0379638671875,
-0.00039887428283691406,
-0.00882720947265625,
-0.0273284912109375,
0.0262451171875,
-0.01056671142578125,
-0.05267333984375,
0.025115966796875,
-0.0158233642578125,
0.01666259765625,
0.002048492431640625,
-0.0192413330078125,
-0.033843994140625,
0.02130126953125,
-0.043182373046875,
0.0947265625,
0.028717041015625,
-0.08087158203125,
0.027313232421875,
-0.04034423828125,
-0.0093841552734375,
-0.0188140869140625,
0.01088714599609375,
-0.077392578125,
0.0007252693176269531,
0.01458740234375,
0.054290771484375,
-0.01654052734375,
0.015716552734375,
-0.04364013671875,
-0.02252197265625,
0.031005859375,
-0.025299072265625,
0.07763671875,
0.0199432373046875,
-0.037445068359375,
0.005146026611328125,
-0.052215576171875,
0.0082244873046875,
0.033050537109375,
-0.02716064453125,
-0.0156402587890625,
-0.047943115234375,
0.011505126953125,
0.02044677734375,
0.0121917724609375,
-0.0400390625,
0.0229339599609375,
-0.0104827880859375,
0.04052734375,
0.056793212890625,
-0.00977325439453125,
0.020904541015625,
-0.0227813720703125,
0.0182037353515625,
0.038604736328125,
0.017333984375,
-0.00420379638671875,
-0.035858154296875,
-0.05548095703125,
-0.05474853515625,
0.03253173828125,
0.0256805419921875,
-0.0325927734375,
0.04473876953125,
-0.0182342529296875,
-0.057373046875,
-0.0391845703125,
0.00904083251953125,
0.0312347412109375,
0.050018310546875,
0.0272369384765625,
-0.0292205810546875,
-0.03509521484375,
-0.0684814453125,
0.005695343017578125,
-0.0044403076171875,
0.0044708251953125,
0.0123291015625,
0.04583740234375,
-0.01480865478515625,
0.055999755859375,
-0.04339599609375,
-0.028167724609375,
-0.0156707763671875,
0.008270263671875,
0.042938232421875,
0.052154541015625,
0.06756591796875,
-0.051483154296875,
-0.054473876953125,
-0.0196685791015625,
-0.06829833984375,
0.0034942626953125,
0.005764007568359375,
-0.0197601318359375,
0.02203369140625,
0.013580322265625,
-0.04864501953125,
0.04974365234375,
0.0174407958984375,
-0.026275634765625,
0.02264404296875,
-0.0158538818359375,
0.0222930908203125,
-0.09283447265625,
0.01300811767578125,
0.03265380859375,
-0.0168914794921875,
-0.0333251953125,
-0.0168914794921875,
0.0023822784423828125,
0.005992889404296875,
-0.04022216796875,
0.0382080078125,
-0.039947509765625,
0.00634765625,
-0.01387786865234375,
-0.0278167724609375,
0.00885009765625,
0.058380126953125,
-0.0066070556640625,
0.0237274169921875,
0.0589599609375,
-0.03759765625,
0.038726806640625,
0.03289794921875,
-0.0159759521484375,
0.04248046875,
-0.056121826171875,
0.0193328857421875,
-0.006504058837890625,
0.0216217041015625,
-0.09112548828125,
-0.00943756103515625,
0.02685546875,
-0.034515380859375,
0.056060791015625,
-0.04522705078125,
-0.0312347412109375,
-0.0411376953125,
-0.03387451171875,
0.03521728515625,
0.055023193359375,
-0.053497314453125,
0.0265045166015625,
0.012115478515625,
0.01422119140625,
-0.049163818359375,
-0.0665283203125,
-0.02813720703125,
-0.046539306640625,
-0.0450439453125,
0.0340576171875,
0.0006866455078125,
0.004863739013671875,
0.015045166015625,
-0.01090240478515625,
-0.01397705078125,
-0.004863739013671875,
0.034637451171875,
0.032073974609375,
-0.00911712646484375,
-0.01291656494140625,
-0.0138702392578125,
-0.01532745361328125,
0.0027675628662109375,
-0.0236358642578125,
0.03839111328125,
-0.017181396484375,
-0.0094146728515625,
-0.067138671875,
-0.0100250244140625,
0.044830322265625,
0.0008640289306640625,
0.05633544921875,
0.0709228515625,
-0.034515380859375,
0.0008707046508789062,
-0.037139892578125,
-0.0274658203125,
-0.0382080078125,
0.0440673828125,
-0.0309295654296875,
-0.0227508544921875,
0.05755615234375,
-0.0007715225219726562,
0.009796142578125,
0.04925537109375,
0.0330810546875,
-0.00878143310546875,
0.060272216796875,
0.04071044921875,
-0.0015764236450195312,
0.05743408203125,
-0.0689697265625,
-0.01268768310546875,
-0.058441162109375,
-0.0213623046875,
-0.025177001953125,
-0.054931640625,
-0.04522705078125,
-0.023223876953125,
0.0316162109375,
0.01177978515625,
-0.026763916015625,
0.033477783203125,
-0.07012939453125,
0.0126800537109375,
0.055633544921875,
0.04034423828125,
-0.005580902099609375,
0.032012939453125,
-0.0113525390625,
-0.0018768310546875,
-0.054443359375,
-0.0133209228515625,
0.08154296875,
0.034210205078125,
0.06585693359375,
-0.02117919921875,
0.05792236328125,
-0.0081024169921875,
0.0196990966796875,
-0.04376220703125,
0.036407470703125,
-0.0125274658203125,
-0.036224365234375,
-0.0086669921875,
-0.030242919921875,
-0.06854248046875,
0.0077667236328125,
-0.002170562744140625,
-0.046173095703125,
0.0195465087890625,
0.019012451171875,
-0.018707275390625,
0.04705810546875,
-0.059906005859375,
0.07183837890625,
-0.008331298828125,
-0.038177490234375,
0.00334930419921875,
-0.049072265625,
0.016571044921875,
0.006496429443359375,
-0.0173492431640625,
-0.006580352783203125,
0.025115966796875,
0.06939697265625,
-0.040985107421875,
0.067138671875,
-0.040985107421875,
0.0167388916015625,
0.041748046875,
-0.01531982421875,
0.0216827392578125,
-0.0059967041015625,
-0.0011434555053710938,
0.0367431640625,
0.00959014892578125,
-0.02813720703125,
-0.0343017578125,
0.047760009765625,
-0.06744384765625,
-0.0268096923828125,
-0.041534423828125,
-0.039459228515625,
0.0162811279296875,
0.00096893310546875,
0.043121337890625,
0.03668212890625,
0.0112457275390625,
0.026153564453125,
0.05224609375,
-0.019256591796875,
0.03033447265625,
-0.0034084320068359375,
-0.0085296630859375,
-0.036956787109375,
0.0528564453125,
0.017608642578125,
0.015899658203125,
0.0051727294921875,
0.0171051025390625,
-0.033233642578125,
-0.034088134765625,
-0.027008056640625,
0.028411865234375,
-0.054443359375,
-0.03692626953125,
-0.047515869140625,
-0.033172607421875,
-0.028167724609375,
0.0077972412109375,
-0.04095458984375,
-0.0289154052734375,
-0.031280517578125,
0.0032749176025390625,
0.06439208984375,
0.03594970703125,
-0.0199737548828125,
0.029083251953125,
-0.047882080078125,
0.01580810546875,
0.0107879638671875,
0.03570556640625,
-0.0096435546875,
-0.075927734375,
-0.018707275390625,
0.01107025146484375,
-0.03912353515625,
-0.062286376953125,
0.029815673828125,
0.019317626953125,
0.03851318359375,
0.0294036865234375,
-0.0036869049072265625,
0.06787109375,
-0.007274627685546875,
0.0290679931640625,
0.0231170654296875,
-0.04168701171875,
0.0491943359375,
-0.0025787353515625,
0.0132293701171875,
0.0248870849609375,
0.0306243896484375,
-0.01073455810546875,
-0.007781982421875,
-0.070068359375,
-0.06451416015625,
0.06903076171875,
0.01270294189453125,
0.0036334991455078125,
0.0213165283203125,
0.0504150390625,
0.0022335052490234375,
0.00782012939453125,
-0.058746337890625,
-0.0347900390625,
-0.025634765625,
-0.028778076171875,
0.00479888916015625,
-0.00733184814453125,
-0.0010890960693359375,
-0.054473876953125,
0.06396484375,
-0.0025577545166015625,
0.04083251953125,
0.0261993408203125,
-0.006717681884765625,
-0.005046844482421875,
-0.0285797119140625,
0.01532745361328125,
0.015655517578125,
-0.0289154052734375,
0.00867462158203125,
0.00800323486328125,
-0.047760009765625,
0.0124053955078125,
0.024749755859375,
0.000015139579772949219,
0.008544921875,
0.01512908935546875,
0.0684814453125,
-0.00635528564453125,
0.00362396240234375,
0.02374267578125,
-0.0091552734375,
-0.033050537109375,
-0.01702880859375,
0.005863189697265625,
0.0006327629089355469,
0.03643798828125,
0.027099609375,
0.0187225341796875,
0.001529693603515625,
-0.0177154541015625,
0.018035888671875,
0.03924560546875,
-0.028045654296875,
-0.027740478515625,
0.052490234375,
-0.0154266357421875,
0.00745391845703125,
0.0692138671875,
-0.00554656982421875,
-0.029998779296875,
0.075927734375,
0.02374267578125,
0.08050537109375,
-0.01079559326171875,
0.001697540283203125,
0.0667724609375,
0.00719451904296875,
-0.0048675537109375,
0.005954742431640625,
0.00737762451171875,
-0.04473876953125,
0.00652313232421875,
-0.05206298828125,
0.01418304443359375,
0.03350830078125,
-0.040924072265625,
0.0288848876953125,
-0.0372314453125,
-0.0396728515625,
0.0174407958984375,
0.01515960693359375,
-0.064453125,
0.00995635986328125,
0.0014495849609375,
0.051666259765625,
-0.0635986328125,
0.05474853515625,
0.06207275390625,
-0.04278564453125,
-0.0712890625,
-0.010833740234375,
-0.007282257080078125,
-0.04461669921875,
0.051422119140625,
0.03472900390625,
0.01300811767578125,
0.01377105712890625,
-0.056884765625,
-0.05126953125,
0.0999755859375,
0.04425048828125,
-0.019500732421875,
0.012298583984375,
0.004528045654296875,
0.022857666015625,
-0.016632080078125,
0.031494140625,
0.025634765625,
0.031005859375,
0.031524658203125,
-0.055389404296875,
0.01302337646484375,
-0.0280914306640625,
0.006587982177734375,
0.0088958740234375,
-0.0582275390625,
0.0706787109375,
-0.032440185546875,
-0.00936126708984375,
0.001461029052734375,
0.04510498046875,
0.0238494873046875,
0.0142974853515625,
0.04449462890625,
0.07000732421875,
0.03509521484375,
-0.02850341796875,
0.056365966796875,
-0.0082855224609375,
0.058197021484375,
0.053192138671875,
0.0165252685546875,
0.0247650146484375,
0.042327880859375,
-0.0305938720703125,
0.0258331298828125,
0.08233642578125,
-0.030364990234375,
0.049896240234375,
0.01329803466796875,
0.0006327629089355469,
-0.0024662017822265625,
0.0092315673828125,
-0.037811279296875,
0.032257080078125,
0.00359344482421875,
-0.04376220703125,
-0.02142333984375,
0.00409698486328125,
0.0010347366333007812,
-0.029022216796875,
-0.00441741943359375,
0.053009033203125,
0.003536224365234375,
-0.0369873046875,
0.07427978515625,
-0.005218505859375,
0.06402587890625,
-0.0311126708984375,
-0.0081634521484375,
-0.0220794677734375,
0.040496826171875,
-0.02703857421875,
-0.051422119140625,
0.0238494873046875,
-0.00754547119140625,
-0.012847900390625,
0.0023403167724609375,
0.04730224609375,
-0.03350830078125,
-0.04681396484375,
0.01097869873046875,
0.0193939208984375,
0.03643798828125,
-0.01165771484375,
-0.08929443359375,
0.0007548332214355469,
0.00832366943359375,
-0.05731201171875,
0.0252685546875,
0.044036865234375,
0.009368896484375,
0.047454833984375,
0.050384521484375,
-0.0163726806640625,
0.01244354248046875,
-0.01543426513671875,
0.07354736328125,
-0.0225830078125,
-0.02496337890625,
-0.078369140625,
0.055023193359375,
-0.0056304931640625,
-0.03118896484375,
0.037750244140625,
0.0467529296875,
0.06304931640625,
-0.0127105712890625,
0.045867919921875,
-0.0268402099609375,
-0.0020771026611328125,
-0.0222320556640625,
0.05462646484375,
-0.048919677734375,
-0.00473785400390625,
-0.0268096923828125,
-0.06695556640625,
-0.01021575927734375,
0.06793212890625,
-0.01378631591796875,
0.035736083984375,
0.04071044921875,
0.06646728515625,
-0.03497314453125,
-0.0307159423828125,
0.0100555419921875,
0.01739501953125,
0.009521484375,
0.03240966796875,
0.03765869140625,
-0.059234619140625,
0.024749755859375,
-0.0595703125,
-0.02093505859375,
-0.00930023193359375,
-0.0574951171875,
-0.078125,
-0.06817626953125,
-0.057647705078125,
-0.046875,
-0.017364501953125,
0.065185546875,
0.0738525390625,
-0.04925537109375,
0.0002206563949584961,
-0.0034389495849609375,
-0.0021686553955078125,
-0.0213470458984375,
-0.018035888671875,
0.047210693359375,
-0.01268768310546875,
-0.079345703125,
-0.033477783203125,
-0.00505828857421875,
0.03497314453125,
-0.0056304931640625,
-0.00704193115234375,
-0.018798828125,
-0.027587890625,
0.0171356201171875,
0.00614166259765625,
-0.042205810546875,
0.0017747879028320312,
-0.011566162109375,
-0.0116729736328125,
0.032989501953125,
0.02227783203125,
-0.045196533203125,
0.0188446044921875,
0.03594970703125,
0.022125244140625,
0.0640869140625,
-0.013458251953125,
-0.0005240440368652344,
-0.0635986328125,
0.05181884765625,
-0.00807952880859375,
0.0382080078125,
0.03143310546875,
-0.0343017578125,
0.052398681640625,
0.035125732421875,
-0.032623291015625,
-0.06915283203125,
-0.01482391357421875,
-0.0806884765625,
-0.0092620849609375,
0.0738525390625,
-0.036834716796875,
-0.03564453125,
0.03607177734375,
-0.015899658203125,
0.0555419921875,
-0.0146026611328125,
0.053009033203125,
0.0308685302734375,
0.002170562744140625,
-0.030120849609375,
-0.034820556640625,
0.03253173828125,
0.009552001953125,
-0.044158935546875,
-0.014892578125,
0.01212310791015625,
0.0577392578125,
0.0189208984375,
0.03778076171875,
-0.0167236328125,
0.0015764236450195312,
-0.0048065185546875,
0.0267181396484375,
-0.03192138671875,
-0.00823974609375,
-0.0224456787109375,
-0.00977325439453125,
-0.01104736328125,
-0.048370361328125
]
] |
nateraw/bert-base-uncased-emotion | 2021-05-20T01:18:38.000Z | [
"transformers",
"pytorch",
"jax",
"bert",
"text-classification",
"emotion",
"en",
"dataset:emotion",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | nateraw | null | null | nateraw/bert-base-uncased-emotion | 4 | 32,307 | transformers | 2022-03-02T23:29:05 | ---
language:
- en
thumbnail: https://avatars3.githubusercontent.com/u/32437151?s=460&u=4ec59abc8d21d5feea3dab323d23a5860e6996a4&v=4
tags:
- text-classification
- emotion
- pytorch
license: apache-2.0
datasets:
- emotion
metrics:
- accuracy
---
# bert-base-uncased-emotion
## Model description
`bert-base-uncased` finetuned on the emotion dataset using PyTorch Lightning. Sequence length 128, learning rate 2e-5, batch size 32, 2 GPUs, 4 epochs.
For more details, please see, [the emotion dataset on nlp viewer](https://huggingface.co/nlp/viewer/?dataset=emotion).
#### Limitations and bias
- Not the best model, but it works in a pinch I guess...
- Code not available as I just hacked this together.
- [Follow me on github](https://github.com/nateraw) to get notified when code is made available.
## Training data
Data came from HuggingFace's `datasets` package. The data can be viewed [on nlp viewer](https://huggingface.co/nlp/viewer/?dataset=emotion).
## Training procedure
...
## Eval results
val_acc - 0.931 (useless, as this should be precision/recall/f1)
The score was calculated using PyTorch Lightning metrics.
| 1,136 | [
[
-0.0374755859375,
-0.044708251953125,
0.006641387939453125,
0.04791259765625,
-0.0309295654296875,
-0.01023101806640625,
-0.02606201171875,
-0.0240478515625,
0.042999267578125,
0.0015087127685546875,
-0.04998779296875,
-0.046173095703125,
-0.042694091796875,
-0.0228424072265625,
-0.029144287109375,
0.10760498046875,
0.00884246826171875,
0.00762939453125,
0.004573822021484375,
-0.0107574462890625,
-0.00568389892578125,
-0.037384033203125,
-0.022216796875,
-0.048065185546875,
0.0299530029296875,
0.029693603515625,
0.045166015625,
-0.006259918212890625,
0.036468505859375,
0.0135498046875,
-0.019622802734375,
-0.0299072265625,
-0.04901123046875,
-0.010833740234375,
0.002017974853515625,
-0.006465911865234375,
-0.04730224609375,
-0.0007309913635253906,
0.0278167724609375,
0.031036376953125,
-0.0232391357421875,
0.01251983642578125,
-0.001232147216796875,
0.06591796875,
-0.02569580078125,
0.0238800048828125,
-0.043060302734375,
0.02984619140625,
0.00548553466796875,
0.0035495758056640625,
-0.027099609375,
-0.01849365234375,
0.03485107421875,
-0.006061553955078125,
0.0232696533203125,
0.00962066650390625,
0.07159423828125,
-0.0034637451171875,
-0.0187225341796875,
-0.01003265380859375,
-0.0158538818359375,
0.04351806640625,
-0.047088623046875,
0.01326751708984375,
0.021270751953125,
0.0026340484619140625,
0.0035915374755859375,
-0.045684814453125,
-0.028350830078125,
-0.00916290283203125,
0.0205230712890625,
0.01349639892578125,
-0.037384033203125,
0.0179595947265625,
0.0283966064453125,
0.04510498046875,
-0.0362548828125,
-0.016265869140625,
-0.01149749755859375,
-0.00888824462890625,
0.0478515625,
-0.004947662353515625,
0.00835418701171875,
-0.034820556640625,
-0.033843994140625,
-0.02020263671875,
-0.0252838134765625,
0.01471710205078125,
0.04388427734375,
0.0196533203125,
-0.044036865234375,
0.0516357421875,
0.013763427734375,
0.019134521484375,
0.02349853515625,
0.03289794921875,
0.057464599609375,
0.0258941650390625,
-0.0350341796875,
0.00354766845703125,
0.0712890625,
0.037261962890625,
0.0247802734375,
-0.008453369140625,
-0.02789306640625,
-0.00045990943908691406,
0.03302001953125,
-0.0582275390625,
-0.034637451171875,
0.0288848876953125,
-0.04931640625,
-0.0234375,
-0.008392333984375,
-0.069580078125,
-0.0203704833984375,
-0.012481689453125,
0.045684814453125,
-0.057037353515625,
-0.025604248046875,
0.01070404052734375,
-0.0194549560546875,
0.018585205078125,
0.01171112060546875,
-0.07110595703125,
0.0233917236328125,
0.0227508544921875,
0.051849365234375,
0.01090240478515625,
-0.0020542144775390625,
0.001422882080078125,
-0.05792236328125,
-0.01311492919921875,
0.0195465087890625,
-0.004047393798828125,
-0.032196044921875,
0.00738525390625,
-0.000010609626770019531,
0.0111236572265625,
-0.0132598876953125,
0.07305908203125,
-0.02679443359375,
0.0116424560546875,
-0.0167083740234375,
-0.052978515625,
-0.0201873779296875,
0.00914764404296875,
-0.03717041015625,
0.0838623046875,
0.03509521484375,
-0.07049560546875,
0.00798797607421875,
-0.046539306640625,
-0.0264892578125,
-0.0022830963134765625,
0.01055908203125,
-0.038055419921875,
0.037017822265625,
0.0024318695068359375,
0.0465087890625,
0.0035152435302734375,
0.0211029052734375,
-0.033721923828125,
-0.0275115966796875,
0.00919342041015625,
-0.01247406005859375,
0.057525634765625,
0.0014028549194335938,
-0.0421142578125,
0.004131317138671875,
-0.06103515625,
0.00647735595703125,
0.01134490966796875,
-0.0183868408203125,
0.0030040740966796875,
-0.0238800048828125,
0.04791259765625,
0.017425537109375,
0.0225677490234375,
-0.059722900390625,
0.024871826171875,
-0.036956787109375,
-0.00482940673828125,
0.0618896484375,
-0.0158843994140625,
0.0194244384765625,
-0.00647735595703125,
0.030029296875,
-0.0079498291015625,
0.004856109619140625,
0.02117919921875,
-0.04840087890625,
-0.062744140625,
-0.0113983154296875,
0.024139404296875,
0.031585693359375,
-0.035552978515625,
0.08038330078125,
0.01251983642578125,
-0.049713134765625,
-0.058929443359375,
-0.01531982421875,
0.041259765625,
0.044097900390625,
0.03302001953125,
-0.04791259765625,
-0.0548095703125,
-0.062164306640625,
0.005039215087890625,
0.0009446144104003906,
-0.00817108154296875,
0.027984619140625,
0.040985107421875,
-0.0309295654296875,
0.06817626953125,
-0.0311737060546875,
-0.0135955810546875,
-0.0207977294921875,
0.035858154296875,
0.030731201171875,
0.05059814453125,
0.04754638671875,
-0.0310516357421875,
-0.0200042724609375,
-0.0258026123046875,
-0.05474853515625,
-0.00860595703125,
0.00418853759765625,
-0.010589599609375,
0.018402099609375,
-0.01345062255859375,
-0.050750732421875,
0.050811767578125,
0.043609619140625,
-0.034637451171875,
0.057281494140625,
-0.01248931884765625,
-0.0005793571472167969,
-0.060150146484375,
0.009552001953125,
0.0204620361328125,
-0.01001739501953125,
-0.034759521484375,
-0.0093231201171875,
0.01340484619140625,
-0.0094146728515625,
-0.037567138671875,
0.035858154296875,
-0.01873779296875,
-0.0183258056640625,
-0.013519287109375,
-0.01541900634765625,
-0.0090179443359375,
0.06536865234375,
0.0155181884765625,
0.019256591796875,
0.055694580078125,
-0.0213165283203125,
0.050140380859375,
0.044036865234375,
-0.036956787109375,
0.032501220703125,
-0.0546875,
0.019622802734375,
-0.0065460205078125,
0.01090240478515625,
-0.0540771484375,
-0.0239410400390625,
0.0095367431640625,
-0.047607421875,
0.0306854248046875,
-0.006641387939453125,
-0.047149658203125,
-0.02166748046875,
-0.0301666259765625,
0.017547607421875,
0.07049560546875,
-0.04107666015625,
0.031982421875,
0.00881195068359375,
-0.005390167236328125,
-0.041107177734375,
-0.0562744140625,
-0.0036106109619140625,
-0.00635528564453125,
-0.037841796875,
0.02850341796875,
-0.016510009765625,
-0.002414703369140625,
-0.00916290283203125,
0.0033512115478515625,
0.00412750244140625,
-0.01523590087890625,
0.029937744140625,
0.01050567626953125,
0.002162933349609375,
0.0217132568359375,
-0.003082275390625,
-0.0011806488037109375,
0.02545166015625,
0.01282501220703125,
0.0419921875,
-0.0443115234375,
-0.01456451416015625,
-0.03826904296875,
0.0145721435546875,
0.0291290283203125,
0.018707275390625,
0.054168701171875,
0.07568359375,
-0.046539306640625,
-0.01262664794921875,
-0.03173828125,
-0.0155181884765625,
-0.0287322998046875,
0.019134521484375,
-0.00505828857421875,
-0.0391845703125,
0.0552978515625,
0.0184783935546875,
-0.0116729736328125,
0.04595947265625,
0.054412841796875,
-0.0171356201171875,
0.06903076171875,
0.048797607421875,
-0.0250701904296875,
0.04193115234375,
-0.024688720703125,
0.005390167236328125,
-0.0640869140625,
-0.0240631103515625,
-0.009124755859375,
-0.04534912109375,
-0.037811279296875,
0.00159454345703125,
0.018707275390625,
0.01174163818359375,
-0.031951904296875,
0.02313232421875,
-0.044464111328125,
0.01003265380859375,
0.047760009765625,
0.034454345703125,
-0.02288818359375,
-0.0026035308837890625,
-0.0221710205078125,
-0.017181396484375,
-0.029541015625,
-0.00875091552734375,
0.0802001953125,
0.050567626953125,
0.0704345703125,
0.0011692047119140625,
0.05596923828125,
0.024993896484375,
0.01506805419921875,
-0.05438232421875,
0.026824951171875,
-0.0028591156005859375,
-0.05181884765625,
-0.002910614013671875,
-0.027679443359375,
-0.061859130859375,
-0.006725311279296875,
-0.030242919921875,
-0.06884765625,
-0.0015935897827148438,
0.003398895263671875,
-0.0172119140625,
0.0132904052734375,
-0.06903076171875,
0.07440185546875,
-0.03131103515625,
-0.01409149169921875,
0.0028781890869140625,
-0.080078125,
0.0245819091796875,
0.0169219970703125,
-0.0100250244140625,
-0.0234832763671875,
0.041839599609375,
0.0618896484375,
-0.032684326171875,
0.06549072265625,
-0.033050537109375,
0.009368896484375,
0.010284423828125,
-0.0012197494506835938,
0.021728515625,
0.0068359375,
-0.00994873046875,
0.0084991455078125,
-0.0206298828125,
-0.036346435546875,
-0.0263214111328125,
0.047607421875,
-0.07073974609375,
0.01409149169921875,
-0.0440673828125,
-0.036163330078125,
-0.024383544921875,
-0.0022182464599609375,
0.04302978515625,
0.027984619140625,
-0.0288848876953125,
0.0243072509765625,
0.06671142578125,
-0.0243682861328125,
0.0380859375,
0.01397705078125,
-0.0217132568359375,
-0.033050537109375,
0.04876708984375,
-0.024688720703125,
0.00115203857421875,
0.006793975830078125,
0.0200958251953125,
-0.035430908203125,
-0.011016845703125,
-0.02099609375,
0.0031986236572265625,
-0.045654296875,
-0.0225677490234375,
-0.03564453125,
-0.0195465087890625,
-0.026580810546875,
-0.01605224609375,
-0.03228759765625,
-0.0189056396484375,
-0.0419921875,
-0.0277099609375,
0.06488037109375,
0.033905029296875,
-0.015838623046875,
0.0270843505859375,
-0.055572509765625,
0.0303497314453125,
0.003993988037109375,
0.057281494140625,
0.00740814208984375,
-0.051727294921875,
-0.01416778564453125,
-0.0036449432373046875,
-0.01690673828125,
-0.060882568359375,
0.060943603515625,
0.010498046875,
0.01230621337890625,
0.032073974609375,
0.0148162841796875,
0.039276123046875,
-0.0284576416015625,
0.0570068359375,
0.040740966796875,
-0.09033203125,
0.04736328125,
-0.025299072265625,
0.0246734619140625,
0.04290771484375,
0.0198822021484375,
-0.0283203125,
-0.00725555419921875,
-0.08502197265625,
-0.08343505859375,
0.05474853515625,
0.044464111328125,
0.0169219970703125,
0.0029430389404296875,
0.026123046875,
0.00267791748046875,
0.03485107421875,
-0.083251953125,
-0.0443115234375,
-0.029022216796875,
-0.039093017578125,
-0.00894927978515625,
-0.0269012451171875,
-0.01328277587890625,
-0.03936767578125,
0.06512451171875,
0.0001646280288696289,
0.051605224609375,
0.02081298828125,
-0.007335662841796875,
-0.0271148681640625,
0.0012464523315429688,
0.030914306640625,
0.0009331703186035156,
-0.07818603515625,
-0.0132598876953125,
-0.002101898193359375,
-0.03631591796875,
-0.01313018798828125,
0.0253753662109375,
0.0162506103515625,
0.0219879150390625,
0.0303497314453125,
0.08612060546875,
0.0281524658203125,
-0.045867919921875,
0.056182861328125,
-0.014617919921875,
-0.01861572265625,
-0.0258941650390625,
-0.018890380859375,
0.002719879150390625,
0.022674560546875,
0.0283050537109375,
0.016265869140625,
0.0135955810546875,
-0.036102294921875,
0.039215087890625,
0.0214385986328125,
-0.046356201171875,
-0.030975341796875,
0.0364990234375,
0.0247802734375,
-0.01357269287109375,
0.0606689453125,
-0.0257110595703125,
-0.041046142578125,
0.049835205078125,
0.028717041015625,
0.08135986328125,
0.009979248046875,
0.0007123947143554688,
0.0188446044921875,
0.008056640625,
0.0009870529174804688,
0.0504150390625,
0.0027294158935546875,
-0.0631103515625,
-0.00420379638671875,
-0.049957275390625,
-0.0452880859375,
0.007415771484375,
-0.083740234375,
0.01265716552734375,
-0.0445556640625,
-0.01515960693359375,
-0.007137298583984375,
0.0120697021484375,
-0.055816650390625,
0.04241943359375,
0.03143310546875,
0.095947265625,
-0.07379150390625,
0.058074951171875,
0.035980224609375,
-0.031585693359375,
-0.05987548828125,
-0.015380859375,
0.0030307769775390625,
-0.06280517578125,
0.0312347412109375,
0.0246124267578125,
0.00923919677734375,
0.0007266998291015625,
-0.06353759765625,
-0.02935791015625,
0.056060791015625,
0.0272369384765625,
-0.03271484375,
0.006252288818359375,
-0.03167724609375,
0.0660400390625,
-0.0217132568359375,
0.03863525390625,
0.033660888671875,
0.0120391845703125,
0.01238250732421875,
-0.058441162109375,
-0.03021240234375,
-0.03656005859375,
-0.0138702392578125,
0.0034332275390625,
-0.0419921875,
0.053131103515625,
0.0015611648559570312,
0.0163421630859375,
0.0036449432373046875,
0.050689697265625,
0.01294708251953125,
0.0197906494140625,
0.0273895263671875,
0.07073974609375,
0.037628173828125,
-0.02001953125,
0.07373046875,
-0.0151214599609375,
0.074951171875,
0.057098388671875,
-0.020355224609375,
0.061859130859375,
0.025146484375,
-0.02093505859375,
0.04559326171875,
0.0701904296875,
-0.007160186767578125,
0.050140380859375,
0.0276031494140625,
-0.01428985595703125,
0.007732391357421875,
0.005908966064453125,
-0.035614013671875,
0.0197906494140625,
0.026580810546875,
-0.040740966796875,
-0.00701141357421875,
0.004894256591796875,
-0.0007810592651367188,
-0.0285797119140625,
-0.0357666015625,
0.04241943359375,
0.004871368408203125,
-0.02313232421875,
0.0455322265625,
0.0007486343383789062,
0.06689453125,
-0.047760009765625,
0.01361083984375,
-0.020721435546875,
0.00661468505859375,
-0.0253448486328125,
-0.07440185546875,
0.0119781494140625,
0.0192108154296875,
-0.0191802978515625,
-0.0338134765625,
0.04559326171875,
-0.04486083984375,
-0.032623291015625,
0.03851318359375,
0.0211639404296875,
0.0229644775390625,
-0.01873779296875,
-0.07757568359375,
0.006580352783203125,
0.0033664703369140625,
-0.034454345703125,
0.0234375,
0.03143310546875,
0.032440185546875,
0.041015625,
0.027496337890625,
-0.0176849365234375,
-0.00547027587890625,
0.033355712890625,
0.06256103515625,
-0.05987548828125,
-0.03167724609375,
-0.05059814453125,
0.05682373046875,
0.003490447998046875,
-0.042236328125,
0.03045654296875,
0.03375244140625,
0.032958984375,
-0.027679443359375,
0.044036865234375,
-0.0304107666015625,
0.03253173828125,
-0.033447265625,
0.0462646484375,
-0.036529541015625,
-0.02325439453125,
-0.059295654296875,
-0.0491943359375,
-0.01715087890625,
0.06768798828125,
0.0167694091796875,
0.009857177734375,
0.0491943359375,
0.0321044921875,
0.008514404296875,
0.01039886474609375,
0.0081939697265625,
0.0157470703125,
-0.007720947265625,
0.055755615234375,
0.0304412841796875,
-0.04559326171875,
0.007843017578125,
-0.037811279296875,
-0.0281219482421875,
-0.031494140625,
-0.0850830078125,
-0.05999755859375,
-0.036865234375,
-0.029541015625,
-0.053558349609375,
-0.01320648193359375,
0.0858154296875,
0.05206298828125,
-0.06304931640625,
-0.00852203369140625,
-0.014190673828125,
-0.00498199462890625,
-0.0060577392578125,
-0.02099609375,
0.0294036865234375,
-0.004734039306640625,
-0.042205810546875,
0.00616455078125,
0.004947662353515625,
0.015960693359375,
0.0008401870727539062,
-0.0103759765625,
-0.0203399658203125,
-0.012054443359375,
0.03240966796875,
0.02703857421875,
-0.052520751953125,
-0.03179931640625,
-0.01456451416015625,
-0.0134429931640625,
0.0173187255859375,
0.02392578125,
-0.04345703125,
0.0413818359375,
0.0305023193359375,
0.040374755859375,
0.044708251953125,
0.01018524169921875,
0.031982421875,
-0.099365234375,
0.0095977783203125,
0.0333251953125,
0.054931640625,
0.0262603759765625,
-0.0299224853515625,
0.033111572265625,
0.024688720703125,
-0.049957275390625,
-0.04437255859375,
0.004322052001953125,
-0.1121826171875,
0.01253509521484375,
0.08319091796875,
-0.00730133056640625,
-0.0112152099609375,
0.0321044921875,
-0.03192138671875,
0.027008056640625,
-0.0594482421875,
0.0689697265625,
0.061492919921875,
-0.035614013671875,
-0.024200439453125,
-0.01319122314453125,
0.042083740234375,
0.039642333984375,
-0.04779052734375,
-0.01314544677734375,
0.038238525390625,
0.01386260986328125,
0.0141754150390625,
0.0372314453125,
0.00864410400390625,
0.0223236083984375,
0.0025634765625,
0.067626953125,
0.0291290283203125,
-0.0128936767578125,
-0.056671142578125,
0.00970458984375,
-0.00811004638671875,
-0.0277862548828125
]
] |
Salesforce/blip-itm-large-coco | 2023-08-01T14:48:50.000Z | [
"transformers",
"pytorch",
"tf",
"blip",
"image-text-matching",
"arxiv:2201.12086",
"license:bsd-3-clause",
"endpoints_compatible",
"has_space",
"region:us"
] | null | Salesforce | null | null | Salesforce/blip-itm-large-coco | 0 | 32,228 | transformers | 2022-12-13T11:41:12 | ---
pipeline_tags: 'other'
tags:
- image-text-matching
languages:
- en
license: bsd-3-clause
---
# BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation
Model card for BLIP trained on image-text matching - large architecture (with ViT large backbone) trained on COCO dataset.
|  |
|:--:|
| <b> Pull figure from BLIP official repo | Image source: https://github.com/salesforce/BLIP </b>|
## TL;DR
Authors from the [paper](https://arxiv.org/abs/2201.12086) write in the abstract:
*Vision-Language Pre-training (VLP) has advanced the performance for many vision-language tasks. However, most existing pre-trained models only excel in either understanding-based tasks or generation-based tasks. Furthermore, performance improvement has been largely achieved by scaling up the dataset with noisy image-text pairs collected from the web, which is a suboptimal source of supervision. In this paper, we propose BLIP, a new VLP framework which transfers flexibly to both vision-language understanding and generation tasks. BLIP effectively utilizes the noisy web data by bootstrapping the captions, where a captioner generates synthetic captions and a filter removes the noisy ones. We achieve state-of-the-art results on a wide range of vision-language tasks, such as image-text retrieval (+2.7% in average recall@1), image captioning (+2.8% in CIDEr), and VQA (+1.6% in VQA score). BLIP also demonstrates strong generalization ability when directly transferred to videolanguage tasks in a zero-shot manner. Code, models, and datasets are released.*
## Usage
You can use this model for conditional and un-conditional image captioning
### Using the Pytorch model
#### Running the model on CPU
<details>
<summary> Click to expand </summary>
```python
import requests
from PIL import Image
from transformers import BlipProcessor, BlipForImageTextRetrieval
processor = BlipProcessor.from_pretrained("Salesforce/blip-itm-large-coco")
model = BlipForImageTextRetrieval.from_pretrained("Salesforce/blip-itm-large-coco")
img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg'
raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB')
question = "A woman and a dog sitting together in a beach."
inputs = processor(raw_image, question, return_tensors="pt")
itm_scores = model(**inputs)[0]
cosine_score = model(**inputs, use_itm_head=False)[0]
```
</details>
#### Running the model on GPU
##### In full precision
<details>
<summary> Click to expand </summary>
```python
import requests
from PIL import Image
from transformers import BlipProcessor, BlipForImageTextRetrieval
processor = BlipProcessor.from_pretrained("Salesforce/blip-itm-large-coco")
model = BlipForImageTextRetrieval.from_pretrained("Salesforce/blip-itm-large-coco").to("cuda")
img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg'
raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB')
question = "A woman and a dog sitting together in a beach."
inputs = processor(raw_image, question, return_tensors="pt").to("cuda")
itm_scores = model(**inputs)[0]
cosine_score = model(**inputs, use_itm_head=False)[0]
```
</details>
##### In half precision (`float16`)
<details>
<summary> Click to expand </summary>
```python
import torch
import requests
from PIL import Image
from transformers import BlipProcessor, BlipForImageTextRetrieval
processor = BlipProcessor.from_pretrained("Salesforce/blip-itm-large-coco")
model = BlipForImageTextRetrieval.from_pretrained("Salesforce/blip-itm-large-coco", torch_dtype=torch.float16).to("cuda")
img_url = 'https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg'
raw_image = Image.open(requests.get(img_url, stream=True).raw).convert('RGB')
question = "A woman and a dog sitting together in a beach."
inputs = processor(raw_image, question, return_tensors="pt").to("cuda", torch.float16)
itm_scores = model(**inputs)[0]
cosine_score = model(**inputs, use_itm_head=False)[0]
```
</details>
## BibTex and citation info
```
@misc{https://doi.org/10.48550/arxiv.2201.12086,
doi = {10.48550/ARXIV.2201.12086},
url = {https://arxiv.org/abs/2201.12086},
author = {Li, Junnan and Li, Dongxu and Xiong, Caiming and Hoi, Steven},
keywords = {Computer Vision and Pattern Recognition (cs.CV), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
``` | 4,823 | [
[
-0.023040771484375,
-0.045318603515625,
-0.0018291473388671875,
0.04473876953125,
-0.028045654296875,
-0.00078582763671875,
-0.037200927734375,
-0.050750732421875,
-0.0007872581481933594,
0.0216217041015625,
-0.025299072265625,
-0.038177490234375,
-0.032623291015625,
-0.00502777099609375,
-0.019866943359375,
0.0506591796875,
0.01198577880859375,
0.004192352294921875,
-0.01424407958984375,
0.0008397102355957031,
-0.0169219970703125,
-0.0226898193359375,
-0.0399169921875,
-0.0013561248779296875,
0.0021209716796875,
0.025726318359375,
0.04071044921875,
0.036865234375,
0.05859375,
0.027557373046875,
-0.005290985107421875,
0.003963470458984375,
-0.0238037109375,
-0.0282745361328125,
-0.008819580078125,
-0.05389404296875,
-0.0140838623046875,
0.0042724609375,
0.038360595703125,
0.04315185546875,
0.00014889240264892578,
0.031768798828125,
0.007476806640625,
0.042755126953125,
-0.053863525390625,
0.029937744140625,
-0.05609130859375,
0.0036773681640625,
-0.0026302337646484375,
-0.01287078857421875,
-0.0279388427734375,
-0.00982666015625,
0.00861358642578125,
-0.060943603515625,
0.046051025390625,
0.01611328125,
0.12017822265625,
0.023468017578125,
0.0186767578125,
-0.0165252685546875,
-0.030609130859375,
0.06365966796875,
-0.041748046875,
0.0335693359375,
0.013153076171875,
0.0206298828125,
0.006000518798828125,
-0.057159423828125,
-0.055877685546875,
-0.0247802734375,
-0.005916595458984375,
0.029266357421875,
-0.01641845703125,
-0.007293701171875,
0.0222625732421875,
0.033538818359375,
-0.04718017578125,
-0.001399993896484375,
-0.060546875,
-0.02032470703125,
0.04803466796875,
-0.006778717041015625,
0.0185546875,
-0.0177001953125,
-0.0401611328125,
-0.03271484375,
-0.039459228515625,
0.030120849609375,
-0.005779266357421875,
0.015838623046875,
-0.03314208984375,
0.05401611328125,
-0.0017757415771484375,
0.067626953125,
0.0188446044921875,
-0.019317626953125,
0.044097900390625,
-0.0239410400390625,
-0.034698486328125,
-0.0018053054809570312,
0.08258056640625,
0.0447998046875,
0.025299072265625,
0.004825592041015625,
0.005298614501953125,
0.0127410888671875,
0.005218505859375,
-0.06390380859375,
-0.0308074951171875,
0.0154266357421875,
-0.0207366943359375,
-0.016693115234375,
0.00624847412109375,
-0.0703125,
-0.007465362548828125,
-0.0020732879638671875,
0.038818359375,
-0.04071044921875,
-0.011962890625,
0.01444244384765625,
-0.0196075439453125,
0.0283660888671875,
0.0237274169921875,
-0.05859375,
-0.0016765594482421875,
0.019989013671875,
0.0706787109375,
0.002895355224609375,
-0.045684814453125,
-0.0206146240234375,
0.007198333740234375,
-0.031280517578125,
0.0391845703125,
-0.01074981689453125,
-0.0116424560546875,
-0.00620269775390625,
0.01459503173828125,
-0.0126495361328125,
-0.040130615234375,
0.005802154541015625,
-0.0197601318359375,
0.017059326171875,
-0.0086212158203125,
-0.019256591796875,
-0.0267333984375,
0.02740478515625,
-0.0277557373046875,
0.07574462890625,
0.002864837646484375,
-0.05902099609375,
0.045562744140625,
-0.039093017578125,
-0.0232391357421875,
0.017822265625,
-0.01812744140625,
-0.044708251953125,
-0.008636474609375,
0.03564453125,
0.030670166015625,
-0.03155517578125,
0.00704193115234375,
-0.021697998046875,
-0.02783203125,
0.00994110107421875,
-0.021636962890625,
0.0806884765625,
-0.003604888916015625,
-0.0478515625,
-0.0001608133316040039,
-0.062408447265625,
-0.0017223358154296875,
0.0215911865234375,
-0.0255126953125,
0.0012121200561523438,
-0.019989013671875,
0.019317626953125,
0.017578125,
0.040283203125,
-0.042633056640625,
0.0005269050598144531,
-0.0279388427734375,
0.0312347412109375,
0.040740966796875,
-0.0182037353515625,
0.0239105224609375,
-0.004703521728515625,
0.02496337890625,
0.0102386474609375,
0.0276947021484375,
-0.0214080810546875,
-0.046875,
-0.0762939453125,
-0.039459228515625,
-0.00029587745666503906,
0.0472412109375,
-0.062286376953125,
0.03326416015625,
-0.0218353271484375,
-0.0413818359375,
-0.054168701171875,
0.013397216796875,
0.046417236328125,
0.061492919921875,
0.0465087890625,
-0.0343017578125,
-0.036224365234375,
-0.057586669921875,
0.01439666748046875,
-0.019989013671875,
-0.0004334449768066406,
0.023773193359375,
0.04278564453125,
-0.0097503662109375,
0.06298828125,
-0.037567138671875,
-0.033355712890625,
-0.018402099609375,
0.004055023193359375,
0.029266357421875,
0.051025390625,
0.05621337890625,
-0.06103515625,
-0.03369140625,
0.005390167236328125,
-0.06585693359375,
0.0079498291015625,
-0.0018663406372070312,
-0.0025348663330078125,
0.031005859375,
0.0377197265625,
-0.045989990234375,
0.04718017578125,
0.03668212890625,
-0.0151214599609375,
0.04998779296875,
-0.0224456787109375,
0.006855010986328125,
-0.0660400390625,
0.0298309326171875,
0.01971435546875,
-0.0028743743896484375,
-0.0209808349609375,
0.00936126708984375,
0.01081085205078125,
-0.00913238525390625,
-0.04833984375,
0.048828125,
-0.044342041015625,
-0.021148681640625,
0.00597381591796875,
0.002681732177734375,
0.008514404296875,
0.05499267578125,
0.0240020751953125,
0.05548095703125,
0.079345703125,
-0.05621337890625,
0.0290374755859375,
0.03131103515625,
-0.039093017578125,
0.0291290283203125,
-0.059600830078125,
-0.007518768310546875,
0.0014362335205078125,
-0.0154266357421875,
-0.08746337890625,
-0.01068878173828125,
0.02197265625,
-0.050872802734375,
0.02252197265625,
-0.0200653076171875,
-0.028594970703125,
-0.054107666015625,
-0.024017333984375,
0.02630615234375,
0.03460693359375,
-0.055389404296875,
0.021636962890625,
0.0124053955078125,
0.0131683349609375,
-0.061798095703125,
-0.0823974609375,
0.00457763671875,
0.005397796630859375,
-0.048736572265625,
0.034759521484375,
-0.00551605224609375,
0.01238250732421875,
0.0089263916015625,
0.0087890625,
-0.006984710693359375,
-0.018951416015625,
0.019866943359375,
0.039520263671875,
-0.0274505615234375,
-0.0142974853515625,
-0.031402587890625,
0.00899505615234375,
-0.01068878173828125,
-0.015777587890625,
0.060394287109375,
-0.02606201171875,
-0.005558013916015625,
-0.05438232421875,
-0.007427215576171875,
0.045166015625,
-0.034942626953125,
0.04278564453125,
0.057586669921875,
-0.0153350830078125,
0.0016040802001953125,
-0.043182373046875,
0.00395965576171875,
-0.041534423828125,
0.038787841796875,
-0.0180206298828125,
-0.030242919921875,
0.039306640625,
0.0239715576171875,
-0.0021038055419921875,
0.02191162109375,
0.052154541015625,
-0.01641845703125,
0.052215576171875,
0.06158447265625,
-0.006908416748046875,
0.05572509765625,
-0.06591796875,
0.0006299018859863281,
-0.056671142578125,
-0.032012939453125,
-0.0146484375,
-0.007110595703125,
-0.036407470703125,
-0.03839111328125,
0.0188751220703125,
0.0215606689453125,
-0.025665283203125,
0.021392822265625,
-0.045166015625,
0.0157470703125,
0.05718994140625,
0.0171356201171875,
-0.006885528564453125,
0.0155029296875,
-0.017181396484375,
0.0018901824951171875,
-0.05474853515625,
-0.01364898681640625,
0.07574462890625,
0.0162200927734375,
0.05377197265625,
-0.021514892578125,
0.0291290283203125,
-0.0202178955078125,
0.01491546630859375,
-0.049957275390625,
0.051605224609375,
-0.02093505859375,
-0.0413818359375,
-0.0181732177734375,
-0.0231475830078125,
-0.06829833984375,
0.01641845703125,
-0.02716064453125,
-0.0692138671875,
0.017822265625,
0.035797119140625,
-0.0153961181640625,
0.0171356201171875,
-0.05914306640625,
0.07623291015625,
-0.035400390625,
-0.051605224609375,
0.01235198974609375,
-0.045379638671875,
0.022674560546875,
0.022796630859375,
0.0011196136474609375,
0.017913818359375,
0.01288604736328125,
0.05279541015625,
-0.036712646484375,
0.0645751953125,
-0.018951416015625,
0.0233917236328125,
0.031890869140625,
-0.01390838623046875,
-0.0009455680847167969,
-0.0071868896484375,
0.011688232421875,
0.03271484375,
-0.00470733642578125,
-0.044677734375,
-0.034881591796875,
0.01800537109375,
-0.057403564453125,
-0.03546142578125,
-0.03033447265625,
-0.037933349609375,
0.00347137451171875,
0.0243377685546875,
0.056884765625,
0.020416259765625,
0.0171051025390625,
0.01013946533203125,
0.0179595947265625,
-0.037811279296875,
0.052215576171875,
0.025115966796875,
-0.03326416015625,
-0.033782958984375,
0.07623291015625,
-0.0010843276977539062,
0.01451873779296875,
0.0246734619140625,
0.016876220703125,
-0.029388427734375,
-0.038238525390625,
-0.046142578125,
0.03466796875,
-0.04058837890625,
-0.0219268798828125,
-0.020843505859375,
-0.019378662109375,
-0.0379638671875,
-0.0245208740234375,
-0.039306640625,
-0.00827789306640625,
-0.03106689453125,
0.0117950439453125,
0.032012939453125,
0.0214691162109375,
-0.005565643310546875,
0.032867431640625,
-0.033416748046875,
0.0290374755859375,
0.0181884765625,
0.018402099609375,
-0.00386810302734375,
-0.041473388671875,
-0.01155853271484375,
0.0117950439453125,
-0.0223846435546875,
-0.04827880859375,
0.04962158203125,
0.021575927734375,
0.0316162109375,
0.03387451171875,
-0.0308837890625,
0.0858154296875,
-0.0197296142578125,
0.061859130859375,
0.04425048828125,
-0.07147216796875,
0.053741455078125,
0.0005278587341308594,
0.015472412109375,
0.034332275390625,
0.02447509765625,
-0.02490234375,
-0.031951904296875,
-0.04095458984375,
-0.0714111328125,
0.053741455078125,
0.00909423828125,
-0.00830078125,
0.019287109375,
0.019073486328125,
-0.0171661376953125,
0.0253753662109375,
-0.0682373046875,
-0.01384735107421875,
-0.041839599609375,
-0.0174407958984375,
-0.0264434814453125,
0.0104827880859375,
0.0137481689453125,
-0.0523681640625,
0.03509521484375,
-0.01190948486328125,
0.0290374755859375,
0.031494140625,
-0.0347900390625,
-0.0054168701171875,
-0.031707763671875,
0.037445068359375,
0.040191650390625,
-0.01995849609375,
0.00046253204345703125,
-0.00966644287109375,
-0.061798095703125,
-0.01470947265625,
0.00029659271240234375,
-0.02301025390625,
-0.003986358642578125,
0.0433349609375,
0.07000732421875,
0.00653076171875,
-0.045440673828125,
0.059295654296875,
0.0027294158935546875,
-0.0128326416015625,
-0.0212554931640625,
0.006900787353515625,
-0.00817108154296875,
0.018463134765625,
0.040435791015625,
0.01294708251953125,
-0.0155487060546875,
-0.04583740234375,
0.01519012451171875,
0.033172607421875,
-0.017242431640625,
-0.022613525390625,
0.056549072265625,
-0.009979248046875,
-0.016265869140625,
0.04998779296875,
-0.024078369140625,
-0.053466796875,
0.066650390625,
0.05438232421875,
0.033355712890625,
-0.00872802734375,
0.02490234375,
0.055877685546875,
0.0312347412109375,
-0.005764007568359375,
0.03033447265625,
0.0005259513854980469,
-0.060089111328125,
-0.0158843994140625,
-0.05633544921875,
-0.0182647705078125,
0.02197265625,
-0.041412353515625,
0.034820556640625,
-0.05157470703125,
0.003971099853515625,
0.00583648681640625,
0.010009765625,
-0.06744384765625,
0.0357666015625,
0.0115814208984375,
0.0609130859375,
-0.06201171875,
0.039642333984375,
0.06280517578125,
-0.06988525390625,
-0.0694580078125,
-0.006923675537109375,
-0.025634765625,
-0.08148193359375,
0.05987548828125,
0.03204345703125,
-0.00417327880859375,
-0.0013914108276367188,
-0.065673828125,
-0.057952880859375,
0.080078125,
0.036773681640625,
-0.041748046875,
-0.0015974044799804688,
0.0165557861328125,
0.049102783203125,
-0.01617431640625,
0.0214691162109375,
0.01434326171875,
0.027130126953125,
0.030609130859375,
-0.071533203125,
-0.0029697418212890625,
-0.022674560546875,
-0.004764556884765625,
-0.019744873046875,
-0.061981201171875,
0.07745361328125,
-0.03472900390625,
-0.0117340087890625,
-0.0026092529296875,
0.05145263671875,
0.032440185546875,
0.0234375,
0.029052734375,
0.042938232421875,
0.05145263671875,
0.00447845458984375,
0.06878662109375,
-0.0264892578125,
0.04473876953125,
0.060546875,
0.0156402587890625,
0.06280517578125,
0.03924560546875,
-0.0108184814453125,
0.0194244384765625,
0.047393798828125,
-0.03961181640625,
0.036163330078125,
0.0092010498046875,
0.0179901123046875,
-0.007213592529296875,
0.013885498046875,
-0.020599365234375,
0.06134033203125,
0.0284881591796875,
-0.025970458984375,
-0.00400543212890625,
0.01047515869140625,
-0.009368896484375,
-0.00994110107421875,
-0.038818359375,
0.0238037109375,
-0.0062713623046875,
-0.0487060546875,
0.0767822265625,
-0.0175628662109375,
0.075439453125,
-0.021240234375,
0.0003619194030761719,
-0.00902557373046875,
0.0210418701171875,
-0.0226593017578125,
-0.06622314453125,
0.005702972412109375,
0.005565643310546875,
0.004474639892578125,
0.00809478759765625,
0.0277557373046875,
-0.03692626953125,
-0.0740966796875,
0.029510498046875,
0.011566162109375,
0.0234375,
0.00534820556640625,
-0.07403564453125,
0.011627197265625,
0.006175994873046875,
-0.017730712890625,
-0.00966644287109375,
0.0199127197265625,
-0.0014209747314453125,
0.05633544921875,
0.050201416015625,
0.0245513916015625,
0.042938232421875,
-0.0034770965576171875,
0.058624267578125,
-0.044525146484375,
-0.03338623046875,
-0.0556640625,
0.0391845703125,
-0.0118865966796875,
-0.04107666015625,
0.045867919921875,
0.06549072265625,
0.07879638671875,
-0.0166778564453125,
0.0426025390625,
-0.0186767578125,
0.0034961700439453125,
-0.045440673828125,
0.056732177734375,
-0.06011962890625,
-0.0037670135498046875,
-0.038818359375,
-0.055084228515625,
-0.041961669921875,
0.07318115234375,
-0.0231475830078125,
0.0003962516784667969,
0.04443359375,
0.08258056640625,
-0.023345947265625,
-0.035125732421875,
0.0196533203125,
0.0237884521484375,
0.021636962890625,
0.052886962890625,
0.042083740234375,
-0.044158935546875,
0.047393798828125,
-0.0478515625,
-0.018646240234375,
-0.0185699462890625,
-0.049530029296875,
-0.07373046875,
-0.0570068359375,
-0.0361328125,
-0.0183868408203125,
-0.00359344482421875,
0.041839599609375,
0.06097412109375,
-0.051727294921875,
-0.0247802734375,
-0.018768310546875,
-0.0006318092346191406,
-0.0189056396484375,
-0.016632080078125,
0.043121337890625,
-0.0272216796875,
-0.061126708984375,
-0.0004858970642089844,
0.02288818359375,
0.010833740234375,
-0.0105743408203125,
0.005069732666015625,
-0.0226593017578125,
-0.0232696533203125,
0.032440185546875,
0.035552978515625,
-0.046142578125,
-0.0170440673828125,
0.0052032470703125,
-0.009307861328125,
0.033050537109375,
0.0243072509765625,
-0.04656982421875,
0.03704833984375,
0.020477294921875,
0.0260772705078125,
0.06890869140625,
-0.008056640625,
0.0080108642578125,
-0.053741455078125,
0.055084228515625,
0.007038116455078125,
0.03680419921875,
0.0433349609375,
-0.02032470703125,
0.0283660888671875,
0.0300750732421875,
-0.0152435302734375,
-0.06353759765625,
-0.0012655258178710938,
-0.099853515625,
-0.016937255859375,
0.08392333984375,
-0.0223236083984375,
-0.059600830078125,
0.0123138427734375,
-0.0206146240234375,
0.0249786376953125,
-0.01142120361328125,
0.057647705078125,
0.0181427001953125,
0.0011491775512695312,
-0.036285400390625,
-0.01678466796875,
0.0286407470703125,
0.02117919921875,
-0.04730224609375,
-0.01507568359375,
0.022796630859375,
0.038848876953125,
0.04327392578125,
0.04827880859375,
-0.0045166015625,
0.037628173828125,
0.00843048095703125,
0.04058837890625,
-0.01123809814453125,
-0.0237274169921875,
-0.01422119140625,
0.01012420654296875,
-0.01073455810546875,
-0.05047607421875
]
] |
openai-gpt | 2023-04-06T13:42:43.000Z | [
"transformers",
"pytorch",
"tf",
"rust",
"safetensors",
"openai-gpt",
"text-generation",
"en",
"arxiv:1705.11168",
"arxiv:1803.02324",
"arxiv:1910.09700",
"license:mit",
"endpoints_compatible",
"has_space",
"region:us"
] | text-generation | null | null | null | openai-gpt | 189 | 32,164 | transformers | 2022-03-02T23:29:04 | ---
language: en
license: mit
---
# OpenAI GPT
## Table of Contents
- [Model Details](#model-details)
- [How To Get Started With the Model](#how-to-get-started-with-the-model)
- [Uses](#uses)
- [Risks, Limitations and Biases](#risks-limitations-and-biases)
- [Training](#training)
- [Evaluation](#evaluation)
- [Environmental Impact](#environmental-impact)
- [Technical Specifications](#technical-specifications)
- [Citation Information](#citation-information)
- [Model Card Authors](#model-card-authors)
## Model Details
**Model Description:** `openai-gpt` is a transformer-based language model created and released by OpenAI. The model is a causal (unidirectional) transformer pre-trained using language modeling on a large corpus with long range dependencies.
- **Developed by:** Alec Radford, Karthik Narasimhan, Tim Salimans, Ilya Sutskever. See [associated research paper](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf) and [GitHub repo](https://github.com/openai/finetune-transformer-lm) for model developers and contributors.
- **Model Type:** Transformer-based language model
- **Language(s):** English
- **License:** [MIT License](https://github.com/openai/finetune-transformer-lm/blob/master/LICENSE)
- **Related Models:** [GPT2](https://huggingface.co/gpt2), [GPT2-Medium](https://huggingface.co/gpt2-medium), [GPT2-Large](https://huggingface.co/gpt2-large) and [GPT2-XL](https://huggingface.co/gpt2-xl)
- **Resources for more information:**
- [Research Paper](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf)
- [OpenAI Blog Post](https://openai.com/blog/language-unsupervised/)
- [GitHub Repo](https://github.com/openai/finetune-transformer-lm)
- Test the full generation capabilities here: https://transformer.huggingface.co/doc/gpt
## How to Get Started with the Model
Use the code below to get started with the model. You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we
set a seed for reproducibility:
```python
>>> from transformers import pipeline, set_seed
>>> generator = pipeline('text-generation', model='openai-gpt')
>>> set_seed(42)
>>> generator("Hello, I'm a language model,", max_length=30, num_return_sequences=5)
[{'generated_text': "Hello, I'm a language model,'he said, when i was finished.'ah well,'said the man,'that's"},
{'generated_text': 'Hello, I\'m a language model, " she said. \n she reached the bottom of the shaft and leaned a little further out. it was'},
{'generated_text': 'Hello, I\'m a language model, " she laughed. " we call that a\'white girl.\'or as we are called by the'},
{'generated_text': 'Hello, I\'m a language model, " said mr pin. " an\'the ones with the funny hats don\'t. " the rest of'},
{'generated_text': 'Hello, I\'m a language model, was\'ere \'bout to do some more dancin \', " he said, then his voice lowered to'}]
```
Here is how to use this model in PyTorch:
```python
from transformers import OpenAIGPTTokenizer, OpenAIGPTModel
import torch
tokenizer = OpenAIGPTTokenizer.from_pretrained("openai-gpt")
model = OpenAIGPTModel.from_pretrained("openai-gpt")
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
outputs = model(**inputs)
last_hidden_states = outputs.last_hidden_state
```
and in TensorFlow:
```python
from transformers import OpenAIGPTTokenizer, TFOpenAIGPTModel
tokenizer = OpenAIGPTTokenizer.from_pretrained("openai-gpt")
model = TFOpenAIGPTModel.from_pretrained("openai-gpt")
inputs = tokenizer("Hello, my dog is cute", return_tensors="tf")
outputs = model(inputs)
last_hidden_states = outputs.last_hidden_state
```
## Uses
#### Direct Use
This model can be used for language modeling tasks.
#### Downstream Use
Potential downstream uses of this model include tasks that leverage language models. In the [associated paper](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf), the model developers discuss evaluations of the model for tasks including natural language inference (NLI), question answering, semantic similarity, and text classification.
#### Misuse and Out-of-scope Use
The model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model.
## Risks, Limitations and Biases
#### Biases
**CONTENT WARNING: Readers should be aware that language generated by this model can be disturbing or offensive to some and can propagate historical and current stereotypes.**
Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
Predictions generated by this model can include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. For example:
```python
>>> from transformers import pipeline, set_seed
>>> generator = pipeline('text-generation', model='openai-gpt')
>>> set_seed(42)
>>> generator("The man worked as a", max_length=10, num_return_sequences=5)
[{'generated_text': 'The man worked as a teacher for the college he'},
{'generated_text': 'The man worked as a janitor at the club.'},
{'generated_text': 'The man worked as a bodyguard in america. the'},
{'generated_text': 'The man worked as a clerk for one of the'},
{'generated_text': 'The man worked as a nurse, but there was'}]
>>> set_seed(42)
>>> generator("The woman worked as a", max_length=10, num_return_sequences=5)
[{'generated_text': 'The woman worked as a medical intern but is a'},
{'generated_text': 'The woman worked as a midwife, i know that'},
{'generated_text': 'The woman worked as a prostitute in a sex club'},
{'generated_text': 'The woman worked as a secretary for one of the'},
{'generated_text': 'The woman worked as a nurse, but she had'}]
```
This bias may also affect fine-tuned versions of this model. Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model.
#### Risks and Limitations
The model developers also wrote in a [blog post](https://openai.com/blog/language-unsupervised/) about risks and limitations of the model, including:
> - **Compute Requirements:** Many previous approaches to NLP tasks train relatively small models on a single GPU from scratch. Our approach requires an expensive pre-training step - 1 month on 8 GPUs. Luckily, this only has to be done once and we’re releasing our model so others can avoid it. It is also a large model (in comparison to prior work) and consequently uses more compute and memory — we used a 37-layer (12 block) Transformer architecture, and we train on sequences of up to 512 tokens. Most experiments were conducted on 4 and 8 GPU systems. The model does fine-tune to new tasks very quickly which helps mitigate the additional resource requirements.
> - **The limits and bias of learning about the world through text:** Books and text readily available on the internet do not contain complete or even accurate information about the world. Recent work ([Lucy and Gauthier, 2017](https://arxiv.org/abs/1705.11168)) has shown that certain kinds of information are difficult to learn via just text and other work ([Gururangan et al., 2018](https://arxiv.org/abs/1803.02324)) has shown that models learn and exploit biases in data distributions.
> - **Still brittle generalization:** Although our approach improves performance across a broad range of tasks, current deep learning NLP models still exhibit surprising and counterintuitive behavior - especially when evaluated in a systematic, adversarial, or out-of-distribution way. Our approach is not immune to these issues, though we have observed some indications of progress. Our approach shows improved lexical robustness over previous purely neural approaches to textual entailment. On the dataset introduced in Glockner et al. (2018) our model achieves 83.75%, performing similarly to KIM, which incorporates external knowledge via WordNet.
## Training
#### Training Data
The model developers [write](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf):
> We use the BooksCorpus dataset ([Zhu et al., 2015](https://www.cv-foundation.org/openaccess/content_iccv_2015/papers/Zhu_Aligning_Books_and_ICCV_2015_paper.pdf)) for training the language model. It contains over 7,000 unique unpublished books from a variety of genres including Adventure, Fantasy, and Romance. Crucially, it contains long stretches of contiguous text, which allows the generative model to learn to condition on long-range information.
#### Training Procedure
The model developers [write](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf):
> Our model largely follows the original transformer work [62]. We trained a 12-layer decoder-only transformer with masked self-attention heads (768 dimensional states and 12 attention heads). For the position-wise feed-forward networks, we used 3072 dimensional inner states. We used the Adam optimization scheme [27] with a max learning rate of 2.5e-4. The learning rate was increased linearly from zero over the first 2000 updates and annealed to 0 using a cosine schedule. We train for 100 epochs on minibatches of 64 randomly sampled, contiguous sequences of 512 tokens. Since layernorm [2] is used extensively throughout the model, a simple weight initialization of N (0, 0.02) was sufficient. We used a bytepair encoding (BPE) vocabulary with 40,000 merges [53] and residual, embedding, and attention dropouts with a rate of 0.1 for regularization. We also employed a modified version of L2 regularization proposed in [37], with w = 0.01 on all non bias or gain weights. For the activation function, we used the Gaussian Error Linear Unit (GELU) [18]. We used learned position embeddings instead of the sinusoidal version proposed in the original work. We use the ftfy library2 to clean the raw text in BooksCorpus, standardize some punctuation and whitespace, and use the spaCy tokenizer.
See the paper for further details and links to citations.
## Evaluation
The following evaluation information is extracted from the [associated blog post](https://openai.com/blog/language-unsupervised/). See the [associated paper](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf) for further details.
#### Testing Data, Factors and Metrics
The model developers report that the model was evaluated on the following tasks and datasets using the listed metrics:
- **Task:** Textual Entailment
- **Datasets:** [SNLI](https://huggingface.co/datasets/snli), [MNLI Matched](https://huggingface.co/datasets/glue), [MNLI Mismatched](https://huggingface.co/datasets/glue), [SciTail](https://huggingface.co/datasets/scitail), [QNLI](https://huggingface.co/datasets/glue), [RTE](https://huggingface.co/datasets/glue)
- **Metrics:** Accuracy
- **Task:** Semantic Similarity
- **Datasets:** [STS-B](https://huggingface.co/datasets/glue), [QQP](https://huggingface.co/datasets/glue), [MRPC](https://huggingface.co/datasets/glue)
- **Metrics:** Accuracy
- **Task:** Reading Comprehension
- **Datasets:** [RACE](https://huggingface.co/datasets/race)
- **Metrics:** Accuracy
- **Task:** Commonsense Reasoning
- **Datasets:** [ROCStories](https://huggingface.co/datasets/story_cloze), [COPA](https://huggingface.co/datasets/xcopa)
- **Metrics:** Accuracy
- **Task:** Sentiment Analysis
- **Datasets:** [SST-2](https://huggingface.co/datasets/glue)
- **Metrics:** Accuracy
- **Task:** Linguistic Acceptability
- **Datasets:** [CoLA](https://huggingface.co/datasets/glue)
- **Metrics:** Accuracy
- **Task:** Multi Task Benchmark
- **Datasets:** [GLUE](https://huggingface.co/datasets/glue)
- **Metrics:** Accuracy
#### Results
The model achieves the following results without any fine-tuning (zero-shot):
| Task | TE | TE | TE |TE | TE | TE | SS | SS | SS | RC | CR | CR | SA | LA | MTB |
|:--------:|:--:|:----------:|:-------------:|:-----:|:----:|:---:|:---:|:---:|:--:|:----:|:--------:|:----:|:----:|:----:|:----:|
| Dataset |SNLI|MNLI Matched|MNLI Mismatched|SciTail| QNLI | RTE |STS-B| QQP |MPRC|RACE |ROCStories|COPA | SST-2| CoLA | GLUE |
| |89.9| 82.1 | 81.4 |88.3 | 88.1 | 56.0|82.0 | 70.3|82.3|59.0 | 86.5 | 78.6 | 91.3 | 45.4 | 72.8 |
## Environmental Impact
The model developers [report that](https://openai.com/blog/language-unsupervised/):
> The total compute used to train this model was 0.96 petaflop days (pfs-days).
> 8 P600 GPU's * 30 days * 12 TFLOPS/GPU * 0.33 utilization = .96 pfs-days
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** 8 P600 GPUs
- **Hours used:** 720 hours (30 days)
- **Cloud Provider:** Unknown
- **Compute Region:** Unknown
- **Carbon Emitted:** Unknown
## Technical Specifications
See the [associated paper](https://cdn.openai.com/research-covers/language-unsupervised/language_understanding_paper.pdf) for details on the modeling architecture, objective, compute infrastructure, and training details.
## Citation Information
```bibtex
@article{radford2018improving,
title={Improving language understanding by generative pre-training},
author={Radford, Alec and Narasimhan, Karthik and Salimans, Tim and Sutskever, Ilya and others},
year={2018},
publisher={OpenAI}
}
```
APA:
*Radford, A., Narasimhan, K., Salimans, T., & Sutskever, I. (2018). Improving language understanding by generative pre-training.*
## Model Card Authors
This model card was written by the Hugging Face team. | 14,077 | [
[
-0.01528167724609375,
-0.05792236328125,
0.01207733154296875,
0.0104217529296875,
-0.01531982421875,
-0.035308837890625,
-0.03326416015625,
-0.026214599609375,
-0.009307861328125,
0.03424072265625,
-0.0298919677734375,
-0.03302001953125,
-0.049591064453125,
0.0037021636962890625,
-0.0252685546875,
0.09564208984375,
-0.0026187896728515625,
-0.0010471343994140625,
0.01129913330078125,
0.0016155242919921875,
-0.0185546875,
-0.032379150390625,
-0.049468994140625,
-0.035003662109375,
0.017578125,
-0.0019683837890625,
0.0518798828125,
0.041229248046875,
0.0250244140625,
0.020843505859375,
-0.016204833984375,
-0.00994873046875,
-0.03240966796875,
-0.0181884765625,
-0.004367828369140625,
-0.0301513671875,
-0.0244293212890625,
0.0101318359375,
0.04229736328125,
0.04815673828125,
-0.0129852294921875,
0.0211029052734375,
0.00948333740234375,
0.0312347412109375,
-0.0299835205078125,
0.0273590087890625,
-0.04052734375,
0.0007848739624023438,
-0.0244293212890625,
0.01045989990234375,
-0.03472900390625,
-0.0166473388671875,
0.00830841064453125,
-0.033050537109375,
0.0232696533203125,
-0.01525115966796875,
0.0908203125,
0.0190582275390625,
-0.0191650390625,
-0.0145263671875,
-0.06072998046875,
0.05706787109375,
-0.0703125,
0.0204620361328125,
0.0297088623046875,
0.0118865966796875,
-0.004116058349609375,
-0.06805419921875,
-0.05975341796875,
-0.01532745361328125,
-0.0196685791015625,
0.016082763671875,
-0.020904541015625,
-0.004909515380859375,
0.02960205078125,
0.04296875,
-0.04766845703125,
-0.00244903564453125,
-0.036102294921875,
-0.005146026611328125,
0.040130615234375,
-0.00463104248046875,
0.040496826171875,
-0.0323486328125,
-0.010406494140625,
-0.0225067138671875,
-0.03912353515625,
0.004940032958984375,
0.045013427734375,
0.031890869140625,
-0.033966064453125,
0.05267333984375,
-0.004611968994140625,
0.045654296875,
-0.01230621337890625,
-0.01251220703125,
0.0283355712890625,
-0.0267486572265625,
-0.0249786376953125,
-0.0161895751953125,
0.09307861328125,
0.0117340087890625,
0.031768798828125,
-0.01210784912109375,
-0.01080322265625,
0.00370025634765625,
0.00629425048828125,
-0.055328369140625,
-0.0238494873046875,
0.01328277587890625,
-0.043701171875,
-0.0280609130859375,
0.00616455078125,
-0.06939697265625,
-0.0013189315795898438,
-0.0070953369140625,
0.0287933349609375,
-0.0316162109375,
-0.038055419921875,
0.005512237548828125,
-0.006313323974609375,
0.016693115234375,
-0.0082550048828125,
-0.0789794921875,
0.01247406005859375,
0.034027099609375,
0.06640625,
0.0142059326171875,
-0.022491455078125,
-0.01020050048828125,
-0.007114410400390625,
-0.005016326904296875,
0.0187225341796875,
-0.018096923828125,
-0.01514434814453125,
-0.0029125213623046875,
0.0027446746826171875,
-0.01300811767578125,
-0.018402099609375,
0.0439453125,
-0.028656005859375,
0.045013427734375,
0.00897979736328125,
-0.0179595947265625,
-0.0126800537109375,
-0.004695892333984375,
-0.04730224609375,
0.08013916015625,
0.0165863037109375,
-0.082763671875,
0.011199951171875,
-0.068115234375,
-0.028717041015625,
-0.00771331787109375,
-0.005283355712890625,
-0.03350830078125,
-0.004608154296875,
0.0261688232421875,
0.0272216796875,
-0.0300750732421875,
0.040618896484375,
0.005863189697265625,
-0.0081939697265625,
0.0131683349609375,
-0.040496826171875,
0.09619140625,
0.0238494873046875,
-0.03521728515625,
0.004486083984375,
-0.048797607421875,
-0.01169586181640625,
0.038604736328125,
-0.0236358642578125,
-0.018829345703125,
-0.009124755859375,
0.0166168212890625,
0.017333984375,
0.0106658935546875,
-0.045928955078125,
0.015655517578125,
-0.056854248046875,
0.0513916015625,
0.049652099609375,
-0.0191497802734375,
0.0308990478515625,
-0.012664794921875,
0.0286407470703125,
0.003170013427734375,
0.00963592529296875,
-0.0156707763671875,
-0.05767822265625,
-0.06298828125,
-0.006206512451171875,
0.037353515625,
0.0533447265625,
-0.058807373046875,
0.041290283203125,
-0.0222625732421875,
-0.0394287109375,
-0.050872802734375,
-0.00006157159805297852,
0.039886474609375,
0.0382080078125,
0.034637451171875,
-0.0179901123046875,
-0.04833984375,
-0.04864501953125,
-0.0213623046875,
-0.0202484130859375,
-0.01715087890625,
0.0180511474609375,
0.05816650390625,
-0.027191162109375,
0.07000732421875,
-0.04547119140625,
-0.0277252197265625,
-0.033660888671875,
0.0158538818359375,
0.03704833984375,
0.0384521484375,
0.032867431640625,
-0.05596923828125,
-0.027191162109375,
-0.0022678375244140625,
-0.064208984375,
0.00046133995056152344,
-0.00499725341796875,
-0.0032806396484375,
0.027496337890625,
0.028564453125,
-0.059967041015625,
0.03173828125,
0.04052734375,
-0.0287017822265625,
0.04095458984375,
-0.011993408203125,
-0.0174713134765625,
-0.1015625,
0.0235443115234375,
0.013427734375,
0.0020618438720703125,
-0.0589599609375,
0.01279449462890625,
-0.01507568359375,
-0.018035888671875,
-0.033111572265625,
0.05291748046875,
-0.0252838134765625,
0.017059326171875,
0.0002846717834472656,
0.00231170654296875,
-0.0222930908203125,
0.0577392578125,
0.0022869110107421875,
0.05462646484375,
0.03912353515625,
-0.042510986328125,
0.01233673095703125,
0.029693603515625,
-0.0282440185546875,
0.00433349609375,
-0.0633544921875,
0.01218414306640625,
-0.020843505859375,
0.006671905517578125,
-0.059600830078125,
-0.0178070068359375,
0.036834716796875,
-0.042633056640625,
0.0269927978515625,
-0.019866943359375,
-0.041015625,
-0.03460693359375,
0.00194549560546875,
0.031097412109375,
0.055938720703125,
-0.042022705078125,
0.041412353515625,
0.032958984375,
-0.00545501708984375,
-0.051605224609375,
-0.06024169921875,
0.0026798248291015625,
-0.01070404052734375,
-0.051025390625,
0.0278778076171875,
0.0155487060546875,
-0.0174560546875,
-0.0022735595703125,
0.01666259765625,
-0.003093719482421875,
-0.00179290771484375,
0.01435089111328125,
0.0200347900390625,
-0.0207366943359375,
-0.0028095245361328125,
-0.01409149169921875,
-0.0150604248046875,
0.0021610260009765625,
-0.0298004150390625,
0.054290771484375,
-0.0171661376953125,
-0.008575439453125,
-0.03436279296875,
0.0165863037109375,
0.0215606689453125,
-0.0206146240234375,
0.064697265625,
0.06597900390625,
-0.03656005859375,
0.003795623779296875,
-0.02069091796875,
-0.0139923095703125,
-0.0338134765625,
0.039093017578125,
-0.0189971923828125,
-0.061187744140625,
0.037322998046875,
0.0210113525390625,
0.00843048095703125,
0.062347412109375,
0.04644775390625,
0.01477813720703125,
0.07623291015625,
0.05828857421875,
-0.0220794677734375,
0.03466796875,
-0.03369140625,
0.032562255859375,
-0.058197021484375,
-0.01261138916015625,
-0.04949951171875,
0.00096893310546875,
-0.068603515625,
-0.0289459228515625,
0.0151519775390625,
0.0122833251953125,
-0.028839111328125,
0.03778076171875,
-0.04425048828125,
0.0203857421875,
0.0494384765625,
0.0002067089080810547,
0.01190948486328125,
0.00481414794921875,
-0.01308441162109375,
0.00027441978454589844,
-0.0496826171875,
-0.0479736328125,
0.089599609375,
0.04779052734375,
0.0567626953125,
0.0071258544921875,
0.04046630859375,
-0.0077056884765625,
0.0234832763671875,
-0.034149169921875,
0.032501220703125,
-0.0266876220703125,
-0.046295166015625,
-0.033050537109375,
-0.04632568359375,
-0.07647705078125,
0.016357421875,
-0.0019245147705078125,
-0.078125,
0.0019779205322265625,
0.004169464111328125,
-0.0096588134765625,
0.041229248046875,
-0.054534912109375,
0.08453369140625,
-0.008209228515625,
-0.0028285980224609375,
0.0131683349609375,
-0.054534912109375,
0.041717529296875,
0.01357269287109375,
0.01091766357421875,
0.00775146484375,
0.01763916015625,
0.06549072265625,
-0.038604736328125,
0.07806396484375,
-0.01171112060546875,
-0.0025959014892578125,
0.0227813720703125,
-0.0185546875,
0.02459716796875,
-0.002964019775390625,
-0.0026760101318359375,
0.04296875,
-0.016021728515625,
-0.0186767578125,
-0.0260162353515625,
0.03839111328125,
-0.08050537109375,
-0.023895263671875,
-0.033966064453125,
-0.038238525390625,
0.02001953125,
0.0279083251953125,
0.043060302734375,
0.034210205078125,
-0.0134429931640625,
0.0009131431579589844,
0.04046630859375,
-0.040679931640625,
0.04296875,
0.01079559326171875,
-0.0176544189453125,
-0.02825927734375,
0.06829833984375,
0.0013704299926757812,
0.0156402587890625,
0.0299224853515625,
0.0235443115234375,
-0.029449462890625,
-0.021759033203125,
-0.04052734375,
0.032867431640625,
-0.05242919921875,
-0.0015878677368164062,
-0.06597900390625,
-0.03369140625,
-0.05133056640625,
0.019805908203125,
-0.021453857421875,
-0.022186279296875,
-0.0304412841796875,
0.0023899078369140625,
0.0255584716796875,
0.051788330078125,
0.0082855224609375,
0.035308837890625,
-0.044097900390625,
0.032867431640625,
0.01861572265625,
0.03424072265625,
0.003070831298828125,
-0.054412841796875,
-0.01055145263671875,
0.0164337158203125,
-0.03369140625,
-0.059478759765625,
0.04541015625,
0.01195526123046875,
0.04345703125,
0.020172119140625,
0.002044677734375,
0.028533935546875,
-0.034149169921875,
0.07257080078125,
0.01473236083984375,
-0.07025146484375,
0.046478271484375,
-0.029449462890625,
0.0273590087890625,
0.0184173583984375,
0.03033447265625,
-0.048583984375,
-0.032928466796875,
-0.054931640625,
-0.07781982421875,
0.06988525390625,
0.037567138671875,
0.02459716796875,
-0.00450897216796875,
0.02154541015625,
-0.004741668701171875,
0.01178741455078125,
-0.09918212890625,
-0.043212890625,
-0.038055419921875,
-0.02789306640625,
-0.0167999267578125,
-0.01352691650390625,
-0.0007205009460449219,
-0.02191162109375,
0.06414794921875,
-0.007640838623046875,
0.05194091796875,
0.01032257080078125,
-0.0244598388671875,
0.0031642913818359375,
0.0252227783203125,
0.046173095703125,
0.037567138671875,
-0.013427734375,
0.0003502368927001953,
0.00775146484375,
-0.050201416015625,
-0.0055389404296875,
0.021392822265625,
-0.03643798828125,
0.0035915374755859375,
0.020355224609375,
0.08935546875,
-0.009124755859375,
-0.031097412109375,
0.0460205078125,
-0.01105499267578125,
-0.012786865234375,
-0.0274505615234375,
-0.0050811767578125,
0.0064544677734375,
0.0027294158935546875,
0.0163421630859375,
0.006229400634765625,
-0.01027679443359375,
-0.0421142578125,
0.0038700103759765625,
0.0195159912109375,
-0.02783203125,
-0.0240478515625,
0.07196044921875,
0.00751495361328125,
-0.03515625,
0.06378173828125,
-0.0204315185546875,
-0.039764404296875,
0.04351806640625,
0.05133056640625,
0.08099365234375,
-0.0086822509765625,
0.017333984375,
0.045501708984375,
0.040618896484375,
-0.016326904296875,
0.0158843994140625,
0.01401519775390625,
-0.06451416015625,
-0.040802001953125,
-0.05810546875,
-0.0023860931396484375,
0.0296630859375,
-0.037200927734375,
0.02252197265625,
-0.0216064453125,
-0.00551605224609375,
-0.0008454322814941406,
-0.01824951171875,
-0.05633544921875,
0.0139007568359375,
0.003070831298828125,
0.065185546875,
-0.0804443359375,
0.06524658203125,
0.04290771484375,
-0.06524658203125,
-0.0687255859375,
0.01445770263671875,
-0.021240234375,
-0.051422119140625,
0.045196533203125,
0.02752685546875,
0.024871826171875,
0.00011646747589111328,
-0.036102294921875,
-0.05078125,
0.0697021484375,
0.0276031494140625,
-0.0204925537109375,
-0.01554107666015625,
0.025390625,
0.05108642578125,
-0.016998291015625,
0.0499267578125,
0.047149658203125,
0.03143310546875,
-0.01222991943359375,
-0.07354736328125,
0.01499176025390625,
-0.031341552734375,
0.0124359130859375,
0.01495361328125,
-0.032196044921875,
0.08721923828125,
-0.0188751220703125,
-0.0162811279296875,
0.009765625,
0.059051513671875,
-0.0044097900390625,
0.00196075439453125,
0.034881591796875,
0.053497314453125,
0.046539306640625,
-0.0257110595703125,
0.09564208984375,
-0.0135955810546875,
0.0430908203125,
0.078125,
0.0162200927734375,
0.052337646484375,
0.0211029052734375,
-0.0235443115234375,
0.038055419921875,
0.0306243896484375,
-0.00787353515625,
0.03515625,
0.01068115234375,
-0.005329132080078125,
0.00952911376953125,
-0.006229400634765625,
-0.033660888671875,
0.034393310546875,
0.01116943359375,
-0.0421142578125,
-0.016143798828125,
0.004764556884765625,
0.0102386474609375,
-0.007068634033203125,
-0.009613037109375,
0.052276611328125,
0.012664794921875,
-0.0537109375,
0.04425048828125,
0.02520751953125,
0.068603515625,
-0.044403076171875,
0.004161834716796875,
-0.00655364990234375,
0.0191192626953125,
-0.0125732421875,
-0.056488037109375,
0.0229644775390625,
0.0080413818359375,
-0.017364501953125,
-0.0265655517578125,
0.048919677734375,
-0.045135498046875,
-0.035614013671875,
0.0221099853515625,
0.021820068359375,
0.025726318359375,
-0.00940704345703125,
-0.05975341796875,
-0.0152587890625,
0.00994110107421875,
-0.02239990234375,
0.021728515625,
0.017730712890625,
-0.00353240966796875,
0.04296875,
0.053131103515625,
0.006610870361328125,
0.006366729736328125,
0.0258636474609375,
0.05218505859375,
-0.046478271484375,
-0.038726806640625,
-0.0692138671875,
0.0545654296875,
0.0003669261932373047,
-0.04071044921875,
0.049224853515625,
0.03668212890625,
0.08489990234375,
-0.0166778564453125,
0.068603515625,
-0.027313232421875,
0.0269012451171875,
-0.03521728515625,
0.0640869140625,
-0.038330078125,
0.007320404052734375,
-0.0404052734375,
-0.06658935546875,
-0.0213623046875,
0.05560302734375,
-0.024200439453125,
0.0205535888671875,
0.045135498046875,
0.0714111328125,
-0.0082550048828125,
0.0025730133056640625,
0.01056671142578125,
0.0239715576171875,
0.0263824462890625,
0.039581298828125,
0.042816162109375,
-0.0634765625,
0.055328369140625,
-0.035614013671875,
-0.0167999267578125,
-0.0012302398681640625,
-0.061279296875,
-0.072021484375,
-0.05291748046875,
-0.0260162353515625,
-0.03863525390625,
-0.00836181640625,
0.06219482421875,
0.04083251953125,
-0.06402587890625,
-0.027496337890625,
-0.020172119140625,
0.01058197021484375,
-0.015777587890625,
-0.0211639404296875,
0.0312347412109375,
-0.0152435302734375,
-0.0634765625,
0.006038665771484375,
-0.006839752197265625,
0.0134124755859375,
-0.0238494873046875,
-0.011749267578125,
-0.02899169921875,
-0.01525115966796875,
0.032623291015625,
0.027862548828125,
-0.06103515625,
-0.009613037109375,
-0.003620147705078125,
-0.014434814453125,
-0.0012035369873046875,
0.04669189453125,
-0.036376953125,
0.0230560302734375,
0.04107666015625,
0.034881591796875,
0.040863037109375,
-0.0189666748046875,
0.0452880859375,
-0.055450439453125,
0.018402099609375,
0.006011962890625,
0.03466796875,
0.0272216796875,
-0.0312347412109375,
0.0447998046875,
0.024078369140625,
-0.041015625,
-0.061187744140625,
0.0092620849609375,
-0.06201171875,
-0.0175018310546875,
0.09149169921875,
-0.0116729736328125,
-0.020477294921875,
-0.00180816650390625,
-0.01788330078125,
0.04815673828125,
-0.021087646484375,
0.043701171875,
0.03375244140625,
0.016998291015625,
-0.0133209228515625,
-0.055328369140625,
0.032470703125,
0.0258636474609375,
-0.0537109375,
-0.0002579689025878906,
0.0019483566284179688,
0.026824951171875,
0.0225982666015625,
0.050811767578125,
-0.0156402587890625,
0.002292633056640625,
-0.001766204833984375,
0.0321044921875,
0.0017719268798828125,
-0.0121917724609375,
-0.033447265625,
0.0045013427734375,
-0.01007843017578125,
-0.0175018310546875
]
] |
yahma/llama-7b-hf | 2023-04-08T14:50:03.000Z | [
"transformers",
"pytorch",
"llama",
"text-generation",
"license:other",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | yahma | null | null | yahma/llama-7b-hf | 58 | 31,968 | transformers | 2023-04-08T14:39:35 | ---
license: other
---
LLaMA-7B converted to work with git head Transformers/HuggingFace on April 8, 2023. This version should resolve the EOS token issues.
This is under a special license, please see the LICENSE file for details.
This contains the weights for the LLaMA-7b model. This model is under a non-commercial license (see the LICENSE file).
You should only use this repository if you have been granted access to the model by filling out [this form](https://docs.google.com/forms/d/e/1FAIpQLSfqNECQnMkycAp2jP4Z9TFX0cGR4uf7b_fBxjY_OjhJILlKGA/viewform?usp=send_form) but either lost your copy of the weights or got some trouble converting them to the Transformers format.
--
license: other
---
# LLaMA Model Card
## Model details
**Organization developing the model**
The FAIR team of Meta AI.
**Model date**
LLaMA was trained between December. 2022 and Feb. 2023.
**Model version**
This is version 1 of the model.
**Model type**
LLaMA is an auto-regressive language model, based on the transformer architecture. The model comes in different sizes: 7B, 13B, 33B and 65B parameters.
**Paper or resources for more information**
More information can be found in the paper “LLaMA, Open and Efficient Foundation Language Models”, available at https://research.facebook.com/publications/llama-open-and-efficient-foundation-language-models/.
**Citations details**
https://research.facebook.com/publications/llama-open-and-efficient-foundation-language-models/
**License**
Non-commercial bespoke license
**Where to send questions or comments about the model**
Questions and comments about LLaMA can be sent via the [GitHub repository](https://github.com/facebookresearch/llama) of the project , by opening an issue.
## Intended use
**Primary intended uses**
The primary use of LLaMA is research on large language models, including:
exploring potential applications such as question answering, natural language understanding or reading comprehension,
understanding capabilities and limitations of current language models, and developing techniques to improve those,
evaluating and mitigating biases, risks, toxic and harmful content generations, hallucinations.
**Primary intended users**
The primary intended users of the model are researchers in natural language processing, machine learning and artificial intelligence.
**Out-of-scope use cases**
LLaMA is a base, or foundational, model. As such, it should not be used on downstream applications without further risk evaluation and mitigation. In particular, our model has not been trained with human feedback, and can thus generate toxic or offensive content, incorrect information or generally unhelpful answers.
## Factors
**Relevant factors**
One of the most relevant factors for which model performance may vary is which language is used. Although we included 20 languages in the training data, most of our dataset is made of English text, and we thus expect the model to perform better for English than other languages. Relatedly, it has been shown in previous studies that performance might vary for different dialects, and we expect that it will be the case for our model.
**Evaluation factors**
As our model is trained on data from the Web, we expect that it reflects biases from this source. We thus evaluated on RAI datasets to measure biases exhibited by the model for gender, religion, race, sexual orientation, age, nationality, disability, physical appearance and socio-economic status. We also measure the toxicity of model generations, depending on the toxicity of the context used to prompt the model.
## Metrics
**Model performance measures**
We use the following measure to evaluate the model:
- Accuracy for common sense reasoning, reading comprehension, natural language understanding (MMLU), BIG-bench hard, WinoGender and CrowS-Pairs,
- Exact match for question answering,
- The toxicity score from Perspective API on RealToxicityPrompts.
**Decision thresholds**
Not applicable.
**Approaches to uncertainty and variability**
Due to the high computational requirements of training LLMs, we trained only one model of each size, and thus could not evaluate variability of pre-training.
## Evaluation datasets
The model was evaluated on the following benchmarks: BoolQ, PIQA, SIQA, HellaSwag, WinoGrande, ARC, OpenBookQA, NaturalQuestions, TriviaQA, RACE, MMLU, BIG-bench hard, GSM8k, RealToxicityPrompts, WinoGender, CrowS-Pairs.
## Training dataset
The model was trained using the following source of data: CCNet [67%], C4 [15%], GitHub [4.5%], Wikipedia [4.5%], Books [4.5%], ArXiv [2.5%], Stack Exchange[2%]. The Wikipedia and Books domains include data in the following languages: bg, ca, cs, da, de, en, es, fr, hr, hu, it, nl, pl, pt, ro, ru, sl, sr, sv, uk. See the paper for more details about the training set and corresponding preprocessing.
## Quantitative analysis
Hyperparameters for the model architecture
<table>
<thead>
<tr>
<th >LLaMA</th> <th colspan=6>Model hyper parameters </th>
</tr>
<tr>
<th>Number of parameters</th><th>dimension</th><th>n heads</th><th>n layers</th><th>Learn rate</th><th>Batch size</th><th>n tokens</th>
</tr>
</thead>
<tbody>
<tr>
<th>7B</th> <th>4096</th> <th>32</th> <th>32</th> <th>3.0E-04</th><th>4M</th><th>1T
</tr>
<tr>
<th>13B</th><th>5120</th><th>40</th><th>40</th><th>3.0E-04</th><th>4M</th><th>1T
</tr>
<tr>
<th>33B</th><th>6656</th><th>52</th><th>60</th><th>1.5.E-04</th><th>4M</th><th>1.4T
</tr>
<tr>
<th>65B</th><th>8192</th><th>64</th><th>80</th><th>1.5.E-04</th><th>4M</th><th>1.4T
</tr>
</tbody>
</table>
*Table 1 - Summary of LLama Model Hyperparameters*
We present our results on eight standard common sense reasoning benchmarks in the table below.
<table>
<thead>
<tr>
<th>LLaMA</th> <th colspan=9>Reasoning tasks </th>
</tr>
<tr>
<th>Number of parameters</th> <th>BoolQ</th><th>PIQA</th><th>SIQA</th><th>HellaSwag</th><th>WinoGrande</th><th>ARC-e</th><th>ARC-c</th><th>OBQA</th><th>COPA</th>
</tr>
</thead>
<tbody>
<tr>
<th>7B</th><th>76.5</th><th>79.8</th><th>48.9</th><th>76.1</th><th>70.1</th><th>76.7</th><th>47.6</th><th>57.2</th><th>93
</th>
<tr><th>13B</th><th>78.1</th><th>80.1</th><th>50.4</th><th>79.2</th><th>73</th><th>78.1</th><th>52.7</th><th>56.4</th><th>94
</th>
<tr><th>33B</th><th>83.1</th><th>82.3</th><th>50.4</th><th>82.8</th><th>76</th><th>81.4</th><th>57.8</th><th>58.6</th><th>92
</th>
<tr><th>65B</th><th>85.3</th><th>82.8</th><th>52.3</th><th>84.2</th><th>77</th><th>81.5</th><th>56</th><th>60.2</th><th>94</th></tr>
</tbody>
</table>
*Table 2 - Summary of LLama Model Performance on Reasoning tasks*
We present our results on bias in the table below. Note that lower value is better indicating lower bias.
| No | Category | FAIR LLM |
| --- | -------------------- | -------- |
| 1 | Gender | 70.6 |
| 2 | Religion | 79 |
| 3 | Race/Color | 57 |
| 4 | Sexual orientation | 81 |
| 5 | Age | 70.1 |
| 6 | Nationality | 64.2 |
| 7 | Disability | 66.7 |
| 8 | Physical appearance | 77.8 |
| 9 | Socioeconomic status | 71.5 |
| | LLaMA Average | 66.6 |
*Table 3 - Summary bias of our model output*
## Ethical considerations
**Data**
The data used to train the model is collected from various sources, mostly from the Web. As such, it contains offensive, harmful and biased content. We thus expect the model to exhibit such biases from the training data.
**Human life**
The model is not intended to inform decisions about matters central to human life, and should not be used in such a way.
**Mitigations**
We filtered the data from the Web based on its proximity to Wikipedia text and references. For this, we used a Kneser-Ney language model and a fastText linear classifier.
**Risks and harms**
Risks and harms of large language models include the generation of harmful, offensive or biased content. These models are often prone to generating incorrect information, sometimes referred to as hallucinations. We do not expect our model to be an exception in this regard.
**Use cases**
LLaMA is a foundational model, and as such, it should not be used for downstream applications without further investigation and mitigations of risks. These risks and potential fraught use cases include, but are not limited to: generation of misinformation and generation of harmful, biased or offensive content.
| 8,831 | [
[
-0.02911376953125,
-0.051971435546875,
0.03338623046875,
0.0223236083984375,
-0.0203704833984375,
-0.0179290771484375,
0.0030879974365234375,
-0.049774169921875,
0.008575439453125,
0.031982421875,
-0.037841796875,
-0.042022705078125,
-0.05615234375,
0.016143798828125,
-0.0308380126953125,
0.0667724609375,
0.007389068603515625,
-0.006740570068359375,
-0.0124664306640625,
-0.01727294921875,
-0.0213623046875,
-0.04608154296875,
-0.03643798828125,
-0.0206146240234375,
0.03472900390625,
0.0190582275390625,
0.05120849609375,
0.051727294921875,
0.035400390625,
0.025390625,
-0.033416748046875,
0.0123138427734375,
-0.021392822265625,
-0.0111236572265625,
-0.0132598876953125,
-0.0310211181640625,
-0.04608154296875,
0.00792694091796875,
0.024444580078125,
0.0426025390625,
-0.01910400390625,
0.044219970703125,
0.003376007080078125,
0.04986572265625,
-0.05340576171875,
0.007076263427734375,
-0.044342041015625,
-0.0005521774291992188,
-0.006175994873046875,
0.006397247314453125,
-0.01071929931640625,
-0.0232086181640625,
0.0010967254638671875,
-0.054443359375,
-0.009918212890625,
0.01300811767578125,
0.09088134765625,
0.04669189453125,
-0.039306640625,
-0.000020384788513183594,
-0.0220794677734375,
0.08111572265625,
-0.0753173828125,
0.033538818359375,
0.0242767333984375,
0.0208587646484375,
-0.01201629638671875,
-0.041351318359375,
-0.056884765625,
-0.02142333984375,
0.009124755859375,
0.016357421875,
-0.0272369384765625,
-0.00991058349609375,
0.035369873046875,
0.0153350830078125,
-0.0309906005859375,
0.01244354248046875,
-0.0247344970703125,
-0.0168914794921875,
0.0653076171875,
0.0246429443359375,
0.006435394287109375,
-0.0208740234375,
-0.0389404296875,
-0.004932403564453125,
-0.044830322265625,
0.01505279541015625,
0.0269622802734375,
0.012420654296875,
-0.0205841064453125,
0.048492431640625,
-0.034271240234375,
0.0487060546875,
-0.0011425018310546875,
-0.038818359375,
0.03619384765625,
-0.0307769775390625,
-0.0274200439453125,
-0.006992340087890625,
0.059844970703125,
0.039031982421875,
0.0247955322265625,
0.010284423828125,
-0.007595062255859375,
0.0166168212890625,
0.004215240478515625,
-0.056060791015625,
0.015899658203125,
0.013458251953125,
-0.030731201171875,
-0.0194854736328125,
-0.0254058837890625,
-0.04779052734375,
-0.023529052734375,
-0.0321044921875,
0.003452301025390625,
-0.01776123046875,
-0.0259857177734375,
0.021728515625,
0.005886077880859375,
0.0275115966796875,
0.0291595458984375,
-0.044677734375,
0.021240234375,
0.03399658203125,
0.051788330078125,
-0.0236358642578125,
-0.03802490234375,
-0.0009069442749023438,
0.0114898681640625,
-0.01511383056640625,
0.0634765625,
-0.0231781005859375,
-0.0212249755859375,
-0.0213623046875,
0.0017604827880859375,
-0.006977081298828125,
-0.0433349609375,
0.0518798828125,
-0.033111572265625,
0.0056304931640625,
-0.038238525390625,
-0.0352783203125,
-0.027557373046875,
0.0423583984375,
-0.048065185546875,
0.09893798828125,
-0.010406494140625,
-0.057708740234375,
0.0223236083984375,
-0.053131103515625,
-0.0029277801513671875,
-0.0244293212890625,
-0.00095367431640625,
-0.022705078125,
-0.024932861328125,
0.0258636474609375,
0.04046630859375,
-0.049713134765625,
0.038330078125,
-0.022613525390625,
-0.040618896484375,
0.01800537109375,
-0.03765869140625,
0.08447265625,
0.0173187255859375,
-0.040679931640625,
-0.00084686279296875,
-0.07421875,
-0.00897216796875,
0.0404052734375,
-0.03802490234375,
-0.008758544921875,
-0.00885009765625,
-0.005008697509765625,
0.0260467529296875,
0.023529052734375,
-0.0355224609375,
0.007427215576171875,
-0.031494140625,
0.027435302734375,
0.055084228515625,
0.0081787109375,
0.0177001953125,
-0.0362548828125,
0.037994384765625,
0.019775390625,
0.0201873779296875,
0.0203399658203125,
-0.052032470703125,
-0.068603515625,
-0.01502227783203125,
0.014068603515625,
0.05303955078125,
-0.0185546875,
0.036834716796875,
-0.01947021484375,
-0.051666259765625,
-0.03271484375,
0.01800537109375,
0.041961669921875,
0.059112548828125,
0.034515380859375,
-0.007381439208984375,
-0.039520263671875,
-0.0782470703125,
-0.01305389404296875,
-0.02947998046875,
0.01363372802734375,
0.03192138671875,
0.050048828125,
-0.0293121337890625,
0.059417724609375,
-0.037750244140625,
-0.010406494140625,
-0.019317626953125,
-0.0203704833984375,
0.02423095703125,
0.02197265625,
0.04656982421875,
-0.051055908203125,
-0.0177764892578125,
-0.00640869140625,
-0.06927490234375,
-0.01309967041015625,
0.01416015625,
-0.01528167724609375,
0.0219879150390625,
0.022247314453125,
-0.05572509765625,
0.039825439453125,
0.040802001953125,
-0.01751708984375,
0.056854248046875,
0.022430419921875,
-0.0023651123046875,
-0.080810546875,
0.01485443115234375,
0.0173492431640625,
0.01186370849609375,
-0.030914306640625,
0.00888824462890625,
-0.00908660888671875,
-0.000057220458984375,
-0.058319091796875,
0.060455322265625,
-0.015228271484375,
0.0028209686279296875,
-0.00970458984375,
0.0047149658203125,
0.0095062255859375,
0.05438232421875,
-0.00196075439453125,
0.071044921875,
0.0307769775390625,
-0.05609130859375,
0.013275146484375,
0.01229095458984375,
-0.025360107421875,
0.0268707275390625,
-0.0672607421875,
0.003566741943359375,
0.0130462646484375,
0.024658203125,
-0.057891845703125,
-0.00792694091796875,
0.036041259765625,
-0.033203125,
-0.0120086669921875,
0.0024089813232421875,
-0.0462646484375,
-0.027130126953125,
-0.0135650634765625,
0.0216064453125,
0.034271240234375,
-0.0335693359375,
0.037506103515625,
0.026641845703125,
0.00640106201171875,
-0.0694580078125,
-0.062347412109375,
-0.0050201416015625,
-0.0272979736328125,
-0.06329345703125,
0.0151824951171875,
-0.01136016845703125,
-0.0316162109375,
-0.00626373291015625,
-0.00748443603515625,
-0.0021991729736328125,
0.01995849609375,
0.01580810546875,
0.019866943359375,
-0.00897979736328125,
0.0029811859130859375,
0.0032901763916015625,
-0.01248931884765625,
0.00847625732421875,
0.007762908935546875,
0.037506103515625,
-0.0221710205078125,
-0.041168212890625,
-0.04583740234375,
0.005794525146484375,
0.03521728515625,
-0.01580810546875,
0.05364990234375,
0.033416748046875,
-0.026947021484375,
0.004482269287109375,
-0.054656982421875,
-0.00299835205078125,
-0.037506103515625,
0.0310211181640625,
-0.021270751953125,
-0.05950927734375,
0.05047607421875,
-0.009735107421875,
-0.001026153564453125,
0.04669189453125,
0.052703857421875,
0.00675201416015625,
0.07611083984375,
0.035186767578125,
-0.01824951171875,
0.0247039794921875,
-0.0310211181640625,
0.013671875,
-0.06927490234375,
-0.0294036865234375,
-0.0341796875,
-0.01971435546875,
-0.033721923828125,
-0.0267181396484375,
0.0186920166015625,
0.0021648406982421875,
-0.051910400390625,
0.016265869140625,
-0.049407958984375,
0.022613525390625,
0.046966552734375,
0.014373779296875,
0.0269622802734375,
-0.003414154052734375,
-0.0234527587890625,
0.0021343231201171875,
-0.041778564453125,
-0.0540771484375,
0.0927734375,
0.03192138671875,
0.041229248046875,
0.007694244384765625,
0.04705810546875,
0.0029773712158203125,
0.0035381317138671875,
-0.0537109375,
0.06292724609375,
0.02252197265625,
-0.061859130859375,
-0.0166473388671875,
-0.0193634033203125,
-0.06829833984375,
0.037750244140625,
-0.00318145751953125,
-0.06317138671875,
0.028961181640625,
0.0010004043579101562,
-0.0171661376953125,
0.0260467529296875,
-0.0589599609375,
0.045379638671875,
-0.030120849609375,
-0.029144287109375,
-0.007770538330078125,
-0.044586181640625,
0.042236328125,
0.005046844482421875,
0.02056884765625,
-0.025054931640625,
-0.017364501953125,
0.071533203125,
-0.033111572265625,
0.0721435546875,
-0.00788116455078125,
0.00588226318359375,
0.034423828125,
-0.01030731201171875,
0.043212890625,
0.0003750324249267578,
-0.02813720703125,
0.038787841796875,
-0.0026760101318359375,
-0.03594970703125,
-0.033294677734375,
0.047088623046875,
-0.0792236328125,
-0.069091796875,
-0.049713134765625,
-0.04559326171875,
-0.01259613037109375,
0.01528167724609375,
0.009033203125,
-0.01299285888671875,
0.01538848876953125,
0.005706787109375,
0.047454833984375,
-0.023956298828125,
0.0230865478515625,
0.02862548828125,
-0.0025577545166015625,
-0.01751708984375,
0.0570068359375,
0.01502227783203125,
0.01605224609375,
0.0002703666687011719,
0.017181396484375,
-0.0229034423828125,
-0.04205322265625,
-0.034027099609375,
0.02880859375,
-0.057708740234375,
-0.0217132568359375,
-0.056060791015625,
-0.0221710205078125,
-0.03216552734375,
-0.0019350051879882812,
-0.0289459228515625,
-0.0277099609375,
-0.034454345703125,
-0.0225677490234375,
0.040252685546875,
0.038543701171875,
0.01486968994140625,
0.0303802490234375,
-0.035400390625,
0.00875091552734375,
0.013519287109375,
0.007740020751953125,
0.0177001953125,
-0.045196533203125,
-0.00988006591796875,
0.004245758056640625,
-0.044036865234375,
-0.0638427734375,
0.030181884765625,
-0.015777587890625,
0.062347412109375,
0.0265045166015625,
-0.000698089599609375,
0.055145263671875,
-0.006580352783203125,
0.08038330078125,
0.016387939453125,
-0.06317138671875,
0.040557861328125,
-0.025909423828125,
0.02142333984375,
0.041229248046875,
0.03924560546875,
-0.0206146240234375,
-0.0090484619140625,
-0.04949951171875,
-0.063720703125,
0.04205322265625,
0.0097808837890625,
-0.0078277587890625,
0.0032482147216796875,
0.016357421875,
-0.01001739501953125,
0.01910400390625,
-0.07476806640625,
-0.0322265625,
-0.0135650634765625,
-0.00537109375,
0.01320648193359375,
-0.017669677734375,
-0.0185546875,
-0.044158935546875,
0.050384521484375,
-0.004116058349609375,
0.0220184326171875,
0.00479888916015625,
-0.015289306640625,
0.00717926025390625,
0.01666259765625,
0.03338623046875,
0.061370849609375,
-0.0103912353515625,
-0.0113983154296875,
0.039031982421875,
-0.044586181640625,
0.0170745849609375,
0.0106353759765625,
-0.02117919921875,
-0.0250396728515625,
0.0268402099609375,
0.055450439453125,
0.01715087890625,
-0.058624267578125,
0.03369140625,
0.012359619140625,
-0.0207061767578125,
-0.0216064453125,
0.0106658935546875,
0.0169830322265625,
0.031768798828125,
0.0189666748046875,
-0.0021877288818359375,
0.0169525146484375,
-0.020538330078125,
-0.0143890380859375,
0.016357421875,
-0.00315093994140625,
-0.0109710693359375,
0.05792236328125,
0.00460052490234375,
-0.00827789306640625,
0.03662109375,
-0.008758544921875,
-0.03607177734375,
0.059417724609375,
0.040374755859375,
0.040435791015625,
-0.007495880126953125,
0.005100250244140625,
0.038726806640625,
0.0243377685546875,
-0.00594329833984375,
0.028045654296875,
0.0038928985595703125,
-0.043792724609375,
-0.02197265625,
-0.0623779296875,
-0.036834716796875,
0.007305145263671875,
-0.046051025390625,
0.0169525146484375,
-0.03802490234375,
-0.0191497802734375,
-0.01363372802734375,
0.024688720703125,
-0.064697265625,
0.032806396484375,
0.00662994384765625,
0.08062744140625,
-0.0633544921875,
0.0546875,
0.058074951171875,
-0.05291748046875,
-0.0787353515625,
-0.01120758056640625,
0.0173187255859375,
-0.07501220703125,
0.049774169921875,
0.004543304443359375,
-0.0029392242431640625,
-0.005626678466796875,
-0.055328369140625,
-0.09771728515625,
0.1123046875,
0.03387451171875,
-0.039215087890625,
0.00007289648056030273,
0.0275726318359375,
0.040374755859375,
-0.01593017578125,
0.025238037109375,
0.043792724609375,
0.0440673828125,
0.01432037353515625,
-0.0731201171875,
0.022247314453125,
-0.03802490234375,
-0.0019664764404296875,
-0.01471710205078125,
-0.0743408203125,
0.07464599609375,
-0.0074615478515625,
0.003437042236328125,
-0.0057830810546875,
0.043548583984375,
0.072021484375,
0.03961181640625,
0.035858154296875,
0.06982421875,
0.06683349609375,
0.006763458251953125,
0.08935546875,
-0.02191162109375,
0.0123443603515625,
0.062164306640625,
-0.022491455078125,
0.06719970703125,
0.03680419921875,
-0.04217529296875,
0.03802490234375,
0.06646728515625,
0.004581451416015625,
0.0214385986328125,
0.01100921630859375,
0.0017642974853515625,
-0.0173492431640625,
-0.0243377685546875,
-0.03875732421875,
0.0267181396484375,
0.0193023681640625,
-0.0237884521484375,
-0.00917816162109375,
-0.0126495361328125,
0.0152587890625,
-0.00435638427734375,
-0.01448822021484375,
0.037322998046875,
0.0126495361328125,
-0.038116455078125,
0.0638427734375,
-0.00453948974609375,
0.0623779296875,
-0.040740966796875,
0.003589630126953125,
-0.019927978515625,
0.0025157928466796875,
-0.0234832763671875,
-0.045654296875,
0.01226043701171875,
0.006603240966796875,
-0.0118560791015625,
0.0070343017578125,
0.052978515625,
0.0196685791015625,
-0.052581787109375,
0.044586181640625,
0.044342041015625,
0.0175933837890625,
-0.0054931640625,
-0.07904052734375,
0.023529052734375,
-0.0079803466796875,
-0.052093505859375,
0.031646728515625,
0.0247955322265625,
-0.025299072265625,
0.07867431640625,
0.06134033203125,
0.006534576416015625,
0.025054931640625,
0.0032405853271484375,
0.0850830078125,
-0.054412841796875,
-0.0210723876953125,
-0.05291748046875,
0.042877197265625,
-0.0018787384033203125,
-0.043609619140625,
0.0628662109375,
0.03448486328125,
0.06500244140625,
0.0217437744140625,
0.057220458984375,
0.00722503662109375,
0.037506103515625,
-0.025421142578125,
0.048187255859375,
-0.05914306640625,
0.0232391357421875,
-0.0224761962890625,
-0.0760498046875,
-0.0175933837890625,
0.04095458984375,
-0.03936767578125,
0.01364898681640625,
0.041748046875,
0.065673828125,
-0.0018863677978515625,
-0.0034427642822265625,
0.0140228271484375,
0.01898193359375,
0.0144500732421875,
0.04144287109375,
0.055389404296875,
-0.037628173828125,
0.0469970703125,
-0.032073974609375,
-0.0113983154296875,
-0.01336669921875,
-0.05438232421875,
-0.06427001953125,
-0.0328369140625,
-0.0185699462890625,
-0.02142333984375,
0.003814697265625,
0.0660400390625,
0.0489501953125,
-0.048309326171875,
-0.0279998779296875,
-0.00001239776611328125,
0.01318359375,
-0.0183563232421875,
-0.0150604248046875,
0.03057861328125,
-0.0003802776336669922,
-0.053009033203125,
0.0119781494140625,
-0.00289154052734375,
-0.00434112548828125,
-0.023529052734375,
-0.0208740234375,
-0.0472412109375,
0.0115966796875,
0.0428466796875,
0.0134735107421875,
-0.0792236328125,
-0.01036834716796875,
0.015838623046875,
-0.0145111083984375,
0.001873016357421875,
0.002384185791015625,
-0.056854248046875,
0.003143310546875,
0.0211639404296875,
0.0570068359375,
0.03936767578125,
0.0006113052368164062,
0.00873565673828125,
-0.028656005859375,
0.0186614990234375,
0.015869140625,
0.028594970703125,
0.0175933837890625,
-0.03863525390625,
0.0606689453125,
0.0305328369140625,
-0.04827880859375,
-0.074462890625,
-0.0006308555603027344,
-0.08953857421875,
-0.011810302734375,
0.10577392578125,
-0.005329132080078125,
-0.039459228515625,
0.0007719993591308594,
-0.018707275390625,
0.027862548828125,
-0.0360107421875,
0.048309326171875,
0.052032470703125,
-0.0186920166015625,
-0.01410675048828125,
-0.058624267578125,
0.01207733154296875,
0.037994384765625,
-0.059844970703125,
-0.019989013671875,
0.03045654296875,
0.02655029296875,
0.011993408203125,
0.040802001953125,
-0.00475311279296875,
0.021087646484375,
-0.00203704833984375,
0.012176513671875,
-0.0194549560546875,
-0.013427734375,
-0.01343536376953125,
-0.004970550537109375,
0.006786346435546875,
-0.008941650390625
]
] |
Yntec/photoMovieX | 2023-08-07T12:33:44.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"MagicArt35",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | Yntec | null | null | Yntec/photoMovieX | 3 | 31,863 | diffusers | 2023-08-05T09:53:28 | ---
license: creativeml-openrail-m
library_name: diffusers
pipeline_tag: text-to-image
tags:
- stable-diffusion
- stable-diffusion-diffusers
- diffusers
- text-to-image
- MagicArt35
---
# Photo Movie X
Preview and prompt:

pretty cute little girl in tricycle, Screenshot of an surreal jean 70s round minimalist architecture, Sharp light, 35mm still from a sci fi blockbuster color movie made in 2022, beautiful portrait, Dorothy, set in 1860, in front of a spaceship that has just landed on an alien planet, are all wearing, a robot stands nearby
Original page:
https://civitai.com/models/94687?modelVersionId=101000
| 736 | [
[
-0.035797119140625,
-0.02392578125,
0.0545654296875,
-0.0166168212890625,
-0.02740478515625,
0.0139923095703125,
0.045989990234375,
-0.0308074951171875,
0.032867431640625,
0.036895751953125,
-0.08544921875,
-0.0308380126953125,
-0.02142333984375,
-0.00852203369140625,
-0.056060791015625,
0.04583740234375,
0.00821685791015625,
-0.0003571510314941406,
0.010040283203125,
0.016998291015625,
-0.043701171875,
-0.002979278564453125,
-0.026947021484375,
-0.0014219284057617188,
0.003551483154296875,
0.051513671875,
0.06976318359375,
0.049652099609375,
0.0103607177734375,
0.028656005859375,
0.0053253173828125,
-0.006946563720703125,
-0.034881591796875,
-0.01387786865234375,
-0.01611328125,
-0.052337646484375,
-0.01071929931640625,
0.0257720947265625,
0.067138671875,
0.054351806640625,
0.0161590576171875,
0.0157012939453125,
-0.0156707763671875,
0.0262298583984375,
-0.034912109375,
-0.0184478759765625,
0.021759033203125,
0.0167694091796875,
-0.0147857666015625,
0.0301666259765625,
0.0082855224609375,
-0.0203704833984375,
-0.0389404296875,
-0.07257080078125,
0.0244598388671875,
0.04327392578125,
0.09466552734375,
0.01546478271484375,
-0.0162353515625,
-0.00835418701171875,
-0.047576904296875,
0.045196533203125,
0.00838470458984375,
0.0249786376953125,
-0.00998687744140625,
0.0413818359375,
-0.0149688720703125,
-0.06292724609375,
-0.07354736328125,
-0.01001739501953125,
0.006977081298828125,
-0.009185791015625,
-0.044036865234375,
-0.054046630859375,
0.0297698974609375,
0.0241851806640625,
-0.0645751953125,
-0.005832672119140625,
-0.04864501953125,
0.004123687744140625,
0.04180908203125,
0.0428466796875,
0.050262451171875,
-0.031646728515625,
-0.044189453125,
-0.039154052734375,
-0.037200927734375,
0.0128326416015625,
0.01267242431640625,
-0.007053375244140625,
-0.057952880859375,
0.063232421875,
-0.00800323486328125,
0.033233642578125,
0.023193359375,
0.017059326171875,
0.027099609375,
-0.05517578125,
-0.01480865478515625,
-0.0194549560546875,
0.04656982421875,
0.0889892578125,
0.01221466064453125,
0.0260772705078125,
0.01447296142578125,
0.00968170166015625,
0.01447296142578125,
-0.06207275390625,
-0.00333404541015625,
0.03271484375,
-0.045318603515625,
-0.0357666015625,
0.01666259765625,
-0.0682373046875,
-0.029083251953125,
0.004459381103515625,
-0.000033855438232421875,
-0.038177490234375,
-0.01152801513671875,
0.0012407302856445312,
-0.0210418701171875,
0.020263671875,
0.0224151611328125,
-0.03143310546875,
-0.0222015380859375,
0.01544189453125,
0.058502197265625,
0.0231475830078125,
0.006923675537109375,
-0.018524169921875,
0.0196990966796875,
-0.05517578125,
0.06390380859375,
-0.0322265625,
-0.06463623046875,
-0.00506591796875,
0.034820556640625,
0.016693115234375,
-0.031982421875,
0.052001953125,
-0.038299560546875,
0.012298583984375,
0.00844573974609375,
0.0005764961242675781,
-0.0019550323486328125,
0.0105743408203125,
-0.05914306640625,
0.0606689453125,
0.02508544921875,
-0.038604736328125,
0.05535888671875,
-0.031036376953125,
0.005489349365234375,
0.01336669921875,
-0.0168304443359375,
-0.0254058837890625,
0.022308349609375,
0.0037841796875,
-0.0256805419921875,
0.007038116455078125,
-0.032989501953125,
-0.064697265625,
-0.04144287109375,
0.0152435302734375,
0.0234527587890625,
0.07025146484375,
0.0418701171875,
-0.0017719268798828125,
-0.01108551025390625,
-0.08050537109375,
0.038177490234375,
0.040313720703125,
0.019256591796875,
-0.0088043212890625,
-0.05621337890625,
0.01580810546875,
0.050537109375,
0.03192138671875,
-0.049468994140625,
0.020416259765625,
0.0054779052734375,
0.002315521240234375,
0.0289154052734375,
-0.0013551712036132812,
0.03851318359375,
-0.03509521484375,
0.0614013671875,
-0.0160369873046875,
0.03289794921875,
-0.0162200927734375,
-0.022857666015625,
-0.06884765625,
-0.007595062255859375,
-0.0179595947265625,
0.021942138671875,
-0.04388427734375,
0.0089111328125,
-0.00036454200744628906,
-0.059661865234375,
-0.0159759521484375,
-0.020233154296875,
0.012939453125,
-0.0022678375244140625,
-0.01532745361328125,
-0.0214385986328125,
-0.04205322265625,
-0.05841064453125,
0.02239990234375,
-0.03265380859375,
-0.02081298828125,
0.0292510986328125,
0.043060302734375,
-0.01522064208984375,
0.0458984375,
-0.042633056640625,
-0.0173797607421875,
-0.017120361328125,
-0.0158233642578125,
0.06524658203125,
0.03314208984375,
0.061981201171875,
-0.0521240234375,
-0.04180908203125,
-0.0126190185546875,
-0.069091796875,
0.005035400390625,
0.018096923828125,
-0.015960693359375,
0.004894256591796875,
0.042510986328125,
-0.06890869140625,
0.0243988037109375,
0.0041046142578125,
-0.056304931640625,
0.0302276611328125,
-0.042694091796875,
0.0362548828125,
-0.058624267578125,
0.01064300537109375,
-0.005298614501953125,
-0.029022216796875,
-0.02496337890625,
0.006336212158203125,
-0.00927734375,
-0.039398193359375,
-0.04052734375,
0.044708251953125,
-0.0302276611328125,
-0.00402069091796875,
-0.033599853515625,
0.0042877197265625,
0.0162811279296875,
-0.012481689453125,
0.00861358642578125,
0.03399658203125,
0.03265380859375,
-0.0220184326171875,
0.01399993896484375,
0.043304443359375,
-0.0178680419921875,
0.0654296875,
-0.06488037109375,
0.0196075439453125,
-0.0166168212890625,
0.01125335693359375,
-0.08441162109375,
-0.053802490234375,
0.044219970703125,
-0.0242767333984375,
-0.0024089813232421875,
0.0016002655029296875,
-0.04986572265625,
-0.059051513671875,
-0.03814697265625,
0.0257720947265625,
0.03936767578125,
-0.024505615234375,
0.01284027099609375,
0.00756072998046875,
-0.025543212890625,
0.0021457672119140625,
-0.031982421875,
0.01372528076171875,
-0.0109405517578125,
-0.063232421875,
0.044830322265625,
-0.0022563934326171875,
-0.041534423828125,
-0.00567626953125,
0.02935791015625,
-0.01470947265625,
-0.04595947265625,
0.038604736328125,
0.063232421875,
-0.01018524169921875,
-0.01457977294921875,
-0.01479339599609375,
-0.00605010986328125,
-0.0162811279296875,
0.0022869110107421875,
0.04901123046875,
-0.03350830078125,
0.0006909370422363281,
-0.06109619140625,
0.059051513671875,
0.056610107421875,
0.036346435546875,
0.01947021484375,
0.0509033203125,
-0.01194000244140625,
0.01129150390625,
-0.006855010986328125,
-0.017791748046875,
-0.034637451171875,
0.00969696044921875,
-0.011627197265625,
-0.025390625,
0.006046295166015625,
0.033294677734375,
-0.045623779296875,
0.036224365234375,
0.041961669921875,
-0.0572509765625,
0.0765380859375,
0.07525634765625,
0.0151214599609375,
0.042877197265625,
-0.05047607421875,
0.0135345458984375,
-0.01544952392578125,
-0.0202789306640625,
-0.033538818359375,
0.0012197494506835938,
-0.064453125,
-0.04620361328125,
0.032318115234375,
-0.00440216064453125,
-0.05718994140625,
0.038330078125,
-0.00714874267578125,
0.032470703125,
0.044891357421875,
0.007701873779296875,
0.0312042236328125,
-0.0128326416015625,
0.0007624626159667969,
-0.006557464599609375,
-0.0248565673828125,
-0.053375244140625,
0.04803466796875,
-0.00176239013671875,
0.039337158203125,
-0.004543304443359375,
0.046173095703125,
0.0152740478515625,
-0.033538818359375,
0.00311279296875,
0.053375244140625,
-0.0078887939453125,
-0.0902099609375,
0.010650634765625,
0.0109405517578125,
-0.047027587890625,
-0.01071929931640625,
-0.033355712890625,
-0.049072265625,
0.04107666015625,
0.0036773681640625,
-0.07135009765625,
-0.0019445419311523438,
-0.056976318359375,
0.052581787109375,
-0.028350830078125,
-0.057403564453125,
-0.00955963134765625,
-0.051727294921875,
0.0194549560546875,
0.00121307373046875,
0.03607177734375,
0.002040863037109375,
-0.01239013671875,
0.021148681640625,
-0.0228271484375,
0.04522705078125,
0.003665924072265625,
0.0260162353515625,
0.022857666015625,
0.006298065185546875,
0.0190582275390625,
0.0147857666015625,
0.0100250244140625,
0.0029087066650390625,
0.01110076904296875,
-0.01151275634765625,
-0.0231170654296875,
0.049560546875,
-0.05462646484375,
-0.0296173095703125,
-0.041473388671875,
-0.046783447265625,
0.02008056640625,
0.0178985595703125,
0.061309814453125,
0.0008435249328613281,
-0.01213836669921875,
-0.0250701904296875,
0.004512786865234375,
-0.0204010009765625,
0.07501220703125,
0.0247344970703125,
-0.0562744140625,
-0.021942138671875,
0.052703857421875,
0.00949859619140625,
0.0143280029296875,
-0.00414276123046875,
0.005462646484375,
-0.0287628173828125,
-0.0102691650390625,
-0.013519287109375,
0.032928466796875,
-0.0404052734375,
0.0179901123046875,
-0.0230712890625,
0.0203704833984375,
-0.0050048828125,
-0.040374755859375,
-0.0293731689453125,
-0.050933837890625,
-0.0302276611328125,
0.0001252889633178711,
0.015380859375,
0.0870361328125,
-0.007282257080078125,
0.006237030029296875,
-0.0521240234375,
0.025970458984375,
0.0216064453125,
0.00247955322265625,
-0.04034423828125,
-0.0308837890625,
-0.0000928640365600586,
-0.03265380859375,
-0.056060791015625,
-0.01450347900390625,
0.048248291015625,
-0.00296783447265625,
0.030242919921875,
0.014862060546875,
0.0084991455078125,
0.032989501953125,
-0.0288543701171875,
0.0740966796875,
0.03668212890625,
-0.035247802734375,
0.047119140625,
-0.04376220703125,
0.033233642578125,
0.033782958984375,
0.01113128662109375,
-0.0140228271484375,
-0.00897216796875,
-0.07843017578125,
-0.04058837890625,
0.054840087890625,
0.0289764404296875,
0.02227783203125,
0.0235595703125,
0.007198333740234375,
-0.00838470458984375,
0.03851318359375,
-0.052459716796875,
-0.03302001953125,
-0.03558349609375,
0.005908966064453125,
0.00405120849609375,
-0.00225067138671875,
0.0096435546875,
-0.020050048828125,
0.02789306640625,
-0.0123138427734375,
0.032562255859375,
0.00600433349609375,
0.0282135009765625,
-0.0253143310546875,
-0.0008339881896972656,
0.0194091796875,
0.035308837890625,
-0.043914794921875,
-0.01471710205078125,
-0.015655517578125,
-0.01171112060546875,
0.041717529296875,
0.0031642913818359375,
-0.028594970703125,
0.0247802734375,
0.0155181884765625,
0.034637451171875,
0.01300811767578125,
-0.04351806640625,
0.0241546630859375,
0.01389312744140625,
0.03656005859375,
-0.04052734375,
0.036346435546875,
0.005207061767578125,
0.042083740234375,
0.027099609375,
0.01015472412109375,
0.0233917236328125,
-0.06524658203125,
0.00310516357421875,
0.025787353515625,
-0.05145263671875,
-0.0294189453125,
0.0760498046875,
-0.015350341796875,
-0.0295257568359375,
0.037109375,
-0.0189971923828125,
-0.022308349609375,
0.046905517578125,
0.0321044921875,
0.046966552734375,
0.01059722900390625,
0.0374755859375,
0.069091796875,
-0.00673675537109375,
0.041259765625,
0.044036865234375,
0.019866943359375,
-0.0291290283203125,
0.0401611328125,
-0.0260772705078125,
-0.033233642578125,
-0.02008056640625,
-0.00830841064453125,
0.0684814453125,
-0.039398193359375,
0.0002560615539550781,
0.004299163818359375,
0.0113067626953125,
-0.041168212890625,
0.03900146484375,
0.043243408203125,
0.076904296875,
-0.045440673828125,
0.063232421875,
0.04107666015625,
-0.07647705078125,
-0.038543701171875,
-0.0138092041015625,
0.031890869140625,
-0.057525634765625,
0.051666259765625,
0.0287017822265625,
-0.00888824462890625,
-0.0282135009765625,
-0.0538330078125,
-0.04644775390625,
0.08941650390625,
0.0272216796875,
-0.061676025390625,
-0.01806640625,
0.0121002197265625,
0.020965576171875,
-0.052001953125,
0.01349639892578125,
0.050933837890625,
0.0355224609375,
0.025360107421875,
-0.046142578125,
-0.0244903564453125,
-0.051544189453125,
0.01422119140625,
-0.0284881591796875,
-0.06890869140625,
0.080810546875,
-0.06756591796875,
-0.0212554931640625,
0.0643310546875,
0.045745849609375,
0.07147216796875,
0.043304443359375,
0.068359375,
0.02728271484375,
0.03363037109375,
-0.01071929931640625,
0.1373291015625,
0.0031070709228515625,
0.0223541259765625,
0.05609130859375,
-0.0208740234375,
0.03167724609375,
0.04034423828125,
-0.04217529296875,
0.039337158203125,
0.08111572265625,
-0.0206451416015625,
0.047576904296875,
-0.021942138671875,
-0.02130126953125,
-0.0257720947265625,
-0.0465087890625,
-0.0075836181640625,
0.015350341796875,
-0.012451171875,
-0.0254669189453125,
-0.028594970703125,
0.0121002197265625,
-0.00521087646484375,
0.0188446044921875,
-0.0384521484375,
0.022918701171875,
0.01316070556640625,
-0.0311431884765625,
0.0301971435546875,
-0.037506103515625,
0.00014972686767578125,
-0.0321044921875,
-0.014404296875,
-0.015899658203125,
0.0200653076171875,
-0.01079559326171875,
-0.08990478515625,
0.00611114501953125,
-0.00013113021850585938,
-0.045684814453125,
-0.005680084228515625,
0.034454345703125,
-0.00748443603515625,
-0.091064453125,
0.0181884765625,
-0.0025539398193359375,
0.00655364990234375,
0.01824951171875,
-0.051239013671875,
0.0245513916015625,
0.01120758056640625,
0.0247802734375,
-0.006999969482421875,
0.0023059844970703125,
-0.01256561279296875,
0.034881591796875,
0.0246429443359375,
-0.016082763671875,
-0.004840850830078125,
-0.01248931884765625,
0.04901123046875,
-0.0528564453125,
-0.0557861328125,
-0.04168701171875,
0.051116943359375,
-0.024566650390625,
-0.035858154296875,
0.057525634765625,
0.030487060546875,
0.04052734375,
-0.03924560546875,
0.042022705078125,
-0.0284423828125,
0.029205322265625,
-0.037109375,
0.031951904296875,
-0.060882568359375,
-0.005588531494140625,
-0.025115966796875,
-0.048370361328125,
-0.022979736328125,
0.0322265625,
-0.002223968505859375,
0.028717041015625,
0.03961181640625,
0.07354736328125,
0.00702667236328125,
-0.00128173828125,
0.0174560546875,
0.004291534423828125,
0.020477294921875,
0.01387786865234375,
0.0775146484375,
-0.036712646484375,
0.033660888671875,
-0.034088134765625,
-0.004573822021484375,
-0.0361328125,
-0.06719970703125,
-0.0374755859375,
-0.031005859375,
-0.032012939453125,
-0.0267333984375,
0.0005145072937011719,
0.059783935546875,
0.08209228515625,
-0.038177490234375,
-0.01242828369140625,
-0.0036411285400390625,
0.0135955810546875,
0.0161895751953125,
-0.0092010498046875,
0.00353240966796875,
0.035858154296875,
-0.0855712890625,
0.0231170654296875,
0.0211944580078125,
0.05145263671875,
-0.0130615234375,
0.01468658447265625,
-0.01122283935546875,
-0.01247406005859375,
0.00664520263671875,
0.0142822265625,
-0.0152740478515625,
-0.0374755859375,
-0.0023250579833984375,
-0.00266265869140625,
0.0146026611328125,
0.0274810791015625,
-0.03936767578125,
0.0217132568359375,
0.027008056640625,
-0.04278564453125,
0.041046142578125,
-0.005168914794921875,
0.060455322265625,
-0.0176239013671875,
0.027801513671875,
-0.01280975341796875,
0.050140380859375,
0.038238525390625,
-0.036346435546875,
0.043701171875,
0.03515625,
-0.02239990234375,
-0.053619384765625,
0.007244110107421875,
-0.09619140625,
-0.0089111328125,
0.0614013671875,
0.029083251953125,
-0.0330810546875,
-0.004245758056640625,
-0.041015625,
-0.013946533203125,
-0.0182342529296875,
0.024017333984375,
0.041046142578125,
-0.004665374755859375,
-0.0272216796875,
-0.038116455078125,
-0.0152740478515625,
-0.0197296142578125,
-0.033203125,
-0.0256500244140625,
0.038177490234375,
0.034454345703125,
0.05987548828125,
0.03509521484375,
-0.0297088623046875,
0.040283203125,
0.027435302734375,
0.027313232421875,
0.009857177734375,
-0.0125579833984375,
-0.00978851318359375,
0.022064208984375,
-0.01311492919921875,
-0.042877197265625
]
] |
timm/mobilenetv3_small_075.lamb_in1k | 2023-04-27T22:49:32.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2110.00476",
"arxiv:1905.02244",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/mobilenetv3_small_075.lamb_in1k | 0 | 31,811 | timm | 2022-12-16T05:38:29 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for mobilenetv3_small_075.lamb_in1k
A MobileNet-v3 image classification model. Trained on ImageNet-1k in `timm` using recipe template described below.
Recipe details:
* A LAMB optimizer recipe that is similar to [ResNet Strikes Back](https://arxiv.org/abs/2110.00476) `A2` but 50% longer with EMA weight averaging, no CutMix
* RMSProp (TF 1.0 behaviour) optimizer, EMA weight averaging
* Step (exponential decay w/ staircase) LR schedule with warmup
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 2.0
- GMACs: 0.0
- Activations (M): 1.3
- Image size: 224 x 224
- **Papers:**
- Searching for MobileNetV3: https://arxiv.org/abs/1905.02244
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/huggingface/pytorch-image-models
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('mobilenetv3_small_075.lamb_in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'mobilenetv3_small_075.lamb_in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 16, 112, 112])
# torch.Size([1, 16, 56, 56])
# torch.Size([1, 24, 28, 28])
# torch.Size([1, 40, 14, 14])
# torch.Size([1, 432, 7, 7])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'mobilenetv3_small_075.lamb_in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 432, 7, 7) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@misc{rw2019timm,
author = {Ross Wightman},
title = {PyTorch Image Models},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
doi = {10.5281/zenodo.4414861},
howpublished = {\url{https://github.com/huggingface/pytorch-image-models}}
}
```
```bibtex
@inproceedings{howard2019searching,
title={Searching for mobilenetv3},
author={Howard, Andrew and Sandler, Mark and Chu, Grace and Chen, Liang-Chieh and Chen, Bo and Tan, Mingxing and Wang, Weijun and Zhu, Yukun and Pang, Ruoming and Vasudevan, Vijay and others},
booktitle={Proceedings of the IEEE/CVF international conference on computer vision},
pages={1314--1324},
year={2019}
}
```
| 4,427 | [
[
-0.0335693359375,
-0.02532958984375,
-0.00217437744140625,
0.01235198974609375,
-0.0258636474609375,
-0.03216552734375,
-0.0034122467041015625,
-0.0244140625,
0.025115966796875,
0.031646728515625,
-0.0258636474609375,
-0.055694580078125,
-0.04132080078125,
-0.01165771484375,
-0.00724029541015625,
0.06549072265625,
-0.007053375244140625,
-0.005344390869140625,
-0.00540924072265625,
-0.04693603515625,
-0.01473236083984375,
-0.0214080810546875,
-0.06463623046875,
-0.032470703125,
0.032867431640625,
0.0229644775390625,
0.042572021484375,
0.056915283203125,
0.04449462890625,
0.032135009765625,
-0.00702667236328125,
0.005260467529296875,
-0.00726318359375,
-0.01904296875,
0.034332275390625,
-0.04742431640625,
-0.0382080078125,
0.0250701904296875,
0.045166015625,
0.0185394287109375,
0.0006899833679199219,
0.0380859375,
0.0035343170166015625,
0.049896240234375,
-0.0266571044921875,
0.000682830810546875,
-0.036285400390625,
0.0142669677734375,
-0.01271820068359375,
0.0013751983642578125,
-0.0146484375,
-0.03521728515625,
0.0097198486328125,
-0.0290374755859375,
0.0300140380859375,
0.002498626708984375,
0.0987548828125,
0.01155853271484375,
-0.01377105712890625,
-0.0010690689086914062,
-0.01666259765625,
0.058441162109375,
-0.058929443359375,
0.01473236083984375,
0.0323486328125,
0.010406494140625,
-0.004276275634765625,
-0.0677490234375,
-0.047576904296875,
-0.01323699951171875,
-0.0033721923828125,
0.000152587890625,
-0.01317596435546875,
-0.008392333984375,
0.01885986328125,
0.0230560302734375,
-0.0361328125,
0.0078887939453125,
-0.042999267578125,
-0.017578125,
0.0460205078125,
0.0048828125,
0.0287017822265625,
-0.0248565673828125,
-0.03271484375,
-0.02691650390625,
-0.033233642578125,
0.0318603515625,
0.0187530517578125,
0.0145416259765625,
-0.05242919921875,
0.039276123046875,
0.00647735595703125,
0.0494384765625,
-0.0029125213623046875,
-0.0310516357421875,
0.049896240234375,
-0.004253387451171875,
-0.031646728515625,
-0.003147125244140625,
0.087646484375,
0.04095458984375,
0.00832366943359375,
0.0179290771484375,
-0.007476806640625,
-0.0287017822265625,
-0.00437164306640625,
-0.08734130859375,
-0.0160064697265625,
0.027923583984375,
-0.06524658203125,
-0.0372314453125,
0.01953125,
-0.040313720703125,
-0.01116943359375,
0.006603240966796875,
0.033203125,
-0.02996826171875,
-0.028900146484375,
0.005275726318359375,
-0.01081085205078125,
0.03143310546875,
0.0068359375,
-0.042449951171875,
0.01294708251953125,
0.01470184326171875,
0.09344482421875,
0.0085906982421875,
-0.03240966796875,
-0.02069091796875,
-0.02142333984375,
-0.0177764892578125,
0.029205322265625,
-0.004528045654296875,
-0.016845703125,
-0.025115966796875,
0.026092529296875,
-0.017822265625,
-0.056976318359375,
0.0281524658203125,
-0.019287109375,
0.01323699951171875,
0.0017566680908203125,
-0.004261016845703125,
-0.04534912109375,
0.0207366943359375,
-0.038970947265625,
0.10321044921875,
0.018035888671875,
-0.06524658203125,
0.018402099609375,
-0.042694091796875,
-0.016021728515625,
-0.031402587890625,
0.0041046142578125,
-0.08062744140625,
-0.009063720703125,
0.017913818359375,
0.06695556640625,
-0.0282135009765625,
-0.009796142578125,
-0.043609619140625,
-0.02178955078125,
0.022064208984375,
0.011138916015625,
0.07464599609375,
0.016510009765625,
-0.039306640625,
0.0153045654296875,
-0.048858642578125,
0.017120361328125,
0.038360595703125,
-0.02032470703125,
-0.008758544921875,
-0.034576416015625,
0.00971221923828125,
0.0250091552734375,
0.0034809112548828125,
-0.04144287109375,
0.0164337158203125,
-0.01198577880859375,
0.039794921875,
0.031982421875,
-0.0138092041015625,
0.0274200439453125,
-0.034912109375,
0.0190582275390625,
0.022705078125,
0.0189361572265625,
-0.0075531005859375,
-0.045501708984375,
-0.060821533203125,
-0.031646728515625,
0.026824951171875,
0.038055419921875,
-0.034515380859375,
0.02691650390625,
-0.011260986328125,
-0.06304931640625,
-0.034210205078125,
0.00754547119140625,
0.037811279296875,
0.038299560546875,
0.023834228515625,
-0.036041259765625,
-0.039947509765625,
-0.0699462890625,
-0.0026340484619140625,
0.0006341934204101562,
0.0004019737243652344,
0.030548095703125,
0.0523681640625,
-0.01317596435546875,
0.0478515625,
-0.024383544921875,
-0.022979736328125,
-0.01727294921875,
0.00927734375,
0.0303497314453125,
0.062347412109375,
0.059722900390625,
-0.061431884765625,
-0.03436279296875,
0.0011796951293945312,
-0.0704345703125,
0.01210784912109375,
-0.00701141357421875,
-0.005397796630859375,
0.0190582275390625,
0.01529693603515625,
-0.0469970703125,
0.053375244140625,
0.0193939208984375,
-0.01611328125,
0.03021240234375,
-0.00726318359375,
0.0206146240234375,
-0.09100341796875,
0.005771636962890625,
0.0382080078125,
-0.013702392578125,
-0.02862548828125,
-0.00046253204345703125,
0.005802154541015625,
-0.00644683837890625,
-0.045806884765625,
0.0538330078125,
-0.039581298828125,
-0.0183258056640625,
-0.01361846923828125,
-0.00732421875,
-0.0003070831298828125,
0.046478271484375,
-0.0120697021484375,
0.03564453125,
0.051605224609375,
-0.044464111328125,
0.035919189453125,
0.026092529296875,
-0.010711669921875,
0.020904541015625,
-0.054962158203125,
0.01129913330078125,
0.007472991943359375,
0.0223541259765625,
-0.059814453125,
-0.019439697265625,
0.026611328125,
-0.04718017578125,
0.029327392578125,
-0.045379638671875,
-0.02978515625,
-0.04693603515625,
-0.0440673828125,
0.033782958984375,
0.045806884765625,
-0.051971435546875,
0.0426025390625,
0.022705078125,
0.0247955322265625,
-0.04119873046875,
-0.0599365234375,
-0.0208587646484375,
-0.03619384765625,
-0.05792236328125,
0.03546142578125,
0.022216796875,
0.00510406494140625,
0.00463104248046875,
-0.01247406005859375,
-0.0137481689453125,
-0.01143646240234375,
0.049591064453125,
0.0272216796875,
-0.021728515625,
-0.0180206298828125,
-0.031829833984375,
-0.0011920928955078125,
-0.0014333724975585938,
-0.0236968994140625,
0.044708251953125,
-0.03131103515625,
-0.004474639892578125,
-0.07275390625,
-0.01265716552734375,
0.040069580078125,
-0.01079559326171875,
0.058074951171875,
0.08831787109375,
-0.03546142578125,
0.00988006591796875,
-0.0372314453125,
-0.0103302001953125,
-0.0361328125,
0.02386474609375,
-0.035888671875,
-0.033050537109375,
0.07135009765625,
0.0031223297119140625,
-0.0033740997314453125,
0.050140380859375,
0.0266571044921875,
-0.00630950927734375,
0.06378173828125,
0.041229248046875,
0.012969970703125,
0.050445556640625,
-0.0650634765625,
-0.0166778564453125,
-0.07135009765625,
-0.04547119140625,
-0.0328369140625,
-0.03546142578125,
-0.059417724609375,
-0.0302581787109375,
0.030609130859375,
0.016357421875,
-0.033599853515625,
0.040374755859375,
-0.0552978515625,
0.00730133056640625,
0.054229736328125,
0.049102783203125,
-0.03021240234375,
0.0201873779296875,
-0.02691650390625,
0.002227783203125,
-0.056304931640625,
-0.02410888671875,
0.0831298828125,
0.033477783203125,
0.041046142578125,
-0.008026123046875,
0.054840087890625,
-0.0183258056640625,
0.02099609375,
-0.04962158203125,
0.044158935546875,
-0.004302978515625,
-0.028900146484375,
-0.0019702911376953125,
-0.034027099609375,
-0.08062744140625,
0.01274871826171875,
-0.0217437744140625,
-0.0626220703125,
0.00946044921875,
0.01227569580078125,
-0.019317626953125,
0.055908203125,
-0.062225341796875,
0.07049560546875,
-0.0081939697265625,
-0.040618896484375,
0.0109100341796875,
-0.0533447265625,
0.02691650390625,
0.01520538330078125,
-0.00984954833984375,
-0.01183319091796875,
0.0105438232421875,
0.0823974609375,
-0.049591064453125,
0.0556640625,
-0.03399658203125,
0.028594970703125,
0.0433349609375,
-0.006683349609375,
0.028900146484375,
-0.003925323486328125,
-0.0165557861328125,
0.02178955078125,
0.006267547607421875,
-0.039093017578125,
-0.03759765625,
0.047760009765625,
-0.06671142578125,
-0.0229644775390625,
-0.028045654296875,
-0.025054931640625,
0.01470184326171875,
0.013336181640625,
0.0384521484375,
0.0499267578125,
0.021331787109375,
0.0228729248046875,
0.039306640625,
-0.037109375,
0.03924560546875,
-0.0101776123046875,
-0.02117919921875,
-0.037841796875,
0.0677490234375,
0.0026302337646484375,
0.0007138252258300781,
0.00463104248046875,
0.0173187255859375,
-0.02496337890625,
-0.04656982421875,
-0.0274505615234375,
0.0233001708984375,
-0.040435791015625,
-0.03515625,
-0.0445556640625,
-0.0313720703125,
-0.02496337890625,
-0.006103515625,
-0.0416259765625,
-0.02166748046875,
-0.033599853515625,
0.0242156982421875,
0.05072021484375,
0.04339599609375,
-0.00977325439453125,
0.05029296875,
-0.054168701171875,
0.0157012939453125,
0.004268646240234375,
0.03387451171875,
-0.01007843017578125,
-0.058319091796875,
-0.0197296142578125,
-0.0010395050048828125,
-0.035980224609375,
-0.051300048828125,
0.040740966796875,
0.00701141357421875,
0.030181884765625,
0.024688720703125,
-0.022003173828125,
0.059417724609375,
-0.00244903564453125,
0.046966552734375,
0.040008544921875,
-0.04803466796875,
0.0496826171875,
-0.0167388916015625,
0.017578125,
0.006320953369140625,
0.02960205078125,
-0.013671875,
0.00870513916015625,
-0.06427001953125,
-0.054595947265625,
0.060699462890625,
0.01477813720703125,
-0.003017425537109375,
0.02783203125,
0.05279541015625,
-0.0071868896484375,
-0.0019779205322265625,
-0.06390380859375,
-0.0304107666015625,
-0.0316162109375,
-0.0199432373046875,
0.0148162841796875,
-0.008636474609375,
-0.0013666152954101562,
-0.055999755859375,
0.05169677734375,
-0.001720428466796875,
0.05816650390625,
0.0279083251953125,
0.0024433135986328125,
0.006412506103515625,
-0.0330810546875,
0.045166015625,
0.0193939208984375,
-0.02935791015625,
0.001888275146484375,
0.01198577880859375,
-0.05126953125,
0.01152801513671875,
0.005023956298828125,
0.00601959228515625,
0.0003883838653564453,
0.024200439453125,
0.0665283203125,
-0.0036907196044921875,
0.00585174560546875,
0.036041259765625,
-0.0092010498046875,
-0.035980224609375,
-0.022125244140625,
0.01055145263671875,
-0.0023784637451171875,
0.0322265625,
0.03021240234375,
0.02862548828125,
-0.007289886474609375,
-0.0168914794921875,
0.017791748046875,
0.034393310546875,
-0.0214080810546875,
-0.0229339599609375,
0.053314208984375,
-0.00955963134765625,
-0.0171661376953125,
0.0535888671875,
-0.0119171142578125,
-0.038360595703125,
0.08099365234375,
0.03515625,
0.06494140625,
-0.0075531005859375,
0.0051422119140625,
0.061309814453125,
0.0204010009765625,
-0.007450103759765625,
0.0197296142578125,
0.01457977294921875,
-0.05523681640625,
0.00754547119140625,
-0.040740966796875,
0.00824737548828125,
0.03448486328125,
-0.047576904296875,
0.029205322265625,
-0.05029296875,
-0.036407470703125,
0.018524169921875,
0.020721435546875,
-0.06829833984375,
0.023406982421875,
-0.0104522705078125,
0.065673828125,
-0.043121337890625,
0.058990478515625,
0.06573486328125,
-0.035919189453125,
-0.0819091796875,
0.0002123117446899414,
0.007740020751953125,
-0.07061767578125,
0.05078125,
0.04034423828125,
0.00139617919921875,
0.007415771484375,
-0.0599365234375,
-0.048583984375,
0.1021728515625,
0.0270233154296875,
-0.003353118896484375,
0.0251312255859375,
-0.01080322265625,
0.00469207763671875,
-0.03753662109375,
0.041046142578125,
0.01507568359375,
0.022979736328125,
0.0216064453125,
-0.060821533203125,
0.0183563232421875,
-0.0299072265625,
0.014984130859375,
0.01462554931640625,
-0.0653076171875,
0.060089111328125,
-0.041473388671875,
-0.00870513916015625,
0.0028743743896484375,
0.04425048828125,
0.0176239013671875,
0.023681640625,
0.03509521484375,
0.0555419921875,
0.036407470703125,
-0.018035888671875,
0.066650390625,
0.0021419525146484375,
0.03863525390625,
0.043426513671875,
0.0175018310546875,
0.045989990234375,
0.02655029296875,
-0.0157928466796875,
0.031524658203125,
0.09228515625,
-0.0230560302734375,
0.0207977294921875,
0.0133056640625,
-0.0029506683349609375,
0.00041556358337402344,
0.005916595458984375,
-0.03485107421875,
0.05389404296875,
0.01314544677734375,
-0.042388916015625,
-0.00836944580078125,
0.005046844482421875,
0.006084442138671875,
-0.0271453857421875,
-0.02117919921875,
0.03204345703125,
0.0013532638549804688,
-0.0246429443359375,
0.078125,
0.017669677734375,
0.0650634765625,
-0.0195465087890625,
0.00508880615234375,
-0.0211944580078125,
0.00555419921875,
-0.033416748046875,
-0.049591064453125,
0.02142333984375,
-0.0212249755859375,
-0.0029850006103515625,
0.00732421875,
0.051116943359375,
-0.009796142578125,
-0.027496337890625,
0.0098114013671875,
0.021453857421875,
0.0406494140625,
0.002010345458984375,
-0.09130859375,
0.022674560546875,
0.0097198486328125,
-0.0465087890625,
0.0205078125,
0.023101806640625,
0.00432586669921875,
0.0635986328125,
0.047821044921875,
-0.0099639892578125,
0.01329803466796875,
-0.0233001708984375,
0.06256103515625,
-0.052703857421875,
-0.0174560546875,
-0.064697265625,
0.045745849609375,
-0.0144500732421875,
-0.044158935546875,
0.041351318359375,
0.0521240234375,
0.063232421875,
0.0021114349365234375,
0.03912353515625,
-0.023468017578125,
-0.004497528076171875,
-0.038299560546875,
0.050018310546875,
-0.06256103515625,
0.00333404541015625,
-0.006710052490234375,
-0.051788330078125,
-0.0282135009765625,
0.059600830078125,
-0.018402099609375,
0.0296630859375,
0.042449951171875,
0.0787353515625,
-0.03302001953125,
-0.01439666748046875,
0.00792694091796875,
-0.0004665851593017578,
-0.00450897216796875,
0.023345947265625,
0.03179931640625,
-0.067138671875,
0.02752685546875,
-0.041473388671875,
-0.01535797119140625,
-0.01763916015625,
-0.055816650390625,
-0.07708740234375,
-0.065673828125,
-0.03863525390625,
-0.06060791015625,
-0.0166778564453125,
0.0699462890625,
0.0859375,
-0.043609619140625,
-0.0157318115234375,
0.00439453125,
0.01690673828125,
-0.01168060302734375,
-0.0168609619140625,
0.0438232421875,
-0.0027027130126953125,
-0.051116943359375,
-0.0164642333984375,
-0.002452850341796875,
0.034820556640625,
0.0116424560546875,
-0.015777587890625,
-0.0134124755859375,
-0.0228424072265625,
0.0242156982421875,
0.03289794921875,
-0.048675537109375,
-0.00724029541015625,
-0.0182342529296875,
-0.015838623046875,
0.0310211181640625,
0.042083740234375,
-0.036041259765625,
0.0152130126953125,
0.01617431640625,
0.02618408203125,
0.07049560546875,
-0.0207366943359375,
0.005950927734375,
-0.058990478515625,
0.048583984375,
-0.01258087158203125,
0.029052734375,
0.028656005859375,
-0.0223236083984375,
0.0452880859375,
0.0289306640625,
-0.032257080078125,
-0.06683349609375,
-0.00778961181640625,
-0.08392333984375,
-0.0029888153076171875,
0.07598876953125,
-0.020294189453125,
-0.0445556640625,
0.028472900390625,
-0.0007243156433105469,
0.048095703125,
-0.01003265380859375,
0.033905029296875,
0.01264190673828125,
-0.01532745361328125,
-0.050689697265625,
-0.055816650390625,
0.035369873046875,
0.00933074951171875,
-0.047210693359375,
-0.04437255859375,
-0.004253387451171875,
0.05029296875,
0.0158843994140625,
0.042816162109375,
-0.01357269287109375,
0.00795745849609375,
0.005138397216796875,
0.038726806640625,
-0.0282745361328125,
0.001087188720703125,
-0.0154571533203125,
-0.0012035369873046875,
-0.0158233642578125,
-0.054901123046875
]
] |
Hello-SimpleAI/chatgpt-qa-detector-roberta | 2023-01-19T11:02:25.000Z | [
"transformers",
"pytorch",
"roberta",
"text-classification",
"chatgpt",
"en",
"dataset:Hello-SimpleAI/HC3",
"arxiv:2301.07597",
"endpoints_compatible",
"region:us",
"has_space"
] | text-classification | Hello-SimpleAI | null | null | Hello-SimpleAI/chatgpt-qa-detector-roberta | 1 | 31,805 | transformers | 2023-01-19T10:29:39 | ---
datasets:
- Hello-SimpleAI/HC3
language:
- en
pipeline_tag: text-classification
tags:
- chatgpt
---
# Model Card for `Hello-SimpleAI/chatgpt-qa-detector-roberta`
This model is trained on `question-answer` pairs of **the filtered full-text** from [Hello-SimpleAI/HC3](https://huggingface.co/datasets/Hello-SimpleAI/HC3).
More details refer to [arxiv: 2301.07597](https://arxiv.org/abs/2301.07597) and Gtihub project [Hello-SimpleAI/chatgpt-comparison-detection](https://github.com/Hello-SimpleAI/chatgpt-comparison-detection).
The base checkpoint is [roberta-base](https://huggingface.co/roberta-base).
We train it with all [Hello-SimpleAI/HC3](https://huggingface.co/datasets/Hello-SimpleAI/HC3) data (without held-out) for 1 epoch.
(1-epoch is consistent with the experiments in [our paper](https://arxiv.org/abs/2301.07597).)
## Citation
Checkout this papaer [arxiv: 2301.07597](https://arxiv.org/abs/2301.07597)
```
@article{guo-etal-2023-hc3,
title = "How Close is ChatGPT to Human Experts? Comparison Corpus, Evaluation, and Detection",
author = "Guo, Biyang and
Zhang, Xin and
Wang, Ziyuan and
Jiang, Minqi and
Nie, Jinran and
Ding, Yuxuan and
Yue, Jianwei and
Wu, Yupeng",
journal={arXiv preprint arxiv:2301.07597}
year = "2023",
}
```
| 1,321 | [
[
-0.036102294921875,
-0.056488037109375,
0.0372314453125,
-0.01026153564453125,
-0.0232391357421875,
-0.0244598388671875,
-0.004032135009765625,
-0.0223846435546875,
-0.0021228790283203125,
0.0290679931640625,
-0.042755126953125,
-0.0304412841796875,
-0.0452880859375,
-0.00782012939453125,
-0.0145263671875,
0.09259033203125,
0.0273590087890625,
0.019683837890625,
-0.00562286376953125,
-0.0220947265625,
-0.0198516845703125,
-0.031982421875,
-0.08087158203125,
-0.026123046875,
0.022247314453125,
0.0215606689453125,
0.03955078125,
0.0241241455078125,
0.024444580078125,
0.0141448974609375,
-0.03753662109375,
0.0033111572265625,
-0.035400390625,
-0.026153564453125,
0.0218963623046875,
-0.03912353515625,
-0.05816650390625,
-0.0006227493286132812,
0.046051025390625,
0.00839996337890625,
-0.00087738037109375,
0.02020263671875,
0.0106964111328125,
0.037445068359375,
-0.0282745361328125,
0.0263824462890625,
-0.04248046875,
-0.0211181640625,
-0.0231170654296875,
-0.006053924560546875,
-0.0340576171875,
-0.0178680419921875,
0.0095367431640625,
-0.033477783203125,
0.0298309326171875,
0.003360748291015625,
0.09814453125,
0.0162200927734375,
-0.050750732421875,
-0.005756378173828125,
-0.038238525390625,
0.049530029296875,
-0.059417724609375,
0.0193939208984375,
0.0229644775390625,
0.0264739990234375,
-0.016998291015625,
-0.044158935546875,
-0.06317138671875,
-0.008453369140625,
0.0007119178771972656,
0.0168304443359375,
-0.021942138671875,
0.0052947998046875,
0.03656005859375,
0.028594970703125,
-0.0723876953125,
-0.00213623046875,
-0.03619384765625,
-0.028778076171875,
0.02911376953125,
0.007045745849609375,
0.01161956787109375,
-0.03887939453125,
-0.0379638671875,
-0.0232391357421875,
-0.04425048828125,
0.0147857666015625,
0.0232696533203125,
0.0109405517578125,
-0.031524658203125,
0.01364898681640625,
-0.035369873046875,
0.048004150390625,
-0.0056610107421875,
-0.0033016204833984375,
0.037017822265625,
-0.0345458984375,
-0.01526641845703125,
-0.0272979736328125,
0.0775146484375,
0.00348663330078125,
0.01335906982421875,
0.00823974609375,
0.01030731201171875,
-0.0004940032958984375,
-0.0130767822265625,
-0.0728759765625,
-0.035369873046875,
0.02691650390625,
-0.0209808349609375,
-0.0283050537109375,
0.006481170654296875,
-0.04632568359375,
0.000911712646484375,
-0.00455474853515625,
0.039764404296875,
-0.034149169921875,
-0.0202178955078125,
0.0005249977111816406,
-0.00655364990234375,
0.06072998046875,
0.011566162109375,
-0.043304443359375,
0.01123809814453125,
0.0477294921875,
0.04962158203125,
0.010223388671875,
-0.0151519775390625,
-0.045562744140625,
-0.01531219482421875,
0.0015268325805664062,
0.07855224609375,
-0.007213592529296875,
-0.01560211181640625,
-0.0244140625,
0.010986328125,
-0.020263671875,
-0.0308380126953125,
0.0780029296875,
-0.049591064453125,
0.050079345703125,
-0.0188751220703125,
-0.03662109375,
-0.0186004638671875,
0.03448486328125,
-0.037353515625,
0.08740234375,
-0.00203704833984375,
-0.0703125,
0.028656005859375,
-0.04412841796875,
0.0063629150390625,
0.0037555694580078125,
-0.00537109375,
-0.060211181640625,
-0.04351806640625,
0.031341552734375,
0.01375579833984375,
-0.0240631103515625,
0.01358795166015625,
-0.0288543701171875,
-0.040283203125,
0.026031494140625,
-0.0296173095703125,
0.0869140625,
0.021392822265625,
-0.0124664306640625,
0.0147705078125,
-0.0625,
0.023529052734375,
0.01136016845703125,
-0.0276031494140625,
-0.0091705322265625,
-0.0168914794921875,
0.0114288330078125,
0.008636474609375,
0.017059326171875,
-0.053131103515625,
0.0184326171875,
-0.01415252685546875,
0.030364990234375,
0.047393798828125,
0.0167236328125,
0.00902557373046875,
-0.044830322265625,
0.0251617431640625,
0.0083770751953125,
0.024169921875,
-0.01293182373046875,
-0.07684326171875,
-0.062042236328125,
-0.019287109375,
0.041900634765625,
0.05865478515625,
-0.050872802734375,
0.047027587890625,
-0.034393310546875,
-0.041412353515625,
-0.038421630859375,
-0.01248931884765625,
0.0369873046875,
0.04193115234375,
0.0255889892578125,
-0.03369140625,
-0.029998779296875,
-0.0635986328125,
-0.0080108642578125,
-0.037139892578125,
-0.019256591796875,
0.0172882080078125,
0.04339599609375,
-0.00914764404296875,
0.0684814453125,
-0.038238525390625,
-0.0220184326171875,
-0.004467010498046875,
0.0265045166015625,
0.0288543701171875,
0.0460205078125,
0.048919677734375,
-0.064453125,
-0.0325927734375,
-0.0297088623046875,
-0.0517578125,
0.002719879150390625,
0.003643035888671875,
-0.0298309326171875,
0.0013914108276367188,
0.007335662841796875,
-0.042877197265625,
0.0307464599609375,
0.0275726318359375,
-0.034027099609375,
0.03436279296875,
0.0031719207763671875,
0.0307464599609375,
-0.09503173828125,
0.0156707763671875,
-0.0005450248718261719,
-0.027679443359375,
-0.056060791015625,
0.0242156982421875,
-0.0038738250732421875,
-0.00640869140625,
-0.0374755859375,
0.048248291015625,
-0.0175323486328125,
0.010009765625,
-0.027099609375,
0.01503753662109375,
-0.00650787353515625,
0.06439208984375,
0.00499725341796875,
0.05865478515625,
0.0305938720703125,
-0.03509521484375,
0.01016998291015625,
0.051177978515625,
-0.0101318359375,
0.04376220703125,
-0.0709228515625,
0.035980224609375,
0.001461029052734375,
0.0263671875,
-0.083984375,
-0.01171875,
0.047637939453125,
-0.06671142578125,
0.0028095245361328125,
-0.0655517578125,
-0.049224853515625,
-0.0157318115234375,
-0.01702880859375,
0.04766845703125,
0.0667724609375,
-0.0288848876953125,
0.02508544921875,
0.0156097412109375,
0.00876617431640625,
-0.0235595703125,
-0.05023193359375,
-0.01453399658203125,
-0.0050811767578125,
-0.0673828125,
0.005153656005859375,
-0.0210113525390625,
0.01531982421875,
0.0008411407470703125,
0.01288604736328125,
-0.0225982666015625,
-0.008575439453125,
0.02001953125,
0.02978515625,
-0.0281982421875,
-0.00039196014404296875,
-0.0147247314453125,
-0.032135009765625,
0.0026493072509765625,
-0.00949859619140625,
0.052520751953125,
-0.027435302734375,
-0.04071044921875,
-0.04547119140625,
-0.0029449462890625,
0.0224761962890625,
-0.01151275634765625,
0.056732177734375,
0.07037353515625,
-0.0255889892578125,
0.02117919921875,
-0.04034423828125,
-0.01380157470703125,
-0.032562255859375,
0.01467132568359375,
-0.0284423828125,
-0.054595947265625,
0.047515869140625,
0.028961181640625,
-0.0081939697265625,
0.050750732421875,
0.036102294921875,
0.00688934326171875,
0.07086181640625,
0.0102386474609375,
-0.01335906982421875,
0.035552978515625,
-0.04052734375,
0.0021228790283203125,
-0.08221435546875,
0.0012531280517578125,
-0.05059814453125,
-0.005290985107421875,
-0.054595947265625,
-0.031005859375,
0.0345458984375,
0.0095062255859375,
-0.042877197265625,
0.0248260498046875,
-0.03558349609375,
0.02520751953125,
0.05401611328125,
0.0343017578125,
0.0282745361328125,
-0.01023101806640625,
0.0046844482421875,
0.002635955810546875,
-0.050323486328125,
-0.035919189453125,
0.09368896484375,
0.0279388427734375,
0.0223846435546875,
0.00853729248046875,
0.056121826171875,
0.0032634735107421875,
0.006916046142578125,
-0.038360595703125,
0.048248291015625,
-0.0007877349853515625,
-0.059417724609375,
-0.00983428955078125,
-0.042510986328125,
-0.0716552734375,
0.023773193359375,
-0.01496124267578125,
-0.05218505859375,
0.0105438232421875,
0.00946044921875,
-0.0305023193359375,
0.02850341796875,
-0.043182373046875,
0.0782470703125,
0.015869140625,
-0.01190948486328125,
-0.006397247314453125,
-0.03662109375,
0.036285400390625,
0.0295257568359375,
0.004474639892578125,
-0.015106201171875,
0.0265045166015625,
0.068115234375,
-0.01531219482421875,
0.05401611328125,
-0.029632568359375,
0.0239410400390625,
0.047027587890625,
-0.01358795166015625,
0.0423583984375,
0.01422119140625,
0.006656646728515625,
0.01076507568359375,
0.02685546875,
-0.07293701171875,
-0.018829345703125,
0.03326416015625,
-0.06402587890625,
-0.02099609375,
-0.052459716796875,
-0.021759033203125,
-0.005229949951171875,
0.0243072509765625,
0.037017822265625,
0.044830322265625,
-0.0155487060546875,
0.0075836181640625,
0.06707763671875,
0.00693511962890625,
0.0050506591796875,
0.037353515625,
-0.01114654541015625,
-0.0218353271484375,
0.05206298828125,
-0.0212249755859375,
0.0163726806640625,
0.017669677734375,
0.005645751953125,
-0.001361846923828125,
-0.054443359375,
-0.049285888671875,
0.023712158203125,
-0.044036865234375,
-0.0089569091796875,
-0.04168701171875,
-0.044708251953125,
-0.028778076171875,
0.037994384765625,
-0.0206146240234375,
-0.020843505859375,
-0.0161285400390625,
0.007415771484375,
0.0307159423828125,
0.0181121826171875,
0.0162200927734375,
0.03387451171875,
-0.0726318359375,
0.021697998046875,
0.036346435546875,
0.0001806020736694336,
0.0002422332763671875,
-0.0726318359375,
-0.00850677490234375,
0.034027099609375,
-0.0262603759765625,
-0.0570068359375,
0.015716552734375,
0.01235198974609375,
0.0452880859375,
0.022430419921875,
-0.0014324188232421875,
0.0440673828125,
0.003475189208984375,
0.065673828125,
-0.016693115234375,
-0.046905517578125,
0.03375244140625,
-0.045318603515625,
0.0372314453125,
0.04998779296875,
0.0181427001953125,
-0.038421630859375,
-0.0245361328125,
-0.056121826171875,
-0.052001953125,
0.044769287109375,
0.0352783203125,
0.002582550048828125,
0.00812530517578125,
0.0347900390625,
-0.016937255859375,
-0.0056915283203125,
-0.07476806640625,
-0.0172882080078125,
0.01331329345703125,
-0.01165008544921875,
0.0228424072265625,
-0.028167724609375,
-0.01503753662109375,
-0.007328033447265625,
0.06341552734375,
-0.02117919921875,
0.049896240234375,
0.0192413330078125,
-0.00873565673828125,
0.005970001220703125,
0.028656005859375,
0.038055419921875,
0.030120849609375,
-0.029327392578125,
-0.01708984375,
0.018035888671875,
-0.04119873046875,
-0.00939178466796875,
0.0066680908203125,
-0.0338134765625,
-0.0151824951171875,
0.041961669921875,
0.06658935546875,
0.0178375244140625,
-0.0328369140625,
0.058380126953125,
-0.0218658447265625,
-0.016357421875,
-0.0472412109375,
0.0160369873046875,
-0.0007009506225585938,
0.0204925537109375,
0.0240325927734375,
0.0264129638671875,
0.01015472412109375,
-0.031707763671875,
0.023529052734375,
0.02264404296875,
-0.03662109375,
-0.02783203125,
0.058074951171875,
0.0015201568603515625,
-0.0030918121337890625,
0.06646728515625,
-0.055755615234375,
-0.049224853515625,
0.048004150390625,
0.00653839111328125,
0.06475830078125,
0.01525115966796875,
0.001247406005859375,
0.053924560546875,
0.004024505615234375,
0.0017118453979492188,
0.0372314453125,
-0.00008797645568847656,
-0.06585693359375,
-0.0195770263671875,
-0.03656005859375,
-0.0261077880859375,
0.00696563720703125,
-0.06878662109375,
0.031707763671875,
-0.01305389404296875,
-0.006183624267578125,
-0.01104736328125,
0.017791748046875,
-0.05706787109375,
0.003387451171875,
0.0060882568359375,
0.065185546875,
-0.0731201171875,
0.081298828125,
0.0367431640625,
-0.04022216796875,
-0.052215576171875,
0.005702972412109375,
0.01485443115234375,
-0.06622314453125,
0.029998779296875,
0.00012636184692382812,
0.00994110107421875,
-0.00551605224609375,
-0.044525146484375,
-0.0511474609375,
0.100341796875,
0.0021915435791015625,
-0.0204925537109375,
-0.005229949951171875,
-0.012298583984375,
0.051483154296875,
-0.00919342041015625,
0.041656494140625,
0.0283660888671875,
0.02581787109375,
0.03302001953125,
-0.07037353515625,
-0.006641387939453125,
-0.042388916015625,
-0.0192108154296875,
0.0135498046875,
-0.06964111328125,
0.06793212890625,
0.003246307373046875,
-0.0136566162109375,
0.03997802734375,
0.04486083984375,
0.04296875,
0.032928466796875,
0.044036865234375,
0.064208984375,
0.036834716796875,
-0.033203125,
0.06732177734375,
-0.0193023681640625,
0.044921875,
0.08038330078125,
0.0173797607421875,
0.0736083984375,
0.01229095458984375,
-0.027801513671875,
0.045318603515625,
0.061187744140625,
-0.0209503173828125,
0.038818359375,
-0.00896453857421875,
-0.01018524169921875,
-0.0263824462890625,
0.015625,
-0.0364990234375,
0.019287109375,
0.00452423095703125,
0.0014142990112304688,
-0.00518035888671875,
-0.01544189453125,
0.0203094482421875,
-0.01461029052734375,
-0.00032520294189453125,
0.056488037109375,
-0.007648468017578125,
-0.043304443359375,
0.052947998046875,
-0.0028820037841796875,
0.057647705078125,
-0.046478271484375,
-0.00453948974609375,
-0.016632080078125,
0.00894927978515625,
-0.010162353515625,
-0.049163818359375,
0.01059722900390625,
-0.008056640625,
0.0023441314697265625,
0.0209503173828125,
0.046722412109375,
-0.03338623046875,
-0.019561767578125,
0.01111602783203125,
0.033111572265625,
0.028778076171875,
-0.006877899169921875,
-0.06072998046875,
-0.0168914794921875,
0.010894775390625,
-0.01541900634765625,
0.029571533203125,
0.0203094482421875,
0.004589080810546875,
0.0501708984375,
0.0556640625,
0.0030956268310546875,
0.0196990966796875,
-0.004608154296875,
0.06427001953125,
-0.036773681640625,
-0.046051025390625,
-0.04150390625,
0.0435791015625,
-0.01279449462890625,
-0.057373046875,
0.037994384765625,
0.055084228515625,
0.06072998046875,
0.0056915283203125,
0.04217529296875,
-0.0162506103515625,
0.049652099609375,
-0.049072265625,
0.045623779296875,
-0.0301055908203125,
0.0216827392578125,
-0.00716400146484375,
-0.056121826171875,
-0.0029773712158203125,
0.032470703125,
-0.0238189697265625,
0.015411376953125,
0.0439453125,
0.07025146484375,
-0.01279449462890625,
0.007022857666015625,
0.0117034912109375,
0.014892578125,
0.0340576171875,
0.06573486328125,
0.040313720703125,
-0.0745849609375,
0.05523681640625,
-0.0197601318359375,
-0.0280303955078125,
-0.00714111328125,
-0.048583984375,
-0.07421875,
-0.0496826171875,
-0.035491943359375,
-0.03411865234375,
0.0017728805541992188,
0.051361083984375,
0.0604248046875,
-0.07891845703125,
-0.006175994873046875,
0.00005644559860229492,
-0.000423431396484375,
-0.011627197265625,
-0.0174713134765625,
0.036834716796875,
0.0012950897216796875,
-0.0614013671875,
0.0034027099609375,
-0.0163116455078125,
0.01457977294921875,
-0.0057830810546875,
-0.0167999267578125,
-0.023956298828125,
-0.0130157470703125,
0.00788116455078125,
0.033233642578125,
-0.046844482421875,
-0.040924072265625,
-0.005680084228515625,
-0.019317626953125,
0.0228424072265625,
0.031097412109375,
-0.0457763671875,
0.0218658447265625,
0.0589599609375,
0.0184326171875,
0.0626220703125,
0.0049285888671875,
0.014129638671875,
-0.0435791015625,
0.0303955078125,
0.0128021240234375,
0.0247955322265625,
0.0147247314453125,
-0.013397216796875,
0.050445556640625,
0.02838134765625,
-0.07220458984375,
-0.052978515625,
0.015655517578125,
-0.101806640625,
0.01163482666015625,
0.078857421875,
-0.025604248046875,
-0.027557373046875,
-0.0014696121215820312,
-0.019989013671875,
0.0236663818359375,
-0.03277587890625,
0.05609130859375,
0.04571533203125,
-0.006923675537109375,
-0.0272216796875,
-0.042724609375,
0.037200927734375,
0.0186004638671875,
-0.04974365234375,
-0.025238037109375,
0.00945281982421875,
0.041839599609375,
0.014678955078125,
0.06365966796875,
-0.0159912109375,
0.036285400390625,
0.0019664764404296875,
0.0188751220703125,
-0.012115478515625,
0.0063018798828125,
-0.0364990234375,
-0.0104827880859375,
0.0240936279296875,
-0.03289794921875
]
] |
Hello-SimpleAI/chatgpt-detector-roberta-chinese | 2023-01-19T11:02:48.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"chatgpt",
"zh",
"dataset:Hello-SimpleAI/HC3-Chinese",
"arxiv:2301.07597",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | Hello-SimpleAI | null | null | Hello-SimpleAI/chatgpt-detector-roberta-chinese | 14 | 31,743 | transformers | 2023-01-18T16:41:57 | ---
datasets:
- Hello-SimpleAI/HC3-Chinese
language:
- zh
pipeline_tag: text-classification
tags:
- chatgpt
---
# Model Card for `Hello-SimpleAI/chatgpt-detector-roberta-chinese`
This model is trained on **the mix of full-text and splitted sentences** of `answer`s from [Hello-SimpleAI/HC3-Chinese](https://huggingface.co/datasets/Hello-SimpleAI/HC3-Chinese).
More details refer to [arxiv: 2301.07597](https://arxiv.org/abs/2301.07597) and Gtihub project [Hello-SimpleAI/chatgpt-comparison-detection](https://github.com/Hello-SimpleAI/chatgpt-comparison-detection).
The base checkpoint is [hfl/chinese-roberta-wwm-ext](https://huggingface.co/hfl/chinese-roberta-wwm-ext).
We train it with all [Hello-SimpleAI/HC3-Chinese](https://huggingface.co/datasets/Hello-SimpleAI/HC3-Chinese) data (without held-out) for 2 epochs.
(2-epoch is consistent with the experiments in [our paper](https://arxiv.org/abs/2301.07597).)
## Citation
Checkout this papaer [arxiv: 2301.07597](https://arxiv.org/abs/2301.07597)
```
@article{guo-etal-2023-hc3,
title = "How Close is ChatGPT to Human Experts? Comparison Corpus, Evaluation, and Detection",
author = "Guo, Biyang and
Zhang, Xin and
Wang, Ziyuan and
Jiang, Minqi and
Nie, Jinran and
Ding, Yuxuan and
Yue, Jianwei and
Wu, Yupeng",
journal={arXiv preprint arxiv:2301.07597}
year = "2023",
}
``` | 1,403 | [
[
-0.028656005859375,
-0.04974365234375,
0.0283050537109375,
-0.0007390975952148438,
-0.0245819091796875,
-0.026275634765625,
-0.0155487060546875,
-0.02947998046875,
-0.0055999755859375,
0.0226898193359375,
-0.0418701171875,
-0.0302581787109375,
-0.04388427734375,
-0.006488800048828125,
-0.00545501708984375,
0.0877685546875,
0.016876220703125,
0.0200042724609375,
0.00868988037109375,
-0.01812744140625,
-0.01361083984375,
-0.0279998779296875,
-0.0771484375,
-0.03363037109375,
0.024566650390625,
0.020355224609375,
0.04278564453125,
0.046783447265625,
0.0357666015625,
0.011566162109375,
-0.033782958984375,
0.004001617431640625,
-0.02716064453125,
-0.036895751953125,
0.022857666015625,
-0.028533935546875,
-0.05230712890625,
-0.0023021697998046875,
0.0438232421875,
0.02142333984375,
-0.00629425048828125,
0.00452423095703125,
0.0146636962890625,
0.027984619140625,
-0.028656005859375,
0.023529052734375,
-0.044403076171875,
-0.0061492919921875,
-0.028533935546875,
-0.004650115966796875,
-0.03729248046875,
-0.00482177734375,
0.011199951171875,
-0.035186767578125,
0.0333251953125,
0.0060577392578125,
0.10467529296875,
0.00847625732421875,
-0.05023193359375,
-0.007076263427734375,
-0.033843994140625,
0.0482177734375,
-0.0703125,
0.0172882080078125,
0.0203399658203125,
0.0172882080078125,
-0.013427734375,
-0.041229248046875,
-0.059600830078125,
-0.0120391845703125,
-0.0104217529296875,
0.020111083984375,
-0.010162353515625,
0.007663726806640625,
0.040374755859375,
0.03411865234375,
-0.07562255859375,
0.01354217529296875,
-0.028961181640625,
-0.042388916015625,
0.03460693359375,
0.006397247314453125,
0.0156707763671875,
-0.038909912109375,
-0.04327392578125,
-0.0233917236328125,
-0.04412841796875,
0.0073089599609375,
0.0258941650390625,
0.022613525390625,
-0.034637451171875,
0.01513671875,
-0.0230255126953125,
0.05108642578125,
-0.01097869873046875,
-0.01007843017578125,
0.043548583984375,
-0.0367431640625,
-0.0163421630859375,
-0.019775390625,
0.0833740234375,
0.007122039794921875,
0.01183319091796875,
0.01212310791015625,
0.003871917724609375,
-0.00435638427734375,
-0.02587890625,
-0.07769775390625,
-0.031524658203125,
0.01172637939453125,
-0.0279998779296875,
-0.022918701171875,
0.01263427734375,
-0.042327880859375,
-0.0010004043579101562,
-0.004367828369140625,
0.031005859375,
-0.03753662109375,
-0.0271453857421875,
0.0016460418701171875,
-0.00913238525390625,
0.062286376953125,
0.01181793212890625,
-0.04888916015625,
0.0020904541015625,
0.050872802734375,
0.06011962890625,
-0.0014600753784179688,
-0.030303955078125,
-0.04290771484375,
-0.01123809814453125,
-0.0018453598022460938,
0.068603515625,
-0.0032253265380859375,
-0.016448974609375,
-0.012603759765625,
0.0180511474609375,
-0.0247039794921875,
-0.031524658203125,
0.07806396484375,
-0.03240966796875,
0.049346923828125,
-0.0229034423828125,
-0.035003662109375,
-0.0172271728515625,
0.029998779296875,
-0.039947509765625,
0.08428955078125,
-0.00881195068359375,
-0.0771484375,
0.031463623046875,
-0.047393798828125,
-0.017822265625,
0.0110321044921875,
-0.00359344482421875,
-0.053955078125,
-0.040924072265625,
0.0245513916015625,
0.0183868408203125,
-0.0196380615234375,
0.014923095703125,
-0.0268707275390625,
-0.0406494140625,
0.007617950439453125,
-0.037811279296875,
0.086181640625,
0.02276611328125,
-0.022247314453125,
0.00799560546875,
-0.0577392578125,
0.009185791015625,
0.018218994140625,
-0.027740478515625,
-0.00821685791015625,
-0.00937652587890625,
0.0260162353515625,
0.01236724853515625,
0.022369384765625,
-0.052032470703125,
0.0123138427734375,
-0.01666259765625,
0.031494140625,
0.0491943359375,
0.0010919570922851562,
0.0074462890625,
-0.039337158203125,
0.0123138427734375,
0.0186309814453125,
0.01247406005859375,
-0.0249176025390625,
-0.06488037109375,
-0.06591796875,
-0.0171966552734375,
0.0308074951171875,
0.06463623046875,
-0.052032470703125,
0.0545654296875,
-0.0355224609375,
-0.038726806640625,
-0.0301361083984375,
-0.01751708984375,
0.032989501953125,
0.042144775390625,
0.030059814453125,
-0.0302734375,
-0.0304718017578125,
-0.06854248046875,
-0.013458251953125,
-0.035552978515625,
-0.0153656005859375,
0.01424407958984375,
0.047210693359375,
-0.0129241943359375,
0.0582275390625,
-0.037200927734375,
-0.0225677490234375,
-0.0080413818359375,
0.02679443359375,
0.0259246826171875,
0.0430908203125,
0.040496826171875,
-0.06005859375,
-0.037933349609375,
-0.007114410400390625,
-0.054046630859375,
0.00226593017578125,
-0.00780487060546875,
-0.0279388427734375,
0.01183319091796875,
0.0107574462890625,
-0.03826904296875,
0.0302734375,
0.0247955322265625,
-0.0260162353515625,
0.037689208984375,
-0.0020923614501953125,
0.0199432373046875,
-0.095947265625,
0.018035888671875,
0.005702972412109375,
-0.02105712890625,
-0.0570068359375,
0.0175933837890625,
0.0013561248779296875,
-0.0007767677307128906,
-0.04449462890625,
0.048614501953125,
-0.0289154052734375,
0.010711669921875,
-0.031341552734375,
0.01345062255859375,
-0.0149383544921875,
0.0670166015625,
0.0198822021484375,
0.04876708984375,
0.0308837890625,
-0.033660888671875,
0.009185791015625,
0.037628173828125,
-0.0195159912109375,
0.036376953125,
-0.060638427734375,
0.0265045166015625,
0.00867462158203125,
0.0268707275390625,
-0.07550048828125,
-0.0219268798828125,
0.0440673828125,
-0.05255126953125,
0.0078277587890625,
-0.046661376953125,
-0.05517578125,
-0.020721435546875,
-0.01934814453125,
0.042205810546875,
0.061859130859375,
-0.03857421875,
0.024078369140625,
0.007488250732421875,
0.0160064697265625,
-0.029998779296875,
-0.0631103515625,
-0.00933837890625,
-0.0072021484375,
-0.0714111328125,
0.01471710205078125,
-0.0152740478515625,
0.0198974609375,
-0.0013322830200195312,
0.00848388671875,
-0.019134521484375,
-0.0071868896484375,
0.00418853759765625,
0.031585693359375,
-0.0275115966796875,
-0.012115478515625,
-0.0146636962890625,
-0.022003173828125,
-0.0017337799072265625,
-0.0085601806640625,
0.05072021484375,
-0.01641845703125,
-0.04022216796875,
-0.04168701171875,
-0.0033893585205078125,
0.0194091796875,
-0.00598907470703125,
0.0572509765625,
0.08538818359375,
-0.032958984375,
0.018768310546875,
-0.033966064453125,
-0.00975799560546875,
-0.0308685302734375,
0.026123046875,
-0.022247314453125,
-0.060821533203125,
0.046356201171875,
0.0276641845703125,
0.0041046142578125,
0.052642822265625,
0.045135498046875,
0.01531982421875,
0.07330322265625,
0.0110015869140625,
-0.0187835693359375,
0.03656005859375,
-0.0262451171875,
0.002109527587890625,
-0.078125,
0.00421905517578125,
-0.03900146484375,
-0.00626373291015625,
-0.05902099609375,
-0.02490234375,
0.020751953125,
-0.00284576416015625,
-0.047149658203125,
0.0330810546875,
-0.0380859375,
0.02935791015625,
0.04150390625,
0.02813720703125,
0.02532958984375,
-0.004741668701171875,
-0.0015420913696289062,
0.0026493072509765625,
-0.048980712890625,
-0.029144287109375,
0.07745361328125,
0.031585693359375,
0.02349853515625,
0.006069183349609375,
0.0498046875,
0.0015420913696289062,
0.00865936279296875,
-0.04339599609375,
0.0443115234375,
-0.0126190185546875,
-0.05517578125,
-0.009796142578125,
-0.047576904296875,
-0.0689697265625,
0.0325927734375,
-0.0052490234375,
-0.051300048828125,
0.0007152557373046875,
0.00583648681640625,
-0.0242462158203125,
0.0297088623046875,
-0.04119873046875,
0.08203125,
0.004428863525390625,
-0.01140594482421875,
0.00415802001953125,
-0.0440673828125,
0.04315185546875,
0.0232391357421875,
0.00975799560546875,
-0.0191497802734375,
0.0235137939453125,
0.070556640625,
-0.0210418701171875,
0.056182861328125,
-0.030242919921875,
0.01568603515625,
0.036102294921875,
-0.018035888671875,
0.043121337890625,
-0.00016438961029052734,
0.0066680908203125,
0.0227508544921875,
0.0151214599609375,
-0.06512451171875,
-0.005596160888671875,
0.02960205078125,
-0.06182861328125,
-0.024993896484375,
-0.05950927734375,
-0.0263214111328125,
0.00971221923828125,
0.0302581787109375,
0.055419921875,
0.03271484375,
-0.0199432373046875,
0.00978851318359375,
0.0533447265625,
0.0037441253662109375,
0.0200042724609375,
0.03448486328125,
-0.00972747802734375,
-0.0232086181640625,
0.04962158203125,
-0.015655517578125,
0.0198822021484375,
0.0224151611328125,
0.016998291015625,
0.002002716064453125,
-0.04815673828125,
-0.049713134765625,
0.02490234375,
-0.04486083984375,
0.00077056884765625,
-0.039520263671875,
-0.04901123046875,
-0.02777099609375,
0.028533935546875,
-0.01898193359375,
-0.01096343994140625,
-0.0182952880859375,
0.0025482177734375,
0.01922607421875,
0.01045989990234375,
0.0038661956787109375,
0.0367431640625,
-0.0694580078125,
0.0104217529296875,
0.0190277099609375,
-0.0062255859375,
0.00598907470703125,
-0.07672119140625,
-0.028961181640625,
0.0325927734375,
-0.0224151611328125,
-0.049591064453125,
0.023834228515625,
0.0157012939453125,
0.04718017578125,
0.03228759765625,
-0.006809234619140625,
0.049652099609375,
-0.01116180419921875,
0.0701904296875,
-0.00982666015625,
-0.054595947265625,
0.0270233154296875,
-0.04376220703125,
0.041015625,
0.044830322265625,
0.0146636962890625,
-0.05450439453125,
-0.017791748046875,
-0.04302978515625,
-0.056854248046875,
0.05731201171875,
0.033111572265625,
0.0006966590881347656,
0.0080413818359375,
0.027252197265625,
-0.0128631591796875,
-0.01229095458984375,
-0.08184814453125,
-0.01708984375,
0.01120758056640625,
-0.012603759765625,
0.018463134765625,
-0.0201568603515625,
-0.01263427734375,
-0.0095672607421875,
0.060791015625,
-0.022552490234375,
0.046417236328125,
0.0163421630859375,
-0.01036834716796875,
0.00334930419921875,
0.01641845703125,
0.04254150390625,
0.021026611328125,
-0.021392822265625,
-0.01267242431640625,
0.02252197265625,
-0.04925537109375,
-0.01343536376953125,
0.01195526123046875,
-0.0267333984375,
-0.00957489013671875,
0.05841064453125,
0.07684326171875,
0.01373291015625,
-0.0302886962890625,
0.056976318359375,
-0.02020263671875,
-0.0226593017578125,
-0.048065185546875,
0.0124359130859375,
0.0016279220581054688,
0.00572967529296875,
0.0233306884765625,
0.0219573974609375,
0.007476806640625,
-0.0286102294921875,
0.0209197998046875,
0.0185699462890625,
-0.035919189453125,
-0.0312347412109375,
0.0531005859375,
0.006938934326171875,
-0.0031223297119140625,
0.07037353515625,
-0.0518798828125,
-0.055755615234375,
0.05084228515625,
0.014129638671875,
0.0711669921875,
0.0089263916015625,
-0.0107879638671875,
0.05523681640625,
0.014892578125,
-0.0161590576171875,
0.021942138671875,
0.0021228790283203125,
-0.07501220703125,
-0.019866943359375,
-0.0289459228515625,
-0.0227203369140625,
0.01181793212890625,
-0.06988525390625,
0.041259765625,
-0.0098724365234375,
0.0037860870361328125,
-0.0103607177734375,
0.013427734375,
-0.0565185546875,
0.0004203319549560547,
0.01910400390625,
0.07086181640625,
-0.07470703125,
0.08111572265625,
0.034576416015625,
-0.030059814453125,
-0.0555419921875,
0.016845703125,
0.0195159912109375,
-0.069091796875,
0.0272216796875,
0.02044677734375,
0.011444091796875,
-0.01318359375,
-0.045196533203125,
-0.046295166015625,
0.10321044921875,
-0.0016603469848632812,
-0.0286102294921875,
-0.0036773681640625,
-0.00954437255859375,
0.048675537109375,
-0.01067352294921875,
0.041015625,
0.038177490234375,
0.031158447265625,
0.0357666015625,
-0.07830810546875,
-0.00494384765625,
-0.038726806640625,
-0.0146484375,
0.0145416259765625,
-0.076904296875,
0.075927734375,
0.005329132080078125,
-0.025299072265625,
0.0295562744140625,
0.0462646484375,
0.038970947265625,
0.032012939453125,
0.048370361328125,
0.061553955078125,
0.041839599609375,
-0.0297088623046875,
0.0631103515625,
-0.0200347900390625,
0.04571533203125,
0.07769775390625,
0.00003331899642944336,
0.0625,
0.006317138671875,
-0.0200958251953125,
0.043701171875,
0.0726318359375,
-0.0257110595703125,
0.037872314453125,
-0.008819580078125,
-0.010833740234375,
-0.0223236083984375,
0.01403045654296875,
-0.03759765625,
0.0258026123046875,
0.0034389495849609375,
-0.0026683807373046875,
-0.0047607421875,
-0.016845703125,
0.040069580078125,
-0.02215576171875,
0.00237274169921875,
0.054443359375,
-0.005970001220703125,
-0.047271728515625,
0.048431396484375,
0.0024700164794921875,
0.072265625,
-0.04827880859375,
0.0025196075439453125,
-0.01470184326171875,
0.01409149169921875,
-0.0172882080078125,
-0.049652099609375,
0.0119781494140625,
-0.00347137451171875,
0.00574493408203125,
0.0226898193359375,
0.0462646484375,
-0.0421142578125,
-0.0247039794921875,
0.0213470458984375,
0.0316162109375,
0.0268707275390625,
-0.00016808509826660156,
-0.06597900390625,
-0.0150146484375,
0.020477294921875,
-0.0212249755859375,
0.030242919921875,
0.0305023193359375,
0.00762939453125,
0.0357666015625,
0.062286376953125,
0.00311279296875,
0.00815582275390625,
0.0003962516784667969,
0.0504150390625,
-0.04388427734375,
-0.045379638671875,
-0.06280517578125,
0.038543701171875,
-0.016326904296875,
-0.05224609375,
0.04510498046875,
0.06109619140625,
0.06805419921875,
0.0091400146484375,
0.053558349609375,
-0.01727294921875,
0.041473388671875,
-0.046661376953125,
0.047454833984375,
-0.038665771484375,
0.020904541015625,
-0.0089874267578125,
-0.055755615234375,
-0.0112762451171875,
0.040863037109375,
-0.016204833984375,
0.0165863037109375,
0.052001953125,
0.0635986328125,
-0.007770538330078125,
0.007350921630859375,
0.00801849365234375,
0.00970458984375,
0.04095458984375,
0.07025146484375,
0.038818359375,
-0.06463623046875,
0.053985595703125,
-0.02532958984375,
-0.030303955078125,
-0.01031494140625,
-0.05877685546875,
-0.074951171875,
-0.044769287109375,
-0.033447265625,
-0.029266357421875,
-0.009490966796875,
0.0577392578125,
0.059051513671875,
-0.06787109375,
-0.01806640625,
-0.000270843505859375,
-0.0074615478515625,
-0.01708984375,
-0.01763916015625,
0.042205810546875,
-0.0007762908935546875,
-0.06304931640625,
0.0020847320556640625,
-0.009368896484375,
0.0148468017578125,
-0.00788116455078125,
-0.0183258056640625,
-0.0197296142578125,
-0.016632080078125,
0.016021728515625,
0.0283355712890625,
-0.04852294921875,
-0.04119873046875,
0.0008001327514648438,
-0.0256500244140625,
0.0205078125,
0.03466796875,
-0.026123046875,
0.017608642578125,
0.054718017578125,
0.0186004638671875,
0.061279296875,
0.0005426406860351562,
0.01418304443359375,
-0.042236328125,
0.036865234375,
0.00716400146484375,
0.034576416015625,
0.026611328125,
-0.0206756591796875,
0.054779052734375,
0.040618896484375,
-0.0672607421875,
-0.054840087890625,
0.0156097412109375,
-0.0989990234375,
0.00572967529296875,
0.08966064453125,
-0.026153564453125,
-0.025726318359375,
0.00414276123046875,
-0.01385498046875,
0.026092529296875,
-0.0296478271484375,
0.046905517578125,
0.060394287109375,
-0.007488250732421875,
-0.0185394287109375,
-0.03643798828125,
0.03448486328125,
0.01381683349609375,
-0.05035400390625,
-0.01215362548828125,
0.005970001220703125,
0.032989501953125,
0.01409149169921875,
0.0579833984375,
-0.0090179443359375,
0.0296478271484375,
0.00557708740234375,
0.025299072265625,
-0.00815582275390625,
0.01031494140625,
-0.033447265625,
-0.0240936279296875,
0.013275146484375,
-0.030303955078125
]
] |
1-800-BAD-CODE/xlm-roberta_punctuation_fullstop_truecase | 2023-07-15T20:42:28.000Z | [
"generic",
"onnx",
"nemo",
"text2text-generation",
"punctuation",
"sentence-boundary-detection",
"truecasing",
"true-casing",
"af",
"am",
"ar",
"bg",
"bn",
"de",
"el",
"en",
"es",
"et",
"fa",
"fi",
"fr",
"gu",
"hi",
"hr",
"hu",
"id",
"is",
"it",
"ja",
"kk",
"kn",
"ko",
"ky",
"lt",
"lv",
"mk",
"ml",
"mr",
"nl",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"rw",
"so",
"sr",
"sw",
"ta",
"te",
"tr",
"uk",
"zh",
"license:apache-2.0",
"region:us"
] | text2text-generation | 1-800-BAD-CODE | null | null | 1-800-BAD-CODE/xlm-roberta_punctuation_fullstop_truecase | 21 | 31,726 | generic | 2023-05-07T22:33:05 | ---
license: apache-2.0
library_name: generic
tags:
- text2text-generation
- punctuation
- sentence-boundary-detection
- truecasing
- true-casing
language:
- af
- am
- ar
- bg
- bn
- de
- el
- en
- es
- et
- fa
- fi
- fr
- gu
- hi
- hr
- hu
- id
- is
- it
- ja
- kk
- kn
- ko
- ky
- lt
- lv
- mk
- ml
- mr
- nl
- or
- pa
- pl
- ps
- pt
- ro
- ru
- rw
- so
- sr
- sw
- ta
- te
- tr
- uk
- zh
widget:
- text: "hola amigo cómo estás es un día lluvioso hoy"
- text: "please rsvp for the party asap preferably before 8 pm tonight"
- text: "este modelo fue entrenado en un gpu a100 en realidad no se que dice esta frase lo traduje con nmt"
- text: "此模型向文本添加标点符号它支持47种语言并在a100gpu上接受过训练它可以在每种语言上运行而无需每种语言的特殊路径"
- text: "यह मॉडल 47 भाषाओं में विराम चिह्न जोड़ता है यह भाषा विशिष्ट पथ के बिना काम करता है यह प्रत्येक भाषा के लिए विशेष पथों के बिना प्रत्येक भाषा पर कार्य कर सकता है"
---
# Model Overview
This is an `xlm-roberta` fine-tuned to restore punctuation, true-case (capitalize),
and detect sentence boundaries (full stops) in 47 languages.
# Usage
If you want to just play with the model, the widget on this page will suffice. To use the model offline,
the following snippets show how to use the model both with a wrapper (that I wrote, available from `PyPI`)
and manual usuage (using the ONNX and SentencePiece models in this repo).
## Usage via `punctuators` package
<details>
<summary>Click to see usage with wrappers</summary>
The easiest way to use this model is to install [punctuators](https://github.com/1-800-BAD-CODE/punctuators):
```bash
$ pip install punctuators
```
But this is just an ONNX and SentencePiece model, so you may run it as you wish.
The input to the `punctuators` API is a list (batch) of strings.
Each string will be punctuated, true-cased, and segmented on predicted full stops.
The output will therefore be a list of list of strings: one list of segmented sentences per input text.
To disable full stops, use `m.infer(texts, apply_sbd=False)`.
The output will then be a list of strings: one punctuated, true-cased string per input text.
<details open>
<summary>Example Usage</summary>
```python
from typing import List
from punctuators.models import PunctCapSegModelONNX
m: PunctCapSegModelONNX = PunctCapSegModelONNX.from_pretrained(
"1-800-BAD-CODE/xlm-roberta_punctuation_fullstop_truecase"
)
input_texts: List[str] = [
"hola mundo cómo estás estamos bajo el sol y hace mucho calor santa coloma abre los huertos urbanos a las escuelas de la ciudad",
"hello friend how's it going it's snowing outside right now in connecticut a large storm is moving in",
"未來疫苗將有望覆蓋3歲以上全年齡段美國與北約軍隊已全部撤離還有鐵路公路在內的各項基建的來源都將枯竭",
"በባለፈው ሳምንት ኢትዮጵያ ከሶማሊያ 3 ሺህ ወታደሮቿንም እንዳስወጣች የሶማሊያው ዳልሳን ሬድዮ ዘግቦ ነበር ጸጥታ ሃይሉና ህዝቡ ተቀናጅቶ በመስራቱ በመዲናዋ ላይ የታቀደው የጥፋት ሴራ ከሽፏል",
"こんにちは友人" "調子はどう" "今日は雨の日でしたね" "乾いた状態を保つために一日中室内で過ごしました",
"hallo freund wie geht's es war heute ein regnerischer tag nicht wahr ich verbrachte den tag drinnen um trocken zu bleiben",
"हैलो दोस्त ये कैसा चल रहा है आज बारिश का दिन था न मैंने सूखा रहने के लिए दिन घर के अंदर बिताया",
"كيف تجري الامور كان يومًا ممطرًا اليوم أليس كذلك قضيت اليوم في الداخل لأظل جافًا",
]
results: List[List[str]] = m.infer(
texts=input_texts, apply_sbd=True,
)
for input_text, output_texts in zip(input_texts, results):
print(f"Input: {input_text}")
print(f"Outputs:")
for text in output_texts:
print(f"\t{text}")
print()
```
</details>
<details open>
<summary>Expected output</summary>
```text
Input: hola mundo cómo estás estamos bajo el sol y hace mucho calor santa coloma abre los huertos urbanos a las escuelas de la ciudad
Outputs:
Hola mundo, ¿cómo estás?
Estamos bajo el sol y hace mucho calor.
Santa Coloma abre los huertos urbanos a las escuelas de la ciudad.
Input: hello friend how's it going it's snowing outside right now in connecticut a large storm is moving in
Outputs:
Hello friend, how's it going?
It's snowing outside right now.
In Connecticut, a large storm is moving in.
Input: 未來疫苗將有望覆蓋3歲以上全年齡段美國與北約軍隊已全部撤離還有鐵路公路在內的各項基建的來源都將枯竭
Outputs:
未來,疫苗將有望覆蓋3歲以上全年齡段。
美國與北約軍隊已全部撤離。
還有,鐵路,公路在內的各項基建的來源都將枯竭。
Input: በባለፈው ሳምንት ኢትዮጵያ ከሶማሊያ 3 ሺህ ወታደሮቿንም እንዳስወጣች የሶማሊያው ዳልሳን ሬድዮ ዘግቦ ነበር ጸጥታ ሃይሉና ህዝቡ ተቀናጅቶ በመስራቱ በመዲናዋ ላይ የታቀደው የጥፋት ሴራ ከሽፏል
Outputs:
በባለፈው ሳምንት ኢትዮጵያ ከሶማሊያ 3 ሺህ ወታደሮቿንም እንዳስወጣች የሶማሊያው ዳልሳን ሬድዮ ዘግቦ ነበር።
ጸጥታ ሃይሉና ህዝቡ ተቀናጅቶ በመስራቱ በመዲናዋ ላይ የታቀደው የጥፋት ሴራ ከሽፏል።
Input: こんにちは友人調子はどう今日は雨の日でしたね乾いた状態を保つために一日中室内で過ごしました
Outputs:
こんにちは、友人、調子はどう?
今日は雨の日でしたね。
乾いた状態を保つために、一日中、室内で過ごしました。
Input: hallo freund wie geht's es war heute ein regnerischer tag nicht wahr ich verbrachte den tag drinnen um trocken zu bleiben
Outputs:
Hallo Freund, wie geht's?
Es war heute ein regnerischer Tag, nicht wahr?
Ich verbrachte den Tag drinnen, um trocken zu bleiben.
Input: हैलो दोस्त ये कैसा चल रहा है आज बारिश का दिन था न मैंने सूखा रहने के लिए दिन घर के अंदर बिताया
Outputs:
हैलो दोस्त, ये कैसा चल रहा है?
आज बारिश का दिन था न, मैंने सूखा रहने के लिए दिन घर के अंदर बिताया।
Input: كيف تجري الامور كان يومًا ممطرًا اليوم أليس كذلك قضيت اليوم في الداخل لأظل جافًا
Outputs:
كيف تجري الامور؟
كان يومًا ممطرًا اليوم، أليس كذلك؟
قضيت اليوم في الداخل لأظل جافًا.
```
</details>
</details>
## Manual Usage
If you want to use the ONNX and SP models without wrappers, see the following example.
<details>
<summary>Click to see manual usage</summary>
```python
from typing import List
import numpy as np
import onnxruntime as ort
from huggingface_hub import hf_hub_download
from omegaconf import OmegaConf
from sentencepiece import SentencePieceProcessor
# Download the models from HF hub. Note: to clean up, you can find these files in your HF cache directory
spe_path = hf_hub_download(repo_id="1-800-BAD-CODE/xlm-roberta_punctuation_fullstop_truecase", filename="sp.model")
onnx_path = hf_hub_download(repo_id="1-800-BAD-CODE/xlm-roberta_punctuation_fullstop_truecase", filename="model.onnx")
config_path = hf_hub_download(
repo_id="1-800-BAD-CODE/xlm-roberta_punctuation_fullstop_truecase", filename="config.yaml"
)
# Load the SP model
tokenizer: SentencePieceProcessor = SentencePieceProcessor(spe_path) # noqa
# Load the ONNX graph
ort_session: ort.InferenceSession = ort.InferenceSession(onnx_path)
# Load the model config with labels, etc.
config = OmegaConf.load(config_path)
# Potential classification labels before each subtoken
pre_labels: List[str] = config.pre_labels
# Potential classification labels after each subtoken
post_labels: List[str] = config.post_labels
# Special class that means "predict nothing"
null_token = config.get("null_token", "<NULL>")
# Special class that means "all chars in this subtoken end with a period", e.g., "am" -> "a.m."
acronym_token = config.get("acronym_token", "<ACRONYM>")
# Not used in this example, but if your sequence exceed this value, you need to fold it over multiple inputs
max_len = config.max_length
# For reference only, graph has no language-specific behavior
languages: List[str] = config.languages
# Encode some input text, adding BOS + EOS
input_text = "hola mundo cómo estás estamos bajo el sol y hace mucho calor santa coloma abre los huertos urbanos a las escuelas de la ciudad"
input_ids = [tokenizer.bos_id()] + tokenizer.EncodeAsIds(input_text) + [tokenizer.eos_id()]
# Create a numpy array with shape [B, T], as the graph expects as input.
# Note that we do not pass lengths to the graph; if you are using a batch, padding should be tokenizer.pad_id() and the
# graph's attention mechanisms will ignore pad_id() without requiring explicit sequence lengths.
input_ids_arr: np.array = np.array([input_ids])
# Run the graph, get outputs for all analytics
pre_preds, post_preds, cap_preds, sbd_preds = ort_session.run(None, {"input_ids": input_ids_arr})
# Squeeze off the batch dimensions and convert to lists
pre_preds = pre_preds[0].tolist()
post_preds = post_preds[0].tolist()
cap_preds = cap_preds[0].tolist()
sbd_preds = sbd_preds[0].tolist()
# Segmented sentences
output_texts: List[str] = []
# Current sentence, which is built until we hit a sentence boundary prediction
current_chars: List[str] = []
# Iterate over the outputs, ignoring the first (BOS) and final (EOS) predictions and tokens
for token_idx in range(1, len(input_ids) - 1):
token = tokenizer.IdToPiece(input_ids[token_idx])
# Simple SP decoding
if token.startswith("▁") and current_chars:
current_chars.append(" ")
# Token-level predictions
pre_label = pre_labels[pre_preds[token_idx]]
post_label = post_labels[post_preds[token_idx]]
# If we predict "pre-punct", insert it before this token
if pre_label != null_token:
current_chars.append(pre_label)
# Iterate over each char. Skip SP's space token,
char_start = 1 if token.startswith("▁") else 0
for token_char_idx, char in enumerate(token[char_start:], start=char_start):
# If this char should be capitalized, apply upper case
if cap_preds[token_idx][token_char_idx]:
char = char.upper()
# Append char
current_chars.append(char)
# if this is an acronym, add a period after every char (p.m., a.m., etc.)
if post_label == acronym_token:
current_chars.append(".")
# Maybe this subtoken ends with punctuation
if post_label != null_token and post_label != acronym_token:
current_chars.append(post_label)
# If this token is a sentence boundary, finalize the current sentence and reset
if sbd_preds[token_idx]:
output_texts.append("".join(current_chars))
current_chars.clear()
# Maybe push final sentence, if the final token was not classified as a sentence boundary
if current_chars:
output_texts.append("".join(current_chars))
# Pretty print
print(f"Input: {input_text}")
print("Outputs:")
for text in output_texts:
print(f"\t{text}")
```
Expected output:
```text
Input: hola mundo cómo estás estamos bajo el sol y hace mucho calor santa coloma abre los huertos urbanos a las escuelas de la ciudad
Outputs:
Hola mundo, ¿cómo estás?
Estamos bajo el sol y hace mucho calor.
Santa Coloma abre los huertos urbanos a las escuelas de la ciudad.
```
</details>
# Model Architecture
This model implements the following graph, which allows punctuation, true-casing, and fullstop prediction
in every language without language-specific behavior:

<details>
<summary>Click to see graph explanations</summary>
We start by tokenizing the text and encoding it with XLM-Roberta, which is the pre-trained portion of this graph.
Then we predict punctuation before and after every subtoken.
Predicting before each token allows for Spanish inverted question marks.
Predicting after every token allows for all other punctuation, including punctuation within continuous-script
languages and acronyms.
We use embeddings to represent the predicted punctuation tokens to inform the sentence boundary head of the
punctuation that'll be inserted into the text. This allows proper full stop prediction, since certain punctuation
tokens (periods, questions marks, etc.) are strongly correlated with sentence boundaries.
We then shift full stop predictions to the right by one, to inform the true-casing head of where the beginning
of each new sentence is. This is important since true-casing is strongly correlated to sentence boundaries.
For true-casing, we predict `N` predictions per subtoken, where `N` is the number of characters in the subtoken.
In practice, `N` is the maximum subtoken length and extra predictions are ignored. Essentially, true-casing is
modeled as a multi-label problem. This allows for upper-casing arbitrary characters, e.g., "NATO", "MacDonald", "mRNA", etc.
Applying all these predictions to the input text, we can punctuate, true-case, and split sentences in any language.
</details>
## Tokenizer
<details>
<summary>Click to see how the XLM-Roberta tokenizer was un-hacked</summary>
Instead of the hacky wrapper used by FairSeq and strangely ported (not fixed) by HuggingFace, the `xlm-roberta` SentencePiece model was adjusted to correctly encode
the text. Per HF's comments,
```python
# Original fairseq vocab and spm vocab must be "aligned":
# Vocab | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9
# -------- | ------- | ------- | ------ | ------- | --- | --- | --- | ----- | ----- | ----
# fairseq | '<s>' | '<pad>' | '</s>' | '<unk>' | ',' | '.' | '▁' | 's' | '▁de' | '-'
# spm | '<unk>' | '<s>' | '</s>' | ',' | '.' | '▁' | 's' | '▁de' | '-' | '▁a'
```
The SP model was un-hacked with the following snippet
(SentencePiece experts, let me know if there is a problem here):
```python
from sentencepiece import SentencePieceProcessor
from sentencepiece.sentencepiece_model_pb2 import ModelProto
m = ModelProto()
m.ParseFromString(open("/path/to/xlmroberta/sentencepiece.bpe.model", "rb").read())
pieces = list(m.pieces)
pieces = (
[
ModelProto.SentencePiece(piece="<s>", type=ModelProto.SentencePiece.Type.CONTROL),
ModelProto.SentencePiece(piece="<pad>", type=ModelProto.SentencePiece.Type.CONTROL),
ModelProto.SentencePiece(piece="</s>", type=ModelProto.SentencePiece.Type.CONTROL),
ModelProto.SentencePiece(piece="<unk>", type=ModelProto.SentencePiece.Type.UNKNOWN),
]
+ pieces[3:]
+ [ModelProto.SentencePiece(piece="<mask>", type=ModelProto.SentencePiece.Type.USER_DEFINED)]
)
del m.pieces[:]
m.pieces.extend(pieces)
with open("/path/to/new/sp.model", "wb") as f:
f.write(m.SerializeToString())
```
Now we can use just the SP model without a wrapper.
</details>
## Post-Punctuation Tokens
This model predicts the following set of punctuation tokens after each subtoken:
| Token | Description | Relevant Languages |
| ---: | :---------- | :----------- |
| \<NULL\> | No punctuation | All |
| \<ACRONYM\> | Every character in this subword is followed by a period | Primarily English, some European |
| . | Latin full stop | Many |
| , | Latin comma | Many |
| ? | Latin question mark | Many |
| ? | Full-width question mark | Chinese, Japanese |
| , | Full-width comma | Chinese, Japanese |
| 。 | Full-width full stop | Chinese, Japanese |
| 、 | Ideographic comma | Chinese, Japanese |
| ・ | Middle dot | Japanese |
| । | Danda | Hindi, Bengali, Oriya |
| ؟ | Arabic question mark | Arabic |
| ; | Greek question mark | Greek |
| ። | Ethiopic full stop | Amharic |
| ፣ | Ethiopic comma | Amharic |
| ፧ | Ethiopic question mark | Amharic |
## Pre-Punctuation Tokens
This model predicts the following set of punctuation tokens before each subword:
| Token | Description | Relevant Languages |
| ---: | :---------- | :----------- |
| \<NULL\> | No punctuation | All |
| ¿ | Inverted question mark | Spanish |
# Training Details
This model was trained in the NeMo framework on an A100 for approximately 7 hours.
You may view the `tensorboard` log on [tensorboard.dev](https://tensorboard.dev/experiment/xxnULI1aTeK37vUDL4ejiw/#scalars).
This model was trained with News Crawl data from WMT.
1M lines of text for each language was used, except for a few low-resource languages which may have used less.
Languages were chosen based on whether the News Crawl corpus contained enough reliable-quality data as judged by the author.
# Limitations
This model was trained on news data, and may not perform well on conversational or informal data.
This model is unlikely to be of production quality.
It was trained with "only" 1M lines per language, and the dev sets may have been noisy due to the nature of web-scraped news data.
This model over-predicts Spanish question marks, especially the inverted question mark `¿` (see metrics below).
Since `¿` is a rare token, especially in the context of a 47-language model, Spanish questions were over-sampled
by selecting more of these sentences from additional training data that was not used. However, this seems to have
"over-corrected" the problem and a lot of Spanish question marks are predicted.
The model may also over-predict commas.
If you find any general limitations not mentioned here, let me know so all limitations can be addressed in the
next fine-tuning.
# Evaluation
In these metrics, keep in mind that
1. The data is noisy
2. Sentence boundaries and true-casing are conditioned on predicted punctuation, which is the most difficult task and sometimes incorrect.
When conditioning on reference punctuation, true-casing and SBD is practically 100% for most languages.
4. Punctuation can be subjective. E.g.,
`Hola mundo, ¿cómo estás?`
or
`Hola mundo. ¿Cómo estás?`
When the sentences are longer and more practical, these ambiguities abound and affect all 3 analytics.
## Test Data and Example Generation
Each test example was generated using the following procedure:
1. Concatenate 11 random sentences (1 + 10 for each sentence in the test set)
2. Lower-case the concatenated sentence
3. Remove all punctuation
Targets are generated as we lower-case letters and remove punctuation.
The data is a held-out portion of News Crawl, which has been deduplicated.
3,000 lines of data per language was used, generating 3,000 unique examples of 11 sentences each.
We generate 3,000 examples, where example `i` begins with sentence `i` and is followed by 10 random
sentences selected from the 3,000 sentence test set.
For measuring true-casing and sentence boundary detection, reference punctuation tokens were used for
conditioning (see graph above). If we use predicted punctuation instead, then incorrect punctuation will
result in true-casing and SBD targets not aligning correctly and these metrics will be artificially low.
## Selected Language Evaluation Reports
For now, metrics for a few selected languages are shown below.
Given the amount of work required to collect and pretty-print metrics in 47 languages, I'll add more eventually.
Expand any of the following tabs to see metrics for that language.
<details>
<summary>English</summary>
```text
punct_post test report:
label precision recall f1 support
<NULL> (label_id: 0) 99.25 98.43 98.84 564908
<ACRONYM> (label_id: 1) 63.14 84.67 72.33 613
. (label_id: 2) 90.97 93.91 92.42 32040
, (label_id: 3) 73.95 84.32 78.79 24271
? (label_id: 4) 79.05 81.94 80.47 1041
? (label_id: 5) 0.00 0.00 0.00 0
, (label_id: 6) 0.00 0.00 0.00 0
。 (label_id: 7) 0.00 0.00 0.00 0
、 (label_id: 8) 0.00 0.00 0.00 0
・ (label_id: 9) 0.00 0.00 0.00 0
। (label_id: 10) 0.00 0.00 0.00 0
؟ (label_id: 11) 0.00 0.00 0.00 0
، (label_id: 12) 0.00 0.00 0.00 0
; (label_id: 13) 0.00 0.00 0.00 0
። (label_id: 14) 0.00 0.00 0.00 0
፣ (label_id: 15) 0.00 0.00 0.00 0
፧ (label_id: 16) 0.00 0.00 0.00 0
-------------------
micro avg 97.60 97.60 97.60 622873
macro avg 81.27 88.65 84.57 622873
weighted avg 97.77 97.60 97.67 622873
```
```
cap test report:
label precision recall f1 support
LOWER (label_id: 0) 99.72 99.85 99.78 2134956
UPPER (label_id: 1) 96.33 93.52 94.91 91996
-------------------
micro avg 99.59 99.59 99.59 2226952
macro avg 98.03 96.68 97.34 2226952
weighted avg 99.58 99.59 99.58 2226952
```
```
seg test report:
label precision recall f1 support
NOSTOP (label_id: 0) 99.99 99.98 99.99 591540
FULLSTOP (label_id: 1) 99.61 99.89 99.75 34333
-------------------
micro avg 99.97 99.97 99.97 625873
macro avg 99.80 99.93 99.87 625873
weighted avg 99.97 99.97 99.97 625873
```
</details>
<details>
<summary>Spanish</summary>
```text
punct_pre test report:
label precision recall f1 support
<NULL> (label_id: 0) 99.94 99.89 99.92 636941
¿ (label_id: 1) 56.73 71.35 63.20 1288
-------------------
micro avg 99.83 99.83 99.83 638229
macro avg 78.34 85.62 81.56 638229
weighted avg 99.85 99.83 99.84 638229
```
```
punct_post test report:
label precision recall f1 support
<NULL> (label_id: 0) 99.19 98.41 98.80 578271
<ACRONYM> (label_id: 1) 30.10 56.36 39.24 55
. (label_id: 2) 91.92 93.12 92.52 30856
, (label_id: 3) 72.98 82.44 77.42 27761
? (label_id: 4) 52.77 71.85 60.85 1286
? (label_id: 5) 0.00 0.00 0.00 0
, (label_id: 6) 0.00 0.00 0.00 0
。 (label_id: 7) 0.00 0.00 0.00 0
、 (label_id: 8) 0.00 0.00 0.00 0
・ (label_id: 9) 0.00 0.00 0.00 0
। (label_id: 10) 0.00 0.00 0.00 0
؟ (label_id: 11) 0.00 0.00 0.00 0
، (label_id: 12) 0.00 0.00 0.00 0
; (label_id: 13) 0.00 0.00 0.00 0
። (label_id: 14) 0.00 0.00 0.00 0
፣ (label_id: 15) 0.00 0.00 0.00 0
፧ (label_id: 16) 0.00 0.00 0.00 0
-------------------
micro avg 97.40 97.40 97.40 638229
macro avg 69.39 80.44 73.77 638229
weighted avg 97.60 97.40 97.48 638229
```
```
cap test report:
label precision recall f1 support
LOWER (label_id: 0) 99.82 99.86 99.84 2324724
UPPER (label_id: 1) 95.92 94.70 95.30 79266
-------------------
micro avg 99.69 99.69 99.69 2403990
macro avg 97.87 97.28 97.57 2403990
weighted avg 99.69 99.69 99.69 2403990
```
```
seg test report:
label precision recall f1 support
NOSTOP (label_id: 0) 99.99 99.96 99.98 607057
FULLSTOP (label_id: 1) 99.31 99.88 99.60 34172
-------------------
micro avg 99.96 99.96 99.96 641229
macro avg 99.65 99.92 99.79 641229
weighted avg 99.96 99.96 99.96 641229
```
</details>
<details>
<summary>Amharic</summary>
```text
punct_post test report:
label precision recall f1 support
<NULL> (label_id: 0) 99.83 99.28 99.56 729664
<ACRONYM> (label_id: 1) 0.00 0.00 0.00 0
. (label_id: 2) 0.00 0.00 0.00 0
, (label_id: 3) 0.00 0.00 0.00 0
? (label_id: 4) 0.00 0.00 0.00 0
? (label_id: 5) 0.00 0.00 0.00 0
, (label_id: 6) 0.00 0.00 0.00 0
。 (label_id: 7) 0.00 0.00 0.00 0
、 (label_id: 8) 0.00 0.00 0.00 0
・ (label_id: 9) 0.00 0.00 0.00 0
। (label_id: 10) 0.00 0.00 0.00 0
؟ (label_id: 11) 0.00 0.00 0.00 0
، (label_id: 12) 0.00 0.00 0.00 0
; (label_id: 13) 0.00 0.00 0.00 0
። (label_id: 14) 91.27 97.90 94.47 25341
፣ (label_id: 15) 61.93 82.11 70.60 5818
፧ (label_id: 16) 67.41 81.73 73.89 1177
-------------------
micro avg 99.08 99.08 99.08 762000
macro avg 80.11 90.26 84.63 762000
weighted avg 99.21 99.08 99.13 762000
```
```
cap test report:
label precision recall f1 support
LOWER (label_id: 0) 98.40 98.03 98.21 1064
UPPER (label_id: 1) 71.23 75.36 73.24 69
-------------------
micro avg 96.65 96.65 96.65 1133
macro avg 84.81 86.69 85.73 1133
weighted avg 96.74 96.65 96.69 1133
```
```
seg test report:
label precision recall f1 support
NOSTOP (label_id: 0) 99.99 99.85 99.92 743158
FULLSTOP (label_id: 1) 95.20 99.62 97.36 21842
-------------------
micro avg 99.85 99.85 99.85 765000
macro avg 97.59 99.74 98.64 765000
weighted avg 99.85 99.85 99.85 765000
```
</details>
<details>
<summary>Chinese</summary>
```text
punct_post test report:
label precision recall f1 support
<NULL> (label_id: 0) 99.53 97.31 98.41 435611
<ACRONYM> (label_id: 1) 0.00 0.00 0.00 0
. (label_id: 2) 0.00 0.00 0.00 0
, (label_id: 3) 0.00 0.00 0.00 0
? (label_id: 4) 0.00 0.00 0.00 0
? (label_id: 5) 81.85 87.31 84.49 1513
, (label_id: 6) 74.08 93.67 82.73 35921
。 (label_id: 7) 96.51 96.93 96.72 32097
、 (label_id: 8) 0.00 0.00 0.00 0
・ (label_id: 9) 0.00 0.00 0.00 0
। (label_id: 10) 0.00 0.00 0.00 0
؟ (label_id: 11) 0.00 0.00 0.00 0
، (label_id: 12) 0.00 0.00 0.00 0
; (label_id: 13) 0.00 0.00 0.00 0
። (label_id: 14) 0.00 0.00 0.00 0
፣ (label_id: 15) 0.00 0.00 0.00 0
፧ (label_id: 16) 0.00 0.00 0.00 0
-------------------
micro avg 97.00 97.00 97.00 505142
macro avg 87.99 93.81 90.59 505142
weighted avg 97.48 97.00 97.15 505142
```
```
cap test report:
label precision recall f1 support
LOWER (label_id: 0) 94.89 94.98 94.94 2951
UPPER (label_id: 1) 81.34 81.03 81.18 796
-------------------
micro avg 92.02 92.02 92.02 3747
macro avg 88.11 88.01 88.06 3747
weighted avg 92.01 92.02 92.01 3747
```
```
seg test report:
label precision recall f1 support
NOSTOP (label_id: 0) 99.99 99.97 99.98 473642
FULLSTOP (label_id: 1) 99.55 99.90 99.72 34500
-------------------
micro avg 99.96 99.96 99.96 508142
macro avg 99.77 99.93 99.85 508142
weighted avg 99.96 99.96 99.96 508142
```
</details>
<details>
<summary>Japanese</summary>
```text
punct_post test report:
label precision recall f1 support
<NULL> (label_id: 0) 99.34 95.90 97.59 406341
<ACRONYM> (label_id: 1) 0.00 0.00 0.00 0
. (label_id: 2) 0.00 0.00 0.00 0
, (label_id: 3) 0.00 0.00 0.00 0
? (label_id: 4) 0.00 0.00 0.00 0
? (label_id: 5) 70.55 73.56 72.02 1456
, (label_id: 6) 0.00 0.00 0.00 0
。 (label_id: 7) 94.38 96.95 95.65 32537
、 (label_id: 8) 54.28 87.62 67.03 18610
・ (label_id: 9) 28.18 71.64 40.45 1100
। (label_id: 10) 0.00 0.00 0.00 0
؟ (label_id: 11) 0.00 0.00 0.00 0
، (label_id: 12) 0.00 0.00 0.00 0
; (label_id: 13) 0.00 0.00 0.00 0
። (label_id: 14) 0.00 0.00 0.00 0
፣ (label_id: 15) 0.00 0.00 0.00 0
፧ (label_id: 16) 0.00 0.00 0.00 0
-------------------
micro avg 95.51 95.51 95.51 460044
macro avg 69.35 85.13 74.55 460044
weighted avg 96.91 95.51 96.00 460044
```
```
cap test report:
label precision recall f1 support
LOWER (label_id: 0) 92.33 94.03 93.18 4174
UPPER (label_id: 1) 83.51 79.46 81.43 1587
-------------------
micro avg 90.02 90.02 90.02 5761
macro avg 87.92 86.75 87.30 5761
weighted avg 89.90 90.02 89.94 5761
```
```
seg test report:
label precision recall f1 support
NOSTOP (label_id: 0) 99.99 99.92 99.96 428544
FULLSTOP (label_id: 1) 99.07 99.87 99.47 34500
-------------------
micro avg 99.92 99.92 99.92 463044
macro avg 99.53 99.90 99.71 463044
weighted avg 99.92 99.92 99.92 463044
```
</details>
<details>
<summary>Hindi</summary>
```text
punct_post test report:
label precision recall f1 support
<NULL> (label_id: 0) 99.75 99.44 99.59 560358
<ACRONYM> (label_id: 1) 0.00 0.00 0.00 0
. (label_id: 2) 0.00 0.00 0.00 0
, (label_id: 3) 69.55 78.48 73.75 8084
? (label_id: 4) 63.30 87.07 73.31 317
? (label_id: 5) 0.00 0.00 0.00 0
, (label_id: 6) 0.00 0.00 0.00 0
。 (label_id: 7) 0.00 0.00 0.00 0
、 (label_id: 8) 0.00 0.00 0.00 0
・ (label_id: 9) 0.00 0.00 0.00 0
। (label_id: 10) 96.92 98.66 97.78 32118
؟ (label_id: 11) 0.00 0.00 0.00 0
، (label_id: 12) 0.00 0.00 0.00 0
; (label_id: 13) 0.00 0.00 0.00 0
። (label_id: 14) 0.00 0.00 0.00 0
፣ (label_id: 15) 0.00 0.00 0.00 0
፧ (label_id: 16) 0.00 0.00 0.00 0
-------------------
micro avg 99.11 99.11 99.11 600877
macro avg 82.38 90.91 86.11 600877
weighted avg 99.17 99.11 99.13 600877
```
```
cap test report:
label precision recall f1 support
LOWER (label_id: 0) 97.19 96.72 96.95 2466
UPPER (label_id: 1) 89.14 90.60 89.86 734
-------------------
micro avg 95.31 95.31 95.31 3200
macro avg 93.17 93.66 93.41 3200
weighted avg 95.34 95.31 95.33 3200
```
```
seg test report:
label precision recall f1 support
NOSTOP (label_id: 0) 100.00 99.99 99.99 569472
FULLSTOP (label_id: 1) 99.82 99.99 99.91 34405
-------------------
micro avg 99.99 99.99 99.99 603877
macro avg 99.91 99.99 99.95 603877
weighted avg 99.99 99.99 99.99 603877
```
</details>
<details>
<summary>Arabic</summary>
```text
punct_post test report:
label precision recall f1 support
<NULL> (label_id: 0) 99.30 96.94 98.10 688043
<ACRONYM> (label_id: 1) 93.33 77.78 84.85 18
. (label_id: 2) 93.31 93.78 93.54 28175
, (label_id: 3) 0.00 0.00 0.00 0
? (label_id: 4) 0.00 0.00 0.00 0
? (label_id: 5) 0.00 0.00 0.00 0
, (label_id: 6) 0.00 0.00 0.00 0
。 (label_id: 7) 0.00 0.00 0.00 0
、 (label_id: 8) 0.00 0.00 0.00 0
・ (label_id: 9) 0.00 0.00 0.00 0
। (label_id: 10) 0.00 0.00 0.00 0
؟ (label_id: 11) 65.93 82.79 73.40 860
، (label_id: 12) 44.89 79.20 57.30 20941
; (label_id: 13) 0.00 0.00 0.00 0
። (label_id: 14) 0.00 0.00 0.00 0
፣ (label_id: 15) 0.00 0.00 0.00 0
፧ (label_id: 16) 0.00 0.00 0.00 0
-------------------
micro avg 96.29 96.29 96.29 738037
macro avg 79.35 86.10 81.44 738037
weighted avg 97.49 96.29 96.74 738037
```
```
cap test report:
label precision recall f1 support
LOWER (label_id: 0) 97.10 99.49 98.28 4137
UPPER (label_id: 1) 98.71 92.89 95.71 1729
-------------------
micro avg 97.55 97.55 97.55 5866
macro avg 97.90 96.19 96.99 5866
weighted avg 97.57 97.55 97.52 5866
```
```
seg test report:
label precision recall f1 support
NOSTOP (label_id: 0) 99.99 99.97 99.98 710456
FULLSTOP (label_id: 1) 99.39 99.85 99.62 30581
-------------------
micro avg 99.97 99.97 99.97 741037
macro avg 99.69 99.91 99.80 741037
weighted avg 99.97 99.97 99.97 741037
```
</details>
# Extra Stuff
## Acronyms, abbreviations, and bi-capitalized words
This section briefly demonstrates the models behavior when presented with the following:
1. Acronyms: "NATO"
2. Fake acronyms: "NHTG" in place of "NATO"
3. Ambigous term which could be an acronym or proper noun: "Tuny"
3. Bi-capitalized words: "McDavid"
4. Intialisms: "p.m."
<details open>
<summary>Acronyms, etc. inputs</summary>
```python
from typing import List
from punctuators.models import PunctCapSegModelONNX
m: PunctCapSegModelONNX = PunctCapSegModelONNX.from_pretrained(
"1-800-BAD-CODE/xlm-roberta_punctuation_fullstop_truecase"
)
input_texts = [
"the us is a nato member as a nato member the country enjoys security guarantees notably article 5",
"the us is a nhtg member as a nhtg member the country enjoys security guarantees notably article 5",
"the us is a tuny member as a tuny member the country enjoys security guarantees notably article 5",
"connor andrew mcdavid is a canadian professional ice hockey centre and captain of the edmonton oilers of the national hockey league the oilers selected him first overall in the 2015 nhl entry draft mcdavid spent his childhood playing ice hockey against older children",
"please rsvp for the party asap preferably before 8 pm tonight",
]
results: List[List[str]] = m.infer(
texts=input_texts, apply_sbd=True,
)
for input_text, output_texts in zip(input_texts, results):
print(f"Input: {input_text}")
print(f"Outputs:")
for text in output_texts:
print(f"\t{text}")
print()
```
</details>
<details open>
<summary>Expected output</summary>
```text
Input: the us is a nato member as a nato member the country enjoys security guarantees notably article 5
Outputs:
The U.S. is a NATO member.
As a NATO member, the country enjoys security guarantees, notably Article 5.
Input: the us is a nhtg member as a nhtg member the country enjoys security guarantees notably article 5
Outputs:
The U.S. is a NHTG member.
As a NHTG member, the country enjoys security guarantees, notably Article 5.
Input: the us is a tuny member as a tuny member the country enjoys security guarantees notably article 5
Outputs:
The U.S. is a Tuny member.
As a Tuny member, the country enjoys security guarantees, notably Article 5.
Input: connor andrew mcdavid is a canadian professional ice hockey centre and captain of the edmonton oilers of the national hockey league the oilers selected him first overall in the 2015 nhl entry draft mcdavid spent his childhood playing ice hockey against older children
Outputs:
Connor Andrew McDavid is a Canadian professional ice hockey centre and captain of the Edmonton Oilers of the National Hockey League.
The Oilers selected him first overall in the 2015 NHL entry draft.
McDavid spent his childhood playing ice hockey against older children.
Input: please rsvp for the party asap preferably before 8 pm tonight
Outputs:
Please RSVP for the party ASAP, preferably before 8 p.m. tonight.
```
</details>
| 46,586 | [
[
-0.0210113525390625,
-0.06658935546875,
0.043243408203125,
0.0288238525390625,
-0.029144287109375,
-0.01186370849609375,
-0.0164947509765625,
-0.0185089111328125,
0.0245819091796875,
0.04022216796875,
-0.0361328125,
-0.045440673828125,
-0.051116943359375,
0.0264434814453125,
-0.0211944580078125,
0.07525634765625,
-0.0017404556274414062,
-0.006053924560546875,
0.010284423828125,
0.0004982948303222656,
-0.0240325927734375,
-0.0462646484375,
-0.058746337890625,
-0.0028171539306640625,
0.0206756591796875,
0.022613525390625,
0.03924560546875,
0.0413818359375,
0.034912109375,
0.037506103515625,
-0.018157958984375,
0.01392364501953125,
-0.00911712646484375,
-0.0007905960083007812,
-0.0023975372314453125,
-0.0184783935546875,
-0.0206298828125,
-0.012786865234375,
0.05426025390625,
0.051177978515625,
-0.00481414794921875,
0.01024627685546875,
-0.0008387565612792969,
0.022003173828125,
-0.033233642578125,
0.01381683349609375,
-0.0210418701171875,
-0.006351470947265625,
-0.01332855224609375,
-0.017669677734375,
-0.025177001953125,
-0.05084228515625,
-0.01201629638671875,
-0.045013427734375,
-0.01392364501953125,
0.00940704345703125,
0.0948486328125,
0.0246429443359375,
-0.020843505859375,
-0.0213775634765625,
-0.041656494140625,
0.06744384765625,
-0.064453125,
0.016448974609375,
0.0222625732421875,
0.0017938613891601562,
-0.0231781005859375,
-0.0609130859375,
-0.055023193359375,
0.0011758804321289062,
-0.019195556640625,
0.0255279541015625,
-0.0222625732421875,
-0.0200042724609375,
0.01611328125,
0.00919342041015625,
-0.038787841796875,
-0.021026611328125,
-0.0345458984375,
-0.0206756591796875,
0.04107666015625,
0.0027523040771484375,
0.041229248046875,
-0.0304718017578125,
-0.031768798828125,
-0.02301025390625,
-0.01824951171875,
0.013763427734375,
0.0072174072265625,
0.0308837890625,
-0.042022705078125,
0.052154541015625,
0.007232666015625,
0.051300048828125,
0.01325225830078125,
-0.0267791748046875,
0.0292816162109375,
-0.02899169921875,
-0.03564453125,
0.0000013709068298339844,
0.08123779296875,
0.02130126953125,
0.021881103515625,
0.0010051727294921875,
0.0027561187744140625,
0.0009517669677734375,
-0.02294921875,
-0.058502197265625,
-0.01274871826171875,
0.022003173828125,
-0.0182952880859375,
-0.023345947265625,
0.01151275634765625,
-0.06597900390625,
-0.015228271484375,
0.0012302398681640625,
0.03094482421875,
-0.054962158203125,
-0.01165771484375,
0.0190277099609375,
-0.0343017578125,
0.0004892349243164062,
0.018890380859375,
-0.04266357421875,
0.00771331787109375,
0.0252838134765625,
0.068115234375,
0.024017333984375,
-0.0234375,
-0.040283203125,
0.008514404296875,
-0.0185089111328125,
0.05523681640625,
-0.033905029296875,
-0.025909423828125,
-0.01145172119140625,
0.019561767578125,
-0.035491943359375,
-0.0246429443359375,
0.04058837890625,
-0.0206756591796875,
0.036773681640625,
-0.00662994384765625,
-0.040740966796875,
-0.020294189453125,
0.007442474365234375,
-0.0269927978515625,
0.0703125,
0.01432037353515625,
-0.08001708984375,
0.007663726806640625,
-0.051849365234375,
-0.04486083984375,
-0.003971099853515625,
-0.0038471221923828125,
-0.045562744140625,
-0.0209503173828125,
0.0283203125,
0.038116455078125,
0.000018715858459472656,
0.006816864013671875,
-0.013946533203125,
-0.01434326171875,
0.0408935546875,
-0.013214111328125,
0.09930419921875,
0.032623291015625,
-0.0283660888671875,
0.01227569580078125,
-0.0596923828125,
-0.00026488304138183594,
0.006244659423828125,
-0.03558349609375,
-0.01284027099609375,
-0.017425537109375,
0.01412200927734375,
0.009613037109375,
0.019012451171875,
-0.057281494140625,
0.007343292236328125,
-0.0489501953125,
0.0533447265625,
0.06231689453125,
0.021881103515625,
0.026763916015625,
-0.03314208984375,
0.040069580078125,
0.012054443359375,
-0.007091522216796875,
-0.01473236083984375,
-0.047760009765625,
-0.05999755859375,
-0.0284423828125,
0.015380859375,
0.0638427734375,
-0.059906005859375,
0.049163818359375,
-0.01470947265625,
-0.047454833984375,
-0.01385498046875,
-0.00598907470703125,
0.0340576171875,
0.0275115966796875,
0.0231781005859375,
-0.0169677734375,
-0.059112548828125,
-0.039886474609375,
-0.0192718505859375,
-0.017578125,
-0.0005202293395996094,
0.0132293701171875,
0.045623779296875,
-0.0195159912109375,
0.06463623046875,
-0.0516357421875,
-0.02001953125,
-0.033538818359375,
0.002780914306640625,
0.03607177734375,
0.04730224609375,
0.044464111328125,
-0.050048828125,
-0.040771484375,
-0.003139495849609375,
-0.046630859375,
-0.0010976791381835938,
-0.023193359375,
-0.0079193115234375,
0.021575927734375,
0.0300140380859375,
-0.033721923828125,
0.041839599609375,
0.0286102294921875,
-0.056121826171875,
0.04803466796875,
-0.01898193359375,
0.01702880859375,
-0.1038818359375,
0.011474609375,
0.005352020263671875,
-0.02001953125,
-0.042938232421875,
0.0011625289916992188,
-0.00872802734375,
-0.0021209716796875,
-0.031402587890625,
0.056121826171875,
-0.029327392578125,
0.00925445556640625,
0.00984954833984375,
0.00629425048828125,
0.0001533031463623047,
0.0428466796875,
0.0045318603515625,
0.057586669921875,
0.03668212890625,
-0.057220458984375,
0.015045166015625,
0.04364013671875,
-0.0222015380859375,
0.0178985595703125,
-0.043212890625,
-0.005100250244140625,
0.00479888916015625,
0.0022449493408203125,
-0.090576171875,
-0.0225677490234375,
0.05230712890625,
-0.058502197265625,
0.007205963134765625,
0.0030117034912109375,
-0.0443115234375,
-0.037322998046875,
-0.0139923095703125,
0.0006742477416992188,
0.0411376953125,
-0.028167724609375,
0.0333251953125,
0.024261474609375,
-0.0110626220703125,
-0.0250396728515625,
-0.07525634765625,
0.00933837890625,
-0.0150909423828125,
-0.0557861328125,
0.021820068359375,
-0.015838623046875,
-0.026275634765625,
-0.00888824462890625,
0.01406097412109375,
-0.0209503173828125,
0.001628875732421875,
0.0170745849609375,
0.0189208984375,
-0.0055999755859375,
-0.01274871826171875,
0.004192352294921875,
-0.0012083053588867188,
-0.0006151199340820312,
-0.004505157470703125,
0.061309814453125,
-0.013092041015625,
-0.00937652587890625,
-0.0469970703125,
0.01666259765625,
0.03631591796875,
-0.01288604736328125,
0.0537109375,
0.0677490234375,
-0.0275421142578125,
0.0111541748046875,
-0.0302886962890625,
-0.018096923828125,
-0.03826904296875,
0.027313232421875,
-0.029205322265625,
-0.05072021484375,
0.035064697265625,
0.001056671142578125,
-0.00429534912109375,
0.045013427734375,
0.04083251953125,
-0.0018138885498046875,
0.0687255859375,
0.049835205078125,
0.0005745887756347656,
0.024658203125,
-0.03375244140625,
0.029083251953125,
-0.052764892578125,
-0.01389312744140625,
-0.036834716796875,
0.004199981689453125,
-0.045074462890625,
-0.025360107421875,
0.03485107421875,
0.01088714599609375,
-0.029327392578125,
0.035552978515625,
-0.062286376953125,
0.00521087646484375,
0.041900634765625,
0.0108184814453125,
0.004688262939453125,
0.004009246826171875,
-0.0210418701171875,
0.0005035400390625,
-0.04852294921875,
-0.0384521484375,
0.058624267578125,
0.0170745849609375,
0.039520263671875,
0.00722503662109375,
0.05230712890625,
0.004863739013671875,
-0.00047898292541503906,
-0.0367431640625,
0.052825927734375,
-0.022186279296875,
-0.0306549072265625,
-0.0282745361328125,
-0.0245513916015625,
-0.07427978515625,
0.0005941390991210938,
-0.005084991455078125,
-0.08074951171875,
0.0200042724609375,
-0.006961822509765625,
-0.04827880859375,
0.04254150390625,
-0.04681396484375,
0.06396484375,
-0.005908966064453125,
-0.025054931640625,
0.00928497314453125,
-0.055206298828125,
0.0259552001953125,
0.0166473388671875,
0.04296875,
0.0042266845703125,
0.0029926300048828125,
0.07794189453125,
-0.054412841796875,
0.07049560546875,
-0.005695343017578125,
0.006740570068359375,
0.0318603515625,
0.0012655258178710938,
0.0244903564453125,
0.006923675537109375,
-0.0072174072265625,
0.01255035400390625,
0.01497650146484375,
-0.0186920166015625,
-0.0181732177734375,
0.044342041015625,
-0.07000732421875,
-0.025177001953125,
-0.0606689453125,
-0.032073974609375,
0.01515960693359375,
0.0264434814453125,
0.04296875,
0.035736083984375,
0.01155853271484375,
-0.007343292236328125,
0.021636962890625,
-0.03680419921875,
0.05517578125,
0.0194854736328125,
-0.022491455078125,
-0.058624267578125,
0.0604248046875,
0.00829315185546875,
0.00617218017578125,
0.0289306640625,
0.02349853515625,
-0.01332855224609375,
-0.01165771484375,
-0.04229736328125,
0.029083251953125,
-0.047454833984375,
-0.01267242431640625,
-0.0640869140625,
-0.027862548828125,
-0.07037353515625,
-0.0300140380859375,
-0.00653839111328125,
-0.04083251953125,
-0.032623291015625,
0.005992889404296875,
0.054534912109375,
0.054962158203125,
-0.021392822265625,
0.03143310546875,
-0.04779052734375,
0.013275146484375,
0.0012140274047851562,
0.00344085693359375,
-0.00449371337890625,
-0.051483154296875,
-0.0208282470703125,
0.0051116943359375,
-0.032470703125,
-0.07269287109375,
0.07080078125,
0.00545501708984375,
0.0093994140625,
0.04095458984375,
0.0028591156005859375,
0.06292724609375,
-0.0158843994140625,
0.0684814453125,
0.0132904052734375,
-0.07843017578125,
0.04461669921875,
-0.025970458984375,
0.0254364013671875,
0.016998291015625,
0.0245208740234375,
-0.061065673828125,
-0.041107177734375,
-0.049896240234375,
-0.06243896484375,
0.07049560546875,
0.035400390625,
-0.00919342041015625,
-0.0163726806640625,
0.0219573974609375,
-0.002910614013671875,
0.0139617919921875,
-0.05926513671875,
-0.04058837890625,
-0.026153564453125,
-0.01309967041015625,
-0.00788116455078125,
-0.01039886474609375,
-0.00146484375,
-0.04388427734375,
0.06390380859375,
0.0178375244140625,
0.043426513671875,
0.039642333984375,
0.0022258758544921875,
0.00244903564453125,
0.0341796875,
0.055267333984375,
0.041229248046875,
-0.0263671875,
0.00991058349609375,
0.0203399658203125,
-0.041717529296875,
0.0031871795654296875,
0.00890350341796875,
-0.025146484375,
0.0200653076171875,
0.0287017822265625,
0.05987548828125,
0.023101806640625,
-0.033538818359375,
0.035675048828125,
-0.00539398193359375,
-0.00860595703125,
-0.05401611328125,
-0.00411224365234375,
0.01128387451171875,
0.0017499923706054688,
0.0265045166015625,
0.02618408203125,
-0.004634857177734375,
-0.0533447265625,
0.01224517822265625,
0.01338958740234375,
-0.0239715576171875,
-0.0240936279296875,
0.06219482421875,
-0.0022144317626953125,
-0.03424072265625,
0.035247802734375,
-0.04144287109375,
-0.0697021484375,
0.06646728515625,
0.058135986328125,
0.056640625,
-0.00873565673828125,
0.0225677490234375,
0.061126708984375,
0.0237884521484375,
0.0066070556640625,
0.04730224609375,
0.00205230712890625,
-0.056915283203125,
-0.020538330078125,
-0.0657958984375,
-0.01532745361328125,
0.00927734375,
-0.035003662109375,
0.0228271484375,
-0.057403564453125,
-0.0145416259765625,
0.00548553466796875,
0.0038166046142578125,
-0.047943115234375,
0.025726318359375,
0.003932952880859375,
0.052825927734375,
-0.07373046875,
0.05841064453125,
0.058624267578125,
-0.057647705078125,
-0.07208251953125,
-0.009979248046875,
-0.0036773681640625,
-0.05938720703125,
0.0419921875,
0.0149688720703125,
0.0055999755859375,
0.00354766845703125,
-0.040771484375,
-0.08551025390625,
0.06695556640625,
0.0080108642578125,
-0.0259246826171875,
0.003276824951171875,
0.0164794921875,
0.0323486328125,
-0.024200439453125,
0.0426025390625,
0.03887939453125,
0.06622314453125,
-0.003276824951171875,
-0.065185546875,
0.031890869140625,
-0.025604248046875,
-0.004428863525390625,
-0.0021533966064453125,
-0.05804443359375,
0.08819580078125,
-0.0293121337890625,
-0.035736083984375,
0.0302886962890625,
0.047882080078125,
0.028076171875,
0.02685546875,
0.033355712890625,
0.049591064453125,
0.048553466796875,
-0.0167694091796875,
0.07696533203125,
-0.0302886962890625,
0.024566650390625,
0.060455322265625,
-0.0025615692138671875,
0.061004638671875,
0.0323486328125,
-0.0333251953125,
0.06634521484375,
0.05291748046875,
-0.016357421875,
0.0302276611328125,
0.00493621826171875,
-0.0078582763671875,
-0.015716552734375,
-0.0052490234375,
-0.0275421142578125,
0.0357666015625,
0.0184783935546875,
-0.0209197998046875,
-0.0079193115234375,
0.024017333984375,
0.040496826171875,
0.01291656494140625,
-0.0130767822265625,
0.055023193359375,
0.0064239501953125,
-0.046875,
0.05206298828125,
0.0175933837890625,
0.07373046875,
-0.03826904296875,
0.01148223876953125,
-0.006732940673828125,
0.0242767333984375,
-0.0197906494140625,
-0.06036376953125,
0.0285491943359375,
-0.005268096923828125,
-0.006378173828125,
-0.006305694580078125,
0.034149169921875,
-0.0382080078125,
-0.07037353515625,
0.02459716796875,
0.0157928466796875,
0.0389404296875,
0.00205230712890625,
-0.053558349609375,
0.00836181640625,
0.01953125,
-0.017333984375,
0.0006194114685058594,
0.033477783203125,
0.01485443115234375,
0.043243408203125,
0.051361083984375,
0.025177001953125,
0.008636474609375,
-0.003948211669921875,
0.05633544921875,
-0.0628662109375,
-0.041656494140625,
-0.06610107421875,
0.043243408203125,
-0.0166168212890625,
-0.040802001953125,
0.07061767578125,
0.05413818359375,
0.057373046875,
-0.0106964111328125,
0.07403564453125,
-0.0227813720703125,
0.0487060546875,
-0.025726318359375,
0.06353759765625,
-0.042022705078125,
0.0033931732177734375,
-0.0322265625,
-0.047149658203125,
-0.0267486572265625,
0.054595947265625,
-0.0249176025390625,
0.003276824951171875,
0.062744140625,
0.060272216796875,
0.00661468505859375,
-0.010498046875,
0.0174713134765625,
0.042205810546875,
0.01267242431640625,
0.044219970703125,
0.0226593017578125,
-0.054718017578125,
0.05096435546875,
-0.051025390625,
-0.0210723876953125,
-0.012420654296875,
-0.044677734375,
-0.072265625,
-0.05548095703125,
-0.0357666015625,
-0.038665771484375,
-0.00005370378494262695,
0.06707763671875,
0.04229736328125,
-0.07025146484375,
-0.026611328125,
-0.007694244384765625,
0.018829345703125,
-0.02301025390625,
-0.028076171875,
0.047027587890625,
-0.01116180419921875,
-0.05816650390625,
0.019134521484375,
0.0168914794921875,
0.0101318359375,
0.00041484832763671875,
0.00592041015625,
-0.039215087890625,
0.0175628662109375,
0.039642333984375,
0.0305023193359375,
-0.0606689453125,
-0.02984619140625,
0.007678985595703125,
-0.0158538818359375,
0.00710296630859375,
0.0201873779296875,
-0.051025390625,
0.01123809814453125,
0.039093017578125,
0.01195526123046875,
0.0318603515625,
-0.007354736328125,
0.0307159423828125,
-0.045562744140625,
0.01934814453125,
-0.00004690885543823242,
0.04437255859375,
0.0293121337890625,
-0.03582763671875,
0.037567138671875,
0.03179931640625,
-0.052276611328125,
-0.06646728515625,
-0.0021820068359375,
-0.08026123046875,
-0.03717041015625,
0.09014892578125,
-0.00902557373046875,
-0.04168701171875,
-0.005096435546875,
-0.02581787109375,
0.04669189453125,
-0.038421630859375,
0.049957275390625,
0.0460205078125,
-0.0020847320556640625,
-0.002086639404296875,
-0.007373809814453125,
0.03338623046875,
0.036590576171875,
-0.045654296875,
-0.0153045654296875,
0.0111236572265625,
0.034576416015625,
0.03228759765625,
0.051177978515625,
0.00655364990234375,
0.0226287841796875,
0.007720947265625,
0.02490234375,
-0.0011644363403320312,
-0.00304412841796875,
-0.0185394287109375,
0.00000762939453125,
-0.02703857421875,
-0.0252685546875
]
] |
tomh/toxigen_roberta | 2022-05-01T19:42:09.000Z | [
"transformers",
"pytorch",
"roberta",
"text-classification",
"en",
"arxiv:2203.09509",
"endpoints_compatible",
"region:us"
] | text-classification | tomh | null | null | tomh/toxigen_roberta | 3 | 31,713 | transformers | 2022-05-01T13:19:41 | ---
language:
- en
tags:
- text-classification
---
Thomas Hartvigsen, Saadia Gabriel, Hamid Palangi, Maarten Sap, Dipankar Ray, Ece Kamar.
This model comes from the paper [ToxiGen: A Large-Scale Machine-Generated Dataset for Adversarial and Implicit Hate Speech Detection](https://arxiv.org/abs/2203.09509) and can be used to detect implicit hate speech.
Please visit the [Github Repository](https://github.com/microsoft/TOXIGEN) for the training dataset and further details.
```bibtex
@inproceedings{hartvigsen2022toxigen,
title = "{T}oxi{G}en: A Large-Scale Machine-Generated Dataset for Adversarial and Implicit Hate Speech Detection",
author = "Hartvigsen, Thomas and Gabriel, Saadia and Palangi, Hamid and Sap, Maarten and Ray, Dipankar and Kamar, Ece",
booktitle = "Proceedings of the 60th Annual Meeting of the Association of Computational Linguistics",
year = "2022"
}
``` | 904 | [
[
-0.0115509033203125,
-0.0587158203125,
0.030517578125,
-0.0009984970092773438,
0.0089111328125,
-0.021484375,
-0.022857666015625,
-0.037384033203125,
-0.02752685546875,
0.0226287841796875,
-0.0452880859375,
-0.060089111328125,
-0.0657958984375,
-0.00791168212890625,
-0.043426513671875,
0.0831298828125,
0.0151519775390625,
0.0261993408203125,
0.0024814605712890625,
-0.0161895751953125,
0.0075531005859375,
-0.03314208984375,
-0.0587158203125,
-0.00670623779296875,
0.04254150390625,
0.00670623779296875,
0.0374755859375,
0.05914306640625,
-0.00366973876953125,
0.01995849609375,
-0.00554656982421875,
-0.0175323486328125,
-0.0611572265625,
0.006786346435546875,
-0.01390838623046875,
-0.01412200927734375,
-0.021026611328125,
0.0054473876953125,
0.05450439453125,
0.0257415771484375,
-0.00037980079650878906,
0.01532745361328125,
0.002292633056640625,
0.01432037353515625,
-0.039093017578125,
-0.011993408203125,
-0.06671142578125,
-0.0009207725524902344,
-0.0271759033203125,
0.0135345458984375,
-0.045867919921875,
-0.0201263427734375,
0.001262664794921875,
-0.0289306640625,
0.0277252197265625,
0.0003323554992675781,
0.046051025390625,
0.0189666748046875,
-0.037841796875,
-0.02960205078125,
-0.0457763671875,
0.06378173828125,
-0.052642822265625,
0.036956787109375,
0.0166778564453125,
0.0023021697998046875,
-0.0156402587890625,
-0.0281524658203125,
-0.058563232421875,
-0.022216796875,
0.00640106201171875,
-0.0031070709228515625,
-0.045196533203125,
0.006137847900390625,
0.036163330078125,
0.0016603469848632812,
-0.05047607421875,
0.0207061767578125,
-0.032623291015625,
-0.02667236328125,
0.0645751953125,
0.0016527175903320312,
0.0112152099609375,
0.004985809326171875,
-0.038909912109375,
-0.01058197021484375,
-0.0305938720703125,
-0.005584716796875,
0.046417236328125,
0.0292816162109375,
-0.029449462890625,
0.032073974609375,
0.0068206787109375,
0.04534912109375,
-0.01259613037109375,
-0.01446533203125,
0.03900146484375,
-0.0005331039428710938,
0.0008120536804199219,
0.0106048583984375,
0.07470703125,
0.018310546875,
0.04150390625,
-0.0261077880859375,
-0.01058197021484375,
0.022216796875,
0.0166778564453125,
-0.0740966796875,
-0.0052642822265625,
0.012664794921875,
-0.0298614501953125,
-0.032623291015625,
-0.0192108154296875,
-0.0782470703125,
-0.0433349609375,
-0.02093505859375,
0.026275634765625,
-0.01947021484375,
-0.0236968994140625,
-0.0248565673828125,
-0.01148223876953125,
0.009735107421875,
0.004558563232421875,
-0.02777099609375,
0.01528167724609375,
0.0322265625,
0.0439453125,
-0.02923583984375,
-0.0204620361328125,
-0.03765869140625,
0.0161895751953125,
-0.00897979736328125,
0.035064697265625,
-0.0528564453125,
-0.03399658203125,
0.0234527587890625,
-0.00786590576171875,
-0.0010061264038085938,
-0.032379150390625,
0.0677490234375,
-0.037261962890625,
0.0233612060546875,
-0.0027599334716796875,
-0.043060302734375,
-0.025665283203125,
0.0218658447265625,
-0.030029296875,
0.0689697265625,
0.0090179443359375,
-0.055450439453125,
0.01117706298828125,
-0.05108642578125,
-0.023040771484375,
-0.012939453125,
-0.0102691650390625,
-0.0692138671875,
-0.004131317138671875,
-0.00485992431640625,
0.016571044921875,
-0.031890869140625,
0.0119171142578125,
-0.0670166015625,
0.00438690185546875,
0.01251220703125,
-0.02471923828125,
0.08172607421875,
0.0227813720703125,
-0.03631591796875,
0.01311492919921875,
-0.0823974609375,
-0.0079345703125,
0.031890869140625,
-0.01439666748046875,
-0.024505615234375,
0.0019378662109375,
0.022430419921875,
0.028900146484375,
0.002613067626953125,
-0.07135009765625,
-0.00531005859375,
-0.00514984130859375,
0.0222930908203125,
0.071044921875,
-0.0136566162109375,
0.0028209686279296875,
-0.008148193359375,
0.0275115966796875,
0.00463104248046875,
0.0036182403564453125,
0.0227813720703125,
-0.0283966064453125,
-0.039581298828125,
-0.027374267578125,
0.005702972412109375,
0.04571533203125,
-0.0384521484375,
0.0343017578125,
-0.018890380859375,
-0.0421142578125,
-0.01093292236328125,
-0.0134429931640625,
0.047271728515625,
0.058929443359375,
0.038482666015625,
0.01085662841796875,
-0.056427001953125,
-0.0694580078125,
-0.017669677734375,
-0.01494598388671875,
0.004669189453125,
0.035675048828125,
0.06207275390625,
-0.0191497802734375,
0.07537841796875,
-0.038665771484375,
-0.00897979736328125,
0.014373779296875,
0.01123046875,
-0.01490020751953125,
0.0266265869140625,
0.055755615234375,
-0.0528564453125,
-0.04901123046875,
-0.0285186767578125,
-0.046905517578125,
-0.0328369140625,
0.0259857177734375,
-0.0204620361328125,
0.01174163818359375,
0.043853759765625,
-0.003299713134765625,
0.037261962890625,
0.045562744140625,
-0.045196533203125,
0.04400634765625,
0.0259552001953125,
0.005084991455078125,
-0.103515625,
0.0037631988525390625,
0.01403045654296875,
-0.01187896728515625,
-0.07745361328125,
-0.0254364013671875,
-0.01137542724609375,
-0.0082855224609375,
-0.045989990234375,
0.0419921875,
-0.0086212158203125,
0.005695343017578125,
-0.0158233642578125,
0.0037822723388671875,
-0.0384521484375,
0.058502197265625,
0.0037670135498046875,
0.0755615234375,
0.056121826171875,
-0.052947998046875,
0.0180206298828125,
0.034027099609375,
-0.007049560546875,
0.054779052734375,
-0.06744384765625,
0.020721435546875,
-0.00948333740234375,
0.008758544921875,
-0.0804443359375,
-0.00202178955078125,
0.0294952392578125,
-0.05047607421875,
0.0081939697265625,
-0.02056884765625,
-0.03765869140625,
-0.01324462890625,
-0.0196075439453125,
0.05767822265625,
0.0298919677734375,
-0.05078125,
0.048004150390625,
0.052947998046875,
0.00638580322265625,
-0.03961181640625,
-0.03802490234375,
-0.01326751708984375,
-0.035858154296875,
-0.05194091796875,
0.0245208740234375,
-0.01557159423828125,
-0.0179290771484375,
-0.007656097412109375,
0.030059814453125,
-0.00641632080078125,
-0.00010824203491210938,
0.0338134765625,
0.012359619140625,
-0.01087188720703125,
0.01241302490234375,
-0.01126861572265625,
0.0134429931640625,
0.03167724609375,
-0.0254974365234375,
0.046661376953125,
0.007335662841796875,
-0.004596710205078125,
-0.027191162109375,
0.036712646484375,
0.045501708984375,
-0.004913330078125,
0.064697265625,
0.051483154296875,
-0.0274200439453125,
-0.0195770263671875,
-0.02178955078125,
0.00249481201171875,
-0.03125,
0.040924072265625,
-0.0117950439453125,
-0.06280517578125,
0.0462646484375,
0.009246826171875,
-0.005023956298828125,
0.056121826171875,
0.0540771484375,
0.021881103515625,
0.08551025390625,
0.029571533203125,
-0.0201263427734375,
0.048492431640625,
-0.014556884765625,
-0.008331298828125,
-0.0499267578125,
-0.0200042724609375,
-0.0706787109375,
-0.01103973388671875,
-0.049224853515625,
-0.01544952392578125,
0.033172607421875,
-0.01971435546875,
-0.060821533203125,
0.0170135498046875,
-0.0213165283203125,
0.023651123046875,
0.0281524658203125,
0.0131988525390625,
-0.009613037109375,
0.028045654296875,
-0.010406494140625,
-0.0019483566284179688,
-0.02984619140625,
-0.017242431640625,
0.09393310546875,
0.0269927978515625,
0.021026611328125,
0.0267181396484375,
0.03314208984375,
0.037109375,
0.02581787109375,
-0.04107666015625,
0.048492431640625,
-0.0243988037109375,
-0.06396484375,
-0.0185089111328125,
-0.03857421875,
-0.0771484375,
0.00623321533203125,
-0.0192413330078125,
-0.06597900390625,
0.01430511474609375,
0.004669189453125,
-0.0106353759765625,
0.032958984375,
-0.038482666015625,
0.06060791015625,
-0.0008106231689453125,
-0.033355712890625,
-0.01441192626953125,
-0.0297393798828125,
0.0178985595703125,
-0.00040268898010253906,
0.009552001953125,
-0.005329132080078125,
0.004871368408203125,
0.083984375,
-0.0101318359375,
0.07012939453125,
-0.0308380126953125,
-0.02984619140625,
0.0245361328125,
-0.0119476318359375,
0.047332763671875,
-0.014617919921875,
-0.0301055908203125,
0.040008544921875,
-0.050201416015625,
-0.031768798828125,
-0.01348114013671875,
0.03985595703125,
-0.07598876953125,
-0.027252197265625,
-0.0352783203125,
-0.042327880859375,
-0.0136260986328125,
0.017364501953125,
0.040740966796875,
0.035980224609375,
-0.0316162109375,
0.0130157470703125,
0.0577392578125,
-0.02056884765625,
0.028961181640625,
0.04913330078125,
-0.00027632713317871094,
-0.0518798828125,
0.056793212890625,
0.006877899169921875,
0.018280029296875,
0.008148193359375,
0.006900787353515625,
-0.0243377685546875,
-0.0540771484375,
-0.018524169921875,
0.033966064453125,
-0.036224365234375,
-0.003452301025390625,
-0.0604248046875,
-0.0262908935546875,
-0.046478271484375,
0.0101776123046875,
-0.01418304443359375,
-0.01264190673828125,
-0.0305023193359375,
0.004886627197265625,
0.0277099609375,
0.043792724609375,
-0.006969451904296875,
0.03570556640625,
-0.0304412841796875,
0.0301361083984375,
0.022247314453125,
0.01369476318359375,
0.0029048919677734375,
-0.0853271484375,
-0.0127716064453125,
0.0172882080078125,
-0.025390625,
-0.06915283203125,
0.0361328125,
0.014068603515625,
0.049468994140625,
0.0247802734375,
0.0186920166015625,
0.031768798828125,
-0.036773681640625,
0.06427001953125,
0.006961822509765625,
-0.06365966796875,
0.0460205078125,
-0.045562744140625,
-0.00539398193359375,
0.0267181396484375,
0.0499267578125,
-0.025726318359375,
-0.024566650390625,
-0.0556640625,
-0.04986572265625,
0.07452392578125,
0.0230865478515625,
-0.0016946792602539062,
0.0174102783203125,
0.0037708282470703125,
0.005222320556640625,
0.016265869140625,
-0.08197021484375,
-0.045928955078125,
-0.00862884521484375,
-0.026397705078125,
-0.01690673828125,
-0.039093017578125,
-0.0312347412109375,
-0.0224609375,
0.0777587890625,
0.00807952880859375,
0.050048828125,
-0.01349639892578125,
-0.0097503662109375,
0.00045037269592285156,
0.0183868408203125,
0.06622314453125,
0.015777587890625,
-0.0013418197631835938,
-0.0013551712036132812,
0.01177215576171875,
-0.038482666015625,
-0.006641387939453125,
0.0177459716796875,
-0.018798828125,
-0.00785064697265625,
0.0291900634765625,
0.09228515625,
-0.03387451171875,
-0.030853271484375,
0.04449462890625,
0.002105712890625,
-0.051361083984375,
-0.045318603515625,
0.00749969482421875,
-0.024566650390625,
0.0217742919921875,
0.0177459716796875,
0.004901885986328125,
0.0198211669921875,
-0.023101806640625,
0.0219268798828125,
0.03997802734375,
-0.040283203125,
-0.029693603515625,
0.050537109375,
0.007236480712890625,
-0.0291900634765625,
0.046630859375,
-0.025421142578125,
-0.0657958984375,
0.0281829833984375,
0.044708251953125,
0.08489990234375,
-0.019439697265625,
0.007793426513671875,
0.049560546875,
0.00823974609375,
0.0219268798828125,
0.00711822509765625,
-0.00365447998046875,
-0.06658935546875,
0.006351470947265625,
-0.04534912109375,
-0.005229949951171875,
0.049560546875,
-0.0333251953125,
0.01192474365234375,
-0.0159759521484375,
0.00637054443359375,
0.0185089111328125,
-0.0175628662109375,
-0.024078369140625,
0.0192413330078125,
0.038299560546875,
0.04473876953125,
-0.083984375,
0.0662841796875,
0.03399658203125,
-0.0143585205078125,
-0.04254150390625,
0.0125274658203125,
0.0130462646484375,
-0.06585693359375,
0.034698486328125,
0.00342559814453125,
-0.02313232421875,
0.0025482177734375,
-0.06744384765625,
-0.06134033203125,
0.0518798828125,
0.035888671875,
-0.02667236328125,
0.01445770263671875,
0.0273590087890625,
0.049163818359375,
-0.033477783203125,
0.006580352783203125,
0.034454345703125,
0.02618408203125,
0.01242828369140625,
-0.037872314453125,
-0.01458740234375,
-0.04327392578125,
0.0082244873046875,
-0.0007319450378417969,
-0.04437255859375,
0.07232666015625,
0.002117156982421875,
-0.0166778564453125,
-0.01201629638671875,
0.06158447265625,
0.0186004638671875,
0.03204345703125,
0.057861328125,
0.05499267578125,
0.04443359375,
-0.0079345703125,
0.050628662109375,
-0.0237579345703125,
0.0174713134765625,
0.09356689453125,
-0.00249481201171875,
0.05206298828125,
0.00884246826171875,
-0.017669677734375,
0.035247802734375,
0.031036376953125,
-0.0179443359375,
0.0400390625,
0.022003173828125,
-0.032318115234375,
-0.00946807861328125,
-0.0228118896484375,
-0.04205322265625,
0.02874755859375,
0.034820556640625,
-0.049713134765625,
-0.02001953125,
0.019134521484375,
0.020233154296875,
0.0202178955078125,
-0.0291748046875,
0.053466796875,
-0.0277099609375,
-0.034698486328125,
0.08099365234375,
-0.0023975372314453125,
0.06103515625,
-0.03271484375,
0.01099395751953125,
0.0131683349609375,
0.0218658447265625,
-0.0122528076171875,
-0.054534912109375,
0.0157470703125,
0.03399658203125,
-0.034637451171875,
-0.0141448974609375,
0.0304718017578125,
-0.054351806640625,
-0.0269622802734375,
0.043060302734375,
-0.00836944580078125,
0.0166015625,
-0.0017938613891601562,
-0.0634765625,
0.0078582763671875,
0.0259246826171875,
-0.017333984375,
0.01763916015625,
0.0295257568359375,
0.0007314682006835938,
0.03436279296875,
0.0413818359375,
-0.0013055801391601562,
0.02685546875,
0.010345458984375,
0.058807373046875,
-0.023773193359375,
-0.04290771484375,
-0.06475830078125,
0.03485107421875,
-0.0228118896484375,
-0.037872314453125,
0.060455322265625,
0.061767578125,
0.07598876953125,
-0.01003265380859375,
0.07904052734375,
-0.00771331787109375,
0.06268310546875,
-0.0242156982421875,
0.041900634765625,
-0.04595947265625,
-0.00734710693359375,
-0.022735595703125,
-0.06268310546875,
-0.00494384765625,
0.03594970703125,
-0.03662109375,
0.0146942138671875,
0.043121337890625,
0.09039306640625,
0.006801605224609375,
-0.0021572113037109375,
0.028594970703125,
0.0504150390625,
0.034820556640625,
0.033477783203125,
0.0411376953125,
-0.040618896484375,
0.049713134765625,
-0.0065155029296875,
-0.00960540771484375,
-0.006134033203125,
-0.051239013671875,
-0.06787109375,
-0.05230712890625,
-0.042205810546875,
-0.059356689453125,
0.007720947265625,
0.07073974609375,
0.0440673828125,
-0.0853271484375,
-0.027069091796875,
-0.01221466064453125,
-0.006961822509765625,
0.0017547607421875,
-0.00951385498046875,
0.0227813720703125,
0.002597808837890625,
-0.0687255859375,
0.0259246826171875,
-0.01861572265625,
0.0207672119140625,
0.00951385498046875,
-0.022491455078125,
-0.02703857421875,
0.01338958740234375,
0.034149169921875,
0.023895263671875,
-0.038330078125,
-0.01727294921875,
-0.0243072509765625,
-0.024505615234375,
0.007610321044921875,
0.033935546875,
-0.02935791015625,
0.01013946533203125,
0.0225677490234375,
0.034637451171875,
0.027069091796875,
-0.007610321044921875,
0.0256805419921875,
-0.05999755859375,
0.028656005859375,
0.0088653564453125,
0.0160980224609375,
0.0088958740234375,
-0.0232391357421875,
0.036773681640625,
0.034698486328125,
-0.03363037109375,
-0.0765380859375,
0.0289154052734375,
-0.0782470703125,
-0.01349639892578125,
0.1177978515625,
-0.0035858154296875,
-0.025421142578125,
-0.01007843017578125,
-0.01313018798828125,
0.04803466796875,
-0.06011962890625,
0.046905517578125,
0.036590576171875,
0.019439697265625,
0.0017910003662109375,
-0.007007598876953125,
0.046844482421875,
0.0022830963134765625,
-0.059844970703125,
0.0179595947265625,
0.041290283203125,
0.017547607421875,
0.0244293212890625,
0.0204620361328125,
-0.004489898681640625,
0.0110626220703125,
0.002681732177734375,
0.0225677490234375,
-0.0124664306640625,
-0.0189208984375,
-0.03265380859375,
0.0399169921875,
-0.0251007080078125,
-0.0253448486328125
]
] |
pszemraj/grammar-synthesis-small | 2023-04-28T01:45:36.000Z | [
"transformers",
"pytorch",
"onnx",
"safetensors",
"t5",
"text2text-generation",
"grammar",
"spelling",
"punctuation",
"error-correction",
"grammar synthesis",
"FLAN",
"dataset:jfleg",
"arxiv:2107.06751",
"license:cc-by-nc-sa-4.0",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text2text-generation | pszemraj | null | null | pszemraj/grammar-synthesis-small | 4 | 31,698 | transformers | 2022-07-09T23:00:37 | ---
languages:
- en
license:
- cc-by-nc-sa-4.0
- apache-2.0
tags:
- grammar
- spelling
- punctuation
- error-correction
- grammar synthesis
- FLAN
datasets:
- jfleg
widget:
- text: "There car broke down so their hitching a ride to they're class."
example_title: "compound-1"
- text: "i can has cheezburger"
example_title: "cheezburger"
- text: "so em if we have an now so with fito ringina know how to estimate the tren given the ereafte mylite trend we can also em an estimate is nod s
i again tort watfettering an we have estimated the trend an
called wot to be called sthat of exty right now we can and look at
wy this should not hare a trend i becan we just remove the trend an and we can we now estimate
tesees ona effect of them exty"
example_title: "Transcribed Audio Example 2"
- text: "My coworker said he used a financial planner to help choose his stocks so he wouldn't loose money."
example_title: "incorrect word choice (context)"
- text: "good so hve on an tadley i'm not able to make it to the exla session on monday this week e which is why i am e recording pre recording
an this excelleision and so to day i want e to talk about two things and first of all em i wont em wene give a summary er about
ta ohow to remove trents in these nalitives from time series"
example_title: "lowercased audio transcription output"
- text: "Frustrated, the chairs took me forever to set up."
example_title: "dangling modifier"
- text: "I would like a peice of pie."
example_title: "miss-spelling"
- text: "Which part of Zurich was you going to go hiking in when we were there for the first time together? ! ?"
example_title: "chatbot on Zurich"
- text: "Most of the course is about semantic or content of language but there are also interesting topics to be learned from the servicefeatures except statistics in characters in documents. At this point, Elvthos introduces himself as his native English speaker and goes on to say that if you continue to work on social scnce,"
example_title: "social science ASR summary output"
- text: "they are somewhat nearby right yes please i'm not sure how the innish is tepen thut mayyouselect one that istatte lo variants in their property e ere interested and anyone basical e may be applyind reaching the browing approach were"
- "medical course audio transcription"
parameters:
max_length: 128
min_length: 4
num_beams: 8
repetition_penalty: 1.21
length_penalty: 1
early_stopping: True
---
# grammar-synthesis-small (beta)
This model is a fine-tuned version of [google/t5-small-lm-adapt](https://huggingface.co/google/t5-small-lm-adapt) for grammar correction on an expanded version of the [JFLEG](https://paperswithcode.com/dataset/jfleg) dataset.
usage in Python (after `pip install transformers`):
```python
from transformers import pipeline
corrector = pipeline(
'text2text-generation',
'pszemraj/grammar-synthesis-small',
)
raw_text = 'i can has cheezburger'
results = corrector(raw_text)
print(results)
```
Check out a simple demo in [Google Colab here](https://colab.research.google.com/gist/pszemraj/06fac5b608889e258229a659cc53485f/demo-for-grammar-synthesis-small.ipynb).
## Model description
The intent is to create a text2text language model that successfully completes "single-shot grammar correction" on a potentially grammatically incorrect text **that could have a lot of mistakes** with the important qualifier of **it does not semantically change text/information that IS grammatically correct.**
Compare some of the heavier-error examples on [other grammar correction models](https://huggingface.co/models?dataset=dataset:jfleg) to see the difference :)
## Limitations
- dataset: `cc-by-nc-sa-4.0`
- model: `apache-2.0`
- this is **still a work-in-progress** and while probably useful for "single-shot grammar correction" in a lot of cases, **give the outputs a glance for correctness ok?**
## Use Cases
Obviously, this section is quite general as there are many things one can use "general single-shot grammar correction" for. Some ideas or use cases:
1. Correcting highly error-prone LM outputs. Some examples would be audio transcription (ASR) (this is literally some of the examples) or something like handwriting OCR.
- To be investigated further, depending on what model/system is used it _might_ be worth it to apply this after OCR on typed characters.
2. Correcting/infilling text generated by text generation models to be cohesive/remove obvious errors that break the conversation immersion. I use this on the outputs of [this OPT 2.7B chatbot-esque model of myself](https://huggingface.co/pszemraj/opt-peter-2.7B).
> An example of this model running on CPU with beam search:
```
original response:
ive heard it attributed to a bunch of different philosophical schools, including stoicism, pragmatism, existentialism and even some forms of post-structuralism. i think one of the most interesting (and most difficult) philosophical problems is trying to let dogs (or other animals) out of cages. the reason why this is a difficult problem is because it seems to go against our grain (so to
synthesizing took 306.12 seconds
Final response in 1294.857 s:
I've heard it attributed to a bunch of different philosophical schools, including solipsism, pragmatism, existentialism and even some forms of post-structuralism. i think one of the most interesting (and most difficult) philosophical problems is trying to let dogs (or other animals) out of cages. the reason why this is a difficult problem is because it seems to go against our grain (so to speak)
```
_Note: that I have some other logic that removes any periods at the end of the final sentence in this chatbot setting [to avoid coming off as passive aggressive](https://www.npr.org/2020/09/05/909969004/before-texting-your-kid-make-sure-to-double-check-your-punctuation)_
3. Somewhat related to #2 above, fixing/correcting so-called [tortured-phrases](https://arxiv.org/abs/2107.06751) that are dead giveaways text was generated by a language model. _Note that _SOME_ of these are not fixed, especially as they venture into domain-specific terminology (i.e. irregular timberland instead of Random Forest)._
## Training and evaluation data
More information needed 😉
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0004
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- distributed_type: multi-GPU
- gradient_accumulation_steps: 32
- total_train_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 4
### Training results
### Framework versions
- Transformers 4.20.1
- Pytorch 1.11.0+cu113
- Datasets 2.3.2
- Tokenizers 0.12.1
| 6,876 | [
[
-0.019439697265625,
-0.091796875,
0.029266357421875,
0.0208587646484375,
-0.0171661376953125,
-0.01058197021484375,
-0.02215576171875,
-0.0225372314453125,
-0.00021076202392578125,
0.0201416015625,
-0.05694580078125,
-0.0292205810546875,
-0.03424072265625,
0.0217132568359375,
-0.026275634765625,
0.0784912109375,
-0.00139617919921875,
0.01024627685546875,
0.007080078125,
0.00615692138671875,
-0.036834716796875,
-0.045806884765625,
-0.07159423828125,
-0.0208740234375,
0.032745361328125,
0.021728515625,
0.052886962890625,
0.05255126953125,
0.055877685546875,
0.0256805419921875,
-0.0254364013671875,
0.01006317138671875,
-0.0269775390625,
-0.006206512451171875,
-0.0167999267578125,
-0.037200927734375,
-0.03887939453125,
0.01265716552734375,
0.050689697265625,
0.0576171875,
-0.0088958740234375,
0.0098876953125,
-0.005481719970703125,
0.05535888671875,
-0.031280517578125,
0.01275634765625,
-0.0380859375,
0.0279998779296875,
-0.018218994140625,
-0.0010442733764648438,
-0.0428466796875,
-0.0253753662109375,
-0.00467681884765625,
-0.05364990234375,
0.0181884765625,
0.00403594970703125,
0.08624267578125,
0.029876708984375,
-0.0206146240234375,
-0.0288543701171875,
-0.057098388671875,
0.050079345703125,
-0.056121826171875,
0.009521484375,
0.033233642578125,
0.009063720703125,
-0.040802001953125,
-0.07818603515625,
-0.046783447265625,
-0.007106781005859375,
-0.021148681640625,
0.023468017578125,
-0.026702880859375,
0.017852783203125,
0.04766845703125,
0.030517578125,
-0.044219970703125,
-0.008880615234375,
-0.039276123046875,
-0.0184173583984375,
0.025299072265625,
0.003818511962890625,
0.032806396484375,
-0.00942230224609375,
-0.0149383544921875,
-0.004680633544921875,
-0.047576904296875,
0.03582763671875,
0.0242919921875,
0.039154052734375,
0.0004150867462158203,
0.055328369140625,
-0.008209228515625,
0.048980712890625,
0.0161590576171875,
-0.00684356689453125,
0.01311492919921875,
-0.034454345703125,
-0.0255584716796875,
-0.022125244140625,
0.05877685546875,
0.02642822265625,
0.03106689453125,
0.00011152029037475586,
0.00962066650390625,
0.029144287109375,
0.0157623291015625,
-0.063720703125,
-0.022705078125,
0.012481689453125,
-0.019744873046875,
-0.0264434814453125,
-0.01255035400390625,
-0.037109375,
0.00011336803436279297,
-0.007228851318359375,
0.0308074951171875,
-0.051971435546875,
0.0096893310546875,
0.035797119140625,
-0.0247650146484375,
-0.0003809928894042969,
0.018218994140625,
-0.057647705078125,
0.0142822265625,
0.03399658203125,
0.057037353515625,
0.04217529296875,
-0.0225372314453125,
-0.040863037109375,
-0.010711669921875,
-0.04376220703125,
0.06768798828125,
-0.040069580078125,
-0.0209503173828125,
-0.00833892822265625,
0.023101806640625,
-0.0123138427734375,
-0.035400390625,
0.056396484375,
-0.00861358642578125,
0.057891845703125,
-0.0177459716796875,
-0.048858642578125,
-0.0126190185546875,
0.0111083984375,
-0.0247039794921875,
0.09674072265625,
-0.00859832763671875,
-0.03973388671875,
0.0225067138671875,
-0.05072021484375,
-0.029388427734375,
-0.00977325439453125,
0.013763427734375,
-0.039154052734375,
0.00745391845703125,
0.0049896240234375,
0.0164642333984375,
-0.0021915435791015625,
0.0214080810546875,
-0.0233917236328125,
-0.02410888671875,
0.0223388671875,
-0.0164337158203125,
0.06817626953125,
0.0009312629699707031,
-0.026214599609375,
0.0030117034912109375,
-0.06512451171875,
0.0203399658203125,
0.01507568359375,
-0.0347900390625,
-0.0245819091796875,
0.005496978759765625,
-0.0204315185546875,
0.029022216796875,
-0.0026092529296875,
-0.037689208984375,
0.02789306640625,
-0.051788330078125,
0.048095703125,
0.050628662109375,
-0.006359100341796875,
0.03466796875,
-0.034576416015625,
0.02239990234375,
-0.01543426513671875,
0.0247039794921875,
-0.0143585205078125,
-0.03253173828125,
-0.07501220703125,
-0.033935546875,
0.024139404296875,
0.05035400390625,
-0.041900634765625,
0.052734375,
0.0090789794921875,
-0.04571533203125,
-0.041229248046875,
0.01158905029296875,
0.053619384765625,
0.044097900390625,
0.029144287109375,
-0.028228759765625,
-0.053619384765625,
-0.057891845703125,
-0.0277557373046875,
-0.02899169921875,
0.00970458984375,
-0.00859832763671875,
0.05035400390625,
-0.00286865234375,
0.07232666015625,
-0.03240966796875,
-0.0125579833984375,
-0.004367828369140625,
0.002307891845703125,
0.01439666748046875,
0.035430908203125,
0.037078857421875,
-0.03826904296875,
-0.029052734375,
-0.0254364013671875,
-0.06890869140625,
-0.01507568359375,
-0.0032100677490234375,
-0.0282745361328125,
0.01175689697265625,
0.047393798828125,
-0.06280517578125,
0.0288543701171875,
0.0159454345703125,
-0.03924560546875,
0.037689208984375,
-0.005878448486328125,
-0.007965087890625,
-0.0777587890625,
0.0101470947265625,
-0.0012683868408203125,
-0.0245819091796875,
-0.056549072265625,
0.0189056396484375,
-0.002773284912109375,
-0.0130157470703125,
-0.041229248046875,
0.04144287109375,
-0.0233306884765625,
0.0235748291015625,
-0.01096343994140625,
0.017059326171875,
0.0007457733154296875,
0.041656494140625,
-0.01525115966796875,
0.0650634765625,
0.030242919921875,
-0.040191650390625,
0.026611328125,
0.0208587646484375,
-0.0013675689697265625,
0.032257080078125,
-0.053924560546875,
0.01322174072265625,
-0.01297760009765625,
0.0300750732421875,
-0.07269287109375,
-0.0180816650390625,
0.06256103515625,
-0.0347900390625,
0.006072998046875,
-0.00150299072265625,
-0.038421630859375,
-0.0399169921875,
-0.034912109375,
0.006099700927734375,
0.04400634765625,
-0.029998779296875,
0.0286712646484375,
0.00482940673828125,
-0.0191802978515625,
-0.04437255859375,
-0.048095703125,
0.0009450912475585938,
-0.01165771484375,
-0.05419921875,
0.021453857421875,
-0.025787353515625,
-0.0295562744140625,
0.0105743408203125,
-0.01300811767578125,
0.0036830902099609375,
0.005428314208984375,
0.01413726806640625,
0.0211944580078125,
-0.0105438232421875,
0.027740478515625,
-0.00616455078125,
-0.01556396484375,
-0.01288604736328125,
-0.004848480224609375,
0.051239013671875,
-0.033203125,
-0.01093292236328125,
-0.0364990234375,
0.01806640625,
0.030242919921875,
-0.0006480216979980469,
0.042633056640625,
0.042938232421875,
-0.037445068359375,
-0.0163726806640625,
-0.025634765625,
-0.0215301513671875,
-0.037322998046875,
0.024658203125,
-0.021087646484375,
-0.055145263671875,
0.03948974609375,
-0.00669097900390625,
0.000705718994140625,
0.048370361328125,
0.03399658203125,
-0.03265380859375,
0.0748291015625,
0.042938232421875,
0.01311492919921875,
0.0545654296875,
-0.005950927734375,
0.037109375,
-0.05169677734375,
-0.00033926963806152344,
-0.04144287109375,
-0.0143280029296875,
-0.04998779296875,
-0.0134735107421875,
0.01116180419921875,
0.030731201171875,
-0.025909423828125,
0.033599853515625,
-0.0380859375,
0.038116455078125,
0.04241943359375,
0.01139068603515625,
0.0181732177734375,
0.00397491455078125,
0.002536773681640625,
-0.006744384765625,
-0.05322265625,
-0.051239013671875,
0.070556640625,
0.0206146240234375,
0.051025390625,
-0.0035343170166015625,
0.03399658203125,
0.0133209228515625,
0.0020008087158203125,
-0.06988525390625,
0.055389404296875,
-0.029022216796875,
-0.06292724609375,
-0.0197906494140625,
-0.01837158203125,
-0.06622314453125,
0.0201416015625,
-0.01751708984375,
-0.05572509765625,
-0.001590728759765625,
0.016265869140625,
-0.03448486328125,
0.0171661376953125,
-0.07440185546875,
0.07708740234375,
-0.0105743408203125,
-0.0330810546875,
-0.00537109375,
-0.06011962890625,
0.0160980224609375,
0.0022430419921875,
0.006732940673828125,
-0.01788330078125,
0.01271820068359375,
0.04931640625,
-0.056976318359375,
0.08306884765625,
-0.0037441253662109375,
0.01380157470703125,
0.0294189453125,
-0.00749969482421875,
0.037689208984375,
-0.0108642578125,
0.00815582275390625,
-0.00298309326171875,
0.004787445068359375,
-0.029815673828125,
-0.038665771484375,
0.0178680419921875,
-0.059051513671875,
-0.0496826171875,
-0.050384521484375,
-0.035247802734375,
0.0007457733154296875,
0.0195159912109375,
0.023406982421875,
0.033782958984375,
-0.01558685302734375,
0.0276641845703125,
0.0296173095703125,
-0.0146942138671875,
0.04730224609375,
0.041717529296875,
-0.0265350341796875,
-0.0266876220703125,
0.04205322265625,
0.0100860595703125,
0.0175628662109375,
0.01338958740234375,
0.0204925537109375,
-0.00665283203125,
-0.01165008544921875,
-0.037872314453125,
0.031890869140625,
-0.052276611328125,
-0.01409912109375,
-0.038360595703125,
-0.028228759765625,
-0.032135009765625,
-0.020599365234375,
-0.036834716796875,
-0.044403076171875,
-0.0227508544921875,
-0.00439453125,
0.022064208984375,
0.029022216796875,
0.0012006759643554688,
0.0474853515625,
-0.040924072265625,
0.016998291015625,
0.01155853271484375,
0.0150604248046875,
-0.037750244140625,
-0.040924072265625,
-0.0022945404052734375,
0.00920867919921875,
-0.03521728515625,
-0.0587158203125,
0.0240020751953125,
0.03179931640625,
0.0269775390625,
0.01708984375,
-0.0105438232421875,
0.07684326171875,
-0.039154052734375,
0.07073974609375,
0.0152740478515625,
-0.0787353515625,
0.060699462890625,
-0.01520538330078125,
0.0142059326171875,
0.038299560546875,
0.0135650634765625,
-0.062286376953125,
-0.04364013671875,
-0.05169677734375,
-0.0758056640625,
0.0648193359375,
0.030242919921875,
0.016448974609375,
-0.005931854248046875,
0.046661376953125,
-0.0019464492797851562,
0.0166015625,
-0.0721435546875,
-0.01456451416015625,
-0.00959014892578125,
-0.01459503173828125,
0.00353240966796875,
-0.016143798828125,
-0.0189056396484375,
-0.029510498046875,
0.08447265625,
-0.0028057098388671875,
0.0238494873046875,
0.0224761962890625,
-0.0109710693359375,
-0.0141448974609375,
0.036651611328125,
0.049041748046875,
0.039794921875,
-0.022918701171875,
0.00887298583984375,
0.0080413818359375,
-0.024993896484375,
0.01285552978515625,
0.01169586181640625,
-0.0239410400390625,
0.00818634033203125,
0.027130126953125,
0.057861328125,
-0.0016937255859375,
-0.045867919921875,
0.03204345703125,
-0.00530242919921875,
-0.007129669189453125,
-0.042572021484375,
0.0079498291015625,
-0.00629425048828125,
0.0253143310546875,
0.00801849365234375,
0.025238037109375,
-0.0035247802734375,
-0.0712890625,
-0.0004534721374511719,
0.0227508544921875,
-0.01325225830078125,
-0.034881591796875,
0.036834716796875,
0.008880615234375,
-0.04364013671875,
0.045623779296875,
-0.038543701171875,
-0.037261962890625,
0.04559326171875,
0.057830810546875,
0.057708740234375,
-0.0158233642578125,
0.0217132568359375,
0.03887939453125,
0.02545166015625,
-0.0222015380859375,
0.0212554931640625,
0.0166015625,
-0.06390380859375,
-0.0241851806640625,
-0.044281005859375,
-0.0279541015625,
0.0093841552734375,
-0.03790283203125,
0.045562744140625,
-0.05206298828125,
-0.037750244140625,
0.01326751708984375,
-0.0094451904296875,
-0.0411376953125,
0.0171051025390625,
-0.0015277862548828125,
0.05877685546875,
-0.07330322265625,
0.0517578125,
0.06768798828125,
-0.0540771484375,
-0.06536865234375,
-0.004344940185546875,
0.005268096923828125,
-0.058837890625,
0.03179931640625,
0.0010547637939453125,
-0.00970458984375,
-0.01467132568359375,
-0.04876708984375,
-0.053375244140625,
0.06298828125,
0.0308074951171875,
-0.04351806640625,
-0.0297393798828125,
0.005168914794921875,
0.0677490234375,
-0.00927734375,
0.025299072265625,
0.035369873046875,
0.029998779296875,
0.0009069442749023438,
-0.07281494140625,
0.0110015869140625,
-0.04034423828125,
0.005367279052734375,
0.006931304931640625,
-0.0648193359375,
0.0762939453125,
-0.0015115737915039062,
-0.0191497802734375,
0.015594482421875,
0.0229339599609375,
0.012481689453125,
0.0243682861328125,
0.030792236328125,
0.045318603515625,
0.068115234375,
-0.022705078125,
0.06756591796875,
-0.0288543701171875,
0.0474853515625,
0.07928466796875,
-0.0008664131164550781,
0.055877685546875,
0.046600341796875,
-0.0195159912109375,
0.034576416015625,
0.0574951171875,
-0.00849151611328125,
0.043121337890625,
0.0023632049560546875,
-0.00969696044921875,
-0.0166168212890625,
-0.002376556396484375,
-0.03564453125,
0.035888671875,
0.0180511474609375,
-0.037353515625,
-0.01544952392578125,
0.001186370849609375,
0.0406494140625,
-0.0041961669921875,
-0.00597381591796875,
0.059295654296875,
-0.000274658203125,
-0.052093505859375,
0.03753662109375,
0.01824951171875,
0.048919677734375,
-0.05047607421875,
0.0101470947265625,
-0.0241851806640625,
0.0325927734375,
-0.0124664306640625,
-0.04766845703125,
0.03485107421875,
0.01021575927734375,
-0.015655517578125,
-0.0249176025390625,
0.07379150390625,
-0.03179931640625,
-0.043182373046875,
0.0233154296875,
0.039306640625,
0.0135498046875,
-0.00231170654296875,
-0.04461669921875,
-0.002635955810546875,
-0.003204345703125,
-0.018798828125,
-0.00463104248046875,
0.028564453125,
0.00403594970703125,
0.042694091796875,
0.052947998046875,
0.0145263671875,
0.009674072265625,
-0.002277374267578125,
0.061859130859375,
-0.053680419921875,
-0.02587890625,
-0.07861328125,
0.05047607421875,
-0.0098419189453125,
-0.039031982421875,
0.0450439453125,
0.042022705078125,
0.0859375,
-0.01282501220703125,
0.06219482421875,
-0.0236053466796875,
0.022308349609375,
-0.04144287109375,
0.021759033203125,
-0.0465087890625,
0.003520965576171875,
-0.0219573974609375,
-0.055633544921875,
-0.038177490234375,
0.0703125,
-0.03753662109375,
-0.00799560546875,
0.076171875,
0.0738525390625,
0.001972198486328125,
0.0037174224853515625,
0.017822265625,
0.0256195068359375,
0.0247039794921875,
0.036651611328125,
0.07330322265625,
-0.049652099609375,
0.059356689453125,
-0.0274658203125,
-0.00887298583984375,
-0.01708984375,
-0.045196533203125,
-0.06695556640625,
-0.038421630859375,
-0.0379638671875,
-0.042510986328125,
0.0277557373046875,
0.08819580078125,
0.041473388671875,
-0.05230712890625,
-0.005512237548828125,
-0.01519012451171875,
-0.007320404052734375,
-0.028076171875,
-0.018463134765625,
0.018218994140625,
-0.04522705078125,
-0.0645751953125,
0.0200347900390625,
0.0225372314453125,
-0.0019407272338867188,
0.0201263427734375,
0.0121002197265625,
-0.0194854736328125,
0.0127105712890625,
0.05401611328125,
0.0248260498046875,
-0.06591796875,
-0.058807373046875,
0.01267242431640625,
-0.0247344970703125,
0.016845703125,
0.047454833984375,
-0.033721923828125,
0.04107666015625,
0.034942626953125,
0.022674560546875,
0.03924560546875,
0.002780914306640625,
0.0185546875,
-0.060821533203125,
0.020294189453125,
0.005733489990234375,
0.03704833984375,
0.0220184326171875,
-0.015228271484375,
0.0565185546875,
0.0550537109375,
-0.038482666015625,
-0.06451416015625,
0.014923095703125,
-0.07696533203125,
-0.02667236328125,
0.086181640625,
-0.00914764404296875,
-0.029388427734375,
0.01255035400390625,
-0.04449462890625,
0.0435791015625,
-0.024658203125,
0.068115234375,
0.08251953125,
-0.0140838623046875,
-0.003749847412109375,
-0.0330810546875,
0.036895751953125,
0.046478271484375,
-0.06451416015625,
-0.00943756103515625,
0.04437255859375,
0.042572021484375,
0.01444244384765625,
0.06219482421875,
-0.012451171875,
0.0136871337890625,
0.0079193115234375,
0.02081298828125,
-0.00411224365234375,
-0.020172119140625,
-0.025543212890625,
0.0135345458984375,
-0.006336212158203125,
-0.0286712646484375
]
] |
NoCrypt/SomethingV2_2 | 2023-05-06T03:13:52.000Z | [
"diffusers",
"stable-diffusion",
"text-to-image",
"safetensors",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | NoCrypt | null | null | NoCrypt/SomethingV2_2 | 117 | 31,583 | diffusers | 2023-03-08T13:24:01 | ---
license: creativeml-openrail-m
thumbnail: >-
https://huggingface.co/NoCrypt/SomethingV2_2/resolve/main/images/thumbnail.webp
tags:
- stable-diffusion
- text-to-image
- safetensors
- diffusers
inference: true
language:
- en
widget:
- text: >-
masterpiece, masterpiece, masterpiece, best quality, ultra-detailed, 1girl, hatsune miku, blue hair, upper body, looking at viewer, ?, negative space, bioluminescence, bioluminescence, bioluminescence, darkness, wind, butterfly, black background, portrait, ice
example_title: example
library_name: diffusers
---
<center>
<img src="https://huggingface.co/NoCrypt/SomethingV2_2/resolve/main/images/Artboard%201.png"/>
<h1 style="font-size:1.6rem;">
<b>
SomethingV2.2
</b>
</h1>
<p>
Welcome to SomethingV2.2 - an improved anime latent diffusion model from <a href="https://huggingface.co/NoCrypt/SomethingV2">SomethingV2</a>
A lot of things are being discovered lately, such as a way to merge model using mbw automatically, offset noise to get much darker result, and even VAE tuning. This model is intended to use all of those features as the improvements, here's some improvements that have been made:
</p>
<img src="https://huggingface.co/NoCrypt/SomethingV2_2/resolve/main/images/Artboard%202.png"/>
<h2>Can't trust the numbers? Here's some proof</h2>
</center>
%2C%20dark.png)

<img style="display:inline;margin:0;padding:0;" src="https://huggingface.co/NoCrypt/SomethingV2_2/resolve/main/images/00019-1829045217-masterpiece%2C%20best%20quality%2C%20hatsune%20miku%2C%201girl%2C%20white%20shirt%2C%20blue%20necktie%2C%20bare%20shoulders%2C%20very%20detailed%20background%2C%20hands%20on%20ow.png" width="32%"/>
<img style="display:inline;margin:0;padding:0;" src="https://huggingface.co/NoCrypt/SomethingV2_2/resolve/main/images/00018-1769428138-masterpiece%2C%20best%20quality%2C%20hatsune%20miku%2C%201girl%2C%20white%20shirt%2C%20blue%20necktie%2C%20bare%20shoulders%2C%20very%20detailed%20background%2C%20hands%20on%20ow.png" width="32%"/>
<img style="display:inline;margin:0;padding:0;" src="https://huggingface.co/NoCrypt/SomethingV2_2/resolve/main/images/00020-3514023396-masterpiece%2C%20best%20quality%2C%20hatsune%20miku%2C%201girl%2C%20white%20shirt%2C%20blue%20necktie%2C%20bare%20shoulders%2C%20very%20detailed%20background%2C%20cafe%2C%20angry.png" width="32%"/>
<details><summary><big><b>Prompts</b></big></summary>
```yaml
masterpiece, best quality, ultra-detailed, 2girls, upper body, looking at viewer, ?, negative space, (bioluminescence:1.2), darkness, wind, butterfly, black background, glowing,
AND masterpiece, best quality, ultra-detailed, 2girls, hatsune miku, upper body, looking at viewer, ?, negative space, (bioluminescence:1.2), darkness, wind, butterfly, black background, glowing, (blue theme:1.2)
AND masterpiece, best quality, ultra-detailed, 2girls, hakurei reimu, (brown hair:1.1), upper body, looking at viewer, ?, negative space, (bioluminescence:1.2), darkness, wind, butterfly, black background, glowing, (red theme:1.2)
Negative prompt: EasyNegative
Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 7, Seed: 3452449180, Size: 816x504, Model: somethingv2_1, Denoising strength: 0.58, Clip skip: 2, ENSD: 31337, Latent Couple: "divisions=1:1,1:2,1:2 positions=0:0,0:0,0:1 weights=0.2,0.8,0.8 end at step=13", Hires upscale: 1.9, Hires steps: 12, Hires upscaler: Latent (nearest-exact)
```
```yaml
masterpiece, best quality, hatsune miku, white shirt, darkness, dark background
Negative prompt: EasyNegative
Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 7, Seed: 72332473, Size: 504x600, Model: somethingv2_1, Denoising strength: 0.58, Clip skip: 2, ENSD: 31337, Hires upscale: 1.85, Hires steps: 12, Hires upscaler: Latent (nearest-exact)
```
```yaml
masterpiece, best quality, hatsune miku, 1girl, white shirt, blue necktie, bare shoulders, very detailed background, hands on own cheeks, open mouth, one eye closed, clenched teeth, smile
Negative prompt: EasyNegative, tattoo, (shoulder tattoo:1.0), (number tattoo:1.3), frills
Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 7, Seed: 1829045217, Size: 456x592, Model: SomethingV2_2, Denoising strength: 0.53, Clip skip: 2, ENSD: 31337, Hires upscale: 1.65, Hires steps: 12, Hires upscaler: Latent (nearest-exact), Discard penultimate sigma: True
```
```yaml
masterpiece, best quality, hatsune miku, 1girl, white shirt, blue necktie, bare shoulders, very detailed background, hands on own cheeks, open mouth, eyez closed, clenched teeth, smile, arms behind back,
Negative prompt: EasyNegative, tattoo, (shoulder tattoo:1.0), (number tattoo:1.3), frills
Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 7, Seed: 1769428138, Size: 456x592, Model: SomethingV2_2, Denoising strength: 0.53, Clip skip: 2, ENSD: 31337, Hires upscale: 1.65, Hires steps: 12, Hires upscaler: Latent (nearest-exact), Discard penultimate sigma: True
```
```yaml
masterpiece, best quality, hatsune miku, 1girl, white shirt, blue necktie, bare shoulders, very detailed background, cafe, angry, crossed arms, detached sleeves, light particles,
Negative prompt: EasyNegative, tattoo, (shoulder tattoo:1.0), (number tattoo:1.3), frills
Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 7, Seed: 3514023396, Size: 456x592, Model: SomethingV2_2, Denoising strength: 0.53, Clip skip: 2, ENSD: 31337, Hires upscale: 1.65, Hires steps: 12, Hires upscaler: Latent (nearest-exact), Discard penultimate sigma: True
```
</details>
## Non-miku examples
<img style="display:inline;margin:0;padding:0;" width="49%" src="https://huggingface.co/NoCrypt/SomethingV2_2/resolve/main/images/00021-4018636341-masterpiece%2C%20best%20quality%2C%201girl%2C%20aqua%20eyes%2C%20baseball%20cap%2C%20blonde%20hair%2C%20closed%20mouth%2C%20earrings%2C%20green%20background%2C%20hat%2C%20hoop%20earr.png"/>
<img style="display:inline;margin:0;padding:0;" width="49%" src="https://huggingface.co/NoCrypt/SomethingV2_2/resolve/main/images/00022-1334620477-masterpiece%2C%20best%20quality%2C%20landscape.png"/>
<details><summary><big><b>Prompts</b></big></summary>
```yaml
masterpiece, best quality, 1girl, aqua eyes, baseball cap, blonde hair, closed mouth, earrings, green background, hat, hoop earrings, jewelry, looking at viewer, shirt, short hair, simple background, solo, upper body, yellow shirt
Negative prompt: EasyNegative
Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 7, Seed: 4018636341, Size: 440x592, Model: SomethingV2_2, Denoising strength: 0.53, Clip skip: 2, ENSD: 31337, Hires upscale: 1.65, Hires steps: 13, Hires upscaler: Latent (nearest-exact)
```
```yaml
masterpiece, best quality, landscape
Negative prompt: EasyNegative
Steps: 20, Sampler: DPM++ 2M Karras, CFG scale: 7, Seed: 1334620477, Size: 440x592, Model: SomethingV2_2, Denoising strength: 0.53, Clip skip: 2, ENSD: 31337, Hires upscale: 1.65, Hires steps: 13, Hires upscaler: Latent (nearest-exact)
```
</details>
## Recommended settings
- VAE: None (Baked in model, [blessed2](https://huggingface.co/NoCrypt/blessed_vae/blob/main/blessed2.vae.pt))
- Clip Skip: 2
- Sampler: DPM++ 2M Karras
- CFG Scale: 7 ± 5
- Recommended Positive Prompt: masterpiece, best quality, negative space, (bioluminescence:1.2), darkness, dark background
- Recommended Negative Prompt: [EasyNegative](https://huggingface.co/datasets/gsdf/EasyNegative)
- For better results, using hires fix is a must.
- Hires upscaler: Latent (any variant, such as nearest-exact)
## Recipe
*Due to [SD-Silicon's Terms of use](https://huggingface.co/Xynon/SD-Silicon#terms-of-use). I must specify how the model was made*
|Model A | Model B | Interpolation Method | Weight | Name |
|---|---|---|---|---|
|[dpepmkmp](https://huggingface.co/closertodeath/dpepmkmp/blob/main/dpepmkmp.safetensors)|[silicon29-dark](https://huggingface.co/Xynon/SD-Silicon/blob/main/Silicon29/Silicon29-dark.safetensors)|MBW|Reverse Cosine|[dpepsili](https://huggingface.co/un1xx/model_dump/blob/main/bw-merge-dpepmkmp-Silicon29-dark-0.ckpt)|
|[somethingV2_1](https://huggingface.co/NoCrypt/SomethingV2/blob/main/somethingv2_1.safetensors)|[dpepsili](https://huggingface.co/un1xx/model_dump/blob/main/bw-merge-dpepmkmp-Silicon29-dark-0.ckpt)|MBW|Cosine|SomethingV2_2 raw|
|SomethingV2_2 raw|[Blessed2 VAE](https://huggingface.co/NoCrypt/blessed_vae/blob/main/blessed2.vae.pt)|Bake VAE|-|**[SomethingV2_2](https://huggingface.co/NoCrypt/SomethingV2_2/blob/main/SomethingV2_2.safetensors)**|
## Why not call it SomethingV4?
Since this model was based on SomethingV2 and there's not THAT much of improvements in some condition. Calling it V4 is just not right at the moment 😅
I am NoCrypt | 9,110 | [
[
-0.054779052734375,
-0.057952880859375,
0.0147552490234375,
0.0168304443359375,
-0.035369873046875,
-0.0019388198852539062,
0.003376007080078125,
-0.039886474609375,
0.056884765625,
0.040496826171875,
-0.0565185546875,
-0.043212890625,
-0.0458984375,
0.0091552734375,
0.008575439453125,
0.0709228515625,
0.019622802734375,
-0.00829315185546875,
0.00921630859375,
0.005336761474609375,
-0.035430908203125,
-0.0008997917175292969,
-0.03338623046875,
-0.0047760009765625,
0.01422119140625,
0.0304718017578125,
0.05718994140625,
0.051025390625,
0.026824951171875,
0.0245819091796875,
-0.0197906494140625,
0.00580596923828125,
-0.045989990234375,
-0.0084686279296875,
0.00849151611328125,
-0.046905517578125,
-0.07562255859375,
0.016082763671875,
0.0411376953125,
0.0198822021484375,
0.015716552734375,
0.0053558349609375,
-0.0012445449829101562,
0.0535888671875,
-0.0108489990234375,
0.004703521728515625,
0.006103515625,
0.005771636962890625,
-0.028045654296875,
0.004192352294921875,
-0.00539398193359375,
-0.0399169921875,
0.0023250579833984375,
-0.058746337890625,
0.0158233642578125,
-0.01580810546875,
0.096435546875,
0.003742218017578125,
-0.01143646240234375,
0.007190704345703125,
-0.0234527587890625,
0.05865478515625,
-0.055419921875,
0.00870513916015625,
0.023193359375,
0.00345611572265625,
-0.012847900390625,
-0.07086181640625,
-0.053619384765625,
-0.0007891654968261719,
-0.0194091796875,
0.05157470703125,
-0.021087646484375,
-0.0185089111328125,
0.02789306640625,
0.054931640625,
-0.04510498046875,
-0.01215362548828125,
-0.0201873779296875,
-0.0172576904296875,
0.05169677734375,
0.0114898681640625,
0.051971435546875,
-0.0255126953125,
-0.032806396484375,
-0.025543212890625,
-0.04083251953125,
0.011871337890625,
0.0141143798828125,
-0.01303863525390625,
-0.04638671875,
0.039703369140625,
0.00830841064453125,
0.046783447265625,
0.0174407958984375,
-0.034576416015625,
0.02716064453125,
-0.041351318359375,
0.0037670135498046875,
-0.043701171875,
0.08380126953125,
0.054412841796875,
-0.017181396484375,
0.006519317626953125,
0.005046844482421875,
-0.0035991668701171875,
-0.002658843994140625,
-0.09295654296875,
-0.023468017578125,
0.0275726318359375,
-0.044464111328125,
-0.0249176025390625,
-0.00391387939453125,
-0.055206298828125,
-0.004344940185546875,
-0.01080322265625,
0.033477783203125,
-0.05267333984375,
-0.0400390625,
0.0033931732177734375,
-0.021575927734375,
0.01245880126953125,
0.03143310546875,
-0.02618408203125,
-0.005992889404296875,
0.029205322265625,
0.0714111328125,
0.003582000732421875,
-0.0137481689453125,
-0.0010042190551757812,
-0.00986480712890625,
-0.04656982421875,
0.041778564453125,
-0.0043487548828125,
-0.03546142578125,
-0.051177978515625,
0.0288238525390625,
0.003879547119140625,
-0.0230865478515625,
0.0479736328125,
-0.01009368896484375,
0.02392578125,
-0.032318115234375,
-0.034576416015625,
-0.0210723876953125,
-0.002197265625,
-0.051849365234375,
0.051055908203125,
0.036834716796875,
-0.07183837890625,
0.014190673828125,
-0.040924072265625,
-0.005809783935546875,
0.00705718994140625,
0.0035400390625,
-0.04351806640625,
-0.006984710693359375,
-0.00066375732421875,
0.043792724609375,
-0.024017333984375,
-0.0147552490234375,
-0.03131103515625,
-0.025634765625,
0.0241546630859375,
-0.00772857666015625,
0.08740234375,
0.0142822265625,
-0.04345703125,
-0.0201263427734375,
-0.05059814453125,
-0.0091705322265625,
0.055389404296875,
0.0032901763916015625,
-0.0155181884765625,
-0.0382080078125,
0.0111541748046875,
0.02587890625,
0.03472900390625,
-0.04327392578125,
0.0213470458984375,
-0.0513916015625,
0.036529541015625,
0.04522705078125,
0.01439666748046875,
0.0292510986328125,
-0.061187744140625,
0.043670654296875,
0.0031833648681640625,
0.02447509765625,
-0.0213470458984375,
-0.0601806640625,
-0.06109619140625,
-0.047027587890625,
0.0170745849609375,
0.021636962890625,
-0.04083251953125,
0.04925537109375,
-0.0006136894226074219,
-0.07489013671875,
-0.054443359375,
-0.006649017333984375,
0.0513916015625,
0.037811279296875,
0.01435089111328125,
-0.055267333984375,
-0.01212310791015625,
-0.06439208984375,
0.027130126953125,
-0.007556915283203125,
0.00554656982421875,
0.020843505859375,
0.0301513671875,
-0.00656890869140625,
0.05633544921875,
-0.0474853515625,
-0.041595458984375,
-0.011322021484375,
0.0009417533874511719,
0.03814697265625,
0.045562744140625,
0.080810546875,
-0.058349609375,
-0.050628662109375,
-0.0111541748046875,
-0.050140380859375,
-0.0052947998046875,
0.00408172607421875,
-0.037872314453125,
0.005474090576171875,
-0.0032062530517578125,
-0.04241943359375,
0.029754638671875,
0.031890869140625,
-0.041473388671875,
0.051727294921875,
-0.0225067138671875,
0.03875732421875,
-0.10504150390625,
0.0135040283203125,
0.01409912109375,
-0.00908660888671875,
-0.051849365234375,
0.04107666015625,
-0.0035190582275390625,
0.0226593017578125,
-0.0222015380859375,
0.04425048828125,
-0.029388427734375,
0.019378662109375,
-0.02545166015625,
0.00632476806640625,
0.0245819091796875,
0.039031982421875,
0.0146331787109375,
0.04486083984375,
0.058837890625,
-0.0212860107421875,
0.0302734375,
0.039794921875,
-0.016204833984375,
0.076171875,
-0.057525634765625,
0.019775390625,
-0.016876220703125,
0.01500701904296875,
-0.0894775390625,
-0.018341064453125,
0.04241943359375,
-0.064208984375,
0.0268096923828125,
-0.0296630859375,
-0.04547119140625,
-0.039093017578125,
-0.0423583984375,
0.00008553266525268555,
0.062347412109375,
-0.026763916015625,
0.04058837890625,
0.01445770263671875,
-0.00959014892578125,
-0.0308074951171875,
-0.06231689453125,
-0.00009429454803466797,
-0.019073486328125,
-0.053070068359375,
0.03955078125,
-0.0252227783203125,
0.007534027099609375,
0.003055572509765625,
0.019775390625,
-0.007293701171875,
0.00601959228515625,
0.033660888671875,
0.0458984375,
-0.0283660888671875,
-0.03326416015625,
0.0161895751953125,
-0.0164642333984375,
-0.005039215087890625,
-0.002109527587890625,
0.03839111328125,
-0.01364898681640625,
-0.01458740234375,
-0.06927490234375,
0.01317596435546875,
0.055816650390625,
-0.006160736083984375,
0.0401611328125,
0.07470703125,
-0.0234832763671875,
0.0278778076171875,
-0.042938232421875,
-0.01067352294921875,
-0.031494140625,
0.00597381591796875,
-0.0287933349609375,
-0.049774169921875,
0.053985595703125,
0.02850341796875,
0.016998291015625,
0.055145263671875,
0.03631591796875,
-0.016448974609375,
0.07952880859375,
0.0289306640625,
-0.0174560546875,
0.03179931640625,
-0.077392578125,
-0.020538330078125,
-0.06512451171875,
-0.028350830078125,
-0.00437164306640625,
-0.036529541015625,
-0.035247802734375,
-0.0487060546875,
0.03515625,
0.03765869140625,
-0.0051422119140625,
0.052825927734375,
-0.04107666015625,
0.0204925537109375,
0.037109375,
0.034393310546875,
0.0214080810546875,
0.0017070770263671875,
0.00487518310546875,
-0.032318115234375,
-0.027130126953125,
-0.01558685302734375,
0.073974609375,
0.0213623046875,
0.01470947265625,
0.0247650146484375,
0.048126220703125,
0.0027256011962890625,
0.0158233642578125,
-0.037567138671875,
0.04217529296875,
-0.02001953125,
-0.058929443359375,
0.0014772415161132812,
-0.025115966796875,
-0.06561279296875,
0.016448974609375,
-0.040191650390625,
-0.059356689453125,
0.0194091796875,
0.006866455078125,
-0.030426025390625,
0.02130126953125,
-0.039794921875,
0.047637939453125,
0.0114898681640625,
-0.0401611328125,
-0.007808685302734375,
-0.06085205078125,
0.0308380126953125,
0.00258636474609375,
0.005542755126953125,
-0.00804901123046875,
-0.00521087646484375,
0.055877685546875,
-0.046173095703125,
0.052276611328125,
-0.0117340087890625,
-0.0153656005859375,
0.032745361328125,
0.015411376953125,
0.0304412841796875,
0.0174560546875,
0.005115509033203125,
0.0199432373046875,
0.004695892333984375,
-0.03094482421875,
-0.032470703125,
0.055206298828125,
-0.049774169921875,
-0.017578125,
-0.01251220703125,
-0.015472412109375,
0.017578125,
0.0153350830078125,
0.0684814453125,
0.05242919921875,
0.001934051513671875,
0.0010700225830078125,
0.04486083984375,
-0.01050567626953125,
0.02679443359375,
0.02069091796875,
-0.0063934326171875,
-0.057586669921875,
0.0814208984375,
0.0124969482421875,
0.0211639404296875,
0.006534576416015625,
0.02239990234375,
-0.0259246826171875,
-0.0174560546875,
-0.05767822265625,
0.05072021484375,
-0.032562255859375,
-0.01415252685546875,
-0.051025390625,
-0.016815185546875,
-0.032379150390625,
-0.0098876953125,
-0.0462646484375,
-0.0285797119140625,
-0.051605224609375,
0.0252227783203125,
0.056884765625,
0.044921875,
-0.0259246826171875,
0.005840301513671875,
-0.043548583984375,
0.033050537109375,
0.0146636962890625,
0.024627685546875,
0.0125579833984375,
-0.05902099609375,
0.006649017333984375,
0.0023899078369140625,
-0.0263824462890625,
-0.07806396484375,
0.03741455078125,
-0.00298309326171875,
0.0210113525390625,
0.040557861328125,
-0.00588226318359375,
0.06207275390625,
-0.006256103515625,
0.06298828125,
0.034637451171875,
-0.03875732421875,
0.041046142578125,
-0.049102783203125,
-0.003498077392578125,
0.0361328125,
0.02496337890625,
-0.0270233154296875,
-0.03167724609375,
-0.06597900390625,
-0.058258056640625,
0.0626220703125,
0.03558349609375,
0.00963592529296875,
0.01181793212890625,
0.047637939453125,
0.005405426025390625,
0.01282501220703125,
-0.05596923828125,
-0.0821533203125,
-0.024688720703125,
0.00264739990234375,
-0.00843048095703125,
-0.00949859619140625,
0.0008587837219238281,
-0.041107177734375,
0.05767822265625,
-0.0009512901306152344,
0.040985107421875,
0.0190582275390625,
0.0155181884765625,
-0.00982666015625,
-0.01287078857421875,
0.028900146484375,
0.01934814453125,
-0.02093505859375,
-0.0308380126953125,
-0.0026607513427734375,
-0.047760009765625,
0.01261138916015625,
0.00972747802734375,
-0.032867431640625,
-0.0014657974243164062,
0.01806640625,
0.05364990234375,
-0.01605224609375,
-0.020782470703125,
0.06744384765625,
-0.00720977783203125,
-0.0221710205078125,
-0.0143280029296875,
0.021240234375,
0.0116424560546875,
0.034637451171875,
0.0108795166015625,
0.005886077880859375,
0.02655029296875,
-0.030029296875,
0.0118865966796875,
0.0272369384765625,
-0.031402587890625,
-0.032958984375,
0.05999755859375,
-0.01555633544921875,
0.000011563301086425781,
0.00841522216796875,
-0.052825927734375,
-0.035736083984375,
0.058074951171875,
0.040496826171875,
0.043060302734375,
-0.0166168212890625,
0.030670166015625,
0.0677490234375,
0.00722503662109375,
-0.007701873779296875,
0.0316162109375,
0.01232147216796875,
-0.0183868408203125,
-0.02801513671875,
-0.03875732421875,
-0.01265716552734375,
0.0343017578125,
-0.04534912109375,
0.028594970703125,
-0.0465087890625,
-0.0240020751953125,
-0.01253509521484375,
0.019012451171875,
-0.025634765625,
0.0273590087890625,
-0.0036029815673828125,
0.04705810546875,
-0.069091796875,
0.037109375,
0.042144775390625,
-0.04595947265625,
-0.0662841796875,
-0.00452423095703125,
0.0216522216796875,
-0.04205322265625,
0.01238250732421875,
0.005245208740234375,
0.01092529296875,
0.011810302734375,
-0.047271728515625,
-0.054046630859375,
0.09222412109375,
0.02569580078125,
-0.0220947265625,
0.01605224609375,
-0.029083251953125,
0.046783447265625,
-0.0164947509765625,
0.044189453125,
0.046417236328125,
0.0210113525390625,
0.0281219482421875,
-0.042633056640625,
0.016693115234375,
-0.05706787109375,
0.00868988037109375,
0.003574371337890625,
-0.08392333984375,
0.06903076171875,
-0.022308349609375,
-0.0207977294921875,
0.047637939453125,
0.058929443359375,
0.039764404296875,
0.014251708984375,
0.032318115234375,
0.05364990234375,
0.0229949951171875,
-0.0225830078125,
0.068603515625,
-0.01861572265625,
0.02154541015625,
0.050445556640625,
0.032989501953125,
0.041717529296875,
-0.0064239501953125,
-0.0311431884765625,
0.04345703125,
0.05718994140625,
-0.01953125,
0.039154052734375,
0.0023040771484375,
-0.00787353515625,
-0.0087127685546875,
-0.0228424072265625,
-0.052947998046875,
0.01088714599609375,
0.015533447265625,
-0.0132598876953125,
0.0088653564453125,
0.0091094970703125,
0.01038360595703125,
0.0028553009033203125,
-0.00827789306640625,
0.049163818359375,
0.00592041015625,
-0.030731201171875,
0.044677734375,
0.007656097412109375,
0.07403564453125,
-0.02435302734375,
-0.00714874267578125,
-0.0316162109375,
-0.015655517578125,
-0.0122222900390625,
-0.071044921875,
0.0022735595703125,
-0.009552001953125,
-0.001964569091796875,
-0.0085601806640625,
0.042938232421875,
-0.0159759521484375,
-0.034088134765625,
0.0207977294921875,
0.01128387451171875,
0.037322998046875,
0.0034198760986328125,
-0.08319091796875,
-0.0007185935974121094,
0.0062408447265625,
-0.019378662109375,
0.004638671875,
0.026214599609375,
0.036529541015625,
0.0238037109375,
0.032379150390625,
0.0221710205078125,
-0.01374053955078125,
0.00537872314453125,
0.05999755859375,
-0.036224365234375,
-0.031768798828125,
-0.05712890625,
0.055389404296875,
-0.00862884521484375,
-0.03631591796875,
0.05303955078125,
0.031585693359375,
0.048004150390625,
-0.00876617431640625,
0.0457763671875,
-0.0267486572265625,
0.03546142578125,
-0.0361328125,
0.07208251953125,
-0.07904052734375,
-0.006805419921875,
-0.053375244140625,
-0.0650634765625,
-0.006229400634765625,
0.0643310546875,
0.018829345703125,
0.015167236328125,
0.0123138427734375,
0.06707763671875,
-0.004299163818359375,
-0.01009368896484375,
0.004398345947265625,
0.021881103515625,
-0.006801605224609375,
0.06427001953125,
0.066162109375,
-0.066162109375,
0.01364898681640625,
-0.06561279296875,
-0.0257415771484375,
-0.0269622802734375,
-0.0513916015625,
-0.07733154296875,
-0.0675048828125,
-0.05303955078125,
-0.05364990234375,
-0.03094482421875,
0.0570068359375,
0.05645751953125,
-0.04241943359375,
0.002925872802734375,
-0.00188446044921875,
-0.0090789794921875,
-0.014862060546875,
-0.0181121826171875,
0.019744873046875,
0.03662109375,
-0.07562255859375,
0.01143646240234375,
0.0297088623046875,
0.048065185546875,
-0.00791168212890625,
-0.0195770263671875,
0.0168304443359375,
-0.016326904296875,
0.0263214111328125,
0.00981903076171875,
-0.0482177734375,
0.01354217529296875,
0.007904052734375,
-0.01187896728515625,
0.015472412109375,
0.0113067626953125,
-0.02496337890625,
0.0308990478515625,
0.042449951171875,
0.0034732818603515625,
0.0452880859375,
-0.006305694580078125,
0.0031566619873046875,
-0.036590576171875,
-0.0027294158935546875,
-0.004817962646484375,
0.0321044921875,
0.018341064453125,
-0.029327392578125,
0.040130615234375,
0.054443359375,
-0.022369384765625,
-0.04949951171875,
0.01244354248046875,
-0.102294921875,
-0.02239990234375,
0.073974609375,
-0.01276397705078125,
-0.032470703125,
0.0145721435546875,
-0.036376953125,
0.029266357421875,
-0.01702880859375,
0.032073974609375,
0.03265380859375,
0.0012578964233398438,
-0.0143280029296875,
-0.035675048828125,
0.0217742919921875,
0.031097412109375,
-0.04901123046875,
-0.0170745849609375,
0.039093017578125,
0.032440185546875,
0.042449951171875,
0.07330322265625,
-0.0426025390625,
0.03778076171875,
-0.00246429443359375,
0.01053619384765625,
-0.00334930419921875,
-0.007427215576171875,
-0.02545166015625,
-0.005756378173828125,
0.0015077590942382812,
-0.012054443359375
]
] |
Linaqruf/animagine-xl | 2023-08-21T07:06:06.000Z | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"stable-diffusion-xl",
"text-to-image",
"en",
"dataset:Linaqruf/animagine-datasets",
"license:openrail++",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionXLPipeline",
"region:us"
] | text-to-image | Linaqruf | null | null | Linaqruf/animagine-xl | 239 | 31,571 | diffusers | 2023-08-04T08:55:23 | ---
license: openrail++
language:
- en
pipeline_tag: text-to-image
tags:
- stable-diffusion
- stable-diffusion-diffusers
- stable-diffusion-xl
inference:
parameter:
negative_prompt: lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry
widget:
- text: >-
face focus, cute, masterpiece, best quality, 1girl, green hair, sweater, looking at viewer, upper body, beanie, outdoors, night, turtleneck
example_title: example 1girl
- text: >-
face focus, bishounen, masterpiece, best quality, 1boy, green hair, sweater, looking at viewer, upper body, beanie, outdoors, night, turtleneck
example_title: example 1boy
library_name: diffusers
datasets:
- Linaqruf/animagine-datasets
---
<style>
.title-container {
display: flex;
justify-content: center;
align-items: center;
height: 100vh; /* Adjust this value to position the title vertically */
}
.title {
font-size: 3em;
text-align: center;
color: #333;
font-family: 'Helvetica Neue', sans-serif;
text-transform: uppercase;
letter-spacing: 0.1em;
padding: 0.5em 0;
background: transparent;
}
.title span {
background: -webkit-linear-gradient(45deg, #7ed56f, #28b485);
-webkit-background-clip: text;
-webkit-text-fill-color: transparent;
}
.custom-table {
table-layout: fixed;
width: 100%;
border-collapse: collapse;
margin-top: 2em;
}
.custom-table td {
width: 50%;
vertical-align: top;
padding: 10px;
box-shadow: 0px 0px 10px 0px rgba(0,0,0,0.15);
}
.custom-image {
width: 100%;
height: auto;
object-fit: cover;
border-radius: 10px;
transition: transform .2s;
margin-bottom: 1em;
}
.custom-image:hover {
transform: scale(1.05);
}
</style>
<h1 class="title"><span>Animagine XL</span></h1>
<table class="custom-table">
<tr>
<td>
<a href="https://huggingface.co/Linaqruf/animagine-xl/blob/main/sample_images/image1.png">
<img class="custom-image" src="https://huggingface.co/Linaqruf/animagine-xl/resolve/main/sample_images/image1.png" alt="sample1">
</a>
<a href="https://huggingface.co/Linaqruf/animagine-xl/blob/main/sample_images/image4.png">
<img class="custom-image" src="https://huggingface.co/Linaqruf/animagine-xl/resolve/main/sample_images/image4.png" alt="sample3">
</a>
</td>
<td>
<a href="https://huggingface.co/Linaqruf/animagine-xl/blob/main/sample_images/image2.png">
<img class="custom-image" src="https://huggingface.co/Linaqruf/animagine-xl/resolve/main/sample_images/image2.png" alt="sample2">
</a>
<a href="https://huggingface.co/Linaqruf/animagine-xl/blob/main/sample_images/image3.png">
<img class="custom-image" src="https://huggingface.co/Linaqruf/animagine-xl/resolve/main/sample_images/image3.png" alt="sample4">
</a>
</td>
</tr>
</table>
<hr>
## Overview
**Animagine XL** is a high-resolution, latent text-to-image diffusion model. The model has been fine-tuned using a learning rate of `4e-7` over 27000 global steps with a batch size of 16 on a curated dataset of superior-quality anime-style images. This model is derived from Stable Diffusion XL 1.0.
- Use it with the [`Stable Diffusion Webui`](https://github.com/AUTOMATIC1111/stable-diffusion-webui)
- Use it with 🧨 [`diffusers`](https://huggingface.co/docs/diffusers/index)
- Use it with the [`ComfyUI`](https://github.com/comfyanonymous/ComfyUI) **(recommended)**
Like other anime-style Stable Diffusion models, it also supports Danbooru tags to generate images.
e.g. _**face focus, cute, masterpiece, best quality, 1girl, green hair, sweater, looking at viewer, upper body, beanie, outdoors, night, turtleneck**_
<hr>
## Features
1. High-Resolution Images: The model trained with 1024x1024 resolution. The model is trained using [NovelAI Aspect Ratio Bucketing Tool](https://github.com/NovelAI/novelai-aspect-ratio-bucketing) so that it can be trained at non-square resolutions.
2. Anime-styled Generation: Based on given text prompts, the model can create high quality anime-styled images.
3. Fine-Tuned Diffusion Process: The model utilizes a fine-tuned diffusion process to ensure high quality and unique image output.
<hr>
## Model Details
- **Developed by:** [Linaqruf](https://github.com/Linaqruf)
- **Model type:** Diffusion-based text-to-image generative model
- **Model Description:** This is a model that can be used to generate and modify high quality anime-themed images based on text prompts.
- **License:** [CreativeML Open RAIL++-M License](https://huggingface.co/stabilityai/stable-diffusion-2/blob/main/LICENSE-MODEL)
- **Finetuned from model:** [Stable Diffusion XL 1.0](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0)
<hr>
## How to Use:
- Download `Animagine XL` [here](https://huggingface.co/Linaqruf/animagine-xl/resolve/main/animagine-xl.safetensors), the model is in `.safetensors` format.
- You need to use Danbooru-style tag as prompt instead of natural language, otherwise you will get realistic result instead of anime
- You can use any generic negative prompt or use the following suggested negative prompt to guide the model towards high aesthetic generationse:
```
lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry
```
- And, the following should also be prepended to prompts to get high aesthetic results:
```
masterpiece, best quality
```
- Use this cheat sheet to find the best resolution:
```
768 x 1344: Vertical (9:16)
915 x 1144: Portrait (4:5)
1024 x 1024: Square (1:1)
1182 x 886: Photo (4:3)
1254 x 836: Landscape (3:2)
1365 x 768: Widescreen (16:9)
1564 x 670: Cinematic (21:9)
```
<hr>
## Gradio & Colab
We also support a [Gradio](https://github.com/gradio-app/gradio) Web UI and Colab with Diffusers to run **Animagine XL**:
[](https://huggingface.co/spaces/Linaqruf/Animagine-XL)
[](https://colab.research.google.com/#fileId=https%3A//huggingface.co/Linaqruf/animagine-xl/blob/main/Animagine_XL_demo.ipynb)
## 🧨 Diffusers
Make sure to upgrade diffusers to >= 0.18.2:
```
pip install diffusers --upgrade
```
In addition make sure to install `transformers`, `safetensors`, `accelerate` as well as the invisible watermark:
```
pip install invisible_watermark transformers accelerate safetensors
```
Running the pipeline (if you don't swap the scheduler it will run with the default **EulerDiscreteScheduler** in this example we are swapping it to **EulerAncestralDiscreteScheduler**:
```py
import torch
from torch import autocast
from diffusers import StableDiffusionXLPipeline, EulerAncestralDiscreteScheduler
model = "Linaqruf/animagine-xl"
pipe = StableDiffusionXLPipeline.from_pretrained(
model,
torch_dtype=torch.float16,
use_safetensors=True,
variant="fp16"
)
pipe.scheduler = EulerAncestralDiscreteScheduler.from_config(pipe.scheduler.config)
pipe.to('cuda')
prompt = "face focus, cute, masterpiece, best quality, 1girl, green hair, sweater, looking at viewer, upper body, beanie, outdoors, night, turtleneck"
negative_prompt = "lowres, bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, worst quality, low quality, normal quality, jpeg artifacts, signature, watermark, username, blurry"
image = pipe(
prompt,
negative_prompt=negative_prompt,
width=1024,
height=1024,
guidance_scale=12,
target_size=(1024,1024),
original_size=(4096,4096),
num_inference_steps=50
).images[0]
image.save("anime_girl.png")
```
<hr>
## Limitation
This model inherit Stable Diffusion XL 1.0 [limitation](https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0#limitations)
| 8,277 | [
[
-0.047607421875,
-0.0655517578125,
0.022430419921875,
0.02520751953125,
-0.0194854736328125,
-0.00804901123046875,
0.016571044921875,
-0.03521728515625,
0.042083740234375,
0.021514892578125,
-0.051300048828125,
-0.041778564453125,
-0.05096435546875,
-0.00047898292541503906,
-0.01043701171875,
0.061309814453125,
0.0035152435302734375,
-0.027801513671875,
-0.0068817138671875,
-0.003963470458984375,
-0.0311431884765625,
0.00299835205078125,
-0.0716552734375,
-0.0303497314453125,
0.020660400390625,
0.01983642578125,
0.078125,
0.0694580078125,
0.019622802734375,
0.0247650146484375,
-0.016815185546875,
-0.00945281982421875,
-0.0289306640625,
-0.006793975830078125,
0.0087738037109375,
-0.0244598388671875,
-0.045928955078125,
0.002582550048828125,
0.04547119140625,
0.00762939453125,
-0.01502227783203125,
-0.0017948150634765625,
-0.005908966064453125,
0.05914306640625,
-0.019561767578125,
-0.00380706787109375,
-0.010040283203125,
0.0259552001953125,
-0.01328277587890625,
-0.002971649169921875,
-0.01134490966796875,
-0.038055419921875,
0.00568389892578125,
-0.068359375,
0.007556915283203125,
-0.002368927001953125,
0.117919921875,
0.0208892822265625,
-0.02166748046875,
-0.0023956298828125,
-0.0300140380859375,
0.053497314453125,
-0.051605224609375,
0.0246124267578125,
0.00820159912109375,
0.01210784912109375,
0.005489349365234375,
-0.0712890625,
-0.039031982421875,
0.0017375946044921875,
-0.023345947265625,
0.0301361083984375,
-0.043060302734375,
-0.0164947509765625,
0.018157958984375,
0.033782958984375,
-0.04949951171875,
-0.01297760009765625,
-0.0131988525390625,
0.0024662017822265625,
0.044677734375,
0.0137481689453125,
0.06207275390625,
-0.0076141357421875,
-0.041717529296875,
-0.01207733154296875,
-0.0252532958984375,
0.005321502685546875,
0.0242919921875,
-0.0275115966796875,
-0.06072998046875,
0.0151519775390625,
0.008331298828125,
0.03887939453125,
0.032928466796875,
-0.0134735107421875,
0.031097412109375,
-0.006374359130859375,
-0.01898193359375,
-0.0294036865234375,
0.08343505859375,
0.0460205078125,
0.0007834434509277344,
0.004047393798828125,
0.0012502670288085938,
0.005268096923828125,
0.0029888153076171875,
-0.0919189453125,
-0.0179443359375,
0.01953125,
-0.0333251953125,
-0.0213470458984375,
-0.029571533203125,
-0.07794189453125,
-0.01209259033203125,
-0.003448486328125,
0.032867431640625,
-0.041229248046875,
-0.029296875,
0.005218505859375,
-0.04150390625,
0.00757598876953125,
0.039154052734375,
-0.049407958984375,
0.0157623291015625,
-0.0006442070007324219,
0.07635498046875,
-0.0167236328125,
0.002826690673828125,
-0.013397216796875,
0.00785064697265625,
-0.015655517578125,
0.056121826171875,
-0.03192138671875,
-0.031585693359375,
-0.015899658203125,
0.01409149169921875,
-0.01529693603515625,
-0.0279998779296875,
0.040771484375,
-0.0139007568359375,
0.0303802490234375,
-0.01511383056640625,
-0.034637451171875,
-0.0157012939453125,
0.0038967132568359375,
-0.032562255859375,
0.0670166015625,
0.02056884765625,
-0.08612060546875,
0.0099029541015625,
-0.052581787109375,
-0.016693115234375,
0.006561279296875,
-0.00232696533203125,
-0.053253173828125,
-0.01549530029296875,
0.014373779296875,
0.0374755859375,
-0.006664276123046875,
-0.002803802490234375,
-0.0184326171875,
-0.0047454833984375,
0.01522064208984375,
-0.019287109375,
0.08197021484375,
0.0273895263671875,
-0.040130615234375,
0.005828857421875,
-0.051055908203125,
-0.01226043701171875,
0.03125,
-0.01451873779296875,
-0.01003265380859375,
-0.0216217041015625,
0.021148681640625,
0.022857666015625,
0.028076171875,
-0.041412353515625,
0.01520538330078125,
-0.017181396484375,
0.034210205078125,
0.045501708984375,
0.012603759765625,
0.052886962890625,
-0.03948974609375,
0.045440673828125,
0.01078033447265625,
0.0293731689453125,
-0.007717132568359375,
-0.04473876953125,
-0.058074951171875,
-0.040985107421875,
0.00884246826171875,
0.040679931640625,
-0.050994873046875,
0.035491943359375,
-0.00284576416015625,
-0.050323486328125,
-0.0310211181640625,
0.005359649658203125,
0.0198974609375,
0.0537109375,
0.0275421142578125,
-0.057464599609375,
-0.017303466796875,
-0.058929443359375,
0.018524169921875,
0.00585174560546875,
-0.003055572509765625,
0.037109375,
0.046630859375,
-0.03363037109375,
0.05499267578125,
-0.047576904296875,
-0.03900146484375,
-0.00791168212890625,
0.00695037841796875,
0.0281982421875,
0.057861328125,
0.072021484375,
-0.06109619140625,
-0.06439208984375,
0.00494384765625,
-0.06634521484375,
-0.0102386474609375,
-0.00443267822265625,
-0.03985595703125,
0.0239410400390625,
0.00623321533203125,
-0.0501708984375,
0.03741455078125,
0.043731689453125,
-0.060089111328125,
0.04327392578125,
-0.034088134765625,
0.0196685791015625,
-0.08154296875,
0.0170440673828125,
0.03009033203125,
-0.02471923828125,
-0.04986572265625,
0.0244293212890625,
-0.0023956298828125,
-0.0021190643310546875,
-0.03839111328125,
0.06060791015625,
-0.034027099609375,
0.0272979736328125,
-0.014434814453125,
-0.01178741455078125,
0.0222320556640625,
0.030670166015625,
0.02166748046875,
0.035675048828125,
0.069580078125,
-0.04302978515625,
0.02777099609375,
0.031951904296875,
-0.0093841552734375,
0.06842041015625,
-0.05975341796875,
0.0090484619140625,
-0.0197601318359375,
0.0097808837890625,
-0.079345703125,
-0.02191162109375,
0.03619384765625,
-0.033447265625,
0.043609619140625,
-0.01641845703125,
-0.03424072265625,
-0.026885986328125,
-0.0266571044921875,
0.0139617919921875,
0.06805419921875,
-0.035675048828125,
0.046417236328125,
0.0223541259765625,
-0.00008535385131835938,
-0.03411865234375,
-0.054412841796875,
0.00013566017150878906,
-0.03759765625,
-0.07489013671875,
0.04974365234375,
-0.0218505859375,
0.0039825439453125,
0.0158843994140625,
0.0086669921875,
-0.01209259033203125,
-0.01372528076171875,
0.017974853515625,
0.0273590087890625,
-0.0037250518798828125,
-0.025482177734375,
0.006175994873046875,
-0.0055084228515625,
-0.01270294189453125,
0.0037784576416015625,
0.05523681640625,
-0.011383056640625,
0.00478363037109375,
-0.0819091796875,
0.01132965087890625,
0.046417236328125,
-0.00003331899642944336,
0.057586669921875,
0.07598876953125,
-0.02886962890625,
0.026153564453125,
-0.04840087890625,
-0.0053863525390625,
-0.0384521484375,
-0.0008196830749511719,
-0.0076141357421875,
-0.03314208984375,
0.0618896484375,
0.01381683349609375,
0.01427459716796875,
0.03778076171875,
0.0374755859375,
-0.0156402587890625,
0.08489990234375,
0.040985107421875,
0.003818511962890625,
0.057342529296875,
-0.061187744140625,
-0.024017333984375,
-0.057220458984375,
-0.03131103515625,
-0.0209197998046875,
-0.0304107666015625,
-0.026214599609375,
-0.03704833984375,
0.04229736328125,
0.0057220458984375,
-0.032867431640625,
0.038482666015625,
-0.044464111328125,
0.0165252685546875,
0.02117919921875,
0.021240234375,
0.0190277099609375,
0.0117645263671875,
-0.00963592529296875,
-0.01111602783203125,
-0.038055419921875,
-0.030059814453125,
0.0280303955078125,
0.041748046875,
0.063232421875,
0.011260986328125,
0.0540771484375,
0.0016717910766601562,
0.0266571044921875,
-0.0350341796875,
0.0294952392578125,
-0.006793975830078125,
-0.055023193359375,
-0.0143585205078125,
-0.031982421875,
-0.0810546875,
0.0204010009765625,
-0.034423828125,
-0.053314208984375,
0.0303497314453125,
0.0147857666015625,
-0.01045989990234375,
0.02386474609375,
-0.064453125,
0.06793212890625,
0.0028667449951171875,
-0.057647705078125,
0.00927734375,
-0.038848876953125,
0.0168914794921875,
0.033294677734375,
0.019744873046875,
-0.01335906982421875,
-0.0128173828125,
0.045074462890625,
-0.040435791015625,
0.06243896484375,
-0.0283355712890625,
-0.00006157159805297852,
0.0197601318359375,
0.00910186767578125,
0.0273590087890625,
0.0172119140625,
-0.0102386474609375,
0.00705718994140625,
0.0057525634765625,
-0.036407470703125,
-0.039520263671875,
0.045623779296875,
-0.056854248046875,
-0.035247802734375,
-0.030914306640625,
-0.03167724609375,
0.0243682861328125,
0.01108551025390625,
0.05615234375,
0.0266571044921875,
-0.011627197265625,
-0.0007724761962890625,
0.053558349609375,
-0.0281219482421875,
0.049652099609375,
0.014404296875,
-0.0119476318359375,
-0.044708251953125,
0.05914306640625,
0.006961822509765625,
0.015655517578125,
0.0192718505859375,
0.037353515625,
-0.033905029296875,
-0.01000213623046875,
-0.061065673828125,
0.034454345703125,
-0.033447265625,
-0.0276031494140625,
-0.048248291015625,
-0.01666259765625,
-0.0294952392578125,
-0.02496337890625,
-0.0292816162109375,
-0.0175933837890625,
-0.0589599609375,
0.0182647705078125,
0.04888916015625,
0.0428466796875,
0.0010766983032226562,
0.0222625732421875,
-0.034942626953125,
0.0205535888671875,
0.0241546630859375,
0.021148681640625,
0.0134429931640625,
-0.059722900390625,
0.0093841552734375,
-0.00685882568359375,
-0.05584716796875,
-0.06964111328125,
0.07098388671875,
0.01538848876953125,
0.0280609130859375,
0.041473388671875,
-0.01012420654296875,
0.06854248046875,
-0.034149169921875,
0.047393798828125,
0.04901123046875,
-0.0618896484375,
0.04681396484375,
-0.03729248046875,
0.03314208984375,
0.01004791259765625,
0.048370361328125,
-0.036834716796875,
-0.03466796875,
-0.06201171875,
-0.056793212890625,
0.032073974609375,
0.035186767578125,
0.0218505859375,
0.0115203857421875,
0.044219970703125,
-0.00867462158203125,
-0.004917144775390625,
-0.06890869140625,
-0.04974365234375,
-0.02154541015625,
-0.003376007080078125,
0.0029144287109375,
0.00893402099609375,
0.0019083023071289062,
-0.048248291015625,
0.055999755859375,
0.002437591552734375,
0.038848876953125,
0.026336669921875,
0.0137786865234375,
-0.026885986328125,
0.002948760986328125,
0.03460693359375,
0.033050537109375,
-0.0025196075439453125,
-0.01800537109375,
0.001125335693359375,
-0.03515625,
0.00792694091796875,
0.004756927490234375,
-0.032501220703125,
0.01332855224609375,
-0.0001436471939086914,
0.06573486328125,
-0.013641357421875,
-0.0184478759765625,
0.0443115234375,
-0.0204925537109375,
-0.025482177734375,
-0.0277099609375,
0.01490020751953125,
0.01873779296875,
0.023406982421875,
0.0263671875,
0.0198974609375,
0.019317626953125,
-0.03656005859375,
0.00974273681640625,
0.036407470703125,
-0.0206451416015625,
-0.0272674560546875,
0.07806396484375,
-0.004711151123046875,
-0.01019287109375,
0.01922607421875,
-0.0280303955078125,
-0.0311737060546875,
0.054840087890625,
0.05145263671875,
0.040496826171875,
-0.023773193359375,
0.040496826171875,
0.05615234375,
-0.005584716796875,
-0.0030231475830078125,
0.01294708251953125,
0.018707275390625,
-0.04058837890625,
-0.0025272369384765625,
-0.052520751953125,
-0.00821685791015625,
0.01154327392578125,
-0.03228759765625,
0.036834716796875,
-0.04632568359375,
-0.02960205078125,
-0.00042247772216796875,
0.01061248779296875,
-0.048187255859375,
0.022491455078125,
-0.005367279052734375,
0.060546875,
-0.0679931640625,
0.050689697265625,
0.043975830078125,
-0.047607421875,
-0.0579833984375,
-0.0099029541015625,
0.0026912689208984375,
-0.051177978515625,
0.0290069580078125,
0.006072998046875,
0.01080322265625,
0.0168914794921875,
-0.044586181640625,
-0.06268310546875,
0.0994873046875,
0.04205322265625,
-0.0153045654296875,
-0.01274871826171875,
-0.025177001953125,
0.050811767578125,
-0.038330078125,
0.03363037109375,
0.035552978515625,
0.0333251953125,
0.037017822265625,
-0.051971435546875,
0.0183563232421875,
-0.037750244140625,
0.02081298828125,
-0.002166748046875,
-0.07318115234375,
0.056640625,
-0.01446533203125,
-0.02984619140625,
0.025482177734375,
0.045989990234375,
0.0127105712890625,
0.01397705078125,
0.035308837890625,
0.07257080078125,
0.037200927734375,
-0.0189971923828125,
0.08587646484375,
-0.0006499290466308594,
0.03326416015625,
0.03668212890625,
0.0174713134765625,
0.045654296875,
0.016815185546875,
-0.01146697998046875,
0.06536865234375,
0.0511474609375,
-0.0151824951171875,
0.040740966796875,
0.00384521484375,
-0.0190277099609375,
0.0012655258178710938,
-0.0186614990234375,
-0.04107666015625,
0.0011529922485351562,
0.0283966064453125,
-0.02471923828125,
-0.006458282470703125,
0.0030364990234375,
0.0095977783203125,
0.00623321533203125,
-0.00984954833984375,
0.050323486328125,
0.00008094310760498047,
-0.024383544921875,
0.060089111328125,
-0.007106781005859375,
0.077392578125,
-0.0252838134765625,
-0.0008306503295898438,
-0.026336669921875,
-0.004917144775390625,
-0.035400390625,
-0.0672607421875,
0.0240631103515625,
-0.00133514404296875,
0.007289886474609375,
-0.0229644775390625,
0.037689208984375,
-0.0247650146484375,
-0.0313720703125,
0.0204010009765625,
0.0158538818359375,
0.0224151611328125,
0.0243988037109375,
-0.08544921875,
0.0233306884765625,
-0.002468109130859375,
-0.0303497314453125,
0.0163116455078125,
0.024444580078125,
0.0322265625,
0.05084228515625,
0.0180206298828125,
0.00653076171875,
0.012664794921875,
0.0003418922424316406,
0.0504150390625,
-0.0253143310546875,
-0.03839111328125,
-0.062042236328125,
0.053314208984375,
-0.0170135498046875,
-0.0169525146484375,
0.045623779296875,
0.040435791015625,
0.05377197265625,
-0.0291290283203125,
0.039215087890625,
-0.033447265625,
0.006206512451171875,
-0.028106689453125,
0.07196044921875,
-0.0712890625,
-0.006038665771484375,
-0.048187255859375,
-0.056976318359375,
-0.021026611328125,
0.070556640625,
0.0011072158813476562,
0.006023406982421875,
0.0350341796875,
0.0628662109375,
-0.006565093994140625,
-0.0302276611328125,
0.0135498046875,
0.01105499267578125,
0.0147247314453125,
0.046783447265625,
0.0472412109375,
-0.05401611328125,
0.0302886962890625,
-0.051605224609375,
-0.0231781005859375,
-0.0003185272216796875,
-0.0692138671875,
-0.069580078125,
-0.055908203125,
-0.06781005859375,
-0.03472900390625,
-0.01395416259765625,
0.04876708984375,
0.07318115234375,
-0.03753662109375,
-0.021148681640625,
-0.01520538330078125,
0.00875091552734375,
-0.00992584228515625,
-0.02557373046875,
0.019287109375,
0.0035762786865234375,
-0.09100341796875,
-0.0195159912109375,
0.01016998291015625,
0.0499267578125,
0.0085906982421875,
-0.0221099853515625,
-0.0205078125,
-0.0080718994140625,
0.031646728515625,
0.0335693359375,
-0.041229248046875,
-0.01265716552734375,
-0.0135345458984375,
-0.003997802734375,
0.0333251953125,
0.00714874267578125,
-0.0286102294921875,
0.016754150390625,
0.046142578125,
0.0175018310546875,
0.05914306640625,
-0.0008702278137207031,
0.0138092041015625,
-0.05413818359375,
0.0155487060546875,
-0.002437591552734375,
0.031707763671875,
0.0204925537109375,
-0.043182373046875,
0.047088623046875,
0.042236328125,
-0.040802001953125,
-0.048614501953125,
0.0059967041015625,
-0.09014892578125,
-0.01409912109375,
0.0732421875,
-0.0180206298828125,
-0.031097412109375,
0.01210784912109375,
-0.031097412109375,
0.0182647705078125,
-0.046417236328125,
0.047332763671875,
0.03851318359375,
-0.029144287109375,
-0.0338134765625,
-0.0386962890625,
0.032196044921875,
0.0132293701171875,
-0.06207275390625,
-0.032684326171875,
0.033050537109375,
0.0546875,
0.038970947265625,
0.053497314453125,
-0.0269012451171875,
0.0239410400390625,
-0.007694244384765625,
0.0185546875,
0.015106201171875,
-0.0033473968505859375,
-0.0316162109375,
-0.004611968994140625,
-0.014801025390625,
-0.004566192626953125
]
] |
cl-tohoku/bert-base-japanese-char-v2 | 2021-09-23T13:45:24.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"ja",
"dataset:wikipedia",
"license:cc-by-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | cl-tohoku | null | null | cl-tohoku/bert-base-japanese-char-v2 | 2 | 31,567 | transformers | 2022-03-02T23:29:05 | ---
language: ja
license: cc-by-sa-4.0
datasets:
- wikipedia
widget:
- text: 東北大学で[MASK]の研究をしています。
---
# BERT base Japanese (character-level tokenization with whole word masking, jawiki-20200831)
This is a [BERT](https://github.com/google-research/bert) model pretrained on texts in the Japanese language.
This version of the model processes input texts with word-level tokenization based on the Unidic 2.1.2 dictionary (available in [unidic-lite](https://pypi.org/project/unidic-lite/) package), followed by character-level tokenization.
Additionally, the model is trained with the whole word masking enabled for the masked language modeling (MLM) objective.
The codes for the pretraining are available at [cl-tohoku/bert-japanese](https://github.com/cl-tohoku/bert-japanese/tree/v2.0).
## Model architecture
The model architecture is the same as the original BERT base model; 12 layers, 768 dimensions of hidden states, and 12 attention heads.
## Training Data
The models are trained on the Japanese version of Wikipedia.
The training corpus is generated from the Wikipedia Cirrussearch dump file as of August 31, 2020.
The generated corpus files are 4.0GB in total, containing approximately 30M sentences.
We used the [MeCab](https://taku910.github.io/mecab/) morphological parser with [mecab-ipadic-NEologd](https://github.com/neologd/mecab-ipadic-neologd) dictionary to split texts into sentences.
## Tokenization
The texts are first tokenized by MeCab with the Unidic 2.1.2 dictionary and then split into characters.
The vocabulary size is 6144.
We used [`fugashi`](https://github.com/polm/fugashi) and [`unidic-lite`](https://github.com/polm/unidic-lite) packages for the tokenization.
## Training
The models are trained with the same configuration as the original BERT; 512 tokens per instance, 256 instances per batch, and 1M training steps.
For training of the MLM (masked language modeling) objective, we introduced whole word masking in which all of the subword tokens corresponding to a single word (tokenized by MeCab) are masked at once.
For training of each model, we used a v3-8 instance of Cloud TPUs provided by [TensorFlow Research Cloud program](https://www.tensorflow.org/tfrc/).
The training took about 5 days to finish.
## Licenses
The pretrained models are distributed under the terms of the [Creative Commons Attribution-ShareAlike 3.0](https://creativecommons.org/licenses/by-sa/3.0/).
## Acknowledgments
This model is trained with Cloud TPUs provided by [TensorFlow Research Cloud](https://www.tensorflow.org/tfrc/) program.
| 2,572 | [
[
-0.028839111328125,
-0.065673828125,
0.0141754150390625,
0.01322174072265625,
-0.0513916015625,
0.004764556884765625,
-0.022705078125,
-0.03302001953125,
0.035888671875,
0.041900634765625,
-0.053680419921875,
-0.044342041015625,
-0.04766845703125,
-0.0004963874816894531,
-0.011749267578125,
0.085693359375,
-0.0005235671997070312,
0.0175018310546875,
0.0203857421875,
0.0159912109375,
-0.0230865478515625,
-0.0523681640625,
-0.056793212890625,
-0.025787353515625,
0.0301055908203125,
0.010986328125,
0.032562255859375,
0.032501220703125,
0.01531982421875,
0.0152435302734375,
-0.00010395050048828125,
-0.005962371826171875,
-0.041229248046875,
-0.01393890380859375,
0.002422332763671875,
-0.03033447265625,
-0.0179290771484375,
-0.001369476318359375,
0.0472412109375,
0.05523681640625,
0.005046844482421875,
0.0034637451171875,
-0.006641387939453125,
0.029022216796875,
-0.039520263671875,
0.0031070709228515625,
-0.0555419921875,
0.0036754608154296875,
-0.0254364013671875,
0.016754150390625,
-0.02288818359375,
0.007198333740234375,
0.00911712646484375,
-0.0535888671875,
0.026580810546875,
0.001049041748046875,
0.09320068359375,
-0.002758026123046875,
-0.004322052001953125,
-0.017181396484375,
-0.03143310546875,
0.0540771484375,
-0.072265625,
0.028839111328125,
0.042755126953125,
-0.0009589195251464844,
-0.00969696044921875,
-0.064697265625,
-0.048309326171875,
-0.01294708251953125,
0.0040740966796875,
0.006183624267578125,
-0.006587982177734375,
0.01364898681640625,
0.0241241455078125,
0.0175018310546875,
-0.054595947265625,
0.0268402099609375,
-0.03216552734375,
-0.0239410400390625,
0.0305023193359375,
-0.00612640380859375,
0.0259552001953125,
-0.0298614501953125,
-0.034942626953125,
-0.0175323486328125,
-0.03753662109375,
0.00901031494140625,
0.0279998779296875,
0.0158538818359375,
-0.003810882568359375,
0.039276123046875,
-0.0027790069580078125,
0.035675048828125,
-0.01238250732421875,
-0.0183868408203125,
0.0249786376953125,
-0.0259857177734375,
-0.0295257568359375,
0.007083892822265625,
0.07501220703125,
0.006626129150390625,
0.0294647216796875,
-0.0158538818359375,
-0.023590087890625,
-0.0038204193115234375,
0.019378662109375,
-0.0687255859375,
-0.01383209228515625,
0.00965118408203125,
-0.04510498046875,
-0.025177001953125,
0.0043182373046875,
-0.03515625,
0.0007109642028808594,
-0.00010955333709716797,
0.054534912109375,
-0.047821044921875,
-0.0224761962890625,
0.01422119140625,
-0.01050567626953125,
0.0122833251953125,
0.0004992485046386719,
-0.069580078125,
0.0160064697265625,
0.04412841796875,
0.0582275390625,
0.0010023117065429688,
-0.0193939208984375,
0.002262115478515625,
0.006435394287109375,
-0.028228759765625,
0.0274810791015625,
-0.0219268798828125,
-0.0305938720703125,
0.00229644775390625,
0.01032257080078125,
-0.0167236328125,
-0.0166473388671875,
0.034149169921875,
-0.03857421875,
0.0289154052734375,
-0.003643035888671875,
-0.0589599609375,
-0.01537322998046875,
0.0176239013671875,
-0.0318603515625,
0.08245849609375,
0.0035076141357421875,
-0.04949951171875,
0.01548004150390625,
-0.06280517578125,
-0.031494140625,
0.0237579345703125,
0.00954437255859375,
-0.034149169921875,
0.0013628005981445312,
0.0177459716796875,
0.034576416015625,
0.0070037841796875,
0.029296875,
-0.01259613037109375,
-0.0294647216796875,
0.019989013671875,
-0.0207061767578125,
0.08673095703125,
0.01450347900390625,
-0.0450439453125,
0.0014467239379882812,
-0.057220458984375,
0.005107879638671875,
0.016998291015625,
-0.0195770263671875,
-0.02667236328125,
-0.0244598388671875,
0.017364501953125,
0.01556396484375,
0.036773681640625,
-0.056915283203125,
0.00904083251953125,
-0.043212890625,
0.033660888671875,
0.055084228515625,
-0.0084075927734375,
0.017181396484375,
-0.005889892578125,
0.0308990478515625,
0.00347900390625,
0.0181884765625,
-0.0273284912109375,
-0.049285888671875,
-0.08074951171875,
-0.0271759033203125,
0.0556640625,
0.03271484375,
-0.055511474609375,
0.06658935546875,
-0.0258026123046875,
-0.031982421875,
-0.06512451171875,
0.0015201568603515625,
0.037109375,
0.0311279296875,
0.020172119140625,
-0.04034423828125,
-0.049713134765625,
-0.07049560546875,
0.01128387451171875,
-0.011810302734375,
-0.0167083740234375,
0.0007390975952148438,
0.0537109375,
-0.034393310546875,
0.053955078125,
-0.0222930908203125,
-0.0310211181640625,
-0.0199432373046875,
0.0218963623046875,
0.022613525390625,
0.046783447265625,
0.033935546875,
-0.0380859375,
-0.036529541015625,
-0.0180816650390625,
-0.035858154296875,
-0.0027904510498046875,
-0.004207611083984375,
-0.0230712890625,
0.027374267578125,
0.04522705078125,
-0.041259765625,
0.0297088623046875,
0.037811279296875,
-0.025482177734375,
0.028350830078125,
-0.01328277587890625,
-0.01641845703125,
-0.10784912109375,
0.032928466796875,
-0.01561737060546875,
-0.020477294921875,
-0.0570068359375,
0.004421234130859375,
-0.0036220550537109375,
-0.0180511474609375,
-0.0255889892578125,
0.04608154296875,
-0.04010009765625,
0.00217437744140625,
-0.021820068359375,
0.00013709068298339844,
-0.0017232894897460938,
0.062164306640625,
0.01189422607421875,
0.0511474609375,
0.03643798828125,
-0.04400634765625,
0.004413604736328125,
0.005641937255859375,
-0.053741455078125,
-0.004459381103515625,
-0.057373046875,
0.01242828369140625,
-0.00347900390625,
0.01329803466796875,
-0.07562255859375,
-0.006549835205078125,
0.040252685546875,
-0.04473876953125,
0.042510986328125,
0.0118408203125,
-0.060760498046875,
-0.0272369384765625,
-0.029815673828125,
0.00638580322265625,
0.049652099609375,
-0.0308990478515625,
0.0318603515625,
0.0289459228515625,
-0.003452301025390625,
-0.0574951171875,
-0.061187744140625,
0.0239105224609375,
0.01296234130859375,
-0.03033447265625,
0.039215087890625,
-0.00640869140625,
0.0106658935546875,
0.011627197265625,
0.005767822265625,
-0.005924224853515625,
0.0171051025390625,
0.00913238525390625,
0.02783203125,
-0.01316070556640625,
0.00426483154296875,
0.006855010986328125,
-0.002765655517578125,
-0.004863739013671875,
-0.017181396484375,
0.072509765625,
0.00801849365234375,
-0.00696563720703125,
-0.03302001953125,
0.0196990966796875,
0.0219268798828125,
-0.0041351318359375,
0.07861328125,
0.06787109375,
-0.038726806640625,
-0.0033416748046875,
-0.04718017578125,
-0.01529693603515625,
-0.0322265625,
0.04345703125,
-0.035797119140625,
-0.07476806640625,
0.04107666015625,
0.02587890625,
0.0228729248046875,
0.0452880859375,
0.0430908203125,
-0.017333984375,
0.07080078125,
0.05352783203125,
-0.026611328125,
0.047271728515625,
-0.031982421875,
0.0233306884765625,
-0.0640869140625,
-0.0304412841796875,
-0.0274200439453125,
-0.0254364013671875,
-0.0469970703125,
-0.0286865234375,
0.010711669921875,
0.0229034423828125,
-0.018341064453125,
0.0277099609375,
-0.03399658203125,
0.0338134765625,
0.050750732421875,
0.013519287109375,
0.0012035369873046875,
0.02520751953125,
-0.02587890625,
-0.00879669189453125,
-0.051910400390625,
-0.036041259765625,
0.08917236328125,
0.046295166015625,
0.033660888671875,
-0.007038116455078125,
0.04962158203125,
0.00730133056640625,
0.0222320556640625,
-0.05126953125,
0.043243408203125,
-0.034027099609375,
-0.076416015625,
-0.0234222412109375,
-0.0214385986328125,
-0.0787353515625,
0.0234375,
-0.022308349609375,
-0.053619384765625,
0.00017023086547851562,
-0.022064208984375,
-0.0036334991455078125,
0.0362548828125,
-0.046966552734375,
0.06280517578125,
-0.020538330078125,
0.00830841064453125,
-0.0129852294921875,
-0.0684814453125,
0.02618408203125,
-0.014312744140625,
0.00876617431640625,
-0.004352569580078125,
-0.0067596435546875,
0.0828857421875,
-0.034454345703125,
0.07635498046875,
-0.0160980224609375,
0.0010137557983398438,
0.004364013671875,
-0.017730712890625,
0.012115478515625,
-0.0133819580078125,
0.01407623291015625,
0.04803466796875,
-0.0120391845703125,
-0.03936767578125,
-0.0148162841796875,
0.044189453125,
-0.083251953125,
-0.02850341796875,
-0.034637451171875,
-0.0254364013671875,
-0.0031986236572265625,
0.038299560546875,
0.043792724609375,
0.020111083984375,
-0.019775390625,
0.023101806640625,
0.06414794921875,
-0.0197296142578125,
0.04034423828125,
0.045135498046875,
-0.0230865478515625,
-0.033416748046875,
0.06402587890625,
0.00981903076171875,
0.0087890625,
0.044219970703125,
0.001003265380859375,
-0.0254364013671875,
-0.037628173828125,
-0.033935546875,
0.037322998046875,
-0.037017822265625,
-0.0019893646240234375,
-0.06591796875,
-0.037933349609375,
-0.050079345703125,
0.005157470703125,
-0.021087646484375,
-0.0312347412109375,
-0.038299560546875,
-0.003856658935546875,
0.000919342041015625,
0.0416259765625,
0.0037746429443359375,
0.038543701171875,
-0.044708251953125,
0.0205230712890625,
0.01296234130859375,
0.0210418701171875,
-0.000820159912109375,
-0.055511474609375,
-0.031707763671875,
0.0207061767578125,
-0.012420654296875,
-0.052490234375,
0.0271759033203125,
0.00740814208984375,
0.049896240234375,
0.04083251953125,
-0.014007568359375,
0.048492431640625,
-0.0311279296875,
0.07305908203125,
0.0299530029296875,
-0.07525634765625,
0.033172607421875,
-0.0194854736328125,
0.0298004150390625,
0.04632568359375,
0.0382080078125,
-0.04608154296875,
-0.033355712890625,
-0.059417724609375,
-0.059326171875,
0.062042236328125,
0.02191162109375,
0.0283966064453125,
0.0026874542236328125,
0.031463623046875,
-0.0008020401000976562,
0.02337646484375,
-0.07855224609375,
-0.0312347412109375,
-0.031280517578125,
-0.019866943359375,
-0.0243377685546875,
-0.04510498046875,
0.006488800048828125,
-0.0245513916015625,
0.07733154296875,
0.00841522216796875,
0.0305938720703125,
0.007366180419921875,
-0.024810791015625,
0.001476287841796875,
0.006595611572265625,
0.055755615234375,
0.0391845703125,
-0.029022216796875,
-0.004154205322265625,
-0.00191497802734375,
-0.05328369140625,
-0.01271820068359375,
0.0102996826171875,
-0.0264892578125,
0.04046630859375,
0.034515380859375,
0.0997314453125,
0.0252838134765625,
-0.04620361328125,
0.0401611328125,
0.0002052783966064453,
-0.03271484375,
-0.016143798828125,
-0.0023651123046875,
0.00867462158203125,
-0.008880615234375,
0.03033447265625,
-0.024688720703125,
-0.0087890625,
-0.038665771484375,
-0.0009870529174804688,
0.032958984375,
-0.0174713134765625,
-0.021484375,
0.04296875,
0.0120849609375,
-0.0186309814453125,
0.055938720703125,
-0.0097198486328125,
-0.061492919921875,
0.0411376953125,
0.051910400390625,
0.0743408203125,
-0.017578125,
0.018157958984375,
0.045166015625,
0.044952392578125,
0.007472991943359375,
-0.001132965087890625,
-0.00879669189453125,
-0.06494140625,
-0.0248870849609375,
-0.062042236328125,
-0.019744873046875,
0.044464111328125,
-0.034423828125,
0.0108642578125,
-0.046905517578125,
-0.0092620849609375,
-0.0003199577331542969,
0.0274658203125,
-0.034210205078125,
0.0263824462890625,
0.0194549560546875,
0.0634765625,
-0.05657958984375,
0.08441162109375,
0.055450439453125,
-0.0311737060546875,
-0.058013916015625,
-0.006061553955078125,
-0.035858154296875,
-0.08294677734375,
0.04229736328125,
0.0253143310546875,
0.015960693359375,
0.00644683837890625,
-0.04833984375,
-0.0565185546875,
0.06585693359375,
0.00860595703125,
-0.04443359375,
-0.01396942138671875,
0.0167388916015625,
0.053680419921875,
-0.0311431884765625,
0.0093536376953125,
0.025146484375,
0.0188751220703125,
0.00206756591796875,
-0.06591796875,
-0.012176513671875,
-0.0306396484375,
0.031768798828125,
0.0057373046875,
-0.030792236328125,
0.06787109375,
0.00649261474609375,
-0.01837158203125,
0.010772705078125,
0.037811279296875,
0.016571044921875,
-0.001247406005859375,
0.043548583984375,
0.06329345703125,
0.0491943359375,
0.0034656524658203125,
0.060882568359375,
-0.0306854248046875,
0.02825927734375,
0.06280517578125,
0.01442718505859375,
0.058746337890625,
0.0321044921875,
-0.00504302978515625,
0.05145263671875,
0.060760498046875,
-0.013763427734375,
0.05328369140625,
0.00957489013671875,
-0.014312744140625,
-0.01117706298828125,
-0.01425933837890625,
-0.029754638671875,
0.01934814453125,
0.0288543701171875,
-0.037994384765625,
-0.00972747802734375,
0.01137542724609375,
0.0235595703125,
-0.025909423828125,
-0.0272979736328125,
0.05975341796875,
0.0089111328125,
-0.048553466796875,
0.047760009765625,
0.0309600830078125,
0.07647705078125,
-0.07452392578125,
0.02581787109375,
-0.01303863525390625,
0.0088653564453125,
0.0081939697265625,
-0.046112060546875,
0.0015859603881835938,
0.01338958740234375,
-0.0181884765625,
-0.0253448486328125,
0.0523681640625,
-0.04534912109375,
-0.048553466796875,
0.0201416015625,
0.0111541748046875,
0.034637451171875,
0.004848480224609375,
-0.071044921875,
0.007354736328125,
0.018585205078125,
-0.0224151611328125,
0.032196044921875,
0.012939453125,
-0.0043487548828125,
0.03570556640625,
0.0679931640625,
0.00848388671875,
0.01548004150390625,
0.02325439453125,
0.057891845703125,
-0.037750244140625,
-0.04205322265625,
-0.0628662109375,
0.0275726318359375,
-0.01033782958984375,
-0.030670166015625,
0.04803466796875,
0.03851318359375,
0.07269287109375,
-0.01551055908203125,
0.044830322265625,
-0.00937652587890625,
0.036651611328125,
-0.040374755859375,
0.058624267578125,
-0.053192138671875,
-0.0005440711975097656,
-0.0138092041015625,
-0.07354736328125,
-0.01473236083984375,
0.060516357421875,
0.00841522216796875,
0.00603485107421875,
0.04376220703125,
0.0555419921875,
0.00611114501953125,
-0.0167236328125,
0.026031494140625,
0.0185546875,
0.0212249755859375,
0.0439453125,
0.0316162109375,
-0.05145263671875,
0.0281524658203125,
-0.035797119140625,
-0.01036834716796875,
-0.018341064453125,
-0.05084228515625,
-0.08917236328125,
-0.045013427734375,
-0.016937255859375,
-0.027191162109375,
0.00021767616271972656,
0.061279296875,
0.051971435546875,
-0.06304931640625,
-0.02459716796875,
-0.0084075927734375,
-0.01256561279296875,
0.0127105712890625,
-0.019073486328125,
0.0298614501953125,
-0.029266357421875,
-0.05828857421875,
0.01207733154296875,
0.004302978515625,
0.01032257080078125,
-0.0289459228515625,
0.0007781982421875,
-0.03839111328125,
-0.00250244140625,
0.036865234375,
0.0097198486328125,
-0.0555419921875,
-0.004817962646484375,
0.00740814208984375,
-0.028350830078125,
-0.00994110107421875,
0.04730224609375,
-0.03509521484375,
0.046356201171875,
0.0248870849609375,
0.04241943359375,
0.06732177734375,
-0.016845703125,
0.0261993408203125,
-0.07073974609375,
0.024810791015625,
0.010986328125,
0.026947021484375,
0.0283203125,
-0.0160369873046875,
0.034576416015625,
0.0309600830078125,
-0.01629638671875,
-0.069580078125,
-0.007781982421875,
-0.06976318359375,
-0.043212890625,
0.0771484375,
-0.01470947265625,
-0.0275421142578125,
-0.0009655952453613281,
-0.0115814208984375,
0.035308837890625,
-0.010528564453125,
0.05120849609375,
0.07696533203125,
0.0302734375,
-0.01580810546875,
-0.0209197998046875,
0.02532958984375,
0.039703369140625,
-0.041595458984375,
-0.0291290283203125,
0.01013946533203125,
0.043212890625,
0.0243072509765625,
0.068115234375,
-0.0007538795471191406,
0.002437591552734375,
-0.01041412353515625,
0.0295562744140625,
-0.01088714599609375,
-0.01210784912109375,
-0.023040771484375,
-0.004619598388671875,
-0.022796630859375,
-0.038909912109375
]
] |
IDEA-CCNL/Wenzhong-GPT2-110M | 2023-05-25T09:48:34.000Z | [
"transformers",
"pytorch",
"safetensors",
"gpt2",
"text-generation",
"generate",
"zh",
"arxiv:2209.02970",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | IDEA-CCNL | null | null | IDEA-CCNL/Wenzhong-GPT2-110M | 25 | 31,511 | transformers | 2022-05-23T03:15:36 | ---
language:
- zh
inference:
parameters:
temperature: 0.7
top_p: 0.6
repetition_penalty: 1.1
max_new_tokens: 128
num_return_sequences: 3
do_sample: true
license: apache-2.0
tags:
- generate
- gpt2
widget:
- 北京是中国的
- 西湖的景色
---
# Wenzhong-GPT2-110M
- Main Page:[Fengshenbang](https://fengshenbang-lm.com/)
- Github: [Fengshenbang-LM](https://github.com/IDEA-CCNL/Fengshenbang-LM)
## 简介 Brief Introduction
善于处理NLG任务,中文版的GPT2-Small。
Focused on handling NLG tasks, Chinese GPT2-Small.
## 模型分类 Model Taxonomy
| 需求 Demand | 任务 Task | 系列 Series | 模型 Model | 参数 Parameter | 额外 Extra |
| :----: | :----: | :----: | :----: | :----: | :----: |
| 通用 General | 自然语言生成 NLG | 闻仲 Wenzhong | GPT2 | 110M | 中文 Chinese |
## 模型信息 Model Information
类似于Wenzhong2.0-GPT2-3.5B-chinese,我们实现了一个small版本的12层的Wenzhong-GPT2-110M,并且在悟道(300G版本)上面进行预训练。
Similar to Wenzhong2.0-GPT2-3.5B-chinese, we implement a small size Wenzhong-GPT2-110M with 12 layers, which is pre-trained on Wudao Corpus (300G version).
## 使用 Usage
### 加载模型 Loading Models
```python
from transformers import GPT2Tokenizer,GPT2LMHeadModel
hf_model_path = 'IDEA-CCNL/Wenzhong-GPT2-110M'
tokenizer = GPT2Tokenizer.from_pretrained(hf_model_path)
model = GPT2LMHeadModel.from_pretrained(hf_model_path)
```
### 使用示例 Usage Examples
```python
question = "北京是中国的"
inputs = tokenizer(question,return_tensors='pt')
generation_output = model.generate(**inputs,
return_dict_in_generate=True,
output_scores=True,
max_length=150,
# max_new_tokens=80,
do_sample=True,
top_p = 0.6,
# num_beams=5,
eos_token_id=50256,
pad_token_id=0,
num_return_sequences = 5)
for idx,sentence in enumerate(generation_output.sequences):
print('next sentence %d:\n'%idx,
tokenizer.decode(sentence).split('<|endoftext|>')[0])
print('*'*40)
```
## 引用 Citation
如果您在您的工作中使用了我们的模型,可以引用我们的[论文](https://arxiv.org/abs/2209.02970):
If you are using the resource for your work, please cite the our [paper](https://arxiv.org/abs/2209.02970):
```text
@article{fengshenbang,
author = {Jiaxing Zhang and Ruyi Gan and Junjie Wang and Yuxiang Zhang and Lin Zhang and Ping Yang and Xinyu Gao and Ziwei Wu and Xiaoqun Dong and Junqing He and Jianheng Zhuo and Qi Yang and Yongfeng Huang and Xiayu Li and Yanghan Wu and Junyu Lu and Xinyu Zhu and Weifeng Chen and Ting Han and Kunhao Pan and Rui Wang and Hao Wang and Xiaojun Wu and Zhongshen Zeng and Chongpei Chen},
title = {Fengshenbang 1.0: Being the Foundation of Chinese Cognitive Intelligence},
journal = {CoRR},
volume = {abs/2209.02970},
year = {2022}
}
```
也可以引用我们的[网站](https://github.com/IDEA-CCNL/Fengshenbang-LM/):
You can also cite our [website](https://github.com/IDEA-CCNL/Fengshenbang-LM/):
```text
@misc{Fengshenbang-LM,
title={Fengshenbang-LM},
author={IDEA-CCNL},
year={2021},
howpublished={\url{https://github.com/IDEA-CCNL/Fengshenbang-LM}},
}
```
| 3,271 | [
[
-0.02386474609375,
-0.05255126953125,
0.023468017578125,
0.0196685791015625,
-0.025787353515625,
-0.028778076171875,
-0.035888671875,
-0.027435302734375,
-0.0021820068359375,
0.0193634033203125,
-0.034881591796875,
-0.0439453125,
-0.034759521484375,
0.001953125,
0.0103759765625,
0.0701904296875,
-0.0008707046508789062,
0.004840850830078125,
-0.00691986083984375,
-0.0008873939514160156,
-0.02099609375,
-0.021209716796875,
-0.06463623046875,
-0.0269927978515625,
0.00994873046875,
0.0063629150390625,
0.0279388427734375,
0.03997802734375,
0.0279998779296875,
0.021026611328125,
-0.01349639892578125,
0.011810302734375,
-0.015106201171875,
-0.0267181396484375,
-0.0014438629150390625,
-0.0222625732421875,
-0.0462646484375,
-0.0001913309097290039,
0.0496826171875,
0.04144287109375,
-0.00360107421875,
0.0252685546875,
0.031982421875,
0.03753662109375,
-0.0032100677490234375,
0.0252227783203125,
-0.03302001953125,
-0.006908416748046875,
-0.018768310546875,
-0.007038116455078125,
-0.02374267578125,
-0.0238800048828125,
0.005931854248046875,
-0.056976318359375,
0.0028934478759765625,
0.0030689239501953125,
0.09332275390625,
0.01039886474609375,
-0.027069091796875,
-0.0115814208984375,
-0.01442718505859375,
0.07220458984375,
-0.08428955078125,
0.01114654541015625,
0.03399658203125,
-0.007083892822265625,
-0.00981903076171875,
-0.07122802734375,
-0.047698974609375,
-0.0137939453125,
-0.040130615234375,
0.021759033203125,
-0.00476837158203125,
-0.0055389404296875,
0.0247650146484375,
-0.0016641616821289062,
-0.06903076171875,
0.0006451606750488281,
-0.028228759765625,
-0.02301025390625,
0.046722412109375,
0.007724761962890625,
0.0303192138671875,
-0.04296875,
-0.035125732421875,
-0.01165008544921875,
-0.030792236328125,
0.01824951171875,
0.01126861572265625,
0.020751953125,
-0.0242919921875,
0.035919189453125,
-0.0173797607421875,
0.04608154296875,
0.0021228790283203125,
-0.01061248779296875,
0.039642333984375,
-0.057373046875,
-0.0172271728515625,
-0.00933074951171875,
0.083740234375,
0.02471923828125,
0.0036754608154296875,
-0.00171661376953125,
-0.005527496337890625,
-0.031219482421875,
-0.01763916015625,
-0.061492919921875,
-0.0230712890625,
0.0294647216796875,
-0.042388916015625,
-0.005832672119140625,
0.0230255126953125,
-0.057098388671875,
0.0110626220703125,
-0.0121002197265625,
0.041015625,
-0.04058837890625,
-0.03350830078125,
0.0000909566879272461,
-0.01409912109375,
0.0374755859375,
0.0029277801513671875,
-0.055938720703125,
0.01345062255859375,
0.046478271484375,
0.0732421875,
-0.0017137527465820312,
-0.03515625,
-0.0199127197265625,
0.00926971435546875,
-0.0194091796875,
0.036407470703125,
0.000049233436584472656,
-0.01300811767578125,
-0.0152130126953125,
0.006679534912109375,
-0.01371002197265625,
-0.024871826171875,
0.041961669921875,
-0.0211334228515625,
0.04486083984375,
-0.038360595703125,
-0.0242767333984375,
-0.020172119140625,
0.0218658447265625,
-0.032928466796875,
0.0927734375,
0.0177154541015625,
-0.0765380859375,
0.01702880859375,
-0.057647705078125,
-0.025115966796875,
0.00815582275390625,
-0.005718231201171875,
-0.034088134765625,
-0.028778076171875,
0.0181427001953125,
0.022369384765625,
-0.018585205078125,
0.020416259765625,
-0.004291534423828125,
-0.0261077880859375,
0.00872802734375,
-0.0159912109375,
0.10986328125,
0.015411376953125,
-0.035369873046875,
0.0207061767578125,
-0.050445556640625,
0.018707275390625,
0.0250396728515625,
-0.033172607421875,
-0.0241546630859375,
-0.002758026123046875,
0.01479339599609375,
0.029754638671875,
0.04730224609375,
-0.0262451171875,
0.0156707763671875,
-0.038116455078125,
0.042724609375,
0.06109619140625,
0.00438690185546875,
0.0213165283203125,
-0.0246429443359375,
0.014892578125,
0.00107574462890625,
0.004978179931640625,
0.0028514862060546875,
-0.048065185546875,
-0.055633544921875,
-0.008209228515625,
0.0206756591796875,
0.05877685546875,
-0.05975341796875,
0.05743408203125,
-0.02410888671875,
-0.036163330078125,
-0.024383544921875,
0.001186370849609375,
0.0343017578125,
0.038726806640625,
0.0435791015625,
-0.00608062744140625,
-0.0479736328125,
-0.048583984375,
-0.01091766357421875,
-0.0283660888671875,
-0.0108795166015625,
0.030670166015625,
0.048583984375,
-0.016387939453125,
0.05755615234375,
-0.041595458984375,
-0.0216064453125,
-0.0179443359375,
0.02203369140625,
0.03656005859375,
0.05084228515625,
0.039215087890625,
-0.0511474609375,
-0.04052734375,
-0.005382537841796875,
-0.057403564453125,
-0.0084686279296875,
-0.017913818359375,
-0.0352783203125,
0.0213165283203125,
0.024444580078125,
-0.05841064453125,
0.015716552734375,
0.01611328125,
-0.040008544921875,
0.05780029296875,
-0.0175628662109375,
-0.0014047622680664062,
-0.10284423828125,
0.0199127197265625,
-0.003437042236328125,
0.01177215576171875,
-0.043670654296875,
0.00467681884765625,
-0.005100250244140625,
0.0172119140625,
-0.0278167724609375,
0.048431396484375,
-0.049163818359375,
0.01103973388671875,
-0.00254058837890625,
0.01788330078125,
-0.004146575927734375,
0.059234619140625,
-0.0005865097045898438,
0.045440673828125,
0.044708251953125,
-0.05145263671875,
0.025787353515625,
0.032470703125,
-0.0235443115234375,
0.01291656494140625,
-0.062255859375,
0.006130218505859375,
0.0107421875,
0.018035888671875,
-0.08306884765625,
-0.0067596435546875,
0.041015625,
-0.048370361328125,
0.0168304443359375,
-0.0019044876098632812,
-0.039276123046875,
-0.04412841796875,
-0.04412841796875,
0.02801513671875,
0.04522705078125,
-0.03338623046875,
0.020172119140625,
0.01078033447265625,
-0.0140228271484375,
-0.046905517578125,
-0.07177734375,
-0.005260467529296875,
-0.01473236083984375,
-0.05279541015625,
0.041595458984375,
-0.0081329345703125,
-0.0002582073211669922,
0.01029205322265625,
0.01261138916015625,
0.01039886474609375,
-0.00623321533203125,
-0.00261688232421875,
0.034576416015625,
-0.02301025390625,
-0.00008100271224975586,
-0.007091522216796875,
-0.009765625,
0.012451171875,
-0.04150390625,
0.048065185546875,
-0.00464630126953125,
-0.01290130615234375,
-0.0311279296875,
0.01531219482421875,
0.01268768310546875,
-0.0222320556640625,
0.05230712890625,
0.08343505859375,
-0.037567138671875,
-0.0027484893798828125,
-0.0181427001953125,
-0.01227569580078125,
-0.036834716796875,
0.05328369140625,
-0.0223541259765625,
-0.054840087890625,
0.0364990234375,
0.014801025390625,
0.01340484619140625,
0.061370849609375,
0.04791259765625,
0.0078887939453125,
0.059356689453125,
0.03155517578125,
-0.0020503997802734375,
0.03173828125,
-0.038726806640625,
0.0077056884765625,
-0.08148193359375,
-0.0242919921875,
-0.030853271484375,
-0.012603759765625,
-0.052764892578125,
-0.03265380859375,
0.01708984375,
0.01727294921875,
-0.03570556640625,
0.036865234375,
-0.0599365234375,
0.0010747909545898438,
0.058990478515625,
0.003658294677734375,
-0.0030670166015625,
-0.00672149658203125,
-0.0271148681640625,
0.00637054443359375,
-0.046661376953125,
-0.034881591796875,
0.07928466796875,
0.01751708984375,
0.0261383056640625,
0.0181427001953125,
0.04638671875,
-0.00394439697265625,
0.001331329345703125,
-0.03985595703125,
0.0543212890625,
-0.003559112548828125,
-0.038482666015625,
-0.040679931640625,
-0.039306640625,
-0.06964111328125,
0.0169219970703125,
0.005176544189453125,
-0.0714111328125,
0.004138946533203125,
-0.01453399658203125,
-0.03076171875,
0.02813720703125,
-0.040130615234375,
0.0657958984375,
-0.0240478515625,
-0.025909423828125,
-0.0013828277587890625,
-0.04852294921875,
0.033935546875,
0.00743865966796875,
0.0075225830078125,
0.0003826618194580078,
-0.004425048828125,
0.0926513671875,
-0.032470703125,
0.04693603515625,
-0.0252227783203125,
0.0032711029052734375,
0.0379638671875,
-0.025390625,
0.045074462890625,
0.0126190185546875,
-0.0030269622802734375,
0.0248870849609375,
-0.0028209686279296875,
-0.034912109375,
-0.018951416015625,
0.047515869140625,
-0.06378173828125,
-0.041473388671875,
-0.048553466796875,
-0.038482666015625,
0.005023956298828125,
0.036346435546875,
0.042083740234375,
0.0177459716796875,
0.0024471282958984375,
0.00811004638671875,
0.0302886962890625,
-0.0260467529296875,
0.061614990234375,
0.0251007080078125,
-0.005504608154296875,
-0.0421142578125,
0.06256103515625,
0.0298919677734375,
0.0031528472900390625,
0.032135009765625,
0.0152587890625,
-0.0290679931640625,
-0.0543212890625,
-0.0291595458984375,
0.028839111328125,
-0.036590576171875,
-0.00774383544921875,
-0.0474853515625,
-0.033172607421875,
-0.04974365234375,
0.0140838623046875,
-0.007640838623046875,
-0.0250091552734375,
-0.030670166015625,
-0.0115814208984375,
0.02801513671875,
0.03350830078125,
-0.0214996337890625,
0.0020122528076171875,
-0.0343017578125,
0.02398681640625,
0.02557373046875,
0.0191802978515625,
0.01174163818359375,
-0.057586669921875,
-0.039276123046875,
0.0140533447265625,
-0.023834228515625,
-0.05377197265625,
0.0362548828125,
-0.0016536712646484375,
0.055694580078125,
0.032928466796875,
0.00420379638671875,
0.04766845703125,
-0.02142333984375,
0.07818603515625,
0.00977325439453125,
-0.068115234375,
0.0438232421875,
-0.0295257568359375,
0.01763916015625,
0.02197265625,
0.0156402587890625,
-0.0419921875,
0.0002415180206298828,
-0.04290771484375,
-0.08203125,
0.07574462890625,
0.020233154296875,
-0.00154876708984375,
-0.001598358154296875,
0.019287109375,
-0.01214599609375,
-0.00649261474609375,
-0.07049560546875,
-0.03759765625,
-0.02740478515625,
-0.0155487060546875,
0.0045928955078125,
-0.02581787109375,
-0.0168914794921875,
-0.032867431640625,
0.0723876953125,
-0.005611419677734375,
0.06109619140625,
0.026580810546875,
0.00022363662719726562,
0.0126190185546875,
0.0117645263671875,
0.04931640625,
0.041717529296875,
-0.0195770263671875,
0.0021305084228515625,
0.0282745361328125,
-0.04632568359375,
-0.00528717041015625,
0.0222625732421875,
-0.030487060546875,
0.00385284423828125,
0.0423583984375,
0.061279296875,
-0.00844573974609375,
-0.025543212890625,
0.032012939453125,
0.01194000244140625,
-0.0340576171875,
-0.033721923828125,
0.0020313262939453125,
0.0193634033203125,
0.0185394287109375,
0.042388916015625,
-0.006244659423828125,
-0.003299713134765625,
-0.0309906005859375,
0.0214996337890625,
0.033966064453125,
-0.015716552734375,
-0.01751708984375,
0.0555419921875,
-0.005420684814453125,
-0.012786865234375,
0.04559326171875,
-0.0455322265625,
-0.049713134765625,
0.058135986328125,
0.047637939453125,
0.06536865234375,
0.00786590576171875,
0.00904083251953125,
0.0662841796875,
0.0299530029296875,
-0.016845703125,
0.028228759765625,
-0.0002262592315673828,
-0.050445556640625,
-0.0176239013671875,
-0.03717041015625,
0.001766204833984375,
0.018585205078125,
-0.048583984375,
0.015350341796875,
-0.0291900634765625,
-0.0171966552734375,
-0.0185089111328125,
0.0389404296875,
-0.036956787109375,
0.02056884765625,
0.01132965087890625,
0.056182861328125,
-0.04327392578125,
0.0782470703125,
0.0526123046875,
-0.043914794921875,
-0.07733154296875,
0.0082855224609375,
-0.004749298095703125,
-0.054595947265625,
0.040435791015625,
0.00841522216796875,
0.006378173828125,
0.00945281982421875,
-0.037506103515625,
-0.0767822265625,
0.12249755859375,
0.00264739990234375,
-0.0261993408203125,
-0.01036834716796875,
-0.001800537109375,
0.035614013671875,
-0.012359619140625,
0.0306549072265625,
0.028350830078125,
0.057952880859375,
-0.007564544677734375,
-0.059722900390625,
0.03582763671875,
-0.041107177734375,
0.0022373199462890625,
-0.005558013916015625,
-0.07550048828125,
0.08795166015625,
-0.027587890625,
-0.0256500244140625,
0.013153076171875,
0.06201171875,
0.03045654296875,
0.024871826171875,
0.0228118896484375,
0.04681396484375,
0.064697265625,
-0.0256500244140625,
0.056182861328125,
-0.02252197265625,
0.045745849609375,
0.0745849609375,
0.01244354248046875,
0.060394287109375,
0.0293731689453125,
-0.042083740234375,
0.044586181640625,
0.057586669921875,
-0.0295257568359375,
0.04486083984375,
0.00666046142578125,
0.003391265869140625,
0.01136016845703125,
0.0029392242431640625,
-0.04913330078125,
0.01364898681640625,
0.01123046875,
-0.020172119140625,
0.0051422119140625,
-0.01898193359375,
0.0300140380859375,
-0.01824951171875,
-0.0154266357421875,
0.050018310546875,
0.0086517333984375,
-0.052276611328125,
0.054718017578125,
0.025970458984375,
0.07818603515625,
-0.04931640625,
0.01287078857421875,
-0.00925445556640625,
0.004070281982421875,
-0.021392822265625,
-0.05340576171875,
0.00989532470703125,
-0.004329681396484375,
-0.008819580078125,
0.00550079345703125,
0.039215087890625,
-0.03973388671875,
-0.0474853515625,
0.03497314453125,
0.031585693359375,
0.01531982421875,
0.01036834716796875,
-0.07611083984375,
-0.0281219482421875,
0.0229339599609375,
-0.056549072265625,
0.02215576171875,
0.0408935546875,
0.004119873046875,
0.02349853515625,
0.06402587890625,
0.0133209228515625,
0.00835418701171875,
0.002696990966796875,
0.04931640625,
-0.06640625,
-0.037811279296875,
-0.07989501953125,
0.040130615234375,
-0.0161590576171875,
-0.047454833984375,
0.064453125,
0.042205810546875,
0.06890869140625,
0.0030727386474609375,
0.0640869140625,
-0.0283660888671875,
0.023223876953125,
-0.03741455078125,
0.0675048828125,
-0.043212890625,
0.01275634765625,
-0.02410888671875,
-0.057403564453125,
-0.01422882080078125,
0.05352783203125,
-0.021392822265625,
0.028594970703125,
0.06243896484375,
0.06878662109375,
0.01556396484375,
-0.0211334228515625,
0.0006666183471679688,
0.03509521484375,
0.0220947265625,
0.07696533203125,
0.0228729248046875,
-0.052032470703125,
0.05181884765625,
-0.038726806640625,
-0.0135040283203125,
-0.007419586181640625,
-0.05133056640625,
-0.065673828125,
-0.04986572265625,
-0.0293731689453125,
-0.039581298828125,
-0.015838623046875,
0.062255859375,
0.047088623046875,
-0.0657958984375,
-0.0132904052734375,
-0.0145721435546875,
0.01387786865234375,
-0.04791259765625,
-0.023468017578125,
0.07086181640625,
-0.01080322265625,
-0.05389404296875,
-0.0027179718017578125,
0.004451751708984375,
0.004352569580078125,
-0.00942230224609375,
-0.02880859375,
-0.04010009765625,
0.01136016845703125,
0.0303497314453125,
0.016387939453125,
-0.06195068359375,
-0.00873565673828125,
0.0085296630859375,
-0.034698486328125,
0.002124786376953125,
0.020843505859375,
-0.036529541015625,
0.00534820556640625,
0.05340576171875,
0.0183868408203125,
0.034515380859375,
0.00360870361328125,
0.01245880126953125,
-0.03656005859375,
0.01392364501953125,
-0.004352569580078125,
0.03204345703125,
0.02142333984375,
-0.034393310546875,
0.048431396484375,
0.0341796875,
-0.03814697265625,
-0.0526123046875,
-0.0063629150390625,
-0.0924072265625,
-0.0296478271484375,
0.096435546875,
-0.0206756591796875,
-0.0180511474609375,
-0.002895355224609375,
-0.0209197998046875,
0.049713134765625,
-0.0180206298828125,
0.045989990234375,
0.05731201171875,
-0.0071258544921875,
-0.007350921630859375,
-0.039276123046875,
0.03729248046875,
0.02978515625,
-0.05328369140625,
-0.004756927490234375,
-0.00023663043975830078,
0.0174713134765625,
0.0211334228515625,
0.05633544921875,
-0.00884246826171875,
0.021026611328125,
0.012908935546875,
0.01313018798828125,
-0.0252227783203125,
0.01153564453125,
-0.00995635986328125,
0.0005636215209960938,
-0.01340484619140625,
-0.025238037109375
]
] |
Hello-SimpleAI/chatgpt-qa-detector-roberta-chinese | 2023-01-19T11:02:06.000Z | [
"transformers",
"pytorch",
"bert",
"text-classification",
"chatgpt",
"zh",
"dataset:Hello-SimpleAI/HC3-Chinese",
"arxiv:2301.07597",
"endpoints_compatible",
"region:us",
"has_space"
] | text-classification | Hello-SimpleAI | null | null | Hello-SimpleAI/chatgpt-qa-detector-roberta-chinese | 3 | 31,493 | transformers | 2023-01-19T10:46:32 | ---
datasets:
- Hello-SimpleAI/HC3-Chinese
language:
- zh
pipeline_tag: text-classification
tags:
- chatgpt
---
# Model Card for `Hello-SimpleAI/chatgpt-qa-detector-roberta-chinese`
This model is trained on `question-answer` pairs of **the filtered full-text** from [Hello-SimpleAI/HC3-Chinese](https://huggingface.co/datasets/Hello-SimpleAI/HC3-Chinese).
More details refer to [arxiv: 2301.07597](https://arxiv.org/abs/2301.07597) and Gtihub project [Hello-SimpleAI/chatgpt-comparison-detection](https://github.com/Hello-SimpleAI/chatgpt-comparison-detection).
The base checkpoint is [hfl/chinese-roberta-wwm-ext](https://huggingface.co/hfl/chinese-roberta-wwm-ext).
We train it with all [Hello-SimpleAI/HC3-Chinese](https://huggingface.co/datasets/Hello-SimpleAI/HC3-Chinese) data (without held-out) for 2 epochs.
( 2-epoch is consistent with the experiments in [our paper](https://arxiv.org/abs/2301.07597).)
## Citation
Checkout this papaer [arxiv: 2301.07597](https://arxiv.org/abs/2301.07597)
```
@article{guo-etal-2023-hc3,
title = "How Close is ChatGPT to Human Experts? Comparison Corpus, Evaluation, and Detection",
author = "Guo, Biyang and
Zhang, Xin and
Wang, Ziyuan and
Jiang, Minqi and
Nie, Jinran and
Ding, Yuxuan and
Yue, Jianwei and
Wu, Yupeng",
journal={arXiv preprint arxiv:2301.07597}
year = "2023",
}
```
| 1,402 | [
[
-0.031829833984375,
-0.054107666015625,
0.0341796875,
-0.0027332305908203125,
-0.0242767333984375,
-0.0231475830078125,
-0.005374908447265625,
-0.0297698974609375,
-0.006317138671875,
0.026458740234375,
-0.03924560546875,
-0.032806396484375,
-0.0384521484375,
-0.006511688232421875,
-0.005390167236328125,
0.0836181640625,
0.0200347900390625,
0.0245208740234375,
0.0017423629760742188,
-0.022125244140625,
-0.0164947509765625,
-0.0284423828125,
-0.0833740234375,
-0.029998779296875,
0.0218658447265625,
0.019134521484375,
0.044952392578125,
0.033447265625,
0.028900146484375,
0.01224517822265625,
-0.034637451171875,
0.00392913818359375,
-0.0282440185546875,
-0.0318603515625,
0.0232391357421875,
-0.0322265625,
-0.053955078125,
-0.0034465789794921875,
0.04705810546875,
0.0110931396484375,
-0.00255584716796875,
0.01476287841796875,
0.0119171142578125,
0.03338623046875,
-0.02752685546875,
0.026397705078125,
-0.042449951171875,
-0.0128631591796875,
-0.0241546630859375,
-0.005527496337890625,
-0.029815673828125,
-0.01580810546875,
0.00972747802734375,
-0.03875732421875,
0.0298309326171875,
0.004627227783203125,
0.10382080078125,
0.00858306884765625,
-0.05029296875,
-0.0038623809814453125,
-0.034149169921875,
0.046905517578125,
-0.0704345703125,
0.021209716796875,
0.02545166015625,
0.0214080810546875,
-0.01538848876953125,
-0.04400634765625,
-0.065185546875,
-0.00811767578125,
-0.005420684814453125,
0.01751708984375,
-0.00811767578125,
0.006275177001953125,
0.03802490234375,
0.03057861328125,
-0.07696533203125,
0.007568359375,
-0.0310516357421875,
-0.03973388671875,
0.037109375,
0.007205963134765625,
0.01715087890625,
-0.039215087890625,
-0.040283203125,
-0.0239410400390625,
-0.047149658203125,
0.019683837890625,
0.0243988037109375,
0.0145111083984375,
-0.0318603515625,
0.0087738037109375,
-0.0335693359375,
0.047607421875,
-0.00994873046875,
-0.00904083251953125,
0.03826904296875,
-0.036041259765625,
-0.01348876953125,
-0.02520751953125,
0.08209228515625,
0.0033550262451171875,
0.00862884521484375,
0.0135345458984375,
0.0029125213623046875,
-0.01067352294921875,
-0.0276031494140625,
-0.075927734375,
-0.034637451171875,
0.02252197265625,
-0.0238037109375,
-0.019683837890625,
0.01146697998046875,
-0.043426513671875,
-0.004314422607421875,
-0.004795074462890625,
0.040771484375,
-0.03192138671875,
-0.0279541015625,
0.00270843505859375,
-0.00890350341796875,
0.06866455078125,
0.01023101806640625,
-0.042694091796875,
-0.00067901611328125,
0.046142578125,
0.05120849609375,
0.00641632080078125,
-0.022430419921875,
-0.04217529296875,
-0.01433563232421875,
-0.0084991455078125,
0.06951904296875,
-0.004673004150390625,
-0.0146026611328125,
-0.01508331298828125,
0.0191802978515625,
-0.01959228515625,
-0.03314208984375,
0.0738525390625,
-0.04779052734375,
0.051849365234375,
-0.0298309326171875,
-0.03204345703125,
-0.0208892822265625,
0.030975341796875,
-0.038482666015625,
0.0904541015625,
-0.0007429122924804688,
-0.0711669921875,
0.028900146484375,
-0.039794921875,
-0.004596710205078125,
0.00986480712890625,
-0.006473541259765625,
-0.055938720703125,
-0.04638671875,
0.028167724609375,
0.0157470703125,
-0.0234527587890625,
0.0146636962890625,
-0.029083251953125,
-0.045379638671875,
0.01654052734375,
-0.02886962890625,
0.089111328125,
0.0218658447265625,
-0.017486572265625,
0.0114288330078125,
-0.05609130859375,
0.0196533203125,
0.0181427001953125,
-0.0269012451171875,
-0.0097808837890625,
-0.01288604736328125,
0.017059326171875,
0.0183563232421875,
0.022125244140625,
-0.05157470703125,
0.01033782958984375,
-0.01763916015625,
0.03143310546875,
0.046783447265625,
0.01171875,
0.0020923614501953125,
-0.04150390625,
0.0211029052734375,
0.01537322998046875,
0.01763916015625,
-0.0199127197265625,
-0.0709228515625,
-0.062347412109375,
-0.011810302734375,
0.034881591796875,
0.061920166015625,
-0.062347412109375,
0.0440673828125,
-0.03314208984375,
-0.04095458984375,
-0.034637451171875,
-0.0117645263671875,
0.037445068359375,
0.04266357421875,
0.025390625,
-0.031646728515625,
-0.0245208740234375,
-0.06585693359375,
-0.0098724365234375,
-0.04461669921875,
-0.0182342529296875,
0.01386260986328125,
0.040191650390625,
-0.004405975341796875,
0.06048583984375,
-0.03558349609375,
-0.023956298828125,
-0.00377655029296875,
0.0230255126953125,
0.0259552001953125,
0.044708251953125,
0.044586181640625,
-0.06378173828125,
-0.03594970703125,
-0.0167694091796875,
-0.052642822265625,
0.0027942657470703125,
-0.0028400421142578125,
-0.031280517578125,
0.009002685546875,
0.0111236572265625,
-0.035614013671875,
0.0261993408203125,
0.0251007080078125,
-0.032135009765625,
0.03863525390625,
0.0016107559204101562,
0.0295257568359375,
-0.0946044921875,
0.0183868408203125,
0.004299163818359375,
-0.0209197998046875,
-0.054840087890625,
0.025634765625,
0.004047393798828125,
-0.004970550537109375,
-0.0457763671875,
0.05230712890625,
-0.0250244140625,
0.0126953125,
-0.028961181640625,
0.0167083740234375,
-0.00392913818359375,
0.0723876953125,
0.01404571533203125,
0.05572509765625,
0.0295257568359375,
-0.03692626953125,
0.01049041748046875,
0.043060302734375,
-0.0135955810546875,
0.035614013671875,
-0.06597900390625,
0.030609130859375,
0.006603240966796875,
0.0218658447265625,
-0.0826416015625,
-0.013702392578125,
0.045654296875,
-0.061065673828125,
0.0056915283203125,
-0.0509033203125,
-0.052947998046875,
-0.02227783203125,
-0.0227813720703125,
0.047088623046875,
0.06573486328125,
-0.0338134765625,
0.0226287841796875,
0.01165008544921875,
0.01233673095703125,
-0.027099609375,
-0.054473876953125,
-0.01444244384765625,
-0.00341796875,
-0.068115234375,
0.00920867919921875,
-0.0175323486328125,
0.0212860107421875,
-0.00017523765563964844,
0.01129913330078125,
-0.02313232421875,
-0.00490570068359375,
0.006481170654296875,
0.0323486328125,
-0.031890869140625,
-0.0025730133056640625,
-0.011016845703125,
-0.02105712890625,
0.0048065185546875,
-0.0095062255859375,
0.049896240234375,
-0.0193023681640625,
-0.04150390625,
-0.0472412109375,
-0.006439208984375,
0.0220184326171875,
-0.01433563232421875,
0.055694580078125,
0.0821533203125,
-0.03167724609375,
0.0211639404296875,
-0.035858154296875,
-0.0145721435546875,
-0.033111572265625,
0.0277252197265625,
-0.0272979736328125,
-0.056640625,
0.05206298828125,
0.034088134765625,
0.0022449493408203125,
0.051544189453125,
0.037872314453125,
0.012786865234375,
0.078369140625,
0.01025390625,
-0.0163726806640625,
0.035064697265625,
-0.03204345703125,
0.0037021636962890625,
-0.0789794921875,
0.00238800048828125,
-0.04376220703125,
-0.00283050537109375,
-0.057861328125,
-0.0265960693359375,
0.029998779296875,
0.006587982177734375,
-0.046295166015625,
0.022857666015625,
-0.03369140625,
0.0240631103515625,
0.048309326171875,
0.0305633544921875,
0.026824951171875,
-0.00901031494140625,
0.00220489501953125,
0.002414703369140625,
-0.049957275390625,
-0.033294677734375,
0.0819091796875,
0.033233642578125,
0.017730712890625,
0.0078582763671875,
0.04620361328125,
0.00244140625,
0.0058746337890625,
-0.041900634765625,
0.045745849609375,
-0.008087158203125,
-0.05657958984375,
-0.0145111083984375,
-0.04107666015625,
-0.06878662109375,
0.0311126708984375,
-0.00975799560546875,
-0.048858642578125,
0.0080413818359375,
0.0079803466796875,
-0.0305633544921875,
0.03204345703125,
-0.03936767578125,
0.07977294921875,
0.00937652587890625,
-0.01497650146484375,
0.00037479400634765625,
-0.037933349609375,
0.04547119140625,
0.0227508544921875,
0.006504058837890625,
-0.0196533203125,
0.0235443115234375,
0.07171630859375,
-0.01090240478515625,
0.050872802734375,
-0.03173828125,
0.01180267333984375,
0.0455322265625,
-0.0161895751953125,
0.0423583984375,
0.01001739501953125,
0.0045013427734375,
0.018218994140625,
0.0311126708984375,
-0.07122802734375,
-0.0144805908203125,
0.029022216796875,
-0.062744140625,
-0.031036376953125,
-0.052154541015625,
-0.020477294921875,
-0.0008139610290527344,
0.027099609375,
0.039886474609375,
0.040740966796875,
-0.01212310791015625,
0.0048828125,
0.06201171875,
-0.0012998580932617188,
0.01421356201171875,
0.044586181640625,
-0.013916015625,
-0.02032470703125,
0.0570068359375,
-0.0166168212890625,
0.0186309814453125,
0.02105712890625,
0.011199951171875,
-0.0018520355224609375,
-0.051788330078125,
-0.05413818359375,
0.0258331298828125,
-0.046478271484375,
-0.0023593902587890625,
-0.03912353515625,
-0.051300048828125,
-0.031494140625,
0.0396728515625,
-0.01544952392578125,
-0.01308441162109375,
-0.016265869140625,
0.00601959228515625,
0.0219268798828125,
0.00962066650390625,
0.01461029052734375,
0.032196044921875,
-0.07275390625,
0.019012451171875,
0.030670166015625,
-0.0014390945434570312,
0.00765228271484375,
-0.07562255859375,
-0.0236053466796875,
0.03363037109375,
-0.0262298583984375,
-0.05712890625,
0.0215911865234375,
0.0137481689453125,
0.046905517578125,
0.032196044921875,
0.0009732246398925781,
0.044677734375,
0.0013608932495117188,
0.07305908203125,
-0.017852783203125,
-0.04974365234375,
0.0258636474609375,
-0.043731689453125,
0.04058837890625,
0.043365478515625,
0.0161590576171875,
-0.052337646484375,
-0.02532958984375,
-0.04180908203125,
-0.052947998046875,
0.052642822265625,
0.026092529296875,
0.0040283203125,
0.0099334716796875,
0.03338623046875,
-0.01398468017578125,
-0.01076507568359375,
-0.0758056640625,
-0.0181121826171875,
0.01264190673828125,
-0.01004791259765625,
0.02203369140625,
-0.0263214111328125,
-0.0087890625,
-0.00992584228515625,
0.06768798828125,
-0.0247039794921875,
0.042449951171875,
0.0224761962890625,
-0.007049560546875,
0.0028247833251953125,
0.0212554931640625,
0.036956787109375,
0.0311431884765625,
-0.02239990234375,
-0.016510009765625,
0.020904541015625,
-0.045196533203125,
-0.01003265380859375,
0.0091094970703125,
-0.0352783203125,
-0.01131439208984375,
0.046966552734375,
0.0706787109375,
0.00965118408203125,
-0.036834716796875,
0.05682373046875,
-0.022064208984375,
-0.0190277099609375,
-0.0465087890625,
0.012603759765625,
-0.0008106231689453125,
0.01218414306640625,
0.027923583984375,
0.0123748779296875,
0.00765228271484375,
-0.0284423828125,
0.019744873046875,
0.0218505859375,
-0.04119873046875,
-0.031494140625,
0.05621337890625,
0.00353240966796875,
0.0015783309936523438,
0.06585693359375,
-0.0572509765625,
-0.056396484375,
0.0538330078125,
0.00661468505859375,
0.06634521484375,
0.00820159912109375,
-0.0005159378051757812,
0.059051513671875,
0.0118408203125,
-0.006473541259765625,
0.0350341796875,
0.0033092498779296875,
-0.06890869140625,
-0.0174560546875,
-0.03192138671875,
-0.0227203369140625,
0.01297760009765625,
-0.06768798828125,
0.035614013671875,
-0.01136016845703125,
0.0006322860717773438,
-0.016571044921875,
0.0224761962890625,
-0.055450439453125,
0.00299072265625,
0.01305389404296875,
0.06634521484375,
-0.07562255859375,
0.08074951171875,
0.0400390625,
-0.03619384765625,
-0.059112548828125,
0.00980377197265625,
0.01329803466796875,
-0.06951904296875,
0.0275421142578125,
0.0074920654296875,
0.01371002197265625,
-0.007266998291015625,
-0.043060302734375,
-0.0538330078125,
0.103759765625,
-0.00026798248291015625,
-0.02239990234375,
-0.0030059814453125,
-0.00908660888671875,
0.05059814453125,
-0.00400543212890625,
0.039398193359375,
0.033721923828125,
0.0291748046875,
0.0309600830078125,
-0.07586669921875,
-0.0005292892456054688,
-0.041656494140625,
-0.01739501953125,
0.011810302734375,
-0.0789794921875,
0.070556640625,
0.002681732177734375,
-0.0237579345703125,
0.034881591796875,
0.047515869140625,
0.042510986328125,
0.0360107421875,
0.04296875,
0.05633544921875,
0.03729248046875,
-0.0296783447265625,
0.062286376953125,
-0.02313232421875,
0.040252685546875,
0.07489013671875,
0.007373809814453125,
0.0679931640625,
0.00934600830078125,
-0.0259246826171875,
0.04296875,
0.0740966796875,
-0.0215911865234375,
0.03363037109375,
-0.009796142578125,
-0.00945281982421875,
-0.0263214111328125,
0.00910186767578125,
-0.037811279296875,
0.020538330078125,
0.001560211181640625,
0.0032215118408203125,
-0.0013828277587890625,
-0.0168609619140625,
0.0277099609375,
-0.016815185546875,
-0.0003218650817871094,
0.055694580078125,
-0.004940032958984375,
-0.04779052734375,
0.05303955078125,
0.0034332275390625,
0.0634765625,
-0.04644775390625,
-0.0022525787353515625,
-0.0191497802734375,
0.004718780517578125,
-0.01251983642578125,
-0.047607421875,
0.006103515625,
-0.01207733154296875,
0.006023406982421875,
0.0260772705078125,
0.043304443359375,
-0.04254150390625,
-0.0212249755859375,
0.01519775390625,
0.037261962890625,
0.025421142578125,
-0.00452423095703125,
-0.06396484375,
-0.023193359375,
0.0142059326171875,
-0.019622802734375,
0.0311126708984375,
0.024871826171875,
-0.0014476776123046875,
0.045135498046875,
0.05804443359375,
0.005443572998046875,
0.0149688720703125,
-0.0008702278137207031,
0.05438232421875,
-0.041748046875,
-0.04815673828125,
-0.048431396484375,
0.0361328125,
-0.008209228515625,
-0.05474853515625,
0.040313720703125,
0.054473876953125,
0.06622314453125,
0.01000213623046875,
0.046356201171875,
-0.0121917724609375,
0.043548583984375,
-0.050079345703125,
0.052581787109375,
-0.039703369140625,
0.0245819091796875,
-0.00605010986328125,
-0.05511474609375,
-0.002346038818359375,
0.0274658203125,
-0.018096923828125,
0.0167236328125,
0.047088623046875,
0.06988525390625,
-0.00414276123046875,
0.00444793701171875,
0.0142364501953125,
0.010162353515625,
0.03997802734375,
0.0718994140625,
0.039794921875,
-0.07232666015625,
0.0538330078125,
-0.025909423828125,
-0.0242462158203125,
-0.00469970703125,
-0.048980712890625,
-0.07275390625,
-0.0504150390625,
-0.036102294921875,
-0.03131103515625,
-0.007251739501953125,
0.051849365234375,
0.056243896484375,
-0.07763671875,
-0.01049041748046875,
0.01010894775390625,
0.0020275115966796875,
-0.018218994140625,
-0.0185089111328125,
0.044586181640625,
-0.0015916824340820312,
-0.061920166015625,
-0.00295257568359375,
-0.0147857666015625,
0.015106201171875,
-0.01297760009765625,
-0.01971435546875,
-0.01849365234375,
-0.012664794921875,
0.0103912353515625,
0.034912109375,
-0.050262451171875,
-0.037994384765625,
0.0022983551025390625,
-0.0247955322265625,
0.0185699462890625,
0.030914306640625,
-0.03729248046875,
0.02142333984375,
0.0609130859375,
0.021636962890625,
0.052337646484375,
0.0029315948486328125,
0.0142974853515625,
-0.039398193359375,
0.02996826171875,
0.0088653564453125,
0.0287933349609375,
0.0219268798828125,
-0.01776123046875,
0.0465087890625,
0.033233642578125,
-0.06585693359375,
-0.053680419921875,
0.0119171142578125,
-0.0927734375,
0.0035266876220703125,
0.0855712890625,
-0.02813720703125,
-0.0265960693359375,
-0.00406646728515625,
-0.0166168212890625,
0.02667236328125,
-0.02972412109375,
0.056488037109375,
0.05242919921875,
-0.0027179718017578125,
-0.0232391357421875,
-0.039520263671875,
0.0382080078125,
0.02215576171875,
-0.0472412109375,
-0.0203399658203125,
0.006511688232421875,
0.034027099609375,
0.009857177734375,
0.060302734375,
-0.0150604248046875,
0.039306640625,
-0.003326416015625,
0.022125244140625,
-0.0144500732421875,
0.01355743408203125,
-0.0374755859375,
-0.0177459716796875,
0.0228729248046875,
-0.0308837890625
]
] |
blanchefort/rubert-base-cased-sentiment-rusentiment | 2023-04-06T04:06:16.000Z | [
"transformers",
"pytorch",
"tf",
"jax",
"safetensors",
"bert",
"text-classification",
"sentiment",
"ru",
"dataset:RuSentiment",
"endpoints_compatible",
"has_space",
"region:us"
] | text-classification | blanchefort | null | null | blanchefort/rubert-base-cased-sentiment-rusentiment | 4 | 31,486 | transformers | 2022-03-02T23:29:05 | ---
language:
- ru
tags:
- sentiment
- text-classification
datasets:
- RuSentiment
---
# RuBERT for Sentiment Analysis
This is a [DeepPavlov/rubert-base-cased-conversational](https://huggingface.co/DeepPavlov/rubert-base-cased-conversational) model trained on [RuSentiment](http://text-machine.cs.uml.edu/projects/rusentiment/).
## Labels
0: NEUTRAL
1: POSITIVE
2: NEGATIVE
## How to use
```python
import torch
from transformers import AutoModelForSequenceClassification
from transformers import BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained('blanchefort/rubert-base-cased-sentiment-rusentiment')
model = AutoModelForSequenceClassification.from_pretrained('blanchefort/rubert-base-cased-sentiment-rusentiment', return_dict=True)
@torch.no_grad()
def predict(text):
inputs = tokenizer(text, max_length=512, padding=True, truncation=True, return_tensors='pt')
outputs = model(**inputs)
predicted = torch.nn.functional.softmax(outputs.logits, dim=1)
predicted = torch.argmax(predicted, dim=1).numpy()
return predicted
```
## Dataset used for model training
**[RuSentiment](http://text-machine.cs.uml.edu/projects/rusentiment/)**
> A. Rogers A. Romanov A. Rumshisky S. Volkova M. Gronas A. Gribov RuSentiment: An Enriched Sentiment Analysis Dataset for Social Media in Russian. Proceedings of COLING 2018. | 1,362 | [
[
-0.018585205078125,
-0.049163818359375,
0.00371551513671875,
0.036865234375,
-0.03594970703125,
0.005863189697265625,
-0.0274658203125,
-0.00006222724914550781,
0.025360107421875,
0.01103973388671875,
-0.0345458984375,
-0.052581787109375,
-0.044952392578125,
-0.01300048828125,
-0.018707275390625,
0.12109375,
0.013519287109375,
0.042694091796875,
-0.00848388671875,
-0.007511138916015625,
-0.00946807861328125,
-0.041473388671875,
-0.035614013671875,
-0.050018310546875,
0.032318115234375,
0.03424072265625,
0.031402587890625,
-0.014404296875,
0.0369873046875,
0.023956298828125,
-0.0018892288208007812,
-0.022613525390625,
-0.0340576171875,
0.0157928466796875,
-0.002269744873046875,
-0.0281219482421875,
-0.0609130859375,
0.007190704345703125,
0.0173187255859375,
0.029510498046875,
-0.0132598876953125,
0.02337646484375,
-0.018585205078125,
0.03521728515625,
-0.02459716796875,
0.007598876953125,
-0.042999267578125,
0.01074981689453125,
-0.0014219284057617188,
-0.017608642578125,
-0.032623291015625,
-0.02996826171875,
0.0210113525390625,
-0.0250244140625,
0.0313720703125,
-0.00839996337890625,
0.089111328125,
0.00638580322265625,
-0.01507568359375,
-0.024871826171875,
-0.03533935546875,
0.07940673828125,
-0.050933837890625,
0.0115966796875,
0.0146942138671875,
0.0070648193359375,
0.00817108154296875,
-0.055023193359375,
-0.036163330078125,
-0.005397796630859375,
0.0065460205078125,
0.03204345703125,
-0.01540374755859375,
0.003368377685546875,
0.004085540771484375,
0.041473388671875,
-0.040374755859375,
-0.0160675048828125,
-0.026519775390625,
-0.0167083740234375,
0.037078857421875,
0.0135955810546875,
-0.006549835205078125,
-0.0311279296875,
-0.03753662109375,
-0.01751708984375,
-0.002590179443359375,
-0.0010061264038085938,
0.03472900390625,
0.00885009765625,
-0.0187530517578125,
0.052398681640625,
-0.0235595703125,
0.033843994140625,
0.0180206298828125,
-0.007305145263671875,
0.055938720703125,
0.006549835205078125,
-0.0279693603515625,
-0.0142974853515625,
0.0791015625,
0.01824951171875,
0.02972412109375,
0.0098876953125,
-0.0249176025390625,
0.024200439453125,
0.001934051513671875,
-0.071533203125,
-0.06121826171875,
0.0345458984375,
-0.026611328125,
-0.045928955078125,
0.0082855224609375,
-0.07830810546875,
-0.0394287109375,
-0.00991058349609375,
0.05023193359375,
-0.029022216796875,
-0.02532958984375,
0.038055419921875,
-0.0157928466796875,
0.025360107421875,
0.01323699951171875,
-0.036224365234375,
0.0143890380859375,
0.03631591796875,
0.043914794921875,
0.0133819580078125,
0.0001442432403564453,
-0.0302276611328125,
-0.042816162109375,
0.00292205810546875,
0.0523681640625,
-0.0223236083984375,
-0.027801513671875,
0.009124755859375,
-0.0106201171875,
0.005229949951171875,
-0.01361083984375,
0.0657958984375,
-0.034027099609375,
0.05517578125,
0.008636474609375,
-0.03936767578125,
-0.0088653564453125,
0.0379638671875,
-0.0295257568359375,
0.06951904296875,
0.00762939453125,
-0.07012939453125,
0.0207977294921875,
-0.05755615234375,
-0.042022705078125,
-0.0031585693359375,
0.00372314453125,
-0.07586669921875,
0.003215789794921875,
0.004741668701171875,
0.0621337890625,
-0.00891876220703125,
0.031951904296875,
-0.03741455078125,
-0.002307891845703125,
0.06036376953125,
-0.022186279296875,
0.0697021484375,
0.0200042724609375,
-0.0308837890625,
0.0152740478515625,
-0.057891845703125,
-0.0008554458618164062,
0.00009959936141967773,
-0.0232086181640625,
0.004779815673828125,
-0.0111846923828125,
0.033233642578125,
0.0149688720703125,
0.0224609375,
-0.0577392578125,
0.0200347900390625,
-0.03057861328125,
0.0016574859619140625,
0.0654296875,
-0.0010194778442382812,
0.03021240234375,
0.00414276123046875,
0.04766845703125,
0.019775390625,
0.01397705078125,
0.016143798828125,
-0.01727294921875,
-0.09393310546875,
-0.02667236328125,
0.037567138671875,
0.04364013671875,
-0.06494140625,
0.038787841796875,
-0.0272216796875,
-0.039031982421875,
-0.029022216796875,
-0.0007328987121582031,
0.0189666748046875,
0.0272979736328125,
0.02972412109375,
-0.01959228515625,
-0.05352783203125,
-0.05657958984375,
-0.0257720947265625,
0.0004467964172363281,
0.011077880859375,
0.0032291412353515625,
0.047607421875,
-0.0333251953125,
0.061553955078125,
-0.04876708984375,
-0.02825927734375,
-0.0255889892578125,
0.040252685546875,
0.04754638671875,
0.041961669921875,
0.022735595703125,
-0.05706787109375,
-0.0643310546875,
-0.003322601318359375,
-0.039520263671875,
-0.003597259521484375,
-0.002887725830078125,
-0.01195526123046875,
0.015655517578125,
-0.0046539306640625,
-0.056060791015625,
0.0254974365234375,
0.0297698974609375,
-0.05303955078125,
0.048370361328125,
0.01361083984375,
0.0002313852310180664,
-0.0927734375,
0.00745391845703125,
0.02947998046875,
-0.01351165771484375,
-0.0645751953125,
-0.031951904296875,
-0.00519561767578125,
0.0004930496215820312,
-0.022918701171875,
0.0496826171875,
-0.0260162353515625,
0.01959228515625,
-0.0108642578125,
-0.0007266998291015625,
0.006458282470703125,
0.03350830078125,
-0.0012607574462890625,
0.0308685302734375,
0.0545654296875,
-0.0224761962890625,
0.037261962890625,
0.02374267578125,
-0.0188140869140625,
0.044403076171875,
-0.05126953125,
0.0079345703125,
0.0005407333374023438,
0.006122589111328125,
-0.07135009765625,
-0.01568603515625,
0.043731689453125,
-0.08087158203125,
0.03173828125,
-0.0093994140625,
-0.0294036865234375,
-0.01520538330078125,
-0.042633056640625,
0.0215301513671875,
0.042266845703125,
-0.040863037109375,
0.04638671875,
0.0195465087890625,
-0.00371551513671875,
-0.07379150390625,
-0.047332763671875,
-0.00388336181640625,
-0.0293426513671875,
-0.044708251953125,
-0.00554656982421875,
-0.00861358642578125,
-0.0175018310546875,
-0.014129638671875,
0.00360870361328125,
-0.0107269287109375,
-0.00872039794921875,
0.007747650146484375,
0.02972412109375,
-0.004154205322265625,
0.0157012939453125,
-0.012603759765625,
-0.013824462890625,
0.017120361328125,
0.0025806427001953125,
0.06402587890625,
-0.036468505859375,
0.00725555419921875,
-0.02325439453125,
-0.00777435302734375,
0.03466796875,
-0.0022640228271484375,
0.0491943359375,
0.08575439453125,
-0.01116943359375,
-0.017486572265625,
-0.04669189453125,
-0.0066986083984375,
-0.0364990234375,
0.0295257568359375,
-0.0123291015625,
-0.04473876953125,
0.04791259765625,
0.0164794921875,
-0.0230712890625,
0.035614013671875,
0.052734375,
-0.00307464599609375,
0.0675048828125,
0.043243408203125,
-0.02630615234375,
0.046905517578125,
-0.047515869140625,
0.02764892578125,
-0.05023193359375,
-0.0121307373046875,
-0.0257720947265625,
-0.0054473876953125,
-0.040985107421875,
-0.021148681640625,
0.0072021484375,
-0.002292633056640625,
-0.00714874267578125,
0.018341064453125,
-0.0372314453125,
0.015838623046875,
0.024658203125,
0.01415252685546875,
-0.0002149343490600586,
0.0151824951171875,
0.011505126953125,
-0.0038909912109375,
-0.04290771484375,
-0.031402587890625,
0.07672119140625,
0.043853759765625,
0.08306884765625,
-0.01401519775390625,
0.05169677734375,
0.02301025390625,
0.04632568359375,
-0.08087158203125,
0.04833984375,
-0.0142669677734375,
-0.060821533203125,
0.005695343017578125,
-0.033477783203125,
-0.06573486328125,
0.0230255126953125,
-0.0216217041015625,
-0.035003662109375,
0.0135040283203125,
0.007717132568359375,
-0.0076751708984375,
0.004604339599609375,
-0.04400634765625,
0.06597900390625,
-0.0047149658203125,
-0.0282135009765625,
-0.00925445556640625,
-0.039215087890625,
0.0188446044921875,
0.02978515625,
-0.0010356903076171875,
-0.017791748046875,
0.0193939208984375,
0.072998046875,
-0.001789093017578125,
0.0791015625,
-0.0301513671875,
-0.0022182464599609375,
0.024505615234375,
0.0032863616943359375,
0.0152587890625,
0.01151275634765625,
-0.0118865966796875,
0.004856109619140625,
-0.0171966552734375,
-0.0384521484375,
-0.01200103759765625,
0.044158935546875,
-0.06591796875,
-0.022705078125,
-0.058868408203125,
-0.020233154296875,
-0.00618743896484375,
0.0045166015625,
0.031036376953125,
0.052581787109375,
-0.0284423828125,
0.027801513671875,
0.047332763671875,
-0.012420654296875,
0.0284881591796875,
0.0179443359375,
-0.0147857666015625,
-0.059539794921875,
0.08502197265625,
-0.018768310546875,
-0.0013494491577148438,
0.0145263671875,
0.03668212890625,
-0.019805908203125,
-0.00383758544921875,
-0.0069732666015625,
0.0208282470703125,
-0.07244873046875,
-0.0445556640625,
-0.044952392578125,
-0.012664794921875,
-0.043914794921875,
-0.0007328987121582031,
-0.012298583984375,
-0.037322998046875,
-0.04901123046875,
-0.01605224609375,
0.036407470703125,
0.02923583984375,
-0.01256561279296875,
0.039886474609375,
-0.0321044921875,
0.0223236083984375,
0.0063629150390625,
0.01800537109375,
-0.026214599609375,
-0.048004150390625,
-0.023712158203125,
-0.0017852783203125,
-0.034698486328125,
-0.06646728515625,
0.0640869140625,
0.01352691650390625,
0.0256805419921875,
0.028594970703125,
0.00959014892578125,
0.0278472900390625,
-0.0005331039428710938,
0.053741455078125,
0.036407470703125,
-0.0802001953125,
0.033050537109375,
-0.03753662109375,
0.007495880126953125,
0.0408935546875,
0.0267791748046875,
-0.043304443359375,
-0.017669677734375,
-0.06573486328125,
-0.063720703125,
0.0438232421875,
0.012603759765625,
0.01861572265625,
-0.001312255859375,
0.01049041748046875,
0.01104736328125,
0.0270233154296875,
-0.1112060546875,
-0.0279388427734375,
-0.0391845703125,
-0.04949951171875,
-0.0135498046875,
-0.0274658203125,
-0.0178375244140625,
-0.038482666015625,
0.07769775390625,
-0.006618499755859375,
0.040679931640625,
0.0222015380859375,
-0.0190582275390625,
-0.006526947021484375,
0.02777099609375,
0.042510986328125,
0.0246734619140625,
-0.0218353271484375,
0.011077880859375,
0.015289306640625,
-0.042877197265625,
-0.005161285400390625,
0.006023406982421875,
0.0074310302734375,
0.00933074951171875,
0.0077362060546875,
0.05804443359375,
-0.023529052734375,
-0.017333984375,
0.05859375,
-0.0152740478515625,
-0.0220794677734375,
-0.05474853515625,
-0.016937255859375,
-0.00914764404296875,
0.0166473388671875,
0.020233154296875,
0.01499176025390625,
0.01849365234375,
-0.019317626953125,
0.00494384765625,
0.04522705078125,
-0.05328369140625,
-0.041473388671875,
0.0157470703125,
0.0279388427734375,
-0.017364501953125,
0.0261993408203125,
-0.0197296142578125,
-0.07281494140625,
0.037017822265625,
0.0182037353515625,
0.09149169921875,
0.0016107559204101562,
0.0279388427734375,
0.029266357421875,
0.01025390625,
0.01149749755859375,
0.032440185546875,
0.00835418701171875,
-0.0657958984375,
-0.027374267578125,
-0.055572509765625,
-0.02252197265625,
0.015045166015625,
-0.05810546875,
0.0290679931640625,
-0.0301361083984375,
-0.025360107421875,
-0.0148468017578125,
0.0193939208984375,
-0.05023193359375,
0.03753662109375,
-0.00308990478515625,
0.05023193359375,
-0.08001708984375,
0.065673828125,
0.06280517578125,
-0.0206451416015625,
-0.035614013671875,
-0.005218505859375,
-0.02484130859375,
-0.043212890625,
0.050811767578125,
0.0272216796875,
-0.01059722900390625,
-0.0031585693359375,
-0.0645751953125,
-0.0306396484375,
0.06829833984375,
-0.01023101806640625,
0.0017948150634765625,
0.047607421875,
-0.0018091201782226562,
0.0552978515625,
-0.018707275390625,
0.0208282470703125,
0.034027099609375,
0.02496337890625,
-0.007678985595703125,
-0.02911376953125,
-0.01275634765625,
-0.042999267578125,
-0.027191162109375,
0.0131072998046875,
-0.045013427734375,
0.08367919921875,
0.004856109619140625,
0.0006475448608398438,
0.01446533203125,
0.05718994140625,
0.0004992485046386719,
0.0226898193359375,
0.02484130859375,
0.046966552734375,
0.0289764404296875,
-0.0225677490234375,
0.056976318359375,
-0.01204681396484375,
0.08209228515625,
0.0631103515625,
0.00876617431640625,
0.0501708984375,
0.03997802734375,
-0.00739288330078125,
0.0614013671875,
0.03521728515625,
-0.0191802978515625,
0.046966552734375,
0.013824462890625,
-0.033660888671875,
-0.0134735107421875,
0.01806640625,
-0.0279693603515625,
0.0234222412109375,
0.0279541015625,
-0.0140533447265625,
0.00726318359375,
-0.00194549560546875,
0.0153961181640625,
-0.0110015869140625,
-0.010223388671875,
0.0450439453125,
0.0031948089599609375,
-0.03070068359375,
0.027557373046875,
0.00019478797912597656,
0.07940673828125,
-0.022705078125,
0.00986480712890625,
-0.01611328125,
0.046539306640625,
-0.0219879150390625,
-0.055419921875,
-0.0178375244140625,
-0.00592041015625,
-0.0012416839599609375,
-0.01251983642578125,
0.08331298828125,
-0.035430908203125,
-0.0584716796875,
0.018768310546875,
0.0302276611328125,
0.00313568115234375,
0.01064300537109375,
-0.0693359375,
-0.008880615234375,
0.005435943603515625,
-0.054840087890625,
0.00684356689453125,
0.037872314453125,
0.040313720703125,
0.04901123046875,
0.033111572265625,
-0.00481414794921875,
-0.0015277862548828125,
0.0185699462890625,
0.0784912109375,
-0.0513916015625,
-0.0248565673828125,
-0.07244873046875,
0.06427001953125,
-0.0232391357421875,
-0.045074462890625,
0.06793212890625,
0.037139892578125,
0.041900634765625,
-0.03094482421875,
0.04937744140625,
-0.0168304443359375,
0.031463623046875,
-0.0221099853515625,
0.0477294921875,
-0.0296630859375,
-0.0020275115966796875,
-0.034881591796875,
-0.05859375,
-0.031280517578125,
0.0751953125,
-0.03607177734375,
0.00630950927734375,
0.046966552734375,
0.063720703125,
-0.013397216796875,
-0.0135650634765625,
0.0152587890625,
0.0251007080078125,
0.02191162109375,
0.01177215576171875,
0.0751953125,
-0.047088623046875,
0.03759765625,
-0.038177490234375,
-0.0162353515625,
-0.034820556640625,
-0.06146240234375,
-0.08905029296875,
-0.0303192138671875,
-0.039031982421875,
-0.06103515625,
-0.0219268798828125,
0.0648193359375,
0.039398193359375,
-0.06707763671875,
-0.031982421875,
-0.00116729736328125,
0.0011777877807617188,
0.0225067138671875,
-0.029296875,
0.005634307861328125,
-0.03485107421875,
-0.04266357421875,
-0.0062713623046875,
-0.00850677490234375,
0.00110626220703125,
-0.0180816650390625,
-0.00894927978515625,
-0.0188446044921875,
-0.0010519027709960938,
0.044525146484375,
0.009033203125,
-0.043121337890625,
-0.0210418701171875,
0.01076507568359375,
-0.0191650390625,
0.0228729248046875,
0.042327880859375,
-0.043853759765625,
0.0219268798828125,
0.03778076171875,
0.013397216796875,
0.0477294921875,
-0.0003807544708251953,
0.022247314453125,
-0.07208251953125,
0.0127716064453125,
0.03448486328125,
0.03900146484375,
0.04510498046875,
-0.022705078125,
0.030792236328125,
0.024658203125,
-0.0496826171875,
-0.055023193359375,
0.00307464599609375,
-0.0916748046875,
-0.01360321044921875,
0.10638427734375,
0.0008611679077148438,
-0.016387939453125,
0.00785064697265625,
-0.03472900390625,
0.028228759765625,
-0.044403076171875,
0.061279296875,
0.044525146484375,
-0.0124359130859375,
0.0024356842041015625,
-0.0106658935546875,
0.029022216796875,
0.0535888671875,
-0.0543212890625,
-0.004955291748046875,
0.0300445556640625,
0.035003662109375,
0.0261077880859375,
0.045654296875,
0.0006923675537109375,
0.019317626953125,
0.0005450248718261719,
0.04510498046875,
0.0255889892578125,
-0.0160675048828125,
-0.04754638671875,
0.0008368492126464844,
-0.0023288726806640625,
-0.0263824462890625
]
] |
timm/crossvit_9_240.in1k | 2023-04-24T00:30:01.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2103.14899",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/crossvit_9_240.in1k | 0 | 31,473 | timm | 2023-04-24T00:29:50 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for crossvit_9_240.in1k
A CrossViT image classification model. Trained on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 8.6
- GMACs: 1.8
- Activations (M): 9.5
- Image size: 240 x 240
- **Papers:**
- CrossViT: Cross-Attention Multi-Scale Vision Transformer for Image Classification: https://arxiv.org/abs/2103.14899
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/IBM/CrossViT
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('crossvit_9_240.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'crossvit_9_240.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (torch.Size([1, 401, 128]), torch.Size([1, 197, 256])) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{
chen2021crossvit,
title={{CrossViT: Cross-Attention Multi-Scale Vision Transformer for Image Classification}},
author={Chun-Fu (Richard) Chen and Quanfu Fan and Rameswar Panda},
booktitle={International Conference on Computer Vision (ICCV)},
year={2021}
}
```
| 2,802 | [
[
-0.0361328125,
-0.028045654296875,
-0.00473785400390625,
0.0193023681640625,
-0.025970458984375,
-0.024444580078125,
-0.010955810546875,
-0.028289794921875,
0.01538848876953125,
0.023040771484375,
-0.043426513671875,
-0.042572021484375,
-0.05450439453125,
-0.010955810546875,
-0.013275146484375,
0.07196044921875,
-0.0080718994140625,
0.01197052001953125,
-0.018951416015625,
-0.03814697265625,
-0.0140838623046875,
-0.0233001708984375,
-0.05743408203125,
-0.02783203125,
0.0283660888671875,
0.01529693603515625,
0.036468505859375,
0.03857421875,
0.05126953125,
0.032867431640625,
0.00012862682342529297,
-0.0010690689086914062,
-0.023651123046875,
-0.0290679931640625,
0.018218994140625,
-0.052978515625,
-0.033447265625,
0.021881103515625,
0.05615234375,
0.030120849609375,
0.00536346435546875,
0.03179931640625,
0.0181732177734375,
0.0302734375,
-0.020751953125,
0.011505126953125,
-0.030487060546875,
0.0237274169921875,
-0.01509857177734375,
-0.0008363723754882812,
-0.024932861328125,
-0.02490234375,
0.01812744140625,
-0.037628173828125,
0.035125732421875,
0.007793426513671875,
0.09033203125,
0.0231475830078125,
0.0017604827880859375,
0.010406494140625,
-0.0193023681640625,
0.0657958984375,
-0.054931640625,
0.038421630859375,
0.01079559326171875,
0.0140380859375,
0.00664520263671875,
-0.07171630859375,
-0.039154052734375,
-0.0071258544921875,
-0.010650634765625,
0.0005311965942382812,
-0.0217742919921875,
-0.0020160675048828125,
0.0302734375,
0.025146484375,
-0.04437255859375,
-0.0007877349853515625,
-0.03985595703125,
-0.01090240478515625,
0.03887939453125,
0.01290130615234375,
0.0206756591796875,
-0.00954437255859375,
-0.0516357421875,
-0.031982421875,
-0.0243377685546875,
0.02996826171875,
0.0160064697265625,
0.01030731201171875,
-0.046142578125,
0.033447265625,
0.00681304931640625,
0.037139892578125,
0.0193939208984375,
-0.0218353271484375,
0.047760009765625,
-0.0090484619140625,
-0.03973388671875,
-0.0065155029296875,
0.07794189453125,
0.030548095703125,
0.0191192626953125,
0.0021152496337890625,
0.00021266937255859375,
-0.02093505859375,
0.0030345916748046875,
-0.08258056640625,
-0.0128173828125,
0.0115203857421875,
-0.0465087890625,
-0.027252197265625,
0.0182647705078125,
-0.04913330078125,
0.0014677047729492188,
-0.004955291748046875,
0.045440673828125,
-0.032928466796875,
-0.01641845703125,
0.00733184814453125,
-0.01142120361328125,
0.03497314453125,
0.0148468017578125,
-0.03704833984375,
0.0128021240234375,
0.01336669921875,
0.0791015625,
-0.0070343017578125,
-0.04180908203125,
-0.017059326171875,
-0.0208282470703125,
-0.0143585205078125,
0.0299530029296875,
-0.0154266357421875,
-0.01265716552734375,
-0.0217742919921875,
0.033172607421875,
-0.01727294921875,
-0.041229248046875,
0.025146484375,
-0.01010894775390625,
0.0255126953125,
0.0015583038330078125,
-0.01323699951171875,
-0.033721923828125,
0.0305633544921875,
-0.0295257568359375,
0.08905029296875,
0.0252838134765625,
-0.074462890625,
0.0306549072265625,
-0.037628173828125,
-0.0107879638671875,
-0.010528564453125,
0.00113677978515625,
-0.0875244140625,
-0.01094818115234375,
0.0162200927734375,
0.054168701171875,
-0.0288543701171875,
0.0014848709106445312,
-0.046875,
-0.00923919677734375,
0.0220184326171875,
-0.01445770263671875,
0.0780029296875,
-0.007350921630859375,
-0.034637451171875,
0.0169525146484375,
-0.048248291015625,
0.01479339599609375,
0.033050537109375,
-0.017486572265625,
-0.00940704345703125,
-0.058746337890625,
0.020355224609375,
0.024139404296875,
0.007568359375,
-0.05389404296875,
0.021026611328125,
-0.005558013916015625,
0.02630615234375,
0.037139892578125,
-0.0097503662109375,
0.019378662109375,
-0.0260162353515625,
0.031158447265625,
0.02203369140625,
0.0345458984375,
0.002643585205078125,
-0.052703857421875,
-0.06707763671875,
-0.0318603515625,
0.024810791015625,
0.034576416015625,
-0.0396728515625,
0.03253173828125,
-0.014068603515625,
-0.06390380859375,
-0.037750244140625,
0.001438140869140625,
0.04217529296875,
0.03448486328125,
0.03363037109375,
-0.03887939453125,
-0.0399169921875,
-0.0703125,
-0.0072784423828125,
0.00545501708984375,
0.006999969482421875,
0.0265960693359375,
0.051055908203125,
-0.0157623291015625,
0.044647216796875,
-0.0295257568359375,
-0.0228424072265625,
-0.015655517578125,
0.01026153564453125,
0.0152435302734375,
0.053802490234375,
0.06207275390625,
-0.051727294921875,
-0.045501708984375,
-0.0124969482421875,
-0.07098388671875,
0.01435089111328125,
-0.01129150390625,
-0.01381683349609375,
0.01392364501953125,
0.0117950439453125,
-0.045440673828125,
0.06439208984375,
0.0125274658203125,
-0.0224151611328125,
0.0285186767578125,
-0.01090240478515625,
0.01611328125,
-0.07861328125,
0.0029621124267578125,
0.034759521484375,
-0.01430511474609375,
-0.03662109375,
-0.007518768310546875,
0.00746917724609375,
0.004337310791015625,
-0.044189453125,
0.045135498046875,
-0.04071044921875,
-0.01085662841796875,
-0.0077667236328125,
-0.0224609375,
0.00289154052734375,
0.061370849609375,
0.01023101806640625,
0.0222625732421875,
0.0594482421875,
-0.03814697265625,
0.044525146484375,
0.036529541015625,
-0.0136871337890625,
0.046600341796875,
-0.050384521484375,
0.00395965576171875,
0.007965087890625,
0.0159149169921875,
-0.08099365234375,
-0.00482177734375,
0.02667236328125,
-0.050384521484375,
0.046142578125,
-0.032562255859375,
-0.022308349609375,
-0.035736083984375,
-0.030548095703125,
0.0487060546875,
0.0438232421875,
-0.055999755859375,
0.039306640625,
0.0170745849609375,
0.0112762451171875,
-0.044769287109375,
-0.0709228515625,
-0.018035888671875,
-0.026763916015625,
-0.0574951171875,
0.024932861328125,
0.007598876953125,
0.01387786865234375,
0.01018524169921875,
0.0036067962646484375,
0.0007243156433105469,
-0.0242767333984375,
0.037017822265625,
0.030364990234375,
-0.02520751953125,
-0.00794219970703125,
-0.024505615234375,
-0.005977630615234375,
0.01189422607421875,
-0.0183258056640625,
0.037322998046875,
-0.021881103515625,
-0.02093505859375,
-0.0535888671875,
-0.003246307373046875,
0.043060302734375,
-0.01116180419921875,
0.0531005859375,
0.08642578125,
-0.034423828125,
-0.005069732666015625,
-0.03314208984375,
-0.01522064208984375,
-0.03399658203125,
0.0372314453125,
-0.032684326171875,
-0.0256805419921875,
0.0584716796875,
0.00707244873046875,
0.00420379638671875,
0.052764892578125,
0.034698486328125,
0.00955963134765625,
0.058746337890625,
0.054351806640625,
0.01166534423828125,
0.053802490234375,
-0.06890869140625,
-0.01148223876953125,
-0.071533203125,
-0.047882080078125,
-0.02423095703125,
-0.0419921875,
-0.044647216796875,
-0.034637451171875,
0.028656005859375,
-0.005870819091796875,
-0.0211334228515625,
0.038238525390625,
-0.062744140625,
0.0022487640380859375,
0.05474853515625,
0.031982421875,
-0.018524169921875,
0.01715087890625,
-0.0198822021484375,
-0.007213592529296875,
-0.0645751953125,
-0.0034999847412109375,
0.08148193359375,
0.039276123046875,
0.07177734375,
-0.02130126953125,
0.044342041015625,
-0.0197296142578125,
0.0247039794921875,
-0.041900634765625,
0.039215087890625,
-0.0028400421142578125,
-0.03533935546875,
-0.0184783935546875,
-0.0247344970703125,
-0.0771484375,
0.0168304443359375,
-0.018218994140625,
-0.0498046875,
0.0272674560546875,
-0.00021386146545410156,
-0.0183258056640625,
0.06304931640625,
-0.051361083984375,
0.07891845703125,
-0.021514892578125,
-0.036163330078125,
0.005413055419921875,
-0.05145263671875,
0.0258941650390625,
0.01015472412109375,
-0.0222320556640625,
0.0056610107421875,
0.0112152099609375,
0.08489990234375,
-0.036895751953125,
0.072998046875,
-0.0253448486328125,
0.0277099609375,
0.03082275390625,
-0.0170745849609375,
0.0164031982421875,
-0.002468109130859375,
0.011566162109375,
0.033843994140625,
0.002582550048828125,
-0.03302001953125,
-0.038421630859375,
0.04522705078125,
-0.07373046875,
-0.029693603515625,
-0.03912353515625,
-0.045928955078125,
0.00931549072265625,
0.003406524658203125,
0.050018310546875,
0.045928955078125,
0.004520416259765625,
0.030731201171875,
0.048675537109375,
-0.028106689453125,
0.03314208984375,
-0.00562286376953125,
-0.0163726806640625,
-0.038299560546875,
0.061553955078125,
0.0186309814453125,
0.018341064453125,
0.01153564453125,
0.007671356201171875,
-0.0260772705078125,
-0.0301513671875,
-0.0310211181640625,
0.03021240234375,
-0.04901123046875,
-0.030242919921875,
-0.044586181640625,
-0.04229736328125,
-0.037200927734375,
-0.00377655029296875,
-0.0361328125,
-0.0264739990234375,
-0.0226898193359375,
0.017852783203125,
0.0430908203125,
0.0511474609375,
-0.007030487060546875,
0.035400390625,
-0.051727294921875,
0.009979248046875,
0.01187896728515625,
0.03997802734375,
-0.004425048828125,
-0.0758056640625,
-0.0269012451171875,
-0.0089874267578125,
-0.036956787109375,
-0.051910400390625,
0.04278564453125,
0.00420379638671875,
0.04888916015625,
0.0293731689453125,
-0.0169677734375,
0.058746337890625,
-0.0010356903076171875,
0.035400390625,
0.0290374755859375,
-0.049224853515625,
0.039398193359375,
0.00527191162109375,
0.019775390625,
-0.00099945068359375,
0.0379638671875,
-0.0263824462890625,
-0.007091522216796875,
-0.078125,
-0.046722412109375,
0.07122802734375,
0.0157318115234375,
-0.012237548828125,
0.039764404296875,
0.04345703125,
0.00640106201171875,
0.0018491744995117188,
-0.06658935546875,
-0.0306549072265625,
-0.03460693359375,
-0.019073486328125,
-0.00637054443359375,
0.0035495758056640625,
-0.00223541259765625,
-0.05584716796875,
0.0572509765625,
-0.004657745361328125,
0.050689697265625,
0.0276641845703125,
-0.00513458251953125,
-0.0124969482421875,
-0.0294036865234375,
0.0338134765625,
0.020751953125,
-0.016876220703125,
-0.00333404541015625,
0.0263519287109375,
-0.045989990234375,
0.004436492919921875,
0.01346588134765625,
0.000019073486328125,
0.0031909942626953125,
0.03759765625,
0.073486328125,
-0.00024771690368652344,
0.011199951171875,
0.036163330078125,
-0.0192718505859375,
-0.030364990234375,
-0.0151824951171875,
0.003025054931640625,
-0.00605010986328125,
0.03765869140625,
0.017486572265625,
0.025970458984375,
-0.005695343017578125,
-0.019012451171875,
0.01544189453125,
0.0479736328125,
-0.0367431640625,
-0.0284271240234375,
0.053741455078125,
-0.00861358642578125,
-0.01505279541015625,
0.058563232421875,
0.005405426025390625,
-0.045684814453125,
0.07720947265625,
0.0279998779296875,
0.0791015625,
-0.016998291015625,
-0.0038242340087890625,
0.057403564453125,
0.0234832763671875,
0.0066680908203125,
0.011138916015625,
0.0086822509765625,
-0.052734375,
0.00698089599609375,
-0.03955078125,
-0.008941650390625,
0.0265960693359375,
-0.042083740234375,
0.0250244140625,
-0.049407958984375,
-0.027191162109375,
0.0152435302734375,
0.0215911865234375,
-0.07037353515625,
0.0258941650390625,
0.00394439697265625,
0.0650634765625,
-0.0665283203125,
0.062225341796875,
0.0601806640625,
-0.048065185546875,
-0.0665283203125,
-0.0088043212890625,
-0.0061187744140625,
-0.0736083984375,
0.04986572265625,
0.03179931640625,
0.004772186279296875,
-0.0022525787353515625,
-0.0628662109375,
-0.0489501953125,
0.09808349609375,
0.022125244140625,
-0.0138092041015625,
0.0118865966796875,
0.014068603515625,
0.0169677734375,
-0.037933349609375,
0.037139892578125,
0.01058197021484375,
0.034393310546875,
0.01885986328125,
-0.05938720703125,
0.0173492431640625,
-0.024566650390625,
0.0067901611328125,
0.01364898681640625,
-0.0753173828125,
0.07586669921875,
-0.03521728515625,
0.003185272216796875,
0.0018482208251953125,
0.050933837890625,
0.0100555419921875,
0.014984130859375,
0.044403076171875,
0.0670166015625,
0.03045654296875,
-0.0217742919921875,
0.0621337890625,
-0.00010442733764648438,
0.0462646484375,
0.0418701171875,
0.0216064453125,
0.033203125,
0.032867431640625,
-0.0204315185546875,
0.0271759033203125,
0.0789794921875,
-0.035614013671875,
0.0247955322265625,
0.006298065185546875,
0.01119232177734375,
-0.023162841796875,
0.01129150390625,
-0.039764404296875,
0.03546142578125,
0.01287078857421875,
-0.0367431640625,
-0.0141448974609375,
0.0159149169921875,
-0.017791748046875,
-0.03857421875,
-0.0186767578125,
0.03240966796875,
0.006587982177734375,
-0.034332275390625,
0.05633544921875,
-0.004718780517578125,
0.060394287109375,
-0.028411865234375,
-0.0027179718017578125,
-0.01493072509765625,
0.039154052734375,
-0.0250244140625,
-0.07025146484375,
0.0174102783203125,
-0.02227783203125,
-0.00460052490234375,
0.0015163421630859375,
0.048065185546875,
-0.036712646484375,
-0.04345703125,
0.0271759033203125,
0.0204315185546875,
0.0345458984375,
-0.0008645057678222656,
-0.093017578125,
0.00414276123046875,
0.010040283203125,
-0.04730224609375,
0.0255126953125,
0.0428466796875,
-0.002513885498046875,
0.0616455078125,
0.039154052734375,
-0.01385498046875,
0.025299072265625,
-0.01568603515625,
0.0633544921875,
-0.044189453125,
-0.0269317626953125,
-0.056640625,
0.043365478515625,
-0.004100799560546875,
-0.0498046875,
0.05230712890625,
0.05377197265625,
0.07421875,
-0.005977630615234375,
0.045684814453125,
-0.00991058349609375,
0.007061004638671875,
-0.0236968994140625,
0.04290771484375,
-0.060302734375,
-0.01177215576171875,
-0.01372528076171875,
-0.054595947265625,
-0.0298614501953125,
0.04833984375,
-0.020904541015625,
0.036163330078125,
0.044921875,
0.06646728515625,
-0.03546142578125,
-0.0186004638671875,
0.022979736328125,
0.01076507568359375,
0.003582000732421875,
0.039520263671875,
0.036346435546875,
-0.06463623046875,
0.04193115234375,
-0.04180908203125,
-0.01503753662109375,
-0.0211639404296875,
-0.034637451171875,
-0.08660888671875,
-0.0579833984375,
-0.047088623046875,
-0.044342041015625,
-0.0184478759765625,
0.054840087890625,
0.0809326171875,
-0.04986572265625,
-0.0133819580078125,
0.0161895751953125,
0.005496978759765625,
-0.01493072509765625,
-0.0157623291015625,
0.0389404296875,
-0.0133209228515625,
-0.062286376953125,
-0.034149169921875,
0.005725860595703125,
0.027587890625,
-0.018829345703125,
-0.0230560302734375,
-0.0264739990234375,
-0.0153656005859375,
0.01285552978515625,
0.023162841796875,
-0.052001953125,
-0.0132293701171875,
-0.00867462158203125,
-0.0162200927734375,
0.04052734375,
0.03131103515625,
-0.044525146484375,
0.0180816650390625,
0.037628173828125,
0.0325927734375,
0.07379150390625,
-0.02130126953125,
0.005786895751953125,
-0.05828857421875,
0.03228759765625,
-0.010498046875,
0.044830322265625,
0.0299835205078125,
-0.02862548828125,
0.042388916015625,
0.039886474609375,
-0.037933349609375,
-0.058258056640625,
-0.01119232177734375,
-0.085205078125,
-0.0030193328857421875,
0.063720703125,
-0.0299072265625,
-0.049041748046875,
0.0265960693359375,
-0.00733184814453125,
0.05230712890625,
-0.0038318634033203125,
0.0265960693359375,
0.0225677490234375,
-0.01312255859375,
-0.046722412109375,
-0.024932861328125,
0.029388427734375,
0.0149993896484375,
-0.054351806640625,
-0.0301055908203125,
-0.01288604736328125,
0.054962158203125,
0.0264434814453125,
0.022186279296875,
-0.01114654541015625,
0.0156402587890625,
0.01367950439453125,
0.047271728515625,
-0.0205230712890625,
-0.014373779296875,
-0.02447509765625,
0.0031890869140625,
-0.0206756591796875,
-0.053466796875
]
] |
huggyllama/llama-13b | 2023-04-07T15:50:53.000Z | [
"transformers",
"pytorch",
"safetensors",
"llama",
"text-generation",
"license:other",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | text-generation | huggyllama | null | null | huggyllama/llama-13b | 119 | 31,396 | transformers | 2023-04-03T23:37:51 | ---
license: other
---
This contains the weights for the LLaMA-13b model. This model is under a non-commercial license (see the LICENSE file).
You should only use this repository if you have been granted access to the model by filling out [this form](https://docs.google.com/forms/d/e/1FAIpQLSfqNECQnMkycAp2jP4Z9TFX0cGR4uf7b_fBxjY_OjhJILlKGA/viewform?usp=send_form) but either lost your copy of the weights or got some trouble converting them to the Transformers format.
| 473 | [
[
-0.00525665283203125,
-0.01317596435546875,
0.031402587890625,
0.057281494140625,
-0.041778564453125,
-0.00431060791015625,
0.022308349609375,
-0.0228729248046875,
0.033050537109375,
0.0506591796875,
-0.058563232421875,
-0.0228271484375,
-0.06512451171875,
0.0165252685546875,
-0.039886474609375,
0.0709228515625,
-0.00638580322265625,
0.0214996337890625,
-0.0189208984375,
-0.0039825439453125,
-0.0079345703125,
-0.01494598388671875,
-0.010345458984375,
-0.049072265625,
0.04693603515625,
0.0272216796875,
0.040618896484375,
0.031097412109375,
0.0574951171875,
0.0171356201171875,
-0.0234222412109375,
-0.0286407470703125,
-0.039794921875,
-0.022857666015625,
0.002468109130859375,
-0.0423583984375,
-0.0628662109375,
0.0210723876953125,
0.045867919921875,
0.041717529296875,
-0.052001953125,
0.03399658203125,
-0.0165557861328125,
0.038055419921875,
-0.0290985107421875,
0.02471923828125,
-0.03070068359375,
-0.0034313201904296875,
-0.0207672119140625,
-0.011505126953125,
-0.029632568359375,
-0.0244903564453125,
-0.02227783203125,
-0.0631103515625,
-0.00502777099609375,
0.01221466064453125,
0.0882568359375,
0.03717041015625,
-0.042388916015625,
-0.0228424072265625,
-0.022705078125,
0.05596923828125,
-0.05438232421875,
0.01116180419921875,
0.034027099609375,
0.041473388671875,
-0.0127105712890625,
-0.060699462890625,
-0.01451873779296875,
-0.032196044921875,
0.003520965576171875,
0.0084381103515625,
-0.033477783203125,
0.0017871856689453125,
0.01512908935546875,
0.05126953125,
-0.0236663818359375,
0.006084442138671875,
-0.0762939453125,
-0.00855255126953125,
0.0721435546875,
0.01111602783203125,
0.033050537109375,
-0.0302886962890625,
-0.0667724609375,
-0.027496337890625,
-0.052886962890625,
-0.00701141357421875,
0.0550537109375,
0.01375579833984375,
-0.060821533203125,
0.0947265625,
0.0036773681640625,
0.029144287109375,
0.01629638671875,
-0.0156402587890625,
0.046539306640625,
0.0145263671875,
-0.031219482421875,
-0.00017786026000976562,
0.03192138671875,
0.03961181640625,
0.01322174072265625,
-0.013031005859375,
-0.0133209228515625,
-0.005863189697265625,
0.032928466796875,
-0.05987548828125,
-0.0017423629760742188,
-0.01023101806640625,
-0.058013916015625,
-0.0277862548828125,
0.004589080810546875,
-0.02398681640625,
-0.038665771484375,
-0.02294921875,
0.036895751953125,
0.002899169921875,
-0.0272979736328125,
0.038421630859375,
-0.01537322998046875,
0.028472900390625,
0.0127716064453125,
-0.04449462890625,
0.03924560546875,
0.0015506744384765625,
0.039154052734375,
0.016845703125,
-0.01172637939453125,
-0.001861572265625,
0.018585205078125,
-0.01294708251953125,
0.04815673828125,
-0.0124359130859375,
-0.06597900390625,
-0.0165863037109375,
0.04217529296875,
-0.004673004150390625,
-0.048126220703125,
0.03887939453125,
-0.059478759765625,
-0.00736236572265625,
-0.01503753662109375,
-0.03338623046875,
-0.036590576171875,
-0.00997161865234375,
-0.0728759765625,
0.07647705078125,
0.025177001953125,
-0.049530029296875,
0.0197296142578125,
-0.032928466796875,
-0.00803375244140625,
0.0182647705078125,
0.00418853759765625,
-0.03497314453125,
0.0264434814453125,
-0.01276397705078125,
0.023406982421875,
-0.025482177734375,
0.027099609375,
-0.031890869140625,
-0.0258941650390625,
0.00501251220703125,
0.0008993148803710938,
0.0909423828125,
0.0196685791015625,
0.001708984375,
0.005947113037109375,
-0.08453369140625,
-0.04058837890625,
0.037353515625,
-0.053497314453125,
-0.003978729248046875,
-0.0249786376953125,
0.01486968994140625,
0.0018863677978515625,
0.049072265625,
-0.049224853515625,
0.03546142578125,
0.01168060302734375,
0.006587982177734375,
0.066650390625,
-0.0006284713745117188,
0.042724609375,
-0.0290985107421875,
0.06219482421875,
-0.007572174072265625,
0.0240936279296875,
0.03985595703125,
-0.04766845703125,
-0.0384521484375,
-0.0160675048828125,
-0.0015716552734375,
0.0251617431640625,
-0.0287322998046875,
0.028961181640625,
-0.005519866943359375,
-0.07476806640625,
-0.0273590087890625,
0.004169464111328125,
0.01068115234375,
0.03167724609375,
0.01824951171875,
-0.0119781494140625,
-0.042236328125,
-0.09033203125,
0.025054931640625,
-0.0136871337890625,
-0.0220947265625,
0.040985107421875,
0.07232666015625,
-0.0215911865234375,
0.0294952392578125,
-0.04315185546875,
-0.03070068359375,
-0.0236053466796875,
0.00080108642578125,
0.0159454345703125,
0.0240631103515625,
0.07269287109375,
-0.0362548828125,
0.0007276535034179688,
-0.0243072509765625,
-0.063720703125,
-0.040008544921875,
-0.005893707275390625,
-0.015167236328125,
-0.0089263916015625,
0.002777099609375,
-0.042694091796875,
0.037261962890625,
0.07568359375,
-0.0257568359375,
0.034942626953125,
-0.0223388671875,
-0.02069091796875,
-0.07672119140625,
0.0256500244140625,
0.01024627685546875,
-0.0089263916015625,
0.00827789306640625,
0.02996826171875,
0.00905609130859375,
-0.0003044605255126953,
-0.044097900390625,
0.03350830078125,
-0.020660400390625,
-0.0210723876953125,
-0.01065826416015625,
-0.00981903076171875,
0.002105712890625,
0.016021728515625,
-0.0177001953125,
0.0689697265625,
0.025146484375,
-0.035552978515625,
0.033050537109375,
0.0396728515625,
-0.032806396484375,
0.0224456787109375,
-0.060882568359375,
-0.005649566650390625,
-0.020355224609375,
0.030120849609375,
-0.031890869140625,
-0.0247955322265625,
0.0316162109375,
-0.0230865478515625,
0.00830841064453125,
-0.006328582763671875,
-0.0158843994140625,
-0.014495849609375,
-0.01983642578125,
0.0184783935546875,
0.0243072509765625,
-0.035858154296875,
0.0662841796875,
0.033782958984375,
0.01245880126953125,
-0.04608154296875,
-0.0902099609375,
-0.0212860107421875,
-0.024261474609375,
-0.034027099609375,
0.03826904296875,
-0.004802703857421875,
-0.018768310546875,
0.0204925537109375,
-0.01517486572265625,
-0.0159912109375,
0.00844573974609375,
0.033233642578125,
0.0267181396484375,
-0.001651763916015625,
-0.018707275390625,
0.00981903076171875,
-0.01117706298828125,
0.0103302001953125,
0.016265869140625,
0.043060302734375,
0.002765655517578125,
-0.0085906982421875,
-0.047882080078125,
-0.0033130645751953125,
0.030517578125,
0.00846099853515625,
0.0673828125,
0.01373291015625,
-0.037628173828125,
0.0015840530395507812,
-0.0303802490234375,
-0.0009250640869140625,
-0.03424072265625,
0.013397216796875,
-0.0126495361328125,
-0.0301513671875,
0.049224853515625,
0.009124755859375,
-0.01221466064453125,
0.060577392578125,
0.057952880859375,
0.004131317138671875,
0.03448486328125,
0.067626953125,
-0.006580352783203125,
0.025848388671875,
-0.0423583984375,
-0.00789642333984375,
-0.08233642578125,
-0.07208251953125,
-0.047943115234375,
-0.0523681640625,
-0.0170745849609375,
-0.0100555419921875,
0.00308990478515625,
0.0102996826171875,
-0.05712890625,
0.06219482421875,
-0.0261383056640625,
0.01276397705078125,
0.04949951171875,
0.027679443359375,
0.03857421875,
0.00417327880859375,
-0.00891876220703125,
-0.0009112358093261719,
-0.01416778564453125,
-0.040802001953125,
0.08392333984375,
0.00714874267578125,
0.095947265625,
0.017120361328125,
0.049285888671875,
0.0213623046875,
0.0224761962890625,
-0.053497314453125,
0.040679931640625,
0.00943756103515625,
-0.07147216796875,
0.00908660888671875,
-0.035369873046875,
-0.06719970703125,
0.0070037841796875,
-0.0013275146484375,
-0.054473876953125,
0.0215606689453125,
-0.01470947265625,
0.004703521728515625,
0.031036376953125,
-0.04974365234375,
0.040435791015625,
-0.00634002685546875,
0.02264404296875,
-0.03466796875,
-0.025177001953125,
0.046417236328125,
0.0001596212387084961,
0.0125579833984375,
-0.01519012451171875,
-0.01354217529296875,
0.06109619140625,
-0.040283203125,
0.0728759765625,
-0.0291595458984375,
-0.01983642578125,
0.034881591796875,
-0.000040411949157714844,
0.0295562744140625,
0.015838623046875,
-0.00896453857421875,
0.0171661376953125,
-0.0096282958984375,
-0.01316070556640625,
-0.00646209716796875,
0.037109375,
-0.09368896484375,
-0.0279998779296875,
-0.036895751953125,
-0.03997802734375,
0.03863525390625,
0.0092010498046875,
0.01047515869140625,
-0.005847930908203125,
0.0251617431640625,
0.047515869140625,
0.020172119140625,
-0.032379150390625,
0.0173797607421875,
0.040985107421875,
-0.0186614990234375,
-0.02264404296875,
0.035308837890625,
0.0067138671875,
0.01343536376953125,
0.01947021484375,
0.0270233154296875,
-0.0302886962890625,
-0.0209197998046875,
-0.031890869140625,
0.03118896484375,
-0.055877685546875,
-0.03680419921875,
-0.02557373046875,
-0.01953125,
-0.01093292236328125,
-0.0120697021484375,
-0.0146942138671875,
-0.058197021484375,
-0.037994384765625,
-0.02667236328125,
0.04974365234375,
0.07080078125,
-0.0259246826171875,
0.0731201171875,
-0.04974365234375,
0.0224761962890625,
0.01715087890625,
0.0189208984375,
0.01486968994140625,
-0.061737060546875,
-0.00324249267578125,
-0.0268707275390625,
-0.050018310546875,
-0.05645751953125,
0.0207672119140625,
-0.0057525634765625,
0.03594970703125,
0.0170135498046875,
-0.01180267333984375,
0.042022705078125,
-0.0218353271484375,
0.07257080078125,
0.021453857421875,
-0.05645751953125,
0.01142120361328125,
-0.01442718505859375,
-0.0086669921875,
0.0205535888671875,
0.037994384765625,
-0.0176849365234375,
0.01113128662109375,
-0.05145263671875,
-0.055755615234375,
0.05963134765625,
0.0156402587890625,
0.0104217529296875,
0.028350830078125,
0.01390838623046875,
0.0020656585693359375,
0.023712158203125,
-0.08770751953125,
-0.0188140869140625,
-0.0214080810546875,
-0.0158538818359375,
0.03814697265625,
-0.0213623046875,
-0.0234222412109375,
-0.00809478759765625,
0.06341552734375,
0.0215301513671875,
0.003963470458984375,
-0.0079345703125,
-0.0184478759765625,
-0.0298919677734375,
-0.005428314208984375,
0.034454345703125,
0.0338134765625,
-0.04241943359375,
-0.023681640625,
0.01470947265625,
-0.06768798828125,
0.01442718505859375,
0.0250396728515625,
-0.006374359130859375,
-0.032073974609375,
0.021270751953125,
0.031829833984375,
0.03399658203125,
-0.035369873046875,
0.043853759765625,
0.0010528564453125,
-0.0347900390625,
-0.0271453857421875,
-0.0247802734375,
0.0228729248046875,
0.037628173828125,
0.0295562744140625,
0.00041222572326660156,
0.0150604248046875,
-0.014617919921875,
-0.00582122802734375,
0.01493072509765625,
0.01157379150390625,
-0.033599853515625,
0.07568359375,
0.006717681884765625,
0.001087188720703125,
0.041473388671875,
-0.00047588348388671875,
0.01085662841796875,
0.060882568359375,
0.041839599609375,
0.052337646484375,
-0.01216888427734375,
0.0098876953125,
0.01910400390625,
0.032073974609375,
-0.00909423828125,
0.052886962890625,
0.01271820068359375,
-0.033782958984375,
-0.03863525390625,
-0.068603515625,
-0.03924560546875,
0.01059722900390625,
-0.061187744140625,
0.041290283203125,
-0.0462646484375,
-0.01271820068359375,
-0.0171051025390625,
0.0043487548828125,
-0.03338623046875,
0.028167724609375,
0.03155517578125,
0.08343505859375,
-0.04949951171875,
0.04693603515625,
0.0521240234375,
-0.07659912109375,
-0.09027099609375,
-0.05322265625,
0.0159759521484375,
-0.10394287109375,
0.05804443359375,
-0.00370025634765625,
-0.01654052734375,
-0.0107879638671875,
-0.0660400390625,
-0.0897216796875,
0.0977783203125,
0.04193115234375,
-0.035797119140625,
-0.0411376953125,
0.005542755126953125,
-0.004795074462890625,
-0.00022804737091064453,
0.00905609130859375,
0.00304412841796875,
0.0452880859375,
0.0245208740234375,
-0.06109619140625,
0.004756927490234375,
0.0006461143493652344,
-0.0022640228271484375,
-0.006320953369140625,
-0.063232421875,
0.07330322265625,
-0.006580352783203125,
-0.006168365478515625,
0.034332275390625,
0.0328369140625,
0.040008544921875,
-0.006771087646484375,
0.022186279296875,
0.069091796875,
0.054779052734375,
0.004222869873046875,
0.087646484375,
-0.01491546630859375,
0.05084228515625,
0.045654296875,
-0.026153564453125,
0.039764404296875,
0.03643798828125,
-0.031341552734375,
0.05780029296875,
0.041168212890625,
-0.03045654296875,
0.034881591796875,
0.036468505859375,
-0.00009876489639282227,
-0.00824737548828125,
-0.0192108154296875,
-0.047454833984375,
0.025848388671875,
0.01020050048828125,
-0.033599853515625,
-0.0330810546875,
-0.02520751953125,
0.002643585205078125,
-0.011016845703125,
-0.026702880859375,
0.025848388671875,
0.037384033203125,
0.02264404296875,
0.027618408203125,
0.010986328125,
0.04534912109375,
-0.064208984375,
0.0056915283203125,
-0.0029506683349609375,
0.01485443115234375,
-0.036651611328125,
-0.04461669921875,
0.034881591796875,
0.01425933837890625,
-0.01096343994140625,
-0.0155029296875,
0.035400390625,
0.02520751953125,
-0.0634765625,
0.040069580078125,
0.01361083984375,
0.01123809814453125,
0.01361083984375,
-0.050537109375,
0.02569580078125,
-0.019256591796875,
-0.03466796875,
0.0186920166015625,
0.0030670166015625,
0.0002390146255493164,
0.0716552734375,
0.034942626953125,
0.00556182861328125,
0.0189208984375,
0.0254364013671875,
0.07763671875,
-0.041229248046875,
-0.0111083984375,
-0.03460693359375,
0.047607421875,
0.01213836669921875,
-0.039825439453125,
0.041534423828125,
0.0521240234375,
0.0732421875,
-0.03839111328125,
0.0276031494140625,
-0.0279083251953125,
0.0025177001953125,
-0.036376953125,
0.07196044921875,
-0.0684814453125,
-0.00864410400390625,
-0.015594482421875,
-0.07086181640625,
-0.0360107421875,
0.06195068359375,
0.004840850830078125,
-0.0102996826171875,
0.03656005859375,
0.0443115234375,
0.001522064208984375,
0.00460052490234375,
0.003948211669921875,
0.00701904296875,
0.0196533203125,
0.03192138671875,
0.049407958984375,
-0.0484619140625,
0.054443359375,
-0.0164031982421875,
-0.0200347900390625,
-0.0011577606201171875,
-0.07366943359375,
-0.05535888671875,
-0.017333984375,
-0.004909515380859375,
-0.013031005859375,
-0.037261962890625,
0.07586669921875,
0.049224853515625,
-0.029815673828125,
-0.038665771484375,
0.004337310791015625,
-0.01384735107421875,
0.0015230178833007812,
-0.00966644287109375,
0.025360107421875,
0.01314544677734375,
-0.0574951171875,
0.029052734375,
-0.004730224609375,
0.0399169921875,
-0.01739501953125,
-0.0021762847900390625,
-0.01031494140625,
-0.005092620849609375,
0.019927978515625,
-0.001827239990234375,
-0.055206298828125,
0.00041222572326660156,
0.004405975341796875,
-0.0005044937133789062,
-0.0030307769775390625,
0.0295257568359375,
-0.047210693359375,
-0.00249481201171875,
0.0243682861328125,
0.04180908203125,
0.039306640625,
0.00197601318359375,
0.0269775390625,
-0.04632568359375,
0.04132080078125,
0.00897979736328125,
0.060455322265625,
0.0214691162109375,
-0.019256591796875,
0.046142578125,
0.0188751220703125,
-0.049285888671875,
-0.0462646484375,
-0.0010499954223632812,
-0.10693359375,
0.0019283294677734375,
0.06219482421875,
0.0020236968994140625,
-0.03363037109375,
0.05145263671875,
-0.03289794921875,
0.015045166015625,
-0.040313720703125,
0.051788330078125,
0.053192138671875,
0.01654052734375,
-0.0213470458984375,
-0.03564453125,
0.006198883056640625,
-0.0049591064453125,
-0.043914794921875,
-0.0222930908203125,
0.0288238525390625,
0.023406982421875,
0.0133056640625,
0.0149688720703125,
-0.047119140625,
-0.008941650390625,
-0.0011272430419921875,
0.031158447265625,
-0.00316619873046875,
-0.0174713134765625,
-0.019012451171875,
-0.0211334228515625,
0.0004134178161621094,
-0.0235137939453125
]
] |
timm/pit_b_224.in1k | 2023-04-26T00:06:43.000Z | [
"timm",
"pytorch",
"safetensors",
"image-classification",
"dataset:imagenet-1k",
"arxiv:2103.16302",
"license:apache-2.0",
"region:us"
] | image-classification | timm | null | null | timm/pit_b_224.in1k | 0 | 31,376 | timm | 2023-04-26T00:05:41 | ---
tags:
- image-classification
- timm
library_name: timm
license: apache-2.0
datasets:
- imagenet-1k
---
# Model card for pit_b_224.in1k
A PiT (Pooling based Vision Transformer) image classification model. Trained on ImageNet-1k by paper authors.
## Model Details
- **Model Type:** Image classification / feature backbone
- **Model Stats:**
- Params (M): 73.8
- GMACs: 12.4
- Activations (M): 32.9
- Image size: 224 x 224
- **Papers:**
- Rethinking Spatial Dimensions of Vision Transformers: https://arxiv.org/abs/2103.16302
- **Dataset:** ImageNet-1k
- **Original:** https://github.com/naver-ai/pit
## Model Usage
### Image Classification
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model('pit_b_224.in1k', pretrained=True)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
top5_probabilities, top5_class_indices = torch.topk(output.softmax(dim=1) * 100, k=5)
```
### Feature Map Extraction
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'pit_b_224.in1k',
pretrained=True,
features_only=True,
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # unsqueeze single image into batch of 1
for o in output:
# print shape of each feature map in output
# e.g.:
# torch.Size([1, 256, 31, 31])
# torch.Size([1, 512, 16, 16])
# torch.Size([1, 1024, 8, 8])
print(o.shape)
```
### Image Embeddings
```python
from urllib.request import urlopen
from PIL import Image
import timm
img = Image.open(urlopen(
'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png'
))
model = timm.create_model(
'pit_b_224.in1k',
pretrained=True,
num_classes=0, # remove classifier nn.Linear
)
model = model.eval()
# get model specific transforms (normalization, resize)
data_config = timm.data.resolve_model_data_config(model)
transforms = timm.data.create_transform(**data_config, is_training=False)
output = model(transforms(img).unsqueeze(0)) # output is (batch_size, num_features) shaped tensor
# or equivalently (without needing to set num_classes=0)
output = model.forward_features(transforms(img).unsqueeze(0))
# output is unpooled, a (1, 1, 1024) shaped tensor
output = model.forward_head(output, pre_logits=True)
# output is a (1, num_features) shaped tensor
```
## Model Comparison
Explore the dataset and runtime metrics of this model in timm [model results](https://github.com/huggingface/pytorch-image-models/tree/main/results).
## Citation
```bibtex
@inproceedings{heo2021pit,
title={Rethinking Spatial Dimensions of Vision Transformers},
author={Byeongho Heo and Sangdoo Yun and Dongyoon Han and Sanghyuk Chun and Junsuk Choe and Seong Joon Oh},
booktitle = {International Conference on Computer Vision (ICCV)},
year={2021},
}
```
| 3,601 | [
[
-0.04241943359375,
-0.034698486328125,
0.00794219970703125,
0.013763427734375,
-0.0254669189453125,
-0.027740478515625,
-0.01068115234375,
-0.032135009765625,
0.01678466796875,
0.03375244140625,
-0.036529541015625,
-0.05242919921875,
-0.05499267578125,
-0.0117034912109375,
-0.018218994140625,
0.06732177734375,
-0.0091094970703125,
-0.00656890869140625,
-0.013824462890625,
-0.037139892578125,
-0.0224456787109375,
-0.0177764892578125,
-0.053985595703125,
-0.0286712646484375,
0.020965576171875,
0.01708984375,
0.040924072265625,
0.04876708984375,
0.054473876953125,
0.039825439453125,
-0.00601959228515625,
-0.00173187255859375,
-0.014007568359375,
-0.0247344970703125,
0.025146484375,
-0.049102783203125,
-0.0263214111328125,
0.02716064453125,
0.061920166015625,
0.034271240234375,
0.0003724098205566406,
0.020660400390625,
0.006557464599609375,
0.0404052734375,
-0.0176849365234375,
0.019500732421875,
-0.0328369140625,
0.0220184326171875,
-0.006359100341796875,
0.007678985595703125,
-0.0174560546875,
-0.03253173828125,
0.0216522216796875,
-0.039886474609375,
0.042694091796875,
0.00487518310546875,
0.09686279296875,
0.02978515625,
0.006107330322265625,
0.0030803680419921875,
-0.0235443115234375,
0.062042236328125,
-0.05694580078125,
0.01523590087890625,
0.0220184326171875,
0.0197296142578125,
0.0016679763793945312,
-0.0657958984375,
-0.045867919921875,
-0.008819580078125,
-0.0249786376953125,
0.0008058547973632812,
-0.0106048583984375,
-0.00739288330078125,
0.0311737060546875,
0.0225830078125,
-0.031829833984375,
0.0019817352294921875,
-0.047882080078125,
-0.005031585693359375,
0.0535888671875,
0.006961822509765625,
0.023345947265625,
-0.0181121826171875,
-0.044769287109375,
-0.03662109375,
-0.0202178955078125,
0.025146484375,
0.00864410400390625,
0.0135955810546875,
-0.054290771484375,
0.033477783203125,
0.01345062255859375,
0.03369140625,
0.00957489013671875,
-0.028961181640625,
0.04779052734375,
0.00856781005859375,
-0.032989501953125,
-0.00269317626953125,
0.07720947265625,
0.0260162353515625,
0.015899658203125,
0.0101318359375,
-0.0032138824462890625,
-0.020843505859375,
0.004901885986328125,
-0.076904296875,
-0.0224609375,
0.021820068359375,
-0.0400390625,
-0.02703857421875,
0.03167724609375,
-0.05438232421875,
-0.0122528076171875,
-0.0038604736328125,
0.044586181640625,
-0.042755126953125,
-0.021270751953125,
0.006305694580078125,
-0.01885986328125,
0.033599853515625,
0.012969970703125,
-0.04254150390625,
0.00608062744140625,
0.018096923828125,
0.0772705078125,
0.00301361083984375,
-0.049530029296875,
-0.007354736328125,
-0.01514434814453125,
-0.0204620361328125,
0.04217529296875,
-0.0082244873046875,
-0.01287841796875,
-0.019989013671875,
0.0250701904296875,
-0.0194091796875,
-0.060211181640625,
0.019073486328125,
-0.006351470947265625,
0.024993896484375,
-0.0006275177001953125,
-0.0130157470703125,
-0.036651611328125,
0.022705078125,
-0.0300445556640625,
0.09381103515625,
0.033477783203125,
-0.06396484375,
0.0296173095703125,
-0.043487548828125,
-0.0079498291015625,
-0.00934600830078125,
-0.006500244140625,
-0.08349609375,
0.0028533935546875,
0.0213623046875,
0.04766845703125,
-0.0116729736328125,
0.0055084228515625,
-0.0390625,
-0.02374267578125,
0.03192138671875,
0.00617218017578125,
0.07177734375,
0.004833221435546875,
-0.03619384765625,
0.022125244140625,
-0.042144775390625,
0.01055908203125,
0.037872314453125,
-0.0196990966796875,
-0.005340576171875,
-0.035919189453125,
0.0064849853515625,
0.0226593017578125,
0.0038909912109375,
-0.047821044921875,
0.0142974853515625,
-0.017547607421875,
0.0300750732421875,
0.0413818359375,
-0.01303863525390625,
0.0289306640625,
-0.0287933349609375,
0.020782470703125,
0.024017333984375,
0.02374267578125,
-0.007450103759765625,
-0.04541015625,
-0.0601806640625,
-0.04400634765625,
0.0191497802734375,
0.033416748046875,
-0.033294677734375,
0.050323486328125,
-0.0147552490234375,
-0.056976318359375,
-0.03790283203125,
-0.00272369384765625,
0.0276336669921875,
0.040374755859375,
0.024993896484375,
-0.040374755859375,
-0.036712646484375,
-0.0654296875,
0.0099945068359375,
-0.0012102127075195312,
0.0010461807250976562,
0.028106689453125,
0.0546875,
-0.0159912109375,
0.06097412109375,
-0.035369873046875,
-0.02630615234375,
-0.0208740234375,
0.0006580352783203125,
0.0294952392578125,
0.058380126953125,
0.06414794921875,
-0.046478271484375,
-0.04095458984375,
-0.0125579833984375,
-0.08160400390625,
0.0034885406494140625,
-0.0087127685546875,
-0.020904541015625,
0.0215301513671875,
0.01056671142578125,
-0.051300048828125,
0.0592041015625,
0.0134429931640625,
-0.0308685302734375,
0.036834716796875,
-0.0174713134765625,
0.01548004150390625,
-0.0732421875,
0.00968170166015625,
0.02764892578125,
-0.0021533966064453125,
-0.033111572265625,
0.01497650146484375,
0.0019779205322265625,
-0.01314544677734375,
-0.043670654296875,
0.054931640625,
-0.03289794921875,
-0.0199127197265625,
-0.0156402587890625,
-0.007549285888671875,
-0.0021457672119140625,
0.0604248046875,
0.0024089813232421875,
0.0213165283203125,
0.061920166015625,
-0.036285400390625,
0.036956787109375,
0.050689697265625,
-0.01812744140625,
0.034454345703125,
-0.04718017578125,
0.003787994384765625,
-0.005237579345703125,
0.0182037353515625,
-0.07470703125,
-0.01404571533203125,
0.02740478515625,
-0.043975830078125,
0.04473876953125,
-0.032073974609375,
-0.0279388427734375,
-0.044464111328125,
-0.03656005859375,
0.034515380859375,
0.055816650390625,
-0.05615234375,
0.0303802490234375,
0.021728515625,
0.01294708251953125,
-0.038177490234375,
-0.06646728515625,
-0.023223876953125,
-0.038299560546875,
-0.05596923828125,
0.0301361083984375,
0.0081634521484375,
0.00402069091796875,
0.01300811767578125,
0.0002796649932861328,
-0.00583648681640625,
-0.0171356201171875,
0.0413818359375,
0.030120849609375,
-0.0169677734375,
-0.0173492431640625,
-0.0222320556640625,
-0.01462554931640625,
0.0109100341796875,
-0.01087188720703125,
0.04205322265625,
-0.01468658447265625,
-0.0163726806640625,
-0.06512451171875,
0.00015938282012939453,
0.035736083984375,
-0.00545501708984375,
0.06353759765625,
0.0830078125,
-0.037506103515625,
0.0008044242858886719,
-0.02716064453125,
-0.0215606689453125,
-0.036163330078125,
0.03472900390625,
-0.0278778076171875,
-0.03399658203125,
0.06317138671875,
-0.0003733634948730469,
0.0044708251953125,
0.053009033203125,
0.0269775390625,
-0.013916015625,
0.05377197265625,
0.056732177734375,
0.006954193115234375,
0.052459716796875,
-0.0775146484375,
-0.0011844635009765625,
-0.06683349609375,
-0.044158935546875,
-0.0297393798828125,
-0.050048828125,
-0.04473876953125,
-0.0311431884765625,
0.0377197265625,
0.0169677734375,
-0.035797119140625,
0.0301055908203125,
-0.055755615234375,
0.009735107421875,
0.056640625,
0.038299560546875,
-0.0341796875,
0.026885986328125,
-0.0198822021484375,
-0.0093994140625,
-0.057098388671875,
-0.01338958740234375,
0.06671142578125,
0.041656494140625,
0.04827880859375,
-0.0146636962890625,
0.0491943359375,
-0.0192413330078125,
0.02178955078125,
-0.041168212890625,
0.045684814453125,
-0.006305694580078125,
-0.03167724609375,
-0.02008056640625,
-0.0251617431640625,
-0.07891845703125,
0.018035888671875,
-0.0277862548828125,
-0.057586669921875,
0.0110015869140625,
0.0009140968322753906,
-0.0215301513671875,
0.062744140625,
-0.05865478515625,
0.06927490234375,
-0.01025390625,
-0.0286712646484375,
0.0046234130859375,
-0.047210693359375,
0.0227508544921875,
0.00873565673828125,
-0.0167083740234375,
-0.00629425048828125,
0.018768310546875,
0.07666015625,
-0.044830322265625,
0.06829833984375,
-0.0386962890625,
0.02716064453125,
0.040191650390625,
-0.014495849609375,
0.01837158203125,
-0.0062713623046875,
0.00313568115234375,
0.03515625,
-0.002407073974609375,
-0.0396728515625,
-0.044219970703125,
0.0419921875,
-0.0784912109375,
-0.0298004150390625,
-0.033111572265625,
-0.04547119140625,
0.01390838623046875,
0.0017375946044921875,
0.0386962890625,
0.04046630859375,
0.017730712890625,
0.01444244384765625,
0.041778564453125,
-0.04766845703125,
0.03997802734375,
-0.00850677490234375,
-0.016387939453125,
-0.03399658203125,
0.053863525390625,
0.0028362274169921875,
-0.0010404586791992188,
0.0094757080078125,
0.0153961181640625,
-0.033203125,
-0.03460693359375,
-0.0262298583984375,
0.04290771484375,
-0.039794921875,
-0.032470703125,
-0.039398193359375,
-0.05084228515625,
-0.034454345703125,
-0.0165557861328125,
-0.0292816162109375,
-0.0196075439453125,
-0.017120361328125,
0.0152130126953125,
0.0445556640625,
0.050445556640625,
-0.01183319091796875,
0.047454833984375,
-0.049468994140625,
0.00827789306640625,
0.01227569580078125,
0.041046142578125,
-0.01209259033203125,
-0.0704345703125,
-0.0240936279296875,
-0.01082611083984375,
-0.036590576171875,
-0.056976318359375,
0.0518798828125,
0.0037384033203125,
0.04718017578125,
0.03070068359375,
-0.0215301513671875,
0.060089111328125,
-0.00644683837890625,
0.040802001953125,
0.024505615234375,
-0.045745849609375,
0.053466796875,
-0.0025634765625,
0.02001953125,
-0.00011098384857177734,
0.0266571044921875,
-0.0173492431640625,
-0.009765625,
-0.077392578125,
-0.052947998046875,
0.0758056640625,
0.0132598876953125,
-0.004627227783203125,
0.03070068359375,
0.056427001953125,
0.0034465789794921875,
0.0028858184814453125,
-0.059814453125,
-0.03558349609375,
-0.0208740234375,
-0.0192718505859375,
-0.00910186767578125,
-0.00946807861328125,
-0.0025043487548828125,
-0.0550537109375,
0.04705810546875,
0.002532958984375,
0.0679931640625,
0.02392578125,
-0.003620147705078125,
-0.00870513916015625,
-0.032470703125,
0.0307769775390625,
0.02197265625,
-0.0248870849609375,
-0.0009713172912597656,
0.020904541015625,
-0.043670654296875,
0.0029315948486328125,
0.01267242431640625,
-0.00033354759216308594,
-0.002002716064453125,
0.037017822265625,
0.06829833984375,
0.00470733642578125,
-0.00179290771484375,
0.0296173095703125,
-0.001922607421875,
-0.033111572265625,
-0.0211181640625,
0.01439666748046875,
-0.0146636962890625,
0.038330078125,
0.0195465087890625,
0.022674560546875,
-0.0192413330078125,
-0.0216522216796875,
0.018310546875,
0.03887939453125,
-0.0236053466796875,
-0.02447509765625,
0.05279541015625,
-0.014129638671875,
-0.01508331298828125,
0.07855224609375,
-0.004405975341796875,
-0.04241943359375,
0.0777587890625,
0.0367431640625,
0.06646728515625,
-0.011199951171875,
0.00394439697265625,
0.076904296875,
0.0191497802734375,
0.00872802734375,
0.00701141357421875,
0.01363372802734375,
-0.05950927734375,
0.0027065277099609375,
-0.048828125,
0.0033512115478515625,
0.030792236328125,
-0.036041259765625,
0.0222015380859375,
-0.051788330078125,
-0.0191802978515625,
0.00304412841796875,
0.0277099609375,
-0.063720703125,
0.0166168212890625,
0.0033206939697265625,
0.06402587890625,
-0.06536865234375,
0.06060791015625,
0.0648193359375,
-0.0386962890625,
-0.06805419921875,
-0.0126190185546875,
-0.0023326873779296875,
-0.07421875,
0.036041259765625,
0.0377197265625,
0.0201416015625,
-0.002109527587890625,
-0.064453125,
-0.053497314453125,
0.1082763671875,
0.03155517578125,
-0.0064544677734375,
0.0217132568359375,
0.01204681396484375,
0.0186767578125,
-0.035675048828125,
0.0289764404296875,
0.0227813720703125,
0.031219482421875,
0.0222015380859375,
-0.05120849609375,
0.01248931884765625,
-0.022216796875,
0.0108642578125,
0.0180206298828125,
-0.055633544921875,
0.06695556640625,
-0.044189453125,
-0.0123291015625,
0.0037364959716796875,
0.048095703125,
0.034698486328125,
0.01287078857421875,
0.04364013671875,
0.07159423828125,
0.03619384765625,
-0.0258941650390625,
0.06671142578125,
0.0018167495727539062,
0.0472412109375,
0.040740966796875,
0.02069091796875,
0.044158935546875,
0.022613525390625,
-0.0255584716796875,
0.03240966796875,
0.07550048828125,
-0.034515380859375,
0.0281219482421875,
0.016082763671875,
0.005130767822265625,
-0.0025920867919921875,
0.0169677734375,
-0.032745361328125,
0.044189453125,
0.0159912109375,
-0.0335693359375,
-0.01947021484375,
0.005741119384765625,
-0.01251220703125,
-0.0193634033203125,
-0.01336669921875,
0.031494140625,
-0.003536224365234375,
-0.0308380126953125,
0.050506591796875,
0.00870513916015625,
0.06427001953125,
-0.02813720703125,
-0.001209259033203125,
-0.0293121337890625,
0.0197601318359375,
-0.0214080810546875,
-0.06512451171875,
0.031951904296875,
-0.01837158203125,
-0.0006613731384277344,
-0.0010843276977539062,
0.04693603515625,
-0.0279541015625,
-0.036163330078125,
0.01605224609375,
0.026397705078125,
0.04669189453125,
-0.004039764404296875,
-0.0955810546875,
0.0094451904296875,
0.00612640380859375,
-0.032440185546875,
0.0252532958984375,
0.034912109375,
0.004016876220703125,
0.057708740234375,
0.039306640625,
-0.0101165771484375,
0.0193634033203125,
-0.016845703125,
0.049285888671875,
-0.05377197265625,
-0.02618408203125,
-0.056427001953125,
0.052642822265625,
-0.0037441253662109375,
-0.03741455078125,
0.03887939453125,
0.047210693359375,
0.07171630859375,
-0.008514404296875,
0.032470703125,
-0.02032470703125,
0.0013647079467773438,
-0.0261383056640625,
0.058563232421875,
-0.044647216796875,
-0.004180908203125,
-0.012298583984375,
-0.061279296875,
-0.03070068359375,
0.0633544921875,
-0.017547607421875,
0.033782958984375,
0.047607421875,
0.06890869140625,
-0.023284912109375,
-0.0237274169921875,
0.0199127197265625,
0.0140533447265625,
0.005863189697265625,
0.038116455078125,
0.039581298828125,
-0.0633544921875,
0.0223846435546875,
-0.0548095703125,
-0.01934814453125,
-0.00510406494140625,
-0.040008544921875,
-0.07275390625,
-0.07159423828125,
-0.0523681640625,
-0.039520263671875,
-0.0294189453125,
0.052154541015625,
0.0841064453125,
-0.04705810546875,
-0.013397216796875,
0.0112457275390625,
0.009918212890625,
-0.0178070068359375,
-0.0172271728515625,
0.048614501953125,
-0.01558685302734375,
-0.064697265625,
-0.0289459228515625,
0.0029926300048828125,
0.03424072265625,
-0.00365447998046875,
-0.0172576904296875,
-0.01201629638671875,
-0.0237274169921875,
0.0247650146484375,
0.021331787109375,
-0.051971435546875,
-0.0152587890625,
-0.0262603759765625,
-0.01296234130859375,
0.033477783203125,
0.029876708984375,
-0.052825927734375,
0.00875091552734375,
0.0244140625,
0.0237579345703125,
0.06396484375,
-0.023345947265625,
-0.0032482147216796875,
-0.06414794921875,
0.044158935546875,
-0.0221710205078125,
0.040557861328125,
0.028656005859375,
-0.022674560546875,
0.04962158203125,
0.034576416015625,
-0.0313720703125,
-0.057159423828125,
-0.0139923095703125,
-0.0845947265625,
-0.01401519775390625,
0.059906005859375,
-0.0267486572265625,
-0.04339599609375,
0.034332275390625,
-0.005458831787109375,
0.04437255859375,
0.0016717910766601562,
0.05438232421875,
0.01593017578125,
-0.0153961181640625,
-0.049774169921875,
-0.03399658203125,
0.0277557373046875,
0.004695892333984375,
-0.05377197265625,
-0.037139892578125,
0.0006775856018066406,
0.050262451171875,
0.0230560302734375,
0.035614013671875,
-0.0177764892578125,
0.0027828216552734375,
0.01074981689453125,
0.03961181640625,
-0.01922607421875,
-0.01702880859375,
-0.032135009765625,
0.005413055419921875,
-0.0139007568359375,
-0.046234130859375
]
] |
Helsinki-NLP/opus-mt-ine-en | 2023-08-16T11:58:27.000Z | [
"transformers",
"pytorch",
"tf",
"marian",
"text2text-generation",
"translation",
"ca",
"es",
"os",
"ro",
"fy",
"cy",
"sc",
"is",
"yi",
"lb",
"an",
"sq",
"fr",
"ht",
"rm",
"ps",
"af",
"uk",
"sl",
"lt",
"bg",
"be",
"gd",
"si",
"en",
"br",
"mk",
"or",
"mr",
"ru",
"fo",
"co",
"oc",
"pl",
"gl",
"nb",
"bn",
"id",
"hy",
"da",
"gv",
"nl",
"pt",
"hi",
"as",
"kw",
"ga",
"sv",
"gu",
"wa",
"lv",
"el",
"it",
"hr",
"ur",
"nn",
"de",
"cs",
"ine",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | Helsinki-NLP | null | null | Helsinki-NLP/opus-mt-ine-en | 2 | 31,365 | transformers | 2022-03-02T23:29:04 | ---
language:
- ca
- es
- os
- ro
- fy
- cy
- sc
- is
- yi
- lb
- an
- sq
- fr
- ht
- rm
- ps
- af
- uk
- sl
- lt
- bg
- be
- gd
- si
- en
- br
- mk
- or
- mr
- ru
- fo
- co
- oc
- pl
- gl
- nb
- bn
- id
- hy
- da
- gv
- nl
- pt
- hi
- as
- kw
- ga
- sv
- gu
- wa
- lv
- el
- it
- hr
- ur
- nn
- de
- cs
- ine
tags:
- translation
license: apache-2.0
---
### ine-eng
* source group: Indo-European languages
* target group: English
* OPUS readme: [ine-eng](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/ine-eng/README.md)
* model: transformer
* source language(s): afr aln ang_Latn arg asm ast awa bel bel_Latn ben bho bos_Latn bre bul bul_Latn cat ces cor cos csb_Latn cym dan deu dsb egl ell enm_Latn ext fao fra frm_Latn frr fry gcf_Latn gla gle glg glv gom gos got_Goth grc_Grek gsw guj hat hif_Latn hin hrv hsb hye ind isl ita jdt_Cyrl ksh kur_Arab kur_Latn lad lad_Latn lat_Latn lav lij lit lld_Latn lmo ltg ltz mai mar max_Latn mfe min mkd mwl nds nld nno nob nob_Hebr non_Latn npi oci ori orv_Cyrl oss pan_Guru pap pdc pes pes_Latn pes_Thaa pms pnb pol por prg_Latn pus roh rom ron rue rus san_Deva scn sco sgs sin slv snd_Arab spa sqi srp_Cyrl srp_Latn stq swe swg tgk_Cyrl tly_Latn tmw_Latn ukr urd vec wln yid zlm_Latn zsm_Latn zza
* target language(s): eng
* model: transformer
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* download original weights: [opus2m-2020-08-01.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/ine-eng/opus2m-2020-08-01.zip)
* test set translations: [opus2m-2020-08-01.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/ine-eng/opus2m-2020-08-01.test.txt)
* test set scores: [opus2m-2020-08-01.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/ine-eng/opus2m-2020-08-01.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| newsdev2014-hineng.hin.eng | 11.2 | 0.375 |
| newsdev2016-enro-roneng.ron.eng | 35.5 | 0.614 |
| newsdev2017-enlv-laveng.lav.eng | 25.1 | 0.542 |
| newsdev2019-engu-gujeng.guj.eng | 16.0 | 0.420 |
| newsdev2019-enlt-liteng.lit.eng | 24.0 | 0.522 |
| newsdiscussdev2015-enfr-fraeng.fra.eng | 30.1 | 0.550 |
| newsdiscusstest2015-enfr-fraeng.fra.eng | 33.4 | 0.572 |
| newssyscomb2009-ceseng.ces.eng | 24.0 | 0.520 |
| newssyscomb2009-deueng.deu.eng | 25.7 | 0.526 |
| newssyscomb2009-fraeng.fra.eng | 27.9 | 0.550 |
| newssyscomb2009-itaeng.ita.eng | 31.4 | 0.574 |
| newssyscomb2009-spaeng.spa.eng | 28.3 | 0.555 |
| news-test2008-deueng.deu.eng | 24.0 | 0.515 |
| news-test2008-fraeng.fra.eng | 24.5 | 0.524 |
| news-test2008-spaeng.spa.eng | 25.5 | 0.533 |
| newstest2009-ceseng.ces.eng | 23.3 | 0.516 |
| newstest2009-deueng.deu.eng | 23.2 | 0.512 |
| newstest2009-fraeng.fra.eng | 27.3 | 0.545 |
| newstest2009-itaeng.ita.eng | 30.3 | 0.567 |
| newstest2009-spaeng.spa.eng | 27.9 | 0.549 |
| newstest2010-ceseng.ces.eng | 23.8 | 0.523 |
| newstest2010-deueng.deu.eng | 26.2 | 0.545 |
| newstest2010-fraeng.fra.eng | 28.6 | 0.562 |
| newstest2010-spaeng.spa.eng | 31.4 | 0.581 |
| newstest2011-ceseng.ces.eng | 24.2 | 0.521 |
| newstest2011-deueng.deu.eng | 23.9 | 0.522 |
| newstest2011-fraeng.fra.eng | 29.5 | 0.570 |
| newstest2011-spaeng.spa.eng | 30.3 | 0.570 |
| newstest2012-ceseng.ces.eng | 23.5 | 0.516 |
| newstest2012-deueng.deu.eng | 24.9 | 0.529 |
| newstest2012-fraeng.fra.eng | 30.0 | 0.568 |
| newstest2012-ruseng.rus.eng | 29.9 | 0.565 |
| newstest2012-spaeng.spa.eng | 33.3 | 0.593 |
| newstest2013-ceseng.ces.eng | 25.6 | 0.531 |
| newstest2013-deueng.deu.eng | 27.7 | 0.545 |
| newstest2013-fraeng.fra.eng | 30.0 | 0.561 |
| newstest2013-ruseng.rus.eng | 24.4 | 0.514 |
| newstest2013-spaeng.spa.eng | 30.8 | 0.577 |
| newstest2014-csen-ceseng.ces.eng | 27.7 | 0.558 |
| newstest2014-deen-deueng.deu.eng | 27.7 | 0.545 |
| newstest2014-fren-fraeng.fra.eng | 32.2 | 0.592 |
| newstest2014-hien-hineng.hin.eng | 16.7 | 0.450 |
| newstest2014-ruen-ruseng.rus.eng | 27.2 | 0.552 |
| newstest2015-encs-ceseng.ces.eng | 25.4 | 0.518 |
| newstest2015-ende-deueng.deu.eng | 28.8 | 0.552 |
| newstest2015-enru-ruseng.rus.eng | 25.6 | 0.527 |
| newstest2016-encs-ceseng.ces.eng | 27.0 | 0.540 |
| newstest2016-ende-deueng.deu.eng | 33.5 | 0.592 |
| newstest2016-enro-roneng.ron.eng | 32.8 | 0.591 |
| newstest2016-enru-ruseng.rus.eng | 24.8 | 0.523 |
| newstest2017-encs-ceseng.ces.eng | 23.7 | 0.510 |
| newstest2017-ende-deueng.deu.eng | 29.3 | 0.556 |
| newstest2017-enlv-laveng.lav.eng | 18.9 | 0.486 |
| newstest2017-enru-ruseng.rus.eng | 28.0 | 0.546 |
| newstest2018-encs-ceseng.ces.eng | 24.9 | 0.521 |
| newstest2018-ende-deueng.deu.eng | 36.0 | 0.604 |
| newstest2018-enru-ruseng.rus.eng | 23.8 | 0.517 |
| newstest2019-deen-deueng.deu.eng | 31.5 | 0.570 |
| newstest2019-guen-gujeng.guj.eng | 12.1 | 0.377 |
| newstest2019-lten-liteng.lit.eng | 26.6 | 0.555 |
| newstest2019-ruen-ruseng.rus.eng | 27.5 | 0.541 |
| Tatoeba-test.afr-eng.afr.eng | 59.0 | 0.724 |
| Tatoeba-test.ang-eng.ang.eng | 9.9 | 0.254 |
| Tatoeba-test.arg-eng.arg.eng | 41.6 | 0.487 |
| Tatoeba-test.asm-eng.asm.eng | 22.8 | 0.392 |
| Tatoeba-test.ast-eng.ast.eng | 36.1 | 0.521 |
| Tatoeba-test.awa-eng.awa.eng | 11.6 | 0.280 |
| Tatoeba-test.bel-eng.bel.eng | 42.2 | 0.597 |
| Tatoeba-test.ben-eng.ben.eng | 45.8 | 0.598 |
| Tatoeba-test.bho-eng.bho.eng | 34.4 | 0.518 |
| Tatoeba-test.bre-eng.bre.eng | 24.4 | 0.405 |
| Tatoeba-test.bul-eng.bul.eng | 50.8 | 0.660 |
| Tatoeba-test.cat-eng.cat.eng | 51.2 | 0.677 |
| Tatoeba-test.ces-eng.ces.eng | 47.6 | 0.641 |
| Tatoeba-test.cor-eng.cor.eng | 5.4 | 0.214 |
| Tatoeba-test.cos-eng.cos.eng | 61.0 | 0.675 |
| Tatoeba-test.csb-eng.csb.eng | 22.5 | 0.394 |
| Tatoeba-test.cym-eng.cym.eng | 34.7 | 0.522 |
| Tatoeba-test.dan-eng.dan.eng | 56.2 | 0.708 |
| Tatoeba-test.deu-eng.deu.eng | 44.9 | 0.625 |
| Tatoeba-test.dsb-eng.dsb.eng | 21.0 | 0.383 |
| Tatoeba-test.egl-eng.egl.eng | 6.9 | 0.221 |
| Tatoeba-test.ell-eng.ell.eng | 62.1 | 0.741 |
| Tatoeba-test.enm-eng.enm.eng | 22.6 | 0.466 |
| Tatoeba-test.ext-eng.ext.eng | 33.2 | 0.496 |
| Tatoeba-test.fao-eng.fao.eng | 28.1 | 0.460 |
| Tatoeba-test.fas-eng.fas.eng | 9.6 | 0.306 |
| Tatoeba-test.fra-eng.fra.eng | 50.3 | 0.661 |
| Tatoeba-test.frm-eng.frm.eng | 30.0 | 0.457 |
| Tatoeba-test.frr-eng.frr.eng | 15.2 | 0.301 |
| Tatoeba-test.fry-eng.fry.eng | 34.4 | 0.525 |
| Tatoeba-test.gcf-eng.gcf.eng | 18.4 | 0.317 |
| Tatoeba-test.gla-eng.gla.eng | 24.1 | 0.400 |
| Tatoeba-test.gle-eng.gle.eng | 52.2 | 0.671 |
| Tatoeba-test.glg-eng.glg.eng | 50.5 | 0.669 |
| Tatoeba-test.glv-eng.glv.eng | 5.7 | 0.189 |
| Tatoeba-test.gos-eng.gos.eng | 19.2 | 0.378 |
| Tatoeba-test.got-eng.got.eng | 0.1 | 0.022 |
| Tatoeba-test.grc-eng.grc.eng | 0.9 | 0.095 |
| Tatoeba-test.gsw-eng.gsw.eng | 23.9 | 0.390 |
| Tatoeba-test.guj-eng.guj.eng | 28.0 | 0.428 |
| Tatoeba-test.hat-eng.hat.eng | 44.2 | 0.567 |
| Tatoeba-test.hbs-eng.hbs.eng | 51.6 | 0.666 |
| Tatoeba-test.hif-eng.hif.eng | 22.3 | 0.451 |
| Tatoeba-test.hin-eng.hin.eng | 41.7 | 0.585 |
| Tatoeba-test.hsb-eng.hsb.eng | 46.4 | 0.590 |
| Tatoeba-test.hye-eng.hye.eng | 40.4 | 0.564 |
| Tatoeba-test.isl-eng.isl.eng | 43.8 | 0.605 |
| Tatoeba-test.ita-eng.ita.eng | 60.7 | 0.735 |
| Tatoeba-test.jdt-eng.jdt.eng | 5.5 | 0.091 |
| Tatoeba-test.kok-eng.kok.eng | 7.8 | 0.205 |
| Tatoeba-test.ksh-eng.ksh.eng | 15.8 | 0.284 |
| Tatoeba-test.kur-eng.kur.eng | 11.6 | 0.232 |
| Tatoeba-test.lad-eng.lad.eng | 30.7 | 0.484 |
| Tatoeba-test.lah-eng.lah.eng | 11.0 | 0.286 |
| Tatoeba-test.lat-eng.lat.eng | 24.4 | 0.432 |
| Tatoeba-test.lav-eng.lav.eng | 47.2 | 0.646 |
| Tatoeba-test.lij-eng.lij.eng | 9.0 | 0.287 |
| Tatoeba-test.lit-eng.lit.eng | 51.7 | 0.670 |
| Tatoeba-test.lld-eng.lld.eng | 22.4 | 0.369 |
| Tatoeba-test.lmo-eng.lmo.eng | 26.1 | 0.381 |
| Tatoeba-test.ltz-eng.ltz.eng | 39.8 | 0.536 |
| Tatoeba-test.mai-eng.mai.eng | 72.3 | 0.758 |
| Tatoeba-test.mar-eng.mar.eng | 32.0 | 0.554 |
| Tatoeba-test.mfe-eng.mfe.eng | 63.1 | 0.822 |
| Tatoeba-test.mkd-eng.mkd.eng | 49.5 | 0.638 |
| Tatoeba-test.msa-eng.msa.eng | 38.6 | 0.566 |
| Tatoeba-test.multi.eng | 45.6 | 0.615 |
| Tatoeba-test.mwl-eng.mwl.eng | 40.4 | 0.767 |
| Tatoeba-test.nds-eng.nds.eng | 35.5 | 0.538 |
| Tatoeba-test.nep-eng.nep.eng | 4.9 | 0.209 |
| Tatoeba-test.nld-eng.nld.eng | 54.2 | 0.694 |
| Tatoeba-test.non-eng.non.eng | 39.3 | 0.573 |
| Tatoeba-test.nor-eng.nor.eng | 50.9 | 0.663 |
| Tatoeba-test.oci-eng.oci.eng | 19.6 | 0.386 |
| Tatoeba-test.ori-eng.ori.eng | 16.2 | 0.364 |
| Tatoeba-test.orv-eng.orv.eng | 13.6 | 0.288 |
| Tatoeba-test.oss-eng.oss.eng | 9.4 | 0.301 |
| Tatoeba-test.pan-eng.pan.eng | 17.1 | 0.389 |
| Tatoeba-test.pap-eng.pap.eng | 57.0 | 0.680 |
| Tatoeba-test.pdc-eng.pdc.eng | 41.6 | 0.526 |
| Tatoeba-test.pms-eng.pms.eng | 13.7 | 0.333 |
| Tatoeba-test.pol-eng.pol.eng | 46.5 | 0.632 |
| Tatoeba-test.por-eng.por.eng | 56.4 | 0.710 |
| Tatoeba-test.prg-eng.prg.eng | 2.3 | 0.193 |
| Tatoeba-test.pus-eng.pus.eng | 3.2 | 0.194 |
| Tatoeba-test.roh-eng.roh.eng | 17.5 | 0.420 |
| Tatoeba-test.rom-eng.rom.eng | 5.0 | 0.237 |
| Tatoeba-test.ron-eng.ron.eng | 51.4 | 0.670 |
| Tatoeba-test.rue-eng.rue.eng | 26.0 | 0.447 |
| Tatoeba-test.rus-eng.rus.eng | 47.8 | 0.634 |
| Tatoeba-test.san-eng.san.eng | 4.0 | 0.195 |
| Tatoeba-test.scn-eng.scn.eng | 45.1 | 0.440 |
| Tatoeba-test.sco-eng.sco.eng | 41.9 | 0.582 |
| Tatoeba-test.sgs-eng.sgs.eng | 38.7 | 0.498 |
| Tatoeba-test.sin-eng.sin.eng | 29.7 | 0.499 |
| Tatoeba-test.slv-eng.slv.eng | 38.2 | 0.564 |
| Tatoeba-test.snd-eng.snd.eng | 12.7 | 0.342 |
| Tatoeba-test.spa-eng.spa.eng | 53.2 | 0.687 |
| Tatoeba-test.sqi-eng.sqi.eng | 51.9 | 0.679 |
| Tatoeba-test.stq-eng.stq.eng | 9.0 | 0.391 |
| Tatoeba-test.swe-eng.swe.eng | 57.4 | 0.705 |
| Tatoeba-test.swg-eng.swg.eng | 18.0 | 0.338 |
| Tatoeba-test.tgk-eng.tgk.eng | 24.3 | 0.413 |
| Tatoeba-test.tly-eng.tly.eng | 1.1 | 0.094 |
| Tatoeba-test.ukr-eng.ukr.eng | 48.0 | 0.639 |
| Tatoeba-test.urd-eng.urd.eng | 27.2 | 0.471 |
| Tatoeba-test.vec-eng.vec.eng | 28.0 | 0.398 |
| Tatoeba-test.wln-eng.wln.eng | 17.5 | 0.320 |
| Tatoeba-test.yid-eng.yid.eng | 26.9 | 0.457 |
| Tatoeba-test.zza-eng.zza.eng | 1.7 | 0.131 |
### System Info:
- hf_name: ine-eng
- source_languages: ine
- target_languages: eng
- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/ine-eng/README.md
- original_repo: Tatoeba-Challenge
- tags: ['translation']
- languages: ['ca', 'es', 'os', 'ro', 'fy', 'cy', 'sc', 'is', 'yi', 'lb', 'an', 'sq', 'fr', 'ht', 'rm', 'ps', 'af', 'uk', 'sl', 'lt', 'bg', 'be', 'gd', 'si', 'en', 'br', 'mk', 'or', 'mr', 'ru', 'fo', 'co', 'oc', 'pl', 'gl', 'nb', 'bn', 'id', 'hy', 'da', 'gv', 'nl', 'pt', 'hi', 'as', 'kw', 'ga', 'sv', 'gu', 'wa', 'lv', 'el', 'it', 'hr', 'ur', 'nn', 'de', 'cs', 'ine']
- src_constituents: {'cat', 'spa', 'pap', 'mwl', 'lij', 'bos_Latn', 'lad_Latn', 'lat_Latn', 'pcd', 'oss', 'ron', 'fry', 'cym', 'awa', 'swg', 'zsm_Latn', 'srd', 'gcf_Latn', 'isl', 'yid', 'bho', 'ltz', 'kur_Latn', 'arg', 'pes_Thaa', 'sqi', 'csb_Latn', 'fra', 'hat', 'non_Latn', 'sco', 'pnb', 'roh', 'bul_Latn', 'pus', 'afr', 'ukr', 'slv', 'lit', 'tmw_Latn', 'hsb', 'tly_Latn', 'bul', 'bel', 'got_Goth', 'lat_Grek', 'ext', 'gla', 'mai', 'sin', 'hif_Latn', 'eng', 'bre', 'nob_Hebr', 'prg_Latn', 'ang_Latn', 'aln', 'mkd', 'ori', 'mar', 'afr_Arab', 'san_Deva', 'gos', 'rus', 'fao', 'orv_Cyrl', 'bel_Latn', 'cos', 'zza', 'grc_Grek', 'oci', 'mfe', 'gom', 'bjn', 'sgs', 'tgk_Cyrl', 'hye_Latn', 'pdc', 'srp_Cyrl', 'pol', 'ast', 'glg', 'pms', 'nob', 'ben', 'min', 'srp_Latn', 'zlm_Latn', 'ind', 'rom', 'hye', 'scn', 'enm_Latn', 'lmo', 'npi', 'pes', 'dan', 'rus_Latn', 'jdt_Cyrl', 'gsw', 'glv', 'nld', 'snd_Arab', 'kur_Arab', 'por', 'hin', 'dsb', 'asm', 'lad', 'frm_Latn', 'ksh', 'pan_Guru', 'cor', 'gle', 'swe', 'guj', 'wln', 'lav', 'ell', 'frr', 'rue', 'ita', 'hrv', 'urd', 'stq', 'nno', 'deu', 'lld_Latn', 'ces', 'egl', 'vec', 'max_Latn', 'pes_Latn', 'ltg', 'nds'}
- tgt_constituents: {'eng'}
- src_multilingual: True
- tgt_multilingual: False
- prepro: normalization + SentencePiece (spm32k,spm32k)
- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/ine-eng/opus2m-2020-08-01.zip
- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/ine-eng/opus2m-2020-08-01.test.txt
- src_alpha3: ine
- tgt_alpha3: eng
- short_pair: ine-en
- chrF2_score: 0.615
- bleu: 45.6
- brevity_penalty: 0.997
- ref_len: 71872.0
- src_name: Indo-European languages
- tgt_name: English
- train_date: 2020-08-01
- src_alpha2: ine
- tgt_alpha2: en
- prefer_old: False
- long_pair: ine-eng
- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535
- transformers_git_sha: 2207e5d8cb224e954a7cba69fa4ac2309e9ff30b
- port_machine: brutasse
- port_time: 2020-08-21-14:41 | 13,118 | [
[
-0.053680419921875,
-0.033416748046875,
0.01363372802734375,
0.034149169921875,
-0.01172637939453125,
-0.002288818359375,
0.007183074951171875,
-0.02935791015625,
0.0491943359375,
-0.0186767578125,
-0.0233154296875,
-0.023651123046875,
-0.035186767578125,
0.024505615234375,
0.0012369155883789062,
0.031707763671875,
-0.0028781890869140625,
0.01070404052734375,
0.0188446044921875,
-0.02069091796875,
-0.03692626953125,
0.003875732421875,
-0.0634765625,
-0.00325775146484375,
0.00567626953125,
0.036773681640625,
0.034454345703125,
0.0222015380859375,
0.0300140380859375,
0.029205322265625,
-0.0139312744140625,
0.0083160400390625,
0.01175689697265625,
-0.01116943359375,
0.0032806396484375,
-0.056640625,
-0.03155517578125,
-0.0109100341796875,
0.05224609375,
0.04449462890625,
0.0257110595703125,
0.02557373046875,
0.0121307373046875,
0.06982421875,
-0.03277587890625,
-0.004764556884765625,
-0.0039520263671875,
0.0009021759033203125,
-0.02764892578125,
-0.023284912109375,
-0.04046630859375,
-0.0570068359375,
-0.01226043701171875,
-0.044830322265625,
0.00841522216796875,
-0.0035552978515625,
0.10235595703125,
-0.01229095458984375,
-0.0211029052734375,
-0.0120697021484375,
-0.01995849609375,
0.06671142578125,
-0.0570068359375,
0.0191802978515625,
0.0301666259765625,
-0.00666046142578125,
-0.0065765380859375,
-0.0234527587890625,
-0.04595947265625,
0.0163421630859375,
-0.022918701171875,
0.044342041015625,
0.0039825439453125,
-0.0299835205078125,
0.0119781494140625,
0.0302581787109375,
-0.045623779296875,
0.007442474365234375,
-0.03497314453125,
-0.00066375732421875,
0.05615234375,
0.02569580078125,
0.023651123046875,
-0.0302734375,
-0.04541015625,
-0.03662109375,
-0.030670166015625,
0.046356201171875,
0.0204315185546875,
0.01229095458984375,
-0.0212860107421875,
0.049591064453125,
-0.01947021484375,
0.032623291015625,
0.032745361328125,
-0.005207061767578125,
0.050506591796875,
-0.042724609375,
-0.0406494140625,
-0.01971435546875,
0.06622314453125,
0.043670654296875,
-0.01073455810546875,
0.0242767333984375,
0.011688232421875,
0.012481689453125,
-0.0311279296875,
-0.055328369140625,
0.002414703369140625,
0.00952911376953125,
-0.0200653076171875,
-0.00494384765625,
0.00830841064453125,
-0.09466552734375,
-0.0035495758056640625,
-0.002948760986328125,
0.0276031494140625,
-0.054107666015625,
-0.0283660888671875,
0.018890380859375,
-0.005401611328125,
0.033203125,
0.0147552490234375,
-0.036163330078125,
0.02398681640625,
0.00972747802734375,
0.05792236328125,
-0.0273895263671875,
-0.017547607421875,
0.01152801513671875,
-0.0035190582275390625,
-0.037078857421875,
0.05865478515625,
-0.02081298828125,
-0.035980224609375,
-0.0198211669921875,
0.0108642578125,
-0.034881591796875,
-0.0153656005859375,
0.04541015625,
-0.008209228515625,
0.0303497314453125,
-0.04815673828125,
-0.044647216796875,
-0.0253753662109375,
0.0107574462890625,
-0.03277587890625,
0.09368896484375,
0.01165771484375,
-0.06536865234375,
0.048248291015625,
-0.043670654296875,
0.00949859619140625,
-0.004119873046875,
0.0037078857421875,
-0.040313720703125,
-0.01678466796875,
0.029998779296875,
0.0207366943359375,
-0.0281982421875,
0.0014142990112304688,
-0.00786590576171875,
-0.0253143310546875,
0.004352569580078125,
-0.0200347900390625,
0.08465576171875,
0.0228118896484375,
-0.040008544921875,
-0.012359619140625,
-0.07232666015625,
0.033203125,
0.0107879638671875,
-0.0306549072265625,
0.0014247894287109375,
-0.047943115234375,
-0.0149688720703125,
0.01947021484375,
0.0130767822265625,
-0.04534912109375,
0.01157379150390625,
-0.05224609375,
-0.01302337646484375,
0.07574462890625,
0.01444244384765625,
0.029449462890625,
-0.04522705078125,
0.033782958984375,
0.02508544921875,
0.01432037353515625,
0.0217742919921875,
-0.044342041015625,
-0.05450439453125,
-0.01548004150390625,
0.040863037109375,
0.050994873046875,
-0.0411376953125,
0.05224609375,
-0.00670623779296875,
-0.0728759765625,
-0.039794921875,
-0.0092926025390625,
0.0386962890625,
0.0350341796875,
0.02471923828125,
-0.01331329345703125,
-0.039703369140625,
-0.074951171875,
-0.025634765625,
-0.00027108192443847656,
0.0130615234375,
0.01357269287109375,
0.0701904296875,
0.007793426513671875,
0.055328369140625,
-0.057098388671875,
-0.0220184326171875,
-0.01824951171875,
-0.0130767822265625,
0.041595458984375,
0.048065185546875,
0.061492919921875,
-0.052642822265625,
-0.08404541015625,
-0.00461578369140625,
-0.056549072265625,
-0.006526947021484375,
0.00415802001953125,
-0.015960693359375,
0.040252685546875,
0.00641632080078125,
-0.0458984375,
0.061309814453125,
0.032562255859375,
-0.05303955078125,
0.05377197265625,
-0.0195465087890625,
0.041717529296875,
-0.08624267578125,
0.015899658203125,
-0.0007419586181640625,
0.01540374755859375,
-0.025726318359375,
-0.0031108856201171875,
0.0012598037719726562,
0.010955810546875,
-0.026031494140625,
0.054779052734375,
-0.059967041015625,
-0.00563812255859375,
0.03509521484375,
0.0284271240234375,
-0.00811767578125,
0.055023193359375,
-0.022552490234375,
0.078857421875,
0.05377197265625,
-0.042083740234375,
0.0176849365234375,
0.014007568359375,
-0.0201263427734375,
0.040435791015625,
-0.0360107421875,
-0.0263214111328125,
0.00786590576171875,
-0.0022182464599609375,
-0.081298828125,
-0.00704193115234375,
0.01467132568359375,
-0.048828125,
0.0135345458984375,
0.0224456787109375,
-0.022979736328125,
-0.043670654296875,
-0.062744140625,
0.018157958984375,
0.029083251953125,
-0.0108489990234375,
0.0219573974609375,
0.01242828369140625,
-0.0052642822265625,
-0.04425048828125,
-0.0467529296875,
-0.0209503173828125,
-0.0081634521484375,
-0.039459228515625,
0.0197601318359375,
-0.01947021484375,
-0.0162506103515625,
0.0117950439453125,
-0.021240234375,
-0.0179595947265625,
-0.0151214599609375,
0.015106201171875,
0.01215362548828125,
-0.01849365234375,
-0.011749267578125,
-0.0214691162109375,
-0.014404296875,
-0.01189422607421875,
0.022857666015625,
0.0465087890625,
-0.0204315185546875,
-0.0303192138671875,
-0.04730224609375,
0.0174102783203125,
0.04022216796875,
-0.011566162109375,
0.065185546875,
0.0234832763671875,
-0.017486572265625,
0.01122283935546875,
-0.040130615234375,
0.00786590576171875,
-0.03570556640625,
0.014251708984375,
-0.046295166015625,
-0.04229736328125,
0.059967041015625,
0.006256103515625,
0.00272369384765625,
0.08172607421875,
0.04046630859375,
-0.0012598037719726562,
0.06280517578125,
0.009796142578125,
0.023193359375,
0.032501220703125,
-0.047760009765625,
0.0039825439453125,
-0.051116943359375,
-0.044036865234375,
-0.0433349609375,
-0.02947998046875,
-0.057891845703125,
-0.016876220703125,
0.032989501953125,
-0.0019321441650390625,
-0.0247802734375,
0.04327392578125,
-0.058563232421875,
0.0167694091796875,
0.03961181640625,
0.0261993408203125,
0.007442474365234375,
-0.0283203125,
-0.0211639404296875,
-0.0211944580078125,
-0.0210723876953125,
-0.0262908935546875,
0.09307861328125,
0.0198974609375,
0.03759765625,
0.021331787109375,
0.057342529296875,
0.0228118896484375,
0.0001952648162841797,
-0.0291748046875,
0.053436279296875,
0.021331787109375,
-0.0625,
-0.0391845703125,
-0.00408935546875,
-0.06683349609375,
0.034515380859375,
-0.0111083984375,
-0.044647216796875,
0.03277587890625,
-0.00321197509765625,
-0.019439697265625,
0.03045654296875,
-0.06121826171875,
0.041107177734375,
-0.00942230224609375,
-0.0556640625,
0.0112762451171875,
-0.04052734375,
0.0140838623046875,
0.007511138916015625,
0.007404327392578125,
0.0016880035400390625,
0.006900787353515625,
0.07257080078125,
-0.050323486328125,
0.0251922607421875,
-0.03167724609375,
0.01174163818359375,
0.04962158203125,
0.0064544677734375,
0.033935546875,
0.0137786865234375,
-0.01036834716796875,
-0.03582763671875,
-0.00048089027404785156,
-0.05218505859375,
-0.006809234619140625,
0.0518798828125,
-0.055084228515625,
-0.048980712890625,
-0.07098388671875,
-0.01158905029296875,
-0.004665374755859375,
0.020263671875,
0.034423828125,
0.04913330078125,
-0.007411956787109375,
0.0306549072265625,
0.0521240234375,
-0.0225372314453125,
0.06280517578125,
0.0024547576904296875,
-0.0011587142944335938,
-0.03643798828125,
0.05157470703125,
0.0185546875,
0.026702880859375,
0.03265380859375,
0.0113067626953125,
-0.0242919921875,
-0.02606201171875,
-0.0273284912109375,
0.0184173583984375,
-0.032684326171875,
-0.04156494140625,
-0.047271728515625,
-0.01309967041015625,
-0.0374755859375,
-0.0281219482421875,
-0.0308837890625,
-0.042633056640625,
-0.003734588623046875,
-0.02392578125,
0.0265045166015625,
0.047027587890625,
-0.0220489501953125,
0.0115966796875,
-0.0184326171875,
0.00405120849609375,
0.013916015625,
0.0228271484375,
-0.0133056640625,
-0.04736328125,
-0.01279449462890625,
0.0008606910705566406,
-0.037506103515625,
-0.0963134765625,
0.049346923828125,
-0.0089263916015625,
0.0231781005859375,
0.0176849365234375,
-0.01324462890625,
0.068115234375,
0.00485992431640625,
0.0872802734375,
0.0165557861328125,
-0.061920166015625,
0.046875,
-0.01396942138671875,
0.022186279296875,
0.05560302734375,
0.0222015380859375,
-0.0028362274169921875,
-0.04681396484375,
-0.056915283203125,
-0.06494140625,
0.040618896484375,
0.0235443115234375,
-0.026123046875,
-0.0105438232421875,
-0.007965087890625,
-0.0019989013671875,
-0.02703857421875,
-0.0560302734375,
-0.060699462890625,
0.00836181640625,
-0.01007843017578125,
0.0160675048828125,
-0.0234832763671875,
-0.000017762184143066406,
-0.042083740234375,
0.05084228515625,
0.0220489501953125,
0.039947509765625,
0.03302001953125,
-0.004779815673828125,
0.016082763671875,
0.034423828125,
0.070068359375,
0.038177490234375,
-0.016845703125,
-0.0026302337646484375,
0.04833984375,
-0.0570068359375,
0.02960205078125,
-0.0125274658203125,
-0.0218658447265625,
0.0114288330078125,
0.01507568359375,
0.0214385986328125,
0.0113983154296875,
-0.0005602836608886719,
0.0328369140625,
-0.00014519691467285156,
-0.029571533203125,
-0.046539306640625,
-0.0112762451171875,
0.006317138671875,
-0.0056915283203125,
0.0228118896484375,
0.01461029052734375,
0.0029392242431640625,
-0.0384521484375,
0.021728515625,
0.010955810546875,
-0.01531219482421875,
-0.00789642333984375,
0.039642333984375,
0.0127410888671875,
-0.006603240966796875,
0.005626678466796875,
-0.0258026123046875,
-0.036041259765625,
0.06268310546875,
0.037689208984375,
0.037933349609375,
-0.044464111328125,
-0.00511932373046875,
0.074951171875,
0.0034809112548828125,
-0.004062652587890625,
0.040557861328125,
0.04547119140625,
-0.017822265625,
0.0006132125854492188,
-0.053619384765625,
-0.0011138916015625,
0.0023746490478515625,
-0.046905517578125,
0.015625,
-0.02392578125,
-0.033935546875,
-0.0014133453369140625,
0.04046630859375,
-0.052886962890625,
0.041229248046875,
-0.037445068359375,
0.09295654296875,
-0.07763671875,
0.04461669921875,
0.055694580078125,
-0.0635986328125,
-0.08502197265625,
-0.021514892578125,
-0.0123291015625,
-0.04229736328125,
0.048736572265625,
-0.018341064453125,
0.0008139610290527344,
-0.004993438720703125,
-0.0152587890625,
-0.0726318359375,
0.10015869140625,
-0.0034942626953125,
-0.014007568359375,
0.0152587890625,
-0.0005397796630859375,
0.041534423828125,
0.007770538330078125,
0.0537109375,
0.035186767578125,
0.0654296875,
-0.003398895263671875,
-0.06781005859375,
0.021331787109375,
-0.04986572265625,
-0.0163726806640625,
0.022918701171875,
-0.0728759765625,
0.0877685546875,
0.002773284912109375,
-0.013671875,
-0.0180816650390625,
0.040008544921875,
0.02276611328125,
0.0177764892578125,
0.0225067138671875,
0.0567626953125,
0.034423828125,
-0.0287017822265625,
0.06365966796875,
-0.00934600830078125,
0.02923583984375,
0.0418701171875,
0.00234222412109375,
0.053314208984375,
0.032012939453125,
-0.0518798828125,
0.0261077880859375,
0.051483154296875,
-0.007175445556640625,
0.033111572265625,
0.00970458984375,
-0.028228759765625,
-0.00609588623046875,
0.0007352828979492188,
-0.04718017578125,
0.031494140625,
0.011505126953125,
-0.0025310516357421875,
0.00211334228515625,
-0.02197265625,
0.028900146484375,
0.0280609130859375,
-0.0135345458984375,
0.035430908203125,
0.004169464111328125,
-0.052581787109375,
0.03851318359375,
-0.0103912353515625,
0.065673828125,
-0.03167724609375,
0.0018033981323242188,
-0.032501220703125,
0.03118896484375,
-0.01959228515625,
-0.086669921875,
0.0308380126953125,
-0.0004181861877441406,
-0.019775390625,
0.00083160400390625,
0.038421630859375,
-0.0128326416015625,
-0.039398193359375,
0.032745361328125,
0.031280517578125,
0.0160980224609375,
0.031463623046875,
-0.0386962890625,
-0.0107574462890625,
0.0031566619873046875,
-0.053863525390625,
0.018585205078125,
0.03277587890625,
0.007747650146484375,
0.051055908203125,
0.0257110595703125,
0.016448974609375,
0.0384521484375,
-0.030303955078125,
0.0443115234375,
-0.053985595703125,
-0.0262298583984375,
-0.06939697265625,
0.0297698974609375,
-0.038238525390625,
-0.044219970703125,
0.081298828125,
0.07073974609375,
0.05120849609375,
-0.0136871337890625,
0.046173095703125,
-0.0361328125,
0.0416259765625,
-0.0171051025390625,
0.04730224609375,
-0.043701171875,
-0.0250244140625,
-0.00557708740234375,
-0.052032470703125,
-0.01348876953125,
0.041534423828125,
-0.01520538330078125,
0.020233154296875,
0.07958984375,
0.045867919921875,
0.019439697265625,
0.0008254051208496094,
0.0048828125,
0.0164031982421875,
0.007717132568359375,
0.07647705078125,
0.024169921875,
-0.06365966796875,
0.036041259765625,
-0.0248260498046875,
-0.004680633544921875,
-0.02496337890625,
-0.0369873046875,
-0.054534912109375,
-0.05023193359375,
-0.002712249755859375,
-0.0286407470703125,
-0.031402587890625,
0.06268310546875,
0.01438140869140625,
-0.061614990234375,
-0.0247955322265625,
0.01418304443359375,
0.0074310302734375,
-0.03546142578125,
-0.023712158203125,
0.073974609375,
-0.0036907196044921875,
-0.07208251953125,
0.0096282958984375,
-0.007251739501953125,
0.0076751708984375,
0.0341796875,
-0.0089263916015625,
-0.050933837890625,
0.022186279296875,
0.0245819091796875,
0.0264129638671875,
-0.037445068359375,
-0.02923583984375,
0.019744873046875,
-0.040313720703125,
0.0316162109375,
-0.007656097412109375,
-0.033966064453125,
0.009674072265625,
0.0733642578125,
0.0204010009765625,
0.04498291015625,
0.01288604736328125,
0.010772705078125,
-0.03375244140625,
0.0318603515625,
0.005214691162109375,
0.0194854736328125,
-0.0026340484619140625,
-0.0247039794921875,
0.05975341796875,
0.0232086181640625,
-0.03338623046875,
-0.0714111328125,
-0.015533447265625,
-0.08544921875,
0.0025844573974609375,
0.06842041015625,
-0.005924224853515625,
-0.0244140625,
0.0037021636962890625,
-0.025848388671875,
0.02764892578125,
-0.0355224609375,
0.029632568359375,
0.0384521484375,
-0.0208282470703125,
0.00665283203125,
-0.056243896484375,
0.0272216796875,
0.041595458984375,
-0.07220458984375,
-0.0217742919921875,
0.00800323486328125,
0.028106689453125,
0.037628173828125,
0.0777587890625,
-0.039306640625,
0.007419586181640625,
0.0279693603515625,
0.00643157958984375,
-0.0019235610961914062,
-0.0008435249328613281,
-0.0018177032470703125,
0.020294189453125,
-0.01157379150390625,
-0.044219970703125
]
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.