id stringlengths 2 115 | author stringlengths 2 42 ⌀ | last_modified timestamp[us, tz=UTC] | downloads int64 0 8.87M | likes int64 0 3.84k | paperswithcode_id stringlengths 2 45 ⌀ | tags list | lastModified timestamp[us, tz=UTC] | createdAt stringlengths 24 24 | key stringclasses 1 value | created timestamp[us] | card stringlengths 1 1.01M | embedding list | library_name stringclasses 21 values | pipeline_tag stringclasses 27 values | mask_token null | card_data null | widget_data null | model_index null | config null | transformers_info null | spaces null | safetensors null | transformersInfo null | modelId stringlengths 5 111 ⌀ | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
deepseek-ai/deepseek-llm-67b-chat | deepseek-ai | 2023-11-29T11:40:59Z | 1,042 | 24 | null | [
"transformers",
"pytorch",
"llama",
"text-generation",
"license:other",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2023-11-29T11:40:59Z | 2023-11-29T03:30:00.000Z | null | null | ---
license: other
license_name: deepseek
license_link: LICENSE
---
<p align="center">
<img width="500px" alt="DeepSeek Chat" src="https://github.com/deepseek-ai/DeepSeek-LLM/blob/main/images/logo.png?raw=true">
</p>
<p align="center"><a href="https://www.deepseek.com/">[🏠Homepage]</a> | <a href="https://chat.deepseek.com/">[🤖 Chat with DeepSeek LLM]</a> | <a href="https://discord.gg/Tc7c45Zzu5">[Discord]</a> | <a href="https://github.com/deepseek-ai/DeepSeek-LLM/blob/main/images/qr.jpeg">[Wechat(微信)]</a> </p>
<hr>
### 1. Introduction of Deepseek LLM
Introducing DeepSeek LLM, an advanced language model comprising 67 billion parameters. It has been trained from scratch on a vast dataset of 2 trillion tokens in both English and Chinese. In order to foster research, we have made DeepSeek LLM 7B/67B Base and DeepSeek LLM 7B/67B Chat open source for the research community.
### 2. Model Summary
`deepseek-llm-67b-chat` is a 67B parameter model initialized from `deepseek-llm-67b-base` and fine-tuned on extra instruction data.
- **Home Page:** [DeepSeek](https://deepseek.com/)
- **Repository:** [deepseek-ai/deepseek-LLM](https://github.com/deepseek-ai/deepseek-LLM)
- **Chat With DeepSeek LLM:** [DeepSeek-LLM](https://chat.deepseek.com/)
### 3. How to Use
Here give some examples of how to use our model.
#### Chat Completion
```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM, GenerationConfig
model_name = "deepseek-ai/deepseek-llm-67b-chat"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16, device_map="auto")
model.generation_config = GenerationConfig.from_pretrained(model_name)
model.generation_config.pad_token_id = model.generation_config.eos_token_id
messages = [
{"role": "user", "content": "Who are you?"}
]
input_tensor = tokenizer.apply_chat_template(messages, add_generation_prompt=True, return_tensors="pt")
outputs = model.generate(input_tensor.to(model.device), max_new_tokens=100)
result = tokenizer.decode(outputs[0][input_tensor.shape[1]:], skip_special_tokens=True)
print(result)
```
Avoiding the use of the provided function `apply_chat_template`, you can also interact with our model following the sample template. Note that `messages` should be replaced by your input.
```
User: {messages[0]['content']}
Assistant: {messages[1]['content']}<|end▁of▁sentence|>User: {messages[2]['content']}
Assistant:
```
**Note:** By default (`add_special_tokens=True`), our tokenizer automatically adds a `bos_token` (`<|begin▁of▁sentence|>`) before the input text. Additionally, since the system prompt is not compatible with this version of our models, we DO NOT RECOMMEND including the system prompt in your input.
### 4. License
This code repository is licensed under the MIT License. The use of DeepSeek LLM models is subject to the Model License. DeepSeek LLM supports commercial use.
See the [LICENSE-MODEL](https://github.com/deepseek-ai/deepseek-LLM/blob/main/LICENSE-MODEL) for more details.
### 5. Contact
If you have any questions, please raise an issue or contact us at [service@deepseek.com](mailto:service@deepseek.com).
| null | transformers | text-generation | null | null | null | null | null | null | null | null | null | deepseek-ai/deepseek-llm-67b-chat | [
-0.30180129408836365,
-0.8337169289588928,
0.2942453920841217,
0.3908047676086426,
-0.40847277641296387,
-0.017351511865854263,
-0.2571586072444916,
-0.46990329027175903,
0.13319823145866394,
0.16286389529705048,
-0.6958994269371033,
-0.7332366704940796,
-0.6622211337089539,
-0.21413455903... |
yentinglin/Taiwan-LLM-7B-v2.0.1-chat | yentinglin | 2023-11-29T06:02:03Z | 1,001 | 23 | null | [
"transformers",
"safetensors",
"llama",
"text-generation",
"zh",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | 2023-11-29T06:02:03Z | 2023-10-10T16:30:19.000Z | null | null |
---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/model-cards
license: apache-2.0
language:
- zh
widget:
- text: >-
A chat between a curious user and an artificial intelligence assistant.
The assistant gives helpful, detailed, and polite answers to the user's
questions. USER: 你好,請問你可以幫我寫一封推薦信嗎? ASSISTANT:
library_name: transformers
pipeline_tag: text-generation
extra_gated_heading: Acknowledge license to accept the repository.
extra_gated_prompt: Please contact the author for access.
extra_gated_button_content: Acknowledge license 同意以上內容
extra_gated_fields:
Name: text
Mail: text
Organization: text
Country: text
Any utilization of the Taiwan LLM repository mandates the explicit acknowledgment and attribution to the original author: checkbox
使用Taiwan LLM必須明確地承認和歸功於優必達株式會社 Ubitus 以及原始作者: checkbox
---
<img src="https://cdn-uploads.huggingface.co/production/uploads/5df9c78eda6d0311fd3d541f/CmusIT5OlSXvFrbTJ7l-C.png" alt="Taiwan LLM Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
# 🌟 Checkout [Taiwan-LLM Demo Chat-UI](http://www.twllm.com) 🌟
# Model Card for Taiwan LLM 7B v2.0.1 chat
Taiwan LLM is an advanced language model tailored for Traditional Chinese, focusing on the linguistic and cultural contexts of Taiwan.
Developed from a large base model, it's enriched with diverse Taiwanese textual sources and refined through Supervised Fine-Tuning.
This model excels in language understanding and generation, aligning closely with Taiwan's cultural nuances.
It demonstrates improved performance on various benchmarks like TC-Eval, showcasing its contextual comprehension and cultural relevance.
For detailed insights into Taiwan LLM's development and features, refer to our [technical report](https://github.com/MiuLab/Taiwan-LLaMa/blob/main/twllm_paper.pdf).
## Model description
- **Model type:** A 7B parameter GPT-like model fine-tuned on a mix of publicly available, synthetic datasets.
- **Language(s) (NLP):** Primarily Traditional Chinese (zh-tw)
- **Finetuned from model:** [yentinglin/Taiwan-LLM-7B-v2.0-base](https://huggingface.co/yentinglin/yentinglin/Taiwan-LLM-7B-v2.0-base)
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/MiuLab/Taiwan-LLaMa
- **Demo:** https://twllm.com/
## Performance

## Intended uses
Here's how you can run the model using the `pipeline()` function from 🤗 Transformers:
```python
# pip install transformers>=4.34
# pip install accelerate
import torch
from transformers import pipeline
pipe = pipeline("text-generation", model="yentinglin/Taiwan-LLM-7B-v2.0.1-chat", torch_dtype=torch.bfloat16, device_map="auto")
# We use the tokenizer's chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
messages = [
{
"role": "system",
"content": "你是一個人工智慧助理",
},
{"role": "user", "content": "東北季風如何影響台灣氣候?"},
]
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
```
### Training hyperparameters



The following hyperparameters were used during training:
- learning_rate: 5e-05
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 5.0
## Citation
If you find Taiwan LLM is useful in your work, please cite it with:
```
@inproceedings{lin-chen-2023-llm,
title = "{LLM}-Eval: Unified Multi-Dimensional Automatic Evaluation for Open-Domain Conversations with Large Language Models",
author = "Lin, Yen-Ting and Chen, Yun-Nung",
booktitle = "Proceedings of the 5th Workshop on NLP for Conversational AI (NLP4ConvAI 2023)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.nlp4convai-1.5",
pages = "47--58"
}
@misc{taiwanllama,
author={Lin, Yen-Ting and Chen, Yun-Nung},
title={Language Models for Taiwanese Culture},
year={2023},
url={https://github.com/MiuLab/Taiwan-LLaMa},
note={Code and models available at https://github.com/MiuLab/Taiwan-LLaMa},
}
```
# Acknowledgement
Taiwan LLM v2 is conducted in collaboration with [Ubitus K.K.](http://ubitus.net). Ubitus provides valuable compute resources for the project.
| null | transformers | text-generation | null | null | null | null | null | null | null | null | null | yentinglin/Taiwan-LLM-7B-v2.0.1-chat | [
-0.384193480014801,
-0.9690145254135132,
0.3245360553264618,
0.46791544556617737,
-0.49359020590782166,
0.06574080139398575,
-0.45467880368232727,
-0.5814841389656067,
0.39335891604423523,
0.4424402117729187,
-0.4445461332798004,
-0.6840085983276367,
-0.5195119380950928,
0.0497580096125602... |
DTAI-KULeuven/robbert-2022-dutch-base | DTAI-KULeuven | 2023-11-29T10:55:44Z | 953 | 9 | null | [
"transformers",
"pytorch",
"safetensors",
"roberta",
"fill-mask",
"Dutch",
"Flemish",
"RoBERTa",
"RobBERT",
"nl",
"dataset:oscar",
"dataset:dbrd",
"dataset:lassy-ud",
"dataset:europarl-mono",
"dataset:conll2002",
"arxiv:2211.08192",
"arxiv:2001.06286",
"arxiv:1907.11692",
"arxiv:... | 2023-11-29T10:55:44Z | 2022-08-15T09:48:36.000Z | null | null | ---
language: "nl"
thumbnail: "https://github.com/iPieter/RobBERT/raw/master/res/robbert_2022_logo.png"
tags:
- Dutch
- Flemish
- RoBERTa
- RobBERT
license: mit
datasets:
- oscar
- dbrd
- lassy-ud
- europarl-mono
- conll2002
widget:
- text: "Hallo, ik ben RobBERT-2022, het nieuwe <mask> taalmodel van de KU Leuven."
---
<p align="center">
<img src="https://github.com/iPieter/RobBERT/raw/master/res/robbert_2022_logo_with_name.png" alt="RobBERT-2022: Updating a Dutch Language Model to Account for Evolving Language Use" width="75%">
</p>
# RobBERT-2022: Updating a Dutch Language Model to Account for Evolving Language Use.
RobBERT-2022 is the latest release of the [Dutch RobBERT model](https://pieter.ai/robbert/).
It further pretrained the original [pdelobelle/robbert-v2-dutch-base](https://huggingface.co/pdelobelle/robbert-v2-dutch-base) model on the 2022 version of the OSCAR version.
Thanks to this more recent dataset, this [DTAI-KULeuven/robbert-2022-dutch-base](https://huggingface.co/DTAI-KULeuven/robbert-2022-dutch-base) model shows increased performance on several tasks related to recent events, e.g. COVID-19-related tasks.
We also found that for some tasks that do not contain more recent information than 2019, the original [pdelobelle/robbert-v2-dutch-base](https://huggingface.co/pdelobelle/robbert-v2-dutch-base) RobBERT model can still outperform this newer one.
The original RobBERT model was released in January 2020. Dutch has evolved a lot since then, for example the COVID-19 pandemic introduced a wide range of new words that were suddenly used daily. Also, many other world facts that the original model considered true have also changed. To account for this and other changes in usage, we release a new Dutch BERT model trained on data from 2022: RobBERT 2022.
More in-depth information about RobBERT-2022 can be found in our [blog post](https://pieter.ai/robbert-2022/), [our paper](http://arxiv.org/abs/2211.08192), [the original RobBERT paper](https://arxiv.org/abs/2001.06286) and [the RobBERT Github repository](https://github.com/iPieter/RobBERT).
## How to use
RobBERT-2022 and RobBERT both use the [RoBERTa](https://arxiv.org/abs/1907.11692) architecture and pre-training but with a Dutch tokenizer and training data. RoBERTa is the robustly optimized English BERT model, making it even more powerful than the original BERT model. Given this same architecture, RobBERT can easily be finetuned and inferenced using [code to finetune RoBERTa](https://huggingface.co/transformers/model_doc/roberta.html) models and most code used for BERT models, e.g. as provided by [HuggingFace Transformers](https://huggingface.co/transformers/) library.
By default, RobBERT-2022 has the masked language model head used in training. This can be used as a zero-shot way to fill masks in sentences. It can be tested out for free on [RobBERT's Hosted infererence API of Huggingface](https://huggingface.co/pdelobelle/robbert-v2-dutch-base?text=De+hoofdstad+van+Belgi%C3%AB+is+%3Cmask%3E.). You can also create a new prediction head for your own task by using any of HuggingFace's [RoBERTa-runners](https://huggingface.co/transformers/v2.7.0/examples.html#language-model-training), [their fine-tuning notebooks](https://huggingface.co/transformers/v4.1.1/notebooks.html) by changing the model name to `DTAI-KULeuven/robbert-2022-dutch-base`.
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("DTAI-KULeuven/robbert-2022-dutch-base")
model = AutoModelForSequenceClassification.from_pretrained("DTAI-KULeuven/robbert-2022-dutch-base")
```
You can then use most of [HuggingFace's BERT-based notebooks](https://huggingface.co/transformers/v4.1.1/notebooks.html) for finetuning RobBERT-2022 on your type of Dutch language dataset.
## Comparison of Available Dutch BERT models
There is a wide variety of Dutch BERT-based models available for fine-tuning on your tasks.
Here's a quick summary to find the one that suits your need:
- [pdelobelle/robbert-v2-dutch-base](https://huggingface.co/pdelobelle/robbert-v2-dutch-base): The RobBERT model has for years been the best performing BERT-like model for most language tasks. It is trained on a large Dutch webcrawled dataset (OSCAR) and uses the superior [RoBERTa](https://huggingface.co/docs/transformers/model_doc/roberta) architecture, which robustly optimized the original [BERT model](https://huggingface.co/docs/transformers/model_doc/bert).
- [DTAI-KULeuven/robbertje-1-gb-merged](https://huggingface.co/DTAI-KULeuven/robbertje-1-gb-mergedRobBERTje): The RobBERTje model is a distilled version of RobBERT and about half the size and four times faster to perform inference on. This can help deploy more scalable language models for your language task
- [DTAI-KULeuven/robbert-2022-dutch-base](https://huggingface.co/DTAI-KULeuven/robbert-2022-dutch-base): The RobBERT-2022 is a further pre-trained RobBERT model on the OSCAR2022 dataset. It is helpful for tasks that rely on words and/or information about more recent events.
There's also the [GroNLP/bert-base-dutch-cased](https://huggingface.co/GroNLP/bert-base-dutch-cased) "BERTje" model. This model uses the outdated basic BERT model, and is trained on a smaller corpus of clean Dutch texts.
Thanks to RobBERT's more recent architecture as well as its larger and more real-world-like training corpus, most researchers and practitioners seem to achieve higher performance on their language tasks with the RobBERT model.
## Technical Details From The Paper
### Our Performance Evaluation Results
All experiments are described in more detail in our [paper](https://arxiv.org/abs/2001.06286), with the code in [our GitHub repository](https://github.com/iPieter/RobBERT).
### Sentiment analysis
Predicting whether a review is positive or negative using the [Dutch Book Reviews Dataset](https://github.com/benjaminvdb/110kDBRD).
| Model | Accuracy [%] |
|-------------------|--------------------------|
| ULMFiT | 93.8 |
| BERTje | 93.0 |
| RobBERT v2 | 94.4 |
| RobBERT 2022 | **95.1** |
### Die/Dat (coreference resolution)
We measured how well the models are able to do coreference resolution by predicting whether "die" or "dat" should be filled into a sentence.
For this, we used the [EuroParl corpus](https://www.statmt.org/europarl/).
#### Finetuning on whole dataset
| Model | Accuracy [%] | F1 [%] |
|-------------------|--------------------------|--------------|
| [Baseline](https://arxiv.org/abs/2001.02943) (LSTM) | | 75.03 |
| mBERT | 98.285 | 98.033 |
| BERTje | 98.268 | 98.014 |
| RobBERT v2 | **99.232** | **99.121** |
| RobBERT 2022 | 97.8 | |
#### Finetuning on 10K examples
We also measured the performance using only 10K training examples.
This experiment clearly illustrates that RobBERT outperforms other models when there is little data available.
| Model | Accuracy [%] | F1 [%] |
|-------------------|--------------------------|--------------|
| mBERT | 92.157 | 90.898 |
| BERTje | 93.096 | 91.279 |
| RobBERT v2 | **97.816** | **97.514** |
#### Using zero-shot word masking task
Since BERT models are pre-trained using the word masking task, we can use this to predict whether "die" or "dat" is more likely.
This experiment shows that RobBERT has internalised more information about Dutch than other models.
| Model | Accuracy [%] |
|-------------------|--------------------------|
| ZeroR | 66.70 |
| mBERT | 90.21 |
| BERTje | 94.94 |
| RobBERT v2 | **98.75** |
### Part-of-Speech Tagging.
Using the [Lassy UD dataset](https://universaldependencies.org/treebanks/nl_lassysmall/index.html).
| Model | Accuracy [%] |
|-------------------|--------------------------|
| Frog | 91.7 |
| mBERT | **96.5** |
| BERTje | 96.3 |
| RobBERT v2 | 96.4 |
| RobBERT 2022 | 96.1 |
## Credits and citation
This project is created by [Pieter Delobelle](https://people.cs.kuleuven.be/~pieter.delobelle), [Thomas Winters](https://thomaswinters.be) and [Bettina Berendt](https://people.cs.kuleuven.be/~bettina.berendt/).
If you would like to cite our paper or model, you can use the following BibTeX:
```
@inproceedings{delobelle2022robbert2022,
doi = {10.48550/ARXIV.2211.08192},
url = {https://arxiv.org/abs/2211.08192},
author = {Delobelle, Pieter and Winters, Thomas and Berendt, Bettina},
keywords = {Computation and Language (cs.CL), Machine Learning (cs.LG), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {RobBERT-2022: Updating a Dutch Language Model to Account for Evolving Language Use},
venue = {arXiv},
year = {2022},
}
@inproceedings{delobelle2020robbert,
title = "{R}ob{BERT}: a {D}utch {R}o{BERT}a-based {L}anguage {M}odel",
author = "Delobelle, Pieter and
Winters, Thomas and
Berendt, Bettina",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2020",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.findings-emnlp.292",
doi = "10.18653/v1/2020.findings-emnlp.292",
pages = "3255--3265"
}
``` | null | transformers | fill-mask | null | null | null | null | null | null | null | null | null | DTAI-KULeuven/robbert-2022-dutch-base | [
-0.3619382381439209,
-0.9262018203735352,
0.031682778149843216,
0.30993786454200745,
-0.20272774994373322,
-0.11032143235206604,
-0.3580532968044281,
-0.9111288189888,
0.2907639443874359,
0.3496769964694977,
-0.37686604261398315,
-0.2660410404205322,
-0.8041829466819763,
-0.024170476943254... |
allenai/tulu-2-dpo-70b | allenai | 2023-11-29T06:37:00Z | 846 | 68 | null | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"dataset:HuggingFaceH4/ultrafeedback_binarized",
"dataset:allenai/tulu-v2-sft-mixture",
"arxiv:2305.18290",
"arxiv:2311.10702",
"base_model:meta-llama/Llama-2-70b-hf",
"endpoints_compatible",
"has_space",
"text-generation-inference... | 2023-11-29T06:37:00Z | 2023-11-12T21:34:51.000Z | null | null | ---
model-index:
- name: tulu-2-dpo-70b
results: []
datasets:
- HuggingFaceH4/ultrafeedback_binarized
- allenai/tulu-v2-sft-mixture
language:
- en
base_model: meta-llama/Llama-2-70b-hf
---
<img src="https://huggingface.co/datasets/allenai/blog-images/resolve/main/tulu-v2/Tulu%20V2%20banner.png" alt="TuluV2 banner" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
# Model Card for Tulu V2 DPO 70B
Tulu is a series of language models that are trained to act as helpful assistants.
Tulu V2 DPO 70B is a fine-tuned version of Llama 2 that was trained on on a mix of publicly available, synthetic and human datasets using [Direct Preference Optimization (DPO)](https://arxiv.org/abs/2305.18290).
This model is a strong alternative to Llama 2 70b Chat.
For more details, read the paper: [Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2
](https://arxiv.org/abs/2311.10702).
## Model description
- **Model type:** The flagship model of a suite of instruction and RLHF tuned chat models on a mix of publicly available, synthetic and human-created datasets.
- **Language(s) (NLP):** Primarily English
- **License:** [AI2 ImpACT](https://allenai.org/impact-license) Low-risk license.
- **Finetuned from model:** [meta-llama/Llama-2-70b-hf](https://huggingface.co/meta-llama/Llama-2-70b-hf)
### Model Sources
- **Repository:** https://github.com/allenai/open-instruct
- **DPO Recipe:** The DPO recipe is from the [Zephyr Beta](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta) model
- **Model Family:** Other models and the dataset are found in the [Tulu V2 collection](https://huggingface.co/collections/allenai/tulu-v2-suite-6551b56e743e6349aab45101).
## Performance
| Model | Size | Alignment | MT-Bench (score) | AlpacaEval (win rate %) |
|-------------|-----|----|---------------|--------------|
| **Tulu-v2-7b** 🐪 | **7B** | **SFT** | **6.30** | **73.9** |
| **Tulu-v2-dpo-7b** 🐪 | **7B** | **DPO** | **6.29** | **85.1** |
| **Tulu-v2-13b** 🐪 | **13B** | **SFT** | **6.70** | **78.9** |
| **Tulu-v2-dpo-13b** 🐪 | **13B** | **DPO** | **7.00** | **89.5** |
| **Tulu-v2-70b** 🐪 | **70B** | **SFT** | **7.49** | **86.6** |
| **Tulu-v2-dpo-70b** 🐪 | **70B** | **DPO** | **7.89** | **95.1** |
## Input Format
The model is trained to use the following format (note the newlines):
```
<|user|>
Your message here!
<|assistant|>
```
For best results, format all inputs in this manner. **Make sure to include a newline after `<|assistant|>`, this can affect generation quality quite a bit.**
## Intended uses & limitations
The model was initially fine-tuned on a filtered and preprocessed of the [Tulu V2 mix dataset](https://huggingface.co/datasets/allenai/tulu-v2-sft-mixture), which contains a diverse range of human created instructions and synthetic dialogues generated primarily by other LLMs.
We then further aligned the model with a [Jax DPO trainer](https://github.com/hamishivi/EasyLM/blob/main/EasyLM/models/llama/llama_train_dpo.py) built on [EasyLM](https://github.com/young-geng/EasyLM) on the [openbmb/UltraFeedback](https://huggingface.co/datasets/openbmb/UltraFeedback) dataset, which contains 64k prompts and model completions that are ranked by GPT-4.
<!-- You can find the datasets used for training Tulu V2 [here]()
Here's how you can run the model using the `pipeline()` function from 🤗 Transformers:
```python
# Install transformers from source - only needed for versions <= v4.34
# pip install git+https://github.com/huggingface/transformers.git
# pip install accelerate
import torch
from transformers import pipeline
pipe = pipeline("text-generation", model="HuggingFaceH4/tulu-2-dpo-70b", torch_dtype=torch.bfloat16, device_map="auto")
# We use the tokenizer's chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
messages = [
{
"role": "system",
"content": "You are a friendly chatbot who always responds in the style of a pirate",
},
{"role": "user", "content": "How many helicopters can a human eat in one sitting?"},
]
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
# <|system|>
# You are a friendly chatbot who always responds in the style of a pirate.</s>
# <|user|>
# How many helicopters can a human eat in one sitting?</s>
# <|assistant|>
# Ah, me hearty matey! But yer question be a puzzler! A human cannot eat a helicopter in one sitting, as helicopters are not edible. They be made of metal, plastic, and other materials, not food!
```-->
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
The Tulu models have not been aligned to generate safe completions within the RLHF phase or deployed with in-the-loop filtering of responses like ChatGPT, so the model can produce problematic outputs (especially when prompted to do so).
It is also unknown what the size and composition of the corpus was used to train the base Llama 2 models, however it is likely to have included a mix of Web data and technical sources like books and code. See the [Falcon 180B model card](https://huggingface.co/tiiuae/falcon-180B#training-data) for an example of this.
### Training hyperparameters
The following hyperparameters were used during DPO training:
- learning_rate: 5e-07
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3.0
## Citation
If you find Tulu 2 is useful in your work, please cite it with:
```
@misc{ivison2023camels,
title={Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2},
author={Hamish Ivison and Yizhong Wang and Valentina Pyatkin and Nathan Lambert and Matthew Peters and Pradeep Dasigi and Joel Jang and David Wadden and Noah A. Smith and Iz Beltagy and Hannaneh Hajishirzi},
year={2023},
eprint={2311.10702},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
*Model card adapted from [Zephyr Beta](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta/blob/main/README.md)* | null | transformers | text-generation | null | null | null | null | null | null | null | null | null | allenai/tulu-2-dpo-70b | [
-0.25742435455322266,
-0.7044185996055603,
-0.08892272412776947,
0.21436674892902374,
-0.3226727247238159,
0.059040363878011703,
0.007290001027286053,
-0.6877930164337158,
0.16640853881835938,
0.1527809202671051,
-0.4547768235206604,
-0.12809616327285767,
-0.6711880564689636,
0.05598389357... |
tkcho/cp-commerce-clf-kr-sku-brand-584e9c24b3db22d85b064e58672d85c8 | tkcho | 2023-11-29T19:57:15Z | 840 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T19:57:15Z | 2023-11-10T13:18:42.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-584e9c24b3db22d85b064e58672d85c8 | [
-0.3227648437023163,
-0.2256842851638794,
0.8622258305549622,
0.4346150755882263,
-0.5282991528511047,
0.7012966275215149,
0.7915719151496887,
0.07618607580661774,
0.774602472782135,
0.25632160902023315,
-0.7852813005447388,
-0.22573809325695038,
-0.910448431968689,
0.571567177772522,
-0... |
badmonk/mixmi | badmonk | 2023-11-29T04:27:54Z | 821 | 1 | null | [
"diffusers",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | 2023-11-29T04:27:54Z | 2023-11-29T04:22:39.000Z | null | null | ---
license: creativeml-openrail-m
tags:
- text-to-image
- stable-diffusion
---
### mixmi Dreambooth model trained by badmonk with TheLastBen's fast-DreamBooth notebook
| null | diffusers | null | null | null | null | null | null | null | null | null | null | badmonk/mixmi | [
-0.44162893295288086,
-0.7936329245567322,
0.39511793851852417,
0.6592450737953186,
-0.6902657747268677,
0.13408419489860535,
0.05500367656350136,
-0.4518316090106964,
0.4639265239238739,
0.4752938449382782,
-0.3751230239868164,
-0.4421042799949646,
-0.7992390394210815,
-0.5496045351028442... |
yentinglin/Taiwan-LLM-13B-v2.0-chat | yentinglin | 2023-11-29T06:02:38Z | 812 | 7 | null | [
"transformers",
"safetensors",
"llama",
"text-generation",
"zh",
"license:apache-2.0",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | 2023-11-29T06:02:38Z | 2023-10-17T12:31:57.000Z | null | null |
---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/model-cards
license: apache-2.0
language:
- zh
widget:
- text: >-
A chat between a curious user and an artificial intelligence assistant.
The assistant gives helpful, detailed, and polite answers to the user's
questions. USER: 你好,請問你可以幫我寫一封推薦信嗎? ASSISTANT:
library_name: transformers
pipeline_tag: text-generation
extra_gated_heading: Acknowledge license to accept the repository.
extra_gated_prompt: Please contact the author for access.
extra_gated_button_content: Acknowledge license 同意以上內容
extra_gated_fields:
Name: text
Mail: text
Organization: text
Country: text
Any utilization of the Taiwan LLM repository mandates the explicit acknowledgment and attribution to the original author: checkbox
使用Taiwan LLM必須明確地承認和歸功於優必達株式會社 Ubitus 以及原始作者: checkbox
---
<img src="https://cdn-uploads.huggingface.co/production/uploads/5df9c78eda6d0311fd3d541f/CmusIT5OlSXvFrbTJ7l-C.png" alt="Taiwan LLM Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
# 🌟 Checkout [Taiwan-LLM Demo Chat-UI](http://www.twllm.com) 🌟
# Model Card for Taiwan LLM 13B v2.0 chat
Taiwan LLM is an advanced language model tailored for Traditional Chinese, focusing on the linguistic and cultural contexts of Taiwan.
Developed from a large base model, it's enriched with diverse Taiwanese textual sources and refined through Supervised Fine-Tuning.
This model excels in language understanding and generation, aligning closely with Taiwan's cultural nuances.
It demonstrates improved performance on various benchmarks like TC-Eval, showcasing its contextual comprehension and cultural relevance.
For detailed insights into Taiwan LLM's development and features, refer to our [technical report](https://github.com/MiuLab/Taiwan-LLaMa/blob/main/twllm_paper.pdf).
## Model description
- **Model type:** A 13B parameter GPT-like model fine-tuned on a mix of publicly available, synthetic datasets.
- **Language(s) (NLP):** Primarily Traditional Chinese (zh-tw)
- **Finetuned from model:** [yentinglin/Taiwan-LLM-13B-v2.0-base](https://huggingface.co/yentinglin/Taiwan-LLM-13B-v2.0-base)
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/MiuLab/Taiwan-LLaMa
- **Demo:** https://twllm.com/
## Performance

## Intended uses
Here's how you can run the model using the `pipeline()` function from 🤗 Transformers:
```python
# pip install transformers>=4.34
# pip install accelerate
import torch
from transformers import pipeline
pipe = pipeline("text-generation", model="yentinglin/Taiwan-LLM-13B-v2.0-chat", torch_dtype=torch.bfloat16, device_map="auto")
# We use the tokenizer's chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
messages = [
{
"role": "system",
"content": "你是一個人工智慧助理",
},
{"role": "user", "content": "東北季風如何影響台灣氣候?"},
]
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
```
### Training hyperparameters



The following hyperparameters were used during training:
- learning_rate: 5e-05
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 5.0
## Citation
If you find Taiwan LLM is useful in your work, please cite it with:
```
@inproceedings{lin-chen-2023-llm,
title = "{LLM}-Eval: Unified Multi-Dimensional Automatic Evaluation for Open-Domain Conversations with Large Language Models",
author = "Lin, Yen-Ting and Chen, Yun-Nung",
booktitle = "Proceedings of the 5th Workshop on NLP for Conversational AI (NLP4ConvAI 2023)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.nlp4convai-1.5",
pages = "47--58"
}
@misc{taiwanllama,
author={Lin, Yen-Ting and Chen, Yun-Nung},
title={Language Models for Taiwanese Culture},
year={2023},
url={https://github.com/MiuLab/Taiwan-LLaMa},
note={Code and models available at https://github.com/MiuLab/Taiwan-LLaMa},
}
```
# Acknowledgement
Taiwan LLM v2 is conducted in collaboration with [Ubitus K.K.](http://ubitus.net). Ubitus provides valuable compute resources for the project.
| null | transformers | text-generation | null | null | null | null | null | null | null | null | null | yentinglin/Taiwan-LLM-13B-v2.0-chat | [
-0.38043370842933655,
-0.9793344140052795,
0.32165807485580444,
0.4900784492492676,
-0.4789084494113922,
0.07184863090515137,
-0.46661460399627686,
-0.5886014699935913,
0.3897702991962433,
0.41718655824661255,
-0.4600169062614441,
-0.6736988425254822,
-0.5196977257728577,
0.048952125012874... |
badmonk/mxsa | badmonk | 2023-11-29T08:06:30Z | 798 | 1 | null | [
"diffusers",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | 2023-11-29T08:06:30Z | 2023-11-29T07:59:06.000Z | null | null | ---
license: creativeml-openrail-m
tags:
- text-to-image
- stable-diffusion
---
### mxsa Dreambooth model trained by badmonk with TheLastBen's fast-DreamBooth notebook
| null | diffusers | null | null | null | null | null | null | null | null | null | null | badmonk/mxsa | [
-0.29198211431503296,
-0.9005880951881409,
0.6699969172477722,
0.5305265784263611,
-0.8001897931098938,
0.002319114049896598,
0.162977933883667,
-0.2279384881258011,
0.4155460000038147,
0.5127345323562622,
-0.4382941424846649,
-0.6672796607017517,
-0.7923075556755066,
-0.5198479294776917,
... |
tkcho/cp-commerce-clf-kr-sku-brand-f73bde0ee5839f7fe2c8480b83dbaff3 | tkcho | 2023-11-29T03:36:22Z | 680 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T03:36:22Z | 2023-11-10T15:02:04.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-f73bde0ee5839f7fe2c8480b83dbaff3 | [
-0.3227645754814148,
-0.22568456828594208,
0.862226128578186,
0.43461504578590393,
-0.52829909324646,
0.7012966871261597,
0.7915720343589783,
0.07618620246648788,
0.7746025323867798,
0.25632232427597046,
-0.7852811813354492,
-0.22573864459991455,
-0.910447895526886,
0.5715669393539429,
-... |
optimum/tiny_random_bert_neuronx | optimum | 2023-11-29T19:16:35Z | 640 | 0 | null | [
"transformers",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | 2023-11-29T19:16:35Z | 2023-05-31T09:11:44.000Z | null | null | Entry not found | null | transformers | feature-extraction | null | null | null | null | null | null | null | null | null | optimum/tiny_random_bert_neuronx | [
-0.3227645754814148,
-0.22568456828594208,
0.862226128578186,
0.43461504578590393,
-0.52829909324646,
0.7012966871261597,
0.7915720343589783,
0.07618620246648788,
0.7746025323867798,
0.25632232427597046,
-0.7852811813354492,
-0.22573864459991455,
-0.910447895526886,
0.5715669393539429,
-... |
OpenLLM-France/Claire-7B-0.1 | OpenLLM-France | 2023-11-29T19:02:42Z | 634 | 31 | null | [
"transformers",
"pytorch",
"falcon",
"text-generation",
"pretrained",
"conversational",
"fr",
"arxiv:2311.16840",
"base_model:tiiuae/falcon-7b",
"license:cc-by-nc-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | 2023-11-29T19:02:42Z | 2023-11-09T17:43:49.000Z | null | null | ---
language:
- fr
license: cc-by-nc-sa-4.0
pipeline_tag: text-generation
base_model: tiiuae/falcon-7b
tags:
- pretrained
- conversational
widget:
- text: |-
- Bonjour Dominique, qu'allez-vous nous cuisiner aujourd'hui ?
- Bonjour Camille,
example_title: Request for a recipe
group: Dash
- text: |-
[Intervenant 1:] Bonjour Dominique, qu'allez-vous nous cuisiner aujourd'hui ?
[Intervenant 2:] Bonjour Camille,
example_title: Request for a recipe
group: Intervenant
- text: |-
[Camille:] Bonjour Dominique, qu'allez-vous nous cuisiner aujourd'hui ?
[Dominique:] Bonjour Camille,
example_title: Request for a recipe
group: FirstName
- text: |-
[Camille Durand:] Bonjour Dominique, qu'allez-vous nous cuisiner aujourd'hui ?
[Dominique Petit:] Bonjour Camille,
example_title: Request for a recipe
group: Named
inference:
parameters:
temperature: 1.0
max_new_tokens: 200
top_k: 10
---
# Claire-7B-0.1
**Claire-7B-0.1 is a 7B parameter causal decoder-only model built by [LINAGORA](https://labs.linagora.com/) and [OpenLLM-France](https://github.com/OpenLLM-France)**
**adapted from [Falcon-7b](https://huggingface.co/tiiuae/falcon-7b) on French conversational data.**
Quantized versions in GGUF format can be found in [TheBloke/Claire-7B-0.1-GGUF](https://huggingface.co/TheBloke/Claire-7B-0.1-GGUF).
Claire-7B-0.1 is a pretrained language model designed to be attuned to the dynamics of linguistic interactions in dialogue. Without further training, its expected use is to generate continuations of dialogues. Its main purpose is to serve as a base model for fine-tuning on dialogue generation (e.g., chat) and dialogue understanding (e.g., meeting summarization) tasks. Please note that due to its training, the model is prone to generate dialogues with disfluencies and other constructions common to spoken language.
* [Typical usage](#typical-usage)
* [Typical prompts](#typical-prompts)
* [Training Details](#training-details)
* [Training Data](#training-data)
* [Training Procedure](#training-procedure)
* [Evaluation](#evaluation)
* [License](#license)
* [Acknowledgements](#acknowledgements)
* [Contact](#contact)
## Typical usage
```python
import transformers
import torch
model_name = "OpenLLM-France/Claire-7B-0.1"
tokenizer = transformers.AutoTokenizer.from_pretrained(model_name)
model = transformers.AutoModelForCausalLM.from_pretrained(model_name,
device_map="auto",
torch_dtype=torch.bfloat16,
load_in_4bit=True # For efficient inference, if supported by the GPU card
)
pipeline = transformers.pipeline("text-generation", model=model, tokenizer=tokenizer)
generation_kwargs = dict(
num_return_sequences=1, # Number of variants to generate.
return_full_text= False, # Do not include the prompt in the generated text.
max_new_tokens=200, # Maximum length for the output text.
do_sample=True, top_k=10, temperature=1.0, # Sampling parameters.
pad_token_id=tokenizer.eos_token_id, # Just to avoid a harmless warning.
)
prompt = """\
- Bonjour Dominique, qu'allez-vous nous cuisiner aujourd'hui ?
- Bonjour Camille,\
"""
completions = pipeline(prompt, **generation_kwargs)
for completion in completions:
print(prompt + " […]" + completion['generated_text'])
```
This will print something like:
```
- Bonjour Dominique, qu'allez-vous nous cuisiner aujourd'hui ?
- Bonjour Camille, […] je vous prépare un plat de saison, une daube provençale.
- Ah je ne connais pas cette recette.
- C'est très facile à préparer, vous n'avez qu'à mettre de l'eau dans une marmite, y mettre de l'oignon émincé, des carottes coupées en petits morceaux, et vous allez mettre votre viande de bœuf coupé en petits morceaux également.
- Je n'ai jamais cuisiné de viande de bœuf, mais c'est vrai que ça a l'air bien facile.
- Vous n'avez plus qu'à laisser mijoter, et ensuite il sera temps de servir les clients.
- Très bien.
```
You will need at least 6GB of VRAM to run inference using 4bit quantization (16GB of VRAM without 4bit quantization).
If you have trouble running this code, make sure you have recent versions of `torch`, `transformers` and `accelerate` (see [requirements.txt](requirements.txt)).
### Typical prompts
Claire-7B-0.1 was trained on diarized French conversations. During training, the dialogues were normalized in several formats. The possible formats for expected prompts are as follows:
A monologue can be specified as a single line prompt (though keep in mind that Claire might still return a dialogue because of its training):
```python
prompt = "Mesdames et messieurs les députés, chers collègues, bonsoir. Vous l'aurez peut-être remarqué, je cite rarement"
```
A dialogue between two speakers can be specified with one line per speech turn starting with a dash:
```python
prompt = """\
- Bonjour Dominique, qu'allez-vous nous cuisiner aujourd'hui ?
- Bonjour Camille,\
"""
```
A dialogue or multilogue (with two or more speakers) can be specified with lines that start with `[Intervenant X:]` where `X` is a number:
```python
prompt = """\
[Intervenant 1:] Bonjour Dominique, qu'allez-vous nous cuisiner aujourd'hui ?
[Intervenant 2:] Bonjour Camille,\
"""
```
A dialogue or multilogue with named speakers can be specified with lines that start with `[SpeakerName:]`
where `SpeakerName` can be a first name, a first and a last name, a nickname, a title…
```python
prompt = """\
[Mme Camille Durand:] Bonjour Dominique, qu'allez-vous nous cuisiner aujourd'hui ?
[Mr. Dominique Petit:] Bonjour Camille,\
"""
```
## Training Details
### Training Data
The training dataset is available at [OpenLLM-France/Claire-Dialogue-French-0.1](https://huggingface.co/datasets/OpenLLM-France/Claire-Dialogue-French-0.1)
and described in ["The Claire French Dialogue Dataset" (2023)](https://arxiv.org/abs/2311.16840).
Claire-7B-0.1 was tuned from Falcon-7b on the following data distribution:
| **Data type** | **Words** | **Training Sampling Weight** | **Sources** |
|-------------------------------|------------|------------------------------|-----------------------------------------------------|
| Parliamentary Proceedings | 135M | 35% | Assemblée Nationale |
| Theatre | 16M | 18% | Théâtre Classique, Théâtre Gratuit |
| Interviews | 6.4M | 29% | TCOF, CFPP, CFPB (ORFEO), ACSYNT, PFC, Valibel (ORFEO), ESLO|
| Free Conversations | 2.2M | 10% | CRFP (ORFEO), OFROM (ORFEO), CID, Rhapsodie, ParisStories, PFC, CLAPI, C-ORAL-ROM (ORFEO), LinTO, ESLO |
| Meetings | 1.2M | 5% | SUMM-RE, LinTO, Réunions de travail (ORFEO) |
| Debates | 402k | <2% | FREDSum, ESLO |
| Assistance | 159k | <1% | Fleuron (ORFEO), Accueil UBS, OTG, ESLO |
| Presentation, Formal Address | 86k | <0.5% | Valibel (ORFEO), LinTO, ESLO |
Training data was augmented with the following techniques:
* varying the format used to indicate speech turns (dashes or [XXX:])
* substituting [Intervenant X:] for [SpeakerName:] or vice versa, where [SpeakerName:] might be a real name or a randomly generated name
* removing punctuation marks and/or casing (to prepare the model for transcripts produced by some Automatic Speech Recognition systems)
Long conversations were truncated at a maximum of 2048 tokens. Where possible, they were split between speaker turns.
While the model has been trained and evaluated only on French dialogues, it may be able to generate conversations in other languages from the original Falcon-7b training data.
### Training Procedure
The training code is available at [https://github.com/OpenLLM-France/Lit-Claire](https://github.com/OpenLLM-France/Lit-Claire).
Claire-7B-0.1 is a causal decoder-only model trained on a causal language modeling task (i.e., predict the next token).
See [Falcon-7b](https://huggingface.co/tiiuae/falcon-7b) for more details.
Claire-7B-0.1 was trained on 1 A100 80GB GPU for about 50 GPU hours.
Hyperparameters were the following:
| **Hyperparameter** | **Value** |
|--------------------|------------|
| Precision | `bfloat16` |
| Optimizer | AdamW |
| Learning rate | 1e-4 |
| Weight decay | 1e-2 |
| Batch size | 132 |
| LoRA rank | 16 |
| LoRA alpha | 32 |
| Dropout | 0.05 |
| gradient clipping | 1 |
## Evaluation
To evaluate Claire-7B-0.1’s ability to generate natural sounding, French conversations, we compared its responses to a variety of prompts with those of three other models:
* [Falcon-7b](https://huggingface.co/tiiuae/falcon-7b),
* [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
* [Claire-Mistral-7B-0.1](https://huggingface.co/OpenLLM-France/Claire-Mistral-7B-0.1) (a version of Mistral-7B-v0.1 adapted in the same fashion as Claire-7B-0.1)
We tested an even mixture of monologue and dialogue-style prompts.
Each of the four generated responses was evaluated along three dimensions:
Interaction, Fluency and Relevance.
Evaluators were also asked to rank the four responses by preference.
Our results confirm that continual pre-training of Falcon-7b and Mistral-7B-v0.1 leads to improvement (relative to the base models) along all three evaluation dimensions and that Claire-7B-0.1 outperforms the adapted Mistral counterpart in the Fluency and Relevance categories
(and in the Interaction category if we focus on dialogue-style prompts).
Ranking results also reveal a clear subjective preference for Claire-7B-0.1,
as shown in the following table:
<!--| | **Claire-Falcon** | **Claire-Mistral** | **Falcon** | **Mistral** | -->
| | <span style="font-weight: normal">... over</span><br /> **Claire-Falcon** | <span style="font-weight: normal">... over</span><br /> **Claire-Mistral** | <span style="font-weight: normal">... over</span><br /> **Falcon** | <span style="font-weight: normal">... over</span><br /> **Mistral** |
|--------------------------------------|----------------------|-----------------------|---------------|---------------------|
| prefer<br /> **Claire-Falcon** ... | | **62.2%** | **63.9%** | **83.8%** |
| prefer<br /> **Claire-Mistral** ... | _34.8%_ | | **56.2%** | **75.3%** |
| prefer<br /> **Falcon** ... | _36.1%_ | _43.8%_ | | **81.4%** |
| prefer<br /> **Mistral** ... | _16.2%_ | _24.7%_ | _18.6%_ | |
(In this table,
"Claire-Falcon" stands for Claire-7B-0.1,
"Falcon", for [Falcon-7b](https://huggingface.co/tiiuae/falcon-7b),
"Mistral", for [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
and "Claire-Mistral", for [Claire-Mistral-7B-0.1](https://huggingface.co/OpenLLM-France/Claire-Mistral-7B-0.1).)
Please note that the model can generate disfluencies and humorous responses as a result of its training on spoken and theatrical text.
More evaluation details will be provided in a separate publication.
## License
Given that some of the corpora used for training are only available under CC-BY-NC-SA licenses,
Claire-7B-0.1 is made available under the [CC-BY-NC-SA 4.0 license](https://creativecommons.org/licenses/by-nc-sa/4.0/).
You can find a variant of this model published under the Apache 2.0 license at [OpenLLM-France/Claire-7B-Apache-0.1](https://huggingface.co/OpenLLM-France/Claire-7B-Apache-0.1).
## Acknowledgements
This work was performed using HPC resources from GENCI–IDRIS (Grant 2023-AD011014561).
Claire-7B-0.1 was created by members of [LINAGORA](https://labs.linagora.com/) (in alphabetical order): Ismaïl Harrando, Julie Hunter, Jean-Pierre Lorré, Jérôme Louradour, Michel-Marie Maudet, Virgile Rennard, Guokan Shang.
Special thanks to partners from the OpenLLM-France community, especially Christophe Cerisara (LORIA), Pierre-Carl Langlais and Anastasia Stasenko (OpSci), and Pierre Colombo, for valuable advice.
## Contact
contact@openllm-france.fr | null | transformers | text-generation | null | null | null | null | null | null | null | null | null | OpenLLM-France/Claire-7B-0.1 | [
-0.38448378443717957,
-0.8795424103736877,
0.3410196900367737,
0.24896307289600372,
-0.04735638201236725,
-0.2578032612800598,
-0.2900561988353729,
-0.12587477266788483,
0.2092238813638687,
0.7550427913665771,
-0.6717768907546997,
-0.6947675943374634,
-0.4872341752052307,
0.245860099792480... |
tony4194/distilBERT-infoExtract | tony4194 | 2023-11-29T13:03:14Z | 612 | 0 | null | [
"transformers",
"pytorch",
"distilbert",
"token-classification",
"generated_from_trainer",
"en",
"dataset:conll2003",
"base_model:distilbert-base-cased",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:03:14Z | 2023-10-31T13:53:44.000Z | null | null | ---
license: apache-2.0
base_model: distilbert-base-cased
tags:
- generated_from_trainer
datasets:
- conll2003
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: distilBERT-infoExtract
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: conll2003
type: conll2003
config: conll2003
split: validation
args: conll2003
metrics:
- name: Precision
type: precision
value: 0.9133716160787531
- name: Recall
type: recall
value: 0.9368899360484685
- name: F1
type: f1
value: 0.9249813076347928
- name: Accuracy
type: accuracy
value: 0.9832077471007241
language:
- en
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilBERT-infoExtract
This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the conll2003 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0718
- Precision: 0.9134
- Recall: 0.9369
- F1: 0.9250
- Accuracy: 0.9832
## Model description
The model can identify human name, organization and location so far (no time recognition). It was trained for 5 minutes with T4 GPU on Colab.
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.0954 | 1.0 | 1756 | 0.0846 | 0.8880 | 0.9194 | 0.9034 | 0.9769 |
| 0.0498 | 2.0 | 3512 | 0.0699 | 0.9057 | 0.9310 | 0.9182 | 0.9815 |
| 0.031 | 3.0 | 5268 | 0.0718 | 0.9134 | 0.9369 | 0.9250 | 0.9832 |
### Framework versions
- Transformers 4.34.1
- Pytorch 2.1.0+cu118
- Datasets 2.14.6
- Tokenizers 0.14.1 | null | transformers | token-classification | null | null | null | null | null | null | null | null | null | tony4194/distilBERT-infoExtract | [
-0.45630669593811035,
-0.6713894605636597,
0.23850391805171967,
0.15279093384742737,
-0.21923616528511047,
-0.10395658016204834,
-0.0014353751903399825,
-0.29420825839042664,
0.06941358745098114,
0.14587636291980743,
-0.7577391266822815,
-0.6645355224609375,
-0.7291964888572693,
-0.0151007... |
tkcho/cp-commerce-clf-kr-sku-brand-998ba46f1dd295a1e3dd04def5cd2287 | tkcho | 2023-11-29T05:41:08Z | 573 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T05:41:08Z | 2023-11-11T02:50:27.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-998ba46f1dd295a1e3dd04def5cd2287 | [
-0.3227650821208954,
-0.22568479180335999,
0.8622263669967651,
0.4346153140068054,
-0.5282987952232361,
0.7012966871261597,
0.7915722727775574,
0.07618651539087296,
0.7746027112007141,
0.2563222348690033,
-0.7852821350097656,
-0.225738525390625,
-0.910447895526886,
0.5715667009353638,
-0... |
optimum/tiny_random_bert_neuron | optimum | 2023-11-29T19:26:21Z | 570 | 0 | null | [
"transformers",
"bert",
"feature-extraction",
"endpoints_compatible",
"region:us"
] | 2023-11-29T19:26:21Z | 2023-06-05T13:10:15.000Z | null | null | Entry not found | null | transformers | feature-extraction | null | null | null | null | null | null | null | null | null | optimum/tiny_random_bert_neuron | [
-0.3227650821208954,
-0.22568479180335999,
0.8622263669967651,
0.4346153140068054,
-0.5282987952232361,
0.7012966871261597,
0.7915722727775574,
0.07618651539087296,
0.7746027112007141,
0.2563222348690033,
-0.7852821350097656,
-0.225738525390625,
-0.910447895526886,
0.5715667009353638,
-0... |
tkcho/cp-commerce-clf-kr-sku-brand-a4b94aa2730451161c1b2ea6107ed86f | tkcho | 2023-11-29T17:31:29Z | 506 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T17:31:29Z | 2023-11-11T06:07:30.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-a4b94aa2730451161c1b2ea6107ed86f | [
-0.3227650821208954,
-0.22568479180335999,
0.8622263669967651,
0.4346153140068054,
-0.5282987952232361,
0.7012966871261597,
0.7915722727775574,
0.07618651539087296,
0.7746027112007141,
0.2563222348690033,
-0.7852821350097656,
-0.225738525390625,
-0.910447895526886,
0.5715667009353638,
-0... |
epfl-llm/meditron-7b | epfl-llm | 2023-11-29T16:49:04Z | 499 | 55 | null | [
"transformers",
"pytorch",
"llama",
"text-generation",
"en",
"dataset:epfl-llm/guidelines",
"arxiv:2311.16079",
"base_model:meta-llama/Llama-2-7b",
"license:llama2",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | 2023-11-29T16:49:04Z | 2023-11-08T16:03:23.000Z | null | null | ---
license: llama2
language:
- en
metrics:
- accuracy
- perplexity
datasets:
- epfl-llm/guidelines
base_model: meta-llama/Llama-2-7b
---
<img width=50% src="meditron_LOGO.png" alt="Alt text" title="Meditron-logo">
# Model Card for Meditron-7B-v1.0
Meditron is a suite of open-source medical Large Language Models (LLMs).
Meditron-7B is a 7 billion parameters model adapted to the medical domain from Llama-2-7B through continued pretraining on a comprehensively curated medical corpus, including selected PubMed articles, abstracts, a [new dataset](https://huggingface.co/datasets/epfl-llm/guidelines) of internationally-recognized medical guidelines, and general domain data from [RedPajama-v1](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T).
Meditron-7B, finetuned on relevant training data, outperforms Llama-2-7B and PMC-Llama on multiple medical reasoning tasks.
<details open>
<summary><strong>Advisory Notice</strong></summary>
<blockquote style="background-color: #f2f2f2; padding: 10px; margin: 0 0 10px; border-left: 5px solid #ddd;">
While Meditron is designed to encode medical knowledge from sources of high-quality evidence, it is not yet adapted to deliver this knowledge appropriately, safely, or within professional actionable constraints. We recommend against deploying Meditron in medical applications without extensive use-case alignment, as well as additional testing, specifically including randomized controlled trials in real-world practice settings.
</blockquote>
</details>
## Model Details
- **Developed by:** [EPFL LLM Team](https://huggingface.co/epfl-llm)
- **Model type:** Causal decoder-only transformer language model
- **Language(s):** English (mainly)
- **Model License:** [LLAMA 2 COMMUNITY LICENSE AGREEMENT](https://huggingface.co/meta-llama/Llama-2-70b/raw/main/LICENSE.txt)
- **Code License:** [APACHE 2.0 LICENSE](LICENSE)
- **Continue-pretrained from model:** [Llama-2-7B](https://huggingface.co/meta-llama/Llama-2-7b)
- **Context length:** 2K tokens
- **Input:** Text-only data
- **Output:** Model generates text only
- **Status:** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we enhance model's performance.
- **Knowledge Cutoff:** August 2023
### Model Sources
- **Repository:** [epflLLM/meditron](https://github.com/epfLLM/meditron)
- **Trainer:** [epflLLM/Megatron-LLM](https://github.com/epfLLM/Megatron-LLM)
- **Paper:** *[MediTron-70B: Scaling Medical Pretraining for Large Language Models](https://arxiv.org/abs/2311.16079)*
## Uses
Meditron-7B is being made available for further testing and assessment as an AI assistant to enhance clinical decision-making and enhance access to an LLM for healthcare use. Potential use cases may include but are not limited to:
- Medical exam question answering
- Supporting differential diagnosis
- Disease information (symptoms, cause, treatment) query
- General health information query
### Direct Use
It is possible to use this model to generate text, which is useful for experimentation and understanding its capabilities.
It should not be used directly for production or work that may impact people.
### Downstream Use
Meditron-7B is a foundation model that can be finetuned, instruction-tuned, or RLHF-tuned for specific downstream tasks and applications.
The main way we have used this model is finetuning for downstream question-answering tasks, but we encourage using this model for additional applications.
Specific formatting needs to be followed to prompt our finetuned models, including the `<|im_start|>`, `<|im_end|>` tags, and `system`, `question`, `answer` identifiers.
"""
<|im_start|>system
{system_message}<|im_end|>
<|im_start|>question
{prompt}<|im_end|>
<|im_start|>answer
"""
**Note**: the above formatting is not a requirement if you use your own formatting option for the finetuning of the model.
### Out-of-Scope Use
We do not recommend using this model for natural language generation in a production environment, finetuned or otherwise.
## Truthfulness, Helpfulness, Risk, and Bias
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
We did an initial assessment of Meditron models' **Truthfulness** against baseline models and consumer-level medical models.
We use TruthfulQA (multiple choice) as the main evaluation benchmark.
We only focus on the categories that are relevant to the medical domain, including Health, Nutrition, Psychology, and Science.
For 7B models, we perform one-shot evaluations for consistent answer generation.
For 70B models, the evaluations are under the zero-shot setting.
Below, we report the detailed truthfulness performance of each category.
| | | | | | | | |
| --- | ------ |----- |----- |----- |----- |----- |----- |
|Category | meditron-70b | llama-2-70b | med42-70b* | meditron-7b | llama-2-7b | PMC-llama-7b |
|Health | 81.8 | 69.1 | 83.6 | 27.3 | 16.4 | 3.6 |
|Nutrition | 77.9 | 68.8 | 62.5 | 31.1 | 12.5 | 6.3 |
|Psychology| 47.4 | 36.8 | 52.6 | 21.1 | 10.5 | 0.0 |
|Science | 77.8 | 44.4 | 33.3 | 33.3 | 11.1 | 0.0 |
|Avg | 71.2 | 54.8 | 58.0 | 28.3 | 12.6 | 2.5 |
| | | | | | | |
For a more detailed performance analysis, please see our paper.
Significant research is still required to fully explore potential bias, fairness, and safety issues with this language model.
Please recognize that our evaluation on Meditron-7B's helpfulness, risk, and bias are highly limited.
Thus, as we noted in the safety notice, we strongly against any deployment in medical applications without further alignment process and rigorous evaluation!
### Recommendations
**IMPORTANT!**
Users (both direct and downstream) should be made aware of the risks, biases, and limitations of the model.
While this model is capable of generating natural language text, we have only begun to explore this capability and its limitations.
Understanding these limitations is especially important in a domain like medicine.
Therefore, we strongly recommend against using this model in production for natural language generation or for professional purposes related to health and medicine.
## Training Details
### Training Data
Meditron’s domain-adaptive pre-training corpus GAP-Replay combines 48.1B tokens from four corpora:
- [**Clinical Guidelines**](https://huggingface.co/datasets/epfl-llm/guidelines): a new dataset of 46K internationally-recognized clinical practice guidelines from various healthcare-related sources, including hospitals and international organizations.
- **Medical Paper Abstracts**: 16.1M abstracts extracted from closed-access PubMed and PubMed Central papers.
- **Medical Papers**: full-text articles extracted from 5M publicly available PubMed and PubMed Central papers.
- **Replay Data**: 400M tokens of general domain pretraining data sampled from [RedPajama-v1](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T)
<img width=75% src="gap-replay.png" alt="Alt text" title="Meditron-logo">
#### Data Preprocessing
Please see the detailed preprocessing procedure in our paper.
### Training Procedure
We used the [Megatron-LLM](https://github.com/epfLLM/Megatron-LLM) distributed training library, a derivative of Nvidia's Megatron LM project, to optimize training efficiency.
Hardware consists of 1 node of 8x NVIDIA A100 (80GB) SXM GPUs connected by NVLink and NVSwitch with a single Nvidia ConnectX-6 DX network card and equipped with 2 x AMD EPYC 7543 32-Core Processors and 512 GB of RAM.
Our three way parallelism scheme uses:
- Data Parallelism (DP -- different GPUs process different subsets of the batches) of 2,
- Pipeline Parallelism (PP -- different GPUs process different layers) of 4,
- Tensor Parallelism (TP -- different GPUs process different subtensors for matrix multiplication) of 1.
#### Training Hyperparameters
| | |
| --- | ------ |
| bf16 | true |
| lr | 3e-4 |
| eps | 1e-5 |
| betas | \[0.9, 0.95\] |
| clip_grad | 1 |
| weight decay | 0.1 |
| DP size | 16 |
| TP size | 4 |
| PP size | 1 |
| seq length | 2048 |
| lr scheduler | cosine|
| min lr | 1e-6 |
| warmup iteration | 2000 |
| micro batch size | 10 |
| global batch size | 1600 |
| | |
#### Sizes
The model was trained in September 2023.
The model architecture is exactly Llama 2, meaning
| | |
| --- | ------ |
| Model size | 7B |
| Hidden dimension | 4096 |
| Num. attention heads | 32 |
| Num. layers | 32 |
| | |
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data & Metrics
#### Testing Data
- [MedQA (USMLE)](https://huggingface.co/datasets/bigbio/med_qa)
- [MedMCQA](https://huggingface.co/datasets/medmcqa)
- [PubMedQA](https://huggingface.co/datasets/bigbio/pubmed_qa)
- [MMLU-Medical](https://huggingface.co/datasets/lukaemon/mmlu)
- [MedQA-4-Option](https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options)
#### Metrics
- Accuracy: suite the evaluation of multiple-choice question-answering tasks.
### Results
We finetune meditron-7b, llama-2-7b, pmc-llama-7b on each benchmark (pubmedqa, medmcqa, medqa)'s training data individually.
We report the finetuned models' performance with top token selection as the inference mode.
For MMLU-Medical, models finetuned on MedMCQA are used for inference.
For MedQA-4-Option, models finetuned on MedQA are used for inference.
For a more detailed performance analysis, please see our paper.
| | | | | | |
| --- | ------ |----- |----- |----- |----- |
|Dataset | meditron-7b | llama-2-7b | pmc-llama-7b | Zephyr-7B-beta* | Mistral-7B-instruct* |
|MMLU-Medical | 54.2 | 53.7 | 56.4 | 63.3 | 60.0 |
|PubMedQA | 74.4 | 61.8 | 59.2 | 46.0 | 17.8 |
|MedMCQA | 59.2 | 54.4 | 57.6 | 43.0 | 40.2 |
|MedQA | 47.9 | 44.0 | 42.4 | 42.8 | 32.4 |
|MedQA-4-Option| 52.0 | 49.6 | 49.2 | 48.5 | 41.1 |
|Avg | 57.5 | 52.7 | 53.0 | 48.7 | 38.3 |
| | | | | | |
**Note**: models with * are already instruction-tuned, so we exclude them from further finetuning on any training data.
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
- **Hardware Type:** 8 x NVIDIA A100 (80GB) SXM
- **Total GPU hours:** 588.8
- **Hardware Provider:** EPFL Research Computing Platform
- **Compute Region:** Switzerland
- **Carbon Emitted:** Switzerland has a carbon efficiency of 0.016 kgCO2/kWh (https://www.carbonfootprint.com/docs/2018_8_electricity_factors_august_2018_-_online_sources.pdf). 73.6 hours of 8 A100s means 588.8 hours at a TDP of 400W. Assuming a Power Usage effectiveness of 1.5, total emissions are estimated to be:
(400W / 1000W/kWh / GPU * 0.016 kgCO2/kWh * 73.6 h * 8 GPU) * 1.8 PUE = 6.8 kgCO2.
## Citation
**BibTeX:**
If you use Meditron or its training data, please cite our work:
```
@misc{chen2023meditron70b,
title={MEDITRON-70B: Scaling Medical Pretraining for Large Language Models},
author={Zeming Chen and Alejandro Hernández-Cano and Angelika Romanou and Antoine Bonnet and Kyle Matoba and Francesco Salvi and Matteo Pagliardini and Simin Fan and Andreas Köpf and Amirkeivan Mohtashami and Alexandre Sallinen and Alireza Sakhaeirad and Vinitra Swamy and Igor Krawczuk and Deniz Bayazit and Axel Marmet and Syrielle Montariol and Mary-Anne Hartley and Martin Jaggi and Antoine Bosselut},
year={2023},
eprint={2311.16079},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@software{epfmedtrn,
author = {Zeming Chen and Alejandro Hernández-Cano and Angelika Romanou and Antoine Bonnet and Kyle Matoba and Francesco Salvi and Matteo Pagliardini and Simin Fan and Andreas Köpf and Amirkeivan Mohtashami and Alexandre Sallinen and Alireza Sakhaeirad and Vinitra Swamy and Igor Krawczuk and Deniz Bayazit and Axel Marmet and Syrielle Montariol and Mary-Anne Hartley and Martin Jaggi and Antoine Bosselut},
title = {MediTron-70B: Scaling Medical Pretraining for Large Language Models},
month = November,
year = 2023,
url = {https://github.com/epfLLM/meditron}
}
``` | null | transformers | text-generation | null | null | null | null | null | null | null | null | null | epfl-llm/meditron-7b | [
-0.3902868330478668,
-0.7957016825675964,
0.48532959818840027,
-0.12763197720050812,
-0.5213152170181274,
-0.19020451605319977,
-0.2901688516139984,
-0.5983539819717407,
0.08409656584262848,
0.569098711013794,
-0.6006947159767151,
-0.6058794856071472,
-0.5912721157073975,
0.438161224126815... |
tkcho/cp-commerce-clf-kr-sku-brand-30bd0829024fe255c33474907faf28e9 | tkcho | 2023-11-29T21:11:33Z | 481 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T21:11:33Z | 2023-11-11T07:45:14.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-30bd0829024fe255c33474907faf28e9 | [
-0.3227651119232178,
-0.22568456828594208,
0.8622261881828308,
0.43461447954177856,
-0.5282989740371704,
0.7012965083122253,
0.7915719747543335,
0.0761861652135849,
0.7746025323867798,
0.25632235407829285,
-0.7852817177772522,
-0.22573819756507874,
-0.9104477763175964,
0.5715669393539429,
... |
digiplay/asyncsMIX_v2 | digiplay | 2023-11-29T18:56:45Z | 464 | 5 | null | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"license:other",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | 2023-11-29T18:56:45Z | 2023-08-20T18:34:31.000Z | null | null | ---
license: other
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
inference: true
---
Model info :
https://civitai.com/models/114807?modelVersionId=144932
Sample image I made thru Huggingface's API :

Original Author's DEMO images :




| null | diffusers | text-to-image | null | null | null | null | null | null | null | null | null | digiplay/asyncsMIX_v2 | [
-0.7413633465766907,
-0.5590740442276001,
0.5042567253112793,
0.5204942226409912,
-0.4054664373397827,
-0.07159529626369476,
0.39658719301223755,
-0.5569695234298706,
0.8513290882110596,
0.3546072840690613,
-1.1098240613937378,
-0.6122995615005493,
-0.5725120902061462,
-0.03922732174396515... |
tkcho/cp-commerce-clf-kr-sku-brand-b40e1da73c6336ffecb737b9b3b1bd14 | tkcho | 2023-11-29T21:18:40Z | 463 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T21:18:40Z | 2023-11-11T08:36:03.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-b40e1da73c6336ffecb737b9b3b1bd14 | [
-0.3227651119232178,
-0.22568456828594208,
0.8622261881828308,
0.43461447954177856,
-0.5282989740371704,
0.7012965083122253,
0.7915719747543335,
0.0761861652135849,
0.7746025323867798,
0.25632235407829285,
-0.7852817177772522,
-0.22573819756507874,
-0.9104477763175964,
0.5715669393539429,
... |
AunyMoons/loras-pack | AunyMoons | 2023-11-29T18:35:59Z | 428 | 1 | null | [
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"template:sd-lora",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"region:us"
] | 2023-11-29T18:35:59Z | 2023-11-18T17:14:42.000Z | null | null | ---
tags:
- text-to-image
- stable-diffusion
- lora
- diffusers
- template:sd-lora
widget:
- text: '-'
output:
url: images/ComfyUI_00119_.png
base_model: stabilityai/stable-diffusion-xl-base-1.0
instance_prompt: null
---
# Extreme Low Angle Perspective
<Gallery />
## Download model
Weights for this model are available in Safetensors,PyTorch format.
[Download](/AunyMoons/loras-pack/tree/main) them in the Files & versions tab.
| null | diffusers | text-to-image | null | null | null | null | null | null | null | null | null | AunyMoons/loras-pack | [
-0.12710453569889069,
-0.4148067235946655,
0.10719165205955505,
0.10811784863471985,
-0.44499287009239197,
-0.4403470456600189,
0.4897499680519104,
-0.6308213472366333,
0.3559702932834625,
0.6936344504356384,
-0.5296828150749207,
-0.531294047832489,
-0.5421407222747803,
-0.2449319213628769... |
Yntec/Tantrum | Yntec | 2023-11-29T13:22:45Z | 425 | 0 | null | [
"diffusers",
"Anime",
"Cartoons",
"Cute",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"license:creativeml-openrail-m",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us",
"has_space"
] | 2023-11-29T13:22:45Z | 2023-11-29T12:07:57.000Z | null | null | ---
license: creativeml-openrail-m
library_name: diffusers
pipeline_tag: text-to-image
tags:
- Anime
- Cartoons
- Cute
- stable-diffusion
- stable-diffusion-diffusers
- diffusers
- text-to-image
---
# Tantrum
I made this model on a tantrum.
Comparison:

Sample and prompt:

princess,cartoon,wearing white dress,golden crown,red shoes,orange hair,kart,blue eyes,looking at viewer,smiling,happy,sitting on racing kart,outside,forest,blue sky,extremely detailed,hdr,toadstool, | null | diffusers | text-to-image | null | null | null | null | null | null | null | null | null | Yntec/Tantrum | [
-0.6695874333381653,
-0.6321233510971069,
-0.061179567128419876,
0.43212512135505676,
-0.39090150594711304,
0.104555644094944,
0.17744408547878265,
-0.13792622089385986,
0.35568174719810486,
0.34158703684806824,
-0.6374362111091614,
-0.19536305963993073,
-0.4838319718837738,
-0.03638596460... |
tkcho/cp-commerce-clf-kr-sku-brand-a098d146d2e71043ae2e9081b9db118d | tkcho | 2023-11-29T10:31:44Z | 424 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T10:31:44Z | 2023-11-11T13:35:49.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-a098d146d2e71043ae2e9081b9db118d | [
-0.322765052318573,
-0.22568443417549133,
0.862225353717804,
0.43461543321609497,
-0.5282990336418152,
0.7012964487075806,
0.7915717363357544,
0.07618646323680878,
0.7746022939682007,
0.25632232427597046,
-0.7852814197540283,
-0.2257380485534668,
-0.9104474782943726,
0.5715667009353638,
... |
tkcho/cp-commerce-clf-kr-sku-brand-8fba471112ad0cc103fd56324d632bd5 | tkcho | 2023-11-29T17:44:27Z | 415 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T17:44:27Z | 2023-11-11T11:34:57.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-8fba471112ad0cc103fd56324d632bd5 | [
-0.322765052318573,
-0.22568443417549133,
0.862225353717804,
0.43461543321609497,
-0.5282990336418152,
0.7012964487075806,
0.7915717363357544,
0.07618646323680878,
0.7746022939682007,
0.25632232427597046,
-0.7852814197540283,
-0.2257380485534668,
-0.9104474782943726,
0.5715667009353638,
... |
tkcho/cp-commerce-clf-kr-sku-brand-22453bf4bfbbcba04c065019e64a749a | tkcho | 2023-11-29T19:17:03Z | 408 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T19:17:03Z | 2023-11-11T16:23:13.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-22453bf4bfbbcba04c065019e64a749a | [
-0.322765052318573,
-0.22568443417549133,
0.862225353717804,
0.43461543321609497,
-0.5282990336418152,
0.7012964487075806,
0.7915717363357544,
0.07618646323680878,
0.7746022939682007,
0.25632232427597046,
-0.7852814197540283,
-0.2257380485534668,
-0.9104474782943726,
0.5715667009353638,
... |
tkcho/cp-commerce-clf-kr-sku-brand-e655d2ad4aef672aa6c6eea769b06ce0 | tkcho | 2023-11-29T04:54:49Z | 373 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T04:54:49Z | 2023-11-14T07:38:02.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-e655d2ad4aef672aa6c6eea769b06ce0 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-7e5aab4c0569fca1aba946d2ab017e99 | tkcho | 2023-11-29T11:41:52Z | 371 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T11:41:52Z | 2023-11-14T15:57:08.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-7e5aab4c0569fca1aba946d2ab017e99 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-aecbe30967594441529b6dd9ac1efaa6 | tkcho | 2023-11-29T09:17:28Z | 363 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T09:17:28Z | 2023-11-13T05:27:41.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-aecbe30967594441529b6dd9ac1efaa6 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-1a1abe605d3da7def772178b724ac52e | tkcho | 2023-11-29T19:20:48Z | 362 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T19:20:48Z | 2023-11-11T14:53:33.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-1a1abe605d3da7def772178b724ac52e | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/commerce-clf-kr-sku-brand-384d0df541580abdb6962aafc27d7c56 | tkcho | 2023-11-30T00:48:38Z | 361 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T00:48:38Z | 2023-11-29T12:55:08.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/commerce-clf-kr-sku-brand-384d0df541580abdb6962aafc27d7c56 | [
-0.3227650821208954,
-0.22568479180335999,
0.8622263669967651,
0.4346153140068054,
-0.5282987952232361,
0.7012966871261597,
0.7915722727775574,
0.07618651539087296,
0.7746027112007141,
0.2563222348690033,
-0.7852821350097656,
-0.225738525390625,
-0.910447895526886,
0.5715667009353638,
-0... |
FreedomIntelligence/AceGPT-7B | FreedomIntelligence | 2023-11-29T16:09:05Z | 360 | 2 | null | [
"transformers",
"pytorch",
"llama",
"text-generation",
"ar",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2023-11-29T16:09:05Z | 2023-09-14T08:56:42.000Z | null | null | ---
license: apache-2.0
language:
- ar
---
# <b>AceGPT</b>
AceGPT is a fully fine-tuned generative text model collection based on LlaMA2, particularly in the
Arabic language domain. This is the repository for the 7B pretrained model.
---
## Model Details
We have released the AceGPT family of large language models, which is a collection of fully fine-tuned generative text models based on LlaMA2, ranging from 7B to 13B parameters. Our models include two main categories: AceGPT and AceGPT-chat. AceGPT-chat is an optimized version specifically designed for dialogue applications. It is worth mentioning that our models have demonstrated superior performance compared to all currently available open-source Arabic dialogue models in multiple benchmark tests. Furthermore, in our human evaluations, our models have shown comparable satisfaction levels to some closed-source models, such as ChatGPT, in the Arabic language.
## Model Developers
We are from the School of Data Science, the Chinese University of Hong Kong, Shenzhen (CUHKSZ), the Shenzhen Research Institute of Big Data (SRIBD), and the King Abdullah University of Science and Technology (KAUST).
## Variations
AceGPT families come in a range of parameter sizes —— 7B and 13B, each size of model has a base category and a -chat category.
## Input
Models input text only.
## Output
Models output text only.
## Model Evaluation Results
Experiments on Arabic MMLU and EXAMs. ' AverageBest ', ' STEM ', ' Humanities ', ' Social Sciences ' and ' Others (Business, Health, Misc)' belong to Arabic MMLU. Best performance is in bold and the second best is underlined.
| Model | Average | STEM | Humanities | Social Sciences | Others (Business, Health, Misc) |EXAMs |
|-----------------|---------|------|------------|-----------------|---------------------------------|--------------|
| Bloomz Muennighoff et al. (2022) | 30.95 | 32.32 | 26.71 | 35.85 | 28.95 | 33.89 |
| Llama2-7B | 28.81 | 28.48 | 26.68 | 29.88 | 30.18 | 23.48 |
| Llama2-13B | 31.25 | 31.06 | 27.11 | 35.5 | 31.35 | 25.45 |
| Jais-13B-base | 30.01 | 27.85 | 25.42 | 39.7 | 27.06 | 35.67 |
| AceGPT-7B-base | 30.36 | 26.63 | 28.17 | 35.15 | 31.5 | 31.96 |
| AceGPT-13B-base | <u>37.26</u> | <u>35.16</u> | <u>30.3</u> | <u>47.34</u> | <u>36.25</u> | <u>36.63</u> |
| ChatGPT | <b>46.07</b> | <b>44.17</b> | <b>35.33</b> | <b>61.26</b> | <b>43.52</b> | <b>45.63 </b> |
---
## Samples
#### Arabic MMLU (5-shot)
فيما يلي أسئلة الاختيار من متعدد (مع الإجابات) حول جبر تجريدي
سؤال: العثور على جميع قيم c في Z_3 بحيث يكون Z_3 [x]/(x^2+c) حقلًا.
A. 0
B. 1
C. 2
D. 3
إجابة: B
سؤال: البيان رقم 1 | إذا كان aH عنصرًا في مجموعة العوامل ، فإن | aH | يقسم | a |. البيان رقم 2 | إذا كانت H و K مجموعات فرعية لـ G ، فإن HK مجموعة فرعية لـ G.
A. صحيح ، صحيح
B. خطأ ، خطأ
C. صحيح ، خطأ
D. خطأ ، صحيح
إجابة: B
سؤال: العبارة 1 | كل عنصر من مجموعة يولد مجموعة دورية من المجموعة. العبارة 2 | المجموعة المتناظرة S_10 لديها 10 عناصر.
A. صحيح، صحيح
B. خطأ، خطأ
C. صحيح، خطأ
D. خطأ، صحيح
إجابة: C
سؤال: البيان 1| كل وظيفة من مجموعة محدودة على نفسها يجب أن تكون واحدة لكل مجموعة. البيان 2 | كل فرع فرعي لمجموعة أبيلية هو أبيلي.
A. صحيح, صحيح
B. خاطئ, خاطئ
C. صحيح, خاطئ
D. خاطئ, صحيح\nإجابة: A
سؤال: اعثر على خاصية الحلقة 2Z.
A. 0
B. 3
C. 12
D. 30
إجابة: A
سؤال: ما هو الدرجة للامتداد الميداني الناتج من Q(sqrt(2), sqrt(3), sqrt(18)) على Q؟
A. 0
B. 4
C. 2
D. 6
إجابة:",
# You can get more detail at https://github.com/FreedomIntelligence/AceGPT/tree/main | null | transformers | text-generation | null | null | null | null | null | null | null | null | null | FreedomIntelligence/AceGPT-7B | [
-0.5995752215385437,
-0.7978371977806091,
0.48642754554748535,
0.3606192171573639,
-0.43346142768859863,
0.0789729580283165,
-0.028223995119333267,
-0.42059651017189026,
0.33293041586875916,
0.3523064851760864,
-0.34801119565963745,
-0.948062539100647,
-0.7210822105407715,
-0.0734110400080... |
tkcho/cp-commerce-clf-kr-sku-brand-d81197003d27ba1249f92eba8e41117e | tkcho | 2023-11-29T22:55:05Z | 354 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T22:55:05Z | 2023-11-14T15:57:57.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-d81197003d27ba1249f92eba8e41117e | [
-0.3227650821208954,
-0.22568479180335999,
0.8622263669967651,
0.4346153140068054,
-0.5282987952232361,
0.7012966871261597,
0.7915722727775574,
0.07618651539087296,
0.7746027112007141,
0.2563222348690033,
-0.7852821350097656,
-0.225738525390625,
-0.910447895526886,
0.5715667009353638,
-0... |
tkcho/cp-commerce-clf-kr-sku-brand-f6816f16dd572eba005a94d51d6820cb | tkcho | 2023-11-29T09:25:34Z | 349 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T09:25:34Z | 2023-11-15T11:32:54.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-f6816f16dd572eba005a94d51d6820cb | [
-0.3227650821208954,
-0.22568479180335999,
0.8622263669967651,
0.4346153140068054,
-0.5282987952232361,
0.7012966871261597,
0.7915722727775574,
0.07618651539087296,
0.7746027112007141,
0.2563222348690033,
-0.7852821350097656,
-0.225738525390625,
-0.910447895526886,
0.5715667009353638,
-0... |
tkcho/cp-commerce-clf-kr-sku-brand-10cc14c66449283fdd0cee6fa5910198 | tkcho | 2023-11-29T23:58:23Z | 343 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T23:58:23Z | 2023-11-15T18:26:20.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-10cc14c66449283fdd0cee6fa5910198 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-5e3b24ed24fc60aea66128b9a92cd5ff | tkcho | 2023-11-29T11:25:39Z | 342 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T11:25:39Z | 2023-11-11T16:28:58.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-5e3b24ed24fc60aea66128b9a92cd5ff | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-895cc8684e5f0d11202b96bedc1e0f4e | tkcho | 2023-11-29T11:19:34Z | 341 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T11:19:34Z | 2023-11-14T15:20:54.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-895cc8684e5f0d11202b96bedc1e0f4e | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-3abe06a78a401592ddd971231ecf3655 | tkcho | 2023-11-29T11:37:31Z | 341 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T11:37:31Z | 2023-11-14T15:38:12.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-3abe06a78a401592ddd971231ecf3655 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-bd06b082f8b8cbc8c47376b405b55b55 | tkcho | 2023-11-29T12:20:57Z | 341 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T12:20:57Z | 2023-11-14T16:35:59.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-bd06b082f8b8cbc8c47376b405b55b55 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-7a2870024143fc4f2d465e6d402b8813 | tkcho | 2023-11-29T12:34:59Z | 341 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T12:34:59Z | 2023-11-14T16:56:10.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-7a2870024143fc4f2d465e6d402b8813 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-e93ea8ca8ec5c166c14669f101af286d | tkcho | 2023-11-29T12:27:44Z | 340 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T12:27:44Z | 2023-11-14T16:49:13.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-e93ea8ca8ec5c166c14669f101af286d | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-48bda74654f990c7b435053b89114b42 | tkcho | 2023-11-29T22:52:46Z | 339 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T22:52:46Z | 2023-11-14T15:27:27.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-48bda74654f990c7b435053b89114b42 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-7200c78b0f927d1f091b1d730f4f171a | tkcho | 2023-11-29T11:30:21Z | 339 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T11:30:21Z | 2023-11-14T15:28:18.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-7200c78b0f927d1f091b1d730f4f171a | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-ea07d8e1f83a4d3b3d03ff616e6a8200 | tkcho | 2023-11-29T22:56:21Z | 339 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T22:56:21Z | 2023-11-14T16:04:48.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-ea07d8e1f83a4d3b3d03ff616e6a8200 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-afcbd2fc936ac99aee180bcd76934808 | tkcho | 2023-11-29T12:37:02Z | 339 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T12:37:02Z | 2023-11-16T19:52:41.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-afcbd2fc936ac99aee180bcd76934808 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-99a518a90751c53b4175f83f3bac3162 | tkcho | 2023-11-29T11:29:25Z | 338 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T11:29:25Z | 2023-11-14T15:30:58.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-99a518a90751c53b4175f83f3bac3162 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
YernazarBis/llama-2-7b-finetune-test2-merged | YernazarBis | 2023-11-29T17:12:09Z | 337 | 0 | null | [
"transformers",
"safetensors",
"llama",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2023-11-29T17:12:09Z | 2023-11-29T16:09:54.000Z | null | null | Entry not found | null | transformers | text-generation | null | null | null | null | null | null | null | null | null | YernazarBis/llama-2-7b-finetune-test2-merged | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-2455d558f5ed15979a49c61b3fbfb010 | tkcho | 2023-11-29T12:46:34Z | 335 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T12:46:34Z | 2023-11-14T17:41:51.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-2455d558f5ed15979a49c61b3fbfb010 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-a7ab3ea41426943e3c02d4b939d9f84f | tkcho | 2023-11-29T22:57:59Z | 335 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T22:57:59Z | 2023-11-16T19:37:43.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-a7ab3ea41426943e3c02d4b939d9f84f | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-c4146c53f1ad72a2acacc344b847defc | tkcho | 2023-11-29T11:52:28Z | 333 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T11:52:28Z | 2023-11-15T14:59:05.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-c4146c53f1ad72a2acacc344b847defc | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-ed90cfa6d6719fc46d7a136b88ec4dc7 | tkcho | 2023-11-29T13:22:58Z | 331 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:22:58Z | 2023-11-16T20:05:29.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-ed90cfa6d6719fc46d7a136b88ec4dc7 | [
-0.3227648138999939,
-0.22568483650684357,
0.8622256517410278,
0.43461519479751587,
-0.5282990336418152,
0.7012965679168701,
0.7915716767311096,
0.07618631422519684,
0.7746025323867798,
0.25632259249687195,
-0.7852814793586731,
-0.22573857009410858,
-0.910447895526886,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-530edfefb0c145be8c69577a59c706f2 | tkcho | 2023-11-29T11:39:03Z | 330 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T11:39:03Z | 2023-11-14T15:50:48.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-530edfefb0c145be8c69577a59c706f2 | [
-0.3227648138999939,
-0.22568483650684357,
0.8622256517410278,
0.43461519479751587,
-0.5282990336418152,
0.7012965679168701,
0.7915716767311096,
0.07618631422519684,
0.7746025323867798,
0.25632259249687195,
-0.7852814793586731,
-0.22573857009410858,
-0.910447895526886,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-7c8b8d20a93137d56184b77b26dfb05d | tkcho | 2023-11-29T13:07:12Z | 330 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:07:12Z | 2023-11-14T17:47:41.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-7c8b8d20a93137d56184b77b26dfb05d | [
-0.3227648138999939,
-0.22568483650684357,
0.8622256517410278,
0.43461519479751587,
-0.5282990336418152,
0.7012965679168701,
0.7915716767311096,
0.07618631422519684,
0.7746025323867798,
0.25632259249687195,
-0.7852814793586731,
-0.22573857009410858,
-0.910447895526886,
0.5715669393539429,
... |
panopstor/SD15-768 | panopstor | 2023-11-30T00:26:54Z | 329 | 1 | null | [
"diffusers",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"endpoints_compatible",
"has_space",
"diffusers:StableDiffusionPipeline",
"region:us"
] | 2023-11-30T00:26:54Z | 2023-11-06T21:21:06.000Z | null | null | ---
license: creativeml-openrail-m
language:
- en
pipeline_tag: text-to-image
---
This is a fine tune on Stable Diffusion 1.5 with the MSE VAE from RunwayML, tuned to generate at a nomimal size of 768x768 or various aspects of similar total pixel count.
Fine tuned using EveryDream2 (https://github.com/victorchall/EveryDream2trainer) for 40 epochs across 4 sessions (10 epochs each) on 30,000 handpicked images with a wide variety of imagery from fine art, photography, and video games.
Approximate time to train is 60 hours on an RTX 6000 Ada 48GB, initially trained at 512 nominal size at batch size 12, then 640 at batch size 12, then finally 768 at batch size 8 with 4 gradient accumulation steps.
Unet weights tuned with bitsandbytes AdamW8bit with 1e-6 learning rate, with constant learning rate for the first 30 epochs and cosine for the final 10.
Text encoder tuned via backpropogation through the Unet using the same AdamW8bit optimizer with a learning rate of 2e-7 cosine for each session and weight decay 0.040 to account for lower normal.
EveryDream2 trainer implements aspect ratio batch fitting so cropping artifacts are greatly reduced. Higher resolution outputs are consistent compared to the original SD1.5 checkpoint which tends to duplicate subject matter beyond the trained 512x512 resolution.
Optimizer states are provided in .pt form for each the text encoder and Unet, which will can be loaded along with the Diffusers weights for resumption in the EveryDream2 trainer software.
| null | diffusers | text-to-image | null | null | null | null | null | null | null | null | null | panopstor/SD15-768 | [
-0.615660548210144,
-0.4740406572818756,
0.501829206943512,
0.2698715329170227,
-0.47651734948158264,
-0.23722875118255615,
-0.12525111436843872,
-0.15243299305438995,
0.24773603677749634,
0.386141836643219,
-0.8097472190856934,
-0.5170879364013672,
-0.6655675172805786,
-0.2274312973022461... |
tkcho/cp-commerce-clf-kr-sku-brand-7becd2b9f36c82e69f6f6a6d05f700d5 | tkcho | 2023-11-29T13:16:09Z | 329 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:16:09Z | 2023-11-14T18:01:48.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-7becd2b9f36c82e69f6f6a6d05f700d5 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-a221a38f4a0e1737810c8614a283d813 | tkcho | 2023-11-30T00:51:27Z | 329 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T00:51:27Z | 2023-11-15T19:40:23.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-a221a38f4a0e1737810c8614a283d813 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-28e825de57100851d613d3134782b13e | tkcho | 2023-11-30T01:26:46Z | 329 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T01:26:46Z | 2023-11-16T22:31:28.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-28e825de57100851d613d3134782b13e | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-e06062e9a55a4ffbdad25bc2a8a835ee | tkcho | 2023-11-29T22:59:31Z | 328 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T22:59:31Z | 2023-11-14T16:37:16.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-e06062e9a55a4ffbdad25bc2a8a835ee | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-ef8c89fddfe91b2708eab970a8fd6992 | tkcho | 2023-11-29T14:00:55Z | 328 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T14:00:55Z | 2023-11-15T16:32:01.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-ef8c89fddfe91b2708eab970a8fd6992 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-9550ec6c53cd842f1af68dcedadd0342 | tkcho | 2023-11-29T23:18:51Z | 328 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T23:18:51Z | 2023-11-15T16:46:49.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-9550ec6c53cd842f1af68dcedadd0342 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-5abfb3403c8e87b698e8b887fb4bc62b | tkcho | 2023-11-29T23:17:51Z | 328 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T23:17:51Z | 2023-11-15T16:47:58.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-5abfb3403c8e87b698e8b887fb4bc62b | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-4444bdf209ac0cdc04119809e58de1c5 | tkcho | 2023-11-30T00:40:25Z | 328 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T00:40:25Z | 2023-11-15T19:23:38.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-4444bdf209ac0cdc04119809e58de1c5 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
qgyd2021/few_shot_intent | qgyd2021 | 2023-11-29T04:08:29Z | 328 | 0 | null | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"generated_from_trainer",
"base_model:qgyd2021/few_shot_intent",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | 2023-11-29T04:08:29Z | 2023-11-23T07:10:31.000Z | null | null | ---
base_model: qgyd2021/few_shot_intent
tags:
- generated_from_trainer
model-index:
- name: few_shot_intent
results: []
---
# few_shot_intent
This model is a fine-tuned version of [qgyd2021/few_shot_intent](https://huggingface.co/qgyd2021/few_shot_intent) on the [qgyd2021/few_shot_intent_sft](https://huggingface.co/datasets/qgyd2021/few_shot_intent_sft) dataset.
```python
#!/usr/bin/python3
# -*- coding: utf-8 -*-
import argparse
import os
from pathlib import Path
import platform
import re
import string
from typing import List
if platform.system() == "Windows":
from project_settings import project_path
else:
project_path = os.path.abspath("./")
project_path = Path(project_path)
hf_hub_cache = (project_path / "cache/huggingface/hub").as_posix()
os.environ["HUGGINGFACE_HUB_CACHE"] = hf_hub_cache
import torch
from transformers.models.auto import AutoModelForCausalLM, AutoTokenizer
from transformers.models.gpt2.modeling_gpt2 import GPT2LMHeadModel
prompt = """
意图识别。不确定时输出:不知道。
Examples:
------------
text: 打开风扇
intent: 开启
------------
text: 关闭电视
intent: 关闭
------------
text: 把风扇关了吧
intent: 关闭
------------
text: 电视开开
intent: 开启
------------
text: 天上一天
intent:
"""
def get_args():
parser = argparse.ArgumentParser()
parser.add_argument(
"--pretrained_model_name_or_path",
default="qgyd2021/few_shot_intent",
type=str,
)
parser.add_argument(
"--prompt",
default=prompt,
type=str,
)
parser.add_argument("--max_input_len", default=512, type=int)
parser.add_argument("--max_new_tokens", default=512, type=int)
parser.add_argument("--top_p", default=0.9, type=float)
parser.add_argument("--temperature", default=0.35, type=float)
parser.add_argument("--repetition_penalty", default=1.0, type=float)
parser.add_argument('--device', default="cuda" if torch.cuda.is_available() else "cpu", type=str)
args = parser.parse_args()
return args
def remove_space_between_cn_en(text):
splits = re.split(" ", text)
if len(splits) < 2:
return text
result = ""
for t in splits:
if t == "":
continue
if re.search(f"[a-zA-Z0-9{string.punctuation}]$", result) and re.search("^[a-zA-Z0-9]", t):
result += " "
result += t
else:
if not result == "":
result += t
else:
result = t
if text.endswith(" "):
result += " "
return result
def main():
args = get_args()
# pretrained model
model: GPT2LMHeadModel = AutoModelForCausalLM.from_pretrained(args.pretrained_model_name_or_path)
tokenizer = AutoTokenizer.from_pretrained(args.pretrained_model_name_or_path)
prompt_encoded = tokenizer.__call__(args.prompt, add_special_tokens=True)
input_ids: List[int] = prompt_encoded["input_ids"]
input_ids = torch.tensor([input_ids], dtype=torch.long)
input_ids = input_ids[:, -args.max_input_len:]
tokenizer.eos_token = tokenizer.sep_token
tokenizer.eos_token_id = tokenizer.sep_token_id
# generate
with torch.no_grad():
outputs = model.generate(
input_ids=input_ids,
max_new_tokens=args.max_new_tokens,
do_sample=True,
top_p=args.top_p,
temperature=args.temperature,
repetition_penalty=args.repetition_penalty,
eos_token_id=tokenizer.sep_token_id,
pad_token_id=tokenizer.pad_token_id,
)
outputs = outputs.tolist()[0][len(input_ids[0]):]
answer = tokenizer.decode(outputs)
answer = answer.strip().replace(tokenizer.eos_token, "").strip()
answer = remove_space_between_cn_en(answer)
print(answer)
return
if __name__ == '__main__':
main()
```
| null | transformers | text-generation | null | null | null | null | null | null | null | null | null | qgyd2021/few_shot_intent | [
-0.09834426641464233,
-0.7688739895820618,
0.4743431806564331,
0.2616889774799347,
-0.2511584460735321,
-0.3181752562522888,
-0.12777204811573029,
-0.15073664486408234,
-0.14263968169689178,
0.16724970936775208,
-0.6016799211502075,
-0.6640111804008484,
-0.5963980555534363,
0.1051697805523... |
tkcho/cp-commerce-clf-kr-sku-brand-7c56f6fb8a23f08a8dfb4f8dd9cefa56 | tkcho | 2023-11-29T13:04:20Z | 327 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:04:20Z | 2023-11-14T17:32:07.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-7c56f6fb8a23f08a8dfb4f8dd9cefa56 | [
-0.3227648437023163,
-0.2256842851638794,
0.8622258305549622,
0.4346150755882263,
-0.5282991528511047,
0.7012966275215149,
0.7915719151496887,
0.07618607580661774,
0.774602472782135,
0.25632160902023315,
-0.7852813005447388,
-0.22573809325695038,
-0.910448431968689,
0.571567177772522,
-0... |
tkcho/cp-commerce-clf-kr-sku-brand-8daf218ca89d471fe9c88f9b15f6b138 | tkcho | 2023-11-29T13:09:14Z | 327 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:09:14Z | 2023-11-14T17:51:15.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-8daf218ca89d471fe9c88f9b15f6b138 | [
-0.3227648437023163,
-0.2256842851638794,
0.8622258305549622,
0.4346150755882263,
-0.5282991528511047,
0.7012966275215149,
0.7915719151496887,
0.07618607580661774,
0.774602472782135,
0.25632160902023315,
-0.7852813005447388,
-0.22573809325695038,
-0.910448431968689,
0.571567177772522,
-0... |
tkcho/cp-commerce-clf-kr-sku-brand-31a5d2fefb5b552fdc9c2b1ca2dad6e5 | tkcho | 2023-11-29T13:36:42Z | 327 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:36:42Z | 2023-11-15T16:04:55.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-31a5d2fefb5b552fdc9c2b1ca2dad6e5 | [
-0.3227648437023163,
-0.2256842851638794,
0.8622258305549622,
0.4346150755882263,
-0.5282991528511047,
0.7012966275215149,
0.7915719151496887,
0.07618607580661774,
0.774602472782135,
0.25632160902023315,
-0.7852813005447388,
-0.22573809325695038,
-0.910448431968689,
0.571567177772522,
-0... |
tkcho/cp-commerce-clf-kr-sku-brand-fb3639769ad4bd2915a57e0fa04bb393 | tkcho | 2023-11-29T23:25:33Z | 327 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T23:25:33Z | 2023-11-15T16:58:50.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-fb3639769ad4bd2915a57e0fa04bb393 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-9a9405632a176edb7f2c1c235ff9ef9d | tkcho | 2023-11-29T13:29:40Z | 326 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:29:40Z | 2023-11-15T15:50:49.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-9a9405632a176edb7f2c1c235ff9ef9d | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-9a6befa9fb8074957ed29521b3505ab5 | tkcho | 2023-11-29T23:33:04Z | 326 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T23:33:04Z | 2023-11-15T17:04:41.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-9a6befa9fb8074957ed29521b3505ab5 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-a5e4eac139e9ebbe6cd600ce61130a5a | tkcho | 2023-11-30T00:33:09Z | 326 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T00:33:09Z | 2023-11-16T21:55:11.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-a5e4eac139e9ebbe6cd600ce61130a5a | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-8a7ec44d063e0bf7a271c9ba5fac223d | tkcho | 2023-11-29T13:15:01Z | 325 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:15:01Z | 2023-11-14T17:57:06.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-8a7ec44d063e0bf7a271c9ba5fac223d | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-19af1e94175009cdef6261067634f5d6 | tkcho | 2023-11-29T23:04:52Z | 325 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T23:04:52Z | 2023-11-15T15:27:37.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-19af1e94175009cdef6261067634f5d6 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-4f27631edf951400ef30266c1ae05a97 | tkcho | 2023-11-30T00:03:01Z | 325 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T00:03:01Z | 2023-11-15T18:31:57.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-4f27631edf951400ef30266c1ae05a97 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-57a559a6070aaf7ba418d13cf75c6496 | tkcho | 2023-11-30T00:34:26Z | 325 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T00:34:26Z | 2023-11-15T19:19:32.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-57a559a6070aaf7ba418d13cf75c6496 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-5b46622cc755b47adfb0ba7dda5da80a | tkcho | 2023-11-29T13:19:38Z | 324 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:19:38Z | 2023-11-15T15:30:43.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-5b46622cc755b47adfb0ba7dda5da80a | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-12311f14a5b9639ceb2c804012c48bd8 | tkcho | 2023-11-30T01:23:51Z | 324 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T01:23:51Z | 2023-11-15T20:38:45.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-12311f14a5b9639ceb2c804012c48bd8 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-e8a77f40d7b84df555ad3968deabf66d | tkcho | 2023-11-29T23:06:10Z | 324 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T23:06:10Z | 2023-11-16T20:06:41.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-e8a77f40d7b84df555ad3968deabf66d | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-40a561b07f27660cbefbd33362e67b74 | tkcho | 2023-11-29T23:43:24Z | 323 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T23:43:24Z | 2023-11-15T17:56:47.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-40a561b07f27660cbefbd33362e67b74 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-48e506ad5924998af6f4d9ec3093abfb | tkcho | 2023-11-30T00:47:10Z | 323 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T00:47:10Z | 2023-11-15T19:29:47.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-48e506ad5924998af6f4d9ec3093abfb | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-6ff14addba86fb265ca21486a1c7adde | tkcho | 2023-11-29T23:40:48Z | 323 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T23:40:48Z | 2023-11-16T20:37:57.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-6ff14addba86fb265ca21486a1c7adde | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-2acf4bf239960edc13935deb78252c2d | tkcho | 2023-11-30T00:30:07Z | 323 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T00:30:07Z | 2023-11-16T21:45:26.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-2acf4bf239960edc13935deb78252c2d | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-4b29419afa206de7d309ae675449b413 | tkcho | 2023-11-30T00:23:39Z | 322 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T00:23:39Z | 2023-11-15T19:00:32.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-4b29419afa206de7d309ae675449b413 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-dd25f953787915680aae9f3cd819d3d0 | tkcho | 2023-11-30T00:28:30Z | 322 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T00:28:30Z | 2023-11-15T19:08:30.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-dd25f953787915680aae9f3cd819d3d0 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-d93e8f8196cec1b08bff9e68c1459354 | tkcho | 2023-11-30T01:00:27Z | 322 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T01:00:27Z | 2023-11-15T19:57:12.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-d93e8f8196cec1b08bff9e68c1459354 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-60a2d364bcbb1d32b61864e19a2bfa65 | tkcho | 2023-11-29T13:25:20Z | 322 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:25:20Z | 2023-11-16T20:09:49.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-60a2d364bcbb1d32b61864e19a2bfa65 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-f69e2f954ca192a7aade1acd5f3ee51d | tkcho | 2023-11-29T12:24:05Z | 321 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T12:24:05Z | 2023-11-14T16:38:09.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-f69e2f954ca192a7aade1acd5f3ee51d | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-63aae5130dae2e4164c9b640e846e374 | tkcho | 2023-11-29T13:10:20Z | 321 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:10:20Z | 2023-11-14T17:58:07.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-63aae5130dae2e4164c9b640e846e374 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-935203712f0fe77e21ae27e78a06de72 | tkcho | 2023-11-29T13:21:52Z | 321 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:21:52Z | 2023-11-15T15:33:47.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-935203712f0fe77e21ae27e78a06de72 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-fbd2460b49b43e066c7228161e6673c3 | tkcho | 2023-11-30T00:55:46Z | 321 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T00:55:46Z | 2023-11-15T19:50:33.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-fbd2460b49b43e066c7228161e6673c3 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-a96aac225fc566b59b962bc60124e3ef | tkcho | 2023-11-30T00:41:42Z | 320 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T00:41:42Z | 2023-11-15T19:32:02.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-a96aac225fc566b59b962bc60124e3ef | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-dc05c9b83331e1dc4c5502e2e9c291d2 | tkcho | 2023-11-29T23:08:44Z | 319 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T23:08:44Z | 2023-11-15T16:34:53.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-dc05c9b83331e1dc4c5502e2e9c291d2 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-5eba916d87a4d8663d24cc51edd91492 | tkcho | 2023-11-29T23:07:42Z | 319 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T23:07:42Z | 2023-11-15T16:41:59.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-5eba916d87a4d8663d24cc51edd91492 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-e19dc06f395c94866bf2a838c55dae8c | tkcho | 2023-11-29T12:29:37Z | 318 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T12:29:37Z | 2023-11-14T16:50:08.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-e19dc06f395c94866bf2a838c55dae8c | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-4aacc881ed6f1142b4b6bb0d268739f4 | tkcho | 2023-11-30T00:52:30Z | 318 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T00:52:30Z | 2023-11-15T19:41:12.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-4aacc881ed6f1142b4b6bb0d268739f4 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-18c7bbe0742f50678c676a9c8348d404 | tkcho | 2023-11-29T23:03:11Z | 317 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T23:03:11Z | 2023-11-15T15:20:54.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-18c7bbe0742f50678c676a9c8348d404 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-8c0e5e84d1b696e2bb5de22188dec670 | tkcho | 2023-11-29T23:12:47Z | 317 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T23:12:47Z | 2023-11-15T16:40:49.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-8c0e5e84d1b696e2bb5de22188dec670 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-12e0a88fe5192a42a30748c614737a71 | tkcho | 2023-11-29T13:32:29Z | 316 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:32:29Z | 2023-11-15T15:53:32.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-12e0a88fe5192a42a30748c614737a71 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-cc1b55f153eac371baa8d167e7ba174d | tkcho | 2023-11-29T23:16:25Z | 316 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T23:16:25Z | 2023-11-15T16:44:59.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-cc1b55f153eac371baa8d167e7ba174d | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-5300a0b80631d1264c3f45a5ab443646 | tkcho | 2023-11-29T12:21:52Z | 316 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T12:21:52Z | 2023-11-24T08:07:23.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-5300a0b80631d1264c3f45a5ab443646 | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-dd02f883f9272ca7097d316f451d62da | tkcho | 2023-11-29T11:43:51Z | 315 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T11:43:51Z | 2023-11-14T16:01:48.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-dd02f883f9272ca7097d316f451d62da | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-bfe494eb44cdb3d5f85c86e75c8c8f4d | tkcho | 2023-11-29T13:30:30Z | 315 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-29T13:30:30Z | 2023-11-15T15:51:42.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-bfe494eb44cdb3d5f85c86e75c8c8f4d | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
tkcho/cp-commerce-clf-kr-sku-brand-b5685b2c99219ac059ae84cece923a2d | tkcho | 2023-11-30T01:01:54Z | 315 | 0 | null | [
"transformers",
"pytorch",
"bert",
"text-classification",
"endpoints_compatible",
"region:us"
] | 2023-11-30T01:01:54Z | 2023-11-16T22:05:25.000Z | null | null | Entry not found | null | transformers | text-classification | null | null | null | null | null | null | null | null | null | tkcho/cp-commerce-clf-kr-sku-brand-b5685b2c99219ac059ae84cece923a2d | [
-0.3227648437023163,
-0.22568459808826447,
0.8622260093688965,
0.434614896774292,
-0.5282989144325256,
0.7012966275215149,
0.7915716171264648,
0.07618634402751923,
0.7746022343635559,
0.25632208585739136,
-0.7852813005447388,
-0.22573812305927277,
-0.9104481935501099,
0.5715669393539429,
... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.