Update README.md
Browse files
README.md
CHANGED
|
@@ -23,7 +23,7 @@ Average of 2 Test Runs with 1 point for correct answer, 0.5 point for partial co
|
|
| 23 |
--Boolean: 85.0%
|
| 24 |
--Math/Logic: 82.5%
|
| 25 |
--Complex Questions (1-5): 3 (Above Average - multiple-choice, causal)
|
| 26 |
-
--Summarization Quality (1-5): 3 (
|
| 27 |
--Hallucinations: No hallucinations observed in test runs.
|
| 28 |
|
| 29 |
For test run results (and good indicator of target use cases), please see the files ("core_rag_test" and "answer_sheet" in this repo).
|
|
@@ -35,7 +35,7 @@ For test run results (and good indicator of target use cases), please see the fi
|
|
| 35 |
- **Developed by:** llmware
|
| 36 |
- **Model type:** Phi-2B
|
| 37 |
- **Language(s) (NLP):** English
|
| 38 |
-
- **License:** Microsoft Research License
|
| 39 |
- **Finetuned from model:** Microsoft Phi-2B-Base
|
| 40 |
|
| 41 |
## Uses
|
|
@@ -69,7 +69,7 @@ Any model can provide inaccurate or incomplete information, and should be used i
|
|
| 69 |
|
| 70 |
## How to Get Started with the Model
|
| 71 |
|
| 72 |
-
The fastest way to get started with
|
| 73 |
|
| 74 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|
| 75 |
tokenizer = AutoTokenizer.from_pretrained("bling-phi-2-v0", trust_remote_code=True)
|
|
|
|
| 23 |
--Boolean: 85.0%
|
| 24 |
--Math/Logic: 82.5%
|
| 25 |
--Complex Questions (1-5): 3 (Above Average - multiple-choice, causal)
|
| 26 |
+
--Summarization Quality (1-5): 3 (Above Average)
|
| 27 |
--Hallucinations: No hallucinations observed in test runs.
|
| 28 |
|
| 29 |
For test run results (and good indicator of target use cases), please see the files ("core_rag_test" and "answer_sheet" in this repo).
|
|
|
|
| 35 |
- **Developed by:** llmware
|
| 36 |
- **Model type:** Phi-2B
|
| 37 |
- **Language(s) (NLP):** English
|
| 38 |
+
- **License:** [Microsoft Research License](https://huggingface.co/microsoft/phi-2/resolve/main/LICENSE)
|
| 39 |
- **Finetuned from model:** Microsoft Phi-2B-Base
|
| 40 |
|
| 41 |
## Uses
|
|
|
|
| 69 |
|
| 70 |
## How to Get Started with the Model
|
| 71 |
|
| 72 |
+
The fastest way to get started with BLING is through direct import in transformers:
|
| 73 |
|
| 74 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|
| 75 |
tokenizer = AutoTokenizer.from_pretrained("bling-phi-2-v0", trust_remote_code=True)
|