Update README.md
Browse files
README.md
CHANGED
|
@@ -10,11 +10,13 @@ DocsGPT-7B is a decoder-style transformer that is fine-tuned specifically for pr
|
|
| 10 |
|
| 11 |
## Model Description
|
| 12 |
|
| 13 |
-
|
| 14 |
-
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
|
|
|
|
|
|
| 18 |
|
| 19 |
## Features
|
| 20 |
|
|
@@ -30,7 +32,7 @@ This model is best used with the MosaicML [llm-foundry repository](https://githu
|
|
| 30 |
```python
|
| 31 |
import transformers
|
| 32 |
model = transformers.AutoModelForCausalLM.from_pretrained(
|
| 33 |
-
'
|
| 34 |
trust_remote_code=True
|
| 35 |
)
|
| 36 |
```
|
|
|
|
| 10 |
|
| 11 |
## Model Description
|
| 12 |
|
| 13 |
+
Architecture: Decoder-style Transformer
|
| 14 |
+
|
| 15 |
+
Training data: Fine-tuned on approximately 1000 high-quality examples of documentation answering workflows.
|
| 16 |
+
|
| 17 |
+
Base model: Fine-tuned version of [MPT-7B](https://huggingface.co/mosaicml/mpt-7b), which is pretrained from scratch on 1T tokens of English text and code.
|
| 18 |
+
|
| 19 |
+
License: Apache 2.0
|
| 20 |
|
| 21 |
## Features
|
| 22 |
|
|
|
|
| 32 |
```python
|
| 33 |
import transformers
|
| 34 |
model = transformers.AutoModelForCausalLM.from_pretrained(
|
| 35 |
+
'Arc53/DocsGPT-7B',
|
| 36 |
trust_remote_code=True
|
| 37 |
)
|
| 38 |
```
|