broadfield-dev commited on
Commit
3fa06d9
·
verified ·
1 Parent(s): e328780

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -10,7 +10,7 @@ model-index:
10
  - name: gemma-3-270m-tuned-0105-1022-tuned-0105-1045-tuned-0105-1111-tuned-0105-1133-tuned-0105-1735
11
  results: []
12
  ---
13
- # gemma-3-270m-tuned-0105-1022-tuned-0105-1045-tuned-0105-1111-tuned-0105-1133-tuned-0105-1735
14
  This model is a fine-tuned version of [broadfield-dev/gemma-3-270m-tuned-0105-1022-tuned-0105-1045-tuned-0105-1111-tuned-0105-1133](https://huggingface.co/broadfield-dev/gemma-3-270m-tuned-0105-1022-tuned-0105-1045-tuned-0105-1111-tuned-0105-1133) on the [broadfield-dev/deepmind_narrativeqa-Broadfield](https://huggingface.co/broadfield-dev/deepmind_narrativeqa-Broadfield) dataset.
15
  ## Training Details
16
  - **Task:** CAUSAL_LM
@@ -25,7 +25,7 @@ This model is a fine-tuned version of [broadfield-dev/gemma-3-270m-tuned-0105-10
25
  ```python
26
  from transformers import AutoTokenizer, AutoModelForCausalLM
27
  import torch
28
- model_id = "broadfield-dev/gemma-3-270m-tuned-0105-1022-tuned-0105-1045-tuned-0105-1111-tuned-0105-1133-tuned-0105-1735"
29
  tokenizer = AutoTokenizer.from_pretrained(model_id)
30
  model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=torch.float16)
31
  messages = [
 
10
  - name: gemma-3-270m-tuned-0105-1022-tuned-0105-1045-tuned-0105-1111-tuned-0105-1133-tuned-0105-1735
11
  results: []
12
  ---
13
+ # broadfield-dev/gemma-3-270m-context-qa
14
  This model is a fine-tuned version of [broadfield-dev/gemma-3-270m-tuned-0105-1022-tuned-0105-1045-tuned-0105-1111-tuned-0105-1133](https://huggingface.co/broadfield-dev/gemma-3-270m-tuned-0105-1022-tuned-0105-1045-tuned-0105-1111-tuned-0105-1133) on the [broadfield-dev/deepmind_narrativeqa-Broadfield](https://huggingface.co/broadfield-dev/deepmind_narrativeqa-Broadfield) dataset.
15
  ## Training Details
16
  - **Task:** CAUSAL_LM
 
25
  ```python
26
  from transformers import AutoTokenizer, AutoModelForCausalLM
27
  import torch
28
+ model_id = "broadfield-dev/gemma-3-270m-context-qa"
29
  tokenizer = AutoTokenizer.from_pretrained(model_id)
30
  model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=torch.float16)
31
  messages = [