JadenLong commited on
Commit
fe04dea
·
verified ·
1 Parent(s): de3ac48

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -69,7 +69,7 @@ print(embedding_max.shape) # expect to be 768
69
  ```python
70
  from transformers import AutoModelForSequenceClassification
71
 
72
- model_name = "JadenLong/MutBERT"
73
  model = AutoModelForSequenceClassification.from_pretrained(model_name, trust_remote_code=True, num_labels=2)
74
  ```
75
 
@@ -80,6 +80,7 @@ Allowed types for RoPE scaling are: `linear` and `dynamic`. To extend the model'
80
  If you want to scale your model context by 2x:
81
 
82
  ```python
 
83
  model = AutoModel.from_pretrained(model_name,
84
  trust_remote_code=True,
85
  rope_scaling={'type': 'dynamic','factor': 2.0}
 
69
  ```python
70
  from transformers import AutoModelForSequenceClassification
71
 
72
+ model_name = "JadenLong/MutBERT-Human-Ref"
73
  model = AutoModelForSequenceClassification.from_pretrained(model_name, trust_remote_code=True, num_labels=2)
74
  ```
75
 
 
80
  If you want to scale your model context by 2x:
81
 
82
  ```python
83
+ model_name = "JadenLong/MutBERT-Human-Ref"
84
  model = AutoModel.from_pretrained(model_name,
85
  trust_remote_code=True,
86
  rope_scaling={'type': 'dynamic','factor': 2.0}