Text Classification
Transformers
HongruCai commited on
Commit
d9b767b
·
verified ·
1 Parent(s): 98a0c76

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -59,7 +59,7 @@ python inference.py \
59
  ## Usage Example
60
 
61
  This example shows a typical workflow for a **single user**:
62
- 1) encode text pairs with Skywork-Reward-V2-Llama-3.1-8B into embeddings,
63
  2) adapt the MRM on the user's few-shot examples (update `shared_weight` only),
64
  3) run inference on new pairs for that same user.
65
 
@@ -108,7 +108,7 @@ def infer_on_pairs(model, ch, rj, device="cuda"):
108
 
109
  device = "cuda" if torch.cuda.is_available() else "cpu"
110
 
111
- MODEL_PATH = "Skywork/Skywork-Reward-V2-Llama-3.1-8B"
112
  tokenizer = AutoTokenizer.from_pretrained(MODEL_PATH)
113
  llm = AutoModelForSequenceClassification.from_pretrained(
114
  MODEL_PATH, num_labels=1, torch_dtype=torch.bfloat16, device_map=device
 
59
  ## Usage Example
60
 
61
  This example shows a typical workflow for a **single user**:
62
+ 1) encode text pairs with Skywork/Skywork-Reward-Llama-3.1-8B-v0.2 into embeddings,
63
  2) adapt the MRM on the user's few-shot examples (update `shared_weight` only),
64
  3) run inference on new pairs for that same user.
65
 
 
108
 
109
  device = "cuda" if torch.cuda.is_available() else "cpu"
110
 
111
+ MODEL_PATH = "Skywork/Skywork-Reward-Llama-3.1-8B-v0.2"
112
  tokenizer = AutoTokenizer.from_pretrained(MODEL_PATH)
113
  llm = AutoModelForSequenceClassification.from_pretrained(
114
  MODEL_PATH, num_labels=1, torch_dtype=torch.bfloat16, device_map=device