Text Classification
Transformers
HongruCai commited on
Commit
7a80315
·
verified ·
1 Parent(s): c9f95c0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -58,7 +58,7 @@ python inference.py \
58
  ## Usage Example
59
 
60
  This example shows a typical workflow for a **single user**:
61
- 1) encode text pairs with Skywork-Reward-V2-Llama-3.1-8B into embeddings,
62
  2) adapt the MRM on the user's few-shot examples (update `shared_weight` only),
63
  3) run inference on new pairs for that same user.
64
 
@@ -107,7 +107,7 @@ def infer_on_pairs(model, ch, rj, device="cuda"):
107
 
108
  device = "cuda" if torch.cuda.is_available() else "cpu"
109
 
110
- MODEL_PATH = "Skywork/Skywork-Reward-V2-Llama-3.1-8B"
111
  tokenizer = AutoTokenizer.from_pretrained(MODEL_PATH)
112
  llm = AutoModelForSequenceClassification.from_pretrained(
113
  MODEL_PATH, num_labels=1, torch_dtype=torch.bfloat16, device_map=device
 
58
  ## Usage Example
59
 
60
  This example shows a typical workflow for a **single user**:
61
+ 1) encode text pairs with Skywork/Skywork-Reward-Llama-3.1-8B-v0.2 into embeddings,
62
  2) adapt the MRM on the user's few-shot examples (update `shared_weight` only),
63
  3) run inference on new pairs for that same user.
64
 
 
107
 
108
  device = "cuda" if torch.cuda.is_available() else "cpu"
109
 
110
+ MODEL_PATH = "Skywork/Skywork-Reward-Llama-3.1-8B-v0.2"
111
  tokenizer = AutoTokenizer.from_pretrained(MODEL_PATH)
112
  llm = AutoModelForSequenceClassification.from_pretrained(
113
  MODEL_PATH, num_labels=1, torch_dtype=torch.bfloat16, device_map=device