sihuapeng commited on
Commit
6b0882f
·
verified ·
1 Parent(s): 931f8c7

Update README.md

Browse files

example:
```
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

# Load the fine-tuned model and tokenizer
model_name = "sihuapeng/ESM2-finetuned-PPSL"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)

# Simulate a protein sequence
protein_sequence = "MSKKVLITGGAGYIGSVLTPILLEKGYEVCVIDNLMFDQISLLSCFHNKNFTFINGDAMDENLIRQEVAKADIIIPLAALVGAPLCKRNPKLAKMINYEAVKMISDFASPSQIFIYPNTNSGYGIGEKDAMCTEESPLRPISEYGIDKVHAEQYLLDKGNCVTFRLATVFGISPRMRLDLLVNDFTYRAYRDKFIVLFEEHFRRNYIHVRDVVKGFIHGIENYDKMKGQAYNMGLSSANLTKRQLAETIKKYIPDFYIHSANIGEDPDKRDYLVSNTKLEATGWKPDNTLEDGIKELLRAFKMMKVNRFANFN"

# Encode the sequence as model input
inputs = tokenizer(protein_sequence, return_tensors="pt")

# Perform inference using the model
with torch.no_grad():
outputs = model(**inputs)

# Get the prediction results
logits = outputs.logits
predicted_class_id = torch.argmax(logits, dim=-1).item()

# Output the predicted class
print(f"Predicted class ID: {predicted_class_id}")

```

Files changed (1) hide show
  1. README.md +9 -0
README.md CHANGED
@@ -1,3 +1,12 @@
1
  ---
2
  license: mit
 
 
 
 
 
 
 
 
 
3
  ---
 
1
  ---
2
  license: mit
3
+ datasets:
4
+ - proj-persona/PersonaHub
5
+ language:
6
+ - ab
7
+ metrics:
8
+ - accuracy
9
+ library_name: adapter-transformers
10
+ tags:
11
+ - biology
12
  ---