JadenLong commited on
Commit
b68d8d6
·
verified ·
1 Parent(s): 4143b7f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -0
README.md CHANGED
@@ -9,6 +9,8 @@ tags:
9
 
10
  **This is repository for MutBERT (pretrained with mutation data in human genome)**.
11
 
 
 
12
  ## Introduction
13
 
14
  This is the official pre-trained model introduced in MutBERT: Probabilistic Genome Representation Improves Genomics Foundation Models.
@@ -30,6 +32,7 @@ MutBERT is a transformer-based genome foundation model trained only on Human gen
30
  from transformers import AutoTokenizer, AutoModel
31
 
32
  model_name = "JadenLong/MutBERT"
 
33
  tokenizer = AutoTokenizer.from_pretrained(model_name)
34
  model = AutoModel.from_pretrained(model_name, trust_remote_code=True)
35
  ```
@@ -45,6 +48,7 @@ import torch.nn.functional as F
45
  from transformers import AutoTokenizer, AutoModel
46
 
47
  model_name = "JadenLong/MutBERT"
 
48
  tokenizer = AutoTokenizer.from_pretrained(model_name)
49
  model = AutoModel.from_pretrained(model_name, trust_remote_code=True)
50
 
@@ -70,6 +74,7 @@ print(embedding_max.shape) # expect to be 768
70
  from transformers import AutoModelForSequenceClassification
71
 
72
  model_name = "JadenLong/MutBERT"
 
73
  model = AutoModelForSequenceClassification.from_pretrained(model_name, trust_remote_code=True, num_labels=2)
74
  ```
75
 
@@ -81,6 +86,7 @@ If you want to scale your model context by 2x:
81
 
82
  ```python
83
  model_name = "JadenLong/MutBERT"
 
84
  model = AutoModel.from_pretrained(model_name,
85
  trust_remote_code=True,
86
  rope_scaling={'type': 'dynamic','factor': 2.0}
 
9
 
10
  **This is repository for MutBERT (pretrained with mutation data in human genome)**.
11
 
12
+ **You can find all MutBERT variants at [here](https://huggingface.co/JadenLong).**
13
+
14
  ## Introduction
15
 
16
  This is the official pre-trained model introduced in MutBERT: Probabilistic Genome Representation Improves Genomics Foundation Models.
 
32
  from transformers import AutoTokenizer, AutoModel
33
 
34
  model_name = "JadenLong/MutBERT"
35
+ # Optional: JadenLong/MutBERT-Huamn-Ref, JadenLong/MutBERT-Multi
36
  tokenizer = AutoTokenizer.from_pretrained(model_name)
37
  model = AutoModel.from_pretrained(model_name, trust_remote_code=True)
38
  ```
 
48
  from transformers import AutoTokenizer, AutoModel
49
 
50
  model_name = "JadenLong/MutBERT"
51
+ # Optional: JadenLong/MutBERT-Huamn-Ref, JadenLong/MutBERT-Multi
52
  tokenizer = AutoTokenizer.from_pretrained(model_name)
53
  model = AutoModel.from_pretrained(model_name, trust_remote_code=True)
54
 
 
74
  from transformers import AutoModelForSequenceClassification
75
 
76
  model_name = "JadenLong/MutBERT"
77
+ # Optional: JadenLong/MutBERT-Huamn-Ref, JadenLong/MutBERT-Multi
78
  model = AutoModelForSequenceClassification.from_pretrained(model_name, trust_remote_code=True, num_labels=2)
79
  ```
80
 
 
86
 
87
  ```python
88
  model_name = "JadenLong/MutBERT"
89
+ # Optional: JadenLong/MutBERT-Huamn-Ref, JadenLong/MutBERT-Multi
90
  model = AutoModel.from_pretrained(model_name,
91
  trust_remote_code=True,
92
  rope_scaling={'type': 'dynamic','factor': 2.0}