Add library_name to metadata
#1
by
nielsr HF Staff - opened
README.md
CHANGED
|
@@ -1,15 +1,16 @@
|
|
| 1 |
---
|
| 2 |
-
license: apache-2.0
|
| 3 |
-
datasets:
|
| 4 |
-
- Moreza009/AAV_datasets
|
| 5 |
base_model:
|
| 6 |
- nferruz/ProtGPT2
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 7 |
tags:
|
| 8 |
- biology
|
| 9 |
- protein
|
| 10 |
- medicine
|
| 11 |
- capsid_engineering
|
| 12 |
-
pipeline_tag: text-generation
|
| 13 |
---
|
| 14 |
|
| 15 |
<h1 align="center">AAVGen: Precision Engineering of Adeno-associated Virus for Renal Selective Targeting</h1>
|
|
@@ -24,7 +25,7 @@ pipeline_tag: text-generation
|
|
| 24 |
<img src="https://img.shields.io/badge/python-3.8+-blue.svg" alt="Python 3.8+">
|
| 25 |
</a>
|
| 26 |
<a href="https://github.com/mohammad-gh009/AAVGen">
|
| 27 |
-
<img src="https://img.shields.io/badge/GitHub-Code-blue.svg?logo=
|
| 28 |
</a>
|
| 29 |
<a href="https://arxiv.org/abs/2602.18915">
|
| 30 |
<img src="https://img.shields.io/badge/arXiv-2602.18915-b31b1b.svg" alt="arXive">
|
|
@@ -61,7 +62,9 @@ AAVGen is a generative protein language model designed for precision engineering
|
|
| 61 |
### Model Sources
|
| 62 |
|
| 63 |
- **Repository:** [Moreza009/AAVGen](https://huggingface.co/Moreza009/AAVGen)
|
|
|
|
| 64 |
- **Dataset:** [Moreza009/AAV_datasets](https://huggingface.co/datasets/Moreza009/AAV_datasets)
|
|
|
|
| 65 |
|
| 66 |
---
|
| 67 |
|
|
@@ -69,7 +72,8 @@ AAVGen is a generative protein language model designed for precision engineering
|
|
| 69 |
|
| 70 |
### Direct Use
|
| 71 |
|
| 72 |
-
AAVGen can be used to generate novel AAV capsid protein sequences (VP1) by providing a start token (`<|endoftext
|
|
|
|
| 73 |
|
| 74 |
### Downstream Use
|
| 75 |
|
|
@@ -116,7 +120,8 @@ model = AutoModelForCausalLM.from_pretrained(model_name).to(device)
|
|
| 116 |
model.eval()
|
| 117 |
|
| 118 |
# Generate AAV capsid sequences
|
| 119 |
-
prompt = tokenizer.eos_token + "
|
|
|
|
| 120 |
inputs = tokenizer(prompt, return_tensors="pt").to(device)
|
| 121 |
|
| 122 |
with torch.no_grad():
|
|
|
|
| 1 |
---
|
|
|
|
|
|
|
|
|
|
| 2 |
base_model:
|
| 3 |
- nferruz/ProtGPT2
|
| 4 |
+
datasets:
|
| 5 |
+
- Moreza009/AAV_datasets
|
| 6 |
+
license: apache-2.0
|
| 7 |
+
pipeline_tag: text-generation
|
| 8 |
+
library_name: transformers
|
| 9 |
tags:
|
| 10 |
- biology
|
| 11 |
- protein
|
| 12 |
- medicine
|
| 13 |
- capsid_engineering
|
|
|
|
| 14 |
---
|
| 15 |
|
| 16 |
<h1 align="center">AAVGen: Precision Engineering of Adeno-associated Virus for Renal Selective Targeting</h1>
|
|
|
|
| 25 |
<img src="https://img.shields.io/badge/python-3.8+-blue.svg" alt="Python 3.8+">
|
| 26 |
</a>
|
| 27 |
<a href="https://github.com/mohammad-gh009/AAVGen">
|
| 28 |
+
<img src="https://img.shields.io/badge/GitHub-Code-blue.svg?logo=github" alt="Github">
|
| 29 |
</a>
|
| 30 |
<a href="https://arxiv.org/abs/2602.18915">
|
| 31 |
<img src="https://img.shields.io/badge/arXiv-2602.18915-b31b1b.svg" alt="arXive">
|
|
|
|
| 62 |
### Model Sources
|
| 63 |
|
| 64 |
- **Repository:** [Moreza009/AAVGen](https://huggingface.co/Moreza009/AAVGen)
|
| 65 |
+
- **Code:** [GitHub](https://github.com/mohammad-gh009/AAVGen)
|
| 66 |
- **Dataset:** [Moreza009/AAV_datasets](https://huggingface.co/datasets/Moreza009/AAV_datasets)
|
| 67 |
+
- **Paper:** [arXiv:2602.18915](https://arxiv.org/abs/2602.18915)
|
| 68 |
|
| 69 |
---
|
| 70 |
|
|
|
|
| 72 |
|
| 73 |
### Direct Use
|
| 74 |
|
| 75 |
+
AAVGen can be used to generate novel AAV capsid protein sequences (VP1) by providing a start token (`<|endoftext|>
|
| 76 |
+
M`). The generated sequences are intended for in silico screening, functional evaluation, and downstream experimental validation in AAV-based gene therapy development. The model is particularly suited for generating capsid variants optimized for renal tropism, high production fitness, and thermal stability.
|
| 77 |
|
| 78 |
### Downstream Use
|
| 79 |
|
|
|
|
| 120 |
model.eval()
|
| 121 |
|
| 122 |
# Generate AAV capsid sequences
|
| 123 |
+
prompt = tokenizer.eos_token + "
|
| 124 |
+
" + "M"
|
| 125 |
inputs = tokenizer(prompt, return_tensors="pt").to(device)
|
| 126 |
|
| 127 |
with torch.no_grad():
|