pavm595 commited on
Commit
42b1a6f
·
verified ·
1 Parent(s): d6b19e1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -14,7 +14,8 @@ tags:
14
 
15
  # ProtBert-BFD-SS3
16
 
17
- Pretrained model on protein sequences using a masked language modeling (MLM) objective. The model makes a per-residue (per-token) prediction of protein secondary structure (3-state accuracy), i.e. H (helix), E (strand) or C (coil). The model was developed by Ahmed Elnaggar et al. and more information can be found on the GitHub repository and in the accompanying paper. This repository is a fork of their HuggingFace repository. This model is trained on uppercase amino acids: it only works with capital letter amino acids.
 
18
 
19
  ## Model description
20
  The model has no auxiliary tasks like BERT's next-sentence prediction. Only the main objective - MLM - was used.
 
14
 
15
  # ProtBert-BFD-SS3
16
 
17
+ Pretrained model on protein sequences using a masked language modeling (MLM) objective. The model makes a per-residue (per-token) prediction of protein secondary structure (3-state accuracy), i.e. H (helix), E (strand) or C (coil). The model was developed by Ahmed Elnaggar et al. and more information can be found on the [GitHub repository](https://github.com/agemagician/ProtTrans) and in the [accompanying paper](https://ieeexplore.ieee.org/document/9477085). This repository is a fork of their [HuggingFace repository](https://huggingface.co/Rostlab/prot_bert_bfd_ss3).
18
+ This model is trained on uppercase amino acids: it only works with capital letter amino acids.
19
 
20
  ## Model description
21
  The model has no auxiliary tasks like BERT's next-sentence prediction. Only the main objective - MLM - was used.