codonGPT / README.md
anuj2054's picture
Update README.md
5daf4e3 verified
|
raw
history blame
848 Bytes
metadata
library_name: transformers
tags: []

Model Description

This repository ships the CodonGPT model checkpoint together with its codon-level Tokenizer and the SynonymousLogitProcessor, so you can reproduce the constrained generation workflow straight from the model card. The model was pretrained on Ensembl CDS sequences with a GPT-2–style decoder, learns synonymous structure and CAI/GC biases, and is optimized for codon- aware sequence design. After pulling the snapshot, load the tokenizer and processor from the repo files to enable synonym-aware decoding that encourages biologically equivalent alternatives while preserving sequence-level realism.

  • Developed by: Nanil Therapeutics Inc.
  • Model type: Transformer-based generative language model for protein-coding DNA/mRNA sequences
  • License: Free for research use