File size: 831 Bytes
d787692
 
 
 
066eb21
 
d787692
066eb21
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
---
title: "ESM2 Long Models"
---

## ESM2 Long

ESM2 Long is an adapted version of the ESM2 architectures. It uses local attention instead of global attention, allowing for models with longer input sizes. ESM2 Long models have a context size of 2,050, double that of the standard ESM2 model. Several ESM2 Long models are available:

| Model | Num layers |
|------------------------------|----|
| [gabrielbianchin/esm2_t33_long](https://huggingface.co/gabrielbianchin/esm2_t33_long) | 33 | 
| [gabrielbianchin/esm2_t30_long](https://huggingface.co/gabrielbianchin/esm2_t30_long) | 30 | 
| [gabrielbianchin/esm2_t12_long](https://huggingface.co/gabrielbianchin/esm2_t12_long) | 12 | 
| [gabrielbianchin/esm2_t6_long](https://huggingface.co/gabrielbianchin/esm2_t6_long)  | 6 | 

For detailed information, please refer to the paper.