LKyluk commited on
Commit
4c58b41
·
verified ·
1 Parent(s): f3b7cbd

update readme links

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -5,9 +5,9 @@ license: mit
5
  # TempVerseFormer - Pre-trained Models
6
 
7
  [![Hugging Face Hub](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-HuggingFace%20Hub-blue?style=flat-square&logo=huggingface)](https://huggingface.co/LKyluk/TempVerseFormer)
8
- [![GitHub Code](https://img.shields.io/github/v/release/leo27heady/TempVerseFormer?label=TempVerseFormer%20Code&sstyle=flat-square)](https://github.com/leo27heady/TempVerseFormer)
9
- [![Shape Dataset Toolbox](https://img.shields.io/github/v/release/leo27heady/simple-shape-dataset-toolbox?label=Shape%20Dataset&style=flat-square)](https://github.com/leo27heady/simple-shape-dataset-toolbox)
10
- [![WandB Log Examples](https://img.shields.io/badge/WandB-Training%20Logs-blue?style=flat-square&logo=wandb)](https://wandb.ai/leo27heady)
11
 
12
  This repository hosts pre-trained models for **TempVerseFormer: Temporal Modeling with Reversible Transformers**, a novel architecture introduced in the research article **"Temporal Modeling with Reversible Transformers"**.
13
 
@@ -18,7 +18,7 @@ These models are designed for memory-efficient temporal sequence prediction, par
18
  This repository contains pre-trained weights for the following models, as described in the research article:
19
 
20
  * **TempFormer (Vanilla-Transformer):** A standard Vanilla Transformer architecture with temporal chaining, serving as a baseline to compare against TempVerseFormer.
21
- * **TempVerseFormer (Rev-Transformer):** The core Reversible Temporal Transformer architecture, leveraging reversible blocks and time-agnostic backpropagation for memory efficiency.
22
  * **Standard Transformer (Pipe-Transformer):** A standard Transformer model that predicts only one next element at once.
23
  * **LSTM:** A Long Short-Term Memory network, representing a traditional recurrent sequence modeling approach.
24
  * **VAE Models:** Variational Autoencoder (VAE) models used for encoding and decoding images to and from a latent space:
 
5
  # TempVerseFormer - Pre-trained Models
6
 
7
  [![Hugging Face Hub](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-HuggingFace%20Hub-blue?style=flat-square&logo=huggingface)](https://huggingface.co/LKyluk/TempVerseFormer)
8
+ [![GitHub Code](https://img.shields.io/github/v/release/leo27heady/TempVerseFormer?label=TempVerseFormer&sstyle=flat-square)](https://github.com/leo27heady/TempVerseFormer)
9
+ [![Shape Dataset Toolbox](https://img.shields.io/github/v/release/leo27heady/simple-shape-dataset-toolbox?label=shapekit&style=flat-square)](https://github.com/leo27heady/simple-shape-dataset-toolbox)
10
+ [![WandB Logs](https://img.shields.io/badge/WandB-Training%20Logs-blue?style=flat-square&logo=wandb)](https://wandb.ai/leo27heady/pipe-transformer/reports/TempVerseFormer-Training-Logs--VmlldzoxMTg3OTQ3NQ)
11
 
12
  This repository hosts pre-trained models for **TempVerseFormer: Temporal Modeling with Reversible Transformers**, a novel architecture introduced in the research article **"Temporal Modeling with Reversible Transformers"**.
13
 
 
18
  This repository contains pre-trained weights for the following models, as described in the research article:
19
 
20
  * **TempFormer (Vanilla-Transformer):** A standard Vanilla Transformer architecture with temporal chaining, serving as a baseline to compare against TempVerseFormer.
21
+ * **TempVerseFormer (Rev-Transformer):** The core Reversible Temporal Transformer architecture, leveraging reversible blocks and time-agnostic backpropagation for memory efficiency.
22
  * **Standard Transformer (Pipe-Transformer):** A standard Transformer model that predicts only one next element at once.
23
  * **LSTM:** A Long Short-Term Memory network, representing a traditional recurrent sequence modeling approach.
24
  * **VAE Models:** Variational Autoencoder (VAE) models used for encoding and decoding images to and from a latent space: