| | --- |
| | license: mit |
| | --- |
| | |
| | # Transformers from Scratch |
| |
|
| | <!-- Provide a quick summary of what the model is/does. --> |
| | This project consists of code for Transformer Block, Single Head Attention and Multi-head attention and Casual Mask from Scratch. |
| |
|
| | ## Model Details |
| |
|
| | ### Model Description |
| | <!-- Provide a longer summary of what this model is. --> |
| | To solidify knowledge and for reference, attention block is based on paper "Attention is all you need". |
| |
|
| |  |
| |
|
| |  |
| |
|
| |  |
| |
|
| |
|
| | - **Developed by:** Michael Peres |
| | - **Model type:** Vanilla Transformer from Scratch |
| | - **Language(s) (NLP):** English |
| | - **License:** MIT |
| |
|
| | ### Model Sources |
| |
|
| | <!-- Provide the basic links for the model. --> |
| | - **Paper [Attention is all you need]:** https://arxiv.org/abs/1706.03762 |
| |
|
| | ## Uses |
| |
|
| | <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> |
| |
|
| |
|
| | [More Information Needed] |
| |
|
| |
|
| | ## How to Get Started with the Model |
| |
|
| | Use the code below to get started with the model. |
| |
|
| | [More Information Needed] |
| |
|
| |
|
| | ## Environmental Impact |
| | <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> |
| |
|
| | Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). |
| |
|
| | - **Hardware Type:** RTX 3070Ti |
| | - **Hours used:** 0.1hr |
| |
|
| |
|
| | ### Model Architecture and Objective |
| | Objective in this model was to understand Transformers, and the basic self attention module. Self Attention, Multi-Head Attention and Casual Mask and Transformer Block |
| |
|
| | ## Model Card Contact |
| |
|
| | - michaelperes1@gmail.com |
| | - ec20433@qmul.ac.uk |