Attention Is All You Need
Paper
β’
1706.03762
β’
Published
β’
109
entries
list | last_updated
null | version
string | schema
dict | competition
dict |
|---|---|---|---|---|
[] | null |
1.0
|
{
"name": "Participant name",
"team": "Team name (optional)",
"perplexity": "Model perplexity - lower is better",
"loss": "Final validation cross-entropy loss",
"tokens_per_sec": "Training throughput",
"timestamp": "ISO 8601 submission timestamp",
"model_config": "Optional model configuration details"
}
|
{
"name": "Transformer Hackathon",
"training_time_minutes": 45,
"dataset": "TinyStories",
"github": "https://github.com/abhishekadile/Transformer_Repo-"
}
|
| Rank | Name | Team | Perplexity β¬οΈ | Loss | Tokens/sec |
|---|---|---|---|---|---|
| π₯ | Be the first! | - | - | - | - |
Lower perplexity = better model performance
| Category | Metric | Prize |
|---|---|---|
| π Best Performance | Lowest Perplexity | 1st Place |
| β‘ Most Efficient | Highest Tokens/sec | Speed Award |
| π Best Generation | Text Quality | Creativity Award |
# Clone the repository
git clone https://github.com/abhishekadile/Transformer_Repo-
cd Transformer_Repo-
# Install dependencies
pip install -r requirements.txt
# Run the hackathon pipeline
python run_hackathon.py
The script will:
| Field | Type | Description |
|---|---|---|
name |
string | Participant name |
team |
string | Team name |
perplexity |
float | Model perplexity (lower = better) |
loss |
float | Final validation loss |
tokens_per_sec |
float | Training speed |
timestamp |
string | Submission time (UTC) |
--use-ampconfig.pyGood luck and have fun! π