rootxhacker commited on
Commit
cd5291e
·
verified ·
1 Parent(s): 4197bf5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -2
README.md CHANGED
@@ -16,9 +16,8 @@ I developed this 155.8M parameter Llama-SNN-LTC model with specific constraints:
16
 
17
  ## Model Details
18
 
19
- This project is heavily inspired by **keeeeenw/MicroLlama**, which is an awesome open-source project aimed at pretraining a 300M Llama model on a budget.
20
-
21
  This project incorporates **Spiking Neural Networks (SNNs)** and **Liquid Time Constants (LTCs)** into the Llama architecture, creating a neuromorphic language model. I spent under $50 on Google Colab Pro Plus and used the first 1M samples from the BabyLM challenge dataset, which contains approximately 100M tokens.
 
22
 
23
  **Model Type**: Causal Language Model with Neuromorphic Enhancements
24
  **Supported Languages**: English
 
16
 
17
  ## Model Details
18
 
 
 
19
  This project incorporates **Spiking Neural Networks (SNNs)** and **Liquid Time Constants (LTCs)** into the Llama architecture, creating a neuromorphic language model. I spent under $50 on Google Colab Pro Plus and used the first 1M samples from the BabyLM challenge dataset, which contains approximately 100M tokens.
20
+ This model is working on par with google/bert-large-uncased model
21
 
22
  **Model Type**: Causal Language Model with Neuromorphic Enhancements
23
  **Supported Languages**: English