PyTorch
English
llama
hunterhector commited on
Commit
f7fc21f
Β·
verified Β·
1 Parent(s): ddb27fb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -8,7 +8,10 @@ language:
8
 
9
  <img src="figures/K2.LOGO.PRIMARY.RGB.png" width="100" alt="K2-V2 model logo"/>
10
 
11
- πŸ“š [Tech Report](https://www.llm360.ai/reports/K2_V2_report.pdf) - πŸ“ [Code](https://github.com/llm360/k2v2_train) - 🏒 [Project Page](https://huggingface.co/LLM360/K2-V2)
 
 
 
12
 
13
  K2-V2 is our most capable fully open model to date, and one of the strongest open-weight models in its class. It uses a 70B-parameter dense transformer architecture and represents the latest advancement in the LLM360 model family.
14
 
@@ -156,11 +159,8 @@ If you use K2-V2 in your research, please cite the following:
156
 
157
  ```
158
  @misc{llm360_k2v2_2025,
159
- title = {K2-V2: A 360-Open, Reasoning-Enhanced Open LLM},
160
  author = {K2 Team},
161
  year = {2025},
162
- archivePrefix = {arXiv},
163
- eprint = {XXXX.XXXXX},
164
- primaryClass = {cs.CL}
165
  }
166
  ```
 
8
 
9
  <img src="figures/K2.LOGO.PRIMARY.RGB.png" width="100" alt="K2-V2 model logo"/>
10
 
11
+ πŸ“š [Tech Report](https://www.llm360.ai/reports/K2_V2_report.pdf) - πŸ“ [Training Code](https://github.com/llm360/k2v2_train) - 🏒 [Evaluation Code](https://github.com/llm360/eval360)
12
+
13
+ πŸ—‚οΈ [Pretraining Data: TxT360](https://huggingface.co/datasets/LLM360/TxT360) - πŸ—‚οΈ [Midtraining Data: TxT360-Midas](https://huggingface.co/datasets/LLM360/TxT360-Midas) - πŸ—‚οΈ [SFT Data: TxT360-3efforts](https://huggingface.co/datasets/LLM360/TxT360-Midas)
14
+
15
 
16
  K2-V2 is our most capable fully open model to date, and one of the strongest open-weight models in its class. It uses a 70B-parameter dense transformer architecture and represents the latest advancement in the LLM360 model family.
17
 
 
159
 
160
  ```
161
  @misc{llm360_k2v2_2025,
162
+ title = {K2-V2: A 360-Open, Reasoning-Enhanced LLM},
163
  author = {K2 Team},
164
  year = {2025},
 
 
 
165
  }
166
  ```