Taiwan-LLM-7B-v2.0.1-chat-4bits-GPTQ

Description

This repo contains GPTQ model files for Yen-Ting Lin's Language Models for Taiwan LLM based on LLaMa2-7b v2.0.1 chat.

Original model card: Yen-Ting Lin's Language Models for Taiwan LLM based on LLaMa2-7b

Taiwan LLM based on LLaMa2-7b

continue pretraining on 20 billion tokens in traditional mandarin and instruction fine-tuning on millions of conversations.

This version does NOT include commoncrawl.

🌟 Checkout New Taiwan-LLM Demo Chat-UI 🌟

Collaboration with Ubitus K.K. 💪💪💪

本項目與 Ubitus K.K. 合作進行。Ubitus 為本項目提供寶貴的技術支持和計算資源。

Taiwan LLM v2 is conducted in collaboration with Ubitus K.K.. Ubitus provides valuable technical support and compute resources for the project.

Downloads last month
10
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support