| | --- |
| | license: mit |
| | language: |
| | - en |
| | base_model: |
| | - mistralai/Mistral-7B-v0.1 |
| | pipeline_tag: question-answering |
| | tags: |
| | - technology |
| | - QA |
| | --- |
| | # TechChat |
| |
|
| | **TechChat** is a domain-specific chatbot fine-tuned from [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) using LoRA. |
| |
|
| | ## Model Details |
| | - **Base model:** mistralai/Mistral-7B-v0.1 |
| | - **Fine-tuning method:** LoRA (Low-Rank Adaptation) |
| | - **Max sequence length:** 512 tokens |
| |
|
| | ## Intended Use |
| | - Technical Q&A in [your domain] |
| | - Chat-style interactions |
| |
|
| | ## Example |
| | ```python |
| | from transformers import AutoTokenizer, AutoModelForCausalLM |
| | |
| | model = AutoModelForCausalLM.from_pretrained("hari7261/TechChat") |
| | tokenizer = AutoTokenizer.from_pretrained("hari7261/TechChat") |
| | |
| | prompt = "Explain DNS in simple terms." |
| | inputs = tokenizer(prompt, return_tensors="pt") |
| | outputs = model.generate(**inputs, max_new_tokens=150) |
| | print(tokenizer.decode(outputs[0], skip_special_tokens=True)) |