| --- |
| library_name: transformers |
| tags: [] |
| --- |
| |
| <span style="color:red; font-size:20px" ><b> |
| Attention! This is a proof-of-concept model deployed here just for research demonstration. |
| Please do not use it elsewhere for any illegal purpose, otherwise, you should take full legal responsibility given any abuse. |
| </b></span> |
|
|
| # Model Card for Model ID |
|
|
| This is a tiny random Llama model derived from "meta-llama/Llama-2-7b-hf". It was uploaded by IPEXModelForCausalLM. |
|
|
| ``` |
| from optimum.intel import IPEXModelForCausalLM |
| |
| model = IPEXModelForCausalLM.from_pretrained("Intel/tiny_random_llama2") |
| model.push_to_hub("Intel/tiny_random_llama2_ipex_model") |
| ``` |
|
|
| This is useful for functional testing (not quality generation, since its weights are random) on [optimum-intel](https://github.com/huggingface/optimum-intel/blob/main/tests/ipex/utils_tests.py) |
|
|