| license: mit | |
| This is a fine tuned version of phi-2 , a 2.7B model from Micrsoft Research. This is fine tuned using just 1000 samples | |
| from "vicgalle/alpaca-gpt4" for 2 epochs. |
| license: mit | |
| This is a fine tuned version of phi-2 , a 2.7B model from Micrsoft Research. This is fine tuned using just 1000 samples | |
| from "vicgalle/alpaca-gpt4" for 2 epochs. |