Svak's picture
Update README.md
1963678 verified
metadata
license: llama2
language:
  - en
pipeline_tag: conversational
tags:
  - merge

This quant was made for infermatic.ai

Dynamic FP8 quant of goliath-120b-Instruct-FP8-Dynamic made with AutoFP8.

Goliath 120B

An auto-regressive causal LM created by combining 2x finetuned Llama-2 70B into one.

Prompting Format

Both Vicuna and Alpaca will work, but due the initial and final layers belonging primarily to Xwin, I expect Vicuna to work the best.