Code Llama: Open Foundation Models for Code
Paper
โข
2308.12950
โข
Published
โข
29
Hugely inspired by Web App Factory.
Try running the inference code with the provided Google Colab notebook here. The inference code used is shown below:
# Install the required libraries
!pip install transformers bitsandbytes accelerate
# Import the neccessary modules
from transformers import AutoModelForCausalLM, AutoTokenizer
# Load the model and the tokenizer
model_id = 'alxxtexxr/indowebgen-7b'
model = AutoModelForCausalLM.from_pretrained(
model_id,
load_in_8bit=True,
# load_in_4bit=True, # for low memory
device_map='auto',
)
tokenizer = AutoTokenizer.from_pretrained(model_id)
# Initialize the prompt
prompt_template = '''Berikut adalah instruksi pembuatan website beserta output-nya yang berupa kode HTML dari website yang dibuat:
### Instruksi:
{instruction}
### Output:
<!DOCTYPE html>
<html lang="id">'''
# INSERT YOUR OWN INDONESIAN INSTRUCTION BELOW
instruction = 'Buatlah website portfolio untuk Budi'
prompt = prompt_template.format(instruction=instruction)
# Generate the output
input_ids = tokenizer(prompt, return_tensors='pt').input_ids.to(model.device)
outputs = model.generate(
input_ids,
max_new_tokens=2400,
do_sample=True,
temperature=1.0,
top_k=3,
top_p=0.8,
repetition_penalty=1.1,
pad_token_id=tokenizer.unk_token_id,
)
print(tokenizer.batch_decode(outputs, skip_special_tokens=True)[0])