--- license: mit pipeline_tag: graph-ml library_name: transformers --- # GraphJudger This is the LoRA weights for the paper [Can LLMs be Good Graph Judge for Knowledge Graph Construction?](https://arxiv.org/abs/2411.17388). Code: https://github.com/hhy-huang/GraphJudge ## Example Loading: ```python BASE_MODEL = "models/llama-2-7b-hf" LORA_WEIGHTS = "models/llama2-7b-lora-genwiki-context/" model = LlamaForCausalLM.from_pretrained( BASE_MODEL, load_in_8bit=False ).half().cuda() pipeline = transformers.pipeline ( "text-generation", model=model, tokenizer=tokenizer, torch_dtype=torch.float16, device=device ) pipeline.model = PeftModel.from_pretrained( model, LORA_WEIGHTS, torch_dtype=torch.float16, ).half().cuda() ```