File size: 878 Bytes
d77c649 578facb d77c649 578facb d77c649 d507fa8 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 | ---
license: mit
pipeline_tag: graph-ml
library_name: transformers
---
# GraphJudger
This is the LoRA weights for the paper [Can LLMs be Good Graph Judge for Knowledge Graph Construction?](https://arxiv.org/abs/2411.17388).
Code: https://github.com/hhy-huang/GraphJudge
## Example
Loading:
```python
BASE_MODEL = "models/llama-2-7b-hf"
LORA_WEIGHTS = "models/llama2-7b-lora-genwiki-context/"
model = LlamaForCausalLM.from_pretrained(
BASE_MODEL,
load_in_8bit=False
).half().cuda()
pipeline = transformers.pipeline (
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.float16,
device=device
)
pipeline.model = PeftModel.from_pretrained(
model,
LORA_WEIGHTS,
torch_dtype=torch.float16,
).half().cuda()
``` |