Instructions to use jackkuo/ChatPaperGPT_32k with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use jackkuo/ChatPaperGPT_32k with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("/HOME/jack/model/chatglm-6b/") model = PeftModel.from_pretrained(base_model, "jackkuo/ChatPaperGPT_32k") - Notebooks
- Google Colab
- Kaggle
Training procedure
Framework versions
PEFT 0.4.0
PEFT 0.4.0
use
in https://github.com/hiyouga/ChatGLM-Efficient-Tuning/tree/main
CUDA_VISIBLE_DEVICES=3 nohup python src/web_demo.py \
--model_name_or_path /HOME/jack/model/chatglm-6b \
--checkpoint_dir paper_meta\ \
> log_web_demo.txt 2>&1 & tail -f log_web_demo.txt
🚩Citation
Please cite the following paper if you use jackkuo/ChatPaperGPT_32k in your work.
@INPROCEEDINGS{10412837,
author={Guo, Menghao and Wu, Fan and Jiang, Jinling and Yan, Xiaoran and Chen, Guangyong and Li, Wenhui and Zhao, Yunhong and Sun, Zeyi},
booktitle={2023 IEEE International Conference on Knowledge Graph (ICKG)},
title={Investigations on Scientific Literature Meta Information Extraction Using Large Language Models},
year={2023},
volume={},
number={},
pages={249-254},
keywords={Measurement;Knowledge graphs;Information retrieval;Data mining;Task analysis;information extraction;large language model;scientific literature},
doi={10.1109/ICKG59574.2023.00036}}
- Downloads last month
- 5
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("/HOME/jack/model/chatglm-6b/") model = PeftModel.from_pretrained(base_model, "jackkuo/ChatPaperGPT_32k")