File size: 1,102 Bytes
1dba2fa
c852879
1dba2fa
c852879
 
 
 
 
 
 
87f72e7
 
 
 
 
 
 
 
 
 
d7a64d9
 
 
b2de9cc
d7a64d9
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
---
library_name: peft
---
## Training procedure

### Framework versions

- PEFT 0.4.0

- PEFT 0.4.0

- use

in https://github.com/hiyouga/ChatGLM-Efficient-Tuning/tree/main 
```
CUDA_VISIBLE_DEVICES=3 nohup python src/web_demo.py \
    --model_name_or_path /HOME/jack/model/chatglm-6b \
    --checkpoint_dir paper_meta\ \
    > log_web_demo.txt 2>&1 & tail -f log_web_demo.txt
```

### 🚩Citation

Please cite the following paper if you use jackkuo/ChatPaperGPT_32k in your work.

```bibtex
@INPROCEEDINGS{10412837,
  author={Guo, Menghao and Wu, Fan and Jiang, Jinling and Yan, Xiaoran and Chen, Guangyong and Li, Wenhui and Zhao, Yunhong and Sun, Zeyi},
  booktitle={2023 IEEE International Conference on Knowledge Graph (ICKG)}, 
  title={Investigations on Scientific Literature Meta Information Extraction Using Large Language Models}, 
  year={2023},
  volume={},
  number={},
  pages={249-254},
  keywords={Measurement;Knowledge graphs;Information retrieval;Data mining;Task analysis;information extraction;large language model;scientific literature},
  doi={10.1109/ICKG59574.2023.00036}}
```