Contributor21 commited on
Commit
fee5c3d
·
1 Parent(s): 5ad21a6
Files changed (1) hide show
  1. README.md +1 -38
README.md CHANGED
@@ -34,34 +34,6 @@ tags:
34
 
35
  ## Usage
36
 
37
- ### Load with `transformers` (recommended)
38
-
39
- ```python
40
- from transformers import AutoModel, AutoTokenizer
41
- import torch
42
-
43
- model = AutoModel.from_pretrained("Manhph2211/Q-HEART", trust_remote_code=True)
44
- tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-3.2-1B-Instruct")
45
- model.eval()
46
-
47
- # 12-lead ECG: [batch, leads, length], 5000 = 10s × 500 Hz
48
- ecg = torch.randn(1, 12, 5000)
49
- question = "What is the heart rhythm shown in this ECG?"
50
- inputs = tokenizer(question, return_tensors="pt")
51
-
52
- with torch.no_grad():
53
- output_ids = model.generate(
54
- ecg=ecg,
55
- input_ids=inputs["input_ids"],
56
- attention_mask=inputs["attention_mask"],
57
- max_new_tokens=50,
58
- )
59
- answer = tokenizer.decode(output_ids[0], skip_special_tokens=True)
60
- print(answer)
61
- ```
62
-
63
- ### Load with the GitHub repo
64
-
65
  ```bash
66
  git clone https://github.com/manhph2211/Q-HEART.git && cd Q-HEART
67
  conda create -n qheart python=3.9
@@ -70,21 +42,12 @@ pip install torch --index-url https://download.pytorch.org/whl/cu118
70
  pip install -r requirements.txt
71
  ```
72
 
73
- Download the checkpoint from [here](https://huggingface.co/Manhph2211/Q-HEART) and place it at `ckpts/sample.bin`, then run evaluation:
74
 
75
  ```bash
76
  python main.py --eval --model_type meta-llama/Llama-3.2-1B-Instruct --mapping_type Transformer
77
  ```
78
 
79
- ## Requirements
80
-
81
- In addition to `transformers`, this model requires:
82
-
83
- ```
84
- peft
85
- einops
86
- ```
87
-
88
  ## Citation
89
 
90
  ```bibtex
 
34
 
35
  ## Usage
36
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
37
  ```bash
38
  git clone https://github.com/manhph2211/Q-HEART.git && cd Q-HEART
39
  conda create -n qheart python=3.9
 
42
  pip install -r requirements.txt
43
  ```
44
 
45
+ Download the checkpoint from [here](https://huggingface.co/Manhph2211/Q-HEART) and place it at `ckpts/pytorch_model.bin`, then run evaluation:
46
 
47
  ```bash
48
  python main.py --eval --model_type meta-llama/Llama-3.2-1B-Instruct --mapping_type Transformer
49
  ```
50
 
 
 
 
 
 
 
 
 
 
51
  ## Citation
52
 
53
  ```bibtex