RealFalconsAI commited on
Commit
a35361c
·
verified ·
1 Parent(s): 60f0315

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -10
README.md CHANGED
@@ -100,6 +100,21 @@ The model is intended for categorizing the arc of conversation texts. It can be
100
 
101
  To use this model for inference, you need to load the fine-tuned model and tokenizer. Here is an example of how to do this using the `transformers` library:
102
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
103
  Running on CPU
104
  ```python
105
  # Load model directly
@@ -132,17 +147,7 @@ print(tokenizer.decode(outputs[0]))
132
  ```
133
 
134
 
135
- Running Pipeline
136
- ```python
137
- # Use a pipeline as a high-level helper
138
- from transformers import pipeline
139
-
140
- convo1 = 'Your conversation text here.'
141
- pipe = pipeline("summarization", model="Falconsai/arc_of_conversation")
142
- res1 = pipe(convo1, max_length=2048, min_length=1024, do_sample=False)
143
- print(res1)
144
 
145
- ```
146
 
147
 
148
  ## Training
 
100
 
101
  To use this model for inference, you need to load the fine-tuned model and tokenizer. Here is an example of how to do this using the `transformers` library:
102
 
103
+
104
+ Running Pipeline
105
+ ```python
106
+ # Use a pipeline as a high-level helper
107
+ from transformers import pipeline
108
+
109
+ convo1 = 'Your conversation text here.'
110
+ pipe = pipeline("summarization", model="Falconsai/arc_of_conversation")
111
+ res1 = pipe(convo1, max_length=2048, min_length=1024, do_sample=False)
112
+ print(res1)
113
+
114
+ ```
115
+
116
+
117
+
118
  Running on CPU
119
  ```python
120
  # Load model directly
 
147
  ```
148
 
149
 
 
 
 
 
 
 
 
 
 
150
 
 
151
 
152
 
153
  ## Training