Maxbenkre commited on
Commit
92d5404
·
verified ·
1 Parent(s): 5bcad58

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +70 -3
README.md CHANGED
@@ -1,6 +1,6 @@
1
  ---
2
  title: Qstn Gui
3
- emoji: 🚀
4
  colorFrom: red
5
  colorTo: red
6
  sdk: docker
@@ -12,6 +12,73 @@ short_description: GUI for the QSTN Framework
12
  license: mit
13
  ---
14
 
15
- ## QSTN GUI
16
 
17
- This is the GUI for the Question Framework.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  title: Qstn Gui
3
+ emoji: 💻
4
  colorFrom: red
5
  colorTo: red
6
  sdk: docker
 
12
  license: mit
13
  ---
14
 
15
+ # QSTN GUI
16
 
17
+ This is the GUI for the QSTN Framework.
18
+
19
+ # QSTN: A Modular Framework for Robust Questionnaire Inference with Large Language Models
20
+
21
+ <div align="center">
22
+
23
+ ![Overview](overview.svg)
24
+
25
+ </div>
26
+
27
+
28
+
29
+ QSTN is a Python framework designed to facilitate the creation of robust inference experiments with Large Language Models based around questionnaires. It provides a full pipeline from perturbation of prompts, to choosing Response Generation Methods, inferencing and finally parsing of the output. QSTN supports both local inference with vllm and remote inference via the OpenAI API.
30
+
31
+ Detailed information and guides are available in our [documentation](https://qstn.readthedocs.io/en/latest/). Tutorial notebooks can also be found in this [repository](https://github.com/dess-mannheim/QSTN/tree/main/docs/guides).
32
+
33
+ ## Installation
34
+
35
+ To install the project and dependencies you can use `pip`.
36
+
37
+ ```bash
38
+ pip install qstn
39
+ ```
40
+
41
+ Or install this package from source:
42
+
43
+ ```bash
44
+ pip install git+https://github.com/dess-mannheim/QSTN.git
45
+ ```
46
+
47
+ ## Getting Started
48
+
49
+ Below you can find a minimum working example of how to use QSTN. It can be easily integrated into existing projects, requiring just three function calls to operate. Users familiar with vllm or the OpenAI API can use the same Model/Client calls and arguments. In this example reasoning and the generated response are automatically parsed. For more elaborate examples, see the [tutorial notebooks](https://github.com/dess-mannheim/QSTN/tree/main/docs/guides).
50
+
51
+ ```python
52
+ import qstn
53
+ import pandas as pd
54
+ from vllm import LLM
55
+
56
+ # 1. Prepare questionnaire and persona data
57
+ questionnaires = pd.read_csv("hf://datasets/qstn/ex/q.csv")
58
+ personas = pd.read_csv("hf://datasets/qstn/ex/p.csv")
59
+ prompt = (
60
+ f"Please tell us how you feel about:\n"
61
+ f"{qstn.utilities.placeholder.PROMPT_QUESTIONS}"
62
+ )
63
+ interviews = [
64
+ qstn.prompt_builder.LLMPrompt(
65
+ questionnaire_source=questionnaires,
66
+ system_prompt=persona,
67
+ prompt=prompt,
68
+ ) for persona in personas.system_prompt]
69
+
70
+ # 2. Run Inference
71
+ model = LLM("Qwen/Qwen3-4B", max_model_len=5000)
72
+ results = qstn.survey_manager.conduct_survey_single_item(
73
+ model, interviews, max_tokens=500
74
+ )
75
+
76
+ # 3. Parse Results
77
+ parsed_results = qstn.parser.raw_responses(results)
78
+ ```
79
+
80
+ ## Citation
81
+
82
+ Authors: Maximilian Kreutner, Jens Rupprecht, Georg Ahnert, Ahmed Salem, and Markus Strohmaier
83
+
84
+ This package will soon have a arxiv paper.