Datasets:
Updating readme
Browse files
README.md
CHANGED
|
@@ -2,4 +2,126 @@
|
|
| 2 |
language:
|
| 3 |
- en
|
| 4 |
pretty_name: Housing QA
|
| 5 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 2 |
language:
|
| 3 |
- en
|
| 4 |
pretty_name: Housing QA
|
| 5 |
+
---
|
| 6 |
+
|
| 7 |
+
`HousingQA` is a benchmark evaluating LLMs' abilities to answer questions about housing law in different states. There are two tasks: `knowledge_qa` and `rc_questions`
|
| 8 |
+
|
| 9 |
+
|
| 10 |
+
## `knowledge_qa`
|
| 11 |
+
`knowledge_qa` consists of 9297 yes/no questions. Each question and answer are specific to a jurisdiction. Each row has the following structure:
|
| 12 |
+
|
| 13 |
+
```json
|
| 14 |
+
{
|
| 15 |
+
"idx": 0,
|
| 16 |
+
"state": "Alabama",
|
| 17 |
+
"question": "Is there a state/territory law regulating residential evictions?",
|
| 18 |
+
"answer": "Yes",
|
| 19 |
+
"question_group": 69,
|
| 20 |
+
"statutes": [
|
| 21 |
+
{
|
| 22 |
+
"citation": "ALA. CODE \u00a7 35-9A-141(11)",
|
| 23 |
+
"excerpt": "(11) \u201cpremises\u201d means a dwelling unit and the structure of which it is a part and facilities and appurtenances therein and grounds, areas, and facilities held out for the use of tenants generally or whose use is promised by the rental agreement to the tenant;"
|
| 24 |
+
}
|
| 25 |
+
],
|
| 26 |
+
"original_question": "Is there a state/territory law regulating residential evictions?",
|
| 27 |
+
"caveats": [
|
| 28 |
+
""
|
| 29 |
+
]
|
| 30 |
+
}
|
| 31 |
+
```
|
| 32 |
+
where:
|
| 33 |
+
- `idx`: unique sample idx
|
| 34 |
+
- `state`: the state to which the question pertains.
|
| 35 |
+
- `question`: the question
|
| 36 |
+
- `answer`: the answer to the question
|
| 37 |
+
- `question_group`: the question group. Questions repeated in different jurisdictions have the same `question_group` value.
|
| 38 |
+
- `statutes`: the statutes that support the answer
|
| 39 |
+
- `citation`: the statute citation
|
| 40 |
+
- `excerpt`: an excerpt from the statute
|
| 41 |
+
- `original_question`: the original question from the LSC database. This is sometimes identical to `question`. In other cases `question` corresponds to a rephrased version of this designed to have a yes/no answer.
|
| 42 |
+
- `caveats`: any caveats to the answer, from the LSC annotations.
|
| 43 |
+
|
| 44 |
+
If you want to prompt a LLM to answer the question, we recommend explicitly providing the state and specifying that the question should be answered with respect the law in 2021.
|
| 45 |
+
|
| 46 |
+
```text
|
| 47 |
+
|
| 48 |
+
Consider statutory law for {state} in the year 2021. {question}
|
| 49 |
+
|
| 50 |
+
Answer "Yes" or "No".
|
| 51 |
+
Answer:
|
| 52 |
+
```
|
| 53 |
+
|
| 54 |
+
## `rc_questions`
|
| 55 |
+
`rc_questions` consists of 6853 yes/no questions. Each question and answer are specific to a jurisdiction.
|
| 56 |
+
|
| 57 |
+
`rc_questions` is a subset of `knowledge_questions`. Each question in this subset has been annotated with a list of relevant statutes, all of which are contained in the `statutes` split.
|
| 58 |
+
|
| 59 |
+
`rc_questions` can be used in several ways:
|
| 60 |
+
|
| 61 |
+
- To evaluate **statutory comprehension**, by asking a LLM to answer the question based on the statutes provided.
|
| 62 |
+
- To evaluate **statute retrieval**, by using an embedding-based retriever to identify the relevant statutes for a question (from the `statutes` split).
|
| 63 |
+
- To evaluate **RAG**, by using an embedding-based retriever to identify the relevant statutes for a question (from the `statutes` split) and then asking a LLM to answer the question based on the statutes provided.
|
| 64 |
+
|
| 65 |
+
Each row has the following structure:
|
| 66 |
+
|
| 67 |
+
```json
|
| 68 |
+
{
|
| 69 |
+
"idx": 0,
|
| 70 |
+
"state": "Alabama",
|
| 71 |
+
"question": "Is there a state/territory law regulating residential evictions?",
|
| 72 |
+
"answer": "Yes",
|
| 73 |
+
"question_group": 69,
|
| 74 |
+
"statutes": [
|
| 75 |
+
{
|
| 76 |
+
"statute_idx": 431263,
|
| 77 |
+
"citation": "ALA. CODE \u00a7 35-9A-141(11)",
|
| 78 |
+
"excerpt": "(11) \u201cpremises\u201d means a dwelling unit and the structure of which it is a part and facilities and appurtenances therein and grounds, areas, and facilities held out for the use of tenants generally or whose use is promised by the rental agreement to the tenant;"
|
| 79 |
+
}
|
| 80 |
+
],
|
| 81 |
+
"original_question": "Is there a state/territory law regulating residential evictions?",
|
| 82 |
+
"caveats": [
|
| 83 |
+
""
|
| 84 |
+
]
|
| 85 |
+
}
|
| 86 |
+
```
|
| 87 |
+
where:
|
| 88 |
+
- `idx`: unique sample idx
|
| 89 |
+
- `state`: the state to which the question pertains.
|
| 90 |
+
- `question`: the question
|
| 91 |
+
- `answer`: the answer to the question
|
| 92 |
+
- `question_group`: the question group. Questions repeated in different jurisdictions have the same `question_group` value.
|
| 93 |
+
- `statutes`: the statutes that support the answer
|
| 94 |
+
- `statute_idx`: a foreign key corresponding to the `idx` column in the `statutes` split.
|
| 95 |
+
- `citation`: the statute citation
|
| 96 |
+
- `excerpt`: an excerpt from the statute
|
| 97 |
+
- `original_question`: the original question from the LSC database. This is sometimes identical to `question`. In other cases `question` corresponds to a rephrased version of this designed to have a yes/no answer.
|
| 98 |
+
- `caveats`: any caveats to the answer, from the LSC annotations.
|
| 99 |
+
|
| 100 |
+
If you want to prompt a LLM to answer the question, we recommend explicitly providing the state and specifying that the question should be answered with respect the law in 2021.
|
| 101 |
+
|
| 102 |
+
```text
|
| 103 |
+
Consider statutory law for {state} in the year 2021. Read the following statute excerpts which govern housing law in this state, and answer the question below.
|
| 104 |
+
|
| 105 |
+
Statutes ##################
|
| 106 |
+
|
| 107 |
+
{statute_list}
|
| 108 |
+
|
| 109 |
+
Question ##################
|
| 110 |
+
|
| 111 |
+
{question}
|
| 112 |
+
Answer "Yes" or "No".
|
| 113 |
+
Answer:
|
| 114 |
+
```
|
| 115 |
+
|
| 116 |
+
### `statutes`
|
| 117 |
+
The `statutes` contains approximately 1.7 million statutes collected Justia. Note:
|
| 118 |
+
|
| 119 |
+
- Not all states are represented
|
| 120 |
+
- For each state, not all statutes are captured
|
| 121 |
+
|
| 122 |
+
Data has the following columns:
|
| 123 |
+
- `citation`: statute citation
|
| 124 |
+
- `path`: a string containing the headers for the statute
|
| 125 |
+
- `state`: the state for which the statute belongs
|
| 126 |
+
- `text`: the text of the statute
|
| 127 |
+
- `idx`: a unique idx for the statute. This maps to the `statute_idx` column in the `rc_questions` split.
|