|
|
--- |
|
|
license: cc-by-4.0 |
|
|
task_categories: |
|
|
- question-answering |
|
|
language: |
|
|
- en |
|
|
size_categories: |
|
|
- 1K<n<10K |
|
|
--- |
|
|
|
|
|
<div align="center"> |
|
|
<br> |
|
|
<img align="center" src="https://cdn-uploads.huggingface.co/production/uploads/65aa8369f8111f40c03e3c61/0ZohMluOG70HW1irfGCAb.png" alt="GeoQuestions1089" width="400"/> |
|
|
</div> |
|
|
<br> |
|
|
<br> |
|
|
<div align="center"> |
|
|
A crowdsourced geospatial question-answering dataset that contains 1089 triples of natural language questions, SPARQL/GeoSPARQL queries, and their answers over YAGO2geo. |
|
|
</div> |
|
|
|
|
|
## Overview |
|
|
|
|
|
**GeoQuestions1089** is a crowdsourced geospatial question-answering dataset that targets the Knowledge Graph [YAGO2geo](https://yago2geo.di.uoa.gr/). It contains 1089 triples of geospatial questions, their answers, and the respective SPARQL/GeoSPARQL queries. |
|
|
|
|
|
It has been used to benchmark two state of the art Question Answering engines, [GeoQA2](https://github.com/AI-team-UoA/GeoQA2) and [the engine of Hamzei et al](https://github.com/hamzeiehsan/Questions-To-GeoSPARQL). |
|
|
|
|
|
## Repository information |
|
|
|
|
|
The official code repository for this dataset is located on [GitHub](https://github.com/AI-team-UoA/GeoQuestions1089/). |
|
|
|
|
|
## Dataset |
|
|
|
|
|
The dataset is described in the following [paper](http://cgi.di.uoa.gr/~koubarak/publications/2023/ISWC_2023_GeoQuestions_paper-3.pdf) (also used to cite the dataset): |
|
|
|
|
|
``` |
|
|
@inproceedings{10.1007/978-3-031-47243-5_15, |
|
|
title = {Benchmarking Geospatial Question Answering Engines Using the Dataset GeoQuestions1089}, |
|
|
author = {Sergios-Anestis Kefalidis, Dharmen Punjani, Eleni Tsalapati, |
|
|
Konstantinos Plas, Mariangela Pollali, Michail Mitsios, |
|
|
Myrto Tsokanaridou, Manolis Koubarakis and Pierre Maret}, |
|
|
booktitle = {The Semantic Web - {ISWC} 2023 - 22nd International Semantic Web Conference, |
|
|
Athens, Greece, November 6-10, 2023, Proceedings, Part {II}}, |
|
|
year = {2023} |
|
|
} |
|
|
``` |
|
|
|
|
|
Shortly, the *GeoQuestions1089* dataset consists of two parts, which we will refer to as *GeoQuestions_c* and *GeoQuestions_w* both of which target the union of YAGO2 and YAGO2geo. |
|
|
|
|
|
*GeoQuestions_c* consits of 1017 entries and *GeoQuestions_w* of 72 entries. The difference between the two is that the natural language questions of *GeoQuestions_w* contain grammatical, syntactical and spelling mistakes. |
|
|
|
|
|
<!-- <center> --> |
|
|
|
|
|
| Description | Range | |
|
|
| --- | --- | |
|
|
|Triples targeting YAGO2geo (*GeoQuestions_c*) | 1-895 | |
|
|
|Triples targeting YAGO2 + YAGO2geo (*GeoQuestions_c*) | 896-1017 | |
|
|
|Triples with questions that contain mistakes (*GeoQuestions_w*) | 1018-1089 | |
|
|
|
|
|
<!-- </center> --> |
|
|
|
|
|
### Current version of the dataset |
|
|
The aforementioned paper describes version 1.0. The latest available version is 1.1. |
|
|
|
|
|
Version 1.1 includes several enhancements: |
|
|
- Uniform query format and variable naming |
|
|
- Fixes in natural language capitalization |
|
|
- Corrections in query categorization |
|
|
- Replacement of stSPARQL functions with GeoSPARQL functions where applicable |
|
|
- Minor improvements in query correctness of existing queries |
|
|
- A few triples that were erroneous (resulting from incorrect file modifications and text editing) have been replaced by correct ones. |
|
|
|
|
|
These updates ensure greater consistency and accuracy in the dataset, making it a more reliable resource for geospatial QA research. |
|
|
|
|
|
## Categories |
|
|
|
|
|
The questions of the dataset are split into 9 categories: |
|
|
|
|
|
<ol type="A"> |
|
|
<li>Asking for a thematic or a spatial attribute of a feature, e.g., |
|
|
<i>Where is Loch Goil located?</i></li> |
|
|
<li>Asking whether a feature is in a geospatial relation with another feature or features, e.g., |
|
|
<i>Is Liverpool east of Ireland?</i></li> |
|
|
<li>Asking for features of a given class that are in a geospatial relation with another feature, e.g., |
|
|
<i>Which counties border county Lincolnshire?</i></li> |
|
|
<li>Asking for features of a given class that are in a geospatial relation with any features of another class, e.g., |
|
|
<i>Which churches are near castles?</i></li> |
|
|
<li>Asking for features of a given class that are in a geospatial relation with an unspecified feature of another class, and either one or both, is/are in another geospatial relation with a feature specified explicitly, e.g., |
|
|
<i>Which churches are near a castle in Scotland?</i></li> |
|
|
<li>As in categories C, D and E above, plus more thematic and/or geospatial characteristics of the features expected as answers, e.g., |
|
|
<i>Which mountains in Scotland have height more than 1000 meters?</i></li> |
|
|
<li>Questions with quantities and aggregates, e.g., |
|
|
<i>What is the total area of lakes in Monaghan?</i> or <i>How many lakes are there in Monaghan?</i></li> |
|
|
<li>Questions with superlatives or comparatives, e.g., |
|
|
<i>Which is the largest island in Greece?</i></li> |
|
|
<li>Questions with quantities, aggregates, and superlatives/comparatives, e.g., |
|
|
<i>Which city in the UK has the most hospitals?</i></li> |
|
|
</ol> |
|
|
|
|
|
<!-- <center> --> |
|
|
|
|
|
| Category | GeoQuestions1089_c | GeoQuestions1089_w | |
|
|
|----------|--------------------|--------------------| |
|
|
| A | 173 | 16 | |
|
|
| B | 139 | 11 | |
|
|
| C | 176 | 14 | |
|
|
| D | 22 | 1 | |
|
|
| E | 138 | 6 | |
|
|
| F | 24 | 2 | |
|
|
| G | 174 | 11 | |
|
|
| H | 145 | 9 | |
|
|
| I | 26 | 2 | |
|
|
|
|
|
<!-- </center> --> |
|
|
|
|
|
You can read more about these categories in the [paper](http://cgi.di.uoa.gr/~koubarak/publications/2023/ISWC_2023_GeoQuestions_paper-3.pdf). |
|
|
|
|
|
## Benchmark (Version 1.1) |
|
|
|
|
|
We have used the dataset to evaluate the engines [GeoQA2](https://github.com/AI-team-UoA/GeoQA/tree/geoqa2) and the engine of [Hamzei et al.](https://github.com/hamzeiehsan/Questions-To-GeoSPARQL#translating-place-related-questions-to-geosparql-queries-thewebconf-2022). We present the results of the evaluation: |
|
|
|
|
|
### GeoQA2 |
|
|
|
|
|
#### Combined Table: Evaluation of GeoQA2 over GeoQuestions_C and GeoQuestions_W |
|
|
|
|
|
| | Category | Executable Queries (C) | Correct Answers (C) | Correct Answers*(1) (C) | Executable Queries (W) | Correct Answers (W) | Correct Answers*(1) (W) | |
|
|
|-----------|----------|------------------------|---------------------|----------------------|------------------------|---------------------|----------------------| |
|
|
| | A | 83.81% | 50.86% | 60.68% | 75.00% | 50.00% | 66.67% | |
|
|
| | B | 74.82% | 60.43% | 80.76% | 81.81% | 45.45% | 55.56% | |
|
|
| | C | 81.25% | 45.45% | 55.94% | 85.71% | 50.00% | 58.34% | |
|
|
| | D | 54.54% | 9.09% | 16.67% | 100.00% | 0.00% | 0.00% | |
|
|
| | E | 76.08% | 24.63% | 32.38% | 50.00% | 33.33% | 66.67% | |
|
|
| | F | 58.33% | 25.00% | 42.85% | 50.00% | 0.00% | 0.00% | |
|
|
| | G | 73.56% | 33.33% | 45.31% | 36.36% | 0.00% | 0.00% | |
|
|
| | H | 66.89% | 18.62% | 27.83% | 66.67% | 0.00% | 0.00% | |
|
|
| | I | 80.76% | 19.23% | 23.80% | 50.00% | 0.00% | 0.00% | |
|
|
|| **Total** | 75.61% | 37.75% | 49.93% | 68.05% | 30.55% | 44.89% | |
|
|
|
|
|
##### (1) Corrent Answers* is the percentage of correct answers calculated over the number of Executable Queries generated by the engines. |
|
|
|
|
|
### System of Hamzei et al. |
|
|
|
|
|
#### Combined Table: Evaluation of the system of Hamzei et al. over GeoQuestions_C and GeoQuestions_W |
|
|
|
|
|
|
|
|
| | Category | Executable Queries (C) | Correct Answers (C) | Correct Answers* (C) | Executable Queries (W) | Correct Answers (W) | Correct Answers* (W) | |
|
|
|-----------|----------|------------------------|---------------------|----------------------|------------------------|---------------------|----------------------| |
|
|
| | A | 82.08% | 23.12% | 28.16% | 93.75% | 6.25% | 6.67% | |
|
|
| | B | 94.96% | 53.23% | 56.06% | 100.00% | 54.54% | 54.54% | |
|
|
| | C | 81.81% | 26.13% | 31.94% | 100.00% | 14.28% | 14.28% | |
|
|
| | D | 81.81% | 4.54% | 5.55% | 100.00% | 0.00% | 0.00% | |
|
|
| | E | 92.75% | 6.52% | 7.03% | 83.34% | 0.00% | 0.00% | |
|
|
| | F | 62.50% | 12.50% | 20.00% | 90.90% | 0.00% | 0.00% | |
|
|
| | G | 80.45% | 10.34% | 12.85% | 100.00% | 0.00% | 0.00% | |
|
|
| | H | 77.93% | 26.89% | 34.51% | 77.78% | 0.00% | 0.00% | |
|
|
| | I | 84.61% | 7.96% | 9.09% | 50.00% | 0.00% | 0.00% | |
|
|
| | **Total** | 83.97% | 22.81% | 27.28% | 93.05% | 12.50% | 13.43% | |
|
|
|
|
|
|
|
|
##### Additional benchmark results exist and we are working on publishing them. Until then, if you want to see more please send a message at: |
|
|
`s[dot]kefalidis[at]di[dot]uoa[dot]gr` |
|
|
|
|
|
|
|
|
## Materialization and Transpiler |
|
|
|
|
|
To improve the time performance of query execution, we pre-computed and materialized certain relations between entities in the YAGO2geo KG. |
|
|
|
|
|
The geospatial relations *within*, *crosses*, *intersects* and *borders* (and their extensions, e.g., *overlaps* and *covers*) are the most expensive ones to be computed. While *north*, *south*, *east* and *west* are easily computed. Hence, we materialized these relations. |
|
|
|
|
|
To easily utilize these materialized relations, please see the [GitHub repository](https://github.com/AI-team-UoA/GeoQuestions1089/). |
|
|
|
|
|
## RDF Store |
|
|
|
|
|
To run the experiments and generate the answers for the gold and generated queries we used GraphDB. Because GraphDB does not support stSPARQL functions, we have [extended the GeoSPARQL plugin of GraphDB](https://github.com/SKefalidis/graphdb-geosparql-plugin). |
|
|
|
|
|
## Notes |
|
|
|
|
|
### About the definition of near for distance calculations |
|
|
We decided to define *near* based on the concept used. This is consistent with the definition of near in [GeoQuestions201](https://geoqa.di.uoa.gr/geospatial_gold_standard.html). |
|
|
|
|
|
| Near to | Distance | |
|
|
| --- | --- | |
|
|
|Near to a City: | 5km | |
|
|
|Near to a Town: | 5km | |
|
|
|Near to a Bay: | 1km | |
|
|
|Near to a Beach: | 1km | |
|
|
|Near to a Forest: | 1km | |
|
|
|Near to a Hotel: | 1km | |
|
|
|Near to a Lake: | 1km | |
|
|
|Near to a Landmark: | 1km | |
|
|
|Near to a Village: | 1km | |
|
|
|Near to a Restaurant: | 500 meters | |
|
|
|Near to a Park: | 500 meters| |
|
|
|
|
|
### Prefixes used in GeoQuestions1089: |
|
|
|
|
|
``` |
|
|
PREFIX geo: <http://www.opengis.net/ont/geosparql#> |
|
|
PREFIX geof: <http://www.opengis.net/def/function/geosparql/> |
|
|
PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> |
|
|
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#> |
|
|
PREFIX xsd: <http://www.w3.org/2001/XMLSchema#> |
|
|
PREFIX yago: <http://yago-knowledge.org/resource/> |
|
|
PREFIX y2geor: <http://kr.di.uoa.gr/yago2geo/resource/> |
|
|
PREFIX y2geoo: <http://kr.di.uoa.gr/yago2geo/ontology/> |
|
|
PREFIX strdf: <http://strdf.di.uoa.gr/ontology#> |
|
|
PREFIX uom: <http://www.opengis.net/def/uom/OGC/1.0/> |
|
|
PREFIX owl: <http://www.w3.org/2002/07/owl#> |
|
|
``` |
|
|
|
|
|
## License |
|
|
|
|
|
Released under the CC0 Attribution 4.0 International license. |
|
|
|
|
|
Copyright © 2024 AI-Team, University of Athens |
|
|
|
|
|
|