--- license: mit language: - en tags: - OntoLearner - ontology-learning - web-and-internet pretty_name: Web And Internet ---
OntoLearner

Web And Internet Domain Ontologies

## Overview The web and internet domain encompasses ontologies that articulate the structure and semantics of web technologies, including the intricate relationships and protocols that underpin linked data, web services, and online communication standards. This domain is pivotal in advancing knowledge representation by enabling the seamless integration and interoperability of diverse data sources, thereby facilitating more intelligent and dynamic web interactions. Through precise modeling of web semantics, it supports the development of robust frameworks for data exchange and enhances the semantic web's capacity to deliver contextually relevant information. ## Ontologies | Ontology ID | Full Name | Classes | Properties | Last Updated | |-------------|-----------|---------|------------|--------------| | Hydra | Hydra Ontology (Hydra) | 2 | 0 | 13 July 2021| | SAREF | Smart Applications REFerence ontology (SAREF) | 129 | 89 | 2020-12-31| ## Dataset Files Each ontology directory contains the following files: 1. `.` - The original ontology file 2. `term_typings.json` - A Dataset of term-to-type mappings 3. `taxonomies.json` - Dataset of taxonomic relations 4. `non_taxonomic_relations.json` - Dataset of non-taxonomic relations 5. `.rst` - Documentation describing the ontology ## Usage These datasets are intended for ontology learning research and applications. Here's how to use them with OntoLearner: First of all, install the `OntoLearner` library via PiP: ```bash pip install ontolearner ``` **How to load an ontology or LLM4OL Paradigm tasks datasets?** ``` python from ontolearner import SAREF ontology = SAREF() # Load an ontology. ontology.load() # Load (or extract) LLMs4OL Paradigm tasks datasets data = ontology.extract() ``` **How use the loaded dataset for LLM4OL Paradigm task settings?** ``` python # Import core modules from the OntoLearner library from ontolearner import SAREF, LearnerPipeline, train_test_split # Load the SAREF ontology, which contains concepts related to wines, their properties, and categories ontology = SAREF() ontology.load() # Load entities, types, and structured term annotations from the ontology data = ontology.extract() # Split into train and test sets train_data, test_data = train_test_split(data, test_size=0.2, random_state=42) # Initialize a multi-component learning pipeline (retriever + LLM) # This configuration enables a Retrieval-Augmented Generation (RAG) setup pipeline = LearnerPipeline( retriever_id='sentence-transformers/all-MiniLM-L6-v2', # Dense retriever model for nearest neighbor search llm_id='Qwen/Qwen2.5-0.5B-Instruct', # Lightweight instruction-tuned LLM for reasoning hf_token='...', # Hugging Face token for accessing gated models batch_size=32, # Batch size for training/prediction if supported top_k=5 # Number of top retrievals to include in RAG prompting ) # Run the pipeline: training, prediction, and evaluation in one call outputs = pipeline( train_data=train_data, test_data=test_data, evaluate=True, # Compute metrics like precision, recall, and F1 task='term-typing' # Specifies the task # Other options: "taxonomy-discovery" or "non-taxonomy-discovery" ) # Print final evaluation metrics print("Metrics:", outputs['metrics']) # Print the total time taken for the full pipeline execution print("Elapsed time:", outputs['elapsed_time']) # Print all outputs (including predictions) print(outputs) ``` For more detailed documentation, see the [![Documentation](https://img.shields.io/badge/Documentation-ontolearner.readthedocs.io-blue)](https://ontolearner.readthedocs.io) ## Citation If you find our work helpful, feel free to give us a cite. ```bibtex @inproceedings{babaei2023llms4ol, title={LLMs4OL: Large language models for ontology learning}, author={Babaei Giglou, Hamed and D’Souza, Jennifer and Auer, S{\"o}ren}, booktitle={International Semantic Web Conference}, pages={408--427}, year={2023}, organization={Springer} } ```