metadata
language:
- en
license: mit
size_categories:
- 10K<n<100K
task_categories:
- question-answering
- text-analysis
tags:
- knowledge-coupling
- llama2
- hotpotqa
- multi-hop-reasoning
- gradient-analysis
- ripple-effects
Knowledge Coupling Analysis on HotpotQA Dataset
Dataset Description
This dataset contains the results of a comprehensive knowledge coupling analysis performed on the HotpotQA dataset using LLaMA2-7B model. The analysis investigates how different pieces of knowledge interact within the model's parameter space through gradient-based coupling measurements.
Research Overview
- Model: meta-llama/Llama-2-7b-hf (layers 28-31 focused analysis)
- Dataset: HotpotQA (train + dev splits, 97,852 total samples)
- Method: Gradient-based knowledge coupling via cosine similarity
- Target Layers: model.layers.28-31.mlp.down_proj (semantically rich layers)
Key Findings
The analysis revealed:
- Mean coupling score: 0.0222 across all knowledge piece pairs
- High coupling pairs (≥0.4 threshold): Critical for ripple effect prediction
- Layer-specific analysis focusing on MLP down-projection layers
- Comprehensive gradient analysis with 180,355,072 dimensions per knowledge piece
Files Description
Core Results
global_analysis_results.json: Comprehensive analysis summary with statisticsall_knowledge_pieces.json: Complete set of processed knowledge pieces (92MB)all_coupling_pairs.csv: All pairwise coupling measurements (245MB)
Supporting Files
dataset_info.json: Dataset statistics and conversion detailscoupling_analysis_config.json: Analysis configuration and parameters
Usage
from datasets import load_dataset
# Load the knowledge coupling results
dataset = load_dataset("your-username/hotpotqa-knowledge-coupling")
# Access global analysis results
global_results = dataset["global_analysis"]
# Access knowledge pieces
knowledge_pieces = dataset["knowledge_pieces"]
# Access coupling pairs
coupling_pairs = dataset["coupling_pairs"]
Citation
If you use this dataset in your research, please cite:
@dataset{hotpotqa_knowledge_coupling,
title={Knowledge Coupling Analysis on HotpotQA Dataset using LLaMA2-7B},
author={[Your Name]},
year={2024},
publisher={HuggingFace},
url={https://huggingface.co/datasets/your-username/hotpotqa-knowledge-coupling}
}
Technical Details
- Gradient Computation: ∇_θ log P(answer|question) for cloze-style questions
- Coupling Measurement: Cosine similarity between L2-normalized gradients
- Memory Optimization: Focused on layers 28-31 to handle GPU memory constraints
- Hardware: NVIDIA A40 GPU (46GB VRAM)
License
This dataset is released under the MIT License. The original HotpotQA dataset follows its respective licensing terms.