Omarrran's picture
Update README.md
48587d3 verified
---
license: mit
task_categories:
- text-classification
- feature-extraction
language:
- en
size_categories:
- 1K<n<10K
tags:
- geo
- geography
- US
- Location
- Geospatial
- HNM
- Numerical
- catageorical
- demographic
- social
- economic
---
# US_GeoSpatial_dataset_by_HNM
## Dataset Description
This dataset contains 56 records with 16 features.
### Dataset Summary
| Metric | Value |
|--------|-------|
| Total Rows | 56 |
| Total Columns | 16 |
| Numeric Columns | 10 |
| Categorical Columns | 6 |
| Missing Values | 0 (0.00%) |
| Duplicate Rows | 0 |
| Memory Usage | 19.51 MB |
## Dataset Structure
### Data Fields
| Column | Type | Sample/Range | Unique Values | Missing % |
|--------|------|--------------|---------------|----------|
| `geo_id` | int64 | Range: [1.00, 78.00] | 56 | 0.0% |
| `region_code` | int64 | Range: [1.00, 9.00] | 5 | 0.0% |
| `division_code` | int64 | Range: [0.00, 9.00] | 10 | 0.0% |
| `state_fips_code` | int64 | Range: [1.00, 78.00] | 56 | 0.0% |
| `state_gnis_code` | int64 | Range: [68085.00, 1802710.00] | 56 | 0.0% |
| `state` | object | Example: 'GU' | 56 | 0.0% |
| `state_name` | object | Example: 'Guam' | 56 | 0.0% |
| `lsad_code` | int64 | Range: [0.00, 0.00] | 1 | 0.0% |
| `mtfcc_feature_class_code` | object | Example: 'G4000' | 1 | 0.0% |
| `functional_status` | object | Example: 'A' | 1 | 0.0% |
| `area_land_meters` | int64 | Range: [158340389.00, 1478927050067.00] | 56 | 0.0% |
| `area_water_meters` | int64 | Range: [18687196.00, 245394222619.00] | 56 | 0.0% |
| `int_point_lat` | float64 | Range: [-14.27, 63.35] | 56 | 0.0% |
| `int_point_lon` | float64 | Range: [-170.67, 145.60] | 56 | 0.0% |
| `int_point_geom` | object | Example: 'POINT(144.7719021 13.4417451)' | 56 | 0.0% |
| `state_geom` | object | Example: 'POLYGON((144.563426 13.448065, 144.56355 13.445248' | 56 | 0.0% |
### Data Splits
This dataset contains a single split with all 56 examples.
This is the **US Census Bureau Geographic Data** - specifically the **state-level geographic identifiers and measurements** dataset.
## What This Dataset Contains
| Column | Meaning |
|--------|---------|
| `geo_id`, `state_fips_code` | Federal Information Processing Standard codes for US states/territories |
| `region_code`, `division_code` | Census Bureau regional classifications (4 regions, 9 divisions) |
| `state_gnis_code` | Geographic Names Information System identifiers |
| `area_land_meters`, `area_water_meters` | Land and water area measurements |
| `int_point_lat`, `int_point_lon` | Internal centroid coordinates (center point of each state) |
## Practical Use Cases
**1. Geospatial Analysis & Mapping**
- Creating choropleth maps of US states
- Calculating distances between state centers
- Building location-based services that need state boundaries
**2. Data Enrichment & Joins**
- Joining with other datasets (population, economic, health data) using FIPS codes
- Standardizing geographic identifiers across multiple data sources
- Linking Census data with external APIs
**3. Regional Analytics**
- Comparing metrics across Census regions/divisions (Northeast, Midwest, South, West)
- Grouping states for regional market analysis
- Understanding geographic distribution patterns
**4. Environmental & Land Use Studies**
- Calculating land-to-water ratios by state
- Analyzing state sizes for resource allocation models
- Comparing population density (when combined with population data)
**5. Machine Learning Features**
- Geographic features for predictive models
- Clustering states by location or size
- Spatial autocorrelation analysis
This dataset is essentially a **foundational geographic reference dataset** - most valuable when joined with demographic, economic, or social datasets for spatial analysis.
## Usage
### Loading the Dataset
```python
from datasets import load_dataset
# Load from Hugging Face Hub
dataset = load_dataset("Omarrran/US_GeoSpatial_Dataset_by_HNM")
# Access the data
train_data = dataset['train']
print(train_data[0])
```
### Loading as Pandas DataFrame
```python
import pandas as pd
from datasets import load_dataset
dataset = load_dataset("Omarrran/US_GeoSpatial_Dataset_by_HNM")
df = dataset['train'].to_pandas()
```
## Statistical Summary
### Numeric Columns
```
geo_id region_code division_code state_fips_code state_gnis_code lsad_code area_land_meters area_water_meters int_point_lat int_point_lon
count 56.000000 56.000000 56.000000 56.000000 5.600000e+01 56.0 5.600000e+01 5.600000e+01 56.000000 56.000000
mean 32.535714 3.232143 4.660714 32.535714 1.522958e+06 0.0 1.635888e+11 1.245427e+10 36.944973 -85.299146
std 19.075891 2.080132 2.830206 19.075891 4.648599e+05 0.0 2.173973e+11 3.503733e+10 11.055418 49.717199
min 1.000000 1.000000 0.000000 1.000000 6.808500e+04 0.0 1.583404e+08 1.868720e+07 -14.267159 -170.668267
25% 17.750000 2.000000 2.750000 17.750000 1.423460e+06 0.0 2.483234e+10 1.502762e+09 34.376997 -101.722698
50% 31.500000 3.000000 5.000000 31.500000 1.779784e+06 0.0 1.285501e+11 3.706434e+09 38.996207 -87.998035
75% 46.250000 4.000000 7.000000 46.250000 1.779800e+06 0.0 2.007729e+11 8.964466e+09 42.932463 -76.931011
max 78.000000 9.000000 9.000000 78.000000 1.802710e+06 0.0 1.478927e+12 2.453942e+11 63.347356 145.601021
```
## Additional Information
### Dataset Creation
- **Source**: /content/us_state_boundaries_56_20251126_191744.csv
- **Created**: 2025-11-28
- **Format**: CSV
![image](https://cdn-uploads.huggingface.co/production/uploads/66afb3f1eaf3e876595627bf/EQZgiNiyL2TV4UngvIJYX.png)
![image](https://cdn-uploads.huggingface.co/production/uploads/66afb3f1eaf3e876595627bf/oCSwTubmkRSG4Myni1IF3.png)
![image](https://cdn-uploads.huggingface.co/production/uploads/66afb3f1eaf3e876595627bf/k7_T9B3AYU1-52Od0Gpwr.png)
### Licensing Information
This dataset is released under the MIT License.
### Citation Information
```bibtex
@dataset{us_geospatial_dataset_by_hnm},
title = {US_GeoSpatial_dataset_by_HNM},
year = {2025},
Author = {Haq Nawaz Malik}
publisher = {Hugging Face},
Link = {https://huggingface.co/datasets/Omarrran/US_GeoSpatial_Dataset_by_HNM}
}
```
### Contributions
Thanks to the contributors who helped in creating this dataset.
---