Dataset Preview
Duplicate
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed because of a cast error
Error code:   DatasetGenerationCastError
Exception:    DatasetGenerationCastError
Message:      An error occurred while generating the dataset

All the data files must have the same columns, but at some point there are 7 new columns ({'run_id', 'queue_depth', 'event_id', 'event_time', 'component_name', 'event_type', 'severity'}) and 7 missing columns ({'deployment_env', 'traffic_tier', 'agent_name', 'region', 'status', 'agent_type', 'reliability_profile'}).

This happened while the csv dataset builder was generating data using

hf://datasets/Tekhnika/agentic-ai-observability-free/sample_output/events.csv (at revision ad108a272d1a2b55239459ed234b606395925234), [/tmp/hf-datasets-cache/medium/datasets/14445453005017-config-parquet-and-info-Tekhnika-agentic-ai-obser-9875a9b1/hub/datasets--Tekhnika--agentic-ai-observability-free/snapshots/ad108a272d1a2b55239459ed234b606395925234/sample_output/agents.csv (origin=hf://datasets/Tekhnika/agentic-ai-observability-free@ad108a272d1a2b55239459ed234b606395925234/sample_output/agents.csv), /tmp/hf-datasets-cache/medium/datasets/14445453005017-config-parquet-and-info-Tekhnika-agentic-ai-obser-9875a9b1/hub/datasets--Tekhnika--agentic-ai-observability-free/snapshots/ad108a272d1a2b55239459ed234b606395925234/sample_output/events.csv (origin=hf://datasets/Tekhnika/agentic-ai-observability-free@ad108a272d1a2b55239459ed234b606395925234/sample_output/events.csv), /tmp/hf-datasets-cache/medium/datasets/14445453005017-config-parquet-and-info-Tekhnika-agentic-ai-obser-9875a9b1/hub/datasets--Tekhnika--agentic-ai-observability-free/snapshots/ad108a272d1a2b55239459ed234b606395925234/sample_output/runs.csv (origin=hf://datasets/Tekhnika/agentic-ai-observability-free@ad108a272d1a2b55239459ed234b606395925234/sample_output/runs.csv)]

Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)
Traceback:    Traceback (most recent call last):
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1890, in _prepare_split_single
                  writer.write_table(table)
                File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 760, in write_table
                  pa_table = table_cast(pa_table, self._schema)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2272, in table_cast
                  return cast_table_to_schema(table, schema)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2218, in cast_table_to_schema
                  raise CastError(
              datasets.table.CastError: Couldn't cast
              event_id: int64
              run_id: int64
              agent_id: int64
              event_time: string
              event_type: string
              severity: string
              component_name: string
              queue_depth: int64
              -- schema metadata --
              pandas: '{"index_columns": [{"kind": "range", "name": null, "start": 0, "' + 1200
              to
              {'agent_id': Value('int64'), 'agent_name': Value('string'), 'agent_type': Value('string'), 'deployment_env': Value('string'), 'region': Value('string'), 'status': Value('string'), 'traffic_tier': Value('string'), 'reliability_profile': Value('string')}
              because column names don't match
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1347, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 980, in convert_to_parquet
                  builder.download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 884, in download_and_prepare
                  self._download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 947, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1739, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1892, in _prepare_split_single
                  raise DatasetGenerationCastError.from_cast_error(
              datasets.exceptions.DatasetGenerationCastError: An error occurred while generating the dataset
              
              All the data files must have the same columns, but at some point there are 7 new columns ({'run_id', 'queue_depth', 'event_id', 'event_time', 'component_name', 'event_type', 'severity'}) and 7 missing columns ({'deployment_env', 'traffic_tier', 'agent_name', 'region', 'status', 'agent_type', 'reliability_profile'}).
              
              This happened while the csv dataset builder was generating data using
              
              hf://datasets/Tekhnika/agentic-ai-observability-free/sample_output/events.csv (at revision ad108a272d1a2b55239459ed234b606395925234), [/tmp/hf-datasets-cache/medium/datasets/14445453005017-config-parquet-and-info-Tekhnika-agentic-ai-obser-9875a9b1/hub/datasets--Tekhnika--agentic-ai-observability-free/snapshots/ad108a272d1a2b55239459ed234b606395925234/sample_output/agents.csv (origin=hf://datasets/Tekhnika/agentic-ai-observability-free@ad108a272d1a2b55239459ed234b606395925234/sample_output/agents.csv), /tmp/hf-datasets-cache/medium/datasets/14445453005017-config-parquet-and-info-Tekhnika-agentic-ai-obser-9875a9b1/hub/datasets--Tekhnika--agentic-ai-observability-free/snapshots/ad108a272d1a2b55239459ed234b606395925234/sample_output/events.csv (origin=hf://datasets/Tekhnika/agentic-ai-observability-free@ad108a272d1a2b55239459ed234b606395925234/sample_output/events.csv), /tmp/hf-datasets-cache/medium/datasets/14445453005017-config-parquet-and-info-Tekhnika-agentic-ai-obser-9875a9b1/hub/datasets--Tekhnika--agentic-ai-observability-free/snapshots/ad108a272d1a2b55239459ed234b606395925234/sample_output/runs.csv (origin=hf://datasets/Tekhnika/agentic-ai-observability-free@ad108a272d1a2b55239459ed234b606395925234/sample_output/runs.csv)]
              
              Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

agent_id
int64
agent_name
string
agent_type
string
deployment_env
string
region
string
status
string
traffic_tier
string
reliability_profile
string
1
agent_1
support
staging
europe
healthy
critical
stable
2
agent_2
research
staging
global
healthy
medium
volatile
3
agent_3
research
production
global
degraded
low
stable
4
agent_4
copilot
staging
north_america
watch
low
volatile
5
agent_5
research
staging
asia_pacific
watch
medium
fragile
6
agent_6
ops
production
asia_pacific
degraded
medium
fragile
7
agent_7
ops
beta
global
paused
low
stable
8
agent_8
sales
sandbox
global
healthy
low
volatile
9
agent_9
ops
sandbox
north_america
healthy
high
recovering
10
agent_10
research
beta
global
watch
low
stable
11
agent_11
support
staging
global
paused
medium
recovering
12
agent_12
ops
production
europe
watch
medium
stable
13
agent_13
support
production
europe
watch
critical
fragile
14
agent_14
support
beta
global
healthy
low
volatile
15
agent_15
research
staging
global
healthy
critical
stable
16
agent_16
support
sandbox
global
watch
critical
fragile
17
agent_17
ops
staging
europe
healthy
high
stable
18
agent_18
support
staging
north_america
watch
low
stable
19
agent_19
ops
staging
asia_pacific
degraded
critical
stable
20
agent_20
copilot
production
europe
watch
high
recovering
21
agent_21
copilot
beta
north_america
watch
high
recovering
22
agent_22
ops
beta
north_america
healthy
high
volatile
23
agent_23
copilot
production
north_america
watch
critical
stable
24
agent_24
support
staging
north_america
watch
medium
recovering
25
agent_25
research
staging
asia_pacific
paused
medium
fragile
26
agent_26
research
staging
global
degraded
critical
volatile
27
agent_27
ops
production
north_america
paused
low
recovering
28
agent_28
sales
beta
global
healthy
high
volatile
29
agent_29
sales
staging
global
degraded
critical
stable
30
agent_30
research
sandbox
global
watch
low
volatile
31
agent_31
ops
staging
north_america
healthy
critical
recovering
32
agent_32
sales
production
north_america
paused
low
recovering
33
agent_33
research
sandbox
global
paused
high
volatile
34
agent_34
ops
beta
europe
healthy
medium
recovering
35
agent_35
research
sandbox
asia_pacific
paused
critical
stable
36
agent_36
support
sandbox
global
paused
critical
fragile
37
agent_37
research
production
global
watch
critical
stable
38
agent_38
ops
beta
asia_pacific
healthy
critical
stable
39
agent_39
support
staging
europe
watch
critical
recovering
40
agent_40
ops
staging
asia_pacific
paused
critical
stable
41
agent_41
ops
production
asia_pacific
degraded
low
fragile
42
agent_42
ops
sandbox
europe
paused
critical
volatile
43
agent_43
support
staging
global
degraded
medium
recovering
44
agent_44
ops
production
europe
paused
medium
fragile
45
agent_45
support
sandbox
europe
degraded
critical
volatile
46
agent_46
ops
staging
north_america
watch
medium
fragile
1,107
null
null
null
null
null
null
null
138
null
null
null
null
null
null
null
943
null
null
null
null
null
null
null
395
null
null
null
null
null
null
null
822
null
null
null
null
null
null
null
634
null
null
null
null
null
null
null
9
null
null
null
null
null
null
null
437
null
null
null
null
null
null
null
107
null
null
null
null
null
null
null
1,045
null
null
null
null
null
null
null
442
null
null
null
null
null
null
null
106
null
null
null
null
null
null
null
267
null
null
null
null
null
null
null
348
null
null
null
null
null
null
null
18
null
null
null
null
null
null
null
220
null
null
null
null
null
null
null
11
null
null
null
null
null
null
null
772
null
null
null
null
null
null
null
504
null
null
null
null
null
null
null
157
null
null
null
null
null
null
null
515
null
null
null
null
null
null
null
730
null
null
null
null
null
null
null
105
null
null
null
null
null
null
null
372
null
null
null
null
null
null
null
482
null
null
null
null
null
null
null
405
null
null
null
null
null
null
null
189
null
null
null
null
null
null
null
321
null
null
null
null
null
null
null
471
null
null
null
null
null
null
null
570
null
null
null
null
null
null
null
282
null
null
null
null
null
null
null
797
null
null
null
null
null
null
null
655
null
null
null
null
null
null
null
380
null
null
null
null
null
null
null
149
null
null
null
null
null
null
null
203
null
null
null
null
null
null
null
537
null
null
null
null
null
null
null
823
null
null
null
null
null
null
null
210
null
null
null
null
null
null
null
109
null
null
null
null
null
null
null
276
null
null
null
null
null
null
null
120
null
null
null
null
null
null
null
983
null
null
null
null
null
null
null
430
null
null
null
null
null
null
null
940
null
null
null
null
null
null
null
895
null
null
null
null
null
null
null
215
null
null
null
null
null
null
null
452
null
null
null
null
null
null
null
425
null
null
null
null
null
null
null
1,017
null
null
null
null
null
null
null
250
null
null
null
null
null
null
null
969
null
null
null
null
null
null
null
374
null
null
null
null
null
null
null
550
null
null
null
null
null
null
null
End of preview.

Agentic AI Observability

Free sample for AI agent reliability dashboards, anomaly exploration, and observability-oriented analytics workflows.

What is included

  • agents.csv: 46 rows, 8 columns
  • events.csv: 5590 rows, 8 columns
  • runs.csv: 1863 rows, 8 columns

Why this dataset is useful

  • Good starter sample for an agent reliability dashboard or observability notebook.
  • Useful for validating run-level and event-level analytics in Python, SQL, and BI tools.
  • Lightweight enough for quick experiments while still matching the core workflow of the full starter pack.

Starter use cases

  • Agent reliability baseline using run and event data.
  • Observability dashboard for event severity, run behavior, and agent health patterns.

Schema overview

agents.csv

  • Rows: 46
  • Columns: agent_id, agent_name, agent_type, deployment_env, region, status, traffic_tier, reliability_profile

events.csv

  • Rows: 5590
  • Columns: event_id, run_id, agent_id, event_time, event_type, severity, component_name, queue_depth

runs.csv

  • Rows: 1863
  • Columns: run_id, agent_id, started_at, duration_ms, run_status, failure_risk_score, slo_breach_flag, anomaly_window_flag

Free vs full version

  • Free Kaggle sample: reduced rows, reduced columns, starter notebook, and enough linked observability tables to validate the core workflow.
  • Full version: full row volume, richer feature coverage, tool and feedback tables, and extra starter assets for dashboard, SQL, and anomaly-analysis work.

Upgrade to full version

Notes

  • Contains generated data only and no real personal data.
  • Designed as a lightweight free sample for evaluation and discovery.
Downloads last month
19