SidneyBissoli commited on
Commit
3c77882
·
1 Parent(s): a2a6023

Fix data access examples: point to year partitions

Browse files
Files changed (1) hide show
  1. README.md +14 -8
README.md CHANGED
@@ -87,15 +87,21 @@ Sys.setenv(
87
  AWS_DEFAULT_REGION = "auto"
88
  )
89
 
90
- ds <- open_dataset("s3://healthbr-data/sinasc/", format = "parquet")
 
91
 
92
- # Example: live births in São Paulo, 2022
93
  ds |>
94
- filter(ano == "2022", uf == "SP") |>
95
  count(SEXO) |>
96
  collect()
 
97
  ```
98
 
 
 
 
 
99
  ### Python (PyArrow)
100
 
101
  ```python
@@ -109,21 +115,21 @@ s3 = fs.S3FileSystem(
109
  region="auto"
110
  )
111
 
 
112
  dataset = pds.dataset(
113
- "healthbr-data/sinasc/",
114
  filesystem=s3,
115
  format="parquet",
116
  partitioning="hive"
117
  )
118
-
119
- table = dataset.to_table(
120
- filter=(pds.field("ano") == "2022") & (pds.field("uf") == "SP")
121
- )
122
  print(table.to_pandas().head())
123
  ```
124
 
125
  > **Note:** These credentials are **read-only** and safe to use in scripts.
126
  > The bucket does not allow anonymous S3 access — credentials are required.
 
127
 
128
  ## File structure
129
 
 
87
  AWS_DEFAULT_REGION = "auto"
88
  )
89
 
90
+ # Open a single year (fastest)
91
+ ds <- open_dataset("s3://healthbr-data/sinasc/ano=2022/", format = "parquet")
92
 
93
+ # Example: live births in São Paulo, 2022, by sex
94
  ds |>
95
+ filter(uf == "SP") |>
96
  count(SEXO) |>
97
  collect()
98
+
99
  ```
100
 
101
+ > **Important:** Point to year partitions (`ano=YYYY/`), not to the dataset
102
+ > root. The root contains `README.md` and `manifest.json`, which Arrow
103
+ > cannot read as Parquet files.
104
+
105
  ### Python (PyArrow)
106
 
107
  ```python
 
115
  region="auto"
116
  )
117
 
118
+ # Single year
119
  dataset = pds.dataset(
120
+ "healthbr-data/sinasc/ano=2022/",
121
  filesystem=s3,
122
  format="parquet",
123
  partitioning="hive"
124
  )
125
+ # Example: live births in São Paulo, 2022
126
+ table = dataset.to_table(filter=(pds.field("uf") == "SP"))
 
 
127
  print(table.to_pandas().head())
128
  ```
129
 
130
  > **Note:** These credentials are **read-only** and safe to use in scripts.
131
  > The bucket does not allow anonymous S3 access — credentials are required.
132
+ > Point to year partitions, not the dataset root (see note above).
133
 
134
  ## File structure
135