ZTWHHH commited on
Commit
61773a7
·
verified ·
1 Parent(s): 18da568

Add files using upload-large-folder tool

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. videochat2/lib/python3.10/site-packages/pandas/tests/extension/__init__.py +0 -0
  2. videochat2/lib/python3.10/site-packages/pandas/tests/extension/conftest.py +206 -0
  3. videochat2/lib/python3.10/site-packages/pandas/tests/extension/date/array.py +182 -0
  4. videochat2/lib/python3.10/site-packages/pandas/tests/extension/json/__init__.py +7 -0
  5. videochat2/lib/python3.10/site-packages/pandas/tests/extension/json/__pycache__/array.cpython-310.pyc +0 -0
  6. videochat2/lib/python3.10/site-packages/pandas/tests/extension/json/__pycache__/test_json.cpython-310.pyc +0 -0
  7. videochat2/lib/python3.10/site-packages/pandas/tests/extension/json/array.py +244 -0
  8. videochat2/lib/python3.10/site-packages/pandas/tests/extension/json/test_json.py +395 -0
  9. videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_arrow.py +2749 -0
  10. videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_boolean.py +401 -0
  11. videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_categorical.py +316 -0
  12. videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_common.py +80 -0
  13. videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_datetime.py +194 -0
  14. videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_extension.py +26 -0
  15. videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_external_block.py +39 -0
  16. videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_floating.py +225 -0
  17. videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_integer.py +294 -0
  18. videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_numpy.py +456 -0
  19. videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_period.py +202 -0
  20. videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_sparse.py +475 -0
  21. videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_string.py +282 -0
  22. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/__init__.cpython-310.pyc +0 -0
  23. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/common.cpython-310.pyc +0 -0
  24. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/conftest.cpython-310.pyc +0 -0
  25. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_alter_axes.cpython-310.pyc +0 -0
  26. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_api.cpython-310.pyc +0 -0
  27. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_arithmetic.cpython-310.pyc +0 -0
  28. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_block_internals.cpython-310.pyc +0 -0
  29. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_cumulative.cpython-310.pyc +0 -0
  30. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_iteration.cpython-310.pyc +0 -0
  31. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_logical_ops.cpython-310.pyc +0 -0
  32. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_nonunique_indexes.cpython-310.pyc +0 -0
  33. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_npfuncs.cpython-310.pyc +0 -0
  34. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_query_eval.cpython-310.pyc +0 -0
  35. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_reductions.cpython-310.pyc +0 -0
  36. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_repr_info.cpython-310.pyc +0 -0
  37. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_stack_unstack.cpython-310.pyc +0 -0
  38. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_subclass.cpython-310.pyc +0 -0
  39. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_ufunc.cpython-310.pyc +0 -0
  40. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_unary.cpython-310.pyc +0 -0
  41. videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_validate.cpython-310.pyc +0 -0
  42. videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/__init__.cpython-310.pyc +0 -0
  43. videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_coercion.cpython-310.pyc +0 -0
  44. videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_delitem.cpython-310.pyc +0 -0
  45. videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_get.cpython-310.pyc +0 -0
  46. videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_get_value.cpython-310.pyc +0 -0
  47. videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_getitem.cpython-310.pyc +0 -0
  48. videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_indexing.cpython-310.pyc +0 -0
  49. videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_insert.cpython-310.pyc +0 -0
  50. videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_mask.cpython-310.pyc +0 -0
videochat2/lib/python3.10/site-packages/pandas/tests/extension/__init__.py ADDED
File without changes
videochat2/lib/python3.10/site-packages/pandas/tests/extension/conftest.py ADDED
@@ -0,0 +1,206 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import operator
2
+
3
+ import pytest
4
+
5
+ from pandas import (
6
+ Series,
7
+ options,
8
+ )
9
+
10
+
11
+ @pytest.fixture
12
+ def dtype():
13
+ """A fixture providing the ExtensionDtype to validate."""
14
+ raise NotImplementedError
15
+
16
+
17
+ @pytest.fixture
18
+ def data():
19
+ """
20
+ Length-100 array for this type.
21
+
22
+ * data[0] and data[1] should both be non missing
23
+ * data[0] and data[1] should not be equal
24
+ """
25
+ raise NotImplementedError
26
+
27
+
28
+ @pytest.fixture
29
+ def data_for_twos():
30
+ """Length-100 array in which all the elements are two."""
31
+ raise NotImplementedError
32
+
33
+
34
+ @pytest.fixture
35
+ def data_missing():
36
+ """Length-2 array with [NA, Valid]"""
37
+ raise NotImplementedError
38
+
39
+
40
+ @pytest.fixture(params=["data", "data_missing"])
41
+ def all_data(request, data, data_missing):
42
+ """Parametrized fixture giving 'data' and 'data_missing'"""
43
+ if request.param == "data":
44
+ return data
45
+ elif request.param == "data_missing":
46
+ return data_missing
47
+
48
+
49
+ @pytest.fixture
50
+ def data_repeated(data):
51
+ """
52
+ Generate many datasets.
53
+
54
+ Parameters
55
+ ----------
56
+ data : fixture implementing `data`
57
+
58
+ Returns
59
+ -------
60
+ Callable[[int], Generator]:
61
+ A callable that takes a `count` argument and
62
+ returns a generator yielding `count` datasets.
63
+ """
64
+
65
+ def gen(count):
66
+ for _ in range(count):
67
+ yield data
68
+
69
+ return gen
70
+
71
+
72
+ @pytest.fixture
73
+ def data_for_sorting():
74
+ """
75
+ Length-3 array with a known sort order.
76
+
77
+ This should be three items [B, C, A] with
78
+ A < B < C
79
+ """
80
+ raise NotImplementedError
81
+
82
+
83
+ @pytest.fixture
84
+ def data_missing_for_sorting():
85
+ """
86
+ Length-3 array with a known sort order.
87
+
88
+ This should be three items [B, NA, A] with
89
+ A < B and NA missing.
90
+ """
91
+ raise NotImplementedError
92
+
93
+
94
+ @pytest.fixture
95
+ def na_cmp():
96
+ """
97
+ Binary operator for comparing NA values.
98
+
99
+ Should return a function of two arguments that returns
100
+ True if both arguments are (scalar) NA for your type.
101
+
102
+ By default, uses ``operator.is_``
103
+ """
104
+ return operator.is_
105
+
106
+
107
+ @pytest.fixture
108
+ def na_value():
109
+ """The scalar missing value for this type. Default 'None'"""
110
+ return None
111
+
112
+
113
+ @pytest.fixture
114
+ def data_for_grouping():
115
+ """
116
+ Data for factorization, grouping, and unique tests.
117
+
118
+ Expected to be like [B, B, NA, NA, A, A, B, C]
119
+
120
+ Where A < B < C and NA is missing
121
+ """
122
+ raise NotImplementedError
123
+
124
+
125
+ @pytest.fixture(params=[True, False])
126
+ def box_in_series(request):
127
+ """Whether to box the data in a Series"""
128
+ return request.param
129
+
130
+
131
+ @pytest.fixture(
132
+ params=[
133
+ lambda x: 1,
134
+ lambda x: [1] * len(x),
135
+ lambda x: Series([1] * len(x)),
136
+ lambda x: x,
137
+ ],
138
+ ids=["scalar", "list", "series", "object"],
139
+ )
140
+ def groupby_apply_op(request):
141
+ """
142
+ Functions to test groupby.apply().
143
+ """
144
+ return request.param
145
+
146
+
147
+ @pytest.fixture(params=[True, False])
148
+ def as_frame(request):
149
+ """
150
+ Boolean fixture to support Series and Series.to_frame() comparison testing.
151
+ """
152
+ return request.param
153
+
154
+
155
+ @pytest.fixture(params=[True, False])
156
+ def as_series(request):
157
+ """
158
+ Boolean fixture to support arr and Series(arr) comparison testing.
159
+ """
160
+ return request.param
161
+
162
+
163
+ @pytest.fixture(params=[True, False])
164
+ def use_numpy(request):
165
+ """
166
+ Boolean fixture to support comparison testing of ExtensionDtype array
167
+ and numpy array.
168
+ """
169
+ return request.param
170
+
171
+
172
+ @pytest.fixture(params=["ffill", "bfill"])
173
+ def fillna_method(request):
174
+ """
175
+ Parametrized fixture giving method parameters 'ffill' and 'bfill' for
176
+ Series.fillna(method=<method>) testing.
177
+ """
178
+ return request.param
179
+
180
+
181
+ @pytest.fixture(params=[True, False])
182
+ def as_array(request):
183
+ """
184
+ Boolean fixture to support ExtensionDtype _from_sequence method testing.
185
+ """
186
+ return request.param
187
+
188
+
189
+ @pytest.fixture
190
+ def invalid_scalar(data):
191
+ """
192
+ A scalar that *cannot* be held by this ExtensionArray.
193
+
194
+ The default should work for most subclasses, but is not guaranteed.
195
+
196
+ If the array can hold any item (i.e. object dtype), then use pytest.skip.
197
+ """
198
+ return object.__new__(object)
199
+
200
+
201
+ @pytest.fixture
202
+ def using_copy_on_write() -> bool:
203
+ """
204
+ Fixture to check if Copy-on-Write is enabled.
205
+ """
206
+ return options.mode.copy_on_write and options.mode.data_manager == "block"
videochat2/lib/python3.10/site-packages/pandas/tests/extension/date/array.py ADDED
@@ -0,0 +1,182 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import datetime as dt
2
+ from typing import (
3
+ Any,
4
+ Optional,
5
+ Sequence,
6
+ Tuple,
7
+ Union,
8
+ cast,
9
+ )
10
+
11
+ import numpy as np
12
+
13
+ from pandas._typing import (
14
+ Dtype,
15
+ PositionalIndexer,
16
+ )
17
+
18
+ from pandas.core.dtypes.dtypes import register_extension_dtype
19
+
20
+ from pandas.api.extensions import (
21
+ ExtensionArray,
22
+ ExtensionDtype,
23
+ )
24
+ from pandas.api.types import pandas_dtype
25
+
26
+
27
+ @register_extension_dtype
28
+ class DateDtype(ExtensionDtype):
29
+ @property
30
+ def type(self):
31
+ return dt.date
32
+
33
+ @property
34
+ def name(self):
35
+ return "DateDtype"
36
+
37
+ @classmethod
38
+ def construct_from_string(cls, string: str):
39
+ if not isinstance(string, str):
40
+ raise TypeError(
41
+ f"'construct_from_string' expects a string, got {type(string)}"
42
+ )
43
+
44
+ if string == cls.__name__:
45
+ return cls()
46
+ else:
47
+ raise TypeError(f"Cannot construct a '{cls.__name__}' from '{string}'")
48
+
49
+ @classmethod
50
+ def construct_array_type(cls):
51
+ return DateArray
52
+
53
+ @property
54
+ def na_value(self):
55
+ return dt.date.min
56
+
57
+ def __repr__(self) -> str:
58
+ return self.name
59
+
60
+
61
+ class DateArray(ExtensionArray):
62
+ def __init__(
63
+ self,
64
+ dates: Union[
65
+ dt.date,
66
+ Sequence[dt.date],
67
+ Tuple[np.ndarray, np.ndarray, np.ndarray],
68
+ np.ndarray,
69
+ ],
70
+ ) -> None:
71
+ if isinstance(dates, dt.date):
72
+ self._year = np.array([dates.year])
73
+ self._month = np.array([dates.month])
74
+ self._day = np.array([dates.year])
75
+ return
76
+
77
+ ldates = len(dates)
78
+ if isinstance(dates, list):
79
+ # pre-allocate the arrays since we know the size before hand
80
+ self._year = np.zeros(ldates, dtype=np.uint16) # 65535 (0, 9999)
81
+ self._month = np.zeros(ldates, dtype=np.uint8) # 255 (1, 31)
82
+ self._day = np.zeros(ldates, dtype=np.uint8) # 255 (1, 12)
83
+ # populate them
84
+ for i, (y, m, d) in enumerate(
85
+ map(lambda date: (date.year, date.month, date.day), dates)
86
+ ):
87
+ self._year[i] = y
88
+ self._month[i] = m
89
+ self._day[i] = d
90
+
91
+ elif isinstance(dates, tuple):
92
+ # only support triples
93
+ if ldates != 3:
94
+ raise ValueError("only triples are valid")
95
+ # check if all elements have the same type
96
+ if any(map(lambda x: not isinstance(x, np.ndarray), dates)):
97
+ raise TypeError("invalid type")
98
+ ly, lm, ld = (len(cast(np.ndarray, d)) for d in dates)
99
+ if not ly == lm == ld:
100
+ raise ValueError(
101
+ f"tuple members must have the same length: {(ly, lm, ld)}"
102
+ )
103
+ self._year = dates[0].astype(np.uint16)
104
+ self._month = dates[1].astype(np.uint8)
105
+ self._day = dates[2].astype(np.uint8)
106
+
107
+ elif isinstance(dates, np.ndarray) and dates.dtype == "U10":
108
+ self._year = np.zeros(ldates, dtype=np.uint16) # 65535 (0, 9999)
109
+ self._month = np.zeros(ldates, dtype=np.uint8) # 255 (1, 31)
110
+ self._day = np.zeros(ldates, dtype=np.uint8) # 255 (1, 12)
111
+
112
+ # error: "object_" object is not iterable
113
+ obj = np.char.split(dates, sep="-")
114
+ for (i,), (y, m, d) in np.ndenumerate(obj): # type: ignore[misc]
115
+ self._year[i] = int(y)
116
+ self._month[i] = int(m)
117
+ self._day[i] = int(d)
118
+
119
+ else:
120
+ raise TypeError(f"{type(dates)} is not supported")
121
+
122
+ @property
123
+ def dtype(self) -> ExtensionDtype:
124
+ return DateDtype()
125
+
126
+ def astype(self, dtype, copy=True):
127
+ dtype = pandas_dtype(dtype)
128
+
129
+ if isinstance(dtype, DateDtype):
130
+ data = self.copy() if copy else self
131
+ else:
132
+ data = self.to_numpy(dtype=dtype, copy=copy, na_value=dt.date.min)
133
+
134
+ return data
135
+
136
+ @property
137
+ def nbytes(self) -> int:
138
+ return self._year.nbytes + self._month.nbytes + self._day.nbytes
139
+
140
+ def __len__(self) -> int:
141
+ return len(self._year) # all 3 arrays are enforced to have the same length
142
+
143
+ def __getitem__(self, item: PositionalIndexer):
144
+ if isinstance(item, int):
145
+ return dt.date(self._year[item], self._month[item], self._day[item])
146
+ else:
147
+ raise NotImplementedError("only ints are supported as indexes")
148
+
149
+ def __setitem__(self, key: Union[int, slice, np.ndarray], value: Any):
150
+ if not isinstance(key, int):
151
+ raise NotImplementedError("only ints are supported as indexes")
152
+
153
+ if not isinstance(value, dt.date):
154
+ raise TypeError("you can only set datetime.date types")
155
+
156
+ self._year[key] = value.year
157
+ self._month[key] = value.month
158
+ self._day[key] = value.day
159
+
160
+ def __repr__(self) -> str:
161
+ return f"DateArray{list(zip(self._year, self._month, self._day))}"
162
+
163
+ def copy(self) -> "DateArray":
164
+ return DateArray((self._year.copy(), self._month.copy(), self._day.copy()))
165
+
166
+ def isna(self) -> np.ndarray:
167
+ return np.logical_and(
168
+ np.logical_and(
169
+ self._year == dt.date.min.year, self._month == dt.date.min.month
170
+ ),
171
+ self._day == dt.date.min.day,
172
+ )
173
+
174
+ @classmethod
175
+ def _from_sequence(cls, scalars, *, dtype: Optional[Dtype] = None, copy=False):
176
+ if isinstance(scalars, dt.date):
177
+ pass
178
+ elif isinstance(scalars, DateArray):
179
+ pass
180
+ elif isinstance(scalars, np.ndarray):
181
+ scalars = scalars.astype("U10") # 10 chars for yyyy-mm-dd
182
+ return DateArray(scalars)
videochat2/lib/python3.10/site-packages/pandas/tests/extension/json/__init__.py ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ from pandas.tests.extension.json.array import (
2
+ JSONArray,
3
+ JSONDtype,
4
+ make_data,
5
+ )
6
+
7
+ __all__ = ["JSONArray", "JSONDtype", "make_data"]
videochat2/lib/python3.10/site-packages/pandas/tests/extension/json/__pycache__/array.cpython-310.pyc ADDED
Binary file (8.61 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/extension/json/__pycache__/test_json.cpython-310.pyc ADDED
Binary file (14 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/extension/json/array.py ADDED
@@ -0,0 +1,244 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Test extension array for storing nested data in a pandas container.
3
+
4
+ The JSONArray stores lists of dictionaries. The storage mechanism is a list,
5
+ not an ndarray.
6
+
7
+ Note
8
+ ----
9
+ We currently store lists of UserDicts. Pandas has a few places
10
+ internally that specifically check for dicts, and does non-scalar things
11
+ in that case. We *want* the dictionaries to be treated as scalars, so we
12
+ hack around pandas by using UserDicts.
13
+ """
14
+ from __future__ import annotations
15
+
16
+ from collections import (
17
+ UserDict,
18
+ abc,
19
+ )
20
+ import itertools
21
+ import numbers
22
+ import random
23
+ import string
24
+ import sys
25
+ from typing import (
26
+ Any,
27
+ Mapping,
28
+ )
29
+
30
+ import numpy as np
31
+
32
+ from pandas._typing import type_t
33
+
34
+ from pandas.core.dtypes.cast import construct_1d_object_array_from_listlike
35
+ from pandas.core.dtypes.common import (
36
+ is_bool_dtype,
37
+ is_list_like,
38
+ pandas_dtype,
39
+ )
40
+
41
+ import pandas as pd
42
+ from pandas.api.extensions import (
43
+ ExtensionArray,
44
+ ExtensionDtype,
45
+ )
46
+ from pandas.core.indexers import unpack_tuple_and_ellipses
47
+
48
+
49
+ class JSONDtype(ExtensionDtype):
50
+ type = abc.Mapping
51
+ name = "json"
52
+ na_value: Mapping[str, Any] = UserDict()
53
+
54
+ @classmethod
55
+ def construct_array_type(cls) -> type_t[JSONArray]:
56
+ """
57
+ Return the array type associated with this dtype.
58
+
59
+ Returns
60
+ -------
61
+ type
62
+ """
63
+ return JSONArray
64
+
65
+
66
+ class JSONArray(ExtensionArray):
67
+ dtype = JSONDtype()
68
+ __array_priority__ = 1000
69
+
70
+ def __init__(self, values, dtype=None, copy=False) -> None:
71
+ for val in values:
72
+ if not isinstance(val, self.dtype.type):
73
+ raise TypeError("All values must be of type " + str(self.dtype.type))
74
+ self.data = values
75
+
76
+ # Some aliases for common attribute names to ensure pandas supports
77
+ # these
78
+ self._items = self._data = self.data
79
+ # those aliases are currently not working due to assumptions
80
+ # in internal code (GH-20735)
81
+ # self._values = self.values = self.data
82
+
83
+ @classmethod
84
+ def _from_sequence(cls, scalars, dtype=None, copy=False):
85
+ return cls(scalars)
86
+
87
+ @classmethod
88
+ def _from_factorized(cls, values, original):
89
+ return cls([UserDict(x) for x in values if x != ()])
90
+
91
+ def __getitem__(self, item):
92
+ if isinstance(item, tuple):
93
+ item = unpack_tuple_and_ellipses(item)
94
+
95
+ if isinstance(item, numbers.Integral):
96
+ return self.data[item]
97
+ elif isinstance(item, slice) and item == slice(None):
98
+ # Make sure we get a view
99
+ return type(self)(self.data)
100
+ elif isinstance(item, slice):
101
+ # slice
102
+ return type(self)(self.data[item])
103
+ elif not is_list_like(item):
104
+ # e.g. "foo" or 2.5
105
+ # exception message copied from numpy
106
+ raise IndexError(
107
+ r"only integers, slices (`:`), ellipsis (`...`), numpy.newaxis "
108
+ r"(`None`) and integer or boolean arrays are valid indices"
109
+ )
110
+ else:
111
+ item = pd.api.indexers.check_array_indexer(self, item)
112
+ if is_bool_dtype(item.dtype):
113
+ return self._from_sequence([x for x, m in zip(self, item) if m])
114
+ # integer
115
+ return type(self)([self.data[i] for i in item])
116
+
117
+ def __setitem__(self, key, value):
118
+ if isinstance(key, numbers.Integral):
119
+ self.data[key] = value
120
+ else:
121
+ if not isinstance(value, (type(self), abc.Sequence)):
122
+ # broadcast value
123
+ value = itertools.cycle([value])
124
+
125
+ if isinstance(key, np.ndarray) and key.dtype == "bool":
126
+ # masking
127
+ for i, (k, v) in enumerate(zip(key, value)):
128
+ if k:
129
+ assert isinstance(v, self.dtype.type)
130
+ self.data[i] = v
131
+ else:
132
+ for k, v in zip(key, value):
133
+ assert isinstance(v, self.dtype.type)
134
+ self.data[k] = v
135
+
136
+ def __len__(self) -> int:
137
+ return len(self.data)
138
+
139
+ def __eq__(self, other):
140
+ return NotImplemented
141
+
142
+ def __ne__(self, other):
143
+ return NotImplemented
144
+
145
+ def __array__(self, dtype=None):
146
+ if dtype is None:
147
+ dtype = object
148
+ if dtype == object:
149
+ # on py38 builds it looks like numpy is inferring to a non-1D array
150
+ return construct_1d_object_array_from_listlike(list(self))
151
+ return np.asarray(self.data, dtype=dtype)
152
+
153
+ @property
154
+ def nbytes(self) -> int:
155
+ return sys.getsizeof(self.data)
156
+
157
+ def isna(self):
158
+ return np.array([x == self.dtype.na_value for x in self.data], dtype=bool)
159
+
160
+ def take(self, indexer, allow_fill=False, fill_value=None):
161
+ # re-implement here, since NumPy has trouble setting
162
+ # sized objects like UserDicts into scalar slots of
163
+ # an ndarary.
164
+ indexer = np.asarray(indexer)
165
+ msg = (
166
+ "Index is out of bounds or cannot do a "
167
+ "non-empty take from an empty array."
168
+ )
169
+
170
+ if allow_fill:
171
+ if fill_value is None:
172
+ fill_value = self.dtype.na_value
173
+ # bounds check
174
+ if (indexer < -1).any():
175
+ raise ValueError
176
+ try:
177
+ output = [
178
+ self.data[loc] if loc != -1 else fill_value for loc in indexer
179
+ ]
180
+ except IndexError as err:
181
+ raise IndexError(msg) from err
182
+ else:
183
+ try:
184
+ output = [self.data[loc] for loc in indexer]
185
+ except IndexError as err:
186
+ raise IndexError(msg) from err
187
+
188
+ return self._from_sequence(output)
189
+
190
+ def copy(self):
191
+ return type(self)(self.data[:])
192
+
193
+ def astype(self, dtype, copy=True):
194
+ # NumPy has issues when all the dicts are the same length.
195
+ # np.array([UserDict(...), UserDict(...)]) fails,
196
+ # but np.array([{...}, {...}]) works, so cast.
197
+ from pandas.core.arrays.string_ import StringDtype
198
+
199
+ dtype = pandas_dtype(dtype)
200
+ # needed to add this check for the Series constructor
201
+ if isinstance(dtype, type(self.dtype)) and dtype == self.dtype:
202
+ if copy:
203
+ return self.copy()
204
+ return self
205
+ elif isinstance(dtype, StringDtype):
206
+ value = self.astype(str) # numpy doesn't like nested dicts
207
+ return dtype.construct_array_type()._from_sequence(value, copy=False)
208
+
209
+ return np.array([dict(x) for x in self], dtype=dtype, copy=copy)
210
+
211
+ def unique(self):
212
+ # Parent method doesn't work since np.array will try to infer
213
+ # a 2-dim object.
214
+ return type(self)([dict(x) for x in {tuple(d.items()) for d in self.data}])
215
+
216
+ @classmethod
217
+ def _concat_same_type(cls, to_concat):
218
+ data = list(itertools.chain.from_iterable(x.data for x in to_concat))
219
+ return cls(data)
220
+
221
+ def _values_for_factorize(self):
222
+ frozen = self._values_for_argsort()
223
+ if len(frozen) == 0:
224
+ # factorize_array expects 1-d array, this is a len-0 2-d array.
225
+ frozen = frozen.ravel()
226
+ return frozen, ()
227
+
228
+ def _values_for_argsort(self):
229
+ # Bypass NumPy's shape inference to get a (N,) array of tuples.
230
+ frozen = [tuple(x.items()) for x in self]
231
+ return construct_1d_object_array_from_listlike(frozen)
232
+
233
+
234
+ def make_data():
235
+ # TODO: Use a regular dict. See _NDFrameIndexer._setitem_with_indexer
236
+ return [
237
+ UserDict(
238
+ [
239
+ (random.choice(string.ascii_letters), random.randint(0, 100))
240
+ for _ in range(random.randint(0, 10))
241
+ ]
242
+ )
243
+ for _ in range(100)
244
+ ]
videochat2/lib/python3.10/site-packages/pandas/tests/extension/json/test_json.py ADDED
@@ -0,0 +1,395 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import collections
2
+ import operator
3
+ import sys
4
+
5
+ import pytest
6
+
7
+ import pandas as pd
8
+ import pandas._testing as tm
9
+ from pandas.tests.extension import base
10
+ from pandas.tests.extension.json.array import (
11
+ JSONArray,
12
+ JSONDtype,
13
+ make_data,
14
+ )
15
+
16
+
17
+ @pytest.fixture
18
+ def dtype():
19
+ return JSONDtype()
20
+
21
+
22
+ @pytest.fixture
23
+ def data():
24
+ """Length-100 PeriodArray for semantics test."""
25
+ data = make_data()
26
+
27
+ # Why the while loop? NumPy is unable to construct an ndarray from
28
+ # equal-length ndarrays. Many of our operations involve coercing the
29
+ # EA to an ndarray of objects. To avoid random test failures, we ensure
30
+ # that our data is coercible to an ndarray. Several tests deal with only
31
+ # the first two elements, so that's what we'll check.
32
+
33
+ while len(data[0]) == len(data[1]):
34
+ data = make_data()
35
+
36
+ return JSONArray(data)
37
+
38
+
39
+ @pytest.fixture
40
+ def data_missing():
41
+ """Length 2 array with [NA, Valid]"""
42
+ return JSONArray([{}, {"a": 10}])
43
+
44
+
45
+ @pytest.fixture
46
+ def data_for_sorting():
47
+ return JSONArray([{"b": 1}, {"c": 4}, {"a": 2, "c": 3}])
48
+
49
+
50
+ @pytest.fixture
51
+ def data_missing_for_sorting():
52
+ return JSONArray([{"b": 1}, {}, {"a": 4}])
53
+
54
+
55
+ @pytest.fixture
56
+ def na_value(dtype):
57
+ return dtype.na_value
58
+
59
+
60
+ @pytest.fixture
61
+ def na_cmp():
62
+ return operator.eq
63
+
64
+
65
+ @pytest.fixture
66
+ def data_for_grouping():
67
+ return JSONArray(
68
+ [
69
+ {"b": 1},
70
+ {"b": 1},
71
+ {},
72
+ {},
73
+ {"a": 0, "c": 2},
74
+ {"a": 0, "c": 2},
75
+ {"b": 1},
76
+ {"c": 2},
77
+ ]
78
+ )
79
+
80
+
81
+ class BaseJSON:
82
+ # NumPy doesn't handle an array of equal-length UserDicts.
83
+ # The default assert_series_equal eventually does a
84
+ # Series.values, which raises. We work around it by
85
+ # converting the UserDicts to dicts.
86
+ @classmethod
87
+ def assert_series_equal(cls, left, right, *args, **kwargs):
88
+ if left.dtype.name == "json":
89
+ assert left.dtype == right.dtype
90
+ left = pd.Series(
91
+ JSONArray(left.values.astype(object)), index=left.index, name=left.name
92
+ )
93
+ right = pd.Series(
94
+ JSONArray(right.values.astype(object)),
95
+ index=right.index,
96
+ name=right.name,
97
+ )
98
+ tm.assert_series_equal(left, right, *args, **kwargs)
99
+
100
+ @classmethod
101
+ def assert_frame_equal(cls, left, right, *args, **kwargs):
102
+ obj_type = kwargs.get("obj", "DataFrame")
103
+ tm.assert_index_equal(
104
+ left.columns,
105
+ right.columns,
106
+ exact=kwargs.get("check_column_type", "equiv"),
107
+ check_names=kwargs.get("check_names", True),
108
+ check_exact=kwargs.get("check_exact", False),
109
+ check_categorical=kwargs.get("check_categorical", True),
110
+ obj=f"{obj_type}.columns",
111
+ )
112
+
113
+ jsons = (left.dtypes == "json").index
114
+
115
+ for col in jsons:
116
+ cls.assert_series_equal(left[col], right[col], *args, **kwargs)
117
+
118
+ left = left.drop(columns=jsons)
119
+ right = right.drop(columns=jsons)
120
+ tm.assert_frame_equal(left, right, *args, **kwargs)
121
+
122
+
123
+ class TestDtype(BaseJSON, base.BaseDtypeTests):
124
+ pass
125
+
126
+
127
+ class TestInterface(BaseJSON, base.BaseInterfaceTests):
128
+ def test_custom_asserts(self):
129
+ # This would always trigger the KeyError from trying to put
130
+ # an array of equal-length UserDicts inside an ndarray.
131
+ data = JSONArray(
132
+ [
133
+ collections.UserDict({"a": 1}),
134
+ collections.UserDict({"b": 2}),
135
+ collections.UserDict({"c": 3}),
136
+ ]
137
+ )
138
+ a = pd.Series(data)
139
+ self.assert_series_equal(a, a)
140
+ self.assert_frame_equal(a.to_frame(), a.to_frame())
141
+
142
+ b = pd.Series(data.take([0, 0, 1]))
143
+ msg = r"Series are different"
144
+ with pytest.raises(AssertionError, match=msg):
145
+ self.assert_series_equal(a, b)
146
+
147
+ with pytest.raises(AssertionError, match=msg):
148
+ self.assert_frame_equal(a.to_frame(), b.to_frame())
149
+
150
+ @pytest.mark.xfail(
151
+ reason="comparison method not implemented for JSONArray (GH-37867)"
152
+ )
153
+ def test_contains(self, data):
154
+ # GH-37867
155
+ super().test_contains(data)
156
+
157
+
158
+ class TestConstructors(BaseJSON, base.BaseConstructorsTests):
159
+ @pytest.mark.xfail(reason="not implemented constructor from dtype")
160
+ def test_from_dtype(self, data):
161
+ # construct from our dtype & string dtype
162
+ super().test_from_dtype(data)
163
+
164
+ @pytest.mark.xfail(reason="RecursionError, GH-33900")
165
+ def test_series_constructor_no_data_with_index(self, dtype, na_value):
166
+ # RecursionError: maximum recursion depth exceeded in comparison
167
+ rec_limit = sys.getrecursionlimit()
168
+ try:
169
+ # Limit to avoid stack overflow on Windows CI
170
+ sys.setrecursionlimit(100)
171
+ super().test_series_constructor_no_data_with_index(dtype, na_value)
172
+ finally:
173
+ sys.setrecursionlimit(rec_limit)
174
+
175
+ @pytest.mark.xfail(reason="RecursionError, GH-33900")
176
+ def test_series_constructor_scalar_na_with_index(self, dtype, na_value):
177
+ # RecursionError: maximum recursion depth exceeded in comparison
178
+ rec_limit = sys.getrecursionlimit()
179
+ try:
180
+ # Limit to avoid stack overflow on Windows CI
181
+ sys.setrecursionlimit(100)
182
+ super().test_series_constructor_scalar_na_with_index(dtype, na_value)
183
+ finally:
184
+ sys.setrecursionlimit(rec_limit)
185
+
186
+ @pytest.mark.xfail(reason="collection as scalar, GH-33901")
187
+ def test_series_constructor_scalar_with_index(self, data, dtype):
188
+ # TypeError: All values must be of type <class 'collections.abc.Mapping'>
189
+ rec_limit = sys.getrecursionlimit()
190
+ try:
191
+ # Limit to avoid stack overflow on Windows CI
192
+ sys.setrecursionlimit(100)
193
+ super().test_series_constructor_scalar_with_index(data, dtype)
194
+ finally:
195
+ sys.setrecursionlimit(rec_limit)
196
+
197
+
198
+ class TestReshaping(BaseJSON, base.BaseReshapingTests):
199
+ @pytest.mark.xfail(reason="Different definitions of NA")
200
+ def test_stack(self):
201
+ """
202
+ The test does .astype(object).stack(). If we happen to have
203
+ any missing values in `data`, then we'll end up with different
204
+ rows since we consider `{}` NA, but `.astype(object)` doesn't.
205
+ """
206
+ super().test_stack()
207
+
208
+ @pytest.mark.xfail(reason="dict for NA")
209
+ def test_unstack(self, data, index):
210
+ # The base test has NaN for the expected NA value.
211
+ # this matches otherwise
212
+ return super().test_unstack(data, index)
213
+
214
+
215
+ class TestGetitem(BaseJSON, base.BaseGetitemTests):
216
+ pass
217
+
218
+
219
+ class TestIndex(BaseJSON, base.BaseIndexTests):
220
+ pass
221
+
222
+
223
+ class TestMissing(BaseJSON, base.BaseMissingTests):
224
+ @pytest.mark.xfail(reason="Setting a dict as a scalar")
225
+ def test_fillna_series(self):
226
+ """We treat dictionaries as a mapping in fillna, not a scalar."""
227
+ super().test_fillna_series()
228
+
229
+ @pytest.mark.xfail(reason="Setting a dict as a scalar")
230
+ def test_fillna_frame(self):
231
+ """We treat dictionaries as a mapping in fillna, not a scalar."""
232
+ super().test_fillna_frame()
233
+
234
+
235
+ unhashable = pytest.mark.xfail(reason="Unhashable")
236
+
237
+
238
+ class TestReduce(base.BaseNoReduceTests):
239
+ pass
240
+
241
+
242
+ class TestMethods(BaseJSON, base.BaseMethodsTests):
243
+ @unhashable
244
+ def test_value_counts(self, all_data, dropna):
245
+ super().test_value_counts(all_data, dropna)
246
+
247
+ @unhashable
248
+ def test_value_counts_with_normalize(self, data):
249
+ super().test_value_counts_with_normalize(data)
250
+
251
+ @unhashable
252
+ def test_sort_values_frame(self):
253
+ # TODO (EA.factorize): see if _values_for_factorize allows this.
254
+ super().test_sort_values_frame()
255
+
256
+ @pytest.mark.parametrize("ascending", [True, False])
257
+ def test_sort_values(self, data_for_sorting, ascending, sort_by_key):
258
+ super().test_sort_values(data_for_sorting, ascending, sort_by_key)
259
+
260
+ @pytest.mark.parametrize("ascending", [True, False])
261
+ def test_sort_values_missing(
262
+ self, data_missing_for_sorting, ascending, sort_by_key
263
+ ):
264
+ super().test_sort_values_missing(
265
+ data_missing_for_sorting, ascending, sort_by_key
266
+ )
267
+
268
+ @pytest.mark.xfail(reason="combine for JSONArray not supported")
269
+ def test_combine_le(self, data_repeated):
270
+ super().test_combine_le(data_repeated)
271
+
272
+ @pytest.mark.xfail(reason="combine for JSONArray not supported")
273
+ def test_combine_add(self, data_repeated):
274
+ super().test_combine_add(data_repeated)
275
+
276
+ @pytest.mark.xfail(
277
+ reason="combine for JSONArray not supported - "
278
+ "may pass depending on random data",
279
+ strict=False,
280
+ raises=AssertionError,
281
+ )
282
+ def test_combine_first(self, data):
283
+ super().test_combine_first(data)
284
+
285
+ @unhashable
286
+ def test_hash_pandas_object_works(self, data, kind):
287
+ super().test_hash_pandas_object_works(data, kind)
288
+
289
+ @pytest.mark.xfail(reason="broadcasting error")
290
+ def test_where_series(self, data, na_value):
291
+ # Fails with
292
+ # *** ValueError: operands could not be broadcast together
293
+ # with shapes (4,) (4,) (0,)
294
+ super().test_where_series(data, na_value)
295
+
296
+ @pytest.mark.xfail(reason="Can't compare dicts.")
297
+ def test_searchsorted(self, data_for_sorting):
298
+ super().test_searchsorted(data_for_sorting)
299
+
300
+ @pytest.mark.xfail(reason="Can't compare dicts.")
301
+ def test_equals(self, data, na_value, as_series):
302
+ super().test_equals(data, na_value, as_series)
303
+
304
+ @pytest.mark.skip("fill-value is interpreted as a dict of values")
305
+ def test_fillna_copy_frame(self, data_missing):
306
+ super().test_fillna_copy_frame(data_missing)
307
+
308
+
309
+ class TestCasting(BaseJSON, base.BaseCastingTests):
310
+ @pytest.mark.xfail(reason="failing on np.array(self, dtype=str)")
311
+ def test_astype_str(self):
312
+ """This currently fails in NumPy on np.array(self, dtype=str) with
313
+
314
+ *** ValueError: setting an array element with a sequence
315
+ """
316
+ super().test_astype_str()
317
+
318
+
319
+ # We intentionally don't run base.BaseSetitemTests because pandas'
320
+ # internals has trouble setting sequences of values into scalar positions.
321
+
322
+
323
+ class TestGroupby(BaseJSON, base.BaseGroupbyTests):
324
+ @unhashable
325
+ def test_groupby_extension_transform(self):
326
+ """
327
+ This currently fails in Series.name.setter, since the
328
+ name must be hashable, but the value is a dictionary.
329
+ I think this is what we want, i.e. `.name` should be the original
330
+ values, and not the values for factorization.
331
+ """
332
+ super().test_groupby_extension_transform()
333
+
334
+ @unhashable
335
+ def test_groupby_extension_apply(self):
336
+ """
337
+ This fails in Index._do_unique_check with
338
+
339
+ > hash(val)
340
+ E TypeError: unhashable type: 'UserDict' with
341
+
342
+ I suspect that once we support Index[ExtensionArray],
343
+ we'll be able to dispatch unique.
344
+ """
345
+ super().test_groupby_extension_apply()
346
+
347
+ @unhashable
348
+ def test_groupby_extension_agg(self):
349
+ """
350
+ This fails when we get to tm.assert_series_equal when left.index
351
+ contains dictionaries, which are not hashable.
352
+ """
353
+ super().test_groupby_extension_agg()
354
+
355
+ @unhashable
356
+ def test_groupby_extension_no_sort(self):
357
+ """
358
+ This fails when we get to tm.assert_series_equal when left.index
359
+ contains dictionaries, which are not hashable.
360
+ """
361
+ super().test_groupby_extension_no_sort()
362
+
363
+ @pytest.mark.xfail(reason="GH#39098: Converts agg result to object")
364
+ def test_groupby_agg_extension(self, data_for_grouping):
365
+ super().test_groupby_agg_extension(data_for_grouping)
366
+
367
+
368
+ class TestArithmeticOps(BaseJSON, base.BaseArithmeticOpsTests):
369
+ def test_arith_frame_with_scalar(self, data, all_arithmetic_operators, request):
370
+ if len(data[0]) != 1:
371
+ mark = pytest.mark.xfail(reason="raises in coercing to Series")
372
+ request.node.add_marker(mark)
373
+ super().test_arith_frame_with_scalar(data, all_arithmetic_operators)
374
+
375
+ def test_add_series_with_extension_array(self, data):
376
+ ser = pd.Series(data)
377
+ with pytest.raises(TypeError, match="unsupported"):
378
+ ser + data
379
+
380
+ @pytest.mark.xfail(reason="not implemented")
381
+ def test_divmod_series_array(self):
382
+ # GH 23287
383
+ # skipping because it is not implemented
384
+ super().test_divmod_series_array()
385
+
386
+ def _check_divmod_op(self, s, op, other, exc=NotImplementedError):
387
+ return super()._check_divmod_op(s, op, other, exc=TypeError)
388
+
389
+
390
+ class TestComparisonOps(BaseJSON, base.BaseComparisonOpsTests):
391
+ pass
392
+
393
+
394
+ class TestPrinting(BaseJSON, base.BasePrintingTests):
395
+ pass
videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_arrow.py ADDED
@@ -0,0 +1,2749 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ This file contains a minimal set of tests for compliance with the extension
3
+ array interface test suite, and should contain no other tests.
4
+ The test suite for the full functionality of the array is located in
5
+ `pandas/tests/arrays/`.
6
+ The tests in this file are inherited from the BaseExtensionTests, and only
7
+ minimal tweaks should be applied to get the tests passing (by overwriting a
8
+ parent method).
9
+ Additional tests should either be added to one of the BaseExtensionTests
10
+ classes (if they are relevant for the extension interface for all dtypes), or
11
+ be added to the array-specific tests in `pandas/tests/arrays/`.
12
+ """
13
+ from datetime import (
14
+ date,
15
+ datetime,
16
+ time,
17
+ timedelta,
18
+ )
19
+ from decimal import Decimal
20
+ from io import (
21
+ BytesIO,
22
+ StringIO,
23
+ )
24
+ import operator
25
+ import pickle
26
+ import re
27
+
28
+ import numpy as np
29
+ import pytest
30
+
31
+ from pandas._libs import lib
32
+ from pandas.compat import (
33
+ PY311,
34
+ is_ci_environment,
35
+ is_platform_windows,
36
+ pa_version_under7p0,
37
+ pa_version_under8p0,
38
+ pa_version_under9p0,
39
+ pa_version_under11p0,
40
+ )
41
+ from pandas.errors import PerformanceWarning
42
+
43
+ from pandas.core.dtypes.common import is_any_int_dtype
44
+ from pandas.core.dtypes.dtypes import CategoricalDtypeType
45
+
46
+ import pandas as pd
47
+ import pandas._testing as tm
48
+ from pandas.api.extensions import no_default
49
+ from pandas.api.types import (
50
+ is_bool_dtype,
51
+ is_float_dtype,
52
+ is_integer_dtype,
53
+ is_numeric_dtype,
54
+ is_signed_integer_dtype,
55
+ is_string_dtype,
56
+ is_unsigned_integer_dtype,
57
+ )
58
+ from pandas.tests.extension import base
59
+
60
+ pa = pytest.importorskip("pyarrow", minversion="7.0.0")
61
+
62
+ from pandas.core.arrays.arrow.array import ArrowExtensionArray
63
+
64
+ from pandas.core.arrays.arrow.dtype import ArrowDtype # isort:skip
65
+
66
+
67
+ @pytest.fixture(params=tm.ALL_PYARROW_DTYPES, ids=str)
68
+ def dtype(request):
69
+ return ArrowDtype(pyarrow_dtype=request.param)
70
+
71
+
72
+ @pytest.fixture
73
+ def data(dtype):
74
+ pa_dtype = dtype.pyarrow_dtype
75
+ if pa.types.is_boolean(pa_dtype):
76
+ data = [True, False] * 4 + [None] + [True, False] * 44 + [None] + [True, False]
77
+ elif pa.types.is_floating(pa_dtype):
78
+ data = [1.0, 0.0] * 4 + [None] + [-2.0, -1.0] * 44 + [None] + [0.5, 99.5]
79
+ elif pa.types.is_signed_integer(pa_dtype):
80
+ data = [1, 0] * 4 + [None] + [-2, -1] * 44 + [None] + [1, 99]
81
+ elif pa.types.is_unsigned_integer(pa_dtype):
82
+ data = [1, 0] * 4 + [None] + [2, 1] * 44 + [None] + [1, 99]
83
+ elif pa.types.is_decimal(pa_dtype):
84
+ data = (
85
+ [Decimal("1"), Decimal("0.0")] * 4
86
+ + [None]
87
+ + [Decimal("-2.0"), Decimal("-1.0")] * 44
88
+ + [None]
89
+ + [Decimal("0.5"), Decimal("33.123")]
90
+ )
91
+ elif pa.types.is_date(pa_dtype):
92
+ data = (
93
+ [date(2022, 1, 1), date(1999, 12, 31)] * 4
94
+ + [None]
95
+ + [date(2022, 1, 1), date(2022, 1, 1)] * 44
96
+ + [None]
97
+ + [date(1999, 12, 31), date(1999, 12, 31)]
98
+ )
99
+ elif pa.types.is_timestamp(pa_dtype):
100
+ data = (
101
+ [datetime(2020, 1, 1, 1, 1, 1, 1), datetime(1999, 1, 1, 1, 1, 1, 1)] * 4
102
+ + [None]
103
+ + [datetime(2020, 1, 1, 1), datetime(1999, 1, 1, 1)] * 44
104
+ + [None]
105
+ + [datetime(2020, 1, 1), datetime(1999, 1, 1)]
106
+ )
107
+ elif pa.types.is_duration(pa_dtype):
108
+ data = (
109
+ [timedelta(1), timedelta(1, 1)] * 4
110
+ + [None]
111
+ + [timedelta(-1), timedelta(0)] * 44
112
+ + [None]
113
+ + [timedelta(-10), timedelta(10)]
114
+ )
115
+ elif pa.types.is_time(pa_dtype):
116
+ data = (
117
+ [time(12, 0), time(0, 12)] * 4
118
+ + [None]
119
+ + [time(0, 0), time(1, 1)] * 44
120
+ + [None]
121
+ + [time(0, 5), time(5, 0)]
122
+ )
123
+ elif pa.types.is_string(pa_dtype):
124
+ data = ["a", "b"] * 4 + [None] + ["1", "2"] * 44 + [None] + ["!", ">"]
125
+ elif pa.types.is_binary(pa_dtype):
126
+ data = [b"a", b"b"] * 4 + [None] + [b"1", b"2"] * 44 + [None] + [b"!", b">"]
127
+ else:
128
+ raise NotImplementedError
129
+ return pd.array(data, dtype=dtype)
130
+
131
+
132
+ @pytest.fixture
133
+ def data_missing(data):
134
+ """Length-2 array with [NA, Valid]"""
135
+ return type(data)._from_sequence([None, data[0]], dtype=data.dtype)
136
+
137
+
138
+ @pytest.fixture(params=["data", "data_missing"])
139
+ def all_data(request, data, data_missing):
140
+ """Parametrized fixture returning 'data' or 'data_missing' integer arrays.
141
+
142
+ Used to test dtype conversion with and without missing values.
143
+ """
144
+ if request.param == "data":
145
+ return data
146
+ elif request.param == "data_missing":
147
+ return data_missing
148
+
149
+
150
+ @pytest.fixture
151
+ def data_for_grouping(dtype):
152
+ """
153
+ Data for factorization, grouping, and unique tests.
154
+
155
+ Expected to be like [B, B, NA, NA, A, A, B, C]
156
+
157
+ Where A < B < C and NA is missing
158
+ """
159
+ pa_dtype = dtype.pyarrow_dtype
160
+ if pa.types.is_boolean(pa_dtype):
161
+ A = False
162
+ B = True
163
+ C = True
164
+ elif pa.types.is_floating(pa_dtype):
165
+ A = -1.1
166
+ B = 0.0
167
+ C = 1.1
168
+ elif pa.types.is_signed_integer(pa_dtype):
169
+ A = -1
170
+ B = 0
171
+ C = 1
172
+ elif pa.types.is_unsigned_integer(pa_dtype):
173
+ A = 0
174
+ B = 1
175
+ C = 10
176
+ elif pa.types.is_date(pa_dtype):
177
+ A = date(1999, 12, 31)
178
+ B = date(2010, 1, 1)
179
+ C = date(2022, 1, 1)
180
+ elif pa.types.is_timestamp(pa_dtype):
181
+ A = datetime(1999, 1, 1, 1, 1, 1, 1)
182
+ B = datetime(2020, 1, 1)
183
+ C = datetime(2020, 1, 1, 1)
184
+ elif pa.types.is_duration(pa_dtype):
185
+ A = timedelta(-1)
186
+ B = timedelta(0)
187
+ C = timedelta(1, 4)
188
+ elif pa.types.is_time(pa_dtype):
189
+ A = time(0, 0)
190
+ B = time(0, 12)
191
+ C = time(12, 12)
192
+ elif pa.types.is_string(pa_dtype):
193
+ A = "a"
194
+ B = "b"
195
+ C = "c"
196
+ elif pa.types.is_binary(pa_dtype):
197
+ A = b"a"
198
+ B = b"b"
199
+ C = b"c"
200
+ elif pa.types.is_decimal(pa_dtype):
201
+ A = Decimal("-1.1")
202
+ B = Decimal("0.0")
203
+ C = Decimal("1.1")
204
+ else:
205
+ raise NotImplementedError
206
+ return pd.array([B, B, None, None, A, A, B, C], dtype=dtype)
207
+
208
+
209
+ @pytest.fixture
210
+ def data_for_sorting(data_for_grouping):
211
+ """
212
+ Length-3 array with a known sort order.
213
+
214
+ This should be three items [B, C, A] with
215
+ A < B < C
216
+ """
217
+ return type(data_for_grouping)._from_sequence(
218
+ [data_for_grouping[0], data_for_grouping[7], data_for_grouping[4]],
219
+ dtype=data_for_grouping.dtype,
220
+ )
221
+
222
+
223
+ @pytest.fixture
224
+ def data_missing_for_sorting(data_for_grouping):
225
+ """
226
+ Length-3 array with a known sort order.
227
+
228
+ This should be three items [B, NA, A] with
229
+ A < B and NA missing.
230
+ """
231
+ return type(data_for_grouping)._from_sequence(
232
+ [data_for_grouping[0], data_for_grouping[2], data_for_grouping[4]],
233
+ dtype=data_for_grouping.dtype,
234
+ )
235
+
236
+
237
+ @pytest.fixture
238
+ def data_for_twos(data):
239
+ """Length-100 array in which all the elements are two."""
240
+ pa_dtype = data.dtype.pyarrow_dtype
241
+ if pa.types.is_integer(pa_dtype) or pa.types.is_floating(pa_dtype):
242
+ return pd.array([2] * 100, dtype=data.dtype)
243
+ # tests will be xfailed where 2 is not a valid scalar for pa_dtype
244
+ return data
245
+
246
+
247
+ @pytest.fixture
248
+ def na_value():
249
+ """The scalar missing value for this type. Default 'None'"""
250
+ return pd.NA
251
+
252
+
253
+ class TestBaseCasting(base.BaseCastingTests):
254
+ def test_astype_str(self, data, request):
255
+ pa_dtype = data.dtype.pyarrow_dtype
256
+ if pa.types.is_binary(pa_dtype):
257
+ request.node.add_marker(
258
+ pytest.mark.xfail(
259
+ reason=f"For {pa_dtype} .astype(str) decodes.",
260
+ )
261
+ )
262
+ super().test_astype_str(data)
263
+
264
+
265
+ class TestConstructors(base.BaseConstructorsTests):
266
+ def test_from_dtype(self, data, request):
267
+ pa_dtype = data.dtype.pyarrow_dtype
268
+ if pa.types.is_string(pa_dtype) or pa.types.is_decimal(pa_dtype):
269
+ if pa.types.is_string(pa_dtype):
270
+ reason = "ArrowDtype(pa.string()) != StringDtype('pyarrow')"
271
+ else:
272
+ reason = f"pyarrow.type_for_alias cannot infer {pa_dtype}"
273
+
274
+ request.node.add_marker(
275
+ pytest.mark.xfail(
276
+ reason=reason,
277
+ )
278
+ )
279
+ super().test_from_dtype(data)
280
+
281
+ def test_from_sequence_pa_array(self, data):
282
+ # https://github.com/pandas-dev/pandas/pull/47034#discussion_r955500784
283
+ # data._data = pa.ChunkedArray
284
+ result = type(data)._from_sequence(data._data)
285
+ tm.assert_extension_array_equal(result, data)
286
+ assert isinstance(result._data, pa.ChunkedArray)
287
+
288
+ result = type(data)._from_sequence(data._data.combine_chunks())
289
+ tm.assert_extension_array_equal(result, data)
290
+ assert isinstance(result._data, pa.ChunkedArray)
291
+
292
+ def test_from_sequence_pa_array_notimplemented(self, request):
293
+ with pytest.raises(NotImplementedError, match="Converting strings to"):
294
+ ArrowExtensionArray._from_sequence_of_strings(
295
+ ["12-1"], dtype=pa.month_day_nano_interval()
296
+ )
297
+
298
+ def test_from_sequence_of_strings_pa_array(self, data, request):
299
+ pa_dtype = data.dtype.pyarrow_dtype
300
+ if pa.types.is_time64(pa_dtype) and pa_dtype.equals("time64[ns]") and not PY311:
301
+ request.node.add_marker(
302
+ pytest.mark.xfail(
303
+ reason="Nanosecond time parsing not supported.",
304
+ )
305
+ )
306
+ elif pa_version_under11p0 and (
307
+ pa.types.is_duration(pa_dtype) or pa.types.is_decimal(pa_dtype)
308
+ ):
309
+ request.node.add_marker(
310
+ pytest.mark.xfail(
311
+ raises=pa.ArrowNotImplementedError,
312
+ reason=f"pyarrow doesn't support parsing {pa_dtype}",
313
+ )
314
+ )
315
+ elif pa.types.is_timestamp(pa_dtype) and pa_dtype.tz is not None:
316
+ if is_platform_windows() and is_ci_environment():
317
+ request.node.add_marker(
318
+ pytest.mark.xfail(
319
+ raises=pa.ArrowInvalid,
320
+ reason=(
321
+ "TODO: Set ARROW_TIMEZONE_DATABASE environment variable "
322
+ "on CI to path to the tzdata for pyarrow."
323
+ ),
324
+ )
325
+ )
326
+ pa_array = data._data.cast(pa.string())
327
+ result = type(data)._from_sequence_of_strings(pa_array, dtype=data.dtype)
328
+ tm.assert_extension_array_equal(result, data)
329
+
330
+ pa_array = pa_array.combine_chunks()
331
+ result = type(data)._from_sequence_of_strings(pa_array, dtype=data.dtype)
332
+ tm.assert_extension_array_equal(result, data)
333
+
334
+
335
+ class TestGetitemTests(base.BaseGetitemTests):
336
+ pass
337
+
338
+
339
+ class TestBaseAccumulateTests(base.BaseAccumulateTests):
340
+ def check_accumulate(self, ser, op_name, skipna):
341
+ result = getattr(ser, op_name)(skipna=skipna)
342
+
343
+ if ser.dtype.kind == "m":
344
+ # Just check that we match the integer behavior.
345
+ ser = ser.astype("int64[pyarrow]")
346
+ result = result.astype("int64[pyarrow]")
347
+
348
+ result = result.astype("Float64")
349
+ expected = getattr(ser.astype("Float64"), op_name)(skipna=skipna)
350
+ self.assert_series_equal(result, expected, check_dtype=False)
351
+
352
+ @pytest.mark.parametrize("skipna", [True, False])
353
+ def test_accumulate_series_raises(self, data, all_numeric_accumulations, skipna):
354
+ pa_type = data.dtype.pyarrow_dtype
355
+ if (
356
+ (
357
+ pa.types.is_integer(pa_type)
358
+ or pa.types.is_floating(pa_type)
359
+ or pa.types.is_duration(pa_type)
360
+ )
361
+ and all_numeric_accumulations == "cumsum"
362
+ and not pa_version_under9p0
363
+ ):
364
+ pytest.skip("These work, are tested by test_accumulate_series.")
365
+
366
+ op_name = all_numeric_accumulations
367
+ ser = pd.Series(data)
368
+
369
+ with pytest.raises(NotImplementedError):
370
+ getattr(ser, op_name)(skipna=skipna)
371
+
372
+ @pytest.mark.parametrize("skipna", [True, False])
373
+ def test_accumulate_series(self, data, all_numeric_accumulations, skipna, request):
374
+ pa_type = data.dtype.pyarrow_dtype
375
+ op_name = all_numeric_accumulations
376
+ ser = pd.Series(data)
377
+
378
+ do_skip = False
379
+ if pa.types.is_string(pa_type) or pa.types.is_binary(pa_type):
380
+ if op_name in ["cumsum", "cumprod"]:
381
+ do_skip = True
382
+ elif pa.types.is_temporal(pa_type) and not pa.types.is_duration(pa_type):
383
+ if op_name in ["cumsum", "cumprod"]:
384
+ do_skip = True
385
+ elif pa.types.is_duration(pa_type):
386
+ if op_name == "cumprod":
387
+ do_skip = True
388
+
389
+ if do_skip:
390
+ pytest.skip(
391
+ "These should *not* work, we test in test_accumulate_series_raises "
392
+ "that these correctly raise."
393
+ )
394
+
395
+ if all_numeric_accumulations != "cumsum" or pa_version_under9p0:
396
+ if request.config.option.skip_slow:
397
+ # equivalent to marking these cases with @pytest.mark.slow,
398
+ # these xfails take a long time to run because pytest
399
+ # renders the exception messages even when not showing them
400
+ pytest.skip("pyarrow xfail slow")
401
+
402
+ request.node.add_marker(
403
+ pytest.mark.xfail(
404
+ reason=f"{all_numeric_accumulations} not implemented",
405
+ raises=NotImplementedError,
406
+ )
407
+ )
408
+ elif all_numeric_accumulations == "cumsum" and (
409
+ pa.types.is_boolean(pa_type) or pa.types.is_decimal(pa_type)
410
+ ):
411
+ request.node.add_marker(
412
+ pytest.mark.xfail(
413
+ reason=f"{all_numeric_accumulations} not implemented for {pa_type}",
414
+ raises=NotImplementedError,
415
+ )
416
+ )
417
+
418
+ self.check_accumulate(ser, op_name, skipna)
419
+
420
+
421
+ class TestBaseNumericReduce(base.BaseNumericReduceTests):
422
+ def check_reduce(self, ser, op_name, skipna):
423
+ pa_dtype = ser.dtype.pyarrow_dtype
424
+ if op_name == "count":
425
+ result = getattr(ser, op_name)()
426
+ else:
427
+ result = getattr(ser, op_name)(skipna=skipna)
428
+ if pa.types.is_boolean(pa_dtype):
429
+ # Can't convert if ser contains NA
430
+ pytest.skip(
431
+ "pandas boolean data with NA does not fully support all reductions"
432
+ )
433
+ elif pa.types.is_integer(pa_dtype) or pa.types.is_floating(pa_dtype):
434
+ ser = ser.astype("Float64")
435
+ if op_name == "count":
436
+ expected = getattr(ser, op_name)()
437
+ else:
438
+ expected = getattr(ser, op_name)(skipna=skipna)
439
+ tm.assert_almost_equal(result, expected)
440
+
441
+ @pytest.mark.parametrize("skipna", [True, False])
442
+ def test_reduce_series(self, data, all_numeric_reductions, skipna, request):
443
+ pa_dtype = data.dtype.pyarrow_dtype
444
+ opname = all_numeric_reductions
445
+
446
+ ser = pd.Series(data)
447
+
448
+ should_work = True
449
+ if pa.types.is_temporal(pa_dtype) and opname in [
450
+ "sum",
451
+ "var",
452
+ "skew",
453
+ "kurt",
454
+ "prod",
455
+ ]:
456
+ if pa.types.is_duration(pa_dtype) and opname in ["sum"]:
457
+ # summing timedeltas is one case that *is* well-defined
458
+ pass
459
+ else:
460
+ should_work = False
461
+ elif (
462
+ pa.types.is_string(pa_dtype) or pa.types.is_binary(pa_dtype)
463
+ ) and opname in [
464
+ "sum",
465
+ "mean",
466
+ "median",
467
+ "prod",
468
+ "std",
469
+ "sem",
470
+ "var",
471
+ "skew",
472
+ "kurt",
473
+ ]:
474
+ should_work = False
475
+
476
+ if not should_work:
477
+ # matching the non-pyarrow versions, these operations *should* not
478
+ # work for these dtypes
479
+ msg = f"does not support reduction '{opname}'"
480
+ with pytest.raises(TypeError, match=msg):
481
+ getattr(ser, opname)(skipna=skipna)
482
+
483
+ return
484
+
485
+ xfail_mark = pytest.mark.xfail(
486
+ raises=TypeError,
487
+ reason=(
488
+ f"{all_numeric_reductions} is not implemented in "
489
+ f"pyarrow={pa.__version__} for {pa_dtype}"
490
+ ),
491
+ )
492
+ if all_numeric_reductions in {"skew", "kurt"}:
493
+ request.node.add_marker(xfail_mark)
494
+ elif (
495
+ all_numeric_reductions in {"var", "std", "median"}
496
+ and pa_version_under7p0
497
+ and pa.types.is_decimal(pa_dtype)
498
+ ):
499
+ request.node.add_marker(xfail_mark)
500
+ elif all_numeric_reductions == "sem" and pa_version_under8p0:
501
+ request.node.add_marker(xfail_mark)
502
+
503
+ elif pa.types.is_boolean(pa_dtype) and all_numeric_reductions in {
504
+ "sem",
505
+ "std",
506
+ "var",
507
+ "median",
508
+ }:
509
+ request.node.add_marker(xfail_mark)
510
+ super().test_reduce_series(data, all_numeric_reductions, skipna)
511
+
512
+ @pytest.mark.parametrize("typ", ["int64", "uint64", "float64"])
513
+ def test_median_not_approximate(self, typ):
514
+ # GH 52679
515
+ result = pd.Series([1, 2], dtype=f"{typ}[pyarrow]").median()
516
+ assert result == 1.5
517
+
518
+
519
+ class TestBaseBooleanReduce(base.BaseBooleanReduceTests):
520
+ @pytest.mark.parametrize("skipna", [True, False])
521
+ def test_reduce_series(
522
+ self, data, all_boolean_reductions, skipna, na_value, request
523
+ ):
524
+ pa_dtype = data.dtype.pyarrow_dtype
525
+ xfail_mark = pytest.mark.xfail(
526
+ raises=TypeError,
527
+ reason=(
528
+ f"{all_boolean_reductions} is not implemented in "
529
+ f"pyarrow={pa.__version__} for {pa_dtype}"
530
+ ),
531
+ )
532
+ if pa.types.is_string(pa_dtype) or pa.types.is_binary(pa_dtype):
533
+ # We *might* want to make this behave like the non-pyarrow cases,
534
+ # but have not yet decided.
535
+ request.node.add_marker(xfail_mark)
536
+
537
+ op_name = all_boolean_reductions
538
+ ser = pd.Series(data)
539
+
540
+ if pa.types.is_temporal(pa_dtype) and not pa.types.is_duration(pa_dtype):
541
+ # xref GH#34479 we support this in our non-pyarrow datetime64 dtypes,
542
+ # but it isn't obvious we _should_. For now, we keep the pyarrow
543
+ # behavior which does not support this.
544
+
545
+ with pytest.raises(TypeError, match="does not support reduction"):
546
+ getattr(ser, op_name)(skipna=skipna)
547
+
548
+ return
549
+
550
+ result = getattr(ser, op_name)(skipna=skipna)
551
+ assert result is (op_name == "any")
552
+
553
+
554
+ class TestBaseGroupby(base.BaseGroupbyTests):
555
+ def test_groupby_extension_no_sort(self, data_for_grouping, request):
556
+ pa_dtype = data_for_grouping.dtype.pyarrow_dtype
557
+ if pa.types.is_boolean(pa_dtype):
558
+ request.node.add_marker(
559
+ pytest.mark.xfail(
560
+ reason=f"{pa_dtype} only has 2 unique possible values",
561
+ )
562
+ )
563
+ super().test_groupby_extension_no_sort(data_for_grouping)
564
+
565
+ def test_groupby_extension_transform(self, data_for_grouping, request):
566
+ pa_dtype = data_for_grouping.dtype.pyarrow_dtype
567
+ if pa.types.is_boolean(pa_dtype):
568
+ request.node.add_marker(
569
+ pytest.mark.xfail(
570
+ reason=f"{pa_dtype} only has 2 unique possible values",
571
+ )
572
+ )
573
+ super().test_groupby_extension_transform(data_for_grouping)
574
+
575
+ @pytest.mark.parametrize("as_index", [True, False])
576
+ def test_groupby_extension_agg(self, as_index, data_for_grouping, request):
577
+ pa_dtype = data_for_grouping.dtype.pyarrow_dtype
578
+ if pa.types.is_boolean(pa_dtype):
579
+ request.node.add_marker(
580
+ pytest.mark.xfail(
581
+ raises=ValueError,
582
+ reason=f"{pa_dtype} only has 2 unique possible values",
583
+ )
584
+ )
585
+ super().test_groupby_extension_agg(as_index, data_for_grouping)
586
+
587
+ def test_in_numeric_groupby(self, data_for_grouping):
588
+ if is_string_dtype(data_for_grouping.dtype):
589
+ df = pd.DataFrame(
590
+ {
591
+ "A": [1, 1, 2, 2, 3, 3, 1, 4],
592
+ "B": data_for_grouping,
593
+ "C": [1, 1, 1, 1, 1, 1, 1, 1],
594
+ }
595
+ )
596
+
597
+ expected = pd.Index(["C"])
598
+ with pytest.raises(TypeError, match="does not support"):
599
+ df.groupby("A").sum().columns
600
+ result = df.groupby("A").sum(numeric_only=True).columns
601
+ tm.assert_index_equal(result, expected)
602
+ else:
603
+ super().test_in_numeric_groupby(data_for_grouping)
604
+
605
+
606
+ class TestBaseDtype(base.BaseDtypeTests):
607
+ def test_check_dtype(self, data, request):
608
+ pa_dtype = data.dtype.pyarrow_dtype
609
+ if pa.types.is_decimal(pa_dtype) and pa_version_under8p0:
610
+ request.node.add_marker(
611
+ pytest.mark.xfail(
612
+ raises=ValueError,
613
+ reason="decimal string repr affects numpy comparison",
614
+ )
615
+ )
616
+ super().test_check_dtype(data)
617
+
618
+ def test_construct_from_string_own_name(self, dtype, request):
619
+ pa_dtype = dtype.pyarrow_dtype
620
+ if pa.types.is_decimal(pa_dtype):
621
+ request.node.add_marker(
622
+ pytest.mark.xfail(
623
+ raises=NotImplementedError,
624
+ reason=f"pyarrow.type_for_alias cannot infer {pa_dtype}",
625
+ )
626
+ )
627
+
628
+ if pa.types.is_string(pa_dtype):
629
+ # We still support StringDtype('pyarrow') over ArrowDtype(pa.string())
630
+ msg = r"string\[pyarrow\] should be constructed by StringDtype"
631
+ with pytest.raises(TypeError, match=msg):
632
+ dtype.construct_from_string(dtype.name)
633
+
634
+ return
635
+
636
+ super().test_construct_from_string_own_name(dtype)
637
+
638
+ def test_is_dtype_from_name(self, dtype, request):
639
+ pa_dtype = dtype.pyarrow_dtype
640
+ if pa.types.is_string(pa_dtype):
641
+ # We still support StringDtype('pyarrow') over ArrowDtype(pa.string())
642
+ assert not type(dtype).is_dtype(dtype.name)
643
+ else:
644
+ if pa.types.is_decimal(pa_dtype):
645
+ request.node.add_marker(
646
+ pytest.mark.xfail(
647
+ raises=NotImplementedError,
648
+ reason=f"pyarrow.type_for_alias cannot infer {pa_dtype}",
649
+ )
650
+ )
651
+ super().test_is_dtype_from_name(dtype)
652
+
653
+ def test_construct_from_string_another_type_raises(self, dtype):
654
+ msg = r"'another_type' must end with '\[pyarrow\]'"
655
+ with pytest.raises(TypeError, match=msg):
656
+ type(dtype).construct_from_string("another_type")
657
+
658
+ def test_get_common_dtype(self, dtype, request):
659
+ pa_dtype = dtype.pyarrow_dtype
660
+ if (
661
+ pa.types.is_date(pa_dtype)
662
+ or pa.types.is_time(pa_dtype)
663
+ or (
664
+ pa.types.is_timestamp(pa_dtype)
665
+ and (pa_dtype.unit != "ns" or pa_dtype.tz is not None)
666
+ )
667
+ or (pa.types.is_duration(pa_dtype) and pa_dtype.unit != "ns")
668
+ or pa.types.is_binary(pa_dtype)
669
+ or pa.types.is_decimal(pa_dtype)
670
+ ):
671
+ request.node.add_marker(
672
+ pytest.mark.xfail(
673
+ reason=(
674
+ f"{pa_dtype} does not have associated numpy "
675
+ f"dtype findable by find_common_type"
676
+ )
677
+ )
678
+ )
679
+ super().test_get_common_dtype(dtype)
680
+
681
+ def test_is_not_string_type(self, dtype):
682
+ pa_dtype = dtype.pyarrow_dtype
683
+ if pa.types.is_string(pa_dtype):
684
+ assert is_string_dtype(dtype)
685
+ else:
686
+ super().test_is_not_string_type(dtype)
687
+
688
+
689
+ class TestBaseIndex(base.BaseIndexTests):
690
+ pass
691
+
692
+
693
+ class TestBaseInterface(base.BaseInterfaceTests):
694
+ @pytest.mark.xfail(
695
+ reason="GH 45419: pyarrow.ChunkedArray does not support views.", run=False
696
+ )
697
+ def test_view(self, data):
698
+ super().test_view(data)
699
+
700
+
701
+ class TestBaseMissing(base.BaseMissingTests):
702
+ def test_fillna_no_op_returns_copy(self, data):
703
+ data = data[~data.isna()]
704
+
705
+ valid = data[0]
706
+ result = data.fillna(valid)
707
+ assert result is not data
708
+ self.assert_extension_array_equal(result, data)
709
+ with tm.assert_produces_warning(PerformanceWarning):
710
+ result = data.fillna(method="backfill")
711
+ assert result is not data
712
+ self.assert_extension_array_equal(result, data)
713
+
714
+ def test_fillna_series_method(self, data_missing, fillna_method):
715
+ with tm.maybe_produces_warning(
716
+ PerformanceWarning, fillna_method is not None, check_stacklevel=False
717
+ ):
718
+ super().test_fillna_series_method(data_missing, fillna_method)
719
+
720
+
721
+ class TestBasePrinting(base.BasePrintingTests):
722
+ pass
723
+
724
+
725
+ class TestBaseReshaping(base.BaseReshapingTests):
726
+ @pytest.mark.xfail(
727
+ reason="GH 45419: pyarrow.ChunkedArray does not support views", run=False
728
+ )
729
+ def test_transpose(self, data):
730
+ super().test_transpose(data)
731
+
732
+
733
+ class TestBaseSetitem(base.BaseSetitemTests):
734
+ @pytest.mark.xfail(
735
+ reason="GH 45419: pyarrow.ChunkedArray does not support views", run=False
736
+ )
737
+ def test_setitem_preserves_views(self, data):
738
+ super().test_setitem_preserves_views(data)
739
+
740
+
741
+ class TestBaseParsing(base.BaseParsingTests):
742
+ @pytest.mark.parametrize("dtype_backend", ["pyarrow", no_default])
743
+ @pytest.mark.parametrize("engine", ["c", "python"])
744
+ def test_EA_types(self, engine, data, dtype_backend, request):
745
+ pa_dtype = data.dtype.pyarrow_dtype
746
+ if pa.types.is_decimal(pa_dtype):
747
+ request.node.add_marker(
748
+ pytest.mark.xfail(
749
+ raises=NotImplementedError,
750
+ reason=f"Parameterized types {pa_dtype} not supported.",
751
+ )
752
+ )
753
+ elif pa.types.is_timestamp(pa_dtype) and pa_dtype.unit in ("us", "ns"):
754
+ request.node.add_marker(
755
+ pytest.mark.xfail(
756
+ raises=ValueError,
757
+ reason="https://github.com/pandas-dev/pandas/issues/49767",
758
+ )
759
+ )
760
+ elif pa.types.is_binary(pa_dtype):
761
+ request.node.add_marker(
762
+ pytest.mark.xfail(reason="CSV parsers don't correctly handle binary")
763
+ )
764
+ elif (
765
+ pa.types.is_duration(pa_dtype)
766
+ and dtype_backend == "pyarrow"
767
+ and engine == "python"
768
+ ):
769
+ request.node.add_marker(
770
+ pytest.mark.xfail(
771
+ raises=TypeError,
772
+ reason="Invalid type for timedelta scalar: NAType",
773
+ )
774
+ )
775
+ df = pd.DataFrame({"with_dtype": pd.Series(data, dtype=str(data.dtype))})
776
+ csv_output = df.to_csv(index=False, na_rep=np.nan)
777
+ if pa.types.is_binary(pa_dtype):
778
+ csv_output = BytesIO(csv_output)
779
+ else:
780
+ csv_output = StringIO(csv_output)
781
+ result = pd.read_csv(
782
+ csv_output,
783
+ dtype={"with_dtype": str(data.dtype)},
784
+ engine=engine,
785
+ dtype_backend=dtype_backend,
786
+ )
787
+ expected = df
788
+ self.assert_frame_equal(result, expected)
789
+
790
+
791
+ class TestBaseUnaryOps(base.BaseUnaryOpsTests):
792
+ def test_invert(self, data, request):
793
+ pa_dtype = data.dtype.pyarrow_dtype
794
+ if not pa.types.is_boolean(pa_dtype):
795
+ request.node.add_marker(
796
+ pytest.mark.xfail(
797
+ raises=pa.ArrowNotImplementedError,
798
+ reason=f"pyarrow.compute.invert does support {pa_dtype}",
799
+ )
800
+ )
801
+ super().test_invert(data)
802
+
803
+
804
+ class TestBaseMethods(base.BaseMethodsTests):
805
+ @pytest.mark.parametrize("periods", [1, -2])
806
+ def test_diff(self, data, periods, request):
807
+ pa_dtype = data.dtype.pyarrow_dtype
808
+ if pa.types.is_unsigned_integer(pa_dtype) and periods == 1:
809
+ request.node.add_marker(
810
+ pytest.mark.xfail(
811
+ raises=pa.ArrowInvalid,
812
+ reason=(
813
+ f"diff with {pa_dtype} and periods={periods} will overflow"
814
+ ),
815
+ )
816
+ )
817
+ super().test_diff(data, periods)
818
+
819
+ def test_value_counts_returns_pyarrow_int64(self, data):
820
+ # GH 51462
821
+ data = data[:10]
822
+ result = data.value_counts()
823
+ assert result.dtype == ArrowDtype(pa.int64())
824
+
825
+ def test_value_counts_with_normalize(self, data, request):
826
+ data = data[:10].unique()
827
+ values = np.array(data[~data.isna()])
828
+ ser = pd.Series(data, dtype=data.dtype)
829
+
830
+ result = ser.value_counts(normalize=True).sort_index()
831
+
832
+ expected = pd.Series(
833
+ [1 / len(values)] * len(values), index=result.index, name="proportion"
834
+ )
835
+ expected = expected.astype("double[pyarrow]")
836
+
837
+ self.assert_series_equal(result, expected)
838
+
839
+ def test_argmin_argmax(
840
+ self, data_for_sorting, data_missing_for_sorting, na_value, request
841
+ ):
842
+ pa_dtype = data_for_sorting.dtype.pyarrow_dtype
843
+ if pa.types.is_boolean(pa_dtype):
844
+ request.node.add_marker(
845
+ pytest.mark.xfail(
846
+ reason=f"{pa_dtype} only has 2 unique possible values",
847
+ )
848
+ )
849
+ elif pa.types.is_decimal(pa_dtype) and pa_version_under7p0:
850
+ request.node.add_marker(
851
+ pytest.mark.xfail(
852
+ reason=f"No pyarrow kernel for {pa_dtype}",
853
+ raises=pa.ArrowNotImplementedError,
854
+ )
855
+ )
856
+ super().test_argmin_argmax(data_for_sorting, data_missing_for_sorting, na_value)
857
+
858
+ @pytest.mark.parametrize(
859
+ "op_name, skipna, expected",
860
+ [
861
+ ("idxmax", True, 0),
862
+ ("idxmin", True, 2),
863
+ ("argmax", True, 0),
864
+ ("argmin", True, 2),
865
+ ("idxmax", False, np.nan),
866
+ ("idxmin", False, np.nan),
867
+ ("argmax", False, -1),
868
+ ("argmin", False, -1),
869
+ ],
870
+ )
871
+ def test_argreduce_series(
872
+ self, data_missing_for_sorting, op_name, skipna, expected, request
873
+ ):
874
+ pa_dtype = data_missing_for_sorting.dtype.pyarrow_dtype
875
+ if pa.types.is_decimal(pa_dtype) and pa_version_under7p0 and skipna:
876
+ request.node.add_marker(
877
+ pytest.mark.xfail(
878
+ reason=f"No pyarrow kernel for {pa_dtype}",
879
+ raises=pa.ArrowNotImplementedError,
880
+ )
881
+ )
882
+ super().test_argreduce_series(
883
+ data_missing_for_sorting, op_name, skipna, expected
884
+ )
885
+
886
+ def test_factorize(self, data_for_grouping, request):
887
+ pa_dtype = data_for_grouping.dtype.pyarrow_dtype
888
+ if pa.types.is_boolean(pa_dtype):
889
+ request.node.add_marker(
890
+ pytest.mark.xfail(
891
+ reason=f"{pa_dtype} only has 2 unique possible values",
892
+ )
893
+ )
894
+ super().test_factorize(data_for_grouping)
895
+
896
+ _combine_le_expected_dtype = "bool[pyarrow]"
897
+
898
+ def test_combine_add(self, data_repeated, request):
899
+ pa_dtype = next(data_repeated(1)).dtype.pyarrow_dtype
900
+ if pa.types.is_duration(pa_dtype):
901
+ # TODO: this fails on the scalar addition constructing 'expected'
902
+ # but not in the actual 'combine' call, so may be salvage-able
903
+ mark = pytest.mark.xfail(
904
+ raises=TypeError,
905
+ reason=f"{pa_dtype} cannot be added to {pa_dtype}",
906
+ )
907
+ request.node.add_marker(mark)
908
+ super().test_combine_add(data_repeated)
909
+
910
+ elif pa.types.is_temporal(pa_dtype):
911
+ # analogous to datetime64, these cannot be added
912
+ orig_data1, orig_data2 = data_repeated(2)
913
+ s1 = pd.Series(orig_data1)
914
+ s2 = pd.Series(orig_data2)
915
+ with pytest.raises(TypeError):
916
+ s1.combine(s2, lambda x1, x2: x1 + x2)
917
+
918
+ else:
919
+ super().test_combine_add(data_repeated)
920
+
921
+ def test_searchsorted(self, data_for_sorting, as_series, request):
922
+ pa_dtype = data_for_sorting.dtype.pyarrow_dtype
923
+ if pa.types.is_boolean(pa_dtype):
924
+ request.node.add_marker(
925
+ pytest.mark.xfail(
926
+ reason=f"{pa_dtype} only has 2 unique possible values",
927
+ )
928
+ )
929
+ super().test_searchsorted(data_for_sorting, as_series)
930
+
931
+ def test_basic_equals(self, data):
932
+ # https://github.com/pandas-dev/pandas/issues/34660
933
+ assert pd.Series(data).equals(pd.Series(data))
934
+
935
+
936
+ class TestBaseArithmeticOps(base.BaseArithmeticOpsTests):
937
+ divmod_exc = NotImplementedError
938
+
939
+ @classmethod
940
+ def assert_equal(cls, left, right, **kwargs):
941
+ if isinstance(left, pd.DataFrame):
942
+ left_pa_type = left.iloc[:, 0].dtype.pyarrow_dtype
943
+ right_pa_type = right.iloc[:, 0].dtype.pyarrow_dtype
944
+ else:
945
+ left_pa_type = left.dtype.pyarrow_dtype
946
+ right_pa_type = right.dtype.pyarrow_dtype
947
+ if pa.types.is_decimal(left_pa_type) or pa.types.is_decimal(right_pa_type):
948
+ # decimal precision can resize in the result type depending on data
949
+ # just compare the float values
950
+ left = left.astype("float[pyarrow]")
951
+ right = right.astype("float[pyarrow]")
952
+ tm.assert_equal(left, right, **kwargs)
953
+
954
+ def get_op_from_name(self, op_name):
955
+ short_opname = op_name.strip("_")
956
+ if short_opname == "rtruediv":
957
+ # use the numpy version that won't raise on division by zero
958
+ return lambda x, y: np.divide(y, x)
959
+ elif short_opname == "rfloordiv":
960
+ return lambda x, y: np.floor_divide(y, x)
961
+
962
+ return tm.get_op_from_name(op_name)
963
+
964
+ def _patch_combine(self, obj, other, op):
965
+ # BaseOpsUtil._combine can upcast expected dtype
966
+ # (because it generates expected on python scalars)
967
+ # while ArrowExtensionArray maintains original type
968
+ expected = base.BaseArithmeticOpsTests._combine(self, obj, other, op)
969
+ was_frame = False
970
+ if isinstance(expected, pd.DataFrame):
971
+ was_frame = True
972
+ expected_data = expected.iloc[:, 0]
973
+ original_dtype = obj.iloc[:, 0].dtype
974
+ else:
975
+ expected_data = expected
976
+ original_dtype = obj.dtype
977
+
978
+ pa_expected = pa.array(expected_data._values)
979
+
980
+ if pa.types.is_duration(pa_expected.type):
981
+ # pyarrow sees sequence of datetime/timedelta objects and defaults
982
+ # to "us" but the non-pointwise op retains unit
983
+ unit = original_dtype.pyarrow_dtype.unit
984
+ if type(other) in [datetime, timedelta] and unit in ["s", "ms"]:
985
+ # pydatetime/pytimedelta objects have microsecond reso, so we
986
+ # take the higher reso of the original and microsecond. Note
987
+ # this matches what we would do with DatetimeArray/TimedeltaArray
988
+ unit = "us"
989
+ pa_expected = pa_expected.cast(f"duration[{unit}]")
990
+ else:
991
+ pa_expected = pa_expected.cast(original_dtype.pyarrow_dtype)
992
+
993
+ pd_expected = type(expected_data._values)(pa_expected)
994
+ if was_frame:
995
+ expected = pd.DataFrame(
996
+ pd_expected, index=expected.index, columns=expected.columns
997
+ )
998
+ else:
999
+ expected = pd.Series(pd_expected)
1000
+ return expected
1001
+
1002
+ def _is_temporal_supported(self, opname, pa_dtype):
1003
+ return not pa_version_under8p0 and (
1004
+ opname in ("__add__", "__radd__")
1005
+ and pa.types.is_duration(pa_dtype)
1006
+ or opname in ("__sub__", "__rsub__")
1007
+ and pa.types.is_temporal(pa_dtype)
1008
+ )
1009
+
1010
+ def _get_scalar_exception(self, opname, pa_dtype):
1011
+ arrow_temporal_supported = self._is_temporal_supported(opname, pa_dtype)
1012
+ if opname in {
1013
+ "__mod__",
1014
+ "__rmod__",
1015
+ }:
1016
+ exc = NotImplementedError
1017
+ elif arrow_temporal_supported:
1018
+ exc = None
1019
+ elif opname in ["__add__", "__radd__"] and (
1020
+ pa.types.is_string(pa_dtype) or pa.types.is_binary(pa_dtype)
1021
+ ):
1022
+ exc = None
1023
+ elif not (
1024
+ pa.types.is_floating(pa_dtype)
1025
+ or pa.types.is_integer(pa_dtype)
1026
+ or pa.types.is_decimal(pa_dtype)
1027
+ ):
1028
+ exc = pa.ArrowNotImplementedError
1029
+ else:
1030
+ exc = None
1031
+ return exc
1032
+
1033
+ def _get_arith_xfail_marker(self, opname, pa_dtype):
1034
+ mark = None
1035
+
1036
+ arrow_temporal_supported = self._is_temporal_supported(opname, pa_dtype)
1037
+
1038
+ if (
1039
+ opname == "__rpow__"
1040
+ and (
1041
+ pa.types.is_floating(pa_dtype)
1042
+ or pa.types.is_integer(pa_dtype)
1043
+ or pa.types.is_decimal(pa_dtype)
1044
+ )
1045
+ and not pa_version_under7p0
1046
+ ):
1047
+ mark = pytest.mark.xfail(
1048
+ reason=(
1049
+ f"GH#29997: 1**pandas.NA == 1 while 1**pyarrow.NA == NULL "
1050
+ f"for {pa_dtype}"
1051
+ )
1052
+ )
1053
+ elif arrow_temporal_supported:
1054
+ mark = pytest.mark.xfail(
1055
+ raises=TypeError,
1056
+ reason=(
1057
+ f"{opname} not supported between"
1058
+ f"pd.NA and {pa_dtype} Python scalar"
1059
+ ),
1060
+ )
1061
+ elif (
1062
+ opname == "__rfloordiv__"
1063
+ and (pa.types.is_integer(pa_dtype) or pa.types.is_decimal(pa_dtype))
1064
+ and not pa_version_under7p0
1065
+ ):
1066
+ mark = pytest.mark.xfail(
1067
+ raises=pa.ArrowInvalid,
1068
+ reason="divide by 0",
1069
+ )
1070
+ elif (
1071
+ opname == "__rtruediv__"
1072
+ and pa.types.is_decimal(pa_dtype)
1073
+ and not pa_version_under7p0
1074
+ ):
1075
+ mark = pytest.mark.xfail(
1076
+ raises=pa.ArrowInvalid,
1077
+ reason="divide by 0",
1078
+ )
1079
+ elif (
1080
+ opname == "__pow__"
1081
+ and pa.types.is_decimal(pa_dtype)
1082
+ and pa_version_under7p0
1083
+ ):
1084
+ mark = pytest.mark.xfail(
1085
+ raises=pa.ArrowInvalid,
1086
+ reason="Invalid decimal function: power_checked",
1087
+ )
1088
+
1089
+ return mark
1090
+
1091
+ def test_arith_series_with_scalar(
1092
+ self, data, all_arithmetic_operators, request, monkeypatch
1093
+ ):
1094
+ pa_dtype = data.dtype.pyarrow_dtype
1095
+
1096
+ if all_arithmetic_operators == "__rmod__" and (
1097
+ pa.types.is_string(pa_dtype) or pa.types.is_binary(pa_dtype)
1098
+ ):
1099
+ pytest.skip("Skip testing Python string formatting")
1100
+
1101
+ self.series_scalar_exc = self._get_scalar_exception(
1102
+ all_arithmetic_operators, pa_dtype
1103
+ )
1104
+
1105
+ mark = self._get_arith_xfail_marker(all_arithmetic_operators, pa_dtype)
1106
+ if mark is not None:
1107
+ request.node.add_marker(mark)
1108
+
1109
+ if (
1110
+ (
1111
+ all_arithmetic_operators == "__floordiv__"
1112
+ and pa.types.is_integer(pa_dtype)
1113
+ )
1114
+ or pa.types.is_duration(pa_dtype)
1115
+ or pa.types.is_timestamp(pa_dtype)
1116
+ ):
1117
+ # BaseOpsUtil._combine always returns int64, while ArrowExtensionArray does
1118
+ # not upcast
1119
+ monkeypatch.setattr(TestBaseArithmeticOps, "_combine", self._patch_combine)
1120
+ super().test_arith_series_with_scalar(data, all_arithmetic_operators)
1121
+
1122
+ def test_arith_frame_with_scalar(
1123
+ self, data, all_arithmetic_operators, request, monkeypatch
1124
+ ):
1125
+ pa_dtype = data.dtype.pyarrow_dtype
1126
+
1127
+ if all_arithmetic_operators == "__rmod__" and (
1128
+ pa.types.is_string(pa_dtype) or pa.types.is_binary(pa_dtype)
1129
+ ):
1130
+ pytest.skip("Skip testing Python string formatting")
1131
+
1132
+ self.frame_scalar_exc = self._get_scalar_exception(
1133
+ all_arithmetic_operators, pa_dtype
1134
+ )
1135
+
1136
+ mark = self._get_arith_xfail_marker(all_arithmetic_operators, pa_dtype)
1137
+ if mark is not None:
1138
+ request.node.add_marker(mark)
1139
+
1140
+ if (
1141
+ (
1142
+ all_arithmetic_operators == "__floordiv__"
1143
+ and pa.types.is_integer(pa_dtype)
1144
+ )
1145
+ or pa.types.is_duration(pa_dtype)
1146
+ or pa.types.is_timestamp(pa_dtype)
1147
+ ):
1148
+ # BaseOpsUtil._combine always returns int64, while ArrowExtensionArray does
1149
+ # not upcast
1150
+ monkeypatch.setattr(TestBaseArithmeticOps, "_combine", self._patch_combine)
1151
+ super().test_arith_frame_with_scalar(data, all_arithmetic_operators)
1152
+
1153
+ def test_arith_series_with_array(
1154
+ self, data, all_arithmetic_operators, request, monkeypatch
1155
+ ):
1156
+ pa_dtype = data.dtype.pyarrow_dtype
1157
+
1158
+ self.series_array_exc = self._get_scalar_exception(
1159
+ all_arithmetic_operators, pa_dtype
1160
+ )
1161
+
1162
+ if (
1163
+ all_arithmetic_operators
1164
+ in (
1165
+ "__sub__",
1166
+ "__rsub__",
1167
+ )
1168
+ and pa.types.is_unsigned_integer(pa_dtype)
1169
+ and not pa_version_under7p0
1170
+ ):
1171
+ request.node.add_marker(
1172
+ pytest.mark.xfail(
1173
+ raises=pa.ArrowInvalid,
1174
+ reason=(
1175
+ f"Implemented pyarrow.compute.subtract_checked "
1176
+ f"which raises on overflow for {pa_dtype}"
1177
+ ),
1178
+ )
1179
+ )
1180
+
1181
+ mark = self._get_arith_xfail_marker(all_arithmetic_operators, pa_dtype)
1182
+ if mark is not None:
1183
+ request.node.add_marker(mark)
1184
+
1185
+ op_name = all_arithmetic_operators
1186
+ ser = pd.Series(data)
1187
+ # pd.Series([ser.iloc[0]] * len(ser)) may not return ArrowExtensionArray
1188
+ # since ser.iloc[0] is a python scalar
1189
+ other = pd.Series(pd.array([ser.iloc[0]] * len(ser), dtype=data.dtype))
1190
+
1191
+ if (
1192
+ pa.types.is_floating(pa_dtype)
1193
+ or (
1194
+ pa.types.is_integer(pa_dtype)
1195
+ and all_arithmetic_operators not in ["__truediv__", "__rtruediv__"]
1196
+ )
1197
+ or pa.types.is_duration(pa_dtype)
1198
+ or pa.types.is_timestamp(pa_dtype)
1199
+ ):
1200
+ monkeypatch.setattr(TestBaseArithmeticOps, "_combine", self._patch_combine)
1201
+ self.check_opname(ser, op_name, other, exc=self.series_array_exc)
1202
+
1203
+ def test_add_series_with_extension_array(self, data, request):
1204
+ pa_dtype = data.dtype.pyarrow_dtype
1205
+
1206
+ if pa.types.is_temporal(pa_dtype) and not pa.types.is_duration(pa_dtype):
1207
+ # i.e. timestamp, date, time, but not timedelta; these *should*
1208
+ # raise when trying to add
1209
+ ser = pd.Series(data)
1210
+ if pa_version_under7p0:
1211
+ msg = "Function add_checked has no kernel matching input types"
1212
+ else:
1213
+ msg = "Function 'add_checked' has no kernel matching input types"
1214
+ with pytest.raises(NotImplementedError, match=msg):
1215
+ # TODO: this is a pa.lib.ArrowNotImplementedError, might
1216
+ # be better to reraise a TypeError; more consistent with
1217
+ # non-pyarrow cases
1218
+ ser + data
1219
+
1220
+ return
1221
+
1222
+ if (pa_version_under8p0 and pa.types.is_duration(pa_dtype)) or (
1223
+ pa.types.is_boolean(pa_dtype)
1224
+ ):
1225
+ request.node.add_marker(
1226
+ pytest.mark.xfail(
1227
+ raises=NotImplementedError,
1228
+ reason=f"add_checked not implemented for {pa_dtype}",
1229
+ )
1230
+ )
1231
+ elif pa_dtype.equals("int8"):
1232
+ request.node.add_marker(
1233
+ pytest.mark.xfail(
1234
+ raises=pa.ArrowInvalid,
1235
+ reason=f"raises on overflow for {pa_dtype}",
1236
+ )
1237
+ )
1238
+ super().test_add_series_with_extension_array(data)
1239
+
1240
+
1241
+ class TestBaseComparisonOps(base.BaseComparisonOpsTests):
1242
+ def test_compare_array(self, data, comparison_op, na_value):
1243
+ ser = pd.Series(data)
1244
+ # pd.Series([ser.iloc[0]] * len(ser)) may not return ArrowExtensionArray
1245
+ # since ser.iloc[0] is a python scalar
1246
+ other = pd.Series(pd.array([ser.iloc[0]] * len(ser), dtype=data.dtype))
1247
+ if comparison_op.__name__ in ["eq", "ne"]:
1248
+ # comparison should match point-wise comparisons
1249
+ result = comparison_op(ser, other)
1250
+ # Series.combine does not calculate the NA mask correctly
1251
+ # when comparing over an array
1252
+ assert result[8] is na_value
1253
+ assert result[97] is na_value
1254
+ expected = ser.combine(other, comparison_op)
1255
+ expected[8] = na_value
1256
+ expected[97] = na_value
1257
+ self.assert_series_equal(result, expected)
1258
+
1259
+ else:
1260
+ exc = None
1261
+ try:
1262
+ result = comparison_op(ser, other)
1263
+ except Exception as err:
1264
+ exc = err
1265
+
1266
+ if exc is None:
1267
+ # Didn't error, then should match point-wise behavior
1268
+ expected = ser.combine(other, comparison_op)
1269
+ self.assert_series_equal(result, expected)
1270
+ else:
1271
+ with pytest.raises(type(exc)):
1272
+ ser.combine(other, comparison_op)
1273
+
1274
+ def test_invalid_other_comp(self, data, comparison_op):
1275
+ # GH 48833
1276
+ with pytest.raises(
1277
+ NotImplementedError, match=".* not implemented for <class 'object'>"
1278
+ ):
1279
+ comparison_op(data, object())
1280
+
1281
+ @pytest.mark.parametrize("masked_dtype", ["boolean", "Int64", "Float64"])
1282
+ def test_comp_masked_numpy(self, masked_dtype, comparison_op):
1283
+ # GH 52625
1284
+ data = [1, 0, None]
1285
+ ser_masked = pd.Series(data, dtype=masked_dtype)
1286
+ ser_pa = pd.Series(data, dtype=f"{masked_dtype.lower()}[pyarrow]")
1287
+ result = comparison_op(ser_pa, ser_masked)
1288
+ if comparison_op in [operator.lt, operator.gt, operator.ne]:
1289
+ exp = [False, False, None]
1290
+ else:
1291
+ exp = [True, True, None]
1292
+ expected = pd.Series(exp, dtype=ArrowDtype(pa.bool_()))
1293
+ tm.assert_series_equal(result, expected)
1294
+
1295
+
1296
+ class TestLogicalOps:
1297
+ """Various Series and DataFrame logical ops methods."""
1298
+
1299
+ def test_kleene_or(self):
1300
+ a = pd.Series([True] * 3 + [False] * 3 + [None] * 3, dtype="boolean[pyarrow]")
1301
+ b = pd.Series([True, False, None] * 3, dtype="boolean[pyarrow]")
1302
+ result = a | b
1303
+ expected = pd.Series(
1304
+ [True, True, True, True, False, None, True, None, None],
1305
+ dtype="boolean[pyarrow]",
1306
+ )
1307
+ tm.assert_series_equal(result, expected)
1308
+
1309
+ result = b | a
1310
+ tm.assert_series_equal(result, expected)
1311
+
1312
+ # ensure we haven't mutated anything inplace
1313
+ tm.assert_series_equal(
1314
+ a,
1315
+ pd.Series([True] * 3 + [False] * 3 + [None] * 3, dtype="boolean[pyarrow]"),
1316
+ )
1317
+ tm.assert_series_equal(
1318
+ b, pd.Series([True, False, None] * 3, dtype="boolean[pyarrow]")
1319
+ )
1320
+
1321
+ @pytest.mark.parametrize(
1322
+ "other, expected",
1323
+ [
1324
+ (None, [True, None, None]),
1325
+ (pd.NA, [True, None, None]),
1326
+ (True, [True, True, True]),
1327
+ (np.bool_(True), [True, True, True]),
1328
+ (False, [True, False, None]),
1329
+ (np.bool_(False), [True, False, None]),
1330
+ ],
1331
+ )
1332
+ def test_kleene_or_scalar(self, other, expected):
1333
+ a = pd.Series([True, False, None], dtype="boolean[pyarrow]")
1334
+ result = a | other
1335
+ expected = pd.Series(expected, dtype="boolean[pyarrow]")
1336
+ tm.assert_series_equal(result, expected)
1337
+
1338
+ result = other | a
1339
+ tm.assert_series_equal(result, expected)
1340
+
1341
+ # ensure we haven't mutated anything inplace
1342
+ tm.assert_series_equal(
1343
+ a, pd.Series([True, False, None], dtype="boolean[pyarrow]")
1344
+ )
1345
+
1346
+ def test_kleene_and(self):
1347
+ a = pd.Series([True] * 3 + [False] * 3 + [None] * 3, dtype="boolean[pyarrow]")
1348
+ b = pd.Series([True, False, None] * 3, dtype="boolean[pyarrow]")
1349
+ result = a & b
1350
+ expected = pd.Series(
1351
+ [True, False, None, False, False, False, None, False, None],
1352
+ dtype="boolean[pyarrow]",
1353
+ )
1354
+ tm.assert_series_equal(result, expected)
1355
+
1356
+ result = b & a
1357
+ tm.assert_series_equal(result, expected)
1358
+
1359
+ # ensure we haven't mutated anything inplace
1360
+ tm.assert_series_equal(
1361
+ a,
1362
+ pd.Series([True] * 3 + [False] * 3 + [None] * 3, dtype="boolean[pyarrow]"),
1363
+ )
1364
+ tm.assert_series_equal(
1365
+ b, pd.Series([True, False, None] * 3, dtype="boolean[pyarrow]")
1366
+ )
1367
+
1368
+ @pytest.mark.parametrize(
1369
+ "other, expected",
1370
+ [
1371
+ (None, [None, False, None]),
1372
+ (pd.NA, [None, False, None]),
1373
+ (True, [True, False, None]),
1374
+ (False, [False, False, False]),
1375
+ (np.bool_(True), [True, False, None]),
1376
+ (np.bool_(False), [False, False, False]),
1377
+ ],
1378
+ )
1379
+ def test_kleene_and_scalar(self, other, expected):
1380
+ a = pd.Series([True, False, None], dtype="boolean[pyarrow]")
1381
+ result = a & other
1382
+ expected = pd.Series(expected, dtype="boolean[pyarrow]")
1383
+ tm.assert_series_equal(result, expected)
1384
+
1385
+ result = other & a
1386
+ tm.assert_series_equal(result, expected)
1387
+
1388
+ # ensure we haven't mutated anything inplace
1389
+ tm.assert_series_equal(
1390
+ a, pd.Series([True, False, None], dtype="boolean[pyarrow]")
1391
+ )
1392
+
1393
+ def test_kleene_xor(self):
1394
+ a = pd.Series([True] * 3 + [False] * 3 + [None] * 3, dtype="boolean[pyarrow]")
1395
+ b = pd.Series([True, False, None] * 3, dtype="boolean[pyarrow]")
1396
+ result = a ^ b
1397
+ expected = pd.Series(
1398
+ [False, True, None, True, False, None, None, None, None],
1399
+ dtype="boolean[pyarrow]",
1400
+ )
1401
+ tm.assert_series_equal(result, expected)
1402
+
1403
+ result = b ^ a
1404
+ tm.assert_series_equal(result, expected)
1405
+
1406
+ # ensure we haven't mutated anything inplace
1407
+ tm.assert_series_equal(
1408
+ a,
1409
+ pd.Series([True] * 3 + [False] * 3 + [None] * 3, dtype="boolean[pyarrow]"),
1410
+ )
1411
+ tm.assert_series_equal(
1412
+ b, pd.Series([True, False, None] * 3, dtype="boolean[pyarrow]")
1413
+ )
1414
+
1415
+ @pytest.mark.parametrize(
1416
+ "other, expected",
1417
+ [
1418
+ (None, [None, None, None]),
1419
+ (pd.NA, [None, None, None]),
1420
+ (True, [False, True, None]),
1421
+ (np.bool_(True), [False, True, None]),
1422
+ (np.bool_(False), [True, False, None]),
1423
+ ],
1424
+ )
1425
+ def test_kleene_xor_scalar(self, other, expected):
1426
+ a = pd.Series([True, False, None], dtype="boolean[pyarrow]")
1427
+ result = a ^ other
1428
+ expected = pd.Series(expected, dtype="boolean[pyarrow]")
1429
+ tm.assert_series_equal(result, expected)
1430
+
1431
+ result = other ^ a
1432
+ tm.assert_series_equal(result, expected)
1433
+
1434
+ # ensure we haven't mutated anything inplace
1435
+ tm.assert_series_equal(
1436
+ a, pd.Series([True, False, None], dtype="boolean[pyarrow]")
1437
+ )
1438
+
1439
+ @pytest.mark.parametrize(
1440
+ "op, exp",
1441
+ [
1442
+ ["__and__", True],
1443
+ ["__or__", True],
1444
+ ["__xor__", False],
1445
+ ],
1446
+ )
1447
+ def test_logical_masked_numpy(self, op, exp):
1448
+ # GH 52625
1449
+ data = [True, False, None]
1450
+ ser_masked = pd.Series(data, dtype="boolean")
1451
+ ser_pa = pd.Series(data, dtype="boolean[pyarrow]")
1452
+ result = getattr(ser_pa, op)(ser_masked)
1453
+ expected = pd.Series([exp, False, None], dtype=ArrowDtype(pa.bool_()))
1454
+ tm.assert_series_equal(result, expected)
1455
+
1456
+
1457
+ def test_arrowdtype_construct_from_string_type_with_unsupported_parameters():
1458
+ with pytest.raises(NotImplementedError, match="Passing pyarrow type"):
1459
+ ArrowDtype.construct_from_string("not_a_real_dype[s, tz=UTC][pyarrow]")
1460
+
1461
+ # but as of GH#50689, timestamptz is supported
1462
+ dtype = ArrowDtype.construct_from_string("timestamp[s, tz=UTC][pyarrow]")
1463
+ expected = ArrowDtype(pa.timestamp("s", "UTC"))
1464
+ assert dtype == expected
1465
+
1466
+ with pytest.raises(NotImplementedError, match="Passing pyarrow type"):
1467
+ ArrowDtype.construct_from_string("decimal(7, 2)[pyarrow]")
1468
+
1469
+
1470
+ def test_arrowdtype_construct_from_string_type_only_one_pyarrow():
1471
+ # GH#51225
1472
+ invalid = "int64[pyarrow]foobar[pyarrow]"
1473
+ msg = (
1474
+ r"Passing pyarrow type specific parameters \(\[pyarrow\]\) in the "
1475
+ r"string is not supported\."
1476
+ )
1477
+ with pytest.raises(NotImplementedError, match=msg):
1478
+ pd.Series(range(3), dtype=invalid)
1479
+
1480
+
1481
+ @pytest.mark.parametrize(
1482
+ "interpolation", ["linear", "lower", "higher", "nearest", "midpoint"]
1483
+ )
1484
+ @pytest.mark.parametrize("quantile", [0.5, [0.5, 0.5]])
1485
+ def test_quantile(data, interpolation, quantile, request):
1486
+ pa_dtype = data.dtype.pyarrow_dtype
1487
+
1488
+ data = data.take([0, 0, 0])
1489
+ ser = pd.Series(data)
1490
+
1491
+ if (
1492
+ pa.types.is_string(pa_dtype)
1493
+ or pa.types.is_binary(pa_dtype)
1494
+ or pa.types.is_boolean(pa_dtype)
1495
+ ):
1496
+ # For string, bytes, and bool, we don't *expect* to have quantile work
1497
+ # Note this matches the non-pyarrow behavior
1498
+ if pa_version_under7p0:
1499
+ msg = r"Function quantile has no kernel matching input types \(.*\)"
1500
+ else:
1501
+ msg = r"Function 'quantile' has no kernel matching input types \(.*\)"
1502
+ with pytest.raises(pa.ArrowNotImplementedError, match=msg):
1503
+ ser.quantile(q=quantile, interpolation=interpolation)
1504
+ return
1505
+
1506
+ if (
1507
+ pa.types.is_integer(pa_dtype)
1508
+ or pa.types.is_floating(pa_dtype)
1509
+ or (pa.types.is_decimal(pa_dtype) and not pa_version_under7p0)
1510
+ ):
1511
+ pass
1512
+ elif pa.types.is_temporal(data._data.type):
1513
+ pass
1514
+ else:
1515
+ request.node.add_marker(
1516
+ pytest.mark.xfail(
1517
+ raises=pa.ArrowNotImplementedError,
1518
+ reason=f"quantile not supported by pyarrow for {pa_dtype}",
1519
+ )
1520
+ )
1521
+ data = data.take([0, 0, 0])
1522
+ ser = pd.Series(data)
1523
+ result = ser.quantile(q=quantile, interpolation=interpolation)
1524
+
1525
+ if pa.types.is_timestamp(pa_dtype) and interpolation not in ["lower", "higher"]:
1526
+ # rounding error will make the check below fail
1527
+ # (e.g. '2020-01-01 01:01:01.000001' vs '2020-01-01 01:01:01.000001024'),
1528
+ # so we'll check for now that we match the numpy analogue
1529
+ if pa_dtype.tz:
1530
+ pd_dtype = f"M8[{pa_dtype.unit}, {pa_dtype.tz}]"
1531
+ else:
1532
+ pd_dtype = f"M8[{pa_dtype.unit}]"
1533
+ ser_np = ser.astype(pd_dtype)
1534
+
1535
+ expected = ser_np.quantile(q=quantile, interpolation=interpolation)
1536
+ if quantile == 0.5:
1537
+ if pa_dtype.unit == "us":
1538
+ expected = expected.to_pydatetime(warn=False)
1539
+ assert result == expected
1540
+ else:
1541
+ if pa_dtype.unit == "us":
1542
+ expected = expected.dt.floor("us")
1543
+ tm.assert_series_equal(result, expected.astype(data.dtype))
1544
+ return
1545
+
1546
+ if quantile == 0.5:
1547
+ assert result == data[0]
1548
+ else:
1549
+ # Just check the values
1550
+ expected = pd.Series(data.take([0, 0]), index=[0.5, 0.5])
1551
+ if (
1552
+ pa.types.is_integer(pa_dtype)
1553
+ or pa.types.is_floating(pa_dtype)
1554
+ or pa.types.is_decimal(pa_dtype)
1555
+ ):
1556
+ expected = expected.astype("float64[pyarrow]")
1557
+ result = result.astype("float64[pyarrow]")
1558
+ tm.assert_series_equal(result, expected)
1559
+
1560
+
1561
+ @pytest.mark.parametrize(
1562
+ "take_idx, exp_idx",
1563
+ [[[0, 0, 2, 2, 4, 4], [0, 4]], [[0, 0, 0, 2, 4, 4], [0]]],
1564
+ ids=["multi_mode", "single_mode"],
1565
+ )
1566
+ def test_mode_dropna_true(data_for_grouping, take_idx, exp_idx):
1567
+ data = data_for_grouping.take(take_idx)
1568
+ ser = pd.Series(data)
1569
+ result = ser.mode(dropna=True)
1570
+ expected = pd.Series(data_for_grouping.take(exp_idx))
1571
+ tm.assert_series_equal(result, expected)
1572
+
1573
+
1574
+ def test_mode_dropna_false_mode_na(data):
1575
+ # GH 50982
1576
+ more_nans = pd.Series([None, None, data[0]], dtype=data.dtype)
1577
+ result = more_nans.mode(dropna=False)
1578
+ expected = pd.Series([None], dtype=data.dtype)
1579
+ tm.assert_series_equal(result, expected)
1580
+
1581
+ expected = pd.Series([None, data[0]], dtype=data.dtype)
1582
+ result = expected.mode(dropna=False)
1583
+ tm.assert_series_equal(result, expected)
1584
+
1585
+
1586
+ @pytest.mark.parametrize(
1587
+ "arrow_dtype, expected_type",
1588
+ [
1589
+ [pa.binary(), bytes],
1590
+ [pa.binary(16), bytes],
1591
+ [pa.large_binary(), bytes],
1592
+ [pa.large_string(), str],
1593
+ [pa.list_(pa.int64()), list],
1594
+ [pa.large_list(pa.int64()), list],
1595
+ [pa.map_(pa.string(), pa.int64()), list],
1596
+ [pa.struct([("f1", pa.int8()), ("f2", pa.string())]), dict],
1597
+ [pa.dictionary(pa.int64(), pa.int64()), CategoricalDtypeType],
1598
+ ],
1599
+ )
1600
+ def test_arrow_dtype_type(arrow_dtype, expected_type):
1601
+ # GH 51845
1602
+ # TODO: Redundant with test_getitem_scalar once arrow_dtype exists in data fixture
1603
+ assert ArrowDtype(arrow_dtype).type == expected_type
1604
+
1605
+
1606
+ def test_is_bool_dtype():
1607
+ # GH 22667
1608
+ data = ArrowExtensionArray(pa.array([True, False, True]))
1609
+ assert is_bool_dtype(data)
1610
+ assert pd.core.common.is_bool_indexer(data)
1611
+ s = pd.Series(range(len(data)))
1612
+ result = s[data]
1613
+ expected = s[np.asarray(data)]
1614
+ tm.assert_series_equal(result, expected)
1615
+
1616
+
1617
+ def test_is_numeric_dtype(data):
1618
+ # GH 50563
1619
+ pa_type = data.dtype.pyarrow_dtype
1620
+ if (
1621
+ pa.types.is_floating(pa_type)
1622
+ or pa.types.is_integer(pa_type)
1623
+ or pa.types.is_decimal(pa_type)
1624
+ ):
1625
+ assert is_numeric_dtype(data)
1626
+ else:
1627
+ assert not is_numeric_dtype(data)
1628
+
1629
+
1630
+ def test_is_integer_dtype(data):
1631
+ # GH 50667
1632
+ pa_type = data.dtype.pyarrow_dtype
1633
+ if pa.types.is_integer(pa_type):
1634
+ assert is_integer_dtype(data)
1635
+ else:
1636
+ assert not is_integer_dtype(data)
1637
+
1638
+
1639
+ def test_is_any_integer_dtype(data):
1640
+ # GH 50667
1641
+ pa_type = data.dtype.pyarrow_dtype
1642
+ if pa.types.is_integer(pa_type):
1643
+ assert is_any_int_dtype(data)
1644
+ else:
1645
+ assert not is_any_int_dtype(data)
1646
+
1647
+
1648
+ def test_is_signed_integer_dtype(data):
1649
+ pa_type = data.dtype.pyarrow_dtype
1650
+ if pa.types.is_signed_integer(pa_type):
1651
+ assert is_signed_integer_dtype(data)
1652
+ else:
1653
+ assert not is_signed_integer_dtype(data)
1654
+
1655
+
1656
+ def test_is_unsigned_integer_dtype(data):
1657
+ pa_type = data.dtype.pyarrow_dtype
1658
+ if pa.types.is_unsigned_integer(pa_type):
1659
+ assert is_unsigned_integer_dtype(data)
1660
+ else:
1661
+ assert not is_unsigned_integer_dtype(data)
1662
+
1663
+
1664
+ def test_is_float_dtype(data):
1665
+ pa_type = data.dtype.pyarrow_dtype
1666
+ if pa.types.is_floating(pa_type):
1667
+ assert is_float_dtype(data)
1668
+ else:
1669
+ assert not is_float_dtype(data)
1670
+
1671
+
1672
+ def test_pickle_roundtrip(data):
1673
+ # GH 42600
1674
+ expected = pd.Series(data)
1675
+ expected_sliced = expected.head(2)
1676
+ full_pickled = pickle.dumps(expected)
1677
+ sliced_pickled = pickle.dumps(expected_sliced)
1678
+
1679
+ assert len(full_pickled) > len(sliced_pickled)
1680
+
1681
+ result = pickle.loads(full_pickled)
1682
+ tm.assert_series_equal(result, expected)
1683
+
1684
+ result_sliced = pickle.loads(sliced_pickled)
1685
+ tm.assert_series_equal(result_sliced, expected_sliced)
1686
+
1687
+
1688
+ def test_astype_from_non_pyarrow(data):
1689
+ # GH49795
1690
+ pd_array = data._data.to_pandas().array
1691
+ result = pd_array.astype(data.dtype)
1692
+ assert not isinstance(pd_array.dtype, ArrowDtype)
1693
+ assert isinstance(result.dtype, ArrowDtype)
1694
+ tm.assert_extension_array_equal(result, data)
1695
+
1696
+
1697
+ def test_astype_float_from_non_pyarrow_str():
1698
+ # GH50430
1699
+ ser = pd.Series(["1.0"])
1700
+ result = ser.astype("float64[pyarrow]")
1701
+ expected = pd.Series([1.0], dtype="float64[pyarrow]")
1702
+ tm.assert_series_equal(result, expected)
1703
+
1704
+
1705
+ def test_to_numpy_with_defaults(data):
1706
+ # GH49973
1707
+ result = data.to_numpy()
1708
+
1709
+ pa_type = data._data.type
1710
+ if pa.types.is_duration(pa_type) or pa.types.is_timestamp(pa_type):
1711
+ expected = np.array(list(data))
1712
+ else:
1713
+ expected = np.array(data._data)
1714
+
1715
+ if data._hasna:
1716
+ expected = expected.astype(object)
1717
+ expected[pd.isna(data)] = pd.NA
1718
+
1719
+ tm.assert_numpy_array_equal(result, expected)
1720
+
1721
+
1722
+ def test_to_numpy_int_with_na():
1723
+ # GH51227: ensure to_numpy does not convert int to float
1724
+ data = [1, None]
1725
+ arr = pd.array(data, dtype="int64[pyarrow]")
1726
+ result = arr.to_numpy()
1727
+ expected = np.array([1, pd.NA], dtype=object)
1728
+ assert isinstance(result[0], int)
1729
+ tm.assert_numpy_array_equal(result, expected)
1730
+
1731
+
1732
+ @pytest.mark.parametrize("na_val, exp", [(lib.no_default, np.nan), (1, 1)])
1733
+ def test_to_numpy_null_array(na_val, exp):
1734
+ # GH#52443
1735
+ arr = pd.array([pd.NA, pd.NA], dtype="null[pyarrow]")
1736
+ result = arr.to_numpy(dtype="float64", na_value=na_val)
1737
+ expected = np.array([exp] * 2, dtype="float64")
1738
+ tm.assert_numpy_array_equal(result, expected)
1739
+
1740
+
1741
+ def test_to_numpy_null_array_no_dtype():
1742
+ # GH#52443
1743
+ arr = pd.array([pd.NA, pd.NA], dtype="null[pyarrow]")
1744
+ result = arr.to_numpy(dtype=None)
1745
+ expected = np.array([pd.NA] * 2, dtype="object")
1746
+ tm.assert_numpy_array_equal(result, expected)
1747
+
1748
+
1749
+ def test_setitem_null_slice(data):
1750
+ # GH50248
1751
+ orig = data.copy()
1752
+
1753
+ result = orig.copy()
1754
+ result[:] = data[0]
1755
+ expected = ArrowExtensionArray(
1756
+ pa.array([data[0]] * len(data), type=data._data.type)
1757
+ )
1758
+ tm.assert_extension_array_equal(result, expected)
1759
+
1760
+ result = orig.copy()
1761
+ result[:] = data[::-1]
1762
+ expected = data[::-1]
1763
+ tm.assert_extension_array_equal(result, expected)
1764
+
1765
+ result = orig.copy()
1766
+ result[:] = data.tolist()
1767
+ expected = data
1768
+ tm.assert_extension_array_equal(result, expected)
1769
+
1770
+
1771
+ def test_setitem_invalid_dtype(data):
1772
+ # GH50248
1773
+ pa_type = data._data.type
1774
+ if pa.types.is_string(pa_type) or pa.types.is_binary(pa_type):
1775
+ fill_value = 123
1776
+ err = TypeError
1777
+ msg = "Invalid value '123' for dtype"
1778
+ elif (
1779
+ pa.types.is_integer(pa_type)
1780
+ or pa.types.is_floating(pa_type)
1781
+ or pa.types.is_boolean(pa_type)
1782
+ ):
1783
+ fill_value = "foo"
1784
+ err = pa.ArrowInvalid
1785
+ msg = "Could not convert"
1786
+ else:
1787
+ fill_value = "foo"
1788
+ err = TypeError
1789
+ msg = "Invalid value 'foo' for dtype"
1790
+ with pytest.raises(err, match=msg):
1791
+ data[:] = fill_value
1792
+
1793
+
1794
+ @pytest.mark.skipif(pa_version_under8p0, reason="returns object with 7.0")
1795
+ def test_from_arrow_respecting_given_dtype():
1796
+ date_array = pa.array(
1797
+ [pd.Timestamp("2019-12-31"), pd.Timestamp("2019-12-31")], type=pa.date32()
1798
+ )
1799
+ result = date_array.to_pandas(
1800
+ types_mapper={pa.date32(): ArrowDtype(pa.date64())}.get
1801
+ )
1802
+ expected = pd.Series(
1803
+ [pd.Timestamp("2019-12-31"), pd.Timestamp("2019-12-31")],
1804
+ dtype=ArrowDtype(pa.date64()),
1805
+ )
1806
+ tm.assert_series_equal(result, expected)
1807
+
1808
+
1809
+ @pytest.mark.skipif(pa_version_under8p0, reason="doesn't raise with 7")
1810
+ def test_from_arrow_respecting_given_dtype_unsafe():
1811
+ array = pa.array([1.5, 2.5], type=pa.float64())
1812
+ with pytest.raises(pa.ArrowInvalid, match="Float value 1.5 was truncated"):
1813
+ array.to_pandas(types_mapper={pa.float64(): ArrowDtype(pa.int64())}.get)
1814
+
1815
+
1816
+ def test_round():
1817
+ dtype = "float64[pyarrow]"
1818
+
1819
+ ser = pd.Series([0.0, 1.23, 2.56, pd.NA], dtype=dtype)
1820
+ result = ser.round(1)
1821
+ expected = pd.Series([0.0, 1.2, 2.6, pd.NA], dtype=dtype)
1822
+ tm.assert_series_equal(result, expected)
1823
+
1824
+ ser = pd.Series([123.4, pd.NA, 56.78], dtype=dtype)
1825
+ result = ser.round(-1)
1826
+ expected = pd.Series([120.0, pd.NA, 60.0], dtype=dtype)
1827
+ tm.assert_series_equal(result, expected)
1828
+
1829
+
1830
+ def test_searchsorted_with_na_raises(data_for_sorting, as_series):
1831
+ # GH50447
1832
+ b, c, a = data_for_sorting
1833
+ arr = data_for_sorting.take([2, 0, 1]) # to get [a, b, c]
1834
+ arr[-1] = pd.NA
1835
+
1836
+ if as_series:
1837
+ arr = pd.Series(arr)
1838
+
1839
+ msg = (
1840
+ "searchsorted requires array to be sorted, "
1841
+ "which is impossible with NAs present."
1842
+ )
1843
+ with pytest.raises(ValueError, match=msg):
1844
+ arr.searchsorted(b)
1845
+
1846
+
1847
+ def test_sort_values_dictionary():
1848
+ df = pd.DataFrame(
1849
+ {
1850
+ "a": pd.Series(
1851
+ ["x", "y"], dtype=ArrowDtype(pa.dictionary(pa.int32(), pa.string()))
1852
+ ),
1853
+ "b": [1, 2],
1854
+ },
1855
+ )
1856
+ expected = df.copy()
1857
+ result = df.sort_values(by=["a", "b"])
1858
+ tm.assert_frame_equal(result, expected)
1859
+
1860
+
1861
+ @pytest.mark.parametrize("pat", ["abc", "a[a-z]{2}"])
1862
+ def test_str_count(pat):
1863
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
1864
+ result = ser.str.count(pat)
1865
+ expected = pd.Series([1, None], dtype=ArrowDtype(pa.int32()))
1866
+ tm.assert_series_equal(result, expected)
1867
+
1868
+
1869
+ def test_str_count_flags_unsupported():
1870
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
1871
+ with pytest.raises(NotImplementedError, match="count not"):
1872
+ ser.str.count("abc", flags=1)
1873
+
1874
+
1875
+ @pytest.mark.parametrize(
1876
+ "side, str_func", [["left", "rjust"], ["right", "ljust"], ["both", "center"]]
1877
+ )
1878
+ def test_str_pad(side, str_func):
1879
+ ser = pd.Series(["a", None], dtype=ArrowDtype(pa.string()))
1880
+ result = ser.str.pad(width=3, side=side, fillchar="x")
1881
+ expected = pd.Series(
1882
+ [getattr("a", str_func)(3, "x"), None], dtype=ArrowDtype(pa.string())
1883
+ )
1884
+ tm.assert_series_equal(result, expected)
1885
+
1886
+
1887
+ def test_str_pad_invalid_side():
1888
+ ser = pd.Series(["a", None], dtype=ArrowDtype(pa.string()))
1889
+ with pytest.raises(ValueError, match="Invalid side: foo"):
1890
+ ser.str.pad(3, "foo", "x")
1891
+
1892
+
1893
+ @pytest.mark.parametrize(
1894
+ "pat, case, na, regex, exp",
1895
+ [
1896
+ ["ab", False, None, False, [True, None]],
1897
+ ["Ab", True, None, False, [False, None]],
1898
+ ["ab", False, True, False, [True, True]],
1899
+ ["a[a-z]{1}", False, None, True, [True, None]],
1900
+ ["A[a-z]{1}", True, None, True, [False, None]],
1901
+ ],
1902
+ )
1903
+ def test_str_contains(pat, case, na, regex, exp):
1904
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
1905
+ result = ser.str.contains(pat, case=case, na=na, regex=regex)
1906
+ expected = pd.Series(exp, dtype=ArrowDtype(pa.bool_()))
1907
+ tm.assert_series_equal(result, expected)
1908
+
1909
+
1910
+ def test_str_contains_flags_unsupported():
1911
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
1912
+ with pytest.raises(NotImplementedError, match="contains not"):
1913
+ ser.str.contains("a", flags=1)
1914
+
1915
+
1916
+ @pytest.mark.parametrize(
1917
+ "side, pat, na, exp",
1918
+ [
1919
+ ["startswith", "ab", None, [True, None]],
1920
+ ["startswith", "b", False, [False, False]],
1921
+ ["endswith", "b", True, [False, True]],
1922
+ ["endswith", "bc", None, [True, None]],
1923
+ ],
1924
+ )
1925
+ def test_str_start_ends_with(side, pat, na, exp):
1926
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
1927
+ result = getattr(ser.str, side)(pat, na=na)
1928
+ expected = pd.Series(exp, dtype=ArrowDtype(pa.bool_()))
1929
+ tm.assert_series_equal(result, expected)
1930
+
1931
+
1932
+ @pytest.mark.parametrize(
1933
+ "arg_name, arg",
1934
+ [["pat", re.compile("b")], ["repl", str], ["case", False], ["flags", 1]],
1935
+ )
1936
+ def test_str_replace_unsupported(arg_name, arg):
1937
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
1938
+ kwargs = {"pat": "b", "repl": "x", "regex": True}
1939
+ kwargs[arg_name] = arg
1940
+ with pytest.raises(NotImplementedError, match="replace is not supported"):
1941
+ ser.str.replace(**kwargs)
1942
+
1943
+
1944
+ @pytest.mark.parametrize(
1945
+ "pat, repl, n, regex, exp",
1946
+ [
1947
+ ["a", "x", -1, False, ["xbxc", None]],
1948
+ ["a", "x", 1, False, ["xbac", None]],
1949
+ ["[a-b]", "x", -1, True, ["xxxc", None]],
1950
+ ],
1951
+ )
1952
+ def test_str_replace(pat, repl, n, regex, exp):
1953
+ ser = pd.Series(["abac", None], dtype=ArrowDtype(pa.string()))
1954
+ result = ser.str.replace(pat, repl, n=n, regex=regex)
1955
+ expected = pd.Series(exp, dtype=ArrowDtype(pa.string()))
1956
+ tm.assert_series_equal(result, expected)
1957
+
1958
+
1959
+ def test_str_repeat_unsupported():
1960
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
1961
+ with pytest.raises(NotImplementedError, match="repeat is not"):
1962
+ ser.str.repeat([1, 2])
1963
+
1964
+
1965
+ @pytest.mark.xfail(
1966
+ pa_version_under7p0,
1967
+ reason="Unsupported for pyarrow < 7",
1968
+ raises=NotImplementedError,
1969
+ )
1970
+ def test_str_repeat():
1971
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
1972
+ result = ser.str.repeat(2)
1973
+ expected = pd.Series(["abcabc", None], dtype=ArrowDtype(pa.string()))
1974
+ tm.assert_series_equal(result, expected)
1975
+
1976
+
1977
+ @pytest.mark.parametrize(
1978
+ "pat, case, na, exp",
1979
+ [
1980
+ ["ab", False, None, [True, None]],
1981
+ ["Ab", True, None, [False, None]],
1982
+ ["bc", True, None, [False, None]],
1983
+ ["ab", False, True, [True, True]],
1984
+ ["a[a-z]{1}", False, None, [True, None]],
1985
+ ["A[a-z]{1}", True, None, [False, None]],
1986
+ ],
1987
+ )
1988
+ def test_str_match(pat, case, na, exp):
1989
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
1990
+ result = ser.str.match(pat, case=case, na=na)
1991
+ expected = pd.Series(exp, dtype=ArrowDtype(pa.bool_()))
1992
+ tm.assert_series_equal(result, expected)
1993
+
1994
+
1995
+ @pytest.mark.parametrize(
1996
+ "pat, case, na, exp",
1997
+ [
1998
+ ["abc", False, None, [True, None]],
1999
+ ["Abc", True, None, [False, None]],
2000
+ ["bc", True, None, [False, None]],
2001
+ ["ab", False, True, [True, True]],
2002
+ ["a[a-z]{2}", False, None, [True, None]],
2003
+ ["A[a-z]{1}", True, None, [False, None]],
2004
+ ],
2005
+ )
2006
+ def test_str_fullmatch(pat, case, na, exp):
2007
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
2008
+ result = ser.str.match(pat, case=case, na=na)
2009
+ expected = pd.Series(exp, dtype=ArrowDtype(pa.bool_()))
2010
+ tm.assert_series_equal(result, expected)
2011
+
2012
+
2013
+ @pytest.mark.parametrize(
2014
+ "sub, start, end, exp, exp_typ",
2015
+ [["ab", 0, None, [0, None], pa.int32()], ["bc", 1, 3, [2, None], pa.int64()]],
2016
+ )
2017
+ def test_str_find(sub, start, end, exp, exp_typ):
2018
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
2019
+ result = ser.str.find(sub, start=start, end=end)
2020
+ expected = pd.Series(exp, dtype=ArrowDtype(exp_typ))
2021
+ tm.assert_series_equal(result, expected)
2022
+
2023
+
2024
+ def test_str_find_notimplemented():
2025
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
2026
+ with pytest.raises(NotImplementedError, match="find not implemented"):
2027
+ ser.str.find("ab", start=1)
2028
+
2029
+
2030
+ @pytest.mark.parametrize(
2031
+ "i, exp",
2032
+ [
2033
+ [1, ["b", "e", None]],
2034
+ [-1, ["c", "e", None]],
2035
+ [2, ["c", None, None]],
2036
+ [-3, ["a", None, None]],
2037
+ [4, [None, None, None]],
2038
+ ],
2039
+ )
2040
+ def test_str_get(i, exp):
2041
+ ser = pd.Series(["abc", "de", None], dtype=ArrowDtype(pa.string()))
2042
+ result = ser.str.get(i)
2043
+ expected = pd.Series(exp, dtype=ArrowDtype(pa.string()))
2044
+ tm.assert_series_equal(result, expected)
2045
+
2046
+
2047
+ @pytest.mark.xfail(
2048
+ reason="TODO: StringMethods._validate should support Arrow list types",
2049
+ raises=AttributeError,
2050
+ )
2051
+ def test_str_join():
2052
+ ser = pd.Series(ArrowExtensionArray(pa.array([list("abc"), list("123"), None])))
2053
+ result = ser.str.join("=")
2054
+ expected = pd.Series(["a=b=c", "1=2=3", None], dtype=ArrowDtype(pa.string()))
2055
+ tm.assert_series_equal(result, expected)
2056
+
2057
+
2058
+ @pytest.mark.parametrize(
2059
+ "start, stop, step, exp",
2060
+ [
2061
+ [None, 2, None, ["ab", None]],
2062
+ [None, 2, 1, ["ab", None]],
2063
+ [1, 3, 1, ["bc", None]],
2064
+ ],
2065
+ )
2066
+ def test_str_slice(start, stop, step, exp):
2067
+ ser = pd.Series(["abcd", None], dtype=ArrowDtype(pa.string()))
2068
+ result = ser.str.slice(start, stop, step)
2069
+ expected = pd.Series(exp, dtype=ArrowDtype(pa.string()))
2070
+ tm.assert_series_equal(result, expected)
2071
+
2072
+
2073
+ @pytest.mark.parametrize(
2074
+ "start, stop, repl, exp",
2075
+ [
2076
+ [1, 2, "x", ["axcd", None]],
2077
+ [None, 2, "x", ["xcd", None]],
2078
+ [None, 2, None, ["cd", None]],
2079
+ ],
2080
+ )
2081
+ def test_str_slice_replace(start, stop, repl, exp):
2082
+ ser = pd.Series(["abcd", None], dtype=ArrowDtype(pa.string()))
2083
+ result = ser.str.slice_replace(start, stop, repl)
2084
+ expected = pd.Series(exp, dtype=ArrowDtype(pa.string()))
2085
+ tm.assert_series_equal(result, expected)
2086
+
2087
+
2088
+ @pytest.mark.parametrize(
2089
+ "value, method, exp",
2090
+ [
2091
+ ["a1c", "isalnum", True],
2092
+ ["!|,", "isalnum", False],
2093
+ ["aaa", "isalpha", True],
2094
+ ["!!!", "isalpha", False],
2095
+ ["٠", "isdecimal", True],
2096
+ ["~!", "isdecimal", False],
2097
+ ["2", "isdigit", True],
2098
+ ["~", "isdigit", False],
2099
+ ["aaa", "islower", True],
2100
+ ["aaA", "islower", False],
2101
+ ["123", "isnumeric", True],
2102
+ ["11I", "isnumeric", False],
2103
+ [" ", "isspace", True],
2104
+ ["", "isspace", False],
2105
+ ["The That", "istitle", True],
2106
+ ["the That", "istitle", False],
2107
+ ["AAA", "isupper", True],
2108
+ ["AAc", "isupper", False],
2109
+ ],
2110
+ )
2111
+ def test_str_is_functions(value, method, exp):
2112
+ ser = pd.Series([value, None], dtype=ArrowDtype(pa.string()))
2113
+ result = getattr(ser.str, method)()
2114
+ expected = pd.Series([exp, None], dtype=ArrowDtype(pa.bool_()))
2115
+ tm.assert_series_equal(result, expected)
2116
+
2117
+
2118
+ @pytest.mark.parametrize(
2119
+ "method, exp",
2120
+ [
2121
+ ["capitalize", "Abc def"],
2122
+ ["title", "Abc Def"],
2123
+ ["swapcase", "AbC Def"],
2124
+ ["lower", "abc def"],
2125
+ ["upper", "ABC DEF"],
2126
+ ["casefold", "abc def"],
2127
+ ],
2128
+ )
2129
+ def test_str_transform_functions(method, exp):
2130
+ ser = pd.Series(["aBc dEF", None], dtype=ArrowDtype(pa.string()))
2131
+ result = getattr(ser.str, method)()
2132
+ expected = pd.Series([exp, None], dtype=ArrowDtype(pa.string()))
2133
+ tm.assert_series_equal(result, expected)
2134
+
2135
+
2136
+ def test_str_len():
2137
+ ser = pd.Series(["abcd", None], dtype=ArrowDtype(pa.string()))
2138
+ result = ser.str.len()
2139
+ expected = pd.Series([4, None], dtype=ArrowDtype(pa.int32()))
2140
+ tm.assert_series_equal(result, expected)
2141
+
2142
+
2143
+ @pytest.mark.parametrize(
2144
+ "method, to_strip, val",
2145
+ [
2146
+ ["strip", None, " abc "],
2147
+ ["strip", "x", "xabcx"],
2148
+ ["lstrip", None, " abc"],
2149
+ ["lstrip", "x", "xabc"],
2150
+ ["rstrip", None, "abc "],
2151
+ ["rstrip", "x", "abcx"],
2152
+ ],
2153
+ )
2154
+ def test_str_strip(method, to_strip, val):
2155
+ ser = pd.Series([val, None], dtype=ArrowDtype(pa.string()))
2156
+ result = getattr(ser.str, method)(to_strip=to_strip)
2157
+ expected = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
2158
+ tm.assert_series_equal(result, expected)
2159
+
2160
+
2161
+ @pytest.mark.parametrize("val", ["abc123", "abc"])
2162
+ def test_str_removesuffix(val):
2163
+ ser = pd.Series([val, None], dtype=ArrowDtype(pa.string()))
2164
+ result = ser.str.removesuffix("123")
2165
+ expected = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
2166
+ tm.assert_series_equal(result, expected)
2167
+
2168
+
2169
+ @pytest.mark.parametrize("val", ["123abc", "abc"])
2170
+ def test_str_removeprefix(val):
2171
+ ser = pd.Series([val, None], dtype=ArrowDtype(pa.string()))
2172
+ result = ser.str.removeprefix("123")
2173
+ expected = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
2174
+ tm.assert_series_equal(result, expected)
2175
+
2176
+
2177
+ @pytest.mark.parametrize("errors", ["ignore", "strict"])
2178
+ @pytest.mark.parametrize(
2179
+ "encoding, exp",
2180
+ [
2181
+ ["utf8", b"abc"],
2182
+ ["utf32", b"\xff\xfe\x00\x00a\x00\x00\x00b\x00\x00\x00c\x00\x00\x00"],
2183
+ ],
2184
+ )
2185
+ def test_str_encode(errors, encoding, exp):
2186
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
2187
+ result = ser.str.encode(encoding, errors)
2188
+ expected = pd.Series([exp, None], dtype=ArrowDtype(pa.binary()))
2189
+ tm.assert_series_equal(result, expected)
2190
+
2191
+
2192
+ @pytest.mark.parametrize("flags", [0, 1])
2193
+ def test_str_findall(flags):
2194
+ ser = pd.Series(["abc", "efg", None], dtype=ArrowDtype(pa.string()))
2195
+ result = ser.str.findall("b", flags=flags)
2196
+ expected = pd.Series([["b"], [], None], dtype=ArrowDtype(pa.list_(pa.string())))
2197
+ tm.assert_series_equal(result, expected)
2198
+
2199
+
2200
+ @pytest.mark.parametrize("method", ["index", "rindex"])
2201
+ @pytest.mark.parametrize(
2202
+ "start, end",
2203
+ [
2204
+ [0, None],
2205
+ [1, 4],
2206
+ ],
2207
+ )
2208
+ def test_str_r_index(method, start, end):
2209
+ ser = pd.Series(["abcba", None], dtype=ArrowDtype(pa.string()))
2210
+ result = getattr(ser.str, method)("c", start, end)
2211
+ expected = pd.Series([2, None], dtype=ArrowDtype(pa.int64()))
2212
+ tm.assert_series_equal(result, expected)
2213
+
2214
+ with pytest.raises(ValueError, match="substring not found"):
2215
+ getattr(ser.str, method)("foo", start, end)
2216
+
2217
+
2218
+ @pytest.mark.parametrize("form", ["NFC", "NFKC"])
2219
+ def test_str_normalize(form):
2220
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
2221
+ result = ser.str.normalize(form)
2222
+ expected = ser.copy()
2223
+ tm.assert_series_equal(result, expected)
2224
+
2225
+
2226
+ @pytest.mark.parametrize(
2227
+ "start, end",
2228
+ [
2229
+ [0, None],
2230
+ [1, 4],
2231
+ ],
2232
+ )
2233
+ def test_str_rfind(start, end):
2234
+ ser = pd.Series(["abcba", "foo", None], dtype=ArrowDtype(pa.string()))
2235
+ result = ser.str.rfind("c", start, end)
2236
+ expected = pd.Series([2, -1, None], dtype=ArrowDtype(pa.int64()))
2237
+ tm.assert_series_equal(result, expected)
2238
+
2239
+
2240
+ def test_str_translate():
2241
+ ser = pd.Series(["abcba", None], dtype=ArrowDtype(pa.string()))
2242
+ result = ser.str.translate({97: "b"})
2243
+ expected = pd.Series(["bbcbb", None], dtype=ArrowDtype(pa.string()))
2244
+ tm.assert_series_equal(result, expected)
2245
+
2246
+
2247
+ def test_str_wrap():
2248
+ ser = pd.Series(["abcba", None], dtype=ArrowDtype(pa.string()))
2249
+ result = ser.str.wrap(3)
2250
+ expected = pd.Series(["abc\nba", None], dtype=ArrowDtype(pa.string()))
2251
+ tm.assert_series_equal(result, expected)
2252
+
2253
+
2254
+ def test_get_dummies():
2255
+ ser = pd.Series(["a|b", None, "a|c"], dtype=ArrowDtype(pa.string()))
2256
+ result = ser.str.get_dummies()
2257
+ expected = pd.DataFrame(
2258
+ [[True, True, False], [False, False, False], [True, False, True]],
2259
+ dtype=ArrowDtype(pa.bool_()),
2260
+ columns=["a", "b", "c"],
2261
+ )
2262
+ tm.assert_frame_equal(result, expected)
2263
+
2264
+
2265
+ def test_str_partition():
2266
+ ser = pd.Series(["abcba", None], dtype=ArrowDtype(pa.string()))
2267
+ result = ser.str.partition("b")
2268
+ expected = pd.DataFrame(
2269
+ [["a", "b", "cba"], [None, None, None]], dtype=ArrowDtype(pa.string())
2270
+ )
2271
+ tm.assert_frame_equal(result, expected)
2272
+
2273
+ result = ser.str.partition("b", expand=False)
2274
+ expected = pd.Series(ArrowExtensionArray(pa.array([["a", "b", "cba"], None])))
2275
+ tm.assert_series_equal(result, expected)
2276
+
2277
+ result = ser.str.rpartition("b")
2278
+ expected = pd.DataFrame(
2279
+ [["abc", "b", "a"], [None, None, None]], dtype=ArrowDtype(pa.string())
2280
+ )
2281
+ tm.assert_frame_equal(result, expected)
2282
+
2283
+ result = ser.str.rpartition("b", expand=False)
2284
+ expected = pd.Series(ArrowExtensionArray(pa.array([["abc", "b", "a"], None])))
2285
+ tm.assert_series_equal(result, expected)
2286
+
2287
+
2288
+ def test_str_split():
2289
+ # GH 52401
2290
+ ser = pd.Series(["a1cbcb", "a2cbcb", None], dtype=ArrowDtype(pa.string()))
2291
+ result = ser.str.split("c")
2292
+ expected = pd.Series(
2293
+ ArrowExtensionArray(pa.array([["a1", "b", "b"], ["a2", "b", "b"], None]))
2294
+ )
2295
+ tm.assert_series_equal(result, expected)
2296
+
2297
+ result = ser.str.split("c", n=1)
2298
+ expected = pd.Series(
2299
+ ArrowExtensionArray(pa.array([["a1", "bcb"], ["a2", "bcb"], None]))
2300
+ )
2301
+ tm.assert_series_equal(result, expected)
2302
+
2303
+ result = ser.str.split("[1-2]", regex=True)
2304
+ expected = pd.Series(
2305
+ ArrowExtensionArray(pa.array([["a", "cbcb"], ["a", "cbcb"], None]))
2306
+ )
2307
+ tm.assert_series_equal(result, expected)
2308
+
2309
+ result = ser.str.split("[1-2]", regex=True, expand=True)
2310
+ expected = pd.DataFrame(
2311
+ {
2312
+ 0: ArrowExtensionArray(pa.array(["a", "a", None])),
2313
+ 1: ArrowExtensionArray(pa.array(["cbcb", "cbcb", None])),
2314
+ }
2315
+ )
2316
+ tm.assert_frame_equal(result, expected)
2317
+
2318
+ result = ser.str.split("1", expand=True)
2319
+ expected = pd.DataFrame(
2320
+ {
2321
+ 0: ArrowExtensionArray(pa.array(["a", "a2cbcb", None])),
2322
+ 1: ArrowExtensionArray(pa.array(["cbcb", None, None])),
2323
+ }
2324
+ )
2325
+ tm.assert_frame_equal(result, expected)
2326
+
2327
+
2328
+ def test_str_rsplit():
2329
+ # GH 52401
2330
+ ser = pd.Series(["a1cbcb", "a2cbcb", None], dtype=ArrowDtype(pa.string()))
2331
+ result = ser.str.rsplit("c")
2332
+ expected = pd.Series(
2333
+ ArrowExtensionArray(pa.array([["a1", "b", "b"], ["a2", "b", "b"], None]))
2334
+ )
2335
+ tm.assert_series_equal(result, expected)
2336
+
2337
+ result = ser.str.rsplit("c", n=1)
2338
+ expected = pd.Series(
2339
+ ArrowExtensionArray(pa.array([["a1cb", "b"], ["a2cb", "b"], None]))
2340
+ )
2341
+ tm.assert_series_equal(result, expected)
2342
+
2343
+ result = ser.str.rsplit("c", n=1, expand=True)
2344
+ expected = pd.DataFrame(
2345
+ {
2346
+ 0: ArrowExtensionArray(pa.array(["a1cb", "a2cb", None])),
2347
+ 1: ArrowExtensionArray(pa.array(["b", "b", None])),
2348
+ }
2349
+ )
2350
+ tm.assert_frame_equal(result, expected)
2351
+
2352
+ result = ser.str.rsplit("1", expand=True)
2353
+ expected = pd.DataFrame(
2354
+ {
2355
+ 0: ArrowExtensionArray(pa.array(["a", "a2cbcb", None])),
2356
+ 1: ArrowExtensionArray(pa.array(["cbcb", None, None])),
2357
+ }
2358
+ )
2359
+ tm.assert_frame_equal(result, expected)
2360
+
2361
+
2362
+ def test_str_unsupported_extract():
2363
+ ser = pd.Series(["abc", None], dtype=ArrowDtype(pa.string()))
2364
+ with pytest.raises(
2365
+ NotImplementedError, match="str.extract not supported with pd.ArrowDtype"
2366
+ ):
2367
+ ser.str.extract(r"[ab](\d)")
2368
+
2369
+
2370
+ @pytest.mark.parametrize("unit", ["ns", "us", "ms", "s"])
2371
+ def test_duration_from_strings_with_nat(unit):
2372
+ # GH51175
2373
+ strings = ["1000", "NaT"]
2374
+ pa_type = pa.duration(unit)
2375
+ result = ArrowExtensionArray._from_sequence_of_strings(strings, dtype=pa_type)
2376
+ expected = ArrowExtensionArray(pa.array([1000, None], type=pa_type))
2377
+ tm.assert_extension_array_equal(result, expected)
2378
+
2379
+
2380
+ def test_unsupported_dt(data):
2381
+ pa_dtype = data.dtype.pyarrow_dtype
2382
+ if not pa.types.is_temporal(pa_dtype):
2383
+ with pytest.raises(
2384
+ AttributeError, match="Can only use .dt accessor with datetimelike values"
2385
+ ):
2386
+ pd.Series(data).dt
2387
+
2388
+
2389
+ @pytest.mark.parametrize(
2390
+ "prop, expected",
2391
+ [
2392
+ ["year", 2023],
2393
+ ["day", 2],
2394
+ ["day_of_week", 0],
2395
+ ["dayofweek", 0],
2396
+ ["weekday", 0],
2397
+ ["day_of_year", 2],
2398
+ ["dayofyear", 2],
2399
+ ["hour", 3],
2400
+ ["minute", 4],
2401
+ pytest.param(
2402
+ "is_leap_year",
2403
+ False,
2404
+ marks=pytest.mark.xfail(
2405
+ pa_version_under8p0,
2406
+ raises=NotImplementedError,
2407
+ reason="is_leap_year not implemented for pyarrow < 8.0",
2408
+ ),
2409
+ ),
2410
+ ["microsecond", 5],
2411
+ ["month", 1],
2412
+ ["nanosecond", 6],
2413
+ ["quarter", 1],
2414
+ ["second", 7],
2415
+ ["date", date(2023, 1, 2)],
2416
+ ["time", time(3, 4, 7, 5)],
2417
+ ],
2418
+ )
2419
+ def test_dt_properties(prop, expected):
2420
+ ser = pd.Series(
2421
+ [
2422
+ pd.Timestamp(
2423
+ year=2023,
2424
+ month=1,
2425
+ day=2,
2426
+ hour=3,
2427
+ minute=4,
2428
+ second=7,
2429
+ microsecond=5,
2430
+ nanosecond=6,
2431
+ ),
2432
+ None,
2433
+ ],
2434
+ dtype=ArrowDtype(pa.timestamp("ns")),
2435
+ )
2436
+ result = getattr(ser.dt, prop)
2437
+ exp_type = None
2438
+ if isinstance(expected, date):
2439
+ exp_type = pa.date32()
2440
+ elif isinstance(expected, time):
2441
+ exp_type = pa.time64("ns")
2442
+ expected = pd.Series(ArrowExtensionArray(pa.array([expected, None], type=exp_type)))
2443
+ tm.assert_series_equal(result, expected)
2444
+
2445
+
2446
+ @pytest.mark.parametrize("unit", ["us", "ns"])
2447
+ def test_dt_time_preserve_unit(unit):
2448
+ ser = pd.Series(
2449
+ [datetime(year=2023, month=1, day=2, hour=3), None],
2450
+ dtype=ArrowDtype(pa.timestamp(unit)),
2451
+ )
2452
+ result = ser.dt.time
2453
+ expected = pd.Series(
2454
+ ArrowExtensionArray(pa.array([time(3, 0), None], type=pa.time64(unit)))
2455
+ )
2456
+ tm.assert_series_equal(result, expected)
2457
+
2458
+
2459
+ @pytest.mark.parametrize("tz", [None, "UTC", "US/Pacific"])
2460
+ def test_dt_tz(tz):
2461
+ ser = pd.Series(
2462
+ [datetime(year=2023, month=1, day=2, hour=3), None],
2463
+ dtype=ArrowDtype(pa.timestamp("ns", tz=tz)),
2464
+ )
2465
+ result = ser.dt.tz
2466
+ assert result == tz
2467
+
2468
+
2469
+ def test_dt_isocalendar():
2470
+ ser = pd.Series(
2471
+ [datetime(year=2023, month=1, day=2, hour=3), None],
2472
+ dtype=ArrowDtype(pa.timestamp("ns")),
2473
+ )
2474
+ result = ser.dt.isocalendar()
2475
+ expected = pd.DataFrame(
2476
+ [[2023, 1, 1], [0, 0, 0]],
2477
+ columns=["year", "week", "day"],
2478
+ dtype="int64[pyarrow]",
2479
+ )
2480
+ tm.assert_frame_equal(result, expected)
2481
+
2482
+
2483
+ def test_dt_strftime(request):
2484
+ if is_platform_windows() and is_ci_environment():
2485
+ request.node.add_marker(
2486
+ pytest.mark.xfail(
2487
+ raises=pa.ArrowInvalid,
2488
+ reason=(
2489
+ "TODO: Set ARROW_TIMEZONE_DATABASE environment variable "
2490
+ "on CI to path to the tzdata for pyarrow."
2491
+ ),
2492
+ )
2493
+ )
2494
+ ser = pd.Series(
2495
+ [datetime(year=2023, month=1, day=2, hour=3), None],
2496
+ dtype=ArrowDtype(pa.timestamp("ns")),
2497
+ )
2498
+ result = ser.dt.strftime("%Y-%m-%dT%H:%M:%S")
2499
+ expected = pd.Series(
2500
+ ["2023-01-02T03:00:00.000000000", None], dtype=ArrowDtype(pa.string())
2501
+ )
2502
+ tm.assert_series_equal(result, expected)
2503
+
2504
+
2505
+ @pytest.mark.parametrize("method", ["ceil", "floor", "round"])
2506
+ def test_dt_roundlike_tz_options_not_supported(method):
2507
+ ser = pd.Series(
2508
+ [datetime(year=2023, month=1, day=2, hour=3), None],
2509
+ dtype=ArrowDtype(pa.timestamp("ns")),
2510
+ )
2511
+ with pytest.raises(NotImplementedError, match="ambiguous is not supported."):
2512
+ getattr(ser.dt, method)("1H", ambiguous="NaT")
2513
+
2514
+ with pytest.raises(NotImplementedError, match="nonexistent is not supported."):
2515
+ getattr(ser.dt, method)("1H", nonexistent="NaT")
2516
+
2517
+
2518
+ @pytest.mark.parametrize("method", ["ceil", "floor", "round"])
2519
+ def test_dt_roundlike_unsupported_freq(method):
2520
+ ser = pd.Series(
2521
+ [datetime(year=2023, month=1, day=2, hour=3), None],
2522
+ dtype=ArrowDtype(pa.timestamp("ns")),
2523
+ )
2524
+ with pytest.raises(ValueError, match="freq='1B' is not supported"):
2525
+ getattr(ser.dt, method)("1B")
2526
+
2527
+ with pytest.raises(ValueError, match="Must specify a valid frequency: None"):
2528
+ getattr(ser.dt, method)(None)
2529
+
2530
+
2531
+ @pytest.mark.xfail(
2532
+ pa_version_under7p0, reason="Methods not supported for pyarrow < 7.0"
2533
+ )
2534
+ @pytest.mark.parametrize("freq", ["D", "H", "T", "S", "L", "U", "N"])
2535
+ @pytest.mark.parametrize("method", ["ceil", "floor", "round"])
2536
+ def test_dt_ceil_year_floor(freq, method):
2537
+ ser = pd.Series(
2538
+ [datetime(year=2023, month=1, day=1), None],
2539
+ )
2540
+ pa_dtype = ArrowDtype(pa.timestamp("ns"))
2541
+ expected = getattr(ser.dt, method)(f"1{freq}").astype(pa_dtype)
2542
+ result = getattr(ser.astype(pa_dtype).dt, method)(f"1{freq}")
2543
+ tm.assert_series_equal(result, expected)
2544
+
2545
+
2546
+ def test_dt_to_pydatetime():
2547
+ # GH 51859
2548
+ data = [datetime(2022, 1, 1), datetime(2023, 1, 1)]
2549
+ ser = pd.Series(data, dtype=ArrowDtype(pa.timestamp("ns")))
2550
+
2551
+ result = ser.dt.to_pydatetime()
2552
+ expected = np.array(data, dtype=object)
2553
+ tm.assert_numpy_array_equal(result, expected)
2554
+ assert all(type(res) is datetime for res in result)
2555
+
2556
+ expected = ser.astype("datetime64[ns]").dt.to_pydatetime()
2557
+ tm.assert_numpy_array_equal(result, expected)
2558
+
2559
+
2560
+ @pytest.mark.parametrize("date_type", [32, 64])
2561
+ def test_dt_to_pydatetime_date_error(date_type):
2562
+ # GH 52812
2563
+ ser = pd.Series(
2564
+ [date(2022, 12, 31)],
2565
+ dtype=ArrowDtype(getattr(pa, f"date{date_type}")()),
2566
+ )
2567
+ with pytest.raises(ValueError, match="to_pydatetime cannot be called with"):
2568
+ ser.dt.to_pydatetime()
2569
+
2570
+
2571
+ def test_dt_tz_localize_unsupported_tz_options():
2572
+ ser = pd.Series(
2573
+ [datetime(year=2023, month=1, day=2, hour=3), None],
2574
+ dtype=ArrowDtype(pa.timestamp("ns")),
2575
+ )
2576
+ with pytest.raises(NotImplementedError, match="ambiguous='NaT' is not supported"):
2577
+ ser.dt.tz_localize("UTC", ambiguous="NaT")
2578
+
2579
+ with pytest.raises(NotImplementedError, match="nonexistent='NaT' is not supported"):
2580
+ ser.dt.tz_localize("UTC", nonexistent="NaT")
2581
+
2582
+
2583
+ def test_dt_tz_localize_none():
2584
+ ser = pd.Series(
2585
+ [datetime(year=2023, month=1, day=2, hour=3), None],
2586
+ dtype=ArrowDtype(pa.timestamp("ns", tz="US/Pacific")),
2587
+ )
2588
+ result = ser.dt.tz_localize(None)
2589
+ expected = pd.Series(
2590
+ [datetime(year=2023, month=1, day=2, hour=3), None],
2591
+ dtype=ArrowDtype(pa.timestamp("ns")),
2592
+ )
2593
+ tm.assert_series_equal(result, expected)
2594
+
2595
+
2596
+ @pytest.mark.parametrize("unit", ["us", "ns"])
2597
+ def test_dt_tz_localize(unit, request):
2598
+ if is_platform_windows() and is_ci_environment():
2599
+ request.node.add_marker(
2600
+ pytest.mark.xfail(
2601
+ raises=pa.ArrowInvalid,
2602
+ reason=(
2603
+ "TODO: Set ARROW_TIMEZONE_DATABASE environment variable "
2604
+ "on CI to path to the tzdata for pyarrow."
2605
+ ),
2606
+ )
2607
+ )
2608
+ ser = pd.Series(
2609
+ [datetime(year=2023, month=1, day=2, hour=3), None],
2610
+ dtype=ArrowDtype(pa.timestamp(unit)),
2611
+ )
2612
+ result = ser.dt.tz_localize("US/Pacific")
2613
+ exp_data = pa.array(
2614
+ [datetime(year=2023, month=1, day=2, hour=3), None], type=pa.timestamp(unit)
2615
+ )
2616
+ exp_data = pa.compute.assume_timezone(exp_data, "US/Pacific")
2617
+ expected = pd.Series(ArrowExtensionArray(exp_data))
2618
+ tm.assert_series_equal(result, expected)
2619
+
2620
+
2621
+ @pytest.mark.parametrize(
2622
+ "nonexistent, exp_date",
2623
+ [
2624
+ ["shift_forward", datetime(year=2023, month=3, day=12, hour=3)],
2625
+ ["shift_backward", pd.Timestamp("2023-03-12 01:59:59.999999999")],
2626
+ ],
2627
+ )
2628
+ def test_dt_tz_localize_nonexistent(nonexistent, exp_date, request):
2629
+ if is_platform_windows() and is_ci_environment():
2630
+ request.node.add_marker(
2631
+ pytest.mark.xfail(
2632
+ raises=pa.ArrowInvalid,
2633
+ reason=(
2634
+ "TODO: Set ARROW_TIMEZONE_DATABASE environment variable "
2635
+ "on CI to path to the tzdata for pyarrow."
2636
+ ),
2637
+ )
2638
+ )
2639
+ ser = pd.Series(
2640
+ [datetime(year=2023, month=3, day=12, hour=2, minute=30), None],
2641
+ dtype=ArrowDtype(pa.timestamp("ns")),
2642
+ )
2643
+ result = ser.dt.tz_localize("US/Pacific", nonexistent=nonexistent)
2644
+ exp_data = pa.array([exp_date, None], type=pa.timestamp("ns"))
2645
+ exp_data = pa.compute.assume_timezone(exp_data, "US/Pacific")
2646
+ expected = pd.Series(ArrowExtensionArray(exp_data))
2647
+ tm.assert_series_equal(result, expected)
2648
+
2649
+
2650
+ @pytest.mark.parametrize("skipna", [True, False])
2651
+ def test_boolean_reduce_series_all_null(all_boolean_reductions, skipna):
2652
+ # GH51624
2653
+ ser = pd.Series([None], dtype="float64[pyarrow]")
2654
+ result = getattr(ser, all_boolean_reductions)(skipna=skipna)
2655
+ if skipna:
2656
+ expected = all_boolean_reductions == "all"
2657
+ else:
2658
+ expected = pd.NA
2659
+ assert result is expected
2660
+
2661
+
2662
+ @pytest.mark.parametrize("dtype", ["string", "string[pyarrow]"])
2663
+ def test_series_from_string_array(dtype):
2664
+ arr = pa.array("the quick brown fox".split())
2665
+ ser = pd.Series(arr, dtype=dtype)
2666
+ expected = pd.Series(ArrowExtensionArray(arr), dtype=dtype)
2667
+ tm.assert_series_equal(ser, expected)
2668
+
2669
+
2670
+ def test_setitem_boolean_replace_with_mask_segfault():
2671
+ # GH#52059
2672
+ N = 145_000
2673
+ arr = ArrowExtensionArray(pa.chunked_array([np.ones((N,), dtype=np.bool_)]))
2674
+ expected = arr.copy()
2675
+ arr[np.zeros((N,), dtype=np.bool_)] = False
2676
+ assert arr._data == expected._data
2677
+
2678
+
2679
+ @pytest.mark.parametrize(
2680
+ "data, arrow_dtype",
2681
+ [
2682
+ ([b"a", b"b"], pa.large_binary()),
2683
+ (["a", "b"], pa.large_string()),
2684
+ ],
2685
+ )
2686
+ def test_conversion_large_dtypes_from_numpy_array(data, arrow_dtype):
2687
+ dtype = ArrowDtype(arrow_dtype)
2688
+ result = pd.array(np.array(data), dtype=dtype)
2689
+ expected = pd.array(data, dtype=dtype)
2690
+ tm.assert_extension_array_equal(result, expected)
2691
+
2692
+
2693
+ @pytest.mark.parametrize("pa_type", tm.ALL_INT_PYARROW_DTYPES + tm.FLOAT_PYARROW_DTYPES)
2694
+ def test_describe_numeric_data(pa_type):
2695
+ # GH 52470
2696
+ data = pd.Series([1, 2, 3], dtype=ArrowDtype(pa_type))
2697
+ result = data.describe()
2698
+ expected = pd.Series(
2699
+ [3, 2, 1, 1, 1.5, 2.0, 2.5, 3],
2700
+ dtype=ArrowDtype(pa.float64()),
2701
+ index=["count", "mean", "std", "min", "25%", "50%", "75%", "max"],
2702
+ )
2703
+ tm.assert_series_equal(result, expected)
2704
+
2705
+
2706
+ @pytest.mark.parametrize("pa_type", tm.TIMEDELTA_PYARROW_DTYPES)
2707
+ def test_describe_timedelta_data(pa_type):
2708
+ # GH53001
2709
+ data = pd.Series(range(1, 10), dtype=ArrowDtype(pa_type))
2710
+ result = data.describe()
2711
+ expected = pd.Series(
2712
+ [9] + pd.to_timedelta([5, 2, 1, 3, 5, 7, 9], unit=pa_type.unit).tolist(),
2713
+ dtype=object,
2714
+ index=["count", "mean", "std", "min", "25%", "50%", "75%", "max"],
2715
+ )
2716
+ tm.assert_series_equal(result, expected)
2717
+
2718
+
2719
+ @pytest.mark.parametrize("pa_type", tm.DATETIME_PYARROW_DTYPES)
2720
+ def test_describe_datetime_data(pa_type):
2721
+ # GH53001
2722
+ data = pd.Series(range(1, 10), dtype=ArrowDtype(pa_type))
2723
+ result = data.describe()
2724
+ expected = pd.Series(
2725
+ [9]
2726
+ + [
2727
+ pd.Timestamp(v, tz=pa_type.tz, unit=pa_type.unit)
2728
+ for v in [5, 1, 3, 5, 7, 9]
2729
+ ],
2730
+ dtype=object,
2731
+ index=["count", "mean", "min", "25%", "50%", "75%", "max"],
2732
+ )
2733
+ tm.assert_series_equal(result, expected)
2734
+
2735
+
2736
+ @pytest.mark.xfail(
2737
+ pa_version_under8p0,
2738
+ reason="Function 'add_checked' has no kernel matching input types",
2739
+ raises=pa.ArrowNotImplementedError,
2740
+ )
2741
+ def test_duration_overflow_from_ndarray_containing_nat():
2742
+ # GH52843
2743
+ data_ts = pd.to_datetime([1, None])
2744
+ data_td = pd.to_timedelta([1, None])
2745
+ ser_ts = pd.Series(data_ts, dtype=ArrowDtype(pa.timestamp("ns")))
2746
+ ser_td = pd.Series(data_td, dtype=ArrowDtype(pa.duration("ns")))
2747
+ result = ser_ts + ser_td
2748
+ expected = pd.Series([2, None], dtype=ArrowDtype(pa.timestamp("ns")))
2749
+ tm.assert_series_equal(result, expected)
videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_boolean.py ADDED
@@ -0,0 +1,401 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ This file contains a minimal set of tests for compliance with the extension
3
+ array interface test suite, and should contain no other tests.
4
+ The test suite for the full functionality of the array is located in
5
+ `pandas/tests/arrays/`.
6
+
7
+ The tests in this file are inherited from the BaseExtensionTests, and only
8
+ minimal tweaks should be applied to get the tests passing (by overwriting a
9
+ parent method).
10
+
11
+ Additional tests should either be added to one of the BaseExtensionTests
12
+ classes (if they are relevant for the extension interface for all dtypes), or
13
+ be added to the array-specific tests in `pandas/tests/arrays/`.
14
+
15
+ """
16
+ import numpy as np
17
+ import pytest
18
+
19
+ from pandas.core.dtypes.common import is_bool_dtype
20
+
21
+ import pandas as pd
22
+ import pandas._testing as tm
23
+ from pandas.core.arrays.boolean import BooleanDtype
24
+ from pandas.tests.extension import base
25
+
26
+
27
+ def make_data():
28
+ return [True, False] * 4 + [np.nan] + [True, False] * 44 + [np.nan] + [True, False]
29
+
30
+
31
+ @pytest.fixture
32
+ def dtype():
33
+ return BooleanDtype()
34
+
35
+
36
+ @pytest.fixture
37
+ def data(dtype):
38
+ return pd.array(make_data(), dtype=dtype)
39
+
40
+
41
+ @pytest.fixture
42
+ def data_for_twos(dtype):
43
+ return pd.array(np.ones(100), dtype=dtype)
44
+
45
+
46
+ @pytest.fixture
47
+ def data_missing(dtype):
48
+ return pd.array([np.nan, True], dtype=dtype)
49
+
50
+
51
+ @pytest.fixture
52
+ def data_for_sorting(dtype):
53
+ return pd.array([True, True, False], dtype=dtype)
54
+
55
+
56
+ @pytest.fixture
57
+ def data_missing_for_sorting(dtype):
58
+ return pd.array([True, np.nan, False], dtype=dtype)
59
+
60
+
61
+ @pytest.fixture
62
+ def na_cmp():
63
+ # we are pd.NA
64
+ return lambda x, y: x is pd.NA and y is pd.NA
65
+
66
+
67
+ @pytest.fixture
68
+ def na_value():
69
+ return pd.NA
70
+
71
+
72
+ @pytest.fixture
73
+ def data_for_grouping(dtype):
74
+ b = True
75
+ a = False
76
+ na = np.nan
77
+ return pd.array([b, b, na, na, a, a, b], dtype=dtype)
78
+
79
+
80
+ class TestDtype(base.BaseDtypeTests):
81
+ pass
82
+
83
+
84
+ class TestInterface(base.BaseInterfaceTests):
85
+ pass
86
+
87
+
88
+ class TestConstructors(base.BaseConstructorsTests):
89
+ pass
90
+
91
+
92
+ class TestGetitem(base.BaseGetitemTests):
93
+ pass
94
+
95
+
96
+ class TestSetitem(base.BaseSetitemTests):
97
+ pass
98
+
99
+
100
+ class TestIndex(base.BaseIndexTests):
101
+ pass
102
+
103
+
104
+ class TestMissing(base.BaseMissingTests):
105
+ pass
106
+
107
+
108
+ class TestArithmeticOps(base.BaseArithmeticOpsTests):
109
+ implements = {"__sub__", "__rsub__"}
110
+
111
+ def check_opname(self, s, op_name, other, exc=None):
112
+ # overwriting to indicate ops don't raise an error
113
+ exc = None
114
+ if op_name.strip("_").lstrip("r") in ["pow", "truediv", "floordiv"]:
115
+ # match behavior with non-masked bool dtype
116
+ exc = NotImplementedError
117
+ super().check_opname(s, op_name, other, exc=exc)
118
+
119
+ def _check_op(self, obj, op, other, op_name, exc=NotImplementedError):
120
+ if exc is None:
121
+ if op_name in self.implements:
122
+ msg = r"numpy boolean subtract"
123
+ with pytest.raises(TypeError, match=msg):
124
+ op(obj, other)
125
+ return
126
+
127
+ result = op(obj, other)
128
+ expected = self._combine(obj, other, op)
129
+
130
+ if op_name in (
131
+ "__floordiv__",
132
+ "__rfloordiv__",
133
+ "__pow__",
134
+ "__rpow__",
135
+ "__mod__",
136
+ "__rmod__",
137
+ ):
138
+ # combine keeps boolean type
139
+ expected = expected.astype("Int8")
140
+ elif op_name in ("__truediv__", "__rtruediv__"):
141
+ # combine with bools does not generate the correct result
142
+ # (numpy behaviour for div is to regard the bools as numeric)
143
+ expected = self._combine(obj.astype(float), other, op)
144
+ expected = expected.astype("Float64")
145
+ if op_name == "__rpow__":
146
+ # for rpow, combine does not propagate NaN
147
+ expected[result.isna()] = np.nan
148
+ self.assert_equal(result, expected)
149
+ else:
150
+ with pytest.raises(exc):
151
+ op(obj, other)
152
+
153
+ @pytest.mark.xfail(
154
+ reason="Inconsistency between floordiv and divmod; we raise for floordiv "
155
+ "but not for divmod. This matches what we do for non-masked bool dtype."
156
+ )
157
+ def test_divmod_series_array(self, data, data_for_twos):
158
+ super().test_divmod_series_array(data, data_for_twos)
159
+
160
+ @pytest.mark.xfail(
161
+ reason="Inconsistency between floordiv and divmod; we raise for floordiv "
162
+ "but not for divmod. This matches what we do for non-masked bool dtype."
163
+ )
164
+ def test_divmod(self, data):
165
+ super().test_divmod(data)
166
+
167
+
168
+ class TestComparisonOps(base.BaseComparisonOpsTests):
169
+ def check_opname(self, s, op_name, other, exc=None):
170
+ # overwriting to indicate ops don't raise an error
171
+ super().check_opname(s, op_name, other, exc=None)
172
+
173
+
174
+ class TestReshaping(base.BaseReshapingTests):
175
+ pass
176
+
177
+
178
+ class TestMethods(base.BaseMethodsTests):
179
+ _combine_le_expected_dtype = "boolean"
180
+
181
+ def test_factorize(self, data_for_grouping):
182
+ # override because we only have 2 unique values
183
+ labels, uniques = pd.factorize(data_for_grouping, use_na_sentinel=True)
184
+ expected_labels = np.array([0, 0, -1, -1, 1, 1, 0], dtype=np.intp)
185
+ expected_uniques = data_for_grouping.take([0, 4])
186
+
187
+ tm.assert_numpy_array_equal(labels, expected_labels)
188
+ self.assert_extension_array_equal(uniques, expected_uniques)
189
+
190
+ def test_searchsorted(self, data_for_sorting, as_series):
191
+ # override because we only have 2 unique values
192
+ data_for_sorting = pd.array([True, False], dtype="boolean")
193
+ b, a = data_for_sorting
194
+ arr = type(data_for_sorting)._from_sequence([a, b])
195
+
196
+ if as_series:
197
+ arr = pd.Series(arr)
198
+ assert arr.searchsorted(a) == 0
199
+ assert arr.searchsorted(a, side="right") == 1
200
+
201
+ assert arr.searchsorted(b) == 1
202
+ assert arr.searchsorted(b, side="right") == 2
203
+
204
+ result = arr.searchsorted(arr.take([0, 1]))
205
+ expected = np.array([0, 1], dtype=np.intp)
206
+
207
+ tm.assert_numpy_array_equal(result, expected)
208
+
209
+ # sorter
210
+ sorter = np.array([1, 0])
211
+ assert data_for_sorting.searchsorted(a, sorter=sorter) == 0
212
+
213
+ def test_argmin_argmax(self, data_for_sorting, data_missing_for_sorting):
214
+ # override because there are only 2 unique values
215
+
216
+ # data_for_sorting -> [B, C, A] with A < B < C -> here True, True, False
217
+ assert data_for_sorting.argmax() == 0
218
+ assert data_for_sorting.argmin() == 2
219
+
220
+ # with repeated values -> first occurrence
221
+ data = data_for_sorting.take([2, 0, 0, 1, 1, 2])
222
+ assert data.argmax() == 1
223
+ assert data.argmin() == 0
224
+
225
+ # with missing values
226
+ # data_missing_for_sorting -> [B, NA, A] with A < B and NA missing.
227
+ assert data_missing_for_sorting.argmax() == 0
228
+ assert data_missing_for_sorting.argmin() == 2
229
+
230
+
231
+ class TestCasting(base.BaseCastingTests):
232
+ pass
233
+
234
+
235
+ class TestGroupby(base.BaseGroupbyTests):
236
+ """
237
+ Groupby-specific tests are overridden because boolean only has 2
238
+ unique values, base tests uses 3 groups.
239
+ """
240
+
241
+ def test_grouping_grouper(self, data_for_grouping):
242
+ df = pd.DataFrame(
243
+ {"A": ["B", "B", None, None, "A", "A", "B"], "B": data_for_grouping}
244
+ )
245
+ gr1 = df.groupby("A").grouper.groupings[0]
246
+ gr2 = df.groupby("B").grouper.groupings[0]
247
+
248
+ tm.assert_numpy_array_equal(gr1.grouping_vector, df.A.values)
249
+ tm.assert_extension_array_equal(gr2.grouping_vector, data_for_grouping)
250
+
251
+ @pytest.mark.parametrize("as_index", [True, False])
252
+ def test_groupby_extension_agg(self, as_index, data_for_grouping):
253
+ df = pd.DataFrame({"A": [1, 1, 2, 2, 3, 3, 1], "B": data_for_grouping})
254
+ result = df.groupby("B", as_index=as_index).A.mean()
255
+ _, uniques = pd.factorize(data_for_grouping, sort=True)
256
+
257
+ if as_index:
258
+ index = pd.Index(uniques, name="B")
259
+ expected = pd.Series([3.0, 1.0], index=index, name="A")
260
+ self.assert_series_equal(result, expected)
261
+ else:
262
+ expected = pd.DataFrame({"B": uniques, "A": [3.0, 1.0]})
263
+ self.assert_frame_equal(result, expected)
264
+
265
+ def test_groupby_agg_extension(self, data_for_grouping):
266
+ # GH#38980 groupby agg on extension type fails for non-numeric types
267
+ df = pd.DataFrame({"A": [1, 1, 2, 2, 3, 3, 1], "B": data_for_grouping})
268
+
269
+ expected = df.iloc[[0, 2, 4]]
270
+ expected = expected.set_index("A")
271
+
272
+ result = df.groupby("A").agg({"B": "first"})
273
+ self.assert_frame_equal(result, expected)
274
+
275
+ result = df.groupby("A").agg("first")
276
+ self.assert_frame_equal(result, expected)
277
+
278
+ result = df.groupby("A").first()
279
+ self.assert_frame_equal(result, expected)
280
+
281
+ def test_groupby_extension_no_sort(self, data_for_grouping):
282
+ df = pd.DataFrame({"A": [1, 1, 2, 2, 3, 3, 1], "B": data_for_grouping})
283
+ result = df.groupby("B", sort=False).A.mean()
284
+ _, index = pd.factorize(data_for_grouping, sort=False)
285
+
286
+ index = pd.Index(index, name="B")
287
+ expected = pd.Series([1.0, 3.0], index=index, name="A")
288
+ self.assert_series_equal(result, expected)
289
+
290
+ def test_groupby_extension_transform(self, data_for_grouping):
291
+ valid = data_for_grouping[~data_for_grouping.isna()]
292
+ df = pd.DataFrame({"A": [1, 1, 3, 3, 1], "B": valid})
293
+
294
+ result = df.groupby("B").A.transform(len)
295
+ expected = pd.Series([3, 3, 2, 2, 3], name="A")
296
+
297
+ self.assert_series_equal(result, expected)
298
+
299
+ def test_groupby_extension_apply(self, data_for_grouping, groupby_apply_op):
300
+ df = pd.DataFrame({"A": [1, 1, 2, 2, 3, 3, 1], "B": data_for_grouping})
301
+ df.groupby("B", group_keys=False).apply(groupby_apply_op)
302
+ df.groupby("B", group_keys=False).A.apply(groupby_apply_op)
303
+ df.groupby("A", group_keys=False).apply(groupby_apply_op)
304
+ df.groupby("A", group_keys=False).B.apply(groupby_apply_op)
305
+
306
+ def test_groupby_apply_identity(self, data_for_grouping):
307
+ df = pd.DataFrame({"A": [1, 1, 2, 2, 3, 3, 1], "B": data_for_grouping})
308
+ result = df.groupby("A").B.apply(lambda x: x.array)
309
+ expected = pd.Series(
310
+ [
311
+ df.B.iloc[[0, 1, 6]].array,
312
+ df.B.iloc[[2, 3]].array,
313
+ df.B.iloc[[4, 5]].array,
314
+ ],
315
+ index=pd.Index([1, 2, 3], name="A"),
316
+ name="B",
317
+ )
318
+ self.assert_series_equal(result, expected)
319
+
320
+ def test_in_numeric_groupby(self, data_for_grouping):
321
+ df = pd.DataFrame(
322
+ {
323
+ "A": [1, 1, 2, 2, 3, 3, 1],
324
+ "B": data_for_grouping,
325
+ "C": [1, 1, 1, 1, 1, 1, 1],
326
+ }
327
+ )
328
+ result = df.groupby("A").sum().columns
329
+
330
+ if data_for_grouping.dtype._is_numeric:
331
+ expected = pd.Index(["B", "C"])
332
+ else:
333
+ expected = pd.Index(["C"])
334
+
335
+ tm.assert_index_equal(result, expected)
336
+
337
+ @pytest.mark.parametrize("min_count", [0, 10])
338
+ def test_groupby_sum_mincount(self, data_for_grouping, min_count):
339
+ df = pd.DataFrame({"A": [1, 1, 2, 2, 3, 3, 1], "B": data_for_grouping})
340
+ result = df.groupby("A").sum(min_count=min_count)
341
+ if min_count == 0:
342
+ expected = pd.DataFrame(
343
+ {"B": pd.array([3, 0, 0], dtype="Int64")},
344
+ index=pd.Index([1, 2, 3], name="A"),
345
+ )
346
+ tm.assert_frame_equal(result, expected)
347
+ else:
348
+ expected = pd.DataFrame(
349
+ {"B": pd.array([pd.NA] * 3, dtype="Int64")},
350
+ index=pd.Index([1, 2, 3], name="A"),
351
+ )
352
+ tm.assert_frame_equal(result, expected)
353
+
354
+
355
+ class TestNumericReduce(base.BaseNumericReduceTests):
356
+ def check_reduce(self, s, op_name, skipna):
357
+ if op_name == "count":
358
+ result = getattr(s, op_name)()
359
+ expected = getattr(s.astype("float64"), op_name)()
360
+ else:
361
+ result = getattr(s, op_name)(skipna=skipna)
362
+ expected = getattr(s.astype("float64"), op_name)(skipna=skipna)
363
+ # override parent function to cast to bool for min/max
364
+ if np.isnan(expected):
365
+ expected = pd.NA
366
+ elif op_name in ("min", "max"):
367
+ expected = bool(expected)
368
+ tm.assert_almost_equal(result, expected)
369
+
370
+
371
+ class TestBooleanReduce(base.BaseBooleanReduceTests):
372
+ pass
373
+
374
+
375
+ class TestPrinting(base.BasePrintingTests):
376
+ pass
377
+
378
+
379
+ class TestUnaryOps(base.BaseUnaryOpsTests):
380
+ pass
381
+
382
+
383
+ class TestAccumulation(base.BaseAccumulateTests):
384
+ def check_accumulate(self, s, op_name, skipna):
385
+ result = getattr(s, op_name)(skipna=skipna)
386
+ expected = getattr(pd.Series(s.astype("float64")), op_name)(skipna=skipna)
387
+ tm.assert_series_equal(result, expected, check_dtype=False)
388
+ if op_name in ("cummin", "cummax"):
389
+ assert is_bool_dtype(result)
390
+
391
+ @pytest.mark.parametrize("skipna", [True, False])
392
+ def test_accumulate_series_raises(self, data, all_numeric_accumulations, skipna):
393
+ pass
394
+
395
+
396
+ class TestParsing(base.BaseParsingTests):
397
+ pass
398
+
399
+
400
+ class Test2DCompat(base.Dim2CompatTests):
401
+ pass
videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_categorical.py ADDED
@@ -0,0 +1,316 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ This file contains a minimal set of tests for compliance with the extension
3
+ array interface test suite, and should contain no other tests.
4
+ The test suite for the full functionality of the array is located in
5
+ `pandas/tests/arrays/`.
6
+
7
+ The tests in this file are inherited from the BaseExtensionTests, and only
8
+ minimal tweaks should be applied to get the tests passing (by overwriting a
9
+ parent method).
10
+
11
+ Additional tests should either be added to one of the BaseExtensionTests
12
+ classes (if they are relevant for the extension interface for all dtypes), or
13
+ be added to the array-specific tests in `pandas/tests/arrays/`.
14
+
15
+ """
16
+ import string
17
+
18
+ import numpy as np
19
+ import pytest
20
+
21
+ import pandas as pd
22
+ from pandas import (
23
+ Categorical,
24
+ CategoricalIndex,
25
+ Timestamp,
26
+ )
27
+ import pandas._testing as tm
28
+ from pandas.api.types import CategoricalDtype
29
+ from pandas.tests.extension import base
30
+
31
+
32
+ def make_data():
33
+ while True:
34
+ values = np.random.choice(list(string.ascii_letters), size=100)
35
+ # ensure we meet the requirements
36
+ # 1. first two not null
37
+ # 2. first and second are different
38
+ if values[0] != values[1]:
39
+ break
40
+ return values
41
+
42
+
43
+ @pytest.fixture
44
+ def dtype():
45
+ return CategoricalDtype()
46
+
47
+
48
+ @pytest.fixture
49
+ def data():
50
+ """Length-100 array for this type.
51
+
52
+ * data[0] and data[1] should both be non missing
53
+ * data[0] and data[1] should not be equal
54
+ """
55
+ return Categorical(make_data())
56
+
57
+
58
+ @pytest.fixture
59
+ def data_missing():
60
+ """Length 2 array with [NA, Valid]"""
61
+ return Categorical([np.nan, "A"])
62
+
63
+
64
+ @pytest.fixture
65
+ def data_for_sorting():
66
+ return Categorical(["A", "B", "C"], categories=["C", "A", "B"], ordered=True)
67
+
68
+
69
+ @pytest.fixture
70
+ def data_missing_for_sorting():
71
+ return Categorical(["A", None, "B"], categories=["B", "A"], ordered=True)
72
+
73
+
74
+ @pytest.fixture
75
+ def na_value():
76
+ return np.nan
77
+
78
+
79
+ @pytest.fixture
80
+ def data_for_grouping():
81
+ return Categorical(["a", "a", None, None, "b", "b", "a", "c"])
82
+
83
+
84
+ class TestDtype(base.BaseDtypeTests):
85
+ pass
86
+
87
+
88
+ class TestInterface(base.BaseInterfaceTests):
89
+ @pytest.mark.xfail(reason="Memory usage doesn't match")
90
+ def test_memory_usage(self, data):
91
+ # Is this deliberate?
92
+ super().test_memory_usage(data)
93
+
94
+ def test_contains(self, data, data_missing):
95
+ # GH-37867
96
+ # na value handling in Categorical.__contains__ is deprecated.
97
+ # See base.BaseInterFaceTests.test_contains for more details.
98
+
99
+ na_value = data.dtype.na_value
100
+ # ensure data without missing values
101
+ data = data[~data.isna()]
102
+
103
+ # first elements are non-missing
104
+ assert data[0] in data
105
+ assert data_missing[0] in data_missing
106
+
107
+ # check the presence of na_value
108
+ assert na_value in data_missing
109
+ assert na_value not in data
110
+
111
+ # Categoricals can contain other nan-likes than na_value
112
+ for na_value_obj in tm.NULL_OBJECTS:
113
+ if na_value_obj is na_value:
114
+ continue
115
+ assert na_value_obj not in data
116
+ assert na_value_obj in data_missing # this line differs from super method
117
+
118
+
119
+ class TestConstructors(base.BaseConstructorsTests):
120
+ def test_empty(self, dtype):
121
+ cls = dtype.construct_array_type()
122
+ result = cls._empty((4,), dtype=dtype)
123
+
124
+ assert isinstance(result, cls)
125
+ # the dtype we passed is not initialized, so will not match the
126
+ # dtype on our result.
127
+ assert result.dtype == CategoricalDtype([])
128
+
129
+
130
+ class TestReshaping(base.BaseReshapingTests):
131
+ pass
132
+
133
+
134
+ class TestGetitem(base.BaseGetitemTests):
135
+ @pytest.mark.skip(reason="Backwards compatibility")
136
+ def test_getitem_scalar(self, data):
137
+ # CategoricalDtype.type isn't "correct" since it should
138
+ # be a parent of the elements (object). But don't want
139
+ # to break things by changing.
140
+ super().test_getitem_scalar(data)
141
+
142
+
143
+ class TestSetitem(base.BaseSetitemTests):
144
+ pass
145
+
146
+
147
+ class TestIndex(base.BaseIndexTests):
148
+ pass
149
+
150
+
151
+ class TestMissing(base.BaseMissingTests):
152
+ pass
153
+
154
+
155
+ class TestReduce(base.BaseNoReduceTests):
156
+ pass
157
+
158
+
159
+ class TestAccumulate(base.BaseAccumulateTests):
160
+ @pytest.mark.parametrize("skipna", [True, False])
161
+ def test_accumulate_series(self, data, all_numeric_accumulations, skipna):
162
+ pass
163
+
164
+
165
+ class TestMethods(base.BaseMethodsTests):
166
+ @pytest.mark.xfail(reason="Unobserved categories included")
167
+ def test_value_counts(self, all_data, dropna):
168
+ return super().test_value_counts(all_data, dropna)
169
+
170
+ def test_combine_add(self, data_repeated):
171
+ # GH 20825
172
+ # When adding categoricals in combine, result is a string
173
+ orig_data1, orig_data2 = data_repeated(2)
174
+ s1 = pd.Series(orig_data1)
175
+ s2 = pd.Series(orig_data2)
176
+ result = s1.combine(s2, lambda x1, x2: x1 + x2)
177
+ expected = pd.Series(
178
+ [a + b for (a, b) in zip(list(orig_data1), list(orig_data2))]
179
+ )
180
+ self.assert_series_equal(result, expected)
181
+
182
+ val = s1.iloc[0]
183
+ result = s1.combine(val, lambda x1, x2: x1 + x2)
184
+ expected = pd.Series([a + val for a in list(orig_data1)])
185
+ self.assert_series_equal(result, expected)
186
+
187
+
188
+ class TestCasting(base.BaseCastingTests):
189
+ @pytest.mark.parametrize("cls", [Categorical, CategoricalIndex])
190
+ @pytest.mark.parametrize("values", [[1, np.nan], [Timestamp("2000"), pd.NaT]])
191
+ def test_cast_nan_to_int(self, cls, values):
192
+ # GH 28406
193
+ s = cls(values)
194
+
195
+ msg = "Cannot (cast|convert)"
196
+ with pytest.raises((ValueError, TypeError), match=msg):
197
+ s.astype(int)
198
+
199
+ @pytest.mark.parametrize(
200
+ "expected",
201
+ [
202
+ pd.Series(["2019", "2020"], dtype="datetime64[ns, UTC]"),
203
+ pd.Series([0, 0], dtype="timedelta64[ns]"),
204
+ pd.Series([pd.Period("2019"), pd.Period("2020")], dtype="period[A-DEC]"),
205
+ pd.Series([pd.Interval(0, 1), pd.Interval(1, 2)], dtype="interval"),
206
+ pd.Series([1, np.nan], dtype="Int64"),
207
+ ],
208
+ )
209
+ def test_cast_category_to_extension_dtype(self, expected):
210
+ # GH 28668
211
+ result = expected.astype("category").astype(expected.dtype)
212
+
213
+ tm.assert_series_equal(result, expected)
214
+
215
+ @pytest.mark.parametrize(
216
+ "dtype, expected",
217
+ [
218
+ (
219
+ "datetime64[ns]",
220
+ np.array(["2015-01-01T00:00:00.000000000"], dtype="datetime64[ns]"),
221
+ ),
222
+ (
223
+ "datetime64[ns, MET]",
224
+ pd.DatetimeIndex(
225
+ [Timestamp("2015-01-01 00:00:00+0100", tz="MET")]
226
+ ).array,
227
+ ),
228
+ ],
229
+ )
230
+ def test_consistent_casting(self, dtype, expected):
231
+ # GH 28448
232
+ result = Categorical(["2015-01-01"]).astype(dtype)
233
+ assert result == expected
234
+
235
+
236
+ class TestArithmeticOps(base.BaseArithmeticOpsTests):
237
+ def test_arith_frame_with_scalar(self, data, all_arithmetic_operators, request):
238
+ # frame & scalar
239
+ op_name = all_arithmetic_operators
240
+ if op_name == "__rmod__":
241
+ request.node.add_marker(
242
+ pytest.mark.xfail(
243
+ reason="rmod never called when string is first argument"
244
+ )
245
+ )
246
+ super().test_arith_frame_with_scalar(data, op_name)
247
+
248
+ def test_arith_series_with_scalar(self, data, all_arithmetic_operators, request):
249
+ op_name = all_arithmetic_operators
250
+ if op_name == "__rmod__":
251
+ request.node.add_marker(
252
+ pytest.mark.xfail(
253
+ reason="rmod never called when string is first argument"
254
+ )
255
+ )
256
+ super().test_arith_series_with_scalar(data, op_name)
257
+
258
+ def test_add_series_with_extension_array(self, data):
259
+ ser = pd.Series(data)
260
+ with pytest.raises(TypeError, match="cannot perform|unsupported operand"):
261
+ ser + data
262
+
263
+ def test_divmod_series_array(self):
264
+ # GH 23287
265
+ # skipping because it is not implemented
266
+ pass
267
+
268
+ def _check_divmod_op(self, s, op, other, exc=NotImplementedError):
269
+ return super()._check_divmod_op(s, op, other, exc=TypeError)
270
+
271
+
272
+ class TestComparisonOps(base.BaseComparisonOpsTests):
273
+ def _compare_other(self, s, data, op, other):
274
+ op_name = f"__{op.__name__}__"
275
+ if op_name == "__eq__":
276
+ result = op(s, other)
277
+ expected = s.combine(other, lambda x, y: x == y)
278
+ assert (result == expected).all()
279
+
280
+ elif op_name == "__ne__":
281
+ result = op(s, other)
282
+ expected = s.combine(other, lambda x, y: x != y)
283
+ assert (result == expected).all()
284
+
285
+ else:
286
+ msg = "Unordered Categoricals can only compare equality or not"
287
+ with pytest.raises(TypeError, match=msg):
288
+ op(data, other)
289
+
290
+ @pytest.mark.parametrize(
291
+ "categories",
292
+ [["a", "b"], [0, 1], [Timestamp("2019"), Timestamp("2020")]],
293
+ )
294
+ def test_not_equal_with_na(self, categories):
295
+ # https://github.com/pandas-dev/pandas/issues/32276
296
+ c1 = Categorical.from_codes([-1, 0], categories=categories)
297
+ c2 = Categorical.from_codes([0, 1], categories=categories)
298
+
299
+ result = c1 != c2
300
+
301
+ assert result.all()
302
+
303
+
304
+ class TestParsing(base.BaseParsingTests):
305
+ pass
306
+
307
+
308
+ class Test2DCompat(base.NDArrayBacked2DTests):
309
+ def test_repr_2d(self, data):
310
+ # Categorical __repr__ doesn't include "Categorical", so we need
311
+ # to special-case
312
+ res = repr(data.reshape(1, -1))
313
+ assert res.count("\nCategories") == 1
314
+
315
+ res = repr(data.reshape(-1, 1))
316
+ assert res.count("\nCategories") == 1
videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_common.py ADDED
@@ -0,0 +1,80 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import numpy as np
2
+ import pytest
3
+
4
+ from pandas.core.dtypes import dtypes
5
+ from pandas.core.dtypes.common import is_extension_array_dtype
6
+
7
+ import pandas as pd
8
+ import pandas._testing as tm
9
+ from pandas.core.arrays import ExtensionArray
10
+
11
+
12
+ class DummyDtype(dtypes.ExtensionDtype):
13
+ pass
14
+
15
+
16
+ class DummyArray(ExtensionArray):
17
+ def __init__(self, data) -> None:
18
+ self.data = data
19
+
20
+ def __array__(self, dtype):
21
+ return self.data
22
+
23
+ @property
24
+ def dtype(self):
25
+ return DummyDtype()
26
+
27
+ def astype(self, dtype, copy=True):
28
+ # we don't support anything but a single dtype
29
+ if isinstance(dtype, DummyDtype):
30
+ if copy:
31
+ return type(self)(self.data)
32
+ return self
33
+
34
+ return np.array(self, dtype=dtype, copy=copy)
35
+
36
+
37
+ class TestExtensionArrayDtype:
38
+ @pytest.mark.parametrize(
39
+ "values",
40
+ [
41
+ pd.Categorical([]),
42
+ pd.Categorical([]).dtype,
43
+ pd.Series(pd.Categorical([])),
44
+ DummyDtype(),
45
+ DummyArray(np.array([1, 2])),
46
+ ],
47
+ )
48
+ def test_is_extension_array_dtype(self, values):
49
+ assert is_extension_array_dtype(values)
50
+
51
+ @pytest.mark.parametrize("values", [np.array([]), pd.Series(np.array([]))])
52
+ def test_is_not_extension_array_dtype(self, values):
53
+ assert not is_extension_array_dtype(values)
54
+
55
+
56
+ def test_astype():
57
+ arr = DummyArray(np.array([1, 2, 3]))
58
+ expected = np.array([1, 2, 3], dtype=object)
59
+
60
+ result = arr.astype(object)
61
+ tm.assert_numpy_array_equal(result, expected)
62
+
63
+ result = arr.astype("object")
64
+ tm.assert_numpy_array_equal(result, expected)
65
+
66
+
67
+ def test_astype_no_copy():
68
+ arr = DummyArray(np.array([1, 2, 3], dtype=np.int64))
69
+ result = arr.astype(arr.dtype, copy=False)
70
+
71
+ assert arr is result
72
+
73
+ result = arr.astype(arr.dtype)
74
+ assert arr is not result
75
+
76
+
77
+ @pytest.mark.parametrize("dtype", [dtypes.CategoricalDtype(), dtypes.IntervalDtype()])
78
+ def test_is_extension_array_dtype(dtype):
79
+ assert isinstance(dtype, dtypes.ExtensionDtype)
80
+ assert is_extension_array_dtype(dtype)
videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_datetime.py ADDED
@@ -0,0 +1,194 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ This file contains a minimal set of tests for compliance with the extension
3
+ array interface test suite, and should contain no other tests.
4
+ The test suite for the full functionality of the array is located in
5
+ `pandas/tests/arrays/`.
6
+
7
+ The tests in this file are inherited from the BaseExtensionTests, and only
8
+ minimal tweaks should be applied to get the tests passing (by overwriting a
9
+ parent method).
10
+
11
+ Additional tests should either be added to one of the BaseExtensionTests
12
+ classes (if they are relevant for the extension interface for all dtypes), or
13
+ be added to the array-specific tests in `pandas/tests/arrays/`.
14
+
15
+ """
16
+ import numpy as np
17
+ import pytest
18
+
19
+ from pandas.core.dtypes.dtypes import DatetimeTZDtype
20
+
21
+ import pandas as pd
22
+ from pandas.core.arrays import DatetimeArray
23
+ from pandas.tests.extension import base
24
+
25
+
26
+ @pytest.fixture(params=["US/Central"])
27
+ def dtype(request):
28
+ return DatetimeTZDtype(unit="ns", tz=request.param)
29
+
30
+
31
+ @pytest.fixture
32
+ def data(dtype):
33
+ data = DatetimeArray(pd.date_range("2000", periods=100, tz=dtype.tz), dtype=dtype)
34
+ return data
35
+
36
+
37
+ @pytest.fixture
38
+ def data_missing(dtype):
39
+ return DatetimeArray(
40
+ np.array(["NaT", "2000-01-01"], dtype="datetime64[ns]"), dtype=dtype
41
+ )
42
+
43
+
44
+ @pytest.fixture
45
+ def data_for_sorting(dtype):
46
+ a = pd.Timestamp("2000-01-01")
47
+ b = pd.Timestamp("2000-01-02")
48
+ c = pd.Timestamp("2000-01-03")
49
+ return DatetimeArray(np.array([b, c, a], dtype="datetime64[ns]"), dtype=dtype)
50
+
51
+
52
+ @pytest.fixture
53
+ def data_missing_for_sorting(dtype):
54
+ a = pd.Timestamp("2000-01-01")
55
+ b = pd.Timestamp("2000-01-02")
56
+ return DatetimeArray(np.array([b, "NaT", a], dtype="datetime64[ns]"), dtype=dtype)
57
+
58
+
59
+ @pytest.fixture
60
+ def data_for_grouping(dtype):
61
+ """
62
+ Expected to be like [B, B, NA, NA, A, A, B, C]
63
+
64
+ Where A < B < C and NA is missing
65
+ """
66
+ a = pd.Timestamp("2000-01-01")
67
+ b = pd.Timestamp("2000-01-02")
68
+ c = pd.Timestamp("2000-01-03")
69
+ na = "NaT"
70
+ return DatetimeArray(
71
+ np.array([b, b, na, na, a, a, b, c], dtype="datetime64[ns]"), dtype=dtype
72
+ )
73
+
74
+
75
+ @pytest.fixture
76
+ def na_cmp():
77
+ def cmp(a, b):
78
+ return a is pd.NaT and a is b
79
+
80
+ return cmp
81
+
82
+
83
+ @pytest.fixture
84
+ def na_value():
85
+ return pd.NaT
86
+
87
+
88
+ # ----------------------------------------------------------------------------
89
+ class BaseDatetimeTests:
90
+ pass
91
+
92
+
93
+ # ----------------------------------------------------------------------------
94
+ # Tests
95
+ class TestDatetimeDtype(BaseDatetimeTests, base.BaseDtypeTests):
96
+ pass
97
+
98
+
99
+ class TestConstructors(BaseDatetimeTests, base.BaseConstructorsTests):
100
+ def test_series_constructor(self, data):
101
+ # Series construction drops any .freq attr
102
+ data = data._with_freq(None)
103
+ super().test_series_constructor(data)
104
+
105
+
106
+ class TestGetitem(BaseDatetimeTests, base.BaseGetitemTests):
107
+ pass
108
+
109
+
110
+ class TestIndex(base.BaseIndexTests):
111
+ pass
112
+
113
+
114
+ class TestMethods(BaseDatetimeTests, base.BaseMethodsTests):
115
+ def test_combine_add(self, data_repeated):
116
+ # Timestamp.__add__(Timestamp) not defined
117
+ pass
118
+
119
+
120
+ class TestInterface(BaseDatetimeTests, base.BaseInterfaceTests):
121
+ pass
122
+
123
+
124
+ class TestArithmeticOps(BaseDatetimeTests, base.BaseArithmeticOpsTests):
125
+ implements = {"__sub__", "__rsub__"}
126
+
127
+ def test_arith_frame_with_scalar(self, data, all_arithmetic_operators):
128
+ # frame & scalar
129
+ if all_arithmetic_operators in self.implements:
130
+ df = pd.DataFrame({"A": data})
131
+ self.check_opname(df, all_arithmetic_operators, data[0], exc=None)
132
+ else:
133
+ # ... but not the rest.
134
+ super().test_arith_frame_with_scalar(data, all_arithmetic_operators)
135
+
136
+ def test_arith_series_with_scalar(self, data, all_arithmetic_operators):
137
+ if all_arithmetic_operators in self.implements:
138
+ ser = pd.Series(data)
139
+ self.check_opname(ser, all_arithmetic_operators, ser.iloc[0], exc=None)
140
+ else:
141
+ # ... but not the rest.
142
+ super().test_arith_series_with_scalar(data, all_arithmetic_operators)
143
+
144
+ def test_add_series_with_extension_array(self, data):
145
+ # Datetime + Datetime not implemented
146
+ ser = pd.Series(data)
147
+ msg = "cannot add DatetimeArray and DatetimeArray"
148
+ with pytest.raises(TypeError, match=msg):
149
+ ser + data
150
+
151
+ def test_arith_series_with_array(self, data, all_arithmetic_operators):
152
+ if all_arithmetic_operators in self.implements:
153
+ ser = pd.Series(data)
154
+ self.check_opname(ser, all_arithmetic_operators, ser.iloc[0], exc=None)
155
+ else:
156
+ # ... but not the rest.
157
+ super().test_arith_series_with_scalar(data, all_arithmetic_operators)
158
+
159
+ def test_divmod_series_array(self):
160
+ # GH 23287
161
+ # skipping because it is not implemented
162
+ pass
163
+
164
+
165
+ class TestCasting(BaseDatetimeTests, base.BaseCastingTests):
166
+ pass
167
+
168
+
169
+ class TestComparisonOps(BaseDatetimeTests, base.BaseComparisonOpsTests):
170
+ pass
171
+
172
+
173
+ class TestMissing(BaseDatetimeTests, base.BaseMissingTests):
174
+ pass
175
+
176
+
177
+ class TestReshaping(BaseDatetimeTests, base.BaseReshapingTests):
178
+ pass
179
+
180
+
181
+ class TestSetitem(BaseDatetimeTests, base.BaseSetitemTests):
182
+ pass
183
+
184
+
185
+ class TestGroupby(BaseDatetimeTests, base.BaseGroupbyTests):
186
+ pass
187
+
188
+
189
+ class TestPrinting(BaseDatetimeTests, base.BasePrintingTests):
190
+ pass
191
+
192
+
193
+ class Test2DCompat(BaseDatetimeTests, base.NDArrayBacked2DTests):
194
+ pass
videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_extension.py ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Tests for behavior if an author does *not* implement EA methods.
3
+ """
4
+ import numpy as np
5
+ import pytest
6
+
7
+ from pandas.core.arrays import ExtensionArray
8
+
9
+
10
+ class MyEA(ExtensionArray):
11
+ def __init__(self, values) -> None:
12
+ self._values = values
13
+
14
+
15
+ @pytest.fixture
16
+ def data():
17
+ arr = np.arange(10)
18
+ return MyEA(arr)
19
+
20
+
21
+ class TestExtensionArray:
22
+ def test_errors(self, data, all_arithmetic_operators):
23
+ # invalid ops
24
+ op_name = all_arithmetic_operators
25
+ with pytest.raises(AttributeError):
26
+ getattr(data, op_name)
videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_external_block.py ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import numpy as np
2
+ import pytest
3
+
4
+ from pandas._libs.internals import BlockPlacement
5
+ import pandas.util._test_decorators as td
6
+
7
+ import pandas as pd
8
+ from pandas.core.internals import BlockManager
9
+ from pandas.core.internals.blocks import ExtensionBlock
10
+
11
+ pytestmark = td.skip_array_manager_invalid_test
12
+
13
+
14
+ class CustomBlock(ExtensionBlock):
15
+ _holder = np.ndarray
16
+
17
+ # Cannot override final attribute "_can_hold_na"
18
+ @property # type: ignore[misc]
19
+ def _can_hold_na(self) -> bool:
20
+ return False
21
+
22
+
23
+ @pytest.fixture
24
+ def df():
25
+ df1 = pd.DataFrame({"a": [1, 2, 3]})
26
+ blocks = df1._mgr.blocks
27
+ values = np.arange(3, dtype="int64")
28
+ bp = BlockPlacement(slice(1, 2))
29
+ custom_block = CustomBlock(values, placement=bp, ndim=2)
30
+ blocks = blocks + (custom_block,)
31
+ block_manager = BlockManager(blocks, [pd.Index(["a", "b"]), df1.index])
32
+ return pd.DataFrame(block_manager)
33
+
34
+
35
+ def test_concat_axis1(df):
36
+ # GH17954
37
+ df2 = pd.DataFrame({"c": [0.1, 0.2, 0.3]})
38
+ res = pd.concat([df, df2], axis=1)
39
+ assert isinstance(res._mgr.blocks[1], CustomBlock)
videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_floating.py ADDED
@@ -0,0 +1,225 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ This file contains a minimal set of tests for compliance with the extension
3
+ array interface test suite, and should contain no other tests.
4
+ The test suite for the full functionality of the array is located in
5
+ `pandas/tests/arrays/`.
6
+
7
+ The tests in this file are inherited from the BaseExtensionTests, and only
8
+ minimal tweaks should be applied to get the tests passing (by overwriting a
9
+ parent method).
10
+
11
+ Additional tests should either be added to one of the BaseExtensionTests
12
+ classes (if they are relevant for the extension interface for all dtypes), or
13
+ be added to the array-specific tests in `pandas/tests/arrays/`.
14
+
15
+ """
16
+ import numpy as np
17
+ import pytest
18
+
19
+ from pandas.core.dtypes.common import is_extension_array_dtype
20
+
21
+ import pandas as pd
22
+ import pandas._testing as tm
23
+ from pandas.api.types import is_float_dtype
24
+ from pandas.core.arrays.floating import (
25
+ Float32Dtype,
26
+ Float64Dtype,
27
+ )
28
+ from pandas.tests.extension import base
29
+
30
+
31
+ def make_data():
32
+ return (
33
+ list(np.arange(0.1, 0.9, 0.1))
34
+ + [pd.NA]
35
+ + list(np.arange(1, 9.8, 0.1))
36
+ + [pd.NA]
37
+ + [9.9, 10.0]
38
+ )
39
+
40
+
41
+ @pytest.fixture(params=[Float32Dtype, Float64Dtype])
42
+ def dtype(request):
43
+ return request.param()
44
+
45
+
46
+ @pytest.fixture
47
+ def data(dtype):
48
+ return pd.array(make_data(), dtype=dtype)
49
+
50
+
51
+ @pytest.fixture
52
+ def data_for_twos(dtype):
53
+ return pd.array(np.ones(100) * 2, dtype=dtype)
54
+
55
+
56
+ @pytest.fixture
57
+ def data_missing(dtype):
58
+ return pd.array([pd.NA, 0.1], dtype=dtype)
59
+
60
+
61
+ @pytest.fixture
62
+ def data_for_sorting(dtype):
63
+ return pd.array([0.1, 0.2, 0.0], dtype=dtype)
64
+
65
+
66
+ @pytest.fixture
67
+ def data_missing_for_sorting(dtype):
68
+ return pd.array([0.1, pd.NA, 0.0], dtype=dtype)
69
+
70
+
71
+ @pytest.fixture
72
+ def na_cmp():
73
+ # we are pd.NA
74
+ return lambda x, y: x is pd.NA and y is pd.NA
75
+
76
+
77
+ @pytest.fixture
78
+ def na_value():
79
+ return pd.NA
80
+
81
+
82
+ @pytest.fixture
83
+ def data_for_grouping(dtype):
84
+ b = 0.1
85
+ a = 0.0
86
+ c = 0.2
87
+ na = pd.NA
88
+ return pd.array([b, b, na, na, a, a, b, c], dtype=dtype)
89
+
90
+
91
+ class TestDtype(base.BaseDtypeTests):
92
+ pass
93
+
94
+
95
+ class TestArithmeticOps(base.BaseArithmeticOpsTests):
96
+ def check_opname(self, s, op_name, other, exc=None):
97
+ # overwriting to indicate ops don't raise an error
98
+ super().check_opname(s, op_name, other, exc=None)
99
+
100
+ def _check_op(self, s, op, other, op_name, exc=NotImplementedError):
101
+ if exc is None:
102
+ sdtype = tm.get_dtype(s)
103
+ if (
104
+ hasattr(other, "dtype")
105
+ and not is_extension_array_dtype(other.dtype)
106
+ and is_float_dtype(other.dtype)
107
+ ):
108
+ # other is np.float64 and would therefore always result in
109
+ # upcasting, so keeping other as same numpy_dtype
110
+ other = other.astype(sdtype.numpy_dtype)
111
+
112
+ result = op(s, other)
113
+ expected = self._combine(s, other, op)
114
+
115
+ # combine method result in 'biggest' (float64) dtype
116
+ expected = expected.astype(sdtype)
117
+
118
+ self.assert_equal(result, expected)
119
+ else:
120
+ with pytest.raises(exc):
121
+ op(s, other)
122
+
123
+ def _check_divmod_op(self, s, op, other, exc=None):
124
+ super()._check_divmod_op(s, op, other, None)
125
+
126
+
127
+ class TestComparisonOps(base.BaseComparisonOpsTests):
128
+ # TODO: share with IntegerArray?
129
+ def _check_op(self, s, op, other, op_name, exc=NotImplementedError):
130
+ if exc is None:
131
+ result = op(s, other)
132
+ # Override to do the astype to boolean
133
+ expected = s.combine(other, op).astype("boolean")
134
+ self.assert_series_equal(result, expected)
135
+ else:
136
+ with pytest.raises(exc):
137
+ op(s, other)
138
+
139
+ def check_opname(self, s, op_name, other, exc=None):
140
+ super().check_opname(s, op_name, other, exc=None)
141
+
142
+ def _compare_other(self, s, data, op, other):
143
+ op_name = f"__{op.__name__}__"
144
+ self.check_opname(s, op_name, other)
145
+
146
+
147
+ class TestInterface(base.BaseInterfaceTests):
148
+ pass
149
+
150
+
151
+ class TestConstructors(base.BaseConstructorsTests):
152
+ pass
153
+
154
+
155
+ class TestReshaping(base.BaseReshapingTests):
156
+ pass
157
+
158
+
159
+ class TestGetitem(base.BaseGetitemTests):
160
+ pass
161
+
162
+
163
+ class TestSetitem(base.BaseSetitemTests):
164
+ pass
165
+
166
+
167
+ class TestIndex(base.BaseIndexTests):
168
+ pass
169
+
170
+
171
+ class TestMissing(base.BaseMissingTests):
172
+ pass
173
+
174
+
175
+ class TestMethods(base.BaseMethodsTests):
176
+ _combine_le_expected_dtype = object # TODO: can we make this boolean?
177
+
178
+
179
+ class TestCasting(base.BaseCastingTests):
180
+ pass
181
+
182
+
183
+ class TestGroupby(base.BaseGroupbyTests):
184
+ pass
185
+
186
+
187
+ class TestNumericReduce(base.BaseNumericReduceTests):
188
+ def check_reduce(self, s, op_name, skipna):
189
+ # overwrite to ensure pd.NA is tested instead of np.nan
190
+ # https://github.com/pandas-dev/pandas/issues/30958
191
+ if op_name == "count":
192
+ result = getattr(s, op_name)()
193
+ expected = getattr(s.dropna().astype(s.dtype.numpy_dtype), op_name)()
194
+ else:
195
+ result = getattr(s, op_name)(skipna=skipna)
196
+ expected = getattr(s.dropna().astype(s.dtype.numpy_dtype), op_name)(
197
+ skipna=skipna
198
+ )
199
+ if not skipna and s.isna().any():
200
+ expected = pd.NA
201
+ tm.assert_almost_equal(result, expected)
202
+
203
+
204
+ @pytest.mark.skip(reason="Tested in tests/reductions/test_reductions.py")
205
+ class TestBooleanReduce(base.BaseBooleanReduceTests):
206
+ pass
207
+
208
+
209
+ class TestPrinting(base.BasePrintingTests):
210
+ pass
211
+
212
+
213
+ class TestParsing(base.BaseParsingTests):
214
+ pass
215
+
216
+
217
+ @pytest.mark.filterwarnings("ignore:overflow encountered in reduce:RuntimeWarning")
218
+ class Test2DCompat(base.Dim2CompatTests):
219
+ pass
220
+
221
+
222
+ class TestAccumulation(base.BaseAccumulateTests):
223
+ @pytest.mark.parametrize("skipna", [True, False])
224
+ def test_accumulate_series_raises(self, data, all_numeric_accumulations, skipna):
225
+ pass
videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_integer.py ADDED
@@ -0,0 +1,294 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ This file contains a minimal set of tests for compliance with the extension
3
+ array interface test suite, and should contain no other tests.
4
+ The test suite for the full functionality of the array is located in
5
+ `pandas/tests/arrays/`.
6
+
7
+ The tests in this file are inherited from the BaseExtensionTests, and only
8
+ minimal tweaks should be applied to get the tests passing (by overwriting a
9
+ parent method).
10
+
11
+ Additional tests should either be added to one of the BaseExtensionTests
12
+ classes (if they are relevant for the extension interface for all dtypes), or
13
+ be added to the array-specific tests in `pandas/tests/arrays/`.
14
+
15
+ """
16
+ import numpy as np
17
+ import pytest
18
+
19
+ from pandas.compat import (
20
+ IS64,
21
+ is_platform_windows,
22
+ )
23
+
24
+ import pandas as pd
25
+ import pandas._testing as tm
26
+ from pandas.api.types import (
27
+ is_extension_array_dtype,
28
+ is_integer_dtype,
29
+ )
30
+ from pandas.core.arrays.integer import (
31
+ Int8Dtype,
32
+ Int16Dtype,
33
+ Int32Dtype,
34
+ Int64Dtype,
35
+ UInt8Dtype,
36
+ UInt16Dtype,
37
+ UInt32Dtype,
38
+ UInt64Dtype,
39
+ )
40
+ from pandas.tests.extension import base
41
+
42
+
43
+ def make_data():
44
+ return list(range(1, 9)) + [pd.NA] + list(range(10, 98)) + [pd.NA] + [99, 100]
45
+
46
+
47
+ @pytest.fixture(
48
+ params=[
49
+ Int8Dtype,
50
+ Int16Dtype,
51
+ Int32Dtype,
52
+ Int64Dtype,
53
+ UInt8Dtype,
54
+ UInt16Dtype,
55
+ UInt32Dtype,
56
+ UInt64Dtype,
57
+ ]
58
+ )
59
+ def dtype(request):
60
+ return request.param()
61
+
62
+
63
+ @pytest.fixture
64
+ def data(dtype):
65
+ return pd.array(make_data(), dtype=dtype)
66
+
67
+
68
+ @pytest.fixture
69
+ def data_for_twos(dtype):
70
+ return pd.array(np.ones(100) * 2, dtype=dtype)
71
+
72
+
73
+ @pytest.fixture
74
+ def data_missing(dtype):
75
+ return pd.array([pd.NA, 1], dtype=dtype)
76
+
77
+
78
+ @pytest.fixture
79
+ def data_for_sorting(dtype):
80
+ return pd.array([1, 2, 0], dtype=dtype)
81
+
82
+
83
+ @pytest.fixture
84
+ def data_missing_for_sorting(dtype):
85
+ return pd.array([1, pd.NA, 0], dtype=dtype)
86
+
87
+
88
+ @pytest.fixture
89
+ def na_cmp():
90
+ # we are pd.NA
91
+ return lambda x, y: x is pd.NA and y is pd.NA
92
+
93
+
94
+ @pytest.fixture
95
+ def na_value():
96
+ return pd.NA
97
+
98
+
99
+ @pytest.fixture
100
+ def data_for_grouping(dtype):
101
+ b = 1
102
+ a = 0
103
+ c = 2
104
+ na = pd.NA
105
+ return pd.array([b, b, na, na, a, a, b, c], dtype=dtype)
106
+
107
+
108
+ class TestDtype(base.BaseDtypeTests):
109
+ pass
110
+
111
+
112
+ class TestArithmeticOps(base.BaseArithmeticOpsTests):
113
+ def check_opname(self, s, op_name, other, exc=None):
114
+ # overwriting to indicate ops don't raise an error
115
+ super().check_opname(s, op_name, other, exc=None)
116
+
117
+ def _check_op(self, s, op, other, op_name, exc=NotImplementedError):
118
+ if exc is None:
119
+ sdtype = tm.get_dtype(s)
120
+
121
+ if (
122
+ hasattr(other, "dtype")
123
+ and not is_extension_array_dtype(other.dtype)
124
+ and is_integer_dtype(other.dtype)
125
+ and sdtype.is_unsigned_integer
126
+ ):
127
+ # TODO: comment below is inaccurate; other can be int8, int16, ...
128
+ # and the trouble is that e.g. if s is UInt8 and other is int8,
129
+ # then result is UInt16
130
+ # other is np.int64 and would therefore always result in
131
+ # upcasting, so keeping other as same numpy_dtype
132
+ other = other.astype(sdtype.numpy_dtype)
133
+
134
+ result = op(s, other)
135
+ expected = self._combine(s, other, op)
136
+
137
+ if op_name in ("__rtruediv__", "__truediv__", "__div__"):
138
+ expected = expected.fillna(np.nan).astype("Float64")
139
+ else:
140
+ # combine method result in 'biggest' (int64) dtype
141
+ expected = expected.astype(sdtype)
142
+
143
+ self.assert_equal(result, expected)
144
+ else:
145
+ with pytest.raises(exc):
146
+ op(s, other)
147
+
148
+ def _check_divmod_op(self, s, op, other, exc=None):
149
+ super()._check_divmod_op(s, op, other, None)
150
+
151
+
152
+ class TestComparisonOps(base.BaseComparisonOpsTests):
153
+ def _check_op(self, s, op, other, op_name, exc=NotImplementedError):
154
+ if exc is None:
155
+ result = op(s, other)
156
+ # Override to do the astype to boolean
157
+ expected = s.combine(other, op).astype("boolean")
158
+ self.assert_series_equal(result, expected)
159
+ else:
160
+ with pytest.raises(exc):
161
+ op(s, other)
162
+
163
+ def check_opname(self, s, op_name, other, exc=None):
164
+ super().check_opname(s, op_name, other, exc=None)
165
+
166
+ def _compare_other(self, s, data, op, other):
167
+ op_name = f"__{op.__name__}__"
168
+ self.check_opname(s, op_name, other)
169
+
170
+
171
+ class TestInterface(base.BaseInterfaceTests):
172
+ pass
173
+
174
+
175
+ class TestConstructors(base.BaseConstructorsTests):
176
+ pass
177
+
178
+
179
+ class TestReshaping(base.BaseReshapingTests):
180
+ pass
181
+
182
+ # for test_concat_mixed_dtypes test
183
+ # concat of an Integer and Int coerces to object dtype
184
+ # TODO(jreback) once integrated this would
185
+
186
+
187
+ class TestGetitem(base.BaseGetitemTests):
188
+ pass
189
+
190
+
191
+ class TestSetitem(base.BaseSetitemTests):
192
+ pass
193
+
194
+
195
+ class TestIndex(base.BaseIndexTests):
196
+ pass
197
+
198
+
199
+ class TestMissing(base.BaseMissingTests):
200
+ pass
201
+
202
+
203
+ class TestMethods(base.BaseMethodsTests):
204
+ _combine_le_expected_dtype = object # TODO: can we make this boolean?
205
+
206
+
207
+ class TestCasting(base.BaseCastingTests):
208
+ pass
209
+
210
+
211
+ class TestGroupby(base.BaseGroupbyTests):
212
+ pass
213
+
214
+
215
+ class TestNumericReduce(base.BaseNumericReduceTests):
216
+ def check_reduce(self, s, op_name, skipna):
217
+ # overwrite to ensure pd.NA is tested instead of np.nan
218
+ # https://github.com/pandas-dev/pandas/issues/30958
219
+ if op_name == "count":
220
+ result = getattr(s, op_name)()
221
+ expected = getattr(s.dropna().astype("int64"), op_name)()
222
+ else:
223
+ result = getattr(s, op_name)(skipna=skipna)
224
+ expected = getattr(s.dropna().astype("int64"), op_name)(skipna=skipna)
225
+ if not skipna and s.isna().any():
226
+ expected = pd.NA
227
+ tm.assert_almost_equal(result, expected)
228
+
229
+
230
+ @pytest.mark.skip(reason="Tested in tests/reductions/test_reductions.py")
231
+ class TestBooleanReduce(base.BaseBooleanReduceTests):
232
+ pass
233
+
234
+
235
+ class TestAccumulation(base.BaseAccumulateTests):
236
+ def check_accumulate(self, s, op_name, skipna):
237
+ # overwrite to ensure pd.NA is tested instead of np.nan
238
+ # https://github.com/pandas-dev/pandas/issues/30958
239
+ length = 64
240
+ if not IS64 or is_platform_windows():
241
+ if not s.dtype.itemsize == 8:
242
+ length = 32
243
+
244
+ if s.dtype.name.startswith("U"):
245
+ expected_dtype = f"UInt{length}"
246
+ else:
247
+ expected_dtype = f"Int{length}"
248
+
249
+ if op_name == "cumsum":
250
+ result = getattr(s, op_name)(skipna=skipna)
251
+ expected = pd.Series(
252
+ pd.array(
253
+ getattr(s.astype("float64"), op_name)(skipna=skipna),
254
+ dtype=expected_dtype,
255
+ )
256
+ )
257
+ tm.assert_series_equal(result, expected)
258
+ elif op_name in ["cummax", "cummin"]:
259
+ result = getattr(s, op_name)(skipna=skipna)
260
+ expected = pd.Series(
261
+ pd.array(
262
+ getattr(s.astype("float64"), op_name)(skipna=skipna),
263
+ dtype=s.dtype,
264
+ )
265
+ )
266
+ tm.assert_series_equal(result, expected)
267
+ elif op_name == "cumprod":
268
+ result = getattr(s[:12], op_name)(skipna=skipna)
269
+ expected = pd.Series(
270
+ pd.array(
271
+ getattr(s[:12].astype("float64"), op_name)(skipna=skipna),
272
+ dtype=expected_dtype,
273
+ )
274
+ )
275
+ tm.assert_series_equal(result, expected)
276
+
277
+ else:
278
+ raise NotImplementedError(f"{op_name} not supported")
279
+
280
+ @pytest.mark.parametrize("skipna", [True, False])
281
+ def test_accumulate_series_raises(self, data, all_numeric_accumulations, skipna):
282
+ pass
283
+
284
+
285
+ class TestPrinting(base.BasePrintingTests):
286
+ pass
287
+
288
+
289
+ class TestParsing(base.BaseParsingTests):
290
+ pass
291
+
292
+
293
+ class Test2DCompat(base.Dim2CompatTests):
294
+ pass
videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_numpy.py ADDED
@@ -0,0 +1,456 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ This file contains a minimal set of tests for compliance with the extension
3
+ array interface test suite, and should contain no other tests.
4
+ The test suite for the full functionality of the array is located in
5
+ `pandas/tests/arrays/`.
6
+
7
+ The tests in this file are inherited from the BaseExtensionTests, and only
8
+ minimal tweaks should be applied to get the tests passing (by overwriting a
9
+ parent method).
10
+
11
+ Additional tests should either be added to one of the BaseExtensionTests
12
+ classes (if they are relevant for the extension interface for all dtypes), or
13
+ be added to the array-specific tests in `pandas/tests/arrays/`.
14
+
15
+ Note: we do not bother with base.BaseIndexTests because PandasArray
16
+ will never be held in an Index.
17
+ """
18
+ import numpy as np
19
+ import pytest
20
+
21
+ from pandas.core.dtypes.cast import can_hold_element
22
+ from pandas.core.dtypes.dtypes import (
23
+ ExtensionDtype,
24
+ PandasDtype,
25
+ )
26
+
27
+ import pandas as pd
28
+ import pandas._testing as tm
29
+ from pandas.api.types import is_object_dtype
30
+ from pandas.core.arrays.numpy_ import PandasArray
31
+ from pandas.core.internals import blocks
32
+ from pandas.tests.extension import base
33
+
34
+
35
+ def _can_hold_element_patched(obj, element) -> bool:
36
+ if isinstance(element, PandasArray):
37
+ element = element.to_numpy()
38
+ return can_hold_element(obj, element)
39
+
40
+
41
+ orig_assert_attr_equal = tm.assert_attr_equal
42
+
43
+
44
+ def _assert_attr_equal(attr: str, left, right, obj: str = "Attributes"):
45
+ """
46
+ patch tm.assert_attr_equal so PandasDtype("object") is closed enough to
47
+ np.dtype("object")
48
+ """
49
+ if attr == "dtype":
50
+ lattr = getattr(left, "dtype", None)
51
+ rattr = getattr(right, "dtype", None)
52
+ if isinstance(lattr, PandasDtype) and not isinstance(rattr, PandasDtype):
53
+ left = left.astype(lattr.numpy_dtype)
54
+ elif isinstance(rattr, PandasDtype) and not isinstance(lattr, PandasDtype):
55
+ right = right.astype(rattr.numpy_dtype)
56
+
57
+ orig_assert_attr_equal(attr, left, right, obj)
58
+
59
+
60
+ @pytest.fixture(params=["float", "object"])
61
+ def dtype(request):
62
+ return PandasDtype(np.dtype(request.param))
63
+
64
+
65
+ @pytest.fixture
66
+ def allow_in_pandas(monkeypatch):
67
+ """
68
+ A monkeypatch to tells pandas to let us in.
69
+
70
+ By default, passing a PandasArray to an index / series / frame
71
+ constructor will unbox that PandasArray to an ndarray, and treat
72
+ it as a non-EA column. We don't want people using EAs without
73
+ reason.
74
+
75
+ The mechanism for this is a check against ABCPandasArray
76
+ in each constructor.
77
+
78
+ But, for testing, we need to allow them in pandas. So we patch
79
+ the _typ of PandasArray, so that we evade the ABCPandasArray
80
+ check.
81
+ """
82
+ with monkeypatch.context() as m:
83
+ m.setattr(PandasArray, "_typ", "extension")
84
+ m.setattr(blocks, "can_hold_element", _can_hold_element_patched)
85
+ m.setattr(tm.asserters, "assert_attr_equal", _assert_attr_equal)
86
+ yield
87
+
88
+
89
+ @pytest.fixture
90
+ def data(allow_in_pandas, dtype):
91
+ if dtype.numpy_dtype == "object":
92
+ return pd.Series([(i,) for i in range(100)]).array
93
+ return PandasArray(np.arange(1, 101, dtype=dtype._dtype))
94
+
95
+
96
+ @pytest.fixture
97
+ def data_missing(allow_in_pandas, dtype):
98
+ if dtype.numpy_dtype == "object":
99
+ return PandasArray(np.array([np.nan, (1,)], dtype=object))
100
+ return PandasArray(np.array([np.nan, 1.0]))
101
+
102
+
103
+ @pytest.fixture
104
+ def na_value():
105
+ return np.nan
106
+
107
+
108
+ @pytest.fixture
109
+ def na_cmp():
110
+ def cmp(a, b):
111
+ return np.isnan(a) and np.isnan(b)
112
+
113
+ return cmp
114
+
115
+
116
+ @pytest.fixture
117
+ def data_for_sorting(allow_in_pandas, dtype):
118
+ """Length-3 array with a known sort order.
119
+
120
+ This should be three items [B, C, A] with
121
+ A < B < C
122
+ """
123
+ if dtype.numpy_dtype == "object":
124
+ # Use an empty tuple for first element, then remove,
125
+ # to disable np.array's shape inference.
126
+ return PandasArray(np.array([(), (2,), (3,), (1,)], dtype=object)[1:])
127
+ return PandasArray(np.array([1, 2, 0]))
128
+
129
+
130
+ @pytest.fixture
131
+ def data_missing_for_sorting(allow_in_pandas, dtype):
132
+ """Length-3 array with a known sort order.
133
+
134
+ This should be three items [B, NA, A] with
135
+ A < B and NA missing.
136
+ """
137
+ if dtype.numpy_dtype == "object":
138
+ return PandasArray(np.array([(1,), np.nan, (0,)], dtype=object))
139
+ return PandasArray(np.array([1, np.nan, 0]))
140
+
141
+
142
+ @pytest.fixture
143
+ def data_for_grouping(allow_in_pandas, dtype):
144
+ """Data for factorization, grouping, and unique tests.
145
+
146
+ Expected to be like [B, B, NA, NA, A, A, B, C]
147
+
148
+ Where A < B < C and NA is missing
149
+ """
150
+ if dtype.numpy_dtype == "object":
151
+ a, b, c = (1,), (2,), (3,)
152
+ else:
153
+ a, b, c = np.arange(3)
154
+ return PandasArray(
155
+ np.array([b, b, np.nan, np.nan, a, a, b, c], dtype=dtype.numpy_dtype)
156
+ )
157
+
158
+
159
+ @pytest.fixture
160
+ def skip_numpy_object(dtype, request):
161
+ """
162
+ Tests for PandasArray with nested data. Users typically won't create
163
+ these objects via `pd.array`, but they can show up through `.array`
164
+ on a Series with nested data. Many of the base tests fail, as they aren't
165
+ appropriate for nested data.
166
+
167
+ This fixture allows these tests to be skipped when used as a usefixtures
168
+ marker to either an individual test or a test class.
169
+ """
170
+ if dtype == "object":
171
+ mark = pytest.mark.xfail(reason="Fails for object dtype")
172
+ request.node.add_marker(mark)
173
+
174
+
175
+ skip_nested = pytest.mark.usefixtures("skip_numpy_object")
176
+
177
+
178
+ class BaseNumPyTests:
179
+ @classmethod
180
+ def assert_series_equal(cls, left, right, *args, **kwargs):
181
+ # base class tests hard-code expected values with numpy dtypes,
182
+ # whereas we generally want the corresponding PandasDtype
183
+ if (
184
+ isinstance(right, pd.Series)
185
+ and not isinstance(right.dtype, ExtensionDtype)
186
+ and isinstance(left.dtype, PandasDtype)
187
+ ):
188
+ right = right.astype(PandasDtype(right.dtype))
189
+ return tm.assert_series_equal(left, right, *args, **kwargs)
190
+
191
+
192
+ class TestCasting(BaseNumPyTests, base.BaseCastingTests):
193
+ pass
194
+
195
+
196
+ class TestConstructors(BaseNumPyTests, base.BaseConstructorsTests):
197
+ @pytest.mark.skip(reason="We don't register our dtype")
198
+ # We don't want to register. This test should probably be split in two.
199
+ def test_from_dtype(self, data):
200
+ pass
201
+
202
+ @skip_nested
203
+ def test_series_constructor_scalar_with_index(self, data, dtype):
204
+ # ValueError: Length of passed values is 1, index implies 3.
205
+ super().test_series_constructor_scalar_with_index(data, dtype)
206
+
207
+
208
+ class TestDtype(BaseNumPyTests, base.BaseDtypeTests):
209
+ def test_check_dtype(self, data, request):
210
+ if data.dtype.numpy_dtype == "object":
211
+ request.node.add_marker(
212
+ pytest.mark.xfail(
213
+ reason=f"PandasArray expectedly clashes with a "
214
+ f"NumPy name: {data.dtype.numpy_dtype}"
215
+ )
216
+ )
217
+ super().test_check_dtype(data)
218
+
219
+ def test_is_not_object_type(self, dtype, request):
220
+ if dtype.numpy_dtype == "object":
221
+ # Different from BaseDtypeTests.test_is_not_object_type
222
+ # because PandasDtype(object) is an object type
223
+ assert is_object_dtype(dtype)
224
+ else:
225
+ super().test_is_not_object_type(dtype)
226
+
227
+
228
+ class TestGetitem(BaseNumPyTests, base.BaseGetitemTests):
229
+ @skip_nested
230
+ def test_getitem_scalar(self, data):
231
+ # AssertionError
232
+ super().test_getitem_scalar(data)
233
+
234
+
235
+ class TestGroupby(BaseNumPyTests, base.BaseGroupbyTests):
236
+ pass
237
+
238
+
239
+ class TestInterface(BaseNumPyTests, base.BaseInterfaceTests):
240
+ @skip_nested
241
+ def test_array_interface(self, data):
242
+ # NumPy array shape inference
243
+ super().test_array_interface(data)
244
+
245
+
246
+ class TestMethods(BaseNumPyTests, base.BaseMethodsTests):
247
+ @skip_nested
248
+ def test_shift_fill_value(self, data):
249
+ # np.array shape inference. Shift implementation fails.
250
+ super().test_shift_fill_value(data)
251
+
252
+ @skip_nested
253
+ def test_fillna_copy_frame(self, data_missing):
254
+ # The "scalar" for this array isn't a scalar.
255
+ super().test_fillna_copy_frame(data_missing)
256
+
257
+ @skip_nested
258
+ def test_fillna_copy_series(self, data_missing):
259
+ # The "scalar" for this array isn't a scalar.
260
+ super().test_fillna_copy_series(data_missing)
261
+
262
+ @skip_nested
263
+ def test_searchsorted(self, data_for_sorting, as_series):
264
+ # Test setup fails.
265
+ super().test_searchsorted(data_for_sorting, as_series)
266
+
267
+ @pytest.mark.xfail(reason="PandasArray.diff may fail on dtype")
268
+ def test_diff(self, data, periods):
269
+ return super().test_diff(data, periods)
270
+
271
+ def test_insert(self, data, request):
272
+ if data.dtype.numpy_dtype == object:
273
+ mark = pytest.mark.xfail(reason="Dimension mismatch in np.concatenate")
274
+ request.node.add_marker(mark)
275
+
276
+ super().test_insert(data)
277
+
278
+ @skip_nested
279
+ def test_insert_invalid(self, data, invalid_scalar):
280
+ # PandasArray[object] can hold anything, so skip
281
+ super().test_insert_invalid(data, invalid_scalar)
282
+
283
+
284
+ class TestArithmetics(BaseNumPyTests, base.BaseArithmeticOpsTests):
285
+ divmod_exc = None
286
+ series_scalar_exc = None
287
+ frame_scalar_exc = None
288
+ series_array_exc = None
289
+
290
+ @skip_nested
291
+ def test_divmod(self, data):
292
+ super().test_divmod(data)
293
+
294
+ @skip_nested
295
+ def test_divmod_series_array(self, data):
296
+ ser = pd.Series(data)
297
+ self._check_divmod_op(ser, divmod, data, exc=None)
298
+
299
+ @skip_nested
300
+ def test_arith_series_with_scalar(self, data, all_arithmetic_operators):
301
+ super().test_arith_series_with_scalar(data, all_arithmetic_operators)
302
+
303
+ def test_arith_series_with_array(self, data, all_arithmetic_operators, request):
304
+ opname = all_arithmetic_operators
305
+ if data.dtype.numpy_dtype == object and opname not in ["__add__", "__radd__"]:
306
+ mark = pytest.mark.xfail(reason="Fails for object dtype")
307
+ request.node.add_marker(mark)
308
+ super().test_arith_series_with_array(data, all_arithmetic_operators)
309
+
310
+ @skip_nested
311
+ def test_arith_frame_with_scalar(self, data, all_arithmetic_operators):
312
+ super().test_arith_frame_with_scalar(data, all_arithmetic_operators)
313
+
314
+
315
+ class TestPrinting(BaseNumPyTests, base.BasePrintingTests):
316
+ pass
317
+
318
+
319
+ class TestNumericReduce(BaseNumPyTests, base.BaseNumericReduceTests):
320
+ def check_reduce(self, s, op_name, skipna):
321
+ result = getattr(s, op_name)(skipna=skipna)
322
+ # avoid coercing int -> float. Just cast to the actual numpy type.
323
+ expected = getattr(s.astype(s.dtype._dtype), op_name)(skipna=skipna)
324
+ tm.assert_almost_equal(result, expected)
325
+
326
+ @pytest.mark.parametrize("skipna", [True, False])
327
+ def test_reduce_series(self, data, all_boolean_reductions, skipna):
328
+ super().test_reduce_series(data, all_boolean_reductions, skipna)
329
+
330
+
331
+ @skip_nested
332
+ class TestBooleanReduce(BaseNumPyTests, base.BaseBooleanReduceTests):
333
+ pass
334
+
335
+
336
+ class TestMissing(BaseNumPyTests, base.BaseMissingTests):
337
+ @skip_nested
338
+ def test_fillna_series(self, data_missing):
339
+ # Non-scalar "scalar" values.
340
+ super().test_fillna_series(data_missing)
341
+
342
+ @skip_nested
343
+ def test_fillna_frame(self, data_missing):
344
+ # Non-scalar "scalar" values.
345
+ super().test_fillna_frame(data_missing)
346
+
347
+
348
+ class TestReshaping(BaseNumPyTests, base.BaseReshapingTests):
349
+ @pytest.mark.parametrize(
350
+ "in_frame",
351
+ [
352
+ True,
353
+ pytest.param(
354
+ False,
355
+ marks=pytest.mark.xfail(reason="PandasArray inconsistently extracted"),
356
+ ),
357
+ ],
358
+ )
359
+ def test_concat(self, data, in_frame):
360
+ super().test_concat(data, in_frame)
361
+
362
+
363
+ class TestSetitem(BaseNumPyTests, base.BaseSetitemTests):
364
+ @skip_nested
365
+ def test_setitem_invalid(self, data, invalid_scalar):
366
+ # object dtype can hold anything, so doesn't raise
367
+ super().test_setitem_invalid(data, invalid_scalar)
368
+
369
+ @skip_nested
370
+ def test_setitem_sequence_broadcasts(self, data, box_in_series):
371
+ # ValueError: cannot set using a list-like indexer with a different
372
+ # length than the value
373
+ super().test_setitem_sequence_broadcasts(data, box_in_series)
374
+
375
+ @skip_nested
376
+ @pytest.mark.parametrize("setter", ["loc", None])
377
+ def test_setitem_mask_broadcast(self, data, setter):
378
+ # ValueError: cannot set using a list-like indexer with a different
379
+ # length than the value
380
+ super().test_setitem_mask_broadcast(data, setter)
381
+
382
+ @skip_nested
383
+ def test_setitem_scalar_key_sequence_raise(self, data):
384
+ # Failed: DID NOT RAISE <class 'ValueError'>
385
+ super().test_setitem_scalar_key_sequence_raise(data)
386
+
387
+ # TODO: there is some issue with PandasArray, therefore,
388
+ # skip the setitem test for now, and fix it later (GH 31446)
389
+
390
+ @skip_nested
391
+ @pytest.mark.parametrize(
392
+ "mask",
393
+ [
394
+ np.array([True, True, True, False, False]),
395
+ pd.array([True, True, True, False, False], dtype="boolean"),
396
+ ],
397
+ ids=["numpy-array", "boolean-array"],
398
+ )
399
+ def test_setitem_mask(self, data, mask, box_in_series):
400
+ super().test_setitem_mask(data, mask, box_in_series)
401
+
402
+ @skip_nested
403
+ @pytest.mark.parametrize(
404
+ "idx",
405
+ [[0, 1, 2], pd.array([0, 1, 2], dtype="Int64"), np.array([0, 1, 2])],
406
+ ids=["list", "integer-array", "numpy-array"],
407
+ )
408
+ def test_setitem_integer_array(self, data, idx, box_in_series):
409
+ super().test_setitem_integer_array(data, idx, box_in_series)
410
+
411
+ @pytest.mark.parametrize(
412
+ "idx, box_in_series",
413
+ [
414
+ ([0, 1, 2, pd.NA], False),
415
+ pytest.param([0, 1, 2, pd.NA], True, marks=pytest.mark.xfail),
416
+ (pd.array([0, 1, 2, pd.NA], dtype="Int64"), False),
417
+ (pd.array([0, 1, 2, pd.NA], dtype="Int64"), False),
418
+ ],
419
+ ids=["list-False", "list-True", "integer-array-False", "integer-array-True"],
420
+ )
421
+ def test_setitem_integer_with_missing_raises(self, data, idx, box_in_series):
422
+ super().test_setitem_integer_with_missing_raises(data, idx, box_in_series)
423
+
424
+ @skip_nested
425
+ def test_setitem_slice(self, data, box_in_series):
426
+ super().test_setitem_slice(data, box_in_series)
427
+
428
+ @skip_nested
429
+ def test_setitem_loc_iloc_slice(self, data):
430
+ super().test_setitem_loc_iloc_slice(data)
431
+
432
+ def test_setitem_with_expansion_dataframe_column(self, data, full_indexer):
433
+ # https://github.com/pandas-dev/pandas/issues/32395
434
+ df = expected = pd.DataFrame({"data": pd.Series(data)})
435
+ result = pd.DataFrame(index=df.index)
436
+
437
+ # because result has object dtype, the attempt to do setting inplace
438
+ # is successful, and object dtype is retained
439
+ key = full_indexer(df)
440
+ result.loc[key, "data"] = df["data"]
441
+
442
+ # base class method has expected = df; PandasArray behaves oddly because
443
+ # we patch _typ for these tests.
444
+ if data.dtype.numpy_dtype != object:
445
+ if not isinstance(key, slice) or key != slice(None):
446
+ expected = pd.DataFrame({"data": data.to_numpy()})
447
+ self.assert_frame_equal(result, expected)
448
+
449
+
450
+ @skip_nested
451
+ class TestParsing(BaseNumPyTests, base.BaseParsingTests):
452
+ pass
453
+
454
+
455
+ class Test2DCompat(BaseNumPyTests, base.NDArrayBacked2DTests):
456
+ pass
videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_period.py ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ This file contains a minimal set of tests for compliance with the extension
3
+ array interface test suite, and should contain no other tests.
4
+ The test suite for the full functionality of the array is located in
5
+ `pandas/tests/arrays/`.
6
+
7
+ The tests in this file are inherited from the BaseExtensionTests, and only
8
+ minimal tweaks should be applied to get the tests passing (by overwriting a
9
+ parent method).
10
+
11
+ Additional tests should either be added to one of the BaseExtensionTests
12
+ classes (if they are relevant for the extension interface for all dtypes), or
13
+ be added to the array-specific tests in `pandas/tests/arrays/`.
14
+
15
+ """
16
+ import numpy as np
17
+ import pytest
18
+
19
+ from pandas._libs import iNaT
20
+ from pandas.compat import is_platform_windows
21
+ from pandas.compat.numpy import np_version_gte1p24
22
+
23
+ from pandas.core.dtypes.dtypes import PeriodDtype
24
+
25
+ import pandas as pd
26
+ import pandas._testing as tm
27
+ from pandas.core.arrays import PeriodArray
28
+ from pandas.tests.extension import base
29
+
30
+
31
+ @pytest.fixture(params=["D", "2D"])
32
+ def dtype(request):
33
+ return PeriodDtype(freq=request.param)
34
+
35
+
36
+ @pytest.fixture
37
+ def data(dtype):
38
+ return PeriodArray(np.arange(1970, 2070), freq=dtype.freq)
39
+
40
+
41
+ @pytest.fixture
42
+ def data_for_twos(dtype):
43
+ return PeriodArray(np.ones(100) * 2, freq=dtype.freq)
44
+
45
+
46
+ @pytest.fixture
47
+ def data_for_sorting(dtype):
48
+ return PeriodArray([2018, 2019, 2017], freq=dtype.freq)
49
+
50
+
51
+ @pytest.fixture
52
+ def data_missing(dtype):
53
+ return PeriodArray([iNaT, 2017], freq=dtype.freq)
54
+
55
+
56
+ @pytest.fixture
57
+ def data_missing_for_sorting(dtype):
58
+ return PeriodArray([2018, iNaT, 2017], freq=dtype.freq)
59
+
60
+
61
+ @pytest.fixture
62
+ def data_for_grouping(dtype):
63
+ B = 2018
64
+ NA = iNaT
65
+ A = 2017
66
+ C = 2019
67
+ return PeriodArray([B, B, NA, NA, A, A, B, C], freq=dtype.freq)
68
+
69
+
70
+ @pytest.fixture
71
+ def na_value():
72
+ return pd.NaT
73
+
74
+
75
+ class BasePeriodTests:
76
+ pass
77
+
78
+
79
+ class TestPeriodDtype(BasePeriodTests, base.BaseDtypeTests):
80
+ pass
81
+
82
+
83
+ class TestConstructors(BasePeriodTests, base.BaseConstructorsTests):
84
+ pass
85
+
86
+
87
+ class TestGetitem(BasePeriodTests, base.BaseGetitemTests):
88
+ pass
89
+
90
+
91
+ class TestIndex(base.BaseIndexTests):
92
+ pass
93
+
94
+
95
+ class TestMethods(BasePeriodTests, base.BaseMethodsTests):
96
+ def test_combine_add(self, data_repeated):
97
+ # Period + Period is not defined.
98
+ pass
99
+
100
+ @pytest.mark.parametrize("periods", [1, -2])
101
+ def test_diff(self, data, periods):
102
+ if is_platform_windows() and np_version_gte1p24:
103
+ with tm.assert_produces_warning(RuntimeWarning, check_stacklevel=False):
104
+ super().test_diff(data, periods)
105
+ else:
106
+ super().test_diff(data, periods)
107
+
108
+
109
+ class TestInterface(BasePeriodTests, base.BaseInterfaceTests):
110
+ pass
111
+
112
+
113
+ class TestArithmeticOps(BasePeriodTests, base.BaseArithmeticOpsTests):
114
+ implements = {"__sub__", "__rsub__"}
115
+
116
+ def test_arith_frame_with_scalar(self, data, all_arithmetic_operators):
117
+ # frame & scalar
118
+ if all_arithmetic_operators in self.implements:
119
+ df = pd.DataFrame({"A": data})
120
+ self.check_opname(df, all_arithmetic_operators, data[0], exc=None)
121
+ else:
122
+ # ... but not the rest.
123
+ super().test_arith_frame_with_scalar(data, all_arithmetic_operators)
124
+
125
+ def test_arith_series_with_scalar(self, data, all_arithmetic_operators):
126
+ # we implement substitution...
127
+ if all_arithmetic_operators in self.implements:
128
+ s = pd.Series(data)
129
+ self.check_opname(s, all_arithmetic_operators, s.iloc[0], exc=None)
130
+ else:
131
+ # ... but not the rest.
132
+ super().test_arith_series_with_scalar(data, all_arithmetic_operators)
133
+
134
+ def test_arith_series_with_array(self, data, all_arithmetic_operators):
135
+ if all_arithmetic_operators in self.implements:
136
+ s = pd.Series(data)
137
+ self.check_opname(s, all_arithmetic_operators, s.iloc[0], exc=None)
138
+ else:
139
+ # ... but not the rest.
140
+ super().test_arith_series_with_scalar(data, all_arithmetic_operators)
141
+
142
+ def _check_divmod_op(self, s, op, other, exc=NotImplementedError):
143
+ super()._check_divmod_op(s, op, other, exc=TypeError)
144
+
145
+ def test_add_series_with_extension_array(self, data):
146
+ # we don't implement + for Period
147
+ s = pd.Series(data)
148
+ msg = (
149
+ r"unsupported operand type\(s\) for \+: "
150
+ r"\'PeriodArray\' and \'PeriodArray\'"
151
+ )
152
+ with pytest.raises(TypeError, match=msg):
153
+ s + data
154
+
155
+ def test_direct_arith_with_ndframe_returns_not_implemented(
156
+ self, data, frame_or_series
157
+ ):
158
+ # Override to use __sub__ instead of __add__
159
+ other = pd.Series(data)
160
+ if frame_or_series is pd.DataFrame:
161
+ other = other.to_frame()
162
+
163
+ result = data.__sub__(other)
164
+ assert result is NotImplemented
165
+
166
+
167
+ class TestCasting(BasePeriodTests, base.BaseCastingTests):
168
+ pass
169
+
170
+
171
+ class TestComparisonOps(BasePeriodTests, base.BaseComparisonOpsTests):
172
+ pass
173
+
174
+
175
+ class TestMissing(BasePeriodTests, base.BaseMissingTests):
176
+ pass
177
+
178
+
179
+ class TestReshaping(BasePeriodTests, base.BaseReshapingTests):
180
+ pass
181
+
182
+
183
+ class TestSetitem(BasePeriodTests, base.BaseSetitemTests):
184
+ pass
185
+
186
+
187
+ class TestGroupby(BasePeriodTests, base.BaseGroupbyTests):
188
+ pass
189
+
190
+
191
+ class TestPrinting(BasePeriodTests, base.BasePrintingTests):
192
+ pass
193
+
194
+
195
+ class TestParsing(BasePeriodTests, base.BaseParsingTests):
196
+ @pytest.mark.parametrize("engine", ["c", "python"])
197
+ def test_EA_types(self, engine, data):
198
+ super().test_EA_types(engine, data)
199
+
200
+
201
+ class Test2DCompat(BasePeriodTests, base.NDArrayBacked2DTests):
202
+ pass
videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_sparse.py ADDED
@@ -0,0 +1,475 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ This file contains a minimal set of tests for compliance with the extension
3
+ array interface test suite, and should contain no other tests.
4
+ The test suite for the full functionality of the array is located in
5
+ `pandas/tests/arrays/`.
6
+
7
+ The tests in this file are inherited from the BaseExtensionTests, and only
8
+ minimal tweaks should be applied to get the tests passing (by overwriting a
9
+ parent method).
10
+
11
+ Additional tests should either be added to one of the BaseExtensionTests
12
+ classes (if they are relevant for the extension interface for all dtypes), or
13
+ be added to the array-specific tests in `pandas/tests/arrays/`.
14
+
15
+ """
16
+
17
+ import numpy as np
18
+ import pytest
19
+
20
+ from pandas.errors import PerformanceWarning
21
+
22
+ import pandas as pd
23
+ from pandas import SparseDtype
24
+ import pandas._testing as tm
25
+ from pandas.arrays import SparseArray
26
+ from pandas.tests.extension import base
27
+
28
+
29
+ def make_data(fill_value):
30
+ if np.isnan(fill_value):
31
+ data = np.random.uniform(size=100)
32
+ else:
33
+ data = np.random.randint(1, 100, size=100)
34
+ if data[0] == data[1]:
35
+ data[0] += 1
36
+
37
+ data[2::3] = fill_value
38
+ return data
39
+
40
+
41
+ @pytest.fixture
42
+ def dtype():
43
+ return SparseDtype()
44
+
45
+
46
+ @pytest.fixture(params=[0, np.nan])
47
+ def data(request):
48
+ """Length-100 PeriodArray for semantics test."""
49
+ res = SparseArray(make_data(request.param), fill_value=request.param)
50
+ return res
51
+
52
+
53
+ @pytest.fixture
54
+ def data_for_twos():
55
+ return SparseArray(np.ones(100) * 2)
56
+
57
+
58
+ @pytest.fixture(params=[0, np.nan])
59
+ def data_missing(request):
60
+ """Length 2 array with [NA, Valid]"""
61
+ return SparseArray([np.nan, 1], fill_value=request.param)
62
+
63
+
64
+ @pytest.fixture(params=[0, np.nan])
65
+ def data_repeated(request):
66
+ """Return different versions of data for count times"""
67
+
68
+ def gen(count):
69
+ for _ in range(count):
70
+ yield SparseArray(make_data(request.param), fill_value=request.param)
71
+
72
+ yield gen
73
+
74
+
75
+ @pytest.fixture(params=[0, np.nan])
76
+ def data_for_sorting(request):
77
+ return SparseArray([2, 3, 1], fill_value=request.param)
78
+
79
+
80
+ @pytest.fixture(params=[0, np.nan])
81
+ def data_missing_for_sorting(request):
82
+ return SparseArray([2, np.nan, 1], fill_value=request.param)
83
+
84
+
85
+ @pytest.fixture
86
+ def na_value():
87
+ return np.nan
88
+
89
+
90
+ @pytest.fixture
91
+ def na_cmp():
92
+ return lambda left, right: pd.isna(left) and pd.isna(right)
93
+
94
+
95
+ @pytest.fixture(params=[0, np.nan])
96
+ def data_for_grouping(request):
97
+ return SparseArray([1, 1, np.nan, np.nan, 2, 2, 1, 3], fill_value=request.param)
98
+
99
+
100
+ @pytest.fixture(params=[0, np.nan])
101
+ def data_for_compare(request):
102
+ return SparseArray([0, 0, np.nan, -2, -1, 4, 2, 3, 0, 0], fill_value=request.param)
103
+
104
+
105
+ class BaseSparseTests:
106
+ def _check_unsupported(self, data):
107
+ if data.dtype == SparseDtype(int, 0):
108
+ pytest.skip("Can't store nan in int array.")
109
+
110
+ @pytest.mark.xfail(reason="SparseArray does not support setitem")
111
+ def test_ravel(self, data):
112
+ super().test_ravel(data)
113
+
114
+
115
+ class TestDtype(BaseSparseTests, base.BaseDtypeTests):
116
+ def test_array_type_with_arg(self, data, dtype):
117
+ assert dtype.construct_array_type() is SparseArray
118
+
119
+
120
+ class TestInterface(BaseSparseTests, base.BaseInterfaceTests):
121
+ def test_copy(self, data):
122
+ # __setitem__ does not work, so we only have a smoke-test
123
+ data.copy()
124
+
125
+ def test_view(self, data):
126
+ # __setitem__ does not work, so we only have a smoke-test
127
+ data.view()
128
+
129
+
130
+ class TestConstructors(BaseSparseTests, base.BaseConstructorsTests):
131
+ pass
132
+
133
+
134
+ class TestReshaping(BaseSparseTests, base.BaseReshapingTests):
135
+ def test_concat_mixed_dtypes(self, data):
136
+ # https://github.com/pandas-dev/pandas/issues/20762
137
+ # This should be the same, aside from concat([sparse, float])
138
+ df1 = pd.DataFrame({"A": data[:3]})
139
+ df2 = pd.DataFrame({"A": [1, 2, 3]})
140
+ df3 = pd.DataFrame({"A": ["a", "b", "c"]}).astype("category")
141
+ dfs = [df1, df2, df3]
142
+
143
+ # dataframes
144
+ result = pd.concat(dfs)
145
+ expected = pd.concat(
146
+ [x.apply(lambda s: np.asarray(s).astype(object)) for x in dfs]
147
+ )
148
+ self.assert_frame_equal(result, expected)
149
+
150
+ @pytest.mark.parametrize(
151
+ "columns",
152
+ [
153
+ ["A", "B"],
154
+ pd.MultiIndex.from_tuples(
155
+ [("A", "a"), ("A", "b")], names=["outer", "inner"]
156
+ ),
157
+ ],
158
+ )
159
+ def test_stack(self, data, columns):
160
+ super().test_stack(data, columns)
161
+
162
+ def test_concat_columns(self, data, na_value):
163
+ self._check_unsupported(data)
164
+ super().test_concat_columns(data, na_value)
165
+
166
+ def test_concat_extension_arrays_copy_false(self, data, na_value):
167
+ self._check_unsupported(data)
168
+ super().test_concat_extension_arrays_copy_false(data, na_value)
169
+
170
+ def test_align(self, data, na_value):
171
+ self._check_unsupported(data)
172
+ super().test_align(data, na_value)
173
+
174
+ def test_align_frame(self, data, na_value):
175
+ self._check_unsupported(data)
176
+ super().test_align_frame(data, na_value)
177
+
178
+ def test_align_series_frame(self, data, na_value):
179
+ self._check_unsupported(data)
180
+ super().test_align_series_frame(data, na_value)
181
+
182
+ def test_merge(self, data, na_value):
183
+ self._check_unsupported(data)
184
+ super().test_merge(data, na_value)
185
+
186
+ @pytest.mark.xfail(reason="SparseArray does not support setitem")
187
+ def test_transpose(self, data):
188
+ super().test_transpose(data)
189
+
190
+
191
+ class TestGetitem(BaseSparseTests, base.BaseGetitemTests):
192
+ def test_get(self, data):
193
+ ser = pd.Series(data, index=[2 * i for i in range(len(data))])
194
+ if np.isnan(ser.values.fill_value):
195
+ assert np.isnan(ser.get(4)) and np.isnan(ser.iloc[2])
196
+ else:
197
+ assert ser.get(4) == ser.iloc[2]
198
+ assert ser.get(2) == ser.iloc[1]
199
+
200
+ def test_reindex(self, data, na_value):
201
+ self._check_unsupported(data)
202
+ super().test_reindex(data, na_value)
203
+
204
+
205
+ # Skipping TestSetitem, since we don't implement it.
206
+
207
+
208
+ class TestIndex(base.BaseIndexTests):
209
+ pass
210
+
211
+
212
+ class TestMissing(BaseSparseTests, base.BaseMissingTests):
213
+ def test_isna(self, data_missing):
214
+ sarr = SparseArray(data_missing)
215
+ expected_dtype = SparseDtype(bool, pd.isna(data_missing.dtype.fill_value))
216
+ expected = SparseArray([True, False], dtype=expected_dtype)
217
+ result = sarr.isna()
218
+ tm.assert_sp_array_equal(result, expected)
219
+
220
+ # test isna for arr without na
221
+ sarr = sarr.fillna(0)
222
+ expected_dtype = SparseDtype(bool, pd.isna(data_missing.dtype.fill_value))
223
+ expected = SparseArray([False, False], fill_value=False, dtype=expected_dtype)
224
+ self.assert_equal(sarr.isna(), expected)
225
+
226
+ def test_fillna_limit_pad(self, data_missing):
227
+ with tm.assert_produces_warning(PerformanceWarning, check_stacklevel=False):
228
+ super().test_fillna_limit_pad(data_missing)
229
+
230
+ def test_fillna_limit_backfill(self, data_missing):
231
+ with tm.assert_produces_warning(PerformanceWarning, check_stacklevel=False):
232
+ super().test_fillna_limit_backfill(data_missing)
233
+
234
+ def test_fillna_no_op_returns_copy(self, data, request):
235
+ if np.isnan(data.fill_value):
236
+ request.node.add_marker(
237
+ pytest.mark.xfail(reason="returns array with different fill value")
238
+ )
239
+ with tm.assert_produces_warning(PerformanceWarning, check_stacklevel=False):
240
+ super().test_fillna_no_op_returns_copy(data)
241
+
242
+ def test_fillna_series_method(self, data_missing):
243
+ with tm.assert_produces_warning(PerformanceWarning, check_stacklevel=False):
244
+ super().test_fillna_limit_backfill(data_missing)
245
+
246
+ @pytest.mark.xfail(reason="Unsupported")
247
+ def test_fillna_series(self):
248
+ # this one looks doable.
249
+ super().test_fillna_series()
250
+
251
+ def test_fillna_frame(self, data_missing):
252
+ # Have to override to specify that fill_value will change.
253
+ fill_value = data_missing[1]
254
+
255
+ result = pd.DataFrame({"A": data_missing, "B": [1, 2]}).fillna(fill_value)
256
+
257
+ if pd.isna(data_missing.fill_value):
258
+ dtype = SparseDtype(data_missing.dtype, fill_value)
259
+ else:
260
+ dtype = data_missing.dtype
261
+
262
+ expected = pd.DataFrame(
263
+ {
264
+ "A": data_missing._from_sequence([fill_value, fill_value], dtype=dtype),
265
+ "B": [1, 2],
266
+ }
267
+ )
268
+
269
+ self.assert_frame_equal(result, expected)
270
+
271
+
272
+ class TestMethods(BaseSparseTests, base.BaseMethodsTests):
273
+ _combine_le_expected_dtype = "Sparse[bool]"
274
+
275
+ def test_fillna_copy_frame(self, data_missing, using_copy_on_write):
276
+ arr = data_missing.take([1, 1])
277
+ df = pd.DataFrame({"A": arr}, copy=False)
278
+
279
+ filled_val = df.iloc[0, 0]
280
+ result = df.fillna(filled_val)
281
+
282
+ if hasattr(df._mgr, "blocks"):
283
+ if using_copy_on_write:
284
+ assert df.values.base is result.values.base
285
+ else:
286
+ assert df.values.base is not result.values.base
287
+ assert df.A._values.to_dense() is arr.to_dense()
288
+
289
+ def test_fillna_copy_series(self, data_missing, using_copy_on_write):
290
+ arr = data_missing.take([1, 1])
291
+ ser = pd.Series(arr, copy=False)
292
+
293
+ filled_val = ser[0]
294
+ result = ser.fillna(filled_val)
295
+
296
+ if using_copy_on_write:
297
+ assert ser._values is result._values
298
+
299
+ else:
300
+ assert ser._values is not result._values
301
+ assert ser._values.to_dense() is arr.to_dense()
302
+
303
+ @pytest.mark.xfail(reason="Not Applicable")
304
+ def test_fillna_length_mismatch(self, data_missing):
305
+ super().test_fillna_length_mismatch(data_missing)
306
+
307
+ def test_where_series(self, data, na_value):
308
+ assert data[0] != data[1]
309
+ cls = type(data)
310
+ a, b = data[:2]
311
+
312
+ ser = pd.Series(cls._from_sequence([a, a, b, b], dtype=data.dtype))
313
+
314
+ cond = np.array([True, True, False, False])
315
+ result = ser.where(cond)
316
+
317
+ new_dtype = SparseDtype("float", 0.0)
318
+ expected = pd.Series(
319
+ cls._from_sequence([a, a, na_value, na_value], dtype=new_dtype)
320
+ )
321
+ self.assert_series_equal(result, expected)
322
+
323
+ other = cls._from_sequence([a, b, a, b], dtype=data.dtype)
324
+ cond = np.array([True, False, True, True])
325
+ result = ser.where(cond, other)
326
+ expected = pd.Series(cls._from_sequence([a, b, b, b], dtype=data.dtype))
327
+ self.assert_series_equal(result, expected)
328
+
329
+ def test_combine_first(self, data, request):
330
+ if data.dtype.subtype == "int":
331
+ # Right now this is upcasted to float, just like combine_first
332
+ # for Series[int]
333
+ mark = pytest.mark.xfail(
334
+ reason="TODO(SparseArray.__setitem__) will preserve dtype."
335
+ )
336
+ request.node.add_marker(mark)
337
+ super().test_combine_first(data)
338
+
339
+ def test_searchsorted(self, data_for_sorting, as_series):
340
+ with tm.assert_produces_warning(PerformanceWarning, check_stacklevel=False):
341
+ super().test_searchsorted(data_for_sorting, as_series)
342
+
343
+ def test_shift_0_periods(self, data):
344
+ # GH#33856 shifting with periods=0 should return a copy, not same obj
345
+ result = data.shift(0)
346
+
347
+ data._sparse_values[0] = data._sparse_values[1]
348
+ assert result._sparse_values[0] != result._sparse_values[1]
349
+
350
+ @pytest.mark.parametrize("method", ["argmax", "argmin"])
351
+ def test_argmin_argmax_all_na(self, method, data, na_value):
352
+ # overriding because Sparse[int64, 0] cannot handle na_value
353
+ self._check_unsupported(data)
354
+ super().test_argmin_argmax_all_na(method, data, na_value)
355
+
356
+ @pytest.mark.parametrize("box", [pd.array, pd.Series, pd.DataFrame])
357
+ def test_equals(self, data, na_value, as_series, box):
358
+ self._check_unsupported(data)
359
+ super().test_equals(data, na_value, as_series, box)
360
+
361
+
362
+ class TestCasting(BaseSparseTests, base.BaseCastingTests):
363
+ def test_astype_str(self, data):
364
+ # pre-2.0 this would give a SparseDtype even if the user asked
365
+ # for a non-sparse dtype.
366
+ result = pd.Series(data[:5]).astype(str)
367
+ expected = pd.Series([str(x) for x in data[:5]], dtype=object)
368
+ self.assert_series_equal(result, expected)
369
+
370
+ @pytest.mark.xfail(raises=TypeError, reason="no sparse StringDtype")
371
+ def test_astype_string(self, data):
372
+ super().test_astype_string(data)
373
+
374
+
375
+ class TestArithmeticOps(BaseSparseTests, base.BaseArithmeticOpsTests):
376
+ series_scalar_exc = None
377
+ frame_scalar_exc = None
378
+ divmod_exc = None
379
+ series_array_exc = None
380
+
381
+ def _skip_if_different_combine(self, data):
382
+ if data.fill_value == 0:
383
+ # arith ops call on dtype.fill_value so that the sparsity
384
+ # is maintained. Combine can't be called on a dtype in
385
+ # general, so we can't make the expected. This is tested elsewhere
386
+ pytest.skip("Incorrected expected from Series.combine and tested elsewhere")
387
+
388
+ def test_arith_series_with_scalar(self, data, all_arithmetic_operators):
389
+ self._skip_if_different_combine(data)
390
+ super().test_arith_series_with_scalar(data, all_arithmetic_operators)
391
+
392
+ def test_arith_series_with_array(self, data, all_arithmetic_operators):
393
+ self._skip_if_different_combine(data)
394
+ super().test_arith_series_with_array(data, all_arithmetic_operators)
395
+
396
+ def test_arith_frame_with_scalar(self, data, all_arithmetic_operators, request):
397
+ if data.dtype.fill_value != 0:
398
+ pass
399
+ elif all_arithmetic_operators.strip("_") not in [
400
+ "mul",
401
+ "rmul",
402
+ "floordiv",
403
+ "rfloordiv",
404
+ "pow",
405
+ "mod",
406
+ "rmod",
407
+ ]:
408
+ mark = pytest.mark.xfail(reason="result dtype.fill_value mismatch")
409
+ request.node.add_marker(mark)
410
+ super().test_arith_frame_with_scalar(data, all_arithmetic_operators)
411
+
412
+ def _check_divmod_op(self, ser, op, other, exc=NotImplementedError):
413
+ # We implement divmod
414
+ super()._check_divmod_op(ser, op, other, exc=None)
415
+
416
+
417
+ class TestComparisonOps(BaseSparseTests):
418
+ def _compare_other(self, data_for_compare: SparseArray, comparison_op, other):
419
+ op = comparison_op
420
+
421
+ result = op(data_for_compare, other)
422
+ assert isinstance(result, SparseArray)
423
+ assert result.dtype.subtype == np.bool_
424
+
425
+ if isinstance(other, SparseArray):
426
+ fill_value = op(data_for_compare.fill_value, other.fill_value)
427
+ else:
428
+ fill_value = np.all(
429
+ op(np.asarray(data_for_compare.fill_value), np.asarray(other))
430
+ )
431
+
432
+ expected = SparseArray(
433
+ op(data_for_compare.to_dense(), np.asarray(other)),
434
+ fill_value=fill_value,
435
+ dtype=np.bool_,
436
+ )
437
+ tm.assert_sp_array_equal(result, expected)
438
+
439
+ def test_scalar(self, data_for_compare: SparseArray, comparison_op):
440
+ self._compare_other(data_for_compare, comparison_op, 0)
441
+ self._compare_other(data_for_compare, comparison_op, 1)
442
+ self._compare_other(data_for_compare, comparison_op, -1)
443
+ self._compare_other(data_for_compare, comparison_op, np.nan)
444
+
445
+ @pytest.mark.xfail(reason="Wrong indices")
446
+ def test_array(self, data_for_compare: SparseArray, comparison_op):
447
+ arr = np.linspace(-4, 5, 10)
448
+ self._compare_other(data_for_compare, comparison_op, arr)
449
+
450
+ @pytest.mark.xfail(reason="Wrong indices")
451
+ def test_sparse_array(self, data_for_compare: SparseArray, comparison_op):
452
+ arr = data_for_compare + 1
453
+ self._compare_other(data_for_compare, comparison_op, arr)
454
+ arr = data_for_compare * 2
455
+ self._compare_other(data_for_compare, comparison_op, arr)
456
+
457
+
458
+ class TestPrinting(BaseSparseTests, base.BasePrintingTests):
459
+ @pytest.mark.xfail(reason="Different repr")
460
+ def test_array_repr(self, data, size):
461
+ super().test_array_repr(data, size)
462
+
463
+
464
+ class TestParsing(BaseSparseTests, base.BaseParsingTests):
465
+ @pytest.mark.parametrize("engine", ["c", "python"])
466
+ def test_EA_types(self, engine, data):
467
+ expected_msg = r".*must implement _from_sequence_of_strings.*"
468
+ with pytest.raises(NotImplementedError, match=expected_msg):
469
+ super().test_EA_types(engine, data)
470
+
471
+
472
+ class TestNoNumericAccumulations(base.BaseAccumulateTests):
473
+ @pytest.mark.parametrize("skipna", [True, False])
474
+ def test_accumulate_series(self, data, all_numeric_accumulations, skipna):
475
+ pass
videochat2/lib/python3.10/site-packages/pandas/tests/extension/test_string.py ADDED
@@ -0,0 +1,282 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ This file contains a minimal set of tests for compliance with the extension
3
+ array interface test suite, and should contain no other tests.
4
+ The test suite for the full functionality of the array is located in
5
+ `pandas/tests/arrays/`.
6
+
7
+ The tests in this file are inherited from the BaseExtensionTests, and only
8
+ minimal tweaks should be applied to get the tests passing (by overwriting a
9
+ parent method).
10
+
11
+ Additional tests should either be added to one of the BaseExtensionTests
12
+ classes (if they are relevant for the extension interface for all dtypes), or
13
+ be added to the array-specific tests in `pandas/tests/arrays/`.
14
+
15
+ """
16
+ import string
17
+
18
+ import numpy as np
19
+ import pytest
20
+
21
+ from pandas.errors import PerformanceWarning
22
+
23
+ import pandas as pd
24
+ import pandas._testing as tm
25
+ from pandas.api.types import is_string_dtype
26
+ from pandas.core.arrays import ArrowStringArray
27
+ from pandas.core.arrays.string_ import StringDtype
28
+ from pandas.tests.extension import base
29
+
30
+
31
+ def split_array(arr):
32
+ if arr.dtype.storage != "pyarrow":
33
+ pytest.skip("only applicable for pyarrow chunked array n/a")
34
+
35
+ def _split_array(arr):
36
+ import pyarrow as pa
37
+
38
+ arrow_array = arr._data
39
+ split = len(arrow_array) // 2
40
+ arrow_array = pa.chunked_array(
41
+ [*arrow_array[:split].chunks, *arrow_array[split:].chunks]
42
+ )
43
+ assert arrow_array.num_chunks == 2
44
+ return type(arr)(arrow_array)
45
+
46
+ return _split_array(arr)
47
+
48
+
49
+ @pytest.fixture(params=[True, False])
50
+ def chunked(request):
51
+ return request.param
52
+
53
+
54
+ @pytest.fixture
55
+ def dtype(string_storage):
56
+ return StringDtype(storage=string_storage)
57
+
58
+
59
+ @pytest.fixture
60
+ def data(dtype, chunked):
61
+ strings = np.random.choice(list(string.ascii_letters), size=100)
62
+ while strings[0] == strings[1]:
63
+ strings = np.random.choice(list(string.ascii_letters), size=100)
64
+
65
+ arr = dtype.construct_array_type()._from_sequence(strings)
66
+ return split_array(arr) if chunked else arr
67
+
68
+
69
+ @pytest.fixture
70
+ def data_missing(dtype, chunked):
71
+ """Length 2 array with [NA, Valid]"""
72
+ arr = dtype.construct_array_type()._from_sequence([pd.NA, "A"])
73
+ return split_array(arr) if chunked else arr
74
+
75
+
76
+ @pytest.fixture
77
+ def data_for_sorting(dtype, chunked):
78
+ arr = dtype.construct_array_type()._from_sequence(["B", "C", "A"])
79
+ return split_array(arr) if chunked else arr
80
+
81
+
82
+ @pytest.fixture
83
+ def data_missing_for_sorting(dtype, chunked):
84
+ arr = dtype.construct_array_type()._from_sequence(["B", pd.NA, "A"])
85
+ return split_array(arr) if chunked else arr
86
+
87
+
88
+ @pytest.fixture
89
+ def na_value():
90
+ return pd.NA
91
+
92
+
93
+ @pytest.fixture
94
+ def data_for_grouping(dtype, chunked):
95
+ arr = dtype.construct_array_type()._from_sequence(
96
+ ["B", "B", pd.NA, pd.NA, "A", "A", "B", "C"]
97
+ )
98
+ return split_array(arr) if chunked else arr
99
+
100
+
101
+ class TestDtype(base.BaseDtypeTests):
102
+ def test_eq_with_str(self, dtype):
103
+ assert dtype == f"string[{dtype.storage}]"
104
+ super().test_eq_with_str(dtype)
105
+
106
+ def test_is_not_string_type(self, dtype):
107
+ # Different from BaseDtypeTests.test_is_not_string_type
108
+ # because StringDtype is a string type
109
+ assert is_string_dtype(dtype)
110
+
111
+
112
+ class TestInterface(base.BaseInterfaceTests):
113
+ def test_view(self, data, request):
114
+ if data.dtype.storage == "pyarrow":
115
+ pytest.skip(reason="2D support not implemented for ArrowStringArray")
116
+ super().test_view(data)
117
+
118
+
119
+ class TestConstructors(base.BaseConstructorsTests):
120
+ def test_from_dtype(self, data):
121
+ # base test uses string representation of dtype
122
+ pass
123
+
124
+ def test_constructor_from_list(self):
125
+ # GH 27673
126
+ pytest.importorskip("pyarrow", minversion="1.0.0")
127
+ result = pd.Series(["E"], dtype=StringDtype(storage="pyarrow"))
128
+ assert isinstance(result.dtype, StringDtype)
129
+ assert result.dtype.storage == "pyarrow"
130
+
131
+
132
+ class TestReshaping(base.BaseReshapingTests):
133
+ def test_transpose(self, data, request):
134
+ if data.dtype.storage == "pyarrow":
135
+ pytest.skip(reason="2D support not implemented for ArrowStringArray")
136
+ super().test_transpose(data)
137
+
138
+
139
+ class TestGetitem(base.BaseGetitemTests):
140
+ pass
141
+
142
+
143
+ class TestSetitem(base.BaseSetitemTests):
144
+ def test_setitem_preserves_views(self, data, request):
145
+ if data.dtype.storage == "pyarrow":
146
+ pytest.skip(reason="2D support not implemented for ArrowStringArray")
147
+ super().test_setitem_preserves_views(data)
148
+
149
+
150
+ class TestIndex(base.BaseIndexTests):
151
+ pass
152
+
153
+
154
+ class TestMissing(base.BaseMissingTests):
155
+ def test_dropna_array(self, data_missing):
156
+ result = data_missing.dropna()
157
+ expected = data_missing[[1]]
158
+ self.assert_extension_array_equal(result, expected)
159
+
160
+ def test_fillna_no_op_returns_copy(self, data):
161
+ data = data[~data.isna()]
162
+
163
+ valid = data[0]
164
+ result = data.fillna(valid)
165
+ assert result is not data
166
+ self.assert_extension_array_equal(result, data)
167
+
168
+ with tm.maybe_produces_warning(
169
+ PerformanceWarning, data.dtype.storage == "pyarrow"
170
+ ):
171
+ result = data.fillna(method="backfill")
172
+ assert result is not data
173
+ self.assert_extension_array_equal(result, data)
174
+
175
+ def test_fillna_series_method(self, data_missing, fillna_method):
176
+ with tm.maybe_produces_warning(
177
+ PerformanceWarning,
178
+ fillna_method is not None and data_missing.dtype.storage == "pyarrow",
179
+ check_stacklevel=False,
180
+ ):
181
+ super().test_fillna_series_method(data_missing, fillna_method)
182
+
183
+
184
+ class TestNoReduce(base.BaseNoReduceTests):
185
+ @pytest.mark.parametrize("skipna", [True, False])
186
+ def test_reduce_series_numeric(self, data, all_numeric_reductions, skipna):
187
+ op_name = all_numeric_reductions
188
+
189
+ if op_name in ["min", "max"]:
190
+ return None
191
+
192
+ ser = pd.Series(data)
193
+ with pytest.raises(TypeError):
194
+ getattr(ser, op_name)(skipna=skipna)
195
+
196
+
197
+ class TestMethods(base.BaseMethodsTests):
198
+ def test_value_counts_with_normalize(self, data):
199
+ data = data[:10].unique()
200
+ values = np.array(data[~data.isna()])
201
+ ser = pd.Series(data, dtype=data.dtype)
202
+
203
+ result = ser.value_counts(normalize=True).sort_index()
204
+
205
+ expected = pd.Series(
206
+ [1 / len(values)] * len(values), index=result.index, name="proportion"
207
+ )
208
+ if getattr(data.dtype, "storage", "") == "pyarrow":
209
+ expected = expected.astype("double[pyarrow]")
210
+ else:
211
+ expected = expected.astype("Float64")
212
+
213
+ self.assert_series_equal(result, expected)
214
+
215
+
216
+ class TestCasting(base.BaseCastingTests):
217
+ pass
218
+
219
+
220
+ class TestComparisonOps(base.BaseComparisonOpsTests):
221
+ def _compare_other(self, ser, data, op, other):
222
+ op_name = f"__{op.__name__}__"
223
+ result = getattr(ser, op_name)(other)
224
+ dtype = "boolean[pyarrow]" if ser.dtype.storage == "pyarrow" else "boolean"
225
+ expected = getattr(ser.astype(object), op_name)(other).astype(dtype)
226
+ self.assert_series_equal(result, expected)
227
+
228
+ def test_compare_scalar(self, data, comparison_op):
229
+ ser = pd.Series(data)
230
+ self._compare_other(ser, data, comparison_op, "abc")
231
+
232
+
233
+ class TestParsing(base.BaseParsingTests):
234
+ pass
235
+
236
+
237
+ class TestPrinting(base.BasePrintingTests):
238
+ pass
239
+
240
+
241
+ class TestGroupBy(base.BaseGroupbyTests):
242
+ @pytest.mark.parametrize("as_index", [True, False])
243
+ def test_groupby_extension_agg(self, as_index, data_for_grouping):
244
+ df = pd.DataFrame({"A": [1, 1, 2, 2, 3, 3, 1, 4], "B": data_for_grouping})
245
+ result = df.groupby("B", as_index=as_index).A.mean()
246
+ _, uniques = pd.factorize(data_for_grouping, sort=True)
247
+
248
+ if as_index:
249
+ index = pd.Index(uniques, name="B")
250
+ expected = pd.Series([3.0, 1.0, 4.0], index=index, name="A")
251
+ self.assert_series_equal(result, expected)
252
+ else:
253
+ expected = pd.DataFrame({"B": uniques, "A": [3.0, 1.0, 4.0]})
254
+ self.assert_frame_equal(result, expected)
255
+
256
+ @pytest.mark.filterwarnings("ignore:Falling back:pandas.errors.PerformanceWarning")
257
+ def test_groupby_extension_apply(self, data_for_grouping, groupby_apply_op):
258
+ super().test_groupby_extension_apply(data_for_grouping, groupby_apply_op)
259
+
260
+
261
+ class Test2DCompat(base.Dim2CompatTests):
262
+ @pytest.fixture(autouse=True)
263
+ def arrow_not_supported(self, data, request):
264
+ if isinstance(data, ArrowStringArray):
265
+ pytest.skip(reason="2D support not implemented for ArrowStringArray")
266
+
267
+
268
+ def test_searchsorted_with_na_raises(data_for_sorting, as_series):
269
+ # GH50447
270
+ b, c, a = data_for_sorting
271
+ arr = data_for_sorting.take([2, 0, 1]) # to get [a, b, c]
272
+ arr[-1] = pd.NA
273
+
274
+ if as_series:
275
+ arr = pd.Series(arr)
276
+
277
+ msg = (
278
+ "searchsorted requires array to be sorted, "
279
+ "which is impossible with NAs present."
280
+ )
281
+ with pytest.raises(ValueError, match=msg):
282
+ arr.searchsorted(b)
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (174 Bytes). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/common.cpython-310.pyc ADDED
Binary file (2.28 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/conftest.cpython-310.pyc ADDED
Binary file (9.01 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_alter_axes.cpython-310.pyc ADDED
Binary file (1.22 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_api.cpython-310.pyc ADDED
Binary file (11.8 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_arithmetic.cpython-310.pyc ADDED
Binary file (57.7 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_block_internals.cpython-310.pyc ADDED
Binary file (11.3 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_cumulative.cpython-310.pyc ADDED
Binary file (2.42 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_iteration.cpython-310.pyc ADDED
Binary file (5.26 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_logical_ops.cpython-310.pyc ADDED
Binary file (4.83 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_nonunique_indexes.cpython-310.pyc ADDED
Binary file (10.1 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_npfuncs.cpython-310.pyc ADDED
Binary file (1.28 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_query_eval.cpython-310.pyc ADDED
Binary file (46.7 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_reductions.cpython-310.pyc ADDED
Binary file (47.8 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_repr_info.cpython-310.pyc ADDED
Binary file (12.9 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_stack_unstack.cpython-310.pyc ADDED
Binary file (59.5 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_subclass.cpython-310.pyc ADDED
Binary file (20.7 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_ufunc.cpython-310.pyc ADDED
Binary file (8.69 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_unary.cpython-310.pyc ADDED
Binary file (5.11 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/__pycache__/test_validate.cpython-310.pyc ADDED
Binary file (1.39 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (183 Bytes). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_coercion.cpython-310.pyc ADDED
Binary file (4.78 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_delitem.cpython-310.pyc ADDED
Binary file (2.09 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_get.cpython-310.pyc ADDED
Binary file (988 Bytes). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_get_value.cpython-310.pyc ADDED
Binary file (1.14 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_getitem.cpython-310.pyc ADDED
Binary file (15 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_indexing.cpython-310.pyc ADDED
Binary file (53.2 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_insert.cpython-310.pyc ADDED
Binary file (3.99 kB). View file
 
videochat2/lib/python3.10/site-packages/pandas/tests/frame/indexing/__pycache__/test_mask.cpython-310.pyc ADDED
Binary file (5.38 kB). View file