ZTWHHH commited on
Commit
9b098a8
·
verified ·
1 Parent(s): 62f131f

Add files using upload-large-folder tool

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. deepseek/lib/python3.10/site-packages/llvmlite-0.43.0.dist-info/METADATA +137 -0
  2. deepseek/lib/python3.10/site-packages/pandas/__pycache__/__init__.cpython-310.pyc +0 -0
  3. deepseek/lib/python3.10/site-packages/pandas/__pycache__/_typing.cpython-310.pyc +0 -0
  4. deepseek/lib/python3.10/site-packages/pandas/_config/__init__.py +57 -0
  5. deepseek/lib/python3.10/site-packages/pandas/_config/__pycache__/__init__.cpython-310.pyc +0 -0
  6. deepseek/lib/python3.10/site-packages/pandas/_config/__pycache__/display.cpython-310.pyc +0 -0
  7. deepseek/lib/python3.10/site-packages/pandas/_config/__pycache__/localization.cpython-310.pyc +0 -0
  8. deepseek/lib/python3.10/site-packages/pandas/_config/config.py +948 -0
  9. deepseek/lib/python3.10/site-packages/pandas/_config/dates.py +25 -0
  10. deepseek/lib/python3.10/site-packages/pandas/_config/localization.py +172 -0
  11. deepseek/lib/python3.10/site-packages/pandas/_testing/__pycache__/__init__.cpython-310.pyc +0 -0
  12. deepseek/lib/python3.10/site-packages/pandas/_testing/__pycache__/compat.cpython-310.pyc +0 -0
  13. deepseek/lib/python3.10/site-packages/pandas/_testing/__pycache__/contexts.cpython-310.pyc +0 -0
  14. deepseek/lib/python3.10/site-packages/pandas/_testing/_warnings.py +232 -0
  15. deepseek/lib/python3.10/site-packages/pandas/api/__pycache__/__init__.cpython-310.pyc +0 -0
  16. deepseek/lib/python3.10/site-packages/pandas/api/extensions/__init__.py +33 -0
  17. deepseek/lib/python3.10/site-packages/pandas/api/extensions/__pycache__/__init__.cpython-310.pyc +0 -0
  18. deepseek/lib/python3.10/site-packages/pandas/api/indexers/__init__.py +17 -0
  19. deepseek/lib/python3.10/site-packages/pandas/api/interchange/__pycache__/__init__.cpython-310.pyc +0 -0
  20. deepseek/lib/python3.10/site-packages/pandas/api/typing/__pycache__/__init__.cpython-310.pyc +0 -0
  21. deepseek/lib/python3.10/site-packages/pandas/compat/__init__.py +199 -0
  22. deepseek/lib/python3.10/site-packages/pandas/compat/__pycache__/_constants.cpython-310.pyc +0 -0
  23. deepseek/lib/python3.10/site-packages/pandas/compat/__pycache__/pyarrow.cpython-310.pyc +0 -0
  24. deepseek/lib/python3.10/site-packages/pandas/compat/compressors.py +77 -0
  25. deepseek/lib/python3.10/site-packages/pandas/compat/numpy/__pycache__/function.cpython-310.pyc +0 -0
  26. deepseek/lib/python3.10/site-packages/pandas/compat/numpy/function.py +418 -0
  27. deepseek/lib/python3.10/site-packages/pandas/compat/pyarrow.py +29 -0
  28. deepseek/lib/python3.10/site-packages/pandas/plotting/_matplotlib/__init__.py +93 -0
  29. deepseek/lib/python3.10/site-packages/pandas/plotting/_matplotlib/__pycache__/hist.cpython-310.pyc +0 -0
  30. deepseek/lib/python3.10/site-packages/pandas/plotting/_matplotlib/__pycache__/timeseries.cpython-310.pyc +0 -0
  31. deepseek/lib/python3.10/site-packages/pandas/plotting/_matplotlib/__pycache__/tools.cpython-310.pyc +0 -0
  32. deepseek/lib/python3.10/site-packages/pandas/plotting/_matplotlib/converter.py +1139 -0
  33. deepseek/lib/python3.10/site-packages/pandas/plotting/_matplotlib/style.py +278 -0
  34. deepseek/lib/python3.10/site-packages/pandas/plotting/_matplotlib/timeseries.py +370 -0
  35. deepseek/lib/python3.10/site-packages/pandas/tests/computation/__init__.py +0 -0
  36. deepseek/lib/python3.10/site-packages/pandas/tests/computation/__pycache__/__init__.cpython-310.pyc +0 -0
  37. deepseek/lib/python3.10/site-packages/pandas/tests/computation/__pycache__/test_compat.cpython-310.pyc +0 -0
  38. deepseek/lib/python3.10/site-packages/pandas/tests/computation/__pycache__/test_eval.cpython-310.pyc +0 -0
  39. deepseek/lib/python3.10/site-packages/pandas/tests/computation/test_compat.py +32 -0
  40. deepseek/lib/python3.10/site-packages/pandas/tests/computation/test_eval.py +2001 -0
  41. deepseek/lib/python3.10/site-packages/pandas/tests/series/__pycache__/test_api.cpython-310.pyc +0 -0
  42. deepseek/lib/python3.10/site-packages/pandas/tests/series/__pycache__/test_constructors.cpython-310.pyc +0 -0
  43. deepseek/lib/python3.10/site-packages/pandas/tests/series/__pycache__/test_missing.cpython-310.pyc +0 -0
  44. deepseek/lib/python3.10/site-packages/pandas/tests/series/__pycache__/test_reductions.cpython-310.pyc +0 -0
  45. deepseek/lib/python3.10/site-packages/pandas/tests/series/__pycache__/test_unary.cpython-310.pyc +0 -0
  46. deepseek/lib/python3.10/site-packages/rpds_py-0.22.3.dist-info/REQUESTED +0 -0
  47. deepseek/lib/python3.10/site-packages/rpds_py-0.22.3.dist-info/WHEEL +4 -0
  48. deepseekvl2/lib/python3.10/site-packages/torch/utils/__pycache__/_contextlib.cpython-310.pyc +0 -0
  49. deepseekvl2/lib/python3.10/site-packages/torch/utils/__pycache__/_cuda_trace.cpython-310.pyc +0 -0
  50. deepseekvl2/lib/python3.10/site-packages/torch/utils/__pycache__/_pytree.cpython-310.pyc +0 -0
deepseek/lib/python3.10/site-packages/llvmlite-0.43.0.dist-info/METADATA ADDED
@@ -0,0 +1,137 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.1
2
+ Name: llvmlite
3
+ Version: 0.43.0
4
+ Summary: lightweight wrapper around basic LLVM functionality
5
+ Home-page: http://llvmlite.readthedocs.io
6
+ License: BSD
7
+ Project-URL: Source, https://github.com/numba/llvmlite
8
+ Classifier: Development Status :: 4 - Beta
9
+ Classifier: Intended Audience :: Developers
10
+ Classifier: Operating System :: OS Independent
11
+ Classifier: Programming Language :: Python
12
+ Classifier: Programming Language :: Python :: 3
13
+ Classifier: Programming Language :: Python :: 3.9
14
+ Classifier: Programming Language :: Python :: 3.10
15
+ Classifier: Programming Language :: Python :: 3.11
16
+ Classifier: Programming Language :: Python :: 3.12
17
+ Classifier: Topic :: Software Development :: Code Generators
18
+ Classifier: Topic :: Software Development :: Compilers
19
+ Requires-Python: >=3.9
20
+ License-File: LICENSE
21
+ License-File: LICENSE.thirdparty
22
+
23
+ ========
24
+ llvmlite
25
+ ========
26
+
27
+ .. image:: https://dev.azure.com/numba/numba/_apis/build/status/numba.llvmlite?branchName=main
28
+ :target: https://dev.azure.com/numba/numba/_build/latest?definitionId=2&branchName=main
29
+ :alt: Azure Pipelines
30
+ .. image:: https://codeclimate.com/github/numba/llvmlite/badges/gpa.svg
31
+ :target: https://codeclimate.com/github/numba/llvmlite
32
+ :alt: Code Climate
33
+ .. image:: https://coveralls.io/repos/github/numba/llvmlite/badge.svg
34
+ :target: https://coveralls.io/github/numba/llvmlite
35
+ :alt: Coveralls.io
36
+ .. image:: https://readthedocs.org/projects/llvmlite/badge/
37
+ :target: https://llvmlite.readthedocs.io
38
+ :alt: Readthedocs.io
39
+
40
+ A Lightweight LLVM Python Binding for Writing JIT Compilers
41
+ -----------------------------------------------------------
42
+
43
+ .. _llvmpy: https://github.com/llvmpy/llvmpy
44
+
45
+ llvmlite is a project originally tailored for Numba_'s needs, using the
46
+ following approach:
47
+
48
+ * A small C wrapper around the parts of the LLVM C++ API we need that are
49
+ not already exposed by the LLVM C API.
50
+ * A ctypes Python wrapper around the C API.
51
+ * A pure Python implementation of the subset of the LLVM IR builder that we
52
+ need for Numba.
53
+
54
+ Why llvmlite
55
+ ============
56
+
57
+ The old llvmpy_ binding exposes a lot of LLVM APIs but the mapping of
58
+ C++-style memory management to Python is error prone. Numba_ and many JIT
59
+ compilers do not need a full LLVM API. Only the IR builder, optimizer,
60
+ and JIT compiler APIs are necessary.
61
+
62
+ Key Benefits
63
+ ============
64
+
65
+ * The IR builder is pure Python code and decoupled from LLVM's
66
+ frequently-changing C++ APIs.
67
+ * Materializing a LLVM module calls LLVM's IR parser which provides
68
+ better error messages than step-by-step IR building through the C++
69
+ API (no more segfaults or process aborts).
70
+ * Most of llvmlite uses the LLVM C API which is small but very stable
71
+ (low maintenance when changing LLVM version).
72
+ * The binding is not a Python C-extension, but a plain DLL accessed using
73
+ ctypes (no need to wrestle with Python's compiler requirements and C++ 11
74
+ compatibility).
75
+ * The Python binding layer has sane memory management.
76
+ * llvmlite is faster than llvmpy thanks to a much simpler architecture
77
+ (the Numba_ test suite is twice faster than it was).
78
+
79
+ Compatibility
80
+ =============
81
+
82
+ llvmlite has been tested with Python 3.9 -- 3.12 and is likely to work with
83
+ greater versions.
84
+
85
+ As of version 0.41.0, llvmlite requires LLVM 14.x.x on all architectures
86
+
87
+ Historical compatibility table:
88
+
89
+ ================= ========================
90
+ llvmlite versions compatible LLVM versions
91
+ ================= ========================
92
+ 0.41.0 - ... 14.x.x
93
+ 0.40.0 - 0.40.1 11.x.x and 14.x.x (12.x.x and 13.x.x untested but may work)
94
+ 0.37.0 - 0.39.1 11.x.x
95
+ 0.34.0 - 0.36.0 10.0.x (9.0.x for ``aarch64`` only)
96
+ 0.33.0 9.0.x
97
+ 0.29.0 - 0.32.0 7.0.x, 7.1.x, 8.0.x
98
+ 0.27.0 - 0.28.0 7.0.x
99
+ 0.23.0 - 0.26.0 6.0.x
100
+ 0.21.0 - 0.22.0 5.0.x
101
+ 0.17.0 - 0.20.0 4.0.x
102
+ 0.16.0 - 0.17.0 3.9.x
103
+ 0.13.0 - 0.15.0 3.8.x
104
+ 0.9.0 - 0.12.1 3.7.x
105
+ 0.6.0 - 0.8.0 3.6.x
106
+ 0.1.0 - 0.5.1 3.5.x
107
+ ================= ========================
108
+
109
+ Documentation
110
+ =============
111
+
112
+ You'll find the documentation at http://llvmlite.pydata.org
113
+
114
+
115
+ Pre-built binaries
116
+ ==================
117
+
118
+ We recommend you use the binaries provided by the Numba_ team for
119
+ the Conda_ package manager. You can find them in Numba's `anaconda.org
120
+ channel <https://anaconda.org/numba>`_. For example::
121
+
122
+ $ conda install --channel=numba llvmlite
123
+
124
+ (or, simply, the official llvmlite package provided in the Anaconda_
125
+ distribution)
126
+
127
+ .. _Numba: http://numba.pydata.org/
128
+ .. _Conda: http://conda.pydata.org/
129
+ .. _Anaconda: http://docs.continuum.io/anaconda/index.html
130
+
131
+
132
+ Other build methods
133
+ ===================
134
+
135
+ If you don't want to use our pre-built packages, you can compile
136
+ and install llvmlite yourself. The documentation will teach you how:
137
+ http://llvmlite.pydata.org/en/latest/install/index.html
deepseek/lib/python3.10/site-packages/pandas/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (6.95 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/__pycache__/_typing.cpython-310.pyc ADDED
Binary file (11.5 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/_config/__init__.py ADDED
@@ -0,0 +1,57 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ pandas._config is considered explicitly upstream of everything else in pandas,
3
+ should have no intra-pandas dependencies.
4
+
5
+ importing `dates` and `display` ensures that keys needed by _libs
6
+ are initialized.
7
+ """
8
+ __all__ = [
9
+ "config",
10
+ "detect_console_encoding",
11
+ "get_option",
12
+ "set_option",
13
+ "reset_option",
14
+ "describe_option",
15
+ "option_context",
16
+ "options",
17
+ "using_copy_on_write",
18
+ "warn_copy_on_write",
19
+ ]
20
+ from pandas._config import config
21
+ from pandas._config import dates # pyright: ignore[reportUnusedImport] # noqa: F401
22
+ from pandas._config.config import (
23
+ _global_config,
24
+ describe_option,
25
+ get_option,
26
+ option_context,
27
+ options,
28
+ reset_option,
29
+ set_option,
30
+ )
31
+ from pandas._config.display import detect_console_encoding
32
+
33
+
34
+ def using_copy_on_write() -> bool:
35
+ _mode_options = _global_config["mode"]
36
+ return (
37
+ _mode_options["copy_on_write"] is True
38
+ and _mode_options["data_manager"] == "block"
39
+ )
40
+
41
+
42
+ def warn_copy_on_write() -> bool:
43
+ _mode_options = _global_config["mode"]
44
+ return (
45
+ _mode_options["copy_on_write"] == "warn"
46
+ and _mode_options["data_manager"] == "block"
47
+ )
48
+
49
+
50
+ def using_nullable_dtypes() -> bool:
51
+ _mode_options = _global_config["mode"]
52
+ return _mode_options["nullable_dtypes"]
53
+
54
+
55
+ def using_pyarrow_string_dtype() -> bool:
56
+ _mode_options = _global_config["future"]
57
+ return _mode_options["infer_string"]
deepseek/lib/python3.10/site-packages/pandas/_config/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (1.5 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/_config/__pycache__/display.cpython-310.pyc ADDED
Binary file (1.38 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/_config/__pycache__/localization.cpython-310.pyc ADDED
Binary file (4.82 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/_config/config.py ADDED
@@ -0,0 +1,948 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ The config module holds package-wide configurables and provides
3
+ a uniform API for working with them.
4
+
5
+ Overview
6
+ ========
7
+
8
+ This module supports the following requirements:
9
+ - options are referenced using keys in dot.notation, e.g. "x.y.option - z".
10
+ - keys are case-insensitive.
11
+ - functions should accept partial/regex keys, when unambiguous.
12
+ - options can be registered by modules at import time.
13
+ - options can be registered at init-time (via core.config_init)
14
+ - options have a default value, and (optionally) a description and
15
+ validation function associated with them.
16
+ - options can be deprecated, in which case referencing them
17
+ should produce a warning.
18
+ - deprecated options can optionally be rerouted to a replacement
19
+ so that accessing a deprecated option reroutes to a differently
20
+ named option.
21
+ - options can be reset to their default value.
22
+ - all option can be reset to their default value at once.
23
+ - all options in a certain sub - namespace can be reset at once.
24
+ - the user can set / get / reset or ask for the description of an option.
25
+ - a developer can register and mark an option as deprecated.
26
+ - you can register a callback to be invoked when the option value
27
+ is set or reset. Changing the stored value is considered misuse, but
28
+ is not verboten.
29
+
30
+ Implementation
31
+ ==============
32
+
33
+ - Data is stored using nested dictionaries, and should be accessed
34
+ through the provided API.
35
+
36
+ - "Registered options" and "Deprecated options" have metadata associated
37
+ with them, which are stored in auxiliary dictionaries keyed on the
38
+ fully-qualified key, e.g. "x.y.z.option".
39
+
40
+ - the config_init module is imported by the package's __init__.py file.
41
+ placing any register_option() calls there will ensure those options
42
+ are available as soon as pandas is loaded. If you use register_option
43
+ in a module, it will only be available after that module is imported,
44
+ which you should be aware of.
45
+
46
+ - `config_prefix` is a context_manager (for use with the `with` keyword)
47
+ which can save developers some typing, see the docstring.
48
+
49
+ """
50
+
51
+ from __future__ import annotations
52
+
53
+ from contextlib import (
54
+ ContextDecorator,
55
+ contextmanager,
56
+ )
57
+ import re
58
+ from typing import (
59
+ TYPE_CHECKING,
60
+ Any,
61
+ Callable,
62
+ Generic,
63
+ NamedTuple,
64
+ cast,
65
+ )
66
+ import warnings
67
+
68
+ from pandas._typing import (
69
+ F,
70
+ T,
71
+ )
72
+ from pandas.util._exceptions import find_stack_level
73
+
74
+ if TYPE_CHECKING:
75
+ from collections.abc import (
76
+ Generator,
77
+ Iterable,
78
+ )
79
+
80
+
81
+ class DeprecatedOption(NamedTuple):
82
+ key: str
83
+ msg: str | None
84
+ rkey: str | None
85
+ removal_ver: str | None
86
+
87
+
88
+ class RegisteredOption(NamedTuple):
89
+ key: str
90
+ defval: object
91
+ doc: str
92
+ validator: Callable[[object], Any] | None
93
+ cb: Callable[[str], Any] | None
94
+
95
+
96
+ # holds deprecated option metadata
97
+ _deprecated_options: dict[str, DeprecatedOption] = {}
98
+
99
+ # holds registered option metadata
100
+ _registered_options: dict[str, RegisteredOption] = {}
101
+
102
+ # holds the current values for registered options
103
+ _global_config: dict[str, Any] = {}
104
+
105
+ # keys which have a special meaning
106
+ _reserved_keys: list[str] = ["all"]
107
+
108
+
109
+ class OptionError(AttributeError, KeyError):
110
+ """
111
+ Exception raised for pandas.options.
112
+
113
+ Backwards compatible with KeyError checks.
114
+
115
+ Examples
116
+ --------
117
+ >>> pd.options.context
118
+ Traceback (most recent call last):
119
+ OptionError: No such option
120
+ """
121
+
122
+
123
+ #
124
+ # User API
125
+
126
+
127
+ def _get_single_key(pat: str, silent: bool) -> str:
128
+ keys = _select_options(pat)
129
+ if len(keys) == 0:
130
+ if not silent:
131
+ _warn_if_deprecated(pat)
132
+ raise OptionError(f"No such keys(s): {repr(pat)}")
133
+ if len(keys) > 1:
134
+ raise OptionError("Pattern matched multiple keys")
135
+ key = keys[0]
136
+
137
+ if not silent:
138
+ _warn_if_deprecated(key)
139
+
140
+ key = _translate_key(key)
141
+
142
+ return key
143
+
144
+
145
+ def _get_option(pat: str, silent: bool = False) -> Any:
146
+ key = _get_single_key(pat, silent)
147
+
148
+ # walk the nested dict
149
+ root, k = _get_root(key)
150
+ return root[k]
151
+
152
+
153
+ def _set_option(*args, **kwargs) -> None:
154
+ # must at least 1 arg deal with constraints later
155
+ nargs = len(args)
156
+ if not nargs or nargs % 2 != 0:
157
+ raise ValueError("Must provide an even number of non-keyword arguments")
158
+
159
+ # default to false
160
+ silent = kwargs.pop("silent", False)
161
+
162
+ if kwargs:
163
+ kwarg = next(iter(kwargs.keys()))
164
+ raise TypeError(f'_set_option() got an unexpected keyword argument "{kwarg}"')
165
+
166
+ for k, v in zip(args[::2], args[1::2]):
167
+ key = _get_single_key(k, silent)
168
+
169
+ o = _get_registered_option(key)
170
+ if o and o.validator:
171
+ o.validator(v)
172
+
173
+ # walk the nested dict
174
+ root, k_root = _get_root(key)
175
+ root[k_root] = v
176
+
177
+ if o.cb:
178
+ if silent:
179
+ with warnings.catch_warnings(record=True):
180
+ o.cb(key)
181
+ else:
182
+ o.cb(key)
183
+
184
+
185
+ def _describe_option(pat: str = "", _print_desc: bool = True) -> str | None:
186
+ keys = _select_options(pat)
187
+ if len(keys) == 0:
188
+ raise OptionError("No such keys(s)")
189
+
190
+ s = "\n".join([_build_option_description(k) for k in keys])
191
+
192
+ if _print_desc:
193
+ print(s)
194
+ return None
195
+ return s
196
+
197
+
198
+ def _reset_option(pat: str, silent: bool = False) -> None:
199
+ keys = _select_options(pat)
200
+
201
+ if len(keys) == 0:
202
+ raise OptionError("No such keys(s)")
203
+
204
+ if len(keys) > 1 and len(pat) < 4 and pat != "all":
205
+ raise ValueError(
206
+ "You must specify at least 4 characters when "
207
+ "resetting multiple keys, use the special keyword "
208
+ '"all" to reset all the options to their default value'
209
+ )
210
+
211
+ for k in keys:
212
+ _set_option(k, _registered_options[k].defval, silent=silent)
213
+
214
+
215
+ def get_default_val(pat: str):
216
+ key = _get_single_key(pat, silent=True)
217
+ return _get_registered_option(key).defval
218
+
219
+
220
+ class DictWrapper:
221
+ """provide attribute-style access to a nested dict"""
222
+
223
+ d: dict[str, Any]
224
+
225
+ def __init__(self, d: dict[str, Any], prefix: str = "") -> None:
226
+ object.__setattr__(self, "d", d)
227
+ object.__setattr__(self, "prefix", prefix)
228
+
229
+ def __setattr__(self, key: str, val: Any) -> None:
230
+ prefix = object.__getattribute__(self, "prefix")
231
+ if prefix:
232
+ prefix += "."
233
+ prefix += key
234
+ # you can't set new keys
235
+ # can you can't overwrite subtrees
236
+ if key in self.d and not isinstance(self.d[key], dict):
237
+ _set_option(prefix, val)
238
+ else:
239
+ raise OptionError("You can only set the value of existing options")
240
+
241
+ def __getattr__(self, key: str):
242
+ prefix = object.__getattribute__(self, "prefix")
243
+ if prefix:
244
+ prefix += "."
245
+ prefix += key
246
+ try:
247
+ v = object.__getattribute__(self, "d")[key]
248
+ except KeyError as err:
249
+ raise OptionError("No such option") from err
250
+ if isinstance(v, dict):
251
+ return DictWrapper(v, prefix)
252
+ else:
253
+ return _get_option(prefix)
254
+
255
+ def __dir__(self) -> list[str]:
256
+ return list(self.d.keys())
257
+
258
+
259
+ # For user convenience, we'd like to have the available options described
260
+ # in the docstring. For dev convenience we'd like to generate the docstrings
261
+ # dynamically instead of maintaining them by hand. To this, we use the
262
+ # class below which wraps functions inside a callable, and converts
263
+ # __doc__ into a property function. The doctsrings below are templates
264
+ # using the py2.6+ advanced formatting syntax to plug in a concise list
265
+ # of options, and option descriptions.
266
+
267
+
268
+ class CallableDynamicDoc(Generic[T]):
269
+ def __init__(self, func: Callable[..., T], doc_tmpl: str) -> None:
270
+ self.__doc_tmpl__ = doc_tmpl
271
+ self.__func__ = func
272
+
273
+ def __call__(self, *args, **kwds) -> T:
274
+ return self.__func__(*args, **kwds)
275
+
276
+ # error: Signature of "__doc__" incompatible with supertype "object"
277
+ @property
278
+ def __doc__(self) -> str: # type: ignore[override]
279
+ opts_desc = _describe_option("all", _print_desc=False)
280
+ opts_list = pp_options_list(list(_registered_options.keys()))
281
+ return self.__doc_tmpl__.format(opts_desc=opts_desc, opts_list=opts_list)
282
+
283
+
284
+ _get_option_tmpl = """
285
+ get_option(pat)
286
+
287
+ Retrieves the value of the specified option.
288
+
289
+ Available options:
290
+
291
+ {opts_list}
292
+
293
+ Parameters
294
+ ----------
295
+ pat : str
296
+ Regexp which should match a single option.
297
+ Note: partial matches are supported for convenience, but unless you use the
298
+ full option name (e.g. x.y.z.option_name), your code may break in future
299
+ versions if new options with similar names are introduced.
300
+
301
+ Returns
302
+ -------
303
+ result : the value of the option
304
+
305
+ Raises
306
+ ------
307
+ OptionError : if no such option exists
308
+
309
+ Notes
310
+ -----
311
+ Please reference the :ref:`User Guide <options>` for more information.
312
+
313
+ The available options with its descriptions:
314
+
315
+ {opts_desc}
316
+
317
+ Examples
318
+ --------
319
+ >>> pd.get_option('display.max_columns') # doctest: +SKIP
320
+ 4
321
+ """
322
+
323
+ _set_option_tmpl = """
324
+ set_option(pat, value)
325
+
326
+ Sets the value of the specified option.
327
+
328
+ Available options:
329
+
330
+ {opts_list}
331
+
332
+ Parameters
333
+ ----------
334
+ pat : str
335
+ Regexp which should match a single option.
336
+ Note: partial matches are supported for convenience, but unless you use the
337
+ full option name (e.g. x.y.z.option_name), your code may break in future
338
+ versions if new options with similar names are introduced.
339
+ value : object
340
+ New value of option.
341
+
342
+ Returns
343
+ -------
344
+ None
345
+
346
+ Raises
347
+ ------
348
+ OptionError if no such option exists
349
+
350
+ Notes
351
+ -----
352
+ Please reference the :ref:`User Guide <options>` for more information.
353
+
354
+ The available options with its descriptions:
355
+
356
+ {opts_desc}
357
+
358
+ Examples
359
+ --------
360
+ >>> pd.set_option('display.max_columns', 4)
361
+ >>> df = pd.DataFrame([[1, 2, 3, 4, 5], [6, 7, 8, 9, 10]])
362
+ >>> df
363
+ 0 1 ... 3 4
364
+ 0 1 2 ... 4 5
365
+ 1 6 7 ... 9 10
366
+ [2 rows x 5 columns]
367
+ >>> pd.reset_option('display.max_columns')
368
+ """
369
+
370
+ _describe_option_tmpl = """
371
+ describe_option(pat, _print_desc=False)
372
+
373
+ Prints the description for one or more registered options.
374
+
375
+ Call with no arguments to get a listing for all registered options.
376
+
377
+ Available options:
378
+
379
+ {opts_list}
380
+
381
+ Parameters
382
+ ----------
383
+ pat : str
384
+ Regexp pattern. All matching keys will have their description displayed.
385
+ _print_desc : bool, default True
386
+ If True (default) the description(s) will be printed to stdout.
387
+ Otherwise, the description(s) will be returned as a unicode string
388
+ (for testing).
389
+
390
+ Returns
391
+ -------
392
+ None by default, the description(s) as a unicode string if _print_desc
393
+ is False
394
+
395
+ Notes
396
+ -----
397
+ Please reference the :ref:`User Guide <options>` for more information.
398
+
399
+ The available options with its descriptions:
400
+
401
+ {opts_desc}
402
+
403
+ Examples
404
+ --------
405
+ >>> pd.describe_option('display.max_columns') # doctest: +SKIP
406
+ display.max_columns : int
407
+ If max_cols is exceeded, switch to truncate view...
408
+ """
409
+
410
+ _reset_option_tmpl = """
411
+ reset_option(pat)
412
+
413
+ Reset one or more options to their default value.
414
+
415
+ Pass "all" as argument to reset all options.
416
+
417
+ Available options:
418
+
419
+ {opts_list}
420
+
421
+ Parameters
422
+ ----------
423
+ pat : str/regex
424
+ If specified only options matching `prefix*` will be reset.
425
+ Note: partial matches are supported for convenience, but unless you
426
+ use the full option name (e.g. x.y.z.option_name), your code may break
427
+ in future versions if new options with similar names are introduced.
428
+
429
+ Returns
430
+ -------
431
+ None
432
+
433
+ Notes
434
+ -----
435
+ Please reference the :ref:`User Guide <options>` for more information.
436
+
437
+ The available options with its descriptions:
438
+
439
+ {opts_desc}
440
+
441
+ Examples
442
+ --------
443
+ >>> pd.reset_option('display.max_columns') # doctest: +SKIP
444
+ """
445
+
446
+ # bind the functions with their docstrings into a Callable
447
+ # and use that as the functions exposed in pd.api
448
+ get_option = CallableDynamicDoc(_get_option, _get_option_tmpl)
449
+ set_option = CallableDynamicDoc(_set_option, _set_option_tmpl)
450
+ reset_option = CallableDynamicDoc(_reset_option, _reset_option_tmpl)
451
+ describe_option = CallableDynamicDoc(_describe_option, _describe_option_tmpl)
452
+ options = DictWrapper(_global_config)
453
+
454
+ #
455
+ # Functions for use by pandas developers, in addition to User - api
456
+
457
+
458
+ class option_context(ContextDecorator):
459
+ """
460
+ Context manager to temporarily set options in the `with` statement context.
461
+
462
+ You need to invoke as ``option_context(pat, val, [(pat, val), ...])``.
463
+
464
+ Examples
465
+ --------
466
+ >>> from pandas import option_context
467
+ >>> with option_context('display.max_rows', 10, 'display.max_columns', 5):
468
+ ... pass
469
+ """
470
+
471
+ def __init__(self, *args) -> None:
472
+ if len(args) % 2 != 0 or len(args) < 2:
473
+ raise ValueError(
474
+ "Need to invoke as option_context(pat, val, [(pat, val), ...])."
475
+ )
476
+
477
+ self.ops = list(zip(args[::2], args[1::2]))
478
+
479
+ def __enter__(self) -> None:
480
+ self.undo = [(pat, _get_option(pat)) for pat, val in self.ops]
481
+
482
+ for pat, val in self.ops:
483
+ _set_option(pat, val, silent=True)
484
+
485
+ def __exit__(self, *args) -> None:
486
+ if self.undo:
487
+ for pat, val in self.undo:
488
+ _set_option(pat, val, silent=True)
489
+
490
+
491
+ def register_option(
492
+ key: str,
493
+ defval: object,
494
+ doc: str = "",
495
+ validator: Callable[[object], Any] | None = None,
496
+ cb: Callable[[str], Any] | None = None,
497
+ ) -> None:
498
+ """
499
+ Register an option in the package-wide pandas config object
500
+
501
+ Parameters
502
+ ----------
503
+ key : str
504
+ Fully-qualified key, e.g. "x.y.option - z".
505
+ defval : object
506
+ Default value of the option.
507
+ doc : str
508
+ Description of the option.
509
+ validator : Callable, optional
510
+ Function of a single argument, should raise `ValueError` if
511
+ called with a value which is not a legal value for the option.
512
+ cb
513
+ a function of a single argument "key", which is called
514
+ immediately after an option value is set/reset. key is
515
+ the full name of the option.
516
+
517
+ Raises
518
+ ------
519
+ ValueError if `validator` is specified and `defval` is not a valid value.
520
+
521
+ """
522
+ import keyword
523
+ import tokenize
524
+
525
+ key = key.lower()
526
+
527
+ if key in _registered_options:
528
+ raise OptionError(f"Option '{key}' has already been registered")
529
+ if key in _reserved_keys:
530
+ raise OptionError(f"Option '{key}' is a reserved key")
531
+
532
+ # the default value should be legal
533
+ if validator:
534
+ validator(defval)
535
+
536
+ # walk the nested dict, creating dicts as needed along the path
537
+ path = key.split(".")
538
+
539
+ for k in path:
540
+ if not re.match("^" + tokenize.Name + "$", k):
541
+ raise ValueError(f"{k} is not a valid identifier")
542
+ if keyword.iskeyword(k):
543
+ raise ValueError(f"{k} is a python keyword")
544
+
545
+ cursor = _global_config
546
+ msg = "Path prefix to option '{option}' is already an option"
547
+
548
+ for i, p in enumerate(path[:-1]):
549
+ if not isinstance(cursor, dict):
550
+ raise OptionError(msg.format(option=".".join(path[:i])))
551
+ if p not in cursor:
552
+ cursor[p] = {}
553
+ cursor = cursor[p]
554
+
555
+ if not isinstance(cursor, dict):
556
+ raise OptionError(msg.format(option=".".join(path[:-1])))
557
+
558
+ cursor[path[-1]] = defval # initialize
559
+
560
+ # save the option metadata
561
+ _registered_options[key] = RegisteredOption(
562
+ key=key, defval=defval, doc=doc, validator=validator, cb=cb
563
+ )
564
+
565
+
566
+ def deprecate_option(
567
+ key: str,
568
+ msg: str | None = None,
569
+ rkey: str | None = None,
570
+ removal_ver: str | None = None,
571
+ ) -> None:
572
+ """
573
+ Mark option `key` as deprecated, if code attempts to access this option,
574
+ a warning will be produced, using `msg` if given, or a default message
575
+ if not.
576
+ if `rkey` is given, any access to the key will be re-routed to `rkey`.
577
+
578
+ Neither the existence of `key` nor that if `rkey` is checked. If they
579
+ do not exist, any subsequence access will fail as usual, after the
580
+ deprecation warning is given.
581
+
582
+ Parameters
583
+ ----------
584
+ key : str
585
+ Name of the option to be deprecated.
586
+ must be a fully-qualified option name (e.g "x.y.z.rkey").
587
+ msg : str, optional
588
+ Warning message to output when the key is referenced.
589
+ if no message is given a default message will be emitted.
590
+ rkey : str, optional
591
+ Name of an option to reroute access to.
592
+ If specified, any referenced `key` will be
593
+ re-routed to `rkey` including set/get/reset.
594
+ rkey must be a fully-qualified option name (e.g "x.y.z.rkey").
595
+ used by the default message if no `msg` is specified.
596
+ removal_ver : str, optional
597
+ Specifies the version in which this option will
598
+ be removed. used by the default message if no `msg` is specified.
599
+
600
+ Raises
601
+ ------
602
+ OptionError
603
+ If the specified key has already been deprecated.
604
+ """
605
+ key = key.lower()
606
+
607
+ if key in _deprecated_options:
608
+ raise OptionError(f"Option '{key}' has already been defined as deprecated.")
609
+
610
+ _deprecated_options[key] = DeprecatedOption(key, msg, rkey, removal_ver)
611
+
612
+
613
+ #
614
+ # functions internal to the module
615
+
616
+
617
+ def _select_options(pat: str) -> list[str]:
618
+ """
619
+ returns a list of keys matching `pat`
620
+
621
+ if pat=="all", returns all registered options
622
+ """
623
+ # short-circuit for exact key
624
+ if pat in _registered_options:
625
+ return [pat]
626
+
627
+ # else look through all of them
628
+ keys = sorted(_registered_options.keys())
629
+ if pat == "all": # reserved key
630
+ return keys
631
+
632
+ return [k for k in keys if re.search(pat, k, re.I)]
633
+
634
+
635
+ def _get_root(key: str) -> tuple[dict[str, Any], str]:
636
+ path = key.split(".")
637
+ cursor = _global_config
638
+ for p in path[:-1]:
639
+ cursor = cursor[p]
640
+ return cursor, path[-1]
641
+
642
+
643
+ def _is_deprecated(key: str) -> bool:
644
+ """Returns True if the given option has been deprecated"""
645
+ key = key.lower()
646
+ return key in _deprecated_options
647
+
648
+
649
+ def _get_deprecated_option(key: str):
650
+ """
651
+ Retrieves the metadata for a deprecated option, if `key` is deprecated.
652
+
653
+ Returns
654
+ -------
655
+ DeprecatedOption (namedtuple) if key is deprecated, None otherwise
656
+ """
657
+ try:
658
+ d = _deprecated_options[key]
659
+ except KeyError:
660
+ return None
661
+ else:
662
+ return d
663
+
664
+
665
+ def _get_registered_option(key: str):
666
+ """
667
+ Retrieves the option metadata if `key` is a registered option.
668
+
669
+ Returns
670
+ -------
671
+ RegisteredOption (namedtuple) if key is deprecated, None otherwise
672
+ """
673
+ return _registered_options.get(key)
674
+
675
+
676
+ def _translate_key(key: str) -> str:
677
+ """
678
+ if key id deprecated and a replacement key defined, will return the
679
+ replacement key, otherwise returns `key` as - is
680
+ """
681
+ d = _get_deprecated_option(key)
682
+ if d:
683
+ return d.rkey or key
684
+ else:
685
+ return key
686
+
687
+
688
+ def _warn_if_deprecated(key: str) -> bool:
689
+ """
690
+ Checks if `key` is a deprecated option and if so, prints a warning.
691
+
692
+ Returns
693
+ -------
694
+ bool - True if `key` is deprecated, False otherwise.
695
+ """
696
+ d = _get_deprecated_option(key)
697
+ if d:
698
+ if d.msg:
699
+ warnings.warn(
700
+ d.msg,
701
+ FutureWarning,
702
+ stacklevel=find_stack_level(),
703
+ )
704
+ else:
705
+ msg = f"'{key}' is deprecated"
706
+ if d.removal_ver:
707
+ msg += f" and will be removed in {d.removal_ver}"
708
+ if d.rkey:
709
+ msg += f", please use '{d.rkey}' instead."
710
+ else:
711
+ msg += ", please refrain from using it."
712
+
713
+ warnings.warn(msg, FutureWarning, stacklevel=find_stack_level())
714
+ return True
715
+ return False
716
+
717
+
718
+ def _build_option_description(k: str) -> str:
719
+ """Builds a formatted description of a registered option and prints it"""
720
+ o = _get_registered_option(k)
721
+ d = _get_deprecated_option(k)
722
+
723
+ s = f"{k} "
724
+
725
+ if o.doc:
726
+ s += "\n".join(o.doc.strip().split("\n"))
727
+ else:
728
+ s += "No description available."
729
+
730
+ if o:
731
+ s += f"\n [default: {o.defval}] [currently: {_get_option(k, True)}]"
732
+
733
+ if d:
734
+ rkey = d.rkey or ""
735
+ s += "\n (Deprecated"
736
+ s += f", use `{rkey}` instead."
737
+ s += ")"
738
+
739
+ return s
740
+
741
+
742
+ def pp_options_list(keys: Iterable[str], width: int = 80, _print: bool = False):
743
+ """Builds a concise listing of available options, grouped by prefix"""
744
+ from itertools import groupby
745
+ from textwrap import wrap
746
+
747
+ def pp(name: str, ks: Iterable[str]) -> list[str]:
748
+ pfx = "- " + name + ".[" if name else ""
749
+ ls = wrap(
750
+ ", ".join(ks),
751
+ width,
752
+ initial_indent=pfx,
753
+ subsequent_indent=" ",
754
+ break_long_words=False,
755
+ )
756
+ if ls and ls[-1] and name:
757
+ ls[-1] = ls[-1] + "]"
758
+ return ls
759
+
760
+ ls: list[str] = []
761
+ singles = [x for x in sorted(keys) if x.find(".") < 0]
762
+ if singles:
763
+ ls += pp("", singles)
764
+ keys = [x for x in keys if x.find(".") >= 0]
765
+
766
+ for k, g in groupby(sorted(keys), lambda x: x[: x.rfind(".")]):
767
+ ks = [x[len(k) + 1 :] for x in list(g)]
768
+ ls += pp(k, ks)
769
+ s = "\n".join(ls)
770
+ if _print:
771
+ print(s)
772
+ else:
773
+ return s
774
+
775
+
776
+ #
777
+ # helpers
778
+
779
+
780
+ @contextmanager
781
+ def config_prefix(prefix: str) -> Generator[None, None, None]:
782
+ """
783
+ contextmanager for multiple invocations of API with a common prefix
784
+
785
+ supported API functions: (register / get / set )__option
786
+
787
+ Warning: This is not thread - safe, and won't work properly if you import
788
+ the API functions into your module using the "from x import y" construct.
789
+
790
+ Example
791
+ -------
792
+ import pandas._config.config as cf
793
+ with cf.config_prefix("display.font"):
794
+ cf.register_option("color", "red")
795
+ cf.register_option("size", " 5 pt")
796
+ cf.set_option(size, " 6 pt")
797
+ cf.get_option(size)
798
+ ...
799
+
800
+ etc'
801
+
802
+ will register options "display.font.color", "display.font.size", set the
803
+ value of "display.font.size"... and so on.
804
+ """
805
+ # Note: reset_option relies on set_option, and on key directly
806
+ # it does not fit in to this monkey-patching scheme
807
+
808
+ global register_option, get_option, set_option
809
+
810
+ def wrap(func: F) -> F:
811
+ def inner(key: str, *args, **kwds):
812
+ pkey = f"{prefix}.{key}"
813
+ return func(pkey, *args, **kwds)
814
+
815
+ return cast(F, inner)
816
+
817
+ _register_option = register_option
818
+ _get_option = get_option
819
+ _set_option = set_option
820
+ set_option = wrap(set_option)
821
+ get_option = wrap(get_option)
822
+ register_option = wrap(register_option)
823
+ try:
824
+ yield
825
+ finally:
826
+ set_option = _set_option
827
+ get_option = _get_option
828
+ register_option = _register_option
829
+
830
+
831
+ # These factories and methods are handy for use as the validator
832
+ # arg in register_option
833
+
834
+
835
+ def is_type_factory(_type: type[Any]) -> Callable[[Any], None]:
836
+ """
837
+
838
+ Parameters
839
+ ----------
840
+ `_type` - a type to be compared against (e.g. type(x) == `_type`)
841
+
842
+ Returns
843
+ -------
844
+ validator - a function of a single argument x , which raises
845
+ ValueError if type(x) is not equal to `_type`
846
+
847
+ """
848
+
849
+ def inner(x) -> None:
850
+ if type(x) != _type:
851
+ raise ValueError(f"Value must have type '{_type}'")
852
+
853
+ return inner
854
+
855
+
856
+ def is_instance_factory(_type) -> Callable[[Any], None]:
857
+ """
858
+
859
+ Parameters
860
+ ----------
861
+ `_type` - the type to be checked against
862
+
863
+ Returns
864
+ -------
865
+ validator - a function of a single argument x , which raises
866
+ ValueError if x is not an instance of `_type`
867
+
868
+ """
869
+ if isinstance(_type, (tuple, list)):
870
+ _type = tuple(_type)
871
+ type_repr = "|".join(map(str, _type))
872
+ else:
873
+ type_repr = f"'{_type}'"
874
+
875
+ def inner(x) -> None:
876
+ if not isinstance(x, _type):
877
+ raise ValueError(f"Value must be an instance of {type_repr}")
878
+
879
+ return inner
880
+
881
+
882
+ def is_one_of_factory(legal_values) -> Callable[[Any], None]:
883
+ callables = [c for c in legal_values if callable(c)]
884
+ legal_values = [c for c in legal_values if not callable(c)]
885
+
886
+ def inner(x) -> None:
887
+ if x not in legal_values:
888
+ if not any(c(x) for c in callables):
889
+ uvals = [str(lval) for lval in legal_values]
890
+ pp_values = "|".join(uvals)
891
+ msg = f"Value must be one of {pp_values}"
892
+ if len(callables):
893
+ msg += " or a callable"
894
+ raise ValueError(msg)
895
+
896
+ return inner
897
+
898
+
899
+ def is_nonnegative_int(value: object) -> None:
900
+ """
901
+ Verify that value is None or a positive int.
902
+
903
+ Parameters
904
+ ----------
905
+ value : None or int
906
+ The `value` to be checked.
907
+
908
+ Raises
909
+ ------
910
+ ValueError
911
+ When the value is not None or is a negative integer
912
+ """
913
+ if value is None:
914
+ return
915
+
916
+ elif isinstance(value, int):
917
+ if value >= 0:
918
+ return
919
+
920
+ msg = "Value must be a nonnegative integer or None"
921
+ raise ValueError(msg)
922
+
923
+
924
+ # common type validators, for convenience
925
+ # usage: register_option(... , validator = is_int)
926
+ is_int = is_type_factory(int)
927
+ is_bool = is_type_factory(bool)
928
+ is_float = is_type_factory(float)
929
+ is_str = is_type_factory(str)
930
+ is_text = is_instance_factory((str, bytes))
931
+
932
+
933
+ def is_callable(obj) -> bool:
934
+ """
935
+
936
+ Parameters
937
+ ----------
938
+ `obj` - the object to be checked
939
+
940
+ Returns
941
+ -------
942
+ validator - returns True if object is callable
943
+ raises ValueError otherwise.
944
+
945
+ """
946
+ if not callable(obj):
947
+ raise ValueError("Value must be a callable")
948
+ return True
deepseek/lib/python3.10/site-packages/pandas/_config/dates.py ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ config for datetime formatting
3
+ """
4
+ from __future__ import annotations
5
+
6
+ from pandas._config import config as cf
7
+
8
+ pc_date_dayfirst_doc = """
9
+ : boolean
10
+ When True, prints and parses dates with the day first, eg 20/01/2005
11
+ """
12
+
13
+ pc_date_yearfirst_doc = """
14
+ : boolean
15
+ When True, prints and parses dates with the year first, eg 2005/01/20
16
+ """
17
+
18
+ with cf.config_prefix("display"):
19
+ # Needed upstream of `_libs` because these are used in tslibs.parsing
20
+ cf.register_option(
21
+ "date_dayfirst", False, pc_date_dayfirst_doc, validator=cf.is_bool
22
+ )
23
+ cf.register_option(
24
+ "date_yearfirst", False, pc_date_yearfirst_doc, validator=cf.is_bool
25
+ )
deepseek/lib/python3.10/site-packages/pandas/_config/localization.py ADDED
@@ -0,0 +1,172 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Helpers for configuring locale settings.
3
+
4
+ Name `localization` is chosen to avoid overlap with builtin `locale` module.
5
+ """
6
+ from __future__ import annotations
7
+
8
+ from contextlib import contextmanager
9
+ import locale
10
+ import platform
11
+ import re
12
+ import subprocess
13
+ from typing import TYPE_CHECKING
14
+
15
+ from pandas._config.config import options
16
+
17
+ if TYPE_CHECKING:
18
+ from collections.abc import Generator
19
+
20
+
21
+ @contextmanager
22
+ def set_locale(
23
+ new_locale: str | tuple[str, str], lc_var: int = locale.LC_ALL
24
+ ) -> Generator[str | tuple[str, str], None, None]:
25
+ """
26
+ Context manager for temporarily setting a locale.
27
+
28
+ Parameters
29
+ ----------
30
+ new_locale : str or tuple
31
+ A string of the form <language_country>.<encoding>. For example to set
32
+ the current locale to US English with a UTF8 encoding, you would pass
33
+ "en_US.UTF-8".
34
+ lc_var : int, default `locale.LC_ALL`
35
+ The category of the locale being set.
36
+
37
+ Notes
38
+ -----
39
+ This is useful when you want to run a particular block of code under a
40
+ particular locale, without globally setting the locale. This probably isn't
41
+ thread-safe.
42
+ """
43
+ # getlocale is not always compliant with setlocale, use setlocale. GH#46595
44
+ current_locale = locale.setlocale(lc_var)
45
+
46
+ try:
47
+ locale.setlocale(lc_var, new_locale)
48
+ normalized_code, normalized_encoding = locale.getlocale()
49
+ if normalized_code is not None and normalized_encoding is not None:
50
+ yield f"{normalized_code}.{normalized_encoding}"
51
+ else:
52
+ yield new_locale
53
+ finally:
54
+ locale.setlocale(lc_var, current_locale)
55
+
56
+
57
+ def can_set_locale(lc: str, lc_var: int = locale.LC_ALL) -> bool:
58
+ """
59
+ Check to see if we can set a locale, and subsequently get the locale,
60
+ without raising an Exception.
61
+
62
+ Parameters
63
+ ----------
64
+ lc : str
65
+ The locale to attempt to set.
66
+ lc_var : int, default `locale.LC_ALL`
67
+ The category of the locale being set.
68
+
69
+ Returns
70
+ -------
71
+ bool
72
+ Whether the passed locale can be set
73
+ """
74
+ try:
75
+ with set_locale(lc, lc_var=lc_var):
76
+ pass
77
+ except (ValueError, locale.Error):
78
+ # horrible name for a Exception subclass
79
+ return False
80
+ else:
81
+ return True
82
+
83
+
84
+ def _valid_locales(locales: list[str] | str, normalize: bool) -> list[str]:
85
+ """
86
+ Return a list of normalized locales that do not throw an ``Exception``
87
+ when set.
88
+
89
+ Parameters
90
+ ----------
91
+ locales : str
92
+ A string where each locale is separated by a newline.
93
+ normalize : bool
94
+ Whether to call ``locale.normalize`` on each locale.
95
+
96
+ Returns
97
+ -------
98
+ valid_locales : list
99
+ A list of valid locales.
100
+ """
101
+ return [
102
+ loc
103
+ for loc in (
104
+ locale.normalize(loc.strip()) if normalize else loc.strip()
105
+ for loc in locales
106
+ )
107
+ if can_set_locale(loc)
108
+ ]
109
+
110
+
111
+ def get_locales(
112
+ prefix: str | None = None,
113
+ normalize: bool = True,
114
+ ) -> list[str]:
115
+ """
116
+ Get all the locales that are available on the system.
117
+
118
+ Parameters
119
+ ----------
120
+ prefix : str
121
+ If not ``None`` then return only those locales with the prefix
122
+ provided. For example to get all English language locales (those that
123
+ start with ``"en"``), pass ``prefix="en"``.
124
+ normalize : bool
125
+ Call ``locale.normalize`` on the resulting list of available locales.
126
+ If ``True``, only locales that can be set without throwing an
127
+ ``Exception`` are returned.
128
+
129
+ Returns
130
+ -------
131
+ locales : list of strings
132
+ A list of locale strings that can be set with ``locale.setlocale()``.
133
+ For example::
134
+
135
+ locale.setlocale(locale.LC_ALL, locale_string)
136
+
137
+ On error will return an empty list (no locale available, e.g. Windows)
138
+
139
+ """
140
+ if platform.system() in ("Linux", "Darwin"):
141
+ raw_locales = subprocess.check_output(["locale", "-a"])
142
+ else:
143
+ # Other platforms e.g. windows platforms don't define "locale -a"
144
+ # Note: is_platform_windows causes circular import here
145
+ return []
146
+
147
+ try:
148
+ # raw_locales is "\n" separated list of locales
149
+ # it may contain non-decodable parts, so split
150
+ # extract what we can and then rejoin.
151
+ split_raw_locales = raw_locales.split(b"\n")
152
+ out_locales = []
153
+ for x in split_raw_locales:
154
+ try:
155
+ out_locales.append(str(x, encoding=options.display.encoding))
156
+ except UnicodeError:
157
+ # 'locale -a' is used to populated 'raw_locales' and on
158
+ # Redhat 7 Linux (and maybe others) prints locale names
159
+ # using windows-1252 encoding. Bug only triggered by
160
+ # a few special characters and when there is an
161
+ # extensive list of installed locales.
162
+ out_locales.append(str(x, encoding="windows-1252"))
163
+
164
+ except TypeError:
165
+ pass
166
+
167
+ if prefix is None:
168
+ return _valid_locales(out_locales, normalize)
169
+
170
+ pattern = re.compile(f"{prefix}.*")
171
+ found = pattern.findall("\n".join(out_locales))
172
+ return _valid_locales(found, normalize)
deepseek/lib/python3.10/site-packages/pandas/_testing/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (14.3 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/_testing/__pycache__/compat.cpython-310.pyc ADDED
Binary file (939 Bytes). View file
 
deepseek/lib/python3.10/site-packages/pandas/_testing/__pycache__/contexts.cpython-310.pyc ADDED
Binary file (6.23 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/_testing/_warnings.py ADDED
@@ -0,0 +1,232 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from __future__ import annotations
2
+
3
+ from contextlib import (
4
+ contextmanager,
5
+ nullcontext,
6
+ )
7
+ import inspect
8
+ import re
9
+ import sys
10
+ from typing import (
11
+ TYPE_CHECKING,
12
+ Literal,
13
+ cast,
14
+ )
15
+ import warnings
16
+
17
+ from pandas.compat import PY311
18
+
19
+ if TYPE_CHECKING:
20
+ from collections.abc import (
21
+ Generator,
22
+ Sequence,
23
+ )
24
+
25
+
26
+ @contextmanager
27
+ def assert_produces_warning(
28
+ expected_warning: type[Warning] | bool | tuple[type[Warning], ...] | None = Warning,
29
+ filter_level: Literal[
30
+ "error", "ignore", "always", "default", "module", "once"
31
+ ] = "always",
32
+ check_stacklevel: bool = True,
33
+ raise_on_extra_warnings: bool = True,
34
+ match: str | None = None,
35
+ ) -> Generator[list[warnings.WarningMessage], None, None]:
36
+ """
37
+ Context manager for running code expected to either raise a specific warning,
38
+ multiple specific warnings, or not raise any warnings. Verifies that the code
39
+ raises the expected warning(s), and that it does not raise any other unexpected
40
+ warnings. It is basically a wrapper around ``warnings.catch_warnings``.
41
+
42
+ Parameters
43
+ ----------
44
+ expected_warning : {Warning, False, tuple[Warning, ...], None}, default Warning
45
+ The type of Exception raised. ``exception.Warning`` is the base
46
+ class for all warnings. To raise multiple types of exceptions,
47
+ pass them as a tuple. To check that no warning is returned,
48
+ specify ``False`` or ``None``.
49
+ filter_level : str or None, default "always"
50
+ Specifies whether warnings are ignored, displayed, or turned
51
+ into errors.
52
+ Valid values are:
53
+
54
+ * "error" - turns matching warnings into exceptions
55
+ * "ignore" - discard the warning
56
+ * "always" - always emit a warning
57
+ * "default" - print the warning the first time it is generated
58
+ from each location
59
+ * "module" - print the warning the first time it is generated
60
+ from each module
61
+ * "once" - print the warning the first time it is generated
62
+
63
+ check_stacklevel : bool, default True
64
+ If True, displays the line that called the function containing
65
+ the warning to show were the function is called. Otherwise, the
66
+ line that implements the function is displayed.
67
+ raise_on_extra_warnings : bool, default True
68
+ Whether extra warnings not of the type `expected_warning` should
69
+ cause the test to fail.
70
+ match : str, optional
71
+ Match warning message.
72
+
73
+ Examples
74
+ --------
75
+ >>> import warnings
76
+ >>> with assert_produces_warning():
77
+ ... warnings.warn(UserWarning())
78
+ ...
79
+ >>> with assert_produces_warning(False):
80
+ ... warnings.warn(RuntimeWarning())
81
+ ...
82
+ Traceback (most recent call last):
83
+ ...
84
+ AssertionError: Caused unexpected warning(s): ['RuntimeWarning'].
85
+ >>> with assert_produces_warning(UserWarning):
86
+ ... warnings.warn(RuntimeWarning())
87
+ Traceback (most recent call last):
88
+ ...
89
+ AssertionError: Did not see expected warning of class 'UserWarning'.
90
+
91
+ ..warn:: This is *not* thread-safe.
92
+ """
93
+ __tracebackhide__ = True
94
+
95
+ with warnings.catch_warnings(record=True) as w:
96
+ warnings.simplefilter(filter_level)
97
+ try:
98
+ yield w
99
+ finally:
100
+ if expected_warning:
101
+ expected_warning = cast(type[Warning], expected_warning)
102
+ _assert_caught_expected_warning(
103
+ caught_warnings=w,
104
+ expected_warning=expected_warning,
105
+ match=match,
106
+ check_stacklevel=check_stacklevel,
107
+ )
108
+ if raise_on_extra_warnings:
109
+ _assert_caught_no_extra_warnings(
110
+ caught_warnings=w,
111
+ expected_warning=expected_warning,
112
+ )
113
+
114
+
115
+ def maybe_produces_warning(warning: type[Warning], condition: bool, **kwargs):
116
+ """
117
+ Return a context manager that possibly checks a warning based on the condition
118
+ """
119
+ if condition:
120
+ return assert_produces_warning(warning, **kwargs)
121
+ else:
122
+ return nullcontext()
123
+
124
+
125
+ def _assert_caught_expected_warning(
126
+ *,
127
+ caught_warnings: Sequence[warnings.WarningMessage],
128
+ expected_warning: type[Warning],
129
+ match: str | None,
130
+ check_stacklevel: bool,
131
+ ) -> None:
132
+ """Assert that there was the expected warning among the caught warnings."""
133
+ saw_warning = False
134
+ matched_message = False
135
+ unmatched_messages = []
136
+
137
+ for actual_warning in caught_warnings:
138
+ if issubclass(actual_warning.category, expected_warning):
139
+ saw_warning = True
140
+
141
+ if check_stacklevel:
142
+ _assert_raised_with_correct_stacklevel(actual_warning)
143
+
144
+ if match is not None:
145
+ if re.search(match, str(actual_warning.message)):
146
+ matched_message = True
147
+ else:
148
+ unmatched_messages.append(actual_warning.message)
149
+
150
+ if not saw_warning:
151
+ raise AssertionError(
152
+ f"Did not see expected warning of class "
153
+ f"{repr(expected_warning.__name__)}"
154
+ )
155
+
156
+ if match and not matched_message:
157
+ raise AssertionError(
158
+ f"Did not see warning {repr(expected_warning.__name__)} "
159
+ f"matching '{match}'. The emitted warning messages are "
160
+ f"{unmatched_messages}"
161
+ )
162
+
163
+
164
+ def _assert_caught_no_extra_warnings(
165
+ *,
166
+ caught_warnings: Sequence[warnings.WarningMessage],
167
+ expected_warning: type[Warning] | bool | tuple[type[Warning], ...] | None,
168
+ ) -> None:
169
+ """Assert that no extra warnings apart from the expected ones are caught."""
170
+ extra_warnings = []
171
+
172
+ for actual_warning in caught_warnings:
173
+ if _is_unexpected_warning(actual_warning, expected_warning):
174
+ # GH#38630 pytest.filterwarnings does not suppress these.
175
+ if actual_warning.category == ResourceWarning:
176
+ # GH 44732: Don't make the CI flaky by filtering SSL-related
177
+ # ResourceWarning from dependencies
178
+ if "unclosed <ssl.SSLSocket" in str(actual_warning.message):
179
+ continue
180
+ # GH 44844: Matplotlib leaves font files open during the entire process
181
+ # upon import. Don't make CI flaky if ResourceWarning raised
182
+ # due to these open files.
183
+ if any("matplotlib" in mod for mod in sys.modules):
184
+ continue
185
+ if PY311 and actual_warning.category == EncodingWarning:
186
+ # EncodingWarnings are checked in the CI
187
+ # pyproject.toml errors on EncodingWarnings in pandas
188
+ # Ignore EncodingWarnings from other libraries
189
+ continue
190
+ extra_warnings.append(
191
+ (
192
+ actual_warning.category.__name__,
193
+ actual_warning.message,
194
+ actual_warning.filename,
195
+ actual_warning.lineno,
196
+ )
197
+ )
198
+
199
+ if extra_warnings:
200
+ raise AssertionError(f"Caused unexpected warning(s): {repr(extra_warnings)}")
201
+
202
+
203
+ def _is_unexpected_warning(
204
+ actual_warning: warnings.WarningMessage,
205
+ expected_warning: type[Warning] | bool | tuple[type[Warning], ...] | None,
206
+ ) -> bool:
207
+ """Check if the actual warning issued is unexpected."""
208
+ if actual_warning and not expected_warning:
209
+ return True
210
+ expected_warning = cast(type[Warning], expected_warning)
211
+ return bool(not issubclass(actual_warning.category, expected_warning))
212
+
213
+
214
+ def _assert_raised_with_correct_stacklevel(
215
+ actual_warning: warnings.WarningMessage,
216
+ ) -> None:
217
+ # https://stackoverflow.com/questions/17407119/python-inspect-stack-is-slow
218
+ frame = inspect.currentframe()
219
+ for _ in range(4):
220
+ frame = frame.f_back # type: ignore[union-attr]
221
+ try:
222
+ caller_filename = inspect.getfile(frame) # type: ignore[arg-type]
223
+ finally:
224
+ # See note in
225
+ # https://docs.python.org/3/library/inspect.html#inspect.Traceback
226
+ del frame
227
+ msg = (
228
+ "Warning not set with correct stacklevel. "
229
+ f"File where warning is raised: {actual_warning.filename} != "
230
+ f"{caller_filename}. Warning message: {actual_warning.message}"
231
+ )
232
+ assert actual_warning.filename == caller_filename, msg
deepseek/lib/python3.10/site-packages/pandas/api/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (366 Bytes). View file
 
deepseek/lib/python3.10/site-packages/pandas/api/extensions/__init__.py ADDED
@@ -0,0 +1,33 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Public API for extending pandas objects.
3
+ """
4
+
5
+ from pandas._libs.lib import no_default
6
+
7
+ from pandas.core.dtypes.base import (
8
+ ExtensionDtype,
9
+ register_extension_dtype,
10
+ )
11
+
12
+ from pandas.core.accessor import (
13
+ register_dataframe_accessor,
14
+ register_index_accessor,
15
+ register_series_accessor,
16
+ )
17
+ from pandas.core.algorithms import take
18
+ from pandas.core.arrays import (
19
+ ExtensionArray,
20
+ ExtensionScalarOpsMixin,
21
+ )
22
+
23
+ __all__ = [
24
+ "no_default",
25
+ "ExtensionDtype",
26
+ "register_extension_dtype",
27
+ "register_dataframe_accessor",
28
+ "register_index_accessor",
29
+ "register_series_accessor",
30
+ "take",
31
+ "ExtensionArray",
32
+ "ExtensionScalarOpsMixin",
33
+ ]
deepseek/lib/python3.10/site-packages/pandas/api/extensions/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (731 Bytes). View file
 
deepseek/lib/python3.10/site-packages/pandas/api/indexers/__init__.py ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Public API for Rolling Window Indexers.
3
+ """
4
+
5
+ from pandas.core.indexers import check_array_indexer
6
+ from pandas.core.indexers.objects import (
7
+ BaseIndexer,
8
+ FixedForwardWindowIndexer,
9
+ VariableOffsetWindowIndexer,
10
+ )
11
+
12
+ __all__ = [
13
+ "check_array_indexer",
14
+ "BaseIndexer",
15
+ "FixedForwardWindowIndexer",
16
+ "VariableOffsetWindowIndexer",
17
+ ]
deepseek/lib/python3.10/site-packages/pandas/api/interchange/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (423 Bytes). View file
 
deepseek/lib/python3.10/site-packages/pandas/api/typing/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (1.07 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/compat/__init__.py ADDED
@@ -0,0 +1,199 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ compat
3
+ ======
4
+
5
+ Cross-compatible functions for different versions of Python.
6
+
7
+ Other items:
8
+ * platform checker
9
+ """
10
+ from __future__ import annotations
11
+
12
+ import os
13
+ import platform
14
+ import sys
15
+ from typing import TYPE_CHECKING
16
+
17
+ from pandas.compat._constants import (
18
+ IS64,
19
+ ISMUSL,
20
+ PY310,
21
+ PY311,
22
+ PY312,
23
+ PYPY,
24
+ )
25
+ import pandas.compat.compressors
26
+ from pandas.compat.numpy import is_numpy_dev
27
+ from pandas.compat.pyarrow import (
28
+ pa_version_under10p1,
29
+ pa_version_under11p0,
30
+ pa_version_under13p0,
31
+ pa_version_under14p0,
32
+ pa_version_under14p1,
33
+ pa_version_under16p0,
34
+ pa_version_under17p0,
35
+ )
36
+
37
+ if TYPE_CHECKING:
38
+ from pandas._typing import F
39
+
40
+
41
+ def set_function_name(f: F, name: str, cls: type) -> F:
42
+ """
43
+ Bind the name/qualname attributes of the function.
44
+ """
45
+ f.__name__ = name
46
+ f.__qualname__ = f"{cls.__name__}.{name}"
47
+ f.__module__ = cls.__module__
48
+ return f
49
+
50
+
51
+ def is_platform_little_endian() -> bool:
52
+ """
53
+ Checking if the running platform is little endian.
54
+
55
+ Returns
56
+ -------
57
+ bool
58
+ True if the running platform is little endian.
59
+ """
60
+ return sys.byteorder == "little"
61
+
62
+
63
+ def is_platform_windows() -> bool:
64
+ """
65
+ Checking if the running platform is windows.
66
+
67
+ Returns
68
+ -------
69
+ bool
70
+ True if the running platform is windows.
71
+ """
72
+ return sys.platform in ["win32", "cygwin"]
73
+
74
+
75
+ def is_platform_linux() -> bool:
76
+ """
77
+ Checking if the running platform is linux.
78
+
79
+ Returns
80
+ -------
81
+ bool
82
+ True if the running platform is linux.
83
+ """
84
+ return sys.platform == "linux"
85
+
86
+
87
+ def is_platform_mac() -> bool:
88
+ """
89
+ Checking if the running platform is mac.
90
+
91
+ Returns
92
+ -------
93
+ bool
94
+ True if the running platform is mac.
95
+ """
96
+ return sys.platform == "darwin"
97
+
98
+
99
+ def is_platform_arm() -> bool:
100
+ """
101
+ Checking if the running platform use ARM architecture.
102
+
103
+ Returns
104
+ -------
105
+ bool
106
+ True if the running platform uses ARM architecture.
107
+ """
108
+ return platform.machine() in ("arm64", "aarch64") or platform.machine().startswith(
109
+ "armv"
110
+ )
111
+
112
+
113
+ def is_platform_power() -> bool:
114
+ """
115
+ Checking if the running platform use Power architecture.
116
+
117
+ Returns
118
+ -------
119
+ bool
120
+ True if the running platform uses ARM architecture.
121
+ """
122
+ return platform.machine() in ("ppc64", "ppc64le")
123
+
124
+
125
+ def is_ci_environment() -> bool:
126
+ """
127
+ Checking if running in a continuous integration environment by checking
128
+ the PANDAS_CI environment variable.
129
+
130
+ Returns
131
+ -------
132
+ bool
133
+ True if the running in a continuous integration environment.
134
+ """
135
+ return os.environ.get("PANDAS_CI", "0") == "1"
136
+
137
+
138
+ def get_lzma_file() -> type[pandas.compat.compressors.LZMAFile]:
139
+ """
140
+ Importing the `LZMAFile` class from the `lzma` module.
141
+
142
+ Returns
143
+ -------
144
+ class
145
+ The `LZMAFile` class from the `lzma` module.
146
+
147
+ Raises
148
+ ------
149
+ RuntimeError
150
+ If the `lzma` module was not imported correctly, or didn't exist.
151
+ """
152
+ if not pandas.compat.compressors.has_lzma:
153
+ raise RuntimeError(
154
+ "lzma module not available. "
155
+ "A Python re-install with the proper dependencies, "
156
+ "might be required to solve this issue."
157
+ )
158
+ return pandas.compat.compressors.LZMAFile
159
+
160
+
161
+ def get_bz2_file() -> type[pandas.compat.compressors.BZ2File]:
162
+ """
163
+ Importing the `BZ2File` class from the `bz2` module.
164
+
165
+ Returns
166
+ -------
167
+ class
168
+ The `BZ2File` class from the `bz2` module.
169
+
170
+ Raises
171
+ ------
172
+ RuntimeError
173
+ If the `bz2` module was not imported correctly, or didn't exist.
174
+ """
175
+ if not pandas.compat.compressors.has_bz2:
176
+ raise RuntimeError(
177
+ "bz2 module not available. "
178
+ "A Python re-install with the proper dependencies, "
179
+ "might be required to solve this issue."
180
+ )
181
+ return pandas.compat.compressors.BZ2File
182
+
183
+
184
+ __all__ = [
185
+ "is_numpy_dev",
186
+ "pa_version_under10p1",
187
+ "pa_version_under11p0",
188
+ "pa_version_under13p0",
189
+ "pa_version_under14p0",
190
+ "pa_version_under14p1",
191
+ "pa_version_under16p0",
192
+ "pa_version_under17p0",
193
+ "IS64",
194
+ "ISMUSL",
195
+ "PY310",
196
+ "PY311",
197
+ "PY312",
198
+ "PYPY",
199
+ ]
deepseek/lib/python3.10/site-packages/pandas/compat/__pycache__/_constants.cpython-310.pyc ADDED
Binary file (705 Bytes). View file
 
deepseek/lib/python3.10/site-packages/pandas/compat/__pycache__/pyarrow.cpython-310.pyc ADDED
Binary file (896 Bytes). View file
 
deepseek/lib/python3.10/site-packages/pandas/compat/compressors.py ADDED
@@ -0,0 +1,77 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Patched ``BZ2File`` and ``LZMAFile`` to handle pickle protocol 5.
3
+ """
4
+
5
+ from __future__ import annotations
6
+
7
+ from pickle import PickleBuffer
8
+
9
+ from pandas.compat._constants import PY310
10
+
11
+ try:
12
+ import bz2
13
+
14
+ has_bz2 = True
15
+ except ImportError:
16
+ has_bz2 = False
17
+
18
+ try:
19
+ import lzma
20
+
21
+ has_lzma = True
22
+ except ImportError:
23
+ has_lzma = False
24
+
25
+
26
+ def flatten_buffer(
27
+ b: bytes | bytearray | memoryview | PickleBuffer,
28
+ ) -> bytes | bytearray | memoryview:
29
+ """
30
+ Return some 1-D `uint8` typed buffer.
31
+
32
+ Coerces anything that does not match that description to one that does
33
+ without copying if possible (otherwise will copy).
34
+ """
35
+
36
+ if isinstance(b, (bytes, bytearray)):
37
+ return b
38
+
39
+ if not isinstance(b, PickleBuffer):
40
+ b = PickleBuffer(b)
41
+
42
+ try:
43
+ # coerce to 1-D `uint8` C-contiguous `memoryview` zero-copy
44
+ return b.raw()
45
+ except BufferError:
46
+ # perform in-memory copy if buffer is not contiguous
47
+ return memoryview(b).tobytes("A")
48
+
49
+
50
+ if has_bz2:
51
+
52
+ class BZ2File(bz2.BZ2File):
53
+ if not PY310:
54
+
55
+ def write(self, b) -> int:
56
+ # Workaround issue where `bz2.BZ2File` expects `len`
57
+ # to return the number of bytes in `b` by converting
58
+ # `b` into something that meets that constraint with
59
+ # minimal copying.
60
+ #
61
+ # Note: This is fixed in Python 3.10.
62
+ return super().write(flatten_buffer(b))
63
+
64
+
65
+ if has_lzma:
66
+
67
+ class LZMAFile(lzma.LZMAFile):
68
+ if not PY310:
69
+
70
+ def write(self, b) -> int:
71
+ # Workaround issue where `lzma.LZMAFile` expects `len`
72
+ # to return the number of bytes in `b` by converting
73
+ # `b` into something that meets that constraint with
74
+ # minimal copying.
75
+ #
76
+ # Note: This is fixed in Python 3.10.
77
+ return super().write(flatten_buffer(b))
deepseek/lib/python3.10/site-packages/pandas/compat/numpy/__pycache__/function.cpython-310.pyc ADDED
Binary file (10.5 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/compat/numpy/function.py ADDED
@@ -0,0 +1,418 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ For compatibility with numpy libraries, pandas functions or methods have to
3
+ accept '*args' and '**kwargs' parameters to accommodate numpy arguments that
4
+ are not actually used or respected in the pandas implementation.
5
+
6
+ To ensure that users do not abuse these parameters, validation is performed in
7
+ 'validators.py' to make sure that any extra parameters passed correspond ONLY
8
+ to those in the numpy signature. Part of that validation includes whether or
9
+ not the user attempted to pass in non-default values for these extraneous
10
+ parameters. As we want to discourage users from relying on these parameters
11
+ when calling the pandas implementation, we want them only to pass in the
12
+ default values for these parameters.
13
+
14
+ This module provides a set of commonly used default arguments for functions and
15
+ methods that are spread throughout the codebase. This module will make it
16
+ easier to adjust to future upstream changes in the analogous numpy signatures.
17
+ """
18
+ from __future__ import annotations
19
+
20
+ from typing import (
21
+ TYPE_CHECKING,
22
+ Any,
23
+ TypeVar,
24
+ cast,
25
+ overload,
26
+ )
27
+
28
+ import numpy as np
29
+ from numpy import ndarray
30
+
31
+ from pandas._libs.lib import (
32
+ is_bool,
33
+ is_integer,
34
+ )
35
+ from pandas.errors import UnsupportedFunctionCall
36
+ from pandas.util._validators import (
37
+ validate_args,
38
+ validate_args_and_kwargs,
39
+ validate_kwargs,
40
+ )
41
+
42
+ if TYPE_CHECKING:
43
+ from pandas._typing import (
44
+ Axis,
45
+ AxisInt,
46
+ )
47
+
48
+ AxisNoneT = TypeVar("AxisNoneT", Axis, None)
49
+
50
+
51
+ class CompatValidator:
52
+ def __init__(
53
+ self,
54
+ defaults,
55
+ fname=None,
56
+ method: str | None = None,
57
+ max_fname_arg_count=None,
58
+ ) -> None:
59
+ self.fname = fname
60
+ self.method = method
61
+ self.defaults = defaults
62
+ self.max_fname_arg_count = max_fname_arg_count
63
+
64
+ def __call__(
65
+ self,
66
+ args,
67
+ kwargs,
68
+ fname=None,
69
+ max_fname_arg_count=None,
70
+ method: str | None = None,
71
+ ) -> None:
72
+ if not args and not kwargs:
73
+ return None
74
+
75
+ fname = self.fname if fname is None else fname
76
+ max_fname_arg_count = (
77
+ self.max_fname_arg_count
78
+ if max_fname_arg_count is None
79
+ else max_fname_arg_count
80
+ )
81
+ method = self.method if method is None else method
82
+
83
+ if method == "args":
84
+ validate_args(fname, args, max_fname_arg_count, self.defaults)
85
+ elif method == "kwargs":
86
+ validate_kwargs(fname, kwargs, self.defaults)
87
+ elif method == "both":
88
+ validate_args_and_kwargs(
89
+ fname, args, kwargs, max_fname_arg_count, self.defaults
90
+ )
91
+ else:
92
+ raise ValueError(f"invalid validation method '{method}'")
93
+
94
+
95
+ ARGMINMAX_DEFAULTS = {"out": None}
96
+ validate_argmin = CompatValidator(
97
+ ARGMINMAX_DEFAULTS, fname="argmin", method="both", max_fname_arg_count=1
98
+ )
99
+ validate_argmax = CompatValidator(
100
+ ARGMINMAX_DEFAULTS, fname="argmax", method="both", max_fname_arg_count=1
101
+ )
102
+
103
+
104
+ def process_skipna(skipna: bool | ndarray | None, args) -> tuple[bool, Any]:
105
+ if isinstance(skipna, ndarray) or skipna is None:
106
+ args = (skipna,) + args
107
+ skipna = True
108
+
109
+ return skipna, args
110
+
111
+
112
+ def validate_argmin_with_skipna(skipna: bool | ndarray | None, args, kwargs) -> bool:
113
+ """
114
+ If 'Series.argmin' is called via the 'numpy' library, the third parameter
115
+ in its signature is 'out', which takes either an ndarray or 'None', so
116
+ check if the 'skipna' parameter is either an instance of ndarray or is
117
+ None, since 'skipna' itself should be a boolean
118
+ """
119
+ skipna, args = process_skipna(skipna, args)
120
+ validate_argmin(args, kwargs)
121
+ return skipna
122
+
123
+
124
+ def validate_argmax_with_skipna(skipna: bool | ndarray | None, args, kwargs) -> bool:
125
+ """
126
+ If 'Series.argmax' is called via the 'numpy' library, the third parameter
127
+ in its signature is 'out', which takes either an ndarray or 'None', so
128
+ check if the 'skipna' parameter is either an instance of ndarray or is
129
+ None, since 'skipna' itself should be a boolean
130
+ """
131
+ skipna, args = process_skipna(skipna, args)
132
+ validate_argmax(args, kwargs)
133
+ return skipna
134
+
135
+
136
+ ARGSORT_DEFAULTS: dict[str, int | str | None] = {}
137
+ ARGSORT_DEFAULTS["axis"] = -1
138
+ ARGSORT_DEFAULTS["kind"] = "quicksort"
139
+ ARGSORT_DEFAULTS["order"] = None
140
+ ARGSORT_DEFAULTS["kind"] = None
141
+ ARGSORT_DEFAULTS["stable"] = None
142
+
143
+
144
+ validate_argsort = CompatValidator(
145
+ ARGSORT_DEFAULTS, fname="argsort", max_fname_arg_count=0, method="both"
146
+ )
147
+
148
+ # two different signatures of argsort, this second validation for when the
149
+ # `kind` param is supported
150
+ ARGSORT_DEFAULTS_KIND: dict[str, int | None] = {}
151
+ ARGSORT_DEFAULTS_KIND["axis"] = -1
152
+ ARGSORT_DEFAULTS_KIND["order"] = None
153
+ ARGSORT_DEFAULTS_KIND["stable"] = None
154
+ validate_argsort_kind = CompatValidator(
155
+ ARGSORT_DEFAULTS_KIND, fname="argsort", max_fname_arg_count=0, method="both"
156
+ )
157
+
158
+
159
+ def validate_argsort_with_ascending(ascending: bool | int | None, args, kwargs) -> bool:
160
+ """
161
+ If 'Categorical.argsort' is called via the 'numpy' library, the first
162
+ parameter in its signature is 'axis', which takes either an integer or
163
+ 'None', so check if the 'ascending' parameter has either integer type or is
164
+ None, since 'ascending' itself should be a boolean
165
+ """
166
+ if is_integer(ascending) or ascending is None:
167
+ args = (ascending,) + args
168
+ ascending = True
169
+
170
+ validate_argsort_kind(args, kwargs, max_fname_arg_count=3)
171
+ ascending = cast(bool, ascending)
172
+ return ascending
173
+
174
+
175
+ CLIP_DEFAULTS: dict[str, Any] = {"out": None}
176
+ validate_clip = CompatValidator(
177
+ CLIP_DEFAULTS, fname="clip", method="both", max_fname_arg_count=3
178
+ )
179
+
180
+
181
+ @overload
182
+ def validate_clip_with_axis(axis: ndarray, args, kwargs) -> None:
183
+ ...
184
+
185
+
186
+ @overload
187
+ def validate_clip_with_axis(axis: AxisNoneT, args, kwargs) -> AxisNoneT:
188
+ ...
189
+
190
+
191
+ def validate_clip_with_axis(
192
+ axis: ndarray | AxisNoneT, args, kwargs
193
+ ) -> AxisNoneT | None:
194
+ """
195
+ If 'NDFrame.clip' is called via the numpy library, the third parameter in
196
+ its signature is 'out', which can takes an ndarray, so check if the 'axis'
197
+ parameter is an instance of ndarray, since 'axis' itself should either be
198
+ an integer or None
199
+ """
200
+ if isinstance(axis, ndarray):
201
+ args = (axis,) + args
202
+ # error: Incompatible types in assignment (expression has type "None",
203
+ # variable has type "Union[ndarray[Any, Any], str, int]")
204
+ axis = None # type: ignore[assignment]
205
+
206
+ validate_clip(args, kwargs)
207
+ # error: Incompatible return value type (got "Union[ndarray[Any, Any],
208
+ # str, int]", expected "Union[str, int, None]")
209
+ return axis # type: ignore[return-value]
210
+
211
+
212
+ CUM_FUNC_DEFAULTS: dict[str, Any] = {}
213
+ CUM_FUNC_DEFAULTS["dtype"] = None
214
+ CUM_FUNC_DEFAULTS["out"] = None
215
+ validate_cum_func = CompatValidator(
216
+ CUM_FUNC_DEFAULTS, method="both", max_fname_arg_count=1
217
+ )
218
+ validate_cumsum = CompatValidator(
219
+ CUM_FUNC_DEFAULTS, fname="cumsum", method="both", max_fname_arg_count=1
220
+ )
221
+
222
+
223
+ def validate_cum_func_with_skipna(skipna: bool, args, kwargs, name) -> bool:
224
+ """
225
+ If this function is called via the 'numpy' library, the third parameter in
226
+ its signature is 'dtype', which takes either a 'numpy' dtype or 'None', so
227
+ check if the 'skipna' parameter is a boolean or not
228
+ """
229
+ if not is_bool(skipna):
230
+ args = (skipna,) + args
231
+ skipna = True
232
+ elif isinstance(skipna, np.bool_):
233
+ skipna = bool(skipna)
234
+
235
+ validate_cum_func(args, kwargs, fname=name)
236
+ return skipna
237
+
238
+
239
+ ALLANY_DEFAULTS: dict[str, bool | None] = {}
240
+ ALLANY_DEFAULTS["dtype"] = None
241
+ ALLANY_DEFAULTS["out"] = None
242
+ ALLANY_DEFAULTS["keepdims"] = False
243
+ ALLANY_DEFAULTS["axis"] = None
244
+ validate_all = CompatValidator(
245
+ ALLANY_DEFAULTS, fname="all", method="both", max_fname_arg_count=1
246
+ )
247
+ validate_any = CompatValidator(
248
+ ALLANY_DEFAULTS, fname="any", method="both", max_fname_arg_count=1
249
+ )
250
+
251
+ LOGICAL_FUNC_DEFAULTS = {"out": None, "keepdims": False}
252
+ validate_logical_func = CompatValidator(LOGICAL_FUNC_DEFAULTS, method="kwargs")
253
+
254
+ MINMAX_DEFAULTS = {"axis": None, "dtype": None, "out": None, "keepdims": False}
255
+ validate_min = CompatValidator(
256
+ MINMAX_DEFAULTS, fname="min", method="both", max_fname_arg_count=1
257
+ )
258
+ validate_max = CompatValidator(
259
+ MINMAX_DEFAULTS, fname="max", method="both", max_fname_arg_count=1
260
+ )
261
+
262
+ RESHAPE_DEFAULTS: dict[str, str] = {"order": "C"}
263
+ validate_reshape = CompatValidator(
264
+ RESHAPE_DEFAULTS, fname="reshape", method="both", max_fname_arg_count=1
265
+ )
266
+
267
+ REPEAT_DEFAULTS: dict[str, Any] = {"axis": None}
268
+ validate_repeat = CompatValidator(
269
+ REPEAT_DEFAULTS, fname="repeat", method="both", max_fname_arg_count=1
270
+ )
271
+
272
+ ROUND_DEFAULTS: dict[str, Any] = {"out": None}
273
+ validate_round = CompatValidator(
274
+ ROUND_DEFAULTS, fname="round", method="both", max_fname_arg_count=1
275
+ )
276
+
277
+ SORT_DEFAULTS: dict[str, int | str | None] = {}
278
+ SORT_DEFAULTS["axis"] = -1
279
+ SORT_DEFAULTS["kind"] = "quicksort"
280
+ SORT_DEFAULTS["order"] = None
281
+ validate_sort = CompatValidator(SORT_DEFAULTS, fname="sort", method="kwargs")
282
+
283
+ STAT_FUNC_DEFAULTS: dict[str, Any | None] = {}
284
+ STAT_FUNC_DEFAULTS["dtype"] = None
285
+ STAT_FUNC_DEFAULTS["out"] = None
286
+
287
+ SUM_DEFAULTS = STAT_FUNC_DEFAULTS.copy()
288
+ SUM_DEFAULTS["axis"] = None
289
+ SUM_DEFAULTS["keepdims"] = False
290
+ SUM_DEFAULTS["initial"] = None
291
+
292
+ PROD_DEFAULTS = SUM_DEFAULTS.copy()
293
+
294
+ MEAN_DEFAULTS = SUM_DEFAULTS.copy()
295
+
296
+ MEDIAN_DEFAULTS = STAT_FUNC_DEFAULTS.copy()
297
+ MEDIAN_DEFAULTS["overwrite_input"] = False
298
+ MEDIAN_DEFAULTS["keepdims"] = False
299
+
300
+ STAT_FUNC_DEFAULTS["keepdims"] = False
301
+
302
+ validate_stat_func = CompatValidator(STAT_FUNC_DEFAULTS, method="kwargs")
303
+ validate_sum = CompatValidator(
304
+ SUM_DEFAULTS, fname="sum", method="both", max_fname_arg_count=1
305
+ )
306
+ validate_prod = CompatValidator(
307
+ PROD_DEFAULTS, fname="prod", method="both", max_fname_arg_count=1
308
+ )
309
+ validate_mean = CompatValidator(
310
+ MEAN_DEFAULTS, fname="mean", method="both", max_fname_arg_count=1
311
+ )
312
+ validate_median = CompatValidator(
313
+ MEDIAN_DEFAULTS, fname="median", method="both", max_fname_arg_count=1
314
+ )
315
+
316
+ STAT_DDOF_FUNC_DEFAULTS: dict[str, bool | None] = {}
317
+ STAT_DDOF_FUNC_DEFAULTS["dtype"] = None
318
+ STAT_DDOF_FUNC_DEFAULTS["out"] = None
319
+ STAT_DDOF_FUNC_DEFAULTS["keepdims"] = False
320
+ validate_stat_ddof_func = CompatValidator(STAT_DDOF_FUNC_DEFAULTS, method="kwargs")
321
+
322
+ TAKE_DEFAULTS: dict[str, str | None] = {}
323
+ TAKE_DEFAULTS["out"] = None
324
+ TAKE_DEFAULTS["mode"] = "raise"
325
+ validate_take = CompatValidator(TAKE_DEFAULTS, fname="take", method="kwargs")
326
+
327
+
328
+ def validate_take_with_convert(convert: ndarray | bool | None, args, kwargs) -> bool:
329
+ """
330
+ If this function is called via the 'numpy' library, the third parameter in
331
+ its signature is 'axis', which takes either an ndarray or 'None', so check
332
+ if the 'convert' parameter is either an instance of ndarray or is None
333
+ """
334
+ if isinstance(convert, ndarray) or convert is None:
335
+ args = (convert,) + args
336
+ convert = True
337
+
338
+ validate_take(args, kwargs, max_fname_arg_count=3, method="both")
339
+ return convert
340
+
341
+
342
+ TRANSPOSE_DEFAULTS = {"axes": None}
343
+ validate_transpose = CompatValidator(
344
+ TRANSPOSE_DEFAULTS, fname="transpose", method="both", max_fname_arg_count=0
345
+ )
346
+
347
+
348
+ def validate_groupby_func(name: str, args, kwargs, allowed=None) -> None:
349
+ """
350
+ 'args' and 'kwargs' should be empty, except for allowed kwargs because all
351
+ of their necessary parameters are explicitly listed in the function
352
+ signature
353
+ """
354
+ if allowed is None:
355
+ allowed = []
356
+
357
+ kwargs = set(kwargs) - set(allowed)
358
+
359
+ if len(args) + len(kwargs) > 0:
360
+ raise UnsupportedFunctionCall(
361
+ "numpy operations are not valid with groupby. "
362
+ f"Use .groupby(...).{name}() instead"
363
+ )
364
+
365
+
366
+ RESAMPLER_NUMPY_OPS = ("min", "max", "sum", "prod", "mean", "std", "var")
367
+
368
+
369
+ def validate_resampler_func(method: str, args, kwargs) -> None:
370
+ """
371
+ 'args' and 'kwargs' should be empty because all of their necessary
372
+ parameters are explicitly listed in the function signature
373
+ """
374
+ if len(args) + len(kwargs) > 0:
375
+ if method in RESAMPLER_NUMPY_OPS:
376
+ raise UnsupportedFunctionCall(
377
+ "numpy operations are not valid with resample. "
378
+ f"Use .resample(...).{method}() instead"
379
+ )
380
+ raise TypeError("too many arguments passed in")
381
+
382
+
383
+ def validate_minmax_axis(axis: AxisInt | None, ndim: int = 1) -> None:
384
+ """
385
+ Ensure that the axis argument passed to min, max, argmin, or argmax is zero
386
+ or None, as otherwise it will be incorrectly ignored.
387
+
388
+ Parameters
389
+ ----------
390
+ axis : int or None
391
+ ndim : int, default 1
392
+
393
+ Raises
394
+ ------
395
+ ValueError
396
+ """
397
+ if axis is None:
398
+ return
399
+ if axis >= ndim or (axis < 0 and ndim + axis < 0):
400
+ raise ValueError(f"`axis` must be fewer than the number of dimensions ({ndim})")
401
+
402
+
403
+ _validation_funcs = {
404
+ "median": validate_median,
405
+ "mean": validate_mean,
406
+ "min": validate_min,
407
+ "max": validate_max,
408
+ "sum": validate_sum,
409
+ "prod": validate_prod,
410
+ }
411
+
412
+
413
+ def validate_func(fname, args, kwargs) -> None:
414
+ if fname not in _validation_funcs:
415
+ return validate_stat_func(args, kwargs, fname=fname)
416
+
417
+ validation_func = _validation_funcs[fname]
418
+ return validation_func(args, kwargs)
deepseek/lib/python3.10/site-packages/pandas/compat/pyarrow.py ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """ support pyarrow compatibility across versions """
2
+
3
+ from __future__ import annotations
4
+
5
+ from pandas.util.version import Version
6
+
7
+ try:
8
+ import pyarrow as pa
9
+
10
+ _palv = Version(Version(pa.__version__).base_version)
11
+ pa_version_under10p1 = _palv < Version("10.0.1")
12
+ pa_version_under11p0 = _palv < Version("11.0.0")
13
+ pa_version_under12p0 = _palv < Version("12.0.0")
14
+ pa_version_under13p0 = _palv < Version("13.0.0")
15
+ pa_version_under14p0 = _palv < Version("14.0.0")
16
+ pa_version_under14p1 = _palv < Version("14.0.1")
17
+ pa_version_under15p0 = _palv < Version("15.0.0")
18
+ pa_version_under16p0 = _palv < Version("16.0.0")
19
+ pa_version_under17p0 = _palv < Version("17.0.0")
20
+ except ImportError:
21
+ pa_version_under10p1 = True
22
+ pa_version_under11p0 = True
23
+ pa_version_under12p0 = True
24
+ pa_version_under13p0 = True
25
+ pa_version_under14p0 = True
26
+ pa_version_under14p1 = True
27
+ pa_version_under15p0 = True
28
+ pa_version_under16p0 = True
29
+ pa_version_under17p0 = True
deepseek/lib/python3.10/site-packages/pandas/plotting/_matplotlib/__init__.py ADDED
@@ -0,0 +1,93 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from __future__ import annotations
2
+
3
+ from typing import TYPE_CHECKING
4
+
5
+ from pandas.plotting._matplotlib.boxplot import (
6
+ BoxPlot,
7
+ boxplot,
8
+ boxplot_frame,
9
+ boxplot_frame_groupby,
10
+ )
11
+ from pandas.plotting._matplotlib.converter import (
12
+ deregister,
13
+ register,
14
+ )
15
+ from pandas.plotting._matplotlib.core import (
16
+ AreaPlot,
17
+ BarhPlot,
18
+ BarPlot,
19
+ HexBinPlot,
20
+ LinePlot,
21
+ PiePlot,
22
+ ScatterPlot,
23
+ )
24
+ from pandas.plotting._matplotlib.hist import (
25
+ HistPlot,
26
+ KdePlot,
27
+ hist_frame,
28
+ hist_series,
29
+ )
30
+ from pandas.plotting._matplotlib.misc import (
31
+ andrews_curves,
32
+ autocorrelation_plot,
33
+ bootstrap_plot,
34
+ lag_plot,
35
+ parallel_coordinates,
36
+ radviz,
37
+ scatter_matrix,
38
+ )
39
+ from pandas.plotting._matplotlib.tools import table
40
+
41
+ if TYPE_CHECKING:
42
+ from pandas.plotting._matplotlib.core import MPLPlot
43
+
44
+ PLOT_CLASSES: dict[str, type[MPLPlot]] = {
45
+ "line": LinePlot,
46
+ "bar": BarPlot,
47
+ "barh": BarhPlot,
48
+ "box": BoxPlot,
49
+ "hist": HistPlot,
50
+ "kde": KdePlot,
51
+ "area": AreaPlot,
52
+ "pie": PiePlot,
53
+ "scatter": ScatterPlot,
54
+ "hexbin": HexBinPlot,
55
+ }
56
+
57
+
58
+ def plot(data, kind, **kwargs):
59
+ # Importing pyplot at the top of the file (before the converters are
60
+ # registered) causes problems in matplotlib 2 (converters seem to not
61
+ # work)
62
+ import matplotlib.pyplot as plt
63
+
64
+ if kwargs.pop("reuse_plot", False):
65
+ ax = kwargs.get("ax")
66
+ if ax is None and len(plt.get_fignums()) > 0:
67
+ with plt.rc_context():
68
+ ax = plt.gca()
69
+ kwargs["ax"] = getattr(ax, "left_ax", ax)
70
+ plot_obj = PLOT_CLASSES[kind](data, **kwargs)
71
+ plot_obj.generate()
72
+ plot_obj.draw()
73
+ return plot_obj.result
74
+
75
+
76
+ __all__ = [
77
+ "plot",
78
+ "hist_series",
79
+ "hist_frame",
80
+ "boxplot",
81
+ "boxplot_frame",
82
+ "boxplot_frame_groupby",
83
+ "table",
84
+ "andrews_curves",
85
+ "autocorrelation_plot",
86
+ "bootstrap_plot",
87
+ "lag_plot",
88
+ "parallel_coordinates",
89
+ "radviz",
90
+ "scatter_matrix",
91
+ "register",
92
+ "deregister",
93
+ ]
deepseek/lib/python3.10/site-packages/pandas/plotting/_matplotlib/__pycache__/hist.cpython-310.pyc ADDED
Binary file (12.8 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/plotting/_matplotlib/__pycache__/timeseries.cpython-310.pyc ADDED
Binary file (8.04 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/plotting/_matplotlib/__pycache__/tools.cpython-310.pyc ADDED
Binary file (11.8 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/plotting/_matplotlib/converter.py ADDED
@@ -0,0 +1,1139 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from __future__ import annotations
2
+
3
+ import contextlib
4
+ import datetime as pydt
5
+ from datetime import (
6
+ datetime,
7
+ timedelta,
8
+ tzinfo,
9
+ )
10
+ import functools
11
+ from typing import (
12
+ TYPE_CHECKING,
13
+ Any,
14
+ cast,
15
+ )
16
+ import warnings
17
+
18
+ import matplotlib.dates as mdates
19
+ from matplotlib.ticker import (
20
+ AutoLocator,
21
+ Formatter,
22
+ Locator,
23
+ )
24
+ from matplotlib.transforms import nonsingular
25
+ import matplotlib.units as munits
26
+ import numpy as np
27
+
28
+ from pandas._libs import lib
29
+ from pandas._libs.tslibs import (
30
+ Timestamp,
31
+ to_offset,
32
+ )
33
+ from pandas._libs.tslibs.dtypes import (
34
+ FreqGroup,
35
+ periods_per_day,
36
+ )
37
+ from pandas._typing import (
38
+ F,
39
+ npt,
40
+ )
41
+
42
+ from pandas.core.dtypes.common import (
43
+ is_float,
44
+ is_float_dtype,
45
+ is_integer,
46
+ is_integer_dtype,
47
+ is_nested_list_like,
48
+ )
49
+
50
+ from pandas import (
51
+ Index,
52
+ Series,
53
+ get_option,
54
+ )
55
+ import pandas.core.common as com
56
+ from pandas.core.indexes.datetimes import date_range
57
+ from pandas.core.indexes.period import (
58
+ Period,
59
+ PeriodIndex,
60
+ period_range,
61
+ )
62
+ import pandas.core.tools.datetimes as tools
63
+
64
+ if TYPE_CHECKING:
65
+ from collections.abc import Generator
66
+
67
+ from matplotlib.axis import Axis
68
+
69
+ from pandas._libs.tslibs.offsets import BaseOffset
70
+
71
+
72
+ _mpl_units = {} # Cache for units overwritten by us
73
+
74
+
75
+ def get_pairs():
76
+ pairs = [
77
+ (Timestamp, DatetimeConverter),
78
+ (Period, PeriodConverter),
79
+ (pydt.datetime, DatetimeConverter),
80
+ (pydt.date, DatetimeConverter),
81
+ (pydt.time, TimeConverter),
82
+ (np.datetime64, DatetimeConverter),
83
+ ]
84
+ return pairs
85
+
86
+
87
+ def register_pandas_matplotlib_converters(func: F) -> F:
88
+ """
89
+ Decorator applying pandas_converters.
90
+ """
91
+
92
+ @functools.wraps(func)
93
+ def wrapper(*args, **kwargs):
94
+ with pandas_converters():
95
+ return func(*args, **kwargs)
96
+
97
+ return cast(F, wrapper)
98
+
99
+
100
+ @contextlib.contextmanager
101
+ def pandas_converters() -> Generator[None, None, None]:
102
+ """
103
+ Context manager registering pandas' converters for a plot.
104
+
105
+ See Also
106
+ --------
107
+ register_pandas_matplotlib_converters : Decorator that applies this.
108
+ """
109
+ value = get_option("plotting.matplotlib.register_converters")
110
+
111
+ if value:
112
+ # register for True or "auto"
113
+ register()
114
+ try:
115
+ yield
116
+ finally:
117
+ if value == "auto":
118
+ # only deregister for "auto"
119
+ deregister()
120
+
121
+
122
+ def register() -> None:
123
+ pairs = get_pairs()
124
+ for type_, cls in pairs:
125
+ # Cache previous converter if present
126
+ if type_ in munits.registry and not isinstance(munits.registry[type_], cls):
127
+ previous = munits.registry[type_]
128
+ _mpl_units[type_] = previous
129
+ # Replace with pandas converter
130
+ munits.registry[type_] = cls()
131
+
132
+
133
+ def deregister() -> None:
134
+ # Renamed in pandas.plotting.__init__
135
+ for type_, cls in get_pairs():
136
+ # We use type to catch our classes directly, no inheritance
137
+ if type(munits.registry.get(type_)) is cls:
138
+ munits.registry.pop(type_)
139
+
140
+ # restore the old keys
141
+ for unit, formatter in _mpl_units.items():
142
+ if type(formatter) not in {DatetimeConverter, PeriodConverter, TimeConverter}:
143
+ # make it idempotent by excluding ours.
144
+ munits.registry[unit] = formatter
145
+
146
+
147
+ def _to_ordinalf(tm: pydt.time) -> float:
148
+ tot_sec = tm.hour * 3600 + tm.minute * 60 + tm.second + tm.microsecond / 10**6
149
+ return tot_sec
150
+
151
+
152
+ def time2num(d):
153
+ if isinstance(d, str):
154
+ parsed = Timestamp(d)
155
+ return _to_ordinalf(parsed.time())
156
+ if isinstance(d, pydt.time):
157
+ return _to_ordinalf(d)
158
+ return d
159
+
160
+
161
+ class TimeConverter(munits.ConversionInterface):
162
+ @staticmethod
163
+ def convert(value, unit, axis):
164
+ valid_types = (str, pydt.time)
165
+ if isinstance(value, valid_types) or is_integer(value) or is_float(value):
166
+ return time2num(value)
167
+ if isinstance(value, Index):
168
+ return value.map(time2num)
169
+ if isinstance(value, (list, tuple, np.ndarray, Index)):
170
+ return [time2num(x) for x in value]
171
+ return value
172
+
173
+ @staticmethod
174
+ def axisinfo(unit, axis) -> munits.AxisInfo | None:
175
+ if unit != "time":
176
+ return None
177
+
178
+ majloc = AutoLocator()
179
+ majfmt = TimeFormatter(majloc)
180
+ return munits.AxisInfo(majloc=majloc, majfmt=majfmt, label="time")
181
+
182
+ @staticmethod
183
+ def default_units(x, axis) -> str:
184
+ return "time"
185
+
186
+
187
+ # time formatter
188
+ class TimeFormatter(Formatter):
189
+ def __init__(self, locs) -> None:
190
+ self.locs = locs
191
+
192
+ def __call__(self, x, pos: int | None = 0) -> str:
193
+ """
194
+ Return the time of day as a formatted string.
195
+
196
+ Parameters
197
+ ----------
198
+ x : float
199
+ The time of day specified as seconds since 00:00 (midnight),
200
+ with up to microsecond precision.
201
+ pos
202
+ Unused
203
+
204
+ Returns
205
+ -------
206
+ str
207
+ A string in HH:MM:SS.mmmuuu format. Microseconds,
208
+ milliseconds and seconds are only displayed if non-zero.
209
+ """
210
+ fmt = "%H:%M:%S.%f"
211
+ s = int(x)
212
+ msus = round((x - s) * 10**6)
213
+ ms = msus // 1000
214
+ us = msus % 1000
215
+ m, s = divmod(s, 60)
216
+ h, m = divmod(m, 60)
217
+ _, h = divmod(h, 24)
218
+ if us != 0:
219
+ return pydt.time(h, m, s, msus).strftime(fmt)
220
+ elif ms != 0:
221
+ return pydt.time(h, m, s, msus).strftime(fmt)[:-3]
222
+ elif s != 0:
223
+ return pydt.time(h, m, s).strftime("%H:%M:%S")
224
+
225
+ return pydt.time(h, m).strftime("%H:%M")
226
+
227
+
228
+ # Period Conversion
229
+
230
+
231
+ class PeriodConverter(mdates.DateConverter):
232
+ @staticmethod
233
+ def convert(values, units, axis):
234
+ if is_nested_list_like(values):
235
+ values = [PeriodConverter._convert_1d(v, units, axis) for v in values]
236
+ else:
237
+ values = PeriodConverter._convert_1d(values, units, axis)
238
+ return values
239
+
240
+ @staticmethod
241
+ def _convert_1d(values, units, axis):
242
+ if not hasattr(axis, "freq"):
243
+ raise TypeError("Axis must have `freq` set to convert to Periods")
244
+ valid_types = (str, datetime, Period, pydt.date, pydt.time, np.datetime64)
245
+ with warnings.catch_warnings():
246
+ warnings.filterwarnings(
247
+ "ignore", "Period with BDay freq is deprecated", category=FutureWarning
248
+ )
249
+ warnings.filterwarnings(
250
+ "ignore", r"PeriodDtype\[B\] is deprecated", category=FutureWarning
251
+ )
252
+ if (
253
+ isinstance(values, valid_types)
254
+ or is_integer(values)
255
+ or is_float(values)
256
+ ):
257
+ return get_datevalue(values, axis.freq)
258
+ elif isinstance(values, PeriodIndex):
259
+ return values.asfreq(axis.freq).asi8
260
+ elif isinstance(values, Index):
261
+ return values.map(lambda x: get_datevalue(x, axis.freq))
262
+ elif lib.infer_dtype(values, skipna=False) == "period":
263
+ # https://github.com/pandas-dev/pandas/issues/24304
264
+ # convert ndarray[period] -> PeriodIndex
265
+ return PeriodIndex(values, freq=axis.freq).asi8
266
+ elif isinstance(values, (list, tuple, np.ndarray, Index)):
267
+ return [get_datevalue(x, axis.freq) for x in values]
268
+ return values
269
+
270
+
271
+ def get_datevalue(date, freq):
272
+ if isinstance(date, Period):
273
+ return date.asfreq(freq).ordinal
274
+ elif isinstance(date, (str, datetime, pydt.date, pydt.time, np.datetime64)):
275
+ return Period(date, freq).ordinal
276
+ elif (
277
+ is_integer(date)
278
+ or is_float(date)
279
+ or (isinstance(date, (np.ndarray, Index)) and (date.size == 1))
280
+ ):
281
+ return date
282
+ elif date is None:
283
+ return None
284
+ raise ValueError(f"Unrecognizable date '{date}'")
285
+
286
+
287
+ # Datetime Conversion
288
+ class DatetimeConverter(mdates.DateConverter):
289
+ @staticmethod
290
+ def convert(values, unit, axis):
291
+ # values might be a 1-d array, or a list-like of arrays.
292
+ if is_nested_list_like(values):
293
+ values = [DatetimeConverter._convert_1d(v, unit, axis) for v in values]
294
+ else:
295
+ values = DatetimeConverter._convert_1d(values, unit, axis)
296
+ return values
297
+
298
+ @staticmethod
299
+ def _convert_1d(values, unit, axis):
300
+ def try_parse(values):
301
+ try:
302
+ return mdates.date2num(tools.to_datetime(values))
303
+ except Exception:
304
+ return values
305
+
306
+ if isinstance(values, (datetime, pydt.date, np.datetime64, pydt.time)):
307
+ return mdates.date2num(values)
308
+ elif is_integer(values) or is_float(values):
309
+ return values
310
+ elif isinstance(values, str):
311
+ return try_parse(values)
312
+ elif isinstance(values, (list, tuple, np.ndarray, Index, Series)):
313
+ if isinstance(values, Series):
314
+ # https://github.com/matplotlib/matplotlib/issues/11391
315
+ # Series was skipped. Convert to DatetimeIndex to get asi8
316
+ values = Index(values)
317
+ if isinstance(values, Index):
318
+ values = values.values
319
+ if not isinstance(values, np.ndarray):
320
+ values = com.asarray_tuplesafe(values)
321
+
322
+ if is_integer_dtype(values) or is_float_dtype(values):
323
+ return values
324
+
325
+ try:
326
+ values = tools.to_datetime(values)
327
+ except Exception:
328
+ pass
329
+
330
+ values = mdates.date2num(values)
331
+
332
+ return values
333
+
334
+ @staticmethod
335
+ def axisinfo(unit: tzinfo | None, axis) -> munits.AxisInfo:
336
+ """
337
+ Return the :class:`~matplotlib.units.AxisInfo` for *unit*.
338
+
339
+ *unit* is a tzinfo instance or None.
340
+ The *axis* argument is required but not used.
341
+ """
342
+ tz = unit
343
+
344
+ majloc = PandasAutoDateLocator(tz=tz)
345
+ majfmt = PandasAutoDateFormatter(majloc, tz=tz)
346
+ datemin = pydt.date(2000, 1, 1)
347
+ datemax = pydt.date(2010, 1, 1)
348
+
349
+ return munits.AxisInfo(
350
+ majloc=majloc, majfmt=majfmt, label="", default_limits=(datemin, datemax)
351
+ )
352
+
353
+
354
+ class PandasAutoDateFormatter(mdates.AutoDateFormatter):
355
+ def __init__(self, locator, tz=None, defaultfmt: str = "%Y-%m-%d") -> None:
356
+ mdates.AutoDateFormatter.__init__(self, locator, tz, defaultfmt)
357
+
358
+
359
+ class PandasAutoDateLocator(mdates.AutoDateLocator):
360
+ def get_locator(self, dmin, dmax):
361
+ """Pick the best locator based on a distance."""
362
+ tot_sec = (dmax - dmin).total_seconds()
363
+
364
+ if abs(tot_sec) < self.minticks:
365
+ self._freq = -1
366
+ locator = MilliSecondLocator(self.tz)
367
+ locator.set_axis(self.axis)
368
+
369
+ # error: Item "None" of "Axis | _DummyAxis | _AxisWrapper | None"
370
+ # has no attribute "get_data_interval"
371
+ locator.axis.set_view_interval( # type: ignore[union-attr]
372
+ *self.axis.get_view_interval() # type: ignore[union-attr]
373
+ )
374
+ locator.axis.set_data_interval( # type: ignore[union-attr]
375
+ *self.axis.get_data_interval() # type: ignore[union-attr]
376
+ )
377
+ return locator
378
+
379
+ return mdates.AutoDateLocator.get_locator(self, dmin, dmax)
380
+
381
+ def _get_unit(self):
382
+ return MilliSecondLocator.get_unit_generic(self._freq)
383
+
384
+
385
+ class MilliSecondLocator(mdates.DateLocator):
386
+ UNIT = 1.0 / (24 * 3600 * 1000)
387
+
388
+ def __init__(self, tz) -> None:
389
+ mdates.DateLocator.__init__(self, tz)
390
+ self._interval = 1.0
391
+
392
+ def _get_unit(self):
393
+ return self.get_unit_generic(-1)
394
+
395
+ @staticmethod
396
+ def get_unit_generic(freq):
397
+ unit = mdates.RRuleLocator.get_unit_generic(freq)
398
+ if unit < 0:
399
+ return MilliSecondLocator.UNIT
400
+ return unit
401
+
402
+ def __call__(self):
403
+ # if no data have been set, this will tank with a ValueError
404
+ try:
405
+ dmin, dmax = self.viewlim_to_dt()
406
+ except ValueError:
407
+ return []
408
+
409
+ # We need to cap at the endpoints of valid datetime
410
+ nmax, nmin = mdates.date2num((dmax, dmin))
411
+
412
+ num = (nmax - nmin) * 86400 * 1000
413
+ max_millis_ticks = 6
414
+ for interval in [1, 10, 50, 100, 200, 500]:
415
+ if num <= interval * (max_millis_ticks - 1):
416
+ self._interval = interval
417
+ break
418
+ # We went through the whole loop without breaking, default to 1
419
+ self._interval = 1000.0
420
+
421
+ estimate = (nmax - nmin) / (self._get_unit() * self._get_interval())
422
+
423
+ if estimate > self.MAXTICKS * 2:
424
+ raise RuntimeError(
425
+ "MillisecondLocator estimated to generate "
426
+ f"{estimate:d} ticks from {dmin} to {dmax}: exceeds Locator.MAXTICKS"
427
+ f"* 2 ({self.MAXTICKS * 2:d}) "
428
+ )
429
+
430
+ interval = self._get_interval()
431
+ freq = f"{interval}ms"
432
+ tz = self.tz.tzname(None)
433
+ st = dmin.replace(tzinfo=None)
434
+ ed = dmin.replace(tzinfo=None)
435
+ all_dates = date_range(start=st, end=ed, freq=freq, tz=tz).astype(object)
436
+
437
+ try:
438
+ if len(all_dates) > 0:
439
+ locs = self.raise_if_exceeds(mdates.date2num(all_dates))
440
+ return locs
441
+ except Exception: # pragma: no cover
442
+ pass
443
+
444
+ lims = mdates.date2num([dmin, dmax])
445
+ return lims
446
+
447
+ def _get_interval(self):
448
+ return self._interval
449
+
450
+ def autoscale(self):
451
+ """
452
+ Set the view limits to include the data range.
453
+ """
454
+ # We need to cap at the endpoints of valid datetime
455
+ dmin, dmax = self.datalim_to_dt()
456
+
457
+ vmin = mdates.date2num(dmin)
458
+ vmax = mdates.date2num(dmax)
459
+
460
+ return self.nonsingular(vmin, vmax)
461
+
462
+
463
+ def _from_ordinal(x, tz: tzinfo | None = None) -> datetime:
464
+ ix = int(x)
465
+ dt = datetime.fromordinal(ix)
466
+ remainder = float(x) - ix
467
+ hour, remainder = divmod(24 * remainder, 1)
468
+ minute, remainder = divmod(60 * remainder, 1)
469
+ second, remainder = divmod(60 * remainder, 1)
470
+ microsecond = int(1_000_000 * remainder)
471
+ if microsecond < 10:
472
+ microsecond = 0 # compensate for rounding errors
473
+ dt = datetime(
474
+ dt.year, dt.month, dt.day, int(hour), int(minute), int(second), microsecond
475
+ )
476
+ if tz is not None:
477
+ dt = dt.astimezone(tz)
478
+
479
+ if microsecond > 999990: # compensate for rounding errors
480
+ dt += timedelta(microseconds=1_000_000 - microsecond)
481
+
482
+ return dt
483
+
484
+
485
+ # Fixed frequency dynamic tick locators and formatters
486
+
487
+ # -------------------------------------------------------------------------
488
+ # --- Locators ---
489
+ # -------------------------------------------------------------------------
490
+
491
+
492
+ def _get_default_annual_spacing(nyears) -> tuple[int, int]:
493
+ """
494
+ Returns a default spacing between consecutive ticks for annual data.
495
+ """
496
+ if nyears < 11:
497
+ (min_spacing, maj_spacing) = (1, 1)
498
+ elif nyears < 20:
499
+ (min_spacing, maj_spacing) = (1, 2)
500
+ elif nyears < 50:
501
+ (min_spacing, maj_spacing) = (1, 5)
502
+ elif nyears < 100:
503
+ (min_spacing, maj_spacing) = (5, 10)
504
+ elif nyears < 200:
505
+ (min_spacing, maj_spacing) = (5, 25)
506
+ elif nyears < 600:
507
+ (min_spacing, maj_spacing) = (10, 50)
508
+ else:
509
+ factor = nyears // 1000 + 1
510
+ (min_spacing, maj_spacing) = (factor * 20, factor * 100)
511
+ return (min_spacing, maj_spacing)
512
+
513
+
514
+ def _period_break(dates: PeriodIndex, period: str) -> npt.NDArray[np.intp]:
515
+ """
516
+ Returns the indices where the given period changes.
517
+
518
+ Parameters
519
+ ----------
520
+ dates : PeriodIndex
521
+ Array of intervals to monitor.
522
+ period : str
523
+ Name of the period to monitor.
524
+ """
525
+ mask = _period_break_mask(dates, period)
526
+ return np.nonzero(mask)[0]
527
+
528
+
529
+ def _period_break_mask(dates: PeriodIndex, period: str) -> npt.NDArray[np.bool_]:
530
+ current = getattr(dates, period)
531
+ previous = getattr(dates - 1 * dates.freq, period)
532
+ return current != previous
533
+
534
+
535
+ def has_level_label(label_flags: npt.NDArray[np.intp], vmin: float) -> bool:
536
+ """
537
+ Returns true if the ``label_flags`` indicate there is at least one label
538
+ for this level.
539
+
540
+ if the minimum view limit is not an exact integer, then the first tick
541
+ label won't be shown, so we must adjust for that.
542
+ """
543
+ if label_flags.size == 0 or (
544
+ label_flags.size == 1 and label_flags[0] == 0 and vmin % 1 > 0.0
545
+ ):
546
+ return False
547
+ else:
548
+ return True
549
+
550
+
551
+ def _get_periods_per_ymd(freq: BaseOffset) -> tuple[int, int, int]:
552
+ # error: "BaseOffset" has no attribute "_period_dtype_code"
553
+ dtype_code = freq._period_dtype_code # type: ignore[attr-defined]
554
+ freq_group = FreqGroup.from_period_dtype_code(dtype_code)
555
+
556
+ ppd = -1 # placeholder for above-day freqs
557
+
558
+ if dtype_code >= FreqGroup.FR_HR.value:
559
+ # error: "BaseOffset" has no attribute "_creso"
560
+ ppd = periods_per_day(freq._creso) # type: ignore[attr-defined]
561
+ ppm = 28 * ppd
562
+ ppy = 365 * ppd
563
+ elif freq_group == FreqGroup.FR_BUS:
564
+ ppm = 19
565
+ ppy = 261
566
+ elif freq_group == FreqGroup.FR_DAY:
567
+ ppm = 28
568
+ ppy = 365
569
+ elif freq_group == FreqGroup.FR_WK:
570
+ ppm = 3
571
+ ppy = 52
572
+ elif freq_group == FreqGroup.FR_MTH:
573
+ ppm = 1
574
+ ppy = 12
575
+ elif freq_group == FreqGroup.FR_QTR:
576
+ ppm = -1 # placerholder
577
+ ppy = 4
578
+ elif freq_group == FreqGroup.FR_ANN:
579
+ ppm = -1 # placeholder
580
+ ppy = 1
581
+ else:
582
+ raise NotImplementedError(f"Unsupported frequency: {dtype_code}")
583
+
584
+ return ppd, ppm, ppy
585
+
586
+
587
+ @functools.cache
588
+ def _daily_finder(vmin: float, vmax: float, freq: BaseOffset) -> np.ndarray:
589
+ # error: "BaseOffset" has no attribute "_period_dtype_code"
590
+ dtype_code = freq._period_dtype_code # type: ignore[attr-defined]
591
+
592
+ periodsperday, periodspermonth, periodsperyear = _get_periods_per_ymd(freq)
593
+
594
+ # save this for later usage
595
+ vmin_orig = vmin
596
+ (vmin, vmax) = (int(vmin), int(vmax))
597
+ span = vmax - vmin + 1
598
+
599
+ with warnings.catch_warnings():
600
+ warnings.filterwarnings(
601
+ "ignore", "Period with BDay freq is deprecated", category=FutureWarning
602
+ )
603
+ warnings.filterwarnings(
604
+ "ignore", r"PeriodDtype\[B\] is deprecated", category=FutureWarning
605
+ )
606
+ dates_ = period_range(
607
+ start=Period(ordinal=vmin, freq=freq),
608
+ end=Period(ordinal=vmax, freq=freq),
609
+ freq=freq,
610
+ )
611
+
612
+ # Initialize the output
613
+ info = np.zeros(
614
+ span, dtype=[("val", np.int64), ("maj", bool), ("min", bool), ("fmt", "|S20")]
615
+ )
616
+ info["val"][:] = dates_.asi8
617
+ info["fmt"][:] = ""
618
+ info["maj"][[0, -1]] = True
619
+ # .. and set some shortcuts
620
+ info_maj = info["maj"]
621
+ info_min = info["min"]
622
+ info_fmt = info["fmt"]
623
+
624
+ def first_label(label_flags):
625
+ if (label_flags[0] == 0) and (label_flags.size > 1) and ((vmin_orig % 1) > 0.0):
626
+ return label_flags[1]
627
+ else:
628
+ return label_flags[0]
629
+
630
+ # Case 1. Less than a month
631
+ if span <= periodspermonth:
632
+ day_start = _period_break(dates_, "day")
633
+ month_start = _period_break(dates_, "month")
634
+ year_start = _period_break(dates_, "year")
635
+
636
+ def _hour_finder(label_interval: int, force_year_start: bool) -> None:
637
+ target = dates_.hour
638
+ mask = _period_break_mask(dates_, "hour")
639
+ info_maj[day_start] = True
640
+ info_min[mask & (target % label_interval == 0)] = True
641
+ info_fmt[mask & (target % label_interval == 0)] = "%H:%M"
642
+ info_fmt[day_start] = "%H:%M\n%d-%b"
643
+ info_fmt[year_start] = "%H:%M\n%d-%b\n%Y"
644
+ if force_year_start and not has_level_label(year_start, vmin_orig):
645
+ info_fmt[first_label(day_start)] = "%H:%M\n%d-%b\n%Y"
646
+
647
+ def _minute_finder(label_interval: int) -> None:
648
+ target = dates_.minute
649
+ hour_start = _period_break(dates_, "hour")
650
+ mask = _period_break_mask(dates_, "minute")
651
+ info_maj[hour_start] = True
652
+ info_min[mask & (target % label_interval == 0)] = True
653
+ info_fmt[mask & (target % label_interval == 0)] = "%H:%M"
654
+ info_fmt[day_start] = "%H:%M\n%d-%b"
655
+ info_fmt[year_start] = "%H:%M\n%d-%b\n%Y"
656
+
657
+ def _second_finder(label_interval: int) -> None:
658
+ target = dates_.second
659
+ minute_start = _period_break(dates_, "minute")
660
+ mask = _period_break_mask(dates_, "second")
661
+ info_maj[minute_start] = True
662
+ info_min[mask & (target % label_interval == 0)] = True
663
+ info_fmt[mask & (target % label_interval == 0)] = "%H:%M:%S"
664
+ info_fmt[day_start] = "%H:%M:%S\n%d-%b"
665
+ info_fmt[year_start] = "%H:%M:%S\n%d-%b\n%Y"
666
+
667
+ if span < periodsperday / 12000:
668
+ _second_finder(1)
669
+ elif span < periodsperday / 6000:
670
+ _second_finder(2)
671
+ elif span < periodsperday / 2400:
672
+ _second_finder(5)
673
+ elif span < periodsperday / 1200:
674
+ _second_finder(10)
675
+ elif span < periodsperday / 800:
676
+ _second_finder(15)
677
+ elif span < periodsperday / 400:
678
+ _second_finder(30)
679
+ elif span < periodsperday / 150:
680
+ _minute_finder(1)
681
+ elif span < periodsperday / 70:
682
+ _minute_finder(2)
683
+ elif span < periodsperday / 24:
684
+ _minute_finder(5)
685
+ elif span < periodsperday / 12:
686
+ _minute_finder(15)
687
+ elif span < periodsperday / 6:
688
+ _minute_finder(30)
689
+ elif span < periodsperday / 2.5:
690
+ _hour_finder(1, False)
691
+ elif span < periodsperday / 1.5:
692
+ _hour_finder(2, False)
693
+ elif span < periodsperday * 1.25:
694
+ _hour_finder(3, False)
695
+ elif span < periodsperday * 2.5:
696
+ _hour_finder(6, True)
697
+ elif span < periodsperday * 4:
698
+ _hour_finder(12, True)
699
+ else:
700
+ info_maj[month_start] = True
701
+ info_min[day_start] = True
702
+ info_fmt[day_start] = "%d"
703
+ info_fmt[month_start] = "%d\n%b"
704
+ info_fmt[year_start] = "%d\n%b\n%Y"
705
+ if not has_level_label(year_start, vmin_orig):
706
+ if not has_level_label(month_start, vmin_orig):
707
+ info_fmt[first_label(day_start)] = "%d\n%b\n%Y"
708
+ else:
709
+ info_fmt[first_label(month_start)] = "%d\n%b\n%Y"
710
+
711
+ # Case 2. Less than three months
712
+ elif span <= periodsperyear // 4:
713
+ month_start = _period_break(dates_, "month")
714
+ info_maj[month_start] = True
715
+ if dtype_code < FreqGroup.FR_HR.value:
716
+ info["min"] = True
717
+ else:
718
+ day_start = _period_break(dates_, "day")
719
+ info["min"][day_start] = True
720
+ week_start = _period_break(dates_, "week")
721
+ year_start = _period_break(dates_, "year")
722
+ info_fmt[week_start] = "%d"
723
+ info_fmt[month_start] = "\n\n%b"
724
+ info_fmt[year_start] = "\n\n%b\n%Y"
725
+ if not has_level_label(year_start, vmin_orig):
726
+ if not has_level_label(month_start, vmin_orig):
727
+ info_fmt[first_label(week_start)] = "\n\n%b\n%Y"
728
+ else:
729
+ info_fmt[first_label(month_start)] = "\n\n%b\n%Y"
730
+ # Case 3. Less than 14 months ...............
731
+ elif span <= 1.15 * periodsperyear:
732
+ year_start = _period_break(dates_, "year")
733
+ month_start = _period_break(dates_, "month")
734
+ week_start = _period_break(dates_, "week")
735
+ info_maj[month_start] = True
736
+ info_min[week_start] = True
737
+ info_min[year_start] = False
738
+ info_min[month_start] = False
739
+ info_fmt[month_start] = "%b"
740
+ info_fmt[year_start] = "%b\n%Y"
741
+ if not has_level_label(year_start, vmin_orig):
742
+ info_fmt[first_label(month_start)] = "%b\n%Y"
743
+ # Case 4. Less than 2.5 years ...............
744
+ elif span <= 2.5 * periodsperyear:
745
+ year_start = _period_break(dates_, "year")
746
+ quarter_start = _period_break(dates_, "quarter")
747
+ month_start = _period_break(dates_, "month")
748
+ info_maj[quarter_start] = True
749
+ info_min[month_start] = True
750
+ info_fmt[quarter_start] = "%b"
751
+ info_fmt[year_start] = "%b\n%Y"
752
+ # Case 4. Less than 4 years .................
753
+ elif span <= 4 * periodsperyear:
754
+ year_start = _period_break(dates_, "year")
755
+ month_start = _period_break(dates_, "month")
756
+ info_maj[year_start] = True
757
+ info_min[month_start] = True
758
+ info_min[year_start] = False
759
+
760
+ month_break = dates_[month_start].month
761
+ jan_or_jul = month_start[(month_break == 1) | (month_break == 7)]
762
+ info_fmt[jan_or_jul] = "%b"
763
+ info_fmt[year_start] = "%b\n%Y"
764
+ # Case 5. Less than 11 years ................
765
+ elif span <= 11 * periodsperyear:
766
+ year_start = _period_break(dates_, "year")
767
+ quarter_start = _period_break(dates_, "quarter")
768
+ info_maj[year_start] = True
769
+ info_min[quarter_start] = True
770
+ info_min[year_start] = False
771
+ info_fmt[year_start] = "%Y"
772
+ # Case 6. More than 12 years ................
773
+ else:
774
+ year_start = _period_break(dates_, "year")
775
+ year_break = dates_[year_start].year
776
+ nyears = span / periodsperyear
777
+ (min_anndef, maj_anndef) = _get_default_annual_spacing(nyears)
778
+ major_idx = year_start[(year_break % maj_anndef == 0)]
779
+ info_maj[major_idx] = True
780
+ minor_idx = year_start[(year_break % min_anndef == 0)]
781
+ info_min[minor_idx] = True
782
+ info_fmt[major_idx] = "%Y"
783
+
784
+ return info
785
+
786
+
787
+ @functools.cache
788
+ def _monthly_finder(vmin: float, vmax: float, freq: BaseOffset) -> np.ndarray:
789
+ _, _, periodsperyear = _get_periods_per_ymd(freq)
790
+
791
+ vmin_orig = vmin
792
+ (vmin, vmax) = (int(vmin), int(vmax))
793
+ span = vmax - vmin + 1
794
+
795
+ # Initialize the output
796
+ info = np.zeros(
797
+ span, dtype=[("val", int), ("maj", bool), ("min", bool), ("fmt", "|S8")]
798
+ )
799
+ info["val"] = np.arange(vmin, vmax + 1)
800
+ dates_ = info["val"]
801
+ info["fmt"] = ""
802
+ year_start = (dates_ % 12 == 0).nonzero()[0]
803
+ info_maj = info["maj"]
804
+ info_fmt = info["fmt"]
805
+
806
+ if span <= 1.15 * periodsperyear:
807
+ info_maj[year_start] = True
808
+ info["min"] = True
809
+
810
+ info_fmt[:] = "%b"
811
+ info_fmt[year_start] = "%b\n%Y"
812
+
813
+ if not has_level_label(year_start, vmin_orig):
814
+ if dates_.size > 1:
815
+ idx = 1
816
+ else:
817
+ idx = 0
818
+ info_fmt[idx] = "%b\n%Y"
819
+
820
+ elif span <= 2.5 * periodsperyear:
821
+ quarter_start = (dates_ % 3 == 0).nonzero()
822
+ info_maj[year_start] = True
823
+ # TODO: Check the following : is it really info['fmt'] ?
824
+ # 2023-09-15 this is reached in test_finder_monthly
825
+ info["fmt"][quarter_start] = True
826
+ info["min"] = True
827
+
828
+ info_fmt[quarter_start] = "%b"
829
+ info_fmt[year_start] = "%b\n%Y"
830
+
831
+ elif span <= 4 * periodsperyear:
832
+ info_maj[year_start] = True
833
+ info["min"] = True
834
+
835
+ jan_or_jul = (dates_ % 12 == 0) | (dates_ % 12 == 6)
836
+ info_fmt[jan_or_jul] = "%b"
837
+ info_fmt[year_start] = "%b\n%Y"
838
+
839
+ elif span <= 11 * periodsperyear:
840
+ quarter_start = (dates_ % 3 == 0).nonzero()
841
+ info_maj[year_start] = True
842
+ info["min"][quarter_start] = True
843
+
844
+ info_fmt[year_start] = "%Y"
845
+
846
+ else:
847
+ nyears = span / periodsperyear
848
+ (min_anndef, maj_anndef) = _get_default_annual_spacing(nyears)
849
+ years = dates_[year_start] // 12 + 1
850
+ major_idx = year_start[(years % maj_anndef == 0)]
851
+ info_maj[major_idx] = True
852
+ info["min"][year_start[(years % min_anndef == 0)]] = True
853
+
854
+ info_fmt[major_idx] = "%Y"
855
+
856
+ return info
857
+
858
+
859
+ @functools.cache
860
+ def _quarterly_finder(vmin: float, vmax: float, freq: BaseOffset) -> np.ndarray:
861
+ _, _, periodsperyear = _get_periods_per_ymd(freq)
862
+ vmin_orig = vmin
863
+ (vmin, vmax) = (int(vmin), int(vmax))
864
+ span = vmax - vmin + 1
865
+
866
+ info = np.zeros(
867
+ span, dtype=[("val", int), ("maj", bool), ("min", bool), ("fmt", "|S8")]
868
+ )
869
+ info["val"] = np.arange(vmin, vmax + 1)
870
+ info["fmt"] = ""
871
+ dates_ = info["val"]
872
+ info_maj = info["maj"]
873
+ info_fmt = info["fmt"]
874
+ year_start = (dates_ % 4 == 0).nonzero()[0]
875
+
876
+ if span <= 3.5 * periodsperyear:
877
+ info_maj[year_start] = True
878
+ info["min"] = True
879
+
880
+ info_fmt[:] = "Q%q"
881
+ info_fmt[year_start] = "Q%q\n%F"
882
+ if not has_level_label(year_start, vmin_orig):
883
+ if dates_.size > 1:
884
+ idx = 1
885
+ else:
886
+ idx = 0
887
+ info_fmt[idx] = "Q%q\n%F"
888
+
889
+ elif span <= 11 * periodsperyear:
890
+ info_maj[year_start] = True
891
+ info["min"] = True
892
+ info_fmt[year_start] = "%F"
893
+
894
+ else:
895
+ # https://github.com/pandas-dev/pandas/pull/47602
896
+ years = dates_[year_start] // 4 + 1970
897
+ nyears = span / periodsperyear
898
+ (min_anndef, maj_anndef) = _get_default_annual_spacing(nyears)
899
+ major_idx = year_start[(years % maj_anndef == 0)]
900
+ info_maj[major_idx] = True
901
+ info["min"][year_start[(years % min_anndef == 0)]] = True
902
+ info_fmt[major_idx] = "%F"
903
+
904
+ return info
905
+
906
+
907
+ @functools.cache
908
+ def _annual_finder(vmin: float, vmax: float, freq: BaseOffset) -> np.ndarray:
909
+ # Note: small difference here vs other finders in adding 1 to vmax
910
+ (vmin, vmax) = (int(vmin), int(vmax + 1))
911
+ span = vmax - vmin + 1
912
+
913
+ info = np.zeros(
914
+ span, dtype=[("val", int), ("maj", bool), ("min", bool), ("fmt", "|S8")]
915
+ )
916
+ info["val"] = np.arange(vmin, vmax + 1)
917
+ info["fmt"] = ""
918
+ dates_ = info["val"]
919
+
920
+ (min_anndef, maj_anndef) = _get_default_annual_spacing(span)
921
+ major_idx = dates_ % maj_anndef == 0
922
+ minor_idx = dates_ % min_anndef == 0
923
+ info["maj"][major_idx] = True
924
+ info["min"][minor_idx] = True
925
+ info["fmt"][major_idx] = "%Y"
926
+
927
+ return info
928
+
929
+
930
+ def get_finder(freq: BaseOffset):
931
+ # error: "BaseOffset" has no attribute "_period_dtype_code"
932
+ dtype_code = freq._period_dtype_code # type: ignore[attr-defined]
933
+ fgroup = FreqGroup.from_period_dtype_code(dtype_code)
934
+
935
+ if fgroup == FreqGroup.FR_ANN:
936
+ return _annual_finder
937
+ elif fgroup == FreqGroup.FR_QTR:
938
+ return _quarterly_finder
939
+ elif fgroup == FreqGroup.FR_MTH:
940
+ return _monthly_finder
941
+ elif (dtype_code >= FreqGroup.FR_BUS.value) or fgroup == FreqGroup.FR_WK:
942
+ return _daily_finder
943
+ else: # pragma: no cover
944
+ raise NotImplementedError(f"Unsupported frequency: {dtype_code}")
945
+
946
+
947
+ class TimeSeries_DateLocator(Locator):
948
+ """
949
+ Locates the ticks along an axis controlled by a :class:`Series`.
950
+
951
+ Parameters
952
+ ----------
953
+ freq : BaseOffset
954
+ Valid frequency specifier.
955
+ minor_locator : {False, True}, optional
956
+ Whether the locator is for minor ticks (True) or not.
957
+ dynamic_mode : {True, False}, optional
958
+ Whether the locator should work in dynamic mode.
959
+ base : {int}, optional
960
+ quarter : {int}, optional
961
+ month : {int}, optional
962
+ day : {int}, optional
963
+ """
964
+
965
+ axis: Axis
966
+
967
+ def __init__(
968
+ self,
969
+ freq: BaseOffset,
970
+ minor_locator: bool = False,
971
+ dynamic_mode: bool = True,
972
+ base: int = 1,
973
+ quarter: int = 1,
974
+ month: int = 1,
975
+ day: int = 1,
976
+ plot_obj=None,
977
+ ) -> None:
978
+ freq = to_offset(freq, is_period=True)
979
+ self.freq = freq
980
+ self.base = base
981
+ (self.quarter, self.month, self.day) = (quarter, month, day)
982
+ self.isminor = minor_locator
983
+ self.isdynamic = dynamic_mode
984
+ self.offset = 0
985
+ self.plot_obj = plot_obj
986
+ self.finder = get_finder(freq)
987
+
988
+ def _get_default_locs(self, vmin, vmax):
989
+ """Returns the default locations of ticks."""
990
+ locator = self.finder(vmin, vmax, self.freq)
991
+
992
+ if self.isminor:
993
+ return np.compress(locator["min"], locator["val"])
994
+ return np.compress(locator["maj"], locator["val"])
995
+
996
+ def __call__(self):
997
+ """Return the locations of the ticks."""
998
+ # axis calls Locator.set_axis inside set_m<xxxx>_formatter
999
+
1000
+ vi = tuple(self.axis.get_view_interval())
1001
+ vmin, vmax = vi
1002
+ if vmax < vmin:
1003
+ vmin, vmax = vmax, vmin
1004
+ if self.isdynamic:
1005
+ locs = self._get_default_locs(vmin, vmax)
1006
+ else: # pragma: no cover
1007
+ base = self.base
1008
+ (d, m) = divmod(vmin, base)
1009
+ vmin = (d + 1) * base
1010
+ # error: No overload variant of "range" matches argument types "float",
1011
+ # "float", "int"
1012
+ locs = list(range(vmin, vmax + 1, base)) # type: ignore[call-overload]
1013
+ return locs
1014
+
1015
+ def autoscale(self):
1016
+ """
1017
+ Sets the view limits to the nearest multiples of base that contain the
1018
+ data.
1019
+ """
1020
+ # requires matplotlib >= 0.98.0
1021
+ (vmin, vmax) = self.axis.get_data_interval()
1022
+
1023
+ locs = self._get_default_locs(vmin, vmax)
1024
+ (vmin, vmax) = locs[[0, -1]]
1025
+ if vmin == vmax:
1026
+ vmin -= 1
1027
+ vmax += 1
1028
+ return nonsingular(vmin, vmax)
1029
+
1030
+
1031
+ # -------------------------------------------------------------------------
1032
+ # --- Formatter ---
1033
+ # -------------------------------------------------------------------------
1034
+
1035
+
1036
+ class TimeSeries_DateFormatter(Formatter):
1037
+ """
1038
+ Formats the ticks along an axis controlled by a :class:`PeriodIndex`.
1039
+
1040
+ Parameters
1041
+ ----------
1042
+ freq : BaseOffset
1043
+ Valid frequency specifier.
1044
+ minor_locator : bool, default False
1045
+ Whether the current formatter should apply to minor ticks (True) or
1046
+ major ticks (False).
1047
+ dynamic_mode : bool, default True
1048
+ Whether the formatter works in dynamic mode or not.
1049
+ """
1050
+
1051
+ axis: Axis
1052
+
1053
+ def __init__(
1054
+ self,
1055
+ freq: BaseOffset,
1056
+ minor_locator: bool = False,
1057
+ dynamic_mode: bool = True,
1058
+ plot_obj=None,
1059
+ ) -> None:
1060
+ freq = to_offset(freq, is_period=True)
1061
+ self.format = None
1062
+ self.freq = freq
1063
+ self.locs: list[Any] = [] # unused, for matplotlib compat
1064
+ self.formatdict: dict[Any, Any] | None = None
1065
+ self.isminor = minor_locator
1066
+ self.isdynamic = dynamic_mode
1067
+ self.offset = 0
1068
+ self.plot_obj = plot_obj
1069
+ self.finder = get_finder(freq)
1070
+
1071
+ def _set_default_format(self, vmin, vmax):
1072
+ """Returns the default ticks spacing."""
1073
+ info = self.finder(vmin, vmax, self.freq)
1074
+
1075
+ if self.isminor:
1076
+ format = np.compress(info["min"] & np.logical_not(info["maj"]), info)
1077
+ else:
1078
+ format = np.compress(info["maj"], info)
1079
+ self.formatdict = {x: f for (x, _, _, f) in format}
1080
+ return self.formatdict
1081
+
1082
+ def set_locs(self, locs) -> None:
1083
+ """Sets the locations of the ticks"""
1084
+ # don't actually use the locs. This is just needed to work with
1085
+ # matplotlib. Force to use vmin, vmax
1086
+
1087
+ self.locs = locs
1088
+
1089
+ (vmin, vmax) = tuple(self.axis.get_view_interval())
1090
+ if vmax < vmin:
1091
+ (vmin, vmax) = (vmax, vmin)
1092
+ self._set_default_format(vmin, vmax)
1093
+
1094
+ def __call__(self, x, pos: int | None = 0) -> str:
1095
+ if self.formatdict is None:
1096
+ return ""
1097
+ else:
1098
+ fmt = self.formatdict.pop(x, "")
1099
+ if isinstance(fmt, np.bytes_):
1100
+ fmt = fmt.decode("utf-8")
1101
+ with warnings.catch_warnings():
1102
+ warnings.filterwarnings(
1103
+ "ignore",
1104
+ "Period with BDay freq is deprecated",
1105
+ category=FutureWarning,
1106
+ )
1107
+ period = Period(ordinal=int(x), freq=self.freq)
1108
+ assert isinstance(period, Period)
1109
+ return period.strftime(fmt)
1110
+
1111
+
1112
+ class TimeSeries_TimedeltaFormatter(Formatter):
1113
+ """
1114
+ Formats the ticks along an axis controlled by a :class:`TimedeltaIndex`.
1115
+ """
1116
+
1117
+ axis: Axis
1118
+
1119
+ @staticmethod
1120
+ def format_timedelta_ticks(x, pos, n_decimals: int) -> str:
1121
+ """
1122
+ Convert seconds to 'D days HH:MM:SS.F'
1123
+ """
1124
+ s, ns = divmod(x, 10**9) # TODO(non-nano): this looks like it assumes ns
1125
+ m, s = divmod(s, 60)
1126
+ h, m = divmod(m, 60)
1127
+ d, h = divmod(h, 24)
1128
+ decimals = int(ns * 10 ** (n_decimals - 9))
1129
+ s = f"{int(h):02d}:{int(m):02d}:{int(s):02d}"
1130
+ if n_decimals > 0:
1131
+ s += f".{decimals:0{n_decimals}d}"
1132
+ if d != 0:
1133
+ s = f"{int(d):d} days {s}"
1134
+ return s
1135
+
1136
+ def __call__(self, x, pos: int | None = 0) -> str:
1137
+ (vmin, vmax) = tuple(self.axis.get_view_interval())
1138
+ n_decimals = min(int(np.ceil(np.log10(100 * 10**9 / abs(vmax - vmin)))), 9)
1139
+ return self.format_timedelta_ticks(x, pos, n_decimals)
deepseek/lib/python3.10/site-packages/pandas/plotting/_matplotlib/style.py ADDED
@@ -0,0 +1,278 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from __future__ import annotations
2
+
3
+ from collections.abc import (
4
+ Collection,
5
+ Iterator,
6
+ )
7
+ import itertools
8
+ from typing import (
9
+ TYPE_CHECKING,
10
+ cast,
11
+ )
12
+ import warnings
13
+
14
+ import matplotlib as mpl
15
+ import matplotlib.colors
16
+ import numpy as np
17
+
18
+ from pandas._typing import MatplotlibColor as Color
19
+ from pandas.util._exceptions import find_stack_level
20
+
21
+ from pandas.core.dtypes.common import is_list_like
22
+
23
+ import pandas.core.common as com
24
+
25
+ if TYPE_CHECKING:
26
+ from matplotlib.colors import Colormap
27
+
28
+
29
+ def get_standard_colors(
30
+ num_colors: int,
31
+ colormap: Colormap | None = None,
32
+ color_type: str = "default",
33
+ color: dict[str, Color] | Color | Collection[Color] | None = None,
34
+ ):
35
+ """
36
+ Get standard colors based on `colormap`, `color_type` or `color` inputs.
37
+
38
+ Parameters
39
+ ----------
40
+ num_colors : int
41
+ Minimum number of colors to be returned.
42
+ Ignored if `color` is a dictionary.
43
+ colormap : :py:class:`matplotlib.colors.Colormap`, optional
44
+ Matplotlib colormap.
45
+ When provided, the resulting colors will be derived from the colormap.
46
+ color_type : {"default", "random"}, optional
47
+ Type of colors to derive. Used if provided `color` and `colormap` are None.
48
+ Ignored if either `color` or `colormap` are not None.
49
+ color : dict or str or sequence, optional
50
+ Color(s) to be used for deriving sequence of colors.
51
+ Can be either be a dictionary, or a single color (single color string,
52
+ or sequence of floats representing a single color),
53
+ or a sequence of colors.
54
+
55
+ Returns
56
+ -------
57
+ dict or list
58
+ Standard colors. Can either be a mapping if `color` was a dictionary,
59
+ or a list of colors with a length of `num_colors` or more.
60
+
61
+ Warns
62
+ -----
63
+ UserWarning
64
+ If both `colormap` and `color` are provided.
65
+ Parameter `color` will override.
66
+ """
67
+ if isinstance(color, dict):
68
+ return color
69
+
70
+ colors = _derive_colors(
71
+ color=color,
72
+ colormap=colormap,
73
+ color_type=color_type,
74
+ num_colors=num_colors,
75
+ )
76
+
77
+ return list(_cycle_colors(colors, num_colors=num_colors))
78
+
79
+
80
+ def _derive_colors(
81
+ *,
82
+ color: Color | Collection[Color] | None,
83
+ colormap: str | Colormap | None,
84
+ color_type: str,
85
+ num_colors: int,
86
+ ) -> list[Color]:
87
+ """
88
+ Derive colors from either `colormap`, `color_type` or `color` inputs.
89
+
90
+ Get a list of colors either from `colormap`, or from `color`,
91
+ or from `color_type` (if both `colormap` and `color` are None).
92
+
93
+ Parameters
94
+ ----------
95
+ color : str or sequence, optional
96
+ Color(s) to be used for deriving sequence of colors.
97
+ Can be either be a single color (single color string, or sequence of floats
98
+ representing a single color), or a sequence of colors.
99
+ colormap : :py:class:`matplotlib.colors.Colormap`, optional
100
+ Matplotlib colormap.
101
+ When provided, the resulting colors will be derived from the colormap.
102
+ color_type : {"default", "random"}, optional
103
+ Type of colors to derive. Used if provided `color` and `colormap` are None.
104
+ Ignored if either `color` or `colormap`` are not None.
105
+ num_colors : int
106
+ Number of colors to be extracted.
107
+
108
+ Returns
109
+ -------
110
+ list
111
+ List of colors extracted.
112
+
113
+ Warns
114
+ -----
115
+ UserWarning
116
+ If both `colormap` and `color` are provided.
117
+ Parameter `color` will override.
118
+ """
119
+ if color is None and colormap is not None:
120
+ return _get_colors_from_colormap(colormap, num_colors=num_colors)
121
+ elif color is not None:
122
+ if colormap is not None:
123
+ warnings.warn(
124
+ "'color' and 'colormap' cannot be used simultaneously. Using 'color'",
125
+ stacklevel=find_stack_level(),
126
+ )
127
+ return _get_colors_from_color(color)
128
+ else:
129
+ return _get_colors_from_color_type(color_type, num_colors=num_colors)
130
+
131
+
132
+ def _cycle_colors(colors: list[Color], num_colors: int) -> Iterator[Color]:
133
+ """Cycle colors until achieving max of `num_colors` or length of `colors`.
134
+
135
+ Extra colors will be ignored by matplotlib if there are more colors
136
+ than needed and nothing needs to be done here.
137
+ """
138
+ max_colors = max(num_colors, len(colors))
139
+ yield from itertools.islice(itertools.cycle(colors), max_colors)
140
+
141
+
142
+ def _get_colors_from_colormap(
143
+ colormap: str | Colormap,
144
+ num_colors: int,
145
+ ) -> list[Color]:
146
+ """Get colors from colormap."""
147
+ cmap = _get_cmap_instance(colormap)
148
+ return [cmap(num) for num in np.linspace(0, 1, num=num_colors)]
149
+
150
+
151
+ def _get_cmap_instance(colormap: str | Colormap) -> Colormap:
152
+ """Get instance of matplotlib colormap."""
153
+ if isinstance(colormap, str):
154
+ cmap = colormap
155
+ colormap = mpl.colormaps[colormap]
156
+ if colormap is None:
157
+ raise ValueError(f"Colormap {cmap} is not recognized")
158
+ return colormap
159
+
160
+
161
+ def _get_colors_from_color(
162
+ color: Color | Collection[Color],
163
+ ) -> list[Color]:
164
+ """Get colors from user input color."""
165
+ if len(color) == 0:
166
+ raise ValueError(f"Invalid color argument: {color}")
167
+
168
+ if _is_single_color(color):
169
+ color = cast(Color, color)
170
+ return [color]
171
+
172
+ color = cast(Collection[Color], color)
173
+ return list(_gen_list_of_colors_from_iterable(color))
174
+
175
+
176
+ def _is_single_color(color: Color | Collection[Color]) -> bool:
177
+ """Check if `color` is a single color, not a sequence of colors.
178
+
179
+ Single color is of these kinds:
180
+ - Named color "red", "C0", "firebrick"
181
+ - Alias "g"
182
+ - Sequence of floats, such as (0.1, 0.2, 0.3) or (0.1, 0.2, 0.3, 0.4).
183
+
184
+ See Also
185
+ --------
186
+ _is_single_string_color
187
+ """
188
+ if isinstance(color, str) and _is_single_string_color(color):
189
+ # GH #36972
190
+ return True
191
+
192
+ if _is_floats_color(color):
193
+ return True
194
+
195
+ return False
196
+
197
+
198
+ def _gen_list_of_colors_from_iterable(color: Collection[Color]) -> Iterator[Color]:
199
+ """
200
+ Yield colors from string of several letters or from collection of colors.
201
+ """
202
+ for x in color:
203
+ if _is_single_color(x):
204
+ yield x
205
+ else:
206
+ raise ValueError(f"Invalid color {x}")
207
+
208
+
209
+ def _is_floats_color(color: Color | Collection[Color]) -> bool:
210
+ """Check if color comprises a sequence of floats representing color."""
211
+ return bool(
212
+ is_list_like(color)
213
+ and (len(color) == 3 or len(color) == 4)
214
+ and all(isinstance(x, (int, float)) for x in color)
215
+ )
216
+
217
+
218
+ def _get_colors_from_color_type(color_type: str, num_colors: int) -> list[Color]:
219
+ """Get colors from user input color type."""
220
+ if color_type == "default":
221
+ return _get_default_colors(num_colors)
222
+ elif color_type == "random":
223
+ return _get_random_colors(num_colors)
224
+ else:
225
+ raise ValueError("color_type must be either 'default' or 'random'")
226
+
227
+
228
+ def _get_default_colors(num_colors: int) -> list[Color]:
229
+ """Get `num_colors` of default colors from matplotlib rc params."""
230
+ import matplotlib.pyplot as plt
231
+
232
+ colors = [c["color"] for c in plt.rcParams["axes.prop_cycle"]]
233
+ return colors[0:num_colors]
234
+
235
+
236
+ def _get_random_colors(num_colors: int) -> list[Color]:
237
+ """Get `num_colors` of random colors."""
238
+ return [_random_color(num) for num in range(num_colors)]
239
+
240
+
241
+ def _random_color(column: int) -> list[float]:
242
+ """Get a random color represented as a list of length 3"""
243
+ # GH17525 use common._random_state to avoid resetting the seed
244
+ rs = com.random_state(column)
245
+ return rs.rand(3).tolist()
246
+
247
+
248
+ def _is_single_string_color(color: Color) -> bool:
249
+ """Check if `color` is a single string color.
250
+
251
+ Examples of single string colors:
252
+ - 'r'
253
+ - 'g'
254
+ - 'red'
255
+ - 'green'
256
+ - 'C3'
257
+ - 'firebrick'
258
+
259
+ Parameters
260
+ ----------
261
+ color : Color
262
+ Color string or sequence of floats.
263
+
264
+ Returns
265
+ -------
266
+ bool
267
+ True if `color` looks like a valid color.
268
+ False otherwise.
269
+ """
270
+ conv = matplotlib.colors.ColorConverter()
271
+ try:
272
+ # error: Argument 1 to "to_rgba" of "ColorConverter" has incompatible type
273
+ # "str | Sequence[float]"; expected "tuple[float, float, float] | ..."
274
+ conv.to_rgba(color) # type: ignore[arg-type]
275
+ except ValueError:
276
+ return False
277
+ else:
278
+ return True
deepseek/lib/python3.10/site-packages/pandas/plotting/_matplotlib/timeseries.py ADDED
@@ -0,0 +1,370 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # TODO: Use the fact that axis can have units to simplify the process
2
+
3
+ from __future__ import annotations
4
+
5
+ import functools
6
+ from typing import (
7
+ TYPE_CHECKING,
8
+ Any,
9
+ cast,
10
+ )
11
+ import warnings
12
+
13
+ import numpy as np
14
+
15
+ from pandas._libs.tslibs import (
16
+ BaseOffset,
17
+ Period,
18
+ to_offset,
19
+ )
20
+ from pandas._libs.tslibs.dtypes import (
21
+ OFFSET_TO_PERIOD_FREQSTR,
22
+ FreqGroup,
23
+ )
24
+
25
+ from pandas.core.dtypes.generic import (
26
+ ABCDatetimeIndex,
27
+ ABCPeriodIndex,
28
+ ABCTimedeltaIndex,
29
+ )
30
+
31
+ from pandas.io.formats.printing import pprint_thing
32
+ from pandas.plotting._matplotlib.converter import (
33
+ TimeSeries_DateFormatter,
34
+ TimeSeries_DateLocator,
35
+ TimeSeries_TimedeltaFormatter,
36
+ )
37
+ from pandas.tseries.frequencies import (
38
+ get_period_alias,
39
+ is_subperiod,
40
+ is_superperiod,
41
+ )
42
+
43
+ if TYPE_CHECKING:
44
+ from datetime import timedelta
45
+
46
+ from matplotlib.axes import Axes
47
+
48
+ from pandas._typing import NDFrameT
49
+
50
+ from pandas import (
51
+ DataFrame,
52
+ DatetimeIndex,
53
+ Index,
54
+ PeriodIndex,
55
+ Series,
56
+ )
57
+
58
+ # ---------------------------------------------------------------------
59
+ # Plotting functions and monkey patches
60
+
61
+
62
+ def maybe_resample(series: Series, ax: Axes, kwargs: dict[str, Any]):
63
+ # resample against axes freq if necessary
64
+
65
+ if "how" in kwargs:
66
+ raise ValueError(
67
+ "'how' is not a valid keyword for plotting functions. If plotting "
68
+ "multiple objects on shared axes, resample manually first."
69
+ )
70
+
71
+ freq, ax_freq = _get_freq(ax, series)
72
+
73
+ if freq is None: # pragma: no cover
74
+ raise ValueError("Cannot use dynamic axis without frequency info")
75
+
76
+ # Convert DatetimeIndex to PeriodIndex
77
+ if isinstance(series.index, ABCDatetimeIndex):
78
+ series = series.to_period(freq=freq)
79
+
80
+ if ax_freq is not None and freq != ax_freq:
81
+ if is_superperiod(freq, ax_freq): # upsample input
82
+ series = series.copy()
83
+ # error: "Index" has no attribute "asfreq"
84
+ series.index = series.index.asfreq( # type: ignore[attr-defined]
85
+ ax_freq, how="s"
86
+ )
87
+ freq = ax_freq
88
+ elif _is_sup(freq, ax_freq): # one is weekly
89
+ # Resampling with PeriodDtype is deprecated, so we convert to
90
+ # DatetimeIndex, resample, then convert back.
91
+ ser_ts = series.to_timestamp()
92
+ ser_d = ser_ts.resample("D").last().dropna()
93
+ ser_freq = ser_d.resample(ax_freq).last().dropna()
94
+ series = ser_freq.to_period(ax_freq)
95
+ freq = ax_freq
96
+ elif is_subperiod(freq, ax_freq) or _is_sub(freq, ax_freq):
97
+ _upsample_others(ax, freq, kwargs)
98
+ else: # pragma: no cover
99
+ raise ValueError("Incompatible frequency conversion")
100
+ return freq, series
101
+
102
+
103
+ def _is_sub(f1: str, f2: str) -> bool:
104
+ return (f1.startswith("W") and is_subperiod("D", f2)) or (
105
+ f2.startswith("W") and is_subperiod(f1, "D")
106
+ )
107
+
108
+
109
+ def _is_sup(f1: str, f2: str) -> bool:
110
+ return (f1.startswith("W") and is_superperiod("D", f2)) or (
111
+ f2.startswith("W") and is_superperiod(f1, "D")
112
+ )
113
+
114
+
115
+ def _upsample_others(ax: Axes, freq: BaseOffset, kwargs: dict[str, Any]) -> None:
116
+ legend = ax.get_legend()
117
+ lines, labels = _replot_ax(ax, freq)
118
+ _replot_ax(ax, freq)
119
+
120
+ other_ax = None
121
+ if hasattr(ax, "left_ax"):
122
+ other_ax = ax.left_ax
123
+ if hasattr(ax, "right_ax"):
124
+ other_ax = ax.right_ax
125
+
126
+ if other_ax is not None:
127
+ rlines, rlabels = _replot_ax(other_ax, freq)
128
+ lines.extend(rlines)
129
+ labels.extend(rlabels)
130
+
131
+ if legend is not None and kwargs.get("legend", True) and len(lines) > 0:
132
+ title: str | None = legend.get_title().get_text()
133
+ if title == "None":
134
+ title = None
135
+ ax.legend(lines, labels, loc="best", title=title)
136
+
137
+
138
+ def _replot_ax(ax: Axes, freq: BaseOffset):
139
+ data = getattr(ax, "_plot_data", None)
140
+
141
+ # clear current axes and data
142
+ # TODO #54485
143
+ ax._plot_data = [] # type: ignore[attr-defined]
144
+ ax.clear()
145
+
146
+ decorate_axes(ax, freq)
147
+
148
+ lines = []
149
+ labels = []
150
+ if data is not None:
151
+ for series, plotf, kwds in data:
152
+ series = series.copy()
153
+ idx = series.index.asfreq(freq, how="S")
154
+ series.index = idx
155
+ # TODO #54485
156
+ ax._plot_data.append((series, plotf, kwds)) # type: ignore[attr-defined]
157
+
158
+ # for tsplot
159
+ if isinstance(plotf, str):
160
+ from pandas.plotting._matplotlib import PLOT_CLASSES
161
+
162
+ plotf = PLOT_CLASSES[plotf]._plot
163
+
164
+ lines.append(plotf(ax, series.index._mpl_repr(), series.values, **kwds)[0])
165
+ labels.append(pprint_thing(series.name))
166
+
167
+ return lines, labels
168
+
169
+
170
+ def decorate_axes(ax: Axes, freq: BaseOffset) -> None:
171
+ """Initialize axes for time-series plotting"""
172
+ if not hasattr(ax, "_plot_data"):
173
+ # TODO #54485
174
+ ax._plot_data = [] # type: ignore[attr-defined]
175
+
176
+ # TODO #54485
177
+ ax.freq = freq # type: ignore[attr-defined]
178
+ xaxis = ax.get_xaxis()
179
+ # TODO #54485
180
+ xaxis.freq = freq # type: ignore[attr-defined]
181
+
182
+
183
+ def _get_ax_freq(ax: Axes):
184
+ """
185
+ Get the freq attribute of the ax object if set.
186
+ Also checks shared axes (eg when using secondary yaxis, sharex=True
187
+ or twinx)
188
+ """
189
+ ax_freq = getattr(ax, "freq", None)
190
+ if ax_freq is None:
191
+ # check for left/right ax in case of secondary yaxis
192
+ if hasattr(ax, "left_ax"):
193
+ ax_freq = getattr(ax.left_ax, "freq", None)
194
+ elif hasattr(ax, "right_ax"):
195
+ ax_freq = getattr(ax.right_ax, "freq", None)
196
+ if ax_freq is None:
197
+ # check if a shared ax (sharex/twinx) has already freq set
198
+ shared_axes = ax.get_shared_x_axes().get_siblings(ax)
199
+ if len(shared_axes) > 1:
200
+ for shared_ax in shared_axes:
201
+ ax_freq = getattr(shared_ax, "freq", None)
202
+ if ax_freq is not None:
203
+ break
204
+ return ax_freq
205
+
206
+
207
+ def _get_period_alias(freq: timedelta | BaseOffset | str) -> str | None:
208
+ if isinstance(freq, BaseOffset):
209
+ freqstr = freq.name
210
+ else:
211
+ freqstr = to_offset(freq, is_period=True).rule_code
212
+
213
+ return get_period_alias(freqstr)
214
+
215
+
216
+ def _get_freq(ax: Axes, series: Series):
217
+ # get frequency from data
218
+ freq = getattr(series.index, "freq", None)
219
+ if freq is None:
220
+ freq = getattr(series.index, "inferred_freq", None)
221
+ freq = to_offset(freq, is_period=True)
222
+
223
+ ax_freq = _get_ax_freq(ax)
224
+
225
+ # use axes freq if no data freq
226
+ if freq is None:
227
+ freq = ax_freq
228
+
229
+ # get the period frequency
230
+ freq = _get_period_alias(freq)
231
+ return freq, ax_freq
232
+
233
+
234
+ def use_dynamic_x(ax: Axes, data: DataFrame | Series) -> bool:
235
+ freq = _get_index_freq(data.index)
236
+ ax_freq = _get_ax_freq(ax)
237
+
238
+ if freq is None: # convert irregular if axes has freq info
239
+ freq = ax_freq
240
+ # do not use tsplot if irregular was plotted first
241
+ elif (ax_freq is None) and (len(ax.get_lines()) > 0):
242
+ return False
243
+
244
+ if freq is None:
245
+ return False
246
+
247
+ freq_str = _get_period_alias(freq)
248
+
249
+ if freq_str is None:
250
+ return False
251
+
252
+ # FIXME: hack this for 0.10.1, creating more technical debt...sigh
253
+ if isinstance(data.index, ABCDatetimeIndex):
254
+ # error: "BaseOffset" has no attribute "_period_dtype_code"
255
+ freq_str = OFFSET_TO_PERIOD_FREQSTR.get(freq_str, freq_str)
256
+ base = to_offset(
257
+ freq_str, is_period=True
258
+ )._period_dtype_code # type: ignore[attr-defined]
259
+ x = data.index
260
+ if base <= FreqGroup.FR_DAY.value:
261
+ return x[:1].is_normalized
262
+ period = Period(x[0], freq_str)
263
+ assert isinstance(period, Period)
264
+ return period.to_timestamp().tz_localize(x.tz) == x[0]
265
+ return True
266
+
267
+
268
+ def _get_index_freq(index: Index) -> BaseOffset | None:
269
+ freq = getattr(index, "freq", None)
270
+ if freq is None:
271
+ freq = getattr(index, "inferred_freq", None)
272
+ if freq == "B":
273
+ # error: "Index" has no attribute "dayofweek"
274
+ weekdays = np.unique(index.dayofweek) # type: ignore[attr-defined]
275
+ if (5 in weekdays) or (6 in weekdays):
276
+ freq = None
277
+
278
+ freq = to_offset(freq)
279
+ return freq
280
+
281
+
282
+ def maybe_convert_index(ax: Axes, data: NDFrameT) -> NDFrameT:
283
+ # tsplot converts automatically, but don't want to convert index
284
+ # over and over for DataFrames
285
+ if isinstance(data.index, (ABCDatetimeIndex, ABCPeriodIndex)):
286
+ freq: str | BaseOffset | None = data.index.freq
287
+
288
+ if freq is None:
289
+ # We only get here for DatetimeIndex
290
+ data.index = cast("DatetimeIndex", data.index)
291
+ freq = data.index.inferred_freq
292
+ freq = to_offset(freq)
293
+
294
+ if freq is None:
295
+ freq = _get_ax_freq(ax)
296
+
297
+ if freq is None:
298
+ raise ValueError("Could not get frequency alias for plotting")
299
+
300
+ freq_str = _get_period_alias(freq)
301
+
302
+ with warnings.catch_warnings():
303
+ # suppress Period[B] deprecation warning
304
+ # TODO: need to find an alternative to this before the deprecation
305
+ # is enforced!
306
+ warnings.filterwarnings(
307
+ "ignore",
308
+ r"PeriodDtype\[B\] is deprecated",
309
+ category=FutureWarning,
310
+ )
311
+
312
+ if isinstance(data.index, ABCDatetimeIndex):
313
+ data = data.tz_localize(None).to_period(freq=freq_str)
314
+ elif isinstance(data.index, ABCPeriodIndex):
315
+ data.index = data.index.asfreq(freq=freq_str)
316
+ return data
317
+
318
+
319
+ # Patch methods for subplot.
320
+
321
+
322
+ def _format_coord(freq, t, y) -> str:
323
+ time_period = Period(ordinal=int(t), freq=freq)
324
+ return f"t = {time_period} y = {y:8f}"
325
+
326
+
327
+ def format_dateaxis(
328
+ subplot, freq: BaseOffset, index: DatetimeIndex | PeriodIndex
329
+ ) -> None:
330
+ """
331
+ Pretty-formats the date axis (x-axis).
332
+
333
+ Major and minor ticks are automatically set for the frequency of the
334
+ current underlying series. As the dynamic mode is activated by
335
+ default, changing the limits of the x axis will intelligently change
336
+ the positions of the ticks.
337
+ """
338
+ from matplotlib import pylab
339
+
340
+ # handle index specific formatting
341
+ # Note: DatetimeIndex does not use this
342
+ # interface. DatetimeIndex uses matplotlib.date directly
343
+ if isinstance(index, ABCPeriodIndex):
344
+ majlocator = TimeSeries_DateLocator(
345
+ freq, dynamic_mode=True, minor_locator=False, plot_obj=subplot
346
+ )
347
+ minlocator = TimeSeries_DateLocator(
348
+ freq, dynamic_mode=True, minor_locator=True, plot_obj=subplot
349
+ )
350
+ subplot.xaxis.set_major_locator(majlocator)
351
+ subplot.xaxis.set_minor_locator(minlocator)
352
+
353
+ majformatter = TimeSeries_DateFormatter(
354
+ freq, dynamic_mode=True, minor_locator=False, plot_obj=subplot
355
+ )
356
+ minformatter = TimeSeries_DateFormatter(
357
+ freq, dynamic_mode=True, minor_locator=True, plot_obj=subplot
358
+ )
359
+ subplot.xaxis.set_major_formatter(majformatter)
360
+ subplot.xaxis.set_minor_formatter(minformatter)
361
+
362
+ # x and y coord info
363
+ subplot.format_coord = functools.partial(_format_coord, freq)
364
+
365
+ elif isinstance(index, ABCTimedeltaIndex):
366
+ subplot.xaxis.set_major_formatter(TimeSeries_TimedeltaFormatter())
367
+ else:
368
+ raise TypeError("index type not supported")
369
+
370
+ pylab.draw_if_interactive()
deepseek/lib/python3.10/site-packages/pandas/tests/computation/__init__.py ADDED
File without changes
deepseek/lib/python3.10/site-packages/pandas/tests/computation/__pycache__/__init__.cpython-310.pyc ADDED
Binary file (178 Bytes). View file
 
deepseek/lib/python3.10/site-packages/pandas/tests/computation/__pycache__/test_compat.cpython-310.pyc ADDED
Binary file (1.09 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/tests/computation/__pycache__/test_eval.cpython-310.pyc ADDED
Binary file (58.7 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/tests/computation/test_compat.py ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import pytest
2
+
3
+ from pandas.compat._optional import VERSIONS
4
+
5
+ import pandas as pd
6
+ from pandas.core.computation import expr
7
+ from pandas.core.computation.engines import ENGINES
8
+ from pandas.util.version import Version
9
+
10
+
11
+ def test_compat():
12
+ # test we have compat with our version of numexpr
13
+
14
+ from pandas.core.computation.check import NUMEXPR_INSTALLED
15
+
16
+ ne = pytest.importorskip("numexpr")
17
+
18
+ ver = ne.__version__
19
+ if Version(ver) < Version(VERSIONS["numexpr"]):
20
+ assert not NUMEXPR_INSTALLED
21
+ else:
22
+ assert NUMEXPR_INSTALLED
23
+
24
+
25
+ @pytest.mark.parametrize("engine", ENGINES)
26
+ @pytest.mark.parametrize("parser", expr.PARSERS)
27
+ def test_invalid_numexpr_version(engine, parser):
28
+ if engine == "numexpr":
29
+ pytest.importorskip("numexpr")
30
+ a, b = 1, 2 # noqa: F841
31
+ res = pd.eval("a + b", engine=engine, parser=parser)
32
+ assert res == 3
deepseek/lib/python3.10/site-packages/pandas/tests/computation/test_eval.py ADDED
@@ -0,0 +1,2001 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from __future__ import annotations
2
+
3
+ from functools import reduce
4
+ from itertools import product
5
+ import operator
6
+
7
+ import numpy as np
8
+ import pytest
9
+
10
+ from pandas.compat import PY312
11
+ from pandas.errors import (
12
+ NumExprClobberingError,
13
+ PerformanceWarning,
14
+ UndefinedVariableError,
15
+ )
16
+ import pandas.util._test_decorators as td
17
+
18
+ from pandas.core.dtypes.common import (
19
+ is_bool,
20
+ is_float,
21
+ is_list_like,
22
+ is_scalar,
23
+ )
24
+
25
+ import pandas as pd
26
+ from pandas import (
27
+ DataFrame,
28
+ Index,
29
+ Series,
30
+ date_range,
31
+ period_range,
32
+ timedelta_range,
33
+ )
34
+ import pandas._testing as tm
35
+ from pandas.core.computation import (
36
+ expr,
37
+ pytables,
38
+ )
39
+ from pandas.core.computation.engines import ENGINES
40
+ from pandas.core.computation.expr import (
41
+ BaseExprVisitor,
42
+ PandasExprVisitor,
43
+ PythonExprVisitor,
44
+ )
45
+ from pandas.core.computation.expressions import (
46
+ NUMEXPR_INSTALLED,
47
+ USE_NUMEXPR,
48
+ )
49
+ from pandas.core.computation.ops import (
50
+ ARITH_OPS_SYMS,
51
+ SPECIAL_CASE_ARITH_OPS_SYMS,
52
+ _binary_math_ops,
53
+ _binary_ops_dict,
54
+ _unary_math_ops,
55
+ )
56
+ from pandas.core.computation.scope import DEFAULT_GLOBALS
57
+
58
+
59
+ @pytest.fixture(
60
+ params=(
61
+ pytest.param(
62
+ engine,
63
+ marks=[
64
+ pytest.mark.skipif(
65
+ engine == "numexpr" and not USE_NUMEXPR,
66
+ reason=f"numexpr enabled->{USE_NUMEXPR}, "
67
+ f"installed->{NUMEXPR_INSTALLED}",
68
+ ),
69
+ td.skip_if_no("numexpr"),
70
+ ],
71
+ )
72
+ for engine in ENGINES
73
+ )
74
+ )
75
+ def engine(request):
76
+ return request.param
77
+
78
+
79
+ @pytest.fixture(params=expr.PARSERS)
80
+ def parser(request):
81
+ return request.param
82
+
83
+
84
+ def _eval_single_bin(lhs, cmp1, rhs, engine):
85
+ c = _binary_ops_dict[cmp1]
86
+ if ENGINES[engine].has_neg_frac:
87
+ try:
88
+ return c(lhs, rhs)
89
+ except ValueError as e:
90
+ if str(e).startswith(
91
+ "negative number cannot be raised to a fractional power"
92
+ ):
93
+ return np.nan
94
+ raise
95
+ return c(lhs, rhs)
96
+
97
+
98
+ # TODO: using range(5) here is a kludge
99
+ @pytest.fixture(
100
+ params=list(range(5)),
101
+ ids=["DataFrame", "Series", "SeriesNaN", "DataFrameNaN", "float"],
102
+ )
103
+ def lhs(request):
104
+ nan_df1 = DataFrame(np.random.default_rng(2).standard_normal((10, 5)))
105
+ nan_df1[nan_df1 > 0.5] = np.nan
106
+
107
+ opts = (
108
+ DataFrame(np.random.default_rng(2).standard_normal((10, 5))),
109
+ Series(np.random.default_rng(2).standard_normal(5)),
110
+ Series([1, 2, np.nan, np.nan, 5]),
111
+ nan_df1,
112
+ np.random.default_rng(2).standard_normal(),
113
+ )
114
+ return opts[request.param]
115
+
116
+
117
+ rhs = lhs
118
+ midhs = lhs
119
+
120
+
121
+ @pytest.fixture
122
+ def idx_func_dict():
123
+ return {
124
+ "i": lambda n: Index(np.arange(n), dtype=np.int64),
125
+ "f": lambda n: Index(np.arange(n), dtype=np.float64),
126
+ "s": lambda n: Index([f"{i}_{chr(i)}" for i in range(97, 97 + n)]),
127
+ "dt": lambda n: date_range("2020-01-01", periods=n),
128
+ "td": lambda n: timedelta_range("1 day", periods=n),
129
+ "p": lambda n: period_range("2020-01-01", periods=n, freq="D"),
130
+ }
131
+
132
+
133
+ class TestEval:
134
+ @pytest.mark.parametrize(
135
+ "cmp1",
136
+ ["!=", "==", "<=", ">=", "<", ">"],
137
+ ids=["ne", "eq", "le", "ge", "lt", "gt"],
138
+ )
139
+ @pytest.mark.parametrize("cmp2", [">", "<"], ids=["gt", "lt"])
140
+ @pytest.mark.parametrize("binop", expr.BOOL_OPS_SYMS)
141
+ def test_complex_cmp_ops(self, cmp1, cmp2, binop, lhs, rhs, engine, parser):
142
+ if parser == "python" and binop in ["and", "or"]:
143
+ msg = "'BoolOp' nodes are not implemented"
144
+ with pytest.raises(NotImplementedError, match=msg):
145
+ ex = f"(lhs {cmp1} rhs) {binop} (lhs {cmp2} rhs)"
146
+ pd.eval(ex, engine=engine, parser=parser)
147
+ return
148
+
149
+ lhs_new = _eval_single_bin(lhs, cmp1, rhs, engine)
150
+ rhs_new = _eval_single_bin(lhs, cmp2, rhs, engine)
151
+ expected = _eval_single_bin(lhs_new, binop, rhs_new, engine)
152
+
153
+ ex = f"(lhs {cmp1} rhs) {binop} (lhs {cmp2} rhs)"
154
+ result = pd.eval(ex, engine=engine, parser=parser)
155
+ tm.assert_equal(result, expected)
156
+
157
+ @pytest.mark.parametrize("cmp_op", expr.CMP_OPS_SYMS)
158
+ def test_simple_cmp_ops(self, cmp_op, lhs, rhs, engine, parser):
159
+ lhs = lhs < 0
160
+ rhs = rhs < 0
161
+
162
+ if parser == "python" and cmp_op in ["in", "not in"]:
163
+ msg = "'(In|NotIn)' nodes are not implemented"
164
+
165
+ with pytest.raises(NotImplementedError, match=msg):
166
+ ex = f"lhs {cmp_op} rhs"
167
+ pd.eval(ex, engine=engine, parser=parser)
168
+ return
169
+
170
+ ex = f"lhs {cmp_op} rhs"
171
+ msg = "|".join(
172
+ [
173
+ r"only list-like( or dict-like)? objects are allowed to be "
174
+ r"passed to (DataFrame\.)?isin\(\), you passed a "
175
+ r"(`|')bool(`|')",
176
+ "argument of type 'bool' is not iterable",
177
+ ]
178
+ )
179
+ if cmp_op in ("in", "not in") and not is_list_like(rhs):
180
+ with pytest.raises(TypeError, match=msg):
181
+ pd.eval(
182
+ ex,
183
+ engine=engine,
184
+ parser=parser,
185
+ local_dict={"lhs": lhs, "rhs": rhs},
186
+ )
187
+ else:
188
+ expected = _eval_single_bin(lhs, cmp_op, rhs, engine)
189
+ result = pd.eval(ex, engine=engine, parser=parser)
190
+ tm.assert_equal(result, expected)
191
+
192
+ @pytest.mark.parametrize("op", expr.CMP_OPS_SYMS)
193
+ def test_compound_invert_op(self, op, lhs, rhs, request, engine, parser):
194
+ if parser == "python" and op in ["in", "not in"]:
195
+ msg = "'(In|NotIn)' nodes are not implemented"
196
+ with pytest.raises(NotImplementedError, match=msg):
197
+ ex = f"~(lhs {op} rhs)"
198
+ pd.eval(ex, engine=engine, parser=parser)
199
+ return
200
+
201
+ if (
202
+ is_float(lhs)
203
+ and not is_float(rhs)
204
+ and op in ["in", "not in"]
205
+ and engine == "python"
206
+ and parser == "pandas"
207
+ ):
208
+ mark = pytest.mark.xfail(
209
+ reason="Looks like expected is negative, unclear whether "
210
+ "expected is incorrect or result is incorrect"
211
+ )
212
+ request.applymarker(mark)
213
+ skip_these = ["in", "not in"]
214
+ ex = f"~(lhs {op} rhs)"
215
+
216
+ msg = "|".join(
217
+ [
218
+ r"only list-like( or dict-like)? objects are allowed to be "
219
+ r"passed to (DataFrame\.)?isin\(\), you passed a "
220
+ r"(`|')float(`|')",
221
+ "argument of type 'float' is not iterable",
222
+ ]
223
+ )
224
+ if is_scalar(rhs) and op in skip_these:
225
+ with pytest.raises(TypeError, match=msg):
226
+ pd.eval(
227
+ ex,
228
+ engine=engine,
229
+ parser=parser,
230
+ local_dict={"lhs": lhs, "rhs": rhs},
231
+ )
232
+ else:
233
+ # compound
234
+ if is_scalar(lhs) and is_scalar(rhs):
235
+ lhs, rhs = (np.array([x]) for x in (lhs, rhs))
236
+ expected = _eval_single_bin(lhs, op, rhs, engine)
237
+ if is_scalar(expected):
238
+ expected = not expected
239
+ else:
240
+ expected = ~expected
241
+ result = pd.eval(ex, engine=engine, parser=parser)
242
+ tm.assert_almost_equal(expected, result)
243
+
244
+ @pytest.mark.parametrize("cmp1", ["<", ">"])
245
+ @pytest.mark.parametrize("cmp2", ["<", ">"])
246
+ def test_chained_cmp_op(self, cmp1, cmp2, lhs, midhs, rhs, engine, parser):
247
+ mid = midhs
248
+ if parser == "python":
249
+ ex1 = f"lhs {cmp1} mid {cmp2} rhs"
250
+ msg = "'BoolOp' nodes are not implemented"
251
+ with pytest.raises(NotImplementedError, match=msg):
252
+ pd.eval(ex1, engine=engine, parser=parser)
253
+ return
254
+
255
+ lhs_new = _eval_single_bin(lhs, cmp1, mid, engine)
256
+ rhs_new = _eval_single_bin(mid, cmp2, rhs, engine)
257
+
258
+ if lhs_new is not None and rhs_new is not None:
259
+ ex1 = f"lhs {cmp1} mid {cmp2} rhs"
260
+ ex2 = f"lhs {cmp1} mid and mid {cmp2} rhs"
261
+ ex3 = f"(lhs {cmp1} mid) & (mid {cmp2} rhs)"
262
+ expected = _eval_single_bin(lhs_new, "&", rhs_new, engine)
263
+
264
+ for ex in (ex1, ex2, ex3):
265
+ result = pd.eval(ex, engine=engine, parser=parser)
266
+
267
+ tm.assert_almost_equal(result, expected)
268
+
269
+ @pytest.mark.parametrize(
270
+ "arith1", sorted(set(ARITH_OPS_SYMS).difference(SPECIAL_CASE_ARITH_OPS_SYMS))
271
+ )
272
+ def test_binary_arith_ops(self, arith1, lhs, rhs, engine, parser):
273
+ ex = f"lhs {arith1} rhs"
274
+ result = pd.eval(ex, engine=engine, parser=parser)
275
+ expected = _eval_single_bin(lhs, arith1, rhs, engine)
276
+
277
+ tm.assert_almost_equal(result, expected)
278
+ ex = f"lhs {arith1} rhs {arith1} rhs"
279
+ result = pd.eval(ex, engine=engine, parser=parser)
280
+ nlhs = _eval_single_bin(lhs, arith1, rhs, engine)
281
+ try:
282
+ nlhs, ghs = nlhs.align(rhs)
283
+ except (ValueError, TypeError, AttributeError):
284
+ # ValueError: series frame or frame series align
285
+ # TypeError, AttributeError: series or frame with scalar align
286
+ return
287
+ else:
288
+ if engine == "numexpr":
289
+ import numexpr as ne
290
+
291
+ # direct numpy comparison
292
+ expected = ne.evaluate(f"nlhs {arith1} ghs")
293
+ # Update assert statement due to unreliable numerical
294
+ # precision component (GH37328)
295
+ # TODO: update testing code so that assert_almost_equal statement
296
+ # can be replaced again by the assert_numpy_array_equal statement
297
+ tm.assert_almost_equal(result.values, expected)
298
+ else:
299
+ expected = eval(f"nlhs {arith1} ghs")
300
+ tm.assert_almost_equal(result, expected)
301
+
302
+ # modulus, pow, and floor division require special casing
303
+
304
+ def test_modulus(self, lhs, rhs, engine, parser):
305
+ ex = r"lhs % rhs"
306
+ result = pd.eval(ex, engine=engine, parser=parser)
307
+ expected = lhs % rhs
308
+ tm.assert_almost_equal(result, expected)
309
+
310
+ if engine == "numexpr":
311
+ import numexpr as ne
312
+
313
+ expected = ne.evaluate(r"expected % rhs")
314
+ if isinstance(result, (DataFrame, Series)):
315
+ tm.assert_almost_equal(result.values, expected)
316
+ else:
317
+ tm.assert_almost_equal(result, expected.item())
318
+ else:
319
+ expected = _eval_single_bin(expected, "%", rhs, engine)
320
+ tm.assert_almost_equal(result, expected)
321
+
322
+ def test_floor_division(self, lhs, rhs, engine, parser):
323
+ ex = "lhs // rhs"
324
+
325
+ if engine == "python":
326
+ res = pd.eval(ex, engine=engine, parser=parser)
327
+ expected = lhs // rhs
328
+ tm.assert_equal(res, expected)
329
+ else:
330
+ msg = (
331
+ r"unsupported operand type\(s\) for //: 'VariableNode' and "
332
+ "'VariableNode'"
333
+ )
334
+ with pytest.raises(TypeError, match=msg):
335
+ pd.eval(
336
+ ex,
337
+ local_dict={"lhs": lhs, "rhs": rhs},
338
+ engine=engine,
339
+ parser=parser,
340
+ )
341
+
342
+ @td.skip_if_windows
343
+ def test_pow(self, lhs, rhs, engine, parser):
344
+ # odd failure on win32 platform, so skip
345
+ ex = "lhs ** rhs"
346
+ expected = _eval_single_bin(lhs, "**", rhs, engine)
347
+ result = pd.eval(ex, engine=engine, parser=parser)
348
+
349
+ if (
350
+ is_scalar(lhs)
351
+ and is_scalar(rhs)
352
+ and isinstance(expected, (complex, np.complexfloating))
353
+ and np.isnan(result)
354
+ ):
355
+ msg = "(DataFrame.columns|numpy array) are different"
356
+ with pytest.raises(AssertionError, match=msg):
357
+ tm.assert_numpy_array_equal(result, expected)
358
+ else:
359
+ tm.assert_almost_equal(result, expected)
360
+
361
+ ex = "(lhs ** rhs) ** rhs"
362
+ result = pd.eval(ex, engine=engine, parser=parser)
363
+
364
+ middle = _eval_single_bin(lhs, "**", rhs, engine)
365
+ expected = _eval_single_bin(middle, "**", rhs, engine)
366
+ tm.assert_almost_equal(result, expected)
367
+
368
+ def test_check_single_invert_op(self, lhs, engine, parser):
369
+ # simple
370
+ try:
371
+ elb = lhs.astype(bool)
372
+ except AttributeError:
373
+ elb = np.array([bool(lhs)])
374
+ expected = ~elb
375
+ result = pd.eval("~elb", engine=engine, parser=parser)
376
+ tm.assert_almost_equal(expected, result)
377
+
378
+ def test_frame_invert(self, engine, parser):
379
+ expr = "~lhs"
380
+
381
+ # ~ ##
382
+ # frame
383
+ # float always raises
384
+ lhs = DataFrame(np.random.default_rng(2).standard_normal((5, 2)))
385
+ if engine == "numexpr":
386
+ msg = "couldn't find matching opcode for 'invert_dd'"
387
+ with pytest.raises(NotImplementedError, match=msg):
388
+ pd.eval(expr, engine=engine, parser=parser)
389
+ else:
390
+ msg = "ufunc 'invert' not supported for the input types"
391
+ with pytest.raises(TypeError, match=msg):
392
+ pd.eval(expr, engine=engine, parser=parser)
393
+
394
+ # int raises on numexpr
395
+ lhs = DataFrame(np.random.default_rng(2).integers(5, size=(5, 2)))
396
+ if engine == "numexpr":
397
+ msg = "couldn't find matching opcode for 'invert"
398
+ with pytest.raises(NotImplementedError, match=msg):
399
+ pd.eval(expr, engine=engine, parser=parser)
400
+ else:
401
+ expect = ~lhs
402
+ result = pd.eval(expr, engine=engine, parser=parser)
403
+ tm.assert_frame_equal(expect, result)
404
+
405
+ # bool always works
406
+ lhs = DataFrame(np.random.default_rng(2).standard_normal((5, 2)) > 0.5)
407
+ expect = ~lhs
408
+ result = pd.eval(expr, engine=engine, parser=parser)
409
+ tm.assert_frame_equal(expect, result)
410
+
411
+ # object raises
412
+ lhs = DataFrame(
413
+ {"b": ["a", 1, 2.0], "c": np.random.default_rng(2).standard_normal(3) > 0.5}
414
+ )
415
+ if engine == "numexpr":
416
+ with pytest.raises(ValueError, match="unknown type object"):
417
+ pd.eval(expr, engine=engine, parser=parser)
418
+ else:
419
+ msg = "bad operand type for unary ~: 'str'"
420
+ with pytest.raises(TypeError, match=msg):
421
+ pd.eval(expr, engine=engine, parser=parser)
422
+
423
+ def test_series_invert(self, engine, parser):
424
+ # ~ ####
425
+ expr = "~lhs"
426
+
427
+ # series
428
+ # float raises
429
+ lhs = Series(np.random.default_rng(2).standard_normal(5))
430
+ if engine == "numexpr":
431
+ msg = "couldn't find matching opcode for 'invert_dd'"
432
+ with pytest.raises(NotImplementedError, match=msg):
433
+ result = pd.eval(expr, engine=engine, parser=parser)
434
+ else:
435
+ msg = "ufunc 'invert' not supported for the input types"
436
+ with pytest.raises(TypeError, match=msg):
437
+ pd.eval(expr, engine=engine, parser=parser)
438
+
439
+ # int raises on numexpr
440
+ lhs = Series(np.random.default_rng(2).integers(5, size=5))
441
+ if engine == "numexpr":
442
+ msg = "couldn't find matching opcode for 'invert"
443
+ with pytest.raises(NotImplementedError, match=msg):
444
+ pd.eval(expr, engine=engine, parser=parser)
445
+ else:
446
+ expect = ~lhs
447
+ result = pd.eval(expr, engine=engine, parser=parser)
448
+ tm.assert_series_equal(expect, result)
449
+
450
+ # bool
451
+ lhs = Series(np.random.default_rng(2).standard_normal(5) > 0.5)
452
+ expect = ~lhs
453
+ result = pd.eval(expr, engine=engine, parser=parser)
454
+ tm.assert_series_equal(expect, result)
455
+
456
+ # float
457
+ # int
458
+ # bool
459
+
460
+ # object
461
+ lhs = Series(["a", 1, 2.0])
462
+ if engine == "numexpr":
463
+ with pytest.raises(ValueError, match="unknown type object"):
464
+ pd.eval(expr, engine=engine, parser=parser)
465
+ else:
466
+ msg = "bad operand type for unary ~: 'str'"
467
+ with pytest.raises(TypeError, match=msg):
468
+ pd.eval(expr, engine=engine, parser=parser)
469
+
470
+ def test_frame_negate(self, engine, parser):
471
+ expr = "-lhs"
472
+
473
+ # float
474
+ lhs = DataFrame(np.random.default_rng(2).standard_normal((5, 2)))
475
+ expect = -lhs
476
+ result = pd.eval(expr, engine=engine, parser=parser)
477
+ tm.assert_frame_equal(expect, result)
478
+
479
+ # int
480
+ lhs = DataFrame(np.random.default_rng(2).integers(5, size=(5, 2)))
481
+ expect = -lhs
482
+ result = pd.eval(expr, engine=engine, parser=parser)
483
+ tm.assert_frame_equal(expect, result)
484
+
485
+ # bool doesn't work with numexpr but works elsewhere
486
+ lhs = DataFrame(np.random.default_rng(2).standard_normal((5, 2)) > 0.5)
487
+ if engine == "numexpr":
488
+ msg = "couldn't find matching opcode for 'neg_bb'"
489
+ with pytest.raises(NotImplementedError, match=msg):
490
+ pd.eval(expr, engine=engine, parser=parser)
491
+ else:
492
+ expect = -lhs
493
+ result = pd.eval(expr, engine=engine, parser=parser)
494
+ tm.assert_frame_equal(expect, result)
495
+
496
+ def test_series_negate(self, engine, parser):
497
+ expr = "-lhs"
498
+
499
+ # float
500
+ lhs = Series(np.random.default_rng(2).standard_normal(5))
501
+ expect = -lhs
502
+ result = pd.eval(expr, engine=engine, parser=parser)
503
+ tm.assert_series_equal(expect, result)
504
+
505
+ # int
506
+ lhs = Series(np.random.default_rng(2).integers(5, size=5))
507
+ expect = -lhs
508
+ result = pd.eval(expr, engine=engine, parser=parser)
509
+ tm.assert_series_equal(expect, result)
510
+
511
+ # bool doesn't work with numexpr but works elsewhere
512
+ lhs = Series(np.random.default_rng(2).standard_normal(5) > 0.5)
513
+ if engine == "numexpr":
514
+ msg = "couldn't find matching opcode for 'neg_bb'"
515
+ with pytest.raises(NotImplementedError, match=msg):
516
+ pd.eval(expr, engine=engine, parser=parser)
517
+ else:
518
+ expect = -lhs
519
+ result = pd.eval(expr, engine=engine, parser=parser)
520
+ tm.assert_series_equal(expect, result)
521
+
522
+ @pytest.mark.parametrize(
523
+ "lhs",
524
+ [
525
+ # Float
526
+ DataFrame(np.random.default_rng(2).standard_normal((5, 2))),
527
+ # Int
528
+ DataFrame(np.random.default_rng(2).integers(5, size=(5, 2))),
529
+ # bool doesn't work with numexpr but works elsewhere
530
+ DataFrame(np.random.default_rng(2).standard_normal((5, 2)) > 0.5),
531
+ ],
532
+ )
533
+ def test_frame_pos(self, lhs, engine, parser):
534
+ expr = "+lhs"
535
+ expect = lhs
536
+
537
+ result = pd.eval(expr, engine=engine, parser=parser)
538
+ tm.assert_frame_equal(expect, result)
539
+
540
+ @pytest.mark.parametrize(
541
+ "lhs",
542
+ [
543
+ # Float
544
+ Series(np.random.default_rng(2).standard_normal(5)),
545
+ # Int
546
+ Series(np.random.default_rng(2).integers(5, size=5)),
547
+ # bool doesn't work with numexpr but works elsewhere
548
+ Series(np.random.default_rng(2).standard_normal(5) > 0.5),
549
+ ],
550
+ )
551
+ def test_series_pos(self, lhs, engine, parser):
552
+ expr = "+lhs"
553
+ expect = lhs
554
+
555
+ result = pd.eval(expr, engine=engine, parser=parser)
556
+ tm.assert_series_equal(expect, result)
557
+
558
+ def test_scalar_unary(self, engine, parser):
559
+ msg = "bad operand type for unary ~: 'float'"
560
+ warn = None
561
+ if PY312 and not (engine == "numexpr" and parser == "pandas"):
562
+ warn = DeprecationWarning
563
+ with pytest.raises(TypeError, match=msg):
564
+ pd.eval("~1.0", engine=engine, parser=parser)
565
+
566
+ assert pd.eval("-1.0", parser=parser, engine=engine) == -1.0
567
+ assert pd.eval("+1.0", parser=parser, engine=engine) == +1.0
568
+ assert pd.eval("~1", parser=parser, engine=engine) == ~1
569
+ assert pd.eval("-1", parser=parser, engine=engine) == -1
570
+ assert pd.eval("+1", parser=parser, engine=engine) == +1
571
+ with tm.assert_produces_warning(
572
+ warn, match="Bitwise inversion", check_stacklevel=False
573
+ ):
574
+ assert pd.eval("~True", parser=parser, engine=engine) == ~True
575
+ with tm.assert_produces_warning(
576
+ warn, match="Bitwise inversion", check_stacklevel=False
577
+ ):
578
+ assert pd.eval("~False", parser=parser, engine=engine) == ~False
579
+ assert pd.eval("-True", parser=parser, engine=engine) == -True
580
+ assert pd.eval("-False", parser=parser, engine=engine) == -False
581
+ assert pd.eval("+True", parser=parser, engine=engine) == +True
582
+ assert pd.eval("+False", parser=parser, engine=engine) == +False
583
+
584
+ def test_unary_in_array(self):
585
+ # GH 11235
586
+ # TODO: 2022-01-29: result return list with numexpr 2.7.3 in CI
587
+ # but cannot reproduce locally
588
+ result = np.array(
589
+ pd.eval("[-True, True, +True, -False, False, +False, -37, 37, ~37, +37]"),
590
+ dtype=np.object_,
591
+ )
592
+ expected = np.array(
593
+ [
594
+ -True,
595
+ True,
596
+ +True,
597
+ -False,
598
+ False,
599
+ +False,
600
+ -37,
601
+ 37,
602
+ ~37,
603
+ +37,
604
+ ],
605
+ dtype=np.object_,
606
+ )
607
+ tm.assert_numpy_array_equal(result, expected)
608
+
609
+ @pytest.mark.parametrize("dtype", [np.float32, np.float64])
610
+ @pytest.mark.parametrize("expr", ["x < -0.1", "-5 > x"])
611
+ def test_float_comparison_bin_op(self, dtype, expr):
612
+ # GH 16363
613
+ df = DataFrame({"x": np.array([0], dtype=dtype)})
614
+ res = df.eval(expr)
615
+ assert res.values == np.array([False])
616
+
617
+ def test_unary_in_function(self):
618
+ # GH 46471
619
+ df = DataFrame({"x": [0, 1, np.nan]})
620
+
621
+ result = df.eval("x.fillna(-1)")
622
+ expected = df.x.fillna(-1)
623
+ # column name becomes None if using numexpr
624
+ # only check names when the engine is not numexpr
625
+ tm.assert_series_equal(result, expected, check_names=not USE_NUMEXPR)
626
+
627
+ result = df.eval("x.shift(1, fill_value=-1)")
628
+ expected = df.x.shift(1, fill_value=-1)
629
+ tm.assert_series_equal(result, expected, check_names=not USE_NUMEXPR)
630
+
631
+ @pytest.mark.parametrize(
632
+ "ex",
633
+ (
634
+ "1 or 2",
635
+ "1 and 2",
636
+ "a and b",
637
+ "a or b",
638
+ "1 or 2 and (3 + 2) > 3",
639
+ "2 * x > 2 or 1 and 2",
640
+ "2 * df > 3 and 1 or a",
641
+ ),
642
+ )
643
+ def test_disallow_scalar_bool_ops(self, ex, engine, parser):
644
+ x, a, b = np.random.default_rng(2).standard_normal(3), 1, 2 # noqa: F841
645
+ df = DataFrame(np.random.default_rng(2).standard_normal((3, 2))) # noqa: F841
646
+
647
+ msg = "cannot evaluate scalar only bool ops|'BoolOp' nodes are not"
648
+ with pytest.raises(NotImplementedError, match=msg):
649
+ pd.eval(ex, engine=engine, parser=parser)
650
+
651
+ def test_identical(self, engine, parser):
652
+ # see gh-10546
653
+ x = 1
654
+ result = pd.eval("x", engine=engine, parser=parser)
655
+ assert result == 1
656
+ assert is_scalar(result)
657
+
658
+ x = 1.5
659
+ result = pd.eval("x", engine=engine, parser=parser)
660
+ assert result == 1.5
661
+ assert is_scalar(result)
662
+
663
+ x = False
664
+ result = pd.eval("x", engine=engine, parser=parser)
665
+ assert not result
666
+ assert is_bool(result)
667
+ assert is_scalar(result)
668
+
669
+ x = np.array([1])
670
+ result = pd.eval("x", engine=engine, parser=parser)
671
+ tm.assert_numpy_array_equal(result, np.array([1]))
672
+ assert result.shape == (1,)
673
+
674
+ x = np.array([1.5])
675
+ result = pd.eval("x", engine=engine, parser=parser)
676
+ tm.assert_numpy_array_equal(result, np.array([1.5]))
677
+ assert result.shape == (1,)
678
+
679
+ x = np.array([False]) # noqa: F841
680
+ result = pd.eval("x", engine=engine, parser=parser)
681
+ tm.assert_numpy_array_equal(result, np.array([False]))
682
+ assert result.shape == (1,)
683
+
684
+ def test_line_continuation(self, engine, parser):
685
+ # GH 11149
686
+ exp = """1 + 2 * \
687
+ 5 - 1 + 2 """
688
+ result = pd.eval(exp, engine=engine, parser=parser)
689
+ assert result == 12
690
+
691
+ def test_float_truncation(self, engine, parser):
692
+ # GH 14241
693
+ exp = "1000000000.006"
694
+ result = pd.eval(exp, engine=engine, parser=parser)
695
+ expected = np.float64(exp)
696
+ assert result == expected
697
+
698
+ df = DataFrame({"A": [1000000000.0009, 1000000000.0011, 1000000000.0015]})
699
+ cutoff = 1000000000.0006
700
+ result = df.query(f"A < {cutoff:.4f}")
701
+ assert result.empty
702
+
703
+ cutoff = 1000000000.0010
704
+ result = df.query(f"A > {cutoff:.4f}")
705
+ expected = df.loc[[1, 2], :]
706
+ tm.assert_frame_equal(expected, result)
707
+
708
+ exact = 1000000000.0011
709
+ result = df.query(f"A == {exact:.4f}")
710
+ expected = df.loc[[1], :]
711
+ tm.assert_frame_equal(expected, result)
712
+
713
+ def test_disallow_python_keywords(self):
714
+ # GH 18221
715
+ df = DataFrame([[0, 0, 0]], columns=["foo", "bar", "class"])
716
+ msg = "Python keyword not valid identifier in numexpr query"
717
+ with pytest.raises(SyntaxError, match=msg):
718
+ df.query("class == 0")
719
+
720
+ df = DataFrame()
721
+ df.index.name = "lambda"
722
+ with pytest.raises(SyntaxError, match=msg):
723
+ df.query("lambda == 0")
724
+
725
+ def test_true_false_logic(self):
726
+ # GH 25823
727
+ # This behavior is deprecated in Python 3.12
728
+ with tm.maybe_produces_warning(
729
+ DeprecationWarning, PY312, check_stacklevel=False
730
+ ):
731
+ assert pd.eval("not True") == -2
732
+ assert pd.eval("not False") == -1
733
+ assert pd.eval("True and not True") == 0
734
+
735
+ def test_and_logic_string_match(self):
736
+ # GH 25823
737
+ event = Series({"a": "hello"})
738
+ assert pd.eval(f"{event.str.match('hello').a}")
739
+ assert pd.eval(f"{event.str.match('hello').a and event.str.match('hello').a}")
740
+
741
+
742
+ # -------------------------------------
743
+ # gh-12388: Typecasting rules consistency with python
744
+
745
+
746
+ class TestTypeCasting:
747
+ @pytest.mark.parametrize("op", ["+", "-", "*", "**", "/"])
748
+ # maybe someday... numexpr has too many upcasting rules now
749
+ # chain(*(np.core.sctypes[x] for x in ['uint', 'int', 'float']))
750
+ @pytest.mark.parametrize("left_right", [("df", "3"), ("3", "df")])
751
+ def test_binop_typecasting(
752
+ self, engine, parser, op, complex_or_float_dtype, left_right, request
753
+ ):
754
+ # GH#21374
755
+ dtype = complex_or_float_dtype
756
+ df = DataFrame(np.random.default_rng(2).standard_normal((5, 3)), dtype=dtype)
757
+ left, right = left_right
758
+ s = f"{left} {op} {right}"
759
+ res = pd.eval(s, engine=engine, parser=parser)
760
+ if dtype == "complex64" and engine == "numexpr":
761
+ mark = pytest.mark.xfail(
762
+ reason="numexpr issue with complex that are upcast "
763
+ "to complex 128 "
764
+ "https://github.com/pydata/numexpr/issues/492"
765
+ )
766
+ request.applymarker(mark)
767
+ assert df.values.dtype == dtype
768
+ assert res.values.dtype == dtype
769
+ tm.assert_frame_equal(res, eval(s), check_exact=False)
770
+
771
+
772
+ # -------------------------------------
773
+ # Basic and complex alignment
774
+
775
+
776
+ def should_warn(*args):
777
+ not_mono = not any(map(operator.attrgetter("is_monotonic_increasing"), args))
778
+ only_one_dt = reduce(
779
+ operator.xor, (issubclass(x.dtype.type, np.datetime64) for x in args)
780
+ )
781
+ return not_mono and only_one_dt
782
+
783
+
784
+ class TestAlignment:
785
+ index_types = ["i", "s", "dt"]
786
+ lhs_index_types = index_types + ["s"] # 'p'
787
+
788
+ def test_align_nested_unary_op(self, engine, parser):
789
+ s = "df * ~2"
790
+ df = DataFrame(np.random.default_rng(2).standard_normal((5, 3)))
791
+ res = pd.eval(s, engine=engine, parser=parser)
792
+ tm.assert_frame_equal(res, df * ~2)
793
+
794
+ @pytest.mark.filterwarnings("always::RuntimeWarning")
795
+ @pytest.mark.parametrize("lr_idx_type", lhs_index_types)
796
+ @pytest.mark.parametrize("rr_idx_type", index_types)
797
+ @pytest.mark.parametrize("c_idx_type", index_types)
798
+ def test_basic_frame_alignment(
799
+ self, engine, parser, lr_idx_type, rr_idx_type, c_idx_type, idx_func_dict
800
+ ):
801
+ df = DataFrame(
802
+ np.random.default_rng(2).standard_normal((10, 10)),
803
+ index=idx_func_dict[lr_idx_type](10),
804
+ columns=idx_func_dict[c_idx_type](10),
805
+ )
806
+ df2 = DataFrame(
807
+ np.random.default_rng(2).standard_normal((20, 10)),
808
+ index=idx_func_dict[rr_idx_type](20),
809
+ columns=idx_func_dict[c_idx_type](10),
810
+ )
811
+ # only warns if not monotonic and not sortable
812
+ if should_warn(df.index, df2.index):
813
+ with tm.assert_produces_warning(RuntimeWarning):
814
+ res = pd.eval("df + df2", engine=engine, parser=parser)
815
+ else:
816
+ res = pd.eval("df + df2", engine=engine, parser=parser)
817
+ tm.assert_frame_equal(res, df + df2)
818
+
819
+ @pytest.mark.parametrize("r_idx_type", lhs_index_types)
820
+ @pytest.mark.parametrize("c_idx_type", lhs_index_types)
821
+ def test_frame_comparison(
822
+ self, engine, parser, r_idx_type, c_idx_type, idx_func_dict
823
+ ):
824
+ df = DataFrame(
825
+ np.random.default_rng(2).standard_normal((10, 10)),
826
+ index=idx_func_dict[r_idx_type](10),
827
+ columns=idx_func_dict[c_idx_type](10),
828
+ )
829
+ res = pd.eval("df < 2", engine=engine, parser=parser)
830
+ tm.assert_frame_equal(res, df < 2)
831
+
832
+ df3 = DataFrame(
833
+ np.random.default_rng(2).standard_normal(df.shape),
834
+ index=df.index,
835
+ columns=df.columns,
836
+ )
837
+ res = pd.eval("df < df3", engine=engine, parser=parser)
838
+ tm.assert_frame_equal(res, df < df3)
839
+
840
+ @pytest.mark.filterwarnings("ignore::RuntimeWarning")
841
+ @pytest.mark.parametrize("r1", lhs_index_types)
842
+ @pytest.mark.parametrize("c1", index_types)
843
+ @pytest.mark.parametrize("r2", index_types)
844
+ @pytest.mark.parametrize("c2", index_types)
845
+ def test_medium_complex_frame_alignment(
846
+ self, engine, parser, r1, c1, r2, c2, idx_func_dict
847
+ ):
848
+ df = DataFrame(
849
+ np.random.default_rng(2).standard_normal((3, 2)),
850
+ index=idx_func_dict[r1](3),
851
+ columns=idx_func_dict[c1](2),
852
+ )
853
+ df2 = DataFrame(
854
+ np.random.default_rng(2).standard_normal((4, 2)),
855
+ index=idx_func_dict[r2](4),
856
+ columns=idx_func_dict[c2](2),
857
+ )
858
+ df3 = DataFrame(
859
+ np.random.default_rng(2).standard_normal((5, 2)),
860
+ index=idx_func_dict[r2](5),
861
+ columns=idx_func_dict[c2](2),
862
+ )
863
+ if should_warn(df.index, df2.index, df3.index):
864
+ with tm.assert_produces_warning(RuntimeWarning):
865
+ res = pd.eval("df + df2 + df3", engine=engine, parser=parser)
866
+ else:
867
+ res = pd.eval("df + df2 + df3", engine=engine, parser=parser)
868
+ tm.assert_frame_equal(res, df + df2 + df3)
869
+
870
+ @pytest.mark.filterwarnings("ignore::RuntimeWarning")
871
+ @pytest.mark.parametrize("index_name", ["index", "columns"])
872
+ @pytest.mark.parametrize("c_idx_type", index_types)
873
+ @pytest.mark.parametrize("r_idx_type", lhs_index_types)
874
+ def test_basic_frame_series_alignment(
875
+ self, engine, parser, index_name, r_idx_type, c_idx_type, idx_func_dict
876
+ ):
877
+ df = DataFrame(
878
+ np.random.default_rng(2).standard_normal((10, 10)),
879
+ index=idx_func_dict[r_idx_type](10),
880
+ columns=idx_func_dict[c_idx_type](10),
881
+ )
882
+ index = getattr(df, index_name)
883
+ s = Series(np.random.default_rng(2).standard_normal(5), index[:5])
884
+
885
+ if should_warn(df.index, s.index):
886
+ with tm.assert_produces_warning(RuntimeWarning):
887
+ res = pd.eval("df + s", engine=engine, parser=parser)
888
+ else:
889
+ res = pd.eval("df + s", engine=engine, parser=parser)
890
+
891
+ if r_idx_type == "dt" or c_idx_type == "dt":
892
+ expected = df.add(s) if engine == "numexpr" else df + s
893
+ else:
894
+ expected = df + s
895
+ tm.assert_frame_equal(res, expected)
896
+
897
+ @pytest.mark.parametrize("index_name", ["index", "columns"])
898
+ @pytest.mark.parametrize(
899
+ "r_idx_type, c_idx_type",
900
+ list(product(["i", "s"], ["i", "s"])) + [("dt", "dt")],
901
+ )
902
+ @pytest.mark.filterwarnings("ignore::RuntimeWarning")
903
+ def test_basic_series_frame_alignment(
904
+ self, request, engine, parser, index_name, r_idx_type, c_idx_type, idx_func_dict
905
+ ):
906
+ if (
907
+ engine == "numexpr"
908
+ and parser in ("pandas", "python")
909
+ and index_name == "index"
910
+ and r_idx_type == "i"
911
+ and c_idx_type == "s"
912
+ ):
913
+ reason = (
914
+ f"Flaky column ordering when engine={engine}, "
915
+ f"parser={parser}, index_name={index_name}, "
916
+ f"r_idx_type={r_idx_type}, c_idx_type={c_idx_type}"
917
+ )
918
+ request.applymarker(pytest.mark.xfail(reason=reason, strict=False))
919
+ df = DataFrame(
920
+ np.random.default_rng(2).standard_normal((10, 7)),
921
+ index=idx_func_dict[r_idx_type](10),
922
+ columns=idx_func_dict[c_idx_type](7),
923
+ )
924
+ index = getattr(df, index_name)
925
+ s = Series(np.random.default_rng(2).standard_normal(5), index[:5])
926
+ if should_warn(s.index, df.index):
927
+ with tm.assert_produces_warning(RuntimeWarning):
928
+ res = pd.eval("s + df", engine=engine, parser=parser)
929
+ else:
930
+ res = pd.eval("s + df", engine=engine, parser=parser)
931
+
932
+ if r_idx_type == "dt" or c_idx_type == "dt":
933
+ expected = df.add(s) if engine == "numexpr" else s + df
934
+ else:
935
+ expected = s + df
936
+ tm.assert_frame_equal(res, expected)
937
+
938
+ @pytest.mark.filterwarnings("ignore::RuntimeWarning")
939
+ @pytest.mark.parametrize("c_idx_type", index_types)
940
+ @pytest.mark.parametrize("r_idx_type", lhs_index_types)
941
+ @pytest.mark.parametrize("index_name", ["index", "columns"])
942
+ @pytest.mark.parametrize("op", ["+", "*"])
943
+ def test_series_frame_commutativity(
944
+ self, engine, parser, index_name, op, r_idx_type, c_idx_type, idx_func_dict
945
+ ):
946
+ df = DataFrame(
947
+ np.random.default_rng(2).standard_normal((10, 10)),
948
+ index=idx_func_dict[r_idx_type](10),
949
+ columns=idx_func_dict[c_idx_type](10),
950
+ )
951
+ index = getattr(df, index_name)
952
+ s = Series(np.random.default_rng(2).standard_normal(5), index[:5])
953
+
954
+ lhs = f"s {op} df"
955
+ rhs = f"df {op} s"
956
+ if should_warn(df.index, s.index):
957
+ with tm.assert_produces_warning(RuntimeWarning):
958
+ a = pd.eval(lhs, engine=engine, parser=parser)
959
+ with tm.assert_produces_warning(RuntimeWarning):
960
+ b = pd.eval(rhs, engine=engine, parser=parser)
961
+ else:
962
+ a = pd.eval(lhs, engine=engine, parser=parser)
963
+ b = pd.eval(rhs, engine=engine, parser=parser)
964
+
965
+ if r_idx_type != "dt" and c_idx_type != "dt":
966
+ if engine == "numexpr":
967
+ tm.assert_frame_equal(a, b)
968
+
969
+ @pytest.mark.filterwarnings("always::RuntimeWarning")
970
+ @pytest.mark.parametrize("r1", lhs_index_types)
971
+ @pytest.mark.parametrize("c1", index_types)
972
+ @pytest.mark.parametrize("r2", index_types)
973
+ @pytest.mark.parametrize("c2", index_types)
974
+ def test_complex_series_frame_alignment(
975
+ self, engine, parser, r1, c1, r2, c2, idx_func_dict
976
+ ):
977
+ n = 3
978
+ m1 = 5
979
+ m2 = 2 * m1
980
+ df = DataFrame(
981
+ np.random.default_rng(2).standard_normal((m1, n)),
982
+ index=idx_func_dict[r1](m1),
983
+ columns=idx_func_dict[c1](n),
984
+ )
985
+ df2 = DataFrame(
986
+ np.random.default_rng(2).standard_normal((m2, n)),
987
+ index=idx_func_dict[r2](m2),
988
+ columns=idx_func_dict[c2](n),
989
+ )
990
+ index = df2.columns
991
+ ser = Series(np.random.default_rng(2).standard_normal(n), index[:n])
992
+
993
+ if r2 == "dt" or c2 == "dt":
994
+ if engine == "numexpr":
995
+ expected2 = df2.add(ser)
996
+ else:
997
+ expected2 = df2 + ser
998
+ else:
999
+ expected2 = df2 + ser
1000
+
1001
+ if r1 == "dt" or c1 == "dt":
1002
+ if engine == "numexpr":
1003
+ expected = expected2.add(df)
1004
+ else:
1005
+ expected = expected2 + df
1006
+ else:
1007
+ expected = expected2 + df
1008
+
1009
+ if should_warn(df2.index, ser.index, df.index):
1010
+ with tm.assert_produces_warning(RuntimeWarning):
1011
+ res = pd.eval("df2 + ser + df", engine=engine, parser=parser)
1012
+ else:
1013
+ res = pd.eval("df2 + ser + df", engine=engine, parser=parser)
1014
+ assert res.shape == expected.shape
1015
+ tm.assert_frame_equal(res, expected)
1016
+
1017
+ def test_performance_warning_for_poor_alignment(self, engine, parser):
1018
+ df = DataFrame(np.random.default_rng(2).standard_normal((1000, 10)))
1019
+ s = Series(np.random.default_rng(2).standard_normal(10000))
1020
+ if engine == "numexpr":
1021
+ seen = PerformanceWarning
1022
+ else:
1023
+ seen = False
1024
+
1025
+ with tm.assert_produces_warning(seen):
1026
+ pd.eval("df + s", engine=engine, parser=parser)
1027
+
1028
+ s = Series(np.random.default_rng(2).standard_normal(1000))
1029
+ with tm.assert_produces_warning(False):
1030
+ pd.eval("df + s", engine=engine, parser=parser)
1031
+
1032
+ df = DataFrame(np.random.default_rng(2).standard_normal((10, 10000)))
1033
+ s = Series(np.random.default_rng(2).standard_normal(10000))
1034
+ with tm.assert_produces_warning(False):
1035
+ pd.eval("df + s", engine=engine, parser=parser)
1036
+
1037
+ df = DataFrame(np.random.default_rng(2).standard_normal((10, 10)))
1038
+ s = Series(np.random.default_rng(2).standard_normal(10000))
1039
+
1040
+ is_python_engine = engine == "python"
1041
+
1042
+ if not is_python_engine:
1043
+ wrn = PerformanceWarning
1044
+ else:
1045
+ wrn = False
1046
+
1047
+ with tm.assert_produces_warning(wrn) as w:
1048
+ pd.eval("df + s", engine=engine, parser=parser)
1049
+
1050
+ if not is_python_engine:
1051
+ assert len(w) == 1
1052
+ msg = str(w[0].message)
1053
+ logged = np.log10(s.size - df.shape[1])
1054
+ expected = (
1055
+ f"Alignment difference on axis 1 is larger "
1056
+ f"than an order of magnitude on term 'df', "
1057
+ f"by more than {logged:.4g}; performance may suffer."
1058
+ )
1059
+ assert msg == expected
1060
+
1061
+
1062
+ # ------------------------------------
1063
+ # Slightly more complex ops
1064
+
1065
+
1066
+ class TestOperations:
1067
+ def eval(self, *args, **kwargs):
1068
+ kwargs["level"] = kwargs.pop("level", 0) + 1
1069
+ return pd.eval(*args, **kwargs)
1070
+
1071
+ def test_simple_arith_ops(self, engine, parser):
1072
+ exclude_arith = []
1073
+ if parser == "python":
1074
+ exclude_arith = ["in", "not in"]
1075
+
1076
+ arith_ops = [
1077
+ op
1078
+ for op in expr.ARITH_OPS_SYMS + expr.CMP_OPS_SYMS
1079
+ if op not in exclude_arith
1080
+ ]
1081
+
1082
+ ops = (op for op in arith_ops if op != "//")
1083
+
1084
+ for op in ops:
1085
+ ex = f"1 {op} 1"
1086
+ ex2 = f"x {op} 1"
1087
+ ex3 = f"1 {op} (x + 1)"
1088
+
1089
+ if op in ("in", "not in"):
1090
+ msg = "argument of type 'int' is not iterable"
1091
+ with pytest.raises(TypeError, match=msg):
1092
+ pd.eval(ex, engine=engine, parser=parser)
1093
+ else:
1094
+ expec = _eval_single_bin(1, op, 1, engine)
1095
+ x = self.eval(ex, engine=engine, parser=parser)
1096
+ assert x == expec
1097
+
1098
+ expec = _eval_single_bin(x, op, 1, engine)
1099
+ y = self.eval(ex2, local_dict={"x": x}, engine=engine, parser=parser)
1100
+ assert y == expec
1101
+
1102
+ expec = _eval_single_bin(1, op, x + 1, engine)
1103
+ y = self.eval(ex3, local_dict={"x": x}, engine=engine, parser=parser)
1104
+ assert y == expec
1105
+
1106
+ @pytest.mark.parametrize("rhs", [True, False])
1107
+ @pytest.mark.parametrize("lhs", [True, False])
1108
+ @pytest.mark.parametrize("op", expr.BOOL_OPS_SYMS)
1109
+ def test_simple_bool_ops(self, rhs, lhs, op):
1110
+ ex = f"{lhs} {op} {rhs}"
1111
+
1112
+ if parser == "python" and op in ["and", "or"]:
1113
+ msg = "'BoolOp' nodes are not implemented"
1114
+ with pytest.raises(NotImplementedError, match=msg):
1115
+ self.eval(ex)
1116
+ return
1117
+
1118
+ res = self.eval(ex)
1119
+ exp = eval(ex)
1120
+ assert res == exp
1121
+
1122
+ @pytest.mark.parametrize("rhs", [True, False])
1123
+ @pytest.mark.parametrize("lhs", [True, False])
1124
+ @pytest.mark.parametrize("op", expr.BOOL_OPS_SYMS)
1125
+ def test_bool_ops_with_constants(self, rhs, lhs, op):
1126
+ ex = f"{lhs} {op} {rhs}"
1127
+
1128
+ if parser == "python" and op in ["and", "or"]:
1129
+ msg = "'BoolOp' nodes are not implemented"
1130
+ with pytest.raises(NotImplementedError, match=msg):
1131
+ self.eval(ex)
1132
+ return
1133
+
1134
+ res = self.eval(ex)
1135
+ exp = eval(ex)
1136
+ assert res == exp
1137
+
1138
+ def test_4d_ndarray_fails(self):
1139
+ x = np.random.default_rng(2).standard_normal((3, 4, 5, 6))
1140
+ y = Series(np.random.default_rng(2).standard_normal(10))
1141
+ msg = "N-dimensional objects, where N > 2, are not supported with eval"
1142
+ with pytest.raises(NotImplementedError, match=msg):
1143
+ self.eval("x + y", local_dict={"x": x, "y": y})
1144
+
1145
+ def test_constant(self):
1146
+ x = self.eval("1")
1147
+ assert x == 1
1148
+
1149
+ def test_single_variable(self):
1150
+ df = DataFrame(np.random.default_rng(2).standard_normal((10, 2)))
1151
+ df2 = self.eval("df", local_dict={"df": df})
1152
+ tm.assert_frame_equal(df, df2)
1153
+
1154
+ def test_failing_subscript_with_name_error(self):
1155
+ df = DataFrame(np.random.default_rng(2).standard_normal((5, 3))) # noqa: F841
1156
+ with pytest.raises(NameError, match="name 'x' is not defined"):
1157
+ self.eval("df[x > 2] > 2")
1158
+
1159
+ def test_lhs_expression_subscript(self):
1160
+ df = DataFrame(np.random.default_rng(2).standard_normal((5, 3)))
1161
+ result = self.eval("(df + 1)[df > 2]", local_dict={"df": df})
1162
+ expected = (df + 1)[df > 2]
1163
+ tm.assert_frame_equal(result, expected)
1164
+
1165
+ def test_attr_expression(self):
1166
+ df = DataFrame(
1167
+ np.random.default_rng(2).standard_normal((5, 3)), columns=list("abc")
1168
+ )
1169
+ expr1 = "df.a < df.b"
1170
+ expec1 = df.a < df.b
1171
+ expr2 = "df.a + df.b + df.c"
1172
+ expec2 = df.a + df.b + df.c
1173
+ expr3 = "df.a + df.b + df.c[df.b < 0]"
1174
+ expec3 = df.a + df.b + df.c[df.b < 0]
1175
+ exprs = expr1, expr2, expr3
1176
+ expecs = expec1, expec2, expec3
1177
+ for e, expec in zip(exprs, expecs):
1178
+ tm.assert_series_equal(expec, self.eval(e, local_dict={"df": df}))
1179
+
1180
+ def test_assignment_fails(self):
1181
+ df = DataFrame(
1182
+ np.random.default_rng(2).standard_normal((5, 3)), columns=list("abc")
1183
+ )
1184
+ df2 = DataFrame(np.random.default_rng(2).standard_normal((5, 3)))
1185
+ expr1 = "df = df2"
1186
+ msg = "cannot assign without a target object"
1187
+ with pytest.raises(ValueError, match=msg):
1188
+ self.eval(expr1, local_dict={"df": df, "df2": df2})
1189
+
1190
+ def test_assignment_column_multiple_raise(self):
1191
+ df = DataFrame(
1192
+ np.random.default_rng(2).standard_normal((5, 2)), columns=list("ab")
1193
+ )
1194
+ # multiple assignees
1195
+ with pytest.raises(SyntaxError, match="invalid syntax"):
1196
+ df.eval("d c = a + b")
1197
+
1198
+ def test_assignment_column_invalid_assign(self):
1199
+ df = DataFrame(
1200
+ np.random.default_rng(2).standard_normal((5, 2)), columns=list("ab")
1201
+ )
1202
+ # invalid assignees
1203
+ msg = "left hand side of an assignment must be a single name"
1204
+ with pytest.raises(SyntaxError, match=msg):
1205
+ df.eval("d,c = a + b")
1206
+
1207
+ def test_assignment_column_invalid_assign_function_call(self):
1208
+ df = DataFrame(
1209
+ np.random.default_rng(2).standard_normal((5, 2)), columns=list("ab")
1210
+ )
1211
+ msg = "cannot assign to function call"
1212
+ with pytest.raises(SyntaxError, match=msg):
1213
+ df.eval('Timestamp("20131001") = a + b')
1214
+
1215
+ def test_assignment_single_assign_existing(self):
1216
+ df = DataFrame(
1217
+ np.random.default_rng(2).standard_normal((5, 2)), columns=list("ab")
1218
+ )
1219
+ # single assignment - existing variable
1220
+ expected = df.copy()
1221
+ expected["a"] = expected["a"] + expected["b"]
1222
+ df.eval("a = a + b", inplace=True)
1223
+ tm.assert_frame_equal(df, expected)
1224
+
1225
+ def test_assignment_single_assign_new(self):
1226
+ df = DataFrame(
1227
+ np.random.default_rng(2).standard_normal((5, 2)), columns=list("ab")
1228
+ )
1229
+ # single assignment - new variable
1230
+ expected = df.copy()
1231
+ expected["c"] = expected["a"] + expected["b"]
1232
+ df.eval("c = a + b", inplace=True)
1233
+ tm.assert_frame_equal(df, expected)
1234
+
1235
+ def test_assignment_single_assign_local_overlap(self):
1236
+ df = DataFrame(
1237
+ np.random.default_rng(2).standard_normal((5, 2)), columns=list("ab")
1238
+ )
1239
+ df = df.copy()
1240
+ a = 1 # noqa: F841
1241
+ df.eval("a = 1 + b", inplace=True)
1242
+
1243
+ expected = df.copy()
1244
+ expected["a"] = 1 + expected["b"]
1245
+ tm.assert_frame_equal(df, expected)
1246
+
1247
+ def test_assignment_single_assign_name(self):
1248
+ df = DataFrame(
1249
+ np.random.default_rng(2).standard_normal((5, 2)), columns=list("ab")
1250
+ )
1251
+
1252
+ a = 1 # noqa: F841
1253
+ old_a = df.a.copy()
1254
+ df.eval("a = a + b", inplace=True)
1255
+ result = old_a + df.b
1256
+ tm.assert_series_equal(result, df.a, check_names=False)
1257
+ assert result.name is None
1258
+
1259
+ def test_assignment_multiple_raises(self):
1260
+ df = DataFrame(
1261
+ np.random.default_rng(2).standard_normal((5, 2)), columns=list("ab")
1262
+ )
1263
+ # multiple assignment
1264
+ df.eval("c = a + b", inplace=True)
1265
+ msg = "can only assign a single expression"
1266
+ with pytest.raises(SyntaxError, match=msg):
1267
+ df.eval("c = a = b")
1268
+
1269
+ def test_assignment_explicit(self):
1270
+ df = DataFrame(
1271
+ np.random.default_rng(2).standard_normal((5, 2)), columns=list("ab")
1272
+ )
1273
+ # explicit targets
1274
+ self.eval("c = df.a + df.b", local_dict={"df": df}, target=df, inplace=True)
1275
+ expected = df.copy()
1276
+ expected["c"] = expected["a"] + expected["b"]
1277
+ tm.assert_frame_equal(df, expected)
1278
+
1279
+ def test_column_in(self):
1280
+ # GH 11235
1281
+ df = DataFrame({"a": [11], "b": [-32]})
1282
+ result = df.eval("a in [11, -32]")
1283
+ expected = Series([True])
1284
+ # TODO: 2022-01-29: Name check failed with numexpr 2.7.3 in CI
1285
+ # but cannot reproduce locally
1286
+ tm.assert_series_equal(result, expected, check_names=False)
1287
+
1288
+ @pytest.mark.xfail(reason="Unknown: Omitted test_ in name prior.")
1289
+ def test_assignment_not_inplace(self):
1290
+ # see gh-9297
1291
+ df = DataFrame(
1292
+ np.random.default_rng(2).standard_normal((5, 2)), columns=list("ab")
1293
+ )
1294
+
1295
+ actual = df.eval("c = a + b", inplace=False)
1296
+ assert actual is not None
1297
+
1298
+ expected = df.copy()
1299
+ expected["c"] = expected["a"] + expected["b"]
1300
+ tm.assert_frame_equal(df, expected)
1301
+
1302
+ def test_multi_line_expression(self, warn_copy_on_write):
1303
+ # GH 11149
1304
+ df = DataFrame({"a": [1, 2, 3], "b": [4, 5, 6]})
1305
+ expected = df.copy()
1306
+
1307
+ expected["c"] = expected["a"] + expected["b"]
1308
+ expected["d"] = expected["c"] + expected["b"]
1309
+ answer = df.eval(
1310
+ """
1311
+ c = a + b
1312
+ d = c + b""",
1313
+ inplace=True,
1314
+ )
1315
+ tm.assert_frame_equal(expected, df)
1316
+ assert answer is None
1317
+
1318
+ expected["a"] = expected["a"] - 1
1319
+ expected["e"] = expected["a"] + 2
1320
+ answer = df.eval(
1321
+ """
1322
+ a = a - 1
1323
+ e = a + 2""",
1324
+ inplace=True,
1325
+ )
1326
+ tm.assert_frame_equal(expected, df)
1327
+ assert answer is None
1328
+
1329
+ # multi-line not valid if not all assignments
1330
+ msg = "Multi-line expressions are only valid if all expressions contain"
1331
+ with pytest.raises(ValueError, match=msg):
1332
+ df.eval(
1333
+ """
1334
+ a = b + 2
1335
+ b - 2""",
1336
+ inplace=False,
1337
+ )
1338
+
1339
+ def test_multi_line_expression_not_inplace(self):
1340
+ # GH 11149
1341
+ df = DataFrame({"a": [1, 2, 3], "b": [4, 5, 6]})
1342
+ expected = df.copy()
1343
+
1344
+ expected["c"] = expected["a"] + expected["b"]
1345
+ expected["d"] = expected["c"] + expected["b"]
1346
+ df = df.eval(
1347
+ """
1348
+ c = a + b
1349
+ d = c + b""",
1350
+ inplace=False,
1351
+ )
1352
+ tm.assert_frame_equal(expected, df)
1353
+
1354
+ expected["a"] = expected["a"] - 1
1355
+ expected["e"] = expected["a"] + 2
1356
+ df = df.eval(
1357
+ """
1358
+ a = a - 1
1359
+ e = a + 2""",
1360
+ inplace=False,
1361
+ )
1362
+ tm.assert_frame_equal(expected, df)
1363
+
1364
+ def test_multi_line_expression_local_variable(self):
1365
+ # GH 15342
1366
+ df = DataFrame({"a": [1, 2, 3], "b": [4, 5, 6]})
1367
+ expected = df.copy()
1368
+
1369
+ local_var = 7
1370
+ expected["c"] = expected["a"] * local_var
1371
+ expected["d"] = expected["c"] + local_var
1372
+ answer = df.eval(
1373
+ """
1374
+ c = a * @local_var
1375
+ d = c + @local_var
1376
+ """,
1377
+ inplace=True,
1378
+ )
1379
+ tm.assert_frame_equal(expected, df)
1380
+ assert answer is None
1381
+
1382
+ def test_multi_line_expression_callable_local_variable(self):
1383
+ # 26426
1384
+ df = DataFrame({"a": [1, 2, 3], "b": [4, 5, 6]})
1385
+
1386
+ def local_func(a, b):
1387
+ return b
1388
+
1389
+ expected = df.copy()
1390
+ expected["c"] = expected["a"] * local_func(1, 7)
1391
+ expected["d"] = expected["c"] + local_func(1, 7)
1392
+ answer = df.eval(
1393
+ """
1394
+ c = a * @local_func(1, 7)
1395
+ d = c + @local_func(1, 7)
1396
+ """,
1397
+ inplace=True,
1398
+ )
1399
+ tm.assert_frame_equal(expected, df)
1400
+ assert answer is None
1401
+
1402
+ def test_multi_line_expression_callable_local_variable_with_kwargs(self):
1403
+ # 26426
1404
+ df = DataFrame({"a": [1, 2, 3], "b": [4, 5, 6]})
1405
+
1406
+ def local_func(a, b):
1407
+ return b
1408
+
1409
+ expected = df.copy()
1410
+ expected["c"] = expected["a"] * local_func(b=7, a=1)
1411
+ expected["d"] = expected["c"] + local_func(b=7, a=1)
1412
+ answer = df.eval(
1413
+ """
1414
+ c = a * @local_func(b=7, a=1)
1415
+ d = c + @local_func(b=7, a=1)
1416
+ """,
1417
+ inplace=True,
1418
+ )
1419
+ tm.assert_frame_equal(expected, df)
1420
+ assert answer is None
1421
+
1422
+ def test_assignment_in_query(self):
1423
+ # GH 8664
1424
+ df = DataFrame({"a": [1, 2, 3], "b": [4, 5, 6]})
1425
+ df_orig = df.copy()
1426
+ msg = "cannot assign without a target object"
1427
+ with pytest.raises(ValueError, match=msg):
1428
+ df.query("a = 1")
1429
+ tm.assert_frame_equal(df, df_orig)
1430
+
1431
+ def test_query_inplace(self):
1432
+ # see gh-11149
1433
+ df = DataFrame({"a": [1, 2, 3], "b": [4, 5, 6]})
1434
+ expected = df.copy()
1435
+ expected = expected[expected["a"] == 2]
1436
+ df.query("a == 2", inplace=True)
1437
+ tm.assert_frame_equal(expected, df)
1438
+
1439
+ df = {}
1440
+ expected = {"a": 3}
1441
+
1442
+ self.eval("a = 1 + 2", target=df, inplace=True)
1443
+ tm.assert_dict_equal(df, expected)
1444
+
1445
+ @pytest.mark.parametrize("invalid_target", [1, "cat", [1, 2], np.array([]), (1, 3)])
1446
+ def test_cannot_item_assign(self, invalid_target):
1447
+ msg = "Cannot assign expression output to target"
1448
+ expression = "a = 1 + 2"
1449
+
1450
+ with pytest.raises(ValueError, match=msg):
1451
+ self.eval(expression, target=invalid_target, inplace=True)
1452
+
1453
+ if hasattr(invalid_target, "copy"):
1454
+ with pytest.raises(ValueError, match=msg):
1455
+ self.eval(expression, target=invalid_target, inplace=False)
1456
+
1457
+ @pytest.mark.parametrize("invalid_target", [1, "cat", (1, 3)])
1458
+ def test_cannot_copy_item(self, invalid_target):
1459
+ msg = "Cannot return a copy of the target"
1460
+ expression = "a = 1 + 2"
1461
+
1462
+ with pytest.raises(ValueError, match=msg):
1463
+ self.eval(expression, target=invalid_target, inplace=False)
1464
+
1465
+ @pytest.mark.parametrize("target", [1, "cat", [1, 2], np.array([]), (1, 3), {1: 2}])
1466
+ def test_inplace_no_assignment(self, target):
1467
+ expression = "1 + 2"
1468
+
1469
+ assert self.eval(expression, target=target, inplace=False) == 3
1470
+
1471
+ msg = "Cannot operate inplace if there is no assignment"
1472
+ with pytest.raises(ValueError, match=msg):
1473
+ self.eval(expression, target=target, inplace=True)
1474
+
1475
+ def test_basic_period_index_boolean_expression(self):
1476
+ df = DataFrame(
1477
+ np.random.default_rng(2).standard_normal((2, 2)),
1478
+ columns=period_range("2020-01-01", freq="D", periods=2),
1479
+ )
1480
+ e = df < 2
1481
+ r = self.eval("df < 2", local_dict={"df": df})
1482
+ x = df < 2
1483
+
1484
+ tm.assert_frame_equal(r, e)
1485
+ tm.assert_frame_equal(x, e)
1486
+
1487
+ def test_basic_period_index_subscript_expression(self):
1488
+ df = DataFrame(
1489
+ np.random.default_rng(2).standard_normal((2, 2)),
1490
+ columns=period_range("2020-01-01", freq="D", periods=2),
1491
+ )
1492
+ r = self.eval("df[df < 2 + 3]", local_dict={"df": df})
1493
+ e = df[df < 2 + 3]
1494
+ tm.assert_frame_equal(r, e)
1495
+
1496
+ def test_nested_period_index_subscript_expression(self):
1497
+ df = DataFrame(
1498
+ np.random.default_rng(2).standard_normal((2, 2)),
1499
+ columns=period_range("2020-01-01", freq="D", periods=2),
1500
+ )
1501
+ r = self.eval("df[df[df < 2] < 2] + df * 2", local_dict={"df": df})
1502
+ e = df[df[df < 2] < 2] + df * 2
1503
+ tm.assert_frame_equal(r, e)
1504
+
1505
+ def test_date_boolean(self, engine, parser):
1506
+ df = DataFrame(np.random.default_rng(2).standard_normal((5, 3)))
1507
+ df["dates1"] = date_range("1/1/2012", periods=5)
1508
+ res = self.eval(
1509
+ "df.dates1 < 20130101",
1510
+ local_dict={"df": df},
1511
+ engine=engine,
1512
+ parser=parser,
1513
+ )
1514
+ expec = df.dates1 < "20130101"
1515
+ tm.assert_series_equal(res, expec, check_names=False)
1516
+
1517
+ def test_simple_in_ops(self, engine, parser):
1518
+ if parser != "python":
1519
+ res = pd.eval("1 in [1, 2]", engine=engine, parser=parser)
1520
+ assert res
1521
+
1522
+ res = pd.eval("2 in (1, 2)", engine=engine, parser=parser)
1523
+ assert res
1524
+
1525
+ res = pd.eval("3 in (1, 2)", engine=engine, parser=parser)
1526
+ assert not res
1527
+
1528
+ res = pd.eval("3 not in (1, 2)", engine=engine, parser=parser)
1529
+ assert res
1530
+
1531
+ res = pd.eval("[3] not in (1, 2)", engine=engine, parser=parser)
1532
+ assert res
1533
+
1534
+ res = pd.eval("[3] in ([3], 2)", engine=engine, parser=parser)
1535
+ assert res
1536
+
1537
+ res = pd.eval("[[3]] in [[[3]], 2]", engine=engine, parser=parser)
1538
+ assert res
1539
+
1540
+ res = pd.eval("(3,) in [(3,), 2]", engine=engine, parser=parser)
1541
+ assert res
1542
+
1543
+ res = pd.eval("(3,) not in [(3,), 2]", engine=engine, parser=parser)
1544
+ assert not res
1545
+
1546
+ res = pd.eval("[(3,)] in [[(3,)], 2]", engine=engine, parser=parser)
1547
+ assert res
1548
+ else:
1549
+ msg = "'In' nodes are not implemented"
1550
+ with pytest.raises(NotImplementedError, match=msg):
1551
+ pd.eval("1 in [1, 2]", engine=engine, parser=parser)
1552
+ with pytest.raises(NotImplementedError, match=msg):
1553
+ pd.eval("2 in (1, 2)", engine=engine, parser=parser)
1554
+ with pytest.raises(NotImplementedError, match=msg):
1555
+ pd.eval("3 in (1, 2)", engine=engine, parser=parser)
1556
+ with pytest.raises(NotImplementedError, match=msg):
1557
+ pd.eval("[(3,)] in (1, 2, [(3,)])", engine=engine, parser=parser)
1558
+ msg = "'NotIn' nodes are not implemented"
1559
+ with pytest.raises(NotImplementedError, match=msg):
1560
+ pd.eval("3 not in (1, 2)", engine=engine, parser=parser)
1561
+ with pytest.raises(NotImplementedError, match=msg):
1562
+ pd.eval("[3] not in (1, 2, [[3]])", engine=engine, parser=parser)
1563
+
1564
+ def test_check_many_exprs(self, engine, parser):
1565
+ a = 1 # noqa: F841
1566
+ expr = " * ".join("a" * 33)
1567
+ expected = 1
1568
+ res = pd.eval(expr, engine=engine, parser=parser)
1569
+ assert res == expected
1570
+
1571
+ @pytest.mark.parametrize(
1572
+ "expr",
1573
+ [
1574
+ "df > 2 and df > 3",
1575
+ "df > 2 or df > 3",
1576
+ "not df > 2",
1577
+ ],
1578
+ )
1579
+ def test_fails_and_or_not(self, expr, engine, parser):
1580
+ df = DataFrame(np.random.default_rng(2).standard_normal((5, 3)))
1581
+ if parser == "python":
1582
+ msg = "'BoolOp' nodes are not implemented"
1583
+ if "not" in expr:
1584
+ msg = "'Not' nodes are not implemented"
1585
+
1586
+ with pytest.raises(NotImplementedError, match=msg):
1587
+ pd.eval(
1588
+ expr,
1589
+ local_dict={"df": df},
1590
+ parser=parser,
1591
+ engine=engine,
1592
+ )
1593
+ else:
1594
+ # smoke-test, should not raise
1595
+ pd.eval(
1596
+ expr,
1597
+ local_dict={"df": df},
1598
+ parser=parser,
1599
+ engine=engine,
1600
+ )
1601
+
1602
+ @pytest.mark.parametrize("char", ["|", "&"])
1603
+ def test_fails_ampersand_pipe(self, char, engine, parser):
1604
+ df = DataFrame(np.random.default_rng(2).standard_normal((5, 3))) # noqa: F841
1605
+ ex = f"(df + 2)[df > 1] > 0 {char} (df > 0)"
1606
+ if parser == "python":
1607
+ msg = "cannot evaluate scalar only bool ops"
1608
+ with pytest.raises(NotImplementedError, match=msg):
1609
+ pd.eval(ex, parser=parser, engine=engine)
1610
+ else:
1611
+ # smoke-test, should not raise
1612
+ pd.eval(ex, parser=parser, engine=engine)
1613
+
1614
+
1615
+ class TestMath:
1616
+ def eval(self, *args, **kwargs):
1617
+ kwargs["level"] = kwargs.pop("level", 0) + 1
1618
+ return pd.eval(*args, **kwargs)
1619
+
1620
+ @pytest.mark.skipif(
1621
+ not NUMEXPR_INSTALLED, reason="Unary ops only implemented for numexpr"
1622
+ )
1623
+ @pytest.mark.parametrize("fn", _unary_math_ops)
1624
+ def test_unary_functions(self, fn):
1625
+ df = DataFrame({"a": np.random.default_rng(2).standard_normal(10)})
1626
+ a = df.a
1627
+
1628
+ expr = f"{fn}(a)"
1629
+ got = self.eval(expr)
1630
+ with np.errstate(all="ignore"):
1631
+ expect = getattr(np, fn)(a)
1632
+ tm.assert_series_equal(got, expect, check_names=False)
1633
+
1634
+ @pytest.mark.parametrize("fn", _binary_math_ops)
1635
+ def test_binary_functions(self, fn):
1636
+ df = DataFrame(
1637
+ {
1638
+ "a": np.random.default_rng(2).standard_normal(10),
1639
+ "b": np.random.default_rng(2).standard_normal(10),
1640
+ }
1641
+ )
1642
+ a = df.a
1643
+ b = df.b
1644
+
1645
+ expr = f"{fn}(a, b)"
1646
+ got = self.eval(expr)
1647
+ with np.errstate(all="ignore"):
1648
+ expect = getattr(np, fn)(a, b)
1649
+ tm.assert_almost_equal(got, expect, check_names=False)
1650
+
1651
+ def test_df_use_case(self, engine, parser):
1652
+ df = DataFrame(
1653
+ {
1654
+ "a": np.random.default_rng(2).standard_normal(10),
1655
+ "b": np.random.default_rng(2).standard_normal(10),
1656
+ }
1657
+ )
1658
+ df.eval(
1659
+ "e = arctan2(sin(a), b)",
1660
+ engine=engine,
1661
+ parser=parser,
1662
+ inplace=True,
1663
+ )
1664
+ got = df.e
1665
+ expect = np.arctan2(np.sin(df.a), df.b)
1666
+ tm.assert_series_equal(got, expect, check_names=False)
1667
+
1668
+ def test_df_arithmetic_subexpression(self, engine, parser):
1669
+ df = DataFrame(
1670
+ {
1671
+ "a": np.random.default_rng(2).standard_normal(10),
1672
+ "b": np.random.default_rng(2).standard_normal(10),
1673
+ }
1674
+ )
1675
+ df.eval("e = sin(a + b)", engine=engine, parser=parser, inplace=True)
1676
+ got = df.e
1677
+ expect = np.sin(df.a + df.b)
1678
+ tm.assert_series_equal(got, expect, check_names=False)
1679
+
1680
+ @pytest.mark.parametrize(
1681
+ "dtype, expect_dtype",
1682
+ [
1683
+ (np.int32, np.float64),
1684
+ (np.int64, np.float64),
1685
+ (np.float32, np.float32),
1686
+ (np.float64, np.float64),
1687
+ pytest.param(np.complex128, np.complex128, marks=td.skip_if_windows),
1688
+ ],
1689
+ )
1690
+ def test_result_types(self, dtype, expect_dtype, engine, parser):
1691
+ # xref https://github.com/pandas-dev/pandas/issues/12293
1692
+ # this fails on Windows, apparently a floating point precision issue
1693
+
1694
+ # Did not test complex64 because DataFrame is converting it to
1695
+ # complex128. Due to https://github.com/pandas-dev/pandas/issues/10952
1696
+ df = DataFrame(
1697
+ {"a": np.random.default_rng(2).standard_normal(10).astype(dtype)}
1698
+ )
1699
+ assert df.a.dtype == dtype
1700
+ df.eval("b = sin(a)", engine=engine, parser=parser, inplace=True)
1701
+ got = df.b
1702
+ expect = np.sin(df.a)
1703
+ assert expect.dtype == got.dtype
1704
+ assert expect_dtype == got.dtype
1705
+ tm.assert_series_equal(got, expect, check_names=False)
1706
+
1707
+ def test_undefined_func(self, engine, parser):
1708
+ df = DataFrame({"a": np.random.default_rng(2).standard_normal(10)})
1709
+ msg = '"mysin" is not a supported function'
1710
+
1711
+ with pytest.raises(ValueError, match=msg):
1712
+ df.eval("mysin(a)", engine=engine, parser=parser)
1713
+
1714
+ def test_keyword_arg(self, engine, parser):
1715
+ df = DataFrame({"a": np.random.default_rng(2).standard_normal(10)})
1716
+ msg = 'Function "sin" does not support keyword arguments'
1717
+
1718
+ with pytest.raises(TypeError, match=msg):
1719
+ df.eval("sin(x=a)", engine=engine, parser=parser)
1720
+
1721
+
1722
+ _var_s = np.random.default_rng(2).standard_normal(10)
1723
+
1724
+
1725
+ class TestScope:
1726
+ def test_global_scope(self, engine, parser):
1727
+ e = "_var_s * 2"
1728
+ tm.assert_numpy_array_equal(
1729
+ _var_s * 2, pd.eval(e, engine=engine, parser=parser)
1730
+ )
1731
+
1732
+ def test_no_new_locals(self, engine, parser):
1733
+ x = 1
1734
+ lcls = locals().copy()
1735
+ pd.eval("x + 1", local_dict=lcls, engine=engine, parser=parser)
1736
+ lcls2 = locals().copy()
1737
+ lcls2.pop("lcls")
1738
+ assert lcls == lcls2
1739
+
1740
+ def test_no_new_globals(self, engine, parser):
1741
+ x = 1 # noqa: F841
1742
+ gbls = globals().copy()
1743
+ pd.eval("x + 1", engine=engine, parser=parser)
1744
+ gbls2 = globals().copy()
1745
+ assert gbls == gbls2
1746
+
1747
+ def test_empty_locals(self, engine, parser):
1748
+ # GH 47084
1749
+ x = 1 # noqa: F841
1750
+ msg = "name 'x' is not defined"
1751
+ with pytest.raises(UndefinedVariableError, match=msg):
1752
+ pd.eval("x + 1", engine=engine, parser=parser, local_dict={})
1753
+
1754
+ def test_empty_globals(self, engine, parser):
1755
+ # GH 47084
1756
+ msg = "name '_var_s' is not defined"
1757
+ e = "_var_s * 2"
1758
+ with pytest.raises(UndefinedVariableError, match=msg):
1759
+ pd.eval(e, engine=engine, parser=parser, global_dict={})
1760
+
1761
+
1762
+ @td.skip_if_no("numexpr")
1763
+ def test_invalid_engine():
1764
+ msg = "Invalid engine 'asdf' passed"
1765
+ with pytest.raises(KeyError, match=msg):
1766
+ pd.eval("x + y", local_dict={"x": 1, "y": 2}, engine="asdf")
1767
+
1768
+
1769
+ @td.skip_if_no("numexpr")
1770
+ @pytest.mark.parametrize(
1771
+ ("use_numexpr", "expected"),
1772
+ (
1773
+ (True, "numexpr"),
1774
+ (False, "python"),
1775
+ ),
1776
+ )
1777
+ def test_numexpr_option_respected(use_numexpr, expected):
1778
+ # GH 32556
1779
+ from pandas.core.computation.eval import _check_engine
1780
+
1781
+ with pd.option_context("compute.use_numexpr", use_numexpr):
1782
+ result = _check_engine(None)
1783
+ assert result == expected
1784
+
1785
+
1786
+ @td.skip_if_no("numexpr")
1787
+ def test_numexpr_option_incompatible_op():
1788
+ # GH 32556
1789
+ with pd.option_context("compute.use_numexpr", False):
1790
+ df = DataFrame(
1791
+ {"A": [True, False, True, False, None, None], "B": [1, 2, 3, 4, 5, 6]}
1792
+ )
1793
+ result = df.query("A.isnull()")
1794
+ expected = DataFrame({"A": [None, None], "B": [5, 6]}, index=[4, 5])
1795
+ tm.assert_frame_equal(result, expected)
1796
+
1797
+
1798
+ @td.skip_if_no("numexpr")
1799
+ def test_invalid_parser():
1800
+ msg = "Invalid parser 'asdf' passed"
1801
+ with pytest.raises(KeyError, match=msg):
1802
+ pd.eval("x + y", local_dict={"x": 1, "y": 2}, parser="asdf")
1803
+
1804
+
1805
+ _parsers: dict[str, type[BaseExprVisitor]] = {
1806
+ "python": PythonExprVisitor,
1807
+ "pytables": pytables.PyTablesExprVisitor,
1808
+ "pandas": PandasExprVisitor,
1809
+ }
1810
+
1811
+
1812
+ @pytest.mark.parametrize("engine", ENGINES)
1813
+ @pytest.mark.parametrize("parser", _parsers)
1814
+ def test_disallowed_nodes(engine, parser):
1815
+ VisitorClass = _parsers[parser]
1816
+ inst = VisitorClass("x + 1", engine, parser)
1817
+
1818
+ for ops in VisitorClass.unsupported_nodes:
1819
+ msg = "nodes are not implemented"
1820
+ with pytest.raises(NotImplementedError, match=msg):
1821
+ getattr(inst, ops)()
1822
+
1823
+
1824
+ def test_syntax_error_exprs(engine, parser):
1825
+ e = "s +"
1826
+ with pytest.raises(SyntaxError, match="invalid syntax"):
1827
+ pd.eval(e, engine=engine, parser=parser)
1828
+
1829
+
1830
+ def test_name_error_exprs(engine, parser):
1831
+ e = "s + t"
1832
+ msg = "name 's' is not defined"
1833
+ with pytest.raises(NameError, match=msg):
1834
+ pd.eval(e, engine=engine, parser=parser)
1835
+
1836
+
1837
+ @pytest.mark.parametrize("express", ["a + @b", "@a + b", "@a + @b"])
1838
+ def test_invalid_local_variable_reference(engine, parser, express):
1839
+ a, b = 1, 2 # noqa: F841
1840
+
1841
+ if parser != "pandas":
1842
+ with pytest.raises(SyntaxError, match="The '@' prefix is only"):
1843
+ pd.eval(express, engine=engine, parser=parser)
1844
+ else:
1845
+ with pytest.raises(SyntaxError, match="The '@' prefix is not"):
1846
+ pd.eval(express, engine=engine, parser=parser)
1847
+
1848
+
1849
+ def test_numexpr_builtin_raises(engine, parser):
1850
+ sin, dotted_line = 1, 2
1851
+ if engine == "numexpr":
1852
+ msg = "Variables in expression .+"
1853
+ with pytest.raises(NumExprClobberingError, match=msg):
1854
+ pd.eval("sin + dotted_line", engine=engine, parser=parser)
1855
+ else:
1856
+ res = pd.eval("sin + dotted_line", engine=engine, parser=parser)
1857
+ assert res == sin + dotted_line
1858
+
1859
+
1860
+ def test_bad_resolver_raises(engine, parser):
1861
+ cannot_resolve = 42, 3.0
1862
+ with pytest.raises(TypeError, match="Resolver of type .+"):
1863
+ pd.eval("1 + 2", resolvers=cannot_resolve, engine=engine, parser=parser)
1864
+
1865
+
1866
+ def test_empty_string_raises(engine, parser):
1867
+ # GH 13139
1868
+ with pytest.raises(ValueError, match="expr cannot be an empty string"):
1869
+ pd.eval("", engine=engine, parser=parser)
1870
+
1871
+
1872
+ def test_more_than_one_expression_raises(engine, parser):
1873
+ with pytest.raises(SyntaxError, match="only a single expression is allowed"):
1874
+ pd.eval("1 + 1; 2 + 2", engine=engine, parser=parser)
1875
+
1876
+
1877
+ @pytest.mark.parametrize("cmp", ("and", "or"))
1878
+ @pytest.mark.parametrize("lhs", (int, float))
1879
+ @pytest.mark.parametrize("rhs", (int, float))
1880
+ def test_bool_ops_fails_on_scalars(lhs, cmp, rhs, engine, parser):
1881
+ gen = {
1882
+ int: lambda: np.random.default_rng(2).integers(10),
1883
+ float: np.random.default_rng(2).standard_normal,
1884
+ }
1885
+
1886
+ mid = gen[lhs]() # noqa: F841
1887
+ lhs = gen[lhs]()
1888
+ rhs = gen[rhs]()
1889
+
1890
+ ex1 = f"lhs {cmp} mid {cmp} rhs"
1891
+ ex2 = f"lhs {cmp} mid and mid {cmp} rhs"
1892
+ ex3 = f"(lhs {cmp} mid) & (mid {cmp} rhs)"
1893
+ for ex in (ex1, ex2, ex3):
1894
+ msg = "cannot evaluate scalar only bool ops|'BoolOp' nodes are not"
1895
+ with pytest.raises(NotImplementedError, match=msg):
1896
+ pd.eval(ex, engine=engine, parser=parser)
1897
+
1898
+
1899
+ @pytest.mark.parametrize(
1900
+ "other",
1901
+ [
1902
+ "'x'",
1903
+ "...",
1904
+ ],
1905
+ )
1906
+ def test_equals_various(other):
1907
+ df = DataFrame({"A": ["a", "b", "c"]}, dtype=object)
1908
+ result = df.eval(f"A == {other}")
1909
+ expected = Series([False, False, False], name="A")
1910
+ if USE_NUMEXPR:
1911
+ # https://github.com/pandas-dev/pandas/issues/10239
1912
+ # lose name with numexpr engine. Remove when that's fixed.
1913
+ expected.name = None
1914
+ tm.assert_series_equal(result, expected)
1915
+
1916
+
1917
+ def test_inf(engine, parser):
1918
+ s = "inf + 1"
1919
+ expected = np.inf
1920
+ result = pd.eval(s, engine=engine, parser=parser)
1921
+ assert result == expected
1922
+
1923
+
1924
+ @pytest.mark.parametrize("column", ["Temp(°C)", "Capacitance(μF)"])
1925
+ def test_query_token(engine, column):
1926
+ # See: https://github.com/pandas-dev/pandas/pull/42826
1927
+ df = DataFrame(
1928
+ np.random.default_rng(2).standard_normal((5, 2)), columns=[column, "b"]
1929
+ )
1930
+ expected = df[df[column] > 5]
1931
+ query_string = f"`{column}` > 5"
1932
+ result = df.query(query_string, engine=engine)
1933
+ tm.assert_frame_equal(result, expected)
1934
+
1935
+
1936
+ def test_negate_lt_eq_le(engine, parser):
1937
+ df = DataFrame([[0, 10], [1, 20]], columns=["cat", "count"])
1938
+ expected = df[~(df.cat > 0)]
1939
+
1940
+ result = df.query("~(cat > 0)", engine=engine, parser=parser)
1941
+ tm.assert_frame_equal(result, expected)
1942
+
1943
+ if parser == "python":
1944
+ msg = "'Not' nodes are not implemented"
1945
+ with pytest.raises(NotImplementedError, match=msg):
1946
+ df.query("not (cat > 0)", engine=engine, parser=parser)
1947
+ else:
1948
+ result = df.query("not (cat > 0)", engine=engine, parser=parser)
1949
+ tm.assert_frame_equal(result, expected)
1950
+
1951
+
1952
+ @pytest.mark.parametrize(
1953
+ "column",
1954
+ DEFAULT_GLOBALS.keys(),
1955
+ )
1956
+ def test_eval_no_support_column_name(request, column):
1957
+ # GH 44603
1958
+ if column in ["True", "False", "inf", "Inf"]:
1959
+ request.applymarker(
1960
+ pytest.mark.xfail(
1961
+ raises=KeyError,
1962
+ reason=f"GH 47859 DataFrame eval not supported with {column}",
1963
+ )
1964
+ )
1965
+
1966
+ df = DataFrame(
1967
+ np.random.default_rng(2).integers(0, 100, size=(10, 2)),
1968
+ columns=[column, "col1"],
1969
+ )
1970
+ expected = df[df[column] > 6]
1971
+ result = df.query(f"{column}>6")
1972
+
1973
+ tm.assert_frame_equal(result, expected)
1974
+
1975
+
1976
+ def test_set_inplace(using_copy_on_write, warn_copy_on_write):
1977
+ # https://github.com/pandas-dev/pandas/issues/47449
1978
+ # Ensure we don't only update the DataFrame inplace, but also the actual
1979
+ # column values, such that references to this column also get updated
1980
+ df = DataFrame({"A": [1, 2, 3], "B": [4, 5, 6], "C": [7, 8, 9]})
1981
+ result_view = df[:]
1982
+ ser = df["A"]
1983
+ with tm.assert_cow_warning(warn_copy_on_write):
1984
+ df.eval("A = B + C", inplace=True)
1985
+ expected = DataFrame({"A": [11, 13, 15], "B": [4, 5, 6], "C": [7, 8, 9]})
1986
+ tm.assert_frame_equal(df, expected)
1987
+ if not using_copy_on_write:
1988
+ tm.assert_series_equal(ser, expected["A"])
1989
+ tm.assert_series_equal(result_view["A"], expected["A"])
1990
+ else:
1991
+ expected = Series([1, 2, 3], name="A")
1992
+ tm.assert_series_equal(ser, expected)
1993
+ tm.assert_series_equal(result_view["A"], expected)
1994
+
1995
+
1996
+ class TestValidate:
1997
+ @pytest.mark.parametrize("value", [1, "True", [1, 2, 3], 5.0])
1998
+ def test_validate_bool_args(self, value):
1999
+ msg = 'For argument "inplace" expected type bool, received type'
2000
+ with pytest.raises(ValueError, match=msg):
2001
+ pd.eval("2+2", inplace=value)
deepseek/lib/python3.10/site-packages/pandas/tests/series/__pycache__/test_api.cpython-310.pyc ADDED
Binary file (10.6 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/tests/series/__pycache__/test_constructors.cpython-310.pyc ADDED
Binary file (78.4 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/tests/series/__pycache__/test_missing.cpython-310.pyc ADDED
Binary file (3.69 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/tests/series/__pycache__/test_reductions.cpython-310.pyc ADDED
Binary file (6.8 kB). View file
 
deepseek/lib/python3.10/site-packages/pandas/tests/series/__pycache__/test_unary.cpython-310.pyc ADDED
Binary file (1.93 kB). View file
 
deepseek/lib/python3.10/site-packages/rpds_py-0.22.3.dist-info/REQUESTED ADDED
File without changes
deepseek/lib/python3.10/site-packages/rpds_py-0.22.3.dist-info/WHEEL ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ Wheel-Version: 1.0
2
+ Generator: maturin (1.7.8)
3
+ Root-Is-Purelib: false
4
+ Tag: cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64
deepseekvl2/lib/python3.10/site-packages/torch/utils/__pycache__/_contextlib.cpython-310.pyc ADDED
Binary file (4.83 kB). View file
 
deepseekvl2/lib/python3.10/site-packages/torch/utils/__pycache__/_cuda_trace.cpython-310.pyc ADDED
Binary file (3.88 kB). View file
 
deepseekvl2/lib/python3.10/site-packages/torch/utils/__pycache__/_pytree.cpython-310.pyc ADDED
Binary file (11.3 kB). View file