content stringlengths 1 103k ⌀ | path stringlengths 8 216 | filename stringlengths 2 179 | language stringclasses 15
values | size_bytes int64 2 189k | quality_score float64 0.5 0.95 | complexity float64 0 1 | documentation_ratio float64 0 1 | repository stringclasses 5
values | stars int64 0 1k | created_date stringdate 2023-07-10 19:21:08 2025-07-09 19:11:45 | license stringclasses 4
values | is_test bool 2
classes | file_hash stringlengths 32 32 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_getattr.cpython-313.pyc | test_getattr.cpython-313.pyc | Other | 2,088 | 0.95 | 0.034483 | 0 | python-kit | 964 | 2023-10-30T20:23:55.736408 | MIT | true | 21f798dd712fc1256075212439ef070e |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_gridspec.cpython-313.pyc | test_gridspec.cpython-313.pyc | Other | 3,021 | 0.8 | 0 | 0.054054 | node-utils | 565 | 2024-01-28T18:21:24.534695 | MIT | true | e51d7a87ddb7793d711cc0c29c4b2666 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_image.cpython-313.pyc | test_image.cpython-313.pyc | Other | 101,599 | 0.75 | 0.005319 | 0.023596 | awesome-app | 561 | 2024-07-27T19:00:16.821043 | Apache-2.0 | true | 2379886f238bc23e32cc5166ab1be637 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_legend.cpython-313.pyc | test_legend.cpython-313.pyc | Other | 96,976 | 0.6 | 0.001175 | 0.030451 | node-utils | 953 | 2025-01-14T18:35:28.521942 | BSD-3-Clause | true | c55e1126aa86584ee14dcfc3a46a8cd3 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_lines.cpython-313.pyc | test_lines.cpython-313.pyc | Other | 26,206 | 0.8 | 0.011952 | 0.021552 | python-kit | 622 | 2023-08-13T23:20:40.111668 | GPL-3.0 | true | a46f2c86045e29c25448197dee60c7bd |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_marker.cpython-313.pyc | test_marker.cpython-313.pyc | Other | 16,107 | 0.8 | 0 | 0.039474 | python-kit | 216 | 2023-08-16T17:14:10.779660 | MIT | true | 2d9a885c7faae4f06f954540c4260d0e |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_mathtext.cpython-313.pyc | test_mathtext.cpython-313.pyc | Other | 28,093 | 0.95 | 0.01676 | 0.040698 | vue-tools | 30 | 2024-03-07T05:16:01.216856 | BSD-3-Clause | true | 92fa99bae273400548558b36938013d2 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_matplotlib.cpython-313.pyc | test_matplotlib.cpython-313.pyc | Other | 4,858 | 0.95 | 0.022222 | 0 | node-utils | 154 | 2024-02-24T00:54:54.229609 | BSD-3-Clause | true | 33b1221027587c7930f68eadf8cac49c |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_mlab.cpython-313.pyc | test_mlab.cpython-313.pyc | Other | 59,498 | 0.6 | 0.025271 | 0.032193 | awesome-app | 132 | 2024-04-30T01:44:11.106929 | MIT | true | 67c739e336c9f26cc1345599ca8b22b7 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_multivariate_colormaps.cpython-313.pyc | test_multivariate_colormaps.cpython-313.pyc | Other | 28,841 | 0.8 | 0.00542 | 0.064607 | react-lib | 463 | 2024-03-15T20:08:44.035143 | GPL-3.0 | true | 6b352d7ada743ffb79b2ac1c4366d5a3 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_offsetbox.cpython-313.pyc | test_offsetbox.cpython-313.pyc | Other | 21,170 | 0.8 | 0 | 0.033149 | react-lib | 13 | 2025-04-18T17:05:19.517193 | MIT | true | 49b58af19cc3f67df5e3f068055f4799 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_patches.cpython-313.pyc | test_patches.cpython-313.pyc | Other | 49,085 | 0.8 | 0 | 0.022358 | react-lib | 258 | 2024-11-11T01:00:48.428086 | BSD-3-Clause | true | c3eec6ab893aee9c160f37e8e4ecd6e4 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_path.cpython-313.pyc | test_path.cpython-313.pyc | Other | 34,159 | 0.8 | 0 | 0.008746 | react-lib | 906 | 2025-05-16T22:35:16.571643 | MIT | true | 76acb29c66c552d3272c9481c0d3337e |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_patheffects.cpython-313.pyc | test_patheffects.cpython-313.pyc | Other | 13,284 | 0.8 | 0 | 0.007576 | python-kit | 750 | 2023-08-03T10:53:14.280231 | Apache-2.0 | true | 2346ef375307bf8be89edf1ba19fb4a2 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_pickle.cpython-313.pyc | test_pickle.cpython-313.pyc | Other | 19,797 | 0.95 | 0 | 0.026667 | python-kit | 980 | 2025-02-17T00:40:17.311624 | MIT | true | 06226e564793af5ef75b682d2d1df3d6 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_png.cpython-313.pyc | test_png.cpython-313.pyc | Other | 3,130 | 0.8 | 0 | 0 | node-utils | 334 | 2025-05-29T18:51:20.493174 | Apache-2.0 | true | 26394da3f0de789049d29c83767d510f |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_polar.cpython-313.pyc | test_polar.cpython-313.pyc | Other | 30,667 | 0.8 | 0.003497 | 0.070039 | python-kit | 708 | 2024-09-22T09:31:31.541245 | MIT | true | 74978a906b5c242bc896a154b3463ff4 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_preprocess_data.cpython-313.pyc | test_preprocess_data.cpython-313.pyc | Other | 14,073 | 0.95 | 0.005495 | 0 | react-lib | 962 | 2025-06-11T01:19:30.914653 | BSD-3-Clause | true | c459e850ab20ff13b55dedcf8352d7ad |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_pyplot.cpython-313.pyc | test_pyplot.cpython-313.pyc | Other | 26,148 | 0.95 | 0.00365 | 0.021008 | awesome-app | 401 | 2025-01-04T05:43:17.513307 | BSD-3-Clause | true | ee9944f62a50adcef03707540be200fb |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_quiver.cpython-313.pyc | test_quiver.cpython-313.pyc | Other | 21,767 | 0.8 | 0.003922 | 0.00813 | python-kit | 773 | 2024-02-12T21:21:11.913962 | BSD-3-Clause | true | 7883fd38f75b407945d0bd79418d6fe8 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_rcparams.cpython-313.pyc | test_rcparams.cpython-313.pyc | Other | 32,289 | 0.95 | 0.017241 | 0.031802 | python-kit | 343 | 2024-07-09T16:17:19.939026 | Apache-2.0 | true | ced1150616537717ed641bf118b4c290 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_sankey.cpython-313.pyc | test_sankey.cpython-313.pyc | Other | 6,066 | 0.8 | 0 | 0.015625 | node-utils | 341 | 2024-03-28T16:36:34.848267 | MIT | true | b11744a10a1929fe3e8e0aee585b5245 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_scale.cpython-313.pyc | test_scale.cpython-313.pyc | Other | 17,359 | 0.95 | 0.005714 | 0.116279 | awesome-app | 112 | 2025-05-20T08:59:30.635146 | GPL-3.0 | true | 00eaf9fba45b695e32d5797e5d4ada3e |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_simplification.cpython-313.pyc | test_simplification.cpython-313.pyc | Other | 33,832 | 0.8 | 0 | 0.014706 | awesome-app | 156 | 2025-06-12T07:32:03.794843 | Apache-2.0 | true | a4625de3e8c8829fcd742302f4d6bfeb |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_skew.cpython-313.pyc | test_skew.cpython-313.pyc | Other | 9,984 | 0.8 | 0.014493 | 0.015385 | vue-tools | 806 | 2025-01-27T09:57:08.463300 | MIT | true | 3b38def1d53129ab5c4873bed4dd3739 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_sphinxext.cpython-313.pyc | test_sphinxext.cpython-313.pyc | Other | 11,953 | 0.95 | 0.034188 | 0.008929 | node-utils | 423 | 2023-08-24T23:30:42.798153 | GPL-3.0 | true | 4adda4df52640d3bae7ad4c785444c6e |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_spines.cpython-313.pyc | test_spines.cpython-313.pyc | Other | 11,012 | 0.8 | 0 | 0.0125 | node-utils | 358 | 2023-09-28T13:46:09.956788 | Apache-2.0 | true | 362d2f5f282a6cb78271fd0a599a55d4 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_streamplot.cpython-313.pyc | test_streamplot.cpython-313.pyc | Other | 11,300 | 0.8 | 0 | 0 | react-lib | 461 | 2023-08-26T13:35:11.481611 | GPL-3.0 | true | f6f4699ae7983d17ffad3ed9340d019d |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_style.cpython-313.pyc | test_style.cpython-313.pyc | Other | 12,564 | 0.8 | 0 | 0 | react-lib | 157 | 2023-08-07T21:48:56.360252 | GPL-3.0 | true | 4ff44df9eb9558e9b16f0ea214b8e907 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_subplots.cpython-313.pyc | test_subplots.cpython-313.pyc | Other | 15,659 | 0.8 | 0 | 0.036765 | vue-tools | 386 | 2024-10-05T15:51:56.183802 | GPL-3.0 | true | 162de70ddcbc647070176aa5973481ac |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_table.cpython-313.pyc | test_table.cpython-313.pyc | Other | 13,920 | 0.8 | 0 | 0.025 | vue-tools | 367 | 2023-07-26T03:43:09.870826 | MIT | true | 6f4b553e64bff278bb96325e0f8cc4cf |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_testing.cpython-313.pyc | test_testing.cpython-313.pyc | Other | 2,410 | 0.95 | 0.076923 | 0.041667 | python-kit | 513 | 2024-03-11T03:59:07.917695 | BSD-3-Clause | true | 03a2e807821882296a712768c792e3a1 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_texmanager.cpython-313.pyc | test_texmanager.cpython-313.pyc | Other | 3,773 | 0.95 | 0 | 0 | awesome-app | 271 | 2025-03-06T07:36:06.892920 | MIT | true | 281d993658a6f7af5bec23edb8a4682f |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_text.cpython-313.pyc | test_text.cpython-313.pyc | Other | 59,345 | 0.6 | 0.002137 | 0.037123 | node-utils | 463 | 2025-04-23T13:43:18.270224 | MIT | true | 249b09e60d3cd2aee0ac38235faca5c4 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_textpath.cpython-313.pyc | test_textpath.cpython-313.pyc | Other | 884 | 0.8 | 0 | 0 | vue-tools | 624 | 2024-09-13T03:47:51.470544 | Apache-2.0 | true | 009db2f268492285fbac867568c888b7 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_ticker.cpython-313.pyc | test_ticker.cpython-313.pyc | Other | 104,586 | 0.6 | 0.014134 | 0.007417 | vue-tools | 472 | 2024-12-15T14:29:04.489490 | GPL-3.0 | true | 8feaa5fedfc8321880687997be90e7e9 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_tightlayout.cpython-313.pyc | test_tightlayout.cpython-313.pyc | Other | 22,278 | 0.8 | 0.021186 | 0.065421 | node-utils | 308 | 2025-03-06T13:39:37.709303 | BSD-3-Clause | true | 3a7bca3122bb1f721c98398d3623eb7e |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_transforms.cpython-313.pyc | test_transforms.cpython-313.pyc | Other | 79,146 | 0.75 | 0.007117 | 0.014706 | react-lib | 958 | 2025-05-15T13:04:25.324425 | BSD-3-Clause | true | 08fdcd1749f7b72d655777cbccc8e536 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_triangulation.cpython-313.pyc | test_triangulation.cpython-313.pyc | Other | 77,302 | 0.75 | 0.000952 | 0.017382 | react-lib | 837 | 2024-10-09T19:36:43.555768 | BSD-3-Clause | true | 63c9335414687b0bddca0fac3e5272c1 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_type1font.cpython-313.pyc | test_type1font.cpython-313.pyc | Other | 10,895 | 0.8 | 0.010204 | 0.043956 | react-lib | 124 | 2024-05-25T04:28:42.569942 | MIT | true | c946b00287600f33aae8117a55289e64 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_units.cpython-313.pyc | test_units.cpython-313.pyc | Other | 22,756 | 0.95 | 0 | 0.005988 | react-lib | 237 | 2024-10-01T10:55:34.424383 | Apache-2.0 | true | 1050a3e57c1abfd47c2fb3a00cc5d240 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_usetex.cpython-313.pyc | test_usetex.cpython-313.pyc | Other | 10,757 | 0.8 | 0.010526 | 0.073171 | vue-tools | 86 | 2024-04-09T12:00:37.618389 | BSD-3-Clause | true | eb7c2debad5553ffca64111547b6ae4f |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\test_widgets.cpython-313.pyc | test_widgets.cpython-313.pyc | Other | 81,617 | 0.75 | 0.004796 | 0.040943 | awesome-app | 718 | 2024-10-12T02:37:06.900773 | MIT | true | cba69b6191e6244086457c31903c6fb5 |
\n\n | .venv\Lib\site-packages\matplotlib\tests\__pycache__\__init__.cpython-313.pyc | __init__.cpython-313.pyc | Other | 609 | 0.7 | 0 | 0 | awesome-app | 157 | 2025-06-13T10:49:22.649107 | MIT | true | 06cf224937230bd4ca1f70649d7f0175 |
import sys\n\nimport numpy as np\n\nfrom matplotlib import _api\n\n\nclass Triangulation:\n """\n An unstructured triangular grid consisting of npoints points and\n ntri triangles. The triangles can either be specified by the user\n or automatically generated using a Delaunay triangulation.\n\n Parameters\n ----------\n x, y : (npoints,) array-like\n Coordinates of grid points.\n triangles : (ntri, 3) array-like of int, optional\n For each triangle, the indices of the three points that make\n up the triangle, ordered in an anticlockwise manner. If not\n specified, the Delaunay triangulation is calculated.\n mask : (ntri,) array-like of bool, optional\n Which triangles are masked out.\n\n Attributes\n ----------\n triangles : (ntri, 3) array of int\n For each triangle, the indices of the three points that make\n up the triangle, ordered in an anticlockwise manner. If you want to\n take the *mask* into account, use `get_masked_triangles` instead.\n mask : (ntri, 3) array of bool or None\n Masked out triangles.\n is_delaunay : bool\n Whether the Triangulation is a calculated Delaunay\n triangulation (where *triangles* was not specified) or not.\n\n Notes\n -----\n For a Triangulation to be valid it must not have duplicate points,\n triangles formed from colinear points, or overlapping triangles.\n """\n def __init__(self, x, y, triangles=None, mask=None):\n from matplotlib import _qhull\n\n self.x = np.asarray(x, dtype=np.float64)\n self.y = np.asarray(y, dtype=np.float64)\n if self.x.shape != self.y.shape or self.x.ndim != 1:\n raise ValueError("x and y must be equal-length 1D arrays, but "\n f"found shapes {self.x.shape!r} and "\n f"{self.y.shape!r}")\n\n self.mask = None\n self._edges = None\n self._neighbors = None\n self.is_delaunay = False\n\n if triangles is None:\n # No triangulation specified, so use matplotlib._qhull to obtain\n # Delaunay triangulation.\n self.triangles, self._neighbors = _qhull.delaunay(x, y, sys.flags.verbose)\n self.is_delaunay = True\n else:\n # Triangulation specified. Copy, since we may correct triangle\n # orientation.\n try:\n self.triangles = np.array(triangles, dtype=np.int32, order='C')\n except ValueError as e:\n raise ValueError('triangles must be a (N, 3) int array, not '\n f'{triangles!r}') from e\n if self.triangles.ndim != 2 or self.triangles.shape[1] != 3:\n raise ValueError(\n 'triangles must be a (N, 3) int array, but found shape '\n f'{self.triangles.shape!r}')\n if self.triangles.max() >= len(self.x):\n raise ValueError(\n 'triangles are indices into the points and must be in the '\n f'range 0 <= i < {len(self.x)} but found value '\n f'{self.triangles.max()}')\n if self.triangles.min() < 0:\n raise ValueError(\n 'triangles are indices into the points and must be in the '\n f'range 0 <= i < {len(self.x)} but found value '\n f'{self.triangles.min()}')\n\n # Underlying C++ object is not created until first needed.\n self._cpp_triangulation = None\n\n # Default TriFinder not created until needed.\n self._trifinder = None\n\n self.set_mask(mask)\n\n def calculate_plane_coefficients(self, z):\n """\n Calculate plane equation coefficients for all unmasked triangles from\n the point (x, y) coordinates and specified z-array of shape (npoints).\n The returned array has shape (npoints, 3) and allows z-value at (x, y)\n position in triangle tri to be calculated using\n ``z = array[tri, 0] * x + array[tri, 1] * y + array[tri, 2]``.\n """\n return self.get_cpp_triangulation().calculate_plane_coefficients(z)\n\n @property\n def edges(self):\n """\n Return integer array of shape (nedges, 2) containing all edges of\n non-masked triangles.\n\n Each row defines an edge by its start point index and end point\n index. Each edge appears only once, i.e. for an edge between points\n *i* and *j*, there will only be either *(i, j)* or *(j, i)*.\n """\n if self._edges is None:\n self._edges = self.get_cpp_triangulation().get_edges()\n return self._edges\n\n def get_cpp_triangulation(self):\n """\n Return the underlying C++ Triangulation object, creating it\n if necessary.\n """\n from matplotlib import _tri\n if self._cpp_triangulation is None:\n self._cpp_triangulation = _tri.Triangulation(\n # For unset arrays use empty tuple which has size of zero.\n self.x, self.y, self.triangles,\n self.mask if self.mask is not None else (),\n self._edges if self._edges is not None else (),\n self._neighbors if self._neighbors is not None else (),\n not self.is_delaunay)\n return self._cpp_triangulation\n\n def get_masked_triangles(self):\n """\n Return an array of triangles taking the mask into account.\n """\n if self.mask is not None:\n return self.triangles[~self.mask]\n else:\n return self.triangles\n\n @staticmethod\n def get_from_args_and_kwargs(*args, **kwargs):\n """\n Return a Triangulation object from the args and kwargs, and\n the remaining args and kwargs with the consumed values removed.\n\n There are two alternatives: either the first argument is a\n Triangulation object, in which case it is returned, or the args\n and kwargs are sufficient to create a new Triangulation to\n return. In the latter case, see Triangulation.__init__ for\n the possible args and kwargs.\n """\n if isinstance(args[0], Triangulation):\n triangulation, *args = args\n if 'triangles' in kwargs:\n _api.warn_external(\n "Passing the keyword 'triangles' has no effect when also "\n "passing a Triangulation")\n if 'mask' in kwargs:\n _api.warn_external(\n "Passing the keyword 'mask' has no effect when also "\n "passing a Triangulation")\n else:\n x, y, triangles, mask, args, kwargs = \\n Triangulation._extract_triangulation_params(args, kwargs)\n triangulation = Triangulation(x, y, triangles, mask)\n return triangulation, args, kwargs\n\n @staticmethod\n def _extract_triangulation_params(args, kwargs):\n x, y, *args = args\n # Check triangles in kwargs then args.\n triangles = kwargs.pop('triangles', None)\n from_args = False\n if triangles is None and args:\n triangles = args[0]\n from_args = True\n if triangles is not None:\n try:\n triangles = np.asarray(triangles, dtype=np.int32)\n except ValueError:\n triangles = None\n if triangles is not None and (triangles.ndim != 2 or\n triangles.shape[1] != 3):\n triangles = None\n if triangles is not None and from_args:\n args = args[1:] # Consumed first item in args.\n # Check for mask in kwargs.\n mask = kwargs.pop('mask', None)\n return x, y, triangles, mask, args, kwargs\n\n def get_trifinder(self):\n """\n Return the default `matplotlib.tri.TriFinder` of this\n triangulation, creating it if necessary. This allows the same\n TriFinder object to be easily shared.\n """\n if self._trifinder is None:\n # Default TriFinder class.\n from matplotlib.tri._trifinder import TrapezoidMapTriFinder\n self._trifinder = TrapezoidMapTriFinder(self)\n return self._trifinder\n\n @property\n def neighbors(self):\n """\n Return integer array of shape (ntri, 3) containing neighbor triangles.\n\n For each triangle, the indices of the three triangles that\n share the same edges, or -1 if there is no such neighboring\n triangle. ``neighbors[i, j]`` is the triangle that is the neighbor\n to the edge from point index ``triangles[i, j]`` to point index\n ``triangles[i, (j+1)%3]``.\n """\n if self._neighbors is None:\n self._neighbors = self.get_cpp_triangulation().get_neighbors()\n return self._neighbors\n\n def set_mask(self, mask):\n """\n Set or clear the mask array.\n\n Parameters\n ----------\n mask : None or bool array of length ntri\n """\n if mask is None:\n self.mask = None\n else:\n self.mask = np.asarray(mask, dtype=bool)\n if self.mask.shape != (self.triangles.shape[0],):\n raise ValueError('mask array must have same length as '\n 'triangles array')\n\n # Set mask in C++ Triangulation.\n if self._cpp_triangulation is not None:\n self._cpp_triangulation.set_mask(\n self.mask if self.mask is not None else ())\n\n # Clear derived fields so they are recalculated when needed.\n self._edges = None\n self._neighbors = None\n\n # Recalculate TriFinder if it exists.\n if self._trifinder is not None:\n self._trifinder._initialize()\n | .venv\Lib\site-packages\matplotlib\tri\_triangulation.py | _triangulation.py | Python | 9,784 | 0.95 | 0.190283 | 0.06422 | awesome-app | 665 | 2024-04-14T01:43:29.530211 | MIT | false | b3e66c1ff7701647231e5755f754723a |
from matplotlib import _tri\nfrom matplotlib.tri._trifinder import TriFinder\n\nimport numpy as np\nfrom numpy.typing import ArrayLike\nfrom typing import Any\n\nclass Triangulation:\n x: np.ndarray\n y: np.ndarray\n mask: np.ndarray | None\n is_delaunay: bool\n triangles: np.ndarray\n def __init__(\n self,\n x: ArrayLike,\n y: ArrayLike,\n triangles: ArrayLike | None = ...,\n mask: ArrayLike | None = ...,\n ) -> None: ...\n def calculate_plane_coefficients(self, z: ArrayLike) -> np.ndarray: ...\n @property\n def edges(self) -> np.ndarray: ...\n def get_cpp_triangulation(self) -> _tri.Triangulation: ...\n def get_masked_triangles(self) -> np.ndarray: ...\n @staticmethod\n def get_from_args_and_kwargs(\n *args, **kwargs\n ) -> tuple[Triangulation, tuple[Any, ...], dict[str, Any]]: ...\n def get_trifinder(self) -> TriFinder: ...\n @property\n def neighbors(self) -> np.ndarray: ...\n def set_mask(self, mask: None | ArrayLike) -> None: ...\n | .venv\Lib\site-packages\matplotlib\tri\_triangulation.pyi | _triangulation.pyi | Other | 1,017 | 0.85 | 0.30303 | 0.032258 | node-utils | 737 | 2025-01-16T22:13:07.001945 | GPL-3.0 | false | 51fd74ae5b7f2e68251807f872155616 |
import numpy as np\n\nfrom matplotlib import _docstring\nfrom matplotlib.contour import ContourSet\nfrom matplotlib.tri._triangulation import Triangulation\n\n\n@_docstring.interpd\nclass TriContourSet(ContourSet):\n """\n Create and store a set of contour lines or filled regions for\n a triangular grid.\n\n This class is typically not instantiated directly by the user but by\n `~.Axes.tricontour` and `~.Axes.tricontourf`.\n\n %(contour_set_attributes)s\n """\n def __init__(self, ax, *args, **kwargs):\n """\n Draw triangular grid contour lines or filled regions,\n depending on whether keyword arg *filled* is False\n (default) or True.\n\n The first argument of the initializer must be an `~.axes.Axes`\n object. The remaining arguments and keyword arguments\n are described in the docstring of `~.Axes.tricontour`.\n """\n super().__init__(ax, *args, **kwargs)\n\n def _process_args(self, *args, **kwargs):\n """\n Process args and kwargs.\n """\n if isinstance(args[0], TriContourSet):\n C = args[0]._contour_generator\n if self.levels is None:\n self.levels = args[0].levels\n self.zmin = args[0].zmin\n self.zmax = args[0].zmax\n self._mins = args[0]._mins\n self._maxs = args[0]._maxs\n else:\n from matplotlib import _tri\n tri, z = self._contour_args(args, kwargs)\n C = _tri.TriContourGenerator(tri.get_cpp_triangulation(), z)\n self._mins = [tri.x.min(), tri.y.min()]\n self._maxs = [tri.x.max(), tri.y.max()]\n\n self._contour_generator = C\n return kwargs\n\n def _contour_args(self, args, kwargs):\n tri, args, kwargs = Triangulation.get_from_args_and_kwargs(*args,\n **kwargs)\n z, *args = args\n z = np.ma.asarray(z)\n if z.shape != tri.x.shape:\n raise ValueError('z array must have same length as triangulation x'\n ' and y arrays')\n\n # z values must be finite, only need to check points that are included\n # in the triangulation.\n z_check = z[np.unique(tri.get_masked_triangles())]\n if np.ma.is_masked(z_check):\n raise ValueError('z must not contain masked points within the '\n 'triangulation')\n if not np.isfinite(z_check).all():\n raise ValueError('z array must not contain non-finite values '\n 'within the triangulation')\n\n z = np.ma.masked_invalid(z, copy=False)\n self.zmax = float(z_check.max())\n self.zmin = float(z_check.min())\n if self.logscale and self.zmin <= 0:\n func = 'contourf' if self.filled else 'contour'\n raise ValueError(f'Cannot {func} log of negative values.')\n self._process_contour_level_args(args, z.dtype)\n return (tri, z)\n\n\n_docstring.interpd.register(_tricontour_doc="""\nDraw contour %%(type)s on an unstructured triangular grid.\n\nCall signatures::\n\n %%(func)s(triangulation, z, [levels], ...)\n %%(func)s(x, y, z, [levels], *, [triangles=triangles], [mask=mask], ...)\n\nThe triangular grid can be specified either by passing a `.Triangulation`\nobject as the first parameter, or by passing the points *x*, *y* and\noptionally the *triangles* and a *mask*. See `.Triangulation` for an\nexplanation of these parameters. If neither of *triangulation* or\n*triangles* are given, the triangulation is calculated on the fly.\n\nIt is possible to pass *triangles* positionally, i.e.\n``%%(func)s(x, y, triangles, z, ...)``. However, this is discouraged. For more\nclarity, pass *triangles* via keyword argument.\n\nParameters\n----------\ntriangulation : `.Triangulation`, optional\n An already created triangular grid.\n\nx, y, triangles, mask\n Parameters defining the triangular grid. See `.Triangulation`.\n This is mutually exclusive with specifying *triangulation*.\n\nz : array-like\n The height values over which the contour is drawn. Color-mapping is\n controlled by *cmap*, *norm*, *vmin*, and *vmax*.\n\n .. note::\n All values in *z* must be finite. Hence, nan and inf values must\n either be removed or `~.Triangulation.set_mask` be used.\n\nlevels : int or array-like, optional\n Determines the number and positions of the contour lines / regions.\n\n If an int *n*, use `~matplotlib.ticker.MaxNLocator`, which tries to\n automatically choose no more than *n+1* "nice" contour levels between\n between minimum and maximum numeric values of *Z*.\n\n If array-like, draw contour lines at the specified levels. The values must\n be in increasing order.\n\nReturns\n-------\n`~matplotlib.tri.TriContourSet`\n\nOther Parameters\n----------------\ncolors : :mpltype:`color` or list of :mpltype:`color`, optional\n The colors of the levels, i.e., the contour %%(type)s.\n\n The sequence is cycled for the levels in ascending order. If the sequence\n is shorter than the number of levels, it is repeated.\n\n As a shortcut, single color strings may be used in place of one-element\n lists, i.e. ``'red'`` instead of ``['red']`` to color all levels with the\n same color. This shortcut does only work for color strings, not for other\n ways of specifying colors.\n\n By default (value *None*), the colormap specified by *cmap* will be used.\n\nalpha : float, default: 1\n The alpha blending value, between 0 (transparent) and 1 (opaque).\n\n%(cmap_doc)s\n\n This parameter is ignored if *colors* is set.\n\n%(norm_doc)s\n\n This parameter is ignored if *colors* is set.\n\n%(vmin_vmax_doc)s\n\n If *vmin* or *vmax* are not given, the default color scaling is based on\n *levels*.\n\n This parameter is ignored if *colors* is set.\n\norigin : {*None*, 'upper', 'lower', 'image'}, default: None\n Determines the orientation and exact position of *z* by specifying the\n position of ``z[0, 0]``. This is only relevant, if *X*, *Y* are not given.\n\n - *None*: ``z[0, 0]`` is at X=0, Y=0 in the lower left corner.\n - 'lower': ``z[0, 0]`` is at X=0.5, Y=0.5 in the lower left corner.\n - 'upper': ``z[0, 0]`` is at X=N+0.5, Y=0.5 in the upper left corner.\n - 'image': Use the value from :rc:`image.origin`.\n\nextent : (x0, x1, y0, y1), optional\n If *origin* is not *None*, then *extent* is interpreted as in `.imshow`: it\n gives the outer pixel boundaries. In this case, the position of z[0, 0] is\n the center of the pixel, not a corner. If *origin* is *None*, then\n (*x0*, *y0*) is the position of z[0, 0], and (*x1*, *y1*) is the position\n of z[-1, -1].\n\n This argument is ignored if *X* and *Y* are specified in the call to\n contour.\n\nlocator : ticker.Locator subclass, optional\n The locator is used to determine the contour levels if they are not given\n explicitly via *levels*.\n Defaults to `~.ticker.MaxNLocator`.\n\nextend : {'neither', 'both', 'min', 'max'}, default: 'neither'\n Determines the ``%%(func)s``-coloring of values that are outside the\n *levels* range.\n\n If 'neither', values outside the *levels* range are not colored. If 'min',\n 'max' or 'both', color the values below, above or below and above the\n *levels* range.\n\n Values below ``min(levels)`` and above ``max(levels)`` are mapped to the\n under/over values of the `.Colormap`. Note that most colormaps do not have\n dedicated colors for these by default, so that the over and under values\n are the edge values of the colormap. You may want to set these values\n explicitly using `.Colormap.set_under` and `.Colormap.set_over`.\n\n .. note::\n\n An existing `.TriContourSet` does not get notified if properties of its\n colormap are changed. Therefore, an explicit call to\n `.ContourSet.changed()` is needed after modifying the colormap. The\n explicit call can be left out, if a colorbar is assigned to the\n `.TriContourSet` because it internally calls `.ContourSet.changed()`.\n\nxunits, yunits : registered units, optional\n Override axis units by specifying an instance of a\n :class:`matplotlib.units.ConversionInterface`.\n\nantialiased : bool, optional\n Enable antialiasing, overriding the defaults. For\n filled contours, the default is *True*. For line contours,\n it is taken from :rc:`lines.antialiased`.""" % _docstring.interpd.params)\n\n\n@_docstring.Substitution(func='tricontour', type='lines')\n@_docstring.interpd\ndef tricontour(ax, *args, **kwargs):\n """\n %(_tricontour_doc)s\n\n linewidths : float or array-like, default: :rc:`contour.linewidth`\n The line width of the contour lines.\n\n If a number, all levels will be plotted with this linewidth.\n\n If a sequence, the levels in ascending order will be plotted with\n the linewidths in the order specified.\n\n If None, this falls back to :rc:`lines.linewidth`.\n\n linestyles : {*None*, 'solid', 'dashed', 'dashdot', 'dotted'}, optional\n If *linestyles* is *None*, the default is 'solid' unless the lines are\n monochrome. In that case, negative contours will take their linestyle\n from :rc:`contour.negative_linestyle` setting.\n\n *linestyles* can also be an iterable of the above strings specifying a\n set of linestyles to be used. If this iterable is shorter than the\n number of contour levels it will be repeated as necessary.\n """\n kwargs['filled'] = False\n return TriContourSet(ax, *args, **kwargs)\n\n\n@_docstring.Substitution(func='tricontourf', type='regions')\n@_docstring.interpd\ndef tricontourf(ax, *args, **kwargs):\n """\n %(_tricontour_doc)s\n\n hatches : list[str], optional\n A list of crosshatch patterns to use on the filled areas.\n If None, no hatching will be added to the contour.\n\n Notes\n -----\n `.tricontourf` fills intervals that are closed at the top; that is, for\n boundaries *z1* and *z2*, the filled region is::\n\n z1 < Z <= z2\n\n except for the lowest interval, which is closed on both sides (i.e. it\n includes the lowest value).\n """\n kwargs['filled'] = True\n return TriContourSet(ax, *args, **kwargs)\n | .venv\Lib\site-packages\matplotlib\tri\_tricontour.py | _tricontour.py | Python | 10,220 | 0.95 | 0.114815 | 0.038647 | vue-tools | 139 | 2023-08-22T17:30:29.626302 | MIT | false | de3527f59617c614db657394b33f3477 |
from matplotlib.axes import Axes\nfrom matplotlib.contour import ContourSet\nfrom matplotlib.tri._triangulation import Triangulation\n\nfrom numpy.typing import ArrayLike\nfrom typing import overload\n\n# TODO: more explicit args/kwargs (for all things in this module)?\n\nclass TriContourSet(ContourSet):\n def __init__(self, ax: Axes, *args, **kwargs) -> None: ...\n\n@overload\ndef tricontour(\n ax: Axes,\n triangulation: Triangulation,\n z: ArrayLike,\n levels: int | ArrayLike = ...,\n **kwargs\n) -> TriContourSet: ...\n@overload\ndef tricontour(\n ax: Axes,\n x: ArrayLike,\n y: ArrayLike,\n z: ArrayLike,\n levels: int | ArrayLike = ...,\n *,\n triangles: ArrayLike = ...,\n mask: ArrayLike = ...,\n **kwargs\n) -> TriContourSet: ...\n@overload\ndef tricontourf(\n ax: Axes,\n triangulation: Triangulation,\n z: ArrayLike,\n levels: int | ArrayLike = ...,\n **kwargs\n) -> TriContourSet: ...\n@overload\ndef tricontourf(\n ax: Axes,\n x: ArrayLike,\n y: ArrayLike,\n z: ArrayLike,\n levels: int | ArrayLike = ...,\n *,\n triangles: ArrayLike = ...,\n mask: ArrayLike = ...,\n **kwargs\n) -> TriContourSet: ...\n | .venv\Lib\site-packages\matplotlib\tri\_tricontour.pyi | _tricontour.pyi | Other | 1,155 | 0.95 | 0.134615 | 0.145833 | react-lib | 817 | 2024-09-07T01:18:14.830251 | BSD-3-Clause | false | 177a840511a7eedbc9e3aeff9eeb1c58 |
import numpy as np\n\nfrom matplotlib import _api\nfrom matplotlib.tri import Triangulation\n\n\nclass TriFinder:\n """\n Abstract base class for classes used to find the triangles of a\n Triangulation in which (x, y) points lie.\n\n Rather than instantiate an object of a class derived from TriFinder, it is\n usually better to use the function `.Triangulation.get_trifinder`.\n\n Derived classes implement __call__(x, y) where x and y are array-like point\n coordinates of the same shape.\n """\n\n def __init__(self, triangulation):\n _api.check_isinstance(Triangulation, triangulation=triangulation)\n self._triangulation = triangulation\n\n def __call__(self, x, y):\n raise NotImplementedError\n\n\nclass TrapezoidMapTriFinder(TriFinder):\n """\n `~matplotlib.tri.TriFinder` class implemented using the trapezoid\n map algorithm from the book "Computational Geometry, Algorithms and\n Applications", second edition, by M. de Berg, M. van Kreveld, M. Overmars\n and O. Schwarzkopf.\n\n The triangulation must be valid, i.e. it must not have duplicate points,\n triangles formed from colinear points, or overlapping triangles. The\n algorithm has some tolerance to triangles formed from colinear points, but\n this should not be relied upon.\n """\n\n def __init__(self, triangulation):\n from matplotlib import _tri\n super().__init__(triangulation)\n self._cpp_trifinder = _tri.TrapezoidMapTriFinder(\n triangulation.get_cpp_triangulation())\n self._initialize()\n\n def __call__(self, x, y):\n """\n Return an array containing the indices of the triangles in which the\n specified *x*, *y* points lie, or -1 for points that do not lie within\n a triangle.\n\n *x*, *y* are array-like x and y coordinates of the same shape and any\n number of dimensions.\n\n Returns integer array with the same shape and *x* and *y*.\n """\n x = np.asarray(x, dtype=np.float64)\n y = np.asarray(y, dtype=np.float64)\n if x.shape != y.shape:\n raise ValueError("x and y must be array-like with the same shape")\n\n # C++ does the heavy lifting, and expects 1D arrays.\n indices = (self._cpp_trifinder.find_many(x.ravel(), y.ravel())\n .reshape(x.shape))\n return indices\n\n def _get_tree_stats(self):\n """\n Return a python list containing the statistics about the node tree:\n 0: number of nodes (tree size)\n 1: number of unique nodes\n 2: number of trapezoids (tree leaf nodes)\n 3: number of unique trapezoids\n 4: maximum parent count (max number of times a node is repeated in\n tree)\n 5: maximum depth of tree (one more than the maximum number of\n comparisons needed to search through the tree)\n 6: mean of all trapezoid depths (one more than the average number\n of comparisons needed to search through the tree)\n """\n return self._cpp_trifinder.get_tree_stats()\n\n def _initialize(self):\n """\n Initialize the underlying C++ object. Can be called multiple times if,\n for example, the triangulation is modified.\n """\n self._cpp_trifinder.initialize()\n\n def _print_tree(self):\n """\n Print a text representation of the node tree, which is useful for\n debugging purposes.\n """\n self._cpp_trifinder.print_tree()\n | .venv\Lib\site-packages\matplotlib\tri\_trifinder.py | _trifinder.py | Python | 3,522 | 0.95 | 0.197917 | 0.025641 | react-lib | 418 | 2024-09-05T17:11:56.734797 | GPL-3.0 | false | cf4b0f1ba33a13640f480be21f809c70 |
from matplotlib.tri import Triangulation\nfrom numpy.typing import ArrayLike\n\nclass TriFinder:\n def __init__(self, triangulation: Triangulation) -> None: ...\n def __call__(self, x: ArrayLike, y: ArrayLike) -> ArrayLike: ...\n\nclass TrapezoidMapTriFinder(TriFinder):\n def __init__(self, triangulation: Triangulation) -> None: ...\n def __call__(self, x: ArrayLike, y: ArrayLike) -> ArrayLike: ...\n | .venv\Lib\site-packages\matplotlib\tri\_trifinder.pyi | _trifinder.pyi | Other | 405 | 0.85 | 0.6 | 0 | awesome-app | 40 | 2024-09-08T17:46:03.692956 | GPL-3.0 | false | 60f7fb1e7e5745a01e49e04a32a7b64d |
"""\nInterpolation inside triangular grids.\n"""\n\nimport numpy as np\n\nfrom matplotlib import _api\nfrom matplotlib.tri import Triangulation\nfrom matplotlib.tri._trifinder import TriFinder\nfrom matplotlib.tri._tritools import TriAnalyzer\n\n__all__ = ('TriInterpolator', 'LinearTriInterpolator', 'CubicTriInterpolator')\n\n\nclass TriInterpolator:\n """\n Abstract base class for classes used to interpolate on a triangular grid.\n\n Derived classes implement the following methods:\n\n - ``__call__(x, y)``,\n where x, y are array-like point coordinates of the same shape, and\n that returns a masked array of the same shape containing the\n interpolated z-values.\n\n - ``gradient(x, y)``,\n where x, y are array-like point coordinates of the same\n shape, and that returns a list of 2 masked arrays of the same shape\n containing the 2 derivatives of the interpolator (derivatives of\n interpolated z values with respect to x and y).\n """\n\n def __init__(self, triangulation, z, trifinder=None):\n _api.check_isinstance(Triangulation, triangulation=triangulation)\n self._triangulation = triangulation\n\n self._z = np.asarray(z)\n if self._z.shape != self._triangulation.x.shape:\n raise ValueError("z array must have same length as triangulation x"\n " and y arrays")\n\n _api.check_isinstance((TriFinder, None), trifinder=trifinder)\n self._trifinder = trifinder or self._triangulation.get_trifinder()\n\n # Default scaling factors : 1.0 (= no scaling)\n # Scaling may be used for interpolations for which the order of\n # magnitude of x, y has an impact on the interpolant definition.\n # Please refer to :meth:`_interpolate_multikeys` for details.\n self._unit_x = 1.0\n self._unit_y = 1.0\n\n # Default triangle renumbering: None (= no renumbering)\n # Renumbering may be used to avoid unnecessary computations\n # if complex calculations are done inside the Interpolator.\n # Please refer to :meth:`_interpolate_multikeys` for details.\n self._tri_renum = None\n\n # __call__ and gradient docstrings are shared by all subclasses\n # (except, if needed, relevant additions).\n # However these methods are only implemented in subclasses to avoid\n # confusion in the documentation.\n _docstring__call__ = """\n Returns a masked array containing interpolated values at the specified\n (x, y) points.\n\n Parameters\n ----------\n x, y : array-like\n x and y coordinates of the same shape and any number of\n dimensions.\n\n Returns\n -------\n np.ma.array\n Masked array of the same shape as *x* and *y*; values corresponding\n to (*x*, *y*) points outside of the triangulation are masked out.\n\n """\n\n _docstringgradient = r"""\n Returns a list of 2 masked arrays containing interpolated derivatives\n at the specified (x, y) points.\n\n Parameters\n ----------\n x, y : array-like\n x and y coordinates of the same shape and any number of\n dimensions.\n\n Returns\n -------\n dzdx, dzdy : np.ma.array\n 2 masked arrays of the same shape as *x* and *y*; values\n corresponding to (x, y) points outside of the triangulation\n are masked out.\n The first returned array contains the values of\n :math:`\frac{\partial z}{\partial x}` and the second those of\n :math:`\frac{\partial z}{\partial y}`.\n\n """\n\n def _interpolate_multikeys(self, x, y, tri_index=None,\n return_keys=('z',)):\n """\n Versatile (private) method defined for all TriInterpolators.\n\n :meth:`_interpolate_multikeys` is a wrapper around method\n :meth:`_interpolate_single_key` (to be defined in the child\n subclasses).\n :meth:`_interpolate_single_key actually performs the interpolation,\n but only for 1-dimensional inputs and at valid locations (inside\n unmasked triangles of the triangulation).\n\n The purpose of :meth:`_interpolate_multikeys` is to implement the\n following common tasks needed in all subclasses implementations:\n\n - calculation of containing triangles\n - dealing with more than one interpolation request at the same\n location (e.g., if the 2 derivatives are requested, it is\n unnecessary to compute the containing triangles twice)\n - scaling according to self._unit_x, self._unit_y\n - dealing with points outside of the grid (with fill value np.nan)\n - dealing with multi-dimensional *x*, *y* arrays: flattening for\n :meth:`_interpolate_params` call and final reshaping.\n\n (Note that np.vectorize could do most of those things very well for\n you, but it does it by function evaluations over successive tuples of\n the input arrays. Therefore, this tends to be more time-consuming than\n using optimized numpy functions - e.g., np.dot - which can be used\n easily on the flattened inputs, in the child-subclass methods\n :meth:`_interpolate_single_key`.)\n\n It is guaranteed that the calls to :meth:`_interpolate_single_key`\n will be done with flattened (1-d) array-like input parameters *x*, *y*\n and with flattened, valid `tri_index` arrays (no -1 index allowed).\n\n Parameters\n ----------\n x, y : array-like\n x and y coordinates where interpolated values are requested.\n tri_index : array-like of int, optional\n Array of the containing triangle indices, same shape as\n *x* and *y*. Defaults to None. If None, these indices\n will be computed by a TriFinder instance.\n (Note: For point outside the grid, tri_index[ipt] shall be -1).\n return_keys : tuple of keys from {'z', 'dzdx', 'dzdy'}\n Defines the interpolation arrays to return, and in which order.\n\n Returns\n -------\n list of arrays\n Each array-like contains the expected interpolated values in the\n order defined by *return_keys* parameter.\n """\n # Flattening and rescaling inputs arrays x, y\n # (initial shape is stored for output)\n x = np.asarray(x, dtype=np.float64)\n y = np.asarray(y, dtype=np.float64)\n sh_ret = x.shape\n if x.shape != y.shape:\n raise ValueError("x and y shall have same shapes."\n f" Given: {x.shape} and {y.shape}")\n x = np.ravel(x)\n y = np.ravel(y)\n x_scaled = x/self._unit_x\n y_scaled = y/self._unit_y\n size_ret = np.size(x_scaled)\n\n # Computes & ravels the element indexes, extract the valid ones.\n if tri_index is None:\n tri_index = self._trifinder(x, y)\n else:\n if tri_index.shape != sh_ret:\n raise ValueError(\n "tri_index array is provided and shall"\n " have same shape as x and y. Given: "\n f"{tri_index.shape} and {sh_ret}")\n tri_index = np.ravel(tri_index)\n\n mask_in = (tri_index != -1)\n if self._tri_renum is None:\n valid_tri_index = tri_index[mask_in]\n else:\n valid_tri_index = self._tri_renum[tri_index[mask_in]]\n valid_x = x_scaled[mask_in]\n valid_y = y_scaled[mask_in]\n\n ret = []\n for return_key in return_keys:\n # Find the return index associated with the key.\n try:\n return_index = {'z': 0, 'dzdx': 1, 'dzdy': 2}[return_key]\n except KeyError as err:\n raise ValueError("return_keys items shall take values in"\n " {'z', 'dzdx', 'dzdy'}") from err\n\n # Sets the scale factor for f & df components\n scale = [1., 1./self._unit_x, 1./self._unit_y][return_index]\n\n # Computes the interpolation\n ret_loc = np.empty(size_ret, dtype=np.float64)\n ret_loc[~mask_in] = np.nan\n ret_loc[mask_in] = self._interpolate_single_key(\n return_key, valid_tri_index, valid_x, valid_y) * scale\n ret += [np.ma.masked_invalid(ret_loc.reshape(sh_ret), copy=False)]\n\n return ret\n\n def _interpolate_single_key(self, return_key, tri_index, x, y):\n """\n Interpolate at points belonging to the triangulation\n (inside an unmasked triangles).\n\n Parameters\n ----------\n return_key : {'z', 'dzdx', 'dzdy'}\n The requested values (z or its derivatives).\n tri_index : 1D int array\n Valid triangle index (cannot be -1).\n x, y : 1D arrays, same shape as `tri_index`\n Valid locations where interpolation is requested.\n\n Returns\n -------\n 1-d array\n Returned array of the same size as *tri_index*\n """\n raise NotImplementedError("TriInterpolator subclasses" +\n "should implement _interpolate_single_key!")\n\n\nclass LinearTriInterpolator(TriInterpolator):\n """\n Linear interpolator on a triangular grid.\n\n Each triangle is represented by a plane so that an interpolated value at\n point (x, y) lies on the plane of the triangle containing (x, y).\n Interpolated values are therefore continuous across the triangulation, but\n their first derivatives are discontinuous at edges between triangles.\n\n Parameters\n ----------\n triangulation : `~matplotlib.tri.Triangulation`\n The triangulation to interpolate over.\n z : (npoints,) array-like\n Array of values, defined at grid points, to interpolate between.\n trifinder : `~matplotlib.tri.TriFinder`, optional\n If this is not specified, the Triangulation's default TriFinder will\n be used by calling `.Triangulation.get_trifinder`.\n\n Methods\n -------\n `__call__` (x, y) : Returns interpolated values at (x, y) points.\n `gradient` (x, y) : Returns interpolated derivatives at (x, y) points.\n\n """\n def __init__(self, triangulation, z, trifinder=None):\n super().__init__(triangulation, z, trifinder)\n\n # Store plane coefficients for fast interpolation calculations.\n self._plane_coefficients = \\n self._triangulation.calculate_plane_coefficients(self._z)\n\n def __call__(self, x, y):\n return self._interpolate_multikeys(x, y, tri_index=None,\n return_keys=('z',))[0]\n __call__.__doc__ = TriInterpolator._docstring__call__\n\n def gradient(self, x, y):\n return self._interpolate_multikeys(x, y, tri_index=None,\n return_keys=('dzdx', 'dzdy'))\n gradient.__doc__ = TriInterpolator._docstringgradient\n\n def _interpolate_single_key(self, return_key, tri_index, x, y):\n _api.check_in_list(['z', 'dzdx', 'dzdy'], return_key=return_key)\n if return_key == 'z':\n return (self._plane_coefficients[tri_index, 0]*x +\n self._plane_coefficients[tri_index, 1]*y +\n self._plane_coefficients[tri_index, 2])\n elif return_key == 'dzdx':\n return self._plane_coefficients[tri_index, 0]\n else: # 'dzdy'\n return self._plane_coefficients[tri_index, 1]\n\n\nclass CubicTriInterpolator(TriInterpolator):\n r"""\n Cubic interpolator on a triangular grid.\n\n In one-dimension - on a segment - a cubic interpolating function is\n defined by the values of the function and its derivative at both ends.\n This is almost the same in 2D inside a triangle, except that the values\n of the function and its 2 derivatives have to be defined at each triangle\n node.\n\n The CubicTriInterpolator takes the value of the function at each node -\n provided by the user - and internally computes the value of the\n derivatives, resulting in a smooth interpolation.\n (As a special feature, the user can also impose the value of the\n derivatives at each node, but this is not supposed to be the common\n usage.)\n\n Parameters\n ----------\n triangulation : `~matplotlib.tri.Triangulation`\n The triangulation to interpolate over.\n z : (npoints,) array-like\n Array of values, defined at grid points, to interpolate between.\n kind : {'min_E', 'geom', 'user'}, optional\n Choice of the smoothing algorithm, in order to compute\n the interpolant derivatives (defaults to 'min_E'):\n\n - if 'min_E': (default) The derivatives at each node is computed\n to minimize a bending energy.\n - if 'geom': The derivatives at each node is computed as a\n weighted average of relevant triangle normals. To be used for\n speed optimization (large grids).\n - if 'user': The user provides the argument *dz*, no computation\n is hence needed.\n\n trifinder : `~matplotlib.tri.TriFinder`, optional\n If not specified, the Triangulation's default TriFinder will\n be used by calling `.Triangulation.get_trifinder`.\n dz : tuple of array-likes (dzdx, dzdy), optional\n Used only if *kind* ='user'. In this case *dz* must be provided as\n (dzdx, dzdy) where dzdx, dzdy are arrays of the same shape as *z* and\n are the interpolant first derivatives at the *triangulation* points.\n\n Methods\n -------\n `__call__` (x, y) : Returns interpolated values at (x, y) points.\n `gradient` (x, y) : Returns interpolated derivatives at (x, y) points.\n\n Notes\n -----\n This note is a bit technical and details how the cubic interpolation is\n computed.\n\n The interpolation is based on a Clough-Tocher subdivision scheme of\n the *triangulation* mesh (to make it clearer, each triangle of the\n grid will be divided in 3 child-triangles, and on each child triangle\n the interpolated function is a cubic polynomial of the 2 coordinates).\n This technique originates from FEM (Finite Element Method) analysis;\n the element used is a reduced Hsieh-Clough-Tocher (HCT)\n element. Its shape functions are described in [1]_.\n The assembled function is guaranteed to be C1-smooth, i.e. it is\n continuous and its first derivatives are also continuous (this\n is easy to show inside the triangles but is also true when crossing the\n edges).\n\n In the default case (*kind* ='min_E'), the interpolant minimizes a\n curvature energy on the functional space generated by the HCT element\n shape functions - with imposed values but arbitrary derivatives at each\n node. The minimized functional is the integral of the so-called total\n curvature (implementation based on an algorithm from [2]_ - PCG sparse\n solver):\n\n .. math::\n\n E(z) = \frac{1}{2} \int_{\Omega} \left(\n \left( \frac{\partial^2{z}}{\partial{x}^2} \right)^2 +\n \left( \frac{\partial^2{z}}{\partial{y}^2} \right)^2 +\n 2\left( \frac{\partial^2{z}}{\partial{y}\partial{x}} \right)^2\n \right) dx\,dy\n\n If the case *kind* ='geom' is chosen by the user, a simple geometric\n approximation is used (weighted average of the triangle normal\n vectors), which could improve speed on very large grids.\n\n References\n ----------\n .. [1] Michel Bernadou, Kamal Hassan, "Basis functions for general\n Hsieh-Clough-Tocher triangles, complete or reduced.",\n International Journal for Numerical Methods in Engineering,\n 17(5):784 - 789. 2.01.\n .. [2] C.T. Kelley, "Iterative Methods for Optimization".\n\n """\n def __init__(self, triangulation, z, kind='min_E', trifinder=None,\n dz=None):\n super().__init__(triangulation, z, trifinder)\n\n # Loads the underlying c++ _triangulation.\n # (During loading, reordering of triangulation._triangles may occur so\n # that all final triangles are now anti-clockwise)\n self._triangulation.get_cpp_triangulation()\n\n # To build the stiffness matrix and avoid zero-energy spurious modes\n # we will only store internally the valid (unmasked) triangles and\n # the necessary (used) points coordinates.\n # 2 renumbering tables need to be computed and stored:\n # - a triangle renum table in order to translate the result from a\n # TriFinder instance into the internal stored triangle number.\n # - a node renum table to overwrite the self._z values into the new\n # (used) node numbering.\n tri_analyzer = TriAnalyzer(self._triangulation)\n (compressed_triangles, compressed_x, compressed_y, tri_renum,\n node_renum) = tri_analyzer._get_compressed_triangulation()\n self._triangles = compressed_triangles\n self._tri_renum = tri_renum\n # Taking into account the node renumbering in self._z:\n valid_node = (node_renum != -1)\n self._z[node_renum[valid_node]] = self._z[valid_node]\n\n # Computing scale factors\n self._unit_x = np.ptp(compressed_x)\n self._unit_y = np.ptp(compressed_y)\n self._pts = np.column_stack([compressed_x / self._unit_x,\n compressed_y / self._unit_y])\n # Computing triangle points\n self._tris_pts = self._pts[self._triangles]\n # Computing eccentricities\n self._eccs = self._compute_tri_eccentricities(self._tris_pts)\n # Computing dof estimations for HCT triangle shape function\n _api.check_in_list(['user', 'geom', 'min_E'], kind=kind)\n self._dof = self._compute_dof(kind, dz=dz)\n # Loading HCT element\n self._ReferenceElement = _ReducedHCT_Element()\n\n def __call__(self, x, y):\n return self._interpolate_multikeys(x, y, tri_index=None,\n return_keys=('z',))[0]\n __call__.__doc__ = TriInterpolator._docstring__call__\n\n def gradient(self, x, y):\n return self._interpolate_multikeys(x, y, tri_index=None,\n return_keys=('dzdx', 'dzdy'))\n gradient.__doc__ = TriInterpolator._docstringgradient\n\n def _interpolate_single_key(self, return_key, tri_index, x, y):\n _api.check_in_list(['z', 'dzdx', 'dzdy'], return_key=return_key)\n tris_pts = self._tris_pts[tri_index]\n alpha = self._get_alpha_vec(x, y, tris_pts)\n ecc = self._eccs[tri_index]\n dof = np.expand_dims(self._dof[tri_index], axis=1)\n if return_key == 'z':\n return self._ReferenceElement.get_function_values(\n alpha, ecc, dof)\n else: # 'dzdx', 'dzdy'\n J = self._get_jacobian(tris_pts)\n dzdx = self._ReferenceElement.get_function_derivatives(\n alpha, J, ecc, dof)\n if return_key == 'dzdx':\n return dzdx[:, 0, 0]\n else:\n return dzdx[:, 1, 0]\n\n def _compute_dof(self, kind, dz=None):\n """\n Compute and return nodal dofs according to kind.\n\n Parameters\n ----------\n kind : {'min_E', 'geom', 'user'}\n Choice of the _DOF_estimator subclass to estimate the gradient.\n dz : tuple of array-likes (dzdx, dzdy), optional\n Used only if *kind*=user; in this case passed to the\n :class:`_DOF_estimator_user`.\n\n Returns\n -------\n array-like, shape (npts, 2)\n Estimation of the gradient at triangulation nodes (stored as\n degree of freedoms of reduced-HCT triangle elements).\n """\n if kind == 'user':\n if dz is None:\n raise ValueError("For a CubicTriInterpolator with "\n "*kind*='user', a valid *dz* "\n "argument is expected.")\n TE = _DOF_estimator_user(self, dz=dz)\n elif kind == 'geom':\n TE = _DOF_estimator_geom(self)\n else: # 'min_E', checked in __init__\n TE = _DOF_estimator_min_E(self)\n return TE.compute_dof_from_df()\n\n @staticmethod\n def _get_alpha_vec(x, y, tris_pts):\n """\n Fast (vectorized) function to compute barycentric coordinates alpha.\n\n Parameters\n ----------\n x, y : array-like of dim 1 (shape (nx,))\n Coordinates of the points whose points barycentric coordinates are\n requested.\n tris_pts : array like of dim 3 (shape: (nx, 3, 2))\n Coordinates of the containing triangles apexes.\n\n Returns\n -------\n array of dim 2 (shape (nx, 3))\n Barycentric coordinates of the points inside the containing\n triangles.\n """\n ndim = tris_pts.ndim-2\n\n a = tris_pts[:, 1, :] - tris_pts[:, 0, :]\n b = tris_pts[:, 2, :] - tris_pts[:, 0, :]\n abT = np.stack([a, b], axis=-1)\n ab = _transpose_vectorized(abT)\n OM = np.stack([x, y], axis=1) - tris_pts[:, 0, :]\n\n metric = ab @ abT\n # Here we try to deal with the colinear cases.\n # metric_inv is in this case set to the Moore-Penrose pseudo-inverse\n # meaning that we will still return a set of valid barycentric\n # coordinates.\n metric_inv = _pseudo_inv22sym_vectorized(metric)\n Covar = ab @ _transpose_vectorized(np.expand_dims(OM, ndim))\n ksi = metric_inv @ Covar\n alpha = _to_matrix_vectorized([\n [1-ksi[:, 0, 0]-ksi[:, 1, 0]], [ksi[:, 0, 0]], [ksi[:, 1, 0]]])\n return alpha\n\n @staticmethod\n def _get_jacobian(tris_pts):\n """\n Fast (vectorized) function to compute triangle jacobian matrix.\n\n Parameters\n ----------\n tris_pts : array like of dim 3 (shape: (nx, 3, 2))\n Coordinates of the containing triangles apexes.\n\n Returns\n -------\n array of dim 3 (shape (nx, 2, 2))\n Barycentric coordinates of the points inside the containing\n triangles.\n J[itri, :, :] is the jacobian matrix at apex 0 of the triangle\n itri, so that the following (matrix) relationship holds:\n [dz/dksi] = [J] x [dz/dx]\n with x: global coordinates\n ksi: element parametric coordinates in triangle first apex\n local basis.\n """\n a = np.array(tris_pts[:, 1, :] - tris_pts[:, 0, :])\n b = np.array(tris_pts[:, 2, :] - tris_pts[:, 0, :])\n J = _to_matrix_vectorized([[a[:, 0], a[:, 1]],\n [b[:, 0], b[:, 1]]])\n return J\n\n @staticmethod\n def _compute_tri_eccentricities(tris_pts):\n """\n Compute triangle eccentricities.\n\n Parameters\n ----------\n tris_pts : array like of dim 3 (shape: (nx, 3, 2))\n Coordinates of the triangles apexes.\n\n Returns\n -------\n array like of dim 2 (shape: (nx, 3))\n The so-called eccentricity parameters [1] needed for HCT triangular\n element.\n """\n a = np.expand_dims(tris_pts[:, 2, :] - tris_pts[:, 1, :], axis=2)\n b = np.expand_dims(tris_pts[:, 0, :] - tris_pts[:, 2, :], axis=2)\n c = np.expand_dims(tris_pts[:, 1, :] - tris_pts[:, 0, :], axis=2)\n # Do not use np.squeeze, this is dangerous if only one triangle\n # in the triangulation...\n dot_a = (_transpose_vectorized(a) @ a)[:, 0, 0]\n dot_b = (_transpose_vectorized(b) @ b)[:, 0, 0]\n dot_c = (_transpose_vectorized(c) @ c)[:, 0, 0]\n # Note that this line will raise a warning for dot_a, dot_b or dot_c\n # zeros, but we choose not to support triangles with duplicate points.\n return _to_matrix_vectorized([[(dot_c-dot_b) / dot_a],\n [(dot_a-dot_c) / dot_b],\n [(dot_b-dot_a) / dot_c]])\n\n\n# FEM element used for interpolation and for solving minimisation\n# problem (Reduced HCT element)\nclass _ReducedHCT_Element:\n """\n Implementation of reduced HCT triangular element with explicit shape\n functions.\n\n Computes z, dz, d2z and the element stiffness matrix for bending energy:\n E(f) = integral( (d2z/dx2 + d2z/dy2)**2 dA)\n\n *** Reference for the shape functions: ***\n [1] Basis functions for general Hsieh-Clough-Tocher _triangles, complete or\n reduced.\n Michel Bernadou, Kamal Hassan\n International Journal for Numerical Methods in Engineering.\n 17(5):784 - 789. 2.01\n\n *** Element description: ***\n 9 dofs: z and dz given at 3 apex\n C1 (conform)\n\n """\n # 1) Loads matrices to generate shape functions as a function of\n # triangle eccentricities - based on [1] p.11 '''\n M = np.array([\n [ 0.00, 0.00, 0.00, 4.50, 4.50, 0.00, 0.00, 0.00, 0.00, 0.00],\n [-0.25, 0.00, 0.00, 0.50, 1.25, 0.00, 0.00, 0.00, 0.00, 0.00],\n [-0.25, 0.00, 0.00, 1.25, 0.50, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.50, 1.00, 0.00, -1.50, 0.00, 3.00, 3.00, 0.00, 0.00, 3.00],\n [ 0.00, 0.00, 0.00, -0.25, 0.25, 0.00, 1.00, 0.00, 0.00, 0.50],\n [ 0.25, 0.00, 0.00, -0.50, -0.25, 1.00, 0.00, 0.00, 0.00, 1.00],\n [ 0.50, 0.00, 1.00, 0.00, -1.50, 0.00, 0.00, 3.00, 3.00, 3.00],\n [ 0.25, 0.00, 0.00, -0.25, -0.50, 0.00, 0.00, 0.00, 1.00, 1.00],\n [ 0.00, 0.00, 0.00, 0.25, -0.25, 0.00, 0.00, 1.00, 0.00, 0.50]])\n M0 = np.array([\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [-1.00, 0.00, 0.00, 1.50, 1.50, 0.00, 0.00, 0.00, 0.00, -3.00],\n [-0.50, 0.00, 0.00, 0.75, 0.75, 0.00, 0.00, 0.00, 0.00, -1.50],\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 1.00, 0.00, 0.00, -1.50, -1.50, 0.00, 0.00, 0.00, 0.00, 3.00],\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.50, 0.00, 0.00, -0.75, -0.75, 0.00, 0.00, 0.00, 0.00, 1.50]])\n M1 = np.array([\n [-0.50, 0.00, 0.00, 1.50, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [-0.25, 0.00, 0.00, 0.75, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.50, 0.00, 0.00, -1.50, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.25, 0.00, 0.00, -0.75, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00]])\n M2 = np.array([\n [ 0.50, 0.00, 0.00, 0.00, -1.50, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.25, 0.00, 0.00, 0.00, -0.75, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [-0.50, 0.00, 0.00, 0.00, 1.50, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [-0.25, 0.00, 0.00, 0.00, 0.75, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00],\n [ 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00, 0.00]])\n\n # 2) Loads matrices to rotate components of gradient & Hessian\n # vectors in the reference basis of triangle first apex (a0)\n rotate_dV = np.array([[ 1., 0.], [ 0., 1.],\n [ 0., 1.], [-1., -1.],\n [-1., -1.], [ 1., 0.]])\n\n rotate_d2V = np.array([[1., 0., 0.], [0., 1., 0.], [ 0., 0., 1.],\n [0., 1., 0.], [1., 1., 1.], [ 0., -2., -1.],\n [1., 1., 1.], [1., 0., 0.], [-2., 0., -1.]])\n\n # 3) Loads Gauss points & weights on the 3 sub-_triangles for P2\n # exact integral - 3 points on each subtriangles.\n # NOTE: as the 2nd derivative is discontinuous , we really need those 9\n # points!\n n_gauss = 9\n gauss_pts = np.array([[13./18., 4./18., 1./18.],\n [ 4./18., 13./18., 1./18.],\n [ 7./18., 7./18., 4./18.],\n [ 1./18., 13./18., 4./18.],\n [ 1./18., 4./18., 13./18.],\n [ 4./18., 7./18., 7./18.],\n [ 4./18., 1./18., 13./18.],\n [13./18., 1./18., 4./18.],\n [ 7./18., 4./18., 7./18.]], dtype=np.float64)\n gauss_w = np.ones([9], dtype=np.float64) / 9.\n\n # 4) Stiffness matrix for curvature energy\n E = np.array([[1., 0., 0.], [0., 1., 0.], [0., 0., 2.]])\n\n # 5) Loads the matrix to compute DOF_rot from tri_J at apex 0\n J0_to_J1 = np.array([[-1., 1.], [-1., 0.]])\n J0_to_J2 = np.array([[ 0., -1.], [ 1., -1.]])\n\n def get_function_values(self, alpha, ecc, dofs):\n """\n Parameters\n ----------\n alpha : is a (N x 3 x 1) array (array of column-matrices) of\n barycentric coordinates,\n ecc : is a (N x 3 x 1) array (array of column-matrices) of triangle\n eccentricities,\n dofs : is a (N x 1 x 9) arrays (arrays of row-matrices) of computed\n degrees of freedom.\n\n Returns\n -------\n Returns the N-array of interpolated function values.\n """\n subtri = np.argmin(alpha, axis=1)[:, 0]\n ksi = _roll_vectorized(alpha, -subtri, axis=0)\n E = _roll_vectorized(ecc, -subtri, axis=0)\n x = ksi[:, 0, 0]\n y = ksi[:, 1, 0]\n z = ksi[:, 2, 0]\n x_sq = x*x\n y_sq = y*y\n z_sq = z*z\n V = _to_matrix_vectorized([\n [x_sq*x], [y_sq*y], [z_sq*z], [x_sq*z], [x_sq*y], [y_sq*x],\n [y_sq*z], [z_sq*y], [z_sq*x], [x*y*z]])\n prod = self.M @ V\n prod += _scalar_vectorized(E[:, 0, 0], self.M0 @ V)\n prod += _scalar_vectorized(E[:, 1, 0], self.M1 @ V)\n prod += _scalar_vectorized(E[:, 2, 0], self.M2 @ V)\n s = _roll_vectorized(prod, 3*subtri, axis=0)\n return (dofs @ s)[:, 0, 0]\n\n def get_function_derivatives(self, alpha, J, ecc, dofs):\n """\n Parameters\n ----------\n *alpha* is a (N x 3 x 1) array (array of column-matrices of\n barycentric coordinates)\n *J* is a (N x 2 x 2) array of jacobian matrices (jacobian matrix at\n triangle first apex)\n *ecc* is a (N x 3 x 1) array (array of column-matrices of triangle\n eccentricities)\n *dofs* is a (N x 1 x 9) arrays (arrays of row-matrices) of computed\n degrees of freedom.\n\n Returns\n -------\n Returns the values of interpolated function derivatives [dz/dx, dz/dy]\n in global coordinates at locations alpha, as a column-matrices of\n shape (N x 2 x 1).\n """\n subtri = np.argmin(alpha, axis=1)[:, 0]\n ksi = _roll_vectorized(alpha, -subtri, axis=0)\n E = _roll_vectorized(ecc, -subtri, axis=0)\n x = ksi[:, 0, 0]\n y = ksi[:, 1, 0]\n z = ksi[:, 2, 0]\n x_sq = x*x\n y_sq = y*y\n z_sq = z*z\n dV = _to_matrix_vectorized([\n [ -3.*x_sq, -3.*x_sq],\n [ 3.*y_sq, 0.],\n [ 0., 3.*z_sq],\n [ -2.*x*z, -2.*x*z+x_sq],\n [-2.*x*y+x_sq, -2.*x*y],\n [ 2.*x*y-y_sq, -y_sq],\n [ 2.*y*z, y_sq],\n [ z_sq, 2.*y*z],\n [ -z_sq, 2.*x*z-z_sq],\n [ x*z-y*z, x*y-y*z]])\n # Puts back dV in first apex basis\n dV = dV @ _extract_submatrices(\n self.rotate_dV, subtri, block_size=2, axis=0)\n\n prod = self.M @ dV\n prod += _scalar_vectorized(E[:, 0, 0], self.M0 @ dV)\n prod += _scalar_vectorized(E[:, 1, 0], self.M1 @ dV)\n prod += _scalar_vectorized(E[:, 2, 0], self.M2 @ dV)\n dsdksi = _roll_vectorized(prod, 3*subtri, axis=0)\n dfdksi = dofs @ dsdksi\n # In global coordinates:\n # Here we try to deal with the simplest colinear cases, returning a\n # null matrix.\n J_inv = _safe_inv22_vectorized(J)\n dfdx = J_inv @ _transpose_vectorized(dfdksi)\n return dfdx\n\n def get_function_hessians(self, alpha, J, ecc, dofs):\n """\n Parameters\n ----------\n *alpha* is a (N x 3 x 1) array (array of column-matrices) of\n barycentric coordinates\n *J* is a (N x 2 x 2) array of jacobian matrices (jacobian matrix at\n triangle first apex)\n *ecc* is a (N x 3 x 1) array (array of column-matrices) of triangle\n eccentricities\n *dofs* is a (N x 1 x 9) arrays (arrays of row-matrices) of computed\n degrees of freedom.\n\n Returns\n -------\n Returns the values of interpolated function 2nd-derivatives\n [d2z/dx2, d2z/dy2, d2z/dxdy] in global coordinates at locations alpha,\n as a column-matrices of shape (N x 3 x 1).\n """\n d2sdksi2 = self.get_d2Sidksij2(alpha, ecc)\n d2fdksi2 = dofs @ d2sdksi2\n H_rot = self.get_Hrot_from_J(J)\n d2fdx2 = d2fdksi2 @ H_rot\n return _transpose_vectorized(d2fdx2)\n\n def get_d2Sidksij2(self, alpha, ecc):\n """\n Parameters\n ----------\n *alpha* is a (N x 3 x 1) array (array of column-matrices) of\n barycentric coordinates\n *ecc* is a (N x 3 x 1) array (array of column-matrices) of triangle\n eccentricities\n\n Returns\n -------\n Returns the arrays d2sdksi2 (N x 3 x 1) Hessian of shape functions\n expressed in covariant coordinates in first apex basis.\n """\n subtri = np.argmin(alpha, axis=1)[:, 0]\n ksi = _roll_vectorized(alpha, -subtri, axis=0)\n E = _roll_vectorized(ecc, -subtri, axis=0)\n x = ksi[:, 0, 0]\n y = ksi[:, 1, 0]\n z = ksi[:, 2, 0]\n d2V = _to_matrix_vectorized([\n [ 6.*x, 6.*x, 6.*x],\n [ 6.*y, 0., 0.],\n [ 0., 6.*z, 0.],\n [ 2.*z, 2.*z-4.*x, 2.*z-2.*x],\n [2.*y-4.*x, 2.*y, 2.*y-2.*x],\n [2.*x-4.*y, 0., -2.*y],\n [ 2.*z, 0., 2.*y],\n [ 0., 2.*y, 2.*z],\n [ 0., 2.*x-4.*z, -2.*z],\n [ -2.*z, -2.*y, x-y-z]])\n # Puts back d2V in first apex basis\n d2V = d2V @ _extract_submatrices(\n self.rotate_d2V, subtri, block_size=3, axis=0)\n prod = self.M @ d2V\n prod += _scalar_vectorized(E[:, 0, 0], self.M0 @ d2V)\n prod += _scalar_vectorized(E[:, 1, 0], self.M1 @ d2V)\n prod += _scalar_vectorized(E[:, 2, 0], self.M2 @ d2V)\n d2sdksi2 = _roll_vectorized(prod, 3*subtri, axis=0)\n return d2sdksi2\n\n def get_bending_matrices(self, J, ecc):\n """\n Parameters\n ----------\n *J* is a (N x 2 x 2) array of jacobian matrices (jacobian matrix at\n triangle first apex)\n *ecc* is a (N x 3 x 1) array (array of column-matrices) of triangle\n eccentricities\n\n Returns\n -------\n Returns the element K matrices for bending energy expressed in\n GLOBAL nodal coordinates.\n K_ij = integral [ (d2zi/dx2 + d2zi/dy2) * (d2zj/dx2 + d2zj/dy2) dA]\n tri_J is needed to rotate dofs from local basis to global basis\n """\n n = np.size(ecc, 0)\n\n # 1) matrix to rotate dofs in global coordinates\n J1 = self.J0_to_J1 @ J\n J2 = self.J0_to_J2 @ J\n DOF_rot = np.zeros([n, 9, 9], dtype=np.float64)\n DOF_rot[:, 0, 0] = 1\n DOF_rot[:, 3, 3] = 1\n DOF_rot[:, 6, 6] = 1\n DOF_rot[:, 1:3, 1:3] = J\n DOF_rot[:, 4:6, 4:6] = J1\n DOF_rot[:, 7:9, 7:9] = J2\n\n # 2) matrix to rotate Hessian in global coordinates.\n H_rot, area = self.get_Hrot_from_J(J, return_area=True)\n\n # 3) Computes stiffness matrix\n # Gauss quadrature.\n K = np.zeros([n, 9, 9], dtype=np.float64)\n weights = self.gauss_w\n pts = self.gauss_pts\n for igauss in range(self.n_gauss):\n alpha = np.tile(pts[igauss, :], n).reshape(n, 3)\n alpha = np.expand_dims(alpha, 2)\n weight = weights[igauss]\n d2Skdksi2 = self.get_d2Sidksij2(alpha, ecc)\n d2Skdx2 = d2Skdksi2 @ H_rot\n K += weight * (d2Skdx2 @ self.E @ _transpose_vectorized(d2Skdx2))\n\n # 4) With nodal (not elem) dofs\n K = _transpose_vectorized(DOF_rot) @ K @ DOF_rot\n\n # 5) Need the area to compute total element energy\n return _scalar_vectorized(area, K)\n\n def get_Hrot_from_J(self, J, return_area=False):\n """\n Parameters\n ----------\n *J* is a (N x 2 x 2) array of jacobian matrices (jacobian matrix at\n triangle first apex)\n\n Returns\n -------\n Returns H_rot used to rotate Hessian from local basis of first apex,\n to global coordinates.\n if *return_area* is True, returns also the triangle area (0.5*det(J))\n """\n # Here we try to deal with the simplest colinear cases; a null\n # energy and area is imposed.\n J_inv = _safe_inv22_vectorized(J)\n Ji00 = J_inv[:, 0, 0]\n Ji11 = J_inv[:, 1, 1]\n Ji10 = J_inv[:, 1, 0]\n Ji01 = J_inv[:, 0, 1]\n H_rot = _to_matrix_vectorized([\n [Ji00*Ji00, Ji10*Ji10, Ji00*Ji10],\n [Ji01*Ji01, Ji11*Ji11, Ji01*Ji11],\n [2*Ji00*Ji01, 2*Ji11*Ji10, Ji00*Ji11+Ji10*Ji01]])\n if not return_area:\n return H_rot\n else:\n area = 0.5 * (J[:, 0, 0]*J[:, 1, 1] - J[:, 0, 1]*J[:, 1, 0])\n return H_rot, area\n\n def get_Kff_and_Ff(self, J, ecc, triangles, Uc):\n """\n Build K and F for the following elliptic formulation:\n minimization of curvature energy with value of function at node\n imposed and derivatives 'free'.\n\n Build the global Kff matrix in cco format.\n Build the full Ff vec Ff = - Kfc x Uc.\n\n Parameters\n ----------\n *J* is a (N x 2 x 2) array of jacobian matrices (jacobian matrix at\n triangle first apex)\n *ecc* is a (N x 3 x 1) array (array of column-matrices) of triangle\n eccentricities\n *triangles* is a (N x 3) array of nodes indexes.\n *Uc* is (N x 3) array of imposed displacements at nodes\n\n Returns\n -------\n (Kff_rows, Kff_cols, Kff_vals) Kff matrix in coo format - Duplicate\n (row, col) entries must be summed.\n Ff: force vector - dim npts * 3\n """\n ntri = np.size(ecc, 0)\n vec_range = np.arange(ntri, dtype=np.int32)\n c_indices = np.full(ntri, -1, dtype=np.int32) # for unused dofs, -1\n f_dof = [1, 2, 4, 5, 7, 8]\n c_dof = [0, 3, 6]\n\n # vals, rows and cols indices in global dof numbering\n f_dof_indices = _to_matrix_vectorized([[\n c_indices, triangles[:, 0]*2, triangles[:, 0]*2+1,\n c_indices, triangles[:, 1]*2, triangles[:, 1]*2+1,\n c_indices, triangles[:, 2]*2, triangles[:, 2]*2+1]])\n\n expand_indices = np.ones([ntri, 9, 1], dtype=np.int32)\n f_row_indices = _transpose_vectorized(expand_indices @ f_dof_indices)\n f_col_indices = expand_indices @ f_dof_indices\n K_elem = self.get_bending_matrices(J, ecc)\n\n # Extracting sub-matrices\n # Explanation & notations:\n # * Subscript f denotes 'free' degrees of freedom (i.e. dz/dx, dz/dx)\n # * Subscript c denotes 'condensated' (imposed) degrees of freedom\n # (i.e. z at all nodes)\n # * F = [Ff, Fc] is the force vector\n # * U = [Uf, Uc] is the imposed dof vector\n # [ Kff Kfc ]\n # * K = [ ] is the laplacian stiffness matrix\n # [ Kcf Kff ]\n # * As F = K x U one gets straightforwardly: Ff = - Kfc x Uc\n\n # Computing Kff stiffness matrix in sparse coo format\n Kff_vals = np.ravel(K_elem[np.ix_(vec_range, f_dof, f_dof)])\n Kff_rows = np.ravel(f_row_indices[np.ix_(vec_range, f_dof, f_dof)])\n Kff_cols = np.ravel(f_col_indices[np.ix_(vec_range, f_dof, f_dof)])\n\n # Computing Ff force vector in sparse coo format\n Kfc_elem = K_elem[np.ix_(vec_range, f_dof, c_dof)]\n Uc_elem = np.expand_dims(Uc, axis=2)\n Ff_elem = -(Kfc_elem @ Uc_elem)[:, :, 0]\n Ff_indices = f_dof_indices[np.ix_(vec_range, [0], f_dof)][:, 0, :]\n\n # Extracting Ff force vector in dense format\n # We have to sum duplicate indices - using bincount\n Ff = np.bincount(np.ravel(Ff_indices), weights=np.ravel(Ff_elem))\n return Kff_rows, Kff_cols, Kff_vals, Ff\n\n\n# :class:_DOF_estimator, _DOF_estimator_user, _DOF_estimator_geom,\n# _DOF_estimator_min_E\n# Private classes used to compute the degree of freedom of each triangular\n# element for the TriCubicInterpolator.\nclass _DOF_estimator:\n """\n Abstract base class for classes used to estimate a function's first\n derivatives, and deduce the dofs for a CubicTriInterpolator using a\n reduced HCT element formulation.\n\n Derived classes implement ``compute_df(self, **kwargs)``, returning\n ``np.vstack([dfx, dfy]).T`` where ``dfx, dfy`` are the estimation of the 2\n gradient coordinates.\n """\n def __init__(self, interpolator, **kwargs):\n _api.check_isinstance(CubicTriInterpolator, interpolator=interpolator)\n self._pts = interpolator._pts\n self._tris_pts = interpolator._tris_pts\n self.z = interpolator._z\n self._triangles = interpolator._triangles\n (self._unit_x, self._unit_y) = (interpolator._unit_x,\n interpolator._unit_y)\n self.dz = self.compute_dz(**kwargs)\n self.compute_dof_from_df()\n\n def compute_dz(self, **kwargs):\n raise NotImplementedError\n\n def compute_dof_from_df(self):\n """\n Compute reduced-HCT elements degrees of freedom, from the gradient.\n """\n J = CubicTriInterpolator._get_jacobian(self._tris_pts)\n tri_z = self.z[self._triangles]\n tri_dz = self.dz[self._triangles]\n tri_dof = self.get_dof_vec(tri_z, tri_dz, J)\n return tri_dof\n\n @staticmethod\n def get_dof_vec(tri_z, tri_dz, J):\n """\n Compute the dof vector of a triangle, from the value of f, df and\n of the local Jacobian at each node.\n\n Parameters\n ----------\n tri_z : shape (3,) array\n f nodal values.\n tri_dz : shape (3, 2) array\n df/dx, df/dy nodal values.\n J\n Jacobian matrix in local basis of apex 0.\n\n Returns\n -------\n dof : shape (9,) array\n For each apex ``iapex``::\n\n dof[iapex*3+0] = f(Ai)\n dof[iapex*3+1] = df(Ai).(AiAi+)\n dof[iapex*3+2] = df(Ai).(AiAi-)\n """\n npt = tri_z.shape[0]\n dof = np.zeros([npt, 9], dtype=np.float64)\n J1 = _ReducedHCT_Element.J0_to_J1 @ J\n J2 = _ReducedHCT_Element.J0_to_J2 @ J\n\n col0 = J @ np.expand_dims(tri_dz[:, 0, :], axis=2)\n col1 = J1 @ np.expand_dims(tri_dz[:, 1, :], axis=2)\n col2 = J2 @ np.expand_dims(tri_dz[:, 2, :], axis=2)\n\n dfdksi = _to_matrix_vectorized([\n [col0[:, 0, 0], col1[:, 0, 0], col2[:, 0, 0]],\n [col0[:, 1, 0], col1[:, 1, 0], col2[:, 1, 0]]])\n dof[:, 0:7:3] = tri_z\n dof[:, 1:8:3] = dfdksi[:, 0]\n dof[:, 2:9:3] = dfdksi[:, 1]\n return dof\n\n\nclass _DOF_estimator_user(_DOF_estimator):\n """dz is imposed by user; accounts for scaling if any."""\n\n def compute_dz(self, dz):\n (dzdx, dzdy) = dz\n dzdx = dzdx * self._unit_x\n dzdy = dzdy * self._unit_y\n return np.vstack([dzdx, dzdy]).T\n\n\nclass _DOF_estimator_geom(_DOF_estimator):\n """Fast 'geometric' approximation, recommended for large arrays."""\n\n def compute_dz(self):\n """\n self.df is computed as weighted average of _triangles sharing a common\n node. On each triangle itri f is first assumed linear (= ~f), which\n allows to compute d~f[itri]\n Then the following approximation of df nodal values is then proposed:\n f[ipt] = SUM ( w[itri] x d~f[itri] , for itri sharing apex ipt)\n The weighted coeff. w[itri] are proportional to the angle of the\n triangle itri at apex ipt\n """\n el_geom_w = self.compute_geom_weights()\n el_geom_grad = self.compute_geom_grads()\n\n # Sum of weights coeffs\n w_node_sum = np.bincount(np.ravel(self._triangles),\n weights=np.ravel(el_geom_w))\n\n # Sum of weighted df = (dfx, dfy)\n dfx_el_w = np.empty_like(el_geom_w)\n dfy_el_w = np.empty_like(el_geom_w)\n for iapex in range(3):\n dfx_el_w[:, iapex] = el_geom_w[:, iapex]*el_geom_grad[:, 0]\n dfy_el_w[:, iapex] = el_geom_w[:, iapex]*el_geom_grad[:, 1]\n dfx_node_sum = np.bincount(np.ravel(self._triangles),\n weights=np.ravel(dfx_el_w))\n dfy_node_sum = np.bincount(np.ravel(self._triangles),\n weights=np.ravel(dfy_el_w))\n\n # Estimation of df\n dfx_estim = dfx_node_sum/w_node_sum\n dfy_estim = dfy_node_sum/w_node_sum\n return np.vstack([dfx_estim, dfy_estim]).T\n\n def compute_geom_weights(self):\n """\n Build the (nelems, 3) weights coeffs of _triangles angles,\n renormalized so that np.sum(weights, axis=1) == np.ones(nelems)\n """\n weights = np.zeros([np.size(self._triangles, 0), 3])\n tris_pts = self._tris_pts\n for ipt in range(3):\n p0 = tris_pts[:, ipt % 3, :]\n p1 = tris_pts[:, (ipt+1) % 3, :]\n p2 = tris_pts[:, (ipt-1) % 3, :]\n alpha1 = np.arctan2(p1[:, 1]-p0[:, 1], p1[:, 0]-p0[:, 0])\n alpha2 = np.arctan2(p2[:, 1]-p0[:, 1], p2[:, 0]-p0[:, 0])\n # In the below formula we could take modulo 2. but\n # modulo 1. is safer regarding round-off errors (flat triangles).\n angle = np.abs(((alpha2-alpha1) / np.pi) % 1)\n # Weight proportional to angle up np.pi/2; null weight for\n # degenerated cases 0 and np.pi (note that *angle* is normalized\n # by np.pi).\n weights[:, ipt] = 0.5 - np.abs(angle-0.5)\n return weights\n\n def compute_geom_grads(self):\n """\n Compute the (global) gradient component of f assumed linear (~f).\n returns array df of shape (nelems, 2)\n df[ielem].dM[ielem] = dz[ielem] i.e. df = dz x dM = dM.T^-1 x dz\n """\n tris_pts = self._tris_pts\n tris_f = self.z[self._triangles]\n\n dM1 = tris_pts[:, 1, :] - tris_pts[:, 0, :]\n dM2 = tris_pts[:, 2, :] - tris_pts[:, 0, :]\n dM = np.dstack([dM1, dM2])\n # Here we try to deal with the simplest colinear cases: a null\n # gradient is assumed in this case.\n dM_inv = _safe_inv22_vectorized(dM)\n\n dZ1 = tris_f[:, 1] - tris_f[:, 0]\n dZ2 = tris_f[:, 2] - tris_f[:, 0]\n dZ = np.vstack([dZ1, dZ2]).T\n df = np.empty_like(dZ)\n\n # With np.einsum: could be ej,eji -> ej\n df[:, 0] = dZ[:, 0]*dM_inv[:, 0, 0] + dZ[:, 1]*dM_inv[:, 1, 0]\n df[:, 1] = dZ[:, 0]*dM_inv[:, 0, 1] + dZ[:, 1]*dM_inv[:, 1, 1]\n return df\n\n\nclass _DOF_estimator_min_E(_DOF_estimator_geom):\n """\n The 'smoothest' approximation, df is computed through global minimization\n of the bending energy:\n E(f) = integral[(d2z/dx2 + d2z/dy2 + 2 d2z/dxdy)**2 dA]\n """\n def __init__(self, Interpolator):\n self._eccs = Interpolator._eccs\n super().__init__(Interpolator)\n\n def compute_dz(self):\n """\n Elliptic solver for bending energy minimization.\n Uses a dedicated 'toy' sparse Jacobi PCG solver.\n """\n # Initial guess for iterative PCG solver.\n dz_init = super().compute_dz()\n Uf0 = np.ravel(dz_init)\n\n reference_element = _ReducedHCT_Element()\n J = CubicTriInterpolator._get_jacobian(self._tris_pts)\n eccs = self._eccs\n triangles = self._triangles\n Uc = self.z[self._triangles]\n\n # Building stiffness matrix and force vector in coo format\n Kff_rows, Kff_cols, Kff_vals, Ff = reference_element.get_Kff_and_Ff(\n J, eccs, triangles, Uc)\n\n # Building sparse matrix and solving minimization problem\n # We could use scipy.sparse direct solver; however to avoid this\n # external dependency an implementation of a simple PCG solver with\n # a simple diagonal Jacobi preconditioner is implemented.\n tol = 1.e-10\n n_dof = Ff.shape[0]\n Kff_coo = _Sparse_Matrix_coo(Kff_vals, Kff_rows, Kff_cols,\n shape=(n_dof, n_dof))\n Kff_coo.compress_csc()\n Uf, err = _cg(A=Kff_coo, b=Ff, x0=Uf0, tol=tol)\n # If the PCG did not converge, we return the best guess between Uf0\n # and Uf.\n err0 = np.linalg.norm(Kff_coo.dot(Uf0) - Ff)\n if err0 < err:\n # Maybe a good occasion to raise a warning here ?\n _api.warn_external("In TriCubicInterpolator initialization, "\n "PCG sparse solver did not converge after "\n "1000 iterations. `geom` approximation is "\n "used instead of `min_E`")\n Uf = Uf0\n\n # Building dz from Uf\n dz = np.empty([self._pts.shape[0], 2], dtype=np.float64)\n dz[:, 0] = Uf[::2]\n dz[:, 1] = Uf[1::2]\n return dz\n\n\n# The following private :class:_Sparse_Matrix_coo and :func:_cg provide\n# a PCG sparse solver for (symmetric) elliptic problems.\nclass _Sparse_Matrix_coo:\n def __init__(self, vals, rows, cols, shape):\n """\n Create a sparse matrix in coo format.\n *vals*: arrays of values of non-null entries of the matrix\n *rows*: int arrays of rows of non-null entries of the matrix\n *cols*: int arrays of cols of non-null entries of the matrix\n *shape*: 2-tuple (n, m) of matrix shape\n """\n self.n, self.m = shape\n self.vals = np.asarray(vals, dtype=np.float64)\n self.rows = np.asarray(rows, dtype=np.int32)\n self.cols = np.asarray(cols, dtype=np.int32)\n\n def dot(self, V):\n """\n Dot product of self by a vector *V* in sparse-dense to dense format\n *V* dense vector of shape (self.m,).\n """\n assert V.shape == (self.m,)\n return np.bincount(self.rows,\n weights=self.vals*V[self.cols],\n minlength=self.m)\n\n def compress_csc(self):\n """\n Compress rows, cols, vals / summing duplicates. Sort for csc format.\n """\n _, unique, indices = np.unique(\n self.rows + self.n*self.cols,\n return_index=True, return_inverse=True)\n self.rows = self.rows[unique]\n self.cols = self.cols[unique]\n self.vals = np.bincount(indices, weights=self.vals)\n\n def compress_csr(self):\n """\n Compress rows, cols, vals / summing duplicates. Sort for csr format.\n """\n _, unique, indices = np.unique(\n self.m*self.rows + self.cols,\n return_index=True, return_inverse=True)\n self.rows = self.rows[unique]\n self.cols = self.cols[unique]\n self.vals = np.bincount(indices, weights=self.vals)\n\n def to_dense(self):\n """\n Return a dense matrix representing self, mainly for debugging purposes.\n """\n ret = np.zeros([self.n, self.m], dtype=np.float64)\n nvals = self.vals.size\n for i in range(nvals):\n ret[self.rows[i], self.cols[i]] += self.vals[i]\n return ret\n\n def __str__(self):\n return self.to_dense().__str__()\n\n @property\n def diag(self):\n """Return the (dense) vector of the diagonal elements."""\n in_diag = (self.rows == self.cols)\n diag = np.zeros(min(self.n, self.n), dtype=np.float64) # default 0.\n diag[self.rows[in_diag]] = self.vals[in_diag]\n return diag\n\n\ndef _cg(A, b, x0=None, tol=1.e-10, maxiter=1000):\n """\n Use Preconditioned Conjugate Gradient iteration to solve A x = b\n A simple Jacobi (diagonal) preconditioner is used.\n\n Parameters\n ----------\n A : _Sparse_Matrix_coo\n *A* must have been compressed before by compress_csc or\n compress_csr method.\n b : array\n Right hand side of the linear system.\n x0 : array, optional\n Starting guess for the solution. Defaults to the zero vector.\n tol : float, optional\n Tolerance to achieve. The algorithm terminates when the relative\n residual is below tol. Default is 1e-10.\n maxiter : int, optional\n Maximum number of iterations. Iteration will stop after *maxiter*\n steps even if the specified tolerance has not been achieved. Defaults\n to 1000.\n\n Returns\n -------\n x : array\n The converged solution.\n err : float\n The absolute error np.linalg.norm(A.dot(x) - b)\n """\n n = b.size\n assert A.n == n\n assert A.m == n\n b_norm = np.linalg.norm(b)\n\n # Jacobi pre-conditioner\n kvec = A.diag\n # For diag elem < 1e-6 we keep 1e-6.\n kvec = np.maximum(kvec, 1e-6)\n\n # Initial guess\n if x0 is None:\n x = np.zeros(n)\n else:\n x = x0\n\n r = b - A.dot(x)\n w = r/kvec\n\n p = np.zeros(n)\n beta = 0.0\n rho = np.dot(r, w)\n k = 0\n\n # Following C. T. Kelley\n while (np.sqrt(abs(rho)) > tol*b_norm) and (k < maxiter):\n p = w + beta*p\n z = A.dot(p)\n alpha = rho/np.dot(p, z)\n r = r - alpha*z\n w = r/kvec\n rhoold = rho\n rho = np.dot(r, w)\n x = x + alpha*p\n beta = rho/rhoold\n # err = np.linalg.norm(A.dot(x) - b) # absolute accuracy - not used\n k += 1\n err = np.linalg.norm(A.dot(x) - b)\n return x, err\n\n\n# The following private functions:\n# :func:`_safe_inv22_vectorized`\n# :func:`_pseudo_inv22sym_vectorized`\n# :func:`_scalar_vectorized`\n# :func:`_transpose_vectorized`\n# :func:`_roll_vectorized`\n# :func:`_to_matrix_vectorized`\n# :func:`_extract_submatrices`\n# provide fast numpy implementation of some standard operations on arrays of\n# matrices - stored as (:, n_rows, n_cols)-shaped np.arrays.\n\n# Development note: Dealing with pathologic 'flat' triangles in the\n# CubicTriInterpolator code and impact on (2, 2)-matrix inversion functions\n# :func:`_safe_inv22_vectorized` and :func:`_pseudo_inv22sym_vectorized`.\n#\n# Goals:\n# 1) The CubicTriInterpolator should be able to handle flat or almost flat\n# triangles without raising an error,\n# 2) These degenerated triangles should have no impact on the automatic dof\n# calculation (associated with null weight for the _DOF_estimator_geom and\n# with null energy for the _DOF_estimator_min_E),\n# 3) Linear patch test should be passed exactly on degenerated meshes,\n# 4) Interpolation (with :meth:`_interpolate_single_key` or\n# :meth:`_interpolate_multi_key`) shall be correctly handled even *inside*\n# the pathologic triangles, to interact correctly with a TriRefiner class.\n#\n# Difficulties:\n# Flat triangles have rank-deficient *J* (so-called jacobian matrix) and\n# *metric* (the metric tensor = J x J.T). Computation of the local\n# tangent plane is also problematic.\n#\n# Implementation:\n# Most of the time, when computing the inverse of a rank-deficient matrix it\n# is safe to simply return the null matrix (which is the implementation in\n# :func:`_safe_inv22_vectorized`). This is because of point 2), itself\n# enforced by:\n# - null area hence null energy in :class:`_DOF_estimator_min_E`\n# - angles close or equal to 0 or np.pi hence null weight in\n# :class:`_DOF_estimator_geom`.\n# Note that the function angle -> weight is continuous and maximum for an\n# angle np.pi/2 (refer to :meth:`compute_geom_weights`)\n# The exception is the computation of barycentric coordinates, which is done\n# by inversion of the *metric* matrix. In this case, we need to compute a set\n# of valid coordinates (1 among numerous possibilities), to ensure point 4).\n# We benefit here from the symmetry of metric = J x J.T, which makes it easier\n# to compute a pseudo-inverse in :func:`_pseudo_inv22sym_vectorized`\ndef _safe_inv22_vectorized(M):\n """\n Inversion of arrays of (2, 2) matrices, returns 0 for rank-deficient\n matrices.\n\n *M* : array of (2, 2) matrices to inverse, shape (n, 2, 2)\n """\n _api.check_shape((None, 2, 2), M=M)\n M_inv = np.empty_like(M)\n prod1 = M[:, 0, 0]*M[:, 1, 1]\n delta = prod1 - M[:, 0, 1]*M[:, 1, 0]\n\n # We set delta_inv to 0. in case of a rank deficient matrix; a\n # rank-deficient input matrix *M* will lead to a null matrix in output\n rank2 = (np.abs(delta) > 1e-8*np.abs(prod1))\n if np.all(rank2):\n # Normal 'optimized' flow.\n delta_inv = 1./delta\n else:\n # 'Pathologic' flow.\n delta_inv = np.zeros(M.shape[0])\n delta_inv[rank2] = 1./delta[rank2]\n\n M_inv[:, 0, 0] = M[:, 1, 1]*delta_inv\n M_inv[:, 0, 1] = -M[:, 0, 1]*delta_inv\n M_inv[:, 1, 0] = -M[:, 1, 0]*delta_inv\n M_inv[:, 1, 1] = M[:, 0, 0]*delta_inv\n return M_inv\n\n\ndef _pseudo_inv22sym_vectorized(M):\n """\n Inversion of arrays of (2, 2) SYMMETRIC matrices; returns the\n (Moore-Penrose) pseudo-inverse for rank-deficient matrices.\n\n In case M is of rank 1, we have M = trace(M) x P where P is the orthogonal\n projection on Im(M), and we return trace(M)^-1 x P == M / trace(M)**2\n In case M is of rank 0, we return the null matrix.\n\n *M* : array of (2, 2) matrices to inverse, shape (n, 2, 2)\n """\n _api.check_shape((None, 2, 2), M=M)\n M_inv = np.empty_like(M)\n prod1 = M[:, 0, 0]*M[:, 1, 1]\n delta = prod1 - M[:, 0, 1]*M[:, 1, 0]\n rank2 = (np.abs(delta) > 1e-8*np.abs(prod1))\n\n if np.all(rank2):\n # Normal 'optimized' flow.\n M_inv[:, 0, 0] = M[:, 1, 1] / delta\n M_inv[:, 0, 1] = -M[:, 0, 1] / delta\n M_inv[:, 1, 0] = -M[:, 1, 0] / delta\n M_inv[:, 1, 1] = M[:, 0, 0] / delta\n else:\n # 'Pathologic' flow.\n # Here we have to deal with 2 sub-cases\n # 1) First sub-case: matrices of rank 2:\n delta = delta[rank2]\n M_inv[rank2, 0, 0] = M[rank2, 1, 1] / delta\n M_inv[rank2, 0, 1] = -M[rank2, 0, 1] / delta\n M_inv[rank2, 1, 0] = -M[rank2, 1, 0] / delta\n M_inv[rank2, 1, 1] = M[rank2, 0, 0] / delta\n # 2) Second sub-case: rank-deficient matrices of rank 0 and 1:\n rank01 = ~rank2\n tr = M[rank01, 0, 0] + M[rank01, 1, 1]\n tr_zeros = (np.abs(tr) < 1.e-8)\n sq_tr_inv = (1.-tr_zeros) / (tr**2+tr_zeros)\n # sq_tr_inv = 1. / tr**2\n M_inv[rank01, 0, 0] = M[rank01, 0, 0] * sq_tr_inv\n M_inv[rank01, 0, 1] = M[rank01, 0, 1] * sq_tr_inv\n M_inv[rank01, 1, 0] = M[rank01, 1, 0] * sq_tr_inv\n M_inv[rank01, 1, 1] = M[rank01, 1, 1] * sq_tr_inv\n\n return M_inv\n\n\ndef _scalar_vectorized(scalar, M):\n """\n Scalar product between scalars and matrices.\n """\n return scalar[:, np.newaxis, np.newaxis]*M\n\n\ndef _transpose_vectorized(M):\n """\n Transposition of an array of matrices *M*.\n """\n return np.transpose(M, [0, 2, 1])\n\n\ndef _roll_vectorized(M, roll_indices, axis):\n """\n Roll an array of matrices along *axis* (0: rows, 1: columns) according to\n an array of indices *roll_indices*.\n """\n assert axis in [0, 1]\n ndim = M.ndim\n assert ndim == 3\n ndim_roll = roll_indices.ndim\n assert ndim_roll == 1\n sh = M.shape\n r, c = sh[-2:]\n assert sh[0] == roll_indices.shape[0]\n vec_indices = np.arange(sh[0], dtype=np.int32)\n\n # Builds the rolled matrix\n M_roll = np.empty_like(M)\n if axis == 0:\n for ir in range(r):\n for ic in range(c):\n M_roll[:, ir, ic] = M[vec_indices, (-roll_indices+ir) % r, ic]\n else: # 1\n for ir in range(r):\n for ic in range(c):\n M_roll[:, ir, ic] = M[vec_indices, ir, (-roll_indices+ic) % c]\n return M_roll\n\n\ndef _to_matrix_vectorized(M):\n """\n Build an array of matrices from individuals np.arrays of identical shapes.\n\n Parameters\n ----------\n M\n ncols-list of nrows-lists of shape sh.\n\n Returns\n -------\n M_res : np.array of shape (sh, nrow, ncols)\n *M_res* satisfies ``M_res[..., i, j] = M[i][j]``.\n """\n assert isinstance(M, (tuple, list))\n assert all(isinstance(item, (tuple, list)) for item in M)\n c_vec = np.asarray([len(item) for item in M])\n assert np.all(c_vec-c_vec[0] == 0)\n r = len(M)\n c = c_vec[0]\n M00 = np.asarray(M[0][0])\n dt = M00.dtype\n sh = [M00.shape[0], r, c]\n M_ret = np.empty(sh, dtype=dt)\n for irow in range(r):\n for icol in range(c):\n M_ret[:, irow, icol] = np.asarray(M[irow][icol])\n return M_ret\n\n\ndef _extract_submatrices(M, block_indices, block_size, axis):\n """\n Extract selected blocks of a matrices *M* depending on parameters\n *block_indices* and *block_size*.\n\n Returns the array of extracted matrices *Mres* so that ::\n\n M_res[..., ir, :] = M[(block_indices*block_size+ir), :]\n """\n assert block_indices.ndim == 1\n assert axis in [0, 1]\n\n r, c = M.shape\n if axis == 0:\n sh = [block_indices.shape[0], block_size, c]\n else: # 1\n sh = [block_indices.shape[0], r, block_size]\n\n dt = M.dtype\n M_res = np.empty(sh, dtype=dt)\n if axis == 0:\n for ir in range(block_size):\n M_res[:, ir, :] = M[(block_indices*block_size+ir), :]\n else: # 1\n for ic in range(block_size):\n M_res[:, :, ic] = M[:, (block_indices*block_size+ic)]\n\n return M_res\n | .venv\Lib\site-packages\matplotlib\tri\_triinterpolate.py | _triinterpolate.py | Python | 62,445 | 0.75 | 0.114994 | 0.147959 | vue-tools | 617 | 2024-03-07T21:28:52.730654 | GPL-3.0 | false | 8c4b8fe5fd565957b8bfb15cc1991a32 |
from matplotlib.tri import Triangulation, TriFinder\n\nfrom typing import Literal\nimport numpy as np\nfrom numpy.typing import ArrayLike\n\nclass TriInterpolator:\n def __init__(\n self,\n triangulation: Triangulation,\n z: ArrayLike,\n trifinder: TriFinder | None = ...,\n ) -> None: ...\n # __call__ and gradient are not actually implemented by the ABC, but are specified as required\n def __call__(self, x: ArrayLike, y: ArrayLike) -> np.ma.MaskedArray: ...\n def gradient(\n self, x: ArrayLike, y: ArrayLike\n ) -> tuple[np.ma.MaskedArray, np.ma.MaskedArray]: ...\n\nclass LinearTriInterpolator(TriInterpolator): ...\n\nclass CubicTriInterpolator(TriInterpolator):\n def __init__(\n self,\n triangulation: Triangulation,\n z: ArrayLike,\n kind: Literal["min_E", "geom", "user"] = ...,\n trifinder: TriFinder | None = ...,\n dz: tuple[ArrayLike, ArrayLike] | None = ...,\n ) -> None: ...\n\n__all__ = ('TriInterpolator', 'LinearTriInterpolator', 'CubicTriInterpolator')\n | .venv\Lib\site-packages\matplotlib\tri\_triinterpolate.pyi | _triinterpolate.pyi | Other | 1,044 | 0.95 | 0.21875 | 0.037037 | vue-tools | 725 | 2025-03-21T12:52:29.745015 | BSD-3-Clause | false | f465aa06e4ca9a43ddb8cffe6f043757 |
import numpy as np\n\nfrom matplotlib import _api, _docstring\nfrom matplotlib.collections import PolyCollection, TriMesh\nfrom matplotlib.tri._triangulation import Triangulation\n\n\n@_docstring.interpd\ndef tripcolor(ax, *args, alpha=1.0, norm=None, cmap=None, vmin=None,\n vmax=None, shading='flat', facecolors=None, **kwargs):\n """\n Create a pseudocolor plot of an unstructured triangular grid.\n\n Call signatures::\n\n tripcolor(triangulation, c, *, ...)\n tripcolor(x, y, c, *, [triangles=triangles], [mask=mask], ...)\n\n The triangular grid can be specified either by passing a `.Triangulation`\n object as the first parameter, or by passing the points *x*, *y* and\n optionally the *triangles* and a *mask*. See `.Triangulation` for an\n explanation of these parameters.\n\n It is possible to pass the triangles positionally, i.e.\n ``tripcolor(x, y, triangles, c, ...)``. However, this is discouraged.\n For more clarity, pass *triangles* via keyword argument.\n\n If neither of *triangulation* or *triangles* are given, the triangulation\n is calculated on the fly. In this case, it does not make sense to provide\n colors at the triangle faces via *c* or *facecolors* because there are\n multiple possible triangulations for a group of points and you don't know\n which triangles will be constructed.\n\n Parameters\n ----------\n triangulation : `.Triangulation`\n An already created triangular grid.\n x, y, triangles, mask\n Parameters defining the triangular grid. See `.Triangulation`.\n This is mutually exclusive with specifying *triangulation*.\n c : array-like\n The color values, either for the points or for the triangles. Which one\n is automatically inferred from the length of *c*, i.e. does it match\n the number of points or the number of triangles. If there are the same\n number of points and triangles in the triangulation it is assumed that\n color values are defined at points; to force the use of color values at\n triangles use the keyword argument ``facecolors=c`` instead of just\n ``c``.\n This parameter is position-only.\n facecolors : array-like, optional\n Can be used alternatively to *c* to specify colors at the triangle\n faces. This parameter takes precedence over *c*.\n shading : {'flat', 'gouraud'}, default: 'flat'\n If 'flat' and the color values *c* are defined at points, the color\n values used for each triangle are from the mean c of the triangle's\n three points. If *shading* is 'gouraud' then color values must be\n defined at points.\n %(cmap_doc)s\n\n %(norm_doc)s\n\n %(vmin_vmax_doc)s\n\n %(colorizer_doc)s\n\n Returns\n -------\n `~matplotlib.collections.PolyCollection` or `~matplotlib.collections.TriMesh`\n The result depends on *shading*: For ``shading='flat'`` the result is a\n `.PolyCollection`, for ``shading='gouraud'`` the result is a `.TriMesh`.\n\n Other Parameters\n ----------------\n **kwargs : `~matplotlib.collections.Collection` properties\n\n %(Collection:kwdoc)s\n """\n _api.check_in_list(['flat', 'gouraud'], shading=shading)\n\n tri, args, kwargs = Triangulation.get_from_args_and_kwargs(*args, **kwargs)\n\n # Parse the color to be in one of (the other variable will be None):\n # - facecolors: if specified at the triangle faces\n # - point_colors: if specified at the points\n if facecolors is not None:\n if args:\n _api.warn_external(\n "Positional parameter c has no effect when the keyword "\n "facecolors is given")\n point_colors = None\n if len(facecolors) != len(tri.triangles):\n raise ValueError("The length of facecolors must match the number "\n "of triangles")\n else:\n # Color from positional parameter c\n if not args:\n raise TypeError(\n "tripcolor() missing 1 required positional argument: 'c'; or "\n "1 required keyword-only argument: 'facecolors'")\n elif len(args) > 1:\n raise TypeError(f"Unexpected positional parameters: {args[1:]!r}")\n c = np.asarray(args[0])\n if len(c) == len(tri.x):\n # having this before the len(tri.triangles) comparison gives\n # precedence to nodes if there are as many nodes as triangles\n point_colors = c\n facecolors = None\n elif len(c) == len(tri.triangles):\n point_colors = None\n facecolors = c\n else:\n raise ValueError('The length of c must match either the number '\n 'of points or the number of triangles')\n\n # Handling of linewidths, shading, edgecolors and antialiased as\n # in Axes.pcolor\n linewidths = (0.25,)\n if 'linewidth' in kwargs:\n kwargs['linewidths'] = kwargs.pop('linewidth')\n kwargs.setdefault('linewidths', linewidths)\n\n edgecolors = 'none'\n if 'edgecolor' in kwargs:\n kwargs['edgecolors'] = kwargs.pop('edgecolor')\n ec = kwargs.setdefault('edgecolors', edgecolors)\n\n if 'antialiased' in kwargs:\n kwargs['antialiaseds'] = kwargs.pop('antialiased')\n if 'antialiaseds' not in kwargs and ec.lower() == "none":\n kwargs['antialiaseds'] = False\n\n if shading == 'gouraud':\n if facecolors is not None:\n raise ValueError(\n "shading='gouraud' can only be used when the colors "\n "are specified at the points, not at the faces.")\n collection = TriMesh(tri, alpha=alpha, array=point_colors,\n cmap=cmap, norm=norm, **kwargs)\n else: # 'flat'\n # Vertices of triangles.\n maskedTris = tri.get_masked_triangles()\n verts = np.stack((tri.x[maskedTris], tri.y[maskedTris]), axis=-1)\n\n # Color values.\n if facecolors is None:\n # One color per triangle, the mean of the 3 vertex color values.\n colors = point_colors[maskedTris].mean(axis=1)\n elif tri.mask is not None:\n # Remove color values of masked triangles.\n colors = facecolors[~tri.mask]\n else:\n colors = facecolors\n collection = PolyCollection(verts, alpha=alpha, array=colors,\n cmap=cmap, norm=norm, **kwargs)\n\n collection._scale_norm(norm, vmin, vmax)\n ax.grid(False)\n\n minx = tri.x.min()\n maxx = tri.x.max()\n miny = tri.y.min()\n maxy = tri.y.max()\n corners = (minx, miny), (maxx, maxy)\n ax.update_datalim(corners)\n ax.autoscale_view()\n ax.add_collection(collection)\n return collection\n | .venv\Lib\site-packages\matplotlib\tri\_tripcolor.py | _tripcolor.py | Python | 6,705 | 0.95 | 0.131737 | 0.090909 | node-utils | 473 | 2024-03-31T22:23:39.666214 | GPL-3.0 | false | 950e80ac5683c79a743129aea784a10d |
from matplotlib.axes import Axes\nfrom matplotlib.collections import PolyCollection, TriMesh\nfrom matplotlib.colors import Normalize, Colormap\nfrom matplotlib.tri._triangulation import Triangulation\n\nfrom numpy.typing import ArrayLike\n\nfrom typing import overload, Literal\n\n@overload\ndef tripcolor(\n ax: Axes,\n triangulation: Triangulation,\n c: ArrayLike = ...,\n *,\n alpha: float = ...,\n norm: str | Normalize | None = ...,\n cmap: str | Colormap | None = ...,\n vmin: float | None = ...,\n vmax: float | None = ...,\n shading: Literal["flat"] = ...,\n facecolors: ArrayLike | None = ...,\n **kwargs\n) -> PolyCollection: ...\n@overload\ndef tripcolor(\n ax: Axes,\n x: ArrayLike,\n y: ArrayLike,\n c: ArrayLike = ...,\n *,\n alpha: float = ...,\n norm: str | Normalize | None = ...,\n cmap: str | Colormap | None = ...,\n vmin: float | None = ...,\n vmax: float | None = ...,\n shading: Literal["flat"] = ...,\n facecolors: ArrayLike | None = ...,\n **kwargs\n) -> PolyCollection: ...\n@overload\ndef tripcolor(\n ax: Axes,\n triangulation: Triangulation,\n c: ArrayLike = ...,\n *,\n alpha: float = ...,\n norm: str | Normalize | None = ...,\n cmap: str | Colormap | None = ...,\n vmin: float | None = ...,\n vmax: float | None = ...,\n shading: Literal["gouraud"],\n facecolors: ArrayLike | None = ...,\n **kwargs\n) -> TriMesh: ...\n@overload\ndef tripcolor(\n ax: Axes,\n x: ArrayLike,\n y: ArrayLike,\n c: ArrayLike = ...,\n *,\n alpha: float = ...,\n norm: str | Normalize | None = ...,\n cmap: str | Colormap | None = ...,\n vmin: float | None = ...,\n vmax: float | None = ...,\n shading: Literal["gouraud"],\n facecolors: ArrayLike | None = ...,\n **kwargs\n) -> TriMesh: ...\n | .venv\Lib\site-packages\matplotlib\tri\_tripcolor.pyi | _tripcolor.pyi | Other | 1,781 | 0.85 | 0.056338 | 0.117647 | awesome-app | 612 | 2025-02-02T00:49:40.412481 | GPL-3.0 | false | 50f75e19d48f48ade00a1b5c3203fb66 |
import numpy as np\nfrom matplotlib.tri._triangulation import Triangulation\nimport matplotlib.cbook as cbook\nimport matplotlib.lines as mlines\n\n\ndef triplot(ax, *args, **kwargs):\n """\n Draw an unstructured triangular grid as lines and/or markers.\n\n Call signatures::\n\n triplot(triangulation, ...)\n triplot(x, y, [triangles], *, [mask=mask], ...)\n\n The triangular grid can be specified either by passing a `.Triangulation`\n object as the first parameter, or by passing the points *x*, *y* and\n optionally the *triangles* and a *mask*. If neither of *triangulation* or\n *triangles* are given, the triangulation is calculated on the fly.\n\n Parameters\n ----------\n triangulation : `.Triangulation`\n An already created triangular grid.\n x, y, triangles, mask\n Parameters defining the triangular grid. See `.Triangulation`.\n This is mutually exclusive with specifying *triangulation*.\n other_parameters\n All other args and kwargs are forwarded to `~.Axes.plot`.\n\n Returns\n -------\n lines : `~matplotlib.lines.Line2D`\n The drawn triangles edges.\n markers : `~matplotlib.lines.Line2D`\n The drawn marker nodes.\n """\n import matplotlib.axes\n\n tri, args, kwargs = Triangulation.get_from_args_and_kwargs(*args, **kwargs)\n x, y, edges = (tri.x, tri.y, tri.edges)\n\n # Decode plot format string, e.g., 'ro-'\n fmt = args[0] if args else ""\n linestyle, marker, color = matplotlib.axes._base._process_plot_format(fmt)\n\n # Insert plot format string into a copy of kwargs (kwargs values prevail).\n kw = cbook.normalize_kwargs(kwargs, mlines.Line2D)\n for key, val in zip(('linestyle', 'marker', 'color'),\n (linestyle, marker, color)):\n if val is not None:\n kw.setdefault(key, val)\n\n # Draw lines without markers.\n # Note 1: If we drew markers here, most markers would be drawn more than\n # once as they belong to several edges.\n # Note 2: We insert nan values in the flattened edges arrays rather than\n # plotting directly (triang.x[edges].T, triang.y[edges].T)\n # as it considerably speeds-up code execution.\n linestyle = kw['linestyle']\n kw_lines = {\n **kw,\n 'marker': 'None', # No marker to draw.\n 'zorder': kw.get('zorder', 1), # Path default zorder is used.\n }\n if linestyle not in [None, 'None', '', ' ']:\n tri_lines_x = np.insert(x[edges], 2, np.nan, axis=1)\n tri_lines_y = np.insert(y[edges], 2, np.nan, axis=1)\n tri_lines = ax.plot(tri_lines_x.ravel(), tri_lines_y.ravel(),\n **kw_lines)\n else:\n tri_lines = ax.plot([], [], **kw_lines)\n\n # Draw markers separately.\n marker = kw['marker']\n kw_markers = {\n **kw,\n 'linestyle': 'None', # No line to draw.\n }\n kw_markers.pop('label', None)\n if marker not in [None, 'None', '', ' ']:\n tri_markers = ax.plot(x, y, **kw_markers)\n else:\n tri_markers = ax.plot([], [], **kw_markers)\n\n return tri_lines + tri_markers\n | .venv\Lib\site-packages\matplotlib\tri\_triplot.py | _triplot.py | Python | 3,102 | 0.95 | 0.069767 | 0.178082 | node-utils | 224 | 2024-07-06T09:31:52.038512 | BSD-3-Clause | false | 1799e18aba55deec625421685fb4bb08 |
from matplotlib.tri._triangulation import Triangulation\nfrom matplotlib.axes import Axes\nfrom matplotlib.lines import Line2D\n\nfrom typing import overload\nfrom numpy.typing import ArrayLike\n\n@overload\ndef triplot(\n ax: Axes, triangulation: Triangulation, *args, **kwargs\n) -> tuple[Line2D, Line2D]: ...\n@overload\ndef triplot(\n ax: Axes, x: ArrayLike, y: ArrayLike, triangles: ArrayLike = ..., *args, **kwargs\n) -> tuple[Line2D, Line2D]: ...\n | .venv\Lib\site-packages\matplotlib\tri\_triplot.pyi | _triplot.pyi | Other | 446 | 0.85 | 0.133333 | 0 | react-lib | 650 | 2024-10-28T17:06:41.724474 | BSD-3-Clause | false | 665b38dbc000e91da098c75a6b2a0203 |
"""\nMesh refinement for triangular grids.\n"""\n\nimport numpy as np\n\nfrom matplotlib import _api\nfrom matplotlib.tri._triangulation import Triangulation\nimport matplotlib.tri._triinterpolate\n\n\nclass TriRefiner:\n """\n Abstract base class for classes implementing mesh refinement.\n\n A TriRefiner encapsulates a Triangulation object and provides tools for\n mesh refinement and interpolation.\n\n Derived classes must implement:\n\n - ``refine_triangulation(return_tri_index=False, **kwargs)`` , where\n the optional keyword arguments *kwargs* are defined in each\n TriRefiner concrete implementation, and which returns:\n\n - a refined triangulation,\n - optionally (depending on *return_tri_index*), for each\n point of the refined triangulation: the index of\n the initial triangulation triangle to which it belongs.\n\n - ``refine_field(z, triinterpolator=None, **kwargs)``, where:\n\n - *z* array of field values (to refine) defined at the base\n triangulation nodes,\n - *triinterpolator* is an optional `~matplotlib.tri.TriInterpolator`,\n - the other optional keyword arguments *kwargs* are defined in\n each TriRefiner concrete implementation;\n\n and which returns (as a tuple) a refined triangular mesh and the\n interpolated values of the field at the refined triangulation nodes.\n """\n\n def __init__(self, triangulation):\n _api.check_isinstance(Triangulation, triangulation=triangulation)\n self._triangulation = triangulation\n\n\nclass UniformTriRefiner(TriRefiner):\n """\n Uniform mesh refinement by recursive subdivisions.\n\n Parameters\n ----------\n triangulation : `~matplotlib.tri.Triangulation`\n The encapsulated triangulation (to be refined)\n """\n# See Also\n# --------\n# :class:`~matplotlib.tri.CubicTriInterpolator` and\n# :class:`~matplotlib.tri.TriAnalyzer`.\n# """\n def __init__(self, triangulation):\n super().__init__(triangulation)\n\n def refine_triangulation(self, return_tri_index=False, subdiv=3):\n """\n Compute a uniformly refined triangulation *refi_triangulation* of\n the encapsulated :attr:`triangulation`.\n\n This function refines the encapsulated triangulation by splitting each\n father triangle into 4 child sub-triangles built on the edges midside\n nodes, recursing *subdiv* times. In the end, each triangle is hence\n divided into ``4**subdiv`` child triangles.\n\n Parameters\n ----------\n return_tri_index : bool, default: False\n Whether an index table indicating the father triangle index of each\n point is returned.\n subdiv : int, default: 3\n Recursion level for the subdivision.\n Each triangle is divided into ``4**subdiv`` child triangles;\n hence, the default results in 64 refined subtriangles for each\n triangle of the initial triangulation.\n\n Returns\n -------\n refi_triangulation : `~matplotlib.tri.Triangulation`\n The refined triangulation.\n found_index : int array\n Index of the initial triangulation containing triangle, for each\n point of *refi_triangulation*.\n Returned only if *return_tri_index* is set to True.\n """\n refi_triangulation = self._triangulation\n ntri = refi_triangulation.triangles.shape[0]\n\n # Computes the triangulation ancestors numbers in the reference\n # triangulation.\n ancestors = np.arange(ntri, dtype=np.int32)\n for _ in range(subdiv):\n refi_triangulation, ancestors = self._refine_triangulation_once(\n refi_triangulation, ancestors)\n refi_npts = refi_triangulation.x.shape[0]\n refi_triangles = refi_triangulation.triangles\n\n # Now we compute found_index table if needed\n if return_tri_index:\n # We have to initialize found_index with -1 because some nodes\n # may very well belong to no triangle at all, e.g., in case of\n # Delaunay Triangulation with DuplicatePointWarning.\n found_index = np.full(refi_npts, -1, dtype=np.int32)\n tri_mask = self._triangulation.mask\n if tri_mask is None:\n found_index[refi_triangles] = np.repeat(ancestors,\n 3).reshape(-1, 3)\n else:\n # There is a subtlety here: we want to avoid whenever possible\n # that refined points container is a masked triangle (which\n # would result in artifacts in plots).\n # So we impose the numbering from masked ancestors first,\n # then overwrite it with unmasked ancestor numbers.\n ancestor_mask = tri_mask[ancestors]\n found_index[refi_triangles[ancestor_mask, :]\n ] = np.repeat(ancestors[ancestor_mask],\n 3).reshape(-1, 3)\n found_index[refi_triangles[~ancestor_mask, :]\n ] = np.repeat(ancestors[~ancestor_mask],\n 3).reshape(-1, 3)\n return refi_triangulation, found_index\n else:\n return refi_triangulation\n\n def refine_field(self, z, triinterpolator=None, subdiv=3):\n """\n Refine a field defined on the encapsulated triangulation.\n\n Parameters\n ----------\n z : (npoints,) array-like\n Values of the field to refine, defined at the nodes of the\n encapsulated triangulation. (``n_points`` is the number of points\n in the initial triangulation)\n triinterpolator : `~matplotlib.tri.TriInterpolator`, optional\n Interpolator used for field interpolation. If not specified,\n a `~matplotlib.tri.CubicTriInterpolator` will be used.\n subdiv : int, default: 3\n Recursion level for the subdivision.\n Each triangle is divided into ``4**subdiv`` child triangles.\n\n Returns\n -------\n refi_tri : `~matplotlib.tri.Triangulation`\n The returned refined triangulation.\n refi_z : 1D array of length: *refi_tri* node count.\n The returned interpolated field (at *refi_tri* nodes).\n """\n if triinterpolator is None:\n interp = matplotlib.tri.CubicTriInterpolator(\n self._triangulation, z)\n else:\n _api.check_isinstance(matplotlib.tri.TriInterpolator,\n triinterpolator=triinterpolator)\n interp = triinterpolator\n\n refi_tri, found_index = self.refine_triangulation(\n subdiv=subdiv, return_tri_index=True)\n refi_z = interp._interpolate_multikeys(\n refi_tri.x, refi_tri.y, tri_index=found_index)[0]\n return refi_tri, refi_z\n\n @staticmethod\n def _refine_triangulation_once(triangulation, ancestors=None):\n """\n Refine a `.Triangulation` by splitting each triangle into 4\n child-masked_triangles built on the edges midside nodes.\n\n Masked triangles, if present, are also split, but their children\n returned masked.\n\n If *ancestors* is not provided, returns only a new triangulation:\n child_triangulation.\n\n If the array-like key table *ancestor* is given, it shall be of shape\n (ntri,) where ntri is the number of *triangulation* masked_triangles.\n In this case, the function returns\n (child_triangulation, child_ancestors)\n child_ancestors is defined so that the 4 child masked_triangles share\n the same index as their father: child_ancestors.shape = (4 * ntri,).\n """\n\n x = triangulation.x\n y = triangulation.y\n\n # According to tri.triangulation doc:\n # neighbors[i, j] is the triangle that is the neighbor\n # to the edge from point index masked_triangles[i, j] to point\n # index masked_triangles[i, (j+1)%3].\n neighbors = triangulation.neighbors\n triangles = triangulation.triangles\n npts = np.shape(x)[0]\n ntri = np.shape(triangles)[0]\n if ancestors is not None:\n ancestors = np.asarray(ancestors)\n if np.shape(ancestors) != (ntri,):\n raise ValueError(\n "Incompatible shapes provide for "\n "triangulation.masked_triangles and ancestors: "\n f"{np.shape(triangles)} and {np.shape(ancestors)}")\n\n # Initiating tables refi_x and refi_y of the refined triangulation\n # points\n # hint: each apex is shared by 2 masked_triangles except the borders.\n borders = np.sum(neighbors == -1)\n added_pts = (3*ntri + borders) // 2\n refi_npts = npts + added_pts\n refi_x = np.zeros(refi_npts)\n refi_y = np.zeros(refi_npts)\n\n # First part of refi_x, refi_y is just the initial points\n refi_x[:npts] = x\n refi_y[:npts] = y\n\n # Second part contains the edge midside nodes.\n # Each edge belongs to 1 triangle (if border edge) or is shared by 2\n # masked_triangles (interior edge).\n # We first build 2 * ntri arrays of edge starting nodes (edge_elems,\n # edge_apexes); we then extract only the masters to avoid overlaps.\n # The so-called 'master' is the triangle with biggest index\n # The 'slave' is the triangle with lower index\n # (can be -1 if border edge)\n # For slave and master we will identify the apex pointing to the edge\n # start\n edge_elems = np.tile(np.arange(ntri, dtype=np.int32), 3)\n edge_apexes = np.repeat(np.arange(3, dtype=np.int32), ntri)\n edge_neighbors = neighbors[edge_elems, edge_apexes]\n mask_masters = (edge_elems > edge_neighbors)\n\n # Identifying the "masters" and adding to refi_x, refi_y vec\n masters = edge_elems[mask_masters]\n apex_masters = edge_apexes[mask_masters]\n x_add = (x[triangles[masters, apex_masters]] +\n x[triangles[masters, (apex_masters+1) % 3]]) * 0.5\n y_add = (y[triangles[masters, apex_masters]] +\n y[triangles[masters, (apex_masters+1) % 3]]) * 0.5\n refi_x[npts:] = x_add\n refi_y[npts:] = y_add\n\n # Building the new masked_triangles; each old masked_triangles hosts\n # 4 new masked_triangles\n # there are 6 pts to identify per 'old' triangle, 3 new_pt_corner and\n # 3 new_pt_midside\n new_pt_corner = triangles\n\n # What is the index in refi_x, refi_y of point at middle of apex iapex\n # of elem ielem ?\n # If ielem is the apex master: simple count, given the way refi_x was\n # built.\n # If ielem is the apex slave: yet we do not know; but we will soon\n # using the neighbors table.\n new_pt_midside = np.empty([ntri, 3], dtype=np.int32)\n cum_sum = npts\n for imid in range(3):\n mask_st_loc = (imid == apex_masters)\n n_masters_loc = np.sum(mask_st_loc)\n elem_masters_loc = masters[mask_st_loc]\n new_pt_midside[:, imid][elem_masters_loc] = np.arange(\n n_masters_loc, dtype=np.int32) + cum_sum\n cum_sum += n_masters_loc\n\n # Now dealing with slave elems.\n # for each slave element we identify the master and then the inode\n # once slave_masters is identified, slave_masters_apex is such that:\n # neighbors[slaves_masters, slave_masters_apex] == slaves\n mask_slaves = np.logical_not(mask_masters)\n slaves = edge_elems[mask_slaves]\n slaves_masters = edge_neighbors[mask_slaves]\n diff_table = np.abs(neighbors[slaves_masters, :] -\n np.outer(slaves, np.ones(3, dtype=np.int32)))\n slave_masters_apex = np.argmin(diff_table, axis=1)\n slaves_apex = edge_apexes[mask_slaves]\n new_pt_midside[slaves, slaves_apex] = new_pt_midside[\n slaves_masters, slave_masters_apex]\n\n # Builds the 4 child masked_triangles\n child_triangles = np.empty([ntri*4, 3], dtype=np.int32)\n child_triangles[0::4, :] = np.vstack([\n new_pt_corner[:, 0], new_pt_midside[:, 0],\n new_pt_midside[:, 2]]).T\n child_triangles[1::4, :] = np.vstack([\n new_pt_corner[:, 1], new_pt_midside[:, 1],\n new_pt_midside[:, 0]]).T\n child_triangles[2::4, :] = np.vstack([\n new_pt_corner[:, 2], new_pt_midside[:, 2],\n new_pt_midside[:, 1]]).T\n child_triangles[3::4, :] = np.vstack([\n new_pt_midside[:, 0], new_pt_midside[:, 1],\n new_pt_midside[:, 2]]).T\n child_triangulation = Triangulation(refi_x, refi_y, child_triangles)\n\n # Builds the child mask\n if triangulation.mask is not None:\n child_triangulation.set_mask(np.repeat(triangulation.mask, 4))\n\n if ancestors is None:\n return child_triangulation\n else:\n return child_triangulation, np.repeat(ancestors, 4)\n | .venv\Lib\site-packages\matplotlib\tri\_trirefine.py | _trirefine.py | Python | 13,178 | 0.95 | 0.120521 | 0.191729 | node-utils | 399 | 2025-07-06T19:32:00.588616 | BSD-3-Clause | false | c9feef2d80a820bb7b87d7a02a2a602e |
from typing import Literal, overload\n\nimport numpy as np\nfrom numpy.typing import ArrayLike\n\nfrom matplotlib.tri._triangulation import Triangulation\nfrom matplotlib.tri._triinterpolate import TriInterpolator\n\nclass TriRefiner:\n def __init__(self, triangulation: Triangulation) -> None: ...\n\nclass UniformTriRefiner(TriRefiner):\n def __init__(self, triangulation: Triangulation) -> None: ...\n @overload\n def refine_triangulation(\n self, *, return_tri_index: Literal[True], subdiv: int = ...\n ) -> tuple[Triangulation, np.ndarray]: ...\n @overload\n def refine_triangulation(\n self, return_tri_index: Literal[False] = ..., subdiv: int = ...\n ) -> Triangulation: ...\n @overload\n def refine_triangulation(\n self, return_tri_index: bool = ..., subdiv: int = ...\n ) -> tuple[Triangulation, np.ndarray] | Triangulation: ...\n def refine_field(\n self,\n z: ArrayLike,\n triinterpolator: TriInterpolator | None = ...,\n subdiv: int = ...,\n ) -> tuple[Triangulation, np.ndarray]: ...\n | .venv\Lib\site-packages\matplotlib\tri\_trirefine.pyi | _trirefine.pyi | Other | 1,056 | 0.85 | 0.258065 | 0 | awesome-app | 117 | 2025-01-12T01:29:58.426674 | GPL-3.0 | false | b36776dea450e4885bf52679390b68b3 |
"""\nTools for triangular grids.\n"""\n\nimport numpy as np\n\nfrom matplotlib import _api\nfrom matplotlib.tri import Triangulation\n\n\nclass TriAnalyzer:\n """\n Define basic tools for triangular mesh analysis and improvement.\n\n A TriAnalyzer encapsulates a `.Triangulation` object and provides basic\n tools for mesh analysis and mesh improvement.\n\n Attributes\n ----------\n scale_factors\n\n Parameters\n ----------\n triangulation : `~matplotlib.tri.Triangulation`\n The encapsulated triangulation to analyze.\n """\n\n def __init__(self, triangulation):\n _api.check_isinstance(Triangulation, triangulation=triangulation)\n self._triangulation = triangulation\n\n @property\n def scale_factors(self):\n """\n Factors to rescale the triangulation into a unit square.\n\n Returns\n -------\n (float, float)\n Scaling factors (kx, ky) so that the triangulation\n ``[triangulation.x * kx, triangulation.y * ky]``\n fits exactly inside a unit square.\n """\n compressed_triangles = self._triangulation.get_masked_triangles()\n node_used = (np.bincount(np.ravel(compressed_triangles),\n minlength=self._triangulation.x.size) != 0)\n return (1 / np.ptp(self._triangulation.x[node_used]),\n 1 / np.ptp(self._triangulation.y[node_used]))\n\n def circle_ratios(self, rescale=True):\n """\n Return a measure of the triangulation triangles flatness.\n\n The ratio of the incircle radius over the circumcircle radius is a\n widely used indicator of a triangle flatness.\n It is always ``<= 0.5`` and ``== 0.5`` only for equilateral\n triangles. Circle ratios below 0.01 denote very flat triangles.\n\n To avoid unduly low values due to a difference of scale between the 2\n axis, the triangular mesh can first be rescaled to fit inside a unit\n square with `scale_factors` (Only if *rescale* is True, which is\n its default value).\n\n Parameters\n ----------\n rescale : bool, default: True\n If True, internally rescale (based on `scale_factors`), so that the\n (unmasked) triangles fit exactly inside a unit square mesh.\n\n Returns\n -------\n masked array\n Ratio of the incircle radius over the circumcircle radius, for\n each 'rescaled' triangle of the encapsulated triangulation.\n Values corresponding to masked triangles are masked out.\n\n """\n # Coords rescaling\n if rescale:\n (kx, ky) = self.scale_factors\n else:\n (kx, ky) = (1.0, 1.0)\n pts = np.vstack([self._triangulation.x*kx,\n self._triangulation.y*ky]).T\n tri_pts = pts[self._triangulation.triangles]\n # Computes the 3 side lengths\n a = tri_pts[:, 1, :] - tri_pts[:, 0, :]\n b = tri_pts[:, 2, :] - tri_pts[:, 1, :]\n c = tri_pts[:, 0, :] - tri_pts[:, 2, :]\n a = np.hypot(a[:, 0], a[:, 1])\n b = np.hypot(b[:, 0], b[:, 1])\n c = np.hypot(c[:, 0], c[:, 1])\n # circumcircle and incircle radii\n s = (a+b+c)*0.5\n prod = s*(a+b-s)*(a+c-s)*(b+c-s)\n # We have to deal with flat triangles with infinite circum_radius\n bool_flat = (prod == 0.)\n if np.any(bool_flat):\n # Pathologic flow\n ntri = tri_pts.shape[0]\n circum_radius = np.empty(ntri, dtype=np.float64)\n circum_radius[bool_flat] = np.inf\n abc = a*b*c\n circum_radius[~bool_flat] = abc[~bool_flat] / (\n 4.0*np.sqrt(prod[~bool_flat]))\n else:\n # Normal optimized flow\n circum_radius = (a*b*c) / (4.0*np.sqrt(prod))\n in_radius = (a*b*c) / (4.0*circum_radius*s)\n circle_ratio = in_radius/circum_radius\n mask = self._triangulation.mask\n if mask is None:\n return circle_ratio\n else:\n return np.ma.array(circle_ratio, mask=mask)\n\n def get_flat_tri_mask(self, min_circle_ratio=0.01, rescale=True):\n """\n Eliminate excessively flat border triangles from the triangulation.\n\n Returns a mask *new_mask* which allows to clean the encapsulated\n triangulation from its border-located flat triangles\n (according to their :meth:`circle_ratios`).\n This mask is meant to be subsequently applied to the triangulation\n using `.Triangulation.set_mask`.\n *new_mask* is an extension of the initial triangulation mask\n in the sense that an initially masked triangle will remain masked.\n\n The *new_mask* array is computed recursively; at each step flat\n triangles are removed only if they share a side with the current mesh\n border. Thus, no new holes in the triangulated domain will be created.\n\n Parameters\n ----------\n min_circle_ratio : float, default: 0.01\n Border triangles with incircle/circumcircle radii ratio r/R will\n be removed if r/R < *min_circle_ratio*.\n rescale : bool, default: True\n If True, first, internally rescale (based on `scale_factors`) so\n that the (unmasked) triangles fit exactly inside a unit square\n mesh. This rescaling accounts for the difference of scale which\n might exist between the 2 axis.\n\n Returns\n -------\n array of bool\n Mask to apply to encapsulated triangulation.\n All the initially masked triangles remain masked in the\n *new_mask*.\n\n Notes\n -----\n The rationale behind this function is that a Delaunay\n triangulation - of an unstructured set of points - sometimes contains\n almost flat triangles at its border, leading to artifacts in plots\n (especially for high-resolution contouring).\n Masked with computed *new_mask*, the encapsulated\n triangulation would contain no more unmasked border triangles\n with a circle ratio below *min_circle_ratio*, thus improving the\n mesh quality for subsequent plots or interpolation.\n """\n # Recursively computes the mask_current_borders, true if a triangle is\n # at the border of the mesh OR touching the border through a chain of\n # invalid aspect ratio masked_triangles.\n ntri = self._triangulation.triangles.shape[0]\n mask_bad_ratio = self.circle_ratios(rescale) < min_circle_ratio\n\n current_mask = self._triangulation.mask\n if current_mask is None:\n current_mask = np.zeros(ntri, dtype=bool)\n valid_neighbors = np.copy(self._triangulation.neighbors)\n renum_neighbors = np.arange(ntri, dtype=np.int32)\n nadd = -1\n while nadd != 0:\n # The active wavefront is the triangles from the border (unmasked\n # but with a least 1 neighbor equal to -1\n wavefront = (np.min(valid_neighbors, axis=1) == -1) & ~current_mask\n # The element from the active wavefront will be masked if their\n # circle ratio is bad.\n added_mask = wavefront & mask_bad_ratio\n current_mask = added_mask | current_mask\n nadd = np.sum(added_mask)\n\n # now we have to update the tables valid_neighbors\n valid_neighbors[added_mask, :] = -1\n renum_neighbors[added_mask] = -1\n valid_neighbors = np.where(valid_neighbors == -1, -1,\n renum_neighbors[valid_neighbors])\n\n return np.ma.filled(current_mask, True)\n\n def _get_compressed_triangulation(self):\n """\n Compress (if masked) the encapsulated triangulation.\n\n Returns minimal-length triangles array (*compressed_triangles*) and\n coordinates arrays (*compressed_x*, *compressed_y*) that can still\n describe the unmasked triangles of the encapsulated triangulation.\n\n Returns\n -------\n compressed_triangles : array-like\n the returned compressed triangulation triangles\n compressed_x : array-like\n the returned compressed triangulation 1st coordinate\n compressed_y : array-like\n the returned compressed triangulation 2nd coordinate\n tri_renum : int array\n renumbering table to translate the triangle numbers from the\n encapsulated triangulation into the new (compressed) renumbering.\n -1 for masked triangles (deleted from *compressed_triangles*).\n node_renum : int array\n renumbering table to translate the point numbers from the\n encapsulated triangulation into the new (compressed) renumbering.\n -1 for unused points (i.e. those deleted from *compressed_x* and\n *compressed_y*).\n\n """\n # Valid triangles and renumbering\n tri_mask = self._triangulation.mask\n compressed_triangles = self._triangulation.get_masked_triangles()\n ntri = self._triangulation.triangles.shape[0]\n if tri_mask is not None:\n tri_renum = self._total_to_compress_renum(~tri_mask)\n else:\n tri_renum = np.arange(ntri, dtype=np.int32)\n\n # Valid nodes and renumbering\n valid_node = (np.bincount(np.ravel(compressed_triangles),\n minlength=self._triangulation.x.size) != 0)\n compressed_x = self._triangulation.x[valid_node]\n compressed_y = self._triangulation.y[valid_node]\n node_renum = self._total_to_compress_renum(valid_node)\n\n # Now renumbering the valid triangles nodes\n compressed_triangles = node_renum[compressed_triangles]\n\n return (compressed_triangles, compressed_x, compressed_y, tri_renum,\n node_renum)\n\n @staticmethod\n def _total_to_compress_renum(valid):\n """\n Parameters\n ----------\n valid : 1D bool array\n Validity mask.\n\n Returns\n -------\n int array\n Array so that (`valid_array` being a compressed array\n based on a `masked_array` with mask ~*valid*):\n\n - For all i with valid[i] = True:\n valid_array[renum[i]] = masked_array[i]\n - For all i with valid[i] = False:\n renum[i] = -1 (invalid value)\n """\n renum = np.full(np.size(valid), -1, dtype=np.int32)\n n_valid = np.sum(valid)\n renum[valid] = np.arange(n_valid, dtype=np.int32)\n return renum\n | .venv\Lib\site-packages\matplotlib\tri\_tritools.py | _tritools.py | Python | 10,575 | 0.95 | 0.114068 | 0.087719 | node-utils | 666 | 2025-01-08T21:18:10.415648 | Apache-2.0 | false | 39d1ae53c40258f7bd17a1879f0bf2d2 |
from matplotlib.tri import Triangulation\n\nimport numpy as np\n\nclass TriAnalyzer:\n def __init__(self, triangulation: Triangulation) -> None: ...\n @property\n def scale_factors(self) -> tuple[float, float]: ...\n def circle_ratios(self, rescale: bool = ...) -> np.ndarray: ...\n def get_flat_tri_mask(\n self, min_circle_ratio: float = ..., rescale: bool = ...\n ) -> np.ndarray: ...\n | .venv\Lib\site-packages\matplotlib\tri\_tritools.pyi | _tritools.pyi | Other | 402 | 0.85 | 0.416667 | 0 | awesome-app | 409 | 2025-03-27T02:52:18.156392 | BSD-3-Clause | false | 29ab742af970ea0d2b766c3a59ca8ac7 |
"""\nUnstructured triangular grid functions.\n"""\n\nfrom ._triangulation import Triangulation\nfrom ._tricontour import TriContourSet, tricontour, tricontourf\nfrom ._trifinder import TriFinder, TrapezoidMapTriFinder\nfrom ._triinterpolate import (TriInterpolator, LinearTriInterpolator,\n CubicTriInterpolator)\nfrom ._tripcolor import tripcolor\nfrom ._triplot import triplot\nfrom ._trirefine import TriRefiner, UniformTriRefiner\nfrom ._tritools import TriAnalyzer\n\n\n__all__ = ["Triangulation",\n "TriContourSet", "tricontour", "tricontourf",\n "TriFinder", "TrapezoidMapTriFinder",\n "TriInterpolator", "LinearTriInterpolator", "CubicTriInterpolator",\n "tripcolor",\n "triplot",\n "TriRefiner", "UniformTriRefiner",\n "TriAnalyzer"]\n | .venv\Lib\site-packages\matplotlib\tri\__init__.py | __init__.py | Python | 820 | 0.85 | 0 | 0 | vue-tools | 605 | 2025-06-04T18:24:53.433084 | Apache-2.0 | false | ed0ea5ee986eee4c878de67bfcec8e16 |
\n\n | .venv\Lib\site-packages\matplotlib\tri\__pycache__\_triangulation.cpython-313.pyc | _triangulation.cpython-313.pyc | Other | 11,110 | 0.8 | 0.045455 | 0.008197 | react-lib | 488 | 2024-01-23T03:21:04.834836 | MIT | false | 9d500212a1c37d7924fb5720bb0e088e |
\n\n | .venv\Lib\site-packages\matplotlib\tri\__pycache__\_tricontour.cpython-313.pyc | _tricontour.cpython-313.pyc | Other | 12,128 | 0.95 | 0.081448 | 0.02924 | vue-tools | 468 | 2023-10-29T12:17:58.908322 | BSD-3-Clause | false | e1a8a9f3410129173ecdd43e278ab468 |
\n\n | .venv\Lib\site-packages\matplotlib\tri\__pycache__\_trifinder.cpython-313.pyc | _trifinder.cpython-313.pyc | Other | 5,035 | 0.95 | 0.113924 | 0.014085 | vue-tools | 236 | 2023-10-30T20:28:34.985808 | Apache-2.0 | false | a5cf276c8067f5445a13b54d267def2e |
\n\n | .venv\Lib\site-packages\matplotlib\tri\__pycache__\_triinterpolate.cpython-313.pyc | _triinterpolate.cpython-313.pyc | Other | 67,043 | 0.75 | 0.054326 | 0.035948 | node-utils | 720 | 2024-10-22T00:23:06.295924 | GPL-3.0 | false | 1683bcd6bed504e19adb1ae0776498a1 |
\n\n | .venv\Lib\site-packages\matplotlib\tri\__pycache__\_tripcolor.cpython-313.pyc | _tripcolor.cpython-313.pyc | Other | 6,803 | 0.95 | 0.05042 | 0.009434 | node-utils | 691 | 2024-07-22T11:44:43.532578 | Apache-2.0 | false | 85654291e3f3df378da427c019306ddd |
\n\n | .venv\Lib\site-packages\matplotlib\tri\__pycache__\_triplot.cpython-313.pyc | _triplot.cpython-313.pyc | Other | 3,281 | 0.8 | 0 | 0.022222 | node-utils | 376 | 2025-05-06T07:30:34.567084 | GPL-3.0 | false | 12f822976dd19196df3c953eab1557cb |
\n\n | .venv\Lib\site-packages\matplotlib\tri\__pycache__\_trirefine.cpython-313.pyc | _trirefine.cpython-313.pyc | Other | 12,044 | 0.95 | 0.089286 | 0 | react-lib | 477 | 2023-11-08T21:38:17.342392 | MIT | false | a930e3631fba97aa3ce5478fc5c96bd2 |
\n\n | .venv\Lib\site-packages\matplotlib\tri\__pycache__\_tritools.cpython-313.pyc | _tritools.cpython-313.pyc | Other | 11,705 | 0.95 | 0.079365 | 0.017751 | node-utils | 902 | 2025-04-14T04:14:39.377295 | GPL-3.0 | false | ef527355054cfec31aba40e40947d080 |
\n\n | .venv\Lib\site-packages\matplotlib\tri\__pycache__\__init__.cpython-313.pyc | __init__.cpython-313.pyc | Other | 908 | 0.85 | 0 | 0 | vue-tools | 152 | 2023-10-19T23:09:38.934167 | Apache-2.0 | false | ab03d6499423e33e75ce593b48a5c246 |
"""\nHelper functions for deprecating parts of the Matplotlib API.\n\nThis documentation is only relevant for Matplotlib developers, not for users.\n\n.. warning::\n\n This module is for internal use only. Do not use it in your own code.\n We may change the API at any time with no warning.\n\n"""\n\nimport contextlib\nimport functools\nimport inspect\nimport math\nimport warnings\n\n\nclass MatplotlibDeprecationWarning(DeprecationWarning):\n """A class for issuing deprecation warnings for Matplotlib users."""\n\n\ndef _generate_deprecation_warning(\n since, message='', name='', alternative='', pending=False, obj_type='',\n addendum='', *, removal=''):\n if pending:\n if removal:\n raise ValueError("A pending deprecation cannot have a scheduled removal")\n elif removal == '':\n macro, meso, *_ = since.split('.')\n removal = f'{macro}.{int(meso) + 2}'\n if not message:\n message = (\n ("The %(name)s %(obj_type)s" if obj_type else "%(name)s") +\n (" will be deprecated in a future version" if pending else\n (" was deprecated in Matplotlib %(since)s" +\n (" and will be removed in %(removal)s" if removal else ""))) +\n "." +\n (" Use %(alternative)s instead." if alternative else "") +\n (" %(addendum)s" if addendum else ""))\n warning_cls = PendingDeprecationWarning if pending else MatplotlibDeprecationWarning\n return warning_cls(message % dict(\n func=name, name=name, obj_type=obj_type, since=since, removal=removal,\n alternative=alternative, addendum=addendum))\n\n\ndef warn_deprecated(\n since, *, message='', name='', alternative='', pending=False,\n obj_type='', addendum='', removal=''):\n """\n Display a standardized deprecation.\n\n Parameters\n ----------\n since : str\n The release at which this API became deprecated.\n message : str, optional\n Override the default deprecation message. The ``%(since)s``,\n ``%(name)s``, ``%(alternative)s``, ``%(obj_type)s``, ``%(addendum)s``,\n and ``%(removal)s`` format specifiers will be replaced by the values\n of the respective arguments passed to this function.\n name : str, optional\n The name of the deprecated object.\n alternative : str, optional\n An alternative API that the user may use in place of the deprecated\n API. The deprecation warning will tell the user about this alternative\n if provided.\n pending : bool, optional\n If True, uses a PendingDeprecationWarning instead of a\n DeprecationWarning. Cannot be used together with *removal*.\n obj_type : str, optional\n The object type being deprecated.\n addendum : str, optional\n Additional text appended directly to the final message.\n removal : str, optional\n The expected removal version. With the default (an empty string), a\n removal version is automatically computed from *since*. Set to other\n Falsy values to not schedule a removal date. Cannot be used together\n with *pending*.\n\n Examples\n --------\n ::\n\n # To warn of the deprecation of "matplotlib.name_of_module"\n warn_deprecated('1.4.0', name='matplotlib.name_of_module',\n obj_type='module')\n """\n warning = _generate_deprecation_warning(\n since, message, name, alternative, pending, obj_type, addendum,\n removal=removal)\n from . import warn_external\n warn_external(warning, category=MatplotlibDeprecationWarning)\n\n\ndef deprecated(since, *, message='', name='', alternative='', pending=False,\n obj_type=None, addendum='', removal=''):\n """\n Decorator to mark a function, a class, or a property as deprecated.\n\n When deprecating a classmethod, a staticmethod, or a property, the\n ``@deprecated`` decorator should go *under* ``@classmethod`` and\n ``@staticmethod`` (i.e., `deprecated` should directly decorate the\n underlying callable), but *over* ``@property``.\n\n When deprecating a class ``C`` intended to be used as a base class in a\n multiple inheritance hierarchy, ``C`` *must* define an ``__init__`` method\n (if ``C`` instead inherited its ``__init__`` from its own base class, then\n ``@deprecated`` would mess up ``__init__`` inheritance when installing its\n own (deprecation-emitting) ``C.__init__``).\n\n Parameters are the same as for `warn_deprecated`, except that *obj_type*\n defaults to 'class' if decorating a class, 'attribute' if decorating a\n property, and 'function' otherwise.\n\n Examples\n --------\n ::\n\n @deprecated('1.4.0')\n def the_function_to_deprecate():\n pass\n """\n\n def deprecate(obj, message=message, name=name, alternative=alternative,\n pending=pending, obj_type=obj_type, addendum=addendum):\n from matplotlib._api import classproperty\n\n if isinstance(obj, type):\n if obj_type is None:\n obj_type = "class"\n func = obj.__init__\n name = name or obj.__name__\n old_doc = obj.__doc__\n\n def finalize(wrapper, new_doc):\n try:\n obj.__doc__ = new_doc\n except AttributeError: # Can't set on some extension objects.\n pass\n obj.__init__ = functools.wraps(obj.__init__)(wrapper)\n return obj\n\n elif isinstance(obj, (property, classproperty)):\n if obj_type is None:\n obj_type = "attribute"\n func = None\n name = name or obj.fget.__name__\n old_doc = obj.__doc__\n\n class _deprecated_property(type(obj)):\n def __get__(self, instance, owner=None):\n if instance is not None or owner is not None \\n and isinstance(self, classproperty):\n emit_warning()\n return super().__get__(instance, owner)\n\n def __set__(self, instance, value):\n if instance is not None:\n emit_warning()\n return super().__set__(instance, value)\n\n def __delete__(self, instance):\n if instance is not None:\n emit_warning()\n return super().__delete__(instance)\n\n def __set_name__(self, owner, set_name):\n nonlocal name\n if name == "<lambda>":\n name = set_name\n\n def finalize(_, new_doc):\n return _deprecated_property(\n fget=obj.fget, fset=obj.fset, fdel=obj.fdel, doc=new_doc)\n\n else:\n if obj_type is None:\n obj_type = "function"\n func = obj\n name = name or obj.__name__\n old_doc = func.__doc__\n\n def finalize(wrapper, new_doc):\n wrapper = functools.wraps(func)(wrapper)\n wrapper.__doc__ = new_doc\n return wrapper\n\n def emit_warning():\n warn_deprecated(\n since, message=message, name=name, alternative=alternative,\n pending=pending, obj_type=obj_type, addendum=addendum,\n removal=removal)\n\n def wrapper(*args, **kwargs):\n emit_warning()\n return func(*args, **kwargs)\n\n old_doc = inspect.cleandoc(old_doc or '').strip('\n')\n\n notes_header = '\nNotes\n-----'\n second_arg = ' '.join([t.strip() for t in\n (message, f"Use {alternative} instead."\n if alternative else "", addendum) if t])\n new_doc = (f"[*Deprecated*] {old_doc}\n"\n f"{notes_header if notes_header not in old_doc else ''}\n"\n f".. deprecated:: {since}\n"\n f" {second_arg}")\n\n if not old_doc:\n # This is to prevent a spurious 'unexpected unindent' warning from\n # docutils when the original docstring was blank.\n new_doc += r'\ '\n\n return finalize(wrapper, new_doc)\n\n return deprecate\n\n\nclass deprecate_privatize_attribute:\n """\n Helper to deprecate public access to an attribute (or method).\n\n This helper should only be used at class scope, as follows::\n\n class Foo:\n attr = _deprecate_privatize_attribute(*args, **kwargs)\n\n where *all* parameters are forwarded to `deprecated`. This form makes\n ``attr`` a property which forwards read and write access to ``self._attr``\n (same name but with a leading underscore), with a deprecation warning.\n Note that the attribute name is derived from *the name this helper is\n assigned to*. This helper also works for deprecating methods.\n """\n\n def __init__(self, *args, **kwargs):\n self.deprecator = deprecated(*args, **kwargs)\n\n def __set_name__(self, owner, name):\n setattr(owner, name, self.deprecator(\n property(lambda self: getattr(self, f"_{name}"),\n lambda self, value: setattr(self, f"_{name}", value)),\n name=name))\n\n\n# Used by _copy_docstring_and_deprecators to redecorate pyplot wrappers and\n# boilerplate.py to retrieve original signatures. It may seem natural to store\n# this information as an attribute on the wrapper, but if the wrapper gets\n# itself functools.wraps()ed, then such attributes are silently propagated to\n# the outer wrapper, which is not desired.\nDECORATORS = {}\n\n\ndef rename_parameter(since, old, new, func=None):\n """\n Decorator indicating that parameter *old* of *func* is renamed to *new*.\n\n The actual implementation of *func* should use *new*, not *old*. If *old*\n is passed to *func*, a DeprecationWarning is emitted, and its value is\n used, even if *new* is also passed by keyword (this is to simplify pyplot\n wrapper functions, which always pass *new* explicitly to the Axes method).\n If *new* is also passed but positionally, a TypeError will be raised by the\n underlying function during argument binding.\n\n Examples\n --------\n ::\n\n @_api.rename_parameter("3.1", "bad_name", "good_name")\n def func(good_name): ...\n """\n\n decorator = functools.partial(rename_parameter, since, old, new)\n\n if func is None:\n return decorator\n\n signature = inspect.signature(func)\n assert old not in signature.parameters, (\n f"Matplotlib internal error: {old!r} cannot be a parameter for "\n f"{func.__name__}()")\n assert new in signature.parameters, (\n f"Matplotlib internal error: {new!r} must be a parameter for "\n f"{func.__name__}()")\n\n @functools.wraps(func)\n def wrapper(*args, **kwargs):\n if old in kwargs:\n warn_deprecated(\n since, message=f"The {old!r} parameter of {func.__name__}() "\n f"has been renamed {new!r} since Matplotlib {since}; support "\n f"for the old name will be dropped in %(removal)s.")\n kwargs[new] = kwargs.pop(old)\n return func(*args, **kwargs)\n\n # wrapper() must keep the same documented signature as func(): if we\n # instead made both *old* and *new* appear in wrapper()'s signature, they\n # would both show up in the pyplot function for an Axes method as well and\n # pyplot would explicitly pass both arguments to the Axes method.\n\n DECORATORS[wrapper] = decorator\n return wrapper\n\n\nclass _deprecated_parameter_class:\n def __repr__(self):\n return "<deprecated parameter>"\n\n\n_deprecated_parameter = _deprecated_parameter_class()\n\n\ndef delete_parameter(since, name, func=None, **kwargs):\n """\n Decorator indicating that parameter *name* of *func* is being deprecated.\n\n The actual implementation of *func* should keep the *name* parameter in its\n signature, or accept a ``**kwargs`` argument (through which *name* would be\n passed).\n\n Parameters that come after the deprecated parameter effectively become\n keyword-only (as they cannot be passed positionally without triggering the\n DeprecationWarning on the deprecated parameter), and should be marked as\n such after the deprecation period has passed and the deprecated parameter\n is removed.\n\n Parameters other than *since*, *name*, and *func* are keyword-only and\n forwarded to `.warn_deprecated`.\n\n Examples\n --------\n ::\n\n @_api.delete_parameter("3.1", "unused")\n def func(used_arg, other_arg, unused, more_args): ...\n """\n\n decorator = functools.partial(delete_parameter, since, name, **kwargs)\n\n if func is None:\n return decorator\n\n signature = inspect.signature(func)\n # Name of `**kwargs` parameter of the decorated function, typically\n # "kwargs" if such a parameter exists, or None if the decorated function\n # doesn't accept `**kwargs`.\n kwargs_name = next((param.name for param in signature.parameters.values()\n if param.kind == inspect.Parameter.VAR_KEYWORD), None)\n if name in signature.parameters:\n kind = signature.parameters[name].kind\n is_varargs = kind is inspect.Parameter.VAR_POSITIONAL\n is_varkwargs = kind is inspect.Parameter.VAR_KEYWORD\n if not is_varargs and not is_varkwargs:\n name_idx = (\n # Deprecated parameter can't be passed positionally.\n math.inf if kind is inspect.Parameter.KEYWORD_ONLY\n # If call site has no more than this number of parameters, the\n # deprecated parameter can't have been passed positionally.\n else [*signature.parameters].index(name))\n func.__signature__ = signature = signature.replace(parameters=[\n param.replace(default=_deprecated_parameter)\n if param.name == name else param\n for param in signature.parameters.values()])\n else:\n name_idx = -1 # Deprecated parameter can always have been passed.\n else:\n is_varargs = is_varkwargs = False\n # Deprecated parameter can't be passed positionally.\n name_idx = math.inf\n assert kwargs_name, (\n f"Matplotlib internal error: {name!r} must be a parameter for "\n f"{func.__name__}()")\n\n addendum = kwargs.pop('addendum', None)\n\n @functools.wraps(func)\n def wrapper(*inner_args, **inner_kwargs):\n if len(inner_args) <= name_idx and name not in inner_kwargs:\n # Early return in the simple, non-deprecated case (much faster than\n # calling bind()).\n return func(*inner_args, **inner_kwargs)\n arguments = signature.bind(*inner_args, **inner_kwargs).arguments\n if is_varargs and arguments.get(name):\n warn_deprecated(\n since, message=f"Additional positional arguments to "\n f"{func.__name__}() are deprecated since %(since)s and "\n f"support for them will be removed in %(removal)s.")\n elif is_varkwargs and arguments.get(name):\n warn_deprecated(\n since, message=f"Additional keyword arguments to "\n f"{func.__name__}() are deprecated since %(since)s and "\n f"support for them will be removed in %(removal)s.")\n # We cannot just check `name not in arguments` because the pyplot\n # wrappers always pass all arguments explicitly.\n elif any(name in d and d[name] != _deprecated_parameter\n for d in [arguments, arguments.get(kwargs_name, {})]):\n deprecation_addendum = (\n f"If any parameter follows {name!r}, they should be passed as "\n f"keyword, not positionally.")\n warn_deprecated(\n since,\n name=repr(name),\n obj_type=f"parameter of {func.__name__}()",\n addendum=(addendum + " " + deprecation_addendum) if addendum\n else deprecation_addendum,\n **kwargs)\n return func(*inner_args, **inner_kwargs)\n\n DECORATORS[wrapper] = decorator\n return wrapper\n\n\ndef make_keyword_only(since, name, func=None):\n """\n Decorator indicating that passing parameter *name* (or any of the following\n ones) positionally to *func* is being deprecated.\n\n When used on a method that has a pyplot wrapper, this should be the\n outermost decorator, so that :file:`boilerplate.py` can access the original\n signature.\n """\n\n decorator = functools.partial(make_keyword_only, since, name)\n\n if func is None:\n return decorator\n\n signature = inspect.signature(func)\n POK = inspect.Parameter.POSITIONAL_OR_KEYWORD\n KWO = inspect.Parameter.KEYWORD_ONLY\n assert (name in signature.parameters\n and signature.parameters[name].kind == POK), (\n f"Matplotlib internal error: {name!r} must be a positional-or-keyword "\n f"parameter for {func.__name__}(). If this error happens on a function with a "\n f"pyplot wrapper, make sure make_keyword_only() is the outermost decorator.")\n names = [*signature.parameters]\n name_idx = names.index(name)\n kwonly = [name for name in names[name_idx:]\n if signature.parameters[name].kind == POK]\n\n @functools.wraps(func)\n def wrapper(*args, **kwargs):\n # Don't use signature.bind here, as it would fail when stacked with\n # rename_parameter and an "old" argument name is passed in\n # (signature.bind would fail, but the actual call would succeed).\n if len(args) > name_idx:\n warn_deprecated(\n since, message="Passing the %(name)s %(obj_type)s "\n "positionally is deprecated since Matplotlib %(since)s; the "\n "parameter will become keyword-only in %(removal)s.",\n name=name, obj_type=f"parameter of {func.__name__}()")\n return func(*args, **kwargs)\n\n # Don't modify *func*'s signature, as boilerplate.py needs it.\n wrapper.__signature__ = signature.replace(parameters=[\n param.replace(kind=KWO) if param.name in kwonly else param\n for param in signature.parameters.values()])\n DECORATORS[wrapper] = decorator\n return wrapper\n\n\ndef deprecate_method_override(method, obj, *, allow_empty=False, **kwargs):\n """\n Return ``obj.method`` with a deprecation if it was overridden, else None.\n\n Parameters\n ----------\n method\n An unbound method, i.e. an expression of the form\n ``Class.method_name``. Remember that within the body of a method, one\n can always use ``__class__`` to refer to the class that is currently\n being defined.\n obj\n Either an object of the class where *method* is defined, or a subclass\n of that class.\n allow_empty : bool, default: False\n Whether to allow overrides by "empty" methods without emitting a\n warning.\n **kwargs\n Additional parameters passed to `warn_deprecated` to generate the\n deprecation warning; must at least include the "since" key.\n """\n\n def empty(): pass\n def empty_with_docstring(): """doc"""\n\n name = method.__name__\n bound_child = getattr(obj, name)\n bound_base = (\n method # If obj is a class, then we need to use unbound methods.\n if isinstance(bound_child, type(empty)) and isinstance(obj, type)\n else method.__get__(obj))\n if (bound_child != bound_base\n and (not allow_empty\n or (getattr(getattr(bound_child, "__code__", None),\n "co_code", None)\n not in [empty.__code__.co_code,\n empty_with_docstring.__code__.co_code]))):\n warn_deprecated(**{"name": name, "obj_type": "method", **kwargs})\n return bound_child\n return None\n\n\n@contextlib.contextmanager\ndef suppress_matplotlib_deprecation_warning():\n with warnings.catch_warnings():\n warnings.simplefilter("ignore", MatplotlibDeprecationWarning)\n yield\n | .venv\Lib\site-packages\matplotlib\_api\deprecation.py | deprecation.py | Python | 20,091 | 0.95 | 0.249509 | 0.069378 | node-utils | 151 | 2024-08-24T00:18:56.827644 | BSD-3-Clause | false | 7abc4427880f13e071c33c7f21f575f1 |
from collections.abc import Callable\nimport contextlib\nfrom typing import Any, Literal, ParamSpec, TypedDict, TypeVar, overload\nfrom typing_extensions import (\n Unpack, # < Py 3.11\n)\n\n_P = ParamSpec("_P")\n_R = TypeVar("_R")\n_T = TypeVar("_T")\n\nclass MatplotlibDeprecationWarning(DeprecationWarning): ...\n\nclass DeprecationKwargs(TypedDict, total=False):\n message: str\n alternative: str\n pending: bool\n obj_type: str\n addendum: str\n removal: str | Literal[False]\n\nclass NamedDeprecationKwargs(DeprecationKwargs, total=False):\n name: str\n\ndef warn_deprecated(since: str, **kwargs: Unpack[NamedDeprecationKwargs]) -> None: ...\ndef deprecated(\n since: str, **kwargs: Unpack[NamedDeprecationKwargs]\n) -> Callable[[_T], _T]: ...\n\nclass deprecate_privatize_attribute(Any):\n def __init__(self, since: str, **kwargs: Unpack[NamedDeprecationKwargs]): ...\n def __set_name__(self, owner: type[object], name: str) -> None: ...\n\nDECORATORS: dict[Callable, Callable] = ...\n\n@overload\ndef rename_parameter(\n since: str, old: str, new: str, func: None = ...\n) -> Callable[[Callable[_P, _R]], Callable[_P, _R]]: ...\n@overload\ndef rename_parameter(\n since: str, old: str, new: str, func: Callable[_P, _R]\n) -> Callable[_P, _R]: ...\n\nclass _deprecated_parameter_class: ...\n\n_deprecated_parameter: _deprecated_parameter_class\n\n@overload\ndef delete_parameter(\n since: str, name: str, func: None = ..., **kwargs: Unpack[DeprecationKwargs]\n) -> Callable[[Callable[_P, _R]], Callable[_P, _R]]: ...\n@overload\ndef delete_parameter(\n since: str, name: str, func: Callable[_P, _R], **kwargs: Unpack[DeprecationKwargs]\n) -> Callable[_P, _R]: ...\n@overload\ndef make_keyword_only(\n since: str, name: str, func: None = ...\n) -> Callable[[Callable[_P, _R]], Callable[_P, _R]]: ...\n@overload\ndef make_keyword_only(\n since: str, name: str, func: Callable[_P, _R]\n) -> Callable[_P, _R]: ...\ndef deprecate_method_override(\n method: Callable[_P, _R],\n obj: object | type,\n *,\n allow_empty: bool = ...,\n since: str,\n **kwargs: Unpack[NamedDeprecationKwargs]\n) -> Callable[_P, _R]: ...\ndef suppress_matplotlib_deprecation_warning() -> (\n contextlib.AbstractContextManager[None]\n): ...\n | .venv\Lib\site-packages\matplotlib\_api\deprecation.pyi | deprecation.pyi | Other | 2,217 | 0.95 | 0.226667 | 0.03125 | react-lib | 203 | 2024-11-12T12:16:33.316959 | BSD-3-Clause | false | b87e2d768a2858c7ff834e0044dc7ba4 |
"""\nHelper functions for managing the Matplotlib API.\n\nThis documentation is only relevant for Matplotlib developers, not for users.\n\n.. warning::\n\n This module and its submodules are for internal use only. Do not use them\n in your own code. We may change the API at any time with no warning.\n\n"""\n\nimport functools\nimport itertools\nimport pathlib\nimport re\nimport sys\nimport warnings\n\nfrom .deprecation import ( # noqa: F401\n deprecated, warn_deprecated,\n rename_parameter, delete_parameter, make_keyword_only,\n deprecate_method_override, deprecate_privatize_attribute,\n suppress_matplotlib_deprecation_warning,\n MatplotlibDeprecationWarning)\n\n\nclass classproperty:\n """\n Like `property`, but also triggers on access via the class, and it is the\n *class* that's passed as argument.\n\n Examples\n --------\n ::\n\n class C:\n @classproperty\n def foo(cls):\n return cls.__name__\n\n assert C.foo == "C"\n """\n\n def __init__(self, fget, fset=None, fdel=None, doc=None):\n self._fget = fget\n if fset is not None or fdel is not None:\n raise ValueError('classproperty only implements fget.')\n self.fset = fset\n self.fdel = fdel\n # docs are ignored for now\n self._doc = doc\n\n def __get__(self, instance, owner):\n return self._fget(owner)\n\n @property\n def fget(self):\n return self._fget\n\n\n# In the following check_foo() functions, the first parameter is positional-only to make\n# e.g. `_api.check_isinstance([...], types=foo)` work.\n\ndef check_isinstance(types, /, **kwargs):\n """\n For each *key, value* pair in *kwargs*, check that *value* is an instance\n of one of *types*; if not, raise an appropriate TypeError.\n\n As a special case, a ``None`` entry in *types* is treated as NoneType.\n\n Examples\n --------\n >>> _api.check_isinstance((SomeClass, None), arg=arg)\n """\n none_type = type(None)\n types = ((types,) if isinstance(types, type) else\n (none_type,) if types is None else\n tuple(none_type if tp is None else tp for tp in types))\n\n def type_name(tp):\n return ("None" if tp is none_type\n else tp.__qualname__ if tp.__module__ == "builtins"\n else f"{tp.__module__}.{tp.__qualname__}")\n\n for k, v in kwargs.items():\n if not isinstance(v, types):\n names = [*map(type_name, types)]\n if "None" in names: # Move it to the end for better wording.\n names.remove("None")\n names.append("None")\n raise TypeError(\n "{!r} must be an instance of {}, not a {}".format(\n k,\n ", ".join(names[:-1]) + " or " + names[-1]\n if len(names) > 1 else names[0],\n type_name(type(v))))\n\n\ndef check_in_list(values, /, *, _print_supported_values=True, **kwargs):\n """\n For each *key, value* pair in *kwargs*, check that *value* is in *values*;\n if not, raise an appropriate ValueError.\n\n Parameters\n ----------\n values : iterable\n Sequence of values to check on.\n _print_supported_values : bool, default: True\n Whether to print *values* when raising ValueError.\n **kwargs : dict\n *key, value* pairs as keyword arguments to find in *values*.\n\n Raises\n ------\n ValueError\n If any *value* in *kwargs* is not found in *values*.\n\n Examples\n --------\n >>> _api.check_in_list(["foo", "bar"], arg=arg, other_arg=other_arg)\n """\n if not kwargs:\n raise TypeError("No argument to check!")\n for key, val in kwargs.items():\n if val not in values:\n msg = f"{val!r} is not a valid value for {key}"\n if _print_supported_values:\n msg += f"; supported values are {', '.join(map(repr, values))}"\n raise ValueError(msg)\n\n\ndef check_shape(shape, /, **kwargs):\n """\n For each *key, value* pair in *kwargs*, check that *value* has the shape *shape*;\n if not, raise an appropriate ValueError.\n\n *None* in the shape is treated as a "free" size that can have any length.\n e.g. (None, 2) -> (N, 2)\n\n The values checked must be numpy arrays.\n\n Examples\n --------\n To check for (N, 2) shaped arrays\n\n >>> _api.check_shape((None, 2), arg=arg, other_arg=other_arg)\n """\n for k, v in kwargs.items():\n data_shape = v.shape\n\n if (len(data_shape) != len(shape)\n or any(s != t and t is not None for s, t in zip(data_shape, shape))):\n dim_labels = iter(itertools.chain(\n 'NMLKJIH',\n (f"D{i}" for i in itertools.count())))\n text_shape = ", ".join([str(n) if n is not None else next(dim_labels)\n for n in shape[::-1]][::-1])\n if len(shape) == 1:\n text_shape += ","\n\n raise ValueError(\n f"{k!r} must be {len(shape)}D with shape ({text_shape}), "\n f"but your input has shape {v.shape}"\n )\n\n\ndef check_getitem(mapping, /, **kwargs):\n """\n *kwargs* must consist of a single *key, value* pair. If *key* is in\n *mapping*, return ``mapping[value]``; else, raise an appropriate\n ValueError.\n\n Examples\n --------\n >>> _api.check_getitem({"foo": "bar"}, arg=arg)\n """\n if len(kwargs) != 1:\n raise ValueError("check_getitem takes a single keyword argument")\n (k, v), = kwargs.items()\n try:\n return mapping[v]\n except KeyError:\n raise ValueError(\n f"{v!r} is not a valid value for {k}; supported values are "\n f"{', '.join(map(repr, mapping))}") from None\n\n\ndef caching_module_getattr(cls):\n """\n Helper decorator for implementing module-level ``__getattr__`` as a class.\n\n This decorator must be used at the module toplevel as follows::\n\n @caching_module_getattr\n class __getattr__: # The class *must* be named ``__getattr__``.\n @property # Only properties are taken into account.\n def name(self): ...\n\n The ``__getattr__`` class will be replaced by a ``__getattr__``\n function such that trying to access ``name`` on the module will\n resolve the corresponding property (which may be decorated e.g. with\n ``_api.deprecated`` for deprecating module globals). The properties are\n all implicitly cached. Moreover, a suitable AttributeError is generated\n and raised if no property with the given name exists.\n """\n\n assert cls.__name__ == "__getattr__"\n # Don't accidentally export cls dunders.\n props = {name: prop for name, prop in vars(cls).items()\n if isinstance(prop, property)}\n instance = cls()\n\n @functools.cache\n def __getattr__(name):\n if name in props:\n return props[name].__get__(instance)\n raise AttributeError(\n f"module {cls.__module__!r} has no attribute {name!r}")\n\n return __getattr__\n\n\ndef define_aliases(alias_d, cls=None):\n """\n Class decorator for defining property aliases.\n\n Use as ::\n\n @_api.define_aliases({"property": ["alias", ...], ...})\n class C: ...\n\n For each property, if the corresponding ``get_property`` is defined in the\n class so far, an alias named ``get_alias`` will be defined; the same will\n be done for setters. If neither the getter nor the setter exists, an\n exception will be raised.\n\n The alias map is stored as the ``_alias_map`` attribute on the class and\n can be used by `.normalize_kwargs` (which assumes that higher priority\n aliases come last).\n """\n if cls is None: # Return the actual class decorator.\n return functools.partial(define_aliases, alias_d)\n\n def make_alias(name): # Enforce a closure over *name*.\n @functools.wraps(getattr(cls, name))\n def method(self, *args, **kwargs):\n return getattr(self, name)(*args, **kwargs)\n return method\n\n for prop, aliases in alias_d.items():\n exists = False\n for prefix in ["get_", "set_"]:\n if prefix + prop in vars(cls):\n exists = True\n for alias in aliases:\n method = make_alias(prefix + prop)\n method.__name__ = prefix + alias\n method.__doc__ = f"Alias for `{prefix + prop}`."\n setattr(cls, prefix + alias, method)\n if not exists:\n raise ValueError(\n f"Neither getter nor setter exists for {prop!r}")\n\n def get_aliased_and_aliases(d):\n return {*d, *(alias for aliases in d.values() for alias in aliases)}\n\n preexisting_aliases = getattr(cls, "_alias_map", {})\n conflicting = (get_aliased_and_aliases(preexisting_aliases)\n & get_aliased_and_aliases(alias_d))\n if conflicting:\n # Need to decide on conflict resolution policy.\n raise NotImplementedError(\n f"Parent class already defines conflicting aliases: {conflicting}")\n cls._alias_map = {**preexisting_aliases, **alias_d}\n return cls\n\n\ndef select_matching_signature(funcs, *args, **kwargs):\n """\n Select and call the function that accepts ``*args, **kwargs``.\n\n *funcs* is a list of functions which should not raise any exception (other\n than `TypeError` if the arguments passed do not match their signature).\n\n `select_matching_signature` tries to call each of the functions in *funcs*\n with ``*args, **kwargs`` (in the order in which they are given). Calls\n that fail with a `TypeError` are silently skipped. As soon as a call\n succeeds, `select_matching_signature` returns its return value. If no\n function accepts ``*args, **kwargs``, then the `TypeError` raised by the\n last failing call is re-raised.\n\n Callers should normally make sure that any ``*args, **kwargs`` can only\n bind a single *func* (to avoid any ambiguity), although this is not checked\n by `select_matching_signature`.\n\n Notes\n -----\n `select_matching_signature` is intended to help implementing\n signature-overloaded functions. In general, such functions should be\n avoided, except for back-compatibility concerns. A typical use pattern is\n ::\n\n def my_func(*args, **kwargs):\n params = select_matching_signature(\n [lambda old1, old2: locals(), lambda new: locals()],\n *args, **kwargs)\n if "old1" in params:\n warn_deprecated(...)\n old1, old2 = params.values() # note that locals() is ordered.\n else:\n new, = params.values()\n # do things with params\n\n which allows *my_func* to be called either with two parameters (*old1* and\n *old2*) or a single one (*new*). Note that the new signature is given\n last, so that callers get a `TypeError` corresponding to the new signature\n if the arguments they passed in do not match any signature.\n """\n # Rather than relying on locals() ordering, one could have just used func's\n # signature (``bound = inspect.signature(func).bind(*args, **kwargs);\n # bound.apply_defaults(); return bound``) but that is significantly slower.\n for i, func in enumerate(funcs):\n try:\n return func(*args, **kwargs)\n except TypeError:\n if i == len(funcs) - 1:\n raise\n\n\ndef nargs_error(name, takes, given):\n """Generate a TypeError to be raised by function calls with wrong arity."""\n return TypeError(f"{name}() takes {takes} positional arguments but "\n f"{given} were given")\n\n\ndef kwarg_error(name, kw):\n """\n Generate a TypeError to be raised by function calls with wrong kwarg.\n\n Parameters\n ----------\n name : str\n The name of the calling function.\n kw : str or Iterable[str]\n Either the invalid keyword argument name, or an iterable yielding\n invalid keyword arguments (e.g., a ``kwargs`` dict).\n """\n if not isinstance(kw, str):\n kw = next(iter(kw))\n return TypeError(f"{name}() got an unexpected keyword argument '{kw}'")\n\n\ndef recursive_subclasses(cls):\n """Yield *cls* and direct and indirect subclasses of *cls*."""\n yield cls\n for subcls in cls.__subclasses__():\n yield from recursive_subclasses(subcls)\n\n\ndef warn_external(message, category=None):\n """\n `warnings.warn` wrapper that sets *stacklevel* to "outside Matplotlib".\n\n The original emitter of the warning can be obtained by patching this\n function back to `warnings.warn`, i.e. ``_api.warn_external =\n warnings.warn`` (or ``functools.partial(warnings.warn, stacklevel=2)``,\n etc.).\n """\n kwargs = {}\n if sys.version_info[:2] >= (3, 12):\n # Go to Python's `site-packages` or `lib` from an editable install.\n basedir = pathlib.Path(__file__).parents[2]\n kwargs['skip_file_prefixes'] = (str(basedir / 'matplotlib'),\n str(basedir / 'mpl_toolkits'))\n else:\n frame = sys._getframe()\n for stacklevel in itertools.count(1):\n if frame is None:\n # when called in embedded context may hit frame is None\n kwargs['stacklevel'] = stacklevel\n break\n if not re.match(r"\A(matplotlib|mpl_toolkits)(\Z|\.(?!tests\.))",\n # Work around sphinx-gallery not setting __name__.\n frame.f_globals.get("__name__", "")):\n kwargs['stacklevel'] = stacklevel\n break\n frame = frame.f_back\n # preemptively break reference cycle between locals and the frame\n del frame\n warnings.warn(message, category, **kwargs)\n | .venv\Lib\site-packages\matplotlib\_api\__init__.py | __init__.py | Python | 13,799 | 0.95 | 0.283887 | 0.069182 | python-kit | 751 | 2025-05-04T19:27:15.297397 | MIT | false | a7ef9174a2ec024171262f5eae61bcb4 |
from collections.abc import Callable, Generator, Iterable, Mapping, Sequence\nfrom typing import Any, TypeVar, overload\nfrom typing_extensions import Self # < Py 3.11\n\nfrom numpy.typing import NDArray\n\nfrom .deprecation import ( # noqa: F401, re-exported API\n deprecated as deprecated,\n warn_deprecated as warn_deprecated,\n rename_parameter as rename_parameter,\n delete_parameter as delete_parameter,\n make_keyword_only as make_keyword_only,\n deprecate_method_override as deprecate_method_override,\n deprecate_privatize_attribute as deprecate_privatize_attribute,\n suppress_matplotlib_deprecation_warning as suppress_matplotlib_deprecation_warning,\n MatplotlibDeprecationWarning as MatplotlibDeprecationWarning,\n)\n\n_T = TypeVar("_T")\n\nclass classproperty(Any):\n def __init__(\n self,\n fget: Callable[[_T], Any],\n fset: None = ...,\n fdel: None = ...,\n doc: str | None = None,\n ): ...\n @overload\n def __get__(self, instance: None, owner: None) -> Self: ...\n @overload\n def __get__(self, instance: object, owner: type[object]) -> Any: ...\n @property\n def fget(self) -> Callable[[_T], Any]: ...\n\ndef check_isinstance(\n types: type | tuple[type | None, ...], /, **kwargs: Any\n) -> None: ...\ndef check_in_list(\n values: Sequence[Any], /, *, _print_supported_values: bool = ..., **kwargs: Any\n) -> None: ...\ndef check_shape(shape: tuple[int | None, ...], /, **kwargs: NDArray) -> None: ...\ndef check_getitem(mapping: Mapping[Any, Any], /, **kwargs: Any) -> Any: ...\ndef caching_module_getattr(cls: type) -> Callable[[str], Any]: ...\n@overload\ndef define_aliases(\n alias_d: dict[str, list[str]], cls: None = ...\n) -> Callable[[type[_T]], type[_T]]: ...\n@overload\ndef define_aliases(alias_d: dict[str, list[str]], cls: type[_T]) -> type[_T]: ...\ndef select_matching_signature(\n funcs: list[Callable], *args: Any, **kwargs: Any\n) -> Any: ...\ndef nargs_error(name: str, takes: int | str, given: int) -> TypeError: ...\ndef kwarg_error(name: str, kw: str | Iterable[str]) -> TypeError: ...\ndef recursive_subclasses(cls: type) -> Generator[type, None, None]: ...\ndef warn_external(\n message: str | Warning, category: type[Warning] | None = ...\n) -> None: ...\n | .venv\Lib\site-packages\matplotlib\_api\__init__.pyi | __init__.pyi | Other | 2,246 | 0.95 | 0.288136 | 0 | vue-tools | 836 | 2024-11-24T02:14:44.399049 | Apache-2.0 | false | b2cb7dd7fa1f3437825853010095361d |
\n\n | .venv\Lib\site-packages\matplotlib\_api\__pycache__\deprecation.cpython-313.pyc | deprecation.cpython-313.pyc | Other | 23,586 | 0.95 | 0.104839 | 0.008772 | awesome-app | 101 | 2023-10-27T12:17:47.370597 | MIT | false | 211ee2eb944ce15b12598a4fb1e0a483 |
\n\n | .venv\Lib\site-packages\matplotlib\_api\__pycache__\__init__.cpython-313.pyc | __init__.cpython-313.pyc | Other | 17,237 | 0.95 | 0.166667 | 0.046512 | vue-tools | 152 | 2023-12-22T14:08:25.348670 | GPL-3.0 | false | 4c1c6f8a6defcf4707c4851540a363f1 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\animation.cpython-313.pyc | animation.cpython-313.pyc | Other | 78,467 | 0.75 | 0.091892 | 0.011152 | vue-tools | 714 | 2024-07-03T04:49:59.302035 | BSD-3-Clause | false | 12cf11ce748bc676f49a119c38448a39 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\artist.cpython-313.pyc | artist.cpython-313.pyc | Other | 72,635 | 0.75 | 0.109228 | 0.013363 | awesome-app | 789 | 2024-03-31T07:25:49.684300 | GPL-3.0 | false | c13dfa36c2708f15fec268d0de1e40d4 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\backend_managers.cpython-313.pyc | backend_managers.cpython-313.pyc | Other | 14,762 | 0.95 | 0.076577 | 0.010582 | vue-tools | 452 | 2025-02-05T05:12:46.840247 | GPL-3.0 | false | 3048955bde8281667ad99739cefa0d75 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\backend_tools.cpython-313.pyc | backend_tools.cpython-313.pyc | Other | 51,261 | 0.95 | 0.067961 | 0.034667 | python-kit | 610 | 2025-02-21T12:19:49.412661 | BSD-3-Clause | false | afaa34320c33bcd9edb55a71652079c4 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\bezier.cpython-313.pyc | bezier.cpython-313.pyc | Other | 20,759 | 0.95 | 0.048338 | 0.010033 | node-utils | 827 | 2025-02-03T05:47:35.559420 | GPL-3.0 | false | d2629efecaedf71e8894ffd2e483e578 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\category.cpython-313.pyc | category.cpython-313.pyc | Other | 9,880 | 0.95 | 0.044444 | 0 | react-lib | 268 | 2024-06-16T16:54:08.370711 | MIT | false | d2918f250ae1f0eff0b33a691158abb7 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\cbook.cpython-313.pyc | cbook.cpython-313.pyc | Other | 96,267 | 0.75 | 0.063766 | 0.005973 | awesome-app | 983 | 2024-12-28T21:36:00.966902 | GPL-3.0 | false | a404d2a2a918d1292cc103a91f659614 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\cm.cpython-313.pyc | cm.cpython-313.pyc | Other | 11,433 | 0.95 | 0.09596 | 0.00625 | react-lib | 507 | 2024-02-27T11:44:56.819425 | GPL-3.0 | false | c8be1efa9b413d1aa8bab151c16844a5 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\colorbar.cpython-313.pyc | colorbar.cpython-313.pyc | Other | 72,681 | 0.75 | 0.064057 | 0.024161 | vue-tools | 274 | 2023-08-05T20:23:34.450240 | MIT | false | 50281a35a6fcec5bb3a45ebd20ecdd75 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\colorizer.cpython-313.pyc | colorizer.cpython-313.pyc | Other | 29,227 | 0.95 | 0.099723 | 0.003155 | node-utils | 67 | 2025-05-14T08:41:28.407611 | GPL-3.0 | false | 475155d3d63ed13cd9c6b393344e8617 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\container.cpython-313.pyc | container.cpython-313.pyc | Other | 6,515 | 0.95 | 0.099099 | 0.010638 | node-utils | 783 | 2024-08-20T10:24:36.448160 | MIT | false | aa236aa0e8e4e70367e97997bfd1e60f |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\contour.cpython-313.pyc | contour.cpython-313.pyc | Other | 77,454 | 0.75 | 0.060377 | 0.017544 | vue-tools | 31 | 2025-03-29T01:48:27.688958 | BSD-3-Clause | false | 2bd03f5274c54b1f5ba74cfc9ec9da6f |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\dates.cpython-313.pyc | dates.cpython-313.pyc | Other | 73,888 | 0.75 | 0.061379 | 0.025027 | awesome-app | 208 | 2024-04-11T01:00:47.653658 | GPL-3.0 | false | 83cd078918a21f80fd67da945bcc8d9b |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\dviread.cpython-313.pyc | dviread.cpython-313.pyc | Other | 53,216 | 0.95 | 0.071895 | 0.007246 | react-lib | 90 | 2023-12-14T15:51:16.359703 | MIT | false | fba54b1552b55e104428974e186b23f9 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\font_manager.cpython-313.pyc | font_manager.cpython-313.pyc | Other | 63,548 | 0.75 | 0.042117 | 0.006211 | awesome-app | 801 | 2023-11-07T22:26:41.751807 | GPL-3.0 | false | 8d9773ec5a2f255f0c95cba6358b10a5 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\gridspec.cpython-313.pyc | gridspec.cpython-313.pyc | Other | 34,804 | 0.95 | 0.045045 | 0.012469 | python-kit | 934 | 2024-08-11T14:10:36.340933 | MIT | false | 684e55abd01d4d521b2d513341df2a44 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\hatch.cpython-313.pyc | hatch.cpython-313.pyc | Other | 12,395 | 0.95 | 0.033333 | 0 | react-lib | 231 | 2024-11-05T17:46:54.065144 | Apache-2.0 | false | 03687bb996153ce50bdbb587402c4e18 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\image.cpython-313.pyc | image.cpython-313.pyc | Other | 82,232 | 0.75 | 0.058879 | 0.017436 | react-lib | 973 | 2024-09-30T04:23:42.881349 | MIT | false | 5ad833d0ea0697a1d0dcd62a82db9357 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\inset.cpython-313.pyc | inset.cpython-313.pyc | Other | 12,156 | 0.95 | 0.032258 | 0.014184 | node-utils | 783 | 2023-08-24T21:24:31.467194 | BSD-3-Clause | false | 25a601026d6c070bf9a1e598b68e5948 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\layout_engine.cpython-313.pyc | layout_engine.cpython-313.pyc | Other | 12,821 | 0.95 | 0.091324 | 0 | vue-tools | 245 | 2025-03-14T15:17:15.628980 | BSD-3-Clause | false | 4b2e320fd88133461ee139a3063bb69c |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\legend.cpython-313.pyc | legend.cpython-313.pyc | Other | 58,610 | 0.75 | 0.058333 | 0.012759 | node-utils | 296 | 2024-06-16T21:48:04.223500 | BSD-3-Clause | false | 7cf2a68060722daa9d46283cdccacbc6 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\legend_handler.cpython-313.pyc | legend_handler.cpython-313.pyc | Other | 33,977 | 0.95 | 0.077889 | 0.02139 | react-lib | 122 | 2024-01-22T22:46:18.320688 | GPL-3.0 | false | 2e4f72cc9bd04891a3842d8cdd9da236 |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\lines.cpython-313.pyc | lines.cpython-313.pyc | Other | 66,833 | 0.75 | 0.047991 | 0.012821 | python-kit | 260 | 2024-12-08T22:14:10.871870 | Apache-2.0 | false | 016f21cbd754d18dd8f970a9aa54018f |
\n\n | .venv\Lib\site-packages\matplotlib\__pycache__\markers.cpython-313.pyc | markers.cpython-313.pyc | Other | 45,779 | 0.95 | 0.015054 | 0.016129 | node-utils | 978 | 2023-10-10T00:17:20.503116 | Apache-2.0 | false | 540b8226cc48de37533df64a62b7141a |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.