content stringlengths 1 103k ⌀ | path stringlengths 8 216 | filename stringlengths 2 179 | language stringclasses 15
values | size_bytes int64 2 189k | quality_score float64 0.5 0.95 | complexity float64 0 1 | documentation_ratio float64 0 1 | repository stringclasses 5
values | stars int64 0 1k | created_date stringdate 2023-07-10 19:21:08 2025-07-09 19:11:45 | license stringclasses 4
values | is_test bool 2
classes | file_hash stringlengths 32 32 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Metadata-Version: 2.1\nName: MarkupSafe\nVersion: 3.0.2\nSummary: Safely add untrusted strings to HTML/XML markup.\nMaintainer-email: Pallets <contact@palletsprojects.com>\nLicense: Copyright 2010 Pallets\n \n Redistribution and use in source and binary forms, with or without\n modification, are permitted provided that the following conditions are\n met:\n \n 1. Redistributions of source code must retain the above copyright\n notice, this list of conditions and the following disclaimer.\n \n 2. Redistributions in binary form must reproduce the above copyright\n notice, this list of conditions and the following disclaimer in the\n documentation and/or other materials provided with the distribution.\n \n 3. Neither the name of the copyright holder nor the names of its\n contributors may be used to endorse or promote products derived from\n this software without specific prior written permission.\n \n THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS\n "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT\n LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A\n PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT\n HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,\n SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED\n TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR\n PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF\n LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING\n NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS\n SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n \nProject-URL: Donate, https://palletsprojects.com/donate\nProject-URL: Documentation, https://markupsafe.palletsprojects.com/\nProject-URL: Changes, https://markupsafe.palletsprojects.com/changes/\nProject-URL: Source, https://github.com/pallets/markupsafe/\nProject-URL: Chat, https://discord.gg/pallets\nClassifier: Development Status :: 5 - Production/Stable\nClassifier: Environment :: Web Environment\nClassifier: Intended Audience :: Developers\nClassifier: License :: OSI Approved :: BSD License\nClassifier: Operating System :: OS Independent\nClassifier: Programming Language :: Python\nClassifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content\nClassifier: Topic :: Text Processing :: Markup :: HTML\nClassifier: Typing :: Typed\nRequires-Python: >=3.9\nDescription-Content-Type: text/markdown\nLicense-File: LICENSE.txt\n\n# MarkupSafe\n\nMarkupSafe implements a text object that escapes characters so it is\nsafe to use in HTML and XML. Characters that have special meanings are\nreplaced so that they display as the actual characters. This mitigates\ninjection attacks, meaning untrusted user input can safely be displayed\non a page.\n\n\n## Examples\n\n```pycon\n>>> from markupsafe import Markup, escape\n\n>>> # escape replaces special characters and wraps in Markup\n>>> escape("<script>alert(document.cookie);</script>")\nMarkup('<script>alert(document.cookie);</script>')\n\n>>> # wrap in Markup to mark text "safe" and prevent escaping\n>>> Markup("<strong>Hello</strong>")\nMarkup('<strong>hello</strong>')\n\n>>> escape(Markup("<strong>Hello</strong>"))\nMarkup('<strong>hello</strong>')\n\n>>> # Markup is a str subclass\n>>> # methods and operators escape their arguments\n>>> template = Markup("Hello <em>{name}</em>")\n>>> template.format(name='"World"')\nMarkup('Hello <em>"World"</em>')\n```\n\n## Donate\n\nThe Pallets organization develops and supports MarkupSafe and other\npopular packages. In order to grow the community of contributors and\nusers, and allow the maintainers to devote more time to the projects,\n[please donate today][].\n\n[please donate today]: https://palletsprojects.com/donate\n | .venv\Lib\site-packages\MarkupSafe-3.0.2.dist-info\METADATA | METADATA | Other | 4,067 | 0.95 | 0 | 0.040541 | react-lib | 224 | 2024-04-20T08:55:47.180647 | GPL-3.0 | false | 3de07a4580a3efb0a940645dfee90342 |
MarkupSafe-3.0.2.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4\nMarkupSafe-3.0.2.dist-info/LICENSE.txt,sha256=RjHsDbX9kKVH4zaBcmTGeYIUM4FG-KyUtKV_lu6MnsQ,1503\nMarkupSafe-3.0.2.dist-info/METADATA,sha256=nhoabjupBG41j_JxPCJ3ylgrZ6Fx8oMCFbiLF9Kafqc,4067\nMarkupSafe-3.0.2.dist-info/RECORD,,\nMarkupSafe-3.0.2.dist-info/WHEEL,sha256=-v_yZ08fSknsoT62oIKG9wp1eCBV9_ao2rO4BeIReTY,101\nMarkupSafe-3.0.2.dist-info/top_level.txt,sha256=qy0Plje5IJuvsCBjejJyhDCjEAdcDLK_2agVcex8Z6U,11\nmarkupsafe/__init__.py,sha256=pREerPwvinB62tNCMOwqxBS2YHV6R52Wcq1d-rB4Z5o,13609\nmarkupsafe/__pycache__/__init__.cpython-313.pyc,,\nmarkupsafe/__pycache__/_native.cpython-313.pyc,,\nmarkupsafe/_native.py,sha256=2ptkJ40yCcp9kq3L1NqpgjfpZB-obniYKFFKUOkHh4Q,218\nmarkupsafe/_speedups.c,sha256=SglUjn40ti9YgQAO--OgkSyv9tXq9vvaHyVhQows4Ok,4353\nmarkupsafe/_speedups.cp313-win_amd64.pyd,sha256=7MA12j0aUiSeNpFy-98h_pPSqgCpLeRacgp3I-j00Yo,13312\nmarkupsafe/_speedups.pyi,sha256=LSDmXYOefH4HVpAXuL8sl7AttLw0oXh1njVoVZp2wqQ,42\nmarkupsafe/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0\n | .venv\Lib\site-packages\MarkupSafe-3.0.2.dist-info\RECORD | RECORD | Other | 1,095 | 0.7 | 0 | 0 | node-utils | 729 | 2025-06-17T17:49:49.953531 | GPL-3.0 | false | 10637be1cca71fc7cb542534ad052ca4 |
markupsafe\n | .venv\Lib\site-packages\MarkupSafe-3.0.2.dist-info\top_level.txt | top_level.txt | Other | 11 | 0.5 | 0 | 0 | python-kit | 858 | 2025-01-26T06:36:51.989501 | BSD-3-Clause | false | 5862354c9fbb5b15204672c79808e25c |
Wheel-Version: 1.0\nGenerator: setuptools (75.2.0)\nRoot-Is-Purelib: false\nTag: cp313-cp313-win_amd64\n\n | .venv\Lib\site-packages\MarkupSafe-3.0.2.dist-info\WHEEL | WHEEL | Other | 101 | 0.7 | 0 | 0 | vue-tools | 96 | 2025-02-07T08:09:08.986630 | GPL-3.0 | false | c16db81da71b13b0ef4d8a11883c1abd |
import abc\nimport base64\nimport contextlib\nfrom io import BytesIO, TextIOWrapper\nimport itertools\nimport logging\nfrom pathlib import Path\nimport shutil\nimport subprocess\nimport sys\nfrom tempfile import TemporaryDirectory\nimport uuid\nimport warnings\n\nimport numpy as np\nfrom PIL import Image\n\nimport matplotlib as mpl\nfrom matplotlib._animation_data import (\n DISPLAY_TEMPLATE, INCLUDED_FRAMES, JS_INCLUDE, STYLE_INCLUDE)\nfrom matplotlib import _api, cbook\nimport matplotlib.colors as mcolors\n\n_log = logging.getLogger(__name__)\n\n# Process creation flag for subprocess to prevent it raising a terminal\n# window. See for example https://stackoverflow.com/q/24130623/\nsubprocess_creation_flags = (\n subprocess.CREATE_NO_WINDOW if sys.platform == 'win32' else 0)\n\n\ndef adjusted_figsize(w, h, dpi, n):\n """\n Compute figure size so that pixels are a multiple of n.\n\n Parameters\n ----------\n w, h : float\n Size in inches.\n\n dpi : float\n The dpi.\n\n n : int\n The target multiple.\n\n Returns\n -------\n wnew, hnew : float\n The new figure size in inches.\n """\n\n # this maybe simplified if / when we adopt consistent rounding for\n # pixel size across the whole library\n def correct_roundoff(x, dpi, n):\n if int(x*dpi) % n != 0:\n if int(np.nextafter(x, np.inf)*dpi) % n == 0:\n x = np.nextafter(x, np.inf)\n elif int(np.nextafter(x, -np.inf)*dpi) % n == 0:\n x = np.nextafter(x, -np.inf)\n return x\n\n wnew = int(w * dpi / n) * n / dpi\n hnew = int(h * dpi / n) * n / dpi\n return correct_roundoff(wnew, dpi, n), correct_roundoff(hnew, dpi, n)\n\n\nclass MovieWriterRegistry:\n """Registry of available writer classes by human readable name."""\n\n def __init__(self):\n self._registered = dict()\n\n def register(self, name):\n """\n Decorator for registering a class under a name.\n\n Example use::\n\n @registry.register(name)\n class Foo:\n pass\n """\n def wrapper(writer_cls):\n self._registered[name] = writer_cls\n return writer_cls\n return wrapper\n\n def is_available(self, name):\n """\n Check if given writer is available by name.\n\n Parameters\n ----------\n name : str\n\n Returns\n -------\n bool\n """\n try:\n cls = self._registered[name]\n except KeyError:\n return False\n return cls.isAvailable()\n\n def __iter__(self):\n """Iterate over names of available writer class."""\n for name in self._registered:\n if self.is_available(name):\n yield name\n\n def list(self):\n """Get a list of available MovieWriters."""\n return [*self]\n\n def __getitem__(self, name):\n """Get an available writer class from its name."""\n if self.is_available(name):\n return self._registered[name]\n raise RuntimeError(f"Requested MovieWriter ({name}) not available")\n\n\nwriters = MovieWriterRegistry()\n\n\nclass AbstractMovieWriter(abc.ABC):\n """\n Abstract base class for writing movies, providing a way to grab frames by\n calling `~AbstractMovieWriter.grab_frame`.\n\n `setup` is called to start the process and `finish` is called afterwards.\n `saving` is provided as a context manager to facilitate this process as ::\n\n with moviewriter.saving(fig, outfile='myfile.mp4', dpi=100):\n # Iterate over frames\n moviewriter.grab_frame(**savefig_kwargs)\n\n The use of the context manager ensures that `setup` and `finish` are\n performed as necessary.\n\n An instance of a concrete subclass of this class can be given as the\n ``writer`` argument of `Animation.save()`.\n """\n\n def __init__(self, fps=5, metadata=None, codec=None, bitrate=None):\n self.fps = fps\n self.metadata = metadata if metadata is not None else {}\n self.codec = mpl._val_or_rc(codec, 'animation.codec')\n self.bitrate = mpl._val_or_rc(bitrate, 'animation.bitrate')\n\n @abc.abstractmethod\n def setup(self, fig, outfile, dpi=None):\n """\n Setup for writing the movie file.\n\n Parameters\n ----------\n fig : `~matplotlib.figure.Figure`\n The figure object that contains the information for frames.\n outfile : str\n The filename of the resulting movie file.\n dpi : float, default: ``fig.dpi``\n The DPI (or resolution) for the file. This controls the size\n in pixels of the resulting movie file.\n """\n # Check that path is valid\n Path(outfile).parent.resolve(strict=True)\n self.outfile = outfile\n self.fig = fig\n if dpi is None:\n dpi = self.fig.dpi\n self.dpi = dpi\n\n @property\n def frame_size(self):\n """A tuple ``(width, height)`` in pixels of a movie frame."""\n w, h = self.fig.get_size_inches()\n return int(w * self.dpi), int(h * self.dpi)\n\n def _supports_transparency(self):\n """\n Whether this writer supports transparency.\n\n Writers may consult output file type and codec to determine this at runtime.\n """\n return False\n\n @abc.abstractmethod\n def grab_frame(self, **savefig_kwargs):\n """\n Grab the image information from the figure and save as a movie frame.\n\n All keyword arguments in *savefig_kwargs* are passed on to the\n `~.Figure.savefig` call that saves the figure. However, several\n keyword arguments that are supported by `~.Figure.savefig` may not be\n passed as they are controlled by the MovieWriter:\n\n - *dpi*, *bbox_inches*: These may not be passed because each frame of the\n animation much be exactly the same size in pixels.\n - *format*: This is controlled by the MovieWriter.\n """\n\n @abc.abstractmethod\n def finish(self):\n """Finish any processing for writing the movie."""\n\n @contextlib.contextmanager\n def saving(self, fig, outfile, dpi, *args, **kwargs):\n """\n Context manager to facilitate writing the movie file.\n\n ``*args, **kw`` are any parameters that should be passed to `setup`.\n """\n if mpl.rcParams['savefig.bbox'] == 'tight':\n _log.info("Disabling savefig.bbox = 'tight', as it may cause "\n "frame size to vary, which is inappropriate for "\n "animation.")\n\n # This particular sequence is what contextlib.contextmanager wants\n self.setup(fig, outfile, dpi, *args, **kwargs)\n with mpl.rc_context({'savefig.bbox': None}):\n try:\n yield self\n finally:\n self.finish()\n\n\nclass MovieWriter(AbstractMovieWriter):\n """\n Base class for writing movies.\n\n This is a base class for MovieWriter subclasses that write a movie frame\n data to a pipe. You cannot instantiate this class directly.\n See examples for how to use its subclasses.\n\n Attributes\n ----------\n frame_format : str\n The format used in writing frame data, defaults to 'rgba'.\n fig : `~matplotlib.figure.Figure`\n The figure to capture data from.\n This must be provided by the subclasses.\n """\n\n # Builtin writer subclasses additionally define the _exec_key and _args_key\n # attributes, which indicate the rcParams entries where the path to the\n # executable and additional command-line arguments to the executable are\n # stored. Third-party writers cannot meaningfully set these as they cannot\n # extend rcParams with new keys.\n\n # Pipe-based writers only support RGBA, but file-based ones support more\n # formats.\n supported_formats = ["rgba"]\n\n def __init__(self, fps=5, codec=None, bitrate=None, extra_args=None,\n metadata=None):\n """\n Parameters\n ----------\n fps : int, default: 5\n Movie frame rate (per second).\n codec : str or None, default: :rc:`animation.codec`\n The codec to use.\n bitrate : int, default: :rc:`animation.bitrate`\n The bitrate of the movie, in kilobits per second. Higher values\n means higher quality movies, but increase the file size. A value\n of -1 lets the underlying movie encoder select the bitrate.\n extra_args : list of str or None, optional\n Extra command-line arguments passed to the underlying movie encoder. These\n arguments are passed last to the encoder, just before the filename. The\n default, None, means to use :rc:`animation.[name-of-encoder]_args` for the\n builtin writers.\n metadata : dict[str, str], default: {}\n A dictionary of keys and values for metadata to include in the\n output file. Some keys that may be of use include:\n title, artist, genre, subject, copyright, srcform, comment.\n """\n if type(self) is MovieWriter:\n # TODO MovieWriter is still an abstract class and needs to be\n # extended with a mixin. This should be clearer in naming\n # and description. For now, just give a reasonable error\n # message to users.\n raise TypeError(\n 'MovieWriter cannot be instantiated directly. Please use one '\n 'of its subclasses.')\n\n super().__init__(fps=fps, metadata=metadata, codec=codec,\n bitrate=bitrate)\n self.frame_format = self.supported_formats[0]\n self.extra_args = extra_args\n\n def _adjust_frame_size(self):\n if self.codec == 'h264':\n wo, ho = self.fig.get_size_inches()\n w, h = adjusted_figsize(wo, ho, self.dpi, 2)\n if (wo, ho) != (w, h):\n self.fig.set_size_inches(w, h, forward=True)\n _log.info('figure size in inches has been adjusted '\n 'from %s x %s to %s x %s', wo, ho, w, h)\n else:\n w, h = self.fig.get_size_inches()\n _log.debug('frame size in pixels is %s x %s', *self.frame_size)\n return w, h\n\n def setup(self, fig, outfile, dpi=None):\n # docstring inherited\n super().setup(fig, outfile, dpi=dpi)\n self._w, self._h = self._adjust_frame_size()\n # Run here so that grab_frame() can write the data to a pipe. This\n # eliminates the need for temp files.\n self._run()\n\n def _run(self):\n # Uses subprocess to call the program for assembling frames into a\n # movie file. *args* returns the sequence of command line arguments\n # from a few configuration options.\n command = self._args()\n _log.info('MovieWriter._run: running command: %s',\n cbook._pformat_subprocess(command))\n PIPE = subprocess.PIPE\n self._proc = subprocess.Popen(\n command, stdin=PIPE, stdout=PIPE, stderr=PIPE,\n creationflags=subprocess_creation_flags)\n\n def finish(self):\n """Finish any processing for writing the movie."""\n out, err = self._proc.communicate()\n # Use the encoding/errors that universal_newlines would use.\n out = TextIOWrapper(BytesIO(out)).read()\n err = TextIOWrapper(BytesIO(err)).read()\n if out:\n _log.log(\n logging.WARNING if self._proc.returncode else logging.DEBUG,\n "MovieWriter stdout:\n%s", out)\n if err:\n _log.log(\n logging.WARNING if self._proc.returncode else logging.DEBUG,\n "MovieWriter stderr:\n%s", err)\n if self._proc.returncode:\n raise subprocess.CalledProcessError(\n self._proc.returncode, self._proc.args, out, err)\n\n def grab_frame(self, **savefig_kwargs):\n # docstring inherited\n _validate_grabframe_kwargs(savefig_kwargs)\n _log.debug('MovieWriter.grab_frame: Grabbing frame.')\n # Readjust the figure size in case it has been changed by the user.\n # All frames must have the same size to save the movie correctly.\n self.fig.set_size_inches(self._w, self._h)\n # Save the figure data to the sink, using the frame format and dpi.\n self.fig.savefig(self._proc.stdin, format=self.frame_format,\n dpi=self.dpi, **savefig_kwargs)\n\n def _args(self):\n """Assemble list of encoder-specific command-line arguments."""\n return NotImplementedError("args needs to be implemented by subclass.")\n\n @classmethod\n def bin_path(cls):\n """\n Return the binary path to the commandline tool used by a specific\n subclass. This is a class method so that the tool can be looked for\n before making a particular MovieWriter subclass available.\n """\n return str(mpl.rcParams[cls._exec_key])\n\n @classmethod\n def isAvailable(cls):\n """Return whether a MovieWriter subclass is actually available."""\n return shutil.which(cls.bin_path()) is not None\n\n\nclass FileMovieWriter(MovieWriter):\n """\n `MovieWriter` for writing to individual files and stitching at the end.\n\n This must be sub-classed to be useful.\n """\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n self.frame_format = mpl.rcParams['animation.frame_format']\n\n def setup(self, fig, outfile, dpi=None, frame_prefix=None):\n """\n Setup for writing the movie file.\n\n Parameters\n ----------\n fig : `~matplotlib.figure.Figure`\n The figure to grab the rendered frames from.\n outfile : str\n The filename of the resulting movie file.\n dpi : float, default: ``fig.dpi``\n The dpi of the output file. This, with the figure size,\n controls the size in pixels of the resulting movie file.\n frame_prefix : str, optional\n The filename prefix to use for temporary files. If *None* (the\n default), files are written to a temporary directory which is\n deleted by `finish`; if not *None*, no temporary files are\n deleted.\n """\n # Check that path is valid\n Path(outfile).parent.resolve(strict=True)\n self.fig = fig\n self.outfile = outfile\n if dpi is None:\n dpi = self.fig.dpi\n self.dpi = dpi\n self._adjust_frame_size()\n\n if frame_prefix is None:\n self._tmpdir = TemporaryDirectory()\n self.temp_prefix = str(Path(self._tmpdir.name, 'tmp'))\n else:\n self._tmpdir = None\n self.temp_prefix = frame_prefix\n self._frame_counter = 0 # used for generating sequential file names\n self._temp_paths = list()\n self.fname_format_str = '%s%%07d.%s'\n\n def __del__(self):\n if hasattr(self, '_tmpdir') and self._tmpdir:\n self._tmpdir.cleanup()\n\n @property\n def frame_format(self):\n """\n Format (png, jpeg, etc.) to use for saving the frames, which can be\n decided by the individual subclasses.\n """\n return self._frame_format\n\n @frame_format.setter\n def frame_format(self, frame_format):\n if frame_format in self.supported_formats:\n self._frame_format = frame_format\n else:\n _api.warn_external(\n f"Ignoring file format {frame_format!r} which is not "\n f"supported by {type(self).__name__}; using "\n f"{self.supported_formats[0]} instead.")\n self._frame_format = self.supported_formats[0]\n\n def _base_temp_name(self):\n # Generates a template name (without number) given the frame format\n # for extension and the prefix.\n return self.fname_format_str % (self.temp_prefix, self.frame_format)\n\n def grab_frame(self, **savefig_kwargs):\n # docstring inherited\n # Creates a filename for saving using basename and counter.\n _validate_grabframe_kwargs(savefig_kwargs)\n path = Path(self._base_temp_name() % self._frame_counter)\n self._temp_paths.append(path) # Record the filename for later use.\n self._frame_counter += 1 # Ensures each created name is unique.\n _log.debug('FileMovieWriter.grab_frame: Grabbing frame %d to path=%s',\n self._frame_counter, path)\n with open(path, 'wb') as sink: # Save figure to the sink.\n self.fig.savefig(sink, format=self.frame_format, dpi=self.dpi,\n **savefig_kwargs)\n\n def finish(self):\n # Call run here now that all frame grabbing is done. All temp files\n # are available to be assembled.\n try:\n self._run()\n super().finish()\n finally:\n if self._tmpdir:\n _log.debug(\n 'MovieWriter: clearing temporary path=%s', self._tmpdir\n )\n self._tmpdir.cleanup()\n\n\n@writers.register('pillow')\nclass PillowWriter(AbstractMovieWriter):\n def _supports_transparency(self):\n return True\n\n @classmethod\n def isAvailable(cls):\n return True\n\n def setup(self, fig, outfile, dpi=None):\n super().setup(fig, outfile, dpi=dpi)\n self._frames = []\n\n def grab_frame(self, **savefig_kwargs):\n _validate_grabframe_kwargs(savefig_kwargs)\n buf = BytesIO()\n self.fig.savefig(\n buf, **{**savefig_kwargs, "format": "rgba", "dpi": self.dpi})\n im = Image.frombuffer(\n "RGBA", self.frame_size, buf.getbuffer(), "raw", "RGBA", 0, 1)\n if im.getextrema()[3][0] < 255:\n # This frame has transparency, so we'll just add it as is.\n self._frames.append(im)\n else:\n # Without transparency, we switch to RGB mode, which converts to P mode a\n # little better if needed (specifically, this helps with GIF output.)\n self._frames.append(im.convert("RGB"))\n\n def finish(self):\n self._frames[0].save(\n self.outfile, save_all=True, append_images=self._frames[1:],\n duration=int(1000 / self.fps), loop=0)\n\n\n# Base class of ffmpeg information. Has the config keys and the common set\n# of arguments that controls the *output* side of things.\nclass FFMpegBase:\n """\n Mixin class for FFMpeg output.\n\n This is a base class for the concrete `FFMpegWriter` and `FFMpegFileWriter`\n classes.\n """\n\n _exec_key = 'animation.ffmpeg_path'\n _args_key = 'animation.ffmpeg_args'\n\n def _supports_transparency(self):\n suffix = Path(self.outfile).suffix\n if suffix in {'.apng', '.avif', '.gif', '.webm', '.webp'}:\n return True\n # This list was found by going through `ffmpeg -codecs` for video encoders,\n # running them with _support_transparency() forced to True, and checking that\n # the "Pixel format" in Kdenlive included alpha. Note this is not a guarantee\n # that transparency will work; you may also need to pass `-pix_fmt`, but we\n # trust the user has done so if they are asking for these formats.\n return self.codec in {\n 'apng', 'avrp', 'bmp', 'cfhd', 'dpx', 'ffv1', 'ffvhuff', 'gif', 'huffyuv',\n 'jpeg2000', 'ljpeg', 'png', 'prores', 'prores_aw', 'prores_ks', 'qtrle',\n 'rawvideo', 'targa', 'tiff', 'utvideo', 'v408', }\n\n @property\n def output_args(self):\n args = []\n suffix = Path(self.outfile).suffix\n if suffix in {'.apng', '.avif', '.gif', '.webm', '.webp'}:\n self.codec = suffix[1:]\n else:\n args.extend(['-vcodec', self.codec])\n extra_args = (self.extra_args if self.extra_args is not None\n else mpl.rcParams[self._args_key])\n # For h264, the default format is yuv444p, which is not compatible\n # with quicktime (and others). Specifying yuv420p fixes playback on\n # iOS, as well as HTML5 video in firefox and safari (on both Windows and\n # macOS). Also fixes internet explorer. This is as of 2015/10/29.\n if self.codec == 'h264' and '-pix_fmt' not in extra_args:\n args.extend(['-pix_fmt', 'yuv420p'])\n # For GIF, we're telling FFmpeg to split the video stream, to generate\n # a palette, and then use it for encoding.\n elif self.codec == 'gif' and '-filter_complex' not in extra_args:\n args.extend(['-filter_complex',\n 'split [a][b];[a] palettegen [p];[b][p] paletteuse'])\n # For AVIF, we're telling FFmpeg to split the video stream, extract the alpha,\n # in order to place it in a secondary stream, as needed by AVIF-in-FFmpeg.\n elif self.codec == 'avif' and '-filter_complex' not in extra_args:\n args.extend(['-filter_complex',\n 'split [rgb][rgba]; [rgba] alphaextract [alpha]',\n '-map', '[rgb]', '-map', '[alpha]'])\n if self.bitrate > 0:\n args.extend(['-b', '%dk' % self.bitrate]) # %dk: bitrate in kbps.\n for k, v in self.metadata.items():\n args.extend(['-metadata', f'{k}={v}'])\n args.extend(extra_args)\n\n return args + ['-y', self.outfile]\n\n\n# Combine FFMpeg options with pipe-based writing\n@writers.register('ffmpeg')\nclass FFMpegWriter(FFMpegBase, MovieWriter):\n """\n Pipe-based ffmpeg writer.\n\n Frames are streamed directly to ffmpeg via a pipe and written in a single pass.\n\n This effectively works as a slideshow input to ffmpeg with the fps passed as\n ``-framerate``, so see also `their notes on frame rates`_ for further details.\n\n .. _their notes on frame rates: https://trac.ffmpeg.org/wiki/Slideshow#Framerates\n """\n def _args(self):\n # Returns the command line parameters for subprocess to use\n # ffmpeg to create a movie using a pipe.\n args = [self.bin_path(), '-f', 'rawvideo', '-vcodec', 'rawvideo',\n '-s', '%dx%d' % self.frame_size, '-pix_fmt', self.frame_format,\n '-framerate', str(self.fps)]\n # Logging is quieted because subprocess.PIPE has limited buffer size.\n # If you have a lot of frames in your animation and set logging to\n # DEBUG, you will have a buffer overrun.\n if _log.getEffectiveLevel() > logging.DEBUG:\n args += ['-loglevel', 'error']\n args += ['-i', 'pipe:'] + self.output_args\n return args\n\n\n# Combine FFMpeg options with temp file-based writing\n@writers.register('ffmpeg_file')\nclass FFMpegFileWriter(FFMpegBase, FileMovieWriter):\n """\n File-based ffmpeg writer.\n\n Frames are written to temporary files on disk and then stitched together at the end.\n\n This effectively works as a slideshow input to ffmpeg with the fps passed as\n ``-framerate``, so see also `their notes on frame rates`_ for further details.\n\n .. _their notes on frame rates: https://trac.ffmpeg.org/wiki/Slideshow#Framerates\n """\n supported_formats = ['png', 'jpeg', 'tiff', 'raw', 'rgba']\n\n def _args(self):\n # Returns the command line parameters for subprocess to use\n # ffmpeg to create a movie using a collection of temp images\n args = []\n # For raw frames, we need to explicitly tell ffmpeg the metadata.\n if self.frame_format in {'raw', 'rgba'}:\n args += [\n '-f', 'image2', '-vcodec', 'rawvideo',\n '-video_size', '%dx%d' % self.frame_size,\n '-pixel_format', 'rgba',\n ]\n args += ['-framerate', str(self.fps), '-i', self._base_temp_name()]\n if not self._tmpdir:\n args += ['-frames:v', str(self._frame_counter)]\n # Logging is quieted because subprocess.PIPE has limited buffer size.\n # If you have a lot of frames in your animation and set logging to\n # DEBUG, you will have a buffer overrun.\n if _log.getEffectiveLevel() > logging.DEBUG:\n args += ['-loglevel', 'error']\n return [self.bin_path(), *args, *self.output_args]\n\n\n# Base class for animated GIFs with ImageMagick\nclass ImageMagickBase:\n """\n Mixin class for ImageMagick output.\n\n This is a base class for the concrete `ImageMagickWriter` and\n `ImageMagickFileWriter` classes, which define an ``input_names`` attribute\n (or property) specifying the input names passed to ImageMagick.\n """\n\n _exec_key = 'animation.convert_path'\n _args_key = 'animation.convert_args'\n\n def _supports_transparency(self):\n suffix = Path(self.outfile).suffix\n return suffix in {'.apng', '.avif', '.gif', '.webm', '.webp'}\n\n def _args(self):\n # ImageMagick does not recognize "raw".\n fmt = "rgba" if self.frame_format == "raw" else self.frame_format\n extra_args = (self.extra_args if self.extra_args is not None\n else mpl.rcParams[self._args_key])\n return [\n self.bin_path(),\n "-size", "%ix%i" % self.frame_size,\n "-depth", "8",\n "-delay", str(100 / self.fps),\n "-loop", "0",\n f"{fmt}:{self.input_names}",\n *extra_args,\n self.outfile,\n ]\n\n @classmethod\n def bin_path(cls):\n binpath = super().bin_path()\n if binpath == 'convert':\n binpath = mpl._get_executable_info('magick').executable\n return binpath\n\n @classmethod\n def isAvailable(cls):\n try:\n return super().isAvailable()\n except mpl.ExecutableNotFoundError as _enf:\n # May be raised by get_executable_info.\n _log.debug('ImageMagick unavailable due to: %s', _enf)\n return False\n\n\n# Combine ImageMagick options with pipe-based writing\n@writers.register('imagemagick')\nclass ImageMagickWriter(ImageMagickBase, MovieWriter):\n """\n Pipe-based animated gif writer.\n\n Frames are streamed directly to ImageMagick via a pipe and written\n in a single pass.\n """\n\n input_names = "-" # stdin\n\n\n# Combine ImageMagick options with temp file-based writing\n@writers.register('imagemagick_file')\nclass ImageMagickFileWriter(ImageMagickBase, FileMovieWriter):\n """\n File-based animated gif writer.\n\n Frames are written to temporary files on disk and then stitched\n together at the end.\n """\n\n supported_formats = ['png', 'jpeg', 'tiff', 'raw', 'rgba']\n input_names = property(\n lambda self: f'{self.temp_prefix}*.{self.frame_format}')\n\n\n# Taken directly from jakevdp's JSAnimation package at\n# http://github.com/jakevdp/JSAnimation\ndef _included_frames(frame_count, frame_format, frame_dir):\n return INCLUDED_FRAMES.format(Nframes=frame_count,\n frame_dir=frame_dir,\n frame_format=frame_format)\n\n\ndef _embedded_frames(frame_list, frame_format):\n """frame_list should be a list of base64-encoded png files"""\n if frame_format == 'svg':\n # Fix MIME type for svg\n frame_format = 'svg+xml'\n template = ' frames[{0}] = "data:image/{1};base64,{2}"\n'\n return "\n" + "".join(\n template.format(i, frame_format, frame_data.replace('\n', '\\\n'))\n for i, frame_data in enumerate(frame_list))\n\n\n@writers.register('html')\nclass HTMLWriter(FileMovieWriter):\n """Writer for JavaScript-based HTML movies."""\n\n supported_formats = ['png', 'jpeg', 'tiff', 'svg']\n\n @classmethod\n def isAvailable(cls):\n return True\n\n def __init__(self, fps=30, codec=None, bitrate=None, extra_args=None,\n metadata=None, embed_frames=False, default_mode='loop',\n embed_limit=None):\n\n if extra_args:\n _log.warning("HTMLWriter ignores 'extra_args'")\n extra_args = () # Don't lookup nonexistent rcParam[args_key].\n self.embed_frames = embed_frames\n self.default_mode = default_mode.lower()\n _api.check_in_list(['loop', 'once', 'reflect'],\n default_mode=self.default_mode)\n\n # Save embed limit, which is given in MB\n self._bytes_limit = mpl._val_or_rc(embed_limit, 'animation.embed_limit')\n # Convert from MB to bytes\n self._bytes_limit *= 1024 * 1024\n\n super().__init__(fps, codec, bitrate, extra_args, metadata)\n\n def setup(self, fig, outfile, dpi=None, frame_dir=None):\n outfile = Path(outfile)\n _api.check_in_list(['.html', '.htm'], outfile_extension=outfile.suffix)\n\n self._saved_frames = []\n self._total_bytes = 0\n self._hit_limit = False\n\n if not self.embed_frames:\n if frame_dir is None:\n frame_dir = outfile.with_name(outfile.stem + '_frames')\n frame_dir.mkdir(parents=True, exist_ok=True)\n frame_prefix = frame_dir / 'frame'\n else:\n frame_prefix = None\n\n super().setup(fig, outfile, dpi, frame_prefix)\n self._clear_temp = False\n\n def grab_frame(self, **savefig_kwargs):\n _validate_grabframe_kwargs(savefig_kwargs)\n if self.embed_frames:\n # Just stop processing if we hit the limit\n if self._hit_limit:\n return\n f = BytesIO()\n self.fig.savefig(f, format=self.frame_format,\n dpi=self.dpi, **savefig_kwargs)\n imgdata64 = base64.encodebytes(f.getvalue()).decode('ascii')\n self._total_bytes += len(imgdata64)\n if self._total_bytes >= self._bytes_limit:\n _log.warning(\n "Animation size has reached %s bytes, exceeding the limit "\n "of %s. If you're sure you want a larger animation "\n "embedded, set the animation.embed_limit rc parameter to "\n "a larger value (in MB). This and further frames will be "\n "dropped.", self._total_bytes, self._bytes_limit)\n self._hit_limit = True\n else:\n self._saved_frames.append(imgdata64)\n else:\n return super().grab_frame(**savefig_kwargs)\n\n def finish(self):\n # save the frames to an html file\n if self.embed_frames:\n fill_frames = _embedded_frames(self._saved_frames,\n self.frame_format)\n frame_count = len(self._saved_frames)\n else:\n # temp names is filled by FileMovieWriter\n frame_count = len(self._temp_paths)\n fill_frames = _included_frames(\n frame_count, self.frame_format,\n self._temp_paths[0].parent.relative_to(self.outfile.parent))\n mode_dict = dict(once_checked='',\n loop_checked='',\n reflect_checked='')\n mode_dict[self.default_mode + '_checked'] = 'checked'\n\n interval = 1000 // self.fps\n\n with open(self.outfile, 'w') as of:\n of.write(JS_INCLUDE + STYLE_INCLUDE)\n of.write(DISPLAY_TEMPLATE.format(id=uuid.uuid4().hex,\n Nframes=frame_count,\n fill_frames=fill_frames,\n interval=interval,\n **mode_dict))\n\n # Duplicate the temporary file clean up logic from\n # FileMovieWriter.finish. We cannot call the inherited version of\n # finish because it assumes that there is a subprocess that we either\n # need to call to merge many frames together or that there is a\n # subprocess call that we need to clean up.\n if self._tmpdir:\n _log.debug('MovieWriter: clearing temporary path=%s', self._tmpdir)\n self._tmpdir.cleanup()\n\n\nclass Animation:\n """\n A base class for Animations.\n\n This class is not usable as is, and should be subclassed to provide needed\n behavior.\n\n .. note::\n\n You must store the created Animation in a variable that lives as long\n as the animation should run. Otherwise, the Animation object will be\n garbage-collected and the animation stops.\n\n Parameters\n ----------\n fig : `~matplotlib.figure.Figure`\n The figure object used to get needed events, such as draw or resize.\n\n event_source : object, optional\n A class that can run a callback when desired events\n are generated, as well as be stopped and started.\n\n Examples include timers (see `TimedAnimation`) and file\n system notifications.\n\n blit : bool, default: False\n Whether blitting is used to optimize drawing. If the backend does not\n support blitting, then this parameter has no effect.\n\n See Also\n --------\n FuncAnimation, ArtistAnimation\n """\n\n def __init__(self, fig, event_source=None, blit=False):\n self._draw_was_started = False\n\n self._fig = fig\n # Disables blitting for backends that don't support it. This\n # allows users to request it if available, but still have a\n # fallback that works if it is not.\n self._blit = blit and fig.canvas.supports_blit\n\n # These are the basics of the animation. The frame sequence represents\n # information for each frame of the animation and depends on how the\n # drawing is handled by the subclasses. The event source fires events\n # that cause the frame sequence to be iterated.\n self.frame_seq = self.new_frame_seq()\n self.event_source = event_source\n\n # Instead of starting the event source now, we connect to the figure's\n # draw_event, so that we only start once the figure has been drawn.\n self._first_draw_id = fig.canvas.mpl_connect('draw_event', self._start)\n\n # Connect to the figure's close_event so that we don't continue to\n # fire events and try to draw to a deleted figure.\n self._close_id = self._fig.canvas.mpl_connect('close_event',\n self._stop)\n if self._blit:\n self._setup_blit()\n\n def __del__(self):\n if not getattr(self, '_draw_was_started', True):\n warnings.warn(\n 'Animation was deleted without rendering anything. This is '\n 'most likely not intended. To prevent deletion, assign the '\n 'Animation to a variable, e.g. `anim`, that exists until you '\n 'output the Animation using `plt.show()` or '\n '`anim.save()`.'\n )\n\n def _start(self, *args):\n """\n Starts interactive animation. Adds the draw frame command to the GUI\n handler, calls show to start the event loop.\n """\n # Do not start the event source if saving() it.\n if self._fig.canvas.is_saving():\n return\n # First disconnect our draw event handler\n self._fig.canvas.mpl_disconnect(self._first_draw_id)\n\n # Now do any initial draw\n self._init_draw()\n\n # Add our callback for stepping the animation and\n # actually start the event_source.\n self.event_source.add_callback(self._step)\n self.event_source.start()\n\n def _stop(self, *args):\n # On stop we disconnect all of our events.\n if self._blit:\n self._fig.canvas.mpl_disconnect(self._resize_id)\n self._fig.canvas.mpl_disconnect(self._close_id)\n self.event_source.remove_callback(self._step)\n self.event_source = None\n\n def save(self, filename, writer=None, fps=None, dpi=None, codec=None,\n bitrate=None, extra_args=None, metadata=None, extra_anim=None,\n savefig_kwargs=None, *, progress_callback=None):\n """\n Save the animation as a movie file by drawing every frame.\n\n Parameters\n ----------\n filename : str\n The output filename, e.g., :file:`mymovie.mp4`.\n\n writer : `MovieWriter` or str, default: :rc:`animation.writer`\n A `MovieWriter` instance to use or a key that identifies a\n class to use, such as 'ffmpeg'.\n\n fps : int, optional\n Movie frame rate (per second). If not set, the frame rate from the\n animation's frame interval.\n\n dpi : float, default: :rc:`savefig.dpi`\n Controls the dots per inch for the movie frames. Together with\n the figure's size in inches, this controls the size of the movie.\n\n codec : str, default: :rc:`animation.codec`.\n The video codec to use. Not all codecs are supported by a given\n `MovieWriter`.\n\n bitrate : int, default: :rc:`animation.bitrate`\n The bitrate of the movie, in kilobits per second. Higher values\n means higher quality movies, but increase the file size. A value\n of -1 lets the underlying movie encoder select the bitrate.\n\n extra_args : list of str or None, optional\n Extra command-line arguments passed to the underlying movie encoder. These\n arguments are passed last to the encoder, just before the output filename.\n The default, None, means to use :rc:`animation.[name-of-encoder]_args` for\n the builtin writers.\n\n metadata : dict[str, str], default: {}\n Dictionary of keys and values for metadata to include in\n the output file. Some keys that may be of use include:\n title, artist, genre, subject, copyright, srcform, comment.\n\n extra_anim : list, default: []\n Additional `Animation` objects that should be included\n in the saved movie file. These need to be from the same\n `.Figure` instance. Also, animation frames will\n just be simply combined, so there should be a 1:1 correspondence\n between the frames from the different animations.\n\n savefig_kwargs : dict, default: {}\n Keyword arguments passed to each `~.Figure.savefig` call used to\n save the individual frames.\n\n progress_callback : function, optional\n A callback function that will be called for every frame to notify\n the saving progress. It must have the signature ::\n\n def func(current_frame: int, total_frames: int) -> Any\n\n where *current_frame* is the current frame number and *total_frames* is the\n total number of frames to be saved. *total_frames* is set to None, if the\n total number of frames cannot be determined. Return values may exist but are\n ignored.\n\n Example code to write the progress to stdout::\n\n progress_callback = lambda i, n: print(f'Saving frame {i}/{n}')\n\n Notes\n -----\n *fps*, *codec*, *bitrate*, *extra_args* and *metadata* are used to\n construct a `.MovieWriter` instance and can only be passed if\n *writer* is a string. If they are passed as non-*None* and *writer*\n is a `.MovieWriter`, a `RuntimeError` will be raised.\n """\n\n all_anim = [self]\n if extra_anim is not None:\n all_anim.extend(anim for anim in extra_anim\n if anim._fig is self._fig)\n\n # Disable "Animation was deleted without rendering" warning.\n for anim in all_anim:\n anim._draw_was_started = True\n\n if writer is None:\n writer = mpl.rcParams['animation.writer']\n elif (not isinstance(writer, str) and\n any(arg is not None\n for arg in (fps, codec, bitrate, extra_args, metadata))):\n raise RuntimeError('Passing in values for arguments '\n 'fps, codec, bitrate, extra_args, or metadata '\n 'is not supported when writer is an existing '\n 'MovieWriter instance. These should instead be '\n 'passed as arguments when creating the '\n 'MovieWriter instance.')\n\n if savefig_kwargs is None:\n savefig_kwargs = {}\n else:\n # we are going to mutate this below\n savefig_kwargs = dict(savefig_kwargs)\n\n if fps is None and hasattr(self, '_interval'):\n # Convert interval in ms to frames per second\n fps = 1000. / self._interval\n\n # Reuse the savefig DPI for ours if none is given.\n dpi = mpl._val_or_rc(dpi, 'savefig.dpi')\n if dpi == 'figure':\n dpi = self._fig.dpi\n\n writer_kwargs = {}\n if codec is not None:\n writer_kwargs['codec'] = codec\n if bitrate is not None:\n writer_kwargs['bitrate'] = bitrate\n if extra_args is not None:\n writer_kwargs['extra_args'] = extra_args\n if metadata is not None:\n writer_kwargs['metadata'] = metadata\n\n # If we have the name of a writer, instantiate an instance of the\n # registered class.\n if isinstance(writer, str):\n try:\n writer_cls = writers[writer]\n except RuntimeError: # Raised if not available.\n writer_cls = PillowWriter # Always available.\n _log.warning("MovieWriter %s unavailable; using Pillow "\n "instead.", writer)\n writer = writer_cls(fps, **writer_kwargs)\n _log.info('Animation.save using %s', type(writer))\n\n if 'bbox_inches' in savefig_kwargs:\n _log.warning("Warning: discarding the 'bbox_inches' argument in "\n "'savefig_kwargs' as it may cause frame size "\n "to vary, which is inappropriate for animation.")\n savefig_kwargs.pop('bbox_inches')\n\n # Create a new sequence of frames for saved data. This is different\n # from new_frame_seq() to give the ability to save 'live' generated\n # frame information to be saved later.\n # TODO: Right now, after closing the figure, saving a movie won't work\n # since GUI widgets are gone. Either need to remove extra code to\n # allow for this non-existent use case or find a way to make it work.\n\n def _pre_composite_to_white(color):\n r, g, b, a = mcolors.to_rgba(color)\n return a * np.array([r, g, b]) + 1 - a\n\n # canvas._is_saving = True makes the draw_event animation-starting\n # callback a no-op; canvas.manager = None prevents resizing the GUI\n # widget (both are likewise done in savefig()).\n with (writer.saving(self._fig, filename, dpi),\n cbook._setattr_cm(self._fig.canvas, _is_saving=True, manager=None)):\n if not writer._supports_transparency():\n facecolor = savefig_kwargs.get('facecolor',\n mpl.rcParams['savefig.facecolor'])\n if facecolor == 'auto':\n facecolor = self._fig.get_facecolor()\n savefig_kwargs['facecolor'] = _pre_composite_to_white(facecolor)\n savefig_kwargs['transparent'] = False # just to be safe!\n\n for anim in all_anim:\n anim._init_draw() # Clear the initial frame\n frame_number = 0\n # TODO: Currently only FuncAnimation has a save_count\n # attribute. Can we generalize this to all Animations?\n save_count_list = [getattr(a, '_save_count', None)\n for a in all_anim]\n if None in save_count_list:\n total_frames = None\n else:\n total_frames = sum(save_count_list)\n for data in zip(*[a.new_saved_frame_seq() for a in all_anim]):\n for anim, d in zip(all_anim, data):\n # TODO: See if turning off blit is really necessary\n anim._draw_next_frame(d, blit=False)\n if progress_callback is not None:\n progress_callback(frame_number, total_frames)\n frame_number += 1\n writer.grab_frame(**savefig_kwargs)\n\n def _step(self, *args):\n """\n Handler for getting events. By default, gets the next frame in the\n sequence and hands the data off to be drawn.\n """\n # Returns True to indicate that the event source should continue to\n # call _step, until the frame sequence reaches the end of iteration,\n # at which point False will be returned.\n try:\n framedata = next(self.frame_seq)\n self._draw_next_frame(framedata, self._blit)\n return True\n except StopIteration:\n return False\n\n def new_frame_seq(self):\n """Return a new sequence of frame information."""\n # Default implementation is just an iterator over self._framedata\n return iter(self._framedata)\n\n def new_saved_frame_seq(self):\n """Return a new sequence of saved/cached frame information."""\n # Default is the same as the regular frame sequence\n return self.new_frame_seq()\n\n def _draw_next_frame(self, framedata, blit):\n # Breaks down the drawing of the next frame into steps of pre- and\n # post- draw, as well as the drawing of the frame itself.\n self._pre_draw(framedata, blit)\n self._draw_frame(framedata)\n self._post_draw(framedata, blit)\n\n def _init_draw(self):\n # Initial draw to clear the frame. Also used by the blitting code\n # when a clean base is required.\n self._draw_was_started = True\n\n def _pre_draw(self, framedata, blit):\n # Perform any cleaning or whatnot before the drawing of the frame.\n # This default implementation allows blit to clear the frame.\n if blit:\n self._blit_clear(self._drawn_artists)\n\n def _draw_frame(self, framedata):\n # Performs actual drawing of the frame.\n raise NotImplementedError('Needs to be implemented by subclasses to'\n ' actually make an animation.')\n\n def _post_draw(self, framedata, blit):\n # After the frame is rendered, this handles the actual flushing of\n # the draw, which can be a direct draw_idle() or make use of the\n # blitting.\n if blit and self._drawn_artists:\n self._blit_draw(self._drawn_artists)\n else:\n self._fig.canvas.draw_idle()\n\n # The rest of the code in this class is to facilitate easy blitting\n def _blit_draw(self, artists):\n # Handles blitted drawing, which renders only the artists given instead\n # of the entire figure.\n updated_ax = {a.axes for a in artists}\n # Enumerate artists to cache Axes backgrounds. We do not draw\n # artists yet to not cache foreground from plots with shared Axes\n for ax in updated_ax:\n # If we haven't cached the background for the current view of this\n # Axes object, do so now. This might not always be reliable, but\n # it's an attempt to automate the process.\n cur_view = ax._get_view()\n view, bg = self._blit_cache.get(ax, (object(), None))\n if cur_view != view:\n self._blit_cache[ax] = (\n cur_view, ax.figure.canvas.copy_from_bbox(ax.bbox))\n # Make a separate pass to draw foreground.\n for a in artists:\n a.axes.draw_artist(a)\n # After rendering all the needed artists, blit each Axes individually.\n for ax in updated_ax:\n ax.figure.canvas.blit(ax.bbox)\n\n def _blit_clear(self, artists):\n # Get a list of the Axes that need clearing from the artists that\n # have been drawn. Grab the appropriate saved background from the\n # cache and restore.\n axes = {a.axes for a in artists}\n for ax in axes:\n try:\n view, bg = self._blit_cache[ax]\n except KeyError:\n continue\n if ax._get_view() == view:\n ax.figure.canvas.restore_region(bg)\n else:\n self._blit_cache.pop(ax)\n\n def _setup_blit(self):\n # Setting up the blit requires: a cache of the background for the Axes\n self._blit_cache = dict()\n self._drawn_artists = []\n # _post_draw needs to be called first to initialize the renderer\n self._post_draw(None, self._blit)\n # Then we need to clear the Frame for the initial draw\n # This is typically handled in _on_resize because QT and Tk\n # emit a resize event on launch, but the macosx backend does not,\n # thus we force it here for everyone for consistency\n self._init_draw()\n # Connect to future resize events\n self._resize_id = self._fig.canvas.mpl_connect('resize_event',\n self._on_resize)\n\n def _on_resize(self, event):\n # On resize, we need to disable the resize event handling so we don't\n # get too many events. Also stop the animation events, so that\n # we're paused. Reset the cache and re-init. Set up an event handler\n # to catch once the draw has actually taken place.\n self._fig.canvas.mpl_disconnect(self._resize_id)\n self.event_source.stop()\n self._blit_cache.clear()\n self._init_draw()\n self._resize_id = self._fig.canvas.mpl_connect('draw_event',\n self._end_redraw)\n\n def _end_redraw(self, event):\n # Now that the redraw has happened, do the post draw flushing and\n # blit handling. Then re-enable all of the original events.\n self._post_draw(None, False)\n self.event_source.start()\n self._fig.canvas.mpl_disconnect(self._resize_id)\n self._resize_id = self._fig.canvas.mpl_connect('resize_event',\n self._on_resize)\n\n def to_html5_video(self, embed_limit=None):\n """\n Convert the animation to an HTML5 ``<video>`` tag.\n\n This saves the animation as an h264 video, encoded in base64\n directly into the HTML5 video tag. This respects :rc:`animation.writer`\n and :rc:`animation.bitrate`. This also makes use of the\n *interval* to control the speed, and uses the *repeat*\n parameter to decide whether to loop.\n\n Parameters\n ----------\n embed_limit : float, optional\n Limit, in MB, of the returned animation. No animation is created\n if the limit is exceeded.\n Defaults to :rc:`animation.embed_limit` = 20.0.\n\n Returns\n -------\n str\n An HTML5 video tag with the animation embedded as base64 encoded\n h264 video.\n If the *embed_limit* is exceeded, this returns the string\n "Video too large to embed."\n """\n VIDEO_TAG = r'''<video {size} {options}>\n <source type="video/mp4" src="data:video/mp4;base64,{video}">\n Your browser does not support the video tag.\n</video>'''\n # Cache the rendering of the video as HTML\n if not hasattr(self, '_base64_video'):\n # Save embed limit, which is given in MB\n embed_limit = mpl._val_or_rc(embed_limit, 'animation.embed_limit')\n\n # Convert from MB to bytes\n embed_limit *= 1024 * 1024\n\n # Can't open a NamedTemporaryFile twice on Windows, so use a\n # TemporaryDirectory instead.\n with TemporaryDirectory() as tmpdir:\n path = Path(tmpdir, "temp.m4v")\n # We create a writer manually so that we can get the\n # appropriate size for the tag\n Writer = writers[mpl.rcParams['animation.writer']]\n writer = Writer(codec='h264',\n bitrate=mpl.rcParams['animation.bitrate'],\n fps=1000. / self._interval)\n self.save(str(path), writer=writer)\n # Now open and base64 encode.\n vid64 = base64.encodebytes(path.read_bytes())\n\n vid_len = len(vid64)\n if vid_len >= embed_limit:\n _log.warning(\n "Animation movie is %s bytes, exceeding the limit of %s. "\n "If you're sure you want a large animation embedded, set "\n "the animation.embed_limit rc parameter to a larger value "\n "(in MB).", vid_len, embed_limit)\n else:\n self._base64_video = vid64.decode('ascii')\n self._video_size = 'width="{}" height="{}"'.format(\n *writer.frame_size)\n\n # If we exceeded the size, this attribute won't exist\n if hasattr(self, '_base64_video'):\n # Default HTML5 options are to autoplay and display video controls\n options = ['controls', 'autoplay']\n\n # If we're set to repeat, make it loop\n if getattr(self, '_repeat', False):\n options.append('loop')\n\n return VIDEO_TAG.format(video=self._base64_video,\n size=self._video_size,\n options=' '.join(options))\n else:\n return 'Video too large to embed.'\n\n def to_jshtml(self, fps=None, embed_frames=True, default_mode=None):\n """\n Generate HTML representation of the animation.\n\n Parameters\n ----------\n fps : int, optional\n Movie frame rate (per second). If not set, the frame rate from\n the animation's frame interval.\n embed_frames : bool, optional\n default_mode : str, optional\n What to do when the animation ends. Must be one of ``{'loop',\n 'once', 'reflect'}``. Defaults to ``'loop'`` if the *repeat*\n parameter is True, otherwise ``'once'``.\n\n Returns\n -------\n str\n An HTML representation of the animation embedded as a js object as\n produced with the `.HTMLWriter`.\n """\n if fps is None and hasattr(self, '_interval'):\n # Convert interval in ms to frames per second\n fps = 1000 / self._interval\n\n # If we're not given a default mode, choose one base on the value of\n # the _repeat attribute\n if default_mode is None:\n default_mode = 'loop' if getattr(self, '_repeat',\n False) else 'once'\n\n if not hasattr(self, "_html_representation"):\n # Can't open a NamedTemporaryFile twice on Windows, so use a\n # TemporaryDirectory instead.\n with TemporaryDirectory() as tmpdir:\n path = Path(tmpdir, "temp.html")\n writer = HTMLWriter(fps=fps,\n embed_frames=embed_frames,\n default_mode=default_mode)\n self.save(str(path), writer=writer)\n self._html_representation = path.read_text()\n\n return self._html_representation\n\n def _repr_html_(self):\n """IPython display hook for rendering."""\n fmt = mpl.rcParams['animation.html']\n if fmt == 'html5':\n return self.to_html5_video()\n elif fmt == 'jshtml':\n return self.to_jshtml()\n\n def pause(self):\n """Pause the animation."""\n self.event_source.stop()\n if self._blit:\n for artist in self._drawn_artists:\n artist.set_animated(False)\n\n def resume(self):\n """Resume the animation."""\n self.event_source.start()\n if self._blit:\n for artist in self._drawn_artists:\n artist.set_animated(True)\n\n\nclass TimedAnimation(Animation):\n """\n `Animation` subclass for time-based animation.\n\n A new frame is drawn every *interval* milliseconds.\n\n .. note::\n\n You must store the created Animation in a variable that lives as long\n as the animation should run. Otherwise, the Animation object will be\n garbage-collected and the animation stops.\n\n Parameters\n ----------\n fig : `~matplotlib.figure.Figure`\n The figure object used to get needed events, such as draw or resize.\n interval : int, default: 200\n Delay between frames in milliseconds.\n repeat_delay : int, default: 0\n The delay in milliseconds between consecutive animation runs, if\n *repeat* is True.\n repeat : bool, default: True\n Whether the animation repeats when the sequence of frames is completed.\n blit : bool, default: False\n Whether blitting is used to optimize drawing.\n """\n def __init__(self, fig, interval=200, repeat_delay=0, repeat=True,\n event_source=None, *args, **kwargs):\n self._interval = interval\n # Undocumented support for repeat_delay = None as backcompat.\n self._repeat_delay = repeat_delay if repeat_delay is not None else 0\n self._repeat = repeat\n # If we're not given an event source, create a new timer. This permits\n # sharing timers between animation objects for syncing animations.\n if event_source is None:\n event_source = fig.canvas.new_timer(interval=self._interval)\n super().__init__(fig, event_source=event_source, *args, **kwargs)\n\n def _step(self, *args):\n """Handler for getting events."""\n # Extends the _step() method for the Animation class. If\n # Animation._step signals that it reached the end and we want to\n # repeat, we refresh the frame sequence and return True. If\n # _repeat_delay is set, change the event_source's interval to our loop\n # delay and set the callback to one which will then set the interval\n # back.\n still_going = super()._step(*args)\n if not still_going:\n if self._repeat:\n # Restart the draw loop\n self._init_draw()\n self.frame_seq = self.new_frame_seq()\n self.event_source.interval = self._repeat_delay\n return True\n else:\n # We are done with the animation. Call pause to remove\n # animated flags from artists that were using blitting\n self.pause()\n if self._blit:\n # Remove the resize callback if we were blitting\n self._fig.canvas.mpl_disconnect(self._resize_id)\n self._fig.canvas.mpl_disconnect(self._close_id)\n self.event_source = None\n return False\n\n self.event_source.interval = self._interval\n return True\n\n\nclass ArtistAnimation(TimedAnimation):\n """\n `TimedAnimation` subclass that creates an animation by using a fixed\n set of `.Artist` objects.\n\n Before creating an instance, all plotting should have taken place\n and the relevant artists saved.\n\n .. note::\n\n You must store the created Animation in a variable that lives as long\n as the animation should run. Otherwise, the Animation object will be\n garbage-collected and the animation stops.\n\n Parameters\n ----------\n fig : `~matplotlib.figure.Figure`\n The figure object used to get needed events, such as draw or resize.\n artists : list\n Each list entry is a collection of `.Artist` objects that are made\n visible on the corresponding frame. Other artists are made invisible.\n interval : int, default: 200\n Delay between frames in milliseconds.\n repeat_delay : int, default: 0\n The delay in milliseconds between consecutive animation runs, if\n *repeat* is True.\n repeat : bool, default: True\n Whether the animation repeats when the sequence of frames is completed.\n blit : bool, default: False\n Whether blitting is used to optimize drawing.\n """\n\n def __init__(self, fig, artists, *args, **kwargs):\n # Internal list of artists drawn in the most recent frame.\n self._drawn_artists = []\n\n # Use the list of artists as the framedata, which will be iterated\n # over by the machinery.\n self._framedata = artists\n super().__init__(fig, *args, **kwargs)\n\n def _init_draw(self):\n super()._init_draw()\n # Make all the artists involved in *any* frame invisible\n figs = set()\n for f in self.new_frame_seq():\n for artist in f:\n artist.set_visible(False)\n artist.set_animated(self._blit)\n # Assemble a list of unique figures that need flushing\n if artist.get_figure() not in figs:\n figs.add(artist.get_figure())\n\n # Flush the needed figures\n for fig in figs:\n fig.canvas.draw_idle()\n\n def _pre_draw(self, framedata, blit):\n """Clears artists from the last frame."""\n if blit:\n # Let blit handle clearing\n self._blit_clear(self._drawn_artists)\n else:\n # Otherwise, make all the artists from the previous frame invisible\n for artist in self._drawn_artists:\n artist.set_visible(False)\n\n def _draw_frame(self, artists):\n # Save the artists that were passed in as framedata for the other\n # steps (esp. blitting) to use.\n self._drawn_artists = artists\n\n # Make all the artists from the current frame visible\n for artist in artists:\n artist.set_visible(True)\n\n\nclass FuncAnimation(TimedAnimation):\n """\n `TimedAnimation` subclass that makes an animation by repeatedly calling\n a function *func*.\n\n .. note::\n\n You must store the created Animation in a variable that lives as long\n as the animation should run. Otherwise, the Animation object will be\n garbage-collected and the animation stops.\n\n Parameters\n ----------\n fig : `~matplotlib.figure.Figure`\n The figure object used to get needed events, such as draw or resize.\n\n func : callable\n The function to call at each frame. The first argument will\n be the next value in *frames*. Any additional positional\n arguments can be supplied using `functools.partial` or via the *fargs*\n parameter.\n\n The required signature is::\n\n def func(frame, *fargs) -> iterable_of_artists\n\n It is often more convenient to provide the arguments using\n `functools.partial`. In this way it is also possible to pass keyword\n arguments. To pass a function with both positional and keyword\n arguments, set all arguments as keyword arguments, just leaving the\n *frame* argument unset::\n\n def func(frame, art, *, y=None):\n ...\n\n ani = FuncAnimation(fig, partial(func, art=ln, y='foo'))\n\n If ``blit == True``, *func* must return an iterable of all artists\n that were modified or created. This information is used by the blitting\n algorithm to determine which parts of the figure have to be updated.\n The return value is unused if ``blit == False`` and may be omitted in\n that case.\n\n frames : iterable, int, generator function, or None, optional\n Source of data to pass *func* and each frame of the animation\n\n - If an iterable, then simply use the values provided. If the\n iterable has a length, it will override the *save_count* kwarg.\n\n - If an integer, then equivalent to passing ``range(frames)``\n\n - If a generator function, then must have the signature::\n\n def gen_function() -> obj\n\n - If *None*, then equivalent to passing ``itertools.count``.\n\n In all of these cases, the values in *frames* is simply passed through\n to the user-supplied *func* and thus can be of any type.\n\n init_func : callable, optional\n A function used to draw a clear frame. If not given, the results of\n drawing from the first item in the frames sequence will be used. This\n function will be called once before the first frame.\n\n The required signature is::\n\n def init_func() -> iterable_of_artists\n\n If ``blit == True``, *init_func* must return an iterable of artists\n to be re-drawn. This information is used by the blitting algorithm to\n determine which parts of the figure have to be updated. The return\n value is unused if ``blit == False`` and may be omitted in that case.\n\n fargs : tuple or None, optional\n Additional arguments to pass to each call to *func*. Note: the use of\n `functools.partial` is preferred over *fargs*. See *func* for details.\n\n save_count : int, optional\n Fallback for the number of values from *frames* to cache. This is\n only used if the number of frames cannot be inferred from *frames*,\n i.e. when it's an iterator without length or a generator.\n\n interval : int, default: 200\n Delay between frames in milliseconds.\n\n repeat_delay : int, default: 0\n The delay in milliseconds between consecutive animation runs, if\n *repeat* is True.\n\n repeat : bool, default: True\n Whether the animation repeats when the sequence of frames is completed.\n\n blit : bool, default: False\n Whether blitting is used to optimize drawing. Note: when using\n blitting, any animated artists will be drawn according to their zorder;\n however, they will be drawn on top of any previous artists, regardless\n of their zorder.\n\n cache_frame_data : bool, default: True\n Whether frame data is cached. Disabling cache might be helpful when\n frames contain large objects.\n """\n def __init__(self, fig, func, frames=None, init_func=None, fargs=None,\n save_count=None, *, cache_frame_data=True, **kwargs):\n if fargs:\n self._args = fargs\n else:\n self._args = ()\n self._func = func\n self._init_func = init_func\n\n # Amount of framedata to keep around for saving movies. This is only\n # used if we don't know how many frames there will be: in the case\n # of no generator or in the case of a callable.\n self._save_count = save_count\n # Set up a function that creates a new iterable when needed. If nothing\n # is passed in for frames, just use itertools.count, which will just\n # keep counting from 0. A callable passed in for frames is assumed to\n # be a generator. An iterable will be used as is, and anything else\n # will be treated as a number of frames.\n if frames is None:\n self._iter_gen = itertools.count\n elif callable(frames):\n self._iter_gen = frames\n elif np.iterable(frames):\n if kwargs.get('repeat', True):\n self._tee_from = frames\n def iter_frames(frames=frames):\n this, self._tee_from = itertools.tee(self._tee_from, 2)\n yield from this\n self._iter_gen = iter_frames\n else:\n self._iter_gen = lambda: iter(frames)\n if hasattr(frames, '__len__'):\n self._save_count = len(frames)\n if save_count is not None:\n _api.warn_external(\n f"You passed in an explicit {save_count=} "\n "which is being ignored in favor of "\n f"{len(frames)=}."\n )\n else:\n self._iter_gen = lambda: iter(range(frames))\n self._save_count = frames\n if save_count is not None:\n _api.warn_external(\n f"You passed in an explicit {save_count=} which is being "\n f"ignored in favor of {frames=}."\n )\n if self._save_count is None and cache_frame_data:\n _api.warn_external(\n f"{frames=!r} which we can infer the length of, "\n "did not pass an explicit *save_count* "\n f"and passed {cache_frame_data=}. To avoid a possibly "\n "unbounded cache, frame data caching has been disabled. "\n "To suppress this warning either pass "\n "`cache_frame_data=False` or `save_count=MAX_FRAMES`."\n )\n cache_frame_data = False\n\n self._cache_frame_data = cache_frame_data\n\n # Needs to be initialized so the draw functions work without checking\n self._save_seq = []\n\n super().__init__(fig, **kwargs)\n\n # Need to reset the saved seq, since right now it will contain data\n # for a single frame from init, which is not what we want.\n self._save_seq = []\n\n def new_frame_seq(self):\n # Use the generating function to generate a new frame sequence\n return self._iter_gen()\n\n def new_saved_frame_seq(self):\n # Generate an iterator for the sequence of saved data. If there are\n # no saved frames, generate a new frame sequence and take the first\n # save_count entries in it.\n if self._save_seq:\n # While iterating we are going to update _save_seq\n # so make a copy to safely iterate over\n self._old_saved_seq = list(self._save_seq)\n return iter(self._old_saved_seq)\n else:\n if self._save_count is None:\n frame_seq = self.new_frame_seq()\n\n def gen():\n try:\n while True:\n yield next(frame_seq)\n except StopIteration:\n pass\n return gen()\n else:\n return itertools.islice(self.new_frame_seq(), self._save_count)\n\n def _init_draw(self):\n super()._init_draw()\n # Initialize the drawing either using the given init_func or by\n # calling the draw function with the first item of the frame sequence.\n # For blitting, the init_func should return a sequence of modified\n # artists.\n if self._init_func is None:\n try:\n frame_data = next(self.new_frame_seq())\n except StopIteration:\n # we can't start the iteration, it may have already been\n # exhausted by a previous save or just be 0 length.\n # warn and bail.\n warnings.warn(\n "Can not start iterating the frames for the initial draw. "\n "This can be caused by passing in a 0 length sequence "\n "for *frames*.\n\n"\n "If you passed *frames* as a generator "\n "it may be exhausted due to a previous display or save."\n )\n return\n self._draw_frame(frame_data)\n else:\n self._drawn_artists = self._init_func()\n if self._blit:\n if self._drawn_artists is None:\n raise RuntimeError('The init_func must return a '\n 'sequence of Artist objects.')\n for a in self._drawn_artists:\n a.set_animated(self._blit)\n self._save_seq = []\n\n def _draw_frame(self, framedata):\n if self._cache_frame_data:\n # Save the data for potential saving of movies.\n self._save_seq.append(framedata)\n self._save_seq = self._save_seq[-self._save_count:]\n\n # Call the func with framedata and args. If blitting is desired,\n # func needs to return a sequence of any artists that were modified.\n self._drawn_artists = self._func(framedata, *self._args)\n\n if self._blit:\n\n err = RuntimeError('The animation function must return a sequence '\n 'of Artist objects.')\n try:\n # check if a sequence\n iter(self._drawn_artists)\n except TypeError:\n raise err from None\n\n # check each item if it's artist\n for i in self._drawn_artists:\n if not isinstance(i, mpl.artist.Artist):\n raise err\n\n self._drawn_artists = sorted(self._drawn_artists,\n key=lambda x: x.get_zorder())\n\n for a in self._drawn_artists:\n a.set_animated(self._blit)\n\n\ndef _validate_grabframe_kwargs(savefig_kwargs):\n if mpl.rcParams['savefig.bbox'] == 'tight':\n raise ValueError(\n f"{mpl.rcParams['savefig.bbox']=} must not be 'tight' as it "\n "may cause frame size to vary, which is inappropriate for animation."\n )\n for k in ('dpi', 'bbox_inches', 'format'):\n if k in savefig_kwargs:\n raise TypeError(\n f"grab_frame got an unexpected keyword argument {k!r}"\n )\n | .venv\Lib\site-packages\matplotlib\animation.py | animation.py | Python | 73,114 | 0.75 | 0.218321 | 0.156697 | node-utils | 376 | 2024-06-22T00:11:02.789192 | BSD-3-Clause | false | cc6a0783c9f7a6cc4986eed38de442c1 |
import abc\nfrom collections.abc import Callable, Collection, Iterable, Sequence, Generator\nimport contextlib\nfrom pathlib import Path\nfrom matplotlib.artist import Artist\nfrom matplotlib.backend_bases import TimerBase\nfrom matplotlib.figure import Figure\n\nfrom typing import Any\n\nsubprocess_creation_flags: int\n\ndef adjusted_figsize(w: float, h: float, dpi: float, n: int) -> tuple[float, float]: ...\n\nclass MovieWriterRegistry:\n def __init__(self) -> None: ...\n def register(\n self, name: str\n ) -> Callable[[type[AbstractMovieWriter]], type[AbstractMovieWriter]]: ...\n def is_available(self, name: str) -> bool: ...\n def __iter__(self) -> Generator[str, None, None]: ...\n def list(self) -> list[str]: ...\n def __getitem__(self, name: str) -> type[AbstractMovieWriter]: ...\n\nwriters: MovieWriterRegistry\n\nclass AbstractMovieWriter(abc.ABC, metaclass=abc.ABCMeta):\n fps: int\n metadata: dict[str, str]\n codec: str\n bitrate: int\n def __init__(\n self,\n fps: int = ...,\n metadata: dict[str, str] | None = ...,\n codec: str | None = ...,\n bitrate: int | None = ...,\n ) -> None: ...\n outfile: str | Path\n fig: Figure\n dpi: float\n\n @abc.abstractmethod\n def setup(self, fig: Figure, outfile: str | Path, dpi: float | None = ...) -> None: ...\n @property\n def frame_size(self) -> tuple[int, int]: ...\n @abc.abstractmethod\n def grab_frame(self, **savefig_kwargs) -> None: ...\n @abc.abstractmethod\n def finish(self) -> None: ...\n @contextlib.contextmanager\n def saving(\n self, fig: Figure, outfile: str | Path, dpi: float | None, *args, **kwargs\n ) -> Generator[AbstractMovieWriter, None, None]: ...\n\nclass MovieWriter(AbstractMovieWriter):\n supported_formats: list[str]\n frame_format: str\n extra_args: list[str] | None\n def __init__(\n self,\n fps: int = ...,\n codec: str | None = ...,\n bitrate: int | None = ...,\n extra_args: list[str] | None = ...,\n metadata: dict[str, str] | None = ...,\n ) -> None: ...\n def setup(self, fig: Figure, outfile: str | Path, dpi: float | None = ...) -> None: ...\n def grab_frame(self, **savefig_kwargs) -> None: ...\n def finish(self) -> None: ...\n @classmethod\n def bin_path(cls) -> str: ...\n @classmethod\n def isAvailable(cls) -> bool: ...\n\nclass FileMovieWriter(MovieWriter):\n fig: Figure\n outfile: str | Path\n dpi: float\n temp_prefix: str\n fname_format_str: str\n def setup(\n self,\n fig: Figure,\n outfile: str | Path,\n dpi: float | None = ...,\n frame_prefix: str | Path | None = ...,\n ) -> None: ...\n def __del__(self) -> None: ...\n @property\n def frame_format(self) -> str: ...\n @frame_format.setter\n def frame_format(self, frame_format: str) -> None: ...\n\nclass PillowWriter(AbstractMovieWriter):\n @classmethod\n def isAvailable(cls) -> bool: ...\n def setup(\n self, fig: Figure, outfile: str | Path, dpi: float | None = ...\n ) -> None: ...\n def grab_frame(self, **savefig_kwargs) -> None: ...\n def finish(self) -> None: ...\n\nclass FFMpegBase:\n codec: str\n @property\n def output_args(self) -> list[str]: ...\n\nclass FFMpegWriter(FFMpegBase, MovieWriter): ...\n\nclass FFMpegFileWriter(FFMpegBase, FileMovieWriter):\n supported_formats: list[str]\n\nclass ImageMagickBase:\n @classmethod\n def bin_path(cls) -> str: ...\n @classmethod\n def isAvailable(cls) -> bool: ...\n\nclass ImageMagickWriter(ImageMagickBase, MovieWriter):\n input_names: str\n\nclass ImageMagickFileWriter(ImageMagickBase, FileMovieWriter):\n supported_formats: list[str]\n @property\n def input_names(self) -> str: ...\n\nclass HTMLWriter(FileMovieWriter):\n supported_formats: list[str]\n @classmethod\n def isAvailable(cls) -> bool: ...\n embed_frames: bool\n default_mode: str\n def __init__(\n self,\n fps: int = ...,\n codec: str | None = ...,\n bitrate: int | None = ...,\n extra_args: list[str] | None = ...,\n metadata: dict[str, str] | None = ...,\n embed_frames: bool = ...,\n default_mode: str = ...,\n embed_limit: float | None = ...,\n ) -> None: ...\n def setup(\n self,\n fig: Figure,\n outfile: str | Path,\n dpi: float | None = ...,\n frame_dir: str | Path | None = ...,\n ) -> None: ...\n def grab_frame(self, **savefig_kwargs): ...\n def finish(self) -> None: ...\n\nclass Animation:\n frame_seq: Iterable[Artist]\n event_source: Any\n def __init__(\n self, fig: Figure, event_source: Any | None = ..., blit: bool = ...\n ) -> None: ...\n def __del__(self) -> None: ...\n def save(\n self,\n filename: str | Path,\n writer: AbstractMovieWriter | str | None = ...,\n fps: int | None = ...,\n dpi: float | None = ...,\n codec: str | None = ...,\n bitrate: int | None = ...,\n extra_args: list[str] | None = ...,\n metadata: dict[str, str] | None = ...,\n extra_anim: list[Animation] | None = ...,\n savefig_kwargs: dict[str, Any] | None = ...,\n *,\n progress_callback: Callable[[int, int], Any] | None = ...\n ) -> None: ...\n def new_frame_seq(self) -> Iterable[Artist]: ...\n def new_saved_frame_seq(self) -> Iterable[Artist]: ...\n def to_html5_video(self, embed_limit: float | None = ...) -> str: ...\n def to_jshtml(\n self,\n fps: int | None = ...,\n embed_frames: bool = ...,\n default_mode: str | None = ...,\n ) -> str: ...\n def _repr_html_(self) -> str: ...\n def pause(self) -> None: ...\n def resume(self) -> None: ...\n\nclass TimedAnimation(Animation):\n def __init__(\n self,\n fig: Figure,\n interval: int = ...,\n repeat_delay: int = ...,\n repeat: bool = ...,\n event_source: TimerBase | None = ...,\n *args,\n **kwargs\n ) -> None: ...\n\nclass ArtistAnimation(TimedAnimation):\n def __init__(self, fig: Figure, artists: Sequence[Collection[Artist]], *args, **kwargs) -> None: ...\n\nclass FuncAnimation(TimedAnimation):\n def __init__(\n self,\n fig: Figure,\n func: Callable[..., Iterable[Artist]],\n frames: Iterable | int | Callable[[], Generator] | None = ...,\n init_func: Callable[[], Iterable[Artist]] | None = ...,\n fargs: tuple[Any, ...] | None = ...,\n save_count: int | None = ...,\n *,\n cache_frame_data: bool = ...,\n **kwargs\n ) -> None: ...\n | .venv\Lib\site-packages\matplotlib\animation.pyi | animation.pyi | Other | 6,566 | 0.85 | 0.299539 | 0.02551 | node-utils | 977 | 2024-12-22T04:39:02.464675 | MIT | false | 36274df713137653bbab853049ae6279 |
from collections import namedtuple\nimport contextlib\nfrom functools import cache, reduce, wraps\nimport inspect\nfrom inspect import Signature, Parameter\nimport logging\nfrom numbers import Number, Real\nimport operator\nimport re\nimport warnings\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom . import _api, cbook\nfrom .path import Path\nfrom .transforms import (BboxBase, Bbox, IdentityTransform, Transform, TransformedBbox,\n TransformedPatchPath, TransformedPath)\n\n_log = logging.getLogger(__name__)\n\n\ndef _prevent_rasterization(draw):\n # We assume that by default artists are not allowed to rasterize (unless\n # its draw method is explicitly decorated). If it is being drawn after a\n # rasterized artist and it has reached a raster_depth of 0, we stop\n # rasterization so that it does not affect the behavior of normal artist\n # (e.g., change in dpi).\n\n @wraps(draw)\n def draw_wrapper(artist, renderer, *args, **kwargs):\n if renderer._raster_depth == 0 and renderer._rasterizing:\n # Only stop when we are not in a rasterized parent\n # and something has been rasterized since last stop.\n renderer.stop_rasterizing()\n renderer._rasterizing = False\n\n return draw(artist, renderer, *args, **kwargs)\n\n draw_wrapper._supports_rasterization = False\n return draw_wrapper\n\n\ndef allow_rasterization(draw):\n """\n Decorator for Artist.draw method. Provides routines\n that run before and after the draw call. The before and after functions\n are useful for changing artist-dependent renderer attributes or making\n other setup function calls, such as starting and flushing a mixed-mode\n renderer.\n """\n\n @wraps(draw)\n def draw_wrapper(artist, renderer):\n try:\n if artist.get_rasterized():\n if renderer._raster_depth == 0 and not renderer._rasterizing:\n renderer.start_rasterizing()\n renderer._rasterizing = True\n renderer._raster_depth += 1\n else:\n if renderer._raster_depth == 0 and renderer._rasterizing:\n # Only stop when we are not in a rasterized parent\n # and something has be rasterized since last stop\n renderer.stop_rasterizing()\n renderer._rasterizing = False\n\n if artist.get_agg_filter() is not None:\n renderer.start_filter()\n\n return draw(artist, renderer)\n finally:\n if artist.get_agg_filter() is not None:\n renderer.stop_filter(artist.get_agg_filter())\n if artist.get_rasterized():\n renderer._raster_depth -= 1\n if (renderer._rasterizing and (fig := artist.get_figure(root=True)) and\n fig.suppressComposite):\n # restart rasterizing to prevent merging\n renderer.stop_rasterizing()\n renderer.start_rasterizing()\n\n draw_wrapper._supports_rasterization = True\n return draw_wrapper\n\n\ndef _finalize_rasterization(draw):\n """\n Decorator for Artist.draw method. Needed on the outermost artist, i.e.\n Figure, to finish up if the render is still in rasterized mode.\n """\n @wraps(draw)\n def draw_wrapper(artist, renderer, *args, **kwargs):\n result = draw(artist, renderer, *args, **kwargs)\n if renderer._rasterizing:\n renderer.stop_rasterizing()\n renderer._rasterizing = False\n return result\n return draw_wrapper\n\n\ndef _stale_axes_callback(self, val):\n if self.axes:\n self.axes.stale = val\n\n\n_XYPair = namedtuple("_XYPair", "x y")\n\n\nclass _Unset:\n def __repr__(self):\n return "<UNSET>"\n_UNSET = _Unset()\n\n\nclass Artist:\n """\n Abstract base class for objects that render into a FigureCanvas.\n\n Typically, all visible elements in a figure are subclasses of Artist.\n """\n\n zorder = 0\n\n def __init_subclass__(cls):\n\n # Decorate draw() method so that all artists are able to stop\n # rastrization when necessary. If the artist's draw method is already\n # decorated (has a `_supports_rasterization` attribute), it won't be\n # decorated.\n\n if not hasattr(cls.draw, "_supports_rasterization"):\n cls.draw = _prevent_rasterization(cls.draw)\n\n # Inject custom set() methods into the subclass with signature and\n # docstring based on the subclasses' properties.\n\n if not hasattr(cls.set, '_autogenerated_signature'):\n # Don't overwrite cls.set if the subclass or one of its parents\n # has defined a set method set itself.\n # If there was no explicit definition, cls.set is inherited from\n # the hierarchy of auto-generated set methods, which hold the\n # flag _autogenerated_signature.\n return\n\n cls.set = lambda self, **kwargs: Artist.set(self, **kwargs)\n cls.set.__name__ = "set"\n cls.set.__qualname__ = f"{cls.__qualname__}.set"\n cls._update_set_signature_and_docstring()\n\n _PROPERTIES_EXCLUDED_FROM_SET = [\n 'navigate_mode', # not a user-facing function\n 'figure', # changing the figure is such a profound operation\n # that we don't want this in set()\n '3d_properties', # cannot be used as a keyword due to leading digit\n ]\n\n @classmethod\n def _update_set_signature_and_docstring(cls):\n """\n Update the signature of the set function to list all properties\n as keyword arguments.\n\n Property aliases are not listed in the signature for brevity, but\n are still accepted as keyword arguments.\n """\n cls.set.__signature__ = Signature(\n [Parameter("self", Parameter.POSITIONAL_OR_KEYWORD),\n *[Parameter(prop, Parameter.KEYWORD_ONLY, default=_UNSET)\n for prop in ArtistInspector(cls).get_setters()\n if prop not in Artist._PROPERTIES_EXCLUDED_FROM_SET]])\n cls.set._autogenerated_signature = True\n\n cls.set.__doc__ = (\n "Set multiple properties at once.\n\n"\n "Supported properties are\n\n"\n + kwdoc(cls))\n\n def __init__(self):\n self._stale = True\n self.stale_callback = None\n self._axes = None\n self._parent_figure = None\n\n self._transform = None\n self._transformSet = False\n self._visible = True\n self._animated = False\n self._alpha = None\n self.clipbox = None\n self._clippath = None\n self._clipon = True\n self._label = ''\n self._picker = None\n self._rasterized = False\n self._agg_filter = None\n # Normally, artist classes need to be queried for mouseover info if and\n # only if they override get_cursor_data.\n self._mouseover = type(self).get_cursor_data != Artist.get_cursor_data\n self._callbacks = cbook.CallbackRegistry(signals=["pchanged"])\n try:\n self.axes = None\n except AttributeError:\n # Handle self.axes as a read-only property, as in Figure.\n pass\n self._remove_method = None\n self._url = None\n self._gid = None\n self._snap = None\n self._sketch = mpl.rcParams['path.sketch']\n self._path_effects = mpl.rcParams['path.effects']\n self._sticky_edges = _XYPair([], [])\n self._in_layout = True\n\n def __getstate__(self):\n d = self.__dict__.copy()\n d['stale_callback'] = None\n return d\n\n def remove(self):\n """\n Remove the artist from the figure if possible.\n\n The effect will not be visible until the figure is redrawn, e.g.,\n with `.FigureCanvasBase.draw_idle`. Call `~.axes.Axes.relim` to\n update the Axes limits if desired.\n\n Note: `~.axes.Axes.relim` will not see collections even if the\n collection was added to the Axes with *autolim* = True.\n\n Note: there is no support for removing the artist's legend entry.\n """\n\n # There is no method to set the callback. Instead, the parent should\n # set the _remove_method attribute directly. This would be a\n # protected attribute if Python supported that sort of thing. The\n # callback has one parameter, which is the child to be removed.\n if self._remove_method is not None:\n self._remove_method(self)\n # clear stale callback\n self.stale_callback = None\n _ax_flag = False\n if hasattr(self, 'axes') and self.axes:\n # remove from the mouse hit list\n self.axes._mouseover_set.discard(self)\n self.axes.stale = True\n self.axes = None # decouple the artist from the Axes\n _ax_flag = True\n\n if (fig := self.get_figure(root=False)) is not None:\n if not _ax_flag:\n fig.stale = True\n self._parent_figure = None\n\n else:\n raise NotImplementedError('cannot remove artist')\n # TODO: the fix for the collections relim problem is to move the\n # limits calculation into the artist itself, including the property of\n # whether or not the artist should affect the limits. Then there will\n # be no distinction between axes.add_line, axes.add_patch, etc.\n # TODO: add legend support\n\n def have_units(self):\n """Return whether units are set on any axis."""\n ax = self.axes\n return ax and any(axis.have_units() for axis in ax._axis_map.values())\n\n def convert_xunits(self, x):\n """\n Convert *x* using the unit type of the xaxis.\n\n If the artist is not contained in an Axes or if the xaxis does not\n have units, *x* itself is returned.\n """\n ax = getattr(self, 'axes', None)\n if ax is None or ax.xaxis is None:\n return x\n return ax.xaxis.convert_units(x)\n\n def convert_yunits(self, y):\n """\n Convert *y* using the unit type of the yaxis.\n\n If the artist is not contained in an Axes or if the yaxis does not\n have units, *y* itself is returned.\n """\n ax = getattr(self, 'axes', None)\n if ax is None or ax.yaxis is None:\n return y\n return ax.yaxis.convert_units(y)\n\n @property\n def axes(self):\n """The `~.axes.Axes` instance the artist resides in, or *None*."""\n return self._axes\n\n @axes.setter\n def axes(self, new_axes):\n if (new_axes is not None and self._axes is not None\n and new_axes != self._axes):\n raise ValueError("Can not reset the Axes. You are probably trying to reuse "\n "an artist in more than one Axes which is not supported")\n self._axes = new_axes\n if new_axes is not None and new_axes is not self:\n self.stale_callback = _stale_axes_callback\n\n @property\n def stale(self):\n """\n Whether the artist is 'stale' and needs to be re-drawn for the output\n to match the internal state of the artist.\n """\n return self._stale\n\n @stale.setter\n def stale(self, val):\n self._stale = val\n\n # if the artist is animated it does not take normal part in the\n # draw stack and is not expected to be drawn as part of the normal\n # draw loop (when not saving) so do not propagate this change\n if self._animated:\n return\n\n if val and self.stale_callback is not None:\n self.stale_callback(self, val)\n\n def get_window_extent(self, renderer=None):\n """\n Get the artist's bounding box in display space.\n\n The bounding box' width and height are nonnegative.\n\n Subclasses should override for inclusion in the bounding box\n "tight" calculation. Default is to return an empty bounding\n box at 0, 0.\n\n Be careful when using this function, the results will not update\n if the artist window extent of the artist changes. The extent\n can change due to any changes in the transform stack, such as\n changing the Axes limits, the figure size, or the canvas used\n (as is done when saving a figure). This can lead to unexpected\n behavior where interactive figures will look fine on the screen,\n but will save incorrectly.\n """\n return Bbox([[0, 0], [0, 0]])\n\n def get_tightbbox(self, renderer=None):\n """\n Like `.Artist.get_window_extent`, but includes any clipping.\n\n Parameters\n ----------\n renderer : `~matplotlib.backend_bases.RendererBase` subclass, optional\n renderer that will be used to draw the figures (i.e.\n ``fig.canvas.get_renderer()``)\n\n Returns\n -------\n `.Bbox` or None\n The enclosing bounding box (in figure pixel coordinates).\n Returns None if clipping results in no intersection.\n """\n bbox = self.get_window_extent(renderer)\n if self.get_clip_on():\n clip_box = self.get_clip_box()\n if clip_box is not None:\n bbox = Bbox.intersection(bbox, clip_box)\n clip_path = self.get_clip_path()\n if clip_path is not None and bbox is not None:\n clip_path = clip_path.get_fully_transformed_path()\n bbox = Bbox.intersection(bbox, clip_path.get_extents())\n return bbox\n\n def add_callback(self, func):\n """\n Add a callback function that will be called whenever one of the\n `.Artist`'s properties changes.\n\n Parameters\n ----------\n func : callable\n The callback function. It must have the signature::\n\n def func(artist: Artist) -> Any\n\n where *artist* is the calling `.Artist`. Return values may exist\n but are ignored.\n\n Returns\n -------\n int\n The observer id associated with the callback. This id can be\n used for removing the callback with `.remove_callback` later.\n\n See Also\n --------\n remove_callback\n """\n # Wrapping func in a lambda ensures it can be connected multiple times\n # and never gets weakref-gc'ed.\n return self._callbacks.connect("pchanged", lambda: func(self))\n\n def remove_callback(self, oid):\n """\n Remove a callback based on its observer id.\n\n See Also\n --------\n add_callback\n """\n self._callbacks.disconnect(oid)\n\n def pchanged(self):\n """\n Call all of the registered callbacks.\n\n This function is triggered internally when a property is changed.\n\n See Also\n --------\n add_callback\n remove_callback\n """\n self._callbacks.process("pchanged")\n\n def is_transform_set(self):\n """\n Return whether the Artist has an explicitly set transform.\n\n This is *True* after `.set_transform` has been called.\n """\n return self._transformSet\n\n def set_transform(self, t):\n """\n Set the artist transform.\n\n Parameters\n ----------\n t : `~matplotlib.transforms.Transform`\n """\n self._transform = t\n self._transformSet = True\n self.pchanged()\n self.stale = True\n\n def get_transform(self):\n """Return the `.Transform` instance used by this artist."""\n if self._transform is None:\n self._transform = IdentityTransform()\n elif (not isinstance(self._transform, Transform)\n and hasattr(self._transform, '_as_mpl_transform')):\n self._transform = self._transform._as_mpl_transform(self.axes)\n return self._transform\n\n def get_children(self):\n r"""Return a list of the child `.Artist`\s of this `.Artist`."""\n return []\n\n def _different_canvas(self, event):\n """\n Check whether an *event* occurred on a canvas other that this artist's canvas.\n\n If this method returns True, the event definitely occurred on a different\n canvas; if it returns False, either it occurred on the same canvas, or we may\n not have enough information to know.\n\n Subclasses should start their definition of `contains` as follows::\n\n if self._different_canvas(mouseevent):\n return False, {}\n # subclass-specific implementation follows\n """\n return (getattr(event, "canvas", None) is not None\n and (fig := self.get_figure(root=True)) is not None\n and event.canvas is not fig.canvas)\n\n def contains(self, mouseevent):\n """\n Test whether the artist contains the mouse event.\n\n Parameters\n ----------\n mouseevent : `~matplotlib.backend_bases.MouseEvent`\n\n Returns\n -------\n contains : bool\n Whether any values are within the radius.\n details : dict\n An artist-specific dictionary of details of the event context,\n such as which points are contained in the pick radius. See the\n individual Artist subclasses for details.\n """\n _log.warning("%r needs 'contains' method", self.__class__.__name__)\n return False, {}\n\n def pickable(self):\n """\n Return whether the artist is pickable.\n\n See Also\n --------\n .Artist.set_picker, .Artist.get_picker, .Artist.pick\n """\n return self.get_figure(root=False) is not None and self._picker is not None\n\n def pick(self, mouseevent):\n """\n Process a pick event.\n\n Each child artist will fire a pick event if *mouseevent* is over\n the artist and the artist has picker set.\n\n See Also\n --------\n .Artist.set_picker, .Artist.get_picker, .Artist.pickable\n """\n from .backend_bases import PickEvent # Circular import.\n # Pick self\n if self.pickable():\n picker = self.get_picker()\n if callable(picker):\n inside, prop = picker(self, mouseevent)\n else:\n inside, prop = self.contains(mouseevent)\n if inside:\n PickEvent("pick_event", self.get_figure(root=True).canvas,\n mouseevent, self, **prop)._process()\n\n # Pick children\n for a in self.get_children():\n # make sure the event happened in the same Axes\n ax = getattr(a, 'axes', None)\n if (isinstance(a, mpl.figure.SubFigure)\n or mouseevent.inaxes is None or ax is None\n or mouseevent.inaxes == ax):\n # we need to check if mouseevent.inaxes is None\n # because some objects associated with an Axes (e.g., a\n # tick label) can be outside the bounding box of the\n # Axes and inaxes will be None\n # also check that ax is None so that it traverse objects\n # which do not have an axes property but children might\n a.pick(mouseevent)\n\n def set_picker(self, picker):\n """\n Define the picking behavior of the artist.\n\n Parameters\n ----------\n picker : None or bool or float or callable\n This can be one of the following:\n\n - *None*: Picking is disabled for this artist (default).\n\n - A boolean: If *True* then picking will be enabled and the\n artist will fire a pick event if the mouse event is over\n the artist.\n\n - A float: If picker is a number it is interpreted as an\n epsilon tolerance in points and the artist will fire\n off an event if its data is within epsilon of the mouse\n event. For some artists like lines and patch collections,\n the artist may provide additional data to the pick event\n that is generated, e.g., the indices of the data within\n epsilon of the pick event\n\n - A function: If picker is callable, it is a user supplied\n function which determines whether the artist is hit by the\n mouse event::\n\n hit, props = picker(artist, mouseevent)\n\n to determine the hit test. if the mouse event is over the\n artist, return *hit=True* and props is a dictionary of\n properties you want added to the PickEvent attributes.\n """\n self._picker = picker\n\n def get_picker(self):\n """\n Return the picking behavior of the artist.\n\n The possible values are described in `.Artist.set_picker`.\n\n See Also\n --------\n .Artist.set_picker, .Artist.pickable, .Artist.pick\n """\n return self._picker\n\n def get_url(self):\n """Return the url."""\n return self._url\n\n def set_url(self, url):\n """\n Set the url for the artist.\n\n Parameters\n ----------\n url : str\n """\n self._url = url\n\n def get_gid(self):\n """Return the group id."""\n return self._gid\n\n def set_gid(self, gid):\n """\n Set the (group) id for the artist.\n\n Parameters\n ----------\n gid : str\n """\n self._gid = gid\n\n def get_snap(self):\n """\n Return the snap setting.\n\n See `.set_snap` for details.\n """\n if mpl.rcParams['path.snap']:\n return self._snap\n else:\n return False\n\n def set_snap(self, snap):\n """\n Set the snapping behavior.\n\n Snapping aligns positions with the pixel grid, which results in\n clearer images. For example, if a black line of 1px width was\n defined at a position in between two pixels, the resulting image\n would contain the interpolated value of that line in the pixel grid,\n which would be a grey value on both adjacent pixel positions. In\n contrast, snapping will move the line to the nearest integer pixel\n value, so that the resulting image will really contain a 1px wide\n black line.\n\n Snapping is currently only supported by the Agg and MacOSX backends.\n\n Parameters\n ----------\n snap : bool or None\n Possible values:\n\n - *True*: Snap vertices to the nearest pixel center.\n - *False*: Do not modify vertex positions.\n - *None*: (auto) If the path contains only rectilinear line\n segments, round to the nearest pixel center.\n """\n self._snap = snap\n self.stale = True\n\n def get_sketch_params(self):\n """\n Return the sketch parameters for the artist.\n\n Returns\n -------\n tuple or None\n\n A 3-tuple with the following elements:\n\n - *scale*: The amplitude of the wiggle perpendicular to the\n source line.\n - *length*: The length of the wiggle along the line.\n - *randomness*: The scale factor by which the length is\n shrunken or expanded.\n\n Returns *None* if no sketch parameters were set.\n """\n return self._sketch\n\n def set_sketch_params(self, scale=None, length=None, randomness=None):\n """\n Set the sketch parameters.\n\n Parameters\n ----------\n scale : float, optional\n The amplitude of the wiggle perpendicular to the source\n line, in pixels. If scale is `None`, or not provided, no\n sketch filter will be provided.\n length : float, optional\n The length of the wiggle along the line, in pixels\n (default 128.0)\n randomness : float, optional\n The scale factor by which the length is shrunken or\n expanded (default 16.0)\n\n The PGF backend uses this argument as an RNG seed and not as\n described above. Using the same seed yields the same random shape.\n\n .. ACCEPTS: (scale: float, length: float, randomness: float)\n """\n if scale is None:\n self._sketch = None\n else:\n self._sketch = (scale, length or 128.0, randomness or 16.0)\n self.stale = True\n\n def set_path_effects(self, path_effects):\n """\n Set the path effects.\n\n Parameters\n ----------\n path_effects : list of `.AbstractPathEffect`\n """\n self._path_effects = path_effects\n self.stale = True\n\n def get_path_effects(self):\n return self._path_effects\n\n def get_figure(self, root=False):\n """\n Return the `.Figure` or `.SubFigure` instance the artist belongs to.\n\n Parameters\n ----------\n root : bool, default=False\n If False, return the (Sub)Figure this artist is on. If True,\n return the root Figure for a nested tree of SubFigures.\n """\n if root and self._parent_figure is not None:\n return self._parent_figure.get_figure(root=True)\n\n return self._parent_figure\n\n def set_figure(self, fig):\n """\n Set the `.Figure` or `.SubFigure` instance the artist belongs to.\n\n Parameters\n ----------\n fig : `~matplotlib.figure.Figure` or `~matplotlib.figure.SubFigure`\n """\n # if this is a no-op just return\n if self._parent_figure is fig:\n return\n # if we currently have a figure (the case of both `self.figure`\n # and *fig* being none is taken care of above) we then user is\n # trying to change the figure an artist is associated with which\n # is not allowed for the same reason as adding the same instance\n # to more than one Axes\n if self._parent_figure is not None:\n raise RuntimeError("Can not put single artist in "\n "more than one figure")\n self._parent_figure = fig\n if self._parent_figure and self._parent_figure is not self:\n self.pchanged()\n self.stale = True\n\n figure = property(get_figure, set_figure,\n doc=("The (Sub)Figure that the artist is on. For more "\n "control, use the `get_figure` method."))\n\n def set_clip_box(self, clipbox):\n """\n Set the artist's clip `.Bbox`.\n\n Parameters\n ----------\n clipbox : `~matplotlib.transforms.BboxBase` or None\n Will typically be created from a `.TransformedBbox`. For instance,\n ``TransformedBbox(Bbox([[0, 0], [1, 1]]), ax.transAxes)`` is the default\n clipping for an artist added to an Axes.\n\n """\n _api.check_isinstance((BboxBase, None), clipbox=clipbox)\n if clipbox != self.clipbox:\n self.clipbox = clipbox\n self.pchanged()\n self.stale = True\n\n def set_clip_path(self, path, transform=None):\n """\n Set the artist's clip path.\n\n Parameters\n ----------\n path : `~matplotlib.patches.Patch` or `.Path` or `.TransformedPath` or None\n The clip path. If given a `.Path`, *transform* must be provided as\n well. If *None*, a previously set clip path is removed.\n transform : `~matplotlib.transforms.Transform`, optional\n Only used if *path* is a `.Path`, in which case the given `.Path`\n is converted to a `.TransformedPath` using *transform*.\n\n Notes\n -----\n For efficiency, if *path* is a `.Rectangle` this method will set the\n clipping box to the corresponding rectangle and set the clipping path\n to ``None``.\n\n For technical reasons (support of `~.Artist.set`), a tuple\n (*path*, *transform*) is also accepted as a single positional\n parameter.\n\n .. ACCEPTS: Patch or (Path, Transform) or None\n """\n from matplotlib.patches import Patch, Rectangle\n\n success = False\n if transform is None:\n if isinstance(path, Rectangle):\n self.clipbox = TransformedBbox(Bbox.unit(),\n path.get_transform())\n self._clippath = None\n success = True\n elif isinstance(path, Patch):\n self._clippath = TransformedPatchPath(path)\n success = True\n elif isinstance(path, tuple):\n path, transform = path\n\n if path is None:\n self._clippath = None\n success = True\n elif isinstance(path, Path):\n self._clippath = TransformedPath(path, transform)\n success = True\n elif isinstance(path, TransformedPatchPath):\n self._clippath = path\n success = True\n elif isinstance(path, TransformedPath):\n self._clippath = path\n success = True\n\n if not success:\n raise TypeError(\n "Invalid arguments to set_clip_path, of type "\n f"{type(path).__name__} and {type(transform).__name__}")\n # This may result in the callbacks being hit twice, but guarantees they\n # will be hit at least once.\n self.pchanged()\n self.stale = True\n\n def get_alpha(self):\n """\n Return the alpha value used for blending - not supported on all\n backends.\n """\n return self._alpha\n\n def get_visible(self):\n """Return the visibility."""\n return self._visible\n\n def get_animated(self):\n """Return whether the artist is animated."""\n return self._animated\n\n def get_in_layout(self):\n """\n Return boolean flag, ``True`` if artist is included in layout\n calculations.\n\n E.g. :ref:`constrainedlayout_guide`,\n `.Figure.tight_layout()`, and\n ``fig.savefig(fname, bbox_inches='tight')``.\n """\n return self._in_layout\n\n def _fully_clipped_to_axes(self):\n """\n Return a boolean flag, ``True`` if the artist is clipped to the Axes\n and can thus be skipped in layout calculations. Requires `get_clip_on`\n is True, one of `clip_box` or `clip_path` is set, ``clip_box.extents``\n is equivalent to ``ax.bbox.extents`` (if set), and ``clip_path._patch``\n is equivalent to ``ax.patch`` (if set).\n """\n # Note that ``clip_path.get_fully_transformed_path().get_extents()``\n # cannot be directly compared to ``axes.bbox.extents`` because the\n # extents may be undefined (i.e. equivalent to ``Bbox.null()``)\n # before the associated artist is drawn, and this method is meant\n # to determine whether ``axes.get_tightbbox()`` may bypass drawing\n clip_box = self.get_clip_box()\n clip_path = self.get_clip_path()\n return (self.axes is not None\n and self.get_clip_on()\n and (clip_box is not None or clip_path is not None)\n and (clip_box is None\n or np.all(clip_box.extents == self.axes.bbox.extents))\n and (clip_path is None\n or isinstance(clip_path, TransformedPatchPath)\n and clip_path._patch is self.axes.patch))\n\n def get_clip_on(self):\n """Return whether the artist uses clipping."""\n return self._clipon\n\n def get_clip_box(self):\n """Return the clipbox."""\n return self.clipbox\n\n def get_clip_path(self):\n """Return the clip path."""\n return self._clippath\n\n def get_transformed_clip_path_and_affine(self):\n """\n Return the clip path with the non-affine part of its\n transformation applied, and the remaining affine part of its\n transformation.\n """\n if self._clippath is not None:\n return self._clippath.get_transformed_path_and_affine()\n return None, None\n\n def set_clip_on(self, b):\n """\n Set whether the artist uses clipping.\n\n When False, artists will be visible outside the Axes which\n can lead to unexpected results.\n\n Parameters\n ----------\n b : bool\n """\n self._clipon = b\n # This may result in the callbacks being hit twice, but ensures they\n # are hit at least once\n self.pchanged()\n self.stale = True\n\n def _set_gc_clip(self, gc):\n """Set the clip properly for the gc."""\n if self._clipon:\n if self.clipbox is not None:\n gc.set_clip_rectangle(self.clipbox)\n gc.set_clip_path(self._clippath)\n else:\n gc.set_clip_rectangle(None)\n gc.set_clip_path(None)\n\n def get_rasterized(self):\n """Return whether the artist is to be rasterized."""\n return self._rasterized\n\n def set_rasterized(self, rasterized):\n """\n Force rasterized (bitmap) drawing for vector graphics output.\n\n Rasterized drawing is not supported by all artists. If you try to\n enable this on an artist that does not support it, the command has no\n effect and a warning will be issued.\n\n This setting is ignored for pixel-based output.\n\n See also :doc:`/gallery/misc/rasterization_demo`.\n\n Parameters\n ----------\n rasterized : bool\n """\n supports_rasterization = getattr(self.draw,\n "_supports_rasterization", False)\n if rasterized and not supports_rasterization:\n _api.warn_external(f"Rasterization of '{self}' will be ignored")\n\n self._rasterized = rasterized\n\n def get_agg_filter(self):\n """Return filter function to be used for agg filter."""\n return self._agg_filter\n\n def set_agg_filter(self, filter_func):\n """\n Set the agg filter.\n\n Parameters\n ----------\n filter_func : callable\n A filter function, which takes a (m, n, depth) float array\n and a dpi value, and returns a (m, n, depth) array and two\n offsets from the bottom left corner of the image\n\n .. ACCEPTS: a filter function, which takes a (m, n, 3) float array\n and a dpi value, and returns a (m, n, 3) array and two offsets\n from the bottom left corner of the image\n """\n self._agg_filter = filter_func\n self.stale = True\n\n def draw(self, renderer):\n """\n Draw the Artist (and its children) using the given renderer.\n\n This has no effect if the artist is not visible (`.Artist.get_visible`\n returns False).\n\n Parameters\n ----------\n renderer : `~matplotlib.backend_bases.RendererBase` subclass.\n\n Notes\n -----\n This method is overridden in the Artist subclasses.\n """\n if not self.get_visible():\n return\n self.stale = False\n\n def set_alpha(self, alpha):\n """\n Set the alpha value used for blending - not supported on all backends.\n\n Parameters\n ----------\n alpha : float or None\n *alpha* must be within the 0-1 range, inclusive.\n """\n if alpha is not None and not isinstance(alpha, Real):\n raise TypeError(\n f'alpha must be numeric or None, not {type(alpha)}')\n if alpha is not None and not (0 <= alpha <= 1):\n raise ValueError(f'alpha ({alpha}) is outside 0-1 range')\n if alpha != self._alpha:\n self._alpha = alpha\n self.pchanged()\n self.stale = True\n\n def _set_alpha_for_array(self, alpha):\n """\n Set the alpha value used for blending - not supported on all backends.\n\n Parameters\n ----------\n alpha : array-like or float or None\n All values must be within the 0-1 range, inclusive.\n Masked values and nans are not supported.\n """\n if isinstance(alpha, str):\n raise TypeError("alpha must be numeric or None, not a string")\n if not np.iterable(alpha):\n Artist.set_alpha(self, alpha)\n return\n alpha = np.asarray(alpha)\n if not (0 <= alpha.min() and alpha.max() <= 1):\n raise ValueError('alpha must be between 0 and 1, inclusive, '\n f'but min is {alpha.min()}, max is {alpha.max()}')\n self._alpha = alpha\n self.pchanged()\n self.stale = True\n\n def set_visible(self, b):\n """\n Set the artist's visibility.\n\n Parameters\n ----------\n b : bool\n """\n if b != self._visible:\n self._visible = b\n self.pchanged()\n self.stale = True\n\n def set_animated(self, b):\n """\n Set whether the artist is intended to be used in an animation.\n\n If True, the artist is excluded from regular drawing of the figure.\n You have to call `.Figure.draw_artist` / `.Axes.draw_artist`\n explicitly on the artist. This approach is used to speed up animations\n using blitting.\n\n See also `matplotlib.animation` and\n :ref:`blitting`.\n\n Parameters\n ----------\n b : bool\n """\n if self._animated != b:\n self._animated = b\n self.pchanged()\n\n def set_in_layout(self, in_layout):\n """\n Set if artist is to be included in layout calculations,\n E.g. :ref:`constrainedlayout_guide`,\n `.Figure.tight_layout()`, and\n ``fig.savefig(fname, bbox_inches='tight')``.\n\n Parameters\n ----------\n in_layout : bool\n """\n self._in_layout = in_layout\n\n def get_label(self):\n """Return the label used for this artist in the legend."""\n return self._label\n\n def set_label(self, s):\n """\n Set a label that will be displayed in the legend.\n\n Parameters\n ----------\n s : object\n *s* will be converted to a string by calling `str`.\n """\n label = str(s) if s is not None else None\n if label != self._label:\n self._label = label\n self.pchanged()\n self.stale = True\n\n def get_zorder(self):\n """Return the artist's zorder."""\n return self.zorder\n\n def set_zorder(self, level):\n """\n Set the zorder for the artist. Artists with lower zorder\n values are drawn first.\n\n Parameters\n ----------\n level : float\n """\n if level is None:\n level = self.__class__.zorder\n if level != self.zorder:\n self.zorder = level\n self.pchanged()\n self.stale = True\n\n @property\n def sticky_edges(self):\n """\n ``x`` and ``y`` sticky edge lists for autoscaling.\n\n When performing autoscaling, if a data limit coincides with a value in\n the corresponding sticky_edges list, then no margin will be added--the\n view limit "sticks" to the edge. A typical use case is histograms,\n where one usually expects no margin on the bottom edge (0) of the\n histogram.\n\n Moreover, margin expansion "bumps" against sticky edges and cannot\n cross them. For example, if the upper data limit is 1.0, the upper\n view limit computed by simple margin application is 1.2, but there is a\n sticky edge at 1.1, then the actual upper view limit will be 1.1.\n\n This attribute cannot be assigned to; however, the ``x`` and ``y``\n lists can be modified in place as needed.\n\n Examples\n --------\n >>> artist.sticky_edges.x[:] = (xmin, xmax)\n >>> artist.sticky_edges.y[:] = (ymin, ymax)\n\n """\n return self._sticky_edges\n\n def update_from(self, other):\n """Copy properties from *other* to *self*."""\n self._transform = other._transform\n self._transformSet = other._transformSet\n self._visible = other._visible\n self._alpha = other._alpha\n self.clipbox = other.clipbox\n self._clipon = other._clipon\n self._clippath = other._clippath\n self._label = other._label\n self._sketch = other._sketch\n self._path_effects = other._path_effects\n self.sticky_edges.x[:] = other.sticky_edges.x.copy()\n self.sticky_edges.y[:] = other.sticky_edges.y.copy()\n self.pchanged()\n self.stale = True\n\n def properties(self):\n """Return a dictionary of all the properties of the artist."""\n return ArtistInspector(self).properties()\n\n def _update_props(self, props, errfmt):\n """\n Helper for `.Artist.set` and `.Artist.update`.\n\n *errfmt* is used to generate error messages for invalid property\n names; it gets formatted with ``type(self)`` for "{cls}" and the\n property name for "{prop_name}".\n """\n ret = []\n with cbook._setattr_cm(self, eventson=False):\n for k, v in props.items():\n # Allow attributes we want to be able to update through\n # art.update, art.set, setp.\n if k == "axes":\n ret.append(setattr(self, k, v))\n else:\n func = getattr(self, f"set_{k}", None)\n if not callable(func):\n raise AttributeError(\n errfmt.format(cls=type(self), prop_name=k),\n name=k)\n ret.append(func(v))\n if ret:\n self.pchanged()\n self.stale = True\n return ret\n\n def update(self, props):\n """\n Update this artist's properties from the dict *props*.\n\n Parameters\n ----------\n props : dict\n """\n return self._update_props(\n props, "{cls.__name__!r} object has no property {prop_name!r}")\n\n def _internal_update(self, kwargs):\n """\n Update artist properties without prenormalizing them, but generating\n errors as if calling `set`.\n\n The lack of prenormalization is to maintain backcompatibility.\n """\n return self._update_props(\n kwargs, "{cls.__name__}.set() got an unexpected keyword argument "\n "{prop_name!r}")\n\n def set(self, **kwargs):\n # docstring and signature are auto-generated via\n # Artist._update_set_signature_and_docstring() at the end of the\n # module.\n return self._internal_update(cbook.normalize_kwargs(kwargs, self))\n\n @contextlib.contextmanager\n def _cm_set(self, **kwargs):\n """\n `.Artist.set` context-manager that restores original values at exit.\n """\n orig_vals = {k: getattr(self, f"get_{k}")() for k in kwargs}\n try:\n self.set(**kwargs)\n yield\n finally:\n self.set(**orig_vals)\n\n def findobj(self, match=None, include_self=True):\n """\n Find artist objects.\n\n Recursively find all `.Artist` instances contained in the artist.\n\n Parameters\n ----------\n match\n A filter criterion for the matches. This can be\n\n - *None*: Return all objects contained in artist.\n - A function with signature ``def match(artist: Artist) -> bool``.\n The result will only contain artists for which the function\n returns *True*.\n - A class instance: e.g., `.Line2D`. The result will only contain\n artists of this class or its subclasses (``isinstance`` check).\n\n include_self : bool\n Include *self* in the list to be checked for a match.\n\n Returns\n -------\n list of `.Artist`\n\n """\n if match is None: # always return True\n def matchfunc(x):\n return True\n elif isinstance(match, type) and issubclass(match, Artist):\n def matchfunc(x):\n return isinstance(x, match)\n elif callable(match):\n matchfunc = match\n else:\n raise ValueError('match must be None, a matplotlib.artist.Artist '\n 'subclass, or a callable')\n\n artists = reduce(operator.iadd,\n [c.findobj(matchfunc) for c in self.get_children()], [])\n if include_self and matchfunc(self):\n artists.append(self)\n return artists\n\n def get_cursor_data(self, event):\n """\n Return the cursor data for a given event.\n\n .. note::\n This method is intended to be overridden by artist subclasses.\n As an end-user of Matplotlib you will most likely not call this\n method yourself.\n\n Cursor data can be used by Artists to provide additional context\n information for a given event. The default implementation just returns\n *None*.\n\n Subclasses can override the method and return arbitrary data. However,\n when doing so, they must ensure that `.format_cursor_data` can convert\n the data to a string representation.\n\n The only current use case is displaying the z-value of an `.AxesImage`\n in the status bar of a plot window, while moving the mouse.\n\n Parameters\n ----------\n event : `~matplotlib.backend_bases.MouseEvent`\n\n See Also\n --------\n format_cursor_data\n\n """\n return None\n\n def format_cursor_data(self, data):\n """\n Return a string representation of *data*.\n\n .. note::\n This method is intended to be overridden by artist subclasses.\n As an end-user of Matplotlib you will most likely not call this\n method yourself.\n\n The default implementation converts ints and floats and arrays of ints\n and floats into a comma-separated string enclosed in square brackets,\n unless the artist has an associated colorbar, in which case scalar\n values are formatted using the colorbar's formatter.\n\n See Also\n --------\n get_cursor_data\n """\n if np.ndim(data) == 0 and hasattr(self, "_format_cursor_data_override"):\n # workaround for ScalarMappable to be able to define its own\n # format_cursor_data(). See ScalarMappable._format_cursor_data_override\n # for details.\n return self._format_cursor_data_override(data)\n else:\n try:\n data[0]\n except (TypeError, IndexError):\n data = [data]\n data_str = ', '.join(f'{item:0.3g}' for item in data\n if isinstance(item, Number))\n return "[" + data_str + "]"\n\n def get_mouseover(self):\n """\n Return whether this artist is queried for custom context information\n when the mouse cursor moves over it.\n """\n return self._mouseover\n\n def set_mouseover(self, mouseover):\n """\n Set whether this artist is queried for custom context information when\n the mouse cursor moves over it.\n\n Parameters\n ----------\n mouseover : bool\n\n See Also\n --------\n get_cursor_data\n .ToolCursorPosition\n .NavigationToolbar2\n """\n self._mouseover = bool(mouseover)\n ax = self.axes\n if ax:\n if self._mouseover:\n ax._mouseover_set.add(self)\n else:\n ax._mouseover_set.discard(self)\n\n mouseover = property(get_mouseover, set_mouseover) # backcompat.\n\n\ndef _get_tightbbox_for_layout_only(obj, *args, **kwargs):\n """\n Matplotlib's `.Axes.get_tightbbox` and `.Axis.get_tightbbox` support a\n *for_layout_only* kwarg; this helper tries to use the kwarg but skips it\n when encountering third-party subclasses that do not support it.\n """\n try:\n return obj.get_tightbbox(*args, **{**kwargs, "for_layout_only": True})\n except TypeError:\n return obj.get_tightbbox(*args, **kwargs)\n\n\nclass ArtistInspector:\n """\n A helper class to inspect an `~matplotlib.artist.Artist` and return\n information about its settable properties and their current values.\n """\n\n def __init__(self, o):\n r"""\n Initialize the artist inspector with an `Artist` or an iterable of\n `Artist`\s. If an iterable is used, we assume it is a homogeneous\n sequence (all `Artist`\s are of the same type) and it is your\n responsibility to make sure this is so.\n """\n if not isinstance(o, Artist):\n if np.iterable(o):\n o = list(o)\n if len(o):\n o = o[0]\n\n self.oorig = o\n if not isinstance(o, type):\n o = type(o)\n self.o = o\n\n self.aliasd = self.get_aliases()\n\n def get_aliases(self):\n """\n Get a dict mapping property fullnames to sets of aliases for each alias\n in the :class:`~matplotlib.artist.ArtistInspector`.\n\n e.g., for lines::\n\n {'markerfacecolor': {'mfc'},\n 'linewidth' : {'lw'},\n }\n """\n names = [name for name in dir(self.o)\n if name.startswith(('set_', 'get_'))\n and callable(getattr(self.o, name))]\n aliases = {}\n for name in names:\n func = getattr(self.o, name)\n if not self.is_alias(func):\n continue\n propname = re.search(f"`({name[:4]}.*)`", # get_.*/set_.*\n inspect.getdoc(func)).group(1)\n aliases.setdefault(propname[4:], set()).add(name[4:])\n return aliases\n\n _get_valid_values_regex = re.compile(\n r"\n\s*(?:\.\.\s+)?ACCEPTS:\s*((?:.|\n)*?)(?:$|(?:\n\n))"\n )\n\n def get_valid_values(self, attr):\n """\n Get the legal arguments for the setter associated with *attr*.\n\n This is done by querying the docstring of the setter for a line that\n begins with "ACCEPTS:" or ".. ACCEPTS:", and then by looking for a\n numpydoc-style documentation for the setter's first argument.\n """\n\n name = 'set_%s' % attr\n if not hasattr(self.o, name):\n raise AttributeError(f'{self.o} has no function {name}')\n func = getattr(self.o, name)\n\n if hasattr(func, '_kwarg_doc'):\n return func._kwarg_doc\n\n docstring = inspect.getdoc(func)\n if docstring is None:\n return 'unknown'\n\n if docstring.startswith('Alias for '):\n return None\n\n match = self._get_valid_values_regex.search(docstring)\n if match is not None:\n return re.sub("\n *", " ", match.group(1))\n\n # Much faster than list(inspect.signature(func).parameters)[1],\n # although barely relevant wrt. matplotlib's total import time.\n param_name = func.__code__.co_varnames[1]\n # We could set the presence * based on whether the parameter is a\n # varargs (it can't be a varkwargs) but it's not really worth it.\n match = re.search(fr"(?m)^ *\*?{param_name} : (.+)", docstring)\n if match:\n return match.group(1)\n\n return 'unknown'\n\n def _replace_path(self, source_class):\n """\n Changes the full path to the public API path that is used\n in sphinx. This is needed for links to work.\n """\n replace_dict = {'_base._AxesBase': 'Axes',\n '_axes.Axes': 'Axes'}\n for key, value in replace_dict.items():\n source_class = source_class.replace(key, value)\n return source_class\n\n def get_setters(self):\n """\n Get the attribute strings with setters for object.\n\n For example, for a line, return ``['markerfacecolor', 'linewidth',\n ....]``.\n """\n setters = []\n for name in dir(self.o):\n if not name.startswith('set_'):\n continue\n func = getattr(self.o, name)\n if (not callable(func)\n or self.number_of_parameters(func) < 2\n or self.is_alias(func)):\n continue\n setters.append(name[4:])\n return setters\n\n @staticmethod\n @cache\n def number_of_parameters(func):\n """Return number of parameters of the callable *func*."""\n return len(inspect.signature(func).parameters)\n\n @staticmethod\n @cache\n def is_alias(method):\n """\n Return whether the object *method* is an alias for another method.\n """\n\n ds = inspect.getdoc(method)\n if ds is None:\n return False\n\n return ds.startswith('Alias for ')\n\n def aliased_name(self, s):\n """\n Return 'PROPNAME or alias' if *s* has an alias, else return 'PROPNAME'.\n\n For example, for the line markerfacecolor property, which has an\n alias, return 'markerfacecolor or mfc' and for the transform\n property, which does not, return 'transform'.\n """\n aliases = ''.join(' or %s' % x for x in sorted(self.aliasd.get(s, [])))\n return s + aliases\n\n _NOT_LINKABLE = {\n # A set of property setter methods that are not available in our\n # current docs. This is a workaround used to prevent trying to link\n # these setters which would lead to "target reference not found"\n # warnings during doc build.\n 'matplotlib.image._ImageBase.set_alpha',\n 'matplotlib.image._ImageBase.set_array',\n 'matplotlib.image._ImageBase.set_data',\n 'matplotlib.image._ImageBase.set_filternorm',\n 'matplotlib.image._ImageBase.set_filterrad',\n 'matplotlib.image._ImageBase.set_interpolation',\n 'matplotlib.image._ImageBase.set_interpolation_stage',\n 'matplotlib.image._ImageBase.set_resample',\n 'matplotlib.text._AnnotationBase.set_annotation_clip',\n }\n\n def aliased_name_rest(self, s, target):\n """\n Return 'PROPNAME or alias' if *s* has an alias, else return 'PROPNAME',\n formatted for reST.\n\n For example, for the line markerfacecolor property, which has an\n alias, return 'markerfacecolor or mfc' and for the transform\n property, which does not, return 'transform'.\n """\n # workaround to prevent "reference target not found"\n if target in self._NOT_LINKABLE:\n return f'``{s}``'\n\n aliases = ''.join(\n f' or :meth:`{a} <{target}>`' for a in sorted(self.aliasd.get(s, [])))\n return f':meth:`{s} <{target}>`{aliases}'\n\n def pprint_setters(self, prop=None, leadingspace=2):\n """\n If *prop* is *None*, return a list of strings of all settable\n properties and their valid values.\n\n If *prop* is not *None*, it is a valid property name and that\n property will be returned as a string of property : valid\n values.\n """\n if leadingspace:\n pad = ' ' * leadingspace\n else:\n pad = ''\n if prop is not None:\n accepts = self.get_valid_values(prop)\n return f'{pad}{prop}: {accepts}'\n\n lines = []\n for prop in sorted(self.get_setters()):\n accepts = self.get_valid_values(prop)\n name = self.aliased_name(prop)\n lines.append(f'{pad}{name}: {accepts}')\n return lines\n\n def pprint_setters_rest(self, prop=None, leadingspace=4):\n """\n If *prop* is *None*, return a list of reST-formatted strings of all\n settable properties and their valid values.\n\n If *prop* is not *None*, it is a valid property name and that\n property will be returned as a string of "property : valid"\n values.\n """\n if leadingspace:\n pad = ' ' * leadingspace\n else:\n pad = ''\n if prop is not None:\n accepts = self.get_valid_values(prop)\n return f'{pad}{prop}: {accepts}'\n\n prop_and_qualnames = []\n for prop in sorted(self.get_setters()):\n # Find the parent method which actually provides the docstring.\n for cls in self.o.__mro__:\n method = getattr(cls, f"set_{prop}", None)\n if method and method.__doc__ is not None:\n break\n else: # No docstring available.\n method = getattr(self.o, f"set_{prop}")\n prop_and_qualnames.append(\n (prop, f"{method.__module__}.{method.__qualname__}"))\n\n names = [self.aliased_name_rest(prop, target)\n .replace('_base._AxesBase', 'Axes')\n .replace('_axes.Axes', 'Axes')\n for prop, target in prop_and_qualnames]\n accepts = [self.get_valid_values(prop)\n for prop, _ in prop_and_qualnames]\n\n col0_len = max(len(n) for n in names)\n col1_len = max(len(a) for a in accepts)\n table_formatstr = pad + ' ' + '=' * col0_len + ' ' + '=' * col1_len\n\n return [\n '',\n pad + '.. table::',\n pad + ' :class: property-table',\n '',\n table_formatstr,\n pad + ' ' + 'Property'.ljust(col0_len)\n + ' ' + 'Description'.ljust(col1_len),\n table_formatstr,\n *[pad + ' ' + n.ljust(col0_len) + ' ' + a.ljust(col1_len)\n for n, a in zip(names, accepts)],\n table_formatstr,\n '',\n ]\n\n def properties(self):\n """Return a dictionary mapping property name -> value."""\n o = self.oorig\n getters = [name for name in dir(o)\n if name.startswith('get_') and callable(getattr(o, name))]\n getters.sort()\n d = {}\n for name in getters:\n func = getattr(o, name)\n if self.is_alias(func):\n continue\n try:\n with warnings.catch_warnings():\n warnings.simplefilter('ignore')\n val = func()\n except Exception:\n continue\n else:\n d[name[4:]] = val\n return d\n\n def pprint_getters(self):\n """Return the getters and actual values as list of strings."""\n lines = []\n for name, val in sorted(self.properties().items()):\n if getattr(val, 'shape', ()) != () and len(val) > 6:\n s = str(val[:6]) + '...'\n else:\n s = str(val)\n s = s.replace('\n', ' ')\n if len(s) > 50:\n s = s[:50] + '...'\n name = self.aliased_name(name)\n lines.append(f' {name} = {s}')\n return lines\n\n\ndef getp(obj, property=None):\n """\n Return the value of an `.Artist`'s *property*, or print all of them.\n\n Parameters\n ----------\n obj : `~matplotlib.artist.Artist`\n The queried artist; e.g., a `.Line2D`, a `.Text`, or an `~.axes.Axes`.\n\n property : str or None, default: None\n If *property* is 'somename', this function returns\n ``obj.get_somename()``.\n\n If it's None (or unset), it *prints* all gettable properties from\n *obj*. Many properties have aliases for shorter typing, e.g. 'lw' is\n an alias for 'linewidth'. In the output, aliases and full property\n names will be listed as:\n\n property or alias = value\n\n e.g.:\n\n linewidth or lw = 2\n\n See Also\n --------\n setp\n """\n if property is None:\n insp = ArtistInspector(obj)\n ret = insp.pprint_getters()\n print('\n'.join(ret))\n return\n return getattr(obj, 'get_' + property)()\n\n# alias\nget = getp\n\n\ndef setp(obj, *args, file=None, **kwargs):\n """\n Set one or more properties on an `.Artist`, or list allowed values.\n\n Parameters\n ----------\n obj : `~matplotlib.artist.Artist` or list of `.Artist`\n The artist(s) whose properties are being set or queried. When setting\n properties, all artists are affected; when querying the allowed values,\n only the first instance in the sequence is queried.\n\n For example, two lines can be made thicker and red with a single call:\n\n >>> x = arange(0, 1, 0.01)\n >>> lines = plot(x, sin(2*pi*x), x, sin(4*pi*x))\n >>> setp(lines, linewidth=2, color='r')\n\n file : file-like, default: `sys.stdout`\n Where `setp` writes its output when asked to list allowed values.\n\n >>> with open('output.log') as file:\n ... setp(line, file=file)\n\n The default, ``None``, means `sys.stdout`.\n\n *args, **kwargs\n The properties to set. The following combinations are supported:\n\n - Set the linestyle of a line to be dashed:\n\n >>> line, = plot([1, 2, 3])\n >>> setp(line, linestyle='--')\n\n - Set multiple properties at once:\n\n >>> setp(line, linewidth=2, color='r')\n\n - List allowed values for a line's linestyle:\n\n >>> setp(line, 'linestyle')\n linestyle: {'-', '--', '-.', ':', '', (offset, on-off-seq), ...}\n\n - List all properties that can be set, and their allowed values:\n\n >>> setp(line)\n agg_filter: a filter function, ...\n [long output listing omitted]\n\n `setp` also supports MATLAB style string/value pairs. For example, the\n following are equivalent:\n\n >>> setp(lines, 'linewidth', 2, 'color', 'r') # MATLAB style\n >>> setp(lines, linewidth=2, color='r') # Python style\n\n See Also\n --------\n getp\n """\n\n if isinstance(obj, Artist):\n objs = [obj]\n else:\n objs = list(cbook.flatten(obj))\n\n if not objs:\n return\n\n insp = ArtistInspector(objs[0])\n\n if not kwargs and len(args) < 2:\n if args:\n print(insp.pprint_setters(prop=args[0]), file=file)\n else:\n print('\n'.join(insp.pprint_setters()), file=file)\n return\n\n if len(args) % 2:\n raise ValueError('The set args must be string, value pairs')\n\n funcvals = dict(zip(args[::2], args[1::2]))\n ret = [o.update(funcvals) for o in objs] + [o.set(**kwargs) for o in objs]\n return list(cbook.flatten(ret))\n\n\ndef kwdoc(artist):\n r"""\n Inspect an `~matplotlib.artist.Artist` class (using `.ArtistInspector`) and\n return information about its settable properties and their current values.\n\n Parameters\n ----------\n artist : `~matplotlib.artist.Artist` or an iterable of `Artist`\s\n\n Returns\n -------\n str\n The settable properties of *artist*, as plain text if\n :rc:`docstring.hardcopy` is False and as a rst table (intended for\n use in Sphinx) if it is True.\n """\n ai = ArtistInspector(artist)\n return ('\n'.join(ai.pprint_setters_rest(leadingspace=4))\n if mpl.rcParams['docstring.hardcopy'] else\n 'Properties:\n' + '\n'.join(ai.pprint_setters(leadingspace=4)))\n\n# We defer this to the end of them module, because it needs ArtistInspector\n# to be defined.\nArtist._update_set_signature_and_docstring()\n | .venv\Lib\site-packages\matplotlib\artist.py | artist.py | Python | 63,293 | 0.75 | 0.200539 | 0.062541 | react-lib | 467 | 2025-04-17T04:34:47.126307 | GPL-3.0 | false | 028dfd988bc21957adc0e4c52250aa22 |
from .axes._base import _AxesBase\nfrom .backend_bases import RendererBase, MouseEvent\nfrom .figure import Figure, SubFigure\nfrom .path import Path\nfrom .patches import Patch\nfrom .patheffects import AbstractPathEffect\nfrom .transforms import (\n BboxBase,\n Bbox,\n Transform,\n TransformedPatchPath,\n TransformedPath,\n)\n\nimport numpy as np\n\nfrom collections.abc import Callable, Iterable\nfrom typing import Any, Literal, NamedTuple, TextIO, overload, TypeVar\nfrom numpy.typing import ArrayLike\n\n_T_Artist = TypeVar("_T_Artist", bound=Artist)\n\ndef allow_rasterization(draw): ...\n\nclass _XYPair(NamedTuple):\n x: ArrayLike\n y: ArrayLike\n\nclass _Unset: ...\n\nclass Artist:\n zorder: float\n stale_callback: Callable[[Artist, bool], None] | None\n @property\n def figure(self) -> Figure | SubFigure: ...\n clipbox: BboxBase | None\n def __init__(self) -> None: ...\n def remove(self) -> None: ...\n def have_units(self) -> bool: ...\n # TODO units\n def convert_xunits(self, x): ...\n def convert_yunits(self, y): ...\n @property\n def axes(self) -> _AxesBase | None: ...\n @axes.setter\n def axes(self, new_axes: _AxesBase | None) -> None: ...\n @property\n def stale(self) -> bool: ...\n @stale.setter\n def stale(self, val: bool) -> None: ...\n def get_window_extent(self, renderer: RendererBase | None = ...) -> Bbox: ...\n def get_tightbbox(self, renderer: RendererBase | None = ...) -> Bbox | None: ...\n def add_callback(self, func: Callable[[Artist], Any]) -> int: ...\n def remove_callback(self, oid: int) -> None: ...\n def pchanged(self) -> None: ...\n def is_transform_set(self) -> bool: ...\n def set_transform(self, t: Transform | None) -> None: ...\n def get_transform(self) -> Transform: ...\n def get_children(self) -> list[Artist]: ...\n # TODO can these dicts be type narrowed? e.g. str keys\n def contains(self, mouseevent: MouseEvent) -> tuple[bool, dict[Any, Any]]: ...\n def pickable(self) -> bool: ...\n def pick(self, mouseevent: MouseEvent) -> None: ...\n def set_picker(\n self,\n picker: None\n | bool\n | float\n | Callable[[Artist, MouseEvent], tuple[bool, dict[Any, Any]]],\n ) -> None: ...\n def get_picker(\n self,\n ) -> None | bool | float | Callable[\n [Artist, MouseEvent], tuple[bool, dict[Any, Any]]\n ]: ...\n def get_url(self) -> str | None: ...\n def set_url(self, url: str | None) -> None: ...\n def get_gid(self) -> str | None: ...\n def set_gid(self, gid: str | None) -> None: ...\n def get_snap(self) -> bool | None: ...\n def set_snap(self, snap: bool | None) -> None: ...\n def get_sketch_params(self) -> tuple[float, float, float] | None: ...\n def set_sketch_params(\n self,\n scale: float | None = ...,\n length: float | None = ...,\n randomness: float | None = ...,\n ) -> None: ...\n def set_path_effects(self, path_effects: list[AbstractPathEffect]) -> None: ...\n def get_path_effects(self) -> list[AbstractPathEffect]: ...\n @overload\n def get_figure(self, root: Literal[True]) -> Figure | None: ...\n @overload\n def get_figure(self, root: Literal[False]) -> Figure | SubFigure | None: ...\n @overload\n def get_figure(self, root: bool = ...) -> Figure | SubFigure | None: ...\n def set_figure(self, fig: Figure | SubFigure) -> None: ...\n def set_clip_box(self, clipbox: BboxBase | None) -> None: ...\n def set_clip_path(\n self,\n path: Patch | Path | TransformedPath | TransformedPatchPath | None,\n transform: Transform | None = ...,\n ) -> None: ...\n def get_alpha(self) -> float | None: ...\n def get_visible(self) -> bool: ...\n def get_animated(self) -> bool: ...\n def get_in_layout(self) -> bool: ...\n def get_clip_on(self) -> bool: ...\n def get_clip_box(self) -> Bbox | None: ...\n def get_clip_path(\n self,\n ) -> Patch | Path | TransformedPath | TransformedPatchPath | None: ...\n def get_transformed_clip_path_and_affine(\n self,\n ) -> tuple[None, None] | tuple[Path, Transform]: ...\n def set_clip_on(self, b: bool) -> None: ...\n def get_rasterized(self) -> bool: ...\n def set_rasterized(self, rasterized: bool) -> None: ...\n def get_agg_filter(self) -> Callable[[ArrayLike, float], tuple[np.ndarray, float, float]] | None: ...\n def set_agg_filter(\n self, filter_func: Callable[[ArrayLike, float], tuple[np.ndarray, float, float]] | None\n ) -> None: ...\n def draw(self, renderer: RendererBase) -> None: ...\n def set_alpha(self, alpha: float | None) -> None: ...\n def set_visible(self, b: bool) -> None: ...\n def set_animated(self, b: bool) -> None: ...\n def set_in_layout(self, in_layout: bool) -> None: ...\n def get_label(self) -> object: ...\n def set_label(self, s: object) -> None: ...\n def get_zorder(self) -> float: ...\n def set_zorder(self, level: float) -> None: ...\n @property\n def sticky_edges(self) -> _XYPair: ...\n def update_from(self, other: Artist) -> None: ...\n def properties(self) -> dict[str, Any]: ...\n def update(self, props: dict[str, Any]) -> list[Any]: ...\n def _internal_update(self, kwargs: Any) -> list[Any]: ...\n def set(self, **kwargs: Any) -> list[Any]: ...\n\n @overload\n def findobj(\n self,\n match: None | Callable[[Artist], bool] = ...,\n include_self: bool = ...,\n ) -> list[Artist]: ...\n\n @overload\n def findobj(\n self,\n match: type[_T_Artist],\n include_self: bool = ...,\n ) -> list[_T_Artist]: ...\n\n def get_cursor_data(self, event: MouseEvent) -> Any: ...\n def format_cursor_data(self, data: Any) -> str: ...\n def get_mouseover(self) -> bool: ...\n def set_mouseover(self, mouseover: bool) -> None: ...\n @property\n def mouseover(self) -> bool: ...\n @mouseover.setter\n def mouseover(self, mouseover: bool) -> None: ...\n\nclass ArtistInspector:\n oorig: Artist | type[Artist]\n o: type[Artist]\n aliasd: dict[str, set[str]]\n def __init__(\n self, o: Artist | type[Artist] | Iterable[Artist | type[Artist]]\n ) -> None: ...\n def get_aliases(self) -> dict[str, set[str]]: ...\n def get_valid_values(self, attr: str) -> str | None: ...\n def get_setters(self) -> list[str]: ...\n @staticmethod\n def number_of_parameters(func: Callable) -> int: ...\n @staticmethod\n def is_alias(method: Callable) -> bool: ...\n def aliased_name(self, s: str) -> str: ...\n def aliased_name_rest(self, s: str, target: str) -> str: ...\n @overload\n def pprint_setters(\n self, prop: None = ..., leadingspace: int = ...\n ) -> list[str]: ...\n @overload\n def pprint_setters(self, prop: str, leadingspace: int = ...) -> str: ...\n @overload\n def pprint_setters_rest(\n self, prop: None = ..., leadingspace: int = ...\n ) -> list[str]: ...\n @overload\n def pprint_setters_rest(self, prop: str, leadingspace: int = ...) -> str: ...\n def properties(self) -> dict[str, Any]: ...\n def pprint_getters(self) -> list[str]: ...\n\ndef getp(obj: Artist, property: str | None = ...) -> Any: ...\n\nget = getp\n\ndef setp(obj: Artist, *args, file: TextIO | None = ..., **kwargs) -> list[Any] | None: ...\ndef kwdoc(artist: Artist | type[Artist] | Iterable[Artist | type[Artist]]) -> str: ...\n | .venv\Lib\site-packages\matplotlib\artist.pyi | artist.pyi | Other | 7,336 | 0.95 | 0.492462 | 0.010811 | python-kit | 378 | 2024-06-12T22:33:08.999392 | MIT | false | 9861849b7ad339104e6b0055370b7847 |
from collections.abc import Callable, Iterable, Sequence\nimport datetime\nfrom typing import Any, Literal, overload\nfrom typing_extensions import Self # < Py 3.11\n\nimport numpy as np\nfrom numpy.typing import ArrayLike\n\nimport matplotlib.artist as martist\nfrom matplotlib import cbook\nfrom matplotlib.axes import Axes\nfrom matplotlib.backend_bases import RendererBase\nfrom matplotlib.lines import Line2D\nfrom matplotlib.text import Text\nfrom matplotlib.ticker import Locator, Formatter\nfrom matplotlib.transforms import Transform, Bbox\nfrom matplotlib.typing import ColorType\nfrom matplotlib.units import ConversionInterface\n\n\nGRIDLINE_INTERPOLATION_STEPS: int\n\nclass Tick(martist.Artist):\n axes: Axes\n tick1line: Line2D\n tick2line: Line2D\n gridline: Line2D\n label1: Text\n label2: Text\n def __init__(\n self,\n axes: Axes,\n loc: float,\n *,\n size: float | None = ...,\n width: float | None = ...,\n color: ColorType | None = ...,\n tickdir: Literal["in", "inout", "out"] | None = ...,\n pad: float | None = ...,\n labelsize: float | None = ...,\n labelcolor: ColorType | None = ...,\n labelfontfamily: str | Sequence[str] | None = ...,\n zorder: float | None = ...,\n gridOn: bool | None = ...,\n tick1On: bool = ...,\n tick2On: bool = ...,\n label1On: bool = ...,\n label2On: bool = ...,\n major: bool = ...,\n labelrotation: float = ...,\n grid_color: ColorType | None = ...,\n grid_linestyle: str | None = ...,\n grid_linewidth: float | None = ...,\n grid_alpha: float | None = ...,\n **kwargs\n ) -> None: ...\n def get_tickdir(self) -> Literal["in", "inout", "out"]: ...\n def get_tick_padding(self) -> float: ...\n def get_children(self) -> list[martist.Artist]: ...\n stale: bool\n def set_pad(self, val: float) -> None: ...\n def get_pad(self) -> None: ...\n def get_loc(self) -> float: ...\n def set_url(self, url: str | None) -> None: ...\n def get_view_interval(self) -> ArrayLike: ...\n def update_position(self, loc: float) -> None: ...\n\nclass XTick(Tick):\n __name__: str\n def __init__(self, *args, **kwargs) -> None: ...\n stale: bool\n def update_position(self, loc: float) -> None: ...\n def get_view_interval(self) -> np.ndarray: ...\n\nclass YTick(Tick):\n __name__: str\n def __init__(self, *args, **kwargs) -> None: ...\n stale: bool\n def update_position(self, loc: float) -> None: ...\n def get_view_interval(self) -> np.ndarray: ...\n\nclass Ticker:\n def __init__(self) -> None: ...\n @property\n def locator(self) -> Locator | None: ...\n @locator.setter\n def locator(self, locator: Locator) -> None: ...\n @property\n def formatter(self) -> Formatter | None: ...\n @formatter.setter\n def formatter(self, formatter: Formatter) -> None: ...\n\nclass _LazyTickList:\n def __init__(self, major: bool) -> None: ...\n @overload\n def __get__(self, instance: None, owner: None) -> Self: ...\n @overload\n def __get__(self, instance: Axis, owner: type[Axis]) -> list[Tick]: ...\n\nclass Axis(martist.Artist):\n OFFSETTEXTPAD: int\n isDefault_label: bool\n axes: Axes\n major: Ticker\n minor: Ticker\n callbacks: cbook.CallbackRegistry\n label: Text\n offsetText: Text\n labelpad: float\n pickradius: float\n def __init__(self, axes, *, pickradius: float = ...,\n clear: bool = ...) -> None: ...\n @property\n def isDefault_majloc(self) -> bool: ...\n @isDefault_majloc.setter\n def isDefault_majloc(self, value: bool) -> None: ...\n @property\n def isDefault_majfmt(self) -> bool: ...\n @isDefault_majfmt.setter\n def isDefault_majfmt(self, value: bool) -> None: ...\n @property\n def isDefault_minloc(self) -> bool: ...\n @isDefault_minloc.setter\n def isDefault_minloc(self, value: bool) -> None: ...\n @property\n def isDefault_minfmt(self) -> bool: ...\n @isDefault_minfmt.setter\n def isDefault_minfmt(self, value: bool) -> None: ...\n majorTicks: _LazyTickList\n minorTicks: _LazyTickList\n def get_remove_overlapping_locs(self) -> bool: ...\n def set_remove_overlapping_locs(self, val: bool) -> None: ...\n @property\n def remove_overlapping_locs(self) -> bool: ...\n @remove_overlapping_locs.setter\n def remove_overlapping_locs(self, val: bool) -> None: ...\n stale: bool\n def set_label_coords(\n self, x: float, y: float, transform: Transform | None = ...\n ) -> None: ...\n def get_transform(self) -> Transform: ...\n def get_scale(self) -> str: ...\n def limit_range_for_scale(\n self, vmin: float, vmax: float\n ) -> tuple[float, float]: ...\n def get_children(self) -> list[martist.Artist]: ...\n # TODO units\n converter: Any\n units: Any\n def clear(self) -> None: ...\n def reset_ticks(self) -> None: ...\n def minorticks_on(self) -> None: ...\n def minorticks_off(self) -> None: ...\n def set_tick_params(\n self,\n which: Literal["major", "minor", "both"] = ...,\n reset: bool = ...,\n **kwargs\n ) -> None: ...\n def get_tick_params(\n self, which: Literal["major", "minor"] = ...\n ) -> dict[str, Any]: ...\n def get_view_interval(self) -> tuple[float, float]: ...\n def set_view_interval(\n self, vmin: float, vmax: float, ignore: bool = ...\n ) -> None: ...\n def get_data_interval(self) -> tuple[float, float]: ...\n def set_data_interval(\n self, vmin: float, vmax: float, ignore: bool = ...\n ) -> None: ...\n def get_inverted(self) -> bool: ...\n def set_inverted(self, inverted: bool) -> None: ...\n def set_default_intervals(self) -> None: ...\n def get_tightbbox(\n self, renderer: RendererBase | None = ..., *, for_layout_only: bool = ...\n ) -> Bbox | None: ...\n def get_tick_padding(self) -> float: ...\n def get_gridlines(self) -> list[Line2D]: ...\n def get_label(self) -> Text: ...\n def get_offset_text(self) -> Text: ...\n def get_pickradius(self) -> float: ...\n def get_majorticklabels(self) -> list[Text]: ...\n def get_minorticklabels(self) -> list[Text]: ...\n def get_ticklabels(\n self, minor: bool = ..., which: Literal["major", "minor", "both"] | None = ...\n ) -> list[Text]: ...\n def get_majorticklines(self) -> list[Line2D]: ...\n def get_minorticklines(self) -> list[Line2D]: ...\n def get_ticklines(self, minor: bool = ...) -> list[Line2D]: ...\n def get_majorticklocs(self) -> np.ndarray: ...\n def get_minorticklocs(self) -> np.ndarray: ...\n def get_ticklocs(self, *, minor: bool = ...) -> np.ndarray: ...\n def get_ticks_direction(self, minor: bool = ...) -> np.ndarray: ...\n def get_label_text(self) -> str: ...\n def get_major_locator(self) -> Locator: ...\n def get_minor_locator(self) -> Locator: ...\n def get_major_formatter(self) -> Formatter: ...\n def get_minor_formatter(self) -> Formatter: ...\n def get_major_ticks(self, numticks: int | None = ...) -> list[Tick]: ...\n def get_minor_ticks(self, numticks: int | None = ...) -> list[Tick]: ...\n def grid(\n self,\n visible: bool | None = ...,\n which: Literal["major", "minor", "both"] = ...,\n **kwargs\n ) -> None: ...\n # TODO units\n def update_units(self, data): ...\n def have_units(self) -> bool: ...\n def convert_units(self, x): ...\n def get_converter(self) -> ConversionInterface | None: ...\n def set_converter(self, converter: ConversionInterface) -> None: ...\n def set_units(self, u) -> None: ...\n def get_units(self): ...\n def set_label_text(\n self, label: str, fontdict: dict[str, Any] | None = ..., **kwargs\n ) -> Text: ...\n def set_major_formatter(\n self, formatter: Formatter | str | Callable[[float, float], str]\n ) -> None: ...\n def set_minor_formatter(\n self, formatter: Formatter | str | Callable[[float, float], str]\n ) -> None: ...\n def set_major_locator(self, locator: Locator) -> None: ...\n def set_minor_locator(self, locator: Locator) -> None: ...\n def set_pickradius(self, pickradius: float) -> None: ...\n def set_ticklabels(\n self,\n labels: Iterable[str | Text],\n *,\n minor: bool = ...,\n fontdict: dict[str, Any] | None = ...,\n **kwargs\n ) -> list[Text]: ...\n def set_ticks(\n self,\n ticks: ArrayLike,\n labels: Iterable[str] | None = ...,\n *,\n minor: bool = ...,\n **kwargs\n ) -> list[Tick]: ...\n def axis_date(self, tz: str | datetime.tzinfo | None = ...) -> None: ...\n def get_tick_space(self) -> int: ...\n def get_label_position(self) -> Literal["top", "bottom"]: ...\n def set_label_position(\n self, position: Literal["top", "bottom", "left", "right"]\n ) -> None: ...\n def get_minpos(self) -> float: ...\n\nclass XAxis(Axis):\n __name__: str\n axis_name: str\n def __init__(self, *args, **kwargs) -> None: ...\n label_position: Literal["bottom", "top"]\n stale: bool\n def set_label_position(self, position: Literal["bottom", "top"]) -> None: ... # type: ignore[override]\n def set_ticks_position(\n self, position: Literal["top", "bottom", "both", "default", "none"]\n ) -> None: ...\n def tick_top(self) -> None: ...\n def tick_bottom(self) -> None: ...\n def get_ticks_position(self) -> Literal["top", "bottom", "default", "unknown"]: ...\n def get_tick_space(self) -> int: ...\n\nclass YAxis(Axis):\n __name__: str\n axis_name: str\n def __init__(self, *args, **kwargs) -> None: ...\n label_position: Literal["left", "right"]\n stale: bool\n def set_label_position(self, position: Literal["left", "right"]) -> None: ... # type: ignore[override]\n def set_offset_position(self, position: Literal["left", "right"]) -> None: ...\n def set_ticks_position(\n self, position: Literal["left", "right", "both", "default", "none"]\n ) -> None: ...\n def tick_right(self) -> None: ...\n def tick_left(self) -> None: ...\n def get_ticks_position(self) -> Literal["left", "right", "default", "unknown"]: ...\n def get_tick_space(self) -> int: ...\n | .venv\Lib\site-packages\matplotlib\axis.pyi | axis.pyi | Other | 10,181 | 0.95 | 0.435714 | 0.037313 | python-kit | 616 | 2023-10-12T13:33:28.610751 | GPL-3.0 | false | f4800ac1e9a27ea1cc912d85cf29ace0 |
from enum import Enum, IntEnum\nimport os\nfrom matplotlib import (\n cbook,\n transforms,\n widgets,\n _api,\n)\nfrom matplotlib.artist import Artist\nfrom matplotlib.axes import Axes\nfrom matplotlib.backend_managers import ToolManager\nfrom matplotlib.backend_tools import Cursors, ToolBase\nfrom matplotlib.colorbar import Colorbar\nfrom matplotlib.figure import Figure\nfrom matplotlib.font_manager import FontProperties\nfrom matplotlib.path import Path\nfrom matplotlib.texmanager import TexManager\nfrom matplotlib.text import Text\nfrom matplotlib.transforms import Bbox, BboxBase, Transform, TransformedPath\n\nfrom collections.abc import Callable, Iterable, Sequence\nfrom typing import Any, IO, Literal, NamedTuple, TypeVar\nfrom numpy.typing import ArrayLike\nfrom .typing import ColorType, LineStyleType, CapStyleType, JoinStyleType\n\ndef register_backend(\n format: str, backend: str | type[FigureCanvasBase], description: str | None = ...\n) -> None: ...\ndef get_registered_canvas_class(format: str) -> type[FigureCanvasBase]: ...\n\nclass RendererBase:\n def __init__(self) -> None: ...\n def open_group(self, s: str, gid: str | None = ...) -> None: ...\n def close_group(self, s: str) -> None: ...\n def draw_path(\n self,\n gc: GraphicsContextBase,\n path: Path,\n transform: Transform,\n rgbFace: ColorType | None = ...,\n ) -> None: ...\n def draw_markers(\n self,\n gc: GraphicsContextBase,\n marker_path: Path,\n marker_trans: Transform,\n path: Path,\n trans: Transform,\n rgbFace: ColorType | None = ...,\n ) -> None: ...\n def draw_path_collection(\n self,\n gc: GraphicsContextBase,\n master_transform: Transform,\n paths: Sequence[Path],\n all_transforms: Sequence[ArrayLike],\n offsets: ArrayLike | Sequence[ArrayLike],\n offset_trans: Transform,\n facecolors: ColorType | Sequence[ColorType],\n edgecolors: ColorType | Sequence[ColorType],\n linewidths: float | Sequence[float],\n linestyles: LineStyleType | Sequence[LineStyleType],\n antialiaseds: bool | Sequence[bool],\n urls: str | Sequence[str],\n offset_position: Any,\n ) -> None: ...\n def draw_quad_mesh(\n self,\n gc: GraphicsContextBase,\n master_transform: Transform,\n meshWidth,\n meshHeight,\n coordinates: ArrayLike,\n offsets: ArrayLike | Sequence[ArrayLike],\n offsetTrans: Transform,\n facecolors: Sequence[ColorType],\n antialiased: bool,\n edgecolors: Sequence[ColorType] | ColorType | None,\n ) -> None: ...\n def draw_gouraud_triangles(\n self,\n gc: GraphicsContextBase,\n triangles_array: ArrayLike,\n colors_array: ArrayLike,\n transform: Transform,\n ) -> None: ...\n def get_image_magnification(self) -> float: ...\n def draw_image(\n self,\n gc: GraphicsContextBase,\n x: float,\n y: float,\n im: ArrayLike,\n transform: transforms.Affine2DBase | None = ...,\n ) -> None: ...\n def option_image_nocomposite(self) -> bool: ...\n def option_scale_image(self) -> bool: ...\n def draw_tex(\n self,\n gc: GraphicsContextBase,\n x: float,\n y: float,\n s: str,\n prop: FontProperties,\n angle: float,\n *,\n mtext: Text | None = ...\n ) -> None: ...\n def draw_text(\n self,\n gc: GraphicsContextBase,\n x: float,\n y: float,\n s: str,\n prop: FontProperties,\n angle: float,\n ismath: bool | Literal["TeX"] = ...,\n mtext: Text | None = ...,\n ) -> None: ...\n def get_text_width_height_descent(\n self, s: str, prop: FontProperties, ismath: bool | Literal["TeX"]\n ) -> tuple[float, float, float]: ...\n def flipy(self) -> bool: ...\n def get_canvas_width_height(self) -> tuple[float, float]: ...\n def get_texmanager(self) -> TexManager: ...\n def new_gc(self) -> GraphicsContextBase: ...\n def points_to_pixels(self, points: ArrayLike) -> ArrayLike: ...\n def start_rasterizing(self) -> None: ...\n def stop_rasterizing(self) -> None: ...\n def start_filter(self) -> None: ...\n def stop_filter(self, filter_func) -> None: ...\n\nclass GraphicsContextBase:\n def __init__(self) -> None: ...\n def copy_properties(self, gc: GraphicsContextBase) -> None: ...\n def restore(self) -> None: ...\n def get_alpha(self) -> float: ...\n def get_antialiased(self) -> int: ...\n def get_capstyle(self) -> Literal["butt", "projecting", "round"]: ...\n def get_clip_rectangle(self) -> Bbox | None: ...\n def get_clip_path(\n self,\n ) -> tuple[TransformedPath, Transform] | tuple[None, None]: ...\n def get_dashes(self) -> tuple[float, ArrayLike | None]: ...\n def get_forced_alpha(self) -> bool: ...\n def get_joinstyle(self) -> Literal["miter", "round", "bevel"]: ...\n def get_linewidth(self) -> float: ...\n def get_rgb(self) -> tuple[float, float, float, float]: ...\n def get_url(self) -> str | None: ...\n def get_gid(self) -> int | None: ...\n def get_snap(self) -> bool | None: ...\n def set_alpha(self, alpha: float) -> None: ...\n def set_antialiased(self, b: bool) -> None: ...\n def set_capstyle(self, cs: CapStyleType) -> None: ...\n def set_clip_rectangle(self, rectangle: Bbox | None) -> None: ...\n def set_clip_path(self, path: TransformedPath | None) -> None: ...\n def set_dashes(self, dash_offset: float, dash_list: ArrayLike | None) -> None: ...\n def set_foreground(self, fg: ColorType, isRGBA: bool = ...) -> None: ...\n def set_joinstyle(self, js: JoinStyleType) -> None: ...\n def set_linewidth(self, w: float) -> None: ...\n def set_url(self, url: str | None) -> None: ...\n def set_gid(self, id: int | None) -> None: ...\n def set_snap(self, snap: bool | None) -> None: ...\n def set_hatch(self, hatch: str | None) -> None: ...\n def get_hatch(self) -> str | None: ...\n def get_hatch_path(self, density: float = ...) -> Path: ...\n def get_hatch_color(self) -> ColorType: ...\n def set_hatch_color(self, hatch_color: ColorType) -> None: ...\n def get_hatch_linewidth(self) -> float: ...\n def set_hatch_linewidth(self, hatch_linewidth: float) -> None: ...\n def get_sketch_params(self) -> tuple[float, float, float] | None: ...\n def set_sketch_params(\n self,\n scale: float | None = ...,\n length: float | None = ...,\n randomness: float | None = ...,\n ) -> None: ...\n\nclass TimerBase:\n callbacks: list[tuple[Callable, tuple, dict[str, Any]]]\n def __init__(\n self,\n interval: int | None = ...,\n callbacks: list[tuple[Callable, tuple, dict[str, Any]]] | None = ...,\n ) -> None: ...\n def __del__(self) -> None: ...\n def start(self, interval: int | None = ...) -> None: ...\n def stop(self) -> None: ...\n @property\n def interval(self) -> int: ...\n @interval.setter\n def interval(self, interval: int) -> None: ...\n @property\n def single_shot(self) -> bool: ...\n @single_shot.setter\n def single_shot(self, ss: bool) -> None: ...\n def add_callback(self, func: Callable, *args, **kwargs) -> Callable: ...\n def remove_callback(self, func: Callable, *args, **kwargs) -> None: ...\n\nclass Event:\n name: str\n canvas: FigureCanvasBase\n guiEvent: Any\n def __init__(\n self, name: str, canvas: FigureCanvasBase, guiEvent: Any | None = ...\n ) -> None: ...\n\nclass DrawEvent(Event):\n renderer: RendererBase\n def __init__(\n self, name: str, canvas: FigureCanvasBase, renderer: RendererBase\n ) -> None: ...\n\nclass ResizeEvent(Event):\n width: int\n height: int\n def __init__(self, name: str, canvas: FigureCanvasBase) -> None: ...\n\nclass CloseEvent(Event): ...\n\nclass LocationEvent(Event):\n x: int\n y: int\n inaxes: Axes | None\n xdata: float | None\n ydata: float | None\n def __init__(\n self,\n name: str,\n canvas: FigureCanvasBase,\n x: int,\n y: int,\n guiEvent: Any | None = ...,\n *,\n modifiers: Iterable[str] | None = ...,\n ) -> None: ...\n\nclass MouseButton(IntEnum):\n LEFT = 1\n MIDDLE = 2\n RIGHT = 3\n BACK = 8\n FORWARD = 9\n\nclass MouseEvent(LocationEvent):\n button: MouseButton | Literal["up", "down"] | None\n key: str | None\n step: float\n dblclick: bool\n def __init__(\n self,\n name: str,\n canvas: FigureCanvasBase,\n x: int,\n y: int,\n button: MouseButton | Literal["up", "down"] | None = ...,\n key: str | None = ...,\n step: float = ...,\n dblclick: bool = ...,\n guiEvent: Any | None = ...,\n *,\n buttons: Iterable[MouseButton] | None = ...,\n modifiers: Iterable[str] | None = ...,\n ) -> None: ...\n\nclass PickEvent(Event):\n mouseevent: MouseEvent\n artist: Artist\n def __init__(\n self,\n name: str,\n canvas: FigureCanvasBase,\n mouseevent: MouseEvent,\n artist: Artist,\n guiEvent: Any | None = ...,\n **kwargs\n ) -> None: ...\n\nclass KeyEvent(LocationEvent):\n key: str | None\n def __init__(\n self,\n name: str,\n canvas: FigureCanvasBase,\n key: str | None,\n x: int = ...,\n y: int = ...,\n guiEvent: Any | None = ...,\n ) -> None: ...\n\nclass FigureCanvasBase:\n required_interactive_framework: str | None\n\n @_api.classproperty\n def manager_class(cls) -> type[FigureManagerBase]: ...\n events: list[str]\n fixed_dpi: None | float\n filetypes: dict[str, str]\n\n @_api.classproperty\n def supports_blit(cls) -> bool: ...\n\n figure: Figure\n manager: None | FigureManagerBase\n widgetlock: widgets.LockDraw\n mouse_grabber: None | Axes\n toolbar: None | NavigationToolbar2\n def __init__(self, figure: Figure | None = ...) -> None: ...\n @property\n def callbacks(self) -> cbook.CallbackRegistry: ...\n @property\n def button_pick_id(self) -> int: ...\n @property\n def scroll_pick_id(self) -> int: ...\n @classmethod\n def new_manager(cls, figure: Figure, num: int | str) -> FigureManagerBase: ...\n def is_saving(self) -> bool: ...\n def blit(self, bbox: BboxBase | None = ...) -> None: ...\n def inaxes(self, xy: tuple[float, float]) -> Axes | None: ...\n def grab_mouse(self, ax: Axes) -> None: ...\n def release_mouse(self, ax: Axes) -> None: ...\n def set_cursor(self, cursor: Cursors) -> None: ...\n def draw(self, *args, **kwargs) -> None: ...\n def draw_idle(self, *args, **kwargs) -> None: ...\n @property\n def device_pixel_ratio(self) -> float: ...\n def get_width_height(self, *, physical: bool = ...) -> tuple[int, int]: ...\n @classmethod\n def get_supported_filetypes(cls) -> dict[str, str]: ...\n @classmethod\n def get_supported_filetypes_grouped(cls) -> dict[str, list[str]]: ...\n def print_figure(\n self,\n filename: str | os.PathLike | IO,\n dpi: float | None = ...,\n facecolor: ColorType | Literal["auto"] | None = ...,\n edgecolor: ColorType | Literal["auto"] | None = ...,\n orientation: str = ...,\n format: str | None = ...,\n *,\n bbox_inches: Literal["tight"] | Bbox | None = ...,\n pad_inches: float | None = ...,\n bbox_extra_artists: list[Artist] | None = ...,\n backend: str | None = ...,\n **kwargs\n ) -> Any: ...\n @classmethod\n def get_default_filetype(cls) -> str: ...\n def get_default_filename(self) -> str: ...\n _T = TypeVar("_T", bound=FigureCanvasBase)\n def mpl_connect(self, s: str, func: Callable[[Event], Any]) -> int: ...\n def mpl_disconnect(self, cid: int) -> None: ...\n def new_timer(\n self,\n interval: int | None = ...,\n callbacks: list[tuple[Callable, tuple, dict[str, Any]]] | None = ...,\n ) -> TimerBase: ...\n def flush_events(self) -> None: ...\n def start_event_loop(self, timeout: float = ...) -> None: ...\n def stop_event_loop(self) -> None: ...\n\ndef key_press_handler(\n event: KeyEvent,\n canvas: FigureCanvasBase | None = ...,\n toolbar: NavigationToolbar2 | None = ...,\n) -> None: ...\ndef button_press_handler(\n event: MouseEvent,\n canvas: FigureCanvasBase | None = ...,\n toolbar: NavigationToolbar2 | None = ...,\n) -> None: ...\n\nclass NonGuiException(Exception): ...\n\nclass FigureManagerBase:\n canvas: FigureCanvasBase\n num: int | str\n key_press_handler_id: int | None\n button_press_handler_id: int | None\n toolmanager: ToolManager | None\n toolbar: NavigationToolbar2 | ToolContainerBase | None\n def __init__(self, canvas: FigureCanvasBase, num: int | str) -> None: ...\n @classmethod\n def create_with_canvas(\n cls, canvas_class: type[FigureCanvasBase], figure: Figure, num: int | str\n ) -> FigureManagerBase: ...\n @classmethod\n def start_main_loop(cls) -> None: ...\n @classmethod\n def pyplot_show(cls, *, block: bool | None = ...) -> None: ...\n def show(self) -> None: ...\n def destroy(self) -> None: ...\n def full_screen_toggle(self) -> None: ...\n def resize(self, w: int, h: int) -> None: ...\n def get_window_title(self) -> str: ...\n def set_window_title(self, title: str) -> None: ...\n\ncursors = Cursors\n\nclass _Mode(str, Enum):\n NONE = ""\n PAN = "pan/zoom"\n ZOOM = "zoom rect"\n\nclass NavigationToolbar2:\n toolitems: tuple[tuple[str, ...] | tuple[None, ...], ...]\n UNKNOWN_SAVED_STATUS: object\n canvas: FigureCanvasBase\n mode: _Mode\n def __init__(self, canvas: FigureCanvasBase) -> None: ...\n def set_message(self, s: str) -> None: ...\n def draw_rubberband(\n self, event: Event, x0: float, y0: float, x1: float, y1: float\n ) -> None: ...\n def remove_rubberband(self) -> None: ...\n def home(self, *args) -> None: ...\n def back(self, *args) -> None: ...\n def forward(self, *args) -> None: ...\n def mouse_move(self, event: MouseEvent) -> None: ...\n def pan(self, *args) -> None: ...\n\n class _PanInfo(NamedTuple):\n button: MouseButton\n axes: list[Axes]\n cid: int\n def press_pan(self, event: Event) -> None: ...\n def drag_pan(self, event: Event) -> None: ...\n def release_pan(self, event: Event) -> None: ...\n def zoom(self, *args) -> None: ...\n\n class _ZoomInfo(NamedTuple):\n direction: Literal["in", "out"]\n start_xy: tuple[float, float]\n axes: list[Axes]\n cid: int\n cbar: Colorbar\n def press_zoom(self, event: Event) -> None: ...\n def drag_zoom(self, event: Event) -> None: ...\n def release_zoom(self, event: Event) -> None: ...\n def push_current(self) -> None: ...\n subplot_tool: widgets.SubplotTool\n def configure_subplots(self, *args): ...\n def save_figure(self, *args) -> str | None | object: ...\n def update(self) -> None: ...\n def set_history_buttons(self) -> None: ...\n\nclass ToolContainerBase:\n toolmanager: ToolManager\n def __init__(self, toolmanager: ToolManager) -> None: ...\n def add_tool(self, tool: ToolBase, group: str, position: int = ...) -> None: ...\n def trigger_tool(self, name: str) -> None: ...\n def add_toolitem(\n self,\n name: str,\n group: str,\n position: int,\n image: str,\n description: str,\n toggle: bool,\n ) -> None: ...\n def toggle_toolitem(self, name: str, toggled: bool) -> None: ...\n def remove_toolitem(self, name: str) -> None: ...\n def set_message(self, s: str) -> None: ...\n\nclass _Backend:\n backend_version: str\n FigureCanvas: type[FigureCanvasBase] | None\n FigureManager: type[FigureManagerBase]\n mainloop: None | Callable[[], Any]\n @classmethod\n def new_figure_manager(cls, num: int | str, *args, **kwargs) -> FigureManagerBase: ...\n @classmethod\n def new_figure_manager_given_figure(cls, num: int | str, figure: Figure) -> FigureManagerBase: ...\n @classmethod\n def draw_if_interactive(cls) -> None: ...\n @classmethod\n def show(cls, *, block: bool | None = ...) -> None: ...\n @staticmethod\n def export(cls) -> type[_Backend]: ...\n\nclass ShowBase(_Backend):\n def __call__(self, block: bool | None = ...) -> None: ...\n | .venv\Lib\site-packages\matplotlib\backend_bases.pyi | backend_bases.pyi | Other | 16,270 | 0.85 | 0.365145 | 0.013245 | react-lib | 3 | 2023-11-06T04:31:49.982226 | MIT | false | 2d42a27ab46ab8518188f5b6ca262667 |
from matplotlib import _api, backend_tools, cbook, widgets\n\n\nclass ToolEvent:\n """Event for tool manipulation (add/remove)."""\n def __init__(self, name, sender, tool, data=None):\n self.name = name\n self.sender = sender\n self.tool = tool\n self.data = data\n\n\nclass ToolTriggerEvent(ToolEvent):\n """Event to inform that a tool has been triggered."""\n def __init__(self, name, sender, tool, canvasevent=None, data=None):\n super().__init__(name, sender, tool, data)\n self.canvasevent = canvasevent\n\n\nclass ToolManagerMessageEvent:\n """\n Event carrying messages from toolmanager.\n\n Messages usually get displayed to the user by the toolbar.\n """\n def __init__(self, name, sender, message):\n self.name = name\n self.sender = sender\n self.message = message\n\n\nclass ToolManager:\n """\n Manager for actions triggered by user interactions (key press, toolbar\n clicks, ...) on a Figure.\n\n Attributes\n ----------\n figure : `.Figure`\n keypresslock : `~matplotlib.widgets.LockDraw`\n `.LockDraw` object to know if the `canvas` key_press_event is locked.\n messagelock : `~matplotlib.widgets.LockDraw`\n `.LockDraw` object to know if the message is available to write.\n """\n\n def __init__(self, figure=None):\n\n self._key_press_handler_id = None\n\n self._tools = {}\n self._keys = {}\n self._toggled = {}\n self._callbacks = cbook.CallbackRegistry()\n\n # to process keypress event\n self.keypresslock = widgets.LockDraw()\n self.messagelock = widgets.LockDraw()\n\n self._figure = None\n self.set_figure(figure)\n\n @property\n def canvas(self):\n """Canvas managed by FigureManager."""\n if not self._figure:\n return None\n return self._figure.canvas\n\n @property\n def figure(self):\n """Figure that holds the canvas."""\n return self._figure\n\n @figure.setter\n def figure(self, figure):\n self.set_figure(figure)\n\n def set_figure(self, figure, update_tools=True):\n """\n Bind the given figure to the tools.\n\n Parameters\n ----------\n figure : `.Figure`\n update_tools : bool, default: True\n Force tools to update figure.\n """\n if self._key_press_handler_id:\n self.canvas.mpl_disconnect(self._key_press_handler_id)\n self._figure = figure\n if figure:\n self._key_press_handler_id = self.canvas.mpl_connect(\n 'key_press_event', self._key_press)\n if update_tools:\n for tool in self._tools.values():\n tool.figure = figure\n\n def toolmanager_connect(self, s, func):\n """\n Connect event with string *s* to *func*.\n\n Parameters\n ----------\n s : str\n The name of the event. The following events are recognized:\n\n - 'tool_message_event'\n - 'tool_removed_event'\n - 'tool_added_event'\n\n For every tool added a new event is created\n\n - 'tool_trigger_TOOLNAME', where TOOLNAME is the id of the tool.\n\n func : callable\n Callback function for the toolmanager event with signature::\n\n def func(event: ToolEvent) -> Any\n\n Returns\n -------\n cid\n The callback id for the connection. This can be used in\n `.toolmanager_disconnect`.\n """\n return self._callbacks.connect(s, func)\n\n def toolmanager_disconnect(self, cid):\n """\n Disconnect callback id *cid*.\n\n Example usage::\n\n cid = toolmanager.toolmanager_connect('tool_trigger_zoom', onpress)\n #...later\n toolmanager.toolmanager_disconnect(cid)\n """\n return self._callbacks.disconnect(cid)\n\n def message_event(self, message, sender=None):\n """Emit a `ToolManagerMessageEvent`."""\n if sender is None:\n sender = self\n\n s = 'tool_message_event'\n event = ToolManagerMessageEvent(s, sender, message)\n self._callbacks.process(s, event)\n\n @property\n def active_toggle(self):\n """Currently toggled tools."""\n return self._toggled\n\n def get_tool_keymap(self, name):\n """\n Return the keymap associated with the specified tool.\n\n Parameters\n ----------\n name : str\n Name of the Tool.\n\n Returns\n -------\n list of str\n List of keys associated with the tool.\n """\n\n keys = [k for k, i in self._keys.items() if i == name]\n return keys\n\n def _remove_keys(self, name):\n for k in self.get_tool_keymap(name):\n del self._keys[k]\n\n def update_keymap(self, name, key):\n """\n Set the keymap to associate with the specified tool.\n\n Parameters\n ----------\n name : str\n Name of the Tool.\n key : str or list of str\n Keys to associate with the tool.\n """\n if name not in self._tools:\n raise KeyError(f'{name!r} not in Tools')\n self._remove_keys(name)\n if isinstance(key, str):\n key = [key]\n for k in key:\n if k in self._keys:\n _api.warn_external(\n f'Key {k} changed from {self._keys[k]} to {name}')\n self._keys[k] = name\n\n def remove_tool(self, name):\n """\n Remove tool named *name*.\n\n Parameters\n ----------\n name : str\n Name of the tool.\n """\n tool = self.get_tool(name)\n if getattr(tool, 'toggled', False): # If it's a toggled toggle tool, untoggle\n self.trigger_tool(tool, 'toolmanager')\n self._remove_keys(name)\n event = ToolEvent('tool_removed_event', self, tool)\n self._callbacks.process(event.name, event)\n del self._tools[name]\n\n def add_tool(self, name, tool, *args, **kwargs):\n """\n Add *tool* to `ToolManager`.\n\n If successful, adds a new event ``tool_trigger_{name}`` where\n ``{name}`` is the *name* of the tool; the event is fired every time the\n tool is triggered.\n\n Parameters\n ----------\n name : str\n Name of the tool, treated as the ID, has to be unique.\n tool : type\n Class of the tool to be added. A subclass will be used\n instead if one was registered for the current canvas class.\n *args, **kwargs\n Passed to the *tool*'s constructor.\n\n See Also\n --------\n matplotlib.backend_tools.ToolBase : The base class for tools.\n """\n\n tool_cls = backend_tools._find_tool_class(type(self.canvas), tool)\n if not tool_cls:\n raise ValueError('Impossible to find class for %s' % str(tool))\n\n if name in self._tools:\n _api.warn_external('A "Tool class" with the same name already '\n 'exists, not added')\n return self._tools[name]\n\n tool_obj = tool_cls(self, name, *args, **kwargs)\n self._tools[name] = tool_obj\n\n if tool_obj.default_keymap is not None:\n self.update_keymap(name, tool_obj.default_keymap)\n\n # For toggle tools init the radio_group in self._toggled\n if isinstance(tool_obj, backend_tools.ToolToggleBase):\n # None group is not mutually exclusive, a set is used to keep track\n # of all toggled tools in this group\n if tool_obj.radio_group is None:\n self._toggled.setdefault(None, set())\n else:\n self._toggled.setdefault(tool_obj.radio_group, None)\n\n # If initially toggled\n if tool_obj.toggled:\n self._handle_toggle(tool_obj, None, None)\n tool_obj.set_figure(self.figure)\n\n event = ToolEvent('tool_added_event', self, tool_obj)\n self._callbacks.process(event.name, event)\n\n return tool_obj\n\n def _handle_toggle(self, tool, canvasevent, data):\n """\n Toggle tools, need to untoggle prior to using other Toggle tool.\n Called from trigger_tool.\n\n Parameters\n ----------\n tool : `.ToolBase`\n canvasevent : Event\n Original Canvas event or None.\n data : object\n Extra data to pass to the tool when triggering.\n """\n\n radio_group = tool.radio_group\n # radio_group None is not mutually exclusive\n # just keep track of toggled tools in this group\n if radio_group is None:\n if tool.name in self._toggled[None]:\n self._toggled[None].remove(tool.name)\n else:\n self._toggled[None].add(tool.name)\n return\n\n # If the tool already has a toggled state, untoggle it\n if self._toggled[radio_group] == tool.name:\n toggled = None\n # If no tool was toggled in the radio_group\n # toggle it\n elif self._toggled[radio_group] is None:\n toggled = tool.name\n # Other tool in the radio_group is toggled\n else:\n # Untoggle previously toggled tool\n self.trigger_tool(self._toggled[radio_group],\n self,\n canvasevent,\n data)\n toggled = tool.name\n\n # Keep track of the toggled tool in the radio_group\n self._toggled[radio_group] = toggled\n\n def trigger_tool(self, name, sender=None, canvasevent=None, data=None):\n """\n Trigger a tool and emit the ``tool_trigger_{name}`` event.\n\n Parameters\n ----------\n name : str\n Name of the tool.\n sender : object\n Object that wishes to trigger the tool.\n canvasevent : Event\n Original Canvas event or None.\n data : object\n Extra data to pass to the tool when triggering.\n """\n tool = self.get_tool(name)\n if tool is None:\n return\n\n if sender is None:\n sender = self\n\n if isinstance(tool, backend_tools.ToolToggleBase):\n self._handle_toggle(tool, canvasevent, data)\n\n tool.trigger(sender, canvasevent, data) # Actually trigger Tool.\n\n s = 'tool_trigger_%s' % name\n event = ToolTriggerEvent(s, sender, tool, canvasevent, data)\n self._callbacks.process(s, event)\n\n def _key_press(self, event):\n if event.key is None or self.keypresslock.locked():\n return\n\n name = self._keys.get(event.key, None)\n if name is None:\n return\n self.trigger_tool(name, canvasevent=event)\n\n @property\n def tools(self):\n """A dict mapping tool name -> controlled tool."""\n return self._tools\n\n def get_tool(self, name, warn=True):\n """\n Return the tool object with the given name.\n\n For convenience, this passes tool objects through.\n\n Parameters\n ----------\n name : str or `.ToolBase`\n Name of the tool, or the tool itself.\n warn : bool, default: True\n Whether a warning should be emitted it no tool with the given name\n exists.\n\n Returns\n -------\n `.ToolBase` or None\n The tool or None if no tool with the given name exists.\n """\n if (isinstance(name, backend_tools.ToolBase)\n and name.name in self._tools):\n return name\n if name not in self._tools:\n if warn:\n _api.warn_external(\n f"ToolManager does not control tool {name!r}")\n return None\n return self._tools[name]\n | .venv\Lib\site-packages\matplotlib\backend_managers.py | backend_managers.py | Python | 11,795 | 0.95 | 0.191214 | 0.047771 | vue-tools | 873 | 2023-11-14T02:16:45.638372 | MIT | false | 7961223900fa6749734bdda9a3bb9266 |
from matplotlib import backend_tools, widgets\nfrom matplotlib.backend_bases import FigureCanvasBase\nfrom matplotlib.figure import Figure\n\nfrom collections.abc import Callable, Iterable\nfrom typing import Any, TypeVar\n\nclass ToolEvent:\n name: str\n sender: Any\n tool: backend_tools.ToolBase\n data: Any\n def __init__(self, name, sender, tool, data: Any | None = ...) -> None: ...\n\nclass ToolTriggerEvent(ToolEvent):\n canvasevent: ToolEvent\n def __init__(\n self,\n name,\n sender,\n tool,\n canvasevent: ToolEvent | None = ...,\n data: Any | None = ...,\n ) -> None: ...\n\nclass ToolManagerMessageEvent:\n name: str\n sender: Any\n message: str\n def __init__(self, name: str, sender: Any, message: str) -> None: ...\n\nclass ToolManager:\n keypresslock: widgets.LockDraw\n messagelock: widgets.LockDraw\n def __init__(self, figure: Figure | None = ...) -> None: ...\n @property\n def canvas(self) -> FigureCanvasBase | None: ...\n @property\n def figure(self) -> Figure | None: ...\n @figure.setter\n def figure(self, figure: Figure) -> None: ...\n def set_figure(self, figure: Figure, update_tools: bool = ...) -> None: ...\n def toolmanager_connect(self, s: str, func: Callable[[ToolEvent], Any]) -> int: ...\n def toolmanager_disconnect(self, cid: int) -> None: ...\n def message_event(self, message: str, sender: Any | None = ...) -> None: ...\n @property\n def active_toggle(self) -> dict[str | None, list[str] | str]: ...\n def get_tool_keymap(self, name: str) -> list[str]: ...\n def update_keymap(self, name: str, key: str | Iterable[str]) -> None: ...\n def remove_tool(self, name: str) -> None: ...\n _T = TypeVar("_T", bound=backend_tools.ToolBase)\n def add_tool(self, name: str, tool: type[_T], *args, **kwargs) -> _T: ...\n def trigger_tool(\n self,\n name: str | backend_tools.ToolBase,\n sender: Any | None = ...,\n canvasevent: ToolEvent | None = ...,\n data: Any | None = ...,\n ) -> None: ...\n @property\n def tools(self) -> dict[str, backend_tools.ToolBase]: ...\n def get_tool(\n self, name: str | backend_tools.ToolBase, warn: bool = ...\n ) -> backend_tools.ToolBase | None: ...\n | .venv\Lib\site-packages\matplotlib\backend_managers.pyi | backend_managers.pyi | Other | 2,253 | 0.85 | 0.359375 | 0 | node-utils | 734 | 2024-01-20T01:09:57.885708 | GPL-3.0 | false | 5543df2584a2656b953e76fbfacca4f3 |
"""\nAbstract base classes define the primitives for Tools.\nThese tools are used by `matplotlib.backend_managers.ToolManager`\n\n:class:`ToolBase`\n Simple stateless tool\n\n:class:`ToolToggleBase`\n Tool that has two states, only one Toggle tool can be\n active at any given time for the same\n `matplotlib.backend_managers.ToolManager`\n"""\n\nimport enum\nimport functools\nimport re\nimport time\nfrom types import SimpleNamespace\nimport uuid\nfrom weakref import WeakKeyDictionary\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom matplotlib._pylab_helpers import Gcf\nfrom matplotlib import _api, cbook\n\n\nclass Cursors(enum.IntEnum): # Must subclass int for the macOS backend.\n """Backend-independent cursor types."""\n POINTER = enum.auto()\n HAND = enum.auto()\n SELECT_REGION = enum.auto()\n MOVE = enum.auto()\n WAIT = enum.auto()\n RESIZE_HORIZONTAL = enum.auto()\n RESIZE_VERTICAL = enum.auto()\ncursors = Cursors # Backcompat.\n\n\n# _tool_registry, _register_tool_class, and _find_tool_class implement a\n# mechanism through which ToolManager.add_tool can determine whether a subclass\n# of the requested tool class has been registered (either for the current\n# canvas class or for a parent class), in which case that tool subclass will be\n# instantiated instead. This is the mechanism used e.g. to allow different\n# GUI backends to implement different specializations for ConfigureSubplots.\n\n\n_tool_registry = set()\n\n\ndef _register_tool_class(canvas_cls, tool_cls=None):\n """Decorator registering *tool_cls* as a tool class for *canvas_cls*."""\n if tool_cls is None:\n return functools.partial(_register_tool_class, canvas_cls)\n _tool_registry.add((canvas_cls, tool_cls))\n return tool_cls\n\n\ndef _find_tool_class(canvas_cls, tool_cls):\n """Find a subclass of *tool_cls* registered for *canvas_cls*."""\n for canvas_parent in canvas_cls.__mro__:\n for tool_child in _api.recursive_subclasses(tool_cls):\n if (canvas_parent, tool_child) in _tool_registry:\n return tool_child\n return tool_cls\n\n\n# Views positions tool\n_views_positions = 'viewpos'\n\n\nclass ToolBase:\n """\n Base tool class.\n\n A base tool, only implements `trigger` method or no method at all.\n The tool is instantiated by `matplotlib.backend_managers.ToolManager`.\n """\n\n default_keymap = None\n """\n Keymap to associate with this tool.\n\n ``list[str]``: List of keys that will trigger this tool when a keypress\n event is emitted on ``self.figure.canvas``. Note that this attribute is\n looked up on the instance, and can therefore be a property (this is used\n e.g. by the built-in tools to load the rcParams at instantiation time).\n """\n\n description = None\n """\n Description of the Tool.\n\n `str`: Tooltip used if the Tool is included in a Toolbar.\n """\n\n image = None\n """\n Icon filename.\n\n ``str | None``: Filename of the Toolbar icon; either absolute, or relative to the\n directory containing the Python source file where the ``Tool.image`` class attribute\n is defined (in the latter case, this cannot be defined as an instance attribute).\n In either case, the extension is optional; leaving it off lets individual backends\n select the icon format they prefer. If None, the *name* is used as a label in the\n toolbar button.\n """\n\n def __init__(self, toolmanager, name):\n self._name = name\n self._toolmanager = toolmanager\n self._figure = None\n\n name = property(\n lambda self: self._name,\n doc="The tool id (str, must be unique among tools of a tool manager).")\n toolmanager = property(\n lambda self: self._toolmanager,\n doc="The `.ToolManager` that controls this tool.")\n canvas = property(\n lambda self: self._figure.canvas if self._figure is not None else None,\n doc="The canvas of the figure affected by this tool, or None.")\n\n def set_figure(self, figure):\n self._figure = figure\n\n figure = property(\n lambda self: self._figure,\n # The setter must explicitly call self.set_figure so that subclasses can\n # meaningfully override it.\n lambda self, figure: self.set_figure(figure),\n doc="The Figure affected by this tool, or None.")\n\n def _make_classic_style_pseudo_toolbar(self):\n """\n Return a placeholder object with a single `canvas` attribute.\n\n This is useful to reuse the implementations of tools already provided\n by the classic Toolbars.\n """\n return SimpleNamespace(canvas=self.canvas)\n\n def trigger(self, sender, event, data=None):\n """\n Called when this tool gets used.\n\n This method is called by `.ToolManager.trigger_tool`.\n\n Parameters\n ----------\n event : `.Event`\n The canvas event that caused this tool to be called.\n sender : object\n Object that requested the tool to be triggered.\n data : object\n Extra data.\n """\n pass\n\n\nclass ToolToggleBase(ToolBase):\n """\n Toggleable tool.\n\n Every time it is triggered, it switches between enable and disable.\n\n Parameters\n ----------\n ``*args``\n Variable length argument to be used by the Tool.\n ``**kwargs``\n `toggled` if present and True, sets the initial state of the Tool\n Arbitrary keyword arguments to be consumed by the Tool\n """\n\n radio_group = None\n """\n Attribute to group 'radio' like tools (mutually exclusive).\n\n `str` that identifies the group or **None** if not belonging to a group.\n """\n\n cursor = None\n """Cursor to use when the tool is active."""\n\n default_toggled = False\n """Default of toggled state."""\n\n def __init__(self, *args, **kwargs):\n self._toggled = kwargs.pop('toggled', self.default_toggled)\n super().__init__(*args, **kwargs)\n\n def trigger(self, sender, event, data=None):\n """Calls `enable` or `disable` based on `toggled` value."""\n if self._toggled:\n self.disable(event)\n else:\n self.enable(event)\n self._toggled = not self._toggled\n\n def enable(self, event=None):\n """\n Enable the toggle tool.\n\n `trigger` calls this method when `toggled` is False.\n """\n pass\n\n def disable(self, event=None):\n """\n Disable the toggle tool.\n\n `trigger` call this method when `toggled` is True.\n\n This can happen in different circumstances.\n\n * Click on the toolbar tool button.\n * Call to `matplotlib.backend_managers.ToolManager.trigger_tool`.\n * Another `ToolToggleBase` derived tool is triggered\n (from the same `.ToolManager`).\n """\n pass\n\n @property\n def toggled(self):\n """State of the toggled tool."""\n return self._toggled\n\n def set_figure(self, figure):\n toggled = self.toggled\n if toggled:\n if self.figure:\n self.trigger(self, None)\n else:\n # if no figure the internal state is not changed\n # we change it here so next call to trigger will change it back\n self._toggled = False\n super().set_figure(figure)\n if toggled:\n if figure:\n self.trigger(self, None)\n else:\n # if there is no figure, trigger won't change the internal\n # state we change it back\n self._toggled = True\n\n\nclass ToolSetCursor(ToolBase):\n """\n Change to the current cursor while inaxes.\n\n This tool, keeps track of all `ToolToggleBase` derived tools, and updates\n the cursor when a tool gets triggered.\n """\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n self._id_drag = None\n self._current_tool = None\n self._default_cursor = cursors.POINTER\n self._last_cursor = self._default_cursor\n self.toolmanager.toolmanager_connect('tool_added_event',\n self._add_tool_cbk)\n for tool in self.toolmanager.tools.values(): # process current tools\n self._add_tool_cbk(mpl.backend_managers.ToolEvent(\n 'tool_added_event', self.toolmanager, tool))\n\n def set_figure(self, figure):\n if self._id_drag:\n self.canvas.mpl_disconnect(self._id_drag)\n super().set_figure(figure)\n if figure:\n self._id_drag = self.canvas.mpl_connect(\n 'motion_notify_event', self._set_cursor_cbk)\n\n def _add_tool_cbk(self, event):\n """Process every newly added tool."""\n if getattr(event.tool, 'cursor', None) is not None:\n self.toolmanager.toolmanager_connect(\n f'tool_trigger_{event.tool.name}', self._tool_trigger_cbk)\n\n def _tool_trigger_cbk(self, event):\n self._current_tool = event.tool if event.tool.toggled else None\n self._set_cursor_cbk(event.canvasevent)\n\n def _set_cursor_cbk(self, event):\n if not event or not self.canvas:\n return\n if (self._current_tool and getattr(event, "inaxes", None)\n and event.inaxes.get_navigate()):\n if self._last_cursor != self._current_tool.cursor:\n self.canvas.set_cursor(self._current_tool.cursor)\n self._last_cursor = self._current_tool.cursor\n elif self._last_cursor != self._default_cursor:\n self.canvas.set_cursor(self._default_cursor)\n self._last_cursor = self._default_cursor\n\n\nclass ToolCursorPosition(ToolBase):\n """\n Send message with the current pointer position.\n\n This tool runs in the background reporting the position of the cursor.\n """\n def __init__(self, *args, **kwargs):\n self._id_drag = None\n super().__init__(*args, **kwargs)\n\n def set_figure(self, figure):\n if self._id_drag:\n self.canvas.mpl_disconnect(self._id_drag)\n super().set_figure(figure)\n if figure:\n self._id_drag = self.canvas.mpl_connect(\n 'motion_notify_event', self.send_message)\n\n def send_message(self, event):\n """Call `matplotlib.backend_managers.ToolManager.message_event`."""\n if self.toolmanager.messagelock.locked():\n return\n\n from matplotlib.backend_bases import NavigationToolbar2\n message = NavigationToolbar2._mouse_event_to_message(event)\n self.toolmanager.message_event(message, self)\n\n\nclass RubberbandBase(ToolBase):\n """Draw and remove a rubberband."""\n def trigger(self, sender, event, data=None):\n """Call `draw_rubberband` or `remove_rubberband` based on data."""\n if not self.figure.canvas.widgetlock.available(sender):\n return\n if data is not None:\n self.draw_rubberband(*data)\n else:\n self.remove_rubberband()\n\n def draw_rubberband(self, *data):\n """\n Draw rubberband.\n\n This method must get implemented per backend.\n """\n raise NotImplementedError\n\n def remove_rubberband(self):\n """\n Remove rubberband.\n\n This method should get implemented per backend.\n """\n pass\n\n\nclass ToolQuit(ToolBase):\n """Tool to call the figure manager destroy method."""\n\n description = 'Quit the figure'\n default_keymap = property(lambda self: mpl.rcParams['keymap.quit'])\n\n def trigger(self, sender, event, data=None):\n Gcf.destroy_fig(self.figure)\n\n\nclass ToolQuitAll(ToolBase):\n """Tool to call the figure manager destroy method."""\n\n description = 'Quit all figures'\n default_keymap = property(lambda self: mpl.rcParams['keymap.quit_all'])\n\n def trigger(self, sender, event, data=None):\n Gcf.destroy_all()\n\n\nclass ToolGrid(ToolBase):\n """Tool to toggle the major grids of the figure."""\n\n description = 'Toggle major grids'\n default_keymap = property(lambda self: mpl.rcParams['keymap.grid'])\n\n def trigger(self, sender, event, data=None):\n sentinel = str(uuid.uuid4())\n # Trigger grid switching by temporarily setting :rc:`keymap.grid`\n # to a unique key and sending an appropriate event.\n with (cbook._setattr_cm(event, key=sentinel),\n mpl.rc_context({'keymap.grid': sentinel})):\n mpl.backend_bases.key_press_handler(event, self.figure.canvas)\n\n\nclass ToolMinorGrid(ToolBase):\n """Tool to toggle the major and minor grids of the figure."""\n\n description = 'Toggle major and minor grids'\n default_keymap = property(lambda self: mpl.rcParams['keymap.grid_minor'])\n\n def trigger(self, sender, event, data=None):\n sentinel = str(uuid.uuid4())\n # Trigger grid switching by temporarily setting :rc:`keymap.grid_minor`\n # to a unique key and sending an appropriate event.\n with (cbook._setattr_cm(event, key=sentinel),\n mpl.rc_context({'keymap.grid_minor': sentinel})):\n mpl.backend_bases.key_press_handler(event, self.figure.canvas)\n\n\nclass ToolFullScreen(ToolBase):\n """Tool to toggle full screen."""\n\n description = 'Toggle fullscreen mode'\n default_keymap = property(lambda self: mpl.rcParams['keymap.fullscreen'])\n\n def trigger(self, sender, event, data=None):\n self.figure.canvas.manager.full_screen_toggle()\n\n\nclass AxisScaleBase(ToolToggleBase):\n """Base Tool to toggle between linear and logarithmic."""\n\n def trigger(self, sender, event, data=None):\n if event.inaxes is None:\n return\n super().trigger(sender, event, data)\n\n def enable(self, event=None):\n self.set_scale(event.inaxes, 'log')\n self.figure.canvas.draw_idle()\n\n def disable(self, event=None):\n self.set_scale(event.inaxes, 'linear')\n self.figure.canvas.draw_idle()\n\n\nclass ToolYScale(AxisScaleBase):\n """Tool to toggle between linear and logarithmic scales on the Y axis."""\n\n description = 'Toggle scale Y axis'\n default_keymap = property(lambda self: mpl.rcParams['keymap.yscale'])\n\n def set_scale(self, ax, scale):\n ax.set_yscale(scale)\n\n\nclass ToolXScale(AxisScaleBase):\n """Tool to toggle between linear and logarithmic scales on the X axis."""\n\n description = 'Toggle scale X axis'\n default_keymap = property(lambda self: mpl.rcParams['keymap.xscale'])\n\n def set_scale(self, ax, scale):\n ax.set_xscale(scale)\n\n\nclass ToolViewsPositions(ToolBase):\n """\n Auxiliary Tool to handle changes in views and positions.\n\n Runs in the background and should get used by all the tools that\n need to access the figure's history of views and positions, e.g.\n\n * `ToolZoom`\n * `ToolPan`\n * `ToolHome`\n * `ToolBack`\n * `ToolForward`\n """\n\n def __init__(self, *args, **kwargs):\n self.views = WeakKeyDictionary()\n self.positions = WeakKeyDictionary()\n self.home_views = WeakKeyDictionary()\n super().__init__(*args, **kwargs)\n\n def add_figure(self, figure):\n """Add the current figure to the stack of views and positions."""\n\n if figure not in self.views:\n self.views[figure] = cbook._Stack()\n self.positions[figure] = cbook._Stack()\n self.home_views[figure] = WeakKeyDictionary()\n # Define Home\n self.push_current(figure)\n # Make sure we add a home view for new Axes as they're added\n figure.add_axobserver(lambda fig: self.update_home_views(fig))\n\n def clear(self, figure):\n """Reset the Axes stack."""\n if figure in self.views:\n self.views[figure].clear()\n self.positions[figure].clear()\n self.home_views[figure].clear()\n self.update_home_views()\n\n def update_view(self):\n """\n Update the view limits and position for each Axes from the current\n stack position. If any Axes are present in the figure that aren't in\n the current stack position, use the home view limits for those Axes and\n don't update *any* positions.\n """\n\n views = self.views[self.figure]()\n if views is None:\n return\n pos = self.positions[self.figure]()\n if pos is None:\n return\n home_views = self.home_views[self.figure]\n all_axes = self.figure.get_axes()\n for a in all_axes:\n if a in views:\n cur_view = views[a]\n else:\n cur_view = home_views[a]\n a._set_view(cur_view)\n\n if set(all_axes).issubset(pos):\n for a in all_axes:\n # Restore both the original and modified positions\n a._set_position(pos[a][0], 'original')\n a._set_position(pos[a][1], 'active')\n\n self.figure.canvas.draw_idle()\n\n def push_current(self, figure=None):\n """\n Push the current view limits and position onto their respective stacks.\n """\n if not figure:\n figure = self.figure\n views = WeakKeyDictionary()\n pos = WeakKeyDictionary()\n for a in figure.get_axes():\n views[a] = a._get_view()\n pos[a] = self._axes_pos(a)\n self.views[figure].push(views)\n self.positions[figure].push(pos)\n\n def _axes_pos(self, ax):\n """\n Return the original and modified positions for the specified Axes.\n\n Parameters\n ----------\n ax : matplotlib.axes.Axes\n The `.Axes` to get the positions for.\n\n Returns\n -------\n original_position, modified_position\n A tuple of the original and modified positions.\n """\n\n return (ax.get_position(True).frozen(),\n ax.get_position().frozen())\n\n def update_home_views(self, figure=None):\n """\n Make sure that ``self.home_views`` has an entry for all Axes present\n in the figure.\n """\n\n if not figure:\n figure = self.figure\n for a in figure.get_axes():\n if a not in self.home_views[figure]:\n self.home_views[figure][a] = a._get_view()\n\n def home(self):\n """Recall the first view and position from the stack."""\n self.views[self.figure].home()\n self.positions[self.figure].home()\n\n def back(self):\n """Back one step in the stack of views and positions."""\n self.views[self.figure].back()\n self.positions[self.figure].back()\n\n def forward(self):\n """Forward one step in the stack of views and positions."""\n self.views[self.figure].forward()\n self.positions[self.figure].forward()\n\n\nclass ViewsPositionsBase(ToolBase):\n """Base class for `ToolHome`, `ToolBack` and `ToolForward`."""\n\n _on_trigger = None\n\n def trigger(self, sender, event, data=None):\n self.toolmanager.get_tool(_views_positions).add_figure(self.figure)\n getattr(self.toolmanager.get_tool(_views_positions),\n self._on_trigger)()\n self.toolmanager.get_tool(_views_positions).update_view()\n\n\nclass ToolHome(ViewsPositionsBase):\n """Restore the original view limits."""\n\n description = 'Reset original view'\n image = 'mpl-data/images/home'\n default_keymap = property(lambda self: mpl.rcParams['keymap.home'])\n _on_trigger = 'home'\n\n\nclass ToolBack(ViewsPositionsBase):\n """Move back up the view limits stack."""\n\n description = 'Back to previous view'\n image = 'mpl-data/images/back'\n default_keymap = property(lambda self: mpl.rcParams['keymap.back'])\n _on_trigger = 'back'\n\n\nclass ToolForward(ViewsPositionsBase):\n """Move forward in the view lim stack."""\n\n description = 'Forward to next view'\n image = 'mpl-data/images/forward'\n default_keymap = property(lambda self: mpl.rcParams['keymap.forward'])\n _on_trigger = 'forward'\n\n\nclass ConfigureSubplotsBase(ToolBase):\n """Base tool for the configuration of subplots."""\n\n description = 'Configure subplots'\n image = 'mpl-data/images/subplots'\n\n\nclass SaveFigureBase(ToolBase):\n """Base tool for figure saving."""\n\n description = 'Save the figure'\n image = 'mpl-data/images/filesave'\n default_keymap = property(lambda self: mpl.rcParams['keymap.save'])\n\n\nclass ZoomPanBase(ToolToggleBase):\n """Base class for `ToolZoom` and `ToolPan`."""\n def __init__(self, *args):\n super().__init__(*args)\n self._button_pressed = None\n self._xypress = None\n self._idPress = None\n self._idRelease = None\n self._idScroll = None\n self.base_scale = 2.\n self.scrollthresh = .5 # .5 second scroll threshold\n self.lastscroll = time.time()-self.scrollthresh\n\n def enable(self, event=None):\n """Connect press/release events and lock the canvas."""\n self.figure.canvas.widgetlock(self)\n self._idPress = self.figure.canvas.mpl_connect(\n 'button_press_event', self._press)\n self._idRelease = self.figure.canvas.mpl_connect(\n 'button_release_event', self._release)\n self._idScroll = self.figure.canvas.mpl_connect(\n 'scroll_event', self.scroll_zoom)\n\n def disable(self, event=None):\n """Release the canvas and disconnect press/release events."""\n self._cancel_action()\n self.figure.canvas.widgetlock.release(self)\n self.figure.canvas.mpl_disconnect(self._idPress)\n self.figure.canvas.mpl_disconnect(self._idRelease)\n self.figure.canvas.mpl_disconnect(self._idScroll)\n\n def trigger(self, sender, event, data=None):\n self.toolmanager.get_tool(_views_positions).add_figure(self.figure)\n super().trigger(sender, event, data)\n new_navigate_mode = self.name.upper() if self.toggled else None\n for ax in self.figure.axes:\n ax.set_navigate_mode(new_navigate_mode)\n\n def scroll_zoom(self, event):\n # https://gist.github.com/tacaswell/3144287\n if event.inaxes is None:\n return\n\n if event.button == 'up':\n # deal with zoom in\n scl = self.base_scale\n elif event.button == 'down':\n # deal with zoom out\n scl = 1/self.base_scale\n else:\n # deal with something that should never happen\n scl = 1\n\n ax = event.inaxes\n ax._set_view_from_bbox([event.x, event.y, scl])\n\n # If last scroll was done within the timing threshold, delete the\n # previous view\n if (time.time()-self.lastscroll) < self.scrollthresh:\n self.toolmanager.get_tool(_views_positions).back()\n\n self.figure.canvas.draw_idle() # force re-draw\n\n self.lastscroll = time.time()\n self.toolmanager.get_tool(_views_positions).push_current()\n\n\nclass ToolZoom(ZoomPanBase):\n """A Tool for zooming using a rectangle selector."""\n\n description = 'Zoom to rectangle'\n image = 'mpl-data/images/zoom_to_rect'\n default_keymap = property(lambda self: mpl.rcParams['keymap.zoom'])\n cursor = cursors.SELECT_REGION\n radio_group = 'default'\n\n def __init__(self, *args):\n super().__init__(*args)\n self._ids_zoom = []\n\n def _cancel_action(self):\n for zoom_id in self._ids_zoom:\n self.figure.canvas.mpl_disconnect(zoom_id)\n self.toolmanager.trigger_tool('rubberband', self)\n self.figure.canvas.draw_idle()\n self._xypress = None\n self._button_pressed = None\n self._ids_zoom = []\n return\n\n def _press(self, event):\n """Callback for mouse button presses in zoom-to-rectangle mode."""\n\n # If we're already in the middle of a zoom, pressing another\n # button works to "cancel"\n if self._ids_zoom:\n self._cancel_action()\n\n if event.button == 1:\n self._button_pressed = 1\n elif event.button == 3:\n self._button_pressed = 3\n else:\n self._cancel_action()\n return\n\n x, y = event.x, event.y\n\n self._xypress = []\n for i, a in enumerate(self.figure.get_axes()):\n if (x is not None and y is not None and a.in_axes(event) and\n a.get_navigate() and a.can_zoom()):\n self._xypress.append((x, y, a, i, a._get_view()))\n\n id1 = self.figure.canvas.mpl_connect(\n 'motion_notify_event', self._mouse_move)\n id2 = self.figure.canvas.mpl_connect(\n 'key_press_event', self._switch_on_zoom_mode)\n id3 = self.figure.canvas.mpl_connect(\n 'key_release_event', self._switch_off_zoom_mode)\n\n self._ids_zoom = id1, id2, id3\n self._zoom_mode = event.key\n\n def _switch_on_zoom_mode(self, event):\n self._zoom_mode = event.key\n self._mouse_move(event)\n\n def _switch_off_zoom_mode(self, event):\n self._zoom_mode = None\n self._mouse_move(event)\n\n def _mouse_move(self, event):\n """Callback for mouse moves in zoom-to-rectangle mode."""\n\n if self._xypress:\n x, y = event.x, event.y\n lastx, lasty, a, ind, view = self._xypress[0]\n (x1, y1), (x2, y2) = np.clip(\n [[lastx, lasty], [x, y]], a.bbox.min, a.bbox.max)\n if self._zoom_mode == "x":\n y1, y2 = a.bbox.intervaly\n elif self._zoom_mode == "y":\n x1, x2 = a.bbox.intervalx\n self.toolmanager.trigger_tool(\n 'rubberband', self, data=(x1, y1, x2, y2))\n\n def _release(self, event):\n """Callback for mouse button releases in zoom-to-rectangle mode."""\n\n for zoom_id in self._ids_zoom:\n self.figure.canvas.mpl_disconnect(zoom_id)\n self._ids_zoom = []\n\n if not self._xypress:\n self._cancel_action()\n return\n\n done_ax = []\n\n for cur_xypress in self._xypress:\n x, y = event.x, event.y\n lastx, lasty, a, _ind, view = cur_xypress\n # ignore singular clicks - 5 pixels is a threshold\n if abs(x - lastx) < 5 or abs(y - lasty) < 5:\n self._cancel_action()\n return\n\n # detect twinx, twiny Axes and avoid double zooming\n twinx = any(a.get_shared_x_axes().joined(a, a1) for a1 in done_ax)\n twiny = any(a.get_shared_y_axes().joined(a, a1) for a1 in done_ax)\n done_ax.append(a)\n\n if self._button_pressed == 1:\n direction = 'in'\n elif self._button_pressed == 3:\n direction = 'out'\n else:\n continue\n\n a._set_view_from_bbox((lastx, lasty, x, y), direction,\n self._zoom_mode, twinx, twiny)\n\n self._zoom_mode = None\n self.toolmanager.get_tool(_views_positions).push_current()\n self._cancel_action()\n\n\nclass ToolPan(ZoomPanBase):\n """Pan Axes with left mouse, zoom with right."""\n\n default_keymap = property(lambda self: mpl.rcParams['keymap.pan'])\n description = 'Pan axes with left mouse, zoom with right'\n image = 'mpl-data/images/move'\n cursor = cursors.MOVE\n radio_group = 'default'\n\n def __init__(self, *args):\n super().__init__(*args)\n self._id_drag = None\n\n def _cancel_action(self):\n self._button_pressed = None\n self._xypress = []\n self.figure.canvas.mpl_disconnect(self._id_drag)\n self.toolmanager.messagelock.release(self)\n self.figure.canvas.draw_idle()\n\n def _press(self, event):\n if event.button == 1:\n self._button_pressed = 1\n elif event.button == 3:\n self._button_pressed = 3\n else:\n self._cancel_action()\n return\n\n x, y = event.x, event.y\n\n self._xypress = []\n for i, a in enumerate(self.figure.get_axes()):\n if (x is not None and y is not None and a.in_axes(event) and\n a.get_navigate() and a.can_pan()):\n a.start_pan(x, y, event.button)\n self._xypress.append((a, i))\n self.toolmanager.messagelock(self)\n self._id_drag = self.figure.canvas.mpl_connect(\n 'motion_notify_event', self._mouse_move)\n\n def _release(self, event):\n if self._button_pressed is None:\n self._cancel_action()\n return\n\n self.figure.canvas.mpl_disconnect(self._id_drag)\n self.toolmanager.messagelock.release(self)\n\n for a, _ind in self._xypress:\n a.end_pan()\n if not self._xypress:\n self._cancel_action()\n return\n\n self.toolmanager.get_tool(_views_positions).push_current()\n self._cancel_action()\n\n def _mouse_move(self, event):\n for a, _ind in self._xypress:\n # safer to use the recorded button at the _press than current\n # button: # multiple button can get pressed during motion...\n a.drag_pan(self._button_pressed, event.key, event.x, event.y)\n self.toolmanager.canvas.draw_idle()\n\n\nclass ToolHelpBase(ToolBase):\n description = 'Print tool list, shortcuts and description'\n default_keymap = property(lambda self: mpl.rcParams['keymap.help'])\n image = 'mpl-data/images/help'\n\n @staticmethod\n def format_shortcut(key_sequence):\n """\n Convert a shortcut string from the notation used in rc config to the\n standard notation for displaying shortcuts, e.g. 'ctrl+a' -> 'Ctrl+A'.\n """\n return (key_sequence if len(key_sequence) == 1 else\n re.sub(r"\+[A-Z]", r"+Shift\g<0>", key_sequence).title())\n\n def _format_tool_keymap(self, name):\n keymaps = self.toolmanager.get_tool_keymap(name)\n return ", ".join(self.format_shortcut(keymap) for keymap in keymaps)\n\n def _get_help_entries(self):\n return [(name, self._format_tool_keymap(name), tool.description)\n for name, tool in sorted(self.toolmanager.tools.items())\n if tool.description]\n\n def _get_help_text(self):\n entries = self._get_help_entries()\n entries = ["{}: {}\n\t{}".format(*entry) for entry in entries]\n return "\n".join(entries)\n\n def _get_help_html(self):\n fmt = "<tr><td>{}</td><td>{}</td><td>{}</td></tr>"\n rows = [fmt.format(\n "<b>Action</b>", "<b>Shortcuts</b>", "<b>Description</b>")]\n rows += [fmt.format(*row) for row in self._get_help_entries()]\n return ("<style>td {padding: 0px 4px}</style>"\n "<table><thead>" + rows[0] + "</thead>"\n "<tbody>".join(rows[1:]) + "</tbody></table>")\n\n\nclass ToolCopyToClipboardBase(ToolBase):\n """Tool to copy the figure to the clipboard."""\n\n description = 'Copy the canvas figure to clipboard'\n default_keymap = property(lambda self: mpl.rcParams['keymap.copy'])\n\n def trigger(self, *args, **kwargs):\n message = "Copy tool is not available"\n self.toolmanager.message_event(message, self)\n\n\ndefault_tools = {'home': ToolHome, 'back': ToolBack, 'forward': ToolForward,\n 'zoom': ToolZoom, 'pan': ToolPan,\n 'subplots': ConfigureSubplotsBase,\n 'save': SaveFigureBase,\n 'grid': ToolGrid,\n 'grid_minor': ToolMinorGrid,\n 'fullscreen': ToolFullScreen,\n 'quit': ToolQuit,\n 'quit_all': ToolQuitAll,\n 'xscale': ToolXScale,\n 'yscale': ToolYScale,\n 'position': ToolCursorPosition,\n _views_positions: ToolViewsPositions,\n 'cursor': ToolSetCursor,\n 'rubberband': RubberbandBase,\n 'help': ToolHelpBase,\n 'copy': ToolCopyToClipboardBase,\n }\n\ndefault_toolbar_tools = [['navigation', ['home', 'back', 'forward']],\n ['zoompan', ['pan', 'zoom', 'subplots']],\n ['io', ['save', 'help']]]\n\n\ndef add_tools_to_manager(toolmanager, tools=default_tools):\n """\n Add multiple tools to a `.ToolManager`.\n\n Parameters\n ----------\n toolmanager : `.backend_managers.ToolManager`\n Manager to which the tools are added.\n tools : {str: class_like}, optional\n The tools to add in a {name: tool} dict, see\n `.backend_managers.ToolManager.add_tool` for more info.\n """\n\n for name, tool in tools.items():\n toolmanager.add_tool(name, tool)\n\n\ndef add_tools_to_container(container, tools=default_toolbar_tools):\n """\n Add multiple tools to the container.\n\n Parameters\n ----------\n container : Container\n `.backend_bases.ToolContainerBase` object that will get the tools\n added.\n tools : list, optional\n List in the form ``[[group1, [tool1, tool2 ...]], [group2, [...]]]``\n where the tools ``[tool1, tool2, ...]`` will display in group1.\n See `.backend_bases.ToolContainerBase.add_tool` for details.\n """\n\n for group, grouptools in tools:\n for position, tool in enumerate(grouptools):\n container.add_tool(tool, group, position)\n | .venv\Lib\site-packages\matplotlib\backend_tools.py | backend_tools.py | Python | 33,186 | 0.95 | 0.208417 | 0.051216 | react-lib | 308 | 2024-02-20T06:11:58.202649 | MIT | false | 284f40feb31473b10f65a776f8440ca1 |
import enum\nfrom matplotlib import cbook\nfrom matplotlib.axes import Axes\nfrom matplotlib.backend_bases import ToolContainerBase, FigureCanvasBase\nfrom matplotlib.backend_managers import ToolManager, ToolEvent\nfrom matplotlib.figure import Figure\nfrom matplotlib.scale import ScaleBase\n\nfrom typing import Any, cast\n\nclass Cursors(enum.IntEnum):\n POINTER = cast(int, ...)\n HAND = cast(int, ...)\n SELECT_REGION = cast(int, ...)\n MOVE = cast(int, ...)\n WAIT = cast(int, ...)\n RESIZE_HORIZONTAL = cast(int, ...)\n RESIZE_VERTICAL = cast(int, ...)\n\ncursors = Cursors\n\nclass ToolBase:\n @property\n def default_keymap(self) -> list[str] | None: ...\n description: str | None\n image: str | None\n def __init__(self, toolmanager: ToolManager, name: str) -> None: ...\n @property\n def name(self) -> str: ...\n @property\n def toolmanager(self) -> ToolManager: ...\n @property\n def canvas(self) -> FigureCanvasBase | None: ...\n @property\n def figure(self) -> Figure | None: ...\n @figure.setter\n def figure(self, figure: Figure | None) -> None: ...\n def set_figure(self, figure: Figure | None) -> None: ...\n def trigger(self, sender: Any, event: ToolEvent, data: Any = ...) -> None: ...\n\nclass ToolToggleBase(ToolBase):\n radio_group: str | None\n cursor: Cursors | None\n default_toggled: bool\n def __init__(self, *args, **kwargs) -> None: ...\n def enable(self, event: ToolEvent | None = ...) -> None: ...\n def disable(self, event: ToolEvent | None = ...) -> None: ...\n @property\n def toggled(self) -> bool: ...\n def set_figure(self, figure: Figure | None) -> None: ...\n\nclass ToolSetCursor(ToolBase): ...\n\nclass ToolCursorPosition(ToolBase):\n def send_message(self, event: ToolEvent) -> None: ...\n\nclass RubberbandBase(ToolBase):\n def draw_rubberband(self, *data) -> None: ...\n def remove_rubberband(self) -> None: ...\n\nclass ToolQuit(ToolBase): ...\nclass ToolQuitAll(ToolBase): ...\nclass ToolGrid(ToolBase): ...\nclass ToolMinorGrid(ToolBase): ...\nclass ToolFullScreen(ToolBase): ...\n\nclass AxisScaleBase(ToolToggleBase):\n def enable(self, event: ToolEvent | None = ...) -> None: ...\n def disable(self, event: ToolEvent | None = ...) -> None: ...\n\nclass ToolYScale(AxisScaleBase):\n def set_scale(self, ax: Axes, scale: str | ScaleBase) -> None: ...\n\nclass ToolXScale(AxisScaleBase):\n def set_scale(self, ax, scale: str | ScaleBase) -> None: ...\n\nclass ToolViewsPositions(ToolBase):\n views: dict[Figure | Axes, cbook._Stack]\n positions: dict[Figure | Axes, cbook._Stack]\n home_views: dict[Figure, dict[Axes, tuple[float, float, float, float]]]\n def add_figure(self, figure: Figure) -> None: ...\n def clear(self, figure: Figure) -> None: ...\n def update_view(self) -> None: ...\n def push_current(self, figure: Figure | None = ...) -> None: ...\n def update_home_views(self, figure: Figure | None = ...) -> None: ...\n def home(self) -> None: ...\n def back(self) -> None: ...\n def forward(self) -> None: ...\n\nclass ViewsPositionsBase(ToolBase): ...\nclass ToolHome(ViewsPositionsBase): ...\nclass ToolBack(ViewsPositionsBase): ...\nclass ToolForward(ViewsPositionsBase): ...\nclass ConfigureSubplotsBase(ToolBase): ...\nclass SaveFigureBase(ToolBase): ...\n\nclass ZoomPanBase(ToolToggleBase):\n base_scale: float\n scrollthresh: float\n lastscroll: float\n def __init__(self, *args) -> None: ...\n def enable(self, event: ToolEvent | None = ...) -> None: ...\n def disable(self, event: ToolEvent | None = ...) -> None: ...\n def scroll_zoom(self, event: ToolEvent) -> None: ...\n\nclass ToolZoom(ZoomPanBase): ...\nclass ToolPan(ZoomPanBase): ...\n\nclass ToolHelpBase(ToolBase):\n @staticmethod\n def format_shortcut(key_sequence: str) -> str: ...\n\nclass ToolCopyToClipboardBase(ToolBase): ...\n\ndefault_tools: dict[str, ToolBase]\ndefault_toolbar_tools: list[list[str | list[str]]]\n\ndef add_tools_to_manager(\n toolmanager: ToolManager, tools: dict[str, type[ToolBase]] = ...\n) -> None: ...\ndef add_tools_to_container(container: ToolContainerBase, tools: list[Any] = ...) -> None: ...\n | .venv\Lib\site-packages\matplotlib\backend_tools.pyi | backend_tools.pyi | Other | 4,122 | 0.85 | 0.512397 | 0 | node-utils | 530 | 2024-03-24T10:31:38.121114 | MIT | false | f9d515494a7fcfb191d3bb20f7a8d134 |
"""\nA module providing some utility functions regarding Bézier path manipulation.\n"""\n\nfrom functools import lru_cache\nimport math\nimport warnings\n\nimport numpy as np\n\nfrom matplotlib import _api\n\n\n# same algorithm as 3.8's math.comb\n@np.vectorize\n@lru_cache(maxsize=128)\ndef _comb(n, k):\n if k > n:\n return 0\n k = min(k, n - k)\n i = np.arange(1, k + 1)\n return np.prod((n + 1 - i)/i).astype(int)\n\n\nclass NonIntersectingPathException(ValueError):\n pass\n\n\n# some functions\n\n\ndef get_intersection(cx1, cy1, cos_t1, sin_t1,\n cx2, cy2, cos_t2, sin_t2):\n """\n Return the intersection between the line through (*cx1*, *cy1*) at angle\n *t1* and the line through (*cx2*, *cy2*) at angle *t2*.\n """\n\n # line1 => sin_t1 * (x - cx1) - cos_t1 * (y - cy1) = 0.\n # line1 => sin_t1 * x + cos_t1 * y = sin_t1*cx1 - cos_t1*cy1\n\n line1_rhs = sin_t1 * cx1 - cos_t1 * cy1\n line2_rhs = sin_t2 * cx2 - cos_t2 * cy2\n\n # rhs matrix\n a, b = sin_t1, -cos_t1\n c, d = sin_t2, -cos_t2\n\n ad_bc = a * d - b * c\n if abs(ad_bc) < 1e-12:\n raise ValueError("Given lines do not intersect. Please verify that "\n "the angles are not equal or differ by 180 degrees.")\n\n # rhs_inverse\n a_, b_ = d, -b\n c_, d_ = -c, a\n a_, b_, c_, d_ = (k / ad_bc for k in [a_, b_, c_, d_])\n\n x = a_ * line1_rhs + b_ * line2_rhs\n y = c_ * line1_rhs + d_ * line2_rhs\n\n return x, y\n\n\ndef get_normal_points(cx, cy, cos_t, sin_t, length):\n """\n For a line passing through (*cx*, *cy*) and having an angle *t*, return\n locations of the two points located along its perpendicular line at the\n distance of *length*.\n """\n\n if length == 0.:\n return cx, cy, cx, cy\n\n cos_t1, sin_t1 = sin_t, -cos_t\n cos_t2, sin_t2 = -sin_t, cos_t\n\n x1, y1 = length * cos_t1 + cx, length * sin_t1 + cy\n x2, y2 = length * cos_t2 + cx, length * sin_t2 + cy\n\n return x1, y1, x2, y2\n\n\n# BEZIER routines\n\n# subdividing bezier curve\n# http://www.cs.mtu.edu/~shene/COURSES/cs3621/NOTES/spline/Bezier/bezier-sub.html\n\n\ndef _de_casteljau1(beta, t):\n next_beta = beta[:-1] * (1 - t) + beta[1:] * t\n return next_beta\n\n\ndef split_de_casteljau(beta, t):\n """\n Split a Bézier segment defined by its control points *beta* into two\n separate segments divided at *t* and return their control points.\n """\n beta = np.asarray(beta)\n beta_list = [beta]\n while True:\n beta = _de_casteljau1(beta, t)\n beta_list.append(beta)\n if len(beta) == 1:\n break\n left_beta = [beta[0] for beta in beta_list]\n right_beta = [beta[-1] for beta in reversed(beta_list)]\n\n return left_beta, right_beta\n\n\ndef find_bezier_t_intersecting_with_closedpath(\n bezier_point_at_t, inside_closedpath, t0=0., t1=1., tolerance=0.01):\n """\n Find the intersection of the Bézier curve with a closed path.\n\n The intersection point *t* is approximated by two parameters *t0*, *t1*\n such that *t0* <= *t* <= *t1*.\n\n Search starts from *t0* and *t1* and uses a simple bisecting algorithm\n therefore one of the end points must be inside the path while the other\n doesn't. The search stops when the distance of the points parametrized by\n *t0* and *t1* gets smaller than the given *tolerance*.\n\n Parameters\n ----------\n bezier_point_at_t : callable\n A function returning x, y coordinates of the Bézier at parameter *t*.\n It must have the signature::\n\n bezier_point_at_t(t: float) -> tuple[float, float]\n\n inside_closedpath : callable\n A function returning True if a given point (x, y) is inside the\n closed path. It must have the signature::\n\n inside_closedpath(point: tuple[float, float]) -> bool\n\n t0, t1 : float\n Start parameters for the search.\n\n tolerance : float\n Maximal allowed distance between the final points.\n\n Returns\n -------\n t0, t1 : float\n The Bézier path parameters.\n """\n start = bezier_point_at_t(t0)\n end = bezier_point_at_t(t1)\n\n start_inside = inside_closedpath(start)\n end_inside = inside_closedpath(end)\n\n if start_inside == end_inside and start != end:\n raise NonIntersectingPathException(\n "Both points are on the same side of the closed path")\n\n while True:\n\n # return if the distance is smaller than the tolerance\n if np.hypot(start[0] - end[0], start[1] - end[1]) < tolerance:\n return t0, t1\n\n # calculate the middle point\n middle_t = 0.5 * (t0 + t1)\n middle = bezier_point_at_t(middle_t)\n middle_inside = inside_closedpath(middle)\n\n if start_inside ^ middle_inside:\n t1 = middle_t\n if end == middle:\n # Edge case where infinite loop is possible\n # Caused by large numbers relative to tolerance\n return t0, t1\n end = middle\n else:\n t0 = middle_t\n if start == middle:\n # Edge case where infinite loop is possible\n # Caused by large numbers relative to tolerance\n return t0, t1\n start = middle\n start_inside = middle_inside\n\n\nclass BezierSegment:\n """\n A d-dimensional Bézier segment.\n\n Parameters\n ----------\n control_points : (N, d) array\n Location of the *N* control points.\n """\n\n def __init__(self, control_points):\n self._cpoints = np.asarray(control_points)\n self._N, self._d = self._cpoints.shape\n self._orders = np.arange(self._N)\n coeff = [math.factorial(self._N - 1)\n // (math.factorial(i) * math.factorial(self._N - 1 - i))\n for i in range(self._N)]\n self._px = (self._cpoints.T * coeff).T\n\n def __call__(self, t):\n """\n Evaluate the Bézier curve at point(s) *t* in [0, 1].\n\n Parameters\n ----------\n t : (k,) array-like\n Points at which to evaluate the curve.\n\n Returns\n -------\n (k, d) array\n Value of the curve for each point in *t*.\n """\n t = np.asarray(t)\n return (np.power.outer(1 - t, self._orders[::-1])\n * np.power.outer(t, self._orders)) @ self._px\n\n def point_at_t(self, t):\n """\n Evaluate the curve at a single point, returning a tuple of *d* floats.\n """\n return tuple(self(t))\n\n @property\n def control_points(self):\n """The control points of the curve."""\n return self._cpoints\n\n @property\n def dimension(self):\n """The dimension of the curve."""\n return self._d\n\n @property\n def degree(self):\n """Degree of the polynomial. One less the number of control points."""\n return self._N - 1\n\n @property\n def polynomial_coefficients(self):\n r"""\n The polynomial coefficients of the Bézier curve.\n\n .. warning:: Follows opposite convention from `numpy.polyval`.\n\n Returns\n -------\n (n+1, d) array\n Coefficients after expanding in polynomial basis, where :math:`n`\n is the degree of the Bézier curve and :math:`d` its dimension.\n These are the numbers (:math:`C_j`) such that the curve can be\n written :math:`\sum_{j=0}^n C_j t^j`.\n\n Notes\n -----\n The coefficients are calculated as\n\n .. math::\n\n {n \choose j} \sum_{i=0}^j (-1)^{i+j} {j \choose i} P_i\n\n where :math:`P_i` are the control points of the curve.\n """\n n = self.degree\n # matplotlib uses n <= 4. overflow plausible starting around n = 15.\n if n > 10:\n warnings.warn("Polynomial coefficients formula unstable for high "\n "order Bezier curves!", RuntimeWarning)\n P = self.control_points\n j = np.arange(n+1)[:, None]\n i = np.arange(n+1)[None, :] # _comb is non-zero for i <= j\n prefactor = (-1)**(i + j) * _comb(j, i) # j on axis 0, i on axis 1\n return _comb(n, j) * prefactor @ P # j on axis 0, self.dimension on 1\n\n def axis_aligned_extrema(self):\n """\n Return the dimension and location of the curve's interior extrema.\n\n The extrema are the points along the curve where one of its partial\n derivatives is zero.\n\n Returns\n -------\n dims : array of int\n Index :math:`i` of the partial derivative which is zero at each\n interior extrema.\n dzeros : array of float\n Of same size as dims. The :math:`t` such that :math:`d/dx_i B(t) =\n 0`\n """\n n = self.degree\n if n <= 1:\n return np.array([]), np.array([])\n Cj = self.polynomial_coefficients\n dCj = np.arange(1, n+1)[:, None] * Cj[1:]\n dims = []\n roots = []\n for i, pi in enumerate(dCj.T):\n r = np.roots(pi[::-1])\n roots.append(r)\n dims.append(np.full_like(r, i))\n roots = np.concatenate(roots)\n dims = np.concatenate(dims)\n in_range = np.isreal(roots) & (roots >= 0) & (roots <= 1)\n return dims[in_range], np.real(roots)[in_range]\n\n\ndef split_bezier_intersecting_with_closedpath(\n bezier, inside_closedpath, tolerance=0.01):\n """\n Split a Bézier curve into two at the intersection with a closed path.\n\n Parameters\n ----------\n bezier : (N, 2) array-like\n Control points of the Bézier segment. See `.BezierSegment`.\n inside_closedpath : callable\n A function returning True if a given point (x, y) is inside the\n closed path. See also `.find_bezier_t_intersecting_with_closedpath`.\n tolerance : float\n The tolerance for the intersection. See also\n `.find_bezier_t_intersecting_with_closedpath`.\n\n Returns\n -------\n left, right\n Lists of control points for the two Bézier segments.\n """\n\n bz = BezierSegment(bezier)\n bezier_point_at_t = bz.point_at_t\n\n t0, t1 = find_bezier_t_intersecting_with_closedpath(\n bezier_point_at_t, inside_closedpath, tolerance=tolerance)\n\n _left, _right = split_de_casteljau(bezier, (t0 + t1) / 2.)\n return _left, _right\n\n\n# matplotlib specific\n\n\ndef split_path_inout(path, inside, tolerance=0.01, reorder_inout=False):\n """\n Divide a path into two segments at the point where ``inside(x, y)`` becomes\n False.\n """\n from .path import Path\n path_iter = path.iter_segments()\n\n ctl_points, command = next(path_iter)\n begin_inside = inside(ctl_points[-2:]) # true if begin point is inside\n\n ctl_points_old = ctl_points\n\n iold = 0\n i = 1\n\n for ctl_points, command in path_iter:\n iold = i\n i += len(ctl_points) // 2\n if inside(ctl_points[-2:]) != begin_inside:\n bezier_path = np.concatenate([ctl_points_old[-2:], ctl_points])\n break\n ctl_points_old = ctl_points\n else:\n raise ValueError("The path does not intersect with the patch")\n\n bp = bezier_path.reshape((-1, 2))\n left, right = split_bezier_intersecting_with_closedpath(\n bp, inside, tolerance)\n if len(left) == 2:\n codes_left = [Path.LINETO]\n codes_right = [Path.MOVETO, Path.LINETO]\n elif len(left) == 3:\n codes_left = [Path.CURVE3, Path.CURVE3]\n codes_right = [Path.MOVETO, Path.CURVE3, Path.CURVE3]\n elif len(left) == 4:\n codes_left = [Path.CURVE4, Path.CURVE4, Path.CURVE4]\n codes_right = [Path.MOVETO, Path.CURVE4, Path.CURVE4, Path.CURVE4]\n else:\n raise AssertionError("This should never be reached")\n\n verts_left = left[1:]\n verts_right = right[:]\n\n if path.codes is None:\n path_in = Path(np.concatenate([path.vertices[:i], verts_left]))\n path_out = Path(np.concatenate([verts_right, path.vertices[i:]]))\n\n else:\n path_in = Path(np.concatenate([path.vertices[:iold], verts_left]),\n np.concatenate([path.codes[:iold], codes_left]))\n\n path_out = Path(np.concatenate([verts_right, path.vertices[i:]]),\n np.concatenate([codes_right, path.codes[i:]]))\n\n if reorder_inout and not begin_inside:\n path_in, path_out = path_out, path_in\n\n return path_in, path_out\n\n\ndef inside_circle(cx, cy, r):\n """\n Return a function that checks whether a point is in a circle with center\n (*cx*, *cy*) and radius *r*.\n\n The returned function has the signature::\n\n f(xy: tuple[float, float]) -> bool\n """\n r2 = r ** 2\n\n def _f(xy):\n x, y = xy\n return (x - cx) ** 2 + (y - cy) ** 2 < r2\n return _f\n\n\n# quadratic Bezier lines\n\ndef get_cos_sin(x0, y0, x1, y1):\n dx, dy = x1 - x0, y1 - y0\n d = (dx * dx + dy * dy) ** .5\n # Account for divide by zero\n if d == 0:\n return 0.0, 0.0\n return dx / d, dy / d\n\n\ndef check_if_parallel(dx1, dy1, dx2, dy2, tolerance=1.e-5):\n """\n Check if two lines are parallel.\n\n Parameters\n ----------\n dx1, dy1, dx2, dy2 : float\n The gradients *dy*/*dx* of the two lines.\n tolerance : float\n The angular tolerance in radians up to which the lines are considered\n parallel.\n\n Returns\n -------\n is_parallel\n - 1 if two lines are parallel in same direction.\n - -1 if two lines are parallel in opposite direction.\n - False otherwise.\n """\n theta1 = np.arctan2(dx1, dy1)\n theta2 = np.arctan2(dx2, dy2)\n dtheta = abs(theta1 - theta2)\n if dtheta < tolerance:\n return 1\n elif abs(dtheta - np.pi) < tolerance:\n return -1\n else:\n return False\n\n\ndef get_parallels(bezier2, width):\n """\n Given the quadratic Bézier control points *bezier2*, returns\n control points of quadratic Bézier lines roughly parallel to given\n one separated by *width*.\n """\n\n # The parallel Bezier lines are constructed by following ways.\n # c1 and c2 are control points representing the start and end of the\n # Bezier line.\n # cm is the middle point\n\n c1x, c1y = bezier2[0]\n cmx, cmy = bezier2[1]\n c2x, c2y = bezier2[2]\n\n parallel_test = check_if_parallel(c1x - cmx, c1y - cmy,\n cmx - c2x, cmy - c2y)\n\n if parallel_test == -1:\n _api.warn_external(\n "Lines do not intersect. A straight line is used instead.")\n cos_t1, sin_t1 = get_cos_sin(c1x, c1y, c2x, c2y)\n cos_t2, sin_t2 = cos_t1, sin_t1\n else:\n # t1 and t2 is the angle between c1 and cm, cm, c2. They are\n # also an angle of the tangential line of the path at c1 and c2\n cos_t1, sin_t1 = get_cos_sin(c1x, c1y, cmx, cmy)\n cos_t2, sin_t2 = get_cos_sin(cmx, cmy, c2x, c2y)\n\n # find c1_left, c1_right which are located along the lines\n # through c1 and perpendicular to the tangential lines of the\n # Bezier path at a distance of width. Same thing for c2_left and\n # c2_right with respect to c2.\n c1x_left, c1y_left, c1x_right, c1y_right = (\n get_normal_points(c1x, c1y, cos_t1, sin_t1, width)\n )\n c2x_left, c2y_left, c2x_right, c2y_right = (\n get_normal_points(c2x, c2y, cos_t2, sin_t2, width)\n )\n\n # find cm_left which is the intersecting point of a line through\n # c1_left with angle t1 and a line through c2_left with angle\n # t2. Same with cm_right.\n try:\n cmx_left, cmy_left = get_intersection(c1x_left, c1y_left, cos_t1,\n sin_t1, c2x_left, c2y_left,\n cos_t2, sin_t2)\n cmx_right, cmy_right = get_intersection(c1x_right, c1y_right, cos_t1,\n sin_t1, c2x_right, c2y_right,\n cos_t2, sin_t2)\n except ValueError:\n # Special case straight lines, i.e., angle between two lines is\n # less than the threshold used by get_intersection (we don't use\n # check_if_parallel as the threshold is not the same).\n cmx_left, cmy_left = (\n 0.5 * (c1x_left + c2x_left), 0.5 * (c1y_left + c2y_left)\n )\n cmx_right, cmy_right = (\n 0.5 * (c1x_right + c2x_right), 0.5 * (c1y_right + c2y_right)\n )\n\n # the parallel Bezier lines are created with control points of\n # [c1_left, cm_left, c2_left] and [c1_right, cm_right, c2_right]\n path_left = [(c1x_left, c1y_left),\n (cmx_left, cmy_left),\n (c2x_left, c2y_left)]\n path_right = [(c1x_right, c1y_right),\n (cmx_right, cmy_right),\n (c2x_right, c2y_right)]\n\n return path_left, path_right\n\n\ndef find_control_points(c1x, c1y, mmx, mmy, c2x, c2y):\n """\n Find control points of the Bézier curve passing through (*c1x*, *c1y*),\n (*mmx*, *mmy*), and (*c2x*, *c2y*), at parametric values 0, 0.5, and 1.\n """\n cmx = .5 * (4 * mmx - (c1x + c2x))\n cmy = .5 * (4 * mmy - (c1y + c2y))\n return [(c1x, c1y), (cmx, cmy), (c2x, c2y)]\n\n\ndef make_wedged_bezier2(bezier2, width, w1=1., wm=0.5, w2=0.):\n """\n Being similar to `get_parallels`, returns control points of two quadratic\n Bézier lines having a width roughly parallel to given one separated by\n *width*.\n """\n\n # c1, cm, c2\n c1x, c1y = bezier2[0]\n cmx, cmy = bezier2[1]\n c3x, c3y = bezier2[2]\n\n # t1 and t2 is the angle between c1 and cm, cm, c3.\n # They are also an angle of the tangential line of the path at c1 and c3\n cos_t1, sin_t1 = get_cos_sin(c1x, c1y, cmx, cmy)\n cos_t2, sin_t2 = get_cos_sin(cmx, cmy, c3x, c3y)\n\n # find c1_left, c1_right which are located along the lines\n # through c1 and perpendicular to the tangential lines of the\n # Bezier path at a distance of width. Same thing for c3_left and\n # c3_right with respect to c3.\n c1x_left, c1y_left, c1x_right, c1y_right = (\n get_normal_points(c1x, c1y, cos_t1, sin_t1, width * w1)\n )\n c3x_left, c3y_left, c3x_right, c3y_right = (\n get_normal_points(c3x, c3y, cos_t2, sin_t2, width * w2)\n )\n\n # find c12, c23 and c123 which are middle points of c1-cm, cm-c3 and\n # c12-c23\n c12x, c12y = (c1x + cmx) * .5, (c1y + cmy) * .5\n c23x, c23y = (cmx + c3x) * .5, (cmy + c3y) * .5\n c123x, c123y = (c12x + c23x) * .5, (c12y + c23y) * .5\n\n # tangential angle of c123 (angle between c12 and c23)\n cos_t123, sin_t123 = get_cos_sin(c12x, c12y, c23x, c23y)\n\n c123x_left, c123y_left, c123x_right, c123y_right = (\n get_normal_points(c123x, c123y, cos_t123, sin_t123, width * wm)\n )\n\n path_left = find_control_points(c1x_left, c1y_left,\n c123x_left, c123y_left,\n c3x_left, c3y_left)\n path_right = find_control_points(c1x_right, c1y_right,\n c123x_right, c123y_right,\n c3x_right, c3y_right)\n\n return path_left, path_right\n | .venv\Lib\site-packages\matplotlib\bezier.py | bezier.py | Python | 19,049 | 0.95 | 0.122924 | 0.109244 | python-kit | 345 | 2025-06-01T22:59:40.698202 | GPL-3.0 | false | 9b0124f80b777f58cd766db0f9620969 |
from collections.abc import Callable\nfrom typing import Literal\n\nimport numpy as np\nfrom numpy.typing import ArrayLike\n\nfrom .path import Path\n\nclass NonIntersectingPathException(ValueError): ...\n\ndef get_intersection(\n cx1: float,\n cy1: float,\n cos_t1: float,\n sin_t1: float,\n cx2: float,\n cy2: float,\n cos_t2: float,\n sin_t2: float,\n) -> tuple[float, float]: ...\ndef get_normal_points(\n cx: float, cy: float, cos_t: float, sin_t: float, length: float\n) -> tuple[float, float, float, float]: ...\ndef split_de_casteljau(beta: ArrayLike, t: float) -> tuple[np.ndarray, np.ndarray]: ...\ndef find_bezier_t_intersecting_with_closedpath(\n bezier_point_at_t: Callable[[float], tuple[float, float]],\n inside_closedpath: Callable[[tuple[float, float]], bool],\n t0: float = ...,\n t1: float = ...,\n tolerance: float = ...,\n) -> tuple[float, float]: ...\n\n# TODO make generic over d, the dimension? ndarraydim\nclass BezierSegment:\n def __init__(self, control_points: ArrayLike) -> None: ...\n def __call__(self, t: ArrayLike) -> np.ndarray: ...\n def point_at_t(self, t: float) -> tuple[float, ...]: ...\n @property\n def control_points(self) -> np.ndarray: ...\n @property\n def dimension(self) -> int: ...\n @property\n def degree(self) -> int: ...\n @property\n def polynomial_coefficients(self) -> np.ndarray: ...\n def axis_aligned_extrema(self) -> tuple[np.ndarray, np.ndarray]: ...\n\ndef split_bezier_intersecting_with_closedpath(\n bezier: ArrayLike,\n inside_closedpath: Callable[[tuple[float, float]], bool],\n tolerance: float = ...,\n) -> tuple[np.ndarray, np.ndarray]: ...\ndef split_path_inout(\n path: Path,\n inside: Callable[[tuple[float, float]], bool],\n tolerance: float = ...,\n reorder_inout: bool = ...,\n) -> tuple[Path, Path]: ...\ndef inside_circle(\n cx: float, cy: float, r: float\n) -> Callable[[tuple[float, float]], bool]: ...\ndef get_cos_sin(x0: float, y0: float, x1: float, y1: float) -> tuple[float, float]: ...\ndef check_if_parallel(\n dx1: float, dy1: float, dx2: float, dy2: float, tolerance: float = ...\n) -> Literal[-1, False, 1]: ...\ndef get_parallels(\n bezier2: ArrayLike, width: float\n) -> tuple[list[tuple[float, float]], list[tuple[float, float]]]: ...\ndef find_control_points(\n c1x: float, c1y: float, mmx: float, mmy: float, c2x: float, c2y: float\n) -> list[tuple[float, float]]: ...\ndef make_wedged_bezier2(\n bezier2: ArrayLike, width: float, w1: float = ..., wm: float = ..., w2: float = ...\n) -> tuple[list[tuple[float, float]], list[tuple[float, float]]]: ...\n | .venv\Lib\site-packages\matplotlib\bezier.pyi | bezier.pyi | Other | 2,586 | 0.95 | 0.297297 | 0.014706 | awesome-app | 299 | 2023-09-09T22:23:12.029736 | MIT | false | 9340091a23fac23008619cf130b817f8 |
"""\nPlotting of string "category" data: ``plot(['d', 'f', 'a'], [1, 2, 3])`` will\nplot three points with x-axis values of 'd', 'f', 'a'.\n\nSee :doc:`/gallery/lines_bars_and_markers/categorical_variables` for an\nexample.\n\nThe module uses Matplotlib's `matplotlib.units` mechanism to convert from\nstrings to integers and provides a tick locator, a tick formatter, and the\n`.UnitData` class that creates and stores the string-to-integer mapping.\n"""\n\nfrom collections import OrderedDict\nimport dateutil.parser\nimport itertools\nimport logging\n\nimport numpy as np\n\nfrom matplotlib import _api, cbook, ticker, units\n\n\n_log = logging.getLogger(__name__)\n\n\nclass StrCategoryConverter(units.ConversionInterface):\n @staticmethod\n def convert(value, unit, axis):\n """\n Convert strings in *value* to floats using mapping information stored\n in the *unit* object.\n\n Parameters\n ----------\n value : str or iterable\n Value or list of values to be converted.\n unit : `.UnitData`\n An object mapping strings to integers.\n axis : `~matplotlib.axis.Axis`\n The axis on which the converted value is plotted.\n\n .. note:: *axis* is unused.\n\n Returns\n -------\n float or `~numpy.ndarray` of float\n """\n if unit is None:\n raise ValueError(\n 'Missing category information for StrCategoryConverter; '\n 'this might be caused by unintendedly mixing categorical and '\n 'numeric data')\n StrCategoryConverter._validate_unit(unit)\n # dtype = object preserves numerical pass throughs\n values = np.atleast_1d(np.array(value, dtype=object))\n # force an update so it also does type checking\n unit.update(values)\n s = np.vectorize(unit._mapping.__getitem__, otypes=[float])(values)\n return s if not cbook.is_scalar_or_string(value) else s[0]\n\n @staticmethod\n def axisinfo(unit, axis):\n """\n Set the default axis ticks and labels.\n\n Parameters\n ----------\n unit : `.UnitData`\n object string unit information for value\n axis : `~matplotlib.axis.Axis`\n axis for which information is being set\n\n .. note:: *axis* is not used\n\n Returns\n -------\n `~matplotlib.units.AxisInfo`\n Information to support default tick labeling\n\n """\n StrCategoryConverter._validate_unit(unit)\n # locator and formatter take mapping dict because\n # args need to be pass by reference for updates\n majloc = StrCategoryLocator(unit._mapping)\n majfmt = StrCategoryFormatter(unit._mapping)\n return units.AxisInfo(majloc=majloc, majfmt=majfmt)\n\n @staticmethod\n def default_units(data, axis):\n """\n Set and update the `~matplotlib.axis.Axis` units.\n\n Parameters\n ----------\n data : str or iterable of str\n axis : `~matplotlib.axis.Axis`\n axis on which the data is plotted\n\n Returns\n -------\n `.UnitData`\n object storing string to integer mapping\n """\n # the conversion call stack is default_units -> axis_info -> convert\n if axis.units is None:\n axis.set_units(UnitData(data))\n else:\n axis.units.update(data)\n return axis.units\n\n @staticmethod\n def _validate_unit(unit):\n if not hasattr(unit, '_mapping'):\n raise ValueError(\n f'Provided unit "{unit}" is not valid for a categorical '\n 'converter, as it does not have a _mapping attribute.')\n\n\nclass StrCategoryLocator(ticker.Locator):\n """Tick at every integer mapping of the string data."""\n def __init__(self, units_mapping):\n """\n Parameters\n ----------\n units_mapping : dict\n Mapping of category names (str) to indices (int).\n """\n self._units = units_mapping\n\n def __call__(self):\n # docstring inherited\n return list(self._units.values())\n\n def tick_values(self, vmin, vmax):\n # docstring inherited\n return self()\n\n\nclass StrCategoryFormatter(ticker.Formatter):\n """String representation of the data at every tick."""\n def __init__(self, units_mapping):\n """\n Parameters\n ----------\n units_mapping : dict\n Mapping of category names (str) to indices (int).\n """\n self._units = units_mapping\n\n def __call__(self, x, pos=None):\n # docstring inherited\n return self.format_ticks([x])[0]\n\n def format_ticks(self, values):\n # docstring inherited\n r_mapping = {v: self._text(k) for k, v in self._units.items()}\n return [r_mapping.get(round(val), '') for val in values]\n\n @staticmethod\n def _text(value):\n """Convert text values into utf-8 or ascii strings."""\n if isinstance(value, bytes):\n value = value.decode(encoding='utf-8')\n elif not isinstance(value, str):\n value = str(value)\n return value\n\n\nclass UnitData:\n def __init__(self, data=None):\n """\n Create mapping between unique categorical values and integer ids.\n\n Parameters\n ----------\n data : iterable\n sequence of string values\n """\n self._mapping = OrderedDict()\n self._counter = itertools.count()\n if data is not None:\n self.update(data)\n\n @staticmethod\n def _str_is_convertible(val):\n """\n Helper method to check whether a string can be parsed as float or date.\n """\n try:\n float(val)\n except ValueError:\n try:\n dateutil.parser.parse(val)\n except (ValueError, TypeError):\n # TypeError if dateutil >= 2.8.1 else ValueError\n return False\n return True\n\n def update(self, data):\n """\n Map new values to integer identifiers.\n\n Parameters\n ----------\n data : iterable of str or bytes\n\n Raises\n ------\n TypeError\n If elements in *data* are neither str nor bytes.\n """\n data = np.atleast_1d(np.array(data, dtype=object))\n # check if convertible to number:\n convertible = True\n for val in OrderedDict.fromkeys(data):\n # OrderedDict just iterates over unique values in data.\n _api.check_isinstance((str, bytes), value=val)\n if convertible:\n # this will only be called so long as convertible is True.\n convertible = self._str_is_convertible(val)\n if val not in self._mapping:\n self._mapping[val] = next(self._counter)\n if data.size and convertible:\n _log.info('Using categorical units to plot a list of strings '\n 'that are all parsable as floats or dates. If these '\n 'strings should be plotted as numbers, cast to the '\n 'appropriate data type before plotting.')\n\n\n# Register the converter with Matplotlib's unit framework\n# Intentionally set to a single instance\nunits.registry[str] = \\n units.registry[np.str_] = \\n units.registry[bytes] = \\n units.registry[np.bytes_] = StrCategoryConverter()\n | .venv\Lib\site-packages\matplotlib\category.py | category.py | Python | 7,377 | 0.95 | 0.174468 | 0.076531 | react-lib | 876 | 2023-09-04T00:56:29.872111 | BSD-3-Clause | false | 9afaaba238f10af08b4f3c2104c680cb |
"""\nA collection of utility functions and classes. Originally, many\n(but not all) were from the Python Cookbook -- hence the name cbook.\n"""\n\nimport collections\nimport collections.abc\nimport contextlib\nimport functools\nimport gzip\nimport itertools\nimport math\nimport operator\nimport os\nfrom pathlib import Path\nimport shlex\nimport subprocess\nimport sys\nimport time\nimport traceback\nimport types\nimport weakref\n\nimport numpy as np\n\ntry:\n from numpy.exceptions import VisibleDeprecationWarning # numpy >= 1.25\nexcept ImportError:\n from numpy import VisibleDeprecationWarning\n\nimport matplotlib\nfrom matplotlib import _api, _c_internal_utils\n\n\nclass _ExceptionInfo:\n """\n A class to carry exception information around.\n\n This is used to store and later raise exceptions. It's an alternative to\n directly storing Exception instances that circumvents traceback-related\n issues: caching tracebacks can keep user's objects in local namespaces\n alive indefinitely, which can lead to very surprising memory issues for\n users and result in incorrect tracebacks.\n """\n\n def __init__(self, cls, *args):\n self._cls = cls\n self._args = args\n\n @classmethod\n def from_exception(cls, exc):\n return cls(type(exc), *exc.args)\n\n def to_exception(self):\n return self._cls(*self._args)\n\n\ndef _get_running_interactive_framework():\n """\n Return the interactive framework whose event loop is currently running, if\n any, or "headless" if no event loop can be started, or None.\n\n Returns\n -------\n Optional[str]\n One of the following values: "qt", "gtk3", "gtk4", "wx", "tk",\n "macosx", "headless", ``None``.\n """\n # Use ``sys.modules.get(name)`` rather than ``name in sys.modules`` as\n # entries can also have been explicitly set to None.\n QtWidgets = (\n sys.modules.get("PyQt6.QtWidgets")\n or sys.modules.get("PySide6.QtWidgets")\n or sys.modules.get("PyQt5.QtWidgets")\n or sys.modules.get("PySide2.QtWidgets")\n )\n if QtWidgets and QtWidgets.QApplication.instance():\n return "qt"\n Gtk = sys.modules.get("gi.repository.Gtk")\n if Gtk:\n if Gtk.MAJOR_VERSION == 4:\n from gi.repository import GLib\n if GLib.main_depth():\n return "gtk4"\n if Gtk.MAJOR_VERSION == 3 and Gtk.main_level():\n return "gtk3"\n wx = sys.modules.get("wx")\n if wx and wx.GetApp():\n return "wx"\n tkinter = sys.modules.get("tkinter")\n if tkinter:\n codes = {tkinter.mainloop.__code__, tkinter.Misc.mainloop.__code__}\n for frame in sys._current_frames().values():\n while frame:\n if frame.f_code in codes:\n return "tk"\n frame = frame.f_back\n # Preemptively break reference cycle between locals and the frame.\n del frame\n macosx = sys.modules.get("matplotlib.backends._macosx")\n if macosx and macosx.event_loop_is_running():\n return "macosx"\n if not _c_internal_utils.display_is_valid():\n return "headless"\n return None\n\n\ndef _exception_printer(exc):\n if _get_running_interactive_framework() in ["headless", None]:\n raise exc\n else:\n traceback.print_exc()\n\n\nclass _StrongRef:\n """\n Wrapper similar to a weakref, but keeping a strong reference to the object.\n """\n\n def __init__(self, obj):\n self._obj = obj\n\n def __call__(self):\n return self._obj\n\n def __eq__(self, other):\n return isinstance(other, _StrongRef) and self._obj == other._obj\n\n def __hash__(self):\n return hash(self._obj)\n\n\ndef _weak_or_strong_ref(func, callback):\n """\n Return a `WeakMethod` wrapping *func* if possible, else a `_StrongRef`.\n """\n try:\n return weakref.WeakMethod(func, callback)\n except TypeError:\n return _StrongRef(func)\n\n\nclass _UnhashDict:\n """\n A minimal dict-like class that also supports unhashable keys, storing them\n in a list of key-value pairs.\n\n This class only implements the interface needed for `CallbackRegistry`, and\n tries to minimize the overhead for the hashable case.\n """\n\n def __init__(self, pairs):\n self._dict = {}\n self._pairs = []\n for k, v in pairs:\n self[k] = v\n\n def __setitem__(self, key, value):\n try:\n self._dict[key] = value\n except TypeError:\n for i, (k, v) in enumerate(self._pairs):\n if k == key:\n self._pairs[i] = (key, value)\n break\n else:\n self._pairs.append((key, value))\n\n def __getitem__(self, key):\n try:\n return self._dict[key]\n except TypeError:\n pass\n for k, v in self._pairs:\n if k == key:\n return v\n raise KeyError(key)\n\n def pop(self, key, *args):\n try:\n if key in self._dict:\n return self._dict.pop(key)\n except TypeError:\n for i, (k, v) in enumerate(self._pairs):\n if k == key:\n del self._pairs[i]\n return v\n if args:\n return args[0]\n raise KeyError(key)\n\n def __iter__(self):\n yield from self._dict\n for k, v in self._pairs:\n yield k\n\n\nclass CallbackRegistry:\n """\n Handle registering, processing, blocking, and disconnecting\n for a set of signals and callbacks:\n\n >>> def oneat(x):\n ... print('eat', x)\n >>> def ondrink(x):\n ... print('drink', x)\n\n >>> from matplotlib.cbook import CallbackRegistry\n >>> callbacks = CallbackRegistry()\n\n >>> id_eat = callbacks.connect('eat', oneat)\n >>> id_drink = callbacks.connect('drink', ondrink)\n\n >>> callbacks.process('drink', 123)\n drink 123\n >>> callbacks.process('eat', 456)\n eat 456\n >>> callbacks.process('be merry', 456) # nothing will be called\n\n >>> callbacks.disconnect(id_eat)\n >>> callbacks.process('eat', 456) # nothing will be called\n\n >>> with callbacks.blocked(signal='drink'):\n ... callbacks.process('drink', 123) # nothing will be called\n >>> callbacks.process('drink', 123)\n drink 123\n\n In practice, one should always disconnect all callbacks when they are\n no longer needed to avoid dangling references (and thus memory leaks).\n However, real code in Matplotlib rarely does so, and due to its design,\n it is rather difficult to place this kind of code. To get around this,\n and prevent this class of memory leaks, we instead store weak references\n to bound methods only, so when the destination object needs to die, the\n CallbackRegistry won't keep it alive.\n\n Parameters\n ----------\n exception_handler : callable, optional\n If not None, *exception_handler* must be a function that takes an\n `Exception` as single parameter. It gets called with any `Exception`\n raised by the callbacks during `CallbackRegistry.process`, and may\n either re-raise the exception or handle it in another manner.\n\n The default handler prints the exception (with `traceback.print_exc`) if\n an interactive event loop is running; it re-raises the exception if no\n interactive event loop is running.\n\n signals : list, optional\n If not None, *signals* is a list of signals that this registry handles:\n attempting to `process` or to `connect` to a signal not in the list\n throws a `ValueError`. The default, None, does not restrict the\n handled signals.\n """\n\n # We maintain two mappings:\n # callbacks: signal -> {cid -> weakref-to-callback}\n # _func_cid_map: {(signal, weakref-to-callback) -> cid}\n\n def __init__(self, exception_handler=_exception_printer, *, signals=None):\n self._signals = None if signals is None else list(signals) # Copy it.\n self.exception_handler = exception_handler\n self.callbacks = {}\n self._cid_gen = itertools.count()\n self._func_cid_map = _UnhashDict([])\n # A hidden variable that marks cids that need to be pickled.\n self._pickled_cids = set()\n\n def __getstate__(self):\n return {\n **vars(self),\n # In general, callbacks may not be pickled, so we just drop them,\n # unless directed otherwise by self._pickled_cids.\n "callbacks": {s: {cid: proxy() for cid, proxy in d.items()\n if cid in self._pickled_cids}\n for s, d in self.callbacks.items()},\n # It is simpler to reconstruct this from callbacks in __setstate__.\n "_func_cid_map": None,\n "_cid_gen": next(self._cid_gen)\n }\n\n def __setstate__(self, state):\n cid_count = state.pop('_cid_gen')\n vars(self).update(state)\n self.callbacks = {\n s: {cid: _weak_or_strong_ref(func, functools.partial(self._remove_proxy, s))\n for cid, func in d.items()}\n for s, d in self.callbacks.items()}\n self._func_cid_map = _UnhashDict(\n ((s, proxy), cid)\n for s, d in self.callbacks.items() for cid, proxy in d.items())\n self._cid_gen = itertools.count(cid_count)\n\n def connect(self, signal, func):\n """Register *func* to be called when signal *signal* is generated."""\n if self._signals is not None:\n _api.check_in_list(self._signals, signal=signal)\n proxy = _weak_or_strong_ref(func, functools.partial(self._remove_proxy, signal))\n try:\n return self._func_cid_map[signal, proxy]\n except KeyError:\n cid = self._func_cid_map[signal, proxy] = next(self._cid_gen)\n self.callbacks.setdefault(signal, {})[cid] = proxy\n return cid\n\n def _connect_picklable(self, signal, func):\n """\n Like `.connect`, but the callback is kept when pickling/unpickling.\n\n Currently internal-use only.\n """\n cid = self.connect(signal, func)\n self._pickled_cids.add(cid)\n return cid\n\n # Keep a reference to sys.is_finalizing, as sys may have been cleared out\n # at that point.\n def _remove_proxy(self, signal, proxy, *, _is_finalizing=sys.is_finalizing):\n if _is_finalizing():\n # Weakrefs can't be properly torn down at that point anymore.\n return\n cid = self._func_cid_map.pop((signal, proxy), None)\n if cid is not None:\n del self.callbacks[signal][cid]\n self._pickled_cids.discard(cid)\n else: # Not found\n return\n if len(self.callbacks[signal]) == 0: # Clean up empty dicts\n del self.callbacks[signal]\n\n def disconnect(self, cid):\n """\n Disconnect the callback registered with callback id *cid*.\n\n No error is raised if such a callback does not exist.\n """\n self._pickled_cids.discard(cid)\n for signal, proxy in self._func_cid_map:\n if self._func_cid_map[signal, proxy] == cid:\n break\n else: # Not found\n return\n assert self.callbacks[signal][cid] == proxy\n del self.callbacks[signal][cid]\n self._func_cid_map.pop((signal, proxy))\n if len(self.callbacks[signal]) == 0: # Clean up empty dicts\n del self.callbacks[signal]\n\n def process(self, s, *args, **kwargs):\n """\n Process signal *s*.\n\n All of the functions registered to receive callbacks on *s* will be\n called with ``*args`` and ``**kwargs``.\n """\n if self._signals is not None:\n _api.check_in_list(self._signals, signal=s)\n for ref in list(self.callbacks.get(s, {}).values()):\n func = ref()\n if func is not None:\n try:\n func(*args, **kwargs)\n # this does not capture KeyboardInterrupt, SystemExit,\n # and GeneratorExit\n except Exception as exc:\n if self.exception_handler is not None:\n self.exception_handler(exc)\n else:\n raise\n\n @contextlib.contextmanager\n def blocked(self, *, signal=None):\n """\n Block callback signals from being processed.\n\n A context manager to temporarily block/disable callback signals\n from being processed by the registered listeners.\n\n Parameters\n ----------\n signal : str, optional\n The callback signal to block. The default is to block all signals.\n """\n orig = self.callbacks\n try:\n if signal is None:\n # Empty out the callbacks\n self.callbacks = {}\n else:\n # Only remove the specific signal\n self.callbacks = {k: orig[k] for k in orig if k != signal}\n yield\n finally:\n self.callbacks = orig\n\n\nclass silent_list(list):\n """\n A list with a short ``repr()``.\n\n This is meant to be used for a homogeneous list of artists, so that they\n don't cause long, meaningless output.\n\n Instead of ::\n\n [<matplotlib.lines.Line2D object at 0x7f5749fed3c8>,\n <matplotlib.lines.Line2D object at 0x7f5749fed4e0>,\n <matplotlib.lines.Line2D object at 0x7f5758016550>]\n\n one will get ::\n\n <a list of 3 Line2D objects>\n\n If ``self.type`` is None, the type name is obtained from the first item in\n the list (if any).\n """\n\n def __init__(self, type, seq=None):\n self.type = type\n if seq is not None:\n self.extend(seq)\n\n def __repr__(self):\n if self.type is not None or len(self) != 0:\n tp = self.type if self.type is not None else type(self[0]).__name__\n return f"<a list of {len(self)} {tp} objects>"\n else:\n return "<an empty list>"\n\n\ndef _local_over_kwdict(\n local_var, kwargs, *keys,\n warning_cls=_api.MatplotlibDeprecationWarning):\n out = local_var\n for key in keys:\n kwarg_val = kwargs.pop(key, None)\n if kwarg_val is not None:\n if out is None:\n out = kwarg_val\n else:\n _api.warn_external(f'"{key}" keyword argument will be ignored',\n warning_cls)\n return out\n\n\ndef strip_math(s):\n """\n Remove latex formatting from mathtext.\n\n Only handles fully math and fully non-math strings.\n """\n if len(s) >= 2 and s[0] == s[-1] == "$":\n s = s[1:-1]\n for tex, plain in [\n (r"\times", "x"), # Specifically for Formatter support.\n (r"\mathdefault", ""),\n (r"\rm", ""),\n (r"\cal", ""),\n (r"\tt", ""),\n (r"\it", ""),\n ("\\", ""),\n ("{", ""),\n ("}", ""),\n ]:\n s = s.replace(tex, plain)\n return s\n\n\ndef _strip_comment(s):\n """Strip everything from the first unquoted #."""\n pos = 0\n while True:\n quote_pos = s.find('"', pos)\n hash_pos = s.find('#', pos)\n if quote_pos < 0:\n without_comment = s if hash_pos < 0 else s[:hash_pos]\n return without_comment.strip()\n elif 0 <= hash_pos < quote_pos:\n return s[:hash_pos].strip()\n else:\n closing_quote_pos = s.find('"', quote_pos + 1)\n if closing_quote_pos < 0:\n raise ValueError(\n f"Missing closing quote in: {s!r}. If you need a double-"\n 'quote inside a string, use escaping: e.g. "the \" char"')\n pos = closing_quote_pos + 1 # behind closing quote\n\n\ndef is_writable_file_like(obj):\n """Return whether *obj* looks like a file object with a *write* method."""\n return callable(getattr(obj, 'write', None))\n\n\ndef file_requires_unicode(x):\n """\n Return whether the given writable file-like object requires Unicode to be\n written to it.\n """\n try:\n x.write(b'')\n except TypeError:\n return True\n else:\n return False\n\n\ndef to_filehandle(fname, flag='r', return_opened=False, encoding=None):\n """\n Convert a path to an open file handle or pass-through a file-like object.\n\n Consider using `open_file_cm` instead, as it allows one to properly close\n newly created file objects more easily.\n\n Parameters\n ----------\n fname : str or path-like or file-like\n If `str` or `os.PathLike`, the file is opened using the flags specified\n by *flag* and *encoding*. If a file-like object, it is passed through.\n flag : str, default: 'r'\n Passed as the *mode* argument to `open` when *fname* is `str` or\n `os.PathLike`; ignored if *fname* is file-like.\n return_opened : bool, default: False\n If True, return both the file object and a boolean indicating whether\n this was a new file (that the caller needs to close). If False, return\n only the new file.\n encoding : str or None, default: None\n Passed as the *mode* argument to `open` when *fname* is `str` or\n `os.PathLike`; ignored if *fname* is file-like.\n\n Returns\n -------\n fh : file-like\n opened : bool\n *opened* is only returned if *return_opened* is True.\n """\n if isinstance(fname, os.PathLike):\n fname = os.fspath(fname)\n if isinstance(fname, str):\n if fname.endswith('.gz'):\n fh = gzip.open(fname, flag)\n elif fname.endswith('.bz2'):\n # python may not be compiled with bz2 support,\n # bury import until we need it\n import bz2\n fh = bz2.BZ2File(fname, flag)\n else:\n fh = open(fname, flag, encoding=encoding)\n opened = True\n elif hasattr(fname, 'seek'):\n fh = fname\n opened = False\n else:\n raise ValueError('fname must be a PathLike or file handle')\n if return_opened:\n return fh, opened\n return fh\n\n\ndef open_file_cm(path_or_file, mode="r", encoding=None):\n r"""Pass through file objects and context-manage path-likes."""\n fh, opened = to_filehandle(path_or_file, mode, True, encoding)\n return fh if opened else contextlib.nullcontext(fh)\n\n\ndef is_scalar_or_string(val):\n """Return whether the given object is a scalar or string like."""\n return isinstance(val, str) or not np.iterable(val)\n\n\ndef get_sample_data(fname, asfileobj=True):\n """\n Return a sample data file. *fname* is a path relative to the\n :file:`mpl-data/sample_data` directory. If *asfileobj* is `True`\n return a file object, otherwise just a file path.\n\n Sample data files are stored in the 'mpl-data/sample_data' directory within\n the Matplotlib package.\n\n If the filename ends in .gz, the file is implicitly ungzipped. If the\n filename ends with .npy or .npz, and *asfileobj* is `True`, the file is\n loaded with `numpy.load`.\n """\n path = _get_data_path('sample_data', fname)\n if asfileobj:\n suffix = path.suffix.lower()\n if suffix == '.gz':\n return gzip.open(path)\n elif suffix in ['.npy', '.npz']:\n return np.load(path)\n elif suffix in ['.csv', '.xrc', '.txt']:\n return path.open('r')\n else:\n return path.open('rb')\n else:\n return str(path)\n\n\ndef _get_data_path(*args):\n """\n Return the `pathlib.Path` to a resource file provided by Matplotlib.\n\n ``*args`` specify a path relative to the base data path.\n """\n return Path(matplotlib.get_data_path(), *args)\n\n\ndef flatten(seq, scalarp=is_scalar_or_string):\n """\n Return a generator of flattened nested containers.\n\n For example:\n\n >>> from matplotlib.cbook import flatten\n >>> l = (('John', ['Hunter']), (1, 23), [[([42, (5, 23)], )]])\n >>> print(list(flatten(l)))\n ['John', 'Hunter', 1, 23, 42, 5, 23]\n\n By: Composite of Holger Krekel and Luther Blissett\n From: https://code.activestate.com/recipes/121294-simple-generator-for-flattening-nested-containers/\n and Recipe 1.12 in cookbook\n """ # noqa: E501\n for item in seq:\n if scalarp(item) or item is None:\n yield item\n else:\n yield from flatten(item, scalarp)\n\n\nclass _Stack:\n """\n Stack of elements with a movable cursor.\n\n Mimics home/back/forward in a web browser.\n """\n\n def __init__(self):\n self._pos = -1\n self._elements = []\n\n def clear(self):\n """Empty the stack."""\n self._pos = -1\n self._elements = []\n\n def __call__(self):\n """Return the current element, or None."""\n return self._elements[self._pos] if self._elements else None\n\n def __len__(self):\n return len(self._elements)\n\n def __getitem__(self, ind):\n return self._elements[ind]\n\n def forward(self):\n """Move the position forward and return the current element."""\n self._pos = min(self._pos + 1, len(self._elements) - 1)\n return self()\n\n def back(self):\n """Move the position back and return the current element."""\n self._pos = max(self._pos - 1, 0)\n return self()\n\n def push(self, o):\n """\n Push *o* to the stack after the current position, and return *o*.\n\n Discard all later elements.\n """\n self._elements[self._pos + 1:] = [o]\n self._pos = len(self._elements) - 1\n return o\n\n def home(self):\n """\n Push the first element onto the top of the stack.\n\n The first element is returned.\n """\n return self.push(self._elements[0]) if self._elements else None\n\n\ndef safe_masked_invalid(x, copy=False):\n x = np.array(x, subok=True, copy=copy)\n if not x.dtype.isnative:\n # If we have already made a copy, do the byteswap in place, else make a\n # copy with the byte order swapped.\n # Swap to native order.\n x = x.byteswap(inplace=copy).view(x.dtype.newbyteorder('N'))\n try:\n xm = np.ma.masked_where(~(np.isfinite(x)), x, copy=False)\n except TypeError:\n return x\n return xm\n\n\ndef print_cycles(objects, outstream=sys.stdout, show_progress=False):\n """\n Print loops of cyclic references in the given *objects*.\n\n It is often useful to pass in ``gc.garbage`` to find the cycles that are\n preventing some objects from being garbage collected.\n\n Parameters\n ----------\n objects\n A list of objects to find cycles in.\n outstream\n The stream for output.\n show_progress : bool\n If True, print the number of objects reached as they are found.\n """\n import gc\n\n def print_path(path):\n for i, step in enumerate(path):\n # next "wraps around"\n next = path[(i + 1) % len(path)]\n\n outstream.write(" %s -- " % type(step))\n if isinstance(step, dict):\n for key, val in step.items():\n if val is next:\n outstream.write(f"[{key!r}]")\n break\n if key is next:\n outstream.write(f"[key] = {val!r}")\n break\n elif isinstance(step, list):\n outstream.write("[%d]" % step.index(next))\n elif isinstance(step, tuple):\n outstream.write("( tuple )")\n else:\n outstream.write(repr(step))\n outstream.write(" ->\n")\n outstream.write("\n")\n\n def recurse(obj, start, all, current_path):\n if show_progress:\n outstream.write("%d\r" % len(all))\n\n all[id(obj)] = None\n\n referents = gc.get_referents(obj)\n for referent in referents:\n # If we've found our way back to the start, this is\n # a cycle, so print it out\n if referent is start:\n print_path(current_path)\n\n # Don't go back through the original list of objects, or\n # through temporary references to the object, since those\n # are just an artifact of the cycle detector itself.\n elif referent is objects or isinstance(referent, types.FrameType):\n continue\n\n # We haven't seen this object before, so recurse\n elif id(referent) not in all:\n recurse(referent, start, all, current_path + [obj])\n\n for obj in objects:\n outstream.write(f"Examining: {obj!r}\n")\n recurse(obj, obj, {}, [])\n\n\nclass Grouper:\n """\n A disjoint-set data structure.\n\n Objects can be joined using :meth:`join`, tested for connectedness\n using :meth:`joined`, and all disjoint sets can be retrieved by\n using the object as an iterator.\n\n The objects being joined must be hashable and weak-referenceable.\n\n Examples\n --------\n >>> from matplotlib.cbook import Grouper\n >>> class Foo:\n ... def __init__(self, s):\n ... self.s = s\n ... def __repr__(self):\n ... return self.s\n ...\n >>> a, b, c, d, e, f = [Foo(x) for x in 'abcdef']\n >>> grp = Grouper()\n >>> grp.join(a, b)\n >>> grp.join(b, c)\n >>> grp.join(d, e)\n >>> list(grp)\n [[a, b, c], [d, e]]\n >>> grp.joined(a, b)\n True\n >>> grp.joined(a, c)\n True\n >>> grp.joined(a, d)\n False\n """\n\n def __init__(self, init=()):\n self._mapping = weakref.WeakKeyDictionary(\n {x: weakref.WeakSet([x]) for x in init})\n self._ordering = weakref.WeakKeyDictionary()\n for x in init:\n if x not in self._ordering:\n self._ordering[x] = len(self._ordering)\n self._next_order = len(self._ordering) # Plain int to simplify pickling.\n\n def __getstate__(self):\n return {\n **vars(self),\n # Convert weak refs to strong ones.\n "_mapping": {k: set(v) for k, v in self._mapping.items()},\n "_ordering": {**self._ordering},\n }\n\n def __setstate__(self, state):\n vars(self).update(state)\n # Convert strong refs to weak ones.\n self._mapping = weakref.WeakKeyDictionary(\n {k: weakref.WeakSet(v) for k, v in self._mapping.items()})\n self._ordering = weakref.WeakKeyDictionary(self._ordering)\n\n def __contains__(self, item):\n return item in self._mapping\n\n def join(self, a, *args):\n """\n Join given arguments into the same set. Accepts one or more arguments.\n """\n mapping = self._mapping\n try:\n set_a = mapping[a]\n except KeyError:\n set_a = mapping[a] = weakref.WeakSet([a])\n self._ordering[a] = self._next_order\n self._next_order += 1\n for arg in args:\n try:\n set_b = mapping[arg]\n except KeyError:\n set_b = mapping[arg] = weakref.WeakSet([arg])\n self._ordering[arg] = self._next_order\n self._next_order += 1\n if set_b is not set_a:\n if len(set_b) > len(set_a):\n set_a, set_b = set_b, set_a\n set_a.update(set_b)\n for elem in set_b:\n mapping[elem] = set_a\n\n def joined(self, a, b):\n """Return whether *a* and *b* are members of the same set."""\n return (self._mapping.get(a, object()) is self._mapping.get(b))\n\n def remove(self, a):\n """Remove *a* from the grouper, doing nothing if it is not there."""\n self._mapping.pop(a, {a}).remove(a)\n self._ordering.pop(a, None)\n\n def __iter__(self):\n """\n Iterate over each of the disjoint sets as a list.\n\n The iterator is invalid if interleaved with calls to join().\n """\n unique_groups = {id(group): group for group in self._mapping.values()}\n for group in unique_groups.values():\n yield sorted(group, key=self._ordering.__getitem__)\n\n def get_siblings(self, a):\n """Return all of the items joined with *a*, including itself."""\n siblings = self._mapping.get(a, [a])\n return sorted(siblings, key=self._ordering.get)\n\n\nclass GrouperView:\n """Immutable view over a `.Grouper`."""\n\n def __init__(self, grouper): self._grouper = grouper\n def __contains__(self, item): return item in self._grouper\n def __iter__(self): return iter(self._grouper)\n\n def joined(self, a, b):\n """\n Return whether *a* and *b* are members of the same set.\n """\n return self._grouper.joined(a, b)\n\n def get_siblings(self, a):\n """\n Return all of the items joined with *a*, including itself.\n """\n return self._grouper.get_siblings(a)\n\n\ndef simple_linear_interpolation(a, steps):\n """\n Resample an array with ``steps - 1`` points between original point pairs.\n\n Along each column of *a*, ``(steps - 1)`` points are introduced between\n each original values; the values are linearly interpolated.\n\n Parameters\n ----------\n a : array, shape (n, ...)\n steps : int\n\n Returns\n -------\n array\n shape ``((n - 1) * steps + 1, ...)``\n """\n fps = a.reshape((len(a), -1))\n xp = np.arange(len(a)) * steps\n x = np.arange((len(a) - 1) * steps + 1)\n return (np.column_stack([np.interp(x, xp, fp) for fp in fps.T])\n .reshape((len(x),) + a.shape[1:]))\n\n\ndef delete_masked_points(*args):\n """\n Find all masked and/or non-finite points in a set of arguments,\n and return the arguments with only the unmasked points remaining.\n\n Arguments can be in any of 5 categories:\n\n 1) 1-D masked arrays\n 2) 1-D ndarrays\n 3) ndarrays with more than one dimension\n 4) other non-string iterables\n 5) anything else\n\n The first argument must be in one of the first four categories;\n any argument with a length differing from that of the first\n argument (and hence anything in category 5) then will be\n passed through unchanged.\n\n Masks are obtained from all arguments of the correct length\n in categories 1, 2, and 4; a point is bad if masked in a masked\n array or if it is a nan or inf. No attempt is made to\n extract a mask from categories 2, 3, and 4 if `numpy.isfinite`\n does not yield a Boolean array.\n\n All input arguments that are not passed unchanged are returned\n as ndarrays after removing the points or rows corresponding to\n masks in any of the arguments.\n\n A vastly simpler version of this function was originally\n written as a helper for Axes.scatter().\n\n """\n if not len(args):\n return ()\n if is_scalar_or_string(args[0]):\n raise ValueError("First argument must be a sequence")\n nrecs = len(args[0])\n margs = []\n seqlist = [False] * len(args)\n for i, x in enumerate(args):\n if not isinstance(x, str) and np.iterable(x) and len(x) == nrecs:\n seqlist[i] = True\n if isinstance(x, np.ma.MaskedArray):\n if x.ndim > 1:\n raise ValueError("Masked arrays must be 1-D")\n else:\n x = np.asarray(x)\n margs.append(x)\n masks = [] # List of masks that are True where good.\n for i, x in enumerate(margs):\n if seqlist[i]:\n if x.ndim > 1:\n continue # Don't try to get nan locations unless 1-D.\n if isinstance(x, np.ma.MaskedArray):\n masks.append(~np.ma.getmaskarray(x)) # invert the mask\n xd = x.data\n else:\n xd = x\n try:\n mask = np.isfinite(xd)\n if isinstance(mask, np.ndarray):\n masks.append(mask)\n except Exception: # Fixme: put in tuple of possible exceptions?\n pass\n if len(masks):\n mask = np.logical_and.reduce(masks)\n igood = mask.nonzero()[0]\n if len(igood) < nrecs:\n for i, x in enumerate(margs):\n if seqlist[i]:\n margs[i] = x[igood]\n for i, x in enumerate(margs):\n if seqlist[i] and isinstance(x, np.ma.MaskedArray):\n margs[i] = x.filled()\n return margs\n\n\ndef _combine_masks(*args):\n """\n Find all masked and/or non-finite points in a set of arguments,\n and return the arguments as masked arrays with a common mask.\n\n Arguments can be in any of 5 categories:\n\n 1) 1-D masked arrays\n 2) 1-D ndarrays\n 3) ndarrays with more than one dimension\n 4) other non-string iterables\n 5) anything else\n\n The first argument must be in one of the first four categories;\n any argument with a length differing from that of the first\n argument (and hence anything in category 5) then will be\n passed through unchanged.\n\n Masks are obtained from all arguments of the correct length\n in categories 1, 2, and 4; a point is bad if masked in a masked\n array or if it is a nan or inf. No attempt is made to\n extract a mask from categories 2 and 4 if `numpy.isfinite`\n does not yield a Boolean array. Category 3 is included to\n support RGB or RGBA ndarrays, which are assumed to have only\n valid values and which are passed through unchanged.\n\n All input arguments that are not passed unchanged are returned\n as masked arrays if any masked points are found, otherwise as\n ndarrays.\n\n """\n if not len(args):\n return ()\n if is_scalar_or_string(args[0]):\n raise ValueError("First argument must be a sequence")\n nrecs = len(args[0])\n margs = [] # Output args; some may be modified.\n seqlist = [False] * len(args) # Flags: True if output will be masked.\n masks = [] # List of masks.\n for i, x in enumerate(args):\n if is_scalar_or_string(x) or len(x) != nrecs:\n margs.append(x) # Leave it unmodified.\n else:\n if isinstance(x, np.ma.MaskedArray) and x.ndim > 1:\n raise ValueError("Masked arrays must be 1-D")\n try:\n x = np.asanyarray(x)\n except (VisibleDeprecationWarning, ValueError):\n # NumPy 1.19 raises a warning about ragged arrays, but we want\n # to accept basically anything here.\n x = np.asanyarray(x, dtype=object)\n if x.ndim == 1:\n x = safe_masked_invalid(x)\n seqlist[i] = True\n if np.ma.is_masked(x):\n masks.append(np.ma.getmaskarray(x))\n margs.append(x) # Possibly modified.\n if len(masks):\n mask = np.logical_or.reduce(masks)\n for i, x in enumerate(margs):\n if seqlist[i]:\n margs[i] = np.ma.array(x, mask=mask)\n return margs\n\n\ndef _broadcast_with_masks(*args, compress=False):\n """\n Broadcast inputs, combining all masked arrays.\n\n Parameters\n ----------\n *args : array-like\n The inputs to broadcast.\n compress : bool, default: False\n Whether to compress the masked arrays. If False, the masked values\n are replaced by NaNs.\n\n Returns\n -------\n list of array-like\n The broadcasted and masked inputs.\n """\n # extract the masks, if any\n masks = [k.mask for k in args if isinstance(k, np.ma.MaskedArray)]\n # broadcast to match the shape\n bcast = np.broadcast_arrays(*args, *masks)\n inputs = bcast[:len(args)]\n masks = bcast[len(args):]\n if masks:\n # combine the masks into one\n mask = np.logical_or.reduce(masks)\n # put mask on and compress\n if compress:\n inputs = [np.ma.array(k, mask=mask).compressed()\n for k in inputs]\n else:\n inputs = [np.ma.array(k, mask=mask, dtype=float).filled(np.nan).ravel()\n for k in inputs]\n else:\n inputs = [np.ravel(k) for k in inputs]\n return inputs\n\n\ndef boxplot_stats(X, whis=1.5, bootstrap=None, labels=None, autorange=False):\n r"""\n Return a list of dictionaries of statistics used to draw a series of box\n and whisker plots using `~.Axes.bxp`.\n\n Parameters\n ----------\n X : array-like\n Data that will be represented in the boxplots. Should have 2 or\n fewer dimensions.\n\n whis : float or (float, float), default: 1.5\n The position of the whiskers.\n\n If a float, the lower whisker is at the lowest datum above\n ``Q1 - whis*(Q3-Q1)``, and the upper whisker at the highest datum below\n ``Q3 + whis*(Q3-Q1)``, where Q1 and Q3 are the first and third\n quartiles. The default value of ``whis = 1.5`` corresponds to Tukey's\n original definition of boxplots.\n\n If a pair of floats, they indicate the percentiles at which to draw the\n whiskers (e.g., (5, 95)). In particular, setting this to (0, 100)\n results in whiskers covering the whole range of the data.\n\n In the edge case where ``Q1 == Q3``, *whis* is automatically set to\n (0, 100) (cover the whole range of the data) if *autorange* is True.\n\n Beyond the whiskers, data are considered outliers and are plotted as\n individual points.\n\n bootstrap : int, optional\n Number of times the confidence intervals around the median\n should be bootstrapped (percentile method).\n\n labels : list of str, optional\n Labels for each dataset. Length must be compatible with\n dimensions of *X*.\n\n autorange : bool, optional (False)\n When `True` and the data are distributed such that the 25th and 75th\n percentiles are equal, ``whis`` is set to (0, 100) such that the\n whisker ends are at the minimum and maximum of the data.\n\n Returns\n -------\n list of dict\n A list of dictionaries containing the results for each column\n of data. Keys of each dictionary are the following:\n\n ======== ===================================\n Key Value Description\n ======== ===================================\n label tick label for the boxplot\n mean arithmetic mean value\n med 50th percentile\n q1 first quartile (25th percentile)\n q3 third quartile (75th percentile)\n iqr interquartile range\n cilo lower notch around the median\n cihi upper notch around the median\n whislo end of the lower whisker\n whishi end of the upper whisker\n fliers outliers\n ======== ===================================\n\n Notes\n -----\n Non-bootstrapping approach to confidence interval uses Gaussian-based\n asymptotic approximation:\n\n .. math::\n\n \mathrm{med} \pm 1.57 \times \frac{\mathrm{iqr}}{\sqrt{N}}\n\n General approach from:\n McGill, R., Tukey, J.W., and Larsen, W.A. (1978) "Variations of\n Boxplots", The American Statistician, 32:12-16.\n """\n\n def _bootstrap_median(data, N=5000):\n # determine 95% confidence intervals of the median\n M = len(data)\n percentiles = [2.5, 97.5]\n\n bs_index = np.random.randint(M, size=(N, M))\n bsData = data[bs_index]\n estimate = np.median(bsData, axis=1, overwrite_input=True)\n\n CI = np.percentile(estimate, percentiles)\n return CI\n\n def _compute_conf_interval(data, med, iqr, bootstrap):\n if bootstrap is not None:\n # Do a bootstrap estimate of notch locations.\n # get conf. intervals around median\n CI = _bootstrap_median(data, N=bootstrap)\n notch_min = CI[0]\n notch_max = CI[1]\n else:\n\n N = len(data)\n notch_min = med - 1.57 * iqr / np.sqrt(N)\n notch_max = med + 1.57 * iqr / np.sqrt(N)\n\n return notch_min, notch_max\n\n # output is a list of dicts\n bxpstats = []\n\n # convert X to a list of lists\n X = _reshape_2D(X, "X")\n\n ncols = len(X)\n if labels is None:\n labels = itertools.repeat(None)\n elif len(labels) != ncols:\n raise ValueError("Dimensions of labels and X must be compatible")\n\n input_whis = whis\n for ii, (x, label) in enumerate(zip(X, labels)):\n\n # empty dict\n stats = {}\n if label is not None:\n stats['label'] = label\n\n # restore whis to the input values in case it got changed in the loop\n whis = input_whis\n\n # note tricksiness, append up here and then mutate below\n bxpstats.append(stats)\n\n # if empty, bail\n if len(x) == 0:\n stats['fliers'] = np.array([])\n stats['mean'] = np.nan\n stats['med'] = np.nan\n stats['q1'] = np.nan\n stats['q3'] = np.nan\n stats['iqr'] = np.nan\n stats['cilo'] = np.nan\n stats['cihi'] = np.nan\n stats['whislo'] = np.nan\n stats['whishi'] = np.nan\n continue\n\n # up-convert to an array, just to be safe\n x = np.ma.asarray(x)\n x = x.data[~x.mask].ravel()\n\n # arithmetic mean\n stats['mean'] = np.mean(x)\n\n # medians and quartiles\n q1, med, q3 = np.percentile(x, [25, 50, 75])\n\n # interquartile range\n stats['iqr'] = q3 - q1\n if stats['iqr'] == 0 and autorange:\n whis = (0, 100)\n\n # conf. interval around median\n stats['cilo'], stats['cihi'] = _compute_conf_interval(\n x, med, stats['iqr'], bootstrap\n )\n\n # lowest/highest non-outliers\n if np.iterable(whis) and not isinstance(whis, str):\n loval, hival = np.percentile(x, whis)\n elif np.isreal(whis):\n loval = q1 - whis * stats['iqr']\n hival = q3 + whis * stats['iqr']\n else:\n raise ValueError('whis must be a float or list of percentiles')\n\n # get high extreme\n wiskhi = x[x <= hival]\n if len(wiskhi) == 0 or np.max(wiskhi) < q3:\n stats['whishi'] = q3\n else:\n stats['whishi'] = np.max(wiskhi)\n\n # get low extreme\n wisklo = x[x >= loval]\n if len(wisklo) == 0 or np.min(wisklo) > q1:\n stats['whislo'] = q1\n else:\n stats['whislo'] = np.min(wisklo)\n\n # compute a single array of outliers\n stats['fliers'] = np.concatenate([\n x[x < stats['whislo']],\n x[x > stats['whishi']],\n ])\n\n # add in the remaining stats\n stats['q1'], stats['med'], stats['q3'] = q1, med, q3\n\n return bxpstats\n\n\n#: Maps short codes for line style to their full name used by backends.\nls_mapper = {'-': 'solid', '--': 'dashed', '-.': 'dashdot', ':': 'dotted'}\n#: Maps full names for line styles used by backends to their short codes.\nls_mapper_r = {v: k for k, v in ls_mapper.items()}\n\n\ndef contiguous_regions(mask):\n """\n Return a list of (ind0, ind1) such that ``mask[ind0:ind1].all()`` is\n True and we cover all such regions.\n """\n mask = np.asarray(mask, dtype=bool)\n\n if not mask.size:\n return []\n\n # Find the indices of region changes, and correct offset\n idx, = np.nonzero(mask[:-1] != mask[1:])\n idx += 1\n\n # List operations are faster for moderately sized arrays\n idx = idx.tolist()\n\n # Add first and/or last index if needed\n if mask[0]:\n idx = [0] + idx\n if mask[-1]:\n idx.append(len(mask))\n\n return list(zip(idx[::2], idx[1::2]))\n\n\ndef is_math_text(s):\n """\n Return whether the string *s* contains math expressions.\n\n This is done by checking whether *s* contains an even number of\n non-escaped dollar signs.\n """\n s = str(s)\n dollar_count = s.count(r'$') - s.count(r'\$')\n even_dollars = (dollar_count > 0 and dollar_count % 2 == 0)\n return even_dollars\n\n\ndef _to_unmasked_float_array(x):\n """\n Convert a sequence to a float array; if input was a masked array, masked\n values are converted to nans.\n """\n if hasattr(x, 'mask'):\n return np.ma.asarray(x, float).filled(np.nan)\n else:\n return np.asarray(x, float)\n\n\ndef _check_1d(x):\n """Convert scalars to 1D arrays; pass-through arrays as is."""\n # Unpack in case of e.g. Pandas or xarray object\n x = _unpack_to_numpy(x)\n # plot requires `shape` and `ndim`. If passed an\n # object that doesn't provide them, then force to numpy array.\n # Note this will strip unit information.\n if (not hasattr(x, 'shape') or\n not hasattr(x, 'ndim') or\n len(x.shape) < 1):\n return np.atleast_1d(x)\n else:\n return x\n\n\ndef _reshape_2D(X, name):\n """\n Use Fortran ordering to convert ndarrays and lists of iterables to lists of\n 1D arrays.\n\n Lists of iterables are converted by applying `numpy.asanyarray` to each of\n their elements. 1D ndarrays are returned in a singleton list containing\n them. 2D ndarrays are converted to the list of their *columns*.\n\n *name* is used to generate the error message for invalid inputs.\n """\n\n # Unpack in case of e.g. Pandas or xarray object\n X = _unpack_to_numpy(X)\n\n # Iterate over columns for ndarrays.\n if isinstance(X, np.ndarray):\n X = X.transpose()\n\n if len(X) == 0:\n return [[]]\n elif X.ndim == 1 and np.ndim(X[0]) == 0:\n # 1D array of scalars: directly return it.\n return [X]\n elif X.ndim in [1, 2]:\n # 2D array, or 1D array of iterables: flatten them first.\n return [np.reshape(x, -1) for x in X]\n else:\n raise ValueError(f'{name} must have 2 or fewer dimensions')\n\n # Iterate over list of iterables.\n if len(X) == 0:\n return [[]]\n\n result = []\n is_1d = True\n for xi in X:\n # check if this is iterable, except for strings which we\n # treat as singletons.\n if not isinstance(xi, str):\n try:\n iter(xi)\n except TypeError:\n pass\n else:\n is_1d = False\n xi = np.asanyarray(xi)\n nd = np.ndim(xi)\n if nd > 1:\n raise ValueError(f'{name} must have 2 or fewer dimensions')\n result.append(xi.reshape(-1))\n\n if is_1d:\n # 1D array of scalars: directly return it.\n return [np.reshape(result, -1)]\n else:\n # 2D array, or 1D array of iterables: use flattened version.\n return result\n\n\ndef violin_stats(X, method, points=100, quantiles=None):\n """\n Return a list of dictionaries of data which can be used to draw a series\n of violin plots.\n\n See the ``Returns`` section below to view the required keys of the\n dictionary.\n\n Users can skip this function and pass a user-defined set of dictionaries\n with the same keys to `~.axes.Axes.violinplot` instead of using Matplotlib\n to do the calculations. See the *Returns* section below for the keys\n that must be present in the dictionaries.\n\n Parameters\n ----------\n X : array-like\n Sample data that will be used to produce the gaussian kernel density\n estimates. Must have 2 or fewer dimensions.\n\n method : callable\n The method used to calculate the kernel density estimate for each\n column of data. When called via ``method(v, coords)``, it should\n return a vector of the values of the KDE evaluated at the values\n specified in coords.\n\n points : int, default: 100\n Defines the number of points to evaluate each of the gaussian kernel\n density estimates at.\n\n quantiles : array-like, default: None\n Defines (if not None) a list of floats in interval [0, 1] for each\n column of data, which represents the quantiles that will be rendered\n for that column of data. Must have 2 or fewer dimensions. 1D array will\n be treated as a singleton list containing them.\n\n Returns\n -------\n list of dict\n A list of dictionaries containing the results for each column of data.\n The dictionaries contain at least the following:\n\n - coords: A list of scalars containing the coordinates this particular\n kernel density estimate was evaluated at.\n - vals: A list of scalars containing the values of the kernel density\n estimate at each of the coordinates given in *coords*.\n - mean: The mean value for this column of data.\n - median: The median value for this column of data.\n - min: The minimum value for this column of data.\n - max: The maximum value for this column of data.\n - quantiles: The quantile values for this column of data.\n """\n\n # List of dictionaries describing each of the violins.\n vpstats = []\n\n # Want X to be a list of data sequences\n X = _reshape_2D(X, "X")\n\n # Want quantiles to be as the same shape as data sequences\n if quantiles is not None and len(quantiles) != 0:\n quantiles = _reshape_2D(quantiles, "quantiles")\n # Else, mock quantiles if it's none or empty\n else:\n quantiles = [[]] * len(X)\n\n # quantiles should have the same size as dataset\n if len(X) != len(quantiles):\n raise ValueError("List of violinplot statistics and quantiles values"\n " must have the same length")\n\n # Zip x and quantiles\n for (x, q) in zip(X, quantiles):\n # Dictionary of results for this distribution\n stats = {}\n\n # Calculate basic stats for the distribution\n min_val = np.min(x)\n max_val = np.max(x)\n quantile_val = np.percentile(x, 100 * q)\n\n # Evaluate the kernel density estimate\n coords = np.linspace(min_val, max_val, points)\n stats['vals'] = method(x, coords)\n stats['coords'] = coords\n\n # Store additional statistics for this distribution\n stats['mean'] = np.mean(x)\n stats['median'] = np.median(x)\n stats['min'] = min_val\n stats['max'] = max_val\n stats['quantiles'] = np.atleast_1d(quantile_val)\n\n # Append to output\n vpstats.append(stats)\n\n return vpstats\n\n\ndef pts_to_prestep(x, *args):\n """\n Convert continuous line to pre-steps.\n\n Given a set of ``N`` points, convert to ``2N - 1`` points, which when\n connected linearly give a step function which changes values at the\n beginning of the intervals.\n\n Parameters\n ----------\n x : array\n The x location of the steps. May be empty.\n\n y1, ..., yp : array\n y arrays to be turned into steps; all must be the same length as ``x``.\n\n Returns\n -------\n array\n The x and y values converted to steps in the same order as the input;\n can be unpacked as ``x_out, y1_out, ..., yp_out``. If the input is\n length ``N``, each of these arrays will be length ``2N + 1``. For\n ``N=0``, the length will be 0.\n\n Examples\n --------\n >>> x_s, y1_s, y2_s = pts_to_prestep(x, y1, y2)\n """\n steps = np.zeros((1 + len(args), max(2 * len(x) - 1, 0)))\n # In all `pts_to_*step` functions, only assign once using *x* and *args*,\n # as converting to an array may be expensive.\n steps[0, 0::2] = x\n steps[0, 1::2] = steps[0, 0:-2:2]\n steps[1:, 0::2] = args\n steps[1:, 1::2] = steps[1:, 2::2]\n return steps\n\n\ndef pts_to_poststep(x, *args):\n """\n Convert continuous line to post-steps.\n\n Given a set of ``N`` points convert to ``2N + 1`` points, which when\n connected linearly give a step function which changes values at the end of\n the intervals.\n\n Parameters\n ----------\n x : array\n The x location of the steps. May be empty.\n\n y1, ..., yp : array\n y arrays to be turned into steps; all must be the same length as ``x``.\n\n Returns\n -------\n array\n The x and y values converted to steps in the same order as the input;\n can be unpacked as ``x_out, y1_out, ..., yp_out``. If the input is\n length ``N``, each of these arrays will be length ``2N + 1``. For\n ``N=0``, the length will be 0.\n\n Examples\n --------\n >>> x_s, y1_s, y2_s = pts_to_poststep(x, y1, y2)\n """\n steps = np.zeros((1 + len(args), max(2 * len(x) - 1, 0)))\n steps[0, 0::2] = x\n steps[0, 1::2] = steps[0, 2::2]\n steps[1:, 0::2] = args\n steps[1:, 1::2] = steps[1:, 0:-2:2]\n return steps\n\n\ndef pts_to_midstep(x, *args):\n """\n Convert continuous line to mid-steps.\n\n Given a set of ``N`` points convert to ``2N`` points which when connected\n linearly give a step function which changes values at the middle of the\n intervals.\n\n Parameters\n ----------\n x : array\n The x location of the steps. May be empty.\n\n y1, ..., yp : array\n y arrays to be turned into steps; all must be the same length as\n ``x``.\n\n Returns\n -------\n array\n The x and y values converted to steps in the same order as the input;\n can be unpacked as ``x_out, y1_out, ..., yp_out``. If the input is\n length ``N``, each of these arrays will be length ``2N``.\n\n Examples\n --------\n >>> x_s, y1_s, y2_s = pts_to_midstep(x, y1, y2)\n """\n steps = np.zeros((1 + len(args), 2 * len(x)))\n x = np.asanyarray(x)\n steps[0, 1:-1:2] = steps[0, 2::2] = (x[:-1] + x[1:]) / 2\n steps[0, :1] = x[:1] # Also works for zero-sized input.\n steps[0, -1:] = x[-1:]\n steps[1:, 0::2] = args\n steps[1:, 1::2] = steps[1:, 0::2]\n return steps\n\n\nSTEP_LOOKUP_MAP = {'default': lambda x, y: (x, y),\n 'steps': pts_to_prestep,\n 'steps-pre': pts_to_prestep,\n 'steps-post': pts_to_poststep,\n 'steps-mid': pts_to_midstep}\n\n\ndef index_of(y):\n """\n A helper function to create reasonable x values for the given *y*.\n\n This is used for plotting (x, y) if x values are not explicitly given.\n\n First try ``y.index`` (assuming *y* is a `pandas.Series`), if that\n fails, use ``range(len(y))``.\n\n This will be extended in the future to deal with more types of\n labeled data.\n\n Parameters\n ----------\n y : float or array-like\n\n Returns\n -------\n x, y : ndarray\n The x and y values to plot.\n """\n try:\n return y.index.to_numpy(), y.to_numpy()\n except AttributeError:\n pass\n try:\n y = _check_1d(y)\n except (VisibleDeprecationWarning, ValueError):\n # NumPy 1.19 will warn on ragged input, and we can't actually use it.\n pass\n else:\n return np.arange(y.shape[0], dtype=float), y\n raise ValueError('Input could not be cast to an at-least-1D NumPy array')\n\n\ndef safe_first_element(obj):\n """\n Return the first element in *obj*.\n\n This is a type-independent way of obtaining the first element,\n supporting both index access and the iterator protocol.\n """\n if isinstance(obj, collections.abc.Iterator):\n # needed to accept `array.flat` as input.\n # np.flatiter reports as an instance of collections.Iterator but can still be\n # indexed via []. This has the side effect of re-setting the iterator, but\n # that is acceptable.\n try:\n return obj[0]\n except TypeError:\n pass\n raise RuntimeError("matplotlib does not support generators as input")\n return next(iter(obj))\n\n\ndef _safe_first_finite(obj):\n """\n Return the first finite element in *obj* if one is available and skip_nonfinite is\n True. Otherwise, return the first element.\n\n This is a method for internal use.\n\n This is a type-independent way of obtaining the first finite element, supporting\n both index access and the iterator protocol.\n """\n def safe_isfinite(val):\n if val is None:\n return False\n try:\n return math.isfinite(val)\n except (TypeError, ValueError):\n # if the outer object is 2d, then val is a 1d array, and\n # - math.isfinite(numpy.zeros(3)) raises TypeError\n # - math.isfinite(torch.zeros(3)) raises ValueError\n pass\n try:\n return np.isfinite(val) if np.isscalar(val) else True\n except TypeError:\n # This is something that NumPy cannot make heads or tails of,\n # assume "finite"\n return True\n\n if isinstance(obj, np.flatiter):\n # TODO do the finite filtering on this\n return obj[0]\n elif isinstance(obj, collections.abc.Iterator):\n raise RuntimeError("matplotlib does not support generators as input")\n else:\n for val in obj:\n if safe_isfinite(val):\n return val\n return safe_first_element(obj)\n\n\ndef sanitize_sequence(data):\n """\n Convert dictview objects to list. Other inputs are returned unchanged.\n """\n return (list(data) if isinstance(data, collections.abc.MappingView)\n else data)\n\n\ndef normalize_kwargs(kw, alias_mapping=None):\n """\n Helper function to normalize kwarg inputs.\n\n Parameters\n ----------\n kw : dict or None\n A dict of keyword arguments. None is explicitly supported and treated\n as an empty dict, to support functions with an optional parameter of\n the form ``props=None``.\n\n alias_mapping : dict or Artist subclass or Artist instance, optional\n A mapping between a canonical name to a list of aliases, in order of\n precedence from lowest to highest.\n\n If the canonical value is not in the list it is assumed to have the\n highest priority.\n\n If an Artist subclass or instance is passed, use its properties alias\n mapping.\n\n Raises\n ------\n TypeError\n To match what Python raises if invalid arguments/keyword arguments are\n passed to a callable.\n """\n from matplotlib.artist import Artist\n\n if kw is None:\n return {}\n\n # deal with default value of alias_mapping\n if alias_mapping is None:\n alias_mapping = {}\n elif (isinstance(alias_mapping, type) and issubclass(alias_mapping, Artist)\n or isinstance(alias_mapping, Artist)):\n alias_mapping = getattr(alias_mapping, "_alias_map", {})\n\n to_canonical = {alias: canonical\n for canonical, alias_list in alias_mapping.items()\n for alias in alias_list}\n canonical_to_seen = {}\n ret = {} # output dictionary\n\n for k, v in kw.items():\n canonical = to_canonical.get(k, k)\n if canonical in canonical_to_seen:\n raise TypeError(f"Got both {canonical_to_seen[canonical]!r} and "\n f"{k!r}, which are aliases of one another")\n canonical_to_seen[canonical] = k\n ret[canonical] = v\n\n return ret\n\n\n@contextlib.contextmanager\ndef _lock_path(path):\n """\n Context manager for locking a path.\n\n Usage::\n\n with _lock_path(path):\n ...\n\n Another thread or process that attempts to lock the same path will wait\n until this context manager is exited.\n\n The lock is implemented by creating a temporary file in the parent\n directory, so that directory must exist and be writable.\n """\n path = Path(path)\n lock_path = path.with_name(path.name + ".matplotlib-lock")\n retries = 50\n sleeptime = 0.1\n for _ in range(retries):\n try:\n with lock_path.open("xb"):\n break\n except FileExistsError:\n time.sleep(sleeptime)\n else:\n raise TimeoutError("""\\nLock error: Matplotlib failed to acquire the following lock file:\n {}\nThis maybe due to another process holding this lock file. If you are sure no\nother Matplotlib process is running, remove this file and try again.""".format(\n lock_path))\n try:\n yield\n finally:\n lock_path.unlink()\n\n\ndef _topmost_artist(\n artists,\n _cached_max=functools.partial(max, key=operator.attrgetter("zorder"))):\n """\n Get the topmost artist of a list.\n\n In case of a tie, return the *last* of the tied artists, as it will be\n drawn on top of the others. `max` returns the first maximum in case of\n ties, so we need to iterate over the list in reverse order.\n """\n return _cached_max(reversed(artists))\n\n\ndef _str_equal(obj, s):\n """\n Return whether *obj* is a string equal to string *s*.\n\n This helper solely exists to handle the case where *obj* is a numpy array,\n because in such cases, a naive ``obj == s`` would yield an array, which\n cannot be used in a boolean context.\n """\n return isinstance(obj, str) and obj == s\n\n\ndef _str_lower_equal(obj, s):\n """\n Return whether *obj* is a string equal, when lowercased, to string *s*.\n\n This helper solely exists to handle the case where *obj* is a numpy array,\n because in such cases, a naive ``obj == s`` would yield an array, which\n cannot be used in a boolean context.\n """\n return isinstance(obj, str) and obj.lower() == s\n\n\ndef _array_perimeter(arr):\n """\n Get the elements on the perimeter of *arr*.\n\n Parameters\n ----------\n arr : ndarray, shape (M, N)\n The input array.\n\n Returns\n -------\n ndarray, shape (2*(M - 1) + 2*(N - 1),)\n The elements on the perimeter of the array::\n\n [arr[0, 0], ..., arr[0, -1], ..., arr[-1, -1], ..., arr[-1, 0], ...]\n\n Examples\n --------\n >>> i, j = np.ogrid[:3, :4]\n >>> a = i*10 + j\n >>> a\n array([[ 0, 1, 2, 3],\n [10, 11, 12, 13],\n [20, 21, 22, 23]])\n >>> _array_perimeter(a)\n array([ 0, 1, 2, 3, 13, 23, 22, 21, 20, 10])\n """\n # note we use Python's half-open ranges to avoid repeating\n # the corners\n forward = np.s_[0:-1] # [0 ... -1)\n backward = np.s_[-1:0:-1] # [-1 ... 0)\n return np.concatenate((\n arr[0, forward],\n arr[forward, -1],\n arr[-1, backward],\n arr[backward, 0],\n ))\n\n\ndef _unfold(arr, axis, size, step):\n """\n Append an extra dimension containing sliding windows along *axis*.\n\n All windows are of size *size* and begin with every *step* elements.\n\n Parameters\n ----------\n arr : ndarray, shape (N_1, ..., N_k)\n The input array\n axis : int\n Axis along which the windows are extracted\n size : int\n Size of the windows\n step : int\n Stride between first elements of subsequent windows.\n\n Returns\n -------\n ndarray, shape (N_1, ..., 1 + (N_axis-size)/step, ..., N_k, size)\n\n Examples\n --------\n >>> i, j = np.ogrid[:3, :7]\n >>> a = i*10 + j\n >>> a\n array([[ 0, 1, 2, 3, 4, 5, 6],\n [10, 11, 12, 13, 14, 15, 16],\n [20, 21, 22, 23, 24, 25, 26]])\n >>> _unfold(a, axis=1, size=3, step=2)\n array([[[ 0, 1, 2],\n [ 2, 3, 4],\n [ 4, 5, 6]],\n [[10, 11, 12],\n [12, 13, 14],\n [14, 15, 16]],\n [[20, 21, 22],\n [22, 23, 24],\n [24, 25, 26]]])\n """\n new_shape = [*arr.shape, size]\n new_strides = [*arr.strides, arr.strides[axis]]\n new_shape[axis] = (new_shape[axis] - size) // step + 1\n new_strides[axis] = new_strides[axis] * step\n return np.lib.stride_tricks.as_strided(arr,\n shape=new_shape,\n strides=new_strides,\n writeable=False)\n\n\ndef _array_patch_perimeters(x, rstride, cstride):\n """\n Extract perimeters of patches from *arr*.\n\n Extracted patches are of size (*rstride* + 1) x (*cstride* + 1) and\n share perimeters with their neighbors. The ordering of the vertices matches\n that returned by ``_array_perimeter``.\n\n Parameters\n ----------\n x : ndarray, shape (N, M)\n Input array\n rstride : int\n Vertical (row) stride between corresponding elements of each patch\n cstride : int\n Horizontal (column) stride between corresponding elements of each patch\n\n Returns\n -------\n ndarray, shape (N/rstride * M/cstride, 2 * (rstride + cstride))\n """\n assert rstride > 0 and cstride > 0\n assert (x.shape[0] - 1) % rstride == 0\n assert (x.shape[1] - 1) % cstride == 0\n # We build up each perimeter from four half-open intervals. Here is an\n # illustrated explanation for rstride == cstride == 3\n #\n # T T T R\n # L R\n # L R\n # L B B B\n #\n # where T means that this element will be in the top array, R for right,\n # B for bottom and L for left. Each of the arrays below has a shape of:\n #\n # (number of perimeters that can be extracted vertically,\n # number of perimeters that can be extracted horizontally,\n # cstride for top and bottom and rstride for left and right)\n #\n # Note that _unfold doesn't incur any memory copies, so the only costly\n # operation here is the np.concatenate.\n top = _unfold(x[:-1:rstride, :-1], 1, cstride, cstride)\n bottom = _unfold(x[rstride::rstride, 1:], 1, cstride, cstride)[..., ::-1]\n right = _unfold(x[:-1, cstride::cstride], 0, rstride, rstride)\n left = _unfold(x[1:, :-1:cstride], 0, rstride, rstride)[..., ::-1]\n return (np.concatenate((top, right, bottom, left), axis=2)\n .reshape(-1, 2 * (rstride + cstride)))\n\n\n@contextlib.contextmanager\ndef _setattr_cm(obj, **kwargs):\n """\n Temporarily set some attributes; restore original state at context exit.\n """\n sentinel = object()\n origs = {}\n for attr in kwargs:\n orig = getattr(obj, attr, sentinel)\n if attr in obj.__dict__ or orig is sentinel:\n # if we are pulling from the instance dict or the object\n # does not have this attribute we can trust the above\n origs[attr] = orig\n else:\n # if the attribute is not in the instance dict it must be\n # from the class level\n cls_orig = getattr(type(obj), attr)\n # if we are dealing with a property (but not a general descriptor)\n # we want to set the original value back.\n if isinstance(cls_orig, property):\n origs[attr] = orig\n # otherwise this is _something_ we are going to shadow at\n # the instance dict level from higher up in the MRO. We\n # are going to assume we can delattr(obj, attr) to clean\n # up after ourselves. It is possible that this code will\n # fail if used with a non-property custom descriptor which\n # implements __set__ (and __delete__ does not act like a\n # stack). However, this is an internal tool and we do not\n # currently have any custom descriptors.\n else:\n origs[attr] = sentinel\n\n try:\n for attr, val in kwargs.items():\n setattr(obj, attr, val)\n yield\n finally:\n for attr, orig in origs.items():\n if orig is sentinel:\n delattr(obj, attr)\n else:\n setattr(obj, attr, orig)\n\n\nclass _OrderedSet(collections.abc.MutableSet):\n def __init__(self):\n self._od = collections.OrderedDict()\n\n def __contains__(self, key):\n return key in self._od\n\n def __iter__(self):\n return iter(self._od)\n\n def __len__(self):\n return len(self._od)\n\n def add(self, key):\n self._od.pop(key, None)\n self._od[key] = None\n\n def discard(self, key):\n self._od.pop(key, None)\n\n\n# Agg's buffers are unmultiplied RGBA8888, which neither PyQt<=5.1 nor cairo\n# support; however, both do support premultiplied ARGB32.\n\n\ndef _premultiplied_argb32_to_unmultiplied_rgba8888(buf):\n """\n Convert a premultiplied ARGB32 buffer to an unmultiplied RGBA8888 buffer.\n """\n rgba = np.take( # .take() ensures C-contiguity of the result.\n buf,\n [2, 1, 0, 3] if sys.byteorder == "little" else [1, 2, 3, 0], axis=2)\n rgb = rgba[..., :-1]\n alpha = rgba[..., -1]\n # Un-premultiply alpha. The formula is the same as in cairo-png.c.\n mask = alpha != 0\n for channel in np.rollaxis(rgb, -1):\n channel[mask] = (\n (channel[mask].astype(int) * 255 + alpha[mask] // 2)\n // alpha[mask])\n return rgba\n\n\ndef _unmultiplied_rgba8888_to_premultiplied_argb32(rgba8888):\n """\n Convert an unmultiplied RGBA8888 buffer to a premultiplied ARGB32 buffer.\n """\n if sys.byteorder == "little":\n argb32 = np.take(rgba8888, [2, 1, 0, 3], axis=2)\n rgb24 = argb32[..., :-1]\n alpha8 = argb32[..., -1:]\n else:\n argb32 = np.take(rgba8888, [3, 0, 1, 2], axis=2)\n alpha8 = argb32[..., :1]\n rgb24 = argb32[..., 1:]\n # Only bother premultiplying when the alpha channel is not fully opaque,\n # as the cost is not negligible. The unsafe cast is needed to do the\n # multiplication in-place in an integer buffer.\n if alpha8.min() != 0xff:\n np.multiply(rgb24, alpha8 / 0xff, out=rgb24, casting="unsafe")\n return argb32\n\n\ndef _get_nonzero_slices(buf):\n """\n Return the bounds of the nonzero region of a 2D array as a pair of slices.\n\n ``buf[_get_nonzero_slices(buf)]`` is the smallest sub-rectangle in *buf*\n that encloses all non-zero entries in *buf*. If *buf* is fully zero, then\n ``(slice(0, 0), slice(0, 0))`` is returned.\n """\n x_nz, = buf.any(axis=0).nonzero()\n y_nz, = buf.any(axis=1).nonzero()\n if len(x_nz) and len(y_nz):\n l, r = x_nz[[0, -1]]\n b, t = y_nz[[0, -1]]\n return slice(b, t + 1), slice(l, r + 1)\n else:\n return slice(0, 0), slice(0, 0)\n\n\ndef _pformat_subprocess(command):\n """Pretty-format a subprocess command for printing/logging purposes."""\n return (command if isinstance(command, str)\n else " ".join(shlex.quote(os.fspath(arg)) for arg in command))\n\n\ndef _check_and_log_subprocess(command, logger, **kwargs):\n """\n Run *command*, returning its stdout output if it succeeds.\n\n If it fails (exits with nonzero return code), raise an exception whose text\n includes the failed command and captured stdout and stderr output.\n\n Regardless of the return code, the command is logged at DEBUG level on\n *logger*. In case of success, the output is likewise logged.\n """\n logger.debug('%s', _pformat_subprocess(command))\n proc = subprocess.run(command, capture_output=True, **kwargs)\n if proc.returncode:\n stdout = proc.stdout\n if isinstance(stdout, bytes):\n stdout = stdout.decode()\n stderr = proc.stderr\n if isinstance(stderr, bytes):\n stderr = stderr.decode()\n raise RuntimeError(\n f"The command\n"\n f" {_pformat_subprocess(command)}\n"\n f"failed and generated the following output:\n"\n f"{stdout}\n"\n f"and the following error:\n"\n f"{stderr}")\n if proc.stdout:\n logger.debug("stdout:\n%s", proc.stdout)\n if proc.stderr:\n logger.debug("stderr:\n%s", proc.stderr)\n return proc.stdout\n\n\ndef _setup_new_guiapp():\n """\n Perform OS-dependent setup when Matplotlib creates a new GUI application.\n """\n # Windows: If not explicit app user model id has been set yet (so we're not\n # already embedded), then set it to "matplotlib", so that taskbar icons are\n # correct.\n try:\n _c_internal_utils.Win32_GetCurrentProcessExplicitAppUserModelID()\n except OSError:\n _c_internal_utils.Win32_SetCurrentProcessExplicitAppUserModelID(\n "matplotlib")\n\n\ndef _format_approx(number, precision):\n """\n Format the number with at most the number of decimals given as precision.\n Remove trailing zeros and possibly the decimal point.\n """\n return f'{number:.{precision}f}'.rstrip('0').rstrip('.') or '0'\n\n\ndef _g_sig_digits(value, delta):\n """\n Return the number of significant digits to %g-format *value*, assuming that\n it is known with an error of *delta*.\n """\n if delta == 0:\n if value == 0:\n # if both value and delta are 0, np.spacing below returns 5e-324\n # which results in rather silly results\n return 3\n # delta = 0 may occur when trying to format values over a tiny range;\n # in that case, replace it by the distance to the closest float.\n delta = abs(np.spacing(value))\n # If e.g. value = 45.67 and delta = 0.02, then we want to round to 2 digits\n # after the decimal point (floor(log10(0.02)) = -2); 45.67 contributes 2\n # digits before the decimal point (floor(log10(45.67)) + 1 = 2): the total\n # is 4 significant digits. A value of 0 contributes 1 "digit" before the\n # decimal point.\n # For inf or nan, the precision doesn't matter.\n return max(\n 0,\n (math.floor(math.log10(abs(value))) + 1 if value else 1)\n - math.floor(math.log10(delta))) if math.isfinite(value) else 0\n\n\ndef _unikey_or_keysym_to_mplkey(unikey, keysym):\n """\n Convert a Unicode key or X keysym to a Matplotlib key name.\n\n The Unicode key is checked first; this avoids having to list most printable\n keysyms such as ``EuroSign``.\n """\n # For non-printable characters, gtk3 passes "\0" whereas tk passes an "".\n if unikey and unikey.isprintable():\n return unikey\n key = keysym.lower()\n if key.startswith("kp_"): # keypad_x (including kp_enter).\n key = key[3:]\n if key.startswith("page_"): # page_{up,down}\n key = key.replace("page_", "page")\n if key.endswith(("_l", "_r")): # alt_l, ctrl_l, shift_l.\n key = key[:-2]\n if sys.platform == "darwin" and key == "meta":\n # meta should be reported as command on mac\n key = "cmd"\n key = {\n "return": "enter",\n "prior": "pageup", # Used by tk.\n "next": "pagedown", # Used by tk.\n }.get(key, key)\n return key\n\n\n@functools.cache\ndef _make_class_factory(mixin_class, fmt, attr_name=None):\n """\n Return a function that creates picklable classes inheriting from a mixin.\n\n After ::\n\n factory = _make_class_factory(FooMixin, fmt, attr_name)\n FooAxes = factory(Axes)\n\n ``Foo`` is a class that inherits from ``FooMixin`` and ``Axes`` and **is\n picklable** (picklability is what differentiates this from a plain call to\n `type`). Its ``__name__`` is set to ``fmt.format(Axes.__name__)`` and the\n base class is stored in the ``attr_name`` attribute, if not None.\n\n Moreover, the return value of ``factory`` is memoized: calls with the same\n ``Axes`` class always return the same subclass.\n """\n\n @functools.cache\n def class_factory(axes_class):\n # if we have already wrapped this class, declare victory!\n if issubclass(axes_class, mixin_class):\n return axes_class\n\n # The parameter is named "axes_class" for backcompat but is really just\n # a base class; no axes semantics are used.\n base_class = axes_class\n\n class subcls(mixin_class, base_class):\n # Better approximation than __module__ = "matplotlib.cbook".\n __module__ = mixin_class.__module__\n\n def __reduce__(self):\n return (_picklable_class_constructor,\n (mixin_class, fmt, attr_name, base_class),\n self.__getstate__())\n\n subcls.__name__ = subcls.__qualname__ = fmt.format(base_class.__name__)\n if attr_name is not None:\n setattr(subcls, attr_name, base_class)\n return subcls\n\n class_factory.__module__ = mixin_class.__module__\n return class_factory\n\n\ndef _picklable_class_constructor(mixin_class, fmt, attr_name, base_class):\n """Internal helper for _make_class_factory."""\n factory = _make_class_factory(mixin_class, fmt, attr_name)\n cls = factory(base_class)\n return cls.__new__(cls)\n\n\ndef _is_torch_array(x):\n """Check if 'x' is a PyTorch Tensor."""\n try:\n # we're intentionally not attempting to import torch. If somebody\n # has created a torch array, torch should already be in sys.modules\n return isinstance(x, sys.modules['torch'].Tensor)\n except Exception: # TypeError, KeyError, AttributeError, maybe others?\n # we're attempting to access attributes on imported modules which\n # may have arbitrary user code, so we deliberately catch all exceptions\n return False\n\n\ndef _is_jax_array(x):\n """Check if 'x' is a JAX Array."""\n try:\n # we're intentionally not attempting to import jax. If somebody\n # has created a jax array, jax should already be in sys.modules\n return isinstance(x, sys.modules['jax'].Array)\n except Exception: # TypeError, KeyError, AttributeError, maybe others?\n # we're attempting to access attributes on imported modules which\n # may have arbitrary user code, so we deliberately catch all exceptions\n return False\n\n\ndef _is_tensorflow_array(x):\n """Check if 'x' is a TensorFlow Tensor or Variable."""\n try:\n # we're intentionally not attempting to import TensorFlow. If somebody\n # has created a TensorFlow array, TensorFlow should already be in sys.modules\n # we use `is_tensor` to not depend on the class structure of TensorFlow\n # arrays, as `tf.Variables` are not instances of `tf.Tensor`\n # (they both convert the same way)\n return isinstance(x, sys.modules['tensorflow'].is_tensor(x))\n except Exception: # TypeError, KeyError, AttributeError, maybe others?\n # we're attempting to access attributes on imported modules which\n # may have arbitrary user code, so we deliberately catch all exceptions\n return False\n\n\ndef _unpack_to_numpy(x):\n """Internal helper to extract data from e.g. pandas and xarray objects."""\n if isinstance(x, np.ndarray):\n # If numpy, return directly\n return x\n if hasattr(x, 'to_numpy'):\n # Assume that any to_numpy() method actually returns a numpy array\n return x.to_numpy()\n if hasattr(x, 'values'):\n xtmp = x.values\n # For example a dict has a 'values' attribute, but it is not a property\n # so in this case we do not want to return a function\n if isinstance(xtmp, np.ndarray):\n return xtmp\n if _is_torch_array(x) or _is_jax_array(x) or _is_tensorflow_array(x):\n # using np.asarray() instead of explicitly __array__(), as the latter is\n # only _one_ of many methods, and it's the last resort, see also\n # https://numpy.org/devdocs/user/basics.interoperability.html#using-arbitrary-objects-in-numpy\n # therefore, let arrays do better if they can\n xtmp = np.asarray(x)\n\n # In case np.asarray method does not return a numpy array in future\n if isinstance(xtmp, np.ndarray):\n return xtmp\n return x\n\n\ndef _auto_format_str(fmt, value):\n """\n Apply *value* to the format string *fmt*.\n\n This works both with unnamed %-style formatting and\n unnamed {}-style formatting. %-style formatting has priority.\n If *fmt* is %-style formattable that will be used. Otherwise,\n {}-formatting is applied. Strings without formatting placeholders\n are passed through as is.\n\n Examples\n --------\n >>> _auto_format_str('%.2f m', 0.2)\n '0.20 m'\n >>> _auto_format_str('{} m', 0.2)\n '0.2 m'\n >>> _auto_format_str('const', 0.2)\n 'const'\n >>> _auto_format_str('%d or {}', 0.2)\n '0 or {}'\n """\n try:\n return fmt % (value,)\n except (TypeError, ValueError):\n return fmt.format(value)\n\n\ndef _is_pandas_dataframe(x):\n """Check if 'x' is a Pandas DataFrame."""\n try:\n # we're intentionally not attempting to import Pandas. If somebody\n # has created a Pandas DataFrame, Pandas should already be in sys.modules\n return isinstance(x, sys.modules['pandas'].DataFrame)\n except Exception: # TypeError, KeyError, AttributeError, maybe others?\n # we're attempting to access attributes on imported modules which\n # may have arbitrary user code, so we deliberately catch all exceptions\n return False\n | .venv\Lib\site-packages\matplotlib\cbook.py | cbook.py | Python | 80,161 | 0.75 | 0.2 | 0.095713 | python-kit | 678 | 2024-09-02T09:29:12.916708 | Apache-2.0 | false | e42dafc0f7442bff4c0127c9b3198b5e |
import collections.abc\nfrom collections.abc import Callable, Collection, Generator, Iterable, Iterator\nimport contextlib\nimport os\nfrom pathlib import Path\n\nfrom matplotlib.artist import Artist\n\nimport numpy as np\nfrom numpy.typing import ArrayLike\n\nfrom typing import (\n Any,\n Generic,\n IO,\n Literal,\n TypeVar,\n overload,\n)\n\n_T = TypeVar("_T")\n\ndef _get_running_interactive_framework() -> str | None: ...\n\nclass CallbackRegistry:\n exception_handler: Callable[[Exception], Any]\n callbacks: dict[Any, dict[int, Any]]\n def __init__(\n self,\n exception_handler: Callable[[Exception], Any] | None = ...,\n *,\n signals: Iterable[Any] | None = ...,\n ) -> None: ...\n def connect(self, signal: Any, func: Callable) -> int: ...\n def disconnect(self, cid: int) -> None: ...\n def process(self, s: Any, *args, **kwargs) -> None: ...\n def blocked(\n self, *, signal: Any | None = ...\n ) -> contextlib.AbstractContextManager[None]: ...\n\nclass silent_list(list[_T]):\n type: str | None\n def __init__(self, type: str | None, seq: Iterable[_T] | None = ...) -> None: ...\n\ndef strip_math(s: str) -> str: ...\ndef is_writable_file_like(obj: Any) -> bool: ...\ndef file_requires_unicode(x: Any) -> bool: ...\n@overload\ndef to_filehandle(\n fname: str | os.PathLike | IO,\n flag: str = ...,\n return_opened: Literal[False] = ...,\n encoding: str | None = ...,\n) -> IO: ...\n@overload\ndef to_filehandle(\n fname: str | os.PathLike | IO,\n flag: str,\n return_opened: Literal[True],\n encoding: str | None = ...,\n) -> tuple[IO, bool]: ...\n@overload\ndef to_filehandle(\n fname: str | os.PathLike | IO,\n *, # if flag given, will match previous sig\n return_opened: Literal[True],\n encoding: str | None = ...,\n) -> tuple[IO, bool]: ...\ndef open_file_cm(\n path_or_file: str | os.PathLike | IO,\n mode: str = ...,\n encoding: str | None = ...,\n) -> contextlib.AbstractContextManager[IO]: ...\ndef is_scalar_or_string(val: Any) -> bool: ...\n@overload\ndef get_sample_data(\n fname: str | os.PathLike, asfileobj: Literal[True] = ...\n) -> np.ndarray | IO: ...\n@overload\ndef get_sample_data(fname: str | os.PathLike, asfileobj: Literal[False]) -> str: ...\ndef _get_data_path(*args: Path | str) -> Path: ...\ndef flatten(\n seq: Iterable[Any], scalarp: Callable[[Any], bool] = ...\n) -> Generator[Any, None, None]: ...\n\nclass _Stack(Generic[_T]):\n def __init__(self) -> None: ...\n def clear(self) -> None: ...\n def __call__(self) -> _T: ...\n def __len__(self) -> int: ...\n def __getitem__(self, ind: int) -> _T: ...\n def forward(self) -> _T: ...\n def back(self) -> _T: ...\n def push(self, o: _T) -> _T: ...\n def home(self) -> _T: ...\n\ndef safe_masked_invalid(x: ArrayLike, copy: bool = ...) -> np.ndarray: ...\ndef print_cycles(\n objects: Iterable[Any], outstream: IO = ..., show_progress: bool = ...\n) -> None: ...\n\nclass Grouper(Generic[_T]):\n def __init__(self, init: Iterable[_T] = ...) -> None: ...\n def __contains__(self, item: _T) -> bool: ...\n def join(self, a: _T, *args: _T) -> None: ...\n def joined(self, a: _T, b: _T) -> bool: ...\n def remove(self, a: _T) -> None: ...\n def __iter__(self) -> Iterator[list[_T]]: ...\n def get_siblings(self, a: _T) -> list[_T]: ...\n\nclass GrouperView(Generic[_T]):\n def __init__(self, grouper: Grouper[_T]) -> None: ...\n def __contains__(self, item: _T) -> bool: ...\n def __iter__(self) -> Iterator[list[_T]]: ...\n def joined(self, a: _T, b: _T) -> bool: ...\n def get_siblings(self, a: _T) -> list[_T]: ...\n\ndef simple_linear_interpolation(a: ArrayLike, steps: int) -> np.ndarray: ...\ndef delete_masked_points(*args): ...\ndef _broadcast_with_masks(*args: ArrayLike, compress: bool = ...) -> list[ArrayLike]: ...\ndef boxplot_stats(\n X: ArrayLike,\n whis: float | tuple[float, float] = ...,\n bootstrap: int | None = ...,\n labels: ArrayLike | None = ...,\n autorange: bool = ...,\n) -> list[dict[str, Any]]: ...\n\nls_mapper: dict[str, str]\nls_mapper_r: dict[str, str]\n\ndef contiguous_regions(mask: ArrayLike) -> list[np.ndarray]: ...\ndef is_math_text(s: str) -> bool: ...\ndef violin_stats(\n X: ArrayLike, method: Callable, points: int = ..., quantiles: ArrayLike | None = ...\n) -> list[dict[str, Any]]: ...\ndef pts_to_prestep(x: ArrayLike, *args: ArrayLike) -> np.ndarray: ...\ndef pts_to_poststep(x: ArrayLike, *args: ArrayLike) -> np.ndarray: ...\ndef pts_to_midstep(x: np.ndarray, *args: np.ndarray) -> np.ndarray: ...\n\nSTEP_LOOKUP_MAP: dict[str, Callable]\n\ndef index_of(y: float | ArrayLike) -> tuple[np.ndarray, np.ndarray]: ...\ndef safe_first_element(obj: Collection[_T]) -> _T: ...\ndef sanitize_sequence(data): ...\ndef normalize_kwargs(\n kw: dict[str, Any],\n alias_mapping: dict[str, list[str]] | type[Artist] | Artist | None = ...,\n) -> dict[str, Any]: ...\ndef _lock_path(path: str | os.PathLike) -> contextlib.AbstractContextManager[None]: ...\ndef _str_equal(obj: Any, s: str) -> bool: ...\ndef _str_lower_equal(obj: Any, s: str) -> bool: ...\ndef _array_perimeter(arr: np.ndarray) -> np.ndarray: ...\ndef _unfold(arr: np.ndarray, axis: int, size: int, step: int) -> np.ndarray: ...\ndef _array_patch_perimeters(x: np.ndarray, rstride: int, cstride: int) -> np.ndarray: ...\ndef _setattr_cm(obj: Any, **kwargs) -> contextlib.AbstractContextManager[None]: ...\n\nclass _OrderedSet(collections.abc.MutableSet):\n def __init__(self) -> None: ...\n def __contains__(self, key) -> bool: ...\n def __iter__(self): ...\n def __len__(self) -> int: ...\n def add(self, key) -> None: ...\n def discard(self, key) -> None: ...\n\ndef _setup_new_guiapp() -> None: ...\ndef _format_approx(number: float, precision: int) -> str: ...\ndef _g_sig_digits(value: float, delta: float) -> int: ...\ndef _unikey_or_keysym_to_mplkey(unikey: str, keysym: str) -> str: ...\ndef _is_torch_array(x: Any) -> bool: ...\ndef _is_jax_array(x: Any) -> bool: ...\ndef _unpack_to_numpy(x: Any) -> Any: ...\ndef _auto_format_str(fmt: str, value: Any) -> str: ...\n | .venv\Lib\site-packages\matplotlib\cbook.pyi | cbook.pyi | Other | 6,037 | 0.95 | 0.485549 | 0.012987 | awesome-app | 345 | 2024-12-28T03:37:19.011788 | GPL-3.0 | false | ece8cdbbb2b4fe042e152f2cfa2f45b5 |
"""\nBuiltin colormaps, colormap handling utilities, and the `ScalarMappable` mixin.\n\n.. seealso::\n\n :doc:`/gallery/color/colormap_reference` for a list of builtin colormaps.\n\n :ref:`colormap-manipulation` for examples of how to make\n colormaps.\n\n :ref:`colormaps` an in-depth discussion of choosing\n colormaps.\n\n :ref:`colormapnorms` for more details about data normalization.\n"""\n\nfrom collections.abc import Mapping\n\nimport matplotlib as mpl\nfrom matplotlib import _api, colors\n# TODO make this warn on access\nfrom matplotlib.colorizer import _ScalarMappable as ScalarMappable # noqa\nfrom matplotlib._cm import datad\nfrom matplotlib._cm_listed import cmaps as cmaps_listed\nfrom matplotlib._cm_multivar import cmap_families as multivar_cmaps\nfrom matplotlib._cm_bivar import cmaps as bivar_cmaps\n\n\n_LUTSIZE = mpl.rcParams['image.lut']\n\n\ndef _gen_cmap_registry():\n """\n Generate a dict mapping standard colormap names to standard colormaps, as\n well as the reversed colormaps.\n """\n cmap_d = {**cmaps_listed}\n for name, spec in datad.items():\n cmap_d[name] = ( # Precache the cmaps at a fixed lutsize..\n colors.LinearSegmentedColormap(name, spec, _LUTSIZE)\n if 'red' in spec else\n colors.ListedColormap(spec['listed'], name)\n if 'listed' in spec else\n colors.LinearSegmentedColormap.from_list(name, spec, _LUTSIZE))\n\n # Register colormap aliases for gray and grey.\n aliases = {\n # alias -> original name\n 'grey': 'gray',\n 'gist_grey': 'gist_gray',\n 'gist_yerg': 'gist_yarg',\n 'Grays': 'Greys',\n }\n for alias, original_name in aliases.items():\n cmap = cmap_d[original_name].copy()\n cmap.name = alias\n cmap_d[alias] = cmap\n\n # Generate reversed cmaps.\n for cmap in list(cmap_d.values()):\n rmap = cmap.reversed()\n cmap_d[rmap.name] = rmap\n return cmap_d\n\n\nclass ColormapRegistry(Mapping):\n r"""\n Container for colormaps that are known to Matplotlib by name.\n\n The universal registry instance is `matplotlib.colormaps`. There should be\n no need for users to instantiate `.ColormapRegistry` themselves.\n\n Read access uses a dict-like interface mapping names to `.Colormap`\s::\n\n import matplotlib as mpl\n cmap = mpl.colormaps['viridis']\n\n Returned `.Colormap`\s are copies, so that their modification does not\n change the global definition of the colormap.\n\n Additional colormaps can be added via `.ColormapRegistry.register`::\n\n mpl.colormaps.register(my_colormap)\n\n To get a list of all registered colormaps, you can do::\n\n from matplotlib import colormaps\n list(colormaps)\n """\n def __init__(self, cmaps):\n self._cmaps = cmaps\n self._builtin_cmaps = tuple(cmaps)\n\n def __getitem__(self, item):\n try:\n return self._cmaps[item].copy()\n except KeyError:\n raise KeyError(f"{item!r} is not a known colormap name") from None\n\n def __iter__(self):\n return iter(self._cmaps)\n\n def __len__(self):\n return len(self._cmaps)\n\n def __str__(self):\n return ('ColormapRegistry; available colormaps:\n' +\n ', '.join(f"'{name}'" for name in self))\n\n def __call__(self):\n """\n Return a list of the registered colormap names.\n\n This exists only for backward-compatibility in `.pyplot` which had a\n ``plt.colormaps()`` method. The recommended way to get this list is\n now ``list(colormaps)``.\n """\n return list(self)\n\n def register(self, cmap, *, name=None, force=False):\n """\n Register a new colormap.\n\n The colormap name can then be used as a string argument to any ``cmap``\n parameter in Matplotlib. It is also available in ``pyplot.get_cmap``.\n\n The colormap registry stores a copy of the given colormap, so that\n future changes to the original colormap instance do not affect the\n registered colormap. Think of this as the registry taking a snapshot\n of the colormap at registration.\n\n Parameters\n ----------\n cmap : matplotlib.colors.Colormap\n The colormap to register.\n\n name : str, optional\n The name for the colormap. If not given, ``cmap.name`` is used.\n\n force : bool, default: False\n If False, a ValueError is raised if trying to overwrite an already\n registered name. True supports overwriting registered colormaps\n other than the builtin colormaps.\n """\n _api.check_isinstance(colors.Colormap, cmap=cmap)\n\n name = name or cmap.name\n if name in self:\n if not force:\n # don't allow registering an already existing cmap\n # unless explicitly asked to\n raise ValueError(\n f'A colormap named "{name}" is already registered.')\n elif name in self._builtin_cmaps:\n # We don't allow overriding a builtin.\n raise ValueError("Re-registering the builtin cmap "\n f"{name!r} is not allowed.")\n\n # Warn that we are updating an already existing colormap\n _api.warn_external(f"Overwriting the cmap {name!r} "\n "that was already in the registry.")\n\n self._cmaps[name] = cmap.copy()\n # Someone may set the extremes of a builtin colormap and want to register it\n # with a different name for future lookups. The object would still have the\n # builtin name, so we should update it to the registered name\n if self._cmaps[name].name != name:\n self._cmaps[name].name = name\n\n def unregister(self, name):\n """\n Remove a colormap from the registry.\n\n You cannot remove built-in colormaps.\n\n If the named colormap is not registered, returns with no error, raises\n if you try to de-register a default colormap.\n\n .. warning::\n\n Colormap names are currently a shared namespace that may be used\n by multiple packages. Use `unregister` only if you know you\n have registered that name before. In particular, do not\n unregister just in case to clean the name before registering a\n new colormap.\n\n Parameters\n ----------\n name : str\n The name of the colormap to be removed.\n\n Raises\n ------\n ValueError\n If you try to remove a default built-in colormap.\n """\n if name in self._builtin_cmaps:\n raise ValueError(f"cannot unregister {name!r} which is a builtin "\n "colormap.")\n self._cmaps.pop(name, None)\n\n def get_cmap(self, cmap):\n """\n Return a color map specified through *cmap*.\n\n Parameters\n ----------\n cmap : str or `~matplotlib.colors.Colormap` or None\n\n - if a `.Colormap`, return it\n - if a string, look it up in ``mpl.colormaps``\n - if None, return the Colormap defined in :rc:`image.cmap`\n\n Returns\n -------\n Colormap\n """\n # get the default color map\n if cmap is None:\n return self[mpl.rcParams["image.cmap"]]\n\n # if the user passed in a Colormap, simply return it\n if isinstance(cmap, colors.Colormap):\n return cmap\n if isinstance(cmap, str):\n _api.check_in_list(sorted(_colormaps), cmap=cmap)\n # otherwise, it must be a string so look it up\n return self[cmap]\n raise TypeError(\n 'get_cmap expects None or an instance of a str or Colormap . ' +\n f'you passed {cmap!r} of type {type(cmap)}'\n )\n\n\n# public access to the colormaps should be via `matplotlib.colormaps`. For now,\n# we still create the registry here, but that should stay an implementation\n# detail.\n_colormaps = ColormapRegistry(_gen_cmap_registry())\nglobals().update(_colormaps)\n\n_multivar_colormaps = ColormapRegistry(multivar_cmaps)\n\n_bivar_colormaps = ColormapRegistry(bivar_cmaps)\n\n\n# This is an exact copy of pyplot.get_cmap(). It was removed in 3.9, but apparently\n# caused more user trouble than expected. Re-added for 3.9.1 and extended the\n# deprecation period for two additional minor releases.\n@_api.deprecated(\n '3.7',\n removal='3.11',\n alternative="``matplotlib.colormaps[name]`` or ``matplotlib.colormaps.get_cmap()``"\n " or ``pyplot.get_cmap()``"\n )\ndef get_cmap(name=None, lut=None):\n """\n Get a colormap instance, defaulting to rc values if *name* is None.\n\n Parameters\n ----------\n name : `~matplotlib.colors.Colormap` or str or None, default: None\n If a `.Colormap` instance, it will be returned. Otherwise, the name of\n a colormap known to Matplotlib, which will be resampled by *lut*. The\n default, None, means :rc:`image.cmap`.\n lut : int or None, default: None\n If *name* is not already a Colormap instance and *lut* is not None, the\n colormap will be resampled to have *lut* entries in the lookup table.\n\n Returns\n -------\n Colormap\n """\n if name is None:\n name = mpl.rcParams['image.cmap']\n if isinstance(name, colors.Colormap):\n return name\n _api.check_in_list(sorted(_colormaps), name=name)\n if lut is None:\n return _colormaps[name]\n else:\n return _colormaps[name].resampled(lut)\n\n\ndef _ensure_cmap(cmap):\n """\n Ensure that we have a `.Colormap` object.\n\n For internal use to preserve type stability of errors.\n\n Parameters\n ----------\n cmap : None, str, Colormap\n\n - if a `Colormap`, return it\n - if a string, look it up in mpl.colormaps\n - if None, look up the default color map in mpl.colormaps\n\n Returns\n -------\n Colormap\n\n """\n if isinstance(cmap, colors.Colormap):\n return cmap\n cmap_name = cmap if cmap is not None else mpl.rcParams["image.cmap"]\n # use check_in_list to ensure type stability of the exception raised by\n # the internal usage of this (ValueError vs KeyError)\n if cmap_name not in _colormaps:\n _api.check_in_list(sorted(_colormaps), cmap=cmap_name)\n return mpl.colormaps[cmap_name]\n | .venv\Lib\site-packages\matplotlib\cm.py | cm.py | Python | 10,350 | 0.95 | 0.183871 | 0.089796 | vue-tools | 705 | 2023-08-28T01:29:51.152705 | BSD-3-Clause | false | 51cfb0c4b6e345a22eea1666cdb88c74 |
from collections.abc import Iterator, Mapping\nfrom matplotlib import colors\nfrom matplotlib.colorizer import _ScalarMappable\n\n\nclass ColormapRegistry(Mapping[str, colors.Colormap]):\n def __init__(self, cmaps: Mapping[str, colors.Colormap]) -> None: ...\n def __getitem__(self, item: str) -> colors.Colormap: ...\n def __iter__(self) -> Iterator[str]: ...\n def __len__(self) -> int: ...\n def __call__(self) -> list[str]: ...\n def register(\n self, cmap: colors.Colormap, *, name: str | None = ..., force: bool = ...\n ) -> None: ...\n def unregister(self, name: str) -> None: ...\n def get_cmap(self, cmap: str | colors.Colormap) -> colors.Colormap: ...\n\n_colormaps: ColormapRegistry = ...\n_multivar_colormaps: ColormapRegistry = ...\n_bivar_colormaps: ColormapRegistry = ...\n\ndef get_cmap(name: str | colors.Colormap | None = ..., lut: int | None = ...) -> colors.Colormap: ...\n\nScalarMappable = _ScalarMappable\n | .venv\Lib\site-packages\matplotlib\cm.pyi | cm.pyi | Other | 939 | 0.85 | 0.416667 | 0 | vue-tools | 343 | 2025-02-05T09:20:04.843238 | GPL-3.0 | false | 458a9f2b6547094465e5f72c0d801564 |
"""\nClasses for the efficient drawing of large collections of objects that\nshare most properties, e.g., a large number of line segments or\npolygons.\n\nThe classes are not meant to be as flexible as their single element\ncounterparts (e.g., you may not be able to select all line styles) but\nthey are meant to be fast for common use cases (e.g., a large set of solid\nline segments).\n"""\n\nimport itertools\nimport functools\nimport math\nfrom numbers import Number, Real\nimport warnings\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom . import (_api, _path, artist, cbook, colorizer as mcolorizer, colors as mcolors,\n _docstring, hatch as mhatch, lines as mlines, path as mpath, transforms)\nfrom ._enums import JoinStyle, CapStyle\n\n\n# "color" is excluded; it is a compound setter, and its docstring differs\n# in LineCollection.\n@_api.define_aliases({\n "antialiased": ["antialiaseds", "aa"],\n "edgecolor": ["edgecolors", "ec"],\n "facecolor": ["facecolors", "fc"],\n "linestyle": ["linestyles", "dashes", "ls"],\n "linewidth": ["linewidths", "lw"],\n "offset_transform": ["transOffset"],\n})\nclass Collection(mcolorizer.ColorizingArtist):\n r"""\n Base class for Collections. Must be subclassed to be usable.\n\n A Collection represents a sequence of `.Patch`\es that can be drawn\n more efficiently together than individually. For example, when a single\n path is being drawn repeatedly at different offsets, the renderer can\n typically execute a ``draw_marker()`` call much more efficiently than a\n series of repeated calls to ``draw_path()`` with the offsets put in\n one-by-one.\n\n Most properties of a collection can be configured per-element. Therefore,\n Collections have "plural" versions of many of the properties of a `.Patch`\n (e.g. `.Collection.get_paths` instead of `.Patch.get_path`). Exceptions are\n the *zorder*, *hatch*, *pickradius*, *capstyle* and *joinstyle* properties,\n which can only be set globally for the whole collection.\n\n Besides these exceptions, all properties can be specified as single values\n (applying to all elements) or sequences of values. The property of the\n ``i``\th element of the collection is::\n\n prop[i % len(prop)]\n\n Each Collection can optionally be used as its own `.ScalarMappable` by\n passing the *norm* and *cmap* parameters to its constructor. If the\n Collection's `.ScalarMappable` matrix ``_A`` has been set (via a call\n to `.Collection.set_array`), then at draw time this internal scalar\n mappable will be used to set the ``facecolors`` and ``edgecolors``,\n ignoring those that were manually passed in.\n """\n #: Either a list of 3x3 arrays or an Nx3x3 array (representing N\n #: transforms), suitable for the `all_transforms` argument to\n #: `~matplotlib.backend_bases.RendererBase.draw_path_collection`;\n #: each 3x3 array is used to initialize an\n #: `~matplotlib.transforms.Affine2D` object.\n #: Each kind of collection defines this based on its arguments.\n _transforms = np.empty((0, 3, 3))\n\n # Whether to draw an edge by default. Set on a\n # subclass-by-subclass basis.\n _edge_default = False\n\n @_docstring.interpd\n def __init__(self, *,\n edgecolors=None,\n facecolors=None,\n linewidths=None,\n linestyles='solid',\n capstyle=None,\n joinstyle=None,\n antialiaseds=None,\n offsets=None,\n offset_transform=None,\n norm=None, # optional for ScalarMappable\n cmap=None, # ditto\n colorizer=None,\n pickradius=5.0,\n hatch=None,\n urls=None,\n zorder=1,\n **kwargs\n ):\n """\n Parameters\n ----------\n edgecolors : :mpltype:`color` or list of colors, default: :rc:`patch.edgecolor`\n Edge color for each patch making up the collection. The special\n value 'face' can be passed to make the edgecolor match the\n facecolor.\n facecolors : :mpltype:`color` or list of colors, default: :rc:`patch.facecolor`\n Face color for each patch making up the collection.\n linewidths : float or list of floats, default: :rc:`patch.linewidth`\n Line width for each patch making up the collection.\n linestyles : str or tuple or list thereof, default: 'solid'\n Valid strings are ['solid', 'dashed', 'dashdot', 'dotted', '-',\n '--', '-.', ':']. Dash tuples should be of the form::\n\n (offset, onoffseq),\n\n where *onoffseq* is an even length tuple of on and off ink lengths\n in points. For examples, see\n :doc:`/gallery/lines_bars_and_markers/linestyles`.\n capstyle : `.CapStyle`-like, default: 'butt'\n Style to use for capping lines for all paths in the collection.\n Allowed values are %(CapStyle)s.\n joinstyle : `.JoinStyle`-like, default: 'round'\n Style to use for joining lines for all paths in the collection.\n Allowed values are %(JoinStyle)s.\n antialiaseds : bool or list of bool, default: :rc:`patch.antialiased`\n Whether each patch in the collection should be drawn with\n antialiasing.\n offsets : (float, float) or list thereof, default: (0, 0)\n A vector by which to translate each patch after rendering (default\n is no translation). The translation is performed in screen (pixel)\n coordinates (i.e. after the Artist's transform is applied).\n offset_transform : `~.Transform`, default: `.IdentityTransform`\n A single transform which will be applied to each *offsets* vector\n before it is used.\n cmap, norm\n Data normalization and colormapping parameters. See\n `.ScalarMappable` for a detailed description.\n hatch : str, optional\n Hatching pattern to use in filled paths, if any. Valid strings are\n ['/', '\\', '|', '-', '+', 'x', 'o', 'O', '.', '*']. See\n :doc:`/gallery/shapes_and_collections/hatch_style_reference` for\n the meaning of each hatch type.\n pickradius : float, default: 5.0\n If ``pickradius <= 0``, then `.Collection.contains` will return\n ``True`` whenever the test point is inside of one of the polygons\n formed by the control points of a Path in the Collection. On the\n other hand, if it is greater than 0, then we instead check if the\n test point is contained in a stroke of width ``2*pickradius``\n following any of the Paths in the Collection.\n urls : list of str, default: None\n A URL for each patch to link to once drawn. Currently only works\n for the SVG backend. See :doc:`/gallery/misc/hyperlinks_sgskip` for\n examples.\n zorder : float, default: 1\n The drawing order, shared by all Patches in the Collection. See\n :doc:`/gallery/misc/zorder_demo` for all defaults and examples.\n **kwargs\n Remaining keyword arguments will be used to set properties as\n ``Collection.set_{key}(val)`` for each key-value pair in *kwargs*.\n """\n\n super().__init__(self._get_colorizer(cmap, norm, colorizer))\n # list of un-scaled dash patterns\n # this is needed scaling the dash pattern by linewidth\n self._us_linestyles = [(0, None)]\n # list of dash patterns\n self._linestyles = [(0, None)]\n # list of unbroadcast/scaled linewidths\n self._us_lw = [0]\n self._linewidths = [0]\n\n self._gapcolor = None # Currently only used by LineCollection.\n\n # Flags set by _set_mappable_flags: are colors from mapping an array?\n self._face_is_mapped = None\n self._edge_is_mapped = None\n self._mapped_colors = None # calculated in update_scalarmappable\n self._hatch_color = mcolors.to_rgba(mpl.rcParams['hatch.color'])\n self._hatch_linewidth = mpl.rcParams['hatch.linewidth']\n self.set_facecolor(facecolors)\n self.set_edgecolor(edgecolors)\n self.set_linewidth(linewidths)\n self.set_linestyle(linestyles)\n self.set_antialiased(antialiaseds)\n self.set_pickradius(pickradius)\n self.set_urls(urls)\n self.set_hatch(hatch)\n self.set_zorder(zorder)\n\n if capstyle:\n self.set_capstyle(capstyle)\n else:\n self._capstyle = None\n\n if joinstyle:\n self.set_joinstyle(joinstyle)\n else:\n self._joinstyle = None\n\n if offsets is not None:\n offsets = np.asanyarray(offsets, float)\n # Broadcast (2,) -> (1, 2) but nothing else.\n if offsets.shape == (2,):\n offsets = offsets[None, :]\n\n self._offsets = offsets\n self._offset_transform = offset_transform\n\n self._path_effects = None\n self._internal_update(kwargs)\n self._paths = None\n\n def get_paths(self):\n return self._paths\n\n def set_paths(self, paths):\n self._paths = paths\n self.stale = True\n\n def get_transforms(self):\n return self._transforms\n\n def get_offset_transform(self):\n """Return the `.Transform` instance used by this artist offset."""\n if self._offset_transform is None:\n self._offset_transform = transforms.IdentityTransform()\n elif (not isinstance(self._offset_transform, transforms.Transform)\n and hasattr(self._offset_transform, '_as_mpl_transform')):\n self._offset_transform = \\n self._offset_transform._as_mpl_transform(self.axes)\n return self._offset_transform\n\n def set_offset_transform(self, offset_transform):\n """\n Set the artist offset transform.\n\n Parameters\n ----------\n offset_transform : `.Transform`\n """\n self._offset_transform = offset_transform\n\n def get_datalim(self, transData):\n # Calculate the data limits and return them as a `.Bbox`.\n #\n # This operation depends on the transforms for the data in the\n # collection and whether the collection has offsets:\n #\n # 1. offsets = None, transform child of transData: use the paths for\n # the automatic limits (i.e. for LineCollection in streamline).\n # 2. offsets != None: offset_transform is child of transData:\n #\n # a. transform is child of transData: use the path + offset for\n # limits (i.e for bar).\n # b. transform is not a child of transData: just use the offsets\n # for the limits (i.e. for scatter)\n #\n # 3. otherwise return a null Bbox.\n\n transform = self.get_transform()\n offset_trf = self.get_offset_transform()\n if not (isinstance(offset_trf, transforms.IdentityTransform)\n or offset_trf.contains_branch(transData)):\n # if the offsets are in some coords other than data,\n # then don't use them for autoscaling.\n return transforms.Bbox.null()\n\n paths = self.get_paths()\n if not len(paths):\n # No paths to transform\n return transforms.Bbox.null()\n\n if not transform.is_affine:\n paths = [transform.transform_path_non_affine(p) for p in paths]\n # Don't convert transform to transform.get_affine() here because\n # we may have transform.contains_branch(transData) but not\n # transforms.get_affine().contains_branch(transData). But later,\n # be careful to only apply the affine part that remains.\n\n offsets = self.get_offsets()\n\n if any(transform.contains_branch_seperately(transData)):\n # collections that are just in data units (like quiver)\n # can properly have the axes limits set by their shape +\n # offset. LineCollections that have no offsets can\n # also use this algorithm (like streamplot).\n if isinstance(offsets, np.ma.MaskedArray):\n offsets = offsets.filled(np.nan)\n # get_path_collection_extents handles nan but not masked arrays\n return mpath.get_path_collection_extents(\n transform.get_affine() - transData, paths,\n self.get_transforms(),\n offset_trf.transform_non_affine(offsets),\n offset_trf.get_affine().frozen())\n\n # NOTE: None is the default case where no offsets were passed in\n if self._offsets is not None:\n # this is for collections that have their paths (shapes)\n # in physical, axes-relative, or figure-relative units\n # (i.e. like scatter). We can't uniquely set limits based on\n # those shapes, so we just set the limits based on their\n # location.\n offsets = (offset_trf - transData).transform(offsets)\n # note A-B means A B^{-1}\n offsets = np.ma.masked_invalid(offsets)\n if not offsets.mask.all():\n bbox = transforms.Bbox.null()\n bbox.update_from_data_xy(offsets)\n return bbox\n return transforms.Bbox.null()\n\n def get_window_extent(self, renderer=None):\n # TODO: check to ensure that this does not fail for\n # cases other than scatter plot legend\n return self.get_datalim(transforms.IdentityTransform())\n\n def _prepare_points(self):\n # Helper for drawing and hit testing.\n\n transform = self.get_transform()\n offset_trf = self.get_offset_transform()\n offsets = self.get_offsets()\n paths = self.get_paths()\n\n if self.have_units():\n paths = []\n for path in self.get_paths():\n vertices = path.vertices\n xs, ys = vertices[:, 0], vertices[:, 1]\n xs = self.convert_xunits(xs)\n ys = self.convert_yunits(ys)\n paths.append(mpath.Path(np.column_stack([xs, ys]), path.codes))\n xs = self.convert_xunits(offsets[:, 0])\n ys = self.convert_yunits(offsets[:, 1])\n offsets = np.ma.column_stack([xs, ys])\n\n if not transform.is_affine:\n paths = [transform.transform_path_non_affine(path)\n for path in paths]\n transform = transform.get_affine()\n if not offset_trf.is_affine:\n offsets = offset_trf.transform_non_affine(offsets)\n # This might have changed an ndarray into a masked array.\n offset_trf = offset_trf.get_affine()\n\n if isinstance(offsets, np.ma.MaskedArray):\n offsets = offsets.filled(np.nan)\n # Changing from a masked array to nan-filled ndarray\n # is probably most efficient at this point.\n\n return transform, offset_trf, offsets, paths\n\n @artist.allow_rasterization\n def draw(self, renderer):\n if not self.get_visible():\n return\n renderer.open_group(self.__class__.__name__, self.get_gid())\n\n self.update_scalarmappable()\n\n transform, offset_trf, offsets, paths = self._prepare_points()\n\n gc = renderer.new_gc()\n self._set_gc_clip(gc)\n gc.set_snap(self.get_snap())\n\n if self._hatch:\n gc.set_hatch(self._hatch)\n gc.set_hatch_color(self._hatch_color)\n gc.set_hatch_linewidth(self._hatch_linewidth)\n\n if self.get_sketch_params() is not None:\n gc.set_sketch_params(*self.get_sketch_params())\n\n if self.get_path_effects():\n from matplotlib.patheffects import PathEffectRenderer\n renderer = PathEffectRenderer(self.get_path_effects(), renderer)\n\n # If the collection is made up of a single shape/color/stroke,\n # it can be rendered once and blitted multiple times, using\n # `draw_markers` rather than `draw_path_collection`. This is\n # *much* faster for Agg, and results in smaller file sizes in\n # PDF/SVG/PS.\n\n trans = self.get_transforms()\n facecolors = self.get_facecolor()\n edgecolors = self.get_edgecolor()\n do_single_path_optimization = False\n if (len(paths) == 1 and len(trans) <= 1 and\n len(facecolors) == 1 and len(edgecolors) == 1 and\n len(self._linewidths) == 1 and\n all(ls[1] is None for ls in self._linestyles) and\n len(self._antialiaseds) == 1 and len(self._urls) == 1 and\n self.get_hatch() is None):\n if len(trans):\n combined_transform = transforms.Affine2D(trans[0]) + transform\n else:\n combined_transform = transform\n extents = paths[0].get_extents(combined_transform)\n if (extents.width < self.get_figure(root=True).bbox.width\n and extents.height < self.get_figure(root=True).bbox.height):\n do_single_path_optimization = True\n\n if self._joinstyle:\n gc.set_joinstyle(self._joinstyle)\n\n if self._capstyle:\n gc.set_capstyle(self._capstyle)\n\n if do_single_path_optimization:\n gc.set_foreground(tuple(edgecolors[0]))\n gc.set_linewidth(self._linewidths[0])\n gc.set_dashes(*self._linestyles[0])\n gc.set_antialiased(self._antialiaseds[0])\n gc.set_url(self._urls[0])\n renderer.draw_markers(\n gc, paths[0], combined_transform.frozen(),\n mpath.Path(offsets), offset_trf, tuple(facecolors[0]))\n else:\n if self._gapcolor is not None:\n # First draw paths within the gaps.\n ipaths, ilinestyles = self._get_inverse_paths_linestyles()\n renderer.draw_path_collection(\n gc, transform.frozen(), ipaths,\n self.get_transforms(), offsets, offset_trf,\n [mcolors.to_rgba("none")], self._gapcolor,\n self._linewidths, ilinestyles,\n self._antialiaseds, self._urls,\n "screen")\n\n renderer.draw_path_collection(\n gc, transform.frozen(), paths,\n self.get_transforms(), offsets, offset_trf,\n self.get_facecolor(), self.get_edgecolor(),\n self._linewidths, self._linestyles,\n self._antialiaseds, self._urls,\n "screen") # offset_position, kept for backcompat.\n\n gc.restore()\n renderer.close_group(self.__class__.__name__)\n self.stale = False\n\n def set_pickradius(self, pickradius):\n """\n Set the pick radius used for containment tests.\n\n Parameters\n ----------\n pickradius : float\n Pick radius, in points.\n """\n if not isinstance(pickradius, Real):\n raise ValueError(\n f"pickradius must be a real-valued number, not {pickradius!r}")\n self._pickradius = pickradius\n\n def get_pickradius(self):\n return self._pickradius\n\n def contains(self, mouseevent):\n """\n Test whether the mouse event occurred in the collection.\n\n Returns ``bool, dict(ind=itemlist)``, where every item in itemlist\n contains the event.\n """\n if self._different_canvas(mouseevent) or not self.get_visible():\n return False, {}\n pickradius = (\n float(self._picker)\n if isinstance(self._picker, Number) and\n self._picker is not True # the bool, not just nonzero or 1\n else self._pickradius)\n if self.axes:\n self.axes._unstale_viewLim()\n transform, offset_trf, offsets, paths = self._prepare_points()\n # Tests if the point is contained on one of the polygons formed\n # by the control points of each of the paths. A point is considered\n # "on" a path if it would lie within a stroke of width 2*pickradius\n # following the path. If pickradius <= 0, then we instead simply check\n # if the point is *inside* of the path instead.\n ind = _path.point_in_path_collection(\n mouseevent.x, mouseevent.y, pickradius,\n transform.frozen(), paths, self.get_transforms(),\n offsets, offset_trf, pickradius <= 0)\n return len(ind) > 0, dict(ind=ind)\n\n def set_urls(self, urls):\n """\n Parameters\n ----------\n urls : list of str or None\n\n Notes\n -----\n URLs are currently only implemented by the SVG backend. They are\n ignored by all other backends.\n """\n self._urls = urls if urls is not None else [None]\n self.stale = True\n\n def get_urls(self):\n """\n Return a list of URLs, one for each element of the collection.\n\n The list contains *None* for elements without a URL. See\n :doc:`/gallery/misc/hyperlinks_sgskip` for an example.\n """\n return self._urls\n\n def set_hatch(self, hatch):\n r"""\n Set the hatching pattern\n\n *hatch* can be one of::\n\n / - diagonal hatching\n \ - back diagonal\n | - vertical\n - - horizontal\n + - crossed\n x - crossed diagonal\n o - small circle\n O - large circle\n . - dots\n * - stars\n\n Letters can be combined, in which case all the specified\n hatchings are done. If same letter repeats, it increases the\n density of hatching of that pattern.\n\n Unlike other properties such as linewidth and colors, hatching\n can only be specified for the collection as a whole, not separately\n for each member.\n\n Parameters\n ----------\n hatch : {'/', '\\', '|', '-', '+', 'x', 'o', 'O', '.', '*'}\n """\n # Use validate_hatch(list) after deprecation.\n mhatch._validate_hatch_pattern(hatch)\n self._hatch = hatch\n self.stale = True\n\n def get_hatch(self):\n """Return the current hatching pattern."""\n return self._hatch\n\n def set_hatch_linewidth(self, lw):\n """Set the hatch linewidth."""\n self._hatch_linewidth = lw\n\n def get_hatch_linewidth(self):\n """Return the hatch linewidth."""\n return self._hatch_linewidth\n\n def set_offsets(self, offsets):\n """\n Set the offsets for the collection.\n\n Parameters\n ----------\n offsets : (N, 2) or (2,) array-like\n """\n offsets = np.asanyarray(offsets)\n if offsets.shape == (2,): # Broadcast (2,) -> (1, 2) but nothing else.\n offsets = offsets[None, :]\n cstack = (np.ma.column_stack if isinstance(offsets, np.ma.MaskedArray)\n else np.column_stack)\n self._offsets = cstack(\n (np.asanyarray(self.convert_xunits(offsets[:, 0]), float),\n np.asanyarray(self.convert_yunits(offsets[:, 1]), float)))\n self.stale = True\n\n def get_offsets(self):\n """Return the offsets for the collection."""\n # Default to zeros in the no-offset (None) case\n return np.zeros((1, 2)) if self._offsets is None else self._offsets\n\n def _get_default_linewidth(self):\n # This may be overridden in a subclass.\n return mpl.rcParams['patch.linewidth'] # validated as float\n\n def set_linewidth(self, lw):\n """\n Set the linewidth(s) for the collection. *lw* can be a scalar\n or a sequence; if it is a sequence the patches will cycle\n through the sequence\n\n Parameters\n ----------\n lw : float or list of floats\n """\n if lw is None:\n lw = self._get_default_linewidth()\n # get the un-scaled/broadcast lw\n self._us_lw = np.atleast_1d(lw)\n\n # scale all of the dash patterns.\n self._linewidths, self._linestyles = self._bcast_lwls(\n self._us_lw, self._us_linestyles)\n self.stale = True\n\n def set_linestyle(self, ls):\n """\n Set the linestyle(s) for the collection.\n\n =========================== =================\n linestyle description\n =========================== =================\n ``'-'`` or ``'solid'`` solid line\n ``'--'`` or ``'dashed'`` dashed line\n ``'-.'`` or ``'dashdot'`` dash-dotted line\n ``':'`` or ``'dotted'`` dotted line\n =========================== =================\n\n Alternatively a dash tuple of the following form can be provided::\n\n (offset, onoffseq),\n\n where ``onoffseq`` is an even length tuple of on and off ink in points.\n\n Parameters\n ----------\n ls : str or tuple or list thereof\n Valid values for individual linestyles include {'-', '--', '-.',\n ':', '', (offset, on-off-seq)}. See `.Line2D.set_linestyle` for a\n complete description.\n """\n # get the list of raw 'unscaled' dash patterns\n self._us_linestyles = mlines._get_dash_patterns(ls)\n\n # broadcast and scale the lw and dash patterns\n self._linewidths, self._linestyles = self._bcast_lwls(\n self._us_lw, self._us_linestyles)\n\n @_docstring.interpd\n def set_capstyle(self, cs):\n """\n Set the `.CapStyle` for the collection (for all its elements).\n\n Parameters\n ----------\n cs : `.CapStyle` or %(CapStyle)s\n """\n self._capstyle = CapStyle(cs)\n\n @_docstring.interpd\n def get_capstyle(self):\n """\n Return the cap style for the collection (for all its elements).\n\n Returns\n -------\n %(CapStyle)s or None\n """\n return self._capstyle.name if self._capstyle else None\n\n @_docstring.interpd\n def set_joinstyle(self, js):\n """\n Set the `.JoinStyle` for the collection (for all its elements).\n\n Parameters\n ----------\n js : `.JoinStyle` or %(JoinStyle)s\n """\n self._joinstyle = JoinStyle(js)\n\n @_docstring.interpd\n def get_joinstyle(self):\n """\n Return the join style for the collection (for all its elements).\n\n Returns\n -------\n %(JoinStyle)s or None\n """\n return self._joinstyle.name if self._joinstyle else None\n\n @staticmethod\n def _bcast_lwls(linewidths, dashes):\n """\n Internal helper function to broadcast + scale ls/lw\n\n In the collection drawing code, the linewidth and linestyle are cycled\n through as circular buffers (via ``v[i % len(v)]``). Thus, if we are\n going to scale the dash pattern at set time (not draw time) we need to\n do the broadcasting now and expand both lists to be the same length.\n\n Parameters\n ----------\n linewidths : list\n line widths of collection\n dashes : list\n dash specification (offset, (dash pattern tuple))\n\n Returns\n -------\n linewidths, dashes : list\n Will be the same length, dashes are scaled by paired linewidth\n """\n if mpl.rcParams['_internal.classic_mode']:\n return linewidths, dashes\n # make sure they are the same length so we can zip them\n if len(dashes) != len(linewidths):\n l_dashes = len(dashes)\n l_lw = len(linewidths)\n gcd = math.gcd(l_dashes, l_lw)\n dashes = list(dashes) * (l_lw // gcd)\n linewidths = list(linewidths) * (l_dashes // gcd)\n\n # scale the dash patterns\n dashes = [mlines._scale_dashes(o, d, lw)\n for (o, d), lw in zip(dashes, linewidths)]\n\n return linewidths, dashes\n\n def get_antialiased(self):\n """\n Get the antialiasing state for rendering.\n\n Returns\n -------\n array of bools\n """\n return self._antialiaseds\n\n def set_antialiased(self, aa):\n """\n Set the antialiasing state for rendering.\n\n Parameters\n ----------\n aa : bool or list of bools\n """\n if aa is None:\n aa = self._get_default_antialiased()\n self._antialiaseds = np.atleast_1d(np.asarray(aa, bool))\n self.stale = True\n\n def _get_default_antialiased(self):\n # This may be overridden in a subclass.\n return mpl.rcParams['patch.antialiased']\n\n def set_color(self, c):\n """\n Set both the edgecolor and the facecolor.\n\n Parameters\n ----------\n c : :mpltype:`color` or list of RGBA tuples\n\n See Also\n --------\n Collection.set_facecolor, Collection.set_edgecolor\n For setting the edge or face color individually.\n """\n self.set_facecolor(c)\n self.set_edgecolor(c)\n\n def _get_default_facecolor(self):\n # This may be overridden in a subclass.\n return mpl.rcParams['patch.facecolor']\n\n def _set_facecolor(self, c):\n if c is None:\n c = self._get_default_facecolor()\n\n self._facecolors = mcolors.to_rgba_array(c, self._alpha)\n self.stale = True\n\n def set_facecolor(self, c):\n """\n Set the facecolor(s) of the collection. *c* can be a color (all patches\n have same color), or a sequence of colors; if it is a sequence the\n patches will cycle through the sequence.\n\n If *c* is 'none', the patch will not be filled.\n\n Parameters\n ----------\n c : :mpltype:`color` or list of :mpltype:`color`\n """\n if isinstance(c, str) and c.lower() in ("none", "face"):\n c = c.lower()\n self._original_facecolor = c\n self._set_facecolor(c)\n\n def get_facecolor(self):\n return self._facecolors\n\n def get_edgecolor(self):\n if cbook._str_equal(self._edgecolors, 'face'):\n return self.get_facecolor()\n else:\n return self._edgecolors\n\n def _get_default_edgecolor(self):\n # This may be overridden in a subclass.\n return mpl.rcParams['patch.edgecolor']\n\n def _set_edgecolor(self, c):\n set_hatch_color = True\n if c is None:\n if (mpl.rcParams['patch.force_edgecolor']\n or self._edge_default\n or cbook._str_equal(self._original_facecolor, 'none')):\n c = self._get_default_edgecolor()\n else:\n c = 'none'\n set_hatch_color = False\n if cbook._str_lower_equal(c, 'face'):\n self._edgecolors = 'face'\n self.stale = True\n return\n self._edgecolors = mcolors.to_rgba_array(c, self._alpha)\n if set_hatch_color and len(self._edgecolors):\n self._hatch_color = tuple(self._edgecolors[0])\n self.stale = True\n\n def set_edgecolor(self, c):\n """\n Set the edgecolor(s) of the collection.\n\n Parameters\n ----------\n c : :mpltype:`color` or list of :mpltype:`color` or 'face'\n The collection edgecolor(s). If a sequence, the patches cycle\n through it. If 'face', match the facecolor.\n """\n # We pass through a default value for use in LineCollection.\n # This allows us to maintain None as the default indicator in\n # _original_edgecolor.\n if isinstance(c, str) and c.lower() in ("none", "face"):\n c = c.lower()\n self._original_edgecolor = c\n self._set_edgecolor(c)\n\n def set_alpha(self, alpha):\n """\n Set the transparency of the collection.\n\n Parameters\n ----------\n alpha : float or array of float or None\n If not None, *alpha* values must be between 0 and 1, inclusive.\n If an array is provided, its length must match the number of\n elements in the collection. Masked values and nans are not\n supported.\n """\n artist.Artist._set_alpha_for_array(self, alpha)\n self._set_facecolor(self._original_facecolor)\n self._set_edgecolor(self._original_edgecolor)\n\n set_alpha.__doc__ = artist.Artist._set_alpha_for_array.__doc__\n\n def get_linewidth(self):\n return self._linewidths\n\n def get_linestyle(self):\n return self._linestyles\n\n def _set_mappable_flags(self):\n """\n Determine whether edges and/or faces are color-mapped.\n\n This is a helper for update_scalarmappable.\n It sets Boolean flags '_edge_is_mapped' and '_face_is_mapped'.\n\n Returns\n -------\n mapping_change : bool\n True if either flag is True, or if a flag has changed.\n """\n # The flags are initialized to None to ensure this returns True\n # the first time it is called.\n edge0 = self._edge_is_mapped\n face0 = self._face_is_mapped\n # After returning, the flags must be Booleans, not None.\n self._edge_is_mapped = False\n self._face_is_mapped = False\n if self._A is not None:\n if not cbook._str_equal(self._original_facecolor, 'none'):\n self._face_is_mapped = True\n if cbook._str_equal(self._original_edgecolor, 'face'):\n self._edge_is_mapped = True\n else:\n if self._original_edgecolor is None:\n self._edge_is_mapped = True\n\n mapped = self._face_is_mapped or self._edge_is_mapped\n changed = (edge0 is None or face0 is None\n or self._edge_is_mapped != edge0\n or self._face_is_mapped != face0)\n return mapped or changed\n\n def update_scalarmappable(self):\n """\n Update colors from the scalar mappable array, if any.\n\n Assign colors to edges and faces based on the array and/or\n colors that were directly set, as appropriate.\n """\n if not self._set_mappable_flags():\n return\n # Allow possibility to call 'self.set_array(None)'.\n if self._A is not None:\n # QuadMesh can map 2d arrays (but pcolormesh supplies 1d array)\n if self._A.ndim > 1 and not isinstance(self, _MeshData):\n raise ValueError('Collections can only map rank 1 arrays')\n if np.iterable(self._alpha):\n if self._alpha.size != self._A.size:\n raise ValueError(\n f'Data array shape, {self._A.shape} '\n 'is incompatible with alpha array shape, '\n f'{self._alpha.shape}. '\n 'This can occur with the deprecated '\n 'behavior of the "flat" shading option, '\n 'in which a row and/or column of the data '\n 'array is dropped.')\n # pcolormesh, scatter, maybe others flatten their _A\n self._alpha = self._alpha.reshape(self._A.shape)\n self._mapped_colors = self.to_rgba(self._A, self._alpha)\n\n if self._face_is_mapped:\n self._facecolors = self._mapped_colors\n else:\n self._set_facecolor(self._original_facecolor)\n if self._edge_is_mapped:\n self._edgecolors = self._mapped_colors\n else:\n self._set_edgecolor(self._original_edgecolor)\n self.stale = True\n\n def get_fill(self):\n """Return whether face is colored."""\n return not cbook._str_lower_equal(self._original_facecolor, "none")\n\n def update_from(self, other):\n """Copy properties from other to self."""\n\n artist.Artist.update_from(self, other)\n self._antialiaseds = other._antialiaseds\n self._mapped_colors = other._mapped_colors\n self._edge_is_mapped = other._edge_is_mapped\n self._original_edgecolor = other._original_edgecolor\n self._edgecolors = other._edgecolors\n self._face_is_mapped = other._face_is_mapped\n self._original_facecolor = other._original_facecolor\n self._facecolors = other._facecolors\n self._linewidths = other._linewidths\n self._linestyles = other._linestyles\n self._us_linestyles = other._us_linestyles\n self._pickradius = other._pickradius\n self._hatch = other._hatch\n\n # update_from for scalarmappable\n self._A = other._A\n self.norm = other.norm\n self.cmap = other.cmap\n self.stale = True\n\n\nclass _CollectionWithSizes(Collection):\n """\n Base class for collections that have an array of sizes.\n """\n _factor = 1.0\n\n def get_sizes(self):\n """\n Return the sizes ('areas') of the elements in the collection.\n\n Returns\n -------\n array\n The 'area' of each element.\n """\n return self._sizes\n\n def set_sizes(self, sizes, dpi=72.0):\n """\n Set the sizes of each member of the collection.\n\n Parameters\n ----------\n sizes : `numpy.ndarray` or None\n The size to set for each element of the collection. The\n value is the 'area' of the element.\n dpi : float, default: 72\n The dpi of the canvas.\n """\n if sizes is None:\n self._sizes = np.array([])\n self._transforms = np.empty((0, 3, 3))\n else:\n self._sizes = np.asarray(sizes)\n self._transforms = np.zeros((len(self._sizes), 3, 3))\n scale = np.sqrt(self._sizes) * dpi / 72.0 * self._factor\n self._transforms[:, 0, 0] = scale\n self._transforms[:, 1, 1] = scale\n self._transforms[:, 2, 2] = 1.0\n self.stale = True\n\n @artist.allow_rasterization\n def draw(self, renderer):\n self.set_sizes(self._sizes, self.get_figure(root=True).dpi)\n super().draw(renderer)\n\n\nclass PathCollection(_CollectionWithSizes):\n r"""\n A collection of `~.path.Path`\s, as created by e.g. `~.Axes.scatter`.\n """\n\n def __init__(self, paths, sizes=None, **kwargs):\n """\n Parameters\n ----------\n paths : list of `.path.Path`\n The paths that will make up the `.Collection`.\n sizes : array-like\n The factor by which to scale each drawn `~.path.Path`. One unit\n squared in the Path's data space is scaled to be ``sizes**2``\n points when rendered.\n **kwargs\n Forwarded to `.Collection`.\n """\n\n super().__init__(**kwargs)\n self.set_paths(paths)\n self.set_sizes(sizes)\n self.stale = True\n\n def get_paths(self):\n return self._paths\n\n def legend_elements(self, prop="colors", num="auto",\n fmt=None, func=lambda x: x, **kwargs):\n """\n Create legend handles and labels for a PathCollection.\n\n Each legend handle is a `.Line2D` representing the Path that was drawn,\n and each label is a string that represents the Path.\n\n This is useful for obtaining a legend for a `~.Axes.scatter` plot;\n e.g.::\n\n scatter = plt.scatter([1, 2, 3], [4, 5, 6], c=[7, 2, 3], num=None)\n plt.legend(*scatter.legend_elements())\n\n creates three legend elements, one for each color with the numerical\n values passed to *c* as the labels.\n\n Also see the :ref:`automatedlegendcreation` example.\n\n Parameters\n ----------\n prop : {"colors", "sizes"}, default: "colors"\n If "colors", the legend handles will show the different colors of\n the collection. If "sizes", the legend will show the different\n sizes. To set both, use *kwargs* to directly edit the `.Line2D`\n properties.\n num : int, None, "auto" (default), array-like, or `~.ticker.Locator`\n Target number of elements to create.\n If None, use all unique elements of the mappable array. If an\n integer, target to use *num* elements in the normed range.\n If *"auto"*, try to determine which option better suits the nature\n of the data.\n The number of created elements may slightly deviate from *num* due\n to a `~.ticker.Locator` being used to find useful locations.\n If a list or array, use exactly those elements for the legend.\n Finally, a `~.ticker.Locator` can be provided.\n fmt : str, `~matplotlib.ticker.Formatter`, or None (default)\n The format or formatter to use for the labels. If a string must be\n a valid input for a `.StrMethodFormatter`. If None (the default),\n use a `.ScalarFormatter`.\n func : function, default: ``lambda x: x``\n Function to calculate the labels. Often the size (or color)\n argument to `~.Axes.scatter` will have been pre-processed by the\n user using a function ``s = f(x)`` to make the markers visible;\n e.g. ``size = np.log10(x)``. Providing the inverse of this\n function here allows that pre-processing to be inverted, so that\n the legend labels have the correct values; e.g. ``func = lambda\n x: 10**x``.\n **kwargs\n Allowed keyword arguments are *color* and *size*. E.g. it may be\n useful to set the color of the markers if *prop="sizes"* is used;\n similarly to set the size of the markers if *prop="colors"* is\n used. Any further parameters are passed onto the `.Line2D`\n instance. This may be useful to e.g. specify a different\n *markeredgecolor* or *alpha* for the legend handles.\n\n Returns\n -------\n handles : list of `.Line2D`\n Visual representation of each element of the legend.\n labels : list of str\n The string labels for elements of the legend.\n """\n handles = []\n labels = []\n hasarray = self.get_array() is not None\n if fmt is None:\n fmt = mpl.ticker.ScalarFormatter(useOffset=False, useMathText=True)\n elif isinstance(fmt, str):\n fmt = mpl.ticker.StrMethodFormatter(fmt)\n fmt.create_dummy_axis()\n\n if prop == "colors":\n if not hasarray:\n warnings.warn("Collection without array used. Make sure to "\n "specify the values to be colormapped via the "\n "`c` argument.")\n return handles, labels\n u = np.unique(self.get_array())\n size = kwargs.pop("size", mpl.rcParams["lines.markersize"])\n elif prop == "sizes":\n u = np.unique(self.get_sizes())\n color = kwargs.pop("color", "k")\n else:\n raise ValueError("Valid values for `prop` are 'colors' or "\n f"'sizes'. You supplied '{prop}' instead.")\n\n fu = func(u)\n fmt.axis.set_view_interval(fu.min(), fu.max())\n fmt.axis.set_data_interval(fu.min(), fu.max())\n if num == "auto":\n num = 9\n if len(u) <= num:\n num = None\n if num is None:\n values = u\n label_values = func(values)\n else:\n if prop == "colors":\n arr = self.get_array()\n elif prop == "sizes":\n arr = self.get_sizes()\n if isinstance(num, mpl.ticker.Locator):\n loc = num\n elif np.iterable(num):\n loc = mpl.ticker.FixedLocator(num)\n else:\n num = int(num)\n loc = mpl.ticker.MaxNLocator(nbins=num, min_n_ticks=num-1,\n steps=[1, 2, 2.5, 3, 5, 6, 8, 10])\n label_values = loc.tick_values(func(arr).min(), func(arr).max())\n cond = ((label_values >= func(arr).min()) &\n (label_values <= func(arr).max()))\n label_values = label_values[cond]\n yarr = np.linspace(arr.min(), arr.max(), 256)\n xarr = func(yarr)\n ix = np.argsort(xarr)\n values = np.interp(label_values, xarr[ix], yarr[ix])\n\n kw = {"markeredgewidth": self.get_linewidths()[0],\n "alpha": self.get_alpha(),\n **kwargs}\n\n for val, lab in zip(values, label_values):\n if prop == "colors":\n color = self.cmap(self.norm(val))\n elif prop == "sizes":\n size = np.sqrt(val)\n if np.isclose(size, 0.0):\n continue\n h = mlines.Line2D([0], [0], ls="", color=color, ms=size,\n marker=self.get_paths()[0], **kw)\n handles.append(h)\n if hasattr(fmt, "set_locs"):\n fmt.set_locs(label_values)\n l = fmt(lab)\n labels.append(l)\n\n return handles, labels\n\n\nclass PolyCollection(_CollectionWithSizes):\n\n def __init__(self, verts, sizes=None, *, closed=True, **kwargs):\n """\n Parameters\n ----------\n verts : list of array-like\n The sequence of polygons [*verts0*, *verts1*, ...] where each\n element *verts_i* defines the vertices of polygon *i* as a 2D\n array-like of shape (M, 2).\n sizes : array-like, default: None\n Squared scaling factors for the polygons. The coordinates of each\n polygon *verts_i* are multiplied by the square-root of the\n corresponding entry in *sizes* (i.e., *sizes* specify the scaling\n of areas). The scaling is applied before the Artist master\n transform.\n closed : bool, default: True\n Whether the polygon should be closed by adding a CLOSEPOLY\n connection at the end.\n **kwargs\n Forwarded to `.Collection`.\n """\n super().__init__(**kwargs)\n self.set_sizes(sizes)\n self.set_verts(verts, closed)\n self.stale = True\n\n def set_verts(self, verts, closed=True):\n """\n Set the vertices of the polygons.\n\n Parameters\n ----------\n verts : list of array-like\n The sequence of polygons [*verts0*, *verts1*, ...] where each\n element *verts_i* defines the vertices of polygon *i* as a 2D\n array-like of shape (M, 2).\n closed : bool, default: True\n Whether the polygon should be closed by adding a CLOSEPOLY\n connection at the end.\n """\n self.stale = True\n if isinstance(verts, np.ma.MaskedArray):\n verts = verts.astype(float).filled(np.nan)\n\n # No need to do anything fancy if the path isn't closed.\n if not closed:\n self._paths = [mpath.Path(xy) for xy in verts]\n return\n\n # Fast path for arrays\n if isinstance(verts, np.ndarray) and len(verts.shape) == 3:\n verts_pad = np.concatenate((verts, verts[:, :1]), axis=1)\n # Creating the codes once is much faster than having Path do it\n # separately each time by passing closed=True.\n codes = np.empty(verts_pad.shape[1], dtype=mpath.Path.code_type)\n codes[:] = mpath.Path.LINETO\n codes[0] = mpath.Path.MOVETO\n codes[-1] = mpath.Path.CLOSEPOLY\n self._paths = [mpath.Path(xy, codes) for xy in verts_pad]\n return\n\n self._paths = []\n for xy in verts:\n if len(xy):\n self._paths.append(mpath.Path._create_closed(xy))\n else:\n self._paths.append(mpath.Path(xy))\n\n set_paths = set_verts\n\n def set_verts_and_codes(self, verts, codes):\n """Initialize vertices with path codes."""\n if len(verts) != len(codes):\n raise ValueError("'codes' must be a 1D list or array "\n "with the same length of 'verts'")\n self._paths = [mpath.Path(xy, cds) if len(xy) else mpath.Path(xy)\n for xy, cds in zip(verts, codes)]\n self.stale = True\n\n\nclass FillBetweenPolyCollection(PolyCollection):\n """\n `.PolyCollection` that fills the area between two x- or y-curves.\n """\n def __init__(\n self, t_direction, t, f1, f2, *,\n where=None, interpolate=False, step=None, **kwargs):\n """\n Parameters\n ----------\n t_direction : {{'x', 'y'}}\n The axes on which the variable lies.\n\n - 'x': the curves are ``(t, f1)`` and ``(t, f2)``.\n - 'y': the curves are ``(f1, t)`` and ``(f2, t)``.\n\n t : array-like\n The ``t_direction`` coordinates of the nodes defining the curves.\n\n f1 : array-like or float\n The other coordinates of the nodes defining the first curve.\n\n f2 : array-like or float\n The other coordinates of the nodes defining the second curve.\n\n where : array-like of bool, optional\n Define *where* to exclude some {dir} regions from being filled.\n The filled regions are defined by the coordinates ``t[where]``.\n More precisely, fill between ``t[i]`` and ``t[i+1]`` if\n ``where[i] and where[i+1]``. Note that this definition implies\n that an isolated *True* value between two *False* values in *where*\n will not result in filling. Both sides of the *True* position\n remain unfilled due to the adjacent *False* values.\n\n interpolate : bool, default: False\n This option is only relevant if *where* is used and the two curves\n are crossing each other.\n\n Semantically, *where* is often used for *f1* > *f2* or\n similar. By default, the nodes of the polygon defining the filled\n region will only be placed at the positions in the *t* array.\n Such a polygon cannot describe the above semantics close to the\n intersection. The t-sections containing the intersection are\n simply clipped.\n\n Setting *interpolate* to *True* will calculate the actual\n intersection point and extend the filled region up to this point.\n\n step : {{'pre', 'post', 'mid'}}, optional\n Define *step* if the filling should be a step function,\n i.e. constant in between *t*. The value determines where the\n step will occur:\n\n - 'pre': The f value is continued constantly to the left from\n every *t* position, i.e. the interval ``(t[i-1], t[i]]`` has the\n value ``f[i]``.\n - 'post': The y value is continued constantly to the right from\n every *x* position, i.e. the interval ``[t[i], t[i+1])`` has the\n value ``f[i]``.\n - 'mid': Steps occur half-way between the *t* positions.\n\n **kwargs\n Forwarded to `.PolyCollection`.\n\n See Also\n --------\n .Axes.fill_between, .Axes.fill_betweenx\n """\n self.t_direction = t_direction\n self._interpolate = interpolate\n self._step = step\n verts = self._make_verts(t, f1, f2, where)\n super().__init__(verts, **kwargs)\n\n @staticmethod\n def _f_dir_from_t(t_direction):\n """The direction that is other than `t_direction`."""\n if t_direction == "x":\n return "y"\n elif t_direction == "y":\n return "x"\n else:\n msg = f"t_direction must be 'x' or 'y', got {t_direction!r}"\n raise ValueError(msg)\n\n @property\n def _f_direction(self):\n """The direction that is other than `self.t_direction`."""\n return self._f_dir_from_t(self.t_direction)\n\n def set_data(self, t, f1, f2, *, where=None):\n """\n Set new values for the two bounding curves.\n\n Parameters\n ----------\n t : array-like\n The ``self.t_direction`` coordinates of the nodes defining the curves.\n\n f1 : array-like or float\n The other coordinates of the nodes defining the first curve.\n\n f2 : array-like or float\n The other coordinates of the nodes defining the second curve.\n\n where : array-like of bool, optional\n Define *where* to exclude some {dir} regions from being filled.\n The filled regions are defined by the coordinates ``t[where]``.\n More precisely, fill between ``t[i]`` and ``t[i+1]`` if\n ``where[i] and where[i+1]``. Note that this definition implies\n that an isolated *True* value between two *False* values in *where*\n will not result in filling. Both sides of the *True* position\n remain unfilled due to the adjacent *False* values.\n\n See Also\n --------\n .PolyCollection.set_verts, .Line2D.set_data\n """\n t, f1, f2 = self.axes._fill_between_process_units(\n self.t_direction, self._f_direction, t, f1, f2)\n\n verts = self._make_verts(t, f1, f2, where)\n self.set_verts(verts)\n\n def get_datalim(self, transData):\n """Calculate the data limits and return them as a `.Bbox`."""\n datalim = transforms.Bbox.null()\n datalim.update_from_data_xy((self.get_transform() - transData).transform(\n np.concatenate([self._bbox, [self._bbox.minpos]])))\n return datalim\n\n def _make_verts(self, t, f1, f2, where):\n """\n Make verts that can be forwarded to `.PolyCollection`.\n """\n self._validate_shapes(self.t_direction, self._f_direction, t, f1, f2)\n\n where = self._get_data_mask(t, f1, f2, where)\n t, f1, f2 = np.broadcast_arrays(np.atleast_1d(t), f1, f2, subok=True)\n\n self._bbox = transforms.Bbox.null()\n self._bbox.update_from_data_xy(self._fix_pts_xy_order(np.concatenate([\n np.stack((t[where], f[where]), axis=-1) for f in (f1, f2)])))\n\n return [\n self._make_verts_for_region(t, f1, f2, idx0, idx1)\n for idx0, idx1 in cbook.contiguous_regions(where)\n ]\n\n def _get_data_mask(self, t, f1, f2, where):\n """\n Return a bool array, with True at all points that should eventually be rendered.\n\n The array is True at a point if none of the data inputs\n *t*, *f1*, *f2* is masked and if the input *where* is true at that point.\n """\n if where is None:\n where = True\n else:\n where = np.asarray(where, dtype=bool)\n if where.size != t.size:\n msg = "where size ({}) does not match {!r} size ({})".format(\n where.size, self.t_direction, t.size)\n raise ValueError(msg)\n return where & ~functools.reduce(\n np.logical_or, map(np.ma.getmaskarray, [t, f1, f2]))\n\n @staticmethod\n def _validate_shapes(t_dir, f_dir, t, f1, f2):\n """Validate that t, f1 and f2 are 1-dimensional and have the same length."""\n names = (d + s for d, s in zip((t_dir, f_dir, f_dir), ("", "1", "2")))\n for name, array in zip(names, [t, f1, f2]):\n if array.ndim > 1:\n raise ValueError(f"{name!r} is not 1-dimensional")\n if t.size > 1 and array.size > 1 and t.size != array.size:\n msg = "{!r} has size {}, but {!r} has an unequal size of {}".format(\n t_dir, t.size, name, array.size)\n raise ValueError(msg)\n\n def _make_verts_for_region(self, t, f1, f2, idx0, idx1):\n """\n Make ``verts`` for a contiguous region between ``idx0`` and ``idx1``, taking\n into account ``step`` and ``interpolate``.\n """\n t_slice = t[idx0:idx1]\n f1_slice = f1[idx0:idx1]\n f2_slice = f2[idx0:idx1]\n if self._step is not None:\n step_func = cbook.STEP_LOOKUP_MAP["steps-" + self._step]\n t_slice, f1_slice, f2_slice = step_func(t_slice, f1_slice, f2_slice)\n\n if self._interpolate:\n start = self._get_interpolating_points(t, f1, f2, idx0)\n end = self._get_interpolating_points(t, f1, f2, idx1)\n else:\n # Handle scalar f2 (e.g. 0): the fill should go all\n # the way down to 0 even if none of the dep1 sample points do.\n start = t_slice[0], f2_slice[0]\n end = t_slice[-1], f2_slice[-1]\n\n pts = np.concatenate((\n np.asarray([start]),\n np.stack((t_slice, f1_slice), axis=-1),\n np.asarray([end]),\n np.stack((t_slice, f2_slice), axis=-1)[::-1]))\n\n return self._fix_pts_xy_order(pts)\n\n @classmethod\n def _get_interpolating_points(cls, t, f1, f2, idx):\n """Calculate interpolating points."""\n im1 = max(idx - 1, 0)\n t_values = t[im1:idx+1]\n diff_values = f1[im1:idx+1] - f2[im1:idx+1]\n f1_values = f1[im1:idx+1]\n\n if len(diff_values) == 2:\n if np.ma.is_masked(diff_values[1]):\n return t[im1], f1[im1]\n elif np.ma.is_masked(diff_values[0]):\n return t[idx], f1[idx]\n\n diff_root_t = cls._get_diff_root(0, diff_values, t_values)\n diff_root_f = cls._get_diff_root(diff_root_t, t_values, f1_values)\n return diff_root_t, diff_root_f\n\n @staticmethod\n def _get_diff_root(x, xp, fp):\n """Calculate diff root."""\n order = xp.argsort()\n return np.interp(x, xp[order], fp[order])\n\n def _fix_pts_xy_order(self, pts):\n """\n Fix pts calculation results with `self.t_direction`.\n\n In the workflow, it is assumed that `self.t_direction` is 'x'. If this\n is not true, we need to exchange the coordinates.\n """\n return pts[:, ::-1] if self.t_direction == "y" else pts\n\n\nclass RegularPolyCollection(_CollectionWithSizes):\n """A collection of n-sided regular polygons."""\n\n _path_generator = mpath.Path.unit_regular_polygon\n _factor = np.pi ** (-1/2)\n\n def __init__(self,\n numsides,\n *,\n rotation=0,\n sizes=(1,),\n **kwargs):\n """\n Parameters\n ----------\n numsides : int\n The number of sides of the polygon.\n rotation : float\n The rotation of the polygon in radians.\n sizes : tuple of float\n The area of the circle circumscribing the polygon in points^2.\n **kwargs\n Forwarded to `.Collection`.\n\n Examples\n --------\n See :doc:`/gallery/event_handling/lasso_demo` for a complete example::\n\n offsets = np.random.rand(20, 2)\n facecolors = [cm.jet(x) for x in np.random.rand(20)]\n\n collection = RegularPolyCollection(\n numsides=5, # a pentagon\n rotation=0, sizes=(50,),\n facecolors=facecolors,\n edgecolors=("black",),\n linewidths=(1,),\n offsets=offsets,\n offset_transform=ax.transData,\n )\n """\n super().__init__(**kwargs)\n self.set_sizes(sizes)\n self._numsides = numsides\n self._paths = [self._path_generator(numsides)]\n self._rotation = rotation\n self.set_transform(transforms.IdentityTransform())\n\n def get_numsides(self):\n return self._numsides\n\n def get_rotation(self):\n return self._rotation\n\n @artist.allow_rasterization\n def draw(self, renderer):\n self.set_sizes(self._sizes, self.get_figure(root=True).dpi)\n self._transforms = [\n transforms.Affine2D(x).rotate(-self._rotation).get_matrix()\n for x in self._transforms\n ]\n # Explicitly not super().draw, because set_sizes must be called before\n # updating self._transforms.\n Collection.draw(self, renderer)\n\n\nclass StarPolygonCollection(RegularPolyCollection):\n """Draw a collection of regular stars with *numsides* points."""\n _path_generator = mpath.Path.unit_regular_star\n\n\nclass AsteriskPolygonCollection(RegularPolyCollection):\n """Draw a collection of regular asterisks with *numsides* points."""\n _path_generator = mpath.Path.unit_regular_asterisk\n\n\nclass LineCollection(Collection):\n r"""\n Represents a sequence of `.Line2D`\s that should be drawn together.\n\n This class extends `.Collection` to represent a sequence of\n `.Line2D`\s instead of just a sequence of `.Patch`\s.\n Just as in `.Collection`, each property of a *LineCollection* may be either\n a single value or a list of values. This list is then used cyclically for\n each element of the LineCollection, so the property of the ``i``\th element\n of the collection is::\n\n prop[i % len(prop)]\n\n The properties of each member of a *LineCollection* default to their values\n in :rc:`lines.*` instead of :rc:`patch.*`, and the property *colors* is\n added in place of *edgecolors*.\n """\n\n _edge_default = True\n\n def __init__(self, segments, # Can be None.\n *,\n zorder=2, # Collection.zorder is 1\n **kwargs\n ):\n """\n Parameters\n ----------\n segments : list of (N, 2) array-like\n A sequence ``[line0, line1, ...]`` where each line is a (N, 2)-shape\n array-like containing points::\n\n line0 = [(x0, y0), (x1, y1), ...]\n\n Each line can contain a different number of points.\n linewidths : float or list of float, default: :rc:`lines.linewidth`\n The width of each line in points.\n colors : :mpltype:`color` or list of color, default: :rc:`lines.color`\n A sequence of RGBA tuples (e.g., arbitrary color strings, etc, not\n allowed).\n antialiaseds : bool or list of bool, default: :rc:`lines.antialiased`\n Whether to use antialiasing for each line.\n zorder : float, default: 2\n zorder of the lines once drawn.\n\n facecolors : :mpltype:`color` or list of :mpltype:`color`, default: 'none'\n When setting *facecolors*, each line is interpreted as a boundary\n for an area, implicitly closing the path from the last point to the\n first point. The enclosed area is filled with *facecolor*.\n In order to manually specify what should count as the "interior" of\n each line, please use `.PathCollection` instead, where the\n "interior" can be specified by appropriate usage of\n `~.path.Path.CLOSEPOLY`.\n\n **kwargs\n Forwarded to `.Collection`.\n """\n # Unfortunately, mplot3d needs this explicit setting of 'facecolors'.\n kwargs.setdefault('facecolors', 'none')\n super().__init__(\n zorder=zorder,\n **kwargs)\n self.set_segments(segments)\n\n def set_segments(self, segments):\n if segments is None:\n return\n\n self._paths = [mpath.Path(seg) if isinstance(seg, np.ma.MaskedArray)\n else mpath.Path(np.asarray(seg, float))\n for seg in segments]\n self.stale = True\n\n set_verts = set_segments # for compatibility with PolyCollection\n set_paths = set_segments\n\n def get_segments(self):\n """\n Returns\n -------\n list\n List of segments in the LineCollection. Each list item contains an\n array of vertices.\n """\n segments = []\n\n for path in self._paths:\n vertices = [\n vertex\n for vertex, _\n # Never simplify here, we want to get the data-space values\n # back and there in no way to know the "right" simplification\n # threshold so never try.\n in path.iter_segments(simplify=False)\n ]\n vertices = np.asarray(vertices)\n segments.append(vertices)\n\n return segments\n\n def _get_default_linewidth(self):\n return mpl.rcParams['lines.linewidth']\n\n def _get_default_antialiased(self):\n return mpl.rcParams['lines.antialiased']\n\n def _get_default_edgecolor(self):\n return mpl.rcParams['lines.color']\n\n def _get_default_facecolor(self):\n return 'none'\n\n def set_alpha(self, alpha):\n # docstring inherited\n super().set_alpha(alpha)\n if self._gapcolor is not None:\n self.set_gapcolor(self._original_gapcolor)\n\n def set_color(self, c):\n """\n Set the edgecolor(s) of the LineCollection.\n\n Parameters\n ----------\n c : :mpltype:`color` or list of :mpltype:`color`\n Single color (all lines have same color), or a\n sequence of RGBA tuples; if it is a sequence the lines will\n cycle through the sequence.\n """\n self.set_edgecolor(c)\n\n set_colors = set_color\n\n def get_color(self):\n return self._edgecolors\n\n get_colors = get_color # for compatibility with old versions\n\n def set_gapcolor(self, gapcolor):\n """\n Set a color to fill the gaps in the dashed line style.\n\n .. note::\n\n Striped lines are created by drawing two interleaved dashed lines.\n There can be overlaps between those two, which may result in\n artifacts when using transparency.\n\n This functionality is experimental and may change.\n\n Parameters\n ----------\n gapcolor : :mpltype:`color` or list of :mpltype:`color` or None\n The color with which to fill the gaps. If None, the gaps are\n unfilled.\n """\n self._original_gapcolor = gapcolor\n self._set_gapcolor(gapcolor)\n\n def _set_gapcolor(self, gapcolor):\n if gapcolor is not None:\n gapcolor = mcolors.to_rgba_array(gapcolor, self._alpha)\n self._gapcolor = gapcolor\n self.stale = True\n\n def get_gapcolor(self):\n return self._gapcolor\n\n def _get_inverse_paths_linestyles(self):\n """\n Returns the path and pattern for the gaps in the non-solid lines.\n\n This path and pattern is the inverse of the path and pattern used to\n construct the non-solid lines. For solid lines, we set the inverse path\n to nans to prevent drawing an inverse line.\n """\n path_patterns = [\n (mpath.Path(np.full((1, 2), np.nan)), ls)\n if ls == (0, None) else\n (path, mlines._get_inverse_dash_pattern(*ls))\n for (path, ls) in\n zip(self._paths, itertools.cycle(self._linestyles))]\n\n return zip(*path_patterns)\n\n\nclass EventCollection(LineCollection):\n """\n A collection of locations along a single axis at which an "event" occurred.\n\n The events are given by a 1-dimensional array. They do not have an\n amplitude and are displayed as parallel lines.\n """\n\n _edge_default = True\n\n def __init__(self,\n positions, # Cannot be None.\n orientation='horizontal',\n *,\n lineoffset=0,\n linelength=1,\n linewidth=None,\n color=None,\n linestyle='solid',\n antialiased=None,\n **kwargs\n ):\n """\n Parameters\n ----------\n positions : 1D array-like\n Each value is an event.\n orientation : {'horizontal', 'vertical'}, default: 'horizontal'\n The sequence of events is plotted along this direction.\n The marker lines of the single events are along the orthogonal\n direction.\n lineoffset : float, default: 0\n The offset of the center of the markers from the origin, in the\n direction orthogonal to *orientation*.\n linelength : float, default: 1\n The total height of the marker (i.e. the marker stretches from\n ``lineoffset - linelength/2`` to ``lineoffset + linelength/2``).\n linewidth : float or list thereof, default: :rc:`lines.linewidth`\n The line width of the event lines, in points.\n color : :mpltype:`color` or list of :mpltype:`color`, default: :rc:`lines.color`\n The color of the event lines.\n linestyle : str or tuple or list thereof, default: 'solid'\n Valid strings are ['solid', 'dashed', 'dashdot', 'dotted',\n '-', '--', '-.', ':']. Dash tuples should be of the form::\n\n (offset, onoffseq),\n\n where *onoffseq* is an even length tuple of on and off ink\n in points.\n antialiased : bool or list thereof, default: :rc:`lines.antialiased`\n Whether to use antialiasing for drawing the lines.\n **kwargs\n Forwarded to `.LineCollection`.\n\n Examples\n --------\n .. plot:: gallery/lines_bars_and_markers/eventcollection_demo.py\n """\n super().__init__([],\n linewidths=linewidth, linestyles=linestyle,\n colors=color, antialiaseds=antialiased,\n **kwargs)\n self._is_horizontal = True # Initial value, may be switched below.\n self._linelength = linelength\n self._lineoffset = lineoffset\n self.set_orientation(orientation)\n self.set_positions(positions)\n\n def get_positions(self):\n """\n Return an array containing the floating-point values of the positions.\n """\n pos = 0 if self.is_horizontal() else 1\n return [segment[0, pos] for segment in self.get_segments()]\n\n def set_positions(self, positions):\n """Set the positions of the events."""\n if positions is None:\n positions = []\n if np.ndim(positions) != 1:\n raise ValueError('positions must be one-dimensional')\n lineoffset = self.get_lineoffset()\n linelength = self.get_linelength()\n pos_idx = 0 if self.is_horizontal() else 1\n segments = np.empty((len(positions), 2, 2))\n segments[:, :, pos_idx] = np.sort(positions)[:, None]\n segments[:, 0, 1 - pos_idx] = lineoffset + linelength / 2\n segments[:, 1, 1 - pos_idx] = lineoffset - linelength / 2\n self.set_segments(segments)\n\n def add_positions(self, position):\n """Add one or more events at the specified positions."""\n if position is None or (hasattr(position, 'len') and\n len(position) == 0):\n return\n positions = self.get_positions()\n positions = np.hstack([positions, np.asanyarray(position)])\n self.set_positions(positions)\n extend_positions = append_positions = add_positions\n\n def is_horizontal(self):\n """True if the eventcollection is horizontal, False if vertical."""\n return self._is_horizontal\n\n def get_orientation(self):\n """\n Return the orientation of the event line ('horizontal' or 'vertical').\n """\n return 'horizontal' if self.is_horizontal() else 'vertical'\n\n def switch_orientation(self):\n """\n Switch the orientation of the event line, either from vertical to\n horizontal or vice versus.\n """\n segments = self.get_segments()\n for i, segment in enumerate(segments):\n segments[i] = np.fliplr(segment)\n self.set_segments(segments)\n self._is_horizontal = not self.is_horizontal()\n self.stale = True\n\n def set_orientation(self, orientation):\n """\n Set the orientation of the event line.\n\n Parameters\n ----------\n orientation : {'horizontal', 'vertical'}\n """\n is_horizontal = _api.check_getitem(\n {"horizontal": True, "vertical": False},\n orientation=orientation)\n if is_horizontal == self.is_horizontal():\n return\n self.switch_orientation()\n\n def get_linelength(self):\n """Return the length of the lines used to mark each event."""\n return self._linelength\n\n def set_linelength(self, linelength):\n """Set the length of the lines used to mark each event."""\n if linelength == self.get_linelength():\n return\n lineoffset = self.get_lineoffset()\n segments = self.get_segments()\n pos = 1 if self.is_horizontal() else 0\n for segment in segments:\n segment[0, pos] = lineoffset + linelength / 2.\n segment[1, pos] = lineoffset - linelength / 2.\n self.set_segments(segments)\n self._linelength = linelength\n\n def get_lineoffset(self):\n """Return the offset of the lines used to mark each event."""\n return self._lineoffset\n\n def set_lineoffset(self, lineoffset):\n """Set the offset of the lines used to mark each event."""\n if lineoffset == self.get_lineoffset():\n return\n linelength = self.get_linelength()\n segments = self.get_segments()\n pos = 1 if self.is_horizontal() else 0\n for segment in segments:\n segment[0, pos] = lineoffset + linelength / 2.\n segment[1, pos] = lineoffset - linelength / 2.\n self.set_segments(segments)\n self._lineoffset = lineoffset\n\n def get_linewidth(self):\n """Get the width of the lines used to mark each event."""\n return super().get_linewidth()[0]\n\n def get_linewidths(self):\n return super().get_linewidth()\n\n def get_color(self):\n """Return the color of the lines used to mark each event."""\n return self.get_colors()[0]\n\n\nclass CircleCollection(_CollectionWithSizes):\n """A collection of circles, drawn using splines."""\n\n _factor = np.pi ** (-1/2)\n\n def __init__(self, sizes, **kwargs):\n """\n Parameters\n ----------\n sizes : float or array-like\n The area of each circle in points^2.\n **kwargs\n Forwarded to `.Collection`.\n """\n super().__init__(**kwargs)\n self.set_sizes(sizes)\n self.set_transform(transforms.IdentityTransform())\n self._paths = [mpath.Path.unit_circle()]\n\n\nclass EllipseCollection(Collection):\n """A collection of ellipses, drawn using splines."""\n\n def __init__(self, widths, heights, angles, *, units='points', **kwargs):\n """\n Parameters\n ----------\n widths : array-like\n The lengths of the first axes (e.g., major axis lengths).\n heights : array-like\n The lengths of second axes.\n angles : array-like\n The angles of the first axes, degrees CCW from the x-axis.\n units : {'points', 'inches', 'dots', 'width', 'height', 'x', 'y', 'xy'}\n The units in which majors and minors are given; 'width' and\n 'height' refer to the dimensions of the axes, while 'x' and 'y'\n refer to the *offsets* data units. 'xy' differs from all others in\n that the angle as plotted varies with the aspect ratio, and equals\n the specified angle only when the aspect ratio is unity. Hence\n it behaves the same as the `~.patches.Ellipse` with\n ``axes.transData`` as its transform.\n **kwargs\n Forwarded to `Collection`.\n """\n super().__init__(**kwargs)\n self.set_widths(widths)\n self.set_heights(heights)\n self.set_angles(angles)\n self._units = units\n self.set_transform(transforms.IdentityTransform())\n self._transforms = np.empty((0, 3, 3))\n self._paths = [mpath.Path.unit_circle()]\n\n def _set_transforms(self):\n """Calculate transforms immediately before drawing."""\n\n ax = self.axes\n fig = self.get_figure(root=False)\n\n if self._units == 'xy':\n sc = 1\n elif self._units == 'x':\n sc = ax.bbox.width / ax.viewLim.width\n elif self._units == 'y':\n sc = ax.bbox.height / ax.viewLim.height\n elif self._units == 'inches':\n sc = fig.dpi\n elif self._units == 'points':\n sc = fig.dpi / 72.0\n elif self._units == 'width':\n sc = ax.bbox.width\n elif self._units == 'height':\n sc = ax.bbox.height\n elif self._units == 'dots':\n sc = 1.0\n else:\n raise ValueError(f'Unrecognized units: {self._units!r}')\n\n self._transforms = np.zeros((len(self._widths), 3, 3))\n widths = self._widths * sc\n heights = self._heights * sc\n sin_angle = np.sin(self._angles)\n cos_angle = np.cos(self._angles)\n self._transforms[:, 0, 0] = widths * cos_angle\n self._transforms[:, 0, 1] = heights * -sin_angle\n self._transforms[:, 1, 0] = widths * sin_angle\n self._transforms[:, 1, 1] = heights * cos_angle\n self._transforms[:, 2, 2] = 1.0\n\n _affine = transforms.Affine2D\n if self._units == 'xy':\n m = ax.transData.get_affine().get_matrix().copy()\n m[:2, 2:] = 0\n self.set_transform(_affine(m))\n\n def set_widths(self, widths):\n """Set the lengths of the first axes (e.g., major axis)."""\n self._widths = 0.5 * np.asarray(widths).ravel()\n self.stale = True\n\n def set_heights(self, heights):\n """Set the lengths of second axes (e.g., minor axes)."""\n self._heights = 0.5 * np.asarray(heights).ravel()\n self.stale = True\n\n def set_angles(self, angles):\n """Set the angles of the first axes, degrees CCW from the x-axis."""\n self._angles = np.deg2rad(angles).ravel()\n self.stale = True\n\n def get_widths(self):\n """Get the lengths of the first axes (e.g., major axis)."""\n return self._widths * 2\n\n def get_heights(self):\n """Set the lengths of second axes (e.g., minor axes)."""\n return self._heights * 2\n\n def get_angles(self):\n """Get the angles of the first axes, degrees CCW from the x-axis."""\n return np.rad2deg(self._angles)\n\n @artist.allow_rasterization\n def draw(self, renderer):\n self._set_transforms()\n super().draw(renderer)\n\n\nclass PatchCollection(Collection):\n """\n A generic collection of patches.\n\n PatchCollection draws faster than a large number of equivalent individual\n Patches. It also makes it easier to assign a colormap to a heterogeneous\n collection of patches.\n """\n\n def __init__(self, patches, *, match_original=False, **kwargs):\n """\n Parameters\n ----------\n patches : list of `.Patch`\n A sequence of Patch objects. This list may include\n a heterogeneous assortment of different patch types.\n\n match_original : bool, default: False\n If True, use the colors and linewidths of the original\n patches. If False, new colors may be assigned by\n providing the standard collection arguments, facecolor,\n edgecolor, linewidths, norm or cmap.\n\n **kwargs\n All other parameters are forwarded to `.Collection`.\n\n If any of *edgecolors*, *facecolors*, *linewidths*, *antialiaseds*\n are None, they default to their `.rcParams` patch setting, in\n sequence form.\n\n Notes\n -----\n The use of `~matplotlib.cm.ScalarMappable` functionality is optional.\n If the `~matplotlib.cm.ScalarMappable` matrix ``_A`` has been set (via\n a call to `~.ScalarMappable.set_array`), at draw time a call to scalar\n mappable will be made to set the face colors.\n """\n\n if match_original:\n def determine_facecolor(patch):\n if patch.get_fill():\n return patch.get_facecolor()\n return [0, 0, 0, 0]\n\n kwargs['facecolors'] = [determine_facecolor(p) for p in patches]\n kwargs['edgecolors'] = [p.get_edgecolor() for p in patches]\n kwargs['linewidths'] = [p.get_linewidth() for p in patches]\n kwargs['linestyles'] = [p.get_linestyle() for p in patches]\n kwargs['antialiaseds'] = [p.get_antialiased() for p in patches]\n\n super().__init__(**kwargs)\n\n self.set_paths(patches)\n\n def set_paths(self, patches):\n paths = [p.get_transform().transform_path(p.get_path())\n for p in patches]\n self._paths = paths\n\n\nclass TriMesh(Collection):\n """\n Class for the efficient drawing of a triangular mesh using Gouraud shading.\n\n A triangular mesh is a `~matplotlib.tri.Triangulation` object.\n """\n def __init__(self, triangulation, **kwargs):\n super().__init__(**kwargs)\n self._triangulation = triangulation\n self._shading = 'gouraud'\n\n self._bbox = transforms.Bbox.unit()\n\n # Unfortunately this requires a copy, unless Triangulation\n # was rewritten.\n xy = np.hstack((triangulation.x.reshape(-1, 1),\n triangulation.y.reshape(-1, 1)))\n self._bbox.update_from_data_xy(xy)\n\n def get_paths(self):\n if self._paths is None:\n self.set_paths()\n return self._paths\n\n def set_paths(self):\n self._paths = self.convert_mesh_to_paths(self._triangulation)\n\n @staticmethod\n def convert_mesh_to_paths(tri):\n """\n Convert a given mesh into a sequence of `.Path` objects.\n\n This function is primarily of use to implementers of backends that do\n not directly support meshes.\n """\n triangles = tri.get_masked_triangles()\n verts = np.stack((tri.x[triangles], tri.y[triangles]), axis=-1)\n return [mpath.Path(x) for x in verts]\n\n @artist.allow_rasterization\n def draw(self, renderer):\n if not self.get_visible():\n return\n renderer.open_group(self.__class__.__name__, gid=self.get_gid())\n transform = self.get_transform()\n\n # Get a list of triangles and the color at each vertex.\n tri = self._triangulation\n triangles = tri.get_masked_triangles()\n\n verts = np.stack((tri.x[triangles], tri.y[triangles]), axis=-1)\n\n self.update_scalarmappable()\n colors = self._facecolors[triangles]\n\n gc = renderer.new_gc()\n self._set_gc_clip(gc)\n gc.set_linewidth(self.get_linewidth()[0])\n renderer.draw_gouraud_triangles(gc, verts, colors, transform.frozen())\n gc.restore()\n renderer.close_group(self.__class__.__name__)\n\n\nclass _MeshData:\n r"""\n Class for managing the two dimensional coordinates of Quadrilateral meshes\n and the associated data with them. This class is a mixin and is intended to\n be used with another collection that will implement the draw separately.\n\n A quadrilateral mesh is a grid of M by N adjacent quadrilaterals that are\n defined via a (M+1, N+1) grid of vertices. The quadrilateral (m, n) is\n defined by the vertices ::\n\n (m+1, n) ----------- (m+1, n+1)\n / /\n / /\n / /\n (m, n) -------- (m, n+1)\n\n The mesh need not be regular and the polygons need not be convex.\n\n Parameters\n ----------\n coordinates : (M+1, N+1, 2) array-like\n The vertices. ``coordinates[m, n]`` specifies the (x, y) coordinates\n of vertex (m, n).\n\n shading : {'flat', 'gouraud'}, default: 'flat'\n """\n def __init__(self, coordinates, *, shading='flat'):\n _api.check_shape((None, None, 2), coordinates=coordinates)\n self._coordinates = coordinates\n self._shading = shading\n\n def set_array(self, A):\n """\n Set the data values.\n\n Parameters\n ----------\n A : array-like\n The mesh data. Supported array shapes are:\n\n - (M, N) or (M*N,): a mesh with scalar data. The values are mapped\n to colors using normalization and a colormap. See parameters\n *norm*, *cmap*, *vmin*, *vmax*.\n - (M, N, 3): an image with RGB values (0-1 float or 0-255 int).\n - (M, N, 4): an image with RGBA values (0-1 float or 0-255 int),\n i.e. including transparency.\n\n If the values are provided as a 2D grid, the shape must match the\n coordinates grid. If the values are 1D, they are reshaped to 2D.\n M, N follow from the coordinates grid, where the coordinates grid\n shape is (M, N) for 'gouraud' *shading* and (M+1, N+1) for 'flat'\n shading.\n """\n height, width = self._coordinates.shape[0:-1]\n if self._shading == 'flat':\n h, w = height - 1, width - 1\n else:\n h, w = height, width\n ok_shapes = [(h, w, 3), (h, w, 4), (h, w), (h * w,)]\n if A is not None:\n shape = np.shape(A)\n if shape not in ok_shapes:\n raise ValueError(\n f"For X ({width}) and Y ({height}) with {self._shading} "\n f"shading, A should have shape "\n f"{' or '.join(map(str, ok_shapes))}, not {A.shape}")\n return super().set_array(A)\n\n def get_coordinates(self):\n """\n Return the vertices of the mesh as an (M+1, N+1, 2) array.\n\n M, N are the number of quadrilaterals in the rows / columns of the\n mesh, corresponding to (M+1, N+1) vertices.\n The last dimension specifies the components (x, y).\n """\n return self._coordinates\n\n def get_edgecolor(self):\n # docstring inherited\n # Note that we want to return an array of shape (N*M, 4)\n # a flattened RGBA collection\n return super().get_edgecolor().reshape(-1, 4)\n\n def get_facecolor(self):\n # docstring inherited\n # Note that we want to return an array of shape (N*M, 4)\n # a flattened RGBA collection\n return super().get_facecolor().reshape(-1, 4)\n\n @staticmethod\n def _convert_mesh_to_paths(coordinates):\n """\n Convert a given mesh into a sequence of `.Path` objects.\n\n This function is primarily of use to implementers of backends that do\n not directly support quadmeshes.\n """\n if isinstance(coordinates, np.ma.MaskedArray):\n c = coordinates.data\n else:\n c = coordinates\n points = np.concatenate([\n c[:-1, :-1],\n c[:-1, 1:],\n c[1:, 1:],\n c[1:, :-1],\n c[:-1, :-1]\n ], axis=2).reshape((-1, 5, 2))\n return [mpath.Path(x) for x in points]\n\n def _convert_mesh_to_triangles(self, coordinates):\n """\n Convert a given mesh into a sequence of triangles, each point\n with its own color. The result can be used to construct a call to\n `~.RendererBase.draw_gouraud_triangles`.\n """\n if isinstance(coordinates, np.ma.MaskedArray):\n p = coordinates.data\n else:\n p = coordinates\n\n p_a = p[:-1, :-1]\n p_b = p[:-1, 1:]\n p_c = p[1:, 1:]\n p_d = p[1:, :-1]\n p_center = (p_a + p_b + p_c + p_d) / 4.0\n triangles = np.concatenate([\n p_a, p_b, p_center,\n p_b, p_c, p_center,\n p_c, p_d, p_center,\n p_d, p_a, p_center,\n ], axis=2).reshape((-1, 3, 2))\n\n c = self.get_facecolor().reshape((*coordinates.shape[:2], 4))\n z = self.get_array()\n mask = z.mask if np.ma.is_masked(z) else None\n if mask is not None:\n c[mask, 3] = np.nan\n c_a = c[:-1, :-1]\n c_b = c[:-1, 1:]\n c_c = c[1:, 1:]\n c_d = c[1:, :-1]\n c_center = (c_a + c_b + c_c + c_d) / 4.0\n colors = np.concatenate([\n c_a, c_b, c_center,\n c_b, c_c, c_center,\n c_c, c_d, c_center,\n c_d, c_a, c_center,\n ], axis=2).reshape((-1, 3, 4))\n tmask = np.isnan(colors[..., 2, 3])\n return triangles[~tmask], colors[~tmask]\n\n\nclass QuadMesh(_MeshData, Collection):\n r"""\n Class for the efficient drawing of a quadrilateral mesh.\n\n A quadrilateral mesh is a grid of M by N adjacent quadrilaterals that are\n defined via a (M+1, N+1) grid of vertices. The quadrilateral (m, n) is\n defined by the vertices ::\n\n (m+1, n) ----------- (m+1, n+1)\n / /\n / /\n / /\n (m, n) -------- (m, n+1)\n\n The mesh need not be regular and the polygons need not be convex.\n\n Parameters\n ----------\n coordinates : (M+1, N+1, 2) array-like\n The vertices. ``coordinates[m, n]`` specifies the (x, y) coordinates\n of vertex (m, n).\n\n antialiased : bool, default: True\n\n shading : {'flat', 'gouraud'}, default: 'flat'\n\n Notes\n -----\n Unlike other `.Collection`\s, the default *pickradius* of `.QuadMesh` is 0,\n i.e. `~.Artist.contains` checks whether the test point is within any of the\n mesh quadrilaterals.\n\n """\n\n def __init__(self, coordinates, *, antialiased=True, shading='flat',\n **kwargs):\n kwargs.setdefault("pickradius", 0)\n super().__init__(coordinates=coordinates, shading=shading)\n Collection.__init__(self, **kwargs)\n\n self._antialiased = antialiased\n self._bbox = transforms.Bbox.unit()\n self._bbox.update_from_data_xy(self._coordinates.reshape(-1, 2))\n self.set_mouseover(False)\n\n def get_paths(self):\n if self._paths is None:\n self.set_paths()\n return self._paths\n\n def set_paths(self):\n self._paths = self._convert_mesh_to_paths(self._coordinates)\n self.stale = True\n\n def get_datalim(self, transData):\n return (self.get_transform() - transData).transform_bbox(self._bbox)\n\n @artist.allow_rasterization\n def draw(self, renderer):\n if not self.get_visible():\n return\n renderer.open_group(self.__class__.__name__, self.get_gid())\n transform = self.get_transform()\n offset_trf = self.get_offset_transform()\n offsets = self.get_offsets()\n\n if self.have_units():\n xs = self.convert_xunits(offsets[:, 0])\n ys = self.convert_yunits(offsets[:, 1])\n offsets = np.column_stack([xs, ys])\n\n self.update_scalarmappable()\n\n if not transform.is_affine:\n coordinates = self._coordinates.reshape((-1, 2))\n coordinates = transform.transform(coordinates)\n coordinates = coordinates.reshape(self._coordinates.shape)\n transform = transforms.IdentityTransform()\n else:\n coordinates = self._coordinates\n\n if not offset_trf.is_affine:\n offsets = offset_trf.transform_non_affine(offsets)\n offset_trf = offset_trf.get_affine()\n\n gc = renderer.new_gc()\n gc.set_snap(self.get_snap())\n self._set_gc_clip(gc)\n gc.set_linewidth(self.get_linewidth()[0])\n\n if self._shading == 'gouraud':\n triangles, colors = self._convert_mesh_to_triangles(coordinates)\n renderer.draw_gouraud_triangles(\n gc, triangles, colors, transform.frozen())\n else:\n renderer.draw_quad_mesh(\n gc, transform.frozen(),\n coordinates.shape[1] - 1, coordinates.shape[0] - 1,\n coordinates, offsets, offset_trf,\n # Backends expect flattened rgba arrays (n*m, 4) for fc and ec\n self.get_facecolor().reshape((-1, 4)),\n self._antialiased, self.get_edgecolors().reshape((-1, 4)))\n gc.restore()\n renderer.close_group(self.__class__.__name__)\n self.stale = False\n\n def get_cursor_data(self, event):\n contained, info = self.contains(event)\n if contained and self.get_array() is not None:\n return self.get_array().ravel()[info["ind"]]\n return None\n\n\nclass PolyQuadMesh(_MeshData, PolyCollection):\n """\n Class for drawing a quadrilateral mesh as individual Polygons.\n\n A quadrilateral mesh is a grid of M by N adjacent quadrilaterals that are\n defined via a (M+1, N+1) grid of vertices. The quadrilateral (m, n) is\n defined by the vertices ::\n\n (m+1, n) ----------- (m+1, n+1)\n / /\n / /\n / /\n (m, n) -------- (m, n+1)\n\n The mesh need not be regular and the polygons need not be convex.\n\n Parameters\n ----------\n coordinates : (M+1, N+1, 2) array-like\n The vertices. ``coordinates[m, n]`` specifies the (x, y) coordinates\n of vertex (m, n).\n\n Notes\n -----\n Unlike `.QuadMesh`, this class will draw each cell as an individual Polygon.\n This is significantly slower, but allows for more flexibility when wanting\n to add additional properties to the cells, such as hatching.\n\n Another difference from `.QuadMesh` is that if any of the vertices or data\n of a cell are masked, that Polygon will **not** be drawn and it won't be in\n the list of paths returned.\n """\n\n def __init__(self, coordinates, **kwargs):\n super().__init__(coordinates=coordinates)\n PolyCollection.__init__(self, verts=[], **kwargs)\n # Setting the verts updates the paths of the PolyCollection\n # This is called after the initializers to make sure the kwargs\n # have all been processed and available for the masking calculations\n self._set_unmasked_verts()\n\n def _get_unmasked_polys(self):\n """Get the unmasked regions using the coordinates and array"""\n # mask(X) | mask(Y)\n mask = np.any(np.ma.getmaskarray(self._coordinates), axis=-1)\n\n # We want the shape of the polygon, which is the corner of each X/Y array\n mask = (mask[0:-1, 0:-1] | mask[1:, 1:] | mask[0:-1, 1:] | mask[1:, 0:-1])\n arr = self.get_array()\n if arr is not None:\n arr = np.ma.getmaskarray(arr)\n if arr.ndim == 3:\n # RGB(A) case\n mask |= np.any(arr, axis=-1)\n elif arr.ndim == 2:\n mask |= arr\n else:\n mask |= arr.reshape(self._coordinates[:-1, :-1, :].shape[:2])\n return ~mask\n\n def _set_unmasked_verts(self):\n X = self._coordinates[..., 0]\n Y = self._coordinates[..., 1]\n\n unmask = self._get_unmasked_polys()\n X1 = np.ma.filled(X[:-1, :-1])[unmask]\n Y1 = np.ma.filled(Y[:-1, :-1])[unmask]\n X2 = np.ma.filled(X[1:, :-1])[unmask]\n Y2 = np.ma.filled(Y[1:, :-1])[unmask]\n X3 = np.ma.filled(X[1:, 1:])[unmask]\n Y3 = np.ma.filled(Y[1:, 1:])[unmask]\n X4 = np.ma.filled(X[:-1, 1:])[unmask]\n Y4 = np.ma.filled(Y[:-1, 1:])[unmask]\n npoly = len(X1)\n\n xy = np.ma.stack([X1, Y1, X2, Y2, X3, Y3, X4, Y4, X1, Y1], axis=-1)\n verts = xy.reshape((npoly, 5, 2))\n self.set_verts(verts)\n\n def get_edgecolor(self):\n # docstring inherited\n # We only want to return the facecolors of the polygons\n # that were drawn.\n ec = super().get_edgecolor()\n unmasked_polys = self._get_unmasked_polys().ravel()\n if len(ec) != len(unmasked_polys):\n # Mapping is off\n return ec\n return ec[unmasked_polys, :]\n\n def get_facecolor(self):\n # docstring inherited\n # We only want to return the facecolors of the polygons\n # that were drawn.\n fc = super().get_facecolor()\n unmasked_polys = self._get_unmasked_polys().ravel()\n if len(fc) != len(unmasked_polys):\n # Mapping is off\n return fc\n return fc[unmasked_polys, :]\n\n def set_array(self, A):\n # docstring inherited\n prev_unmask = self._get_unmasked_polys()\n super().set_array(A)\n # If the mask has changed at all we need to update\n # the set of Polys that we are drawing\n if not np.array_equal(prev_unmask, self._get_unmasked_polys()):\n self._set_unmasked_verts()\n | .venv\Lib\site-packages\matplotlib\collections.py | collections.py | Python | 96,311 | 0.75 | 0.175282 | 0.071625 | vue-tools | 285 | 2024-03-26T04:58:10.103834 | Apache-2.0 | false | f077b40569877e8890e54fb4d68d6515 |
from collections.abc import Callable, Iterable, Sequence\nfrom typing import Literal\n\nimport numpy as np\nfrom numpy.typing import ArrayLike, NDArray\n\nfrom . import colorizer, transforms\nfrom .backend_bases import MouseEvent\nfrom .artist import Artist\nfrom .colors import Normalize, Colormap\nfrom .lines import Line2D\nfrom .path import Path\nfrom .patches import Patch\nfrom .ticker import Locator, Formatter\nfrom .tri import Triangulation\nfrom .typing import ColorType, LineStyleType, CapStyleType, JoinStyleType\n\nclass Collection(colorizer.ColorizingArtist):\n def __init__(\n self,\n *,\n edgecolors: ColorType | Sequence[ColorType] | None = ...,\n facecolors: ColorType | Sequence[ColorType] | None = ...,\n linewidths: float | Sequence[float] | None = ...,\n linestyles: LineStyleType | Sequence[LineStyleType] = ...,\n capstyle: CapStyleType | None = ...,\n joinstyle: JoinStyleType | None = ...,\n antialiaseds: bool | Sequence[bool] | None = ...,\n offsets: tuple[float, float] | Sequence[tuple[float, float]] | None = ...,\n offset_transform: transforms.Transform | None = ...,\n norm: Normalize | None = ...,\n cmap: Colormap | None = ...,\n colorizer: colorizer.Colorizer | None = ...,\n pickradius: float = ...,\n hatch: str | None = ...,\n urls: Sequence[str] | None = ...,\n zorder: float = ...,\n **kwargs\n ) -> None: ...\n def get_paths(self) -> Sequence[Path]: ...\n def set_paths(self, paths: Sequence[Path]) -> None: ...\n def get_transforms(self) -> Sequence[transforms.Transform]: ...\n def get_offset_transform(self) -> transforms.Transform: ...\n def set_offset_transform(self, offset_transform: transforms.Transform) -> None: ...\n def get_datalim(self, transData: transforms.Transform) -> transforms.Bbox: ...\n def set_pickradius(self, pickradius: float) -> None: ...\n def get_pickradius(self) -> float: ...\n def set_urls(self, urls: Sequence[str | None]) -> None: ...\n def get_urls(self) -> Sequence[str | None]: ...\n def set_hatch(self, hatch: str) -> None: ...\n def get_hatch(self) -> str: ...\n def set_hatch_linewidth(self, lw: float) -> None: ...\n def get_hatch_linewidth(self) -> float: ...\n def set_offsets(self, offsets: ArrayLike) -> None: ...\n def get_offsets(self) -> ArrayLike: ...\n def set_linewidth(self, lw: float | Sequence[float]) -> None: ...\n def set_linestyle(self, ls: LineStyleType | Sequence[LineStyleType]) -> None: ...\n def set_capstyle(self, cs: CapStyleType) -> None: ...\n def get_capstyle(self) -> Literal["butt", "projecting", "round"] | None: ...\n def set_joinstyle(self, js: JoinStyleType) -> None: ...\n def get_joinstyle(self) -> Literal["miter", "round", "bevel"] | None: ...\n def set_antialiased(self, aa: bool | Sequence[bool]) -> None: ...\n def get_antialiased(self) -> NDArray[np.bool_]: ...\n def set_color(self, c: ColorType | Sequence[ColorType]) -> None: ...\n def set_facecolor(self, c: ColorType | Sequence[ColorType]) -> None: ...\n def get_facecolor(self) -> ColorType | Sequence[ColorType]: ...\n def get_edgecolor(self) -> ColorType | Sequence[ColorType]: ...\n def set_edgecolor(self, c: ColorType | Sequence[ColorType]) -> None: ...\n def set_alpha(self, alpha: float | Sequence[float] | None) -> None: ...\n def get_linewidth(self) -> float | Sequence[float]: ...\n def get_linestyle(self) -> LineStyleType | Sequence[LineStyleType]: ...\n def update_scalarmappable(self) -> None: ...\n def get_fill(self) -> bool: ...\n def update_from(self, other: Artist) -> None: ...\n\nclass _CollectionWithSizes(Collection):\n def get_sizes(self) -> np.ndarray: ...\n def set_sizes(self, sizes: ArrayLike | None, dpi: float = ...) -> None: ...\n\nclass PathCollection(_CollectionWithSizes):\n def __init__(\n self, paths: Sequence[Path], sizes: ArrayLike | None = ..., **kwargs\n ) -> None: ...\n def set_paths(self, paths: Sequence[Path]) -> None: ...\n def get_paths(self) -> Sequence[Path]: ...\n def legend_elements(\n self,\n prop: Literal["colors", "sizes"] = ...,\n num: int | Literal["auto"] | ArrayLike | Locator = ...,\n fmt: str | Formatter | None = ...,\n func: Callable[[ArrayLike], ArrayLike] = ...,\n **kwargs,\n ) -> tuple[list[Line2D], list[str]]: ...\n\nclass PolyCollection(_CollectionWithSizes):\n def __init__(\n self,\n verts: Sequence[ArrayLike],\n sizes: ArrayLike | None = ...,\n *,\n closed: bool = ...,\n **kwargs\n ) -> None: ...\n def set_verts(\n self, verts: Sequence[ArrayLike | Path], closed: bool = ...\n ) -> None: ...\n def set_paths(self, verts: Sequence[Path], closed: bool = ...) -> None: ...\n def set_verts_and_codes(\n self, verts: Sequence[ArrayLike | Path], codes: Sequence[int]\n ) -> None: ...\n\nclass FillBetweenPolyCollection(PolyCollection):\n def __init__(\n self,\n t_direction: Literal["x", "y"],\n t: ArrayLike,\n f1: ArrayLike,\n f2: ArrayLike,\n *,\n where: Sequence[bool] | None = ...,\n interpolate: bool = ...,\n step: Literal["pre", "post", "mid"] | None = ...,\n **kwargs,\n ) -> None: ...\n def set_data(\n self,\n t: ArrayLike,\n f1: ArrayLike,\n f2: ArrayLike,\n *,\n where: Sequence[bool] | None = ...,\n ) -> None: ...\n def get_datalim(self, transData: transforms.Transform) -> transforms.Bbox: ...\n\nclass RegularPolyCollection(_CollectionWithSizes):\n def __init__(\n self, numsides: int, *, rotation: float = ..., sizes: ArrayLike = ..., **kwargs\n ) -> None: ...\n def get_numsides(self) -> int: ...\n def get_rotation(self) -> float: ...\n\nclass StarPolygonCollection(RegularPolyCollection): ...\nclass AsteriskPolygonCollection(RegularPolyCollection): ...\n\nclass LineCollection(Collection):\n def __init__(\n self, segments: Sequence[ArrayLike], *, zorder: float = ..., **kwargs\n ) -> None: ...\n def set_segments(self, segments: Sequence[ArrayLike] | None) -> None: ...\n def set_verts(self, segments: Sequence[ArrayLike] | None) -> None: ...\n def set_paths(self, segments: Sequence[ArrayLike] | None) -> None: ... # type: ignore[override]\n def get_segments(self) -> list[np.ndarray]: ...\n def set_color(self, c: ColorType | Sequence[ColorType]) -> None: ...\n def set_colors(self, c: ColorType | Sequence[ColorType]) -> None: ...\n def set_gapcolor(self, gapcolor: ColorType | Sequence[ColorType] | None) -> None: ...\n def get_color(self) -> ColorType | Sequence[ColorType]: ...\n def get_colors(self) -> ColorType | Sequence[ColorType]: ...\n def get_gapcolor(self) -> ColorType | Sequence[ColorType] | None: ...\n\n\nclass EventCollection(LineCollection):\n def __init__(\n self,\n positions: ArrayLike,\n orientation: Literal["horizontal", "vertical"] = ...,\n *,\n lineoffset: float = ...,\n linelength: float = ...,\n linewidth: float | Sequence[float] | None = ...,\n color: ColorType | Sequence[ColorType] | None = ...,\n linestyle: LineStyleType | Sequence[LineStyleType] = ...,\n antialiased: bool | Sequence[bool] | None = ...,\n **kwargs\n ) -> None: ...\n def get_positions(self) -> list[float]: ...\n def set_positions(self, positions: Sequence[float] | None) -> None: ...\n def add_positions(self, position: Sequence[float] | None) -> None: ...\n def extend_positions(self, position: Sequence[float] | None) -> None: ...\n def append_positions(self, position: Sequence[float] | None) -> None: ...\n def is_horizontal(self) -> bool: ...\n def get_orientation(self) -> Literal["horizontal", "vertical"]: ...\n def switch_orientation(self) -> None: ...\n def set_orientation(\n self, orientation: Literal["horizontal", "vertical"]\n ) -> None: ...\n def get_linelength(self) -> float | Sequence[float]: ...\n def set_linelength(self, linelength: float | Sequence[float]) -> None: ...\n def get_lineoffset(self) -> float: ...\n def set_lineoffset(self, lineoffset: float) -> None: ...\n def get_linewidth(self) -> float: ...\n def get_linewidths(self) -> Sequence[float]: ...\n def get_color(self) -> ColorType: ...\n\nclass CircleCollection(_CollectionWithSizes):\n def __init__(self, sizes: float | ArrayLike, **kwargs) -> None: ...\n\nclass EllipseCollection(Collection):\n def __init__(\n self,\n widths: ArrayLike,\n heights: ArrayLike,\n angles: ArrayLike,\n *,\n units: Literal[\n "points", "inches", "dots", "width", "height", "x", "y", "xy"\n ] = ...,\n **kwargs\n ) -> None: ...\n def set_widths(self, widths: ArrayLike) -> None: ...\n def set_heights(self, heights: ArrayLike) -> None: ...\n def set_angles(self, angles: ArrayLike) -> None: ...\n def get_widths(self) -> ArrayLike: ...\n def get_heights(self) -> ArrayLike: ...\n def get_angles(self) -> ArrayLike: ...\n\nclass PatchCollection(Collection):\n def __init__(\n self, patches: Iterable[Patch], *, match_original: bool = ..., **kwargs\n ) -> None: ...\n def set_paths(self, patches: Iterable[Patch]) -> None: ... # type: ignore[override]\n\nclass TriMesh(Collection):\n def __init__(self, triangulation: Triangulation, **kwargs) -> None: ...\n def get_paths(self) -> list[Path]: ...\n # Parent class has an argument, perhaps add a noop arg?\n def set_paths(self) -> None: ... # type: ignore[override]\n @staticmethod\n def convert_mesh_to_paths(tri: Triangulation) -> list[Path]: ...\n\nclass _MeshData:\n def __init__(\n self,\n coordinates: ArrayLike,\n *,\n shading: Literal["flat", "gouraud"] = ...,\n ) -> None: ...\n def set_array(self, A: ArrayLike | None) -> None: ...\n def get_coordinates(self) -> ArrayLike: ...\n def get_facecolor(self) -> ColorType | Sequence[ColorType]: ...\n def get_edgecolor(self) -> ColorType | Sequence[ColorType]: ...\n\nclass QuadMesh(_MeshData, Collection):\n def __init__(\n self,\n coordinates: ArrayLike,\n *,\n antialiased: bool = ...,\n shading: Literal["flat", "gouraud"] = ...,\n **kwargs\n ) -> None: ...\n def get_paths(self) -> list[Path]: ...\n # Parent class has an argument, perhaps add a noop arg?\n def set_paths(self) -> None: ... # type: ignore[override]\n def get_datalim(self, transData: transforms.Transform) -> transforms.Bbox: ...\n def get_cursor_data(self, event: MouseEvent) -> float: ...\n\nclass PolyQuadMesh(_MeshData, PolyCollection):\n def __init__(\n self,\n coordinates: ArrayLike,\n **kwargs\n ) -> None: ...\n | .venv\Lib\site-packages\matplotlib\collections.pyi | collections.pyi | Other | 10,775 | 0.95 | 0.473282 | 0.074074 | react-lib | 586 | 2024-12-20T19:15:53.635618 | MIT | false | 9ab8f81a69b3e200964aebd7995e7a07 |
"""\nColorbars are a visualization of the mapping from scalar values to colors.\nIn Matplotlib they are drawn into a dedicated `~.axes.Axes`.\n\n.. note::\n Colorbars are typically created through `.Figure.colorbar` or its pyplot\n wrapper `.pyplot.colorbar`, which internally use `.Colorbar` together with\n `.make_axes_gridspec` (for `.GridSpec`-positioned Axes) or `.make_axes` (for\n non-`.GridSpec`-positioned Axes).\n\n End-users most likely won't need to directly use this module's API.\n"""\n\nimport logging\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom matplotlib import _api, cbook, collections, cm, colors, contour, ticker\nimport matplotlib.artist as martist\nimport matplotlib.patches as mpatches\nimport matplotlib.path as mpath\nimport matplotlib.spines as mspines\nimport matplotlib.transforms as mtransforms\nfrom matplotlib import _docstring\n\n_log = logging.getLogger(__name__)\n\n_docstring.interpd.register(\n _make_axes_kw_doc="""\nlocation : None or {'left', 'right', 'top', 'bottom'}\n The location, relative to the parent Axes, where the colorbar Axes\n is created. It also determines the *orientation* of the colorbar\n (colorbars on the left and right are vertical, colorbars at the top\n and bottom are horizontal). If None, the location will come from the\n *orientation* if it is set (vertical colorbars on the right, horizontal\n ones at the bottom), or default to 'right' if *orientation* is unset.\n\norientation : None or {'vertical', 'horizontal'}\n The orientation of the colorbar. It is preferable to set the *location*\n of the colorbar, as that also determines the *orientation*; passing\n incompatible values for *location* and *orientation* raises an exception.\n\nfraction : float, default: 0.15\n Fraction of original Axes to use for colorbar.\n\nshrink : float, default: 1.0\n Fraction by which to multiply the size of the colorbar.\n\naspect : float, default: 20\n Ratio of long to short dimensions.\n\npad : float, default: 0.05 if vertical, 0.15 if horizontal\n Fraction of original Axes between colorbar and new image Axes.\n\nanchor : (float, float), optional\n The anchor point of the colorbar Axes.\n Defaults to (0.0, 0.5) if vertical; (0.5, 1.0) if horizontal.\n\npanchor : (float, float), or *False*, optional\n The anchor point of the colorbar parent Axes. If *False*, the parent\n axes' anchor will be unchanged.\n Defaults to (1.0, 0.5) if vertical; (0.5, 0.0) if horizontal.""",\n _colormap_kw_doc="""\nextend : {'neither', 'both', 'min', 'max'}\n Make pointed end(s) for out-of-range values (unless 'neither'). These are\n set for a given colormap using the colormap set_under and set_over methods.\n\nextendfrac : {*None*, 'auto', length, lengths}\n If set to *None*, both the minimum and maximum triangular colorbar\n extensions will have a length of 5% of the interior colorbar length (this\n is the default setting).\n\n If set to 'auto', makes the triangular colorbar extensions the same lengths\n as the interior boxes (when *spacing* is set to 'uniform') or the same\n lengths as the respective adjacent interior boxes (when *spacing* is set to\n 'proportional').\n\n If a scalar, indicates the length of both the minimum and maximum\n triangular colorbar extensions as a fraction of the interior colorbar\n length. A two-element sequence of fractions may also be given, indicating\n the lengths of the minimum and maximum colorbar extensions respectively as\n a fraction of the interior colorbar length.\n\nextendrect : bool\n If *False* the minimum and maximum colorbar extensions will be triangular\n (the default). If *True* the extensions will be rectangular.\n\nticks : None or list of ticks or Locator\n If None, ticks are determined automatically from the input.\n\nformat : None or str or Formatter\n If None, `~.ticker.ScalarFormatter` is used.\n Format strings, e.g., ``"%4.2e"`` or ``"{x:.2e}"``, are supported.\n An alternative `~.ticker.Formatter` may be given instead.\n\ndrawedges : bool\n Whether to draw lines at color boundaries.\n\nlabel : str\n The label on the colorbar's long axis.\n\nboundaries, values : None or a sequence\n If unset, the colormap will be displayed on a 0-1 scale.\n If sequences, *values* must have a length 1 less than *boundaries*. For\n each region delimited by adjacent entries in *boundaries*, the color mapped\n to the corresponding value in *values* will be used. The size of each\n region is determined by the *spacing* parameter.\n Normally only useful for indexed colors (i.e. ``norm=NoNorm()``) or other\n unusual circumstances.\n\nspacing : {'uniform', 'proportional'}\n For discrete colorbars (`.BoundaryNorm` or contours), 'uniform' gives each\n color the same space; 'proportional' makes the space proportional to the\n data interval.""")\n\n\ndef _set_ticks_on_axis_warn(*args, **kwargs):\n # a top level function which gets put in at the axes'\n # set_xticks and set_yticks by Colorbar.__init__.\n _api.warn_external("Use the colorbar set_ticks() method instead.")\n\n\nclass _ColorbarSpine(mspines.Spine):\n def __init__(self, axes):\n self._ax = axes\n super().__init__(axes, 'colorbar', mpath.Path(np.empty((0, 2))))\n mpatches.Patch.set_transform(self, axes.transAxes)\n\n def get_window_extent(self, renderer=None):\n # This Spine has no Axis associated with it, and doesn't need to adjust\n # its location, so we can directly get the window extent from the\n # super-super-class.\n return mpatches.Patch.get_window_extent(self, renderer=renderer)\n\n def set_xy(self, xy):\n self._path = mpath.Path(xy, closed=True)\n self._xy = xy\n self.stale = True\n\n def draw(self, renderer):\n ret = mpatches.Patch.draw(self, renderer)\n self.stale = False\n return ret\n\n\nclass _ColorbarAxesLocator:\n """\n Shrink the Axes if there are triangular or rectangular extends.\n """\n def __init__(self, cbar):\n self._cbar = cbar\n self._orig_locator = cbar.ax._axes_locator\n\n def __call__(self, ax, renderer):\n if self._orig_locator is not None:\n pos = self._orig_locator(ax, renderer)\n else:\n pos = ax.get_position(original=True)\n if self._cbar.extend == 'neither':\n return pos\n\n y, extendlen = self._cbar._proportional_y()\n if not self._cbar._extend_lower():\n extendlen[0] = 0\n if not self._cbar._extend_upper():\n extendlen[1] = 0\n len = sum(extendlen) + 1\n shrink = 1 / len\n offset = extendlen[0] / len\n # we need to reset the aspect ratio of the axes to account\n # of the extends...\n if hasattr(ax, '_colorbar_info'):\n aspect = ax._colorbar_info['aspect']\n else:\n aspect = False\n # now shrink and/or offset to take into account the\n # extend tri/rectangles.\n if self._cbar.orientation == 'vertical':\n if aspect:\n self._cbar.ax.set_box_aspect(aspect*shrink)\n pos = pos.shrunk(1, shrink).translated(0, offset * pos.height)\n else:\n if aspect:\n self._cbar.ax.set_box_aspect(1/(aspect * shrink))\n pos = pos.shrunk(shrink, 1).translated(offset * pos.width, 0)\n return pos\n\n def get_subplotspec(self):\n # make tight_layout happy..\n return (\n self._cbar.ax.get_subplotspec()\n or getattr(self._orig_locator, "get_subplotspec", lambda: None)())\n\n\n@_docstring.interpd\nclass Colorbar:\n r"""\n Draw a colorbar in an existing Axes.\n\n Typically, colorbars are created using `.Figure.colorbar` or\n `.pyplot.colorbar` and associated with `.ScalarMappable`\s (such as an\n `.AxesImage` generated via `~.axes.Axes.imshow`).\n\n In order to draw a colorbar not associated with other elements in the\n figure, e.g. when showing a colormap by itself, one can create an empty\n `.ScalarMappable`, or directly pass *cmap* and *norm* instead of *mappable*\n to `Colorbar`.\n\n Useful public methods are :meth:`set_label` and :meth:`add_lines`.\n\n Attributes\n ----------\n ax : `~matplotlib.axes.Axes`\n The `~.axes.Axes` instance in which the colorbar is drawn.\n lines : list\n A list of `.LineCollection` (empty if no lines were drawn).\n dividers : `.LineCollection`\n A LineCollection (empty if *drawedges* is ``False``).\n """\n\n n_rasterize = 50 # rasterize solids if number of colors >= n_rasterize\n\n def __init__(\n self, ax, mappable=None, *,\n alpha=None,\n location=None,\n extend=None,\n extendfrac=None,\n extendrect=False,\n ticks=None,\n format=None,\n values=None,\n boundaries=None,\n spacing='uniform',\n drawedges=False,\n label='',\n cmap=None, norm=None, # redundant with *mappable*\n orientation=None, ticklocation='auto', # redundant with *location*\n ):\n """\n Parameters\n ----------\n ax : `~matplotlib.axes.Axes`\n The `~.axes.Axes` instance in which the colorbar is drawn.\n\n mappable : `.ScalarMappable`\n The mappable whose colormap and norm will be used.\n\n To show the colors versus index instead of on a 0-1 scale, set the\n mappable's norm to ``colors.NoNorm()``.\n\n alpha : float\n The colorbar transparency between 0 (transparent) and 1 (opaque).\n\n location : None or {'left', 'right', 'top', 'bottom'}\n Set the colorbar's *orientation* and *ticklocation*. Colorbars on\n the left and right are vertical, colorbars at the top and bottom\n are horizontal. The *ticklocation* is the same as *location*, so if\n *location* is 'top', the ticks are on the top. *orientation* and/or\n *ticklocation* can be provided as well and overrides the value set by\n *location*, but there will be an error for incompatible combinations.\n\n .. versionadded:: 3.7\n\n %(_colormap_kw_doc)s\n\n Other Parameters\n ----------------\n cmap : `~matplotlib.colors.Colormap`, default: :rc:`image.cmap`\n The colormap to use. This parameter is ignored, unless *mappable* is\n None.\n\n norm : `~matplotlib.colors.Normalize`\n The normalization to use. This parameter is ignored, unless *mappable*\n is None.\n\n orientation : None or {'vertical', 'horizontal'}\n If None, use the value determined by *location*. If both\n *orientation* and *location* are None then defaults to 'vertical'.\n\n ticklocation : {'auto', 'left', 'right', 'top', 'bottom'}\n The location of the colorbar ticks. The *ticklocation* must match\n *orientation*. For example, a horizontal colorbar can only have ticks\n at the top or the bottom. If 'auto', the ticks will be the same as\n *location*, so a colorbar to the left will have ticks to the left. If\n *location* is None, the ticks will be at the bottom for a horizontal\n colorbar and at the right for a vertical.\n """\n if mappable is None:\n mappable = cm.ScalarMappable(norm=norm, cmap=cmap)\n\n self.mappable = mappable\n cmap = mappable.cmap\n norm = mappable.norm\n\n filled = True\n if isinstance(mappable, contour.ContourSet):\n cs = mappable\n alpha = cs.get_alpha()\n boundaries = cs._levels\n values = cs.cvalues\n extend = cs.extend\n filled = cs.filled\n if ticks is None:\n ticks = ticker.FixedLocator(cs.levels, nbins=10)\n elif isinstance(mappable, martist.Artist):\n alpha = mappable.get_alpha()\n\n mappable.colorbar = self\n mappable.colorbar_cid = mappable.callbacks.connect(\n 'changed', self.update_normal)\n\n location_orientation = _get_orientation_from_location(location)\n\n _api.check_in_list(\n [None, 'vertical', 'horizontal'], orientation=orientation)\n _api.check_in_list(\n ['auto', 'left', 'right', 'top', 'bottom'],\n ticklocation=ticklocation)\n _api.check_in_list(\n ['uniform', 'proportional'], spacing=spacing)\n\n if location_orientation is not None and orientation is not None:\n if location_orientation != orientation:\n raise TypeError(\n "location and orientation are mutually exclusive")\n else:\n orientation = orientation or location_orientation or "vertical"\n\n self.ax = ax\n self.ax._axes_locator = _ColorbarAxesLocator(self)\n\n if extend is None:\n if (not isinstance(mappable, contour.ContourSet)\n and getattr(cmap, 'colorbar_extend', False) is not False):\n extend = cmap.colorbar_extend\n elif hasattr(norm, 'extend'):\n extend = norm.extend\n else:\n extend = 'neither'\n self.alpha = None\n # Call set_alpha to handle array-like alphas properly\n self.set_alpha(alpha)\n self.cmap = cmap\n self.norm = norm\n self.values = values\n self.boundaries = boundaries\n self.extend = extend\n self._inside = _api.check_getitem(\n {'neither': slice(0, None), 'both': slice(1, -1),\n 'min': slice(1, None), 'max': slice(0, -1)},\n extend=extend)\n self.spacing = spacing\n self.orientation = orientation\n self.drawedges = drawedges\n self._filled = filled\n self.extendfrac = extendfrac\n self.extendrect = extendrect\n self._extend_patches = []\n self.solids = None\n self.solids_patches = []\n self.lines = []\n\n for spine in self.ax.spines.values():\n spine.set_visible(False)\n self.outline = self.ax.spines['outline'] = _ColorbarSpine(self.ax)\n\n self.dividers = collections.LineCollection(\n [],\n colors=[mpl.rcParams['axes.edgecolor']],\n linewidths=[0.5 * mpl.rcParams['axes.linewidth']],\n clip_on=False)\n self.ax.add_collection(self.dividers)\n\n self._locator = None\n self._minorlocator = None\n self._formatter = None\n self._minorformatter = None\n\n if ticklocation == 'auto':\n ticklocation = _get_ticklocation_from_orientation(\n orientation) if location is None else location\n self.ticklocation = ticklocation\n\n self.set_label(label)\n self._reset_locator_formatter_scale()\n\n if np.iterable(ticks):\n self._locator = ticker.FixedLocator(ticks, nbins=len(ticks))\n else:\n self._locator = ticks\n\n if isinstance(format, str):\n # Check format between FormatStrFormatter and StrMethodFormatter\n try:\n self._formatter = ticker.FormatStrFormatter(format)\n _ = self._formatter(0)\n except (TypeError, ValueError):\n self._formatter = ticker.StrMethodFormatter(format)\n else:\n self._formatter = format # Assume it is a Formatter or None\n self._draw_all()\n\n if isinstance(mappable, contour.ContourSet) and not mappable.filled:\n self.add_lines(mappable)\n\n # Link the Axes and Colorbar for interactive use\n self.ax._colorbar = self\n # Don't navigate on any of these types of mappables\n if (isinstance(self.norm, (colors.BoundaryNorm, colors.NoNorm)) or\n isinstance(self.mappable, contour.ContourSet)):\n self.ax.set_navigate(False)\n\n # These are the functions that set up interactivity on this colorbar\n self._interactive_funcs = ["_get_view", "_set_view",\n "_set_view_from_bbox", "drag_pan"]\n for x in self._interactive_funcs:\n setattr(self.ax, x, getattr(self, x))\n # Set the cla function to the cbar's method to override it\n self.ax.cla = self._cbar_cla\n # Callbacks for the extend calculations to handle inverting the axis\n self._extend_cid1 = self.ax.callbacks.connect(\n "xlim_changed", self._do_extends)\n self._extend_cid2 = self.ax.callbacks.connect(\n "ylim_changed", self._do_extends)\n\n @property\n def long_axis(self):\n """Axis that has decorations (ticks, etc) on it."""\n if self.orientation == 'vertical':\n return self.ax.yaxis\n return self.ax.xaxis\n\n @property\n def locator(self):\n """Major tick `.Locator` for the colorbar."""\n return self.long_axis.get_major_locator()\n\n @locator.setter\n def locator(self, loc):\n self.long_axis.set_major_locator(loc)\n self._locator = loc\n\n @property\n def minorlocator(self):\n """Minor tick `.Locator` for the colorbar."""\n return self.long_axis.get_minor_locator()\n\n @minorlocator.setter\n def minorlocator(self, loc):\n self.long_axis.set_minor_locator(loc)\n self._minorlocator = loc\n\n @property\n def formatter(self):\n """Major tick label `.Formatter` for the colorbar."""\n return self.long_axis.get_major_formatter()\n\n @formatter.setter\n def formatter(self, fmt):\n self.long_axis.set_major_formatter(fmt)\n self._formatter = fmt\n\n @property\n def minorformatter(self):\n """Minor tick `.Formatter` for the colorbar."""\n return self.long_axis.get_minor_formatter()\n\n @minorformatter.setter\n def minorformatter(self, fmt):\n self.long_axis.set_minor_formatter(fmt)\n self._minorformatter = fmt\n\n def _cbar_cla(self):\n """Function to clear the interactive colorbar state."""\n for x in self._interactive_funcs:\n delattr(self.ax, x)\n # We now restore the old cla() back and can call it directly\n del self.ax.cla\n self.ax.cla()\n\n def update_normal(self, mappable=None):\n """\n Update solid patches, lines, etc.\n\n This is meant to be called when the norm of the image or contour plot\n to which this colorbar belongs changes.\n\n If the norm on the mappable is different than before, this resets the\n locator and formatter for the axis, so if these have been customized,\n they will need to be customized again. However, if the norm only\n changes values of *vmin*, *vmax* or *cmap* then the old formatter\n and locator will be preserved.\n """\n if mappable:\n # The mappable keyword argument exists because\n # ScalarMappable.changed() emits self.callbacks.process('changed', self)\n # in contrast, ColorizingArtist (and Colorizer) does not use this keyword.\n # [ColorizingArtist.changed() emits self.callbacks.process('changed')]\n # Also, there is no test where self.mappable == mappable is not True\n # and possibly no use case.\n # Therefore, the mappable keyword can be deprecated if cm.ScalarMappable\n # is removed.\n self.mappable = mappable\n _log.debug('colorbar update normal %r %r', self.mappable.norm, self.norm)\n self.set_alpha(self.mappable.get_alpha())\n self.cmap = self.mappable.cmap\n if self.mappable.norm != self.norm:\n self.norm = self.mappable.norm\n self._reset_locator_formatter_scale()\n\n self._draw_all()\n if isinstance(self.mappable, contour.ContourSet):\n CS = self.mappable\n if not CS.filled:\n self.add_lines(CS)\n self.stale = True\n\n def _draw_all(self):\n """\n Calculate any free parameters based on the current cmap and norm,\n and do all the drawing.\n """\n if self.orientation == 'vertical':\n if mpl.rcParams['ytick.minor.visible']:\n self.minorticks_on()\n else:\n if mpl.rcParams['xtick.minor.visible']:\n self.minorticks_on()\n self.long_axis.set(label_position=self.ticklocation,\n ticks_position=self.ticklocation)\n self._short_axis().set_ticks([])\n self._short_axis().set_ticks([], minor=True)\n\n # Set self._boundaries and self._values, including extensions.\n # self._boundaries are the edges of each square of color, and\n # self._values are the value to map into the norm to get the\n # color:\n self._process_values()\n # Set self.vmin and self.vmax to first and last boundary, excluding\n # extensions:\n self.vmin, self.vmax = self._boundaries[self._inside][[0, -1]]\n # Compute the X/Y mesh.\n X, Y = self._mesh()\n # draw the extend triangles, and shrink the inner Axes to accommodate.\n # also adds the outline path to self.outline spine:\n self._do_extends()\n lower, upper = self.vmin, self.vmax\n if self.long_axis.get_inverted():\n # If the axis is inverted, we need to swap the vmin/vmax\n lower, upper = upper, lower\n if self.orientation == 'vertical':\n self.ax.set_xlim(0, 1)\n self.ax.set_ylim(lower, upper)\n else:\n self.ax.set_ylim(0, 1)\n self.ax.set_xlim(lower, upper)\n\n # set up the tick locators and formatters. A bit complicated because\n # boundary norms + uniform spacing requires a manual locator.\n self.update_ticks()\n\n if self._filled:\n ind = np.arange(len(self._values))\n if self._extend_lower():\n ind = ind[1:]\n if self._extend_upper():\n ind = ind[:-1]\n self._add_solids(X, Y, self._values[ind, np.newaxis])\n\n def _add_solids(self, X, Y, C):\n """Draw the colors; optionally add separators."""\n # Cleanup previously set artists.\n if self.solids is not None:\n self.solids.remove()\n for solid in self.solids_patches:\n solid.remove()\n # Add new artist(s), based on mappable type. Use individual patches if\n # hatching is needed, pcolormesh otherwise.\n mappable = getattr(self, 'mappable', None)\n if (isinstance(mappable, contour.ContourSet)\n and any(hatch is not None for hatch in mappable.hatches)):\n self._add_solids_patches(X, Y, C, mappable)\n else:\n self.solids = self.ax.pcolormesh(\n X, Y, C, cmap=self.cmap, norm=self.norm, alpha=self.alpha,\n edgecolors='none', shading='flat')\n if not self.drawedges:\n if len(self._y) >= self.n_rasterize:\n self.solids.set_rasterized(True)\n self._update_dividers()\n\n def _update_dividers(self):\n if not self.drawedges:\n self.dividers.set_segments([])\n return\n # Place all *internal* dividers.\n if self.orientation == 'vertical':\n lims = self.ax.get_ylim()\n bounds = (lims[0] < self._y) & (self._y < lims[1])\n else:\n lims = self.ax.get_xlim()\n bounds = (lims[0] < self._y) & (self._y < lims[1])\n y = self._y[bounds]\n # And then add outer dividers if extensions are on.\n if self._extend_lower():\n y = np.insert(y, 0, lims[0])\n if self._extend_upper():\n y = np.append(y, lims[1])\n X, Y = np.meshgrid([0, 1], y)\n if self.orientation == 'vertical':\n segments = np.dstack([X, Y])\n else:\n segments = np.dstack([Y, X])\n self.dividers.set_segments(segments)\n\n def _add_solids_patches(self, X, Y, C, mappable):\n hatches = mappable.hatches * (len(C) + 1) # Have enough hatches.\n if self._extend_lower():\n # remove first hatch that goes into the extend patch\n hatches = hatches[1:]\n patches = []\n for i in range(len(X) - 1):\n xy = np.array([[X[i, 0], Y[i, 1]],\n [X[i, 1], Y[i, 0]],\n [X[i + 1, 1], Y[i + 1, 0]],\n [X[i + 1, 0], Y[i + 1, 1]]])\n patch = mpatches.PathPatch(mpath.Path(xy),\n facecolor=self.cmap(self.norm(C[i][0])),\n hatch=hatches[i], linewidth=0,\n antialiased=False, alpha=self.alpha)\n self.ax.add_patch(patch)\n patches.append(patch)\n self.solids_patches = patches\n\n def _do_extends(self, ax=None):\n """\n Add the extend tri/rectangles on the outside of the Axes.\n\n ax is unused, but required due to the callbacks on xlim/ylim changed\n """\n # Clean up any previous extend patches\n for patch in self._extend_patches:\n patch.remove()\n self._extend_patches = []\n # extend lengths are fraction of the *inner* part of colorbar,\n # not the total colorbar:\n _, extendlen = self._proportional_y()\n bot = 0 - (extendlen[0] if self._extend_lower() else 0)\n top = 1 + (extendlen[1] if self._extend_upper() else 0)\n\n # xyout is the outline of the colorbar including the extend patches:\n if not self.extendrect:\n # triangle:\n xyout = np.array([[0, 0], [0.5, bot], [1, 0],\n [1, 1], [0.5, top], [0, 1], [0, 0]])\n else:\n # rectangle:\n xyout = np.array([[0, 0], [0, bot], [1, bot], [1, 0],\n [1, 1], [1, top], [0, top], [0, 1],\n [0, 0]])\n\n if self.orientation == 'horizontal':\n xyout = xyout[:, ::-1]\n\n # xyout is the path for the spine:\n self.outline.set_xy(xyout)\n if not self._filled:\n return\n\n # Make extend triangles or rectangles filled patches. These are\n # defined in the outer parent axes' coordinates:\n mappable = getattr(self, 'mappable', None)\n if (isinstance(mappable, contour.ContourSet)\n and any(hatch is not None for hatch in mappable.hatches)):\n hatches = mappable.hatches * (len(self._y) + 1)\n else:\n hatches = [None] * (len(self._y) + 1)\n\n if self._extend_lower():\n if not self.extendrect:\n # triangle\n xy = np.array([[0, 0], [0.5, bot], [1, 0]])\n else:\n # rectangle\n xy = np.array([[0, 0], [0, bot], [1., bot], [1, 0]])\n if self.orientation == 'horizontal':\n xy = xy[:, ::-1]\n # add the patch\n val = -1 if self.long_axis.get_inverted() else 0\n color = self.cmap(self.norm(self._values[val]))\n patch = mpatches.PathPatch(\n mpath.Path(xy), facecolor=color, alpha=self.alpha,\n linewidth=0, antialiased=False,\n transform=self.ax.transAxes,\n hatch=hatches[0], clip_on=False,\n # Place it right behind the standard patches, which is\n # needed if we updated the extends\n zorder=np.nextafter(self.ax.patch.zorder, -np.inf))\n self.ax.add_patch(patch)\n self._extend_patches.append(patch)\n # remove first hatch that goes into the extend patch\n hatches = hatches[1:]\n if self._extend_upper():\n if not self.extendrect:\n # triangle\n xy = np.array([[0, 1], [0.5, top], [1, 1]])\n else:\n # rectangle\n xy = np.array([[0, 1], [0, top], [1, top], [1, 1]])\n if self.orientation == 'horizontal':\n xy = xy[:, ::-1]\n # add the patch\n val = 0 if self.long_axis.get_inverted() else -1\n color = self.cmap(self.norm(self._values[val]))\n hatch_idx = len(self._y) - 1\n patch = mpatches.PathPatch(\n mpath.Path(xy), facecolor=color, alpha=self.alpha,\n linewidth=0, antialiased=False,\n transform=self.ax.transAxes, hatch=hatches[hatch_idx],\n clip_on=False,\n # Place it right behind the standard patches, which is\n # needed if we updated the extends\n zorder=np.nextafter(self.ax.patch.zorder, -np.inf))\n self.ax.add_patch(patch)\n self._extend_patches.append(patch)\n\n self._update_dividers()\n\n def add_lines(self, *args, **kwargs):\n """\n Draw lines on the colorbar.\n\n The lines are appended to the list :attr:`lines`.\n\n Parameters\n ----------\n levels : array-like\n The positions of the lines.\n colors : :mpltype:`color` or list of :mpltype:`color`\n Either a single color applying to all lines or one color value for\n each line.\n linewidths : float or array-like\n Either a single linewidth applying to all lines or one linewidth\n for each line.\n erase : bool, default: True\n Whether to remove any previously added lines.\n\n Notes\n -----\n Alternatively, this method can also be called with the signature\n ``colorbar.add_lines(contour_set, erase=True)``, in which case\n *levels*, *colors*, and *linewidths* are taken from *contour_set*.\n """\n params = _api.select_matching_signature(\n [lambda self, CS, erase=True: locals(),\n lambda self, levels, colors, linewidths, erase=True: locals()],\n self, *args, **kwargs)\n if "CS" in params:\n self, cs, erase = params.values()\n if not isinstance(cs, contour.ContourSet) or cs.filled:\n raise ValueError("If a single artist is passed to add_lines, "\n "it must be a ContourSet of lines")\n # TODO: Make colorbar lines auto-follow changes in contour lines.\n return self.add_lines(\n cs.levels,\n cs.to_rgba(cs.cvalues, cs.alpha),\n cs.get_linewidths(),\n erase=erase)\n else:\n self, levels, colors, linewidths, erase = params.values()\n\n y = self._locate(levels)\n rtol = (self._y[-1] - self._y[0]) * 1e-10\n igood = (y < self._y[-1] + rtol) & (y > self._y[0] - rtol)\n y = y[igood]\n if np.iterable(colors):\n colors = np.asarray(colors)[igood]\n if np.iterable(linewidths):\n linewidths = np.asarray(linewidths)[igood]\n X, Y = np.meshgrid([0, 1], y)\n if self.orientation == 'vertical':\n xy = np.stack([X, Y], axis=-1)\n else:\n xy = np.stack([Y, X], axis=-1)\n col = collections.LineCollection(xy, linewidths=linewidths,\n colors=colors)\n\n if erase and self.lines:\n for lc in self.lines:\n lc.remove()\n self.lines = []\n self.lines.append(col)\n\n # make a clip path that is just a linewidth bigger than the Axes...\n fac = np.max(linewidths) / 72\n xy = np.array([[0, 0], [1, 0], [1, 1], [0, 1], [0, 0]])\n inches = self.ax.get_figure().dpi_scale_trans\n # do in inches:\n xy = inches.inverted().transform(self.ax.transAxes.transform(xy))\n xy[[0, 1, 4], 1] -= fac\n xy[[2, 3], 1] += fac\n # back to axes units...\n xy = self.ax.transAxes.inverted().transform(inches.transform(xy))\n col.set_clip_path(mpath.Path(xy, closed=True),\n self.ax.transAxes)\n self.ax.add_collection(col)\n self.stale = True\n\n def update_ticks(self):\n """\n Set up the ticks and ticklabels. This should not be needed by users.\n """\n # Get the locator and formatter; defaults to self._locator if not None.\n self._get_ticker_locator_formatter()\n self.long_axis.set_major_locator(self._locator)\n self.long_axis.set_minor_locator(self._minorlocator)\n self.long_axis.set_major_formatter(self._formatter)\n\n def _get_ticker_locator_formatter(self):\n """\n Return the ``locator`` and ``formatter`` of the colorbar.\n\n If they have not been defined (i.e. are *None*), the formatter and\n locator are retrieved from the axis, or from the value of the\n boundaries for a boundary norm.\n\n Called by update_ticks...\n """\n locator = self._locator\n formatter = self._formatter\n minorlocator = self._minorlocator\n if isinstance(self.norm, colors.BoundaryNorm):\n b = self.norm.boundaries\n if locator is None:\n locator = ticker.FixedLocator(b, nbins=10)\n if minorlocator is None:\n minorlocator = ticker.FixedLocator(b)\n elif isinstance(self.norm, colors.NoNorm):\n if locator is None:\n # put ticks on integers between the boundaries of NoNorm\n nv = len(self._values)\n base = 1 + int(nv / 10)\n locator = ticker.IndexLocator(base=base, offset=.5)\n elif self.boundaries is not None:\n b = self._boundaries[self._inside]\n if locator is None:\n locator = ticker.FixedLocator(b, nbins=10)\n else: # most cases:\n if locator is None:\n # we haven't set the locator explicitly, so use the default\n # for this axis:\n locator = self.long_axis.get_major_locator()\n if minorlocator is None:\n minorlocator = self.long_axis.get_minor_locator()\n\n if minorlocator is None:\n minorlocator = ticker.NullLocator()\n\n if formatter is None:\n formatter = self.long_axis.get_major_formatter()\n\n self._locator = locator\n self._formatter = formatter\n self._minorlocator = minorlocator\n _log.debug('locator: %r', locator)\n\n def set_ticks(self, ticks, *, labels=None, minor=False, **kwargs):\n """\n Set tick locations.\n\n Parameters\n ----------\n ticks : 1D array-like\n List of tick locations.\n labels : list of str, optional\n List of tick labels. If not set, the labels show the data value.\n minor : bool, default: False\n If ``False``, set the major ticks; if ``True``, the minor ticks.\n **kwargs\n `.Text` properties for the labels. These take effect only if you\n pass *labels*. In other cases, please use `~.Axes.tick_params`.\n """\n if np.iterable(ticks):\n self.long_axis.set_ticks(ticks, labels=labels, minor=minor,\n **kwargs)\n self._locator = self.long_axis.get_major_locator()\n else:\n self._locator = ticks\n self.long_axis.set_major_locator(self._locator)\n self.stale = True\n\n def get_ticks(self, minor=False):\n """\n Return the ticks as a list of locations.\n\n Parameters\n ----------\n minor : boolean, default: False\n if True return the minor ticks.\n """\n if minor:\n return self.long_axis.get_minorticklocs()\n else:\n return self.long_axis.get_majorticklocs()\n\n def set_ticklabels(self, ticklabels, *, minor=False, **kwargs):\n """\n [*Discouraged*] Set tick labels.\n\n .. admonition:: Discouraged\n\n The use of this method is discouraged, because of the dependency\n on tick positions. In most cases, you'll want to use\n ``set_ticks(positions, labels=labels)`` instead.\n\n If you are using this method, you should always fix the tick\n positions before, e.g. by using `.Colorbar.set_ticks` or by\n explicitly setting a `~.ticker.FixedLocator` on the long axis\n of the colorbar. Otherwise, ticks are free to move and the\n labels may end up in unexpected positions.\n\n Parameters\n ----------\n ticklabels : sequence of str or of `.Text`\n Texts for labeling each tick location in the sequence set by\n `.Colorbar.set_ticks`; the number of labels must match the number\n of locations.\n\n update_ticks : bool, default: True\n This keyword argument is ignored and will be removed.\n Deprecated\n\n minor : bool\n If True, set minor ticks instead of major ticks.\n\n **kwargs\n `.Text` properties for the labels.\n """\n self.long_axis.set_ticklabels(ticklabels, minor=minor, **kwargs)\n\n def minorticks_on(self):\n """\n Turn on colorbar minor ticks.\n """\n self.ax.minorticks_on()\n self._short_axis().set_minor_locator(ticker.NullLocator())\n\n def minorticks_off(self):\n """Turn the minor ticks of the colorbar off."""\n self._minorlocator = ticker.NullLocator()\n self.long_axis.set_minor_locator(self._minorlocator)\n\n def set_label(self, label, *, loc=None, **kwargs):\n """\n Add a label to the long axis of the colorbar.\n\n Parameters\n ----------\n label : str\n The label text.\n loc : str, optional\n The location of the label.\n\n - For horizontal orientation one of {'left', 'center', 'right'}\n - For vertical orientation one of {'bottom', 'center', 'top'}\n\n Defaults to :rc:`xaxis.labellocation` or :rc:`yaxis.labellocation`\n depending on the orientation.\n **kwargs\n Keyword arguments are passed to `~.Axes.set_xlabel` /\n `~.Axes.set_ylabel`.\n Supported keywords are *labelpad* and `.Text` properties.\n """\n if self.orientation == "vertical":\n self.ax.set_ylabel(label, loc=loc, **kwargs)\n else:\n self.ax.set_xlabel(label, loc=loc, **kwargs)\n self.stale = True\n\n def set_alpha(self, alpha):\n """\n Set the transparency between 0 (transparent) and 1 (opaque).\n\n If an array is provided, *alpha* will be set to None to use the\n transparency values associated with the colormap.\n """\n self.alpha = None if isinstance(alpha, np.ndarray) else alpha\n\n def _set_scale(self, scale, **kwargs):\n """\n Set the colorbar long axis scale.\n\n Parameters\n ----------\n scale : {"linear", "log", "symlog", "logit", ...} or `.ScaleBase`\n The axis scale type to apply.\n\n **kwargs\n Different keyword arguments are accepted, depending on the scale.\n See the respective class keyword arguments:\n\n - `matplotlib.scale.LinearScale`\n - `matplotlib.scale.LogScale`\n - `matplotlib.scale.SymmetricalLogScale`\n - `matplotlib.scale.LogitScale`\n - `matplotlib.scale.FuncScale`\n - `matplotlib.scale.AsinhScale`\n\n Notes\n -----\n By default, Matplotlib supports the above-mentioned scales.\n Additionally, custom scales may be registered using\n `matplotlib.scale.register_scale`. These scales can then also\n be used here.\n """\n self.long_axis._set_axes_scale(scale, **kwargs)\n\n def remove(self):\n """\n Remove this colorbar from the figure.\n\n If the colorbar was created with ``use_gridspec=True`` the previous\n gridspec is restored.\n """\n if hasattr(self.ax, '_colorbar_info'):\n parents = self.ax._colorbar_info['parents']\n for a in parents:\n if self.ax in a._colorbars:\n a._colorbars.remove(self.ax)\n\n self.ax.remove()\n\n self.mappable.callbacks.disconnect(self.mappable.colorbar_cid)\n self.mappable.colorbar = None\n self.mappable.colorbar_cid = None\n # Remove the extension callbacks\n self.ax.callbacks.disconnect(self._extend_cid1)\n self.ax.callbacks.disconnect(self._extend_cid2)\n\n try:\n ax = self.mappable.axes\n except AttributeError:\n return\n try:\n subplotspec = self.ax.get_subplotspec().get_gridspec()._subplot_spec\n except AttributeError: # use_gridspec was False\n pos = ax.get_position(original=True)\n ax._set_position(pos)\n else: # use_gridspec was True\n ax.set_subplotspec(subplotspec)\n\n def _process_values(self):\n """\n Set `_boundaries` and `_values` based on the self.boundaries and\n self.values if not None, or based on the size of the colormap and\n the vmin/vmax of the norm.\n """\n if self.values is not None:\n # set self._boundaries from the values...\n self._values = np.array(self.values)\n if self.boundaries is None:\n # bracket values by 1/2 dv:\n b = np.zeros(len(self.values) + 1)\n b[1:-1] = 0.5 * (self._values[:-1] + self._values[1:])\n b[0] = 2.0 * b[1] - b[2]\n b[-1] = 2.0 * b[-2] - b[-3]\n self._boundaries = b\n return\n self._boundaries = np.array(self.boundaries)\n return\n\n # otherwise values are set from the boundaries\n if isinstance(self.norm, colors.BoundaryNorm):\n b = self.norm.boundaries\n elif isinstance(self.norm, colors.NoNorm):\n # NoNorm has N blocks, so N+1 boundaries, centered on integers:\n b = np.arange(self.cmap.N + 1) - .5\n elif self.boundaries is not None:\n b = self.boundaries\n else:\n # otherwise make the boundaries from the size of the cmap:\n N = self.cmap.N + 1\n b, _ = self._uniform_y(N)\n # add extra boundaries if needed:\n if self._extend_lower():\n b = np.hstack((b[0] - 1, b))\n if self._extend_upper():\n b = np.hstack((b, b[-1] + 1))\n\n # transform from 0-1 to vmin-vmax:\n if self.mappable.get_array() is not None:\n self.mappable.autoscale_None()\n if not self.norm.scaled():\n # If we still aren't scaled after autoscaling, use 0, 1 as default\n self.norm.vmin = 0\n self.norm.vmax = 1\n self.norm.vmin, self.norm.vmax = mtransforms.nonsingular(\n self.norm.vmin, self.norm.vmax, expander=0.1)\n if (not isinstance(self.norm, colors.BoundaryNorm) and\n (self.boundaries is None)):\n b = self.norm.inverse(b)\n\n self._boundaries = np.asarray(b, dtype=float)\n self._values = 0.5 * (self._boundaries[:-1] + self._boundaries[1:])\n if isinstance(self.norm, colors.NoNorm):\n self._values = (self._values + 0.00001).astype(np.int16)\n\n def _mesh(self):\n """\n Return the coordinate arrays for the colorbar pcolormesh/patches.\n\n These are scaled between vmin and vmax, and already handle colorbar\n orientation.\n """\n y, _ = self._proportional_y()\n # Use the vmin and vmax of the colorbar, which may not be the same\n # as the norm. There are situations where the colormap has a\n # narrower range than the colorbar and we want to accommodate the\n # extra contours.\n if (isinstance(self.norm, (colors.BoundaryNorm, colors.NoNorm))\n or self.boundaries is not None):\n # not using a norm.\n y = y * (self.vmax - self.vmin) + self.vmin\n else:\n # Update the norm values in a context manager as it is only\n # a temporary change and we don't want to propagate any signals\n # attached to the norm (callbacks.blocked).\n with (self.norm.callbacks.blocked(),\n cbook._setattr_cm(self.norm, vmin=self.vmin, vmax=self.vmax)):\n y = self.norm.inverse(y)\n self._y = y\n X, Y = np.meshgrid([0., 1.], y)\n if self.orientation == 'vertical':\n return (X, Y)\n else:\n return (Y, X)\n\n def _forward_boundaries(self, x):\n # map boundaries equally between 0 and 1...\n b = self._boundaries\n y = np.interp(x, b, np.linspace(0, 1, len(b)))\n # the following avoids ticks in the extends:\n eps = (b[-1] - b[0]) * 1e-6\n # map these _well_ out of bounds to keep any ticks out\n # of the extends region...\n y[x < b[0]-eps] = -1\n y[x > b[-1]+eps] = 2\n return y\n\n def _inverse_boundaries(self, x):\n # invert the above...\n b = self._boundaries\n return np.interp(x, np.linspace(0, 1, len(b)), b)\n\n def _reset_locator_formatter_scale(self):\n """\n Reset the locator et al to defaults. Any user-hardcoded changes\n need to be re-entered if this gets called (either at init, or when\n the mappable normal gets changed: Colorbar.update_normal)\n """\n self._process_values()\n self._locator = None\n self._minorlocator = None\n self._formatter = None\n self._minorformatter = None\n if (isinstance(self.mappable, contour.ContourSet) and\n isinstance(self.norm, colors.LogNorm)):\n # if contours have lognorm, give them a log scale...\n self._set_scale('log')\n elif (self.boundaries is not None or\n isinstance(self.norm, colors.BoundaryNorm)):\n if self.spacing == 'uniform':\n funcs = (self._forward_boundaries, self._inverse_boundaries)\n self._set_scale('function', functions=funcs)\n elif self.spacing == 'proportional':\n self._set_scale('linear')\n elif getattr(self.norm, '_scale', None):\n # use the norm's scale (if it exists and is not None):\n self._set_scale(self.norm._scale)\n elif type(self.norm) is colors.Normalize:\n # plain Normalize:\n self._set_scale('linear')\n else:\n # norm._scale is None or not an attr: derive the scale from\n # the Norm:\n funcs = (self.norm, self.norm.inverse)\n self._set_scale('function', functions=funcs)\n\n def _locate(self, x):\n """\n Given a set of color data values, return their\n corresponding colorbar data coordinates.\n """\n if isinstance(self.norm, (colors.NoNorm, colors.BoundaryNorm)):\n b = self._boundaries\n xn = x\n else:\n # Do calculations using normalized coordinates so\n # as to make the interpolation more accurate.\n b = self.norm(self._boundaries, clip=False).filled()\n xn = self.norm(x, clip=False).filled()\n\n bunique = b[self._inside]\n yunique = self._y\n\n z = np.interp(xn, bunique, yunique)\n return z\n\n # trivial helpers\n\n def _uniform_y(self, N):\n """\n Return colorbar data coordinates for *N* uniformly\n spaced boundaries, plus extension lengths if required.\n """\n automin = automax = 1. / (N - 1.)\n extendlength = self._get_extension_lengths(self.extendfrac,\n automin, automax,\n default=0.05)\n y = np.linspace(0, 1, N)\n return y, extendlength\n\n def _proportional_y(self):\n """\n Return colorbar data coordinates for the boundaries of\n a proportional colorbar, plus extension lengths if required:\n """\n if (isinstance(self.norm, colors.BoundaryNorm) or\n self.boundaries is not None):\n y = (self._boundaries - self._boundaries[self._inside][0])\n y = y / (self._boundaries[self._inside][-1] -\n self._boundaries[self._inside][0])\n # need yscaled the same as the axes scale to get\n # the extend lengths.\n if self.spacing == 'uniform':\n yscaled = self._forward_boundaries(self._boundaries)\n else:\n yscaled = y\n else:\n y = self.norm(self._boundaries.copy())\n y = np.ma.filled(y, np.nan)\n # the norm and the scale should be the same...\n yscaled = y\n y = y[self._inside]\n yscaled = yscaled[self._inside]\n # normalize from 0..1:\n norm = colors.Normalize(y[0], y[-1])\n y = np.ma.filled(norm(y), np.nan)\n norm = colors.Normalize(yscaled[0], yscaled[-1])\n yscaled = np.ma.filled(norm(yscaled), np.nan)\n # make the lower and upper extend lengths proportional to the lengths\n # of the first and last boundary spacing (if extendfrac='auto'):\n automin = yscaled[1] - yscaled[0]\n automax = yscaled[-1] - yscaled[-2]\n extendlength = [0, 0]\n if self._extend_lower() or self._extend_upper():\n extendlength = self._get_extension_lengths(\n self.extendfrac, automin, automax, default=0.05)\n return y, extendlength\n\n def _get_extension_lengths(self, frac, automin, automax, default=0.05):\n """\n Return the lengths of colorbar extensions.\n\n This is a helper method for _uniform_y and _proportional_y.\n """\n # Set the default value.\n extendlength = np.array([default, default])\n if isinstance(frac, str):\n _api.check_in_list(['auto'], extendfrac=frac.lower())\n # Use the provided values when 'auto' is required.\n extendlength[:] = [automin, automax]\n elif frac is not None:\n try:\n # Try to set min and max extension fractions directly.\n extendlength[:] = frac\n # If frac is a sequence containing None then NaN may\n # be encountered. This is an error.\n if np.isnan(extendlength).any():\n raise ValueError()\n except (TypeError, ValueError) as err:\n # Raise an error on encountering an invalid value for frac.\n raise ValueError('invalid value for extendfrac') from err\n return extendlength\n\n def _extend_lower(self):\n """Return whether the lower limit is open ended."""\n minmax = "max" if self.long_axis.get_inverted() else "min"\n return self.extend in ('both', minmax)\n\n def _extend_upper(self):\n """Return whether the upper limit is open ended."""\n minmax = "min" if self.long_axis.get_inverted() else "max"\n return self.extend in ('both', minmax)\n\n def _short_axis(self):\n """Return the short axis"""\n if self.orientation == 'vertical':\n return self.ax.xaxis\n return self.ax.yaxis\n\n def _get_view(self):\n # docstring inherited\n # An interactive view for a colorbar is the norm's vmin/vmax\n return self.norm.vmin, self.norm.vmax\n\n def _set_view(self, view):\n # docstring inherited\n # An interactive view for a colorbar is the norm's vmin/vmax\n self.norm.vmin, self.norm.vmax = view\n\n def _set_view_from_bbox(self, bbox, direction='in',\n mode=None, twinx=False, twiny=False):\n # docstring inherited\n # For colorbars, we use the zoom bbox to scale the norm's vmin/vmax\n new_xbound, new_ybound = self.ax._prepare_view_from_bbox(\n bbox, direction=direction, mode=mode, twinx=twinx, twiny=twiny)\n if self.orientation == 'horizontal':\n self.norm.vmin, self.norm.vmax = new_xbound\n elif self.orientation == 'vertical':\n self.norm.vmin, self.norm.vmax = new_ybound\n\n def drag_pan(self, button, key, x, y):\n # docstring inherited\n points = self.ax._get_pan_points(button, key, x, y)\n if points is not None:\n if self.orientation == 'horizontal':\n self.norm.vmin, self.norm.vmax = points[:, 0]\n elif self.orientation == 'vertical':\n self.norm.vmin, self.norm.vmax = points[:, 1]\n\n\nColorbarBase = Colorbar # Backcompat API\n\n\ndef _normalize_location_orientation(location, orientation):\n if location is None:\n location = _get_ticklocation_from_orientation(orientation)\n loc_settings = _api.check_getitem({\n "left": {"location": "left", "anchor": (1.0, 0.5),\n "panchor": (0.0, 0.5), "pad": 0.10},\n "right": {"location": "right", "anchor": (0.0, 0.5),\n "panchor": (1.0, 0.5), "pad": 0.05},\n "top": {"location": "top", "anchor": (0.5, 0.0),\n "panchor": (0.5, 1.0), "pad": 0.05},\n "bottom": {"location": "bottom", "anchor": (0.5, 1.0),\n "panchor": (0.5, 0.0), "pad": 0.15},\n }, location=location)\n loc_settings["orientation"] = _get_orientation_from_location(location)\n if orientation is not None and orientation != loc_settings["orientation"]:\n # Allow the user to pass both if they are consistent.\n raise TypeError("location and orientation are mutually exclusive")\n return loc_settings\n\n\ndef _get_orientation_from_location(location):\n return _api.check_getitem(\n {None: None, "left": "vertical", "right": "vertical",\n "top": "horizontal", "bottom": "horizontal"}, location=location)\n\n\ndef _get_ticklocation_from_orientation(orientation):\n return _api.check_getitem(\n {None: "right", "vertical": "right", "horizontal": "bottom"},\n orientation=orientation)\n\n\n@_docstring.interpd\ndef make_axes(parents, location=None, orientation=None, fraction=0.15,\n shrink=1.0, aspect=20, **kwargs):\n """\n Create an `~.axes.Axes` suitable for a colorbar.\n\n The Axes is placed in the figure of the *parents* Axes, by resizing and\n repositioning *parents*.\n\n Parameters\n ----------\n parents : `~matplotlib.axes.Axes` or iterable or `numpy.ndarray` of `~.axes.Axes`\n The Axes to use as parents for placing the colorbar.\n %(_make_axes_kw_doc)s\n\n Returns\n -------\n cax : `~matplotlib.axes.Axes`\n The child Axes.\n kwargs : dict\n The reduced keyword dictionary to be passed when creating the colorbar\n instance.\n """\n loc_settings = _normalize_location_orientation(location, orientation)\n # put appropriate values into the kwargs dict for passing back to\n # the Colorbar class\n kwargs['orientation'] = loc_settings['orientation']\n location = kwargs['ticklocation'] = loc_settings['location']\n\n anchor = kwargs.pop('anchor', loc_settings['anchor'])\n panchor = kwargs.pop('panchor', loc_settings['panchor'])\n aspect0 = aspect\n # turn parents into a list if it is not already. Note we cannot\n # use .flatten or .ravel as these copy the references rather than\n # reuse them, leading to a memory leak\n if isinstance(parents, np.ndarray):\n parents = list(parents.flat)\n elif np.iterable(parents):\n parents = list(parents)\n else:\n parents = [parents]\n\n fig = parents[0].get_figure()\n\n pad0 = 0.05 if fig.get_constrained_layout() else loc_settings['pad']\n pad = kwargs.pop('pad', pad0)\n\n if not all(fig is ax.get_figure() for ax in parents):\n raise ValueError('Unable to create a colorbar Axes as not all '\n 'parents share the same figure.')\n\n # take a bounding box around all of the given Axes\n parents_bbox = mtransforms.Bbox.union(\n [ax.get_position(original=True).frozen() for ax in parents])\n\n pb = parents_bbox\n if location in ('left', 'right'):\n if location == 'left':\n pbcb, _, pb1 = pb.splitx(fraction, fraction + pad)\n else:\n pb1, _, pbcb = pb.splitx(1 - fraction - pad, 1 - fraction)\n pbcb = pbcb.shrunk(1.0, shrink).anchored(anchor, pbcb)\n else:\n if location == 'bottom':\n pbcb, _, pb1 = pb.splity(fraction, fraction + pad)\n else:\n pb1, _, pbcb = pb.splity(1 - fraction - pad, 1 - fraction)\n pbcb = pbcb.shrunk(shrink, 1.0).anchored(anchor, pbcb)\n\n # define the aspect ratio in terms of y's per x rather than x's per y\n aspect = 1.0 / aspect\n\n # define a transform which takes us from old axes coordinates to\n # new axes coordinates\n shrinking_trans = mtransforms.BboxTransform(parents_bbox, pb1)\n\n # transform each of the Axes in parents using the new transform\n for ax in parents:\n new_posn = shrinking_trans.transform(ax.get_position(original=True))\n new_posn = mtransforms.Bbox(new_posn)\n ax._set_position(new_posn)\n if panchor is not False:\n ax.set_anchor(panchor)\n\n cax = fig.add_axes(pbcb, label="<colorbar>")\n for a in parents:\n # tell the parent it has a colorbar\n a._colorbars += [cax]\n cax._colorbar_info = dict(\n parents=parents,\n location=location,\n shrink=shrink,\n anchor=anchor,\n panchor=panchor,\n fraction=fraction,\n aspect=aspect0,\n pad=pad)\n # and we need to set the aspect ratio by hand...\n cax.set_anchor(anchor)\n cax.set_box_aspect(aspect)\n cax.set_aspect('auto')\n\n return cax, kwargs\n\n\n@_docstring.interpd\ndef make_axes_gridspec(parent, *, location=None, orientation=None,\n fraction=0.15, shrink=1.0, aspect=20, **kwargs):\n """\n Create an `~.axes.Axes` suitable for a colorbar.\n\n The Axes is placed in the figure of the *parent* Axes, by resizing and\n repositioning *parent*.\n\n This function is similar to `.make_axes` and mostly compatible with it.\n Primary differences are\n\n - `.make_axes_gridspec` requires the *parent* to have a subplotspec.\n - `.make_axes` positions the Axes in figure coordinates;\n `.make_axes_gridspec` positions it using a subplotspec.\n - `.make_axes` updates the position of the parent. `.make_axes_gridspec`\n replaces the parent gridspec with a new one.\n\n Parameters\n ----------\n parent : `~matplotlib.axes.Axes`\n The Axes to use as parent for placing the colorbar.\n %(_make_axes_kw_doc)s\n\n Returns\n -------\n cax : `~matplotlib.axes.Axes`\n The child Axes.\n kwargs : dict\n The reduced keyword dictionary to be passed when creating the colorbar\n instance.\n """\n\n loc_settings = _normalize_location_orientation(location, orientation)\n kwargs['orientation'] = loc_settings['orientation']\n location = kwargs['ticklocation'] = loc_settings['location']\n\n aspect0 = aspect\n anchor = kwargs.pop('anchor', loc_settings['anchor'])\n panchor = kwargs.pop('panchor', loc_settings['panchor'])\n pad = kwargs.pop('pad', loc_settings["pad"])\n wh_space = 2 * pad / (1 - pad)\n\n if location in ('left', 'right'):\n gs = parent.get_subplotspec().subgridspec(\n 3, 2, wspace=wh_space, hspace=0,\n height_ratios=[(1-anchor[1])*(1-shrink), shrink, anchor[1]*(1-shrink)])\n if location == 'left':\n gs.set_width_ratios([fraction, 1 - fraction - pad])\n ss_main = gs[:, 1]\n ss_cb = gs[1, 0]\n else:\n gs.set_width_ratios([1 - fraction - pad, fraction])\n ss_main = gs[:, 0]\n ss_cb = gs[1, 1]\n else:\n gs = parent.get_subplotspec().subgridspec(\n 2, 3, hspace=wh_space, wspace=0,\n width_ratios=[anchor[0]*(1-shrink), shrink, (1-anchor[0])*(1-shrink)])\n if location == 'top':\n gs.set_height_ratios([fraction, 1 - fraction - pad])\n ss_main = gs[1, :]\n ss_cb = gs[0, 1]\n else:\n gs.set_height_ratios([1 - fraction - pad, fraction])\n ss_main = gs[0, :]\n ss_cb = gs[1, 1]\n aspect = 1 / aspect\n\n parent.set_subplotspec(ss_main)\n if panchor is not False:\n parent.set_anchor(panchor)\n\n fig = parent.get_figure()\n cax = fig.add_subplot(ss_cb, label="<colorbar>")\n cax.set_anchor(anchor)\n cax.set_box_aspect(aspect)\n cax.set_aspect('auto')\n cax._colorbar_info = dict(\n location=location,\n parents=[parent],\n shrink=shrink,\n anchor=anchor,\n panchor=panchor,\n fraction=fraction,\n aspect=aspect0,\n pad=pad)\n\n return cax, kwargs\n | .venv\Lib\site-packages\matplotlib\colorbar.py | colorbar.py | Python | 60,620 | 0.75 | 0.176583 | 0.109064 | awesome-app | 496 | 2023-07-26T03:04:29.502839 | MIT | false | dee127ee158b2117f6a40f733ed1a5e7 |
import matplotlib.spines as mspines\nfrom matplotlib import cm, collections, colors, contour, colorizer\nfrom matplotlib.axes import Axes\nfrom matplotlib.axis import Axis\nfrom matplotlib.backend_bases import RendererBase\nfrom matplotlib.patches import Patch\nfrom matplotlib.ticker import Locator, Formatter\nfrom matplotlib.transforms import Bbox\n\nimport numpy as np\nfrom numpy.typing import ArrayLike\nfrom collections.abc import Sequence\nfrom typing import Any, Literal, overload\nfrom .typing import ColorType\n\nclass _ColorbarSpine(mspines.Spines):\n def __init__(self, axes: Axes): ...\n def get_window_extent(self, renderer: RendererBase | None = ...) -> Bbox:...\n def set_xy(self, xy: ArrayLike) -> None: ...\n def draw(self, renderer: RendererBase | None) -> None:...\n\n\nclass Colorbar:\n n_rasterize: int\n mappable: cm.ScalarMappable | colorizer.ColorizingArtist\n ax: Axes\n alpha: float | None\n cmap: colors.Colormap\n norm: colors.Normalize\n values: Sequence[float] | None\n boundaries: Sequence[float] | None\n extend: Literal["neither", "both", "min", "max"]\n spacing: Literal["uniform", "proportional"]\n orientation: Literal["vertical", "horizontal"]\n drawedges: bool\n extendfrac: Literal["auto"] | float | Sequence[float] | None\n extendrect: bool\n solids: None | collections.QuadMesh\n solids_patches: list[Patch]\n lines: list[collections.LineCollection]\n outline: _ColorbarSpine\n dividers: collections.LineCollection\n ticklocation: Literal["left", "right", "top", "bottom"]\n def __init__(\n self,\n ax: Axes,\n mappable: cm.ScalarMappable | colorizer.ColorizingArtist | None = ...,\n *,\n cmap: str | colors.Colormap | None = ...,\n norm: colors.Normalize | None = ...,\n alpha: float | None = ...,\n values: Sequence[float] | None = ...,\n boundaries: Sequence[float] | None = ...,\n orientation: Literal["vertical", "horizontal"] | None = ...,\n ticklocation: Literal["auto", "left", "right", "top", "bottom"] = ...,\n extend: Literal["neither", "both", "min", "max"] | None = ...,\n spacing: Literal["uniform", "proportional"] = ...,\n ticks: Sequence[float] | Locator | None = ...,\n format: str | Formatter | None = ...,\n drawedges: bool = ...,\n extendfrac: Literal["auto"] | float | Sequence[float] | None = ...,\n extendrect: bool = ...,\n label: str = ...,\n location: Literal["left", "right", "top", "bottom"] | None = ...\n ) -> None: ...\n @property\n def long_axis(self) -> Axis: ...\n @property\n def locator(self) -> Locator: ...\n @locator.setter\n def locator(self, loc: Locator) -> None: ...\n @property\n def minorlocator(self) -> Locator: ...\n @minorlocator.setter\n def minorlocator(self, loc: Locator) -> None: ...\n @property\n def formatter(self) -> Formatter: ...\n @formatter.setter\n def formatter(self, fmt: Formatter) -> None: ...\n @property\n def minorformatter(self) -> Formatter: ...\n @minorformatter.setter\n def minorformatter(self, fmt: Formatter) -> None: ...\n def update_normal(self, mappable: cm.ScalarMappable | None = ...) -> None: ...\n @overload\n def add_lines(self, CS: contour.ContourSet, erase: bool = ...) -> None: ...\n @overload\n def add_lines(\n self,\n levels: ArrayLike,\n colors: ColorType | Sequence[ColorType],\n linewidths: float | ArrayLike,\n erase: bool = ...,\n ) -> None: ...\n def update_ticks(self) -> None: ...\n def set_ticks(\n self,\n ticks: Sequence[float] | Locator,\n *,\n labels: Sequence[str] | None = ...,\n minor: bool = ...,\n **kwargs\n ) -> None: ...\n def get_ticks(self, minor: bool = ...) -> np.ndarray: ...\n def set_ticklabels(\n self,\n ticklabels: Sequence[str],\n *,\n minor: bool = ...,\n **kwargs\n ) -> None: ...\n def minorticks_on(self) -> None: ...\n def minorticks_off(self) -> None: ...\n def set_label(self, label: str, *, loc: str | None = ..., **kwargs) -> None: ...\n def set_alpha(self, alpha: float | np.ndarray) -> None: ...\n def remove(self) -> None: ...\n def drag_pan(self, button: Any, key: Any, x: float, y: float) -> None: ...\n\nColorbarBase = Colorbar\n\ndef make_axes(\n parents: Axes | list[Axes] | np.ndarray,\n location: Literal["left", "right", "top", "bottom"] | None = ...,\n orientation: Literal["vertical", "horizontal"] | None = ...,\n fraction: float = ...,\n shrink: float = ...,\n aspect: float = ...,\n **kwargs\n) -> tuple[Axes, dict[str, Any]]: ...\ndef make_axes_gridspec(\n parent: Axes,\n *,\n location: Literal["left", "right", "top", "bottom"] | None = ...,\n orientation: Literal["vertical", "horizontal"] | None = ...,\n fraction: float = ...,\n shrink: float = ...,\n aspect: float = ...,\n **kwargs\n) -> tuple[Axes, dict[str, Any]]: ...\n | .venv\Lib\site-packages\matplotlib\colorbar.pyi | colorbar.pyi | Other | 4,966 | 0.85 | 0.223022 | 0.06015 | node-utils | 93 | 2024-06-21T18:07:46.402535 | BSD-3-Clause | false | 8e4dffe85b9c2cc751a15956650c2234 |
"""\nThe Colorizer class which handles the data to color pipeline via a\nnormalization and a colormap.\n\n.. admonition:: Provisional status of colorizer\n\n The ``colorizer`` module and classes in this file are considered\n provisional and may change at any time without a deprecation period.\n\n.. seealso::\n\n :doc:`/gallery/color/colormap_reference` for a list of builtin colormaps.\n\n :ref:`colormap-manipulation` for examples of how to make colormaps.\n\n :ref:`colormaps` for an in-depth discussion of choosing colormaps.\n\n :ref:`colormapnorms` for more details about data normalization.\n\n"""\n\nimport functools\n\nimport numpy as np\nfrom numpy import ma\n\nfrom matplotlib import _api, colors, cbook, scale, artist\nimport matplotlib as mpl\n\nmpl._docstring.interpd.register(\n colorizer_doc="""\\ncolorizer : `~matplotlib.colorizer.Colorizer` or None, default: None\n The Colorizer object used to map color to data. If None, a Colorizer\n object is created from a *norm* and *cmap*.""",\n )\n\n\nclass Colorizer:\n """\n Data to color pipeline.\n\n This pipeline is accessible via `.Colorizer.to_rgba` and executed via\n the `.Colorizer.norm` and `.Colorizer.cmap` attributes.\n\n Parameters\n ----------\n cmap: colorbar.Colorbar or str or None, default: None\n The colormap used to color data.\n\n norm: colors.Normalize or str or None, default: None\n The normalization used to normalize the data\n """\n def __init__(self, cmap=None, norm=None):\n\n self._cmap = None\n self._set_cmap(cmap)\n\n self._id_norm = None\n self._norm = None\n self.norm = norm\n\n self.callbacks = cbook.CallbackRegistry(signals=["changed"])\n self.colorbar = None\n\n def _scale_norm(self, norm, vmin, vmax, A):\n """\n Helper for initial scaling.\n\n Used by public functions that create a ScalarMappable and support\n parameters *vmin*, *vmax* and *norm*. This makes sure that a *norm*\n will take precedence over *vmin*, *vmax*.\n\n Note that this method does not set the norm.\n """\n if vmin is not None or vmax is not None:\n self.set_clim(vmin, vmax)\n if isinstance(norm, colors.Normalize):\n raise ValueError(\n "Passing a Normalize instance simultaneously with "\n "vmin/vmax is not supported. Please pass vmin/vmax "\n "directly to the norm when creating it.")\n\n # always resolve the autoscaling so we have concrete limits\n # rather than deferring to draw time.\n self.autoscale_None(A)\n\n @property\n def norm(self):\n return self._norm\n\n @norm.setter\n def norm(self, norm):\n _api.check_isinstance((colors.Normalize, str, None), norm=norm)\n if norm is None:\n norm = colors.Normalize()\n elif isinstance(norm, str):\n try:\n scale_cls = scale._scale_mapping[norm]\n except KeyError:\n raise ValueError(\n "Invalid norm str name; the following values are "\n f"supported: {', '.join(scale._scale_mapping)}"\n ) from None\n norm = _auto_norm_from_scale(scale_cls)()\n\n if norm is self.norm:\n # We aren't updating anything\n return\n\n in_init = self.norm is None\n # Remove the current callback and connect to the new one\n if not in_init:\n self.norm.callbacks.disconnect(self._id_norm)\n self._norm = norm\n self._id_norm = self.norm.callbacks.connect('changed',\n self.changed)\n if not in_init:\n self.changed()\n\n def to_rgba(self, x, alpha=None, bytes=False, norm=True):\n """\n Return a normalized RGBA array corresponding to *x*.\n\n In the normal case, *x* is a 1D or 2D sequence of scalars, and\n the corresponding `~numpy.ndarray` of RGBA values will be returned,\n based on the norm and colormap set for this Colorizer.\n\n There is one special case, for handling images that are already\n RGB or RGBA, such as might have been read from an image file.\n If *x* is an `~numpy.ndarray` with 3 dimensions,\n and the last dimension is either 3 or 4, then it will be\n treated as an RGB or RGBA array, and no mapping will be done.\n The array can be `~numpy.uint8`, or it can be floats with\n values in the 0-1 range; otherwise a ValueError will be raised.\n Any NaNs or masked elements will be set to 0 alpha.\n If the last dimension is 3, the *alpha* kwarg (defaulting to 1)\n will be used to fill in the transparency. If the last dimension\n is 4, the *alpha* kwarg is ignored; it does not\n replace the preexisting alpha. A ValueError will be raised\n if the third dimension is other than 3 or 4.\n\n In either case, if *bytes* is *False* (default), the RGBA\n array will be floats in the 0-1 range; if it is *True*,\n the returned RGBA array will be `~numpy.uint8` in the 0 to 255 range.\n\n If norm is False, no normalization of the input data is\n performed, and it is assumed to be in the range (0-1).\n\n """\n # First check for special case, image input:\n if isinstance(x, np.ndarray) and x.ndim == 3:\n return self._pass_image_data(x, alpha, bytes, norm)\n\n # Otherwise run norm -> colormap pipeline\n x = ma.asarray(x)\n if norm:\n x = self.norm(x)\n rgba = self.cmap(x, alpha=alpha, bytes=bytes)\n return rgba\n\n @staticmethod\n def _pass_image_data(x, alpha=None, bytes=False, norm=True):\n """\n Helper function to pass ndarray of shape (...,3) or (..., 4)\n through `to_rgba()`, see `to_rgba()` for docstring.\n """\n if x.shape[2] == 3:\n if alpha is None:\n alpha = 1\n if x.dtype == np.uint8:\n alpha = np.uint8(alpha * 255)\n m, n = x.shape[:2]\n xx = np.empty(shape=(m, n, 4), dtype=x.dtype)\n xx[:, :, :3] = x\n xx[:, :, 3] = alpha\n elif x.shape[2] == 4:\n xx = x\n else:\n raise ValueError("Third dimension must be 3 or 4")\n if xx.dtype.kind == 'f':\n # If any of R, G, B, or A is nan, set to 0\n if np.any(nans := np.isnan(x)):\n if x.shape[2] == 4:\n xx = xx.copy()\n xx[np.any(nans, axis=2), :] = 0\n\n if norm and (xx.max() > 1 or xx.min() < 0):\n raise ValueError("Floating point image RGB values "\n "must be in the 0..1 range.")\n if bytes:\n xx = (xx * 255).astype(np.uint8)\n elif xx.dtype == np.uint8:\n if not bytes:\n xx = xx.astype(np.float32) / 255\n else:\n raise ValueError("Image RGB array must be uint8 or "\n "floating point; found %s" % xx.dtype)\n # Account for any masked entries in the original array\n # If any of R, G, B, or A are masked for an entry, we set alpha to 0\n if np.ma.is_masked(x):\n xx[np.any(np.ma.getmaskarray(x), axis=2), 3] = 0\n return xx\n\n def autoscale(self, A):\n """\n Autoscale the scalar limits on the norm instance using the\n current array\n """\n if A is None:\n raise TypeError('You must first set_array for mappable')\n # If the norm's limits are updated self.changed() will be called\n # through the callbacks attached to the norm\n self.norm.autoscale(A)\n\n def autoscale_None(self, A):\n """\n Autoscale the scalar limits on the norm instance using the\n current array, changing only limits that are None\n """\n if A is None:\n raise TypeError('You must first set_array for mappable')\n # If the norm's limits are updated self.changed() will be called\n # through the callbacks attached to the norm\n self.norm.autoscale_None(A)\n\n def _set_cmap(self, cmap):\n """\n Set the colormap for luminance data.\n\n Parameters\n ----------\n cmap : `.Colormap` or str or None\n """\n # bury import to avoid circular imports\n from matplotlib import cm\n in_init = self._cmap is None\n self._cmap = cm._ensure_cmap(cmap)\n if not in_init:\n self.changed() # Things are not set up properly yet.\n\n @property\n def cmap(self):\n return self._cmap\n\n @cmap.setter\n def cmap(self, cmap):\n self._set_cmap(cmap)\n\n def set_clim(self, vmin=None, vmax=None):\n """\n Set the norm limits for image scaling.\n\n Parameters\n ----------\n vmin, vmax : float\n The limits.\n\n The limits may also be passed as a tuple (*vmin*, *vmax*) as a\n single positional argument.\n\n .. ACCEPTS: (vmin: float, vmax: float)\n """\n # If the norm's limits are updated self.changed() will be called\n # through the callbacks attached to the norm, this causes an inconsistent\n # state, to prevent this blocked context manager is used\n if vmax is None:\n try:\n vmin, vmax = vmin\n except (TypeError, ValueError):\n pass\n\n orig_vmin_vmax = self.norm.vmin, self.norm.vmax\n\n # Blocked context manager prevents callbacks from being triggered\n # until both vmin and vmax are updated\n with self.norm.callbacks.blocked(signal='changed'):\n if vmin is not None:\n self.norm.vmin = colors._sanitize_extrema(vmin)\n if vmax is not None:\n self.norm.vmax = colors._sanitize_extrema(vmax)\n\n # emit a update signal if the limits are changed\n if orig_vmin_vmax != (self.norm.vmin, self.norm.vmax):\n self.norm.callbacks.process('changed')\n\n def get_clim(self):\n """\n Return the values (min, max) that are mapped to the colormap limits.\n """\n return self.norm.vmin, self.norm.vmax\n\n def changed(self):\n """\n Call this whenever the mappable is changed to notify all the\n callbackSM listeners to the 'changed' signal.\n """\n self.callbacks.process('changed')\n self.stale = True\n\n @property\n def vmin(self):\n return self.get_clim()[0]\n\n @vmin.setter\n def vmin(self, vmin):\n self.set_clim(vmin=vmin)\n\n @property\n def vmax(self):\n return self.get_clim()[1]\n\n @vmax.setter\n def vmax(self, vmax):\n self.set_clim(vmax=vmax)\n\n @property\n def clip(self):\n return self.norm.clip\n\n @clip.setter\n def clip(self, clip):\n self.norm.clip = clip\n\n\nclass _ColorizerInterface:\n """\n Base class that contains the interface to `Colorizer` objects from\n a `ColorizingArtist` or `.cm.ScalarMappable`.\n\n Note: This class only contain functions that interface the .colorizer\n attribute. Other functions that as shared between `.ColorizingArtist`\n and `.cm.ScalarMappable` are not included.\n """\n def _scale_norm(self, norm, vmin, vmax):\n self._colorizer._scale_norm(norm, vmin, vmax, self._A)\n\n def to_rgba(self, x, alpha=None, bytes=False, norm=True):\n """\n Return a normalized RGBA array corresponding to *x*.\n\n In the normal case, *x* is a 1D or 2D sequence of scalars, and\n the corresponding `~numpy.ndarray` of RGBA values will be returned,\n based on the norm and colormap set for this Colorizer.\n\n There is one special case, for handling images that are already\n RGB or RGBA, such as might have been read from an image file.\n If *x* is an `~numpy.ndarray` with 3 dimensions,\n and the last dimension is either 3 or 4, then it will be\n treated as an RGB or RGBA array, and no mapping will be done.\n The array can be `~numpy.uint8`, or it can be floats with\n values in the 0-1 range; otherwise a ValueError will be raised.\n Any NaNs or masked elements will be set to 0 alpha.\n If the last dimension is 3, the *alpha* kwarg (defaulting to 1)\n will be used to fill in the transparency. If the last dimension\n is 4, the *alpha* kwarg is ignored; it does not\n replace the preexisting alpha. A ValueError will be raised\n if the third dimension is other than 3 or 4.\n\n In either case, if *bytes* is *False* (default), the RGBA\n array will be floats in the 0-1 range; if it is *True*,\n the returned RGBA array will be `~numpy.uint8` in the 0 to 255 range.\n\n If norm is False, no normalization of the input data is\n performed, and it is assumed to be in the range (0-1).\n\n """\n return self._colorizer.to_rgba(x, alpha=alpha, bytes=bytes, norm=norm)\n\n def get_clim(self):\n """\n Return the values (min, max) that are mapped to the colormap limits.\n """\n return self._colorizer.get_clim()\n\n def set_clim(self, vmin=None, vmax=None):\n """\n Set the norm limits for image scaling.\n\n Parameters\n ----------\n vmin, vmax : float\n The limits.\n\n For scalar data, the limits may also be passed as a\n tuple (*vmin*, *vmax*) as a single positional argument.\n\n .. ACCEPTS: (vmin: float, vmax: float)\n """\n # If the norm's limits are updated self.changed() will be called\n # through the callbacks attached to the norm\n self._colorizer.set_clim(vmin, vmax)\n\n def get_alpha(self):\n try:\n return super().get_alpha()\n except AttributeError:\n return 1\n\n @property\n def cmap(self):\n return self._colorizer.cmap\n\n @cmap.setter\n def cmap(self, cmap):\n self._colorizer.cmap = cmap\n\n def get_cmap(self):\n """Return the `.Colormap` instance."""\n return self._colorizer.cmap\n\n def set_cmap(self, cmap):\n """\n Set the colormap for luminance data.\n\n Parameters\n ----------\n cmap : `.Colormap` or str or None\n """\n self.cmap = cmap\n\n @property\n def norm(self):\n return self._colorizer.norm\n\n @norm.setter\n def norm(self, norm):\n self._colorizer.norm = norm\n\n def set_norm(self, norm):\n """\n Set the normalization instance.\n\n Parameters\n ----------\n norm : `.Normalize` or str or None\n\n Notes\n -----\n If there are any colorbars using the mappable for this norm, setting\n the norm of the mappable will reset the norm, locator, and formatters\n on the colorbar to default.\n """\n self.norm = norm\n\n def autoscale(self):\n """\n Autoscale the scalar limits on the norm instance using the\n current array\n """\n self._colorizer.autoscale(self._A)\n\n def autoscale_None(self):\n """\n Autoscale the scalar limits on the norm instance using the\n current array, changing only limits that are None\n """\n self._colorizer.autoscale_None(self._A)\n\n @property\n def colorbar(self):\n """\n The last colorbar associated with this object. May be None\n """\n return self._colorizer.colorbar\n\n @colorbar.setter\n def colorbar(self, colorbar):\n self._colorizer.colorbar = colorbar\n\n def _format_cursor_data_override(self, data):\n # This function overwrites Artist.format_cursor_data(). We cannot\n # implement cm.ScalarMappable.format_cursor_data() directly, because\n # most cm.ScalarMappable subclasses inherit from Artist first and from\n # cm.ScalarMappable second, so Artist.format_cursor_data would always\n # have precedence over cm.ScalarMappable.format_cursor_data.\n\n # Note if cm.ScalarMappable is depreciated, this functionality should be\n # implemented as format_cursor_data() on ColorizingArtist.\n n = self.cmap.N\n if np.ma.getmask(data):\n return "[]"\n normed = self.norm(data)\n if np.isfinite(normed):\n if isinstance(self.norm, colors.BoundaryNorm):\n # not an invertible normalization mapping\n cur_idx = np.argmin(np.abs(self.norm.boundaries - data))\n neigh_idx = max(0, cur_idx - 1)\n # use max diff to prevent delta == 0\n delta = np.diff(\n self.norm.boundaries[neigh_idx:cur_idx + 2]\n ).max()\n elif self.norm.vmin == self.norm.vmax:\n # singular norms, use delta of 10% of only value\n delta = np.abs(self.norm.vmin * .1)\n else:\n # Midpoints of neighboring color intervals.\n neighbors = self.norm.inverse(\n (int(normed * n) + np.array([0, 1])) / n)\n delta = abs(neighbors - data).max()\n g_sig_digits = cbook._g_sig_digits(data, delta)\n else:\n g_sig_digits = 3 # Consistent with default below.\n return f"[{data:-#.{g_sig_digits}g}]"\n\n\nclass _ScalarMappable(_ColorizerInterface):\n """\n A mixin class to map one or multiple sets of scalar data to RGBA.\n\n The ScalarMappable applies data normalization before returning RGBA colors from\n the given `~matplotlib.colors.Colormap`.\n """\n\n # _ScalarMappable exists for compatibility with\n # code written before the introduction of the Colorizer\n # and ColorizingArtist classes.\n\n # _ScalarMappable can be depreciated so that ColorizingArtist\n # inherits directly from _ColorizerInterface.\n # in this case, the following changes should occur:\n # __init__() has its functionality moved to ColorizingArtist.\n # set_array(), get_array(), _get_colorizer() and\n # _check_exclusionary_keywords() are moved to ColorizingArtist.\n # changed() can be removed so long as colorbar.Colorbar\n # is changed to connect to the colorizer instead of the\n # ScalarMappable/ColorizingArtist,\n # otherwise changed() can be moved to ColorizingArtist.\n def __init__(self, norm=None, cmap=None, *, colorizer=None, **kwargs):\n """\n Parameters\n ----------\n norm : `.Normalize` (or subclass thereof) or str or None\n The normalizing object which scales data, typically into the\n interval ``[0, 1]``.\n If a `str`, a `.Normalize` subclass is dynamically generated based\n on the scale with the corresponding name.\n If *None*, *norm* defaults to a *colors.Normalize* object which\n initializes its scaling based on the first data processed.\n cmap : str or `~matplotlib.colors.Colormap`\n The colormap used to map normalized data values to RGBA colors.\n """\n super().__init__(**kwargs)\n self._A = None\n self._colorizer = self._get_colorizer(colorizer=colorizer, norm=norm, cmap=cmap)\n\n self.colorbar = None\n self._id_colorizer = self._colorizer.callbacks.connect('changed', self.changed)\n self.callbacks = cbook.CallbackRegistry(signals=["changed"])\n\n def set_array(self, A):\n """\n Set the value array from array-like *A*.\n\n Parameters\n ----------\n A : array-like or None\n The values that are mapped to colors.\n\n The base class `.ScalarMappable` does not make any assumptions on\n the dimensionality and shape of the value array *A*.\n """\n if A is None:\n self._A = None\n return\n\n A = cbook.safe_masked_invalid(A, copy=True)\n if not np.can_cast(A.dtype, float, "same_kind"):\n raise TypeError(f"Image data of dtype {A.dtype} cannot be "\n "converted to float")\n\n self._A = A\n if not self.norm.scaled():\n self._colorizer.autoscale_None(A)\n\n def get_array(self):\n """\n Return the array of values, that are mapped to colors.\n\n The base class `.ScalarMappable` does not make any assumptions on\n the dimensionality and shape of the array.\n """\n return self._A\n\n def changed(self):\n """\n Call this whenever the mappable is changed to notify all the\n callbackSM listeners to the 'changed' signal.\n """\n self.callbacks.process('changed', self)\n self.stale = True\n\n @staticmethod\n def _check_exclusionary_keywords(colorizer, **kwargs):\n """\n Raises a ValueError if any kwarg is not None while colorizer is not None\n """\n if colorizer is not None:\n if any([val is not None for val in kwargs.values()]):\n raise ValueError("The `colorizer` keyword cannot be used simultaneously"\n " with any of the following keywords: "\n + ", ".join(f'`{key}`' for key in kwargs.keys()))\n\n @staticmethod\n def _get_colorizer(cmap, norm, colorizer):\n if isinstance(colorizer, Colorizer):\n _ScalarMappable._check_exclusionary_keywords(\n Colorizer, cmap=cmap, norm=norm\n )\n return colorizer\n return Colorizer(cmap, norm)\n\n# The docstrings here must be generic enough to apply to all relevant methods.\nmpl._docstring.interpd.register(\n cmap_doc="""\\ncmap : str or `~matplotlib.colors.Colormap`, default: :rc:`image.cmap`\n The Colormap instance or registered colormap name used to map scalar data\n to colors.""",\n norm_doc="""\\nnorm : str or `~matplotlib.colors.Normalize`, optional\n The normalization method used to scale scalar data to the [0, 1] range\n before mapping to colors using *cmap*. By default, a linear scaling is\n used, mapping the lowest value to 0 and the highest to 1.\n\n If given, this can be one of the following:\n\n - An instance of `.Normalize` or one of its subclasses\n (see :ref:`colormapnorms`).\n - A scale name, i.e. one of "linear", "log", "symlog", "logit", etc. For a\n list of available scales, call `matplotlib.scale.get_scale_names()`.\n In that case, a suitable `.Normalize` subclass is dynamically generated\n and instantiated.""",\n vmin_vmax_doc="""\\nvmin, vmax : float, optional\n When using scalar data and no explicit *norm*, *vmin* and *vmax* define\n the data range that the colormap covers. By default, the colormap covers\n the complete value range of the supplied data. It is an error to use\n *vmin*/*vmax* when a *norm* instance is given (but using a `str` *norm*\n name together with *vmin*/*vmax* is acceptable).""",\n)\n\n\nclass ColorizingArtist(_ScalarMappable, artist.Artist):\n """\n Base class for artists that make map data to color using a `.colorizer.Colorizer`.\n\n The `.colorizer.Colorizer` applies data normalization before\n returning RGBA colors from a `~matplotlib.colors.Colormap`.\n\n """\n def __init__(self, colorizer, **kwargs):\n """\n Parameters\n ----------\n colorizer : `.colorizer.Colorizer`\n """\n _api.check_isinstance(Colorizer, colorizer=colorizer)\n super().__init__(colorizer=colorizer, **kwargs)\n\n @property\n def colorizer(self):\n return self._colorizer\n\n @colorizer.setter\n def colorizer(self, cl):\n _api.check_isinstance(Colorizer, colorizer=cl)\n self._colorizer.callbacks.disconnect(self._id_colorizer)\n self._colorizer = cl\n self._id_colorizer = cl.callbacks.connect('changed', self.changed)\n\n def _set_colorizer_check_keywords(self, colorizer, **kwargs):\n """\n Raises a ValueError if any kwarg is not None while colorizer is not None.\n """\n self._check_exclusionary_keywords(colorizer, **kwargs)\n self.colorizer = colorizer\n\n\ndef _auto_norm_from_scale(scale_cls):\n """\n Automatically generate a norm class from *scale_cls*.\n\n This differs from `.colors.make_norm_from_scale` in the following points:\n\n - This function is not a class decorator, but directly returns a norm class\n (as if decorating `.Normalize`).\n - The scale is automatically constructed with ``nonpositive="mask"``, if it\n supports such a parameter, to work around the difference in defaults\n between standard scales (which use "clip") and norms (which use "mask").\n\n Note that ``make_norm_from_scale`` caches the generated norm classes\n (not the instances) and reuses them for later calls. For example,\n ``type(_auto_norm_from_scale("log")) == LogNorm``.\n """\n # Actually try to construct an instance, to verify whether\n # ``nonpositive="mask"`` is supported.\n try:\n norm = colors.make_norm_from_scale(\n functools.partial(scale_cls, nonpositive="mask"))(\n colors.Normalize)()\n except TypeError:\n norm = colors.make_norm_from_scale(scale_cls)(\n colors.Normalize)()\n return type(norm)\n | .venv\Lib\site-packages\matplotlib\colorizer.py | colorizer.py | Python | 25,180 | 0.95 | 0.203414 | 0.085763 | node-utils | 136 | 2024-06-02T22:15:00.209660 | GPL-3.0 | false | 6caa4186a52c2c10c229edac5b3bbdbc |
from matplotlib import cbook, colorbar, colors, artist\n\nfrom typing import overload\nimport numpy as np\nfrom numpy.typing import ArrayLike\n\n\nclass Colorizer:\n colorbar: colorbar.Colorbar | None\n callbacks: cbook.CallbackRegistry\n def __init__(\n self,\n cmap: str | colors.Colormap | None = ...,\n norm: str | colors.Normalize | None = ...,\n ) -> None: ...\n @property\n def norm(self) -> colors.Normalize: ...\n @norm.setter\n def norm(self, norm: colors.Normalize | str | None) -> None: ...\n def to_rgba(\n self,\n x: np.ndarray,\n alpha: float | ArrayLike | None = ...,\n bytes: bool = ...,\n norm: bool = ...,\n ) -> np.ndarray: ...\n def autoscale(self, A: ArrayLike) -> None: ...\n def autoscale_None(self, A: ArrayLike) -> None: ...\n @property\n def cmap(self) -> colors.Colormap: ...\n @cmap.setter\n def cmap(self, cmap: colors.Colormap | str | None) -> None: ...\n def get_clim(self) -> tuple[float, float]: ...\n def set_clim(self, vmin: float | tuple[float, float] | None = ..., vmax: float | None = ...) -> None: ...\n def changed(self) -> None: ...\n @property\n def vmin(self) -> float | None: ...\n @vmin.setter\n def vmin(self, value: float | None) -> None: ...\n @property\n def vmax(self) -> float | None: ...\n @vmax.setter\n def vmax(self, value: float | None) -> None: ...\n @property\n def clip(self) -> bool: ...\n @clip.setter\n def clip(self, value: bool) -> None: ...\n\n\nclass _ColorizerInterface:\n cmap: colors.Colormap\n colorbar: colorbar.Colorbar | None\n callbacks: cbook.CallbackRegistry\n def to_rgba(\n self,\n x: np.ndarray,\n alpha: float | ArrayLike | None = ...,\n bytes: bool = ...,\n norm: bool = ...,\n ) -> np.ndarray: ...\n def get_clim(self) -> tuple[float, float]: ...\n def set_clim(self, vmin: float | tuple[float, float] | None = ..., vmax: float | None = ...) -> None: ...\n def get_alpha(self) -> float | None: ...\n def get_cmap(self) -> colors.Colormap: ...\n def set_cmap(self, cmap: str | colors.Colormap) -> None: ...\n @property\n def norm(self) -> colors.Normalize: ...\n @norm.setter\n def norm(self, norm: colors.Normalize | str | None) -> None: ...\n def set_norm(self, norm: colors.Normalize | str | None) -> None: ...\n def autoscale(self) -> None: ...\n def autoscale_None(self) -> None: ...\n\n\nclass _ScalarMappable(_ColorizerInterface):\n def __init__(\n self,\n norm: colors.Normalize | None = ...,\n cmap: str | colors.Colormap | None = ...,\n *,\n colorizer: Colorizer | None = ...,\n **kwargs\n ) -> None: ...\n def set_array(self, A: ArrayLike | None) -> None: ...\n def get_array(self) -> np.ndarray | None: ...\n def changed(self) -> None: ...\n\n\nclass ColorizingArtist(_ScalarMappable, artist.Artist):\n callbacks: cbook.CallbackRegistry\n def __init__(\n self,\n colorizer: Colorizer,\n **kwargs\n ) -> None: ...\n def set_array(self, A: ArrayLike | None) -> None: ...\n def get_array(self) -> np.ndarray | None: ...\n def changed(self) -> None: ...\n @property\n def colorizer(self) -> Colorizer: ...\n @colorizer.setter\n def colorizer(self, cl: Colorizer) -> None: ...\n | .venv\Lib\site-packages\matplotlib\colorizer.pyi | colorizer.pyi | Other | 3,308 | 0.85 | 0.411765 | 0.032258 | react-lib | 218 | 2025-02-06T14:41:33.616539 | Apache-2.0 | false | 14e349fcde012ea9769dacb609909a11 |
from collections.abc import Callable, Iterable, Iterator, Mapping, Sequence\nfrom matplotlib import cbook, scale\nimport re\n\nfrom typing import Any, Literal, overload\nfrom .typing import ColorType\n\nimport numpy as np\nfrom numpy.typing import ArrayLike\n\n# Explicitly export colors dictionaries which are imported in the impl\nBASE_COLORS: dict[str, ColorType]\nCSS4_COLORS: dict[str, ColorType]\nTABLEAU_COLORS: dict[str, ColorType]\nXKCD_COLORS: dict[str, ColorType]\n\nclass _ColorMapping(dict[str, ColorType]):\n cache: dict[tuple[ColorType, float | None], tuple[float, float, float, float]]\n def __init__(self, mapping) -> None: ...\n def __setitem__(self, key, value) -> None: ...\n def __delitem__(self, key) -> None: ...\n\ndef get_named_colors_mapping() -> _ColorMapping: ...\n\nclass ColorSequenceRegistry(Mapping):\n def __init__(self) -> None: ...\n def __getitem__(self, item: str) -> list[ColorType]: ...\n def __iter__(self) -> Iterator[str]: ...\n def __len__(self) -> int: ...\n def register(self, name: str, color_list: Iterable[ColorType]) -> None: ...\n def unregister(self, name: str) -> None: ...\n\n_color_sequences: ColorSequenceRegistry = ...\n\ndef is_color_like(c: Any) -> bool: ...\ndef same_color(c1: ColorType, c2: ColorType) -> bool: ...\ndef to_rgba(\n c: ColorType, alpha: float | None = ...\n) -> tuple[float, float, float, float]: ...\ndef to_rgba_array(\n c: ColorType | ArrayLike, alpha: float | ArrayLike | None = ...\n) -> np.ndarray: ...\ndef to_rgb(c: ColorType) -> tuple[float, float, float]: ...\ndef to_hex(c: ColorType, keep_alpha: bool = ...) -> str: ...\n\ncnames: dict[str, ColorType]\nhexColorPattern: re.Pattern\nrgb2hex = to_hex\nhex2color = to_rgb\n\nclass ColorConverter:\n colors: _ColorMapping\n cache: dict[tuple[ColorType, float | None], tuple[float, float, float, float]]\n @staticmethod\n def to_rgb(c: ColorType) -> tuple[float, float, float]: ...\n @staticmethod\n def to_rgba(\n c: ColorType, alpha: float | None = ...\n ) -> tuple[float, float, float, float]: ...\n @staticmethod\n def to_rgba_array(\n c: ColorType | ArrayLike, alpha: float | ArrayLike | None = ...\n ) -> np.ndarray: ...\n\ncolorConverter: ColorConverter\n\nclass Colormap:\n name: str\n N: int\n colorbar_extend: bool\n def __init__(self, name: str, N: int = ...) -> None: ...\n @overload\n def __call__(\n self, X: Sequence[float] | np.ndarray, alpha: ArrayLike | None = ..., bytes: bool = ...\n ) -> np.ndarray: ...\n @overload\n def __call__(\n self, X: float, alpha: float | None = ..., bytes: bool = ...\n ) -> tuple[float, float, float, float]: ...\n @overload\n def __call__(\n self, X: ArrayLike, alpha: ArrayLike | None = ..., bytes: bool = ...\n ) -> tuple[float, float, float, float] | np.ndarray: ...\n def __copy__(self) -> Colormap: ...\n def __eq__(self, other: object) -> bool: ...\n def get_bad(self) -> np.ndarray: ...\n def set_bad(self, color: ColorType = ..., alpha: float | None = ...) -> None: ...\n def get_under(self) -> np.ndarray: ...\n def set_under(self, color: ColorType = ..., alpha: float | None = ...) -> None: ...\n def get_over(self) -> np.ndarray: ...\n def set_over(self, color: ColorType = ..., alpha: float | None = ...) -> None: ...\n def set_extremes(\n self,\n *,\n bad: ColorType | None = ...,\n under: ColorType | None = ...,\n over: ColorType | None = ...\n ) -> None: ...\n def with_extremes(\n self,\n *,\n bad: ColorType | None = ...,\n under: ColorType | None = ...,\n over: ColorType | None = ...\n ) -> Colormap: ...\n def is_gray(self) -> bool: ...\n def resampled(self, lutsize: int) -> Colormap: ...\n def reversed(self, name: str | None = ...) -> Colormap: ...\n def _repr_html_(self) -> str: ...\n def _repr_png_(self) -> bytes: ...\n def copy(self) -> Colormap: ...\n\nclass LinearSegmentedColormap(Colormap):\n monochrome: bool\n def __init__(\n self,\n name: str,\n segmentdata: dict[\n Literal["red", "green", "blue", "alpha"], Sequence[tuple[float, ...]]\n ],\n N: int = ...,\n gamma: float = ...,\n ) -> None: ...\n def set_gamma(self, gamma: float) -> None: ...\n @staticmethod\n def from_list(\n name: str, colors: ArrayLike | Sequence[tuple[float, ColorType]], N: int = ..., gamma: float = ...\n ) -> LinearSegmentedColormap: ...\n def resampled(self, lutsize: int) -> LinearSegmentedColormap: ...\n def reversed(self, name: str | None = ...) -> LinearSegmentedColormap: ...\n\nclass ListedColormap(Colormap):\n monochrome: bool\n colors: ArrayLike | ColorType\n def __init__(\n self, colors: ArrayLike | ColorType, name: str = ..., N: int | None = ...\n ) -> None: ...\n def resampled(self, lutsize: int) -> ListedColormap: ...\n def reversed(self, name: str | None = ...) -> ListedColormap: ...\n\nclass MultivarColormap:\n name: str\n n_variates: int\n def __init__(self, colormaps: list[Colormap], combination_mode: Literal['sRGB_add', 'sRGB_sub'], name: str = ...) -> None: ...\n @overload\n def __call__(\n self, X: Sequence[Sequence[float]] | np.ndarray, alpha: ArrayLike | None = ..., bytes: bool = ..., clip: bool = ...\n ) -> np.ndarray: ...\n @overload\n def __call__(\n self, X: Sequence[float], alpha: float | None = ..., bytes: bool = ..., clip: bool = ...\n ) -> tuple[float, float, float, float]: ...\n @overload\n def __call__(\n self, X: ArrayLike, alpha: ArrayLike | None = ..., bytes: bool = ..., clip: bool = ...\n ) -> tuple[float, float, float, float] | np.ndarray: ...\n def copy(self) -> MultivarColormap: ...\n def __copy__(self) -> MultivarColormap: ...\n def __eq__(self, other: Any) -> bool: ...\n def __getitem__(self, item: int) -> Colormap: ...\n def __iter__(self) -> Iterator[Colormap]: ...\n def __len__(self) -> int: ...\n def get_bad(self) -> np.ndarray: ...\n def resampled(self, lutshape: Sequence[int | None]) -> MultivarColormap: ...\n def with_extremes(\n self,\n *,\n bad: ColorType | None = ...,\n under: Sequence[ColorType] | None = ...,\n over: Sequence[ColorType] | None = ...\n ) -> MultivarColormap: ...\n @property\n def combination_mode(self) -> str: ...\n def _repr_html_(self) -> str: ...\n def _repr_png_(self) -> bytes: ...\n\nclass BivarColormap:\n name: str\n N: int\n M: int\n n_variates: int\n def __init__(\n self, N: int = ..., M: int | None = ..., shape: Literal['square', 'circle', 'ignore', 'circleignore'] = ...,\n origin: Sequence[float] = ..., name: str = ...\n ) -> None: ...\n @overload\n def __call__(\n self, X: Sequence[Sequence[float]] | np.ndarray, alpha: ArrayLike | None = ..., bytes: bool = ...\n ) -> np.ndarray: ...\n @overload\n def __call__(\n self, X: Sequence[float], alpha: float | None = ..., bytes: bool = ...\n ) -> tuple[float, float, float, float]: ...\n @overload\n def __call__(\n self, X: ArrayLike, alpha: ArrayLike | None = ..., bytes: bool = ...\n ) -> tuple[float, float, float, float] | np.ndarray: ...\n @property\n def lut(self) -> np.ndarray: ...\n @property\n def shape(self) -> str: ...\n @property\n def origin(self) -> tuple[float, float]: ...\n def copy(self) -> BivarColormap: ...\n def __copy__(self) -> BivarColormap: ...\n def __getitem__(self, item: int) -> Colormap: ...\n def __eq__(self, other: Any) -> bool: ...\n def get_bad(self) -> np.ndarray: ...\n def get_outside(self) -> np.ndarray: ...\n def resampled(self, lutshape: Sequence[int | None], transposed: bool = ...) -> BivarColormap: ...\n def transposed(self) -> BivarColormap: ...\n def reversed(self, axis_0: bool = ..., axis_1: bool = ...) -> BivarColormap: ...\n def with_extremes(\n self,\n *,\n bad: ColorType | None = ...,\n outside: ColorType | None = ...,\n shape: str | None = ...,\n origin: None | Sequence[float] = ...,\n ) -> MultivarColormap: ...\n def _repr_html_(self) -> str: ...\n def _repr_png_(self) -> bytes: ...\n\nclass SegmentedBivarColormap(BivarColormap):\n def __init__(\n self, patch: np.ndarray, N: int = ..., shape: Literal['square', 'circle', 'ignore', 'circleignore'] = ...,\n origin: Sequence[float] = ..., name: str = ...\n ) -> None: ...\n\nclass BivarColormapFromImage(BivarColormap):\n def __init__(\n self, lut: np.ndarray, shape: Literal['square', 'circle', 'ignore', 'circleignore'] = ...,\n origin: Sequence[float] = ..., name: str = ...\n ) -> None: ...\n\nclass Normalize:\n callbacks: cbook.CallbackRegistry\n def __init__(\n self, vmin: float | None = ..., vmax: float | None = ..., clip: bool = ...\n ) -> None: ...\n @property\n def vmin(self) -> float | None: ...\n @vmin.setter\n def vmin(self, value: float | None) -> None: ...\n @property\n def vmax(self) -> float | None: ...\n @vmax.setter\n def vmax(self, value: float | None) -> None: ...\n @property\n def clip(self) -> bool: ...\n @clip.setter\n def clip(self, value: bool) -> None: ...\n @staticmethod\n def process_value(value: ArrayLike) -> tuple[np.ma.MaskedArray, bool]: ...\n @overload\n def __call__(self, value: float, clip: bool | None = ...) -> float: ...\n @overload\n def __call__(self, value: np.ndarray, clip: bool | None = ...) -> np.ma.MaskedArray: ...\n @overload\n def __call__(self, value: ArrayLike, clip: bool | None = ...) -> ArrayLike: ...\n @overload\n def inverse(self, value: float) -> float: ...\n @overload\n def inverse(self, value: np.ndarray) -> np.ma.MaskedArray: ...\n @overload\n def inverse(self, value: ArrayLike) -> ArrayLike: ...\n def autoscale(self, A: ArrayLike) -> None: ...\n def autoscale_None(self, A: ArrayLike) -> None: ...\n def scaled(self) -> bool: ...\n\nclass TwoSlopeNorm(Normalize):\n def __init__(\n self, vcenter: float, vmin: float | None = ..., vmax: float | None = ...\n ) -> None: ...\n @property\n def vcenter(self) -> float: ...\n @vcenter.setter\n def vcenter(self, value: float) -> None: ...\n def autoscale_None(self, A: ArrayLike) -> None: ...\n\nclass CenteredNorm(Normalize):\n def __init__(\n self, vcenter: float = ..., halfrange: float | None = ..., clip: bool = ...\n ) -> None: ...\n @property\n def vcenter(self) -> float: ...\n @vcenter.setter\n def vcenter(self, vcenter: float) -> None: ...\n @property\n def halfrange(self) -> float: ...\n @halfrange.setter\n def halfrange(self, halfrange: float) -> None: ...\n\n@overload\ndef make_norm_from_scale(\n scale_cls: type[scale.ScaleBase],\n base_norm_cls: type[Normalize],\n *,\n init: Callable | None = ...\n) -> type[Normalize]: ...\n@overload\ndef make_norm_from_scale(\n scale_cls: type[scale.ScaleBase],\n base_norm_cls: None = ...,\n *,\n init: Callable | None = ...\n) -> Callable[[type[Normalize]], type[Normalize]]: ...\n\nclass FuncNorm(Normalize):\n def __init__(\n self,\n functions: tuple[Callable, Callable],\n vmin: float | None = ...,\n vmax: float | None = ...,\n clip: bool = ...,\n ) -> None: ...\nclass LogNorm(Normalize): ...\n\nclass SymLogNorm(Normalize):\n def __init__(\n self,\n linthresh: float,\n linscale: float = ...,\n vmin: float | None = ...,\n vmax: float | None = ...,\n clip: bool = ...,\n *,\n base: float = ...,\n ) -> None: ...\n @property\n def linthresh(self) -> float: ...\n @linthresh.setter\n def linthresh(self, value: float) -> None: ...\n\nclass AsinhNorm(Normalize):\n def __init__(\n self,\n linear_width: float = ...,\n vmin: float | None = ...,\n vmax: float | None = ...,\n clip: bool = ...,\n ) -> None: ...\n @property\n def linear_width(self) -> float: ...\n @linear_width.setter\n def linear_width(self, value: float) -> None: ...\n\nclass PowerNorm(Normalize):\n gamma: float\n def __init__(\n self,\n gamma: float,\n vmin: float | None = ...,\n vmax: float | None = ...,\n clip: bool = ...,\n ) -> None: ...\n\nclass BoundaryNorm(Normalize):\n boundaries: np.ndarray\n N: int\n Ncmap: int\n extend: Literal["neither", "both", "min", "max"]\n def __init__(\n self,\n boundaries: ArrayLike,\n ncolors: int,\n clip: bool = ...,\n *,\n extend: Literal["neither", "both", "min", "max"] = ...\n ) -> None: ...\n\nclass NoNorm(Normalize): ...\n\ndef rgb_to_hsv(arr: ArrayLike) -> np.ndarray: ...\ndef hsv_to_rgb(hsv: ArrayLike) -> np.ndarray: ...\n\nclass LightSource:\n azdeg: float\n altdeg: float\n hsv_min_val: float\n hsv_max_val: float\n hsv_min_sat: float\n hsv_max_sat: float\n def __init__(\n self,\n azdeg: float = ...,\n altdeg: float = ...,\n hsv_min_val: float = ...,\n hsv_max_val: float = ...,\n hsv_min_sat: float = ...,\n hsv_max_sat: float = ...,\n ) -> None: ...\n @property\n def direction(self) -> np.ndarray: ...\n def hillshade(\n self,\n elevation: ArrayLike,\n vert_exag: float = ...,\n dx: float = ...,\n dy: float = ...,\n fraction: float = ...,\n ) -> np.ndarray: ...\n def shade_normals(\n self, normals: np.ndarray, fraction: float = ...\n ) -> np.ndarray: ...\n def shade(\n self,\n data: ArrayLike,\n cmap: Colormap,\n norm: Normalize | None = ...,\n blend_mode: Literal["hsv", "overlay", "soft"] | Callable = ...,\n vmin: float | None = ...,\n vmax: float | None = ...,\n vert_exag: float = ...,\n dx: float = ...,\n dy: float = ...,\n fraction: float = ...,\n **kwargs\n ) -> np.ndarray: ...\n def shade_rgb(\n self,\n rgb: ArrayLike,\n elevation: ArrayLike,\n fraction: float = ...,\n blend_mode: Literal["hsv", "overlay", "soft"] | Callable = ...,\n vert_exag: float = ...,\n dx: float = ...,\n dy: float = ...,\n **kwargs\n ) -> np.ndarray: ...\n def blend_hsv(\n self,\n rgb: ArrayLike,\n intensity: ArrayLike,\n hsv_max_sat: float | None = ...,\n hsv_max_val: float | None = ...,\n hsv_min_val: float | None = ...,\n hsv_min_sat: float | None = ...,\n ) -> ArrayLike: ...\n def blend_soft_light(\n self, rgb: np.ndarray, intensity: np.ndarray\n ) -> np.ndarray: ...\n def blend_overlay(self, rgb: np.ndarray, intensity: np.ndarray) -> np.ndarray: ...\n\ndef from_levels_and_colors(\n levels: Sequence[float],\n colors: Sequence[ColorType],\n extend: Literal["neither", "min", "max", "both"] = ...,\n) -> tuple[ListedColormap, BoundaryNorm]: ...\n | .venv\Lib\site-packages\matplotlib\colors.pyi | colors.pyi | Other | 14,908 | 0.95 | 0.342984 | 0.026316 | vue-tools | 493 | 2023-10-27T04:08:42.464036 | MIT | false | 2528d94ed8cb029537885828e8293b59 |
from matplotlib import cbook\nfrom matplotlib.artist import Artist\n\n\nclass Container(tuple):\n """\n Base class for containers.\n\n Containers are classes that collect semantically related Artists such as\n the bars of a bar plot.\n """\n\n def __repr__(self):\n return f"<{type(self).__name__} object of {len(self)} artists>"\n\n def __new__(cls, *args, **kwargs):\n return tuple.__new__(cls, args[0])\n\n def __init__(self, kl, label=None):\n self._callbacks = cbook.CallbackRegistry(signals=["pchanged"])\n self._remove_method = None\n self._label = str(label) if label is not None else None\n\n def remove(self):\n for c in cbook.flatten(\n self, scalarp=lambda x: isinstance(x, Artist)):\n if c is not None:\n c.remove()\n if self._remove_method:\n self._remove_method(self)\n\n def get_children(self):\n return [child for child in cbook.flatten(self) if child is not None]\n\n get_label = Artist.get_label\n set_label = Artist.set_label\n add_callback = Artist.add_callback\n remove_callback = Artist.remove_callback\n pchanged = Artist.pchanged\n\n\nclass BarContainer(Container):\n """\n Container for the artists of bar plots (e.g. created by `.Axes.bar`).\n\n The container can be treated as a tuple of the *patches* themselves.\n Additionally, you can access these and further parameters by the\n attributes.\n\n Attributes\n ----------\n patches : list of :class:`~matplotlib.patches.Rectangle`\n The artists of the bars.\n\n errorbar : None or :class:`~matplotlib.container.ErrorbarContainer`\n A container for the error bar artists if error bars are present.\n *None* otherwise.\n\n datavalues : None or array-like\n The underlying data values corresponding to the bars.\n\n orientation : {'vertical', 'horizontal'}, default: None\n If 'vertical', the bars are assumed to be vertical.\n If 'horizontal', the bars are assumed to be horizontal.\n\n """\n\n def __init__(self, patches, errorbar=None, *, datavalues=None,\n orientation=None, **kwargs):\n self.patches = patches\n self.errorbar = errorbar\n self.datavalues = datavalues\n self.orientation = orientation\n super().__init__(patches, **kwargs)\n\n\nclass ErrorbarContainer(Container):\n """\n Container for the artists of error bars (e.g. created by `.Axes.errorbar`).\n\n The container can be treated as the *lines* tuple itself.\n Additionally, you can access these and further parameters by the\n attributes.\n\n Attributes\n ----------\n lines : tuple\n Tuple of ``(data_line, caplines, barlinecols)``.\n\n - data_line : A `~matplotlib.lines.Line2D` instance of x, y plot markers\n and/or line.\n - caplines : A tuple of `~matplotlib.lines.Line2D` instances of the error\n bar caps.\n - barlinecols : A tuple of `~matplotlib.collections.LineCollection` with the\n horizontal and vertical error ranges.\n\n has_xerr, has_yerr : bool\n ``True`` if the errorbar has x/y errors.\n\n """\n\n def __init__(self, lines, has_xerr=False, has_yerr=False, **kwargs):\n self.lines = lines\n self.has_xerr = has_xerr\n self.has_yerr = has_yerr\n super().__init__(lines, **kwargs)\n\n\nclass StemContainer(Container):\n """\n Container for the artists created in a :meth:`.Axes.stem` plot.\n\n The container can be treated like a namedtuple ``(markerline, stemlines,\n baseline)``.\n\n Attributes\n ----------\n markerline : `~matplotlib.lines.Line2D`\n The artist of the markers at the stem heads.\n\n stemlines : `~matplotlib.collections.LineCollection`\n The artists of the vertical lines for all stems.\n\n baseline : `~matplotlib.lines.Line2D`\n The artist of the horizontal baseline.\n """\n def __init__(self, markerline_stemlines_baseline, **kwargs):\n """\n Parameters\n ----------\n markerline_stemlines_baseline : tuple\n Tuple of ``(markerline, stemlines, baseline)``.\n ``markerline`` contains the `.Line2D` of the markers,\n ``stemlines`` is a `.LineCollection` of the main lines,\n ``baseline`` is the `.Line2D` of the baseline.\n """\n markerline, stemlines, baseline = markerline_stemlines_baseline\n self.markerline = markerline\n self.stemlines = stemlines\n self.baseline = baseline\n super().__init__(markerline_stemlines_baseline, **kwargs)\n | .venv\Lib\site-packages\matplotlib\container.py | container.py | Python | 4,565 | 0.85 | 0.205674 | 0.009174 | vue-tools | 946 | 2023-11-01T01:20:34.211305 | Apache-2.0 | false | 6c72142d4e60663fd994773708ff37a1 |
from matplotlib.artist import Artist\nfrom matplotlib.lines import Line2D\nfrom matplotlib.collections import LineCollection\nfrom matplotlib.patches import Rectangle\n\nfrom collections.abc import Callable\nfrom typing import Any, Literal\nfrom numpy.typing import ArrayLike\n\nclass Container(tuple):\n def __new__(cls, *args, **kwargs): ...\n def __init__(self, kl, label: Any | None = ...) -> None: ...\n def remove(self) -> None: ...\n def get_children(self) -> list[Artist]: ...\n def get_label(self) -> str | None: ...\n def set_label(self, s: Any) -> None: ...\n def add_callback(self, func: Callable[[Artist], Any]) -> int: ...\n def remove_callback(self, oid: int) -> None: ...\n def pchanged(self) -> None: ...\n\nclass BarContainer(Container):\n patches: list[Rectangle]\n errorbar: None | ErrorbarContainer\n datavalues: None | ArrayLike\n orientation: None | Literal["vertical", "horizontal"]\n def __init__(\n self,\n patches: list[Rectangle],\n errorbar: ErrorbarContainer | None = ...,\n *,\n datavalues: ArrayLike | None = ...,\n orientation: Literal["vertical", "horizontal"] | None = ...,\n **kwargs\n ) -> None: ...\n\nclass ErrorbarContainer(Container):\n lines: tuple[Line2D, tuple[Line2D, ...], tuple[LineCollection, ...]]\n has_xerr: bool\n has_yerr: bool\n def __init__(\n self,\n lines: tuple[Line2D, tuple[Line2D, ...], tuple[LineCollection, ...]],\n has_xerr: bool = ...,\n has_yerr: bool = ...,\n **kwargs\n ) -> None: ...\n\nclass StemContainer(Container):\n markerline: Line2D\n stemlines: LineCollection\n baseline: Line2D\n def __init__(\n self,\n markerline_stemlines_baseline: tuple[Line2D, LineCollection, Line2D],\n **kwargs\n ) -> None: ...\n | .venv\Lib\site-packages\matplotlib\container.pyi | container.pyi | Other | 1,805 | 0.85 | 0.285714 | 0.078431 | node-utils | 160 | 2023-11-21T13:04:42.995167 | GPL-3.0 | false | 75a98dfd03f7dd71b496e2079486fb92 |
"""\nClasses to support contour plotting and labelling for the Axes class.\n"""\n\nfrom contextlib import ExitStack\nimport functools\nimport math\nfrom numbers import Integral\n\nimport numpy as np\nfrom numpy import ma\n\nimport matplotlib as mpl\nfrom matplotlib import _api, _docstring\nfrom matplotlib.backend_bases import MouseButton\nfrom matplotlib.lines import Line2D\nfrom matplotlib.path import Path\nfrom matplotlib.text import Text\nimport matplotlib.ticker as ticker\nimport matplotlib.cm as cm\nimport matplotlib.colors as mcolors\nimport matplotlib.collections as mcoll\nimport matplotlib.font_manager as font_manager\nimport matplotlib.cbook as cbook\nimport matplotlib.patches as mpatches\nimport matplotlib.transforms as mtransforms\n\n\ndef _contour_labeler_event_handler(cs, inline, inline_spacing, event):\n canvas = cs.axes.get_figure(root=True).canvas\n is_button = event.name == "button_press_event"\n is_key = event.name == "key_press_event"\n # Quit (even if not in infinite mode; this is consistent with\n # MATLAB and sometimes quite useful, but will require the user to\n # test how many points were actually returned before using data).\n if (is_button and event.button == MouseButton.MIDDLE\n or is_key and event.key in ["escape", "enter"]):\n canvas.stop_event_loop()\n # Pop last click.\n elif (is_button and event.button == MouseButton.RIGHT\n or is_key and event.key in ["backspace", "delete"]):\n # Unfortunately, if one is doing inline labels, then there is currently\n # no way to fix the broken contour - once humpty-dumpty is broken, he\n # can't be put back together. In inline mode, this does nothing.\n if not inline:\n cs.pop_label()\n canvas.draw()\n # Add new click.\n elif (is_button and event.button == MouseButton.LEFT\n # On macOS/gtk, some keys return None.\n or is_key and event.key is not None):\n if cs.axes.contains(event)[0]:\n cs.add_label_near(event.x, event.y, transform=False,\n inline=inline, inline_spacing=inline_spacing)\n canvas.draw()\n\n\nclass ContourLabeler:\n """Mixin to provide labelling capability to `.ContourSet`."""\n\n def clabel(self, levels=None, *,\n fontsize=None, inline=True, inline_spacing=5, fmt=None,\n colors=None, use_clabeltext=False, manual=False,\n rightside_up=True, zorder=None):\n """\n Label a contour plot.\n\n Adds labels to line contours in this `.ContourSet` (which inherits from\n this mixin class).\n\n Parameters\n ----------\n levels : array-like, optional\n A list of level values, that should be labeled. The list must be\n a subset of ``cs.levels``. If not given, all levels are labeled.\n\n fontsize : str or float, default: :rc:`font.size`\n Size in points or relative size e.g., 'smaller', 'x-large'.\n See `.Text.set_size` for accepted string values.\n\n colors : :mpltype:`color` or colors or None, default: None\n The label colors:\n\n - If *None*, the color of each label matches the color of\n the corresponding contour.\n\n - If one string color, e.g., *colors* = 'r' or *colors* =\n 'red', all labels will be plotted in this color.\n\n - If a tuple of colors (string, float, RGB, etc), different labels\n will be plotted in different colors in the order specified.\n\n inline : bool, default: True\n If ``True`` the underlying contour is removed where the label is\n placed.\n\n inline_spacing : float, default: 5\n Space in pixels to leave on each side of label when placing inline.\n\n This spacing will be exact for labels at locations where the\n contour is straight, less so for labels on curved contours.\n\n fmt : `.Formatter` or str or callable or dict, optional\n How the levels are formatted:\n\n - If a `.Formatter`, it is used to format all levels at once, using\n its `.Formatter.format_ticks` method.\n - If a str, it is interpreted as a %-style format string.\n - If a callable, it is called with one level at a time and should\n return the corresponding label.\n - If a dict, it should directly map levels to labels.\n\n The default is to use a standard `.ScalarFormatter`.\n\n manual : bool or iterable, default: False\n If ``True``, contour labels will be placed manually using\n mouse clicks. Click the first button near a contour to\n add a label, click the second button (or potentially both\n mouse buttons at once) to finish adding labels. The third\n button can be used to remove the last label added, but\n only if labels are not inline. Alternatively, the keyboard\n can be used to select label locations (enter to end label\n placement, delete or backspace act like the third mouse button,\n and any other key will select a label location).\n\n *manual* can also be an iterable object of (x, y) tuples.\n Contour labels will be created as if mouse is clicked at each\n (x, y) position.\n\n rightside_up : bool, default: True\n If ``True``, label rotations will always be plus\n or minus 90 degrees from level.\n\n use_clabeltext : bool, default: False\n If ``True``, use `.Text.set_transform_rotates_text` to ensure that\n label rotation is updated whenever the Axes aspect changes.\n\n zorder : float or None, default: ``(2 + contour.get_zorder())``\n zorder of the contour labels.\n\n Returns\n -------\n labels\n A list of `.Text` instances for the labels.\n """\n\n # Based on the input arguments, clabel() adds a list of "label\n # specific" attributes to the ContourSet object. These attributes are\n # all of the form label* and names should be fairly self explanatory.\n #\n # Once these attributes are set, clabel passes control to the labels()\n # method (for automatic label placement) or blocking_input_loop and\n # _contour_labeler_event_handler (for manual label placement).\n\n if fmt is None:\n fmt = ticker.ScalarFormatter(useOffset=False)\n fmt.create_dummy_axis()\n self.labelFmt = fmt\n self._use_clabeltext = use_clabeltext\n self.labelManual = manual\n self.rightside_up = rightside_up\n self._clabel_zorder = 2 + self.get_zorder() if zorder is None else zorder\n\n if levels is None:\n levels = self.levels\n indices = list(range(len(self.cvalues)))\n else:\n levlabs = list(levels)\n indices, levels = [], []\n for i, lev in enumerate(self.levels):\n if lev in levlabs:\n indices.append(i)\n levels.append(lev)\n if len(levels) < len(levlabs):\n raise ValueError(f"Specified levels {levlabs} don't match "\n f"available levels {self.levels}")\n self.labelLevelList = levels\n self.labelIndiceList = indices\n\n self._label_font_props = font_manager.FontProperties(size=fontsize)\n\n if colors is None:\n self.labelMappable = self\n self.labelCValueList = np.take(self.cvalues, self.labelIndiceList)\n else:\n cmap = mcolors.ListedColormap(colors, N=len(self.labelLevelList))\n self.labelCValueList = list(range(len(self.labelLevelList)))\n self.labelMappable = cm.ScalarMappable(cmap=cmap,\n norm=mcolors.NoNorm())\n\n self.labelXYs = []\n\n if np.iterable(manual):\n for x, y in manual:\n self.add_label_near(x, y, inline, inline_spacing)\n elif manual:\n print('Select label locations manually using first mouse button.')\n print('End manual selection with second mouse button.')\n if not inline:\n print('Remove last label by clicking third mouse button.')\n mpl._blocking_input.blocking_input_loop(\n self.axes.get_figure(root=True),\n ["button_press_event", "key_press_event"],\n timeout=-1, handler=functools.partial(\n _contour_labeler_event_handler,\n self, inline, inline_spacing))\n else:\n self.labels(inline, inline_spacing)\n\n return cbook.silent_list('text.Text', self.labelTexts)\n\n def print_label(self, linecontour, labelwidth):\n """Return whether a contour is long enough to hold a label."""\n return (len(linecontour) > 10 * labelwidth\n or (len(linecontour)\n and (np.ptp(linecontour, axis=0) > 1.2 * labelwidth).any()))\n\n def too_close(self, x, y, lw):\n """Return whether a label is already near this location."""\n thresh = (1.2 * lw) ** 2\n return any((x - loc[0]) ** 2 + (y - loc[1]) ** 2 < thresh\n for loc in self.labelXYs)\n\n def _get_nth_label_width(self, nth):\n """Return the width of the *nth* label, in pixels."""\n fig = self.axes.get_figure(root=False)\n renderer = fig.get_figure(root=True)._get_renderer()\n return (Text(0, 0,\n self.get_text(self.labelLevelList[nth], self.labelFmt),\n figure=fig, fontproperties=self._label_font_props)\n .get_window_extent(renderer).width)\n\n def get_text(self, lev, fmt):\n """Get the text of the label."""\n if isinstance(lev, str):\n return lev\n elif isinstance(fmt, dict):\n return fmt.get(lev, '%1.3f')\n elif callable(getattr(fmt, "format_ticks", None)):\n return fmt.format_ticks([*self.labelLevelList, lev])[-1]\n elif callable(fmt):\n return fmt(lev)\n else:\n return fmt % lev\n\n def locate_label(self, linecontour, labelwidth):\n """\n Find good place to draw a label (relatively flat part of the contour).\n """\n ctr_size = len(linecontour)\n n_blocks = int(np.ceil(ctr_size / labelwidth)) if labelwidth > 1 else 1\n block_size = ctr_size if n_blocks == 1 else int(labelwidth)\n # Split contour into blocks of length ``block_size``, filling the last\n # block by cycling the contour start (per `np.resize` semantics). (Due\n # to cycling, the index returned is taken modulo ctr_size.)\n xx = np.resize(linecontour[:, 0], (n_blocks, block_size))\n yy = np.resize(linecontour[:, 1], (n_blocks, block_size))\n yfirst = yy[:, :1]\n ylast = yy[:, -1:]\n xfirst = xx[:, :1]\n xlast = xx[:, -1:]\n s = (yfirst - yy) * (xlast - xfirst) - (xfirst - xx) * (ylast - yfirst)\n l = np.hypot(xlast - xfirst, ylast - yfirst)\n # Ignore warning that divide by zero throws, as this is a valid option\n with np.errstate(divide='ignore', invalid='ignore'):\n distances = (abs(s) / l).sum(axis=-1)\n # Labels are drawn in the middle of the block (``hbsize``) where the\n # contour is the closest (per ``distances``) to a straight line, but\n # not `too_close()` to a preexisting label.\n hbsize = block_size // 2\n adist = np.argsort(distances)\n # If all candidates are `too_close()`, go back to the straightest part\n # (``adist[0]``).\n for idx in np.append(adist, adist[0]):\n x, y = xx[idx, hbsize], yy[idx, hbsize]\n if not self.too_close(x, y, labelwidth):\n break\n return x, y, (idx * block_size + hbsize) % ctr_size\n\n def _split_path_and_get_label_rotation(self, path, idx, screen_pos, lw, spacing=5):\n """\n Prepare for insertion of a label at index *idx* of *path*.\n\n Parameters\n ----------\n path : Path\n The path where the label will be inserted, in data space.\n idx : int\n The vertex index after which the label will be inserted.\n screen_pos : (float, float)\n The position where the label will be inserted, in screen space.\n lw : float\n The label width, in screen space.\n spacing : float\n Extra spacing around the label, in screen space.\n\n Returns\n -------\n path : Path\n The path, broken so that the label can be drawn over it.\n angle : float\n The rotation of the label.\n\n Notes\n -----\n Both tasks are done together to avoid calculating path lengths multiple times,\n which is relatively costly.\n\n The method used here involves computing the path length along the contour in\n pixel coordinates and then looking (label width / 2) away from central point to\n determine rotation and then to break contour if desired. The extra spacing is\n taken into account when breaking the path, but not when computing the angle.\n """\n xys = path.vertices\n codes = path.codes\n\n # Insert a vertex at idx/pos (converting back to data space), if there isn't yet\n # a vertex there. With infinite precision one could also always insert the\n # extra vertex (it will get masked out by the label below anyways), but floating\n # point inaccuracies (the point can have undergone a data->screen->data\n # transform loop) can slightly shift the point and e.g. shift the angle computed\n # below from exactly zero to nonzero.\n pos = self.get_transform().inverted().transform(screen_pos)\n if not np.allclose(pos, xys[idx]):\n xys = np.insert(xys, idx, pos, axis=0)\n codes = np.insert(codes, idx, Path.LINETO)\n\n # Find the connected component where the label will be inserted. Note that a\n # path always starts with a MOVETO, and we consider there's an implicit\n # MOVETO (closing the last path) at the end.\n movetos = (codes == Path.MOVETO).nonzero()[0]\n start = movetos[movetos <= idx][-1]\n try:\n stop = movetos[movetos > idx][0]\n except IndexError:\n stop = len(codes)\n\n # Restrict ourselves to the connected component.\n cc_xys = xys[start:stop]\n idx -= start\n\n # If the path is closed, rotate it s.t. it starts at the label.\n is_closed_path = codes[stop - 1] == Path.CLOSEPOLY\n if is_closed_path:\n cc_xys = np.concatenate([cc_xys[idx:-1], cc_xys[:idx+1]])\n idx = 0\n\n # Like np.interp, but additionally vectorized over fp.\n def interp_vec(x, xp, fp): return [np.interp(x, xp, col) for col in fp.T]\n\n # Use cumulative path lengths ("cpl") as curvilinear coordinate along contour.\n screen_xys = self.get_transform().transform(cc_xys)\n path_cpls = np.insert(\n np.cumsum(np.hypot(*np.diff(screen_xys, axis=0).T)), 0, 0)\n path_cpls -= path_cpls[idx]\n\n # Use linear interpolation to get end coordinates of label.\n target_cpls = np.array([-lw/2, lw/2])\n if is_closed_path: # For closed paths, target from the other end.\n target_cpls[0] += (path_cpls[-1] - path_cpls[0])\n (sx0, sx1), (sy0, sy1) = interp_vec(target_cpls, path_cpls, screen_xys)\n angle = np.rad2deg(np.arctan2(sy1 - sy0, sx1 - sx0)) # Screen space.\n if self.rightside_up: # Fix angle so text is never upside-down\n angle = (angle + 90) % 180 - 90\n\n target_cpls += [-spacing, +spacing] # Expand range by spacing.\n\n # Get indices near points of interest; use -1 as out of bounds marker.\n i0, i1 = np.interp(target_cpls, path_cpls, range(len(path_cpls)),\n left=-1, right=-1)\n i0 = math.floor(i0)\n i1 = math.ceil(i1)\n (x0, x1), (y0, y1) = interp_vec(target_cpls, path_cpls, cc_xys)\n\n # Actually break contours (dropping zero-len parts).\n new_xy_blocks = []\n new_code_blocks = []\n if is_closed_path:\n if i0 != -1 and i1 != -1:\n # This is probably wrong in the case that the entire contour would\n # be discarded, but ensures that a valid path is returned and is\n # consistent with behavior of mpl <3.8\n points = cc_xys[i1:i0+1]\n new_xy_blocks.extend([[(x1, y1)], points, [(x0, y0)]])\n nlines = len(points) + 1\n new_code_blocks.extend([[Path.MOVETO], [Path.LINETO] * nlines])\n else:\n if i0 != -1:\n new_xy_blocks.extend([cc_xys[:i0 + 1], [(x0, y0)]])\n new_code_blocks.extend([[Path.MOVETO], [Path.LINETO] * (i0 + 1)])\n if i1 != -1:\n new_xy_blocks.extend([[(x1, y1)], cc_xys[i1:]])\n new_code_blocks.extend([\n [Path.MOVETO], [Path.LINETO] * (len(cc_xys) - i1)])\n\n # Back to the full path.\n xys = np.concatenate([xys[:start], *new_xy_blocks, xys[stop:]])\n codes = np.concatenate([codes[:start], *new_code_blocks, codes[stop:]])\n\n return angle, Path(xys, codes)\n\n def add_label(self, x, y, rotation, lev, cvalue):\n """Add a contour label, respecting whether *use_clabeltext* was set."""\n data_x, data_y = self.axes.transData.inverted().transform((x, y))\n t = Text(\n data_x, data_y,\n text=self.get_text(lev, self.labelFmt),\n rotation=rotation,\n horizontalalignment='center', verticalalignment='center',\n zorder=self._clabel_zorder,\n color=self.labelMappable.to_rgba(cvalue, alpha=self.get_alpha()),\n fontproperties=self._label_font_props,\n clip_box=self.axes.bbox)\n if self._use_clabeltext:\n data_rotation, = self.axes.transData.inverted().transform_angles(\n [rotation], [[x, y]])\n t.set(rotation=data_rotation, transform_rotates_text=True)\n self.labelTexts.append(t)\n self.labelCValues.append(cvalue)\n self.labelXYs.append((x, y))\n # Add label to plot here - useful for manual mode label selection\n self.axes.add_artist(t)\n\n def add_label_near(self, x, y, inline=True, inline_spacing=5,\n transform=None):\n """\n Add a label near the point ``(x, y)``.\n\n Parameters\n ----------\n x, y : float\n The approximate location of the label.\n inline : bool, default: True\n If *True* remove the segment of the contour beneath the label.\n inline_spacing : int, default: 5\n Space in pixels to leave on each side of label when placing\n inline. This spacing will be exact for labels at locations where\n the contour is straight, less so for labels on curved contours.\n transform : `.Transform` or `False`, default: ``self.axes.transData``\n A transform applied to ``(x, y)`` before labeling. The default\n causes ``(x, y)`` to be interpreted as data coordinates. `False`\n is a synonym for `.IdentityTransform`; i.e. ``(x, y)`` should be\n interpreted as display coordinates.\n """\n\n if transform is None:\n transform = self.axes.transData\n if transform:\n x, y = transform.transform((x, y))\n\n idx_level_min, idx_vtx_min, proj = self._find_nearest_contour(\n (x, y), self.labelIndiceList)\n path = self._paths[idx_level_min]\n level = self.labelIndiceList.index(idx_level_min)\n label_width = self._get_nth_label_width(level)\n rotation, path = self._split_path_and_get_label_rotation(\n path, idx_vtx_min, proj, label_width, inline_spacing)\n self.add_label(*proj, rotation, self.labelLevelList[idx_level_min],\n self.labelCValueList[idx_level_min])\n\n if inline:\n self._paths[idx_level_min] = path\n\n def pop_label(self, index=-1):\n """Defaults to removing last label, but any index can be supplied"""\n self.labelCValues.pop(index)\n t = self.labelTexts.pop(index)\n t.remove()\n\n def labels(self, inline, inline_spacing):\n for idx, (icon, lev, cvalue) in enumerate(zip(\n self.labelIndiceList,\n self.labelLevelList,\n self.labelCValueList,\n )):\n trans = self.get_transform()\n label_width = self._get_nth_label_width(idx)\n additions = []\n for subpath in self._paths[icon]._iter_connected_components():\n screen_xys = trans.transform(subpath.vertices)\n # Check if long enough for a label\n if self.print_label(screen_xys, label_width):\n x, y, idx = self.locate_label(screen_xys, label_width)\n rotation, path = self._split_path_and_get_label_rotation(\n subpath, idx, (x, y),\n label_width, inline_spacing)\n self.add_label(x, y, rotation, lev, cvalue) # Really add label.\n if inline: # If inline, add new contours\n additions.append(path)\n else: # If not adding label, keep old path\n additions.append(subpath)\n # After looping over all segments on a contour, replace old path by new one\n # if inlining.\n if inline:\n self._paths[icon] = Path.make_compound_path(*additions)\n\n def remove(self):\n super().remove()\n for text in self.labelTexts:\n text.remove()\n\n\ndef _find_closest_point_on_path(xys, p):\n """\n Parameters\n ----------\n xys : (N, 2) array-like\n Coordinates of vertices.\n p : (float, float)\n Coordinates of point.\n\n Returns\n -------\n d2min : float\n Minimum square distance of *p* to *xys*.\n proj : (float, float)\n Projection of *p* onto *xys*.\n imin : (int, int)\n Consecutive indices of vertices of segment in *xys* where *proj* is.\n Segments are considered as including their end-points; i.e. if the\n closest point on the path is a node in *xys* with index *i*, this\n returns ``(i-1, i)``. For the special case where *xys* is a single\n point, this returns ``(0, 0)``.\n """\n if len(xys) == 1:\n return (((p - xys[0]) ** 2).sum(), xys[0], (0, 0))\n dxys = xys[1:] - xys[:-1] # Individual segment vectors.\n norms = (dxys ** 2).sum(axis=1)\n norms[norms == 0] = 1 # For zero-length segment, replace 0/0 by 0/1.\n rel_projs = np.clip( # Project onto each segment in relative 0-1 coords.\n ((p - xys[:-1]) * dxys).sum(axis=1) / norms,\n 0, 1)[:, None]\n projs = xys[:-1] + rel_projs * dxys # Projs. onto each segment, in (x, y).\n d2s = ((projs - p) ** 2).sum(axis=1) # Squared distances.\n imin = np.argmin(d2s)\n return (d2s[imin], projs[imin], (imin, imin+1))\n\n\n_docstring.interpd.register(contour_set_attributes=r"""\nAttributes\n----------\nlevels : array\n The values of the contour levels.\n\nlayers : array\n Same as levels for line contours; half-way between\n levels for filled contours. See ``ContourSet._process_colors``.\n""")\n\n\n@_docstring.interpd\nclass ContourSet(ContourLabeler, mcoll.Collection):\n """\n Store a set of contour lines or filled regions.\n\n User-callable method: `~.Axes.clabel`\n\n Parameters\n ----------\n ax : `~matplotlib.axes.Axes`\n\n levels : [level0, level1, ..., leveln]\n A list of floating point numbers indicating the contour levels.\n\n allsegs : [level0segs, level1segs, ...]\n List of all the polygon segments for all the *levels*.\n For contour lines ``len(allsegs) == len(levels)``, and for\n filled contour regions ``len(allsegs) = len(levels)-1``. The lists\n should look like ::\n\n level0segs = [polygon0, polygon1, ...]\n polygon0 = [[x0, y0], [x1, y1], ...]\n\n allkinds : ``None`` or [level0kinds, level1kinds, ...]\n Optional list of all the polygon vertex kinds (code types), as\n described and used in Path. This is used to allow multiply-\n connected paths such as holes within filled polygons.\n If not ``None``, ``len(allkinds) == len(allsegs)``. The lists\n should look like ::\n\n level0kinds = [polygon0kinds, ...]\n polygon0kinds = [vertexcode0, vertexcode1, ...]\n\n If *allkinds* is not ``None``, usually all polygons for a\n particular contour level are grouped together so that\n ``level0segs = [polygon0]`` and ``level0kinds = [polygon0kinds]``.\n\n **kwargs\n Keyword arguments are as described in the docstring of\n `~.Axes.contour`.\n\n %(contour_set_attributes)s\n """\n\n def __init__(self, ax, *args,\n levels=None, filled=False, linewidths=None, linestyles=None,\n hatches=(None,), alpha=None, origin=None, extent=None,\n cmap=None, colors=None, norm=None, vmin=None, vmax=None,\n colorizer=None, extend='neither', antialiased=None, nchunk=0,\n locator=None, transform=None, negative_linestyles=None, clip_path=None,\n **kwargs):\n """\n Draw contour lines or filled regions, depending on\n whether keyword arg *filled* is ``False`` (default) or ``True``.\n\n Call signature::\n\n ContourSet(ax, levels, allsegs, [allkinds], **kwargs)\n\n Parameters\n ----------\n ax : `~matplotlib.axes.Axes`\n The `~.axes.Axes` object to draw on.\n\n levels : [level0, level1, ..., leveln]\n A list of floating point numbers indicating the contour\n levels.\n\n allsegs : [level0segs, level1segs, ...]\n List of all the polygon segments for all the *levels*.\n For contour lines ``len(allsegs) == len(levels)``, and for\n filled contour regions ``len(allsegs) = len(levels)-1``. The lists\n should look like ::\n\n level0segs = [polygon0, polygon1, ...]\n polygon0 = [[x0, y0], [x1, y1], ...]\n\n allkinds : [level0kinds, level1kinds, ...], optional\n Optional list of all the polygon vertex kinds (code types), as\n described and used in Path. This is used to allow multiply-\n connected paths such as holes within filled polygons.\n If not ``None``, ``len(allkinds) == len(allsegs)``. The lists\n should look like ::\n\n level0kinds = [polygon0kinds, ...]\n polygon0kinds = [vertexcode0, vertexcode1, ...]\n\n If *allkinds* is not ``None``, usually all polygons for a\n particular contour level are grouped together so that\n ``level0segs = [polygon0]`` and ``level0kinds = [polygon0kinds]``.\n\n **kwargs\n Keyword arguments are as described in the docstring of\n `~.Axes.contour`.\n """\n if antialiased is None and filled:\n # Eliminate artifacts; we are not stroking the boundaries.\n antialiased = False\n # The default for line contours will be taken from the\n # LineCollection default, which uses :rc:`lines.antialiased`.\n super().__init__(\n antialiaseds=antialiased,\n alpha=alpha,\n clip_path=clip_path,\n transform=transform,\n colorizer=colorizer,\n )\n self.axes = ax\n self.levels = levels\n self.filled = filled\n self.hatches = hatches\n self.origin = origin\n self.extent = extent\n self.colors = colors\n self.extend = extend\n\n self.nchunk = nchunk\n self.locator = locator\n\n if colorizer:\n self._set_colorizer_check_keywords(colorizer, cmap=cmap,\n norm=norm, vmin=vmin,\n vmax=vmax, colors=colors)\n norm = colorizer.norm\n cmap = colorizer.cmap\n if (isinstance(norm, mcolors.LogNorm)\n or isinstance(self.locator, ticker.LogLocator)):\n self.logscale = True\n if norm is None:\n norm = mcolors.LogNorm()\n else:\n self.logscale = False\n\n _api.check_in_list([None, 'lower', 'upper', 'image'], origin=origin)\n if self.extent is not None and len(self.extent) != 4:\n raise ValueError(\n "If given, 'extent' must be None or (x0, x1, y0, y1)")\n if self.colors is not None and cmap is not None:\n raise ValueError('Either colors or cmap must be None')\n if self.origin == 'image':\n self.origin = mpl.rcParams['image.origin']\n\n self._orig_linestyles = linestyles # Only kept for user access.\n self.negative_linestyles = negative_linestyles\n # If negative_linestyles was not defined as a keyword argument, define\n # negative_linestyles with rcParams\n if self.negative_linestyles is None:\n self.negative_linestyles = \\n mpl.rcParams['contour.negative_linestyle']\n\n kwargs = self._process_args(*args, **kwargs)\n self._process_levels()\n\n self._extend_min = self.extend in ['min', 'both']\n self._extend_max = self.extend in ['max', 'both']\n if self.colors is not None:\n if mcolors.is_color_like(self.colors):\n color_sequence = [self.colors]\n else:\n color_sequence = self.colors\n\n ncolors = len(self.levels)\n if self.filled:\n ncolors -= 1\n i0 = 0\n\n # Handle the case where colors are given for the extended\n # parts of the contour.\n\n use_set_under_over = False\n # if we are extending the lower end, and we've been given enough\n # colors then skip the first color in the resulting cmap. For the\n # extend_max case we don't need to worry about passing more colors\n # than ncolors as ListedColormap will clip.\n total_levels = (ncolors +\n int(self._extend_min) +\n int(self._extend_max))\n if (len(color_sequence) == total_levels and\n (self._extend_min or self._extend_max)):\n use_set_under_over = True\n if self._extend_min:\n i0 = 1\n\n cmap = mcolors.ListedColormap(color_sequence[i0:None], N=ncolors)\n\n if use_set_under_over:\n if self._extend_min:\n cmap.set_under(color_sequence[0])\n if self._extend_max:\n cmap.set_over(color_sequence[-1])\n\n # label lists must be initialized here\n self.labelTexts = []\n self.labelCValues = []\n\n self.set_cmap(cmap)\n if norm is not None:\n self.set_norm(norm)\n with self.norm.callbacks.blocked(signal="changed"):\n if vmin is not None:\n self.norm.vmin = vmin\n if vmax is not None:\n self.norm.vmax = vmax\n self.norm._changed()\n self._process_colors()\n\n if self._paths is None:\n self._paths = self._make_paths_from_contour_generator()\n\n if self.filled:\n if linewidths is not None:\n _api.warn_external('linewidths is ignored by contourf')\n # Lower and upper contour levels.\n lowers, uppers = self._get_lowers_and_uppers()\n self.set(\n edgecolor="none",\n # Default zorder taken from Collection\n zorder=kwargs.pop("zorder", 1),\n )\n\n else:\n self.set(\n facecolor="none",\n linewidths=self._process_linewidths(linewidths),\n linestyle=self._process_linestyles(linestyles),\n # Default zorder taken from LineCollection, which is higher\n # than for filled contours so that lines are displayed on top.\n zorder=kwargs.pop("zorder", 2),\n label="_nolegend_",\n )\n\n self.axes.add_collection(self, autolim=False)\n self.sticky_edges.x[:] = [self._mins[0], self._maxs[0]]\n self.sticky_edges.y[:] = [self._mins[1], self._maxs[1]]\n self.axes.update_datalim([self._mins, self._maxs])\n self.axes.autoscale_view(tight=True)\n\n self.changed() # set the colors\n\n if kwargs:\n _api.warn_external(\n 'The following kwargs were not used by contour: ' +\n ", ".join(map(repr, kwargs))\n )\n\n allsegs = property(lambda self: [\n [subp.vertices for subp in p._iter_connected_components()]\n for p in self.get_paths()])\n allkinds = property(lambda self: [\n [subp.codes for subp in p._iter_connected_components()]\n for p in self.get_paths()])\n alpha = property(lambda self: self.get_alpha())\n linestyles = property(lambda self: self._orig_linestyles)\n\n def get_transform(self):\n """Return the `.Transform` instance used by this ContourSet."""\n if self._transform is None:\n self._transform = self.axes.transData\n elif (not isinstance(self._transform, mtransforms.Transform)\n and hasattr(self._transform, '_as_mpl_transform')):\n self._transform = self._transform._as_mpl_transform(self.axes)\n return self._transform\n\n def __getstate__(self):\n state = self.__dict__.copy()\n # the C object _contour_generator cannot currently be pickled. This\n # isn't a big issue as it is not actually used once the contour has\n # been calculated.\n state['_contour_generator'] = None\n return state\n\n def legend_elements(self, variable_name='x', str_format=str):\n """\n Return a list of artists and labels suitable for passing through\n to `~.Axes.legend` which represent this ContourSet.\n\n The labels have the form "0 < x <= 1" stating the data ranges which\n the artists represent.\n\n Parameters\n ----------\n variable_name : str\n The string used inside the inequality used on the labels.\n str_format : function: float -> str\n Function used to format the numbers in the labels.\n\n Returns\n -------\n artists : list[`.Artist`]\n A list of the artists.\n labels : list[str]\n A list of the labels.\n """\n artists = []\n labels = []\n\n if self.filled:\n lowers, uppers = self._get_lowers_and_uppers()\n n_levels = len(self._paths)\n for idx in range(n_levels):\n artists.append(mpatches.Rectangle(\n (0, 0), 1, 1,\n facecolor=self.get_facecolor()[idx],\n hatch=self.hatches[idx % len(self.hatches)],\n ))\n lower = str_format(lowers[idx])\n upper = str_format(uppers[idx])\n if idx == 0 and self.extend in ('min', 'both'):\n labels.append(fr'${variable_name} \leq {lower}s$')\n elif idx == n_levels - 1 and self.extend in ('max', 'both'):\n labels.append(fr'${variable_name} > {upper}s$')\n else:\n labels.append(fr'${lower} < {variable_name} \leq {upper}$')\n else:\n for idx, level in enumerate(self.levels):\n artists.append(Line2D(\n [], [],\n color=self.get_edgecolor()[idx],\n linewidth=self.get_linewidths()[idx],\n linestyle=self.get_linestyles()[idx],\n ))\n labels.append(fr'${variable_name} = {str_format(level)}$')\n\n return artists, labels\n\n def _process_args(self, *args, **kwargs):\n """\n Process *args* and *kwargs*; override in derived classes.\n\n Must set self.levels, self.zmin and self.zmax, and update Axes limits.\n """\n self.levels = args[0]\n allsegs = args[1]\n allkinds = args[2] if len(args) > 2 else None\n self.zmax = np.max(self.levels)\n self.zmin = np.min(self.levels)\n\n if allkinds is None:\n allkinds = [[None] * len(segs) for segs in allsegs]\n\n # Check lengths of levels and allsegs.\n if self.filled:\n if len(allsegs) != len(self.levels) - 1:\n raise ValueError('must be one less number of segments as '\n 'levels')\n else:\n if len(allsegs) != len(self.levels):\n raise ValueError('must be same number of segments as levels')\n\n # Check length of allkinds.\n if len(allkinds) != len(allsegs):\n raise ValueError('allkinds has different length to allsegs')\n\n # Determine x, y bounds and update axes data limits.\n flatseglist = [s for seg in allsegs for s in seg]\n points = np.concatenate(flatseglist, axis=0)\n self._mins = points.min(axis=0)\n self._maxs = points.max(axis=0)\n\n # Each entry in (allsegs, allkinds) is a list of (segs, kinds): segs is a list\n # of (N, 2) arrays of xy coordinates, kinds is a list of arrays of corresponding\n # pathcodes. However, kinds can also be None; in which case all paths in that\n # list are codeless (this case is normalized above). These lists are used to\n # construct paths, which then get concatenated.\n self._paths = [Path.make_compound_path(*map(Path, segs, kinds))\n for segs, kinds in zip(allsegs, allkinds)]\n\n return kwargs\n\n def _make_paths_from_contour_generator(self):\n """Compute ``paths`` using C extension."""\n if self._paths is not None:\n return self._paths\n cg = self._contour_generator\n empty_path = Path(np.empty((0, 2)))\n vertices_and_codes = (\n map(cg.create_filled_contour, *self._get_lowers_and_uppers())\n if self.filled else\n map(cg.create_contour, self.levels))\n return [Path(np.concatenate(vs), np.concatenate(cs)) if len(vs) else empty_path\n for vs, cs in vertices_and_codes]\n\n def _get_lowers_and_uppers(self):\n """\n Return ``(lowers, uppers)`` for filled contours.\n """\n lowers = self._levels[:-1]\n if self.zmin == lowers[0]:\n # Include minimum values in lowest interval\n lowers = lowers.copy() # so we don't change self._levels\n if self.logscale:\n lowers[0] = 0.99 * self.zmin\n else:\n lowers[0] -= 1\n uppers = self._levels[1:]\n return (lowers, uppers)\n\n def changed(self):\n if not hasattr(self, "cvalues"):\n self._process_colors() # Sets cvalues.\n # Force an autoscale immediately because self.to_rgba() calls\n # autoscale_None() internally with the data passed to it,\n # so if vmin/vmax are not set yet, this would override them with\n # content from *cvalues* rather than levels like we want\n self.norm.autoscale_None(self.levels)\n self.set_array(self.cvalues)\n self.update_scalarmappable()\n alphas = np.broadcast_to(self.get_alpha(), len(self.cvalues))\n for label, cv, alpha in zip(self.labelTexts, self.labelCValues, alphas):\n label.set_alpha(alpha)\n label.set_color(self.labelMappable.to_rgba(cv))\n super().changed()\n\n def _autolev(self, N):\n """\n Select contour levels to span the data.\n\n The target number of levels, *N*, is used only when the\n scale is not log and default locator is used.\n\n We need two more levels for filled contours than for\n line contours, because for the latter we need to specify\n the lower and upper boundary of each range. For example,\n a single contour boundary, say at z = 0, requires only\n one contour line, but two filled regions, and therefore\n three levels to provide boundaries for both regions.\n """\n if self.locator is None:\n if self.logscale:\n self.locator = ticker.LogLocator()\n else:\n self.locator = ticker.MaxNLocator(N + 1, min_n_ticks=1)\n\n lev = self.locator.tick_values(self.zmin, self.zmax)\n\n try:\n if self.locator._symmetric:\n return lev\n except AttributeError:\n pass\n\n # Trim excess levels the locator may have supplied.\n under = np.nonzero(lev < self.zmin)[0]\n i0 = under[-1] if len(under) else 0\n over = np.nonzero(lev > self.zmax)[0]\n i1 = over[0] + 1 if len(over) else len(lev)\n if self.extend in ('min', 'both'):\n i0 += 1\n if self.extend in ('max', 'both'):\n i1 -= 1\n\n if i1 - i0 < 3:\n i0, i1 = 0, len(lev)\n\n return lev[i0:i1]\n\n def _process_contour_level_args(self, args, z_dtype):\n """\n Determine the contour levels and store in self.levels.\n """\n if self.levels is None:\n if args:\n levels_arg = args[0]\n elif np.issubdtype(z_dtype, bool):\n if self.filled:\n levels_arg = [0, .5, 1]\n else:\n levels_arg = [.5]\n else:\n levels_arg = 7 # Default, hard-wired.\n else:\n levels_arg = self.levels\n if isinstance(levels_arg, Integral):\n self.levels = self._autolev(levels_arg)\n else:\n self.levels = np.asarray(levels_arg, np.float64)\n if self.filled and len(self.levels) < 2:\n raise ValueError("Filled contours require at least 2 levels.")\n if len(self.levels) > 1 and np.min(np.diff(self.levels)) <= 0.0:\n raise ValueError("Contour levels must be increasing")\n\n def _process_levels(self):\n """\n Assign values to :attr:`layers` based on :attr:`levels`,\n adding extended layers as needed if contours are filled.\n\n For line contours, layers simply coincide with levels;\n a line is a thin layer. No extended levels are needed\n with line contours.\n """\n # Make a private _levels to include extended regions; we\n # want to leave the original levels attribute unchanged.\n # (Colorbar needs this even for line contours.)\n self._levels = list(self.levels)\n\n if self.logscale:\n lower, upper = 1e-250, 1e250\n else:\n lower, upper = -1e250, 1e250\n\n if self.extend in ('both', 'min'):\n self._levels.insert(0, lower)\n if self.extend in ('both', 'max'):\n self._levels.append(upper)\n self._levels = np.asarray(self._levels)\n\n if not self.filled:\n self.layers = self.levels\n return\n\n # Layer values are mid-way between levels in screen space.\n if self.logscale:\n # Avoid overflow by taking sqrt before multiplying.\n self.layers = (np.sqrt(self._levels[:-1])\n * np.sqrt(self._levels[1:]))\n else:\n self.layers = 0.5 * (self._levels[:-1] + self._levels[1:])\n\n def _process_colors(self):\n """\n Color argument processing for contouring.\n\n Note that we base the colormapping on the contour levels\n and layers, not on the actual range of the Z values. This\n means we don't have to worry about bad values in Z, and we\n always have the full dynamic range available for the selected\n levels.\n\n The color is based on the midpoint of the layer, except for\n extended end layers. By default, the norm vmin and vmax\n are the extreme values of the non-extended levels. Hence,\n the layer color extremes are not the extreme values of\n the colormap itself, but approach those values as the number\n of levels increases. An advantage of this scheme is that\n line contours, when added to filled contours, take on\n colors that are consistent with those of the filled regions;\n for example, a contour line on the boundary between two\n regions will have a color intermediate between those\n of the regions.\n\n """\n self.monochrome = self.cmap.monochrome\n if self.colors is not None:\n # Generate integers for direct indexing.\n i0, i1 = 0, len(self.levels)\n if self.filled:\n i1 -= 1\n # Out of range indices for over and under:\n if self.extend in ('both', 'min'):\n i0 -= 1\n if self.extend in ('both', 'max'):\n i1 += 1\n self.cvalues = list(range(i0, i1))\n self.set_norm(mcolors.NoNorm())\n else:\n self.cvalues = self.layers\n self.norm.autoscale_None(self.levels)\n self.set_array(self.cvalues)\n self.update_scalarmappable()\n if self.extend in ('both', 'max', 'min'):\n self.norm.clip = False\n\n def _process_linewidths(self, linewidths):\n Nlev = len(self.levels)\n if linewidths is None:\n default_linewidth = mpl.rcParams['contour.linewidth']\n if default_linewidth is None:\n default_linewidth = mpl.rcParams['lines.linewidth']\n return [default_linewidth] * Nlev\n elif not np.iterable(linewidths):\n return [linewidths] * Nlev\n else:\n linewidths = list(linewidths)\n return (linewidths * math.ceil(Nlev / len(linewidths)))[:Nlev]\n\n def _process_linestyles(self, linestyles):\n Nlev = len(self.levels)\n if linestyles is None:\n tlinestyles = ['solid'] * Nlev\n if self.monochrome:\n eps = - (self.zmax - self.zmin) * 1e-15\n for i, lev in enumerate(self.levels):\n if lev < eps:\n tlinestyles[i] = self.negative_linestyles\n else:\n if isinstance(linestyles, str):\n tlinestyles = [linestyles] * Nlev\n elif np.iterable(linestyles):\n tlinestyles = list(linestyles)\n if len(tlinestyles) < Nlev:\n nreps = int(np.ceil(Nlev / len(linestyles)))\n tlinestyles = tlinestyles * nreps\n if len(tlinestyles) > Nlev:\n tlinestyles = tlinestyles[:Nlev]\n else:\n raise ValueError("Unrecognized type for linestyles kwarg")\n return tlinestyles\n\n def _find_nearest_contour(self, xy, indices=None):\n """\n Find the point in the unfilled contour plot that is closest (in screen\n space) to point *xy*.\n\n Parameters\n ----------\n xy : tuple[float, float]\n The reference point (in screen space).\n indices : list of int or None, default: None\n Indices of contour levels to consider. If None (the default), all levels\n are considered.\n\n Returns\n -------\n idx_level_min : int\n The index of the contour level closest to *xy*.\n idx_vtx_min : int\n The index of the `.Path` segment closest to *xy* (at that level).\n proj : (float, float)\n The point in the contour plot closest to *xy*.\n """\n\n # Convert each contour segment to pixel coordinates and then compare the given\n # point to those coordinates for each contour. This is fast enough in normal\n # cases, but speedups may be possible.\n\n if self.filled:\n raise ValueError("Method does not support filled contours")\n\n if indices is None:\n indices = range(len(self._paths))\n\n d2min = np.inf\n idx_level_min = idx_vtx_min = proj_min = None\n\n for idx_level in indices:\n path = self._paths[idx_level]\n idx_vtx_start = 0\n for subpath in path._iter_connected_components():\n if not len(subpath.vertices):\n continue\n lc = self.get_transform().transform(subpath.vertices)\n d2, proj, leg = _find_closest_point_on_path(lc, xy)\n if d2 < d2min:\n d2min = d2\n idx_level_min = idx_level\n idx_vtx_min = leg[1] + idx_vtx_start\n proj_min = proj\n idx_vtx_start += len(subpath)\n\n return idx_level_min, idx_vtx_min, proj_min\n\n def find_nearest_contour(self, x, y, indices=None, pixel=True):\n """\n Find the point in the contour plot that is closest to ``(x, y)``.\n\n This method does not support filled contours.\n\n Parameters\n ----------\n x, y : float\n The reference point.\n indices : list of int or None, default: None\n Indices of contour levels to consider. If None (the default), all\n levels are considered.\n pixel : bool, default: True\n If *True*, measure distance in pixel (screen) space, which is\n useful for manual contour labeling; else, measure distance in axes\n space.\n\n Returns\n -------\n path : int\n The index of the path that is closest to ``(x, y)``. Each path corresponds\n to one contour level.\n subpath : int\n The index within that closest path of the subpath that is closest to\n ``(x, y)``. Each subpath corresponds to one unbroken contour line.\n index : int\n The index of the vertices within that subpath that are closest to\n ``(x, y)``.\n xmin, ymin : float\n The point in the contour plot that is closest to ``(x, y)``.\n d2 : float\n The squared distance from ``(xmin, ymin)`` to ``(x, y)``.\n """\n segment = index = d2 = None\n\n with ExitStack() as stack:\n if not pixel:\n # _find_nearest_contour works in pixel space. We want axes space, so\n # effectively disable the transformation here by setting to identity.\n stack.enter_context(self._cm_set(\n transform=mtransforms.IdentityTransform()))\n\n i_level, i_vtx, (xmin, ymin) = self._find_nearest_contour((x, y), indices)\n\n if i_level is not None:\n cc_cumlens = np.cumsum(\n [*map(len, self._paths[i_level]._iter_connected_components())])\n segment = cc_cumlens.searchsorted(i_vtx, "right")\n index = i_vtx if segment == 0 else i_vtx - cc_cumlens[segment - 1]\n d2 = (xmin-x)**2 + (ymin-y)**2\n\n return (i_level, segment, index, xmin, ymin, d2)\n\n def draw(self, renderer):\n paths = self._paths\n n_paths = len(paths)\n if not self.filled or all(hatch is None for hatch in self.hatches):\n super().draw(renderer)\n return\n # In presence of hatching, draw contours one at a time.\n edgecolors = self.get_edgecolors()\n if edgecolors.size == 0:\n edgecolors = ("none",)\n for idx in range(n_paths):\n with cbook._setattr_cm(self, _paths=[paths[idx]]), self._cm_set(\n hatch=self.hatches[idx % len(self.hatches)],\n array=[self.get_array()[idx]],\n linewidths=[self.get_linewidths()[idx % len(self.get_linewidths())]],\n linestyles=[self.get_linestyles()[idx % len(self.get_linestyles())]],\n edgecolors=edgecolors[idx % len(edgecolors)],\n ):\n super().draw(renderer)\n\n\n@_docstring.interpd\nclass QuadContourSet(ContourSet):\n """\n Create and store a set of contour lines or filled regions.\n\n This class is typically not instantiated directly by the user but by\n `~.Axes.contour` and `~.Axes.contourf`.\n\n %(contour_set_attributes)s\n """\n\n def _process_args(self, *args, corner_mask=None, algorithm=None, **kwargs):\n """\n Process args and kwargs.\n """\n if args and isinstance(args[0], QuadContourSet):\n if self.levels is None:\n self.levels = args[0].levels\n self.zmin = args[0].zmin\n self.zmax = args[0].zmax\n self._corner_mask = args[0]._corner_mask\n contour_generator = args[0]._contour_generator\n self._mins = args[0]._mins\n self._maxs = args[0]._maxs\n self._algorithm = args[0]._algorithm\n else:\n import contourpy\n\n if algorithm is None:\n algorithm = mpl.rcParams['contour.algorithm']\n mpl.rcParams.validate["contour.algorithm"](algorithm)\n self._algorithm = algorithm\n\n if corner_mask is None:\n if self._algorithm == "mpl2005":\n # mpl2005 does not support corner_mask=True so if not\n # specifically requested then disable it.\n corner_mask = False\n else:\n corner_mask = mpl.rcParams['contour.corner_mask']\n self._corner_mask = corner_mask\n\n x, y, z = self._contour_args(args, kwargs)\n\n contour_generator = contourpy.contour_generator(\n x, y, z, name=self._algorithm, corner_mask=self._corner_mask,\n line_type=contourpy.LineType.SeparateCode,\n fill_type=contourpy.FillType.OuterCode,\n chunk_size=self.nchunk)\n\n t = self.get_transform()\n\n # if the transform is not trans data, and some part of it\n # contains transData, transform the xs and ys to data coordinates\n if (t != self.axes.transData and\n any(t.contains_branch_seperately(self.axes.transData))):\n trans_to_data = t - self.axes.transData\n pts = np.vstack([x.flat, y.flat]).T\n transformed_pts = trans_to_data.transform(pts)\n x = transformed_pts[..., 0]\n y = transformed_pts[..., 1]\n\n self._mins = [ma.min(x), ma.min(y)]\n self._maxs = [ma.max(x), ma.max(y)]\n\n self._contour_generator = contour_generator\n\n return kwargs\n\n def _contour_args(self, args, kwargs):\n if self.filled:\n fn = 'contourf'\n else:\n fn = 'contour'\n nargs = len(args)\n\n if 0 < nargs <= 2:\n z, *args = args\n z = ma.asarray(z)\n x, y = self._initialize_x_y(z)\n elif 2 < nargs <= 4:\n x, y, z_orig, *args = args\n x, y, z = self._check_xyz(x, y, z_orig, kwargs)\n\n else:\n raise _api.nargs_error(fn, takes="from 1 to 4", given=nargs)\n z = ma.masked_invalid(z, copy=False)\n self.zmax = z.max().astype(float)\n self.zmin = z.min().astype(float)\n if self.logscale and self.zmin <= 0:\n z = ma.masked_where(z <= 0, z)\n _api.warn_external('Log scale: values of z <= 0 have been masked')\n self.zmin = z.min().astype(float)\n self._process_contour_level_args(args, z.dtype)\n return (x, y, z)\n\n def _check_xyz(self, x, y, z, kwargs):\n """\n Check that the shapes of the input arrays match; if x and y are 1D,\n convert them to 2D using meshgrid.\n """\n x, y = self.axes._process_unit_info([("x", x), ("y", y)], kwargs)\n\n x = np.asarray(x, dtype=np.float64)\n y = np.asarray(y, dtype=np.float64)\n z = ma.asarray(z)\n\n if z.ndim != 2:\n raise TypeError(f"Input z must be 2D, not {z.ndim}D")\n if z.shape[0] < 2 or z.shape[1] < 2:\n raise TypeError(f"Input z must be at least a (2, 2) shaped array, "\n f"but has shape {z.shape}")\n Ny, Nx = z.shape\n\n if x.ndim != y.ndim:\n raise TypeError(f"Number of dimensions of x ({x.ndim}) and y "\n f"({y.ndim}) do not match")\n if x.ndim == 1:\n nx, = x.shape\n ny, = y.shape\n if nx != Nx:\n raise TypeError(f"Length of x ({nx}) must match number of "\n f"columns in z ({Nx})")\n if ny != Ny:\n raise TypeError(f"Length of y ({ny}) must match number of "\n f"rows in z ({Ny})")\n x, y = np.meshgrid(x, y)\n elif x.ndim == 2:\n if x.shape != z.shape:\n raise TypeError(\n f"Shapes of x {x.shape} and z {z.shape} do not match")\n if y.shape != z.shape:\n raise TypeError(\n f"Shapes of y {y.shape} and z {z.shape} do not match")\n else:\n raise TypeError(f"Inputs x and y must be 1D or 2D, not {x.ndim}D")\n\n return x, y, z\n\n def _initialize_x_y(self, z):\n """\n Return X, Y arrays such that contour(Z) will match imshow(Z)\n if origin is not None.\n The center of pixel Z[i, j] depends on origin:\n if origin is None, x = j, y = i;\n if origin is 'lower', x = j + 0.5, y = i + 0.5;\n if origin is 'upper', x = j + 0.5, y = Nrows - i - 0.5\n If extent is not None, x and y will be scaled to match,\n as in imshow.\n If origin is None and extent is not None, then extent\n will give the minimum and maximum values of x and y.\n """\n if z.ndim != 2:\n raise TypeError(f"Input z must be 2D, not {z.ndim}D")\n elif z.shape[0] < 2 or z.shape[1] < 2:\n raise TypeError(f"Input z must be at least a (2, 2) shaped array, "\n f"but has shape {z.shape}")\n else:\n Ny, Nx = z.shape\n if self.origin is None: # Not for image-matching.\n if self.extent is None:\n return np.meshgrid(np.arange(Nx), np.arange(Ny))\n else:\n x0, x1, y0, y1 = self.extent\n x = np.linspace(x0, x1, Nx)\n y = np.linspace(y0, y1, Ny)\n return np.meshgrid(x, y)\n # Match image behavior:\n if self.extent is None:\n x0, x1, y0, y1 = (0, Nx, 0, Ny)\n else:\n x0, x1, y0, y1 = self.extent\n dx = (x1 - x0) / Nx\n dy = (y1 - y0) / Ny\n x = x0 + (np.arange(Nx) + 0.5) * dx\n y = y0 + (np.arange(Ny) + 0.5) * dy\n if self.origin == 'upper':\n y = y[::-1]\n return np.meshgrid(x, y)\n\n\n_docstring.interpd.register(contour_doc="""\n`.contour` and `.contourf` draw contour lines and filled contours,\nrespectively. Except as noted, function signatures and return values\nare the same for both versions.\n\nParameters\n----------\nX, Y : array-like, optional\n The coordinates of the values in *Z*.\n\n *X* and *Y* must both be 2D with the same shape as *Z* (e.g.\n created via `numpy.meshgrid`), or they must both be 1-D such\n that ``len(X) == N`` is the number of columns in *Z* and\n ``len(Y) == M`` is the number of rows in *Z*.\n\n *X* and *Y* must both be ordered monotonically.\n\n If not given, they are assumed to be integer indices, i.e.\n ``X = range(N)``, ``Y = range(M)``.\n\nZ : (M, N) array-like\n The height values over which the contour is drawn. Color-mapping is\n controlled by *cmap*, *norm*, *vmin*, and *vmax*.\n\nlevels : int or array-like, optional\n Determines the number and positions of the contour lines / regions.\n\n If an int *n*, use `~matplotlib.ticker.MaxNLocator`, which tries\n to automatically choose no more than *n+1* "nice" contour levels\n between minimum and maximum numeric values of *Z*.\n\n If array-like, draw contour lines at the specified levels.\n The values must be in increasing order.\n\nReturns\n-------\n`~.contour.QuadContourSet`\n\nOther Parameters\n----------------\ncorner_mask : bool, default: :rc:`contour.corner_mask`\n Enable/disable corner masking, which only has an effect if *Z* is\n a masked array. If ``False``, any quad touching a masked point is\n masked out. If ``True``, only the triangular corners of quads\n nearest those points are always masked out, other triangular\n corners comprising three unmasked points are contoured as usual.\n\ncolors : :mpltype:`color` or list of :mpltype:`color`, optional\n The colors of the levels, i.e. the lines for `.contour` and the\n areas for `.contourf`.\n\n The sequence is cycled for the levels in ascending order. If the\n sequence is shorter than the number of levels, it's repeated.\n\n As a shortcut, a single color may be used in place of one-element lists, i.e.\n ``'red'`` instead of ``['red']`` to color all levels with the same color.\n\n .. versionchanged:: 3.10\n Previously a single color had to be expressed as a string, but now any\n valid color format may be passed.\n\n By default (value *None*), the colormap specified by *cmap*\n will be used.\n\nalpha : float, default: 1\n The alpha blending value, between 0 (transparent) and 1 (opaque).\n\n%(cmap_doc)s\n\n This parameter is ignored if *colors* is set.\n\n%(norm_doc)s\n\n This parameter is ignored if *colors* is set.\n\n%(vmin_vmax_doc)s\n\n If *vmin* or *vmax* are not given, the default color scaling is based on\n *levels*.\n\n This parameter is ignored if *colors* is set.\n\n%(colorizer_doc)s\n\n This parameter is ignored if *colors* is set.\n\norigin : {*None*, 'upper', 'lower', 'image'}, default: None\n Determines the orientation and exact position of *Z* by specifying\n the position of ``Z[0, 0]``. This is only relevant, if *X*, *Y*\n are not given.\n\n - *None*: ``Z[0, 0]`` is at X=0, Y=0 in the lower left corner.\n - 'lower': ``Z[0, 0]`` is at X=0.5, Y=0.5 in the lower left corner.\n - 'upper': ``Z[0, 0]`` is at X=N+0.5, Y=0.5 in the upper left\n corner.\n - 'image': Use the value from :rc:`image.origin`.\n\nextent : (x0, x1, y0, y1), optional\n If *origin* is not *None*, then *extent* is interpreted as in\n `.imshow`: it gives the outer pixel boundaries. In this case, the\n position of Z[0, 0] is the center of the pixel, not a corner. If\n *origin* is *None*, then (*x0*, *y0*) is the position of Z[0, 0],\n and (*x1*, *y1*) is the position of Z[-1, -1].\n\n This argument is ignored if *X* and *Y* are specified in the call\n to contour.\n\nlocator : ticker.Locator subclass, optional\n The locator is used to determine the contour levels if they\n are not given explicitly via *levels*.\n Defaults to `~.ticker.MaxNLocator`.\n\nextend : {'neither', 'both', 'min', 'max'}, default: 'neither'\n Determines the ``contourf``-coloring of values that are outside the\n *levels* range.\n\n If 'neither', values outside the *levels* range are not colored.\n If 'min', 'max' or 'both', color the values below, above or below\n and above the *levels* range.\n\n Values below ``min(levels)`` and above ``max(levels)`` are mapped\n to the under/over values of the `.Colormap`. Note that most\n colormaps do not have dedicated colors for these by default, so\n that the over and under values are the edge values of the colormap.\n You may want to set these values explicitly using\n `.Colormap.set_under` and `.Colormap.set_over`.\n\n .. note::\n\n An existing `.QuadContourSet` does not get notified if\n properties of its colormap are changed. Therefore, an explicit\n call `~.ContourSet.changed()` is needed after modifying the\n colormap. The explicit call can be left out, if a colorbar is\n assigned to the `.QuadContourSet` because it internally calls\n `~.ContourSet.changed()`.\n\n Example::\n\n x = np.arange(1, 10)\n y = x.reshape(-1, 1)\n h = x * y\n\n cs = plt.contourf(h, levels=[10, 30, 50],\n colors=['#808080', '#A0A0A0', '#C0C0C0'], extend='both')\n cs.cmap.set_over('red')\n cs.cmap.set_under('blue')\n cs.changed()\n\nxunits, yunits : registered units, optional\n Override axis units by specifying an instance of a\n :class:`matplotlib.units.ConversionInterface`.\n\nantialiased : bool, optional\n Enable antialiasing, overriding the defaults. For\n filled contours, the default is *False*. For line contours,\n it is taken from :rc:`lines.antialiased`.\n\nnchunk : int >= 0, optional\n If 0, no subdivision of the domain. Specify a positive integer to\n divide the domain into subdomains of *nchunk* by *nchunk* quads.\n Chunking reduces the maximum length of polygons generated by the\n contouring algorithm which reduces the rendering workload passed\n on to the backend and also requires slightly less RAM. It can\n however introduce rendering artifacts at chunk boundaries depending\n on the backend, the *antialiased* flag and value of *alpha*.\n\nlinewidths : float or array-like, default: :rc:`contour.linewidth`\n *Only applies to* `.contour`.\n\n The line width of the contour lines.\n\n If a number, all levels will be plotted with this linewidth.\n\n If a sequence, the levels in ascending order will be plotted with\n the linewidths in the order specified.\n\n If None, this falls back to :rc:`lines.linewidth`.\n\nlinestyles : {*None*, 'solid', 'dashed', 'dashdot', 'dotted'}, optional\n *Only applies to* `.contour`.\n\n If *linestyles* is *None*, the default is 'solid' unless the lines are\n monochrome. In that case, negative contours will instead take their\n linestyle from the *negative_linestyles* argument.\n\n *linestyles* can also be an iterable of the above strings specifying a set\n of linestyles to be used. If this iterable is shorter than the number of\n contour levels it will be repeated as necessary.\n\nnegative_linestyles : {*None*, 'solid', 'dashed', 'dashdot', 'dotted'}, \\n optional\n *Only applies to* `.contour`.\n\n If *linestyles* is *None* and the lines are monochrome, this argument\n specifies the line style for negative contours.\n\n If *negative_linestyles* is *None*, the default is taken from\n :rc:`contour.negative_linestyle`.\n\n *negative_linestyles* can also be an iterable of the above strings\n specifying a set of linestyles to be used. If this iterable is shorter than\n the number of contour levels it will be repeated as necessary.\n\nhatches : list[str], optional\n *Only applies to* `.contourf`.\n\n A list of cross hatch patterns to use on the filled areas.\n If None, no hatching will be added to the contour.\n\nalgorithm : {'mpl2005', 'mpl2014', 'serial', 'threaded'}, optional\n Which contouring algorithm to use to calculate the contour lines and\n polygons. The algorithms are implemented in\n `ContourPy <https://github.com/contourpy/contourpy>`_, consult the\n `ContourPy documentation <https://contourpy.readthedocs.io>`_ for\n further information.\n\n The default is taken from :rc:`contour.algorithm`.\n\nclip_path : `~matplotlib.patches.Patch` or `.Path` or `.TransformedPath`\n Set the clip path. See `~matplotlib.artist.Artist.set_clip_path`.\n\n .. versionadded:: 3.8\n\ndata : indexable object, optional\n DATA_PARAMETER_PLACEHOLDER\n\nNotes\n-----\n1. `.contourf` differs from the MATLAB version in that it does not draw\n the polygon edges. To draw edges, add line contours with calls to\n `.contour`.\n\n2. `.contourf` fills intervals that are closed at the top; that is, for\n boundaries *z1* and *z2*, the filled region is::\n\n z1 < Z <= z2\n\n except for the lowest interval, which is closed on both sides (i.e.\n it includes the lowest value).\n\n3. `.contour` and `.contourf` use a `marching squares\n <https://en.wikipedia.org/wiki/Marching_squares>`_ algorithm to\n compute contour locations. More information can be found in\n `ContourPy documentation <https://contourpy.readthedocs.io>`_.\n""" % _docstring.interpd.params)\n | .venv\Lib\site-packages\matplotlib\contour.py | contour.py | Python | 68,391 | 0.75 | 0.166863 | 0.080612 | awesome-app | 503 | 2023-11-03T14:35:16.115464 | GPL-3.0 | false | 4b032bd16a087892ae467250eaff6ff0 |
import matplotlib.cm as cm\nfrom matplotlib.artist import Artist\nfrom matplotlib.axes import Axes\nfrom matplotlib.collections import Collection, PathCollection\nfrom matplotlib.colorizer import Colorizer, ColorizingArtist\nfrom matplotlib.colors import Colormap, Normalize\nfrom matplotlib.path import Path\nfrom matplotlib.patches import Patch\nfrom matplotlib.text import Text\nfrom matplotlib.transforms import Transform, TransformedPatchPath, TransformedPath\nfrom matplotlib.ticker import Locator, Formatter\n\nfrom numpy.typing import ArrayLike\nimport numpy as np\nfrom collections.abc import Callable, Iterable, Sequence\nfrom typing import Literal\nfrom .typing import ColorType\n\n\n\nclass ContourLabeler:\n labelFmt: str | Formatter | Callable[[float], str] | dict[float, str]\n labelManual: bool | Iterable[tuple[float, float]]\n rightside_up: bool\n labelLevelList: list[float]\n labelIndiceList: list[int]\n labelMappable: cm.ScalarMappable | ColorizingArtist\n labelCValueList: list[ColorType]\n labelXYs: list[tuple[float, float]]\n def clabel(\n self,\n levels: ArrayLike | None = ...,\n *,\n fontsize: str | float | None = ...,\n inline: bool = ...,\n inline_spacing: float = ...,\n fmt: str | Formatter | Callable[[float], str] | dict[float, str] | None = ...,\n colors: ColorType | Sequence[ColorType] | None = ...,\n use_clabeltext: bool = ...,\n manual: bool | Iterable[tuple[float, float]] = ...,\n rightside_up: bool = ...,\n zorder: float | None = ...\n ) -> list[Text]: ...\n def print_label(self, linecontour: ArrayLike, labelwidth: float) -> bool: ...\n def too_close(self, x: float, y: float, lw: float) -> bool: ...\n def get_text(\n self,\n lev: float,\n fmt: str | Formatter | Callable[[float], str] | dict[float, str],\n ) -> str: ...\n def locate_label(\n self, linecontour: ArrayLike, labelwidth: float\n ) -> tuple[float, float, float]: ...\n def add_label(\n self, x: float, y: float, rotation: float, lev: float, cvalue: ColorType\n ) -> None: ...\n def add_label_near(\n self,\n x: float,\n y: float,\n inline: bool = ...,\n inline_spacing: int = ...,\n transform: Transform | Literal[False] | None = ...,\n ) -> None: ...\n def pop_label(self, index: int = ...) -> None: ...\n def labels(self, inline: bool, inline_spacing: int) -> None: ...\n def remove(self) -> None: ...\n\nclass ContourSet(ContourLabeler, Collection):\n axes: Axes\n levels: Iterable[float]\n filled: bool\n linewidths: float | ArrayLike | None\n hatches: Iterable[str | None]\n origin: Literal["upper", "lower", "image"] | None\n extent: tuple[float, float, float, float] | None\n colors: ColorType | Sequence[ColorType]\n extend: Literal["neither", "both", "min", "max"]\n nchunk: int\n locator: Locator | None\n logscale: bool\n negative_linestyles: None | Literal[\n "solid", "dashed", "dashdot", "dotted"\n ] | Iterable[Literal["solid", "dashed", "dashdot", "dotted"]]\n clip_path: Patch | Path | TransformedPath | TransformedPatchPath | None\n labelTexts: list[Text]\n labelCValues: list[ColorType]\n\n @property\n def allkinds(self) -> list[list[np.ndarray | None]]: ...\n @property\n def allsegs(self) -> list[list[np.ndarray]]: ...\n @property\n def alpha(self) -> float | None: ...\n @property\n def linestyles(self) -> (\n None |\n Literal["solid", "dashed", "dashdot", "dotted"] |\n Iterable[Literal["solid", "dashed", "dashdot", "dotted"]]\n ): ...\n\n def __init__(\n self,\n ax: Axes,\n *args,\n levels: Iterable[float] | None = ...,\n filled: bool = ...,\n linewidths: float | ArrayLike | None = ...,\n linestyles: Literal["solid", "dashed", "dashdot", "dotted"]\n | Iterable[Literal["solid", "dashed", "dashdot", "dotted"]]\n | None = ...,\n hatches: Iterable[str | None] = ...,\n alpha: float | None = ...,\n origin: Literal["upper", "lower", "image"] | None = ...,\n extent: tuple[float, float, float, float] | None = ...,\n cmap: str | Colormap | None = ...,\n colors: ColorType | Sequence[ColorType] | None = ...,\n norm: str | Normalize | None = ...,\n vmin: float | None = ...,\n vmax: float | None = ...,\n colorizer: Colorizer | None = ...,\n extend: Literal["neither", "both", "min", "max"] = ...,\n antialiased: bool | None = ...,\n nchunk: int = ...,\n locator: Locator | None = ...,\n transform: Transform | None = ...,\n negative_linestyles: Literal["solid", "dashed", "dashdot", "dotted"]\n | Iterable[Literal["solid", "dashed", "dashdot", "dotted"]]\n | None = ...,\n clip_path: Patch | Path | TransformedPath | TransformedPatchPath | None = ...,\n **kwargs\n ) -> None: ...\n def legend_elements(\n self, variable_name: str = ..., str_format: Callable[[float], str] = ...\n ) -> tuple[list[Artist], list[str]]: ...\n def find_nearest_contour(\n self, x: float, y: float, indices: Iterable[int] | None = ..., pixel: bool = ...\n ) -> tuple[int, int, int, float, float, float]: ...\n\nclass QuadContourSet(ContourSet): ...\n | .venv\Lib\site-packages\matplotlib\contour.pyi | contour.pyi | Other | 5,300 | 0.85 | 0.142857 | 0.022727 | awesome-app | 138 | 2025-03-10T10:29:58.184797 | MIT | false | 6b9e45c2a504ede33949b06a95d6e9c7 |
"""\nMatplotlib provides sophisticated date plotting capabilities, standing on the\nshoulders of python :mod:`datetime` and the add-on module dateutil_.\n\nBy default, Matplotlib uses the units machinery described in\n`~matplotlib.units` to convert `datetime.datetime`, and `numpy.datetime64`\nobjects when plotted on an x- or y-axis. The user does not\nneed to do anything for dates to be formatted, but dates often have strict\nformatting needs, so this module provides many tick locators and formatters.\nA basic example using `numpy.datetime64` is::\n\n import numpy as np\n\n times = np.arange(np.datetime64('2001-01-02'),\n np.datetime64('2002-02-03'), np.timedelta64(75, 'm'))\n y = np.random.randn(len(times))\n\n fig, ax = plt.subplots()\n ax.plot(times, y)\n\n.. seealso::\n\n - :doc:`/gallery/text_labels_and_annotations/date`\n - :doc:`/gallery/ticks/date_concise_formatter`\n - :doc:`/gallery/ticks/date_demo_convert`\n\n.. _date-format:\n\nMatplotlib date format\n----------------------\n\nMatplotlib represents dates using floating point numbers specifying the number\nof days since a default epoch of 1970-01-01 UTC; for example,\n1970-01-01, 06:00 is the floating point number 0.25. The formatters and\nlocators require the use of `datetime.datetime` objects, so only dates between\nyear 0001 and 9999 can be represented. Microsecond precision\nis achievable for (approximately) 70 years on either side of the epoch, and\n20 microseconds for the rest of the allowable range of dates (year 0001 to\n9999). The epoch can be changed at import time via `.dates.set_epoch` or\n:rc:`date.epoch` to other dates if necessary; see\n:doc:`/gallery/ticks/date_precision_and_epochs` for a discussion.\n\n.. note::\n\n Before Matplotlib 3.3, the epoch was 0000-12-31 which lost modern\n microsecond precision and also made the default axis limit of 0 an invalid\n datetime. In 3.3 the epoch was changed as above. To convert old\n ordinal floats to the new epoch, users can do::\n\n new_ordinal = old_ordinal + mdates.date2num(np.datetime64('0000-12-31'))\n\n\nThere are a number of helper functions to convert between :mod:`datetime`\nobjects and Matplotlib dates:\n\n.. currentmodule:: matplotlib.dates\n\n.. autosummary::\n :nosignatures:\n\n datestr2num\n date2num\n num2date\n num2timedelta\n drange\n set_epoch\n get_epoch\n\n.. note::\n\n Like Python's `datetime.datetime`, Matplotlib uses the Gregorian calendar\n for all conversions between dates and floating point numbers. This practice\n is not universal, and calendar differences can cause confusing\n differences between what Python and Matplotlib give as the number of days\n since 0001-01-01 and what other software and databases yield. For\n example, the US Naval Observatory uses a calendar that switches\n from Julian to Gregorian in October, 1582. Hence, using their\n calculator, the number of days between 0001-01-01 and 2006-04-01 is\n 732403, whereas using the Gregorian calendar via the datetime\n module we find::\n\n In [1]: date(2006, 4, 1).toordinal() - date(1, 1, 1).toordinal()\n Out[1]: 732401\n\nAll the Matplotlib date converters, locators and formatters are timezone aware.\nIf no explicit timezone is provided, :rc:`timezone` is assumed, provided as a\nstring. If you want to use a different timezone, pass the *tz* keyword\nargument of `num2date` to any date tick locators or formatters you create. This\ncan be either a `datetime.tzinfo` instance or a string with the timezone name\nthat can be parsed by `~dateutil.tz.gettz`.\n\nA wide range of specific and general purpose date tick locators and\nformatters are provided in this module. See\n:mod:`matplotlib.ticker` for general information on tick locators\nand formatters. These are described below.\n\nThe dateutil_ module provides additional code to handle date ticking, making it\neasy to place ticks on any kinds of dates. See examples below.\n\n.. _dateutil: https://dateutil.readthedocs.io\n\n.. _date-locators:\n\nDate tick locators\n------------------\n\nMost of the date tick locators can locate single or multiple ticks. For example::\n\n # import constants for the days of the week\n from matplotlib.dates import MO, TU, WE, TH, FR, SA, SU\n\n # tick on Mondays every week\n loc = WeekdayLocator(byweekday=MO, tz=tz)\n\n # tick on Mondays and Saturdays\n loc = WeekdayLocator(byweekday=(MO, SA))\n\nIn addition, most of the constructors take an interval argument::\n\n # tick on Mondays every second week\n loc = WeekdayLocator(byweekday=MO, interval=2)\n\nThe rrule locator allows completely general date ticking::\n\n # tick every 5th easter\n rule = rrulewrapper(YEARLY, byeaster=1, interval=5)\n loc = RRuleLocator(rule)\n\nThe available date tick locators are:\n\n* `MicrosecondLocator`: Locate microseconds.\n\n* `SecondLocator`: Locate seconds.\n\n* `MinuteLocator`: Locate minutes.\n\n* `HourLocator`: Locate hours.\n\n* `DayLocator`: Locate specified days of the month.\n\n* `WeekdayLocator`: Locate days of the week, e.g., MO, TU.\n\n* `MonthLocator`: Locate months, e.g., 7 for July.\n\n* `YearLocator`: Locate years that are multiples of base.\n\n* `RRuleLocator`: Locate using a `rrulewrapper`.\n `rrulewrapper` is a simple wrapper around dateutil_'s `dateutil.rrule`\n which allow almost arbitrary date tick specifications.\n See :doc:`rrule example </gallery/ticks/date_demo_rrule>`.\n\n* `AutoDateLocator`: On autoscale, this class picks the best `DateLocator`\n (e.g., `RRuleLocator`) to set the view limits and the tick locations. If\n called with ``interval_multiples=True`` it will make ticks line up with\n sensible multiples of the tick intervals. For example, if the interval is\n 4 hours, it will pick hours 0, 4, 8, etc. as ticks. This behaviour is not\n guaranteed by default.\n\n.. _date-formatters:\n\nDate formatters\n---------------\n\nThe available date formatters are:\n\n* `AutoDateFormatter`: attempts to figure out the best format to use. This is\n most useful when used with the `AutoDateLocator`.\n\n* `ConciseDateFormatter`: also attempts to figure out the best format to use,\n and to make the format as compact as possible while still having complete\n date information. This is most useful when used with the `AutoDateLocator`.\n\n* `DateFormatter`: use `~datetime.datetime.strftime` format strings.\n"""\n\nimport datetime\nimport functools\nimport logging\nimport re\n\nfrom dateutil.rrule import (rrule, MO, TU, WE, TH, FR, SA, SU, YEARLY,\n MONTHLY, WEEKLY, DAILY, HOURLY, MINUTELY,\n SECONDLY)\nfrom dateutil.relativedelta import relativedelta\nimport dateutil.parser\nimport dateutil.tz\nimport numpy as np\n\nimport matplotlib as mpl\nfrom matplotlib import _api, cbook, ticker, units\n\n__all__ = ('datestr2num', 'date2num', 'num2date', 'num2timedelta', 'drange',\n 'set_epoch', 'get_epoch', 'DateFormatter', 'ConciseDateFormatter',\n 'AutoDateFormatter', 'DateLocator', 'RRuleLocator',\n 'AutoDateLocator', 'YearLocator', 'MonthLocator', 'WeekdayLocator',\n 'DayLocator', 'HourLocator', 'MinuteLocator',\n 'SecondLocator', 'MicrosecondLocator',\n 'rrule', 'MO', 'TU', 'WE', 'TH', 'FR', 'SA', 'SU',\n 'YEARLY', 'MONTHLY', 'WEEKLY', 'DAILY',\n 'HOURLY', 'MINUTELY', 'SECONDLY', 'MICROSECONDLY', 'relativedelta',\n 'DateConverter', 'ConciseDateConverter', 'rrulewrapper')\n\n\n_log = logging.getLogger(__name__)\nUTC = datetime.timezone.utc\n\n\ndef _get_tzinfo(tz=None):\n """\n Generate `~datetime.tzinfo` from a string or return `~datetime.tzinfo`.\n If None, retrieve the preferred timezone from the rcParams dictionary.\n """\n tz = mpl._val_or_rc(tz, 'timezone')\n if tz == 'UTC':\n return UTC\n if isinstance(tz, str):\n tzinfo = dateutil.tz.gettz(tz)\n if tzinfo is None:\n raise ValueError(f"{tz} is not a valid timezone as parsed by"\n " dateutil.tz.gettz.")\n return tzinfo\n if isinstance(tz, datetime.tzinfo):\n return tz\n raise TypeError(f"tz must be string or tzinfo subclass, not {tz!r}.")\n\n\n# Time-related constants.\nEPOCH_OFFSET = float(datetime.datetime(1970, 1, 1).toordinal())\n# EPOCH_OFFSET is not used by matplotlib\nMICROSECONDLY = SECONDLY + 1\nHOURS_PER_DAY = 24.\nMIN_PER_HOUR = 60.\nSEC_PER_MIN = 60.\nMONTHS_PER_YEAR = 12.\n\nDAYS_PER_WEEK = 7.\nDAYS_PER_MONTH = 30.\nDAYS_PER_YEAR = 365.0\n\nMINUTES_PER_DAY = MIN_PER_HOUR * HOURS_PER_DAY\n\nSEC_PER_HOUR = SEC_PER_MIN * MIN_PER_HOUR\nSEC_PER_DAY = SEC_PER_HOUR * HOURS_PER_DAY\nSEC_PER_WEEK = SEC_PER_DAY * DAYS_PER_WEEK\n\nMUSECONDS_PER_DAY = 1e6 * SEC_PER_DAY\n\nMONDAY, TUESDAY, WEDNESDAY, THURSDAY, FRIDAY, SATURDAY, SUNDAY = (\n MO, TU, WE, TH, FR, SA, SU)\nWEEKDAYS = (MONDAY, TUESDAY, WEDNESDAY, THURSDAY, FRIDAY, SATURDAY, SUNDAY)\n\n# default epoch: passed to np.datetime64...\n_epoch = None\n\n\ndef _reset_epoch_test_example():\n """\n Reset the Matplotlib date epoch so it can be set again.\n\n Only for use in tests and examples.\n """\n global _epoch\n _epoch = None\n\n\ndef set_epoch(epoch):\n """\n Set the epoch (origin for dates) for datetime calculations.\n\n The default epoch is :rc:`date.epoch`.\n\n If microsecond accuracy is desired, the date being plotted needs to be\n within approximately 70 years of the epoch. Matplotlib internally\n represents dates as days since the epoch, so floating point dynamic\n range needs to be within a factor of 2^52.\n\n `~.dates.set_epoch` must be called before any dates are converted\n (i.e. near the import section) or a RuntimeError will be raised.\n\n See also :doc:`/gallery/ticks/date_precision_and_epochs`.\n\n Parameters\n ----------\n epoch : str\n valid UTC date parsable by `numpy.datetime64` (do not include\n timezone).\n\n """\n global _epoch\n if _epoch is not None:\n raise RuntimeError('set_epoch must be called before dates plotted.')\n _epoch = epoch\n\n\ndef get_epoch():\n """\n Get the epoch used by `.dates`.\n\n Returns\n -------\n epoch : str\n String for the epoch (parsable by `numpy.datetime64`).\n """\n global _epoch\n\n _epoch = mpl._val_or_rc(_epoch, 'date.epoch')\n return _epoch\n\n\ndef _dt64_to_ordinalf(d):\n """\n Convert `numpy.datetime64` or an `numpy.ndarray` of those types to\n Gregorian date as UTC float relative to the epoch (see `.get_epoch`).\n Roundoff is float64 precision. Practically: microseconds for dates\n between 290301 BC, 294241 AD, milliseconds for larger dates\n (see `numpy.datetime64`).\n """\n\n # the "extra" ensures that we at least allow the dynamic range out to\n # seconds. That should get out to +/-2e11 years.\n dseconds = d.astype('datetime64[s]')\n extra = (d - dseconds).astype('timedelta64[ns]')\n t0 = np.datetime64(get_epoch(), 's')\n dt = (dseconds - t0).astype(np.float64)\n dt += extra.astype(np.float64) / 1.0e9\n dt = dt / SEC_PER_DAY\n\n NaT_int = np.datetime64('NaT').astype(np.int64)\n d_int = d.astype(np.int64)\n dt[d_int == NaT_int] = np.nan\n return dt\n\n\ndef _from_ordinalf(x, tz=None):\n """\n Convert Gregorian float of the date, preserving hours, minutes,\n seconds and microseconds. Return value is a `.datetime`.\n\n The input date *x* is a float in ordinal days at UTC, and the output will\n be the specified `.datetime` object corresponding to that time in\n timezone *tz*, or if *tz* is ``None``, in the timezone specified in\n :rc:`timezone`.\n """\n\n tz = _get_tzinfo(tz)\n\n dt = (np.datetime64(get_epoch()) +\n np.timedelta64(int(np.round(x * MUSECONDS_PER_DAY)), 'us'))\n if dt < np.datetime64('0001-01-01') or dt >= np.datetime64('10000-01-01'):\n raise ValueError(f'Date ordinal {x} converts to {dt} (using '\n f'epoch {get_epoch()}), but Matplotlib dates must be '\n 'between year 0001 and 9999.')\n # convert from datetime64 to datetime:\n dt = dt.tolist()\n\n # datetime64 is always UTC:\n dt = dt.replace(tzinfo=dateutil.tz.gettz('UTC'))\n # but maybe we are working in a different timezone so move.\n dt = dt.astimezone(tz)\n # fix round off errors\n if np.abs(x) > 70 * 365:\n # if x is big, round off to nearest twenty microseconds.\n # This avoids floating point roundoff error\n ms = round(dt.microsecond / 20) * 20\n if ms == 1000000:\n dt = dt.replace(microsecond=0) + datetime.timedelta(seconds=1)\n else:\n dt = dt.replace(microsecond=ms)\n\n return dt\n\n\n# a version of _from_ordinalf that can operate on numpy arrays\n_from_ordinalf_np_vectorized = np.vectorize(_from_ordinalf, otypes="O")\n# a version of dateutil.parser.parse that can operate on numpy arrays\n_dateutil_parser_parse_np_vectorized = np.vectorize(dateutil.parser.parse)\n\n\ndef datestr2num(d, default=None):\n """\n Convert a date string to a datenum using `dateutil.parser.parse`.\n\n Parameters\n ----------\n d : str or sequence of str\n The dates to convert.\n\n default : datetime.datetime, optional\n The default date to use when fields are missing in *d*.\n """\n if isinstance(d, str):\n dt = dateutil.parser.parse(d, default=default)\n return date2num(dt)\n else:\n if default is not None:\n d = [date2num(dateutil.parser.parse(s, default=default))\n for s in d]\n return np.asarray(d)\n d = np.asarray(d)\n if not d.size:\n return d\n return date2num(_dateutil_parser_parse_np_vectorized(d))\n\n\ndef date2num(d):\n """\n Convert datetime objects to Matplotlib dates.\n\n Parameters\n ----------\n d : `datetime.datetime` or `numpy.datetime64` or sequences of these\n\n Returns\n -------\n float or sequence of floats\n Number of days since the epoch. See `.get_epoch` for the\n epoch, which can be changed by :rc:`date.epoch` or `.set_epoch`. If\n the epoch is "1970-01-01T00:00:00" (default) then noon Jan 1 1970\n ("1970-01-01T12:00:00") returns 0.5.\n\n Notes\n -----\n The Gregorian calendar is assumed; this is not universal practice.\n For details see the module docstring.\n """\n # Unpack in case of e.g. Pandas or xarray object\n d = cbook._unpack_to_numpy(d)\n\n # make an iterable, but save state to unpack later:\n iterable = np.iterable(d)\n if not iterable:\n d = [d]\n\n masked = np.ma.is_masked(d)\n mask = np.ma.getmask(d)\n d = np.asarray(d)\n\n # convert to datetime64 arrays, if not already:\n if not np.issubdtype(d.dtype, np.datetime64):\n # datetime arrays\n if not d.size:\n # deals with an empty array...\n return d\n tzi = getattr(d[0], 'tzinfo', None)\n if tzi is not None:\n # make datetime naive:\n d = [dt.astimezone(UTC).replace(tzinfo=None) for dt in d]\n d = np.asarray(d)\n d = d.astype('datetime64[us]')\n\n d = np.ma.masked_array(d, mask=mask) if masked else d\n d = _dt64_to_ordinalf(d)\n\n return d if iterable else d[0]\n\n\ndef num2date(x, tz=None):\n """\n Convert Matplotlib dates to `~datetime.datetime` objects.\n\n Parameters\n ----------\n x : float or sequence of floats\n Number of days (fraction part represents hours, minutes, seconds)\n since the epoch. See `.get_epoch` for the\n epoch, which can be changed by :rc:`date.epoch` or `.set_epoch`.\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Timezone of *x*. If a string, *tz* is passed to `dateutil.tz`.\n\n Returns\n -------\n `~datetime.datetime` or sequence of `~datetime.datetime`\n Dates are returned in timezone *tz*.\n\n If *x* is a sequence, a sequence of `~datetime.datetime` objects will\n be returned.\n\n Notes\n -----\n The Gregorian calendar is assumed; this is not universal practice.\n For details, see the module docstring.\n """\n tz = _get_tzinfo(tz)\n return _from_ordinalf_np_vectorized(x, tz).tolist()\n\n\n_ordinalf_to_timedelta_np_vectorized = np.vectorize(\n lambda x: datetime.timedelta(days=x), otypes="O")\n\n\ndef num2timedelta(x):\n """\n Convert number of days to a `~datetime.timedelta` object.\n\n If *x* is a sequence, a sequence of `~datetime.timedelta` objects will\n be returned.\n\n Parameters\n ----------\n x : float, sequence of floats\n Number of days. The fraction part represents hours, minutes, seconds.\n\n Returns\n -------\n `datetime.timedelta` or list[`datetime.timedelta`]\n """\n return _ordinalf_to_timedelta_np_vectorized(x).tolist()\n\n\ndef drange(dstart, dend, delta):\n """\n Return a sequence of equally spaced Matplotlib dates.\n\n The dates start at *dstart* and reach up to, but not including *dend*.\n They are spaced by *delta*.\n\n Parameters\n ----------\n dstart, dend : `~datetime.datetime`\n The date limits.\n delta : `datetime.timedelta`\n Spacing of the dates.\n\n Returns\n -------\n `numpy.array`\n A list floats representing Matplotlib dates.\n\n """\n f1 = date2num(dstart)\n f2 = date2num(dend)\n step = delta.total_seconds() / SEC_PER_DAY\n\n # calculate the difference between dend and dstart in times of delta\n num = int(np.ceil((f2 - f1) / step))\n\n # calculate end of the interval which will be generated\n dinterval_end = dstart + num * delta\n\n # ensure, that an half open interval will be generated [dstart, dend)\n if dinterval_end >= dend:\n # if the endpoint is greater than or equal to dend,\n # just subtract one delta\n dinterval_end -= delta\n num -= 1\n\n f2 = date2num(dinterval_end) # new float-endpoint\n return np.linspace(f1, f2, num + 1)\n\n\ndef _wrap_in_tex(text):\n p = r'([a-zA-Z]+)'\n ret_text = re.sub(p, r'}$\1$\\mathdefault{', text)\n\n # Braces ensure symbols are not spaced like binary operators.\n ret_text = ret_text.replace('-', '{-}').replace(':', '{:}')\n # To not concatenate space between numbers.\n ret_text = ret_text.replace(' ', r'\;')\n ret_text = '$\\mathdefault{' + ret_text + '}$'\n ret_text = ret_text.replace('$\\mathdefault{}$', '')\n return ret_text\n\n\n## date tick locators and formatters ###\n\n\nclass DateFormatter(ticker.Formatter):\n """\n Format a tick (in days since the epoch) with a\n `~datetime.datetime.strftime` format string.\n """\n\n def __init__(self, fmt, tz=None, *, usetex=None):\n """\n Parameters\n ----------\n fmt : str\n `~datetime.datetime.strftime` format string\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Ticks timezone. If a string, *tz* is passed to `dateutil.tz`.\n usetex : bool, default: :rc:`text.usetex`\n To enable/disable the use of TeX's math mode for rendering the\n results of the formatter.\n """\n self.tz = _get_tzinfo(tz)\n self.fmt = fmt\n self._usetex = mpl._val_or_rc(usetex, 'text.usetex')\n\n def __call__(self, x, pos=0):\n result = num2date(x, self.tz).strftime(self.fmt)\n return _wrap_in_tex(result) if self._usetex else result\n\n def set_tzinfo(self, tz):\n self.tz = _get_tzinfo(tz)\n\n\nclass ConciseDateFormatter(ticker.Formatter):\n """\n A `.Formatter` which attempts to figure out the best format to use for the\n date, and to make it as compact as possible, but still be complete. This is\n most useful when used with the `AutoDateLocator`::\n\n >>> locator = AutoDateLocator()\n >>> formatter = ConciseDateFormatter(locator)\n\n Parameters\n ----------\n locator : `.ticker.Locator`\n Locator that this axis is using.\n\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Ticks timezone, passed to `.dates.num2date`.\n\n formats : list of 6 strings, optional\n Format strings for 6 levels of tick labelling: mostly years,\n months, days, hours, minutes, and seconds. Strings use\n the same format codes as `~datetime.datetime.strftime`. Default is\n ``['%Y', '%b', '%d', '%H:%M', '%H:%M', '%S.%f']``\n\n zero_formats : list of 6 strings, optional\n Format strings for tick labels that are "zeros" for a given tick\n level. For instance, if most ticks are months, ticks around 1 Jan 2005\n will be labeled "Dec", "2005", "Feb". The default is\n ``['', '%Y', '%b', '%b-%d', '%H:%M', '%H:%M']``\n\n offset_formats : list of 6 strings, optional\n Format strings for the 6 levels that is applied to the "offset"\n string found on the right side of an x-axis, or top of a y-axis.\n Combined with the tick labels this should completely specify the\n date. The default is::\n\n ['', '%Y', '%Y-%b', '%Y-%b-%d', '%Y-%b-%d', '%Y-%b-%d %H:%M']\n\n show_offset : bool, default: True\n Whether to show the offset or not.\n\n usetex : bool, default: :rc:`text.usetex`\n To enable/disable the use of TeX's math mode for rendering the results\n of the formatter.\n\n Examples\n --------\n See :doc:`/gallery/ticks/date_concise_formatter`\n\n .. plot::\n\n import datetime\n import matplotlib.dates as mdates\n\n base = datetime.datetime(2005, 2, 1)\n dates = np.array([base + datetime.timedelta(hours=(2 * i))\n for i in range(732)])\n N = len(dates)\n np.random.seed(19680801)\n y = np.cumsum(np.random.randn(N))\n\n fig, ax = plt.subplots(constrained_layout=True)\n locator = mdates.AutoDateLocator()\n formatter = mdates.ConciseDateFormatter(locator)\n ax.xaxis.set_major_locator(locator)\n ax.xaxis.set_major_formatter(formatter)\n\n ax.plot(dates, y)\n ax.set_title('Concise Date Formatter')\n\n """\n\n def __init__(self, locator, tz=None, formats=None, offset_formats=None,\n zero_formats=None, show_offset=True, *, usetex=None):\n """\n Autoformat the date labels. The default format is used to form an\n initial string, and then redundant elements are removed.\n """\n self._locator = locator\n self._tz = tz\n self.defaultfmt = '%Y'\n # there are 6 levels with each level getting a specific format\n # 0: mostly years, 1: months, 2: days,\n # 3: hours, 4: minutes, 5: seconds\n if formats:\n if len(formats) != 6:\n raise ValueError('formats argument must be a list of '\n '6 format strings (or None)')\n self.formats = formats\n else:\n self.formats = ['%Y', # ticks are mostly years\n '%b', # ticks are mostly months\n '%d', # ticks are mostly days\n '%H:%M', # hrs\n '%H:%M', # min\n '%S.%f', # secs\n ]\n # fmt for zeros ticks at this level. These are\n # ticks that should be labeled w/ info the level above.\n # like 1 Jan can just be labelled "Jan". 02:02:00 can\n # just be labeled 02:02.\n if zero_formats:\n if len(zero_formats) != 6:\n raise ValueError('zero_formats argument must be a list of '\n '6 format strings (or None)')\n self.zero_formats = zero_formats\n elif formats:\n # use the users formats for the zero tick formats\n self.zero_formats = [''] + self.formats[:-1]\n else:\n # make the defaults a bit nicer:\n self.zero_formats = [''] + self.formats[:-1]\n self.zero_formats[3] = '%b-%d'\n\n if offset_formats:\n if len(offset_formats) != 6:\n raise ValueError('offset_formats argument must be a list of '\n '6 format strings (or None)')\n self.offset_formats = offset_formats\n else:\n self.offset_formats = ['',\n '%Y',\n '%Y-%b',\n '%Y-%b-%d',\n '%Y-%b-%d',\n '%Y-%b-%d %H:%M']\n self.offset_string = ''\n self.show_offset = show_offset\n self._usetex = mpl._val_or_rc(usetex, 'text.usetex')\n\n def __call__(self, x, pos=None):\n formatter = DateFormatter(self.defaultfmt, self._tz,\n usetex=self._usetex)\n return formatter(x, pos=pos)\n\n def format_ticks(self, values):\n tickdatetime = [num2date(value, tz=self._tz) for value in values]\n tickdate = np.array([tdt.timetuple()[:6] for tdt in tickdatetime])\n\n # basic algorithm:\n # 1) only display a part of the date if it changes over the ticks.\n # 2) don't display the smaller part of the date if:\n # it is always the same or if it is the start of the\n # year, month, day etc.\n # fmt for most ticks at this level\n fmts = self.formats\n # format beginnings of days, months, years, etc.\n zerofmts = self.zero_formats\n # offset fmt are for the offset in the upper left of the\n # or lower right of the axis.\n offsetfmts = self.offset_formats\n show_offset = self.show_offset\n\n # determine the level we will label at:\n # mostly 0: years, 1: months, 2: days,\n # 3: hours, 4: minutes, 5: seconds, 6: microseconds\n for level in range(5, -1, -1):\n unique = np.unique(tickdate[:, level])\n if len(unique) > 1:\n # if 1 is included in unique, the year is shown in ticks\n if level < 2 and np.any(unique == 1):\n show_offset = False\n break\n elif level == 0:\n # all tickdate are the same, so only micros might be different\n # set to the most precise (6: microseconds doesn't exist...)\n level = 5\n\n # level is the basic level we will label at.\n # now loop through and decide the actual ticklabels\n zerovals = [0, 1, 1, 0, 0, 0, 0]\n labels = [''] * len(tickdate)\n for nn in range(len(tickdate)):\n if level < 5:\n if tickdate[nn][level] == zerovals[level]:\n fmt = zerofmts[level]\n else:\n fmt = fmts[level]\n else:\n # special handling for seconds + microseconds\n if (tickdatetime[nn].second == tickdatetime[nn].microsecond\n == 0):\n fmt = zerofmts[level]\n else:\n fmt = fmts[level]\n labels[nn] = tickdatetime[nn].strftime(fmt)\n\n # special handling of seconds and microseconds:\n # strip extra zeros and decimal if possible.\n # this is complicated by two factors. 1) we have some level-4 strings\n # here (i.e. 03:00, '0.50000', '1.000') 2) we would like to have the\n # same number of decimals for each string (i.e. 0.5 and 1.0).\n if level >= 5:\n trailing_zeros = min(\n (len(s) - len(s.rstrip('0')) for s in labels if '.' in s),\n default=None)\n if trailing_zeros:\n for nn in range(len(labels)):\n if '.' in labels[nn]:\n labels[nn] = labels[nn][:-trailing_zeros].rstrip('.')\n\n if show_offset:\n # set the offset string:\n if (self._locator.axis and\n self._locator.axis.__name__ in ('xaxis', 'yaxis')\n and self._locator.axis.get_inverted()):\n self.offset_string = tickdatetime[0].strftime(offsetfmts[level])\n else:\n self.offset_string = tickdatetime[-1].strftime(offsetfmts[level])\n if self._usetex:\n self.offset_string = _wrap_in_tex(self.offset_string)\n else:\n self.offset_string = ''\n\n if self._usetex:\n return [_wrap_in_tex(l) for l in labels]\n else:\n return labels\n\n def get_offset(self):\n return self.offset_string\n\n def format_data_short(self, value):\n return num2date(value, tz=self._tz).strftime('%Y-%m-%d %H:%M:%S')\n\n\nclass AutoDateFormatter(ticker.Formatter):\n """\n A `.Formatter` which attempts to figure out the best format to use. This\n is most useful when used with the `AutoDateLocator`.\n\n `.AutoDateFormatter` has a ``.scale`` dictionary that maps tick scales (the\n interval in days between one major tick) to format strings; this dictionary\n defaults to ::\n\n self.scaled = {\n DAYS_PER_YEAR: rcParams['date.autoformatter.year'],\n DAYS_PER_MONTH: rcParams['date.autoformatter.month'],\n 1: rcParams['date.autoformatter.day'],\n 1 / HOURS_PER_DAY: rcParams['date.autoformatter.hour'],\n 1 / MINUTES_PER_DAY: rcParams['date.autoformatter.minute'],\n 1 / SEC_PER_DAY: rcParams['date.autoformatter.second'],\n 1 / MUSECONDS_PER_DAY: rcParams['date.autoformatter.microsecond'],\n }\n\n The formatter uses the format string corresponding to the lowest key in\n the dictionary that is greater or equal to the current scale. Dictionary\n entries can be customized::\n\n locator = AutoDateLocator()\n formatter = AutoDateFormatter(locator)\n formatter.scaled[1/(24*60)] = '%M:%S' # only show min and sec\n\n Custom callables can also be used instead of format strings. The following\n example shows how to use a custom format function to strip trailing zeros\n from decimal seconds and adds the date to the first ticklabel::\n\n def my_format_function(x, pos=None):\n x = matplotlib.dates.num2date(x)\n if pos == 0:\n fmt = '%D %H:%M:%S.%f'\n else:\n fmt = '%H:%M:%S.%f'\n label = x.strftime(fmt)\n label = label.rstrip("0")\n label = label.rstrip(".")\n return label\n\n formatter.scaled[1/(24*60)] = my_format_function\n """\n\n # This can be improved by providing some user-level direction on\n # how to choose the best format (precedence, etc.).\n\n # Perhaps a 'struct' that has a field for each time-type where a\n # zero would indicate "don't show" and a number would indicate\n # "show" with some sort of priority. Same priorities could mean\n # show all with the same priority.\n\n # Or more simply, perhaps just a format string for each\n # possibility...\n\n def __init__(self, locator, tz=None, defaultfmt='%Y-%m-%d', *,\n usetex=None):\n """\n Autoformat the date labels.\n\n Parameters\n ----------\n locator : `.ticker.Locator`\n Locator that this axis is using.\n\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Ticks timezone. If a string, *tz* is passed to `dateutil.tz`.\n\n defaultfmt : str\n The default format to use if none of the values in ``self.scaled``\n are greater than the unit returned by ``locator._get_unit()``.\n\n usetex : bool, default: :rc:`text.usetex`\n To enable/disable the use of TeX's math mode for rendering the\n results of the formatter. If any entries in ``self.scaled`` are set\n as functions, then it is up to the customized function to enable or\n disable TeX's math mode itself.\n """\n self._locator = locator\n self._tz = tz\n self.defaultfmt = defaultfmt\n self._formatter = DateFormatter(self.defaultfmt, tz)\n rcParams = mpl.rcParams\n self._usetex = mpl._val_or_rc(usetex, 'text.usetex')\n self.scaled = {\n DAYS_PER_YEAR: rcParams['date.autoformatter.year'],\n DAYS_PER_MONTH: rcParams['date.autoformatter.month'],\n 1: rcParams['date.autoformatter.day'],\n 1 / HOURS_PER_DAY: rcParams['date.autoformatter.hour'],\n 1 / MINUTES_PER_DAY: rcParams['date.autoformatter.minute'],\n 1 / SEC_PER_DAY: rcParams['date.autoformatter.second'],\n 1 / MUSECONDS_PER_DAY: rcParams['date.autoformatter.microsecond']\n }\n\n def _set_locator(self, locator):\n self._locator = locator\n\n def __call__(self, x, pos=None):\n try:\n locator_unit_scale = float(self._locator._get_unit())\n except AttributeError:\n locator_unit_scale = 1\n # Pick the first scale which is greater than the locator unit.\n fmt = next((fmt for scale, fmt in sorted(self.scaled.items())\n if scale >= locator_unit_scale),\n self.defaultfmt)\n\n if isinstance(fmt, str):\n self._formatter = DateFormatter(fmt, self._tz, usetex=self._usetex)\n result = self._formatter(x, pos)\n elif callable(fmt):\n result = fmt(x, pos)\n else:\n raise TypeError(f'Unexpected type passed to {self!r}.')\n\n return result\n\n\nclass rrulewrapper:\n """\n A simple wrapper around a `dateutil.rrule` allowing flexible\n date tick specifications.\n """\n def __init__(self, freq, tzinfo=None, **kwargs):\n """\n Parameters\n ----------\n freq : {YEARLY, MONTHLY, WEEKLY, DAILY, HOURLY, MINUTELY, SECONDLY}\n Tick frequency. These constants are defined in `dateutil.rrule`,\n but they are accessible from `matplotlib.dates` as well.\n tzinfo : `datetime.tzinfo`, optional\n Time zone information. The default is None.\n **kwargs\n Additional keyword arguments are passed to the `dateutil.rrule`.\n """\n kwargs['freq'] = freq\n self._base_tzinfo = tzinfo\n\n self._update_rrule(**kwargs)\n\n def set(self, **kwargs):\n """Set parameters for an existing wrapper."""\n self._construct.update(kwargs)\n\n self._update_rrule(**self._construct)\n\n def _update_rrule(self, **kwargs):\n tzinfo = self._base_tzinfo\n\n # rrule does not play nicely with timezones - especially pytz time\n # zones, it's best to use naive zones and attach timezones once the\n # datetimes are returned\n if 'dtstart' in kwargs:\n dtstart = kwargs['dtstart']\n if dtstart.tzinfo is not None:\n if tzinfo is None:\n tzinfo = dtstart.tzinfo\n else:\n dtstart = dtstart.astimezone(tzinfo)\n\n kwargs['dtstart'] = dtstart.replace(tzinfo=None)\n\n if 'until' in kwargs:\n until = kwargs['until']\n if until.tzinfo is not None:\n if tzinfo is not None:\n until = until.astimezone(tzinfo)\n else:\n raise ValueError('until cannot be aware if dtstart '\n 'is naive and tzinfo is None')\n\n kwargs['until'] = until.replace(tzinfo=None)\n\n self._construct = kwargs.copy()\n self._tzinfo = tzinfo\n self._rrule = rrule(**self._construct)\n\n def _attach_tzinfo(self, dt, tzinfo):\n # pytz zones are attached by "localizing" the datetime\n if hasattr(tzinfo, 'localize'):\n return tzinfo.localize(dt, is_dst=True)\n\n return dt.replace(tzinfo=tzinfo)\n\n def _aware_return_wrapper(self, f, returns_list=False):\n """Decorator function that allows rrule methods to handle tzinfo."""\n # This is only necessary if we're actually attaching a tzinfo\n if self._tzinfo is None:\n return f\n\n # All datetime arguments must be naive. If they are not naive, they are\n # converted to the _tzinfo zone before dropping the zone.\n def normalize_arg(arg):\n if isinstance(arg, datetime.datetime) and arg.tzinfo is not None:\n if arg.tzinfo is not self._tzinfo:\n arg = arg.astimezone(self._tzinfo)\n\n return arg.replace(tzinfo=None)\n\n return arg\n\n def normalize_args(args, kwargs):\n args = tuple(normalize_arg(arg) for arg in args)\n kwargs = {kw: normalize_arg(arg) for kw, arg in kwargs.items()}\n\n return args, kwargs\n\n # There are two kinds of functions we care about - ones that return\n # dates and ones that return lists of dates.\n if not returns_list:\n def inner_func(*args, **kwargs):\n args, kwargs = normalize_args(args, kwargs)\n dt = f(*args, **kwargs)\n return self._attach_tzinfo(dt, self._tzinfo)\n else:\n def inner_func(*args, **kwargs):\n args, kwargs = normalize_args(args, kwargs)\n dts = f(*args, **kwargs)\n return [self._attach_tzinfo(dt, self._tzinfo) for dt in dts]\n\n return functools.wraps(f)(inner_func)\n\n def __getattr__(self, name):\n if name in self.__dict__:\n return self.__dict__[name]\n\n f = getattr(self._rrule, name)\n\n if name in {'after', 'before'}:\n return self._aware_return_wrapper(f)\n elif name in {'xafter', 'xbefore', 'between'}:\n return self._aware_return_wrapper(f, returns_list=True)\n else:\n return f\n\n def __setstate__(self, state):\n self.__dict__.update(state)\n\n\nclass DateLocator(ticker.Locator):\n """\n Determines the tick locations when plotting dates.\n\n This class is subclassed by other Locators and\n is not meant to be used on its own.\n """\n hms0d = {'byhour': 0, 'byminute': 0, 'bysecond': 0}\n\n def __init__(self, tz=None):\n """\n Parameters\n ----------\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Ticks timezone. If a string, *tz* is passed to `dateutil.tz`.\n """\n self.tz = _get_tzinfo(tz)\n\n def set_tzinfo(self, tz):\n """\n Set timezone info.\n\n Parameters\n ----------\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Ticks timezone. If a string, *tz* is passed to `dateutil.tz`.\n """\n self.tz = _get_tzinfo(tz)\n\n def datalim_to_dt(self):\n """Convert axis data interval to datetime objects."""\n dmin, dmax = self.axis.get_data_interval()\n if dmin > dmax:\n dmin, dmax = dmax, dmin\n\n return num2date(dmin, self.tz), num2date(dmax, self.tz)\n\n def viewlim_to_dt(self):\n """Convert the view interval to datetime objects."""\n vmin, vmax = self.axis.get_view_interval()\n if vmin > vmax:\n vmin, vmax = vmax, vmin\n return num2date(vmin, self.tz), num2date(vmax, self.tz)\n\n def _get_unit(self):\n """\n Return how many days a unit of the locator is; used for\n intelligent autoscaling.\n """\n return 1\n\n def _get_interval(self):\n """\n Return the number of units for each tick.\n """\n return 1\n\n def nonsingular(self, vmin, vmax):\n """\n Given the proposed upper and lower extent, adjust the range\n if it is too close to being singular (i.e. a range of ~0).\n """\n if not np.isfinite(vmin) or not np.isfinite(vmax):\n # Except if there is no data, then use 1970 as default.\n return (date2num(datetime.date(1970, 1, 1)),\n date2num(datetime.date(1970, 1, 2)))\n if vmax < vmin:\n vmin, vmax = vmax, vmin\n unit = self._get_unit()\n interval = self._get_interval()\n if abs(vmax - vmin) < 1e-6:\n vmin -= 2 * unit * interval\n vmax += 2 * unit * interval\n return vmin, vmax\n\n\nclass RRuleLocator(DateLocator):\n # use the dateutil rrule instance\n\n def __init__(self, o, tz=None):\n super().__init__(tz)\n self.rule = o\n\n def __call__(self):\n # if no data have been set, this will tank with a ValueError\n try:\n dmin, dmax = self.viewlim_to_dt()\n except ValueError:\n return []\n\n return self.tick_values(dmin, dmax)\n\n def tick_values(self, vmin, vmax):\n start, stop = self._create_rrule(vmin, vmax)\n dates = self.rule.between(start, stop, True)\n if len(dates) == 0:\n return date2num([vmin, vmax])\n return self.raise_if_exceeds(date2num(dates))\n\n def _create_rrule(self, vmin, vmax):\n # set appropriate rrule dtstart and until and return\n # start and end\n delta = relativedelta(vmax, vmin)\n\n # We need to cap at the endpoints of valid datetime\n try:\n start = vmin - delta\n except (ValueError, OverflowError):\n # cap\n start = datetime.datetime(1, 1, 1, 0, 0, 0,\n tzinfo=datetime.timezone.utc)\n\n try:\n stop = vmax + delta\n except (ValueError, OverflowError):\n # cap\n stop = datetime.datetime(9999, 12, 31, 23, 59, 59,\n tzinfo=datetime.timezone.utc)\n\n self.rule.set(dtstart=start, until=stop)\n\n return vmin, vmax\n\n def _get_unit(self):\n # docstring inherited\n freq = self.rule._rrule._freq\n return self.get_unit_generic(freq)\n\n @staticmethod\n def get_unit_generic(freq):\n if freq == YEARLY:\n return DAYS_PER_YEAR\n elif freq == MONTHLY:\n return DAYS_PER_MONTH\n elif freq == WEEKLY:\n return DAYS_PER_WEEK\n elif freq == DAILY:\n return 1.0\n elif freq == HOURLY:\n return 1.0 / HOURS_PER_DAY\n elif freq == MINUTELY:\n return 1.0 / MINUTES_PER_DAY\n elif freq == SECONDLY:\n return 1.0 / SEC_PER_DAY\n else:\n # error\n return -1 # or should this just return '1'?\n\n def _get_interval(self):\n return self.rule._rrule._interval\n\n\nclass AutoDateLocator(DateLocator):\n """\n On autoscale, this class picks the best `DateLocator` to set the view\n limits and the tick locations.\n\n Attributes\n ----------\n intervald : dict\n\n Mapping of tick frequencies to multiples allowed for that ticking.\n The default is ::\n\n self.intervald = {\n YEARLY : [1, 2, 4, 5, 10, 20, 40, 50, 100, 200, 400, 500,\n 1000, 2000, 4000, 5000, 10000],\n MONTHLY : [1, 2, 3, 4, 6],\n DAILY : [1, 2, 3, 7, 14, 21],\n HOURLY : [1, 2, 3, 4, 6, 12],\n MINUTELY: [1, 5, 10, 15, 30],\n SECONDLY: [1, 5, 10, 15, 30],\n MICROSECONDLY: [1, 2, 5, 10, 20, 50, 100, 200, 500,\n 1000, 2000, 5000, 10000, 20000, 50000,\n 100000, 200000, 500000, 1000000],\n }\n\n where the keys are defined in `dateutil.rrule`.\n\n The interval is used to specify multiples that are appropriate for\n the frequency of ticking. For instance, every 7 days is sensible\n for daily ticks, but for minutes/seconds, 15 or 30 make sense.\n\n When customizing, you should only modify the values for the existing\n keys. You should not add or delete entries.\n\n Example for forcing ticks every 3 hours::\n\n locator = AutoDateLocator()\n locator.intervald[HOURLY] = [3] # only show every 3 hours\n """\n\n def __init__(self, tz=None, minticks=5, maxticks=None,\n interval_multiples=True):\n """\n Parameters\n ----------\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Ticks timezone. If a string, *tz* is passed to `dateutil.tz`.\n minticks : int\n The minimum number of ticks desired; controls whether ticks occur\n yearly, monthly, etc.\n maxticks : int\n The maximum number of ticks desired; controls the interval between\n ticks (ticking every other, every 3, etc.). For fine-grained\n control, this can be a dictionary mapping individual rrule\n frequency constants (YEARLY, MONTHLY, etc.) to their own maximum\n number of ticks. This can be used to keep the number of ticks\n appropriate to the format chosen in `AutoDateFormatter`. Any\n frequency not specified in this dictionary is given a default\n value.\n interval_multiples : bool, default: True\n Whether ticks should be chosen to be multiple of the interval,\n locking them to 'nicer' locations. For example, this will force\n the ticks to be at hours 0, 6, 12, 18 when hourly ticking is done\n at 6 hour intervals.\n """\n super().__init__(tz=tz)\n self._freq = YEARLY\n self._freqs = [YEARLY, MONTHLY, DAILY, HOURLY, MINUTELY,\n SECONDLY, MICROSECONDLY]\n self.minticks = minticks\n\n self.maxticks = {YEARLY: 11, MONTHLY: 12, DAILY: 11, HOURLY: 12,\n MINUTELY: 11, SECONDLY: 11, MICROSECONDLY: 8}\n if maxticks is not None:\n try:\n self.maxticks.update(maxticks)\n except TypeError:\n # Assume we were given an integer. Use this as the maximum\n # number of ticks for every frequency and create a\n # dictionary for this\n self.maxticks = dict.fromkeys(self._freqs, maxticks)\n self.interval_multiples = interval_multiples\n self.intervald = {\n YEARLY: [1, 2, 4, 5, 10, 20, 40, 50, 100, 200, 400, 500,\n 1000, 2000, 4000, 5000, 10000],\n MONTHLY: [1, 2, 3, 4, 6],\n DAILY: [1, 2, 3, 7, 14, 21],\n HOURLY: [1, 2, 3, 4, 6, 12],\n MINUTELY: [1, 5, 10, 15, 30],\n SECONDLY: [1, 5, 10, 15, 30],\n MICROSECONDLY: [1, 2, 5, 10, 20, 50, 100, 200, 500, 1000, 2000,\n 5000, 10000, 20000, 50000, 100000, 200000, 500000,\n 1000000],\n }\n if interval_multiples:\n # Swap "3" for "4" in the DAILY list; If we use 3 we get bad\n # tick loc for months w/ 31 days: 1, 4, ..., 28, 31, 1\n # If we use 4 then we get: 1, 5, ... 25, 29, 1\n self.intervald[DAILY] = [1, 2, 4, 7, 14]\n\n self._byranges = [None, range(1, 13), range(1, 32),\n range(0, 24), range(0, 60), range(0, 60), None]\n\n def __call__(self):\n # docstring inherited\n dmin, dmax = self.viewlim_to_dt()\n locator = self.get_locator(dmin, dmax)\n return locator()\n\n def tick_values(self, vmin, vmax):\n return self.get_locator(vmin, vmax).tick_values(vmin, vmax)\n\n def nonsingular(self, vmin, vmax):\n # whatever is thrown at us, we can scale the unit.\n # But default nonsingular date plots at an ~4 year period.\n if not np.isfinite(vmin) or not np.isfinite(vmax):\n # Except if there is no data, then use 1970 as default.\n return (date2num(datetime.date(1970, 1, 1)),\n date2num(datetime.date(1970, 1, 2)))\n if vmax < vmin:\n vmin, vmax = vmax, vmin\n if vmin == vmax:\n vmin = vmin - DAYS_PER_YEAR * 2\n vmax = vmax + DAYS_PER_YEAR * 2\n return vmin, vmax\n\n def _get_unit(self):\n if self._freq in [MICROSECONDLY]:\n return 1. / MUSECONDS_PER_DAY\n else:\n return RRuleLocator.get_unit_generic(self._freq)\n\n def get_locator(self, dmin, dmax):\n """Pick the best locator based on a distance."""\n delta = relativedelta(dmax, dmin)\n tdelta = dmax - dmin\n\n # take absolute difference\n if dmin > dmax:\n delta = -delta\n tdelta = -tdelta\n # The following uses a mix of calls to relativedelta and timedelta\n # methods because there is incomplete overlap in the functionality of\n # these similar functions, and it's best to avoid doing our own math\n # whenever possible.\n numYears = float(delta.years)\n numMonths = numYears * MONTHS_PER_YEAR + delta.months\n numDays = tdelta.days # Avoids estimates of days/month, days/year.\n numHours = numDays * HOURS_PER_DAY + delta.hours\n numMinutes = numHours * MIN_PER_HOUR + delta.minutes\n numSeconds = np.floor(tdelta.total_seconds())\n numMicroseconds = np.floor(tdelta.total_seconds() * 1e6)\n\n nums = [numYears, numMonths, numDays, numHours, numMinutes,\n numSeconds, numMicroseconds]\n\n use_rrule_locator = [True] * 6 + [False]\n\n # Default setting of bymonth, etc. to pass to rrule\n # [unused (for year), bymonth, bymonthday, byhour, byminute,\n # bysecond, unused (for microseconds)]\n byranges = [None, 1, 1, 0, 0, 0, None]\n\n # Loop over all the frequencies and try to find one that gives at\n # least a minticks tick positions. Once this is found, look for\n # an interval from a list specific to that frequency that gives no\n # more than maxticks tick positions. Also, set up some ranges\n # (bymonth, etc.) as appropriate to be passed to rrulewrapper.\n for i, (freq, num) in enumerate(zip(self._freqs, nums)):\n # If this particular frequency doesn't give enough ticks, continue\n if num < self.minticks:\n # Since we're not using this particular frequency, set\n # the corresponding by_ to None so the rrule can act as\n # appropriate\n byranges[i] = None\n continue\n\n # Find the first available interval that doesn't give too many\n # ticks\n for interval in self.intervald[freq]:\n if num <= interval * (self.maxticks[freq] - 1):\n break\n else:\n if not (self.interval_multiples and freq == DAILY):\n _api.warn_external(\n f"AutoDateLocator was unable to pick an appropriate "\n f"interval for this date range. It may be necessary "\n f"to add an interval value to the AutoDateLocator's "\n f"intervald dictionary. Defaulting to {interval}.")\n\n # Set some parameters as appropriate\n self._freq = freq\n\n if self._byranges[i] and self.interval_multiples:\n byranges[i] = self._byranges[i][::interval]\n if i in (DAILY, WEEKLY):\n if interval == 14:\n # just make first and 15th. Avoids 30th.\n byranges[i] = [1, 15]\n elif interval == 7:\n byranges[i] = [1, 8, 15, 22]\n\n interval = 1\n else:\n byranges[i] = self._byranges[i]\n break\n else:\n interval = 1\n\n if (freq == YEARLY) and self.interval_multiples:\n locator = YearLocator(interval, tz=self.tz)\n elif use_rrule_locator[i]:\n _, bymonth, bymonthday, byhour, byminute, bysecond, _ = byranges\n rrule = rrulewrapper(self._freq, interval=interval,\n dtstart=dmin, until=dmax,\n bymonth=bymonth, bymonthday=bymonthday,\n byhour=byhour, byminute=byminute,\n bysecond=bysecond)\n\n locator = RRuleLocator(rrule, tz=self.tz)\n else:\n locator = MicrosecondLocator(interval, tz=self.tz)\n if date2num(dmin) > 70 * 365 and interval < 1000:\n _api.warn_external(\n 'Plotting microsecond time intervals for dates far from '\n f'the epoch (time origin: {get_epoch()}) is not well-'\n 'supported. See matplotlib.dates.set_epoch to change the '\n 'epoch.')\n\n locator.set_axis(self.axis)\n return locator\n\n\nclass YearLocator(RRuleLocator):\n """\n Make ticks on a given day of each year that is a multiple of base.\n\n Examples::\n\n # Tick every year on Jan 1st\n locator = YearLocator()\n\n # Tick every 5 years on July 4th\n locator = YearLocator(5, month=7, day=4)\n """\n def __init__(self, base=1, month=1, day=1, tz=None):\n """\n Parameters\n ----------\n base : int, default: 1\n Mark ticks every *base* years.\n month : int, default: 1\n The month on which to place the ticks, starting from 1. Default is\n January.\n day : int, default: 1\n The day on which to place the ticks.\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Ticks timezone. If a string, *tz* is passed to `dateutil.tz`.\n """\n rule = rrulewrapper(YEARLY, interval=base, bymonth=month,\n bymonthday=day, **self.hms0d)\n super().__init__(rule, tz=tz)\n self.base = ticker._Edge_integer(base, 0)\n\n def _create_rrule(self, vmin, vmax):\n # 'start' needs to be a multiple of the interval to create ticks on\n # interval multiples when the tick frequency is YEARLY\n ymin = max(self.base.le(vmin.year) * self.base.step, 1)\n ymax = min(self.base.ge(vmax.year) * self.base.step, 9999)\n\n c = self.rule._construct\n replace = {'year': ymin,\n 'month': c.get('bymonth', 1),\n 'day': c.get('bymonthday', 1),\n 'hour': 0, 'minute': 0, 'second': 0}\n\n start = vmin.replace(**replace)\n stop = start.replace(year=ymax)\n self.rule.set(dtstart=start, until=stop)\n\n return start, stop\n\n\nclass MonthLocator(RRuleLocator):\n """\n Make ticks on occurrences of each month, e.g., 1, 3, 12.\n """\n def __init__(self, bymonth=None, bymonthday=1, interval=1, tz=None):\n """\n Parameters\n ----------\n bymonth : int or list of int, default: all months\n Ticks will be placed on every month in *bymonth*. Default is\n ``range(1, 13)``, i.e. every month.\n bymonthday : int, default: 1\n The day on which to place the ticks.\n interval : int, default: 1\n The interval between each iteration. For example, if\n ``interval=2``, mark every second occurrence.\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Ticks timezone. If a string, *tz* is passed to `dateutil.tz`.\n """\n if bymonth is None:\n bymonth = range(1, 13)\n\n rule = rrulewrapper(MONTHLY, bymonth=bymonth, bymonthday=bymonthday,\n interval=interval, **self.hms0d)\n super().__init__(rule, tz=tz)\n\n\nclass WeekdayLocator(RRuleLocator):\n """\n Make ticks on occurrences of each weekday.\n """\n\n def __init__(self, byweekday=1, interval=1, tz=None):\n """\n Parameters\n ----------\n byweekday : int or list of int, default: all days\n Ticks will be placed on every weekday in *byweekday*. Default is\n every day.\n\n Elements of *byweekday* must be one of MO, TU, WE, TH, FR, SA,\n SU, the constants from :mod:`dateutil.rrule`, which have been\n imported into the :mod:`matplotlib.dates` namespace.\n interval : int, default: 1\n The interval between each iteration. For example, if\n ``interval=2``, mark every second occurrence.\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Ticks timezone. If a string, *tz* is passed to `dateutil.tz`.\n """\n rule = rrulewrapper(DAILY, byweekday=byweekday,\n interval=interval, **self.hms0d)\n super().__init__(rule, tz=tz)\n\n\nclass DayLocator(RRuleLocator):\n """\n Make ticks on occurrences of each day of the month. For example,\n 1, 15, 30.\n """\n def __init__(self, bymonthday=None, interval=1, tz=None):\n """\n Parameters\n ----------\n bymonthday : int or list of int, default: all days\n Ticks will be placed on every day in *bymonthday*. Default is\n ``bymonthday=range(1, 32)``, i.e., every day of the month.\n interval : int, default: 1\n The interval between each iteration. For example, if\n ``interval=2``, mark every second occurrence.\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Ticks timezone. If a string, *tz* is passed to `dateutil.tz`.\n """\n if interval != int(interval) or interval < 1:\n raise ValueError("interval must be an integer greater than 0")\n if bymonthday is None:\n bymonthday = range(1, 32)\n\n rule = rrulewrapper(DAILY, bymonthday=bymonthday,\n interval=interval, **self.hms0d)\n super().__init__(rule, tz=tz)\n\n\nclass HourLocator(RRuleLocator):\n """\n Make ticks on occurrences of each hour.\n """\n def __init__(self, byhour=None, interval=1, tz=None):\n """\n Parameters\n ----------\n byhour : int or list of int, default: all hours\n Ticks will be placed on every hour in *byhour*. Default is\n ``byhour=range(24)``, i.e., every hour.\n interval : int, default: 1\n The interval between each iteration. For example, if\n ``interval=2``, mark every second occurrence.\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Ticks timezone. If a string, *tz* is passed to `dateutil.tz`.\n """\n if byhour is None:\n byhour = range(24)\n\n rule = rrulewrapper(HOURLY, byhour=byhour, interval=interval,\n byminute=0, bysecond=0)\n super().__init__(rule, tz=tz)\n\n\nclass MinuteLocator(RRuleLocator):\n """\n Make ticks on occurrences of each minute.\n """\n def __init__(self, byminute=None, interval=1, tz=None):\n """\n Parameters\n ----------\n byminute : int or list of int, default: all minutes\n Ticks will be placed on every minute in *byminute*. Default is\n ``byminute=range(60)``, i.e., every minute.\n interval : int, default: 1\n The interval between each iteration. For example, if\n ``interval=2``, mark every second occurrence.\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Ticks timezone. If a string, *tz* is passed to `dateutil.tz`.\n """\n if byminute is None:\n byminute = range(60)\n\n rule = rrulewrapper(MINUTELY, byminute=byminute, interval=interval,\n bysecond=0)\n super().__init__(rule, tz=tz)\n\n\nclass SecondLocator(RRuleLocator):\n """\n Make ticks on occurrences of each second.\n """\n def __init__(self, bysecond=None, interval=1, tz=None):\n """\n Parameters\n ----------\n bysecond : int or list of int, default: all seconds\n Ticks will be placed on every second in *bysecond*. Default is\n ``bysecond = range(60)``, i.e., every second.\n interval : int, default: 1\n The interval between each iteration. For example, if\n ``interval=2``, mark every second occurrence.\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Ticks timezone. If a string, *tz* is passed to `dateutil.tz`.\n """\n if bysecond is None:\n bysecond = range(60)\n\n rule = rrulewrapper(SECONDLY, bysecond=bysecond, interval=interval)\n super().__init__(rule, tz=tz)\n\n\nclass MicrosecondLocator(DateLocator):\n """\n Make ticks on regular intervals of one or more microsecond(s).\n\n .. note::\n\n By default, Matplotlib uses a floating point representation of time in\n days since the epoch, so plotting data with\n microsecond time resolution does not work well for\n dates that are far (about 70 years) from the epoch (check with\n `~.dates.get_epoch`).\n\n If you want sub-microsecond resolution time plots, it is strongly\n recommended to use floating point seconds, not datetime-like\n time representation.\n\n If you really must use datetime.datetime() or similar and still\n need microsecond precision, change the time origin via\n `.dates.set_epoch` to something closer to the dates being plotted.\n See :doc:`/gallery/ticks/date_precision_and_epochs`.\n\n """\n def __init__(self, interval=1, tz=None):\n """\n Parameters\n ----------\n interval : int, default: 1\n The interval between each iteration. For example, if\n ``interval=2``, mark every second occurrence.\n tz : str or `~datetime.tzinfo`, default: :rc:`timezone`\n Ticks timezone. If a string, *tz* is passed to `dateutil.tz`.\n """\n super().__init__(tz=tz)\n self._interval = interval\n self._wrapped_locator = ticker.MultipleLocator(interval)\n\n def set_axis(self, axis):\n self._wrapped_locator.set_axis(axis)\n return super().set_axis(axis)\n\n def __call__(self):\n # if no data have been set, this will tank with a ValueError\n try:\n dmin, dmax = self.viewlim_to_dt()\n except ValueError:\n return []\n\n return self.tick_values(dmin, dmax)\n\n def tick_values(self, vmin, vmax):\n nmin, nmax = date2num((vmin, vmax))\n t0 = np.floor(nmin)\n nmax = nmax - t0\n nmin = nmin - t0\n nmin *= MUSECONDS_PER_DAY\n nmax *= MUSECONDS_PER_DAY\n\n ticks = self._wrapped_locator.tick_values(nmin, nmax)\n\n ticks = ticks / MUSECONDS_PER_DAY + t0\n return ticks\n\n def _get_unit(self):\n # docstring inherited\n return 1. / MUSECONDS_PER_DAY\n\n def _get_interval(self):\n # docstring inherited\n return self._interval\n\n\nclass DateConverter(units.ConversionInterface):\n """\n Converter for `datetime.date` and `datetime.datetime` data, or for\n date/time data represented as it would be converted by `date2num`.\n\n The 'unit' tag for such data is None or a `~datetime.tzinfo` instance.\n """\n\n def __init__(self, *, interval_multiples=True):\n self._interval_multiples = interval_multiples\n super().__init__()\n\n def axisinfo(self, unit, axis):\n """\n Return the `~matplotlib.units.AxisInfo` for *unit*.\n\n *unit* is a `~datetime.tzinfo` instance or None.\n The *axis* argument is required but not used.\n """\n tz = unit\n\n majloc = AutoDateLocator(tz=tz,\n interval_multiples=self._interval_multiples)\n majfmt = AutoDateFormatter(majloc, tz=tz)\n datemin = datetime.date(1970, 1, 1)\n datemax = datetime.date(1970, 1, 2)\n\n return units.AxisInfo(majloc=majloc, majfmt=majfmt, label='',\n default_limits=(datemin, datemax))\n\n @staticmethod\n def convert(value, unit, axis):\n """\n If *value* is not already a number or sequence of numbers, convert it\n with `date2num`.\n\n The *unit* and *axis* arguments are not used.\n """\n return date2num(value)\n\n @staticmethod\n def default_units(x, axis):\n """\n Return the `~datetime.tzinfo` instance of *x* or of its first element,\n or None\n """\n if isinstance(x, np.ndarray):\n x = x.ravel()\n\n try:\n x = cbook._safe_first_finite(x)\n except (TypeError, StopIteration):\n pass\n\n try:\n return x.tzinfo\n except AttributeError:\n pass\n return None\n\n\nclass ConciseDateConverter(DateConverter):\n # docstring inherited\n\n def __init__(self, formats=None, zero_formats=None, offset_formats=None,\n show_offset=True, *, interval_multiples=True):\n self._formats = formats\n self._zero_formats = zero_formats\n self._offset_formats = offset_formats\n self._show_offset = show_offset\n self._interval_multiples = interval_multiples\n super().__init__()\n\n def axisinfo(self, unit, axis):\n # docstring inherited\n tz = unit\n majloc = AutoDateLocator(tz=tz,\n interval_multiples=self._interval_multiples)\n majfmt = ConciseDateFormatter(majloc, tz=tz, formats=self._formats,\n zero_formats=self._zero_formats,\n offset_formats=self._offset_formats,\n show_offset=self._show_offset)\n datemin = datetime.date(1970, 1, 1)\n datemax = datetime.date(1970, 1, 2)\n return units.AxisInfo(majloc=majloc, majfmt=majfmt, label='',\n default_limits=(datemin, datemax))\n\n\nclass _SwitchableDateConverter:\n """\n Helper converter-like object that generates and dispatches to\n temporary ConciseDateConverter or DateConverter instances based on\n :rc:`date.converter` and :rc:`date.interval_multiples`.\n """\n\n @staticmethod\n def _get_converter():\n converter_cls = {\n "concise": ConciseDateConverter, "auto": DateConverter}[\n mpl.rcParams["date.converter"]]\n interval_multiples = mpl.rcParams["date.interval_multiples"]\n return converter_cls(interval_multiples=interval_multiples)\n\n def axisinfo(self, *args, **kwargs):\n return self._get_converter().axisinfo(*args, **kwargs)\n\n def default_units(self, *args, **kwargs):\n return self._get_converter().default_units(*args, **kwargs)\n\n def convert(self, *args, **kwargs):\n return self._get_converter().convert(*args, **kwargs)\n\n\nunits.registry[np.datetime64] = \\n units.registry[datetime.date] = \\n units.registry[datetime.datetime] = \\n _SwitchableDateConverter()\n | .venv\Lib\site-packages\matplotlib\dates.py | dates.py | Python | 66,306 | 0.75 | 0.160326 | 0.098798 | awesome-app | 349 | 2023-09-23T07:40:55.350920 | GPL-3.0 | false | fa5843087c7014e0fb600211264a43d6 |
"""\nA module for reading dvi files output by TeX. Several limitations make\nthis not (currently) useful as a general-purpose dvi preprocessor, but\nit is currently used by the pdf backend for processing usetex text.\n\nInterface::\n\n with Dvi(filename, 72) as dvi:\n # iterate over pages:\n for page in dvi:\n w, h, d = page.width, page.height, page.descent\n for x, y, font, glyph, width in page.text:\n fontname = font.texname\n pointsize = font.size\n ...\n for x, y, height, width in page.boxes:\n ...\n"""\n\nfrom collections import namedtuple\nimport enum\nfrom functools import lru_cache, partial, wraps\nimport logging\nimport os\nfrom pathlib import Path\nimport re\nimport struct\nimport subprocess\nimport sys\n\nimport numpy as np\n\nfrom matplotlib import _api, cbook\n\n_log = logging.getLogger(__name__)\n\n# Many dvi related files are looked for by external processes, require\n# additional parsing, and are used many times per rendering, which is why they\n# are cached using lru_cache().\n\n# Dvi is a bytecode format documented in\n# https://ctan.org/pkg/dvitype\n# https://texdoc.org/serve/dvitype.pdf/0\n#\n# The file consists of a preamble, some number of pages, a postamble,\n# and a finale. Different opcodes are allowed in different contexts,\n# so the Dvi object has a parser state:\n#\n# pre: expecting the preamble\n# outer: between pages (followed by a page or the postamble,\n# also e.g. font definitions are allowed)\n# page: processing a page\n# post_post: state after the postamble (our current implementation\n# just stops reading)\n# finale: the finale (unimplemented in our current implementation)\n\n_dvistate = enum.Enum('DviState', 'pre outer inpage post_post finale')\n\n# The marks on a page consist of text and boxes. A page also has dimensions.\nPage = namedtuple('Page', 'text boxes height width descent')\nBox = namedtuple('Box', 'x y height width')\n\n\n# Also a namedtuple, for backcompat.\nclass Text(namedtuple('Text', 'x y font glyph width')):\n """\n A glyph in the dvi file.\n\n The *x* and *y* attributes directly position the glyph. The *font*,\n *glyph*, and *width* attributes are kept public for back-compatibility,\n but users wanting to draw the glyph themselves are encouraged to instead\n load the font specified by `font_path` at `font_size`, warp it with the\n effects specified by `font_effects`, and load the glyph specified by\n `glyph_name_or_index`.\n """\n\n def _get_pdftexmap_entry(self):\n return PsfontsMap(find_tex_file("pdftex.map"))[self.font.texname]\n\n @property\n def font_path(self):\n """The `~pathlib.Path` to the font for this glyph."""\n psfont = self._get_pdftexmap_entry()\n if psfont.filename is None:\n raise ValueError("No usable font file found for {} ({}); "\n "the font may lack a Type-1 version"\n .format(psfont.psname.decode("ascii"),\n psfont.texname.decode("ascii")))\n return Path(psfont.filename)\n\n @property\n def font_size(self):\n """The font size."""\n return self.font.size\n\n @property\n def font_effects(self):\n """\n The "font effects" dict for this glyph.\n\n This dict contains the values for this glyph of SlantFont and\n ExtendFont (if any), read off :file:`pdftex.map`.\n """\n return self._get_pdftexmap_entry().effects\n\n @property\n def glyph_name_or_index(self):\n """\n Either the glyph name or the native charmap glyph index.\n\n If :file:`pdftex.map` specifies an encoding for this glyph's font, that\n is a mapping of glyph indices to Adobe glyph names; use it to convert\n dvi indices to glyph names. Callers can then convert glyph names to\n glyph indices (with FT_Get_Name_Index/get_name_index), and load the\n glyph using FT_Load_Glyph/load_glyph.\n\n If :file:`pdftex.map` specifies no encoding, the indices directly map\n to the font's "native" charmap; glyphs should directly load using\n FT_Load_Char/load_char after selecting the native charmap.\n """\n entry = self._get_pdftexmap_entry()\n return (_parse_enc(entry.encoding)[self.glyph]\n if entry.encoding is not None else self.glyph)\n\n\n# Opcode argument parsing\n#\n# Each of the following functions takes a Dvi object and delta, which is the\n# difference between the opcode and the minimum opcode with the same meaning.\n# Dvi opcodes often encode the number of argument bytes in this delta.\n_arg_mapping = dict(\n # raw: Return delta as is.\n raw=lambda dvi, delta: delta,\n # u1: Read 1 byte as an unsigned number.\n u1=lambda dvi, delta: dvi._read_arg(1, signed=False),\n # u4: Read 4 bytes as an unsigned number.\n u4=lambda dvi, delta: dvi._read_arg(4, signed=False),\n # s4: Read 4 bytes as a signed number.\n s4=lambda dvi, delta: dvi._read_arg(4, signed=True),\n # slen: Read delta bytes as a signed number, or None if delta is None.\n slen=lambda dvi, delta: dvi._read_arg(delta, signed=True) if delta else None,\n # slen1: Read (delta + 1) bytes as a signed number.\n slen1=lambda dvi, delta: dvi._read_arg(delta + 1, signed=True),\n # ulen1: Read (delta + 1) bytes as an unsigned number.\n ulen1=lambda dvi, delta: dvi._read_arg(delta + 1, signed=False),\n # olen1: Read (delta + 1) bytes as an unsigned number if less than 4 bytes,\n # as a signed number if 4 bytes.\n olen1=lambda dvi, delta: dvi._read_arg(delta + 1, signed=(delta == 3)),\n)\n\n\ndef _dispatch(table, min, max=None, state=None, args=('raw',)):\n """\n Decorator for dispatch by opcode. Sets the values in *table*\n from *min* to *max* to this method, adds a check that the Dvi state\n matches *state* if not None, reads arguments from the file according\n to *args*.\n\n Parameters\n ----------\n table : dict[int, callable]\n The dispatch table to be filled in.\n\n min, max : int\n Range of opcodes that calls the registered function; *max* defaults to\n *min*.\n\n state : _dvistate, optional\n State of the Dvi object in which these opcodes are allowed.\n\n args : list[str], default: ['raw']\n Sequence of argument specifications:\n\n - 'raw': opcode minus minimum\n - 'u1': read one unsigned byte\n - 'u4': read four bytes, treat as an unsigned number\n - 's4': read four bytes, treat as a signed number\n - 'slen': read (opcode - minimum) bytes, treat as signed\n - 'slen1': read (opcode - minimum + 1) bytes, treat as signed\n - 'ulen1': read (opcode - minimum + 1) bytes, treat as unsigned\n - 'olen1': read (opcode - minimum + 1) bytes, treat as unsigned\n if under four bytes, signed if four bytes\n """\n def decorate(method):\n get_args = [_arg_mapping[x] for x in args]\n\n @wraps(method)\n def wrapper(self, byte):\n if state is not None and self.state != state:\n raise ValueError("state precondition failed")\n return method(self, *[f(self, byte-min) for f in get_args])\n if max is None:\n table[min] = wrapper\n else:\n for i in range(min, max+1):\n assert table[i] is None\n table[i] = wrapper\n return wrapper\n return decorate\n\n\nclass Dvi:\n """\n A reader for a dvi ("device-independent") file, as produced by TeX.\n\n The current implementation can only iterate through pages in order,\n and does not even attempt to verify the postamble.\n\n This class can be used as a context manager to close the underlying\n file upon exit. Pages can be read via iteration. Here is an overly\n simple way to extract text without trying to detect whitespace::\n\n >>> with matplotlib.dviread.Dvi('input.dvi', 72) as dvi:\n ... for page in dvi:\n ... print(''.join(chr(t.glyph) for t in page.text))\n """\n # dispatch table\n _dtable = [None] * 256\n _dispatch = partial(_dispatch, _dtable)\n\n def __init__(self, filename, dpi):\n """\n Read the data from the file named *filename* and convert\n TeX's internal units to units of *dpi* per inch.\n *dpi* only sets the units and does not limit the resolution.\n Use None to return TeX's internal units.\n """\n _log.debug('Dvi: %s', filename)\n self.file = open(filename, 'rb')\n self.dpi = dpi\n self.fonts = {}\n self.state = _dvistate.pre\n self._missing_font = None\n\n def __enter__(self):\n """Context manager enter method, does nothing."""\n return self\n\n def __exit__(self, etype, evalue, etrace):\n """\n Context manager exit method, closes the underlying file if it is open.\n """\n self.close()\n\n def __iter__(self):\n """\n Iterate through the pages of the file.\n\n Yields\n ------\n Page\n Details of all the text and box objects on the page.\n The Page tuple contains lists of Text and Box tuples and\n the page dimensions, and the Text and Box tuples contain\n coordinates transformed into a standard Cartesian\n coordinate system at the dpi value given when initializing.\n The coordinates are floating point numbers, but otherwise\n precision is not lost and coordinate values are not clipped to\n integers.\n """\n while self._read():\n yield self._output()\n\n def close(self):\n """Close the underlying file if it is open."""\n if not self.file.closed:\n self.file.close()\n\n def _output(self):\n """\n Output the text and boxes belonging to the most recent page.\n page = dvi._output()\n """\n minx = miny = np.inf\n maxx = maxy = -np.inf\n maxy_pure = -np.inf\n for elt in self.text + self.boxes:\n if isinstance(elt, Box):\n x, y, h, w = elt\n e = 0 # zero depth\n else: # glyph\n x, y, font, g, w = elt\n h, e = font._height_depth_of(g)\n minx = min(minx, x)\n miny = min(miny, y - h)\n maxx = max(maxx, x + w)\n maxy = max(maxy, y + e)\n maxy_pure = max(maxy_pure, y)\n if self._baseline_v is not None:\n maxy_pure = self._baseline_v # This should normally be the case.\n self._baseline_v = None\n\n if not self.text and not self.boxes: # Avoid infs/nans from inf+/-inf.\n return Page(text=[], boxes=[], width=0, height=0, descent=0)\n\n if self.dpi is None:\n # special case for ease of debugging: output raw dvi coordinates\n return Page(text=self.text, boxes=self.boxes,\n width=maxx-minx, height=maxy_pure-miny,\n descent=maxy-maxy_pure)\n\n # convert from TeX's "scaled points" to dpi units\n d = self.dpi / (72.27 * 2**16)\n descent = (maxy - maxy_pure) * d\n\n text = [Text((x-minx)*d, (maxy-y)*d - descent, f, g, w*d)\n for (x, y, f, g, w) in self.text]\n boxes = [Box((x-minx)*d, (maxy-y)*d - descent, h*d, w*d)\n for (x, y, h, w) in self.boxes]\n\n return Page(text=text, boxes=boxes, width=(maxx-minx)*d,\n height=(maxy_pure-miny)*d, descent=descent)\n\n def _read(self):\n """\n Read one page from the file. Return True if successful,\n False if there were no more pages.\n """\n # Pages appear to start with the sequence\n # bop (begin of page)\n # xxx comment\n # <push, ..., pop> # if using chemformula\n # down\n # push\n # down\n # <push, push, xxx, right, xxx, pop, pop> # if using xcolor\n # down\n # push\n # down (possibly multiple)\n # push <= here, v is the baseline position.\n # etc.\n # (dviasm is useful to explore this structure.)\n # Thus, we use the vertical position at the first time the stack depth\n # reaches 3, while at least three "downs" have been executed (excluding\n # those popped out (corresponding to the chemformula preamble)), as the\n # baseline (the "down" count is necessary to handle xcolor).\n down_stack = [0]\n self._baseline_v = None\n while True:\n byte = self.file.read(1)[0]\n self._dtable[byte](self, byte)\n if self._missing_font:\n raise self._missing_font.to_exception()\n name = self._dtable[byte].__name__\n if name == "_push":\n down_stack.append(down_stack[-1])\n elif name == "_pop":\n down_stack.pop()\n elif name == "_down":\n down_stack[-1] += 1\n if (self._baseline_v is None\n and len(getattr(self, "stack", [])) == 3\n and down_stack[-1] >= 4):\n self._baseline_v = self.v\n if byte == 140: # end of page\n return True\n if self.state is _dvistate.post_post: # end of file\n self.close()\n return False\n\n def _read_arg(self, nbytes, signed=False):\n """\n Read and return a big-endian integer *nbytes* long.\n Signedness is determined by the *signed* keyword.\n """\n return int.from_bytes(self.file.read(nbytes), "big", signed=signed)\n\n @_dispatch(min=0, max=127, state=_dvistate.inpage)\n def _set_char_immediate(self, char):\n self._put_char_real(char)\n if isinstance(self.fonts[self.f], cbook._ExceptionInfo):\n return\n self.h += self.fonts[self.f]._width_of(char)\n\n @_dispatch(min=128, max=131, state=_dvistate.inpage, args=('olen1',))\n def _set_char(self, char):\n self._put_char_real(char)\n if isinstance(self.fonts[self.f], cbook._ExceptionInfo):\n return\n self.h += self.fonts[self.f]._width_of(char)\n\n @_dispatch(132, state=_dvistate.inpage, args=('s4', 's4'))\n def _set_rule(self, a, b):\n self._put_rule_real(a, b)\n self.h += b\n\n @_dispatch(min=133, max=136, state=_dvistate.inpage, args=('olen1',))\n def _put_char(self, char):\n self._put_char_real(char)\n\n def _put_char_real(self, char):\n font = self.fonts[self.f]\n if isinstance(font, cbook._ExceptionInfo):\n self._missing_font = font\n elif font._vf is None:\n self.text.append(Text(self.h, self.v, font, char,\n font._width_of(char)))\n else:\n scale = font._scale\n for x, y, f, g, w in font._vf[char].text:\n newf = DviFont(scale=_mul2012(scale, f._scale),\n tfm=f._tfm, texname=f.texname, vf=f._vf)\n self.text.append(Text(self.h + _mul2012(x, scale),\n self.v + _mul2012(y, scale),\n newf, g, newf._width_of(g)))\n self.boxes.extend([Box(self.h + _mul2012(x, scale),\n self.v + _mul2012(y, scale),\n _mul2012(a, scale), _mul2012(b, scale))\n for x, y, a, b in font._vf[char].boxes])\n\n @_dispatch(137, state=_dvistate.inpage, args=('s4', 's4'))\n def _put_rule(self, a, b):\n self._put_rule_real(a, b)\n\n def _put_rule_real(self, a, b):\n if a > 0 and b > 0:\n self.boxes.append(Box(self.h, self.v, a, b))\n\n @_dispatch(138)\n def _nop(self, _):\n pass\n\n @_dispatch(139, state=_dvistate.outer, args=('s4',)*11)\n def _bop(self, c0, c1, c2, c3, c4, c5, c6, c7, c8, c9, p):\n self.state = _dvistate.inpage\n self.h = self.v = self.w = self.x = self.y = self.z = 0\n self.stack = []\n self.text = [] # list of Text objects\n self.boxes = [] # list of Box objects\n\n @_dispatch(140, state=_dvistate.inpage)\n def _eop(self, _):\n self.state = _dvistate.outer\n del self.h, self.v, self.w, self.x, self.y, self.z, self.stack\n\n @_dispatch(141, state=_dvistate.inpage)\n def _push(self, _):\n self.stack.append((self.h, self.v, self.w, self.x, self.y, self.z))\n\n @_dispatch(142, state=_dvistate.inpage)\n def _pop(self, _):\n self.h, self.v, self.w, self.x, self.y, self.z = self.stack.pop()\n\n @_dispatch(min=143, max=146, state=_dvistate.inpage, args=('slen1',))\n def _right(self, b):\n self.h += b\n\n @_dispatch(min=147, max=151, state=_dvistate.inpage, args=('slen',))\n def _right_w(self, new_w):\n if new_w is not None:\n self.w = new_w\n self.h += self.w\n\n @_dispatch(min=152, max=156, state=_dvistate.inpage, args=('slen',))\n def _right_x(self, new_x):\n if new_x is not None:\n self.x = new_x\n self.h += self.x\n\n @_dispatch(min=157, max=160, state=_dvistate.inpage, args=('slen1',))\n def _down(self, a):\n self.v += a\n\n @_dispatch(min=161, max=165, state=_dvistate.inpage, args=('slen',))\n def _down_y(self, new_y):\n if new_y is not None:\n self.y = new_y\n self.v += self.y\n\n @_dispatch(min=166, max=170, state=_dvistate.inpage, args=('slen',))\n def _down_z(self, new_z):\n if new_z is not None:\n self.z = new_z\n self.v += self.z\n\n @_dispatch(min=171, max=234, state=_dvistate.inpage)\n def _fnt_num_immediate(self, k):\n self.f = k\n\n @_dispatch(min=235, max=238, state=_dvistate.inpage, args=('olen1',))\n def _fnt_num(self, new_f):\n self.f = new_f\n\n @_dispatch(min=239, max=242, args=('ulen1',))\n def _xxx(self, datalen):\n special = self.file.read(datalen)\n _log.debug(\n 'Dvi._xxx: encountered special: %s',\n ''.join([chr(ch) if 32 <= ch < 127 else '<%02x>' % ch\n for ch in special]))\n\n @_dispatch(min=243, max=246, args=('olen1', 'u4', 'u4', 'u4', 'u1', 'u1'))\n def _fnt_def(self, k, c, s, d, a, l):\n self._fnt_def_real(k, c, s, d, a, l)\n\n def _fnt_def_real(self, k, c, s, d, a, l):\n n = self.file.read(a + l)\n fontname = n[-l:].decode('ascii')\n try:\n tfm = _tfmfile(fontname)\n except FileNotFoundError as exc:\n # Explicitly allow defining missing fonts for Vf support; we only\n # register an error when trying to load a glyph from a missing font\n # and throw that error in Dvi._read. For Vf, _finalize_packet\n # checks whether a missing glyph has been used, and in that case\n # skips the glyph definition.\n self.fonts[k] = cbook._ExceptionInfo.from_exception(exc)\n return\n if c != 0 and tfm.checksum != 0 and c != tfm.checksum:\n raise ValueError(f'tfm checksum mismatch: {n}')\n try:\n vf = _vffile(fontname)\n except FileNotFoundError:\n vf = None\n self.fonts[k] = DviFont(scale=s, tfm=tfm, texname=n, vf=vf)\n\n @_dispatch(247, state=_dvistate.pre, args=('u1', 'u4', 'u4', 'u4', 'u1'))\n def _pre(self, i, num, den, mag, k):\n self.file.read(k) # comment in the dvi file\n if i != 2:\n raise ValueError(f"Unknown dvi format {i}")\n if num != 25400000 or den != 7227 * 2**16:\n raise ValueError("Nonstandard units in dvi file")\n # meaning: TeX always uses those exact values, so it\n # should be enough for us to support those\n # (There are 72.27 pt to an inch so 7227 pt =\n # 7227 * 2**16 sp to 100 in. The numerator is multiplied\n # by 10^5 to get units of 10**-7 meters.)\n if mag != 1000:\n raise ValueError("Nonstandard magnification in dvi file")\n # meaning: LaTeX seems to frown on setting \mag, so\n # I think we can assume this is constant\n self.state = _dvistate.outer\n\n @_dispatch(248, state=_dvistate.outer)\n def _post(self, _):\n self.state = _dvistate.post_post\n # TODO: actually read the postamble and finale?\n # currently post_post just triggers closing the file\n\n @_dispatch(249)\n def _post_post(self, _):\n raise NotImplementedError\n\n @_dispatch(min=250, max=255)\n def _malformed(self, offset):\n raise ValueError(f"unknown command: byte {250 + offset}")\n\n\nclass DviFont:\n """\n Encapsulation of a font that a DVI file can refer to.\n\n This class holds a font's texname and size, supports comparison,\n and knows the widths of glyphs in the same units as the AFM file.\n There are also internal attributes (for use by dviread.py) that\n are *not* used for comparison.\n\n The size is in Adobe points (converted from TeX points).\n\n Parameters\n ----------\n scale : float\n Factor by which the font is scaled from its natural size.\n tfm : Tfm\n TeX font metrics for this font\n texname : bytes\n Name of the font as used internally by TeX and friends, as an ASCII\n bytestring. This is usually very different from any external font\n names; `PsfontsMap` can be used to find the external name of the font.\n vf : Vf\n A TeX "virtual font" file, or None if this font is not virtual.\n\n Attributes\n ----------\n texname : bytes\n size : float\n Size of the font in Adobe points, converted from the slightly\n smaller TeX points.\n widths : list\n Widths of glyphs in glyph-space units, typically 1/1000ths of\n the point size.\n\n """\n __slots__ = ('texname', 'size', 'widths', '_scale', '_vf', '_tfm')\n\n def __init__(self, scale, tfm, texname, vf):\n _api.check_isinstance(bytes, texname=texname)\n self._scale = scale\n self._tfm = tfm\n self.texname = texname\n self._vf = vf\n self.size = scale * (72.0 / (72.27 * 2**16))\n try:\n nchars = max(tfm.width) + 1\n except ValueError:\n nchars = 0\n self.widths = [(1000*tfm.width.get(char, 0)) >> 20\n for char in range(nchars)]\n\n def __eq__(self, other):\n return (type(self) is type(other)\n and self.texname == other.texname and self.size == other.size)\n\n def __ne__(self, other):\n return not self.__eq__(other)\n\n def __repr__(self):\n return f"<{type(self).__name__}: {self.texname}>"\n\n def _width_of(self, char):\n """Width of char in dvi units."""\n width = self._tfm.width.get(char, None)\n if width is not None:\n return _mul2012(width, self._scale)\n _log.debug('No width for char %d in font %s.', char, self.texname)\n return 0\n\n def _height_depth_of(self, char):\n """Height and depth of char in dvi units."""\n result = []\n for metric, name in ((self._tfm.height, "height"),\n (self._tfm.depth, "depth")):\n value = metric.get(char, None)\n if value is None:\n _log.debug('No %s for char %d in font %s',\n name, char, self.texname)\n result.append(0)\n else:\n result.append(_mul2012(value, self._scale))\n # cmsyXX (symbols font) glyph 0 ("minus") has a nonzero descent\n # so that TeX aligns equations properly\n # (https://tex.stackexchange.com/q/526103/)\n # but we actually care about the rasterization depth to align\n # the dvipng-generated images.\n if re.match(br'^cmsy\d+$', self.texname) and char == 0:\n result[-1] = 0\n return result\n\n\nclass Vf(Dvi):\n r"""\n A virtual font (\*.vf file) containing subroutines for dvi files.\n\n Parameters\n ----------\n filename : str or path-like\n\n Notes\n -----\n The virtual font format is a derivative of dvi:\n http://mirrors.ctan.org/info/knuth/virtual-fonts\n This class reuses some of the machinery of `Dvi`\n but replaces the `_read` loop and dispatch mechanism.\n\n Examples\n --------\n ::\n\n vf = Vf(filename)\n glyph = vf[code]\n glyph.text, glyph.boxes, glyph.width\n """\n\n def __init__(self, filename):\n super().__init__(filename, 0)\n try:\n self._first_font = None\n self._chars = {}\n self._read()\n finally:\n self.close()\n\n def __getitem__(self, code):\n return self._chars[code]\n\n def _read(self):\n """\n Read one page from the file. Return True if successful,\n False if there were no more pages.\n """\n packet_char = packet_ends = None\n packet_len = packet_width = None\n while True:\n byte = self.file.read(1)[0]\n # If we are in a packet, execute the dvi instructions\n if self.state is _dvistate.inpage:\n byte_at = self.file.tell()-1\n if byte_at == packet_ends:\n self._finalize_packet(packet_char, packet_width)\n packet_len = packet_char = packet_width = None\n # fall through to out-of-packet code\n elif byte_at > packet_ends:\n raise ValueError("Packet length mismatch in vf file")\n else:\n if byte in (139, 140) or byte >= 243:\n raise ValueError(f"Inappropriate opcode {byte} in vf file")\n Dvi._dtable[byte](self, byte)\n continue\n\n # We are outside a packet\n if byte < 242: # a short packet (length given by byte)\n packet_len = byte\n packet_char = self._read_arg(1)\n packet_width = self._read_arg(3)\n packet_ends = self._init_packet(byte)\n self.state = _dvistate.inpage\n elif byte == 242: # a long packet\n packet_len = self._read_arg(4)\n packet_char = self._read_arg(4)\n packet_width = self._read_arg(4)\n self._init_packet(packet_len)\n elif 243 <= byte <= 246:\n k = self._read_arg(byte - 242, byte == 246)\n c = self._read_arg(4)\n s = self._read_arg(4)\n d = self._read_arg(4)\n a = self._read_arg(1)\n l = self._read_arg(1)\n self._fnt_def_real(k, c, s, d, a, l)\n if self._first_font is None:\n self._first_font = k\n elif byte == 247: # preamble\n i = self._read_arg(1)\n k = self._read_arg(1)\n x = self.file.read(k)\n cs = self._read_arg(4)\n ds = self._read_arg(4)\n self._pre(i, x, cs, ds)\n elif byte == 248: # postamble (just some number of 248s)\n break\n else:\n raise ValueError(f"Unknown vf opcode {byte}")\n\n def _init_packet(self, pl):\n if self.state != _dvistate.outer:\n raise ValueError("Misplaced packet in vf file")\n self.h = self.v = self.w = self.x = self.y = self.z = 0\n self.stack = []\n self.text = []\n self.boxes = []\n self.f = self._first_font\n self._missing_font = None\n return self.file.tell() + pl\n\n def _finalize_packet(self, packet_char, packet_width):\n if not self._missing_font: # Otherwise we don't have full glyph definition.\n self._chars[packet_char] = Page(\n text=self.text, boxes=self.boxes, width=packet_width,\n height=None, descent=None)\n self.state = _dvistate.outer\n\n def _pre(self, i, x, cs, ds):\n if self.state is not _dvistate.pre:\n raise ValueError("pre command in middle of vf file")\n if i != 202:\n raise ValueError(f"Unknown vf format {i}")\n if len(x):\n _log.debug('vf file comment: %s', x)\n self.state = _dvistate.outer\n # cs = checksum, ds = design size\n\n\ndef _mul2012(num1, num2):\n """Multiply two numbers in 20.12 fixed point format."""\n # Separated into a function because >> has surprising precedence\n return (num1*num2) >> 20\n\n\nclass Tfm:\n """\n A TeX Font Metric file.\n\n This implementation covers only the bare minimum needed by the Dvi class.\n\n Parameters\n ----------\n filename : str or path-like\n\n Attributes\n ----------\n checksum : int\n Used for verifying against the dvi file.\n design_size : int\n Design size of the font (unknown units)\n width, height, depth : dict\n Dimensions of each character, need to be scaled by the factor\n specified in the dvi file. These are dicts because indexing may\n not start from 0.\n """\n __slots__ = ('checksum', 'design_size', 'width', 'height', 'depth')\n\n def __init__(self, filename):\n _log.debug('opening tfm file %s', filename)\n with open(filename, 'rb') as file:\n header1 = file.read(24)\n lh, bc, ec, nw, nh, nd = struct.unpack('!6H', header1[2:14])\n _log.debug('lh=%d, bc=%d, ec=%d, nw=%d, nh=%d, nd=%d',\n lh, bc, ec, nw, nh, nd)\n header2 = file.read(4*lh)\n self.checksum, self.design_size = struct.unpack('!2I', header2[:8])\n # there is also encoding information etc.\n char_info = file.read(4*(ec-bc+1))\n widths = struct.unpack(f'!{nw}i', file.read(4*nw))\n heights = struct.unpack(f'!{nh}i', file.read(4*nh))\n depths = struct.unpack(f'!{nd}i', file.read(4*nd))\n self.width = {}\n self.height = {}\n self.depth = {}\n for idx, char in enumerate(range(bc, ec+1)):\n byte0 = char_info[4*idx]\n byte1 = char_info[4*idx+1]\n self.width[char] = widths[byte0]\n self.height[char] = heights[byte1 >> 4]\n self.depth[char] = depths[byte1 & 0xf]\n\n\nPsFont = namedtuple('PsFont', 'texname psname effects encoding filename')\n\n\nclass PsfontsMap:\n """\n A psfonts.map formatted file, mapping TeX fonts to PS fonts.\n\n Parameters\n ----------\n filename : str or path-like\n\n Notes\n -----\n For historical reasons, TeX knows many Type-1 fonts by different\n names than the outside world. (For one thing, the names have to\n fit in eight characters.) Also, TeX's native fonts are not Type-1\n but Metafont, which is nontrivial to convert to PostScript except\n as a bitmap. While high-quality conversions to Type-1 format exist\n and are shipped with modern TeX distributions, we need to know\n which Type-1 fonts are the counterparts of which native fonts. For\n these reasons a mapping is needed from internal font names to font\n file names.\n\n A texmf tree typically includes mapping files called e.g.\n :file:`psfonts.map`, :file:`pdftex.map`, or :file:`dvipdfm.map`.\n The file :file:`psfonts.map` is used by :program:`dvips`,\n :file:`pdftex.map` by :program:`pdfTeX`, and :file:`dvipdfm.map`\n by :program:`dvipdfm`. :file:`psfonts.map` might avoid embedding\n the 35 PostScript fonts (i.e., have no filename for them, as in\n the Times-Bold example above), while the pdf-related files perhaps\n only avoid the "Base 14" pdf fonts. But the user may have\n configured these files differently.\n\n Examples\n --------\n >>> map = PsfontsMap(find_tex_file('pdftex.map'))\n >>> entry = map[b'ptmbo8r']\n >>> entry.texname\n b'ptmbo8r'\n >>> entry.psname\n b'Times-Bold'\n >>> entry.encoding\n '/usr/local/texlive/2008/texmf-dist/fonts/enc/dvips/base/8r.enc'\n >>> entry.effects\n {'slant': 0.16700000000000001}\n >>> entry.filename\n """\n __slots__ = ('_filename', '_unparsed', '_parsed')\n\n # Create a filename -> PsfontsMap cache, so that calling\n # `PsfontsMap(filename)` with the same filename a second time immediately\n # returns the same object.\n @lru_cache\n def __new__(cls, filename):\n self = object.__new__(cls)\n self._filename = os.fsdecode(filename)\n # Some TeX distributions have enormous pdftex.map files which would\n # take hundreds of milliseconds to parse, but it is easy enough to just\n # store the unparsed lines (keyed by the first word, which is the\n # texname) and parse them on-demand.\n with open(filename, 'rb') as file:\n self._unparsed = {}\n for line in file:\n tfmname = line.split(b' ', 1)[0]\n self._unparsed.setdefault(tfmname, []).append(line)\n self._parsed = {}\n return self\n\n def __getitem__(self, texname):\n assert isinstance(texname, bytes)\n if texname in self._unparsed:\n for line in self._unparsed.pop(texname):\n if self._parse_and_cache_line(line):\n break\n try:\n return self._parsed[texname]\n except KeyError:\n raise LookupError(\n f"An associated PostScript font (required by Matplotlib) "\n f"could not be found for TeX font {texname.decode('ascii')!r} "\n f"in {self._filename!r}; this problem can often be solved by "\n f"installing a suitable PostScript font package in your TeX "\n f"package manager") from None\n\n def _parse_and_cache_line(self, line):\n """\n Parse a line in the font mapping file.\n\n The format is (partially) documented at\n http://mirrors.ctan.org/systems/doc/pdftex/manual/pdftex-a.pdf\n https://tug.org/texinfohtml/dvips.html#psfonts_002emap\n Each line can have the following fields:\n\n - tfmname (first, only required field),\n - psname (defaults to tfmname, must come immediately after tfmname if\n present),\n - fontflags (integer, must come immediately after psname if present,\n ignored by us),\n - special (SlantFont and ExtendFont, only field that is double-quoted),\n - fontfile, encodingfile (optional, prefixed by <, <<, or <[; << always\n precedes a font, <[ always precedes an encoding, < can precede either\n but then an encoding file must have extension .enc; < and << also\n request different font subsetting behaviors but we ignore that; < can\n be separated from the filename by whitespace).\n\n special, fontfile, and encodingfile can appear in any order.\n """\n # If the map file specifies multiple encodings for a font, we\n # follow pdfTeX in choosing the last one specified. Such\n # entries are probably mistakes but they have occurred.\n # https://tex.stackexchange.com/q/10826/\n\n if not line or line.startswith((b" ", b"%", b"*", b";", b"#")):\n return\n tfmname = basename = special = encodingfile = fontfile = None\n is_subsetted = is_t1 = is_truetype = False\n matches = re.finditer(br'"([^"]*)(?:"|$)|(\S+)', line)\n for match in matches:\n quoted, unquoted = match.groups()\n if unquoted:\n if unquoted.startswith(b"<<"): # font\n fontfile = unquoted[2:]\n elif unquoted.startswith(b"<["): # encoding\n encodingfile = unquoted[2:]\n elif unquoted.startswith(b"<"): # font or encoding\n word = (\n # <foo => foo\n unquoted[1:]\n # < by itself => read the next word\n or next(filter(None, next(matches).groups())))\n if word.endswith(b".enc"):\n encodingfile = word\n else:\n fontfile = word\n is_subsetted = True\n elif tfmname is None:\n tfmname = unquoted\n elif basename is None:\n basename = unquoted\n elif quoted:\n special = quoted\n effects = {}\n if special:\n words = reversed(special.split())\n for word in words:\n if word == b"SlantFont":\n effects["slant"] = float(next(words))\n elif word == b"ExtendFont":\n effects["extend"] = float(next(words))\n\n # Verify some properties of the line that would cause it to be ignored\n # otherwise.\n if fontfile is not None:\n if fontfile.endswith((b".ttf", b".ttc")):\n is_truetype = True\n elif not fontfile.endswith(b".otf"):\n is_t1 = True\n elif basename is not None:\n is_t1 = True\n if is_truetype and is_subsetted and encodingfile is None:\n return\n if not is_t1 and ("slant" in effects or "extend" in effects):\n return\n if abs(effects.get("slant", 0)) > 1:\n return\n if abs(effects.get("extend", 0)) > 2:\n return\n\n if basename is None:\n basename = tfmname\n if encodingfile is not None:\n encodingfile = find_tex_file(encodingfile)\n if fontfile is not None:\n fontfile = find_tex_file(fontfile)\n self._parsed[tfmname] = PsFont(\n texname=tfmname, psname=basename, effects=effects,\n encoding=encodingfile, filename=fontfile)\n return True\n\n\ndef _parse_enc(path):\n r"""\n Parse a \*.enc file referenced from a psfonts.map style file.\n\n The format supported by this function is a tiny subset of PostScript.\n\n Parameters\n ----------\n path : `os.PathLike`\n\n Returns\n -------\n list\n The nth entry of the list is the PostScript glyph name of the nth\n glyph.\n """\n no_comments = re.sub("%.*", "", Path(path).read_text(encoding="ascii"))\n array = re.search(r"(?s)\[(.*)\]", no_comments).group(1)\n lines = [line for line in array.split() if line]\n if all(line.startswith("/") for line in lines):\n return [line[1:] for line in lines]\n else:\n raise ValueError(f"Failed to parse {path} as Postscript encoding")\n\n\nclass _LuatexKpsewhich:\n @lru_cache # A singleton.\n def __new__(cls):\n self = object.__new__(cls)\n self._proc = self._new_proc()\n return self\n\n def _new_proc(self):\n return subprocess.Popen(\n ["luatex", "--luaonly",\n str(cbook._get_data_path("kpsewhich.lua"))],\n stdin=subprocess.PIPE, stdout=subprocess.PIPE)\n\n def search(self, filename):\n if self._proc.poll() is not None: # Dead, restart it.\n self._proc = self._new_proc()\n self._proc.stdin.write(os.fsencode(filename) + b"\n")\n self._proc.stdin.flush()\n out = self._proc.stdout.readline().rstrip()\n return None if out == b"nil" else os.fsdecode(out)\n\n\n@lru_cache\ndef find_tex_file(filename):\n """\n Find a file in the texmf tree using kpathsea_.\n\n The kpathsea library, provided by most existing TeX distributions, both\n on Unix-like systems and on Windows (MikTeX), is invoked via a long-lived\n luatex process if luatex is installed, or via kpsewhich otherwise.\n\n .. _kpathsea: https://www.tug.org/kpathsea/\n\n Parameters\n ----------\n filename : str or path-like\n\n Raises\n ------\n FileNotFoundError\n If the file is not found.\n """\n\n # we expect these to always be ascii encoded, but use utf-8\n # out of caution\n if isinstance(filename, bytes):\n filename = filename.decode('utf-8', errors='replace')\n\n try:\n lk = _LuatexKpsewhich()\n except FileNotFoundError:\n lk = None # Fallback to directly calling kpsewhich, as below.\n\n if lk:\n path = lk.search(filename)\n else:\n if sys.platform == 'win32':\n # On Windows only, kpathsea can use utf-8 for cmd args and output.\n # The `command_line_encoding` environment variable is set to force\n # it to always use utf-8 encoding. See Matplotlib issue #11848.\n kwargs = {'env': {**os.environ, 'command_line_encoding': 'utf-8'},\n 'encoding': 'utf-8'}\n else: # On POSIX, run through the equivalent of os.fsdecode().\n kwargs = {'encoding': sys.getfilesystemencoding(),\n 'errors': 'surrogateescape'}\n\n try:\n path = (cbook._check_and_log_subprocess(['kpsewhich', filename],\n _log, **kwargs)\n .rstrip('\n'))\n except (FileNotFoundError, RuntimeError):\n path = None\n\n if path:\n return path\n else:\n raise FileNotFoundError(\n f"Matplotlib's TeX implementation searched for a file named "\n f"{filename!r} in your texmf tree, but could not find it")\n\n\n@lru_cache\ndef _fontfile(cls, suffix, texname):\n return cls(find_tex_file(texname + suffix))\n\n\n_tfmfile = partial(_fontfile, Tfm, ".tfm")\n_vffile = partial(_fontfile, Vf, ".vf")\n\n\nif __name__ == '__main__':\n from argparse import ArgumentParser\n import itertools\n\n parser = ArgumentParser()\n parser.add_argument("filename")\n parser.add_argument("dpi", nargs="?", type=float, default=None)\n args = parser.parse_args()\n with Dvi(args.filename, args.dpi) as dvi:\n fontmap = PsfontsMap(find_tex_file('pdftex.map'))\n for page in dvi:\n print(f"=== new page === "\n f"(w: {page.width}, h: {page.height}, d: {page.descent})")\n for font, group in itertools.groupby(\n page.text, lambda text: text.font):\n print(f"font: {font.texname.decode('latin-1')!r}\t"\n f"scale: {font._scale / 2 ** 20}")\n print("x", "y", "glyph", "chr", "w", "(glyphs)", sep="\t")\n for text in group:\n print(text.x, text.y, text.glyph,\n chr(text.glyph) if chr(text.glyph).isprintable()\n else ".",\n text.width, sep="\t")\n if page.boxes:\n print("x", "y", "h", "w", "", "(boxes)", sep="\t")\n for box in page.boxes:\n print(box.x, box.y, box.height, box.width, sep="\t")\n | .venv\Lib\site-packages\matplotlib\dviread.py | dviread.py | Python | 42,590 | 0.95 | 0.206321 | 0.105691 | node-utils | 317 | 2024-07-31T23:15:30.371934 | BSD-3-Clause | false | 6e779d21801a1c1d74984e14848af6b7 |
from pathlib import Path\nimport io\nimport os\nfrom enum import Enum\nfrom collections.abc import Generator\n\nfrom typing import NamedTuple\nfrom typing_extensions import Self # < Py 3.11\n\nclass _dvistate(Enum):\n pre = ...\n outer = ...\n inpage = ...\n post_post = ...\n finale = ...\n\nclass Page(NamedTuple):\n text: list[Text]\n boxes: list[Box]\n height: int\n width: int\n descent: int\n\nclass Box(NamedTuple):\n x: int\n y: int\n height: int\n width: int\n\nclass Text(NamedTuple):\n x: int\n y: int\n font: DviFont\n glyph: int\n width: int\n @property\n def font_path(self) -> Path: ...\n @property\n def font_size(self) -> float: ...\n @property\n def font_effects(self) -> dict[str, float]: ...\n @property\n def glyph_name_or_index(self) -> int | str: ...\n\nclass Dvi:\n file: io.BufferedReader\n dpi: float | None\n fonts: dict[int, DviFont]\n state: _dvistate\n def __init__(self, filename: str | os.PathLike, dpi: float | None) -> None: ...\n def __enter__(self) -> Self: ...\n def __exit__(self, etype, evalue, etrace) -> None: ...\n def __iter__(self) -> Generator[Page, None, None]: ...\n def close(self) -> None: ...\n\nclass DviFont:\n texname: bytes\n size: float\n widths: list[int]\n def __init__(\n self, scale: float, tfm: Tfm, texname: bytes, vf: Vf | None\n ) -> None: ...\n def __eq__(self, other: object) -> bool: ...\n def __ne__(self, other: object) -> bool: ...\n\nclass Vf(Dvi):\n def __init__(self, filename: str | os.PathLike) -> None: ...\n def __getitem__(self, code: int) -> Page: ...\n\nclass Tfm:\n checksum: int\n design_size: int\n width: dict[int, int]\n height: dict[int, int]\n depth: dict[int, int]\n def __init__(self, filename: str | os.PathLike) -> None: ...\n\nclass PsFont(NamedTuple):\n texname: bytes\n psname: bytes\n effects: dict[str, float]\n encoding: None | bytes\n filename: str\n\nclass PsfontsMap:\n def __new__(cls, filename: str | os.PathLike) -> Self: ...\n def __getitem__(self, texname: bytes) -> PsFont: ...\n\ndef find_tex_file(filename: str | os.PathLike) -> str: ...\n | .venv\Lib\site-packages\matplotlib\dviread.pyi | dviread.pyi | Other | 2,139 | 0.95 | 0.314607 | 0 | react-lib | 157 | 2023-10-08T05:43:53.871640 | Apache-2.0 | false | 4ffafc63e653fac19bffa88aa00775cc |
from collections.abc import Callable, Hashable, Iterable, Sequence\nimport os\nfrom typing import Any, IO, Literal, TypeVar, overload\n\nimport numpy as np\nfrom numpy.typing import ArrayLike\n\nfrom matplotlib.artist import Artist\nfrom matplotlib.axes import Axes\nfrom matplotlib.backend_bases import (\n FigureCanvasBase,\n MouseButton,\n MouseEvent,\n RendererBase,\n)\nfrom matplotlib.colors import Colormap, Normalize\nfrom matplotlib.colorbar import Colorbar\nfrom matplotlib.colorizer import ColorizingArtist, Colorizer\nfrom matplotlib.cm import ScalarMappable\nfrom matplotlib.gridspec import GridSpec, SubplotSpec, SubplotParams as SubplotParams\nfrom matplotlib.image import _ImageBase, FigureImage\nfrom matplotlib.layout_engine import LayoutEngine\nfrom matplotlib.legend import Legend\nfrom matplotlib.lines import Line2D\nfrom matplotlib.patches import Rectangle, Patch\nfrom matplotlib.text import Text\nfrom matplotlib.transforms import Affine2D, Bbox, BboxBase, Transform\nfrom .typing import ColorType, HashableList\n\n_T = TypeVar("_T")\n\nclass FigureBase(Artist):\n artists: list[Artist]\n lines: list[Line2D]\n patches: list[Patch]\n texts: list[Text]\n images: list[_ImageBase]\n legends: list[Legend]\n subfigs: list[SubFigure]\n stale: bool\n suppressComposite: bool | None\n def __init__(self, **kwargs) -> None: ...\n def autofmt_xdate(\n self,\n bottom: float = ...,\n rotation: int = ...,\n ha: Literal["left", "center", "right"] = ...,\n which: Literal["major", "minor", "both"] = ...,\n ) -> None: ...\n def get_children(self) -> list[Artist]: ...\n def contains(self, mouseevent: MouseEvent) -> tuple[bool, dict[Any, Any]]: ...\n def suptitle(self, t: str, **kwargs) -> Text: ...\n def get_suptitle(self) -> str: ...\n def supxlabel(self, t: str, **kwargs) -> Text: ...\n def get_supxlabel(self) -> str: ...\n def supylabel(self, t: str, **kwargs) -> Text: ...\n def get_supylabel(self) -> str: ...\n def get_edgecolor(self) -> ColorType: ...\n def get_facecolor(self) -> ColorType: ...\n def get_frameon(self) -> bool: ...\n def set_linewidth(self, linewidth: float) -> None: ...\n def get_linewidth(self) -> float: ...\n def set_edgecolor(self, color: ColorType) -> None: ...\n def set_facecolor(self, color: ColorType) -> None: ...\n @overload\n def get_figure(self, root: Literal[True]) -> Figure: ...\n @overload\n def get_figure(self, root: Literal[False]) -> Figure | SubFigure: ...\n @overload\n def get_figure(self, root: bool = ...) -> Figure | SubFigure: ...\n def set_frameon(self, b: bool) -> None: ...\n @property\n def frameon(self) -> bool: ...\n @frameon.setter\n def frameon(self, b: bool) -> None: ...\n def add_artist(self, artist: Artist, clip: bool = ...) -> Artist: ...\n @overload\n def add_axes(self, ax: Axes) -> Axes: ...\n @overload\n def add_axes(\n self,\n rect: tuple[float, float, float, float],\n projection: None | str = ...,\n polar: bool = ...,\n **kwargs\n ) -> Axes: ...\n\n # TODO: docstring indicates SubplotSpec a valid arg, but none of the listed signatures appear to be that\n @overload\n def add_subplot(\n self, nrows: int, ncols: int, index: int | tuple[int, int], **kwargs\n ) -> Axes: ...\n @overload\n def add_subplot(self, pos: int, **kwargs) -> Axes: ...\n @overload\n def add_subplot(self, ax: Axes, **kwargs) -> Axes: ...\n @overload\n def add_subplot(self, ax: SubplotSpec, **kwargs) -> Axes: ...\n @overload\n def add_subplot(self, **kwargs) -> Axes: ...\n @overload\n def subplots(\n self,\n nrows: Literal[1] = ...,\n ncols: Literal[1] = ...,\n *,\n sharex: bool | Literal["none", "all", "row", "col"] = ...,\n sharey: bool | Literal["none", "all", "row", "col"] = ...,\n squeeze: Literal[True] = ...,\n width_ratios: Sequence[float] | None = ...,\n height_ratios: Sequence[float] | None = ...,\n subplot_kw: dict[str, Any] | None = ...,\n gridspec_kw: dict[str, Any] | None = ...,\n ) -> Axes: ...\n @overload\n def subplots(\n self,\n nrows: int = ...,\n ncols: int = ...,\n *,\n sharex: bool | Literal["none", "all", "row", "col"] = ...,\n sharey: bool | Literal["none", "all", "row", "col"] = ...,\n squeeze: Literal[False],\n width_ratios: Sequence[float] | None = ...,\n height_ratios: Sequence[float] | None = ...,\n subplot_kw: dict[str, Any] | None = ...,\n gridspec_kw: dict[str, Any] | None = ...,\n ) -> np.ndarray: ... # TODO numpy/numpy#24738\n @overload\n def subplots(\n self,\n nrows: int = ...,\n ncols: int = ...,\n *,\n sharex: bool | Literal["none", "all", "row", "col"] = ...,\n sharey: bool | Literal["none", "all", "row", "col"] = ...,\n squeeze: bool = ...,\n width_ratios: Sequence[float] | None = ...,\n height_ratios: Sequence[float] | None = ...,\n subplot_kw: dict[str, Any] | None = ...,\n gridspec_kw: dict[str, Any] | None = ...,\n ) -> Any: ...\n def delaxes(self, ax: Axes) -> None: ...\n def clear(self, keep_observers: bool = ...) -> None: ...\n def clf(self, keep_observers: bool = ...) -> None: ...\n\n @overload\n def legend(self) -> Legend: ...\n @overload\n def legend(self, handles: Iterable[Artist], labels: Iterable[str], **kwargs) -> Legend: ...\n @overload\n def legend(self, *, handles: Iterable[Artist], **kwargs) -> Legend: ...\n @overload\n def legend(self, labels: Iterable[str], **kwargs) -> Legend: ...\n @overload\n def legend(self, **kwargs) -> Legend: ...\n\n def text(\n self,\n x: float,\n y: float,\n s: str,\n fontdict: dict[str, Any] | None = ...,\n **kwargs\n ) -> Text: ...\n def colorbar(\n self,\n mappable: ScalarMappable | ColorizingArtist,\n cax: Axes | None = ...,\n ax: Axes | Iterable[Axes] | None = ...,\n use_gridspec: bool = ...,\n **kwargs\n ) -> Colorbar: ...\n def subplots_adjust(\n self,\n left: float | None = ...,\n bottom: float | None = ...,\n right: float | None = ...,\n top: float | None = ...,\n wspace: float | None = ...,\n hspace: float | None = ...,\n ) -> None: ...\n def align_xlabels(self, axs: Iterable[Axes] | None = ...) -> None: ...\n def align_ylabels(self, axs: Iterable[Axes] | None = ...) -> None: ...\n def align_titles(self, axs: Iterable[Axes] | None = ...) -> None: ...\n def align_labels(self, axs: Iterable[Axes] | None = ...) -> None: ...\n def add_gridspec(self, nrows: int = ..., ncols: int = ..., **kwargs) -> GridSpec: ...\n @overload\n def subfigures(\n self,\n nrows: int = ...,\n ncols: int = ...,\n squeeze: Literal[False] = ...,\n wspace: float | None = ...,\n hspace: float | None = ...,\n width_ratios: ArrayLike | None = ...,\n height_ratios: ArrayLike | None = ...,\n **kwargs\n ) -> np.ndarray: ...\n @overload\n def subfigures(\n self,\n nrows: int = ...,\n ncols: int = ...,\n squeeze: Literal[True] = ...,\n wspace: float | None = ...,\n hspace: float | None = ...,\n width_ratios: ArrayLike | None = ...,\n height_ratios: ArrayLike | None = ...,\n **kwargs\n ) -> np.ndarray | SubFigure: ...\n def add_subfigure(self, subplotspec: SubplotSpec, **kwargs) -> SubFigure: ...\n def sca(self, a: Axes) -> Axes: ...\n def gca(self) -> Axes: ...\n def _gci(self) -> ColorizingArtist | None: ...\n def _process_projection_requirements(\n self, *, axes_class=None, polar=False, projection=None, **kwargs\n ) -> tuple[type[Axes], dict[str, Any]]: ...\n def get_default_bbox_extra_artists(self) -> list[Artist]: ...\n def get_tightbbox(\n self,\n renderer: RendererBase | None = ...,\n *,\n bbox_extra_artists: Iterable[Artist] | None = ...,\n ) -> Bbox: ...\n @overload\n def subplot_mosaic(\n self,\n mosaic: str,\n *,\n sharex: bool = ...,\n sharey: bool = ...,\n width_ratios: ArrayLike | None = ...,\n height_ratios: ArrayLike | None = ...,\n empty_sentinel: str = ...,\n subplot_kw: dict[str, Any] | None = ...,\n per_subplot_kw: dict[str | tuple[str, ...], dict[str, Any]] | None = ...,\n gridspec_kw: dict[str, Any] | None = ...,\n ) -> dict[str, Axes]: ...\n @overload\n def subplot_mosaic(\n self,\n mosaic: list[HashableList[_T]],\n *,\n sharex: bool = ...,\n sharey: bool = ...,\n width_ratios: ArrayLike | None = ...,\n height_ratios: ArrayLike | None = ...,\n empty_sentinel: _T = ...,\n subplot_kw: dict[str, Any] | None = ...,\n per_subplot_kw: dict[_T | tuple[_T, ...], dict[str, Any]] | None = ...,\n gridspec_kw: dict[str, Any] | None = ...,\n ) -> dict[_T, Axes]: ...\n @overload\n def subplot_mosaic(\n self,\n mosaic: list[HashableList[Hashable]],\n *,\n sharex: bool = ...,\n sharey: bool = ...,\n width_ratios: ArrayLike | None = ...,\n height_ratios: ArrayLike | None = ...,\n empty_sentinel: Any = ...,\n subplot_kw: dict[str, Any] | None = ...,\n per_subplot_kw: dict[Hashable | tuple[Hashable, ...], dict[str, Any]] | None = ...,\n gridspec_kw: dict[str, Any] | None = ...,\n ) -> dict[Hashable, Axes]: ...\n\nclass SubFigure(FigureBase):\n @property\n def figure(self) -> Figure: ...\n subplotpars: SubplotParams\n dpi_scale_trans: Affine2D\n transFigure: Transform\n bbox_relative: Bbox\n figbbox: BboxBase\n bbox: BboxBase\n transSubfigure: Transform\n patch: Rectangle\n def __init__(\n self,\n parent: Figure | SubFigure,\n subplotspec: SubplotSpec,\n *,\n facecolor: ColorType | None = ...,\n edgecolor: ColorType | None = ...,\n linewidth: float = ...,\n frameon: bool | None = ...,\n **kwargs\n ) -> None: ...\n @property\n def canvas(self) -> FigureCanvasBase: ...\n @property\n def dpi(self) -> float: ...\n @dpi.setter\n def dpi(self, value: float) -> None: ...\n def get_dpi(self) -> float: ...\n def set_dpi(self, val) -> None: ...\n def get_constrained_layout(self) -> bool: ...\n def get_constrained_layout_pads(\n self, relative: bool = ...\n ) -> tuple[float, float, float, float]: ...\n def get_layout_engine(self) -> LayoutEngine: ...\n @property # type: ignore[misc]\n def axes(self) -> list[Axes]: ... # type: ignore[override]\n def get_axes(self) -> list[Axes]: ...\n\nclass Figure(FigureBase):\n @property\n def figure(self) -> Figure: ...\n bbox_inches: Bbox\n dpi_scale_trans: Affine2D\n bbox: BboxBase\n figbbox: BboxBase\n transFigure: Transform\n transSubfigure: Transform\n patch: Rectangle\n subplotpars: SubplotParams\n def __init__(\n self,\n figsize: tuple[float, float] | None = ...,\n dpi: float | None = ...,\n *,\n facecolor: ColorType | None = ...,\n edgecolor: ColorType | None = ...,\n linewidth: float = ...,\n frameon: bool | None = ...,\n subplotpars: SubplotParams | None = ...,\n tight_layout: bool | dict[str, Any] | None = ...,\n constrained_layout: bool | dict[str, Any] | None = ...,\n layout: Literal["constrained", "compressed", "tight"]\n | LayoutEngine\n | None = ...,\n **kwargs\n ) -> None: ...\n def pick(self, mouseevent: MouseEvent) -> None: ...\n def set_layout_engine(\n self,\n layout: Literal["constrained", "compressed", "tight", "none"]\n | LayoutEngine\n | None = ...,\n **kwargs\n ) -> None: ...\n def get_layout_engine(self) -> LayoutEngine | None: ...\n def _repr_html_(self) -> str | None: ...\n def show(self, warn: bool = ...) -> None: ...\n @property\n def number(self) -> int | str: ...\n @number.setter\n def number(self, num: int | str) -> None: ...\n @property # type: ignore[misc]\n def axes(self) -> list[Axes]: ... # type: ignore[override]\n def get_axes(self) -> list[Axes]: ...\n @property\n def dpi(self) -> float: ...\n @dpi.setter\n def dpi(self, dpi: float) -> None: ...\n def get_tight_layout(self) -> bool: ...\n def get_constrained_layout_pads(\n self, relative: bool = ...\n ) -> tuple[float, float, float, float]: ...\n def get_constrained_layout(self) -> bool: ...\n canvas: FigureCanvasBase\n def set_canvas(self, canvas: FigureCanvasBase) -> None: ...\n def figimage(\n self,\n X: ArrayLike,\n xo: int = ...,\n yo: int = ...,\n alpha: float | None = ...,\n norm: str | Normalize | None = ...,\n cmap: str | Colormap | None = ...,\n vmin: float | None = ...,\n vmax: float | None = ...,\n origin: Literal["upper", "lower"] | None = ...,\n resize: bool = ...,\n *,\n colorizer: Colorizer | None = ...,\n **kwargs\n ) -> FigureImage: ...\n def set_size_inches(\n self, w: float | tuple[float, float], h: float | None = ..., forward: bool = ...\n ) -> None: ...\n def get_size_inches(self) -> np.ndarray: ...\n def get_figwidth(self) -> float: ...\n def get_figheight(self) -> float: ...\n def get_dpi(self) -> float: ...\n def set_dpi(self, val: float) -> None: ...\n def set_figwidth(self, val: float, forward: bool = ...) -> None: ...\n def set_figheight(self, val: float, forward: bool = ...) -> None: ...\n def clear(self, keep_observers: bool = ...) -> None: ...\n def draw_without_rendering(self) -> None: ...\n def draw_artist(self, a: Artist) -> None: ...\n def add_axobserver(self, func: Callable[[Figure], Any]) -> None: ...\n def savefig(\n self,\n fname: str | os.PathLike | IO,\n *,\n transparent: bool | None = ...,\n **kwargs\n ) -> None: ...\n def ginput(\n self,\n n: int = ...,\n timeout: float = ...,\n show_clicks: bool = ...,\n mouse_add: MouseButton = ...,\n mouse_pop: MouseButton = ...,\n mouse_stop: MouseButton = ...,\n ) -> list[tuple[int, int]]: ...\n def waitforbuttonpress(self, timeout: float = ...) -> None | bool: ...\n def tight_layout(\n self,\n *,\n pad: float = ...,\n h_pad: float | None = ...,\n w_pad: float | None = ...,\n rect: tuple[float, float, float, float] | None = ...\n ) -> None: ...\n\ndef figaspect(\n arg: float | ArrayLike,\n) -> np.ndarray[tuple[Literal[2]], np.dtype[np.float64]]: ...\n | .venv\Lib\site-packages\matplotlib\figure.pyi | figure.pyi | Other | 14,738 | 0.95 | 0.264775 | 0.05569 | python-kit | 514 | 2023-09-03T20:50:05.523938 | Apache-2.0 | false | 7b20173b57608f2c823659ac01163588 |
"""\nA module for finding, managing, and using fonts across platforms.\n\nThis module provides a single `FontManager` instance, ``fontManager``, that can\nbe shared across backends and platforms. The `findfont`\nfunction returns the best TrueType (TTF) font file in the local or\nsystem font path that matches the specified `FontProperties`\ninstance. The `FontManager` also handles Adobe Font Metrics\n(AFM) font files for use by the PostScript backend.\nThe `FontManager.addfont` function adds a custom font from a file without\ninstalling it into your operating system.\n\nThe design is based on the `W3C Cascading Style Sheet, Level 1 (CSS1)\nfont specification <http://www.w3.org/TR/1998/REC-CSS2-19980512/>`_.\nFuture versions may implement the Level 2 or 2.1 specifications.\n"""\n\n# KNOWN ISSUES\n#\n# - documentation\n# - font variant is untested\n# - font stretch is incomplete\n# - font size is incomplete\n# - default font algorithm needs improvement and testing\n# - setWeights function needs improvement\n# - 'light' is an invalid weight value, remove it.\n\nfrom __future__ import annotations\n\nfrom base64 import b64encode\nimport copy\nimport dataclasses\nfrom functools import lru_cache\nimport functools\nfrom io import BytesIO\nimport json\nimport logging\nfrom numbers import Number\nimport os\nfrom pathlib import Path\nimport plistlib\nimport re\nimport subprocess\nimport sys\nimport threading\n\nimport matplotlib as mpl\nfrom matplotlib import _api, _afm, cbook, ft2font\nfrom matplotlib._fontconfig_pattern import (\n parse_fontconfig_pattern, generate_fontconfig_pattern)\nfrom matplotlib.rcsetup import _validators\n\n_log = logging.getLogger(__name__)\n\nfont_scalings = {\n 'xx-small': 0.579,\n 'x-small': 0.694,\n 'small': 0.833,\n 'medium': 1.0,\n 'large': 1.200,\n 'x-large': 1.440,\n 'xx-large': 1.728,\n 'larger': 1.2,\n 'smaller': 0.833,\n None: 1.0,\n}\nstretch_dict = {\n 'ultra-condensed': 100,\n 'extra-condensed': 200,\n 'condensed': 300,\n 'semi-condensed': 400,\n 'normal': 500,\n 'semi-expanded': 600,\n 'semi-extended': 600,\n 'expanded': 700,\n 'extended': 700,\n 'extra-expanded': 800,\n 'extra-extended': 800,\n 'ultra-expanded': 900,\n 'ultra-extended': 900,\n}\nweight_dict = {\n 'ultralight': 100,\n 'light': 200,\n 'normal': 400,\n 'regular': 400,\n 'book': 400,\n 'medium': 500,\n 'roman': 500,\n 'semibold': 600,\n 'demibold': 600,\n 'demi': 600,\n 'bold': 700,\n 'heavy': 800,\n 'extra bold': 800,\n 'black': 900,\n}\n_weight_regexes = [\n # From fontconfig's FcFreeTypeQueryFaceInternal; not the same as\n # weight_dict!\n ("thin", 100),\n ("extralight", 200),\n ("ultralight", 200),\n ("demilight", 350),\n ("semilight", 350),\n ("light", 300), # Needs to come *after* demi/semilight!\n ("book", 380),\n ("regular", 400),\n ("normal", 400),\n ("medium", 500),\n ("demibold", 600),\n ("demi", 600),\n ("semibold", 600),\n ("extrabold", 800),\n ("superbold", 800),\n ("ultrabold", 800),\n ("bold", 700), # Needs to come *after* extra/super/ultrabold!\n ("ultrablack", 1000),\n ("superblack", 1000),\n ("extrablack", 1000),\n (r"\bultra", 1000),\n ("black", 900), # Needs to come *after* ultra/super/extrablack!\n ("heavy", 900),\n]\nfont_family_aliases = {\n 'serif',\n 'sans-serif',\n 'sans serif',\n 'cursive',\n 'fantasy',\n 'monospace',\n 'sans',\n}\n\n# OS Font paths\ntry:\n _HOME = Path.home()\nexcept Exception: # Exceptions thrown by home() are not specified...\n _HOME = Path(os.devnull) # Just an arbitrary path with no children.\nMSFolders = \\n r'Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders'\nMSFontDirectories = [\n r'SOFTWARE\Microsoft\Windows NT\CurrentVersion\Fonts',\n r'SOFTWARE\Microsoft\Windows\CurrentVersion\Fonts']\nMSUserFontDirectories = [\n str(_HOME / 'AppData/Local/Microsoft/Windows/Fonts'),\n str(_HOME / 'AppData/Roaming/Microsoft/Windows/Fonts'),\n]\nX11FontDirectories = [\n # an old standard installation point\n "/usr/X11R6/lib/X11/fonts/TTF/",\n "/usr/X11/lib/X11/fonts",\n # here is the new standard location for fonts\n "/usr/share/fonts/",\n # documented as a good place to install new fonts\n "/usr/local/share/fonts/",\n # common application, not really useful\n "/usr/lib/openoffice/share/fonts/truetype/",\n # user fonts\n str((Path(os.environ.get('XDG_DATA_HOME') or _HOME / ".local/share"))\n / "fonts"),\n str(_HOME / ".fonts"),\n]\nOSXFontDirectories = [\n "/Library/Fonts/",\n "/Network/Library/Fonts/",\n "/System/Library/Fonts/",\n # fonts installed via MacPorts\n "/opt/local/share/fonts",\n # user fonts\n str(_HOME / "Library/Fonts"),\n]\n\n\ndef get_fontext_synonyms(fontext):\n """\n Return a list of file extensions that are synonyms for\n the given file extension *fileext*.\n """\n return {\n 'afm': ['afm'],\n 'otf': ['otf', 'ttc', 'ttf'],\n 'ttc': ['otf', 'ttc', 'ttf'],\n 'ttf': ['otf', 'ttc', 'ttf'],\n }[fontext]\n\n\ndef list_fonts(directory, extensions):\n """\n Return a list of all fonts matching any of the extensions, found\n recursively under the directory.\n """\n extensions = ["." + ext for ext in extensions]\n return [os.path.join(dirpath, filename)\n # os.walk ignores access errors, unlike Path.glob.\n for dirpath, _, filenames in os.walk(directory)\n for filename in filenames\n if Path(filename).suffix.lower() in extensions]\n\n\ndef win32FontDirectory():\n r"""\n Return the user-specified font directory for Win32. This is\n looked up from the registry key ::\n\n \\HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders\Fonts\n\n If the key is not found, ``%WINDIR%\Fonts`` will be returned.\n """ # noqa: E501\n import winreg\n try:\n with winreg.OpenKey(winreg.HKEY_CURRENT_USER, MSFolders) as user:\n return winreg.QueryValueEx(user, 'Fonts')[0]\n except OSError:\n return os.path.join(os.environ['WINDIR'], 'Fonts')\n\n\ndef _get_win32_installed_fonts():\n """List the font paths known to the Windows registry."""\n import winreg\n items = set()\n # Search and resolve fonts listed in the registry.\n for domain, base_dirs in [\n (winreg.HKEY_LOCAL_MACHINE, [win32FontDirectory()]), # System.\n (winreg.HKEY_CURRENT_USER, MSUserFontDirectories), # User.\n ]:\n for base_dir in base_dirs:\n for reg_path in MSFontDirectories:\n try:\n with winreg.OpenKey(domain, reg_path) as local:\n for j in range(winreg.QueryInfoKey(local)[1]):\n # value may contain the filename of the font or its\n # absolute path.\n key, value, tp = winreg.EnumValue(local, j)\n if not isinstance(value, str):\n continue\n try:\n # If value contains already an absolute path,\n # then it is not changed further.\n path = Path(base_dir, value).resolve()\n except RuntimeError:\n # Don't fail with invalid entries.\n continue\n items.add(path)\n except (OSError, MemoryError):\n continue\n return items\n\n\n@lru_cache\ndef _get_fontconfig_fonts():\n """Cache and list the font paths known to ``fc-list``."""\n try:\n if b'--format' not in subprocess.check_output(['fc-list', '--help']):\n _log.warning( # fontconfig 2.7 implemented --format.\n 'Matplotlib needs fontconfig>=2.7 to query system fonts.')\n return []\n out = subprocess.check_output(['fc-list', '--format=%{file}\\n'])\n except (OSError, subprocess.CalledProcessError):\n return []\n return [Path(os.fsdecode(fname)) for fname in out.split(b'\n')]\n\n\n@lru_cache\ndef _get_macos_fonts():\n """Cache and list the font paths known to ``system_profiler SPFontsDataType``."""\n try:\n d, = plistlib.loads(\n subprocess.check_output(["system_profiler", "-xml", "SPFontsDataType"]))\n except (OSError, subprocess.CalledProcessError, plistlib.InvalidFileException):\n return []\n return [Path(entry["path"]) for entry in d["_items"]]\n\n\ndef findSystemFonts(fontpaths=None, fontext='ttf'):\n """\n Search for fonts in the specified font paths. If no paths are\n given, will use a standard set of system paths, as well as the\n list of fonts tracked by fontconfig if fontconfig is installed and\n available. A list of TrueType fonts are returned by default with\n AFM fonts as an option.\n """\n fontfiles = set()\n fontexts = get_fontext_synonyms(fontext)\n\n if fontpaths is None:\n if sys.platform == 'win32':\n installed_fonts = _get_win32_installed_fonts()\n fontpaths = []\n else:\n installed_fonts = _get_fontconfig_fonts()\n if sys.platform == 'darwin':\n installed_fonts += _get_macos_fonts()\n fontpaths = [*X11FontDirectories, *OSXFontDirectories]\n else:\n fontpaths = X11FontDirectories\n fontfiles.update(str(path) for path in installed_fonts\n if path.suffix.lower()[1:] in fontexts)\n\n elif isinstance(fontpaths, str):\n fontpaths = [fontpaths]\n\n for path in fontpaths:\n fontfiles.update(map(os.path.abspath, list_fonts(path, fontexts)))\n\n return [fname for fname in fontfiles if os.path.exists(fname)]\n\n\n@dataclasses.dataclass(frozen=True)\nclass FontEntry:\n """\n A class for storing Font properties.\n\n It is used when populating the font lookup dictionary.\n """\n\n fname: str = ''\n name: str = ''\n style: str = 'normal'\n variant: str = 'normal'\n weight: str | int = 'normal'\n stretch: str = 'normal'\n size: str = 'medium'\n\n def _repr_html_(self) -> str:\n png_stream = self._repr_png_()\n png_b64 = b64encode(png_stream).decode()\n return f"<img src=\"data:image/png;base64, {png_b64}\" />"\n\n def _repr_png_(self) -> bytes:\n from matplotlib.figure import Figure # Circular import.\n fig = Figure()\n font_path = Path(self.fname) if self.fname != '' else None\n fig.text(0, 0, self.name, font=font_path)\n with BytesIO() as buf:\n fig.savefig(buf, bbox_inches='tight', transparent=True)\n return buf.getvalue()\n\n\ndef ttfFontProperty(font):\n """\n Extract information from a TrueType font file.\n\n Parameters\n ----------\n font : `.FT2Font`\n The TrueType font file from which information will be extracted.\n\n Returns\n -------\n `FontEntry`\n The extracted font properties.\n\n """\n name = font.family_name\n\n # Styles are: italic, oblique, and normal (default)\n\n sfnt = font.get_sfnt()\n mac_key = (1, # platform: macintosh\n 0, # id: roman\n 0) # langid: english\n ms_key = (3, # platform: microsoft\n 1, # id: unicode_cs\n 0x0409) # langid: english_united_states\n\n # These tables are actually mac_roman-encoded, but mac_roman support may be\n # missing in some alternative Python implementations and we are only going\n # to look for ASCII substrings, where any ASCII-compatible encoding works\n # - or big-endian UTF-16, since important Microsoft fonts use that.\n sfnt2 = (sfnt.get((*mac_key, 2), b'').decode('latin-1').lower() or\n sfnt.get((*ms_key, 2), b'').decode('utf_16_be').lower())\n sfnt4 = (sfnt.get((*mac_key, 4), b'').decode('latin-1').lower() or\n sfnt.get((*ms_key, 4), b'').decode('utf_16_be').lower())\n\n if sfnt4.find('oblique') >= 0:\n style = 'oblique'\n elif sfnt4.find('italic') >= 0:\n style = 'italic'\n elif sfnt2.find('regular') >= 0:\n style = 'normal'\n elif ft2font.StyleFlags.ITALIC in font.style_flags:\n style = 'italic'\n else:\n style = 'normal'\n\n # Variants are: small-caps and normal (default)\n\n # !!!! Untested\n if name.lower() in ['capitals', 'small-caps']:\n variant = 'small-caps'\n else:\n variant = 'normal'\n\n # The weight-guessing algorithm is directly translated from fontconfig\n # 2.13.1's FcFreeTypeQueryFaceInternal (fcfreetype.c).\n wws_subfamily = 22\n typographic_subfamily = 16\n font_subfamily = 2\n styles = [\n sfnt.get((*mac_key, wws_subfamily), b'').decode('latin-1'),\n sfnt.get((*mac_key, typographic_subfamily), b'').decode('latin-1'),\n sfnt.get((*mac_key, font_subfamily), b'').decode('latin-1'),\n sfnt.get((*ms_key, wws_subfamily), b'').decode('utf-16-be'),\n sfnt.get((*ms_key, typographic_subfamily), b'').decode('utf-16-be'),\n sfnt.get((*ms_key, font_subfamily), b'').decode('utf-16-be'),\n ]\n styles = [*filter(None, styles)] or [font.style_name]\n\n def get_weight(): # From fontconfig's FcFreeTypeQueryFaceInternal.\n # OS/2 table weight.\n os2 = font.get_sfnt_table("OS/2")\n if os2 and os2["version"] != 0xffff:\n return os2["usWeightClass"]\n # PostScript font info weight.\n try:\n ps_font_info_weight = (\n font.get_ps_font_info()["weight"].replace(" ", "") or "")\n except ValueError:\n pass\n else:\n for regex, weight in _weight_regexes:\n if re.fullmatch(regex, ps_font_info_weight, re.I):\n return weight\n # Style name weight.\n for style in styles:\n style = style.replace(" ", "")\n for regex, weight in _weight_regexes:\n if re.search(regex, style, re.I):\n return weight\n if ft2font.StyleFlags.BOLD in font.style_flags:\n return 700 # "bold"\n return 500 # "medium", not "regular"!\n\n weight = int(get_weight())\n\n # Stretch can be absolute and relative\n # Absolute stretches are: ultra-condensed, extra-condensed, condensed,\n # semi-condensed, normal, semi-expanded, expanded, extra-expanded,\n # and ultra-expanded.\n # Relative stretches are: wider, narrower\n # Child value is: inherit\n\n if any(word in sfnt4 for word in ['narrow', 'condensed', 'cond']):\n stretch = 'condensed'\n elif 'demi cond' in sfnt4:\n stretch = 'semi-condensed'\n elif any(word in sfnt4 for word in ['wide', 'expanded', 'extended']):\n stretch = 'expanded'\n else:\n stretch = 'normal'\n\n # Sizes can be absolute and relative.\n # Absolute sizes are: xx-small, x-small, small, medium, large, x-large,\n # and xx-large.\n # Relative sizes are: larger, smaller\n # Length value is an absolute font size, e.g., 12pt\n # Percentage values are in 'em's. Most robust specification.\n\n if not font.scalable:\n raise NotImplementedError("Non-scalable fonts are not supported")\n size = 'scalable'\n\n return FontEntry(font.fname, name, style, variant, weight, stretch, size)\n\n\ndef afmFontProperty(fontpath, font):\n """\n Extract information from an AFM font file.\n\n Parameters\n ----------\n fontpath : str\n The filename corresponding to *font*.\n font : AFM\n The AFM font file from which information will be extracted.\n\n Returns\n -------\n `FontEntry`\n The extracted font properties.\n """\n\n name = font.get_familyname()\n fontname = font.get_fontname().lower()\n\n # Styles are: italic, oblique, and normal (default)\n\n if font.get_angle() != 0 or 'italic' in name.lower():\n style = 'italic'\n elif 'oblique' in name.lower():\n style = 'oblique'\n else:\n style = 'normal'\n\n # Variants are: small-caps and normal (default)\n\n # !!!! Untested\n if name.lower() in ['capitals', 'small-caps']:\n variant = 'small-caps'\n else:\n variant = 'normal'\n\n weight = font.get_weight().lower()\n if weight not in weight_dict:\n weight = 'normal'\n\n # Stretch can be absolute and relative\n # Absolute stretches are: ultra-condensed, extra-condensed, condensed,\n # semi-condensed, normal, semi-expanded, expanded, extra-expanded,\n # and ultra-expanded.\n # Relative stretches are: wider, narrower\n # Child value is: inherit\n if 'demi cond' in fontname:\n stretch = 'semi-condensed'\n elif any(word in fontname for word in ['narrow', 'cond']):\n stretch = 'condensed'\n elif any(word in fontname for word in ['wide', 'expanded', 'extended']):\n stretch = 'expanded'\n else:\n stretch = 'normal'\n\n # Sizes can be absolute and relative.\n # Absolute sizes are: xx-small, x-small, small, medium, large, x-large,\n # and xx-large.\n # Relative sizes are: larger, smaller\n # Length value is an absolute font size, e.g., 12pt\n # Percentage values are in 'em's. Most robust specification.\n\n # All AFM fonts are apparently scalable.\n\n size = 'scalable'\n\n return FontEntry(fontpath, name, style, variant, weight, stretch, size)\n\n\ndef _cleanup_fontproperties_init(init_method):\n """\n A decorator to limit the call signature to single a positional argument\n or alternatively only keyword arguments.\n\n We still accept but deprecate all other call signatures.\n\n When the deprecation expires we can switch the signature to::\n\n __init__(self, pattern=None, /, *, family=None, style=None, ...)\n\n plus a runtime check that pattern is not used alongside with the\n keyword arguments. This results eventually in the two possible\n call signatures::\n\n FontProperties(pattern)\n FontProperties(family=..., size=..., ...)\n\n """\n @functools.wraps(init_method)\n def wrapper(self, *args, **kwargs):\n # multiple args with at least some positional ones\n if len(args) > 1 or len(args) == 1 and kwargs:\n # Note: Both cases were previously handled as individual properties.\n # Therefore, we do not mention the case of font properties here.\n _api.warn_deprecated(\n "3.10",\n message="Passing individual properties to FontProperties() "\n "positionally was deprecated in Matplotlib %(since)s and "\n "will be removed in %(removal)s. Please pass all properties "\n "via keyword arguments."\n )\n # single non-string arg -> clearly a family not a pattern\n if len(args) == 1 and not kwargs and not cbook.is_scalar_or_string(args[0]):\n # Case font-family list passed as single argument\n _api.warn_deprecated(\n "3.10",\n message="Passing family as positional argument to FontProperties() "\n "was deprecated in Matplotlib %(since)s and will be removed "\n "in %(removal)s. Please pass family names as keyword"\n "argument."\n )\n # Note on single string arg:\n # This has been interpreted as pattern so far. We are already raising if a\n # non-pattern compatible family string was given. Therefore, we do not need\n # to warn for this case.\n return init_method(self, *args, **kwargs)\n\n return wrapper\n\n\nclass FontProperties:\n """\n A class for storing and manipulating font properties.\n\n The font properties are the six properties described in the\n `W3C Cascading Style Sheet, Level 1\n <http://www.w3.org/TR/1998/REC-CSS2-19980512/>`_ font\n specification and *math_fontfamily* for math fonts:\n\n - family: A list of font names in decreasing order of priority.\n The items may include a generic font family name, either 'sans-serif',\n 'serif', 'cursive', 'fantasy', or 'monospace'. In that case, the actual\n font to be used will be looked up from the associated rcParam during the\n search process in `.findfont`. Default: :rc:`font.family`\n\n - style: Either 'normal', 'italic' or 'oblique'.\n Default: :rc:`font.style`\n\n - variant: Either 'normal' or 'small-caps'.\n Default: :rc:`font.variant`\n\n - stretch: A numeric value in the range 0-1000 or one of\n 'ultra-condensed', 'extra-condensed', 'condensed',\n 'semi-condensed', 'normal', 'semi-expanded', 'expanded',\n 'extra-expanded' or 'ultra-expanded'. Default: :rc:`font.stretch`\n\n - weight: A numeric value in the range 0-1000 or one of\n 'ultralight', 'light', 'normal', 'regular', 'book', 'medium',\n 'roman', 'semibold', 'demibold', 'demi', 'bold', 'heavy',\n 'extra bold', 'black'. Default: :rc:`font.weight`\n\n - size: Either a relative value of 'xx-small', 'x-small',\n 'small', 'medium', 'large', 'x-large', 'xx-large' or an\n absolute font size, e.g., 10. Default: :rc:`font.size`\n\n - math_fontfamily: The family of fonts used to render math text.\n Supported values are: 'dejavusans', 'dejavuserif', 'cm',\n 'stix', 'stixsans' and 'custom'. Default: :rc:`mathtext.fontset`\n\n Alternatively, a font may be specified using the absolute path to a font\n file, by using the *fname* kwarg. However, in this case, it is typically\n simpler to just pass the path (as a `pathlib.Path`, not a `str`) to the\n *font* kwarg of the `.Text` object.\n\n The preferred usage of font sizes is to use the relative values,\n e.g., 'large', instead of absolute font sizes, e.g., 12. This\n approach allows all text sizes to be made larger or smaller based\n on the font manager's default font size.\n\n This class accepts a single positional string as fontconfig_ pattern_,\n or alternatively individual properties as keyword arguments::\n\n FontProperties(pattern)\n FontProperties(*, family=None, style=None, variant=None, ...)\n\n This support does not depend on fontconfig; we are merely borrowing its\n pattern syntax for use here.\n\n .. _fontconfig: https://www.freedesktop.org/wiki/Software/fontconfig/\n .. _pattern:\n https://www.freedesktop.org/software/fontconfig/fontconfig-user.html\n\n Note that Matplotlib's internal font manager and fontconfig use a\n different algorithm to lookup fonts, so the results of the same pattern\n may be different in Matplotlib than in other applications that use\n fontconfig.\n """\n\n @_cleanup_fontproperties_init\n def __init__(self, family=None, style=None, variant=None, weight=None,\n stretch=None, size=None,\n fname=None, # if set, it's a hardcoded filename to use\n math_fontfamily=None):\n self.set_family(family)\n self.set_style(style)\n self.set_variant(variant)\n self.set_weight(weight)\n self.set_stretch(stretch)\n self.set_file(fname)\n self.set_size(size)\n self.set_math_fontfamily(math_fontfamily)\n # Treat family as a fontconfig pattern if it is the only parameter\n # provided. Even in that case, call the other setters first to set\n # attributes not specified by the pattern to the rcParams defaults.\n if (isinstance(family, str)\n and style is None and variant is None and weight is None\n and stretch is None and size is None and fname is None):\n self.set_fontconfig_pattern(family)\n\n @classmethod\n def _from_any(cls, arg):\n """\n Generic constructor which can build a `.FontProperties` from any of the\n following:\n\n - a `.FontProperties`: it is passed through as is;\n - `None`: a `.FontProperties` using rc values is used;\n - an `os.PathLike`: it is used as path to the font file;\n - a `str`: it is parsed as a fontconfig pattern;\n - a `dict`: it is passed as ``**kwargs`` to `.FontProperties`.\n """\n if arg is None:\n return cls()\n elif isinstance(arg, cls):\n return arg\n elif isinstance(arg, os.PathLike):\n return cls(fname=arg)\n elif isinstance(arg, str):\n return cls(arg)\n else:\n return cls(**arg)\n\n def __hash__(self):\n l = (tuple(self.get_family()),\n self.get_slant(),\n self.get_variant(),\n self.get_weight(),\n self.get_stretch(),\n self.get_size(),\n self.get_file(),\n self.get_math_fontfamily())\n return hash(l)\n\n def __eq__(self, other):\n return hash(self) == hash(other)\n\n def __str__(self):\n return self.get_fontconfig_pattern()\n\n def get_family(self):\n """\n Return a list of individual font family names or generic family names.\n\n The font families or generic font families (which will be resolved\n from their respective rcParams when searching for a matching font) in\n the order of preference.\n """\n return self._family\n\n def get_name(self):\n """\n Return the name of the font that best matches the font properties.\n """\n return get_font(findfont(self)).family_name\n\n def get_style(self):\n """\n Return the font style. Values are: 'normal', 'italic' or 'oblique'.\n """\n return self._slant\n\n def get_variant(self):\n """\n Return the font variant. Values are: 'normal' or 'small-caps'.\n """\n return self._variant\n\n def get_weight(self):\n """\n Set the font weight. Options are: A numeric value in the\n range 0-1000 or one of 'light', 'normal', 'regular', 'book',\n 'medium', 'roman', 'semibold', 'demibold', 'demi', 'bold',\n 'heavy', 'extra bold', 'black'\n """\n return self._weight\n\n def get_stretch(self):\n """\n Return the font stretch or width. Options are: 'ultra-condensed',\n 'extra-condensed', 'condensed', 'semi-condensed', 'normal',\n 'semi-expanded', 'expanded', 'extra-expanded', 'ultra-expanded'.\n """\n return self._stretch\n\n def get_size(self):\n """\n Return the font size.\n """\n return self._size\n\n def get_file(self):\n """\n Return the filename of the associated font.\n """\n return self._file\n\n def get_fontconfig_pattern(self):\n """\n Get a fontconfig_ pattern_ suitable for looking up the font as\n specified with fontconfig's ``fc-match`` utility.\n\n This support does not depend on fontconfig; we are merely borrowing its\n pattern syntax for use here.\n """\n return generate_fontconfig_pattern(self)\n\n def set_family(self, family):\n """\n Change the font family. Can be either an alias (generic name\n is CSS parlance), such as: 'serif', 'sans-serif', 'cursive',\n 'fantasy', or 'monospace', a real font name or a list of real\n font names. Real font names are not supported when\n :rc:`text.usetex` is `True`. Default: :rc:`font.family`\n """\n if family is None:\n family = mpl.rcParams['font.family']\n if isinstance(family, str):\n family = [family]\n self._family = family\n\n def set_style(self, style):\n """\n Set the font style.\n\n Parameters\n ----------\n style : {'normal', 'italic', 'oblique'}, default: :rc:`font.style`\n """\n if style is None:\n style = mpl.rcParams['font.style']\n _api.check_in_list(['normal', 'italic', 'oblique'], style=style)\n self._slant = style\n\n def set_variant(self, variant):\n """\n Set the font variant.\n\n Parameters\n ----------\n variant : {'normal', 'small-caps'}, default: :rc:`font.variant`\n """\n if variant is None:\n variant = mpl.rcParams['font.variant']\n _api.check_in_list(['normal', 'small-caps'], variant=variant)\n self._variant = variant\n\n def set_weight(self, weight):\n """\n Set the font weight.\n\n Parameters\n ----------\n weight : int or {'ultralight', 'light', 'normal', 'regular', 'book', \\n'medium', 'roman', 'semibold', 'demibold', 'demi', 'bold', 'heavy', \\n'extra bold', 'black'}, default: :rc:`font.weight`\n If int, must be in the range 0-1000.\n """\n if weight is None:\n weight = mpl.rcParams['font.weight']\n if weight in weight_dict:\n self._weight = weight\n return\n try:\n weight = int(weight)\n except ValueError:\n pass\n else:\n if 0 <= weight <= 1000:\n self._weight = weight\n return\n raise ValueError(f"{weight=} is invalid")\n\n def set_stretch(self, stretch):\n """\n Set the font stretch or width.\n\n Parameters\n ----------\n stretch : int or {'ultra-condensed', 'extra-condensed', 'condensed', \\n'semi-condensed', 'normal', 'semi-expanded', 'expanded', 'extra-expanded', \\n'ultra-expanded'}, default: :rc:`font.stretch`\n If int, must be in the range 0-1000.\n """\n if stretch is None:\n stretch = mpl.rcParams['font.stretch']\n if stretch in stretch_dict:\n self._stretch = stretch\n return\n try:\n stretch = int(stretch)\n except ValueError:\n pass\n else:\n if 0 <= stretch <= 1000:\n self._stretch = stretch\n return\n raise ValueError(f"{stretch=} is invalid")\n\n def set_size(self, size):\n """\n Set the font size.\n\n Parameters\n ----------\n size : float or {'xx-small', 'x-small', 'small', 'medium', \\n'large', 'x-large', 'xx-large'}, default: :rc:`font.size`\n If a float, the font size in points. The string values denote\n sizes relative to the default font size.\n """\n if size is None:\n size = mpl.rcParams['font.size']\n try:\n size = float(size)\n except ValueError:\n try:\n scale = font_scalings[size]\n except KeyError as err:\n raise ValueError(\n "Size is invalid. Valid font size are "\n + ", ".join(map(str, font_scalings))) from err\n else:\n size = scale * FontManager.get_default_size()\n if size < 1.0:\n _log.info('Fontsize %1.2f < 1.0 pt not allowed by FreeType. '\n 'Setting fontsize = 1 pt', size)\n size = 1.0\n self._size = size\n\n def set_file(self, file):\n """\n Set the filename of the fontfile to use. In this case, all\n other properties will be ignored.\n """\n self._file = os.fspath(file) if file is not None else None\n\n def set_fontconfig_pattern(self, pattern):\n """\n Set the properties by parsing a fontconfig_ *pattern*.\n\n This support does not depend on fontconfig; we are merely borrowing its\n pattern syntax for use here.\n """\n for key, val in parse_fontconfig_pattern(pattern).items():\n if type(val) is list:\n getattr(self, "set_" + key)(val[0])\n else:\n getattr(self, "set_" + key)(val)\n\n def get_math_fontfamily(self):\n """\n Return the name of the font family used for math text.\n\n The default font is :rc:`mathtext.fontset`.\n """\n return self._math_fontfamily\n\n def set_math_fontfamily(self, fontfamily):\n """\n Set the font family for text in math mode.\n\n If not set explicitly, :rc:`mathtext.fontset` will be used.\n\n Parameters\n ----------\n fontfamily : str\n The name of the font family.\n\n Available font families are defined in the\n :ref:`default matplotlibrc file <customizing-with-matplotlibrc-files>`.\n\n See Also\n --------\n .text.Text.get_math_fontfamily\n """\n if fontfamily is None:\n fontfamily = mpl.rcParams['mathtext.fontset']\n else:\n valid_fonts = _validators['mathtext.fontset'].valid.values()\n # _check_in_list() Validates the parameter math_fontfamily as\n # if it were passed to rcParams['mathtext.fontset']\n _api.check_in_list(valid_fonts, math_fontfamily=fontfamily)\n self._math_fontfamily = fontfamily\n\n def copy(self):\n """Return a copy of self."""\n return copy.copy(self)\n\n # Aliases\n set_name = set_family\n get_slant = get_style\n set_slant = set_style\n get_size_in_points = get_size\n\n\nclass _JSONEncoder(json.JSONEncoder):\n def default(self, o):\n if isinstance(o, FontManager):\n return dict(o.__dict__, __class__='FontManager')\n elif isinstance(o, FontEntry):\n d = dict(o.__dict__, __class__='FontEntry')\n try:\n # Cache paths of fonts shipped with Matplotlib relative to the\n # Matplotlib data path, which helps in the presence of venvs.\n d["fname"] = str(Path(d["fname"]).relative_to(mpl.get_data_path()))\n except ValueError:\n pass\n return d\n else:\n return super().default(o)\n\n\ndef _json_decode(o):\n cls = o.pop('__class__', None)\n if cls is None:\n return o\n elif cls == 'FontManager':\n r = FontManager.__new__(FontManager)\n r.__dict__.update(o)\n return r\n elif cls == 'FontEntry':\n if not os.path.isabs(o['fname']):\n o['fname'] = os.path.join(mpl.get_data_path(), o['fname'])\n r = FontEntry(**o)\n return r\n else:\n raise ValueError("Don't know how to deserialize __class__=%s" % cls)\n\n\ndef json_dump(data, filename):\n """\n Dump `FontManager` *data* as JSON to the file named *filename*.\n\n See Also\n --------\n json_load\n\n Notes\n -----\n File paths that are children of the Matplotlib data path (typically, fonts\n shipped with Matplotlib) are stored relative to that data path (to remain\n valid across virtualenvs).\n\n This function temporarily locks the output file to prevent multiple\n processes from overwriting one another's output.\n """\n try:\n with cbook._lock_path(filename), open(filename, 'w') as fh:\n json.dump(data, fh, cls=_JSONEncoder, indent=2)\n except OSError as e:\n _log.warning('Could not save font_manager cache %s', e)\n\n\ndef json_load(filename):\n """\n Load a `FontManager` from the JSON file named *filename*.\n\n See Also\n --------\n json_dump\n """\n with open(filename) as fh:\n return json.load(fh, object_hook=_json_decode)\n\n\nclass FontManager:\n """\n On import, the `FontManager` singleton instance creates a list of ttf and\n afm fonts and caches their `FontProperties`. The `FontManager.findfont`\n method does a nearest neighbor search to find the font that most closely\n matches the specification. If no good enough match is found, the default\n font is returned.\n\n Fonts added with the `FontManager.addfont` method will not persist in the\n cache; therefore, `addfont` will need to be called every time Matplotlib is\n imported. This method should only be used if and when a font cannot be\n installed on your operating system by other means.\n\n Notes\n -----\n The `FontManager.addfont` method must be called on the global `FontManager`\n instance.\n\n Example usage::\n\n import matplotlib.pyplot as plt\n from matplotlib import font_manager\n\n font_dirs = ["/resources/fonts"] # The path to the custom font file.\n font_files = font_manager.findSystemFonts(fontpaths=font_dirs)\n\n for font_file in font_files:\n font_manager.fontManager.addfont(font_file)\n """\n # Increment this version number whenever the font cache data\n # format or behavior has changed and requires an existing font\n # cache files to be rebuilt.\n __version__ = 390\n\n def __init__(self, size=None, weight='normal'):\n self._version = self.__version__\n\n self.__default_weight = weight\n self.default_size = size\n\n # Create list of font paths.\n paths = [cbook._get_data_path('fonts', subdir)\n for subdir in ['ttf', 'afm', 'pdfcorefonts']]\n _log.debug('font search path %s', paths)\n\n self.defaultFamily = {\n 'ttf': 'DejaVu Sans',\n 'afm': 'Helvetica'}\n\n self.afmlist = []\n self.ttflist = []\n\n # Delay the warning by 5s.\n timer = threading.Timer(5, lambda: _log.warning(\n 'Matplotlib is building the font cache; this may take a moment.'))\n timer.start()\n try:\n for fontext in ["afm", "ttf"]:\n for path in [*findSystemFonts(paths, fontext=fontext),\n *findSystemFonts(fontext=fontext)]:\n try:\n self.addfont(path)\n except OSError as exc:\n _log.info("Failed to open font file %s: %s", path, exc)\n except Exception as exc:\n _log.info("Failed to extract font properties from %s: "\n "%s", path, exc)\n finally:\n timer.cancel()\n\n def addfont(self, path):\n """\n Cache the properties of the font at *path* to make it available to the\n `FontManager`. The type of font is inferred from the path suffix.\n\n Parameters\n ----------\n path : str or path-like\n\n Notes\n -----\n This method is useful for adding a custom font without installing it in\n your operating system. See the `FontManager` singleton instance for\n usage and caveats about this function.\n """\n # Convert to string in case of a path as\n # afmFontProperty and FT2Font expect this\n path = os.fsdecode(path)\n if Path(path).suffix.lower() == ".afm":\n with open(path, "rb") as fh:\n font = _afm.AFM(fh)\n prop = afmFontProperty(path, font)\n self.afmlist.append(prop)\n else:\n font = ft2font.FT2Font(path)\n prop = ttfFontProperty(font)\n self.ttflist.append(prop)\n self._findfont_cached.cache_clear()\n\n @property\n def defaultFont(self):\n # Lazily evaluated (findfont then caches the result) to avoid including\n # the venv path in the json serialization.\n return {ext: self.findfont(family, fontext=ext)\n for ext, family in self.defaultFamily.items()}\n\n def get_default_weight(self):\n """\n Return the default font weight.\n """\n return self.__default_weight\n\n @staticmethod\n def get_default_size():\n """\n Return the default font size.\n """\n return mpl.rcParams['font.size']\n\n def set_default_weight(self, weight):\n """\n Set the default font weight. The initial value is 'normal'.\n """\n self.__default_weight = weight\n\n @staticmethod\n def _expand_aliases(family):\n if family in ('sans', 'sans serif'):\n family = 'sans-serif'\n return mpl.rcParams['font.' + family]\n\n # Each of the scoring functions below should return a value between\n # 0.0 (perfect match) and 1.0 (terrible match)\n def score_family(self, families, family2):\n """\n Return a match score between the list of font families in\n *families* and the font family name *family2*.\n\n An exact match at the head of the list returns 0.0.\n\n A match further down the list will return between 0 and 1.\n\n No match will return 1.0.\n """\n if not isinstance(families, (list, tuple)):\n families = [families]\n elif len(families) == 0:\n return 1.0\n family2 = family2.lower()\n step = 1 / len(families)\n for i, family1 in enumerate(families):\n family1 = family1.lower()\n if family1 in font_family_aliases:\n options = [*map(str.lower, self._expand_aliases(family1))]\n if family2 in options:\n idx = options.index(family2)\n return (i + (idx / len(options))) * step\n elif family1 == family2:\n # The score should be weighted by where in the\n # list the font was found.\n return i * step\n return 1.0\n\n def score_style(self, style1, style2):\n """\n Return a match score between *style1* and *style2*.\n\n An exact match returns 0.0.\n\n A match between 'italic' and 'oblique' returns 0.1.\n\n No match returns 1.0.\n """\n if style1 == style2:\n return 0.0\n elif (style1 in ('italic', 'oblique')\n and style2 in ('italic', 'oblique')):\n return 0.1\n return 1.0\n\n def score_variant(self, variant1, variant2):\n """\n Return a match score between *variant1* and *variant2*.\n\n An exact match returns 0.0, otherwise 1.0.\n """\n if variant1 == variant2:\n return 0.0\n else:\n return 1.0\n\n def score_stretch(self, stretch1, stretch2):\n """\n Return a match score between *stretch1* and *stretch2*.\n\n The result is the absolute value of the difference between the\n CSS numeric values of *stretch1* and *stretch2*, normalized\n between 0.0 and 1.0.\n """\n try:\n stretchval1 = int(stretch1)\n except ValueError:\n stretchval1 = stretch_dict.get(stretch1, 500)\n try:\n stretchval2 = int(stretch2)\n except ValueError:\n stretchval2 = stretch_dict.get(stretch2, 500)\n return abs(stretchval1 - stretchval2) / 1000.0\n\n def score_weight(self, weight1, weight2):\n """\n Return a match score between *weight1* and *weight2*.\n\n The result is 0.0 if both weight1 and weight 2 are given as strings\n and have the same value.\n\n Otherwise, the result is the absolute value of the difference between\n the CSS numeric values of *weight1* and *weight2*, normalized between\n 0.05 and 1.0.\n """\n # exact match of the weight names, e.g. weight1 == weight2 == "regular"\n if cbook._str_equal(weight1, weight2):\n return 0.0\n w1 = weight1 if isinstance(weight1, Number) else weight_dict[weight1]\n w2 = weight2 if isinstance(weight2, Number) else weight_dict[weight2]\n return 0.95 * (abs(w1 - w2) / 1000) + 0.05\n\n def score_size(self, size1, size2):\n """\n Return a match score between *size1* and *size2*.\n\n If *size2* (the size specified in the font file) is 'scalable', this\n function always returns 0.0, since any font size can be generated.\n\n Otherwise, the result is the absolute distance between *size1* and\n *size2*, normalized so that the usual range of font sizes (6pt -\n 72pt) will lie between 0.0 and 1.0.\n """\n if size2 == 'scalable':\n return 0.0\n # Size value should have already been\n try:\n sizeval1 = float(size1)\n except ValueError:\n sizeval1 = self.default_size * font_scalings[size1]\n try:\n sizeval2 = float(size2)\n except ValueError:\n return 1.0\n return abs(sizeval1 - sizeval2) / 72\n\n def findfont(self, prop, fontext='ttf', directory=None,\n fallback_to_default=True, rebuild_if_missing=True):\n """\n Find the path to the font file most closely matching the given font properties.\n\n Parameters\n ----------\n prop : str or `~matplotlib.font_manager.FontProperties`\n The font properties to search for. This can be either a\n `.FontProperties` object or a string defining a\n `fontconfig patterns`_.\n\n fontext : {'ttf', 'afm'}, default: 'ttf'\n The extension of the font file:\n\n - 'ttf': TrueType and OpenType fonts (.ttf, .ttc, .otf)\n - 'afm': Adobe Font Metrics (.afm)\n\n directory : str, optional\n If given, only search this directory and its subdirectories.\n\n fallback_to_default : bool\n If True, will fall back to the default font family (usually\n "DejaVu Sans" or "Helvetica") if the first lookup hard-fails.\n\n rebuild_if_missing : bool\n Whether to rebuild the font cache and search again if the first\n match appears to point to a nonexisting font (i.e., the font cache\n contains outdated entries).\n\n Returns\n -------\n str\n The filename of the best matching font.\n\n Notes\n -----\n This performs a nearest neighbor search. Each font is given a\n similarity score to the target font properties. The first font with\n the highest score is returned. If no matches below a certain\n threshold are found, the default font (usually DejaVu Sans) is\n returned.\n\n The result is cached, so subsequent lookups don't have to\n perform the O(n) nearest neighbor search.\n\n See the `W3C Cascading Style Sheet, Level 1\n <http://www.w3.org/TR/1998/REC-CSS2-19980512/>`_ documentation\n for a description of the font finding algorithm.\n\n .. _fontconfig patterns:\n https://www.freedesktop.org/software/fontconfig/fontconfig-user.html\n """\n # Pass the relevant rcParams (and the font manager, as `self`) to\n # _findfont_cached so to prevent using a stale cache entry after an\n # rcParam was changed.\n rc_params = tuple(tuple(mpl.rcParams[key]) for key in [\n "font.serif", "font.sans-serif", "font.cursive", "font.fantasy",\n "font.monospace"])\n ret = self._findfont_cached(\n prop, fontext, directory, fallback_to_default, rebuild_if_missing,\n rc_params)\n if isinstance(ret, cbook._ExceptionInfo):\n raise ret.to_exception()\n return ret\n\n def get_font_names(self):\n """Return the list of available fonts."""\n return list({font.name for font in self.ttflist})\n\n def _find_fonts_by_props(self, prop, fontext='ttf', directory=None,\n fallback_to_default=True, rebuild_if_missing=True):\n """\n Find the paths to the font files most closely matching the given properties.\n\n Parameters\n ----------\n prop : str or `~matplotlib.font_manager.FontProperties`\n The font properties to search for. This can be either a\n `.FontProperties` object or a string defining a\n `fontconfig patterns`_.\n\n fontext : {'ttf', 'afm'}, default: 'ttf'\n The extension of the font file:\n\n - 'ttf': TrueType and OpenType fonts (.ttf, .ttc, .otf)\n - 'afm': Adobe Font Metrics (.afm)\n\n directory : str, optional\n If given, only search this directory and its subdirectories.\n\n fallback_to_default : bool\n If True, will fall back to the default font family (usually\n "DejaVu Sans" or "Helvetica") if none of the families were found.\n\n rebuild_if_missing : bool\n Whether to rebuild the font cache and search again if the first\n match appears to point to a nonexisting font (i.e., the font cache\n contains outdated entries).\n\n Returns\n -------\n list[str]\n The paths of the fonts found.\n\n Notes\n -----\n This is an extension/wrapper of the original findfont API, which only\n returns a single font for given font properties. Instead, this API\n returns a list of filepaths of multiple fonts which closely match the\n given font properties. Since this internally uses the original API,\n there's no change to the logic of performing the nearest neighbor\n search. See `findfont` for more details.\n """\n\n prop = FontProperties._from_any(prop)\n\n fpaths = []\n for family in prop.get_family():\n cprop = prop.copy()\n cprop.set_family(family) # set current prop's family\n\n try:\n fpaths.append(\n self.findfont(\n cprop, fontext, directory,\n fallback_to_default=False, # don't fallback to default\n rebuild_if_missing=rebuild_if_missing,\n )\n )\n except ValueError:\n if family in font_family_aliases:\n _log.warning(\n "findfont: Generic family %r not found because "\n "none of the following families were found: %s",\n family, ", ".join(self._expand_aliases(family))\n )\n else:\n _log.warning("findfont: Font family %r not found.", family)\n\n # only add default family if no other font was found and\n # fallback_to_default is enabled\n if not fpaths:\n if fallback_to_default:\n dfamily = self.defaultFamily[fontext]\n cprop = prop.copy()\n cprop.set_family(dfamily)\n fpaths.append(\n self.findfont(\n cprop, fontext, directory,\n fallback_to_default=True,\n rebuild_if_missing=rebuild_if_missing,\n )\n )\n else:\n raise ValueError("Failed to find any font, and fallback "\n "to the default font was disabled")\n\n return fpaths\n\n @lru_cache(1024)\n def _findfont_cached(self, prop, fontext, directory, fallback_to_default,\n rebuild_if_missing, rc_params):\n\n prop = FontProperties._from_any(prop)\n\n fname = prop.get_file()\n if fname is not None:\n return fname\n\n if fontext == 'afm':\n fontlist = self.afmlist\n else:\n fontlist = self.ttflist\n\n best_score = 1e64\n best_font = None\n\n _log.debug('findfont: Matching %s.', prop)\n for font in fontlist:\n if (directory is not None and\n Path(directory) not in Path(font.fname).parents):\n continue\n # Matching family should have top priority, so multiply it by 10.\n score = (self.score_family(prop.get_family(), font.name) * 10\n + self.score_style(prop.get_style(), font.style)\n + self.score_variant(prop.get_variant(), font.variant)\n + self.score_weight(prop.get_weight(), font.weight)\n + self.score_stretch(prop.get_stretch(), font.stretch)\n + self.score_size(prop.get_size(), font.size))\n _log.debug('findfont: score(%s) = %s', font, score)\n if score < best_score:\n best_score = score\n best_font = font\n if score == 0:\n break\n\n if best_font is None or best_score >= 10.0:\n if fallback_to_default:\n _log.warning(\n 'findfont: Font family %s not found. Falling back to %s.',\n prop.get_family(), self.defaultFamily[fontext])\n for family in map(str.lower, prop.get_family()):\n if family in font_family_aliases:\n _log.warning(\n "findfont: Generic family %r not found because "\n "none of the following families were found: %s",\n family, ", ".join(self._expand_aliases(family)))\n default_prop = prop.copy()\n default_prop.set_family(self.defaultFamily[fontext])\n return self.findfont(default_prop, fontext, directory,\n fallback_to_default=False)\n else:\n # This return instead of raise is intentional, as we wish to\n # cache that it was not found, which will not occur if it was\n # actually raised.\n return cbook._ExceptionInfo(\n ValueError,\n f"Failed to find font {prop}, and fallback to the default font was "\n f"disabled"\n )\n else:\n _log.debug('findfont: Matching %s to %s (%r) with score of %f.',\n prop, best_font.name, best_font.fname, best_score)\n result = best_font.fname\n\n if not os.path.isfile(result):\n if rebuild_if_missing:\n _log.info(\n 'findfont: Found a missing font file. Rebuilding cache.')\n new_fm = _load_fontmanager(try_read_cache=False)\n # Replace self by the new fontmanager, because users may have\n # a reference to this specific instance.\n # TODO: _load_fontmanager should really be (used by) a method\n # modifying the instance in place.\n vars(self).update(vars(new_fm))\n return self.findfont(\n prop, fontext, directory, rebuild_if_missing=False)\n else:\n # This return instead of raise is intentional, as we wish to\n # cache that it was not found, which will not occur if it was\n # actually raised.\n return cbook._ExceptionInfo(ValueError, "No valid font could be found")\n\n return _cached_realpath(result)\n\n\n@lru_cache\ndef is_opentype_cff_font(filename):\n """\n Return whether the given font is a Postscript Compact Font Format Font\n embedded in an OpenType wrapper. Used by the PostScript and PDF backends\n that cannot subset these fonts.\n """\n if os.path.splitext(filename)[1].lower() == '.otf':\n with open(filename, 'rb') as fd:\n return fd.read(4) == b"OTTO"\n else:\n return False\n\n\n@lru_cache(64)\ndef _get_font(font_filepaths, hinting_factor, *, _kerning_factor, thread_id):\n first_fontpath, *rest = font_filepaths\n return ft2font.FT2Font(\n first_fontpath, hinting_factor,\n _fallback_list=[\n ft2font.FT2Font(\n fpath, hinting_factor,\n _kerning_factor=_kerning_factor\n )\n for fpath in rest\n ],\n _kerning_factor=_kerning_factor\n )\n\n\n# FT2Font objects cannot be used across fork()s because they reference the same\n# FT_Library object. While invalidating *all* existing FT2Fonts after a fork\n# would be too complicated to be worth it, the main way FT2Fonts get reused is\n# via the cache of _get_font, which we can empty upon forking (not on Windows,\n# which has no fork() or register_at_fork()).\nif hasattr(os, "register_at_fork"):\n os.register_at_fork(after_in_child=_get_font.cache_clear)\n\n\n@lru_cache(64)\ndef _cached_realpath(path):\n # Resolving the path avoids embedding the font twice in pdf/ps output if a\n # single font is selected using two different relative paths.\n return os.path.realpath(path)\n\n\ndef get_font(font_filepaths, hinting_factor=None):\n """\n Get an `.ft2font.FT2Font` object given a list of file paths.\n\n Parameters\n ----------\n font_filepaths : Iterable[str, Path, bytes], str, Path, bytes\n Relative or absolute paths to the font files to be used.\n\n If a single string, bytes, or `pathlib.Path`, then it will be treated\n as a list with that entry only.\n\n If more than one filepath is passed, then the returned FT2Font object\n will fall back through the fonts, in the order given, to find a needed\n glyph.\n\n Returns\n -------\n `.ft2font.FT2Font`\n\n """\n if isinstance(font_filepaths, (str, Path, bytes)):\n paths = (_cached_realpath(font_filepaths),)\n else:\n paths = tuple(_cached_realpath(fname) for fname in font_filepaths)\n\n if hinting_factor is None:\n hinting_factor = mpl.rcParams['text.hinting_factor']\n\n return _get_font(\n # must be a tuple to be cached\n paths,\n hinting_factor,\n _kerning_factor=mpl.rcParams['text.kerning_factor'],\n # also key on the thread ID to prevent segfaults with multi-threading\n thread_id=threading.get_ident()\n )\n\n\ndef _load_fontmanager(*, try_read_cache=True):\n fm_path = Path(\n mpl.get_cachedir(), f"fontlist-v{FontManager.__version__}.json")\n if try_read_cache:\n try:\n fm = json_load(fm_path)\n except Exception:\n pass\n else:\n if getattr(fm, "_version", object()) == FontManager.__version__:\n _log.debug("Using fontManager instance from %s", fm_path)\n return fm\n fm = FontManager()\n json_dump(fm, fm_path)\n _log.info("generated new fontManager")\n return fm\n\n\nfontManager = _load_fontmanager()\nfindfont = fontManager.findfont\nget_font_names = fontManager.get_font_names\n | .venv\Lib\site-packages\matplotlib\font_manager.py | font_manager.py | Python | 57,651 | 0.75 | 0.150152 | 0.09065 | react-lib | 260 | 2023-08-22T13:08:06.239479 | BSD-3-Clause | false | 1c61d3c9cceac69d43dba9dd6c9e6793 |
from dataclasses import dataclass\nimport os\n\nfrom matplotlib._afm import AFM\nfrom matplotlib import ft2font\n\nfrom pathlib import Path\n\nfrom collections.abc import Iterable\nfrom typing import Any, Literal\n\nfont_scalings: dict[str | None, float]\nstretch_dict: dict[str, int]\nweight_dict: dict[str, int]\nfont_family_aliases: set[str]\nMSFolders: str\nMSFontDirectories: list[str]\nMSUserFontDirectories: list[str]\nX11FontDirectories: list[str]\nOSXFontDirectories: list[str]\n\ndef get_fontext_synonyms(fontext: str) -> list[str]: ...\ndef list_fonts(directory: str, extensions: Iterable[str]) -> list[str]: ...\ndef win32FontDirectory() -> str: ...\ndef _get_fontconfig_fonts() -> list[Path]: ...\ndef findSystemFonts(\n fontpaths: Iterable[str | os.PathLike | Path] | None = ..., fontext: str = ...\n) -> list[str]: ...\n@dataclass\nclass FontEntry:\n fname: str = ...\n name: str = ...\n style: str = ...\n variant: str = ...\n weight: str | int = ...\n stretch: str = ...\n size: str = ...\n def _repr_html_(self) -> str: ...\n def _repr_png_(self) -> bytes: ...\n\ndef ttfFontProperty(font: ft2font.FT2Font) -> FontEntry: ...\ndef afmFontProperty(fontpath: str, font: AFM) -> FontEntry: ...\n\nclass FontProperties:\n def __init__(\n self,\n family: str | Iterable[str] | None = ...,\n style: Literal["normal", "italic", "oblique"] | None = ...,\n variant: Literal["normal", "small-caps"] | None = ...,\n weight: int | str | None = ...,\n stretch: int | str | None = ...,\n size: float | str | None = ...,\n fname: str | os.PathLike | Path | None = ...,\n math_fontfamily: str | None = ...,\n ) -> None: ...\n def __hash__(self) -> int: ...\n def __eq__(self, other: object) -> bool: ...\n def get_family(self) -> list[str]: ...\n def get_name(self) -> str: ...\n def get_style(self) -> Literal["normal", "italic", "oblique"]: ...\n def get_variant(self) -> Literal["normal", "small-caps"]: ...\n def get_weight(self) -> int | str: ...\n def get_stretch(self) -> int | str: ...\n def get_size(self) -> float: ...\n def get_file(self) -> str | bytes | None: ...\n def get_fontconfig_pattern(self) -> dict[str, list[Any]]: ...\n def set_family(self, family: str | Iterable[str] | None) -> None: ...\n def set_style(\n self, style: Literal["normal", "italic", "oblique"] | None\n ) -> None: ...\n def set_variant(self, variant: Literal["normal", "small-caps"] | None) -> None: ...\n def set_weight(self, weight: int | str | None) -> None: ...\n def set_stretch(self, stretch: int | str | None) -> None: ...\n def set_size(self, size: float | str | None) -> None: ...\n def set_file(self, file: str | os.PathLike | Path | None) -> None: ...\n def set_fontconfig_pattern(self, pattern: str) -> None: ...\n def get_math_fontfamily(self) -> str: ...\n def set_math_fontfamily(self, fontfamily: str | None) -> None: ...\n def copy(self) -> FontProperties: ...\n # Aliases\n set_name = set_family\n get_slant = get_style\n set_slant = set_style\n get_size_in_points = get_size\n\ndef json_dump(data: FontManager, filename: str | Path | os.PathLike) -> None: ...\ndef json_load(filename: str | Path | os.PathLike) -> FontManager: ...\n\nclass FontManager:\n __version__: int\n default_size: float | None\n defaultFamily: dict[str, str]\n afmlist: list[FontEntry]\n ttflist: list[FontEntry]\n def __init__(self, size: float | None = ..., weight: str = ...) -> None: ...\n def addfont(self, path: str | Path | os.PathLike) -> None: ...\n @property\n def defaultFont(self) -> dict[str, str]: ...\n def get_default_weight(self) -> str: ...\n @staticmethod\n def get_default_size() -> float: ...\n def set_default_weight(self, weight: str) -> None: ...\n def score_family(\n self, families: str | list[str] | tuple[str], family2: str\n ) -> float: ...\n def score_style(self, style1: str, style2: str) -> float: ...\n def score_variant(self, variant1: str, variant2: str) -> float: ...\n def score_stretch(self, stretch1: str | int, stretch2: str | int) -> float: ...\n def score_weight(self, weight1: str | float, weight2: str | float) -> float: ...\n def score_size(self, size1: str | float, size2: str | float) -> float: ...\n def findfont(\n self,\n prop: str | FontProperties,\n fontext: Literal["ttf", "afm"] = ...,\n directory: str | None = ...,\n fallback_to_default: bool = ...,\n rebuild_if_missing: bool = ...,\n ) -> str: ...\n def get_font_names(self) -> list[str]: ...\n\ndef is_opentype_cff_font(filename: str) -> bool: ...\ndef get_font(\n font_filepaths: Iterable[str | Path | bytes] | str | Path | bytes,\n hinting_factor: int | None = ...,\n) -> ft2font.FT2Font: ...\n\nfontManager: FontManager\n\ndef findfont(\n prop: str | FontProperties,\n fontext: Literal["ttf", "afm"] = ...,\n directory: str | None = ...,\n fallback_to_default: bool = ...,\n rebuild_if_missing: bool = ...,\n) -> str: ...\ndef get_font_names() -> list[str]: ...\n | .venv\Lib\site-packages\matplotlib\font_manager.pyi | font_manager.pyi | Other | 5,052 | 0.95 | 0.404412 | 0.008065 | awesome-app | 866 | 2024-08-15T03:31:40.818704 | BSD-3-Clause | false | ab315a5405517e2130d80b23184a41a8 |
from enum import Enum, Flag\nimport sys\nfrom typing import BinaryIO, Literal, TypedDict, final, overload, cast\nfrom typing_extensions import Buffer # < Py 3.12\n\nimport numpy as np\nfrom numpy.typing import NDArray\n\n__freetype_build_type__: str\n__freetype_version__: str\n\nclass FaceFlags(Flag):\n SCALABLE = cast(int, ...)\n FIXED_SIZES = cast(int, ...)\n FIXED_WIDTH = cast(int, ...)\n SFNT = cast(int, ...)\n HORIZONTAL = cast(int, ...)\n VERTICAL = cast(int, ...)\n KERNING = cast(int, ...)\n FAST_GLYPHS = cast(int, ...)\n MULTIPLE_MASTERS = cast(int, ...)\n GLYPH_NAMES = cast(int, ...)\n EXTERNAL_STREAM = cast(int, ...)\n HINTER = cast(int, ...)\n CID_KEYED = cast(int, ...)\n TRICKY = cast(int, ...)\n COLOR = cast(int, ...)\n # VARIATION = cast(int, ...) # FT 2.9\n # SVG = cast(int, ...) # FT 2.12\n # SBIX = cast(int, ...) # FT 2.12\n # SBIX_OVERLAY = cast(int, ...) # FT 2.12\n\nclass Kerning(Enum):\n DEFAULT = cast(int, ...)\n UNFITTED = cast(int, ...)\n UNSCALED = cast(int, ...)\n\nclass LoadFlags(Flag):\n DEFAULT = cast(int, ...)\n NO_SCALE = cast(int, ...)\n NO_HINTING = cast(int, ...)\n RENDER = cast(int, ...)\n NO_BITMAP = cast(int, ...)\n VERTICAL_LAYOUT = cast(int, ...)\n FORCE_AUTOHINT = cast(int, ...)\n CROP_BITMAP = cast(int, ...)\n PEDANTIC = cast(int, ...)\n IGNORE_GLOBAL_ADVANCE_WIDTH = cast(int, ...)\n NO_RECURSE = cast(int, ...)\n IGNORE_TRANSFORM = cast(int, ...)\n MONOCHROME = cast(int, ...)\n LINEAR_DESIGN = cast(int, ...)\n NO_AUTOHINT = cast(int, ...)\n COLOR = cast(int, ...)\n COMPUTE_METRICS = cast(int, ...) # FT 2.6.1\n # BITMAP_METRICS_ONLY = cast(int, ...) # FT 2.7.1\n # NO_SVG = cast(int, ...) # FT 2.13.1\n # The following should be unique, but the above can be OR'd together.\n TARGET_NORMAL = cast(int, ...)\n TARGET_LIGHT = cast(int, ...)\n TARGET_MONO = cast(int, ...)\n TARGET_LCD = cast(int, ...)\n TARGET_LCD_V = cast(int, ...)\n\nclass StyleFlags(Flag):\n NORMAL = cast(int, ...)\n ITALIC = cast(int, ...)\n BOLD = cast(int, ...)\n\nclass _SfntHeadDict(TypedDict):\n version: tuple[int, int]\n fontRevision: tuple[int, int]\n checkSumAdjustment: int\n magicNumber: int\n flags: int\n unitsPerEm: int\n created: tuple[int, int]\n modified: tuple[int, int]\n xMin: int\n yMin: int\n xMax: int\n yMax: int\n macStyle: int\n lowestRecPPEM: int\n fontDirectionHint: int\n indexToLocFormat: int\n glyphDataFormat: int\n\nclass _SfntMaxpDict(TypedDict):\n version: tuple[int, int]\n numGlyphs: int\n maxPoints: int\n maxContours: int\n maxComponentPoints: int\n maxComponentContours: int\n maxZones: int\n maxTwilightPoints: int\n maxStorage: int\n maxFunctionDefs: int\n maxInstructionDefs: int\n maxStackElements: int\n maxSizeOfInstructions: int\n maxComponentElements: int\n maxComponentDepth: int\n\nclass _SfntOs2Dict(TypedDict):\n version: int\n xAvgCharWidth: int\n usWeightClass: int\n usWidthClass: int\n fsType: int\n ySubscriptXSize: int\n ySubscriptYSize: int\n ySubscriptXOffset: int\n ySubscriptYOffset: int\n ySuperscriptXSize: int\n ySuperscriptYSize: int\n ySuperscriptXOffset: int\n ySuperscriptYOffset: int\n yStrikeoutSize: int\n yStrikeoutPosition: int\n sFamilyClass: int\n panose: bytes\n ulCharRange: tuple[int, int, int, int]\n achVendID: bytes\n fsSelection: int\n fsFirstCharIndex: int\n fsLastCharIndex: int\n\nclass _SfntHheaDict(TypedDict):\n version: tuple[int, int]\n ascent: int\n descent: int\n lineGap: int\n advanceWidthMax: int\n minLeftBearing: int\n minRightBearing: int\n xMaxExtent: int\n caretSlopeRise: int\n caretSlopeRun: int\n caretOffset: int\n metricDataFormat: int\n numOfLongHorMetrics: int\n\nclass _SfntVheaDict(TypedDict):\n version: tuple[int, int]\n vertTypoAscender: int\n vertTypoDescender: int\n vertTypoLineGap: int\n advanceHeightMax: int\n minTopSideBearing: int\n minBottomSizeBearing: int\n yMaxExtent: int\n caretSlopeRise: int\n caretSlopeRun: int\n caretOffset: int\n metricDataFormat: int\n numOfLongVerMetrics: int\n\nclass _SfntPostDict(TypedDict):\n format: tuple[int, int]\n italicAngle: tuple[int, int]\n underlinePosition: int\n underlineThickness: int\n isFixedPitch: int\n minMemType42: int\n maxMemType42: int\n minMemType1: int\n maxMemType1: int\n\nclass _SfntPcltDict(TypedDict):\n version: tuple[int, int]\n fontNumber: int\n pitch: int\n xHeight: int\n style: int\n typeFamily: int\n capHeight: int\n symbolSet: int\n typeFace: bytes\n characterComplement: bytes\n strokeWeight: int\n widthType: int\n serifStyle: int\n\n@final\nclass FT2Font(Buffer):\n def __init__(\n self,\n filename: str | BinaryIO,\n hinting_factor: int = ...,\n *,\n _fallback_list: list[FT2Font] | None = ...,\n _kerning_factor: int = ...\n ) -> None: ...\n if sys.version_info[:2] >= (3, 12):\n def __buffer__(self, flags: int) -> memoryview: ...\n def _get_fontmap(self, string: str) -> dict[str, FT2Font]: ...\n def clear(self) -> None: ...\n def draw_glyph_to_bitmap(\n self, image: FT2Image, x: int, y: int, glyph: Glyph, antialiased: bool = ...\n ) -> None: ...\n def draw_glyphs_to_bitmap(self, antialiased: bool = ...) -> None: ...\n def get_bitmap_offset(self) -> tuple[int, int]: ...\n def get_char_index(self, codepoint: int) -> int: ...\n def get_charmap(self) -> dict[int, int]: ...\n def get_descent(self) -> int: ...\n def get_glyph_name(self, index: int) -> str: ...\n def get_image(self) -> NDArray[np.uint8]: ...\n def get_kerning(self, left: int, right: int, mode: Kerning) -> int: ...\n def get_name_index(self, name: str) -> int: ...\n def get_num_glyphs(self) -> int: ...\n def get_path(self) -> tuple[NDArray[np.float64], NDArray[np.int8]]: ...\n def get_ps_font_info(\n self,\n ) -> tuple[str, str, str, str, str, int, int, int, int]: ...\n def get_sfnt(self) -> dict[tuple[int, int, int, int], bytes]: ...\n @overload\n def get_sfnt_table(self, name: Literal["head"]) -> _SfntHeadDict | None: ...\n @overload\n def get_sfnt_table(self, name: Literal["maxp"]) -> _SfntMaxpDict | None: ...\n @overload\n def get_sfnt_table(self, name: Literal["OS/2"]) -> _SfntOs2Dict | None: ...\n @overload\n def get_sfnt_table(self, name: Literal["hhea"]) -> _SfntHheaDict | None: ...\n @overload\n def get_sfnt_table(self, name: Literal["vhea"]) -> _SfntVheaDict | None: ...\n @overload\n def get_sfnt_table(self, name: Literal["post"]) -> _SfntPostDict | None: ...\n @overload\n def get_sfnt_table(self, name: Literal["pclt"]) -> _SfntPcltDict | None: ...\n def get_width_height(self) -> tuple[int, int]: ...\n def load_char(self, charcode: int, flags: LoadFlags = ...) -> Glyph: ...\n def load_glyph(self, glyphindex: int, flags: LoadFlags = ...) -> Glyph: ...\n def select_charmap(self, i: int) -> None: ...\n def set_charmap(self, i: int) -> None: ...\n def set_size(self, ptsize: float, dpi: float) -> None: ...\n def set_text(\n self, string: str, angle: float = ..., flags: LoadFlags = ...\n ) -> NDArray[np.float64]: ...\n @property\n def ascender(self) -> int: ...\n @property\n def bbox(self) -> tuple[int, int, int, int]: ...\n @property\n def descender(self) -> int: ...\n @property\n def face_flags(self) -> FaceFlags: ...\n @property\n def family_name(self) -> str: ...\n @property\n def fname(self) -> str: ...\n @property\n def height(self) -> int: ...\n @property\n def max_advance_height(self) -> int: ...\n @property\n def max_advance_width(self) -> int: ...\n @property\n def num_charmaps(self) -> int: ...\n @property\n def num_faces(self) -> int: ...\n @property\n def num_fixed_sizes(self) -> int: ...\n @property\n def num_glyphs(self) -> int: ...\n @property\n def num_named_instances(self) -> int: ...\n @property\n def postscript_name(self) -> str: ...\n @property\n def scalable(self) -> bool: ...\n @property\n def style_flags(self) -> StyleFlags: ...\n @property\n def style_name(self) -> str: ...\n @property\n def underline_position(self) -> int: ...\n @property\n def underline_thickness(self) -> int: ...\n @property\n def units_per_EM(self) -> int: ...\n\n@final\nclass FT2Image(Buffer):\n def __init__(self, width: int, height: int) -> None: ...\n def draw_rect_filled(self, x0: int, y0: int, x1: int, y1: int) -> None: ...\n if sys.version_info[:2] >= (3, 12):\n def __buffer__(self, flags: int) -> memoryview: ...\n\n@final\nclass Glyph:\n @property\n def width(self) -> int: ...\n @property\n def height(self) -> int: ...\n @property\n def horiBearingX(self) -> int: ...\n @property\n def horiBearingY(self) -> int: ...\n @property\n def horiAdvance(self) -> int: ...\n @property\n def linearHoriAdvance(self) -> int: ...\n @property\n def vertBearingX(self) -> int: ...\n @property\n def vertBearingY(self) -> int: ...\n @property\n def vertAdvance(self) -> int: ...\n @property\n def bbox(self) -> tuple[int, int, int, int]: ...\n | .venv\Lib\site-packages\matplotlib\ft2font.pyi | ft2font.pyi | Other | 9,253 | 0.95 | 0.262821 | 0.027027 | awesome-app | 233 | 2025-03-14T05:23:15.462943 | GPL-3.0 | false | 0f95bb3b1ece8effda8141c73d324f49 |
r"""\n:mod:`~matplotlib.gridspec` contains classes that help to layout multiple\n`~.axes.Axes` in a grid-like pattern within a figure.\n\nThe `GridSpec` specifies the overall grid structure. Individual cells within\nthe grid are referenced by `SubplotSpec`\s.\n\nOften, users need not access this module directly, and can use higher-level\nmethods like `~.pyplot.subplots`, `~.pyplot.subplot_mosaic` and\n`~.Figure.subfigures`. See the tutorial :ref:`arranging_axes` for a guide.\n"""\n\nimport copy\nimport logging\nfrom numbers import Integral\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom matplotlib import _api, _pylab_helpers, _tight_layout\nfrom matplotlib.transforms import Bbox\n\n_log = logging.getLogger(__name__)\n\n\nclass GridSpecBase:\n """\n A base class of GridSpec that specifies the geometry of the grid\n that a subplot will be placed.\n """\n\n def __init__(self, nrows, ncols, height_ratios=None, width_ratios=None):\n """\n Parameters\n ----------\n nrows, ncols : int\n The number of rows and columns of the grid.\n width_ratios : array-like of length *ncols*, optional\n Defines the relative widths of the columns. Each column gets a\n relative width of ``width_ratios[i] / sum(width_ratios)``.\n If not given, all columns will have the same width.\n height_ratios : array-like of length *nrows*, optional\n Defines the relative heights of the rows. Each row gets a\n relative height of ``height_ratios[i] / sum(height_ratios)``.\n If not given, all rows will have the same height.\n """\n if not isinstance(nrows, Integral) or nrows <= 0:\n raise ValueError(\n f"Number of rows must be a positive integer, not {nrows!r}")\n if not isinstance(ncols, Integral) or ncols <= 0:\n raise ValueError(\n f"Number of columns must be a positive integer, not {ncols!r}")\n self._nrows, self._ncols = nrows, ncols\n self.set_height_ratios(height_ratios)\n self.set_width_ratios(width_ratios)\n\n def __repr__(self):\n height_arg = (f', height_ratios={self._row_height_ratios!r}'\n if len(set(self._row_height_ratios)) != 1 else '')\n width_arg = (f', width_ratios={self._col_width_ratios!r}'\n if len(set(self._col_width_ratios)) != 1 else '')\n return '{clsname}({nrows}, {ncols}{optionals})'.format(\n clsname=self.__class__.__name__,\n nrows=self._nrows,\n ncols=self._ncols,\n optionals=height_arg + width_arg,\n )\n\n nrows = property(lambda self: self._nrows,\n doc="The number of rows in the grid.")\n ncols = property(lambda self: self._ncols,\n doc="The number of columns in the grid.")\n\n def get_geometry(self):\n """\n Return a tuple containing the number of rows and columns in the grid.\n """\n return self._nrows, self._ncols\n\n def get_subplot_params(self, figure=None):\n # Must be implemented in subclasses\n pass\n\n def new_subplotspec(self, loc, rowspan=1, colspan=1):\n """\n Create and return a `.SubplotSpec` instance.\n\n Parameters\n ----------\n loc : (int, int)\n The position of the subplot in the grid as\n ``(row_index, column_index)``.\n rowspan, colspan : int, default: 1\n The number of rows and columns the subplot should span in the grid.\n """\n loc1, loc2 = loc\n subplotspec = self[loc1:loc1+rowspan, loc2:loc2+colspan]\n return subplotspec\n\n def set_width_ratios(self, width_ratios):\n """\n Set the relative widths of the columns.\n\n *width_ratios* must be of length *ncols*. Each column gets a relative\n width of ``width_ratios[i] / sum(width_ratios)``.\n """\n if width_ratios is None:\n width_ratios = [1] * self._ncols\n elif len(width_ratios) != self._ncols:\n raise ValueError('Expected the given number of width ratios to '\n 'match the number of columns of the grid')\n self._col_width_ratios = width_ratios\n\n def get_width_ratios(self):\n """\n Return the width ratios.\n\n This is *None* if no width ratios have been set explicitly.\n """\n return self._col_width_ratios\n\n def set_height_ratios(self, height_ratios):\n """\n Set the relative heights of the rows.\n\n *height_ratios* must be of length *nrows*. Each row gets a relative\n height of ``height_ratios[i] / sum(height_ratios)``.\n """\n if height_ratios is None:\n height_ratios = [1] * self._nrows\n elif len(height_ratios) != self._nrows:\n raise ValueError('Expected the given number of height ratios to '\n 'match the number of rows of the grid')\n self._row_height_ratios = height_ratios\n\n def get_height_ratios(self):\n """\n Return the height ratios.\n\n This is *None* if no height ratios have been set explicitly.\n """\n return self._row_height_ratios\n\n def get_grid_positions(self, fig):\n """\n Return the positions of the grid cells in figure coordinates.\n\n Parameters\n ----------\n fig : `~matplotlib.figure.Figure`\n The figure the grid should be applied to. The subplot parameters\n (margins and spacing between subplots) are taken from *fig*.\n\n Returns\n -------\n bottoms, tops, lefts, rights : array\n The bottom, top, left, right positions of the grid cells in\n figure coordinates.\n """\n nrows, ncols = self.get_geometry()\n subplot_params = self.get_subplot_params(fig)\n left = subplot_params.left\n right = subplot_params.right\n bottom = subplot_params.bottom\n top = subplot_params.top\n wspace = subplot_params.wspace\n hspace = subplot_params.hspace\n tot_width = right - left\n tot_height = top - bottom\n\n # calculate accumulated heights of columns\n cell_h = tot_height / (nrows + hspace*(nrows-1))\n sep_h = hspace * cell_h\n norm = cell_h * nrows / sum(self._row_height_ratios)\n cell_heights = [r * norm for r in self._row_height_ratios]\n sep_heights = [0] + ([sep_h] * (nrows-1))\n cell_hs = np.cumsum(np.column_stack([sep_heights, cell_heights]).flat)\n\n # calculate accumulated widths of rows\n cell_w = tot_width / (ncols + wspace*(ncols-1))\n sep_w = wspace * cell_w\n norm = cell_w * ncols / sum(self._col_width_ratios)\n cell_widths = [r * norm for r in self._col_width_ratios]\n sep_widths = [0] + ([sep_w] * (ncols-1))\n cell_ws = np.cumsum(np.column_stack([sep_widths, cell_widths]).flat)\n\n fig_tops, fig_bottoms = (top - cell_hs).reshape((-1, 2)).T\n fig_lefts, fig_rights = (left + cell_ws).reshape((-1, 2)).T\n return fig_bottoms, fig_tops, fig_lefts, fig_rights\n\n @staticmethod\n def _check_gridspec_exists(figure, nrows, ncols):\n """\n Check if the figure already has a gridspec with these dimensions,\n or create a new one\n """\n for ax in figure.get_axes():\n gs = ax.get_gridspec()\n if gs is not None:\n if hasattr(gs, 'get_topmost_subplotspec'):\n # This is needed for colorbar gridspec layouts.\n # This is probably OK because this whole logic tree\n # is for when the user is doing simple things with the\n # add_subplot command. For complicated layouts\n # like subgridspecs the proper gridspec is passed in...\n gs = gs.get_topmost_subplotspec().get_gridspec()\n if gs.get_geometry() == (nrows, ncols):\n return gs\n # else gridspec not found:\n return GridSpec(nrows, ncols, figure=figure)\n\n def __getitem__(self, key):\n """Create and return a `.SubplotSpec` instance."""\n nrows, ncols = self.get_geometry()\n\n def _normalize(key, size, axis): # Includes last index.\n orig_key = key\n if isinstance(key, slice):\n start, stop, _ = key.indices(size)\n if stop > start:\n return start, stop - 1\n raise IndexError("GridSpec slice would result in no space "\n "allocated for subplot")\n else:\n if key < 0:\n key = key + size\n if 0 <= key < size:\n return key, key\n elif axis is not None:\n raise IndexError(f"index {orig_key} is out of bounds for "\n f"axis {axis} with size {size}")\n else: # flat index\n raise IndexError(f"index {orig_key} is out of bounds for "\n f"GridSpec with size {size}")\n\n if isinstance(key, tuple):\n try:\n k1, k2 = key\n except ValueError as err:\n raise ValueError("Unrecognized subplot spec") from err\n num1, num2 = np.ravel_multi_index(\n [_normalize(k1, nrows, 0), _normalize(k2, ncols, 1)],\n (nrows, ncols))\n else: # Single key\n num1, num2 = _normalize(key, nrows * ncols, None)\n\n return SubplotSpec(self, num1, num2)\n\n def subplots(self, *, sharex=False, sharey=False, squeeze=True,\n subplot_kw=None):\n """\n Add all subplots specified by this `GridSpec` to its parent figure.\n\n See `.Figure.subplots` for detailed documentation.\n """\n\n figure = self.figure\n\n if figure is None:\n raise ValueError("GridSpec.subplots() only works for GridSpecs "\n "created with a parent figure")\n\n if not isinstance(sharex, str):\n sharex = "all" if sharex else "none"\n if not isinstance(sharey, str):\n sharey = "all" if sharey else "none"\n\n _api.check_in_list(["all", "row", "col", "none", False, True],\n sharex=sharex, sharey=sharey)\n if subplot_kw is None:\n subplot_kw = {}\n # don't mutate kwargs passed by user...\n subplot_kw = subplot_kw.copy()\n\n # Create array to hold all Axes.\n axarr = np.empty((self._nrows, self._ncols), dtype=object)\n for row in range(self._nrows):\n for col in range(self._ncols):\n shared_with = {"none": None, "all": axarr[0, 0],\n "row": axarr[row, 0], "col": axarr[0, col]}\n subplot_kw["sharex"] = shared_with[sharex]\n subplot_kw["sharey"] = shared_with[sharey]\n axarr[row, col] = figure.add_subplot(\n self[row, col], **subplot_kw)\n\n # turn off redundant tick labeling\n if sharex in ["col", "all"]:\n for ax in axarr.flat:\n ax._label_outer_xaxis(skip_non_rectangular_axes=True)\n if sharey in ["row", "all"]:\n for ax in axarr.flat:\n ax._label_outer_yaxis(skip_non_rectangular_axes=True)\n\n if squeeze:\n # Discarding unneeded dimensions that equal 1. If we only have one\n # subplot, just return it instead of a 1-element array.\n return axarr.item() if axarr.size == 1 else axarr.squeeze()\n else:\n # Returned axis array will be always 2-d, even if nrows=ncols=1.\n return axarr\n\n\nclass GridSpec(GridSpecBase):\n """\n A grid layout to place subplots within a figure.\n\n The location of the grid cells is determined in a similar way to\n `.SubplotParams` using *left*, *right*, *top*, *bottom*, *wspace*\n and *hspace*.\n\n Indexing a GridSpec instance returns a `.SubplotSpec`.\n """\n def __init__(self, nrows, ncols, figure=None,\n left=None, bottom=None, right=None, top=None,\n wspace=None, hspace=None,\n width_ratios=None, height_ratios=None):\n """\n Parameters\n ----------\n nrows, ncols : int\n The number of rows and columns of the grid.\n\n figure : `.Figure`, optional\n Only used for constrained layout to create a proper layoutgrid.\n\n left, right, top, bottom : float, optional\n Extent of the subplots as a fraction of figure width or height.\n Left cannot be larger than right, and bottom cannot be larger than\n top. If not given, the values will be inferred from a figure or\n rcParams at draw time. See also `GridSpec.get_subplot_params`.\n\n wspace : float, optional\n The amount of width reserved for space between subplots,\n expressed as a fraction of the average axis width.\n If not given, the values will be inferred from a figure or\n rcParams when necessary. See also `GridSpec.get_subplot_params`.\n\n hspace : float, optional\n The amount of height reserved for space between subplots,\n expressed as a fraction of the average axis height.\n If not given, the values will be inferred from a figure or\n rcParams when necessary. See also `GridSpec.get_subplot_params`.\n\n width_ratios : array-like of length *ncols*, optional\n Defines the relative widths of the columns. Each column gets a\n relative width of ``width_ratios[i] / sum(width_ratios)``.\n If not given, all columns will have the same width.\n\n height_ratios : array-like of length *nrows*, optional\n Defines the relative heights of the rows. Each row gets a\n relative height of ``height_ratios[i] / sum(height_ratios)``.\n If not given, all rows will have the same height.\n\n """\n self.left = left\n self.bottom = bottom\n self.right = right\n self.top = top\n self.wspace = wspace\n self.hspace = hspace\n self.figure = figure\n\n super().__init__(nrows, ncols,\n width_ratios=width_ratios,\n height_ratios=height_ratios)\n\n _AllowedKeys = ["left", "bottom", "right", "top", "wspace", "hspace"]\n\n def update(self, **kwargs):\n """\n Update the subplot parameters of the grid.\n\n Parameters that are not explicitly given are not changed. Setting a\n parameter to *None* resets it to :rc:`figure.subplot.*`.\n\n Parameters\n ----------\n left, right, top, bottom : float or None, optional\n Extent of the subplots as a fraction of figure width or height.\n wspace, hspace : float, optional\n Spacing between the subplots as a fraction of the average subplot\n width / height.\n """\n for k, v in kwargs.items():\n if k in self._AllowedKeys:\n setattr(self, k, v)\n else:\n raise AttributeError(f"{k} is an unknown keyword")\n for figmanager in _pylab_helpers.Gcf.figs.values():\n for ax in figmanager.canvas.figure.axes:\n if ax.get_subplotspec() is not None:\n ss = ax.get_subplotspec().get_topmost_subplotspec()\n if ss.get_gridspec() == self:\n fig = ax.get_figure(root=False)\n ax._set_position(ax.get_subplotspec().get_position(fig))\n\n def get_subplot_params(self, figure=None):\n """\n Return the `.SubplotParams` for the GridSpec.\n\n In order of precedence the values are taken from\n\n - non-*None* attributes of the GridSpec\n - the provided *figure*\n - :rc:`figure.subplot.*`\n\n Note that the ``figure`` attribute of the GridSpec is always ignored.\n """\n if figure is None:\n kw = {k: mpl.rcParams["figure.subplot."+k]\n for k in self._AllowedKeys}\n subplotpars = SubplotParams(**kw)\n else:\n subplotpars = copy.copy(figure.subplotpars)\n\n subplotpars.update(**{k: getattr(self, k) for k in self._AllowedKeys})\n\n return subplotpars\n\n def locally_modified_subplot_params(self):\n """\n Return a list of the names of the subplot parameters explicitly set\n in the GridSpec.\n\n This is a subset of the attributes of `.SubplotParams`.\n """\n return [k for k in self._AllowedKeys if getattr(self, k)]\n\n def tight_layout(self, figure, renderer=None,\n pad=1.08, h_pad=None, w_pad=None, rect=None):\n """\n Adjust subplot parameters to give specified padding.\n\n Parameters\n ----------\n figure : `.Figure`\n The figure.\n renderer : `.RendererBase` subclass, optional\n The renderer to be used.\n pad : float\n Padding between the figure edge and the edges of subplots, as a\n fraction of the font-size.\n h_pad, w_pad : float, optional\n Padding (height/width) between edges of adjacent subplots.\n Defaults to *pad*.\n rect : tuple (left, bottom, right, top), default: None\n (left, bottom, right, top) rectangle in normalized figure\n coordinates that the whole subplots area (including labels) will\n fit into. Default (None) is the whole figure.\n """\n if renderer is None:\n renderer = figure._get_renderer()\n kwargs = _tight_layout.get_tight_layout_figure(\n figure, figure.axes,\n _tight_layout.get_subplotspec_list(figure.axes, grid_spec=self),\n renderer, pad=pad, h_pad=h_pad, w_pad=w_pad, rect=rect)\n if kwargs:\n self.update(**kwargs)\n\n\nclass GridSpecFromSubplotSpec(GridSpecBase):\n """\n GridSpec whose subplot layout parameters are inherited from the\n location specified by a given SubplotSpec.\n """\n def __init__(self, nrows, ncols,\n subplot_spec,\n wspace=None, hspace=None,\n height_ratios=None, width_ratios=None):\n """\n Parameters\n ----------\n nrows, ncols : int\n Number of rows and number of columns of the grid.\n subplot_spec : SubplotSpec\n Spec from which the layout parameters are inherited.\n wspace, hspace : float, optional\n See `GridSpec` for more details. If not specified default values\n (from the figure or rcParams) are used.\n height_ratios : array-like of length *nrows*, optional\n See `GridSpecBase` for details.\n width_ratios : array-like of length *ncols*, optional\n See `GridSpecBase` for details.\n """\n self._wspace = wspace\n self._hspace = hspace\n if isinstance(subplot_spec, SubplotSpec):\n self._subplot_spec = subplot_spec\n else:\n raise TypeError(\n "subplot_spec must be type SubplotSpec, "\n "usually from GridSpec, or axes.get_subplotspec.")\n self.figure = self._subplot_spec.get_gridspec().figure\n super().__init__(nrows, ncols,\n width_ratios=width_ratios,\n height_ratios=height_ratios)\n\n def get_subplot_params(self, figure=None):\n """Return a dictionary of subplot layout parameters."""\n hspace = (self._hspace if self._hspace is not None\n else figure.subplotpars.hspace if figure is not None\n else mpl.rcParams["figure.subplot.hspace"])\n wspace = (self._wspace if self._wspace is not None\n else figure.subplotpars.wspace if figure is not None\n else mpl.rcParams["figure.subplot.wspace"])\n\n figbox = self._subplot_spec.get_position(figure)\n left, bottom, right, top = figbox.extents\n\n return SubplotParams(left=left, right=right,\n bottom=bottom, top=top,\n wspace=wspace, hspace=hspace)\n\n def get_topmost_subplotspec(self):\n """\n Return the topmost `.SubplotSpec` instance associated with the subplot.\n """\n return self._subplot_spec.get_topmost_subplotspec()\n\n\nclass SubplotSpec:\n """\n The location of a subplot in a `GridSpec`.\n\n .. note::\n\n Likely, you will never instantiate a `SubplotSpec` yourself. Instead,\n you will typically obtain one from a `GridSpec` using item-access.\n\n Parameters\n ----------\n gridspec : `~matplotlib.gridspec.GridSpec`\n The GridSpec, which the subplot is referencing.\n num1, num2 : int\n The subplot will occupy the *num1*-th cell of the given\n *gridspec*. If *num2* is provided, the subplot will span between\n *num1*-th cell and *num2*-th cell **inclusive**.\n\n The index starts from 0.\n """\n def __init__(self, gridspec, num1, num2=None):\n self._gridspec = gridspec\n self.num1 = num1\n self.num2 = num2\n\n def __repr__(self):\n return (f"{self.get_gridspec()}["\n f"{self.rowspan.start}:{self.rowspan.stop}, "\n f"{self.colspan.start}:{self.colspan.stop}]")\n\n @staticmethod\n def _from_subplot_args(figure, args):\n """\n Construct a `.SubplotSpec` from a parent `.Figure` and either\n\n - a `.SubplotSpec` -- returned as is;\n - one or three numbers -- a MATLAB-style subplot specifier.\n """\n if len(args) == 1:\n arg, = args\n if isinstance(arg, SubplotSpec):\n return arg\n elif not isinstance(arg, Integral):\n raise ValueError(\n f"Single argument to subplot must be a three-digit "\n f"integer, not {arg!r}")\n try:\n rows, cols, num = map(int, str(arg))\n except ValueError:\n raise ValueError(\n f"Single argument to subplot must be a three-digit "\n f"integer, not {arg!r}") from None\n elif len(args) == 3:\n rows, cols, num = args\n else:\n raise _api.nargs_error("subplot", takes="1 or 3", given=len(args))\n\n gs = GridSpec._check_gridspec_exists(figure, rows, cols)\n if gs is None:\n gs = GridSpec(rows, cols, figure=figure)\n if isinstance(num, tuple) and len(num) == 2:\n if not all(isinstance(n, Integral) for n in num):\n raise ValueError(\n f"Subplot specifier tuple must contain integers, not {num}"\n )\n i, j = num\n else:\n if not isinstance(num, Integral) or num < 1 or num > rows*cols:\n raise ValueError(\n f"num must be an integer with 1 <= num <= {rows*cols}, "\n f"not {num!r}"\n )\n i = j = num\n return gs[i-1:j]\n\n # num2 is a property only to handle the case where it is None and someone\n # mutates num1.\n\n @property\n def num2(self):\n return self.num1 if self._num2 is None else self._num2\n\n @num2.setter\n def num2(self, value):\n self._num2 = value\n\n def get_gridspec(self):\n return self._gridspec\n\n def get_geometry(self):\n """\n Return the subplot geometry as tuple ``(n_rows, n_cols, start, stop)``.\n\n The indices *start* and *stop* define the range of the subplot within\n the `GridSpec`. *stop* is inclusive (i.e. for a single cell\n ``start == stop``).\n """\n rows, cols = self.get_gridspec().get_geometry()\n return rows, cols, self.num1, self.num2\n\n @property\n def rowspan(self):\n """The rows spanned by this subplot, as a `range` object."""\n ncols = self.get_gridspec().ncols\n return range(self.num1 // ncols, self.num2 // ncols + 1)\n\n @property\n def colspan(self):\n """The columns spanned by this subplot, as a `range` object."""\n ncols = self.get_gridspec().ncols\n # We explicitly support num2 referring to a column on num1's *left*, so\n # we must sort the column indices here so that the range makes sense.\n c1, c2 = sorted([self.num1 % ncols, self.num2 % ncols])\n return range(c1, c2 + 1)\n\n def is_first_row(self):\n return self.rowspan.start == 0\n\n def is_last_row(self):\n return self.rowspan.stop == self.get_gridspec().nrows\n\n def is_first_col(self):\n return self.colspan.start == 0\n\n def is_last_col(self):\n return self.colspan.stop == self.get_gridspec().ncols\n\n def get_position(self, figure):\n """\n Update the subplot position from ``figure.subplotpars``.\n """\n gridspec = self.get_gridspec()\n nrows, ncols = gridspec.get_geometry()\n rows, cols = np.unravel_index([self.num1, self.num2], (nrows, ncols))\n fig_bottoms, fig_tops, fig_lefts, fig_rights = \\n gridspec.get_grid_positions(figure)\n\n fig_bottom = fig_bottoms[rows].min()\n fig_top = fig_tops[rows].max()\n fig_left = fig_lefts[cols].min()\n fig_right = fig_rights[cols].max()\n return Bbox.from_extents(fig_left, fig_bottom, fig_right, fig_top)\n\n def get_topmost_subplotspec(self):\n """\n Return the topmost `SubplotSpec` instance associated with the subplot.\n """\n gridspec = self.get_gridspec()\n if hasattr(gridspec, "get_topmost_subplotspec"):\n return gridspec.get_topmost_subplotspec()\n else:\n return self\n\n def __eq__(self, other):\n """\n Two SubplotSpecs are considered equal if they refer to the same\n position(s) in the same `GridSpec`.\n """\n # other may not even have the attributes we are checking.\n return ((self._gridspec, self.num1, self.num2)\n == (getattr(other, "_gridspec", object()),\n getattr(other, "num1", object()),\n getattr(other, "num2", object())))\n\n def __hash__(self):\n return hash((self._gridspec, self.num1, self.num2))\n\n def subgridspec(self, nrows, ncols, **kwargs):\n """\n Create a GridSpec within this subplot.\n\n The created `.GridSpecFromSubplotSpec` will have this `SubplotSpec` as\n a parent.\n\n Parameters\n ----------\n nrows : int\n Number of rows in grid.\n\n ncols : int\n Number of columns in grid.\n\n Returns\n -------\n `.GridSpecFromSubplotSpec`\n\n Other Parameters\n ----------------\n **kwargs\n All other parameters are passed to `.GridSpecFromSubplotSpec`.\n\n See Also\n --------\n matplotlib.pyplot.subplots\n\n Examples\n --------\n Adding three subplots in the space occupied by a single subplot::\n\n fig = plt.figure()\n gs0 = fig.add_gridspec(3, 1)\n ax1 = fig.add_subplot(gs0[0])\n ax2 = fig.add_subplot(gs0[1])\n gssub = gs0[2].subgridspec(1, 3)\n for i in range(3):\n fig.add_subplot(gssub[0, i])\n """\n return GridSpecFromSubplotSpec(nrows, ncols, self, **kwargs)\n\n\nclass SubplotParams:\n """\n Parameters defining the positioning of a subplots grid in a figure.\n """\n\n def __init__(self, left=None, bottom=None, right=None, top=None,\n wspace=None, hspace=None):\n """\n Defaults are given by :rc:`figure.subplot.[name]`.\n\n Parameters\n ----------\n left : float\n The position of the left edge of the subplots,\n as a fraction of the figure width.\n right : float\n The position of the right edge of the subplots,\n as a fraction of the figure width.\n bottom : float\n The position of the bottom edge of the subplots,\n as a fraction of the figure height.\n top : float\n The position of the top edge of the subplots,\n as a fraction of the figure height.\n wspace : float\n The width of the padding between subplots,\n as a fraction of the average Axes width.\n hspace : float\n The height of the padding between subplots,\n as a fraction of the average Axes height.\n """\n for key in ["left", "bottom", "right", "top", "wspace", "hspace"]:\n setattr(self, key, mpl.rcParams[f"figure.subplot.{key}"])\n self.update(left, bottom, right, top, wspace, hspace)\n\n def update(self, left=None, bottom=None, right=None, top=None,\n wspace=None, hspace=None):\n """\n Update the dimensions of the passed parameters. *None* means unchanged.\n """\n if ((left if left is not None else self.left)\n >= (right if right is not None else self.right)):\n raise ValueError('left cannot be >= right')\n if ((bottom if bottom is not None else self.bottom)\n >= (top if top is not None else self.top)):\n raise ValueError('bottom cannot be >= top')\n if left is not None:\n self.left = left\n if right is not None:\n self.right = right\n if bottom is not None:\n self.bottom = bottom\n if top is not None:\n self.top = top\n if wspace is not None:\n self.wspace = wspace\n if hspace is not None:\n self.hspace = hspace\n | .venv\Lib\site-packages\matplotlib\gridspec.py | gridspec.py | Python | 29,786 | 0.95 | 0.181472 | 0.037202 | python-kit | 596 | 2024-11-09T07:34:00.909642 | BSD-3-Clause | false | b83a8280bd10ac6f968b9e92c1ea2a54 |
from typing import Any, Literal, overload\n\nfrom numpy.typing import ArrayLike\nimport numpy as np\n\nfrom matplotlib.axes import Axes\nfrom matplotlib.backend_bases import RendererBase\nfrom matplotlib.figure import Figure\nfrom matplotlib.transforms import Bbox\n\nclass GridSpecBase:\n def __init__(\n self,\n nrows: int,\n ncols: int,\n height_ratios: ArrayLike | None = ...,\n width_ratios: ArrayLike | None = ...,\n ) -> None: ...\n @property\n def nrows(self) -> int: ...\n @property\n def ncols(self) -> int: ...\n def get_geometry(self) -> tuple[int, int]: ...\n def get_subplot_params(self, figure: Figure | None = ...) -> SubplotParams: ...\n def new_subplotspec(\n self, loc: tuple[int, int], rowspan: int = ..., colspan: int = ...\n ) -> SubplotSpec: ...\n def set_width_ratios(self, width_ratios: ArrayLike | None) -> None: ...\n def get_width_ratios(self) -> ArrayLike: ...\n def set_height_ratios(self, height_ratios: ArrayLike | None) -> None: ...\n def get_height_ratios(self) -> ArrayLike: ...\n def get_grid_positions(\n self, fig: Figure\n ) -> tuple[np.ndarray, np.ndarray, np.ndarray, np.ndarray]: ...\n @staticmethod\n def _check_gridspec_exists(figure: Figure, nrows: int, ncols: int) -> GridSpec: ...\n def __getitem__(\n self, key: tuple[int | slice, int | slice] | slice | int\n ) -> SubplotSpec: ...\n @overload\n def subplots(\n self,\n *,\n sharex: bool | Literal["all", "row", "col", "none"] = ...,\n sharey: bool | Literal["all", "row", "col", "none"] = ...,\n squeeze: Literal[False],\n subplot_kw: dict[str, Any] | None = ...\n ) -> np.ndarray: ...\n @overload\n def subplots(\n self,\n *,\n sharex: bool | Literal["all", "row", "col", "none"] = ...,\n sharey: bool | Literal["all", "row", "col", "none"] = ...,\n squeeze: Literal[True] = ...,\n subplot_kw: dict[str, Any] | None = ...\n ) -> np.ndarray | Axes: ...\n\nclass GridSpec(GridSpecBase):\n left: float | None\n bottom: float | None\n right: float | None\n top: float | None\n wspace: float | None\n hspace: float | None\n figure: Figure | None\n def __init__(\n self,\n nrows: int,\n ncols: int,\n figure: Figure | None = ...,\n left: float | None = ...,\n bottom: float | None = ...,\n right: float | None = ...,\n top: float | None = ...,\n wspace: float | None = ...,\n hspace: float | None = ...,\n width_ratios: ArrayLike | None = ...,\n height_ratios: ArrayLike | None = ...,\n ) -> None: ...\n def update(self, **kwargs: float | None) -> None: ...\n def locally_modified_subplot_params(self) -> list[str]: ...\n def tight_layout(\n self,\n figure: Figure,\n renderer: RendererBase | None = ...,\n pad: float = ...,\n h_pad: float | None = ...,\n w_pad: float | None = ...,\n rect: tuple[float, float, float, float] | None = ...,\n ) -> None: ...\n\nclass GridSpecFromSubplotSpec(GridSpecBase):\n figure: Figure | None\n def __init__(\n self,\n nrows: int,\n ncols: int,\n subplot_spec: SubplotSpec,\n wspace: float | None = ...,\n hspace: float | None = ...,\n height_ratios: ArrayLike | None = ...,\n width_ratios: ArrayLike | None = ...,\n ) -> None: ...\n def get_topmost_subplotspec(self) -> SubplotSpec: ...\n\nclass SubplotSpec:\n num1: int\n def __init__(\n self, gridspec: GridSpecBase, num1: int, num2: int | None = ...\n ) -> None: ...\n @staticmethod\n def _from_subplot_args(figure, args): ...\n @property\n def num2(self) -> int: ...\n @num2.setter\n def num2(self, value: int) -> None: ...\n def get_gridspec(self) -> GridSpecBase: ...\n def get_geometry(self) -> tuple[int, int, int, int]: ...\n @property\n def rowspan(self) -> range: ...\n @property\n def colspan(self) -> range: ...\n def is_first_row(self) -> bool: ...\n def is_last_row(self) -> bool: ...\n def is_first_col(self) -> bool: ...\n def is_last_col(self) -> bool: ...\n def get_position(self, figure: Figure) -> Bbox: ...\n def get_topmost_subplotspec(self) -> SubplotSpec: ...\n def __eq__(self, other: object) -> bool: ...\n def __hash__(self) -> int: ...\n def subgridspec(\n self, nrows: int, ncols: int, **kwargs\n ) -> GridSpecFromSubplotSpec: ...\n\nclass SubplotParams:\n def __init__(\n self,\n left: float | None = ...,\n bottom: float | None = ...,\n right: float | None = ...,\n top: float | None = ...,\n wspace: float | None = ...,\n hspace: float | None = ...,\n ) -> None: ...\n left: float\n right: float\n bottom: float\n top: float\n wspace: float\n hspace: float\n def update(\n self,\n left: float | None = ...,\n bottom: float | None = ...,\n right: float | None = ...,\n top: float | None = ...,\n wspace: float | None = ...,\n hspace: float | None = ...,\n ) -> None: ...\n | .venv\Lib\site-packages\matplotlib\gridspec.pyi | gridspec.pyi | Other | 5,099 | 0.85 | 0.28125 | 0.013072 | vue-tools | 11 | 2025-03-03T01:03:44.078448 | GPL-3.0 | false | f25ddc4740087f72e4ca369264bc4eef |
"""Contains classes for generating hatch patterns."""\n\nimport numpy as np\n\nfrom matplotlib import _api\nfrom matplotlib.path import Path\n\n\nclass HatchPatternBase:\n """The base class for a hatch pattern."""\n pass\n\n\nclass HorizontalHatch(HatchPatternBase):\n def __init__(self, hatch, density):\n self.num_lines = int((hatch.count('-') + hatch.count('+')) * density)\n self.num_vertices = self.num_lines * 2\n\n def set_vertices_and_codes(self, vertices, codes):\n steps, stepsize = np.linspace(0.0, 1.0, self.num_lines, False,\n retstep=True)\n steps += stepsize / 2.\n vertices[0::2, 0] = 0.0\n vertices[0::2, 1] = steps\n vertices[1::2, 0] = 1.0\n vertices[1::2, 1] = steps\n codes[0::2] = Path.MOVETO\n codes[1::2] = Path.LINETO\n\n\nclass VerticalHatch(HatchPatternBase):\n def __init__(self, hatch, density):\n self.num_lines = int((hatch.count('|') + hatch.count('+')) * density)\n self.num_vertices = self.num_lines * 2\n\n def set_vertices_and_codes(self, vertices, codes):\n steps, stepsize = np.linspace(0.0, 1.0, self.num_lines, False,\n retstep=True)\n steps += stepsize / 2.\n vertices[0::2, 0] = steps\n vertices[0::2, 1] = 0.0\n vertices[1::2, 0] = steps\n vertices[1::2, 1] = 1.0\n codes[0::2] = Path.MOVETO\n codes[1::2] = Path.LINETO\n\n\nclass NorthEastHatch(HatchPatternBase):\n def __init__(self, hatch, density):\n self.num_lines = int(\n (hatch.count('/') + hatch.count('x') + hatch.count('X')) * density)\n if self.num_lines:\n self.num_vertices = (self.num_lines + 1) * 2\n else:\n self.num_vertices = 0\n\n def set_vertices_and_codes(self, vertices, codes):\n steps = np.linspace(-0.5, 0.5, self.num_lines + 1)\n vertices[0::2, 0] = 0.0 + steps\n vertices[0::2, 1] = 0.0 - steps\n vertices[1::2, 0] = 1.0 + steps\n vertices[1::2, 1] = 1.0 - steps\n codes[0::2] = Path.MOVETO\n codes[1::2] = Path.LINETO\n\n\nclass SouthEastHatch(HatchPatternBase):\n def __init__(self, hatch, density):\n self.num_lines = int(\n (hatch.count('\\') + hatch.count('x') + hatch.count('X'))\n * density)\n if self.num_lines:\n self.num_vertices = (self.num_lines + 1) * 2\n else:\n self.num_vertices = 0\n\n def set_vertices_and_codes(self, vertices, codes):\n steps = np.linspace(-0.5, 0.5, self.num_lines + 1)\n vertices[0::2, 0] = 0.0 + steps\n vertices[0::2, 1] = 1.0 + steps\n vertices[1::2, 0] = 1.0 + steps\n vertices[1::2, 1] = 0.0 + steps\n codes[0::2] = Path.MOVETO\n codes[1::2] = Path.LINETO\n\n\nclass Shapes(HatchPatternBase):\n filled = False\n\n def __init__(self, hatch, density):\n if self.num_rows == 0:\n self.num_shapes = 0\n self.num_vertices = 0\n else:\n self.num_shapes = ((self.num_rows // 2 + 1) * (self.num_rows + 1) +\n (self.num_rows // 2) * self.num_rows)\n self.num_vertices = (self.num_shapes *\n len(self.shape_vertices) *\n (1 if self.filled else 2))\n\n def set_vertices_and_codes(self, vertices, codes):\n offset = 1.0 / self.num_rows\n shape_vertices = self.shape_vertices * offset * self.size\n shape_codes = self.shape_codes\n if not self.filled:\n shape_vertices = np.concatenate( # Forward, then backward.\n [shape_vertices, shape_vertices[::-1] * 0.9])\n shape_codes = np.concatenate([shape_codes, shape_codes])\n vertices_parts = []\n codes_parts = []\n for row in range(self.num_rows + 1):\n if row % 2 == 0:\n cols = np.linspace(0, 1, self.num_rows + 1)\n else:\n cols = np.linspace(offset / 2, 1 - offset / 2, self.num_rows)\n row_pos = row * offset\n for col_pos in cols:\n vertices_parts.append(shape_vertices + [col_pos, row_pos])\n codes_parts.append(shape_codes)\n np.concatenate(vertices_parts, out=vertices)\n np.concatenate(codes_parts, out=codes)\n\n\nclass Circles(Shapes):\n def __init__(self, hatch, density):\n path = Path.unit_circle()\n self.shape_vertices = path.vertices\n self.shape_codes = path.codes\n super().__init__(hatch, density)\n\n\nclass SmallCircles(Circles):\n size = 0.2\n\n def __init__(self, hatch, density):\n self.num_rows = (hatch.count('o')) * density\n super().__init__(hatch, density)\n\n\nclass LargeCircles(Circles):\n size = 0.35\n\n def __init__(self, hatch, density):\n self.num_rows = (hatch.count('O')) * density\n super().__init__(hatch, density)\n\n\nclass SmallFilledCircles(Circles):\n size = 0.1\n filled = True\n\n def __init__(self, hatch, density):\n self.num_rows = (hatch.count('.')) * density\n super().__init__(hatch, density)\n\n\nclass Stars(Shapes):\n size = 1.0 / 3.0\n filled = True\n\n def __init__(self, hatch, density):\n self.num_rows = (hatch.count('*')) * density\n path = Path.unit_regular_star(5)\n self.shape_vertices = path.vertices\n self.shape_codes = np.full(len(self.shape_vertices), Path.LINETO,\n dtype=Path.code_type)\n self.shape_codes[0] = Path.MOVETO\n super().__init__(hatch, density)\n\n_hatch_types = [\n HorizontalHatch,\n VerticalHatch,\n NorthEastHatch,\n SouthEastHatch,\n SmallCircles,\n LargeCircles,\n SmallFilledCircles,\n Stars\n ]\n\n\ndef _validate_hatch_pattern(hatch):\n valid_hatch_patterns = set(r'-+|/\xXoO.*')\n if hatch is not None:\n invalids = set(hatch).difference(valid_hatch_patterns)\n if invalids:\n valid = ''.join(sorted(valid_hatch_patterns))\n invalids = ''.join(sorted(invalids))\n _api.warn_deprecated(\n '3.4',\n removal='3.11', # one release after custom hatches (#20690)\n message=f'hatch must consist of a string of "{valid}" or '\n 'None, but found the following invalid values '\n f'"{invalids}". Passing invalid values is deprecated '\n 'since %(since)s and will become an error in %(removal)s.'\n )\n\n\ndef get_path(hatchpattern, density=6):\n """\n Given a hatch specifier, *hatchpattern*, generates Path to render\n the hatch in a unit square. *density* is the number of lines per\n unit square.\n """\n density = int(density)\n\n patterns = [hatch_type(hatchpattern, density)\n for hatch_type in _hatch_types]\n num_vertices = sum([pattern.num_vertices for pattern in patterns])\n\n if num_vertices == 0:\n return Path(np.empty((0, 2)))\n\n vertices = np.empty((num_vertices, 2))\n codes = np.empty(num_vertices, Path.code_type)\n\n cursor = 0\n for pattern in patterns:\n if pattern.num_vertices != 0:\n vertices_chunk = vertices[cursor:cursor + pattern.num_vertices]\n codes_chunk = codes[cursor:cursor + pattern.num_vertices]\n pattern.set_vertices_and_codes(vertices_chunk, codes_chunk)\n cursor += pattern.num_vertices\n\n return Path(vertices, codes)\n | .venv\Lib\site-packages\matplotlib\hatch.py | hatch.py | Python | 7,453 | 0.95 | 0.204444 | 0.005525 | vue-tools | 3 | 2024-09-11T12:06:03.074614 | GPL-3.0 | false | 9de3eb4188f6bda44032648f8b8c3363 |
from matplotlib.path import Path\n\nimport numpy as np\nfrom numpy.typing import ArrayLike\n\nclass HatchPatternBase: ...\n\nclass HorizontalHatch(HatchPatternBase):\n num_lines: int\n num_vertices: int\n def __init__(self, hatch: str, density: int) -> None: ...\n def set_vertices_and_codes(self, vertices: ArrayLike, codes: ArrayLike) -> None: ...\n\nclass VerticalHatch(HatchPatternBase):\n num_lines: int\n num_vertices: int\n def __init__(self, hatch: str, density: int) -> None: ...\n def set_vertices_and_codes(self, vertices: ArrayLike, codes: ArrayLike) -> None: ...\n\nclass NorthEastHatch(HatchPatternBase):\n num_lines: int\n num_vertices: int\n def __init__(self, hatch: str, density: int) -> None: ...\n def set_vertices_and_codes(self, vertices: ArrayLike, codes: ArrayLike) -> None: ...\n\nclass SouthEastHatch(HatchPatternBase):\n num_lines: int\n num_vertices: int\n def __init__(self, hatch: str, density: int) -> None: ...\n def set_vertices_and_codes(self, vertices: ArrayLike, codes: ArrayLike) -> None: ...\n\nclass Shapes(HatchPatternBase):\n filled: bool\n num_shapes: int\n num_vertices: int\n def __init__(self, hatch: str, density: int) -> None: ...\n def set_vertices_and_codes(self, vertices: ArrayLike, codes: ArrayLike) -> None: ...\n\nclass Circles(Shapes):\n shape_vertices: np.ndarray\n shape_codes: np.ndarray\n def __init__(self, hatch: str, density: int) -> None: ...\n\nclass SmallCircles(Circles):\n size: float\n num_rows: int\n def __init__(self, hatch: str, density: int) -> None: ...\n\nclass LargeCircles(Circles):\n size: float\n num_rows: int\n def __init__(self, hatch: str, density: int) -> None: ...\n\nclass SmallFilledCircles(Circles):\n size: float\n filled: bool\n num_rows: int\n def __init__(self, hatch: str, density: int) -> None: ...\n\nclass Stars(Shapes):\n size: float\n filled: bool\n num_rows: int\n shape_vertices: np.ndarray\n shape_codes: np.ndarray\n def __init__(self, hatch: str, density: int) -> None: ...\n\ndef get_path(hatchpattern: str, density: int = ...) -> Path: ...\n | .venv\Lib\site-packages\matplotlib\hatch.pyi | hatch.pyi | Other | 2,098 | 0.85 | 0.397059 | 0 | vue-tools | 919 | 2023-09-03T17:47:36.891398 | Apache-2.0 | false | 161f276bc98a0598a2bc5af50b805674 |
"""\nThe image module supports basic image loading, rescaling and display\noperations.\n"""\n\nimport math\nimport os\nimport logging\nfrom pathlib import Path\nimport warnings\n\nimport numpy as np\nimport PIL.Image\nimport PIL.PngImagePlugin\n\nimport matplotlib as mpl\nfrom matplotlib import _api, cbook\n# For clarity, names from _image are given explicitly in this module\nfrom matplotlib import _image\n# For user convenience, the names from _image are also imported into\n# the image namespace\nfrom matplotlib._image import * # noqa: F401, F403\nimport matplotlib.artist as martist\nimport matplotlib.colorizer as mcolorizer\nfrom matplotlib.backend_bases import FigureCanvasBase\nimport matplotlib.colors as mcolors\nfrom matplotlib.transforms import (\n Affine2D, BboxBase, Bbox, BboxTransform, BboxTransformTo,\n IdentityTransform, TransformedBbox)\n\n_log = logging.getLogger(__name__)\n\n# map interpolation strings to module constants\n_interpd_ = {\n 'auto': _image.NEAREST, # this will use nearest or Hanning...\n 'none': _image.NEAREST, # fall back to nearest when not supported\n 'nearest': _image.NEAREST,\n 'bilinear': _image.BILINEAR,\n 'bicubic': _image.BICUBIC,\n 'spline16': _image.SPLINE16,\n 'spline36': _image.SPLINE36,\n 'hanning': _image.HANNING,\n 'hamming': _image.HAMMING,\n 'hermite': _image.HERMITE,\n 'kaiser': _image.KAISER,\n 'quadric': _image.QUADRIC,\n 'catrom': _image.CATROM,\n 'gaussian': _image.GAUSSIAN,\n 'bessel': _image.BESSEL,\n 'mitchell': _image.MITCHELL,\n 'sinc': _image.SINC,\n 'lanczos': _image.LANCZOS,\n 'blackman': _image.BLACKMAN,\n 'antialiased': _image.NEAREST, # this will use nearest or Hanning...\n}\n\ninterpolations_names = set(_interpd_)\n\n\ndef composite_images(images, renderer, magnification=1.0):\n """\n Composite a number of RGBA images into one. The images are\n composited in the order in which they appear in the *images* list.\n\n Parameters\n ----------\n images : list of Images\n Each must have a `make_image` method. For each image,\n `can_composite` should return `True`, though this is not\n enforced by this function. Each image must have a purely\n affine transformation with no shear.\n\n renderer : `.RendererBase`\n\n magnification : float, default: 1\n The additional magnification to apply for the renderer in use.\n\n Returns\n -------\n image : (M, N, 4) `numpy.uint8` array\n The composited RGBA image.\n offset_x, offset_y : float\n The (left, bottom) offset where the composited image should be placed\n in the output figure.\n """\n if len(images) == 0:\n return np.empty((0, 0, 4), dtype=np.uint8), 0, 0\n\n parts = []\n bboxes = []\n for image in images:\n data, x, y, trans = image.make_image(renderer, magnification)\n if data is not None:\n x *= magnification\n y *= magnification\n parts.append((data, x, y, image._get_scalar_alpha()))\n bboxes.append(\n Bbox([[x, y], [x + data.shape[1], y + data.shape[0]]]))\n\n if len(parts) == 0:\n return np.empty((0, 0, 4), dtype=np.uint8), 0, 0\n\n bbox = Bbox.union(bboxes)\n\n output = np.zeros(\n (int(bbox.height), int(bbox.width), 4), dtype=np.uint8)\n\n for data, x, y, alpha in parts:\n trans = Affine2D().translate(x - bbox.x0, y - bbox.y0)\n _image.resample(data, output, trans, _image.NEAREST,\n resample=False, alpha=alpha)\n\n return output, bbox.x0 / magnification, bbox.y0 / magnification\n\n\ndef _draw_list_compositing_images(\n renderer, parent, artists, suppress_composite=None):\n """\n Draw a sorted list of artists, compositing images into a single\n image where possible.\n\n For internal Matplotlib use only: It is here to reduce duplication\n between `Figure.draw` and `Axes.draw`, but otherwise should not be\n generally useful.\n """\n has_images = any(isinstance(x, _ImageBase) for x in artists)\n\n # override the renderer default if suppressComposite is not None\n not_composite = (suppress_composite if suppress_composite is not None\n else renderer.option_image_nocomposite())\n\n if not_composite or not has_images:\n for a in artists:\n a.draw(renderer)\n else:\n # Composite any adjacent images together\n image_group = []\n mag = renderer.get_image_magnification()\n\n def flush_images():\n if len(image_group) == 1:\n image_group[0].draw(renderer)\n elif len(image_group) > 1:\n data, l, b = composite_images(image_group, renderer, mag)\n if data.size != 0:\n gc = renderer.new_gc()\n gc.set_clip_rectangle(parent.bbox)\n gc.set_clip_path(parent.get_clip_path())\n renderer.draw_image(gc, round(l), round(b), data)\n gc.restore()\n del image_group[:]\n\n for a in artists:\n if (isinstance(a, _ImageBase) and a.can_composite() and\n a.get_clip_on() and not a.get_clip_path()):\n image_group.append(a)\n else:\n flush_images()\n a.draw(renderer)\n flush_images()\n\n\ndef _resample(\n image_obj, data, out_shape, transform, *, resample=None, alpha=1):\n """\n Convenience wrapper around `._image.resample` to resample *data* to\n *out_shape* (with a third dimension if *data* is RGBA) that takes care of\n allocating the output array and fetching the relevant properties from the\n Image object *image_obj*.\n """\n # AGG can only handle coordinates smaller than 24-bit signed integers,\n # so raise errors if the input data is larger than _image.resample can\n # handle.\n msg = ('Data with more than {n} cannot be accurately displayed. '\n 'Downsampling to less than {n} before displaying. '\n 'To remove this warning, manually downsample your data.')\n if data.shape[1] > 2**23:\n warnings.warn(msg.format(n='2**23 columns'))\n step = int(np.ceil(data.shape[1] / 2**23))\n data = data[:, ::step]\n transform = Affine2D().scale(step, 1) + transform\n if data.shape[0] > 2**24:\n warnings.warn(msg.format(n='2**24 rows'))\n step = int(np.ceil(data.shape[0] / 2**24))\n data = data[::step, :]\n transform = Affine2D().scale(1, step) + transform\n # decide if we need to apply anti-aliasing if the data is upsampled:\n # compare the number of displayed pixels to the number of\n # the data pixels.\n interpolation = image_obj.get_interpolation()\n if interpolation in ['antialiased', 'auto']:\n # don't antialias if upsampling by an integer number or\n # if zooming in more than a factor of 3\n pos = np.array([[0, 0], [data.shape[1], data.shape[0]]])\n disp = transform.transform(pos)\n dispx = np.abs(np.diff(disp[:, 0]))\n dispy = np.abs(np.diff(disp[:, 1]))\n if ((dispx > 3 * data.shape[1] or\n dispx == data.shape[1] or\n dispx == 2 * data.shape[1]) and\n (dispy > 3 * data.shape[0] or\n dispy == data.shape[0] or\n dispy == 2 * data.shape[0])):\n interpolation = 'nearest'\n else:\n interpolation = 'hanning'\n out = np.zeros(out_shape + data.shape[2:], data.dtype) # 2D->2D, 3D->3D.\n if resample is None:\n resample = image_obj.get_resample()\n _image.resample(data, out, transform,\n _interpd_[interpolation],\n resample,\n alpha,\n image_obj.get_filternorm(),\n image_obj.get_filterrad())\n return out\n\n\ndef _rgb_to_rgba(A):\n """\n Convert an RGB image to RGBA, as required by the image resample C++\n extension.\n """\n rgba = np.zeros((A.shape[0], A.shape[1], 4), dtype=A.dtype)\n rgba[:, :, :3] = A\n if rgba.dtype == np.uint8:\n rgba[:, :, 3] = 255\n else:\n rgba[:, :, 3] = 1.0\n return rgba\n\n\nclass _ImageBase(mcolorizer.ColorizingArtist):\n """\n Base class for images.\n\n interpolation and cmap default to their rc settings\n\n cmap is a colors.Colormap instance\n norm is a colors.Normalize instance to map luminance to 0-1\n\n extent is data axes (left, right, bottom, top) for making image plots\n registered with data plots. Default is to label the pixel\n centers with the zero-based row and column indices.\n\n Additional kwargs are matplotlib.artist properties\n """\n zorder = 0\n\n def __init__(self, ax,\n cmap=None,\n norm=None,\n colorizer=None,\n interpolation=None,\n origin=None,\n filternorm=True,\n filterrad=4.0,\n resample=False,\n *,\n interpolation_stage=None,\n **kwargs\n ):\n super().__init__(self._get_colorizer(cmap, norm, colorizer))\n if origin is None:\n origin = mpl.rcParams['image.origin']\n _api.check_in_list(["upper", "lower"], origin=origin)\n self.origin = origin\n self.set_filternorm(filternorm)\n self.set_filterrad(filterrad)\n self.set_interpolation(interpolation)\n self.set_interpolation_stage(interpolation_stage)\n self.set_resample(resample)\n self.axes = ax\n\n self._imcache = None\n\n self._internal_update(kwargs)\n\n def __str__(self):\n try:\n shape = self.get_shape()\n return f"{type(self).__name__}(shape={shape!r})"\n except RuntimeError:\n return type(self).__name__\n\n def __getstate__(self):\n # Save some space on the pickle by not saving the cache.\n return {**super().__getstate__(), "_imcache": None}\n\n def get_size(self):\n """Return the size of the image as tuple (numrows, numcols)."""\n return self.get_shape()[:2]\n\n def get_shape(self):\n """\n Return the shape of the image as tuple (numrows, numcols, channels).\n """\n if self._A is None:\n raise RuntimeError('You must first set the image array')\n\n return self._A.shape\n\n def set_alpha(self, alpha):\n """\n Set the alpha value used for blending - not supported on all backends.\n\n Parameters\n ----------\n alpha : float or 2D array-like or None\n """\n martist.Artist._set_alpha_for_array(self, alpha)\n if np.ndim(alpha) not in (0, 2):\n raise TypeError('alpha must be a float, two-dimensional '\n 'array, or None')\n self._imcache = None\n\n def _get_scalar_alpha(self):\n """\n Get a scalar alpha value to be applied to the artist as a whole.\n\n If the alpha value is a matrix, the method returns 1.0 because pixels\n have individual alpha values (see `~._ImageBase._make_image` for\n details). If the alpha value is a scalar, the method returns said value\n to be applied to the artist as a whole because pixels do not have\n individual alpha values.\n """\n return 1.0 if self._alpha is None or np.ndim(self._alpha) > 0 \\n else self._alpha\n\n def changed(self):\n """\n Call this whenever the mappable is changed so observers can update.\n """\n self._imcache = None\n super().changed()\n\n def _make_image(self, A, in_bbox, out_bbox, clip_bbox, magnification=1.0,\n unsampled=False, round_to_pixel_border=True):\n """\n Normalize, rescale, and colormap the image *A* from the given *in_bbox*\n (in data space), to the given *out_bbox* (in pixel space) clipped to\n the given *clip_bbox* (also in pixel space), and magnified by the\n *magnification* factor.\n\n Parameters\n ----------\n A : ndarray\n\n - a (M, N) array interpreted as scalar (greyscale) image,\n with one of the dtypes `~numpy.float32`, `~numpy.float64`,\n `~numpy.float128`, `~numpy.uint16` or `~numpy.uint8`.\n - (M, N, 4) RGBA image with a dtype of `~numpy.float32`,\n `~numpy.float64`, `~numpy.float128`, or `~numpy.uint8`.\n\n in_bbox : `~matplotlib.transforms.Bbox`\n\n out_bbox : `~matplotlib.transforms.Bbox`\n\n clip_bbox : `~matplotlib.transforms.Bbox`\n\n magnification : float, default: 1\n\n unsampled : bool, default: False\n If True, the image will not be scaled, but an appropriate\n affine transformation will be returned instead.\n\n round_to_pixel_border : bool, default: True\n If True, the output image size will be rounded to the nearest pixel\n boundary. This makes the images align correctly with the Axes.\n It should not be used if exact scaling is needed, such as for\n `.FigureImage`.\n\n Returns\n -------\n image : (M, N, 4) `numpy.uint8` array\n The RGBA image, resampled unless *unsampled* is True.\n x, y : float\n The upper left corner where the image should be drawn, in pixel\n space.\n trans : `~matplotlib.transforms.Affine2D`\n The affine transformation from image to pixel space.\n """\n if A is None:\n raise RuntimeError('You must first set the image '\n 'array or the image attribute')\n if A.size == 0:\n raise RuntimeError("_make_image must get a non-empty image. "\n "Your Artist's draw method must filter before "\n "this method is called.")\n\n clipped_bbox = Bbox.intersection(out_bbox, clip_bbox)\n\n if clipped_bbox is None:\n return None, 0, 0, None\n\n out_width_base = clipped_bbox.width * magnification\n out_height_base = clipped_bbox.height * magnification\n\n if out_width_base == 0 or out_height_base == 0:\n return None, 0, 0, None\n\n if self.origin == 'upper':\n # Flip the input image using a transform. This avoids the\n # problem with flipping the array, which results in a copy\n # when it is converted to contiguous in the C wrapper\n t0 = Affine2D().translate(0, -A.shape[0]).scale(1, -1)\n else:\n t0 = IdentityTransform()\n\n t0 += (\n Affine2D()\n .scale(\n in_bbox.width / A.shape[1],\n in_bbox.height / A.shape[0])\n .translate(in_bbox.x0, in_bbox.y0)\n + self.get_transform())\n\n t = (t0\n + (Affine2D()\n .translate(-clipped_bbox.x0, -clipped_bbox.y0)\n .scale(magnification)))\n\n # So that the image is aligned with the edge of the Axes, we want to\n # round up the output width to the next integer. This also means\n # scaling the transform slightly to account for the extra subpixel.\n if ((not unsampled) and t.is_affine and round_to_pixel_border and\n (out_width_base % 1.0 != 0.0 or out_height_base % 1.0 != 0.0)):\n out_width = math.ceil(out_width_base)\n out_height = math.ceil(out_height_base)\n extra_width = (out_width - out_width_base) / out_width_base\n extra_height = (out_height - out_height_base) / out_height_base\n t += Affine2D().scale(1.0 + extra_width, 1.0 + extra_height)\n else:\n out_width = int(out_width_base)\n out_height = int(out_height_base)\n out_shape = (out_height, out_width)\n\n if not unsampled:\n if not (A.ndim == 2 or A.ndim == 3 and A.shape[-1] in (3, 4)):\n raise ValueError(f"Invalid shape {A.shape} for image data")\n\n # if antialiased, this needs to change as window sizes\n # change:\n interpolation_stage = self._interpolation_stage\n if interpolation_stage in ['antialiased', 'auto']:\n pos = np.array([[0, 0], [A.shape[1], A.shape[0]]])\n disp = t.transform(pos)\n dispx = np.abs(np.diff(disp[:, 0])) / A.shape[1]\n dispy = np.abs(np.diff(disp[:, 1])) / A.shape[0]\n if (dispx < 3) or (dispy < 3):\n interpolation_stage = 'rgba'\n else:\n interpolation_stage = 'data'\n\n if A.ndim == 2 and interpolation_stage == 'data':\n # if we are a 2D array, then we are running through the\n # norm + colormap transformation. However, in general the\n # input data is not going to match the size on the screen so we\n # have to resample to the correct number of pixels\n\n if A.dtype.kind == 'f': # Float dtype: scale to same dtype.\n scaled_dtype = np.dtype("f8" if A.dtype.itemsize > 4 else "f4")\n if scaled_dtype.itemsize < A.dtype.itemsize:\n _api.warn_external(f"Casting input data from {A.dtype}"\n f" to {scaled_dtype} for imshow.")\n else: # Int dtype, likely.\n # TODO slice input array first\n # Scale to appropriately sized float: use float32 if the\n # dynamic range is small, to limit the memory footprint.\n da = A.max().astype("f8") - A.min().astype("f8")\n scaled_dtype = "f8" if da > 1e8 else "f4"\n\n # resample the input data to the correct resolution and shape\n A_resampled = _resample(self, A.astype(scaled_dtype), out_shape, t)\n\n # if using NoNorm, cast back to the original datatype\n if isinstance(self.norm, mcolors.NoNorm):\n A_resampled = A_resampled.astype(A.dtype)\n\n # Compute out_mask (what screen pixels include "bad" data\n # pixels) and out_alpha (to what extent screen pixels are\n # covered by data pixels: 0 outside the data extent, 1 inside\n # (even for bad data), and intermediate values at the edges).\n mask = (np.where(A.mask, np.float32(np.nan), np.float32(1))\n if A.mask.shape == A.shape # nontrivial mask\n else np.ones_like(A, np.float32))\n # we always have to interpolate the mask to account for\n # non-affine transformations\n out_alpha = _resample(self, mask, out_shape, t, resample=True)\n del mask # Make sure we don't use mask anymore!\n out_mask = np.isnan(out_alpha)\n out_alpha[out_mask] = 1\n # Apply the pixel-by-pixel alpha values if present\n alpha = self.get_alpha()\n if alpha is not None and np.ndim(alpha) > 0:\n out_alpha *= _resample(self, alpha, out_shape, t, resample=True)\n # mask and run through the norm\n resampled_masked = np.ma.masked_array(A_resampled, out_mask)\n output = self.norm(resampled_masked)\n else:\n if A.ndim == 2: # interpolation_stage = 'rgba'\n self.norm.autoscale_None(A)\n A = self.to_rgba(A)\n alpha = self.get_alpha()\n if alpha is None: # alpha parameter not specified\n if A.shape[2] == 3: # image has no alpha channel\n output_alpha = 255 if A.dtype == np.uint8 else 1.0\n else:\n output_alpha = _resample( # resample alpha channel\n self, A[..., 3], out_shape, t)\n output = _resample( # resample rgb channels\n self, _rgb_to_rgba(A[..., :3]), out_shape, t)\n elif np.ndim(alpha) > 0: # Array alpha\n # user-specified array alpha overrides the existing alpha channel\n output_alpha = _resample(self, alpha, out_shape, t)\n output = _resample(\n self, _rgb_to_rgba(A[..., :3]), out_shape, t)\n else: # Scalar alpha\n if A.shape[2] == 3: # broadcast scalar alpha\n output_alpha = (255 * alpha) if A.dtype == np.uint8 else alpha\n else: # or apply scalar alpha to existing alpha channel\n output_alpha = _resample(self, A[..., 3], out_shape, t) * alpha\n output = _resample(\n self, _rgb_to_rgba(A[..., :3]), out_shape, t)\n output[..., 3] = output_alpha # recombine rgb and alpha\n\n # output is now either a 2D array of normed (int or float) data\n # or an RGBA array of re-sampled input\n output = self.to_rgba(output, bytes=True, norm=False)\n # output is now a correctly sized RGBA array of uint8\n\n # Apply alpha *after* if the input was greyscale without a mask\n if A.ndim == 2:\n alpha = self._get_scalar_alpha()\n alpha_channel = output[:, :, 3]\n alpha_channel[:] = ( # Assignment will cast to uint8.\n alpha_channel.astype(np.float32) * out_alpha * alpha)\n\n else:\n if self._imcache is None:\n self._imcache = self.to_rgba(A, bytes=True, norm=(A.ndim == 2))\n output = self._imcache\n\n # Subset the input image to only the part that will be displayed.\n subset = TransformedBbox(clip_bbox, t0.inverted()).frozen()\n output = output[\n int(max(subset.ymin, 0)):\n int(min(subset.ymax + 1, output.shape[0])),\n int(max(subset.xmin, 0)):\n int(min(subset.xmax + 1, output.shape[1]))]\n\n t = Affine2D().translate(\n int(max(subset.xmin, 0)), int(max(subset.ymin, 0))) + t\n\n return output, clipped_bbox.x0, clipped_bbox.y0, t\n\n def make_image(self, renderer, magnification=1.0, unsampled=False):\n """\n Normalize, rescale, and colormap this image's data for rendering using\n *renderer*, with the given *magnification*.\n\n If *unsampled* is True, the image will not be scaled, but an\n appropriate affine transformation will be returned instead.\n\n Returns\n -------\n image : (M, N, 4) `numpy.uint8` array\n The RGBA image, resampled unless *unsampled* is True.\n x, y : float\n The upper left corner where the image should be drawn, in pixel\n space.\n trans : `~matplotlib.transforms.Affine2D`\n The affine transformation from image to pixel space.\n """\n raise NotImplementedError('The make_image method must be overridden')\n\n def _check_unsampled_image(self):\n """\n Return whether the image is better to be drawn unsampled.\n\n The derived class needs to override it.\n """\n return False\n\n @martist.allow_rasterization\n def draw(self, renderer):\n # if not visible, declare victory and return\n if not self.get_visible():\n self.stale = False\n return\n # for empty images, there is nothing to draw!\n if self.get_array().size == 0:\n self.stale = False\n return\n # actually render the image.\n gc = renderer.new_gc()\n self._set_gc_clip(gc)\n gc.set_alpha(self._get_scalar_alpha())\n gc.set_url(self.get_url())\n gc.set_gid(self.get_gid())\n if (renderer.option_scale_image() # Renderer supports transform kwarg.\n and self._check_unsampled_image()\n and self.get_transform().is_affine):\n im, l, b, trans = self.make_image(renderer, unsampled=True)\n if im is not None:\n trans = Affine2D().scale(im.shape[1], im.shape[0]) + trans\n renderer.draw_image(gc, l, b, im, trans)\n else:\n im, l, b, trans = self.make_image(\n renderer, renderer.get_image_magnification())\n if im is not None:\n renderer.draw_image(gc, l, b, im)\n gc.restore()\n self.stale = False\n\n def contains(self, mouseevent):\n """Test whether the mouse event occurred within the image."""\n if (self._different_canvas(mouseevent)\n # This doesn't work for figimage.\n or not self.axes.contains(mouseevent)[0]):\n return False, {}\n # TODO: make sure this is consistent with patch and patch\n # collection on nonlinear transformed coordinates.\n # TODO: consider returning image coordinates (shouldn't\n # be too difficult given that the image is rectilinear\n trans = self.get_transform().inverted()\n x, y = trans.transform([mouseevent.x, mouseevent.y])\n xmin, xmax, ymin, ymax = self.get_extent()\n # This checks xmin <= x <= xmax *or* xmax <= x <= xmin.\n inside = (x is not None and (x - xmin) * (x - xmax) <= 0\n and y is not None and (y - ymin) * (y - ymax) <= 0)\n return inside, {}\n\n def write_png(self, fname):\n """Write the image to png file *fname*."""\n im = self.to_rgba(self._A[::-1] if self.origin == 'lower' else self._A,\n bytes=True, norm=True)\n PIL.Image.fromarray(im).save(fname, format="png")\n\n @staticmethod\n def _normalize_image_array(A):\n """\n Check validity of image-like input *A* and normalize it to a format suitable for\n Image subclasses.\n """\n A = cbook.safe_masked_invalid(A, copy=True)\n if A.dtype != np.uint8 and not np.can_cast(A.dtype, float, "same_kind"):\n raise TypeError(f"Image data of dtype {A.dtype} cannot be "\n f"converted to float")\n if A.ndim == 3 and A.shape[-1] == 1:\n A = A.squeeze(-1) # If just (M, N, 1), assume scalar and apply colormap.\n if not (A.ndim == 2 or A.ndim == 3 and A.shape[-1] in [3, 4]):\n raise TypeError(f"Invalid shape {A.shape} for image data")\n if A.ndim == 3:\n # If the input data has values outside the valid range (after\n # normalisation), we issue a warning and then clip X to the bounds\n # - otherwise casting wraps extreme values, hiding outliers and\n # making reliable interpretation impossible.\n high = 255 if np.issubdtype(A.dtype, np.integer) else 1\n if A.min() < 0 or high < A.max():\n _log.warning(\n 'Clipping input data to the valid range for imshow with '\n 'RGB data ([0..1] for floats or [0..255] for integers). '\n 'Got range [%s..%s].',\n A.min(), A.max()\n )\n A = np.clip(A, 0, high)\n # Cast unsupported integer types to uint8\n if A.dtype != np.uint8 and np.issubdtype(A.dtype, np.integer):\n A = A.astype(np.uint8)\n return A\n\n def set_data(self, A):\n """\n Set the image array.\n\n Note that this function does *not* update the normalization used.\n\n Parameters\n ----------\n A : array-like or `PIL.Image.Image`\n """\n if isinstance(A, PIL.Image.Image):\n A = pil_to_array(A) # Needed e.g. to apply png palette.\n self._A = self._normalize_image_array(A)\n self._imcache = None\n self.stale = True\n\n def set_array(self, A):\n """\n Retained for backwards compatibility - use set_data instead.\n\n Parameters\n ----------\n A : array-like\n """\n # This also needs to be here to override the inherited\n # cm.ScalarMappable.set_array method so it is not invoked by mistake.\n self.set_data(A)\n\n def get_interpolation(self):\n """\n Return the interpolation method the image uses when resizing.\n\n One of 'auto', 'antialiased', 'nearest', 'bilinear', 'bicubic',\n 'spline16', 'spline36', 'hanning', 'hamming', 'hermite', 'kaiser',\n 'quadric', 'catrom', 'gaussian', 'bessel', 'mitchell', 'sinc', 'lanczos',\n or 'none'.\n """\n return self._interpolation\n\n def set_interpolation(self, s):\n """\n Set the interpolation method the image uses when resizing.\n\n If None, use :rc:`image.interpolation`. If 'none', the image is\n shown as is without interpolating. 'none' is only supported in\n agg, ps and pdf backends and will fall back to 'nearest' mode\n for other backends.\n\n Parameters\n ----------\n s : {'auto', 'nearest', 'bilinear', 'bicubic', 'spline16', \\n'spline36', 'hanning', 'hamming', 'hermite', 'kaiser', 'quadric', 'catrom', \\n'gaussian', 'bessel', 'mitchell', 'sinc', 'lanczos', 'none'} or None\n """\n s = mpl._val_or_rc(s, 'image.interpolation').lower()\n _api.check_in_list(interpolations_names, interpolation=s)\n self._interpolation = s\n self.stale = True\n\n def get_interpolation_stage(self):\n """\n Return when interpolation happens during the transform to RGBA.\n\n One of 'data', 'rgba', 'auto'.\n """\n return self._interpolation_stage\n\n def set_interpolation_stage(self, s):\n """\n Set when interpolation happens during the transform to RGBA.\n\n Parameters\n ----------\n s : {'data', 'rgba', 'auto'} or None\n Whether to apply up/downsampling interpolation in data or RGBA\n space. If None, use :rc:`image.interpolation_stage`.\n If 'auto' we will check upsampling rate and if less\n than 3 then use 'rgba', otherwise use 'data'.\n """\n s = mpl._val_or_rc(s, 'image.interpolation_stage')\n _api.check_in_list(['data', 'rgba', 'auto'], s=s)\n self._interpolation_stage = s\n self.stale = True\n\n def can_composite(self):\n """Return whether the image can be composited with its neighbors."""\n trans = self.get_transform()\n return (\n self._interpolation != 'none' and\n trans.is_affine and\n trans.is_separable)\n\n def set_resample(self, v):\n """\n Set whether image resampling is used.\n\n Parameters\n ----------\n v : bool or None\n If None, use :rc:`image.resample`.\n """\n v = mpl._val_or_rc(v, 'image.resample')\n self._resample = v\n self.stale = True\n\n def get_resample(self):\n """Return whether image resampling is used."""\n return self._resample\n\n def set_filternorm(self, filternorm):\n """\n Set whether the resize filter normalizes the weights.\n\n See help for `~.Axes.imshow`.\n\n Parameters\n ----------\n filternorm : bool\n """\n self._filternorm = bool(filternorm)\n self.stale = True\n\n def get_filternorm(self):\n """Return whether the resize filter normalizes the weights."""\n return self._filternorm\n\n def set_filterrad(self, filterrad):\n """\n Set the resize filter radius only applicable to some\n interpolation schemes -- see help for imshow\n\n Parameters\n ----------\n filterrad : positive float\n """\n r = float(filterrad)\n if r <= 0:\n raise ValueError("The filter radius must be a positive number")\n self._filterrad = r\n self.stale = True\n\n def get_filterrad(self):\n """Return the filterrad setting."""\n return self._filterrad\n\n\nclass AxesImage(_ImageBase):\n """\n An image with pixels on a regular grid, attached to an Axes.\n\n Parameters\n ----------\n ax : `~matplotlib.axes.Axes`\n The Axes the image will belong to.\n cmap : str or `~matplotlib.colors.Colormap`, default: :rc:`image.cmap`\n The Colormap instance or registered colormap name used to map scalar\n data to colors.\n norm : str or `~matplotlib.colors.Normalize`\n Maps luminance to 0-1.\n interpolation : str, default: :rc:`image.interpolation`\n Supported values are 'none', 'auto', 'nearest', 'bilinear',\n 'bicubic', 'spline16', 'spline36', 'hanning', 'hamming', 'hermite',\n 'kaiser', 'quadric', 'catrom', 'gaussian', 'bessel', 'mitchell',\n 'sinc', 'lanczos', 'blackman'.\n interpolation_stage : {'data', 'rgba'}, default: 'data'\n If 'data', interpolation\n is carried out on the data provided by the user. If 'rgba', the\n interpolation is carried out after the colormapping has been\n applied (visual interpolation).\n origin : {'upper', 'lower'}, default: :rc:`image.origin`\n Place the [0, 0] index of the array in the upper left or lower left\n corner of the Axes. The convention 'upper' is typically used for\n matrices and images.\n extent : tuple, optional\n The data axes (left, right, bottom, top) for making image plots\n registered with data plots. Default is to label the pixel\n centers with the zero-based row and column indices.\n filternorm : bool, default: True\n A parameter for the antigrain image resize filter\n (see the antigrain documentation).\n If filternorm is set, the filter normalizes integer values and corrects\n the rounding errors. It doesn't do anything with the source floating\n point values, it corrects only integers according to the rule of 1.0\n which means that any sum of pixel weights must be equal to 1.0. So,\n the filter function must produce a graph of the proper shape.\n filterrad : float > 0, default: 4\n The filter radius for filters that have a radius parameter, i.e. when\n interpolation is one of: 'sinc', 'lanczos' or 'blackman'.\n resample : bool, default: False\n When True, use a full resampling method. When False, only resample when\n the output image is larger than the input image.\n **kwargs : `~matplotlib.artist.Artist` properties\n """\n\n def __init__(self, ax,\n *,\n cmap=None,\n norm=None,\n colorizer=None,\n interpolation=None,\n origin=None,\n extent=None,\n filternorm=True,\n filterrad=4.0,\n resample=False,\n interpolation_stage=None,\n **kwargs\n ):\n\n self._extent = extent\n\n super().__init__(\n ax,\n cmap=cmap,\n norm=norm,\n colorizer=colorizer,\n interpolation=interpolation,\n origin=origin,\n filternorm=filternorm,\n filterrad=filterrad,\n resample=resample,\n interpolation_stage=interpolation_stage,\n **kwargs\n )\n\n def get_window_extent(self, renderer=None):\n x0, x1, y0, y1 = self._extent\n bbox = Bbox.from_extents([x0, y0, x1, y1])\n return bbox.transformed(self.get_transform())\n\n def make_image(self, renderer, magnification=1.0, unsampled=False):\n # docstring inherited\n trans = self.get_transform()\n # image is created in the canvas coordinate.\n x1, x2, y1, y2 = self.get_extent()\n bbox = Bbox(np.array([[x1, y1], [x2, y2]]))\n transformed_bbox = TransformedBbox(bbox, trans)\n clip = ((self.get_clip_box() or self.axes.bbox) if self.get_clip_on()\n else self.get_figure(root=True).bbox)\n return self._make_image(self._A, bbox, transformed_bbox, clip,\n magnification, unsampled=unsampled)\n\n def _check_unsampled_image(self):\n """Return whether the image would be better drawn unsampled."""\n return self.get_interpolation() == "none"\n\n def set_extent(self, extent, **kwargs):\n """\n Set the image extent.\n\n Parameters\n ----------\n extent : 4-tuple of float\n The position and size of the image as tuple\n ``(left, right, bottom, top)`` in data coordinates.\n **kwargs\n Other parameters from which unit info (i.e., the *xunits*,\n *yunits*, *zunits* (for 3D Axes), *runits* and *thetaunits* (for\n polar Axes) entries are applied, if present.\n\n Notes\n -----\n This updates `.Axes.dataLim`, and, if autoscaling, sets `.Axes.viewLim`\n to tightly fit the image, regardless of `~.Axes.dataLim`. Autoscaling\n state is not changed, so a subsequent call to `.Axes.autoscale_view`\n will redo the autoscaling in accord with `~.Axes.dataLim`.\n """\n (xmin, xmax), (ymin, ymax) = self.axes._process_unit_info(\n [("x", [extent[0], extent[1]]),\n ("y", [extent[2], extent[3]])],\n kwargs)\n if kwargs:\n raise _api.kwarg_error("set_extent", kwargs)\n xmin = self.axes._validate_converted_limits(\n xmin, self.convert_xunits)\n xmax = self.axes._validate_converted_limits(\n xmax, self.convert_xunits)\n ymin = self.axes._validate_converted_limits(\n ymin, self.convert_yunits)\n ymax = self.axes._validate_converted_limits(\n ymax, self.convert_yunits)\n extent = [xmin, xmax, ymin, ymax]\n\n self._extent = extent\n corners = (xmin, ymin), (xmax, ymax)\n self.axes.update_datalim(corners)\n self.sticky_edges.x[:] = [xmin, xmax]\n self.sticky_edges.y[:] = [ymin, ymax]\n if self.axes.get_autoscalex_on():\n self.axes.set_xlim((xmin, xmax), auto=None)\n if self.axes.get_autoscaley_on():\n self.axes.set_ylim((ymin, ymax), auto=None)\n self.stale = True\n\n def get_extent(self):\n """Return the image extent as tuple (left, right, bottom, top)."""\n if self._extent is not None:\n return self._extent\n else:\n sz = self.get_size()\n numrows, numcols = sz\n if self.origin == 'upper':\n return (-0.5, numcols-0.5, numrows-0.5, -0.5)\n else:\n return (-0.5, numcols-0.5, -0.5, numrows-0.5)\n\n def get_cursor_data(self, event):\n """\n Return the image value at the event position or *None* if the event is\n outside the image.\n\n See Also\n --------\n matplotlib.artist.Artist.get_cursor_data\n """\n xmin, xmax, ymin, ymax = self.get_extent()\n if self.origin == 'upper':\n ymin, ymax = ymax, ymin\n arr = self.get_array()\n data_extent = Bbox([[xmin, ymin], [xmax, ymax]])\n array_extent = Bbox([[0, 0], [arr.shape[1], arr.shape[0]]])\n trans = self.get_transform().inverted()\n trans += BboxTransform(boxin=data_extent, boxout=array_extent)\n point = trans.transform([event.x, event.y])\n if any(np.isnan(point)):\n return None\n j, i = point.astype(int)\n # Clip the coordinates at array bounds\n if not (0 <= i < arr.shape[0]) or not (0 <= j < arr.shape[1]):\n return None\n else:\n return arr[i, j]\n\n\nclass NonUniformImage(AxesImage):\n """\n An image with pixels on a rectilinear grid.\n\n In contrast to `.AxesImage`, where pixels are on a regular grid,\n NonUniformImage allows rows and columns with individual heights / widths.\n\n See also :doc:`/gallery/images_contours_and_fields/image_nonuniform`.\n """\n\n def __init__(self, ax, *, interpolation='nearest', **kwargs):\n """\n Parameters\n ----------\n ax : `~matplotlib.axes.Axes`\n The Axes the image will belong to.\n interpolation : {'nearest', 'bilinear'}, default: 'nearest'\n The interpolation scheme used in the resampling.\n **kwargs\n All other keyword arguments are identical to those of `.AxesImage`.\n """\n super().__init__(ax, **kwargs)\n self.set_interpolation(interpolation)\n\n def _check_unsampled_image(self):\n """Return False. Do not use unsampled image."""\n return False\n\n def make_image(self, renderer, magnification=1.0, unsampled=False):\n # docstring inherited\n if self._A is None:\n raise RuntimeError('You must first set the image array')\n if unsampled:\n raise ValueError('unsampled not supported on NonUniformImage')\n A = self._A\n if A.ndim == 2:\n if A.dtype != np.uint8:\n A = self.to_rgba(A, bytes=True)\n else:\n A = np.repeat(A[:, :, np.newaxis], 4, 2)\n A[:, :, 3] = 255\n else:\n if A.dtype != np.uint8:\n A = (255*A).astype(np.uint8)\n if A.shape[2] == 3:\n B = np.zeros(tuple([*A.shape[0:2], 4]), np.uint8)\n B[:, :, 0:3] = A\n B[:, :, 3] = 255\n A = B\n l, b, r, t = self.axes.bbox.extents\n width = int(((round(r) + 0.5) - (round(l) - 0.5)) * magnification)\n height = int(((round(t) + 0.5) - (round(b) - 0.5)) * magnification)\n\n invertedTransform = self.axes.transData.inverted()\n x_pix = invertedTransform.transform(\n [(x, b) for x in np.linspace(l, r, width)])[:, 0]\n y_pix = invertedTransform.transform(\n [(l, y) for y in np.linspace(b, t, height)])[:, 1]\n\n if self._interpolation == "nearest":\n x_mid = (self._Ax[:-1] + self._Ax[1:]) / 2\n y_mid = (self._Ay[:-1] + self._Ay[1:]) / 2\n x_int = x_mid.searchsorted(x_pix)\n y_int = y_mid.searchsorted(y_pix)\n # The following is equal to `A[y_int[:, None], x_int[None, :]]`,\n # but many times faster. Both casting to uint32 (to have an\n # effectively 1D array) and manual index flattening matter.\n im = (\n np.ascontiguousarray(A).view(np.uint32).ravel()[\n np.add.outer(y_int * A.shape[1], x_int)]\n .view(np.uint8).reshape((height, width, 4)))\n else: # self._interpolation == "bilinear"\n # Use np.interp to compute x_int/x_float has similar speed.\n x_int = np.clip(\n self._Ax.searchsorted(x_pix) - 1, 0, len(self._Ax) - 2)\n y_int = np.clip(\n self._Ay.searchsorted(y_pix) - 1, 0, len(self._Ay) - 2)\n idx_int = np.add.outer(y_int * A.shape[1], x_int)\n x_frac = np.clip(\n np.divide(x_pix - self._Ax[x_int], np.diff(self._Ax)[x_int],\n dtype=np.float32), # Downcasting helps with speed.\n 0, 1)\n y_frac = np.clip(\n np.divide(y_pix - self._Ay[y_int], np.diff(self._Ay)[y_int],\n dtype=np.float32),\n 0, 1)\n f00 = np.outer(1 - y_frac, 1 - x_frac)\n f10 = np.outer(y_frac, 1 - x_frac)\n f01 = np.outer(1 - y_frac, x_frac)\n f11 = np.outer(y_frac, x_frac)\n im = np.empty((height, width, 4), np.uint8)\n for chan in range(4):\n ac = A[:, :, chan].reshape(-1) # reshape(-1) avoids a copy.\n # Shifting the buffer start (`ac[offset:]`) avoids an array\n # addition (`ac[idx_int + offset]`).\n buf = f00 * ac[idx_int]\n buf += f10 * ac[A.shape[1]:][idx_int]\n buf += f01 * ac[1:][idx_int]\n buf += f11 * ac[A.shape[1] + 1:][idx_int]\n im[:, :, chan] = buf # Implicitly casts to uint8.\n return im, l, b, IdentityTransform()\n\n def set_data(self, x, y, A):\n """\n Set the grid for the pixel centers, and the pixel values.\n\n Parameters\n ----------\n x, y : 1D array-like\n Monotonic arrays of shapes (N,) and (M,), respectively, specifying\n pixel centers.\n A : array-like\n (M, N) `~numpy.ndarray` or masked array of values to be\n colormapped, or (M, N, 3) RGB array, or (M, N, 4) RGBA array.\n """\n A = self._normalize_image_array(A)\n x = np.array(x, np.float32)\n y = np.array(y, np.float32)\n if not (x.ndim == y.ndim == 1 and A.shape[:2] == y.shape + x.shape):\n raise TypeError("Axes don't match array shape")\n self._A = A\n self._Ax = x\n self._Ay = y\n self._imcache = None\n self.stale = True\n\n def set_array(self, *args):\n raise NotImplementedError('Method not supported')\n\n def set_interpolation(self, s):\n """\n Parameters\n ----------\n s : {'nearest', 'bilinear'} or None\n If None, use :rc:`image.interpolation`.\n """\n if s is not None and s not in ('nearest', 'bilinear'):\n raise NotImplementedError('Only nearest neighbor and '\n 'bilinear interpolations are supported')\n super().set_interpolation(s)\n\n def get_extent(self):\n if self._A is None:\n raise RuntimeError('Must set data first')\n return self._Ax[0], self._Ax[-1], self._Ay[0], self._Ay[-1]\n\n def set_filternorm(self, filternorm):\n pass\n\n def set_filterrad(self, filterrad):\n pass\n\n def set_norm(self, norm):\n if self._A is not None:\n raise RuntimeError('Cannot change colors after loading data')\n super().set_norm(norm)\n\n def set_cmap(self, cmap):\n if self._A is not None:\n raise RuntimeError('Cannot change colors after loading data')\n super().set_cmap(cmap)\n\n def get_cursor_data(self, event):\n # docstring inherited\n x, y = event.xdata, event.ydata\n if (x < self._Ax[0] or x > self._Ax[-1] or\n y < self._Ay[0] or y > self._Ay[-1]):\n return None\n j = np.searchsorted(self._Ax, x) - 1\n i = np.searchsorted(self._Ay, y) - 1\n return self._A[i, j]\n\n\nclass PcolorImage(AxesImage):\n """\n Make a pcolor-style plot with an irregular rectangular grid.\n\n This uses a variation of the original irregular image code,\n and it is used by pcolorfast for the corresponding grid type.\n """\n\n def __init__(self, ax,\n x=None,\n y=None,\n A=None,\n *,\n cmap=None,\n norm=None,\n colorizer=None,\n **kwargs\n ):\n """\n Parameters\n ----------\n ax : `~matplotlib.axes.Axes`\n The Axes the image will belong to.\n x, y : 1D array-like, optional\n Monotonic arrays of length N+1 and M+1, respectively, specifying\n rectangle boundaries. If not given, will default to\n ``range(N + 1)`` and ``range(M + 1)``, respectively.\n A : array-like\n The data to be color-coded. The interpretation depends on the\n shape:\n\n - (M, N) `~numpy.ndarray` or masked array: values to be colormapped\n - (M, N, 3): RGB array\n - (M, N, 4): RGBA array\n\n cmap : str or `~matplotlib.colors.Colormap`, default: :rc:`image.cmap`\n The Colormap instance or registered colormap name used to map\n scalar data to colors.\n norm : str or `~matplotlib.colors.Normalize`\n Maps luminance to 0-1.\n **kwargs : `~matplotlib.artist.Artist` properties\n """\n super().__init__(ax, norm=norm, cmap=cmap, colorizer=colorizer)\n self._internal_update(kwargs)\n if A is not None:\n self.set_data(x, y, A)\n\n def make_image(self, renderer, magnification=1.0, unsampled=False):\n # docstring inherited\n if self._A is None:\n raise RuntimeError('You must first set the image array')\n if unsampled:\n raise ValueError('unsampled not supported on PColorImage')\n\n if self._imcache is None:\n A = self.to_rgba(self._A, bytes=True)\n self._imcache = np.pad(A, [(1, 1), (1, 1), (0, 0)], "constant")\n padded_A = self._imcache\n bg = mcolors.to_rgba(self.axes.patch.get_facecolor(), 0)\n bg = (np.array(bg) * 255).astype(np.uint8)\n if (padded_A[0, 0] != bg).all():\n padded_A[[0, -1], :] = padded_A[:, [0, -1]] = bg\n\n l, b, r, t = self.axes.bbox.extents\n width = (round(r) + 0.5) - (round(l) - 0.5)\n height = (round(t) + 0.5) - (round(b) - 0.5)\n width = round(width * magnification)\n height = round(height * magnification)\n vl = self.axes.viewLim\n\n x_pix = np.linspace(vl.x0, vl.x1, width)\n y_pix = np.linspace(vl.y0, vl.y1, height)\n x_int = self._Ax.searchsorted(x_pix)\n y_int = self._Ay.searchsorted(y_pix)\n im = ( # See comment in NonUniformImage.make_image re: performance.\n padded_A.view(np.uint32).ravel()[\n np.add.outer(y_int * padded_A.shape[1], x_int)]\n .view(np.uint8).reshape((height, width, 4)))\n return im, l, b, IdentityTransform()\n\n def _check_unsampled_image(self):\n return False\n\n def set_data(self, x, y, A):\n """\n Set the grid for the rectangle boundaries, and the data values.\n\n Parameters\n ----------\n x, y : 1D array-like, optional\n Monotonic arrays of length N+1 and M+1, respectively, specifying\n rectangle boundaries. If not given, will default to\n ``range(N + 1)`` and ``range(M + 1)``, respectively.\n A : array-like\n The data to be color-coded. The interpretation depends on the\n shape:\n\n - (M, N) `~numpy.ndarray` or masked array: values to be colormapped\n - (M, N, 3): RGB array\n - (M, N, 4): RGBA array\n """\n A = self._normalize_image_array(A)\n x = np.arange(0., A.shape[1] + 1) if x is None else np.array(x, float).ravel()\n y = np.arange(0., A.shape[0] + 1) if y is None else np.array(y, float).ravel()\n if A.shape[:2] != (y.size - 1, x.size - 1):\n raise ValueError(\n "Axes don't match array shape. Got %s, expected %s." %\n (A.shape[:2], (y.size - 1, x.size - 1)))\n # For efficient cursor readout, ensure x and y are increasing.\n if x[-1] < x[0]:\n x = x[::-1]\n A = A[:, ::-1]\n if y[-1] < y[0]:\n y = y[::-1]\n A = A[::-1]\n self._A = A\n self._Ax = x\n self._Ay = y\n self._imcache = None\n self.stale = True\n\n def set_array(self, *args):\n raise NotImplementedError('Method not supported')\n\n def get_cursor_data(self, event):\n # docstring inherited\n x, y = event.xdata, event.ydata\n if (x < self._Ax[0] or x > self._Ax[-1] or\n y < self._Ay[0] or y > self._Ay[-1]):\n return None\n j = np.searchsorted(self._Ax, x) - 1\n i = np.searchsorted(self._Ay, y) - 1\n return self._A[i, j]\n\n\nclass FigureImage(_ImageBase):\n """An image attached to a figure."""\n\n zorder = 0\n\n _interpolation = 'nearest'\n\n def __init__(self, fig,\n *,\n cmap=None,\n norm=None,\n colorizer=None,\n offsetx=0,\n offsety=0,\n origin=None,\n **kwargs\n ):\n """\n cmap is a colors.Colormap instance\n norm is a colors.Normalize instance to map luminance to 0-1\n\n kwargs are an optional list of Artist keyword args\n """\n super().__init__(\n None,\n norm=norm,\n cmap=cmap,\n colorizer=colorizer,\n origin=origin\n )\n self.set_figure(fig)\n self.ox = offsetx\n self.oy = offsety\n self._internal_update(kwargs)\n self.magnification = 1.0\n\n def get_extent(self):\n """Return the image extent as tuple (left, right, bottom, top)."""\n numrows, numcols = self.get_size()\n return (-0.5 + self.ox, numcols-0.5 + self.ox,\n -0.5 + self.oy, numrows-0.5 + self.oy)\n\n def make_image(self, renderer, magnification=1.0, unsampled=False):\n # docstring inherited\n fig = self.get_figure(root=True)\n fac = renderer.dpi/fig.dpi\n # fac here is to account for pdf, eps, svg backends where\n # figure.dpi is set to 72. This means we need to scale the\n # image (using magnification) and offset it appropriately.\n bbox = Bbox([[self.ox/fac, self.oy/fac],\n [(self.ox/fac + self._A.shape[1]),\n (self.oy/fac + self._A.shape[0])]])\n width, height = fig.get_size_inches()\n width *= renderer.dpi\n height *= renderer.dpi\n clip = Bbox([[0, 0], [width, height]])\n return self._make_image(\n self._A, bbox, bbox, clip, magnification=magnification / fac,\n unsampled=unsampled, round_to_pixel_border=False)\n\n def set_data(self, A):\n """Set the image array."""\n super().set_data(A)\n self.stale = True\n\n\nclass BboxImage(_ImageBase):\n """The Image class whose size is determined by the given bbox."""\n\n def __init__(self, bbox,\n *,\n cmap=None,\n norm=None,\n colorizer=None,\n interpolation=None,\n origin=None,\n filternorm=True,\n filterrad=4.0,\n resample=False,\n **kwargs\n ):\n """\n cmap is a colors.Colormap instance\n norm is a colors.Normalize instance to map luminance to 0-1\n\n kwargs are an optional list of Artist keyword args\n """\n super().__init__(\n None,\n cmap=cmap,\n norm=norm,\n colorizer=colorizer,\n interpolation=interpolation,\n origin=origin,\n filternorm=filternorm,\n filterrad=filterrad,\n resample=resample,\n **kwargs\n )\n self.bbox = bbox\n\n def get_window_extent(self, renderer=None):\n if renderer is None:\n renderer = self.get_figure()._get_renderer()\n\n if isinstance(self.bbox, BboxBase):\n return self.bbox\n elif callable(self.bbox):\n return self.bbox(renderer)\n else:\n raise ValueError("Unknown type of bbox")\n\n def contains(self, mouseevent):\n """Test whether the mouse event occurred within the image."""\n if self._different_canvas(mouseevent) or not self.get_visible():\n return False, {}\n x, y = mouseevent.x, mouseevent.y\n inside = self.get_window_extent().contains(x, y)\n return inside, {}\n\n def make_image(self, renderer, magnification=1.0, unsampled=False):\n # docstring inherited\n width, height = renderer.get_canvas_width_height()\n bbox_in = self.get_window_extent(renderer).frozen()\n bbox_in._points /= [width, height]\n bbox_out = self.get_window_extent(renderer)\n clip = Bbox([[0, 0], [width, height]])\n self._transform = BboxTransformTo(clip)\n return self._make_image(\n self._A,\n bbox_in, bbox_out, clip, magnification, unsampled=unsampled)\n\n\ndef imread(fname, format=None):\n """\n Read an image from a file into an array.\n\n .. note::\n\n This function exists for historical reasons. It is recommended to\n use `PIL.Image.open` instead for loading images.\n\n Parameters\n ----------\n fname : str or file-like\n The image file to read: a filename, a URL or a file-like object opened\n in read-binary mode.\n\n Passing a URL is deprecated. Please open the URL\n for reading and pass the result to Pillow, e.g. with\n ``np.array(PIL.Image.open(urllib.request.urlopen(url)))``.\n format : str, optional\n The image file format assumed for reading the data. The image is\n loaded as a PNG file if *format* is set to "png", if *fname* is a path\n or opened file with a ".png" extension, or if it is a URL. In all\n other cases, *format* is ignored and the format is auto-detected by\n `PIL.Image.open`.\n\n Returns\n -------\n `numpy.array`\n The image data. The returned array has shape\n\n - (M, N) for grayscale images.\n - (M, N, 3) for RGB images.\n - (M, N, 4) for RGBA images.\n\n PNG images are returned as float arrays (0-1). All other formats are\n returned as int arrays, with a bit depth determined by the file's\n contents.\n """\n # hide imports to speed initial import on systems with slow linkers\n from urllib import parse\n\n if format is None:\n if isinstance(fname, str):\n parsed = parse.urlparse(fname)\n # If the string is a URL (Windows paths appear as if they have a\n # length-1 scheme), assume png.\n if len(parsed.scheme) > 1:\n ext = 'png'\n else:\n ext = Path(fname).suffix.lower()[1:]\n elif hasattr(fname, 'geturl'): # Returned by urlopen().\n # We could try to parse the url's path and use the extension, but\n # returning png is consistent with the block above. Note that this\n # if clause has to come before checking for fname.name as\n # urlopen("file:///...") also has a name attribute (with the fixed\n # value "<urllib response>").\n ext = 'png'\n elif hasattr(fname, 'name'):\n ext = Path(fname.name).suffix.lower()[1:]\n else:\n ext = 'png'\n else:\n ext = format\n img_open = (\n PIL.PngImagePlugin.PngImageFile if ext == 'png' else PIL.Image.open)\n if isinstance(fname, str) and len(parse.urlparse(fname).scheme) > 1:\n # Pillow doesn't handle URLs directly.\n raise ValueError(\n "Please open the URL for reading and pass the "\n "result to Pillow, e.g. with "\n "``np.array(PIL.Image.open(urllib.request.urlopen(url)))``."\n )\n with img_open(fname) as image:\n return (_pil_png_to_float_array(image)\n if isinstance(image, PIL.PngImagePlugin.PngImageFile) else\n pil_to_array(image))\n\n\ndef imsave(fname, arr, vmin=None, vmax=None, cmap=None, format=None,\n origin=None, dpi=100, *, metadata=None, pil_kwargs=None):\n """\n Colormap and save an array as an image file.\n\n RGB(A) images are passed through. Single channel images will be\n colormapped according to *cmap* and *norm*.\n\n .. note::\n\n If you want to save a single channel image as gray scale please use an\n image I/O library (such as pillow, tifffile, or imageio) directly.\n\n Parameters\n ----------\n fname : str or path-like or file-like\n A path or a file-like object to store the image in.\n If *format* is not set, then the output format is inferred from the\n extension of *fname*, if any, and from :rc:`savefig.format` otherwise.\n If *format* is set, it determines the output format.\n arr : array-like\n The image data. Accepts NumPy arrays or sequences\n (e.g., lists or tuples). The shape can be one of\n MxN (luminance), MxNx3 (RGB) or MxNx4 (RGBA).\n vmin, vmax : float, optional\n *vmin* and *vmax* set the color scaling for the image by fixing the\n values that map to the colormap color limits. If either *vmin*\n or *vmax* is None, that limit is determined from the *arr*\n min/max value.\n cmap : str or `~matplotlib.colors.Colormap`, default: :rc:`image.cmap`\n A Colormap instance or registered colormap name. The colormap\n maps scalar data to colors. It is ignored for RGB(A) data.\n format : str, optional\n The file format, e.g. 'png', 'pdf', 'svg', ... The behavior when this\n is unset is documented under *fname*.\n origin : {'upper', 'lower'}, default: :rc:`image.origin`\n Indicates whether the ``(0, 0)`` index of the array is in the upper\n left or lower left corner of the Axes.\n dpi : float\n The DPI to store in the metadata of the file. This does not affect the\n resolution of the output image. Depending on file format, this may be\n rounded to the nearest integer.\n metadata : dict, optional\n Metadata in the image file. The supported keys depend on the output\n format, see the documentation of the respective backends for more\n information.\n Currently only supported for "png", "pdf", "ps", "eps", and "svg".\n pil_kwargs : dict, optional\n Keyword arguments passed to `PIL.Image.Image.save`. If the 'pnginfo'\n key is present, it completely overrides *metadata*, including the\n default 'Software' key.\n """\n from matplotlib.figure import Figure\n\n # Normalizing input (e.g., list or tuples) to NumPy array if needed\n arr = np.asanyarray(arr)\n\n if isinstance(fname, os.PathLike):\n fname = os.fspath(fname)\n if format is None:\n format = (Path(fname).suffix[1:] if isinstance(fname, str)\n else mpl.rcParams["savefig.format"]).lower()\n if format in ["pdf", "ps", "eps", "svg"]:\n # Vector formats that are not handled by PIL.\n if pil_kwargs is not None:\n raise ValueError(\n f"Cannot use 'pil_kwargs' when saving to {format}")\n fig = Figure(dpi=dpi, frameon=False)\n fig.figimage(arr, cmap=cmap, vmin=vmin, vmax=vmax, origin=origin,\n resize=True)\n fig.savefig(fname, dpi=dpi, format=format, transparent=True,\n metadata=metadata)\n else:\n # Don't bother creating an image; this avoids rounding errors on the\n # size when dividing and then multiplying by dpi.\n if origin is None:\n origin = mpl.rcParams["image.origin"]\n else:\n _api.check_in_list(('upper', 'lower'), origin=origin)\n if origin == "lower":\n arr = arr[::-1]\n if (isinstance(arr, memoryview) and arr.format == "B"\n and arr.ndim == 3 and arr.shape[-1] == 4):\n # Such an ``arr`` would also be handled fine by sm.to_rgba below\n # (after casting with asarray), but it is useful to special-case it\n # because that's what backend_agg passes, and can be in fact used\n # as is, saving a few operations.\n rgba = arr\n else:\n sm = mcolorizer.Colorizer(cmap=cmap)\n sm.set_clim(vmin, vmax)\n rgba = sm.to_rgba(arr, bytes=True)\n if pil_kwargs is None:\n pil_kwargs = {}\n else:\n # we modify this below, so make a copy (don't modify caller's dict)\n pil_kwargs = pil_kwargs.copy()\n pil_shape = (rgba.shape[1], rgba.shape[0])\n rgba = np.require(rgba, requirements='C')\n image = PIL.Image.frombuffer(\n "RGBA", pil_shape, rgba, "raw", "RGBA", 0, 1)\n if format == "png":\n # Only use the metadata kwarg if pnginfo is not set, because the\n # semantics of duplicate keys in pnginfo is unclear.\n if "pnginfo" in pil_kwargs:\n if metadata:\n _api.warn_external("'metadata' is overridden by the "\n "'pnginfo' entry in 'pil_kwargs'.")\n else:\n metadata = {\n "Software": (f"Matplotlib version{mpl.__version__}, "\n f"https://matplotlib.org/"),\n **(metadata if metadata is not None else {}),\n }\n pil_kwargs["pnginfo"] = pnginfo = PIL.PngImagePlugin.PngInfo()\n for k, v in metadata.items():\n if v is not None:\n pnginfo.add_text(k, v)\n elif metadata is not None:\n raise ValueError(f"metadata not supported for format {format!r}")\n if format in ["jpg", "jpeg"]:\n format = "jpeg" # Pillow doesn't recognize "jpg".\n facecolor = mpl.rcParams["savefig.facecolor"]\n if cbook._str_equal(facecolor, "auto"):\n facecolor = mpl.rcParams["figure.facecolor"]\n color = tuple(int(x * 255) for x in mcolors.to_rgb(facecolor))\n background = PIL.Image.new("RGB", pil_shape, color)\n background.paste(image, image)\n image = background\n pil_kwargs.setdefault("format", format)\n pil_kwargs.setdefault("dpi", (dpi, dpi))\n image.save(fname, **pil_kwargs)\n\n\ndef pil_to_array(pilImage):\n """\n Load a `PIL image`_ and return it as a numpy int array.\n\n .. _PIL image: https://pillow.readthedocs.io/en/latest/reference/Image.html\n\n Returns\n -------\n numpy.array\n\n The array shape depends on the image type:\n\n - (M, N) for grayscale images.\n - (M, N, 3) for RGB images.\n - (M, N, 4) for RGBA images.\n """\n if pilImage.mode in ['RGBA', 'RGBX', 'RGB', 'L']:\n # return MxNx4 RGBA, MxNx3 RBA, or MxN luminance array\n return np.asarray(pilImage)\n elif pilImage.mode.startswith('I;16'):\n # return MxN luminance array of uint16\n raw = pilImage.tobytes('raw', pilImage.mode)\n if pilImage.mode.endswith('B'):\n x = np.frombuffer(raw, '>u2')\n else:\n x = np.frombuffer(raw, '<u2')\n return x.reshape(pilImage.size[::-1]).astype('=u2')\n else: # try to convert to an rgba image\n try:\n pilImage = pilImage.convert('RGBA')\n except ValueError as err:\n raise RuntimeError('Unknown image mode') from err\n return np.asarray(pilImage) # return MxNx4 RGBA array\n\n\ndef _pil_png_to_float_array(pil_png):\n """Convert a PIL `PNGImageFile` to a 0-1 float array."""\n # Unlike pil_to_array this converts to 0-1 float32s for backcompat with the\n # old libpng-based loader.\n # The supported rawmodes are from PIL.PngImagePlugin._MODES. When\n # mode == "RGB(A)", the 16-bit raw data has already been coarsened to 8-bit\n # by Pillow.\n mode = pil_png.mode\n rawmode = pil_png.png.im_rawmode\n if rawmode == "1": # Grayscale.\n return np.asarray(pil_png, np.float32)\n if rawmode == "L;2": # Grayscale.\n return np.divide(pil_png, 2**2 - 1, dtype=np.float32)\n if rawmode == "L;4": # Grayscale.\n return np.divide(pil_png, 2**4 - 1, dtype=np.float32)\n if rawmode == "L": # Grayscale.\n return np.divide(pil_png, 2**8 - 1, dtype=np.float32)\n if rawmode == "I;16B": # Grayscale.\n return np.divide(pil_png, 2**16 - 1, dtype=np.float32)\n if mode == "RGB": # RGB.\n return np.divide(pil_png, 2**8 - 1, dtype=np.float32)\n if mode == "P": # Palette.\n return np.divide(pil_png.convert("RGBA"), 2**8 - 1, dtype=np.float32)\n if mode == "LA": # Grayscale + alpha.\n return np.divide(pil_png.convert("RGBA"), 2**8 - 1, dtype=np.float32)\n if mode == "RGBA": # RGBA.\n return np.divide(pil_png, 2**8 - 1, dtype=np.float32)\n raise ValueError(f"Unknown PIL rawmode: {rawmode}")\n\n\ndef thumbnail(infile, thumbfile, scale=0.1, interpolation='bilinear',\n preview=False):\n """\n Make a thumbnail of image in *infile* with output filename *thumbfile*.\n\n See :doc:`/gallery/misc/image_thumbnail_sgskip`.\n\n Parameters\n ----------\n infile : str or file-like\n The image file. Matplotlib relies on Pillow_ for image reading, and\n thus supports a wide range of file formats, including PNG, JPG, TIFF\n and others.\n\n .. _Pillow: https://python-pillow.github.io\n\n thumbfile : str or file-like\n The thumbnail filename.\n\n scale : float, default: 0.1\n The scale factor for the thumbnail.\n\n interpolation : str, default: 'bilinear'\n The interpolation scheme used in the resampling. See the\n *interpolation* parameter of `~.Axes.imshow` for possible values.\n\n preview : bool, default: False\n If True, the default backend (presumably a user interface\n backend) will be used which will cause a figure to be raised if\n `~matplotlib.pyplot.show` is called. If it is False, the figure is\n created using `.FigureCanvasBase` and the drawing backend is selected\n as `.Figure.savefig` would normally do.\n\n Returns\n -------\n `.Figure`\n The figure instance containing the thumbnail.\n """\n\n im = imread(infile)\n rows, cols, depth = im.shape\n\n # This doesn't really matter (it cancels in the end) but the API needs it.\n dpi = 100\n\n height = rows / dpi * scale\n width = cols / dpi * scale\n\n if preview:\n # Let the UI backend do everything.\n import matplotlib.pyplot as plt\n fig = plt.figure(figsize=(width, height), dpi=dpi)\n else:\n from matplotlib.figure import Figure\n fig = Figure(figsize=(width, height), dpi=dpi)\n FigureCanvasBase(fig)\n\n ax = fig.add_axes([0, 0, 1, 1], aspect='auto',\n frameon=False, xticks=[], yticks=[])\n ax.imshow(im, aspect='auto', resample=True, interpolation=interpolation)\n fig.savefig(thumbfile, dpi=dpi)\n return fig\n | .venv\Lib\site-packages\matplotlib\image.py | image.py | Python | 69,765 | 0.75 | 0.173572 | 0.085038 | node-utils | 644 | 2024-06-07T11:12:12.744272 | MIT | false | e004408721bd7a998333769a94be8e34 |
from collections.abc import Callable, Sequence\nimport os\nimport pathlib\nfrom typing import Any, BinaryIO, Literal\n\nimport numpy as np\nfrom numpy.typing import ArrayLike, NDArray\nimport PIL.Image\n\nfrom matplotlib.axes import Axes\nfrom matplotlib import colorizer\nfrom matplotlib.backend_bases import RendererBase, MouseEvent\nfrom matplotlib.colorizer import Colorizer\nfrom matplotlib.colors import Colormap, Normalize\nfrom matplotlib.figure import Figure\nfrom matplotlib.transforms import Affine2D, BboxBase, Bbox, Transform\n\n#\n# These names are re-exported from matplotlib._image.\n#\n\nBESSEL: int\nBICUBIC: int\nBILINEAR: int\nBLACKMAN: int\nCATROM: int\nGAUSSIAN: int\nHAMMING: int\nHANNING: int\nHERMITE: int\nKAISER: int\nLANCZOS: int\nMITCHELL: int\nNEAREST: int\nQUADRIC: int\nSINC: int\nSPLINE16: int\nSPLINE36: int\n\ndef resample(\n input_array: NDArray[np.float32] | NDArray[np.float64] | NDArray[np.int8],\n output_array: NDArray[np.float32] | NDArray[np.float64] | NDArray[np.int8],\n transform: Transform,\n interpolation: int = ...,\n resample: bool = ...,\n alpha: float = ...,\n norm: bool = ...,\n radius: float = ...,\n) -> None: ...\n\n#\n# END names re-exported from matplotlib._image.\n#\n\ninterpolations_names: set[str]\n\ndef composite_images(\n images: Sequence[_ImageBase], renderer: RendererBase, magnification: float = ...\n) -> tuple[np.ndarray, float, float]: ...\n\nclass _ImageBase(colorizer.ColorizingArtist):\n zorder: float\n origin: Literal["upper", "lower"]\n axes: Axes\n def __init__(\n self,\n ax: Axes,\n cmap: str | Colormap | None = ...,\n norm: str | Normalize | None = ...,\n colorizer: Colorizer | None = ...,\n interpolation: str | None = ...,\n origin: Literal["upper", "lower"] | None = ...,\n filternorm: bool = ...,\n filterrad: float = ...,\n resample: bool | None = ...,\n *,\n interpolation_stage: Literal["data", "rgba", "auto"] | None = ...,\n **kwargs\n ) -> None: ...\n def get_size(self) -> tuple[int, int]: ...\n def set_alpha(self, alpha: float | ArrayLike | None) -> None: ...\n def changed(self) -> None: ...\n def make_image(\n self, renderer: RendererBase, magnification: float = ..., unsampled: bool = ...\n ) -> tuple[np.ndarray, float, float, Affine2D]: ...\n def draw(self, renderer: RendererBase) -> None: ...\n def write_png(self, fname: str | pathlib.Path | BinaryIO) -> None: ...\n def set_data(self, A: ArrayLike | None) -> None: ...\n def set_array(self, A: ArrayLike | None) -> None: ...\n def get_shape(self) -> tuple[int, int, int]: ...\n def get_interpolation(self) -> str: ...\n def set_interpolation(self, s: str | None) -> None: ...\n def get_interpolation_stage(self) -> Literal["data", "rgba", "auto"]: ...\n def set_interpolation_stage(self, s: Literal["data", "rgba", "auto"]) -> None: ...\n def can_composite(self) -> bool: ...\n def set_resample(self, v: bool | None) -> None: ...\n def get_resample(self) -> bool: ...\n def set_filternorm(self, filternorm: bool) -> None: ...\n def get_filternorm(self) -> bool: ...\n def set_filterrad(self, filterrad: float) -> None: ...\n def get_filterrad(self) -> float: ...\n\nclass AxesImage(_ImageBase):\n def __init__(\n self,\n ax: Axes,\n *,\n cmap: str | Colormap | None = ...,\n norm: str | Normalize | None = ...,\n colorizer: Colorizer | None = ...,\n interpolation: str | None = ...,\n origin: Literal["upper", "lower"] | None = ...,\n extent: tuple[float, float, float, float] | None = ...,\n filternorm: bool = ...,\n filterrad: float = ...,\n resample: bool = ...,\n interpolation_stage: Literal["data", "rgba", "auto"] | None = ...,\n **kwargs\n ) -> None: ...\n def get_window_extent(self, renderer: RendererBase | None = ...) -> Bbox: ...\n def make_image(\n self, renderer: RendererBase, magnification: float = ..., unsampled: bool = ...\n ) -> tuple[np.ndarray, float, float, Affine2D]: ...\n def set_extent(\n self, extent: tuple[float, float, float, float], **kwargs\n ) -> None: ...\n def get_extent(self) -> tuple[float, float, float, float]: ...\n def get_cursor_data(self, event: MouseEvent) -> None | float: ...\n\nclass NonUniformImage(AxesImage):\n mouseover: bool\n def __init__(\n self, ax: Axes, *, interpolation: Literal["nearest", "bilinear"] = ..., **kwargs\n ) -> None: ...\n def set_data(self, x: ArrayLike, y: ArrayLike, A: ArrayLike) -> None: ... # type: ignore[override]\n # more limited interpolation available here than base class\n def set_interpolation(self, s: Literal["nearest", "bilinear"]) -> None: ... # type: ignore[override]\n\nclass PcolorImage(AxesImage):\n def __init__(\n self,\n ax: Axes,\n x: ArrayLike | None = ...,\n y: ArrayLike | None = ...,\n A: ArrayLike | None = ...,\n *,\n cmap: str | Colormap | None = ...,\n norm: str | Normalize | None = ...,\n colorizer: Colorizer | None = ...,\n **kwargs\n ) -> None: ...\n def set_data(self, x: ArrayLike, y: ArrayLike, A: ArrayLike) -> None: ... # type: ignore[override]\n\nclass FigureImage(_ImageBase):\n zorder: float\n figure: Figure\n ox: float\n oy: float\n magnification: float\n def __init__(\n self,\n fig: Figure,\n *,\n cmap: str | Colormap | None = ...,\n norm: str | Normalize | None = ...,\n colorizer: Colorizer | None = ...,\n offsetx: int = ...,\n offsety: int = ...,\n origin: Literal["upper", "lower"] | None = ...,\n **kwargs\n ) -> None: ...\n def get_extent(self) -> tuple[float, float, float, float]: ...\n\nclass BboxImage(_ImageBase):\n bbox: BboxBase\n def __init__(\n self,\n bbox: BboxBase | Callable[[RendererBase | None], Bbox],\n *,\n cmap: str | Colormap | None = ...,\n norm: str | Normalize | None = ...,\n colorizer: Colorizer | None = ...,\n interpolation: str | None = ...,\n origin: Literal["upper", "lower"] | None = ...,\n filternorm: bool = ...,\n filterrad: float = ...,\n resample: bool = ...,\n **kwargs\n ) -> None: ...\n def get_window_extent(self, renderer: RendererBase | None = ...) -> Bbox: ...\n\ndef imread(\n fname: str | pathlib.Path | BinaryIO, format: str | None = ...\n) -> np.ndarray: ...\ndef imsave(\n fname: str | os.PathLike | BinaryIO,\n arr: ArrayLike,\n vmin: float | None = ...,\n vmax: float | None = ...,\n cmap: str | Colormap | None = ...,\n format: str | None = ...,\n origin: Literal["upper", "lower"] | None = ...,\n dpi: float = ...,\n *,\n metadata: dict[str, str] | None = ...,\n pil_kwargs: dict[str, Any] | None = ...\n) -> None: ...\ndef pil_to_array(pilImage: PIL.Image.Image) -> np.ndarray: ...\ndef thumbnail(\n infile: str | BinaryIO,\n thumbfile: str | BinaryIO,\n scale: float = ...,\n interpolation: str = ...,\n preview: bool = ...,\n) -> Figure: ...\n | .venv\Lib\site-packages\matplotlib\image.pyi | image.pyi | Other | 7,066 | 0.95 | 0.227907 | 0.09 | react-lib | 444 | 2024-11-17T08:11:44.812912 | Apache-2.0 | false | ae08b1b589efe9e3d4ee9de99f302fe9 |
"""\nThe inset module defines the InsetIndicator class, which draws the rectangle and\nconnectors required for `.Axes.indicate_inset` and `.Axes.indicate_inset_zoom`.\n"""\n\nfrom . import _api, artist, transforms\nfrom matplotlib.patches import ConnectionPatch, PathPatch, Rectangle\nfrom matplotlib.path import Path\n\n\n_shared_properties = ('alpha', 'edgecolor', 'linestyle', 'linewidth')\n\n\nclass InsetIndicator(artist.Artist):\n """\n An artist to highlight an area of interest.\n\n An inset indicator is a rectangle on the plot at the position indicated by\n *bounds* that optionally has lines that connect the rectangle to an inset\n Axes (`.Axes.inset_axes`).\n\n .. versionadded:: 3.10\n """\n zorder = 4.99\n\n def __init__(self, bounds=None, inset_ax=None, zorder=None, **kwargs):\n """\n Parameters\n ----------\n bounds : [x0, y0, width, height], optional\n Lower-left corner of rectangle to be marked, and its width\n and height. If not set, the bounds will be calculated from the\n data limits of inset_ax, which must be supplied.\n\n inset_ax : `~.axes.Axes`, optional\n An optional inset Axes to draw connecting lines to. Two lines are\n drawn connecting the indicator box to the inset Axes on corners\n chosen so as to not overlap with the indicator box.\n\n zorder : float, default: 4.99\n Drawing order of the rectangle and connector lines. The default,\n 4.99, is just below the default level of inset Axes.\n\n **kwargs\n Other keyword arguments are passed on to the `.Rectangle` patch.\n """\n if bounds is None and inset_ax is None:\n raise ValueError("At least one of bounds or inset_ax must be supplied")\n\n self._inset_ax = inset_ax\n\n if bounds is None:\n # Work out bounds from inset_ax\n self._auto_update_bounds = True\n bounds = self._bounds_from_inset_ax()\n else:\n self._auto_update_bounds = False\n\n x, y, width, height = bounds\n\n self._rectangle = Rectangle((x, y), width, height, clip_on=False, **kwargs)\n\n # Connector positions cannot be calculated till the artist has been added\n # to an axes, so just make an empty list for now.\n self._connectors = []\n\n super().__init__()\n self.set_zorder(zorder)\n\n # Initial style properties for the artist should match the rectangle.\n for prop in _shared_properties:\n setattr(self, f'_{prop}', artist.getp(self._rectangle, prop))\n\n def _shared_setter(self, prop, val):\n """\n Helper function to set the same style property on the artist and its children.\n """\n setattr(self, f'_{prop}', val)\n\n artist.setp([self._rectangle, *self._connectors], prop, val)\n\n def set_alpha(self, alpha):\n # docstring inherited\n self._shared_setter('alpha', alpha)\n\n def set_edgecolor(self, color):\n """\n Set the edge color of the rectangle and the connectors.\n\n Parameters\n ----------\n color : :mpltype:`color` or None\n """\n self._shared_setter('edgecolor', color)\n\n def set_color(self, c):\n """\n Set the edgecolor of the rectangle and the connectors, and the\n facecolor for the rectangle.\n\n Parameters\n ----------\n c : :mpltype:`color`\n """\n self._shared_setter('edgecolor', c)\n self._shared_setter('facecolor', c)\n\n def set_linewidth(self, w):\n """\n Set the linewidth in points of the rectangle and the connectors.\n\n Parameters\n ----------\n w : float or None\n """\n self._shared_setter('linewidth', w)\n\n def set_linestyle(self, ls):\n """\n Set the linestyle of the rectangle and the connectors.\n\n ========================================== =================\n linestyle description\n ========================================== =================\n ``'-'`` or ``'solid'`` solid line\n ``'--'`` or ``'dashed'`` dashed line\n ``'-.'`` or ``'dashdot'`` dash-dotted line\n ``':'`` or ``'dotted'`` dotted line\n ``'none'``, ``'None'``, ``' '``, or ``''`` draw nothing\n ========================================== =================\n\n Alternatively a dash tuple of the following form can be provided::\n\n (offset, onoffseq)\n\n where ``onoffseq`` is an even length tuple of on and off ink in points.\n\n Parameters\n ----------\n ls : {'-', '--', '-.', ':', '', (offset, on-off-seq), ...}\n The line style.\n """\n self._shared_setter('linestyle', ls)\n\n def _bounds_from_inset_ax(self):\n xlim = self._inset_ax.get_xlim()\n ylim = self._inset_ax.get_ylim()\n return (xlim[0], ylim[0], xlim[1] - xlim[0], ylim[1] - ylim[0])\n\n def _update_connectors(self):\n (x, y) = self._rectangle.get_xy()\n width = self._rectangle.get_width()\n height = self._rectangle.get_height()\n\n existing_connectors = self._connectors or [None] * 4\n\n # connect the inset_axes to the rectangle\n for xy_inset_ax, existing in zip([(0, 0), (0, 1), (1, 0), (1, 1)],\n existing_connectors):\n # inset_ax positions are in axes coordinates\n # The 0, 1 values define the four edges if the inset_ax\n # lower_left, upper_left, lower_right upper_right.\n ex, ey = xy_inset_ax\n if self.axes.xaxis.get_inverted():\n ex = 1 - ex\n if self.axes.yaxis.get_inverted():\n ey = 1 - ey\n xy_data = x + ex * width, y + ey * height\n if existing is None:\n # Create new connection patch with styles inherited from the\n # parent artist.\n p = ConnectionPatch(\n xyA=xy_inset_ax, coordsA=self._inset_ax.transAxes,\n xyB=xy_data, coordsB=self.axes.transData,\n arrowstyle="-",\n edgecolor=self._edgecolor, alpha=self.get_alpha(),\n linestyle=self._linestyle, linewidth=self._linewidth)\n self._connectors.append(p)\n else:\n # Only update positioning of existing connection patch. We\n # do not want to override any style settings made by the user.\n existing.xy1 = xy_inset_ax\n existing.xy2 = xy_data\n existing.coords1 = self._inset_ax.transAxes\n existing.coords2 = self.axes.transData\n\n if existing is None:\n # decide which two of the lines to keep visible....\n pos = self._inset_ax.get_position()\n bboxins = pos.transformed(self.get_figure(root=False).transSubfigure)\n rectbbox = transforms.Bbox.from_bounds(x, y, width, height).transformed(\n self._rectangle.get_transform())\n x0 = rectbbox.x0 < bboxins.x0\n x1 = rectbbox.x1 < bboxins.x1\n y0 = rectbbox.y0 < bboxins.y0\n y1 = rectbbox.y1 < bboxins.y1\n self._connectors[0].set_visible(x0 ^ y0)\n self._connectors[1].set_visible(x0 == y1)\n self._connectors[2].set_visible(x1 == y0)\n self._connectors[3].set_visible(x1 ^ y1)\n\n @property\n def rectangle(self):\n """`.Rectangle`: the indicator frame."""\n return self._rectangle\n\n @property\n def connectors(self):\n """\n 4-tuple of `.patches.ConnectionPatch` or None\n The four connector lines connecting to (lower_left, upper_left,\n lower_right upper_right) corners of *inset_ax*. Two lines are\n set with visibility to *False*, but the user can set the\n visibility to True if the automatic choice is not deemed correct.\n """\n if self._inset_ax is None:\n return\n\n if self._auto_update_bounds:\n self._rectangle.set_bounds(self._bounds_from_inset_ax())\n self._update_connectors()\n return tuple(self._connectors)\n\n def draw(self, renderer):\n # docstring inherited\n conn_same_style = []\n\n # Figure out which connectors have the same style as the box, so should\n # be drawn as a single path.\n for conn in self.connectors or []:\n if conn.get_visible():\n drawn = False\n for s in _shared_properties:\n if artist.getp(self._rectangle, s) != artist.getp(conn, s):\n # Draw this connector by itself\n conn.draw(renderer)\n drawn = True\n break\n\n if not drawn:\n # Connector has same style as box.\n conn_same_style.append(conn)\n\n if conn_same_style:\n # Since at least one connector has the same style as the rectangle, draw\n # them as a compound path.\n artists = [self._rectangle] + conn_same_style\n paths = [a.get_transform().transform_path(a.get_path()) for a in artists]\n path = Path.make_compound_path(*paths)\n\n # Create a temporary patch to draw the path.\n p = PathPatch(path)\n p.update_from(self._rectangle)\n p.set_transform(transforms.IdentityTransform())\n p.draw(renderer)\n\n return\n\n # Just draw the rectangle\n self._rectangle.draw(renderer)\n\n @_api.deprecated(\n '3.10',\n message=('Since Matplotlib 3.10 indicate_inset_[zoom] returns a single '\n 'InsetIndicator artist with a rectangle property and a connectors '\n 'property. From 3.12 it will no longer be possible to unpack the '\n 'return value into two elements.'))\n def __getitem__(self, key):\n return [self._rectangle, self.connectors][key]\n | .venv\Lib\site-packages\matplotlib\inset.py | inset.py | Python | 10,154 | 0.95 | 0.144981 | 0.113636 | react-lib | 994 | 2024-04-16T02:06:04.481000 | Apache-2.0 | false | b348d4fcddffacf15e7426042974ca31 |
from . import artist\nfrom .axes import Axes\nfrom .backend_bases import RendererBase\nfrom .patches import ConnectionPatch, Rectangle\n\nfrom .typing import ColorType, LineStyleType\n\nclass InsetIndicator(artist.Artist):\n def __init__(\n self,\n bounds: tuple[float, float, float, float] | None = ...,\n inset_ax: Axes | None = ...,\n zorder: float | None = ...,\n **kwargs\n ) -> None: ...\n def set_alpha(self, alpha: float | None) -> None: ...\n def set_edgecolor(self, color: ColorType | None) -> None: ...\n def set_color(self, c: ColorType | None) -> None: ...\n def set_linewidth(self, w: float | None) -> None: ...\n def set_linestyle(self, ls: LineStyleType | None) -> None: ...\n @property\n def rectangle(self) -> Rectangle: ...\n @property\n def connectors(self) -> tuple[ConnectionPatch, ConnectionPatch, ConnectionPatch, ConnectionPatch] | None: ...\n def draw(self, renderer: RendererBase) -> None: ...\n | .venv\Lib\site-packages\matplotlib\inset.pyi | inset.pyi | Other | 968 | 0.85 | 0.4 | 0.043478 | python-kit | 666 | 2023-10-12T10:11:33.537836 | GPL-3.0 | false | 727e02d2b4441ecbcb511a1e57eb374e |
"""\nClasses to layout elements in a `.Figure`.\n\nFigures have a ``layout_engine`` property that holds a subclass of\n`~.LayoutEngine` defined here (or *None* for no layout). At draw time\n``figure.get_layout_engine().execute()`` is called, the goal of which is\nusually to rearrange Axes on the figure to produce a pleasing layout. This is\nlike a ``draw`` callback but with two differences. First, when printing we\ndisable the layout engine for the final draw. Second, it is useful to know the\nlayout engine while the figure is being created. In particular, colorbars are\nmade differently with different layout engines (for historical reasons).\n\nMatplotlib has two built-in layout engines:\n\n- `.TightLayoutEngine` was the first layout engine added to Matplotlib.\n See also :ref:`tight_layout_guide`.\n- `.ConstrainedLayoutEngine` is more modern and generally gives better results.\n See also :ref:`constrainedlayout_guide`.\n\nThird parties can create their own layout engine by subclassing `.LayoutEngine`.\n"""\n\nfrom contextlib import nullcontext\n\nimport matplotlib as mpl\n\nfrom matplotlib._constrained_layout import do_constrained_layout\nfrom matplotlib._tight_layout import (get_subplotspec_list,\n get_tight_layout_figure)\n\n\nclass LayoutEngine:\n """\n Base class for Matplotlib layout engines.\n\n A layout engine can be passed to a figure at instantiation or at any time\n with `~.figure.Figure.set_layout_engine`. Once attached to a figure, the\n layout engine ``execute`` function is called at draw time by\n `~.figure.Figure.draw`, providing a special draw-time hook.\n\n .. note::\n\n However, note that layout engines affect the creation of colorbars, so\n `~.figure.Figure.set_layout_engine` should be called before any\n colorbars are created.\n\n Currently, there are two properties of `LayoutEngine` classes that are\n consulted while manipulating the figure:\n\n - ``engine.colorbar_gridspec`` tells `.Figure.colorbar` whether to make the\n axes using the gridspec method (see `.colorbar.make_axes_gridspec`) or\n not (see `.colorbar.make_axes`);\n - ``engine.adjust_compatible`` stops `.Figure.subplots_adjust` from being\n run if it is not compatible with the layout engine.\n\n To implement a custom `LayoutEngine`:\n\n 1. override ``_adjust_compatible`` and ``_colorbar_gridspec``\n 2. override `LayoutEngine.set` to update *self._params*\n 3. override `LayoutEngine.execute` with your implementation\n\n """\n # override these in subclass\n _adjust_compatible = None\n _colorbar_gridspec = None\n\n def __init__(self, **kwargs):\n super().__init__(**kwargs)\n self._params = {}\n\n def set(self, **kwargs):\n """\n Set the parameters for the layout engine.\n """\n raise NotImplementedError\n\n @property\n def colorbar_gridspec(self):\n """\n Return a boolean if the layout engine creates colorbars using a\n gridspec.\n """\n if self._colorbar_gridspec is None:\n raise NotImplementedError\n return self._colorbar_gridspec\n\n @property\n def adjust_compatible(self):\n """\n Return a boolean if the layout engine is compatible with\n `~.Figure.subplots_adjust`.\n """\n if self._adjust_compatible is None:\n raise NotImplementedError\n return self._adjust_compatible\n\n def get(self):\n """\n Return copy of the parameters for the layout engine.\n """\n return dict(self._params)\n\n def execute(self, fig):\n """\n Execute the layout on the figure given by *fig*.\n """\n # subclasses must implement this.\n raise NotImplementedError\n\n\nclass PlaceHolderLayoutEngine(LayoutEngine):\n """\n This layout engine does not adjust the figure layout at all.\n\n The purpose of this `.LayoutEngine` is to act as a placeholder when the user removes\n a layout engine to ensure an incompatible `.LayoutEngine` cannot be set later.\n\n Parameters\n ----------\n adjust_compatible, colorbar_gridspec : bool\n Allow the PlaceHolderLayoutEngine to mirror the behavior of whatever\n layout engine it is replacing.\n\n """\n def __init__(self, adjust_compatible, colorbar_gridspec, **kwargs):\n self._adjust_compatible = adjust_compatible\n self._colorbar_gridspec = colorbar_gridspec\n super().__init__(**kwargs)\n\n def execute(self, fig):\n """\n Do nothing.\n """\n return\n\n\nclass TightLayoutEngine(LayoutEngine):\n """\n Implements the ``tight_layout`` geometry management. See\n :ref:`tight_layout_guide` for details.\n """\n _adjust_compatible = True\n _colorbar_gridspec = True\n\n def __init__(self, *, pad=1.08, h_pad=None, w_pad=None,\n rect=(0, 0, 1, 1), **kwargs):\n """\n Initialize tight_layout engine.\n\n Parameters\n ----------\n pad : float, default: 1.08\n Padding between the figure edge and the edges of subplots, as a\n fraction of the font size.\n h_pad, w_pad : float\n Padding (height/width) between edges of adjacent subplots.\n Defaults to *pad*.\n rect : tuple (left, bottom, right, top), default: (0, 0, 1, 1).\n rectangle in normalized figure coordinates that the subplots\n (including labels) will fit into.\n """\n super().__init__(**kwargs)\n for td in ['pad', 'h_pad', 'w_pad', 'rect']:\n # initialize these in case None is passed in above:\n self._params[td] = None\n self.set(pad=pad, h_pad=h_pad, w_pad=w_pad, rect=rect)\n\n def execute(self, fig):\n """\n Execute tight_layout.\n\n This decides the subplot parameters given the padding that\n will allow the Axes labels to not be covered by other labels\n and Axes.\n\n Parameters\n ----------\n fig : `.Figure` to perform layout on.\n\n See Also\n --------\n .figure.Figure.tight_layout\n .pyplot.tight_layout\n """\n info = self._params\n renderer = fig._get_renderer()\n with getattr(renderer, "_draw_disabled", nullcontext)():\n kwargs = get_tight_layout_figure(\n fig, fig.axes, get_subplotspec_list(fig.axes), renderer,\n pad=info['pad'], h_pad=info['h_pad'], w_pad=info['w_pad'],\n rect=info['rect'])\n if kwargs:\n fig.subplots_adjust(**kwargs)\n\n def set(self, *, pad=None, w_pad=None, h_pad=None, rect=None):\n """\n Set the pads for tight_layout.\n\n Parameters\n ----------\n pad : float\n Padding between the figure edge and the edges of subplots, as a\n fraction of the font size.\n w_pad, h_pad : float\n Padding (width/height) between edges of adjacent subplots.\n Defaults to *pad*.\n rect : tuple (left, bottom, right, top)\n rectangle in normalized figure coordinates that the subplots\n (including labels) will fit into.\n """\n for td in self.set.__kwdefaults__:\n if locals()[td] is not None:\n self._params[td] = locals()[td]\n\n\nclass ConstrainedLayoutEngine(LayoutEngine):\n """\n Implements the ``constrained_layout`` geometry management. See\n :ref:`constrainedlayout_guide` for details.\n """\n\n _adjust_compatible = False\n _colorbar_gridspec = False\n\n def __init__(self, *, h_pad=None, w_pad=None,\n hspace=None, wspace=None, rect=(0, 0, 1, 1),\n compress=False, **kwargs):\n """\n Initialize ``constrained_layout`` settings.\n\n Parameters\n ----------\n h_pad, w_pad : float\n Padding around the Axes elements in inches.\n Default to :rc:`figure.constrained_layout.h_pad` and\n :rc:`figure.constrained_layout.w_pad`.\n hspace, wspace : float\n Fraction of the figure to dedicate to space between the\n axes. These are evenly spread between the gaps between the Axes.\n A value of 0.2 for a three-column layout would have a space\n of 0.1 of the figure width between each column.\n If h/wspace < h/w_pad, then the pads are used instead.\n Default to :rc:`figure.constrained_layout.hspace` and\n :rc:`figure.constrained_layout.wspace`.\n rect : tuple of 4 floats\n Rectangle in figure coordinates to perform constrained layout in\n (left, bottom, width, height), each from 0-1.\n compress : bool\n Whether to shift Axes so that white space in between them is\n removed. This is useful for simple grids of fixed-aspect Axes (e.g.\n a grid of images). See :ref:`compressed_layout`.\n """\n super().__init__(**kwargs)\n # set the defaults:\n self.set(w_pad=mpl.rcParams['figure.constrained_layout.w_pad'],\n h_pad=mpl.rcParams['figure.constrained_layout.h_pad'],\n wspace=mpl.rcParams['figure.constrained_layout.wspace'],\n hspace=mpl.rcParams['figure.constrained_layout.hspace'],\n rect=(0, 0, 1, 1))\n # set anything that was passed in (None will be ignored):\n self.set(w_pad=w_pad, h_pad=h_pad, wspace=wspace, hspace=hspace,\n rect=rect)\n self._compress = compress\n\n def execute(self, fig):\n """\n Perform constrained_layout and move and resize Axes accordingly.\n\n Parameters\n ----------\n fig : `.Figure` to perform layout on.\n """\n width, height = fig.get_size_inches()\n # pads are relative to the current state of the figure...\n w_pad = self._params['w_pad'] / width\n h_pad = self._params['h_pad'] / height\n\n return do_constrained_layout(fig, w_pad=w_pad, h_pad=h_pad,\n wspace=self._params['wspace'],\n hspace=self._params['hspace'],\n rect=self._params['rect'],\n compress=self._compress)\n\n def set(self, *, h_pad=None, w_pad=None,\n hspace=None, wspace=None, rect=None):\n """\n Set the pads for constrained_layout.\n\n Parameters\n ----------\n h_pad, w_pad : float\n Padding around the Axes elements in inches.\n Default to :rc:`figure.constrained_layout.h_pad` and\n :rc:`figure.constrained_layout.w_pad`.\n hspace, wspace : float\n Fraction of the figure to dedicate to space between the\n axes. These are evenly spread between the gaps between the Axes.\n A value of 0.2 for a three-column layout would have a space\n of 0.1 of the figure width between each column.\n If h/wspace < h/w_pad, then the pads are used instead.\n Default to :rc:`figure.constrained_layout.hspace` and\n :rc:`figure.constrained_layout.wspace`.\n rect : tuple of 4 floats\n Rectangle in figure coordinates to perform constrained layout in\n (left, bottom, width, height), each from 0-1.\n """\n for td in self.set.__kwdefaults__:\n if locals()[td] is not None:\n self._params[td] = locals()[td]\n | .venv\Lib\site-packages\matplotlib\layout_engine.py | layout_engine.py | Python | 11,433 | 0.95 | 0.148867 | 0.023077 | vue-tools | 406 | 2025-05-13T04:08:16.540362 | Apache-2.0 | false | 7afee1126a72b86f757fb09ecb3debd8 |
from matplotlib.figure import Figure\n\nfrom typing import Any\n\nclass LayoutEngine:\n def __init__(self, **kwargs: Any) -> None: ...\n def set(self) -> None: ...\n @property\n def colorbar_gridspec(self) -> bool: ...\n @property\n def adjust_compatible(self) -> bool: ...\n def get(self) -> dict[str, Any]: ...\n def execute(self, fig: Figure) -> None: ...\n\nclass PlaceHolderLayoutEngine(LayoutEngine):\n def __init__(\n self, adjust_compatible: bool, colorbar_gridspec: bool, **kwargs: Any\n ) -> None: ...\n def execute(self, fig: Figure) -> None: ...\n\nclass TightLayoutEngine(LayoutEngine):\n def __init__(\n self,\n *,\n pad: float = ...,\n h_pad: float | None = ...,\n w_pad: float | None = ...,\n rect: tuple[float, float, float, float] = ...,\n **kwargs: Any\n ) -> None: ...\n def execute(self, fig: Figure) -> None: ...\n def set(\n self,\n *,\n pad: float | None = ...,\n w_pad: float | None = ...,\n h_pad: float | None = ...,\n rect: tuple[float, float, float, float] | None = ...\n ) -> None: ...\n\nclass ConstrainedLayoutEngine(LayoutEngine):\n def __init__(\n self,\n *,\n h_pad: float | None = ...,\n w_pad: float | None = ...,\n hspace: float | None = ...,\n wspace: float | None = ...,\n rect: tuple[float, float, float, float] = ...,\n compress: bool = ...,\n **kwargs: Any\n ) -> None: ...\n def execute(self, fig: Figure) -> Any: ...\n def set(\n self,\n *,\n h_pad: float | None = ...,\n w_pad: float | None = ...,\n hspace: float | None = ...,\n wspace: float | None = ...,\n rect: tuple[float, float, float, float] | None = ...\n ) -> None: ...\n | .venv\Lib\site-packages\matplotlib\layout_engine.pyi | layout_engine.pyi | Other | 1,788 | 0.85 | 0.290323 | 0.105263 | awesome-app | 791 | 2024-04-30T17:59:38.523161 | MIT | false | 1208632d22cb43a97ba5bd820a049a66 |
"""\nThe legend module defines the Legend class, which is responsible for\ndrawing legends associated with Axes and/or figures.\n\n.. important::\n\n It is unlikely that you would ever create a Legend instance manually.\n Most users would normally create a legend via the `~.Axes.legend`\n function. For more details on legends there is also a :ref:`legend guide\n <legend_guide>`.\n\nThe `Legend` class is a container of legend handles and legend texts.\n\nThe legend handler map specifies how to create legend handles from artists\n(lines, patches, etc.) in the Axes or figures. Default legend handlers are\ndefined in the :mod:`~matplotlib.legend_handler` module. While not all artist\ntypes are covered by the default legend handlers, custom legend handlers can be\ndefined to support arbitrary objects.\n\nSee the :ref`<legend_guide>` for more\ninformation.\n"""\n\nimport itertools\nimport logging\nimport numbers\nimport time\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom matplotlib import _api, _docstring, cbook, colors, offsetbox\nfrom matplotlib.artist import Artist, allow_rasterization\nfrom matplotlib.cbook import silent_list\nfrom matplotlib.font_manager import FontProperties\nfrom matplotlib.lines import Line2D\nfrom matplotlib.patches import (Patch, Rectangle, Shadow, FancyBboxPatch,\n StepPatch)\nfrom matplotlib.collections import (\n Collection, CircleCollection, LineCollection, PathCollection,\n PolyCollection, RegularPolyCollection)\nfrom matplotlib.text import Text\nfrom matplotlib.transforms import Bbox, BboxBase, TransformedBbox\nfrom matplotlib.transforms import BboxTransformTo, BboxTransformFrom\nfrom matplotlib.offsetbox import (\n AnchoredOffsetbox, DraggableOffsetBox,\n HPacker, VPacker,\n DrawingArea, TextArea,\n)\nfrom matplotlib.container import ErrorbarContainer, BarContainer, StemContainer\nfrom . import legend_handler\n\n\nclass DraggableLegend(DraggableOffsetBox):\n def __init__(self, legend, use_blit=False, update="loc"):\n """\n Wrapper around a `.Legend` to support mouse dragging.\n\n Parameters\n ----------\n legend : `.Legend`\n The `.Legend` instance to wrap.\n use_blit : bool, optional\n Use blitting for faster image composition. For details see\n :ref:`func-animation`.\n update : {'loc', 'bbox'}, optional\n If "loc", update the *loc* parameter of the legend upon finalizing.\n If "bbox", update the *bbox_to_anchor* parameter.\n """\n self.legend = legend\n\n _api.check_in_list(["loc", "bbox"], update=update)\n self._update = update\n\n super().__init__(legend, legend._legend_box, use_blit=use_blit)\n\n def finalize_offset(self):\n if self._update == "loc":\n self._update_loc(self.get_loc_in_canvas())\n elif self._update == "bbox":\n self._update_bbox_to_anchor(self.get_loc_in_canvas())\n\n def _update_loc(self, loc_in_canvas):\n bbox = self.legend.get_bbox_to_anchor()\n # if bbox has zero width or height, the transformation is\n # ill-defined. Fall back to the default bbox_to_anchor.\n if bbox.width == 0 or bbox.height == 0:\n self.legend.set_bbox_to_anchor(None)\n bbox = self.legend.get_bbox_to_anchor()\n _bbox_transform = BboxTransformFrom(bbox)\n self.legend._loc = tuple(_bbox_transform.transform(loc_in_canvas))\n\n def _update_bbox_to_anchor(self, loc_in_canvas):\n loc_in_bbox = self.legend.axes.transAxes.transform(loc_in_canvas)\n self.legend.set_bbox_to_anchor(loc_in_bbox)\n\n\n_legend_kw_doc_base = """\nbbox_to_anchor : `.BboxBase`, 2-tuple, or 4-tuple of floats\n Box that is used to position the legend in conjunction with *loc*.\n Defaults to ``axes.bbox`` (if called as a method to `.Axes.legend`) or\n ``figure.bbox`` (if ``figure.legend``). This argument allows arbitrary\n placement of the legend.\n\n Bbox coordinates are interpreted in the coordinate system given by\n *bbox_transform*, with the default transform\n Axes or Figure coordinates, depending on which ``legend`` is called.\n\n If a 4-tuple or `.BboxBase` is given, then it specifies the bbox\n ``(x, y, width, height)`` that the legend is placed in.\n To put the legend in the best location in the bottom right\n quadrant of the Axes (or figure)::\n\n loc='best', bbox_to_anchor=(0.5, 0., 0.5, 0.5)\n\n A 2-tuple ``(x, y)`` places the corner of the legend specified by *loc* at\n x, y. For example, to put the legend's upper right-hand corner in the\n center of the Axes (or figure) the following keywords can be used::\n\n loc='upper right', bbox_to_anchor=(0.5, 0.5)\n\nncols : int, default: 1\n The number of columns that the legend has.\n\n For backward compatibility, the spelling *ncol* is also supported\n but it is discouraged. If both are given, *ncols* takes precedence.\n\nprop : None or `~matplotlib.font_manager.FontProperties` or dict\n The font properties of the legend. If None (default), the current\n :data:`matplotlib.rcParams` will be used.\n\nfontsize : int or {'xx-small', 'x-small', 'small', 'medium', 'large', \\n'x-large', 'xx-large'}\n The font size of the legend. If the value is numeric the size will be the\n absolute font size in points. String values are relative to the current\n default font size. This argument is only used if *prop* is not specified.\n\nlabelcolor : str or list, default: :rc:`legend.labelcolor`\n The color of the text in the legend. Either a valid color string\n (for example, 'red'), or a list of color strings. The labelcolor can\n also be made to match the color of the line or marker using 'linecolor',\n 'markerfacecolor' (or 'mfc'), or 'markeredgecolor' (or 'mec').\n\n Labelcolor can be set globally using :rc:`legend.labelcolor`. If None,\n use :rc:`text.color`.\n\nnumpoints : int, default: :rc:`legend.numpoints`\n The number of marker points in the legend when creating a legend\n entry for a `.Line2D` (line).\n\nscatterpoints : int, default: :rc:`legend.scatterpoints`\n The number of marker points in the legend when creating\n a legend entry for a `.PathCollection` (scatter plot).\n\nscatteryoffsets : iterable of floats, default: ``[0.375, 0.5, 0.3125]``\n The vertical offset (relative to the font size) for the markers\n created for a scatter plot legend entry. 0.0 is at the base the\n legend text, and 1.0 is at the top. To draw all markers at the\n same height, set to ``[0.5]``.\n\nmarkerscale : float, default: :rc:`legend.markerscale`\n The relative size of legend markers compared to the originally drawn ones.\n\nmarkerfirst : bool, default: True\n If *True*, legend marker is placed to the left of the legend label.\n If *False*, legend marker is placed to the right of the legend label.\n\nreverse : bool, default: False\n If *True*, the legend labels are displayed in reverse order from the input.\n If *False*, the legend labels are displayed in the same order as the input.\n\n .. versionadded:: 3.7\n\nframeon : bool, default: :rc:`legend.frameon`\n Whether the legend should be drawn on a patch (frame).\n\nfancybox : bool, default: :rc:`legend.fancybox`\n Whether round edges should be enabled around the `.FancyBboxPatch` which\n makes up the legend's background.\n\nshadow : None, bool or dict, default: :rc:`legend.shadow`\n Whether to draw a shadow behind the legend.\n The shadow can be configured using `.Patch` keywords.\n Customization via :rc:`legend.shadow` is currently not supported.\n\nframealpha : float, default: :rc:`legend.framealpha`\n The alpha transparency of the legend's background.\n If *shadow* is activated and *framealpha* is ``None``, the default value is\n ignored.\n\nfacecolor : "inherit" or color, default: :rc:`legend.facecolor`\n The legend's background color.\n If ``"inherit"``, use :rc:`axes.facecolor`.\n\nedgecolor : "inherit" or color, default: :rc:`legend.edgecolor`\n The legend's background patch edge color.\n If ``"inherit"``, use :rc:`axes.edgecolor`.\n\nmode : {"expand", None}\n If *mode* is set to ``"expand"`` the legend will be horizontally\n expanded to fill the Axes area (or *bbox_to_anchor* if defines\n the legend's size).\n\nbbox_transform : None or `~matplotlib.transforms.Transform`\n The transform for the bounding box (*bbox_to_anchor*). For a value\n of ``None`` (default) the Axes'\n :data:`~matplotlib.axes.Axes.transAxes` transform will be used.\n\ntitle : str or None\n The legend's title. Default is no title (``None``).\n\ntitle_fontproperties : None or `~matplotlib.font_manager.FontProperties` or dict\n The font properties of the legend's title. If None (default), the\n *title_fontsize* argument will be used if present; if *title_fontsize* is\n also None, the current :rc:`legend.title_fontsize` will be used.\n\ntitle_fontsize : int or {'xx-small', 'x-small', 'small', 'medium', 'large', \\n'x-large', 'xx-large'}, default: :rc:`legend.title_fontsize`\n The font size of the legend's title.\n Note: This cannot be combined with *title_fontproperties*. If you want\n to set the fontsize alongside other font properties, use the *size*\n parameter in *title_fontproperties*.\n\nalignment : {'center', 'left', 'right'}, default: 'center'\n The alignment of the legend title and the box of entries. The entries\n are aligned as a single block, so that markers always lined up.\n\nborderpad : float, default: :rc:`legend.borderpad`\n The fractional whitespace inside the legend border, in font-size units.\n\nlabelspacing : float, default: :rc:`legend.labelspacing`\n The vertical space between the legend entries, in font-size units.\n\nhandlelength : float, default: :rc:`legend.handlelength`\n The length of the legend handles, in font-size units.\n\nhandleheight : float, default: :rc:`legend.handleheight`\n The height of the legend handles, in font-size units.\n\nhandletextpad : float, default: :rc:`legend.handletextpad`\n The pad between the legend handle and text, in font-size units.\n\nborderaxespad : float, default: :rc:`legend.borderaxespad`\n The pad between the Axes and legend border, in font-size units.\n\ncolumnspacing : float, default: :rc:`legend.columnspacing`\n The spacing between columns, in font-size units.\n\nhandler_map : dict or None\n The custom dictionary mapping instances or types to a legend\n handler. This *handler_map* updates the default handler map\n found at `matplotlib.legend.Legend.get_legend_handler_map`.\n\ndraggable : bool, default: False\n Whether the legend can be dragged with the mouse.\n"""\n\n_loc_doc_base = """\nloc : str or pair of floats, default: {default}\n The location of the legend.\n\n The strings ``'upper left'``, ``'upper right'``, ``'lower left'``,\n ``'lower right'`` place the legend at the corresponding corner of the\n {parent}.\n\n The strings ``'upper center'``, ``'lower center'``, ``'center left'``,\n ``'center right'`` place the legend at the center of the corresponding edge\n of the {parent}.\n\n The string ``'center'`` places the legend at the center of the {parent}.\n{best}\n The location can also be a 2-tuple giving the coordinates of the lower-left\n corner of the legend in {parent} coordinates (in which case *bbox_to_anchor*\n will be ignored).\n\n For back-compatibility, ``'center right'`` (but no other location) can also\n be spelled ``'right'``, and each "string" location can also be given as a\n numeric value:\n\n ================== =============\n Location String Location Code\n ================== =============\n 'best' (Axes only) 0\n 'upper right' 1\n 'upper left' 2\n 'lower left' 3\n 'lower right' 4\n 'right' 5\n 'center left' 6\n 'center right' 7\n 'lower center' 8\n 'upper center' 9\n 'center' 10\n ================== =============\n {outside}"""\n\n_loc_doc_best = """\n The string ``'best'`` places the legend at the location, among the nine\n locations defined so far, with the minimum overlap with other drawn\n artists. This option can be quite slow for plots with large amounts of\n data; your plotting speed may benefit from providing a specific location.\n"""\n\n_legend_kw_axes_st = (\n _loc_doc_base.format(parent='axes', default=':rc:`legend.loc`',\n best=_loc_doc_best, outside='') +\n _legend_kw_doc_base)\n_docstring.interpd.register(_legend_kw_axes=_legend_kw_axes_st)\n\n_outside_doc = """\n If a figure is using the constrained layout manager, the string codes\n of the *loc* keyword argument can get better layout behaviour using the\n prefix 'outside'. There is ambiguity at the corners, so 'outside\n upper right' will make space for the legend above the rest of the\n axes in the layout, and 'outside right upper' will make space on the\n right side of the layout. In addition to the values of *loc*\n listed above, we have 'outside right upper', 'outside right lower',\n 'outside left upper', and 'outside left lower'. See\n :ref:`legend_guide` for more details.\n"""\n\n_legend_kw_figure_st = (\n _loc_doc_base.format(parent='figure', default="'upper right'",\n best='', outside=_outside_doc) +\n _legend_kw_doc_base)\n_docstring.interpd.register(_legend_kw_figure=_legend_kw_figure_st)\n\n_legend_kw_both_st = (\n _loc_doc_base.format(parent='axes/figure',\n default=":rc:`legend.loc` for Axes, 'upper right' for Figure",\n best=_loc_doc_best, outside=_outside_doc) +\n _legend_kw_doc_base)\n_docstring.interpd.register(_legend_kw_doc=_legend_kw_both_st)\n\n_legend_kw_set_loc_st = (\n _loc_doc_base.format(parent='axes/figure',\n default=":rc:`legend.loc` for Axes, 'upper right' for Figure",\n best=_loc_doc_best, outside=_outside_doc))\n_docstring.interpd.register(_legend_kw_set_loc_doc=_legend_kw_set_loc_st)\n\n\nclass Legend(Artist):\n """\n Place a legend on the figure/axes.\n """\n\n # 'best' is only implemented for Axes legends\n codes = {'best': 0, **AnchoredOffsetbox.codes}\n zorder = 5\n\n def __str__(self):\n return "Legend"\n\n @_docstring.interpd\n def __init__(\n self, parent, handles, labels,\n *,\n loc=None,\n numpoints=None, # number of points in the legend line\n markerscale=None, # relative size of legend markers vs. original\n markerfirst=True, # left/right ordering of legend marker and label\n reverse=False, # reverse ordering of legend marker and label\n scatterpoints=None, # number of scatter points\n scatteryoffsets=None,\n prop=None, # properties for the legend texts\n fontsize=None, # keyword to set font size directly\n labelcolor=None, # keyword to set the text color\n\n # spacing & pad defined as a fraction of the font-size\n borderpad=None, # whitespace inside the legend border\n labelspacing=None, # vertical space between the legend entries\n handlelength=None, # length of the legend handles\n handleheight=None, # height of the legend handles\n handletextpad=None, # pad between the legend handle and text\n borderaxespad=None, # pad between the Axes and legend border\n columnspacing=None, # spacing between columns\n\n ncols=1, # number of columns\n mode=None, # horizontal distribution of columns: None or "expand"\n\n fancybox=None, # True: fancy box, False: rounded box, None: rcParam\n shadow=None,\n title=None, # legend title\n title_fontsize=None, # legend title font size\n framealpha=None, # set frame alpha\n edgecolor=None, # frame patch edgecolor\n facecolor=None, # frame patch facecolor\n\n bbox_to_anchor=None, # bbox to which the legend will be anchored\n bbox_transform=None, # transform for the bbox\n frameon=None, # draw frame\n handler_map=None,\n title_fontproperties=None, # properties for the legend title\n alignment="center", # control the alignment within the legend box\n ncol=1, # synonym for ncols (backward compatibility)\n draggable=False # whether the legend can be dragged with the mouse\n ):\n """\n Parameters\n ----------\n parent : `~matplotlib.axes.Axes` or `.Figure`\n The artist that contains the legend.\n\n handles : list of (`.Artist` or tuple of `.Artist`)\n A list of Artists (lines, patches) to be added to the legend.\n\n labels : list of str\n A list of labels to show next to the artists. The length of handles\n and labels should be the same. If they are not, they are truncated\n to the length of the shorter list.\n\n Other Parameters\n ----------------\n %(_legend_kw_doc)s\n\n Attributes\n ----------\n legend_handles\n List of `.Artist` objects added as legend entries.\n\n .. versionadded:: 3.7\n """\n # local import only to avoid circularity\n from matplotlib.axes import Axes\n from matplotlib.figure import FigureBase\n\n super().__init__()\n\n if prop is None:\n self.prop = FontProperties(size=mpl._val_or_rc(fontsize, "legend.fontsize"))\n else:\n self.prop = FontProperties._from_any(prop)\n if isinstance(prop, dict) and "size" not in prop:\n self.prop.set_size(mpl.rcParams["legend.fontsize"])\n\n self._fontsize = self.prop.get_size_in_points()\n\n self.texts = []\n self.legend_handles = []\n self._legend_title_box = None\n\n #: A dictionary with the extra handler mappings for this Legend\n #: instance.\n self._custom_handler_map = handler_map\n\n self.numpoints = mpl._val_or_rc(numpoints, 'legend.numpoints')\n self.markerscale = mpl._val_or_rc(markerscale, 'legend.markerscale')\n self.scatterpoints = mpl._val_or_rc(scatterpoints, 'legend.scatterpoints')\n self.borderpad = mpl._val_or_rc(borderpad, 'legend.borderpad')\n self.labelspacing = mpl._val_or_rc(labelspacing, 'legend.labelspacing')\n self.handlelength = mpl._val_or_rc(handlelength, 'legend.handlelength')\n self.handleheight = mpl._val_or_rc(handleheight, 'legend.handleheight')\n self.handletextpad = mpl._val_or_rc(handletextpad, 'legend.handletextpad')\n self.borderaxespad = mpl._val_or_rc(borderaxespad, 'legend.borderaxespad')\n self.columnspacing = mpl._val_or_rc(columnspacing, 'legend.columnspacing')\n self.shadow = mpl._val_or_rc(shadow, 'legend.shadow')\n\n if reverse:\n labels = [*reversed(labels)]\n handles = [*reversed(handles)]\n\n if len(handles) < 2:\n ncols = 1\n self._ncols = ncols if ncols != 1 else ncol\n\n if self.numpoints <= 0:\n raise ValueError("numpoints must be > 0; it was %d" % numpoints)\n\n # introduce y-offset for handles of the scatter plot\n if scatteryoffsets is None:\n self._scatteryoffsets = np.array([3. / 8., 4. / 8., 2.5 / 8.])\n else:\n self._scatteryoffsets = np.asarray(scatteryoffsets)\n reps = self.scatterpoints // len(self._scatteryoffsets) + 1\n self._scatteryoffsets = np.tile(self._scatteryoffsets,\n reps)[:self.scatterpoints]\n\n # _legend_box is a VPacker instance that contains all\n # legend items and will be initialized from _init_legend_box()\n # method.\n self._legend_box = None\n\n if isinstance(parent, Axes):\n self.isaxes = True\n self.axes = parent\n self.set_figure(parent.get_figure(root=False))\n elif isinstance(parent, FigureBase):\n self.isaxes = False\n self.set_figure(parent)\n else:\n raise TypeError(\n "Legend needs either Axes or FigureBase as parent"\n )\n self.parent = parent\n\n self._mode = mode\n self.set_bbox_to_anchor(bbox_to_anchor, bbox_transform)\n\n # Figure out if self.shadow is valid\n # If shadow was None, rcParams loads False\n # So it shouldn't be None here\n\n self._shadow_props = {'ox': 2, 'oy': -2} # default location offsets\n if isinstance(self.shadow, dict):\n self._shadow_props.update(self.shadow)\n self.shadow = True\n elif self.shadow in (0, 1, True, False):\n self.shadow = bool(self.shadow)\n else:\n raise ValueError(\n 'Legend shadow must be a dict or bool, not '\n f'{self.shadow!r} of type {type(self.shadow)}.'\n )\n\n # We use FancyBboxPatch to draw a legend frame. The location\n # and size of the box will be updated during the drawing time.\n\n facecolor = mpl._val_or_rc(facecolor, "legend.facecolor")\n if facecolor == 'inherit':\n facecolor = mpl.rcParams["axes.facecolor"]\n\n edgecolor = mpl._val_or_rc(edgecolor, "legend.edgecolor")\n if edgecolor == 'inherit':\n edgecolor = mpl.rcParams["axes.edgecolor"]\n\n fancybox = mpl._val_or_rc(fancybox, "legend.fancybox")\n\n self.legendPatch = FancyBboxPatch(\n xy=(0, 0), width=1, height=1,\n facecolor=facecolor, edgecolor=edgecolor,\n # If shadow is used, default to alpha=1 (#8943).\n alpha=(framealpha if framealpha is not None\n else 1 if shadow\n else mpl.rcParams["legend.framealpha"]),\n # The width and height of the legendPatch will be set (in draw())\n # to the length that includes the padding. Thus we set pad=0 here.\n boxstyle=("round,pad=0,rounding_size=0.2" if fancybox\n else "square,pad=0"),\n mutation_scale=self._fontsize,\n snap=True,\n visible=mpl._val_or_rc(frameon, "legend.frameon")\n )\n self._set_artist_props(self.legendPatch)\n\n _api.check_in_list(["center", "left", "right"], alignment=alignment)\n self._alignment = alignment\n\n # init with null renderer\n self._init_legend_box(handles, labels, markerfirst)\n\n # Set legend location\n self.set_loc(loc)\n\n # figure out title font properties:\n if title_fontsize is not None and title_fontproperties is not None:\n raise ValueError(\n "title_fontsize and title_fontproperties can't be specified "\n "at the same time. Only use one of them. ")\n title_prop_fp = FontProperties._from_any(title_fontproperties)\n if isinstance(title_fontproperties, dict):\n if "size" not in title_fontproperties:\n title_fontsize = mpl.rcParams["legend.title_fontsize"]\n title_prop_fp.set_size(title_fontsize)\n elif title_fontsize is not None:\n title_prop_fp.set_size(title_fontsize)\n elif not isinstance(title_fontproperties, FontProperties):\n title_fontsize = mpl.rcParams["legend.title_fontsize"]\n title_prop_fp.set_size(title_fontsize)\n\n self.set_title(title, prop=title_prop_fp)\n\n self._draggable = None\n self.set_draggable(state=draggable)\n\n # set the text color\n\n color_getters = { # getter function depends on line or patch\n 'linecolor': ['get_color', 'get_facecolor'],\n 'markerfacecolor': ['get_markerfacecolor', 'get_facecolor'],\n 'mfc': ['get_markerfacecolor', 'get_facecolor'],\n 'markeredgecolor': ['get_markeredgecolor', 'get_edgecolor'],\n 'mec': ['get_markeredgecolor', 'get_edgecolor'],\n }\n labelcolor = mpl._val_or_rc(labelcolor, 'legend.labelcolor')\n if labelcolor is None:\n labelcolor = mpl.rcParams['text.color']\n if isinstance(labelcolor, str) and labelcolor in color_getters:\n getter_names = color_getters[labelcolor]\n for handle, text in zip(self.legend_handles, self.texts):\n try:\n if handle.get_array() is not None:\n continue\n except AttributeError:\n pass\n for getter_name in getter_names:\n try:\n color = getattr(handle, getter_name)()\n if isinstance(color, np.ndarray):\n if (\n color.shape[0] == 1\n or np.isclose(color, color[0]).all()\n ):\n text.set_color(color[0])\n else:\n pass\n else:\n text.set_color(color)\n break\n except AttributeError:\n pass\n elif cbook._str_equal(labelcolor, 'none'):\n for text in self.texts:\n text.set_color(labelcolor)\n elif np.iterable(labelcolor):\n for text, color in zip(self.texts,\n itertools.cycle(\n colors.to_rgba_array(labelcolor))):\n text.set_color(color)\n else:\n raise ValueError(f"Invalid labelcolor: {labelcolor!r}")\n\n def _set_artist_props(self, a):\n """\n Set the boilerplate props for artists added to Axes.\n """\n a.set_figure(self.get_figure(root=False))\n if self.isaxes:\n a.axes = self.axes\n\n a.set_transform(self.get_transform())\n\n @_docstring.interpd\n def set_loc(self, loc=None):\n """\n Set the location of the legend.\n\n .. versionadded:: 3.8\n\n Parameters\n ----------\n %(_legend_kw_set_loc_doc)s\n """\n loc0 = loc\n self._loc_used_default = loc is None\n if loc is None:\n loc = mpl.rcParams["legend.loc"]\n if not self.isaxes and loc in [0, 'best']:\n loc = 'upper right'\n\n type_err_message = ("loc must be string, coordinate tuple, or"\n f" an integer 0-10, not {loc!r}")\n\n # handle outside legends:\n self._outside_loc = None\n if isinstance(loc, str):\n if loc.split()[0] == 'outside':\n # strip outside:\n loc = loc.split('outside ')[1]\n # strip "center" at the beginning\n self._outside_loc = loc.replace('center ', '')\n # strip first\n self._outside_loc = self._outside_loc.split()[0]\n locs = loc.split()\n if len(locs) > 1 and locs[0] in ('right', 'left'):\n # locs doesn't accept "left upper", etc, so swap\n if locs[0] != 'center':\n locs = locs[::-1]\n loc = locs[0] + ' ' + locs[1]\n # check that loc is in acceptable strings\n loc = _api.check_getitem(self.codes, loc=loc)\n elif np.iterable(loc):\n # coerce iterable into tuple\n loc = tuple(loc)\n # validate the tuple represents Real coordinates\n if len(loc) != 2 or not all(isinstance(e, numbers.Real) for e in loc):\n raise ValueError(type_err_message)\n elif isinstance(loc, int):\n # validate the integer represents a string numeric value\n if loc < 0 or loc > 10:\n raise ValueError(type_err_message)\n else:\n # all other cases are invalid values of loc\n raise ValueError(type_err_message)\n\n if self.isaxes and self._outside_loc:\n raise ValueError(\n f"'outside' option for loc='{loc0}' keyword argument only "\n "works for figure legends")\n\n if not self.isaxes and loc == 0:\n raise ValueError(\n "Automatic legend placement (loc='best') not implemented for "\n "figure legend")\n\n tmp = self._loc_used_default\n self._set_loc(loc)\n self._loc_used_default = tmp # ignore changes done by _set_loc\n\n def _set_loc(self, loc):\n # find_offset function will be provided to _legend_box and\n # _legend_box will draw itself at the location of the return\n # value of the find_offset.\n self._loc_used_default = False\n self._loc_real = loc\n self.stale = True\n self._legend_box.set_offset(self._findoffset)\n\n def set_ncols(self, ncols):\n """Set the number of columns."""\n self._ncols = ncols\n\n def _get_loc(self):\n return self._loc_real\n\n _loc = property(_get_loc, _set_loc)\n\n def _findoffset(self, width, height, xdescent, ydescent, renderer):\n """Helper function to locate the legend."""\n\n if self._loc == 0: # "best".\n x, y = self._find_best_position(width, height, renderer)\n elif self._loc in Legend.codes.values(): # Fixed location.\n bbox = Bbox.from_bounds(0, 0, width, height)\n x, y = self._get_anchored_bbox(self._loc, bbox,\n self.get_bbox_to_anchor(),\n renderer)\n else: # Axes or figure coordinates.\n fx, fy = self._loc\n bbox = self.get_bbox_to_anchor()\n x, y = bbox.x0 + bbox.width * fx, bbox.y0 + bbox.height * fy\n\n return x + xdescent, y + ydescent\n\n @allow_rasterization\n def draw(self, renderer):\n # docstring inherited\n if not self.get_visible():\n return\n\n renderer.open_group('legend', gid=self.get_gid())\n\n fontsize = renderer.points_to_pixels(self._fontsize)\n\n # if mode == fill, set the width of the legend_box to the\n # width of the parent (minus pads)\n if self._mode in ["expand"]:\n pad = 2 * (self.borderaxespad + self.borderpad) * fontsize\n self._legend_box.set_width(self.get_bbox_to_anchor().width - pad)\n\n # update the location and size of the legend. This needs to\n # be done in any case to clip the figure right.\n bbox = self._legend_box.get_window_extent(renderer)\n self.legendPatch.set_bounds(bbox.bounds)\n self.legendPatch.set_mutation_scale(fontsize)\n\n # self.shadow is validated in __init__\n # So by here it is a bool and self._shadow_props contains any configs\n\n if self.shadow:\n Shadow(self.legendPatch, **self._shadow_props).draw(renderer)\n\n self.legendPatch.draw(renderer)\n self._legend_box.draw(renderer)\n\n renderer.close_group('legend')\n self.stale = False\n\n # _default_handler_map defines the default mapping between plot\n # elements and the legend handlers.\n\n _default_handler_map = {\n StemContainer: legend_handler.HandlerStem(),\n ErrorbarContainer: legend_handler.HandlerErrorbar(),\n Line2D: legend_handler.HandlerLine2D(),\n Patch: legend_handler.HandlerPatch(),\n StepPatch: legend_handler.HandlerStepPatch(),\n LineCollection: legend_handler.HandlerLineCollection(),\n RegularPolyCollection: legend_handler.HandlerRegularPolyCollection(),\n CircleCollection: legend_handler.HandlerCircleCollection(),\n BarContainer: legend_handler.HandlerPatch(\n update_func=legend_handler.update_from_first_child),\n tuple: legend_handler.HandlerTuple(),\n PathCollection: legend_handler.HandlerPathCollection(),\n PolyCollection: legend_handler.HandlerPolyCollection()\n }\n\n # (get|set|update)_default_handler_maps are public interfaces to\n # modify the default handler map.\n\n @classmethod\n def get_default_handler_map(cls):\n """Return the global default handler map, shared by all legends."""\n return cls._default_handler_map\n\n @classmethod\n def set_default_handler_map(cls, handler_map):\n """Set the global default handler map, shared by all legends."""\n cls._default_handler_map = handler_map\n\n @classmethod\n def update_default_handler_map(cls, handler_map):\n """Update the global default handler map, shared by all legends."""\n cls._default_handler_map.update(handler_map)\n\n def get_legend_handler_map(self):\n """Return this legend instance's handler map."""\n default_handler_map = self.get_default_handler_map()\n return ({**default_handler_map, **self._custom_handler_map}\n if self._custom_handler_map else default_handler_map)\n\n @staticmethod\n def get_legend_handler(legend_handler_map, orig_handle):\n """\n Return a legend handler from *legend_handler_map* that\n corresponds to *orig_handler*.\n\n *legend_handler_map* should be a dictionary object (that is\n returned by the get_legend_handler_map method).\n\n It first checks if the *orig_handle* itself is a key in the\n *legend_handler_map* and return the associated value.\n Otherwise, it checks for each of the classes in its\n method-resolution-order. If no matching key is found, it\n returns ``None``.\n """\n try:\n return legend_handler_map[orig_handle]\n except (TypeError, KeyError): # TypeError if unhashable.\n pass\n for handle_type in type(orig_handle).mro():\n try:\n return legend_handler_map[handle_type]\n except KeyError:\n pass\n return None\n\n def _init_legend_box(self, handles, labels, markerfirst=True):\n """\n Initialize the legend_box. The legend_box is an instance of\n the OffsetBox, which is packed with legend handles and\n texts. Once packed, their location is calculated during the\n drawing time.\n """\n\n fontsize = self._fontsize\n\n # legend_box is a HPacker, horizontally packed with columns.\n # Each column is a VPacker, vertically packed with legend items.\n # Each legend item is a HPacker packed with:\n # - handlebox: a DrawingArea which contains the legend handle.\n # - labelbox: a TextArea which contains the legend text.\n\n text_list = [] # the list of text instances\n handle_list = [] # the list of handle instances\n handles_and_labels = []\n\n # The approximate height and descent of text. These values are\n # only used for plotting the legend handle.\n descent = 0.35 * fontsize * (self.handleheight - 0.7) # heuristic.\n height = fontsize * self.handleheight - descent\n # each handle needs to be drawn inside a box of (x, y, w, h) =\n # (0, -descent, width, height). And their coordinates should\n # be given in the display coordinates.\n\n # The transformation of each handle will be automatically set\n # to self.get_transform(). If the artist does not use its\n # default transform (e.g., Collections), you need to\n # manually set their transform to the self.get_transform().\n legend_handler_map = self.get_legend_handler_map()\n\n for orig_handle, label in zip(handles, labels):\n handler = self.get_legend_handler(legend_handler_map, orig_handle)\n if handler is None:\n _api.warn_external(\n "Legend does not support handles for "\n f"{type(orig_handle).__name__} "\n "instances.\nA proxy artist may be used "\n "instead.\nSee: https://matplotlib.org/"\n "stable/users/explain/axes/legend_guide.html"\n "#controlling-the-legend-entries")\n # No handle for this artist, so we just defer to None.\n handle_list.append(None)\n else:\n textbox = TextArea(label, multilinebaseline=True,\n textprops=dict(\n verticalalignment='baseline',\n horizontalalignment='left',\n fontproperties=self.prop))\n handlebox = DrawingArea(width=self.handlelength * fontsize,\n height=height,\n xdescent=0., ydescent=descent)\n\n text_list.append(textbox._text)\n # Create the artist for the legend which represents the\n # original artist/handle.\n handle_list.append(handler.legend_artist(self, orig_handle,\n fontsize, handlebox))\n handles_and_labels.append((handlebox, textbox))\n\n columnbox = []\n # array_split splits n handles_and_labels into ncols columns, with the\n # first n%ncols columns having an extra entry. filter(len, ...)\n # handles the case where n < ncols: the last ncols-n columns are empty\n # and get filtered out.\n for handles_and_labels_column in filter(\n len, np.array_split(handles_and_labels, self._ncols)):\n # pack handlebox and labelbox into itembox\n itemboxes = [HPacker(pad=0,\n sep=self.handletextpad * fontsize,\n children=[h, t] if markerfirst else [t, h],\n align="baseline")\n for h, t in handles_and_labels_column]\n # pack columnbox\n alignment = "baseline" if markerfirst else "right"\n columnbox.append(VPacker(pad=0,\n sep=self.labelspacing * fontsize,\n align=alignment,\n children=itemboxes))\n\n mode = "expand" if self._mode == "expand" else "fixed"\n sep = self.columnspacing * fontsize\n self._legend_handle_box = HPacker(pad=0,\n sep=sep, align="baseline",\n mode=mode,\n children=columnbox)\n self._legend_title_box = TextArea("")\n self._legend_box = VPacker(pad=self.borderpad * fontsize,\n sep=self.labelspacing * fontsize,\n align=self._alignment,\n children=[self._legend_title_box,\n self._legend_handle_box])\n self._legend_box.set_figure(self.get_figure(root=False))\n self._legend_box.axes = self.axes\n self.texts = text_list\n self.legend_handles = handle_list\n\n def _auto_legend_data(self, renderer):\n """\n Return display coordinates for hit testing for "best" positioning.\n\n Returns\n -------\n bboxes\n List of bounding boxes of all patches.\n lines\n List of `.Path` corresponding to each line.\n offsets\n List of (x, y) offsets of all collection.\n """\n assert self.isaxes # always holds, as this is only called internally\n bboxes = []\n lines = []\n offsets = []\n for artist in self.parent._children:\n if isinstance(artist, Line2D):\n lines.append(\n artist.get_transform().transform_path(artist.get_path()))\n elif isinstance(artist, Rectangle):\n bboxes.append(\n artist.get_bbox().transformed(artist.get_data_transform()))\n elif isinstance(artist, Patch):\n lines.append(\n artist.get_transform().transform_path(artist.get_path()))\n elif isinstance(artist, PolyCollection):\n lines.extend(artist.get_transform().transform_path(path)\n for path in artist.get_paths())\n elif isinstance(artist, Collection):\n transform, transOffset, hoffsets, _ = artist._prepare_points()\n if len(hoffsets):\n offsets.extend(transOffset.transform(hoffsets))\n elif isinstance(artist, Text):\n bboxes.append(artist.get_window_extent(renderer))\n\n return bboxes, lines, offsets\n\n def get_children(self):\n # docstring inherited\n return [self._legend_box, self.get_frame()]\n\n def get_frame(self):\n """Return the `~.patches.Rectangle` used to frame the legend."""\n return self.legendPatch\n\n def get_lines(self):\n r"""Return the list of `~.lines.Line2D`\s in the legend."""\n return [h for h in self.legend_handles if isinstance(h, Line2D)]\n\n def get_patches(self):\n r"""Return the list of `~.patches.Patch`\s in the legend."""\n return silent_list('Patch',\n [h for h in self.legend_handles\n if isinstance(h, Patch)])\n\n def get_texts(self):\n r"""Return the list of `~.text.Text`\s in the legend."""\n return silent_list('Text', self.texts)\n\n def set_alignment(self, alignment):\n """\n Set the alignment of the legend title and the box of entries.\n\n The entries are aligned as a single block, so that markers always\n lined up.\n\n Parameters\n ----------\n alignment : {'center', 'left', 'right'}.\n\n """\n _api.check_in_list(["center", "left", "right"], alignment=alignment)\n self._alignment = alignment\n self._legend_box.align = alignment\n\n def get_alignment(self):\n """Get the alignment value of the legend box"""\n return self._legend_box.align\n\n def set_title(self, title, prop=None):\n """\n Set legend title and title style.\n\n Parameters\n ----------\n title : str\n The legend title.\n\n prop : `.font_manager.FontProperties` or `str` or `pathlib.Path`\n The font properties of the legend title.\n If a `str`, it is interpreted as a fontconfig pattern parsed by\n `.FontProperties`. If a `pathlib.Path`, it is interpreted as the\n absolute path to a font file.\n\n """\n self._legend_title_box._text.set_text(title)\n if title:\n self._legend_title_box._text.set_visible(True)\n self._legend_title_box.set_visible(True)\n else:\n self._legend_title_box._text.set_visible(False)\n self._legend_title_box.set_visible(False)\n\n if prop is not None:\n self._legend_title_box._text.set_fontproperties(prop)\n\n self.stale = True\n\n def get_title(self):\n """Return the `.Text` instance for the legend title."""\n return self._legend_title_box._text\n\n def get_window_extent(self, renderer=None):\n # docstring inherited\n if renderer is None:\n renderer = self.get_figure(root=True)._get_renderer()\n return self._legend_box.get_window_extent(renderer=renderer)\n\n def get_tightbbox(self, renderer=None):\n # docstring inherited\n return self._legend_box.get_window_extent(renderer)\n\n def get_frame_on(self):\n """Get whether the legend box patch is drawn."""\n return self.legendPatch.get_visible()\n\n def set_frame_on(self, b):\n """\n Set whether the legend box patch is drawn.\n\n Parameters\n ----------\n b : bool\n """\n self.legendPatch.set_visible(b)\n self.stale = True\n\n draw_frame = set_frame_on # Backcompat alias.\n\n def get_bbox_to_anchor(self):\n """Return the bbox that the legend will be anchored to."""\n if self._bbox_to_anchor is None:\n return self.parent.bbox\n else:\n return self._bbox_to_anchor\n\n def set_bbox_to_anchor(self, bbox, transform=None):\n """\n Set the bbox that the legend will be anchored to.\n\n Parameters\n ----------\n bbox : `~matplotlib.transforms.BboxBase` or tuple\n The bounding box can be specified in the following ways:\n\n - A `.BboxBase` instance\n - A tuple of ``(left, bottom, width, height)`` in the given\n transform (normalized axes coordinate if None)\n - A tuple of ``(left, bottom)`` where the width and height will be\n assumed to be zero.\n - *None*, to remove the bbox anchoring, and use the parent bbox.\n\n transform : `~matplotlib.transforms.Transform`, optional\n A transform to apply to the bounding box. If not specified, this\n will use a transform to the bounding box of the parent.\n """\n if bbox is None:\n self._bbox_to_anchor = None\n return\n elif isinstance(bbox, BboxBase):\n self._bbox_to_anchor = bbox\n else:\n try:\n l = len(bbox)\n except TypeError as err:\n raise ValueError(f"Invalid bbox: {bbox}") from err\n\n if l == 2:\n bbox = [bbox[0], bbox[1], 0, 0]\n\n self._bbox_to_anchor = Bbox.from_bounds(*bbox)\n\n if transform is None:\n transform = BboxTransformTo(self.parent.bbox)\n\n self._bbox_to_anchor = TransformedBbox(self._bbox_to_anchor,\n transform)\n self.stale = True\n\n def _get_anchored_bbox(self, loc, bbox, parentbbox, renderer):\n """\n Place the *bbox* inside the *parentbbox* according to a given\n location code. Return the (x, y) coordinate of the bbox.\n\n Parameters\n ----------\n loc : int\n A location code in range(1, 11). This corresponds to the possible\n values for ``self._loc``, excluding "best".\n bbox : `~matplotlib.transforms.Bbox`\n bbox to be placed, in display coordinates.\n parentbbox : `~matplotlib.transforms.Bbox`\n A parent box which will contain the bbox, in display coordinates.\n """\n return offsetbox._get_anchored_bbox(\n loc, bbox, parentbbox,\n self.borderaxespad * renderer.points_to_pixels(self._fontsize))\n\n def _find_best_position(self, width, height, renderer):\n """Determine the best location to place the legend."""\n assert self.isaxes # always holds, as this is only called internally\n\n start_time = time.perf_counter()\n\n bboxes, lines, offsets = self._auto_legend_data(renderer)\n\n bbox = Bbox.from_bounds(0, 0, width, height)\n\n candidates = []\n for idx in range(1, len(self.codes)):\n l, b = self._get_anchored_bbox(idx, bbox,\n self.get_bbox_to_anchor(),\n renderer)\n legendBox = Bbox.from_bounds(l, b, width, height)\n # XXX TODO: If markers are present, it would be good to take them\n # into account when checking vertex overlaps in the next line.\n badness = (sum(legendBox.count_contains(line.vertices)\n for line in lines)\n + legendBox.count_contains(offsets)\n + legendBox.count_overlaps(bboxes)\n + sum(line.intersects_bbox(legendBox, filled=False)\n for line in lines))\n # Include the index to favor lower codes in case of a tie.\n candidates.append((badness, idx, (l, b)))\n if badness == 0:\n break\n\n _, _, (l, b) = min(candidates)\n\n if self._loc_used_default and time.perf_counter() - start_time > 1:\n _api.warn_external(\n 'Creating legend with loc="best" can be slow with large '\n 'amounts of data.')\n\n return l, b\n\n def contains(self, mouseevent):\n return self.legendPatch.contains(mouseevent)\n\n def set_draggable(self, state, use_blit=False, update='loc'):\n """\n Enable or disable mouse dragging support of the legend.\n\n Parameters\n ----------\n state : bool\n Whether mouse dragging is enabled.\n use_blit : bool, optional\n Use blitting for faster image composition. For details see\n :ref:`func-animation`.\n update : {'loc', 'bbox'}, optional\n The legend parameter to be changed when dragged:\n\n - 'loc': update the *loc* parameter of the legend\n - 'bbox': update the *bbox_to_anchor* parameter of the legend\n\n Returns\n -------\n `.DraggableLegend` or *None*\n If *state* is ``True`` this returns the `.DraggableLegend` helper\n instance. Otherwise this returns *None*.\n """\n if state:\n if self._draggable is None:\n self._draggable = DraggableLegend(self,\n use_blit,\n update=update)\n else:\n if self._draggable is not None:\n self._draggable.disconnect()\n self._draggable = None\n return self._draggable\n\n def get_draggable(self):\n """Return ``True`` if the legend is draggable, ``False`` otherwise."""\n return self._draggable is not None\n\n\n# Helper functions to parse legend arguments for both `figure.legend` and\n# `axes.legend`:\ndef _get_legend_handles(axs, legend_handler_map=None):\n """Yield artists that can be used as handles in a legend."""\n handles_original = []\n for ax in axs:\n handles_original += [\n *(a for a in ax._children\n if isinstance(a, (Line2D, Patch, Collection, Text))),\n *ax.containers]\n # support parasite Axes:\n if hasattr(ax, 'parasites'):\n for axx in ax.parasites:\n handles_original += [\n *(a for a in axx._children\n if isinstance(a, (Line2D, Patch, Collection, Text))),\n *axx.containers]\n\n handler_map = {**Legend.get_default_handler_map(),\n **(legend_handler_map or {})}\n has_handler = Legend.get_legend_handler\n for handle in handles_original:\n label = handle.get_label()\n if label != '_nolegend_' and has_handler(handler_map, handle):\n yield handle\n elif (label and not label.startswith('_') and\n not has_handler(handler_map, handle)):\n _api.warn_external(\n "Legend does not support handles for "\n f"{type(handle).__name__} "\n "instances.\nSee: https://matplotlib.org/stable/"\n "tutorials/intermediate/legend_guide.html"\n "#implementing-a-custom-legend-handler")\n continue\n\n\ndef _get_legend_handles_labels(axs, legend_handler_map=None):\n """Return handles and labels for legend."""\n handles = []\n labels = []\n for handle in _get_legend_handles(axs, legend_handler_map):\n label = handle.get_label()\n if label and not label.startswith('_'):\n handles.append(handle)\n labels.append(label)\n return handles, labels\n\n\ndef _parse_legend_args(axs, *args, handles=None, labels=None, **kwargs):\n """\n Get the handles and labels from the calls to either ``figure.legend``\n or ``axes.legend``.\n\n The parser is a bit involved because we support::\n\n legend()\n legend(labels)\n legend(handles, labels)\n legend(labels=labels)\n legend(handles=handles)\n legend(handles=handles, labels=labels)\n\n The behavior for a mixture of positional and keyword handles and labels\n is undefined and issues a warning; it will be an error in the future.\n\n Parameters\n ----------\n axs : list of `.Axes`\n If handles are not given explicitly, the artists in these Axes are\n used as handles.\n *args : tuple\n Positional parameters passed to ``legend()``.\n handles\n The value of the keyword argument ``legend(handles=...)``, or *None*\n if that keyword argument was not used.\n labels\n The value of the keyword argument ``legend(labels=...)``, or *None*\n if that keyword argument was not used.\n **kwargs\n All other keyword arguments passed to ``legend()``.\n\n Returns\n -------\n handles : list of (`.Artist` or tuple of `.Artist`)\n The legend handles.\n labels : list of str\n The legend labels.\n kwargs : dict\n *kwargs* with keywords handles and labels removed.\n\n """\n log = logging.getLogger(__name__)\n\n handlers = kwargs.get('handler_map')\n\n if (handles is not None or labels is not None) and args:\n _api.warn_deprecated("3.9", message=(\n "You have mixed positional and keyword arguments, some input may "\n "be discarded. This is deprecated since %(since)s and will "\n "become an error in %(removal)s."))\n\n if (hasattr(handles, "__len__") and\n hasattr(labels, "__len__") and\n len(handles) != len(labels)):\n _api.warn_external(f"Mismatched number of handles and labels: "\n f"len(handles) = {len(handles)} "\n f"len(labels) = {len(labels)}")\n # if got both handles and labels as kwargs, make same length\n if handles and labels:\n handles, labels = zip(*zip(handles, labels))\n\n elif handles is not None and labels is None:\n labels = [handle.get_label() for handle in handles]\n\n elif labels is not None and handles is None:\n # Get as many handles as there are labels.\n handles = [handle for handle, label\n in zip(_get_legend_handles(axs, handlers), labels)]\n\n elif len(args) == 0: # 0 args: automatically detect labels and handles.\n handles, labels = _get_legend_handles_labels(axs, handlers)\n if not handles:\n _api.warn_external(\n "No artists with labels found to put in legend. Note that "\n "artists whose label start with an underscore are ignored "\n "when legend() is called with no argument.")\n\n elif len(args) == 1: # 1 arg: user defined labels, automatic handle detection.\n labels, = args\n if any(isinstance(l, Artist) for l in labels):\n raise TypeError("A single argument passed to legend() must be a "\n "list of labels, but found an Artist in there.")\n\n # Get as many handles as there are labels.\n handles = [handle for handle, label\n in zip(_get_legend_handles(axs, handlers), labels)]\n\n elif len(args) == 2: # 2 args: user defined handles and labels.\n handles, labels = args[:2]\n\n else:\n raise _api.nargs_error('legend', '0-2', len(args))\n\n return handles, labels, kwargs\n | .venv\Lib\site-packages\matplotlib\legend.py | legend.py | Python | 55,311 | 0.75 | 0.152555 | 0.08348 | awesome-app | 117 | 2024-05-12T14:20:55.639413 | BSD-3-Clause | false | b92a0cf92576803deae4aff1b26de65d |
from matplotlib.axes import Axes\nfrom matplotlib.artist import Artist\nfrom matplotlib.backend_bases import MouseEvent\nfrom matplotlib.figure import Figure\nfrom matplotlib.font_manager import FontProperties\nfrom matplotlib.legend_handler import HandlerBase\nfrom matplotlib.lines import Line2D\nfrom matplotlib.offsetbox import (\n DraggableOffsetBox,\n)\nfrom matplotlib.patches import FancyBboxPatch, Patch, Rectangle\nfrom matplotlib.text import Text\nfrom matplotlib.transforms import (\n BboxBase,\n Transform,\n)\n\n\nimport pathlib\nfrom collections.abc import Iterable\nfrom typing import Any, Literal, overload\nfrom .typing import ColorType\n\nclass DraggableLegend(DraggableOffsetBox):\n legend: Legend\n def __init__(\n self, legend: Legend, use_blit: bool = ..., update: Literal["loc", "bbox"] = ...\n ) -> None: ...\n def finalize_offset(self) -> None: ...\n\nclass Legend(Artist):\n codes: dict[str, int]\n zorder: float\n prop: FontProperties\n texts: list[Text]\n legend_handles: list[Artist | None]\n numpoints: int\n markerscale: float\n scatterpoints: int\n borderpad: float\n labelspacing: float\n handlelength: float\n handleheight: float\n handletextpad: float\n borderaxespad: float\n columnspacing: float\n shadow: bool\n isaxes: bool\n axes: Axes\n parent: Axes | Figure\n legendPatch: FancyBboxPatch\n def __init__(\n self,\n parent: Axes | Figure,\n handles: Iterable[Artist | tuple[Artist, ...]],\n labels: Iterable[str],\n *,\n loc: str | tuple[float, float] | int | None = ...,\n numpoints: int | None = ...,\n markerscale: float | None = ...,\n markerfirst: bool = ...,\n reverse: bool = ...,\n scatterpoints: int | None = ...,\n scatteryoffsets: Iterable[float] | None = ...,\n prop: FontProperties | dict[str, Any] | None = ...,\n fontsize: float | str | None = ...,\n labelcolor: ColorType\n | Iterable[ColorType]\n | Literal["linecolor", "markerfacecolor", "mfc", "markeredgecolor", "mec"]\n | None = ...,\n borderpad: float | None = ...,\n labelspacing: float | None = ...,\n handlelength: float | None = ...,\n handleheight: float | None = ...,\n handletextpad: float | None = ...,\n borderaxespad: float | None = ...,\n columnspacing: float | None = ...,\n ncols: int = ...,\n mode: Literal["expand"] | None = ...,\n fancybox: bool | None = ...,\n shadow: bool | dict[str, Any] | None = ...,\n title: str | None = ...,\n title_fontsize: float | None = ...,\n framealpha: float | None = ...,\n edgecolor: Literal["inherit"] | ColorType | None = ...,\n facecolor: Literal["inherit"] | ColorType | None = ...,\n bbox_to_anchor: BboxBase\n | tuple[float, float]\n | tuple[float, float, float, float]\n | None = ...,\n bbox_transform: Transform | None = ...,\n frameon: bool | None = ...,\n handler_map: dict[Artist | type, HandlerBase] | None = ...,\n title_fontproperties: FontProperties | dict[str, Any] | None = ...,\n alignment: Literal["center", "left", "right"] = ...,\n ncol: int = ...,\n draggable: bool = ...\n ) -> None: ...\n def contains(self, mouseevent: MouseEvent) -> tuple[bool, dict[Any, Any]]: ...\n def set_ncols(self, ncols: int) -> None: ...\n @classmethod\n def get_default_handler_map(cls) -> dict[type, HandlerBase]: ...\n @classmethod\n def set_default_handler_map(cls, handler_map: dict[type, HandlerBase]) -> None: ...\n @classmethod\n def update_default_handler_map(\n cls, handler_map: dict[type, HandlerBase]\n ) -> None: ...\n def get_legend_handler_map(self) -> dict[type, HandlerBase]: ...\n @staticmethod\n def get_legend_handler(\n legend_handler_map: dict[type, HandlerBase], orig_handle: Any\n ) -> HandlerBase | None: ...\n def get_children(self) -> list[Artist]: ...\n def get_frame(self) -> Rectangle: ...\n def get_lines(self) -> list[Line2D]: ...\n def get_patches(self) -> list[Patch]: ...\n def get_texts(self) -> list[Text]: ...\n def set_alignment(self, alignment: Literal["center", "left", "right"]) -> None: ...\n def get_alignment(self) -> Literal["center", "left", "right"]: ...\n def set_loc(self, loc: str | tuple[float, float] | int | None = ...) -> None: ...\n def set_title(\n self, title: str, prop: FontProperties | str | pathlib.Path | None = ...\n ) -> None: ...\n def get_title(self) -> Text: ...\n def get_frame_on(self) -> bool: ...\n def set_frame_on(self, b: bool) -> None: ...\n draw_frame = set_frame_on\n def get_bbox_to_anchor(self) -> BboxBase: ...\n def set_bbox_to_anchor(\n self,\n bbox: BboxBase\n | tuple[float, float]\n | tuple[float, float, float, float]\n | None,\n transform: Transform | None = ...\n ) -> None: ...\n @overload\n def set_draggable(\n self,\n state: Literal[True],\n use_blit: bool = ...,\n update: Literal["loc", "bbox"] = ...,\n ) -> DraggableLegend: ...\n @overload\n def set_draggable(\n self,\n state: Literal[False],\n use_blit: bool = ...,\n update: Literal["loc", "bbox"] = ...,\n ) -> None: ...\n def get_draggable(self) -> bool: ...\n | .venv\Lib\site-packages\matplotlib\legend.pyi | legend.pyi | Other | 5,364 | 0.85 | 0.190789 | 0.006757 | node-utils | 695 | 2025-03-02T08:12:13.193556 | MIT | false | 7047eb40201fa511bff967ff3dcf4ab9 |
"""\nDefault legend handlers.\n\n.. important::\n\n This is a low-level legend API, which most end users do not need.\n\n We recommend that you are familiar with the :ref:`legend guide\n <legend_guide>` before reading this documentation.\n\nLegend handlers are expected to be a callable object with a following\nsignature::\n\n legend_handler(legend, orig_handle, fontsize, handlebox)\n\nWhere *legend* is the legend itself, *orig_handle* is the original\nplot, *fontsize* is the fontsize in pixels, and *handlebox* is an\n`.OffsetBox` instance. Within the call, you should create relevant\nartists (using relevant properties from the *legend* and/or\n*orig_handle*) and add them into the *handlebox*. The artists need to\nbe scaled according to the *fontsize* (note that the size is in pixels,\ni.e., this is dpi-scaled value).\n\nThis module includes definition of several legend handler classes\nderived from the base class (HandlerBase) with the following method::\n\n def legend_artist(self, legend, orig_handle, fontsize, handlebox)\n"""\n\nfrom itertools import cycle\n\nimport numpy as np\n\nfrom matplotlib import cbook\nfrom matplotlib.lines import Line2D\nfrom matplotlib.patches import Rectangle\nimport matplotlib.collections as mcoll\n\n\ndef update_from_first_child(tgt, src):\n first_child = next(iter(src.get_children()), None)\n if first_child is not None:\n tgt.update_from(first_child)\n\n\nclass HandlerBase:\n """\n A base class for default legend handlers.\n\n The derived classes are meant to override *create_artists* method, which\n has the following signature::\n\n def create_artists(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize,\n trans):\n\n The overridden method needs to create artists of the given\n transform that fits in the given dimension (xdescent, ydescent,\n width, height) that are scaled by fontsize if necessary.\n\n """\n def __init__(self, xpad=0., ypad=0., update_func=None):\n """\n Parameters\n ----------\n xpad : float, optional\n Padding in x-direction.\n ypad : float, optional\n Padding in y-direction.\n update_func : callable, optional\n Function for updating the legend handler properties from another\n legend handler, used by `~HandlerBase.update_prop`.\n """\n self._xpad, self._ypad = xpad, ypad\n self._update_prop_func = update_func\n\n def _update_prop(self, legend_handle, orig_handle):\n if self._update_prop_func is None:\n self._default_update_prop(legend_handle, orig_handle)\n else:\n self._update_prop_func(legend_handle, orig_handle)\n\n def _default_update_prop(self, legend_handle, orig_handle):\n legend_handle.update_from(orig_handle)\n\n def update_prop(self, legend_handle, orig_handle, legend):\n\n self._update_prop(legend_handle, orig_handle)\n\n legend._set_artist_props(legend_handle)\n legend_handle.set_clip_box(None)\n legend_handle.set_clip_path(None)\n\n def adjust_drawing_area(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize,\n ):\n xdescent = xdescent - self._xpad * fontsize\n ydescent = ydescent - self._ypad * fontsize\n width = width - self._xpad * fontsize\n height = height - self._ypad * fontsize\n return xdescent, ydescent, width, height\n\n def legend_artist(self, legend, orig_handle,\n fontsize, handlebox):\n """\n Return the artist that this HandlerBase generates for the given\n original artist/handle.\n\n Parameters\n ----------\n legend : `~matplotlib.legend.Legend`\n The legend for which these legend artists are being created.\n orig_handle : :class:`matplotlib.artist.Artist` or similar\n The object for which these legend artists are being created.\n fontsize : int\n The fontsize in pixels. The artists being created should\n be scaled according to the given fontsize.\n handlebox : `~matplotlib.offsetbox.OffsetBox`\n The box which has been created to hold this legend entry's\n artists. Artists created in the `legend_artist` method must\n be added to this handlebox inside this method.\n\n """\n xdescent, ydescent, width, height = self.adjust_drawing_area(\n legend, orig_handle,\n handlebox.xdescent, handlebox.ydescent,\n handlebox.width, handlebox.height,\n fontsize)\n artists = self.create_artists(legend, orig_handle,\n xdescent, ydescent, width, height,\n fontsize, handlebox.get_transform())\n\n # create_artists will return a list of artists.\n for a in artists:\n handlebox.add_artist(a)\n\n # we only return the first artist\n return artists[0]\n\n def create_artists(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize,\n trans):\n """\n Return the legend artists generated.\n\n Parameters\n ----------\n legend : `~matplotlib.legend.Legend`\n The legend for which these legend artists are being created.\n orig_handle : `~matplotlib.artist.Artist` or similar\n The object for which these legend artists are being created.\n xdescent, ydescent, width, height : int\n The rectangle (*xdescent*, *ydescent*, *width*, *height*) that the\n legend artists being created should fit within.\n fontsize : int\n The fontsize in pixels. The legend artists being created should\n be scaled according to the given fontsize.\n trans : `~matplotlib.transforms.Transform`\n The transform that is applied to the legend artists being created.\n Typically from unit coordinates in the handler box to screen\n coordinates.\n """\n raise NotImplementedError('Derived must override')\n\n\nclass HandlerNpoints(HandlerBase):\n """\n A legend handler that shows *numpoints* points in the legend entry.\n """\n\n def __init__(self, marker_pad=0.3, numpoints=None, **kwargs):\n """\n Parameters\n ----------\n marker_pad : float\n Padding between points in legend entry.\n numpoints : int\n Number of points to show in legend entry.\n **kwargs\n Keyword arguments forwarded to `.HandlerBase`.\n """\n super().__init__(**kwargs)\n\n self._numpoints = numpoints\n self._marker_pad = marker_pad\n\n def get_numpoints(self, legend):\n if self._numpoints is None:\n return legend.numpoints\n else:\n return self._numpoints\n\n def get_xdata(self, legend, xdescent, ydescent, width, height, fontsize):\n numpoints = self.get_numpoints(legend)\n if numpoints > 1:\n # we put some pad here to compensate the size of the marker\n pad = self._marker_pad * fontsize\n xdata = np.linspace(-xdescent + pad,\n -xdescent + width - pad,\n numpoints)\n xdata_marker = xdata\n else:\n xdata = [-xdescent, -xdescent + width]\n xdata_marker = [-xdescent + 0.5 * width]\n return xdata, xdata_marker\n\n\nclass HandlerNpointsYoffsets(HandlerNpoints):\n """\n A legend handler that shows *numpoints* in the legend, and allows them to\n be individually offset in the y-direction.\n """\n\n def __init__(self, numpoints=None, yoffsets=None, **kwargs):\n """\n Parameters\n ----------\n numpoints : int\n Number of points to show in legend entry.\n yoffsets : array of floats\n Length *numpoints* list of y offsets for each point in\n legend entry.\n **kwargs\n Keyword arguments forwarded to `.HandlerNpoints`.\n """\n super().__init__(numpoints=numpoints, **kwargs)\n self._yoffsets = yoffsets\n\n def get_ydata(self, legend, xdescent, ydescent, width, height, fontsize):\n if self._yoffsets is None:\n ydata = height * legend._scatteryoffsets\n else:\n ydata = height * np.asarray(self._yoffsets)\n\n return ydata\n\n\nclass HandlerLine2DCompound(HandlerNpoints):\n """\n Original handler for `.Line2D` instances, that relies on combining\n a line-only with a marker-only artist. May be deprecated in the future.\n """\n\n def create_artists(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize,\n trans):\n # docstring inherited\n xdata, xdata_marker = self.get_xdata(legend, xdescent, ydescent,\n width, height, fontsize)\n\n ydata = np.full_like(xdata, ((height - ydescent) / 2))\n legline = Line2D(xdata, ydata)\n\n self.update_prop(legline, orig_handle, legend)\n legline.set_drawstyle('default')\n legline.set_marker("")\n\n legline_marker = Line2D(xdata_marker, ydata[:len(xdata_marker)])\n self.update_prop(legline_marker, orig_handle, legend)\n legline_marker.set_linestyle('None')\n if legend.markerscale != 1:\n newsz = legline_marker.get_markersize() * legend.markerscale\n legline_marker.set_markersize(newsz)\n # we don't want to add this to the return list because\n # the texts and handles are assumed to be in one-to-one\n # correspondence.\n legline._legmarker = legline_marker\n\n legline.set_transform(trans)\n legline_marker.set_transform(trans)\n\n return [legline, legline_marker]\n\n\nclass HandlerLine2D(HandlerNpoints):\n """\n Handler for `.Line2D` instances.\n\n See Also\n --------\n HandlerLine2DCompound : An earlier handler implementation, which used one\n artist for the line and another for the marker(s).\n """\n\n def create_artists(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize,\n trans):\n # docstring inherited\n xdata, xdata_marker = self.get_xdata(legend, xdescent, ydescent,\n width, height, fontsize)\n\n markevery = None\n if self.get_numpoints(legend) == 1:\n # Special case: one wants a single marker in the center\n # and a line that extends on both sides. One will use a\n # 3 points line, but only mark the #1 (i.e. middle) point.\n xdata = np.linspace(xdata[0], xdata[-1], 3)\n markevery = [1]\n\n ydata = np.full_like(xdata, (height - ydescent) / 2)\n legline = Line2D(xdata, ydata, markevery=markevery)\n\n self.update_prop(legline, orig_handle, legend)\n\n if legend.markerscale != 1:\n newsz = legline.get_markersize() * legend.markerscale\n legline.set_markersize(newsz)\n\n legline.set_transform(trans)\n\n return [legline]\n\n\nclass HandlerPatch(HandlerBase):\n """\n Handler for `.Patch` instances.\n """\n\n def __init__(self, patch_func=None, **kwargs):\n """\n Parameters\n ----------\n patch_func : callable, optional\n The function that creates the legend key artist.\n *patch_func* should have the signature::\n\n def patch_func(legend=legend, orig_handle=orig_handle,\n xdescent=xdescent, ydescent=ydescent,\n width=width, height=height, fontsize=fontsize)\n\n Subsequently, the created artist will have its ``update_prop``\n method called and the appropriate transform will be applied.\n\n **kwargs\n Keyword arguments forwarded to `.HandlerBase`.\n """\n super().__init__(**kwargs)\n self._patch_func = patch_func\n\n def _create_patch(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize):\n if self._patch_func is None:\n p = Rectangle(xy=(-xdescent, -ydescent),\n width=width, height=height)\n else:\n p = self._patch_func(legend=legend, orig_handle=orig_handle,\n xdescent=xdescent, ydescent=ydescent,\n width=width, height=height, fontsize=fontsize)\n return p\n\n def create_artists(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize, trans):\n # docstring inherited\n p = self._create_patch(legend, orig_handle,\n xdescent, ydescent, width, height, fontsize)\n self.update_prop(p, orig_handle, legend)\n p.set_transform(trans)\n return [p]\n\n\nclass HandlerStepPatch(HandlerBase):\n """\n Handler for `~.matplotlib.patches.StepPatch` instances.\n """\n\n @staticmethod\n def _create_patch(orig_handle, xdescent, ydescent, width, height):\n return Rectangle(xy=(-xdescent, -ydescent), width=width,\n height=height, color=orig_handle.get_facecolor())\n\n @staticmethod\n def _create_line(orig_handle, width, height):\n # Unfilled StepPatch should show as a line\n legline = Line2D([0, width], [height/2, height/2],\n color=orig_handle.get_edgecolor(),\n linestyle=orig_handle.get_linestyle(),\n linewidth=orig_handle.get_linewidth(),\n )\n\n # Overwrite manually because patch and line properties don't mix\n legline.set_drawstyle('default')\n legline.set_marker("")\n return legline\n\n def create_artists(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize, trans):\n # docstring inherited\n if orig_handle.get_fill() or (orig_handle.get_hatch() is not None):\n p = self._create_patch(orig_handle, xdescent, ydescent, width,\n height)\n self.update_prop(p, orig_handle, legend)\n else:\n p = self._create_line(orig_handle, width, height)\n p.set_transform(trans)\n return [p]\n\n\nclass HandlerLineCollection(HandlerLine2D):\n """\n Handler for `.LineCollection` instances.\n """\n def get_numpoints(self, legend):\n if self._numpoints is None:\n return legend.scatterpoints\n else:\n return self._numpoints\n\n def _default_update_prop(self, legend_handle, orig_handle):\n lw = orig_handle.get_linewidths()[0]\n dashes = orig_handle._us_linestyles[0]\n color = orig_handle.get_colors()[0]\n legend_handle.set_color(color)\n legend_handle.set_linestyle(dashes)\n legend_handle.set_linewidth(lw)\n\n def create_artists(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize, trans):\n # docstring inherited\n xdata, xdata_marker = self.get_xdata(legend, xdescent, ydescent,\n width, height, fontsize)\n ydata = np.full_like(xdata, (height - ydescent) / 2)\n legline = Line2D(xdata, ydata)\n\n self.update_prop(legline, orig_handle, legend)\n legline.set_transform(trans)\n\n return [legline]\n\n\nclass HandlerRegularPolyCollection(HandlerNpointsYoffsets):\n r"""Handler for `.RegularPolyCollection`\s."""\n\n def __init__(self, yoffsets=None, sizes=None, **kwargs):\n super().__init__(yoffsets=yoffsets, **kwargs)\n\n self._sizes = sizes\n\n def get_numpoints(self, legend):\n if self._numpoints is None:\n return legend.scatterpoints\n else:\n return self._numpoints\n\n def get_sizes(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize):\n if self._sizes is None:\n handle_sizes = orig_handle.get_sizes()\n if not len(handle_sizes):\n handle_sizes = [1]\n size_max = max(handle_sizes) * legend.markerscale ** 2\n size_min = min(handle_sizes) * legend.markerscale ** 2\n\n numpoints = self.get_numpoints(legend)\n if numpoints < 4:\n sizes = [.5 * (size_max + size_min), size_max,\n size_min][:numpoints]\n else:\n rng = (size_max - size_min)\n sizes = rng * np.linspace(0, 1, numpoints) + size_min\n else:\n sizes = self._sizes\n\n return sizes\n\n def update_prop(self, legend_handle, orig_handle, legend):\n\n self._update_prop(legend_handle, orig_handle)\n\n legend_handle.set_figure(legend.get_figure(root=False))\n # legend._set_artist_props(legend_handle)\n legend_handle.set_clip_box(None)\n legend_handle.set_clip_path(None)\n\n def create_collection(self, orig_handle, sizes, offsets, offset_transform):\n return type(orig_handle)(\n orig_handle.get_numsides(),\n rotation=orig_handle.get_rotation(), sizes=sizes,\n offsets=offsets, offset_transform=offset_transform,\n )\n\n def create_artists(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize,\n trans):\n # docstring inherited\n xdata, xdata_marker = self.get_xdata(legend, xdescent, ydescent,\n width, height, fontsize)\n\n ydata = self.get_ydata(legend, xdescent, ydescent,\n width, height, fontsize)\n\n sizes = self.get_sizes(legend, orig_handle, xdescent, ydescent,\n width, height, fontsize)\n\n p = self.create_collection(\n orig_handle, sizes,\n offsets=list(zip(xdata_marker, ydata)), offset_transform=trans)\n\n self.update_prop(p, orig_handle, legend)\n p.set_offset_transform(trans)\n return [p]\n\n\nclass HandlerPathCollection(HandlerRegularPolyCollection):\n r"""Handler for `.PathCollection`\s, which are used by `~.Axes.scatter`."""\n\n def create_collection(self, orig_handle, sizes, offsets, offset_transform):\n return type(orig_handle)(\n [orig_handle.get_paths()[0]], sizes=sizes,\n offsets=offsets, offset_transform=offset_transform,\n )\n\n\nclass HandlerCircleCollection(HandlerRegularPolyCollection):\n r"""Handler for `.CircleCollection`\s."""\n\n def create_collection(self, orig_handle, sizes, offsets, offset_transform):\n return type(orig_handle)(\n sizes, offsets=offsets, offset_transform=offset_transform)\n\n\nclass HandlerErrorbar(HandlerLine2D):\n """Handler for Errorbars."""\n\n def __init__(self, xerr_size=0.5, yerr_size=None,\n marker_pad=0.3, numpoints=None, **kwargs):\n\n self._xerr_size = xerr_size\n self._yerr_size = yerr_size\n\n super().__init__(marker_pad=marker_pad, numpoints=numpoints, **kwargs)\n\n def get_err_size(self, legend, xdescent, ydescent,\n width, height, fontsize):\n xerr_size = self._xerr_size * fontsize\n\n if self._yerr_size is None:\n yerr_size = xerr_size\n else:\n yerr_size = self._yerr_size * fontsize\n\n return xerr_size, yerr_size\n\n def create_artists(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize,\n trans):\n # docstring inherited\n plotlines, caplines, barlinecols = orig_handle\n\n xdata, xdata_marker = self.get_xdata(legend, xdescent, ydescent,\n width, height, fontsize)\n\n ydata = np.full_like(xdata, (height - ydescent) / 2)\n legline = Line2D(xdata, ydata)\n\n xdata_marker = np.asarray(xdata_marker)\n ydata_marker = np.asarray(ydata[:len(xdata_marker)])\n\n xerr_size, yerr_size = self.get_err_size(legend, xdescent, ydescent,\n width, height, fontsize)\n\n legline_marker = Line2D(xdata_marker, ydata_marker)\n\n # when plotlines are None (only errorbars are drawn), we just\n # make legline invisible.\n if plotlines is None:\n legline.set_visible(False)\n legline_marker.set_visible(False)\n else:\n self.update_prop(legline, plotlines, legend)\n\n legline.set_drawstyle('default')\n legline.set_marker('none')\n\n self.update_prop(legline_marker, plotlines, legend)\n legline_marker.set_linestyle('None')\n\n if legend.markerscale != 1:\n newsz = legline_marker.get_markersize() * legend.markerscale\n legline_marker.set_markersize(newsz)\n\n handle_barlinecols = []\n handle_caplines = []\n\n if orig_handle.has_xerr:\n verts = [((x - xerr_size, y), (x + xerr_size, y))\n for x, y in zip(xdata_marker, ydata_marker)]\n coll = mcoll.LineCollection(verts)\n self.update_prop(coll, barlinecols[0], legend)\n handle_barlinecols.append(coll)\n\n if caplines:\n capline_left = Line2D(xdata_marker - xerr_size, ydata_marker)\n capline_right = Line2D(xdata_marker + xerr_size, ydata_marker)\n self.update_prop(capline_left, caplines[0], legend)\n self.update_prop(capline_right, caplines[0], legend)\n capline_left.set_marker("|")\n capline_right.set_marker("|")\n\n handle_caplines.append(capline_left)\n handle_caplines.append(capline_right)\n\n if orig_handle.has_yerr:\n verts = [((x, y - yerr_size), (x, y + yerr_size))\n for x, y in zip(xdata_marker, ydata_marker)]\n coll = mcoll.LineCollection(verts)\n self.update_prop(coll, barlinecols[0], legend)\n handle_barlinecols.append(coll)\n\n if caplines:\n capline_left = Line2D(xdata_marker, ydata_marker - yerr_size)\n capline_right = Line2D(xdata_marker, ydata_marker + yerr_size)\n self.update_prop(capline_left, caplines[0], legend)\n self.update_prop(capline_right, caplines[0], legend)\n capline_left.set_marker("_")\n capline_right.set_marker("_")\n\n handle_caplines.append(capline_left)\n handle_caplines.append(capline_right)\n\n artists = [\n *handle_barlinecols, *handle_caplines, legline, legline_marker,\n ]\n for artist in artists:\n artist.set_transform(trans)\n return artists\n\n\nclass HandlerStem(HandlerNpointsYoffsets):\n """\n Handler for plots produced by `~.Axes.stem`.\n """\n\n def __init__(self, marker_pad=0.3, numpoints=None,\n bottom=None, yoffsets=None, **kwargs):\n """\n Parameters\n ----------\n marker_pad : float, default: 0.3\n Padding between points in legend entry.\n numpoints : int, optional\n Number of points to show in legend entry.\n bottom : float, optional\n\n yoffsets : array of floats, optional\n Length *numpoints* list of y offsets for each point in\n legend entry.\n **kwargs\n Keyword arguments forwarded to `.HandlerNpointsYoffsets`.\n """\n super().__init__(marker_pad=marker_pad, numpoints=numpoints,\n yoffsets=yoffsets, **kwargs)\n self._bottom = bottom\n\n def get_ydata(self, legend, xdescent, ydescent, width, height, fontsize):\n if self._yoffsets is None:\n ydata = height * (0.5 * legend._scatteryoffsets + 0.5)\n else:\n ydata = height * np.asarray(self._yoffsets)\n\n return ydata\n\n def create_artists(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize,\n trans):\n # docstring inherited\n markerline, stemlines, baseline = orig_handle\n # Check to see if the stemcontainer is storing lines as a list or a\n # LineCollection. Eventually using a list will be removed, and this\n # logic can also be removed.\n using_linecoll = isinstance(stemlines, mcoll.LineCollection)\n\n xdata, xdata_marker = self.get_xdata(legend, xdescent, ydescent,\n width, height, fontsize)\n\n ydata = self.get_ydata(legend, xdescent, ydescent,\n width, height, fontsize)\n\n if self._bottom is None:\n bottom = 0.\n else:\n bottom = self._bottom\n\n leg_markerline = Line2D(xdata_marker, ydata[:len(xdata_marker)])\n self.update_prop(leg_markerline, markerline, legend)\n\n leg_stemlines = [Line2D([x, x], [bottom, y])\n for x, y in zip(xdata_marker, ydata)]\n\n if using_linecoll:\n # change the function used by update_prop() from the default\n # to one that handles LineCollection\n with cbook._setattr_cm(\n self, _update_prop_func=self._copy_collection_props):\n for line in leg_stemlines:\n self.update_prop(line, stemlines, legend)\n\n else:\n for lm, m in zip(leg_stemlines, stemlines):\n self.update_prop(lm, m, legend)\n\n leg_baseline = Line2D([np.min(xdata), np.max(xdata)],\n [bottom, bottom])\n self.update_prop(leg_baseline, baseline, legend)\n\n artists = [*leg_stemlines, leg_baseline, leg_markerline]\n for artist in artists:\n artist.set_transform(trans)\n return artists\n\n def _copy_collection_props(self, legend_handle, orig_handle):\n """\n Copy properties from the `.LineCollection` *orig_handle* to the\n `.Line2D` *legend_handle*.\n """\n legend_handle.set_color(orig_handle.get_color()[0])\n legend_handle.set_linestyle(orig_handle.get_linestyle()[0])\n\n\nclass HandlerTuple(HandlerBase):\n """\n Handler for Tuple.\n """\n\n def __init__(self, ndivide=1, pad=None, **kwargs):\n """\n Parameters\n ----------\n ndivide : int or None, default: 1\n The number of sections to divide the legend area into. If None,\n use the length of the input tuple.\n pad : float, default: :rc:`legend.borderpad`\n Padding in units of fraction of font size.\n **kwargs\n Keyword arguments forwarded to `.HandlerBase`.\n """\n self._ndivide = ndivide\n self._pad = pad\n super().__init__(**kwargs)\n\n def create_artists(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize,\n trans):\n # docstring inherited\n handler_map = legend.get_legend_handler_map()\n\n if self._ndivide is None:\n ndivide = len(orig_handle)\n else:\n ndivide = self._ndivide\n\n if self._pad is None:\n pad = legend.borderpad * fontsize\n else:\n pad = self._pad * fontsize\n\n if ndivide > 1:\n width = (width - pad * (ndivide - 1)) / ndivide\n\n xds_cycle = cycle(xdescent - (width + pad) * np.arange(ndivide))\n\n a_list = []\n for handle1 in orig_handle:\n handler = legend.get_legend_handler(handler_map, handle1)\n _a_list = handler.create_artists(\n legend, handle1,\n next(xds_cycle), ydescent, width, height, fontsize, trans)\n a_list.extend(_a_list)\n\n return a_list\n\n\nclass HandlerPolyCollection(HandlerBase):\n """\n Handler for `.PolyCollection` used in `~.Axes.fill_between` and\n `~.Axes.stackplot`.\n """\n def _update_prop(self, legend_handle, orig_handle):\n def first_color(colors):\n if colors.size == 0:\n return (0, 0, 0, 0)\n return tuple(colors[0])\n\n def get_first(prop_array):\n if len(prop_array):\n return prop_array[0]\n else:\n return None\n\n # orig_handle is a PolyCollection and legend_handle is a Patch.\n # Directly set Patch color attributes (must be RGBA tuples).\n legend_handle._facecolor = first_color(orig_handle.get_facecolor())\n legend_handle._edgecolor = first_color(orig_handle.get_edgecolor())\n legend_handle._original_facecolor = orig_handle._original_facecolor\n legend_handle._original_edgecolor = orig_handle._original_edgecolor\n legend_handle._fill = orig_handle.get_fill()\n legend_handle._hatch = orig_handle.get_hatch()\n # Hatch color is anomalous in having no getters and setters.\n legend_handle._hatch_color = orig_handle._hatch_color\n # Setters are fine for the remaining attributes.\n legend_handle.set_linewidth(get_first(orig_handle.get_linewidths()))\n legend_handle.set_linestyle(get_first(orig_handle.get_linestyles()))\n legend_handle.set_transform(get_first(orig_handle.get_transforms()))\n legend_handle.set_figure(orig_handle.get_figure())\n # Alpha is already taken into account by the color attributes.\n\n def create_artists(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize, trans):\n # docstring inherited\n p = Rectangle(xy=(-xdescent, -ydescent),\n width=width, height=height)\n self.update_prop(p, orig_handle, legend)\n p.set_transform(trans)\n return [p]\n | .venv\Lib\site-packages\matplotlib\legend_handler.py | legend_handler.py | Python | 29,931 | 0.95 | 0.163592 | 0.064516 | node-utils | 719 | 2024-01-03T19:07:19.330953 | Apache-2.0 | false | aa30c16968d30523888c8d5d48625fb3 |
from collections.abc import Callable, Sequence\nfrom matplotlib.artist import Artist\nfrom matplotlib.legend import Legend\nfrom matplotlib.offsetbox import OffsetBox\nfrom matplotlib.transforms import Transform\n\nfrom typing import TypeVar\n\nfrom numpy.typing import ArrayLike\n\ndef update_from_first_child(tgt: Artist, src: Artist) -> None: ...\n\nclass HandlerBase:\n def __init__(\n self,\n xpad: float = ...,\n ypad: float = ...,\n update_func: Callable[[Artist, Artist], None] | None = ...,\n ) -> None: ...\n def update_prop(\n self, legend_handle: Artist, orig_handle: Artist, legend: Legend\n ) -> None: ...\n def adjust_drawing_area(\n self,\n legend: Legend,\n orig_handle: Artist,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n ) -> tuple[float, float, float, float]: ...\n def legend_artist(\n self, legend: Legend, orig_handle: Artist, fontsize: float, handlebox: OffsetBox\n ) -> Artist: ...\n def create_artists(\n self,\n legend: Legend,\n orig_handle: Artist,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n trans: Transform,\n ) -> Sequence[Artist]: ...\n\nclass HandlerNpoints(HandlerBase):\n def __init__(\n self, marker_pad: float = ..., numpoints: int | None = ..., **kwargs\n ) -> None: ...\n def get_numpoints(self, legend: Legend) -> int | None: ...\n def get_xdata(\n self,\n legend: Legend,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n ) -> tuple[ArrayLike, ArrayLike]: ...\n\nclass HandlerNpointsYoffsets(HandlerNpoints):\n def __init__(\n self,\n numpoints: int | None = ...,\n yoffsets: Sequence[float] | None = ...,\n **kwargs\n ) -> None: ...\n def get_ydata(\n self,\n legend: Legend,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n ) -> ArrayLike: ...\n\nclass HandlerLine2DCompound(HandlerNpoints):\n def create_artists(\n self,\n legend: Legend,\n orig_handle: Artist,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n trans: Transform,\n ) -> Sequence[Artist]: ...\n\nclass HandlerLine2D(HandlerNpoints):\n def create_artists(\n self,\n legend: Legend,\n orig_handle: Artist,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n trans: Transform,\n ) -> Sequence[Artist]: ...\n\nclass HandlerPatch(HandlerBase):\n def __init__(self, patch_func: Callable | None = ..., **kwargs) -> None: ...\n def create_artists(\n self,\n legend: Legend,\n orig_handle: Artist,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n trans: Transform,\n ) -> Sequence[Artist]: ...\n\nclass HandlerStepPatch(HandlerBase):\n def create_artists(\n self,\n legend: Legend,\n orig_handle: Artist,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n trans: Transform,\n ) -> Sequence[Artist]: ...\n\nclass HandlerLineCollection(HandlerLine2D):\n def get_numpoints(self, legend: Legend) -> int: ...\n def create_artists(\n self,\n legend: Legend,\n orig_handle: Artist,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n trans: Transform,\n ) -> Sequence[Artist]: ...\n\n_T = TypeVar("_T", bound=Artist)\n\nclass HandlerRegularPolyCollection(HandlerNpointsYoffsets):\n def __init__(\n self,\n yoffsets: Sequence[float] | None = ...,\n sizes: Sequence[float] | None = ...,\n **kwargs\n ) -> None: ...\n def get_numpoints(self, legend: Legend) -> int: ...\n def get_sizes(\n self,\n legend: Legend,\n orig_handle: Artist,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n ) -> Sequence[float]: ...\n def update_prop(\n self, legend_handle, orig_handle: Artist, legend: Legend\n ) -> None: ...\n def create_collection(\n self,\n orig_handle: _T,\n sizes: Sequence[float] | None,\n offsets: Sequence[float] | None,\n offset_transform: Transform,\n ) -> _T: ...\n def create_artists(\n self,\n legend: Legend,\n orig_handle: Artist,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n trans: Transform,\n ) -> Sequence[Artist]: ...\n\nclass HandlerPathCollection(HandlerRegularPolyCollection):\n def create_collection(\n self,\n orig_handle: _T,\n sizes: Sequence[float] | None,\n offsets: Sequence[float] | None,\n offset_transform: Transform,\n ) -> _T: ...\n\nclass HandlerCircleCollection(HandlerRegularPolyCollection):\n def create_collection(\n self,\n orig_handle: _T,\n sizes: Sequence[float] | None,\n offsets: Sequence[float] | None,\n offset_transform: Transform,\n ) -> _T: ...\n\nclass HandlerErrorbar(HandlerLine2D):\n def __init__(\n self,\n xerr_size: float = ...,\n yerr_size: float | None = ...,\n marker_pad: float = ...,\n numpoints: int | None = ...,\n **kwargs\n ) -> None: ...\n def get_err_size(\n self,\n legend: Legend,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n ) -> tuple[float, float]: ...\n def create_artists(\n self,\n legend: Legend,\n orig_handle: Artist,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n trans: Transform,\n ) -> Sequence[Artist]: ...\n\nclass HandlerStem(HandlerNpointsYoffsets):\n def __init__(\n self,\n marker_pad: float = ...,\n numpoints: int | None = ...,\n bottom: float | None = ...,\n yoffsets: Sequence[float] | None = ...,\n **kwargs\n ) -> None: ...\n def get_ydata(\n self,\n legend: Legend,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n ) -> ArrayLike: ...\n def create_artists(\n self,\n legend: Legend,\n orig_handle: Artist,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n trans: Transform,\n ) -> Sequence[Artist]: ...\n\nclass HandlerTuple(HandlerBase):\n def __init__(\n self, ndivide: int | None = ..., pad: float | None = ..., **kwargs\n ) -> None: ...\n def create_artists(\n self,\n legend: Legend,\n orig_handle: Artist,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n trans: Transform,\n ) -> Sequence[Artist]: ...\n\nclass HandlerPolyCollection(HandlerBase):\n def create_artists(\n self,\n legend: Legend,\n orig_handle: Artist,\n xdescent: float,\n ydescent: float,\n width: float,\n height: float,\n fontsize: float,\n trans: Transform,\n ) -> Sequence[Artist]: ...\n | .venv\Lib\site-packages\matplotlib\legend_handler.pyi | legend_handler.pyi | Other | 7,655 | 0.85 | 0.170068 | 0.014545 | react-lib | 941 | 2023-08-15T09:56:32.894611 | GPL-3.0 | false | 1faf05ee5f07d5c5d1016596c4fc95d7 |
"""\n2D lines with support for a variety of line styles, markers, colors, etc.\n"""\n\nimport copy\n\nfrom numbers import Integral, Number, Real\nimport logging\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom . import _api, cbook, colors as mcolors, _docstring\nfrom .artist import Artist, allow_rasterization\nfrom .cbook import (\n _to_unmasked_float_array, ls_mapper, ls_mapper_r, STEP_LOOKUP_MAP)\nfrom .markers import MarkerStyle\nfrom .path import Path\nfrom .transforms import Bbox, BboxTransformTo, TransformedPath\nfrom ._enums import JoinStyle, CapStyle\n\n# Imported here for backward compatibility, even though they don't\n# really belong.\nfrom . import _path\nfrom .markers import ( # noqa\n CARETLEFT, CARETRIGHT, CARETUP, CARETDOWN,\n CARETLEFTBASE, CARETRIGHTBASE, CARETUPBASE, CARETDOWNBASE,\n TICKLEFT, TICKRIGHT, TICKUP, TICKDOWN)\n\n_log = logging.getLogger(__name__)\n\n\ndef _get_dash_pattern(style):\n """Convert linestyle to dash pattern."""\n # go from short hand -> full strings\n if isinstance(style, str):\n style = ls_mapper.get(style, style)\n # un-dashed styles\n if style in ['solid', 'None']:\n offset = 0\n dashes = None\n # dashed styles\n elif style in ['dashed', 'dashdot', 'dotted']:\n offset = 0\n dashes = tuple(mpl.rcParams[f'lines.{style}_pattern'])\n #\n elif isinstance(style, tuple):\n offset, dashes = style\n if offset is None:\n raise ValueError(f'Unrecognized linestyle: {style!r}')\n else:\n raise ValueError(f'Unrecognized linestyle: {style!r}')\n\n # normalize offset to be positive and shorter than the dash cycle\n if dashes is not None:\n dsum = sum(dashes)\n if dsum:\n offset %= dsum\n\n return offset, dashes\n\n\ndef _get_dash_patterns(styles):\n """Convert linestyle or sequence of linestyles to list of dash patterns."""\n try:\n patterns = [_get_dash_pattern(styles)]\n except ValueError:\n try:\n patterns = [_get_dash_pattern(x) for x in styles]\n except ValueError as err:\n emsg = f'Do not know how to convert {styles!r} to dashes'\n raise ValueError(emsg) from err\n\n return patterns\n\n\ndef _get_inverse_dash_pattern(offset, dashes):\n """Return the inverse of the given dash pattern, for filling the gaps."""\n # Define the inverse pattern by moving the last gap to the start of the\n # sequence.\n gaps = dashes[-1:] + dashes[:-1]\n # Set the offset so that this new first segment is skipped\n # (see backend_bases.GraphicsContextBase.set_dashes for offset definition).\n offset_gaps = offset + dashes[-1]\n\n return offset_gaps, gaps\n\n\ndef _scale_dashes(offset, dashes, lw):\n if not mpl.rcParams['lines.scale_dashes']:\n return offset, dashes\n scaled_offset = offset * lw\n scaled_dashes = ([x * lw if x is not None else None for x in dashes]\n if dashes is not None else None)\n return scaled_offset, scaled_dashes\n\n\ndef segment_hits(cx, cy, x, y, radius):\n """\n Return the indices of the segments in the polyline with coordinates (*cx*,\n *cy*) that are within a distance *radius* of the point (*x*, *y*).\n """\n # Process single points specially\n if len(x) <= 1:\n res, = np.nonzero((cx - x) ** 2 + (cy - y) ** 2 <= radius ** 2)\n return res\n\n # We need to lop the last element off a lot.\n xr, yr = x[:-1], y[:-1]\n\n # Only look at line segments whose nearest point to C on the line\n # lies within the segment.\n dx, dy = x[1:] - xr, y[1:] - yr\n Lnorm_sq = dx ** 2 + dy ** 2 # Possibly want to eliminate Lnorm==0\n u = ((cx - xr) * dx + (cy - yr) * dy) / Lnorm_sq\n candidates = (u >= 0) & (u <= 1)\n\n # Note that there is a little area near one side of each point\n # which will be near neither segment, and another which will\n # be near both, depending on the angle of the lines. The\n # following radius test eliminates these ambiguities.\n point_hits = (cx - x) ** 2 + (cy - y) ** 2 <= radius ** 2\n candidates = candidates & ~(point_hits[:-1] | point_hits[1:])\n\n # For those candidates which remain, determine how far they lie away\n # from the line.\n px, py = xr + u * dx, yr + u * dy\n line_hits = (cx - px) ** 2 + (cy - py) ** 2 <= radius ** 2\n line_hits = line_hits & candidates\n points, = point_hits.ravel().nonzero()\n lines, = line_hits.ravel().nonzero()\n return np.concatenate((points, lines))\n\n\ndef _mark_every_path(markevery, tpath, affine, ax):\n """\n Helper function that sorts out how to deal the input\n `markevery` and returns the points where markers should be drawn.\n\n Takes in the `markevery` value and the line path and returns the\n sub-sampled path.\n """\n # pull out the two bits of data we want from the path\n codes, verts = tpath.codes, tpath.vertices\n\n def _slice_or_none(in_v, slc):\n """Helper function to cope with `codes` being an ndarray or `None`."""\n if in_v is None:\n return None\n return in_v[slc]\n\n # if just an int, assume starting at 0 and make a tuple\n if isinstance(markevery, Integral):\n markevery = (0, markevery)\n # if just a float, assume starting at 0.0 and make a tuple\n elif isinstance(markevery, Real):\n markevery = (0.0, markevery)\n\n if isinstance(markevery, tuple):\n if len(markevery) != 2:\n raise ValueError('`markevery` is a tuple but its len is not 2; '\n f'markevery={markevery}')\n start, step = markevery\n # if step is an int, old behavior\n if isinstance(step, Integral):\n # tuple of 2 int is for backwards compatibility,\n if not isinstance(start, Integral):\n raise ValueError(\n '`markevery` is a tuple with len 2 and second element is '\n 'an int, but the first element is not an int; '\n f'markevery={markevery}')\n # just return, we are done here\n\n return Path(verts[slice(start, None, step)],\n _slice_or_none(codes, slice(start, None, step)))\n\n elif isinstance(step, Real):\n if not isinstance(start, Real):\n raise ValueError(\n '`markevery` is a tuple with len 2 and second element is '\n 'a float, but the first element is not a float or an int; '\n f'markevery={markevery}')\n if ax is None:\n raise ValueError(\n "markevery is specified relative to the Axes size, but "\n "the line does not have a Axes as parent")\n\n # calc cumulative distance along path (in display coords):\n fin = np.isfinite(verts).all(axis=1)\n fverts = verts[fin]\n disp_coords = affine.transform(fverts)\n\n delta = np.empty((len(disp_coords), 2))\n delta[0, :] = 0\n delta[1:, :] = disp_coords[1:, :] - disp_coords[:-1, :]\n delta = np.hypot(*delta.T).cumsum()\n # calc distance between markers along path based on the Axes\n # bounding box diagonal being a distance of unity:\n (x0, y0), (x1, y1) = ax.transAxes.transform([[0, 0], [1, 1]])\n scale = np.hypot(x1 - x0, y1 - y0)\n marker_delta = np.arange(start * scale, delta[-1], step * scale)\n # find closest actual data point that is closest to\n # the theoretical distance along the path:\n inds = np.abs(delta[np.newaxis, :] - marker_delta[:, np.newaxis])\n inds = inds.argmin(axis=1)\n inds = np.unique(inds)\n # return, we are done here\n return Path(fverts[inds], _slice_or_none(codes, inds))\n else:\n raise ValueError(\n f"markevery={markevery!r} is a tuple with len 2, but its "\n f"second element is not an int or a float")\n\n elif isinstance(markevery, slice):\n # mazol tov, it's already a slice, just return\n return Path(verts[markevery], _slice_or_none(codes, markevery))\n\n elif np.iterable(markevery):\n # fancy indexing\n try:\n return Path(verts[markevery], _slice_or_none(codes, markevery))\n except (ValueError, IndexError) as err:\n raise ValueError(\n f"markevery={markevery!r} is iterable but not a valid numpy "\n f"fancy index") from err\n else:\n raise ValueError(f"markevery={markevery!r} is not a recognized value")\n\n\n@_docstring.interpd\n@_api.define_aliases({\n "antialiased": ["aa"],\n "color": ["c"],\n "drawstyle": ["ds"],\n "linestyle": ["ls"],\n "linewidth": ["lw"],\n "markeredgecolor": ["mec"],\n "markeredgewidth": ["mew"],\n "markerfacecolor": ["mfc"],\n "markerfacecoloralt": ["mfcalt"],\n "markersize": ["ms"],\n})\nclass Line2D(Artist):\n """\n A line - the line can have both a solid linestyle connecting all\n the vertices, and a marker at each vertex. Additionally, the\n drawing of the solid line is influenced by the drawstyle, e.g., one\n can create "stepped" lines in various styles.\n """\n\n lineStyles = _lineStyles = { # hidden names deprecated\n '-': '_draw_solid',\n '--': '_draw_dashed',\n '-.': '_draw_dash_dot',\n ':': '_draw_dotted',\n 'None': '_draw_nothing',\n ' ': '_draw_nothing',\n '': '_draw_nothing',\n }\n\n _drawStyles_l = {\n 'default': '_draw_lines',\n 'steps-mid': '_draw_steps_mid',\n 'steps-pre': '_draw_steps_pre',\n 'steps-post': '_draw_steps_post',\n }\n\n _drawStyles_s = {\n 'steps': '_draw_steps_pre',\n }\n\n # drawStyles should now be deprecated.\n drawStyles = {**_drawStyles_l, **_drawStyles_s}\n # Need a list ordered with long names first:\n drawStyleKeys = [*_drawStyles_l, *_drawStyles_s]\n\n # Referenced here to maintain API. These are defined in\n # MarkerStyle\n markers = MarkerStyle.markers\n filled_markers = MarkerStyle.filled_markers\n fillStyles = MarkerStyle.fillstyles\n\n zorder = 2\n\n _subslice_optim_min_size = 1000\n\n def __str__(self):\n if self._label != "":\n return f"Line2D({self._label})"\n elif self._x is None:\n return "Line2D()"\n elif len(self._x) > 3:\n return "Line2D(({:g},{:g}),({:g},{:g}),...,({:g},{:g}))".format(\n self._x[0], self._y[0],\n self._x[1], self._y[1],\n self._x[-1], self._y[-1])\n else:\n return "Line2D(%s)" % ",".join(\n map("({:g},{:g})".format, self._x, self._y))\n\n def __init__(self, xdata, ydata, *,\n linewidth=None, # all Nones default to rc\n linestyle=None,\n color=None,\n gapcolor=None,\n marker=None,\n markersize=None,\n markeredgewidth=None,\n markeredgecolor=None,\n markerfacecolor=None,\n markerfacecoloralt='none',\n fillstyle=None,\n antialiased=None,\n dash_capstyle=None,\n solid_capstyle=None,\n dash_joinstyle=None,\n solid_joinstyle=None,\n pickradius=5,\n drawstyle=None,\n markevery=None,\n **kwargs\n ):\n """\n Create a `.Line2D` instance with *x* and *y* data in sequences of\n *xdata*, *ydata*.\n\n Additional keyword arguments are `.Line2D` properties:\n\n %(Line2D:kwdoc)s\n\n See :meth:`set_linestyle` for a description of the line styles,\n :meth:`set_marker` for a description of the markers, and\n :meth:`set_drawstyle` for a description of the draw styles.\n\n """\n super().__init__()\n\n # Convert sequences to NumPy arrays.\n if not np.iterable(xdata):\n raise RuntimeError('xdata must be a sequence')\n if not np.iterable(ydata):\n raise RuntimeError('ydata must be a sequence')\n\n if linewidth is None:\n linewidth = mpl.rcParams['lines.linewidth']\n\n if linestyle is None:\n linestyle = mpl.rcParams['lines.linestyle']\n if marker is None:\n marker = mpl.rcParams['lines.marker']\n if color is None:\n color = mpl.rcParams['lines.color']\n\n if markersize is None:\n markersize = mpl.rcParams['lines.markersize']\n if antialiased is None:\n antialiased = mpl.rcParams['lines.antialiased']\n if dash_capstyle is None:\n dash_capstyle = mpl.rcParams['lines.dash_capstyle']\n if dash_joinstyle is None:\n dash_joinstyle = mpl.rcParams['lines.dash_joinstyle']\n if solid_capstyle is None:\n solid_capstyle = mpl.rcParams['lines.solid_capstyle']\n if solid_joinstyle is None:\n solid_joinstyle = mpl.rcParams['lines.solid_joinstyle']\n\n if drawstyle is None:\n drawstyle = 'default'\n\n self._dashcapstyle = None\n self._dashjoinstyle = None\n self._solidjoinstyle = None\n self._solidcapstyle = None\n self.set_dash_capstyle(dash_capstyle)\n self.set_dash_joinstyle(dash_joinstyle)\n self.set_solid_capstyle(solid_capstyle)\n self.set_solid_joinstyle(solid_joinstyle)\n\n self._linestyles = None\n self._drawstyle = None\n self._linewidth = linewidth\n self._unscaled_dash_pattern = (0, None) # offset, dash\n self._dash_pattern = (0, None) # offset, dash (scaled by linewidth)\n\n self.set_linewidth(linewidth)\n self.set_linestyle(linestyle)\n self.set_drawstyle(drawstyle)\n\n self._color = None\n self.set_color(color)\n if marker is None:\n marker = 'none' # Default.\n if not isinstance(marker, MarkerStyle):\n self._marker = MarkerStyle(marker, fillstyle)\n else:\n self._marker = marker\n\n self._gapcolor = None\n self.set_gapcolor(gapcolor)\n\n self._markevery = None\n self._markersize = None\n self._antialiased = None\n\n self.set_markevery(markevery)\n self.set_antialiased(antialiased)\n self.set_markersize(markersize)\n\n self._markeredgecolor = None\n self._markeredgewidth = None\n self._markerfacecolor = None\n self._markerfacecoloralt = None\n\n self.set_markerfacecolor(markerfacecolor) # Normalizes None to rc.\n self.set_markerfacecoloralt(markerfacecoloralt)\n self.set_markeredgecolor(markeredgecolor) # Normalizes None to rc.\n self.set_markeredgewidth(markeredgewidth)\n\n # update kwargs before updating data to give the caller a\n # chance to init axes (and hence unit support)\n self._internal_update(kwargs)\n self.pickradius = pickradius\n self.ind_offset = 0\n if (isinstance(self._picker, Number) and\n not isinstance(self._picker, bool)):\n self._pickradius = self._picker\n\n self._xorig = np.asarray([])\n self._yorig = np.asarray([])\n self._invalidx = True\n self._invalidy = True\n self._x = None\n self._y = None\n self._xy = None\n self._path = None\n self._transformed_path = None\n self._subslice = False\n self._x_filled = None # used in subslicing; only x is needed\n\n self.set_data(xdata, ydata)\n\n def contains(self, mouseevent):\n """\n Test whether *mouseevent* occurred on the line.\n\n An event is deemed to have occurred "on" the line if it is less\n than ``self.pickradius`` (default: 5 points) away from it. Use\n `~.Line2D.get_pickradius` or `~.Line2D.set_pickradius` to get or set\n the pick radius.\n\n Parameters\n ----------\n mouseevent : `~matplotlib.backend_bases.MouseEvent`\n\n Returns\n -------\n contains : bool\n Whether any values are within the radius.\n details : dict\n A dictionary ``{'ind': pointlist}``, where *pointlist* is a\n list of points of the line that are within the pickradius around\n the event position.\n\n TODO: sort returned indices by distance\n """\n if self._different_canvas(mouseevent):\n return False, {}\n\n # Make sure we have data to plot\n if self._invalidy or self._invalidx:\n self.recache()\n if len(self._xy) == 0:\n return False, {}\n\n # Convert points to pixels\n transformed_path = self._get_transformed_path()\n path, affine = transformed_path.get_transformed_path_and_affine()\n path = affine.transform_path(path)\n xy = path.vertices\n xt = xy[:, 0]\n yt = xy[:, 1]\n\n # Convert pick radius from points to pixels\n fig = self.get_figure(root=True)\n if fig is None:\n _log.warning('no figure set when check if mouse is on line')\n pixels = self._pickradius\n else:\n pixels = fig.dpi / 72. * self._pickradius\n\n # The math involved in checking for containment (here and inside of\n # segment_hits) assumes that it is OK to overflow, so temporarily set\n # the error flags accordingly.\n with np.errstate(all='ignore'):\n # Check for collision\n if self._linestyle in ['None', None]:\n # If no line, return the nearby point(s)\n ind, = np.nonzero(\n (xt - mouseevent.x) ** 2 + (yt - mouseevent.y) ** 2\n <= pixels ** 2)\n else:\n # If line, return the nearby segment(s)\n ind = segment_hits(mouseevent.x, mouseevent.y, xt, yt, pixels)\n if self._drawstyle.startswith("steps"):\n ind //= 2\n\n ind += self.ind_offset\n\n # Return the point(s) within radius\n return len(ind) > 0, dict(ind=ind)\n\n def get_pickradius(self):\n """\n Return the pick radius used for containment tests.\n\n See `.contains` for more details.\n """\n return self._pickradius\n\n def set_pickradius(self, pickradius):\n """\n Set the pick radius used for containment tests.\n\n See `.contains` for more details.\n\n Parameters\n ----------\n pickradius : float\n Pick radius, in points.\n """\n if not isinstance(pickradius, Real) or pickradius < 0:\n raise ValueError("pick radius should be a distance")\n self._pickradius = pickradius\n\n pickradius = property(get_pickradius, set_pickradius)\n\n def get_fillstyle(self):\n """\n Return the marker fill style.\n\n See also `~.Line2D.set_fillstyle`.\n """\n return self._marker.get_fillstyle()\n\n def set_fillstyle(self, fs):\n """\n Set the marker fill style.\n\n Parameters\n ----------\n fs : {'full', 'left', 'right', 'bottom', 'top', 'none'}\n Possible values:\n\n - 'full': Fill the whole marker with the *markerfacecolor*.\n - 'left', 'right', 'bottom', 'top': Fill the marker half at\n the given side with the *markerfacecolor*. The other\n half of the marker is filled with *markerfacecoloralt*.\n - 'none': No filling.\n\n For examples see :ref:`marker_fill_styles`.\n """\n self.set_marker(MarkerStyle(self._marker.get_marker(), fs))\n self.stale = True\n\n def set_markevery(self, every):\n """\n Set the markevery property to subsample the plot when using markers.\n\n e.g., if ``every=5``, every 5-th marker will be plotted.\n\n Parameters\n ----------\n every : None or int or (int, int) or slice or list[int] or float or \\n(float, float) or list[bool]\n Which markers to plot.\n\n - ``every=None``: every point will be plotted.\n - ``every=N``: every N-th marker will be plotted starting with\n marker 0.\n - ``every=(start, N)``: every N-th marker, starting at index\n *start*, will be plotted.\n - ``every=slice(start, end, N)``: every N-th marker, starting at\n index *start*, up to but not including index *end*, will be\n plotted.\n - ``every=[i, j, m, ...]``: only markers at the given indices\n will be plotted.\n - ``every=[True, False, True, ...]``: only positions that are True\n will be plotted. The list must have the same length as the data\n points.\n - ``every=0.1``, (i.e. a float): markers will be spaced at\n approximately equal visual distances along the line; the distance\n along the line between markers is determined by multiplying the\n display-coordinate distance of the Axes bounding-box diagonal\n by the value of *every*.\n - ``every=(0.5, 0.1)`` (i.e. a length-2 tuple of float): similar\n to ``every=0.1`` but the first marker will be offset along the\n line by 0.5 multiplied by the\n display-coordinate-diagonal-distance along the line.\n\n For examples see\n :doc:`/gallery/lines_bars_and_markers/markevery_demo`.\n\n Notes\n -----\n Setting *markevery* will still only draw markers at actual data points.\n While the float argument form aims for uniform visual spacing, it has\n to coerce from the ideal spacing to the nearest available data point.\n Depending on the number and distribution of data points, the result\n may still not look evenly spaced.\n\n When using a start offset to specify the first marker, the offset will\n be from the first data point which may be different from the first\n the visible data point if the plot is zoomed in.\n\n If zooming in on a plot when using float arguments then the actual\n data points that have markers will change because the distance between\n markers is always determined from the display-coordinates\n axes-bounding-box-diagonal regardless of the actual axes data limits.\n\n """\n self._markevery = every\n self.stale = True\n\n def get_markevery(self):\n """\n Return the markevery setting for marker subsampling.\n\n See also `~.Line2D.set_markevery`.\n """\n return self._markevery\n\n def set_picker(self, p):\n """\n Set the event picker details for the line.\n\n Parameters\n ----------\n p : float or callable[[Artist, Event], tuple[bool, dict]]\n If a float, it is used as the pick radius in points.\n """\n if not callable(p):\n self.set_pickradius(p)\n self._picker = p\n\n def get_bbox(self):\n """Get the bounding box of this line."""\n bbox = Bbox([[0, 0], [0, 0]])\n bbox.update_from_data_xy(self.get_xydata())\n return bbox\n\n def get_window_extent(self, renderer=None):\n bbox = Bbox([[0, 0], [0, 0]])\n trans_data_to_xy = self.get_transform().transform\n bbox.update_from_data_xy(trans_data_to_xy(self.get_xydata()),\n ignore=True)\n # correct for marker size, if any\n if self._marker:\n ms = (self._markersize / 72.0 * self.get_figure(root=True).dpi) * 0.5\n bbox = bbox.padded(ms)\n return bbox\n\n def set_data(self, *args):\n """\n Set the x and y data.\n\n Parameters\n ----------\n *args : (2, N) array or two 1D arrays\n\n See Also\n --------\n set_xdata\n set_ydata\n """\n if len(args) == 1:\n (x, y), = args\n else:\n x, y = args\n\n self.set_xdata(x)\n self.set_ydata(y)\n\n def recache_always(self):\n self.recache(always=True)\n\n def recache(self, always=False):\n if always or self._invalidx:\n xconv = self.convert_xunits(self._xorig)\n x = _to_unmasked_float_array(xconv).ravel()\n else:\n x = self._x\n if always or self._invalidy:\n yconv = self.convert_yunits(self._yorig)\n y = _to_unmasked_float_array(yconv).ravel()\n else:\n y = self._y\n\n self._xy = np.column_stack(np.broadcast_arrays(x, y)).astype(float)\n self._x, self._y = self._xy.T # views\n\n self._subslice = False\n if (self.axes\n and len(x) > self._subslice_optim_min_size\n and _path.is_sorted_and_has_non_nan(x)\n and self.axes.name == 'rectilinear'\n and self.axes.get_xscale() == 'linear'\n and self._markevery is None\n and self.get_clip_on()\n and self.get_transform() == self.axes.transData):\n self._subslice = True\n nanmask = np.isnan(x)\n if nanmask.any():\n self._x_filled = self._x.copy()\n indices = np.arange(len(x))\n self._x_filled[nanmask] = np.interp(\n indices[nanmask], indices[~nanmask], self._x[~nanmask])\n else:\n self._x_filled = self._x\n\n if self._path is not None:\n interpolation_steps = self._path._interpolation_steps\n else:\n interpolation_steps = 1\n xy = STEP_LOOKUP_MAP[self._drawstyle](*self._xy.T)\n self._path = Path(np.asarray(xy).T,\n _interpolation_steps=interpolation_steps)\n self._transformed_path = None\n self._invalidx = False\n self._invalidy = False\n\n def _transform_path(self, subslice=None):\n """\n Put a TransformedPath instance at self._transformed_path;\n all invalidation of the transform is then handled by the\n TransformedPath instance.\n """\n # Masked arrays are now handled by the Path class itself\n if subslice is not None:\n xy = STEP_LOOKUP_MAP[self._drawstyle](*self._xy[subslice, :].T)\n _path = Path(np.asarray(xy).T,\n _interpolation_steps=self._path._interpolation_steps)\n else:\n _path = self._path\n self._transformed_path = TransformedPath(_path, self.get_transform())\n\n def _get_transformed_path(self):\n """Return this line's `~matplotlib.transforms.TransformedPath`."""\n if self._transformed_path is None:\n self._transform_path()\n return self._transformed_path\n\n def set_transform(self, t):\n # docstring inherited\n self._invalidx = True\n self._invalidy = True\n super().set_transform(t)\n\n @allow_rasterization\n def draw(self, renderer):\n # docstring inherited\n\n if not self.get_visible():\n return\n\n if self._invalidy or self._invalidx:\n self.recache()\n self.ind_offset = 0 # Needed for contains() method.\n if self._subslice and self.axes:\n x0, x1 = self.axes.get_xbound()\n i0 = self._x_filled.searchsorted(x0, 'left')\n i1 = self._x_filled.searchsorted(x1, 'right')\n subslice = slice(max(i0 - 1, 0), i1 + 1)\n self.ind_offset = subslice.start\n self._transform_path(subslice)\n else:\n subslice = None\n\n if self.get_path_effects():\n from matplotlib.patheffects import PathEffectRenderer\n renderer = PathEffectRenderer(self.get_path_effects(), renderer)\n\n renderer.open_group('line2d', self.get_gid())\n if self._lineStyles[self._linestyle] != '_draw_nothing':\n tpath, affine = (self._get_transformed_path()\n .get_transformed_path_and_affine())\n if len(tpath.vertices):\n gc = renderer.new_gc()\n self._set_gc_clip(gc)\n gc.set_url(self.get_url())\n\n gc.set_antialiased(self._antialiased)\n gc.set_linewidth(self._linewidth)\n\n if self.is_dashed():\n cap = self._dashcapstyle\n join = self._dashjoinstyle\n else:\n cap = self._solidcapstyle\n join = self._solidjoinstyle\n gc.set_joinstyle(join)\n gc.set_capstyle(cap)\n gc.set_snap(self.get_snap())\n if self.get_sketch_params() is not None:\n gc.set_sketch_params(*self.get_sketch_params())\n\n # We first draw a path within the gaps if needed.\n if self.is_dashed() and self._gapcolor is not None:\n lc_rgba = mcolors.to_rgba(self._gapcolor, self._alpha)\n gc.set_foreground(lc_rgba, isRGBA=True)\n\n offset_gaps, gaps = _get_inverse_dash_pattern(\n *self._dash_pattern)\n\n gc.set_dashes(offset_gaps, gaps)\n renderer.draw_path(gc, tpath, affine.frozen())\n\n lc_rgba = mcolors.to_rgba(self._color, self._alpha)\n gc.set_foreground(lc_rgba, isRGBA=True)\n\n gc.set_dashes(*self._dash_pattern)\n renderer.draw_path(gc, tpath, affine.frozen())\n gc.restore()\n\n if self._marker and self._markersize > 0:\n gc = renderer.new_gc()\n self._set_gc_clip(gc)\n gc.set_url(self.get_url())\n gc.set_linewidth(self._markeredgewidth)\n gc.set_antialiased(self._antialiased)\n\n ec_rgba = mcolors.to_rgba(\n self.get_markeredgecolor(), self._alpha)\n fc_rgba = mcolors.to_rgba(\n self._get_markerfacecolor(), self._alpha)\n fcalt_rgba = mcolors.to_rgba(\n self._get_markerfacecolor(alt=True), self._alpha)\n # If the edgecolor is "auto", it is set according to the *line*\n # color but inherits the alpha value of the *face* color, if any.\n if (cbook._str_equal(self._markeredgecolor, "auto")\n and not cbook._str_lower_equal(\n self.get_markerfacecolor(), "none")):\n ec_rgba = ec_rgba[:3] + (fc_rgba[3],)\n gc.set_foreground(ec_rgba, isRGBA=True)\n if self.get_sketch_params() is not None:\n scale, length, randomness = self.get_sketch_params()\n gc.set_sketch_params(scale/2, length/2, 2*randomness)\n\n marker = self._marker\n\n # Markers *must* be drawn ignoring the drawstyle (but don't pay the\n # recaching if drawstyle is already "default").\n if self.get_drawstyle() != "default":\n with cbook._setattr_cm(\n self, _drawstyle="default", _transformed_path=None):\n self.recache()\n self._transform_path(subslice)\n tpath, affine = (self._get_transformed_path()\n .get_transformed_points_and_affine())\n else:\n tpath, affine = (self._get_transformed_path()\n .get_transformed_points_and_affine())\n\n if len(tpath.vertices):\n # subsample the markers if markevery is not None\n markevery = self.get_markevery()\n if markevery is not None:\n subsampled = _mark_every_path(\n markevery, tpath, affine, self.axes)\n else:\n subsampled = tpath\n\n snap = marker.get_snap_threshold()\n if isinstance(snap, Real):\n snap = renderer.points_to_pixels(self._markersize) >= snap\n gc.set_snap(snap)\n gc.set_joinstyle(marker.get_joinstyle())\n gc.set_capstyle(marker.get_capstyle())\n marker_path = marker.get_path()\n marker_trans = marker.get_transform()\n w = renderer.points_to_pixels(self._markersize)\n\n if cbook._str_equal(marker.get_marker(), ","):\n gc.set_linewidth(0)\n else:\n # Don't scale for pixels, and don't stroke them\n marker_trans = marker_trans.scale(w)\n renderer.draw_markers(gc, marker_path, marker_trans,\n subsampled, affine.frozen(),\n fc_rgba)\n\n alt_marker_path = marker.get_alt_path()\n if alt_marker_path:\n alt_marker_trans = marker.get_alt_transform()\n alt_marker_trans = alt_marker_trans.scale(w)\n renderer.draw_markers(\n gc, alt_marker_path, alt_marker_trans, subsampled,\n affine.frozen(), fcalt_rgba)\n\n gc.restore()\n\n renderer.close_group('line2d')\n self.stale = False\n\n def get_antialiased(self):\n """Return whether antialiased rendering is used."""\n return self._antialiased\n\n def get_color(self):\n """\n Return the line color.\n\n See also `~.Line2D.set_color`.\n """\n return self._color\n\n def get_drawstyle(self):\n """\n Return the drawstyle.\n\n See also `~.Line2D.set_drawstyle`.\n """\n return self._drawstyle\n\n def get_gapcolor(self):\n """\n Return the line gapcolor.\n\n See also `~.Line2D.set_gapcolor`.\n """\n return self._gapcolor\n\n def get_linestyle(self):\n """\n Return the linestyle.\n\n See also `~.Line2D.set_linestyle`.\n """\n return self._linestyle\n\n def get_linewidth(self):\n """\n Return the linewidth in points.\n\n See also `~.Line2D.set_linewidth`.\n """\n return self._linewidth\n\n def get_marker(self):\n """\n Return the line marker.\n\n See also `~.Line2D.set_marker`.\n """\n return self._marker.get_marker()\n\n def get_markeredgecolor(self):\n """\n Return the marker edge color.\n\n See also `~.Line2D.set_markeredgecolor`.\n """\n mec = self._markeredgecolor\n if cbook._str_equal(mec, 'auto'):\n if mpl.rcParams['_internal.classic_mode']:\n if self._marker.get_marker() in ('.', ','):\n return self._color\n if (self._marker.is_filled()\n and self._marker.get_fillstyle() != 'none'):\n return 'k' # Bad hard-wired default...\n return self._color\n else:\n return mec\n\n def get_markeredgewidth(self):\n """\n Return the marker edge width in points.\n\n See also `~.Line2D.set_markeredgewidth`.\n """\n return self._markeredgewidth\n\n def _get_markerfacecolor(self, alt=False):\n if self._marker.get_fillstyle() == 'none':\n return 'none'\n fc = self._markerfacecoloralt if alt else self._markerfacecolor\n if cbook._str_lower_equal(fc, 'auto'):\n return self._color\n else:\n return fc\n\n def get_markerfacecolor(self):\n """\n Return the marker face color.\n\n See also `~.Line2D.set_markerfacecolor`.\n """\n return self._get_markerfacecolor(alt=False)\n\n def get_markerfacecoloralt(self):\n """\n Return the alternate marker face color.\n\n See also `~.Line2D.set_markerfacecoloralt`.\n """\n return self._get_markerfacecolor(alt=True)\n\n def get_markersize(self):\n """\n Return the marker size in points.\n\n See also `~.Line2D.set_markersize`.\n """\n return self._markersize\n\n def get_data(self, orig=True):\n """\n Return the line data as an ``(xdata, ydata)`` pair.\n\n If *orig* is *True*, return the original data.\n """\n return self.get_xdata(orig=orig), self.get_ydata(orig=orig)\n\n def get_xdata(self, orig=True):\n """\n Return the xdata.\n\n If *orig* is *True*, return the original data, else the\n processed data.\n """\n if orig:\n return self._xorig\n if self._invalidx:\n self.recache()\n return self._x\n\n def get_ydata(self, orig=True):\n """\n Return the ydata.\n\n If *orig* is *True*, return the original data, else the\n processed data.\n """\n if orig:\n return self._yorig\n if self._invalidy:\n self.recache()\n return self._y\n\n def get_path(self):\n """Return the `~matplotlib.path.Path` associated with this line."""\n if self._invalidy or self._invalidx:\n self.recache()\n return self._path\n\n def get_xydata(self):\n """Return the *xy* data as a (N, 2) array."""\n if self._invalidy or self._invalidx:\n self.recache()\n return self._xy\n\n def set_antialiased(self, b):\n """\n Set whether to use antialiased rendering.\n\n Parameters\n ----------\n b : bool\n """\n if self._antialiased != b:\n self.stale = True\n self._antialiased = b\n\n def set_color(self, color):\n """\n Set the color of the line.\n\n Parameters\n ----------\n color : :mpltype:`color`\n """\n mcolors._check_color_like(color=color)\n self._color = color\n self.stale = True\n\n def set_drawstyle(self, drawstyle):\n """\n Set the drawstyle of the plot.\n\n The drawstyle determines how the points are connected.\n\n Parameters\n ----------\n drawstyle : {'default', 'steps', 'steps-pre', 'steps-mid', \\n'steps-post'}, default: 'default'\n For 'default', the points are connected with straight lines.\n\n The steps variants connect the points with step-like lines,\n i.e. horizontal lines with vertical steps. They differ in the\n location of the step:\n\n - 'steps-pre': The step is at the beginning of the line segment,\n i.e. the line will be at the y-value of point to the right.\n - 'steps-mid': The step is halfway between the points.\n - 'steps-post: The step is at the end of the line segment,\n i.e. the line will be at the y-value of the point to the left.\n - 'steps' is equal to 'steps-pre' and is maintained for\n backward-compatibility.\n\n For examples see :doc:`/gallery/lines_bars_and_markers/step_demo`.\n """\n if drawstyle is None:\n drawstyle = 'default'\n _api.check_in_list(self.drawStyles, drawstyle=drawstyle)\n if self._drawstyle != drawstyle:\n self.stale = True\n # invalidate to trigger a recache of the path\n self._invalidx = True\n self._drawstyle = drawstyle\n\n def set_gapcolor(self, gapcolor):\n """\n Set a color to fill the gaps in the dashed line style.\n\n .. note::\n\n Striped lines are created by drawing two interleaved dashed lines.\n There can be overlaps between those two, which may result in\n artifacts when using transparency.\n\n This functionality is experimental and may change.\n\n Parameters\n ----------\n gapcolor : :mpltype:`color` or None\n The color with which to fill the gaps. If None, the gaps are\n unfilled.\n """\n if gapcolor is not None:\n mcolors._check_color_like(color=gapcolor)\n self._gapcolor = gapcolor\n self.stale = True\n\n def set_linewidth(self, w):\n """\n Set the line width in points.\n\n Parameters\n ----------\n w : float\n Line width, in points.\n """\n w = float(w)\n if self._linewidth != w:\n self.stale = True\n self._linewidth = w\n self._dash_pattern = _scale_dashes(*self._unscaled_dash_pattern, w)\n\n def set_linestyle(self, ls):\n """\n Set the linestyle of the line.\n\n Parameters\n ----------\n ls : {'-', '--', '-.', ':', '', (offset, on-off-seq), ...}\n Possible values:\n\n - A string:\n\n ========================================== =================\n linestyle description\n ========================================== =================\n ``'-'`` or ``'solid'`` solid line\n ``'--'`` or ``'dashed'`` dashed line\n ``'-.'`` or ``'dashdot'`` dash-dotted line\n ``':'`` or ``'dotted'`` dotted line\n ``'none'``, ``'None'``, ``' '``, or ``''`` draw nothing\n ========================================== =================\n\n - Alternatively a dash tuple of the following form can be\n provided::\n\n (offset, onoffseq)\n\n where ``onoffseq`` is an even length tuple of on and off ink\n in points. See also :meth:`set_dashes`.\n\n For examples see :doc:`/gallery/lines_bars_and_markers/linestyles`.\n """\n if isinstance(ls, str):\n if ls in [' ', '', 'none']:\n ls = 'None'\n _api.check_in_list([*self._lineStyles, *ls_mapper_r], ls=ls)\n if ls not in self._lineStyles:\n ls = ls_mapper_r[ls]\n self._linestyle = ls\n else:\n self._linestyle = '--'\n self._unscaled_dash_pattern = _get_dash_pattern(ls)\n self._dash_pattern = _scale_dashes(\n *self._unscaled_dash_pattern, self._linewidth)\n self.stale = True\n\n @_docstring.interpd\n def set_marker(self, marker):\n """\n Set the line marker.\n\n Parameters\n ----------\n marker : marker style string, `~.path.Path` or `~.markers.MarkerStyle`\n See `~matplotlib.markers` for full description of possible\n arguments.\n """\n self._marker = MarkerStyle(marker, self._marker.get_fillstyle())\n self.stale = True\n\n def _set_markercolor(self, name, has_rcdefault, val):\n if val is None:\n val = mpl.rcParams[f"lines.{name}"] if has_rcdefault else "auto"\n attr = f"_{name}"\n current = getattr(self, attr)\n if current is None:\n self.stale = True\n else:\n neq = current != val\n # Much faster than `np.any(current != val)` if no arrays are used.\n if neq.any() if isinstance(neq, np.ndarray) else neq:\n self.stale = True\n setattr(self, attr, val)\n\n def set_markeredgecolor(self, ec):\n """\n Set the marker edge color.\n\n Parameters\n ----------\n ec : :mpltype:`color`\n """\n self._set_markercolor("markeredgecolor", True, ec)\n\n def set_markerfacecolor(self, fc):\n """\n Set the marker face color.\n\n Parameters\n ----------\n fc : :mpltype:`color`\n """\n self._set_markercolor("markerfacecolor", True, fc)\n\n def set_markerfacecoloralt(self, fc):\n """\n Set the alternate marker face color.\n\n Parameters\n ----------\n fc : :mpltype:`color`\n """\n self._set_markercolor("markerfacecoloralt", False, fc)\n\n def set_markeredgewidth(self, ew):\n """\n Set the marker edge width in points.\n\n Parameters\n ----------\n ew : float\n Marker edge width, in points.\n """\n if ew is None:\n ew = mpl.rcParams['lines.markeredgewidth']\n if self._markeredgewidth != ew:\n self.stale = True\n self._markeredgewidth = ew\n\n def set_markersize(self, sz):\n """\n Set the marker size in points.\n\n Parameters\n ----------\n sz : float\n Marker size, in points.\n """\n sz = float(sz)\n if self._markersize != sz:\n self.stale = True\n self._markersize = sz\n\n def set_xdata(self, x):\n """\n Set the data array for x.\n\n Parameters\n ----------\n x : 1D array\n\n See Also\n --------\n set_data\n set_ydata\n """\n if not np.iterable(x):\n raise RuntimeError('x must be a sequence')\n self._xorig = copy.copy(x)\n self._invalidx = True\n self.stale = True\n\n def set_ydata(self, y):\n """\n Set the data array for y.\n\n Parameters\n ----------\n y : 1D array\n\n See Also\n --------\n set_data\n set_xdata\n """\n if not np.iterable(y):\n raise RuntimeError('y must be a sequence')\n self._yorig = copy.copy(y)\n self._invalidy = True\n self.stale = True\n\n def set_dashes(self, seq):\n """\n Set the dash sequence.\n\n The dash sequence is a sequence of floats of even length describing\n the length of dashes and spaces in points.\n\n For example, (5, 2, 1, 2) describes a sequence of 5 point and 1 point\n dashes separated by 2 point spaces.\n\n See also `~.Line2D.set_gapcolor`, which allows those spaces to be\n filled with a color.\n\n Parameters\n ----------\n seq : sequence of floats (on/off ink in points) or (None, None)\n If *seq* is empty or ``(None, None)``, the linestyle will be set\n to solid.\n """\n if seq == (None, None) or len(seq) == 0:\n self.set_linestyle('-')\n else:\n self.set_linestyle((0, seq))\n\n def update_from(self, other):\n """Copy properties from *other* to self."""\n super().update_from(other)\n self._linestyle = other._linestyle\n self._linewidth = other._linewidth\n self._color = other._color\n self._gapcolor = other._gapcolor\n self._markersize = other._markersize\n self._markerfacecolor = other._markerfacecolor\n self._markerfacecoloralt = other._markerfacecoloralt\n self._markeredgecolor = other._markeredgecolor\n self._markeredgewidth = other._markeredgewidth\n self._unscaled_dash_pattern = other._unscaled_dash_pattern\n self._dash_pattern = other._dash_pattern\n self._dashcapstyle = other._dashcapstyle\n self._dashjoinstyle = other._dashjoinstyle\n self._solidcapstyle = other._solidcapstyle\n self._solidjoinstyle = other._solidjoinstyle\n\n self._linestyle = other._linestyle\n self._marker = MarkerStyle(marker=other._marker)\n self._drawstyle = other._drawstyle\n\n @_docstring.interpd\n def set_dash_joinstyle(self, s):\n """\n How to join segments of the line if it `~Line2D.is_dashed`.\n\n The default joinstyle is :rc:`lines.dash_joinstyle`.\n\n Parameters\n ----------\n s : `.JoinStyle` or %(JoinStyle)s\n """\n js = JoinStyle(s)\n if self._dashjoinstyle != js:\n self.stale = True\n self._dashjoinstyle = js\n\n @_docstring.interpd\n def set_solid_joinstyle(self, s):\n """\n How to join segments if the line is solid (not `~Line2D.is_dashed`).\n\n The default joinstyle is :rc:`lines.solid_joinstyle`.\n\n Parameters\n ----------\n s : `.JoinStyle` or %(JoinStyle)s\n """\n js = JoinStyle(s)\n if self._solidjoinstyle != js:\n self.stale = True\n self._solidjoinstyle = js\n\n def get_dash_joinstyle(self):\n """\n Return the `.JoinStyle` for dashed lines.\n\n See also `~.Line2D.set_dash_joinstyle`.\n """\n return self._dashjoinstyle.name\n\n def get_solid_joinstyle(self):\n """\n Return the `.JoinStyle` for solid lines.\n\n See also `~.Line2D.set_solid_joinstyle`.\n """\n return self._solidjoinstyle.name\n\n @_docstring.interpd\n def set_dash_capstyle(self, s):\n """\n How to draw the end caps if the line is `~Line2D.is_dashed`.\n\n The default capstyle is :rc:`lines.dash_capstyle`.\n\n Parameters\n ----------\n s : `.CapStyle` or %(CapStyle)s\n """\n cs = CapStyle(s)\n if self._dashcapstyle != cs:\n self.stale = True\n self._dashcapstyle = cs\n\n @_docstring.interpd\n def set_solid_capstyle(self, s):\n """\n How to draw the end caps if the line is solid (not `~Line2D.is_dashed`)\n\n The default capstyle is :rc:`lines.solid_capstyle`.\n\n Parameters\n ----------\n s : `.CapStyle` or %(CapStyle)s\n """\n cs = CapStyle(s)\n if self._solidcapstyle != cs:\n self.stale = True\n self._solidcapstyle = cs\n\n def get_dash_capstyle(self):\n """\n Return the `.CapStyle` for dashed lines.\n\n See also `~.Line2D.set_dash_capstyle`.\n """\n return self._dashcapstyle.name\n\n def get_solid_capstyle(self):\n """\n Return the `.CapStyle` for solid lines.\n\n See also `~.Line2D.set_solid_capstyle`.\n """\n return self._solidcapstyle.name\n\n def is_dashed(self):\n """\n Return whether line has a dashed linestyle.\n\n A custom linestyle is assumed to be dashed, we do not inspect the\n ``onoffseq`` directly.\n\n See also `~.Line2D.set_linestyle`.\n """\n return self._linestyle in ('--', '-.', ':')\n\n\nclass AxLine(Line2D):\n """\n A helper class that implements `~.Axes.axline`, by recomputing the artist\n transform at draw time.\n """\n\n def __init__(self, xy1, xy2, slope, **kwargs):\n """\n Parameters\n ----------\n xy1 : (float, float)\n The first set of (x, y) coordinates for the line to pass through.\n xy2 : (float, float) or None\n The second set of (x, y) coordinates for the line to pass through.\n Both *xy2* and *slope* must be passed, but one of them must be None.\n slope : float or None\n The slope of the line. Both *xy2* and *slope* must be passed, but one of\n them must be None.\n """\n super().__init__([0, 1], [0, 1], **kwargs)\n\n if (xy2 is None and slope is None or\n xy2 is not None and slope is not None):\n raise TypeError(\n "Exactly one of 'xy2' and 'slope' must be given")\n\n self._slope = slope\n self._xy1 = xy1\n self._xy2 = xy2\n\n def get_transform(self):\n ax = self.axes\n points_transform = self._transform - ax.transData + ax.transScale\n\n if self._xy2 is not None:\n # two points were given\n (x1, y1), (x2, y2) = \\n points_transform.transform([self._xy1, self._xy2])\n dx = x2 - x1\n dy = y2 - y1\n if dx == 0:\n if dy == 0:\n raise ValueError(\n f"Cannot draw a line through two identical points "\n f"(x={(x1, x2)}, y={(y1, y2)})")\n slope = np.inf\n else:\n slope = dy / dx\n else:\n # one point and a slope were given\n x1, y1 = points_transform.transform(self._xy1)\n slope = self._slope\n (vxlo, vylo), (vxhi, vyhi) = ax.transScale.transform(ax.viewLim)\n # General case: find intersections with view limits in either\n # direction, and draw between the middle two points.\n if slope == 0:\n start = vxlo, y1\n stop = vxhi, y1\n elif np.isinf(slope):\n start = x1, vylo\n stop = x1, vyhi\n else:\n _, start, stop, _ = sorted([\n (vxlo, y1 + (vxlo - x1) * slope),\n (vxhi, y1 + (vxhi - x1) * slope),\n (x1 + (vylo - y1) / slope, vylo),\n (x1 + (vyhi - y1) / slope, vyhi),\n ])\n return (BboxTransformTo(Bbox([start, stop]))\n + ax.transLimits + ax.transAxes)\n\n def draw(self, renderer):\n self._transformed_path = None # Force regen.\n super().draw(renderer)\n\n def get_xy1(self):\n """Return the *xy1* value of the line."""\n return self._xy1\n\n def get_xy2(self):\n """Return the *xy2* value of the line."""\n return self._xy2\n\n def get_slope(self):\n """Return the *slope* value of the line."""\n return self._slope\n\n def set_xy1(self, *args, **kwargs):\n """\n Set the *xy1* value of the line.\n\n Parameters\n ----------\n xy1 : tuple[float, float]\n Points for the line to pass through.\n """\n params = _api.select_matching_signature([\n lambda self, x, y: locals(), lambda self, xy1: locals(),\n ], self, *args, **kwargs)\n if "x" in params:\n _api.warn_deprecated("3.10", message=(\n "Passing x and y separately to AxLine.set_xy1 is deprecated since "\n "%(since)s; pass them as a single tuple instead."))\n xy1 = params["x"], params["y"]\n else:\n xy1 = params["xy1"]\n self._xy1 = xy1\n\n def set_xy2(self, *args, **kwargs):\n """\n Set the *xy2* value of the line.\n\n .. note::\n\n You can only set *xy2* if the line was created using the *xy2*\n parameter. If the line was created using *slope*, please use\n `~.AxLine.set_slope`.\n\n Parameters\n ----------\n xy2 : tuple[float, float]\n Points for the line to pass through.\n """\n if self._slope is None:\n params = _api.select_matching_signature([\n lambda self, x, y: locals(), lambda self, xy2: locals(),\n ], self, *args, **kwargs)\n if "x" in params:\n _api.warn_deprecated("3.10", message=(\n "Passing x and y separately to AxLine.set_xy2 is deprecated since "\n "%(since)s; pass them as a single tuple instead."))\n xy2 = params["x"], params["y"]\n else:\n xy2 = params["xy2"]\n self._xy2 = xy2\n else:\n raise ValueError("Cannot set an 'xy2' value while 'slope' is set;"\n " they differ but their functionalities overlap")\n\n def set_slope(self, slope):\n """\n Set the *slope* value of the line.\n\n .. note::\n\n You can only set *slope* if the line was created using the *slope*\n parameter. If the line was created using *xy2*, please use\n `~.AxLine.set_xy2`.\n\n Parameters\n ----------\n slope : float\n The slope of the line.\n """\n if self._xy2 is None:\n self._slope = slope\n else:\n raise ValueError("Cannot set a 'slope' value while 'xy2' is set;"\n " they differ but their functionalities overlap")\n\n\nclass VertexSelector:\n """\n Manage the callbacks to maintain a list of selected vertices for `.Line2D`.\n Derived classes should override the `process_selected` method to do\n something with the picks.\n\n Here is an example which highlights the selected verts with red circles::\n\n import numpy as np\n import matplotlib.pyplot as plt\n import matplotlib.lines as lines\n\n class HighlightSelected(lines.VertexSelector):\n def __init__(self, line, fmt='ro', **kwargs):\n super().__init__(line)\n self.markers, = self.axes.plot([], [], fmt, **kwargs)\n\n def process_selected(self, ind, xs, ys):\n self.markers.set_data(xs, ys)\n self.canvas.draw()\n\n fig, ax = plt.subplots()\n x, y = np.random.rand(2, 30)\n line, = ax.plot(x, y, 'bs-', picker=5)\n\n selector = HighlightSelected(line)\n plt.show()\n """\n\n def __init__(self, line):\n """\n Parameters\n ----------\n line : `~matplotlib.lines.Line2D`\n The line must already have been added to an `~.axes.Axes` and must\n have its picker property set.\n """\n if line.axes is None:\n raise RuntimeError('You must first add the line to the Axes')\n if line.get_picker() is None:\n raise RuntimeError('You must first set the picker property '\n 'of the line')\n self.axes = line.axes\n self.line = line\n self.cid = self.canvas.callbacks._connect_picklable(\n 'pick_event', self.onpick)\n self.ind = set()\n\n canvas = property(lambda self: self.axes.get_figure(root=True).canvas)\n\n def process_selected(self, ind, xs, ys):\n """\n Default "do nothing" implementation of the `process_selected` method.\n\n Parameters\n ----------\n ind : list of int\n The indices of the selected vertices.\n xs, ys : array-like\n The coordinates of the selected vertices.\n """\n pass\n\n def onpick(self, event):\n """When the line is picked, update the set of selected indices."""\n if event.artist is not self.line:\n return\n self.ind ^= set(event.ind)\n ind = sorted(self.ind)\n xdata, ydata = self.line.get_data()\n self.process_selected(ind, xdata[ind], ydata[ind])\n\n\nlineStyles = Line2D._lineStyles\nlineMarkers = MarkerStyle.markers\ndrawStyles = Line2D.drawStyles\nfillStyles = MarkerStyle.fillstyles\n | .venv\Lib\site-packages\matplotlib\lines.py | lines.py | Python | 57,920 | 0.75 | 0.155814 | 0.052851 | node-utils | 538 | 2025-02-13T09:32:58.910618 | MIT | false | e32e2304e54e618b06d592ba3578d822 |
from .artist import Artist\nfrom .axes import Axes\nfrom .backend_bases import MouseEvent, FigureCanvasBase\nfrom .path import Path\nfrom .transforms import Bbox\n\nfrom collections.abc import Callable, Sequence\nfrom typing import Any, Literal, overload\nfrom .typing import (\n ColorType,\n DrawStyleType,\n FillStyleType,\n LineStyleType,\n CapStyleType,\n JoinStyleType,\n MarkEveryType,\n MarkerType,\n)\nfrom numpy.typing import ArrayLike\n\ndef segment_hits(\n cx: ArrayLike, cy: ArrayLike, x: ArrayLike, y: ArrayLike, radius: ArrayLike\n) -> ArrayLike: ...\n\nclass Line2D(Artist):\n lineStyles: dict[str, str]\n drawStyles: dict[str, str]\n drawStyleKeys: list[str]\n markers: dict[str | int, str]\n filled_markers: tuple[str, ...]\n fillStyles: tuple[str, ...]\n zorder: float\n ind_offset: float\n def __init__(\n self,\n xdata: ArrayLike,\n ydata: ArrayLike,\n *,\n linewidth: float | None = ...,\n linestyle: LineStyleType | None = ...,\n color: ColorType | None = ...,\n gapcolor: ColorType | None = ...,\n marker: MarkerType | None = ...,\n markersize: float | None = ...,\n markeredgewidth: float | None = ...,\n markeredgecolor: ColorType | None = ...,\n markerfacecolor: ColorType | None = ...,\n markerfacecoloralt: ColorType = ...,\n fillstyle: FillStyleType | None = ...,\n antialiased: bool | None = ...,\n dash_capstyle: CapStyleType | None = ...,\n solid_capstyle: CapStyleType | None = ...,\n dash_joinstyle: JoinStyleType | None = ...,\n solid_joinstyle: JoinStyleType | None = ...,\n pickradius: float = ...,\n drawstyle: DrawStyleType | None = ...,\n markevery: MarkEveryType | None = ...,\n **kwargs\n ) -> None: ...\n def contains(self, mouseevent: MouseEvent) -> tuple[bool, dict]: ...\n def get_pickradius(self) -> float: ...\n def set_pickradius(self, pickradius: float) -> None: ...\n pickradius: float\n def get_fillstyle(self) -> FillStyleType: ...\n stale: bool\n def set_fillstyle(self, fs: FillStyleType) -> None: ...\n def set_markevery(self, every: MarkEveryType) -> None: ...\n def get_markevery(self) -> MarkEveryType: ...\n def set_picker(\n self, p: None | bool | float | Callable[[Artist, MouseEvent], tuple[bool, dict]]\n ) -> None: ...\n def get_bbox(self) -> Bbox: ...\n @overload\n def set_data(self, args: ArrayLike) -> None: ...\n @overload\n def set_data(self, x: ArrayLike, y: ArrayLike) -> None: ...\n def recache_always(self) -> None: ...\n def recache(self, always: bool = ...) -> None: ...\n def get_antialiased(self) -> bool: ...\n def get_color(self) -> ColorType: ...\n def get_drawstyle(self) -> DrawStyleType: ...\n def get_gapcolor(self) -> ColorType: ...\n def get_linestyle(self) -> LineStyleType: ...\n def get_linewidth(self) -> float: ...\n def get_marker(self) -> MarkerType: ...\n def get_markeredgecolor(self) -> ColorType: ...\n def get_markeredgewidth(self) -> float: ...\n def get_markerfacecolor(self) -> ColorType: ...\n def get_markerfacecoloralt(self) -> ColorType: ...\n def get_markersize(self) -> float: ...\n def get_data(self, orig: bool = ...) -> tuple[ArrayLike, ArrayLike]: ...\n def get_xdata(self, orig: bool = ...) -> ArrayLike: ...\n def get_ydata(self, orig: bool = ...) -> ArrayLike: ...\n def get_path(self) -> Path: ...\n def get_xydata(self) -> ArrayLike: ...\n def set_antialiased(self, b: bool) -> None: ...\n def set_color(self, color: ColorType) -> None: ...\n def set_drawstyle(self, drawstyle: DrawStyleType | None) -> None: ...\n def set_gapcolor(self, gapcolor: ColorType | None) -> None: ...\n def set_linewidth(self, w: float) -> None: ...\n def set_linestyle(self, ls: LineStyleType) -> None: ...\n def set_marker(self, marker: MarkerType) -> None: ...\n def set_markeredgecolor(self, ec: ColorType | None) -> None: ...\n def set_markerfacecolor(self, fc: ColorType | None) -> None: ...\n def set_markerfacecoloralt(self, fc: ColorType | None) -> None: ...\n def set_markeredgewidth(self, ew: float | None) -> None: ...\n def set_markersize(self, sz: float) -> None: ...\n def set_xdata(self, x: ArrayLike) -> None: ...\n def set_ydata(self, y: ArrayLike) -> None: ...\n def set_dashes(self, seq: Sequence[float] | tuple[None, None]) -> None: ...\n def update_from(self, other: Artist) -> None: ...\n def set_dash_joinstyle(self, s: JoinStyleType) -> None: ...\n def set_solid_joinstyle(self, s: JoinStyleType) -> None: ...\n def get_dash_joinstyle(self) -> Literal["miter", "round", "bevel"]: ...\n def get_solid_joinstyle(self) -> Literal["miter", "round", "bevel"]: ...\n def set_dash_capstyle(self, s: CapStyleType) -> None: ...\n def set_solid_capstyle(self, s: CapStyleType) -> None: ...\n def get_dash_capstyle(self) -> Literal["butt", "projecting", "round"]: ...\n def get_solid_capstyle(self) -> Literal["butt", "projecting", "round"]: ...\n def is_dashed(self) -> bool: ...\n\nclass AxLine(Line2D):\n def __init__(\n self,\n xy1: tuple[float, float],\n xy2: tuple[float, float] | None,\n slope: float | None,\n **kwargs\n ) -> None: ...\n def get_xy1(self) -> tuple[float, float] | None: ...\n def get_xy2(self) -> tuple[float, float] | None: ...\n def get_slope(self) -> float: ...\n def set_xy1(self, xy1: tuple[float, float]) -> None: ...\n def set_xy2(self, xy2: tuple[float, float]) -> None: ...\n def set_slope(self, slope: float) -> None: ...\n\nclass VertexSelector:\n axes: Axes\n line: Line2D\n cid: int\n ind: set[int]\n def __init__(self, line: Line2D) -> None: ...\n @property\n def canvas(self) -> FigureCanvasBase: ...\n def process_selected(\n self, ind: Sequence[int], xs: ArrayLike, ys: ArrayLike\n ) -> None: ...\n def onpick(self, event: Any) -> None: ...\n\nlineStyles: dict[str, str]\nlineMarkers: dict[str | int, str]\ndrawStyles: dict[str, str]\nfillStyles: tuple[FillStyleType, ...]\n | .venv\Lib\site-packages\matplotlib\lines.pyi | lines.pyi | Other | 6,081 | 0.85 | 0.464052 | 0.020408 | node-utils | 613 | 2023-08-21T17:14:19.974260 | BSD-3-Clause | false | dea06c319496e32ab8cc5c613ccef3b3 |
r"""\nFunctions to handle markers; used by the marker functionality of\n`~matplotlib.axes.Axes.plot`, `~matplotlib.axes.Axes.scatter`, and\n`~matplotlib.axes.Axes.errorbar`.\n\nAll possible markers are defined here:\n\n============================== ====== =========================================\nmarker symbol description\n============================== ====== =========================================\n``"."`` |m00| point\n``","`` |m01| pixel\n``"o"`` |m02| circle\n``"v"`` |m03| triangle_down\n``"^"`` |m04| triangle_up\n``"<"`` |m05| triangle_left\n``">"`` |m06| triangle_right\n``"1"`` |m07| tri_down\n``"2"`` |m08| tri_up\n``"3"`` |m09| tri_left\n``"4"`` |m10| tri_right\n``"8"`` |m11| octagon\n``"s"`` |m12| square\n``"p"`` |m13| pentagon\n``"P"`` |m23| plus (filled)\n``"*"`` |m14| star\n``"h"`` |m15| hexagon1\n``"H"`` |m16| hexagon2\n``"+"`` |m17| plus\n``"x"`` |m18| x\n``"X"`` |m24| x (filled)\n``"D"`` |m19| diamond\n``"d"`` |m20| thin_diamond\n``"|"`` |m21| vline\n``"_"`` |m22| hline\n``0`` (``TICKLEFT``) |m25| tickleft\n``1`` (``TICKRIGHT``) |m26| tickright\n``2`` (``TICKUP``) |m27| tickup\n``3`` (``TICKDOWN``) |m28| tickdown\n``4`` (``CARETLEFT``) |m29| caretleft\n``5`` (``CARETRIGHT``) |m30| caretright\n``6`` (``CARETUP``) |m31| caretup\n``7`` (``CARETDOWN``) |m32| caretdown\n``8`` (``CARETLEFTBASE``) |m33| caretleft (centered at base)\n``9`` (``CARETRIGHTBASE``) |m34| caretright (centered at base)\n``10`` (``CARETUPBASE``) |m35| caretup (centered at base)\n``11`` (``CARETDOWNBASE``) |m36| caretdown (centered at base)\n``"none"`` or ``"None"`` nothing\n``" "`` or ``""`` nothing\n``"$...$"`` |m37| Render the string using mathtext.\n E.g ``"$f$"`` for marker showing the\n letter ``f``.\n``verts`` A list of (x, y) pairs used for Path\n vertices. The center of the marker is\n located at (0, 0) and the size is\n normalized, such that the created path\n is encapsulated inside the unit cell.\n``path`` A `~matplotlib.path.Path` instance.\n``(numsides, 0, angle)`` A regular polygon with ``numsides``\n sides, rotated by ``angle``.\n``(numsides, 1, angle)`` A star-like symbol with ``numsides``\n sides, rotated by ``angle``.\n``(numsides, 2, angle)`` An asterisk with ``numsides`` sides,\n rotated by ``angle``.\n============================== ====== =========================================\n\nNote that special symbols can be defined via the\n:ref:`STIX math font <mathtext>`,\ne.g. ``"$\u266B$"``. For an overview over the STIX font symbols refer to the\n`STIX font table <http://www.stixfonts.org/allGlyphs.html>`_.\nAlso see the :doc:`/gallery/text_labels_and_annotations/stix_fonts_demo`.\n\nInteger numbers from ``0`` to ``11`` create lines and triangles. Those are\nequally accessible via capitalized variables, like ``CARETDOWNBASE``.\nHence the following are equivalent::\n\n plt.plot([1, 2, 3], marker=11)\n plt.plot([1, 2, 3], marker=matplotlib.markers.CARETDOWNBASE)\n\nMarkers join and cap styles can be customized by creating a new instance of\nMarkerStyle.\nA MarkerStyle can also have a custom `~matplotlib.transforms.Transform`\nallowing it to be arbitrarily rotated or offset.\n\nExamples showing the use of markers:\n\n* :doc:`/gallery/lines_bars_and_markers/marker_reference`\n* :doc:`/gallery/lines_bars_and_markers/scatter_star_poly`\n* :doc:`/gallery/lines_bars_and_markers/multivariate_marker_plot`\n\n.. |m00| image:: /_static/markers/m00.png\n.. |m01| image:: /_static/markers/m01.png\n.. |m02| image:: /_static/markers/m02.png\n.. |m03| image:: /_static/markers/m03.png\n.. |m04| image:: /_static/markers/m04.png\n.. |m05| image:: /_static/markers/m05.png\n.. |m06| image:: /_static/markers/m06.png\n.. |m07| image:: /_static/markers/m07.png\n.. |m08| image:: /_static/markers/m08.png\n.. |m09| image:: /_static/markers/m09.png\n.. |m10| image:: /_static/markers/m10.png\n.. |m11| image:: /_static/markers/m11.png\n.. |m12| image:: /_static/markers/m12.png\n.. |m13| image:: /_static/markers/m13.png\n.. |m14| image:: /_static/markers/m14.png\n.. |m15| image:: /_static/markers/m15.png\n.. |m16| image:: /_static/markers/m16.png\n.. |m17| image:: /_static/markers/m17.png\n.. |m18| image:: /_static/markers/m18.png\n.. |m19| image:: /_static/markers/m19.png\n.. |m20| image:: /_static/markers/m20.png\n.. |m21| image:: /_static/markers/m21.png\n.. |m22| image:: /_static/markers/m22.png\n.. |m23| image:: /_static/markers/m23.png\n.. |m24| image:: /_static/markers/m24.png\n.. |m25| image:: /_static/markers/m25.png\n.. |m26| image:: /_static/markers/m26.png\n.. |m27| image:: /_static/markers/m27.png\n.. |m28| image:: /_static/markers/m28.png\n.. |m29| image:: /_static/markers/m29.png\n.. |m30| image:: /_static/markers/m30.png\n.. |m31| image:: /_static/markers/m31.png\n.. |m32| image:: /_static/markers/m32.png\n.. |m33| image:: /_static/markers/m33.png\n.. |m34| image:: /_static/markers/m34.png\n.. |m35| image:: /_static/markers/m35.png\n.. |m36| image:: /_static/markers/m36.png\n.. |m37| image:: /_static/markers/m37.png\n"""\nimport copy\n\nfrom collections.abc import Sized\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom . import _api, cbook\nfrom .path import Path\nfrom .transforms import IdentityTransform, Affine2D\nfrom ._enums import JoinStyle, CapStyle\n\n# special-purpose marker identifiers:\n(TICKLEFT, TICKRIGHT, TICKUP, TICKDOWN,\n CARETLEFT, CARETRIGHT, CARETUP, CARETDOWN,\n CARETLEFTBASE, CARETRIGHTBASE, CARETUPBASE, CARETDOWNBASE) = range(12)\n\n_empty_path = Path(np.empty((0, 2)))\n\n\nclass MarkerStyle:\n """\n A class representing marker types.\n\n Instances are immutable. If you need to change anything, create a new\n instance.\n\n Attributes\n ----------\n markers : dict\n All known markers.\n filled_markers : tuple\n All known filled markers. This is a subset of *markers*.\n fillstyles : tuple\n The supported fillstyles.\n """\n\n markers = {\n '.': 'point',\n ',': 'pixel',\n 'o': 'circle',\n 'v': 'triangle_down',\n '^': 'triangle_up',\n '<': 'triangle_left',\n '>': 'triangle_right',\n '1': 'tri_down',\n '2': 'tri_up',\n '3': 'tri_left',\n '4': 'tri_right',\n '8': 'octagon',\n 's': 'square',\n 'p': 'pentagon',\n '*': 'star',\n 'h': 'hexagon1',\n 'H': 'hexagon2',\n '+': 'plus',\n 'x': 'x',\n 'D': 'diamond',\n 'd': 'thin_diamond',\n '|': 'vline',\n '_': 'hline',\n 'P': 'plus_filled',\n 'X': 'x_filled',\n TICKLEFT: 'tickleft',\n TICKRIGHT: 'tickright',\n TICKUP: 'tickup',\n TICKDOWN: 'tickdown',\n CARETLEFT: 'caretleft',\n CARETRIGHT: 'caretright',\n CARETUP: 'caretup',\n CARETDOWN: 'caretdown',\n CARETLEFTBASE: 'caretleftbase',\n CARETRIGHTBASE: 'caretrightbase',\n CARETUPBASE: 'caretupbase',\n CARETDOWNBASE: 'caretdownbase',\n "None": 'nothing',\n "none": 'nothing',\n ' ': 'nothing',\n '': 'nothing'\n }\n\n # Just used for informational purposes. is_filled()\n # is calculated in the _set_* functions.\n filled_markers = (\n '.', 'o', 'v', '^', '<', '>', '8', 's', 'p', '*', 'h', 'H', 'D', 'd',\n 'P', 'X')\n\n fillstyles = ('full', 'left', 'right', 'bottom', 'top', 'none')\n _half_fillstyles = ('left', 'right', 'bottom', 'top')\n\n def __init__(self, marker,\n fillstyle=None, transform=None, capstyle=None, joinstyle=None):\n """\n Parameters\n ----------\n marker : str, array-like, Path, MarkerStyle\n - Another instance of `MarkerStyle` copies the details of that *marker*.\n - For other possible marker values, see the module docstring\n `matplotlib.markers`.\n\n fillstyle : str, default: :rc:`markers.fillstyle`\n One of 'full', 'left', 'right', 'bottom', 'top', 'none'.\n\n transform : `~matplotlib.transforms.Transform`, optional\n Transform that will be combined with the native transform of the\n marker.\n\n capstyle : `.CapStyle` or %(CapStyle)s, optional\n Cap style that will override the default cap style of the marker.\n\n joinstyle : `.JoinStyle` or %(JoinStyle)s, optional\n Join style that will override the default join style of the marker.\n """\n self._marker_function = None\n self._user_transform = transform\n self._user_capstyle = CapStyle(capstyle) if capstyle is not None else None\n self._user_joinstyle = JoinStyle(joinstyle) if joinstyle is not None else None\n self._set_fillstyle(fillstyle)\n self._set_marker(marker)\n\n def _recache(self):\n if self._marker_function is None:\n return\n self._path = _empty_path\n self._transform = IdentityTransform()\n self._alt_path = None\n self._alt_transform = None\n self._snap_threshold = None\n self._joinstyle = JoinStyle.round\n self._capstyle = self._user_capstyle or CapStyle.butt\n # Initial guess: Assume the marker is filled unless the fillstyle is\n # set to 'none'. The marker function will override this for unfilled\n # markers.\n self._filled = self._fillstyle != 'none'\n self._marker_function()\n\n def __bool__(self):\n return bool(len(self._path.vertices))\n\n def is_filled(self):\n return self._filled\n\n def get_fillstyle(self):\n return self._fillstyle\n\n def _set_fillstyle(self, fillstyle):\n """\n Set the fillstyle.\n\n Parameters\n ----------\n fillstyle : {'full', 'left', 'right', 'bottom', 'top', 'none'}\n The part of the marker surface that is colored with\n markerfacecolor.\n """\n if fillstyle is None:\n fillstyle = mpl.rcParams['markers.fillstyle']\n _api.check_in_list(self.fillstyles, fillstyle=fillstyle)\n self._fillstyle = fillstyle\n\n def get_joinstyle(self):\n return self._joinstyle.name\n\n def get_capstyle(self):\n return self._capstyle.name\n\n def get_marker(self):\n return self._marker\n\n def _set_marker(self, marker):\n """\n Set the marker.\n\n Parameters\n ----------\n marker : str, array-like, Path, MarkerStyle\n - Another instance of `MarkerStyle` copies the details of that *marker*.\n - For other possible marker values see the module docstring\n `matplotlib.markers`.\n """\n if isinstance(marker, str) and cbook.is_math_text(marker):\n self._marker_function = self._set_mathtext_path\n elif isinstance(marker, (int, str)) and marker in self.markers:\n self._marker_function = getattr(self, '_set_' + self.markers[marker])\n elif (isinstance(marker, np.ndarray) and marker.ndim == 2 and\n marker.shape[1] == 2):\n self._marker_function = self._set_vertices\n elif isinstance(marker, Path):\n self._marker_function = self._set_path_marker\n elif (isinstance(marker, Sized) and len(marker) in (2, 3) and\n marker[1] in (0, 1, 2)):\n self._marker_function = self._set_tuple_marker\n elif isinstance(marker, MarkerStyle):\n self.__dict__ = copy.deepcopy(marker.__dict__)\n else:\n try:\n Path(marker)\n self._marker_function = self._set_vertices\n except ValueError as err:\n raise ValueError(\n f'Unrecognized marker style {marker!r}') from err\n\n if not isinstance(marker, MarkerStyle):\n self._marker = marker\n self._recache()\n\n def get_path(self):\n """\n Return a `.Path` for the primary part of the marker.\n\n For unfilled markers this is the whole marker, for filled markers,\n this is the area to be drawn with *markerfacecolor*.\n """\n return self._path\n\n def get_transform(self):\n """\n Return the transform to be applied to the `.Path` from\n `MarkerStyle.get_path()`.\n """\n if self._user_transform is None:\n return self._transform.frozen()\n else:\n return (self._transform + self._user_transform).frozen()\n\n def get_alt_path(self):\n """\n Return a `.Path` for the alternate part of the marker.\n\n For unfilled markers, this is *None*; for filled markers, this is the\n area to be drawn with *markerfacecoloralt*.\n """\n return self._alt_path\n\n def get_alt_transform(self):\n """\n Return the transform to be applied to the `.Path` from\n `MarkerStyle.get_alt_path()`.\n """\n if self._user_transform is None:\n return self._alt_transform.frozen()\n else:\n return (self._alt_transform + self._user_transform).frozen()\n\n def get_snap_threshold(self):\n return self._snap_threshold\n\n def get_user_transform(self):\n """Return user supplied part of marker transform."""\n if self._user_transform is not None:\n return self._user_transform.frozen()\n\n def transformed(self, transform):\n """\n Return a new version of this marker with the transform applied.\n\n Parameters\n ----------\n transform : `~matplotlib.transforms.Affine2D`\n Transform will be combined with current user supplied transform.\n """\n new_marker = MarkerStyle(self)\n if new_marker._user_transform is not None:\n new_marker._user_transform += transform\n else:\n new_marker._user_transform = transform\n return new_marker\n\n def rotated(self, *, deg=None, rad=None):\n """\n Return a new version of this marker rotated by specified angle.\n\n Parameters\n ----------\n deg : float, optional\n Rotation angle in degrees.\n\n rad : float, optional\n Rotation angle in radians.\n\n .. note:: You must specify exactly one of deg or rad.\n """\n if deg is None and rad is None:\n raise ValueError('One of deg or rad is required')\n if deg is not None and rad is not None:\n raise ValueError('Only one of deg and rad can be supplied')\n new_marker = MarkerStyle(self)\n if new_marker._user_transform is None:\n new_marker._user_transform = Affine2D()\n\n if deg is not None:\n new_marker._user_transform.rotate_deg(deg)\n if rad is not None:\n new_marker._user_transform.rotate(rad)\n\n return new_marker\n\n def scaled(self, sx, sy=None):\n """\n Return new marker scaled by specified scale factors.\n\n If *sy* is not given, the same scale is applied in both the *x*- and\n *y*-directions.\n\n Parameters\n ----------\n sx : float\n *X*-direction scaling factor.\n sy : float, optional\n *Y*-direction scaling factor.\n """\n if sy is None:\n sy = sx\n\n new_marker = MarkerStyle(self)\n _transform = new_marker._user_transform or Affine2D()\n new_marker._user_transform = _transform.scale(sx, sy)\n return new_marker\n\n def _set_nothing(self):\n self._filled = False\n\n def _set_custom_marker(self, path):\n rescale = np.max(np.abs(path.vertices)) # max of x's and y's.\n self._transform = Affine2D().scale(0.5 / rescale)\n self._path = path\n\n def _set_path_marker(self):\n self._set_custom_marker(self._marker)\n\n def _set_vertices(self):\n self._set_custom_marker(Path(self._marker))\n\n def _set_tuple_marker(self):\n marker = self._marker\n if len(marker) == 2:\n numsides, rotation = marker[0], 0.0\n elif len(marker) == 3:\n numsides, rotation = marker[0], marker[2]\n symstyle = marker[1]\n if symstyle == 0:\n self._path = Path.unit_regular_polygon(numsides)\n self._joinstyle = self._user_joinstyle or JoinStyle.miter\n elif symstyle == 1:\n self._path = Path.unit_regular_star(numsides)\n self._joinstyle = self._user_joinstyle or JoinStyle.bevel\n elif symstyle == 2:\n self._path = Path.unit_regular_asterisk(numsides)\n self._filled = False\n self._joinstyle = self._user_joinstyle or JoinStyle.bevel\n else:\n raise ValueError(f"Unexpected tuple marker: {marker}")\n self._transform = Affine2D().scale(0.5).rotate_deg(rotation)\n\n def _set_mathtext_path(self):\n """\n Draw mathtext markers '$...$' using `.TextPath` object.\n\n Submitted by tcb\n """\n from matplotlib.text import TextPath\n\n # again, the properties could be initialised just once outside\n # this function\n text = TextPath(xy=(0, 0), s=self.get_marker(),\n usetex=mpl.rcParams['text.usetex'])\n if len(text.vertices) == 0:\n return\n\n bbox = text.get_extents()\n max_dim = max(bbox.width, bbox.height)\n self._transform = (\n Affine2D()\n .translate(-bbox.xmin + 0.5 * -bbox.width, -bbox.ymin + 0.5 * -bbox.height)\n .scale(1.0 / max_dim))\n self._path = text\n self._snap = False\n\n def _half_fill(self):\n return self.get_fillstyle() in self._half_fillstyles\n\n def _set_circle(self, size=1.0):\n self._transform = Affine2D().scale(0.5 * size)\n self._snap_threshold = np.inf\n if not self._half_fill():\n self._path = Path.unit_circle()\n else:\n self._path = self._alt_path = Path.unit_circle_righthalf()\n fs = self.get_fillstyle()\n self._transform.rotate_deg(\n {'right': 0, 'top': 90, 'left': 180, 'bottom': 270}[fs])\n self._alt_transform = self._transform.frozen().rotate_deg(180.)\n\n def _set_point(self):\n self._set_circle(size=0.5)\n\n def _set_pixel(self):\n self._path = Path.unit_rectangle()\n # Ideally, you'd want -0.5, -0.5 here, but then the snapping\n # algorithm in the Agg backend will round this to a 2x2\n # rectangle from (-1, -1) to (1, 1). By offsetting it\n # slightly, we can force it to be (0, 0) to (1, 1), which both\n # makes it only be a single pixel and places it correctly\n # aligned to 1-width stroking (i.e. the ticks). This hack is\n # the best of a number of bad alternatives, mainly because the\n # backends are not aware of what marker is actually being used\n # beyond just its path data.\n self._transform = Affine2D().translate(-0.49999, -0.49999)\n self._snap_threshold = None\n\n _triangle_path = Path._create_closed([[0, 1], [-1, -1], [1, -1]])\n # Going down halfway looks to small. Golden ratio is too far.\n _triangle_path_u = Path._create_closed([[0, 1], [-3/5, -1/5], [3/5, -1/5]])\n _triangle_path_d = Path._create_closed(\n [[-3/5, -1/5], [3/5, -1/5], [1, -1], [-1, -1]])\n _triangle_path_l = Path._create_closed([[0, 1], [0, -1], [-1, -1]])\n _triangle_path_r = Path._create_closed([[0, 1], [0, -1], [1, -1]])\n\n def _set_triangle(self, rot, skip):\n self._transform = Affine2D().scale(0.5).rotate_deg(rot)\n self._snap_threshold = 5.0\n\n if not self._half_fill():\n self._path = self._triangle_path\n else:\n mpaths = [self._triangle_path_u,\n self._triangle_path_l,\n self._triangle_path_d,\n self._triangle_path_r]\n\n fs = self.get_fillstyle()\n if fs == 'top':\n self._path = mpaths[(0 + skip) % 4]\n self._alt_path = mpaths[(2 + skip) % 4]\n elif fs == 'bottom':\n self._path = mpaths[(2 + skip) % 4]\n self._alt_path = mpaths[(0 + skip) % 4]\n elif fs == 'left':\n self._path = mpaths[(1 + skip) % 4]\n self._alt_path = mpaths[(3 + skip) % 4]\n else:\n self._path = mpaths[(3 + skip) % 4]\n self._alt_path = mpaths[(1 + skip) % 4]\n\n self._alt_transform = self._transform\n\n self._joinstyle = self._user_joinstyle or JoinStyle.miter\n\n def _set_triangle_up(self):\n return self._set_triangle(0.0, 0)\n\n def _set_triangle_down(self):\n return self._set_triangle(180.0, 2)\n\n def _set_triangle_left(self):\n return self._set_triangle(90.0, 3)\n\n def _set_triangle_right(self):\n return self._set_triangle(270.0, 1)\n\n def _set_square(self):\n self._transform = Affine2D().translate(-0.5, -0.5)\n self._snap_threshold = 2.0\n if not self._half_fill():\n self._path = Path.unit_rectangle()\n else:\n # Build a bottom filled square out of two rectangles, one filled.\n self._path = Path([[0.0, 0.0], [1.0, 0.0], [1.0, 0.5],\n [0.0, 0.5], [0.0, 0.0]])\n self._alt_path = Path([[0.0, 0.5], [1.0, 0.5], [1.0, 1.0],\n [0.0, 1.0], [0.0, 0.5]])\n fs = self.get_fillstyle()\n rotate = {'bottom': 0, 'right': 90, 'top': 180, 'left': 270}[fs]\n self._transform.rotate_deg(rotate)\n self._alt_transform = self._transform\n\n self._joinstyle = self._user_joinstyle or JoinStyle.miter\n\n def _set_diamond(self):\n self._transform = Affine2D().translate(-0.5, -0.5).rotate_deg(45)\n self._snap_threshold = 5.0\n if not self._half_fill():\n self._path = Path.unit_rectangle()\n else:\n self._path = Path([[0, 0], [1, 0], [1, 1], [0, 0]])\n self._alt_path = Path([[0, 0], [0, 1], [1, 1], [0, 0]])\n fs = self.get_fillstyle()\n rotate = {'right': 0, 'top': 90, 'left': 180, 'bottom': 270}[fs]\n self._transform.rotate_deg(rotate)\n self._alt_transform = self._transform\n self._joinstyle = self._user_joinstyle or JoinStyle.miter\n\n def _set_thin_diamond(self):\n self._set_diamond()\n self._transform.scale(0.6, 1.0)\n\n def _set_pentagon(self):\n self._transform = Affine2D().scale(0.5)\n self._snap_threshold = 5.0\n\n polypath = Path.unit_regular_polygon(5)\n\n if not self._half_fill():\n self._path = polypath\n else:\n verts = polypath.vertices\n y = (1 + np.sqrt(5)) / 4.\n top = Path(verts[[0, 1, 4, 0]])\n bottom = Path(verts[[1, 2, 3, 4, 1]])\n left = Path([verts[0], verts[1], verts[2], [0, -y], verts[0]])\n right = Path([verts[0], verts[4], verts[3], [0, -y], verts[0]])\n self._path, self._alt_path = {\n 'top': (top, bottom), 'bottom': (bottom, top),\n 'left': (left, right), 'right': (right, left),\n }[self.get_fillstyle()]\n self._alt_transform = self._transform\n\n self._joinstyle = self._user_joinstyle or JoinStyle.miter\n\n def _set_star(self):\n self._transform = Affine2D().scale(0.5)\n self._snap_threshold = 5.0\n\n polypath = Path.unit_regular_star(5, innerCircle=0.381966)\n\n if not self._half_fill():\n self._path = polypath\n else:\n verts = polypath.vertices\n top = Path(np.concatenate([verts[0:4], verts[7:10], verts[0:1]]))\n bottom = Path(np.concatenate([verts[3:8], verts[3:4]]))\n left = Path(np.concatenate([verts[0:6], verts[0:1]]))\n right = Path(np.concatenate([verts[0:1], verts[5:10], verts[0:1]]))\n self._path, self._alt_path = {\n 'top': (top, bottom), 'bottom': (bottom, top),\n 'left': (left, right), 'right': (right, left),\n }[self.get_fillstyle()]\n self._alt_transform = self._transform\n\n self._joinstyle = self._user_joinstyle or JoinStyle.bevel\n\n def _set_hexagon1(self):\n self._transform = Affine2D().scale(0.5)\n self._snap_threshold = None\n\n polypath = Path.unit_regular_polygon(6)\n\n if not self._half_fill():\n self._path = polypath\n else:\n verts = polypath.vertices\n # not drawing inside lines\n x = np.abs(np.cos(5 * np.pi / 6.))\n top = Path(np.concatenate([[(-x, 0)], verts[[1, 0, 5]], [(x, 0)]]))\n bottom = Path(np.concatenate([[(-x, 0)], verts[2:5], [(x, 0)]]))\n left = Path(verts[0:4])\n right = Path(verts[[0, 5, 4, 3]])\n self._path, self._alt_path = {\n 'top': (top, bottom), 'bottom': (bottom, top),\n 'left': (left, right), 'right': (right, left),\n }[self.get_fillstyle()]\n self._alt_transform = self._transform\n\n self._joinstyle = self._user_joinstyle or JoinStyle.miter\n\n def _set_hexagon2(self):\n self._transform = Affine2D().scale(0.5).rotate_deg(30)\n self._snap_threshold = None\n\n polypath = Path.unit_regular_polygon(6)\n\n if not self._half_fill():\n self._path = polypath\n else:\n verts = polypath.vertices\n # not drawing inside lines\n x, y = np.sqrt(3) / 4, 3 / 4.\n top = Path(verts[[1, 0, 5, 4, 1]])\n bottom = Path(verts[1:5])\n left = Path(np.concatenate([\n [(x, y)], verts[:3], [(-x, -y), (x, y)]]))\n right = Path(np.concatenate([\n [(x, y)], verts[5:2:-1], [(-x, -y)]]))\n self._path, self._alt_path = {\n 'top': (top, bottom), 'bottom': (bottom, top),\n 'left': (left, right), 'right': (right, left),\n }[self.get_fillstyle()]\n self._alt_transform = self._transform\n\n self._joinstyle = self._user_joinstyle or JoinStyle.miter\n\n def _set_octagon(self):\n self._transform = Affine2D().scale(0.5)\n self._snap_threshold = 5.0\n\n polypath = Path.unit_regular_polygon(8)\n\n if not self._half_fill():\n self._transform.rotate_deg(22.5)\n self._path = polypath\n else:\n x = np.sqrt(2.) / 4.\n self._path = self._alt_path = Path(\n [[0, -1], [0, 1], [-x, 1], [-1, x],\n [-1, -x], [-x, -1], [0, -1]])\n fs = self.get_fillstyle()\n self._transform.rotate_deg(\n {'left': 0, 'bottom': 90, 'right': 180, 'top': 270}[fs])\n self._alt_transform = self._transform.frozen().rotate_deg(180.0)\n\n self._joinstyle = self._user_joinstyle or JoinStyle.miter\n\n _line_marker_path = Path([[0.0, -1.0], [0.0, 1.0]])\n\n def _set_vline(self):\n self._transform = Affine2D().scale(0.5)\n self._snap_threshold = 1.0\n self._filled = False\n self._path = self._line_marker_path\n\n def _set_hline(self):\n self._set_vline()\n self._transform = self._transform.rotate_deg(90)\n\n _tickhoriz_path = Path([[0.0, 0.0], [1.0, 0.0]])\n\n def _set_tickleft(self):\n self._transform = Affine2D().scale(-1.0, 1.0)\n self._snap_threshold = 1.0\n self._filled = False\n self._path = self._tickhoriz_path\n\n def _set_tickright(self):\n self._transform = Affine2D().scale(1.0, 1.0)\n self._snap_threshold = 1.0\n self._filled = False\n self._path = self._tickhoriz_path\n\n _tickvert_path = Path([[-0.0, 0.0], [-0.0, 1.0]])\n\n def _set_tickup(self):\n self._transform = Affine2D().scale(1.0, 1.0)\n self._snap_threshold = 1.0\n self._filled = False\n self._path = self._tickvert_path\n\n def _set_tickdown(self):\n self._transform = Affine2D().scale(1.0, -1.0)\n self._snap_threshold = 1.0\n self._filled = False\n self._path = self._tickvert_path\n\n _tri_path = Path([[0.0, 0.0], [0.0, -1.0],\n [0.0, 0.0], [0.8, 0.5],\n [0.0, 0.0], [-0.8, 0.5]],\n [Path.MOVETO, Path.LINETO,\n Path.MOVETO, Path.LINETO,\n Path.MOVETO, Path.LINETO])\n\n def _set_tri_down(self):\n self._transform = Affine2D().scale(0.5)\n self._snap_threshold = 5.0\n self._filled = False\n self._path = self._tri_path\n\n def _set_tri_up(self):\n self._set_tri_down()\n self._transform = self._transform.rotate_deg(180)\n\n def _set_tri_left(self):\n self._set_tri_down()\n self._transform = self._transform.rotate_deg(270)\n\n def _set_tri_right(self):\n self._set_tri_down()\n self._transform = self._transform.rotate_deg(90)\n\n _caret_path = Path([[-1.0, 1.5], [0.0, 0.0], [1.0, 1.5]])\n\n def _set_caretdown(self):\n self._transform = Affine2D().scale(0.5)\n self._snap_threshold = 3.0\n self._filled = False\n self._path = self._caret_path\n self._joinstyle = self._user_joinstyle or JoinStyle.miter\n\n def _set_caretup(self):\n self._set_caretdown()\n self._transform = self._transform.rotate_deg(180)\n\n def _set_caretleft(self):\n self._set_caretdown()\n self._transform = self._transform.rotate_deg(270)\n\n def _set_caretright(self):\n self._set_caretdown()\n self._transform = self._transform.rotate_deg(90)\n\n _caret_path_base = Path([[-1.0, 0.0], [0.0, -1.5], [1.0, 0]])\n\n def _set_caretdownbase(self):\n self._set_caretdown()\n self._path = self._caret_path_base\n\n def _set_caretupbase(self):\n self._set_caretdownbase()\n self._transform = self._transform.rotate_deg(180)\n\n def _set_caretleftbase(self):\n self._set_caretdownbase()\n self._transform = self._transform.rotate_deg(270)\n\n def _set_caretrightbase(self):\n self._set_caretdownbase()\n self._transform = self._transform.rotate_deg(90)\n\n _plus_path = Path([[-1.0, 0.0], [1.0, 0.0],\n [0.0, -1.0], [0.0, 1.0]],\n [Path.MOVETO, Path.LINETO,\n Path.MOVETO, Path.LINETO])\n\n def _set_plus(self):\n self._transform = Affine2D().scale(0.5)\n self._snap_threshold = 1.0\n self._filled = False\n self._path = self._plus_path\n\n _x_path = Path([[-1.0, -1.0], [1.0, 1.0],\n [-1.0, 1.0], [1.0, -1.0]],\n [Path.MOVETO, Path.LINETO,\n Path.MOVETO, Path.LINETO])\n\n def _set_x(self):\n self._transform = Affine2D().scale(0.5)\n self._snap_threshold = 3.0\n self._filled = False\n self._path = self._x_path\n\n _plus_filled_path = Path._create_closed(np.array([\n (-1, -3), (+1, -3), (+1, -1), (+3, -1), (+3, +1), (+1, +1),\n (+1, +3), (-1, +3), (-1, +1), (-3, +1), (-3, -1), (-1, -1)]) / 6)\n _plus_filled_path_t = Path._create_closed(np.array([\n (+3, 0), (+3, +1), (+1, +1), (+1, +3),\n (-1, +3), (-1, +1), (-3, +1), (-3, 0)]) / 6)\n\n def _set_plus_filled(self):\n self._transform = Affine2D()\n self._snap_threshold = 5.0\n self._joinstyle = self._user_joinstyle or JoinStyle.miter\n if not self._half_fill():\n self._path = self._plus_filled_path\n else:\n # Rotate top half path to support all partitions\n self._path = self._alt_path = self._plus_filled_path_t\n fs = self.get_fillstyle()\n self._transform.rotate_deg(\n {'top': 0, 'left': 90, 'bottom': 180, 'right': 270}[fs])\n self._alt_transform = self._transform.frozen().rotate_deg(180)\n\n _x_filled_path = Path._create_closed(np.array([\n (-1, -2), (0, -1), (+1, -2), (+2, -1), (+1, 0), (+2, +1),\n (+1, +2), (0, +1), (-1, +2), (-2, +1), (-1, 0), (-2, -1)]) / 4)\n _x_filled_path_t = Path._create_closed(np.array([\n (+1, 0), (+2, +1), (+1, +2), (0, +1),\n (-1, +2), (-2, +1), (-1, 0)]) / 4)\n\n def _set_x_filled(self):\n self._transform = Affine2D()\n self._snap_threshold = 5.0\n self._joinstyle = self._user_joinstyle or JoinStyle.miter\n if not self._half_fill():\n self._path = self._x_filled_path\n else:\n # Rotate top half path to support all partitions\n self._path = self._alt_path = self._x_filled_path_t\n fs = self.get_fillstyle()\n self._transform.rotate_deg(\n {'top': 0, 'left': 90, 'bottom': 180, 'right': 270}[fs])\n self._alt_transform = self._transform.frozen().rotate_deg(180)\n | .venv\Lib\site-packages\matplotlib\markers.py | markers.py | Python | 33,708 | 0.95 | 0.118943 | 0.037613 | python-kit | 778 | 2023-08-05T20:14:43.962515 | GPL-3.0 | false | 356bd4872835539063b2ea93207350d9 |
from typing import Literal\n\nfrom .path import Path\nfrom .transforms import Affine2D, Transform\n\nfrom numpy.typing import ArrayLike\nfrom .typing import CapStyleType, FillStyleType, JoinStyleType\n\nTICKLEFT: int\nTICKRIGHT: int\nTICKUP: int\nTICKDOWN: int\nCARETLEFT: int\nCARETRIGHT: int\nCARETUP: int\nCARETDOWN: int\nCARETLEFTBASE: int\nCARETRIGHTBASE: int\nCARETUPBASE: int\nCARETDOWNBASE: int\n\nclass MarkerStyle:\n markers: dict[str | int, str]\n filled_markers: tuple[str, ...]\n fillstyles: tuple[FillStyleType, ...]\n\n def __init__(\n self,\n marker: str | ArrayLike | Path | MarkerStyle,\n fillstyle: FillStyleType | None = ...,\n transform: Transform | None = ...,\n capstyle: CapStyleType | None = ...,\n joinstyle: JoinStyleType | None = ...,\n ) -> None: ...\n def __bool__(self) -> bool: ...\n def is_filled(self) -> bool: ...\n def get_fillstyle(self) -> FillStyleType: ...\n def get_joinstyle(self) -> Literal["miter", "round", "bevel"]: ...\n def get_capstyle(self) -> Literal["butt", "projecting", "round"]: ...\n def get_marker(self) -> str | ArrayLike | Path | None: ...\n def get_path(self) -> Path: ...\n def get_transform(self) -> Transform: ...\n def get_alt_path(self) -> Path | None: ...\n def get_alt_transform(self) -> Transform: ...\n def get_snap_threshold(self) -> float | None: ...\n def get_user_transform(self) -> Transform | None: ...\n def transformed(self, transform: Affine2D) -> MarkerStyle: ...\n def rotated(\n self, *, deg: float | None = ..., rad: float | None = ...\n ) -> MarkerStyle: ...\n def scaled(self, sx: float, sy: float | None = ...) -> MarkerStyle: ...\n | .venv\Lib\site-packages\matplotlib\markers.pyi | markers.pyi | Other | 1,678 | 0.85 | 0.333333 | 0 | node-utils | 520 | 2025-05-20T02:02:21.062713 | GPL-3.0 | false | 34dd16014dd29849db073ab50c63819d |
r"""\nA module for parsing a subset of the TeX math syntax and rendering it to a\nMatplotlib backend.\n\nFor a tutorial of its usage, see :ref:`mathtext`. This\ndocument is primarily concerned with implementation details.\n\nThe module uses pyparsing_ to parse the TeX expression.\n\n.. _pyparsing: https://pypi.org/project/pyparsing/\n\nThe Bakoma distribution of the TeX Computer Modern fonts, and STIX\nfonts are supported. There is experimental support for using\narbitrary fonts, but results may vary without proper tweaking and\nmetrics for those fonts.\n"""\n\nimport functools\nimport logging\n\nimport matplotlib as mpl\nfrom matplotlib import _api, _mathtext\nfrom matplotlib.ft2font import LoadFlags\nfrom matplotlib.font_manager import FontProperties\nfrom ._mathtext import ( # noqa: F401, reexported API\n RasterParse, VectorParse, get_unicode_index)\n\n_log = logging.getLogger(__name__)\n\n\nget_unicode_index.__module__ = __name__\n\n##############################################################################\n# MAIN\n\n\nclass MathTextParser:\n _parser = None\n _font_type_mapping = {\n 'cm': _mathtext.BakomaFonts,\n 'dejavuserif': _mathtext.DejaVuSerifFonts,\n 'dejavusans': _mathtext.DejaVuSansFonts,\n 'stix': _mathtext.StixFonts,\n 'stixsans': _mathtext.StixSansFonts,\n 'custom': _mathtext.UnicodeFonts,\n }\n\n def __init__(self, output):\n """\n Create a MathTextParser for the given backend *output*.\n\n Parameters\n ----------\n output : {"path", "agg"}\n Whether to return a `VectorParse` ("path") or a\n `RasterParse` ("agg", or its synonym "macosx").\n """\n self._output_type = _api.check_getitem(\n {"path": "vector", "agg": "raster", "macosx": "raster"},\n output=output.lower())\n\n def parse(self, s, dpi=72, prop=None, *, antialiased=None):\n """\n Parse the given math expression *s* at the given *dpi*. If *prop* is\n provided, it is a `.FontProperties` object specifying the "default"\n font to use in the math expression, used for all non-math text.\n\n The results are cached, so multiple calls to `parse`\n with the same expression should be fast.\n\n Depending on the *output* type, this returns either a `VectorParse` or\n a `RasterParse`.\n """\n # lru_cache can't decorate parse() directly because prop is\n # mutable, so we key the cache using an internal copy (see\n # Text._get_text_metrics_with_cache for a similar case); likewise,\n # we need to check the mutable state of the text.antialiased and\n # text.hinting rcParams.\n prop = prop.copy() if prop is not None else None\n antialiased = mpl._val_or_rc(antialiased, 'text.antialiased')\n from matplotlib.backends import backend_agg\n load_glyph_flags = {\n "vector": LoadFlags.NO_HINTING,\n "raster": backend_agg.get_hinting_flag(),\n }[self._output_type]\n return self._parse_cached(s, dpi, prop, antialiased, load_glyph_flags)\n\n @functools.lru_cache(50)\n def _parse_cached(self, s, dpi, prop, antialiased, load_glyph_flags):\n if prop is None:\n prop = FontProperties()\n fontset_class = _api.check_getitem(\n self._font_type_mapping, fontset=prop.get_math_fontfamily())\n fontset = fontset_class(prop, load_glyph_flags)\n fontsize = prop.get_size_in_points()\n\n if self._parser is None: # Cache the parser globally.\n self.__class__._parser = _mathtext.Parser()\n\n box = self._parser.parse(s, fontset, fontsize, dpi)\n output = _mathtext.ship(box)\n if self._output_type == "vector":\n return output.to_vector()\n elif self._output_type == "raster":\n return output.to_raster(antialiased=antialiased)\n\n\ndef math_to_image(s, filename_or_obj, prop=None, dpi=None, format=None,\n *, color=None):\n """\n Given a math expression, renders it in a closely-clipped bounding\n box to an image file.\n\n Parameters\n ----------\n s : str\n A math expression. The math portion must be enclosed in dollar signs.\n filename_or_obj : str or path-like or file-like\n Where to write the image data.\n prop : `.FontProperties`, optional\n The size and style of the text.\n dpi : float, optional\n The output dpi. If not set, the dpi is determined as for\n `.Figure.savefig`.\n format : str, optional\n The output format, e.g., 'svg', 'pdf', 'ps' or 'png'. If not set, the\n format is determined as for `.Figure.savefig`.\n color : str, optional\n Foreground color, defaults to :rc:`text.color`.\n """\n from matplotlib import figure\n\n parser = MathTextParser('path')\n width, height, depth, _, _ = parser.parse(s, dpi=72, prop=prop)\n\n fig = figure.Figure(figsize=(width / 72.0, height / 72.0))\n fig.text(0, depth/height, s, fontproperties=prop, color=color)\n fig.savefig(filename_or_obj, dpi=dpi, format=format)\n\n return depth\n | .venv\Lib\site-packages\matplotlib\mathtext.py | mathtext.py | Python | 5,104 | 0.95 | 0.121429 | 0.070175 | node-utils | 103 | 2024-04-08T10:28:56.692566 | GPL-3.0 | false | 8735713e478304d126eef159b5b3da2d |
import os\nfrom typing import Generic, IO, Literal, TypeVar, overload\n\nfrom matplotlib.font_manager import FontProperties\nfrom matplotlib.typing import ColorType\n\n# Re-exported API from _mathtext.\nfrom ._mathtext import (\n RasterParse as RasterParse,\n VectorParse as VectorParse,\n get_unicode_index as get_unicode_index,\n)\n\n_ParseType = TypeVar("_ParseType", RasterParse, VectorParse)\n\nclass MathTextParser(Generic[_ParseType]):\n @overload\n def __init__(self: MathTextParser[VectorParse], output: Literal["path"]) -> None: ...\n @overload\n def __init__(self: MathTextParser[RasterParse], output: Literal["agg", "raster", "macosx"]) -> None: ...\n def parse(\n self, s: str, dpi: float = ..., prop: FontProperties | None = ..., *, antialiased: bool | None = ...\n ) -> _ParseType: ...\n\ndef math_to_image(\n s: str,\n filename_or_obj: str | os.PathLike | IO,\n prop: FontProperties | None = ...,\n dpi: float | None = ...,\n format: str | None = ...,\n *,\n color: ColorType | None = ...\n) -> float: ...\n | .venv\Lib\site-packages\matplotlib\mathtext.pyi | mathtext.pyi | Other | 1,045 | 0.95 | 0.151515 | 0.071429 | react-lib | 540 | 2024-12-27T19:56:20.501373 | BSD-3-Clause | false | c36b77f9de26ca5113b146907e773bc5 |
"""\nNumerical Python functions written for compatibility with MATLAB\ncommands with the same names. Most numerical Python functions can be found in\nthe `NumPy`_ and `SciPy`_ libraries. What remains here is code for performing\nspectral computations and kernel density estimations.\n\n.. _NumPy: https://numpy.org\n.. _SciPy: https://www.scipy.org\n\nSpectral functions\n------------------\n\n`cohere`\n Coherence (normalized cross spectral density)\n\n`csd`\n Cross spectral density using Welch's average periodogram\n\n`detrend`\n Remove the mean or best fit line from an array\n\n`psd`\n Power spectral density using Welch's average periodogram\n\n`specgram`\n Spectrogram (spectrum over segments of time)\n\n`complex_spectrum`\n Return the complex-valued frequency spectrum of a signal\n\n`magnitude_spectrum`\n Return the magnitude of the frequency spectrum of a signal\n\n`angle_spectrum`\n Return the angle (wrapped phase) of the frequency spectrum of a signal\n\n`phase_spectrum`\n Return the phase (unwrapped angle) of the frequency spectrum of a signal\n\n`detrend_mean`\n Remove the mean from a line.\n\n`detrend_linear`\n Remove the best fit line from a line.\n\n`detrend_none`\n Return the original line.\n"""\n\nimport functools\nfrom numbers import Number\n\nimport numpy as np\n\nfrom matplotlib import _api, _docstring, cbook\n\n\ndef window_hanning(x):\n """\n Return *x* times the Hanning (or Hann) window of len(*x*).\n\n See Also\n --------\n window_none : Another window algorithm.\n """\n return np.hanning(len(x))*x\n\n\ndef window_none(x):\n """\n No window function; simply return *x*.\n\n See Also\n --------\n window_hanning : Another window algorithm.\n """\n return x\n\n\ndef detrend(x, key=None, axis=None):\n """\n Return *x* with its trend removed.\n\n Parameters\n ----------\n x : array or sequence\n Array or sequence containing the data.\n\n key : {'default', 'constant', 'mean', 'linear', 'none'} or function\n The detrending algorithm to use. 'default', 'mean', and 'constant' are\n the same as `detrend_mean`. 'linear' is the same as `detrend_linear`.\n 'none' is the same as `detrend_none`. The default is 'mean'. See the\n corresponding functions for more details regarding the algorithms. Can\n also be a function that carries out the detrend operation.\n\n axis : int\n The axis along which to do the detrending.\n\n See Also\n --------\n detrend_mean : Implementation of the 'mean' algorithm.\n detrend_linear : Implementation of the 'linear' algorithm.\n detrend_none : Implementation of the 'none' algorithm.\n """\n if key is None or key in ['constant', 'mean', 'default']:\n return detrend(x, key=detrend_mean, axis=axis)\n elif key == 'linear':\n return detrend(x, key=detrend_linear, axis=axis)\n elif key == 'none':\n return detrend(x, key=detrend_none, axis=axis)\n elif callable(key):\n x = np.asarray(x)\n if axis is not None and axis + 1 > x.ndim:\n raise ValueError(f'axis(={axis}) out of bounds')\n if (axis is None and x.ndim == 0) or (not axis and x.ndim == 1):\n return key(x)\n # try to use the 'axis' argument if the function supports it,\n # otherwise use apply_along_axis to do it\n try:\n return key(x, axis=axis)\n except TypeError:\n return np.apply_along_axis(key, axis=axis, arr=x)\n else:\n raise ValueError(\n f"Unknown value for key: {key!r}, must be one of: 'default', "\n f"'constant', 'mean', 'linear', or a function")\n\n\ndef detrend_mean(x, axis=None):\n """\n Return *x* minus the mean(*x*).\n\n Parameters\n ----------\n x : array or sequence\n Array or sequence containing the data\n Can have any dimensionality\n\n axis : int\n The axis along which to take the mean. See `numpy.mean` for a\n description of this argument.\n\n See Also\n --------\n detrend_linear : Another detrend algorithm.\n detrend_none : Another detrend algorithm.\n detrend : A wrapper around all the detrend algorithms.\n """\n x = np.asarray(x)\n\n if axis is not None and axis+1 > x.ndim:\n raise ValueError('axis(=%s) out of bounds' % axis)\n\n return x - x.mean(axis, keepdims=True)\n\n\ndef detrend_none(x, axis=None):\n """\n Return *x*: no detrending.\n\n Parameters\n ----------\n x : any object\n An object containing the data\n\n axis : int\n This parameter is ignored.\n It is included for compatibility with detrend_mean\n\n See Also\n --------\n detrend_mean : Another detrend algorithm.\n detrend_linear : Another detrend algorithm.\n detrend : A wrapper around all the detrend algorithms.\n """\n return x\n\n\ndef detrend_linear(y):\n """\n Return *x* minus best fit line; 'linear' detrending.\n\n Parameters\n ----------\n y : 0-D or 1-D array or sequence\n Array or sequence containing the data\n\n See Also\n --------\n detrend_mean : Another detrend algorithm.\n detrend_none : Another detrend algorithm.\n detrend : A wrapper around all the detrend algorithms.\n """\n # This is faster than an algorithm based on linalg.lstsq.\n y = np.asarray(y)\n\n if y.ndim > 1:\n raise ValueError('y cannot have ndim > 1')\n\n # short-circuit 0-D array.\n if not y.ndim:\n return np.array(0., dtype=y.dtype)\n\n x = np.arange(y.size, dtype=float)\n\n C = np.cov(x, y, bias=1)\n b = C[0, 1]/C[0, 0]\n\n a = y.mean() - b*x.mean()\n return y - (b*x + a)\n\n\ndef _spectral_helper(x, y=None, NFFT=None, Fs=None, detrend_func=None,\n window=None, noverlap=None, pad_to=None,\n sides=None, scale_by_freq=None, mode=None):\n """\n Private helper implementing the common parts between the psd, csd,\n spectrogram and complex, magnitude, angle, and phase spectrums.\n """\n if y is None:\n # if y is None use x for y\n same_data = True\n else:\n # The checks for if y is x are so that we can use the same function to\n # implement the core of psd(), csd(), and spectrogram() without doing\n # extra calculations. We return the unaveraged Pxy, freqs, and t.\n same_data = y is x\n\n if Fs is None:\n Fs = 2\n if noverlap is None:\n noverlap = 0\n if detrend_func is None:\n detrend_func = detrend_none\n if window is None:\n window = window_hanning\n\n # if NFFT is set to None use the whole signal\n if NFFT is None:\n NFFT = 256\n\n if noverlap >= NFFT:\n raise ValueError('noverlap must be less than NFFT')\n\n if mode is None or mode == 'default':\n mode = 'psd'\n _api.check_in_list(\n ['default', 'psd', 'complex', 'magnitude', 'angle', 'phase'],\n mode=mode)\n\n if not same_data and mode != 'psd':\n raise ValueError("x and y must be equal if mode is not 'psd'")\n\n # Make sure we're dealing with a numpy array. If y and x were the same\n # object to start with, keep them that way\n x = np.asarray(x)\n if not same_data:\n y = np.asarray(y)\n\n if sides is None or sides == 'default':\n if np.iscomplexobj(x):\n sides = 'twosided'\n else:\n sides = 'onesided'\n _api.check_in_list(['default', 'onesided', 'twosided'], sides=sides)\n\n # zero pad x and y up to NFFT if they are shorter than NFFT\n if len(x) < NFFT:\n n = len(x)\n x = np.resize(x, NFFT)\n x[n:] = 0\n\n if not same_data and len(y) < NFFT:\n n = len(y)\n y = np.resize(y, NFFT)\n y[n:] = 0\n\n if pad_to is None:\n pad_to = NFFT\n\n if mode != 'psd':\n scale_by_freq = False\n elif scale_by_freq is None:\n scale_by_freq = True\n\n # For real x, ignore the negative frequencies unless told otherwise\n if sides == 'twosided':\n numFreqs = pad_to\n if pad_to % 2:\n freqcenter = (pad_to - 1)//2 + 1\n else:\n freqcenter = pad_to//2\n scaling_factor = 1.\n elif sides == 'onesided':\n if pad_to % 2:\n numFreqs = (pad_to + 1)//2\n else:\n numFreqs = pad_to//2 + 1\n scaling_factor = 2.\n\n if not np.iterable(window):\n window = window(np.ones(NFFT, x.dtype))\n if len(window) != NFFT:\n raise ValueError(\n "The window length must match the data's first dimension")\n\n result = np.lib.stride_tricks.sliding_window_view(\n x, NFFT, axis=0)[::NFFT - noverlap].T\n result = detrend(result, detrend_func, axis=0)\n result = result * window.reshape((-1, 1))\n result = np.fft.fft(result, n=pad_to, axis=0)[:numFreqs, :]\n freqs = np.fft.fftfreq(pad_to, 1/Fs)[:numFreqs]\n\n if not same_data:\n # if same_data is False, mode must be 'psd'\n resultY = np.lib.stride_tricks.sliding_window_view(\n y, NFFT, axis=0)[::NFFT - noverlap].T\n resultY = detrend(resultY, detrend_func, axis=0)\n resultY = resultY * window.reshape((-1, 1))\n resultY = np.fft.fft(resultY, n=pad_to, axis=0)[:numFreqs, :]\n result = np.conj(result) * resultY\n elif mode == 'psd':\n result = np.conj(result) * result\n elif mode == 'magnitude':\n result = np.abs(result) / window.sum()\n elif mode == 'angle' or mode == 'phase':\n # we unwrap the phase later to handle the onesided vs. twosided case\n result = np.angle(result)\n elif mode == 'complex':\n result /= window.sum()\n\n if mode == 'psd':\n\n # Also include scaling factors for one-sided densities and dividing by\n # the sampling frequency, if desired. Scale everything, except the DC\n # component and the NFFT/2 component:\n\n # if we have a even number of frequencies, don't scale NFFT/2\n if not NFFT % 2:\n slc = slice(1, -1, None)\n # if we have an odd number, just don't scale DC\n else:\n slc = slice(1, None, None)\n\n result[slc] *= scaling_factor\n\n # MATLAB divides by the sampling frequency so that density function\n # has units of dB/Hz and can be integrated by the plotted frequency\n # values. Perform the same scaling here.\n if scale_by_freq:\n result /= Fs\n # Scale the spectrum by the norm of the window to compensate for\n # windowing loss; see Bendat & Piersol Sec 11.5.2.\n result /= (window**2).sum()\n else:\n # In this case, preserve power in the segment, not amplitude\n result /= window.sum()**2\n\n t = np.arange(NFFT/2, len(x) - NFFT/2 + 1, NFFT - noverlap)/Fs\n\n if sides == 'twosided':\n # center the frequency range at zero\n freqs = np.roll(freqs, -freqcenter, axis=0)\n result = np.roll(result, -freqcenter, axis=0)\n elif not pad_to % 2:\n # get the last value correctly, it is negative otherwise\n freqs[-1] *= -1\n\n # we unwrap the phase here to handle the onesided vs. twosided case\n if mode == 'phase':\n result = np.unwrap(result, axis=0)\n\n return result, freqs, t\n\n\ndef _single_spectrum_helper(\n mode, x, Fs=None, window=None, pad_to=None, sides=None):\n """\n Private helper implementing the commonality between the complex, magnitude,\n angle, and phase spectrums.\n """\n _api.check_in_list(['complex', 'magnitude', 'angle', 'phase'], mode=mode)\n\n if pad_to is None:\n pad_to = len(x)\n\n spec, freqs, _ = _spectral_helper(x=x, y=None, NFFT=len(x), Fs=Fs,\n detrend_func=detrend_none, window=window,\n noverlap=0, pad_to=pad_to,\n sides=sides,\n scale_by_freq=False,\n mode=mode)\n if mode != 'complex':\n spec = spec.real\n\n if spec.ndim == 2 and spec.shape[1] == 1:\n spec = spec[:, 0]\n\n return spec, freqs\n\n\n# Split out these keyword docs so that they can be used elsewhere\n_docstring.interpd.register(\n Spectral="""\\nFs : float, default: 2\n The sampling frequency (samples per time unit). It is used to calculate\n the Fourier frequencies, *freqs*, in cycles per time unit.\n\nwindow : callable or ndarray, default: `.window_hanning`\n A function or a vector of length *NFFT*. To create window vectors see\n `.window_hanning`, `.window_none`, `numpy.blackman`, `numpy.hamming`,\n `numpy.bartlett`, `scipy.signal`, `scipy.signal.get_window`, etc. If a\n function is passed as the argument, it must take a data segment as an\n argument and return the windowed version of the segment.\n\nsides : {'default', 'onesided', 'twosided'}, optional\n Which sides of the spectrum to return. 'default' is one-sided for real\n data and two-sided for complex data. 'onesided' forces the return of a\n one-sided spectrum, while 'twosided' forces two-sided.""",\n\n Single_Spectrum="""\\npad_to : int, optional\n The number of points to which the data segment is padded when performing\n the FFT. While not increasing the actual resolution of the spectrum (the\n minimum distance between resolvable peaks), this can give more points in\n the plot, allowing for more detail. This corresponds to the *n* parameter\n in the call to `~numpy.fft.fft`. The default is None, which sets *pad_to*\n equal to the length of the input signal (i.e. no padding).""",\n\n PSD="""\\npad_to : int, optional\n The number of points to which the data segment is padded when performing\n the FFT. This can be different from *NFFT*, which specifies the number\n of data points used. While not increasing the actual resolution of the\n spectrum (the minimum distance between resolvable peaks), this can give\n more points in the plot, allowing for more detail. This corresponds to\n the *n* parameter in the call to `~numpy.fft.fft`. The default is None,\n which sets *pad_to* equal to *NFFT*\n\nNFFT : int, default: 256\n The number of data points used in each block for the FFT. A power 2 is\n most efficient. This should *NOT* be used to get zero padding, or the\n scaling of the result will be incorrect; use *pad_to* for this instead.\n\ndetrend : {'none', 'mean', 'linear'} or callable, default: 'none'\n The function applied to each segment before fft-ing, designed to remove\n the mean or linear trend. Unlike in MATLAB, where the *detrend* parameter\n is a vector, in Matplotlib it is a function. The :mod:`~matplotlib.mlab`\n module defines `.detrend_none`, `.detrend_mean`, and `.detrend_linear`,\n but you can use a custom function as well. You can also use a string to\n choose one of the functions: 'none' calls `.detrend_none`. 'mean' calls\n `.detrend_mean`. 'linear' calls `.detrend_linear`.\n\nscale_by_freq : bool, default: True\n Whether the resulting density values should be scaled by the scaling\n frequency, which gives density in units of 1/Hz. This allows for\n integration over the returned frequency values. The default is True for\n MATLAB compatibility.""")\n\n\n@_docstring.interpd\ndef psd(x, NFFT=None, Fs=None, detrend=None, window=None,\n noverlap=None, pad_to=None, sides=None, scale_by_freq=None):\n r"""\n Compute the power spectral density.\n\n The power spectral density :math:`P_{xx}` by Welch's average\n periodogram method. The vector *x* is divided into *NFFT* length\n segments. Each segment is detrended by function *detrend* and\n windowed by function *window*. *noverlap* gives the length of\n the overlap between segments. The :math:`|\mathrm{fft}(i)|^2`\n of each segment :math:`i` are averaged to compute :math:`P_{xx}`.\n\n If len(*x*) < *NFFT*, it will be zero padded to *NFFT*.\n\n Parameters\n ----------\n x : 1-D array or sequence\n Array or sequence containing the data\n\n %(Spectral)s\n\n %(PSD)s\n\n noverlap : int, default: 0 (no overlap)\n The number of points of overlap between segments.\n\n Returns\n -------\n Pxx : 1-D array\n The values for the power spectrum :math:`P_{xx}` (real valued)\n\n freqs : 1-D array\n The frequencies corresponding to the elements in *Pxx*\n\n References\n ----------\n Bendat & Piersol -- Random Data: Analysis and Measurement Procedures, John\n Wiley & Sons (1986)\n\n See Also\n --------\n specgram\n `specgram` differs in the default overlap; in not returning the mean of\n the segment periodograms; and in returning the times of the segments.\n\n magnitude_spectrum : returns the magnitude spectrum.\n\n csd : returns the spectral density between two signals.\n """\n Pxx, freqs = csd(x=x, y=None, NFFT=NFFT, Fs=Fs, detrend=detrend,\n window=window, noverlap=noverlap, pad_to=pad_to,\n sides=sides, scale_by_freq=scale_by_freq)\n return Pxx.real, freqs\n\n\n@_docstring.interpd\ndef csd(x, y, NFFT=None, Fs=None, detrend=None, window=None,\n noverlap=None, pad_to=None, sides=None, scale_by_freq=None):\n """\n Compute the cross-spectral density.\n\n The cross spectral density :math:`P_{xy}` by Welch's average\n periodogram method. The vectors *x* and *y* are divided into\n *NFFT* length segments. Each segment is detrended by function\n *detrend* and windowed by function *window*. *noverlap* gives\n the length of the overlap between segments. The product of\n the direct FFTs of *x* and *y* are averaged over each segment\n to compute :math:`P_{xy}`, with a scaling to correct for power\n loss due to windowing.\n\n If len(*x*) < *NFFT* or len(*y*) < *NFFT*, they will be zero\n padded to *NFFT*.\n\n Parameters\n ----------\n x, y : 1-D arrays or sequences\n Arrays or sequences containing the data\n\n %(Spectral)s\n\n %(PSD)s\n\n noverlap : int, default: 0 (no overlap)\n The number of points of overlap between segments.\n\n Returns\n -------\n Pxy : 1-D array\n The values for the cross spectrum :math:`P_{xy}` before scaling (real\n valued)\n\n freqs : 1-D array\n The frequencies corresponding to the elements in *Pxy*\n\n References\n ----------\n Bendat & Piersol -- Random Data: Analysis and Measurement Procedures, John\n Wiley & Sons (1986)\n\n See Also\n --------\n psd : equivalent to setting ``y = x``.\n """\n if NFFT is None:\n NFFT = 256\n Pxy, freqs, _ = _spectral_helper(x=x, y=y, NFFT=NFFT, Fs=Fs,\n detrend_func=detrend, window=window,\n noverlap=noverlap, pad_to=pad_to,\n sides=sides, scale_by_freq=scale_by_freq,\n mode='psd')\n\n if Pxy.ndim == 2:\n if Pxy.shape[1] > 1:\n Pxy = Pxy.mean(axis=1)\n else:\n Pxy = Pxy[:, 0]\n return Pxy, freqs\n\n\n_single_spectrum_docs = """\\nCompute the {quantity} of *x*.\nData is padded to a length of *pad_to* and the windowing function *window* is\napplied to the signal.\n\nParameters\n----------\nx : 1-D array or sequence\n Array or sequence containing the data\n\n{Spectral}\n\n{Single_Spectrum}\n\nReturns\n-------\nspectrum : 1-D array\n The {quantity}.\nfreqs : 1-D array\n The frequencies corresponding to the elements in *spectrum*.\n\nSee Also\n--------\npsd\n Returns the power spectral density.\ncomplex_spectrum\n Returns the complex-valued frequency spectrum.\nmagnitude_spectrum\n Returns the absolute value of the `complex_spectrum`.\nangle_spectrum\n Returns the angle of the `complex_spectrum`.\nphase_spectrum\n Returns the phase (unwrapped angle) of the `complex_spectrum`.\nspecgram\n Can return the complex spectrum of segments within the signal.\n"""\n\n\ncomplex_spectrum = functools.partial(_single_spectrum_helper, "complex")\ncomplex_spectrum.__doc__ = _single_spectrum_docs.format(\n quantity="complex-valued frequency spectrum",\n **_docstring.interpd.params)\nmagnitude_spectrum = functools.partial(_single_spectrum_helper, "magnitude")\nmagnitude_spectrum.__doc__ = _single_spectrum_docs.format(\n quantity="magnitude (absolute value) of the frequency spectrum",\n **_docstring.interpd.params)\nangle_spectrum = functools.partial(_single_spectrum_helper, "angle")\nangle_spectrum.__doc__ = _single_spectrum_docs.format(\n quantity="angle of the frequency spectrum (wrapped phase spectrum)",\n **_docstring.interpd.params)\nphase_spectrum = functools.partial(_single_spectrum_helper, "phase")\nphase_spectrum.__doc__ = _single_spectrum_docs.format(\n quantity="phase of the frequency spectrum (unwrapped phase spectrum)",\n **_docstring.interpd.params)\n\n\n@_docstring.interpd\ndef specgram(x, NFFT=None, Fs=None, detrend=None, window=None,\n noverlap=None, pad_to=None, sides=None, scale_by_freq=None,\n mode=None):\n """\n Compute a spectrogram.\n\n Compute and plot a spectrogram of data in *x*. Data are split into\n *NFFT* length segments and the spectrum of each section is\n computed. The windowing function *window* is applied to each\n segment, and the amount of overlap of each segment is\n specified with *noverlap*.\n\n Parameters\n ----------\n x : array-like\n 1-D array or sequence.\n\n %(Spectral)s\n\n %(PSD)s\n\n noverlap : int, default: 128\n The number of points of overlap between blocks.\n mode : str, default: 'psd'\n What sort of spectrum to use:\n 'psd'\n Returns the power spectral density.\n 'complex'\n Returns the complex-valued frequency spectrum.\n 'magnitude'\n Returns the magnitude spectrum.\n 'angle'\n Returns the phase spectrum without unwrapping.\n 'phase'\n Returns the phase spectrum with unwrapping.\n\n Returns\n -------\n spectrum : array-like\n 2D array, columns are the periodograms of successive segments.\n\n freqs : array-like\n 1-D array, frequencies corresponding to the rows in *spectrum*.\n\n t : array-like\n 1-D array, the times corresponding to midpoints of segments\n (i.e the columns in *spectrum*).\n\n See Also\n --------\n psd : differs in the overlap and in the return values.\n complex_spectrum : similar, but with complex valued frequencies.\n magnitude_spectrum : similar single segment when *mode* is 'magnitude'.\n angle_spectrum : similar to single segment when *mode* is 'angle'.\n phase_spectrum : similar to single segment when *mode* is 'phase'.\n\n Notes\n -----\n *detrend* and *scale_by_freq* only apply when *mode* is set to 'psd'.\n\n """\n if noverlap is None:\n noverlap = 128 # default in _spectral_helper() is noverlap = 0\n if NFFT is None:\n NFFT = 256 # same default as in _spectral_helper()\n if len(x) <= NFFT:\n _api.warn_external("Only one segment is calculated since parameter "\n f"NFFT (={NFFT}) >= signal length (={len(x)}).")\n\n spec, freqs, t = _spectral_helper(x=x, y=None, NFFT=NFFT, Fs=Fs,\n detrend_func=detrend, window=window,\n noverlap=noverlap, pad_to=pad_to,\n sides=sides,\n scale_by_freq=scale_by_freq,\n mode=mode)\n\n if mode != 'complex':\n spec = spec.real # Needed since helper implements generically\n\n return spec, freqs, t\n\n\n@_docstring.interpd\ndef cohere(x, y, NFFT=256, Fs=2, detrend=detrend_none, window=window_hanning,\n noverlap=0, pad_to=None, sides='default', scale_by_freq=None):\n r"""\n The coherence between *x* and *y*. Coherence is the normalized\n cross spectral density:\n\n .. math::\n\n C_{xy} = \frac{|P_{xy}|^2}{P_{xx}P_{yy}}\n\n Parameters\n ----------\n x, y\n Array or sequence containing the data\n\n %(Spectral)s\n\n %(PSD)s\n\n noverlap : int, default: 0 (no overlap)\n The number of points of overlap between segments.\n\n Returns\n -------\n Cxy : 1-D array\n The coherence vector.\n freqs : 1-D array\n The frequencies for the elements in *Cxy*.\n\n See Also\n --------\n :func:`psd`, :func:`csd` :\n For information about the methods used to compute :math:`P_{xy}`,\n :math:`P_{xx}` and :math:`P_{yy}`.\n """\n if len(x) < 2 * NFFT:\n raise ValueError(\n "Coherence is calculated by averaging over *NFFT* length "\n "segments. Your signal is too short for your choice of *NFFT*.")\n Pxx, f = psd(x, NFFT, Fs, detrend, window, noverlap, pad_to, sides,\n scale_by_freq)\n Pyy, f = psd(y, NFFT, Fs, detrend, window, noverlap, pad_to, sides,\n scale_by_freq)\n Pxy, f = csd(x, y, NFFT, Fs, detrend, window, noverlap, pad_to, sides,\n scale_by_freq)\n Cxy = np.abs(Pxy) ** 2 / (Pxx * Pyy)\n return Cxy, f\n\n\nclass GaussianKDE:\n """\n Representation of a kernel-density estimate using Gaussian kernels.\n\n Parameters\n ----------\n dataset : array-like\n Datapoints to estimate from. In case of univariate data this is a 1-D\n array, otherwise a 2D array with shape (# of dims, # of data).\n bw_method : {'scott', 'silverman'} or float or callable, optional\n The method used to calculate the estimator bandwidth. If a\n float, this will be used directly as `kde.factor`. If a\n callable, it should take a `GaussianKDE` instance as only\n parameter and return a float. If None (default), 'scott' is used.\n\n Attributes\n ----------\n dataset : ndarray\n The dataset passed to the constructor.\n dim : int\n Number of dimensions.\n num_dp : int\n Number of datapoints.\n factor : float\n The bandwidth factor, obtained from `kde.covariance_factor`, with which\n the covariance matrix is multiplied.\n covariance : ndarray\n The covariance matrix of *dataset*, scaled by the calculated bandwidth\n (`kde.factor`).\n inv_cov : ndarray\n The inverse of *covariance*.\n\n Methods\n -------\n kde.evaluate(points) : ndarray\n Evaluate the estimated pdf on a provided set of points.\n kde(points) : ndarray\n Same as kde.evaluate(points)\n """\n\n # This implementation with minor modification was too good to pass up.\n # from scipy: https://github.com/scipy/scipy/blob/master/scipy/stats/kde.py\n\n def __init__(self, dataset, bw_method=None):\n self.dataset = np.atleast_2d(dataset)\n if not np.array(self.dataset).size > 1:\n raise ValueError("`dataset` input should have multiple elements.")\n\n self.dim, self.num_dp = np.array(self.dataset).shape\n\n if bw_method is None:\n pass\n elif cbook._str_equal(bw_method, 'scott'):\n self.covariance_factor = self.scotts_factor\n elif cbook._str_equal(bw_method, 'silverman'):\n self.covariance_factor = self.silverman_factor\n elif isinstance(bw_method, Number):\n self._bw_method = 'use constant'\n self.covariance_factor = lambda: bw_method\n elif callable(bw_method):\n self._bw_method = bw_method\n self.covariance_factor = lambda: self._bw_method(self)\n else:\n raise ValueError("`bw_method` should be 'scott', 'silverman', a "\n "scalar or a callable")\n\n # Computes the covariance matrix for each Gaussian kernel using\n # covariance_factor().\n\n self.factor = self.covariance_factor()\n # Cache covariance and inverse covariance of the data\n if not hasattr(self, '_data_inv_cov'):\n self.data_covariance = np.atleast_2d(\n np.cov(\n self.dataset,\n rowvar=1,\n bias=False))\n self.data_inv_cov = np.linalg.inv(self.data_covariance)\n\n self.covariance = self.data_covariance * self.factor ** 2\n self.inv_cov = self.data_inv_cov / self.factor ** 2\n self.norm_factor = (np.sqrt(np.linalg.det(2 * np.pi * self.covariance))\n * self.num_dp)\n\n def scotts_factor(self):\n return np.power(self.num_dp, -1. / (self.dim + 4))\n\n def silverman_factor(self):\n return np.power(\n self.num_dp * (self.dim + 2.0) / 4.0, -1. / (self.dim + 4))\n\n # Default method to calculate bandwidth, can be overwritten by subclass\n covariance_factor = scotts_factor\n\n def evaluate(self, points):\n """\n Evaluate the estimated pdf on a set of points.\n\n Parameters\n ----------\n points : (# of dimensions, # of points)-array\n Alternatively, a (# of dimensions,) vector can be passed in and\n treated as a single point.\n\n Returns\n -------\n (# of points,)-array\n The values at each point.\n\n Raises\n ------\n ValueError : if the dimensionality of the input points is different\n than the dimensionality of the KDE.\n\n """\n points = np.atleast_2d(points)\n\n dim, num_m = np.array(points).shape\n if dim != self.dim:\n raise ValueError(f"points have dimension {dim}, dataset has "\n f"dimension {self.dim}")\n\n result = np.zeros(num_m)\n\n if num_m >= self.num_dp:\n # there are more points than data, so loop over data\n for i in range(self.num_dp):\n diff = self.dataset[:, i, np.newaxis] - points\n tdiff = np.dot(self.inv_cov, diff)\n energy = np.sum(diff * tdiff, axis=0) / 2.0\n result = result + np.exp(-energy)\n else:\n # loop over points\n for i in range(num_m):\n diff = self.dataset - points[:, i, np.newaxis]\n tdiff = np.dot(self.inv_cov, diff)\n energy = np.sum(diff * tdiff, axis=0) / 2.0\n result[i] = np.sum(np.exp(-energy), axis=0)\n\n result = result / self.norm_factor\n\n return result\n\n __call__ = evaluate\n | .venv\Lib\site-packages\matplotlib\mlab.py | mlab.py | Python | 30,210 | 0.95 | 0.135816 | 0.064033 | vue-tools | 581 | 2025-02-17T03:29:11.769432 | GPL-3.0 | false | dcf77844336514e4a9555daacb33c16b |
from collections.abc import Callable\nimport functools\nfrom typing import Literal\n\nimport numpy as np\nfrom numpy.typing import ArrayLike\n\ndef window_hanning(x: ArrayLike) -> ArrayLike: ...\ndef window_none(x: ArrayLike) -> ArrayLike: ...\ndef detrend(\n x: ArrayLike,\n key: Literal["default", "constant", "mean", "linear", "none"]\n | Callable[[ArrayLike, int | None], ArrayLike]\n | None = ...,\n axis: int | None = ...,\n) -> ArrayLike: ...\ndef detrend_mean(x: ArrayLike, axis: int | None = ...) -> ArrayLike: ...\ndef detrend_none(x: ArrayLike, axis: int | None = ...) -> ArrayLike: ...\ndef detrend_linear(y: ArrayLike) -> ArrayLike: ...\ndef psd(\n x: ArrayLike,\n NFFT: int | None = ...,\n Fs: float | None = ...,\n detrend: Literal["none", "mean", "linear"]\n | Callable[[ArrayLike, int | None], ArrayLike]\n | None = ...,\n window: Callable[[ArrayLike], ArrayLike] | ArrayLike | None = ...,\n noverlap: int | None = ...,\n pad_to: int | None = ...,\n sides: Literal["default", "onesided", "twosided"] | None = ...,\n scale_by_freq: bool | None = ...,\n) -> tuple[ArrayLike, ArrayLike]: ...\ndef csd(\n x: ArrayLike,\n y: ArrayLike | None,\n NFFT: int | None = ...,\n Fs: float | None = ...,\n detrend: Literal["none", "mean", "linear"]\n | Callable[[ArrayLike, int | None], ArrayLike]\n | None = ...,\n window: Callable[[ArrayLike], ArrayLike] | ArrayLike | None = ...,\n noverlap: int | None = ...,\n pad_to: int | None = ...,\n sides: Literal["default", "onesided", "twosided"] | None = ...,\n scale_by_freq: bool | None = ...,\n) -> tuple[ArrayLike, ArrayLike]: ...\n\ncomplex_spectrum = functools.partial(tuple[ArrayLike, ArrayLike])\nmagnitude_spectrum = functools.partial(tuple[ArrayLike, ArrayLike])\nangle_spectrum = functools.partial(tuple[ArrayLike, ArrayLike])\nphase_spectrum = functools.partial(tuple[ArrayLike, ArrayLike])\n\ndef specgram(\n x: ArrayLike,\n NFFT: int | None = ...,\n Fs: float | None = ...,\n detrend: Literal["none", "mean", "linear"] | Callable[[ArrayLike, int | None], ArrayLike] | None = ...,\n window: Callable[[ArrayLike], ArrayLike] | ArrayLike | None = ...,\n noverlap: int | None = ...,\n pad_to: int | None = ...,\n sides: Literal["default", "onesided", "twosided"] | None = ...,\n scale_by_freq: bool | None = ...,\n mode: Literal["psd", "complex", "magnitude", "angle", "phase"] | None = ...,\n) -> tuple[ArrayLike, ArrayLike, ArrayLike]: ...\ndef cohere(\n x: ArrayLike,\n y: ArrayLike,\n NFFT: int = ...,\n Fs: float = ...,\n detrend: Literal["none", "mean", "linear"] | Callable[[ArrayLike, int | None], ArrayLike] = ...,\n window: Callable[[ArrayLike], ArrayLike] | ArrayLike = ...,\n noverlap: int = ...,\n pad_to: int | None = ...,\n sides: Literal["default", "onesided", "twosided"] = ...,\n scale_by_freq: bool | None = ...,\n) -> tuple[ArrayLike, ArrayLike]: ...\n\nclass GaussianKDE:\n dataset: ArrayLike\n dim: int\n num_dp: int\n factor: float\n data_covariance: ArrayLike\n data_inv_cov: ArrayLike\n covariance: ArrayLike\n inv_cov: ArrayLike\n norm_factor: float\n def __init__(\n self,\n dataset: ArrayLike,\n bw_method: Literal["scott", "silverman"]\n | float\n | Callable[[GaussianKDE], float]\n | None = ...,\n ) -> None: ...\n def scotts_factor(self) -> float: ...\n def silverman_factor(self) -> float: ...\n def covariance_factor(self) -> float: ...\n def evaluate(self, points: ArrayLike) -> np.ndarray: ...\n def __call__(self, points: ArrayLike) -> np.ndarray: ...\n | .venv\Lib\site-packages\matplotlib\mlab.pyi | mlab.pyi | Other | 3,583 | 0.85 | 0.17 | 0 | python-kit | 833 | 2023-09-21T03:26:54.463466 | MIT | false | aca3c8ece134214bcedd4f85a9aa3545 |
r"""\nContainer classes for `.Artist`\s.\n\n`OffsetBox`\n The base of all container artists defined in this module.\n\n`AnchoredOffsetbox`, `AnchoredText`\n Anchor and align an arbitrary `.Artist` or a text relative to the parent\n axes or a specific anchor point.\n\n`DrawingArea`\n A container with fixed width and height. Children have a fixed position\n inside the container and may be clipped.\n\n`HPacker`, `VPacker`\n Containers for layouting their children vertically or horizontally.\n\n`PaddedBox`\n A container to add a padding around an `.Artist`.\n\n`TextArea`\n Contains a single `.Text` instance.\n"""\n\nimport functools\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom matplotlib import _api, _docstring\nimport matplotlib.artist as martist\nimport matplotlib.path as mpath\nimport matplotlib.text as mtext\nimport matplotlib.transforms as mtransforms\nfrom matplotlib.font_manager import FontProperties\nfrom matplotlib.image import BboxImage\nfrom matplotlib.patches import (\n FancyBboxPatch, FancyArrowPatch, bbox_artist as mbbox_artist)\nfrom matplotlib.transforms import Bbox, BboxBase, TransformedBbox\n\n\nDEBUG = False\n\n\ndef _compat_get_offset(meth):\n """\n Decorator for the get_offset method of OffsetBox and subclasses, that\n allows supporting both the new signature (self, bbox, renderer) and the old\n signature (self, width, height, xdescent, ydescent, renderer).\n """\n sigs = [lambda self, width, height, xdescent, ydescent, renderer: locals(),\n lambda self, bbox, renderer: locals()]\n\n @functools.wraps(meth)\n def get_offset(self, *args, **kwargs):\n params = _api.select_matching_signature(sigs, self, *args, **kwargs)\n bbox = (params["bbox"] if "bbox" in params else\n Bbox.from_bounds(-params["xdescent"], -params["ydescent"],\n params["width"], params["height"]))\n return meth(params["self"], bbox, params["renderer"])\n return get_offset\n\n\n# for debugging use\ndef _bbox_artist(*args, **kwargs):\n if DEBUG:\n mbbox_artist(*args, **kwargs)\n\n\ndef _get_packed_offsets(widths, total, sep, mode="fixed"):\n r"""\n Pack boxes specified by their *widths*.\n\n For simplicity of the description, the terminology used here assumes a\n horizontal layout, but the function works equally for a vertical layout.\n\n There are three packing *mode*\s:\n\n - 'fixed': The elements are packed tight to the left with a spacing of\n *sep* in between. If *total* is *None* the returned total will be the\n right edge of the last box. A non-*None* total will be passed unchecked\n to the output. In particular this means that right edge of the last\n box may be further to the right than the returned total.\n\n - 'expand': Distribute the boxes with equal spacing so that the left edge\n of the first box is at 0, and the right edge of the last box is at\n *total*. The parameter *sep* is ignored in this mode. A total of *None*\n is accepted and considered equal to 1. The total is returned unchanged\n (except for the conversion *None* to 1). If the total is smaller than\n the sum of the widths, the laid out boxes will overlap.\n\n - 'equal': If *total* is given, the total space is divided in N equal\n ranges and each box is left-aligned within its subspace.\n Otherwise (*total* is *None*), *sep* must be provided and each box is\n left-aligned in its subspace of width ``(max(widths) + sep)``. The\n total width is then calculated to be ``N * (max(widths) + sep)``.\n\n Parameters\n ----------\n widths : list of float\n Widths of boxes to be packed.\n total : float or None\n Intended total length. *None* if not used.\n sep : float or None\n Spacing between boxes.\n mode : {'fixed', 'expand', 'equal'}\n The packing mode.\n\n Returns\n -------\n total : float\n The total width needed to accommodate the laid out boxes.\n offsets : array of float\n The left offsets of the boxes.\n """\n _api.check_in_list(["fixed", "expand", "equal"], mode=mode)\n\n if mode == "fixed":\n offsets_ = np.cumsum([0] + [w + sep for w in widths])\n offsets = offsets_[:-1]\n if total is None:\n total = offsets_[-1] - sep\n return total, offsets\n\n elif mode == "expand":\n # This is a bit of a hack to avoid a TypeError when *total*\n # is None and used in conjugation with tight layout.\n if total is None:\n total = 1\n if len(widths) > 1:\n sep = (total - sum(widths)) / (len(widths) - 1)\n else:\n sep = 0\n offsets_ = np.cumsum([0] + [w + sep for w in widths])\n offsets = offsets_[:-1]\n return total, offsets\n\n elif mode == "equal":\n maxh = max(widths)\n if total is None:\n if sep is None:\n raise ValueError("total and sep cannot both be None when "\n "using layout mode 'equal'")\n total = (maxh + sep) * len(widths)\n else:\n sep = total / len(widths) - maxh\n offsets = (maxh + sep) * np.arange(len(widths))\n return total, offsets\n\n\ndef _get_aligned_offsets(yspans, height, align="baseline"):\n """\n Align boxes each specified by their ``(y0, y1)`` spans.\n\n For simplicity of the description, the terminology used here assumes a\n horizontal layout (i.e., vertical alignment), but the function works\n equally for a vertical layout.\n\n Parameters\n ----------\n yspans\n List of (y0, y1) spans of boxes to be aligned.\n height : float or None\n Intended total height. If None, the maximum of the heights\n (``y1 - y0``) in *yspans* is used.\n align : {'baseline', 'left', 'top', 'right', 'bottom', 'center'}\n The alignment anchor of the boxes.\n\n Returns\n -------\n (y0, y1)\n y range spanned by the packing. If a *height* was originally passed\n in, then for all alignments other than "baseline", a span of ``(0,\n height)`` is used without checking that it is actually large enough).\n descent\n The descent of the packing.\n offsets\n The bottom offsets of the boxes.\n """\n\n _api.check_in_list(\n ["baseline", "left", "top", "right", "bottom", "center"], align=align)\n if height is None:\n height = max(y1 - y0 for y0, y1 in yspans)\n\n if align == "baseline":\n yspan = (min(y0 for y0, y1 in yspans), max(y1 for y0, y1 in yspans))\n offsets = [0] * len(yspans)\n elif align in ["left", "bottom"]:\n yspan = (0, height)\n offsets = [-y0 for y0, y1 in yspans]\n elif align in ["right", "top"]:\n yspan = (0, height)\n offsets = [height - y1 for y0, y1 in yspans]\n elif align == "center":\n yspan = (0, height)\n offsets = [(height - (y1 - y0)) * .5 - y0 for y0, y1 in yspans]\n\n return yspan, offsets\n\n\nclass OffsetBox(martist.Artist):\n """\n The OffsetBox is a simple container artist.\n\n The child artists are meant to be drawn at a relative position to its\n parent.\n\n Being an artist itself, all parameters are passed on to `.Artist`.\n """\n def __init__(self, *args, **kwargs):\n super().__init__(*args)\n self._internal_update(kwargs)\n # Clipping has not been implemented in the OffsetBox family, so\n # disable the clip flag for consistency. It can always be turned back\n # on to zero effect.\n self.set_clip_on(False)\n self._children = []\n self._offset = (0, 0)\n\n def set_figure(self, fig):\n """\n Set the `.Figure` for the `.OffsetBox` and all its children.\n\n Parameters\n ----------\n fig : `~matplotlib.figure.Figure`\n """\n super().set_figure(fig)\n for c in self.get_children():\n c.set_figure(fig)\n\n @martist.Artist.axes.setter\n def axes(self, ax):\n # TODO deal with this better\n martist.Artist.axes.fset(self, ax)\n for c in self.get_children():\n if c is not None:\n c.axes = ax\n\n def contains(self, mouseevent):\n """\n Delegate the mouse event contains-check to the children.\n\n As a container, the `.OffsetBox` does not respond itself to\n mouseevents.\n\n Parameters\n ----------\n mouseevent : `~matplotlib.backend_bases.MouseEvent`\n\n Returns\n -------\n contains : bool\n Whether any values are within the radius.\n details : dict\n An artist-specific dictionary of details of the event context,\n such as which points are contained in the pick radius. See the\n individual Artist subclasses for details.\n\n See Also\n --------\n .Artist.contains\n """\n if self._different_canvas(mouseevent):\n return False, {}\n for c in self.get_children():\n a, b = c.contains(mouseevent)\n if a:\n return a, b\n return False, {}\n\n def set_offset(self, xy):\n """\n Set the offset.\n\n Parameters\n ----------\n xy : (float, float) or callable\n The (x, y) coordinates of the offset in display units. These can\n either be given explicitly as a tuple (x, y), or by providing a\n function that converts the extent into the offset. This function\n must have the signature::\n\n def offset(width, height, xdescent, ydescent, renderer) \\n-> (float, float)\n """\n self._offset = xy\n self.stale = True\n\n @_compat_get_offset\n def get_offset(self, bbox, renderer):\n """\n Return the offset as a tuple (x, y).\n\n The extent parameters have to be provided to handle the case where the\n offset is dynamically determined by a callable (see\n `~.OffsetBox.set_offset`).\n\n Parameters\n ----------\n bbox : `.Bbox`\n renderer : `.RendererBase` subclass\n """\n return (\n self._offset(bbox.width, bbox.height, -bbox.x0, -bbox.y0, renderer)\n if callable(self._offset)\n else self._offset)\n\n def set_width(self, width):\n """\n Set the width of the box.\n\n Parameters\n ----------\n width : float\n """\n self.width = width\n self.stale = True\n\n def set_height(self, height):\n """\n Set the height of the box.\n\n Parameters\n ----------\n height : float\n """\n self.height = height\n self.stale = True\n\n def get_visible_children(self):\n r"""Return a list of the visible child `.Artist`\s."""\n return [c for c in self._children if c.get_visible()]\n\n def get_children(self):\n r"""Return a list of the child `.Artist`\s."""\n return self._children\n\n def _get_bbox_and_child_offsets(self, renderer):\n """\n Return the bbox of the offsetbox and the child offsets.\n\n The bbox should satisfy ``x0 <= x1 and y0 <= y1``.\n\n Parameters\n ----------\n renderer : `.RendererBase` subclass\n\n Returns\n -------\n bbox\n list of (xoffset, yoffset) pairs\n """\n raise NotImplementedError(\n "get_bbox_and_offsets must be overridden in derived classes")\n\n def get_bbox(self, renderer):\n """Return the bbox of the offsetbox, ignoring parent offsets."""\n bbox, offsets = self._get_bbox_and_child_offsets(renderer)\n return bbox\n\n def get_window_extent(self, renderer=None):\n # docstring inherited\n if renderer is None:\n renderer = self.get_figure(root=True)._get_renderer()\n bbox = self.get_bbox(renderer)\n try: # Some subclasses redefine get_offset to take no args.\n px, py = self.get_offset(bbox, renderer)\n except TypeError:\n px, py = self.get_offset()\n return bbox.translated(px, py)\n\n def draw(self, renderer):\n """\n Update the location of children if necessary and draw them\n to the given *renderer*.\n """\n bbox, offsets = self._get_bbox_and_child_offsets(renderer)\n px, py = self.get_offset(bbox, renderer)\n for c, (ox, oy) in zip(self.get_visible_children(), offsets):\n c.set_offset((px + ox, py + oy))\n c.draw(renderer)\n _bbox_artist(self, renderer, fill=False, props=dict(pad=0.))\n self.stale = False\n\n\nclass PackerBase(OffsetBox):\n def __init__(self, pad=0., sep=0., width=None, height=None,\n align="baseline", mode="fixed", children=None):\n """\n Parameters\n ----------\n pad : float, default: 0.0\n The boundary padding in points.\n\n sep : float, default: 0.0\n The spacing between items in points.\n\n width, height : float, optional\n Width and height of the container box in pixels, calculated if\n *None*.\n\n align : {'top', 'bottom', 'left', 'right', 'center', 'baseline'}, \\ndefault: 'baseline'\n Alignment of boxes.\n\n mode : {'fixed', 'expand', 'equal'}, default: 'fixed'\n The packing mode.\n\n - 'fixed' packs the given `.Artist`\\s tight with *sep* spacing.\n - 'expand' uses the maximal available space to distribute the\n artists with equal spacing in between.\n - 'equal': Each artist an equal fraction of the available space\n and is left-aligned (or top-aligned) therein.\n\n children : list of `.Artist`\n The artists to pack.\n\n Notes\n -----\n *pad* and *sep* are in points and will be scaled with the renderer\n dpi, while *width* and *height* are in pixels.\n """\n super().__init__()\n self.height = height\n self.width = width\n self.sep = sep\n self.pad = pad\n self.mode = mode\n self.align = align\n self._children = children\n\n\nclass VPacker(PackerBase):\n """\n VPacker packs its children vertically, automatically adjusting their\n relative positions at draw time.\n\n .. code-block:: none\n\n +---------+\n | Child 1 |\n | Child 2 |\n | Child 3 |\n +---------+\n """\n\n def _get_bbox_and_child_offsets(self, renderer):\n # docstring inherited\n dpicor = renderer.points_to_pixels(1.)\n pad = self.pad * dpicor\n sep = self.sep * dpicor\n\n if self.width is not None:\n for c in self.get_visible_children():\n if isinstance(c, PackerBase) and c.mode == "expand":\n c.set_width(self.width)\n\n bboxes = [c.get_bbox(renderer) for c in self.get_visible_children()]\n (x0, x1), xoffsets = _get_aligned_offsets(\n [bbox.intervalx for bbox in bboxes], self.width, self.align)\n height, yoffsets = _get_packed_offsets(\n [bbox.height for bbox in bboxes], self.height, sep, self.mode)\n\n yoffsets = height - (yoffsets + [bbox.y1 for bbox in bboxes])\n ydescent = yoffsets[0]\n yoffsets = yoffsets - ydescent\n\n return (\n Bbox.from_bounds(x0, -ydescent, x1 - x0, height).padded(pad),\n [*zip(xoffsets, yoffsets)])\n\n\nclass HPacker(PackerBase):\n """\n HPacker packs its children horizontally, automatically adjusting their\n relative positions at draw time.\n\n .. code-block:: none\n\n +-------------------------------+\n | Child 1 Child 2 Child 3 |\n +-------------------------------+\n """\n\n def _get_bbox_and_child_offsets(self, renderer):\n # docstring inherited\n dpicor = renderer.points_to_pixels(1.)\n pad = self.pad * dpicor\n sep = self.sep * dpicor\n\n bboxes = [c.get_bbox(renderer) for c in self.get_visible_children()]\n if not bboxes:\n return Bbox.from_bounds(0, 0, 0, 0).padded(pad), []\n\n (y0, y1), yoffsets = _get_aligned_offsets(\n [bbox.intervaly for bbox in bboxes], self.height, self.align)\n width, xoffsets = _get_packed_offsets(\n [bbox.width for bbox in bboxes], self.width, sep, self.mode)\n\n x0 = bboxes[0].x0\n xoffsets -= ([bbox.x0 for bbox in bboxes] - x0)\n\n return (Bbox.from_bounds(x0, y0, width, y1 - y0).padded(pad),\n [*zip(xoffsets, yoffsets)])\n\n\nclass PaddedBox(OffsetBox):\n """\n A container to add a padding around an `.Artist`.\n\n The `.PaddedBox` contains a `.FancyBboxPatch` that is used to visualize\n it when rendering.\n\n .. code-block:: none\n\n +----------------------------+\n | |\n | |\n | |\n | <--pad--> Artist |\n | ^ |\n | pad |\n | v |\n +----------------------------+\n\n Attributes\n ----------\n pad : float\n The padding in points.\n patch : `.FancyBboxPatch`\n When *draw_frame* is True, this `.FancyBboxPatch` is made visible and\n creates a border around the box.\n """\n\n def __init__(self, child, pad=0., *, draw_frame=False, patch_attrs=None):\n """\n Parameters\n ----------\n child : `~matplotlib.artist.Artist`\n The contained `.Artist`.\n pad : float, default: 0.0\n The padding in points. This will be scaled with the renderer dpi.\n In contrast, *width* and *height* are in *pixels* and thus not\n scaled.\n draw_frame : bool\n Whether to draw the contained `.FancyBboxPatch`.\n patch_attrs : dict or None\n Additional parameters passed to the contained `.FancyBboxPatch`.\n """\n super().__init__()\n self.pad = pad\n self._children = [child]\n self.patch = FancyBboxPatch(\n xy=(0.0, 0.0), width=1., height=1.,\n facecolor='w', edgecolor='k',\n mutation_scale=1, # self.prop.get_size_in_points(),\n snap=True,\n visible=draw_frame,\n boxstyle="square,pad=0",\n )\n if patch_attrs is not None:\n self.patch.update(patch_attrs)\n\n def _get_bbox_and_child_offsets(self, renderer):\n # docstring inherited.\n pad = self.pad * renderer.points_to_pixels(1.)\n return (self._children[0].get_bbox(renderer).padded(pad), [(0, 0)])\n\n def draw(self, renderer):\n # docstring inherited\n bbox, offsets = self._get_bbox_and_child_offsets(renderer)\n px, py = self.get_offset(bbox, renderer)\n for c, (ox, oy) in zip(self.get_visible_children(), offsets):\n c.set_offset((px + ox, py + oy))\n\n self.draw_frame(renderer)\n\n for c in self.get_visible_children():\n c.draw(renderer)\n\n self.stale = False\n\n def update_frame(self, bbox, fontsize=None):\n self.patch.set_bounds(bbox.bounds)\n if fontsize:\n self.patch.set_mutation_scale(fontsize)\n self.stale = True\n\n def draw_frame(self, renderer):\n # update the location and size of the legend\n self.update_frame(self.get_window_extent(renderer))\n self.patch.draw(renderer)\n\n\nclass DrawingArea(OffsetBox):\n """\n The DrawingArea can contain any Artist as a child. The DrawingArea\n has a fixed width and height. The position of children relative to\n the parent is fixed. The children can be clipped at the\n boundaries of the parent.\n """\n\n def __init__(self, width, height, xdescent=0., ydescent=0., clip=False):\n """\n Parameters\n ----------\n width, height : float\n Width and height of the container box.\n xdescent, ydescent : float\n Descent of the box in x- and y-direction.\n clip : bool\n Whether to clip the children to the box.\n """\n super().__init__()\n self.width = width\n self.height = height\n self.xdescent = xdescent\n self.ydescent = ydescent\n self._clip_children = clip\n self.offset_transform = mtransforms.Affine2D()\n self.dpi_transform = mtransforms.Affine2D()\n\n @property\n def clip_children(self):\n """\n If the children of this DrawingArea should be clipped\n by DrawingArea bounding box.\n """\n return self._clip_children\n\n @clip_children.setter\n def clip_children(self, val):\n self._clip_children = bool(val)\n self.stale = True\n\n def get_transform(self):\n """\n Return the `~matplotlib.transforms.Transform` applied to the children.\n """\n return self.dpi_transform + self.offset_transform\n\n def set_transform(self, t):\n """\n set_transform is ignored.\n """\n\n def set_offset(self, xy):\n """\n Set the offset of the container.\n\n Parameters\n ----------\n xy : (float, float)\n The (x, y) coordinates of the offset in display units.\n """\n self._offset = xy\n self.offset_transform.clear()\n self.offset_transform.translate(xy[0], xy[1])\n self.stale = True\n\n def get_offset(self):\n """Return offset of the container."""\n return self._offset\n\n def get_bbox(self, renderer):\n # docstring inherited\n dpi_cor = renderer.points_to_pixels(1.)\n return Bbox.from_bounds(\n -self.xdescent * dpi_cor, -self.ydescent * dpi_cor,\n self.width * dpi_cor, self.height * dpi_cor)\n\n def add_artist(self, a):\n """Add an `.Artist` to the container box."""\n self._children.append(a)\n if not a.is_transform_set():\n a.set_transform(self.get_transform())\n if self.axes is not None:\n a.axes = self.axes\n fig = self.get_figure(root=False)\n if fig is not None:\n a.set_figure(fig)\n\n def draw(self, renderer):\n # docstring inherited\n\n dpi_cor = renderer.points_to_pixels(1.)\n self.dpi_transform.clear()\n self.dpi_transform.scale(dpi_cor)\n\n # At this point the DrawingArea has a transform\n # to the display space so the path created is\n # good for clipping children\n tpath = mtransforms.TransformedPath(\n mpath.Path([[0, 0], [0, self.height],\n [self.width, self.height],\n [self.width, 0]]),\n self.get_transform())\n for c in self._children:\n if self._clip_children and not (c.clipbox or c._clippath):\n c.set_clip_path(tpath)\n c.draw(renderer)\n\n _bbox_artist(self, renderer, fill=False, props=dict(pad=0.))\n self.stale = False\n\n\nclass TextArea(OffsetBox):\n """\n The TextArea is a container artist for a single Text instance.\n\n The text is placed at (0, 0) with baseline+left alignment, by default. The\n width and height of the TextArea instance is the width and height of its\n child text.\n """\n\n def __init__(self, s,\n *,\n textprops=None,\n multilinebaseline=False,\n ):\n """\n Parameters\n ----------\n s : str\n The text to be displayed.\n textprops : dict, default: {}\n Dictionary of keyword parameters to be passed to the `.Text`\n instance in the TextArea.\n multilinebaseline : bool, default: False\n Whether the baseline for multiline text is adjusted so that it\n is (approximately) center-aligned with single-line text.\n """\n if textprops is None:\n textprops = {}\n self._text = mtext.Text(0, 0, s, **textprops)\n super().__init__()\n self._children = [self._text]\n self.offset_transform = mtransforms.Affine2D()\n self._baseline_transform = mtransforms.Affine2D()\n self._text.set_transform(self.offset_transform +\n self._baseline_transform)\n self._multilinebaseline = multilinebaseline\n\n def set_text(self, s):\n """Set the text of this area as a string."""\n self._text.set_text(s)\n self.stale = True\n\n def get_text(self):\n """Return the string representation of this area's text."""\n return self._text.get_text()\n\n def set_multilinebaseline(self, t):\n """\n Set multilinebaseline.\n\n If True, the baseline for multiline text is adjusted so that it is\n (approximately) center-aligned with single-line text. This is used\n e.g. by the legend implementation so that single-line labels are\n baseline-aligned, but multiline labels are "center"-aligned with them.\n """\n self._multilinebaseline = t\n self.stale = True\n\n def get_multilinebaseline(self):\n """\n Get multilinebaseline.\n """\n return self._multilinebaseline\n\n def set_transform(self, t):\n """\n set_transform is ignored.\n """\n\n def set_offset(self, xy):\n """\n Set the offset of the container.\n\n Parameters\n ----------\n xy : (float, float)\n The (x, y) coordinates of the offset in display units.\n """\n self._offset = xy\n self.offset_transform.clear()\n self.offset_transform.translate(xy[0], xy[1])\n self.stale = True\n\n def get_offset(self):\n """Return offset of the container."""\n return self._offset\n\n def get_bbox(self, renderer):\n _, h_, d_ = mtext._get_text_metrics_with_cache(\n renderer, "lp", self._text._fontproperties,\n ismath="TeX" if self._text.get_usetex() else False,\n dpi=self.get_figure(root=True).dpi)\n\n bbox, info, yd = self._text._get_layout(renderer)\n w, h = bbox.size\n\n self._baseline_transform.clear()\n\n if len(info) > 1 and self._multilinebaseline:\n yd_new = 0.5 * h - 0.5 * (h_ - d_)\n self._baseline_transform.translate(0, yd - yd_new)\n yd = yd_new\n else: # single line\n h_d = max(h_ - d_, h - yd)\n h = h_d + yd\n\n ha = self._text.get_horizontalalignment()\n x0 = {"left": 0, "center": -w / 2, "right": -w}[ha]\n\n return Bbox.from_bounds(x0, -yd, w, h)\n\n def draw(self, renderer):\n # docstring inherited\n self._text.draw(renderer)\n _bbox_artist(self, renderer, fill=False, props=dict(pad=0.))\n self.stale = False\n\n\nclass AuxTransformBox(OffsetBox):\n """\n Offset Box with the aux_transform. Its children will be\n transformed with the aux_transform first then will be\n offsetted. The absolute coordinate of the aux_transform is meaning\n as it will be automatically adjust so that the left-lower corner\n of the bounding box of children will be set to (0, 0) before the\n offset transform.\n\n It is similar to drawing area, except that the extent of the box\n is not predetermined but calculated from the window extent of its\n children. Furthermore, the extent of the children will be\n calculated in the transformed coordinate.\n """\n def __init__(self, aux_transform):\n self.aux_transform = aux_transform\n super().__init__()\n self.offset_transform = mtransforms.Affine2D()\n # ref_offset_transform makes offset_transform always relative to the\n # lower-left corner of the bbox of its children.\n self.ref_offset_transform = mtransforms.Affine2D()\n\n def add_artist(self, a):\n """Add an `.Artist` to the container box."""\n self._children.append(a)\n a.set_transform(self.get_transform())\n self.stale = True\n\n def get_transform(self):\n """\n Return the :class:`~matplotlib.transforms.Transform` applied\n to the children\n """\n return (self.aux_transform\n + self.ref_offset_transform\n + self.offset_transform)\n\n def set_transform(self, t):\n """\n set_transform is ignored.\n """\n\n def set_offset(self, xy):\n """\n Set the offset of the container.\n\n Parameters\n ----------\n xy : (float, float)\n The (x, y) coordinates of the offset in display units.\n """\n self._offset = xy\n self.offset_transform.clear()\n self.offset_transform.translate(xy[0], xy[1])\n self.stale = True\n\n def get_offset(self):\n """Return offset of the container."""\n return self._offset\n\n def get_bbox(self, renderer):\n # clear the offset transforms\n _off = self.offset_transform.get_matrix() # to be restored later\n self.ref_offset_transform.clear()\n self.offset_transform.clear()\n # calculate the extent\n bboxes = [c.get_window_extent(renderer) for c in self._children]\n ub = Bbox.union(bboxes)\n # adjust ref_offset_transform\n self.ref_offset_transform.translate(-ub.x0, -ub.y0)\n # restore offset transform\n self.offset_transform.set_matrix(_off)\n return Bbox.from_bounds(0, 0, ub.width, ub.height)\n\n def draw(self, renderer):\n # docstring inherited\n for c in self._children:\n c.draw(renderer)\n _bbox_artist(self, renderer, fill=False, props=dict(pad=0.))\n self.stale = False\n\n\nclass AnchoredOffsetbox(OffsetBox):\n """\n An offset box placed according to location *loc*.\n\n AnchoredOffsetbox has a single child. When multiple children are needed,\n use an extra OffsetBox to enclose them. By default, the offset box is\n anchored against its parent Axes. You may explicitly specify the\n *bbox_to_anchor*.\n """\n zorder = 5 # zorder of the legend\n\n # Location codes\n codes = {'upper right': 1,\n 'upper left': 2,\n 'lower left': 3,\n 'lower right': 4,\n 'right': 5,\n 'center left': 6,\n 'center right': 7,\n 'lower center': 8,\n 'upper center': 9,\n 'center': 10,\n }\n\n def __init__(self, loc, *,\n pad=0.4, borderpad=0.5,\n child=None, prop=None, frameon=True,\n bbox_to_anchor=None,\n bbox_transform=None,\n **kwargs):\n """\n Parameters\n ----------\n loc : str\n The box location. Valid locations are\n 'upper left', 'upper center', 'upper right',\n 'center left', 'center', 'center right',\n 'lower left', 'lower center', 'lower right'.\n For backward compatibility, numeric values are accepted as well.\n See the parameter *loc* of `.Legend` for details.\n pad : float, default: 0.4\n Padding around the child as fraction of the fontsize.\n borderpad : float, default: 0.5\n Padding between the offsetbox frame and the *bbox_to_anchor*.\n child : `.OffsetBox`\n The box that will be anchored.\n prop : `.FontProperties`\n This is only used as a reference for paddings. If not given,\n :rc:`legend.fontsize` is used.\n frameon : bool\n Whether to draw a frame around the box.\n bbox_to_anchor : `.BboxBase`, 2-tuple, or 4-tuple of floats\n Box that is used to position the legend in conjunction with *loc*.\n bbox_transform : None or :class:`matplotlib.transforms.Transform`\n The transform for the bounding box (*bbox_to_anchor*).\n **kwargs\n All other parameters are passed on to `.OffsetBox`.\n\n Notes\n -----\n See `.Legend` for a detailed description of the anchoring mechanism.\n """\n super().__init__(**kwargs)\n\n self.set_bbox_to_anchor(bbox_to_anchor, bbox_transform)\n self.set_child(child)\n\n if isinstance(loc, str):\n loc = _api.check_getitem(self.codes, loc=loc)\n\n self.loc = loc\n self.borderpad = borderpad\n self.pad = pad\n\n if prop is None:\n self.prop = FontProperties(size=mpl.rcParams["legend.fontsize"])\n else:\n self.prop = FontProperties._from_any(prop)\n if isinstance(prop, dict) and "size" not in prop:\n self.prop.set_size(mpl.rcParams["legend.fontsize"])\n\n self.patch = FancyBboxPatch(\n xy=(0.0, 0.0), width=1., height=1.,\n facecolor='w', edgecolor='k',\n mutation_scale=self.prop.get_size_in_points(),\n snap=True,\n visible=frameon,\n boxstyle="square,pad=0",\n )\n\n def set_child(self, child):\n """Set the child to be anchored."""\n self._child = child\n if child is not None:\n child.axes = self.axes\n self.stale = True\n\n def get_child(self):\n """Return the child."""\n return self._child\n\n def get_children(self):\n """Return the list of children."""\n return [self._child]\n\n def get_bbox(self, renderer):\n # docstring inherited\n fontsize = renderer.points_to_pixels(self.prop.get_size_in_points())\n pad = self.pad * fontsize\n return self.get_child().get_bbox(renderer).padded(pad)\n\n def get_bbox_to_anchor(self):\n """Return the bbox that the box is anchored to."""\n if self._bbox_to_anchor is None:\n return self.axes.bbox\n else:\n transform = self._bbox_to_anchor_transform\n if transform is None:\n return self._bbox_to_anchor\n else:\n return TransformedBbox(self._bbox_to_anchor, transform)\n\n def set_bbox_to_anchor(self, bbox, transform=None):\n """\n Set the bbox that the box is anchored to.\n\n *bbox* can be a Bbox instance, a list of [left, bottom, width,\n height], or a list of [left, bottom] where the width and\n height will be assumed to be zero. The bbox will be\n transformed to display coordinate by the given transform.\n """\n if bbox is None or isinstance(bbox, BboxBase):\n self._bbox_to_anchor = bbox\n else:\n try:\n l = len(bbox)\n except TypeError as err:\n raise ValueError(f"Invalid bbox: {bbox}") from err\n\n if l == 2:\n bbox = [bbox[0], bbox[1], 0, 0]\n\n self._bbox_to_anchor = Bbox.from_bounds(*bbox)\n\n self._bbox_to_anchor_transform = transform\n self.stale = True\n\n @_compat_get_offset\n def get_offset(self, bbox, renderer):\n # docstring inherited\n pad = (self.borderpad\n * renderer.points_to_pixels(self.prop.get_size_in_points()))\n bbox_to_anchor = self.get_bbox_to_anchor()\n x0, y0 = _get_anchored_bbox(\n self.loc, Bbox.from_bounds(0, 0, bbox.width, bbox.height),\n bbox_to_anchor, pad)\n return x0 - bbox.x0, y0 - bbox.y0\n\n def update_frame(self, bbox, fontsize=None):\n self.patch.set_bounds(bbox.bounds)\n if fontsize:\n self.patch.set_mutation_scale(fontsize)\n\n def draw(self, renderer):\n # docstring inherited\n if not self.get_visible():\n return\n\n # update the location and size of the legend\n bbox = self.get_window_extent(renderer)\n fontsize = renderer.points_to_pixels(self.prop.get_size_in_points())\n self.update_frame(bbox, fontsize)\n self.patch.draw(renderer)\n\n px, py = self.get_offset(self.get_bbox(renderer), renderer)\n self.get_child().set_offset((px, py))\n self.get_child().draw(renderer)\n self.stale = False\n\n\ndef _get_anchored_bbox(loc, bbox, parentbbox, borderpad):\n """\n Return the (x, y) position of the *bbox* anchored at the *parentbbox* with\n the *loc* code with the *borderpad*.\n """\n # This is only called internally and *loc* should already have been\n # validated. If 0 (None), we just let ``bbox.anchored`` raise.\n c = [None, "NE", "NW", "SW", "SE", "E", "W", "E", "S", "N", "C"][loc]\n container = parentbbox.padded(-borderpad)\n return bbox.anchored(c, container=container).p0\n\n\nclass AnchoredText(AnchoredOffsetbox):\n """\n AnchoredOffsetbox with Text.\n """\n\n def __init__(self, s, loc, *, pad=0.4, borderpad=0.5, prop=None, **kwargs):\n """\n Parameters\n ----------\n s : str\n Text.\n\n loc : str\n Location code. See `AnchoredOffsetbox`.\n\n pad : float, default: 0.4\n Padding around the text as fraction of the fontsize.\n\n borderpad : float, default: 0.5\n Spacing between the offsetbox frame and the *bbox_to_anchor*.\n\n prop : dict, optional\n Dictionary of keyword parameters to be passed to the\n `~matplotlib.text.Text` instance contained inside AnchoredText.\n\n **kwargs\n All other parameters are passed to `AnchoredOffsetbox`.\n """\n\n if prop is None:\n prop = {}\n badkwargs = {'va', 'verticalalignment'}\n if badkwargs & set(prop):\n raise ValueError(\n 'Mixing verticalalignment with AnchoredText is not supported.')\n\n self.txt = TextArea(s, textprops=prop)\n fp = self.txt._text.get_fontproperties()\n super().__init__(\n loc, pad=pad, borderpad=borderpad, child=self.txt, prop=fp,\n **kwargs)\n\n\nclass OffsetImage(OffsetBox):\n\n def __init__(self, arr, *,\n zoom=1,\n cmap=None,\n norm=None,\n interpolation=None,\n origin=None,\n filternorm=True,\n filterrad=4.0,\n resample=False,\n dpi_cor=True,\n **kwargs\n ):\n\n super().__init__()\n self._dpi_cor = dpi_cor\n\n self.image = BboxImage(bbox=self.get_window_extent,\n cmap=cmap,\n norm=norm,\n interpolation=interpolation,\n origin=origin,\n filternorm=filternorm,\n filterrad=filterrad,\n resample=resample,\n **kwargs\n )\n\n self._children = [self.image]\n\n self.set_zoom(zoom)\n self.set_data(arr)\n\n def set_data(self, arr):\n self._data = np.asarray(arr)\n self.image.set_data(self._data)\n self.stale = True\n\n def get_data(self):\n return self._data\n\n def set_zoom(self, zoom):\n self._zoom = zoom\n self.stale = True\n\n def get_zoom(self):\n return self._zoom\n\n def get_offset(self):\n """Return offset of the container."""\n return self._offset\n\n def get_children(self):\n return [self.image]\n\n def get_bbox(self, renderer):\n dpi_cor = renderer.points_to_pixels(1.) if self._dpi_cor else 1.\n zoom = self.get_zoom()\n data = self.get_data()\n ny, nx = data.shape[:2]\n w, h = dpi_cor * nx * zoom, dpi_cor * ny * zoom\n return Bbox.from_bounds(0, 0, w, h)\n\n def draw(self, renderer):\n # docstring inherited\n self.image.draw(renderer)\n # bbox_artist(self, renderer, fill=False, props=dict(pad=0.))\n self.stale = False\n\n\nclass AnnotationBbox(martist.Artist, mtext._AnnotationBase):\n """\n Container for an `OffsetBox` referring to a specific position *xy*.\n\n Optionally an arrow pointing from the offsetbox to *xy* can be drawn.\n\n This is like `.Annotation`, but with `OffsetBox` instead of `.Text`.\n """\n\n zorder = 3\n\n def __str__(self):\n return f"AnnotationBbox({self.xy[0]:g},{self.xy[1]:g})"\n\n @_docstring.interpd\n def __init__(self, offsetbox, xy, xybox=None, xycoords='data', boxcoords=None, *,\n frameon=True, pad=0.4, # FancyBboxPatch boxstyle.\n annotation_clip=None,\n box_alignment=(0.5, 0.5),\n bboxprops=None,\n arrowprops=None,\n fontsize=None,\n **kwargs):\n """\n Parameters\n ----------\n offsetbox : `OffsetBox`\n\n xy : (float, float)\n The point *(x, y)* to annotate. The coordinate system is determined\n by *xycoords*.\n\n xybox : (float, float), default: *xy*\n The position *(x, y)* to place the text at. The coordinate system\n is determined by *boxcoords*.\n\n xycoords : single or two-tuple of str or `.Artist` or `.Transform` or \\ncallable, default: 'data'\n The coordinate system that *xy* is given in. See the parameter\n *xycoords* in `.Annotation` for a detailed description.\n\n boxcoords : single or two-tuple of str or `.Artist` or `.Transform` \\nor callable, default: value of *xycoords*\n The coordinate system that *xybox* is given in. See the parameter\n *textcoords* in `.Annotation` for a detailed description.\n\n frameon : bool, default: True\n By default, the text is surrounded by a white `.FancyBboxPatch`\n (accessible as the ``patch`` attribute of the `.AnnotationBbox`).\n If *frameon* is set to False, this patch is made invisible.\n\n annotation_clip: bool or None, default: None\n Whether to clip (i.e. not draw) the annotation when the annotation\n point *xy* is outside the Axes area.\n\n - If *True*, the annotation will be clipped when *xy* is outside\n the Axes.\n - If *False*, the annotation will always be drawn.\n - If *None*, the annotation will be clipped when *xy* is outside\n the Axes and *xycoords* is 'data'.\n\n pad : float, default: 0.4\n Padding around the offsetbox.\n\n box_alignment : (float, float)\n A tuple of two floats for a vertical and horizontal alignment of\n the offset box w.r.t. the *boxcoords*.\n The lower-left corner is (0, 0) and upper-right corner is (1, 1).\n\n bboxprops : dict, optional\n A dictionary of properties to set for the annotation bounding box,\n for example *boxstyle* and *alpha*. See `.FancyBboxPatch` for\n details.\n\n arrowprops: dict, optional\n Arrow properties, see `.Annotation` for description.\n\n fontsize: float or str, optional\n Translated to points and passed as *mutation_scale* into\n `.FancyBboxPatch` to scale attributes of the box style (e.g. pad\n or rounding_size). The name is chosen in analogy to `.Text` where\n *fontsize* defines the mutation scale as well. If not given,\n :rc:`legend.fontsize` is used. See `.Text.set_fontsize` for valid\n values.\n\n **kwargs\n Other `AnnotationBbox` properties. See `.AnnotationBbox.set` for\n a list.\n """\n\n martist.Artist.__init__(self)\n mtext._AnnotationBase.__init__(\n self, xy, xycoords=xycoords, annotation_clip=annotation_clip)\n\n self.offsetbox = offsetbox\n self.arrowprops = arrowprops.copy() if arrowprops is not None else None\n self.set_fontsize(fontsize)\n self.xybox = xybox if xybox is not None else xy\n self.boxcoords = boxcoords if boxcoords is not None else xycoords\n self._box_alignment = box_alignment\n\n if arrowprops is not None:\n self._arrow_relpos = self.arrowprops.pop("relpos", (0.5, 0.5))\n self.arrow_patch = FancyArrowPatch((0, 0), (1, 1),\n **self.arrowprops)\n else:\n self._arrow_relpos = None\n self.arrow_patch = None\n\n self.patch = FancyBboxPatch( # frame\n xy=(0.0, 0.0), width=1., height=1.,\n facecolor='w', edgecolor='k',\n mutation_scale=self.prop.get_size_in_points(),\n snap=True,\n visible=frameon,\n )\n self.patch.set_boxstyle("square", pad=pad)\n if bboxprops:\n self.patch.set(**bboxprops)\n\n self._internal_update(kwargs)\n\n @property\n def xyann(self):\n return self.xybox\n\n @xyann.setter\n def xyann(self, xyann):\n self.xybox = xyann\n self.stale = True\n\n @property\n def anncoords(self):\n return self.boxcoords\n\n @anncoords.setter\n def anncoords(self, coords):\n self.boxcoords = coords\n self.stale = True\n\n def contains(self, mouseevent):\n if self._different_canvas(mouseevent):\n return False, {}\n if not self._check_xy(None):\n return False, {}\n return self.offsetbox.contains(mouseevent)\n # self.arrow_patch is currently not checked as this can be a line - JJ\n\n def get_children(self):\n children = [self.offsetbox, self.patch]\n if self.arrow_patch:\n children.append(self.arrow_patch)\n return children\n\n def set_figure(self, fig):\n if self.arrow_patch is not None:\n self.arrow_patch.set_figure(fig)\n self.offsetbox.set_figure(fig)\n martist.Artist.set_figure(self, fig)\n\n def set_fontsize(self, s=None):\n """\n Set the fontsize in points.\n\n If *s* is not given, reset to :rc:`legend.fontsize`.\n """\n if s is None:\n s = mpl.rcParams["legend.fontsize"]\n\n self.prop = FontProperties(size=s)\n self.stale = True\n\n def get_fontsize(self):\n """Return the fontsize in points."""\n return self.prop.get_size_in_points()\n\n def get_window_extent(self, renderer=None):\n # docstring inherited\n if renderer is None:\n renderer = self.get_figure(root=True)._get_renderer()\n self.update_positions(renderer)\n return Bbox.union([child.get_window_extent(renderer)\n for child in self.get_children()])\n\n def get_tightbbox(self, renderer=None):\n # docstring inherited\n if renderer is None:\n renderer = self.get_figure(root=True)._get_renderer()\n self.update_positions(renderer)\n return Bbox.union([child.get_tightbbox(renderer)\n for child in self.get_children()])\n\n def update_positions(self, renderer):\n """Update pixel positions for the annotated point, the text, and the arrow."""\n\n ox0, oy0 = self._get_xy(renderer, self.xybox, self.boxcoords)\n bbox = self.offsetbox.get_bbox(renderer)\n fw, fh = self._box_alignment\n self.offsetbox.set_offset(\n (ox0 - fw*bbox.width - bbox.x0, oy0 - fh*bbox.height - bbox.y0))\n\n bbox = self.offsetbox.get_window_extent(renderer)\n self.patch.set_bounds(bbox.bounds)\n\n mutation_scale = renderer.points_to_pixels(self.get_fontsize())\n self.patch.set_mutation_scale(mutation_scale)\n\n if self.arrowprops:\n # Use FancyArrowPatch if self.arrowprops has "arrowstyle" key.\n\n # Adjust the starting point of the arrow relative to the textbox.\n # TODO: Rotation needs to be accounted.\n arrow_begin = bbox.p0 + bbox.size * self._arrow_relpos\n arrow_end = self._get_position_xy(renderer)\n # The arrow (from arrow_begin to arrow_end) will be first clipped\n # by patchA and patchB, then shrunk by shrinkA and shrinkB (in\n # points). If patch A is not set, self.bbox_patch is used.\n self.arrow_patch.set_positions(arrow_begin, arrow_end)\n\n if "mutation_scale" in self.arrowprops:\n mutation_scale = renderer.points_to_pixels(\n self.arrowprops["mutation_scale"])\n # Else, use fontsize-based mutation_scale defined above.\n self.arrow_patch.set_mutation_scale(mutation_scale)\n\n patchA = self.arrowprops.get("patchA", self.patch)\n self.arrow_patch.set_patchA(patchA)\n\n def draw(self, renderer):\n # docstring inherited\n if not self.get_visible() or not self._check_xy(renderer):\n return\n renderer.open_group(self.__class__.__name__, gid=self.get_gid())\n self.update_positions(renderer)\n if self.arrow_patch is not None:\n if (self.arrow_patch.get_figure(root=False) is None and\n (fig := self.get_figure(root=False)) is not None):\n self.arrow_patch.set_figure(fig)\n self.arrow_patch.draw(renderer)\n self.patch.draw(renderer)\n self.offsetbox.draw(renderer)\n renderer.close_group(self.__class__.__name__)\n self.stale = False\n\n\nclass DraggableBase:\n """\n Helper base class for a draggable artist (legend, offsetbox).\n\n Derived classes must override the following methods::\n\n def save_offset(self):\n '''\n Called when the object is picked for dragging; should save the\n reference position of the artist.\n '''\n\n def update_offset(self, dx, dy):\n '''\n Called during the dragging; (*dx*, *dy*) is the pixel offset from\n the point where the mouse drag started.\n '''\n\n Optionally, you may override the following method::\n\n def finalize_offset(self):\n '''Called when the mouse is released.'''\n\n In the current implementation of `.DraggableLegend` and\n `DraggableAnnotation`, `update_offset` places the artists in display\n coordinates, and `finalize_offset` recalculates their position in axes\n coordinate and set a relevant attribute.\n """\n\n def __init__(self, ref_artist, use_blit=False):\n self.ref_artist = ref_artist\n if not ref_artist.pickable():\n ref_artist.set_picker(self._picker)\n self.got_artist = False\n self._use_blit = use_blit and self.canvas.supports_blit\n callbacks = self.canvas.callbacks\n self._disconnectors = [\n functools.partial(\n callbacks.disconnect, callbacks._connect_picklable(name, func))\n for name, func in [\n ("pick_event", self.on_pick),\n ("button_release_event", self.on_release),\n ("motion_notify_event", self.on_motion),\n ]\n ]\n\n @staticmethod\n def _picker(artist, mouseevent):\n # A custom picker to prevent dragging on mouse scroll events\n return (artist.contains(mouseevent) and mouseevent.name != "scroll_event"), {}\n\n # A property, not an attribute, to maintain picklability.\n canvas = property(lambda self: self.ref_artist.get_figure(root=True).canvas)\n cids = property(lambda self: [\n disconnect.args[0] for disconnect in self._disconnectors[:2]])\n\n def on_motion(self, evt):\n if self._check_still_parented() and self.got_artist:\n dx = evt.x - self.mouse_x\n dy = evt.y - self.mouse_y\n self.update_offset(dx, dy)\n if self._use_blit:\n self.canvas.restore_region(self.background)\n self.ref_artist.draw(\n self.ref_artist.get_figure(root=True)._get_renderer())\n self.canvas.blit()\n else:\n self.canvas.draw()\n\n def on_pick(self, evt):\n if self._check_still_parented():\n if evt.artist == self.ref_artist:\n self.mouse_x = evt.mouseevent.x\n self.mouse_y = evt.mouseevent.y\n self.save_offset()\n self.got_artist = True\n if self.got_artist and self._use_blit:\n self.ref_artist.set_animated(True)\n self.canvas.draw()\n fig = self.ref_artist.get_figure(root=False)\n self.background = self.canvas.copy_from_bbox(fig.bbox)\n self.ref_artist.draw(fig._get_renderer())\n self.canvas.blit()\n\n def on_release(self, event):\n if self._check_still_parented() and self.got_artist:\n self.finalize_offset()\n self.got_artist = False\n if self._use_blit:\n self.canvas.restore_region(self.background)\n self.ref_artist.draw(self.ref_artist.figure._get_renderer())\n self.canvas.blit()\n self.ref_artist.set_animated(False)\n\n def _check_still_parented(self):\n if self.ref_artist.get_figure(root=False) is None:\n self.disconnect()\n return False\n else:\n return True\n\n def disconnect(self):\n """Disconnect the callbacks."""\n for disconnector in self._disconnectors:\n disconnector()\n\n def save_offset(self):\n pass\n\n def update_offset(self, dx, dy):\n pass\n\n def finalize_offset(self):\n pass\n\n\nclass DraggableOffsetBox(DraggableBase):\n def __init__(self, ref_artist, offsetbox, use_blit=False):\n super().__init__(ref_artist, use_blit=use_blit)\n self.offsetbox = offsetbox\n\n def save_offset(self):\n offsetbox = self.offsetbox\n renderer = offsetbox.get_figure(root=True)._get_renderer()\n offset = offsetbox.get_offset(offsetbox.get_bbox(renderer), renderer)\n self.offsetbox_x, self.offsetbox_y = offset\n self.offsetbox.set_offset(offset)\n\n def update_offset(self, dx, dy):\n loc_in_canvas = self.offsetbox_x + dx, self.offsetbox_y + dy\n self.offsetbox.set_offset(loc_in_canvas)\n\n def get_loc_in_canvas(self):\n offsetbox = self.offsetbox\n renderer = offsetbox.get_figure(root=True)._get_renderer()\n bbox = offsetbox.get_bbox(renderer)\n ox, oy = offsetbox._offset\n loc_in_canvas = (ox + bbox.x0, oy + bbox.y0)\n return loc_in_canvas\n\n\nclass DraggableAnnotation(DraggableBase):\n def __init__(self, annotation, use_blit=False):\n super().__init__(annotation, use_blit=use_blit)\n self.annotation = annotation\n\n def save_offset(self):\n ann = self.annotation\n self.ox, self.oy = ann.get_transform().transform(ann.xyann)\n\n def update_offset(self, dx, dy):\n ann = self.annotation\n ann.xyann = ann.get_transform().inverted().transform(\n (self.ox + dx, self.oy + dy))\n | .venv\Lib\site-packages\matplotlib\offsetbox.py | offsetbox.py | Python | 54,488 | 0.75 | 0.16863 | 0.051128 | awesome-app | 384 | 2024-03-17T11:42:02.709803 | MIT | false | 77057f2506d4d20d4a421efb1df0e51d |
import matplotlib.artist as martist\nfrom matplotlib.backend_bases import RendererBase, Event, FigureCanvasBase\nfrom matplotlib.colors import Colormap, Normalize\nimport matplotlib.text as mtext\nfrom matplotlib.figure import Figure, SubFigure\nfrom matplotlib.font_manager import FontProperties\nfrom matplotlib.image import BboxImage\nfrom matplotlib.patches import FancyArrowPatch, FancyBboxPatch\nfrom matplotlib.transforms import Bbox, BboxBase, Transform\nfrom matplotlib.typing import CoordsType\n\nimport numpy as np\nfrom numpy.typing import ArrayLike\nfrom collections.abc import Callable, Sequence\nfrom typing import Any, Literal, overload\n\nDEBUG: bool\n\ndef _get_packed_offsets(\n widths: Sequence[float],\n total: float | None,\n sep: float | None,\n mode: Literal["fixed", "expand", "equal"] = ...,\n) -> tuple[float, np.ndarray]: ...\n\nclass OffsetBox(martist.Artist):\n width: float | None\n height: float | None\n def __init__(self, *args, **kwargs) -> None: ...\n def set_figure(self, fig: Figure | SubFigure) -> None: ...\n def set_offset(\n self,\n xy: tuple[float, float]\n | Callable[[float, float, float, float, RendererBase], tuple[float, float]],\n ) -> None: ...\n\n @overload\n def get_offset(self, bbox: Bbox, renderer: RendererBase) -> tuple[float, float]: ...\n @overload\n def get_offset(\n self,\n width: float,\n height: float,\n xdescent: float,\n ydescent: float,\n renderer: RendererBase\n ) -> tuple[float, float]: ...\n\n def set_width(self, width: float) -> None: ...\n def set_height(self, height: float) -> None: ...\n def get_visible_children(self) -> list[martist.Artist]: ...\n def get_children(self) -> list[martist.Artist]: ...\n def get_bbox(self, renderer: RendererBase) -> Bbox: ...\n def get_window_extent(self, renderer: RendererBase | None = ...) -> Bbox: ...\n\nclass PackerBase(OffsetBox):\n height: float | None\n width: float | None\n sep: float | None\n pad: float | None\n mode: Literal["fixed", "expand", "equal"]\n align: Literal["top", "bottom", "left", "right", "center", "baseline"]\n def __init__(\n self,\n pad: float | None = ...,\n sep: float | None = ...,\n width: float | None = ...,\n height: float | None = ...,\n align: Literal["top", "bottom", "left", "right", "center", "baseline"] = ...,\n mode: Literal["fixed", "expand", "equal"] = ...,\n children: list[martist.Artist] | None = ...,\n ) -> None: ...\n\nclass VPacker(PackerBase): ...\nclass HPacker(PackerBase): ...\n\nclass PaddedBox(OffsetBox):\n pad: float | None\n patch: FancyBboxPatch\n def __init__(\n self,\n child: martist.Artist,\n pad: float | None = ...,\n *,\n draw_frame: bool = ...,\n patch_attrs: dict[str, Any] | None = ...,\n ) -> None: ...\n def update_frame(self, bbox: Bbox, fontsize: float | None = ...) -> None: ...\n def draw_frame(self, renderer: RendererBase) -> None: ...\n\nclass DrawingArea(OffsetBox):\n width: float\n height: float\n xdescent: float\n ydescent: float\n offset_transform: Transform\n dpi_transform: Transform\n def __init__(\n self,\n width: float,\n height: float,\n xdescent: float = ...,\n ydescent: float = ...,\n clip: bool = ...,\n ) -> None: ...\n @property\n def clip_children(self) -> bool: ...\n @clip_children.setter\n def clip_children(self, val: bool) -> None: ...\n def get_transform(self) -> Transform: ...\n\n # does not accept all options of superclass\n def set_offset(self, xy: tuple[float, float]) -> None: ... # type: ignore[override]\n def get_offset(self) -> tuple[float, float]: ... # type: ignore[override]\n def add_artist(self, a: martist.Artist) -> None: ...\n\nclass TextArea(OffsetBox):\n offset_transform: Transform\n def __init__(\n self,\n s: str,\n *,\n textprops: dict[str, Any] | None = ...,\n multilinebaseline: bool = ...,\n ) -> None: ...\n def set_text(self, s: str) -> None: ...\n def get_text(self) -> str: ...\n def set_multilinebaseline(self, t: bool) -> None: ...\n def get_multilinebaseline(self) -> bool: ...\n\n # does not accept all options of superclass\n def set_offset(self, xy: tuple[float, float]) -> None: ... # type: ignore[override]\n def get_offset(self) -> tuple[float, float]: ... # type: ignore[override]\n\nclass AuxTransformBox(OffsetBox):\n aux_transform: Transform\n offset_transform: Transform\n ref_offset_transform: Transform\n def __init__(self, aux_transform: Transform) -> None: ...\n def add_artist(self, a: martist.Artist) -> None: ...\n def get_transform(self) -> Transform: ...\n\n # does not accept all options of superclass\n def set_offset(self, xy: tuple[float, float]) -> None: ... # type: ignore[override]\n def get_offset(self) -> tuple[float, float]: ... # type: ignore[override]\n\nclass AnchoredOffsetbox(OffsetBox):\n zorder: float\n codes: dict[str, int]\n loc: int\n borderpad: float\n pad: float\n prop: FontProperties\n patch: FancyBboxPatch\n def __init__(\n self,\n loc: str,\n *,\n pad: float = ...,\n borderpad: float = ...,\n child: OffsetBox | None = ...,\n prop: FontProperties | None = ...,\n frameon: bool = ...,\n bbox_to_anchor: BboxBase\n | tuple[float, float]\n | tuple[float, float, float, float]\n | None = ...,\n bbox_transform: Transform | None = ...,\n **kwargs\n ) -> None: ...\n def set_child(self, child: OffsetBox | None) -> None: ...\n def get_child(self) -> OffsetBox | None: ...\n def get_children(self) -> list[martist.Artist]: ...\n def get_bbox_to_anchor(self) -> Bbox: ...\n def set_bbox_to_anchor(\n self, bbox: BboxBase, transform: Transform | None = ...\n ) -> None: ...\n def update_frame(self, bbox: Bbox, fontsize: float | None = ...) -> None: ...\n\nclass AnchoredText(AnchoredOffsetbox):\n txt: TextArea\n def __init__(\n self,\n s: str,\n loc: str,\n *,\n pad: float = ...,\n borderpad: float = ...,\n prop: dict[str, Any] | None = ...,\n **kwargs\n ) -> None: ...\n\nclass OffsetImage(OffsetBox):\n image: BboxImage\n def __init__(\n self,\n arr: ArrayLike,\n *,\n zoom: float = ...,\n cmap: Colormap | str | None = ...,\n norm: Normalize | str | None = ...,\n interpolation: str | None = ...,\n origin: Literal["upper", "lower"] | None = ...,\n filternorm: bool = ...,\n filterrad: float = ...,\n resample: bool = ...,\n dpi_cor: bool = ...,\n **kwargs\n ) -> None: ...\n stale: bool\n def set_data(self, arr: ArrayLike | None) -> None: ...\n def get_data(self) -> ArrayLike | None: ...\n def set_zoom(self, zoom: float) -> None: ...\n def get_zoom(self) -> float: ...\n def get_children(self) -> list[martist.Artist]: ...\n def get_offset(self) -> tuple[float, float]: ... # type: ignore[override]\n\nclass AnnotationBbox(martist.Artist, mtext._AnnotationBase):\n zorder: float\n offsetbox: OffsetBox\n arrowprops: dict[str, Any] | None\n xybox: tuple[float, float]\n boxcoords: CoordsType\n arrow_patch: FancyArrowPatch | None\n patch: FancyBboxPatch\n prop: FontProperties\n def __init__(\n self,\n offsetbox: OffsetBox,\n xy: tuple[float, float],\n xybox: tuple[float, float] | None = ...,\n xycoords: CoordsType = ...,\n boxcoords: CoordsType | None = ...,\n *,\n frameon: bool = ...,\n pad: float = ...,\n annotation_clip: bool | None = ...,\n box_alignment: tuple[float, float] = ...,\n bboxprops: dict[str, Any] | None = ...,\n arrowprops: dict[str, Any] | None = ...,\n fontsize: float | str | None = ...,\n **kwargs\n ) -> None: ...\n @property\n def xyann(self) -> tuple[float, float]: ...\n @xyann.setter\n def xyann(self, xyann: tuple[float, float]) -> None: ...\n @property\n def anncoords(\n self,\n ) -> CoordsType: ...\n @anncoords.setter\n def anncoords(\n self,\n coords: CoordsType,\n ) -> None: ...\n def get_children(self) -> list[martist.Artist]: ...\n def set_figure(self, fig: Figure | SubFigure) -> None: ...\n def set_fontsize(self, s: str | float | None = ...) -> None: ...\n def get_fontsize(self) -> float: ...\n def get_tightbbox(self, renderer: RendererBase | None = ...) -> Bbox: ...\n def update_positions(self, renderer: RendererBase) -> None: ...\n\nclass DraggableBase:\n ref_artist: martist.Artist\n got_artist: bool\n mouse_x: int\n mouse_y: int\n background: Any\n\n @property\n def canvas(self) -> FigureCanvasBase: ...\n @property\n def cids(self) -> list[int]: ...\n\n def __init__(self, ref_artist: martist.Artist, use_blit: bool = ...) -> None: ...\n def on_motion(self, evt: Event) -> None: ...\n def on_pick(self, evt: Event) -> None: ...\n def on_release(self, event: Event) -> None: ...\n def disconnect(self) -> None: ...\n def save_offset(self) -> None: ...\n def update_offset(self, dx: float, dy: float) -> None: ...\n def finalize_offset(self) -> None: ...\n\nclass DraggableOffsetBox(DraggableBase):\n offsetbox: OffsetBox\n def __init__(\n self, ref_artist: martist.Artist, offsetbox: OffsetBox, use_blit: bool = ...\n ) -> None: ...\n def save_offset(self) -> None: ...\n def update_offset(self, dx: float, dy: float) -> None: ...\n def get_loc_in_canvas(self) -> tuple[float, float]: ...\n\nclass DraggableAnnotation(DraggableBase):\n annotation: mtext.Annotation\n def __init__(self, annotation: mtext.Annotation, use_blit: bool = ...) -> None: ...\n def save_offset(self) -> None: ...\n def update_offset(self, dx: float, dy: float) -> None: ...\n | .venv\Lib\site-packages\matplotlib\offsetbox.pyi | offsetbox.pyi | Other | 9,909 | 0.95 | 0.312081 | 0.047445 | awesome-app | 347 | 2024-11-21T23:18:49.125329 | MIT | false | 2cf93f2e69cd7edfe3145b1752c96e9f |
from . import artist\nfrom .axes import Axes\nfrom .backend_bases import RendererBase, MouseEvent\nfrom .path import Path\nfrom .transforms import Transform, Bbox\n\nfrom typing import Any, Literal, overload\n\nimport numpy as np\nfrom numpy.typing import ArrayLike\nfrom .typing import ColorType, LineStyleType, CapStyleType, JoinStyleType\n\nclass Patch(artist.Artist):\n zorder: float\n def __init__(\n self,\n *,\n edgecolor: ColorType | None = ...,\n facecolor: ColorType | None = ...,\n color: ColorType | None = ...,\n linewidth: float | None = ...,\n linestyle: LineStyleType | None = ...,\n antialiased: bool | None = ...,\n hatch: str | None = ...,\n fill: bool = ...,\n capstyle: CapStyleType | None = ...,\n joinstyle: JoinStyleType | None = ...,\n **kwargs,\n ) -> None: ...\n def get_verts(self) -> ArrayLike: ...\n def contains(self, mouseevent: MouseEvent, radius: float | None = None) -> tuple[bool, dict[Any, Any]]: ...\n def contains_point(\n self, point: tuple[float, float], radius: float | None = ...\n ) -> bool: ...\n def contains_points(\n self, points: ArrayLike, radius: float | None = ...\n ) -> np.ndarray: ...\n def get_extents(self) -> Bbox: ...\n def get_transform(self) -> Transform: ...\n def get_data_transform(self) -> Transform: ...\n def get_patch_transform(self) -> Transform: ...\n def get_antialiased(self) -> bool: ...\n def get_edgecolor(self) -> ColorType: ...\n def get_facecolor(self) -> ColorType: ...\n def get_linewidth(self) -> float: ...\n def get_linestyle(self) -> LineStyleType: ...\n def set_antialiased(self, aa: bool | None) -> None: ...\n def set_edgecolor(self, color: ColorType | None) -> None: ...\n def set_facecolor(self, color: ColorType | None) -> None: ...\n def set_color(self, c: ColorType | None) -> None: ...\n def set_alpha(self, alpha: float | None) -> None: ...\n def set_linewidth(self, w: float | None) -> None: ...\n def set_linestyle(self, ls: LineStyleType | None) -> None: ...\n def set_fill(self, b: bool) -> None: ...\n def get_fill(self) -> bool: ...\n fill = property(get_fill, set_fill)\n def set_capstyle(self, s: CapStyleType) -> None: ...\n def get_capstyle(self) -> Literal["butt", "projecting", "round"]: ...\n def set_joinstyle(self, s: JoinStyleType) -> None: ...\n def get_joinstyle(self) -> Literal["miter", "round", "bevel"]: ...\n def set_hatch(self, hatch: str) -> None: ...\n def set_hatch_linewidth(self, lw: float) -> None: ...\n def get_hatch_linewidth(self) -> float: ...\n def get_hatch(self) -> str: ...\n def get_path(self) -> Path: ...\n\nclass Shadow(Patch):\n patch: Patch\n def __init__(self, patch: Patch, ox: float, oy: float, *, shade: float = ..., **kwargs) -> None: ...\n\nclass Rectangle(Patch):\n angle: float\n def __init__(\n self,\n xy: tuple[float, float],\n width: float,\n height: float,\n *,\n angle: float = ...,\n rotation_point: Literal["xy", "center"] | tuple[float, float] = ...,\n **kwargs,\n ) -> None: ...\n @property\n def rotation_point(self) -> Literal["xy", "center"] | tuple[float, float]: ...\n @rotation_point.setter\n def rotation_point(\n self, value: Literal["xy", "center"] | tuple[float, float]\n ) -> None: ...\n def get_x(self) -> float: ...\n def get_y(self) -> float: ...\n def get_xy(self) -> tuple[float, float]: ...\n def get_corners(self) -> np.ndarray: ...\n def get_center(self) -> np.ndarray: ...\n def get_width(self) -> float: ...\n def get_height(self) -> float: ...\n def get_angle(self) -> float: ...\n def set_x(self, x: float) -> None: ...\n def set_y(self, y: float) -> None: ...\n def set_angle(self, angle: float) -> None: ...\n def set_xy(self, xy: tuple[float, float]) -> None: ...\n def set_width(self, w: float) -> None: ...\n def set_height(self, h: float) -> None: ...\n @overload\n def set_bounds(self, args: tuple[float, float, float, float], /) -> None: ...\n @overload\n def set_bounds(\n self, left: float, bottom: float, width: float, height: float, /\n ) -> None: ...\n def get_bbox(self) -> Bbox: ...\n xy = property(get_xy, set_xy)\n\nclass RegularPolygon(Patch):\n xy: tuple[float, float]\n numvertices: int\n orientation: float\n radius: float\n def __init__(\n self,\n xy: tuple[float, float],\n numVertices: int,\n *,\n radius: float = ...,\n orientation: float = ...,\n **kwargs,\n ) -> None: ...\n\nclass PathPatch(Patch):\n def __init__(self, path: Path, **kwargs) -> None: ...\n def set_path(self, path: Path) -> None: ...\n\nclass StepPatch(PathPatch):\n orientation: Literal["vertical", "horizontal"]\n def __init__(\n self,\n values: ArrayLike,\n edges: ArrayLike,\n *,\n orientation: Literal["vertical", "horizontal"] = ...,\n baseline: float = ...,\n **kwargs,\n ) -> None: ...\n\n # NamedTuple StairData, defined in body of method\n def get_data(self) -> tuple[np.ndarray, np.ndarray, float]: ...\n def set_data(\n self,\n values: ArrayLike | None = ...,\n edges: ArrayLike | None = ...,\n baseline: float | None = ...,\n ) -> None: ...\n\nclass Polygon(Patch):\n def __init__(self, xy: ArrayLike, *, closed: bool = ..., **kwargs) -> None: ...\n def get_closed(self) -> bool: ...\n def set_closed(self, closed: bool) -> None: ...\n def get_xy(self) -> np.ndarray: ...\n def set_xy(self, xy: ArrayLike) -> None: ...\n xy = property(get_xy, set_xy)\n\nclass Wedge(Patch):\n center: tuple[float, float]\n r: float\n theta1: float\n theta2: float\n width: float | None\n def __init__(\n self,\n center: tuple[float, float],\n r: float,\n theta1: float,\n theta2: float,\n *,\n width: float | None = ...,\n **kwargs,\n ) -> None: ...\n def set_center(self, center: tuple[float, float]) -> None: ...\n def set_radius(self, radius: float) -> None: ...\n def set_theta1(self, theta1: float) -> None: ...\n def set_theta2(self, theta2: float) -> None: ...\n def set_width(self, width: float | None) -> None: ...\n\nclass Arrow(Patch):\n def __init__(\n self, x: float, y: float, dx: float, dy: float, *, width: float = ..., **kwargs\n ) -> None: ...\n def set_data(\n self,\n x: float | None = ...,\n y: float | None = ...,\n dx: float | None = ...,\n dy: float | None = ...,\n width: float | None = ...,\n ) -> None: ...\nclass FancyArrow(Polygon):\n def __init__(\n self,\n x: float,\n y: float,\n dx: float,\n dy: float,\n *,\n width: float = ...,\n length_includes_head: bool = ...,\n head_width: float | None = ...,\n head_length: float | None = ...,\n shape: Literal["full", "left", "right"] = ...,\n overhang: float = ...,\n head_starts_at_zero: bool = ...,\n **kwargs,\n ) -> None: ...\n def set_data(\n self,\n *,\n x: float | None = ...,\n y: float | None = ...,\n dx: float | None = ...,\n dy: float | None = ...,\n width: float | None = ...,\n head_width: float | None = ...,\n head_length: float | None = ...,\n ) -> None: ...\n\nclass CirclePolygon(RegularPolygon):\n def __init__(\n self,\n xy: tuple[float, float],\n radius: float = ...,\n *,\n resolution: int = ...,\n **kwargs,\n ) -> None: ...\n\nclass Ellipse(Patch):\n def __init__(\n self,\n xy: tuple[float, float],\n width: float,\n height: float,\n *,\n angle: float = ...,\n **kwargs,\n ) -> None: ...\n def set_center(self, xy: tuple[float, float]) -> None: ...\n def get_center(self) -> float: ...\n center = property(get_center, set_center)\n\n def set_width(self, width: float) -> None: ...\n def get_width(self) -> float: ...\n width = property(get_width, set_width)\n\n def set_height(self, height: float) -> None: ...\n def get_height(self) -> float: ...\n height = property(get_height, set_height)\n\n def set_angle(self, angle: float) -> None: ...\n def get_angle(self) -> float: ...\n angle = property(get_angle, set_angle)\n\n def get_corners(self) -> np.ndarray: ...\n\n def get_vertices(self) -> list[tuple[float, float]]: ...\n def get_co_vertices(self) -> list[tuple[float, float]]: ...\n\n\nclass Annulus(Patch):\n a: float\n b: float\n def __init__(\n self,\n xy: tuple[float, float],\n r: float | tuple[float, float],\n width: float,\n angle: float = ...,\n **kwargs,\n ) -> None: ...\n def set_center(self, xy: tuple[float, float]) -> None: ...\n def get_center(self) -> tuple[float, float]: ...\n center = property(get_center, set_center)\n\n def set_width(self, width: float) -> None: ...\n def get_width(self) -> float: ...\n width = property(get_width, set_width)\n\n def set_angle(self, angle: float) -> None: ...\n def get_angle(self) -> float: ...\n angle = property(get_angle, set_angle)\n\n def set_semimajor(self, a: float) -> None: ...\n def set_semiminor(self, b: float) -> None: ...\n def set_radii(self, r: float | tuple[float, float]) -> None: ...\n def get_radii(self) -> tuple[float, float]: ...\n radii = property(get_radii, set_radii)\n\nclass Circle(Ellipse):\n def __init__(\n self, xy: tuple[float, float], radius: float = ..., **kwargs\n ) -> None: ...\n def set_radius(self, radius: float) -> None: ...\n def get_radius(self) -> float: ...\n radius = property(get_radius, set_radius)\n\nclass Arc(Ellipse):\n theta1: float\n theta2: float\n def __init__(\n self,\n xy: tuple[float, float],\n width: float,\n height: float,\n *,\n angle: float = ...,\n theta1: float = ...,\n theta2: float = ...,\n **kwargs,\n ) -> None: ...\n\ndef bbox_artist(\n artist: artist.Artist,\n renderer: RendererBase,\n props: dict[str, Any] | None = ...,\n fill: bool = ...,\n) -> None: ...\ndef draw_bbox(\n bbox: Bbox,\n renderer: RendererBase,\n color: ColorType = ...,\n trans: Transform | None = ...,\n) -> None: ...\n\nclass _Style:\n def __new__(cls, stylename, **kwargs): ...\n @classmethod\n def get_styles(cls) -> dict[str, type]: ...\n @classmethod\n def pprint_styles(cls) -> str: ...\n @classmethod\n def register(cls, name: str, style: type) -> None: ...\n\nclass BoxStyle(_Style):\n class Square(BoxStyle):\n pad: float\n def __init__(self, pad: float = ...) -> None: ...\n def __call__(\n self,\n x0: float,\n y0: float,\n width: float,\n height: float,\n mutation_size: float,\n ) -> Path: ...\n\n class Circle(BoxStyle):\n pad: float\n def __init__(self, pad: float = ...) -> None: ...\n def __call__(\n self,\n x0: float,\n y0: float,\n width: float,\n height: float,\n mutation_size: float,\n ) -> Path: ...\n\n class Ellipse(BoxStyle):\n pad: float\n def __init__(self, pad: float = ...) -> None: ...\n def __call__(\n self,\n x0: float,\n y0: float,\n width: float,\n height: float,\n mutation_size: float,\n ) -> Path: ...\n\n class LArrow(BoxStyle):\n pad: float\n def __init__(self, pad: float = ...) -> None: ...\n def __call__(\n self,\n x0: float,\n y0: float,\n width: float,\n height: float,\n mutation_size: float,\n ) -> Path: ...\n\n class RArrow(LArrow):\n def __call__(\n self,\n x0: float,\n y0: float,\n width: float,\n height: float,\n mutation_size: float,\n ) -> Path: ...\n\n class DArrow(BoxStyle):\n pad: float\n def __init__(self, pad: float = ...) -> None: ...\n def __call__(\n self,\n x0: float,\n y0: float,\n width: float,\n height: float,\n mutation_size: float,\n ) -> Path: ...\n\n class Round(BoxStyle):\n pad: float\n rounding_size: float | None\n def __init__(\n self, pad: float = ..., rounding_size: float | None = ...\n ) -> None: ...\n def __call__(\n self,\n x0: float,\n y0: float,\n width: float,\n height: float,\n mutation_size: float,\n ) -> Path: ...\n\n class Round4(BoxStyle):\n pad: float\n rounding_size: float | None\n def __init__(\n self, pad: float = ..., rounding_size: float | None = ...\n ) -> None: ...\n def __call__(\n self,\n x0: float,\n y0: float,\n width: float,\n height: float,\n mutation_size: float,\n ) -> Path: ...\n\n class Sawtooth(BoxStyle):\n pad: float\n tooth_size: float | None\n def __init__(\n self, pad: float = ..., tooth_size: float | None = ...\n ) -> None: ...\n def __call__(\n self,\n x0: float,\n y0: float,\n width: float,\n height: float,\n mutation_size: float,\n ) -> Path: ...\n\n class Roundtooth(Sawtooth):\n def __call__(\n self,\n x0: float,\n y0: float,\n width: float,\n height: float,\n mutation_size: float,\n ) -> Path: ...\n\nclass ConnectionStyle(_Style):\n class _Base(ConnectionStyle):\n def __call__(\n self,\n posA: tuple[float, float],\n posB: tuple[float, float],\n shrinkA: float = ...,\n shrinkB: float = ...,\n patchA: Patch | None = ...,\n patchB: Patch | None = ...,\n ) -> Path: ...\n\n class Arc3(_Base):\n rad: float\n def __init__(self, rad: float = ...) -> None: ...\n def connect(\n self, posA: tuple[float, float], posB: tuple[float, float]\n ) -> Path: ...\n\n class Angle3(_Base):\n angleA: float\n angleB: float\n def __init__(self, angleA: float = ..., angleB: float = ...) -> None: ...\n def connect(\n self, posA: tuple[float, float], posB: tuple[float, float]\n ) -> Path: ...\n\n class Angle(_Base):\n angleA: float\n angleB: float\n rad: float\n def __init__(\n self, angleA: float = ..., angleB: float = ..., rad: float = ...\n ) -> None: ...\n def connect(\n self, posA: tuple[float, float], posB: tuple[float, float]\n ) -> Path: ...\n\n class Arc(_Base):\n angleA: float\n angleB: float\n armA: float | None\n armB: float | None\n rad: float\n def __init__(\n self,\n angleA: float = ...,\n angleB: float = ...,\n armA: float | None = ...,\n armB: float | None = ...,\n rad: float = ...,\n ) -> None: ...\n def connect(\n self, posA: tuple[float, float], posB: tuple[float, float]\n ) -> Path: ...\n\n class Bar(_Base):\n armA: float\n armB: float\n fraction: float\n angle: float | None\n def __init__(\n self,\n armA: float = ...,\n armB: float = ...,\n fraction: float = ...,\n angle: float | None = ...,\n ) -> None: ...\n def connect(\n self, posA: tuple[float, float], posB: tuple[float, float]\n ) -> Path: ...\n\nclass ArrowStyle(_Style):\n class _Base(ArrowStyle):\n @staticmethod\n def ensure_quadratic_bezier(path: Path) -> list[float]: ...\n def transmute(\n self, path: Path, mutation_size: float, linewidth: float\n ) -> tuple[Path, bool]: ...\n def __call__(\n self,\n path: Path,\n mutation_size: float,\n linewidth: float,\n aspect_ratio: float = ...,\n ) -> tuple[Path, bool]: ...\n\n class _Curve(_Base):\n arrow: str\n fillbegin: bool\n fillend: bool\n def __init__(\n self,\n head_length: float = ...,\n head_width: float = ...,\n widthA: float = ...,\n widthB: float = ...,\n lengthA: float = ...,\n lengthB: float = ...,\n angleA: float | None = ...,\n angleB: float | None = ...,\n scaleA: float | None = ...,\n scaleB: float | None = ...,\n ) -> None: ...\n\n class Curve(_Curve):\n def __init__(self) -> None: ...\n\n class CurveA(_Curve):\n arrow: str\n\n class CurveB(_Curve):\n arrow: str\n\n class CurveAB(_Curve):\n arrow: str\n\n class CurveFilledA(_Curve):\n arrow: str\n\n class CurveFilledB(_Curve):\n arrow: str\n\n class CurveFilledAB(_Curve):\n arrow: str\n\n class BracketA(_Curve):\n arrow: str\n def __init__(\n self, widthA: float = ..., lengthA: float = ..., angleA: float = ...\n ) -> None: ...\n\n class BracketB(_Curve):\n arrow: str\n def __init__(\n self, widthB: float = ..., lengthB: float = ..., angleB: float = ...\n ) -> None: ...\n\n class BracketAB(_Curve):\n arrow: str\n def __init__(\n self,\n widthA: float = ...,\n lengthA: float = ...,\n angleA: float = ...,\n widthB: float = ...,\n lengthB: float = ...,\n angleB: float = ...,\n ) -> None: ...\n\n class BarAB(_Curve):\n arrow: str\n def __init__(\n self,\n widthA: float = ...,\n angleA: float = ...,\n widthB: float = ...,\n angleB: float = ...,\n ) -> None: ...\n\n class BracketCurve(_Curve):\n arrow: str\n def __init__(\n self, widthA: float = ..., lengthA: float = ..., angleA: float | None = ...\n ) -> None: ...\n\n class CurveBracket(_Curve):\n arrow: str\n def __init__(\n self, widthB: float = ..., lengthB: float = ..., angleB: float | None = ...\n ) -> None: ...\n\n class Simple(_Base):\n def __init__(\n self,\n head_length: float = ...,\n head_width: float = ...,\n tail_width: float = ...,\n ) -> None: ...\n\n class Fancy(_Base):\n def __init__(\n self,\n head_length: float = ...,\n head_width: float = ...,\n tail_width: float = ...,\n ) -> None: ...\n\n class Wedge(_Base):\n tail_width: float\n shrink_factor: float\n def __init__(\n self, tail_width: float = ..., shrink_factor: float = ...\n ) -> None: ...\n\nclass FancyBboxPatch(Patch):\n def __init__(\n self,\n xy: tuple[float, float],\n width: float,\n height: float,\n boxstyle: str | BoxStyle = ...,\n *,\n mutation_scale: float = ...,\n mutation_aspect: float = ...,\n **kwargs,\n ) -> None: ...\n def set_boxstyle(self, boxstyle: str | BoxStyle | None = ..., **kwargs) -> None: ...\n def get_boxstyle(self) -> BoxStyle: ...\n def set_mutation_scale(self, scale: float) -> None: ...\n def get_mutation_scale(self) -> float: ...\n def set_mutation_aspect(self, aspect: float) -> None: ...\n def get_mutation_aspect(self) -> float: ...\n def get_x(self) -> float: ...\n def get_y(self) -> float: ...\n def get_width(self) -> float: ...\n def get_height(self) -> float: ...\n def set_x(self, x: float) -> None: ...\n def set_y(self, y: float) -> None: ...\n def set_width(self, w: float) -> None: ...\n def set_height(self, h: float) -> None: ...\n @overload\n def set_bounds(self, args: tuple[float, float, float, float], /) -> None: ...\n @overload\n def set_bounds(\n self, left: float, bottom: float, width: float, height: float, /\n ) -> None: ...\n def get_bbox(self) -> Bbox: ...\n\nclass FancyArrowPatch(Patch):\n patchA: Patch\n patchB: Patch\n shrinkA: float\n shrinkB: float\n def __init__(\n self,\n posA: tuple[float, float] | None = ...,\n posB: tuple[float, float] | None = ...,\n *,\n path: Path | None = ...,\n arrowstyle: str | ArrowStyle = ...,\n connectionstyle: str | ConnectionStyle = ...,\n patchA: Patch | None = ...,\n patchB: Patch | None = ...,\n shrinkA: float = ...,\n shrinkB: float = ...,\n mutation_scale: float = ...,\n mutation_aspect: float | None = ...,\n **kwargs,\n ) -> None: ...\n def set_positions(\n self, posA: tuple[float, float], posB: tuple[float, float]\n ) -> None: ...\n def set_patchA(self, patchA: Patch) -> None: ...\n def set_patchB(self, patchB: Patch) -> None: ...\n def set_connectionstyle(self, connectionstyle: str | ConnectionStyle | None = ..., **kwargs) -> None: ...\n def get_connectionstyle(self) -> ConnectionStyle: ...\n def set_arrowstyle(self, arrowstyle: str | ArrowStyle | None = ..., **kwargs) -> None: ...\n def get_arrowstyle(self) -> ArrowStyle: ...\n def set_mutation_scale(self, scale: float) -> None: ...\n def get_mutation_scale(self) -> float: ...\n def set_mutation_aspect(self, aspect: float | None) -> None: ...\n def get_mutation_aspect(self) -> float: ...\n\nclass ConnectionPatch(FancyArrowPatch):\n xy1: tuple[float, float]\n xy2: tuple[float, float]\n coords1: str | Transform\n coords2: str | Transform | None\n axesA: Axes | None\n axesB: Axes | None\n def __init__(\n self,\n xyA: tuple[float, float],\n xyB: tuple[float, float],\n coordsA: str | Transform,\n coordsB: str | Transform | None = ...,\n *,\n axesA: Axes | None = ...,\n axesB: Axes | None = ...,\n arrowstyle: str | ArrowStyle = ...,\n connectionstyle: str | ConnectionStyle = ...,\n patchA: Patch | None = ...,\n patchB: Patch | None = ...,\n shrinkA: float = ...,\n shrinkB: float = ...,\n mutation_scale: float = ...,\n mutation_aspect: float | None = ...,\n clip_on: bool = ...,\n **kwargs,\n ) -> None: ...\n def set_annotation_clip(self, b: bool | None) -> None: ...\n def get_annotation_clip(self) -> bool | None: ...\n | .venv\Lib\site-packages\matplotlib\patches.pyi | patches.pyi | Other | 22,649 | 0.95 | 0.317041 | 0.039017 | vue-tools | 321 | 2025-07-08T09:15:37.178139 | MIT | false | 2c50ae3bf19cbd2341c61383acf0bcc1 |
r"""\nA module for dealing with the polylines used throughout Matplotlib.\n\nThe primary class for polyline handling in Matplotlib is `Path`. Almost all\nvector drawing makes use of `Path`\s somewhere in the drawing pipeline.\n\nWhilst a `Path` instance itself cannot be drawn, some `.Artist` subclasses,\nsuch as `.PathPatch` and `.PathCollection`, can be used for convenient `Path`\nvisualisation.\n"""\n\nimport copy\nfrom functools import lru_cache\nfrom weakref import WeakValueDictionary\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom . import _api, _path\nfrom .cbook import _to_unmasked_float_array, simple_linear_interpolation\nfrom .bezier import BezierSegment\n\n\nclass Path:\n """\n A series of possibly disconnected, possibly closed, line and curve\n segments.\n\n The underlying storage is made up of two parallel numpy arrays:\n\n - *vertices*: an (N, 2) float array of vertices\n - *codes*: an N-length `numpy.uint8` array of path codes, or None\n\n These two arrays always have the same length in the first\n dimension. For example, to represent a cubic curve, you must\n provide three vertices and three `CURVE4` codes.\n\n The code types are:\n\n - `STOP` : 1 vertex (ignored)\n A marker for the end of the entire path (currently not required and\n ignored)\n\n - `MOVETO` : 1 vertex\n Pick up the pen and move to the given vertex.\n\n - `LINETO` : 1 vertex\n Draw a line from the current position to the given vertex.\n\n - `CURVE3` : 1 control point, 1 endpoint\n Draw a quadratic Bézier curve from the current position, with the given\n control point, to the given end point.\n\n - `CURVE4` : 2 control points, 1 endpoint\n Draw a cubic Bézier curve from the current position, with the given\n control points, to the given end point.\n\n - `CLOSEPOLY` : 1 vertex (ignored)\n Draw a line segment to the start point of the current polyline.\n\n If *codes* is None, it is interpreted as a `MOVETO` followed by a series\n of `LINETO`.\n\n Users of Path objects should not access the vertices and codes arrays\n directly. Instead, they should use `iter_segments` or `cleaned` to get the\n vertex/code pairs. This helps, in particular, to consistently handle the\n case of *codes* being None.\n\n Some behavior of Path objects can be controlled by rcParams. See the\n rcParams whose keys start with 'path.'.\n\n .. note::\n\n The vertices and codes arrays should be treated as\n immutable -- there are a number of optimizations and assumptions\n made up front in the constructor that will not change when the\n data changes.\n """\n\n code_type = np.uint8\n\n # Path codes\n STOP = code_type(0) # 1 vertex\n MOVETO = code_type(1) # 1 vertex\n LINETO = code_type(2) # 1 vertex\n CURVE3 = code_type(3) # 2 vertices\n CURVE4 = code_type(4) # 3 vertices\n CLOSEPOLY = code_type(79) # 1 vertex\n\n #: A dictionary mapping Path codes to the number of vertices that the\n #: code expects.\n NUM_VERTICES_FOR_CODE = {STOP: 1,\n MOVETO: 1,\n LINETO: 1,\n CURVE3: 2,\n CURVE4: 3,\n CLOSEPOLY: 1}\n\n def __init__(self, vertices, codes=None, _interpolation_steps=1,\n closed=False, readonly=False):\n """\n Create a new path with the given vertices and codes.\n\n Parameters\n ----------\n vertices : (N, 2) array-like\n The path vertices, as an array, masked array or sequence of pairs.\n Masked values, if any, will be converted to NaNs, which are then\n handled correctly by the Agg PathIterator and other consumers of\n path data, such as :meth:`iter_segments`.\n codes : array-like or None, optional\n N-length array of integers representing the codes of the path.\n If not None, codes must be the same length as vertices.\n If None, *vertices* will be treated as a series of line segments.\n _interpolation_steps : int, optional\n Used as a hint to certain projections, such as Polar, that this\n path should be linearly interpolated immediately before drawing.\n This attribute is primarily an implementation detail and is not\n intended for public use.\n closed : bool, optional\n If *codes* is None and closed is True, vertices will be treated as\n line segments of a closed polygon. Note that the last vertex will\n then be ignored (as the corresponding code will be set to\n `CLOSEPOLY`).\n readonly : bool, optional\n Makes the path behave in an immutable way and sets the vertices\n and codes as read-only arrays.\n """\n vertices = _to_unmasked_float_array(vertices)\n _api.check_shape((None, 2), vertices=vertices)\n\n if codes is not None and len(vertices):\n codes = np.asarray(codes, self.code_type)\n if codes.ndim != 1 or len(codes) != len(vertices):\n raise ValueError("'codes' must be a 1D list or array with the "\n "same length of 'vertices'. "\n f"Your vertices have shape {vertices.shape} "\n f"but your codes have shape {codes.shape}")\n if len(codes) and codes[0] != self.MOVETO:\n raise ValueError("The first element of 'code' must be equal "\n f"to 'MOVETO' ({self.MOVETO}). "\n f"Your first code is {codes[0]}")\n elif closed and len(vertices):\n codes = np.empty(len(vertices), dtype=self.code_type)\n codes[0] = self.MOVETO\n codes[1:-1] = self.LINETO\n codes[-1] = self.CLOSEPOLY\n\n self._vertices = vertices\n self._codes = codes\n self._interpolation_steps = _interpolation_steps\n self._update_values()\n\n if readonly:\n self._vertices.flags.writeable = False\n if self._codes is not None:\n self._codes.flags.writeable = False\n self._readonly = True\n else:\n self._readonly = False\n\n @classmethod\n def _fast_from_codes_and_verts(cls, verts, codes, internals_from=None):\n """\n Create a Path instance without the expense of calling the constructor.\n\n Parameters\n ----------\n verts : array-like\n codes : array\n internals_from : Path or None\n If not None, another `Path` from which the attributes\n ``should_simplify``, ``simplify_threshold``, and\n ``interpolation_steps`` will be copied. Note that ``readonly`` is\n never copied, and always set to ``False`` by this constructor.\n """\n pth = cls.__new__(cls)\n pth._vertices = _to_unmasked_float_array(verts)\n pth._codes = codes\n pth._readonly = False\n if internals_from is not None:\n pth._should_simplify = internals_from._should_simplify\n pth._simplify_threshold = internals_from._simplify_threshold\n pth._interpolation_steps = internals_from._interpolation_steps\n else:\n pth._should_simplify = True\n pth._simplify_threshold = mpl.rcParams['path.simplify_threshold']\n pth._interpolation_steps = 1\n return pth\n\n @classmethod\n def _create_closed(cls, vertices):\n """\n Create a closed polygonal path going through *vertices*.\n\n Unlike ``Path(..., closed=True)``, *vertices* should **not** end with\n an entry for the CLOSEPATH; this entry is added by `._create_closed`.\n """\n v = _to_unmasked_float_array(vertices)\n return cls(np.concatenate([v, v[:1]]), closed=True)\n\n def _update_values(self):\n self._simplify_threshold = mpl.rcParams['path.simplify_threshold']\n self._should_simplify = (\n self._simplify_threshold > 0 and\n mpl.rcParams['path.simplify'] and\n len(self._vertices) >= 128 and\n (self._codes is None or np.all(self._codes <= Path.LINETO))\n )\n\n @property\n def vertices(self):\n """The vertices of the `Path` as an (N, 2) array."""\n return self._vertices\n\n @vertices.setter\n def vertices(self, vertices):\n if self._readonly:\n raise AttributeError("Can't set vertices on a readonly Path")\n self._vertices = vertices\n self._update_values()\n\n @property\n def codes(self):\n """\n The list of codes in the `Path` as a 1D array.\n\n Each code is one of `STOP`, `MOVETO`, `LINETO`, `CURVE3`, `CURVE4` or\n `CLOSEPOLY`. For codes that correspond to more than one vertex\n (`CURVE3` and `CURVE4`), that code will be repeated so that the length\n of `vertices` and `codes` is always the same.\n """\n return self._codes\n\n @codes.setter\n def codes(self, codes):\n if self._readonly:\n raise AttributeError("Can't set codes on a readonly Path")\n self._codes = codes\n self._update_values()\n\n @property\n def simplify_threshold(self):\n """\n The fraction of a pixel difference below which vertices will\n be simplified out.\n """\n return self._simplify_threshold\n\n @simplify_threshold.setter\n def simplify_threshold(self, threshold):\n self._simplify_threshold = threshold\n\n @property\n def should_simplify(self):\n """\n `True` if the vertices array should be simplified.\n """\n return self._should_simplify\n\n @should_simplify.setter\n def should_simplify(self, should_simplify):\n self._should_simplify = should_simplify\n\n @property\n def readonly(self):\n """\n `True` if the `Path` is read-only.\n """\n return self._readonly\n\n def copy(self):\n """\n Return a shallow copy of the `Path`, which will share the\n vertices and codes with the source `Path`.\n """\n return copy.copy(self)\n\n def __deepcopy__(self, memo=None):\n """\n Return a deepcopy of the `Path`. The `Path` will not be\n readonly, even if the source `Path` is.\n """\n # Deepcopying arrays (vertices, codes) strips the writeable=False flag.\n p = copy.deepcopy(super(), memo)\n p._readonly = False\n return p\n\n deepcopy = __deepcopy__\n\n @classmethod\n def make_compound_path_from_polys(cls, XY):\n """\n Make a compound `Path` object to draw a number of polygons with equal\n numbers of sides.\n\n .. plot:: gallery/misc/histogram_path.py\n\n Parameters\n ----------\n XY : (numpolys, numsides, 2) array\n """\n # for each poly: 1 for the MOVETO, (numsides-1) for the LINETO, 1 for\n # the CLOSEPOLY; the vert for the closepoly is ignored but we still\n # need it to keep the codes aligned with the vertices\n numpolys, numsides, two = XY.shape\n if two != 2:\n raise ValueError("The third dimension of 'XY' must be 2")\n stride = numsides + 1\n nverts = numpolys * stride\n verts = np.zeros((nverts, 2))\n codes = np.full(nverts, cls.LINETO, dtype=cls.code_type)\n codes[0::stride] = cls.MOVETO\n codes[numsides::stride] = cls.CLOSEPOLY\n for i in range(numsides):\n verts[i::stride] = XY[:, i]\n return cls(verts, codes)\n\n @classmethod\n def make_compound_path(cls, *args):\n r"""\n Concatenate a list of `Path`\s into a single `Path`, removing all `STOP`\s.\n """\n if not args:\n return Path(np.empty([0, 2], dtype=np.float32))\n vertices = np.concatenate([path.vertices for path in args])\n codes = np.empty(len(vertices), dtype=cls.code_type)\n i = 0\n for path in args:\n size = len(path.vertices)\n if path.codes is None:\n if size:\n codes[i] = cls.MOVETO\n codes[i+1:i+size] = cls.LINETO\n else:\n codes[i:i+size] = path.codes\n i += size\n not_stop_mask = codes != cls.STOP # Remove STOPs, as internal STOPs are a bug.\n return cls(vertices[not_stop_mask], codes[not_stop_mask])\n\n def __repr__(self):\n return f"Path({self.vertices!r}, {self.codes!r})"\n\n def __len__(self):\n return len(self.vertices)\n\n def iter_segments(self, transform=None, remove_nans=True, clip=None,\n snap=False, stroke_width=1.0, simplify=None,\n curves=True, sketch=None):\n """\n Iterate over all curve segments in the path.\n\n Each iteration returns a pair ``(vertices, code)``, where ``vertices``\n is a sequence of 1-3 coordinate pairs, and ``code`` is a `Path` code.\n\n Additionally, this method can provide a number of standard cleanups and\n conversions to the path.\n\n Parameters\n ----------\n transform : None or :class:`~matplotlib.transforms.Transform`\n If not None, the given affine transformation will be applied to the\n path.\n remove_nans : bool, optional\n Whether to remove all NaNs from the path and skip over them using\n MOVETO commands.\n clip : None or (float, float, float, float), optional\n If not None, must be a four-tuple (x1, y1, x2, y2)\n defining a rectangle in which to clip the path.\n snap : None or bool, optional\n If True, snap all nodes to pixels; if False, don't snap them.\n If None, snap if the path contains only segments\n parallel to the x or y axes, and no more than 1024 of them.\n stroke_width : float, optional\n The width of the stroke being drawn (used for path snapping).\n simplify : None or bool, optional\n Whether to simplify the path by removing vertices\n that do not affect its appearance. If None, use the\n :attr:`should_simplify` attribute. See also :rc:`path.simplify`\n and :rc:`path.simplify_threshold`.\n curves : bool, optional\n If True, curve segments will be returned as curve segments.\n If False, all curves will be converted to line segments.\n sketch : None or sequence, optional\n If not None, must be a 3-tuple of the form\n (scale, length, randomness), representing the sketch parameters.\n """\n if not len(self):\n return\n\n cleaned = self.cleaned(transform=transform,\n remove_nans=remove_nans, clip=clip,\n snap=snap, stroke_width=stroke_width,\n simplify=simplify, curves=curves,\n sketch=sketch)\n\n # Cache these object lookups for performance in the loop.\n NUM_VERTICES_FOR_CODE = self.NUM_VERTICES_FOR_CODE\n STOP = self.STOP\n\n vertices = iter(cleaned.vertices)\n codes = iter(cleaned.codes)\n for curr_vertices, code in zip(vertices, codes):\n if code == STOP:\n break\n extra_vertices = NUM_VERTICES_FOR_CODE[code] - 1\n if extra_vertices:\n for i in range(extra_vertices):\n next(codes)\n curr_vertices = np.append(curr_vertices, next(vertices))\n yield curr_vertices, code\n\n def iter_bezier(self, **kwargs):\n """\n Iterate over each Bézier curve (lines included) in a `Path`.\n\n Parameters\n ----------\n **kwargs\n Forwarded to `.iter_segments`.\n\n Yields\n ------\n B : `~matplotlib.bezier.BezierSegment`\n The Bézier curves that make up the current path. Note in particular\n that freestanding points are Bézier curves of order 0, and lines\n are Bézier curves of order 1 (with two control points).\n code : `~matplotlib.path.Path.code_type`\n The code describing what kind of curve is being returned.\n `MOVETO`, `LINETO`, `CURVE3`, and `CURVE4` correspond to\n Bézier curves with 1, 2, 3, and 4 control points (respectively).\n `CLOSEPOLY` is a `LINETO` with the control points correctly\n chosen based on the start/end points of the current stroke.\n """\n first_vert = None\n prev_vert = None\n for verts, code in self.iter_segments(**kwargs):\n if first_vert is None:\n if code != Path.MOVETO:\n raise ValueError("Malformed path, must start with MOVETO.")\n if code == Path.MOVETO: # a point is like "CURVE1"\n first_vert = verts\n yield BezierSegment(np.array([first_vert])), code\n elif code == Path.LINETO: # "CURVE2"\n yield BezierSegment(np.array([prev_vert, verts])), code\n elif code == Path.CURVE3:\n yield BezierSegment(np.array([prev_vert, verts[:2],\n verts[2:]])), code\n elif code == Path.CURVE4:\n yield BezierSegment(np.array([prev_vert, verts[:2],\n verts[2:4], verts[4:]])), code\n elif code == Path.CLOSEPOLY:\n yield BezierSegment(np.array([prev_vert, first_vert])), code\n elif code == Path.STOP:\n return\n else:\n raise ValueError(f"Invalid Path.code_type: {code}")\n prev_vert = verts[-2:]\n\n def _iter_connected_components(self):\n """Return subpaths split at MOVETOs."""\n if self.codes is None:\n yield self\n else:\n idxs = np.append((self.codes == Path.MOVETO).nonzero()[0], len(self.codes))\n for sl in map(slice, idxs, idxs[1:]):\n yield Path._fast_from_codes_and_verts(\n self.vertices[sl], self.codes[sl], self)\n\n def cleaned(self, transform=None, remove_nans=False, clip=None,\n *, simplify=False, curves=False,\n stroke_width=1.0, snap=False, sketch=None):\n """\n Return a new `Path` with vertices and codes cleaned according to the\n parameters.\n\n See Also\n --------\n Path.iter_segments : for details of the keyword arguments.\n """\n vertices, codes = _path.cleanup_path(\n self, transform, remove_nans, clip, snap, stroke_width, simplify,\n curves, sketch)\n pth = Path._fast_from_codes_and_verts(vertices, codes, self)\n if not simplify:\n pth._should_simplify = False\n return pth\n\n def transformed(self, transform):\n """\n Return a transformed copy of the path.\n\n See Also\n --------\n matplotlib.transforms.TransformedPath\n A specialized path class that will cache the transformed result and\n automatically update when the transform changes.\n """\n return Path(transform.transform(self.vertices), self.codes,\n self._interpolation_steps)\n\n def contains_point(self, point, transform=None, radius=0.0):\n """\n Return whether the area enclosed by the path contains the given point.\n\n The path is always treated as closed; i.e. if the last code is not\n `CLOSEPOLY` an implicit segment connecting the last vertex to the first\n vertex is assumed.\n\n Parameters\n ----------\n point : (float, float)\n The point (x, y) to check.\n transform : `~matplotlib.transforms.Transform`, optional\n If not ``None``, *point* will be compared to ``self`` transformed\n by *transform*; i.e. for a correct check, *transform* should\n transform the path into the coordinate system of *point*.\n radius : float, default: 0\n Additional margin on the path in coordinates of *point*.\n The path is extended tangentially by *radius/2*; i.e. if you would\n draw the path with a linewidth of *radius*, all points on the line\n would still be considered to be contained in the area. Conversely,\n negative values shrink the area: Points on the imaginary line\n will be considered outside the area.\n\n Returns\n -------\n bool\n\n Notes\n -----\n The current algorithm has some limitations:\n\n - The result is undefined for points exactly at the boundary\n (i.e. at the path shifted by *radius/2*).\n - The result is undefined if there is no enclosed area, i.e. all\n vertices are on a straight line.\n - If bounding lines start to cross each other due to *radius* shift,\n the result is not guaranteed to be correct.\n """\n if transform is not None:\n transform = transform.frozen()\n # `point_in_path` does not handle nonlinear transforms, so we\n # transform the path ourselves. If *transform* is affine, letting\n # `point_in_path` handle the transform avoids allocating an extra\n # buffer.\n if transform and not transform.is_affine:\n self = transform.transform_path(self)\n transform = None\n return _path.point_in_path(point[0], point[1], radius, self, transform)\n\n def contains_points(self, points, transform=None, radius=0.0):\n """\n Return whether the area enclosed by the path contains the given points.\n\n The path is always treated as closed; i.e. if the last code is not\n `CLOSEPOLY` an implicit segment connecting the last vertex to the first\n vertex is assumed.\n\n Parameters\n ----------\n points : (N, 2) array\n The points to check. Columns contain x and y values.\n transform : `~matplotlib.transforms.Transform`, optional\n If not ``None``, *points* will be compared to ``self`` transformed\n by *transform*; i.e. for a correct check, *transform* should\n transform the path into the coordinate system of *points*.\n radius : float, default: 0\n Additional margin on the path in coordinates of *points*.\n The path is extended tangentially by *radius/2*; i.e. if you would\n draw the path with a linewidth of *radius*, all points on the line\n would still be considered to be contained in the area. Conversely,\n negative values shrink the area: Points on the imaginary line\n will be considered outside the area.\n\n Returns\n -------\n length-N bool array\n\n Notes\n -----\n The current algorithm has some limitations:\n\n - The result is undefined for points exactly at the boundary\n (i.e. at the path shifted by *radius/2*).\n - The result is undefined if there is no enclosed area, i.e. all\n vertices are on a straight line.\n - If bounding lines start to cross each other due to *radius* shift,\n the result is not guaranteed to be correct.\n """\n if transform is not None:\n transform = transform.frozen()\n result = _path.points_in_path(points, radius, self, transform)\n return result.astype('bool')\n\n def contains_path(self, path, transform=None):\n """\n Return whether this (closed) path completely contains the given path.\n\n If *transform* is not ``None``, the path will be transformed before\n checking for containment.\n """\n if transform is not None:\n transform = transform.frozen()\n return _path.path_in_path(self, None, path, transform)\n\n def get_extents(self, transform=None, **kwargs):\n """\n Get Bbox of the path.\n\n Parameters\n ----------\n transform : `~matplotlib.transforms.Transform`, optional\n Transform to apply to path before computing extents, if any.\n **kwargs\n Forwarded to `.iter_bezier`.\n\n Returns\n -------\n matplotlib.transforms.Bbox\n The extents of the path Bbox([[xmin, ymin], [xmax, ymax]])\n """\n from .transforms import Bbox\n if transform is not None:\n self = transform.transform_path(self)\n if self.codes is None:\n xys = self.vertices\n elif len(np.intersect1d(self.codes, [Path.CURVE3, Path.CURVE4])) == 0:\n # Optimization for the straight line case.\n # Instead of iterating through each curve, consider\n # each line segment's end-points\n # (recall that STOP and CLOSEPOLY vertices are ignored)\n xys = self.vertices[np.isin(self.codes,\n [Path.MOVETO, Path.LINETO])]\n else:\n xys = []\n for curve, code in self.iter_bezier(**kwargs):\n # places where the derivative is zero can be extrema\n _, dzeros = curve.axis_aligned_extrema()\n # as can the ends of the curve\n xys.append(curve([0, *dzeros, 1]))\n xys = np.concatenate(xys)\n if len(xys):\n return Bbox([xys.min(axis=0), xys.max(axis=0)])\n else:\n return Bbox.null()\n\n def intersects_path(self, other, filled=True):\n """\n Return whether if this path intersects another given path.\n\n If *filled* is True, then this also returns True if one path completely\n encloses the other (i.e., the paths are treated as filled).\n """\n return _path.path_intersects_path(self, other, filled)\n\n def intersects_bbox(self, bbox, filled=True):\n """\n Return whether this path intersects a given `~.transforms.Bbox`.\n\n If *filled* is True, then this also returns True if the path completely\n encloses the `.Bbox` (i.e., the path is treated as filled).\n\n The bounding box is always considered filled.\n """\n return _path.path_intersects_rectangle(\n self, bbox.x0, bbox.y0, bbox.x1, bbox.y1, filled)\n\n def interpolated(self, steps):\n """\n Return a new path with each segment divided into *steps* parts.\n\n Codes other than `LINETO`, `MOVETO`, and `CLOSEPOLY` are not handled correctly.\n\n Parameters\n ----------\n steps : int\n The number of segments in the new path for each in the original.\n\n Returns\n -------\n Path\n The interpolated path.\n """\n if steps == 1 or len(self) == 0:\n return self\n\n if self.codes is not None and self.MOVETO in self.codes[1:]:\n return self.make_compound_path(\n *(p.interpolated(steps) for p in self._iter_connected_components()))\n\n if self.codes is not None and self.CLOSEPOLY in self.codes and not np.all(\n self.vertices[self.codes == self.CLOSEPOLY] == self.vertices[0]):\n vertices = self.vertices.copy()\n vertices[self.codes == self.CLOSEPOLY] = vertices[0]\n else:\n vertices = self.vertices\n\n vertices = simple_linear_interpolation(vertices, steps)\n codes = self.codes\n if codes is not None:\n new_codes = np.full((len(codes) - 1) * steps + 1, Path.LINETO,\n dtype=self.code_type)\n new_codes[0::steps] = codes\n else:\n new_codes = None\n return Path(vertices, new_codes)\n\n def to_polygons(self, transform=None, width=0, height=0, closed_only=True):\n """\n Convert this path to a list of polygons or polylines. Each\n polygon/polyline is an (N, 2) array of vertices. In other words,\n each polygon has no `MOVETO` instructions or curves. This\n is useful for displaying in backends that do not support\n compound paths or Bézier curves.\n\n If *width* and *height* are both non-zero then the lines will\n be simplified so that vertices outside of (0, 0), (width,\n height) will be clipped.\n\n The resulting polygons will be simplified if the\n :attr:`Path.should_simplify` attribute of the path is `True`.\n\n If *closed_only* is `True` (default), only closed polygons,\n with the last point being the same as the first point, will be\n returned. Any unclosed polylines in the path will be\n explicitly closed. If *closed_only* is `False`, any unclosed\n polygons in the path will be returned as unclosed polygons,\n and the closed polygons will be returned explicitly closed by\n setting the last point to the same as the first point.\n """\n if len(self.vertices) == 0:\n return []\n\n if transform is not None:\n transform = transform.frozen()\n\n if self.codes is None and (width == 0 or height == 0):\n vertices = self.vertices\n if closed_only:\n if len(vertices) < 3:\n return []\n elif np.any(vertices[0] != vertices[-1]):\n vertices = [*vertices, vertices[0]]\n\n if transform is None:\n return [vertices]\n else:\n return [transform.transform(vertices)]\n\n # Deal with the case where there are curves and/or multiple\n # subpaths (using extension code)\n return _path.convert_path_to_polygons(\n self, transform, width, height, closed_only)\n\n _unit_rectangle = None\n\n @classmethod\n def unit_rectangle(cls):\n """\n Return a `Path` instance of the unit rectangle from (0, 0) to (1, 1).\n """\n if cls._unit_rectangle is None:\n cls._unit_rectangle = cls([[0, 0], [1, 0], [1, 1], [0, 1], [0, 0]],\n closed=True, readonly=True)\n return cls._unit_rectangle\n\n _unit_regular_polygons = WeakValueDictionary()\n\n @classmethod\n def unit_regular_polygon(cls, numVertices):\n """\n Return a :class:`Path` instance for a unit regular polygon with the\n given *numVertices* such that the circumscribing circle has radius 1.0,\n centered at (0, 0).\n """\n if numVertices <= 16:\n path = cls._unit_regular_polygons.get(numVertices)\n else:\n path = None\n if path is None:\n theta = ((2 * np.pi / numVertices) * np.arange(numVertices + 1)\n # This initial rotation is to make sure the polygon always\n # "points-up".\n + np.pi / 2)\n verts = np.column_stack((np.cos(theta), np.sin(theta)))\n path = cls(verts, closed=True, readonly=True)\n if numVertices <= 16:\n cls._unit_regular_polygons[numVertices] = path\n return path\n\n _unit_regular_stars = WeakValueDictionary()\n\n @classmethod\n def unit_regular_star(cls, numVertices, innerCircle=0.5):\n """\n Return a :class:`Path` for a unit regular star with the given\n numVertices and radius of 1.0, centered at (0, 0).\n """\n if numVertices <= 16:\n path = cls._unit_regular_stars.get((numVertices, innerCircle))\n else:\n path = None\n if path is None:\n ns2 = numVertices * 2\n theta = (2*np.pi/ns2 * np.arange(ns2 + 1))\n # This initial rotation is to make sure the polygon always\n # "points-up"\n theta += np.pi / 2.0\n r = np.ones(ns2 + 1)\n r[1::2] = innerCircle\n verts = (r * np.vstack((np.cos(theta), np.sin(theta)))).T\n path = cls(verts, closed=True, readonly=True)\n if numVertices <= 16:\n cls._unit_regular_stars[(numVertices, innerCircle)] = path\n return path\n\n @classmethod\n def unit_regular_asterisk(cls, numVertices):\n """\n Return a :class:`Path` for a unit regular asterisk with the given\n numVertices and radius of 1.0, centered at (0, 0).\n """\n return cls.unit_regular_star(numVertices, 0.0)\n\n _unit_circle = None\n\n @classmethod\n def unit_circle(cls):\n """\n Return the readonly :class:`Path` of the unit circle.\n\n For most cases, :func:`Path.circle` will be what you want.\n """\n if cls._unit_circle is None:\n cls._unit_circle = cls.circle(center=(0, 0), radius=1,\n readonly=True)\n return cls._unit_circle\n\n @classmethod\n def circle(cls, center=(0., 0.), radius=1., readonly=False):\n """\n Return a `Path` representing a circle of a given radius and center.\n\n Parameters\n ----------\n center : (float, float), default: (0, 0)\n The center of the circle.\n radius : float, default: 1\n The radius of the circle.\n readonly : bool\n Whether the created path should have the "readonly" argument\n set when creating the Path instance.\n\n Notes\n -----\n The circle is approximated using 8 cubic Bézier curves, as described in\n\n Lancaster, Don. `Approximating a Circle or an Ellipse Using Four\n Bezier Cubic Splines <https://www.tinaja.com/glib/ellipse4.pdf>`_.\n """\n MAGIC = 0.2652031\n SQRTHALF = np.sqrt(0.5)\n MAGIC45 = SQRTHALF * MAGIC\n\n vertices = np.array([[0.0, -1.0],\n\n [MAGIC, -1.0],\n [SQRTHALF-MAGIC45, -SQRTHALF-MAGIC45],\n [SQRTHALF, -SQRTHALF],\n\n [SQRTHALF+MAGIC45, -SQRTHALF+MAGIC45],\n [1.0, -MAGIC],\n [1.0, 0.0],\n\n [1.0, MAGIC],\n [SQRTHALF+MAGIC45, SQRTHALF-MAGIC45],\n [SQRTHALF, SQRTHALF],\n\n [SQRTHALF-MAGIC45, SQRTHALF+MAGIC45],\n [MAGIC, 1.0],\n [0.0, 1.0],\n\n [-MAGIC, 1.0],\n [-SQRTHALF+MAGIC45, SQRTHALF+MAGIC45],\n [-SQRTHALF, SQRTHALF],\n\n [-SQRTHALF-MAGIC45, SQRTHALF-MAGIC45],\n [-1.0, MAGIC],\n [-1.0, 0.0],\n\n [-1.0, -MAGIC],\n [-SQRTHALF-MAGIC45, -SQRTHALF+MAGIC45],\n [-SQRTHALF, -SQRTHALF],\n\n [-SQRTHALF+MAGIC45, -SQRTHALF-MAGIC45],\n [-MAGIC, -1.0],\n [0.0, -1.0],\n\n [0.0, -1.0]],\n dtype=float)\n\n codes = [cls.CURVE4] * 26\n codes[0] = cls.MOVETO\n codes[-1] = cls.CLOSEPOLY\n return Path(vertices * radius + center, codes, readonly=readonly)\n\n _unit_circle_righthalf = None\n\n @classmethod\n def unit_circle_righthalf(cls):\n """\n Return a `Path` of the right half of a unit circle.\n\n See `Path.circle` for the reference on the approximation used.\n """\n if cls._unit_circle_righthalf is None:\n MAGIC = 0.2652031\n SQRTHALF = np.sqrt(0.5)\n MAGIC45 = SQRTHALF * MAGIC\n\n vertices = np.array(\n [[0.0, -1.0],\n\n [MAGIC, -1.0],\n [SQRTHALF-MAGIC45, -SQRTHALF-MAGIC45],\n [SQRTHALF, -SQRTHALF],\n\n [SQRTHALF+MAGIC45, -SQRTHALF+MAGIC45],\n [1.0, -MAGIC],\n [1.0, 0.0],\n\n [1.0, MAGIC],\n [SQRTHALF+MAGIC45, SQRTHALF-MAGIC45],\n [SQRTHALF, SQRTHALF],\n\n [SQRTHALF-MAGIC45, SQRTHALF+MAGIC45],\n [MAGIC, 1.0],\n [0.0, 1.0],\n\n [0.0, -1.0]],\n\n float)\n\n codes = np.full(14, cls.CURVE4, dtype=cls.code_type)\n codes[0] = cls.MOVETO\n codes[-1] = cls.CLOSEPOLY\n\n cls._unit_circle_righthalf = cls(vertices, codes, readonly=True)\n return cls._unit_circle_righthalf\n\n @classmethod\n def arc(cls, theta1, theta2, n=None, is_wedge=False):\n """\n Return a `Path` for the unit circle arc from angles *theta1* to\n *theta2* (in degrees).\n\n *theta2* is unwrapped to produce the shortest arc within 360 degrees.\n That is, if *theta2* > *theta1* + 360, the arc will be from *theta1* to\n *theta2* - 360 and not a full circle plus some extra overlap.\n\n If *n* is provided, it is the number of spline segments to make.\n If *n* is not provided, the number of spline segments is\n determined based on the delta between *theta1* and *theta2*.\n\n Masionobe, L. 2003. `Drawing an elliptical arc using\n polylines, quadratic or cubic Bezier curves\n <https://web.archive.org/web/20190318044212/http://www.spaceroots.org/documents/ellipse/index.html>`_.\n """\n halfpi = np.pi * 0.5\n\n eta1 = theta1\n eta2 = theta2 - 360 * np.floor((theta2 - theta1) / 360)\n # Ensure 2pi range is not flattened to 0 due to floating-point errors,\n # but don't try to expand existing 0 range.\n if theta2 != theta1 and eta2 <= eta1:\n eta2 += 360\n eta1, eta2 = np.deg2rad([eta1, eta2])\n\n # number of curve segments to make\n if n is None:\n n = int(2 ** np.ceil((eta2 - eta1) / halfpi))\n if n < 1:\n raise ValueError("n must be >= 1 or None")\n\n deta = (eta2 - eta1) / n\n t = np.tan(0.5 * deta)\n alpha = np.sin(deta) * (np.sqrt(4.0 + 3.0 * t * t) - 1) / 3.0\n\n steps = np.linspace(eta1, eta2, n + 1, True)\n cos_eta = np.cos(steps)\n sin_eta = np.sin(steps)\n\n xA = cos_eta[:-1]\n yA = sin_eta[:-1]\n xA_dot = -yA\n yA_dot = xA\n\n xB = cos_eta[1:]\n yB = sin_eta[1:]\n xB_dot = -yB\n yB_dot = xB\n\n if is_wedge:\n length = n * 3 + 4\n vertices = np.zeros((length, 2), float)\n codes = np.full(length, cls.CURVE4, dtype=cls.code_type)\n vertices[1] = [xA[0], yA[0]]\n codes[0:2] = [cls.MOVETO, cls.LINETO]\n codes[-2:] = [cls.LINETO, cls.CLOSEPOLY]\n vertex_offset = 2\n end = length - 2\n else:\n length = n * 3 + 1\n vertices = np.empty((length, 2), float)\n codes = np.full(length, cls.CURVE4, dtype=cls.code_type)\n vertices[0] = [xA[0], yA[0]]\n codes[0] = cls.MOVETO\n vertex_offset = 1\n end = length\n\n vertices[vertex_offset:end:3, 0] = xA + alpha * xA_dot\n vertices[vertex_offset:end:3, 1] = yA + alpha * yA_dot\n vertices[vertex_offset+1:end:3, 0] = xB - alpha * xB_dot\n vertices[vertex_offset+1:end:3, 1] = yB - alpha * yB_dot\n vertices[vertex_offset+2:end:3, 0] = xB\n vertices[vertex_offset+2:end:3, 1] = yB\n\n return cls(vertices, codes, readonly=True)\n\n @classmethod\n def wedge(cls, theta1, theta2, n=None):\n """\n Return a `Path` for the unit circle wedge from angles *theta1* to\n *theta2* (in degrees).\n\n *theta2* is unwrapped to produce the shortest wedge within 360 degrees.\n That is, if *theta2* > *theta1* + 360, the wedge will be from *theta1*\n to *theta2* - 360 and not a full circle plus some extra overlap.\n\n If *n* is provided, it is the number of spline segments to make.\n If *n* is not provided, the number of spline segments is\n determined based on the delta between *theta1* and *theta2*.\n\n See `Path.arc` for the reference on the approximation used.\n """\n return cls.arc(theta1, theta2, n, True)\n\n @staticmethod\n @lru_cache(8)\n def hatch(hatchpattern, density=6):\n """\n Given a hatch specifier, *hatchpattern*, generates a `Path` that\n can be used in a repeated hatching pattern. *density* is the\n number of lines per unit square.\n """\n from matplotlib.hatch import get_path\n return (get_path(hatchpattern, density)\n if hatchpattern is not None else None)\n\n def clip_to_bbox(self, bbox, inside=True):\n """\n Clip the path to the given bounding box.\n\n The path must be made up of one or more closed polygons. This\n algorithm will not behave correctly for unclosed paths.\n\n If *inside* is `True`, clip to the inside of the box, otherwise\n to the outside of the box.\n """\n verts = _path.clip_path_to_rect(self, bbox, inside)\n paths = [Path(poly) for poly in verts]\n return self.make_compound_path(*paths)\n\n\ndef get_path_collection_extents(\n master_transform, paths, transforms, offsets, offset_transform):\n r"""\n Get bounding box of a `.PathCollection`\s internal objects.\n\n That is, given a sequence of `Path`\s, `.Transform`\s objects, and offsets, as found\n in a `.PathCollection`, return the bounding box that encapsulates all of them.\n\n Parameters\n ----------\n master_transform : `~matplotlib.transforms.Transform`\n Global transformation applied to all paths.\n paths : list of `Path`\n transforms : list of `~matplotlib.transforms.Affine2DBase`\n If non-empty, this overrides *master_transform*.\n offsets : (N, 2) array-like\n offset_transform : `~matplotlib.transforms.Affine2DBase`\n Transform applied to the offsets before offsetting the path.\n\n Notes\n -----\n The way that *paths*, *transforms* and *offsets* are combined follows the same\n method as for collections: each is iterated over independently, so if you have 3\n paths (A, B, C), 2 transforms (α, β) and 1 offset (O), their combinations are as\n follows:\n\n - (A, α, O)\n - (B, β, O)\n - (C, α, O)\n """\n from .transforms import Bbox\n if len(paths) == 0:\n raise ValueError("No paths provided")\n if len(offsets) == 0:\n raise ValueError("No offsets provided")\n extents, minpos = _path.get_path_collection_extents(\n master_transform, paths, np.atleast_3d(transforms),\n offsets, offset_transform)\n return Bbox.from_extents(*extents, minpos=minpos)\n | .venv\Lib\site-packages\matplotlib\path.py | path.py | Python | 42,717 | 0.95 | 0.14991 | 0.038136 | vue-tools | 54 | 2024-02-21T11:05:46.946666 | MIT | false | 26c909dc38bbe4178c64b3f97f8f6017 |
from .bezier import BezierSegment\nfrom .transforms import Affine2D, Transform, Bbox\nfrom collections.abc import Generator, Iterable, Sequence\n\nimport numpy as np\nfrom numpy.typing import ArrayLike\n\nfrom typing import Any, overload\n\nclass Path:\n code_type: type[np.uint8]\n STOP: np.uint8\n MOVETO: np.uint8\n LINETO: np.uint8\n CURVE3: np.uint8\n CURVE4: np.uint8\n CLOSEPOLY: np.uint8\n NUM_VERTICES_FOR_CODE: dict[np.uint8, int]\n\n def __init__(\n self,\n vertices: ArrayLike,\n codes: ArrayLike | None = ...,\n _interpolation_steps: int = ...,\n closed: bool = ...,\n readonly: bool = ...,\n ) -> None: ...\n @property\n def vertices(self) -> ArrayLike: ...\n @vertices.setter\n def vertices(self, vertices: ArrayLike) -> None: ...\n @property\n def codes(self) -> ArrayLike | None: ...\n @codes.setter\n def codes(self, codes: ArrayLike) -> None: ...\n @property\n def simplify_threshold(self) -> float: ...\n @simplify_threshold.setter\n def simplify_threshold(self, threshold: float) -> None: ...\n @property\n def should_simplify(self) -> bool: ...\n @should_simplify.setter\n def should_simplify(self, should_simplify: bool) -> None: ...\n @property\n def readonly(self) -> bool: ...\n def copy(self) -> Path: ...\n def __deepcopy__(self, memo: dict[int, Any] | None = ...) -> Path: ...\n deepcopy = __deepcopy__\n\n @classmethod\n def make_compound_path_from_polys(cls, XY: ArrayLike) -> Path: ...\n @classmethod\n def make_compound_path(cls, *args: Path) -> Path: ...\n def __len__(self) -> int: ...\n def iter_segments(\n self,\n transform: Transform | None = ...,\n remove_nans: bool = ...,\n clip: tuple[float, float, float, float] | None = ...,\n snap: bool | None = ...,\n stroke_width: float = ...,\n simplify: bool | None = ...,\n curves: bool = ...,\n sketch: tuple[float, float, float] | None = ...,\n ) -> Generator[tuple[np.ndarray, np.uint8], None, None]: ...\n def iter_bezier(self, **kwargs) -> Generator[BezierSegment, None, None]: ...\n def cleaned(\n self,\n transform: Transform | None = ...,\n remove_nans: bool = ...,\n clip: tuple[float, float, float, float] | None = ...,\n *,\n simplify: bool | None = ...,\n curves: bool = ...,\n stroke_width: float = ...,\n snap: bool | None = ...,\n sketch: tuple[float, float, float] | None = ...\n ) -> Path: ...\n def transformed(self, transform: Transform) -> Path: ...\n def contains_point(\n self,\n point: tuple[float, float],\n transform: Transform | None = ...,\n radius: float = ...,\n ) -> bool: ...\n def contains_points(\n self, points: ArrayLike, transform: Transform | None = ..., radius: float = ...\n ) -> np.ndarray: ...\n def contains_path(self, path: Path, transform: Transform | None = ...) -> bool: ...\n def get_extents(self, transform: Transform | None = ..., **kwargs) -> Bbox: ...\n def intersects_path(self, other: Path, filled: bool = ...) -> bool: ...\n def intersects_bbox(self, bbox: Bbox, filled: bool = ...) -> bool: ...\n def interpolated(self, steps: int) -> Path: ...\n def to_polygons(\n self,\n transform: Transform | None = ...,\n width: float = ...,\n height: float = ...,\n closed_only: bool = ...,\n ) -> list[ArrayLike]: ...\n @classmethod\n def unit_rectangle(cls) -> Path: ...\n @classmethod\n def unit_regular_polygon(cls, numVertices: int) -> Path: ...\n @classmethod\n def unit_regular_star(cls, numVertices: int, innerCircle: float = ...) -> Path: ...\n @classmethod\n def unit_regular_asterisk(cls, numVertices: int) -> Path: ...\n @classmethod\n def unit_circle(cls) -> Path: ...\n @classmethod\n def circle(\n cls,\n center: tuple[float, float] = ...,\n radius: float = ...,\n readonly: bool = ...,\n ) -> Path: ...\n @classmethod\n def unit_circle_righthalf(cls) -> Path: ...\n @classmethod\n def arc(\n cls, theta1: float, theta2: float, n: int | None = ..., is_wedge: bool = ...\n ) -> Path: ...\n @classmethod\n def wedge(cls, theta1: float, theta2: float, n: int | None = ...) -> Path: ...\n @overload\n @staticmethod\n def hatch(hatchpattern: str, density: float = ...) -> Path: ...\n @overload\n @staticmethod\n def hatch(hatchpattern: None, density: float = ...) -> None: ...\n def clip_to_bbox(self, bbox: Bbox, inside: bool = ...) -> Path: ...\n\ndef get_path_collection_extents(\n master_transform: Transform,\n paths: Sequence[Path],\n transforms: Iterable[Affine2D],\n offsets: ArrayLike,\n offset_transform: Affine2D,\n) -> Bbox: ...\n | .venv\Lib\site-packages\matplotlib\path.pyi | path.pyi | Other | 4,777 | 0.85 | 0.292857 | 0.007463 | vue-tools | 542 | 2024-10-09T21:55:28.111521 | GPL-3.0 | false | b1c3e9c7279394b6f72902eb2f0b885c |
"""\nDefines classes for path effects. The path effects are supported in `.Text`,\n`.Line2D` and `.Patch`.\n\n.. seealso::\n :ref:`patheffects_guide`\n"""\n\nfrom matplotlib.backend_bases import RendererBase\nfrom matplotlib import colors as mcolors\nfrom matplotlib import patches as mpatches\nfrom matplotlib import transforms as mtransforms\nfrom matplotlib.path import Path\nimport numpy as np\n\n\nclass AbstractPathEffect:\n """\n A base class for path effects.\n\n Subclasses should override the ``draw_path`` method to add effect\n functionality.\n """\n\n def __init__(self, offset=(0., 0.)):\n """\n Parameters\n ----------\n offset : (float, float), default: (0, 0)\n The (x, y) offset to apply to the path, measured in points.\n """\n self._offset = offset\n\n def _offset_transform(self, renderer):\n """Apply the offset to the given transform."""\n return mtransforms.Affine2D().translate(\n *map(renderer.points_to_pixels, self._offset))\n\n def _update_gc(self, gc, new_gc_dict):\n """\n Update the given GraphicsContext with the given dict of properties.\n\n The keys in the dictionary are used to identify the appropriate\n ``set_`` method on the *gc*.\n """\n new_gc_dict = new_gc_dict.copy()\n\n dashes = new_gc_dict.pop("dashes", None)\n if dashes:\n gc.set_dashes(**dashes)\n\n for k, v in new_gc_dict.items():\n set_method = getattr(gc, 'set_' + k, None)\n if not callable(set_method):\n raise AttributeError(f'Unknown property {k}')\n set_method(v)\n return gc\n\n def draw_path(self, renderer, gc, tpath, affine, rgbFace=None):\n """\n Derived should override this method. The arguments are the same\n as :meth:`matplotlib.backend_bases.RendererBase.draw_path`\n except the first argument is a renderer.\n """\n # Get the real renderer, not a PathEffectRenderer.\n if isinstance(renderer, PathEffectRenderer):\n renderer = renderer._renderer\n return renderer.draw_path(gc, tpath, affine, rgbFace)\n\n\nclass PathEffectRenderer(RendererBase):\n """\n Implements a Renderer which contains another renderer.\n\n This proxy then intercepts draw calls, calling the appropriate\n :class:`AbstractPathEffect` draw method.\n\n .. note::\n Not all methods have been overridden on this RendererBase subclass.\n It may be necessary to add further methods to extend the PathEffects\n capabilities further.\n """\n\n def __init__(self, path_effects, renderer):\n """\n Parameters\n ----------\n path_effects : iterable of :class:`AbstractPathEffect`\n The path effects which this renderer represents.\n renderer : `~matplotlib.backend_bases.RendererBase` subclass\n\n """\n self._path_effects = path_effects\n self._renderer = renderer\n\n def copy_with_path_effect(self, path_effects):\n return self.__class__(path_effects, self._renderer)\n\n def __getattribute__(self, name):\n if name in ['flipy', 'get_canvas_width_height', 'new_gc',\n 'points_to_pixels', '_text2path', 'height', 'width']:\n return getattr(self._renderer, name)\n else:\n return object.__getattribute__(self, name)\n\n def draw_path(self, gc, tpath, affine, rgbFace=None):\n for path_effect in self._path_effects:\n path_effect.draw_path(self._renderer, gc, tpath, affine,\n rgbFace)\n\n def draw_markers(\n self, gc, marker_path, marker_trans, path, *args, **kwargs):\n # We do a little shimmy so that all markers are drawn for each path\n # effect in turn. Essentially, we induce recursion (depth 1) which is\n # terminated once we have just a single path effect to work with.\n if len(self._path_effects) == 1:\n # Call the base path effect function - this uses the unoptimised\n # approach of calling "draw_path" multiple times.\n return super().draw_markers(gc, marker_path, marker_trans, path,\n *args, **kwargs)\n\n for path_effect in self._path_effects:\n renderer = self.copy_with_path_effect([path_effect])\n # Recursively call this method, only next time we will only have\n # one path effect.\n renderer.draw_markers(gc, marker_path, marker_trans, path,\n *args, **kwargs)\n\n def draw_path_collection(self, gc, master_transform, paths, *args,\n **kwargs):\n # We do a little shimmy so that all paths are drawn for each path\n # effect in turn. Essentially, we induce recursion (depth 1) which is\n # terminated once we have just a single path effect to work with.\n if len(self._path_effects) == 1:\n # Call the base path effect function - this uses the unoptimised\n # approach of calling "draw_path" multiple times.\n return super().draw_path_collection(gc, master_transform, paths,\n *args, **kwargs)\n\n for path_effect in self._path_effects:\n renderer = self.copy_with_path_effect([path_effect])\n # Recursively call this method, only next time we will only have\n # one path effect.\n renderer.draw_path_collection(gc, master_transform, paths,\n *args, **kwargs)\n\n def open_group(self, s, gid=None):\n return self._renderer.open_group(s, gid)\n\n def close_group(self, s):\n return self._renderer.close_group(s)\n\n\nclass Normal(AbstractPathEffect):\n """\n The "identity" PathEffect.\n\n The Normal PathEffect's sole purpose is to draw the original artist with\n no special path effect.\n """\n\n\ndef _subclass_with_normal(effect_class):\n """\n Create a PathEffect class combining *effect_class* and a normal draw.\n """\n\n class withEffect(effect_class):\n def draw_path(self, renderer, gc, tpath, affine, rgbFace):\n super().draw_path(renderer, gc, tpath, affine, rgbFace)\n renderer.draw_path(gc, tpath, affine, rgbFace)\n\n withEffect.__name__ = f"with{effect_class.__name__}"\n withEffect.__qualname__ = f"with{effect_class.__name__}"\n withEffect.__doc__ = f"""\n A shortcut PathEffect for applying `.{effect_class.__name__}` and then\n drawing the original Artist.\n\n With this class you can use ::\n\n artist.set_path_effects([patheffects.with{effect_class.__name__}()])\n\n as a shortcut for ::\n\n artist.set_path_effects([patheffects.{effect_class.__name__}(),\n patheffects.Normal()])\n """\n # Docstring inheritance doesn't work for locally-defined subclasses.\n withEffect.draw_path.__doc__ = effect_class.draw_path.__doc__\n return withEffect\n\n\nclass Stroke(AbstractPathEffect):\n """A line based PathEffect which re-draws a stroke."""\n\n def __init__(self, offset=(0, 0), **kwargs):\n """\n The path will be stroked with its gc updated with the given\n keyword arguments, i.e., the keyword arguments should be valid\n gc parameter values.\n """\n super().__init__(offset)\n self._gc = kwargs\n\n def draw_path(self, renderer, gc, tpath, affine, rgbFace):\n """Draw the path with updated gc."""\n gc0 = renderer.new_gc() # Don't modify gc, but a copy!\n gc0.copy_properties(gc)\n gc0 = self._update_gc(gc0, self._gc)\n renderer.draw_path(\n gc0, tpath, affine + self._offset_transform(renderer), rgbFace)\n gc0.restore()\n\n\nwithStroke = _subclass_with_normal(effect_class=Stroke)\n\n\nclass SimplePatchShadow(AbstractPathEffect):\n """A simple shadow via a filled patch."""\n\n def __init__(self, offset=(2, -2),\n shadow_rgbFace=None, alpha=None,\n rho=0.3, **kwargs):\n """\n Parameters\n ----------\n offset : (float, float), default: (2, -2)\n The (x, y) offset of the shadow in points.\n shadow_rgbFace : :mpltype:`color`\n The shadow color.\n alpha : float, default: 0.3\n The alpha transparency of the created shadow patch.\n rho : float, default: 0.3\n A scale factor to apply to the rgbFace color if *shadow_rgbFace*\n is not specified.\n **kwargs\n Extra keywords are stored and passed through to\n :meth:`AbstractPathEffect._update_gc`.\n\n """\n super().__init__(offset)\n\n if shadow_rgbFace is None:\n self._shadow_rgbFace = shadow_rgbFace\n else:\n self._shadow_rgbFace = mcolors.to_rgba(shadow_rgbFace)\n\n if alpha is None:\n alpha = 0.3\n\n self._alpha = alpha\n self._rho = rho\n\n #: The dictionary of keywords to update the graphics collection with.\n self._gc = kwargs\n\n def draw_path(self, renderer, gc, tpath, affine, rgbFace):\n """\n Overrides the standard draw_path to add the shadow offset and\n necessary color changes for the shadow.\n """\n gc0 = renderer.new_gc() # Don't modify gc, but a copy!\n gc0.copy_properties(gc)\n\n if self._shadow_rgbFace is None:\n r, g, b = (rgbFace or (1., 1., 1.))[:3]\n # Scale the colors by a factor to improve the shadow effect.\n shadow_rgbFace = (r * self._rho, g * self._rho, b * self._rho)\n else:\n shadow_rgbFace = self._shadow_rgbFace\n\n gc0.set_foreground("none")\n gc0.set_alpha(self._alpha)\n gc0.set_linewidth(0)\n\n gc0 = self._update_gc(gc0, self._gc)\n renderer.draw_path(\n gc0, tpath, affine + self._offset_transform(renderer),\n shadow_rgbFace)\n gc0.restore()\n\n\nwithSimplePatchShadow = _subclass_with_normal(effect_class=SimplePatchShadow)\n\n\nclass SimpleLineShadow(AbstractPathEffect):\n """A simple shadow via a line."""\n\n def __init__(self, offset=(2, -2),\n shadow_color='k', alpha=0.3, rho=0.3, **kwargs):\n """\n Parameters\n ----------\n offset : (float, float), default: (2, -2)\n The (x, y) offset to apply to the path, in points.\n shadow_color : :mpltype:`color`, default: 'black'\n The shadow color.\n A value of ``None`` takes the original artist's color\n with a scale factor of *rho*.\n alpha : float, default: 0.3\n The alpha transparency of the created shadow patch.\n rho : float, default: 0.3\n A scale factor to apply to the rgbFace color if *shadow_color*\n is ``None``.\n **kwargs\n Extra keywords are stored and passed through to\n :meth:`AbstractPathEffect._update_gc`.\n """\n super().__init__(offset)\n if shadow_color is None:\n self._shadow_color = shadow_color\n else:\n self._shadow_color = mcolors.to_rgba(shadow_color)\n self._alpha = alpha\n self._rho = rho\n #: The dictionary of keywords to update the graphics collection with.\n self._gc = kwargs\n\n def draw_path(self, renderer, gc, tpath, affine, rgbFace):\n """\n Overrides the standard draw_path to add the shadow offset and\n necessary color changes for the shadow.\n """\n gc0 = renderer.new_gc() # Don't modify gc, but a copy!\n gc0.copy_properties(gc)\n\n if self._shadow_color is None:\n r, g, b = (gc0.get_foreground() or (1., 1., 1.))[:3]\n # Scale the colors by a factor to improve the shadow effect.\n shadow_rgbFace = (r * self._rho, g * self._rho, b * self._rho)\n else:\n shadow_rgbFace = self._shadow_color\n\n gc0.set_foreground(shadow_rgbFace)\n gc0.set_alpha(self._alpha)\n\n gc0 = self._update_gc(gc0, self._gc)\n renderer.draw_path(\n gc0, tpath, affine + self._offset_transform(renderer))\n gc0.restore()\n\n\nclass PathPatchEffect(AbstractPathEffect):\n """\n Draws a `.PathPatch` instance whose Path comes from the original\n PathEffect artist.\n """\n\n def __init__(self, offset=(0, 0), **kwargs):\n """\n Parameters\n ----------\n offset : (float, float), default: (0, 0)\n The (x, y) offset to apply to the path, in points.\n **kwargs\n All keyword arguments are passed through to the\n :class:`~matplotlib.patches.PathPatch` constructor. The\n properties which cannot be overridden are "path", "clip_box"\n "transform" and "clip_path".\n """\n super().__init__(offset=offset)\n self.patch = mpatches.PathPatch([], **kwargs)\n\n def draw_path(self, renderer, gc, tpath, affine, rgbFace):\n self.patch._path = tpath\n self.patch.set_transform(affine + self._offset_transform(renderer))\n self.patch.set_clip_box(gc.get_clip_rectangle())\n clip_path = gc.get_clip_path()\n if clip_path and self.patch.get_clip_path() is None:\n self.patch.set_clip_path(*clip_path)\n self.patch.draw(renderer)\n\n\nclass TickedStroke(AbstractPathEffect):\n """\n A line-based PathEffect which draws a path with a ticked style.\n\n This line style is frequently used to represent constraints in\n optimization. The ticks may be used to indicate that one side\n of the line is invalid or to represent a closed boundary of a\n domain (i.e. a wall or the edge of a pipe).\n\n The spacing, length, and angle of ticks can be controlled.\n\n This line style is sometimes referred to as a hatched line.\n\n See also the :doc:`/gallery/misc/tickedstroke_demo` example.\n """\n\n def __init__(self, offset=(0, 0),\n spacing=10.0, angle=45.0, length=np.sqrt(2),\n **kwargs):\n """\n Parameters\n ----------\n offset : (float, float), default: (0, 0)\n The (x, y) offset to apply to the path, in points.\n spacing : float, default: 10.0\n The spacing between ticks in points.\n angle : float, default: 45.0\n The angle between the path and the tick in degrees. The angle\n is measured as if you were an ant walking along the curve, with\n zero degrees pointing directly ahead, 90 to your left, -90\n to your right, and 180 behind you. To change side of the ticks,\n change sign of the angle.\n length : float, default: 1.414\n The length of the tick relative to spacing.\n Recommended length = 1.414 (sqrt(2)) when angle=45, length=1.0\n when angle=90 and length=2.0 when angle=60.\n **kwargs\n Extra keywords are stored and passed through to\n :meth:`AbstractPathEffect._update_gc`.\n\n Examples\n --------\n See :doc:`/gallery/misc/tickedstroke_demo`.\n """\n super().__init__(offset)\n\n self._spacing = spacing\n self._angle = angle\n self._length = length\n self._gc = kwargs\n\n def draw_path(self, renderer, gc, tpath, affine, rgbFace):\n """Draw the path with updated gc."""\n # Do not modify the input! Use copy instead.\n gc0 = renderer.new_gc()\n gc0.copy_properties(gc)\n\n gc0 = self._update_gc(gc0, self._gc)\n trans = affine + self._offset_transform(renderer)\n\n theta = -np.radians(self._angle)\n trans_matrix = np.array([[np.cos(theta), -np.sin(theta)],\n [np.sin(theta), np.cos(theta)]])\n\n # Convert spacing parameter to pixels.\n spacing_px = renderer.points_to_pixels(self._spacing)\n\n # Transform before evaluation because to_polygons works at resolution\n # of one -- assuming it is working in pixel space.\n transpath = affine.transform_path(tpath)\n\n # Evaluate path to straight line segments that can be used to\n # construct line ticks.\n polys = transpath.to_polygons(closed_only=False)\n\n for p in polys:\n x = p[:, 0]\n y = p[:, 1]\n\n # Can not interpolate points or draw line if only one point in\n # polyline.\n if x.size < 2:\n continue\n\n # Find distance between points on the line\n ds = np.hypot(x[1:] - x[:-1], y[1:] - y[:-1])\n\n # Build parametric coordinate along curve\n s = np.concatenate(([0.0], np.cumsum(ds)))\n s_total = s[-1]\n\n num = int(np.ceil(s_total / spacing_px)) - 1\n # Pick parameter values for ticks.\n s_tick = np.linspace(spacing_px/2, s_total - spacing_px/2, num)\n\n # Find points along the parameterized curve\n x_tick = np.interp(s_tick, s, x)\n y_tick = np.interp(s_tick, s, y)\n\n # Find unit vectors in local direction of curve\n delta_s = self._spacing * .001\n u = (np.interp(s_tick + delta_s, s, x) - x_tick) / delta_s\n v = (np.interp(s_tick + delta_s, s, y) - y_tick) / delta_s\n\n # Normalize slope into unit slope vector.\n n = np.hypot(u, v)\n mask = n == 0\n n[mask] = 1.0\n\n uv = np.array([u / n, v / n]).T\n uv[mask] = np.array([0, 0]).T\n\n # Rotate and scale unit vector into tick vector\n dxy = np.dot(uv, trans_matrix) * self._length * spacing_px\n\n # Build tick endpoints\n x_end = x_tick + dxy[:, 0]\n y_end = y_tick + dxy[:, 1]\n\n # Interleave ticks to form Path vertices\n xyt = np.empty((2 * num, 2), dtype=x_tick.dtype)\n xyt[0::2, 0] = x_tick\n xyt[1::2, 0] = x_end\n xyt[0::2, 1] = y_tick\n xyt[1::2, 1] = y_end\n\n # Build up vector of Path codes\n codes = np.tile([Path.MOVETO, Path.LINETO], num)\n\n # Construct and draw resulting path\n h = Path(xyt, codes)\n # Transform back to data space during render\n renderer.draw_path(gc0, h, affine.inverted() + trans, rgbFace)\n\n gc0.restore()\n\n\nwithTickedStroke = _subclass_with_normal(effect_class=TickedStroke)\n | .venv\Lib\site-packages\matplotlib\patheffects.py | patheffects.py | Python | 18,387 | 0.95 | 0.142857 | 0.12439 | node-utils | 449 | 2025-04-14T17:45:25.866243 | BSD-3-Clause | false | f8b443ee3de8fd8e6b516ba5a9b21770 |
from collections.abc import Iterable, Sequence\nfrom typing import Any\n\nfrom matplotlib.backend_bases import RendererBase, GraphicsContextBase\nfrom matplotlib.path import Path\nfrom matplotlib.patches import Patch\nfrom matplotlib.transforms import Transform\n\nfrom matplotlib.typing import ColorType\n\nclass AbstractPathEffect:\n def __init__(self, offset: tuple[float, float] = ...) -> None: ...\n def draw_path(\n self,\n renderer: RendererBase,\n gc: GraphicsContextBase,\n tpath: Path,\n affine: Transform,\n rgbFace: ColorType | None = ...,\n ) -> None: ...\n\nclass PathEffectRenderer(RendererBase):\n def __init__(\n self, path_effects: Iterable[AbstractPathEffect], renderer: RendererBase\n ) -> None: ...\n def copy_with_path_effect(self, path_effects: Iterable[AbstractPathEffect]) -> PathEffectRenderer: ...\n def draw_path(\n self,\n gc: GraphicsContextBase,\n tpath: Path,\n affine: Transform,\n rgbFace: ColorType | None = ...,\n ) -> None: ...\n def draw_markers(\n self,\n gc: GraphicsContextBase,\n marker_path: Path,\n marker_trans: Transform,\n path: Path,\n *args,\n **kwargs\n ) -> None: ...\n def draw_path_collection(\n self,\n gc: GraphicsContextBase,\n master_transform: Transform,\n paths: Sequence[Path],\n *args,\n **kwargs\n ) -> None: ...\n def __getattribute__(self, name: str) -> Any: ...\n\nclass Normal(AbstractPathEffect): ...\n\nclass Stroke(AbstractPathEffect):\n def __init__(self, offset: tuple[float, float] = ..., **kwargs) -> None: ...\n # rgbFace becomes non-optional\n def draw_path(self, renderer: RendererBase, gc: GraphicsContextBase, tpath: Path, affine: Transform, rgbFace: ColorType) -> None: ... # type: ignore[override]\n\nclass withStroke(Stroke): ...\n\nclass SimplePatchShadow(AbstractPathEffect):\n def __init__(\n self,\n offset: tuple[float, float] = ...,\n shadow_rgbFace: ColorType | None = ...,\n alpha: float | None = ...,\n rho: float = ...,\n **kwargs\n ) -> None: ...\n # rgbFace becomes non-optional\n def draw_path(self, renderer: RendererBase, gc: GraphicsContextBase, tpath: Path, affine: Transform, rgbFace: ColorType) -> None: ... # type: ignore[override]\n\nclass withSimplePatchShadow(SimplePatchShadow): ...\n\nclass SimpleLineShadow(AbstractPathEffect):\n def __init__(\n self,\n offset: tuple[float, float] = ...,\n shadow_color: ColorType = ...,\n alpha: float = ...,\n rho: float = ...,\n **kwargs\n ) -> None: ...\n # rgbFace becomes non-optional\n def draw_path(self, renderer: RendererBase, gc: GraphicsContextBase, tpath: Path, affine: Transform, rgbFace: ColorType) -> None: ... # type: ignore[override]\n\nclass PathPatchEffect(AbstractPathEffect):\n patch: Patch\n def __init__(self, offset: tuple[float, float] = ..., **kwargs) -> None: ...\n # rgbFace becomes non-optional\n def draw_path(self, renderer: RendererBase, gc: GraphicsContextBase, tpath: Path, affine: Transform, rgbFace: ColorType) -> None: ... # type: ignore[override]\n\nclass TickedStroke(AbstractPathEffect):\n def __init__(\n self,\n offset: tuple[float, float] = ...,\n spacing: float = ...,\n angle: float = ...,\n length: float = ...,\n **kwargs\n ) -> None: ...\n # rgbFace becomes non-optional\n def draw_path(self, renderer: RendererBase, gc: GraphicsContextBase, tpath: Path, affine: Transform, rgbFace: ColorType) -> None: ... # type: ignore[override]\n\nclass withTickedStroke(TickedStroke): ...\n | .venv\Lib\site-packages\matplotlib\patheffects.pyi | patheffects.pyi | Other | 3,664 | 0.95 | 0.273585 | 0.129032 | vue-tools | 588 | 2024-01-20T11:30:38.010248 | BSD-3-Clause | false | 356c78e4c45aaf4e05ed9dd7a2ce3ba2 |
"""\n`pylab` is a historic interface and its use is strongly discouraged. The equivalent\nreplacement is `matplotlib.pyplot`. See :ref:`api_interfaces` for a full overview\nof Matplotlib interfaces.\n\n`pylab` was designed to support a MATLAB-like way of working with all plotting related\nfunctions directly available in the global namespace. This was achieved through a\nwildcard import (``from pylab import *``).\n\n.. warning::\n The use of `pylab` is discouraged for the following reasons:\n\n ``from pylab import *`` imports all the functions from `matplotlib.pyplot`, `numpy`,\n `numpy.fft`, `numpy.linalg`, and `numpy.random`, and some additional functions into\n the global namespace.\n\n Such a pattern is considered bad practice in modern python, as it clutters the global\n namespace. Even more severely, in the case of `pylab`, this will overwrite some\n builtin functions (e.g. the builtin `sum` will be replaced by `numpy.sum`), which\n can lead to unexpected behavior.\n\n"""\n\nfrom matplotlib.cbook import flatten, silent_list\n\nimport matplotlib as mpl\n\nfrom matplotlib.dates import (\n date2num, num2date, datestr2num, drange, DateFormatter, DateLocator,\n RRuleLocator, YearLocator, MonthLocator, WeekdayLocator, DayLocator,\n HourLocator, MinuteLocator, SecondLocator, rrule, MO, TU, WE, TH, FR,\n SA, SU, YEARLY, MONTHLY, WEEKLY, DAILY, HOURLY, MINUTELY, SECONDLY,\n relativedelta)\n\n# bring all the symbols in so folks can import them from\n# pylab in one fell swoop\n\n## We are still importing too many things from mlab; more cleanup is needed.\n\nfrom matplotlib.mlab import (\n detrend, detrend_linear, detrend_mean, detrend_none, window_hanning,\n window_none)\n\nfrom matplotlib import cbook, mlab, pyplot as plt\nfrom matplotlib.pyplot import *\n\nfrom numpy import *\nfrom numpy.fft import *\nfrom numpy.random import *\nfrom numpy.linalg import *\n\nimport numpy as np\nimport numpy.ma as ma\n\n# don't let numpy's datetime hide stdlib\nimport datetime\n\n# This is needed, or bytes will be numpy.random.bytes from\n# "from numpy.random import *" above\nbytes = __import__("builtins").bytes\n# We also don't want the numpy version of these functions\nabs = __import__("builtins").abs\nbool = __import__("builtins").bool\nmax = __import__("builtins").max\nmin = __import__("builtins").min\npow = __import__("builtins").pow\nround = __import__("builtins").round\n | .venv\Lib\site-packages\matplotlib\pylab.py | pylab.py | Python | 2,369 | 0.95 | 0.029851 | 0.137255 | awesome-app | 94 | 2025-01-17T04:17:57.564565 | GPL-3.0 | false | ac533bb6ef2ce6c06878cc64d656da56 |
"""\nSupport for plotting vector fields.\n\nPresently this contains Quiver and Barb. Quiver plots an arrow in the\ndirection of the vector, with the size of the arrow related to the\nmagnitude of the vector.\n\nBarbs are like quiver in that they point along a vector, but\nthe magnitude of the vector is given schematically by the presence of barbs\nor flags on the barb.\n\nThis will also become a home for things such as standard\ndeviation ellipses, which can and will be derived very easily from\nthe Quiver code.\n"""\n\nimport math\n\nimport numpy as np\nfrom numpy import ma\n\nfrom matplotlib import _api, cbook, _docstring\nimport matplotlib.artist as martist\nimport matplotlib.collections as mcollections\nfrom matplotlib.patches import CirclePolygon\nimport matplotlib.text as mtext\nimport matplotlib.transforms as transforms\n\n\n_quiver_doc = """\nPlot a 2D field of arrows.\n\nCall signature::\n\n quiver([X, Y], U, V, [C], /, **kwargs)\n\n*X*, *Y* define the arrow locations, *U*, *V* define the arrow directions, and\n*C* optionally sets the color. The arguments *X*, *Y*, *U*, *V*, *C* are\npositional-only.\n\n**Arrow length**\n\nThe default settings auto-scales the length of the arrows to a reasonable size.\nTo change this behavior see the *scale* and *scale_units* parameters.\n\n**Arrow shape**\n\nThe arrow shape is determined by *width*, *headwidth*, *headlength* and\n*headaxislength*. See the notes below.\n\n**Arrow styling**\n\nEach arrow is internally represented by a filled polygon with a default edge\nlinewidth of 0. As a result, an arrow is rather a filled area, not a line with\na head, and `.PolyCollection` properties like *linewidth*, *edgecolor*,\n*facecolor*, etc. act accordingly.\n\n\nParameters\n----------\nX, Y : 1D or 2D array-like, optional\n The x and y coordinates of the arrow locations.\n\n If not given, they will be generated as a uniform integer meshgrid based\n on the dimensions of *U* and *V*.\n\n If *X* and *Y* are 1D but *U*, *V* are 2D, *X*, *Y* are expanded to 2D\n using ``X, Y = np.meshgrid(X, Y)``. In this case ``len(X)`` and ``len(Y)``\n must match the column and row dimensions of *U* and *V*.\n\nU, V : 1D or 2D array-like\n The x and y direction components of the arrow vectors. The interpretation\n of these components (in data or in screen space) depends on *angles*.\n\n *U* and *V* must have the same number of elements, matching the number of\n arrow locations in *X*, *Y*. *U* and *V* may be masked. Locations masked\n in any of *U*, *V*, and *C* will not be drawn.\n\nC : 1D or 2D array-like, optional\n Numeric data that defines the arrow colors by colormapping via *norm* and\n *cmap*.\n\n This does not support explicit colors. If you want to set colors directly,\n use *color* instead. The size of *C* must match the number of arrow\n locations.\n\nangles : {'uv', 'xy'} or array-like, default: 'uv'\n Method for determining the angle of the arrows.\n\n - 'uv': Arrow directions are based on\n :ref:`display coordinates <coordinate-systems>`; i.e. a 45° angle will\n always show up as diagonal on the screen, irrespective of figure or Axes\n aspect ratio or Axes data ranges. This is useful when the arrows represent\n a quantity whose direction is not tied to the x and y data coordinates.\n\n If *U* == *V* the orientation of the arrow on the plot is 45 degrees\n counter-clockwise from the horizontal axis (positive to the right).\n\n - 'xy': Arrow direction in data coordinates, i.e. the arrows point from\n (x, y) to (x+u, y+v). This is ideal for vector fields or gradient plots\n where the arrows should directly represent movements or gradients in the\n x and y directions.\n\n - Arbitrary angles may be specified explicitly as an array of values\n in degrees, counter-clockwise from the horizontal axis.\n\n In this case *U*, *V* is only used to determine the length of the\n arrows.\n\n For example, ``angles=[30, 60, 90]`` will orient the arrows at 30, 60, and 90\n degrees respectively, regardless of the *U* and *V* components.\n\n Note: inverting a data axis will correspondingly invert the\n arrows only with ``angles='xy'``.\n\npivot : {'tail', 'mid', 'middle', 'tip'}, default: 'tail'\n The part of the arrow that is anchored to the *X*, *Y* grid. The arrow\n rotates about this point.\n\n 'mid' is a synonym for 'middle'.\n\nscale : float, optional\n Scales the length of the arrow inversely.\n\n Number of data values represented by one unit of arrow length on the plot.\n For example, if the data represents velocity in meters per second (m/s), the\n scale parameter determines how many meters per second correspond to one unit of\n arrow length relative to the width of the plot.\n Smaller scale parameter makes the arrow longer.\n\n By default, an autoscaling algorithm is used to scale the arrow length to a\n reasonable size, which is based on the average vector length and the number of\n vectors.\n\n The arrow length unit is given by the *scale_units* parameter.\n\nscale_units : {'width', 'height', 'dots', 'inches', 'x', 'y', 'xy'}, default: 'width'\n\n The physical image unit, which is used for rendering the scaled arrow data *U*, *V*.\n\n The rendered arrow length is given by\n\n length in x direction = $\\frac{u}{\\mathrm{scale}} \\mathrm{scale_unit}$\n\n length in y direction = $\\frac{v}{\\mathrm{scale}} \\mathrm{scale_unit}$\n\n For example, ``(u, v) = (0.5, 0)`` with ``scale=10, scale_unit="width"`` results\n in a horizontal arrow with a length of *0.5 / 10 * "width"*, i.e. 0.05 times the\n Axes width.\n\n Supported values are:\n\n - 'width' or 'height': The arrow length is scaled relative to the width or height\n of the Axes.\n For example, ``scale_units='width', scale=1.0``, will result in an arrow length\n of width of the Axes.\n\n - 'dots': The arrow length of the arrows is in measured in display dots (pixels).\n\n - 'inches': Arrow lengths are scaled based on the DPI (dots per inch) of the figure.\n This ensures that the arrows have a consistent physical size on the figure,\n in inches, regardless of data values or plot scaling.\n For example, ``(u, v) = (1, 0)`` with ``scale_units='inches', scale=2`` results\n in a 0.5 inch-long arrow.\n\n - 'x' or 'y': The arrow length is scaled relative to the x or y axis units.\n For example, ``(u, v) = (0, 1)`` with ``scale_units='x', scale=1`` results\n in a vertical arrow with the length of 1 x-axis unit.\n\n - 'xy': Arrow length will be same as 'x' or 'y' units.\n This is useful for creating vectors in the x-y plane where u and v have\n the same units as x and y. To plot vectors in the x-y plane with u and v having\n the same units as x and y, use ``angles='xy', scale_units='xy', scale=1``.\n\n Note: Setting *scale_units* without setting scale does not have any effect because\n the scale units only differ by a constant factor and that is rescaled through\n autoscaling.\n\nunits : {'width', 'height', 'dots', 'inches', 'x', 'y', 'xy'}, default: 'width'\n Affects the arrow size (except for the length). In particular, the shaft\n *width* is measured in multiples of this unit.\n\n Supported values are:\n\n - 'width', 'height': The width or height of the Axes.\n - 'dots', 'inches': Pixels or inches based on the figure dpi.\n - 'x', 'y', 'xy': *X*, *Y* or :math:`\\sqrt{X^2 + Y^2}` in data units.\n\n The following table summarizes how these values affect the visible arrow\n size under zooming and figure size changes:\n\n ================= ================= ==================\n units zoom figure size change\n ================= ================= ==================\n 'x', 'y', 'xy' arrow size scales —\n 'width', 'height' — arrow size scales\n 'dots', 'inches' — —\n ================= ================= ==================\n\nwidth : float, optional\n Shaft width in arrow units. All head parameters are relative to *width*.\n\n The default depends on choice of *units* above, and number of vectors;\n a typical starting value is about 0.005 times the width of the plot.\n\nheadwidth : float, default: 3\n Head width as multiple of shaft *width*. See the notes below.\n\nheadlength : float, default: 5\n Head length as multiple of shaft *width*. See the notes below.\n\nheadaxislength : float, default: 4.5\n Head length at shaft intersection as multiple of shaft *width*.\n See the notes below.\n\nminshaft : float, default: 1\n Length below which arrow scales, in units of head length. Do not\n set this to less than 1, or small arrows will look terrible!\n\nminlength : float, default: 1\n Minimum length as a multiple of shaft width; if an arrow length\n is less than this, plot a dot (hexagon) of this diameter instead.\n\ncolor : :mpltype:`color` or list :mpltype:`color`, optional\n Explicit color(s) for the arrows. If *C* has been set, *color* has no\n effect.\n\n This is a synonym for the `.PolyCollection` *facecolor* parameter.\n\nOther Parameters\n----------------\ndata : indexable object, optional\n DATA_PARAMETER_PLACEHOLDER\n\n**kwargs : `~matplotlib.collections.PolyCollection` properties, optional\n All other keyword arguments are passed on to `.PolyCollection`:\n\n %(PolyCollection:kwdoc)s\n\nReturns\n-------\n`~matplotlib.quiver.Quiver`\n\nSee Also\n--------\n.Axes.quiverkey : Add a key to a quiver plot.\n\nNotes\n-----\n\n**Arrow shape**\n\nThe arrow is drawn as a polygon using the nodes as shown below. The values\n*headwidth*, *headlength*, and *headaxislength* are in units of *width*.\n\n.. image:: /_static/quiver_sizes.svg\n :width: 500px\n\nThe defaults give a slightly swept-back arrow. Here are some guidelines how to\nget other head shapes:\n\n- To make the head a triangle, make *headaxislength* the same as *headlength*.\n- To make the arrow more pointed, reduce *headwidth* or increase *headlength*\n and *headaxislength*.\n- To make the head smaller relative to the shaft, scale down all the head\n parameters proportionally.\n- To remove the head completely, set all *head* parameters to 0.\n- To get a diamond-shaped head, make *headaxislength* larger than *headlength*.\n- Warning: For *headaxislength* < (*headlength* / *headwidth*), the "headaxis"\n nodes (i.e. the ones connecting the head with the shaft) will protrude out\n of the head in forward direction so that the arrow head looks broken.\n""" % _docstring.interpd.params\n\n_docstring.interpd.register(quiver_doc=_quiver_doc)\n\n\nclass QuiverKey(martist.Artist):\n """Labelled arrow for use as a quiver plot scale key."""\n halign = {'N': 'center', 'S': 'center', 'E': 'left', 'W': 'right'}\n valign = {'N': 'bottom', 'S': 'top', 'E': 'center', 'W': 'center'}\n pivot = {'N': 'middle', 'S': 'middle', 'E': 'tip', 'W': 'tail'}\n\n def __init__(self, Q, X, Y, U, label,\n *, angle=0, coordinates='axes', color=None, labelsep=0.1,\n labelpos='N', labelcolor=None, fontproperties=None,\n zorder=None, **kwargs):\n """\n Add a key to a quiver plot.\n\n The positioning of the key depends on *X*, *Y*, *coordinates*, and\n *labelpos*. If *labelpos* is 'N' or 'S', *X*, *Y* give the position of\n the middle of the key arrow. If *labelpos* is 'E', *X*, *Y* positions\n the head, and if *labelpos* is 'W', *X*, *Y* positions the tail; in\n either of these two cases, *X*, *Y* is somewhere in the middle of the\n arrow+label key object.\n\n Parameters\n ----------\n Q : `~matplotlib.quiver.Quiver`\n A `.Quiver` object as returned by a call to `~.Axes.quiver()`.\n X, Y : float\n The location of the key.\n U : float\n The length of the key.\n label : str\n The key label (e.g., length and units of the key).\n angle : float, default: 0\n The angle of the key arrow, in degrees anti-clockwise from the\n horizontal axis.\n coordinates : {'axes', 'figure', 'data', 'inches'}, default: 'axes'\n Coordinate system and units for *X*, *Y*: 'axes' and 'figure' are\n normalized coordinate systems with (0, 0) in the lower left and\n (1, 1) in the upper right; 'data' are the axes data coordinates\n (used for the locations of the vectors in the quiver plot itself);\n 'inches' is position in the figure in inches, with (0, 0) at the\n lower left corner.\n color : :mpltype:`color`\n Overrides face and edge colors from *Q*.\n labelpos : {'N', 'S', 'E', 'W'}\n Position the label above, below, to the right, to the left of the\n arrow, respectively.\n labelsep : float, default: 0.1\n Distance in inches between the arrow and the label.\n labelcolor : :mpltype:`color`, default: :rc:`text.color`\n Label color.\n fontproperties : dict, optional\n A dictionary with keyword arguments accepted by the\n `~matplotlib.font_manager.FontProperties` initializer:\n *family*, *style*, *variant*, *size*, *weight*.\n zorder : float\n The zorder of the key. The default is 0.1 above *Q*.\n **kwargs\n Any additional keyword arguments are used to override vector\n properties taken from *Q*.\n """\n super().__init__()\n self.Q = Q\n self.X = X\n self.Y = Y\n self.U = U\n self.angle = angle\n self.coord = coordinates\n self.color = color\n self.label = label\n self._labelsep_inches = labelsep\n\n self.labelpos = labelpos\n self.labelcolor = labelcolor\n self.fontproperties = fontproperties or dict()\n self.kw = kwargs\n self.text = mtext.Text(\n text=label,\n horizontalalignment=self.halign[self.labelpos],\n verticalalignment=self.valign[self.labelpos],\n fontproperties=self.fontproperties)\n if self.labelcolor is not None:\n self.text.set_color(self.labelcolor)\n self._dpi_at_last_init = None\n self.zorder = zorder if zorder is not None else Q.zorder + 0.1\n\n @property\n def labelsep(self):\n return self._labelsep_inches * self.Q.axes.get_figure(root=True).dpi\n\n def _init(self):\n if True: # self._dpi_at_last_init != self.axes.get_figure().dpi\n if self.Q._dpi_at_last_init != self.Q.axes.get_figure(root=True).dpi:\n self.Q._init()\n self._set_transform()\n with cbook._setattr_cm(self.Q, pivot=self.pivot[self.labelpos],\n # Hack: save and restore the Umask\n Umask=ma.nomask):\n u = self.U * np.cos(np.radians(self.angle))\n v = self.U * np.sin(np.radians(self.angle))\n self.verts = self.Q._make_verts([[0., 0.]],\n np.array([u]), np.array([v]), 'uv')\n kwargs = self.Q.polykw\n kwargs.update(self.kw)\n self.vector = mcollections.PolyCollection(\n self.verts,\n offsets=[(self.X, self.Y)],\n offset_transform=self.get_transform(),\n **kwargs)\n if self.color is not None:\n self.vector.set_color(self.color)\n self.vector.set_transform(self.Q.get_transform())\n self.vector.set_figure(self.get_figure())\n self._dpi_at_last_init = self.Q.axes.get_figure(root=True).dpi\n\n def _text_shift(self):\n return {\n "N": (0, +self.labelsep),\n "S": (0, -self.labelsep),\n "E": (+self.labelsep, 0),\n "W": (-self.labelsep, 0),\n }[self.labelpos]\n\n @martist.allow_rasterization\n def draw(self, renderer):\n self._init()\n self.vector.draw(renderer)\n pos = self.get_transform().transform((self.X, self.Y))\n self.text.set_position(pos + self._text_shift())\n self.text.draw(renderer)\n self.stale = False\n\n def _set_transform(self):\n fig = self.Q.axes.get_figure(root=False)\n self.set_transform(_api.check_getitem({\n "data": self.Q.axes.transData,\n "axes": self.Q.axes.transAxes,\n "figure": fig.transFigure,\n "inches": fig.dpi_scale_trans,\n }, coordinates=self.coord))\n\n def set_figure(self, fig):\n super().set_figure(fig)\n self.text.set_figure(fig)\n\n def contains(self, mouseevent):\n if self._different_canvas(mouseevent):\n return False, {}\n # Maybe the dictionary should allow one to\n # distinguish between a text hit and a vector hit.\n if (self.text.contains(mouseevent)[0] or\n self.vector.contains(mouseevent)[0]):\n return True, {}\n return False, {}\n\n\ndef _parse_args(*args, caller_name='function'):\n """\n Helper function to parse positional parameters for colored vector plots.\n\n This is currently used for Quiver and Barbs.\n\n Parameters\n ----------\n *args : list\n list of 2-5 arguments. Depending on their number they are parsed to::\n\n U, V\n U, V, C\n X, Y, U, V\n X, Y, U, V, C\n\n caller_name : str\n Name of the calling method (used in error messages).\n """\n X = Y = C = None\n\n nargs = len(args)\n if nargs == 2:\n # The use of atleast_1d allows for handling scalar arguments while also\n # keeping masked arrays\n U, V = np.atleast_1d(*args)\n elif nargs == 3:\n U, V, C = np.atleast_1d(*args)\n elif nargs == 4:\n X, Y, U, V = np.atleast_1d(*args)\n elif nargs == 5:\n X, Y, U, V, C = np.atleast_1d(*args)\n else:\n raise _api.nargs_error(caller_name, takes="from 2 to 5", given=nargs)\n\n nr, nc = (1, U.shape[0]) if U.ndim == 1 else U.shape\n\n if X is not None:\n X = X.ravel()\n Y = Y.ravel()\n if len(X) == nc and len(Y) == nr:\n X, Y = (a.ravel() for a in np.meshgrid(X, Y))\n elif len(X) != len(Y):\n raise ValueError('X and Y must be the same size, but '\n f'X.size is {X.size} and Y.size is {Y.size}.')\n else:\n indexgrid = np.meshgrid(np.arange(nc), np.arange(nr))\n X, Y = (np.ravel(a) for a in indexgrid)\n # Size validation for U, V, C is left to the set_UVC method.\n return X, Y, U, V, C\n\n\ndef _check_consistent_shapes(*arrays):\n all_shapes = {a.shape for a in arrays}\n if len(all_shapes) != 1:\n raise ValueError('The shapes of the passed in arrays do not match')\n\n\nclass Quiver(mcollections.PolyCollection):\n """\n Specialized PolyCollection for arrows.\n\n The only API method is set_UVC(), which can be used\n to change the size, orientation, and color of the\n arrows; their locations are fixed when the class is\n instantiated. Possibly this method will be useful\n in animations.\n\n Much of the work in this class is done in the draw()\n method so that as much information as possible is available\n about the plot. In subsequent draw() calls, recalculation\n is limited to things that might have changed, so there\n should be no performance penalty from putting the calculations\n in the draw() method.\n """\n\n _PIVOT_VALS = ('tail', 'middle', 'tip')\n\n @_docstring.Substitution(_quiver_doc)\n def __init__(self, ax, *args,\n scale=None, headwidth=3, headlength=5, headaxislength=4.5,\n minshaft=1, minlength=1, units='width', scale_units=None,\n angles='uv', width=None, color='k', pivot='tail', **kwargs):\n """\n The constructor takes one required argument, an Axes\n instance, followed by the args and kwargs described\n by the following pyplot interface documentation:\n %s\n """\n self._axes = ax # The attr actually set by the Artist.axes property.\n X, Y, U, V, C = _parse_args(*args, caller_name='quiver')\n self.X = X\n self.Y = Y\n self.XY = np.column_stack((X, Y))\n self.N = len(X)\n self.scale = scale\n self.headwidth = headwidth\n self.headlength = float(headlength)\n self.headaxislength = headaxislength\n self.minshaft = minshaft\n self.minlength = minlength\n self.units = units\n self.scale_units = scale_units\n self.angles = angles\n self.width = width\n\n if pivot.lower() == 'mid':\n pivot = 'middle'\n self.pivot = pivot.lower()\n _api.check_in_list(self._PIVOT_VALS, pivot=self.pivot)\n\n self.transform = kwargs.pop('transform', ax.transData)\n kwargs.setdefault('facecolors', color)\n kwargs.setdefault('linewidths', (0,))\n super().__init__([], offsets=self.XY, offset_transform=self.transform,\n closed=False, **kwargs)\n self.polykw = kwargs\n self.set_UVC(U, V, C)\n self._dpi_at_last_init = None\n\n def _init(self):\n """\n Initialization delayed until first draw;\n allow time for axes setup.\n """\n # It seems that there are not enough event notifications\n # available to have this work on an as-needed basis at present.\n if True: # self._dpi_at_last_init != self.axes.figure.dpi\n trans = self._set_transform()\n self.span = trans.inverted().transform_bbox(self.axes.bbox).width\n if self.width is None:\n sn = np.clip(math.sqrt(self.N), 8, 25)\n self.width = 0.06 * self.span / sn\n\n # _make_verts sets self.scale if not already specified\n if (self._dpi_at_last_init != self.axes.get_figure(root=True).dpi\n and self.scale is None):\n self._make_verts(self.XY, self.U, self.V, self.angles)\n\n self._dpi_at_last_init = self.axes.get_figure(root=True).dpi\n\n def get_datalim(self, transData):\n trans = self.get_transform()\n offset_trf = self.get_offset_transform()\n full_transform = (trans - transData) + (offset_trf - transData)\n XY = full_transform.transform(self.XY)\n bbox = transforms.Bbox.null()\n bbox.update_from_data_xy(XY, ignore=True)\n return bbox\n\n @martist.allow_rasterization\n def draw(self, renderer):\n self._init()\n verts = self._make_verts(self.XY, self.U, self.V, self.angles)\n self.set_verts(verts, closed=False)\n super().draw(renderer)\n self.stale = False\n\n def set_UVC(self, U, V, C=None):\n # We need to ensure we have a copy, not a reference\n # to an array that might change before draw().\n U = ma.masked_invalid(U, copy=True).ravel()\n V = ma.masked_invalid(V, copy=True).ravel()\n if C is not None:\n C = ma.masked_invalid(C, copy=True).ravel()\n for name, var in zip(('U', 'V', 'C'), (U, V, C)):\n if not (var is None or var.size == self.N or var.size == 1):\n raise ValueError(f'Argument {name} has a size {var.size}'\n f' which does not match {self.N},'\n ' the number of arrow positions')\n\n mask = ma.mask_or(U.mask, V.mask, copy=False, shrink=True)\n if C is not None:\n mask = ma.mask_or(mask, C.mask, copy=False, shrink=True)\n if mask is ma.nomask:\n C = C.filled()\n else:\n C = ma.array(C, mask=mask, copy=False)\n self.U = U.filled(1)\n self.V = V.filled(1)\n self.Umask = mask\n if C is not None:\n self.set_array(C)\n self.stale = True\n\n def _dots_per_unit(self, units):\n """Return a scale factor for converting from units to pixels."""\n bb = self.axes.bbox\n vl = self.axes.viewLim\n return _api.check_getitem({\n 'x': bb.width / vl.width,\n 'y': bb.height / vl.height,\n 'xy': np.hypot(*bb.size) / np.hypot(*vl.size),\n 'width': bb.width,\n 'height': bb.height,\n 'dots': 1.,\n 'inches': self.axes.get_figure(root=True).dpi,\n }, units=units)\n\n def _set_transform(self):\n """\n Set the PolyCollection transform to go\n from arrow width units to pixels.\n """\n dx = self._dots_per_unit(self.units)\n self._trans_scale = dx # pixels per arrow width unit\n trans = transforms.Affine2D().scale(dx)\n self.set_transform(trans)\n return trans\n\n # Calculate angles and lengths for segment between (x, y), (x+u, y+v)\n def _angles_lengths(self, XY, U, V, eps=1):\n xy = self.axes.transData.transform(XY)\n uv = np.column_stack((U, V))\n xyp = self.axes.transData.transform(XY + eps * uv)\n dxy = xyp - xy\n angles = np.arctan2(dxy[:, 1], dxy[:, 0])\n lengths = np.hypot(*dxy.T) / eps\n return angles, lengths\n\n # XY is stacked [X, Y].\n # See quiver() doc for meaning of X, Y, U, V, angles.\n def _make_verts(self, XY, U, V, angles):\n uv = (U + V * 1j)\n str_angles = angles if isinstance(angles, str) else ''\n if str_angles == 'xy' and self.scale_units == 'xy':\n # Here eps is 1 so that if we get U, V by diffing\n # the X, Y arrays, the vectors will connect the\n # points, regardless of the axis scaling (including log).\n angles, lengths = self._angles_lengths(XY, U, V, eps=1)\n elif str_angles == 'xy' or self.scale_units == 'xy':\n # Calculate eps based on the extents of the plot\n # so that we don't end up with roundoff error from\n # adding a small number to a large.\n eps = np.abs(self.axes.dataLim.extents).max() * 0.001\n angles, lengths = self._angles_lengths(XY, U, V, eps=eps)\n\n if str_angles and self.scale_units == 'xy':\n a = lengths\n else:\n a = np.abs(uv)\n\n if self.scale is None:\n sn = max(10, math.sqrt(self.N))\n if self.Umask is not ma.nomask:\n amean = a[~self.Umask].mean()\n else:\n amean = a.mean()\n # crude auto-scaling\n # scale is typical arrow length as a multiple of the arrow width\n scale = 1.8 * amean * sn / self.span\n\n if self.scale_units is None:\n if self.scale is None:\n self.scale = scale\n widthu_per_lenu = 1.0\n else:\n if self.scale_units == 'xy':\n dx = 1\n else:\n dx = self._dots_per_unit(self.scale_units)\n widthu_per_lenu = dx / self._trans_scale\n if self.scale is None:\n self.scale = scale * widthu_per_lenu\n length = a * (widthu_per_lenu / (self.scale * self.width))\n X, Y = self._h_arrows(length)\n if str_angles == 'xy':\n theta = angles\n elif str_angles == 'uv':\n theta = np.angle(uv)\n else:\n theta = ma.masked_invalid(np.deg2rad(angles)).filled(0)\n theta = theta.reshape((-1, 1)) # for broadcasting\n xy = (X + Y * 1j) * np.exp(1j * theta) * self.width\n XY = np.stack((xy.real, xy.imag), axis=2)\n if self.Umask is not ma.nomask:\n XY = ma.array(XY)\n XY[self.Umask] = ma.masked\n # This might be handled more efficiently with nans, given\n # that nans will end up in the paths anyway.\n\n return XY\n\n def _h_arrows(self, length):\n """Length is in arrow width units."""\n # It might be possible to streamline the code\n # and speed it up a bit by using complex (x, y)\n # instead of separate arrays; but any gain would be slight.\n minsh = self.minshaft * self.headlength\n N = len(length)\n length = length.reshape(N, 1)\n # This number is chosen based on when pixel values overflow in Agg\n # causing rendering errors\n # length = np.minimum(length, 2 ** 16)\n np.clip(length, 0, 2 ** 16, out=length)\n # x, y: normal horizontal arrow\n x = np.array([0, -self.headaxislength,\n -self.headlength, 0],\n np.float64)\n x = x + np.array([0, 1, 1, 1]) * length\n y = 0.5 * np.array([1, 1, self.headwidth, 0], np.float64)\n y = np.repeat(y[np.newaxis, :], N, axis=0)\n # x0, y0: arrow without shaft, for short vectors\n x0 = np.array([0, minsh - self.headaxislength,\n minsh - self.headlength, minsh], np.float64)\n y0 = 0.5 * np.array([1, 1, self.headwidth, 0], np.float64)\n ii = [0, 1, 2, 3, 2, 1, 0, 0]\n X = x[:, ii]\n Y = y[:, ii]\n Y[:, 3:-1] *= -1\n X0 = x0[ii]\n Y0 = y0[ii]\n Y0[3:-1] *= -1\n shrink = length / minsh if minsh != 0. else 0.\n X0 = shrink * X0[np.newaxis, :]\n Y0 = shrink * Y0[np.newaxis, :]\n short = np.repeat(length < minsh, 8, axis=1)\n # Now select X0, Y0 if short, otherwise X, Y\n np.copyto(X, X0, where=short)\n np.copyto(Y, Y0, where=short)\n if self.pivot == 'middle':\n X -= 0.5 * X[:, 3, np.newaxis]\n elif self.pivot == 'tip':\n # numpy bug? using -= does not work here unless we multiply by a\n # float first, as with 'mid'.\n X = X - X[:, 3, np.newaxis]\n elif self.pivot != 'tail':\n _api.check_in_list(["middle", "tip", "tail"], pivot=self.pivot)\n\n tooshort = length < self.minlength\n if tooshort.any():\n # Use a heptagonal dot:\n th = np.arange(0, 8, 1, np.float64) * (np.pi / 3.0)\n x1 = np.cos(th) * self.minlength * 0.5\n y1 = np.sin(th) * self.minlength * 0.5\n X1 = np.repeat(x1[np.newaxis, :], N, axis=0)\n Y1 = np.repeat(y1[np.newaxis, :], N, axis=0)\n tooshort = np.repeat(tooshort, 8, 1)\n np.copyto(X, X1, where=tooshort)\n np.copyto(Y, Y1, where=tooshort)\n # Mask handling is deferred to the caller, _make_verts.\n return X, Y\n\n\n_barbs_doc = r"""\nPlot a 2D field of wind barbs.\n\nCall signature::\n\n barbs([X, Y], U, V, [C], /, **kwargs)\n\nWhere *X*, *Y* define the barb locations, *U*, *V* define the barb\ndirections, and *C* optionally sets the color.\n\nThe arguments *X*, *Y*, *U*, *V*, *C* are positional-only and may be\n1D or 2D. *U*, *V*, *C* may be masked arrays, but masked *X*, *Y*\nare not supported at present.\n\nBarbs are traditionally used in meteorology as a way to plot the speed\nand direction of wind observations, but can technically be used to\nplot any two dimensional vector quantity. As opposed to arrows, which\ngive vector magnitude by the length of the arrow, the barbs give more\nquantitative information about the vector magnitude by putting slanted\nlines or a triangle for various increments in magnitude, as show\nschematically below::\n\n : /\ \\n : / \ \\n : / \ \ \\n : / \ \ \\n : ------------------------------\n\nThe largest increment is given by a triangle (or "flag"). After those\ncome full lines (barbs). The smallest increment is a half line. There\nis only, of course, ever at most 1 half line. If the magnitude is\nsmall and only needs a single half-line and no full lines or\ntriangles, the half-line is offset from the end of the barb so that it\ncan be easily distinguished from barbs with a single full line. The\nmagnitude for the barb shown above would nominally be 65, using the\nstandard increments of 50, 10, and 5.\n\nSee also https://en.wikipedia.org/wiki/Wind_barb.\n\nParameters\n----------\nX, Y : 1D or 2D array-like, optional\n The x and y coordinates of the barb locations. See *pivot* for how the\n barbs are drawn to the x, y positions.\n\n If not given, they will be generated as a uniform integer meshgrid based\n on the dimensions of *U* and *V*.\n\n If *X* and *Y* are 1D but *U*, *V* are 2D, *X*, *Y* are expanded to 2D\n using ``X, Y = np.meshgrid(X, Y)``. In this case ``len(X)`` and ``len(Y)``\n must match the column and row dimensions of *U* and *V*.\n\nU, V : 1D or 2D array-like\n The x and y components of the barb shaft.\n\nC : 1D or 2D array-like, optional\n Numeric data that defines the barb colors by colormapping via *norm* and\n *cmap*.\n\n This does not support explicit colors. If you want to set colors directly,\n use *barbcolor* instead.\n\nlength : float, default: 7\n Length of the barb in points; the other parts of the barb\n are scaled against this.\n\npivot : {'tip', 'middle'} or float, default: 'tip'\n The part of the arrow that is anchored to the *X*, *Y* grid. The barb\n rotates about this point. This can also be a number, which shifts the\n start of the barb that many points away from grid point.\n\nbarbcolor : :mpltype:`color` or color sequence\n The color of all parts of the barb except for the flags. This parameter\n is analogous to the *edgecolor* parameter for polygons, which can be used\n instead. However this parameter will override facecolor.\n\nflagcolor : :mpltype:`color` or color sequence\n The color of any flags on the barb. This parameter is analogous to the\n *facecolor* parameter for polygons, which can be used instead. However,\n this parameter will override facecolor. If this is not set (and *C* has\n not either) then *flagcolor* will be set to match *barbcolor* so that the\n barb has a uniform color. If *C* has been set, *flagcolor* has no effect.\n\nsizes : dict, optional\n A dictionary of coefficients specifying the ratio of a given\n feature to the length of the barb. Only those values one wishes to\n override need to be included. These features include:\n\n - 'spacing' - space between features (flags, full/half barbs)\n - 'height' - height (distance from shaft to top) of a flag or full barb\n - 'width' - width of a flag, twice the width of a full barb\n - 'emptybarb' - radius of the circle used for low magnitudes\n\nfill_empty : bool, default: False\n Whether the empty barbs (circles) that are drawn should be filled with\n the flag color. If they are not filled, the center is transparent.\n\nrounding : bool, default: True\n Whether the vector magnitude should be rounded when allocating barb\n components. If True, the magnitude is rounded to the nearest multiple\n of the half-barb increment. If False, the magnitude is simply truncated\n to the next lowest multiple.\n\nbarb_increments : dict, optional\n A dictionary of increments specifying values to associate with\n different parts of the barb. Only those values one wishes to\n override need to be included.\n\n - 'half' - half barbs (Default is 5)\n - 'full' - full barbs (Default is 10)\n - 'flag' - flags (default is 50)\n\nflip_barb : bool or array-like of bool, default: False\n Whether the lines and flags should point opposite to normal.\n Normal behavior is for the barbs and lines to point right (comes from wind\n barbs having these features point towards low pressure in the Northern\n Hemisphere).\n\n A single value is applied to all barbs. Individual barbs can be flipped by\n passing a bool array of the same size as *U* and *V*.\n\nReturns\n-------\nbarbs : `~matplotlib.quiver.Barbs`\n\nOther Parameters\n----------------\ndata : indexable object, optional\n DATA_PARAMETER_PLACEHOLDER\n\n**kwargs\n The barbs can further be customized using `.PolyCollection` keyword\n arguments:\n\n %(PolyCollection:kwdoc)s\n""" % _docstring.interpd.params\n\n_docstring.interpd.register(barbs_doc=_barbs_doc)\n\n\nclass Barbs(mcollections.PolyCollection):\n """\n Specialized PolyCollection for barbs.\n\n The only API method is :meth:`set_UVC`, which can be used to\n change the size, orientation, and color of the arrows. Locations\n are changed using the :meth:`set_offsets` collection method.\n Possibly this method will be useful in animations.\n\n There is one internal function :meth:`_find_tails` which finds\n exactly what should be put on the barb given the vector magnitude.\n From there :meth:`_make_barbs` is used to find the vertices of the\n polygon to represent the barb based on this information.\n """\n\n # This may be an abuse of polygons here to render what is essentially maybe\n # 1 triangle and a series of lines. It works fine as far as I can tell\n # however.\n\n @_docstring.interpd\n def __init__(self, ax, *args,\n pivot='tip', length=7, barbcolor=None, flagcolor=None,\n sizes=None, fill_empty=False, barb_increments=None,\n rounding=True, flip_barb=False, **kwargs):\n """\n The constructor takes one required argument, an Axes\n instance, followed by the args and kwargs described\n by the following pyplot interface documentation:\n %(barbs_doc)s\n """\n self.sizes = sizes or dict()\n self.fill_empty = fill_empty\n self.barb_increments = barb_increments or dict()\n self.rounding = rounding\n self.flip = np.atleast_1d(flip_barb)\n transform = kwargs.pop('transform', ax.transData)\n self._pivot = pivot\n self._length = length\n\n # Flagcolor and barbcolor provide convenience parameters for\n # setting the facecolor and edgecolor, respectively, of the barb\n # polygon. We also work here to make the flag the same color as the\n # rest of the barb by default\n\n if None in (barbcolor, flagcolor):\n kwargs['edgecolors'] = 'face'\n if flagcolor:\n kwargs['facecolors'] = flagcolor\n elif barbcolor:\n kwargs['facecolors'] = barbcolor\n else:\n # Set to facecolor passed in or default to black\n kwargs.setdefault('facecolors', 'k')\n else:\n kwargs['edgecolors'] = barbcolor\n kwargs['facecolors'] = flagcolor\n\n # Explicitly set a line width if we're not given one, otherwise\n # polygons are not outlined and we get no barbs\n if 'linewidth' not in kwargs and 'lw' not in kwargs:\n kwargs['linewidth'] = 1\n\n # Parse out the data arrays from the various configurations supported\n x, y, u, v, c = _parse_args(*args, caller_name='barbs')\n self.x = x\n self.y = y\n xy = np.column_stack((x, y))\n\n # Make a collection\n barb_size = self._length ** 2 / 4 # Empirically determined\n super().__init__(\n [], (barb_size,), offsets=xy, offset_transform=transform, **kwargs)\n self.set_transform(transforms.IdentityTransform())\n\n self.set_UVC(u, v, c)\n\n def _find_tails(self, mag, rounding=True, half=5, full=10, flag=50):\n """\n Find how many of each of the tail pieces is necessary.\n\n Parameters\n ----------\n mag : `~numpy.ndarray`\n Vector magnitudes; must be non-negative (and an actual ndarray).\n rounding : bool, default: True\n Whether to round or to truncate to the nearest half-barb.\n half, full, flag : float, defaults: 5, 10, 50\n Increments for a half-barb, a barb, and a flag.\n\n Returns\n -------\n n_flags, n_barbs : int array\n For each entry in *mag*, the number of flags and barbs.\n half_flag : bool array\n For each entry in *mag*, whether a half-barb is needed.\n empty_flag : bool array\n For each entry in *mag*, whether nothing is drawn.\n """\n # If rounding, round to the nearest multiple of half, the smallest\n # increment\n if rounding:\n mag = half * np.around(mag / half)\n n_flags, mag = divmod(mag, flag)\n n_barb, mag = divmod(mag, full)\n half_flag = mag >= half\n empty_flag = ~(half_flag | (n_flags > 0) | (n_barb > 0))\n return n_flags.astype(int), n_barb.astype(int), half_flag, empty_flag\n\n def _make_barbs(self, u, v, nflags, nbarbs, half_barb, empty_flag, length,\n pivot, sizes, fill_empty, flip):\n """\n Create the wind barbs.\n\n Parameters\n ----------\n u, v\n Components of the vector in the x and y directions, respectively.\n\n nflags, nbarbs, half_barb, empty_flag\n Respectively, the number of flags, number of barbs, flag for\n half a barb, and flag for empty barb, ostensibly obtained from\n :meth:`_find_tails`.\n\n length\n The length of the barb staff in points.\n\n pivot : {"tip", "middle"} or number\n The point on the barb around which the entire barb should be\n rotated. If a number, the start of the barb is shifted by that\n many points from the origin.\n\n sizes : dict\n Coefficients specifying the ratio of a given feature to the length\n of the barb. These features include:\n\n - *spacing*: space between features (flags, full/half barbs).\n - *height*: distance from shaft of top of a flag or full barb.\n - *width*: width of a flag, twice the width of a full barb.\n - *emptybarb*: radius of the circle used for low magnitudes.\n\n fill_empty : bool\n Whether the circle representing an empty barb should be filled or\n not (this changes the drawing of the polygon).\n\n flip : list of bool\n Whether the features should be flipped to the other side of the\n barb (useful for winds in the southern hemisphere).\n\n Returns\n -------\n list of arrays of vertices\n Polygon vertices for each of the wind barbs. These polygons have\n been rotated to properly align with the vector direction.\n """\n\n # These control the spacing and size of barb elements relative to the\n # length of the shaft\n spacing = length * sizes.get('spacing', 0.125)\n full_height = length * sizes.get('height', 0.4)\n full_width = length * sizes.get('width', 0.25)\n empty_rad = length * sizes.get('emptybarb', 0.15)\n\n # Controls y point where to pivot the barb.\n pivot_points = dict(tip=0.0, middle=-length / 2.)\n\n endx = 0.0\n try:\n endy = float(pivot)\n except ValueError:\n endy = pivot_points[pivot.lower()]\n\n # Get the appropriate angle for the vector components. The offset is\n # due to the way the barb is initially drawn, going down the y-axis.\n # This makes sense in a meteorological mode of thinking since there 0\n # degrees corresponds to north (the y-axis traditionally)\n angles = -(ma.arctan2(v, u) + np.pi / 2)\n\n # Used for low magnitude. We just get the vertices, so if we make it\n # out here, it can be reused. The center set here should put the\n # center of the circle at the location(offset), rather than at the\n # same point as the barb pivot; this seems more sensible.\n circ = CirclePolygon((0, 0), radius=empty_rad).get_verts()\n if fill_empty:\n empty_barb = circ\n else:\n # If we don't want the empty one filled, we make a degenerate\n # polygon that wraps back over itself\n empty_barb = np.concatenate((circ, circ[::-1]))\n\n barb_list = []\n for index, angle in np.ndenumerate(angles):\n # If the vector magnitude is too weak to draw anything, plot an\n # empty circle instead\n if empty_flag[index]:\n # We can skip the transform since the circle has no preferred\n # orientation\n barb_list.append(empty_barb)\n continue\n\n poly_verts = [(endx, endy)]\n offset = length\n\n # Handle if this barb should be flipped\n barb_height = -full_height if flip[index] else full_height\n\n # Add vertices for each flag\n for i in range(nflags[index]):\n # The spacing that works for the barbs is a little to much for\n # the flags, but this only occurs when we have more than 1\n # flag.\n if offset != length:\n offset += spacing / 2.\n poly_verts.extend(\n [[endx, endy + offset],\n [endx + barb_height, endy - full_width / 2 + offset],\n [endx, endy - full_width + offset]])\n\n offset -= full_width + spacing\n\n # Add vertices for each barb. These really are lines, but works\n # great adding 3 vertices that basically pull the polygon out and\n # back down the line\n for i in range(nbarbs[index]):\n poly_verts.extend(\n [(endx, endy + offset),\n (endx + barb_height, endy + offset + full_width / 2),\n (endx, endy + offset)])\n\n offset -= spacing\n\n # Add the vertices for half a barb, if needed\n if half_barb[index]:\n # If the half barb is the first on the staff, traditionally it\n # is offset from the end to make it easy to distinguish from a\n # barb with a full one\n if offset == length:\n poly_verts.append((endx, endy + offset))\n offset -= 1.5 * spacing\n poly_verts.extend(\n [(endx, endy + offset),\n (endx + barb_height / 2, endy + offset + full_width / 4),\n (endx, endy + offset)])\n\n # Rotate the barb according the angle. Making the barb first and\n # then rotating it made the math for drawing the barb really easy.\n # Also, the transform framework makes doing the rotation simple.\n poly_verts = transforms.Affine2D().rotate(-angle).transform(\n poly_verts)\n barb_list.append(poly_verts)\n\n return barb_list\n\n def set_UVC(self, U, V, C=None):\n # We need to ensure we have a copy, not a reference to an array that\n # might change before draw().\n self.u = ma.masked_invalid(U, copy=True).ravel()\n self.v = ma.masked_invalid(V, copy=True).ravel()\n\n # Flip needs to have the same number of entries as everything else.\n # Use broadcast_to to avoid a bloated array of identical values.\n # (can't rely on actual broadcasting)\n if len(self.flip) == 1:\n flip = np.broadcast_to(self.flip, self.u.shape)\n else:\n flip = self.flip\n\n if C is not None:\n c = ma.masked_invalid(C, copy=True).ravel()\n x, y, u, v, c, flip = cbook.delete_masked_points(\n self.x.ravel(), self.y.ravel(), self.u, self.v, c,\n flip.ravel())\n _check_consistent_shapes(x, y, u, v, c, flip)\n else:\n x, y, u, v, flip = cbook.delete_masked_points(\n self.x.ravel(), self.y.ravel(), self.u, self.v, flip.ravel())\n _check_consistent_shapes(x, y, u, v, flip)\n\n magnitude = np.hypot(u, v)\n flags, barbs, halves, empty = self._find_tails(\n magnitude, self.rounding, **self.barb_increments)\n\n # Get the vertices for each of the barbs\n\n plot_barbs = self._make_barbs(u, v, flags, barbs, halves, empty,\n self._length, self._pivot, self.sizes,\n self.fill_empty, flip)\n self.set_verts(plot_barbs)\n\n # Set the color array\n if C is not None:\n self.set_array(c)\n\n # Update the offsets in case the masked data changed\n xy = np.column_stack((x, y))\n self._offsets = xy\n self.stale = True\n\n def set_offsets(self, xy):\n """\n Set the offsets for the barb polygons. This saves the offsets passed\n in and masks them as appropriate for the existing U/V data.\n\n Parameters\n ----------\n xy : sequence of pairs of floats\n """\n self.x = xy[:, 0]\n self.y = xy[:, 1]\n x, y, u, v = cbook.delete_masked_points(\n self.x.ravel(), self.y.ravel(), self.u, self.v)\n _check_consistent_shapes(x, y, u, v)\n xy = np.column_stack((x, y))\n super().set_offsets(xy)\n self.stale = True\n | .venv\Lib\site-packages\matplotlib\quiver.py | quiver.py | Python | 48,675 | 0.95 | 0.122964 | 0.111002 | react-lib | 294 | 2024-10-04T04:50:52.325842 | BSD-3-Clause | false | 99f863b767c49a7488f4ffa9635fee65 |
import matplotlib.artist as martist\nimport matplotlib.collections as mcollections\nfrom matplotlib.axes import Axes\nfrom matplotlib.figure import Figure, SubFigure\nfrom matplotlib.text import Text\nfrom matplotlib.transforms import Transform, Bbox\n\n\nimport numpy as np\nfrom numpy.typing import ArrayLike\nfrom collections.abc import Sequence\nfrom typing import Any, Literal, overload\nfrom matplotlib.typing import ColorType\n\nclass QuiverKey(martist.Artist):\n halign: dict[Literal["N", "S", "E", "W"], Literal["left", "center", "right"]]\n valign: dict[Literal["N", "S", "E", "W"], Literal["top", "center", "bottom"]]\n pivot: dict[Literal["N", "S", "E", "W"], Literal["middle", "tip", "tail"]]\n Q: Quiver\n X: float\n Y: float\n U: float\n angle: float\n coord: Literal["axes", "figure", "data", "inches"]\n color: ColorType | None\n label: str\n labelpos: Literal["N", "S", "E", "W"]\n labelcolor: ColorType | None\n fontproperties: dict[str, Any]\n kw: dict[str, Any]\n text: Text\n zorder: float\n def __init__(\n self,\n Q: Quiver,\n X: float,\n Y: float,\n U: float,\n label: str,\n *,\n angle: float = ...,\n coordinates: Literal["axes", "figure", "data", "inches"] = ...,\n color: ColorType | None = ...,\n labelsep: float = ...,\n labelpos: Literal["N", "S", "E", "W"] = ...,\n labelcolor: ColorType | None = ...,\n fontproperties: dict[str, Any] | None = ...,\n zorder: float | None = ...,\n **kwargs\n ) -> None: ...\n @property\n def labelsep(self) -> float: ...\n def set_figure(self, fig: Figure | SubFigure) -> None: ...\n\nclass Quiver(mcollections.PolyCollection):\n X: ArrayLike\n Y: ArrayLike\n XY: ArrayLike\n U: ArrayLike\n V: ArrayLike\n Umask: ArrayLike\n N: int\n scale: float | None\n headwidth: float\n headlength: float\n headaxislength: float\n minshaft: float\n minlength: float\n units: Literal["width", "height", "dots", "inches", "x", "y", "xy"]\n scale_units: Literal["width", "height", "dots", "inches", "x", "y", "xy"] | None\n angles: Literal["uv", "xy"] | ArrayLike\n width: float | None\n pivot: Literal["tail", "middle", "tip"]\n transform: Transform\n polykw: dict[str, Any]\n\n @overload\n def __init__(\n self,\n ax: Axes,\n U: ArrayLike,\n V: ArrayLike,\n C: ArrayLike = ...,\n *,\n scale: float | None = ...,\n headwidth: float = ...,\n headlength: float = ...,\n headaxislength: float = ...,\n minshaft: float = ...,\n minlength: float = ...,\n units: Literal["width", "height", "dots", "inches", "x", "y", "xy"] = ...,\n scale_units: Literal["width", "height", "dots", "inches", "x", "y", "xy"]\n | None = ...,\n angles: Literal["uv", "xy"] | ArrayLike = ...,\n width: float | None = ...,\n color: ColorType | Sequence[ColorType] = ...,\n pivot: Literal["tail", "mid", "middle", "tip"] = ...,\n **kwargs\n ) -> None: ...\n @overload\n def __init__(\n self,\n ax: Axes,\n X: ArrayLike,\n Y: ArrayLike,\n U: ArrayLike,\n V: ArrayLike,\n C: ArrayLike = ...,\n *,\n scale: float | None = ...,\n headwidth: float = ...,\n headlength: float = ...,\n headaxislength: float = ...,\n minshaft: float = ...,\n minlength: float = ...,\n units: Literal["width", "height", "dots", "inches", "x", "y", "xy"] = ...,\n scale_units: Literal["width", "height", "dots", "inches", "x", "y", "xy"]\n | None = ...,\n angles: Literal["uv", "xy"] | ArrayLike = ...,\n width: float | None = ...,\n color: ColorType | Sequence[ColorType] = ...,\n pivot: Literal["tail", "mid", "middle", "tip"] = ...,\n **kwargs\n ) -> None: ...\n def get_datalim(self, transData: Transform) -> Bbox: ...\n def set_UVC(\n self, U: ArrayLike, V: ArrayLike, C: ArrayLike | None = ...\n ) -> None: ...\n\nclass Barbs(mcollections.PolyCollection):\n sizes: dict[str, float]\n fill_empty: bool\n barb_increments: dict[str, float]\n rounding: bool\n flip: np.ndarray\n x: ArrayLike\n y: ArrayLike\n u: ArrayLike\n v: ArrayLike\n\n @overload\n def __init__(\n self,\n ax: Axes,\n U: ArrayLike,\n V: ArrayLike,\n C: ArrayLike = ...,\n *,\n pivot: str = ...,\n length: int = ...,\n barbcolor: ColorType | Sequence[ColorType] | None = ...,\n flagcolor: ColorType | Sequence[ColorType] | None = ...,\n sizes: dict[str, float] | None = ...,\n fill_empty: bool = ...,\n barb_increments: dict[str, float] | None = ...,\n rounding: bool = ...,\n flip_barb: bool | ArrayLike = ...,\n **kwargs\n ) -> None: ...\n @overload\n def __init__(\n self,\n ax: Axes,\n X: ArrayLike,\n Y: ArrayLike,\n U: ArrayLike,\n V: ArrayLike,\n C: ArrayLike = ...,\n *,\n pivot: str = ...,\n length: int = ...,\n barbcolor: ColorType | Sequence[ColorType] | None = ...,\n flagcolor: ColorType | Sequence[ColorType] | None = ...,\n sizes: dict[str, float] | None = ...,\n fill_empty: bool = ...,\n barb_increments: dict[str, float] | None = ...,\n rounding: bool = ...,\n flip_barb: bool | ArrayLike = ...,\n **kwargs\n ) -> None: ...\n def set_UVC(\n self, U: ArrayLike, V: ArrayLike, C: ArrayLike | None = ...\n ) -> None: ...\n def set_offsets(self, xy: ArrayLike) -> None: ...\n | .venv\Lib\site-packages\matplotlib\quiver.pyi | quiver.pyi | Other | 5,640 | 0.85 | 0.076087 | 0.056497 | react-lib | 586 | 2024-12-02T07:37:57.706226 | MIT | false | bdc03b8e57e02e67cad9908d7b698b94 |
"""\nThe rcsetup module contains the validation code for customization using\nMatplotlib's rc settings.\n\nEach rc setting is assigned a function used to validate any attempted changes\nto that setting. The validation functions are defined in the rcsetup module,\nand are used to construct the rcParams global object which stores the settings\nand is referenced throughout Matplotlib.\n\nThe default values of the rc settings are set in the default matplotlibrc file.\nAny additions or deletions to the parameter set listed here should also be\npropagated to the :file:`lib/matplotlib/mpl-data/matplotlibrc` in Matplotlib's\nroot source directory.\n"""\n\nimport ast\nfrom functools import lru_cache, reduce\nfrom numbers import Real\nimport operator\nimport os\nimport re\n\nimport numpy as np\n\nfrom matplotlib import _api, cbook\nfrom matplotlib.backends import BackendFilter, backend_registry\nfrom matplotlib.cbook import ls_mapper\nfrom matplotlib.colors import Colormap, is_color_like\nfrom matplotlib._fontconfig_pattern import parse_fontconfig_pattern\nfrom matplotlib._enums import JoinStyle, CapStyle\n\n# Don't let the original cycler collide with our validating cycler\nfrom cycler import Cycler, cycler as ccycler\n\n\n@_api.caching_module_getattr\nclass __getattr__:\n @_api.deprecated(\n "3.9",\n alternative="``matplotlib.backends.backend_registry.list_builtin"\n "(matplotlib.backends.BackendFilter.INTERACTIVE)``")\n @property\n def interactive_bk(self):\n return backend_registry.list_builtin(BackendFilter.INTERACTIVE)\n\n @_api.deprecated(\n "3.9",\n alternative="``matplotlib.backends.backend_registry.list_builtin"\n "(matplotlib.backends.BackendFilter.NON_INTERACTIVE)``")\n @property\n def non_interactive_bk(self):\n return backend_registry.list_builtin(BackendFilter.NON_INTERACTIVE)\n\n @_api.deprecated(\n "3.9",\n alternative="``matplotlib.backends.backend_registry.list_builtin()``")\n @property\n def all_backends(self):\n return backend_registry.list_builtin()\n\n\nclass ValidateInStrings:\n def __init__(self, key, valid, ignorecase=False, *,\n _deprecated_since=None):\n """*valid* is a list of legal strings."""\n self.key = key\n self.ignorecase = ignorecase\n self._deprecated_since = _deprecated_since\n\n def func(s):\n if ignorecase:\n return s.lower()\n else:\n return s\n self.valid = {func(k): k for k in valid}\n\n def __call__(self, s):\n if self._deprecated_since:\n name, = (k for k, v in globals().items() if v is self)\n _api.warn_deprecated(\n self._deprecated_since, name=name, obj_type="function")\n if self.ignorecase and isinstance(s, str):\n s = s.lower()\n if s in self.valid:\n return self.valid[s]\n msg = (f"{s!r} is not a valid value for {self.key}; supported values "\n f"are {[*self.valid.values()]}")\n if (isinstance(s, str)\n and (s.startswith('"') and s.endswith('"')\n or s.startswith("'") and s.endswith("'"))\n and s[1:-1] in self.valid):\n msg += "; remove quotes surrounding your string"\n raise ValueError(msg)\n\n\n@lru_cache\ndef _listify_validator(scalar_validator, allow_stringlist=False, *,\n n=None, doc=None):\n def f(s):\n if isinstance(s, str):\n try:\n val = [scalar_validator(v.strip()) for v in s.split(',')\n if v.strip()]\n except Exception:\n if allow_stringlist:\n # Sometimes, a list of colors might be a single string\n # of single-letter colornames. So give that a shot.\n val = [scalar_validator(v.strip()) for v in s if v.strip()]\n else:\n raise\n # Allow any ordered sequence type -- generators, np.ndarray, pd.Series\n # -- but not sets, whose iteration order is non-deterministic.\n elif np.iterable(s) and not isinstance(s, (set, frozenset)):\n # The condition on this list comprehension will preserve the\n # behavior of filtering out any empty strings (behavior was\n # from the original validate_stringlist()), while allowing\n # any non-string/text scalar values such as numbers and arrays.\n val = [scalar_validator(v) for v in s\n if not isinstance(v, str) or v]\n else:\n raise ValueError(\n f"Expected str or other non-set iterable, but got {s}")\n if n is not None and len(val) != n:\n raise ValueError(\n f"Expected {n} values, but there are {len(val)} values in {s}")\n return val\n\n try:\n f.__name__ = f"{scalar_validator.__name__}list"\n except AttributeError: # class instance.\n f.__name__ = f"{type(scalar_validator).__name__}List"\n f.__qualname__ = f.__qualname__.rsplit(".", 1)[0] + "." + f.__name__\n f.__doc__ = doc if doc is not None else scalar_validator.__doc__\n return f\n\n\ndef validate_any(s):\n return s\nvalidate_anylist = _listify_validator(validate_any)\n\n\ndef _validate_date(s):\n try:\n np.datetime64(s)\n return s\n except ValueError:\n raise ValueError(\n f'{s!r} should be a string that can be parsed by numpy.datetime64')\n\n\ndef validate_bool(b):\n """Convert b to ``bool`` or raise."""\n if isinstance(b, str):\n b = b.lower()\n if b in ('t', 'y', 'yes', 'on', 'true', '1', 1, True):\n return True\n elif b in ('f', 'n', 'no', 'off', 'false', '0', 0, False):\n return False\n else:\n raise ValueError(f'Cannot convert {b!r} to bool')\n\n\ndef validate_axisbelow(s):\n try:\n return validate_bool(s)\n except ValueError:\n if isinstance(s, str):\n if s == 'line':\n return 'line'\n raise ValueError(f'{s!r} cannot be interpreted as'\n ' True, False, or "line"')\n\n\ndef validate_dpi(s):\n """Confirm s is string 'figure' or convert s to float or raise."""\n if s == 'figure':\n return s\n try:\n return float(s)\n except ValueError as e:\n raise ValueError(f'{s!r} is not string "figure" and '\n f'could not convert {s!r} to float') from e\n\n\ndef _make_type_validator(cls, *, allow_none=False):\n """\n Return a validator that converts inputs to *cls* or raises (and possibly\n allows ``None`` as well).\n """\n\n def validator(s):\n if (allow_none and\n (s is None or cbook._str_lower_equal(s, "none"))):\n return None\n if cls is str and not isinstance(s, str):\n raise ValueError(f'Could not convert {s!r} to str')\n try:\n return cls(s)\n except (TypeError, ValueError) as e:\n raise ValueError(\n f'Could not convert {s!r} to {cls.__name__}') from e\n\n validator.__name__ = f"validate_{cls.__name__}"\n if allow_none:\n validator.__name__ += "_or_None"\n validator.__qualname__ = (\n validator.__qualname__.rsplit(".", 1)[0] + "." + validator.__name__)\n return validator\n\n\nvalidate_string = _make_type_validator(str)\nvalidate_string_or_None = _make_type_validator(str, allow_none=True)\nvalidate_stringlist = _listify_validator(\n validate_string, doc='return a list of strings')\nvalidate_int = _make_type_validator(int)\nvalidate_int_or_None = _make_type_validator(int, allow_none=True)\nvalidate_float = _make_type_validator(float)\nvalidate_float_or_None = _make_type_validator(float, allow_none=True)\nvalidate_floatlist = _listify_validator(\n validate_float, doc='return a list of floats')\n\n\ndef _validate_marker(s):\n try:\n return validate_int(s)\n except ValueError as e:\n try:\n return validate_string(s)\n except ValueError as e:\n raise ValueError('Supported markers are [string, int]') from e\n\n\n_validate_markerlist = _listify_validator(\n _validate_marker, doc='return a list of markers')\n\n\ndef _validate_pathlike(s):\n if isinstance(s, (str, os.PathLike)):\n # Store value as str because savefig.directory needs to distinguish\n # between "" (cwd) and "." (cwd, but gets updated by user selections).\n return os.fsdecode(s)\n else:\n return validate_string(s)\n\n\ndef validate_fonttype(s):\n """\n Confirm that this is a Postscript or PDF font type that we know how to\n convert to.\n """\n fonttypes = {'type3': 3,\n 'truetype': 42}\n try:\n fonttype = validate_int(s)\n except ValueError:\n try:\n return fonttypes[s.lower()]\n except KeyError as e:\n raise ValueError('Supported Postscript/PDF font types are %s'\n % list(fonttypes)) from e\n else:\n if fonttype not in fonttypes.values():\n raise ValueError(\n 'Supported Postscript/PDF font types are %s' %\n list(fonttypes.values()))\n return fonttype\n\n\n_auto_backend_sentinel = object()\n\n\ndef validate_backend(s):\n if s is _auto_backend_sentinel or backend_registry.is_valid_backend(s):\n return s\n else:\n msg = (f"'{s}' is not a valid value for backend; supported values are "\n f"{backend_registry.list_all()}")\n raise ValueError(msg)\n\n\ndef _validate_toolbar(s):\n s = ValidateInStrings(\n 'toolbar', ['None', 'toolbar2', 'toolmanager'], ignorecase=True)(s)\n if s == 'toolmanager':\n _api.warn_external(\n "Treat the new Tool classes introduced in v1.5 as experimental "\n "for now; the API and rcParam may change in future versions.")\n return s\n\n\ndef validate_color_or_inherit(s):\n """Return a valid color arg."""\n if cbook._str_equal(s, 'inherit'):\n return s\n return validate_color(s)\n\n\ndef validate_color_or_auto(s):\n if cbook._str_equal(s, 'auto'):\n return s\n return validate_color(s)\n\n\ndef validate_color_for_prop_cycle(s):\n # N-th color cycle syntax can't go into the color cycle.\n if isinstance(s, str) and re.match("^C[0-9]$", s):\n raise ValueError(f"Cannot put cycle reference ({s!r}) in prop_cycler")\n return validate_color(s)\n\n\ndef _validate_color_or_linecolor(s):\n if cbook._str_equal(s, 'linecolor'):\n return s\n elif cbook._str_equal(s, 'mfc') or cbook._str_equal(s, 'markerfacecolor'):\n return 'markerfacecolor'\n elif cbook._str_equal(s, 'mec') or cbook._str_equal(s, 'markeredgecolor'):\n return 'markeredgecolor'\n elif s is None:\n return None\n elif isinstance(s, str) and len(s) == 6 or len(s) == 8:\n stmp = '#' + s\n if is_color_like(stmp):\n return stmp\n if s.lower() == 'none':\n return None\n elif is_color_like(s):\n return s\n\n raise ValueError(f'{s!r} does not look like a color arg')\n\n\ndef validate_color(s):\n """Return a valid color arg."""\n if isinstance(s, str):\n if s.lower() == 'none':\n return 'none'\n if len(s) == 6 or len(s) == 8:\n stmp = '#' + s\n if is_color_like(stmp):\n return stmp\n\n if is_color_like(s):\n return s\n\n # If it is still valid, it must be a tuple (as a string from matplotlibrc).\n try:\n color = ast.literal_eval(s)\n except (SyntaxError, ValueError):\n pass\n else:\n if is_color_like(color):\n return color\n\n raise ValueError(f'{s!r} does not look like a color arg')\n\n\nvalidate_colorlist = _listify_validator(\n validate_color, allow_stringlist=True, doc='return a list of colorspecs')\n\n\ndef _validate_cmap(s):\n _api.check_isinstance((str, Colormap), cmap=s)\n return s\n\n\ndef validate_aspect(s):\n if s in ('auto', 'equal'):\n return s\n try:\n return float(s)\n except ValueError as e:\n raise ValueError('not a valid aspect specification') from e\n\n\ndef validate_fontsize_None(s):\n if s is None or s == 'None':\n return None\n else:\n return validate_fontsize(s)\n\n\ndef validate_fontsize(s):\n fontsizes = ['xx-small', 'x-small', 'small', 'medium', 'large',\n 'x-large', 'xx-large', 'smaller', 'larger']\n if isinstance(s, str):\n s = s.lower()\n if s in fontsizes:\n return s\n try:\n return float(s)\n except ValueError as e:\n raise ValueError("%s is not a valid font size. Valid font sizes "\n "are %s." % (s, ", ".join(fontsizes))) from e\n\n\nvalidate_fontsizelist = _listify_validator(validate_fontsize)\n\n\ndef validate_fontweight(s):\n weights = [\n 'ultralight', 'light', 'normal', 'regular', 'book', 'medium', 'roman',\n 'semibold', 'demibold', 'demi', 'bold', 'heavy', 'extra bold', 'black']\n # Note: Historically, weights have been case-sensitive in Matplotlib\n if s in weights:\n return s\n try:\n return int(s)\n except (ValueError, TypeError) as e:\n raise ValueError(f'{s} is not a valid font weight.') from e\n\n\ndef validate_fontstretch(s):\n stretchvalues = [\n 'ultra-condensed', 'extra-condensed', 'condensed', 'semi-condensed',\n 'normal', 'semi-expanded', 'expanded', 'extra-expanded',\n 'ultra-expanded']\n # Note: Historically, stretchvalues have been case-sensitive in Matplotlib\n if s in stretchvalues:\n return s\n try:\n return int(s)\n except (ValueError, TypeError) as e:\n raise ValueError(f'{s} is not a valid font stretch.') from e\n\n\ndef validate_font_properties(s):\n parse_fontconfig_pattern(s)\n return s\n\n\ndef _validate_mathtext_fallback(s):\n _fallback_fonts = ['cm', 'stix', 'stixsans']\n if isinstance(s, str):\n s = s.lower()\n if s is None or s == 'none':\n return None\n elif s.lower() in _fallback_fonts:\n return s\n else:\n raise ValueError(\n f"{s} is not a valid fallback font name. Valid fallback font "\n f"names are {','.join(_fallback_fonts)}. Passing 'None' will turn "\n "fallback off.")\n\n\ndef validate_whiskers(s):\n try:\n return _listify_validator(validate_float, n=2)(s)\n except (TypeError, ValueError):\n try:\n return float(s)\n except ValueError as e:\n raise ValueError("Not a valid whisker value [float, "\n "(float, float)]") from e\n\n\ndef validate_ps_distiller(s):\n if isinstance(s, str):\n s = s.lower()\n if s in ('none', None, 'false', False):\n return None\n else:\n return ValidateInStrings('ps.usedistiller', ['ghostscript', 'xpdf'])(s)\n\n\n# A validator dedicated to the named line styles, based on the items in\n# ls_mapper, and a list of possible strings read from Line2D.set_linestyle\n_validate_named_linestyle = ValidateInStrings(\n 'linestyle',\n [*ls_mapper.keys(), *ls_mapper.values(), 'None', 'none', ' ', ''],\n ignorecase=True)\n\n\ndef _validate_linestyle(ls):\n """\n A validator for all possible line styles, the named ones *and*\n the on-off ink sequences.\n """\n if isinstance(ls, str):\n try: # Look first for a valid named line style, like '--' or 'solid'.\n return _validate_named_linestyle(ls)\n except ValueError:\n pass\n try:\n ls = ast.literal_eval(ls) # Parsing matplotlibrc.\n except (SyntaxError, ValueError):\n pass # Will error with the ValueError at the end.\n\n def _is_iterable_not_string_like(x):\n # Explicitly exclude bytes/bytearrays so that they are not\n # nonsensically interpreted as sequences of numbers (codepoints).\n return np.iterable(x) and not isinstance(x, (str, bytes, bytearray))\n\n if _is_iterable_not_string_like(ls):\n if len(ls) == 2 and _is_iterable_not_string_like(ls[1]):\n # (offset, (on, off, on, off, ...))\n offset, onoff = ls\n else:\n # For backcompat: (on, off, on, off, ...); the offset is implicit.\n offset = 0\n onoff = ls\n\n if (isinstance(offset, Real)\n and len(onoff) % 2 == 0\n and all(isinstance(elem, Real) for elem in onoff)):\n return (offset, onoff)\n\n raise ValueError(f"linestyle {ls!r} is not a valid on-off ink sequence.")\n\n\nvalidate_fillstyle = ValidateInStrings(\n 'markers.fillstyle', ['full', 'left', 'right', 'bottom', 'top', 'none'])\n\n\nvalidate_fillstylelist = _listify_validator(validate_fillstyle)\n\n\ndef validate_markevery(s):\n """\n Validate the markevery property of a Line2D object.\n\n Parameters\n ----------\n s : None, int, (int, int), slice, float, (float, float), or list[int]\n\n Returns\n -------\n None, int, (int, int), slice, float, (float, float), or list[int]\n """\n # Validate s against type slice float int and None\n if isinstance(s, (slice, float, int, type(None))):\n return s\n # Validate s against type tuple\n if isinstance(s, tuple):\n if (len(s) == 2\n and (all(isinstance(e, int) for e in s)\n or all(isinstance(e, float) for e in s))):\n return s\n else:\n raise TypeError(\n "'markevery' tuple must be pair of ints or of floats")\n # Validate s against type list\n if isinstance(s, list):\n if all(isinstance(e, int) for e in s):\n return s\n else:\n raise TypeError(\n "'markevery' list must have all elements of type int")\n raise TypeError("'markevery' is of an invalid type")\n\n\nvalidate_markeverylist = _listify_validator(validate_markevery)\n\n\ndef validate_bbox(s):\n if isinstance(s, str):\n s = s.lower()\n if s == 'tight':\n return s\n if s == 'standard':\n return None\n raise ValueError("bbox should be 'tight' or 'standard'")\n elif s is not None:\n # Backwards compatibility. None is equivalent to 'standard'.\n raise ValueError("bbox should be 'tight' or 'standard'")\n return s\n\n\ndef validate_sketch(s):\n\n if isinstance(s, str):\n s = s.lower().strip()\n if s.startswith("(") and s.endswith(")"):\n s = s[1:-1]\n if s == 'none' or s is None:\n return None\n try:\n return tuple(_listify_validator(validate_float, n=3)(s))\n except ValueError as exc:\n raise ValueError("Expected a (scale, length, randomness) tuple") from exc\n\n\ndef _validate_greaterthan_minushalf(s):\n s = validate_float(s)\n if s > -0.5:\n return s\n else:\n raise RuntimeError(f'Value must be >-0.5; got {s}')\n\n\ndef _validate_greaterequal0_lessequal1(s):\n s = validate_float(s)\n if 0 <= s <= 1:\n return s\n else:\n raise RuntimeError(f'Value must be >=0 and <=1; got {s}')\n\n\ndef _validate_int_greaterequal0(s):\n s = validate_int(s)\n if s >= 0:\n return s\n else:\n raise RuntimeError(f'Value must be >=0; got {s}')\n\n\ndef validate_hatch(s):\n r"""\n Validate a hatch pattern.\n A hatch pattern string can have any sequence of the following\n characters: ``\ / | - + * . x o O``.\n """\n if not isinstance(s, str):\n raise ValueError("Hatch pattern must be a string")\n _api.check_isinstance(str, hatch_pattern=s)\n unknown = set(s) - {'\\', '/', '|', '-', '+', '*', '.', 'x', 'o', 'O'}\n if unknown:\n raise ValueError("Unknown hatch symbol(s): %s" % list(unknown))\n return s\n\n\nvalidate_hatchlist = _listify_validator(validate_hatch)\nvalidate_dashlist = _listify_validator(validate_floatlist)\n\n\ndef _validate_minor_tick_ndivs(n):\n """\n Validate ndiv parameter related to the minor ticks.\n It controls the number of minor ticks to be placed between\n two major ticks.\n """\n\n if cbook._str_lower_equal(n, 'auto'):\n return n\n try:\n n = _validate_int_greaterequal0(n)\n return n\n except (RuntimeError, ValueError):\n pass\n\n raise ValueError("'tick.minor.ndivs' must be 'auto' or non-negative int")\n\n\n_prop_validators = {\n 'color': _listify_validator(validate_color_for_prop_cycle,\n allow_stringlist=True),\n 'linewidth': validate_floatlist,\n 'linestyle': _listify_validator(_validate_linestyle),\n 'facecolor': validate_colorlist,\n 'edgecolor': validate_colorlist,\n 'joinstyle': _listify_validator(JoinStyle),\n 'capstyle': _listify_validator(CapStyle),\n 'fillstyle': validate_fillstylelist,\n 'markerfacecolor': validate_colorlist,\n 'markersize': validate_floatlist,\n 'markeredgewidth': validate_floatlist,\n 'markeredgecolor': validate_colorlist,\n 'markevery': validate_markeverylist,\n 'alpha': validate_floatlist,\n 'marker': _validate_markerlist,\n 'hatch': validate_hatchlist,\n 'dashes': validate_dashlist,\n }\n_prop_aliases = {\n 'c': 'color',\n 'lw': 'linewidth',\n 'ls': 'linestyle',\n 'fc': 'facecolor',\n 'ec': 'edgecolor',\n 'mfc': 'markerfacecolor',\n 'mec': 'markeredgecolor',\n 'mew': 'markeredgewidth',\n 'ms': 'markersize',\n }\n\n\ndef cycler(*args, **kwargs):\n """\n Create a `~cycler.Cycler` object much like :func:`cycler.cycler`,\n but includes input validation.\n\n Call signatures::\n\n cycler(cycler)\n cycler(label=values, label2=values2, ...)\n cycler(label, values)\n\n Form 1 copies a given `~cycler.Cycler` object.\n\n Form 2 creates a `~cycler.Cycler` which cycles over one or more\n properties simultaneously. If multiple properties are given, their\n value lists must have the same length.\n\n Form 3 creates a `~cycler.Cycler` for a single property. This form\n exists for compatibility with the original cycler. Its use is\n discouraged in favor of the kwarg form, i.e. ``cycler(label=values)``.\n\n Parameters\n ----------\n cycler : Cycler\n Copy constructor for Cycler.\n\n label : str\n The property key. Must be a valid `.Artist` property.\n For example, 'color' or 'linestyle'. Aliases are allowed,\n such as 'c' for 'color' and 'lw' for 'linewidth'.\n\n values : iterable\n Finite-length iterable of the property values. These values\n are validated and will raise a ValueError if invalid.\n\n Returns\n -------\n Cycler\n A new :class:`~cycler.Cycler` for the given properties.\n\n Examples\n --------\n Creating a cycler for a single property:\n\n >>> c = cycler(color=['red', 'green', 'blue'])\n\n Creating a cycler for simultaneously cycling over multiple properties\n (e.g. red circle, green plus, blue cross):\n\n >>> c = cycler(color=['red', 'green', 'blue'],\n ... marker=['o', '+', 'x'])\n\n """\n if args and kwargs:\n raise TypeError("cycler() can only accept positional OR keyword "\n "arguments -- not both.")\n elif not args and not kwargs:\n raise TypeError("cycler() must have positional OR keyword arguments")\n\n if len(args) == 1:\n if not isinstance(args[0], Cycler):\n raise TypeError("If only one positional argument given, it must "\n "be a Cycler instance.")\n return validate_cycler(args[0])\n elif len(args) == 2:\n pairs = [(args[0], args[1])]\n elif len(args) > 2:\n raise _api.nargs_error('cycler', '0-2', len(args))\n else:\n pairs = kwargs.items()\n\n validated = []\n for prop, vals in pairs:\n norm_prop = _prop_aliases.get(prop, prop)\n validator = _prop_validators.get(norm_prop, None)\n if validator is None:\n raise TypeError("Unknown artist property: %s" % prop)\n vals = validator(vals)\n # We will normalize the property names as well to reduce\n # the amount of alias handling code elsewhere.\n validated.append((norm_prop, vals))\n\n return reduce(operator.add, (ccycler(k, v) for k, v in validated))\n\n\nclass _DunderChecker(ast.NodeVisitor):\n def visit_Attribute(self, node):\n if node.attr.startswith("__") and node.attr.endswith("__"):\n raise ValueError("cycler strings with dunders are forbidden")\n self.generic_visit(node)\n\n\n# A validator dedicated to the named legend loc\n_validate_named_legend_loc = ValidateInStrings(\n 'legend.loc',\n [\n "best",\n "upper right", "upper left", "lower left", "lower right", "right",\n "center left", "center right", "lower center", "upper center",\n "center"],\n ignorecase=True)\n\n\ndef _validate_legend_loc(loc):\n """\n Confirm that loc is a type which rc.Params["legend.loc"] supports.\n\n .. versionadded:: 3.8\n\n Parameters\n ----------\n loc : str | int | (float, float) | str((float, float))\n The location of the legend.\n\n Returns\n -------\n loc : str | int | (float, float) or raise ValueError exception\n The location of the legend.\n """\n if isinstance(loc, str):\n try:\n return _validate_named_legend_loc(loc)\n except ValueError:\n pass\n try:\n loc = ast.literal_eval(loc)\n except (SyntaxError, ValueError):\n pass\n if isinstance(loc, int):\n if 0 <= loc <= 10:\n return loc\n if isinstance(loc, tuple):\n if len(loc) == 2 and all(isinstance(e, Real) for e in loc):\n return loc\n raise ValueError(f"{loc} is not a valid legend location.")\n\n\ndef validate_cycler(s):\n """Return a Cycler object from a string repr or the object itself."""\n if isinstance(s, str):\n # TODO: We might want to rethink this...\n # While I think I have it quite locked down, it is execution of\n # arbitrary code without sanitation.\n # Combine this with the possibility that rcparams might come from the\n # internet (future plans), this could be downright dangerous.\n # I locked it down by only having the 'cycler()' function available.\n # UPDATE: Partly plugging a security hole.\n # I really should have read this:\n # https://nedbatchelder.com/blog/201206/eval_really_is_dangerous.html\n # We should replace this eval with a combo of PyParsing and\n # ast.literal_eval()\n try:\n _DunderChecker().visit(ast.parse(s))\n s = eval(s, {'cycler': cycler, '__builtins__': {}})\n except BaseException as e:\n raise ValueError(f"{s!r} is not a valid cycler construction: {e}"\n ) from e\n # Should make sure what comes from the above eval()\n # is a Cycler object.\n if isinstance(s, Cycler):\n cycler_inst = s\n else:\n raise ValueError(f"Object is not a string or Cycler instance: {s!r}")\n\n unknowns = cycler_inst.keys - (set(_prop_validators) | set(_prop_aliases))\n if unknowns:\n raise ValueError("Unknown artist properties: %s" % unknowns)\n\n # Not a full validation, but it'll at least normalize property names\n # A fuller validation would require v0.10 of cycler.\n checker = set()\n for prop in cycler_inst.keys:\n norm_prop = _prop_aliases.get(prop, prop)\n if norm_prop != prop and norm_prop in cycler_inst.keys:\n raise ValueError(f"Cannot specify both {norm_prop!r} and alias "\n f"{prop!r} in the same prop_cycle")\n if norm_prop in checker:\n raise ValueError(f"Another property was already aliased to "\n f"{norm_prop!r}. Collision normalizing {prop!r}.")\n checker.update([norm_prop])\n\n # This is just an extra-careful check, just in case there is some\n # edge-case I haven't thought of.\n assert len(checker) == len(cycler_inst.keys)\n\n # Now, it should be safe to mutate this cycler\n for prop in cycler_inst.keys:\n norm_prop = _prop_aliases.get(prop, prop)\n cycler_inst.change_key(prop, norm_prop)\n\n for key, vals in cycler_inst.by_key().items():\n _prop_validators[key](vals)\n\n return cycler_inst\n\n\ndef validate_hist_bins(s):\n valid_strs = ["auto", "sturges", "fd", "doane", "scott", "rice", "sqrt"]\n if isinstance(s, str) and s in valid_strs:\n return s\n try:\n return int(s)\n except (TypeError, ValueError):\n pass\n try:\n return validate_floatlist(s)\n except ValueError:\n pass\n raise ValueError(f"'hist.bins' must be one of {valid_strs}, an int or"\n " a sequence of floats")\n\n\nclass _ignorecase(list):\n """A marker class indicating that a list-of-str is case-insensitive."""\n\n\ndef _convert_validator_spec(key, conv):\n if isinstance(conv, list):\n ignorecase = isinstance(conv, _ignorecase)\n return ValidateInStrings(key, conv, ignorecase=ignorecase)\n else:\n return conv\n\n\n# Mapping of rcParams to validators.\n# Converters given as lists or _ignorecase are converted to ValidateInStrings\n# immediately below.\n# The rcParams defaults are defined in lib/matplotlib/mpl-data/matplotlibrc, which\n# gets copied to matplotlib/mpl-data/matplotlibrc by the setup script.\n_validators = {\n "backend": validate_backend,\n "backend_fallback": validate_bool,\n "figure.hooks": validate_stringlist,\n "toolbar": _validate_toolbar,\n "interactive": validate_bool,\n "timezone": validate_string,\n\n "webagg.port": validate_int,\n "webagg.address": validate_string,\n "webagg.open_in_browser": validate_bool,\n "webagg.port_retries": validate_int,\n\n # line props\n "lines.linewidth": validate_float, # line width in points\n "lines.linestyle": _validate_linestyle, # solid line\n "lines.color": validate_color, # first color in color cycle\n "lines.marker": _validate_marker, # marker name\n "lines.markerfacecolor": validate_color_or_auto, # default color\n "lines.markeredgecolor": validate_color_or_auto, # default color\n "lines.markeredgewidth": validate_float,\n "lines.markersize": validate_float, # markersize, in points\n "lines.antialiased": validate_bool, # antialiased (no jaggies)\n "lines.dash_joinstyle": JoinStyle,\n "lines.solid_joinstyle": JoinStyle,\n "lines.dash_capstyle": CapStyle,\n "lines.solid_capstyle": CapStyle,\n "lines.dashed_pattern": validate_floatlist,\n "lines.dashdot_pattern": validate_floatlist,\n "lines.dotted_pattern": validate_floatlist,\n "lines.scale_dashes": validate_bool,\n\n # marker props\n "markers.fillstyle": validate_fillstyle,\n\n ## pcolor(mesh) props:\n "pcolor.shading": ["auto", "flat", "nearest", "gouraud"],\n "pcolormesh.snap": validate_bool,\n\n ## patch props\n "patch.linewidth": validate_float, # line width in points\n "patch.edgecolor": validate_color,\n "patch.force_edgecolor": validate_bool,\n "patch.facecolor": validate_color, # first color in cycle\n "patch.antialiased": validate_bool, # antialiased (no jaggies)\n\n ## hatch props\n "hatch.color": validate_color,\n "hatch.linewidth": validate_float,\n\n ## Histogram properties\n "hist.bins": validate_hist_bins,\n\n ## Boxplot properties\n "boxplot.notch": validate_bool,\n "boxplot.vertical": validate_bool,\n "boxplot.whiskers": validate_whiskers,\n "boxplot.bootstrap": validate_int_or_None,\n "boxplot.patchartist": validate_bool,\n "boxplot.showmeans": validate_bool,\n "boxplot.showcaps": validate_bool,\n "boxplot.showbox": validate_bool,\n "boxplot.showfliers": validate_bool,\n "boxplot.meanline": validate_bool,\n\n "boxplot.flierprops.color": validate_color,\n "boxplot.flierprops.marker": _validate_marker,\n "boxplot.flierprops.markerfacecolor": validate_color_or_auto,\n "boxplot.flierprops.markeredgecolor": validate_color,\n "boxplot.flierprops.markeredgewidth": validate_float,\n "boxplot.flierprops.markersize": validate_float,\n "boxplot.flierprops.linestyle": _validate_linestyle,\n "boxplot.flierprops.linewidth": validate_float,\n\n "boxplot.boxprops.color": validate_color,\n "boxplot.boxprops.linewidth": validate_float,\n "boxplot.boxprops.linestyle": _validate_linestyle,\n\n "boxplot.whiskerprops.color": validate_color,\n "boxplot.whiskerprops.linewidth": validate_float,\n "boxplot.whiskerprops.linestyle": _validate_linestyle,\n\n "boxplot.capprops.color": validate_color,\n "boxplot.capprops.linewidth": validate_float,\n "boxplot.capprops.linestyle": _validate_linestyle,\n\n "boxplot.medianprops.color": validate_color,\n "boxplot.medianprops.linewidth": validate_float,\n "boxplot.medianprops.linestyle": _validate_linestyle,\n\n "boxplot.meanprops.color": validate_color,\n "boxplot.meanprops.marker": _validate_marker,\n "boxplot.meanprops.markerfacecolor": validate_color,\n "boxplot.meanprops.markeredgecolor": validate_color,\n "boxplot.meanprops.markersize": validate_float,\n "boxplot.meanprops.linestyle": _validate_linestyle,\n "boxplot.meanprops.linewidth": validate_float,\n\n ## font props\n "font.family": validate_stringlist, # used by text object\n "font.style": validate_string,\n "font.variant": validate_string,\n "font.stretch": validate_fontstretch,\n "font.weight": validate_fontweight,\n "font.size": validate_float, # Base font size in points\n "font.serif": validate_stringlist,\n "font.sans-serif": validate_stringlist,\n "font.cursive": validate_stringlist,\n "font.fantasy": validate_stringlist,\n "font.monospace": validate_stringlist,\n\n # text props\n "text.color": validate_color,\n "text.usetex": validate_bool,\n "text.latex.preamble": validate_string,\n "text.hinting": ["default", "no_autohint", "force_autohint",\n "no_hinting", "auto", "native", "either", "none"],\n "text.hinting_factor": validate_int,\n "text.kerning_factor": validate_int,\n "text.antialiased": validate_bool,\n "text.parse_math": validate_bool,\n\n "mathtext.cal": validate_font_properties,\n "mathtext.rm": validate_font_properties,\n "mathtext.tt": validate_font_properties,\n "mathtext.it": validate_font_properties,\n "mathtext.bf": validate_font_properties,\n "mathtext.bfit": validate_font_properties,\n "mathtext.sf": validate_font_properties,\n "mathtext.fontset": ["dejavusans", "dejavuserif", "cm", "stix",\n "stixsans", "custom"],\n "mathtext.default": ["rm", "cal", "bfit", "it", "tt", "sf", "bf", "default",\n "bb", "frak", "scr", "regular"],\n "mathtext.fallback": _validate_mathtext_fallback,\n\n "image.aspect": validate_aspect, # equal, auto, a number\n "image.interpolation": validate_string,\n "image.interpolation_stage": ["auto", "data", "rgba"],\n "image.cmap": _validate_cmap, # gray, jet, etc.\n "image.lut": validate_int, # lookup table\n "image.origin": ["upper", "lower"],\n "image.resample": validate_bool,\n # Specify whether vector graphics backends will combine all images on a\n # set of Axes into a single composite image\n "image.composite_image": validate_bool,\n\n # contour props\n "contour.negative_linestyle": _validate_linestyle,\n "contour.corner_mask": validate_bool,\n "contour.linewidth": validate_float_or_None,\n "contour.algorithm": ["mpl2005", "mpl2014", "serial", "threaded"],\n\n # errorbar props\n "errorbar.capsize": validate_float,\n\n # axis props\n # alignment of x/y axis title\n "xaxis.labellocation": ["left", "center", "right"],\n "yaxis.labellocation": ["bottom", "center", "top"],\n\n # Axes props\n "axes.axisbelow": validate_axisbelow,\n "axes.facecolor": validate_color, # background color\n "axes.edgecolor": validate_color, # edge color\n "axes.linewidth": validate_float, # edge linewidth\n\n "axes.spines.left": validate_bool, # Set visibility of axes spines,\n "axes.spines.right": validate_bool, # i.e., the lines around the chart\n "axes.spines.bottom": validate_bool, # denoting data boundary.\n "axes.spines.top": validate_bool,\n\n "axes.titlesize": validate_fontsize, # Axes title fontsize\n "axes.titlelocation": ["left", "center", "right"], # Axes title alignment\n "axes.titleweight": validate_fontweight, # Axes title font weight\n "axes.titlecolor": validate_color_or_auto, # Axes title font color\n # title location, axes units, None means auto\n "axes.titley": validate_float_or_None,\n # pad from Axes top decoration to title in points\n "axes.titlepad": validate_float,\n "axes.grid": validate_bool, # display grid or not\n "axes.grid.which": ["minor", "both", "major"], # which grids are drawn\n "axes.grid.axis": ["x", "y", "both"], # grid type\n "axes.labelsize": validate_fontsize, # fontsize of x & y labels\n "axes.labelpad": validate_float, # space between label and axis\n "axes.labelweight": validate_fontweight, # fontsize of x & y labels\n "axes.labelcolor": validate_color, # color of axis label\n # use scientific notation if log10 of the axis range is smaller than the\n # first or larger than the second\n "axes.formatter.limits": _listify_validator(validate_int, n=2),\n # use current locale to format ticks\n "axes.formatter.use_locale": validate_bool,\n "axes.formatter.use_mathtext": validate_bool,\n # minimum exponent to format in scientific notation\n "axes.formatter.min_exponent": validate_int,\n "axes.formatter.useoffset": validate_bool,\n "axes.formatter.offset_threshold": validate_int,\n "axes.unicode_minus": validate_bool,\n # This entry can be either a cycler object or a string repr of a\n # cycler-object, which gets eval()'ed to create the object.\n "axes.prop_cycle": validate_cycler,\n # If "data", axes limits are set close to the data.\n # If "round_numbers" axes limits are set to the nearest round numbers.\n "axes.autolimit_mode": ["data", "round_numbers"],\n "axes.xmargin": _validate_greaterthan_minushalf, # margin added to xaxis\n "axes.ymargin": _validate_greaterthan_minushalf, # margin added to yaxis\n "axes.zmargin": _validate_greaterthan_minushalf, # margin added to zaxis\n\n "polaraxes.grid": validate_bool, # display polar grid or not\n "axes3d.grid": validate_bool, # display 3d grid\n "axes3d.automargin": validate_bool, # automatically add margin when\n # manually setting 3D axis limits\n\n "axes3d.xaxis.panecolor": validate_color, # 3d background pane\n "axes3d.yaxis.panecolor": validate_color, # 3d background pane\n "axes3d.zaxis.panecolor": validate_color, # 3d background pane\n\n "axes3d.mouserotationstyle": ["azel", "trackball", "sphere", "arcball"],\n "axes3d.trackballsize": validate_float,\n "axes3d.trackballborder": validate_float,\n\n # scatter props\n "scatter.marker": _validate_marker,\n "scatter.edgecolors": validate_string,\n\n "date.epoch": _validate_date,\n "date.autoformatter.year": validate_string,\n "date.autoformatter.month": validate_string,\n "date.autoformatter.day": validate_string,\n "date.autoformatter.hour": validate_string,\n "date.autoformatter.minute": validate_string,\n "date.autoformatter.second": validate_string,\n "date.autoformatter.microsecond": validate_string,\n\n 'date.converter': ['auto', 'concise'],\n # for auto date locator, choose interval_multiples\n 'date.interval_multiples': validate_bool,\n\n # legend properties\n "legend.fancybox": validate_bool,\n "legend.loc": _validate_legend_loc,\n\n # the number of points in the legend line\n "legend.numpoints": validate_int,\n # the number of points in the legend line for scatter\n "legend.scatterpoints": validate_int,\n "legend.fontsize": validate_fontsize,\n "legend.title_fontsize": validate_fontsize_None,\n # color of the legend\n "legend.labelcolor": _validate_color_or_linecolor,\n # the relative size of legend markers vs. original\n "legend.markerscale": validate_float,\n # using dict in rcParams not yet supported, so make sure it is bool\n "legend.shadow": validate_bool,\n # whether or not to draw a frame around legend\n "legend.frameon": validate_bool,\n # alpha value of the legend frame\n "legend.framealpha": validate_float_or_None,\n\n ## the following dimensions are in fraction of the font size\n "legend.borderpad": validate_float, # units are fontsize\n # the vertical space between the legend entries\n "legend.labelspacing": validate_float,\n # the length of the legend lines\n "legend.handlelength": validate_float,\n # the length of the legend lines\n "legend.handleheight": validate_float,\n # the space between the legend line and legend text\n "legend.handletextpad": validate_float,\n # the border between the Axes and legend edge\n "legend.borderaxespad": validate_float,\n # the border between the Axes and legend edge\n "legend.columnspacing": validate_float,\n "legend.facecolor": validate_color_or_inherit,\n "legend.edgecolor": validate_color_or_inherit,\n\n # tick properties\n "xtick.top": validate_bool, # draw ticks on top side\n "xtick.bottom": validate_bool, # draw ticks on bottom side\n "xtick.labeltop": validate_bool, # draw label on top\n "xtick.labelbottom": validate_bool, # draw label on bottom\n "xtick.major.size": validate_float, # major xtick size in points\n "xtick.minor.size": validate_float, # minor xtick size in points\n "xtick.major.width": validate_float, # major xtick width in points\n "xtick.minor.width": validate_float, # minor xtick width in points\n "xtick.major.pad": validate_float, # distance to label in points\n "xtick.minor.pad": validate_float, # distance to label in points\n "xtick.color": validate_color, # color of xticks\n "xtick.labelcolor": validate_color_or_inherit, # color of xtick labels\n "xtick.minor.visible": validate_bool, # visibility of minor xticks\n "xtick.minor.top": validate_bool, # draw top minor xticks\n "xtick.minor.bottom": validate_bool, # draw bottom minor xticks\n "xtick.major.top": validate_bool, # draw top major xticks\n "xtick.major.bottom": validate_bool, # draw bottom major xticks\n # number of minor xticks\n "xtick.minor.ndivs": _validate_minor_tick_ndivs,\n "xtick.labelsize": validate_fontsize, # fontsize of xtick labels\n "xtick.direction": ["out", "in", "inout"], # direction of xticks\n "xtick.alignment": ["center", "right", "left"],\n\n "ytick.left": validate_bool, # draw ticks on left side\n "ytick.right": validate_bool, # draw ticks on right side\n "ytick.labelleft": validate_bool, # draw tick labels on left side\n "ytick.labelright": validate_bool, # draw tick labels on right side\n "ytick.major.size": validate_float, # major ytick size in points\n "ytick.minor.size": validate_float, # minor ytick size in points\n "ytick.major.width": validate_float, # major ytick width in points\n "ytick.minor.width": validate_float, # minor ytick width in points\n "ytick.major.pad": validate_float, # distance to label in points\n "ytick.minor.pad": validate_float, # distance to label in points\n "ytick.color": validate_color, # color of yticks\n "ytick.labelcolor": validate_color_or_inherit, # color of ytick labels\n "ytick.minor.visible": validate_bool, # visibility of minor yticks\n "ytick.minor.left": validate_bool, # draw left minor yticks\n "ytick.minor.right": validate_bool, # draw right minor yticks\n "ytick.major.left": validate_bool, # draw left major yticks\n "ytick.major.right": validate_bool, # draw right major yticks\n # number of minor yticks\n "ytick.minor.ndivs": _validate_minor_tick_ndivs,\n "ytick.labelsize": validate_fontsize, # fontsize of ytick labels\n "ytick.direction": ["out", "in", "inout"], # direction of yticks\n "ytick.alignment": [\n "center", "top", "bottom", "baseline", "center_baseline"],\n\n "grid.color": validate_color, # grid color\n "grid.linestyle": _validate_linestyle, # solid\n "grid.linewidth": validate_float, # in points\n "grid.alpha": validate_float,\n\n ## figure props\n # figure title\n "figure.titlesize": validate_fontsize,\n "figure.titleweight": validate_fontweight,\n\n # figure labels\n "figure.labelsize": validate_fontsize,\n "figure.labelweight": validate_fontweight,\n\n # figure size in inches: width by height\n "figure.figsize": _listify_validator(validate_float, n=2),\n "figure.dpi": validate_float,\n "figure.facecolor": validate_color,\n "figure.edgecolor": validate_color,\n "figure.frameon": validate_bool,\n "figure.autolayout": validate_bool,\n "figure.max_open_warning": validate_int,\n "figure.raise_window": validate_bool,\n "macosx.window_mode": ["system", "tab", "window"],\n\n "figure.subplot.left": validate_float,\n "figure.subplot.right": validate_float,\n "figure.subplot.bottom": validate_float,\n "figure.subplot.top": validate_float,\n "figure.subplot.wspace": validate_float,\n "figure.subplot.hspace": validate_float,\n\n "figure.constrained_layout.use": validate_bool, # run constrained_layout?\n # wspace and hspace are fraction of adjacent subplots to use for space.\n # Much smaller than above because we don't need room for the text.\n "figure.constrained_layout.hspace": validate_float,\n "figure.constrained_layout.wspace": validate_float,\n # buffer around the Axes, in inches.\n "figure.constrained_layout.h_pad": validate_float,\n "figure.constrained_layout.w_pad": validate_float,\n\n ## Saving figure's properties\n 'savefig.dpi': validate_dpi,\n 'savefig.facecolor': validate_color_or_auto,\n 'savefig.edgecolor': validate_color_or_auto,\n 'savefig.orientation': ['landscape', 'portrait'],\n "savefig.format": validate_string,\n "savefig.bbox": validate_bbox, # "tight", or "standard" (= None)\n "savefig.pad_inches": validate_float,\n # default directory in savefig dialog box\n "savefig.directory": _validate_pathlike,\n "savefig.transparent": validate_bool,\n\n "tk.window_focus": validate_bool, # Maintain shell focus for TkAgg\n\n # Set the papersize/type\n "ps.papersize": _ignorecase(\n ["figure", "letter", "legal", "ledger",\n *[f"{ab}{i}" for ab in "ab" for i in range(11)]]),\n "ps.useafm": validate_bool,\n # use ghostscript or xpdf to distill ps output\n "ps.usedistiller": validate_ps_distiller,\n "ps.distiller.res": validate_int, # dpi\n "ps.fonttype": validate_fonttype, # 3 (Type3) or 42 (Truetype)\n "pdf.compression": validate_int, # 0-9 compression level; 0 to disable\n "pdf.inheritcolor": validate_bool, # skip color setting commands\n # use only the 14 PDF core fonts embedded in every PDF viewing application\n "pdf.use14corefonts": validate_bool,\n "pdf.fonttype": validate_fonttype, # 3 (Type3) or 42 (Truetype)\n\n "pgf.texsystem": ["xelatex", "lualatex", "pdflatex"], # latex variant used\n "pgf.rcfonts": validate_bool, # use mpl's rc settings for font config\n "pgf.preamble": validate_string, # custom LaTeX preamble\n\n # write raster image data into the svg file\n "svg.image_inline": validate_bool,\n "svg.fonttype": ["none", "path"], # save text as text ("none") or "paths"\n "svg.hashsalt": validate_string_or_None,\n "svg.id": validate_string_or_None,\n\n # set this when you want to generate hardcopy docstring\n "docstring.hardcopy": validate_bool,\n\n "path.simplify": validate_bool,\n "path.simplify_threshold": _validate_greaterequal0_lessequal1,\n "path.snap": validate_bool,\n "path.sketch": validate_sketch,\n "path.effects": validate_anylist,\n "agg.path.chunksize": validate_int, # 0 to disable chunking\n\n # key-mappings (multi-character mappings should be a list/tuple)\n "keymap.fullscreen": validate_stringlist,\n "keymap.home": validate_stringlist,\n "keymap.back": validate_stringlist,\n "keymap.forward": validate_stringlist,\n "keymap.pan": validate_stringlist,\n "keymap.zoom": validate_stringlist,\n "keymap.save": validate_stringlist,\n "keymap.quit": validate_stringlist,\n "keymap.quit_all": validate_stringlist, # e.g.: "W", "cmd+W", "Q"\n "keymap.grid": validate_stringlist,\n "keymap.grid_minor": validate_stringlist,\n "keymap.yscale": validate_stringlist,\n "keymap.xscale": validate_stringlist,\n "keymap.help": validate_stringlist,\n "keymap.copy": validate_stringlist,\n\n # Animation settings\n "animation.html": ["html5", "jshtml", "none"],\n # Limit, in MB, of size of base64 encoded animation in HTML\n # (i.e. IPython notebook)\n "animation.embed_limit": validate_float,\n "animation.writer": validate_string,\n "animation.codec": validate_string,\n "animation.bitrate": validate_int,\n # Controls image format when frames are written to disk\n "animation.frame_format": ["png", "jpeg", "tiff", "raw", "rgba", "ppm",\n "sgi", "bmp", "pbm", "svg"],\n # Path to ffmpeg binary. If just binary name, subprocess uses $PATH.\n "animation.ffmpeg_path": _validate_pathlike,\n # Additional arguments for ffmpeg movie writer (using pipes)\n "animation.ffmpeg_args": validate_stringlist,\n # Path to convert binary. If just binary name, subprocess uses $PATH.\n "animation.convert_path": _validate_pathlike,\n # Additional arguments for convert movie writer (using pipes)\n "animation.convert_args": validate_stringlist,\n\n # Classic (pre 2.0) compatibility mode\n # This is used for things that are hard to make backward compatible\n # with a sane rcParam alone. This does *not* turn on classic mode\n # altogether. For that use `matplotlib.style.use("classic")`.\n "_internal.classic_mode": validate_bool\n}\n_hardcoded_defaults = { # Defaults not inferred from\n # lib/matplotlib/mpl-data/matplotlibrc...\n # ... because they are private:\n "_internal.classic_mode": False,\n # ... because they are deprecated:\n # No current deprecations.\n # backend is handled separately when constructing rcParamsDefault.\n}\n_validators = {k: _convert_validator_spec(k, conv)\n for k, conv in _validators.items()}\n | .venv\Lib\site-packages\matplotlib\rcsetup.py | rcsetup.py | Python | 51,606 | 0.75 | 0.157549 | 0.113715 | python-kit | 566 | 2025-01-16T04:18:44.983331 | GPL-3.0 | false | 2c312b335f7e0daa1d29c68e068bae8f |
from cycler import Cycler\n\nfrom collections.abc import Callable, Iterable\nfrom typing import Any, Literal, TypeVar\nfrom matplotlib.typing import ColorType, LineStyleType, MarkEveryType\n\ninteractive_bk: list[str]\nnon_interactive_bk: list[str]\nall_backends: list[str]\n\n_T = TypeVar("_T")\n\ndef _listify_validator(s: Callable[[Any], _T]) -> Callable[[Any], list[_T]]: ...\n\nclass ValidateInStrings:\n key: str\n ignorecase: bool\n valid: dict[str, str]\n def __init__(\n self,\n key: str,\n valid: Iterable[str],\n ignorecase: bool = ...,\n *,\n _deprecated_since: str | None = ...\n ) -> None: ...\n def __call__(self, s: Any) -> str: ...\n\ndef validate_any(s: Any) -> Any: ...\ndef validate_anylist(s: Any) -> list[Any]: ...\ndef validate_bool(b: Any) -> bool: ...\ndef validate_axisbelow(s: Any) -> bool | Literal["line"]: ...\ndef validate_dpi(s: Any) -> Literal["figure"] | float: ...\ndef validate_string(s: Any) -> str: ...\ndef validate_string_or_None(s: Any) -> str | None: ...\ndef validate_stringlist(s: Any) -> list[str]: ...\ndef validate_int(s: Any) -> int: ...\ndef validate_int_or_None(s: Any) -> int | None: ...\ndef validate_float(s: Any) -> float: ...\ndef validate_float_or_None(s: Any) -> float | None: ...\ndef validate_floatlist(s: Any) -> list[float]: ...\ndef _validate_marker(s: Any) -> int | str: ...\ndef _validate_markerlist(s: Any) -> list[int | str]: ...\ndef validate_fonttype(s: Any) -> int: ...\n\n_auto_backend_sentinel: object\n\ndef validate_backend(s: Any) -> str: ...\ndef validate_color_or_inherit(s: Any) -> Literal["inherit"] | ColorType: ...\ndef validate_color_or_auto(s: Any) -> ColorType | Literal["auto"]: ...\ndef validate_color_for_prop_cycle(s: Any) -> ColorType: ...\ndef validate_color(s: Any) -> ColorType: ...\ndef validate_colorlist(s: Any) -> list[ColorType]: ...\ndef _validate_color_or_linecolor(\n s: Any,\n) -> ColorType | Literal["linecolor", "markerfacecolor", "markeredgecolor"] | None: ...\ndef validate_aspect(s: Any) -> Literal["auto", "equal"] | float: ...\ndef validate_fontsize_None(\n s: Any,\n) -> Literal[\n "xx-small",\n "x-small",\n "small",\n "medium",\n "large",\n "x-large",\n "xx-large",\n "smaller",\n "larger",\n] | float | None: ...\ndef validate_fontsize(\n s: Any,\n) -> Literal[\n "xx-small",\n "x-small",\n "small",\n "medium",\n "large",\n "x-large",\n "xx-large",\n "smaller",\n "larger",\n] | float: ...\ndef validate_fontsizelist(\n s: Any,\n) -> list[\n Literal[\n "xx-small",\n "x-small",\n "small",\n "medium",\n "large",\n "x-large",\n "xx-large",\n "smaller",\n "larger",\n ]\n | float\n]: ...\ndef validate_fontweight(\n s: Any,\n) -> Literal[\n "ultralight",\n "light",\n "normal",\n "regular",\n "book",\n "medium",\n "roman",\n "semibold",\n "demibold",\n "demi",\n "bold",\n "heavy",\n "extra bold",\n "black",\n] | int: ...\ndef validate_fontstretch(\n s: Any,\n) -> Literal[\n "ultra-condensed",\n "extra-condensed",\n "condensed",\n "semi-condensed",\n "normal",\n "semi-expanded",\n "expanded",\n "extra-expanded",\n "ultra-expanded",\n] | int: ...\ndef validate_font_properties(s: Any) -> dict[str, Any]: ...\ndef validate_whiskers(s: Any) -> list[float] | float: ...\ndef validate_ps_distiller(s: Any) -> None | Literal["ghostscript", "xpdf"]: ...\n\nvalidate_fillstyle: ValidateInStrings\n\ndef validate_fillstylelist(\n s: Any,\n) -> list[Literal["full", "left", "right", "bottom", "top", "none"]]: ...\ndef validate_markevery(s: Any) -> MarkEveryType: ...\ndef _validate_linestyle(s: Any) -> LineStyleType: ...\ndef validate_markeverylist(s: Any) -> list[MarkEveryType]: ...\ndef validate_bbox(s: Any) -> Literal["tight", "standard"] | None: ...\ndef validate_sketch(s: Any) -> None | tuple[float, float, float]: ...\ndef validate_hatch(s: Any) -> str: ...\ndef validate_hatchlist(s: Any) -> list[str]: ...\ndef validate_dashlist(s: Any) -> list[list[float]]: ...\n\n# TODO: copy cycler overloads?\ndef cycler(*args, **kwargs) -> Cycler: ...\ndef validate_cycler(s: Any) -> Cycler: ...\ndef validate_hist_bins(\n s: Any,\n) -> Literal["auto", "sturges", "fd", "doane", "scott", "rice", "sqrt"] | int | list[\n float\n]: ...\n\n# At runtime is added in __init__.py\ndefaultParams: dict[str, Any]\n | .venv\Lib\site-packages\matplotlib\rcsetup.pyi | rcsetup.pyi | Other | 4,337 | 0.95 | 0.301887 | 0.020408 | vue-tools | 691 | 2024-06-10T04:09:51.964870 | GPL-3.0 | false | ce412f8750fe7cc81107a2a0dbf82c0a |
"""\nModule for creating Sankey diagrams using Matplotlib.\n"""\n\nimport logging\nfrom types import SimpleNamespace\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom matplotlib.path import Path\nfrom matplotlib.patches import PathPatch\nfrom matplotlib.transforms import Affine2D\nfrom matplotlib import _docstring\n\n_log = logging.getLogger(__name__)\n\n__author__ = "Kevin L. Davies"\n__credits__ = ["Yannick Copin"]\n__license__ = "BSD"\n__version__ = "2011/09/16"\n\n# Angles [deg/90]\nRIGHT = 0\nUP = 1\n# LEFT = 2\nDOWN = 3\n\n\nclass Sankey:\n """\n Sankey diagram.\n\n Sankey diagrams are a specific type of flow diagram, in which\n the width of the arrows is shown proportionally to the flow\n quantity. They are typically used to visualize energy or\n material or cost transfers between processes.\n `Wikipedia (6/1/2011) <https://en.wikipedia.org/wiki/Sankey_diagram>`_\n\n """\n\n def __init__(self, ax=None, scale=1.0, unit='', format='%G', gap=0.25,\n radius=0.1, shoulder=0.03, offset=0.15, head_angle=100,\n margin=0.4, tolerance=1e-6, **kwargs):\n """\n Create a new Sankey instance.\n\n The optional arguments listed below are applied to all subdiagrams so\n that there is consistent alignment and formatting.\n\n In order to draw a complex Sankey diagram, create an instance of\n `Sankey` by calling it without any kwargs::\n\n sankey = Sankey()\n\n Then add simple Sankey sub-diagrams::\n\n sankey.add() # 1\n sankey.add() # 2\n #...\n sankey.add() # n\n\n Finally, create the full diagram::\n\n sankey.finish()\n\n Or, instead, simply daisy-chain those calls::\n\n Sankey().add().add... .add().finish()\n\n Other Parameters\n ----------------\n ax : `~matplotlib.axes.Axes`\n Axes onto which the data should be plotted. If *ax* isn't\n provided, new Axes will be created.\n scale : float\n Scaling factor for the flows. *scale* sizes the width of the paths\n in order to maintain proper layout. The same scale is applied to\n all subdiagrams. The value should be chosen such that the product\n of the scale and the sum of the inputs is approximately 1.0 (and\n the product of the scale and the sum of the outputs is\n approximately -1.0).\n unit : str\n The physical unit associated with the flow quantities. If *unit*\n is None, then none of the quantities are labeled.\n format : str or callable\n A Python number formatting string or callable used to label the\n flows with their quantities (i.e., a number times a unit, where the\n unit is given). If a format string is given, the label will be\n ``format % quantity``. If a callable is given, it will be called\n with ``quantity`` as an argument.\n gap : float\n Space between paths that break in/break away to/from the top or\n bottom.\n radius : float\n Inner radius of the vertical paths.\n shoulder : float\n Size of the shoulders of output arrows.\n offset : float\n Text offset (from the dip or tip of the arrow).\n head_angle : float\n Angle, in degrees, of the arrow heads (and negative of the angle of\n the tails).\n margin : float\n Minimum space between Sankey outlines and the edge of the plot\n area.\n tolerance : float\n Acceptable maximum of the magnitude of the sum of flows. The\n magnitude of the sum of connected flows cannot be greater than\n *tolerance*.\n **kwargs\n Any additional keyword arguments will be passed to `add`, which\n will create the first subdiagram.\n\n See Also\n --------\n Sankey.add\n Sankey.finish\n\n Examples\n --------\n .. plot:: gallery/specialty_plots/sankey_basics.py\n """\n # Check the arguments.\n if gap < 0:\n raise ValueError(\n "'gap' is negative, which is not allowed because it would "\n "cause the paths to overlap")\n if radius > gap:\n raise ValueError(\n "'radius' is greater than 'gap', which is not allowed because "\n "it would cause the paths to overlap")\n if head_angle < 0:\n raise ValueError(\n "'head_angle' is negative, which is not allowed because it "\n "would cause inputs to look like outputs and vice versa")\n if tolerance < 0:\n raise ValueError(\n "'tolerance' is negative, but it must be a magnitude")\n\n # Create Axes if necessary.\n if ax is None:\n import matplotlib.pyplot as plt\n fig = plt.figure()\n ax = fig.add_subplot(1, 1, 1, xticks=[], yticks=[])\n\n self.diagrams = []\n\n # Store the inputs.\n self.ax = ax\n self.unit = unit\n self.format = format\n self.scale = scale\n self.gap = gap\n self.radius = radius\n self.shoulder = shoulder\n self.offset = offset\n self.margin = margin\n self.pitch = np.tan(np.pi * (1 - head_angle / 180.0) / 2.0)\n self.tolerance = tolerance\n\n # Initialize the vertices of tight box around the diagram(s).\n self.extent = np.array((np.inf, -np.inf, np.inf, -np.inf))\n\n # If there are any kwargs, create the first subdiagram.\n if len(kwargs):\n self.add(**kwargs)\n\n def _arc(self, quadrant=0, cw=True, radius=1, center=(0, 0)):\n """\n Return the codes and vertices for a rotated, scaled, and translated\n 90 degree arc.\n\n Other Parameters\n ----------------\n quadrant : {0, 1, 2, 3}, default: 0\n Uses 0-based indexing (0, 1, 2, or 3).\n cw : bool, default: True\n If True, the arc vertices are produced clockwise; counter-clockwise\n otherwise.\n radius : float, default: 1\n The radius of the arc.\n center : (float, float), default: (0, 0)\n (x, y) tuple of the arc's center.\n """\n # Note: It would be possible to use matplotlib's transforms to rotate,\n # scale, and translate the arc, but since the angles are discrete,\n # it's just as easy and maybe more efficient to do it here.\n ARC_CODES = [Path.LINETO,\n Path.CURVE4,\n Path.CURVE4,\n Path.CURVE4,\n Path.CURVE4,\n Path.CURVE4,\n Path.CURVE4]\n # Vertices of a cubic Bezier curve approximating a 90 deg arc\n # These can be determined by Path.arc(0, 90).\n ARC_VERTICES = np.array([[1.00000000e+00, 0.00000000e+00],\n [1.00000000e+00, 2.65114773e-01],\n [8.94571235e-01, 5.19642327e-01],\n [7.07106781e-01, 7.07106781e-01],\n [5.19642327e-01, 8.94571235e-01],\n [2.65114773e-01, 1.00000000e+00],\n # Insignificant\n # [6.12303177e-17, 1.00000000e+00]])\n [0.00000000e+00, 1.00000000e+00]])\n if quadrant in (0, 2):\n if cw:\n vertices = ARC_VERTICES\n else:\n vertices = ARC_VERTICES[:, ::-1] # Swap x and y.\n else: # 1, 3\n # Negate x.\n if cw:\n # Swap x and y.\n vertices = np.column_stack((-ARC_VERTICES[:, 1],\n ARC_VERTICES[:, 0]))\n else:\n vertices = np.column_stack((-ARC_VERTICES[:, 0],\n ARC_VERTICES[:, 1]))\n if quadrant > 1:\n radius = -radius # Rotate 180 deg.\n return list(zip(ARC_CODES, radius * vertices +\n np.tile(center, (ARC_VERTICES.shape[0], 1))))\n\n def _add_input(self, path, angle, flow, length):\n """\n Add an input to a path and return its tip and label locations.\n """\n if angle is None:\n return [0, 0], [0, 0]\n else:\n x, y = path[-1][1] # Use the last point as a reference.\n dipdepth = (flow / 2) * self.pitch\n if angle == RIGHT:\n x -= length\n dip = [x + dipdepth, y + flow / 2.0]\n path.extend([(Path.LINETO, [x, y]),\n (Path.LINETO, dip),\n (Path.LINETO, [x, y + flow]),\n (Path.LINETO, [x + self.gap, y + flow])])\n label_location = [dip[0] - self.offset, dip[1]]\n else: # Vertical\n x -= self.gap\n if angle == UP:\n sign = 1\n else:\n sign = -1\n\n dip = [x - flow / 2, y - sign * (length - dipdepth)]\n if angle == DOWN:\n quadrant = 2\n else:\n quadrant = 1\n\n # Inner arc isn't needed if inner radius is zero\n if self.radius:\n path.extend(self._arc(quadrant=quadrant,\n cw=angle == UP,\n radius=self.radius,\n center=(x + self.radius,\n y - sign * self.radius)))\n else:\n path.append((Path.LINETO, [x, y]))\n path.extend([(Path.LINETO, [x, y - sign * length]),\n (Path.LINETO, dip),\n (Path.LINETO, [x - flow, y - sign * length])])\n path.extend(self._arc(quadrant=quadrant,\n cw=angle == DOWN,\n radius=flow + self.radius,\n center=(x + self.radius,\n y - sign * self.radius)))\n path.append((Path.LINETO, [x - flow, y + sign * flow]))\n label_location = [dip[0], dip[1] - sign * self.offset]\n\n return dip, label_location\n\n def _add_output(self, path, angle, flow, length):\n """\n Append an output to a path and return its tip and label locations.\n\n .. note:: *flow* is negative for an output.\n """\n if angle is None:\n return [0, 0], [0, 0]\n else:\n x, y = path[-1][1] # Use the last point as a reference.\n tipheight = (self.shoulder - flow / 2) * self.pitch\n if angle == RIGHT:\n x += length\n tip = [x + tipheight, y + flow / 2.0]\n path.extend([(Path.LINETO, [x, y]),\n (Path.LINETO, [x, y + self.shoulder]),\n (Path.LINETO, tip),\n (Path.LINETO, [x, y - self.shoulder + flow]),\n (Path.LINETO, [x, y + flow]),\n (Path.LINETO, [x - self.gap, y + flow])])\n label_location = [tip[0] + self.offset, tip[1]]\n else: # Vertical\n x += self.gap\n if angle == UP:\n sign, quadrant = 1, 3\n else:\n sign, quadrant = -1, 0\n\n tip = [x - flow / 2.0, y + sign * (length + tipheight)]\n # Inner arc isn't needed if inner radius is zero\n if self.radius:\n path.extend(self._arc(quadrant=quadrant,\n cw=angle == UP,\n radius=self.radius,\n center=(x - self.radius,\n y + sign * self.radius)))\n else:\n path.append((Path.LINETO, [x, y]))\n path.extend([(Path.LINETO, [x, y + sign * length]),\n (Path.LINETO, [x - self.shoulder,\n y + sign * length]),\n (Path.LINETO, tip),\n (Path.LINETO, [x + self.shoulder - flow,\n y + sign * length]),\n (Path.LINETO, [x - flow, y + sign * length])])\n path.extend(self._arc(quadrant=quadrant,\n cw=angle == DOWN,\n radius=self.radius - flow,\n center=(x - self.radius,\n y + sign * self.radius)))\n path.append((Path.LINETO, [x - flow, y + sign * flow]))\n label_location = [tip[0], tip[1] + sign * self.offset]\n return tip, label_location\n\n def _revert(self, path, first_action=Path.LINETO):\n """\n A path is not simply reversible by path[::-1] since the code\n specifies an action to take from the **previous** point.\n """\n reverse_path = []\n next_code = first_action\n for code, position in path[::-1]:\n reverse_path.append((next_code, position))\n next_code = code\n return reverse_path\n # This might be more efficient, but it fails because 'tuple' object\n # doesn't support item assignment:\n # path[1] = path[1][-1:0:-1]\n # path[1][0] = first_action\n # path[2] = path[2][::-1]\n # return path\n\n @_docstring.interpd\n def add(self, patchlabel='', flows=None, orientations=None, labels='',\n trunklength=1.0, pathlengths=0.25, prior=None, connect=(0, 0),\n rotation=0, **kwargs):\n """\n Add a simple Sankey diagram with flows at the same hierarchical level.\n\n Parameters\n ----------\n patchlabel : str\n Label to be placed at the center of the diagram.\n Note that *label* (not *patchlabel*) can be passed as keyword\n argument to create an entry in the legend.\n\n flows : list of float\n Array of flow values. By convention, inputs are positive and\n outputs are negative.\n\n Flows are placed along the top of the diagram from the inside out\n in order of their index within *flows*. They are placed along the\n sides of the diagram from the top down and along the bottom from\n the outside in.\n\n If the sum of the inputs and outputs is\n nonzero, the discrepancy will appear as a cubic Bézier curve along\n the top and bottom edges of the trunk.\n\n orientations : list of {-1, 0, 1}\n List of orientations of the flows (or a single orientation to be\n used for all flows). Valid values are 0 (inputs from\n the left, outputs to the right), 1 (from and to the top) or -1\n (from and to the bottom).\n\n labels : list of (str or None)\n List of labels for the flows (or a single label to be used for all\n flows). Each label may be *None* (no label), or a labeling string.\n If an entry is a (possibly empty) string, then the quantity for the\n corresponding flow will be shown below the string. However, if\n the *unit* of the main diagram is None, then quantities are never\n shown, regardless of the value of this argument.\n\n trunklength : float\n Length between the bases of the input and output groups (in\n data-space units).\n\n pathlengths : list of float\n List of lengths of the vertical arrows before break-in or after\n break-away. If a single value is given, then it will be applied to\n the first (inside) paths on the top and bottom, and the length of\n all other arrows will be justified accordingly. The *pathlengths*\n are not applied to the horizontal inputs and outputs.\n\n prior : int\n Index of the prior diagram to which this diagram should be\n connected.\n\n connect : (int, int)\n A (prior, this) tuple indexing the flow of the prior diagram and\n the flow of this diagram which should be connected. If this is the\n first diagram or *prior* is *None*, *connect* will be ignored.\n\n rotation : float\n Angle of rotation of the diagram in degrees. The interpretation of\n the *orientations* argument will be rotated accordingly (e.g., if\n *rotation* == 90, an *orientations* entry of 1 means to/from the\n left). *rotation* is ignored if this diagram is connected to an\n existing one (using *prior* and *connect*).\n\n Returns\n -------\n Sankey\n The current `.Sankey` instance.\n\n Other Parameters\n ----------------\n **kwargs\n Additional keyword arguments set `matplotlib.patches.PathPatch`\n properties, listed below. For example, one may want to use\n ``fill=False`` or ``label="A legend entry"``.\n\n %(Patch:kwdoc)s\n\n See Also\n --------\n Sankey.finish\n """\n # Check and preprocess the arguments.\n flows = np.array([1.0, -1.0]) if flows is None else np.array(flows)\n n = flows.shape[0] # Number of flows\n if rotation is None:\n rotation = 0\n else:\n # In the code below, angles are expressed in deg/90.\n rotation /= 90.0\n if orientations is None:\n orientations = 0\n try:\n orientations = np.broadcast_to(orientations, n)\n except ValueError:\n raise ValueError(\n f"The shapes of 'flows' {np.shape(flows)} and 'orientations' "\n f"{np.shape(orientations)} are incompatible"\n ) from None\n try:\n labels = np.broadcast_to(labels, n)\n except ValueError:\n raise ValueError(\n f"The shapes of 'flows' {np.shape(flows)} and 'labels' "\n f"{np.shape(labels)} are incompatible"\n ) from None\n if trunklength < 0:\n raise ValueError(\n "'trunklength' is negative, which is not allowed because it "\n "would cause poor layout")\n if abs(np.sum(flows)) > self.tolerance:\n _log.info("The sum of the flows is nonzero (%f; patchlabel=%r); "\n "is the system not at steady state?",\n np.sum(flows), patchlabel)\n scaled_flows = self.scale * flows\n gain = sum(max(flow, 0) for flow in scaled_flows)\n loss = sum(min(flow, 0) for flow in scaled_flows)\n if prior is not None:\n if prior < 0:\n raise ValueError("The index of the prior diagram is negative")\n if min(connect) < 0:\n raise ValueError(\n "At least one of the connection indices is negative")\n if prior >= len(self.diagrams):\n raise ValueError(\n f"The index of the prior diagram is {prior}, but there "\n f"are only {len(self.diagrams)} other diagrams")\n if connect[0] >= len(self.diagrams[prior].flows):\n raise ValueError(\n "The connection index to the source diagram is {}, but "\n "that diagram has only {} flows".format(\n connect[0], len(self.diagrams[prior].flows)))\n if connect[1] >= n:\n raise ValueError(\n f"The connection index to this diagram is {connect[1]}, "\n f"but this diagram has only {n} flows")\n if self.diagrams[prior].angles[connect[0]] is None:\n raise ValueError(\n f"The connection cannot be made, which may occur if the "\n f"magnitude of flow {connect[0]} of diagram {prior} is "\n f"less than the specified tolerance")\n flow_error = (self.diagrams[prior].flows[connect[0]] +\n flows[connect[1]])\n if abs(flow_error) >= self.tolerance:\n raise ValueError(\n f"The scaled sum of the connected flows is {flow_error}, "\n f"which is not within the tolerance ({self.tolerance})")\n\n # Determine if the flows are inputs.\n are_inputs = [None] * n\n for i, flow in enumerate(flows):\n if flow >= self.tolerance:\n are_inputs[i] = True\n elif flow <= -self.tolerance:\n are_inputs[i] = False\n else:\n _log.info(\n "The magnitude of flow %d (%f) is below the tolerance "\n "(%f).\nIt will not be shown, and it cannot be used in a "\n "connection.", i, flow, self.tolerance)\n\n # Determine the angles of the arrows (before rotation).\n angles = [None] * n\n for i, (orient, is_input) in enumerate(zip(orientations, are_inputs)):\n if orient == 1:\n if is_input:\n angles[i] = DOWN\n elif is_input is False:\n # Be specific since is_input can be None.\n angles[i] = UP\n elif orient == 0:\n if is_input is not None:\n angles[i] = RIGHT\n else:\n if orient != -1:\n raise ValueError(\n f"The value of orientations[{i}] is {orient}, "\n f"but it must be -1, 0, or 1")\n if is_input:\n angles[i] = UP\n elif is_input is False:\n angles[i] = DOWN\n\n # Justify the lengths of the paths.\n if np.iterable(pathlengths):\n if len(pathlengths) != n:\n raise ValueError(\n f"The lengths of 'flows' ({n}) and 'pathlengths' "\n f"({len(pathlengths)}) are incompatible")\n else: # Make pathlengths into a list.\n urlength = pathlengths\n ullength = pathlengths\n lrlength = pathlengths\n lllength = pathlengths\n d = dict(RIGHT=pathlengths)\n pathlengths = [d.get(angle, 0) for angle in angles]\n # Determine the lengths of the top-side arrows\n # from the middle outwards.\n for i, (angle, is_input, flow) in enumerate(zip(angles, are_inputs,\n scaled_flows)):\n if angle == DOWN and is_input:\n pathlengths[i] = ullength\n ullength += flow\n elif angle == UP and is_input is False:\n pathlengths[i] = urlength\n urlength -= flow # Flow is negative for outputs.\n # Determine the lengths of the bottom-side arrows\n # from the middle outwards.\n for i, (angle, is_input, flow) in enumerate(reversed(list(zip(\n angles, are_inputs, scaled_flows)))):\n if angle == UP and is_input:\n pathlengths[n - i - 1] = lllength\n lllength += flow\n elif angle == DOWN and is_input is False:\n pathlengths[n - i - 1] = lrlength\n lrlength -= flow\n # Determine the lengths of the left-side arrows\n # from the bottom upwards.\n has_left_input = False\n for i, (angle, is_input, spec) in enumerate(reversed(list(zip(\n angles, are_inputs, zip(scaled_flows, pathlengths))))):\n if angle == RIGHT:\n if is_input:\n if has_left_input:\n pathlengths[n - i - 1] = 0\n else:\n has_left_input = True\n # Determine the lengths of the right-side arrows\n # from the top downwards.\n has_right_output = False\n for i, (angle, is_input, spec) in enumerate(zip(\n angles, are_inputs, list(zip(scaled_flows, pathlengths)))):\n if angle == RIGHT:\n if is_input is False:\n if has_right_output:\n pathlengths[i] = 0\n else:\n has_right_output = True\n\n # Begin the subpaths, and smooth the transition if the sum of the flows\n # is nonzero.\n urpath = [(Path.MOVETO, [(self.gap - trunklength / 2.0), # Upper right\n gain / 2.0]),\n (Path.LINETO, [(self.gap - trunklength / 2.0) / 2.0,\n gain / 2.0]),\n (Path.CURVE4, [(self.gap - trunklength / 2.0) / 8.0,\n gain / 2.0]),\n (Path.CURVE4, [(trunklength / 2.0 - self.gap) / 8.0,\n -loss / 2.0]),\n (Path.LINETO, [(trunklength / 2.0 - self.gap) / 2.0,\n -loss / 2.0]),\n (Path.LINETO, [(trunklength / 2.0 - self.gap),\n -loss / 2.0])]\n llpath = [(Path.LINETO, [(trunklength / 2.0 - self.gap), # Lower left\n loss / 2.0]),\n (Path.LINETO, [(trunklength / 2.0 - self.gap) / 2.0,\n loss / 2.0]),\n (Path.CURVE4, [(trunklength / 2.0 - self.gap) / 8.0,\n loss / 2.0]),\n (Path.CURVE4, [(self.gap - trunklength / 2.0) / 8.0,\n -gain / 2.0]),\n (Path.LINETO, [(self.gap - trunklength / 2.0) / 2.0,\n -gain / 2.0]),\n (Path.LINETO, [(self.gap - trunklength / 2.0),\n -gain / 2.0])]\n lrpath = [(Path.LINETO, [(trunklength / 2.0 - self.gap), # Lower right\n loss / 2.0])]\n ulpath = [(Path.LINETO, [self.gap - trunklength / 2.0, # Upper left\n gain / 2.0])]\n\n # Add the subpaths and assign the locations of the tips and labels.\n tips = np.zeros((n, 2))\n label_locations = np.zeros((n, 2))\n # Add the top-side inputs and outputs from the middle outwards.\n for i, (angle, is_input, spec) in enumerate(zip(\n angles, are_inputs, list(zip(scaled_flows, pathlengths)))):\n if angle == DOWN and is_input:\n tips[i, :], label_locations[i, :] = self._add_input(\n ulpath, angle, *spec)\n elif angle == UP and is_input is False:\n tips[i, :], label_locations[i, :] = self._add_output(\n urpath, angle, *spec)\n # Add the bottom-side inputs and outputs from the middle outwards.\n for i, (angle, is_input, spec) in enumerate(reversed(list(zip(\n angles, are_inputs, list(zip(scaled_flows, pathlengths)))))):\n if angle == UP and is_input:\n tip, label_location = self._add_input(llpath, angle, *spec)\n tips[n - i - 1, :] = tip\n label_locations[n - i - 1, :] = label_location\n elif angle == DOWN and is_input is False:\n tip, label_location = self._add_output(lrpath, angle, *spec)\n tips[n - i - 1, :] = tip\n label_locations[n - i - 1, :] = label_location\n # Add the left-side inputs from the bottom upwards.\n has_left_input = False\n for i, (angle, is_input, spec) in enumerate(reversed(list(zip(\n angles, are_inputs, list(zip(scaled_flows, pathlengths)))))):\n if angle == RIGHT and is_input:\n if not has_left_input:\n # Make sure the lower path extends\n # at least as far as the upper one.\n if llpath[-1][1][0] > ulpath[-1][1][0]:\n llpath.append((Path.LINETO, [ulpath[-1][1][0],\n llpath[-1][1][1]]))\n has_left_input = True\n tip, label_location = self._add_input(llpath, angle, *spec)\n tips[n - i - 1, :] = tip\n label_locations[n - i - 1, :] = label_location\n # Add the right-side outputs from the top downwards.\n has_right_output = False\n for i, (angle, is_input, spec) in enumerate(zip(\n angles, are_inputs, list(zip(scaled_flows, pathlengths)))):\n if angle == RIGHT and is_input is False:\n if not has_right_output:\n # Make sure the upper path extends\n # at least as far as the lower one.\n if urpath[-1][1][0] < lrpath[-1][1][0]:\n urpath.append((Path.LINETO, [lrpath[-1][1][0],\n urpath[-1][1][1]]))\n has_right_output = True\n tips[i, :], label_locations[i, :] = self._add_output(\n urpath, angle, *spec)\n # Trim any hanging vertices.\n if not has_left_input:\n ulpath.pop()\n llpath.pop()\n if not has_right_output:\n lrpath.pop()\n urpath.pop()\n\n # Concatenate the subpaths in the correct order (clockwise from top).\n path = (urpath + self._revert(lrpath) + llpath + self._revert(ulpath) +\n [(Path.CLOSEPOLY, urpath[0][1])])\n\n # Create a patch with the Sankey outline.\n codes, vertices = zip(*path)\n vertices = np.array(vertices)\n\n def _get_angle(a, r):\n if a is None:\n return None\n else:\n return a + r\n\n if prior is None:\n if rotation != 0: # By default, none of this is needed.\n angles = [_get_angle(angle, rotation) for angle in angles]\n rotate = Affine2D().rotate_deg(rotation * 90).transform_affine\n tips = rotate(tips)\n label_locations = rotate(label_locations)\n vertices = rotate(vertices)\n text = self.ax.text(0, 0, s=patchlabel, ha='center', va='center')\n else:\n rotation = (self.diagrams[prior].angles[connect[0]] -\n angles[connect[1]])\n angles = [_get_angle(angle, rotation) for angle in angles]\n rotate = Affine2D().rotate_deg(rotation * 90).transform_affine\n tips = rotate(tips)\n offset = self.diagrams[prior].tips[connect[0]] - tips[connect[1]]\n translate = Affine2D().translate(*offset).transform_affine\n tips = translate(tips)\n label_locations = translate(rotate(label_locations))\n vertices = translate(rotate(vertices))\n kwds = dict(s=patchlabel, ha='center', va='center')\n text = self.ax.text(*offset, **kwds)\n if mpl.rcParams['_internal.classic_mode']:\n fc = kwargs.pop('fc', kwargs.pop('facecolor', '#bfd1d4'))\n lw = kwargs.pop('lw', kwargs.pop('linewidth', 0.5))\n else:\n fc = kwargs.pop('fc', kwargs.pop('facecolor', None))\n lw = kwargs.pop('lw', kwargs.pop('linewidth', None))\n if fc is None:\n fc = self.ax._get_patches_for_fill.get_next_color()\n patch = PathPatch(Path(vertices, codes), fc=fc, lw=lw, **kwargs)\n self.ax.add_patch(patch)\n\n # Add the path labels.\n texts = []\n for number, angle, label, location in zip(flows, angles, labels,\n label_locations):\n if label is None or angle is None:\n label = ''\n elif self.unit is not None:\n if isinstance(self.format, str):\n quantity = self.format % abs(number) + self.unit\n elif callable(self.format):\n quantity = self.format(number)\n else:\n raise TypeError(\n 'format must be callable or a format string')\n if label != '':\n label += "\n"\n label += quantity\n texts.append(self.ax.text(x=location[0], y=location[1],\n s=label,\n ha='center', va='center'))\n # Text objects are placed even they are empty (as long as the magnitude\n # of the corresponding flow is larger than the tolerance) in case the\n # user wants to provide labels later.\n\n # Expand the size of the diagram if necessary.\n self.extent = (min(np.min(vertices[:, 0]),\n np.min(label_locations[:, 0]),\n self.extent[0]),\n max(np.max(vertices[:, 0]),\n np.max(label_locations[:, 0]),\n self.extent[1]),\n min(np.min(vertices[:, 1]),\n np.min(label_locations[:, 1]),\n self.extent[2]),\n max(np.max(vertices[:, 1]),\n np.max(label_locations[:, 1]),\n self.extent[3]))\n # Include both vertices _and_ label locations in the extents; there are\n # where either could determine the margins (e.g., arrow shoulders).\n\n # Add this diagram as a subdiagram.\n self.diagrams.append(\n SimpleNamespace(patch=patch, flows=flows, angles=angles, tips=tips,\n text=text, texts=texts))\n\n # Allow a daisy-chained call structure (see docstring for the class).\n return self\n\n def finish(self):\n """\n Adjust the Axes and return a list of information about the Sankey\n subdiagram(s).\n\n Returns a list of subdiagrams with the following fields:\n\n ======== =============================================================\n Field Description\n ======== =============================================================\n *patch* Sankey outline (a `~matplotlib.patches.PathPatch`).\n *flows* Flow values (positive for input, negative for output).\n *angles* List of angles of the arrows [deg/90].\n For example, if the diagram has not been rotated,\n an input to the top side has an angle of 3 (DOWN),\n and an output from the top side has an angle of 1 (UP).\n If a flow has been skipped (because its magnitude is less\n than *tolerance*), then its angle will be *None*.\n *tips* (N, 2)-array of the (x, y) positions of the tips (or "dips")\n of the flow paths.\n If the magnitude of a flow is less the *tolerance* of this\n `Sankey` instance, the flow is skipped and its tip will be at\n the center of the diagram.\n *text* `.Text` instance for the diagram label.\n *texts* List of `.Text` instances for the flow labels.\n ======== =============================================================\n\n See Also\n --------\n Sankey.add\n """\n self.ax.axis([self.extent[0] - self.margin,\n self.extent[1] + self.margin,\n self.extent[2] - self.margin,\n self.extent[3] + self.margin])\n self.ax.set_aspect('equal', adjustable='datalim')\n return self.diagrams\n | .venv\Lib\site-packages\matplotlib\sankey.py | sankey.py | Python | 36,151 | 0.95 | 0.14742 | 0.096904 | node-utils | 991 | 2024-02-01T03:57:11.175691 | MIT | false | 4d47a752cbb627c210ad24389f933abe |
from matplotlib.axes import Axes\n\nfrom collections.abc import Callable, Iterable\nfrom typing import Any\nfrom typing_extensions import Self # < Py 3.11\n\nimport numpy as np\n\n__license__: str\n__credits__: list[str]\n__author__: str\n__version__: str\n\nRIGHT: int\nUP: int\nDOWN: int\n\n# TODO typing units\nclass Sankey:\n diagrams: list[Any]\n ax: Axes\n unit: Any\n format: str | Callable[[float], str]\n scale: float\n gap: float\n radius: float\n shoulder: float\n offset: float\n margin: float\n pitch: float\n tolerance: float\n extent: np.ndarray\n def __init__(\n self,\n ax: Axes | None = ...,\n scale: float = ...,\n unit: Any = ...,\n format: str | Callable[[float], str] = ...,\n gap: float = ...,\n radius: float = ...,\n shoulder: float = ...,\n offset: float = ...,\n head_angle: float = ...,\n margin: float = ...,\n tolerance: float = ...,\n **kwargs\n ) -> None: ...\n def add(\n self,\n patchlabel: str = ...,\n flows: Iterable[float] | None = ...,\n orientations: Iterable[int] | None = ...,\n labels: str | Iterable[str | None] = ...,\n trunklength: float = ...,\n pathlengths: float | Iterable[float] = ...,\n prior: int | None = ...,\n connect: tuple[int, int] = ...,\n rotation: float = ...,\n **kwargs\n ) -> Self: ...\n def finish(self) -> list[Any]: ...\n | .venv\Lib\site-packages\matplotlib\sankey.pyi | sankey.pyi | Other | 1,451 | 0.95 | 0.065574 | 0.053571 | react-lib | 717 | 2025-03-30T14:41:27.360641 | Apache-2.0 | false | 6ec9d94ba2461b18d31738f8b8b55473 |
"""\nScales define the distribution of data values on an axis, e.g. a log scaling.\n\nThe mapping is implemented through `.Transform` subclasses.\n\nThe following scales are built-in:\n\n.. _builtin_scales:\n\n============= ===================== ================================ =================================\nName Class Transform Inverted transform\n============= ===================== ================================ =================================\n"asinh" `AsinhScale` `AsinhTransform` `InvertedAsinhTransform`\n"function" `FuncScale` `FuncTransform` `FuncTransform`\n"functionlog" `FuncScaleLog` `FuncTransform` + `LogTransform` `InvertedLogTransform` + `FuncTransform`\n"linear" `LinearScale` `.IdentityTransform` `.IdentityTransform`\n"log" `LogScale` `LogTransform` `InvertedLogTransform`\n"logit" `LogitScale` `LogitTransform` `LogisticTransform`\n"symlog" `SymmetricalLogScale` `SymmetricalLogTransform` `InvertedSymmetricalLogTransform`\n============= ===================== ================================ =================================\n\nA user will often only use the scale name, e.g. when setting the scale through\n`~.Axes.set_xscale`: ``ax.set_xscale("log")``.\n\nSee also the :ref:`scales examples <sphx_glr_gallery_scales>` in the documentation.\n\nCustom scaling can be achieved through `FuncScale`, or by creating your own\n`ScaleBase` subclass and corresponding transforms (see :doc:`/gallery/scales/custom_scale`).\nThird parties can register their scales by name through `register_scale`.\n""" # noqa: E501\n\nimport inspect\nimport textwrap\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom matplotlib import _api, _docstring\nfrom matplotlib.ticker import (\n NullFormatter, ScalarFormatter, LogFormatterSciNotation, LogitFormatter,\n NullLocator, LogLocator, AutoLocator, AutoMinorLocator,\n SymmetricalLogLocator, AsinhLocator, LogitLocator)\nfrom matplotlib.transforms import Transform, IdentityTransform\n\n\nclass ScaleBase:\n """\n The base class for all scales.\n\n Scales are separable transformations, working on a single dimension.\n\n Subclasses should override\n\n :attr:`name`\n The scale's name.\n :meth:`get_transform`\n A method returning a `.Transform`, which converts data coordinates to\n scaled coordinates. This transform should be invertible, so that e.g.\n mouse positions can be converted back to data coordinates.\n :meth:`set_default_locators_and_formatters`\n A method that sets default locators and formatters for an `~.axis.Axis`\n that uses this scale.\n :meth:`limit_range_for_scale`\n An optional method that "fixes" the axis range to acceptable values,\n e.g. restricting log-scaled axes to positive values.\n """\n\n def __init__(self, axis):\n r"""\n Construct a new scale.\n\n Notes\n -----\n The following note is for scale implementers.\n\n For back-compatibility reasons, scales take an `~matplotlib.axis.Axis`\n object as first argument. However, this argument should not\n be used: a single scale object should be usable by multiple\n `~matplotlib.axis.Axis`\es at the same time.\n """\n\n def get_transform(self):\n """\n Return the `.Transform` object associated with this scale.\n """\n raise NotImplementedError()\n\n def set_default_locators_and_formatters(self, axis):\n """\n Set the locators and formatters of *axis* to instances suitable for\n this scale.\n """\n raise NotImplementedError()\n\n def limit_range_for_scale(self, vmin, vmax, minpos):\n """\n Return the range *vmin*, *vmax*, restricted to the\n domain supported by this scale (if any).\n\n *minpos* should be the minimum positive value in the data.\n This is used by log scales to determine a minimum value.\n """\n return vmin, vmax\n\n\nclass LinearScale(ScaleBase):\n """\n The default linear scale.\n """\n\n name = 'linear'\n\n def __init__(self, axis):\n # This method is present only to prevent inheritance of the base class'\n # constructor docstring, which would otherwise end up interpolated into\n # the docstring of Axis.set_scale.\n """\n """ # noqa: D419\n\n def set_default_locators_and_formatters(self, axis):\n # docstring inherited\n axis.set_major_locator(AutoLocator())\n axis.set_major_formatter(ScalarFormatter())\n axis.set_minor_formatter(NullFormatter())\n # update the minor locator for x and y axis based on rcParams\n if (axis.axis_name == 'x' and mpl.rcParams['xtick.minor.visible'] or\n axis.axis_name == 'y' and mpl.rcParams['ytick.minor.visible']):\n axis.set_minor_locator(AutoMinorLocator())\n else:\n axis.set_minor_locator(NullLocator())\n\n def get_transform(self):\n """\n Return the transform for linear scaling, which is just the\n `~matplotlib.transforms.IdentityTransform`.\n """\n return IdentityTransform()\n\n\nclass FuncTransform(Transform):\n """\n A simple transform that takes and arbitrary function for the\n forward and inverse transform.\n """\n\n input_dims = output_dims = 1\n\n def __init__(self, forward, inverse):\n """\n Parameters\n ----------\n forward : callable\n The forward function for the transform. This function must have\n an inverse and, for best behavior, be monotonic.\n It must have the signature::\n\n def forward(values: array-like) -> array-like\n\n inverse : callable\n The inverse of the forward function. Signature as ``forward``.\n """\n super().__init__()\n if callable(forward) and callable(inverse):\n self._forward = forward\n self._inverse = inverse\n else:\n raise ValueError('arguments to FuncTransform must be functions')\n\n def transform_non_affine(self, values):\n return self._forward(values)\n\n def inverted(self):\n return FuncTransform(self._inverse, self._forward)\n\n\nclass FuncScale(ScaleBase):\n """\n Provide an arbitrary scale with user-supplied function for the axis.\n """\n\n name = 'function'\n\n def __init__(self, axis, functions):\n """\n Parameters\n ----------\n axis : `~matplotlib.axis.Axis`\n The axis for the scale.\n functions : (callable, callable)\n two-tuple of the forward and inverse functions for the scale.\n The forward function must be monotonic.\n\n Both functions must have the signature::\n\n def forward(values: array-like) -> array-like\n """\n forward, inverse = functions\n transform = FuncTransform(forward, inverse)\n self._transform = transform\n\n def get_transform(self):\n """Return the `.FuncTransform` associated with this scale."""\n return self._transform\n\n def set_default_locators_and_formatters(self, axis):\n # docstring inherited\n axis.set_major_locator(AutoLocator())\n axis.set_major_formatter(ScalarFormatter())\n axis.set_minor_formatter(NullFormatter())\n # update the minor locator for x and y axis based on rcParams\n if (axis.axis_name == 'x' and mpl.rcParams['xtick.minor.visible'] or\n axis.axis_name == 'y' and mpl.rcParams['ytick.minor.visible']):\n axis.set_minor_locator(AutoMinorLocator())\n else:\n axis.set_minor_locator(NullLocator())\n\n\nclass LogTransform(Transform):\n input_dims = output_dims = 1\n\n def __init__(self, base, nonpositive='clip'):\n super().__init__()\n if base <= 0 or base == 1:\n raise ValueError('The log base cannot be <= 0 or == 1')\n self.base = base\n self._clip = _api.check_getitem(\n {"clip": True, "mask": False}, nonpositive=nonpositive)\n\n def __str__(self):\n return "{}(base={}, nonpositive={!r})".format(\n type(self).__name__, self.base, "clip" if self._clip else "mask")\n\n def transform_non_affine(self, values):\n # Ignore invalid values due to nans being passed to the transform.\n with np.errstate(divide="ignore", invalid="ignore"):\n log = {np.e: np.log, 2: np.log2, 10: np.log10}.get(self.base)\n if log: # If possible, do everything in a single call to NumPy.\n out = log(values)\n else:\n out = np.log(values)\n out /= np.log(self.base)\n if self._clip:\n # SVG spec says that conforming viewers must support values up\n # to 3.4e38 (C float); however experiments suggest that\n # Inkscape (which uses cairo for rendering) runs into cairo's\n # 24-bit limit (which is apparently shared by Agg).\n # Ghostscript (used for pdf rendering appears to overflow even\n # earlier, with the max value around 2 ** 15 for the tests to\n # pass. On the other hand, in practice, we want to clip beyond\n # np.log10(np.nextafter(0, 1)) ~ -323\n # so 1000 seems safe.\n out[values <= 0] = -1000\n return out\n\n def inverted(self):\n return InvertedLogTransform(self.base)\n\n\nclass InvertedLogTransform(Transform):\n input_dims = output_dims = 1\n\n def __init__(self, base):\n super().__init__()\n self.base = base\n\n def __str__(self):\n return f"{type(self).__name__}(base={self.base})"\n\n def transform_non_affine(self, values):\n return np.power(self.base, values)\n\n def inverted(self):\n return LogTransform(self.base)\n\n\nclass LogScale(ScaleBase):\n """\n A standard logarithmic scale. Care is taken to only plot positive values.\n """\n name = 'log'\n\n def __init__(self, axis, *, base=10, subs=None, nonpositive="clip"):\n """\n Parameters\n ----------\n axis : `~matplotlib.axis.Axis`\n The axis for the scale.\n base : float, default: 10\n The base of the logarithm.\n nonpositive : {'clip', 'mask'}, default: 'clip'\n Determines the behavior for non-positive values. They can either\n be masked as invalid, or clipped to a very small positive number.\n subs : sequence of int, default: None\n Where to place the subticks between each major tick. For example,\n in a log10 scale, ``[2, 3, 4, 5, 6, 7, 8, 9]`` will place 8\n logarithmically spaced minor ticks between each major tick.\n """\n self._transform = LogTransform(base, nonpositive)\n self.subs = subs\n\n base = property(lambda self: self._transform.base)\n\n def set_default_locators_and_formatters(self, axis):\n # docstring inherited\n axis.set_major_locator(LogLocator(self.base))\n axis.set_major_formatter(LogFormatterSciNotation(self.base))\n axis.set_minor_locator(LogLocator(self.base, self.subs))\n axis.set_minor_formatter(\n LogFormatterSciNotation(self.base,\n labelOnlyBase=(self.subs is not None)))\n\n def get_transform(self):\n """Return the `.LogTransform` associated with this scale."""\n return self._transform\n\n def limit_range_for_scale(self, vmin, vmax, minpos):\n """Limit the domain to positive values."""\n if not np.isfinite(minpos):\n minpos = 1e-300 # Should rarely (if ever) have a visible effect.\n\n return (minpos if vmin <= 0 else vmin,\n minpos if vmax <= 0 else vmax)\n\n\nclass FuncScaleLog(LogScale):\n """\n Provide an arbitrary scale with user-supplied function for the axis and\n then put on a logarithmic axes.\n """\n\n name = 'functionlog'\n\n def __init__(self, axis, functions, base=10):\n """\n Parameters\n ----------\n axis : `~matplotlib.axis.Axis`\n The axis for the scale.\n functions : (callable, callable)\n two-tuple of the forward and inverse functions for the scale.\n The forward function must be monotonic.\n\n Both functions must have the signature::\n\n def forward(values: array-like) -> array-like\n\n base : float, default: 10\n Logarithmic base of the scale.\n """\n forward, inverse = functions\n self.subs = None\n self._transform = FuncTransform(forward, inverse) + LogTransform(base)\n\n @property\n def base(self):\n return self._transform._b.base # Base of the LogTransform.\n\n def get_transform(self):\n """Return the `.Transform` associated with this scale."""\n return self._transform\n\n\nclass SymmetricalLogTransform(Transform):\n input_dims = output_dims = 1\n\n def __init__(self, base, linthresh, linscale):\n super().__init__()\n if base <= 1.0:\n raise ValueError("'base' must be larger than 1")\n if linthresh <= 0.0:\n raise ValueError("'linthresh' must be positive")\n if linscale <= 0.0:\n raise ValueError("'linscale' must be positive")\n self.base = base\n self.linthresh = linthresh\n self.linscale = linscale\n self._linscale_adj = (linscale / (1.0 - self.base ** -1))\n self._log_base = np.log(base)\n\n def transform_non_affine(self, values):\n abs_a = np.abs(values)\n with np.errstate(divide="ignore", invalid="ignore"):\n out = np.sign(values) * self.linthresh * (\n self._linscale_adj +\n np.log(abs_a / self.linthresh) / self._log_base)\n inside = abs_a <= self.linthresh\n out[inside] = values[inside] * self._linscale_adj\n return out\n\n def inverted(self):\n return InvertedSymmetricalLogTransform(self.base, self.linthresh,\n self.linscale)\n\n\nclass InvertedSymmetricalLogTransform(Transform):\n input_dims = output_dims = 1\n\n def __init__(self, base, linthresh, linscale):\n super().__init__()\n symlog = SymmetricalLogTransform(base, linthresh, linscale)\n self.base = base\n self.linthresh = linthresh\n self.invlinthresh = symlog.transform(linthresh)\n self.linscale = linscale\n self._linscale_adj = (linscale / (1.0 - self.base ** -1))\n\n def transform_non_affine(self, values):\n abs_a = np.abs(values)\n with np.errstate(divide="ignore", invalid="ignore"):\n out = np.sign(values) * self.linthresh * (\n np.power(self.base,\n abs_a / self.linthresh - self._linscale_adj))\n inside = abs_a <= self.invlinthresh\n out[inside] = values[inside] / self._linscale_adj\n return out\n\n def inverted(self):\n return SymmetricalLogTransform(self.base,\n self.linthresh, self.linscale)\n\n\nclass SymmetricalLogScale(ScaleBase):\n """\n The symmetrical logarithmic scale is logarithmic in both the\n positive and negative directions from the origin.\n\n Since the values close to zero tend toward infinity, there is a\n need to have a range around zero that is linear. The parameter\n *linthresh* allows the user to specify the size of this range\n (-*linthresh*, *linthresh*).\n\n See :doc:`/gallery/scales/symlog_demo` for a detailed description.\n\n Parameters\n ----------\n base : float, default: 10\n The base of the logarithm.\n\n linthresh : float, default: 2\n Defines the range ``(-x, x)``, within which the plot is linear.\n This avoids having the plot go to infinity around zero.\n\n subs : sequence of int\n Where to place the subticks between each major tick.\n For example, in a log10 scale: ``[2, 3, 4, 5, 6, 7, 8, 9]`` will place\n 8 logarithmically spaced minor ticks between each major tick.\n\n linscale : float, optional\n This allows the linear range ``(-linthresh, linthresh)`` to be\n stretched relative to the logarithmic range. Its value is the number of\n decades to use for each half of the linear range. For example, when\n *linscale* == 1.0 (the default), the space used for the positive and\n negative halves of the linear range will be equal to one decade in\n the logarithmic range.\n """\n name = 'symlog'\n\n def __init__(self, axis, *, base=10, linthresh=2, subs=None, linscale=1):\n self._transform = SymmetricalLogTransform(base, linthresh, linscale)\n self.subs = subs\n\n base = property(lambda self: self._transform.base)\n linthresh = property(lambda self: self._transform.linthresh)\n linscale = property(lambda self: self._transform.linscale)\n\n def set_default_locators_and_formatters(self, axis):\n # docstring inherited\n axis.set_major_locator(SymmetricalLogLocator(self.get_transform()))\n axis.set_major_formatter(LogFormatterSciNotation(self.base))\n axis.set_minor_locator(SymmetricalLogLocator(self.get_transform(),\n self.subs))\n axis.set_minor_formatter(NullFormatter())\n\n def get_transform(self):\n """Return the `.SymmetricalLogTransform` associated with this scale."""\n return self._transform\n\n\nclass AsinhTransform(Transform):\n """Inverse hyperbolic-sine transformation used by `.AsinhScale`"""\n input_dims = output_dims = 1\n\n def __init__(self, linear_width):\n super().__init__()\n if linear_width <= 0.0:\n raise ValueError("Scale parameter 'linear_width' " +\n "must be strictly positive")\n self.linear_width = linear_width\n\n def transform_non_affine(self, values):\n return self.linear_width * np.arcsinh(values / self.linear_width)\n\n def inverted(self):\n return InvertedAsinhTransform(self.linear_width)\n\n\nclass InvertedAsinhTransform(Transform):\n """Hyperbolic sine transformation used by `.AsinhScale`"""\n input_dims = output_dims = 1\n\n def __init__(self, linear_width):\n super().__init__()\n self.linear_width = linear_width\n\n def transform_non_affine(self, values):\n return self.linear_width * np.sinh(values / self.linear_width)\n\n def inverted(self):\n return AsinhTransform(self.linear_width)\n\n\nclass AsinhScale(ScaleBase):\n """\n A quasi-logarithmic scale based on the inverse hyperbolic sine (asinh)\n\n For values close to zero, this is essentially a linear scale,\n but for large magnitude values (either positive or negative)\n it is asymptotically logarithmic. The transition between these\n linear and logarithmic regimes is smooth, and has no discontinuities\n in the function gradient in contrast to\n the `.SymmetricalLogScale` ("symlog") scale.\n\n Specifically, the transformation of an axis coordinate :math:`a` is\n :math:`a \\rightarrow a_0 \\sinh^{-1} (a / a_0)` where :math:`a_0`\n is the effective width of the linear region of the transformation.\n In that region, the transformation is\n :math:`a \\rightarrow a + \\mathcal{O}(a^3)`.\n For large values of :math:`a` the transformation behaves as\n :math:`a \\rightarrow a_0 \\, \\mathrm{sgn}(a) \\ln |a| + \\mathcal{O}(1)`.\n\n .. note::\n\n This API is provisional and may be revised in the future\n based on early user feedback.\n """\n\n name = 'asinh'\n\n auto_tick_multipliers = {\n 3: (2, ),\n 4: (2, ),\n 5: (2, ),\n 8: (2, 4),\n 10: (2, 5),\n 16: (2, 4, 8),\n 64: (4, 16),\n 1024: (256, 512)\n }\n\n def __init__(self, axis, *, linear_width=1.0,\n base=10, subs='auto', **kwargs):\n """\n Parameters\n ----------\n linear_width : float, default: 1\n The scale parameter (elsewhere referred to as :math:`a_0`)\n defining the extent of the quasi-linear region,\n and the coordinate values beyond which the transformation\n becomes asymptotically logarithmic.\n base : int, default: 10\n The number base used for rounding tick locations\n on a logarithmic scale. If this is less than one,\n then rounding is to the nearest integer multiple\n of powers of ten.\n subs : sequence of int\n Multiples of the number base used for minor ticks.\n If set to 'auto', this will use built-in defaults,\n e.g. (2, 5) for base=10.\n """\n super().__init__(axis)\n self._transform = AsinhTransform(linear_width)\n self._base = int(base)\n if subs == 'auto':\n self._subs = self.auto_tick_multipliers.get(self._base)\n else:\n self._subs = subs\n\n linear_width = property(lambda self: self._transform.linear_width)\n\n def get_transform(self):\n return self._transform\n\n def set_default_locators_and_formatters(self, axis):\n axis.set(major_locator=AsinhLocator(self.linear_width,\n base=self._base),\n minor_locator=AsinhLocator(self.linear_width,\n base=self._base,\n subs=self._subs),\n minor_formatter=NullFormatter())\n if self._base > 1:\n axis.set_major_formatter(LogFormatterSciNotation(self._base))\n else:\n axis.set_major_formatter('{x:.3g}')\n\n\nclass LogitTransform(Transform):\n input_dims = output_dims = 1\n\n def __init__(self, nonpositive='mask'):\n super().__init__()\n _api.check_in_list(['mask', 'clip'], nonpositive=nonpositive)\n self._nonpositive = nonpositive\n self._clip = {"clip": True, "mask": False}[nonpositive]\n\n def transform_non_affine(self, values):\n """logit transform (base 10), masked or clipped"""\n with np.errstate(divide="ignore", invalid="ignore"):\n out = np.log10(values / (1 - values))\n if self._clip: # See LogTransform for choice of clip value.\n out[values <= 0] = -1000\n out[1 <= values] = 1000\n return out\n\n def inverted(self):\n return LogisticTransform(self._nonpositive)\n\n def __str__(self):\n return f"{type(self).__name__}({self._nonpositive!r})"\n\n\nclass LogisticTransform(Transform):\n input_dims = output_dims = 1\n\n def __init__(self, nonpositive='mask'):\n super().__init__()\n self._nonpositive = nonpositive\n\n def transform_non_affine(self, values):\n """logistic transform (base 10)"""\n return 1.0 / (1 + 10**(-values))\n\n def inverted(self):\n return LogitTransform(self._nonpositive)\n\n def __str__(self):\n return f"{type(self).__name__}({self._nonpositive!r})"\n\n\nclass LogitScale(ScaleBase):\n """\n Logit scale for data between zero and one, both excluded.\n\n This scale is similar to a log scale close to zero and to one, and almost\n linear around 0.5. It maps the interval ]0, 1[ onto ]-infty, +infty[.\n """\n name = 'logit'\n\n def __init__(self, axis, nonpositive='mask', *,\n one_half=r"\frac{1}{2}", use_overline=False):\n r"""\n Parameters\n ----------\n axis : `~matplotlib.axis.Axis`\n Currently unused.\n nonpositive : {'mask', 'clip'}\n Determines the behavior for values beyond the open interval ]0, 1[.\n They can either be masked as invalid, or clipped to a number very\n close to 0 or 1.\n use_overline : bool, default: False\n Indicate the usage of survival notation (\overline{x}) in place of\n standard notation (1-x) for probability close to one.\n one_half : str, default: r"\frac{1}{2}"\n The string used for ticks formatter to represent 1/2.\n """\n self._transform = LogitTransform(nonpositive)\n self._use_overline = use_overline\n self._one_half = one_half\n\n def get_transform(self):\n """Return the `.LogitTransform` associated with this scale."""\n return self._transform\n\n def set_default_locators_and_formatters(self, axis):\n # docstring inherited\n # ..., 0.01, 0.1, 0.5, 0.9, 0.99, ...\n axis.set_major_locator(LogitLocator())\n axis.set_major_formatter(\n LogitFormatter(\n one_half=self._one_half,\n use_overline=self._use_overline\n )\n )\n axis.set_minor_locator(LogitLocator(minor=True))\n axis.set_minor_formatter(\n LogitFormatter(\n minor=True,\n one_half=self._one_half,\n use_overline=self._use_overline\n )\n )\n\n def limit_range_for_scale(self, vmin, vmax, minpos):\n """\n Limit the domain to values between 0 and 1 (excluded).\n """\n if not np.isfinite(minpos):\n minpos = 1e-7 # Should rarely (if ever) have a visible effect.\n return (minpos if vmin <= 0 else vmin,\n 1 - minpos if vmax >= 1 else vmax)\n\n\n_scale_mapping = {\n 'linear': LinearScale,\n 'log': LogScale,\n 'symlog': SymmetricalLogScale,\n 'asinh': AsinhScale,\n 'logit': LogitScale,\n 'function': FuncScale,\n 'functionlog': FuncScaleLog,\n }\n\n\ndef get_scale_names():\n """Return the names of the available scales."""\n return sorted(_scale_mapping)\n\n\ndef scale_factory(scale, axis, **kwargs):\n """\n Return a scale class by name.\n\n Parameters\n ----------\n scale : {%(names)s}\n axis : `~matplotlib.axis.Axis`\n """\n scale_cls = _api.check_getitem(_scale_mapping, scale=scale)\n return scale_cls(axis, **kwargs)\n\n\nif scale_factory.__doc__:\n scale_factory.__doc__ = scale_factory.__doc__ % {\n "names": ", ".join(map(repr, get_scale_names()))}\n\n\ndef register_scale(scale_class):\n """\n Register a new kind of scale.\n\n Parameters\n ----------\n scale_class : subclass of `ScaleBase`\n The scale to register.\n """\n _scale_mapping[scale_class.name] = scale_class\n\n\ndef _get_scale_docs():\n """\n Helper function for generating docstrings related to scales.\n """\n docs = []\n for name, scale_class in _scale_mapping.items():\n docstring = inspect.getdoc(scale_class.__init__) or ""\n docs.extend([\n f" {name!r}",\n "",\n textwrap.indent(docstring, " " * 8),\n ""\n ])\n return "\n".join(docs)\n\n\n_docstring.interpd.register(\n scale_type='{%s}' % ', '.join([repr(x) for x in get_scale_names()]),\n scale_docs=_get_scale_docs().rstrip(),\n )\n | .venv\Lib\site-packages\matplotlib\scale.py | scale.py | Python | 26,924 | 0.95 | 0.205997 | 0.039024 | vue-tools | 940 | 2024-09-19T07:18:44.858364 | MIT | false | 19ab433a9b933a8beb62e41177445568 |
from matplotlib.axis import Axis\nfrom matplotlib.transforms import Transform\n\nfrom collections.abc import Callable, Iterable\nfrom typing import Literal\nfrom numpy.typing import ArrayLike\n\nclass ScaleBase:\n def __init__(self, axis: Axis | None) -> None: ...\n def get_transform(self) -> Transform: ...\n def set_default_locators_and_formatters(self, axis: Axis) -> None: ...\n def limit_range_for_scale(\n self, vmin: float, vmax: float, minpos: float\n ) -> tuple[float, float]: ...\n\nclass LinearScale(ScaleBase):\n name: str\n\nclass FuncTransform(Transform):\n input_dims: int\n output_dims: int\n def __init__(\n self,\n forward: Callable[[ArrayLike], ArrayLike],\n inverse: Callable[[ArrayLike], ArrayLike],\n ) -> None: ...\n def inverted(self) -> FuncTransform: ...\n\nclass FuncScale(ScaleBase):\n name: str\n def __init__(\n self,\n axis: Axis | None,\n functions: tuple[\n Callable[[ArrayLike], ArrayLike], Callable[[ArrayLike], ArrayLike]\n ],\n ) -> None: ...\n\nclass LogTransform(Transform):\n input_dims: int\n output_dims: int\n base: float\n def __init__(\n self, base: float, nonpositive: Literal["clip", "mask"] = ...\n ) -> None: ...\n def inverted(self) -> InvertedLogTransform: ...\n\nclass InvertedLogTransform(Transform):\n input_dims: int\n output_dims: int\n base: float\n def __init__(self, base: float) -> None: ...\n def inverted(self) -> LogTransform: ...\n\nclass LogScale(ScaleBase):\n name: str\n subs: Iterable[int] | None\n def __init__(\n self,\n axis: Axis | None,\n *,\n base: float = ...,\n subs: Iterable[int] | None = ...,\n nonpositive: Literal["clip", "mask"] = ...\n ) -> None: ...\n @property\n def base(self) -> float: ...\n def get_transform(self) -> Transform: ...\n\nclass FuncScaleLog(LogScale):\n def __init__(\n self,\n axis: Axis | None,\n functions: tuple[\n Callable[[ArrayLike], ArrayLike], Callable[[ArrayLike], ArrayLike]\n ],\n base: float = ...,\n ) -> None: ...\n @property\n def base(self) -> float: ...\n def get_transform(self) -> Transform: ...\n\nclass SymmetricalLogTransform(Transform):\n input_dims: int\n output_dims: int\n base: float\n linthresh: float\n linscale: float\n def __init__(self, base: float, linthresh: float, linscale: float) -> None: ...\n def inverted(self) -> InvertedSymmetricalLogTransform: ...\n\nclass InvertedSymmetricalLogTransform(Transform):\n input_dims: int\n output_dims: int\n base: float\n linthresh: float\n invlinthresh: float\n linscale: float\n def __init__(self, base: float, linthresh: float, linscale: float) -> None: ...\n def inverted(self) -> SymmetricalLogTransform: ...\n\nclass SymmetricalLogScale(ScaleBase):\n name: str\n subs: Iterable[int] | None\n def __init__(\n self,\n axis: Axis | None,\n *,\n base: float = ...,\n linthresh: float = ...,\n subs: Iterable[int] | None = ...,\n linscale: float = ...\n ) -> None: ...\n @property\n def base(self) -> float: ...\n @property\n def linthresh(self) -> float: ...\n @property\n def linscale(self) -> float: ...\n def get_transform(self) -> SymmetricalLogTransform: ...\n\nclass AsinhTransform(Transform):\n input_dims: int\n output_dims: int\n linear_width: float\n def __init__(self, linear_width: float) -> None: ...\n def inverted(self) -> InvertedAsinhTransform: ...\n\nclass InvertedAsinhTransform(Transform):\n input_dims: int\n output_dims: int\n linear_width: float\n def __init__(self, linear_width: float) -> None: ...\n def inverted(self) -> AsinhTransform: ...\n\nclass AsinhScale(ScaleBase):\n name: str\n auto_tick_multipliers: dict[int, tuple[int, ...]]\n def __init__(\n self,\n axis: Axis | None,\n *,\n linear_width: float = ...,\n base: float = ...,\n subs: Iterable[int] | Literal["auto"] | None = ...,\n **kwargs\n ) -> None: ...\n @property\n def linear_width(self) -> float: ...\n def get_transform(self) -> AsinhTransform: ...\n\nclass LogitTransform(Transform):\n input_dims: int\n output_dims: int\n def __init__(self, nonpositive: Literal["mask", "clip"] = ...) -> None: ...\n def inverted(self) -> LogisticTransform: ...\n\nclass LogisticTransform(Transform):\n input_dims: int\n output_dims: int\n def __init__(self, nonpositive: Literal["mask", "clip"] = ...) -> None: ...\n def inverted(self) -> LogitTransform: ...\n\nclass LogitScale(ScaleBase):\n name: str\n def __init__(\n self,\n axis: Axis | None,\n nonpositive: Literal["mask", "clip"] = ...,\n *,\n one_half: str = ...,\n use_overline: bool = ...\n ) -> None: ...\n def get_transform(self) -> LogitTransform: ...\n\ndef get_scale_names() -> list[str]: ...\ndef scale_factory(scale: str, axis: Axis, **kwargs) -> ScaleBase: ...\ndef register_scale(scale_class: type[ScaleBase]) -> None: ...\n | .venv\Lib\site-packages\matplotlib\scale.pyi | scale.pyi | Other | 5,057 | 0.85 | 0.331461 | 0.031447 | node-utils | 739 | 2023-09-18T01:16:50.267131 | GPL-3.0 | false | 75531b840cf4f29266a4c126a9967b69 |
from collections.abc import MutableMapping\nimport functools\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom matplotlib import _api, _docstring\nfrom matplotlib.artist import allow_rasterization\nimport matplotlib.transforms as mtransforms\nimport matplotlib.patches as mpatches\nimport matplotlib.path as mpath\n\n\nclass Spine(mpatches.Patch):\n """\n An axis spine -- the line noting the data area boundaries.\n\n Spines are the lines connecting the axis tick marks and noting the\n boundaries of the data area. They can be placed at arbitrary\n positions. See `~.Spine.set_position` for more information.\n\n The default position is ``('outward', 0)``.\n\n Spines are subclasses of `.Patch`, and inherit much of their behavior.\n\n Spines draw a line, a circle, or an arc depending on if\n `~.Spine.set_patch_line`, `~.Spine.set_patch_circle`, or\n `~.Spine.set_patch_arc` has been called. Line-like is the default.\n\n For examples see :ref:`spines_examples`.\n """\n def __str__(self):\n return "Spine"\n\n @_docstring.interpd\n def __init__(self, axes, spine_type, path, **kwargs):\n """\n Parameters\n ----------\n axes : `~matplotlib.axes.Axes`\n The `~.axes.Axes` instance containing the spine.\n spine_type : str\n The spine type.\n path : `~matplotlib.path.Path`\n The `.Path` instance used to draw the spine.\n\n Other Parameters\n ----------------\n **kwargs\n Valid keyword arguments are:\n\n %(Patch:kwdoc)s\n """\n super().__init__(**kwargs)\n self.axes = axes\n self.set_figure(self.axes.get_figure(root=False))\n self.spine_type = spine_type\n self.set_facecolor('none')\n self.set_edgecolor(mpl.rcParams['axes.edgecolor'])\n self.set_linewidth(mpl.rcParams['axes.linewidth'])\n self.set_capstyle('projecting')\n self.axis = None\n\n self.set_zorder(2.5)\n self.set_transform(self.axes.transData) # default transform\n\n self._bounds = None # default bounds\n\n # Defer initial position determination. (Not much support for\n # non-rectangular axes is currently implemented, and this lets\n # them pass through the spines machinery without errors.)\n self._position = None\n _api.check_isinstance(mpath.Path, path=path)\n self._path = path\n\n # To support drawing both linear and circular spines, this\n # class implements Patch behavior three ways. If\n # self._patch_type == 'line', behave like a mpatches.PathPatch\n # instance. If self._patch_type == 'circle', behave like a\n # mpatches.Ellipse instance. If self._patch_type == 'arc', behave like\n # a mpatches.Arc instance.\n self._patch_type = 'line'\n\n # Behavior copied from mpatches.Ellipse:\n # Note: This cannot be calculated until this is added to an Axes\n self._patch_transform = mtransforms.IdentityTransform()\n\n def set_patch_arc(self, center, radius, theta1, theta2):\n """Set the spine to be arc-like."""\n self._patch_type = 'arc'\n self._center = center\n self._width = radius * 2\n self._height = radius * 2\n self._theta1 = theta1\n self._theta2 = theta2\n self._path = mpath.Path.arc(theta1, theta2)\n # arc drawn on axes transform\n self.set_transform(self.axes.transAxes)\n self.stale = True\n\n def set_patch_circle(self, center, radius):\n """Set the spine to be circular."""\n self._patch_type = 'circle'\n self._center = center\n self._width = radius * 2\n self._height = radius * 2\n # circle drawn on axes transform\n self.set_transform(self.axes.transAxes)\n self.stale = True\n\n def set_patch_line(self):\n """Set the spine to be linear."""\n self._patch_type = 'line'\n self.stale = True\n\n # Behavior copied from mpatches.Ellipse:\n def _recompute_transform(self):\n """\n Notes\n -----\n This cannot be called until after this has been added to an Axes,\n otherwise unit conversion will fail. This makes it very important to\n call the accessor method and not directly access the transformation\n member variable.\n """\n assert self._patch_type in ('arc', 'circle')\n center = (self.convert_xunits(self._center[0]),\n self.convert_yunits(self._center[1]))\n width = self.convert_xunits(self._width)\n height = self.convert_yunits(self._height)\n self._patch_transform = mtransforms.Affine2D() \\n .scale(width * 0.5, height * 0.5) \\n .translate(*center)\n\n def get_patch_transform(self):\n if self._patch_type in ('arc', 'circle'):\n self._recompute_transform()\n return self._patch_transform\n else:\n return super().get_patch_transform()\n\n def get_window_extent(self, renderer=None):\n """\n Return the window extent of the spines in display space, including\n padding for ticks (but not their labels)\n\n See Also\n --------\n matplotlib.axes.Axes.get_tightbbox\n matplotlib.axes.Axes.get_window_extent\n """\n # make sure the location is updated so that transforms etc are correct:\n self._adjust_location()\n bb = super().get_window_extent(renderer=renderer)\n if self.axis is None or not self.axis.get_visible():\n return bb\n bboxes = [bb]\n drawn_ticks = self.axis._update_ticks()\n\n major_tick = next(iter({*drawn_ticks} & {*self.axis.majorTicks}), None)\n minor_tick = next(iter({*drawn_ticks} & {*self.axis.minorTicks}), None)\n for tick in [major_tick, minor_tick]:\n if tick is None:\n continue\n bb0 = bb.frozen()\n tickl = tick._size\n tickdir = tick._tickdir\n if tickdir == 'out':\n padout = 1\n padin = 0\n elif tickdir == 'in':\n padout = 0\n padin = 1\n else:\n padout = 0.5\n padin = 0.5\n dpi = self.get_figure(root=True).dpi\n padout = padout * tickl / 72 * dpi\n padin = padin * tickl / 72 * dpi\n\n if tick.tick1line.get_visible():\n if self.spine_type == 'left':\n bb0.x0 = bb0.x0 - padout\n bb0.x1 = bb0.x1 + padin\n elif self.spine_type == 'bottom':\n bb0.y0 = bb0.y0 - padout\n bb0.y1 = bb0.y1 + padin\n\n if tick.tick2line.get_visible():\n if self.spine_type == 'right':\n bb0.x1 = bb0.x1 + padout\n bb0.x0 = bb0.x0 - padin\n elif self.spine_type == 'top':\n bb0.y1 = bb0.y1 + padout\n bb0.y0 = bb0.y0 - padout\n bboxes.append(bb0)\n\n return mtransforms.Bbox.union(bboxes)\n\n def get_path(self):\n return self._path\n\n def _ensure_position_is_set(self):\n if self._position is None:\n # default position\n self._position = ('outward', 0.0) # in points\n self.set_position(self._position)\n\n def register_axis(self, axis):\n """\n Register an axis.\n\n An axis should be registered with its corresponding spine from\n the Axes instance. This allows the spine to clear any axis\n properties when needed.\n """\n self.axis = axis\n self.stale = True\n\n def clear(self):\n """Clear the current spine."""\n self._clear()\n if self.axis is not None:\n self.axis.clear()\n\n def _clear(self):\n """\n Clear things directly related to the spine.\n\n In this way it is possible to avoid clearing the Axis as well when calling\n from library code where it is known that the Axis is cleared separately.\n """\n self._position = None # clear position\n\n def _adjust_location(self):\n """Automatically set spine bounds to the view interval."""\n\n if self.spine_type == 'circle':\n return\n\n if self._bounds is not None:\n low, high = self._bounds\n elif self.spine_type in ('left', 'right'):\n low, high = self.axes.viewLim.intervaly\n elif self.spine_type in ('top', 'bottom'):\n low, high = self.axes.viewLim.intervalx\n else:\n raise ValueError(f'unknown spine spine_type: {self.spine_type}')\n\n if self._patch_type == 'arc':\n if self.spine_type in ('bottom', 'top'):\n try:\n direction = self.axes.get_theta_direction()\n except AttributeError:\n direction = 1\n try:\n offset = self.axes.get_theta_offset()\n except AttributeError:\n offset = 0\n low = low * direction + offset\n high = high * direction + offset\n if low > high:\n low, high = high, low\n\n self._path = mpath.Path.arc(np.rad2deg(low), np.rad2deg(high))\n\n if self.spine_type == 'bottom':\n rmin, rmax = self.axes.viewLim.intervaly\n try:\n rorigin = self.axes.get_rorigin()\n except AttributeError:\n rorigin = rmin\n scaled_diameter = (rmin - rorigin) / (rmax - rorigin)\n self._height = scaled_diameter\n self._width = scaled_diameter\n\n else:\n raise ValueError('unable to set bounds for spine "%s"' %\n self.spine_type)\n else:\n v1 = self._path.vertices\n assert v1.shape == (2, 2), 'unexpected vertices shape'\n if self.spine_type in ['left', 'right']:\n v1[0, 1] = low\n v1[1, 1] = high\n elif self.spine_type in ['bottom', 'top']:\n v1[0, 0] = low\n v1[1, 0] = high\n else:\n raise ValueError('unable to set bounds for spine "%s"' %\n self.spine_type)\n\n @allow_rasterization\n def draw(self, renderer):\n self._adjust_location()\n ret = super().draw(renderer)\n self.stale = False\n return ret\n\n def set_position(self, position):\n """\n Set the position of the spine.\n\n Spine position is specified by a 2 tuple of (position type,\n amount). The position types are:\n\n * 'outward': place the spine out from the data area by the specified\n number of points. (Negative values place the spine inwards.)\n * 'axes': place the spine at the specified Axes coordinate (0 to 1).\n * 'data': place the spine at the specified data coordinate.\n\n Additionally, shorthand notations define a special positions:\n\n * 'center' -> ``('axes', 0.5)``\n * 'zero' -> ``('data', 0.0)``\n\n Examples\n --------\n :doc:`/gallery/spines/spine_placement_demo`\n """\n if position in ('center', 'zero'): # special positions\n pass\n else:\n if len(position) != 2:\n raise ValueError("position should be 'center' or 2-tuple")\n if position[0] not in ['outward', 'axes', 'data']:\n raise ValueError("position[0] should be one of 'outward', "\n "'axes', or 'data' ")\n self._position = position\n self.set_transform(self.get_spine_transform())\n if self.axis is not None:\n self.axis.reset_ticks()\n self.stale = True\n\n def get_position(self):\n """Return the spine position."""\n self._ensure_position_is_set()\n return self._position\n\n def get_spine_transform(self):\n """Return the spine transform."""\n self._ensure_position_is_set()\n\n position = self._position\n if isinstance(position, str):\n if position == 'center':\n position = ('axes', 0.5)\n elif position == 'zero':\n position = ('data', 0)\n assert len(position) == 2, 'position should be 2-tuple'\n position_type, amount = position\n _api.check_in_list(['axes', 'outward', 'data'],\n position_type=position_type)\n if self.spine_type in ['left', 'right']:\n base_transform = self.axes.get_yaxis_transform(which='grid')\n elif self.spine_type in ['top', 'bottom']:\n base_transform = self.axes.get_xaxis_transform(which='grid')\n else:\n raise ValueError(f'unknown spine spine_type: {self.spine_type!r}')\n\n if position_type == 'outward':\n if amount == 0: # short circuit commonest case\n return base_transform\n else:\n offset_vec = {'left': (-1, 0), 'right': (1, 0),\n 'bottom': (0, -1), 'top': (0, 1),\n }[self.spine_type]\n # calculate x and y offset in dots\n offset_dots = amount * np.array(offset_vec) / 72\n return (base_transform\n + mtransforms.ScaledTranslation(\n *offset_dots, self.get_figure(root=False).dpi_scale_trans))\n elif position_type == 'axes':\n if self.spine_type in ['left', 'right']:\n # keep y unchanged, fix x at amount\n return (mtransforms.Affine2D.from_values(0, 0, 0, 1, amount, 0)\n + base_transform)\n elif self.spine_type in ['bottom', 'top']:\n # keep x unchanged, fix y at amount\n return (mtransforms.Affine2D.from_values(1, 0, 0, 0, 0, amount)\n + base_transform)\n elif position_type == 'data':\n if self.spine_type in ('right', 'top'):\n # The right and top spines have a default position of 1 in\n # axes coordinates. When specifying the position in data\n # coordinates, we need to calculate the position relative to 0.\n amount -= 1\n if self.spine_type in ('left', 'right'):\n return mtransforms.blended_transform_factory(\n mtransforms.Affine2D().translate(amount, 0)\n + self.axes.transData,\n self.axes.transData)\n elif self.spine_type in ('bottom', 'top'):\n return mtransforms.blended_transform_factory(\n self.axes.transData,\n mtransforms.Affine2D().translate(0, amount)\n + self.axes.transData)\n\n def set_bounds(self, low=None, high=None):\n """\n Set the spine bounds.\n\n Parameters\n ----------\n low : float or None, optional\n The lower spine bound. Passing *None* leaves the limit unchanged.\n\n The bounds may also be passed as the tuple (*low*, *high*) as the\n first positional argument.\n\n .. ACCEPTS: (low: float, high: float)\n\n high : float or None, optional\n The higher spine bound. Passing *None* leaves the limit unchanged.\n """\n if self.spine_type == 'circle':\n raise ValueError(\n 'set_bounds() method incompatible with circular spines')\n if high is None and np.iterable(low):\n low, high = low\n old_low, old_high = self.get_bounds() or (None, None)\n if low is None:\n low = old_low\n if high is None:\n high = old_high\n self._bounds = (low, high)\n self.stale = True\n\n def get_bounds(self):\n """Get the bounds of the spine."""\n return self._bounds\n\n @classmethod\n def linear_spine(cls, axes, spine_type, **kwargs):\n """Create and return a linear `Spine`."""\n # all values of 0.999 get replaced upon call to set_bounds()\n if spine_type == 'left':\n path = mpath.Path([(0.0, 0.999), (0.0, 0.999)])\n elif spine_type == 'right':\n path = mpath.Path([(1.0, 0.999), (1.0, 0.999)])\n elif spine_type == 'bottom':\n path = mpath.Path([(0.999, 0.0), (0.999, 0.0)])\n elif spine_type == 'top':\n path = mpath.Path([(0.999, 1.0), (0.999, 1.0)])\n else:\n raise ValueError('unable to make path for spine "%s"' % spine_type)\n result = cls(axes, spine_type, path, **kwargs)\n result.set_visible(mpl.rcParams[f'axes.spines.{spine_type}'])\n\n return result\n\n @classmethod\n def arc_spine(cls, axes, spine_type, center, radius, theta1, theta2,\n **kwargs):\n """Create and return an arc `Spine`."""\n path = mpath.Path.arc(theta1, theta2)\n result = cls(axes, spine_type, path, **kwargs)\n result.set_patch_arc(center, radius, theta1, theta2)\n return result\n\n @classmethod\n def circular_spine(cls, axes, center, radius, **kwargs):\n """Create and return a circular `Spine`."""\n path = mpath.Path.unit_circle()\n spine_type = 'circle'\n result = cls(axes, spine_type, path, **kwargs)\n result.set_patch_circle(center, radius)\n return result\n\n def set_color(self, c):\n """\n Set the edgecolor.\n\n Parameters\n ----------\n c : :mpltype:`color`\n\n Notes\n -----\n This method does not modify the facecolor (which defaults to "none"),\n unlike the `.Patch.set_color` method defined in the parent class. Use\n `.Patch.set_facecolor` to set the facecolor.\n """\n self.set_edgecolor(c)\n self.stale = True\n\n\nclass SpinesProxy:\n """\n A proxy to broadcast ``set_*()`` and ``set()`` method calls to contained `.Spines`.\n\n The proxy cannot be used for any other operations on its members.\n\n The supported methods are determined dynamically based on the contained\n spines. If not all spines support a given method, it's executed only on\n the subset of spines that support it.\n """\n def __init__(self, spine_dict):\n self._spine_dict = spine_dict\n\n def __getattr__(self, name):\n broadcast_targets = [spine for spine in self._spine_dict.values()\n if hasattr(spine, name)]\n if (name != 'set' and not name.startswith('set_')) or not broadcast_targets:\n raise AttributeError(\n f"'SpinesProxy' object has no attribute '{name}'")\n\n def x(_targets, _funcname, *args, **kwargs):\n for spine in _targets:\n getattr(spine, _funcname)(*args, **kwargs)\n x = functools.partial(x, broadcast_targets, name)\n x.__doc__ = broadcast_targets[0].__doc__\n return x\n\n def __dir__(self):\n names = []\n for spine in self._spine_dict.values():\n names.extend(name\n for name in dir(spine) if name.startswith('set_'))\n return list(sorted(set(names)))\n\n\nclass Spines(MutableMapping):\n r"""\n The container of all `.Spine`\s in an Axes.\n\n The interface is dict-like mapping names (e.g. 'left') to `.Spine` objects.\n Additionally, it implements some pandas.Series-like features like accessing\n elements by attribute::\n\n spines['top'].set_visible(False)\n spines.top.set_visible(False)\n\n Multiple spines can be addressed simultaneously by passing a list::\n\n spines[['top', 'right']].set_visible(False)\n\n Use an open slice to address all spines::\n\n spines[:].set_visible(False)\n\n The latter two indexing methods will return a `SpinesProxy` that broadcasts all\n ``set_*()`` and ``set()`` calls to its members, but cannot be used for any other\n operation.\n """\n def __init__(self, **kwargs):\n self._dict = kwargs\n\n @classmethod\n def from_dict(cls, d):\n return cls(**d)\n\n def __getstate__(self):\n return self._dict\n\n def __setstate__(self, state):\n self.__init__(**state)\n\n def __getattr__(self, name):\n try:\n return self._dict[name]\n except KeyError:\n raise AttributeError(\n f"'Spines' object does not contain a '{name}' spine")\n\n def __getitem__(self, key):\n if isinstance(key, list):\n unknown_keys = [k for k in key if k not in self._dict]\n if unknown_keys:\n raise KeyError(', '.join(unknown_keys))\n return SpinesProxy({k: v for k, v in self._dict.items()\n if k in key})\n if isinstance(key, tuple):\n raise ValueError('Multiple spines must be passed as a single list')\n if isinstance(key, slice):\n if key.start is None and key.stop is None and key.step is None:\n return SpinesProxy(self._dict)\n else:\n raise ValueError(\n 'Spines does not support slicing except for the fully '\n 'open slice [:] to access all spines.')\n return self._dict[key]\n\n def __setitem__(self, key, value):\n # TODO: Do we want to deprecate adding spines?\n self._dict[key] = value\n\n def __delitem__(self, key):\n # TODO: Do we want to deprecate deleting spines?\n del self._dict[key]\n\n def __iter__(self):\n return iter(self._dict)\n\n def __len__(self):\n return len(self._dict)\n | .venv\Lib\site-packages\matplotlib\spines.py | spines.py | Python | 21,643 | 0.95 | 0.181208 | 0.065347 | python-kit | 92 | 2023-07-15T18:50:45.742644 | MIT | false | b9a8cdb0ba337165188062347b2c96dd |
from collections.abc import Callable, Iterator, MutableMapping\nfrom typing import Literal, TypeVar, overload\n\nimport matplotlib.patches as mpatches\nfrom matplotlib.axes import Axes\nfrom matplotlib.axis import Axis\nfrom matplotlib.path import Path\nfrom matplotlib.transforms import Transform\nfrom matplotlib.typing import ColorType\n\nclass Spine(mpatches.Patch):\n axes: Axes\n spine_type: str\n axis: Axis | None\n def __init__(self, axes: Axes, spine_type: str, path: Path, **kwargs) -> None: ...\n def set_patch_arc(\n self, center: tuple[float, float], radius: float, theta1: float, theta2: float\n ) -> None: ...\n def set_patch_circle(self, center: tuple[float, float], radius: float) -> None: ...\n def set_patch_line(self) -> None: ...\n def get_patch_transform(self) -> Transform: ...\n def get_path(self) -> Path: ...\n def register_axis(self, axis: Axis) -> None: ...\n def clear(self) -> None: ...\n def set_position(\n self,\n position: Literal["center", "zero"]\n | tuple[Literal["outward", "axes", "data"], float],\n ) -> None: ...\n def get_position(\n self,\n ) -> Literal["center", "zero"] | tuple[\n Literal["outward", "axes", "data"], float\n ]: ...\n def get_spine_transform(self) -> Transform: ...\n def set_bounds(self, low: float | None = ..., high: float | None = ...) -> None: ...\n def get_bounds(self) -> tuple[float, float]: ...\n\n _T = TypeVar("_T", bound=Spine)\n @classmethod\n def linear_spine(\n cls: type[_T],\n axes: Axes,\n spine_type: Literal["left", "right", "bottom", "top"],\n **kwargs\n ) -> _T: ...\n @classmethod\n def arc_spine(\n cls: type[_T],\n axes: Axes,\n spine_type: Literal["left", "right", "bottom", "top"],\n center: tuple[float, float],\n radius: float,\n theta1: float,\n theta2: float,\n **kwargs\n ) -> _T: ...\n @classmethod\n def circular_spine(\n cls: type[_T], axes: Axes, center: tuple[float, float], radius: float, **kwargs\n ) -> _T: ...\n def set_color(self, c: ColorType | None) -> None: ...\n\nclass SpinesProxy:\n def __init__(self, spine_dict: dict[str, Spine]) -> None: ...\n def __getattr__(self, name: str) -> Callable[..., None]: ...\n def __dir__(self) -> list[str]: ...\n\nclass Spines(MutableMapping[str, Spine]):\n def __init__(self, **kwargs: Spine) -> None: ...\n @classmethod\n def from_dict(cls, d: dict[str, Spine]) -> Spines: ...\n def __getattr__(self, name: str) -> Spine: ...\n @overload\n def __getitem__(self, key: str) -> Spine: ...\n @overload\n def __getitem__(self, key: list[str]) -> SpinesProxy: ...\n @overload\n def __getitem__(self, key: slice) -> SpinesProxy: ...\n def __setitem__(self, key: str, value: Spine) -> None: ...\n def __delitem__(self, key: str) -> None: ...\n def __iter__(self) -> Iterator[str]: ...\n def __len__(self) -> int: ...\n | .venv\Lib\site-packages\matplotlib\spines.pyi | spines.pyi | Other | 2,951 | 0.85 | 0.39759 | 0.025641 | node-utils | 824 | 2025-03-17T22:35:11.672835 | MIT | false | b7da98f445ffa1cbdac4abf6d8524b4a |
"""\nStacked area plot for 1D arrays inspired by Douglas Y'barbo's stackoverflow\nanswer:\nhttps://stackoverflow.com/q/2225995/\n\n(https://stackoverflow.com/users/66549/doug)\n"""\n\nimport itertools\n\nimport numpy as np\n\nfrom matplotlib import _api\n\n__all__ = ['stackplot']\n\n\ndef stackplot(axes, x, *args,\n labels=(), colors=None, hatch=None, baseline='zero',\n **kwargs):\n """\n Draw a stacked area plot or a streamgraph.\n\n Parameters\n ----------\n x : (N,) array-like\n\n y : (M, N) array-like\n The data can be either stacked or unstacked. Each of the following\n calls is legal::\n\n stackplot(x, y) # where y has shape (M, N) e.g. y = [y1, y2, y3, y4]\n stackplot(x, y1, y2, y3, y4) # where y1, y2, y3, y4 have length N\n\n baseline : {'zero', 'sym', 'wiggle', 'weighted_wiggle'}\n Method used to calculate the baseline:\n\n - ``'zero'``: Constant zero baseline, i.e. a simple stacked plot.\n - ``'sym'``: Symmetric around zero and is sometimes called\n 'ThemeRiver'.\n - ``'wiggle'``: Minimizes the sum of the squared slopes.\n - ``'weighted_wiggle'``: Does the same but weights to account for\n size of each layer. It is also called 'Streamgraph'-layout. More\n details can be found at http://leebyron.com/streamgraph/.\n\n labels : list of str, optional\n A sequence of labels to assign to each data series. If unspecified,\n then no labels will be applied to artists.\n\n colors : list of :mpltype:`color`, optional\n A sequence of colors to be cycled through and used to color the stacked\n areas. The sequence need not be exactly the same length as the number\n of provided *y*, in which case the colors will repeat from the\n beginning.\n\n If not specified, the colors from the Axes property cycle will be used.\n\n hatch : list of str, default: None\n A sequence of hatching styles. See\n :doc:`/gallery/shapes_and_collections/hatch_style_reference`.\n The sequence will be cycled through for filling the\n stacked areas from bottom to top.\n It need not be exactly the same length as the number\n of provided *y*, in which case the styles will repeat from the\n beginning.\n\n .. versionadded:: 3.9\n Support for list input\n\n data : indexable object, optional\n DATA_PARAMETER_PLACEHOLDER\n\n **kwargs\n All other keyword arguments are passed to `.Axes.fill_between`.\n\n Returns\n -------\n list of `.PolyCollection`\n A list of `.PolyCollection` instances, one for each element in the\n stacked area plot.\n """\n\n y = np.vstack(args)\n\n labels = iter(labels)\n if colors is not None:\n colors = itertools.cycle(colors)\n else:\n colors = (axes._get_lines.get_next_color() for _ in y)\n\n if hatch is None or isinstance(hatch, str):\n hatch = itertools.cycle([hatch])\n else:\n hatch = itertools.cycle(hatch)\n\n # Assume data passed has not been 'stacked', so stack it here.\n # We'll need a float buffer for the upcoming calculations.\n stack = np.cumsum(y, axis=0, dtype=np.promote_types(y.dtype, np.float32))\n\n _api.check_in_list(['zero', 'sym', 'wiggle', 'weighted_wiggle'],\n baseline=baseline)\n if baseline == 'zero':\n first_line = 0.\n\n elif baseline == 'sym':\n first_line = -np.sum(y, 0) * 0.5\n stack += first_line[None, :]\n\n elif baseline == 'wiggle':\n m = y.shape[0]\n first_line = (y * (m - 0.5 - np.arange(m)[:, None])).sum(0)\n first_line /= -m\n stack += first_line\n\n elif baseline == 'weighted_wiggle':\n total = np.sum(y, 0)\n # multiply by 1/total (or zero) to avoid infinities in the division:\n inv_total = np.zeros_like(total)\n mask = total > 0\n inv_total[mask] = 1.0 / total[mask]\n increase = np.hstack((y[:, 0:1], np.diff(y)))\n below_size = total - stack\n below_size += 0.5 * y\n move_up = below_size * inv_total\n move_up[:, 0] = 0.5\n center = (move_up - 0.5) * increase\n center = np.cumsum(center.sum(0))\n first_line = center - 0.5 * total\n stack += first_line\n\n # Color between x = 0 and the first array.\n coll = axes.fill_between(x, first_line, stack[0, :],\n facecolor=next(colors),\n hatch=next(hatch),\n label=next(labels, None),\n **kwargs)\n coll.sticky_edges.y[:] = [0]\n r = [coll]\n\n # Color between array i-1 and array i\n for i in range(len(y) - 1):\n r.append(axes.fill_between(x, stack[i, :], stack[i + 1, :],\n facecolor=next(colors),\n hatch=next(hatch),\n label=next(labels, None),\n **kwargs))\n return r\n | .venv\Lib\site-packages\matplotlib\stackplot.py | stackplot.py | Python | 4,997 | 0.95 | 0.081633 | 0.076923 | react-lib | 319 | 2024-03-25T15:12:42.280263 | GPL-3.0 | false | 80a2d3daafa95d79b428b4a80f766ca3 |
from matplotlib.axes import Axes\nfrom matplotlib.collections import PolyCollection\n\nfrom collections.abc import Iterable\nfrom typing import Literal\nfrom numpy.typing import ArrayLike\nfrom matplotlib.typing import ColorType\n\ndef stackplot(\n axes: Axes,\n x: ArrayLike,\n *args: ArrayLike,\n labels: Iterable[str] = ...,\n colors: Iterable[ColorType] | None = ...,\n hatch: Iterable[str] | str | None = ...,\n baseline: Literal["zero", "sym", "wiggle", "weighted_wiggle"] = ...,\n **kwargs\n) -> list[PolyCollection]: ...\n\n__all__ = ['stackplot']\n | .venv\Lib\site-packages\matplotlib\stackplot.pyi | stackplot.pyi | Other | 561 | 0.85 | 0.05 | 0.117647 | node-utils | 267 | 2025-02-19T19:40:20.203038 | Apache-2.0 | false | 62f3ce25f392aed360469c32de87b434 |
"""\nStreamline plotting for 2D vector fields.\n\n"""\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom matplotlib import _api, cm, patches\nimport matplotlib.colors as mcolors\nimport matplotlib.collections as mcollections\nimport matplotlib.lines as mlines\n\n\n__all__ = ['streamplot']\n\n\ndef streamplot(axes, x, y, u, v, density=1, linewidth=None, color=None,\n cmap=None, norm=None, arrowsize=1, arrowstyle='-|>',\n minlength=0.1, transform=None, zorder=None, start_points=None,\n maxlength=4.0, integration_direction='both',\n broken_streamlines=True):\n """\n Draw streamlines of a vector flow.\n\n Parameters\n ----------\n x, y : 1D/2D arrays\n Evenly spaced strictly increasing arrays to make a grid. If 2D, all\n rows of *x* must be equal and all columns of *y* must be equal; i.e.,\n they must be as if generated by ``np.meshgrid(x_1d, y_1d)``.\n u, v : 2D arrays\n *x* and *y*-velocities. The number of rows and columns must match\n the length of *y* and *x*, respectively.\n density : float or (float, float)\n Controls the closeness of streamlines. When ``density = 1``, the domain\n is divided into a 30x30 grid. *density* linearly scales this grid.\n Each cell in the grid can have, at most, one traversing streamline.\n For different densities in each direction, use a tuple\n (density_x, density_y).\n linewidth : float or 2D array\n The width of the streamlines. With a 2D array the line width can be\n varied across the grid. The array must have the same shape as *u*\n and *v*.\n color : :mpltype:`color` or 2D array\n The streamline color. If given an array, its values are converted to\n colors using *cmap* and *norm*. The array must have the same shape\n as *u* and *v*.\n cmap, norm\n Data normalization and colormapping parameters for *color*; only used\n if *color* is an array of floats. See `~.Axes.imshow` for a detailed\n description.\n arrowsize : float\n Scaling factor for the arrow size.\n arrowstyle : str\n Arrow style specification.\n See `~matplotlib.patches.FancyArrowPatch`.\n minlength : float\n Minimum length of streamline in axes coordinates.\n start_points : (N, 2) array\n Coordinates of starting points for the streamlines in data coordinates\n (the same coordinates as the *x* and *y* arrays).\n zorder : float\n The zorder of the streamlines and arrows.\n Artists with lower zorder values are drawn first.\n maxlength : float\n Maximum length of streamline in axes coordinates.\n integration_direction : {'forward', 'backward', 'both'}, default: 'both'\n Integrate the streamline in forward, backward or both directions.\n data : indexable object, optional\n DATA_PARAMETER_PLACEHOLDER\n broken_streamlines : boolean, default: True\n If False, forces streamlines to continue until they\n leave the plot domain. If True, they may be terminated if they\n come too close to another streamline.\n\n Returns\n -------\n StreamplotSet\n Container object with attributes\n\n - ``lines``: `.LineCollection` of streamlines\n\n - ``arrows``: `.PatchCollection` containing `.FancyArrowPatch`\n objects representing the arrows half-way along streamlines.\n\n This container will probably change in the future to allow changes\n to the colormap, alpha, etc. for both lines and arrows, but these\n changes should be backward compatible.\n """\n grid = Grid(x, y)\n mask = StreamMask(density)\n dmap = DomainMap(grid, mask)\n\n if zorder is None:\n zorder = mlines.Line2D.zorder\n\n # default to data coordinates\n if transform is None:\n transform = axes.transData\n\n if color is None:\n color = axes._get_lines.get_next_color()\n\n if linewidth is None:\n linewidth = mpl.rcParams['lines.linewidth']\n\n line_kw = {}\n arrow_kw = dict(arrowstyle=arrowstyle, mutation_scale=10 * arrowsize)\n\n _api.check_in_list(['both', 'forward', 'backward'],\n integration_direction=integration_direction)\n\n if integration_direction == 'both':\n maxlength /= 2.\n\n use_multicolor_lines = isinstance(color, np.ndarray)\n if use_multicolor_lines:\n if color.shape != grid.shape:\n raise ValueError("If 'color' is given, it must match the shape of "\n "the (x, y) grid")\n line_colors = [[]] # Empty entry allows concatenation of zero arrays.\n color = np.ma.masked_invalid(color)\n else:\n line_kw['color'] = color\n arrow_kw['color'] = color\n\n if isinstance(linewidth, np.ndarray):\n if linewidth.shape != grid.shape:\n raise ValueError("If 'linewidth' is given, it must match the "\n "shape of the (x, y) grid")\n line_kw['linewidth'] = []\n else:\n line_kw['linewidth'] = linewidth\n arrow_kw['linewidth'] = linewidth\n\n line_kw['zorder'] = zorder\n arrow_kw['zorder'] = zorder\n\n # Sanity checks.\n if u.shape != grid.shape or v.shape != grid.shape:\n raise ValueError("'u' and 'v' must match the shape of the (x, y) grid")\n\n u = np.ma.masked_invalid(u)\n v = np.ma.masked_invalid(v)\n\n integrate = _get_integrator(u, v, dmap, minlength, maxlength,\n integration_direction)\n\n trajectories = []\n if start_points is None:\n for xm, ym in _gen_starting_points(mask.shape):\n if mask[ym, xm] == 0:\n xg, yg = dmap.mask2grid(xm, ym)\n t = integrate(xg, yg, broken_streamlines)\n if t is not None:\n trajectories.append(t)\n else:\n sp2 = np.asanyarray(start_points, dtype=float).copy()\n\n # Check if start_points are outside the data boundaries\n for xs, ys in sp2:\n if not (grid.x_origin <= xs <= grid.x_origin + grid.width and\n grid.y_origin <= ys <= grid.y_origin + grid.height):\n raise ValueError(f"Starting point ({xs}, {ys}) outside of "\n "data boundaries")\n\n # Convert start_points from data to array coords\n # Shift the seed points from the bottom left of the data so that\n # data2grid works properly.\n sp2[:, 0] -= grid.x_origin\n sp2[:, 1] -= grid.y_origin\n\n for xs, ys in sp2:\n xg, yg = dmap.data2grid(xs, ys)\n # Floating point issues can cause xg, yg to be slightly out of\n # bounds for xs, ys on the upper boundaries. Because we have\n # already checked that the starting points are within the original\n # grid, clip the xg, yg to the grid to work around this issue\n xg = np.clip(xg, 0, grid.nx - 1)\n yg = np.clip(yg, 0, grid.ny - 1)\n\n t = integrate(xg, yg, broken_streamlines)\n if t is not None:\n trajectories.append(t)\n\n if use_multicolor_lines:\n if norm is None:\n norm = mcolors.Normalize(color.min(), color.max())\n cmap = cm._ensure_cmap(cmap)\n\n streamlines = []\n arrows = []\n for t in trajectories:\n tgx, tgy = t.T\n # Rescale from grid-coordinates to data-coordinates.\n tx, ty = dmap.grid2data(tgx, tgy)\n tx += grid.x_origin\n ty += grid.y_origin\n\n # Create multiple tiny segments if varying width or color is given\n if isinstance(linewidth, np.ndarray) or use_multicolor_lines:\n points = np.transpose([tx, ty]).reshape(-1, 1, 2)\n streamlines.extend(np.hstack([points[:-1], points[1:]]))\n else:\n points = np.transpose([tx, ty])\n streamlines.append(points)\n\n # Add arrows halfway along each trajectory.\n s = np.cumsum(np.hypot(np.diff(tx), np.diff(ty)))\n n = np.searchsorted(s, s[-1] / 2.)\n arrow_tail = (tx[n], ty[n])\n arrow_head = (np.mean(tx[n:n + 2]), np.mean(ty[n:n + 2]))\n\n if isinstance(linewidth, np.ndarray):\n line_widths = interpgrid(linewidth, tgx, tgy)[:-1]\n line_kw['linewidth'].extend(line_widths)\n arrow_kw['linewidth'] = line_widths[n]\n\n if use_multicolor_lines:\n color_values = interpgrid(color, tgx, tgy)[:-1]\n line_colors.append(color_values)\n arrow_kw['color'] = cmap(norm(color_values[n]))\n\n p = patches.FancyArrowPatch(\n arrow_tail, arrow_head, transform=transform, **arrow_kw)\n arrows.append(p)\n\n lc = mcollections.LineCollection(\n streamlines, transform=transform, **line_kw)\n lc.sticky_edges.x[:] = [grid.x_origin, grid.x_origin + grid.width]\n lc.sticky_edges.y[:] = [grid.y_origin, grid.y_origin + grid.height]\n if use_multicolor_lines:\n lc.set_array(np.ma.hstack(line_colors))\n lc.set_cmap(cmap)\n lc.set_norm(norm)\n axes.add_collection(lc)\n\n ac = mcollections.PatchCollection(arrows)\n # Adding the collection itself is broken; see #2341.\n for p in arrows:\n axes.add_patch(p)\n\n axes.autoscale_view()\n stream_container = StreamplotSet(lc, ac)\n return stream_container\n\n\nclass StreamplotSet:\n\n def __init__(self, lines, arrows):\n self.lines = lines\n self.arrows = arrows\n\n\n# Coordinate definitions\n# ========================\n\nclass DomainMap:\n """\n Map representing different coordinate systems.\n\n Coordinate definitions:\n\n * axes-coordinates goes from 0 to 1 in the domain.\n * data-coordinates are specified by the input x-y coordinates.\n * grid-coordinates goes from 0 to N and 0 to M for an N x M grid,\n where N and M match the shape of the input data.\n * mask-coordinates goes from 0 to N and 0 to M for an N x M mask,\n where N and M are user-specified to control the density of streamlines.\n\n This class also has methods for adding trajectories to the StreamMask.\n Before adding a trajectory, run `start_trajectory` to keep track of regions\n crossed by a given trajectory. Later, if you decide the trajectory is bad\n (e.g., if the trajectory is very short) just call `undo_trajectory`.\n """\n\n def __init__(self, grid, mask):\n self.grid = grid\n self.mask = mask\n # Constants for conversion between grid- and mask-coordinates\n self.x_grid2mask = (mask.nx - 1) / (grid.nx - 1)\n self.y_grid2mask = (mask.ny - 1) / (grid.ny - 1)\n\n self.x_mask2grid = 1. / self.x_grid2mask\n self.y_mask2grid = 1. / self.y_grid2mask\n\n self.x_data2grid = 1. / grid.dx\n self.y_data2grid = 1. / grid.dy\n\n def grid2mask(self, xi, yi):\n """Return nearest space in mask-coords from given grid-coords."""\n return round(xi * self.x_grid2mask), round(yi * self.y_grid2mask)\n\n def mask2grid(self, xm, ym):\n return xm * self.x_mask2grid, ym * self.y_mask2grid\n\n def data2grid(self, xd, yd):\n return xd * self.x_data2grid, yd * self.y_data2grid\n\n def grid2data(self, xg, yg):\n return xg / self.x_data2grid, yg / self.y_data2grid\n\n def start_trajectory(self, xg, yg, broken_streamlines=True):\n xm, ym = self.grid2mask(xg, yg)\n self.mask._start_trajectory(xm, ym, broken_streamlines)\n\n def reset_start_point(self, xg, yg):\n xm, ym = self.grid2mask(xg, yg)\n self.mask._current_xy = (xm, ym)\n\n def update_trajectory(self, xg, yg, broken_streamlines=True):\n if not self.grid.within_grid(xg, yg):\n raise InvalidIndexError\n xm, ym = self.grid2mask(xg, yg)\n self.mask._update_trajectory(xm, ym, broken_streamlines)\n\n def undo_trajectory(self):\n self.mask._undo_trajectory()\n\n\nclass Grid:\n """Grid of data."""\n def __init__(self, x, y):\n\n if np.ndim(x) == 1:\n pass\n elif np.ndim(x) == 2:\n x_row = x[0]\n if not np.allclose(x_row, x):\n raise ValueError("The rows of 'x' must be equal")\n x = x_row\n else:\n raise ValueError("'x' can have at maximum 2 dimensions")\n\n if np.ndim(y) == 1:\n pass\n elif np.ndim(y) == 2:\n yt = np.transpose(y) # Also works for nested lists.\n y_col = yt[0]\n if not np.allclose(y_col, yt):\n raise ValueError("The columns of 'y' must be equal")\n y = y_col\n else:\n raise ValueError("'y' can have at maximum 2 dimensions")\n\n if not (np.diff(x) > 0).all():\n raise ValueError("'x' must be strictly increasing")\n if not (np.diff(y) > 0).all():\n raise ValueError("'y' must be strictly increasing")\n\n self.nx = len(x)\n self.ny = len(y)\n\n self.dx = x[1] - x[0]\n self.dy = y[1] - y[0]\n\n self.x_origin = x[0]\n self.y_origin = y[0]\n\n self.width = x[-1] - x[0]\n self.height = y[-1] - y[0]\n\n if not np.allclose(np.diff(x), self.width / (self.nx - 1)):\n raise ValueError("'x' values must be equally spaced")\n if not np.allclose(np.diff(y), self.height / (self.ny - 1)):\n raise ValueError("'y' values must be equally spaced")\n\n @property\n def shape(self):\n return self.ny, self.nx\n\n def within_grid(self, xi, yi):\n """Return whether (*xi*, *yi*) is a valid index of the grid."""\n # Note that xi/yi can be floats; so, for example, we can't simply check\n # `xi < self.nx` since *xi* can be `self.nx - 1 < xi < self.nx`\n return 0 <= xi <= self.nx - 1 and 0 <= yi <= self.ny - 1\n\n\nclass StreamMask:\n """\n Mask to keep track of discrete regions crossed by streamlines.\n\n The resolution of this grid determines the approximate spacing between\n trajectories. Streamlines are only allowed to pass through zeroed cells:\n When a streamline enters a cell, that cell is set to 1, and no new\n streamlines are allowed to enter.\n """\n\n def __init__(self, density):\n try:\n self.nx, self.ny = (30 * np.broadcast_to(density, 2)).astype(int)\n except ValueError as err:\n raise ValueError("'density' must be a scalar or be of length "\n "2") from err\n if self.nx < 0 or self.ny < 0:\n raise ValueError("'density' must be positive")\n self._mask = np.zeros((self.ny, self.nx))\n self.shape = self._mask.shape\n\n self._current_xy = None\n\n def __getitem__(self, args):\n return self._mask[args]\n\n def _start_trajectory(self, xm, ym, broken_streamlines=True):\n """Start recording streamline trajectory"""\n self._traj = []\n self._update_trajectory(xm, ym, broken_streamlines)\n\n def _undo_trajectory(self):\n """Remove current trajectory from mask"""\n for t in self._traj:\n self._mask[t] = 0\n\n def _update_trajectory(self, xm, ym, broken_streamlines=True):\n """\n Update current trajectory position in mask.\n\n If the new position has already been filled, raise `InvalidIndexError`.\n """\n if self._current_xy != (xm, ym):\n if self[ym, xm] == 0:\n self._traj.append((ym, xm))\n self._mask[ym, xm] = 1\n self._current_xy = (xm, ym)\n else:\n if broken_streamlines:\n raise InvalidIndexError\n else:\n pass\n\n\nclass InvalidIndexError(Exception):\n pass\n\n\nclass TerminateTrajectory(Exception):\n pass\n\n\n# Integrator definitions\n# =======================\n\ndef _get_integrator(u, v, dmap, minlength, maxlength, integration_direction):\n\n # rescale velocity onto grid-coordinates for integrations.\n u, v = dmap.data2grid(u, v)\n\n # speed (path length) will be in axes-coordinates\n u_ax = u / (dmap.grid.nx - 1)\n v_ax = v / (dmap.grid.ny - 1)\n speed = np.ma.sqrt(u_ax ** 2 + v_ax ** 2)\n\n def forward_time(xi, yi):\n if not dmap.grid.within_grid(xi, yi):\n raise OutOfBounds\n ds_dt = interpgrid(speed, xi, yi)\n if ds_dt == 0:\n raise TerminateTrajectory()\n dt_ds = 1. / ds_dt\n ui = interpgrid(u, xi, yi)\n vi = interpgrid(v, xi, yi)\n return ui * dt_ds, vi * dt_ds\n\n def backward_time(xi, yi):\n dxi, dyi = forward_time(xi, yi)\n return -dxi, -dyi\n\n def integrate(x0, y0, broken_streamlines=True):\n """\n Return x, y grid-coordinates of trajectory based on starting point.\n\n Integrate both forward and backward in time from starting point in\n grid coordinates.\n\n Integration is terminated when a trajectory reaches a domain boundary\n or when it crosses into an already occupied cell in the StreamMask. The\n resulting trajectory is None if it is shorter than `minlength`.\n """\n\n stotal, xy_traj = 0., []\n\n try:\n dmap.start_trajectory(x0, y0, broken_streamlines)\n except InvalidIndexError:\n return None\n if integration_direction in ['both', 'backward']:\n s, xyt = _integrate_rk12(x0, y0, dmap, backward_time, maxlength,\n broken_streamlines)\n stotal += s\n xy_traj += xyt[::-1]\n\n if integration_direction in ['both', 'forward']:\n dmap.reset_start_point(x0, y0)\n s, xyt = _integrate_rk12(x0, y0, dmap, forward_time, maxlength,\n broken_streamlines)\n stotal += s\n xy_traj += xyt[1:]\n\n if stotal > minlength:\n return np.broadcast_arrays(xy_traj, np.empty((1, 2)))[0]\n else: # reject short trajectories\n dmap.undo_trajectory()\n return None\n\n return integrate\n\n\nclass OutOfBounds(IndexError):\n pass\n\n\ndef _integrate_rk12(x0, y0, dmap, f, maxlength, broken_streamlines=True):\n """\n 2nd-order Runge-Kutta algorithm with adaptive step size.\n\n This method is also referred to as the improved Euler's method, or Heun's\n method. This method is favored over higher-order methods because:\n\n 1. To get decent looking trajectories and to sample every mask cell\n on the trajectory we need a small timestep, so a lower order\n solver doesn't hurt us unless the data is *very* high resolution.\n In fact, for cases where the user inputs\n data smaller or of similar grid size to the mask grid, the higher\n order corrections are negligible because of the very fast linear\n interpolation used in `interpgrid`.\n\n 2. For high resolution input data (i.e. beyond the mask\n resolution), we must reduce the timestep. Therefore, an adaptive\n timestep is more suited to the problem as this would be very hard\n to judge automatically otherwise.\n\n This integrator is about 1.5 - 2x as fast as RK4 and RK45 solvers (using\n similar Python implementations) in most setups.\n """\n # This error is below that needed to match the RK4 integrator. It\n # is set for visual reasons -- too low and corners start\n # appearing ugly and jagged. Can be tuned.\n maxerror = 0.003\n\n # This limit is important (for all integrators) to avoid the\n # trajectory skipping some mask cells. We could relax this\n # condition if we use the code which is commented out below to\n # increment the location gradually. However, due to the efficient\n # nature of the interpolation, this doesn't boost speed by much\n # for quite a bit of complexity.\n maxds = min(1. / dmap.mask.nx, 1. / dmap.mask.ny, 0.1)\n\n ds = maxds\n stotal = 0\n xi = x0\n yi = y0\n xyf_traj = []\n\n while True:\n try:\n if dmap.grid.within_grid(xi, yi):\n xyf_traj.append((xi, yi))\n else:\n raise OutOfBounds\n\n # Compute the two intermediate gradients.\n # f should raise OutOfBounds if the locations given are\n # outside the grid.\n k1x, k1y = f(xi, yi)\n k2x, k2y = f(xi + ds * k1x, yi + ds * k1y)\n\n except OutOfBounds:\n # Out of the domain during this step.\n # Take an Euler step to the boundary to improve neatness\n # unless the trajectory is currently empty.\n if xyf_traj:\n ds, xyf_traj = _euler_step(xyf_traj, dmap, f)\n stotal += ds\n break\n except TerminateTrajectory:\n break\n\n dx1 = ds * k1x\n dy1 = ds * k1y\n dx2 = ds * 0.5 * (k1x + k2x)\n dy2 = ds * 0.5 * (k1y + k2y)\n\n ny, nx = dmap.grid.shape\n # Error is normalized to the axes coordinates\n error = np.hypot((dx2 - dx1) / (nx - 1), (dy2 - dy1) / (ny - 1))\n\n # Only save step if within error tolerance\n if error < maxerror:\n xi += dx2\n yi += dy2\n try:\n dmap.update_trajectory(xi, yi, broken_streamlines)\n except InvalidIndexError:\n break\n if stotal + ds > maxlength:\n break\n stotal += ds\n\n # recalculate stepsize based on step error\n if error == 0:\n ds = maxds\n else:\n ds = min(maxds, 0.85 * ds * (maxerror / error) ** 0.5)\n\n return stotal, xyf_traj\n\n\ndef _euler_step(xyf_traj, dmap, f):\n """Simple Euler integration step that extends streamline to boundary."""\n ny, nx = dmap.grid.shape\n xi, yi = xyf_traj[-1]\n cx, cy = f(xi, yi)\n if cx == 0:\n dsx = np.inf\n elif cx < 0:\n dsx = xi / -cx\n else:\n dsx = (nx - 1 - xi) / cx\n if cy == 0:\n dsy = np.inf\n elif cy < 0:\n dsy = yi / -cy\n else:\n dsy = (ny - 1 - yi) / cy\n ds = min(dsx, dsy)\n xyf_traj.append((xi + cx * ds, yi + cy * ds))\n return ds, xyf_traj\n\n\n# Utility functions\n# ========================\n\ndef interpgrid(a, xi, yi):\n """Fast 2D, linear interpolation on an integer grid"""\n\n Ny, Nx = np.shape(a)\n if isinstance(xi, np.ndarray):\n x = xi.astype(int)\n y = yi.astype(int)\n # Check that xn, yn don't exceed max index\n xn = np.clip(x + 1, 0, Nx - 1)\n yn = np.clip(y + 1, 0, Ny - 1)\n else:\n x = int(xi)\n y = int(yi)\n # conditional is faster than clipping for integers\n if x == (Nx - 1):\n xn = x\n else:\n xn = x + 1\n if y == (Ny - 1):\n yn = y\n else:\n yn = y + 1\n\n a00 = a[y, x]\n a01 = a[y, xn]\n a10 = a[yn, x]\n a11 = a[yn, xn]\n xt = xi - x\n yt = yi - y\n a0 = a00 * (1 - xt) + a01 * xt\n a1 = a10 * (1 - xt) + a11 * xt\n ai = a0 * (1 - yt) + a1 * yt\n\n if not isinstance(xi, np.ndarray):\n if np.ma.is_masked(ai):\n raise TerminateTrajectory\n\n return ai\n\n\ndef _gen_starting_points(shape):\n """\n Yield starting points for streamlines.\n\n Trying points on the boundary first gives higher quality streamlines.\n This algorithm starts with a point on the mask corner and spirals inward.\n This algorithm is inefficient, but fast compared to rest of streamplot.\n """\n ny, nx = shape\n xfirst = 0\n yfirst = 1\n xlast = nx - 1\n ylast = ny - 1\n x, y = 0, 0\n direction = 'right'\n for i in range(nx * ny):\n yield x, y\n\n if direction == 'right':\n x += 1\n if x >= xlast:\n xlast -= 1\n direction = 'up'\n elif direction == 'up':\n y += 1\n if y >= ylast:\n ylast -= 1\n direction = 'left'\n elif direction == 'left':\n x -= 1\n if x <= xfirst:\n xfirst += 1\n direction = 'down'\n elif direction == 'down':\n y -= 1\n if y <= yfirst:\n yfirst += 1\n direction = 'right'\n | .venv\Lib\site-packages\matplotlib\streamplot.py | streamplot.py | Python | 24,011 | 0.95 | 0.188202 | 0.086207 | react-lib | 716 | 2023-09-06T05:33:58.341525 | BSD-3-Clause | false | 9b709c9065d167d687c441b87b360af9 |
from matplotlib.axes import Axes\nfrom matplotlib.colors import Normalize, Colormap\nfrom matplotlib.collections import LineCollection, PatchCollection\nfrom matplotlib.patches import ArrowStyle\nfrom matplotlib.transforms import Transform\n\nfrom typing import Literal\nfrom numpy.typing import ArrayLike\nfrom .typing import ColorType\n\ndef streamplot(\n axes: Axes,\n x: ArrayLike,\n y: ArrayLike,\n u: ArrayLike,\n v: ArrayLike,\n density: float | tuple[float, float] = ...,\n linewidth: float | ArrayLike | None = ...,\n color: ColorType | ArrayLike | None = ...,\n cmap: str | Colormap | None = ...,\n norm: str | Normalize | None = ...,\n arrowsize: float = ...,\n arrowstyle: str | ArrowStyle = ...,\n minlength: float = ...,\n transform: Transform | None = ...,\n zorder: float | None = ...,\n start_points: ArrayLike | None = ...,\n maxlength: float = ...,\n integration_direction: Literal["forward", "backward", "both"] = ...,\n broken_streamlines: bool = ...,\n) -> StreamplotSet: ...\n\nclass StreamplotSet:\n lines: LineCollection\n arrows: PatchCollection\n def __init__(self, lines: LineCollection, arrows: PatchCollection) -> None: ...\n\nclass DomainMap:\n grid: Grid\n mask: StreamMask\n x_grid2mask: float\n y_grid2mask: float\n x_mask2grid: float\n y_mask2grid: float\n x_data2grid: float\n y_data2grid: float\n def __init__(self, grid: Grid, mask: StreamMask) -> None: ...\n def grid2mask(self, xi: float, yi: float) -> tuple[int, int]: ...\n def mask2grid(self, xm: float, ym: float) -> tuple[float, float]: ...\n def data2grid(self, xd: float, yd: float) -> tuple[float, float]: ...\n def grid2data(self, xg: float, yg: float) -> tuple[float, float]: ...\n def start_trajectory(\n self, xg: float, yg: float, broken_streamlines: bool = ...\n ) -> None: ...\n def reset_start_point(self, xg: float, yg: float) -> None: ...\n def update_trajectory(self, xg, yg, broken_streamlines: bool = ...) -> None: ...\n def undo_trajectory(self) -> None: ...\n\nclass Grid:\n nx: int\n ny: int\n dx: float\n dy: float\n x_origin: float\n y_origin: float\n width: float\n height: float\n def __init__(self, x: ArrayLike, y: ArrayLike) -> None: ...\n @property\n def shape(self) -> tuple[int, int]: ...\n def within_grid(self, xi: float, yi: float) -> bool: ...\n\nclass StreamMask:\n nx: int\n ny: int\n shape: tuple[int, int]\n def __init__(self, density: float | tuple[float, float]) -> None: ...\n def __getitem__(self, args): ...\n\nclass InvalidIndexError(Exception): ...\nclass TerminateTrajectory(Exception): ...\nclass OutOfBounds(IndexError): ...\n\n__all__ = ['streamplot']\n | .venv\Lib\site-packages\matplotlib\streamplot.pyi | streamplot.pyi | Other | 2,690 | 0.85 | 0.27381 | 0 | python-kit | 734 | 2024-06-15T12:38:26.379355 | GPL-3.0 | false | 399cdedef2e4b3d22558ff438e22bae2 |
# Original code by:\n# John Gill <jng@europe.renre.com>\n# Copyright 2004 John Gill and John Hunter\n#\n# Subsequent changes:\n# The Matplotlib development team\n# Copyright The Matplotlib development team\n\n"""\nTables drawing.\n\n.. note::\n The table implementation in Matplotlib is lightly maintained. For a more\n featureful table implementation, you may wish to try `blume\n <https://github.com/swfiua/blume>`_.\n\nUse the factory function `~matplotlib.table.table` to create a ready-made\ntable from texts. If you need more control, use the `.Table` class and its\nmethods.\n\nThe table consists of a grid of cells, which are indexed by (row, column).\nThe cell (0, 0) is positioned at the top left.\n\nThanks to John Gill for providing the class and table.\n"""\n\nimport numpy as np\n\nfrom . import _api, _docstring\nfrom .artist import Artist, allow_rasterization\nfrom .patches import Rectangle\nfrom .text import Text\nfrom .transforms import Bbox\nfrom .path import Path\n\nfrom .cbook import _is_pandas_dataframe\n\n\nclass Cell(Rectangle):\n """\n A cell is a `.Rectangle` with some associated `.Text`.\n\n As a user, you'll most likely not creates cells yourself. Instead, you\n should use either the `~matplotlib.table.table` factory function or\n `.Table.add_cell`.\n """\n\n PAD = 0.1\n """Padding between text and rectangle."""\n\n _edges = 'BRTL'\n _edge_aliases = {'open': '',\n 'closed': _edges, # default\n 'horizontal': 'BT',\n 'vertical': 'RL'\n }\n\n def __init__(self, xy, width, height, *,\n edgecolor='k', facecolor='w',\n fill=True,\n text='',\n loc='right',\n fontproperties=None,\n visible_edges='closed',\n ):\n """\n Parameters\n ----------\n xy : 2-tuple\n The position of the bottom left corner of the cell.\n width : float\n The cell width.\n height : float\n The cell height.\n edgecolor : :mpltype:`color`, default: 'k'\n The color of the cell border.\n facecolor : :mpltype:`color`, default: 'w'\n The cell facecolor.\n fill : bool, default: True\n Whether the cell background is filled.\n text : str, optional\n The cell text.\n loc : {'right', 'center', 'left'}\n The alignment of the text within the cell.\n fontproperties : dict, optional\n A dict defining the font properties of the text. Supported keys and\n values are the keyword arguments accepted by `.FontProperties`.\n visible_edges : {'closed', 'open', 'horizontal', 'vertical'} or \\nsubstring of 'BRTL'\n The cell edges to be drawn with a line: a substring of 'BRTL'\n (bottom, right, top, left), or one of 'open' (no edges drawn),\n 'closed' (all edges drawn), 'horizontal' (bottom and top),\n 'vertical' (right and left).\n """\n\n # Call base\n super().__init__(xy, width=width, height=height, fill=fill,\n edgecolor=edgecolor, facecolor=facecolor)\n self.set_clip_on(False)\n self.visible_edges = visible_edges\n\n # Create text object\n self._loc = loc\n self._text = Text(x=xy[0], y=xy[1], clip_on=False,\n text=text, fontproperties=fontproperties,\n horizontalalignment=loc, verticalalignment='center')\n\n def set_transform(self, t):\n super().set_transform(t)\n # the text does not get the transform!\n self.stale = True\n\n def set_figure(self, fig):\n super().set_figure(fig)\n self._text.set_figure(fig)\n\n def get_text(self):\n """Return the cell `.Text` instance."""\n return self._text\n\n def set_fontsize(self, size):\n """Set the text fontsize."""\n self._text.set_fontsize(size)\n self.stale = True\n\n def get_fontsize(self):\n """Return the cell fontsize."""\n return self._text.get_fontsize()\n\n def auto_set_font_size(self, renderer):\n """Shrink font size until the text fits into the cell width."""\n fontsize = self.get_fontsize()\n required = self.get_required_width(renderer)\n while fontsize > 1 and required > self.get_width():\n fontsize -= 1\n self.set_fontsize(fontsize)\n required = self.get_required_width(renderer)\n\n return fontsize\n\n @allow_rasterization\n def draw(self, renderer):\n if not self.get_visible():\n return\n # draw the rectangle\n super().draw(renderer)\n # position the text\n self._set_text_position(renderer)\n self._text.draw(renderer)\n self.stale = False\n\n def _set_text_position(self, renderer):\n """Set text up so it is drawn in the right place."""\n bbox = self.get_window_extent(renderer)\n # center vertically\n y = bbox.y0 + bbox.height / 2\n # position horizontally\n loc = self._text.get_horizontalalignment()\n if loc == 'center':\n x = bbox.x0 + bbox.width / 2\n elif loc == 'left':\n x = bbox.x0 + bbox.width * self.PAD\n else: # right.\n x = bbox.x0 + bbox.width * (1 - self.PAD)\n self._text.set_position((x, y))\n\n def get_text_bounds(self, renderer):\n """\n Return the text bounds as *(x, y, width, height)* in table coordinates.\n """\n return (self._text.get_window_extent(renderer)\n .transformed(self.get_data_transform().inverted())\n .bounds)\n\n def get_required_width(self, renderer):\n """Return the minimal required width for the cell."""\n l, b, w, h = self.get_text_bounds(renderer)\n return w * (1.0 + (2.0 * self.PAD))\n\n @_docstring.interpd\n def set_text_props(self, **kwargs):\n """\n Update the text properties.\n\n Valid keyword arguments are:\n\n %(Text:kwdoc)s\n """\n self._text._internal_update(kwargs)\n self.stale = True\n\n @property\n def visible_edges(self):\n """\n The cell edges to be drawn with a line.\n\n Reading this property returns a substring of 'BRTL' (bottom, right,\n top, left').\n\n When setting this property, you can use a substring of 'BRTL' or one\n of {'open', 'closed', 'horizontal', 'vertical'}.\n """\n return self._visible_edges\n\n @visible_edges.setter\n def visible_edges(self, value):\n if value is None:\n self._visible_edges = self._edges\n elif value in self._edge_aliases:\n self._visible_edges = self._edge_aliases[value]\n else:\n if any(edge not in self._edges for edge in value):\n raise ValueError('Invalid edge param {}, must only be one of '\n '{} or string of {}'.format(\n value,\n ", ".join(self._edge_aliases),\n ", ".join(self._edges)))\n self._visible_edges = value\n self.stale = True\n\n def get_path(self):\n """Return a `.Path` for the `.visible_edges`."""\n codes = [Path.MOVETO]\n codes.extend(\n Path.LINETO if edge in self._visible_edges else Path.MOVETO\n for edge in self._edges)\n if Path.MOVETO not in codes[1:]: # All sides are visible\n codes[-1] = Path.CLOSEPOLY\n return Path(\n [[0.0, 0.0], [1.0, 0.0], [1.0, 1.0], [0.0, 1.0], [0.0, 0.0]],\n codes,\n readonly=True\n )\n\n\nCustomCell = Cell # Backcompat. alias.\n\n\nclass Table(Artist):\n """\n A table of cells.\n\n The table consists of a grid of cells, which are indexed by (row, column).\n\n For a simple table, you'll have a full grid of cells with indices from\n (0, 0) to (num_rows-1, num_cols-1), in which the cell (0, 0) is positioned\n at the top left. However, you can also add cells with negative indices.\n You don't have to add a cell to every grid position, so you can create\n tables that have holes.\n\n *Note*: You'll usually not create an empty table from scratch. Instead use\n `~matplotlib.table.table` to create a table from data.\n """\n codes = {'best': 0,\n 'upper right': 1, # default\n 'upper left': 2,\n 'lower left': 3,\n 'lower right': 4,\n 'center left': 5,\n 'center right': 6,\n 'lower center': 7,\n 'upper center': 8,\n 'center': 9,\n 'top right': 10,\n 'top left': 11,\n 'bottom left': 12,\n 'bottom right': 13,\n 'right': 14,\n 'left': 15,\n 'top': 16,\n 'bottom': 17,\n }\n """Possible values where to place the table relative to the Axes."""\n\n FONTSIZE = 10\n\n AXESPAD = 0.02\n """The border between the Axes and the table edge in Axes units."""\n\n def __init__(self, ax, loc=None, bbox=None, **kwargs):\n """\n Parameters\n ----------\n ax : `~matplotlib.axes.Axes`\n The `~.axes.Axes` to plot the table into.\n loc : str, optional\n The position of the cell with respect to *ax*. This must be one of\n the `~.Table.codes`.\n bbox : `.Bbox` or [xmin, ymin, width, height], optional\n A bounding box to draw the table into. If this is not *None*, this\n overrides *loc*.\n\n Other Parameters\n ----------------\n **kwargs\n `.Artist` properties.\n """\n\n super().__init__()\n\n if isinstance(loc, str):\n if loc not in self.codes:\n raise ValueError(\n "Unrecognized location {!r}. Valid locations are\n\t{}"\n .format(loc, '\n\t'.join(self.codes)))\n loc = self.codes[loc]\n self.set_figure(ax.get_figure(root=False))\n self._axes = ax\n self._loc = loc\n self._bbox = bbox\n\n # use axes coords\n ax._unstale_viewLim()\n self.set_transform(ax.transAxes)\n\n self._cells = {}\n self._edges = None\n self._autoColumns = []\n self._autoFontsize = True\n self._internal_update(kwargs)\n\n self.set_clip_on(False)\n\n def add_cell(self, row, col, *args, **kwargs):\n """\n Create a cell and add it to the table.\n\n Parameters\n ----------\n row : int\n Row index.\n col : int\n Column index.\n *args, **kwargs\n All other parameters are passed on to `Cell`.\n\n Returns\n -------\n `.Cell`\n The created cell.\n\n """\n xy = (0, 0)\n cell = Cell(xy, visible_edges=self.edges, *args, **kwargs)\n self[row, col] = cell\n return cell\n\n def __setitem__(self, position, cell):\n """\n Set a custom cell in a given position.\n """\n _api.check_isinstance(Cell, cell=cell)\n try:\n row, col = position[0], position[1]\n except Exception as err:\n raise KeyError('Only tuples length 2 are accepted as '\n 'coordinates') from err\n cell.set_figure(self.get_figure(root=False))\n cell.set_transform(self.get_transform())\n cell.set_clip_on(False)\n self._cells[row, col] = cell\n self.stale = True\n\n def __getitem__(self, position):\n """Retrieve a custom cell from a given position."""\n return self._cells[position]\n\n @property\n def edges(self):\n """\n The default value of `~.Cell.visible_edges` for newly added\n cells using `.add_cell`.\n\n Notes\n -----\n This setting does currently only affect newly created cells using\n `.add_cell`.\n\n To change existing cells, you have to set their edges explicitly::\n\n for c in tab.get_celld().values():\n c.visible_edges = 'horizontal'\n\n """\n return self._edges\n\n @edges.setter\n def edges(self, value):\n self._edges = value\n self.stale = True\n\n def _approx_text_height(self):\n return (self.FONTSIZE / 72.0 * self.get_figure(root=True).dpi /\n self._axes.bbox.height * 1.2)\n\n @allow_rasterization\n def draw(self, renderer):\n # docstring inherited\n\n # Need a renderer to do hit tests on mouseevent; assume the last one\n # will do\n if renderer is None:\n renderer = self.get_figure(root=True)._get_renderer()\n if renderer is None:\n raise RuntimeError('No renderer defined')\n\n if not self.get_visible():\n return\n renderer.open_group('table', gid=self.get_gid())\n self._update_positions(renderer)\n\n for key in sorted(self._cells):\n self._cells[key].draw(renderer)\n\n renderer.close_group('table')\n self.stale = False\n\n def _get_grid_bbox(self, renderer):\n """\n Get a bbox, in axes coordinates for the cells.\n\n Only include those in the range (0, 0) to (maxRow, maxCol).\n """\n boxes = [cell.get_window_extent(renderer)\n for (row, col), cell in self._cells.items()\n if row >= 0 and col >= 0]\n bbox = Bbox.union(boxes)\n return bbox.transformed(self.get_transform().inverted())\n\n def contains(self, mouseevent):\n # docstring inherited\n if self._different_canvas(mouseevent):\n return False, {}\n # TODO: Return index of the cell containing the cursor so that the user\n # doesn't have to bind to each one individually.\n renderer = self.get_figure(root=True)._get_renderer()\n if renderer is not None:\n boxes = [cell.get_window_extent(renderer)\n for (row, col), cell in self._cells.items()\n if row >= 0 and col >= 0]\n bbox = Bbox.union(boxes)\n return bbox.contains(mouseevent.x, mouseevent.y), {}\n else:\n return False, {}\n\n def get_children(self):\n """Return the Artists contained by the table."""\n return list(self._cells.values())\n\n def get_window_extent(self, renderer=None):\n # docstring inherited\n if renderer is None:\n renderer = self.get_figure(root=True)._get_renderer()\n self._update_positions(renderer)\n boxes = [cell.get_window_extent(renderer)\n for cell in self._cells.values()]\n return Bbox.union(boxes)\n\n def _do_cell_alignment(self):\n """\n Calculate row heights and column widths; position cells accordingly.\n """\n # Calculate row/column widths\n widths = {}\n heights = {}\n for (row, col), cell in self._cells.items():\n height = heights.setdefault(row, 0.0)\n heights[row] = max(height, cell.get_height())\n width = widths.setdefault(col, 0.0)\n widths[col] = max(width, cell.get_width())\n\n # work out left position for each column\n xpos = 0\n lefts = {}\n for col in sorted(widths):\n lefts[col] = xpos\n xpos += widths[col]\n\n ypos = 0\n bottoms = {}\n for row in sorted(heights, reverse=True):\n bottoms[row] = ypos\n ypos += heights[row]\n\n # set cell positions\n for (row, col), cell in self._cells.items():\n cell.set_x(lefts[col])\n cell.set_y(bottoms[row])\n\n def auto_set_column_width(self, col):\n """\n Automatically set the widths of given columns to optimal sizes.\n\n Parameters\n ----------\n col : int or sequence of ints\n The indices of the columns to auto-scale.\n """\n col1d = np.atleast_1d(col)\n if not np.issubdtype(col1d.dtype, np.integer):\n raise TypeError("col must be an int or sequence of ints.")\n for cell in col1d:\n self._autoColumns.append(cell)\n\n self.stale = True\n\n def _auto_set_column_width(self, col, renderer):\n """Automatically set width for column."""\n cells = [cell for key, cell in self._cells.items() if key[1] == col]\n max_width = max((cell.get_required_width(renderer) for cell in cells),\n default=0)\n for cell in cells:\n cell.set_width(max_width)\n\n def auto_set_font_size(self, value=True):\n """Automatically set font size."""\n self._autoFontsize = value\n self.stale = True\n\n def _auto_set_font_size(self, renderer):\n\n if len(self._cells) == 0:\n return\n fontsize = next(iter(self._cells.values())).get_fontsize()\n cells = []\n for key, cell in self._cells.items():\n # ignore auto-sized columns\n if key[1] in self._autoColumns:\n continue\n size = cell.auto_set_font_size(renderer)\n fontsize = min(fontsize, size)\n cells.append(cell)\n\n # now set all fontsizes equal\n for cell in self._cells.values():\n cell.set_fontsize(fontsize)\n\n def scale(self, xscale, yscale):\n """Scale column widths by *xscale* and row heights by *yscale*."""\n for c in self._cells.values():\n c.set_width(c.get_width() * xscale)\n c.set_height(c.get_height() * yscale)\n\n def set_fontsize(self, size):\n """\n Set the font size, in points, of the cell text.\n\n Parameters\n ----------\n size : float\n\n Notes\n -----\n As long as auto font size has not been disabled, the value will be\n clipped such that the text fits horizontally into the cell.\n\n You can disable this behavior using `.auto_set_font_size`.\n\n >>> the_table.auto_set_font_size(False)\n >>> the_table.set_fontsize(20)\n\n However, there is no automatic scaling of the row height so that the\n text may exceed the cell boundary.\n """\n for cell in self._cells.values():\n cell.set_fontsize(size)\n self.stale = True\n\n def _offset(self, ox, oy):\n """Move all the artists by ox, oy (axes coords)."""\n for c in self._cells.values():\n x, y = c.get_x(), c.get_y()\n c.set_x(x + ox)\n c.set_y(y + oy)\n\n def _update_positions(self, renderer):\n # called from renderer to allow more precise estimates of\n # widths and heights with get_window_extent\n\n # Do any auto width setting\n for col in self._autoColumns:\n self._auto_set_column_width(col, renderer)\n\n if self._autoFontsize:\n self._auto_set_font_size(renderer)\n\n # Align all the cells\n self._do_cell_alignment()\n\n bbox = self._get_grid_bbox(renderer)\n l, b, w, h = bbox.bounds\n\n if self._bbox is not None:\n # Position according to bbox\n if isinstance(self._bbox, Bbox):\n rl, rb, rw, rh = self._bbox.bounds\n else:\n rl, rb, rw, rh = self._bbox\n self.scale(rw / w, rh / h)\n ox = rl - l\n oy = rb - b\n self._do_cell_alignment()\n else:\n # Position using loc\n (BEST, UR, UL, LL, LR, CL, CR, LC, UC, C,\n TR, TL, BL, BR, R, L, T, B) = range(len(self.codes))\n # defaults for center\n ox = (0.5 - w / 2) - l\n oy = (0.5 - h / 2) - b\n if self._loc in (UL, LL, CL): # left\n ox = self.AXESPAD - l\n if self._loc in (BEST, UR, LR, R, CR): # right\n ox = 1 - (l + w + self.AXESPAD)\n if self._loc in (BEST, UR, UL, UC): # upper\n oy = 1 - (b + h + self.AXESPAD)\n if self._loc in (LL, LR, LC): # lower\n oy = self.AXESPAD - b\n if self._loc in (LC, UC, C): # center x\n ox = (0.5 - w / 2) - l\n if self._loc in (CL, CR, C): # center y\n oy = (0.5 - h / 2) - b\n\n if self._loc in (TL, BL, L): # out left\n ox = - (l + w)\n if self._loc in (TR, BR, R): # out right\n ox = 1.0 - l\n if self._loc in (TR, TL, T): # out top\n oy = 1.0 - b\n if self._loc in (BL, BR, B): # out bottom\n oy = - (b + h)\n\n self._offset(ox, oy)\n\n def get_celld(self):\n r"""\n Return a dict of cells in the table mapping *(row, column)* to\n `.Cell`\s.\n\n Notes\n -----\n You can also directly index into the Table object to access individual\n cells::\n\n cell = table[row, col]\n\n """\n return self._cells\n\n\n@_docstring.interpd\ndef table(ax,\n cellText=None, cellColours=None,\n cellLoc='right', colWidths=None,\n rowLabels=None, rowColours=None, rowLoc='left',\n colLabels=None, colColours=None, colLoc='center',\n loc='bottom', bbox=None, edges='closed',\n **kwargs):\n """\n Add a table to an `~.axes.Axes`.\n\n At least one of *cellText* or *cellColours* must be specified. These\n parameters must be 2D lists, in which the outer lists define the rows and\n the inner list define the column values per row. Each row must have the\n same number of elements.\n\n The table can optionally have row and column headers, which are configured\n using *rowLabels*, *rowColours*, *rowLoc* and *colLabels*, *colColours*,\n *colLoc* respectively.\n\n For finer grained control over tables, use the `.Table` class and add it to\n the Axes with `.Axes.add_table`.\n\n Parameters\n ----------\n cellText : 2D list of str or pandas.DataFrame, optional\n The texts to place into the table cells.\n\n *Note*: Line breaks in the strings are currently not accounted for and\n will result in the text exceeding the cell boundaries.\n\n cellColours : 2D list of :mpltype:`color`, optional\n The background colors of the cells.\n\n cellLoc : {'right', 'center', 'left'}\n The alignment of the text within the cells.\n\n colWidths : list of float, optional\n The column widths in units of the axes. If not given, all columns will\n have a width of *1 / ncols*.\n\n rowLabels : list of str, optional\n The text of the row header cells.\n\n rowColours : list of :mpltype:`color`, optional\n The colors of the row header cells.\n\n rowLoc : {'left', 'center', 'right'}\n The text alignment of the row header cells.\n\n colLabels : list of str, optional\n The text of the column header cells.\n\n colColours : list of :mpltype:`color`, optional\n The colors of the column header cells.\n\n colLoc : {'center', 'left', 'right'}\n The text alignment of the column header cells.\n\n loc : str, default: 'bottom'\n The position of the cell with respect to *ax*. This must be one of\n the `~.Table.codes`.\n\n bbox : `.Bbox` or [xmin, ymin, width, height], optional\n A bounding box to draw the table into. If this is not *None*, this\n overrides *loc*.\n\n edges : {'closed', 'open', 'horizontal', 'vertical'} or substring of 'BRTL'\n The cell edges to be drawn with a line. See also\n `~.Cell.visible_edges`.\n\n Returns\n -------\n `~matplotlib.table.Table`\n The created table.\n\n Other Parameters\n ----------------\n **kwargs\n `.Table` properties.\n\n %(Table:kwdoc)s\n """\n\n if cellColours is None and cellText is None:\n raise ValueError('At least one argument from "cellColours" or '\n '"cellText" must be provided to create a table.')\n\n # Check we have some cellText\n if cellText is None:\n # assume just colours are needed\n rows = len(cellColours)\n cols = len(cellColours[0])\n cellText = [[''] * cols] * rows\n\n # Check if we have a Pandas DataFrame\n if _is_pandas_dataframe(cellText):\n # if rowLabels/colLabels are empty, use DataFrame entries.\n # Otherwise, throw an error.\n if rowLabels is None:\n rowLabels = cellText.index\n else:\n raise ValueError("rowLabels cannot be used alongside Pandas DataFrame")\n if colLabels is None:\n colLabels = cellText.columns\n else:\n raise ValueError("colLabels cannot be used alongside Pandas DataFrame")\n # Update cellText with only values\n cellText = cellText.values\n\n rows = len(cellText)\n cols = len(cellText[0])\n for row in cellText:\n if len(row) != cols:\n raise ValueError(f"Each row in 'cellText' must have {cols} "\n "columns")\n\n if cellColours is not None:\n if len(cellColours) != rows:\n raise ValueError(f"'cellColours' must have {rows} rows")\n for row in cellColours:\n if len(row) != cols:\n raise ValueError("Each row in 'cellColours' must have "\n f"{cols} columns")\n else:\n cellColours = ['w' * cols] * rows\n\n # Set colwidths if not given\n if colWidths is None:\n colWidths = [1.0 / cols] * cols\n\n # Fill in missing information for column\n # and row labels\n rowLabelWidth = 0\n if rowLabels is None:\n if rowColours is not None:\n rowLabels = [''] * rows\n rowLabelWidth = colWidths[0]\n elif rowColours is None:\n rowColours = 'w' * rows\n\n if rowLabels is not None:\n if len(rowLabels) != rows:\n raise ValueError(f"'rowLabels' must be of length {rows}")\n\n # If we have column labels, need to shift\n # the text and colour arrays down 1 row\n offset = 1\n if colLabels is None:\n if colColours is not None:\n colLabels = [''] * cols\n else:\n offset = 0\n elif colColours is None:\n colColours = 'w' * cols\n\n # Set up cell colours if not given\n if cellColours is None:\n cellColours = ['w' * cols] * rows\n\n # Now create the table\n table = Table(ax, loc, bbox, **kwargs)\n table.edges = edges\n height = table._approx_text_height()\n\n # Add the cells\n for row in range(rows):\n for col in range(cols):\n table.add_cell(row + offset, col,\n width=colWidths[col], height=height,\n text=cellText[row][col],\n facecolor=cellColours[row][col],\n loc=cellLoc)\n # Do column labels\n if colLabels is not None:\n for col in range(cols):\n table.add_cell(0, col,\n width=colWidths[col], height=height,\n text=colLabels[col], facecolor=colColours[col],\n loc=colLoc)\n\n # Do row labels\n if rowLabels is not None:\n for row in range(rows):\n table.add_cell(row + offset, -1,\n width=rowLabelWidth or 1e-15, height=height,\n text=rowLabels[row], facecolor=rowColours[row],\n loc=rowLoc)\n if rowLabelWidth == 0:\n table.auto_set_column_width(-1)\n\n # set_fontsize is only effective after cells are added\n if "fontsize" in kwargs:\n table.set_fontsize(kwargs["fontsize"])\n\n ax.add_table(table)\n return table\n | .venv\Lib\site-packages\matplotlib\table.py | table.py | Python | 27,744 | 0.95 | 0.169031 | 0.082504 | vue-tools | 604 | 2023-11-17T03:58:54.177734 | MIT | false | e1e0c7e1ebcf2aca18b79751fd022161 |
from .artist import Artist\nfrom .axes import Axes\nfrom .backend_bases import RendererBase\nfrom .patches import Rectangle\nfrom .path import Path\nfrom .text import Text\nfrom .transforms import Bbox\nfrom .typing import ColorType\n\nfrom collections.abc import Sequence\nfrom typing import Any, Literal, TYPE_CHECKING\n\nfrom pandas import DataFrame\n\nclass Cell(Rectangle):\n PAD: float\n def __init__(\n self,\n xy: tuple[float, float],\n width: float,\n height: float,\n *,\n edgecolor: ColorType = ...,\n facecolor: ColorType = ...,\n fill: bool = ...,\n text: str = ...,\n loc: Literal["left", "center", "right"] = ...,\n fontproperties: dict[str, Any] | None = ...,\n visible_edges: str | None = ...\n ) -> None: ...\n def get_text(self) -> Text: ...\n def set_fontsize(self, size: float) -> None: ...\n def get_fontsize(self) -> float: ...\n def auto_set_font_size(self, renderer: RendererBase) -> float: ...\n def get_text_bounds(\n self, renderer: RendererBase\n ) -> tuple[float, float, float, float]: ...\n def get_required_width(self, renderer: RendererBase) -> float: ...\n def set_text_props(self, **kwargs) -> None: ...\n @property\n def visible_edges(self) -> str: ...\n @visible_edges.setter\n def visible_edges(self, value: str | None) -> None: ...\n def get_path(self) -> Path: ...\n\nCustomCell = Cell\n\nclass Table(Artist):\n codes: dict[str, int]\n FONTSIZE: float\n AXESPAD: float\n def __init__(\n self, ax: Axes, loc: str | None = ..., bbox: Bbox | None = ..., **kwargs\n ) -> None: ...\n def add_cell(self, row: int, col: int, *args, **kwargs) -> Cell: ...\n def __setitem__(self, position: tuple[int, int], cell: Cell) -> None: ...\n def __getitem__(self, position: tuple[int, int]) -> Cell: ...\n @property\n def edges(self) -> str | None: ...\n @edges.setter\n def edges(self, value: str | None) -> None: ...\n def draw(self, renderer) -> None: ...\n def get_children(self) -> list[Artist]: ...\n def get_window_extent(self, renderer: RendererBase | None = ...) -> Bbox: ...\n def auto_set_column_width(self, col: int | Sequence[int]) -> None: ...\n def auto_set_font_size(self, value: bool = ...) -> None: ...\n def scale(self, xscale: float, yscale: float) -> None: ...\n def set_fontsize(self, size: float) -> None: ...\n def get_celld(self) -> dict[tuple[int, int], Cell]: ...\n\ndef table(\n ax: Axes,\n cellText: Sequence[Sequence[str]] | DataFrame | None = ...,\n cellColours: Sequence[Sequence[ColorType]] | None = ...,\n cellLoc: Literal["left", "center", "right"] = ...,\n colWidths: Sequence[float] | None = ...,\n rowLabels: Sequence[str] | None = ...,\n rowColours: Sequence[ColorType] | None = ...,\n rowLoc: Literal["left", "center", "right"] = ...,\n colLabels: Sequence[str] | None = ...,\n colColours: Sequence[ColorType] | None = ...,\n colLoc: Literal["left", "center", "right"] = ...,\n loc: str = ...,\n bbox: Bbox | None = ...,\n edges: str = ...,\n **kwargs\n) -> Table: ...\n | .venv\Lib\site-packages\matplotlib\table.pyi | table.pyi | Other | 3,098 | 0.85 | 0.321839 | 0.024691 | react-lib | 664 | 2025-02-19T12:45:39.472043 | GPL-3.0 | false | b386bb5d4d6363fac143686a692a3a24 |
r"""\nSupport for embedded TeX expressions in Matplotlib.\n\nRequirements:\n\n* LaTeX.\n* \*Agg backends: dvipng>=1.6.\n* PS backend: PSfrag, dvips, and Ghostscript>=9.0.\n* PDF and SVG backends: if LuaTeX is present, it will be used to speed up some\n post-processing steps, but note that it is not used to parse the TeX string\n itself (only LaTeX is supported).\n\nTo enable TeX rendering of all text in your Matplotlib figure, set\n:rc:`text.usetex` to True.\n\nTeX and dvipng/dvips processing results are cached\nin ~/.matplotlib/tex.cache for reuse between sessions.\n\n`TexManager.get_rgba` can also be used to directly obtain raster output as RGBA\nNumPy arrays.\n"""\n\nimport functools\nimport hashlib\nimport logging\nimport os\nfrom pathlib import Path\nimport subprocess\nfrom tempfile import TemporaryDirectory\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom matplotlib import cbook, dviread\n\n_log = logging.getLogger(__name__)\n\n\ndef _usepackage_if_not_loaded(package, *, option=None):\n """\n Output LaTeX code that loads a package (possibly with an option) if it\n hasn't been loaded yet.\n\n LaTeX cannot load twice a package with different options, so this helper\n can be used to protect against users loading arbitrary packages/options in\n their custom preamble.\n """\n option = f"[{option}]" if option is not None else ""\n return (\n r"\makeatletter"\n r"\@ifpackageloaded{%(package)s}{}{\usepackage%(option)s{%(package)s}}"\n r"\makeatother"\n ) % {"package": package, "option": option}\n\n\nclass TexManager:\n """\n Convert strings to dvi files using TeX, caching the results to a directory.\n\n The cache directory is called ``tex.cache`` and is located in the directory\n returned by `.get_cachedir`.\n\n Repeated calls to this constructor always return the same instance.\n """\n\n _texcache = os.path.join(mpl.get_cachedir(), 'tex.cache')\n _grey_arrayd = {}\n\n _font_families = ('serif', 'sans-serif', 'cursive', 'monospace')\n _font_preambles = {\n 'new century schoolbook': r'\renewcommand{\rmdefault}{pnc}',\n 'bookman': r'\renewcommand{\rmdefault}{pbk}',\n 'times': r'\usepackage{mathptmx}',\n 'palatino': r'\usepackage{mathpazo}',\n 'zapf chancery': r'\usepackage{chancery}',\n 'cursive': r'\usepackage{chancery}',\n 'charter': r'\usepackage{charter}',\n 'serif': '',\n 'sans-serif': '',\n 'helvetica': r'\usepackage{helvet}',\n 'avant garde': r'\usepackage{avant}',\n 'courier': r'\usepackage{courier}',\n # Loading the type1ec package ensures that cm-super is installed, which\n # is necessary for Unicode computer modern. (It also allows the use of\n # computer modern at arbitrary sizes, but that's just a side effect.)\n 'monospace': r'\usepackage{type1ec}',\n 'computer modern roman': r'\usepackage{type1ec}',\n 'computer modern sans serif': r'\usepackage{type1ec}',\n 'computer modern typewriter': r'\usepackage{type1ec}',\n }\n _font_types = {\n 'new century schoolbook': 'serif',\n 'bookman': 'serif',\n 'times': 'serif',\n 'palatino': 'serif',\n 'zapf chancery': 'cursive',\n 'charter': 'serif',\n 'helvetica': 'sans-serif',\n 'avant garde': 'sans-serif',\n 'courier': 'monospace',\n 'computer modern roman': 'serif',\n 'computer modern sans serif': 'sans-serif',\n 'computer modern typewriter': 'monospace',\n }\n\n @functools.lru_cache # Always return the same instance.\n def __new__(cls):\n Path(cls._texcache).mkdir(parents=True, exist_ok=True)\n return object.__new__(cls)\n\n @classmethod\n def _get_font_family_and_reduced(cls):\n """Return the font family name and whether the font is reduced."""\n ff = mpl.rcParams['font.family']\n ff_val = ff[0].lower() if len(ff) == 1 else None\n if len(ff) == 1 and ff_val in cls._font_families:\n return ff_val, False\n elif len(ff) == 1 and ff_val in cls._font_preambles:\n return cls._font_types[ff_val], True\n else:\n _log.info('font.family must be one of (%s) when text.usetex is '\n 'True. serif will be used by default.',\n ', '.join(cls._font_families))\n return 'serif', False\n\n @classmethod\n def _get_font_preamble_and_command(cls):\n requested_family, is_reduced_font = cls._get_font_family_and_reduced()\n\n preambles = {}\n for font_family in cls._font_families:\n if is_reduced_font and font_family == requested_family:\n preambles[font_family] = cls._font_preambles[\n mpl.rcParams['font.family'][0].lower()]\n else:\n rcfonts = mpl.rcParams[f"font.{font_family}"]\n for i, font in enumerate(map(str.lower, rcfonts)):\n if font in cls._font_preambles:\n preambles[font_family] = cls._font_preambles[font]\n _log.debug(\n 'family: %s, package: %s, font: %s, skipped: %s',\n font_family, cls._font_preambles[font], rcfonts[i],\n ', '.join(rcfonts[:i]),\n )\n break\n else:\n _log.info('No LaTeX-compatible font found for the %s font'\n 'family in rcParams. Using default.',\n font_family)\n preambles[font_family] = cls._font_preambles[font_family]\n\n # The following packages and commands need to be included in the latex\n # file's preamble:\n cmd = {preambles[family]\n for family in ['serif', 'sans-serif', 'monospace']}\n if requested_family == 'cursive':\n cmd.add(preambles['cursive'])\n cmd.add(r'\usepackage{type1cm}')\n preamble = '\n'.join(sorted(cmd))\n fontcmd = (r'\sffamily' if requested_family == 'sans-serif' else\n r'\ttfamily' if requested_family == 'monospace' else\n r'\rmfamily')\n return preamble, fontcmd\n\n @classmethod\n def get_basefile(cls, tex, fontsize, dpi=None):\n """\n Return a filename based on a hash of the string, fontsize, and dpi.\n """\n src = cls._get_tex_source(tex, fontsize) + str(dpi)\n filehash = hashlib.sha256(\n src.encode('utf-8'),\n usedforsecurity=False\n ).hexdigest()\n filepath = Path(cls._texcache)\n\n num_letters, num_levels = 2, 2\n for i in range(0, num_letters*num_levels, num_letters):\n filepath = filepath / Path(filehash[i:i+2])\n\n filepath.mkdir(parents=True, exist_ok=True)\n return os.path.join(filepath, filehash)\n\n @classmethod\n def get_font_preamble(cls):\n """\n Return a string containing font configuration for the tex preamble.\n """\n font_preamble, command = cls._get_font_preamble_and_command()\n return font_preamble\n\n @classmethod\n def get_custom_preamble(cls):\n """Return a string containing user additions to the tex preamble."""\n return mpl.rcParams['text.latex.preamble']\n\n @classmethod\n def _get_tex_source(cls, tex, fontsize):\n """Return the complete TeX source for processing a TeX string."""\n font_preamble, fontcmd = cls._get_font_preamble_and_command()\n baselineskip = 1.25 * fontsize\n return "\n".join([\n r"\documentclass{article}",\n r"% Pass-through \mathdefault, which is used in non-usetex mode",\n r"% to use the default text font but was historically suppressed",\n r"% in usetex mode.",\n r"\newcommand{\mathdefault}[1]{#1}",\n font_preamble,\n r"\usepackage[utf8]{inputenc}",\n r"\DeclareUnicodeCharacter{2212}{\ensuremath{-}}",\n r"% geometry is loaded before the custom preamble as ",\n r"% convert_psfrags relies on a custom preamble to change the ",\n r"% geometry.",\n r"\usepackage[papersize=72in, margin=1in]{geometry}",\n cls.get_custom_preamble(),\n r"% Use `underscore` package to take care of underscores in text.",\n r"% The [strings] option allows to use underscores in file names.",\n _usepackage_if_not_loaded("underscore", option="strings"),\n r"% Custom packages (e.g. newtxtext) may already have loaded ",\n r"% textcomp with different options.",\n _usepackage_if_not_loaded("textcomp"),\n r"\pagestyle{empty}",\n r"\begin{document}",\n r"% The empty hbox ensures that a page is printed even for empty",\n r"% inputs, except when using psfrag which gets confused by it.",\n r"% matplotlibbaselinemarker is used by dviread to detect the",\n r"% last line's baseline.",\n rf"\fontsize{{{fontsize}}}{{{baselineskip}}}%",\n r"\ifdefined\psfrag\else\hbox{}\fi%",\n rf"{{{fontcmd} {tex}}}%",\n r"\end{document}",\n ])\n\n @classmethod\n def make_tex(cls, tex, fontsize):\n """\n Generate a tex file to render the tex string at a specific font size.\n\n Return the file name.\n """\n texfile = cls.get_basefile(tex, fontsize) + ".tex"\n Path(texfile).write_text(cls._get_tex_source(tex, fontsize),\n encoding='utf-8')\n return texfile\n\n @classmethod\n def _run_checked_subprocess(cls, command, tex, *, cwd=None):\n _log.debug(cbook._pformat_subprocess(command))\n try:\n report = subprocess.check_output(\n command, cwd=cwd if cwd is not None else cls._texcache,\n stderr=subprocess.STDOUT)\n except FileNotFoundError as exc:\n raise RuntimeError(\n f'Failed to process string with tex because {command[0]} '\n 'could not be found') from exc\n except subprocess.CalledProcessError as exc:\n raise RuntimeError(\n '{prog} was not able to process the following string:\n'\n '{tex!r}\n\n'\n 'Here is the full command invocation and its output:\n\n'\n '{format_command}\n\n'\n '{exc}\n\n'.format(\n prog=command[0],\n format_command=cbook._pformat_subprocess(command),\n tex=tex.encode('unicode_escape'),\n exc=exc.output.decode('utf-8', 'backslashreplace'))\n ) from None\n _log.debug(report)\n return report\n\n @classmethod\n def make_dvi(cls, tex, fontsize):\n """\n Generate a dvi file containing latex's layout of tex string.\n\n Return the file name.\n """\n basefile = cls.get_basefile(tex, fontsize)\n dvifile = '%s.dvi' % basefile\n if not os.path.exists(dvifile):\n texfile = Path(cls.make_tex(tex, fontsize))\n # Generate the dvi in a temporary directory to avoid race\n # conditions e.g. if multiple processes try to process the same tex\n # string at the same time. Having tmpdir be a subdirectory of the\n # final output dir ensures that they are on the same filesystem,\n # and thus replace() works atomically. It also allows referring to\n # the texfile with a relative path (for pathological MPLCONFIGDIRs,\n # the absolute path may contain characters (e.g. ~) that TeX does\n # not support; n.b. relative paths cannot traverse parents, or it\n # will be blocked when `openin_any = p` in texmf.cnf).\n cwd = Path(dvifile).parent\n with TemporaryDirectory(dir=cwd) as tmpdir:\n tmppath = Path(tmpdir)\n cls._run_checked_subprocess(\n ["latex", "-interaction=nonstopmode", "--halt-on-error",\n f"--output-directory={tmppath.name}",\n f"{texfile.name}"], tex, cwd=cwd)\n (tmppath / Path(dvifile).name).replace(dvifile)\n return dvifile\n\n @classmethod\n def make_png(cls, tex, fontsize, dpi):\n """\n Generate a png file containing latex's rendering of tex string.\n\n Return the file name.\n """\n basefile = cls.get_basefile(tex, fontsize, dpi)\n pngfile = '%s.png' % basefile\n # see get_rgba for a discussion of the background\n if not os.path.exists(pngfile):\n dvifile = cls.make_dvi(tex, fontsize)\n cmd = ["dvipng", "-bg", "Transparent", "-D", str(dpi),\n "-T", "tight", "-o", pngfile, dvifile]\n # When testing, disable FreeType rendering for reproducibility; but\n # dvipng 1.16 has a bug (fixed in f3ff241) that breaks --freetype0\n # mode, so for it we keep FreeType enabled; the image will be\n # slightly off.\n if (getattr(mpl, "_called_from_pytest", False) and\n mpl._get_executable_info("dvipng").raw_version != "1.16"):\n cmd.insert(1, "--freetype0")\n cls._run_checked_subprocess(cmd, tex)\n return pngfile\n\n @classmethod\n def get_grey(cls, tex, fontsize=None, dpi=None):\n """Return the alpha channel."""\n if not fontsize:\n fontsize = mpl.rcParams['font.size']\n if not dpi:\n dpi = mpl.rcParams['savefig.dpi']\n key = cls._get_tex_source(tex, fontsize), dpi\n alpha = cls._grey_arrayd.get(key)\n if alpha is None:\n pngfile = cls.make_png(tex, fontsize, dpi)\n rgba = mpl.image.imread(os.path.join(cls._texcache, pngfile))\n cls._grey_arrayd[key] = alpha = rgba[:, :, -1]\n return alpha\n\n @classmethod\n def get_rgba(cls, tex, fontsize=None, dpi=None, rgb=(0, 0, 0)):\n r"""\n Return latex's rendering of the tex string as an RGBA array.\n\n Examples\n --------\n >>> texmanager = TexManager()\n >>> s = r"\TeX\ is $\displaystyle\sum_n\frac{-e^{i\pi}}{2^n}$!"\n >>> Z = texmanager.get_rgba(s, fontsize=12, dpi=80, rgb=(1, 0, 0))\n """\n alpha = cls.get_grey(tex, fontsize, dpi)\n rgba = np.empty((*alpha.shape, 4))\n rgba[..., :3] = mpl.colors.to_rgb(rgb)\n rgba[..., -1] = alpha\n return rgba\n\n @classmethod\n def get_text_width_height_descent(cls, tex, fontsize, renderer=None):\n """Return width, height and descent of the text."""\n if tex.strip() == '':\n return 0, 0, 0\n dvifile = cls.make_dvi(tex, fontsize)\n dpi_fraction = renderer.points_to_pixels(1.) if renderer else 1\n with dviread.Dvi(dvifile, 72 * dpi_fraction) as dvi:\n page, = dvi\n # A total height (including the descent) needs to be returned.\n return page.width, page.height + page.descent, page.descent\n | .venv\Lib\site-packages\matplotlib\texmanager.py | texmanager.py | Python | 15,033 | 0.95 | 0.144022 | 0.073171 | react-lib | 855 | 2024-10-31T02:06:38.029939 | MIT | false | b6d8edb9b9ca037927dfccc0df928fb7 |
from .backend_bases import RendererBase\n\nfrom matplotlib.typing import ColorType\n\nimport numpy as np\n\nclass TexManager:\n texcache: str\n @classmethod\n def get_basefile(\n cls, tex: str, fontsize: float, dpi: float | None = ...\n ) -> str: ...\n @classmethod\n def get_font_preamble(cls) -> str: ...\n @classmethod\n def get_custom_preamble(cls) -> str: ...\n @classmethod\n def make_tex(cls, tex: str, fontsize: float) -> str: ...\n @classmethod\n def make_dvi(cls, tex: str, fontsize: float) -> str: ...\n @classmethod\n def make_png(cls, tex: str, fontsize: float, dpi: float) -> str: ...\n @classmethod\n def get_grey(\n cls, tex: str, fontsize: float | None = ..., dpi: float | None = ...\n ) -> np.ndarray: ...\n @classmethod\n def get_rgba(\n cls,\n tex: str,\n fontsize: float | None = ...,\n dpi: float | None = ...,\n rgb: ColorType = ...,\n ) -> np.ndarray: ...\n @classmethod\n def get_text_width_height_descent(\n cls, tex: str, fontsize, renderer: RendererBase | None = ...\n ) -> tuple[int, int, int]: ...\n | .venv\Lib\site-packages\matplotlib\texmanager.pyi | texmanager.pyi | Other | 1,116 | 0.85 | 0.263158 | 0 | react-lib | 240 | 2024-05-04T02:04:44.981531 | MIT | false | ca635b0c787b29553c371543a424516c |
"""\nClasses for including text in a figure.\n"""\n\nimport functools\nimport logging\nimport math\nfrom numbers import Real\nimport weakref\n\nimport numpy as np\n\nimport matplotlib as mpl\nfrom . import _api, artist, cbook, _docstring\nfrom .artist import Artist\nfrom .font_manager import FontProperties\nfrom .patches import FancyArrowPatch, FancyBboxPatch, Rectangle\nfrom .textpath import TextPath, TextToPath # noqa # Logically located here\nfrom .transforms import (\n Affine2D, Bbox, BboxBase, BboxTransformTo, IdentityTransform, Transform)\n\n\n_log = logging.getLogger(__name__)\n\n\ndef _get_textbox(text, renderer):\n """\n Calculate the bounding box of the text.\n\n The bbox position takes text rotation into account, but the width and\n height are those of the unrotated box (unlike `.Text.get_window_extent`).\n """\n # TODO : This function may move into the Text class as a method. As a\n # matter of fact, the information from the _get_textbox function\n # should be available during the Text._get_layout() call, which is\n # called within the _get_textbox. So, it would better to move this\n # function as a method with some refactoring of _get_layout method.\n\n projected_xs = []\n projected_ys = []\n\n theta = np.deg2rad(text.get_rotation())\n tr = Affine2D().rotate(-theta)\n\n _, parts, d = text._get_layout(renderer)\n\n for t, wh, x, y in parts:\n w, h = wh\n\n xt1, yt1 = tr.transform((x, y))\n yt1 -= d\n xt2, yt2 = xt1 + w, yt1 + h\n\n projected_xs.extend([xt1, xt2])\n projected_ys.extend([yt1, yt2])\n\n xt_box, yt_box = min(projected_xs), min(projected_ys)\n w_box, h_box = max(projected_xs) - xt_box, max(projected_ys) - yt_box\n\n x_box, y_box = Affine2D().rotate(theta).transform((xt_box, yt_box))\n\n return x_box, y_box, w_box, h_box\n\n\ndef _get_text_metrics_with_cache(renderer, text, fontprop, ismath, dpi):\n """Call ``renderer.get_text_width_height_descent``, caching the results."""\n # Cached based on a copy of fontprop so that later in-place mutations of\n # the passed-in argument do not mess up the cache.\n return _get_text_metrics_with_cache_impl(\n weakref.ref(renderer), text, fontprop.copy(), ismath, dpi)\n\n\n@functools.lru_cache(4096)\ndef _get_text_metrics_with_cache_impl(\n renderer_ref, text, fontprop, ismath, dpi):\n # dpi is unused, but participates in cache invalidation (via the renderer).\n return renderer_ref().get_text_width_height_descent(text, fontprop, ismath)\n\n\n@_docstring.interpd\n@_api.define_aliases({\n "color": ["c"],\n "fontproperties": ["font", "font_properties"],\n "fontfamily": ["family"],\n "fontname": ["name"],\n "fontsize": ["size"],\n "fontstretch": ["stretch"],\n "fontstyle": ["style"],\n "fontvariant": ["variant"],\n "fontweight": ["weight"],\n "horizontalalignment": ["ha"],\n "verticalalignment": ["va"],\n "multialignment": ["ma"],\n})\nclass Text(Artist):\n """Handle storing and drawing of text in window or data coordinates."""\n\n zorder = 3\n _charsize_cache = dict()\n\n def __repr__(self):\n return f"Text({self._x}, {self._y}, {self._text!r})"\n\n def __init__(self,\n x=0, y=0, text='', *,\n color=None, # defaults to rc params\n verticalalignment='baseline',\n horizontalalignment='left',\n multialignment=None,\n fontproperties=None, # defaults to FontProperties()\n rotation=None,\n linespacing=None,\n rotation_mode=None,\n usetex=None, # defaults to rcParams['text.usetex']\n wrap=False,\n transform_rotates_text=False,\n parse_math=None, # defaults to rcParams['text.parse_math']\n antialiased=None, # defaults to rcParams['text.antialiased']\n **kwargs\n ):\n """\n Create a `.Text` instance at *x*, *y* with string *text*.\n\n The text is aligned relative to the anchor point (*x*, *y*) according\n to ``horizontalalignment`` (default: 'left') and ``verticalalignment``\n (default: 'baseline'). See also\n :doc:`/gallery/text_labels_and_annotations/text_alignment`.\n\n While Text accepts the 'label' keyword argument, by default it is not\n added to the handles of a legend.\n\n Valid keyword arguments are:\n\n %(Text:kwdoc)s\n """\n super().__init__()\n self._x, self._y = x, y\n self._text = ''\n self._reset_visual_defaults(\n text=text,\n color=color,\n fontproperties=fontproperties,\n usetex=usetex,\n parse_math=parse_math,\n wrap=wrap,\n verticalalignment=verticalalignment,\n horizontalalignment=horizontalalignment,\n multialignment=multialignment,\n rotation=rotation,\n transform_rotates_text=transform_rotates_text,\n linespacing=linespacing,\n rotation_mode=rotation_mode,\n antialiased=antialiased\n )\n self.update(kwargs)\n\n def _reset_visual_defaults(\n self,\n text='',\n color=None,\n fontproperties=None,\n usetex=None,\n parse_math=None,\n wrap=False,\n verticalalignment='baseline',\n horizontalalignment='left',\n multialignment=None,\n rotation=None,\n transform_rotates_text=False,\n linespacing=None,\n rotation_mode=None,\n antialiased=None\n ):\n self.set_text(text)\n self.set_color(mpl._val_or_rc(color, "text.color"))\n self.set_fontproperties(fontproperties)\n self.set_usetex(usetex)\n self.set_parse_math(mpl._val_or_rc(parse_math, 'text.parse_math'))\n self.set_wrap(wrap)\n self.set_verticalalignment(verticalalignment)\n self.set_horizontalalignment(horizontalalignment)\n self._multialignment = multialignment\n self.set_rotation(rotation)\n self._transform_rotates_text = transform_rotates_text\n self._bbox_patch = None # a FancyBboxPatch instance\n self._renderer = None\n if linespacing is None:\n linespacing = 1.2 # Maybe use rcParam later.\n self.set_linespacing(linespacing)\n self.set_rotation_mode(rotation_mode)\n self.set_antialiased(antialiased if antialiased is not None else\n mpl.rcParams['text.antialiased'])\n\n def update(self, kwargs):\n # docstring inherited\n ret = []\n kwargs = cbook.normalize_kwargs(kwargs, Text)\n sentinel = object() # bbox can be None, so use another sentinel.\n # Update fontproperties first, as it has lowest priority.\n fontproperties = kwargs.pop("fontproperties", sentinel)\n if fontproperties is not sentinel:\n ret.append(self.set_fontproperties(fontproperties))\n # Update bbox last, as it depends on font properties.\n bbox = kwargs.pop("bbox", sentinel)\n ret.extend(super().update(kwargs))\n if bbox is not sentinel:\n ret.append(self.set_bbox(bbox))\n return ret\n\n def __getstate__(self):\n d = super().__getstate__()\n # remove the cached _renderer (if it exists)\n d['_renderer'] = None\n return d\n\n def contains(self, mouseevent):\n """\n Return whether the mouse event occurred inside the axis-aligned\n bounding-box of the text.\n """\n if (self._different_canvas(mouseevent) or not self.get_visible()\n or self._renderer is None):\n return False, {}\n # Explicitly use Text.get_window_extent(self) and not\n # self.get_window_extent() so that Annotation.contains does not\n # accidentally cover the entire annotation bounding box.\n bbox = Text.get_window_extent(self)\n inside = (bbox.x0 <= mouseevent.x <= bbox.x1\n and bbox.y0 <= mouseevent.y <= bbox.y1)\n cattr = {}\n # if the text has a surrounding patch, also check containment for it,\n # and merge the results with the results for the text.\n if self._bbox_patch:\n patch_inside, patch_cattr = self._bbox_patch.contains(mouseevent)\n inside = inside or patch_inside\n cattr["bbox_patch"] = patch_cattr\n return inside, cattr\n\n def _get_xy_display(self):\n """\n Get the (possibly unit converted) transformed x, y in display coords.\n """\n x, y = self.get_unitless_position()\n return self.get_transform().transform((x, y))\n\n def _get_multialignment(self):\n if self._multialignment is not None:\n return self._multialignment\n else:\n return self._horizontalalignment\n\n def _char_index_at(self, x):\n """\n Calculate the index closest to the coordinate x in display space.\n\n The position of text[index] is assumed to be the sum of the widths\n of all preceding characters text[:index].\n\n This works only on single line texts.\n """\n if not self._text:\n return 0\n\n text = self._text\n\n fontproperties = str(self._fontproperties)\n if fontproperties not in Text._charsize_cache:\n Text._charsize_cache[fontproperties] = dict()\n\n charsize_cache = Text._charsize_cache[fontproperties]\n for char in set(text):\n if char not in charsize_cache:\n self.set_text(char)\n bb = self.get_window_extent()\n charsize_cache[char] = bb.x1 - bb.x0\n\n self.set_text(text)\n bb = self.get_window_extent()\n\n size_accum = np.cumsum([0] + [charsize_cache[x] for x in text])\n std_x = x - bb.x0\n return (np.abs(size_accum - std_x)).argmin()\n\n def get_rotation(self):\n """Return the text angle in degrees between 0 and 360."""\n if self.get_transform_rotates_text():\n return self.get_transform().transform_angles(\n [self._rotation], [self.get_unitless_position()]).item(0)\n else:\n return self._rotation\n\n def get_transform_rotates_text(self):\n """\n Return whether rotations of the transform affect the text direction.\n """\n return self._transform_rotates_text\n\n def set_rotation_mode(self, m):\n """\n Set text rotation mode.\n\n Parameters\n ----------\n m : {None, 'default', 'anchor'}\n If ``"default"``, the text will be first rotated, then aligned according\n to their horizontal and vertical alignments. If ``"anchor"``, then\n alignment occurs before rotation. Passing ``None`` will set the rotation\n mode to ``"default"``.\n """\n if m is None:\n m = "default"\n else:\n _api.check_in_list(("anchor", "default"), rotation_mode=m)\n self._rotation_mode = m\n self.stale = True\n\n def get_rotation_mode(self):\n """Return the text rotation mode."""\n return self._rotation_mode\n\n def set_antialiased(self, antialiased):\n """\n Set whether to use antialiased rendering.\n\n Parameters\n ----------\n antialiased : bool\n\n Notes\n -----\n Antialiasing will be determined by :rc:`text.antialiased`\n and the parameter *antialiased* will have no effect if the text contains\n math expressions.\n """\n self._antialiased = antialiased\n self.stale = True\n\n def get_antialiased(self):\n """Return whether antialiased rendering is used."""\n return self._antialiased\n\n def update_from(self, other):\n # docstring inherited\n super().update_from(other)\n self._color = other._color\n self._multialignment = other._multialignment\n self._verticalalignment = other._verticalalignment\n self._horizontalalignment = other._horizontalalignment\n self._fontproperties = other._fontproperties.copy()\n self._usetex = other._usetex\n self._rotation = other._rotation\n self._transform_rotates_text = other._transform_rotates_text\n self._picker = other._picker\n self._linespacing = other._linespacing\n self._antialiased = other._antialiased\n self.stale = True\n\n def _get_layout(self, renderer):\n """\n Return the extent (bbox) of the text together with\n multiple-alignment information. Note that it returns an extent\n of a rotated text when necessary.\n """\n thisx, thisy = 0.0, 0.0\n lines = self._get_wrapped_text().split("\n") # Ensures lines is not empty.\n\n ws = []\n hs = []\n xs = []\n ys = []\n\n # Full vertical extent of font, including ascenders and descenders:\n _, lp_h, lp_d = _get_text_metrics_with_cache(\n renderer, "lp", self._fontproperties,\n ismath="TeX" if self.get_usetex() else False,\n dpi=self.get_figure(root=True).dpi)\n min_dy = (lp_h - lp_d) * self._linespacing\n\n for i, line in enumerate(lines):\n clean_line, ismath = self._preprocess_math(line)\n if clean_line:\n w, h, d = _get_text_metrics_with_cache(\n renderer, clean_line, self._fontproperties,\n ismath=ismath, dpi=self.get_figure(root=True).dpi)\n else:\n w = h = d = 0\n\n # For multiline text, increase the line spacing when the text\n # net-height (excluding baseline) is larger than that of a "l"\n # (e.g., use of superscripts), which seems what TeX does.\n h = max(h, lp_h)\n d = max(d, lp_d)\n\n ws.append(w)\n hs.append(h)\n\n # Metrics of the last line that are needed later:\n baseline = (h - d) - thisy\n\n if i == 0:\n # position at baseline\n thisy = -(h - d)\n else:\n # put baseline a good distance from bottom of previous line\n thisy -= max(min_dy, (h - d) * self._linespacing)\n\n xs.append(thisx) # == 0.\n ys.append(thisy)\n\n thisy -= d\n\n # Metrics of the last line that are needed later:\n descent = d\n\n # Bounding box definition:\n width = max(ws)\n xmin = 0\n xmax = width\n ymax = 0\n ymin = ys[-1] - descent # baseline of last line minus its descent\n\n # get the rotation matrix\n M = Affine2D().rotate_deg(self.get_rotation())\n\n # now offset the individual text lines within the box\n malign = self._get_multialignment()\n if malign == 'left':\n offset_layout = [(x, y) for x, y in zip(xs, ys)]\n elif malign == 'center':\n offset_layout = [(x + width / 2 - w / 2, y)\n for x, y, w in zip(xs, ys, ws)]\n elif malign == 'right':\n offset_layout = [(x + width - w, y)\n for x, y, w in zip(xs, ys, ws)]\n\n # the corners of the unrotated bounding box\n corners_horiz = np.array(\n [(xmin, ymin), (xmin, ymax), (xmax, ymax), (xmax, ymin)])\n\n # now rotate the bbox\n corners_rotated = M.transform(corners_horiz)\n # compute the bounds of the rotated box\n xmin = corners_rotated[:, 0].min()\n xmax = corners_rotated[:, 0].max()\n ymin = corners_rotated[:, 1].min()\n ymax = corners_rotated[:, 1].max()\n width = xmax - xmin\n height = ymax - ymin\n\n # Now move the box to the target position offset the display\n # bbox by alignment\n halign = self._horizontalalignment\n valign = self._verticalalignment\n\n rotation_mode = self.get_rotation_mode()\n if rotation_mode != "anchor":\n # compute the text location in display coords and the offsets\n # necessary to align the bbox with that location\n if halign == 'center':\n offsetx = (xmin + xmax) / 2\n elif halign == 'right':\n offsetx = xmax\n else:\n offsetx = xmin\n\n if valign == 'center':\n offsety = (ymin + ymax) / 2\n elif valign == 'top':\n offsety = ymax\n elif valign == 'baseline':\n offsety = ymin + descent\n elif valign == 'center_baseline':\n offsety = ymin + height - baseline / 2.0\n else:\n offsety = ymin\n else:\n xmin1, ymin1 = corners_horiz[0]\n xmax1, ymax1 = corners_horiz[2]\n\n if halign == 'center':\n offsetx = (xmin1 + xmax1) / 2.0\n elif halign == 'right':\n offsetx = xmax1\n else:\n offsetx = xmin1\n\n if valign == 'center':\n offsety = (ymin1 + ymax1) / 2.0\n elif valign == 'top':\n offsety = ymax1\n elif valign == 'baseline':\n offsety = ymax1 - baseline\n elif valign == 'center_baseline':\n offsety = ymax1 - baseline / 2.0\n else:\n offsety = ymin1\n\n offsetx, offsety = M.transform((offsetx, offsety))\n\n xmin -= offsetx\n ymin -= offsety\n\n bbox = Bbox.from_bounds(xmin, ymin, width, height)\n\n # now rotate the positions around the first (x, y) position\n xys = M.transform(offset_layout) - (offsetx, offsety)\n\n return bbox, list(zip(lines, zip(ws, hs), *xys.T)), descent\n\n def set_bbox(self, rectprops):\n """\n Draw a bounding box around self.\n\n Parameters\n ----------\n rectprops : dict with properties for `.patches.FancyBboxPatch`\n The default boxstyle is 'square'. The mutation\n scale of the `.patches.FancyBboxPatch` is set to the fontsize.\n\n Examples\n --------\n ::\n\n t.set_bbox(dict(facecolor='red', alpha=0.5))\n """\n\n if rectprops is not None:\n props = rectprops.copy()\n boxstyle = props.pop("boxstyle", None)\n pad = props.pop("pad", None)\n if boxstyle is None:\n boxstyle = "square"\n if pad is None:\n pad = 4 # points\n pad /= self.get_size() # to fraction of font size\n else:\n if pad is None:\n pad = 0.3\n # boxstyle could be a callable or a string\n if isinstance(boxstyle, str) and "pad" not in boxstyle:\n boxstyle += ",pad=%0.2f" % pad\n self._bbox_patch = FancyBboxPatch(\n (0, 0), 1, 1,\n boxstyle=boxstyle, transform=IdentityTransform(), **props)\n else:\n self._bbox_patch = None\n\n self._update_clip_properties()\n\n def get_bbox_patch(self):\n """\n Return the bbox Patch, or None if the `.patches.FancyBboxPatch`\n is not made.\n """\n return self._bbox_patch\n\n def update_bbox_position_size(self, renderer):\n """\n Update the location and the size of the bbox.\n\n This method should be used when the position and size of the bbox needs\n to be updated before actually drawing the bbox.\n """\n if self._bbox_patch:\n # don't use self.get_unitless_position here, which refers to text\n # position in Text:\n posx = float(self.convert_xunits(self._x))\n posy = float(self.convert_yunits(self._y))\n posx, posy = self.get_transform().transform((posx, posy))\n\n x_box, y_box, w_box, h_box = _get_textbox(self, renderer)\n self._bbox_patch.set_bounds(0., 0., w_box, h_box)\n self._bbox_patch.set_transform(\n Affine2D()\n .rotate_deg(self.get_rotation())\n .translate(posx + x_box, posy + y_box))\n fontsize_in_pixel = renderer.points_to_pixels(self.get_size())\n self._bbox_patch.set_mutation_scale(fontsize_in_pixel)\n\n def _update_clip_properties(self):\n if self._bbox_patch:\n clipprops = dict(clip_box=self.clipbox,\n clip_path=self._clippath,\n clip_on=self._clipon)\n self._bbox_patch.update(clipprops)\n\n def set_clip_box(self, clipbox):\n # docstring inherited.\n super().set_clip_box(clipbox)\n self._update_clip_properties()\n\n def set_clip_path(self, path, transform=None):\n # docstring inherited.\n super().set_clip_path(path, transform)\n self._update_clip_properties()\n\n def set_clip_on(self, b):\n # docstring inherited.\n super().set_clip_on(b)\n self._update_clip_properties()\n\n def get_wrap(self):\n """Return whether the text can be wrapped."""\n return self._wrap\n\n def set_wrap(self, wrap):\n """\n Set whether the text can be wrapped.\n\n Wrapping makes sure the text is confined to the (sub)figure box. It\n does not take into account any other artists.\n\n Parameters\n ----------\n wrap : bool\n\n Notes\n -----\n Wrapping does not work together with\n ``savefig(..., bbox_inches='tight')`` (which is also used internally\n by ``%matplotlib inline`` in IPython/Jupyter). The 'tight' setting\n rescales the canvas to accommodate all content and happens before\n wrapping.\n """\n self._wrap = wrap\n\n def _get_wrap_line_width(self):\n """\n Return the maximum line width for wrapping text based on the current\n orientation.\n """\n x0, y0 = self.get_transform().transform(self.get_position())\n figure_box = self.get_figure().get_window_extent()\n\n # Calculate available width based on text alignment\n alignment = self.get_horizontalalignment()\n self.set_rotation_mode('anchor')\n rotation = self.get_rotation()\n\n left = self._get_dist_to_box(rotation, x0, y0, figure_box)\n right = self._get_dist_to_box(\n (180 + rotation) % 360, x0, y0, figure_box)\n\n if alignment == 'left':\n line_width = left\n elif alignment == 'right':\n line_width = right\n else:\n line_width = 2 * min(left, right)\n\n return line_width\n\n def _get_dist_to_box(self, rotation, x0, y0, figure_box):\n """\n Return the distance from the given points to the boundaries of a\n rotated box, in pixels.\n """\n if rotation > 270:\n quad = rotation - 270\n h1 = (y0 - figure_box.y0) / math.cos(math.radians(quad))\n h2 = (figure_box.x1 - x0) / math.cos(math.radians(90 - quad))\n elif rotation > 180:\n quad = rotation - 180\n h1 = (x0 - figure_box.x0) / math.cos(math.radians(quad))\n h2 = (y0 - figure_box.y0) / math.cos(math.radians(90 - quad))\n elif rotation > 90:\n quad = rotation - 90\n h1 = (figure_box.y1 - y0) / math.cos(math.radians(quad))\n h2 = (x0 - figure_box.x0) / math.cos(math.radians(90 - quad))\n else:\n h1 = (figure_box.x1 - x0) / math.cos(math.radians(rotation))\n h2 = (figure_box.y1 - y0) / math.cos(math.radians(90 - rotation))\n\n return min(h1, h2)\n\n def _get_rendered_text_width(self, text):\n """\n Return the width of a given text string, in pixels.\n """\n\n w, h, d = _get_text_metrics_with_cache(\n self._renderer, text, self.get_fontproperties(),\n cbook.is_math_text(text),\n self.get_figure(root=True).dpi)\n return math.ceil(w)\n\n def _get_wrapped_text(self):\n """\n Return a copy of the text string with new lines added so that the text\n is wrapped relative to the parent figure (if `get_wrap` is True).\n """\n if not self.get_wrap():\n return self.get_text()\n\n # Not fit to handle breaking up latex syntax correctly, so\n # ignore latex for now.\n if self.get_usetex():\n return self.get_text()\n\n # Build the line incrementally, for a more accurate measure of length\n line_width = self._get_wrap_line_width()\n wrapped_lines = []\n\n # New lines in the user's text force a split\n unwrapped_lines = self.get_text().split('\n')\n\n # Now wrap each individual unwrapped line\n for unwrapped_line in unwrapped_lines:\n\n sub_words = unwrapped_line.split(' ')\n # Remove items from sub_words as we go, so stop when empty\n while len(sub_words) > 0:\n if len(sub_words) == 1:\n # Only one word, so just add it to the end\n wrapped_lines.append(sub_words.pop(0))\n continue\n\n for i in range(2, len(sub_words) + 1):\n # Get width of all words up to and including here\n line = ' '.join(sub_words[:i])\n current_width = self._get_rendered_text_width(line)\n\n # If all these words are too wide, append all not including\n # last word\n if current_width > line_width:\n wrapped_lines.append(' '.join(sub_words[:i - 1]))\n sub_words = sub_words[i - 1:]\n break\n\n # Otherwise if all words fit in the width, append them all\n elif i == len(sub_words):\n wrapped_lines.append(' '.join(sub_words[:i]))\n sub_words = []\n break\n\n return '\n'.join(wrapped_lines)\n\n @artist.allow_rasterization\n def draw(self, renderer):\n # docstring inherited\n\n if renderer is not None:\n self._renderer = renderer\n if not self.get_visible():\n return\n if self.get_text() == '':\n return\n\n renderer.open_group('text', self.get_gid())\n\n with self._cm_set(text=self._get_wrapped_text()):\n bbox, info, descent = self._get_layout(renderer)\n trans = self.get_transform()\n\n # don't use self.get_position here, which refers to text\n # position in Text:\n x, y = self._x, self._y\n if np.ma.is_masked(x):\n x = np.nan\n if np.ma.is_masked(y):\n y = np.nan\n posx = float(self.convert_xunits(x))\n posy = float(self.convert_yunits(y))\n posx, posy = trans.transform((posx, posy))\n if np.isnan(posx) or np.isnan(posy):\n return # don't throw a warning here\n if not np.isfinite(posx) or not np.isfinite(posy):\n _log.warning("posx and posy should be finite values")\n return\n canvasw, canvash = renderer.get_canvas_width_height()\n\n # Update the location and size of the bbox\n # (`.patches.FancyBboxPatch`), and draw it.\n if self._bbox_patch:\n self.update_bbox_position_size(renderer)\n self._bbox_patch.draw(renderer)\n\n gc = renderer.new_gc()\n gc.set_foreground(self.get_color())\n gc.set_alpha(self.get_alpha())\n gc.set_url(self._url)\n gc.set_antialiased(self._antialiased)\n self._set_gc_clip(gc)\n\n angle = self.get_rotation()\n\n for line, wh, x, y in info:\n\n mtext = self if len(info) == 1 else None\n x = x + posx\n y = y + posy\n if renderer.flipy():\n y = canvash - y\n clean_line, ismath = self._preprocess_math(line)\n\n if self.get_path_effects():\n from matplotlib.patheffects import PathEffectRenderer\n textrenderer = PathEffectRenderer(\n self.get_path_effects(), renderer)\n else:\n textrenderer = renderer\n\n if self.get_usetex():\n textrenderer.draw_tex(gc, x, y, clean_line,\n self._fontproperties, angle,\n mtext=mtext)\n else:\n textrenderer.draw_text(gc, x, y, clean_line,\n self._fontproperties, angle,\n ismath=ismath, mtext=mtext)\n\n gc.restore()\n renderer.close_group('text')\n self.stale = False\n\n def get_color(self):\n """Return the color of the text."""\n return self._color\n\n def get_fontproperties(self):\n """Return the `.font_manager.FontProperties`."""\n return self._fontproperties\n\n def get_fontfamily(self):\n """\n Return the list of font families used for font lookup.\n\n See Also\n --------\n .font_manager.FontProperties.get_family\n """\n return self._fontproperties.get_family()\n\n def get_fontname(self):\n """\n Return the font name as a string.\n\n See Also\n --------\n .font_manager.FontProperties.get_name\n """\n return self._fontproperties.get_name()\n\n def get_fontstyle(self):\n """\n Return the font style as a string.\n\n See Also\n --------\n .font_manager.FontProperties.get_style\n """\n return self._fontproperties.get_style()\n\n def get_fontsize(self):\n """\n Return the font size as an integer.\n\n See Also\n --------\n .font_manager.FontProperties.get_size_in_points\n """\n return self._fontproperties.get_size_in_points()\n\n def get_fontvariant(self):\n """\n Return the font variant as a string.\n\n See Also\n --------\n .font_manager.FontProperties.get_variant\n """\n return self._fontproperties.get_variant()\n\n def get_fontweight(self):\n """\n Return the font weight as a string or a number.\n\n See Also\n --------\n .font_manager.FontProperties.get_weight\n """\n return self._fontproperties.get_weight()\n\n def get_stretch(self):\n """\n Return the font stretch as a string or a number.\n\n See Also\n --------\n .font_manager.FontProperties.get_stretch\n """\n return self._fontproperties.get_stretch()\n\n def get_horizontalalignment(self):\n """\n Return the horizontal alignment as a string. Will be one of\n 'left', 'center' or 'right'.\n """\n return self._horizontalalignment\n\n def get_unitless_position(self):\n """Return the (x, y) unitless position of the text."""\n # This will get the position with all unit information stripped away.\n # This is here for convenience since it is done in several locations.\n x = float(self.convert_xunits(self._x))\n y = float(self.convert_yunits(self._y))\n return x, y\n\n def get_position(self):\n """Return the (x, y) position of the text."""\n # This should return the same data (possible unitized) as was\n # specified with 'set_x' and 'set_y'.\n return self._x, self._y\n\n def get_text(self):\n """Return the text string."""\n return self._text\n\n def get_verticalalignment(self):\n """\n Return the vertical alignment as a string. Will be one of\n 'top', 'center', 'bottom', 'baseline' or 'center_baseline'.\n """\n return self._verticalalignment\n\n def get_window_extent(self, renderer=None, dpi=None):\n """\n Return the `.Bbox` bounding the text, in display units.\n\n In addition to being used internally, this is useful for specifying\n clickable regions in a png file on a web page.\n\n Parameters\n ----------\n renderer : Renderer, optional\n A renderer is needed to compute the bounding box. If the artist\n has already been drawn, the renderer is cached; thus, it is only\n necessary to pass this argument when calling `get_window_extent`\n before the first draw. In practice, it is usually easier to\n trigger a draw first, e.g. by calling\n `~.Figure.draw_without_rendering` or ``plt.show()``.\n\n dpi : float, optional\n The dpi value for computing the bbox, defaults to\n ``self.get_figure(root=True).dpi`` (*not* the renderer dpi); should be set\n e.g. if to match regions with a figure saved with a custom dpi value.\n """\n if not self.get_visible():\n return Bbox.unit()\n\n fig = self.get_figure(root=True)\n if dpi is None:\n dpi = fig.dpi\n if self.get_text() == '':\n with cbook._setattr_cm(fig, dpi=dpi):\n tx, ty = self._get_xy_display()\n return Bbox.from_bounds(tx, ty, 0, 0)\n\n if renderer is not None:\n self._renderer = renderer\n if self._renderer is None:\n self._renderer = fig._get_renderer()\n if self._renderer is None:\n raise RuntimeError(\n "Cannot get window extent of text w/o renderer. You likely "\n "want to call 'figure.draw_without_rendering()' first.")\n\n with cbook._setattr_cm(fig, dpi=dpi):\n bbox, info, descent = self._get_layout(self._renderer)\n x, y = self.get_unitless_position()\n x, y = self.get_transform().transform((x, y))\n bbox = bbox.translated(x, y)\n return bbox\n\n def set_backgroundcolor(self, color):\n """\n Set the background color of the text by updating the bbox.\n\n Parameters\n ----------\n color : :mpltype:`color`\n\n See Also\n --------\n .set_bbox : To change the position of the bounding box\n """\n if self._bbox_patch is None:\n self.set_bbox(dict(facecolor=color, edgecolor=color))\n else:\n self._bbox_patch.update(dict(facecolor=color))\n\n self._update_clip_properties()\n self.stale = True\n\n def set_color(self, color):\n """\n Set the foreground color of the text\n\n Parameters\n ----------\n color : :mpltype:`color`\n """\n # "auto" is only supported by axisartist, but we can just let it error\n # out at draw time for simplicity.\n if not cbook._str_equal(color, "auto"):\n mpl.colors._check_color_like(color=color)\n self._color = color\n self.stale = True\n\n def set_horizontalalignment(self, align):\n """\n Set the horizontal alignment relative to the anchor point.\n\n See also :doc:`/gallery/text_labels_and_annotations/text_alignment`.\n\n Parameters\n ----------\n align : {'left', 'center', 'right'}\n """\n _api.check_in_list(['center', 'right', 'left'], align=align)\n self._horizontalalignment = align\n self.stale = True\n\n def set_multialignment(self, align):\n """\n Set the text alignment for multiline texts.\n\n The layout of the bounding box of all the lines is determined by the\n horizontalalignment and verticalalignment properties. This property\n controls the alignment of the text lines within that box.\n\n Parameters\n ----------\n align : {'left', 'right', 'center'}\n """\n _api.check_in_list(['center', 'right', 'left'], align=align)\n self._multialignment = align\n self.stale = True\n\n def set_linespacing(self, spacing):\n """\n Set the line spacing as a multiple of the font size.\n\n The default line spacing is 1.2.\n\n Parameters\n ----------\n spacing : float (multiple of font size)\n """\n _api.check_isinstance(Real, spacing=spacing)\n self._linespacing = spacing\n self.stale = True\n\n def set_fontfamily(self, fontname):\n """\n Set the font family. Can be either a single string, or a list of\n strings in decreasing priority. Each string may be either a real font\n name or a generic font class name. If the latter, the specific font\n names will be looked up in the corresponding rcParams.\n\n If a `Text` instance is constructed with ``fontfamily=None``, then the\n font is set to :rc:`font.family`, and the\n same is done when `set_fontfamily()` is called on an existing\n `Text` instance.\n\n Parameters\n ----------\n fontname : {FONTNAME, 'serif', 'sans-serif', 'cursive', 'fantasy', \\n'monospace'}\n\n See Also\n --------\n .font_manager.FontProperties.set_family\n """\n self._fontproperties.set_family(fontname)\n self.stale = True\n\n def set_fontvariant(self, variant):\n """\n Set the font variant.\n\n Parameters\n ----------\n variant : {'normal', 'small-caps'}\n\n See Also\n --------\n .font_manager.FontProperties.set_variant\n """\n self._fontproperties.set_variant(variant)\n self.stale = True\n\n def set_fontstyle(self, fontstyle):\n """\n Set the font style.\n\n Parameters\n ----------\n fontstyle : {'normal', 'italic', 'oblique'}\n\n See Also\n --------\n .font_manager.FontProperties.set_style\n """\n self._fontproperties.set_style(fontstyle)\n self.stale = True\n\n def set_fontsize(self, fontsize):\n """\n Set the font size.\n\n Parameters\n ----------\n fontsize : float or {'xx-small', 'x-small', 'small', 'medium', \\n'large', 'x-large', 'xx-large'}\n If a float, the fontsize in points. The string values denote sizes\n relative to the default font size.\n\n See Also\n --------\n .font_manager.FontProperties.set_size\n """\n self._fontproperties.set_size(fontsize)\n self.stale = True\n\n def get_math_fontfamily(self):\n """\n Return the font family name for math text rendered by Matplotlib.\n\n The default value is :rc:`mathtext.fontset`.\n\n See Also\n --------\n set_math_fontfamily\n """\n return self._fontproperties.get_math_fontfamily()\n\n def set_math_fontfamily(self, fontfamily):\n """\n Set the font family for math text rendered by Matplotlib.\n\n This does only affect Matplotlib's own math renderer. It has no effect\n when rendering with TeX (``usetex=True``).\n\n Parameters\n ----------\n fontfamily : str\n The name of the font family.\n\n Available font families are defined in the\n :ref:`default matplotlibrc file\n <customizing-with-matplotlibrc-files>`.\n\n See Also\n --------\n get_math_fontfamily\n """\n self._fontproperties.set_math_fontfamily(fontfamily)\n\n def set_fontweight(self, weight):\n """\n Set the font weight.\n\n Parameters\n ----------\n weight : {a numeric value in range 0-1000, 'ultralight', 'light', \\n'normal', 'regular', 'book', 'medium', 'roman', 'semibold', 'demibold', \\n'demi', 'bold', 'heavy', 'extra bold', 'black'}\n\n See Also\n --------\n .font_manager.FontProperties.set_weight\n """\n self._fontproperties.set_weight(weight)\n self.stale = True\n\n def set_fontstretch(self, stretch):\n """\n Set the font stretch (horizontal condensation or expansion).\n\n Parameters\n ----------\n stretch : {a numeric value in range 0-1000, 'ultra-condensed', \\n'extra-condensed', 'condensed', 'semi-condensed', 'normal', 'semi-expanded', \\n'expanded', 'extra-expanded', 'ultra-expanded'}\n\n See Also\n --------\n .font_manager.FontProperties.set_stretch\n """\n self._fontproperties.set_stretch(stretch)\n self.stale = True\n\n def set_position(self, xy):\n """\n Set the (*x*, *y*) position of the text.\n\n Parameters\n ----------\n xy : (float, float)\n """\n self.set_x(xy[0])\n self.set_y(xy[1])\n\n def set_x(self, x):\n """\n Set the *x* position of the text.\n\n Parameters\n ----------\n x : float\n """\n self._x = x\n self.stale = True\n\n def set_y(self, y):\n """\n Set the *y* position of the text.\n\n Parameters\n ----------\n y : float\n """\n self._y = y\n self.stale = True\n\n def set_rotation(self, s):\n """\n Set the rotation of the text.\n\n Parameters\n ----------\n s : float or {'vertical', 'horizontal'}\n The rotation angle in degrees in mathematically positive direction\n (counterclockwise). 'horizontal' equals 0, 'vertical' equals 90.\n """\n if isinstance(s, Real):\n self._rotation = float(s) % 360\n elif cbook._str_equal(s, 'horizontal') or s is None:\n self._rotation = 0.\n elif cbook._str_equal(s, 'vertical'):\n self._rotation = 90.\n else:\n raise ValueError("rotation must be 'vertical', 'horizontal' or "\n f"a number, not {s}")\n self.stale = True\n\n def set_transform_rotates_text(self, t):\n """\n Whether rotations of the transform affect the text direction.\n\n Parameters\n ----------\n t : bool\n """\n self._transform_rotates_text = t\n self.stale = True\n\n def set_verticalalignment(self, align):\n """\n Set the vertical alignment relative to the anchor point.\n\n See also :doc:`/gallery/text_labels_and_annotations/text_alignment`.\n\n Parameters\n ----------\n align : {'baseline', 'bottom', 'center', 'center_baseline', 'top'}\n """\n _api.check_in_list(\n ['top', 'bottom', 'center', 'baseline', 'center_baseline'],\n align=align)\n self._verticalalignment = align\n self.stale = True\n\n def set_text(self, s):\n r"""\n Set the text string *s*.\n\n It may contain newlines (``\n``) or math in LaTeX syntax.\n\n Parameters\n ----------\n s : object\n Any object gets converted to its `str` representation, except for\n ``None`` which is converted to an empty string.\n """\n s = '' if s is None else str(s)\n if s != self._text:\n self._text = s\n self.stale = True\n\n def _preprocess_math(self, s):\n """\n Return the string *s* after mathtext preprocessing, and the kind of\n mathtext support needed.\n\n - If *self* is configured to use TeX, return *s* unchanged except that\n a single space gets escaped, and the flag "TeX".\n - Otherwise, if *s* is mathtext (has an even number of unescaped dollar\n signs) and ``parse_math`` is not set to False, return *s* and the\n flag True.\n - Otherwise, return *s* with dollar signs unescaped, and the flag\n False.\n """\n if self.get_usetex():\n if s == " ":\n s = r"\ "\n return s, "TeX"\n elif not self.get_parse_math():\n return s, False\n elif cbook.is_math_text(s):\n return s, True\n else:\n return s.replace(r"\$", "$"), False\n\n def set_fontproperties(self, fp):\n """\n Set the font properties that control the text.\n\n Parameters\n ----------\n fp : `.font_manager.FontProperties` or `str` or `pathlib.Path`\n If a `str`, it is interpreted as a fontconfig pattern parsed by\n `.FontProperties`. If a `pathlib.Path`, it is interpreted as the\n absolute path to a font file.\n """\n self._fontproperties = FontProperties._from_any(fp).copy()\n self.stale = True\n\n @_docstring.kwarg_doc("bool, default: :rc:`text.usetex`")\n def set_usetex(self, usetex):\n """\n Parameters\n ----------\n usetex : bool or None\n Whether to render using TeX, ``None`` means to use\n :rc:`text.usetex`.\n """\n if usetex is None:\n self._usetex = mpl.rcParams['text.usetex']\n else:\n self._usetex = bool(usetex)\n self.stale = True\n\n def get_usetex(self):\n """Return whether this `Text` object uses TeX for rendering."""\n return self._usetex\n\n def set_parse_math(self, parse_math):\n """\n Override switch to disable any mathtext parsing for this `Text`.\n\n Parameters\n ----------\n parse_math : bool\n If False, this `Text` will never use mathtext. If True, mathtext\n will be used if there is an even number of unescaped dollar signs.\n """\n self._parse_math = bool(parse_math)\n\n def get_parse_math(self):\n """Return whether mathtext parsing is considered for this `Text`."""\n return self._parse_math\n\n def set_fontname(self, fontname):\n """\n Alias for `set_fontfamily`.\n\n One-way alias only: the getter differs.\n\n Parameters\n ----------\n fontname : {FONTNAME, 'serif', 'sans-serif', 'cursive', 'fantasy', \\n'monospace'}\n\n See Also\n --------\n .font_manager.FontProperties.set_family\n\n """\n self.set_fontfamily(fontname)\n\n\nclass OffsetFrom:\n """Callable helper class for working with `Annotation`."""\n\n def __init__(self, artist, ref_coord, unit="points"):\n """\n Parameters\n ----------\n artist : `~matplotlib.artist.Artist` or `.BboxBase` or `.Transform`\n The object to compute the offset from.\n\n ref_coord : (float, float)\n If *artist* is an `.Artist` or `.BboxBase`, this values is\n the location to of the offset origin in fractions of the\n *artist* bounding box.\n\n If *artist* is a transform, the offset origin is the\n transform applied to this value.\n\n unit : {'points, 'pixels'}, default: 'points'\n The screen units to use (pixels or points) for the offset input.\n """\n self._artist = artist\n x, y = ref_coord # Make copy when ref_coord is an array (and check the shape).\n self._ref_coord = x, y\n self.set_unit(unit)\n\n def set_unit(self, unit):\n """\n Set the unit for input to the transform used by ``__call__``.\n\n Parameters\n ----------\n unit : {'points', 'pixels'}\n """\n _api.check_in_list(["points", "pixels"], unit=unit)\n self._unit = unit\n\n def get_unit(self):\n """Return the unit for input to the transform used by ``__call__``."""\n return self._unit\n\n def __call__(self, renderer):\n """\n Return the offset transform.\n\n Parameters\n ----------\n renderer : `RendererBase`\n The renderer to use to compute the offset\n\n Returns\n -------\n `Transform`\n Maps (x, y) in pixel or point units to screen units\n relative to the given artist.\n """\n if isinstance(self._artist, Artist):\n bbox = self._artist.get_window_extent(renderer)\n xf, yf = self._ref_coord\n x = bbox.x0 + bbox.width * xf\n y = bbox.y0 + bbox.height * yf\n elif isinstance(self._artist, BboxBase):\n bbox = self._artist\n xf, yf = self._ref_coord\n x = bbox.x0 + bbox.width * xf\n y = bbox.y0 + bbox.height * yf\n elif isinstance(self._artist, Transform):\n x, y = self._artist.transform(self._ref_coord)\n else:\n _api.check_isinstance((Artist, BboxBase, Transform), artist=self._artist)\n scale = 1 if self._unit == "pixels" else renderer.points_to_pixels(1)\n return Affine2D().scale(scale).translate(x, y)\n\n\nclass _AnnotationBase:\n def __init__(self,\n xy,\n xycoords='data',\n annotation_clip=None):\n\n x, y = xy # Make copy when xy is an array (and check the shape).\n self.xy = x, y\n self.xycoords = xycoords\n self.set_annotation_clip(annotation_clip)\n\n self._draggable = None\n\n def _get_xy(self, renderer, xy, coords):\n x, y = xy\n xcoord, ycoord = coords if isinstance(coords, tuple) else (coords, coords)\n if xcoord == 'data':\n x = float(self.convert_xunits(x))\n if ycoord == 'data':\n y = float(self.convert_yunits(y))\n return self._get_xy_transform(renderer, coords).transform((x, y))\n\n def _get_xy_transform(self, renderer, coords):\n\n if isinstance(coords, tuple):\n xcoord, ycoord = coords\n from matplotlib.transforms import blended_transform_factory\n tr1 = self._get_xy_transform(renderer, xcoord)\n tr2 = self._get_xy_transform(renderer, ycoord)\n return blended_transform_factory(tr1, tr2)\n elif callable(coords):\n tr = coords(renderer)\n if isinstance(tr, BboxBase):\n return BboxTransformTo(tr)\n elif isinstance(tr, Transform):\n return tr\n else:\n raise TypeError(\n f"xycoords callable must return a BboxBase or Transform, not a "\n f"{type(tr).__name__}")\n elif isinstance(coords, Artist):\n bbox = coords.get_window_extent(renderer)\n return BboxTransformTo(bbox)\n elif isinstance(coords, BboxBase):\n return BboxTransformTo(coords)\n elif isinstance(coords, Transform):\n return coords\n elif not isinstance(coords, str):\n raise TypeError(\n f"'xycoords' must be an instance of str, tuple[str, str], Artist, "\n f"Transform, or Callable, not a {type(coords).__name__}")\n\n if coords == 'data':\n return self.axes.transData\n elif coords == 'polar':\n from matplotlib.projections import PolarAxes\n tr = PolarAxes.PolarTransform(apply_theta_transforms=False)\n trans = tr + self.axes.transData\n return trans\n\n try:\n bbox_name, unit = coords.split()\n except ValueError: # i.e. len(coords.split()) != 2.\n raise ValueError(f"{coords!r} is not a valid coordinate") from None\n\n bbox0, xy0 = None, None\n\n # if unit is offset-like\n if bbox_name == "figure":\n bbox0 = self.get_figure(root=False).figbbox\n elif bbox_name == "subfigure":\n bbox0 = self.get_figure(root=False).bbox\n elif bbox_name == "axes":\n bbox0 = self.axes.bbox\n\n # reference x, y in display coordinate\n if bbox0 is not None:\n xy0 = bbox0.p0\n elif bbox_name == "offset":\n xy0 = self._get_position_xy(renderer)\n else:\n raise ValueError(f"{coords!r} is not a valid coordinate")\n\n if unit == "points":\n tr = Affine2D().scale(\n self.get_figure(root=True).dpi / 72) # dpi/72 dots per point\n elif unit == "pixels":\n tr = Affine2D()\n elif unit == "fontsize":\n tr = Affine2D().scale(\n self.get_size() * self.get_figure(root=True).dpi / 72)\n elif unit == "fraction":\n tr = Affine2D().scale(*bbox0.size)\n else:\n raise ValueError(f"{unit!r} is not a recognized unit")\n\n return tr.translate(*xy0)\n\n def set_annotation_clip(self, b):\n """\n Set the annotation's clipping behavior.\n\n Parameters\n ----------\n b : bool or None\n - True: The annotation will be clipped when ``self.xy`` is\n outside the Axes.\n - False: The annotation will always be drawn.\n - None: The annotation will be clipped when ``self.xy`` is\n outside the Axes and ``self.xycoords == "data"``.\n """\n self._annotation_clip = b\n\n def get_annotation_clip(self):\n """\n Return the annotation's clipping behavior.\n\n See `set_annotation_clip` for the meaning of return values.\n """\n return self._annotation_clip\n\n def _get_position_xy(self, renderer):\n """Return the pixel position of the annotated point."""\n return self._get_xy(renderer, self.xy, self.xycoords)\n\n def _check_xy(self, renderer=None):\n """Check whether the annotation at *xy_pixel* should be drawn."""\n if renderer is None:\n renderer = self.get_figure(root=True)._get_renderer()\n b = self.get_annotation_clip()\n if b or (b is None and self.xycoords == "data"):\n # check if self.xy is inside the Axes.\n xy_pixel = self._get_position_xy(renderer)\n return self.axes.contains_point(xy_pixel)\n return True\n\n def draggable(self, state=None, use_blit=False):\n """\n Set whether the annotation is draggable with the mouse.\n\n Parameters\n ----------\n state : bool or None\n - True or False: set the draggability.\n - None: toggle the draggability.\n use_blit : bool, default: False\n Use blitting for faster image composition. For details see\n :ref:`func-animation`.\n\n Returns\n -------\n DraggableAnnotation or None\n If the annotation is draggable, the corresponding\n `.DraggableAnnotation` helper is returned.\n """\n from matplotlib.offsetbox import DraggableAnnotation\n is_draggable = self._draggable is not None\n\n # if state is None we'll toggle\n if state is None:\n state = not is_draggable\n\n if state:\n if self._draggable is None:\n self._draggable = DraggableAnnotation(self, use_blit)\n else:\n if self._draggable is not None:\n self._draggable.disconnect()\n self._draggable = None\n\n return self._draggable\n\n\nclass Annotation(Text, _AnnotationBase):\n """\n An `.Annotation` is a `.Text` that can refer to a specific position *xy*.\n Optionally an arrow pointing from the text to *xy* can be drawn.\n\n Attributes\n ----------\n xy\n The annotated position.\n xycoords\n The coordinate system for *xy*.\n arrow_patch\n A `.FancyArrowPatch` to point from *xytext* to *xy*.\n """\n\n def __str__(self):\n return f"Annotation({self.xy[0]:g}, {self.xy[1]:g}, {self._text!r})"\n\n def __init__(self, text, xy,\n xytext=None,\n xycoords='data',\n textcoords=None,\n arrowprops=None,\n annotation_clip=None,\n **kwargs):\n """\n Annotate the point *xy* with text *text*.\n\n In the simplest form, the text is placed at *xy*.\n\n Optionally, the text can be displayed in another position *xytext*.\n An arrow pointing from the text to the annotated point *xy* can then\n be added by defining *arrowprops*.\n\n Parameters\n ----------\n text : str\n The text of the annotation.\n\n xy : (float, float)\n The point *(x, y)* to annotate. The coordinate system is determined\n by *xycoords*.\n\n xytext : (float, float), default: *xy*\n The position *(x, y)* to place the text at. The coordinate system\n is determined by *textcoords*.\n\n xycoords : single or two-tuple of str or `.Artist` or `.Transform` or \\ncallable, default: 'data'\n\n The coordinate system that *xy* is given in. The following types\n of values are supported:\n\n - One of the following strings:\n\n ==================== ============================================\n Value Description\n ==================== ============================================\n 'figure points' Points from the lower left of the figure\n 'figure pixels' Pixels from the lower left of the figure\n 'figure fraction' Fraction of figure from lower left\n 'subfigure points' Points from the lower left of the subfigure\n 'subfigure pixels' Pixels from the lower left of the subfigure\n 'subfigure fraction' Fraction of subfigure from lower left\n 'axes points' Points from lower left corner of the Axes\n 'axes pixels' Pixels from lower left corner of the Axes\n 'axes fraction' Fraction of Axes from lower left\n 'data' Use the coordinate system of the object\n being annotated (default)\n 'polar' *(theta, r)* if not native 'data'\n coordinates\n ==================== ============================================\n\n Note that 'subfigure pixels' and 'figure pixels' are the same\n for the parent figure, so users who want code that is usable in\n a subfigure can use 'subfigure pixels'.\n\n - An `.Artist`: *xy* is interpreted as a fraction of the artist's\n `~matplotlib.transforms.Bbox`. E.g. *(0, 0)* would be the lower\n left corner of the bounding box and *(0.5, 1)* would be the\n center top of the bounding box.\n\n - A `.Transform` to transform *xy* to screen coordinates.\n\n - A function with one of the following signatures::\n\n def transform(renderer) -> Bbox\n def transform(renderer) -> Transform\n\n where *renderer* is a `.RendererBase` subclass.\n\n The result of the function is interpreted like the `.Artist` and\n `.Transform` cases above.\n\n - A tuple *(xcoords, ycoords)* specifying separate coordinate\n systems for *x* and *y*. *xcoords* and *ycoords* must each be\n of one of the above described types.\n\n See :ref:`plotting-guide-annotation` for more details.\n\n textcoords : single or two-tuple of str or `.Artist` or `.Transform` \\nor callable, default: value of *xycoords*\n The coordinate system that *xytext* is given in.\n\n All *xycoords* values are valid as well as the following strings:\n\n ================= =================================================\n Value Description\n ================= =================================================\n 'offset points' Offset, in points, from the *xy* value\n 'offset pixels' Offset, in pixels, from the *xy* value\n 'offset fontsize' Offset, relative to fontsize, from the *xy* value\n ================= =================================================\n\n arrowprops : dict, optional\n The properties used to draw a `.FancyArrowPatch` arrow between the\n positions *xy* and *xytext*. Defaults to None, i.e. no arrow is\n drawn.\n\n For historical reasons there are two different ways to specify\n arrows, "simple" and "fancy":\n\n **Simple arrow:**\n\n If *arrowprops* does not contain the key 'arrowstyle' the\n allowed keys are:\n\n ========== =================================================\n Key Description\n ========== =================================================\n width The width of the arrow in points\n headwidth The width of the base of the arrow head in points\n headlength The length of the arrow head in points\n shrink Fraction of total length to shrink from both ends\n ? Any `.FancyArrowPatch` property\n ========== =================================================\n\n The arrow is attached to the edge of the text box, the exact\n position (corners or centers) depending on where it's pointing to.\n\n **Fancy arrow:**\n\n This is used if 'arrowstyle' is provided in the *arrowprops*.\n\n Valid keys are the following `.FancyArrowPatch` parameters:\n\n =============== ===================================\n Key Description\n =============== ===================================\n arrowstyle The arrow style\n connectionstyle The connection style\n relpos See below; default is (0.5, 0.5)\n patchA Default is bounding box of the text\n patchB Default is None\n shrinkA In points. Default is 2 points\n shrinkB In points. Default is 2 points\n mutation_scale Default is text size (in points)\n mutation_aspect Default is 1\n ? Any `.FancyArrowPatch` property\n =============== ===================================\n\n The exact starting point position of the arrow is defined by\n *relpos*. It's a tuple of relative coordinates of the text box,\n where (0, 0) is the lower left corner and (1, 1) is the upper\n right corner. Values <0 and >1 are supported and specify points\n outside the text box. By default (0.5, 0.5), so the starting point\n is centered in the text box.\n\n annotation_clip : bool or None, default: None\n Whether to clip (i.e. not draw) the annotation when the annotation\n point *xy* is outside the Axes area.\n\n - If *True*, the annotation will be clipped when *xy* is outside\n the Axes.\n - If *False*, the annotation will always be drawn.\n - If *None*, the annotation will be clipped when *xy* is outside\n the Axes and *xycoords* is 'data'.\n\n **kwargs\n Additional kwargs are passed to `.Text`.\n\n Returns\n -------\n `.Annotation`\n\n See Also\n --------\n :ref:`annotations`\n\n """\n _AnnotationBase.__init__(self,\n xy,\n xycoords=xycoords,\n annotation_clip=annotation_clip)\n # warn about wonky input data\n if (xytext is None and\n textcoords is not None and\n textcoords != xycoords):\n _api.warn_external("You have used the `textcoords` kwarg, but "\n "not the `xytext` kwarg. This can lead to "\n "surprising results.")\n\n # clean up textcoords and assign default\n if textcoords is None:\n textcoords = self.xycoords\n self._textcoords = textcoords\n\n # cleanup xytext defaults\n if xytext is None:\n xytext = self.xy\n x, y = xytext\n\n self.arrowprops = arrowprops\n if arrowprops is not None:\n arrowprops = arrowprops.copy()\n if "arrowstyle" in arrowprops:\n self._arrow_relpos = arrowprops.pop("relpos", (0.5, 0.5))\n else:\n # modified YAArrow API to be used with FancyArrowPatch\n for key in ['width', 'headwidth', 'headlength', 'shrink']:\n arrowprops.pop(key, None)\n self.arrow_patch = FancyArrowPatch((0, 0), (1, 1), **arrowprops)\n else:\n self.arrow_patch = None\n\n # Must come last, as some kwargs may be propagated to arrow_patch.\n Text.__init__(self, x, y, text, **kwargs)\n\n def contains(self, mouseevent):\n if self._different_canvas(mouseevent):\n return False, {}\n contains, tinfo = Text.contains(self, mouseevent)\n if self.arrow_patch is not None:\n in_patch, _ = self.arrow_patch.contains(mouseevent)\n contains = contains or in_patch\n return contains, tinfo\n\n @property\n def xycoords(self):\n return self._xycoords\n\n @xycoords.setter\n def xycoords(self, xycoords):\n def is_offset(s):\n return isinstance(s, str) and s.startswith("offset")\n\n if (isinstance(xycoords, tuple) and any(map(is_offset, xycoords))\n or is_offset(xycoords)):\n raise ValueError("xycoords cannot be an offset coordinate")\n self._xycoords = xycoords\n\n @property\n def xyann(self):\n """\n The text position.\n\n See also *xytext* in `.Annotation`.\n """\n return self.get_position()\n\n @xyann.setter\n def xyann(self, xytext):\n self.set_position(xytext)\n\n def get_anncoords(self):\n """\n Return the coordinate system to use for `.Annotation.xyann`.\n\n See also *xycoords* in `.Annotation`.\n """\n return self._textcoords\n\n def set_anncoords(self, coords):\n """\n Set the coordinate system to use for `.Annotation.xyann`.\n\n See also *xycoords* in `.Annotation`.\n """\n self._textcoords = coords\n\n anncoords = property(get_anncoords, set_anncoords, doc="""\n The coordinate system to use for `.Annotation.xyann`.""")\n\n def set_figure(self, fig):\n # docstring inherited\n if self.arrow_patch is not None:\n self.arrow_patch.set_figure(fig)\n Artist.set_figure(self, fig)\n\n def update_positions(self, renderer):\n """\n Update the pixel positions of the annotation text and the arrow patch.\n """\n # generate transformation\n self.set_transform(self._get_xy_transform(renderer, self.anncoords))\n\n arrowprops = self.arrowprops\n if arrowprops is None:\n return\n\n bbox = Text.get_window_extent(self, renderer)\n\n arrow_end = x1, y1 = self._get_position_xy(renderer) # Annotated pos.\n\n ms = arrowprops.get("mutation_scale", self.get_size())\n self.arrow_patch.set_mutation_scale(ms)\n\n if "arrowstyle" not in arrowprops:\n # Approximately simulate the YAArrow.\n shrink = arrowprops.get('shrink', 0.0)\n width = arrowprops.get('width', 4)\n headwidth = arrowprops.get('headwidth', 12)\n headlength = arrowprops.get('headlength', 12)\n\n # NB: ms is in pts\n stylekw = dict(head_length=headlength / ms,\n head_width=headwidth / ms,\n tail_width=width / ms)\n\n self.arrow_patch.set_arrowstyle('simple', **stylekw)\n\n # using YAArrow style:\n # pick the corner of the text bbox closest to annotated point.\n xpos = [(bbox.x0, 0), ((bbox.x0 + bbox.x1) / 2, 0.5), (bbox.x1, 1)]\n ypos = [(bbox.y0, 0), ((bbox.y0 + bbox.y1) / 2, 0.5), (bbox.y1, 1)]\n x, relposx = min(xpos, key=lambda v: abs(v[0] - x1))\n y, relposy = min(ypos, key=lambda v: abs(v[0] - y1))\n self._arrow_relpos = (relposx, relposy)\n r = np.hypot(y - y1, x - x1)\n shrink_pts = shrink * r / renderer.points_to_pixels(1)\n self.arrow_patch.shrinkA = self.arrow_patch.shrinkB = shrink_pts\n\n # adjust the starting point of the arrow relative to the textbox.\n # TODO : Rotation needs to be accounted.\n arrow_begin = bbox.p0 + bbox.size * self._arrow_relpos\n # The arrow is drawn from arrow_begin to arrow_end. It will be first\n # clipped by patchA and patchB. Then it will be shrunk by shrinkA and\n # shrinkB (in points). If patchA is not set, self.bbox_patch is used.\n self.arrow_patch.set_positions(arrow_begin, arrow_end)\n\n if "patchA" in arrowprops:\n patchA = arrowprops["patchA"]\n elif self._bbox_patch:\n patchA = self._bbox_patch\n elif self.get_text() == "":\n patchA = None\n else:\n pad = renderer.points_to_pixels(4)\n patchA = Rectangle(\n xy=(bbox.x0 - pad / 2, bbox.y0 - pad / 2),\n width=bbox.width + pad, height=bbox.height + pad,\n transform=IdentityTransform(), clip_on=False)\n self.arrow_patch.set_patchA(patchA)\n\n @artist.allow_rasterization\n def draw(self, renderer):\n # docstring inherited\n if renderer is not None:\n self._renderer = renderer\n if not self.get_visible() or not self._check_xy(renderer):\n return\n # Update text positions before `Text.draw` would, so that the\n # FancyArrowPatch is correctly positioned.\n self.update_positions(renderer)\n self.update_bbox_position_size(renderer)\n if self.arrow_patch is not None: # FancyArrowPatch\n if (self.arrow_patch.get_figure(root=False) is None and\n (fig := self.get_figure(root=False)) is not None):\n self.arrow_patch.set_figure(fig)\n self.arrow_patch.draw(renderer)\n # Draw text, including FancyBboxPatch, after FancyArrowPatch.\n # Otherwise, a wedge arrowstyle can land partly on top of the Bbox.\n Text.draw(self, renderer)\n\n def get_window_extent(self, renderer=None):\n # docstring inherited\n # This block is the same as in Text.get_window_extent, but we need to\n # set the renderer before calling update_positions().\n if not self.get_visible() or not self._check_xy(renderer):\n return Bbox.unit()\n if renderer is not None:\n self._renderer = renderer\n if self._renderer is None:\n self._renderer = self.get_figure(root=True)._get_renderer()\n if self._renderer is None:\n raise RuntimeError('Cannot get window extent without renderer')\n\n self.update_positions(self._renderer)\n\n text_bbox = Text.get_window_extent(self)\n bboxes = [text_bbox]\n\n if self.arrow_patch is not None:\n bboxes.append(self.arrow_patch.get_window_extent())\n\n return Bbox.union(bboxes)\n\n def get_tightbbox(self, renderer=None):\n # docstring inherited\n if not self._check_xy(renderer):\n return Bbox.null()\n return super().get_tightbbox(renderer)\n\n\n_docstring.interpd.register(Annotation=Annotation.__init__.__doc__)\n | .venv\Lib\site-packages\matplotlib\text.py | text.py | Python | 70,856 | 0.75 | 0.136118 | 0.060498 | react-lib | 635 | 2024-01-09T01:06:05.046012 | BSD-3-Clause | false | 4d718971ece296b4828952349a3e823a |
from .artist import Artist\nfrom .backend_bases import RendererBase\nfrom .font_manager import FontProperties\nfrom .offsetbox import DraggableAnnotation\nfrom .path import Path\nfrom .patches import FancyArrowPatch, FancyBboxPatch\nfrom .textpath import ( # noqa: F401, reexported API\n TextPath as TextPath,\n TextToPath as TextToPath,\n)\nfrom .transforms import (\n Bbox,\n BboxBase,\n Transform,\n)\n\nfrom collections.abc import Callable, Iterable\nfrom typing import Any, Literal\nfrom .typing import ColorType, CoordsType\n\nclass Text(Artist):\n zorder: float\n def __init__(\n self,\n x: float = ...,\n y: float = ...,\n text: Any = ...,\n *,\n color: ColorType | None = ...,\n verticalalignment: Literal[\n "bottom", "baseline", "center", "center_baseline", "top"\n ] = ...,\n horizontalalignment: Literal["left", "center", "right"] = ...,\n multialignment: Literal["left", "center", "right"] | None = ...,\n fontproperties: str | Path | FontProperties | None = ...,\n rotation: float | Literal["vertical", "horizontal"] | None = ...,\n linespacing: float | None = ...,\n rotation_mode: Literal["default", "anchor"] | None = ...,\n usetex: bool | None = ...,\n wrap: bool = ...,\n transform_rotates_text: bool = ...,\n parse_math: bool | None = ...,\n antialiased: bool | None = ...,\n **kwargs\n ) -> None: ...\n def update(self, kwargs: dict[str, Any]) -> list[Any]: ...\n def get_rotation(self) -> float: ...\n def get_transform_rotates_text(self) -> bool: ...\n def set_rotation_mode(self, m: None | Literal["default", "anchor"]) -> None: ...\n def get_rotation_mode(self) -> Literal["default", "anchor"]: ...\n def set_bbox(self, rectprops: dict[str, Any]) -> None: ...\n def get_bbox_patch(self) -> None | FancyBboxPatch: ...\n def update_bbox_position_size(self, renderer: RendererBase) -> None: ...\n def get_wrap(self) -> bool: ...\n def set_wrap(self, wrap: bool) -> None: ...\n def get_color(self) -> ColorType: ...\n def get_fontproperties(self) -> FontProperties: ...\n def get_fontfamily(self) -> list[str]: ...\n def get_fontname(self) -> str: ...\n def get_fontstyle(self) -> Literal["normal", "italic", "oblique"]: ...\n def get_fontsize(self) -> float | str: ...\n def get_fontvariant(self) -> Literal["normal", "small-caps"]: ...\n def get_fontweight(self) -> int | str: ...\n def get_stretch(self) -> int | str: ...\n def get_horizontalalignment(self) -> Literal["left", "center", "right"]: ...\n def get_unitless_position(self) -> tuple[float, float]: ...\n def get_position(self) -> tuple[float, float]: ...\n def get_text(self) -> str: ...\n def get_verticalalignment(\n self,\n ) -> Literal["bottom", "baseline", "center", "center_baseline", "top"]: ...\n def get_window_extent(\n self, renderer: RendererBase | None = ..., dpi: float | None = ...\n ) -> Bbox: ...\n def set_backgroundcolor(self, color: ColorType) -> None: ...\n def set_color(self, color: ColorType) -> None: ...\n def set_horizontalalignment(\n self, align: Literal["left", "center", "right"]\n ) -> None: ...\n def set_multialignment(self, align: Literal["left", "center", "right"]) -> None: ...\n def set_linespacing(self, spacing: float) -> None: ...\n def set_fontfamily(self, fontname: str | Iterable[str]) -> None: ...\n def set_fontvariant(self, variant: Literal["normal", "small-caps"]) -> None: ...\n def set_fontstyle(\n self, fontstyle: Literal["normal", "italic", "oblique"]\n ) -> None: ...\n def set_fontsize(self, fontsize: float | str) -> None: ...\n def get_math_fontfamily(self) -> str: ...\n def set_math_fontfamily(self, fontfamily: str) -> None: ...\n def set_fontweight(self, weight: int | str) -> None: ...\n def set_fontstretch(self, stretch: int | str) -> None: ...\n def set_position(self, xy: tuple[float, float]) -> None: ...\n def set_x(self, x: float) -> None: ...\n def set_y(self, y: float) -> None: ...\n def set_rotation(self, s: float) -> None: ...\n def set_transform_rotates_text(self, t: bool) -> None: ...\n def set_verticalalignment(\n self, align: Literal["bottom", "baseline", "center", "center_baseline", "top"]\n ) -> None: ...\n def set_text(self, s: Any) -> None: ...\n def set_fontproperties(self, fp: FontProperties | str | Path | None) -> None: ...\n def set_usetex(self, usetex: bool | None) -> None: ...\n def get_usetex(self) -> bool: ...\n def set_parse_math(self, parse_math: bool) -> None: ...\n def get_parse_math(self) -> bool: ...\n def set_fontname(self, fontname: str | Iterable[str]) -> None: ...\n def get_antialiased(self) -> bool: ...\n def set_antialiased(self, antialiased: bool) -> None: ...\n\nclass OffsetFrom:\n def __init__(\n self,\n artist: Artist | BboxBase | Transform,\n ref_coord: tuple[float, float],\n unit: Literal["points", "pixels"] = ...,\n ) -> None: ...\n def set_unit(self, unit: Literal["points", "pixels"]) -> None: ...\n def get_unit(self) -> Literal["points", "pixels"]: ...\n def __call__(self, renderer: RendererBase) -> Transform: ...\n\nclass _AnnotationBase:\n xy: tuple[float, float]\n xycoords: CoordsType\n def __init__(\n self,\n xy,\n xycoords: CoordsType = ...,\n annotation_clip: bool | None = ...,\n ) -> None: ...\n def set_annotation_clip(self, b: bool | None) -> None: ...\n def get_annotation_clip(self) -> bool | None: ...\n def draggable(\n self, state: bool | None = ..., use_blit: bool = ...\n ) -> DraggableAnnotation | None: ...\n\nclass Annotation(Text, _AnnotationBase):\n arrowprops: dict[str, Any] | None\n arrow_patch: FancyArrowPatch | None\n def __init__(\n self,\n text: str,\n xy: tuple[float, float],\n xytext: tuple[float, float] | None = ...,\n xycoords: CoordsType = ...,\n textcoords: CoordsType | None = ...,\n arrowprops: dict[str, Any] | None = ...,\n annotation_clip: bool | None = ...,\n **kwargs\n ) -> None: ...\n @property\n def xycoords(\n self,\n ) -> CoordsType: ...\n @xycoords.setter\n def xycoords(\n self,\n xycoords: CoordsType,\n ) -> None: ...\n @property\n def xyann(self) -> tuple[float, float]: ...\n @xyann.setter\n def xyann(self, xytext: tuple[float, float]) -> None: ...\n def get_anncoords(\n self,\n ) -> CoordsType: ...\n def set_anncoords(\n self,\n coords: CoordsType,\n ) -> None: ...\n @property\n def anncoords(\n self,\n ) -> CoordsType: ...\n @anncoords.setter\n def anncoords(\n self,\n coords: CoordsType,\n ) -> None: ...\n def update_positions(self, renderer: RendererBase) -> None: ...\n # Drops `dpi` parameter from superclass\n def get_window_extent(self, renderer: RendererBase | None = ...) -> Bbox: ... # type: ignore[override]\n | .venv\Lib\site-packages\matplotlib\text.pyi | text.pyi | Other | 7,019 | 0.95 | 0.425414 | 0.022727 | react-lib | 931 | 2023-12-01T05:55:41.240573 | BSD-3-Clause | false | 8bd3abe46bce6ed433da4bf265178b4e |
from collections import OrderedDict\nimport logging\nimport urllib.parse\n\nimport numpy as np\n\nfrom matplotlib import _text_helpers, dviread\nfrom matplotlib.font_manager import (\n FontProperties, get_font, fontManager as _fontManager\n)\nfrom matplotlib.ft2font import LoadFlags\nfrom matplotlib.mathtext import MathTextParser\nfrom matplotlib.path import Path\nfrom matplotlib.texmanager import TexManager\nfrom matplotlib.transforms import Affine2D\n\n_log = logging.getLogger(__name__)\n\n\nclass TextToPath:\n """A class that converts strings to paths."""\n\n FONT_SCALE = 100.\n DPI = 72\n\n def __init__(self):\n self.mathtext_parser = MathTextParser('path')\n self._texmanager = None\n\n def _get_font(self, prop):\n """\n Find the `FT2Font` matching font properties *prop*, with its size set.\n """\n filenames = _fontManager._find_fonts_by_props(prop)\n font = get_font(filenames)\n font.set_size(self.FONT_SCALE, self.DPI)\n return font\n\n def _get_hinting_flag(self):\n return LoadFlags.NO_HINTING\n\n def _get_char_id(self, font, ccode):\n """\n Return a unique id for the given font and character-code set.\n """\n return urllib.parse.quote(f"{font.postscript_name}-{ccode:x}")\n\n def get_text_width_height_descent(self, s, prop, ismath):\n fontsize = prop.get_size_in_points()\n\n if ismath == "TeX":\n return TexManager().get_text_width_height_descent(s, fontsize)\n\n scale = fontsize / self.FONT_SCALE\n\n if ismath:\n prop = prop.copy()\n prop.set_size(self.FONT_SCALE)\n width, height, descent, *_ = \\n self.mathtext_parser.parse(s, 72, prop)\n return width * scale, height * scale, descent * scale\n\n font = self._get_font(prop)\n font.set_text(s, 0.0, flags=LoadFlags.NO_HINTING)\n w, h = font.get_width_height()\n w /= 64.0 # convert from subpixels\n h /= 64.0\n d = font.get_descent()\n d /= 64.0\n return w * scale, h * scale, d * scale\n\n def get_text_path(self, prop, s, ismath=False):\n """\n Convert text *s* to path (a tuple of vertices and codes for\n matplotlib.path.Path).\n\n Parameters\n ----------\n prop : `~matplotlib.font_manager.FontProperties`\n The font properties for the text.\n s : str\n The text to be converted.\n ismath : {False, True, "TeX"}\n If True, use mathtext parser. If "TeX", use tex for rendering.\n\n Returns\n -------\n verts : list\n A list of arrays containing the (x, y) coordinates of the vertices.\n codes : list\n A list of path codes.\n\n Examples\n --------\n Create a list of vertices and codes from a text, and create a `.Path`\n from those::\n\n from matplotlib.path import Path\n from matplotlib.text import TextToPath\n from matplotlib.font_manager import FontProperties\n\n fp = FontProperties(family="Comic Neue", style="italic")\n verts, codes = TextToPath().get_text_path(fp, "ABC")\n path = Path(verts, codes, closed=False)\n\n Also see `TextPath` for a more direct way to create a path from a text.\n """\n if ismath == "TeX":\n glyph_info, glyph_map, rects = self.get_glyphs_tex(prop, s)\n elif not ismath:\n font = self._get_font(prop)\n glyph_info, glyph_map, rects = self.get_glyphs_with_font(font, s)\n else:\n glyph_info, glyph_map, rects = self.get_glyphs_mathtext(prop, s)\n\n verts, codes = [], []\n for glyph_id, xposition, yposition, scale in glyph_info:\n verts1, codes1 = glyph_map[glyph_id]\n verts.extend(verts1 * scale + [xposition, yposition])\n codes.extend(codes1)\n for verts1, codes1 in rects:\n verts.extend(verts1)\n codes.extend(codes1)\n\n # Make sure an empty string or one with nothing to print\n # (e.g. only spaces & newlines) will be valid/empty path\n if not verts:\n verts = np.empty((0, 2))\n\n return verts, codes\n\n def get_glyphs_with_font(self, font, s, glyph_map=None,\n return_new_glyphs_only=False):\n """\n Convert string *s* to vertices and codes using the provided ttf font.\n """\n\n if glyph_map is None:\n glyph_map = OrderedDict()\n\n if return_new_glyphs_only:\n glyph_map_new = OrderedDict()\n else:\n glyph_map_new = glyph_map\n\n xpositions = []\n glyph_ids = []\n for item in _text_helpers.layout(s, font):\n char_id = self._get_char_id(item.ft_object, ord(item.char))\n glyph_ids.append(char_id)\n xpositions.append(item.x)\n if char_id not in glyph_map:\n glyph_map_new[char_id] = item.ft_object.get_path()\n\n ypositions = [0] * len(xpositions)\n sizes = [1.] * len(xpositions)\n\n rects = []\n\n return (list(zip(glyph_ids, xpositions, ypositions, sizes)),\n glyph_map_new, rects)\n\n def get_glyphs_mathtext(self, prop, s, glyph_map=None,\n return_new_glyphs_only=False):\n """\n Parse mathtext string *s* and convert it to a (vertices, codes) pair.\n """\n\n prop = prop.copy()\n prop.set_size(self.FONT_SCALE)\n\n width, height, descent, glyphs, rects = self.mathtext_parser.parse(\n s, self.DPI, prop)\n\n if not glyph_map:\n glyph_map = OrderedDict()\n\n if return_new_glyphs_only:\n glyph_map_new = OrderedDict()\n else:\n glyph_map_new = glyph_map\n\n xpositions = []\n ypositions = []\n glyph_ids = []\n sizes = []\n\n for font, fontsize, ccode, ox, oy in glyphs:\n char_id = self._get_char_id(font, ccode)\n if char_id not in glyph_map:\n font.clear()\n font.set_size(self.FONT_SCALE, self.DPI)\n font.load_char(ccode, flags=LoadFlags.NO_HINTING)\n glyph_map_new[char_id] = font.get_path()\n\n xpositions.append(ox)\n ypositions.append(oy)\n glyph_ids.append(char_id)\n size = fontsize / self.FONT_SCALE\n sizes.append(size)\n\n myrects = []\n for ox, oy, w, h in rects:\n vert1 = [(ox, oy), (ox, oy + h), (ox + w, oy + h),\n (ox + w, oy), (ox, oy), (0, 0)]\n code1 = [Path.MOVETO,\n Path.LINETO, Path.LINETO, Path.LINETO, Path.LINETO,\n Path.CLOSEPOLY]\n myrects.append((vert1, code1))\n\n return (list(zip(glyph_ids, xpositions, ypositions, sizes)),\n glyph_map_new, myrects)\n\n def get_glyphs_tex(self, prop, s, glyph_map=None,\n return_new_glyphs_only=False):\n """Convert the string *s* to vertices and codes using usetex mode."""\n # Mostly borrowed from pdf backend.\n\n dvifile = TexManager().make_dvi(s, self.FONT_SCALE)\n with dviread.Dvi(dvifile, self.DPI) as dvi:\n page, = dvi\n\n if glyph_map is None:\n glyph_map = OrderedDict()\n\n if return_new_glyphs_only:\n glyph_map_new = OrderedDict()\n else:\n glyph_map_new = glyph_map\n\n glyph_ids, xpositions, ypositions, sizes = [], [], [], []\n\n # Gather font information and do some setup for combining\n # characters into strings.\n for text in page.text:\n font = get_font(text.font_path)\n char_id = self._get_char_id(font, text.glyph)\n if char_id not in glyph_map:\n font.clear()\n font.set_size(self.FONT_SCALE, self.DPI)\n glyph_name_or_index = text.glyph_name_or_index\n if isinstance(glyph_name_or_index, str):\n index = font.get_name_index(glyph_name_or_index)\n font.load_glyph(index, flags=LoadFlags.TARGET_LIGHT)\n elif isinstance(glyph_name_or_index, int):\n self._select_native_charmap(font)\n font.load_char(\n glyph_name_or_index, flags=LoadFlags.TARGET_LIGHT)\n else: # Should not occur.\n raise TypeError(f"Glyph spec of unexpected type: "\n f"{glyph_name_or_index!r}")\n glyph_map_new[char_id] = font.get_path()\n\n glyph_ids.append(char_id)\n xpositions.append(text.x)\n ypositions.append(text.y)\n sizes.append(text.font_size / self.FONT_SCALE)\n\n myrects = []\n\n for ox, oy, h, w in page.boxes:\n vert1 = [(ox, oy), (ox + w, oy), (ox + w, oy + h),\n (ox, oy + h), (ox, oy), (0, 0)]\n code1 = [Path.MOVETO,\n Path.LINETO, Path.LINETO, Path.LINETO, Path.LINETO,\n Path.CLOSEPOLY]\n myrects.append((vert1, code1))\n\n return (list(zip(glyph_ids, xpositions, ypositions, sizes)),\n glyph_map_new, myrects)\n\n @staticmethod\n def _select_native_charmap(font):\n # Select the native charmap. (we can't directly identify it but it's\n # typically an Adobe charmap).\n for charmap_code in [\n 1094992451, # ADOBE_CUSTOM.\n 1094995778, # ADOBE_STANDARD.\n ]:\n try:\n font.select_charmap(charmap_code)\n except (ValueError, RuntimeError):\n pass\n else:\n break\n else:\n _log.warning("No supported encoding in font (%s).", font.fname)\n\n\ntext_to_path = TextToPath()\n\n\nclass TextPath(Path):\n """\n Create a path from the text.\n """\n\n def __init__(self, xy, s, size=None, prop=None,\n _interpolation_steps=1, usetex=False):\n r"""\n Create a path from the text. Note that it simply is a path,\n not an artist. You need to use the `.PathPatch` (or other artists)\n to draw this path onto the canvas.\n\n Parameters\n ----------\n xy : tuple or array of two float values\n Position of the text. For no offset, use ``xy=(0, 0)``.\n\n s : str\n The text to convert to a path.\n\n size : float, optional\n Font size in points. Defaults to the size specified via the font\n properties *prop*.\n\n prop : `~matplotlib.font_manager.FontProperties`, optional\n Font property. If not provided, will use a default\n `.FontProperties` with parameters from the\n :ref:`rcParams<customizing-with-dynamic-rc-settings>`.\n\n _interpolation_steps : int, optional\n (Currently ignored)\n\n usetex : bool, default: False\n Whether to use tex rendering.\n\n Examples\n --------\n The following creates a path from the string "ABC" with Helvetica\n font face; and another path from the latex fraction 1/2::\n\n from matplotlib.text import TextPath\n from matplotlib.font_manager import FontProperties\n\n fp = FontProperties(family="Helvetica", style="italic")\n path1 = TextPath((12, 12), "ABC", size=12, prop=fp)\n path2 = TextPath((0, 0), r"$\frac{1}{2}$", size=12, usetex=True)\n\n Also see :doc:`/gallery/text_labels_and_annotations/demo_text_path`.\n """\n # Circular import.\n from matplotlib.text import Text\n\n prop = FontProperties._from_any(prop)\n if size is None:\n size = prop.get_size_in_points()\n\n self._xy = xy\n self.set_size(size)\n\n self._cached_vertices = None\n s, ismath = Text(usetex=usetex)._preprocess_math(s)\n super().__init__(\n *text_to_path.get_text_path(prop, s, ismath=ismath),\n _interpolation_steps=_interpolation_steps,\n readonly=True)\n self._should_simplify = False\n\n def set_size(self, size):\n """Set the text size."""\n self._size = size\n self._invalid = True\n\n def get_size(self):\n """Get the text size."""\n return self._size\n\n @property\n def vertices(self):\n """\n Return the cached path after updating it if necessary.\n """\n self._revalidate_path()\n return self._cached_vertices\n\n @property\n def codes(self):\n """\n Return the codes\n """\n return self._codes\n\n def _revalidate_path(self):\n """\n Update the path if necessary.\n\n The path for the text is initially create with the font size of\n `.FONT_SCALE`, and this path is rescaled to other size when necessary.\n """\n if self._invalid or self._cached_vertices is None:\n tr = (Affine2D()\n .scale(self._size / text_to_path.FONT_SCALE)\n .translate(*self._xy))\n self._cached_vertices = tr.transform(self._vertices)\n self._cached_vertices.flags.writeable = False\n self._invalid = False\n | .venv\Lib\site-packages\matplotlib\textpath.py | textpath.py | Python | 13,254 | 0.95 | 0.133501 | 0.028125 | react-lib | 20 | 2024-06-08T22:39:50.827247 | MIT | false | 9593a739b0619c9f5360a6f2808204f3 |
from matplotlib.font_manager import FontProperties\nfrom matplotlib.ft2font import FT2Font\nfrom matplotlib.mathtext import MathTextParser, VectorParse\nfrom matplotlib.path import Path\n\nimport numpy as np\n\nfrom typing import Literal\n\nclass TextToPath:\n FONT_SCALE: float\n DPI: float\n mathtext_parser: MathTextParser[VectorParse]\n def __init__(self) -> None: ...\n def get_text_width_height_descent(\n self, s: str, prop: FontProperties, ismath: bool | Literal["TeX"]\n ) -> tuple[float, float, float]: ...\n def get_text_path(\n self, prop: FontProperties, s: str, ismath: bool | Literal["TeX"] = ...\n ) -> list[np.ndarray]: ...\n def get_glyphs_with_font(\n self,\n font: FT2Font,\n s: str,\n glyph_map: dict[str, tuple[np.ndarray, np.ndarray]] | None = ...,\n return_new_glyphs_only: bool = ...,\n ) -> tuple[\n list[tuple[str, float, float, float]],\n dict[str, tuple[np.ndarray, np.ndarray]],\n list[tuple[list[tuple[float, float]], list[int]]],\n ]: ...\n def get_glyphs_mathtext(\n self,\n prop: FontProperties,\n s: str,\n glyph_map: dict[str, tuple[np.ndarray, np.ndarray]] | None = ...,\n return_new_glyphs_only: bool = ...,\n ) -> tuple[\n list[tuple[str, float, float, float]],\n dict[str, tuple[np.ndarray, np.ndarray]],\n list[tuple[list[tuple[float, float]], list[int]]],\n ]: ...\n def get_glyphs_tex(\n self,\n prop: FontProperties,\n s: str,\n glyph_map: dict[str, tuple[np.ndarray, np.ndarray]] | None = ...,\n return_new_glyphs_only: bool = ...,\n ) -> tuple[\n list[tuple[str, float, float, float]],\n dict[str, tuple[np.ndarray, np.ndarray]],\n list[tuple[list[tuple[float, float]], list[int]]],\n ]: ...\n\ntext_to_path: TextToPath\n\nclass TextPath(Path):\n def __init__(\n self,\n xy: tuple[float, float],\n s: str,\n size: float | None = ...,\n prop: FontProperties | None = ...,\n _interpolation_steps: int = ...,\n usetex: bool = ...,\n ) -> None: ...\n def set_size(self, size: float | None) -> None: ...\n def get_size(self) -> float | None: ...\n\n # These are read only... there actually are protections in the base class, so probably can be deleted...\n @property # type: ignore[misc]\n def vertices(self) -> np.ndarray: ... # type: ignore[override]\n @property # type: ignore[misc]\n def codes(self) -> np.ndarray: ... # type: ignore[override]\n | .venv\Lib\site-packages\matplotlib\textpath.pyi | textpath.pyi | Other | 2,529 | 0.95 | 0.189189 | 0.014706 | python-kit | 443 | 2024-04-28T17:30:39.970114 | GPL-3.0 | false | cbb841dc09d4143797940499f80d251f |
from collections.abc import Callable, Sequence\nfrom typing import Any, Literal\n\nfrom matplotlib.axis import Axis\nfrom matplotlib.transforms import Transform\nfrom matplotlib.projections.polar import _AxisWrapper\n\nimport numpy as np\n\nclass _DummyAxis:\n __name__: str\n def __init__(self, minpos: float = ...) -> None: ...\n def get_view_interval(self) -> tuple[float, float]: ...\n def set_view_interval(self, vmin: float, vmax: float) -> None: ...\n def get_minpos(self) -> float: ...\n def get_data_interval(self) -> tuple[float, float]: ...\n def set_data_interval(self, vmin: float, vmax: float) -> None: ...\n def get_tick_space(self) -> int: ...\n\nclass TickHelper:\n axis: None | Axis | _DummyAxis | _AxisWrapper\n def set_axis(self, axis: Axis | _DummyAxis | _AxisWrapper | None) -> None: ...\n def create_dummy_axis(self, **kwargs) -> None: ...\n\nclass Formatter(TickHelper):\n locs: list[float]\n def __call__(self, x: float, pos: int | None = ...) -> str: ...\n def format_ticks(self, values: list[float]) -> list[str]: ...\n def format_data(self, value: float) -> str: ...\n def format_data_short(self, value: float) -> str: ...\n def get_offset(self) -> str: ...\n def set_locs(self, locs: list[float]) -> None: ...\n @staticmethod\n def fix_minus(s: str) -> str: ...\n\nclass NullFormatter(Formatter): ...\n\nclass FixedFormatter(Formatter):\n seq: Sequence[str]\n offset_string: str\n def __init__(self, seq: Sequence[str]) -> None: ...\n def set_offset_string(self, ofs: str) -> None: ...\n\nclass FuncFormatter(Formatter):\n func: Callable[[float, int | None], str]\n offset_string: str\n # Callable[[float, int | None], str] | Callable[[float], str]\n def __init__(self, func: Callable[..., str]) -> None: ...\n def set_offset_string(self, ofs: str) -> None: ...\n\nclass FormatStrFormatter(Formatter):\n fmt: str\n def __init__(self, fmt: str) -> None: ...\n\nclass StrMethodFormatter(Formatter):\n fmt: str\n def __init__(self, fmt: str) -> None: ...\n\nclass ScalarFormatter(Formatter):\n orderOfMagnitude: int\n format: str\n def __init__(\n self,\n useOffset: bool | float | None = ...,\n useMathText: bool | None = ...,\n useLocale: bool | None = ...,\n *,\n usetex: bool | None = ...,\n ) -> None: ...\n offset: float\n def get_usetex(self) -> bool: ...\n def set_usetex(self, val: bool) -> None: ...\n @property\n def usetex(self) -> bool: ...\n @usetex.setter\n def usetex(self, val: bool) -> None: ...\n def get_useOffset(self) -> bool: ...\n def set_useOffset(self, val: bool | float) -> None: ...\n @property\n def useOffset(self) -> bool: ...\n @useOffset.setter\n def useOffset(self, val: bool | float) -> None: ...\n def get_useLocale(self) -> bool: ...\n def set_useLocale(self, val: bool | None) -> None: ...\n @property\n def useLocale(self) -> bool: ...\n @useLocale.setter\n def useLocale(self, val: bool | None) -> None: ...\n def get_useMathText(self) -> bool: ...\n def set_useMathText(self, val: bool | None) -> None: ...\n @property\n def useMathText(self) -> bool: ...\n @useMathText.setter\n def useMathText(self, val: bool | None) -> None: ...\n def set_scientific(self, b: bool) -> None: ...\n def set_powerlimits(self, lims: tuple[int, int]) -> None: ...\n def format_data_short(self, value: float | np.ma.MaskedArray) -> str: ...\n def format_data(self, value: float) -> str: ...\n\nclass LogFormatter(Formatter):\n minor_thresholds: tuple[float, float]\n def __init__(\n self,\n base: float = ...,\n labelOnlyBase: bool = ...,\n minor_thresholds: tuple[float, float] | None = ...,\n linthresh: float | None = ...,\n ) -> None: ...\n def set_base(self, base: float) -> None: ...\n labelOnlyBase: bool\n def set_label_minor(self, labelOnlyBase: bool) -> None: ...\n def set_locs(self, locs: Any | None = ...) -> None: ...\n def format_data(self, value: float) -> str: ...\n def format_data_short(self, value: float) -> str: ...\n\nclass LogFormatterExponent(LogFormatter): ...\nclass LogFormatterMathtext(LogFormatter): ...\nclass LogFormatterSciNotation(LogFormatterMathtext): ...\n\nclass LogitFormatter(Formatter):\n def __init__(\n self,\n *,\n use_overline: bool = ...,\n one_half: str = ...,\n minor: bool = ...,\n minor_threshold: int = ...,\n minor_number: int = ...\n ) -> None: ...\n def use_overline(self, use_overline: bool) -> None: ...\n def set_one_half(self, one_half: str) -> None: ...\n def set_minor_threshold(self, minor_threshold: int) -> None: ...\n def set_minor_number(self, minor_number: int) -> None: ...\n def format_data_short(self, value: float) -> str: ...\n\nclass EngFormatter(ScalarFormatter):\n ENG_PREFIXES: dict[int, str]\n unit: str\n places: int | None\n sep: str\n def __init__(\n self,\n unit: str = ...,\n places: int | None = ...,\n sep: str = ...,\n *,\n usetex: bool | None = ...,\n useMathText: bool | None = ...,\n useOffset: bool | float | None = ...,\n ) -> None: ...\n def format_eng(self, num: float) -> str: ...\n\nclass PercentFormatter(Formatter):\n xmax: float\n decimals: int | None\n def __init__(\n self,\n xmax: float = ...,\n decimals: int | None = ...,\n symbol: str | None = ...,\n is_latex: bool = ...,\n ) -> None: ...\n def format_pct(self, x: float, display_range: float) -> str: ...\n def convert_to_pct(self, x: float) -> float: ...\n @property\n def symbol(self) -> str: ...\n @symbol.setter\n def symbol(self, symbol: str) -> None: ...\n\nclass Locator(TickHelper):\n MAXTICKS: int\n def tick_values(self, vmin: float, vmax: float) -> Sequence[float]: ...\n # Implementation accepts **kwargs, but is a no-op other than a warning\n # Typing as **kwargs would require each subclass to accept **kwargs for mypy\n def set_params(self) -> None: ...\n def __call__(self) -> Sequence[float]: ...\n def raise_if_exceeds(self, locs: Sequence[float]) -> Sequence[float]: ...\n def nonsingular(self, v0: float, v1: float) -> tuple[float, float]: ...\n def view_limits(self, vmin: float, vmax: float) -> tuple[float, float]: ...\n\nclass IndexLocator(Locator):\n offset: float\n def __init__(self, base: float, offset: float) -> None: ...\n def set_params(\n self, base: float | None = ..., offset: float | None = ...\n ) -> None: ...\n\nclass FixedLocator(Locator):\n nbins: int | None\n def __init__(self, locs: Sequence[float], nbins: int | None = ...) -> None: ...\n def set_params(self, nbins: int | None = ...) -> None: ...\n\nclass NullLocator(Locator): ...\n\nclass LinearLocator(Locator):\n presets: dict[tuple[float, float], Sequence[float]]\n def __init__(\n self,\n numticks: int | None = ...,\n presets: dict[tuple[float, float], Sequence[float]] | None = ...,\n ) -> None: ...\n @property\n def numticks(self) -> int: ...\n @numticks.setter\n def numticks(self, numticks: int | None) -> None: ...\n def set_params(\n self,\n numticks: int | None = ...,\n presets: dict[tuple[float, float], Sequence[float]] | None = ...,\n ) -> None: ...\n\nclass MultipleLocator(Locator):\n def __init__(self, base: float = ..., offset: float = ...) -> None: ...\n def set_params(self, base: float | None = ..., offset: float | None = ...) -> None: ...\n def view_limits(self, dmin: float, dmax: float) -> tuple[float, float]: ...\n\nclass _Edge_integer:\n step: float\n def __init__(self, step: float, offset: float) -> None: ...\n def closeto(self, ms: float, edge: float) -> bool: ...\n def le(self, x: float) -> float: ...\n def ge(self, x: float) -> float: ...\n\nclass MaxNLocator(Locator):\n default_params: dict[str, Any]\n def __init__(self, nbins: int | Literal["auto"] | None = ..., **kwargs) -> None: ...\n def set_params(self, **kwargs) -> None: ...\n def view_limits(self, dmin: float, dmax: float) -> tuple[float, float]: ...\n\nclass LogLocator(Locator):\n numticks: int | None\n def __init__(\n self,\n base: float = ...,\n subs: None | Literal["auto", "all"] | Sequence[float] = ...,\n *,\n numticks: int | None = ...,\n ) -> None: ...\n def set_params(\n self,\n base: float | None = ...,\n subs: Literal["auto", "all"] | Sequence[float] | None = ...,\n *,\n numticks: int | None = ...,\n ) -> None: ...\n\nclass SymmetricalLogLocator(Locator):\n numticks: int\n def __init__(\n self,\n transform: Transform | None = ...,\n subs: Sequence[float] | None = ...,\n linthresh: float | None = ...,\n base: float | None = ...,\n ) -> None: ...\n def set_params(\n self, subs: Sequence[float] | None = ..., numticks: int | None = ...\n ) -> None: ...\n\nclass AsinhLocator(Locator):\n linear_width: float\n numticks: int\n symthresh: float\n base: int\n subs: Sequence[float] | None\n def __init__(\n self,\n linear_width: float,\n numticks: int = ...,\n symthresh: float = ...,\n base: int = ...,\n subs: Sequence[float] | None = ...,\n ) -> None: ...\n def set_params(\n self,\n numticks: int | None = ...,\n symthresh: float | None = ...,\n base: int | None = ...,\n subs: Sequence[float] | None = ...,\n ) -> None: ...\n\nclass LogitLocator(MaxNLocator):\n def __init__(\n self, minor: bool = ..., *, nbins: Literal["auto"] | int = ...\n ) -> None: ...\n def set_params(self, minor: bool | None = ..., **kwargs) -> None: ...\n @property\n def minor(self) -> bool: ...\n @minor.setter\n def minor(self, value: bool) -> None: ...\n\nclass AutoLocator(MaxNLocator):\n def __init__(self) -> None: ...\n\nclass AutoMinorLocator(Locator):\n ndivs: int\n def __init__(self, n: int | None = ...) -> None: ...\n\n__all__ = ('TickHelper', 'Formatter', 'FixedFormatter',\n 'NullFormatter', 'FuncFormatter', 'FormatStrFormatter',\n 'StrMethodFormatter', 'ScalarFormatter', 'LogFormatter',\n 'LogFormatterExponent', 'LogFormatterMathtext',\n 'LogFormatterSciNotation',\n 'LogitFormatter', 'EngFormatter', 'PercentFormatter',\n 'Locator', 'IndexLocator', 'FixedLocator', 'NullLocator',\n 'LinearLocator', 'LogLocator', 'AutoLocator',\n 'MultipleLocator', 'MaxNLocator', 'AutoMinorLocator',\n 'SymmetricalLogLocator', 'AsinhLocator', 'LogitLocator')\n | .venv\Lib\site-packages\matplotlib\ticker.pyi | ticker.pyi | Other | 10,604 | 0.95 | 0.418831 | 0.028881 | python-kit | 163 | 2024-03-15T12:20:12.572422 | MIT | false | 81419c6bcd9690f1e86a2dae21ac5002 |
"""\nMatplotlib includes a framework for arbitrary geometric transformations that is used to\ndetermine the final position of all elements drawn on the canvas.\n\nTransforms are composed into trees of `TransformNode` objects\nwhose actual value depends on their children. When the contents of\nchildren change, their parents are automatically invalidated. The\nnext time an invalidated transform is accessed, it is recomputed to\nreflect those changes. This invalidation/caching approach prevents\nunnecessary recomputations of transforms, and contributes to better\ninteractive performance.\n\nFor example, here is a graph of the transform tree used to plot data to the figure:\n\n.. graphviz:: /api/transforms.dot\n :alt: Diagram of transform tree from data to figure coordinates.\n\nThe framework can be used for both affine and non-affine\ntransformations. However, for speed, we want to use the backend\nrenderers to perform affine transformations whenever possible.\nTherefore, it is possible to perform just the affine or non-affine\npart of a transformation on a set of data. The affine is always\nassumed to occur after the non-affine. For any transform::\n\n full transform == non-affine part + affine part\n\nThe backends are not expected to handle non-affine transformations\nthemselves.\n\nSee the tutorial :ref:`transforms_tutorial` for examples\nof how to use transforms.\n"""\n\n# Note: There are a number of places in the code where we use `np.min` or\n# `np.minimum` instead of the builtin `min`, and likewise for `max`. This is\n# done so that `nan`s are propagated, instead of being silently dropped.\n\nimport copy\nimport functools\nimport itertools\nimport textwrap\nimport weakref\nimport math\n\nimport numpy as np\nfrom numpy.linalg import inv\n\nfrom matplotlib import _api\nfrom matplotlib._path import (\n affine_transform, count_bboxes_overlapping_bbox, update_path_extents)\nfrom .path import Path\n\nDEBUG = False\n\n\ndef _make_str_method(*args, **kwargs):\n """\n Generate a ``__str__`` method for a `.Transform` subclass.\n\n After ::\n\n class T:\n __str__ = _make_str_method("attr", key="other")\n\n ``str(T(...))`` will be\n\n .. code-block:: text\n\n {type(T).__name__}(\n {self.attr},\n key={self.other})\n """\n indent = functools.partial(textwrap.indent, prefix=" " * 4)\n def strrepr(x): return repr(x) if isinstance(x, str) else str(x)\n return lambda self: (\n type(self).__name__ + "("\n + ",".join([*(indent("\n" + strrepr(getattr(self, arg)))\n for arg in args),\n *(indent("\n" + k + "=" + strrepr(getattr(self, arg)))\n for k, arg in kwargs.items())])\n + ")")\n\n\nclass TransformNode:\n """\n The base class for anything that participates in the transform tree\n and needs to invalidate its parents or be invalidated. This includes\n classes that are not really transforms, such as bounding boxes, since some\n transforms depend on bounding boxes to compute their values.\n """\n\n # Invalidation may affect only the affine part. If the\n # invalidation was "affine-only", the _invalid member is set to\n # INVALID_AFFINE_ONLY\n\n # Possible values for the _invalid attribute.\n _VALID, _INVALID_AFFINE_ONLY, _INVALID_FULL = range(3)\n\n # Some metadata about the transform, used to determine whether an\n # invalidation is affine-only\n is_affine = False\n is_bbox = _api.deprecated("3.9")(_api.classproperty(lambda cls: False))\n\n pass_through = False\n """\n If pass_through is True, all ancestors will always be\n invalidated, even if 'self' is already invalid.\n """\n\n def __init__(self, shorthand_name=None):\n """\n Parameters\n ----------\n shorthand_name : str\n A string representing the "name" of the transform. The name carries\n no significance other than to improve the readability of\n ``str(transform)`` when DEBUG=True.\n """\n self._parents = {}\n # Initially invalid, until first computation.\n self._invalid = self._INVALID_FULL\n self._shorthand_name = shorthand_name or ''\n\n if DEBUG:\n def __str__(self):\n # either just return the name of this TransformNode, or its repr\n return self._shorthand_name or repr(self)\n\n def __getstate__(self):\n # turn the dictionary with weak values into a normal dictionary\n return {**self.__dict__,\n '_parents': {k: v() for k, v in self._parents.items()}}\n\n def __setstate__(self, data_dict):\n self.__dict__ = data_dict\n # turn the normal dictionary back into a dictionary with weak values\n # The extra lambda is to provide a callback to remove dead\n # weakrefs from the dictionary when garbage collection is done.\n self._parents = {\n k: weakref.ref(v, lambda _, pop=self._parents.pop, k=k: pop(k))\n for k, v in self._parents.items() if v is not None}\n\n def __copy__(self):\n other = copy.copy(super())\n # If `c = a + b; a1 = copy(a)`, then modifications to `a1` do not\n # propagate back to `c`, i.e. we need to clear the parents of `a1`.\n other._parents = {}\n # If `c = a + b; c1 = copy(c)`, then modifications to `a` also need to\n # be propagated to `c1`.\n for key, val in vars(self).items():\n if isinstance(val, TransformNode) and id(self) in val._parents:\n other.set_children(val) # val == getattr(other, key)\n return other\n\n def invalidate(self):\n """\n Invalidate this `TransformNode` and triggers an invalidation of its\n ancestors. Should be called any time the transform changes.\n """\n return self._invalidate_internal(\n level=self._INVALID_AFFINE_ONLY if self.is_affine else self._INVALID_FULL,\n invalidating_node=self)\n\n def _invalidate_internal(self, level, invalidating_node):\n """\n Called by :meth:`invalidate` and subsequently ascends the transform\n stack calling each TransformNode's _invalidate_internal method.\n """\n # If we are already more invalid than the currently propagated invalidation,\n # then we don't need to do anything.\n if level <= self._invalid and not self.pass_through:\n return\n self._invalid = level\n for parent in list(self._parents.values()):\n parent = parent() # Dereference the weak reference.\n if parent is not None:\n parent._invalidate_internal(level=level, invalidating_node=self)\n\n def set_children(self, *children):\n """\n Set the children of the transform, to let the invalidation\n system know which transforms can invalidate this transform.\n Should be called from the constructor of any transforms that\n depend on other transforms.\n """\n # Parents are stored as weak references, so that if the\n # parents are destroyed, references from the children won't\n # keep them alive.\n id_self = id(self)\n for child in children:\n # Use weak references so this dictionary won't keep obsolete nodes\n # alive; the callback deletes the dictionary entry. This is a\n # performance improvement over using WeakValueDictionary.\n ref = weakref.ref(\n self, lambda _, pop=child._parents.pop, k=id_self: pop(k))\n child._parents[id_self] = ref\n\n def frozen(self):\n """\n Return a frozen copy of this transform node. The frozen copy will not\n be updated when its children change. Useful for storing a previously\n known state of a transform where ``copy.deepcopy()`` might normally be\n used.\n """\n return self\n\n\nclass BboxBase(TransformNode):\n """\n The base class of all bounding boxes.\n\n This class is immutable; `Bbox` is a mutable subclass.\n\n The canonical representation is as two points, with no\n restrictions on their ordering. Convenience properties are\n provided to get the left, bottom, right and top edges and width\n and height, but these are not stored explicitly.\n """\n\n is_bbox = _api.deprecated("3.9")(_api.classproperty(lambda cls: True))\n is_affine = True\n\n if DEBUG:\n @staticmethod\n def _check(points):\n if isinstance(points, np.ma.MaskedArray):\n _api.warn_external("Bbox bounds are a masked array.")\n points = np.asarray(points)\n if any((points[1, :] - points[0, :]) == 0):\n _api.warn_external("Singular Bbox.")\n\n def frozen(self):\n return Bbox(self.get_points().copy())\n frozen.__doc__ = TransformNode.__doc__\n\n def __array__(self, *args, **kwargs):\n return self.get_points()\n\n @property\n def x0(self):\n """\n The first of the pair of *x* coordinates that define the bounding box.\n\n This is not guaranteed to be less than :attr:`x1` (for that, use\n :attr:`xmin`).\n """\n return self.get_points()[0, 0]\n\n @property\n def y0(self):\n """\n The first of the pair of *y* coordinates that define the bounding box.\n\n This is not guaranteed to be less than :attr:`y1` (for that, use\n :attr:`ymin`).\n """\n return self.get_points()[0, 1]\n\n @property\n def x1(self):\n """\n The second of the pair of *x* coordinates that define the bounding box.\n\n This is not guaranteed to be greater than :attr:`x0` (for that, use\n :attr:`xmax`).\n """\n return self.get_points()[1, 0]\n\n @property\n def y1(self):\n """\n The second of the pair of *y* coordinates that define the bounding box.\n\n This is not guaranteed to be greater than :attr:`y0` (for that, use\n :attr:`ymax`).\n """\n return self.get_points()[1, 1]\n\n @property\n def p0(self):\n """\n The first pair of (*x*, *y*) coordinates that define the bounding box.\n\n This is not guaranteed to be the bottom-left corner (for that, use\n :attr:`min`).\n """\n return self.get_points()[0]\n\n @property\n def p1(self):\n """\n The second pair of (*x*, *y*) coordinates that define the bounding box.\n\n This is not guaranteed to be the top-right corner (for that, use\n :attr:`max`).\n """\n return self.get_points()[1]\n\n @property\n def xmin(self):\n """The left edge of the bounding box."""\n return np.min(self.get_points()[:, 0])\n\n @property\n def ymin(self):\n """The bottom edge of the bounding box."""\n return np.min(self.get_points()[:, 1])\n\n @property\n def xmax(self):\n """The right edge of the bounding box."""\n return np.max(self.get_points()[:, 0])\n\n @property\n def ymax(self):\n """The top edge of the bounding box."""\n return np.max(self.get_points()[:, 1])\n\n @property\n def min(self):\n """The bottom-left corner of the bounding box."""\n return np.min(self.get_points(), axis=0)\n\n @property\n def max(self):\n """The top-right corner of the bounding box."""\n return np.max(self.get_points(), axis=0)\n\n @property\n def intervalx(self):\n """\n The pair of *x* coordinates that define the bounding box.\n\n This is not guaranteed to be sorted from left to right.\n """\n return self.get_points()[:, 0]\n\n @property\n def intervaly(self):\n """\n The pair of *y* coordinates that define the bounding box.\n\n This is not guaranteed to be sorted from bottom to top.\n """\n return self.get_points()[:, 1]\n\n @property\n def width(self):\n """The (signed) width of the bounding box."""\n points = self.get_points()\n return points[1, 0] - points[0, 0]\n\n @property\n def height(self):\n """The (signed) height of the bounding box."""\n points = self.get_points()\n return points[1, 1] - points[0, 1]\n\n @property\n def size(self):\n """The (signed) width and height of the bounding box."""\n points = self.get_points()\n return points[1] - points[0]\n\n @property\n def bounds(self):\n """Return (:attr:`x0`, :attr:`y0`, :attr:`width`, :attr:`height`)."""\n (x0, y0), (x1, y1) = self.get_points()\n return (x0, y0, x1 - x0, y1 - y0)\n\n @property\n def extents(self):\n """Return (:attr:`x0`, :attr:`y0`, :attr:`x1`, :attr:`y1`)."""\n return self.get_points().flatten() # flatten returns a copy.\n\n def get_points(self):\n raise NotImplementedError\n\n def containsx(self, x):\n """\n Return whether *x* is in the closed (:attr:`x0`, :attr:`x1`) interval.\n """\n x0, x1 = self.intervalx\n return x0 <= x <= x1 or x0 >= x >= x1\n\n def containsy(self, y):\n """\n Return whether *y* is in the closed (:attr:`y0`, :attr:`y1`) interval.\n """\n y0, y1 = self.intervaly\n return y0 <= y <= y1 or y0 >= y >= y1\n\n def contains(self, x, y):\n """\n Return whether ``(x, y)`` is in the bounding box or on its edge.\n """\n return self.containsx(x) and self.containsy(y)\n\n def overlaps(self, other):\n """\n Return whether this bounding box overlaps with the other bounding box.\n\n Parameters\n ----------\n other : `.BboxBase`\n """\n ax1, ay1, ax2, ay2 = self.extents\n bx1, by1, bx2, by2 = other.extents\n if ax2 < ax1:\n ax2, ax1 = ax1, ax2\n if ay2 < ay1:\n ay2, ay1 = ay1, ay2\n if bx2 < bx1:\n bx2, bx1 = bx1, bx2\n if by2 < by1:\n by2, by1 = by1, by2\n return ax1 <= bx2 and bx1 <= ax2 and ay1 <= by2 and by1 <= ay2\n\n def fully_containsx(self, x):\n """\n Return whether *x* is in the open (:attr:`x0`, :attr:`x1`) interval.\n """\n x0, x1 = self.intervalx\n return x0 < x < x1 or x0 > x > x1\n\n def fully_containsy(self, y):\n """\n Return whether *y* is in the open (:attr:`y0`, :attr:`y1`) interval.\n """\n y0, y1 = self.intervaly\n return y0 < y < y1 or y0 > y > y1\n\n def fully_contains(self, x, y):\n """\n Return whether ``x, y`` is in the bounding box, but not on its edge.\n """\n return self.fully_containsx(x) and self.fully_containsy(y)\n\n def fully_overlaps(self, other):\n """\n Return whether this bounding box overlaps with the other bounding box,\n not including the edges.\n\n Parameters\n ----------\n other : `.BboxBase`\n """\n ax1, ay1, ax2, ay2 = self.extents\n bx1, by1, bx2, by2 = other.extents\n if ax2 < ax1:\n ax2, ax1 = ax1, ax2\n if ay2 < ay1:\n ay2, ay1 = ay1, ay2\n if bx2 < bx1:\n bx2, bx1 = bx1, bx2\n if by2 < by1:\n by2, by1 = by1, by2\n return ax1 < bx2 and bx1 < ax2 and ay1 < by2 and by1 < ay2\n\n def transformed(self, transform):\n """\n Construct a `Bbox` by statically transforming this one by *transform*.\n """\n pts = self.get_points()\n ll, ul, lr = transform.transform(np.array(\n [pts[0], [pts[0, 0], pts[1, 1]], [pts[1, 0], pts[0, 1]]]))\n return Bbox([ll, [lr[0], ul[1]]])\n\n coefs = {'C': (0.5, 0.5),\n 'SW': (0, 0),\n 'S': (0.5, 0),\n 'SE': (1.0, 0),\n 'E': (1.0, 0.5),\n 'NE': (1.0, 1.0),\n 'N': (0.5, 1.0),\n 'NW': (0, 1.0),\n 'W': (0, 0.5)}\n\n def anchored(self, c, container):\n """\n Return a copy of the `Bbox` anchored to *c* within *container*.\n\n Parameters\n ----------\n c : (float, float) or {'C', 'SW', 'S', 'SE', 'E', 'NE', ...}\n Either an (*x*, *y*) pair of relative coordinates (0 is left or\n bottom, 1 is right or top), 'C' (center), or a cardinal direction\n ('SW', southwest, is bottom left, etc.).\n container : `Bbox`\n The box within which the `Bbox` is positioned.\n\n See Also\n --------\n .Axes.set_anchor\n """\n l, b, w, h = container.bounds\n L, B, W, H = self.bounds\n cx, cy = self.coefs[c] if isinstance(c, str) else c\n return Bbox(self._points +\n [(l + cx * (w - W)) - L,\n (b + cy * (h - H)) - B])\n\n def shrunk(self, mx, my):\n """\n Return a copy of the `Bbox`, shrunk by the factor *mx*\n in the *x* direction and the factor *my* in the *y* direction.\n The lower left corner of the box remains unchanged. Normally\n *mx* and *my* will be less than 1, but this is not enforced.\n """\n w, h = self.size\n return Bbox([self._points[0],\n self._points[0] + [mx * w, my * h]])\n\n def shrunk_to_aspect(self, box_aspect, container=None, fig_aspect=1.0):\n """\n Return a copy of the `Bbox`, shrunk so that it is as\n large as it can be while having the desired aspect ratio,\n *box_aspect*. If the box coordinates are relative (i.e.\n fractions of a larger box such as a figure) then the\n physical aspect ratio of that figure is specified with\n *fig_aspect*, so that *box_aspect* can also be given as a\n ratio of the absolute dimensions, not the relative dimensions.\n """\n if box_aspect <= 0 or fig_aspect <= 0:\n raise ValueError("'box_aspect' and 'fig_aspect' must be positive")\n if container is None:\n container = self\n w, h = container.size\n H = w * box_aspect / fig_aspect\n if H <= h:\n W = w\n else:\n W = h * fig_aspect / box_aspect\n H = h\n return Bbox([self._points[0],\n self._points[0] + (W, H)])\n\n def splitx(self, *args):\n """\n Return a list of new `Bbox` objects formed by splitting the original\n one with vertical lines at fractional positions given by *args*.\n """\n xf = [0, *args, 1]\n x0, y0, x1, y1 = self.extents\n w = x1 - x0\n return [Bbox([[x0 + xf0 * w, y0], [x0 + xf1 * w, y1]])\n for xf0, xf1 in itertools.pairwise(xf)]\n\n def splity(self, *args):\n """\n Return a list of new `Bbox` objects formed by splitting the original\n one with horizontal lines at fractional positions given by *args*.\n """\n yf = [0, *args, 1]\n x0, y0, x1, y1 = self.extents\n h = y1 - y0\n return [Bbox([[x0, y0 + yf0 * h], [x1, y0 + yf1 * h]])\n for yf0, yf1 in itertools.pairwise(yf)]\n\n def count_contains(self, vertices):\n """\n Count the number of vertices contained in the `Bbox`.\n Any vertices with a non-finite x or y value are ignored.\n\n Parameters\n ----------\n vertices : (N, 2) array\n """\n if len(vertices) == 0:\n return 0\n vertices = np.asarray(vertices)\n with np.errstate(invalid='ignore'):\n return (((self.min < vertices) &\n (vertices < self.max)).all(axis=1).sum())\n\n def count_overlaps(self, bboxes):\n """\n Count the number of bounding boxes that overlap this one.\n\n Parameters\n ----------\n bboxes : sequence of `.BboxBase`\n """\n return count_bboxes_overlapping_bbox(\n self, np.atleast_3d([np.array(x) for x in bboxes]))\n\n def expanded(self, sw, sh):\n """\n Construct a `Bbox` by expanding this one around its center by the\n factors *sw* and *sh*.\n """\n width = self.width\n height = self.height\n deltaw = (sw * width - width) / 2.0\n deltah = (sh * height - height) / 2.0\n a = np.array([[-deltaw, -deltah], [deltaw, deltah]])\n return Bbox(self._points + a)\n\n def padded(self, w_pad, h_pad=None):\n """\n Construct a `Bbox` by padding this one on all four sides.\n\n Parameters\n ----------\n w_pad : float\n Width pad\n h_pad : float, optional\n Height pad. Defaults to *w_pad*.\n\n """\n points = self.get_points()\n if h_pad is None:\n h_pad = w_pad\n return Bbox(points + [[-w_pad, -h_pad], [w_pad, h_pad]])\n\n def translated(self, tx, ty):\n """Construct a `Bbox` by translating this one by *tx* and *ty*."""\n return Bbox(self._points + (tx, ty))\n\n def corners(self):\n """\n Return the corners of this rectangle as an array of points.\n\n Specifically, this returns the array\n ``[[x0, y0], [x0, y1], [x1, y0], [x1, y1]]``.\n """\n (x0, y0), (x1, y1) = self.get_points()\n return np.array([[x0, y0], [x0, y1], [x1, y0], [x1, y1]])\n\n def rotated(self, radians):\n """\n Return the axes-aligned bounding box that bounds the result of rotating\n this `Bbox` by an angle of *radians*.\n """\n corners = self.corners()\n corners_rotated = Affine2D().rotate(radians).transform(corners)\n bbox = Bbox.unit()\n bbox.update_from_data_xy(corners_rotated, ignore=True)\n return bbox\n\n @staticmethod\n def union(bboxes):\n """Return a `Bbox` that contains all of the given *bboxes*."""\n if not len(bboxes):\n raise ValueError("'bboxes' cannot be empty")\n x0 = np.min([bbox.xmin for bbox in bboxes])\n x1 = np.max([bbox.xmax for bbox in bboxes])\n y0 = np.min([bbox.ymin for bbox in bboxes])\n y1 = np.max([bbox.ymax for bbox in bboxes])\n return Bbox([[x0, y0], [x1, y1]])\n\n @staticmethod\n def intersection(bbox1, bbox2):\n """\n Return the intersection of *bbox1* and *bbox2* if they intersect, or\n None if they don't.\n """\n x0 = np.maximum(bbox1.xmin, bbox2.xmin)\n x1 = np.minimum(bbox1.xmax, bbox2.xmax)\n y0 = np.maximum(bbox1.ymin, bbox2.ymin)\n y1 = np.minimum(bbox1.ymax, bbox2.ymax)\n return Bbox([[x0, y0], [x1, y1]]) if x0 <= x1 and y0 <= y1 else None\n\n\n_default_minpos = np.array([np.inf, np.inf])\n\n\nclass Bbox(BboxBase):\n """\n A mutable bounding box.\n\n Examples\n --------\n **Create from known bounds**\n\n The default constructor takes the boundary "points" ``[[xmin, ymin],\n [xmax, ymax]]``.\n\n >>> Bbox([[1, 1], [3, 7]])\n Bbox([[1.0, 1.0], [3.0, 7.0]])\n\n Alternatively, a Bbox can be created from the flattened points array, the\n so-called "extents" ``(xmin, ymin, xmax, ymax)``\n\n >>> Bbox.from_extents(1, 1, 3, 7)\n Bbox([[1.0, 1.0], [3.0, 7.0]])\n\n or from the "bounds" ``(xmin, ymin, width, height)``.\n\n >>> Bbox.from_bounds(1, 1, 2, 6)\n Bbox([[1.0, 1.0], [3.0, 7.0]])\n\n **Create from collections of points**\n\n The "empty" object for accumulating Bboxs is the null bbox, which is a\n stand-in for the empty set.\n\n >>> Bbox.null()\n Bbox([[inf, inf], [-inf, -inf]])\n\n Adding points to the null bbox will give you the bbox of those points.\n\n >>> box = Bbox.null()\n >>> box.update_from_data_xy([[1, 1]])\n >>> box\n Bbox([[1.0, 1.0], [1.0, 1.0]])\n >>> box.update_from_data_xy([[2, 3], [3, 2]], ignore=False)\n >>> box\n Bbox([[1.0, 1.0], [3.0, 3.0]])\n\n Setting ``ignore=True`` is equivalent to starting over from a null bbox.\n\n >>> box.update_from_data_xy([[1, 1]], ignore=True)\n >>> box\n Bbox([[1.0, 1.0], [1.0, 1.0]])\n\n .. warning::\n\n It is recommended to always specify ``ignore`` explicitly. If not, the\n default value of ``ignore`` can be changed at any time by code with\n access to your Bbox, for example using the method `~.Bbox.ignore`.\n\n **Properties of the ``null`` bbox**\n\n .. note::\n\n The current behavior of `Bbox.null()` may be surprising as it does\n not have all of the properties of the "empty set", and as such does\n not behave like a "zero" object in the mathematical sense. We may\n change that in the future (with a deprecation period).\n\n The null bbox is the identity for intersections\n\n >>> Bbox.intersection(Bbox([[1, 1], [3, 7]]), Bbox.null())\n Bbox([[1.0, 1.0], [3.0, 7.0]])\n\n except with itself, where it returns the full space.\n\n >>> Bbox.intersection(Bbox.null(), Bbox.null())\n Bbox([[-inf, -inf], [inf, inf]])\n\n A union containing null will always return the full space (not the other\n set!)\n\n >>> Bbox.union([Bbox([[0, 0], [0, 0]]), Bbox.null()])\n Bbox([[-inf, -inf], [inf, inf]])\n """\n\n def __init__(self, points, **kwargs):\n """\n Parameters\n ----------\n points : `~numpy.ndarray`\n A (2, 2) array of the form ``[[x0, y0], [x1, y1]]``.\n """\n super().__init__(**kwargs)\n points = np.asarray(points, float)\n if points.shape != (2, 2):\n raise ValueError('Bbox points must be of the form '\n '"[[x0, y0], [x1, y1]]".')\n self._points = points\n self._minpos = _default_minpos.copy()\n self._ignore = True\n # it is helpful in some contexts to know if the bbox is a\n # default or has been mutated; we store the orig points to\n # support the mutated methods\n self._points_orig = self._points.copy()\n if DEBUG:\n ___init__ = __init__\n\n def __init__(self, points, **kwargs):\n self._check(points)\n self.___init__(points, **kwargs)\n\n def invalidate(self):\n self._check(self._points)\n super().invalidate()\n\n def frozen(self):\n # docstring inherited\n frozen_bbox = super().frozen()\n frozen_bbox._minpos = self.minpos.copy()\n return frozen_bbox\n\n @staticmethod\n def unit():\n """Create a new unit `Bbox` from (0, 0) to (1, 1)."""\n return Bbox([[0, 0], [1, 1]])\n\n @staticmethod\n def null():\n """Create a new null `Bbox` from (inf, inf) to (-inf, -inf)."""\n return Bbox([[np.inf, np.inf], [-np.inf, -np.inf]])\n\n @staticmethod\n def from_bounds(x0, y0, width, height):\n """\n Create a new `Bbox` from *x0*, *y0*, *width* and *height*.\n\n *width* and *height* may be negative.\n """\n return Bbox.from_extents(x0, y0, x0 + width, y0 + height)\n\n @staticmethod\n def from_extents(*args, minpos=None):\n """\n Create a new Bbox from *left*, *bottom*, *right* and *top*.\n\n The *y*-axis increases upwards.\n\n Parameters\n ----------\n left, bottom, right, top : float\n The four extents of the bounding box.\n minpos : float or None\n If this is supplied, the Bbox will have a minimum positive value\n set. This is useful when dealing with logarithmic scales and other\n scales where negative bounds result in floating point errors.\n """\n bbox = Bbox(np.reshape(args, (2, 2)))\n if minpos is not None:\n bbox._minpos[:] = minpos\n return bbox\n\n def __format__(self, fmt):\n return (\n 'Bbox(x0={0.x0:{1}}, y0={0.y0:{1}}, x1={0.x1:{1}}, y1={0.y1:{1}})'.\n format(self, fmt))\n\n def __str__(self):\n return format(self, '')\n\n def __repr__(self):\n return 'Bbox([[{0.x0}, {0.y0}], [{0.x1}, {0.y1}]])'.format(self)\n\n def ignore(self, value):\n """\n Set whether the existing bounds of the box should be ignored\n by subsequent calls to :meth:`update_from_data_xy`.\n\n value : bool\n - When ``True``, subsequent calls to `update_from_data_xy` will\n ignore the existing bounds of the `Bbox`.\n - When ``False``, subsequent calls to `update_from_data_xy` will\n include the existing bounds of the `Bbox`.\n """\n self._ignore = value\n\n def update_from_path(self, path, ignore=None, updatex=True, updatey=True):\n """\n Update the bounds of the `Bbox` to contain the vertices of the\n provided path. After updating, the bounds will have positive *width*\n and *height*; *x0* and *y0* will be the minimal values.\n\n Parameters\n ----------\n path : `~matplotlib.path.Path`\n ignore : bool, optional\n - When ``True``, ignore the existing bounds of the `Bbox`.\n - When ``False``, include the existing bounds of the `Bbox`.\n - When ``None``, use the last value passed to :meth:`ignore`.\n updatex, updatey : bool, default: True\n When ``True``, update the x/y values.\n """\n if ignore is None:\n ignore = self._ignore\n\n if path.vertices.size == 0:\n return\n\n points, minpos, changed = update_path_extents(\n path, None, self._points, self._minpos, ignore)\n\n if changed:\n self.invalidate()\n if updatex:\n self._points[:, 0] = points[:, 0]\n self._minpos[0] = minpos[0]\n if updatey:\n self._points[:, 1] = points[:, 1]\n self._minpos[1] = minpos[1]\n\n def update_from_data_x(self, x, ignore=None):\n """\n Update the x-bounds of the `Bbox` based on the passed in data. After\n updating, the bounds will have positive *width*, and *x0* will be the\n minimal value.\n\n Parameters\n ----------\n x : `~numpy.ndarray`\n Array of x-values.\n ignore : bool, optional\n - When ``True``, ignore the existing bounds of the `Bbox`.\n - When ``False``, include the existing bounds of the `Bbox`.\n - When ``None``, use the last value passed to :meth:`ignore`.\n """\n x = np.ravel(x)\n self.update_from_data_xy(np.column_stack([x, np.ones(x.size)]),\n ignore=ignore, updatey=False)\n\n def update_from_data_y(self, y, ignore=None):\n """\n Update the y-bounds of the `Bbox` based on the passed in data. After\n updating, the bounds will have positive *height*, and *y0* will be the\n minimal value.\n\n Parameters\n ----------\n y : `~numpy.ndarray`\n Array of y-values.\n ignore : bool, optional\n - When ``True``, ignore the existing bounds of the `Bbox`.\n - When ``False``, include the existing bounds of the `Bbox`.\n - When ``None``, use the last value passed to :meth:`ignore`.\n """\n y = np.ravel(y)\n self.update_from_data_xy(np.column_stack([np.ones(y.size), y]),\n ignore=ignore, updatex=False)\n\n def update_from_data_xy(self, xy, ignore=None, updatex=True, updatey=True):\n """\n Update the `Bbox` bounds based on the passed in *xy* coordinates.\n\n After updating, the bounds will have positive *width* and *height*;\n *x0* and *y0* will be the minimal values.\n\n Parameters\n ----------\n xy : (N, 2) array-like\n The (x, y) coordinates.\n ignore : bool, optional\n - When ``True``, ignore the existing bounds of the `Bbox`.\n - When ``False``, include the existing bounds of the `Bbox`.\n - When ``None``, use the last value passed to :meth:`ignore`.\n updatex, updatey : bool, default: True\n When ``True``, update the x/y values.\n """\n if len(xy) == 0:\n return\n\n path = Path(xy)\n self.update_from_path(path, ignore=ignore,\n updatex=updatex, updatey=updatey)\n\n @BboxBase.x0.setter\n def x0(self, val):\n self._points[0, 0] = val\n self.invalidate()\n\n @BboxBase.y0.setter\n def y0(self, val):\n self._points[0, 1] = val\n self.invalidate()\n\n @BboxBase.x1.setter\n def x1(self, val):\n self._points[1, 0] = val\n self.invalidate()\n\n @BboxBase.y1.setter\n def y1(self, val):\n self._points[1, 1] = val\n self.invalidate()\n\n @BboxBase.p0.setter\n def p0(self, val):\n self._points[0] = val\n self.invalidate()\n\n @BboxBase.p1.setter\n def p1(self, val):\n self._points[1] = val\n self.invalidate()\n\n @BboxBase.intervalx.setter\n def intervalx(self, interval):\n self._points[:, 0] = interval\n self.invalidate()\n\n @BboxBase.intervaly.setter\n def intervaly(self, interval):\n self._points[:, 1] = interval\n self.invalidate()\n\n @BboxBase.bounds.setter\n def bounds(self, bounds):\n l, b, w, h = bounds\n points = np.array([[l, b], [l + w, b + h]], float)\n if np.any(self._points != points):\n self._points = points\n self.invalidate()\n\n @property\n def minpos(self):\n """\n The minimum positive value in both directions within the Bbox.\n\n This is useful when dealing with logarithmic scales and other scales\n where negative bounds result in floating point errors, and will be used\n as the minimum extent instead of *p0*.\n """\n return self._minpos\n\n @minpos.setter\n def minpos(self, val):\n self._minpos[:] = val\n\n @property\n def minposx(self):\n """\n The minimum positive value in the *x*-direction within the Bbox.\n\n This is useful when dealing with logarithmic scales and other scales\n where negative bounds result in floating point errors, and will be used\n as the minimum *x*-extent instead of *x0*.\n """\n return self._minpos[0]\n\n @minposx.setter\n def minposx(self, val):\n self._minpos[0] = val\n\n @property\n def minposy(self):\n """\n The minimum positive value in the *y*-direction within the Bbox.\n\n This is useful when dealing with logarithmic scales and other scales\n where negative bounds result in floating point errors, and will be used\n as the minimum *y*-extent instead of *y0*.\n """\n return self._minpos[1]\n\n @minposy.setter\n def minposy(self, val):\n self._minpos[1] = val\n\n def get_points(self):\n """\n Get the points of the bounding box as an array of the form\n ``[[x0, y0], [x1, y1]]``.\n """\n self._invalid = 0\n return self._points\n\n def set_points(self, points):\n """\n Set the points of the bounding box directly from an array of the form\n ``[[x0, y0], [x1, y1]]``. No error checking is performed, as this\n method is mainly for internal use.\n """\n if np.any(self._points != points):\n self._points = points\n self.invalidate()\n\n def set(self, other):\n """\n Set this bounding box from the "frozen" bounds of another `Bbox`.\n """\n if np.any(self._points != other.get_points()):\n self._points = other.get_points()\n self.invalidate()\n\n def mutated(self):\n """Return whether the bbox has changed since init."""\n return self.mutatedx() or self.mutatedy()\n\n def mutatedx(self):\n """Return whether the x-limits have changed since init."""\n return (self._points[0, 0] != self._points_orig[0, 0] or\n self._points[1, 0] != self._points_orig[1, 0])\n\n def mutatedy(self):\n """Return whether the y-limits have changed since init."""\n return (self._points[0, 1] != self._points_orig[0, 1] or\n self._points[1, 1] != self._points_orig[1, 1])\n\n\nclass TransformedBbox(BboxBase):\n """\n A `Bbox` that is automatically transformed by a given\n transform. When either the child bounding box or transform\n changes, the bounds of this bbox will update accordingly.\n """\n\n def __init__(self, bbox, transform, **kwargs):\n """\n Parameters\n ----------\n bbox : `Bbox`\n transform : `Transform`\n """\n _api.check_isinstance(BboxBase, bbox=bbox)\n _api.check_isinstance(Transform, transform=transform)\n if transform.input_dims != 2 or transform.output_dims != 2:\n raise ValueError(\n "The input and output dimensions of 'transform' must be 2")\n\n super().__init__(**kwargs)\n self._bbox = bbox\n self._transform = transform\n self.set_children(bbox, transform)\n self._points = None\n\n __str__ = _make_str_method("_bbox", "_transform")\n\n def get_points(self):\n # docstring inherited\n if self._invalid:\n p = self._bbox.get_points()\n # Transform all four points, then make a new bounding box\n # from the result, taking care to make the orientation the\n # same.\n points = self._transform.transform(\n [[p[0, 0], p[0, 1]],\n [p[1, 0], p[0, 1]],\n [p[0, 0], p[1, 1]],\n [p[1, 0], p[1, 1]]])\n points = np.ma.filled(points, 0.0)\n\n xs = min(points[:, 0]), max(points[:, 0])\n if p[0, 0] > p[1, 0]:\n xs = xs[::-1]\n\n ys = min(points[:, 1]), max(points[:, 1])\n if p[0, 1] > p[1, 1]:\n ys = ys[::-1]\n\n self._points = np.array([\n [xs[0], ys[0]],\n [xs[1], ys[1]]\n ])\n\n self._invalid = 0\n return self._points\n\n if DEBUG:\n _get_points = get_points\n\n def get_points(self):\n points = self._get_points()\n self._check(points)\n return points\n\n def contains(self, x, y):\n # Docstring inherited.\n return self._bbox.contains(*self._transform.inverted().transform((x, y)))\n\n def fully_contains(self, x, y):\n # Docstring inherited.\n return self._bbox.fully_contains(*self._transform.inverted().transform((x, y)))\n\n\nclass LockableBbox(BboxBase):\n """\n A `Bbox` where some elements may be locked at certain values.\n\n When the child bounding box changes, the bounds of this bbox will update\n accordingly with the exception of the locked elements.\n """\n def __init__(self, bbox, x0=None, y0=None, x1=None, y1=None, **kwargs):\n """\n Parameters\n ----------\n bbox : `Bbox`\n The child bounding box to wrap.\n\n x0 : float or None\n The locked value for x0, or None to leave unlocked.\n\n y0 : float or None\n The locked value for y0, or None to leave unlocked.\n\n x1 : float or None\n The locked value for x1, or None to leave unlocked.\n\n y1 : float or None\n The locked value for y1, or None to leave unlocked.\n\n """\n _api.check_isinstance(BboxBase, bbox=bbox)\n super().__init__(**kwargs)\n self._bbox = bbox\n self.set_children(bbox)\n self._points = None\n fp = [x0, y0, x1, y1]\n mask = [val is None for val in fp]\n self._locked_points = np.ma.array(fp, float, mask=mask).reshape((2, 2))\n\n __str__ = _make_str_method("_bbox", "_locked_points")\n\n def get_points(self):\n # docstring inherited\n if self._invalid:\n points = self._bbox.get_points()\n self._points = np.where(self._locked_points.mask,\n points,\n self._locked_points)\n self._invalid = 0\n return self._points\n\n if DEBUG:\n _get_points = get_points\n\n def get_points(self):\n points = self._get_points()\n self._check(points)\n return points\n\n @property\n def locked_x0(self):\n """\n float or None: The value used for the locked x0.\n """\n if self._locked_points.mask[0, 0]:\n return None\n else:\n return self._locked_points[0, 0]\n\n @locked_x0.setter\n def locked_x0(self, x0):\n self._locked_points.mask[0, 0] = x0 is None\n self._locked_points.data[0, 0] = x0\n self.invalidate()\n\n @property\n def locked_y0(self):\n """\n float or None: The value used for the locked y0.\n """\n if self._locked_points.mask[0, 1]:\n return None\n else:\n return self._locked_points[0, 1]\n\n @locked_y0.setter\n def locked_y0(self, y0):\n self._locked_points.mask[0, 1] = y0 is None\n self._locked_points.data[0, 1] = y0\n self.invalidate()\n\n @property\n def locked_x1(self):\n """\n float or None: The value used for the locked x1.\n """\n if self._locked_points.mask[1, 0]:\n return None\n else:\n return self._locked_points[1, 0]\n\n @locked_x1.setter\n def locked_x1(self, x1):\n self._locked_points.mask[1, 0] = x1 is None\n self._locked_points.data[1, 0] = x1\n self.invalidate()\n\n @property\n def locked_y1(self):\n """\n float or None: The value used for the locked y1.\n """\n if self._locked_points.mask[1, 1]:\n return None\n else:\n return self._locked_points[1, 1]\n\n @locked_y1.setter\n def locked_y1(self, y1):\n self._locked_points.mask[1, 1] = y1 is None\n self._locked_points.data[1, 1] = y1\n self.invalidate()\n\n\nclass Transform(TransformNode):\n """\n The base class of all `TransformNode` instances that\n actually perform a transformation.\n\n All non-affine transformations should be subclasses of this class.\n New affine transformations should be subclasses of `Affine2D`.\n\n Subclasses of this class should override the following members (at\n minimum):\n\n - :attr:`input_dims`\n - :attr:`output_dims`\n - :meth:`transform`\n - :meth:`inverted` (if an inverse exists)\n\n The following attributes may be overridden if the default is unsuitable:\n\n - :attr:`is_separable` (defaults to True for 1D -> 1D transforms, False\n otherwise)\n - :attr:`has_inverse` (defaults to True if :meth:`inverted` is overridden,\n False otherwise)\n\n If the transform needs to do something non-standard with\n `matplotlib.path.Path` objects, such as adding curves\n where there were once line segments, it should override:\n\n - :meth:`transform_path`\n """\n\n input_dims = None\n """\n The number of input dimensions of this transform.\n Must be overridden (with integers) in the subclass.\n """\n\n output_dims = None\n """\n The number of output dimensions of this transform.\n Must be overridden (with integers) in the subclass.\n """\n\n is_separable = False\n """True if this transform is separable in the x- and y- dimensions."""\n\n has_inverse = False\n """True if this transform has a corresponding inverse transform."""\n\n def __init_subclass__(cls):\n # 1d transforms are always separable; we assume higher-dimensional ones\n # are not but subclasses can also directly set is_separable -- this is\n # verified by checking whether "is_separable" appears more than once in\n # the class's MRO (it appears once in Transform).\n if (sum("is_separable" in vars(parent) for parent in cls.__mro__) == 1\n and cls.input_dims == cls.output_dims == 1):\n cls.is_separable = True\n # Transform.inverted raises NotImplementedError; we assume that if this\n # is overridden then the transform is invertible but subclass can also\n # directly set has_inverse.\n if (sum("has_inverse" in vars(parent) for parent in cls.__mro__) == 1\n and hasattr(cls, "inverted")\n and cls.inverted is not Transform.inverted):\n cls.has_inverse = True\n\n def __add__(self, other):\n """\n Compose two transforms together so that *self* is followed by *other*.\n\n ``A + B`` returns a transform ``C`` so that\n ``C.transform(x) == B.transform(A.transform(x))``.\n """\n return (composite_transform_factory(self, other)\n if isinstance(other, Transform) else\n NotImplemented)\n\n # Equality is based on object identity for `Transform`s (so we don't\n # override `__eq__`), but some subclasses, such as TransformWrapper &\n # AffineBase, override this behavior.\n\n def _iter_break_from_left_to_right(self):\n """\n Return an iterator breaking down this transform stack from left to\n right recursively. If self == ((A, N), A) then the result will be an\n iterator which yields I : ((A, N), A), followed by A : (N, A),\n followed by (A, N) : (A), but not ((A, N), A) : I.\n\n This is equivalent to flattening the stack then yielding\n ``flat_stack[:i], flat_stack[i:]`` where i=0..(n-1).\n """\n yield IdentityTransform(), self\n\n @property\n def depth(self):\n """\n Return the number of transforms which have been chained\n together to form this Transform instance.\n\n .. note::\n\n For the special case of a Composite transform, the maximum depth\n of the two is returned.\n\n """\n return 1\n\n def contains_branch(self, other):\n """\n Return whether the given transform is a sub-tree of this transform.\n\n This routine uses transform equality to identify sub-trees, therefore\n in many situations it is object id which will be used.\n\n For the case where the given transform represents the whole\n of this transform, returns True.\n """\n if self.depth < other.depth:\n return False\n\n # check that a subtree is equal to other (starting from self)\n for _, sub_tree in self._iter_break_from_left_to_right():\n if sub_tree == other:\n return True\n return False\n\n def contains_branch_seperately(self, other_transform):\n """\n Return whether the given branch is a sub-tree of this transform on\n each separate dimension.\n\n A common use for this method is to identify if a transform is a blended\n transform containing an Axes' data transform. e.g.::\n\n x_isdata, y_isdata = trans.contains_branch_seperately(ax.transData)\n\n """\n if self.output_dims != 2:\n raise ValueError('contains_branch_seperately only supports '\n 'transforms with 2 output dimensions')\n # for a non-blended transform each separate dimension is the same, so\n # just return the appropriate shape.\n return (self.contains_branch(other_transform), ) * 2\n\n def __sub__(self, other):\n """\n Compose *self* with the inverse of *other*, cancelling identical terms\n if any::\n\n # In general:\n A - B == A + B.inverted()\n # (but see note regarding frozen transforms below).\n\n # If A "ends with" B (i.e. A == A' + B for some A') we can cancel\n # out B:\n (A' + B) - B == A'\n\n # Likewise, if B "starts with" A (B = A + B'), we can cancel out A:\n A - (A + B') == B'.inverted() == B'^-1\n\n Cancellation (rather than naively returning ``A + B.inverted()``) is\n important for multiple reasons:\n\n - It avoids floating-point inaccuracies when computing the inverse of\n B: ``B - B`` is guaranteed to cancel out exactly (resulting in the\n identity transform), whereas ``B + B.inverted()`` may differ by a\n small epsilon.\n - ``B.inverted()`` always returns a frozen transform: if one computes\n ``A + B + B.inverted()`` and later mutates ``B``, then\n ``B.inverted()`` won't be updated and the last two terms won't cancel\n out anymore; on the other hand, ``A + B - B`` will always be equal to\n ``A`` even if ``B`` is mutated.\n """\n # we only know how to do this operation if other is a Transform.\n if not isinstance(other, Transform):\n return NotImplemented\n for remainder, sub_tree in self._iter_break_from_left_to_right():\n if sub_tree == other:\n return remainder\n for remainder, sub_tree in other._iter_break_from_left_to_right():\n if sub_tree == self:\n if not remainder.has_inverse:\n raise ValueError(\n "The shortcut cannot be computed since 'other' "\n "includes a non-invertible component")\n return remainder.inverted()\n # if we have got this far, then there was no shortcut possible\n if other.has_inverse:\n return self + other.inverted()\n else:\n raise ValueError('It is not possible to compute transA - transB '\n 'since transB cannot be inverted and there is no '\n 'shortcut possible.')\n\n def __array__(self, *args, **kwargs):\n """Array interface to get at this Transform's affine matrix."""\n return self.get_affine().get_matrix()\n\n def transform(self, values):\n """\n Apply this transformation on the given array of *values*.\n\n Parameters\n ----------\n values : array-like\n The input values as an array of length :attr:`input_dims` or\n shape (N, :attr:`input_dims`).\n\n Returns\n -------\n array\n The output values as an array of length :attr:`output_dims` or\n shape (N, :attr:`output_dims`), depending on the input.\n """\n # Ensure that values is a 2d array (but remember whether\n # we started with a 1d or 2d array).\n values = np.asanyarray(values)\n ndim = values.ndim\n values = values.reshape((-1, self.input_dims))\n\n # Transform the values\n res = self.transform_affine(self.transform_non_affine(values))\n\n # Convert the result back to the shape of the input values.\n if ndim == 0:\n assert not np.ma.is_masked(res) # just to be on the safe side\n return res[0, 0]\n if ndim == 1:\n return res.reshape(-1)\n elif ndim == 2:\n return res\n raise ValueError(\n "Input values must have shape (N, {dims}) or ({dims},)"\n .format(dims=self.input_dims))\n\n def transform_affine(self, values):\n """\n Apply only the affine part of this transformation on the\n given array of values.\n\n ``transform(values)`` is always equivalent to\n ``transform_affine(transform_non_affine(values))``.\n\n In non-affine transformations, this is generally a no-op. In\n affine transformations, this is equivalent to\n ``transform(values)``.\n\n Parameters\n ----------\n values : array\n The input values as an array of length :attr:`input_dims` or\n shape (N, :attr:`input_dims`).\n\n Returns\n -------\n array\n The output values as an array of length :attr:`output_dims` or\n shape (N, :attr:`output_dims`), depending on the input.\n """\n return self.get_affine().transform(values)\n\n def transform_non_affine(self, values):\n """\n Apply only the non-affine part of this transformation.\n\n ``transform(values)`` is always equivalent to\n ``transform_affine(transform_non_affine(values))``.\n\n In non-affine transformations, this is generally equivalent to\n ``transform(values)``. In affine transformations, this is\n always a no-op.\n\n Parameters\n ----------\n values : array\n The input values as an array of length :attr:`input_dims` or\n shape (N, :attr:`input_dims`).\n\n Returns\n -------\n array\n The output values as an array of length :attr:`output_dims` or\n shape (N, :attr:`output_dims`), depending on the input.\n """\n return values\n\n def transform_bbox(self, bbox):\n """\n Transform the given bounding box.\n\n For smarter transforms including caching (a common requirement in\n Matplotlib), see `TransformedBbox`.\n """\n return Bbox(self.transform(bbox.get_points()))\n\n def get_affine(self):\n """Get the affine part of this transform."""\n return IdentityTransform()\n\n def get_matrix(self):\n """Get the matrix for the affine part of this transform."""\n return self.get_affine().get_matrix()\n\n def transform_point(self, point):\n """\n Return a transformed point.\n\n This function is only kept for backcompatibility; the more general\n `.transform` method is capable of transforming both a list of points\n and a single point.\n\n The point is given as a sequence of length :attr:`input_dims`.\n The transformed point is returned as a sequence of length\n :attr:`output_dims`.\n """\n if len(point) != self.input_dims:\n raise ValueError("The length of 'point' must be 'self.input_dims'")\n return self.transform(point)\n\n def transform_path(self, path):\n """\n Apply the transform to `.Path` *path*, returning a new `.Path`.\n\n In some cases, this transform may insert curves into the path\n that began as line segments.\n """\n return self.transform_path_affine(self.transform_path_non_affine(path))\n\n def transform_path_affine(self, path):\n """\n Apply the affine part of this transform to `.Path` *path*, returning a\n new `.Path`.\n\n ``transform_path(path)`` is equivalent to\n ``transform_path_affine(transform_path_non_affine(values))``.\n """\n return self.get_affine().transform_path_affine(path)\n\n def transform_path_non_affine(self, path):\n """\n Apply the non-affine part of this transform to `.Path` *path*,\n returning a new `.Path`.\n\n ``transform_path(path)`` is equivalent to\n ``transform_path_affine(transform_path_non_affine(values))``.\n """\n x = self.transform_non_affine(path.vertices)\n return Path._fast_from_codes_and_verts(x, path.codes, path)\n\n def transform_angles(self, angles, pts, radians=False, pushoff=1e-5):\n """\n Transform a set of angles anchored at specific locations.\n\n Parameters\n ----------\n angles : (N,) array-like\n The angles to transform.\n pts : (N, 2) array-like\n The points where the angles are anchored.\n radians : bool, default: False\n Whether *angles* are radians or degrees.\n pushoff : float\n For each point in *pts* and angle in *angles*, the transformed\n angle is computed by transforming a segment of length *pushoff*\n starting at that point and making that angle relative to the\n horizontal axis, and measuring the angle between the horizontal\n axis and the transformed segment.\n\n Returns\n -------\n (N,) array\n """\n # Must be 2D\n if self.input_dims != 2 or self.output_dims != 2:\n raise NotImplementedError('Only defined in 2D')\n angles = np.asarray(angles)\n pts = np.asarray(pts)\n _api.check_shape((None, 2), pts=pts)\n _api.check_shape((None,), angles=angles)\n if len(angles) != len(pts):\n raise ValueError("There must be as many 'angles' as 'pts'")\n # Convert to radians if desired\n if not radians:\n angles = np.deg2rad(angles)\n # Move a short distance away\n pts2 = pts + pushoff * np.column_stack([np.cos(angles),\n np.sin(angles)])\n # Transform both sets of points\n tpts = self.transform(pts)\n tpts2 = self.transform(pts2)\n # Calculate transformed angles\n d = tpts2 - tpts\n a = np.arctan2(d[:, 1], d[:, 0])\n # Convert back to degrees if desired\n if not radians:\n a = np.rad2deg(a)\n return a\n\n def inverted(self):\n """\n Return the corresponding inverse transformation.\n\n It holds ``x == self.inverted().transform(self.transform(x))``.\n\n The return value of this method should be treated as\n temporary. An update to *self* does not cause a corresponding\n update to its inverted copy.\n """\n raise NotImplementedError()\n\n\nclass TransformWrapper(Transform):\n """\n A helper class that holds a single child transform and acts\n equivalently to it.\n\n This is useful if a node of the transform tree must be replaced at\n run time with a transform of a different type. This class allows\n that replacement to correctly trigger invalidation.\n\n `TransformWrapper` instances must have the same input and output dimensions\n during their entire lifetime, so the child transform may only be replaced\n with another child transform of the same dimensions.\n """\n\n pass_through = True\n\n def __init__(self, child):\n """\n *child*: A `Transform` instance. This child may later\n be replaced with :meth:`set`.\n """\n _api.check_isinstance(Transform, child=child)\n super().__init__()\n self.set(child)\n\n def __eq__(self, other):\n return self._child.__eq__(other)\n\n __str__ = _make_str_method("_child")\n\n def frozen(self):\n # docstring inherited\n return self._child.frozen()\n\n def set(self, child):\n """\n Replace the current child of this transform with another one.\n\n The new child must have the same number of input and output\n dimensions as the current child.\n """\n if hasattr(self, "_child"): # Absent during init.\n self.invalidate()\n new_dims = (child.input_dims, child.output_dims)\n old_dims = (self._child.input_dims, self._child.output_dims)\n if new_dims != old_dims:\n raise ValueError(\n f"The input and output dims of the new child {new_dims} "\n f"do not match those of current child {old_dims}")\n self._child._parents.pop(id(self), None)\n\n self._child = child\n self.set_children(child)\n\n self.transform = child.transform\n self.transform_affine = child.transform_affine\n self.transform_non_affine = child.transform_non_affine\n self.transform_path = child.transform_path\n self.transform_path_affine = child.transform_path_affine\n self.transform_path_non_affine = child.transform_path_non_affine\n self.get_affine = child.get_affine\n self.inverted = child.inverted\n self.get_matrix = child.get_matrix\n # note we do not wrap other properties here since the transform's\n # child can be changed with WrappedTransform.set and so checking\n # is_affine and other such properties may be dangerous.\n\n self._invalid = 0\n self.invalidate()\n self._invalid = 0\n\n input_dims = property(lambda self: self._child.input_dims)\n output_dims = property(lambda self: self._child.output_dims)\n is_affine = property(lambda self: self._child.is_affine)\n is_separable = property(lambda self: self._child.is_separable)\n has_inverse = property(lambda self: self._child.has_inverse)\n\n\nclass AffineBase(Transform):\n """\n The base class of all affine transformations of any number of dimensions.\n """\n is_affine = True\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n self._inverted = None\n\n def __array__(self, *args, **kwargs):\n # optimises the access of the transform matrix vs. the superclass\n return self.get_matrix()\n\n def __eq__(self, other):\n if getattr(other, "is_affine", False) and hasattr(other, "get_matrix"):\n return (self.get_matrix() == other.get_matrix()).all()\n return NotImplemented\n\n def transform(self, values):\n # docstring inherited\n return self.transform_affine(values)\n\n def transform_affine(self, values):\n # docstring inherited\n raise NotImplementedError('Affine subclasses should override this '\n 'method.')\n\n def transform_non_affine(self, values):\n # docstring inherited\n return values\n\n def transform_path(self, path):\n # docstring inherited\n return self.transform_path_affine(path)\n\n def transform_path_affine(self, path):\n # docstring inherited\n return Path(self.transform_affine(path.vertices),\n path.codes, path._interpolation_steps)\n\n def transform_path_non_affine(self, path):\n # docstring inherited\n return path\n\n def get_affine(self):\n # docstring inherited\n return self\n\n\nclass Affine2DBase(AffineBase):\n """\n The base class of all 2D affine transformations.\n\n 2D affine transformations are performed using a 3x3 numpy array::\n\n a c e\n b d f\n 0 0 1\n\n This class provides the read-only interface. For a mutable 2D\n affine transformation, use `Affine2D`.\n\n Subclasses of this class will generally only need to override a\n constructor and `~.Transform.get_matrix` that generates a custom 3x3 matrix.\n """\n input_dims = 2\n output_dims = 2\n\n def frozen(self):\n # docstring inherited\n return Affine2D(self.get_matrix().copy())\n\n @property\n def is_separable(self):\n mtx = self.get_matrix()\n return mtx[0, 1] == mtx[1, 0] == 0.0\n\n def to_values(self):\n """\n Return the values of the matrix as an ``(a, b, c, d, e, f)`` tuple.\n """\n mtx = self.get_matrix()\n return tuple(mtx[:2].swapaxes(0, 1).flat)\n\n def transform_affine(self, values):\n mtx = self.get_matrix()\n if isinstance(values, np.ma.MaskedArray):\n tpoints = affine_transform(values.data, mtx)\n return np.ma.MaskedArray(tpoints, mask=np.ma.getmask(values))\n return affine_transform(values, mtx)\n\n if DEBUG:\n _transform_affine = transform_affine\n\n def transform_affine(self, values):\n # docstring inherited\n # The major speed trap here is just converting to the\n # points to an array in the first place. If we can use\n # more arrays upstream, that should help here.\n if not isinstance(values, np.ndarray):\n _api.warn_external(\n f'A non-numpy array of type {type(values)} was passed in '\n f'for transformation, which results in poor performance.')\n return self._transform_affine(values)\n\n def inverted(self):\n # docstring inherited\n if self._inverted is None or self._invalid:\n mtx = self.get_matrix()\n shorthand_name = None\n if self._shorthand_name:\n shorthand_name = '(%s)-1' % self._shorthand_name\n self._inverted = Affine2D(inv(mtx), shorthand_name=shorthand_name)\n self._invalid = 0\n return self._inverted\n\n\nclass Affine2D(Affine2DBase):\n """\n A mutable 2D affine transformation.\n """\n\n def __init__(self, matrix=None, **kwargs):\n """\n Initialize an Affine transform from a 3x3 numpy float array::\n\n a c e\n b d f\n 0 0 1\n\n If *matrix* is None, initialize with the identity transform.\n """\n super().__init__(**kwargs)\n if matrix is None:\n # A bit faster than np.identity(3).\n matrix = IdentityTransform._mtx\n self._mtx = matrix.copy()\n self._invalid = 0\n\n _base_str = _make_str_method("_mtx")\n\n def __str__(self):\n return (self._base_str()\n if (self._mtx != np.diag(np.diag(self._mtx))).any()\n else f"Affine2D().scale({self._mtx[0, 0]}, {self._mtx[1, 1]})"\n if self._mtx[0, 0] != self._mtx[1, 1]\n else f"Affine2D().scale({self._mtx[0, 0]})")\n\n @staticmethod\n def from_values(a, b, c, d, e, f):\n """\n Create a new Affine2D instance from the given values::\n\n a c e\n b d f\n 0 0 1\n\n .\n """\n return Affine2D(\n np.array([a, c, e, b, d, f, 0.0, 0.0, 1.0], float).reshape((3, 3)))\n\n def get_matrix(self):\n """\n Get the underlying transformation matrix as a 3x3 array::\n\n a c e\n b d f\n 0 0 1\n\n .\n """\n if self._invalid:\n self._inverted = None\n self._invalid = 0\n return self._mtx\n\n def set_matrix(self, mtx):\n """\n Set the underlying transformation matrix from a 3x3 array::\n\n a c e\n b d f\n 0 0 1\n\n .\n """\n self._mtx = mtx\n self.invalidate()\n\n def set(self, other):\n """\n Set this transformation from the frozen copy of another\n `Affine2DBase` object.\n """\n _api.check_isinstance(Affine2DBase, other=other)\n self._mtx = other.get_matrix()\n self.invalidate()\n\n def clear(self):\n """\n Reset the underlying matrix to the identity transform.\n """\n # A bit faster than np.identity(3).\n self._mtx = IdentityTransform._mtx.copy()\n self.invalidate()\n return self\n\n def rotate(self, theta):\n """\n Add a rotation (in radians) to this transform in place.\n\n Returns *self*, so this method can easily be chained with more\n calls to :meth:`rotate`, :meth:`rotate_deg`, :meth:`translate`\n and :meth:`scale`.\n """\n a = math.cos(theta)\n b = math.sin(theta)\n mtx = self._mtx\n # Operating and assigning one scalar at a time is much faster.\n (xx, xy, x0), (yx, yy, y0), _ = mtx.tolist()\n # mtx = [[a -b 0], [b a 0], [0 0 1]] * mtx\n mtx[0, 0] = a * xx - b * yx\n mtx[0, 1] = a * xy - b * yy\n mtx[0, 2] = a * x0 - b * y0\n mtx[1, 0] = b * xx + a * yx\n mtx[1, 1] = b * xy + a * yy\n mtx[1, 2] = b * x0 + a * y0\n self.invalidate()\n return self\n\n def rotate_deg(self, degrees):\n """\n Add a rotation (in degrees) to this transform in place.\n\n Returns *self*, so this method can easily be chained with more\n calls to :meth:`rotate`, :meth:`rotate_deg`, :meth:`translate`\n and :meth:`scale`.\n """\n return self.rotate(math.radians(degrees))\n\n def rotate_around(self, x, y, theta):\n """\n Add a rotation (in radians) around the point (x, y) in place.\n\n Returns *self*, so this method can easily be chained with more\n calls to :meth:`rotate`, :meth:`rotate_deg`, :meth:`translate`\n and :meth:`scale`.\n """\n return self.translate(-x, -y).rotate(theta).translate(x, y)\n\n def rotate_deg_around(self, x, y, degrees):\n """\n Add a rotation (in degrees) around the point (x, y) in place.\n\n Returns *self*, so this method can easily be chained with more\n calls to :meth:`rotate`, :meth:`rotate_deg`, :meth:`translate`\n and :meth:`scale`.\n """\n # Cast to float to avoid wraparound issues with uint8's\n x, y = float(x), float(y)\n return self.translate(-x, -y).rotate_deg(degrees).translate(x, y)\n\n def translate(self, tx, ty):\n """\n Add a translation in place.\n\n Returns *self*, so this method can easily be chained with more\n calls to :meth:`rotate`, :meth:`rotate_deg`, :meth:`translate`\n and :meth:`scale`.\n """\n self._mtx[0, 2] += tx\n self._mtx[1, 2] += ty\n self.invalidate()\n return self\n\n def scale(self, sx, sy=None):\n """\n Add a scale in place.\n\n If *sy* is None, the same scale is applied in both the *x*- and\n *y*-directions.\n\n Returns *self*, so this method can easily be chained with more\n calls to :meth:`rotate`, :meth:`rotate_deg`, :meth:`translate`\n and :meth:`scale`.\n """\n if sy is None:\n sy = sx\n # explicit element-wise scaling is fastest\n self._mtx[0, 0] *= sx\n self._mtx[0, 1] *= sx\n self._mtx[0, 2] *= sx\n self._mtx[1, 0] *= sy\n self._mtx[1, 1] *= sy\n self._mtx[1, 2] *= sy\n self.invalidate()\n return self\n\n def skew(self, xShear, yShear):\n """\n Add a skew in place.\n\n *xShear* and *yShear* are the shear angles along the *x*- and\n *y*-axes, respectively, in radians.\n\n Returns *self*, so this method can easily be chained with more\n calls to :meth:`rotate`, :meth:`rotate_deg`, :meth:`translate`\n and :meth:`scale`.\n """\n rx = math.tan(xShear)\n ry = math.tan(yShear)\n mtx = self._mtx\n # Operating and assigning one scalar at a time is much faster.\n (xx, xy, x0), (yx, yy, y0), _ = mtx.tolist()\n # mtx = [[1 rx 0], [ry 1 0], [0 0 1]] * mtx\n mtx[0, 0] += rx * yx\n mtx[0, 1] += rx * yy\n mtx[0, 2] += rx * y0\n mtx[1, 0] += ry * xx\n mtx[1, 1] += ry * xy\n mtx[1, 2] += ry * x0\n self.invalidate()\n return self\n\n def skew_deg(self, xShear, yShear):\n """\n Add a skew in place.\n\n *xShear* and *yShear* are the shear angles along the *x*- and\n *y*-axes, respectively, in degrees.\n\n Returns *self*, so this method can easily be chained with more\n calls to :meth:`rotate`, :meth:`rotate_deg`, :meth:`translate`\n and :meth:`scale`.\n """\n return self.skew(math.radians(xShear), math.radians(yShear))\n\n\nclass IdentityTransform(Affine2DBase):\n """\n A special class that does one thing, the identity transform, in a\n fast way.\n """\n _mtx = np.identity(3)\n\n def frozen(self):\n # docstring inherited\n return self\n\n __str__ = _make_str_method()\n\n def get_matrix(self):\n # docstring inherited\n return self._mtx\n\n def transform(self, values):\n # docstring inherited\n return np.asanyarray(values)\n\n def transform_affine(self, values):\n # docstring inherited\n return np.asanyarray(values)\n\n def transform_non_affine(self, values):\n # docstring inherited\n return np.asanyarray(values)\n\n def transform_path(self, path):\n # docstring inherited\n return path\n\n def transform_path_affine(self, path):\n # docstring inherited\n return path\n\n def transform_path_non_affine(self, path):\n # docstring inherited\n return path\n\n def get_affine(self):\n # docstring inherited\n return self\n\n def inverted(self):\n # docstring inherited\n return self\n\n\nclass _BlendedMixin:\n """Common methods for `BlendedGenericTransform` and `BlendedAffine2D`."""\n\n def __eq__(self, other):\n if isinstance(other, (BlendedAffine2D, BlendedGenericTransform)):\n return (self._x == other._x) and (self._y == other._y)\n elif self._x == self._y:\n return self._x == other\n else:\n return NotImplemented\n\n def contains_branch_seperately(self, transform):\n return (self._x.contains_branch(transform),\n self._y.contains_branch(transform))\n\n __str__ = _make_str_method("_x", "_y")\n\n\nclass BlendedGenericTransform(_BlendedMixin, Transform):\n """\n A "blended" transform uses one transform for the *x*-direction, and\n another transform for the *y*-direction.\n\n This "generic" version can handle any given child transform in the\n *x*- and *y*-directions.\n """\n input_dims = 2\n output_dims = 2\n is_separable = True\n pass_through = True\n\n def __init__(self, x_transform, y_transform, **kwargs):\n """\n Create a new "blended" transform using *x_transform* to transform the\n *x*-axis and *y_transform* to transform the *y*-axis.\n\n You will generally not call this constructor directly but use the\n `blended_transform_factory` function instead, which can determine\n automatically which kind of blended transform to create.\n """\n Transform.__init__(self, **kwargs)\n self._x = x_transform\n self._y = y_transform\n self.set_children(x_transform, y_transform)\n self._affine = None\n\n @property\n def depth(self):\n return max(self._x.depth, self._y.depth)\n\n def contains_branch(self, other):\n # A blended transform cannot possibly contain a branch from two\n # different transforms.\n return False\n\n is_affine = property(lambda self: self._x.is_affine and self._y.is_affine)\n has_inverse = property(\n lambda self: self._x.has_inverse and self._y.has_inverse)\n\n def frozen(self):\n # docstring inherited\n return blended_transform_factory(self._x.frozen(), self._y.frozen())\n\n def transform_non_affine(self, values):\n # docstring inherited\n if self._x.is_affine and self._y.is_affine:\n return values\n x = self._x\n y = self._y\n\n if x == y and x.input_dims == 2:\n return x.transform_non_affine(values)\n\n if x.input_dims == 2:\n x_points = x.transform_non_affine(values)[:, 0:1]\n else:\n x_points = x.transform_non_affine(values[:, 0])\n x_points = x_points.reshape((len(x_points), 1))\n\n if y.input_dims == 2:\n y_points = y.transform_non_affine(values)[:, 1:]\n else:\n y_points = y.transform_non_affine(values[:, 1])\n y_points = y_points.reshape((len(y_points), 1))\n\n if (isinstance(x_points, np.ma.MaskedArray) or\n isinstance(y_points, np.ma.MaskedArray)):\n return np.ma.concatenate((x_points, y_points), 1)\n else:\n return np.concatenate((x_points, y_points), 1)\n\n def inverted(self):\n # docstring inherited\n return BlendedGenericTransform(self._x.inverted(), self._y.inverted())\n\n def get_affine(self):\n # docstring inherited\n if self._invalid or self._affine is None:\n if self._x == self._y:\n self._affine = self._x.get_affine()\n else:\n x_mtx = self._x.get_affine().get_matrix()\n y_mtx = self._y.get_affine().get_matrix()\n # We already know the transforms are separable, so we can skip\n # setting b and c to zero.\n mtx = np.array([x_mtx[0], y_mtx[1], [0.0, 0.0, 1.0]])\n self._affine = Affine2D(mtx)\n self._invalid = 0\n return self._affine\n\n\nclass BlendedAffine2D(_BlendedMixin, Affine2DBase):\n """\n A "blended" transform uses one transform for the *x*-direction, and\n another transform for the *y*-direction.\n\n This version is an optimization for the case where both child\n transforms are of type `Affine2DBase`.\n """\n\n is_separable = True\n\n def __init__(self, x_transform, y_transform, **kwargs):\n """\n Create a new "blended" transform using *x_transform* to transform the\n *x*-axis and *y_transform* to transform the *y*-axis.\n\n Both *x_transform* and *y_transform* must be 2D affine transforms.\n\n You will generally not call this constructor directly but use the\n `blended_transform_factory` function instead, which can determine\n automatically which kind of blended transform to create.\n """\n is_affine = x_transform.is_affine and y_transform.is_affine\n is_separable = x_transform.is_separable and y_transform.is_separable\n is_correct = is_affine and is_separable\n if not is_correct:\n raise ValueError("Both *x_transform* and *y_transform* must be 2D "\n "affine transforms")\n\n Transform.__init__(self, **kwargs)\n self._x = x_transform\n self._y = y_transform\n self.set_children(x_transform, y_transform)\n\n Affine2DBase.__init__(self)\n self._mtx = None\n\n def get_matrix(self):\n # docstring inherited\n if self._invalid:\n if self._x == self._y:\n self._mtx = self._x.get_matrix()\n else:\n x_mtx = self._x.get_matrix()\n y_mtx = self._y.get_matrix()\n # We already know the transforms are separable, so we can skip\n # setting b and c to zero.\n self._mtx = np.array([x_mtx[0], y_mtx[1], [0.0, 0.0, 1.0]])\n self._inverted = None\n self._invalid = 0\n return self._mtx\n\n\ndef blended_transform_factory(x_transform, y_transform):\n """\n Create a new "blended" transform using *x_transform* to transform\n the *x*-axis and *y_transform* to transform the *y*-axis.\n\n A faster version of the blended transform is returned for the case\n where both child transforms are affine.\n """\n if (isinstance(x_transform, Affine2DBase) and\n isinstance(y_transform, Affine2DBase)):\n return BlendedAffine2D(x_transform, y_transform)\n return BlendedGenericTransform(x_transform, y_transform)\n\n\nclass CompositeGenericTransform(Transform):\n """\n A composite transform formed by applying transform *a* then\n transform *b*.\n\n This "generic" version can handle any two arbitrary\n transformations.\n """\n pass_through = True\n\n def __init__(self, a, b, **kwargs):\n """\n Create a new composite transform that is the result of\n applying transform *a* then transform *b*.\n\n You will generally not call this constructor directly but write ``a +\n b`` instead, which will automatically choose the best kind of composite\n transform instance to create.\n """\n if a.output_dims != b.input_dims:\n raise ValueError("The output dimension of 'a' must be equal to "\n "the input dimensions of 'b'")\n self.input_dims = a.input_dims\n self.output_dims = b.output_dims\n\n super().__init__(**kwargs)\n self._a = a\n self._b = b\n self.set_children(a, b)\n\n def frozen(self):\n # docstring inherited\n self._invalid = 0\n frozen = composite_transform_factory(\n self._a.frozen(), self._b.frozen())\n if not isinstance(frozen, CompositeGenericTransform):\n return frozen.frozen()\n return frozen\n\n def _invalidate_internal(self, level, invalidating_node):\n # When the left child is invalidated at AFFINE_ONLY level and the right child is\n # non-affine, the composite transform is FULLY invalidated.\n if invalidating_node is self._a and not self._b.is_affine:\n level = Transform._INVALID_FULL\n super()._invalidate_internal(level, invalidating_node)\n\n def __eq__(self, other):\n if isinstance(other, (CompositeGenericTransform, CompositeAffine2D)):\n return self is other or (self._a == other._a\n and self._b == other._b)\n else:\n return False\n\n def _iter_break_from_left_to_right(self):\n for left, right in self._a._iter_break_from_left_to_right():\n yield left, right + self._b\n for left, right in self._b._iter_break_from_left_to_right():\n yield self._a + left, right\n\n def contains_branch_seperately(self, other_transform):\n # docstring inherited\n if self.output_dims != 2:\n raise ValueError('contains_branch_seperately only supports '\n 'transforms with 2 output dimensions')\n if self == other_transform:\n return (True, True)\n return self._b.contains_branch_seperately(other_transform)\n\n depth = property(lambda self: self._a.depth + self._b.depth)\n is_affine = property(lambda self: self._a.is_affine and self._b.is_affine)\n is_separable = property(\n lambda self: self._a.is_separable and self._b.is_separable)\n has_inverse = property(\n lambda self: self._a.has_inverse and self._b.has_inverse)\n\n __str__ = _make_str_method("_a", "_b")\n\n def transform_affine(self, values):\n # docstring inherited\n return self.get_affine().transform(values)\n\n def transform_non_affine(self, values):\n # docstring inherited\n if self._a.is_affine and self._b.is_affine:\n return values\n elif not self._a.is_affine and self._b.is_affine:\n return self._a.transform_non_affine(values)\n else:\n return self._b.transform_non_affine(self._a.transform(values))\n\n def transform_path_non_affine(self, path):\n # docstring inherited\n if self._a.is_affine and self._b.is_affine:\n return path\n elif not self._a.is_affine and self._b.is_affine:\n return self._a.transform_path_non_affine(path)\n else:\n return self._b.transform_path_non_affine(\n self._a.transform_path(path))\n\n def get_affine(self):\n # docstring inherited\n if not self._b.is_affine:\n return self._b.get_affine()\n else:\n return Affine2D(np.dot(self._b.get_affine().get_matrix(),\n self._a.get_affine().get_matrix()))\n\n def inverted(self):\n # docstring inherited\n return CompositeGenericTransform(\n self._b.inverted(), self._a.inverted())\n\n\nclass CompositeAffine2D(Affine2DBase):\n """\n A composite transform formed by applying transform *a* then transform *b*.\n\n This version is an optimization that handles the case where both *a*\n and *b* are 2D affines.\n """\n def __init__(self, a, b, **kwargs):\n """\n Create a new composite transform that is the result of\n applying `Affine2DBase` *a* then `Affine2DBase` *b*.\n\n You will generally not call this constructor directly but write ``a +\n b`` instead, which will automatically choose the best kind of composite\n transform instance to create.\n """\n if not a.is_affine or not b.is_affine:\n raise ValueError("'a' and 'b' must be affine transforms")\n if a.output_dims != b.input_dims:\n raise ValueError("The output dimension of 'a' must be equal to "\n "the input dimensions of 'b'")\n self.input_dims = a.input_dims\n self.output_dims = b.output_dims\n\n super().__init__(**kwargs)\n self._a = a\n self._b = b\n self.set_children(a, b)\n self._mtx = None\n\n @property\n def depth(self):\n return self._a.depth + self._b.depth\n\n def _iter_break_from_left_to_right(self):\n for left, right in self._a._iter_break_from_left_to_right():\n yield left, right + self._b\n for left, right in self._b._iter_break_from_left_to_right():\n yield self._a + left, right\n\n __str__ = _make_str_method("_a", "_b")\n\n def get_matrix(self):\n # docstring inherited\n if self._invalid:\n self._mtx = np.dot(\n self._b.get_matrix(),\n self._a.get_matrix())\n self._inverted = None\n self._invalid = 0\n return self._mtx\n\n\ndef composite_transform_factory(a, b):\n """\n Create a new composite transform that is the result of applying\n transform a then transform b.\n\n Shortcut versions of the blended transform are provided for the\n case where both child transforms are affine, or one or the other\n is the identity transform.\n\n Composite transforms may also be created using the '+' operator,\n e.g.::\n\n c = a + b\n """\n # check to see if any of a or b are IdentityTransforms. We use\n # isinstance here to guarantee that the transforms will *always*\n # be IdentityTransforms. Since TransformWrappers are mutable,\n # use of equality here would be wrong.\n if isinstance(a, IdentityTransform):\n return b\n elif isinstance(b, IdentityTransform):\n return a\n elif isinstance(a, Affine2D) and isinstance(b, Affine2D):\n return CompositeAffine2D(a, b)\n return CompositeGenericTransform(a, b)\n\n\nclass BboxTransform(Affine2DBase):\n """\n `BboxTransform` linearly transforms points from one `Bbox` to another.\n """\n\n is_separable = True\n\n def __init__(self, boxin, boxout, **kwargs):\n """\n Create a new `BboxTransform` that linearly transforms\n points from *boxin* to *boxout*.\n """\n _api.check_isinstance(BboxBase, boxin=boxin, boxout=boxout)\n\n super().__init__(**kwargs)\n self._boxin = boxin\n self._boxout = boxout\n self.set_children(boxin, boxout)\n self._mtx = None\n self._inverted = None\n\n __str__ = _make_str_method("_boxin", "_boxout")\n\n def get_matrix(self):\n # docstring inherited\n if self._invalid:\n inl, inb, inw, inh = self._boxin.bounds\n outl, outb, outw, outh = self._boxout.bounds\n x_scale = outw / inw\n y_scale = outh / inh\n if DEBUG and (x_scale == 0 or y_scale == 0):\n raise ValueError(\n "Transforming from or to a singular bounding box")\n self._mtx = np.array([[x_scale, 0.0, -inl*x_scale+outl],\n [ 0.0, y_scale, -inb*y_scale+outb],\n [ 0.0, 0.0, 1.0]],\n float)\n self._inverted = None\n self._invalid = 0\n return self._mtx\n\n\nclass BboxTransformTo(Affine2DBase):\n """\n `BboxTransformTo` is a transformation that linearly transforms points from\n the unit bounding box to a given `Bbox`.\n """\n\n is_separable = True\n\n def __init__(self, boxout, **kwargs):\n """\n Create a new `BboxTransformTo` that linearly transforms\n points from the unit bounding box to *boxout*.\n """\n _api.check_isinstance(BboxBase, boxout=boxout)\n\n super().__init__(**kwargs)\n self._boxout = boxout\n self.set_children(boxout)\n self._mtx = None\n self._inverted = None\n\n __str__ = _make_str_method("_boxout")\n\n def get_matrix(self):\n # docstring inherited\n if self._invalid:\n outl, outb, outw, outh = self._boxout.bounds\n if DEBUG and (outw == 0 or outh == 0):\n raise ValueError("Transforming to a singular bounding box.")\n self._mtx = np.array([[outw, 0.0, outl],\n [ 0.0, outh, outb],\n [ 0.0, 0.0, 1.0]],\n float)\n self._inverted = None\n self._invalid = 0\n return self._mtx\n\n\n@_api.deprecated("3.9")\nclass BboxTransformToMaxOnly(BboxTransformTo):\n """\n `BboxTransformToMaxOnly` is a transformation that linearly transforms points from\n the unit bounding box to a given `Bbox` with a fixed upper left of (0, 0).\n """\n def get_matrix(self):\n # docstring inherited\n if self._invalid:\n xmax, ymax = self._boxout.max\n if DEBUG and (xmax == 0 or ymax == 0):\n raise ValueError("Transforming to a singular bounding box.")\n self._mtx = np.array([[xmax, 0.0, 0.0],\n [ 0.0, ymax, 0.0],\n [ 0.0, 0.0, 1.0]],\n float)\n self._inverted = None\n self._invalid = 0\n return self._mtx\n\n\nclass BboxTransformFrom(Affine2DBase):\n """\n `BboxTransformFrom` linearly transforms points from a given `Bbox` to the\n unit bounding box.\n """\n is_separable = True\n\n def __init__(self, boxin, **kwargs):\n _api.check_isinstance(BboxBase, boxin=boxin)\n\n super().__init__(**kwargs)\n self._boxin = boxin\n self.set_children(boxin)\n self._mtx = None\n self._inverted = None\n\n __str__ = _make_str_method("_boxin")\n\n def get_matrix(self):\n # docstring inherited\n if self._invalid:\n inl, inb, inw, inh = self._boxin.bounds\n if DEBUG and (inw == 0 or inh == 0):\n raise ValueError("Transforming from a singular bounding box.")\n x_scale = 1.0 / inw\n y_scale = 1.0 / inh\n self._mtx = np.array([[x_scale, 0.0, -inl*x_scale],\n [ 0.0, y_scale, -inb*y_scale],\n [ 0.0, 0.0, 1.0]],\n float)\n self._inverted = None\n self._invalid = 0\n return self._mtx\n\n\nclass ScaledTranslation(Affine2DBase):\n """\n A transformation that translates by *xt* and *yt*, after *xt* and *yt*\n have been transformed by *scale_trans*.\n """\n def __init__(self, xt, yt, scale_trans, **kwargs):\n super().__init__(**kwargs)\n self._t = (xt, yt)\n self._scale_trans = scale_trans\n self.set_children(scale_trans)\n self._mtx = None\n self._inverted = None\n\n __str__ = _make_str_method("_t")\n\n def get_matrix(self):\n # docstring inherited\n if self._invalid:\n # A bit faster than np.identity(3).\n self._mtx = IdentityTransform._mtx.copy()\n self._mtx[:2, 2] = self._scale_trans.transform(self._t)\n self._invalid = 0\n self._inverted = None\n return self._mtx\n\n\nclass _ScaledRotation(Affine2DBase):\n """\n A transformation that applies rotation by *theta*, after transform by *trans_shift*.\n """\n def __init__(self, theta, trans_shift):\n super().__init__()\n self._theta = theta\n self._trans_shift = trans_shift\n self._mtx = None\n\n def get_matrix(self):\n if self._invalid:\n transformed_coords = self._trans_shift.transform([[self._theta, 0]])[0]\n adjusted_theta = transformed_coords[0]\n rotation = Affine2D().rotate(adjusted_theta)\n self._mtx = rotation.get_matrix()\n return self._mtx\n\n\nclass AffineDeltaTransform(Affine2DBase):\n r"""\n A transform wrapper for transforming displacements between pairs of points.\n\n This class is intended to be used to transform displacements ("position\n deltas") between pairs of points (e.g., as the ``offset_transform``\n of `.Collection`\s): given a transform ``t`` such that ``t =\n AffineDeltaTransform(t) + offset``, ``AffineDeltaTransform``\n satisfies ``AffineDeltaTransform(a - b) == AffineDeltaTransform(a) -\n AffineDeltaTransform(b)``.\n\n This is implemented by forcing the offset components of the transform\n matrix to zero.\n\n This class is experimental as of 3.3, and the API may change.\n """\n\n pass_through = True\n\n def __init__(self, transform, **kwargs):\n super().__init__(**kwargs)\n self._base_transform = transform\n self.set_children(transform)\n\n __str__ = _make_str_method("_base_transform")\n\n def get_matrix(self):\n if self._invalid:\n self._mtx = self._base_transform.get_matrix().copy()\n self._mtx[:2, -1] = 0\n return self._mtx\n\n\nclass TransformedPath(TransformNode):\n """\n A `TransformedPath` caches a non-affine transformed copy of the\n `~.path.Path`. This cached copy is automatically updated when the\n non-affine part of the transform changes.\n\n .. note::\n\n Paths are considered immutable by this class. Any update to the\n path's vertices/codes will not trigger a transform recomputation.\n\n """\n def __init__(self, path, transform):\n """\n Parameters\n ----------\n path : `~.path.Path`\n transform : `Transform`\n """\n _api.check_isinstance(Transform, transform=transform)\n super().__init__()\n self._path = path\n self._transform = transform\n self.set_children(transform)\n self._transformed_path = None\n self._transformed_points = None\n\n def _revalidate(self):\n # only recompute if the invalidation includes the non_affine part of\n # the transform\n if (self._invalid == self._INVALID_FULL\n or self._transformed_path is None):\n self._transformed_path = \\n self._transform.transform_path_non_affine(self._path)\n self._transformed_points = \\n Path._fast_from_codes_and_verts(\n self._transform.transform_non_affine(self._path.vertices),\n None, self._path)\n self._invalid = 0\n\n def get_transformed_points_and_affine(self):\n """\n Return a copy of the child path, with the non-affine part of\n the transform already applied, along with the affine part of\n the path necessary to complete the transformation. Unlike\n :meth:`get_transformed_path_and_affine`, no interpolation will\n be performed.\n """\n self._revalidate()\n return self._transformed_points, self.get_affine()\n\n def get_transformed_path_and_affine(self):\n """\n Return a copy of the child path, with the non-affine part of\n the transform already applied, along with the affine part of\n the path necessary to complete the transformation.\n """\n self._revalidate()\n return self._transformed_path, self.get_affine()\n\n def get_fully_transformed_path(self):\n """\n Return a fully-transformed copy of the child path.\n """\n self._revalidate()\n return self._transform.transform_path_affine(self._transformed_path)\n\n def get_affine(self):\n return self._transform.get_affine()\n\n\nclass TransformedPatchPath(TransformedPath):\n """\n A `TransformedPatchPath` caches a non-affine transformed copy of the\n `~.patches.Patch`. This cached copy is automatically updated when the\n non-affine part of the transform or the patch changes.\n """\n\n def __init__(self, patch):\n """\n Parameters\n ----------\n patch : `~.patches.Patch`\n """\n # Defer to TransformedPath.__init__.\n super().__init__(patch.get_path(), patch.get_transform())\n self._patch = patch\n\n def _revalidate(self):\n patch_path = self._patch.get_path()\n # Force invalidation if the patch path changed; otherwise, let base\n # class check invalidation.\n if patch_path != self._path:\n self._path = patch_path\n self._transformed_path = None\n super()._revalidate()\n\n\ndef nonsingular(vmin, vmax, expander=0.001, tiny=1e-15, increasing=True):\n """\n Modify the endpoints of a range as needed to avoid singularities.\n\n Parameters\n ----------\n vmin, vmax : float\n The initial endpoints.\n expander : float, default: 0.001\n Fractional amount by which *vmin* and *vmax* are expanded if\n the original interval is too small, based on *tiny*.\n tiny : float, default: 1e-15\n Threshold for the ratio of the interval to the maximum absolute\n value of its endpoints. If the interval is smaller than\n this, it will be expanded. This value should be around\n 1e-15 or larger; otherwise the interval will be approaching\n the double precision resolution limit.\n increasing : bool, default: True\n If True, swap *vmin*, *vmax* if *vmin* > *vmax*.\n\n Returns\n -------\n vmin, vmax : float\n Endpoints, expanded and/or swapped if necessary.\n If either input is inf or NaN, or if both inputs are 0 or very\n close to zero, it returns -*expander*, *expander*.\n """\n\n if (not np.isfinite(vmin)) or (not np.isfinite(vmax)):\n return -expander, expander\n\n swapped = False\n if vmax < vmin:\n vmin, vmax = vmax, vmin\n swapped = True\n\n # Expand vmin, vmax to float: if they were integer types, they can wrap\n # around in abs (abs(np.int8(-128)) == -128) and vmax - vmin can overflow.\n vmin, vmax = map(float, [vmin, vmax])\n\n maxabsvalue = max(abs(vmin), abs(vmax))\n if maxabsvalue < (1e6 / tiny) * np.finfo(float).tiny:\n vmin = -expander\n vmax = expander\n\n elif vmax - vmin <= maxabsvalue * tiny:\n if vmax == 0 and vmin == 0:\n vmin = -expander\n vmax = expander\n else:\n vmin -= expander*abs(vmin)\n vmax += expander*abs(vmax)\n\n if swapped and not increasing:\n vmin, vmax = vmax, vmin\n return vmin, vmax\n\n\ndef interval_contains(interval, val):\n """\n Check, inclusively, whether an interval includes a given value.\n\n Parameters\n ----------\n interval : (float, float)\n The endpoints of the interval.\n val : float\n Value to check is within interval.\n\n Returns\n -------\n bool\n Whether *val* is within the *interval*.\n """\n a, b = interval\n if a > b:\n a, b = b, a\n return a <= val <= b\n\n\ndef _interval_contains_close(interval, val, rtol=1e-10):\n """\n Check, inclusively, whether an interval includes a given value, with the\n interval expanded by a small tolerance to admit floating point errors.\n\n Parameters\n ----------\n interval : (float, float)\n The endpoints of the interval.\n val : float\n Value to check is within interval.\n rtol : float, default: 1e-10\n Relative tolerance slippage allowed outside of the interval.\n For an interval ``[a, b]``, values\n ``a - rtol * (b - a) <= val <= b + rtol * (b - a)`` are considered\n inside the interval.\n\n Returns\n -------\n bool\n Whether *val* is within the *interval* (with tolerance).\n """\n a, b = interval\n if a > b:\n a, b = b, a\n rtol = (b - a) * rtol\n return a - rtol <= val <= b + rtol\n\n\ndef interval_contains_open(interval, val):\n """\n Check, excluding endpoints, whether an interval includes a given value.\n\n Parameters\n ----------\n interval : (float, float)\n The endpoints of the interval.\n val : float\n Value to check is within interval.\n\n Returns\n -------\n bool\n Whether *val* is within the *interval*.\n """\n a, b = interval\n return a < val < b or a > val > b\n\n\ndef offset_copy(trans, fig=None, x=0.0, y=0.0, units='inches'):\n """\n Return a new transform with an added offset.\n\n Parameters\n ----------\n trans : `Transform` subclass\n Any transform, to which offset will be applied.\n fig : `~matplotlib.figure.Figure`, default: None\n Current figure. It can be None if *units* are 'dots'.\n x, y : float, default: 0.0\n The offset to apply.\n units : {'inches', 'points', 'dots'}, default: 'inches'\n Units of the offset.\n\n Returns\n -------\n `Transform` subclass\n Transform with applied offset.\n """\n _api.check_in_list(['dots', 'points', 'inches'], units=units)\n if units == 'dots':\n return trans + Affine2D().translate(x, y)\n if fig is None:\n raise ValueError('For units of inches or points a fig kwarg is needed')\n if units == 'points':\n x /= 72.0\n y /= 72.0\n # Default units are 'inches'\n return trans + ScaledTranslation(x, y, fig.dpi_scale_trans)\n | .venv\Lib\site-packages\matplotlib\transforms.py | transforms.py | Python | 99,707 | 0.75 | 0.168948 | 0.065182 | awesome-app | 83 | 2023-11-23T01:30:32.502263 | GPL-3.0 | false | bd61e61da6d4572863e5722ffaa59ccc |
from .path import Path\nfrom .patches import Patch\nfrom .figure import Figure\nimport numpy as np\nfrom numpy.typing import ArrayLike\nfrom collections.abc import Iterable, Sequence\nfrom typing import Literal\n\nDEBUG: bool\n\nclass TransformNode:\n INVALID_NON_AFFINE: int\n INVALID_AFFINE: int\n INVALID: int\n is_bbox: bool\n # Implemented as a standard attr in base class, but functionally readonly and some subclasses implement as such\n @property\n def is_affine(self) -> bool: ...\n pass_through: bool\n def __init__(self, shorthand_name: str | None = ...) -> None: ...\n def __copy__(self) -> TransformNode: ...\n def invalidate(self) -> None: ...\n def set_children(self, *children: TransformNode) -> None: ...\n def frozen(self) -> TransformNode: ...\n\nclass BboxBase(TransformNode):\n is_bbox: bool\n is_affine: bool\n def frozen(self) -> Bbox: ...\n def __array__(self, *args, **kwargs): ...\n @property\n def x0(self) -> float: ...\n @property\n def y0(self) -> float: ...\n @property\n def x1(self) -> float: ...\n @property\n def y1(self) -> float: ...\n @property\n def p0(self) -> tuple[float, float]: ...\n @property\n def p1(self) -> tuple[float, float]: ...\n @property\n def xmin(self) -> float: ...\n @property\n def ymin(self) -> float: ...\n @property\n def xmax(self) -> float: ...\n @property\n def ymax(self) -> float: ...\n @property\n def min(self) -> tuple[float, float]: ...\n @property\n def max(self) -> tuple[float, float]: ...\n @property\n def intervalx(self) -> tuple[float, float]: ...\n @property\n def intervaly(self) -> tuple[float, float]: ...\n @property\n def width(self) -> float: ...\n @property\n def height(self) -> float: ...\n @property\n def size(self) -> tuple[float, float]: ...\n @property\n def bounds(self) -> tuple[float, float, float, float]: ...\n @property\n def extents(self) -> tuple[float, float, float, float]: ...\n def get_points(self) -> np.ndarray: ...\n def containsx(self, x: float) -> bool: ...\n def containsy(self, y: float) -> bool: ...\n def contains(self, x: float, y: float) -> bool: ...\n def overlaps(self, other: BboxBase) -> bool: ...\n def fully_containsx(self, x: float) -> bool: ...\n def fully_containsy(self, y: float) -> bool: ...\n def fully_contains(self, x: float, y: float) -> bool: ...\n def fully_overlaps(self, other: BboxBase) -> bool: ...\n def transformed(self, transform: Transform) -> Bbox: ...\n coefs: dict[str, tuple[float, float]]\n def anchored(\n self,\n c: tuple[float, float] | Literal['C', 'SW', 'S', 'SE', 'E', 'NE', 'N', 'NW', 'W'],\n container: BboxBase,\n ) -> Bbox: ...\n def shrunk(self, mx: float, my: float) -> Bbox: ...\n def shrunk_to_aspect(\n self,\n box_aspect: float,\n container: BboxBase | None = ...,\n fig_aspect: float = ...,\n ) -> Bbox: ...\n def splitx(self, *args: float) -> list[Bbox]: ...\n def splity(self, *args: float) -> list[Bbox]: ...\n def count_contains(self, vertices: ArrayLike) -> int: ...\n def count_overlaps(self, bboxes: Iterable[BboxBase]) -> int: ...\n def expanded(self, sw: float, sh: float) -> Bbox: ...\n def padded(self, w_pad: float, h_pad: float | None = ...) -> Bbox: ...\n def translated(self, tx: float, ty: float) -> Bbox: ...\n def corners(self) -> np.ndarray: ...\n def rotated(self, radians: float) -> Bbox: ...\n @staticmethod\n def union(bboxes: Sequence[BboxBase]) -> Bbox: ...\n @staticmethod\n def intersection(bbox1: BboxBase, bbox2: BboxBase) -> Bbox | None: ...\n\nclass Bbox(BboxBase):\n def __init__(self, points: ArrayLike, **kwargs) -> None: ...\n @staticmethod\n def unit() -> Bbox: ...\n @staticmethod\n def null() -> Bbox: ...\n @staticmethod\n def from_bounds(x0: float, y0: float, width: float, height: float) -> Bbox: ...\n @staticmethod\n def from_extents(*args: float, minpos: float | None = ...) -> Bbox: ...\n def __format__(self, fmt: str) -> str: ...\n def ignore(self, value: bool) -> None: ...\n def update_from_path(\n self,\n path: Path,\n ignore: bool | None = ...,\n updatex: bool = ...,\n updatey: bool = ...,\n ) -> None: ...\n def update_from_data_x(self, x: ArrayLike, ignore: bool | None = ...) -> None: ...\n def update_from_data_y(self, y: ArrayLike, ignore: bool | None = ...) -> None: ...\n def update_from_data_xy(\n self,\n xy: ArrayLike,\n ignore: bool | None = ...,\n updatex: bool = ...,\n updatey: bool = ...,\n ) -> None: ...\n @property\n def minpos(self) -> float: ...\n @property\n def minposx(self) -> float: ...\n @property\n def minposy(self) -> float: ...\n def get_points(self) -> np.ndarray: ...\n def set_points(self, points: ArrayLike) -> None: ...\n def set(self, other: Bbox) -> None: ...\n def mutated(self) -> bool: ...\n def mutatedx(self) -> bool: ...\n def mutatedy(self) -> bool: ...\n\nclass TransformedBbox(BboxBase):\n def __init__(self, bbox: Bbox, transform: Transform, **kwargs) -> None: ...\n def get_points(self) -> np.ndarray: ...\n\nclass LockableBbox(BboxBase):\n def __init__(\n self,\n bbox: BboxBase,\n x0: float | None = ...,\n y0: float | None = ...,\n x1: float | None = ...,\n y1: float | None = ...,\n **kwargs\n ) -> None: ...\n @property\n def locked_x0(self) -> float | None: ...\n @locked_x0.setter\n def locked_x0(self, x0: float | None) -> None: ...\n @property\n def locked_y0(self) -> float | None: ...\n @locked_y0.setter\n def locked_y0(self, y0: float | None) -> None: ...\n @property\n def locked_x1(self) -> float | None: ...\n @locked_x1.setter\n def locked_x1(self, x1: float | None) -> None: ...\n @property\n def locked_y1(self) -> float | None: ...\n @locked_y1.setter\n def locked_y1(self, y1: float | None) -> None: ...\n\nclass Transform(TransformNode):\n\n # Implemented as a standard attrs in base class, but functionally readonly and some subclasses implement as such\n @property\n def input_dims(self) -> int | None: ...\n @property\n def output_dims(self) -> int | None: ...\n @property\n def is_separable(self) -> bool: ...\n @property\n def has_inverse(self) -> bool: ...\n\n def __add__(self, other: Transform) -> Transform: ...\n @property\n def depth(self) -> int: ...\n def contains_branch(self, other: Transform) -> bool: ...\n def contains_branch_seperately(\n self, other_transform: Transform\n ) -> Sequence[bool]: ...\n def __sub__(self, other: Transform) -> Transform: ...\n def __array__(self, *args, **kwargs) -> np.ndarray: ...\n def transform(self, values: ArrayLike) -> np.ndarray: ...\n def transform_affine(self, values: ArrayLike) -> np.ndarray: ...\n def transform_non_affine(self, values: ArrayLike) -> ArrayLike: ...\n def transform_bbox(self, bbox: BboxBase) -> Bbox: ...\n def get_affine(self) -> Transform: ...\n def get_matrix(self) -> np.ndarray: ...\n def transform_point(self, point: ArrayLike) -> np.ndarray: ...\n def transform_path(self, path: Path) -> Path: ...\n def transform_path_affine(self, path: Path) -> Path: ...\n def transform_path_non_affine(self, path: Path) -> Path: ...\n def transform_angles(\n self,\n angles: ArrayLike,\n pts: ArrayLike,\n radians: bool = ...,\n pushoff: float = ...,\n ) -> np.ndarray: ...\n def inverted(self) -> Transform: ...\n\nclass TransformWrapper(Transform):\n pass_through: bool\n def __init__(self, child: Transform) -> None: ...\n def __eq__(self, other: object) -> bool: ...\n def frozen(self) -> Transform: ...\n def set(self, child: Transform) -> None: ...\n\nclass AffineBase(Transform):\n is_affine: Literal[True]\n def __init__(self, *args, **kwargs) -> None: ...\n def __eq__(self, other: object) -> bool: ...\n\nclass Affine2DBase(AffineBase):\n input_dims: Literal[2]\n output_dims: Literal[2]\n def frozen(self) -> Affine2D: ...\n def to_values(self) -> tuple[float, float, float, float, float, float]: ...\n\nclass Affine2D(Affine2DBase):\n def __init__(self, matrix: ArrayLike | None = ..., **kwargs) -> None: ...\n @staticmethod\n def from_values(\n a: float, b: float, c: float, d: float, e: float, f: float\n ) -> Affine2D: ...\n def set_matrix(self, mtx: ArrayLike) -> None: ...\n def clear(self) -> Affine2D: ...\n def rotate(self, theta: float) -> Affine2D: ...\n def rotate_deg(self, degrees: float) -> Affine2D: ...\n def rotate_around(self, x: float, y: float, theta: float) -> Affine2D: ...\n def rotate_deg_around(self, x: float, y: float, degrees: float) -> Affine2D: ...\n def translate(self, tx: float, ty: float) -> Affine2D: ...\n def scale(self, sx: float, sy: float | None = ...) -> Affine2D: ...\n def skew(self, xShear: float, yShear: float) -> Affine2D: ...\n def skew_deg(self, xShear: float, yShear: float) -> Affine2D: ...\n\nclass IdentityTransform(Affine2DBase): ...\n\nclass _BlendedMixin:\n def __eq__(self, other: object) -> bool: ...\n def contains_branch_seperately(self, transform: Transform) -> Sequence[bool]: ...\n\nclass BlendedGenericTransform(_BlendedMixin, Transform):\n input_dims: Literal[2]\n output_dims: Literal[2]\n pass_through: bool\n def __init__(\n self, x_transform: Transform, y_transform: Transform, **kwargs\n ) -> None: ...\n @property\n def depth(self) -> int: ...\n def contains_branch(self, other: Transform) -> Literal[False]: ...\n @property\n def is_affine(self) -> bool: ...\n\nclass BlendedAffine2D(_BlendedMixin, Affine2DBase):\n def __init__(\n self, x_transform: Transform, y_transform: Transform, **kwargs\n ) -> None: ...\n\ndef blended_transform_factory(\n x_transform: Transform, y_transform: Transform\n) -> BlendedGenericTransform | BlendedAffine2D: ...\n\nclass CompositeGenericTransform(Transform):\n pass_through: bool\n def __init__(self, a: Transform, b: Transform, **kwargs) -> None: ...\n\nclass CompositeAffine2D(Affine2DBase):\n def __init__(self, a: Affine2DBase, b: Affine2DBase, **kwargs) -> None: ...\n @property\n def depth(self) -> int: ...\n\ndef composite_transform_factory(a: Transform, b: Transform) -> Transform: ...\n\nclass BboxTransform(Affine2DBase):\n def __init__(self, boxin: BboxBase, boxout: BboxBase, **kwargs) -> None: ...\n\nclass BboxTransformTo(Affine2DBase):\n def __init__(self, boxout: BboxBase, **kwargs) -> None: ...\n\nclass BboxTransformToMaxOnly(BboxTransformTo): ...\n\nclass BboxTransformFrom(Affine2DBase):\n def __init__(self, boxin: BboxBase, **kwargs) -> None: ...\n\nclass ScaledTranslation(Affine2DBase):\n def __init__(\n self, xt: float, yt: float, scale_trans: Affine2DBase, **kwargs\n ) -> None: ...\n\nclass AffineDeltaTransform(Affine2DBase):\n def __init__(self, transform: Affine2DBase, **kwargs) -> None: ...\n\nclass TransformedPath(TransformNode):\n def __init__(self, path: Path, transform: Transform) -> None: ...\n def get_transformed_points_and_affine(self) -> tuple[Path, Transform]: ...\n def get_transformed_path_and_affine(self) -> tuple[Path, Transform]: ...\n def get_fully_transformed_path(self) -> Path: ...\n def get_affine(self) -> Transform: ...\n\nclass TransformedPatchPath(TransformedPath):\n def __init__(self, patch: Patch) -> None: ...\n\ndef nonsingular(\n vmin: float,\n vmax: float,\n expander: float = ...,\n tiny: float = ...,\n increasing: bool = ...,\n) -> tuple[float, float]: ...\ndef interval_contains(interval: tuple[float, float], val: float) -> bool: ...\ndef interval_contains_open(interval: tuple[float, float], val: float) -> bool: ...\ndef offset_copy(\n trans: Transform,\n fig: Figure | None = ...,\n x: float = ...,\n y: float = ...,\n units: Literal["inches", "points", "dots"] = ...,\n) -> Transform: ...\n\n\nclass _ScaledRotation(Affine2DBase):\n def __init__(self, theta: float, trans_shift: Transform) -> None: ...\n def get_matrix(self) -> np.ndarray: ...\n | .venv\Lib\site-packages\matplotlib\transforms.pyi | transforms.pyi | Other | 12,102 | 0.95 | 0.527859 | 0.009709 | awesome-app | 991 | 2024-12-05T20:51:33.878794 | MIT | false | 8f38445f0544a1a74db7c5a111a3dc96 |
"""\nTyping support for Matplotlib\n\nThis module contains Type aliases which are useful for Matplotlib and potentially\ndownstream libraries.\n\n.. admonition:: Provisional status of typing\n\n The ``typing`` module and type stub files are considered provisional and may change\n at any time without a deprecation period.\n"""\nfrom collections.abc import Hashable, Sequence\nimport pathlib\nfrom typing import Any, Callable, Literal, TypeAlias, TypeVar, Union\n\nfrom . import path\nfrom ._enums import JoinStyle, CapStyle\nfrom .artist import Artist\nfrom .backend_bases import RendererBase\nfrom .markers import MarkerStyle\nfrom .transforms import Bbox, Transform\n\nRGBColorType: TypeAlias = tuple[float, float, float] | str\nRGBAColorType: TypeAlias = (\n str | # "none" or "#RRGGBBAA"/"#RGBA" hex strings\n tuple[float, float, float, float] |\n # 2 tuple (color, alpha) representations, not infinitely recursive\n # RGBColorType includes the (str, float) tuple, even for RGBA strings\n tuple[RGBColorType, float] |\n # (4-tuple, float) is odd, but accepted as the outer float overriding A of 4-tuple\n tuple[tuple[float, float, float, float], float]\n)\n\nColorType: TypeAlias = RGBColorType | RGBAColorType\n\nRGBColourType: TypeAlias = RGBColorType\nRGBAColourType: TypeAlias = RGBAColorType\nColourType: TypeAlias = ColorType\n\nLineStyleType: TypeAlias = str | tuple[float, Sequence[float]]\nDrawStyleType: TypeAlias = Literal["default", "steps", "steps-pre", "steps-mid",\n "steps-post"]\nMarkEveryType: TypeAlias = (\n None |\n int | tuple[int, int] | slice | list[int] |\n float | tuple[float, float] |\n list[bool]\n)\n\nMarkerType: TypeAlias = str | path.Path | MarkerStyle\nFillStyleType: TypeAlias = Literal["full", "left", "right", "bottom", "top", "none"]\nJoinStyleType: TypeAlias = JoinStyle | Literal["miter", "round", "bevel"]\nCapStyleType: TypeAlias = CapStyle | Literal["butt", "projecting", "round"]\n\nCoordsBaseType = Union[\n str,\n Artist,\n Transform,\n Callable[\n [RendererBase],\n Union[Bbox, Transform]\n ]\n]\nCoordsType = Union[\n CoordsBaseType,\n tuple[CoordsBaseType, CoordsBaseType]\n]\n\nRcStyleType: TypeAlias = (\n str |\n dict[str, Any] |\n pathlib.Path |\n Sequence[str | pathlib.Path | dict[str, Any]]\n)\n\n_HT = TypeVar("_HT", bound=Hashable)\nHashableList: TypeAlias = list[_HT | "HashableList[_HT]"]\n"""A nested list of Hashable values."""\n | .venv\Lib\site-packages\matplotlib\typing.py | typing.py | Python | 2,439 | 0.95 | 0.038462 | 0.045455 | python-kit | 810 | 2023-08-03T21:52:40.752965 | Apache-2.0 | false | d44fd8b85f74dd9d0ed7a6db7c198f27 |
"""\nThe classes here provide support for using custom classes with\nMatplotlib, e.g., those that do not expose the array interface but know\nhow to convert themselves to arrays. It also supports classes with\nunits and units conversion. Use cases include converters for custom\nobjects, e.g., a list of datetime objects, as well as for objects that\nare unit aware. We don't assume any particular units implementation;\nrather a units implementation must register with the Registry converter\ndictionary and provide a `ConversionInterface`. For example,\nhere is a complete implementation which supports plotting with native\ndatetime objects::\n\n import matplotlib.units as units\n import matplotlib.dates as dates\n import matplotlib.ticker as ticker\n import datetime\n\n class DateConverter(units.ConversionInterface):\n\n @staticmethod\n def convert(value, unit, axis):\n "Convert a datetime value to a scalar or array."\n return dates.date2num(value)\n\n @staticmethod\n def axisinfo(unit, axis):\n "Return major and minor tick locators and formatters."\n if unit != 'date':\n return None\n majloc = dates.AutoDateLocator()\n majfmt = dates.AutoDateFormatter(majloc)\n return units.AxisInfo(majloc=majloc, majfmt=majfmt, label='date')\n\n @staticmethod\n def default_units(x, axis):\n "Return the default unit for x or None."\n return 'date'\n\n # Finally we register our object type with the Matplotlib units registry.\n units.registry[datetime.date] = DateConverter()\n"""\n\nfrom decimal import Decimal\nfrom numbers import Number\n\nimport numpy as np\nfrom numpy import ma\n\nfrom matplotlib import cbook\n\n\nclass ConversionError(TypeError):\n pass\n\n\ndef _is_natively_supported(x):\n """\n Return whether *x* is of a type that Matplotlib natively supports or an\n array of objects of such types.\n """\n # Matplotlib natively supports all number types except Decimal.\n if np.iterable(x):\n # Assume lists are homogeneous as other functions in unit system.\n for thisx in x:\n if thisx is ma.masked:\n continue\n return isinstance(thisx, Number) and not isinstance(thisx, Decimal)\n else:\n return isinstance(x, Number) and not isinstance(x, Decimal)\n\n\nclass AxisInfo:\n """\n Information to support default axis labeling, tick labeling, and limits.\n\n An instance of this class must be returned by\n `ConversionInterface.axisinfo`.\n """\n def __init__(self, majloc=None, minloc=None,\n majfmt=None, minfmt=None, label=None,\n default_limits=None):\n """\n Parameters\n ----------\n majloc, minloc : Locator, optional\n Tick locators for the major and minor ticks.\n majfmt, minfmt : Formatter, optional\n Tick formatters for the major and minor ticks.\n label : str, optional\n The default axis label.\n default_limits : optional\n The default min and max limits of the axis if no data has\n been plotted.\n\n Notes\n -----\n If any of the above are ``None``, the axis will simply use the\n default value.\n """\n self.majloc = majloc\n self.minloc = minloc\n self.majfmt = majfmt\n self.minfmt = minfmt\n self.label = label\n self.default_limits = default_limits\n\n\nclass ConversionInterface:\n """\n The minimal interface for a converter to take custom data types (or\n sequences) and convert them to values Matplotlib can use.\n """\n\n @staticmethod\n def axisinfo(unit, axis):\n """Return an `.AxisInfo` for the axis with the specified units."""\n return None\n\n @staticmethod\n def default_units(x, axis):\n """Return the default unit for *x* or ``None`` for the given axis."""\n return None\n\n @staticmethod\n def convert(obj, unit, axis):\n """\n Convert *obj* using *unit* for the specified *axis*.\n\n If *obj* is a sequence, return the converted sequence. The output must\n be a sequence of scalars that can be used by the numpy array layer.\n """\n return obj\n\n\nclass DecimalConverter(ConversionInterface):\n """Converter for decimal.Decimal data to float."""\n\n @staticmethod\n def convert(value, unit, axis):\n """\n Convert Decimals to floats.\n\n The *unit* and *axis* arguments are not used.\n\n Parameters\n ----------\n value : decimal.Decimal or iterable\n Decimal or list of Decimal need to be converted\n """\n if isinstance(value, Decimal):\n return float(value)\n # value is Iterable[Decimal]\n elif isinstance(value, ma.MaskedArray):\n return ma.asarray(value, dtype=float)\n else:\n return np.asarray(value, dtype=float)\n\n # axisinfo and default_units can be inherited as Decimals are Numbers.\n\n\nclass Registry(dict):\n """Register types with conversion interface."""\n\n def get_converter(self, x):\n """Get the converter interface instance for *x*, or None."""\n # Unpack in case of e.g. Pandas or xarray object\n x = cbook._unpack_to_numpy(x)\n\n if isinstance(x, np.ndarray):\n # In case x in a masked array, access the underlying data (only its\n # type matters). If x is a regular ndarray, getdata() just returns\n # the array itself.\n x = np.ma.getdata(x).ravel()\n # If there are no elements in x, infer the units from its dtype\n if not x.size:\n return self.get_converter(np.array([0], dtype=x.dtype))\n for cls in type(x).__mro__: # Look up in the cache.\n try:\n return self[cls]\n except KeyError:\n pass\n try: # If cache lookup fails, look up based on first element...\n first = cbook._safe_first_finite(x)\n except (TypeError, StopIteration):\n pass\n else:\n # ... and avoid infinite recursion for pathological iterables for\n # which indexing returns instances of the same iterable class.\n if type(first) is not type(x):\n return self.get_converter(first)\n return None\n\n\nregistry = Registry()\nregistry[Decimal] = DecimalConverter()\n | .venv\Lib\site-packages\matplotlib\units.py | units.py | Python | 6,429 | 0.95 | 0.230769 | 0.075 | vue-tools | 885 | 2023-11-02T10:12:18.497859 | GPL-3.0 | false | 6e516d85658e71cf38cfd48e0816ad6f |
from .artist import Artist\nfrom .axes import Axes\nfrom .backend_bases import FigureCanvasBase, Event, MouseEvent, MouseButton\nfrom .collections import LineCollection\nfrom .figure import Figure\nfrom .lines import Line2D\nfrom .patches import Polygon, Rectangle\nfrom .text import Text\n\nimport PIL.Image\n\nfrom collections.abc import Callable, Collection, Iterable, Sequence\nfrom typing import Any, Literal\nfrom numpy.typing import ArrayLike\nfrom .typing import ColorType\nimport numpy as np\n\nclass LockDraw:\n def __init__(self) -> None: ...\n def __call__(self, o: Any) -> None: ...\n def release(self, o: Any) -> None: ...\n def available(self, o: Any) -> bool: ...\n def isowner(self, o: Any) -> bool: ...\n def locked(self) -> bool: ...\n\nclass Widget:\n drawon: bool\n eventson: bool\n active: bool\n def set_active(self, active: bool) -> None: ...\n def get_active(self) -> None: ...\n def ignore(self, event) -> bool: ...\n\nclass AxesWidget(Widget):\n ax: Axes\n def __init__(self, ax: Axes) -> None: ...\n @property\n def canvas(self) -> FigureCanvasBase | None: ...\n def connect_event(self, event: Event, callback: Callable) -> None: ...\n def disconnect_events(self) -> None: ...\n\nclass Button(AxesWidget):\n label: Text\n color: ColorType\n hovercolor: ColorType\n def __init__(\n self,\n ax: Axes,\n label: str,\n image: ArrayLike | PIL.Image.Image | None = ...,\n color: ColorType = ...,\n hovercolor: ColorType = ...,\n *,\n useblit: bool = ...\n ) -> None: ...\n def on_clicked(self, func: Callable[[Event], Any]) -> int: ...\n def disconnect(self, cid: int) -> None: ...\n\nclass SliderBase(AxesWidget):\n orientation: Literal["horizontal", "vertical"]\n closedmin: bool\n closedmax: bool\n valmin: float\n valmax: float\n valstep: float | ArrayLike | None\n drag_active: bool\n valfmt: str\n def __init__(\n self,\n ax: Axes,\n orientation: Literal["horizontal", "vertical"],\n closedmin: bool,\n closedmax: bool,\n valmin: float,\n valmax: float,\n valfmt: str,\n dragging: Slider | None,\n valstep: float | ArrayLike | None,\n ) -> None: ...\n def disconnect(self, cid: int) -> None: ...\n def reset(self) -> None: ...\n\nclass Slider(SliderBase):\n slidermin: Slider | None\n slidermax: Slider | None\n val: float\n valinit: float\n track: Rectangle\n poly: Polygon\n hline: Line2D\n vline: Line2D\n label: Text\n valtext: Text\n def __init__(\n self,\n ax: Axes,\n label: str,\n valmin: float,\n valmax: float,\n *,\n valinit: float = ...,\n valfmt: str | None = ...,\n closedmin: bool = ...,\n closedmax: bool = ...,\n slidermin: Slider | None = ...,\n slidermax: Slider | None = ...,\n dragging: bool = ...,\n valstep: float | ArrayLike | None = ...,\n orientation: Literal["horizontal", "vertical"] = ...,\n initcolor: ColorType = ...,\n track_color: ColorType = ...,\n handle_style: dict[str, Any] | None = ...,\n **kwargs\n ) -> None: ...\n def set_val(self, val: float) -> None: ...\n def on_changed(self, func: Callable[[float], Any]) -> int: ...\n\nclass RangeSlider(SliderBase):\n val: tuple[float, float]\n valinit: tuple[float, float]\n track: Rectangle\n poly: Polygon\n label: Text\n valtext: Text\n def __init__(\n self,\n ax: Axes,\n label: str,\n valmin: float,\n valmax: float,\n *,\n valinit: tuple[float, float] | None = ...,\n valfmt: str | None = ...,\n closedmin: bool = ...,\n closedmax: bool = ...,\n dragging: bool = ...,\n valstep: float | ArrayLike | None = ...,\n orientation: Literal["horizontal", "vertical"] = ...,\n track_color: ColorType = ...,\n handle_style: dict[str, Any] | None = ...,\n **kwargs\n ) -> None: ...\n def set_min(self, min: float) -> None: ...\n def set_max(self, max: float) -> None: ...\n def set_val(self, val: ArrayLike) -> None: ...\n def on_changed(self, func: Callable[[tuple[float, float]], Any]) -> int: ...\n\nclass CheckButtons(AxesWidget):\n labels: list[Text]\n def __init__(\n self,\n ax: Axes,\n labels: Sequence[str],\n actives: Iterable[bool] | None = ...,\n *,\n useblit: bool = ...,\n label_props: dict[str, Any] | None = ...,\n frame_props: dict[str, Any] | None = ...,\n check_props: dict[str, Any] | None = ...,\n ) -> None: ...\n def set_label_props(self, props: dict[str, Any]) -> None: ...\n def set_frame_props(self, props: dict[str, Any]) -> None: ...\n def set_check_props(self, props: dict[str, Any]) -> None: ...\n def set_active(self, index: int, state: bool | None = ...) -> None: ... # type: ignore[override]\n def clear(self) -> None: ...\n def get_status(self) -> list[bool]: ...\n def get_checked_labels(self) -> list[str]: ...\n def on_clicked(self, func: Callable[[str | None], Any]) -> int: ...\n def disconnect(self, cid: int) -> None: ...\n\nclass TextBox(AxesWidget):\n label: Text\n text_disp: Text\n cursor_index: int\n cursor: LineCollection\n color: ColorType\n hovercolor: ColorType\n capturekeystrokes: bool\n def __init__(\n self,\n ax: Axes,\n label: str,\n initial: str = ...,\n *,\n color: ColorType = ...,\n hovercolor: ColorType = ...,\n label_pad: float = ...,\n textalignment: Literal["left", "center", "right"] = ...,\n ) -> None: ...\n @property\n def text(self) -> str: ...\n def set_val(self, val: str) -> None: ...\n def begin_typing(self) -> None: ...\n def stop_typing(self) -> None: ...\n def on_text_change(self, func: Callable[[str], Any]) -> int: ...\n def on_submit(self, func: Callable[[str], Any]) -> int: ...\n def disconnect(self, cid: int) -> None: ...\n\nclass RadioButtons(AxesWidget):\n activecolor: ColorType\n value_selected: str\n labels: list[Text]\n def __init__(\n self,\n ax: Axes,\n labels: Iterable[str],\n active: int = ...,\n activecolor: ColorType | None = ...,\n *,\n useblit: bool = ...,\n label_props: dict[str, Any] | Sequence[dict[str, Any]] | None = ...,\n radio_props: dict[str, Any] | None = ...,\n ) -> None: ...\n def set_label_props(self, props: dict[str, Any]) -> None: ...\n def set_radio_props(self, props: dict[str, Any]) -> None: ...\n def set_active(self, index: int) -> None: ...\n def clear(self) -> None: ...\n def on_clicked(self, func: Callable[[str | None], Any]) -> int: ...\n def disconnect(self, cid: int) -> None: ...\n\nclass SubplotTool(Widget):\n figure: Figure\n targetfig: Figure\n buttonreset: Button\n def __init__(self, targetfig: Figure, toolfig: Figure) -> None: ...\n\nclass Cursor(AxesWidget):\n visible: bool\n horizOn: bool\n vertOn: bool\n useblit: bool\n lineh: Line2D\n linev: Line2D\n background: Any\n needclear: bool\n def __init__(\n self,\n ax: Axes,\n *,\n horizOn: bool = ...,\n vertOn: bool = ...,\n useblit: bool = ...,\n **lineprops\n ) -> None: ...\n def clear(self, event: Event) -> None: ...\n def onmove(self, event: Event) -> None: ...\n\nclass MultiCursor(Widget):\n axes: Sequence[Axes]\n horizOn: bool\n vertOn: bool\n visible: bool\n useblit: bool\n vlines: list[Line2D]\n hlines: list[Line2D]\n def __init__(\n self,\n canvas: Any,\n axes: Sequence[Axes],\n *,\n useblit: bool = ...,\n horizOn: bool = ...,\n vertOn: bool = ...,\n **lineprops\n ) -> None: ...\n def connect(self) -> None: ...\n def disconnect(self) -> None: ...\n def clear(self, event: Event) -> None: ...\n def onmove(self, event: Event) -> None: ...\n\nclass _SelectorWidget(AxesWidget):\n onselect: Callable[[float, float], Any]\n useblit: bool\n background: Any\n validButtons: list[MouseButton]\n def __init__(\n self,\n ax: Axes,\n onselect: Callable[[float, float], Any] | None = ...,\n useblit: bool = ...,\n button: MouseButton | Collection[MouseButton] | None = ...,\n state_modifier_keys: dict[str, str] | None = ...,\n use_data_coordinates: bool = ...,\n ) -> None: ...\n def update_background(self, event: Event) -> None: ...\n def connect_default_events(self) -> None: ...\n def ignore(self, event: Event) -> bool: ...\n def update(self) -> None: ...\n def press(self, event: Event) -> bool: ...\n def release(self, event: Event) -> bool: ...\n def onmove(self, event: Event) -> bool: ...\n def on_scroll(self, event: Event) -> None: ...\n def on_key_press(self, event: Event) -> None: ...\n def on_key_release(self, event: Event) -> None: ...\n def set_visible(self, visible: bool) -> None: ...\n def get_visible(self) -> bool: ...\n def clear(self) -> None: ...\n @property\n def artists(self) -> tuple[Artist]: ...\n def set_props(self, **props) -> None: ...\n def set_handle_props(self, **handle_props) -> None: ...\n def add_state(self, state: str) -> None: ...\n def remove_state(self, state: str) -> None: ...\n\nclass SpanSelector(_SelectorWidget):\n snap_values: ArrayLike | None\n onmove_callback: Callable[[float, float], Any]\n minspan: float\n grab_range: float\n drag_from_anywhere: bool\n ignore_event_outside: bool\n def __init__(\n self,\n ax: Axes,\n onselect: Callable[[float, float], Any],\n direction: Literal["horizontal", "vertical"],\n *,\n minspan: float = ...,\n useblit: bool = ...,\n props: dict[str, Any] | None = ...,\n onmove_callback: Callable[[float, float], Any] | None = ...,\n interactive: bool = ...,\n button: MouseButton | Collection[MouseButton] | None = ...,\n handle_props: dict[str, Any] | None = ...,\n grab_range: float = ...,\n state_modifier_keys: dict[str, str] | None = ...,\n drag_from_anywhere: bool = ...,\n ignore_event_outside: bool = ...,\n snap_values: ArrayLike | None = ...,\n ) -> None: ...\n def new_axes(\n self,\n ax: Axes,\n *,\n _props: dict[str, Any] | None = ...,\n _init: bool = ...,\n ) -> None: ...\n def connect_default_events(self) -> None: ...\n @property\n def direction(self) -> Literal["horizontal", "vertical"]: ...\n @direction.setter\n def direction(self, direction: Literal["horizontal", "vertical"]) -> None: ...\n @property\n def extents(self) -> tuple[float, float]: ...\n @extents.setter\n def extents(self, extents: tuple[float, float]) -> None: ...\n\nclass ToolLineHandles:\n ax: Axes\n def __init__(\n self,\n ax: Axes,\n positions: ArrayLike,\n direction: Literal["horizontal", "vertical"],\n *,\n line_props: dict[str, Any] | None = ...,\n useblit: bool = ...,\n ) -> None: ...\n @property\n def artists(self) -> tuple[Line2D]: ...\n @property\n def positions(self) -> list[float]: ...\n @property\n def direction(self) -> Literal["horizontal", "vertical"]: ...\n def set_data(self, positions: ArrayLike) -> None: ...\n def set_visible(self, value: bool) -> None: ...\n def set_animated(self, value: bool) -> None: ...\n def remove(self) -> None: ...\n def closest(self, x: float, y: float) -> tuple[int, float]: ...\n\nclass ToolHandles:\n ax: Axes\n def __init__(\n self,\n ax: Axes,\n x: ArrayLike,\n y: ArrayLike,\n *,\n marker: str = ...,\n marker_props: dict[str, Any] | None = ...,\n useblit: bool = ...,\n ) -> None: ...\n @property\n def x(self) -> ArrayLike: ...\n @property\n def y(self) -> ArrayLike: ...\n @property\n def artists(self) -> tuple[Line2D]: ...\n def set_data(self, pts: ArrayLike, y: ArrayLike | None = ...) -> None: ...\n def set_visible(self, val: bool) -> None: ...\n def set_animated(self, val: bool) -> None: ...\n def closest(self, x: float, y: float) -> tuple[int, float]: ...\n\nclass RectangleSelector(_SelectorWidget):\n drag_from_anywhere: bool\n ignore_event_outside: bool\n minspanx: float\n minspany: float\n spancoords: Literal["data", "pixels"]\n grab_range: float\n def __init__(\n self,\n ax: Axes,\n onselect: Callable[[MouseEvent, MouseEvent], Any] | None = ...,\n *,\n minspanx: float = ...,\n minspany: float = ...,\n useblit: bool = ...,\n props: dict[str, Any] | None = ...,\n spancoords: Literal["data", "pixels"] = ...,\n button: MouseButton | Collection[MouseButton] | None = ...,\n grab_range: float = ...,\n handle_props: dict[str, Any] | None = ...,\n interactive: bool = ...,\n state_modifier_keys: dict[str, str] | None = ...,\n drag_from_anywhere: bool = ...,\n ignore_event_outside: bool = ...,\n use_data_coordinates: bool = ...,\n ) -> None: ...\n @property\n def corners(self) -> tuple[np.ndarray, np.ndarray]: ...\n @property\n def edge_centers(self) -> tuple[np.ndarray, np.ndarray]: ...\n @property\n def center(self) -> tuple[float, float]: ...\n @property\n def extents(self) -> tuple[float, float, float, float]: ...\n @extents.setter\n def extents(self, extents: tuple[float, float, float, float]) -> None: ...\n @property\n def rotation(self) -> float: ...\n @rotation.setter\n def rotation(self, value: float) -> None: ...\n @property\n def geometry(self) -> np.ndarray: ...\n\nclass EllipseSelector(RectangleSelector): ...\n\nclass LassoSelector(_SelectorWidget):\n verts: None | list[tuple[float, float]]\n def __init__(\n self,\n ax: Axes,\n onselect: Callable[[list[tuple[float, float]]], Any] | None = ...,\n *,\n useblit: bool = ...,\n props: dict[str, Any] | None = ...,\n button: MouseButton | Collection[MouseButton] | None = ...,\n ) -> None: ...\n\nclass PolygonSelector(_SelectorWidget):\n grab_range: float\n def __init__(\n self,\n ax: Axes,\n onselect: Callable[[ArrayLike, ArrayLike], Any] | None = ...,\n *,\n useblit: bool = ...,\n props: dict[str, Any] | None = ...,\n handle_props: dict[str, Any] | None = ...,\n grab_range: float = ...,\n draw_bounding_box: bool = ...,\n box_handle_props: dict[str, Any] | None = ...,\n box_props: dict[str, Any] | None = ...\n ) -> None: ...\n def onmove(self, event: Event) -> bool: ...\n @property\n def verts(self) -> list[tuple[float, float]]: ...\n @verts.setter\n def verts(self, xys: Sequence[tuple[float, float]]) -> None: ...\n\nclass Lasso(AxesWidget):\n useblit: bool\n background: Any\n verts: list[tuple[float, float]] | None\n line: Line2D\n callback: Callable[[list[tuple[float, float]]], Any]\n def __init__(\n self,\n ax: Axes,\n xy: tuple[float, float],\n callback: Callable[[list[tuple[float, float]]], Any],\n *,\n useblit: bool = ...,\n props: dict[str, Any] | None = ...,\n ) -> None: ...\n def onrelease(self, event: Event) -> None: ...\n def onmove(self, event: Event) -> None: ...\n | .venv\Lib\site-packages\matplotlib\widgets.pyi | widgets.pyi | Other | 15,370 | 0.95 | 0.293033 | 0.043103 | python-kit | 875 | 2023-12-15T00:43:44.325902 | MIT | false | f6b8fdddf39c7f8f7c2c8b08b1b6edb5 |
"""\nA python interface to Adobe Font Metrics Files.\n\nAlthough a number of other Python implementations exist, and may be more\ncomplete than this, it was decided not to go with them because they were\neither:\n\n1) copyrighted or used a non-BSD compatible license\n2) had too many dependencies and a free standing lib was needed\n3) did more than needed and it was easier to write afresh rather than\n figure out how to get just what was needed.\n\nIt is pretty easy to use, and has no external dependencies:\n\n>>> import matplotlib as mpl\n>>> from pathlib import Path\n>>> afm_path = Path(mpl.get_data_path(), 'fonts', 'afm', 'ptmr8a.afm')\n>>>\n>>> from matplotlib.afm import AFM\n>>> with afm_path.open('rb') as fh:\n... afm = AFM(fh)\n>>> afm.string_width_height('What the heck?')\n(6220.0, 694)\n>>> afm.get_fontname()\n'Times-Roman'\n>>> afm.get_kern_dist('A', 'f')\n0\n>>> afm.get_kern_dist('A', 'y')\n-92.0\n>>> afm.get_bbox_char('!')\n[130, -9, 238, 676]\n\nAs in the Adobe Font Metrics File Format Specification, all dimensions\nare given in units of 1/1000 of the scale factor (point size) of the font\nbeing used.\n"""\n\nfrom collections import namedtuple\nimport logging\nimport re\n\nfrom ._mathtext_data import uni2type1\n\n\n_log = logging.getLogger(__name__)\n\n\ndef _to_int(x):\n # Some AFM files have floats where we are expecting ints -- there is\n # probably a better way to handle this (support floats, round rather than\n # truncate). But I don't know what the best approach is now and this\n # change to _to_int should at least prevent Matplotlib from crashing on\n # these. JDH (2009-11-06)\n return int(float(x))\n\n\ndef _to_float(x):\n # Some AFM files use "," instead of "." as decimal separator -- this\n # shouldn't be ambiguous (unless someone is wicked enough to use "," as\n # thousands separator...).\n if isinstance(x, bytes):\n # Encoding doesn't really matter -- if we have codepoints >127 the call\n # to float() will error anyways.\n x = x.decode('latin-1')\n return float(x.replace(',', '.'))\n\n\ndef _to_str(x):\n return x.decode('utf8')\n\n\ndef _to_list_of_ints(s):\n s = s.replace(b',', b' ')\n return [_to_int(val) for val in s.split()]\n\n\ndef _to_list_of_floats(s):\n return [_to_float(val) for val in s.split()]\n\n\ndef _to_bool(s):\n if s.lower().strip() in (b'false', b'0', b'no'):\n return False\n else:\n return True\n\n\ndef _parse_header(fh):\n """\n Read the font metrics header (up to the char metrics) and returns\n a dictionary mapping *key* to *val*. *val* will be converted to the\n appropriate python type as necessary; e.g.:\n\n * 'False'->False\n * '0'->0\n * '-168 -218 1000 898'-> [-168, -218, 1000, 898]\n\n Dictionary keys are\n\n StartFontMetrics, FontName, FullName, FamilyName, Weight,\n ItalicAngle, IsFixedPitch, FontBBox, UnderlinePosition,\n UnderlineThickness, Version, Notice, EncodingScheme, CapHeight,\n XHeight, Ascender, Descender, StartCharMetrics\n """\n header_converters = {\n b'StartFontMetrics': _to_float,\n b'FontName': _to_str,\n b'FullName': _to_str,\n b'FamilyName': _to_str,\n b'Weight': _to_str,\n b'ItalicAngle': _to_float,\n b'IsFixedPitch': _to_bool,\n b'FontBBox': _to_list_of_ints,\n b'UnderlinePosition': _to_float,\n b'UnderlineThickness': _to_float,\n b'Version': _to_str,\n # Some AFM files have non-ASCII characters (which are not allowed by\n # the spec). Given that there is actually no public API to even access\n # this field, just return it as straight bytes.\n b'Notice': lambda x: x,\n b'EncodingScheme': _to_str,\n b'CapHeight': _to_float, # Is the second version a mistake, or\n b'Capheight': _to_float, # do some AFM files contain 'Capheight'? -JKS\n b'XHeight': _to_float,\n b'Ascender': _to_float,\n b'Descender': _to_float,\n b'StdHW': _to_float,\n b'StdVW': _to_float,\n b'StartCharMetrics': _to_int,\n b'CharacterSet': _to_str,\n b'Characters': _to_int,\n }\n d = {}\n first_line = True\n for line in fh:\n line = line.rstrip()\n if line.startswith(b'Comment'):\n continue\n lst = line.split(b' ', 1)\n key = lst[0]\n if first_line:\n # AFM spec, Section 4: The StartFontMetrics keyword\n # [followed by a version number] must be the first line in\n # the file, and the EndFontMetrics keyword must be the\n # last non-empty line in the file. We just check the\n # first header entry.\n if key != b'StartFontMetrics':\n raise RuntimeError('Not an AFM file')\n first_line = False\n if len(lst) == 2:\n val = lst[1]\n else:\n val = b''\n try:\n converter = header_converters[key]\n except KeyError:\n _log.error("Found an unknown keyword in AFM header (was %r)", key)\n continue\n try:\n d[key] = converter(val)\n except ValueError:\n _log.error('Value error parsing header in AFM: %s, %s', key, val)\n continue\n if key == b'StartCharMetrics':\n break\n else:\n raise RuntimeError('Bad parse')\n return d\n\n\nCharMetrics = namedtuple('CharMetrics', 'width, name, bbox')\nCharMetrics.__doc__ = """\n Represents the character metrics of a single character.\n\n Notes\n -----\n The fields do currently only describe a subset of character metrics\n information defined in the AFM standard.\n """\nCharMetrics.width.__doc__ = """The character width (WX)."""\nCharMetrics.name.__doc__ = """The character name (N)."""\nCharMetrics.bbox.__doc__ = """\n The bbox of the character (B) as a tuple (*llx*, *lly*, *urx*, *ury*)."""\n\n\ndef _parse_char_metrics(fh):\n """\n Parse the given filehandle for character metrics information and return\n the information as dicts.\n\n It is assumed that the file cursor is on the line behind\n 'StartCharMetrics'.\n\n Returns\n -------\n ascii_d : dict\n A mapping "ASCII num of the character" to `.CharMetrics`.\n name_d : dict\n A mapping "character name" to `.CharMetrics`.\n\n Notes\n -----\n This function is incomplete per the standard, but thus far parses\n all the sample afm files tried.\n """\n required_keys = {'C', 'WX', 'N', 'B'}\n\n ascii_d = {}\n name_d = {}\n for line in fh:\n # We are defensively letting values be utf8. The spec requires\n # ascii, but there are non-compliant fonts in circulation\n line = _to_str(line.rstrip()) # Convert from byte-literal\n if line.startswith('EndCharMetrics'):\n return ascii_d, name_d\n # Split the metric line into a dictionary, keyed by metric identifiers\n vals = dict(s.strip().split(' ', 1) for s in line.split(';') if s)\n # There may be other metrics present, but only these are needed\n if not required_keys.issubset(vals):\n raise RuntimeError('Bad char metrics line: %s' % line)\n num = _to_int(vals['C'])\n wx = _to_float(vals['WX'])\n name = vals['N']\n bbox = _to_list_of_floats(vals['B'])\n bbox = list(map(int, bbox))\n metrics = CharMetrics(wx, name, bbox)\n # Workaround: If the character name is 'Euro', give it the\n # corresponding character code, according to WinAnsiEncoding (see PDF\n # Reference).\n if name == 'Euro':\n num = 128\n elif name == 'minus':\n num = ord("\N{MINUS SIGN}") # 0x2212\n if num != -1:\n ascii_d[num] = metrics\n name_d[name] = metrics\n raise RuntimeError('Bad parse')\n\n\ndef _parse_kern_pairs(fh):\n """\n Return a kern pairs dictionary; keys are (*char1*, *char2*) tuples and\n values are the kern pair value. For example, a kern pairs line like\n ``KPX A y -50``\n\n will be represented as::\n\n d[ ('A', 'y') ] = -50\n\n """\n\n line = next(fh)\n if not line.startswith(b'StartKernPairs'):\n raise RuntimeError('Bad start of kern pairs data: %s' % line)\n\n d = {}\n for line in fh:\n line = line.rstrip()\n if not line:\n continue\n if line.startswith(b'EndKernPairs'):\n next(fh) # EndKernData\n return d\n vals = line.split()\n if len(vals) != 4 or vals[0] != b'KPX':\n raise RuntimeError('Bad kern pairs line: %s' % line)\n c1, c2, val = _to_str(vals[1]), _to_str(vals[2]), _to_float(vals[3])\n d[(c1, c2)] = val\n raise RuntimeError('Bad kern pairs parse')\n\n\nCompositePart = namedtuple('CompositePart', 'name, dx, dy')\nCompositePart.__doc__ = """\n Represents the information on a composite element of a composite char."""\nCompositePart.name.__doc__ = """Name of the part, e.g. 'acute'."""\nCompositePart.dx.__doc__ = """x-displacement of the part from the origin."""\nCompositePart.dy.__doc__ = """y-displacement of the part from the origin."""\n\n\ndef _parse_composites(fh):\n """\n Parse the given filehandle for composites information return them as a\n dict.\n\n It is assumed that the file cursor is on the line behind 'StartComposites'.\n\n Returns\n -------\n dict\n A dict mapping composite character names to a parts list. The parts\n list is a list of `.CompositePart` entries describing the parts of\n the composite.\n\n Examples\n --------\n A composite definition line::\n\n CC Aacute 2 ; PCC A 0 0 ; PCC acute 160 170 ;\n\n will be represented as::\n\n composites['Aacute'] = [CompositePart(name='A', dx=0, dy=0),\n CompositePart(name='acute', dx=160, dy=170)]\n\n """\n composites = {}\n for line in fh:\n line = line.rstrip()\n if not line:\n continue\n if line.startswith(b'EndComposites'):\n return composites\n vals = line.split(b';')\n cc = vals[0].split()\n name, _num_parts = cc[1], _to_int(cc[2])\n pccParts = []\n for s in vals[1:-1]:\n pcc = s.split()\n part = CompositePart(pcc[1], _to_float(pcc[2]), _to_float(pcc[3]))\n pccParts.append(part)\n composites[name] = pccParts\n\n raise RuntimeError('Bad composites parse')\n\n\ndef _parse_optional(fh):\n """\n Parse the optional fields for kern pair data and composites.\n\n Returns\n -------\n kern_data : dict\n A dict containing kerning information. May be empty.\n See `._parse_kern_pairs`.\n composites : dict\n A dict containing composite information. May be empty.\n See `._parse_composites`.\n """\n optional = {\n b'StartKernData': _parse_kern_pairs,\n b'StartComposites': _parse_composites,\n }\n\n d = {b'StartKernData': {},\n b'StartComposites': {}}\n for line in fh:\n line = line.rstrip()\n if not line:\n continue\n key = line.split()[0]\n\n if key in optional:\n d[key] = optional[key](fh)\n\n return d[b'StartKernData'], d[b'StartComposites']\n\n\nclass AFM:\n\n def __init__(self, fh):\n """Parse the AFM file in file object *fh*."""\n self._header = _parse_header(fh)\n self._metrics, self._metrics_by_name = _parse_char_metrics(fh)\n self._kern, self._composite = _parse_optional(fh)\n\n def get_bbox_char(self, c, isord=False):\n if not isord:\n c = ord(c)\n return self._metrics[c].bbox\n\n def string_width_height(self, s):\n """\n Return the string width (including kerning) and string height\n as a (*w*, *h*) tuple.\n """\n if not len(s):\n return 0, 0\n total_width = 0\n namelast = None\n miny = 1e9\n maxy = 0\n for c in s:\n if c == '\n':\n continue\n wx, name, bbox = self._metrics[ord(c)]\n\n total_width += wx + self._kern.get((namelast, name), 0)\n l, b, w, h = bbox\n miny = min(miny, b)\n maxy = max(maxy, b + h)\n\n namelast = name\n\n return total_width, maxy - miny\n\n def get_str_bbox_and_descent(self, s):\n """Return the string bounding box and the maximal descent."""\n if not len(s):\n return 0, 0, 0, 0, 0\n total_width = 0\n namelast = None\n miny = 1e9\n maxy = 0\n left = 0\n if not isinstance(s, str):\n s = _to_str(s)\n for c in s:\n if c == '\n':\n continue\n name = uni2type1.get(ord(c), f"uni{ord(c):04X}")\n try:\n wx, _, bbox = self._metrics_by_name[name]\n except KeyError:\n name = 'question'\n wx, _, bbox = self._metrics_by_name[name]\n total_width += wx + self._kern.get((namelast, name), 0)\n l, b, w, h = bbox\n left = min(left, l)\n miny = min(miny, b)\n maxy = max(maxy, b + h)\n\n namelast = name\n\n return left, miny, total_width, maxy - miny, -miny\n\n def get_str_bbox(self, s):\n """Return the string bounding box."""\n return self.get_str_bbox_and_descent(s)[:4]\n\n def get_name_char(self, c, isord=False):\n """Get the name of the character, i.e., ';' is 'semicolon'."""\n if not isord:\n c = ord(c)\n return self._metrics[c].name\n\n def get_width_char(self, c, isord=False):\n """\n Get the width of the character from the character metric WX field.\n """\n if not isord:\n c = ord(c)\n return self._metrics[c].width\n\n def get_width_from_char_name(self, name):\n """Get the width of the character from a type1 character name."""\n return self._metrics_by_name[name].width\n\n def get_height_char(self, c, isord=False):\n """Get the bounding box (ink) height of character *c* (space is 0)."""\n if not isord:\n c = ord(c)\n return self._metrics[c].bbox[-1]\n\n def get_kern_dist(self, c1, c2):\n """\n Return the kerning pair distance (possibly 0) for chars *c1* and *c2*.\n """\n name1, name2 = self.get_name_char(c1), self.get_name_char(c2)\n return self.get_kern_dist_from_name(name1, name2)\n\n def get_kern_dist_from_name(self, name1, name2):\n """\n Return the kerning pair distance (possibly 0) for chars\n *name1* and *name2*.\n """\n return self._kern.get((name1, name2), 0)\n\n def get_fontname(self):\n """Return the font name, e.g., 'Times-Roman'."""\n return self._header[b'FontName']\n\n @property\n def postscript_name(self): # For consistency with FT2Font.\n return self.get_fontname()\n\n def get_fullname(self):\n """Return the font full name, e.g., 'Times-Roman'."""\n name = self._header.get(b'FullName')\n if name is None: # use FontName as a substitute\n name = self._header[b'FontName']\n return name\n\n def get_familyname(self):\n """Return the font family name, e.g., 'Times'."""\n name = self._header.get(b'FamilyName')\n if name is not None:\n return name\n\n # FamilyName not specified so we'll make a guess\n name = self.get_fullname()\n extras = (r'(?i)([ -](regular|plain|italic|oblique|bold|semibold|'\n r'light|ultralight|extra|condensed))+$')\n return re.sub(extras, '', name)\n\n @property\n def family_name(self):\n """The font family name, e.g., 'Times'."""\n return self.get_familyname()\n\n def get_weight(self):\n """Return the font weight, e.g., 'Bold' or 'Roman'."""\n return self._header[b'Weight']\n\n def get_angle(self):\n """Return the fontangle as float."""\n return self._header[b'ItalicAngle']\n\n def get_capheight(self):\n """Return the cap height as float."""\n return self._header[b'CapHeight']\n\n def get_xheight(self):\n """Return the xheight as float."""\n return self._header[b'XHeight']\n\n def get_underline_thickness(self):\n """Return the underline thickness as float."""\n return self._header[b'UnderlineThickness']\n\n def get_horizontal_stem_width(self):\n """\n Return the standard horizontal stem width as float, or *None* if\n not specified in AFM file.\n """\n return self._header.get(b'StdHW', None)\n\n def get_vertical_stem_width(self):\n """\n Return the standard vertical stem width as float, or *None* if\n not specified in AFM file.\n """\n return self._header.get(b'StdVW', None)\n | .venv\Lib\site-packages\matplotlib\_afm.py | _afm.py | Python | 16,692 | 0.95 | 0.167293 | 0.068027 | awesome-app | 225 | 2023-11-22T05:21:01.609622 | MIT | false | 86827f4c12bedc21045615da204bf687 |
# JavaScript template for HTMLWriter\nJS_INCLUDE = """\n<link rel="stylesheet"\nhref="https://maxcdn.bootstrapcdn.com/font-awesome/4.4.0/css/font-awesome.min.css">\n<script language="javascript">\n function isInternetExplorer() {\n ua = navigator.userAgent;\n /* MSIE used to detect old browsers and Trident used to newer ones*/\n return ua.indexOf("MSIE ") > -1 || ua.indexOf("Trident/") > -1;\n }\n\n /* Define the Animation class */\n function Animation(frames, img_id, slider_id, interval, loop_select_id){\n this.img_id = img_id;\n this.slider_id = slider_id;\n this.loop_select_id = loop_select_id;\n this.interval = interval;\n this.current_frame = 0;\n this.direction = 0;\n this.timer = null;\n this.frames = new Array(frames.length);\n\n for (var i=0; i<frames.length; i++)\n {\n this.frames[i] = new Image();\n this.frames[i].src = frames[i];\n }\n var slider = document.getElementById(this.slider_id);\n slider.max = this.frames.length - 1;\n if (isInternetExplorer()) {\n // switch from oninput to onchange because IE <= 11 does not conform\n // with W3C specification. It ignores oninput and onchange behaves\n // like oninput. In contrast, Microsoft Edge behaves correctly.\n slider.setAttribute('onchange', slider.getAttribute('oninput'));\n slider.setAttribute('oninput', null);\n }\n this.set_frame(this.current_frame);\n }\n\n Animation.prototype.get_loop_state = function(){\n var button_group = document[this.loop_select_id].state;\n for (var i = 0; i < button_group.length; i++) {\n var button = button_group[i];\n if (button.checked) {\n return button.value;\n }\n }\n return undefined;\n }\n\n Animation.prototype.set_frame = function(frame){\n this.current_frame = frame;\n document.getElementById(this.img_id).src =\n this.frames[this.current_frame].src;\n document.getElementById(this.slider_id).value = this.current_frame;\n }\n\n Animation.prototype.next_frame = function()\n {\n this.set_frame(Math.min(this.frames.length - 1, this.current_frame + 1));\n }\n\n Animation.prototype.previous_frame = function()\n {\n this.set_frame(Math.max(0, this.current_frame - 1));\n }\n\n Animation.prototype.first_frame = function()\n {\n this.set_frame(0);\n }\n\n Animation.prototype.last_frame = function()\n {\n this.set_frame(this.frames.length - 1);\n }\n\n Animation.prototype.slower = function()\n {\n this.interval /= 0.7;\n if(this.direction > 0){this.play_animation();}\n else if(this.direction < 0){this.reverse_animation();}\n }\n\n Animation.prototype.faster = function()\n {\n this.interval *= 0.7;\n if(this.direction > 0){this.play_animation();}\n else if(this.direction < 0){this.reverse_animation();}\n }\n\n Animation.prototype.anim_step_forward = function()\n {\n this.current_frame += 1;\n if(this.current_frame < this.frames.length){\n this.set_frame(this.current_frame);\n }else{\n var loop_state = this.get_loop_state();\n if(loop_state == "loop"){\n this.first_frame();\n }else if(loop_state == "reflect"){\n this.last_frame();\n this.reverse_animation();\n }else{\n this.pause_animation();\n this.last_frame();\n }\n }\n }\n\n Animation.prototype.anim_step_reverse = function()\n {\n this.current_frame -= 1;\n if(this.current_frame >= 0){\n this.set_frame(this.current_frame);\n }else{\n var loop_state = this.get_loop_state();\n if(loop_state == "loop"){\n this.last_frame();\n }else if(loop_state == "reflect"){\n this.first_frame();\n this.play_animation();\n }else{\n this.pause_animation();\n this.first_frame();\n }\n }\n }\n\n Animation.prototype.pause_animation = function()\n {\n this.direction = 0;\n if (this.timer){\n clearInterval(this.timer);\n this.timer = null;\n }\n }\n\n Animation.prototype.play_animation = function()\n {\n this.pause_animation();\n this.direction = 1;\n var t = this;\n if (!this.timer) this.timer = setInterval(function() {\n t.anim_step_forward();\n }, this.interval);\n }\n\n Animation.prototype.reverse_animation = function()\n {\n this.pause_animation();\n this.direction = -1;\n var t = this;\n if (!this.timer) this.timer = setInterval(function() {\n t.anim_step_reverse();\n }, this.interval);\n }\n</script>\n"""\n\n\n# Style definitions for the HTML template\nSTYLE_INCLUDE = """\n<style>\n.animation {\n display: inline-block;\n text-align: center;\n}\ninput[type=range].anim-slider {\n width: 374px;\n margin-left: auto;\n margin-right: auto;\n}\n.anim-buttons {\n margin: 8px 0px;\n}\n.anim-buttons button {\n padding: 0;\n width: 36px;\n}\n.anim-state label {\n margin-right: 8px;\n}\n.anim-state input {\n margin: 0;\n vertical-align: middle;\n}\n</style>\n"""\n\n\n# HTML template for HTMLWriter\nDISPLAY_TEMPLATE = """\n<div class="animation">\n <img id="_anim_img{id}">\n <div class="anim-controls">\n <input id="_anim_slider{id}" type="range" class="anim-slider"\n name="points" min="0" max="1" step="1" value="0"\n oninput="anim{id}.set_frame(parseInt(this.value));">\n <div class="anim-buttons">\n <button title="Decrease speed" aria-label="Decrease speed" onclick="anim{id}.slower()">\n <i class="fa fa-minus"></i></button>\n <button title="First frame" aria-label="First frame" onclick="anim{id}.first_frame()">\n <i class="fa fa-fast-backward"></i></button>\n <button title="Previous frame" aria-label="Previous frame" onclick="anim{id}.previous_frame()">\n <i class="fa fa-step-backward"></i></button>\n <button title="Play backwards" aria-label="Play backwards" onclick="anim{id}.reverse_animation()">\n <i class="fa fa-play fa-flip-horizontal"></i></button>\n <button title="Pause" aria-label="Pause" onclick="anim{id}.pause_animation()">\n <i class="fa fa-pause"></i></button>\n <button title="Play" aria-label="Play" onclick="anim{id}.play_animation()">\n <i class="fa fa-play"></i></button>\n <button title="Next frame" aria-label="Next frame" onclick="anim{id}.next_frame()">\n <i class="fa fa-step-forward"></i></button>\n <button title="Last frame" aria-label="Last frame" onclick="anim{id}.last_frame()">\n <i class="fa fa-fast-forward"></i></button>\n <button title="Increase speed" aria-label="Increase speed" onclick="anim{id}.faster()">\n <i class="fa fa-plus"></i></button>\n </div>\n <form title="Repetition mode" aria-label="Repetition mode" action="#n" name="_anim_loop_select{id}"\n class="anim-state">\n <input type="radio" name="state" value="once" id="_anim_radio1_{id}"\n {once_checked}>\n <label for="_anim_radio1_{id}">Once</label>\n <input type="radio" name="state" value="loop" id="_anim_radio2_{id}"\n {loop_checked}>\n <label for="_anim_radio2_{id}">Loop</label>\n <input type="radio" name="state" value="reflect" id="_anim_radio3_{id}"\n {reflect_checked}>\n <label for="_anim_radio3_{id}">Reflect</label>\n </form>\n </div>\n</div>\n\n\n<script language="javascript">\n /* Instantiate the Animation class. */\n /* The IDs given should match those used in the template above. */\n (function() {{\n var img_id = "_anim_img{id}";\n var slider_id = "_anim_slider{id}";\n var loop_select_id = "_anim_loop_select{id}";\n var frames = new Array({Nframes});\n {fill_frames}\n\n /* set a timeout to make sure all the above elements are created before\n the object is initialized. */\n setTimeout(function() {{\n anim{id} = new Animation(frames, img_id, slider_id, {interval},\n loop_select_id);\n }}, 0);\n }})()\n</script>\n""" # noqa: E501\n\n\nINCLUDED_FRAMES = """\n for (var i=0; i<{Nframes}; i++){{\n frames[i] = "{frame_dir}/frame" + ("0000000" + i).slice(-7) +\n ".{frame_format}";\n }}\n"""\n | .venv\Lib\site-packages\matplotlib\_animation_data.py | _animation_data.py | Python | 7,986 | 0.95 | 0.229008 | 0.046218 | node-utils | 493 | 2023-09-25T07:59:30.661430 | BSD-3-Clause | false | 7dc06d1887d4a2f2c3169bb7154edbfb |
def blocking_input_loop(figure, event_names, timeout, handler):\n """\n Run *figure*'s event loop while listening to interactive events.\n\n The events listed in *event_names* are passed to *handler*.\n\n This function is used to implement `.Figure.waitforbuttonpress`,\n `.Figure.ginput`, and `.Axes.clabel`.\n\n Parameters\n ----------\n figure : `~matplotlib.figure.Figure`\n event_names : list of str\n The names of the events passed to *handler*.\n timeout : float\n If positive, the event loop is stopped after *timeout* seconds.\n handler : Callable[[Event], Any]\n Function called for each event; it can force an early exit of the event\n loop by calling ``canvas.stop_event_loop()``.\n """\n if figure.canvas.manager:\n figure.show() # Ensure that the figure is shown if we are managing it.\n # Connect the events to the on_event function call.\n cids = [figure.canvas.mpl_connect(name, handler) for name in event_names]\n try:\n figure.canvas.start_event_loop(timeout) # Start event loop.\n finally: # Run even on exception like ctrl-c.\n # Disconnect the callbacks.\n for cid in cids:\n figure.canvas.mpl_disconnect(cid)\n | .venv\Lib\site-packages\matplotlib\_blocking_input.py | _blocking_input.py | Python | 1,224 | 0.95 | 0.333333 | 0.074074 | python-kit | 961 | 2023-08-29T05:04:11.995785 | GPL-3.0 | false | 6f3d025f40cece989d2b33d6f0a59da8 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.