repo stringclasses 1 value | instance_id stringlengths 20 22 | problem_statement stringlengths 126 60.8k | merge_commit stringlengths 40 40 | base_commit stringlengths 40 40 |
|---|---|---|---|---|
python/cpython | python__cpython-105488 | # Inconsistent `repr` of `types.GenericAlias` with `ParamSpec`
The simplest repro:
```python
>>> type A[X, **Y] = None
>>> A[int, [str]]
A[int, [<class 'str'>]]
```
I think that it should be:
```python
>>> type A[X, **Y] = None
>>> A[int, [str]]
A[int, [str]]
```
We already had similar issues in the past: https://github.com/python/cpython/pull/102637
This happens somewhere in `ga_repr`, I will have a look.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105488
* gh-106297
<!-- /gh-linked-prs -->
| eb7d6e7ad844955f9af88707d296e003c7ce4394 | e212618bafaa4f775502e3442de0affb80205b5e |
python/cpython | python__cpython-105482 | # Generate opcode metadata from bytecodes.c instead of opcode.py
We would ideally have bytecodes.c as the single source of truth about opcodes. So opcode.py and the code generated from it should be replaced by alternatives from the cases_generator, if we can.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105482
* gh-105506
* gh-105788
* gh-105791
* gh-105865
* gh-105913
* gh-105950
* gh-106673
* gh-106688
* gh-106758
* gh-106913
* gh-107276
* gh-107339
* gh-107534
* gh-107540
* gh-107561
* gh-107598
* gh-107866
* gh-107942
* gh-107971
* gh-108025
* gh-108079
* gh-108080
* gh-108367
<!-- /gh-linked-prs -->
| be2779c0cb54e56ea4bb9822cc7bb5c24e6a7af7 | 2211454fe210637ed7fabda12690dac6cc9a8149 |
python/cpython | python__cpython-135994 | # Remove deprecated `sre_*` modules
# Feature or enhancement
`sre_*` modules like `sre_constants`, `sre_compile`, and `sre_parse` were deprecated in `3.11` in https://github.com/python/cpython/commit/1be3260a90f16aae334d993aecf7b70426f98013
Our regular deprecation policy is that the deprecated things can be removed in N + 2 release, which is `3.13`.
# Pitch
Let's remove them if there are no objections.
I guess it is safe to remove them for several reasons:
1. https://bugs.python.org/issue47152 clearly states that they were undocumented
2. There are now `re._parser`, `re._constants`, and `re._compiler` modules that are used instead
3. They were listed as deprecated in the "what's new" and the warning was pretty clear
The only argument agaist removing them:
1. The deprecation warning never says when they will be removed:
```python
>>> import sre_compile
<stdin>:1: DeprecationWarning: module 'sre_compile' is deprecated
```
I will send a PR once we settle it:
- Either with a better deprecation message
- Or with these modules removed
@vstinner @serhiy-storchaka what's your opinion?
<!-- gh-linked-prs -->
### Linked PRs
* gh-135994
<!-- /gh-linked-prs -->
| 93809a918f3a2853e8a191b84ad844c722bee2f5 | 12ce16bc134a602d2ac8acde86ae69f70183cb9f |
python/cpython | python__cpython-105438 | # PEP 695: not all name collisions are tested
Right now there are two tests for name colisions of type parameters of type aliases and functions:
- https://github.com/python/cpython/blob/27c68a6d8f20090310450862c2c299bb7ba3c160/Lib/test/test_type_params.py#L11-L12
- https://github.com/python/cpython/blob/27c68a6d8f20090310450862c2c299bb7ba3c160/Lib/test/test_type_aliases.py#L11-L12
They only test `A` and `**A` collide. But, PEP clearly states that `A` and `*A` collide as well.
I will send a PR.
I am really sorry that I have to put all my review comments to https://github.com/python/cpython/pull/103764 as separate issues, but I've missed my chance to properly review the PEP's implementation.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105438
* gh-105452
<!-- /gh-linked-prs -->
| 76883af6bf28b7e810df172bd6762bf2cb64df08 | 18309ad94bb1ae0b092f34dc3fd54199876a6ebd |
python/cpython | python__cpython-105495 | # `subprocess.run(..., env={})` broken on Windows
<!--
If you're new to Python and you're not sure whether what you're experiencing is a bug, the CPython issue tracker is not
the right place to seek help. Consider the following options instead:
- reading the Python tutorial: https://docs.python.org/3/tutorial/
- posting in the "Users" category on discuss.python.org: https://discuss.python.org/c/users/7
- emailing the Python-list mailing list: https://mail.python.org/mailman/listinfo/python-list
- searching our issue tracker (https://github.com/python/cpython/issues) to see if
your problem has already been reported
-->
# Bug report
When passing an empty dictionary as `env` to `subprocess.run` (or `Popen`), I get the following error:
```
Python 3.11.3 (tags/v3.11.3:f3909b8, Apr 4 2023, 23:49:59) [MSC v.1934 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import subprocess
>>> subprocess.run('c:\\msys64\\usr\\bin\\echo.exe', env={})
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\jhbaarnh\AppData\Local\Programs\Python\Python311\Lib\subprocess.py", line 548, in run
with Popen(*popenargs, **kwargs) as process:
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\jhbaarnh\AppData\Local\Programs\Python\Python311\Lib\subprocess.py", line 1024, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "C:\Users\jhbaarnh\AppData\Local\Programs\Python\Python311\Lib\subprocess.py", line 1509, in _execute_child
hp, ht, pid, tid = _winapi.CreateProcess(executable, args,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
OSError: [WinError 87] The parameter is incorrect
>>>
```
# Environment
<!-- Include as many relevant details as possible about the environment you experienced the bug in -->
- Reproduced on the following CPython versions, all native windows builds:
- vanilla python 3.11.3
- vanilla Python 3.9.6
- Intel Python 3.9.16
- I can not reproduce the problem on any of the following:
- python 3.7.2, native Windows build and MSYS build (did not try newer MSYS)
- any linux
<!-- gh-linked-prs -->
### Linked PRs
* gh-105495
* gh-105700
* gh-105701
* gh-105742
* gh-105756
* gh-105757
<!-- /gh-linked-prs -->
| 4f7d3b602d47d61137e82145f601dccfe6f6cd3c | 2b90796be6959d5ef46b38c434a514fce25be971 |
python/cpython | python__cpython-105442 | # tokenize.py emits spurious NEWLINE if file ends on a comment without a newline
# Bug report
This issue, which was reported in https://bugs.python.org/issue35107#msg328884 for much older Python versions in the case where a file ended on a comment, has resurfaced for Python 3.12.
```
Python 3.12.0b2 (tags/v3.12.0b2:e6c0efa, Jun 6 2023, 15:37:48) [MSC v.1935 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import tokenize
>>> import io
>>> tokenize.untokenize(tokenize.generate_tokens(io.StringIO('#').readline))
'#\n'
```
Note that the problem also arises in the following example:
```
>>> tokenize.untokenize(tokenize.generate_tokens(io.StringIO('a\n ').readline))
'a\n \n'
```
<!-- gh-linked-prs -->
### Linked PRs
* gh-105442
* gh-105444
<!-- /gh-linked-prs -->
| 7279fb64089abc73c03247fb8191082ee42a9671 | a24a780d937109a0982d807473ae410cc75b0e3b |
python/cpython | python__cpython-105443 | # PEP 695 is not tested to be `pickle`able
(or at least I cannot find these tests)
I think that testing the pickle result of functions, classes, and type aliases created with the new syntax is very important.
I propose to add these tests, PR is in the works.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105443
* gh-105845
<!-- /gh-linked-prs -->
| 1af8251d9ec2f18e131c19ccf776fb9ec132c7a8 | 486b52a3158e0f64fc54efdfa34ed5437b3619f2 |
python/cpython | python__cpython-105432 | # `test_typing.NewTypeTests` can be improved
After https://github.com/python/cpython/pull/103764/files#diff-04d29c98076c2d6bb75921ea9becb26a862544d39b71db87b6e354c759b9305dR6452 `NewTypeTests` has two unused methods:
- `cleanup`, it is even provided in its parent class: `BaseTestCase`
- `teardown`, because it does not need to clean things up, tests with `-R 3:3` prove that
I will send a PR to clean things up.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105432
* gh-105489
<!-- /gh-linked-prs -->
| 9d35a71a76cb033598ce136ea655d9e452fe3af0 | aa5b762bd3a3e837678cf7f9e1434c0f68208a0e |
python/cpython | python__cpython-105434 | # `typing._Immutable` is not used anymore
`class _Immutable` https://github.com/python/cpython/blob/27c68a6d8f20090310450862c2c299bb7ba3c160/Lib/typing.py#L422-L431 after https://github.com/python/cpython/pull/103764/files#diff-ddb987fca5f5df0c9a2f5521ed687919d70bb3d64eaeb8021f98833a2a716887L969 is not used.
```
» ag _Immutable
Lib/typing.py
419:class _Immutable:
```
Shall I remove it? I think that we don't provide any promises about `typing.py` internals.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105434
* gh-105451
<!-- /gh-linked-prs -->
| 18309ad94bb1ae0b092f34dc3fd54199876a6ebd | e26d296984b2b6279231922ab0940d904aa6144e |
python/cpython | python__cpython-105408 | # Remove unused imports (June 2023)
Time to time, I run ``pyflakes`` to find unused imports in the Python code base. That's the issue for June 2023 :-)
<!-- gh-linked-prs -->
### Linked PRs
* gh-105408
* gh-105409
* gh-105410
* gh-105411
* gh-105554
<!-- /gh-linked-prs -->
| ae319e4b43dc0d9d94d36bdcdbc5443364398c29 | d3a0eacbf382288a487d40f16d63abfeb2125e1a |
python/cpython | python__cpython-105397 | # C API: Deprecate PyImport_ImportModuleNoBlock() function
The PyImport_ImportModuleNoBlock() function is an alias to PyImport_ImportModule() since Python 3.3. I propose to deprecate it and schedule its removal in Python 3.15.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105397
<!-- /gh-linked-prs -->
| 3e525d22128cf040b3fd164f52cc6ae20ca58455 | a5f23d411062f9f29f8a7d7ddefe60d5d8e17d2e |
python/cpython | python__cpython-105399 | # (3.12+) tokenize exception type is wrong
<!--
If you're new to Python and you're not sure whether what you're experiencing is a bug, the CPython issue tracker is not
the right place to seek help. Consider the following options instead:
- reading the Python tutorial: https://docs.python.org/3/tutorial/
- posting in the "Users" category on discuss.python.org: https://discuss.python.org/c/users/7
- emailing the Python-list mailing list: https://mail.python.org/mailman/listinfo/python-list
- searching our issue tracker (https://github.com/python/cpython/issues) to see if
your problem has already been reported
-->
# Bug report
the document exception type for the `tokenize.*` functions is `TokenError` -- however it appears it now raises `SyntaxError`: https://docs.python.org/3.12/library/tokenize.html#tokenize.TokenError
> Raised when either a docstring or expression that may be split over several lines is not completed anywhere in the file, for example:
```python
import io
import tokenize
list(tokenize.generate_tokens(io.StringIO('"""').readline))
```
```console
$ python3.11 t.py
Traceback (most recent call last):
File "/home/asottile/workspace/switch-microcontroller/t.py", line 4, in <module>
list(tokenize.generate_tokens(io.StringIO('"""').readline))
File "/usr/lib/python3.11/tokenize.py", line 465, in _tokenize
raise TokenError("EOF in multi-line string", strstart)
tokenize.TokenError: ('EOF in multi-line string', (1, 0))
```
and
```console
# python3.13 t.py
Traceback (most recent call last):
File "/y/pyupgrade/t.py", line 4, in <module>
list(tokenize.generate_tokens(io.StringIO('"""').readline))
File "/usr/lib/python3.13/tokenize.py", line 526, in _generate_tokens_from_c_tokenizer
for info in it:
File "<string>", line 1
"""
^
SyntaxError: unterminated triple-quoted string literal (detected at line 1)
```
# Your environment
<!-- Include as many relevant details as possible about the environment you experienced the bug in -->
- CPython versions tested on: c7bf74bacd
- Operating system and architecture: ubuntu 22.04 x86_64
<!--
You can freely edit this text. Remove any lines you believe are unnecessary.
-->
<!-- gh-linked-prs -->
### Linked PRs
* gh-105399
* gh-105439
* gh-105466
* gh-105472
<!-- /gh-linked-prs -->
| ffd26545509b706437ca38bd6db6108c1d0e2326 | 27c68a6d8f20090310450862c2c299bb7ba3c160 |
python/cpython | python__cpython-105388 | # PEP 683 (Immortal Objects): Implement Py_INCREF() as function call in limited C API 3.12
The implementation of Py_INCREF() and Py_DECREF() functions changed in Python 3.12 to implement PEP 683 (Immortal Objects). [Stable ABI sections of PEP 683](https://peps.python.org/pep-0683/#stable-abi) says:
> The implementation approach described in this PEP is compatible with extensions compiled to the stable ABI (with the exception of Accidental Immortality and Accidental De-Immortalizing). Due to the nature of the stable ABI, unfortunately, such extensions use versions of Py_INCREF(), etc. that directly modify the object’s ob_refcnt field. **This will invalidate all the performance benefits of immortal objects.**
I propose to change the limited C API implementation to use an opaque function call in the limited C API 3.12 and newer. It hides tricky implementation details: see issue #105059 and PR #105275 for a concrete example. It will also ease the hypothetical implementation of PEP 703 "NO GIL".
Converting Py_INCREF() and Py_DECREF() static inline functions to opaque function calls has an impact on performance. IMO the change is worth it. I didn't measure the impact on performance.
In Python 3.10 and 3.11, I modified the Python C API to abstract access to PyObject and PyVarObject with function calls, rather than accessing directly structure members: see issue #83754. Currently, it's not enforced, it's still possible to access directly structure members. This step is another step towards fully opaque PyObject and PyVarObject structure (remove their members from the public API).
<!-- gh-linked-prs -->
### Linked PRs
* gh-105388
* gh-105763
<!-- /gh-linked-prs -->
| b542972dc133973a7f0517aa1b61779271789111 | f3266c05b6186ab6d1db0799c06b8f76aefe7cf1 |
python/cpython | python__cpython-105384 | # urllib.request: Remove deprecated cafile, capath and cadefault parameters in Python 3.13
In Python 3.12, I removed *keyfile* and *certfile* parameters of most stdlib modules in issue #94172, but I forgot to remove *cafile* and *capath* of the ``urllib.request`` module.
I propose to remove the *cafile*, *capath* and *cadefault* parameters of ``urllib.request.urlopen()`` in Python 3.13. These parameters are deprecated since Python 3.6: the *context* parameter should be used instead. Using *cafile* and *capath* is less safe than using the *context* parameter.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105384
<!-- /gh-linked-prs -->
| 2587b9f64eefde803a5e0b050171ad5f6654f31b | 94d5f9827da4bf4b1e61c134fe29904b2b92f124 |
python/cpython | python__cpython-105377 | # Remove deprecated logging.Logger.warn() method in Python 3.13
The logging.Logger.warn() method was deprecated in Python 3.3 by issue #57444 and commit 04d5bc00a219860c69ea17eaa633d3ab9917409f. This method is not documented and emits a DeprecationWarning since Python 3.3.
https://github.com/python/cpython/issues/57444#issuecomment-1093560206:
> That's deliberate. The original code (before incorporation into Python) had warn(), which was kept for backward compatibility. The docs refer to warning() because that's what everyone is supposed to use. The method names map to the lower case of the appropriate logging level name.
I propose to remove this logging.Logger.warn() method in Python 3.13.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105377
* gh-106553
* gh-122775
* gh-122856
<!-- /gh-linked-prs -->
| 6c54e5d72166d012b52155cbf13af9e533290e06 | 221d703498d84e363070c89c28f839edceaac9b2 |
python/cpython | python__cpython-105412 | # Incorrect error handling for APIs that can raise exceptions
For example, in the collation callback two `str` objects (`string1` and `string2`) are created using `PyUnicode_FromStringAndSize`. Error handling should happen directly after each call to `PyUnicode_FromStringAndSize`:
https://github.com/python/cpython/blob/0cb6b9b0db5df6b3f902e86eb3d4a1e504afb851/Modules/_sqlite/connection.c#L1870-L1875
Other cases where error handling is not done immediately after the API has been used:
- [x] gh-105491
- [x] gh-105494
- [x] gh-105585
- [x] gh-105590
- [x] gh-105591
- [x] gh-105592
- [x] gh-105593
- [x] gh-105594
- [x] gh-105667
- [x] gh-105475
- [x] gh-105599
- [x] gh-105586
- [x] gh-105604
- [x] gh-105606
- [x] gh-105605
- [x] gh-105608
- [x] gh-105610
- [x] gh-105611
I might have missed some; I did not do a complete audit yet.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105412
* gh-105440
* gh-105441
* gh-105475
* gh-105491
* gh-105494
* gh-105581
* gh-105582
* gh-105583
* gh-105584
* gh-105585
* gh-105586
* gh-105590
* gh-105591
* gh-105592
* gh-105593
* gh-105594
* gh-105596
* gh-105597
* gh-105598
* gh-105599
* gh-105600
* gh-105601
* gh-105604
* gh-105605
* gh-105608
* gh-105610
* gh-105611
* gh-105612
* gh-105613
* gh-105615
* gh-105642
* gh-105643
* gh-105644
* gh-105645
* gh-105646
* gh-105647
* gh-105648
* gh-105649
* gh-105650
* gh-105651
* gh-105659
* gh-105660
* gh-105661
* gh-105662
* gh-105663
* gh-105664
* gh-105665
* gh-105666
* gh-105667
* gh-105668
* gh-105669
* gh-105686
* gh-105710
* gh-105711
* gh-105720
* gh-105721
<!-- /gh-linked-prs -->
| a24a780d937109a0982d807473ae410cc75b0e3b | ffd26545509b706437ca38bd6db6108c1d0e2326 |
python/cpython | python__cpython-105715 | # locals() in a module or class scope comprehension does not include the iteration var(s) since 3.12b1
In Python 3.11, inside a comprehension, `locals()` was bound to the scope of the comprehension:
```
$ py -3.11 -c "print([locals() for name in 'abc'])"
[{'.0': <str_ascii_iterator object at 0x10472dc30>, 'name': 'c'}, {'.0': <str_ascii_iterator object at 0x10472dc30>, 'name': 'c'}, {'.0': <str_ascii_iterator object at 0x10472dc30>, 'name': 'c'}]
```
But since 3.12b1, `locals()` is `globals()`:
```
$ py -3.12 -c "print([locals() for name in 'abc'])"
[{'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <class '_frozen_importlib.BuiltinImporter'>, '__spec__': None, '__annotations__': {}, '__builtins__': <module 'builtins' (built-in)>}, {'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <class '_frozen_importlib.BuiltinImporter'>, '__spec__': None, '__annotations__': {}, '__builtins__': <module 'builtins' (built-in)>}, {'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <class '_frozen_importlib.BuiltinImporter'>, '__spec__': None, '__annotations__': {}, '__builtins__': <module 'builtins' (built-in)>}]
$ py -3.11 -c "print([globals() for name in 'abc'])"
[{'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <class '_frozen_importlib.BuiltinImporter'>, '__spec__': None, '__annotations__': {}, '__builtins__': <module 'builtins' (built-in)>}, {'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <class '_frozen_importlib.BuiltinImporter'>, '__spec__': None, '__annotations__': {}, '__builtins__': <module 'builtins' (built-in)>}, {'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <class '_frozen_importlib.BuiltinImporter'>, '__spec__': None, '__annotations__': {}, '__builtins__': <module 'builtins' (built-in)>}]
$ py -3.12 -c "print([locals() is globals() for name in 'abc'])"
[True, True, True]
```
May be related to #105256.
Discovered in [this job](https://github.com/jaraco/jaraco.clipboard/actions/runs/5107203578/jobs/9179898494), triggered by [this code](https://github.com/jaraco/jaraco.clipboard/blob/e81d86fb98c7e0d42fb160960cdd696b07b88952/conftest.py#L5-L9) (which I intend to update not to use `locals()`).
Is this change intentional?
<!-- gh-linked-prs -->
### Linked PRs
* gh-105715
* gh-106470
<!-- /gh-linked-prs -->
| 104d7b760fed18055e4f04e5da3ca619e28bfc81 | 838406b4fc044c0b2f397c23275c69f16a76205b |
python/cpython | python__cpython-105348 | # enum.IntFlag (un)pickles incorrectly on Python 3.12 (and differently on Python 3.11)
<!--
If you're new to Python and you're not sure whether what you're experiencing is a bug, the CPython issue tracker is not
the right place to seek help. Consider the following options instead:
- reading the Python tutorial: https://docs.python.org/3/tutorial/
- posting in the "Users" category on discuss.python.org: https://discuss.python.org/c/users/7
- emailing the Python-list mailing list: https://mail.python.org/mailman/listinfo/python-list
- searching our issue tracker (https://github.com/python/cpython/issues) to see if
your problem has already been reported
-->
# Bug report
Run the following code:
```
import enum
import pickle
class PxdEnum(enum.IntFlag):
RANK_0 = 11
RANK_1 = 37
RANK_2 = 389
print(pickle.loads(pickle.dumps(PxdEnum.RANK_2)))
```
On Python 3.10.10 I get `PxdEnum.RANK_2`
On Python 3.11.2 I get `389` (which is probably also acceptable)
On Python 3.12 (0aaef83351473e8f4eb774f8f999bbe87a4866d7) I get
```
Traceback (most recent call last):
File "<path>/enumpickle.py", line 9, in <module>
print(pickle.loads(pickle.dumps(PxdEnum.RANK_2)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: unsupported operand type(s) for |: 'PxdEnum' and 'NoneType'
```
# Your environment
Tested as described on Python 3.10, 3.11, and 3.12.
Linux
<!--
You can freely edit this text. Remove any lines you believe are unnecessary.
-->
<!-- gh-linked-prs -->
### Linked PRs
* gh-105348
* gh-105519
* gh-105520
<!-- /gh-linked-prs -->
| 4ff5690e591b7d11cf11e34bf61004e2ea58ab3c | e822a676f1f3bef6c5413e9b856db481c08ac2a5 |
python/cpython | python__cpython-105325 | # python -m tokenize is broken when reading from stdin
```
./python.exe -m tokenize <<< 'f"{F():\n}"'
unexpected error: 'generator' object is not callable
Traceback (most recent call last):
File "/Users/pgalindo3/github/python/main/Lib/runpy.py", line 198, in _run_module_as_main
return _run_code(code, main_globals, None,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/pgalindo3/github/python/main/Lib/runpy.py", line 88, in _run_code
exec(code, run_globals)
File "/Users/pgalindo3/github/python/main/Lib/tokenize.py", line 532, in <module>
main()
File "/Users/pgalindo3/github/python/main/Lib/tokenize.py", line 498, in main
for token in tokens:
File "/Users/pgalindo3/github/python/main/Lib/tokenize.py", line 527, in _generate_tokens_from_c_tokenizer
for info in it:
TypeError: 'generator' object is not callable
```
<!-- gh-linked-prs -->
### Linked PRs
* gh-105325
* gh-105330
<!-- /gh-linked-prs -->
| f04c16875b649e2c2b420eb146308d0206c7e527 | 677cf3974190f1f627ff63cc6c8777d7f390e47b |
python/cpython | python__cpython-105295 | # Incorrect call to `SSL_CTX_set_session_id_context` on client side SSL context
When initialising an SSLContext there is a call to `SSL_CTX_set_session_id_context()`:
#define SID_CTX "Python"
SSL_CTX_set_session_id_context(self->ctx, (const unsigned char *) SID_CTX,
sizeof(SID_CTX));
#undef SID_CTX
The openssl man pages state that `SSL_CTX_set_session_id_context` is a "server side only" operation:
https://www.openssl.org/docs/man1.0.2/man3/SSL_CTX_set_session_id_context.html
> SSL_CTX_set_session_id_context, SSL_set_session_id_context - set context within which session can be reused (server side only)
> The session id context becomes part of the session. The session id context is set by the SSL/TLS server. The SSL_CTX_set_session_id_context() and SSL_set_session_id_context() functions are therefore only useful on the server side.
In some circumstances, calling this on a client side socket can result in unexpected behavior. For example TLSv1.3 PSK: https://github.com/python/cpython/pull/103181#issuecomment-1493611344
The fix for this was originally part of another PR (https://github.com/python/cpython/pull/103181) @gpshead recommended creating a separate issue/PR
<!-- gh-linked-prs -->
### Linked PRs
* gh-105295
<!-- /gh-linked-prs -->
| 21d98be42289369ccfbdcc38574cb9ab50ce1c02 | 490295d651d04ec3b3eff2a2cda7501191bad78a |
python/cpython | python__cpython-105294 | # add option to traceback.format_exception_only to recurse into exception groups
traceback.format_exception_only skips the backtrace and prints only the exception message. For exception groups it would be good to have an option to print the nested exceptions, recursively (just their types and message, no backtrace).
This is motivated by https://github.com/python/cpython/issues/104150.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105294
<!-- /gh-linked-prs -->
| f4d8e10d0d0cc1ba0787d2350a699d9fb227a7cd | 92022d8416d9e175800b65c4d71d4e4fb47adcb0 |
python/cpython | python__cpython-107130 | # Link specifically to distutils migration advice in What's New
I think it would be useful to link directly to https://peps.python.org/pep-0632/#migration-advice in 3.12's What's New, since that section has more specific advice.
<!-- gh-linked-prs -->
### Linked PRs
* gh-107130
* gh-107160
<!-- /gh-linked-prs -->
| 7ca2d8e053e5fdbac4a9cec74701a1676c17a167 | b447e19e720e6781025432a40eb72b1cc93ac944 |
python/cpython | python__cpython-105287 | # Improve `typing.py` docstrings
The docstrings in `typing.py` could do with a spring clean. As well as several small grammatical errors, there's also several details which are out of date; very inconsistent indentation in code examples; and numerous other small things that could be improved.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105287
* gh-105319
* gh-105322
* gh-105363
* gh-105416
* gh-105417
<!-- /gh-linked-prs -->
| f714aa2c29eae5cc4b5b54e4d83d83eebd5eac53 | 08756dbba647440803d2ba4545ba0ab2f0cdfe1c |
python/cpython | python__cpython-105281 | # `typing.Protocol` implementation means that `isinstance([], collections.abc.Mapping)` can sometimes evaluate to `True`
(In very specific circumstances)
# Bug report
Due to some changes that have been made to the implementation of `typing.Protocol` in Python 3.12, whether or not `isinstance([], collections.abc.Mapping)` evaluates to `True` or `False` now depends on precisely when and if garbage collection happens.
Minimal repro:
```pycon
>>> import gc, collections.abc, typing
>>> gc.disable()
>>> try:
... class Foo(collections.abc.Mapping, typing.Protocol): pass
... except TypeError:
... pass
...
>>> isinstance([], collections.abc.Mapping)
True
```
Why does this happen?! It's because `Foo` is an illegal class, so `TypeError` is raised during class initialisation here:
https://github.com/python/cpython/blob/ce558e69d4087dd3653207de78345fbb8a2c7835/Lib/typing.py#L1905-L1912
`TypeError` being raised here means that `Foo` never has `__protocol_attrs__` set on it. _But_, the error is raised _after_ class construction, meaning that if garbage collection happens at the wrong time, a reference to `Foo` is still retained in `collections.abc.Mapping.__subclasses__()`. This then creates problems in the `isinstance()` check, because of some behaviour in the `abc` module where it iterates through all of the subclasses of `collections.abc.Mapping` in order to determine whether an object is an instance of `collections.abc.Mapping`.
```pycon
>>> import gc, collections.abc, typing
>>> gc.disable()
>>> try:
... class Foo(collections.abc.Mapping, typing.Protocol): pass
... except TypeError:
... pass
...
>>> x = collections.abc.Mapping.__subclasses__()
>>> x
[<class 'collections.abc.MutableMapping'>, <class '__main__.Foo'>]
>>> x[-1].__protocol_attrs__
set()
```
The fix is to raise `TypeError` before the class has actually been created, rather than during class initialisation.
# Why does this matter?
This might seem like an absurdly specific bug report, but it would be good to see it fixed. It was causing bizarre test failures in the `typing_extensions` backport. The issue was that we did this in one test method:
```py
def test_protocols_bad_subscripts(self):
T = TypeVar('T')
S = TypeVar('S')
with self.assertRaises(TypeError):
class P(typing.Mapping[T, S], Protocol[T]): pass
```
...And that final `P` class _wasn't_ being cleaned up by the garbage collector immediately after that test having been run. That then caused an unrelated test elsewhere to fail due to the fact that `isinstance([], collections.abc.Mapping)` was evaluating to `True`, and it was _very_ hard to figure out why.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105281
* gh-105318
<!-- /gh-linked-prs -->
| 08756dbba647440803d2ba4545ba0ab2f0cdfe1c | 058b96053563bb5c413dc081eb8cc0916516525c |
python/cpython | python__cpython-105350 | # 3.12.0b1: static declaration follows non-static when importing internal/pycore_dict.h
<!--
If you're new to Python and you're not sure whether what you're experiencing is a bug, the CPython issue tracker is not
the right place to seek help. Consider the following options instead:
- reading the Python tutorial: https://docs.python.org/3/tutorial/
- posting in the "Users" category on discuss.python.org: https://discuss.python.org/c/users/7
- emailing the Python-list mailing list: https://mail.python.org/mailman/listinfo/python-list
- searching our issue tracker (https://github.com/python/cpython/issues) to see if
your problem has already been reported
-->
# Bug report
I have a change within a [project](https://github.com/p403n1x87/echion) that includes a native extension which fails to compile with 3.12.0b1:
```
In file included from echion/coremodule.cc:22:
In file included from ./echion/dict.h:7:
In file included from /Users/gabriele/.pyenv/versions/3.12.0b1/include/python3.12/internal/pycore_dict.h:13:
In file included from /Users/gabriele/.pyenv/versions/3.12.0b1/include/python3.12/internal/pycore_runtime.h:16:
In file included from /Users/gabriele/.pyenv/versions/3.12.0b1/include/python3.12/internal/pycore_global_objects.h:11:
/Users/gabriele/.pyenv/versions/3.12.0b1/include/python3.12/internal/pycore_gc.h:84:19: error: static declaration of 'PyObject_GC_IsFinalized' follows non-static declaration
static inline int _PyGC_FINALIZED(PyObject *op) {
^
/Users/gabriele/.pyenv/versions/3.12.0b1/include/python3.12/cpython/objimpl.h:85:30: note: expanded from macro '_PyGC_FINALIZED'
# define _PyGC_FINALIZED(o) PyObject_GC_IsFinalized(o)
^
/Users/gabriele/.pyenv/versions/3.12.0b1/include/python3.12/objimpl.h:209:17: note: previous declaration is here
PyAPI_FUNC(int) PyObject_GC_IsFinalized(PyObject *);
```
I didn't look too deep into what changed here, but my guess would be that the macro should drop the `_` prefix, i.e.
```
# define PyGC_FINALIZED(o) PyObject_GC_IsFinalized(o)
```
since it seems that `PyObject_GC_IsFinalized` is part of the API. The same code compiles fine with older versions of Python.
# Your environment
MacOS Ventura 13.3.1
Python 3.12.0b1
<!-- gh-linked-prs -->
### Linked PRs
* gh-105350
* gh-105362
* gh-107348
* gh-107350
<!-- /gh-linked-prs -->
| 8ddf0dd264acafda29dc587ab8393387bb9a76ab | 49fe2e4af7993c124b98589ee608ed6ba2cee8e6 |
python/cpython | python__cpython-105364 | # 3.12: tokenize adds a newline when it is not there
<!--
If you're new to Python and you're not sure whether what you're experiencing is a bug, the CPython issue tracker is not
the right place to seek help. Consider the following options instead:
- reading the Python tutorial: https://docs.python.org/3/tutorial/
- posting in the "Users" category on discuss.python.org: https://discuss.python.org/c/users/7
- emailing the Python-list mailing list: https://mail.python.org/mailman/listinfo/python-list
- searching our issue tracker (https://github.com/python/cpython/issues) to see if
your problem has already been reported
-->
# Bug report
tokenize adds a concrete newline character when it is not present. this breaks any sort of roundtrip source generation and pycodestyle's end-of-file checker
here's an example of a file containing a single byte (generated via `echo -n 1 > t.py`)
```console
# hd -c t.py
00000000 31 |1|
0000000 1
0000001
# python3.12 -m tokenize t.py
0,0-0,0: ENCODING 'utf-8'
1,0-1,1: NUMBER '1'
1,1-1,2: NEWLINE '\n'
2,0-2,0: ENDMARKER ''
# python3.11 -m tokenize t.py
0,0-0,0: ENCODING 'utf-8'
1,0-1,1: NUMBER '1'
1,1-1,2: NEWLINE ''
2,0-2,0: ENDMARKER ''
```
# Your environment
- CPython versions tested on: 3.12 dbd7d7c8e1
- Operating system and architecture: ubuntu 22.04 LTS x86_64
cc @pablogsal
<!-- gh-linked-prs -->
### Linked PRs
* gh-105364
* gh-105367
<!-- /gh-linked-prs -->
| c0a6ed39343b6dc355607fbff108c515e6c103bf | 0202aa002e06acef9aa55ace0d939103df19cadd |
python/cpython | python__cpython-106378 | # comprehensions iterating over `locals()` may break under some trace funcs in 3.12
It is already the case in all recent Python versions that the lazy-update behavior of the dictionary returned from `locals()` causes weird frame-introspection-sensitive semantics around lazy iteration of `locals()`. E.g. this code will work fine:
```
def f(a, b):
ret = {}
for k, v in locals().items():
ret[k] = v
return ret
f(1, 2)
```
But run this under a tracing function that simply accesses `frame.f_locals`:
```
import sys
def tracefunc(frame, *a, **kw):
frame.f_locals
return tracefunc
sys.settrace(tracefunc)
f(1, 2)
```
And suddenly you instead get `RuntimeError: dictionary changed size during iteration`.
The reason is because accessing `frame.f_locals` triggers a lazy update of the cached locals dictionary on the frame from the fast locals array. And this is the same dictionary returned by `locals()` (it doesn't create a copy), and so the tracing function has the side effect of adding the keys `k` and `v` to the locals dictionary, while it is still being lazily iterated over. Without the access of `frame.f_locals`, nothing triggers this locals-dict lazy update after the initial call to `locals()`, so the iteration is fine.
This is a pre-existing problem with the semantics of `locals()` generally, and there are proposals to fix it, e.g. [PEP 667](https://peps.python.org/pep-0667/).
What is new in Python 3.12 is that PEP 709 means comprehensions can newly be exposed to this same issue. Consider this version of the above function:
```
def f(a, b):
return {k: v for k, v in locals.items()}
```
Under PEP 709, `k` and `v` are now part of the locals of `f` (albeit isolated and only existing during the execution of the comprehension), which means this version of `f` is now subject to the same issue as the previous for-loop version, if run under a tracing func that accesses `f_locals`.
<!-- gh-linked-prs -->
### Linked PRs
* gh-106378
* gh-106471
<!-- /gh-linked-prs -->
| 13aefd175e3c04529251f175c23cb3ed88451fd0 | 104d7b760fed18055e4f04e5da3ca619e28bfc81 |
python/cpython | python__cpython-105241 | # Compiler warning on missing prototypes (Mac)
./Modules/readline.c:1018:16: warning: a function declaration without a prototype is deprecated in all versions of C [-Wstrict-prototypes]
on_startup_hook()
^
void
./Modules/readline.c:1033:18: warning: a function declaration without a prototype is deprecated in all versions of C [-Wstrict-prototypes]
on_pre_input_hook()
^
void
2 warnings generated.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105241
<!-- /gh-linked-prs -->
| a9305b5e80e414b8b9b2bd366e96b43add662d70 | 44bb03f856c30f709bb983f9830eafe914a742aa |
python/cpython | python__cpython-105239 | # Cannot call `issubclass()` against `typing.Protocol` on Python 3.12
# Bug report
On Python 3.11, you can do this:
```pycon
>>> import typing
>>> issubclass(object, typing.Protocol)
False
```
But on Python 3.12 (following the backport of https://github.com/python/cpython/commit/c05c31db8c9dfd708b9857bb57f8e5f3ce40266d), this raises:
```pycon
>>> import typing
>>> issubclass(object, typing.Protocol)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\alexw\coding\cpython\Lib\typing.py", line 1797, in __subclasscheck__
raise TypeError(
TypeError: Instance and class checks can only be used with @runtime_checkable protocols
```
The new behaviour doesn't make sense. That `TypeError` should only be triggered if a user is calling `issubclass()` against a _subclass_ of `typing.Protocol`. `typing.Protocol` itself should be exempted.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105239
* gh-105316
<!-- /gh-linked-prs -->
| cdfb201bfa35b7c50de5099c6d9078c806851d98 | 69d1245685cf95ddc678633e978a56673da64865 |
python/cpython | python__cpython-105252 | # Off-by-one memory error in a string fastsearch since 3.11
# Bug report
This bug happens in [Objects/stringlib/fastsearch.h:589](https://github.com/python/cpython/blob/v3.11.3/Objects/stringlib/fastsearch.h#L589) during matching the last symbol. In some cases, it causes crashes, but it's a bit hard to reproduce since in order this to happen, the last symbol should be the last in this particular memory page and the next page should not be read accessible or have a different non-contiguous address with the previous one.
The simplest script that reproduces the bug for me is:
```
import mmap
def bug():
with open("file.tmp", "wb") as f:
# this is the smallest size that triggers bug for me
f.write(bytes(8388608))
with open("file.tmp", "rb") as f:
with mmap.mmap(f.fileno(), 0, access=mmap.ACCESS_READ) as fm:
with open("/proc/self/maps", "rt") as f:
print(f.read())
# this triggers bug
res = fm.find(b"fo")
if __name__ == "__main__":
bug()
```
But since the result of this script depends on a file system, kernel, and perhaps even a moon phase :smile: , here's a much more reliable way to reproduce it:
```
import mmap
def read_maps():
with open("/proc/self/maps", "rt") as f:
return f.read()
def bug():
prev_map = frozenset(read_maps().split('\n'))
new_map = None
for i in range(0, 2049):
# guard mmap
with mmap.mmap(0, 4096 * (i + 1), flags=mmap.MAP_PRIVATE | mmap.MAP_ANONYMOUS, prot=0) as guard:
with mmap.mmap(0, 8388608 + 4096 * i, flags=mmap.MAP_ANONYMOUS | mmap.MAP_PRIVATE, prot=mmap.PROT_READ) as fm:
new_map = frozenset(read_maps().split('\n'))
for diff in new_map.difference(prev_map):
print(diff)
prev_map = new_map
# this crashes
fm.find(b"fo")
print("---")
if __name__ == "__main__":
bug()
```
This causes the bug across all Linux environments that I've tried. It uses a trick with inaccessible memory region to increase the chances of this bug happening and no files, to speed it up.
Here's some extra info from GDB:
```
Program received signal SIGSEGV, Segmentation fault.
0x000055555570ba81 in stringlib_default_find (s=0x7ffff6a00000 "", n=8388608, p=0x7ffff745a3e0 "fo", m=2, maxcount=-1, mode=1)
at Objects/stringlib/fastsearch.h:589
589 if (!STRINGLIB_BLOOM(mask, ss[i+1])) {
(gdb) pipe info proc mappings | grep -A 1 -B 1 file.tmp
0x555555cb4000 0x555555d66000 0xb2000 0x0 rw-p [heap]
0x7ffff6a00000 0x7ffff7200000 0x800000 0x0 r--s /home/slava/src/cpython/python_bug/file.tmp
0x7ffff7400000 0x7ffff7600000 0x200000 0x0 rw-p
(gdb) p &ss[i]
$1 = 0x7ffff71fffff ""
(gdb) p &ss[i + 1]
$2 = 0x7ffff7200000 <error: Cannot access memory at address 0x7ffff7200000>
(gdb) p i
$3 = 8388606
(gdb) p ss
$4 = 0x7ffff6a00001 ""
(gdb) p s
$5 = 0x7ffff6a00000 ""
```
# Your environment
- CPython 3.11.3
- OS: Linux 6.1 (but it should be OS independent)
I've also tried a bit modified version of a script on OS X, and it crashes there as well.
cc @sweeneyde (since you are the author of d01dceb88b2ca6def8a2284e4c90f89a4a27823f and 6ddb09f35b922a3bbb59e408a3ca7636a6938468).
<!-- gh-linked-prs -->
### Linked PRs
* gh-105252
* gh-106708
* gh-106710
<!-- /gh-linked-prs -->
| ab86426a3472ab68747815299d390b213793c3d1 | 2d43beec225a0495ffa0344f961b99717e5f1106 |
python/cpython | python__cpython-105230 | # Remove superinstructions
Superinstructions offer a small speedup, but their existence complicates things; mainly instrumentation, and specialization.
The tier2 optimizer is likely to be simpler without superinstructions.
[This experiment](https://github.com/faster-cpython/benchmarking-public/tree/main/results/bm-20230526-3.13.0a0-0087a08) shows that combining the most common superinstructions into single instructions can outperform superinstructions, rendering superinstructions pointless.
We have already removed almost all superinstructions in specialization. We should remove them altogether.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105230
* gh-105326
* gh-105703
<!-- /gh-linked-prs -->
| 06893403668961fdbd5d9ece18162eb3470fc8dd | e8ecb9ee6bec03d0c4490f3e7f1524e56e2f6a0f |
python/cpython | python__cpython-105747 | # ``tp_dict`` slot of static builtin types is ``NULL`` in 3.12, without mention in the changelog or an alternative
PyObjC contains some functionality that needs to walk the entire MRO of classes and access the ``tp_dict`` slot. As of Python 3.12 beta 1 this code crashes due to the slot being ``NULL`` on static builtin types.
The code in PyObjC walks the MRO to implement ``tp_getattro`` for the proxy of Objective-C classes to replicate ``PyObject_GenericGetAttr`` with some customisations, as wel as in the implementation of an alternative to ``super`` (again with customisations in the attribute resolution path). All code paths only need read-only access to the ``tp_dict`` of static builtin types, although some code needs to be able to update the ``tp_dict`` of classes created by PyObjC.
There is currently no documented alternative for directly accessing ``tp_dict``, and because of this I've changed PyObjC to use the private API ``_PyType_GetDict``.
Some alternatives I've looked into:
* Use ``PyObject_GetAttrString(someType, "__dict__")``: This increases the complexity of the PyObjC code bases and requires changes to long stable code:
* Needs to introduce a different code path for accessing the ``__dict__`` slot of non-PyObjC subclasses
* Needs to switch to the abstract Mapping API instead of the PyDict API, which likely has a performance impact
* Changes to reference counting invariants in the code
* Use ``_PyType_GetDict`` on Python 3.12 or later: This works, but introduces a reliance on a private CPython API that might disappear at any moment (even micro releases), for example by no longer exporting the function from shared libraries.
Proposal: Rename ``_PyType_GetDict`` to either ``PyUnstable_Type_GetDict`` or ``PyType_GetDict``, while keeping the current interface (e.g. returning a borrowed reference and never raising an exception).
This is a follow-up from #105020
@ericsnowcurrently
<!-- gh-linked-prs -->
### Linked PRs
* gh-105747
* gh-106600
<!-- /gh-linked-prs -->
| a840806d338805fe74a9de01081d30da7605a29f | 3e23fa71f43fb225ca29a931644d1100e2f4d6b8 |
python/cpython | python__cpython-105215 | # Use named constants for MAKE_FUNCTION oparg
The oparg for MAKE_FUNCTION is a mask of various flags (e.g. 0x8 means it has a closure). These are currently hardcoded as integers; let's use named constants instead.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105215
<!-- /gh-linked-prs -->
| 44bb03f856c30f709bb983f9830eafe914a742aa | 41de54378d54f7ffc38f07db4219e80f48c4249e |
python/cpython | python__cpython-105672 | # C API documentation index is laid out incorrectly
The last two lines in this snippet should be indented further than the line above them, they are more specific.
(Alternatively, they can be removed because they don't add much).
<img width="401" alt="image" src="https://github.com/python/cpython/assets/1055913/5bee1ab5-4a6f-45a7-b0fb-de2710002c10">
<!-- gh-linked-prs -->
### Linked PRs
* gh-105672
* gh-105782
* gh-105786
<!-- /gh-linked-prs -->
| d32e8d6070057eb7ad0eb2f9d9f1efab38b2cff4 | 4a113e24a38e2537570e4d694f8e0c01354904c4 |
python/cpython | python__cpython-105231 | # Backslash in f-string format specifier gets escape
<!--
If you're new to Python and you're not sure whether what you're experiencing is a bug, the CPython issue tracker is not
the right place to seek help. Consider the following options instead:
- reading the Python tutorial: https://docs.python.org/3/tutorial/
- posting in the "Users" category on discuss.python.org: https://discuss.python.org/c/users/7
- emailing the Python-list mailing list: https://mail.python.org/mailman/listinfo/python-list
- searching our issue tracker (https://github.com/python/cpython/issues) to see if
your problem has already been reported
-->
# Bug report
Between Python 3.12 alpha 7 and beta 1 the behaviour of format specifiers containing a backslash (e.g. `\n`, `\u2603`) in f-strings changed.
Previously, the `__format__` method would get e.g. the newline character, but it now receives a backslash and an `n` character (i.e. the string `"\n"`)
The following script demonstrates this:
```python3
# issue.py
class F:
def __format__(self, format_spec):
print("spec:", repr(format_spec))
return format_spec.join(["a", "b", "c"])
print("out: ", repr((f"{F():\n}")))
print("out: ", repr((f"{F():\u2603}")))
print("out: ", repr((format(F(), "\n"))))
print("out: ", repr((format(F(), "\u2603"))))
```
```console
$ python3.11 issue.py
spec: '\n'
out: 'a\nb\nc'
spec: '☃'
out: 'a☃b☃c'
spec: '\n'
out: 'a\nb\nc'
spec: '☃'
out: 'a☃b☃c'
```
```console
$ python3.12 issue.py
spec: '\\n'
out: 'a\\nb\\nc'
spec: '\\u2603'
out: 'a\\u2603b\\u2603c'
spec: '\n'
out: 'a\nb\nc'
spec: '☃'
out: 'a☃b☃c'
```
In both cases the `format()` function works correctly - I only encounter the issue with f-strings.
I suspect this is related to the PEP 701 changes to f-strings, but I couldn't find anything in the discussion of that PEP to suggest this was a deliberate change.
# Your environment
<!-- Include as many relevant details as possible about the environment you experienced the bug in -->
- CPython versions tested on: Python 3.11.2, 3.12.0alpha7, 3.12.0beta1
- Operating system and architecture: Ubuntu 20.04 (AMD64)
<!--
You can freely edit this text. Remove any lines you believe are unnecessary.
-->
<!-- gh-linked-prs -->
### Linked PRs
* gh-105231
* gh-105234
<!-- /gh-linked-prs -->
| 41de54378d54f7ffc38f07db4219e80f48c4249e | 4bfa01b9d911ce9358cf1a453bee15554f8e4c07 |
python/cpython | python__cpython-105197 | # Clean up peg_generator: Remove data, scripts and maintain them elsewhere
Since the peg generator has reached a certain stability (I don't see us adding any major new features to the generator in the near future), I think it'd be best if we remove the data and scripts directories from `Tools/peg_generator` and leave the bare minimum needed in the CPython repo. This will remove the maintenance burden of trying to keep them up-to-date.
We can continue to maintain them on [the pegen repo](https://github.com/we-like-parsers/pegen) and use them there, if we need them.
cc @pablogsal
<!-- gh-linked-prs -->
### Linked PRs
* gh-105197
<!-- /gh-linked-prs -->
| a241003d048f33c9072d47217aa6e28beb7ac54f | c67121ac6bf8ee36d79de92ef68fc3fde178d2a3 |
python/cpython | python__cpython-105232 | # `importlib.resources.abc.TraversableResources` says it is deprecated in favour of itself
# Documentation
https://docs.python.org/3.12/library/importlib.resources.abc.html#importlib.resources.abc.TraversableResources says it is deprecated in favour of itself
it should be documented here https://docs.python.org/3.12/library/importlib.html#module-importlib.abc
introduced in https://github.com/python/cpython/commit/10b12dd92ab6e29b573706dd6adb3d8eb63ae81b#r115987571
cc @jaraco @hugovk
<!-- gh-linked-prs -->
### Linked PRs
* gh-105232
* gh-109363
<!-- /gh-linked-prs -->
| 6c0ddca409c1ed27b11c70386cd6c88be5d00115 | 44d9a71ea246e7c3fb478d9be62c16914be6c545 |
python/cpython | python__cpython-105185 | # C API functions with no return value cannot report errors
As discussed in https://github.com/capi-workgroup/problems/issues/20, there are functions in the C API that do not have a way to report error (void return type).
Some of them actually set the error indicator in some cases, and the only way to know it to check ``PyErr_Occurred()``. In other cases the functions currently do not have error cases, but the lack of return value is still a problem in case we want to evolve them in the future and need to report errors.
We will use this issue to fix the straightforward ones, while more complicated ones may spawn their own issues.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105185
* gh-105218
* gh-105219
* gh-105220
* gh-105221
* gh-105222
* gh-105223
* gh-105233
<!-- /gh-linked-prs -->
| ee26ca13a129da8cf549409d0a1b2e892ff2b4ec | 77d25795862f19c6e3d647b76cfb10d5ce1f149c |
python/cpython | python__cpython-105183 | # C API: Remove PyEval_AcquireLock() and PyEval_InitThreads() functions
Since 3.7, ``PyEval_ThreadsInitialized()`` always returns non-zero and calling ``PyEval_InitThreads()`` is useless: ``Py_Initialize()`` now always creates the GIL. These functions were deprecated in Python 3.9 by PR #18892 (commit b4698ecfdb526e0a9f5fa6ef0f8e1d8cca500203) of issue #84058. What's New in Python 3.9:
```
* The :c:func:`PyEval_InitThreads` and :c:func:`PyEval_ThreadsInitialized`
functions are now deprecated and will be removed in Python 3.11. Calling
:c:func:`PyEval_InitThreads` now does nothing. The :term:`GIL` is initialized
by :c:func:`Py_Initialize()` since Python 3.7.
(Contributed by Victor Stinner in :issue:`39877`.)
```
Well, the removal missed Python 3.11.
``PyEval_AcquireLock()`` and ``PyEval_ReleaseLock()`` don't update the current thread state and were deprecated in Python 3.2 by issue #55122:
> These two functions are very low-level and no good to use outside the interpreter core (actually, inside the core I don't think they are useful either). We should deprecate them and recommend the thread-state aware functions instead. See issue #44960 for an example confusion.
The commit ebeb90339de196d97ee197afad5e4e4a7f672a50 adds this entry to What's New in Python 3.2:
```
* The misleading functions :c:func:`PyEval_AcquireLock()` and
:c:func:`PyEval_ReleaseLock()` have been officially deprecated. The
thread-state aware APIs (such as :c:func:`PyEval_SaveThread()`
and :c:func:`PyEval_RestoreThread()`) should be used instead.
```
I propose to now removed these 4 deprecated functions.
In 2020, I already proposed to remove PyEval_AcquireLock() and PyEval_ReleaseLock() functions in https://bugs.python.org/issue39998 But I closed my issue when I discovered that they were part of the stable ABI. This is way to keep these functions in the stable ABI and remove them in the C API.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105183
<!-- /gh-linked-prs -->
| ec0082ca460f6b5eaf987536d28d6bc252322307 | 9ab587b7146618866cee52c220aecf7bd5b44b02 |
python/cpython | python__cpython-105173 | # Minor functools.lru_cache in-code documentation issue
# Bug report
The in-code documentation for the lru_cache function states:
> If *typed* is True, arguments of different types will be cached separately.
For example, f(3.0) and f(3) will be treated as distinct calls with
distinct results.
Specifically for 3.0 and 3, even if typed is False, they are still treated as distinct calls, as `int` is fast-tracked.
The online documentation states correctly:
> (Some types such as str and int may be cached separately even when typed is false.)
I know the docstrings are seldom used, but if they already exists, at least they should be accurate as IDEs sometimes show them. Simply changing the example to `f(3.0)` and `f(Decimal("3.0"))` should suffice.
# Your environment
- CPython versions tested on: 3.13 (dev), applies to all supported CPython versions
- Operating system and architecture: Mac OS Ventura
<!-- gh-linked-prs -->
### Linked PRs
* gh-105173
<!-- /gh-linked-prs -->
| f332594dd47947612e1e5d2faf287930552a5110 | a99b9d911e0f8cb11b3436bdd8eb649b15d01a50 |
python/cpython | python__cpython-105177 | # Annotated assignment in a `match` block does not generate `SETUP_ANNOTATIONS`
# Bug report
Compiling the following code:
```
match 0:
case 0:
x: int = 1
```
generates
```
1 0 LOAD_CONST 0 (0)
2 2 LOAD_CONST 0 (0)
4 COMPARE_OP 2 (==)
6 POP_JUMP_IF_FALSE 12 (to 24)
3 8 LOAD_CONST 1 (1)
10 STORE_NAME 0 (x)
12 LOAD_NAME 1 (int)
14 LOAD_NAME 2 (__annotations__)
16 LOAD_CONST 2 ('x')
18 STORE_SUBSCR
20 LOAD_CONST 3 (None)
22 RETURN_VALUE
2 >> 24 LOAD_CONST 3 (None)
26 RETURN_VALUE
```
There is no `SETUP_ANNOTATIONS` opcode generated despite the `__annotations__` access later on. (Found via https://github.com/google/pytype/issues/1435 - for some reason this does not generate a runtime error, but it does cause issues for tools like pytype.)
# Your environment
cpython 3.10
<!-- gh-linked-prs -->
### Linked PRs
* gh-105177
* gh-105313
* gh-105314
<!-- /gh-linked-prs -->
| 69d1245685cf95ddc678633e978a56673da64865 | 06893403668961fdbd5d9ece18162eb3470fc8dd |
python/cpython | python__cpython-105187 | # generator throw delivered to incorrect generator under trace
<!--
If you're new to Python and you're not sure whether what you're experiencing is a bug, the CPython issue tracker is not
the right place to seek help. Consider the following options instead:
- reading the Python tutorial: https://docs.python.org/3/tutorial/
- posting in the "Users" category on discuss.python.org: https://discuss.python.org/c/users/7
- emailing the Python-list mailing list: https://mail.python.org/mailman/listinfo/python-list
- searching our issue tracker (https://github.com/python/cpython/issues) to see if
your problem has already been reported
-->
# Bug report
With the following demo code, a subgenerator `yield from` from a parent generator main(), that catches exceptions and returns "good":
```python
class Inner:
def send(self, v):
return None
def __iter__(self):
return self
def __next__(self):
return self.send(None)
def throw(self, *exc_info):
raise StopIteration("good")
def main():
return (yield from Inner())
def run(coro):
coro.send(None)
try:
coro.throw(Exception)
except StopIteration as e:
print(e.value)
run(main())
```
when run with `python -m trace`, the exception is delivered to the outer generator main() instead of being suppressed
```
graingert@conscientious ~/projects/cpython main ./python -m trace --count -C . demo.py
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/home/graingert/projects/cpython/Lib/trace.py", line 741, in <module>
main()
File "/home/graingert/projects/cpython/Lib/trace.py", line 729, in main
t.runctx(code, globs, globs)
File "/home/graingert/projects/cpython/Lib/trace.py", line 451, in runctx
exec(cmd, globals, locals)
File "demo.py", line 26, in <module>
run(main())
File "demo.py", line 21, in run
coro.throw(Exception)
File "demo.py", line 16, in main
return (yield from Inner())
^^^^^^^^^^^^^^^^^^
Exception
✘ graingert@conscientious ~/projects/cpython main ± ./python demo.py
good
```
the problem also occurs with the equivalent generator syntax, eg:
```python
def Inner():
try:
while True:
yield None
except:
return "good"
```
# Your environment
<!-- Include as many relevant details as possible about the environment you experienced the bug in -->
- CPython versions tested on: c05c31db8c9dfd708b9857bb57f8e5f3ce40266d 3.12b1
- Operating system and architecture: Ubuntu 22.04 x86_64
see also
https://github.com/urllib3/urllib3/issues/3049#issuecomment-1570452951
https://github.com/nedbat/coveragepy/issues/1635
<!-- gh-linked-prs -->
### Linked PRs
* gh-105187
* gh-105378
<!-- /gh-linked-prs -->
| 601ae09f0c8eda213b9050892f5ce9b91f0aa522 | ee26ca13a129da8cf549409d0a1b2e892ff2b4ec |
python/cpython | python__cpython-105157 | # C API: Deprecate Py_UNICODE type
Python 3.12 removed the last batch of C functions related to the old ``Py_UNICODE`` type. But the type itself is still part of the C API. It is still used in some area of the Python code base where the ``wchar_t`` type should be used instead.
I propose to deprecate the ``Py_UNICODE`` type and schedule its removal in Python 3.15. This type should not be used since Python 3.3: [PEP 393](https://peps.python.org/pep-0393/).
<!-- gh-linked-prs -->
### Linked PRs
* gh-105157
* gh-105158
* gh-105161
* gh-105180
* gh-105379
<!-- /gh-linked-prs -->
| 8ed705c083e8e5ff37649d998a8b1524ec921519 | f332594dd47947612e1e5d2faf287930552a5110 |
python/cpython | python__cpython-105149 | # Make _PyASTOptimizeState internal to ast_opt.c
ast_opt.c imports pycore_compile.c inly for ``_PyASTOptimizeState``, which could be defined in ast_opt.c instead.
We need to change the signature of ``_PyAST_Optimize`` to take two integers (optimization level and flags) instead of a ``_PyASTOptimizeState`` pointer. This will slightly simplify the call sites of ``_PyAST_Optimize``, where this struct is created just to hold those two ints and pass it to ``_PyAST_Optimize``.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105149
<!-- /gh-linked-prs -->
| f990bb8b2d8ee900fe4b0775399f6ef4ca61bb3f | dd29ae26f89ba7db596127b6eba83bb3a45c167b |
python/cpython | python__cpython-105147 | # Update links at end of Windows installer
The last page of the Windows installer suggests emailing python-list@python.org. This probably still works, but is better replaced with a link to discuss.python.org these days. Specifically, https://discuss.python.org/c/users/7 is the right category to link to.
May as well update the downloads link to point directly to the downloads page.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105147
* gh-105167
* gh-105168
<!-- /gh-linked-prs -->
| ed86e14b1672f32f0a31d72070e93d361ee0e2b4 | f990bb8b2d8ee900fe4b0775399f6ef4ca61bb3f |
python/cpython | python__cpython-105154 | # C API: Remove legacy API functions to configure Python initialization
The following C API functions were deprecated in Python 3.11 by issue #88279:
* ``PySys_AddWarnOptionUnicode()``
* ``PySys_AddWarnOption()``
* ``PySys_AddXOption()``
* ``PySys_HasWarnOptions()``
* ``PySys_SetArgvEx()``
* ``PySys_SetArgv()``
* ``PySys_SetPath()``
* ``Py_SetPath()``
* ``Py_SetProgramName()``
* ``Py_SetPythonHome()``
* ``Py_SetStandardStreamEncoding()``
* ``_Py_SetProgramFullPath()``
IMO it's now time to remove them: the new PyConfig API is available since Python 3.8: https://docs.python.org/dev/c-api/init_config.html
<!-- gh-linked-prs -->
### Linked PRs
* gh-105154
* gh-105179
<!-- /gh-linked-prs -->
| 424049cc1117d66dfa86196ee5f694c15b46ac6c | 8ed705c083e8e5ff37649d998a8b1524ec921519 |
python/cpython | python__cpython-105152 | # Strange interaction between `typing.Protocol` and unrelated `isinstance()` checks
On `main`, an `isinstance()` check against `Sized` works just the same as it does on Python 3.11:
```pycon
>>> from typing import Sized
>>> isinstance(1, Sized)
False
```
However! If you first subclass `Sized` like this, `TypeError` is raised on that `isinstance()` check!
```pycon
>>> from typing import Sized, Protocol
>>> class Foo(Sized, Protocol): pass
...
>>> isinstance(1, Sized)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\alexw\coding\cpython\Lib\typing.py", line 1153, in __instancecheck__
return self.__subclasscheck__(type(obj))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexw\coding\cpython\Lib\typing.py", line 1428, in __subclasscheck__
return issubclass(cls, self.__origin__)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen abc>", line 123, in __subclasscheck__
File "C:\Users\alexw\coding\cpython\Lib\typing.py", line 1793, in __subclasscheck__
return super().__subclasscheck__(other)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen abc>", line 123, in __subclasscheck__
File "C:\Users\alexw\coding\cpython\Lib\typing.py", line 1876, in _proto_hook
raise TypeError("Instance and class checks can only be used with"
TypeError: Instance and class checks can only be used with @runtime_checkable protocols
```
This was originally reported by @vnmabus in https://github.com/python/typing_extensions/issues/207.
Note that (because of the `abc`-module cache), this _doesn't_ reproduce if you do an `isinstance()` check _before_ subclassing `typing.Sized`:
```pycon
>>> import typing
>>> isinstance(1, typing.Sized)
False
>>> class Foo(typing.Sized, typing.Protocol): pass
...
>>> isinstance(1, typing.Sized)
False
```
<!-- gh-linked-prs -->
### Linked PRs
* gh-105152
* gh-105159
* gh-105160
<!-- /gh-linked-prs -->
| c05c31db8c9dfd708b9857bb57f8e5f3ce40266d | df396b59af9d50892e5e30463300e8458cb84263 |
python/cpython | python__cpython-105141 | # remove unused argument of _PyErr_ChainStackItem
The argument of _PyErr_ChainStackItem is unused, undocumented and untested, so it should be removed.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105141
<!-- /gh-linked-prs -->
| 60f8117d0c685c2923b7cb17b725b67cd41e8410 | ede89af605b1c0442353435ad22195c16274f65d |
python/cpython | python__cpython-105112 | # Remove deprecated Py_TRASHCAN_SAFE_BEGIN/ Py_TRASHCAN_SAFE_END
Py_TRASHCAN_SAFE_BEGIN/ Py_TRASHCAN_SAFE_END were [deprecated in 3.11](https://github.com/python/cpython/pull/27693/), to be replaced by Py_TRASHCAN_BEGIN/Py_TRASHCAN_END which were added in 3.8. The deprecated macros are flawed (can lead to a crash) so we should not delay their removal.
For more history, see https://mail.python.org/archives/list/python-dev@python.org/message/KWFX6XX3HMZBQ2BYBVL7G74AIOPWO66Y/.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105112
<!-- /gh-linked-prs -->
| fbc9d0dbb22549bac2706f61f3ab631239d357b4 | 0430e97097a8f852aea21669e7f4203d028114f9 |
python/cpython | python__cpython-105108 | # C API: Remove deprecate PyEval_CallObject() function
The Python C API has many functions to call a function or a method: https://docs.python.org/dev/c-api/call.html
The **PyEval** variants were [deprecated in Python 3.9](https://github.com/python/cpython/issues/73734). I propose to now remove them:
* ``PyEval_CallObject()``
* ``PyEval_CallObjectWithKeywords()``
* ``PyEval_CallFunction()``
* ``PyEval_CallMethod()``
There are replacement functions like PyObject_Call() and PyObject_CallFunction() which exist since Python 3.0 at least and so is backward compatible.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105108
* gh-105181
<!-- /gh-linked-prs -->
| 579c41c10224a004c3e89ed9088771325c1c1a98 | adccff3b3f9fbdb58cb4b8fde92456e6dd078af0 |
python/cpython | python__cpython-105106 | # TypeError when using ctypes.BigEndianUnion nested in ctypes.BigEndianStructure on a little endian system (or vice versa)
# Bug report
The classes [ctypes.BigEndianUnion](https://docs.python.org/3/library/ctypes.html#ctypes.BigEndianUnion) and [ctypes.LittleEndianUnion](https://docs.python.org/3/library/ctypes.html#ctypes.LittleEndianUnion) was added in Python 3.11. In the case of the use of an `Union` in a `Structure` class that match the opposite system endianness, the `Union` class can't be used as shown below:
```py
# coding: utf-8
import sys
import ctypes as c
assert sys.byteorder == "little"
"""
struct MyStruct {
unsigned char a;
unsigned char b:
};
struct MainStruct {
union {
struct MyStruct my_struct;
short field_ab;
} my_union;
unsigned char foo;
};
"""
class MyStruct(c.BigEndianStructure):
_fields_ = [("a", c.c_byte), ("b", c.c_byte)]
class MyUnion(c.BigEndianUnion):
_fields_ = [("my_struct", MyStruct), ("ab", c.c_short)]
class MainStruct(c.BigEndianStructure):
_fields_ = [("my_union", MyUnion), ("foo", c.c_byte)]
```
The following traceback is given by the interpretor:
```
TypeError Traceback (most recent call last)
Cell In[1], line 28
24 class MyUnion(c.BigEndianUnion):
25 _fields_ = [("my_struct", MyStruct), ("ab", c.c_short)]
---> 28 class MainStruct(c.BigEndianStructure):
29 _fields_ = [("my_union", MyUnion), ("foo", c.c_byte)]
File ~\AppData\Local\Programs\Python\Python311\Lib\ctypes\_endian.py:31, in _swapped_meta.__setattr__(self, attrname, value)
29 typ = desc[1]
30 rest = desc[2:]
---> 31 fields.append((name, _other_endian(typ)) + rest)
32 value = fields
33 super().__setattr__(attrname, value)
File ~\AppData\Local\Programs\Python\Python311\Lib\ctypes\_endian.py:21, in _other_endian(typ)
19 if issubclass(typ, Structure):
20 return typ
---> 21 raise TypeError("This type does not support other endian: %s" % typ)
TypeError: This type does not support other endian: <class '__main__.MyUnion'>
```
# Your environment
- CPython versions tested on: Interpreter compiled from the main branch in debug mode and also Python 3.11 (coming from the official release)
- Operating system and architecture: Windows 10 x64
<!-- gh-linked-prs -->
### Linked PRs
* gh-105106
* gh-114204
* gh-114205
<!-- /gh-linked-prs -->
| 0b541f64c472976b2fee1ec9919bc7b02a798242 | 33b47a2c2853066b549f242065f6c2e12e18b33b |
python/cpython | python__cpython-105098 | # wave: Deprecate getmark(), setmark() and getmarkers() methods
The ``wave`` module has getmark(), setmark() and getmarkers() methods (in Wave_read and Wave_write classes) which only exist for compatibility with the ``aifc`` module which has been removed in Python 3.13 (PR #104933). I propose to remove these methods.
* getmark() and setmark() always raise an exception, and getmarkers() always returns None (do nothing else): these methods are not very useful.
* Wave_write methods, getmark(), setmark() and getmarkers(), are not documented
* None of these methods are tested
<!-- gh-linked-prs -->
### Linked PRs
* gh-105098
* gh-105136
* gh-105138
* gh-105155
<!-- /gh-linked-prs -->
| 03ad6624c2b4a30cccf6d5b7e2b9999e104444ae | 58a2e0981642dcddf49daa776ff68a43d3498cee |
python/cpython | python__cpython-105092 | # stable_abi.py --all fails on Windows
The `--all` command [from the devguide](https://devguide.python.org/developer-workflow/c-api/#adding-a-new-definition-to-the-limited-api) fails on Windows.
`--all` should only enable `unixy_check` on Unixy platforms.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105092
* gh-105133
<!-- /gh-linked-prs -->
| 0656d23d82cd5b88e578a26c65dd4a64414c833b | b7aadb4583b040ddc8564896b91f4e5e571c82d6 |
python/cpython | python__cpython-105404 | # minimum_version mismatch
# Documentation
The documentation states "The SSL context created above will only allow TLSv1.2 and later..." However, the example code sets the minimum version to TLS 1.3 (not 1.2 as stated).
I believe [this line](https://github.com/python/cpython/blob/c39500db5254c9efee9937f951f58d87ee14cf2f/Doc/library/ssl.rst?plain=1#LL2687C20-L2687C20) should be changed to:
` >>> client_context.minimum_version = ssl.TLSVersion.TLSv1_2`
I am happy to create a pull request, if that is helpful.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105404
* gh-107038
* gh-107039
* gh-107040
* gh-107273
<!-- /gh-linked-prs -->
| e5252c6127ad788b39221982964f935bd1513028 | ae8b114c5b9d211f47bf521fcfdbf40ef72a1bac |
python/cpython | python__cpython-105228 | # test_create_directory_with_write in test_zipfile fails in AIX
test_create_directory_with_write in test_zipfile fails in AIX with the below message
======================================================================
FAIL: test_create_directory_with_write (test.test_zipfile.test_core.TestWithDirectory.test_create_directory_with_write)
Traceback (most recent call last):
File "/cpython/Lib/test/test_zipfile/test_core.py", line 2879, in test_create_directory_with_write
self.assertEqual(zinfo.external_attr, (mode << 16) | 0x10)
AssertionError: 1106051088 != 5401018384
----------------------------------------------------------------------
The reason being AIX stat call st_mode returns 240700 (in octal) for a directory with 700 mode. In linux, it returns 40700. So there is extra “2” which seems to be related to journaled filesystem. So the logic requires that the mode should go a bitwise AND operation with 0xFFFF , like how it is done in **test_write_dir** in **test_core.py** to be in sync with **zinfo.external_attr**
<!-- gh-linked-prs -->
### Linked PRs
* gh-105228
* gh-114860
* gh-114861
<!-- /gh-linked-prs -->
| 4dbb198d279a06fed74ea4c38f93d658baf38170 | 0bf42dae7e73febc76ea96fd58af6b765a12b8a7 |
python/cpython | python__cpython-105056 | # RFE: Allow using setuptools and wheel wheels from a custom directory for tests
<!--
If you're new to Python and you're not sure whether what you're experiencing is a bug, the CPython issue tracker is not
the right place to seek help. Consider the following options instead:
- reading the Python tutorial: https://docs.python.org/3/tutorial/
- posting in the "Users" category on discuss.python.org: https://discuss.python.org/c/users/7
- emailing the Python-list mailing list: https://mail.python.org/mailman/listinfo/python-list
- searching our issue tracker (https://github.com/python/cpython/issues) to see if
your problem has already been reported
-->
# Bug report
In Fedora, we would like to use our wheels instead of the ones bundled/vendored in Lib/test.
Ideally, we could reuse the `--with-wheel-pkg-dir` configure option for this.
# Your environment
<!-- Include as many relevant details as possible about the environment you experienced the bug in -->
- CPython versions tested on: 3.12.0b1
- Operating system and architecture: Fedora Linux, all architectures
<!--
You can freely edit this text. Remove any lines you believe are unnecessary.
-->
This is an issue for https://github.com/python/cpython/pull/105056
<!-- gh-linked-prs -->
### Linked PRs
* gh-105056
* gh-105424
<!-- /gh-linked-prs -->
| bd98b65e974b7a1e086a51e7b55131582f7a0491 | cda1bd3c9d3b2cecdeeba0c498cd2df83fbdb535 |
python/cpython | python__cpython-105217 | # The help function shows incorrect signature for subclass
# Bug report
With the following class hierarchy:
```py
class A0:
def __new__(cls, *args, **kw):
return super().__new__(cls)
def __init__(self, *args, **kw):
super().__init__()
class A1(A0):
def __init__(self, a, b):
super().__init__()
self.a = a
self.b = b
class A2(A1):
c = None
```
`help(A2)` shows the wrong signature for instantiating A2 as `A2(*args, **kw)` instead of the expected `A2(a, b)`, despite the fact that is shows the correct signature for `__init__`:
```
Help on class A2 in module __main__:
class A2(A1)
| A2(*args, **kw)
|
| Method resolution order:
| A2
| A1
| A0
| builtins.object
|
| Data and other attributes defined here:
|
| c = None
|
| ----------------------------------------------------------------------
| Methods inherited from A1:
|
| __init__(self, a, b)
| Initialize self. See help(type(self)) for accurate signature.
|
| ----------------------------------------------------------------------
| Static methods inherited from A0:
|
| __new__(cls, *args, **kw)
| Create and return a new object. See help(type) for accurate signature.
|
...
```
Note that `help(A1)` works correctly and shows the correct signature as `A1(a, b)`:
```
Help on class A1 in module __main__:
class A1(A0)
| A1(a, b)
|
| Method resolution order:
| A1
| A0
| builtins.object
|
| Methods defined here:
|
| __init__(self, a, b)
| Initialize self. See help(type(self)) for accurate signature.
|
| ----------------------------------------------------------------------
| Static methods inherited from A0:
|
| __new__(cls, *args, **kw)
| Create and return a new object. See help(type) for accurate signature.
|
...
```
This doesn't seem to be an issue if `__new__` is not defined on A0, or if A1 redefines `__new__` with the same signature as `__init__`.
# Your environment
<!-- Include as many relevant details as possible about the environment you experienced the bug in -->
- CPython versions tested on: 3.11.3+
- Operating system and architecture: Linux x86
<!-- gh-linked-prs -->
### Linked PRs
* gh-105217
* gh-105257
* gh-105274
<!-- /gh-linked-prs -->
| 9ad199ba36791711f596393ca9a20dbf118ef858 | 70dc2fb9732ba3848ad3ae511a9d3195b1378915 |
python/cpython | python__cpython-105078 | # `test_tkinter` fails on refleak run due to `support` confusion
# Bug report
When running the refleak test runner (`-R`), `test_tkinter` fails on the second run due to `AttributeError: module 'test.test_tkinter.support' has no attribute 'load_package_tests'`. This is because `test_tkinter` has its own `support` module, which when imported overwrites the `support` attribute set in `Lib/test/test_tkinter/__init__.py`. That module should instead import specific symbols from `test.support` and use them directly.
# Your environment
- CPython versions tested on: `main`, `3.12`
- Operating system and architecture: Windows, Linux, any other
<!-- gh-linked-prs -->
### Linked PRs
* gh-105078
* gh-105079
<!-- /gh-linked-prs -->
| 5454db4ace66018179f034fbffcea8d791d66a98 | d593074494b39a3d51e67a4e57189c530867a7ff |
python/cpython | python__cpython-105072 | # Add PyUnstable_Exc_PrepReraiseStar
As requested [here](https://github.com/capi-workgroup/problems/issues/40), it would help cython to have the ``_PyExc_PrepReraiseStar`` function exposed in the C API. Since this is an implementation detail (of the ``except*`` construct), it can only be exposed in the unstable API.
As per the [dev guide](https://devguide.python.org/developer-workflow/c-api/#unstable-c-api), we can still do this in 3.12.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105072
* gh-105095
* gh-105097
* gh-105105
<!-- /gh-linked-prs -->
| b7aadb4583b040ddc8564896b91f4e5e571c82d6 | bd98b65e974b7a1e086a51e7b55131582f7a0491 |
python/cpython | python__cpython-105070 | # Consider not consuming all the buffer in one go in the tokenizer module
Seems that some tools were relying on the implementation detail that the `readline`-like callable that is provided to functions in the `tokenize` module is called as tokens are emitted and not consumed in one go. Although this was never part of the contract and technically we don't need to change, it would make the implementation more efficient as we don't need to hold the entire input in memory at the same time and we won't break these tools.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105070
* gh-105119
<!-- /gh-linked-prs -->
| 9216e69a87d16d871625721ed5a8aa302511f367 | 2ea34cfb3a21182b4d16f57dd6c1cfce46362fe2 |
python/cpython | python__cpython-105060 | # Documentation of `timeit.Timer.timeit` should mention that the return value depends on the timer
# Documentation
At the moment, it is as follows.
https://github.com/python/cpython/blob/cb4615fd43e678fe44e9aeb6a486475a05b492e7/Lib/timeit.py#L164-L169
The return value may not always be a float measured in seconds. If a different `timer` function is used:
```
timeit.timeit(timer=time.perf_counter_ns)
```
then it is an integer measured in nanoseconds.
Not entirely sure how to document it unambiguously, but perhaps it can be reworded to say something like: '… as a number measured in the time unit returned by the `timer` argument of the constructor (by default: a float measured in seconds).' Or similar?
<!-- gh-linked-prs -->
### Linked PRs
* gh-105060
* gh-108534
* gh-108535
<!-- /gh-linked-prs -->
| 7096a2be33619dc02c06a6dc30aac414a9eba462 | 1ac64237e6ce965064451ed57ae37271aeb9fbd3 |
python/cpython | python__cpython-105061 | # SyntaxError: unmatched ')' with whitespace in lambda?
# Bug report
Testing on tip of 3.12 which includes the fix for #105013 (thanks!), I get the following difference in behaviour with Python 3.11 vs tip of Python 3.12:
```python
import inspect
from hypothesis.internal.reflection import extract_lambda_source
from hypothesis.strategies import just
#from hypothesis.strategies import just, one_of
# This variant doesn't trigger a TypeError mid-callback, but both variants get the same final inspect error
#one_of_nested_strategy_with_filter = one_of(
# just(0),
# just(1),
# one_of(just(2), just(3), one_of(just(4), just(5), one_of(just(6), just(7)))),
#).filter(lambda x: x % 2 == 0)
#x = get_pretty_function_description(one_of_nested_strategy_with_filter)
#print(inspect.getsource(x))
one_of_nested_strategy_with_filter = (
just(0)
).filter(lambda x: x % 2 == 0)
x = extract_lambda_source(one_of_nested_strategy_with_filter)
```
With Python 3.12, I get:
```
Traceback (most recent call last):
File "/usr/lib/python3.12/inspect.py", line 1241, in getblock
for _token in tokens:
File "/usr/lib/python3.12/tokenize.py", line 450, in _tokenize
for token in _generate_tokens_from_c_tokenizer(source, extra_tokens=True):
File "/usr/lib/python3.12/tokenize.py", line 537, in _generate_tokens_from_c_tokenizer
for info in c_tokenizer.TokenizerIter(source, extra_tokens=extra_tokens):
File "<string>", line 1
).filter(lambda x: x % 2 == 0)
^
SyntaxError: unmatched ')'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/tmp/foo2.py", line 10, in <module>
x = extract_lambda_source(one_of_nested_strategy_with_filter)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/site-packages/hypothesis/internal/reflection.py", line 305, in extract_lambda_source
sig = inspect.signature(f)
^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/inspect.py", line 3326, in signature
return Signature.from_callable(obj, follow_wrapped=follow_wrapped,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/inspect.py", line 3070, in from_callable
return _signature_from_callable(obj, sigcls=cls,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/inspect.py", line 2484, in _signature_from_callable
raise TypeError('{!r} is not a callable object'.format(obj))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/site-packages/hypothesis/strategies/_internal/misc.py", line 39, in __repr__
suffix = "".join(
^^^^^^^^
File "/usr/lib/python3.12/site-packages/hypothesis/strategies/_internal/misc.py", line 40, in <genexpr>
f".{name}({get_pretty_function_description(f)})"
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/site-packages/hypothesis/internal/reflection.py", line 432, in get_pretty_function_description
return extract_lambda_source(f)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/site-packages/hypothesis/internal/reflection.py", line 312, in extract_lambda_source
source = inspect.getsource(f)
^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/inspect.py", line 1282, in getsource
lines, lnum = getsourcelines(object)
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/inspect.py", line 1274, in getsourcelines
return getblock(lines[lnum:]), lnum + 1
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/inspect.py", line 1248, in getblock
_, *_token_info = _token
^^^^^^
UnboundLocalError: cannot access local variable '_token' where it is not associated with a value
```
But with Python 3.11, I get:
```
Traceback (most recent call last):
File "/tmp/foo2.py", line 10, in <module>
x = extract_lambda_source(one_of_nested_strategy_with_filter)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/site-packages/hypothesis/internal/reflection.py", line 305, in extract_lambda_source
sig = inspect.signature(f)
^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/inspect.py", line 3279, in signature
return Signature.from_callable(obj, follow_wrapped=follow_wrapped,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/inspect.py", line 3027, in from_callable
return _signature_from_callable(obj, sigcls=cls,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/inspect.py", line 2447, in _signature_from_callable
raise TypeError('{!r} is not a callable object'.format(obj))
TypeError: just(0).filter(lambda x: <unknown>) is not a callable object
```
If I change the program to drop the whitespace, it works in 3.12 too:
```
import inspect
from hypothesis.internal.reflection import extract_lambda_source
from hypothesis.strategies import just
one_of_nested_strategy_with_filter = (just(0)).filter(lambda x: x % 2 == 0)
x = extract_lambda_source(one_of_nested_strategy_with_filter)
```
I noticed this w/ a test failure in priority (the output is huge, so just a snippet here)
```
ERROR collecting test/test_priority.py ____________________________________________________________________________________
/usr/lib/python3.12/inspect.py:1241: in getblock
for _token in tokens:
[...]
/usr/lib/python3.12/tokenize.py:537: in _generate_tokens_from_c_tokenizer
for info in c_tokenizer.TokenizerIter(source, extra_tokens=extra_tokens):
E File "<string>", line 1
E ).map(lambda blocked: (blocked, active_readme_streams_from_filter(blocked)))
E ^
E SyntaxError: unmatched ')'
c_tokenizer = <module '_tokenize' (built-in)>
extra_tokens = True
source = (').map(lambda blocked: (blocked, '
'active_readme_streams_from_filter(blocked)))\n'
[...]
ERROR test/test_priority.py - UnboundLocalError: cannot access local variable '_token' where it is not associated with a value
```
and a perhaps more useful test failure in hypothesis, which priority uses:
```
test_one_of_flattens_filter_branches_2 ____________________________________________________________________________________
[gw14] linux -- Python 3.12.0 /var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/bin/python3.12
Traceback (most recent call last):
File "/usr/lib/python3.12/inspect.py", line 1241, in getblock
for _token in tokens:
File "/usr/lib/python3.12/tokenize.py", line 450, in _tokenize
for token in _generate_tokens_from_c_tokenizer(source, extra_tokens=True):
File "/usr/lib/python3.12/tokenize.py", line 537, in _generate_tokens_from_c_tokenizer
for info in c_tokenizer.TokenizerIter(source, extra_tokens=extra_tokens):
File "<string>", line 1
).filter(lambda x: x % 2 == 0)
^
SyntaxError: unmatched ')'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python/tests/quality/test_discovery_ability.py", line 101, in run_test
runner.run()
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/internal/conjecture/engine.py", line 474, in run
self._run()
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/internal/conjecture/engine.py", line 880, in _run
self.generate_new_examples()
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/internal/conjecture/engine.py", line 684, in generate_new_examples
minimal_example = self.cached_test_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/internal/conjecture/engine.py", line 1065, in cached_test_function
self.test_function(data)
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/internal/conjecture/engine.py", line 209, in test_function
self.__stoppable_test_function(data)
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/internal/conjecture/engine.py", line 185, in __stoppable_test_function
self._test_function(data)
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python/tests/quality/test_discovery_ability.py", line 79, in test_function
value = data.draw(specifier)
^^^^^^^^^^^^^^^^^^^^
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/internal/conjecture/data.py", line 956, in draw
return strategy.do_draw(self)
^^^^^^^^^^^^^^^^^^^^^^
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/strategies/_internal/strategies.py", line 942, in do_draw
result = self.do_filtered_draw(data)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/strategies/_internal/strategies.py", line 956, in do_filtered_draw
value = data.draw(self.filtered_strategy)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/internal/conjecture/data.py", line 951, in draw
return strategy.do_draw(self)
^^^^^^^^^^^^^^^^^^^^^^
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/strategies/_internal/strategies.py", line 666, in do_draw
return data.draw(strategy)
^^^^^^^^^^^^^^^^^^^
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/internal/conjecture/data.py", line 951, in draw
return strategy.do_draw(self)
^^^^^^^^^^^^^^^^^^^^^^
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/strategies/_internal/strategies.py", line 532, in do_draw
data.mark_invalid(f"Aborted test because unable to satisfy {self!r}")
^^^^^^^^
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/strategies/_internal/misc.py", line 39, in __repr__
suffix = "".join(
^^^^^^^^
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/strategies/_internal/misc.py", line 40, in <genexpr>
f".{name}({get_pretty_function_description(f)})"
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/internal/reflection.py", line 432, in get_pretty_function_description
return extract_lambda_source(f)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/var/tmp/portage/dev-python/hypothesis-6.75.6/work/hypothesis-hypothesis-python-6.75.6/hypothesis-python-python3_12/install/usr/lib/python3.12/site-packages/hypothesis/internal/reflection.py", line 312, in extract_lambda_source
source = inspect.getsource(f)
^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/inspect.py", line 1282, in getsource
lines, lnum = getsourcelines(object)
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/inspect.py", line 1274, in getsourcelines
return getblock(lines[lnum:]), lnum + 1
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/inspect.py", line 1248, in getblock
_, *_token_info = _token
^^^^^^
UnboundLocalError: cannot access local variable '_token' where it is not associated with a value
```
See also https://github.com/python/cpython/issues/105013.
# Your environment
- CPython versions tested on: 3.11.3, tip of 3.12
- Operating system and architecture: Gentoo Linux, amd64
<!-- gh-linked-prs -->
### Linked PRs
* gh-105061
* gh-105120
<!-- /gh-linked-prs -->
| 70f315c2d6de87b0514ce16cc00a91a5b60a6098 | 9216e69a87d16d871625721ed5a8aa302511f367 |
python/cpython | python__cpython-105094 | # 3.12.0b1 includes backwards incompatible change to operation of `super()`
# Bug report
The optimisation to LOAD_SUPER_ATTR introduced by #104270 appears to have introduced a backwards incompatibility.
The following code will reproduce the problem:
```
class MyInstance:
def __new__(cls, ptr, name, bases, attrs):
self = super().__new__(cls, name, bases, attrs)
print(f"{MyInstance=} {self=} {type(self)=}, {super(MyInstance, type(self))=}")
super(MyInstance, type(self)).__setattr__(self, "ptr", ptr)
return self
def __setattr__(self, name, value):
raise Exception()
class MyClass(MyInstance, type):
def __new__(cls, name):
self = super().__new__(cls, id(name), name, (MyInstance,), {})
return self
class1 = MyClass("Class1")
print(f"{class1.ptr=}")
class2 = MyClass("Class2")
print(f"{class2.ptr=}")
```
Under 3.12.0a7 (and previous stable Python versions going back to at least 3.7), this will succeed, outputting:
```
MyInstance=<class '__main__.MyInstance'> self=<class '__main__.Class1'> type(self)=<class '__main__.MyClass'>, super()=<super: <class 'MyInstance'>, <MyClass object>>
class1.ptr=4364457392
MyInstance=<class '__main__.MyInstance'> self=<class '__main__.Class2'> type(self)=<class '__main__.MyClass'>, super()=<super: <class 'MyInstance'>, <MyClass object>>
class2.ptr=4365761904
```
Under 3.12.0b1, it raises an error:
```
MyInstance=<class '__main__.MyInstance'> self=<class '__main__.Class1'> type(self)=<class '__main__.MyClass'>, super()=<super: <class 'MyInstance'>, <MyClass object>>
class1.ptr=4370144336
MyInstance=<class '__main__.MyInstance'> self=<class '__main__.Class2'> type(self)=<class '__main__.MyClass'>, super()=<super: <class 'MyInstance'>, <MyClass object>>
Traceback (most recent call last):
File "/Users/rkm/beeware/rubicon/objc/repro.py", line 24, in <module>
class2 = MyClass("Class2")
^^^^^^^^^^^^^^^^^
File "/Users/rkm/beeware/rubicon/objc/repro.py", line 16, in __new__
self = super().__new__(cls, id(name), name, (MyInstance,), {})
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rkm/beeware/rubicon/objc/repro.py", line 7, in __new__
super().__setattr__(self, "ptr", ptr)
TypeError: expected 2 arguments, got 3
```
That is - the first `MyClass` instance can be instantiated; however, the second instance fails in mid-construction when invoking `__setattr__`. The state of the objects prior to the `__setattr__` invocation appear to be identical, but the `__setattr__` method behaves differently on the second invocation.
Git bisect has narrowed the cause down to #104270 (CC @carljm, @markshannon).
The reproduction case also fails if the call to `super`:
```
super(MyInstance, type(self)).__setattr__(self, "ptr", ptr)
```
is replaced with the simpler:
```
super().__setattr__(self, "ptr", ptr)
```
## Background
This test case has been extracted from [rubicon-objc](https://github.com/beeware/rubicon-objc), causing beeware/rubicon-objc#313. Rubicon is a wrapper around the Objective C runtime used by macOS and iOS; `MyInstance` is an analog of `ObjCInstance`, and `MyClass` is a an analog of `ObjCClass`. `ObjCInstance` has an implementation of `__setattr__` to redirect attribute access to the underlying ObjC calls. However, during construction, `ObjCInstance` needs to store a `ptr` of the underlying ObjC instance. This isn't a valid ObjC attribute, so `super()` is used to access the underlying `__setattr__` implementation to set the attribute.
# Your environment
- CPython versions tested on: 3.7.9, 3.10.11, 3.12.0a7, 3.12.0b1
- Operating system and architecture: macOS ARM64
<!-- gh-linked-prs -->
### Linked PRs
* gh-105094
* gh-105117
<!-- /gh-linked-prs -->
| 68c75c31536e8c87901934f2d6da81f54f4334f9 | 49f90ba1eae56708b1894441418c13ad8e8ea9a8 |
python/cpython | python__cpython-105049 | # `datetime` documentation regarding ISO8601 reduced precision and extend representation
# Documentation
The documentation for [date.fromisoformat()](https://github.com/python/cpython/blob/b9e2d8076981b471952355b86f90064ce3941032/Doc/library/datetime.rst?plain=1#LL527C8-L527C8), and [datetime.fromisoformat()](https://github.com/python/cpython/blob/b9e2d8076981b471952355b86f90064ce3941032/Doc/library/datetime.rst?plain=1#L993) refer to these function as as being able to accept "any valid ISO 8601 format", however there are a couple notable exceptions to this support beyond what is already mentioned in the documentation. I know that Python's date parsing code intentionally does not implement the full ISO 8601 specification for simplicity reasons, but I feel like this could be better conveyed by the current documentation.
Below is a summary of ISO 8601 date stamps that are not supported by the `datetime` module.
* Reduced Precision Date Representations (ISO8601-1:2019§5.2.2.2)
* Extended Date Representations (ISO8601-1:2019§5.2.2.3)
* Ordinal Date Representations (ISO8601-1:2019§5.2.3): This is already mentioned in the documentation
* Local Time Representations with Decimal Fraction (ISO8601-1:2019§5.3.1.4): This is already mentioned in the documentation
<!-- gh-linked-prs -->
### Linked PRs
* gh-105049
* gh-114866
* gh-114867
<!-- /gh-linked-prs -->
| e9dab656380ec03d628979975646748330b76b9b | 6d7ad57385e6c18545f19714b8f520644d305715 |
python/cpython | python__cpython-105115 | # `tp_bases` of `object` is `NULL`: undocumented or unintentional behavior change in 3.12?
# Documentation
See https://github.com/python/cpython/pull/103912#issuecomment-1559779144 for full context.
Specific problem:
**It appears `tp_bases` of `object` was an empty tuple in the past, but is now `NULL`, could that be? Is that intentional?**
If intentional, could it please be documented? I think the behavior change is likely to trip up fair a number of people.
I looked in https://github.com/python/cpython/blob/edd0cb8e77d7b65f5a9c2c69dc81f9c4514878d5/Doc/whatsnew/3.12.rst but don't see `tp_bases` being mentioned there.
I also verified that the `tp_bases == nullptr` patch is still needed with current 3.12 HEAD (edd0cb8e77d7b65f5a9c2c69dc81f9c4514878d5).
<!-- gh-linked-prs -->
### Linked PRs
* gh-105115
* gh-105124
<!-- /gh-linked-prs -->
| 7be667dfafa2465df6342d72dca9c1f82dd830d0 | 26e7bbf66e93ee7c94b6e007ec7b2d769c2ced92 |
python/cpython | python__cpython-105022 | # Tokenizer produces different output on Windows on py312 for ends of files
# Bug report
If you copy and paste the following code into a `repro.py` file and run `python -m tokenize` on it on a Windows machine, the output is different on 3.12/3.13 compared to what it was on 3.11 (the file ends with a single newline):
```py
foo = 'bar'
spam = 'eggs'
```
On Python 3.11, on Windows, the output is this:
```
> python -m tokenize cpython/repro.py
0,0-0,0: ENCODING 'utf-8'
1,0-1,3: NAME 'foo'
1,4-1,5: OP '='
1,6-1,11: STRING "'bar'"
1,11-1,13: NEWLINE '\r\n'
2,0-2,4: NAME 'spam'
2,5-2,6: OP '='
2,7-2,13: STRING "'eggs'"
2,13-2,15: NEWLINE '\r\n'
3,0-3,0: ENDMARKER ''
```
On Python 3.13 (@ https://github.com/python/cpython/commit/6e62eb2e70a9f2a8099735989a58e8c8cfb4a2f2) on Windows, however, the output is this:
```
> python -m tokenize repro.py
0,0-0,0: ENCODING 'utf-8'
1,0-1,3: NAME 'foo'
1,4-1,5: OP '='
1,6-1,11: STRING "'bar'"
1,11-1,12: NEWLINE '\n'
2,0-2,4: NAME 'spam'
2,5-2,6: OP '='
2,7-2,13: STRING "'eggs'"
2,13-2,14: NEWLINE '\n'
3,0-3,1: NL '\n'
4,0-4,0: ENDMARKER ''
```
There appear to be two changes here:
- All the NEWLINE tokens now have `\n` values, whereas on Python 3.11 they all had `\r\n` values
- There is an additional NL token at the end, immediately before the ENDMARKER token.
As discussed in https://github.com/PyCQA/pycodestyle/issues/1142, this appears to be Windows-specific, and _may_ be the cause of a large number of spurious W391 errors from the pycodestyle linting tool. (W391 dictates that there should be one, and only one, newline at the end of a file.) The pycodestyle tool is included in flake8, meaning that test failures in pycodestyle can cause test failures for other flake8 plugins. (All tests for [the flake8-pyi plugin](https://github.com/PyCQA/flake8-pyi), for example, currently fail on Python 3.13 on Windows.)
# Your environment
```
Python 3.13.0a0 (heads/main:6e62eb2e70, May 27 2023, 14:00:13) [MSC v.1932 64 bit (AMD64)] on win32
```
<!-- gh-linked-prs -->
### Linked PRs
* gh-105022
* gh-105023
* gh-105030
* gh-105041
<!-- /gh-linked-prs -->
| 86d8f489359b8f6cc15006bdcbd70521ce621fbb | 6e62eb2e70a9f2a8099735989a58e8c8cfb4a2f2 |
python/cpython | python__cpython-105021 | # Possible regression in Python3.12 tokenizer
# Bug report
I have encountered a possible regression while I was testing pyrsistent with python3.12_beta1, see https://github.com/tobgu/pyrsistent/issues/275. Tests fail with following error:
```
/usr/lib/python3.12/site-packages/_pytest/python.py:617: in _importtestmodule
mod = import_path(self.path, mode=importmode, root=self.config.rootpath)
/usr/lib/python3.12/site-packages/_pytest/pathlib.py:564: in import_path
importlib.import_module(module_name)
/usr/lib/python3.12/importlib/__init__.py:90: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
<frozen importlib._bootstrap>:1293: in _gcd_import
???
<frozen importlib._bootstrap>:1266: in _find_and_load
???
<frozen importlib._bootstrap>:1237: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:841: in _load_unlocked
???
/usr/lib/python3.12/site-packages/_pytest/assertion/rewrite.py:178: in exec_module
exec(co, module.__dict__)
tests/hypothesis_vector_test.py:48: in <module>
PVectorAndLists = st.lists(st.builds(RefCountTracker)).map(
/usr/lib/python3.12/site-packages/hypothesis/strategies/_internal/strategies.py:347: in map
if is_identity_function(pack):
/usr/lib/python3.12/site-packages/hypothesis/internal/reflection.py:638: in is_identity_function
return bool(re.fullmatch(r"lambda (\w+): \1", get_pretty_function_description(f)))
/usr/lib/python3.12/site-packages/hypothesis/internal/reflection.py:432: in get_pretty_function_description
return extract_lambda_source(f)
/usr/lib/python3.12/site-packages/hypothesis/internal/reflection.py:312: in extract_lambda_source
source = inspect.getsource(f)
/usr/lib/python3.12/inspect.py:1274: in getsource
lines, lnum = getsourcelines(object)
/usr/lib/python3.12/inspect.py:1266: in getsourcelines
return getblock(lines[lnum:]), lnum + 1
/usr/lib/python3.12/inspect.py:1241: in getblock
for _token in tokens:
/usr/lib/python3.12/tokenize.py:451: in _tokenize
for token in _generate_tokens_from_c_tokenizer(source, extra_tokens=True):
/usr/lib/python3.12/tokenize.py:542: in _generate_tokens_from_c_tokenizer
for info in c_tokenizer.TokenizerIter(source, extra_tokens=extra_tokens):
E File "<string>", line 1
E lambda l: (l, pvector(l)))
E ^
E SyntaxError: unmatched ')'
```
It is triggered for this code (see https://github.com/tobgu/pyrsistent/blob/cc90f3e2b339653fde0df422aaf3ccdb3fc1225d/tests/hypothesis_vector_test.py#L48-L49 )
```python
PVectorAndLists = st.lists(st.builds(RefCountTracker)).map(
lambda l: (l, pvector(l)))
```
if I concatenate it to one line, everything works. I also tried to backport fix for https://github.com/python/cpython/issues/104866 as it sounds relevant, but it didn't solved the issue.
# Your environment
- Gentoo `~amd64`
- Python 3.12.0b1
<!-- gh-linked-prs -->
### Linked PRs
* gh-105021
* gh-105032
<!-- /gh-linked-prs -->
| 3a5be878be6f89ee98d0ef9a1274e6a9d9ccbc37 | b225c08de889d2bf070e6c981c5f386cf06d961c |
python/cpython | python__cpython-107014 | # pathlib.PurePath.relative_to(walk_up=True) mishandles '..' components
`pathlib.PurePath.relative_to(other, walk_up=True)` doesn't handle '..' segments in its *other* argument correctly:
```python
>>> from pathlib import PurePath, Path
>>> Path.cwd()
PosixPath('/home/barney/projects/cpython')
>>> PurePath('a/b').relative_to('a/..', walk_up=True)
PurePosixPath('../b')
# expected: ValueError (ideal world: 'a/b')
>>> PurePath('a/b').relative_to('a/../..', walk_up=True)
PurePosixPath('../../b')
# expected: ValueError (ideal world: 'cpython/a/b')
```
`PurePath` objects do not know the current working directory, nor can they safely eliminate `..` segments without resolving symlinks, so I think raising `ValueError` is the only reasonable thing to do.
<!-- gh-linked-prs -->
### Linked PRs
* gh-107014
* gh-107315
<!-- /gh-linked-prs -->
| e7e6e4b035f51ab4a962b45a957254859f264f4f | 6d5b6e71c87fca7c5c26f5dd8f325087962215cc |
python/cpython | python__cpython-104999 | # Optimize joining of pathlib.PurePath() arguments.
In Python 3.12, when multiple arguments are given to `PurePath()`, the initialiser calls `os.path.join()` to join them. This is reasonably slow. For Python 3.13 we can make it faster by:
1. Deferring joining of arguments until strictly needed
2. (Maybe) re-implementing `os.path.join()`, as pathlib did before #95450.
<!-- gh-linked-prs -->
### Linked PRs
* gh-104999
* gh-105483
* gh-105484
<!-- /gh-linked-prs -->
| ffeaec7e60c88d585deacb10264ba7a96e5e52df | f5df347fcf5fe029edbe6bf274da0f4880401852 |
python/cpython | python__cpython-104993 | # Remove deprecated unittest.TestProgram.usageExit
# Feature or enhancement
The `unittest.TestProgram.usageExit` method was deprecated in Python 3.11 and scheduled for removal in 3.13.
The method wasn't documented. The deprecation was documented in the [changelog](https://docs.python.org/3/whatsnew/changelog.html?highlight=usageexit#id86) but not in What's New.
In addition to removal in 3.13, I suggest adding it to the "What's New in 3.11" > Deprecations:
* https://docs.python.org/3.13/whatsnew/3.11.html#deprecated
And to "What's New in 3.12" > "Pending Removal in Python 3.13":
* https://docs.python.org/3.13/whatsnew/3.12.html#pending-removal-in-python-3-13
# Previous discussion
<!--
New features to Python should first be discussed elsewhere before creating issues on GitHub,
for example in the "ideas" category (https://discuss.python.org/c/ideas/6) of discuss.python.org,
or the python-ideas mailing list (https://mail.python.org/mailman3/lists/python-ideas.python.org/).
Use this space to post links to the places where you have already discussed this feature proposal:
-->
* GH-67048
<!--
You can freely edit this text. Remove any lines you believe are unnecessary.
-->
<!-- gh-linked-prs -->
### Linked PRs
* gh-104993
* gh-104994
* gh-104995
* gh-105009
* gh-105010
* gh-105036
<!-- /gh-linked-prs -->
| b225c08de889d2bf070e6c981c5f386cf06d961c | 897e716d03d559a10dd5015ecb501ceb98955f3a |
python/cpython | python__cpython-104986 | # starargs and kwargs are removed from ClassDef and Call ast nodes
# Documentation
The docs for the [ClassDef](https://docs.python.org/3/library/ast.html#ast.ClassDef) and [Call](https://docs.python.org/3/library/ast.html#ast.Call) AST nodes list two fields named `starargs` and `kwargs` which were removed in this [commit](https://github.com/python/cpython/commit/025e9ebd0a0a19f50ca83af6ada0ac65be1fa2a1#diff-aa47c9a4ecdd799c2932afef25673bf9c33ec9ef75f1bc6054b924bea602c747L87).
Thank you @isidentical for your help to find this.
<!-- gh-linked-prs -->
### Linked PRs
* gh-104986
* gh-104987
* gh-104988
<!-- /gh-linked-prs -->
| 61c1d6760facbc172a58512cad46148f587b4da1 | 3fdb55c48291a459fb1e33edb5140ec0383222df |
python/cpython | python__cpython-104985 | # buildbots fail because of test_peg_generator
`test_peg_generator` fails because of some of the changes in #104798 (newlines removed from `TokenInfo`) paired with a behavior change (regarding those same newlines in `TokenInfo`) in #104975.
<!-- gh-linked-prs -->
### Linked PRs
* gh-104985
<!-- /gh-linked-prs -->
| 95f1b1fef777254a45559c0348e80185df3634ff | 61c1d6760facbc172a58512cad46148f587b4da1 |
python/cpython | python__cpython-104980 | # Consider emitting buffered DEDENT tokens on the last line
In Python 3.12, porting the tokenizer to use the C tokenizer underneath to support PEP 701 has now a documented change in [docs.python.org/3.12/whatsnew/3.12.html#changes-in-the-python-api](https://docs.python.org/3.12/whatsnew/3.12.html#changes-in-the-python-api):
> Some final DEDENT tokens are now emitted within the bounds of the input. This means that for a file containing 3 lines, the old version of the tokenizer returned a DEDENT token in line 4 whilst the new version returns the token in line 3.
Apparently, this affects negatively some formatting tools (see https://github.com/PyCQA/pycodestyle/issues/1142). Let's consider what options do we have and see if we can fix this without adding a lot of maintenance burden to the C tokenizer or slowing down everything.
<!-- gh-linked-prs -->
### Linked PRs
* gh-104980
* gh-105000
<!-- /gh-linked-prs -->
| 46b52e6e2bda51d0b89a64ee36ce2d305a7409f3 | 402ee5a68b306b489b782478ab96e8e3b913587a |
python/cpython | python__cpython-104975 | # The lines in tokens from `tokenize.generate_tokens` incorrectly indicate multiple lines.
The line attribute in tokens returned by `tokenize.generate_tokens` incorrectly indicate multiple lines. The tokens should have an invariant that using the `.start` and `.end` attributes to index into the `.line` attribute will produce the `.string` attribute.
tokbug.py:
```python
import io
import sys
import tokenize
SOURCE = r"""
a + \
b
"""
print(sys.version)
readline = io.StringIO(SOURCE).readline
for tok in tokenize.generate_tokens(readline):
correct = (tok.string) == (tok.line[tok.start[1]: tok.end[1]])
print(tok, "" if correct else "<*****!!!")
```
Run with 3.12.0a7:
```
% /usr/local/pyenv/pyenv/versions/3.12.0a7/bin/python tokbug.py
3.12.0a7 (main, Apr 5 2023, 05:51:58) [Clang 14.0.3 (clang-1403.0.22.14.1)]
TokenInfo(type=62 (NL), string='\n', start=(1, 0), end=(1, 1), line='\n')
TokenInfo(type=1 (NAME), string='a', start=(2, 0), end=(2, 1), line='a + \\\n')
TokenInfo(type=54 (OP), string='+', start=(2, 2), end=(2, 3), line='a + \\\n')
TokenInfo(type=1 (NAME), string='b', start=(3, 0), end=(3, 1), line='b\n')
TokenInfo(type=4 (NEWLINE), string='\n', start=(3, 1), end=(3, 2), line='b\n')
TokenInfo(type=0 (ENDMARKER), string='', start=(4, 0), end=(4, 0), line='')
```
Run with 3.12.0b1:
```
% /usr/local/pyenv/pyenv/versions/3.12.0b1/bin/python tokbug.py
3.12.0b1 (main, May 23 2023, 16:19:59) [Clang 14.0.3 (clang-1403.0.22.14.1)]
TokenInfo(type=65 (NL), string='\n', start=(1, 0), end=(1, 1), line='\n')
TokenInfo(type=1 (NAME), string='a', start=(2, 0), end=(2, 1), line='a + \\\n')
TokenInfo(type=55 (OP), string='+', start=(2, 2), end=(2, 3), line='a + \\\n')
TokenInfo(type=1 (NAME), string='b', start=(3, 0), end=(3, 1), line='a + \\\nb\n') <*****!!!
TokenInfo(type=4 (NEWLINE), string='\n', start=(3, 1), end=(3, 2), line='a + \\\nb\n') <*****!!!
TokenInfo(type=0 (ENDMARKER), string='', start=(4, 0), end=(4, 0), line='')
```
Related to #104825? cc @pablogsal
<!-- gh-linked-prs -->
### Linked PRs
* gh-104975
* gh-104982
<!-- /gh-linked-prs -->
| 3fdb55c48291a459fb1e33edb5140ec0383222df | 2cb445635e99d4401949cabebd373288cfdd0138 |
python/cpython | python__cpython-104956 | # Signature for `__release_buffer__` is incorrect
Currently the signature for PEP-688's `__release_buffer__` method is as follows:
```
>>> bytearray.__release_buffer__.__text_signature__
'($self, /)'
```
This is incorrect; the method takes a single argument that according to the PEP should be called `buffer`.
<!-- gh-linked-prs -->
### Linked PRs
* gh-104956
* gh-104973
<!-- /gh-linked-prs -->
| 6e1eccdcce5ea3bf1ef9d326d20ef9df21262c6b | 6c81d7572edbe3a5800b1128e55a2dcef03cc13c |
python/cpython | python__cpython-104948 | # pathlib.PureWindowsPath comparison results vary between Windows and Posix
In #31691 I switched `pathlib.PureWindowsPath` comparisons to use `os.path.normcase()` rather than `str.lower()`. This is probably a mistake, as @eryksun [points out](https://github.com/python/cpython/issues/104484#issuecomment-1560419031):
> The inconsistency is with `ntpath.normcase()` on Windows. It's probably for the best if the pure comparison methods revert to using `str.lower()` for the sake of consistency, not only with `glob()` and `match()`, but also with using `PureWindowsPath` on POSIX. Maybe platform-dependent comparisons could be implemented on `Path`.
>
> On Windows, `ntpath.normcase()` is based on `LCMapStringEx()`. It turns out that this function implements a case mapping for some non-BMP characters. WinAPI [`CompareStringOrdinal()`](https://learn.microsoft.com/en-us/windows/win32/api/stringapiset/nf-stringapiset-comparestringordinal), on the other hand, has no case mapping for non-BMP characters, which is consistent with Microsoft's filesystems. Thus I'd prefer for a platform-dependent comparison to use `CompareStringOrdinal()` instead of `LCMapStringEx()`.
<!-- gh-linked-prs -->
### Linked PRs
* gh-104948
* gh-104990
<!-- /gh-linked-prs -->
| ad0be361c9922a918c7c3eaf83e1d8f2b019279c | 060277d96bf4ba86df8e4d65831a8cbdfeb51fc5 |
python/cpython | python__cpython-104945 | # Remove mentions of old Python versions in `typing.NamedTuple` docstring
# Documentation
This came up in this [PR](https://github.com/python/cpython/pull/104891#discussion_r1205942031)
In `typing.NamedTuple` there are some mentions to Python 3.5 & 3.6 which can be removed:
https://github.com/python/cpython/blob/bd1b6228d132b8e9836fe352cd8dca2b6c1bd98c/Lib/typing.py#L2712
https://github.com/python/cpython/blob/bd1b6228d132b8e9836fe352cd8dca2b6c1bd98c/Lib/typing.py#L2729-L2731
<!-- gh-linked-prs -->
### Linked PRs
* gh-104945
* gh-104962
* gh-104963
<!-- /gh-linked-prs -->
| 46857d0b2a2ac6aeb6dcce2bf2c92ddf4abe7496 | 2cf04e455d8f087bd08cd1d43751007b5e41b3c5 |
python/cpython | python__cpython-104939 | # Runtime-checkable protocols are broken on py312 (the sequel)
# Bug report
On 3.8-3.11, all subclasses of `typing.Generic` were guaranteed to have an `_is_protocol` class attribute, which is used as an internal marker:
https://github.com/python/cpython/blob/76873ca6b1ad1a1eb9518f0ff7fc594ec96d0a65/Lib/typing.py#L1790
Two places in `typing.py` rely on all subclasses of `Generic` having this marker:
https://github.com/python/cpython/blob/dbc8216f4c00ea40b0c2d3ca487e5afeb4b0e0b1/Lib/typing.py#L1895-L1897
https://github.com/python/cpython/blob/dbc8216f4c00ea40b0c2d3ca487e5afeb4b0e0b1/Lib/typing.py#L2062-L2064
However, on Python 3.12 (due to the implementation of PEP-695), subclasses of `Generic` no longer have this marker:
```pycon
>>> class Foo[T]: ...
...
>>> Foo._is_protocol
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: type object 'Foo' has no attribute '_is_protocol'
```
This leads to `AttributeError` being raised in two situations where it shouldn't be:
```pycon
Python 3.13.0a0 (heads/main:1080c4386d, May 25 2023, 13:11:38) [MSC v.1932 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> from typing import Protocol, runtime_checkable
>>> @runtime_checkable
... class Foo[T]: ...
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\alexw\coding\cpython\Lib\typing.py", line 2062, in runtime_checkable
if not issubclass(cls, Generic) or not cls._is_protocol:
^^^^^^^^^^^^^^^^
AttributeError: type object 'Foo' has no attribute '_is_protocol'
>>> @runtime_checkable
... class HasX(Protocol):
... x: int
...
>>> class Bar[T]:
... x: T
... def __init__(self, x):
... self.x = x
...
>>> isinstance(Bar(42), HasX)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\alexw\coding\cpython\Lib\typing.py", line 1810, in __instancecheck__
if super().__instancecheck__(instance):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexw\coding\cpython\Lib\abc.py", line 119, in __instancecheck__
return _abc_instancecheck(cls, instance)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexw\coding\cpython\Lib\typing.py", line 1794, in __subclasscheck__
return super().__subclasscheck__(other)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexw\coding\cpython\Lib\abc.py", line 123, in __subclasscheck__
return _abc_subclasscheck(cls, subclass)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexw\coding\cpython\Lib\typing.py", line 1897, in _proto_hook
issubclass(other, Generic) and other._is_protocol):
^^^^^^^^^^^^^^^^^^
AttributeError: type object 'Bar' has no attribute '_is_protocol'
```
Cc. @JelleZijlstra for PEP-695
<!-- gh-linked-prs -->
### Linked PRs
* gh-104939
* gh-104941
<!-- /gh-linked-prs -->
| 2b7027d0b2ee2e102a24a0da27d01b8221f9351c | 77d7ec5aa978db05cfb6f83e7624ca195065ce11 |
python/cpython | python__cpython-104926 | # "sendIng a read()able" as error message in /Lib/http/client.py
# Bug report
In /Lib/http/client.py line [1027](https://github.com/python/cpython/blob/278030a17d6889731d91a55e9f70cbec0b07dee8/Lib/http/client.py#L1027) and [1057](https://github.com/python/cpython/blob/278030a17d6889731d91a55e9f70cbec0b07dee8/Lib/http/client.py#L1057), the error message is `"sendIng a read()able"` which obviously smells like an abusive string replacement.
I am conscious that this is extremely minor, but somehow it bugs me and I felt like sharing 😅
<!-- gh-linked-prs -->
### Linked PRs
* gh-104926
* gh-104970
* gh-104971
<!-- /gh-linked-prs -->
| 6c81d7572edbe3a5800b1128e55a2dcef03cc13c | 7fc542c88dc8a09d71006a6240943407b83229d0 |
python/cpython | python__cpython-104910 | # Split opcodes into micro-ops
See https://github.com/faster-cpython/ideas/issues/592. This project can be parallelized, and I could use help!
Note that initially we don't have a tier-2 interpreter or instruction format defined yet, but the first stage of splitting doesn't require that.
### Remaining non-viable ops
With usage stats from the latest weekly pyperformance run (not all are uop candidates):
```
956,835,460 YIELD_VALUE
490,121,077 SEND_GEN (specialization of SEND)
429,220,003 JUMP_FORWARD
410,623,920 EXTENDED_ARG
390,537,200 JUMP_BACKWARD_NO_INTERRUPT
250,483,954 RETURN_GENERATOR
239,793,181 CALL_LIST_APPEND (specialization of CALL)
168,297,543 CALL_KW
162,922,780 FOR_ITER_GEN (specialization of FOR_ITER)
157,442,920 CALL_PY_WITH_DEFAULTS (specialization of CALL)
145,986,780 BINARY_SUBSCR_GETITEM (specialization of BINARY_SUBSCR)
135,636,840 STORE_FAST_LOAD_FAST
83,118,452 MAKE_CELL
74,149,898 CALL_FUNCTION_EX
68,587,076 CALL_ALLOC_AND_ENTER_INIT (specialization of CALL)
49,897,724 STORE_ATTR_WITH_HINT (specialization of STORE_ATTR)
49,846,886 LOAD_ATTR_PROPERTY (specialization of LOAD_ATTR)
8,224,500 RERAISE
6,000,000 END_ASYNC_FOR
5,801,385 BEFORE_WITH
2,892,780 RAISE_VARARGS
1,850,040 IMPORT_FROM
1,813,620 IMPORT_NAME
240 CLEANUP_THROW
120 BEFORE_ASYNC_WITH
ENTER_EXECUTOR
LOAD_ATTR_GETATTRIBUTE_OVERRIDDEN (specialization of LOAD_ATTR)
```
<!-- gh-linked-prs -->
### Linked PRs
* gh-104910
* gh-105748
* gh-106677
* gh-106678
* gh-107444
* gh-109316
* gh-109943
* gh-110025
* gh-110317
* gh-110560
<!-- /gh-linked-prs -->
| df396b59af9d50892e5e30463300e8458cb84263 | fbc9d0dbb22549bac2706f61f3ab631239d357b4 |
python/cpython | python__cpython-104899 | # os.PathLike is missing __slots__
[Per](https://github.com/python/cpython/pull/104810#discussion_r1204475255) @AlexWaygood:
> All the ABCs in `collections.abc` deliberately define `__slots__` for this very reason: so that other classes can subclass those ABCs and still be able to use the `__slots__` machinery to block the creation of instance dictionaries.
<!-- gh-linked-prs -->
### Linked PRs
* gh-104899
* gh-105073
<!-- /gh-linked-prs -->
| bd1b6228d132b8e9836fe352cd8dca2b6c1bd98c | fea8632ec69d160a11b8ec506900c14989952bc1 |
python/cpython | python__cpython-104887 | # Remove deprecated configparser.LegacyInterpolation class
# Feature or enhancement
The `configparser.LegacyInterpolation` class is undocumented, was deprecated in the docstring since Python 3.2, deprecated with a warning since Python 3.11, and scheduled for removal in 3.13.
# Previous discussion
<!--
New features to Python should first be discussed elsewhere before creating issues on GitHub,
for example in the "ideas" category (https://discuss.python.org/c/ideas/6) of discuss.python.org,
or the python-ideas mailing list (https://mail.python.org/mailman3/lists/python-ideas.python.org/).
Use this space to post links to the places where you have already discussed this feature proposal:
-->
* https://github.com/python/cpython/issues/90765
<!--
You can freely edit this text. Remove any lines you believe are unnecessary.
-->
<!-- gh-linked-prs -->
### Linked PRs
* gh-104887
<!-- /gh-linked-prs -->
| 3f9c60f51ef820937e7e0f95f45e63fa0ae21e6c | 0242e9a57aa87ed0b5cac526f65631c654a39054 |
python/cpython | python__cpython-105026 | # socket.getblocking() documentation lists opposite equivalence
# Documentation
The [documentation](https://docs.python.org/3/library/socket.html#socket.socket.getblocking) for `socket.getblocking()` states:
> This is equivalent to checking socket.gettimeout() == 0.
But that's the opposite of what it means. According to the [source code](https://github.com/python/cpython/blob/main/Modules/socketmodule.c), `getblocking()` returns `False` if and only if `gettimeout()` returns `0.0.`
<!-- gh-linked-prs -->
### Linked PRs
* gh-105026
* gh-105283
* gh-105284
<!-- /gh-linked-prs -->
| 5a5ed7a3e616a372f054a1dd2e9a31ba32a87a67 | 9a90c9ace2ed878715107bf4ae39e5967d7c931f |
python/cpython | python__cpython-104881 | # TypeAliasType.__module__ can segfault
```
>>> ns = {}
>>> exec("type A = int", ns, ns)
>>> ns["A"].__module__
zsh: segmentation fault ./python.exe
```
I will propose a fix.
<!-- gh-linked-prs -->
### Linked PRs
* gh-104881
* gh-104890
<!-- /gh-linked-prs -->
| fe77a99fc8b549a8bf9ccbc5485fe5ea9bcf47b9 | 1497607a8e99f1103c40368dd5f9057f0146a520 |
python/cpython | python__cpython-104877 | # Remove deprecated turtle.RawTurtle.settiltangle method
# Feature or enhancement
The `turtle.RawTurtle.settiltangle` method was deprecated in docs since Python 3.1 and with a warning since 3.11, and scheduled for removal in 3.13.
# Previous discussion
* https://github.com/python/cpython/pull/29618
<!--
New features to Python should first be discussed elsewhere before creating issues on GitHub,
for example in the "ideas" category (https://discuss.python.org/c/ideas/6) of discuss.python.org,
or the python-ideas mailing list (https://mail.python.org/mailman3/lists/python-ideas.python.org/).
Use this space to post links to the places where you have already discussed this feature proposal:
-->
<!--
You can freely edit this text. Remove any lines you believe are unnecessary.
-->
<!-- gh-linked-prs -->
### Linked PRs
* gh-104877
<!-- /gh-linked-prs -->
| 10c45838e1de47ef57708c71e3d9c2ddb78d493d | 705e387dd81b971cb1ee5727da54adfb565f61d0 |
python/cpython | python__cpython-104875 | # Document `NewType.__supertype__`
`typing.NewType` has always had an attribute `__supertype__` pointing to its supertype. It is widely used in the wild (e.g. https://github.com/pydantic/pydantic/blob/00158878cb1c6218cd58ee62248db56c65e4172c/pydantic/_internal/_typing_extra.py#L153, https://github.com/quora/pyanalyze/blob/bd7f520adc2d8b098be657dfa514d1433bea3b0c/pyanalyze/annotations.py#L456), but not documented. Let's document it and commit to leaving it stable.
<!-- gh-linked-prs -->
### Linked PRs
* gh-104875
* gh-104906
* gh-104907
<!-- /gh-linked-prs -->
| 41768a2bd3a8f57e6ce4e4ae9cab083b69817ec1 | dbcdbf1814b3df4c4e08e525b03384376598217d |
python/cpython | python__cpython-104878 | # Add `typing.get_protocol_members` and `typing.is_protocol`
#103160 added an attribute `__protocol_attrs__` that holds the names of all protocol attributes:
```
>>> from typing import Protocol
>>> class X(Protocol):
... a: int
... def b(self) -> str: pass
...
>>> X.__protocol_attrs__
{'b', 'a'}
```
This is useful for code that needs to extract the names included in a Protocol at runtime. Previously, this was quite difficult, as you had to look at the class's `__dict__` and `__annotations__` directly and exclude a long list of internal attributes (e.g. https://github.com/quora/pyanalyze/blob/bd7f520adc2d8b098be657dfa514d1433bea3b0c/pyanalyze/checker.py#L428).
However, currently `__protocol_attrs__` is an undocumented private attribute. I think we should either document it or add an introspection helper like `typing.get_protocol_attrs()` that exposes it.
<!-- gh-linked-prs -->
### Linked PRs
* gh-104878
<!-- /gh-linked-prs -->
| fc8037d84c5f886849a05ec993dd0f79a356d372 | ba516e70c6d156dc59dede35b6fc3db0151780a5 |
python/cpython | python__cpython-104870 | # inspect.getsourcelines() is buggy on 3.12
```
% cat gsl.py
import inspect
def test_list_add(self):
def capybara() -> None:
assert_is_value(
[x] + [y],
z,
)
assert_is_value(
# in the binop implementation
a,
)
print(inspect.getsourcelines(test_list_add))
% python3.11 gsl.py
(['def test_list_add(self):\n', ' def capybara() -> None:\n', ' assert_is_value(\n', ' [x] + [y],\n', ' z,\n', ' )\n', ' assert_is_value(\n', ' # in the binop implementation\n', ' a,\n', ' )\n'], 3)
% ~/py/cpython/python.exe gsl.py
(['def test_list_add(self):\n', ' def capybara() -> None:\n', ' assert_is_value(\n', ' [x] + [y],\n', ' z,\n', ' )\n', ' assert_is_value(\n', ' # in the binop implementation\n'], 3)
% ~/py/cpython/python.exe -V
Python 3.12.0b1+
```
On 3.12, the last two lines (`a, )`) are not returned as part of the source lines. When I tried to minify by removing the first assert_is_value() call, it instead returned the print() line as part of the function.
This looks related to the tokenizer, cc @pablogsal @lysnikolaou.
<!-- gh-linked-prs -->
### Linked PRs
* gh-104870
* gh-104872
<!-- /gh-linked-prs -->
| c90a862cdcf55dc1753c6466e5fa4a467a13ae24 | 9d457e115447b2079a1f66950d3c76cb77febf38 |
python/cpython | python__cpython-124340 | # allow_abbrev=False does not work for single-dash long options
# Bug report
The following codes get different results between python3.6.9 and python3.8.10
```Python
import argparse
if __name__ == "__main__":
parser = argparse.ArgumentParser(
allow_abbrev=False
)
parser.add_argument('-on', "--output-name", type=str, help="output")
args, _ = parser.parse_known_args(["-on", "name", "-o", "dir"])
print(args)
```
## python3.6.9
```Shell
[chenf@b12e0231:/mnt/ssd/chenf/projects/test_hhb_2.4/debug/cmd_config]$ python3 --version
Python 3.6.9
[chenf@b12e0231:/mnt/ssd/chenf/projects/test_hhb_2.4/debug/cmd_config]$ python3 ~/test.py
Namespace(output_name='name')
```
## python3.8.10
```Shell
[chenf@b11b0623:/mnt/ssd/chenf/project/hhb_test_2.4/cmd_config]$ python3 --version
Python 3.8.10
[chenf@b11b0623:/mnt/ssd/chenf/project/hhb_test_2.4/cmd_config]$ python3 ~/test.py
Namespace(output_name='dir')
```
## Debug
I have tried to debug the codes and get some clues.
I have set the `allow_abbrev=False` to disable prefix match, but it works in python3.6.9 and fails in python3.8.10.
When I div into `argparse.py` and find that it considers the allow_abbrev in python3.6.9 but not in python3.8.10 as follow:
### in python3.6.9 argparse.py
```Python
def _parse_optional(self, arg_string):
....
if self.allow_abbrev:
# search through all possible prefixes of the option string
# and all actions in the parser for possible interpretations
option_tuples = self._get_option_tuples(arg_string)
# if multiple actions match, the option string was ambiguous
if len(option_tuples) > 1:
options = ', '.join([option_string
for action, option_string, explicit_arg in option_tuples])
args = {'option': arg_string, 'matches': options}
msg = _('ambiguous option: %(option)s could match %(matches)s')
self.error(msg % args)
# if exactly one action matched, this segmentation is good,
# so return the parsed action
elif len(option_tuples) == 1:
option_tuple, = option_tuples
return option_tuple
....
```
### in python3.8.10 argparse.py
```Python
def _parse_optional(self, arg_string):
....
# search through all possible prefixes of the option string
# and all actions in the parser for possible interpretations
option_tuples = self._get_option_tuples(arg_string)
# if multiple actions match, the option string was ambiguous
if len(option_tuples) > 1:
options = ', '.join([option_string
for action, option_string, explicit_arg in option_tuples])
args = {'option': arg_string, 'matches': options}
msg = _('ambiguous option: %(option)s could match %(matches)s')
self.error(msg % args)
# if exactly one action matched, this segmentation is good,
# so return the parsed action
elif len(option_tuples) == 1:
option_tuple, = option_tuples
return option_tuple
....
```
<!-- gh-linked-prs -->
### Linked PRs
* gh-124340
* gh-124749
* gh-124750
<!-- /gh-linked-prs -->
| 49e105f9488de18d3d92948232fcbd956cbe0c6e | d08c7888229e78533648191dfe42e2d2d3ecea25 |
python/cpython | python__cpython-105047 | # Certain Tkinter tests fail on Tk 8.7 due to TIP 577
One of the broader effects of [TIP 577](https://core.tcl-lang.org/tips/doc/trunk/tip/577.md) on Tk 8.7 (and to a greater degree under Tcl 9.0 than Tcl 8.7) is that it makes widget commands/options/etc. more consistent about using empty string as the reserved “none” value, rather than e.g. `"none"` or `-1`. python/cpython#103685 was one instance of this.
Several options already accepted the empty string under Tk 8.6, but this was not reflected in the `bad option…` error messages. Tk 8.7 corrects this, which breaks several Tkinter tests with hardcoded error messages:
```
======================================================================
FAIL: test_place_configure_relheight (test.test_tkinter.test_geometry_managers.PlaceTest.test_place_configure_relheight)
----------------------------------------------------------------------
_tkinter.TclError: expected floating-point number or "" but got "abcd"
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/user/git/cpython/Lib/test/test_tkinter/test_geometry_managers.py", line 426, in test_place_configure_relheight
with self.assertRaisesRegex(TclError, 'expected floating-point number '
AssertionError: "expected floating-point number but got "abcd"" does not match "expected floating-point number or "" but got "abcd""
======================================================================
FAIL: test_place_configure_relwidth (test.test_tkinter.test_geometry_managers.PlaceTest.test_place_configure_relwidth)
----------------------------------------------------------------------
_tkinter.TclError: expected floating-point number or "" but got "abcd"
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/user/git/cpython/Lib/test/test_tkinter/test_geometry_managers.py", line 414, in test_place_configure_relwidth
with self.assertRaisesRegex(TclError, 'expected floating-point number '
AssertionError: "expected floating-point number but got "abcd"" does not match "expected floating-point number or "" but got "abcd""
======================================================================
FAIL: test_configure_overrelief (test.test_tkinter.test_widgets.ButtonTest.test_configure_overrelief)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 448, in test_configure_overrelief
self.checkReliefParam(widget, 'overrelief')
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 161, in checkReliefParam
self.checkInvalidParam(widget, name, 'spam',
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 63, in checkInvalidParam
self.assertEqual(str(cm.exception), errmsg)
AssertionError: 'bad relief "spam": must be flat, groove, raised, ridge, solid, sunken, or ""' != 'bad relief "spam": must be flat, groove, raised, ridge, solid, or sunken'
- bad relief "spam": must be flat, groove, raised, ridge, solid, sunken, or ""
? -------
+ bad relief "spam": must be flat, groove, raised, ridge, solid, or sunken
? +++
======================================================================
FAIL: test_configure_overrelief (test.test_tkinter.test_widgets.CheckbuttonTest.test_configure_overrelief)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 448, in test_configure_overrelief
self.checkReliefParam(widget, 'overrelief')
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 161, in checkReliefParam
self.checkInvalidParam(widget, name, 'spam',
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 63, in checkInvalidParam
self.assertEqual(str(cm.exception), errmsg)
AssertionError: 'bad relief "spam": must be flat, groove, raised, ridge, solid, sunken, or ""' != 'bad relief "spam": must be flat, groove, raised, ridge, solid, or sunken'
- bad relief "spam": must be flat, groove, raised, ridge, solid, sunken, or ""
? -------
+ bad relief "spam": must be flat, groove, raised, ridge, solid, or sunken
? +++
======================================================================
FAIL: test_configure_proxyrelief (test.test_tkinter.test_widgets.PanedWindowTest.test_configure_proxyrelief)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/user/git/cpython/Lib/test/test_tkinter/support.py", line 94, in newtest
test(self)
File "/Users/user/git/cpython/Lib/test/test_tkinter/test_widgets.py", line 1245, in test_configure_proxyrelief
self.checkReliefParam(widget, 'proxyrelief')
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 161, in checkReliefParam
self.checkInvalidParam(widget, name, 'spam',
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 63, in checkInvalidParam
self.assertEqual(str(cm.exception), errmsg)
AssertionError: 'bad relief "spam": must be flat, groove, raised, ridge, solid, sunken, or ""' != 'bad relief "spam": must be flat, groove, raised, ridge, solid, or sunken'
- bad relief "spam": must be flat, groove, raised, ridge, solid, sunken, or ""
? -------
+ bad relief "spam": must be flat, groove, raised, ridge, solid, or sunken
? +++
======================================================================
FAIL: test_configure_overrelief (test.test_tkinter.test_widgets.RadiobuttonTest.test_configure_overrelief)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 448, in test_configure_overrelief
self.checkReliefParam(widget, 'overrelief')
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 161, in checkReliefParam
self.checkInvalidParam(widget, name, 'spam',
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 63, in checkInvalidParam
self.assertEqual(str(cm.exception), errmsg)
AssertionError: 'bad relief "spam": must be flat, groove, raised, ridge, solid, sunken, or ""' != 'bad relief "spam": must be flat, groove, raised, ridge, solid, or sunken'
- bad relief "spam": must be flat, groove, raised, ridge, solid, sunken, or ""
? -------
+ bad relief "spam": must be flat, groove, raised, ridge, solid, or sunken
? +++
```
In a few other cases, empty string has become a valid input, so tests which expect an exception to be raised now fail instead:
```
======================================================================
FAIL: test_configure_underline (test.test_tkinter.test_widgets.ButtonTest.test_configure_underline)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 418, in test_configure_underline
self.checkIntegerParam(widget, 'underline', 0, 1, 10)
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 77, in checkIntegerParam
self.checkInvalidParam(widget, name, '',
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 60, in checkInvalidParam
with self.assertRaises(tkinter.TclError) as cm:
AssertionError: TclError not raised
======================================================================
FAIL: test_configure_underline (test.test_tkinter.test_widgets.CheckbuttonTest.test_configure_underline)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 418, in test_configure_underline
self.checkIntegerParam(widget, 'underline', 0, 1, 10)
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 77, in checkIntegerParam
self.checkInvalidParam(widget, name, '',
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 60, in checkInvalidParam
with self.assertRaises(tkinter.TclError) as cm:
AssertionError: TclError not raised
======================================================================
FAIL: test_configure_underline (test.test_tkinter.test_widgets.LabelTest.test_configure_underline)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 418, in test_configure_underline
self.checkIntegerParam(widget, 'underline', 0, 1, 10)
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 77, in checkIntegerParam
self.checkInvalidParam(widget, name, '',
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 60, in checkInvalidParam
with self.assertRaises(tkinter.TclError) as cm:
AssertionError: TclError not raised
======================================================================
FAIL: test_configure_underline (test.test_tkinter.test_widgets.MenubuttonTest.test_configure_underline)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 418, in test_configure_underline
self.checkIntegerParam(widget, 'underline', 0, 1, 10)
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 77, in checkIntegerParam
self.checkInvalidParam(widget, name, '',
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 60, in checkInvalidParam
with self.assertRaises(tkinter.TclError) as cm:
AssertionError: TclError not raised
======================================================================
FAIL: test_configure_underline (test.test_tkinter.test_widgets.OptionMenuTest.test_configure_underline)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 418, in test_configure_underline
self.checkIntegerParam(widget, 'underline', 0, 1, 10)
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 77, in checkIntegerParam
self.checkInvalidParam(widget, name, '',
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 60, in checkInvalidParam
with self.assertRaises(tkinter.TclError) as cm:
AssertionError: TclError not raised
======================================================================
FAIL: test_configure_underline (test.test_tkinter.test_widgets.RadiobuttonTest.test_configure_underline)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 418, in test_configure_underline
self.checkIntegerParam(widget, 'underline', 0, 1, 10)
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 77, in checkIntegerParam
self.checkInvalidParam(widget, name, '',
File "/Users/user/git/cpython/Lib/test/test_tkinter/widget_tests.py", line 60, in checkInvalidParam
with self.assertRaises(tkinter.TclError) as cm:
AssertionError: TclError not raised
```
<!-- gh-linked-prs -->
### Linked PRs
* gh-105047
* gh-120824
* gh-120864
* gh-120865
<!-- /gh-linked-prs -->
| fba73698240660d9154b6917b87dd333d6fb8284 | deb39b5fef12741d99619f6d25f1af287de883d0 |
python/cpython | python__cpython-105005 | # test_venv spams our AddressSanitizer CI with 5mb of test logs
`test_venv` is spamming our AddressSanitizer CI runs with 5mb (50000 lines) of LeakSanitizer reports. Likely because it does not allow the `ASAN_OPTIONS` environment variable setting through to child Python processes it spawns.
if the test isn't useful for AddressSanitizer we should just skip it on ASAN runs. otherwise we need to make sure the relevant environment is plumbed through.
It makes looking at AddressSanitizer logs from Github's CI interface in a browser next to impossible to use. You have to download the logs to be able to make sense out of them.
<!-- gh-linked-prs -->
### Linked PRs
* gh-105005
* gh-105006
<!-- /gh-linked-prs -->
| a17f160376955d369c8d332e1b1a90a6e18c852a | 46b52e6e2bda51d0b89a64ee36ce2d305a7409f3 |
python/cpython | python__cpython-104838 | # CI blocked: threading causing random tests to alter the execution environment and fail
Presumably caused by https://github.com/python/cpython/pull/104754?
<!-- gh-linked-prs -->
### Linked PRs
* gh-104838
<!-- /gh-linked-prs -->
| 4b56e56c495de58425ae3db5f4d8183127ee990b | 7f963bfc79a515dc9822ebddbfb1b5927d2dda09 |
python/cpython | python__cpython-104836 | # Remove unittest's deprecated getTestCaseNames, makeSuite, findTestCases
# Feature or enhancement
Prior to Python 3.11, `unittest`'s `getTestCaseNames`, `makeSuite` and `findTestCases` were commented:
> these functions should be considered obsolete
They were deprecated with warnings in 3.11 and scheduled for removal in 3.13.
# Previous discussion
<!--
New features to Python should first be discussed elsewhere before creating issues on GitHub,
for example in the "ideas" category (https://discuss.python.org/c/ideas/6) of discuss.python.org,
or the python-ideas mailing list (https://mail.python.org/mailman3/lists/python-ideas.python.org/).
Use this space to post links to the places where you have already discussed this feature proposal:
-->
* gh-50096
<!--
You can freely edit this text. Remove any lines you believe are unnecessary.
-->
<!-- gh-linked-prs -->
### Linked PRs
* gh-104836
<!-- /gh-linked-prs -->
| b1cb30ec8639e4e65f62e8f6cd44e979640431c8 | ded5f1f287674ad404ddd042745433388dc073a5 |
python/cpython | python__cpython-104846 | # Token lines in the tokenize module always end in new line
As a consequence of the changes in the Python tokenizer post PEP701, now the `line` attribute always ends in a new line character. We should remove this as it adds no information and wasn't there before.
<!-- gh-linked-prs -->
### Linked PRs
* gh-104846
* gh-104850
* gh-104880
<!-- /gh-linked-prs -->
| c8cf9b42eb2bfbd4c3e708ec28d32430248a1d7a | c45701e9ef004a523ebb28f3be902b3cf2cf7a9b |
python/cpython | python__cpython-104892 | # Strange import errors with Python 3.12 on Windows
I tried to test our project with Python 3.12 beta 1 on Windows but everything failed. After some debugging I noticed that module imports seem to fail when modules aren't on my C-drive:
```
C:\Users\peke>echo print(1) > test312.py
C:\Users\peke>py -3.12 -c "import test312"
1
C:\Users\peke>e:
E:\>echo print(1) > test312.py
E:\>py -3.12 -c "import test312"
Traceback (most recent call last):
File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'test312'
```
No problems with earlier Python versions:
```
E:\>py -3.11 -c "import test312"
1
```
Not sure does it matter, but I'm running Windows on VirtualBox and that E-drive is mapped to a directory on the Linux host.
<!-- gh-linked-prs -->
### Linked PRs
* gh-104892
* gh-104905
<!-- /gh-linked-prs -->
| 6031727a37c6003f78e3b0c7414a0a214855dd08 | 087c1a6539eac330372a8b759bf3e2d472f6148a |
python/cpython | python__cpython-104813 | # Subinterpreter Pending Calls Never Run
A while back we moved the pending calls to `PyInterpreterState`. Then in 2020 we made it so pending calls would only run in the main thread of the main interpreter. Doing so renders the per-interpreter pending calls state pointless and prevents us from making use of pending calls for subinterpreters.
<!-- gh-linked-prs -->
### Linked PRs
* gh-104813
* gh-105752
* gh-105761
<!-- /gh-linked-prs -->
| 757b402ea1c2c6b925a55a08fd844b065b6e082f | 4e80082723b768df124f77d2b73b3ba6b584a735 |
python/cpython | python__cpython-104816 | # Remove deprecated webbrowser.MacOSX class
# Feature or enhancement
The `webbrowser.MacOSX` class was deprecated in Python 3.11 and scheduled for removal in 3.13.
Let's remove it. PR to follow.
# Previous discussion
<!--
New features to Python should first be discussed elsewhere before creating issues on GitHub,
for example in the "ideas" category (https://discuss.python.org/c/ideas/6) of discuss.python.org,
or the python-ideas mailing list (https://mail.python.org/mailman3/lists/python-ideas.python.org/).
Use this space to post links to the places where you have already discussed this feature proposal:
-->
* GH-86421
<!--
You can freely edit this text. Remove any lines you believe are unnecessary.
-->
<!-- gh-linked-prs -->
### Linked PRs
* gh-104816
<!-- /gh-linked-prs -->
| 5ab4bc05c459f25c9d1dcb9b20f10bbd7e6eae5e | afa759fb800be416f69e3e9c9b3efe68006316f5 |
python/cpython | python__cpython-104805 | # Allow detecting Dev Drive on Windows
Windows just announced a new [Dev Drive](https://learn.microsoft.com/en-us/windows/dev-drive/) feature, optimised for high I/O scenarios such as build and test. It also works as a very clear signal that the user is a developer and is doing developer-like tasks.
We should add a function to allow querying whether a specific path is on a Dev Drive. The API is relatively low level, and cannot currently be used from Python, but would allow Python apps to detect when the user is operating on a Dev Drive (e.g. installing or compiling something on one), or choose or offer a more performant temporary or cache location than the user directory.
(For a variety of mostly compatibility reasons, there's no way for Windows to redirect `%TEMP%` onto a Dev Drive, but apps that are aware of it can do it for themselves.)
<!-- gh-linked-prs -->
### Linked PRs
* gh-104805
* gh-105054
<!-- /gh-linked-prs -->
| bfd20d257e4ad16a25f4bac0ea4dbb719cdf6bc7 | e92ac0a741b125f1cffe8c07b054d1dea7b0a05a |
python/cpython | python__cpython-104828 | # PEP-695: Potentially breaking changes made to `__match_args__` attributes of AST nodes
Consider the following script:
```py
import ast
def test(node):
match node:
case ast.FunctionDef("foo", ast.arguments(args=[ast.arg("bar")])):
print('matched! :)')
case _:
print("Didn't match :(")
source = ast.parse("def foo(bar): pass")
node = source.body[0]
assert isinstance(node, ast.FunctionDef)
test(node)
```
Running this script on 3.11 gets you this output:
```
>python repro.py
matched! :)
```
Running this script on CPython `main`, however, gets you this output:
```
>python repro.py
Didn't match :(
```
The reason for this is that the implementation of PEP-695 (new in Python 3.12) added a number of new AST nodes to Python, and as a result, the `__match_args__` attributes of `ast.FunctionDef`, `ast.AsyncFunctionDef` and `ast.ClassDef` are all different on 3.12 compared to what they were on 3.11.
<details>
<summary>`__match_args__` attributes on 3.11:</summary>
```pycon
>>> import ast
>>> for node in ast.ClassDef, ast.FunctionDef, ast.AsyncFunctionDef:
... print(node.__match_args__)
...
('name', 'bases', 'keywords', 'body', 'decorator_list')
('name', 'args', 'body', 'decorator_list', 'returns', 'type_comment')
('name', 'args', 'body', 'decorator_list', 'returns', 'type_comment')
```
</details>
<details>
<summary>`__match_args__` attributes on 3.12:</summary>
```pycon
>>> import ast
>>> for node in ast.ClassDef, ast.FunctionDef, ast.AsyncFunctionDef:
... print(node.__match_args__)
...
('name', 'type_params', 'bases', 'keywords', 'body', 'decorator_list')
('name', 'type_params', 'args', 'body', 'decorator_list', 'returns', 'type_comment')
('name', 'type_params', 'args', 'body', 'decorator_list', 'returns', 'type_comment')
```
</details>
This feels like it has the potential to be quite a breaking change for people using pattern-matching to parse ASTs. It would probably be okay if `type_params` had been added as the final item in the `__match_args__` tuples, but at the moment it comes in second place.
Cc. @JelleZijlstra for PEP-695. Also curious if @brandtbucher has any thoughts (for pattern-matching expertise) or @isidentical (for `ast`-module expertise).
<!-- gh-linked-prs -->
### Linked PRs
* gh-104828
* gh-104834
* gh-104974
* gh-105213
* gh-105846
* gh-105862
<!-- /gh-linked-prs -->
| ba73473f4c18ba4cf7ab18d84d94a47d2d37a0c5 | 6e1eccdcce5ea3bf1ef9d326d20ef9df21262c6b |
python/cpython | python__cpython-104827 | # Cannot use multiple inheritance with `collections.abc.Buffer` and `typing.Protocol`
Various stdlib classes are treated as protocols by type checkers, but are actually ABCs at runtime (for performance reasons). Examples include `contextlib.AbstractContextManager` and `collections.abc.Iterable`. These classes are special-cased in `typing.py` to allow for multiple inheritance with `typing.Protocol`, so that the interface can be extended:
```pycon
>>> from contextlib import AbstractContextManager
>>> from typing import Protocol
>>> class Foo(AbstractContextManager, Protocol):
... def extra_method(self) -> None: ...
...
>>>
```
`collections.abc.Buffer` is a new-in-3.12 class that, like `AbstractContextManager` and `Iterable`, is an ABC at runtime but will be treated by type checkers as if it were a `Protocol`. However, multiple inheritance with `collections.abc.Buffer` and `typing.Protocol` currently fails:
```pycon
>>> class Bar(Buffer, Protocol):
... def extra_method(self) -> None: ...
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\alexw\coding\cpython\Lib\abc.py", line 106, in __new__
cls = super().__new__(mcls, name, bases, namespace, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexw\coding\cpython\Lib\typing.py", line 1916, in __init_subclass__
raise TypeError('Protocols can only inherit from other'
TypeError: Protocols can only inherit from other protocols, got <class 'collections.abc.Buffer'>
```
I think `Buffer` should be special-cased in the same way as `Buffer` and `Iterable`. It needs to be added to this mapping, I think:
https://github.com/python/cpython/blob/ddb14859535ab8091381b9d0baf32dbe245b5e65/Lib/typing.py#L1740-L1746
Cc. @JelleZijlstra for PEP-688
<!-- gh-linked-prs -->
### Linked PRs
* gh-104827
* gh-104841
<!-- /gh-linked-prs -->
| c0ab7d401c736c37bf4462eef7c7d69fef8fab93 | 4b56e56c495de58425ae3db5f4d8183127ee990b |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.