id int64 393k 2.82B | repo stringclasses 68
values | title stringlengths 1 936 | body stringlengths 0 256k ⌀ | labels stringlengths 2 508 | priority stringclasses 3
values | severity stringclasses 3
values |
|---|---|---|---|---|---|---|
2,631,555,914 | ui | [bug]: error using sidebar on mobile | ### Describe the bug
## Error:
When opened the sidebarit returns an error.



## Solution:
As the Sheet component uses the dialog I believe that just add the SheetTitle

### Affected component/components
Sidebar
### How to reproduce
1. Create SideBar Component
2. Access page with resolution mobile
3. Open SideBar
### Codesandbox/StackBlitz link
_No response_
### Logs
```bash
`DialogContent` requires a `DialogTitle` for the component to be accessible for screen reader users.
If you want to hide the `DialogTitle`, you can wrap it with our VisuallyHidden component.
For more information, see https://radix-ui.com/primitives/docs/components/dialog
```
### System Info
```bash
Windows 11 Pro
Brave
```
### Before submitting
- [X] I've made research efforts and searched the documentation
- [X] I've searched for existing issues | bug | low | Critical |
2,631,589,298 | vscode | Conditional breakpoint is not being hit with source-mapped TypeScript variables in VSCode |
Type: <b>Bug</b>
Conditional breakpoints in VSCode are not being hit when set on TypeScript variables that are source-mapped to slightly different names in the generated JavaScript code. However, these breakpoints work correctly in the browser debuggers (both Firefox and Chrome).
Reproduction repository: https://github.com/Luke265/vscode-cond-brk-issue
Tested on Windows 10 and Ubuntu 24 (fresh install)
VS Code version: Code 1.95.1 (65edc4939843c90c34d61f4ce11704f09d3e5cb6, 2024-10-31T05:14:54.222Z)
OS version: Windows_NT x64 10.0.19045
Modes:
<details>
<summary>System Info</summary>
|Item|Value|
|---|---|
|CPUs|Intel(R) Core(TM) i7-4790K CPU @ 4.00GHz (8 x 3999)|
|GPU Status|2d_canvas: enabled<br>canvas_oop_rasterization: enabled_on<br>direct_rendering_display_compositor: disabled_off_ok<br>gpu_compositing: enabled<br>multiple_raster_threads: enabled_on<br>opengl: enabled_on<br>rasterization: enabled<br>raw_draw: disabled_off_ok<br>skia_graphite: disabled_off<br>video_decode: enabled<br>video_encode: enabled<br>vulkan: disabled_off<br>webgl: enabled<br>webgl2: enabled<br>webgpu: enabled<br>webnn: disabled_off|
|Load (avg)|undefined|
|Memory (System)|31.95GB (19.51GB free)|
|Process Argv|--crash-reporter-id 30abc7bb-2472-4e0a-aa9d-4f4bba929bd9|
|Screen Reader|no|
|VM|0%|
</details><details><summary>Extensions (1)</summary>
Extension|Author (truncated)|Version
---|---|---
vscode-firefox-debug|fir|2.9.10
</details><details>
<summary>A/B Experiments</summary>
```
vsliv368:30146709
vspor879:30202332
vspor708:30202333
vspor363:30204092
vscod805cf:30301675
binariesv615:30325510
vsaa593:30376534
py29gd2263:31024239
c4g48928:30535728
azure-dev_surveyone:30548225
962ge761:30959799
pythongtdpath:30769146
pythonnoceb:30805159
asynctok:30898717
pythonmypyd1:30879173
h48ei257:31000450
pythontbext0:30879054
cppperfnew:31000557
dsvsc020:30976470
pythonait:31006305
dsvsc021:30996838
bdiig495:31013172
dvdeprecation:31068756
dwnewjupytercf:31046870
newcmakeconfigv2:31071590
impr_priority:31102340
nativerepl1:31139838
refactort:31108082
pythonrstrctxt:31112756
cf971741:31144450
iacca1:31171482
notype1:31157159
5fd0e150:31155592
dwcopilot:31170013
```
</details>
<!-- generated by issue reporter --> | feature-request,debug | low | Critical |
2,631,592,260 | langchain | Error when I try Cohere with Bedrock | ### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a similar question and didn't find it.
- [X] I am sure that this is a bug in LangChain rather than my code.
- [X] The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
### Example Code
This code
```
from langchain_aws import BedrockLLM
llm = BedrockLLM( region_name="us-east-1", model_id="cohere.command-r-plus-v1:0")
llm.invoke(input="What is the recipe of mayonnaise?")
```
Gave the following stack trace.
### Error Message and Stack Trace (if applicable)
ERROR:root:Error raised by bedrock service: An error occurred (ValidationException) when calling the InvokeModel operation: Malformed input request: #: extraneous key [prompt] is not permitted, please reformat your input and try again.
---------------------------------------------------------------------------
ValidationException Traceback (most recent call last)
Cell In[27], line 5
1 from langchain_aws import BedrockLLM
3 llm = BedrockLLM( region_name="us-east-1", model_id="cohere.command-r-plus-v1:0")
----> 5 llm.invoke(input="What is the recipe of mayonnaise?")
File [~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_core/language_models/llms.py:390](http://localhost:8888/lab/workspaces/auto-e/tree/~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_core/language_models/llms.py#line=389), in BaseLLM.invoke(self, input, config, stop, **kwargs)
380 def invoke(
381 self,
382 input: LanguageModelInput,
(...)
386 **kwargs: Any,
387 ) -> str:
388 config = ensure_config(config)
389 return (
--> 390 self.generate_prompt(
391 [self._convert_input(input)],
392 stop=stop,
393 callbacks=config.get("callbacks"),
394 tags=config.get("tags"),
395 metadata=config.get("metadata"),
396 run_name=config.get("run_name"),
397 run_id=config.pop("run_id", None),
398 **kwargs,
399 )
400 .generations[0][0]
401 .text
402 )
File [~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_core/language_models/llms.py:755](http://localhost:8888/lab/workspaces/auto-e/tree/~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_core/language_models/llms.py#line=754), in BaseLLM.generate_prompt(self, prompts, stop, callbacks, **kwargs)
747 def generate_prompt(
748 self,
749 prompts: list[PromptValue],
(...)
752 **kwargs: Any,
753 ) -> LLMResult:
754 prompt_strings = [p.to_string() for p in prompts]
--> 755 return self.generate(prompt_strings, stop=stop, callbacks=callbacks, **kwargs)
File [~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_core/language_models/llms.py:950](http://localhost:8888/lab/workspaces/auto-e/tree/~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_core/language_models/llms.py#line=949), in BaseLLM.generate(self, prompts, stop, callbacks, tags, metadata, run_name, run_id, **kwargs)
935 if (self.cache is None and get_llm_cache() is None) or self.cache is False:
936 run_managers = [
937 callback_manager.on_llm_start(
938 self._serialized,
(...)
948 )
949 ]
--> 950 output = self._generate_helper(
951 prompts, stop, run_managers, bool(new_arg_supported), **kwargs
952 )
953 return output
954 if len(missing_prompts) > 0:
File [~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_core/language_models/llms.py:792](http://localhost:8888/lab/workspaces/auto-e/tree/~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_core/language_models/llms.py#line=791), in BaseLLM._generate_helper(self, prompts, stop, run_managers, new_arg_supported, **kwargs)
790 for run_manager in run_managers:
791 run_manager.on_llm_error(e, response=LLMResult(generations=[]))
--> 792 raise e
793 flattened_outputs = output.flatten()
794 for manager, flattened_output in zip(run_managers, flattened_outputs):
File [~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_core/language_models/llms.py:779](http://localhost:8888/lab/workspaces/auto-e/tree/~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_core/language_models/llms.py#line=778), in BaseLLM._generate_helper(self, prompts, stop, run_managers, new_arg_supported, **kwargs)
769 def _generate_helper(
770 self,
771 prompts: list[str],
(...)
775 **kwargs: Any,
776 ) -> LLMResult:
777 try:
778 output = (
--> 779 self._generate(
780 prompts,
781 stop=stop,
782 # TODO: support multiple run managers
783 run_manager=run_managers[0] if run_managers else None,
784 **kwargs,
785 )
786 if new_arg_supported
787 else self._generate(prompts, stop=stop)
788 )
789 except BaseException as e:
790 for run_manager in run_managers:
File [~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_core/language_models/llms.py:1502](http://localhost:8888/lab/workspaces/auto-e/tree/~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_core/language_models/llms.py#line=1501), in LLM._generate(self, prompts, stop, run_manager, **kwargs)
1499 new_arg_supported = inspect.signature(self._call).parameters.get("run_manager")
1500 for prompt in prompts:
1501 text = (
-> 1502 self._call(prompt, stop=stop, run_manager=run_manager, **kwargs)
1503 if new_arg_supported
1504 else self._call(prompt, stop=stop, **kwargs)
1505 )
1506 generations.append([Generation(text=text)])
1507 return LLMResult(generations=generations)
File [~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_aws/llms/bedrock.py:1206](http://localhost:8888/lab/workspaces/auto-e/tree/~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_aws/llms/bedrock.py#line=1205), in BedrockLLM._call(self, prompt, stop, run_manager, **kwargs)
1200 run_manager.on_llm_end(
1201 LLMResult(generations=[all_generations], llm_output=llm_output)
1202 )
1204 return completion
-> 1206 text, tool_calls, llm_output = self._prepare_input_and_invoke(
1207 prompt=prompt, stop=stop, run_manager=run_manager, **kwargs
1208 )
1209 if run_manager is not None:
1210 run_manager.on_llm_end(
1211 LLMResult(generations=[[Generation(text=text)]], llm_output=llm_output)
1212 )
File [~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_aws/llms/bedrock.py:840](http://localhost:8888/lab/workspaces/auto-e/tree/~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_aws/llms/bedrock.py#line=839), in BedrockBase._prepare_input_and_invoke(self, prompt, system, messages, stop, run_manager, **kwargs)
838 if run_manager is not None:
839 run_manager.on_llm_error(e)
--> 840 raise e
842 if stop is not None:
843 text = enforce_stop_tokens(text, stop)
File [~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_aws/llms/bedrock.py:826](http://localhost:8888/lab/workspaces/auto-e/tree/~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/langchain_aws/llms/bedrock.py#line=825), in BedrockBase._prepare_input_and_invoke(self, prompt, system, messages, stop, run_manager, **kwargs)
823 request_options["trace"] = "ENABLED"
825 try:
--> 826 response = self.client.invoke_model(**request_options)
828 (
829 text,
830 tool_calls,
(...)
833 stop_reason,
834 ) = LLMInputOutputAdapter.prepare_output(provider, response).values()
836 except Exception as e:
File [~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/botocore/client.py:569](http://localhost:8888/lab/workspaces/auto-e/tree/~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/botocore/client.py#line=568), in ClientCreator._create_api_method.<locals>._api_call(self, *args, **kwargs)
565 raise TypeError(
566 f"{py_operation_name}() only accepts keyword arguments."
567 )
568 # The "self" in this scope is referring to the BaseClient.
--> 569 return self._make_api_call(operation_name, kwargs)
File [~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/botocore/client.py:1023](http://localhost:8888/lab/workspaces/auto-e/tree/~/Library/Caches/pypoetry/virtualenvs/ai-retail-2HZsdhoR-py3.11/lib/python3.11/site-packages/botocore/client.py#line=1022), in BaseClient._make_api_call(self, operation_name, api_params)
1019 error_code = error_info.get("QueryErrorCode") or error_info.get(
1020 "Code"
1021 )
1022 error_class = self.exceptions.from_code(error_code)
-> 1023 raise error_class(parsed_response, operation_name)
1024 else:
1025 return parsed_response
ValidationException: An error occurred (ValidationException) when calling the InvokeModel operation: Malformed input request: #: extraneous key [prompt] is not permitted, please reformat your input and try again.
### Description
I''m just trying to get Cohere Command R+ to work! I need it to summarize text and to be used in `create_stuff_documents_chain`
### System Info
System Information
------------------
> OS: Darwin
> OS Version: Darwin Kernel Version 23.5.0: Wed May 1 20:12:58 PDT 2024; root:xnu-10063.121.3~5/RELEASE_ARM64_T6000
> Python Version: 3.11.9 (v3.11.9:de54cf5be3, Apr 2 2024, 07:12:50) [Clang 13.0.0 (clang-1300.0.29.30)]
Package Information
-------------------
> langchain_core: 0.3.12
> langchain: 0.3.3
> langchain_community: 0.3.2
> langsmith: 0.1.136
> langchain_aws: 0.2.2
> langchain_text_splitters: 0.3.0
> langgraph: 0.2.38
Optional packages not installed
-------------------------------
> langserve
Other Dependencies
------------------
> aiohttp: 3.10.10
> async-timeout: Installed. No version info available.
> boto3: 1.35.43
> dataclasses-json: 0.6.7
> httpx: 0.27.2
> jsonpatch: 1.33
> langgraph-checkpoint: 2.0.1
> langgraph-sdk: 0.1.33
> numpy: 1.26.4
> orjson: 3.10.7
> packaging: 24.1
> pydantic: 2.9.2
> pydantic-settings: 2.6.0
> PyYAML: 6.0.2
> requests: 2.32.3
> requests-toolbelt: 1.0.0
> SQLAlchemy: 2.0.36
> tenacity: 8.5.0
> typing-extensions: 4.12.2
| 🤖:bug | low | Critical |
2,631,603,084 | godot | .NET Export Ignores Solution Path and Incorrectly Appends Assembly Name for Projects in Subdirectories | ### Tested versions
Reproducible in: 4.3.stable.mono.official [77dcf97d8]
### System information
Godot v4.3.stable.mono - Windows 10.0.22631 - Vulkan (Mobile) - dedicated Radeon RX 580 Series (Advanced Micro Devices, Inc.; 26.20.13002.133) - AMD Ryzen 5 2600 Six-Core Processor (12 Threads)
### Issue description
During project export, Godot's .NET integration incorrectly constructs the solution file path for C# projects that exist as part of a larger solution. The export process appears to ignore the configured solution path and instead tries to find a solution file named after the assembly name, whereas normal project execution (F5) works correctly with the specified solution path.
The export process fails with the error:
```
This project contains C# files but no solution file was found at the following path:
C:/Repos/my-game/src/MyGame.Server/../../MyGame.sln/MyGame.Server.sln
```
The export process incorrectly:
1. Ignores the specified solution filename in the Solution Directory setting
2. Attempts to find a solution file named after the assembly name
3. Appends this assumed filename to the solution directory path
### Steps to reproduce
1. Create a new Godot 4.3 C# project in a subdirectory of a larger solution
2. Configure project settings:
```
Assembly Name: "MyGame.Server"
Solution Directory: "../../MyGame.sln"
```
3. Verify that the project runs correctly with F5
4. Attempt to export the project using any preset
5. Observe the error about not finding the solution file
### Workaround Steps
You can work around this issue by either:
1. Create a server-specific solution in the Godot project directory:
```bash
cd src/MyGame.Server
dotnet new sln -n MyGame.Server
dotnet sln add MyGame.Server.csproj
dotnet sln add ../../src/MyGame.Shared/MyGame.Shared.csproj
dotnet sln add ../../src/MyGame.Shared.Godot/MyGame.Shared.Godot.csproj
```
2. Or create a symbolic link in the Godot project directory:
```bash
cd src/MyGame.Server
mklink MyGame.Server.sln ..\..\MyGame.sln
```
Both workarounds are suboptimal as they:
- Introduce duplicate solution files or symbolic links
- Complicate source control and build processes
- May cause confusion for other developers
### Minimal reproduction project (MRP)
Not at the moment. | topic:dotnet | low | Critical |
2,631,613,738 | pytorch | DISABLED test_rng (__main__.TestCompilerBisector) | Platforms: linux, rocm, slow
This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_rng&suite=TestCompilerBisector&limit=100) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/32449134122).
Over the past 3 hours, it has been determined flaky in 11 workflow(s) with 22 failures and 11 successes.
**Debugging instructions (after clicking on the recent samples link):**
DO NOT ASSUME THINGS ARE OKAY IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs.
To find relevant log snippets:
1. Click on the workflow logs linked above
2. Click on the Test step of the job so that it is expanded. Otherwise, the grepping will not work.
3. Grep for `test_rng`
4. There should be several instances run (as flaky tests are rerun in CI) from which you can study the logs.
Test file path: `dynamo/test_compiler_bisector.py`
cc @clee2000 @ezyang @chauhang @penguinwu @voznesenskym @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @chenyang78 @kadeng @amjames | triaged,module: flaky-tests,skipped,oncall: pt2,module: dynamo | medium | Critical |
2,631,619,911 | tauri | [bug] Tauri unilaterally overrides MACOSX_DEPLOYMENT_TARGET, causes sys crates to spuriously rebuild | ### Describe the bug
On macOS, there's a special env var `MACOSX_DEPLOYMENT_TARGET` that implicitly configures the system's C compiler, linker, and opts in to or out of breaking changes in macOS. Objects built for different deployment targets may be incompatible. The `cc` crate [knows this](https://github.com/rust-lang/cc-rs/blob/7a786c5e3dc149f32e565360dd8c3e43748592f5/src/lib.rs#L3766) and emits `cargo:rerun-if-env-changed=MACOSX_DEPLOYMENT_TARGET`, which invalidates cache of sys crates whenever `cargo build` sees a different value of this variable.
Unfortunately, something in Tauri is trying to override `MACOSX_DEPLOYMENT_TARGET` to `10.13`.
This causes two bugs:
1. Even if I set `MACOSX_DEPLOYMENT_TARGET` env var to a different value, the project is built for 10.13 anyway.
2. The override causes `cargo` to invalidate cache of sys crates, and rebuild `objc-sys` and all of `wry`, `tauri`, and its plugins, every time `cargo tauri dev` sees any file change!
> Dirty objc-sys v0.3.5: the env variable MACOSX_DEPLOYMENT_TARGET changed
I'm not sure where Cargo is getting another value from. I suspect that `rust-analyzer` is building the project in a way that doesn't support Tauri's way of overriding `MACOSX_DEPLOYMENT_TARGET`, or perhaps there's just a race condition in Cargo that compares an unset value from before the build start with a value set during the build.
The end result is that `cargo tauri dev` is painfully slow.
### Reproduction
To reproduce, modify `src/main.rs` to add:
```rust
compile_error!(env!("MACOSX_DEPLOYMENT_TARGET"));
```
and run:
```bash
unset MACOSX_DEPLOYMENT_TARGET
cargo tauri dev
```
The expected result:
```text
error: environment variable `MACOSX_DEPLOYMENT_TARGET` not defined at compile time
```
The actual result:
```text
error: 10.13
```
----
Second test:
```bash
export MACOSX_DEPLOYMENT_TARGET=11.0
cargo tauri dev
```
The expected result:
```text
error: 11.0
```
The actual result:
```text
error: 10.13
```
### Expected behavior
_No response_
### Full `tauri info` output
```text
[✔] Environment
- OS: Mac OS 15.1.0 arm64 (X64)
✔ Xcode Command Line Tools: installed
✔ rustc: 1.82.0 (f6e511eec 2024-10-15)
✔ cargo: 1.82.0 (8f40fc59f 2024-08-21)
✔ rustup: 1.27.1 (54dd3d00f 2024-04-24)
✔ Rust toolchain: stable-aarch64-apple-darwin (environment override by RUSTUP_TOOLCHAIN)
- node: 23.1.0
- yarn: 1.22.22
- npm: 10.9.0
[-] Packages
- tauri 🦀: 2.0.6
- tauri-build 🦀: 2.0.2
- wry 🦀: 0.46.3
- tao 🦀: 0.30.5
- tauri-cli 🦀: 2.0.4
[-] Plugins
- tauri-plugin-shell 🦀: 2.0.2
[-] App
- build-type: bundle
- CSP: unset
- frontendDist: ../src
```
### Stack trace
_No response_
### Additional context
_No response_ | type: bug,platform: macOS | low | Critical |
2,631,701,476 | yt-dlp | SoundCloud Download Liked Tracks | ### DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
- [X] I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
### Checklist
- [X] I'm requesting a site-specific feature
- [X] I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
- [X] I've checked that all provided URLs are playable in a browser with the same IP and same login details
- [X] I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
- [X] I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
- [ ] I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and I'm willing to share it if required
### Region
_No response_
### Example URLs
https://soundcloud.com/you/likes
https://soundcloud.com/you/history
https://soundcloud.com/you/albums
https://soundcloud.com/you/following
https://soundcloud.com/you/library
### Provide a description that is worded well enough to be understood
When logged in as a valid user, add support for downloading a user's own liked tracks, albums, history, following, etc.
### Provide verbose output that clearly demonstrates the problem
- [X] Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
- [ ] If using API, add `'verbose': True` to `YoutubeDL` params instead
- [X] Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
### Complete Verbose Output
```shell
[debug] Command-line config: ['--cookies-from-browser', 'firefox', 'https://soundcloud.com/you/likes', '-vU']
[debug] Encodings: locale UTF-8, fs utf-8, pref UTF-8, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version master@2024.11.04.003339 from yt-dlp/yt-dlp-master-builds (linux_exe)
[debug] Python 3.11.10 (CPython x86_64 64bit) - Linux-6.6.54-1.qubes.fc37.x86_64-x86_64-with (OpenSSL 3.1.7 3 Sep 2024)
[debug] exe versions: ffmpeg 6.1.2 (fdk,setts), ffprobe 6.1.2
[debug] Optional libraries: Cryptodome-3.21.0, brotli-1.1.0, certifi-2024.08.30, curl_cffi-0.7.1, mutagen-1.47.0, requests-2.32.3, secretstorage-3.3.3, sqlite3-3.44.2, urllib3-2.2.3, websockets-13.1
[debug] Proxy map: {}
Extracting cookies from firefox
[debug] Extracting cookies from: "/home/user/.mozilla/firefox/330e5w6j.default-release/cookies.sqlite"
Extracted 18 cookies from firefox
[debug] Request Handlers: urllib, requests, websockets, curl_cffi
[debug] Loaded 1838 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-master-builds/releases/latest
Latest version: master@2024.11.04.003339 from yt-dlp/yt-dlp-master-builds
yt-dlp is up to date (master@2024.11.04.003339 from yt-dlp/yt-dlp-master-builds)
[soundcloud:user] Verifying login token...
[soundcloud:user] Logging in
[soundcloud:user] Extracting URL: https://soundcloud.com/you/likes
[soundcloud:user] you: Downloading user info
ERROR: [soundcloud:user] Unable to download JSON metadata: HTTP Error 404: Not Found (caused by <HTTPError 404: Not Found>)
File "yt_dlp/extractor/common.py", line 742, in extract
File "yt_dlp/extractor/soundcloud.py", line 855, in _real_extract
File "yt_dlp/extractor/soundcloud.py", line 107, in _call_api
File "yt_dlp/extractor/common.py", line 1152, in download_content
File "yt_dlp/extractor/common.py", line 1112, in download_handle
File "yt_dlp/extractor/common.py", line 962, in _download_webpage_handle
File "yt_dlp/extractor/common.py", line 911, in _request_webpage
File "yt_dlp/extractor/common.py", line 898, in _request_webpage
File "yt_dlp/YoutubeDL.py", line 4162, in urlopen
File "yt_dlp/networking/common.py", line 117, in send
File "yt_dlp/networking/_helper.py", line 208, in wrapper
File "yt_dlp/networking/common.py", line 340, in send
File "yt_dlp/networking/_requests.py", line 365, in _send
yt_dlp.networking.exceptions.HTTPError: HTTP Error 404: Not Found
```
| site-enhancement,triage | low | Critical |
2,631,705,740 | TypeScript | When using typeof for the interface, it is hoped that the jsdoc comments of the expression can inherit typeof by default | ### 🔍 Search Terms
"typeof","jsdoc"
### ✅ Viability Checklist
- [x] This wouldn't be a breaking change in existing TypeScript/JavaScript code
- [x] This wouldn't change the runtime behavior of existing JavaScript code
- [x] This could be implemented without emitting different JS based on the types of the expressions
- [x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, new syntax sugar for JS, etc.)
- [x] This isn't a request to add a new utility type: https://github.com/microsoft/TypeScript/wiki/No-New-Utility-Types
- [x] This feature would agree with the rest of our Design Goals: https://github.com/Microsoft/TypeScript/wiki/TypeScript-Design-Goals
### ⭐ Suggestion
```
/** JSDoc1 */
const test = { a: 0 }
declare module 'vue' {
interface ComponentCustomProperties {
/** JSDoc2 */
testInInterface: typeof test,
}
}
const test2 = {
/** JSDoc3 */
test
}
```
I hope that when not writing JSDoc2 or JSDoc3, testInInterface and test2.test can have comments for const test in JSDoc1
### 📃 Motivating Example
The interface has the feature of declaring and merging, allowing developers to extend the interfaces defined in the component library. In front-end development, extending Vue's global components, directives, and properties is often used.
Taking a Vue component as an example, including its global and local usage schemes, there is currently a lack of JSDoc comments
- global
```
import vueComponentWithJsDoc from './test.vue'
declare module 'vue' {
interface GlobalComponents {
vueComponentWithoutJsDoc: typeof vueComponentWithJsDoc
}
}
```
- local
```
import vueComponentWithJsDoc from './test.vue'
import { defineComponent } from 'vue'
export default defineComponent({
components: { vueComponentWithoutJsDoc: vueComponentWithJsDoc }
})
```
When using components in the template, both of the above schemes lack JSDoc comments
```
<template>
<vue-component-without-js-doc />
</template>
```
### 💻 Use Cases
1. JSDoc comments can be included when declaring a merge interface
2. The type obtained by the typeof instance is missing the jsdoc annotation
3. Manually copy a jsdoc comment onto the interface field
| Suggestion,Help Wanted,Experience Enhancement | low | Minor |
2,631,742,060 | godot | The custom mouse image set in the project prompts a memory leak when the game exits. | ### Tested versions
The custom mouse image set in the project prompts a memory leak when the game exits, and can be reproduced in the following versions.
v4.2.1.stable.official [b09f793f5]
v4.3.stable.official [77dcf97d8]
v4.4.dev3.official [f4af8201b]
### System information
Windows 11 - Godot v4.3.stable.official [77dcf97d8] (Forward+)
### Issue description
The following content is translated by machine translation. My English is not very good, so I apologize here and hope you can understand. When I set the mouse image in the project settings, and during the game, I click the close button of the window or use get_tree().quit(), the following error appears in the Console:
```
ERROR: 1 RID allocations of type 'N10RendererRD14TextureStorage7TextureE' were leaked at exit.
ERROR: Parameter "RenderingServer::get_singleton()" is null.
at: ~CompressedTexture2D (scene/resources/compressed_texture.cpp:464)
WARNING: 2 RIDs of type "Texture" were leaked.
at: finalize (servers/rendering/rendering_device.cpp:5829)
```
The project in which I reproduced this error is an empty project, and the mouse texture uses the built-in default icon.svg. I am not a professional developer. After I removed the custom content for the mouse image, the error no longer appeared.I don't know how to solve this problem.
### Steps to reproduce
Set a custom mouse image in the project settings, then run the game, and exit through the window's close button or by using the get_tree().quit() function.
### Minimal reproduction project (MRP)
“N/A” | bug,platform:windows,needs testing,topic:input | low | Critical |
2,631,744,406 | pytorch | [RFC]: Adding Triton Backend for Aten operators | # [RFC] Aten Operators in Triton for Multi-backend support
## Abstract
This RFC discusses
1. the benefits and challenges of developing dispatch functions for Aten operators in Triton.
2. a practice in adding Triton backend functions to Aten operators.
## Motivation
Pytorch now has 2600+ entries in [native_functions.yaml](https://github.com/pytorch/pytorch/blob/14fc6b70ea61f18eeae40d96cdd1e035d74d6e62/aten/src/ATen/native/native_functions.yaml) (as of this RFC), which grows about 600 compared to what was report at this [post](https://dev-discuss.pytorch.org/t/where-do-the-2000-pytorch-operators-come-from-more-than-you-wanted-to-know/373). The large number of operators poses a challenge for GPU vendors or other hardware manufacturers who want to add a new backend for pytorch. An op-by-op implementation is needed for a comprehensive operator support.
There are some efforts on decomposing pytorch operators into a smaller set of operators or lowering pytorch Aten IR into other IRs with a smaller set of operator like FuncTorch, PrimTorch and torch inductor. However these methods either work in a trace-and-compile manner, or are too slow to run in eager mode. They sit above the pytorch dispatcher and are very different from the eager execution mode.
We propose an alternative solution to the problem by writing operators in a language with multi-backend support. In short, it is a [Write once, compile anywhere](https://en.wikipedia.org/wiki/Write_once,_compile_anywhere) method. Kernels are written once and compiled into different binaries with different compiler backends. This solution is orthogonal to the solutions listed above. Instead of reducing the number of operators by decomposing operatos to a small set, it tries to offload multi-backend support to the compiler. Since it integrates into pytorch in Aten disptacher, it works with eager execution seamlessly.
With higher abstraction level (tile-oriented, CTA level programming) and decent performance, Triton is getting more attention from deep learning developers. It is used in torch inductor as the code generation language, and also used in many training/inference libraries for LLMs(lightllm, vllm, unsloth) and kernel libraries(Liger Kernels).
In addition, Triton is open-source and supports multiple backends. Thus, it is getting more support from accelerator manufacturers, since supporting Triton makes their device a more attractive platform to developers and the industry as well.
Though Triton has been widely used to develop custom operators, there were not many attempts to write Triton kernels for ***standard*** operators like pointwise operation, reduction, normalization, gemms. Those standard kernels have been studied a lot and may have been implemented with some dedicated libraries in platform-specific programming languages. But developing these standard operators in Triton saves GPU manufacturers much efforts to support a wide variety of Aten operators, especially for those without a comprehensive and highly optimized software stack.
Ezyang made a [proof-on-concept](https://github.com/pytorch/pytorch/issues/62661) that Triton can be used to develop kernels for Aten operators before.
We created [FlagGems](https://github.com/FlagOpen/FlagGems), a high-performance general operator library implemented in Triton.
## Background
### Handwritten kernels vs compiler generated kernels
Torch inductor lowers torch IR into an internal IR with a very small set of opertors and generates Triton code for a wide variety of operators that are composed of pointwise, reduction and scan operations. We choose to create a library of handwritten kernels with the following considerations.
1. **Coverage**: There are some operators that are not easily generated by torch-inductor, for example, sort, argsort, unique and topk. Handwritten kernels have broader coverage than generated ones.
2. **Performance**: Handwritten kernels may have better performance than compiler generated kernels. Since inductor performs analysis and fusion at `IRNode` and `SchedulerNode` levels, where higher level semantics are lost, some optimizations with non-obvious mathematical transformations cannot be applied. For example, online softmax normalizer and flash attention, while we can apply these optimizations to handwritten kernels.
3. **Flexibility**: In addition, it is more flexible to optimize handwritten kernels than to optimize a full `torch.compile` stack. Since `torch.compile` has several components like operation decomposition, fusion and code generation and config selection, some of which are shared in the compilation of many operators. Modifying those components would have influence to multiple operators. While handwritten kernels may also have some shared components, it is relatively easy to make changes to only some operators or patterns.
## Design

### Overview
The overview of FlagGems is shown above. FlagGems consists of the kernels (jit functions, actually) written in Triton and several componets for them to work with DL frameworks and device APIs.
1. **Pytorch Integration** describes how FlagGems integrates into pytorch.
2. **Wrapper** describes the structure of wrapper code above launching of jit functions, and the challenges to the efficiency and portability of wrapper code.
3. **Jit function** discusses how to write efficient jit functions and how to efficiently find reasonable kernel configs. This part also shows how we deal with Triton lang's limitations on writing jit functions for arbitrarily ranked tensors.
4. **Triton runtime** discusses how to reduce runtime overhead of Triton.
5. **Triton compilers** describes how FlagGems work with multiple backends.
### Pytorch Integration
FlagGems integrates into pytorch by registering wrapper functions into the dispatcher. Since the project is started earlier than `torch.library.custom_op` come into being, low level APIs `torch.Library.impl` is used.
1. We create a wrapper function with the same signature as the corresponding ATen native function's signature.
2. The wrapper function is registered into pytorch's dispatcher with `torch.Library` APIs, overwriting existing implementations for cuda backend.
For standard operators, we register our implementations following some rules to work with **autograd**:
1. For a native function with backend-specific implementations, we override the 'CUDA' backend, then the backward implementation defined in Aten retains.
2. For a native function without backend-specific implementations and only `CompositeExplicitAutograd`, we register a function for 'CUDA' backend, which has higher priority than `CompositeExplicitAutograd`. The backward implementation defined in Aten also retains.
3. For a native function without backend-specific implementations and only `CompositeImplicitAutograd`, which has no explicitly defined backward pass(its backward pass is composed of the backward passes of the operators used in this implementation), we have to provide both the forward pass and backward pass, which are wrapped into a `torch.autograd.Function`, and registered with `AutogradCUDA` key.
The rule of thumb is: Only register with CUDA or AutogradCUDA key. Only use the latter when we want or have to make our backward pass used in autograd.
To work with **torch.compile**, we need to find a way to make sure that overriding the dispatch does not break the dispatch function for FakeTensor. We currently find that Aten operators with `AutogradCUDA` implementation added/overridden fail to work correctly with graph tracing in dynamo.
For custom ops, we follow the [python cutom op guid](https://pytorch.org/tutorials/advanced/python_custom_ops.html#python-custom-ops-tutorial) in pytorch.
### Multi-backend Support
FlagGems depends on Triton, but the Triton package differs on different types of devices. For example, when working on AMD gpus, FlagGems need to work with Triton and torch for rocm devices.
To support multiple backend **without** forking the project for different backends, there are some requirements.
1. The dispatch key to register may differ for different backends. A method for device and backend detection is required to get the desired dispatchkey to register.
2. The APIs used in wrapper functions should be device-agnostic. For example, APIs for creating new empty tensors with specified device or switching device context. We may use device agnostic python runtime APIs in the future, as the [RFC](https://github.com/pytorch/pytorch/issues/128403) proposed, or wrap our own.
An alternative method is to introduce some second layer Dispatcher, and leave dispatching to different devices to that dispatcher.
### Wrapper
In many cases, wrappers are simple. A convention wrapper includes the following tasks:
1. input argument checking and error handling;
2. meta data inference and allocation of outputs;
3. device context switch and call to the jit functions;
4. return of the results.
But these tasks (except for the call to the jit function) may also have significant overhead, especially for operations on small tensor, mainly due to the following reasons.
1. When the kernel does support tensor arbitrary strided layout, we need to copy the input tensor as contiguous, which introduces extra data copying;
2. Meta data inference of outpus may be expensive. Though the device, dtype and shape of outputs are well defined for most operators, the stride orders of the outputs are flexible. Selecting the best stride order of outputs could be expensive, especially when implemented in Python.
3. Selecting the best algorithm may be expensive. Some wrappers may call different jit functions according to input sizes, data layout, contiguity, etc.
Although there are workarounds to bypass the overhead, for example, using CUDAGraph, we are working on reducing the per-invocation CPU overhead, since performance in eager execution matters.
In addition, we may add another layer of indirection by providing a series of wrappers that do not necessarily follow the signature of Aten functions, which serves as the interface of a framework-neutral math library.
### Code Reuse in Jit functions
Developing Triton jit functions is basically kernel programming. We discuss code reuse in developing jit functions in FlagGems below.
Pointwise operation, which seems be the simplest operations to write in Triton and always serves as the first example on Triton programming, are however, hard to be flexible with respect to the the input tensor's rank and layout, mainly due to the limitations of Triton lang's features.
1. **It is not easy to write a sigle Triton jit function that support arbitrarily-ranked torch tensors with strided layout**. CUDA kernels for some ATen operators support arbitrarily-ranked torch tensors either by making the inputs contiguous beforehand or by using capturing tensor strides of up to 25 ranks, using cuda lambda capturing. Enforcing contiguous input or copying them as contiguous can also be done in python. However, Triton jit function has no similar mechanism to capture tensors' strides.
2. **Triton does not support self-defined struct or static array as parameter to jit functions**. So we add a runtime [code generation mechanism](https://github.com/FlagOpen/FlagGems?tab=readme-ov-file#automatic-codegen) to generate Triton code for pointwise operators according to the rank of input tensors. Many other operators like Pad, gather scatter, kron have similar issues.
3. **Triton jit function has no support for higher order functions**. While all the pointwise operations have similar logic, there is no way to pass the computation on scalars as a lambda into Triton jit function as a parameter. That's another reason why we use code generation for pointwise operators.
Here's an example of how to use it.
```Python
@pointwise_dynamic(promotion_methods=[(0, 1, "INT_TO_FLOAT")])
@triton.jit
def true_div_func(x, y):
return x / y
```
The function decorated by `pointwise_dynamic` supports arbitrary rank, arbitrary strides and broadcasting.
There is a similar/related solution to this problem in xformers. It works by adding an extension to the syntax and parser of the Triton language, so as to support [vararg](https://github.com/facebookresearch/xformers/blob/68b7fd14df5eb1d2558c52842b4206a14d2d20e9/xformers/triton/vararg_kernel.py#L135). The parameters or variables with `VAR_ARGS_ARRAY` annotation would be unrolled in the extension to instantiate a desired jit function. This solution requires the user to write templates and instantiate them explicitly. However, it does not support varrag of constexpr, and does not consider the problem of supporting higher order functions.
Here's an example of how to use it.
```Python
@triton.jit
def weighted_sumN(
output_ptr,
a_ptr: "VAR_ARGS_ARRAY", # type: ignore # noqa: F821
b: "VAR_ARGS_ARRAY", # type: ignore # noqa: F821
BLOCK_SIZE: tl.constexpr,
):
# Weighted sum, where the weights are on CPU
offset = tl.arange(0, BLOCK_SIZE)
output = tl.zeros([BLOCK_SIZE], tl.float32)
for i in range(len(a_ptr)):
output = output + tl.load(a_ptr[i] + offset) * b[i]
tl.store(output_ptr + offset, output)
...
kernel = unroll_varargs(weighted_sumN, N=NUM_INPUTS)
kernel[(1,)](output, *a, *b_list, BLOCK_SIZE=32)
```
There are also other code reuse in developing a library of operators, for example, common patterns finding best configs. But we do cover them here.
### Triton Compile
Triton can be used in aot-compile or jit-compile manner, though it is mainly used in a jit-compile manner. The pros and cons for aot and jit way of using Triton in operator libraries are:
**aot**
Pros:
1. faster cold launch since no autotunning at runtime.
2. no dependency on python runtime, better for deployment.
Cons:
1. large compiled artifact and longer build time;
2. extra development of a build system including enumerating all possible kernels of each Triton jit function, tuning of jit functions, packaging of all the generated kernels and a dispatcher to select kernels at runtime.
**jit**
Pros:
1. simpler workflow for development and packaging;
Cons:
1. slow warmup when autotunning is involved.
2. dependency on python runtime.
We currently choose jit-compile for simpler development and packaging. When revisiting this topic from the deployment, we may try some aot-based methods.
**Related projects:**
1. [aottriton](https://github.com/ROCm/aotriton/tree/main): a project by ROCm that employs aot compilation. It re-implements an almost full-fledged build system and a runtime to use Triton in an aot manner to build an operator library.
2. [AOTInductor](https://pytorch.org/tutorials/recipes/torch_export_aoti_python.html): a project in pytorch to compile exported models Ahead-of-Time, to a shared library that can be run in a non-Python environment.
### Triton Runtime
There has been several issues and improvements for Triton's runtime of jit functions.
1. [[FRONTEND][RFC] Low latency kernel launching](https://github.com/triton-lang/triton/pull/3503)
2. [Faster jit](https://github.com/triton-lang/triton/pull/3648)
3. [Even faster jit](https://github.com/triton-lang/triton/pull/3649)
#### JIT Runtime
The main reason for slow jit runtime is that Triton launches all the jit functions in a boxed way. Instead of simply calling an ordinary function, calling a jit function involves the following tasks at **each invocation**, which are slow.
1. **Parameter binding and routing**. The signature of the jit function is analyzed beforehand, but parameters received at runtime must be bound to the parameters defined in the signature, which are then classified and routed to different sets. Some are constexpr parameters or compilation parameters (e.g. num_warps, num_stages, num_ctas) while others are parameters to the compiled kernel.
2. **Cache key computation**. Triton runtime extracts features from input parameters to specify kernels by checking input parameters' dtypes, values, or divisibilities of integers and pointers by some predefined values, for example, 16.
Other features like heuristics, autotunner and hook make jit runtime more complicated. Recent refactoring of the jit runtime has reduced the runtime overhead, but there is still space to optimize the runtime performance.
#### LibEntry
We also have a [faster runtime(LibEntry)](https://github.com/FlagOpen/FlagGems/blob/master/src/flag_gems/utils/libentry.py) in FlagGems. We observe that going through the autotuning and heuristics logics is compulsory for a jit function even if the kernel is already compiled. To reduce this extra overhead, we install a fast track lookup function at the entrance of each Triton JIT function, called LibEntry. LibEntry caches a mapping from input parameters to the compiled kernels for each jit function. If the LibEntry cache is hit, the jit function will skip ahead and run the saved kernel, bypassing the intermediate Autotuner and Heuristics. Otherwise, the control will fall back to the original Triton runtime.

There are also other projects working on persisting the results of Autotunner (an in-memory cache) into disks, which are restored and reused in later running.
We are working on combing these methods to reduce runtime overhead and reuse tunning results.
**Related projects:**
1. [triton-dejavu](https://github.com/IBM/triton-dejavu): a project by IBM to reduce autotune overhead of Triton jit functions by storing and restoring autotunner results using JSON files, which prevents auto-tunning at each run.
## Summary
In this RFC we discuss the benefits and challenges to developing Aten operators in Triton and our practice in FlagGems.
Benefits:
1. Triton has higher level of sbstraction and is easy to learn;
2. Triton compiler delivers decent performance;
3. Triton has native multi-backend support and is gaining more support from many GPU manufacturers.
Challenges:
1. Triton is mainly used in a Just-in-Time fashion, and its compiler and runtime depends on python. So it is not straightforward to create a library with triton jit functions without dependency on python (especially when it is used in deployment).
2. Triton lang has limited features, making it harder to reuse code in kernel development;
3. Triton's runtime has high runtime overhead;
4. Implementing wrapper functions in Python has higher runtime overhead than in C++;
5. It takes some effort to override Aten operators' dispatch function in python and make it work with other Pytorch subsystems, for example torch.autograd and torch.compile.
Our goal is to develop a high-performance operator library in triton with multi-backend support in a single-source manner, and to integrate it into Pytorch by using it in implementing Aten operators.
cc @chauhang @penguinwu @voznesenskym @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @yf225 @chenyang78 @kadeng @muchulee8 @ColinPeppler @amjames @desertfire @aakhundov @ezyang | triaged,oncall: pt2,module: inductor | low | Critical |
2,631,756,463 | PowerToys | The window pinning feature should add a pin icon at the top of the window | ### Description of the new feature / enhancement
The current window pinning feature can only display a border around the window, but the border significantly affects visibility. I hope an option can be added to set the style of the pinned window to display an image only in the upper right or upper left corner of the window. For example, I could set an image of a thumbtack to appear in the upper right corner.

### Scenario when this would be used?
When you need the window to stay on top but don't want the prominence of the window being on top to be too obvious, this feature will be just right
### Supporting information
_No response_ | Needs-Triage | low | Minor |
2,631,775,459 | pytorch | `torch.compile` doesn't support `set` arguments | ### 🐛 Describe the bug
```python
"""Demonstrate error with `torch.compile` with `set` argument.
pip install torch@https://download.pytorch.org/whl/nightly/cu124/torch-2.6.0.dev20241103%2Bcu124-cp310-cp310-linux_x86_64.whl pytorch-triton@https://download.pytorch.org/whl/nightly/pytorch_triton-3.1.0%2Bcf34004b8a-cp310-cp310-linux_x86_64.whl
"""
import torch
@torch.compile
def func_using_set() -> torch.Tensor:
with torch.random.fork_rng(devices={torch.device("cuda:0")}):
return torch.randn((1, 2))
func_using_set()
```
```
InternalTorchDynamoError: NotImplementedError: argument of type: <class 'set'>
from user code:
File "/tmp/ipykernel_765512/1017228397.py", line 6, in func_using_set
with torch.random.fork_rng(devices={torch.device("cuda:0")}):
```
cc: @ezyang
### Versions
PyTorch version: 2.6.0.dev20241103+cu124
Is debug build: False
CUDA used to build PyTorch: 12.4
Versions of relevant libraries:
[pip3] pytorch-triton==3.1.0+cf34004b8a
[pip3] torch==2.6.0.dev20241103+cu124
cc @ezyang @chauhang @penguinwu @voznesenskym @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @chenyang78 @kadeng @amjames | triaged,oncall: pt2,module: dynamo | low | Critical |
2,631,793,231 | go | x/net/html: ParseFragment out of memory on specially crafted input | ### Go version
go1.23.1 linux/amd64
### Output of `go env` in your module/workspace:
```shell
GO111MODULE='auto'
GOARCH='amd64'
GOBIN=''
GOCACHE='/home/oof/.cache/go-build'
GOENV='/home/oof/.config/go/env'
GOEXE=''
GOEXPERIMENT=''
GOFLAGS=''
GOHOSTARCH='amd64'
GOHOSTOS='linux'
GOINSECURE=''
GOMODCACHE='/home/oof/.asdf/installs/golang/1.23.1/packages/pkg/mod'
GONOPROXY=''
GONOSUMDB=''
GOOS='linux'
GOPATH='/home/oof/.asdf/installs/golang/1.23.1/packages'
GOPRIVATE=''
GOPROXY='https://proxy.golang.org,direct'
GOROOT='/home/oof/.asdf/installs/golang/1.23.1/go'
GOSUMDB='sum.golang.org'
GOTMPDIR=''
GOTOOLCHAIN='auto'
GOTOOLDIR='/home/oof/.asdf/installs/golang/1.23.1/go/pkg/tool/linux_amd64'
GOVCS=''
GOVERSION='go1.23.1'
GODEBUG=''
GOTELEMETRY='local'
GOTELEMETRYDIR='/home/oof/.config/go/telemetry'
GCCGO='gccgo'
GOAMD64='v1'
AR='ar'
CC='gcc'
CXX='g++'
CGO_ENABLED='1'
GOMOD=''
GOWORK=''
CGO_CFLAGS='-O2 -g'
CGO_CPPFLAGS=''
CGO_CXXFLAGS='-O2 -g'
CGO_FFLAGS='-O2 -g'
CGO_LDFLAGS='-O2 -g'
PKG_CONFIG='pkg-config'
GOGCCFLAGS='-fPIC -m64 -pthread -Wl,--no-gc-sections -fmessage-length=0 -ffile-prefix-map=/tmp/go-build1768793625=/tmp/go-build -gno-record-gcc-switches'
```
### What did you do?
Hi!
I originally reported this as a security issue, but this wasn't categorized as such, so I am just going to paste the original report here:
```
Summary: Out-Of-Memory (OOM) in net/html in golang
Program: Google VRP
URL: https://github.com/golang/net/
Vulnerability type: Denial of Service (DoS)
Details
An attacker can cause Out-Of-Memory by passing a maliciously crafted input to html.ParseFragment. The input is this string: "<svg><head><title><select><input>" which makes the program consume all of the systems available memory.
Here is an example program which demonstrates this vulnerability:
package main
import (
"golang.org/x/net/html"
"strings"
)
func main() {
html.ParseFragment(strings.NewReader("<svg><head><title><select><input>"), nil)
}
My golang version is go version go1.23.1 linux/amd64 and I am using the v0.30.0 version of the net library.
This vulnerability does not occur in html.Parse, only in html.ParseFragment which I found a bit odd.
Attack scenario
An attacker can use this vulnerability to cause degradation in performance and Denial-Of-Service if said attacker can deliver malicious input to html.ParseFragment . The effects of this vulnerability are basically the same as any other OOM bug. I uploaded all the files which I played around with as files.zip.
```
### What did you see happen?
This causes an out-of-memory condition when parsing a specially crafted input.
### What did you expect to see?
The program should parse the input or fail with an error, not consume all of the systems memory. | NeedsInvestigation | low | Critical |
2,631,801,118 | deno | Can't type anything into prompt asking for read access to TERM | Tested Versions: 2.0.1, 2.0.4
OS: macOS Sequoia 15.2
Tested shells: ZSH, bash, sh, fish
Tested terminal emulators: iTerm2, wezTerm, Ghostty, Apple Terminal
Absolutely no keyboard input is accepted, including `CTRL + C`
Passing the `--allow-all` flag works and the program can be ran normally
My best guess is this might have something to do with macOS secure input, but I'm not entirely certain.
Screenshot:
<img width="665" alt="Screenshot 2024-11-03 at 23 14 46" src="https://github.com/user-attachments/assets/e577a7e3-79a4-4aa0-968a-2b9ea276f5bd">
| needs info | low | Minor |
2,631,805,068 | PowerToys | Screen cleaning mode | ### Description of the new feature / enhancement
The screen should turn off while the laptop is being cleaned, and then pressing a key would turn the screen back on.
### Scenario when this would be used?
If someone wants to clean their screen without turning their laptop off, this would help.
### Supporting information
_No response_ | Needs-Triage | low | Minor |
2,631,824,835 | deno | Confirm `deno fmt` in well know directories | `deno fmt` is a pretty dangerous command as it will format all the files in a given directory (and sub directories). By default, it will update all files in place. Other tools, such as `cargo fmt` check for the existence of configuration files (such as a `Cargo.toml`), but Deno doesn't require any such files.
If `deno fmt` is run from the root directory, it could cause significant damage to a users file system, and it appears there was one such instance of this that was reported on the Discord channel.
The `deno fmt` command should ask for confirmation if run in well known (Suspicious) directories that are likely not Deno directories. In particular, if running in a tty environment, deno should ask the user to confirm in the following directories:
- `~/.`
- `~/Downloads`
- `~/Documents`
- `~/Desktop`
- `/`
- `~/git`
Thoughts? | deno fmt,needs discussion | low | Minor |
2,631,828,340 | PowerToys | Low disk/volume space threshold control | ### Description of the new feature / enhancement
Windows will set a drive spacebar to red once it reaches 10% left on a drive.
This made sense in the days of 100GB drives for example as 10% meant only 10 gb left .
Now with people having 2 TB or even 4TB drives , 10% would mean 200GB or 400GB left and you end up with an annoyed RED space left bar.
Windows provides a way to completely Turn off this RED warning; but what would be more practical would be to be able to set a Threshold by % of by MB/GB left. And thus control when the drive actually goes RED.
### Scenario when this would be used?
It is a permanent or ongoing setting that could be controlled by having Powertoys loaded; Quite simply you turn on, and you now control the level at which the drive goes RED; better yet set multiple thresholds like 10% Left = Yellow, 3% Left = RED or something like that.
### Supporting information
https://www.elevenforum.com/t/drive-capacity-icon-color.10734/

| Needs-Triage | low | Minor |
2,631,847,754 | flutter | Running `flutter build web` in CI/CD is abnormally slow and sometimes even fails with `Target dart2js failed: ProcessException: Process exited abnormally with exit code -9` | ### Steps to reproduce
1. Set up a new flutter project and enable web
2. Write a dockerfile to try and build for web in it (ours posted below)
3. Hook it up to a CI runner and have it try to build the docker image
Example Dockerfile:
```
#######################
## Build Container ##
#######################
FROM fischerscode/flutter:stable AS build
# make sure this matches the ENV APP_DIRECTORY in the deploy container
ENV APP_DIRECTORY=/home/flutter/app
# Set the working directory
RUN mkdir $APP_DIRECTORY && chown flutter:flutter $APP_DIRECTORY
WORKDIR $APP_DIRECTORY
# Copy the code into the container
COPY --chown=flutter:flutter . $APP_DIRECTORY
# must use flutter user from base image
USER flutter
# Ensure a clean start for the build
RUN flutter clean
# Fetch the dependencies
RUN flutter pub get
# Build the release version of the app
RUN flutter build web --no-pub --release --verbose
########################
## Deploy Container ##
########################
# Using nginx web server to serve the built out dart webapp files
FROM nginx:alpine
ENV APP_DIRECTORY=/home/flutter/app
# Copy the build artifacts from the build step
COPY --from=build $APP_DIRECTORY/build/web /usr/share/nginx/html
# Copy the nginx configuration file into a usable directory for Nginx
COPY nginx.conf /etc/nginx/nginx.conf
EXPOSE 8000
CMD ["nginx", "-g", "daemon off;"]
```
Example Bitbucket Pipelines step:
```
- step:
name: Put docker image into DO registry
image: alpine/doctl:1.27.13 # comes with doctl, kubectl, and helm
script:
- doctl registry login --expiry-seconds 3600
- export IMAGE_NAME=$IMAGE_TAG_NAME:$BITBUCKET_COMMIT
- docker build -t $IMAGE_NAME $BITBUCKET_CLONE_DIR
- docker push $IMAGE_NAME
services:
- docker
```
### Expected results
When the flutter build web command is run locally, when it gets to the `executing: /home/flutter/sdk/bin/cache/dart-sdk/bin/dart ...` steps, it takes about 2 minutes to complete. Sometimes a little more.
When attempting to build the docker image locally, it is noticeably slower, but still completes consistently in around 5 minutes or less.
It would be reasonable to assume the same behaviour when placed into an CI/CD environment.
### Actual results
However once run in a CI/CD environment it is astonishingly slow to build. Sometimes (rarely) it will complete as expected, but most of the time it will run until about the 20min mark and fail with the error `Target dart2js failed: ProcessException: Process exited abnormally with exit code -9`
Logs from the most recent fail are in the logs section below.
It is always the second compilation task that hangs and eventually fails.
I thought it might be an issue with the base container and have tried multiple others including one of my own where I install the flutter SDK myself. Same result every time.
### Code sample
<details open><summary>Code sample</summary>
```dart
[Paste your code here]
```
</details>
### Screenshots or Video
<details open>
<summary>Screenshots / Video demonstration</summary>
[Upload media here]
</details>
### Logs
<details open><summary>Logs</summary>
```console
#7 [build 7/7] RUN flutter build web --release --verbose
#7 sha256:0aa857e78c93c50d581467f644efc99e22866dec7f5c0676801ad8fab6b87de0
#7 0.299 [ +19 ms] Unable to locate an Android SDK.
#7 0.302 [ +4 ms] executing: uname -m
#7 0.305 [ +2 ms] Exit code 0 from: uname -m
#7 0.306 [ ] x86_64
#7 0.332 [ +26 ms] Artifact Instance of 'AndroidGenSnapshotArtifacts' is not required, skipping update.
#7 0.332 [ ] Artifact Instance of 'AndroidInternalBuildArtifacts' is not required, skipping update.
#7 0.332 [ ] Artifact Instance of 'IOSEngineArtifacts' is not required, skipping update.
#7 0.333 [ ] Artifact Instance of 'FlutterWebSdk' is not required, skipping update.
#7 0.333 [ ] Artifact Instance of 'LegacyCanvasKitRemover' is not required, skipping update.
#7 0.336 [ +2 ms] Artifact Instance of 'WindowsEngineArtifacts' is not required, skipping update.
#7 0.336 [ ] Artifact Instance of 'MacOSEngineArtifacts' is not required, skipping update.
#7 0.336 [ ] Artifact Instance of 'LinuxEngineArtifacts' is not required, skipping update.
#7 0.337 [ ] Artifact Instance of 'LinuxFuchsiaSDKArtifacts' is not required, skipping update.
#7 0.337 [ ] Artifact Instance of 'MacOSFuchsiaSDKArtifacts' is not required, skipping update.
#7 0.337 [ ] Artifact Instance of 'FlutterRunnerSDKArtifacts' is not required, skipping update.
#7 0.337 [ ] Artifact Instance of 'FlutterRunnerDebugSymbols' is not required, skipping update.
#7 0.391 [ +50 ms] Artifact Instance of 'MaterialFonts' is not required, skipping update.
#7 0.391 [ ] Artifact Instance of 'GradleWrapper' is not required, skipping update.
#7 0.391 [ ] Artifact Instance of 'AndroidGenSnapshotArtifacts' is not required, skipping update.
#7 0.391 [ ] Artifact Instance of 'AndroidInternalBuildArtifacts' is not required, skipping update.
#7 0.391 [ ] Artifact Instance of 'IOSEngineArtifacts' is not required, skipping update.
#7 0.394 [ +5 ms] Downloading Web SDK...
#7 0.535 [ +140 ms] Content https://storage.googleapis.com/flutter_infra_release/flutter/db49896cf25ceabc44096d5f088d86414e05a7aa/flutter-web-sdk.zip md5 hash: 5QTlQmhstSpSyKOuGdiQCQ==
#7 3.503 [+2968 ms] Downloading Web SDK... (completed in 3.1s)
#7 3.505 [ +1 ms] executing: unzip -o -q /home/flutter/sdk/bin/cache/downloads/storage.googleapis.com/flutter_infra_release/flutter/db49896cf25ceabc44096d5f088d86414e05a7aa/flutter-web-sdk.zip -d /home/flutter/sdk/bin/cache/flutter_web_sdk
#7 5.634 [+2128 ms] Exit code 0 from: unzip -o -q /home/flutter/sdk/bin/cache/downloads/storage.googleapis.com/flutter_infra_release/flutter/db49896cf25ceabc44096d5f088d86414e05a7aa/flutter-web-sdk.zip -d /home/flutter/sdk/bin/cache/flutter_web_sdk
#7 5.653 [ +19 ms] Artifact Instance of 'FlutterSdk' is not required, skipping update.
#7 5.653 [ ] Artifact Instance of 'WindowsEngineArtifacts' is not required, skipping update.
#7 5.653 [ ] Artifact Instance of 'MacOSEngineArtifacts' is not required, skipping update.
#7 5.653 [ ] Artifact Instance of 'LinuxEngineArtifacts' is not required, skipping update.
#7 5.653 [ ] Artifact Instance of 'LinuxFuchsiaSDKArtifacts' is not required, skipping update.
#7 5.653 [ ] Artifact Instance of 'MacOSFuchsiaSDKArtifacts' is not required, skipping update.
#7 5.654 [ ] Artifact Instance of 'FlutterRunnerSDKArtifacts' is not required, skipping update.
#7 5.654 [ ] Artifact Instance of 'FlutterRunnerDebugSymbols' is not required, skipping update.
#7 5.654 [ ] Artifact Instance of 'IosUsbArtifacts' is not required, skipping update.
#7 5.654 [ ] Artifact Instance of 'IosUsbArtifacts' is not required, skipping update.
#7 5.654 [ ] Artifact Instance of 'IosUsbArtifacts' is not required, skipping update.
#7 5.654 [ ] Artifact Instance of 'IosUsbArtifacts' is not required, skipping update.
#7 5.654 [ ] Artifact Instance of 'IosUsbArtifacts' is not required, skipping update.
#7 5.654 [ ] Artifact Instance of 'FontSubsetArtifacts' is not required, skipping update.
#7 5.654 [ ] Artifact Instance of 'PubDependencies' is not required, skipping update.
#7 5.716 [ +61 ms] Skipping pub get: version match.
#7 5.826 [ +109 ms] Found plugin path_provider at /home/flutter/.pub-cache/hosted/pub.dev/path_provider-2.1.5/
#7 5.830 [ +4 ms] Found plugin path_provider_android at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_android-2.2.12/
#7 5.832 [ +2 ms] Found plugin path_provider_foundation at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_foundation-2.4.0/
#7 5.834 [ +2 ms] Found plugin path_provider_linux at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_linux-2.2.1/
#7 5.836 [ +2 ms] Found plugin path_provider_windows at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_windows-2.3.0/
#7 5.966 [ +129 ms] Found plugin path_provider at /home/flutter/.pub-cache/hosted/pub.dev/path_provider-2.1.5/
#7 5.967 [ +1 ms] Found plugin path_provider_android at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_android-2.2.12/
#7 5.974 [ +6 ms] Found plugin path_provider_foundation at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_foundation-2.4.0/
#7 5.975 [ +1 ms] Found plugin path_provider_linux at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_linux-2.2.1/
#7 5.977 [ +1 ms] Found plugin path_provider_windows at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_windows-2.3.0/
#7 6.053 [ +76 ms] Found plugin path_provider at /home/flutter/.pub-cache/hosted/pub.dev/path_provider-2.1.5/
#7 6.054 [ +1 ms] Found plugin path_provider_android at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_android-2.2.12/
#7 6.055 [ +1 ms] Found plugin path_provider_foundation at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_foundation-2.4.0/
#7 6.056 [ ] Found plugin path_provider_linux at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_linux-2.2.1/
#7 6.058 [ +1 ms] Found plugin path_provider_windows at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_windows-2.3.0/
#7 6.086 [ +28 ms] Generating /home/flutter/app/android/app/src/main/java/io/flutter/plugins/GeneratedPluginRegistrant.java
#7 6.150 [ +64 ms] Found plugin path_provider at /home/flutter/.pub-cache/hosted/pub.dev/path_provider-2.1.5/
#7 6.151 [ ] Found plugin path_provider_android at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_android-2.2.12/
#7 6.152 [ ] Found plugin path_provider_foundation at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_foundation-2.4.0/
#7 6.152 [ ] Found plugin path_provider_linux at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_linux-2.2.1/
#7 6.154 [ +1 ms] Found plugin path_provider_windows at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_windows-2.3.0/
#7 6.173 [ +19 ms] generated_plugin_registrant.dart not found. Skipping.
#7 6.174 [ +1 ms] Compiling [lib/main.dart](https://bitbucket.org/polarlabsca/pinones-app/src/a0b043722c7dec67117521ef1eb43b6e43394789/lib/main.dart) for the Web...
#7 6.178 [ +3 ms] Initializing file store
#7 6.206 [ +28 ms] web_entrypoint: Starting due to {InvalidatedReasonKind.inputChanged: The following inputs have updated contents: /home/flutter/sdk/packages/flutter_tools/lib/src/build_system/targets/web.dart}
#7 6.207 [ ] Skipping target: gen_localizations
#7 6.209 [ +1 ms] invalidated build due to missing files: /home/flutter/app/web/*/index.html, /home/flutter/app/web/flutter_bootstrap.js
#7 6.209 [ ] web_templated_files: Starting due to {InvalidatedReasonKind.buildKeyChanged: The target build key changed., InvalidatedReasonKind.inputMissing: The following inputs were missing: /home/flutter/app/web/*/index.html,/home/flutter/app/web/flutter_bootstrap.js, InvalidatedReasonKind.inputChanged: The following inputs have updated contents: /home/flutter/sdk/bin/internal/engine.version}
#7 6.210 [ +1 ms] web_static_assets: Starting due to {InvalidatedReasonKind.inputChanged: The following inputs have updated contents: /home/flutter/sdk/bin/internal/engine.version}
#7 6.244 [ +33 ms] web_static_assets: Complete
#7 6.259 [ +13 ms] Warning: In index.html:37: Local variable for "serviceWorkerVersion" is deprecated. Use "{{flutter_service_worker_version}}" template token instead. See https://docs.flutter.dev/platform-integration/web/initialization for more details.
#7 6.259 [ +1 ms] Warning: In index.html:46: "FlutterLoader.loadEntrypoint" is deprecated. Use "FlutterLoader.load" instead. See https://docs.flutter.dev/platform-integration/web/initialization for more details.
#7 6.263 [ +4 ms] web_templated_files: Complete
#7 6.292 [ +29 ms] Found plugin path_provider at /home/flutter/.pub-cache/hosted/pub.dev/path_provider-2.1.5/
#7 6.293 [ ] Found plugin path_provider_android at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_android-2.2.12/
#7 6.294 [ ] Found plugin path_provider_foundation at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_foundation-2.4.0/
#7 6.295 [ ] Found plugin path_provider_linux at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_linux-2.2.1/
#7 6.296 [ +1 ms] Found plugin path_provider_windows at /home/flutter/.pub-cache/hosted/pub.dev/path_provider_windows-2.3.0/
#7 6.318 [ +22 ms] web_entrypoint: Complete
#7 6.319 [ ] dart2js: Starting due to {}
#7 6.323 [ +4 ms] executing: /home/flutter/sdk/bin/cache/dart-sdk/bin/dart --disable-dart-dev /home/flutter/sdk/bin/cache/dart-sdk/bin/snapshots/dart2js.dart.snapshot --platform-binaries=/home/flutter/sdk/bin/cache/flutter_web_sdk/kernel --invoker=flutter_tool -Ddart.vm.product=true -DFLUTTER_WEB_AUTO_DETECT=false -DFLUTTER_WEB_USE_SKIA=true -DFLUTTER_WEB_CANVASKIT_URL=https://www.gstatic.com/flutter-canvaskit/db49896cf25ceabc44096d5f088d86414e05a7aa/ --native-null-assertions --no-source-maps -o /home/flutter/app/.dart_tool/flutter_build/92612a589e9112770d9cb63dc07cd59d/app.dill --packages=.dart_tool/package_config.json --cfe-only /home/flutter/app/.dart_tool/flutter_build/92612a589e9112770d9cb63dc07cd59d/main.dart
#7 32.94 [+26614 ms] Compiled 49,774,772 input bytes (41,258,382 characters source) to 37,810,784 kernel bytes in 26.1 seconds using 734.734 MB of memory
#7 32.94 Dart file .dart_tool/flutter_build/92612a589e9112770d9cb63dc07cd59d/main.dart compiled to dill: .dart_tool/flutter_build/92612a589e9112770d9cb63dc07cd59d/app.dill.
#7 32.94 [ +1 ms] executing: /home/flutter/sdk/bin/cache/dart-sdk/bin/dart --disable-dart-dev /home/flutter/sdk/bin/cache/dart-sdk/bin/snapshots/dart2js.dart.snapshot --platform-binaries=/home/flutter/sdk/bin/cache/flutter_web_sdk/kernel --invoker=flutter_tool -Ddart.vm.product=true -DFLUTTER_WEB_AUTO_DETECT=false -DFLUTTER_WEB_USE_SKIA=true -DFLUTTER_WEB_CANVASKIT_URL=https://www.gstatic.com/flutter-canvaskit/db49896cf25ceabc44096d5f088d86414e05a7aa/ --native-null-assertions --no-source-maps -O4 -o /home/flutter/app/.dart_tool/flutter_build/92612a589e9112770d9cb63dc07cd59d/main.dart.js /home/flutter/app/.dart_tool/flutter_build/92612a589e9112770d9cb63dc07cd59d/app.dill
#7 1165.3 [+1132374 ms] Persisting file store
#7 1165.3 [ +3 ms] Done persisting file store
#7 1165.3 [ +5 ms] Target dart2js failed: ProcessException: Process exited abnormally with exit code -9:
#7 1165.3
#7 1165.3 Command: /home/flutter/sdk/bin/cache/dart-sdk/bin/dart --disable-dart-dev /home/flutter/sdk/bin/cache/dart-sdk/bin/snapshots/dart2js.dart.snapshot --platform-binaries=/home/flutter/sdk/bin/cache/flutter_web_sdk/kernel --invoker=flutter_tool -Ddart.vm.product=true -DFLUTTER_WEB_AUTO_DETECT=false -DFLUTTER_WEB_USE_SKIA=true -DFLUTTER_WEB_CANVASKIT_URL=https://www.gstatic.com/flutter-canvaskit/db49896cf25ceabc44096d5f088d86414e05a7aa/ --native-null-assertions --no-source-maps -O4 -o /home/flutter/app/.dart_tool/flutter_build/92612a589e9112770d9cb63dc07cd59d/main.dart.js /home/flutter/app/.dart_tool/flutter_build/92612a589e9112770d9cb63dc07cd59d/app.dill
#7 1165.3 #0 RunResult.throwException (package:flutter_tools/src/base/process.dart:122:5)
#7 1165.3 #1 _DefaultProcessUtils.run (package:flutter_tools/src/base/process.dart:370:19)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #2 Dart2JSTarget.build (package:flutter_tools/src/build_system/targets/web.dart:207:5)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #3 _BuildInstance._invokeInternal (package:flutter_tools/src/build_system/build_system.dart:875:9)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #4 Future.wait.<anonymous closure> (dart:async/future.dart:534:21)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #5 _BuildInstance.invokeTarget (package:flutter_tools/src/build_system/build_system.dart:813:32)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #6 Future.wait.<anonymous closure> (dart:async/future.dart:534:21)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #7 _BuildInstance.invokeTarget (package:flutter_tools/src/build_system/build_system.dart:813:32)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #8 FlutterBuildSystem.build (package:flutter_tools/src/build_system/build_system.dart:635:16)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #9 WebBuilder.buildWeb (package:flutter_tools/src/web/compile.dart:92:34)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #10 BuildWebCommand.runCommand (package:flutter_tools/src/commands/build_web.dart:230:5)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #11 FlutterCommand.run.<anonymous closure> (package:flutter_tools/src/runner/flutter_command.dart:1408:27)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #12 AppContext.run.<anonymous closure> (package:flutter_tools/src/base/context.dart:153:19)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #13 CommandRunner.runCommand (package:args/command_runner.dart:212:13)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #14 FlutterCommandRunner.runCommand.<anonymous closure> (package:flutter_tools/src/runner/flutter_command_runner.dart:420:9)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #15 AppContext.run.<anonymous closure> (package:flutter_tools/src/base/context.dart:153:19)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #16 FlutterCommandRunner.runCommand (package:flutter_tools/src/runner/flutter_command_runner.dart:364:5)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #17 run.<anonymous closure>.<anonymous closure> (package:flutter_tools/runner.dart:130:9)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #18 AppContext.run.<anonymous closure> (package:flutter_tools/src/base/context.dart:153:19)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #19 main (package:flutter_tools/executable.dart:93:3)
#7 1165.3 <asynchronous suspension>
#7 1165.3
#7 1165.3 [ +1 ms] Compiling [lib/main.dart](https://bitbucket.org/polarlabsca/pinones-app/src/a0b043722c7dec67117521ef1eb43b6e43394789/lib/main.dart) for the Web... (completed in 1159.1s)
#7 1165.3 [ ] "flutter web" took 1,165,006ms.
#7 1165.3 [ +4 ms] Error: Failed to compile application for the Web.
#7 1165.3 [ ]
#7 1165.3 #0 throwToolExit (package:flutter_tools/src/base/common.dart:10:3)
#7 1165.3 #1 WebBuilder.buildWeb (package:flutter_tools/src/web/compile.dart:129:7)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #2 BuildWebCommand.runCommand (package:flutter_tools/src/commands/build_web.dart:230:5)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #3 FlutterCommand.run.<anonymous closure> (package:flutter_tools/src/runner/flutter_command.dart:1408:27)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #4 AppContext.run.<anonymous closure> (package:flutter_tools/src/base/context.dart:153:19)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #5 CommandRunner.runCommand (package:args/command_runner.dart:212:13)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #6 FlutterCommandRunner.runCommand.<anonymous closure> (package:flutter_tools/src/runner/flutter_command_runner.dart:420:9)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #7 AppContext.run.<anonymous closure> (package:flutter_tools/src/base/context.dart:153:19)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #8 FlutterCommandRunner.runCommand (package:flutter_tools/src/runner/flutter_command_runner.dart:364:5)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #9 run.<anonymous closure>.<anonymous closure> (package:flutter_tools/runner.dart:130:9)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #10 AppContext.run.<anonymous closure> (package:flutter_tools/src/base/context.dart:153:19)
#7 1165.3 <asynchronous suspension>
#7 1165.3 #11 main (package:flutter_tools/executable.dart:93:3)
#7 1165.3 <asynchronous suspension>
#7 1165.3
#7 1165.3
#7 1165.3 [ +1 ms] Running 1 shutdown hook
#7 1165.3 [ ] Shutdown hooks complete
#7 1165.3 [ ] exiting with code 1
#7 ERROR: process "/bin/sh -c flutter build web --release --verbose" did not complete successfully: exit code: 1
------
> [build 7/7] RUN flutter build web --release --verbose:
------
process "/bin/sh -c flutter build web --release --verbose" did not complete successfully: exit code: 1
```
</details>
### Flutter Doctor output
<details open><summary>Doctor output</summary>
I don't have the doctor output from the public base container I linked, but here is the output from my custom container where I install flutter from the git repo:
```console
#11 [8/8] RUN flutter doctor -v
#11 sha256:e76d3f9c3fae78f5af74e3e8824fa611e2fc9a81081ab2ad099b778a0cb6a97c
#11 0.471 Downloading Material fonts... 340ms
#11 0.895 Downloading Gradle Wrapper... 10ms
#11 0.922 Downloading package sky_engine... 91ms
#11 1.086 Downloading package flutter_gpu... 9ms
#11 1.110 Downloading flutter_patched_sdk tools... 216ms
#11 1.431 Downloading flutter_patched_sdk_product tools... 177ms
#11 1.713 Downloading linux-x64 tools... 1,478ms
#11 4.018 Downloading linux-x64/font-subset tools... 81ms
#11 4.294 [!] Flutter (Channel [user-branch], 3.24.4, on Debian GNU/Linux 12 (bookworm) 5.15.0-1071-aws, locale en_US)
#11 4.294 ! Flutter version 3.24.4 on channel [user-branch] at /home/flutter/sdk
#11 4.294 Currently on an unknown channel. Run `flutter channel` to switch to an official channel.
#11 4.294 If that doesn't fix the issue, reinstall Flutter by following instructions at https://flutter.dev/setup.
#11 4.294 ! Upstream repository unknown source is not the same as FLUTTER_GIT_URL
#11 4.294 • FLUTTER_GIT_URL = https://github.com/flutter/flutter.git
#11 4.294 • Framework revision 603104015d (11 days ago), 2024-10-24 08:01:25 -0700
#11 4.294 • Engine revision db49896cf2
#11 4.295 • Dart version 3.5.4
#11 4.295 • DevTools version 2.37.3
#11 4.295 • If those were intentional, you can disregard the above warnings; however it is recommended to use "git" directly to perform update checks and upgrades.
#11 4.295
#11 4.297 [✗] Android toolchain - develop for Android devices
#11 4.297 ✗ Unable to locate Android SDK.
#11 4.297 Install Android Studio from: https://developer.android.com/studio/index.html
#11 4.297 On first launch it will assist you in installing the Android SDK components.
#11 4.298 (or visit https://flutter.dev/to/linux-android-setup for detailed instructions).
#11 4.298 If the Android SDK has been installed to a custom location, please use
#11 4.298 `flutter config --android-sdk` to update to that location.
#11 4.298
#11 4.298
#11 4.298 [✗] Chrome - develop for the web (Cannot find Chrome executable at google-chrome)
#11 4.298 ! Cannot find Chrome. Try setting CHROME_EXECUTABLE to a Chrome executable.
#11 4.298
#11 4.299 [✗] Linux toolchain - develop for Linux desktop
#11 4.299 ✗ clang++ is required for Linux development.
#11 4.299 It is likely available from your distribution (e.g.: apt install clang), or can be downloaded from https://releases.llvm.org/
#11 4.299 ✗ CMake is required for Linux development.
#11 4.299 It is likely available from your distribution (e.g.: apt install cmake), or can be downloaded from https://cmake.org/download/
#11 4.299 ✗ ninja is required for Linux development.
#11 4.299 It is likely available from your distribution (e.g.: apt install ninja-build), or can be downloaded from https://github.com/ninja-build/ninja/releases
#11 4.300 ✗ pkg-config is required for Linux development.
#11 4.300 It is likely available from your distribution (e.g.: apt install pkg-config), or can be downloaded from https://www.freedesktop.org/wiki/Software/pkg-config/
#11 4.300
#11 4.300 [!] Android Studio (not installed)
#11 4.300 • Android Studio not found; download from https://developer.android.com/studio/index.html
#11 4.300 (or visit https://flutter.dev/to/linux-android-setup for detailed instructions).
#11 4.300
#11 4.303 [✓] Connected device (1 available)
#11 4.303 • Linux (desktop) • linux • linux-x64 • Debian GNU/Linux 12 (bookworm) 5.15.0-1071-aws
#11 4.303
#11 4.358 [✓] Network resources
#11 4.358 • All expected network resources are available.
#11 4.358
#11 4.359 ! Doctor found issues in 5 categories.
```
</details>
| tool,platform-web,a: build,P3,team-tool | medium | Critical |
2,631,895,333 | godot | ready() signal is called instantly from the Editor when loading the scene if connected via the Node tab's Signals menu | ### Tested versions
Reproducible in: v4.3.stable.official [77dcf97d8]
### System information
Godot v4.3.stable - Fedora Linux 41 (KDE Plasma) - Wayland - Vulkan (Forward+) - dedicated AMD Radeon RX 6600 (RADV NAVI23) - 12th Gen Intel(R) Core(TM) i5-12600KF (16 Threads)
### Issue description
When connecting the ready signal from the Node tab, the signal is called when loading the scene _via the editor_.
This leads to instantly running the connected function. It does not matter if the node is the scene's owner or a child, if it has a script, or whether it has a tool annotation or not.
If connected to methods like queue_free(), the editor will crash once it attempts to delete the corresponding node.
The editor will likely not remove any data from the scene. But if the scene is remembered as the most recent scene, it'll potentially leave the editor in a loop of starting -> loading the scene with signals -> running the method -> crashing.
The only solution is to either remove the signals from the .tscn file or change the last scene the editor "remembers".
I don't think this is the expected behavior, because it affects non-tool and script-less Nodes too. Fixing this would be helpful to create scenes with auto emitting and then freeing particles, for example. Without having to add scripts that aren't necessary.
https://github.com/user-attachments/assets/f497ace1-6b9a-4096-9da5-b7f524f6bd3e
### Steps to reproduce
1. Connect the ready signal to any node from the Editor's Node tab.
2. Close and reopen the scene. The connected function should run instantly.
### Minimal reproduction project (MRP)
The included reproduction project has 2 scenes: visibility_example and crash_example.
I recommend loading visibility_example first, as crash_example will make the editor stuck in the previously mentioned loop.
visibility_example only has a Label and TextureRect. The TextureRect has its visibility set to false (you can check the value in the tscn file), so it should be hidden. However, once the editor loads it, it'll trigger the ready signal, connected to that Node's show() method.
crash_example contains a node that will call queue_free() once the editor loads it, triggering the error.
[editor_signals_test.zip](https://github.com/user-attachments/files/17614057/editor_signals_test.zip)
| bug,topic:editor,needs testing | low | Critical |
2,631,915,450 | rust | Unwarned Trait recursion causes stack overflow | ### Code
```rs
pub trait Dog: Animal {}
pub trait Animal {
fn speak(&self) {
println!("Hey.");
}
}
impl<D> Animal for D where D: Dog {
fn speak(&self) {
self.speak()
}
}
pub struct Labrador;
impl Dog for Labrador {}
fn main() {
Labrador.speak();
}
```
### Current output
#### Errors
```
Exited with signal 6 (SIGABRT): abort program
```
#### Standard Error
```
Compiling playground v0.0.1 (/playground)
Finished `dev` profile [unoptimized + debuginfo] target(s) in 0.57s
Running `target/debug/playground`
thread 'main' has overflowed its stack
fatal runtime error: stack overflow
```
#### Standard Output
### Desired output
```
warning: function cannot return without recursing
--> src/main.rs:4:5
|
4 | fn speak(&self) {
| ^^^^^^^^^^^^^^^ cannot return without recursing
5 | self.speak();
| ------------ recursive call site
|
```
### Rationale and extra context
This class of error is properly caught and warned against in the simplest case: calling a trait method from within that same trait method. You can see this in the above code by replacing the blanket implementation with:
```rs
impl Animal for Labrador {
fn speak(&self) {
self.speak()
}
}
```
Which results in the warning listed above in the "desired output".
What appears to be happening is that, because `Dog` is required to be `Animal`, calling `self.speak()` from within the blanket implementation is calling the `Animal` method on `Dog` - which takes precedence over calling the trait method we are currently in.
This code snippet is already seemingly tautological, and it isn't clear how the compiler is reasoning about this. A type implementing `Dog` *must* be an `Animal`, yet `Labrador` cannot have the `Animal` trait unless it is also a `Dog`!
### Other cases
_No response_
### Rust Version
I've been using rust playground for this: https://play.rust-lang.org/?version=stable&mode=debug&edition=2021&gist=e62693e73ead0c65f1b471f8eca6a43a
Using rust 1.82.0
### Anything else?
_No response_ | C-enhancement,A-lints,T-compiler,L-unconditional_recursion,L-false-negative | low | Critical |
2,631,980,670 | vscode | Ability to Temporarily Lock Folders in Workspace | <!-- ⚠️⚠️ Do Not Delete This! feature_request_template ⚠️⚠️ -->
<!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ -->
<!-- Please search existing issues to avoid creating duplicates. -->
<!-- Describe the feature you'd like. -->
It would be helpful to have a feature that allows users to temporarily lock project folders they aren’t actively working on within a workspace. The locked folders could display a pop-up notification each time a user attempts to access them, preventing accidental edits and helping to maintain focus on the active folder. This feature would be particularly useful for developers working in large workspaces with multiple projects.
A lock option should be included in the options list for a folder. | feature-request,file-explorer | medium | Major |
2,632,003,387 | node | Port/Duplicate async_hooks tests to use AsyncLocalStore | Short after https://github.com/nodejs/node/pull/55552 was merged I notices that this resulted in loosing a lot test for `AsyncLocalStore`.
Before `AsyncLocalStore` was based on async hooks. There are plenty of tests for async hooks which implict verified that `AsyncLocalStore` does what it is expected.
But since https://github.com/nodejs/node/pull/55552 `AsyncLocalStore` and async hooks are independent.
I don't think we have to dup them all, but there are some special cases at least in HTTP area (e.g. [here](https://github.com/nodejs/node/blob/32ff100bfa0c0f4ff72fd2228bc64b1d7ec62557/lib/_http_agent.js#L534-L542)) which might require followups.
fyi @nodejs/diagnostics
Edit 21.11.: corrected link to PR
| diag-agenda,async_local_storage | low | Major |
2,632,003,628 | flutter | [Impeller] Camera Preview Not Displayed on Galaxy S9+ Model | ### Steps to reproduce
Recently, on the Galaxy S9+ model, the app screen appeared black, so I upgraded the Flutter version to 3.27.0-1.0.pre.329.
From reviewing other GitHub issues, it appears that the Galaxy S9+ has incomplete support for Vulkan. This issue was resolved in version 3.27.0.
Now, while the app screen appears, packages that use the camera still don’t display the preview.
The issue does not occur if I run the app with the following command:
flutter run --no-enable-impeller
I have attached an example using the camera package.
### Expected results
The camera preview should be displayed.
### Actual results
The camera preview is not displayed.
### Code sample
<details open><summary>Code sample</summary>
```dart
import 'dart:io';
import 'package:flutter/material.dart';
import 'package:camera/camera.dart';
Future<void> main() async {
WidgetsFlutterBinding.ensureInitialized();
final cameras = await availableCameras();
final firstCamera = cameras.first;
runApp(
MaterialApp(
theme: ThemeData.dark(),
home: MyApp(
camera: firstCamera,
),
),
);
}
class MyApp extends StatefulWidget {
const MyApp({
super.key,
required this.camera,
});
final CameraDescription camera;
@override
MyAppState createState() => MyAppState();
}
class MyAppState extends State<MyApp> {
late CameraController _controller;
late Future<void> _initializeControllerFuture;
@override
void initState() {
super.initState();
_controller = CameraController(
widget.camera,
ResolutionPreset.medium,
);
_initializeControllerFuture = _controller.initialize();
}
@override
void dispose() {
_controller.dispose();
super.dispose();
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(title: const Text('카메라 화면')),
body: Column(
children: [
FutureBuilder<void>(
future: _initializeControllerFuture,
builder: (context, snapshot) {
if (snapshot.connectionState == ConnectionState.done) {
return CameraPreview(_controller);
} else {
return const Center(child: CircularProgressIndicator());
}
},
),
],
),
floatingActionButton: FloatingActionButton(
onPressed: () async {
try {
await _initializeControllerFuture;
final image = await _controller.takePicture();
if (!context.mounted) return;
await Navigator.of(context).push(
MaterialPageRoute(
builder: (context) => DisplayPictureScreen(
imagePath: image.path,
),
),
);
} catch (e) {
print(e);
}
},
child: const Icon(Icons.camera_alt),
),
);
}
}
class DisplayPictureScreen extends StatelessWidget {
final String imagePath;
const DisplayPictureScreen({super.key, required this.imagePath});
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(title: const Text('Captured Image')),
body: Image.file(File(imagePath)),
);
}
}
```
</details>
### Screenshots or Video
<details open>
<summary>Screenshots / Video demonstration</summary>
[Upload media here]
</details>
### Logs
<details open><summary>Logs</summary>
```console
[Paste your logs here]
```
</details>
### Flutter Doctor output
<details open><summary>Doctor output</summary>
```console
Doctor summary (to see all details, run flutter doctor -v):
[√] Flutter (Channel master, 3.27.0-1.0.pre.329, on Microsoft Windows [Version 10.0.22631.4317], locale ko-KR)
[√] Windows Version (11 Pro 64-bit, 23H2, 2009)
[√] Android toolchain - develop for Android devices (Android SDK version 35.0.0-rc2)
[√] Chrome - develop for the web
[√] Visual Studio - develop Windows apps (Visual Studio Build Tools 2022 17.5.0)
[√] Android Studio (version 2023.2)
[√] VS Code (version 1.95.1)
[√] Connected device (5 available)
[√] Network resources
```
</details>
| e: device-specific,platform-android,engine,p: camera,package,c: rendering,P2,e: impeller,team-engine,triaged-engine | low | Major |
2,632,009,023 | TypeScript | The generated. d.ts file is missing jsdoc comments | ### 🔍 Search Terms
".d.ts","jsdoc"
### ✅ Viability Checklist
- [x] This wouldn't be a breaking change in existing TypeScript/JavaScript code
- [x] This wouldn't change the runtime behavior of existing JavaScript code
- [x] This could be implemented without emitting different JS based on the types of the expressions
- [x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, new syntax sugar for JS, etc.)
- [x] This isn't a request to add a new utility type: https://github.com/microsoft/TypeScript/wiki/No-New-Utility-Types
- [x] This feature would agree with the rest of our Design Goals: https://github.com/Microsoft/TypeScript/wiki/TypeScript-Design-Goals
### ⭐ Suggestion
When generating .d.ts, if the referenced type is not exported, take one of the following actions:
- Compile error, indicating that necessary types have not been exported
- Automatically generate non exported types into .d.ts
- Maintain the existing implementation and only add jsdoc comments(prefered)
Here is a simple example
1. I wrote a simple library 1
```
// myLib/index.d.ts
type FilterString<T> = {
[K in keyof T as T[K] extends string ? never : K]: T[K];
}
export function define<T>(p: T): FilterString<T>;
```
2. Reference library 1 in library 2
```
import { define } from 'myLib'
export const test = define({
/** jsdoc 1 */
a: 5,
/** jsdoc 2 */
b: '',
})
```
3. Compile library 2 to generate .d.ts
```
// without jsdoc: type FilterString without export
export declare const test: {
a: number; // Don't let jsdoc be lost (prefered)
};
```
```
// with jsdoc: type FilterString with export
// Alternatively, generate the FilterString type into .d.ts as well
export declare const test: import("myLib").FilterString<{
/** jsdoc 1 */
a: number;
/** jsdoc 2 */
b: string;
}>;
```
### 📃 Motivating Example
When writing libraries, maintain consistent JSDoc annotations between the development and release environments
### 💻 Use Cases
1. When writing libraries, maintain consistent JSDoc annotations between the development and release environments
2. I only discovered that JSDoc was missing when it was released
3. Manually find the type that has not been exported and add it to the export
| Suggestion,Help Wanted,Experience Enhancement | low | Critical |
2,632,014,213 | neovim | startup autocmd event documentation is inconsistent with how it behaves | ### Problem
Frequently there is no `ChanInfo` event.
Not sure if it's a doc/code/spec/cockpit issue.
In develop.txt there's (twice)
```
- Call |nvim_set_client_info()| after connecting, so users and plugins can
detect the UI by handling the |ChanInfo| event. This avoids the need for ...
```
But with `nvim-tui` the `ChanInfo` event is intermittent. With both `neovide`
and `goneovim` the event never happens.
At https://neovim.io/doc/user/starting.html#--embed there's
```
--embed ... Waits for the ... nvim_ui_attach() before sourcing startup files
... The client can do other requests before nvim_ui_attach (e.g. nvim_get_api_info
```
`ChanOpen` event is never seen. Might be my specific usage/environment.
In `autocmd.txt`
```
UIEnter
After a UI connects via |nvim_ui_attach()|,
or after builtin TUI is started.
```
```
ChanInfo
State of channel changed, for instance the
client of a RPC channel described itself.
```
### Steps to reproduce
Observations using the script below
With `nvim-tui` either
1. `UIEnter` without a later `ChanInfo`
In `UIEnter` handler `nvim_get_chan_info()` `client` value is populated.
2. `UIEnter` followed by `ChanInfo`
In `UIEnter` handler `nvim_get_chan_info()` `client` value is empty dict.
In `ChanInfo` `vim.v.event.info` is populated as expected.
With `neovide` never see `ChanInfo`
1. In `UIEnter` handler `nvim_get_chan_info()` value is populated.
```lua
vim.api.nvim_create_autocmd("ChanInfo", {
callback = function(ev)
vim.print(string.format("ChanInfo: ev: %s", vim.inspect(ev, {newline=""})))
vim.print(string.format("ChanInfo: v:event info: %s", vim.inspect(vim.v.event.info)))
end
})
vim.api.nvim_create_autocmd("UIEnter", {
callback = function(ev)
vim.print(string.format("UIEnter: ev: %s", vim.inspect(ev, {newline=""})))
vim.print(string.format("UIEnter: v:event: %s", vim.inspect(vim.v.event, {newline=""})))
vim.print(string.format("UIEnter: chan info: %s", vim.inspect(
vim.api.nvim_get_chan_info(vim.v.event.chan), {newline=""})))
end
})
```
### Expected behavior
More accurate descriptions; maybe in conjunction with code changes
(@bfredl follow up on #31057)
### Nvim version (nvim -v)
https://github.com/neovim/neovim/commit/0da4d89558a05fb86186253e778510cfd859caea
### Vim (not Nvim) behaves the same?
NA
### Operating system/version
ubuntu
### Terminal name/version
gnome-terminal
### $TERM environment variable
xterm-256color
### Installation
make install | bug,gui,startup,events | low | Major |
2,632,096,523 | pytorch | There is a bug in torch.cat for non-standard boolean | ### 🐛 Describe the bug
```python
import torch
import numpy as np
# params
lower_value = 5
upper_value = 10
input_shape = [1024,1,1024,1024] # input_shape = [1024,1,1024,1023]
# create numpy array
input_array_0 = np.random.uniform(lower_value, upper_value, input_shape).astype(dtype="int8")
input_array_1 = np.random.uniform(lower_value, upper_value, input_shape).astype(dtype="int8")
# convert to torch.tensor
input_tensor_0 = torch.from_numpy(input_array_0).to(device="cuda").view(bool)
input_tensor_1 = torch.from_numpy(input_array_1).to(device="cuda").view(bool)
input_tensors = []
input_tensors.append(input_tensor_0)
input_tensors.append(input_tensor_1)
# invoke torch.cat
out = torch.cat(input_tensors, dim=1)
print(input_tensor_0.untyped_storage()[0:5])
print(input_tensor_1.untyped_storage()[0:5])
print(out.untyped_storage()[0:5])
```
When `input_shape = [1024,1,1024,1023]`, the content of `out.untyped_storage()` is between 5 and 10, which belongs to non-standard boolean. However, when `input_shape = [1024,1,1024,1024]`, the contents of `out.untyped_storage()` are all standard boolean.


### Versions
torch 2.3.0 + cu121
cc @albanD | triaged,module: boolean tensor,module: python frontend,module: edge cases | low | Critical |
2,632,098,476 | TypeScript | auto-import with `package.json#imports` does not take `moduleResolution` into account | ### 🔎 Search Terms
"subpath", "imports", "package.json", "moduleResolution"
### 🕗 Version & Regression Information
- This changed in commit or PR #55015
- This is the behavior in every version I tried, and I reviewed the FAQ for entries about imports, completions
### ⏯ Playground Link
[alan910127/ts-60405-repro](https://github.com/alan910127/ts-60405-repro)
### 💻 Code
```jsonc
// package.json
{
"imports": {
"#*": "./src/*"
},
"devDependencies": {
"typescript": "5.6.3"
}
}
```
```jsonc
// tsconfig.json
{
"compilerOptions": {
"moduleResolution": "Bundler"
}
}
```
```ts
// src/foo/bar/baz.ts
export function baz() {}
```
```ts
// src/one/two/three.ts
// actual - import { baz } from "#foo/bar/baz.js";
// expected - import { baz } from "#foo/bar/baz";
```
### 🙁 Actual behavior
When auto-importing the function `baz` (either through completion or code action), the typescript language server will insert an import statement with a `.js` file extension.
```ts
import { baz } from "#foo/bar/baz.js";
```
### 🙂 Expected behavior
Since we are using `"moduleResolution": "Bundler"`, it should insert an import statement without the `.js` file extension.
```ts
import { baz } from "#foo/bar/baz";
```
### Additional information about the issue
I've seen a similar issue #59200, but I cannot reproduce the issue, with the same setup. Therefore, I think my issue might be related with the subpath imports. | Needs More Info | low | Minor |
2,632,131,845 | kubernetes | Inconsistency of Partitions in StatefulSets with StartOrdinal Feature | ### What happened?
StatefulSet with
```
Replicas 9
StartOrdinal 2
Partition 5
```
we will get
[2,3,4,5,6] current revision
[7,8,9,10] updated revision
### What did you expect to happen?
If I understand the partition with definition
https://github.com/kubernetes/kubernetes/blob/3036d107a0ee4855b992e9f49eded88e0a739734/staging/src/k8s.io/api/apps/v1/types.go#L117-L122
I will get
[2,3,4] current revision
[5,6,7,8,9,10] updated revision
Or we can define partition as the number of pods with current revision.
### How can we reproduce it (as minimally and precisely as possible)?
1. Create a sts with partition 9 and start ordinal 2
```
apiVersion: apps/v1
kind: StatefulSet
metadata:
creationTimestamp: "2024-09-12T03:06:27Z"
generation: 1
name: ss2
resourceVersion: "23988"
uid: d556aa7c-c4da-4676-8caa-4b32db831c37
spec:
podManagementPolicy: OrderedReady
replicas: 9
revisionHistoryLimit: 10
selector:
matchLabels:
baz: blah
foo: bar
serviceName: test
ordinals:
start: 2
template:
metadata:
labels:
baz: blah
foo: bar
spec:
containers:
- image: xxxx
imagePullPolicy: IfNotPresent
name: nginx
resources: {}
updateStrategy:
rollingUpdate:
maxUnavailable: 1
partition: 0
type: RollingUpdate
```
2. update the sts and set partition as 5
### Anything else we need to know?
_No response_
### Kubernetes version
<details>
```console
$ kubectl version
# paste output here
Client Version: v1.29.3
Kustomize Version: v5.0.4-0.20230601165947-6ce0bf390ce3
Server Version: v1.28.7
```
</details>
### Cloud provider
<details>
kind
</details>
### OS version
<details>
```console
# On Linux:
$ cat /etc/os-release
# paste output here
$ uname -a
# paste output here
# On Windows:
C:\> wmic os get Caption, Version, BuildNumber, OSArchitecture
# paste output here
```
</details>
### Install tools
<details>
</details>
### Container runtime (CRI) and version (if applicable)
<details>
</details>
### Related plugins (CNI, CSI, ...) and versions (if applicable)
<details>
KCM
</details>
| kind/bug,sig/apps,needs-triage | low | Minor |
2,632,164,597 | PowerToys | Updates info relese notes opens even clicked off from settings | ### Microsoft PowerToys version
0.85.1
### Installation method
PowerToys auto-update
### Running as admin
Yes
### Area(s) with issue?
General
### Steps to reproduce
I dont know.
### ✔️ Expected Behavior
I dont need to know update notes weeks after latest update. When turning off the Notification and Release Notes, it should not open the window when Power Toys is opened either att startup or in manual
### ❌ Actual Behavior
When I start my computer and auto starting Power Toys as administrator, the Release Notes window opnes even turned off Show Notifications and Show the Release Notes. Also when taken ticks off from the updates drop box.
### Other Software
_No response_ | Issue-Bug,Needs-Triage | low | Minor |
2,632,176,734 | PowerToys | Shortcut key “chords” sharing the first “chord” | ### Description of the new feature / enhancement
Key chords are great for binding related stuff to the same ROOT key e.g.
- `Windows + S, +S`: pause track
- `Windows + S, +A`: prev track
- `Windows + S, +D: `next track
> Also note that adding this second dimension frees other keys and allows reusing the second chord for similar actions but in other contexts: A/D keys can be used for left/right whatever...
You CAN do this, but the initial chord can trigger windows shortcuts e.g. Windows search pop-ups whenever I do `Windows + S, +S`
I tried setting `Windows + S` to disabled but then you cannot bind the other chords:

### Scenario when this would be used?
Showcased in my example, I used them all the time also e.g. for git in vscode:
- ALT+G,S (stage)
- ALT+G,R (revert)
- ...
| Needs-Triage | low | Minor |
2,632,291,906 | langchain | RedisVectorStore add_texts uses wrong function parameter for ids | ### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a similar question and didn't find it.
- [X] I am sure that this is a bug in LangChain rather than my code.
- [X] The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
### Example Code
The RedisVectorStore implementation uses a different name for ids, namely 'keys'. In all other vector stores and implementations it is 'ids'.
```python
vector_store = RedisVectorStore(EMBEDDING_MODEL, config=config)
doc = Document(
"Some text",
id="unique_id1"
)
added_ids = vector_store.add_documents([doc]) # does not work, ids are not used!
added_ids2 = vector_store.add_documents([doc], keys=[doc.id]) # works, since RedisVectorStore is looking for 'keys' instead of 'ids'
```
### Error Message and Stack Trace (if applicable)
_No response_
### Description
The RedisVectorStore implementation uses a different name for ids, namely 'keys'. In all other vector stores and implementations it is 'ids'. This seems wrong and it's only possible to set your own ids if you inspect the code and add 'keys' instead
### System Info
System Information
------------------
> OS: Darwin
> OS Version: Darwin Kernel Version 23.6.0: Wed Jul 31 20:49:46 PDT 2024; root:xnu-10063.141.1.700.5~1/RELEASE_ARM64_T8103
> Python Version: 3.11.9 (main, Apr 2 2024, 08:25:04) [Clang 15.0.0 (clang-1500.3.9.4)]
Package Information
-------------------
> langchain_core: 0.3.14
> langchain: 0.3.6
> langchain_community: 0.3.4
> langsmith: 0.1.137
> langchain_redis: 0.1.1
> langchain_text_splitters: 0.3.1
Optional packages not installed
-------------------------------
> langgraph
> langserve
Other Dependencies
------------------
> aiohttp: 3.10.10
> async-timeout: Installed. No version info available.
> dataclasses-json: 0.6.7
> httpx: 0.27.2
> httpx-sse: 0.4.0
> jsonpatch: 1.33
> numpy: 1.26.4
> orjson: 3.10.10
> packaging: 24.1
> pydantic: 2.9.2
> pydantic-settings: 2.6.0
> python-ulid: 2.7.0
> PyYAML: 6.0.2
> redisvl: 0.3.5
> requests: 2.32.3
> requests-toolbelt: 1.0.0
> SQLAlchemy: 2.0.36
> tenacity: 8.5.0
> typing-extensions: 4.12.2 | Ɑ: vector store | low | Critical |
2,632,292,876 | PowerToys | Shortcut key “chords” for activating apps | ### Description of the new feature / enhancement
The keyboard manager can map “chords” so I wonder why there is no option to bind them for app activation e.g. Text Extractor.
As explained in https://github.com/microsoft/PowerToys/issues/35741, “chords” are very useful for related actions when sharing the first "chord".
For example the mouse utilities could be:
- Windows +K, F (find)
- Windows +K, H (highlight)
- Windows +K, C (crosshair)
> Notice that, currently, you cannot share the same root key and expect it to override the windows shortcut https://github.com/microsoft/PowerToys/issues/35741
### Scenario when this would be used?
The current workaround is to bind an app to a regular shortcut (maybe an obscure one) and then use the keyboard mapper to bind a chord to that.
This would make the process much simpler and streamline the user experience + show the proper keys in the dashboard etc.
IMO, I would give it low priority, certainly lower than https://github.com/microsoft/PowerToys/issues/35741.
Depends on the complexity to implement this "enhancement" etc... the workaround is not that complicated and there are not that many apps.
> But for now this issue can be referenced, used for discussion and so on!
| Needs-Triage | low | Minor |
2,632,306,028 | storybook | [Bug]: 'viteConfigFile' automigration gets triggered for `@storybook/experimental-nextjs-vite` | ### Describe the bug
When running `npx storybook@latest upgrade` in a Storybook, which uses `@storybook/experimental-nextjs-vite`, the `viteConfigFile` automigration gets automatically triggered. This isn't necessary, because a dedicated `vite.config.ts` file isn't necessary for this framework
### Reproduction link
-
### Reproduction steps
_No response_
### System
-
### Additional context
_No response_ | bug,nextjs,automigrations | low | Critical |
2,632,348,291 | vscode | Editable Refactor Preview | # Problem Statement
Current refactor preview is very limited when using with the SOTA models. Meaning that we have to do a lot of overwork, after the edits are applied, to allow users to post edit partially correct changes, and allow them to keep a good code state in their workspace.
The UX for this is always subpar and requires users to learn specific ways how to interact with different stages, rather than being able to do one action.
We believe that with the modern ML models, which provide large multifile changes, it should be possible to allow users to quickly apply and modify the ML changes, without struggle and the UX should allow that.
In that sense making the refactor preview editable, would allow the user to directly do all their changes in the bounds of the suggestion, with the added benefit of gathering telemetry for the providers (which can easily gather the better expected response).
## Discussion points
### Why not just create your own view
This is not that straight forward. We want to keep the model of interaction as is in the refactor preview, where user can do their changes to the model on the right side cheaply, without breaking their workspace. Then apply the changes once happy with the final result (or even better apply partially).
Note that we can't create editable virtual files from the extension, that could be used for this case, otherwise we could build similar RP experience with Multi diff view and a custom panel.
### Why make the RP editable?
With ML in the game, possibly providing multifile changes, users might not be confident in the direct solution provided by the tool. Therefore simple acceptance/rejection is not an option. In that case its very helpful to be able to tweak specific parts of the edit and apply the result afterwards.
In this case we would be able to allow users to accept more helpful ML suggestions, and allow them to tweak it at the source of the change, rather than post factum, with tedious and potentially unhelpful UX.
### How is this helpful to ML?
Allowing the user to apply correction to the edits right at the source and not post factum, allows for better tracking of content rewrites. This can feed back in to the training/fine tuning sets to improve the content further, and can provide an easy boundary for **where does the changes to the ML edit end** and **where does the unrelated user edits start**.
# Possible solution
- Allow tools to pass in a specified flag in the `vscode.workspace.applyEdit()` call to trigger this behavior
- Remove the possibility to select/deselect edits in a single file (either directly, or after user made the first edit to the file)
- Allow the virtual file TextModel to be editable for RP when requested (note that the factor impacting this is the implicit Readonly on the simple text model)
- Apply all changes that user made
- Preferably as a sequence over the original file to correctly track all the user changes
- Alternatively just grab the right hand side files and apply them whole over the original files
**Happy to help bring this in to production** :) | feature-request,workspace-edit | low | Minor |
2,632,350,605 | excalidraw | add an equal distribution feature | I would like to add an equal distribution feature, similar to the one in PowerPoint. For example, placing 5 blocks at once, and then clicking on equal distribution will make the spacing between them uniform. | More information needed / cannot reproduce | low | Minor |
2,632,419,158 | flutter | TextField doesn't receive input in focused state above PlatformView in Firefox | ### Steps to reproduce
1. Run flutter app for web: `flutter run -d web-server --web-renderer canvaskit`
2. Open the URL in Firebox: `lib/main.dart is being served at http://localhost:XXXXX`
3. Open dialog with a tab on the `FloatingActionButton`
4. Click into the textfield and try to add input
Firefox Version: 132.0 (aarch64)
### Expected results
The textfield receives the input from the keyboard.
### Actual results
The input doesn't receive the input, even the textfield received the focus.
A super weird behavior is that the textfield receives the input when switching the tab: see video
### Code sample
<details open><summary>Code sample</summary>
```dart
import 'package:flutter/material.dart';
import 'package:web/web.dart' as web;
void main() {
runApp(App());
}
class App extends StatelessWidget {
const App({super.key});
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
body: Builder(
builder: (context) {
return Stack(
children: [
Positioned.fill(child: NativeMap()),
Center(
child: FloatingActionButton(
child: Icon(Icons.open_in_browser),
onPressed: () {
showDialog<void>(
context: context,
barrierDismissible: false,
builder: (context) => Dialog(
child: SizedBox(
height: 300,
width: 200,
child: Padding(
padding: const EdgeInsets.all(20.0),
child: Column(
children: [
IconButton(
onPressed: () {
Navigator.of(context).pop();
},
icon: Icon(Icons.close),
),
TextField(
decoration: InputDecoration(
contentPadding: EdgeInsets.all(20),
),
),
],
),
),
),
),
);
},
),
),
],
);
},
),
),
);
}
}
class NativeMap extends StatelessWidget {
const NativeMap({super.key});
@override
Widget build(BuildContext context) {
// this could also be GoogleMaps, Maplibre, etc.
return HtmlElementView.fromTagName(
tagName: 'div',
isVisible: true,
onElementCreated: _onElementCreated,
);
}
void _onElementCreated(Object element) {
(element as web.HTMLElement)
..style.width = '100%'
..style.height = '100%'
..style.backgroundColor = '#FF0000';
}
}
```
</details>
### Screenshots or Video
<details open>
<summary>Screenshots / Video demonstration</summary>
https://github.com/user-attachments/assets/0732ccb3-157b-41f1-8e19-1d4378824574
</details>
### Logs
<details open><summary>Logs</summary>
```console
...
```
</details>
### Flutter Doctor output
<details open><summary>Doctor output</summary>
```console
[✓] Flutter (Channel stable, 3.24.3, on macOS 14.6.1 23G93 darwin-arm64 (Rosetta), locale de-DE)
• Flutter version 3.24.3 on channel stable at /Users/stefanschaller/fvm/versions/3.24.3
• Upstream repository https://github.com/flutter/flutter.git
• Framework revision 2663184aa7 (vor 8 Wochen), 2024-09-11 16:27:48 -0500
• Engine revision 36335019a8
• Dart version 3.5.3
• DevTools version 2.37.3
[✓] Android toolchain - develop for Android devices (Android SDK version 34.0.0)
• Android SDK at /Users/stefanschaller/Library/Android/sdk
• Platform android-34, build-tools 34.0.0
• Java binary at: /Users/stefanschaller/Applications/Android Studio.app/Contents/jbr/Contents/Home/bin/java
• Java version OpenJDK Runtime Environment (build 21.0.3+-79915917-b509.11)
• All Android licenses accepted.
[✓] Xcode - develop for iOS and macOS (Xcode 16.1)
• Xcode at /Applications/Xcode.app/Contents/Developer
• Build 16B40
• CocoaPods version 1.15.2
[✓] Chrome - develop for the web
• Chrome at /Applications/Google Chrome.app/Contents/MacOS/Google Chrome
[✓] Android Studio (version 2024.2)
• Android Studio at /Users/stefanschaller/Applications/Android Studio.app/Contents
• Flutter plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/9212-flutter
• Dart plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/6351-dart
• Java version OpenJDK Runtime Environment (build 21.0.3+-79915917-b509.11)
[✓] IntelliJ IDEA Ultimate Edition (version 2024.2.4)
• IntelliJ at /Users/stefanschaller/Applications/IntelliJ IDEA Ultimate.app
• Flutter plugin version 82.0.3
• Dart plugin version 242.22855.32
[✓] Connected device (4 available)
• Stefan‘s iPhone (mobile) • 00008120-000858943610A01E • ios • iOS 18.0 22A3354
• macOS (desktop) • macos • darwin-arm64 • macOS 14.6.1 23G93 darwin-arm64 (Rosetta)
• Mac Designed for iPad (desktop) • mac-designed-for-ipad • darwin • macOS 14.6.1 23G93 darwin-arm64 (Rosetta)
• Chrome (web) • chrome • web-javascript • Google Chrome 130.0.6723.92
[✓] Network resources
• All expected network resources are available.
• No issues found!
```
</details>
| a: text input,a: platform-views,p: maps,platform-web,package,has reproducible steps,browser: firefox,P2,team-web,triaged-web,fyi-text-input,found in release: 3.24,found in release: 3.27 | low | Major |
2,632,518,348 | PowerToys | Workspaces doesn't recognize firefox developer edtion | ### Microsoft PowerToys version
0.85.1
### Installation method
WinGet
### Running as admin
Yes
### Area(s) with issue?
Workspaces
### Steps to reproduce
Try to create new workspace

Open firefox developer edition while in capture mode in order to add it to workspace:

Default firefox version is added

### ✔️ Expected Behavior
Workspace with firefox developer edition
### ❌ Actual Behavior
Workspace with regular firefox edition
### Other Software
_No response_ | Issue-Bug,Needs-Triage,Product-Workspaces | low | Minor |
2,632,524,220 | go | proposal: slices, sort: randomize sort order of equal elements in tests | ### Proposal Details
Go's (non-stable) sort functions don't guarantee a particular ordering for equal elements. However, since the order of equal elements is always the same for a number of releases, tests tend to accumulate dependencies on the ordering of equal elements by accident.
In Google codebase we noticed that a large number of tests depended on the order of equal elements when the sorting algorithm was changed to pdqsort in April 2022 (http://go.dev/cl/371574). We needed to fix all of these to be able to release pdqsort internally. Recently http://go.dev/cl/624295 changed how equal elements are ordered. In the 2½ years since pdqsort was introduced a few tests have already started to depend on the order of equal elements again.
This is a classic example of [Hyrum's Law](http://hyrumslaw.com)
> With a sufficient number of users of an API,
> it does not matter what you promise in the contract:
> all observable behaviors of your system
> will be depended on by somebody.
Fixing tests when the order of equal elements changes is a lot of work and makes upgrading to a new Go version harder. It's also surprisingly hard to understand if the order of equal elements actually matters (i.e. if stable sort should have been used) or if the order of equal elements was depended upon accidentally. Both problems can be mitigated sufficiently by randomizing the order of equal elements on each test run. That way, tests that depend on the order of equal elements provide a very early signal of problem.
Google already did this for C++ some time ago and it's working very well:
- https://libcxx.llvm.org/DesignDocs/UnspecifiedBehaviorRandomization.html
- https://reviews.llvm.org/D96946
- https://danlark.org/2022/04/20/changing-stdsort-at-googles-scale-and-beyond/ | Proposal | low | Major |
2,632,530,638 | rust | How do I run a single test of libstd with --nocapture ? | It is unclear to me how to run a single test of libstd using --nocapture. I've tried `--verbose` and `--test-args --nocapture` without any luck.
Using
```shell
--test-args `-- --nocapture`
```
reveals that `./x.py` passes the test runner a `--quiet` flag. | T-bootstrap,C-bug,E-needs-investigation | low | Major |
2,632,547,973 | electron | Resize window doesn't work on Wayland | ### Preflight Checklist
- [x] I have read the [Contributing Guidelines](https://github.com/electron/electron/blob/main/CONTRIBUTING.md) for this project.
- [x] I agree to follow the [Code of Conduct](https://github.com/electron/electron/blob/main/CODE_OF_CONDUCT.md) that this project adheres to.
- [x] I have searched the [issue tracker](https://www.github.com/electron/electron/issues) for a bug report that matches the one I want to file, without success.
### Electron Version
33.0.2
### What operating system(s) are you using?
Other Linux
### Operating System Version
Arch Linux, Gnome 47.1
### What arch are you using?
x64
### Last Known Working Electron version
_No response_
### Expected Behavior

Works when running with x
### Actual Behavior

Ignores pointer when running with wayland
### Testcase Gist URL
_No response_
### Additional Information
_No response_ | platform/linux,bug :beetle:,has-repro-comment,33-x-y | medium | Critical |
2,632,548,448 | vscode | vscode takes too long to start with a white page |
1. :warning: We have copied additional data to your clipboard. Make sure to **paste** here. :warning:
## System Info
* Code: 1.95.1 (65edc4939843c90c34d61f4ce11704f09d3e5cb6)
* OS: linux(5.15.0-124-generic)
* CPUs: 11th Gen Intel(R) Core(TM) i7-11800H @ 2.30GHz(16 x 2940)
* Memory(System): 31.07 GB(25.13GB free)
* Memory(Process): 232.02 MB working set(139.79MB private, 0.91MB shared)
* VM(likelihood): 0%
* Initial Startup: true
* Has 0 other windows
* Screen Reader Active: false
* Empty Workspace: false
## Performance Marks
| What | Duration | Process | Info |
| -------------------------------------------------------------- | -------- | ------------------------- | ---------------------------------------------------------- |
| start => app.isReady | 120 | [main] | initial startup: true |
| nls:start => nls:end | 0 | [main] | initial startup: true |
| import(main.js) | 227 | [main] | initial startup: true |
| run main.js | 159 | [main] | initial startup: true |
| start crash reporter | 3 | [main] | initial startup: true |
| serve main IPC handle | 1 | [main] | initial startup: true |
| create window | 39 | [main] | initial startup: true, state: 0ms, widget: 36ms, show: 0ms |
| app.isReady => window.loadUrl() | 119 | [main] | initial startup: true |
| window.loadUrl() => begin to import(workbench.desktop.main.js) | 27539 | [main->renderer] | NewWindow |
| import(workbench.desktop.main.js) | 474 | [renderer] | cached data: NO |
| wait for window config | 3 | [renderer] | - |
| init storage (global & workspace) | 16 | [renderer] | - |
| init workspace service | 30 | [renderer] | - |
| register extensions & spawn extension host | 316 | [renderer] | - |
| restore viewlet | 13 | [renderer] | workbench.view.explorer |
| restore panel | 0 | [renderer] | - |
| restore & resolve visible editors | 121 | [renderer] | 1: workbench.editors.files.fileEditorInput |
| create workbench contributions | 24 | [renderer] | 72 blocking startup |
| overall workbench load | 262 | [renderer] | - |
| workbench ready | 28552 | [main->renderer] | - |
| renderer ready | 778 | [renderer] | - |
| shared process connection ready | 318 | [renderer->sharedprocess] | - |
| extensions registered | 28831 | [renderer] | - |
## Extension Activation Stats
| Extension | Eager | Load Code | Call Activate | Finish Activate | Event | By |
| ------------------------ | ----- | --------- | ------------- | --------------- | ----------------- | ------------------------ |
| vscode.git | true | 75 | 7 | 59 | * | vscode.git |
| vscode.git-base | true | 4 | 1 | 0 | * | vscode.git |
| vscode.github | true | 25 | 1 | 5 | * | vscode.github |
| vscode.debug-auto-launch | false | 2 | 0 | 0 | onStartupFinished | vscode.debug-auto-launch |
| vscode.emmet | false | 19 | 11 | 0 | onLanguage | vscode.emmet |
| vscode.merge-conflict | false | 17 | 3 | 5 | onStartupFinished | vscode.merge-conflict |
## Terminal Stats
| Name | Timestamp | Delta | Total |
| --------------------------------------- | ------------- | ----- | ----- |
| code/terminal/willGetTerminalBackend | 1730719399863 | 0 | 0 |
| code/terminal/didGetTerminalBackend | 1730719399863 | 0 | 0 |
| code/terminal/willReconnect | 1730719399863 | 0 | 0 |
| code/terminal/willGetTerminalLayoutInfo | 1730719399863 | 0 | 0 |
| code/terminal/didGetTerminalLayoutInfo | 1730719399910 | 47 | 47 |
| code/terminal/didReconnect | 1730719399911 | 1 | 48 |
| code/terminal/willReplay | 1730719399911 | 0 | 48 |
| code/terminal/didReplay | 1730719399911 | 0 | 48 |
| code/terminal/willGetPerformanceMarks | 1730719399911 | 0 | 48 |
| code/terminal/didGetPerformanceMarks | 1730719399916 | 5 | 53 |
## Workbench Contributions Blocking Restore
* Total (LifecyclePhase.Starting): 36 (13ms)
* Total (LifecyclePhase.Ready): 42 (11ms)
| Name | Timestamp | Delta | Total |
| --------------------------------------------------------------------------------------------------- | ------------- | ----- | ----- |
| code/willCreateWorkbenchContribution/1/workbench.contrib.navigableContainerManager | 1730719399266 | 0 | 0 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.navigableContainerManager | 1730719399267 | 1 | 1 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.extensionPoints | 1730719399267 | 0 | 1 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.extensionPoints | 1730719399267 | 0 | 1 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.viewsExtensionHandler | 1730719399267 | 0 | 1 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.viewsExtensionHandler | 1730719399267 | 0 | 1 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.preferencesActions | 1730719399267 | 0 | 1 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.preferencesActions | 1730719399270 | 3 | 4 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.preferences | 1730719399270 | 0 | 4 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.preferences | 1730719399271 | 1 | 5 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.notebook | 1730719399271 | 0 | 5 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.notebook | 1730719399271 | 0 | 5 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.cellContentProvider | 1730719399271 | 0 | 5 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.cellContentProvider | 1730719399272 | 1 | 6 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.cellInfoContentProvider | 1730719399272 | 0 | 6 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.cellInfoContentProvider | 1730719399272 | 0 | 6 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.notebookMetadataContentProvider | 1730719399272 | 0 | 6 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.notebookMetadataContentProvider | 1730719399272 | 0 | 6 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.registerCellSchemas | 1730719399272 | 0 | 6 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.registerCellSchemas | 1730719399273 | 1 | 7 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.chatResolver | 1730719399273 | 0 | 7 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.chatResolver | 1730719399273 | 0 | 7 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.chatExtensionPointHandler | 1730719399273 | 0 | 7 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.chatExtensionPointHandler | 1730719399274 | 1 | 8 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.chatViewsWelcomeHandler | 1730719399274 | 0 | 8 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.chatViewsWelcomeHandler | 1730719399274 | 0 | 8 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.explorerViewletViews | 1730719399274 | 0 | 8 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.explorerViewletViews | 1730719399276 | 2 | 10 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.textFileEditorTracker | 1730719399276 | 0 | 10 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.textFileEditorTracker | 1730719399276 | 0 | 10 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.textFileSaveErrorHandler | 1730719399276 | 0 | 10 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.textFileSaveErrorHandler | 1730719399276 | 0 | 10 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.fileUriLabel | 1730719399276 | 0 | 10 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.fileUriLabel | 1730719399276 | 0 | 10 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.dirtyFilesIndicator | 1730719399276 | 0 | 10 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.dirtyFilesIndicator | 1730719399277 | 1 | 11 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.replacePreviewContentProvider | 1730719399277 | 0 | 11 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.replacePreviewContentProvider | 1730719399277 | 0 | 11 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.searchEditor | 1730719399277 | 0 | 11 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.searchEditor | 1730719399277 | 0 | 11 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.mergeEditorResolver | 1730719399277 | 0 | 11 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.mergeEditorResolver | 1730719399277 | 0 | 11 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.multiDiffEditorResolver | 1730719399277 | 0 | 11 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.multiDiffEditorResolver | 1730719399277 | 0 | 11 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.scmMultiDiffSourceResolver | 1730719399277 | 0 | 11 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.scmMultiDiffSourceResolver | 1730719399277 | 0 | 11 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.webviewPanel | 1730719399277 | 0 | 11 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.webviewPanel | 1730719399277 | 0 | 11 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.complexCustomWorkingCopyEditorHandler | 1730719399277 | 0 | 11 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.complexCustomWorkingCopyEditorHandler | 1730719399277 | 0 | 11 |
| code/willCreateWorkbenchContribution/1/terminalMain | 1730719399277 | 0 | 11 |
| code/didCreateWorkbenchContribution/1/terminalMain | 1730719399277 | 0 | 11 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.remoteLabel | 1730719399277 | 0 | 11 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.remoteLabel | 1730719399277 | 0 | 11 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.remoteInvalidWorkspaceDetector | 1730719399277 | 0 | 11 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.remoteInvalidWorkspaceDetector | 1730719399277 | 0 | 11 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.remoteStatusIndicator | 1730719399277 | 0 | 11 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.remoteStatusIndicator | 1730719399278 | 1 | 12 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.listContext | 1730719399278 | 0 | 12 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.listContext | 1730719399278 | 0 | 12 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.dialogHandler | 1730719399278 | 0 | 12 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.dialogHandler | 1730719399278 | 0 | 12 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.nativeWorkingCopyBackupTracker | 1730719399278 | 0 | 12 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.nativeWorkingCopyBackupTracker | 1730719399279 | 1 | 13 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.localTerminalBackend | 1730719399279 | 0 | 13 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.localTerminalBackend | 1730719399279 | 0 | 13 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.userDataSyncServices | 1730719399279 | 0 | 13 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.userDataSyncServices | 1730719399279 | 0 | 13 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.partsSplash | 1730719399279 | 0 | 13 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.partsSplash | 1730719399279 | 0 | 13 |
| code/willCreateWorkbenchContribution/1/workbench.contrib.chatMovedViewWelcomeView | 1730719399279 | 0 | 13 |
| code/didCreateWorkbenchContribution/1/workbench.contrib.chatMovedViewWelcomeView | 1730719399279 | 0 | 13 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.extensionUrlBootstrapHandler | 1730719399279 | 0 | 13 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.extensionUrlBootstrapHandler | 1730719399279 | 0 | 13 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.textInputActionsProvider | 1730719399279 | 0 | 13 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.textInputActionsProvider | 1730719399280 | 1 | 14 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.editorAutoSave | 1730719399280 | 0 | 14 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.editorAutoSave | 1730719399280 | 0 | 14 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.editorStatus | 1730719399280 | 0 | 14 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.editorStatus | 1730719399280 | 0 | 14 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.untitledTextEditorWorkingCopyEditorHandler | 1730719399280 | 0 | 14 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.untitledTextEditorWorkingCopyEditorHandler | 1730719399280 | 0 | 14 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.dynamicEditorConfigurations | 1730719399280 | 0 | 14 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.dynamicEditorConfigurations | 1730719399281 | 1 | 15 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.textMateTokenizationInstantiator | 1730719399281 | 0 | 15 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.textMateTokenizationInstantiator | 1730719399281 | 0 | 15 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.treeSitterTokenizationInstantiator | 1730719399281 | 0 | 15 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.treeSitterTokenizationInstantiator | 1730719399282 | 1 | 16 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.notebookChatContribution | 1730719399282 | 0 | 16 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.notebookChatContribution | 1730719399282 | 0 | 16 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.notebookClipboard | 1730719399282 | 0 | 16 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.notebookClipboard | 1730719399282 | 0 | 16 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.notebook.multiCursorUndoRedo | 1730719399282 | 0 | 16 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.notebook.multiCursorUndoRedo | 1730719399282 | 0 | 16 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.markerListProvider | 1730719399282 | 0 | 16 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.markerListProvider | 1730719399282 | 0 | 16 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.notebookUndoRedo | 1730719399282 | 0 | 16 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.notebookUndoRedo | 1730719399283 | 1 | 17 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.notebookEditorManager | 1730719399283 | 0 | 17 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.notebookEditorManager | 1730719399283 | 0 | 17 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.notebookLanguageSelectorScoreRefine | 1730719399283 | 0 | 17 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.notebookLanguageSelectorScoreRefine | 1730719399283 | 0 | 17 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.simpleNotebookWorkingCopyEditorHandler | 1730719399283 | 0 | 17 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.simpleNotebookWorkingCopyEditorHandler | 1730719399283 | 0 | 17 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.toolsExtensionPointHandler | 1730719399283 | 0 | 17 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.toolsExtensionPointHandler | 1730719399283 | 0 | 17 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.interactiveDocument | 1730719399283 | 0 | 17 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.interactiveDocument | 1730719399284 | 1 | 18 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.replWorkingCopyEditorHandler | 1730719399284 | 0 | 18 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.replWorkingCopyEditorHandler | 1730719399284 | 0 | 18 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.replDocument | 1730719399284 | 0 | 18 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.replDocument | 1730719399284 | 0 | 18 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.fileEditorWorkingCopyEditorHandler | 1730719399284 | 0 | 18 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.fileEditorWorkingCopyEditorHandler | 1730719399284 | 0 | 18 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.bulkEditPreview | 1730719399284 | 0 | 18 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.bulkEditPreview | 1730719399284 | 0 | 18 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.searchEditorWorkingCopyEditorHandler | 1730719399284 | 0 | 18 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.searchEditorWorkingCopyEditorHandler | 1730719399284 | 0 | 18 |
| code/willCreateWorkbenchContribution/2/comments.input.contentProvider | 1730719399284 | 0 | 18 |
| code/didCreateWorkbenchContribution/2/comments.input.contentProvider | 1730719399284 | 0 | 18 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.trustedDomainsFileSystemProvider | 1730719399284 | 0 | 18 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.trustedDomainsFileSystemProvider | 1730719399284 | 0 | 18 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.externalUriResolver | 1730719399284 | 0 | 18 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.externalUriResolver | 1730719399284 | 0 | 18 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.showPortCandidate | 1730719399284 | 0 | 18 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.showPortCandidate | 1730719399284 | 0 | 18 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.tunnelFactory | 1730719399284 | 0 | 18 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.tunnelFactory | 1730719399284 | 0 | 18 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.editorFeaturesInstantiator | 1730719399284 | 0 | 18 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.editorFeaturesInstantiator | 1730719399284 | 0 | 18 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.startupPageEditorResolver | 1730719399284 | 0 | 18 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.startupPageEditorResolver | 1730719399284 | 0 | 18 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.userDataProfiles | 1730719399284 | 0 | 18 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.userDataProfiles | 1730719399285 | 1 | 19 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.localHistoryTimeline | 1730719399285 | 0 | 19 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.localHistoryTimeline | 1730719399286 | 1 | 20 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.workspaceTrustRequestHandler | 1730719399286 | 0 | 20 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.workspaceTrustRequestHandler | 1730719399286 | 0 | 20 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.accessibilityStatus | 1730719399286 | 0 | 20 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.accessibilityStatus | 1730719399286 | 0 | 20 |
| code/willCreateWorkbenchContribution/2/extensionAccessibilityHelpDialogContribution | 1730719399286 | 0 | 20 |
| code/didCreateWorkbenchContribution/2/extensionAccessibilityHelpDialogContribution | 1730719399286 | 0 | 20 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.updateExperimentalSettingsDefaults | 1730719399286 | 0 | 20 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.updateExperimentalSettingsDefaults | 1730719399290 | 4 | 24 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.nativeRemoteConnectionFailureNotification | 1730719399290 | 0 | 24 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.nativeRemoteConnectionFailureNotification | 1730719399290 | 0 | 24 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.linuxAccessibility | 1730719399290 | 0 | 24 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.linuxAccessibility | 1730719399290 | 0 | 24 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.linuxSelectionClipboardPastePreventer | 1730719399290 | 0 | 24 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.linuxSelectionClipboardPastePreventer | 1730719399290 | 0 | 24 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.remoteTelemetryEnablementUpdater | 1730719399290 | 0 | 24 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.remoteTelemetryEnablementUpdater | 1730719399290 | 0 | 24 |
| code/willCreateWorkbenchContribution/2/workbench.contrib.remoteEmptyWorkbenchPresentation | 1730719399290 | 0 | 24 |
| code/didCreateWorkbenchContribution/2/workbench.contrib.remoteEmptyWorkbenchPresentation | 1730719399290 | 0 | 24 |
| code/willCreateWorkbenchContribution/2/workbench.chat.installEntitlement | 1730719399290 | 0 | 24 |
| code/didCreateWorkbenchContribution/2/workbench.chat.installEntitlement | 1730719399290 | 0 | 24 |
## Raw Perf Marks: main
```
Name Timestamp Delta Total
code/timeOrigin 1730719370711.675 0 0
code/didStartMain 1730719370938 226.324951171875 226.324951171875
code/willLoadMainBundle 1730719370711 -227 -0.675048828125
code/didLoadMainBundle 1730719370938 227 226.324951171875
code/willStartCrashReporter 1730719370940 2 228.324951171875
code/didStartCrashReporter 1730719370943 3 231.324951171875
code/mainAppReady 1730719371058 115 346.324951171875
code/willGenerateNls 1730719371058 0 346.324951171875
code/didGenerateNls 1730719371058 0 346.324951171875
code/willLoadNls 1730719371058 0 346.324951171875
code/didLoadNls 1730719371065 7 353.324951171875
code/registerFilesystem/file 1730719371090 25 378.324951171875
code/registerFilesystem/vscode-userdata 1730719371091 1 379.324951171875
code/didRunMainBundle 1730719371097 6 385.324951171875
code/willStartMainServer 1730719371110 13 398.324951171875
code/didStartMainServer 1730719371111 1 399.324951171875
code/willCreateCodeWindow 1730719371136 25 424.324951171875
code/willRestoreCodeWindowState 1730719371137 1 425.324951171875
code/didRestoreCodeWindowState 1730719371137 0 425.324951171875
code/willCreateCodeBrowserWindow 1730719371138 1 426.324951171875
code/didCreateCodeBrowserWindow 1730719371174 36 462.324951171875
code/didCreateCodeWindow 1730719371175 1 463.324951171875
code/willOpenNewWindow 1730719371177 2 465.324951171875
```
## Raw Perf Marks: localPtyHost
```
Name Timestamp Delta Total
```
## Raw Perf Marks: renderer
```
Name Timestamp Delta Total
code/timeOrigin 1730719371177 0 0
code/didStartRenderer 1730719398712 27535 27535
code/willWaitForWindowConfig 1730719398712 0 27535
code/didWaitForWindowConfig 1730719398715 3 27538
code/willShowPartsSplash 1730719398715 0 27538
code/didShowPartsSplash 1730719398716 1 27539
code/willLoadWorkbenchMain 1730719398716 0 27539
code/didLoadWorkbenchMain 1730719399190 474 28013
code/registerFilesystem/file 1730719399193 3 28016
code/registerFilesystem/vscode-userdata 1730719399194 1 28017
code/willInitWorkspaceService 1730719399195 1 28018
code/willInitStorage 1730719399196 1 28019
code/didInitStorage 1730719399212 16 28035
code/willInitUserConfiguration 1730719399216 4 28039
code/didInitUserConfiguration 1730719399219 3 28042
code/willInitWorkspaceConfiguration 1730719399219 0 28042
code/didInitWorkspaceConfiguration 1730719399224 5 28047
code/didInitWorkspaceService 1730719399225 1 28048
code/willStartWorkbench 1730719399228 3 28051
code/LifecyclePhase/Ready 1730719399230 2 28053
code/willCreateWorkbenchContributions/1 1730719399266 36 28089
code/willCreateWorkbenchContribution/1/workbench.contrib.navigableContainerManager 1730719399266 0 28089
code/didCreateWorkbenchContribution/1/workbench.contrib.navigableContainerManager 1730719399267 1 28090
code/willCreateWorkbenchContribution/1/workbench.contrib.extensionPoints 1730719399267 0 28090
code/didCreateWorkbenchContribution/1/workbench.contrib.extensionPoints 1730719399267 0 28090
code/willCreateWorkbenchContribution/1/workbench.contrib.viewsExtensionHandler 1730719399267 0 28090
code/didCreateWorkbenchContribution/1/workbench.contrib.viewsExtensionHandler 1730719399267 0 28090
code/willCreateWorkbenchContribution/1/workbench.contrib.preferencesActions 1730719399267 0 28090
code/didCreateWorkbenchContribution/1/workbench.contrib.preferencesActions 1730719399270 3 28093
code/willCreateWorkbenchContribution/1/workbench.contrib.preferences 1730719399270 0 28093
code/registerFilesystem/vscode 1730719399271 1 28094
code/didCreateWorkbenchContribution/1/workbench.contrib.preferences 1730719399271 0 28094
code/willCreateWorkbenchContribution/1/workbench.contrib.notebook 1730719399271 0 28094
code/didCreateWorkbenchContribution/1/workbench.contrib.notebook 1730719399271 0 28094
code/willCreateWorkbenchContribution/1/workbench.contrib.cellContentProvider 1730719399271 0 28094
code/didCreateWorkbenchContribution/1/workbench.contrib.cellContentProvider 1730719399272 1 28095
code/willCreateWorkbenchContribution/1/workbench.contrib.cellInfoContentProvider 1730719399272 0 28095
code/didCreateWorkbenchContribution/1/workbench.contrib.cellInfoContentProvider 1730719399272 0 28095
code/willCreateWorkbenchContribution/1/workbench.contrib.notebookMetadataContentProvider 1730719399272 0 28095
code/didCreateWorkbenchContribution/1/workbench.contrib.notebookMetadataContentProvider 1730719399272 0 28095
code/willCreateWorkbenchContribution/1/workbench.contrib.registerCellSchemas 1730719399272 0 28095
code/didCreateWorkbenchContribution/1/workbench.contrib.registerCellSchemas 1730719399273 1 28096
code/willCreateWorkbenchContribution/1/workbench.contrib.chatResolver 1730719399273 0 28096
code/didCreateWorkbenchContribution/1/workbench.contrib.chatResolver 1730719399273 0 28096
code/willCreateWorkbenchContribution/1/workbench.contrib.chatExtensionPointHandler 1730719399273 0 28096
code/didCreateWorkbenchContribution/1/workbench.contrib.chatExtensionPointHandler 1730719399274 1 28097
code/willCreateWorkbenchContribution/1/workbench.contrib.chatViewsWelcomeHandler 1730719399274 0 28097
code/didCreateWorkbenchContribution/1/workbench.contrib.chatViewsWelcomeHandler 1730719399274 0 28097
code/willCreateWorkbenchContribution/1/workbench.contrib.explorerViewletViews 1730719399274 0 28097
code/didCreateWorkbenchContribution/1/workbench.contrib.explorerViewletViews 1730719399276 2 28099
code/willCreateWorkbenchContribution/1/workbench.contrib.textFileEditorTracker 1730719399276 0 28099
code/didCreateWorkbenchContribution/1/workbench.contrib.textFileEditorTracker 1730719399276 0 28099
code/willCreateWorkbenchContribution/1/workbench.contrib.textFileSaveErrorHandler 1730719399276 0 28099
code/didCreateWorkbenchContribution/1/workbench.contrib.textFileSaveErrorHandler 1730719399276 0 28099
code/willCreateWorkbenchContribution/1/workbench.contrib.fileUriLabel 1730719399276 0 28099
code/didCreateWorkbenchContribution/1/workbench.contrib.fileUriLabel 1730719399276 0 28099
code/willCreateWorkbenchContribution/1/workbench.contrib.dirtyFilesIndicator 1730719399276 0 28099
code/didCreateWorkbenchContribution/1/workbench.contrib.dirtyFilesIndicator 1730719399277 1 28100
code/willCreateWorkbenchContribution/1/workbench.contrib.replacePreviewContentProvider 1730719399277 0 28100
code/didCreateWorkbenchContribution/1/workbench.contrib.replacePreviewContentProvider 1730719399277 0 28100
code/willCreateWorkbenchContribution/1/workbench.contrib.searchEditor 1730719399277 0 28100
code/didCreateWorkbenchContribution/1/workbench.contrib.searchEditor 1730719399277 0 28100
code/willCreateWorkbenchContribution/1/workbench.contrib.mergeEditorResolver 1730719399277 0 28100
code/didCreateWorkbenchContribution/1/workbench.contrib.mergeEditorResolver 1730719399277 0 28100
code/willCreateWorkbenchContribution/1/workbench.contrib.multiDiffEditorResolver 1730719399277 0 28100
code/didCreateWorkbenchContribution/1/workbench.contrib.multiDiffEditorResolver 1730719399277 0 28100
code/willCreateWorkbenchContribution/1/workbench.contrib.scmMultiDiffSourceResolver 1730719399277 0 28100
code/didCreateWorkbenchContribution/1/workbench.contrib.scmMultiDiffSourceResolver 1730719399277 0 28100
code/willCreateWorkbenchContribution/1/workbench.contrib.webviewPanel 1730719399277 0 28100
code/didCreateWorkbenchContribution/1/workbench.contrib.webviewPanel 1730719399277 0 28100
code/willCreateWorkbenchContribution/1/workbench.contrib.complexCustomWorkingCopyEditorHandler 1730719399277 0 28100
code/didCreateWorkbenchContribution/1/workbench.contrib.complexCustomWorkingCopyEditorHandler 1730719399277 0 28100
code/willCreateWorkbenchContribution/1/terminalMain 1730719399277 0 28100
code/didCreateWorkbenchContribution/1/terminalMain 1730719399277 0 28100
code/willCreateWorkbenchContribution/1/workbench.contrib.remoteLabel 1730719399277 0 28100
code/didCreateWorkbenchContribution/1/workbench.contrib.remoteLabel 1730719399277 0 28100
code/willCreateWorkbenchContribution/1/workbench.contrib.remoteInvalidWorkspaceDetector 1730719399277 0 28100
code/didCreateWorkbenchContribution/1/workbench.contrib.remoteInvalidWorkspaceDetector 1730719399277 0 28100
code/willCreateWorkbenchContribution/1/workbench.contrib.remoteStatusIndicator 1730719399277 0 28100
code/didCreateWorkbenchContribution/1/workbench.contrib.remoteStatusIndicator 1730719399278 1 28101
code/willCreateWorkbenchContribution/1/workbench.contrib.listContext 1730719399278 0 28101
code/didCreateWorkbenchContribution/1/workbench.contrib.listContext 1730719399278 0 28101
code/willCreateWorkbenchContribution/1/workbench.contrib.dialogHandler 1730719399278 0 28101
code/didCreateWorkbenchContribution/1/workbench.contrib.dialogHandler 1730719399278 0 28101
code/willCreateWorkbenchContribution/1/workbench.contrib.nativeWorkingCopyBackupTracker 1730719399278 0 28101
code/didCreateWorkbenchContribution/1/workbench.contrib.nativeWorkingCopyBackupTracker 1730719399279 1 28102
code/willCreateWorkbenchContribution/1/workbench.contrib.localTerminalBackend 1730719399279 0 28102
code/didCreateWorkbenchContribution/1/workbench.contrib.localTerminalBackend 1730719399279 0 28102
code/willCreateWorkbenchContribution/1/workbench.contrib.userDataSyncServices 1730719399279 0 28102
code/didCreateWorkbenchContribution/1/workbench.contrib.userDataSyncServices 1730719399279 0 28102
code/willCreateWorkbenchContribution/1/workbench.contrib.partsSplash 1730719399279 0 28102
code/didCreateWorkbenchContribution/1/workbench.contrib.partsSplash 1730719399279 0 28102
code/willCreateWorkbenchContribution/1/workbench.contrib.chatMovedViewWelcomeView 1730719399279 0 28102
code/didCreateWorkbenchContribution/1/workbench.contrib.chatMovedViewWelcomeView 1730719399279 0 28102
code/didCreateWorkbenchContributions/1 1730719399279 0 28102
code/willCreateWorkbenchContributions/2 1730719399279 0 28102
code/willCreateWorkbenchContribution/2/workbench.contrib.extensionUrlBootstrapHandler 1730719399279 0 28102
code/didCreateWorkbenchContribution/2/workbench.contrib.extensionUrlBootstrapHandler 1730719399279 0 28102
code/willCreateWorkbenchContribution/2/workbench.contrib.textInputActionsProvider 1730719399279 0 28102
code/didCreateWorkbenchContribution/2/workbench.contrib.textInputActionsProvider 1730719399280 1 28103
code/willCreateWorkbenchContribution/2/workbench.contrib.editorAutoSave 1730719399280 0 28103
code/didCreateWorkbenchContribution/2/workbench.contrib.editorAutoSave 1730719399280 0 28103
code/willCreateWorkbenchContribution/2/workbench.contrib.editorStatus 1730719399280 0 28103
code/didCreateWorkbenchContribution/2/workbench.contrib.editorStatus 1730719399280 0 28103
code/willCreateWorkbenchContribution/2/workbench.contrib.untitledTextEditorWorkingCopyEditorHandler 1730719399280 0 28103
code/didCreateWorkbenchContribution/2/workbench.contrib.untitledTextEditorWorkingCopyEditorHandler 1730719399280 0 28103
code/willCreateWorkbenchContribution/2/workbench.contrib.dynamicEditorConfigurations 1730719399280 0 28103
code/didCreateWorkbenchContribution/2/workbench.contrib.dynamicEditorConfigurations 1730719399281 1 28104
code/willCreateWorkbenchContribution/2/workbench.contrib.textMateTokenizationInstantiator 1730719399281 0 28104
code/didCreateWorkbenchContribution/2/workbench.contrib.textMateTokenizationInstantiator 1730719399281 0 28104
code/willCreateWorkbenchContribution/2/workbench.contrib.treeSitterTokenizationInstantiator 1730719399281 0 28104
code/didCreateWorkbenchContribution/2/workbench.contrib.treeSitterTokenizationInstantiator 1730719399282 1 28105
code/willCreateWorkbenchContribution/2/workbench.contrib.notebookChatContribution 1730719399282 0 28105
code/didCreateWorkbenchContribution/2/workbench.contrib.notebookChatContribution 1730719399282 0 28105
code/willCreateWorkbenchContribution/2/workbench.contrib.notebookClipboard 1730719399282 0 28105
code/didCreateWorkbenchContribution/2/workbench.contrib.notebookClipboard 1730719399282 0 28105
code/willCreateWorkbenchContribution/2/workbench.contrib.notebook.multiCursorUndoRedo 1730719399282 0 28105
code/didCreateWorkbenchContribution/2/workbench.contrib.notebook.multiCursorUndoRedo 1730719399282 0 28105
code/willCreateWorkbenchContribution/2/workbench.contrib.markerListProvider 1730719399282 0 28105
code/didCreateWorkbenchContribution/2/workbench.contrib.markerListProvider 1730719399282 0 28105
code/willCreateWorkbenchContribution/2/workbench.contrib.notebookUndoRedo 1730719399282 0 28105
code/didCreateWorkbenchContribution/2/workbench.contrib.notebookUndoRedo 1730719399283 1 28106
code/willCreateWorkbenchContribution/2/workbench.contrib.notebookEditorManager 1730719399283 0 28106
code/didCreateWorkbenchContribution/2/workbench.contrib.notebookEditorManager 1730719399283 0 28106
code/willCreateWorkbenchContribution/2/workbench.contrib.notebookLanguageSelectorScoreRefine 1730719399283 0 28106
code/didCreateWorkbenchContribution/2/workbench.contrib.notebookLanguageSelectorScoreRefine 1730719399283 0 28106
code/willCreateWorkbenchContribution/2/workbench.contrib.simpleNotebookWorkingCopyEditorHandler 1730719399283 0 28106
code/didCreateWorkbenchContribution/2/workbench.contrib.simpleNotebookWorkingCopyEditorHandler 1730719399283 0 28106
code/willCreateWorkbenchContribution/2/workbench.contrib.toolsExtensionPointHandler 1730719399283 0 28106
code/didCreateWorkbenchContribution/2/workbench.contrib.toolsExtensionPointHandler 1730719399283 0 28106
code/willCreateWorkbenchContribution/2/workbench.contrib.interactiveDocument 1730719399283 0 28106
code/didCreateWorkbenchContribution/2/workbench.contrib.interactiveDocument 1730719399284 1 28107
code/willCreateWorkbenchContribution/2/workbench.contrib.replWorkingCopyEditorHandler 1730719399284 0 28107
code/didCreateWorkbenchContribution/2/workbench.contrib.replWorkingCopyEditorHandler 1730719399284 0 28107
code/willCreateWorkbenchContribution/2/workbench.contrib.replDocument 1730719399284 0 28107
code/didCreateWorkbenchContribution/2/workbench.contrib.replDocument 1730719399284 0 28107
code/willCreateWorkbenchContribution/2/workbench.contrib.fileEditorWorkingCopyEditorHandler 1730719399284 0 28107
code/didCreateWorkbenchContribution/2/workbench.contrib.fileEditorWorkingCopyEditorHandler 1730719399284 0 28107
code/willCreateWorkbenchContribution/2/workbench.contrib.bulkEditPreview 1730719399284 0 28107
code/didCreateWorkbenchContribution/2/workbench.contrib.bulkEditPreview 1730719399284 0 28107
code/willCreateWorkbenchContribution/2/workbench.contrib.searchEditorWorkingCopyEditorHandler 1730719399284 0 28107
code/didCreateWorkbenchContribution/2/workbench.contrib.searchEditorWorkingCopyEditorHandler 1730719399284 0 28107
code/willCreateWorkbenchContribution/2/comments.input.contentProvider 1730719399284 0 28107
code/didCreateWorkbenchContribution/2/comments.input.contentProvider 1730719399284 0 28107
code/willCreateWorkbenchContribution/2/workbench.contrib.trustedDomainsFileSystemProvider 1730719399284 0 28107
code/registerFilesystem/trustedDomains 1730719399284 0 28107
code/didCreateWorkbenchContribution/2/workbench.contrib.trustedDomainsFileSystemProvider 1730719399284 0 28107
code/willCreateWorkbenchContribution/2/workbench.contrib.externalUriResolver 1730719399284 0 28107
code/didCreateWorkbenchContribution/2/workbench.contrib.externalUriResolver 1730719399284 0 28107
code/willCreateWorkbenchContribution/2/workbench.contrib.showPortCandidate 1730719399284 0 28107
code/didCreateWorkbenchContribution/2/workbench.contrib.showPortCandidate 1730719399284 0 28107
code/willCreateWorkbenchContribution/2/workbench.contrib.tunnelFactory 1730719399284 0 28107
code/didCreateWorkbenchContribution/2/workbench.contrib.tunnelFactory 1730719399284 0 28107
code/willCreateWorkbenchContribution/2/workbench.contrib.editorFeaturesInstantiator 1730719399284 0 28107
code/didCreateWorkbenchContribution/2/workbench.contrib.editorFeaturesInstantiator 1730719399284 0 28107
code/willCreateWorkbenchContribution/2/workbench.contrib.startupPageEditorResolver 1730719399284 0 28107
code/didCreateWorkbenchContribution/2/workbench.contrib.startupPageEditorResolver 1730719399284 0 28107
code/willCreateWorkbenchContribution/2/workbench.contrib.userDataProfiles 1730719399284 0 28107
code/didCreateWorkbenchContribution/2/workbench.contrib.userDataProfiles 1730719399285 1 28108
code/willCreateWorkbenchContribution/2/workbench.contrib.localHistoryTimeline 1730719399285 0 28108
code/registerFilesystem/vscode-local-history 1730719399286 1 28109
code/didCreateWorkbenchContribution/2/workbench.contrib.localHistoryTimeline 1730719399286 0 28109
code/willCreateWorkbenchContribution/2/workbench.contrib.workspaceTrustRequestHandler 1730719399286 0 28109
code/didCreateWorkbenchContribution/2/workbench.contrib.workspaceTrustRequestHandler 1730719399286 0 28109
code/willCreateWorkbenchContribution/2/workbench.contrib.accessibilityStatus 1730719399286 0 28109
code/didCreateWorkbenchContribution/2/workbench.contrib.accessibilityStatus 1730719399286 0 28109
code/willCreateWorkbenchContribution/2/extensionAccessibilityHelpDialogContribution 1730719399286 0 28109
code/didCreateWorkbenchContribution/2/extensionAccessibilityHelpDialogContribution 1730719399286 0 28109
code/willCreateWorkbenchContribution/2/workbench.contrib.updateExperimentalSettingsDefaults 1730719399286 0 28109
code/didCreateWorkbenchContribution/2/workbench.contrib.updateExperimentalSettingsDefaults 1730719399290 4 28113
code/willCreateWorkbenchContribution/2/workbench.contrib.nativeRemoteConnectionFailureNotification 1730719399290 0 28113
code/didCreateWorkbenchContribution/2/workbench.contrib.nativeRemoteConnectionFailureNotification 1730719399290 0 28113
code/willCreateWorkbenchContribution/2/workbench.contrib.linuxAccessibility 1730719399290 0 28113
code/didCreateWorkbenchContribution/2/workbench.contrib.linuxAccessibility 1730719399290 0 28113
code/willCreateWorkbenchContribution/2/workbench.contrib.linuxSelectionClipboardPastePreventer 1730719399290 0 28113
code/didCreateWorkbenchContribution/2/workbench.contrib.linuxSelectionClipboardPastePreventer 1730719399290 0 28113
code/willCreateWorkbenchContribution/2/workbench.contrib.remoteTelemetryEnablementUpdater 1730719399290 0 28113
code/didCreateWorkbenchContribution/2/workbench.contrib.remoteTelemetryEnablementUpdater 1730719399290 0 28113
code/willCreateWorkbenchContribution/2/workbench.contrib.remoteEmptyWorkbenchPresentation 1730719399290 0 28113
code/didCreateWorkbenchContribution/2/workbench.contrib.remoteEmptyWorkbenchPresentation 1730719399290 0 28113
code/willCreateWorkbenchContribution/2/workbench.chat.installEntitlement 1730719399290 0 28113
code/didCreateWorkbenchContribution/2/workbench.chat.installEntitlement 1730719399290 0 28113
code/didCreateWorkbenchContributions/2 1730719399290 0 28113
code/willCreatePart/workbench.parts.titlebar 1730719399292 2 28115
code/didCreatePart/workbench.parts.titlebar 1730719399311 19 28134
code/willCreatePart/workbench.parts.banner 1730719399311 0 28134
code/didCreatePart/workbench.parts.banner 1730719399311 0 28134
code/willCreatePart/workbench.parts.activitybar 1730719399311 0 28134
code/didCreatePart/workbench.parts.activitybar 1730719399317 6 28140
code/willCreatePart/workbench.parts.sidebar 1730719399317 0 28140
code/didCreatePart/workbench.parts.sidebar 1730719399318 1 28141
code/willCreatePart/workbench.parts.editor 1730719399318 0 28141
code/willCreateTextFileEditorControl 1730719399331 13 28154
code/didCreateTextFileEditorControl 1730719399338 7 28161
code/willSetInputToTextFileEditor 1730719399338 0 28161
code/didCreatePart/workbench.parts.editor 1730719399353 15 28176
code/willCreatePart/workbench.parts.panel 1730719399353 0 28176
code/didCreatePart/workbench.parts.panel 1730719399356 3 28179
code/willCreatePart/workbench.parts.auxiliarybar 1730719399356 0 28179
code/didCreatePart/workbench.parts.auxiliarybar 1730719399357 1 28180
code/willCreatePart/workbench.parts.statusbar 1730719399357 0 28180
code/didCreatePart/workbench.parts.statusbar 1730719399358 1 28181
code/didRemovePartsSplash 1730719399369 11 28192
code/willRestoreEditors 1730719399369 0 28192
code/willRestoreViewlet 1730719399373 4 28196
code/willResolveTextFileEditorModel 1730719399384 11 28207
code/restoreEditors/editorGroupsReady 1730719399386 2 28209
code/didRestoreViewlet 1730719399386 0 28209
code/restoreEditors/editorsToOpenResolved 1730719399387 1 28210
code/willRegisterExplorerViews 1730719399387 0 28210
code/willResolveExplorer 1730719399397 10 28220
code/didRegisterExplorerViews 1730719399399 2 28222
code/willLoadExtensions 1730719399453 54 28276
code/restoreEditors/editorGroupsRestored 1730719399490 37 28313
code/didRestoreEditors 1730719399490 0 28313
code/didStartWorkbench 1730719399490 0 28313
code/LifecyclePhase/Restored 1730719399490 0 28313
code/willCreateWorkbenchContributions/3 1730719399490 0 28313
code/willConnectSharedProcess 1730719399491 1 28314
code/didResolveExplorer 1730719399570 79 28393
code/willHandleExtensionPoints 1730719399727 157 28550
code/willHandleExtensionPoint/jsonValidation 1730719399727 0 28550
code/didHandleExtensionPoint/jsonValidation 1730719399727 0 28550
code/willHandleExtensionPoint/colors 1730719399727 0 28550
code/didHandleExtensionPoint/colors 1730719399728 1 28551
code/willHandleExtensionPoint/semanticTokenScopes 1730719399728 0 28551
code/didHandleExtensionPoint/semanticTokenScopes 1730719399728 0 28551
code/willHandleExtensionPoint/languages 1730719399728 0 28551
code/didHandleExtensionPoint/languages 1730719399734 6 28557
code/willHandleExtensionPoint/snippets 1730719399734 0 28557
code/didHandleExtensionPoint/snippets 1730719399734 0 28557
code/willHandleExtensionPoint/taskDefinitions 1730719399734 0 28557
code/didHandleExtensionPoint/taskDefinitions 1730719399735 1 28558
code/willHandleExtensionPoint/commands 1730719399735 0 28558
code/didHandleExtensionPoint/commands 1730719399736 1 28559
code/willHandleExtensionPoint/submenus 1730719399736 0 28559
code/didHandleExtensionPoint/submenus 1730719399736 0 28559
code/willHandleExtensionPoint/menus 1730719399736 0 28559
code/didHandleExtensionPoint/menus 1730719399741 5 28564
code/willHandleExtensionPoint/configurationDefaults 1730719399741 0 28564
code/didHandleExtensionPoint/configurationDefaults 1730719399741 0 28564
code/willHandleExtensionPoint/configuration 1730719399741 0 28564
code/didHandleExtensionPoint/configuration 1730719399753 12 28576
code/willHandleExtensionPoint/viewsContainers 1730719399753 0 28576
code/didHandleExtensionPoint/viewsContainers 1730719399756 3 28579
code/willHandleExtensionPoint/views 1730719399756 0 28579
code/didHandleExtensionPoint/views 1730719399757 1 28580
code/willHandleExtensionPoint/keybindings 1730719399757 0 28580
code/didHandleExtensionPoint/keybindings 1730719399762 5 28585
code/willHandleExtensionPoint/themes 1730719399762 0 28585
code/didHandleExtensionPoint/themes 1730719399762 0 28585
code/willHandleExtensionPoint/iconThemes 1730719399762 0 28585
code/didHandleExtensionPoint/iconThemes 1730719399762 0 28585
code/willHandleExtensionPoint/grammars 1730719399762 0 28585
code/didHandleExtensionPoint/grammars 1730719399764 2 28587
code/willHandleExtensionPoint/notebooks 1730719399764 0 28587
code/didHandleExtensionPoint/notebooks 1730719399764 0 28587
code/willHandleExtensionPoint/notebookRenderer 1730719399764 0 28587
code/didHandleExtensionPoint/notebookRenderer 1730719399765 1 28588
code/willHandleExtensionPoint/debuggers 1730719399765 0 28588
code/didHandleExtensionPoint/debuggers 1730719399765 0 28588
code/willHandleExtensionPoint/breakpoints 1730719399765 0 28588
code/didHandleExtensionPoint/breakpoints 1730719399765 0 28588
code/willHandleExtensionPoint/customEditors 1730719399765 0 28588
code/didHandleExtensionPoint/customEditors 1730719399765 0 28588
code/willHandleExtensionPoint/terminal 1730719399765 0 28588
code/didHandleExtensionPoint/terminal 1730719399765 0 28588
code/willHandleExtensionPoint/terminalQuickFixes 1730719399765 0 28588
code/didHandleExtensionPoint/terminalQuickFixes 1730719399765 0 28588
code/willHandleExtensionPoint/problemPatterns 1730719399765 0 28588
code/didHandleExtensionPoint/problemPatterns 1730719399766 1 28589
code/willHandleExtensionPoint/problemMatchers 1730719399766 0 28589
code/didHandleExtensionPoint/problemMatchers 1730719399766 0 28589
code/willHandleExtensionPoint/viewsWelcome 1730719399766 0 28589
code/didHandleExtensionPoint/viewsWelcome 1730719399766 0 28589
code/willHandleExtensionPoint/authentication 1730719399766 0 28589
code/didHandleExtensionPoint/authentication 1730719399766 0 28589
code/willHandleExtensionPoint/continueEditSession 1730719399766 0 28589
code/didHandleExtensionPoint/continueEditSession 1730719399766 0 28589
code/willHandleExtensionPoint/codeActions 1730719399766 0 28589
code/didHandleExtensionPoint/codeActions 1730719399766 0 28589
code/didHandleExtensionPoints 1730719399766 0 28589
code/didLoadExtensions 1730719399769 3 28592
code/didConnectSharedProcess 1730719399809 40 28632
code/terminal/willGetTerminalBackend 1730719399863 54 28686
code/terminal/didGetTerminalBackend 1730719399863 0 28686
code/terminal/willReconnect 1730719399863 0 28686
code/terminal/willGetTerminalLayoutInfo 1730719399863 0 28686
code/terminal/didGetTerminalLayoutInfo 1730719399910 47 28733
code/terminal/didReconnect 1730719399911 1 28734
code/terminal/willReplay 1730719399911 0 28734
code/terminal/didReplay 1730719399911 0 28734
code/terminal/willGetPerformanceMarks 1730719399911 0 28734
code/terminal/didGetPerformanceMarks 1730719399916 5 28739
```
## Raw Perf Marks: localExtHost
```
Name Timestamp Delta Total
code/timeOrigin 1730719399526.485 0 0
code/fork/start 1730719399731 204.514892578125 204.514892578125
code/willLoadNls 1730719399731 0 204.514892578125
code/didLoadNls 1730719399738 7 211.514892578125
code/extHost/willConnectToRenderer 1730719399885 147 358.514892578125
code/extHost/didConnectToRenderer 1730719399886 1 359.514892578125
code/extHost/didWaitForInitData 1730719399918 32 391.514892578125
code/extHost/didCreateServices 1730719399920 2 393.514892578125
code/extHost/willWaitForConfig 1730719399931 11 404.514892578125
code/extHost/didWaitForConfig 1730719399944 13 417.514892578125
code/extHost/didInitAPI 1730719399965 21 438.514892578125
code/extHost/didInitProxyResolver 1730719399969 4 442.514892578125
code/extHost/ready 1730719399969 0 442.514892578125
code/extHost/willLoadExtensionCode/vscode.git-base 1730719399980 11 453.514892578125
code/extHost/didLoadExtensionCode/vscode.git-base 1730719399983 3 456.514892578125
code/extHost/willActivateExtension/vscode.git-base 1730719399984 1 457.514892578125
code/extHost/didActivateExtension/vscode.git-base 1730719399985 1 458.514892578125
code/extHost/willLoadExtensionCode/vscode.git 1730719399987 2 460.514892578125
code/extHost/didLoadExtensionCode/vscode.git 1730719400062 75 535.514892578125
code/extHost/willLoadExtensionCode/vscode.github 1730719400063 1 536.514892578125
code/extHost/didLoadExtensionCode/vscode.github 1730719400087 24 560.514892578125
code/extHost/willActivateExtension/vscode.git 1730719400088 1 561.514892578125
code/extHost/willActivateExtension/vscode.github 1730719400116 28 589.514892578125
code/extHost/didActivateExtension/vscode.github 1730719400132 16 605.514892578125
code/extHost/didActivateExtension/vscode.git 1730719400154 22 627.514892578125
```
## Resource Timing Stats
| Name | Duration |
| ------------------------------------------------------------------------ | ------------------ |
| https://default.exp-tas.com/vscode/ab | 237.59999999962747 |
| https://az764295.vo.msecnd.net/extensions/chat.json | 2.599999999627471 |
| blob:vscode-file://vscode-app/7b1600d9-25f2-40b4-ba05-46bbdde7c3c9 | -3 |
| https://marketplace.visualstudio.com/_apis/public/gallery/extensionquery | 355.5 |
| https://marketplace.visualstudio.com/_apis/public/gallery/extensionquery | 216.20000000018626 |
1. :warning: Make sure to **attach** these files from your *home*-directory: :warning:
-`prof-u2ocV3P7.main.cpuprofile.txt`
[prof-u2ocV3P7.main.cpuprofile.txt](https://github.com/user-attachments/files/17617381/prof-u2ocV3P7.main.cpuprofile.txt)
| bug,freeze-slow-crash-leak | low | Critical |
2,632,619,639 | PowerToys | Advanced Paste Feature Request | /### Description of the new feature / enhancement
**Hi,**
I suggest adding scripting capabilities to manipulate copied text and paste the processed text,
user can add multiple scripts as he wants with its own title ,
when user triggers "Advanced paste" he can choose witch script to apply on the copied text,
the script has can be python or any scripting language and structured as a one main function which accept one argument, and user can add any sub-functions according to processing logic.
### Scenario when this would be used?
**example:**
I'm usually needing to copy a text which include many hyphens and I need to paste the text without them, and this a repeated task
usually I copy the text then paste it to notepad then remove the hyphens and then copy it again
> input: DD-5555-22
> output: DD555522
also I can create script to split the input at a certain delimiter and extract the portion I need.
### Supporting information
_No response_
**Many thanks for continues support and upgrades** | Needs-Triage | low | Minor |
2,632,626,500 | TypeScript | Does not find .mts file when doing extensionless import and Bundler | ### 🔎 Search Terms
mts bundler
### 🕗 Version & Regression Information
- This is the behavior in every version I tried, and I reviewed the FAQ for entries about Bundler
### ⏯ Playground Link
cannot test modulerResolution in the playground
### 💻 Code
tsconfig.json
```
moduleResolution: "Bundler"
```
file.mts
```ts
export const hi = 1
```
file2.mts
```ts
import {hi} from './file'
```
### 🙁 Actual behavior
fails to find `./file`
### 🙂 Expected behavior
should resolve just like a bundler, so it should find the `.mts` file
### Additional information about the issue
_No response_ | Suggestion,Awaiting More Feedback | low | Minor |
2,632,630,800 | opencv | drawContours LINE_4 vs LINE_8 | ### System Information
OpenCV v4.1.1
Raspbian Buster
Python v3.7.3
### Detailed description
Good day,
Drawing a contour using cv2.drawContours(image, [contours], 0, 255, lineType=cv2.LINE_4) produces a contour with 8-connectivity, and vice versa.
See the screenshots:
- using lineType=cv2.LINE_4

- using lineType=cv2.LINE_8

The behavior is swapped.
Regards
### Steps to reproduce
'''
# Get points of straw fragment
blob_points = cv2.findNonZero(blob_intensity_msk)
# Get blob convex hull
hull_points = cv2.convexHull(blob_points)
# Initialize convex hull mask
hull_mask = np.zeros_like(blob_intensity_msk)
# Draw convex hull
cv2.drawContours(hull_mask, [hull_points], 0, 255, lineType=cv2.LINE_4)
'''
### Issue submission checklist
- [X] I report the issue, it's not a question
- [X] I checked the problem with documentation, FAQ, open issues, forum.opencv.org, Stack Overflow, etc and have not found any solution
- [X] I updated to the latest OpenCV version and the issue is still there
- [X] There is reproducer code and related data files (videos, images, onnx, etc) | bug,category: imgproc,incomplete | low | Minor |
2,632,656,258 | react | [DevTools Bug]: Settings / Components / Hide components where... - need to be set on each reload. | ### Website or app
https://react.dev/
### Repro steps
_When filtering components under "Hide components where... " you need to modify the filter each time you've reloaded the page and/or hide/show the Chrome DevTools. Otherwise the filter doesn't bite and the hidden components still show._
**For example:**
Name matches Anonymous
Works fine the first time you enter it - all Anonymous components are hidden.
Reload the page and the filter no longer applies. You need to open: Settings / Components / Hide components where... and modify the entry and it works again:
**For example:**
Name matches Anonymous*
Or toggle OFF - close the dialog - open the dialog and toggle ON - Now Anonymous is hidden again.
<img width="820" alt="Screenshot 2024-11-04 at 13 10 37" src="https://github.com/user-attachments/assets/7d7bf89f-d801-4605-9100-140036e67589">
Using 6.0.1-c7c68ef842 of the extension
Chrome Version 130.0.6723.92 (Official Build) (arm64)
### How often does this bug happen?
Every time
### DevTools package (automated)
_No response_
### DevTools version (automated)
_No response_
### Error message (automated)
_No response_
### Error call stack (automated)
_No response_
### Error component stack (automated)
_No response_
### GitHub query string (automated)
_No response_ | Type: Bug,Status: Unconfirmed,Component: Developer Tools | low | Critical |
2,632,690,416 | bitcoin | intermittent issue in wallet_upgradewallet.py: AssertionError: bdb magic does not match bdb btree magic | https://cirrus-ci.com/task/5232872305459200:
```bash
067] node0 2024-11-04T11:58:48.064759Z [httpworker.0] [src/wallet/wallet.h:936] [WalletLogPrintf] [default wallet] m_address_book.size() = 2
[06:58:49.067] test 2024-11-04T11:58:48.065000Z TestFramework (INFO): Can upgrade to HD
[06:58:49.067] test 2024-11-04T11:58:48.066000Z TestFramework (ERROR): Assertion failed
[06:58:49.067] Traceback (most recent call last):
[06:58:49.067] File "/ci_container_base/test/functional/test_framework/test_framework.py", line 132, in main
[06:58:49.067] self.run_test()
[06:58:49.067] File "/ci_container_base/ci/scratch/build-x86_64-pc-linux-gnu/test/functional/wallet_upgradewallet.py", line 226, in run_test
[06:58:49.067] orig_kvs = dump_bdb_kv(node_master_wallet)
[06:58:49.067] File "/ci_container_base/test/functional/test_framework/bdb.py", line 142, in dump_bdb_kv
[06:58:49.067] dump_meta_page(pages[INNER_META_PAGE])
[06:58:49.067] File "/ci_container_base/test/functional/test_framework/bdb.py", line 100, in dump_meta_page
[06:58:49.067] assert magic == BTREE_MAGIC, 'bdb magic does not match bdb btree magic'
[06:58:49.067] AssertionError: bdb magic does not match bdb btree magic
[06:58:49.067] test 2024-11-04T11:58:48.067000Z TestFramework (DEBUG): Closing down network thread
```
Looking at all the logs in #30798, this seems to be a different issue. | CI failed | low | Critical |
2,632,710,510 | rust | cargo build crash after move common.rs file to subdirectory | <!--
Thank you for finding an Internal Compiler Error! 🧊 If possible, try to provide
a minimal verifiable example. You can read "Rust Bug Minimization Patterns" for
how to create smaller examples.
http://blog.pnkfx.org/blog/2019/11/18/rust-bug-minimization-patterns/
-->
### Code
```Rust
<code>
```
### Meta
<!--
If you're using the stable version of the compiler, you should also check if the
bug also exists in the beta or nightly versions.
-->
`rustc --version --verbose`:
```
rustc 1.82.0 (f6e511eec 2024-10-15)
binary: rustc
commit-hash: f6e511eec7342f59a25f7c0534f1dbea00d01b14
commit-date: 2024-10-15
host: x86_64-unknown-linux-gnu
release: 1.82.0
LLVM version: 19.1.1
```
### Error output
```
error: the compiler unexpectedly panicked. this is a bug.
note: we would appreciate a bug report: https://github.com/rust-lang/rust/issues/new?labels=C-bug%2C+I-ICE%2C+T-compiler&template=ice.md
note: rustc 1.82.0 (f6e511eec 2024-10-15) running on x86_64-unknown-linux-gnu
note: compiler flags: --crate-type bin -C embed-bitcode=no -C debuginfo=2 -C incremental=[REDACTED]
note: some of the compiler flags provided by cargo are hidden
query stack during panic:
#0 [evaluate_obligation] evaluating trait selection obligation `axum::error_handling::HandleError<tower::timeout::Timeout<axum::routing::route::Route>, server::error::handle_error, (http::method::Method, http::uri::Uri)>: tower_service::Service<http::request::Request<axum_core::body::Body>>`
#1 [normalize_canonicalized_projection_ty] normalizing `<tower::util::map_request::MapRequest<tower::util::map_err::MapErr<tower::util::map_response::MapResponse<axum::error_handling::HandleError<tower::timeout::Timeout<axum::routing::route::Route>, server::error::handle_error, (http::method::Method, http::uri::Uri)>, <http::response::Response<axum_core::body::Body> as axum_core::response::into_response::IntoResponse>::into_response>, <core::convert::Infallible as core::convert::Into<core::convert::Infallible>>::into>, {closure@axum::routing::route::Route::layer<tower::builder::ServiceBuilder<tower_layer::stack::Stack<tower::timeout::layer::TimeoutLayer, tower_layer::stack::Stack<axum::error_handling::HandleErrorLayer<server::error::handle_error, (http::method::Method, http::uri::Uri)>, tower_layer::identity::Identity>>>, core::convert::Infallible>::{closure#0}}> as tower_service::Service<http::request::Request<axum_core::body::Body>>>::Response`
#2 [try_normalize_generic_arg_after_erasing_regions] normalizing `<tower::util::map_request::MapRequest<tower::util::map_err::MapErr<tower::util::map_response::MapResponse<axum::error_handling::HandleError<tower::timeout::Timeout<axum::routing::route::Route>, server::error::handle_error, (http::method::Method, http::uri::Uri)>, <http::response::Response<axum_core::body::Body> as axum_core::response::into_response::IntoResponse>::into_response>, <core::convert::Infallible as core::convert::Into<core::convert::Infallible>>::into>, {closure@axum::routing::route::Route::layer<tower::builder::ServiceBuilder<tower_layer::stack::Stack<tower::timeout::layer::TimeoutLayer, tower_layer::stack::Stack<axum::error_handling::HandleErrorLayer<server::error::handle_error, (http::method::Method, http::uri::Uri)>, tower_layer::identity::Identity>>>, core::convert::Infallible>::{closure#0}}> as tower::util::ServiceExt<http::request::Request<axum_core::body::Body>>>::map_response<<<tower::util::map_request::MapRequest<tower::util::map_err::MapErr<tower::util::map_response::MapResponse<axum::error_handling::HandleError<tower::timeout::Timeout<axum::routing::route::Route>, server::error::handle_error, (http::method::Method, http::uri::Uri)>, <http::response::Response<axum_core::body::Body> as axum_core::response::into_response::IntoResponse>::into_response>, <core::convert::Infallible as core::convert::Into<core::convert::Infallible>>::into>, {closure@axum::routing::route::Route::layer<tower::builder::ServiceBuilder<tower_layer::stack::Stack<tower::timeout::layer::TimeoutLayer, tower_layer::stack::Stack<axum::error_handling::HandleErrorLayer<server::error::handle_error, (http::method::Method, http::uri::Uri)>, tower_layer::identity::Identity>>>, core::convert::Infallible>::{closure#0}}> as tower_service::Service<http::request::Request<axum_core::body::Body>>>::Response as axum_core::response::into_response::IntoResponse>::into_response, http::response::Response<axum_core::body::Body>>`
#3 [collect_and_partition_mono_items] collect_and_partition_mono_items
end of query stack
there was a panic while trying to force a dep node
try_mark_green dep node stack:
#0 adt_sized_constraint(thread 'rustc' panicked at compiler/rustc_metadata/src/rmeta/def_path_hash_map.rs:23:54:
called `Option::unwrap()` on a `None` value
```
<!--
Include a backtrace in the code block by setting `RUST_BACKTRACE=1` in your
environment. E.g. `RUST_BACKTRACE=1 cargo build`.
-->
<details><summary><strong>Backtrace</strong></summary>
<p>
```
0: rust_begin_unwind
1: core::panicking::panic_fmt
2: core::panicking::panic
3: core::option::unwrap_failed
4: <rustc_query_system::dep_graph::dep_node::DepNode as rustc_middle::dep_graph::dep_node::DepNodeExt>::extract_def_id
5: rustc_interface::callbacks::dep_node_debug
6: <rustc_query_system::dep_graph::dep_node::DepNode as core::fmt::Debug>::fmt
7: core::fmt::write
8: <&std::io::stdio::Stderr as std::io::Write>::write_fmt
9: std::io::stdio::_eprint
10: rustc_query_system::dep_graph::graph::print_markframe_trace::<rustc_middle::dep_graph::DepsType>
11: <rustc_query_system::dep_graph::graph::DepGraphData<rustc_middle::dep_graph::DepsType>>::try_mark_previous_green::<rustc_query_impl::plumbing::QueryCtxt>
12: <rustc_query_system::dep_graph::graph::DepGraphData<rustc_middle::dep_graph::DepsType>>::try_mark_previous_green::<rustc_query_impl::plumbing::QueryCtxt>
13: <rustc_query_system::dep_graph::graph::DepGraphData<rustc_middle::dep_graph::DepsType>>::try_mark_previous_green::<rustc_query_impl::plumbing::QueryCtxt>
14: <rustc_query_system::dep_graph::graph::DepGraphData<rustc_middle::dep_graph::DepsType>>::try_mark_previous_green::<rustc_query_impl::plumbing::QueryCtxt>
15: <rustc_query_system::dep_graph::graph::DepGraphData<rustc_middle::dep_graph::DepsType>>::try_mark_previous_green::<rustc_query_impl::plumbing::QueryCtxt>
16: <rustc_query_system::dep_graph::graph::DepGraphData<rustc_middle::dep_graph::DepsType>>::try_mark_previous_green::<rustc_query_impl::plumbing::QueryCtxt>
17: rustc_query_system::query::plumbing::try_execute_query::<rustc_query_impl::DynamicConfig<rustc_query_system::query::caches::DefaultCache<rustc_type_ir::canonical::Canonical<rustc_middle::ty::context::TyCtxt, rustc_middle::ty::ParamEnvAnd<rustc_middle::ty::predicate::Predicate>>, rustc_middle::query::erase::Erased<[u8; 2]>>, false, false, false>, rustc_query_impl::plumbing::QueryCtxt, true>
18: <rustc_trait_selection::traits::fulfill::FulfillProcessor as rustc_data_structures::obligation_forest::ObligationProcessor>::process_obligation
19: <rustc_data_structures::obligation_forest::ObligationForest<rustc_trait_selection::traits::fulfill::PendingPredicateObligation>>::process_obligations::<rustc_trait_selection::traits::fulfill::FulfillProcessor>
20: rustc_traits::normalize_projection_ty::normalize_canonicalized_projection_ty
[... omitted 3 frames ...]
21: <rustc_trait_selection::traits::query::normalize::QueryNormalizer as rustc_type_ir::fold::FallibleTypeFolder<rustc_middle::ty::context::TyCtxt>>::try_fold_ty
22: <&rustc_middle::ty::list::RawList<(), rustc_middle::ty::generic_args::GenericArg> as rustc_type_ir::fold::TypeFoldable<rustc_middle::ty::context::TyCtxt>>::try_fold_with::<rustc_trait_selection::traits::query::normalize::QueryNormalizer>
23: <rustc_trait_selection::traits::query::normalize::QueryNormalizer as rustc_type_ir::fold::FallibleTypeFolder<rustc_middle::ty::context::TyCtxt>>::try_fold_ty
24: <&rustc_middle::ty::list::RawList<(), rustc_middle::ty::generic_args::GenericArg> as rustc_type_ir::fold::TypeFoldable<rustc_middle::ty::context::TyCtxt>>::try_fold_with::<rustc_trait_selection::traits::query::normalize::QueryNormalizer>
25: <rustc_trait_selection::traits::query::normalize::QueryNormalizer as rustc_type_ir::fold::FallibleTypeFolder<rustc_middle::ty::context::TyCtxt>>::try_fold_ty
26: <rustc_traits::normalize_erasing_regions::provide::{closure#0} as core::ops::function::FnOnce<(rustc_middle::ty::context::TyCtxt, rustc_middle::ty::ParamEnvAnd<rustc_middle::ty::generic_args::GenericArg>)>>::call_once
[... omitted 1 frame ...]
27: <rustc_middle::ty::normalize_erasing_regions::NormalizeAfterErasingRegionsFolder as rustc_type_ir::fold::TypeFolder<rustc_middle::ty::context::TyCtxt>>::fold_ty
28: rustc_monomorphize::collector::collect_items_rec::{closure#0}
29: rustc_monomorphize::collector::collect_items_rec
30: rustc_monomorphize::collector::collect_items_rec
31: rustc_monomorphize::collector::collect_items_rec
32: rustc_monomorphize::collector::collect_items_rec
33: rustc_monomorphize::collector::collect_items_rec
34: rustc_monomorphize::collector::collect_items_rec
35: rustc_monomorphize::collector::collect_items_rec
36: rustc_monomorphize::collector::collect_items_rec
37: rustc_monomorphize::collector::collect_items_rec
38: rustc_monomorphize::collector::collect_items_rec
39: rustc_monomorphize::collector::collect_items_rec
40: rustc_monomorphize::collector::collect_items_rec
41: rustc_monomorphize::collector::collect_items_rec
42: rustc_monomorphize::collector::collect_items_rec
43: rustc_monomorphize::collector::collect_items_rec
44: rustc_monomorphize::collector::collect_items_rec
45: rustc_monomorphize::collector::collect_items_rec
46: rustc_monomorphize::collector::collect_items_rec
47: rustc_monomorphize::collector::collect_items_rec
48: rustc_monomorphize::collector::collect_items_rec
49: rustc_monomorphize::collector::collect_items_rec
50: rustc_monomorphize::collector::collect_items_rec
51: rustc_monomorphize::collector::collect_items_rec
52: rustc_monomorphize::collector::collect_items_rec
53: rustc_monomorphize::collector::collect_items_rec
54: rustc_monomorphize::collector::collect_items_rec
55: rustc_monomorphize::collector::collect_items_rec
56: rustc_monomorphize::collector::collect_items_rec
57: rustc_monomorphize::collector::collect_items_rec
58: rustc_monomorphize::collector::collect_items_rec
59: rustc_monomorphize::collector::collect_items_rec
60: rustc_monomorphize::collector::collect_items_rec
61: rustc_monomorphize::collector::collect_items_rec
62: rustc_monomorphize::partitioning::collect_and_partition_mono_items
[... omitted 2 frames ...]
63: <rustc_codegen_llvm::LlvmCodegenBackend as rustc_codegen_ssa::traits::backend::CodegenBackend>::codegen_crate
64: <rustc_interface::queries::Linker>::codegen_and_build_linker
65: rustc_interface::interface::run_compiler::<core::result::Result<(), rustc_span::ErrorGuaranteed>, rustc_driver_impl::run_compiler::{closure#0}>::{closure#1}
```
</p>
</details>
| I-ICE,T-compiler,A-incr-comp,C-bug,S-needs-repro | low | Critical |
2,632,716,762 | ui | [feat]: add support for microsoft rush monorepo | ### Feature description
Shadcn CLI does not support projects which are managed by [@microsoft/rush](https://rushjs.io/) mono repo.
Rush mono repo could be configured to use either `npm`, `yarn` or `pnpm` but it has [its own way of managing dependencies](https://rushjs.io/pages/developer/modifying_package_json/#rush-add)
_(i.e `npm install` , `yarn add` or `pnpm add` are not used diectly.)_
**Problem:**
Because Rush is managing the dependencies differently, regardless of the Rush package manager config, none of the following Shadcn CLI commands works:
```
npx shadcn@latest add button
# or
npx shadcn@latest add button
# or
pnpm dlx shadcn@latest add button
```
running Shadcn CLI to add a component will cause error at `installing dependencies` step:
using npm

using pnpm

### Proposed Solution
Shadcn CLI to detect if project is managed by Rush, and use [rush cli](https://rushjs.io/pages/developer/modifying_package_json/#rush-add) to install dependencies.
(Shadcn CLI currently supports `npm`, `yarn`, `pnpm` and `bun` )
### Affected component/components
CLI
### Additional Context
Additional details here...
### Before submitting
- [X] I've made research efforts and searched the documentation
- [X] I've searched for existing issues and PRs | area: request | low | Critical |
2,632,717,100 | ui | [bug]: Cannot find module 'next-themes/dist/types' or its corresponding type declarations. | ### Describe the bug
Hello,
I have upgraded my project to next 15.0.3-canary.5 and "next-themes": "^0.4.1",
and now when I run
`pnpm run build `
**It works with "next-themes": "^0.3.0",**
`
./src/components/theme-provider.tsx:5:41
Type error: Cannot find module 'next-themes/dist/types' or its corresponding type declarations.
3 | import * as React from "react";
4 | import { ThemeProvider as NextThemesProvider } from "next-themes";
> 5 | import { type ThemeProviderProps } from "next-themes/dist/types";
| ^
6 |
7 | export function ThemeProvider({ children, ...props }: ThemeProviderProps) {
8 | return <NextThemesProvider {...props}>{children}</NextThemesProvider>;`
I am using the a simple code like in the examples
`
"use client";
import * as React from "react";
import { ThemeProvider as NextThemesProvider } from "next-themes";
import { type ThemeProviderProps } from "next-themes/dist/types";
export function ThemeProvider({ children, ...props }: ThemeProviderProps) {
return <NextThemesProvider {...props}>{children}</NextThemesProvider>;
}
`
I am using the latest version of "next-themes": "^0.4.1",
### Affected component/components
Theme
### How to reproduce
Just run I run
`pnpm run build `
### Codesandbox/StackBlitz link
_No response_
### Logs
_No response_
### System Info
```bash
Chrome latest version
```
### Before submitting
- [X] I've made research efforts and searched the documentation
- [X] I've searched for existing issues | bug | low | Critical |
2,632,721,085 | godot | Polygon2d preview settings unaccessible(mobile) | ### Tested versions
4.3Stable
### System information
Godot editor 4 (mobile version from PlayStore)
### Issue description
The window in the polygon2D preview can be moved from side to side or amplified, however, the actual settings can't be accessed, i cant slide up nor open the setting menu in any way at all.
### Steps to reproduce
Either make the word settings be a link to open the settings menu, or just allow the window to slide up to reach it.
### Minimal reproduction project (MRP)
Here i have no idea what you need | bug,platform:android,topic:editor,topic:2d | low | Minor |
2,632,785,967 | PowerToys | Accessibility: NVDA's mouse tracking broken in PowerToys | ### Microsoft PowerToys version
0.85.1
### Installation method
GitHub, PowerToys auto-update
### Running as admin
None
### Area(s) with issue?
Welcome / PowerToys Tour window
### Steps to reproduce
1. Run NVDA
2. Press `NVDA+m` to turn on mouse tracking
3. Move the mouse inside the PowerToys window.
### ✔️ Expected Behavior
Report text under the mouse or something like that
### ❌ Actual Behavior
The mouse object is always a pane and does not navigate between the actual objects.

Strangely Accessibility Insights For Windows and the narrator are fine.
### Other Software
NVDA alpha 34485 | Issue-Bug,Needs-Triage,Needs-Team-Response | low | Critical |
2,632,796,946 | storybook | [Tracking]: Component Testing Enhancements 🔧 | ## 🧑🤝🧑 Who: @JReinhold, @ndelangen and @ghengeveld
This is a tracking issue for the Component Testing Enhancements 🔧 project. The purpose of this issue is to keep tracking of the overall status of the project and tasks, and plan everything around it. This project is a continuation of #29088 and is related to #29529 and #29530.
# ⚠️ Problem
There was a few unresolved items in #29088 that we want to work on to improve the Component Testing experience. Further more, we're continuously receiving feedback on the new feature that we want to address.
# Tasks
```[tasklist]
#### Must Haves
- [ ] https://github.com/storybookjs/storybook/issues/29745
- [ ] https://github.com/storybookjs/storybook/issues/29796
- [ ] https://github.com/storybookjs/storybook/issues/29797
- [ ] https://github.com/storybookjs/storybook/issues/29613
- [ ] #29771
- [ ] #29772
- [ ] #29775
- [ ] #29776
- [ ] #29778
- [ ] #29779
- [ ] https://github.com/storybookjs/storybook/issues/29686
- [ ] https://github.com/storybookjs/addon-svelte-csf/issues/213
- [ ] ~https://github.com/storybookjs/storybook/issues/26842~
- [ ] https://github.com/storybookjs/storybook/issues/29703
- [ ] https://github.com/storybookjs/storybook/issues/29784
- [ ] https://github.com/storybookjs/storybook/issues/29349
- [ ] https://github.com/storybookjs/storybook/issues/29647
- [ ] https://github.com/storybookjs/storybook/issues/28890
- [ ] https://github.com/storybookjs/storybook/issues/29652
- [ ] https://github.com/storybookjs/storybook/issues/29785
- [ ] https://github.com/storybookjs/storybook/issues/29741
- [ ] https://github.com/storybookjs/storybook/issues/29742
- [ ] https://github.com/storybookjs/storybook/issues/29650
- [ ] https://github.com/storybookjs/storybook/issues/29767
- [ ] https://github.com/storybookjs/storybook/issues/29794
- [ ] https://github.com/storybookjs/storybook/issues/29799
- [ ] https://github.com/storybookjs/storybook/issues/30009
```
```[tasklist]
### Expandable Scope
- [ ] https://github.com/storybookjs/storybook/issues/29653
- [ ] https://github.com/storybookjs/storybook/issues/25073
- [ ] https://github.com/storybookjs/storybook/issues/29651
- [ ] https://github.com/storybookjs/storybook/issues/29572
- [ ] https://github.com/storybookjs/storybook/issues/29602
- [ ] #29773
- [ ] #29774
- [ ] #29780
- [ ] #29781
- [ ] #29782
- [ ] https://github.com/storybookjs/storybook/issues/29567
- [ ] https://github.com/storybookjs/storybook/issues/29575
- [ ] #29777
- [ ] #29783
```
```[tasklist]
### Needs Triage
- [ ] https://github.com/storybookjs/storybook/issues/29812
- [ ] https://github.com/storybookjs/storybook/issues/28954
- [ ] https://github.com/storybookjs/storybook/issues/28896
- [ ] https://github.com/storybookjs/storybook/issues/28898
- [ ] https://github.com/storybookjs/storybook/issues/28891
- [ ] https://github.com/storybookjs/storybook/issues/28897
- [ ] Add Storybook to the [Vitest Ecosystem CI](https://github.com/vitest-dev/vitest-ecosystem-ci)
- [ ] Tech debt: Combine `TestManager` and `VitestManager`
- [ ] https://github.com/storybookjs/storybook/issues/30010
- [ ] https://github.com/storybookjs/storybook/issues/30011
- [ ] https://github.com/storybookjs/storybook/issues/30028
- [ ] https://github.com/storybookjs/storybook/issues/30048
- [ ] https://github.com/storybookjs/storybook/issues/30053
- [ ] https://github.com/storybookjs/storybook/issues/30087
``` | Tracking,addon: test | low | Minor |
2,632,829,738 | vscode | Null editorGhostText.foreground color in default high contrast themes | <!-- ⚠️⚠️ Do Not Delete This! bug_report_template ⚠️⚠️ -->
<!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ -->
<!-- 🕮 Read our guide about submitting issues: https://github.com/microsoft/vscode/wiki/Submitting-Bugs-and-Suggestions -->
<!-- 🔎 Search existing issues to avoid creating duplicates. -->
<!-- 🧪 Test using the latest Insiders build to see if your issue has already been fixed: https://code.visualstudio.com/insiders/ -->
<!-- 💡 Instead of creating your report here, use 'Report Issue' from the 'Help' menu in VS Code to pre-fill useful information. -->
<!-- 🔧 Launch with `code --disable-extensions` to check. -->
Does this issue occur when all extensions are disabled?: Yes
<!-- 🪓 If you answered No above, use 'Help: Start Extension Bisect' from Command Palette to try to identify the cause. -->
<!-- 📣 Issues caused by an extension need to be reported directly to the extension publisher. The 'Help > Report Issue' dialog can assist with this. -->
- VS Code Version: 1.95.1
- OS Version: Darwin arm64 24.0.0
I am writing an extension that uses the theme color `editorGhostText.foreground`, and I would expect it to be visible in all built-in themes. However their values are transparent when I switch theme to either `Dark High Contrast` or `Light High Contrast`. I found they are explicitly set as such [here](https://github.com/microsoft/vscode/blob/e37672f7d7a7e55064cad4d0b11e88f8d92ba7ac/src/vs/editor/common/core/editorColorRegistry.ts#L67), curious why. | bug,ghost-text | low | Critical |
2,632,838,992 | react | TestUtils.Simulate alternatives | Hi team.
We're using Simulate from 'react-dom/test-utils' in our package, which is a dependency for other packages. However, Simulate is deprecated, and it's recommended to use @testing-library/react's fireEvent (https://react.dev/warnings/react-dom-test-utils). The problem is that we need to support React starting from version ^16.8, but @testing-library/react requires React ^18.0.0 as a peer dependency. Given this constraint, do you have any recommendations for what we should use as an alternative to Simulate? | Status: Unconfirmed | low | Minor |
2,632,876,431 | PowerToys | [Image Resizer] - The ability to apply multiple presets to a single image | ### Description of the new feature / enhancement
Often I find myself needing to create multiple sizes of a particular image. Image Resizer has made this fairly straightforward, as my most common sizes are saved as presets and I can just click on the size I need. However, there's multiple occasions where I need to resize an image multiple times, so that I have different size variations.
It would be useful to be able to chose to resize an image, and apply multiple presets to the resize, so that Image Resizer generates an image for each of the presets from a single base image.
### Scenario when this would be used?
A number of companies use images in multiple sizes - ie, Apple have the concept of 2x and 3x images, which are the same image, but double and triple the size respectively.
Similarly, when developing for web, favicons and images used for SEO and for PWAs are normally the same image, but an apple touch icon is 180x180, iPad is 167x167, Android is 192x192 whereas a favicon is 32x32 or 16x16. Having the ability to create an image for each of those sizes, stored as a preset, from a single source image would be beneficial.
### Supporting information
There are a number of sites that provide the functionality specifically for favicons (such as https://realfavicongenerator.net/), but it would be useful to be able to do this in a wider context, such as being able to generate images for App development. | Needs-Triage | low | Minor |
2,632,909,456 | langchain | Tool Calling Fails with Error with Azure Search for Agentic RAG | ### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the [LangGraph](https://langchain-ai.github.io/langgraph/)/LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a similar question and didn't find it.
- [X] I am sure that this is a bug in LangGraph/LangChain rather than my code.
- [X] I am sure this is better as an issue [rather than a GitHub discussion](https://github.com/langchain-ai/langgraph/discussions/new/choose), since this is a LangGraph bug and not a design question.
### Example Code
```python
**Tool creation**
embeddings = AzureOpenAIEmbeddings(model=embedding_model_name,azure_deployment="text-embedding-3-large")
# Create the vector store
vector_store = AzureSearch(
azure_search_endpoint=azure_search_endpoint,
azure_search_key=azure_search_key,
index_name=search_index_name,
embedding_function=embeddings.embed_query,
)
retriever_ = vector_store.as_retriever() // Return AzureSearch Object
retrieve_policy_document = create_retriever_tool(
retriever=retriever_,
name="retriever",
description="Search and Return tool for HR Related questions and Leave Policy in NewYork.",)
** Tool Biding to Agent **
vector_store = create_vector_store()
self.retrieval_tool = create_vector_store_tool(vector_store)
# Bind the retriever tool to the LLM
self.model_tool = self.llm.bind_tools([self.retrieval_tool])
response = self.model_tool.invoke(messages)
** Langgraph workflow ***
workflow = StateGraph(AgentState)
# Add nodes to the workflow
workflow.add_node("agent", self.agent)
retrieve_node = ToolNode([self.retrieval_tool])
workflow.add_node("retrieve", retrieve_node)
workflow.add_node("rewrite", self.rewrite)
workflow.add_node("generate", self.generate)
# Define the workflow edges
workflow.add_edge(START, "agent")
# Conditional edges based on the agent's decision
workflow.add_conditional_edges(
"agent",
tools_condition,
{
"tools": "retrieve",
END: END,
},
)
# Conditional edges after retrieval
workflow.add_conditional_edges(
"retrieve",
self.grade_documents,
{
"generate": "generate",
"rewrite": "rewrite",
},
)
workflow.add_edge("generate", END)
workflow.add_edge("rewrite", "agent")
```
### Error Message and Stack Trace (if applicable)
```shell
Does not perform Tool Call for the above code example, instead gets response from LLM.
When i change tool call to below. Return with invalid tool call errors.
retrieve_policy_document = create_retriever_tool(
retriever=retriever_,
name="retriever_policy_documents",
description="Search and Return tool for HR Related questions and Leave Policy in NewYork.",
)
================================== Ai Message ==================================
Invalid Tool Calls:
python (call_MnPE9xzRf5aedp2j0z6kIA6w)
Call ID: call_MnPE9xzRf5aedp2j0z6kIA6w
Error: Function python arguments:
// Here's an example implementation of the retriever_policy_documents function
// that searches for HR related questions and leave policies in New York.
functions.retriever_policy_documents = ({ query }) => {
const documents = [
{
title: "Extended Leave Policy",
content: "Employees may request an extended leave of absence for medical reasons or to care for a family member. The leave may be unpaid, but job protection is provided under the Family and Medical Leave Act (FMLA).",
category: "Leave Policy",
location: "New York"
},
{
title: "FMLA",
content: "The Family and Medical Leave Act (FMLA) provides eligible employees with up to 12 weeks of unpaid leave per year for certain family and medical reasons. To be eligible, employees must have worked for their employer for at least 12 months and have worked at least 1,250 hours during the previous 12 months.",
category: "HR Policy",
location: "New York"
},
{
title: "Disability Leave",
content: "Employees who are unable to work due to a disability may be eligible for disability leave under the Americans with Disabilities Act (ADA) and the New York State Human Rights Law. The length of the leave and the amount of pay may vary depending on the circumstances.",
category: "Leave Policy",
location: "New York"
}
];
const results = documents.filter(doc => {
const queryWords = query.toLowerCase().split(" ");
const docWords = doc.title.toLowerCase().split(" ");
return queryWords.every(word => docWords.includes(word));
});
return results;
};
are not valid JSON. Received JSONDecodeError Expecting value: line 1 column 1 (char 0)
For troubleshooting, visit: https://python.langchain.com/docs/troubleshooting/errors/OUTPUT_PARSING_FAILURE
Args:
// Here's an example implementation of the retriever_policy_documents function
// that searches for HR related questions and leave policies in New York.
functions.retriever_policy_documents = ({ query }) => {
const documents = [
{
title: "Extended Leave Policy",
content: "Employees may request an extended leave of absence for medical reasons or to care for a family member. The leave may be unpaid, but job protection is provided under the Family and Medical Leave Act (FMLA).",
category: "Leave Policy",
location: "New York"
},
{
title: "FMLA",
content: "The Family and Medical Leave Act (FMLA) provides eligible employees with up to 12 weeks of unpaid leave per year for certain family and medical reasons. To be eligible, employees must have worked for their employer for at least 12 months and have worked at least 1,250 hours during the previous 12 months.",
category: "HR Policy",
location: "New York"
},
{
title: "Disability Leave",
content: "Employees who are unable to work due to a disability may be eligible for disability leave under the Americans with Disabilities Act (ADA) and the New York State Human Rights Law. The length of the leave and the amount of pay may vary depending on the circumstances.",
category: "Leave Policy",
location: "New York"
}
];
const results = documents.filter(doc => {
const queryWords = query.toLowerCase().split(" ");
const docWords = doc.title.toLowerCase().split(" ");
return queryWords.every(word => docWords.includes(word));
});
return results;
};
None
```
### Description
Here are the libraries.
```
langchain 0.3.6
langchain-anthropic 0.2.3
langchain-cli 0.0.31
langchain-community 0.3.4
langchain-core 0.3.14
langchain-openai 0.2.3
langchain-text-splitters 0.3.0
langchainhub 0.1.21
langdetect 1.0.9
langgraph 0.2.38
langgraph-checkpoint 2.0.1
langgraph-checkpoint-postgres 2.0.1
langgraph-sdk 0.1.33
```
Tool Calling does not work with Azure Open AI & Azure Search combination. My use cases requires providing responses based on existing documentation which loaded to Azure Search. I attempted to perform Agentic RAG follow examples..
https://langchain-ai.github.io/langgraph/tutorials/rag/langgraph_agentic_rag/
Retriever step wan not called instead getting response from LLM directly.
How do i enable debugging in LangGraph to show execution of steps and input and output from LangGraph workflow execution ?
Full code can be found below.
https://github.com/kishorekkota/agentic_app/blob/main/agentic_rag/azure_search.py
https://github.com/kishorekkota/agentic_app/blob/main/agentic_rag/rag_assistant.py
### System Info
System Information
------------------
> OS: Darwin
> OS Version: Darwin Kernel Version 24.1.0: Thu Oct 10 21:03:15 PDT 2024; root:xnu-11215.41.3~2/RELEASE_ARM64_T6000
> Python Version: 3.12.7 (main, Oct 1 2024, 02:05:46) [Clang 16.0.0 (clang-1600.0.26.3)]
Package Information
-------------------
> langchain_core: 0.3.14
> langchain: 0.3.6
> langchain_community: 0.3.4
> langsmith: 0.1.133
> langchain_anthropic: 0.2.3
> langchain_cli: 0.0.31
> langchain_openai: 0.2.3
> langchain_text_splitters: 0.3.0
> langchainhub: 0.1.21
> langgraph: 0.2.38
> langserve: 0.3.0
Other Dependencies
------------------
> aiohttp: 3.10.9
> anthropic: 0.36.0
> async-timeout: Installed. No version info available.
> dataclasses-json: 0.6.7
> defusedxml: 0.7.1
> fastapi: 0.115.2
> gitpython: 3.1.43
> gritql: 0.1.5
> httpx: 0.27.2
> httpx-sse: 0.4.0
> jsonpatch: 1.33
> langgraph-checkpoint: 2.0.1
> langgraph-sdk: 0.1.33
> langserve[all]: Installed. No version info available.
> numpy: 1.26.4
> openai: 1.52.0
> orjson: 3.10.7
> packaging: 24.1
> pydantic: 2.9.2
> pydantic-settings: 2.5.2
> PyYAML: 6.0.2
> requests: 2.32.3
> requests-toolbelt: 1.0.0
> SQLAlchemy: 2.0.35
> sse-starlette: 1.8.2
> tenacity: 8.5.0
> tiktoken: 0.8.0
> tomlkit: 0.12.5
> typer[all]: Installed. No version info available.
> types-requests: 2.32.0.20241016
> typing-extensions: 4.12.2
> uvicorn: 0.23.2 | 🤖:bug | low | Critical |
2,632,921,102 | vscode | Manual Updates |
Type: <b>Feature Request</b>
As a user when I select "Manual" as my update mode, I want to check to SEE if there are updates, AND MANUALLY apply them IF I CHOSE TO.
STEPS:
Click Settings icon
click Application
Find "Mode" dropdown
Select Manual
Click Settings icon
Click "Check for Updates"
When the check is complete you will see that a restart is needed to apply the update. However, since I know this update has a bug I don't want to apply it and there is, seemingly, no way to "cancel" applying this update.
VS Code version: Code 1.94.2 (384ff7382de624fb94dbaf6da11977bba1ecd427, 2024-10-09T16:08:44.566Z)
OS version: Windows_NT x64 10.0.22631
Modes:
<!-- generated by issue reporter --> | feature-request,install-update | low | Critical |
2,632,935,303 | bitcoin | Mempool leak through the eviction policy | A spy can see whether a transaction exists in a target node's mempool before the node INV-announces it (overcoming the trickle protection).
0. Fill all the inbound slots of the target (up to 117 by default), and make sure your connections are not protected from eviction (e.g., delay block delivery to the target, etc).
1. Relay txA to the peer from connection CSpy.
2. Issue more connections to the target (say, 100 more).
3. If CSpy was not evicted while others were, it's likely CSpy [was protected](https://github.com/bitcoin/bitcoin/blob/6463117a29294f6ddc9fafecfd1e9023956cc41b/src/node/eviction.cpp#L192) because it provided a **new** transaction txA.
----------
It's quite hard to pull off due to a lot of noise.
A potential solution is a rolling window of recent transactions (say 1000) instead of the very recent ones. | Brainstorming,Privacy | low | Major |
2,632,960,843 | godot | Godot crashes randomly | ### Tested versions
4.3.stable.official
### System information
Linux Mint 22 X86_64 - Vulkan 1.3.289 - Forward Mobile - Using Device #0: AMD - AMD Radeon R3 Graphics (RADV STONEY)
### Issue description
I am using a low end laptop to do projects on godot, 4 gigabytes of RAM, AMD A4-9125 RADEON R3 2C+2G. this might be just a hardware issue. but, it doesn't happen to crash when i use it on windows.
```
================================================================
handle_crash: Program crashed with signal 11
Engine version: Godot Engine v4.3.stable.official (77dcf97d82cbfe4e4615475fa52ca03da645dbd8)
Dumping the backtrace. Please include this when reporting the bug to the project developer.
[1] /lib/x86_64-linux-gnu/libc.so.6(+0x45320) [0x715a12045320] (??:0)
[2] /home/user/Desktop/Godot4.3.x86_64() [0x4113ecb] (??:0)
[3] /home/user/Desktop/Godot4.3.x86_64() [0x7d7abe] (??:0)
[4] /home/user/Desktop/Godot4.3.x86_64() [0x68756c] (??:0)
[5] /home/user/Desktop/Godot4.3.x86_64() [0x232ba5f] (??:0)
[6] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[7] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[8] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[9] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[10] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[11] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[12] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[13] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[14] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[15] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[16] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[17] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[18] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[19] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[20] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[21] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[22] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[23] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[24] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[25] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[26] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[27] /home/user/Desktop/Godot4.3.x86_64() [0x232ba0e] (??:0)
[28] /home/user/Desktop/Godot4.3.x86_64() [0x232c8b3] (??:0)
[29] /home/user/Desktop/Godot4.3.x86_64() [0x23b5028] (??:0)
[30] /home/user/Desktop/Godot4.3.x86_64() [0x4202b7] (??:0)
[31] /lib/x86_64-linux-gnu/libc.so.6(+0x2a1ca) [0x715a1202a1ca] (??:0)
[32] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0x8b) [0x715a1202a28b] (??:0)
[33] /home/user/Desktop/Godot4.3.x86_64() [0x43d44a] (??:0)
-- END OF BACKTRACE --
================================================================
```
### Steps to reproduce
it closes by itself at random time, even when i don't do anything, i leave it running and it just crash.
### Minimal reproduction project (MRP)
it happens with every project
but here's one of them on github
https://github.com/R4Qabdi/outermension | bug,needs testing,crash | low | Critical |
2,632,993,487 | ollama | langchain-python-rag-document not working | ### What is the issue?
I downloaded the example [langchain-python-rag-document](https://github.com/ollama/ollama/tree/main/examples/langchain-python-rag-document) and tried to execute it using a fresh virtual environment and installing requirements.txt.
No matter the Python version I used (3.9, 3.10, 3.11 and 3.12), the installation with pip always produces the following error:
```
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
chromadb 0.4.5 requires pydantic<2.0,>=1.9, but you have pydantic 2.9.2 which is incompatible.
fastapi 0.99.1 requires pydantic!=1.8,!=1.8.1,<2.0.0,>=1.7.4, but you have pydantic 2.9.2 which is incompatible.
tensorflow-macos 2.13.0 requires typing-extensions<4.6.0,>=3.6.6, but you have typing-extensions 4.12.2 which is incompatible.
```
And if I try to execute the example, I get this error:
```
Traceback (most recent call last):
File ".../main.py", line 33, in <module>
vectorstore = Chroma.from_documents(documents=all_splits, embedding=GPT4AllEmbeddings())
File ".../venv/lib/python3.9/site-packages/pydantic/main.py", line 212, in __init__
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
File ".../venv/lib/python3.9/site-packages/langchain_community/embeddings/gpt4all.py", line 40, in validate_environment
values["client"] = Embed4All(
TypeError: __init__() got an unexpected keyword argument 'model_name'
```
Additionally, I get some warnings at execution time (before it raises the TypeError) saying that I should import from `langchain_community` instead of `langchain`.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.3.14 | bug | low | Critical |
2,633,003,400 | flutter | Add support for applying delta/factor transformations for TextTheme height, letter and word spacing | ### Use case
When switching from the the default material font, many times the height, letter spacing and word spacing need to all be adjusted slightly. doing this for all each text style is annoying.
### Proposal
The text theme has the apply method which does bulk operations and even includes support for delta/factor ajustements for font size. This should also have height, letter and word spacing, | c: new feature,framework,f: material design,waiting for PR to land (fixed),P3,team-design,triaged-design | low | Minor |
2,633,045,683 | deno | `deno fmt` consumes all available memory | Version: Deno 2.0.4
`deno --version`
```
deno 2.0.4 (stable, release, x86_64-unknown-linux-gnu)
v8 12.9.202.13-rusty
typescript 5.6.2
```
## Overview
When formatting the following JavaScript code, `deno fmt` consumes all available memory.
```js
let a={b:{c:1,d:()=>{{switch(1){default:if(1){if(1){}else{ABCDEFGHIJKLMNOPQR=123456789012}}else{ABCDEFGHIJKLMNOPQR=123456789012}}}}}}
```
**NOTE**: I recommended to limit memory usage when actually trying this out, otherwise your system may freeze.
## Additional information
Curiously, if I reduce the number of characters in the `ABCDEFGHIJKLMNOPQR=123456789012` part, `deno fmt` works well.
```diff
--- a/fmt-works.js
+++ b/fmt-oom.js
@@ -1 +1 @@
-let a={b:{c:1,d:()=>{{switch(1){default:if(1){if(1){}else{ABCDEFGHIJKLMNOPQ=123456789012}}else{ABCDEFGHIJKLMNOPQR=123456789012}}}}}}
+let a={b:{c:1,d:()=>{{switch(1){default:if(1){if(1){}else{ABCDEFGHIJKLMNOPQR=123456789012}}else{ABCDEFGHIJKLMNOPQR=123456789012}}}}}}
```
```sh
podman run --rm -it --memory 4g -v ./fmt-works.js:/fmt-works.js "docker.io/denoland/deno:2.0.4" fmt /fmt-works.js
/fmt-works.js
Checked 1 file
podman run --rm -it --memory 4g -v ./fmt-oom.js:/fmt-oom.js "docker.io/denoland/deno:2.0.4" fmt /fmt-oom.js
# OOM ocurred !
```
| bug,deno fmt,needs investigation | low | Minor |
2,633,062,986 | deno | Vite Project Conflicts with Deno and Other Package Managers | Version: Deno 2.0.4
OS: Fedora
## Vite Project Conflicts with Deno and Other Package Managers
When you run a Vite project, the default port is set to `http://localhost:5173/`. When you try to run another Vite project while the first one is still running, it automatically sets the port to `http://localhost:5174/`.
However, when running a Vite project with Deno, it does not recognize if another Vite project is running with a different package manager, resulting in both projects **using the same port**.
I’ve tested various package managers, and they work as expected, but Deno seems to overlap on the same port.
## Steps to reproduce the issue:
1. Create a vite project with pnpm (or any other package manager):
```sh
pnpm create vite my-app -- --template vue
cd my-app
pnpm install
pnpm run dev
```
2. Create a vite project with deno:
```sh
deno run -A npm:create-vite@latest my-app2 -- --template vue
cd my-app2
deno install
deno task dev
```
3. **Run Both Projects Simultaneously**: Run both projects at the same time, and they will overlap.

| needs investigation | low | Minor |
2,633,066,831 | vscode | Editor GPU: Handle the case where glyphs do not fit into standard atlas page slabs | Repro:
1. macOS (or a high DPI monitor preferably)
1. Enable editor gpu acceleration
2. Zoom in a lot, eventually things will all break as glyphs won't fit into the atlas pages
Related: https://github.com/microsoft/vscode/issues/232979 | bug,editor-gpu | low | Minor |
2,633,077,032 | PowerToys | Enhancement for "Crop and Lock" - Pixel Perfection | ### Description of the new feature / enhancement
In addition to showing a crosshair over the moused area, create a circular/squared preview of where the cursor is that is visually magnified without any form of smoothing, sharpening, or anti-aliasing, to allow for more precise selection of an area, and accept arrow key inputs to move your cursor more precisely.
### Scenario when this would be used?
1. In situations where a game or app has a side bar menu and someone is using PowerToys to display the app in a demo-type situation where navigating the sidebar could lead to people leaving the intended environment.
2. In situations within a chat application where you can open multiple windows of the app but not hide the sidebar, and you want to have dedicated windows for individual chats without the option to accidentally switch channels.
### Supporting information
ShareX has a similar option for capturing more precise screenshots.
[Image of the process in action.](https://github.com/user-attachments/assets/ca0ac1f9-dbf6-45a6-a2b3-84491037af65)
| Needs-Triage | low | Minor |
2,633,080,077 | godot | [XR][GODOT4.3] godot4.3 openxr doesn't run properly on steamvr, it doesn't render, but the camera can be controlled! | ### Tested versions
v4.3.stable.mono.official [77dcf97d8]
### System information
Windows 11 Godot v4.3.stable.mono.official [77dcf97d8] Vulkan Forward+, PICO VR NEO3, INTEL ARC A770 GRPHICS CARD
### Issue description
c# code
```
using Godot;
using System;
public partial class Main : Node3D
{
private XRInterface _xrInterface;
public void OnStart(){
GD.Print("SESSION START");
}
public override void _Ready()
{
base._Ready();
_xrInterface = (OpenXRInterface)XRServer.FindInterface("OpenXR");
if(_xrInterface != null && _xrInterface.IsInitialized())
{
_xrInterface.Connect("session_begun",new Callable(this,"OnStart"));
GD.Print("OpenXR initialized successfully");
// Turn off v-sync!
DisplayServer.WindowSetVsyncMode(DisplayServer.VSyncMode.Disabled);
var vp = GetViewport();
// Change our main viewport to output to the HMD
vp.UseXR = true;
var camera = GetNode<XRCamera3D>("World/XRPlayer/XRCamera3D");
if (RenderingServer.GetRenderingDevice() != null){
GD.Print("render "+vp.UseXR.ToString());
// vp.VrsMode = Viewport.VrsModeEnum.XR;
}
else if ((int)ProjectSettings.GetSetting("xr/openxr/foveation_level") == 0)
GD.PushWarning("OpenXR: Recommend setting Foveation level to High in Project Settings");
}
else
{
GD.Print("OpenXR not initialized, please check if your headset is connected");
}
}
}
```
I would like to ask after creating a basic openxr environment in godot4 and opening it in steamvr, it stays on the waiting screen, then I find an error on steamvr , it goes like this
-----------------
Failed to create sync texture. Ensure application was built using DXGI 1.1 or later (i.e. Call CreateDXGIFactory1).!!!!
ComposeLayerProjection: failed to submit view 0: VRCompositorError_SharedTexturesNotSupported
-----------------
What should I do to fix this?
### Steps to reproduce
the code is from offical docs:
https://docs.godotengine.org/en/stable/tutorials/xr/setting_up_xr.html
### Minimal reproduction project (MRP)
[xrtest.zip](https://github.com/user-attachments/files/17619939/xrtest.zip)
output:
```
Godot Engine v4.3.stable.mono.official.77dcf97d8 - https://godotengine.org
OpenXR: Running on OpenXR runtime: SteamVR/OpenXR 2.8.6
OpenXR: XrGraphicsRequirementsVulkan2KHR:
- minApiVersionSupported: 1.0.0
- maxApiVersionSupported: 1.2.0
Vulkan 1.3.289 - Forward+ - Using Device #0: Intel - Intel(R) Arc(TM) A770 Graphics
OpenXR initialized successfully
```
| bug,needs testing,topic:xr | low | Critical |
2,633,087,824 | rust | Fix argument splitting in compiletest | We should fix how arguments are split in compiletest, this involves multiple places:
- The custom diff tool
- `//@ compile-flags` and such
They usually involve converting from a user-provided escaped string to `Command::new()` and `Command::arg` and such, but yeah. Command arg splitting in compiletest is currently very naive and usually just splits on whitespace or is otherwise very basic, we should probably just use a proper library to handle the splitting... | T-bootstrap,E-medium,C-bug,A-compiletest | low | Minor |
2,633,179,098 | pytorch | Illegal Memory Access With `torch.compile` | ### 🐛 Describe the bug
We encountered an illegal memory access issue with `torch.compile` and customized torch library operator.
Here's one minimal example to reproduce:
```python
import torch
@torch.library.custom_op("bug_report::foo", mutates_args=("a", "b"))
def foo(a: torch.Tensor, b: torch.Tensor) -> torch.Tensor:
a.add_(1.)
b.add_(1.)
return a + b
@foo.register_fake
def _(a, b):
return a.new_empty(a.shape)
@torch.compile
def bar(a, b):
return foo(a, b)
if __name__ == "__main__":
buf = torch.ones(2, 128 * 1024 * 1024, device="cuda")
a = buf[1]
b = buf[0]
output = bar(a, b)
print(buf)
print(output)
print("finish")
```
We tested with docker image `pytorch/pytorch:2.5.1-cuda12.4-cudnn9-devel` on H100, the error message:
```console
$ python bug.py
Traceback (most recent call last):
File "/workspace/bug.py", line 21, in <module>
output = bar(a, b)
^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torch/_dynamo/eval_frame.py", line 465, in _fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/workspace/bug.py", line 13, in bar
@torch.compile
File "/opt/conda/lib/python3.11/site-packages/torch/_dynamo/eval_frame.py", line 632, in _fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torch/_functorch/aot_autograd.py", line 1100, in forward
return compiled_fn(full_args)
^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torch/_functorch/_aot_autograd/runtime_wrappers.py", line 321, in runtime_wrapper
all_outs = call_func_at_runtime_with_args(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torch/_functorch/_aot_autograd/utils.py", line 124, in call_func_at_runtime_with_args
out = normalize_as_list(f(args))
^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torch/_functorch/_aot_autograd/runtime_wrappers.py", line 667, in inner_fn
outs = compiled_fn(args)
^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torch/_functorch/_aot_autograd/runtime_wrappers.py", line 488, in wrapper
return compiled_fn(runtime_args)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torch/_inductor/codecache.py", line 1478, in __call__
return self.current_callable(inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torch/_inductor/utils.py", line 1977, in run
return model(new_inputs)
^^^^^^^^^^^^^^^^^
File "/tmp/torchinductor_wkong/rw/crwnqh2wuf7heopkntphn7irtlcc6m7jm2fpcyp3w5sf5diwtvhg.py", line 112, in call
triton_poi_fused_0.run(arg1_1, buf0, 268435456, grid=grid(268435456), stream=stream0)
File "/opt/conda/lib/python3.11/site-packages/torch/_inductor/runtime/triton_heuristics.py", line 836, in run
self.autotune_to_one_config(*args, grid=grid, **kwargs)
File "/opt/conda/lib/python3.11/site-packages/torch/_inductor/runtime/triton_heuristics.py", line 729, in autotune_to_one_config
timings = self.benchmark_all_configs(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torch/_inductor/runtime/triton_heuristics.py", line 704, in benchmark_all_configs
timings = {
^
File "/opt/conda/lib/python3.11/site-packages/torch/_inductor/runtime/triton_heuristics.py", line 705, in <dictcomp>
launcher: self.bench(launcher, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torch/_inductor/runtime/triton_heuristics.py", line 675, in bench
return benchmarker.benchmark_gpu(kernel_call, rep=40, fast_flush=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torch/_inductor/runtime/benchmarking.py", line 66, in wrapper
return fn(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/torch/_inductor/runtime/benchmarking.py", line 201, in benchmark_gpu
return self.triton_do_bench(_callable, **kwargs, return_mode="median")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/triton/testing.py", line 107, in do_bench
di.synchronize()
File "/opt/conda/lib/python3.11/site-packages/torch/cuda/__init__.py", line 954, in synchronize
return torch._C._cuda_synchronize()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: CUDA error: an illegal memory access was encountered
CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect.
For debugging consider passing CUDA_LAUNCH_BLOCKING=1
Compile with `TORCH_USE_CUDA_DSA` to enable device-side assertions.
```
### Versions
```console
$ python collect_env.py
Collecting environment information...
PyTorch version: 2.5.1+cu124
Is debug build: False
CUDA used to build PyTorch: 12.4
ROCM used to build PyTorch: N/A
OS: Ubuntu 22.04.4 LTS (x86_64)
GCC version: (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0
Clang version: Could not collect
CMake version: version 3.30.5
Libc version: glibc-2.35
Python version: 3.11.10 | packaged by conda-forge | (main, Oct 16 2024, 01:27:36) [GCC 13.3.0] (64-bit runtime)
Python platform: Linux-5.15.0-88-generic-x86_64-with-glibc2.35
Is CUDA available: True
CUDA runtime version: 12.4.131
CUDA_MODULE_LOADING set to: LAZY
GPU models and configuration: GPU 0: NVIDIA H100 80GB HBM3
Nvidia driver version: 565.57.01
cuDNN version: Could not collect
HIP runtime version: N/A
MIOpen runtime version: N/A
Is XNNPACK available: True
CPU:
Architecture: x86_64
CPU op-mode(s): 32-bit, 64-bit
Address sizes: 46 bits physical, 57 bits virtual
Byte Order: Little Endian
CPU(s): 32
On-line CPU(s) list: 0-31
Vendor ID: GenuineIntel
Model name: Intel(R) Xeon(R) Silver 4314 CPU @ 2.40GHz
CPU family: 6
Model: 106
Thread(s) per core: 2
Core(s) per socket: 16
Socket(s): 1
Stepping: 6
CPU max MHz: 3400.0000
CPU min MHz: 800.0000
BogoMIPS: 4800.00
Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl
xtopology nonstop_tsc cpuid aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx smx est tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid dca sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm 3dnowprefetch c
puid_fault epb cat_l3 invpcid_single intel_ppin ssbd mba ibrs ibpb stibp ibrs_enhanced tpr_shadow vnmi flexpriority ept vpid ept_ad fsgsbase tsc_adjust bmi1 avx2 smep bmi2 erms invpcid cqm rdt_a avx512f avx512dq rdseed adx smap avx512ifm
a clflushopt clwb intel_pt avx512cd sha_ni avx512bw avx512vl xsaveopt xsavec xgetbv1 xsaves cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local split_lock_detect wbnoinvd dtherm ida arat pln pts hwp hwp_act_window hwp_epp hwp_pkg_req avx51
2vbmi umip pku ospke avx512_vbmi2 gfni vaes vpclmulqdq avx512_vnni avx512_bitalg tme avx512_vpopcntdq la57 rdpid fsrm md_clear pconfig flush_l1d arch_capabilities
Virtualization: VT-x
L1d cache: 768 KiB (16 instances)
L1i cache: 512 KiB (16 instances)
L2 cache: 20 MiB (16 instances)
L3 cache: 24 MiB (1 instance)
NUMA node(s): 2
NUMA node0 CPU(s): 0-7,16-23
NUMA node1 CPU(s): 8-15,24-31
Vulnerability Gather data sampling: Mitigation; Microcode
Vulnerability Itlb multihit: Not affected
Vulnerability L1tf: Not affected
Vulnerability Mds: Not affected
Vulnerability Meltdown: Not affected
Vulnerability Mmio stale data: Mitigation; Clear CPU buffers; SMT vulnerable
Vulnerability Retbleed: Not affected
Vulnerability Spec rstack overflow: Not affected
Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl
Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization
Vulnerability Spectre v2: Vulnerable, IBPB: disabled, STIBP: disabled, PBRSB-eIBRS: Vulnerable
Vulnerability Srbds: Not affected
Vulnerability Tsx async abort: Not affected
Versions of relevant libraries:
[pip3] numpy==2.1.2
[pip3] nvidia-cublas-cu12==12.4.5.8
[pip3] nvidia-cuda-cupti-cu12==12.4.127
[pip3] nvidia-cuda-nvrtc-cu12==12.4.127
[pip3] nvidia-cuda-runtime-cu12==12.4.127
[pip3] nvidia-cudnn-cu12==9.1.0.70
[pip3] nvidia-cufft-cu12==11.2.1.3
[pip3] nvidia-curand-cu12==10.3.5.147
[pip3] nvidia-cusolver-cu12==11.6.1.9
[pip3] nvidia-cusparse-cu12==12.3.1.170
[pip3] nvidia-nccl-cu12==2.21.5
[pip3] nvidia-nvjitlink-cu12==12.4.127
[pip3] nvidia-nvtx-cu12==12.4.127
[pip3] optree==0.13.0
[pip3] torch==2.5.1+cu124
[pip3] torchaudio==2.5.1+cu124
[pip3] torchelastic==0.2.2
[pip3] torchvision==0.20.1+cu124
[pip3] triton==3.1.0
[conda] numpy 2.1.2 py311h71ddf71_0 conda-forge
[conda] nvidia-cublas-cu12 12.4.5.8 pypi_0 pypi
[conda] nvidia-cuda-cupti-cu12 12.4.127 pypi_0 pypi
[conda] nvidia-cuda-nvrtc-cu12 12.4.127 pypi_0 pypi
[conda] nvidia-cuda-runtime-cu12 12.4.127 pypi_0 pypi
[conda] nvidia-cudnn-cu12 9.1.0.70 pypi_0 pypi
[conda] nvidia-cufft-cu12 11.2.1.3 pypi_0 pypi
[conda] nvidia-curand-cu12 10.3.5.147 pypi_0 pypi
[conda] nvidia-cusolver-cu12 11.6.1.9 pypi_0 pypi
[conda] nvidia-cusparse-cu12 12.3.1.170 pypi_0 pypi
[conda] nvidia-nccl-cu12 2.21.5 pypi_0 pypi
[conda] nvidia-nvjitlink-cu12 12.4.127 pypi_0 pypi
[conda] nvidia-nvtx-cu12 12.4.127 pypi_0 pypi
[conda] optree 0.13.0 pypi_0 pypi
[conda] torch 2.5.1+cu124 pypi_0 pypi
[conda] torchaudio 2.5.1+cu124 pypi_0 pypi
[conda] torchelastic 0.2.2 pypi_0 pypi
[conda] torchvision 0.20.1+cu124 pypi_0 pypi
[conda] triton 3.1.0 pypi_0 pypi
```
cc @ezyang @gchanan @zou3519 @kadeng @msaroufim @anjali411 @chauhang @penguinwu @voznesenskym @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @yf225 @chenyang78 @muchulee8 @ColinPeppler @amjames @desertfire @aakhundov @BoyuanFeng @bdhirsh | high priority,triaged,module: custom-operators,module: library,oncall: pt2,module: inductor | medium | Critical |
2,633,192,126 | vscode | Editor GPU: Lines that get decorations after initial render should be hidden in the canvas | Repro:
1. Enable editor gpu acceleration
2. Open a file with links in comments
3. Reload window, 🐛 notice that links lines are shown and then get bolder

| bug,editor-gpu | low | Minor |
2,633,207,100 | kubernetes | container_memory_working_set_bytes shows previous container memory | ### What happened?
As raised at https://github.com/prometheus-operator/kube-prometheus/issues/2522, the `container_memory_working_set_bytes` metric shows memory from a killed container instance which is not running anymore, not allowing to use it to know the "real" memory usage.
In my case, I still see the last memory used by the previous container instance for 4:30 minutes, but from @vladmalynych I see in his case it seems to be only for around 3 minutes.
### What did you expect to happen?
To only show data from the **current** running container.
### How can we reproduce it (as minimally and precisely as possible)?
Trigger a container restart (OOMKilled or Eviction) and check the resulting metric.
### Anything else we need to know?
_No response_
### Kubernetes version
<details>
```console
Client Version: v1.31.2
Kustomize Version: v5.4.2
Server Version: v1.31.1-gke.1678000 (Also happening in 1.29.8-gke.1211000)
```
</details>
### Cloud provider
<details>
GKE
</details>
### OS version
<details>
```console
# On Linux:
$ cat /etc/os-release
PRETTY_NAME="Ubuntu 22.04.5 LTS"
NAME="Ubuntu"
VERSION_ID="22.04"
VERSION="22.04.5 LTS (Jammy Jellyfish)"
VERSION_CODENAME=jammy
ID=ubuntu
ID_LIKE=debian
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
UBUNTU_CODENAME=jammy
$ uname -a
Linux gke-*** 5.15.0-1067-gke #73-Ubuntu SMP Sat Aug 31 04:29:32 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
```
</details>
### Install tools
<details>
</details>
### Container runtime (CRI) and version (if applicable)
<details>
</details>
### Related plugins (CNI, CSI, ...) and versions (if applicable)
<details>
</details>
| kind/bug,sig/instrumentation,needs-triage | low | Critical |
2,633,212,526 | excalidraw | Whitelist wishlist | Hi there,
Thanks for everything your doing to maintain this tool, I use it everyday and I'm grateful for its simplicity and speed.
Could I please have the following urls whitelisted so I can use them in context within my lessons:
- https://docs.getmoto.org
- https://eu-west-2.console.aws.amazon.com
- https://docs.aws.amazon.com/
- https://docs.python.org
- https://realpython.com/
Many thanks in advance.
| whitelist | low | Minor |
2,633,212,731 | PowerToys | Can't add more than one additional PC to mouse without borders | ### Microsoft PowerToys version
0.85.1
### Installation method
Microsoft Store
### Running as admin
Yes
### Area(s) with issue?
Mouse Without Borders
### Steps to reproduce
I've tried in standard and administrator mode. I have no issues adding a second PC to the host PC, but as soon as I try to add a third PC to the host PC, it replaces the second PC. The mouse without borders works awesome with two PCs, but I need 4. I can't even get 3.
### ✔️ Expected Behavior
There is a layout that matches my monitor, with one PC in each quadrant. But No matter how I arrange the devices I still can't add a 3rd pc. I want to be able to move my mouse cursor from one machine to the next in a 4-quadrant gird that matches my monitor configuration.
### ❌ Actual Behavior
I'm repeating myself in many different ways by restating the fact that I've said the same thing in a previous box that a simple issue shouldn't require 3 boxes for. Sometimes, for simple problems, stating the problem, expected behavior, and actual behavior doesn't need to be made more complicated than it is. If I already explained the problem and in the description of the problem I explain what should happen and what actually happens and then need to provide the same answer a third time but basically it's the same question and answer just re-explained again then this is actually not very efficient. If I explain something the first time and then I need to explain it a second time, and especially when I have to explain it a 3rd time then this isn't efficient. It's not efficient when I have to explains something more than once because I have to repeat the same answer again which isn't necessary, and it distracts from the core problem. So, to repeat when I add the 3rd pc to the host after adding a second pc, the 3rd pc overwrites the first pc that I added. I can only have two pcs, the host plus 1, not host plus 3 like I expect. The actual behavior is that I can't add more than one PC to the host. When I actually go to add the next PC it overwrites the one that I previously added. There actually doesn't seem to be a way to add more than one additional pc, but clearly there is intent to have up to 4. However actually I can't have more than 2. I expect to be able to add 3 security keys and have my mouse and keyboard automatically connect based on the screen my mouse pointer moved to like the synergy app did way back in the day, but the actual behavior is that I cannot add a second pc to the mouse without borders configuration. please let me know if I should repeat myself.
### Other Software
_No response_ | Issue-Bug,Needs-Triage | low | Minor |
2,633,226,104 | langchain | [Bug][langchain_mistralai] KeyError: 'choices' in streaming response when using Vertex AI Model Garden Mistral integration | ### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a similar question and didn't find it.
- [X] I am sure that this is a bug in LangChain rather than my code.
- [X] The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
### Example Code
The following code:
```python
from langchain_google_vertexai.model_garden_maas.mistral import VertexModelGardenMistral
llm = VertexModelGardenMistral(model_name="mistral-large@2407", project=gcp_project_name, location="europe-west4", streaming=True, safe_mode=True)
llm.invoke("hello")
```
### Error Message and Stack Trace (if applicable)
## printed chunks from _stream() and _astream function in langchain_mistralai/chat_models.py:
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'role': 'assistant', 'content': ''}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': 'Hello'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': '!'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' How'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' can'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' I'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' assist'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' you'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' today'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': '?'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' Let'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' me'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' know'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' if'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' you'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' have'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' any'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' questions'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' or'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' topics'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' you'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': "'"}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': 'd'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' like'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' to'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' discuss'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': '.'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ' '}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': '😊'}, 'finish_reason': None, 'logprobs': None}]}
{'id': 'cd77b9748c4e4009ab29dcf89d20c38f', 'object': 'chat.completion.chunk', 'created': 1730735114, 'model': 'mistral-large', 'choices': [{'index': 0, 'delta': {'content': ''}, 'finish_reason': 'stop', 'logprobs': None}], 'usage': {'prompt_tokens': 5, 'total_tokens': 33, 'completion_tokens': 28}}
**{'usage': {'output_tokens': 0}}**
## Error message
File ".../assistant-chatbot.py", line 110, in <module>
llm.invoke("hello")
File ".../langchain_core/language_models/chat_models.py", line 286, in invoke
self.generate_prompt(
File ".../langchain_core/language_models/chat_models.py", line 786, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".../langchain_core/language_models/chat_models.py", line 643, in generate
raise e
File ".../langchain_core/language_models/chat_models.py", line 633, in generate
self._generate_with_cache(
File ".../langchain_core/language_models/chat_models.py", line 851, in _generate_with_cache
result = self._generate(
^^^^^^^^^^^^^^^
File ".../langchain_mistralai/chat_models.py", line 533, in _generate
return generate_from_stream(stream_iter)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".../langchain_core/language_models/chat_models.py", line 90, in generate_from_stream
generation += list(stream)
^^^^^^^^^^^^
File ".../langchain_mistralai/chat_models.py", line 591, in _stream
if len(chunk["choices"]) == 0:
~~~~~^^^^^^^^^^^
KeyError: 'choices'
### Description
Here's a comprehensive description for the GitHub issue:
**Description**
I encountered a bug while using the Mistral AI model through Google Cloud's Vertex AI Model Garden integration with LangChain. The issue occurs in the streaming implementation when handling chunks from the model's response.
**Current Behavior**
When streaming responses from the Mistral model through Vertex AI, the code fails with a KeyError when trying to access the "choices" key. Looking at the actual stream of chunks, I can see that most chunks have the following structure:
```python
{
'id': 'cd77b9748c4e4009ab29dcf89d20c38f',
'object': 'chat.completion.chunk',
'created': 1730735114,
'model': 'mistral-large',
'choices': [{'index': 0, 'delta': {'content': 'some_content'}, 'finish_reason': None, 'logprobs': None}]
}
```
However, the last chunk in the stream has a different structure:
```python
{'usage': {'output_tokens': 0}}
```
This last chunk is causing the KeyError because it doesn't contain the 'choices' key that the code expects. The current implementation in **_stream** and **_astream** methods:
```python
if len(chunk["choices"]) == 0:
continue
```
assumes this key always exists, which causes the code to crash when processing the final usage statistics chunk.
**Proposed Solution**
The issue can be fixed by safely accessing the "choices" key using the `get()` method with a default empty list:
```python
if len(chunk.get("choices", [])) == 0:
continue
```
This change would:
1. Safely handle the final usage statistics chunk that doesn't have a "choices" key
2. Maintain the same logic for the content chunks that do have choices
3. Make the streaming implementation more resilient to variations in chunk structure
4. Prevent KeyError exceptions from breaking the stream
**Impact**
This issue affects users who are using the Mistral model through Vertex AI's Model Garden with streaming enabled. The bug prevents successful completion of streaming responses, even though the model itself is working correctly and generating proper responses.
**Additional Context**
The full stream of chunks shows that the model is working correctly and generating proper responses - the issue is purely in the handling of the final usage statistics chunk in the LangChain integration code.
### System Info
System Information
------------------
> OS: Linux
> OS Version: #1 SMP Debian 5.10.226-1 (2024-10-03)
> Python Version: 3.11.9 (main, Jul 29 2024, 12:51:52) [GCC 10.2.1 20210110]
Package Information
-------------------
> langchain_core: 0.3.15
> langchain: 0.3.7
> langchain_community: 0.3.5
> langsmith: 0.1.128
> langchain_google_genai: 2.0.0
> langchain_google_vertexai: 2.0.7
> langchain_huggingface: 0.1.0
> langchain_mistralai: 0.2.1
> langchain_openai: 0.2.0
> langchain_text_splitters: 0.3.0
> langgraph: 0.2.44 | 🤖:bug | low | Critical |
2,633,245,277 | pytorch | Inductor should run fallbacks without Autograd-related dispatch keys | Inductor's charter is that it gets passed code that runs on Tensors, without needing to worry about the interaction with Tensor subsystems. This means that when it runs fallbacks (by invoking the operator through the PyTorch dispatcher), it shouldn't need to run the Autograd key, the ADInplaceOrView key, and possibly more (AOTDispatcher handles both of these).
Some of these might be relatively expensive (from an overhead perspective), especially due to https://github.com/pytorch/pytorch/issues/139521 and registrations from Python torch.library.custom_op.
cc @ezyang @chauhang @penguinwu @voznesenskym @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @yf225 @chenyang78 @kadeng @muchulee8 @ColinPeppler @amjames @desertfire @aakhundov | triaged,oncall: pt2,module: inductor | low | Minor |
2,633,246,106 | TypeScript | 'Move to a new file' operation reformats code (typescript) | <!-- ⚠️⚠️ Do Not Delete This! bug_report_template ⚠️⚠️ -->
<!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ -->
<!-- 🕮 Read our guide about submitting issues: https://github.com/microsoft/vscode/wiki/Submitting-Bugs-and-Suggestions -->
<!-- 🔎 Search existing issues to avoid creating duplicates. -->
<!-- 🧪 Test using the latest Insiders build to see if your issue has already been fixed: https://code.visualstudio.com/insiders/ -->
<!-- 💡 Instead of creating your report here, use 'Report Issue' from the 'Help' menu in VS Code to pre-fill useful information. -->
<!-- 🔧 Launch with `code --disable-extensions` to check. -->
Does this issue occur when all extensions are disabled?: Yes
<!-- 🪓 If you answered No above, use 'Help: Start Extension Bisect' from Command Palette to try to identify the cause. -->
<!-- 📣 Issues caused by an extension need to be reported directly to the extension publisher. The 'Help > Report Issue' dialog can assist with this. -->
- VS Code Version: 1.95.0
when i refactor my typescript code by using the 'Move to a new file' operation, it reformats the selected code instead of keeping it as it was.
for example, it
- removes empty lines between functions
- removes line breaks in function parameters | Needs More Info | low | Critical |
2,633,325,194 | go | x/tools/go/analysis/passes/lostcancel: add new context functions and make it configurable | ### Go version
go version go1.23.2 linux/amd64
### Output of `go env` in your module/workspace:
```shell
-
```
### What did you do?
run nogo
### What did you see happen?
Analyzer `lostcancel` does not check if `cancel` returned by new `context` functions (`WithCancelCause`, `WithDeadlineCause`, `WithTimeoutCause`) is called. It is also not possible to add new import paths to check to the analyzer,
### What did you expect to see?
Analyzer lostcancel checks for these new functions too. It also has configuration flag to specify additional import paths for `context`-like packages. You may also add another flag to specify more functions to check but this is more tricky in general, as cancel function may not be returned as a second value as for `context` functions. | Tools | low | Major |
2,633,359,401 | rust | Use `git diff --no-index` as the default diff in compiletest | Since not long ago, compiletest supports custom diffing tools in the config.
This is cool, but in practice probably not very used since it requires configuring it.
I think that compiletest should, when `git` is available, use `git diff --no-index` by default as the diff.
The default git diff is quite OK, and many people (me included) configure git to have an even prettier one. This way, people get a nice diff by default without configuring anything. | C-enhancement,T-bootstrap,A-compiletest,A-test-infra | low | Minor |
2,633,388,921 | kubernetes | Warnings on missing pull secrets can be confusing | ### What happened?
A previous enhancement has added a warning event when a pull secret is missing: https://github.com/kubernetes/kubernetes/pull/117927
If the image has pulled successfully because either another pull secret did the job, or the image doesn't need auth, the warning is distracting.
As an example, this pod is running perfectly fine and has no problems, and yet when described, has had 11k warning events emitted over the last 9 days.
```
Warning FailedToRetrieveImagePullSecret 114s (x11399 over 9d) kubelet Unable to retrieve some image pull secrets (sa-integration); attempting to pull the image may not succeed.
```
We have been including multiple pull secrets in our pulls because it's more flexible to point to a few that may exist than to have to very exactly map each pod to a pull secret. This change means we will get warnings even when everything is working exactly as we wanted it to.
### What did you expect to happen?
No warnings if an image is pulled successfully.
### How can we reproduce it (as minimally and precisely as possible)?
Create a pod or service account that references a missing pull secret that is not required to pull the image.
### Anything else we need to know?
My suggestion would be to only emit that warning if the image cannot be pulled (for example, reaches ImagePullBackOff), or to include the detail about the missing secret in the existing image pull failure events.
### Kubernetes version
<details>
```console
Client Version: v1.29.4
Kustomize Version: v5.0.4-0.20230601165947-6ce0bf390ce3
Server Version: v1.29.8+632b078
```
</details>
### Cloud provider
<details>
n/a
</details>
### OS version
<details>
```console
n/a
```
</details>
### Install tools
<details>
n/a
</details>
### Container runtime (CRI) and version (if applicable)
<details>
n/a
</details>
### Related plugins (CNI, CSI, ...) and versions (if applicable)
<details>
n/a
</details>
| kind/bug,priority/important-soon,sig/node,triage/accepted | low | Critical |
2,633,390,325 | flutter | Migrate 10 popular non-discontinued plugins that reference v1 android embedding. Round 2 | Help the flutter plugin ecosystem by putting up pull requests to the top 10 still not migrated plugins. If a plugin only has references in example app code skip it (and add another from the list). If it violates googles open source contribution license rules skip it (and add another from the list).
Workflow defined in [go/flutter-android-v1-embedding-deprecation](http://goto.google.com/flutter-android-v1-embedding-deprecation)
Each of the following plugins uses a license that allows our contribution, and references v1 Android embedding class(es) in non-example code.
- https://github.com/oddbit/flutter_facebook_app_events/issues/373
- https://github.com/oddbit/flutter_facebook_app_events/pull/388
- https://github.com/aakashkondhalkar/external_path/issues/11
- https://github.com/aakashkondhalkar/external_path/pull/13
- https://github.com/pinkfish/flutter_native_timezone/issues/67
- https://github.com/pinkfish/flutter_native_timezone/pull/71
- https://github.com/inway/flutter_ringtone_player/issues/81
- https://github.com/inway/flutter_ringtone_player/pull/86
- https://github.com/adaptant-labs/flutter_windowmanager/issues/38
- https://github.com/BestBurning/platform_device_id/issues/49
- https://github.com/flutter-webrtc/flutter-webrtc/issues/1648
- https://github.com/mono0926/barcode_scan2/issues/90
- https://github.com/joutvhu/open_file_plus/issues/19
- https://github.com/GeekyAnts/external_app_launcher/issues/38
Continuation of https://github.com/flutter/flutter/issues/157807 | platform-android,P2,team-android,triaged-android | low | Minor |
2,633,395,496 | vscode | Add right click and drag to scroll horizontally or vertically | <!-- ⚠️⚠️ Do Not Delete This! feature_request_template ⚠️⚠️ -->
<!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ -->
<!-- Please search existing issues to avoid creating duplicates. -->
<!-- Describe the feature you'd like. -->
I'd like to be able to right click and drag, either vertically or horizontally in order to scroll vertically or horizontally in a file I'm editing. Similar to how you can right click and drag to navigate in Figma.
| feature-request,editor-scrollbar | low | Minor |
2,633,410,366 | rust | `rust-lld` is using wrong zlib at runtime | After building rust 1.81.0 from sources, it configures with
```
-- Found ZLIB: /home/harmen/spack/opt/spack/linux-ubuntu23.10-zen2/clang-16.0.6/zlib-ng-2.1.6-xidpmzycb2jufhr6doa7anltbcyhu263/lib/libz.so (found version "1.3.0")
```
builds, and installs
```
<prefix>/lib/rustlib/x86_64-unknown-linux-gnu/bin/rust-lld
```
which lists
```
libz.so.1
```
among the DT_NEEDED libraries, but due to a lacking RPATH, at runtime the wrong `libz.so.1` is used:
```
$ ldd lib/rustlib/x86_64-unknown-linux-gnu/bin/rust-lld | grep libz
libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x000072120213b000)
```
This should have been the original zlib cmake has linked everything to.
Normally I would avoid this issue by passing `-DCMAKE_INSTALL_RPATH_USE_LINK_PATH:BOOL=ON`, but I don't see how I can pass this since rust's bootstrap script invokes cmake with a fixed set of arguments.
---
I noticed a related issue with binaries from https://static.rust-lang.org/dist/rust-1.81.0-x86_64-unknown-linux-gnu.tar.gz: both `libLLVM.so.*` and `rust-lld` depend on system `libz.so`, which makes the bootstrap binaries less self-contained than necessary. | A-linkage,T-bootstrap,C-bug | low | Minor |
2,633,473,882 | tensorflow | Unable to install TensorFlow: No matching distribution found for TensorFlow! | When trying to install TensorFlow via pip, I encounter an error stating that no matching distribution can be found. The command I used and the error message are as follows:
```
C:\Users\Enes> pip install tensorflow
ERROR: Could not find a version that satisfies the requirement tensorflow (from versions: none)
ERROR: No matching distribution found for tensorflow
```
Operating System: Windows 10
Python Version: (Python 3.13.0)
I am currently using Python 3.13. Could this be related to compatibility with this specific Python version?
Could you provide guidance on how to resolve this issue, or suggest any compatible alternatives? | stat:awaiting tensorflower,type:feature,type:build/install,subtype:windows | low | Critical |
2,633,500,522 | pytorch | [triton 3.2] test_convolution2 | See https://github.com/pytorch/pytorch/pull/139206
python inductor/test_select_algorithm.py -k test_convolution2
Master issue tracker: https://github.com/pytorch/pytorch/issues/139175
cc @ezyang @chauhang @penguinwu @voznesenskym @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @yf225 @chenyang78 @kadeng @muchulee8 @ColinPeppler @amjames @desertfire @aakhundov @int3 @davidberard98 @nmacchioni @embg @peterbell10 | triaged,oncall: pt2,module: inductor,upstream triton | low | Minor |
2,633,501,697 | pytorch | [triton 3.2] test_custom_scan_op_dynamic_shapes_cuda | See https://github.com/pytorch/pytorch/pull/139206
python inductor/test_torchinductor_dynamic_shapes.py -k test_custom_scan_op_dynamic_shapes_cuda
Master issue tracker: https://github.com/pytorch/pytorch/issues/139175
cc @ezyang @chauhang @penguinwu @voznesenskym @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @ipiszy @yf225 @chenyang78 @kadeng @muchulee8 @ColinPeppler @amjames @desertfire @aakhundov | triaged,oncall: pt2,module: inductor | low | Minor |
2,633,525,041 | vscode | Find and Replace does not work properly on notebooks | <!-- ⚠️⚠️ Do Not Delete This! bug_report_template ⚠️⚠️ -->
<!-- Please read our Rules of Conduct: https://opensource.microsoft.com/codeofconduct/ -->
<!-- 🕮 Read our guide about submitting issues: https://github.com/microsoft/vscode/wiki/Submitting-Bugs-and-Suggestions -->
<!-- 🔎 Search existing issues to avoid creating duplicates. -->
<!-- 🧪 Test using the latest Insiders build to see if your issue has already been fixed: https://code.visualstudio.com/insiders/ -->
<!-- 💡 Instead of creating your report here, use 'Report Issue' from the 'Help' menu in VS Code to pre-fill useful information. -->
<!-- 🔧 Launch with `code --disable-extensions` to check. -->
Does this issue occur when all extensions are disabled?: Yes/No
<!-- 🪓 If you answered No above, use 'Help: Start Extension Bisect' from Command Palette to try to identify the cause. -->
<!-- 📣 Issues caused by an extension need to be reported directly to the extension publisher. The 'Help > Report Issue' dialog can assist with this. -->
- VS Code Version:
- OS Version:
Steps to Reproduce:
1. I copy some mathematical notes from chat-gpt containing latex. I change "\\[" and "\\]" to "$$" and "\\(" and "\\)" to "$" to be properly formatted in markdown. I do it by searching and finding them and replacing them all. After the latest update it does not. It corrupts the notebook.
| bug,notebook-find | low | Critical |
2,633,576,515 | pytorch | [Tracker] complex128 tests are disabled for all foreach internally | ### 🐛 Describe the bug
This issue tracks the fact that we disable all foreach tests for complex128 as a remediation for flaky tests internally. This should be fixed once we have more information about the flakiness.
### Versions
main, after this PR: https://github.com/pytorch/pytorch/pull/139649
cc @ptrblck @msaroufim @mruberry @ZainRizvi @crcrpar @mcarilli | module: cuda,module: tests,triaged,module: mta | low | Critical |
2,633,608,497 | go | x/build/devapp: automate deployment of devapps/owners, internal/gophers changes | devapp/owners and internal/gophers packages host data about the project teams and maintainers information.
Currently, manual redeployment of devapp (by the person on the interrupt rotation) is necessary to reflect changes to these data.
Some options to consider:
* automate the deployment: deploy on push (like x/website)
* keep the data outside the main devapp (like x/wiki (gerrit), x/telemetry/config (module), anything that reads from go.dev/dl (datastore), ...)
| Builders,NeedsInvestigation | low | Minor |
2,633,636,098 | opencv | Single picture camera distortion estimation | ### Describe the feature and motivation
Hello,
In practise in the vfx industry camera distortion is estimated using a single image that covers the entire vieuw.
THis is however not possible with OpenCv.
I wanted to have a go at implementing this paper, But i have lost the ability to read more advanced maths in non coded forms.
I was wondering if someone could implement this paper in openCV
https://arxiv.org/pdf/2403.01263
### Additional context
https://arxiv.org/pdf/2403.01263
### additional reasoning for single picture calibration in context of machine vision
there is a small lens distortion happening during focussing. also zooming obviously changes the lensdistortion.
If calibrations can be made by using a single image.
It gives better results and also allows you to create a map of how zoom / focus effects camera distortion which could allow the use of cheaper cameras that can zoom in over more expensive cameras that can`t zoom in but have a higher clarity and resolution.
### additional rambeling
This doesn`t have to be ran every frame.
You can create an st map once and then use that to distort/re-distort.
| feature,category: calib3d,GSoC | low | Major |
2,633,682,045 | deno | Syntax aware coverage ignore directives | Relates to #16626
In addition to "// deno-coverage-ignore-[file|next|start|stop]" it would be useful to implement "syntax aware" directives.
In the first instance, I would propose the following two directives be added, which I believe would be commonly used.
1. "// deno-coverage-ignore-branch" for ignoring specific branches, primarily of an if or switch statement. A motivating example would be to ignore unreachable branches which are used for static `never` type assertions.
2. "// deno-coverage-ignore-construct" (open to naming suggestions) which would ignore the next syntactic construct, such as an entire function declaration, loop, if/else if/else statement, etc. A motivating example for this would be to ignore the internals of a debug logging utility function.
Both of these can be replaced with "// deno-coverage-ignore-[start|stop]" but would be easier to code review and more convenient as "syntax aware" directives. | suggestion,testing | low | Critical |
2,633,693,403 | next.js | Generated files on disk have different file hash than Webpack-reported assets | ### Link to the code that reproduces this issue
https://github.com/mauron85/nextjs-webpack-assets
### To Reproduce
1. create plain nextjs app or pull example from https://github.com/mauron85/nextjs-webpack-assets
2. add simple asset log plugin
// next.config.js
```
module.exports = {
webpack: (config, { webpack }) => {
config.plugins.push({
apply: (compiler) => {
compiler.hooks.compilation.tap("LogFinalFilenamesPlugin", (compilation) => {
compilation.hooks.processAssets.tap(
{
name: "LogFinalFilenamesPlugin",
stage: webpack.Compilation.PROCESS_ASSETS_STAGE_SUMMARIZE,
},
() => {
const assetNames = Object.keys(compilation.assets);
console.log("Final asset filenames:", assetNames);
}
);
});
},
});
return config;
},
};
```
3. run npm run build
4. compare file names in .next\static\chunks\pages\[sessionId] vs what plugin reported
### Current vs. Expected behavior
When building a Next.js application, the assets listed by Webpack (such as those logged by plugins like FileListPlugin or available in Webpack stats) have different hashes in their filenames compared to the files actually written to the .next/static folder on disk. This creates a mismatch between the assets reported by Webpack and those available on disk after the build process completes, impacting other webpack plugins eg. bug like https://github.com/microsoft/azure-devops-symbols/issues/253
The filenames for assets reported by Webpack should match those generated on disk in the .next/static folder, including any appended hash values for cache busting.
### Provide environment information
```bash
Operating System:
Platform: win32
Arch: x64
Version: Windows 11 Enterprise
Available memory (MB): 32436
Available CPU cores: 8
Binaries:
Node: 20.17.0
npm: 10.8.2
Yarn: N/A
pnpm: N/A
Relevant Packages:
next: 15.0.2 // Latest available version is detected (15.0.2).
eslint-config-next: 15.0.2
react: 19.0.0-rc-02c0e824-20241028
react-dom: 19.0.0-rc-02c0e824-20241028
typescript: 5.6.3
Next.js Config:
output: N/A
```
### Which area(s) are affected? (Select all that apply)
create-next-app
### Which stage(s) are affected? (Select all that apply)
next build (local)
### Additional context
"The mismatch between filenames on disk and Webpack asset names is causing issues with other Webpack plugins. For instance, when using AzureDevOpsSymbolsPlugin, this discrepancy results in duplicated sourceMappingURL entries in production builds, rendering them unusable across all browsers." | create-next-app,bug | low | Critical |
2,633,729,697 | neovim | ext_cmdline will cause hit-enter messages on pattern not found during search | ### Problem
Probably linked to the sames issue cmdheight=0, but werid as the message area should not be affected by ext_cmdline.
It can be observed here in this [issue](https://github.com/smilhey/ed-cmd.nvim/issues/5)
### Steps to reproduce
nvim --clean
```lua
vim.ui_attach(vim.api.nvim_create_namespace("test"), { ext_cmdline = true }, function() end)
```
luafile%.
You can check by searching for anything not present in the buffer and hitting enter (or using the cmdwindow to visualize the editing).
### Expected behavior
The message area should handle its business as usual I guess
### Nvim version (nvim -v)
NVIM v0.11.0-dev-1092+g0da4d89558
### Vim (not Nvim) behaves the same?
no
### Operating system/version
fedora 40
### Terminal name/version
kitty
### $TERM environment variable
tmux-256color
### Installation
build from repo | bug,ui-extensibility,messages | low | Minor |
2,633,730,760 | kubernetes | Counter.WithContext: data race | ### What happened?
I ran integration tests with race detection enabled (https://github.com/kubernetes/kubernetes/pull/116980).
k8s.io/kubernetes/test/integration/storageversionmigrator failed with a data race (https://prow.k8s.io/view/gs/kubernetes-ci-logs/pr-logs/pull/116980/pull-kubernetes-integration/1853468672903876608):
```
Write at 0x00c000514d20 by goroutine 326096:
k8s.io/component-base/metrics.(*Counter).WithContext()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/component-base/metrics/counter.go:110 +0xb5e
k8s.io/apiserver/pkg/audit.ObserveEvent()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/audit/metrics.go:89 +0xb40
k8s.io/apiserver/pkg/endpoints/filters.processAuditEvent()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/endpoints/filters/audit.go:197 +0xb3f
k8s.io/apiserver/pkg/server.DefaultBuildHandlerChain.WithAudit.func6.1()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/endpoints/filters/audit.go:113 +0x6fe
runtime.deferreturn()
/usr/local/go/src/runtime/panic.go:605 +0x5d
net/http.HandlerFunc.ServeHTTP()
/usr/local/go/src/net/http/server.go:2220 +0x47
k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:84 +0x23c
net/http.HandlerFunc.ServeHTTP()
/usr/local/go/src/net/http/server.go:2220 +0x47
k8s.io/apiserver/pkg/server.DefaultBuildHandlerChain.TrackCompleted.trackCompleted.func27.deferwrap1()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x6f
runtime.deferreturn()
/usr/local/go/src/runtime/panic.go:605 +0x5d
net/http.HandlerFunc.ServeHTTP()
/usr/local/go/src/net/http/server.go:2220 +0x47
k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:123 +0xd01
net/http.HandlerFunc.ServeHTTP()
/usr/local/go/src/net/http/server.go:2220 +0x47
k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:94 +0x4b5
net/http.HandlerFunc.ServeHTTP()
/usr/local/go/src/net/http/server.go:2220 +0x47
k8s.io/apiserver/pkg/server.DefaultBuildHandlerChain.WithWarningRecorder.func11()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/endpoints/filters/warning.go:35 +0x11d
net/http.HandlerFunc.ServeHTTP()
/usr/local/go/src/net/http/server.go:2220 +0x47
k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/server/filters/timeout.go:115 +0xd3
Previous write at 0x00c000514d20 by goroutine 326108:
k8s.io/component-base/metrics.(*Counter).WithContext()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/component-base/metrics/counter.go:110 +0xb5e
k8s.io/apiserver/pkg/audit.ObserveEvent()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/audit/metrics.go:89 +0xb40
k8s.io/apiserver/pkg/endpoints/filters.processAuditEvent()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/endpoints/filters/audit.go:197 +0xb3f
k8s.io/apiserver/pkg/server.DefaultBuildHandlerChain.WithAudit.func6.1()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/endpoints/filters/audit.go:113 +0x6fe
runtime.deferreturn()
/usr/local/go/src/runtime/panic.go:605 +0x5d
net/http.HandlerFunc.ServeHTTP()
/usr/local/go/src/net/http/server.go:2220 +0x47
k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:84 +0x23c
net/http.HandlerFunc.ServeHTTP()
/usr/local/go/src/net/http/server.go:2220 +0x47
k8s.io/apiserver/pkg/server.DefaultBuildHandlerChain.TrackCompleted.trackCompleted.func27.deferwrap1()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x6f
runtime.deferreturn()
/usr/local/go/src/runtime/panic.go:605 +0x5d
net/http.HandlerFunc.ServeHTTP()
/usr/local/go/src/net/http/server.go:2220 +0x47
k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:123 +0xd01
net/http.HandlerFunc.ServeHTTP()
/usr/local/go/src/net/http/server.go:2220 +0x47
k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:94 +0x4b5
net/http.HandlerFunc.ServeHTTP()
/usr/local/go/src/net/http/server.go:2220 +0x47
k8s.io/apiserver/pkg/server.DefaultBuildHandlerChain.WithWarningRecorder.func11()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/endpoints/filters/warning.go:35 +0x11d
net/http.HandlerFunc.ServeHTTP()
/usr/local/go/src/net/http/server.go:2220 +0x47
k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1()
/home/prow/go/src/k8s.io/kubernetes/staging/src/k8s.io/apiserver/pkg/server/filters/timeout.go:115 +0xd3
```
### What did you expect to happen?
No race.
### How can we reproduce it (as minimally and precisely as possible)?
Run the integration test with `go test -race`.
### Anything else we need to know?
This was introduced in https://github.com/kubernetes/kubernetes/pull/119949 three weeks ago.
/sig instrumentation
/cc @rexagod
### Kubernetes version
master (soon 1.32)
### Cloud provider
n/a
### OS version
<details>
```console
# On Linux:
$ cat /etc/os-release
# paste output here
$ uname -a
# paste output here
# On Windows:
C:\> wmic os get Caption, Version, BuildNumber, OSArchitecture
# paste output here
```
</details>
### Install tools
<details>
</details>
### Container runtime (CRI) and version (if applicable)
<details>
</details>
### Related plugins (CNI, CSI, ...) and versions (if applicable)
<details>
</details>
| kind/bug,sig/instrumentation,triage/accepted | low | Critical |
2,633,745,320 | ui | [feat]: Support i18n translations for all components | ### Feature description
Some components contain strings in English, which should be fine to transform these strings in props.

From:
```tsx
<span className="sr-only">Close</span>
```
To:
```tsx
<span className="sr-only">{closeLabel}</span>
```
### Affected component/components
dialog,pagination,etc...
### Additional Context
Please use `"react/jsx-no-literals": "error"` to avoid this issue happened again.
### Before submitting
- [X] I've made research efforts and searched the documentation
- [X] I've searched for existing issues and PRs | area: request | low | Critical |
2,633,780,766 | excalidraw | Images not loading after restart | After restarting my pc the images in my whiteboard wont load. | support | low | Minor |
2,633,808,354 | rust | Linking failed with `f16`: `undefined symbol: __floatuntihf` on Ubuntu 20.04 | ### Problem
I'm getting a linking failure on the following program on an Ubuntu 20.04 with x86_64:
```rust
#![feature(f16)]
fn main() {
let _f = (256u128 + 0) as f16;
}
```
```bash
$ cargo +nightly-2024-11-03 --version --verbose
cargo 1.84.0-nightly (031049782 2024-11-01)
release: 1.84.0-nightly
commit-hash: 0310497822a7a673a330a5dd068b7aaa579a265e
commit-date: 2024-11-01
host: x86_64-unknown-linux-gnu
libgit2: 1.8.1 (sys:0.19.0 vendored)
libcurl: 8.9.0-DEV (sys:0.4.74+curl-8.9.0 vendored ssl:OpenSSL/1.1.1w)
ssl: OpenSSL 1.1.1w 11 Sep 2023
os: Ubuntu 20.4.0 (focal) [64-bit]
```
```bash
$ cargo +nightly-2024-11-03 build
Compiling float_ex v0.1.0 (/home/ubuntu/examples/tmp/float_ex)
error: linking with `cc` failed: exit status: 1
|
= note: LC_ALL="C" PATH="/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/bin:/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/bin/self-contained:/home/ubuntu/.vscode-server/cli/servers/Stable-65edc4939843c90c34d61f4ce11704f09d3e5cb6/server/bin/remote-cli:/home/ubuntu/.elan/bin:/home/ubuntu/.opam/4.13.1+options/bin:/home/ubuntu/.local/bin:/home/ubuntu/bin:/home/ubuntu/.cargo/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/home/ubuntu/.dotnet/tools:/home/ubuntu/git/aws-viewer-for-cbmc:/home/ubuntu/git/kani/scripts:/home/ubuntu/git/kissat/build:/home/ubuntu/git/cadical/build:/home/ubuntu/git/cryptominisat/build:/home/ubuntu/git/smack-deps/corral:/home/ubuntu/sources/dafny:/home/ubuntu/git/aeneas/charon/bin:/home/ubuntu/git/aeneas/bin:/home/ubuntu/git/aws-viewer-for-cbmc:/home/ubuntu/git/kani/scripts:/home/ubuntu/git/kissat/build:/home/ubuntu/git/cadical/build:/home/ubuntu/git/cryptominisat/build:/home/ubuntu/git/smack-deps/corral:/home/ubuntu/sources/dafny:/home/ubuntu/git/aeneas/charon/bin:/home/ubuntu/git/aeneas/bin" VSLANG="1033" "cc" "-m64" "/tmp/rustcRMryV9/symbols.o" "/home/ubuntu/examples/tmp/float_ex/target/debug/deps/float_ex-554be11968b54f52.1upxk7wgb5be1ozir5i3pcti1.rcgu.o" "/home/ubuntu/examples/tmp/float_ex/target/debug/deps/float_ex-554be11968b54f52.54olmofqxnpy7pq9qy9o2t30g.rcgu.o" "/home/ubuntu/examples/tmp/float_ex/target/debug/deps/float_ex-554be11968b54f52.ch6aaog2yzci5htmyt918ewsf.rcgu.o" "/home/ubuntu/examples/tmp/float_ex/target/debug/deps/float_ex-554be11968b54f52.ci5nx2nl5wy1uyl56j4qhrd42.rcgu.o" "/home/ubuntu/examples/tmp/float_ex/target/debug/deps/float_ex-554be11968b54f52.cs3lb7ej2s3cvd9jn4tnp3blj.rcgu.o" "/home/ubuntu/examples/tmp/float_ex/target/debug/deps/float_ex-554be11968b54f52.626egnrzk7ba7cgmeqeckoth3.rcgu.o" "-Wl,--as-needed" "-Wl,-Bstatic" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libstd-af54019eb310b9f4.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libpanic_unwind-e370a2465b65d07c.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libobject-8f87839a255517a7.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libmemchr-16ad4ab169e45609.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libaddr2line-28d1468ba8d7b07b.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libgimli-fa4f9f6976fb516c.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/librustc_demangle-cbd1d7b1bf5aede2.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libstd_detect-f1683a071fd1db0a.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libhashbrown-9972421ad1fb0fa7.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/librustc_std_workspace_alloc-dc54626494bdf80e.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libminiz_oxide-b64145080abdbf70.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libadler-6b915e9383d4c977.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libunwind-32ca42ea576f8b75.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libcfg_if-1fa702f5e51b232e.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/liblibc-799d520624f8b2e0.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/liballoc-1af26327f6fe922d.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/librustc_std_workspace_core-395a38b8e0851c9b.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libcore-d453bab70303062c.rlib" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib/libcompiler_builtins-9ec1a83853f387f8.rlib" "-Wl,-Bdynamic" "-lgcc_s" "-lutil" "-lrt" "-lpthread" "-lm" "-ldl" "-lc" "-B/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/bin/gcc-ld" "-fuse-ld=lld" "-Wl,--eh-frame-hdr" "-Wl,-z,noexecstack" "-L" "/home/ubuntu/.rustup/toolchains/nightly-2024-11-03-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/lib" "-o" "/home/ubuntu/examples/tmp/float_ex/target/debug/deps/float_ex-554be11968b54f52" "-Wl,--gc-sections" "-pie" "-Wl,-z,relro,-z,now" "-nodefaultlibs"
= note: rust-lld: error: undefined symbol: __floatuntihf
>>> referenced by main.rs:0 (src/main.rs:0)
>>> /home/ubuntu/examples/tmp/float_ex/target/debug/deps/float_ex-554be11968b54f52.ch6aaog2yzci5htmyt918ewsf.rcgu.o:(float_ex::main::hdd9d458336af3037)
>>> did you mean: __floatuntidf
>>> defined in: /usr/lib/gcc/x86_64-linux-gnu/9/../../../x86_64-linux-gnu/libgcc_s.so.1
collect2: error: ld returned 1 exit status
error: could not compile `float_ex` (bin "float_ex") due to 1 previous error
```
### Steps
_No response_
### Possible Solution(s)
_No response_
### Notes
_No response_
### Version
_No response_ | A-linkage,C-bug,requires-nightly,T-libs,F-f16_and_f128 | low | Critical |
2,633,854,506 | PowerToys | FancyZones Layout Editor | ### Description of the new feature / enhancement
Thank you soo much for such a great tool. I feel that the less shortcut keys a user has to memorize regarding powertoys the better. The more integrated things are into the standard Windows experience, the more user friendly things are. It would be really awesome if you could get to the FancyZones Layout Editor from the context menu that appears when right clicking on a windows title bar. Or possibly right clicking on the maximize/minimize/restore button.
Thank you for your time.
### Scenario when this would be used?
Any time I would like to use the FancyZones Layout Editor.
### Supporting information
Seems to just make sense to make it more user friendly and increase the efficiency of the workflow by giving more convenient access to the FancyZones Layout Editor. | Needs-Triage | low | Minor |
2,633,861,856 | tensorflow | TFLite Interpreter `experimental_preserve_all_tensors` yields different output | ### 1. System information
- OS Platform and Distribution: Pop!_OS 22.04 LTS
- TensorFlow installation: pip package
- TensorFlow library: 2.17.0
### 2. Code
```
import tensorflow as tf
import numpy as np
from ai_edge_litert.interpreter import Interpreter
IMAGE_SHAPE = (10, 10, 1)
IMAGE_SIZE = IMAGE_SHAPE[0]*IMAGE_SHAPE[1]*IMAGE_SHAPE[2]
tf.random.set_seed(0)
np.random.seed(0)
tf.keras.utils.set_random_seed(0)
inp_layer = tf.keras.layers.InputLayer(shape=IMAGE_SHAPE, batch_size=1)
conv_layer = tf.keras.layers.Conv2D(4, (3, 3), strides=(2, 2), padding="same", use_bias=True, activation="relu")
model = tf.keras.models.Sequential([
inp_layer,
conv_layer,
])
w = conv_layer.get_weights()
k = w[0]
b = w[1]
k[:, :, :, 0] = 0
k[:, :, :, 2] = 0
b = np.zeros(shape=w[1].shape)
print("kernel:\n", k)
print("bias:\n", b)
w[0] = k
w[1] = b
conv_layer.set_weights(w)
def representative_dataset():
for _ in range(10):
data = np.random.rand(1, *IMAGE_SHAPE)
yield [data.astype(np.float32)]
converter = tf.lite.TFLiteConverter.from_keras_model(model)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.representative_dataset = representative_dataset
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.int8
converter.inference_output_type = tf.int8
tflite_quant_model = converter.convert()
#filename = "example.tflite"
#with open(filename, 'wb') as f:
# f.write(tflite_quant_model)
#interp = Interpreter(filename)
#exper = Interpreter(filename, experimental_preserve_all_tensors=True)
interp = Interpreter(model_content=tflite_quant_model)
exper = Interpreter(model_content=tflite_quant_model, experimental_preserve_all_tensors=True)
inp=np.zeros(IMAGE_SIZE, dtype=np.int8)
arr = [62, 63, 64, 72, 73, 74, 82, 83, 84]
for i in arr:
inp[i] = i
t = tf.constant(inp, shape=IMAGE_SHAPE, dtype=tf.int8)
print("input:\n", t, "\n")
def alloc_and_run(interpreter):
interpreter.allocate_tensors()
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()
interpreter.set_tensor(input_details[0]['index'], [t])
interpreter.invoke()
return interpreter.get_tensor(output_details[0]['index'])
out_interp = alloc_and_run(interp)
out_exper = alloc_and_run(exper)
print("out_interp:\n", out_interp, "\n")
print("out_exper:\n", out_exper, "\n")
assert(tf.reduce_all(tf.equal(out_interp, out_exper)))
```
### 3. Failure after conversion
Conversion is successful, but interpreter and experimental interpreter yield a different result:
```
out_interp:
[[[[-128 -19 -128 -68]
[-128 -19 -128 -68]
[-128 -19 -128 -68]
[-128 -19 -128 -68]
[-128 -32 -128 -128]]
[[-128 -19 -128 -68]
[-128 -19 -128 -68]
[-128 -19 -128 -68]
[-128 -19 -128 -68]
[-128 -32 -128 -128]]
[[-128 -23 -128 -47]
[-128 28 -128 -61]
[-128 15 -128 -78]
[-128 -19 -128 -68]
[-128 -32 -128 -128]]
[[-128 -12 -128 -5]
[-128 57 -128 -34]
[-128 10 -128 -98]
[-128 -19 -128 -68]
[-128 -32 -128 -128]]
[[-128 -123 -128 -75]
[-128 -128 -128 -66]
[-128 -128 -128 -120]
[-128 -117 -128 -83]
[-128 -128 -128 -128]]]]
out_exper:
[[[[-128 -19 -128 -68]
[-128 -19 -128 -68]
[-128 -19 -128 -68]
[-128 -19 -128 -68]
[-128 -32 -128 -128]]
[[-128 -19 -128 -68]
[-128 -19 -128 -68]
[-128 -19 -128 -68]
[-128 -19 -128 -68]
[-128 -32 -128 -128]]
[[-128 -23 -128 -47]
[-128 28 -128 -61]
[-128 15 -128 -78]
[-128 -19 -128 -68]
[-128 -32 -128 -128]]
[[-128 -12 -128 -5]
[-128 57 -128 -33]
[-128 10 -128 -98]
[-128 -19 -128 -68]
[-128 -32 -128 -128]]
[[-128 -123 -128 -75]
[-128 -128 -128 -66]
[-128 -128 -128 -120]
[-128 -117 -128 -83]
[-128 -128 -128 -128]]]]
Traceback (most recent call last):
File "/home/sri/riptools/scratch5/example.py", line 79, in <module>
assert(tf.reduce_all(tf.equal(out_interp, out_exper)))
AssertionError
```
The difference here is:
`out_interp[0][3][1][3] = -33`
`out_exper[0][3][1][3] = -34`
This "off-by-one" may occur with other sized input tensors and larger non-zero inputs/weights as well. | stat:awaiting tensorflower,comp:lite,TFLiteConverter,2.17 | medium | Critical |
2,633,882,519 | PowerToys | Switchbar doesn't work with searching with run or typing links through run | ### Microsoft PowerToys version
0.86.0
### Installation method
PowerToys auto-update
### Running as admin
Yes
### Area(s) with issue?
PowerToys Run
### Steps to reproduce
Download Switchbar and set it up, and after setting it as the default browser, try opening a web page with Powertoys run or searching with Run.
### ✔️ Expected Behavior
I expected it to open the browser selector, just like any other link
### ❌ Actual Behavior
It opened switchbar settings, the same result as opening the app's .exe
### Other Software
[Switchbar](https://switchbar.com/) 19.3.0 | Issue-Bug,Needs-Triage | low | Minor |
2,633,903,576 | flutter | [go_router] hero widgets do not reactivate after navigating back from a sub route right after closing a dialog | ### What package does this bug report belong to?
go_router
### What target platforms are you seeing this bug on?
Android, Web
### Have you already upgraded your packages?
Yes
### Dependency versions
<details><summary>pubspec.lock</summary>
```lock
# Generated by pub
# See https://dart.dev/tools/pub/glossary#lockfile
packages:
async:
dependency: transitive
description:
name: async
sha256: "947bfcf187f74dbc5e146c9eb9c0f10c9f8b30743e341481c1e2ed3ecc18c20c"
url: "https://pub.dev"
source: hosted
version: "2.11.0"
boolean_selector:
dependency: transitive
description:
name: boolean_selector
sha256: "6cfb5af12253eaf2b368f07bacc5a80d1301a071c73360d746b7f2e32d762c66"
url: "https://pub.dev"
source: hosted
version: "2.1.1"
characters:
dependency: transitive
description:
name: characters
sha256: "04a925763edad70e8443c99234dc3328f442e811f1d8fd1a72f1c8ad0f69a605"
url: "https://pub.dev"
source: hosted
version: "1.3.0"
clock:
dependency: transitive
description:
name: clock
sha256: cb6d7f03e1de671e34607e909a7213e31d7752be4fb66a86d29fe1eb14bfb5cf
url: "https://pub.dev"
source: hosted
version: "1.1.1"
collection:
dependency: transitive
description:
name: collection
sha256: ee67cb0715911d28db6bf4af1026078bd6f0128b07a5f66fb2ed94ec6783c09a
url: "https://pub.dev"
source: hosted
version: "1.18.0"
cupertino_icons:
dependency: "direct main"
description:
name: cupertino_icons
sha256: ba631d1c7f7bef6b729a622b7b752645a2d076dba9976925b8f25725a30e1ee6
url: "https://pub.dev"
source: hosted
version: "1.0.8"
fake_async:
dependency: transitive
description:
name: fake_async
sha256: "511392330127add0b769b75a987850d136345d9227c6b94c96a04cf4a391bf78"
url: "https://pub.dev"
source: hosted
version: "1.3.1"
flutter:
dependency: "direct main"
description: flutter
source: sdk
version: "0.0.0"
flutter_lints:
dependency: "direct dev"
description:
name: flutter_lints
sha256: "5398f14efa795ffb7a33e9b6a08798b26a180edac4ad7db3f231e40f82ce11e1"
url: "https://pub.dev"
source: hosted
version: "5.0.0"
flutter_test:
dependency: "direct dev"
description: flutter
source: sdk
version: "0.0.0"
flutter_web_plugins:
dependency: transitive
description: flutter
source: sdk
version: "0.0.0"
go_router:
dependency: "direct main"
description:
name: go_router
sha256: "5a7419238fe5ed0b6bcf62f2eec8153caf12929dcebe05a891d530bf4b4ef52e"
url: "https://pub.dev"
source: hosted
version: "14.4.0"
leak_tracker:
dependency: transitive
description:
name: leak_tracker
sha256: "3f87a60e8c63aecc975dda1ceedbc8f24de75f09e4856ea27daf8958f2f0ce05"
url: "https://pub.dev"
source: hosted
version: "10.0.5"
leak_tracker_flutter_testing:
dependency: transitive
description:
name: leak_tracker_flutter_testing
sha256: "932549fb305594d82d7183ecd9fa93463e9914e1b67cacc34bc40906594a1806"
url: "https://pub.dev"
source: hosted
version: "3.0.5"
leak_tracker_testing:
dependency: transitive
description:
name: leak_tracker_testing
sha256: "6ba465d5d76e67ddf503e1161d1f4a6bc42306f9d66ca1e8f079a47290fb06d3"
url: "https://pub.dev"
source: hosted
version: "3.0.1"
lints:
dependency: transitive
description:
name: lints
sha256: "3315600f3fb3b135be672bf4a178c55f274bebe368325ae18462c89ac1e3b413"
url: "https://pub.dev"
source: hosted
version: "5.0.0"
logging:
dependency: transitive
description:
name: logging
sha256: c8245ada5f1717ed44271ed1c26b8ce85ca3228fd2ffdb75468ab01979309d61
url: "https://pub.dev"
source: hosted
version: "1.3.0"
matcher:
dependency: transitive
description:
name: matcher
sha256: d2323aa2060500f906aa31a895b4030b6da3ebdcc5619d14ce1aada65cd161cb
url: "https://pub.dev"
source: hosted
version: "0.12.16+1"
material_color_utilities:
dependency: transitive
description:
name: material_color_utilities
sha256: f7142bb1154231d7ea5f96bc7bde4bda2a0945d2806bb11670e30b850d56bdec
url: "https://pub.dev"
source: hosted
version: "0.11.1"
meta:
dependency: transitive
description:
name: meta
sha256: bdb68674043280c3428e9ec998512fb681678676b3c54e773629ffe74419f8c7
url: "https://pub.dev"
source: hosted
version: "1.15.0"
path:
dependency: transitive
description:
name: path
sha256: "087ce49c3f0dc39180befefc60fdb4acd8f8620e5682fe2476afd0b3688bb4af"
url: "https://pub.dev"
source: hosted
version: "1.9.0"
sky_engine:
dependency: transitive
description: flutter
source: sdk
version: "0.0.99"
source_span:
dependency: transitive
description:
name: source_span
sha256: "53e943d4206a5e30df338fd4c6e7a077e02254531b138a15aec3bd143c1a8b3c"
url: "https://pub.dev"
source: hosted
version: "1.10.0"
stack_trace:
dependency: transitive
description:
name: stack_trace
sha256: "73713990125a6d93122541237550ee3352a2d84baad52d375a4cad2eb9b7ce0b"
url: "https://pub.dev"
source: hosted
version: "1.11.1"
stream_channel:
dependency: transitive
description:
name: stream_channel
sha256: ba2aa5d8cc609d96bbb2899c28934f9e1af5cddbd60a827822ea467161eb54e7
url: "https://pub.dev"
source: hosted
version: "2.1.2"
string_scanner:
dependency: transitive
description:
name: string_scanner
sha256: "556692adab6cfa87322a115640c11f13cb77b3f076ddcc5d6ae3c20242bedcde"
url: "https://pub.dev"
source: hosted
version: "1.2.0"
term_glyph:
dependency: transitive
description:
name: term_glyph
sha256: a29248a84fbb7c79282b40b8c72a1209db169a2e0542bce341da992fe1bc7e84
url: "https://pub.dev"
source: hosted
version: "1.2.1"
test_api:
dependency: transitive
description:
name: test_api
sha256: "5b8a98dafc4d5c4c9c72d8b31ab2b23fc13422348d2997120294d3bac86b4ddb"
url: "https://pub.dev"
source: hosted
version: "0.7.2"
vector_math:
dependency: transitive
description:
name: vector_math
sha256: "80b3257d1492ce4d091729e3a67a60407d227c27241d6927be0130c98e741803"
url: "https://pub.dev"
source: hosted
version: "2.1.4"
vm_service:
dependency: transitive
description:
name: vm_service
sha256: "5c5f338a667b4c644744b661f309fb8080bb94b18a7e91ef1dbd343bed00ed6d"
url: "https://pub.dev"
source: hosted
version: "14.2.5"
sdks:
dart: ">=3.5.4 <4.0.0"
flutter: ">=3.19.0"
```
</details>
### Steps to reproduce
1. Create a route with a sub route with mutual hero widgets
2. go to the sub route from the root route
3. show a dialog in the sub route with an await, and then `context.go` to the previous route after awaiting the dialog
4. close the dialog
### Expected results
The hero widget in the root route to be visible
### Actual results
The hero widget is invisible.
FYI, If you wait until the dialog transition is finished and then navigate to the parent route, the issue will not happen.
You can also see how it should behave if you `context.pop()` instead of `go` after closing the dalog.
### Code sample
<details open><summary>Code sample</summary>
```dart
import 'package:flutter/material.dart';
import 'package:go_router/go_router.dart';
void main() => runApp(const MyApp());
final router = GoRouter(routes: [
GoRoute(
path: '/',
builder: (context, state) => One(),
routes: [
GoRoute(
path: 'two',
builder: (context, state) => Two(),
),
],
),
]);
class MyApp extends StatelessWidget {
const MyApp({super.key});
@override
Widget build(BuildContext context) {
return MaterialApp.router(
routerConfig: router,
title: 'Flutter Demo',
debugShowCheckedModeBanner: false,
theme: ThemeData(
colorSchemeSeed: Colors.blue,
),
);
}
}
class One extends StatelessWidget {
const One({super.key});
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(),
body: Center(
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
Hero(
tag: 'tag',
child: Icon(Icons.abc, size: 100),
),
const Text(
'one',
),
ElevatedButton(
child: Text('Go to two'),
onPressed: () async {
context.go('/two');
},
),
],
),
),
);
}
}
class Two extends StatelessWidget {
const Two({super.key});
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(),
body: Center(
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
Hero(
tag: 'tag',
child: Icon(Icons.abc, size: 100),
),
const Text(
'two',
),
ElevatedButton(
child: Text('Go back'),
onPressed: () async {
await showDialog(
context: context,
builder: (context) {
return AlertDialog(
title: Text('Hi'),
actions: [
TextButton(
onPressed: () => context.pop(),
child: Text('close'),
),
],
);
},
);
if (!context.mounted) return;
context.go('/');
},
),
],
),
),
);
}
}
```
</details>
### Screenshots or Videos
<details open>
<summary>Screenshots / Video demonstration</summary>
https://github.com/user-attachments/assets/990fda9a-392f-4ac1-bb61-7ce40e2bce38
</details>
### Logs
<details open><summary>Logs</summary>
```console
EMPTY
```
</details>
### Flutter Doctor output
<details open><summary>Doctor output</summary>
```console
[!] Flutter (Channel stable, 3.24.4, on Microsoft Windows [Version 10.0.22631.4391], locale en-US)
• Flutter version 3.24.4 on channel stable at C:\Users\arina\scoop\apps\fvm\current\default
! Warning: `flutter` on your path resolves to C:\Users\arina\scoop\persist\fvm\versions\stable\bin\flutter, which is not inside your current Flutter SDK
checkout at C:\Users\arina\scoop\apps\fvm\current\default. Consider adding C:\Users\arina\scoop\apps\fvm\current\default\bin to the front of your path.
! Warning: `dart` on your path resolves to C:\Users\arina\scoop\persist\fvm\versions\stable\bin\dart, which is not inside your current Flutter SDK
checkout at C:\Users\arina\scoop\apps\fvm\current\default. Consider adding C:\Users\arina\scoop\apps\fvm\current\default\bin to the front of your path.
• Upstream repository https://github.com/flutter/flutter.git
• Framework revision 603104015d (11 days ago), 2024-10-24 08:01:25 -0700
• Engine revision db49896cf2
• Dart version 3.5.4
• DevTools version 2.37.3
• If those were intentional, you can disregard the above warnings; however it is recommended to use "git" directly to perform update checks and upgrades.
[✓] Windows Version (Installed version of Windows is version 10 or higher)
[✓] Android toolchain - develop for Android devices (Android SDK version 34.0.0)
• Android SDK at C:\Users\arina\AppData\Local\Android\sdk
• Platform android-34, build-tools 34.0.0
• Java binary at: C:\Program Files\Android\Android Studio\jbr\bin\java
• Java version OpenJDK Runtime Environment (build 17.0.10+0--11572160)
• All Android licenses accepted.
[✓] Chrome - develop for the web
• Chrome at C:\Program Files\Google\Chrome\Application\chrome.exe
[✓] Visual Studio - develop Windows apps (Visual Studio Community 2022 17.9.1)
• Visual Studio at C:\Program Files\Microsoft Visual Studio\2022\Community
• Visual Studio Community 2022 version 17.9.34616.47
• Windows 10 SDK version 10.0.22621.0
[✓] Android Studio (version 2023.3)
• Android Studio at C:\Program Files\Android\Android Studio
• Flutter plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/9212-flutter
• Dart plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/6351-dart
• Java version OpenJDK Runtime Environment (build 17.0.10+0--11572160)
[✓] Connected device (4 available)
• CPH2581 (mobile) • ac8372b7 • android-arm64 • Android 14 (API 34)
• Windows (desktop) • windows • windows-x64 • Microsoft Windows [Version 10.0.22631.4391]
• Chrome (web) • chrome • web-javascript • Google Chrome 130.0.6723.92
• Edge (web) • edge • web-javascript • Microsoft Edge 130.0.2849.68
[✓] Network resources
• All expected network resources are available.
! Doctor found issues in 1 category.
```
</details>
| a: animation,package,has reproducible steps,P1,p: go_router,team-go_router,triaged-go_router,found in release: 3.24,found in release: 3.27 | medium | Critical |
2,633,935,271 | godot | Error assigning boolean properties of typed class instance | ### Tested versions
Godot Engine v4.3.stable.official.77dcf97d8
Godot Engine v4.4.dev3.official.f4af8201b
### System information
Godot v4.4.dev3 - Debian GNU/Linux 12 (bookworm) 12 on X11 - X11 display driver, Multi-window, 1 monitor - Vulkan (Forward+) - dedicated NVIDIA GeForce RTX 4080 SUPER (nvidia) - AMD Ryzen 9 5900X 12-Core Processor (24 threads)
### Issue description
This code does not compile:
```gdscript
extends Node3D
func _ready() -> void:
var material := ORMMaterial3D.new()
material.transparency = true
```
The editor shows this error:
```
Line 5:Cannot assign a value of type "bool" as "BaseMaterial3D.Transparency".
Line 5:Value of type "bool" cannot be assigned to a variable of type "BaseMaterial3D.Transparency".
```
changing it to this, it works:
```gdscript
extends Node3D
func _ready() -> void:
var material = ORMMaterial3D.new()
material.transparency = true
```
This code has the same problem:
```gdscript
func _ready() -> void:
var mesh_instance := MeshInstance3D.new()
mesh_instance.cast_shadow = false
```
Assignment of other non-boolean attributes works without problems when using the typed instance, e. g. this works:
```gdscript
func _ready() -> void:
var mesh_instance := MeshInstance3D.new()
mesh_instance.position = Vector3.ZERO
```
### Steps to reproduce
1. Create a blank new project
2. Add a Node3D
3. Attach a script
4. Add the above method
5. Error appears right away in editor (even before attempting to run project)
### Minimal reproduction project (MRP)
[new-game-project.zip](https://github.com/user-attachments/files/17624231/new-game-project.zip)
| enhancement,discussion,topic:gdscript,topic:editor | low | Critical |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.