id stringlengths 4 10 | text stringlengths 4 2.14M | source stringclasses 2
values | created timestamp[s]date 2001-05-16 21:05:09 2025-01-01 03:38:30 | added stringdate 2025-04-01 04:05:38 2025-04-01 07:14:06 | metadata dict |
|---|---|---|---|---|---|
2074770615 | The headers of the NMDC website don't match the products
See screen shot. It throws me off when the text shifts & people tend to struggle to find the submission portal in the products tab.
Website
Product portals
Did you determine if this mattered & could/should be done? @pkalita-lbl
IMO I'm not sure exactly what we'd do here.
The unavoidable fact is that we're talking about headers on two different websites. We are conscious of keeping the content of the menus the same, but as far as the design goes it's hard to divorce the menu from the site it is a part of. For example:
The portal site (https://data.microbiomedata.org/) needs to accommodate the login/logout/user info. That will never be a part of the WordPress site (https://microbiomedata.org/).
The WordPress site is based around a fixed-width layout. All the content including the header is constrained to a fixed width which is why you get whitespace to the left of the logo and to the right of the Contact menu. The portal site is based on a full-width layout. That's why there is no whitespace to the left of the logo there or to the right of the login/logout/user info. It would be odd to have a fixed-width header above full-width content or vice-versa.
Each site uses a different set of fonts, and the header of each site reflects that.
Those are just some of the obvious difference between the designs of the two sites. There's more subtle stuff, too, that varies between them that's not really worth getting into. The bottom line is: they're never going to be pixel-for-pixel identical.
So what are the complaints?
the text shifts
Mainly that's driven by points 1 and 2 above, especially point 2. Because you're going between a fixed-with layout and a full-width layout the degree if shifting you get it mostly determined simply by how wide your browser window is.
people tend to struggle to find the submission portal in the products tab
I'm not quite sure what to say about that. Do you have suggestions?
I think I was referring to text size and font. Not position. Can we make it bigger on the products?
For finding the "submission portal" button... @ljohnson09 I think you had some feedback on this, in an issue? Do you recall?
We could try making the font size in the portal header bigger across the board and see if that's better. But I would be extremely wary about making the font size bigger for only the product links. I think that would look unusual.
the font size bigger for only the product links
no... I mean make them all match...
About, Product, Community, Resources,Contact ... so they catch the eye better.
Okay let's try a larger font size in the portal header as a path forward.
@mslarae13 We had a couple of users from the user interviews that had trouble getting back to the SP after it rerouted them to the DP after ORCiD log in. But eventually people were able to find the products tab - I think they were mostly thrown off by the new portal.
@mslarae13 We had a couple of users from the user interviews that had trouble getting back to the SP after it rerouted them to the DP after ORCiD log in. But eventually people were able to find the products tab - I think they were mostly thrown off by the new portal.
@ljohnson09 this is fixed! https://github.com/microbiomedata/nmdc-server/pull/1275
Will be in prod at the end of the month
| gharchive/issue | 2024-01-10T16:48:18 | 2025-04-01T06:44:56.948617 | {
"authors": [
"ljohnson09",
"mslarae13",
"pkalita-lbl"
],
"repo": "microbiomedata/submission-schema",
"url": "https://github.com/microbiomedata/submission-schema/issues/172",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1124433534 | Upgrade Chakra-UI to latest version
Upgrade Chakra-UI to latest version and test/check for any negative side effects/errors.
The following error is received on updating Chakra UI (and it's dependencies) to the latest version:
./node_modules/framer-motion/dist/es/components/AnimatePresence/index.mjs
Can't import the named export 'Children' from non EcmaScript module (only default export is available)
This problem is documented here - mjs is not supported by default in CRA v4. This has been fixed in v5.
We can either upgrade to CRA v5, use Framer Motion v4 (although this seems unwise as it is a dependency of Chakra UI), or attempt to patch/override the webpack config to add support for mjs.
| gharchive/issue | 2022-02-04T17:17:32 | 2025-04-01T06:44:56.951160 | {
"authors": [
"microbit-robert"
],
"repo": "microbit-foundation/python-editor-next",
"url": "https://github.com/microbit-foundation/python-editor-next/issues/558",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1116073381 | Search translatable strings
Add and use search translatable strings.
See #440.
What happens to untranslated strings in another language (i.e., French)? FormatJS falls back to using string id with a very loud console error.
| gharchive/pull-request | 2022-01-27T11:04:56 | 2025-04-01T06:44:56.952251 | {
"authors": [
"microbit-robert"
],
"repo": "microbit-foundation/python-editor-next",
"url": "https://github.com/microbit-foundation/python-editor-next/pull/480",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1361622862 | When connecting via USB, disconnecting will cause the browser page to crash.
Scratch3 - micro:bit more When connecting to a micro:bit via USB, disconnecting it, including selecting "Disconnect" in the interface or simply unplugging the micro:bit from the computer, will crash the current page.
Refreshing the page will bring you back to Scratch3, but may result in the last few steps in your work not being saved.
Latest version of Google Chrome
Firmware version V0.2.3
micro:bit board version V2.00
The serial port was incorrectly closed.
This was fixed at 94f37a5f38f361a25a7363e8414efe7a42d89766 and released Release 0.2.5 · microbit-more/mbit-more-v2.
| gharchive/issue | 2022-09-05T08:59:44 | 2025-04-01T06:44:56.954882 | {
"authors": [
"Evolution-detector",
"yokobond"
],
"repo": "microbit-more/mbit-more-v2",
"url": "https://github.com/microbit-more/mbit-more-v2/issues/19",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1982035426 | Check the .gitignore file
Maybe set it up to remove R history files as those really shouldn't be in the repo.
Added .Rhistory to our .gitignore file to resolve this
| gharchive/issue | 2023-11-07T19:06:17 | 2025-04-01T06:44:56.956010 | {
"authors": [
"dacb",
"microbroman"
],
"repo": "microbroman/Nitrogenous-Fate-Project",
"url": "https://github.com/microbroman/Nitrogenous-Fate-Project/issues/2",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
937200722 | Configure continuous integration
... for at least building and pushing container image on quay.io microcks organization
Now done. Container image is pushed to https://quay.io/repository/microcks/microcks-hub?tab=tags
| gharchive/issue | 2021-07-05T15:34:54 | 2025-04-01T06:44:56.958862 | {
"authors": [
"lbroudoux"
],
"repo": "microcks/hub.microcks.io",
"url": "https://github.com/microcks/hub.microcks.io/issues/4",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
159927040 | Notification "microG Services Framework Proxy" has stopped."
Hello!
I get regularely a notification window with this message:
"microG Services Framework Proxy" has stopped."
Question:
How can I start troubleshooting this notification by means of identifying the trigger of it?
THX
Please use the catlog App - filter errors - open App after crash immediately and post the last 5 pages.
Hello!
I have started Catlog w/o any filters but collected the attached file.
In this file I have identified this info:
06-15 01:51:30.099 W/ActivityManager( 929): Scheduling restart of crashed service com.google.android.gsf/org.microg.gms.gcm.PushRegisterProxy in 1000ms
However, I cannot identify what is causing this.
Could you please assist?
THX
2016-06-15-00-06-28.txt
Update:
I think I've identified the relevant lines in the log documenting the error:
06-15 01:51:28.236 W/ActivityManager( 929): Permission Denial: Accessing service ComponentInfo{com.google.android.gms/org.microg.gms.gcm.PushRegisterService} from pid=9789, uid=10062 requires com.google.android.c2dm.permission.RECEIVE 06-15 01:51:28.237 E/AndroidRuntime( 9789): FATAL EXCEPTION: IntentService[GsfGcmRegisterProxy] 06-15 01:51:28.237 E/AndroidRuntime( 9789): Process: com.google.android.gsf, PID: 9789 06-15 01:51:28.237 E/AndroidRuntime( 9789): java.lang.SecurityException: Not allowed to start service Intent { act=com.google.android.c2dm.intent.REGISTER pkg=com.google.android.gms (has extras) } without permission com.google.android.c2dm.permission.RECEIVE 06-15 01:51:28.237 E/AndroidRuntime( 9789): at android.app.ContextImpl.startServiceCommon(ContextImpl.java:1747) 06-15 01:51:28.237 E/AndroidRuntime( 9789): at android.app.ContextImpl.startService(ContextImpl.java:1724) 06-15 01:51:28.237 E/AndroidRuntime( 9789): at android.content.ContextWrapper.startService(ContextWrapper.java:522) 06-15 01:51:28.237 E/AndroidRuntime( 9789): at org.microg.gms.gcm.PushRegisterProxy.onHandleIntent(PushRegisterProxy.java:32) 06-15 01:51:28.237 E/AndroidRuntime( 9789): at android.app.IntentService$ServiceHandler.handleMessage(IntentService.java:65) 06-15 01:51:28.237 E/AndroidRuntime( 9789): at android.os.Handler.dispatchMessage(Handler.java:102) 06-15 01:51:28.237 E/AndroidRuntime( 9789): at android.os.Looper.loop(Looper.java:135) 06-15 01:51:28.237 E/AndroidRuntime( 9789): at android.os.HandlerThread.run(HandlerThread.java:61)
Try if reinstalling microg Services Framework Proxy fixes it.
I have executed the following steps:
Remove package "microG Services Framework Proxy" v0.1.0
Reboot
Installation of "microG Services Framework Proxy" from F-Droid
Validating that all restrictions for "microG Services Framework Proxy" in XPrivacy are off
Validating "migcroG Settings" - Self-Check -> no errors reported
As for now the error message is not displayed anymore.
I'll keep this issue updated when observing the device for next 48h or close the issue accordingly.
Hat this some time before. The order of installing components should be:
Microg gms (provides c2dm service and adds permission to system)
Microg gsf proxy (Docks to the provided c2dm Perm)
Play store (Docks to microg gms and provides licensing services and Permission)
Apps requiring Google lvl must be installed after play store. Or Perm BIND will not exist ... Crash!
If you uninstall one part hosting permission , i did habe to reinstall dependent parts in order few weeks ago. Also appplies to Googles apps.
It looks like uninstalling/reinstalling 'framework proxy' did the trick for me. Thanks.
| gharchive/issue | 2016-06-13T11:17:49 | 2025-04-01T06:44:56.978469 | {
"authors": [
"74cmonty",
"Catfriend1",
"jeekajoo",
"mar-v-in"
],
"repo": "microg/android_packages_apps_GmsCore",
"url": "https://github.com/microg/android_packages_apps_GmsCore/issues/142",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
1729662741 | Not able to access trace from within ErrorWebExceptionHandler
Using:
io.micrometer#micrometer-core;1.10.6
io.micrometer#micrometer-tracing;1.0.4
io.micrometer#micrometer-tracing-bridge-otel;1.0.4
and
spring-boot-starter-webflux:3.0.6
The trace is not accessible neither within DefaultErrorWebExceptionHandler nor in custom handler that implements ErrorWebExceptionHandler;
note - I enabled Hooks.enableAutomaticContextPropagation(); in may application class
This sounds like a duplicate of https://github.com/spring-projects/spring-boot/issues/35604 / https://github.com/spring-projects/spring-framework/issues/30013.
| gharchive/issue | 2023-05-28T20:41:10 | 2025-04-01T06:44:56.991947 | {
"authors": [
"chechoochoo",
"shakuzen"
],
"repo": "micrometer-metrics/tracing",
"url": "https://github.com/micrometer-metrics/tracing/issues/269",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
752353829 | quickmaps navigation updates
quickmaps sub-router implemented
routerlinks updated to include query params
sidenav styling tweaked
e2e tests updated
Passes e2e tests local but fails on server.
May be to do with difference between prod and dev builds.
Will try to fix on Monday :-)
Managed to get the e2e tests to fail locally now, so will try to get that fixed...
Woop woop, tests pass!
🥇
| gharchive/pull-request | 2020-11-27T16:16:01 | 2025-04-01T06:44:57.011128 | {
"authors": [
"bgsandan",
"jon571"
],
"repo": "micronutrientsupport/micronutrient-support-tool",
"url": "https://github.com/micronutrientsupport/micronutrient-support-tool/pull/92",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
197168379 | Enhancement/opentracing
This add open tracing capabilities with zipkin to the user service
Will add noop tracer is zipkin address is not set.
resolves #29
Coverage decreased (-1.1%) to 23.252% when pulling b1e6e3f74493a1d62f0a5133ff24ca1af12edac4 on enhancement/opentracing into 1d394b6feaa0c7c029ed6482bb6716ceaae96d41 on master.
Coverage decreased (-1.1%) to 23.252% when pulling aec8297dc74633caf158c49fbb84636d395d90d3 on enhancement/opentracing into 1d394b6feaa0c7c029ed6482bb6716ceaae96d41 on master.
Coverage decreased (-1.1%) to 23.252% when pulling aec8297dc74633caf158c49fbb84636d395d90d3 on enhancement/opentracing into 1d394b6feaa0c7c029ed6482bb6716ceaae96d41 on master.
Coverage decreased (-1.1%) to 23.252% when pulling 6bc65ab9fd43a905a050906a026eae8847fed979 on enhancement/opentracing into 1d394b6feaa0c7c029ed6482bb6716ceaae96d41 on master.
| gharchive/pull-request | 2016-12-22T12:52:48 | 2025-04-01T06:44:57.061018 | {
"authors": [
"coveralls",
"jasonrichardsmith"
],
"repo": "microservices-demo/user",
"url": "https://github.com/microservices-demo/user/pull/30",
"license": "apache-2.0",
"license_type": "permissive",
"license_source": "bigquery"
} |
1979734544 | how-to-run code instruction link is broken
how-to-run code instruction link is broken. the issue can be reproduced by visiting : https://microsoft.github.io/AI-For-Beginners/etc/how-to-run
fixed #362
| gharchive/issue | 2023-11-06T17:57:04 | 2025-04-01T06:44:57.090185 | {
"authors": [
"cutePanda123",
"leestott"
],
"repo": "microsoft/AI-For-Beginners",
"url": "https://github.com/microsoft/AI-For-Beginners/issues/259",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1395779336 | Universial Code Error for MS Extensions
Since today I get this error in my pipelines running against a v21 container.
Error: RunPipeline action failed. Error: The remote server returned an error: (500) Internal Server Error. Sorry, we can’t sign you in because the following apps don’t meet our Universal Code requirement : _Exclude_AnonymizedDataSharing_ by Microsoft _Exclude_APIV1_ by Microsoft _Exclude_APIV2_ by Microsoft _Exclude_Bank Deposits by Microsoft _Exclude_Connectivity Apps by Microsoft _Exclude_Email Logging Using Graph API by Microsoft _Exclude_Microsoft Dynamics 365 - SmartList by Microsoft _Exclude_PlanConfiguration_ by Microsoft _Exclude_ReportLayouts by Microsoft AMC Banking 365 Fundamentals by Microsoft Base Application by Microsoft Business Central Cloud Migration - Previous Release by Microsoft Business Central Cloud Migration - Previous Release (US) by Microsoft Business Central Intelligent Cloud by Microsoft Ceridian Payroll by Microsoft Company Hub by Microsoft Email - Outlook REST API by Microsoft Email - SMTP API by Microsoft Email - SMTP Connector by Microsoft Envestnet Yodlee Bank Feeds by Microsoft Image Analyzer by Microsoft Import of QuickBooks Payroll Files by Microsoft Intelligent Cloud Base by Microsoft Late Payment Prediction by Microsoft Migration of QuickBooks Data by Microsoft Payment Links to PayPal by Microsoft Recommended Apps by Microsoft Sales and Inventory Forecast by Microsoft Send remittance advice by email by Microsoft Send To Email Printer by Microsoft Shopify Connector by Microsoft Simplified Bank Statement Import by Microsoft System Application by Microsoft Troubleshoot FA Ledger Entries by Microsoft Universal Print Integration by Microsoft You must either upgrade the apps to meet the requirement or license the non-Universal Code module that grants an exception to the requirement. For more information, see https://aka.ms/bcUniversalCode. Stacktrace: at Compile-AppInBcContainer, C:\Users\runneradmin\AppData\Local\Temp\711feb78-ccd0-441b-9ca9-a3c34d2a7af4\BcContainerHelper\AppHandling\Compile-AppInNavContainer.ps1: line 433 at <ScriptBlock>, C:\Users\runneradmin\AppData\Local\Temp\711feb78-ccd0-441b-9ca9-a3c34d2a7af4\BcContainerHelper\AppHandling\Run-AlPipeline.ps1: line 649 at <ScriptBlock>, C:\Users\runneradmin\AppData\Local\Temp\711feb78-ccd0-441b-9ca9-a3c34d2a7af4\BcContainerHelper\AppHandling\Run-AlPipeline.ps1: line 1450 at <ScriptBlock>, C:\Users\runneradmin\AppData\Local\Temp\711feb78-ccd0-441b-9ca9-a3c34d2a7af4\BcContainerHelper\AppHandling\Run-AlPipeline.ps1: line 1106 at <ScriptBlock>, C:\Users\runneradmin\AppData\Local\Temp\711feb78-ccd0-441b-9ca9-a3c34d2a7af4\BcContainerHelper\AppHandling\Run-AlPipeline.ps1: line 1098 at <ScriptBlock>, C:\Users\runneradmin\AppData\Local\Temp\711feb78-ccd0-441b-9ca9-a3c34d2a7af4\BcContainerHelper\AppHandling\Run-AlPipeline.ps1: line 747 at Run-AlPipeline, C:\Users\runneradmin\AppData\Local\Temp\711feb78-ccd0-441b-9ca9-a3c34d2a7af4\BcContainerHelper\AppHandling\Run-AlPipeline.ps1: line 712 at <ScriptBlock>, D:\a\_actions\microsoft\AL-Go-Actions\v2.0\RunPipeline\RunPipeline.ps1: line 279 at <ScriptBlock>, D:\a\_temp\653bcdae-b3e3-4726-a736-0cfda11c2265.ps1: line 1 at <ScriptBlock>, <No file>: line 1
Is there any setting I am missing? I would have expected that the MS Extensions either follow universal code or are excluded by default.
I also changed to the .bclicense file format for the license I am using, but again, I would not expect this to cause a problem like this.
It shouldn't - could you email the license you are using to freddyk at microsoft dot com?
Just sent.
And I just re-run the pipeline on 20.5 artifacts without problems
It looks like we have a problem with the apps in the database, not being signed.
To fix this, all apps needs to be uninstalled and re-installed - but that is a memory consuming and lengthy process.
We are working on fixing this in the database asap.
Just posted an announcement on Yammer describing this problem.
https://www.yammer.com/dynamicsnavdev/#/Threads/show?threadId=1944761703292928
| gharchive/issue | 2022-10-04T07:17:45 | 2025-04-01T06:44:57.093939 | {
"authors": [
"StefanMaron",
"freddydk"
],
"repo": "microsoft/AL-Go",
"url": "https://github.com/microsoft/AL-Go/issues/221",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
944399418 | [Event Request] codeunit 99000834 "Purch. Line-Reserve" - OnBeforeTransferReservationEntry
We would like to add a new event OnBeforeTransferReservationEntry in the following procedure:
procedure TransferPurchLineToItemJnlLine(var PurchLine: Record "Purchase Line"; var ItemJnlLine: Record "Item Journal Line"; TransferQty: Decimal; var CheckApplToItemEntry: Boolean): Decimal
.
.
.
if OldReservEntry."Item Tracking" <> OldReservEntry."Item Tracking"::None then begin
OldReservEntry.TestField("Appl.-to Item Entry");
CreateReservEntry.SetApplyToEntryNo(OldReservEntry."Appl.-to Item Entry");
CheckApplToItemEntry := false;
end;
end;
//A109F0006.event.ns
IsHandled := false;
OnBeforeTransferReservationEntry(OldReservEntry, PurchLine, ItemJnlLine, IsHandled);
if not IsHandled then
//A109F0006.event.ne
TransferQty := CreateReservEntry.TransferReservEntry(DATABASE::"Item Journal Line",
ItemJnlLine."Entry Type".AsInteger(), ItemJnlLine."Journal Template Name",
ItemJnlLine."Journal Batch Name", 0, ItemJnlLine."Line No.",
ItemJnlLine."Qty. per Unit of Measure", OldReservEntry, TransferQty);
until (ReservEngineMgt.NEXTRecord(OldReservEntry) = 0) or (TransferQty = 0);
CheckApplToItemEntry := CheckApplToItemEntry and NotFullyReserved;
end;
exit(TransferQty);
end;
Event Signature:
[IntegrationEvent(false, false)]
local procedure OnBeforeTransferReservationEntry(var ReservationEntry: Record "Reservation Entry"; PurchaseLine: Record "Purchase Line"; ItemJournalLine: Record "Item Journal Line"; var IsHandled: Boolean);
begin
//A109F0006.Event
end;
Thanks for reporting this. We agree, and we’ll publish a fix asap, either in an update for the current version or in the next major release. Please do not reply to this, as we do not monitor closed issues. If you have follow-up questions or requests, please create a new issue where you reference this one.
| gharchive/issue | 2021-07-14T12:55:12 | 2025-04-01T06:44:57.096921 | {
"authors": [
"Duizy05",
"JesperSchulz"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/issues/13237",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
954588635 | Replace options fields by enum in SII fields
For the Canary Islands SII, we need new options in the "Invoice Type" and "Special Schema Code" fields in all involved tables. I suggest replacing the type "Option" with "enum".
Here is list of available SII related enums:
Which version do you use? Seems we have everything you may need for your purpose?
I am using the latest published Oncloud version 18.3.27240.28196.
What I'm seeing is that in tables 36, 38, 112, ... the fields are still in option. I need them to be defined with their corresponding enum
Please be aware what we do not implement new enums in production version. We only do it for next major release (=19.0)
I'll keep it in mind. Thanks for everything
| gharchive/issue | 2021-07-28T08:02:12 | 2025-04-01T06:44:57.100135 | {
"authors": [
"AlexanderYakunin",
"operrui"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/issues/13596",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1010574198 | [Request for Protected vars] page 5703 "Location Card"
protected var
//A109F0006.Event18.03 protected vars
[InDataSet]
ReceiptBinCodeEnable: Boolean;
[InDataSet]
ShipmentBinCodeEnable: Boolean;
[InDataSet]
UseADCSEnable: Boolean;
Thanks for reporting this. We agree, and we’ll publish a fix asap, either in an update for the current version or in the next major release. Please do not reply to this, as we do not monitor closed issues. If you have follow-up questions or requests, please create a new issue where you reference this one.
| gharchive/issue | 2021-09-29T07:05:46 | 2025-04-01T06:44:57.101581 | {
"authors": [
"Duizy1971",
"JesperSchulz"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/issues/14508",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1064644183 | [Event Request] Application “Tests-Marketing” Codeunit 134626 "Person and Company Contacts"
[Event Request]Application “Tests-Marketing”, 134626 "Person and Company Contacts"
Could you please insert the Event OnTestInitialize at the beginning of the procedure Initialize
?
Thanks for reporting this. We agree, and we’ll publish a fix asap, either in an update for the current version or in the next major release. Please do not reply to this, as we do not monitor closed issues. If you have follow-up questions or requests, please create a new issue where you reference this one.
| gharchive/issue | 2021-11-26T15:51:28 | 2025-04-01T06:44:57.103529 | {
"authors": [
"JesperSchulz",
"ttaeufer"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/issues/15358",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1195775821 | [Event Request] codeunit 12153 SubcontractingPricesMgt - OnRoutingPricelistCostAfterFilters
In codeunit 12153 SubcontractingPricesMgt please add the following event:
procedure RoutingPricelistCost(var InSubcPrices: Record "Subcontractor Prices"; WorkCenter: Record "Work Center"; var DirUnitCost: Decimal; var IndirCostPct: Decimal; var OvhdRate: Decimal; var UnitCost: Decimal; var UnitCostCalculation: Option Time,Unit; QtyUoM: Decimal; ProdQtyPerUom: Decimal; QtyBase: Decimal)
begin
PricelistQtyPerUOM := 0;
PricelistQty := 0;
PricelistCost := 0;
DirectCost := 0;
PricelistUOM := '';
UnitCostCalculation := WorkCenter."Unit Cost Calculation";
IndirCostPct := WorkCenter."Indirect Cost %";
OvhdRate := WorkCenter."Overhead Rate";
if WorkCenter."Specific Unit Cost" then
DirUnitCost := (UnitCost - OvhdRate) / (1 + IndirCostPct / 100)
else begin
DirUnitCost := WorkCenter."Direct Unit Cost";
UnitCost := WorkCenter."Unit Cost";
end;
if InSubcPrices."Start Date" = 0D then
InSubcPrices."Start Date" := WorkDate;
SubcontractorPrices.Reset();
SubcontractorPrices.SetRange("Vendor No.", InSubcPrices."Vendor No.");
SubcontractorPrices.SetFilter("Work Center No.", '%1|%2', InSubcPrices."Work Center No.", '');
SubcontractorPrices.SetRange("Standard Task Code", InSubcPrices."Standard Task Code");
SubcontractorPrices.SetFilter("Item No.", '%1|%2', InSubcPrices."Item No.", '');
SubcontractorPrices.SetRange("Start Date", 0D, InSubcPrices."Start Date");
SubcontractorPrices.SetFilter("End Date", '>=%1|%2', InSubcPrices."Start Date", 0D);
// New event
OnRoutingPricelistCostAfterFilters(SubcontractorPrices, InSubcPrices, WorkCenter);
// New event
if SubcontractorPrices.FindLast() then begin
if SubcontractorPrices."Unit of Measure Code" = InSubcPrices."Unit of Measure Code" then begin
PricelistQtyPerUOM := ProdQtyPerUom;
PricelistQty := QtyUoM;
PricelistUOM := SubcontractorPrices."Unit of Measure Code";
end else
GetUOMPrice(InSubcPrices."Item No.", QtyBase);
...
[IntegrationEvent(false, false)]
local procedure OnRoutingPricelistCostAfterFilters(var SubcontractorPrices: Record "Subcontractor Prices"; var InSubcPrices: Record "Subcontractor Prices"; WorkCenter: Record "Work Center")
begin
end;
Thank you
Thanks for reporting this. We agree, and we’ll publish a fix asap, either in an update for the current version or in the next major release. Please do not reply to this, as we do not monitor closed issues. If you have follow-up questions or requests, please create a new issue where you reference this one.
| gharchive/issue | 2022-04-07T09:31:44 | 2025-04-01T06:44:57.106117 | {
"authors": [
"JesperSchulz",
"Vittorio-Panigoni"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/issues/17130",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1253440175 | [Event Request] Codeunit 113 Vend. Entry-Edit - procedure OnRun - OnBeforeValidateExportedPaymentFile
Please add an Event OnBeforeValidateExportedPaymentFile in OnRun in Codeunit 113 Vend. Entry-Edit
trigger OnRun()
begin
VendLedgEntry := Rec;
VendLedgEntry.LockTable();
VendLedgEntry.Find;
VendLedgEntry."On Hold" := "On Hold";
if VendLedgEntry.Open then begin
VendLedgEntry."Due Date" := "Due Date";
DtldVendLedgEntry.SetCurrentKey("Vendor Ledger Entry No.");
DtldVendLedgEntry.SetRange("Vendor Ledger Entry No.", VendLedgEntry."Entry No.");
OnRunOnBeforeDtldVendLedgEntryModifyAll(Rec, DtldVendLedgEntry, VendLedgEntry);
DtldVendLedgEntry.ModifyAll("Initial Entry Due Date", "Due Date");
VendLedgEntry."Pmt. Discount Date" := "Pmt. Discount Date";
VendLedgEntry."Applies-to ID" := "Applies-to ID";
VendLedgEntry.Validate("Payment Method Code", "Payment Method Code");
VendLedgEntry.Validate("Remaining Pmt. Disc. Possible", "Remaining Pmt. Disc. Possible");
VendLedgEntry."Pmt. Disc. Tolerance Date" := "Pmt. Disc. Tolerance Date";
VendLedgEntry.Validate("Max. Payment Tolerance", "Max. Payment Tolerance");
VendLedgEntry.Validate("Accepted Payment Tolerance", "Accepted Payment Tolerance");
VendLedgEntry.Validate("Accepted Pmt. Disc. Tolerance", "Accepted Pmt. Disc. Tolerance");
VendLedgEntry.Validate("Amount to Apply", "Amount to Apply");
VendLedgEntry.Validate("Applying Entry", "Applying Entry");
VendLedgEntry.Validate("Applies-to Ext. Doc. No.", "Applies-to Ext. Doc. No.");
VendLedgEntry.Validate("Message to Recipient", "Message to Recipient");
VendLedgEntry.Validate("Recipient Bank Account", "Recipient Bank Account");
end;
IsHandled := false;
OnBeforeValidateExportedPaymentFile(VendLedgEntry, Rec, IsHandled); <-- NEW EVENT
IF NOT IsHandled THEN
VendLedgEntry.Validate("Exported to Payment File", "Exported to Payment File");
VendLedgEntry.Validate("Creditor No.", "Creditor No.");
VendLedgEntry.Validate("Payment Reference", "Payment Reference");
OnBeforeVendLedgEntryModify(VendLedgEntry, Rec);
VendLedgEntry.TestField("Entry No.", "Entry No.");
VendLedgEntry.Modify();
OnRunOnAfterVendLedgEntryMofidy(VendLedgEntry);
Rec := VendLedgEntry;
end;
[IntegrationEvent(false, false)]
local procedure OnBeforeValidateExportedPaymentFile(var VendLedgEntry: Record "Vendor Ledger Entry"; FromVendLedgEntry: Record "Vendor Ledger Entry"; var IsHandled: Boolean)
begin
end;
What are you trying to achieve with this? There is no validation trigger for this, if you want to rewrite this value, there are
OnBeforeVendLedgEntryModify and OnRunOnAfterVendLedgEntryMofidy can't you use those instead?
Thanks for the suggestion
| gharchive/issue | 2022-05-31T07:22:42 | 2025-04-01T06:44:57.109487 | {
"authors": [
"MH1612",
"Stepanenko-Vadim"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/issues/18373",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1426804131 | [Event Request] Table 99000754 Work Center - procedure GetBinCodeForFlushingMethod - OnGetBinCodeForFlushingMethodOnElse
Please add an Event OnGetBinCodeForFlushingMethodOnElse in Procedure GetBinCodeForFlushingMethod in Table 99000754 Work Center
[Event Request] Table 99000754 Work Center - procedure OnGetBinCodeForFlushingMethodOnElse
var
Result: Code[20]; // <-- ADDED
begin
if not UseFlushingMethod then
exit("From-Production Bin Code");
case FlushingMethod of
FlushingMethod::Manual,
FlushingMethod::"Pick + Forward",
FlushingMethod::"Pick + Backward":
exit("To-Production Bin Code");
FlushingMethod::Forward,
FlushingMethod::Backward:
exit("Open Shop Floor Bin Code");
else
begin
OnGetBinCodeForFlushingMethodOnElse(Rec, FlushingMethod, Result); // <-- NEW EVENT
exit(Result);
end;
end;
end;
[IntegrationEvent(false,false)]
local procedure OnGetBinCodeForFlushingMethodOnElse(WorkCenter: Record "Work Center"; FlushingMethod: Enum "Flushing Method"; var Result: Code[20])
begin
end;
Thanks for reporting this. We agree, and we’ll publish a fix asap, either in an update for the current version or in the next major release. Please do not reply to this, as we do not monitor closed issues. If you have follow-up questions or requests, please create a new issue where you reference this one.
| gharchive/issue | 2022-10-28T07:17:20 | 2025-04-01T06:44:57.111496 | {
"authors": [
"AlexanderYakunin",
"MH1612"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/issues/20775",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1625432061 | Table 5077 Segment Line - new function GetComments
Please add new function in Segment Line table.
procedure GetComments(var InterLogEntryCommentLine: Record "Inter. Log Entry Comment Line" temporary)
begin
InterLogEntryCommentLine.DeleteAll();
if TempInterLogEntryCommentLine.FindSet() then
repeat
InterLogEntryCommentLine := TempInterLogEntryCommentLine;
InterLogEntryCommentLine.Insert();
until TempInterLogEntryCommentLine.Next() = 0;
end;
Thank you!
@rvicenteADV could you please elaborate why this procedure should be added and would you be willing to contribute this enhancement (including a test) to the BaseApp by yourself?
@JesperSchulz is this some improvement that could be added via the BaseApp Contribution Pilot?
Hi,
I need to assign TempInterLogEntryCommentLine, it might also be valid to change it as protected. You tell me if I need to do a test.
Thank you.
| gharchive/issue | 2023-03-15T12:27:51 | 2025-04-01T06:44:57.114096 | {
"authors": [
"pri-kise",
"rvicenteADV"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/issues/22576",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2082087078 | [Bug]: Codeunit 9852 Assignment of Code[30] to Code[20] Permission Conflicts Overview
Describe the issue
Table 5555 "Permission Conflicts Overview" has field: field(1; PermissionSetID; Code[20])
Table 2000000250 "Metadata Permission Set" has field: "Role ID"; Code[30].
In Codeunit 9852 "Effective Permissions Mgt." field with Code[30] is assigned to field with Code[20] in line 184.
This leads to an error e.g. by opening page "Permission Conflicts Overview".
Expected behavior
Page should be opened, even if the field is filled with more than 20 characters.
Steps to reproduce
Open Page 9802 "Permission Sets", use Action "Show Permission Conflicts Overview".
Additional context
@KuMs2019 do you have an example for an permissionset with this error?
Hello!
I think, when opening the page, all permission sets are checked and handled.
The first error is in this line with following variables:
I created a support request, since we have the same issue for our BC SaaS customer that use BC23.2
@KuMs2019 if a customer is impacted then you should create a support request via the common support channels.
This repository is onyl for extensibility requests not for Bug Reports.
Ok, sorry, will do my best next time.
@pri-kise, thanks for creating a support request. The repository is indeed only for Business Central extensibility requests or code contributions towards the 1st party Business Central apps, hence closing issue.
| gharchive/issue | 2024-01-15T13:56:11 | 2025-04-01T06:44:57.119829 | {
"authors": [
"JesperSchulz",
"KuMs2019",
"pri-kise"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/issues/25712",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
449155045 | [EventRequest] Codeunit 1535 Approvals Mgmt. - Approval Administrators
We would appreciate a publisher that we can subscribe to so we have other means of deciding who is an approval administrator (team leaders).
We would like the 'CheckUserAsApprovalAdministrator' to be modified:
(Also note we added ApprovalEntry as parameter to the function).
LOCAL PROCEDURE CheckUserAsApprovalAdministrator@116(ApprovalEntry@11095585Record 454);
VAR
UserSetup@1000 : Record 91;
IsHandled@11095586 : Boolean;
BEGIN
///// START
OnCheckUserAsApprovalAdministrator(ApprovalEntry, IsHandled);
IF IsHandled THEN
BEGIN
EXIT;
END;
///// END
UserSetup.GET(USERID);
UserSetup.TESTFIELD("Approval Administrator");
END;
Function 'ApproveSelectedApprovalRequest' add approval entry as parameter.
LOCAL PROCEDURE ApproveSelectedApprovalRequest@5(VAR ApprovalEntry@1000 : Record 454);
BEGIN
IF ApprovalEntry.Status <> ApprovalEntry.Status::Open THEN
ERROR(ApproveOnlyOpenRequestsErr);
IF ApprovalEntry."Approver ID" <> USERID THEN
///// START
//CheckUserAsApprovalAdministrator;
CheckUserAsApprovalAdministrator(ApprovalEntry);
///// END
ApprovalEntry.VALIDATE(Status,ApprovalEntry.Status::Approved);
ApprovalEntry.MODIFY(TRUE);
OnApproveApprovalRequest(ApprovalEntry);
END;
Function 'RejectSelectedApprovalRequest' add approval entry as parameter.
LOCAL PROCEDURE RejectSelectedApprovalRequest@2(VAR ApprovalEntry@1000 : Record 454);
BEGIN
IF ApprovalEntry.Status <> ApprovalEntry.Status::Open THEN
ERROR(RejectOnlyOpenRequestsErr);
IF ApprovalEntry."Approver ID" <> USERID THEN
///// START
//CheckUserAsApprovalAdministrator;
CheckUserAsApprovalAdministrator(ApprovalEntry);
///// END
OnRejectApprovalRequest(ApprovalEntry);
ApprovalEntry.GET(ApprovalEntry."Entry No.");
ApprovalEntry.VALIDATE(Status,ApprovalEntry.Status::Rejected);
ApprovalEntry.MODIFY(TRUE);
END;
And the new publisher:
[Integration]
LOCAL PROCEDURE OnCheckUserAsApprovalAdministrator@11095589(ApprovalEntry@11095585 : Record 454;VAR IsHandled@11095586 : Boolean);
BEGIN
END;
This subscriber was introduced in BC14, however missing in BC16. Will create a new request.
| gharchive/issue | 2019-05-28T09:15:35 | 2025-04-01T06:44:57.122998 | {
"authors": [
"lv-janpieter"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/issues/2605",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
456932680 | [Request for External] Codeunit 1201-"Process Data Exch." SetField method
Hi,
We need to use "SetField" method in "codeunit 1201" from our extension. It's not local and not external as well. Could you please make it external?
Thanks in advance!
Hi muratkarabek. I'm afraid we cannot make this function External because it declared as unsafe due to insertion in the system table. Looks like you should find another way.
Hi again. After taking a look one more time at system table is used in the function you asked to be external, I reconsidered it is safe and it will be exposed as external.
| gharchive/issue | 2019-06-17T13:23:50 | 2025-04-01T06:44:57.124773 | {
"authors": [
"RomanZanizdra",
"muratkarabek"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/issues/2860",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
585144999 | [Event Request]Application “Tests-Data Exchange”, 139154 "Incoming Doc. To Data Exch.UT"
[Event Request]Application “Tests-Data Exchange”, 139154 "Incoming Doc. To Data Exch.UT"
Could you please insert the OnTestInitialize Event at the beginning of Initialize procedure:
Thanks for reporting this. We agree, and we’ll publish a fix asap, either in an update for the current version or in the next major release. We will update this issue with information about availability.
| gharchive/issue | 2020-03-20T15:27:07 | 2025-04-01T06:44:57.126722 | {
"authors": [
"bc-ghost",
"ttaeufer"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/issues/6517",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
589404355 | [Event Request] Codeunit 99000831 Reservation Engine Mgt. - function Function InitRecordset2
Dear,
I request for below Integration
Definition of Integration as below
thanks & regards
Ashwini Tripathi
Thanks for reporting this. We agree, and we’ll publish a fix asap, either in an update for the current version or in the next major release. We will update this issue with information about availability.
| gharchive/issue | 2020-03-27T20:46:20 | 2025-04-01T06:44:57.128874 | {
"authors": [
"ashwinitripathi",
"bc-ghost"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/issues/6629",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
593977029 | Gen. Journal Line "Recurring Method" should be Enum
Hello,
Our customer needs to extend the field "Recurring Method" in "Gen. Journal Line", this is critical.
But this field still Option, could you please make transform field to Enum?
Versions of AL Language: 5.0.254558
Business Central: Version: CA Business Central 16.0 (Platform 16.0.11233.12078 + Application 16.0.11240.11946)
Thanks for reporting this. We agree, and we’ll publish a fix asap, either in an update for the current version or in the next major release. We will update this issue with information about availability.
| gharchive/issue | 2020-04-04T18:05:22 | 2025-04-01T06:44:57.131278 | {
"authors": [
"Drakonian",
"bc-ghost"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/issues/6719",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
613355707 | DESCryptoServiceProvider
Dear Microsoft Team,
Can DESCryptoServiceProvider be added as module?
I see digipoort (my other Pull Request) got mixed up on this, while I used a separate branch. I will try figure out how I can ignore these for this Pull Request...
Hi Angela! Sorry about the delay on looking into this PR! I will get to it within the next week!
Sorry, review has gotten delayed a little more. But we'll be on it soon!
Hi @PredragMaricic , can you review it again? Thanks in advance.
| gharchive/pull-request | 2020-05-06T14:16:34 | 2025-04-01T06:44:57.133006 | {
"authors": [
"JesperSchulz",
"angela2389"
],
"repo": "microsoft/ALAppExtensions",
"url": "https://github.com/microsoft/ALAppExtensions/pull/6964",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
442460446 | [.NET] GIF support
Automatically generated task. Any updates to the title and body will be overwritten. Do NOT modify the labels.
.NET implementation of #2871
WPF renderer is confirmed to NOT support GIFs, as tested by myself using the WPF visualizer
| gharchive/issue | 2019-05-09T22:21:43 | 2025-04-01T06:44:57.134424 | {
"authors": [
"andrewleader"
],
"repo": "microsoft/AdaptiveCards",
"url": "https://github.com/microsoft/AdaptiveCards/issues/2874",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
692343211 | Teams: Media element support
Currently the Media element is not support in Microsoft teams. This work tracks adding that support.
Link to feature: https://adaptivecards.productboard.com/feature-board/planning/features/5300350
The Adaptive Cards SDK supports this. Teams is currently working on supporting media elements.
| gharchive/issue | 2020-09-03T20:51:54 | 2025-04-01T06:44:57.135958 | {
"authors": [
"productboard-ac",
"rahulamlekar"
],
"repo": "microsoft/AdaptiveCards",
"url": "https://github.com/microsoft/AdaptiveCards/issues/4732",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2558882319 | Remove Super assessment from CIEM workbook
PR Checklist
[ ] Explain your changes, so people looking at the PR know what and why, the code changes are the how.
[ ] Validate your changes using one or more of the testing methods.
If adding or updating templates:
[ ] post a screenshot of templates and/or gallery changes
[ ] ensure your template has a corresponding gallery entry in the gallery folder
[ ] If you are adding a new template, add your team and template/gallery file(s) to the CODEOWNERS file. CODEOWNERS entries should be teams, not individuals
[ ] ensure all steps have meaningful names
[ ] ensure all parameters and grid columns have display names set so they can be localized
[ ] ensure that parameters id values are unique or they will fail PR validation (parameter ids are used for localization)
[ ] ensure that steps names are unique or they will fail PR validation (step names are used for localization)
[ ] grep /subscription/ and ensure that your parameters don't have any hardcoded resourceIds or they will fail PR validation
[ ] remove fallbackResourceIds and fromTemplateId fields from your template workbook or they will fail PR validation
@microsoft-github-policy-service agree [company="Microsoft"]
@microsoft-github-policy-service agree
@microsoft-github-policy-service agree company="Microsoft"
@microsoft-github-policy-service agree
@microsoft-github-policy-service agree company=Microsoft
| gharchive/pull-request | 2024-10-01T10:57:57 | 2025-04-01T06:44:57.143719 | {
"authors": [
"MorBriskerMs"
],
"repo": "microsoft/Application-Insights-Workbooks",
"url": "https://github.com/microsoft/Application-Insights-Workbooks/pull/2791",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
787146495 | Adding calibration and registration tools to examples.
Fixes
Description of the changes:
Adding python code leveraging OpenCV to calibrate any given camera.
Adding python code to perform registration (extrinsics) between two cameras, also reports registration error.
Adding example data for users.
Adding a README.md to describe how to use the tools
Adding three different Charuco templates for users, with example data using the mid size one.
Output a json with calibration and registration data between cameras.
Before submitting a Pull Request:
[x] I reviewed CONTRIBUTING.md
[x] I built my changes locally
[x] I ran the unit tests
[x] I ran the functional tests with a device
[x] I ran the performance tests with a device
I tested changes on:
[x] Windows
[x] Linux
hi, great to see this, a much needed function!
I wonder what's the accuracy like? Is the calibrated parameters close to the factory calibration?
hi, great to see this, a much needed function!
I wonder what's the accuracy like? Is the calibrated parameters close to the factory calibration?
@seigeweapon The accuracy of the calibration from this tool is as good, if not better than the factory calibration. You can use this tool to calibrate a device, and then use the tools in the examples/depth_eval_tools folder to evaluate the calibration quality for the depth sensor.
hi, great to see this, a much needed function!
I wonder what's the accuracy like? Is the calibrated parameters close to the factory calibration?
Yes, this calibration is as good as factory.
/azp run
| gharchive/pull-request | 2021-01-15T19:36:57 | 2025-04-01T06:44:57.155107 | {
"authors": [
"jaygullapalli",
"not-the-programmer",
"resuther-msft",
"seigeweapon"
],
"repo": "microsoft/Azure-Kinect-Sensor-SDK",
"url": "https://github.com/microsoft/Azure-Kinect-Sensor-SDK/pull/1475",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
2565147745 | [Juniper] - [QFX5120] - 22H2/23H2
Ensure you are utilizing the latest version of the tool from the release page
Please fill in the following:
Organization (As you want it to appear on the website): Juniper Networks
Device series: QFX5120 series
Speeds supported (GbE): 10, 25, 40, 100
Minimum firmware version requirement: 23.4R2.13
Link to device datasheet: https://www.juniper.net/content/dam/www/assets/datasheets/us/en/switches/qfx5120-ethernet-switch-datasheet.pdf
After the tool execution successfully, there will be five files:
PDF File
YAML File
Log File
PCAP file
Switch Configurations (Please Remove Sensitive Data!)
Please zip and upload these here to submit for review!
QFX5120_Series.zip
Submission approved
#sign-off
| gharchive/issue | 2024-10-03T23:24:14 | 2025-04-01T06:44:57.159581 | {
"authors": [
"DC-TME",
"bkablawi"
],
"repo": "microsoft/AzureStackHCI-Network-Switch-Validation",
"url": "https://github.com/microsoft/AzureStackHCI-Network-Switch-Validation/issues/57",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
488109796 | Unable to sort all the disks in sequence on each columns
Storage Explorer Version: 1.10.0
Platform/OS: Windows 10
Architecture: x64
Commit: f06deb7a6
Regression From: Not a regression
Steps to reproduce:
Expand 'Disks' node -> Open one resource group.
Create multiple disks to it -> Click anyone column name like 'Disk Name'.
Check whether all disks can be sorted in sequence or not.
Expect Experience:
All disks can be sorted by the
Actual Experience:
Unable to sort all the disks in sequence.
This issue has been fixed on the build master/20190917.7. So we close it.
| gharchive/issue | 2019-09-02T10:05:51 | 2025-04-01T06:44:57.164152 | {
"authors": [
"v-xianya",
"v-xuanzh"
],
"repo": "microsoft/AzureStorageExplorer",
"url": "https://github.com/microsoft/AzureStorageExplorer/issues/1763",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
} |
523421384 | Can't add one new Azure account when enabling MSAL on Linux
Storage Explorer Version: 1.11.0
Build: 20191115.3
Branch: hotfix/1.11.1
Platform/OS: Linux Ubuntu 18.04
Architecture: x64
Regression From: Not a regression
Steps to reproduce:
Launch Storage Explorer -> Sign in.
Open Settings panel -> Application -> Sign-in -> Enable MSAL -> Restart Storage Explorer.
Make sure an Azure account already exists in sign-in browser -> Then add another Azure account to sign in.
Check the result.
Expect Experience:
Pop up a password box.
Actual Experience:
No password box pops up.
Directly complete authenticate.
The Azure account that already exists in sign-in browser is added to Storage Explorer.
More Info:
This issue also reproduces on 'use device code flow sign-in'.
@v-limu , can you explain to me more about what you saw that was unexpected?
If you have already signed into your browser, then you won't need to enter your password again. Additionally, if you'd like to sign into an account not already signed into your browser, then you can choose "User another account":
Hi @MRayermannMSFT
The unexpected result occurs when using another account to sign into browser.
Steps to reproduce:
Make sure an Azure account already exists in sign-in browser .
Use another account -> Type a new Azure account name.
Check the result.
Expect Experience:
Authenticate for the new account.
Actual Experience:
No authentication for the new account.
@v-limu , are both accounts MSAs? Or in other words, are both accounts personal accounts or are they work accounts?
This issue doesn't reproduce on Linux using build 20191119.3. So we close it.
| gharchive/issue | 2019-11-15T11:38:53 | 2025-04-01T06:44:57.172995 | {
"authors": [
"MRayermannMSFT",
"v-limu"
],
"repo": "microsoft/AzureStorageExplorer",
"url": "https://github.com/microsoft/AzureStorageExplorer/issues/2265",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
} |
617796777 | Copying Azure blob container folders with colon in name results in name with %3A instead
In an Azure container I have a folder called "sr:season:66441". If I drag and drop that folder to my local Windows machine, that folder ends up with this name: sr%3Aseason%3A66441
This is because ':' is not a valid char for folder or file names on Windows. So it's been encoded.
I know. Couple of questions:
Can't we replace it with a better character?
Is this still a limitation if I run the app on Linux?
@zmarty %3A is the URI encoding value for :. Currently URI encoding is the scheme we use for encoding chars which are invalid for your OS. For Linux, : won't be encoded. You'll see it appear as normal, since on Linux it is not invalid to have it in file name/path.
We will not be changing away from using URI encoding.
Does that all make sense?
Closing due to lack of response.
| gharchive/issue | 2020-05-13T22:33:02 | 2025-04-01T06:44:57.176228 | {
"authors": [
"MRayermannMSFT",
"jinglouMSFT",
"zmarty"
],
"repo": "microsoft/AzureStorageExplorer",
"url": "https://github.com/microsoft/AzureStorageExplorer/issues/2971",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1159329113 | The context view in the split tab is inconsistent with its original tab if the current context view of the original tab is not the 'Active blobs (default)'
Storage Explorer Version: 1.24.0-dev
Build Number: 20220303.7
Branch: main
Platform/OS: Windows 10/Linux Ubuntu 20.04/MacOS Monterey 12.1 (Apple M1 Pro)
Architecture ia32\x64
How Found: From running test cases
Regression From: Previous release (1.22.1)
Steps to Reproduce
Expand one non-ADLS Gen2 storage account -> Blob Containers.
Open one blob container -> Change the context view to 'Active and soft deleted blobs'.
Split the tab -> Check whether the context view is consistent between the two tabs.
Expected Experience
The context view is consistent between the two tabs.
Actual Experience
The context view is inconsistent between the two tabs.
Update:
The items in the editor keep consistent.
Verified this issue on the build main/20220315.1. Fixed.
| gharchive/issue | 2022-03-04T07:22:18 | 2025-04-01T06:44:57.181715 | {
"authors": [
"v-xianya"
],
"repo": "microsoft/AzureStorageExplorer",
"url": "https://github.com/microsoft/AzureStorageExplorer/issues/5502",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2001627431 | Fail to restore file share snapshot with error 'Unexpected Cancel'
Storage Explorer Version: 1.33.0-dev (96)
Build Number: 20231119.2
Branch: main
Platform/OS: Windows 10/Linux Ubuntu 20.04/MacOS Sonoma 14.0(Apple M1 Pro)
Architecture: x64/x64/arm64
How Found: From running test cases
Regression From: Previous release (1.32.1)
Steps to Reproduce
Expand one storage account -> File Shares.
Open one file share -> Upload a file.
Right click the file share -> Click 'Create Snapshot'.
Switch to snapshots view -> Open the file share snapshot.
Select the file -> Click 'Restore Snapshot'.
Check whether succeeds to restore snapshot.
Expected Experience
Succeed to restore snapshot.
Actual Experience
Fail to restore snapshot.
A fix has been merged.
Verified this issue on the build 20231121.2. Fixed. So, we are going to close it.
| gharchive/issue | 2023-11-20T08:28:40 | 2025-04-01T06:44:57.186772 | {
"authors": [
"JasonYeMSFT",
"v-kellyluo"
],
"repo": "microsoft/AzureStorageExplorer",
"url": "https://github.com/microsoft/AzureStorageExplorer/issues/7525",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
} |
2162971721 | The action 'Undelete Selected' is disabled when selecting more than one soft deleted blob snapshot
Storage Explorer Version: 1.33.0 (97)
Build Number: 20240229.8
Branch: rel/1.33.0
Platform/OS: Windows 10/Linux Ubuntu 22.04/MacOS Sonoma 14.3(Apple M1 Pro)
Architecture: x64/x64/x64
How Found: From running test cases
Regression From: Previous release (1.31.2)
Steps to Reproduce
Expand one storage account -> Blob Containers.
Create a blob container -> Upload one blob.
Create two snapshots for the blob -> Switch to the snapshots view.
Soft delete the two blob snapshots -> Switch to 'Active and soft deleted blobs' context view.
Select the two soft deleted blob snapshots -> Right click the selection.
Click 'Undelete'.
Check whether the action 'Undelete Selected' is enabled.
Expected Experience
The action 'Undelete Selected' is enabled.
Actual Experience
The action 'Undelete Selected' is disabled.
This is by design for non-HNS blobs. There is no way to undelete individual snapshot. Undeleting any deleted snapshot will undelete all the snapshots.
| gharchive/issue | 2024-03-01T09:24:18 | 2025-04-01T06:44:57.191884 | {
"authors": [
"JasonYeMSFT",
"v-xianya"
],
"repo": "microsoft/AzureStorageExplorer",
"url": "https://github.com/microsoft/AzureStorageExplorer/issues/7781",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1697403054 | Workspace Owners and Researchers receive a browser Error and cannot see workspaces
When a user is assigned workspace Researcher or owner along with being a member of TRE Users the recieve the error below:
(when the user is assigned TRE Admin the error disappears)
Uh oh!
This area encountered an error that we can't recover from. Please check your configuration and refresh.
Further debugging details can be found in the browser console.
The Browser console has this output:
peError: Cannot read properties of undefined (reading 'length')
at WC (ResourceContextMenu.tsx:202:24)
at Sa (react-dom.production.min.js:167:137)
at _l (react-dom.production.min.js:290:337)
at bc (react-dom.production.min.js:280:389)
at vc (react-dom.production.min.js:280:320)
at gc (react-dom.production.min.js:280:180)
at ic (react-dom.production.min.js:271:88)
at oc (react-dom.production.min.js:268:429)
at k (scheduler.production.min.js:13:203)
at MessagePort.R (scheduler.production.min.js:14:128)
ps @ react-dom.production.min.js:189
i.componentDidCatch.r.callback @ react-dom.production.min.js:190
Bi @ react-dom.production.min.js:144
wl @ react-dom.production.min.js:261
bl @ react-dom.production.min.js:260
yl @ react-dom.production.min.js:259
(anonymous) @ react-dom.production.min.js:283
Cc @ react-dom.production.min.js:281
oc @ react-dom.production.min.js:270
k @ scheduler.production.min.js:13
R @ scheduler.production.min.js:14
GenericErrorBoundary.tsx:21 UNHANDLED EXCEPTION TypeError: Cannot read properties of undefined (reading 'length')
at WC (ResourceContextMenu.tsx:202:24)
at Sa (react-dom.production.min.js:167:137)
at _l (react-dom.production.min.js:290:337)
at bc (react-dom.production.min.js:280:389)
at vc (react-dom.production.min.js:280:320)
at gc (react-dom.production.min.js:280:180)
at ic (react-dom.production.min.js:271:88)
at oc (react-dom.production.min.js:268:429)
at k (scheduler.production.min.js:13:203)
at MessagePort.R (scheduler.production.min.js:14:128) {componentStack: '\n at WC (https://ixxxxx.cloudapp.a…pp.azure.com/static/js/main.aba16646.js:2:691445)'}
value @ GenericErrorBoundary.tsx:21
i.componentDidCatch.r.callback @ react-dom.production.min.js:190
Bi @ react-dom.production.min.js:144
wl @ react-dom.production.min.js:261
bl @ react-dom.production.min.js:260
yl @ react-dom.production.min.js:259
(anonymous) @ react-dom.production.min.js:283
Cc @ react-dom.production.min.js:281
oc @ react-dom.production.min.js:270
k @ scheduler.production.min.js:13
R @ scheduler.production.min.js:14
react-dom.production.min.js:189 TypeError: Cannot read properties of undefined (reading 'length')
at WC (ResourceContextMenu.tsx:202:24)
at Sa (react-dom.production.min.js:167:137)
at _l (react-dom.production.min.js:290:337)
at bc (react-dom.production.min.js:280:389)
at vc (react-dom.production.min.js:280:320)
at gc (react-dom.production.min.js:280:180)
at ic (react-dom.production.min.js:271:88)
at oc (react-dom.production.min.js:268:429)
at k (scheduler.production.min.js:13:203)
at MessagePort.R (scheduler.production.min.js:14:128)
ps @ react-dom.production.min.js:189
i.componentDidCatch.r.callback @ react-dom.production.min.js:190
Bi @ react-dom.production.min.js:144
wl @ react-dom.production.min.js:261
bl @ react-dom.production.min.js:260
yl @ react-dom.production.min.js:259
(anonymous) @ react-dom.production.min.js:283
Cc @ react-dom.production.min.js:281
oc @ react-dom.production.min.js:270
k @ scheduler.production.min.js:13
R @ scheduler.production.min.js:14
GenericErrorBoundary.tsx:21 UNHANDLED EXCEPTION TypeError: Cannot read properties of undefined (reading 'length')
at WC (ResourceContextMenu.tsx:202:24)
at Sa (react-dom.production.min.js:167:137)
at _l (react-dom.production.min.js:290:337)
at bc (react-dom.production.min.js:280:389)
at vc (react-dom.production.min.js:280:320)
at gc (react-dom.production.min.js:280:180)
at ic (react-dom.production.min.js:271:88)
at oc (react-dom.production.min.js:268:429)
at k (scheduler.production.min.js:13:203)
at MessagePort.R (scheduler.production.min.js:14:128) {componentStack: '\n at WC (https://xxxx.xxxxx.cloudapp.a…pp.azure.com/static/js/main.aba16646.js:2:691445)'}
value @ GenericErrorBoundary.tsx:21
i.componentDidCatch.r.callback @ react-dom.production.min.js:190
Bi @ react-dom.production.min.js:144
wl @ react-dom.production.min.js:261
bl @ react-dom.production.min.js:260
yl @ react-dom.production.min.js:259
(anonymous) @ react-dom.production.min.js:283
Cc @ react-dom.production.min.js:281
oc @ react-dom.production.min.js:270
k @ scheduler.production.min.js:13
R @ scheduler.production.min.js:14
@LizaShak @tamirkamara @marrobi Any advice would be appreciated
Is this on a specific screen or at the root of the TRE UI?
It's on the home page/root.
A quick write up of some thoughts on this.
The error appears to be caused by the fact that the API is not returning availableUpgrades in some situations, for some workspaces. This causes the UI to throw an exception when it access
There are two PRs associated with availableUpgrades:
https://github.com/microsoft/AzureTRE/pull/3234 (the API changes)
https://github.com/microsoft/AzureTRE/pull/3387 (the UI changes)
The API changes essentially make sure that enrich_resource_with_available_upgrades is called for every Workspace resource that the API returns, but there is a situation where we return workspaces without this enrichment.
In this except handler: https://github.com/microsoft/AzureTRE/blob/main/api_app/api/routes/workspaces.py#L66
The workspaces returned by line 80 (https://github.com/microsoft/AzureTRE/blob/main/api_app/api/routes/workspaces.py#L80) will be missing the availableUpgrades property because they have not been passed to enrich_resource_with_available_upgrades.
The fix is most likely adding an additional call to enrich_resource_with_available_upgrades between lines 79 and 80 of this except block, much the same way as line 63 of the try block does.
(the above is just a theory, and not based upon seeing this issue occur)
@tamirkamara
Env background: upgraded to v 11
When a new use is granted workspace owner of Reacher, the TRE home page is inaccessible as described above
@martinpeck I managed to reproduce, both of your theories were right :)
Yay! That's good to know!
| gharchive/issue | 2023-05-05T10:45:23 | 2025-04-01T06:44:57.212528 | {
"authors": [
"MoSidiIC",
"marrobi",
"martinpeck",
"tamirkamara",
"yuvalyaron"
],
"repo": "microsoft/AzureTRE",
"url": "https://github.com/microsoft/AzureTRE/issues/3489",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1146734123 | CI checks core image version update
Fixes #1359
What is being addressed
We tag core images with a value from a version file. If the application is updated but the version file isn't, then it might cause failures after the merge to main.
The reason is that in PR validation we deploy a new environment, but on main we use a preexisting one. If the version file isn't change then the existing container running in the environment won't get replaced and that can produce errors in the E2E tests.
How is this addressed
Add a check in the CI that will enforce updating the version file when the respective application is updated.
You can see a run result here: https://github.com/microsoft/AzureTRE/runs/5286484296?check_suite_focus=true
@damoodamoo, @eladiw - you're tagged as reviewers as you were on different ends of this issue - one made a change in the api (with no version update) and the second had tests fail after merge to main.
@damoodamoo There might be a reason why they chose not to go in the sha route before and I didn't want to make too many changes.
It might be something to do with the API displaying a "nice" version so that will remain.
| gharchive/pull-request | 2022-02-22T10:34:20 | 2025-04-01T06:44:57.216026 | {
"authors": [
"tamirkamara"
],
"repo": "microsoft/AzureTRE",
"url": "https://github.com/microsoft/AzureTRE/pull/1360",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
255637514 | QnAMakerDialog: Provide ability to override behaviour for no match scenario
I want my bot to be able to take a different course of action if the QnAMakerDialog does not find a matching answer. For example, not being able to find an answer might result in an automatic transfer of the chat to a human agent. I believe this to be a small change to QnAMakerDialog.js as follows. All I did was extracted the sending of the message in the no match case to its own method.
QnAMakerDialog.prototype.invokeAnswer = function (session, recognizeResult, threshold, noMatchMessage) {
var qnaMakerResult = recognizeResult;
session.privateConversationData.qnaFeedbackUserQuestion = session.message.text;
if (qnaMakerResult.score >= threshold && qnaMakerResult.answers.length > 0) {
if (this.isConfidentAnswer(qnaMakerResult) || this.qnaMakerTools == null) {
this.respondFromQnAMakerResult(session, qnaMakerResult);
this.defaultWaitNextMessage(session, qnaMakerResult);
}
else {
this.qnaFeedbackStep(session, qnaMakerResult);
}
}
else {
this.noMatch(session, noMatchMessage, qnaMakerResult);
}
};
QnAMakerDialog.prototype.noMatch = function (session, noMatchMessage, qnaMakerResult) {
session.send(noMatchMessage);
this.defaultWaitNextMessage(session, qnaMakerResult);
};
Hi All,
We too have similar requirement about "No match found" scenarios... it would be helpful if this can be included as a virtual/abstract method in QnAMakerDialog which gives flexibility for consuming bot service dialog to override this method with custom logic. Please let us know once this is included to get that new nuget packages...
Thanks, Venu Madhav Deevi.
Thank you for opening an issue against the Bot Framework Cognitive Services repo. As part of the Bot Framework v4 release, we’ve moved all the Luis and QnA work to platform SDK repo. We will continue to support and offer maintenance updates via this repo.
From now on, https://github.com/Microsoft/botframework-sdk repo will be used as hub, with pointers to all the different SDK languages, tools and samples repos.
As part of this restructuring, we are closing all tickets in this repo.
For defects or feature requests, please create a new issue in the Bot Framework repo found here:
https://github.com/Microsoft/botframework-sdk/issues
For product behavior, how-to, or general understanding questions, please use Stackoverflow.
https://stackoverflow.com/search?q=bot+framework
Thank you.
The Bot Framework Team
| gharchive/issue | 2017-09-06T15:04:31 | 2025-04-01T06:44:57.221111 | {
"authors": [
"cleemullins",
"dvm-2k1",
"gliddell"
],
"repo": "microsoft/BotBuilder-CognitiveServices",
"url": "https://github.com/microsoft/BotBuilder-CognitiveServices/issues/50",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
776431027 | Two buttons not diplaying in herocard
Hi,
I used below template but two buttons not displaying only one button visible could you please suggest some solution.
[HeroCard
title = Big Buck Bunny
subtitle = by the Blender Institute
text = Big Buck Bunny (code-named Peach) is a short computer-animated comedy film by the Blender Institute
buttons = ${cardActionTemplate('openUrl', 'Study more', 'https://docs.microsoft.com/en-us/composer/how-to-send-cards#videocard')}
buttons = ${cardActionTemplate('openUrl', 'media', 'https://docs.microsoft.com/en-us/composer/how-to-send-cards#videocard')}
]
Thanks.....
You need to create an array. Try this:
[HeroCard
title = Big Buck Bunny
subtitle = by the Blender Institute
text = Big Buck Bunny (code-named Peach) is a short computer-animated comedy film by the Blender Institute
buttons = ${[ cardActionTemplate('openUrl', 'Study more', 'https://docs.microsoft.com/en-us/composer/how-to-send-cards#videocard'), cardActionTemplate('openUrl', 'media', 'https://docs.microsoft.com/en-us/composer/how-to-send-cards#videocard') ]}
][
This renders:
Thanks its working......
| gharchive/issue | 2020-12-30T12:23:07 | 2025-04-01T06:44:57.226536 | {
"authors": [
"BeheraMadan",
"joshgummersall"
],
"repo": "microsoft/BotFramework-Composer",
"url": "https://github.com/microsoft/BotFramework-Composer/issues/5422",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
786052569 | You should be able to remove a bot from the list shown at startup
It would be good to be able within the composer to remove projects from the list of recent projects.
@sangwoohaan will you look into this feature request? It should be rather trivial to implement but will need some UX design.
@sangwoohaan will you look into this feature request? It should be rather trivial to implement but will need some UX design.
Ok will take a look into this.
Ok will take a look into this.
@sangwoohaan : Any updates?
Sorry @daveta for the delay, @emivers8 @xiyangdesign should be able to support this! Could you guys take a look at this feature request?
will do! @sangwoohaan self-assigned.
Design solution is to have a contextual menu (vertical dots). The icon (vertical dots) appear when you hover or click on the row. The context menu includes:
Save as a new project
Remove from list
Reveal in explorer
(Removing "UX design" label since the design is ready.)
| gharchive/issue | 2021-01-14T14:51:58 | 2025-04-01T06:44:57.230653 | {
"authors": [
"a-b-r-o-w-n",
"daveta",
"irwinwilliams",
"mewa1024",
"sangwoohaan",
"xiyangdesign"
],
"repo": "microsoft/BotFramework-Composer",
"url": "https://github.com/microsoft/BotFramework-Composer/issues/5526",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1715890131 | microphone Speech recognition error within iframe component
Edit: This issue occurs only within iframe component.
Is it an issue related to Adaptive Cards?
No
Is this an accessibility issue?
No
What version of Web Chat are you using?
Latest production
Which distribution are you using Web Chat from?
Bundle (webchat.js)
Which hosting environment does this issue primarily affect?
Web apps
Which browsers and platforms do the issue happened?
Browser: Edge (latest), Browser: Chrome (latest)
Which area does this issue affect?
Speech
What is the public URL for the website?
No response
Please describe the bug
I created a webchat panel with microphone button with:
const webchatOptions = {
... // some parameters
webSpeechPonyfillFactory: window.WebChat.createBrowserWebSpeechPonyfillFactory()
};
The webchat is embedded in my web app within <iframe> component.
The web app has microphone permissions but when I click the microphone button within the iframe I get Speech recognition error:
"Error: Speech recognition is not supported"
It only happens within the iframe component (not when the webchat is open in full screen).
More details:
In the browser settings, the website had microphone permissions. Also tried to reset the permissions and the issue persisted.
The error was reproduced in Chrome and Edge, but the microphone worked in Safari
The error was reproduced in "https" url as well as local run ("localhost:...")
the iframe has the attribute allow="microphone *" and I added a Permission-Policy header:
app.use((req, res, next) => {
res.setHeader('Permissions-Policy',
'microphone=*');
next();
});
The microphone button used to work before
Do you see any errors in console log?
SpeechRecognitionErrorEvent
Error: Speech recognition is not supported
How to reproduce the issue?
click on the mic button within the iframe and look at the console.
Also, after clicking the button twice the webchat panel freezes
The web app is private so we can't supply URL.
What do you expect?
We expect that clicking the microphone button once will turn the microphone icon red and will type the speech in the webchat panel.
What actually happened?
After the first click on the mic button- nothing happens on the screen and the following error appears in the console:
After the second click - the following error appears in the console and the webchat panel becomes white blank screen:
Do you have any screenshots or recordings to repro the issue?
No response
Adaptive Card JSON
No response
Additional context
No response
I am looking at this now.
I made an HTML page that use Web Speech API like the way Web Chat use. Surprisingly, it works in an IFRAME. However, I see Web Chat is failing when it is in the IFRAME under the same page. I didn't apply any sandbox attributes to both IFRAME. So, it's a good repro.
Going to investigate a bit more.
It seems when the IFRAME is pointing to a website outside of the origin of the hosting page, speech recognition will not be allowed.
Confirmed. This is not an issue on Web Chat.
The reason why Chrome/Edge says "not-allowed" is because the IFRAME origin is not the same as the host origin. Thus, developer would need to explicitly specify <iframe allow="microphone"> to enable microphone access across origins.
If allow="microphone" is not added, cross origin IFRAME will have microphone disabled and error out with "not-allowed".
| gharchive/issue | 2023-05-18T16:06:30 | 2025-04-01T06:44:57.243272 | {
"authors": [
"Noam-Microsoft",
"compulim"
],
"repo": "microsoft/BotFramework-WebChat",
"url": "https://github.com/microsoft/BotFramework-WebChat/issues/4731",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1953477479 | Integrate fast track test infrastructure with monthly/daily pipelines
This item will be released with Azure Linux 3.0.
Completed
| gharchive/issue | 2023-10-20T04:06:47 | 2025-04-01T06:44:57.244596 | {
"authors": [
"eric-desrochers",
"pokushwaha"
],
"repo": "microsoft/CBL-Mariner",
"url": "https://github.com/microsoft/CBL-Mariner/issues/6526",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1275420752 | toolchain: Add ability to partially rehydrate from upstream repos
Merge Checklist
All boxes should be checked before merging the PR (just tick any boxes which don't apply to this PR)
[x] The toolchain has been rebuilt successfully (or no changes were made to it)
[x] The toolchain/worker package manifests are up-to-date
[x] Any updated packages successfully build (or no packages were changed)
[x] Packages depending on static components modified in this PR (Golang, *-static subpackages, etc.) have had their Release tag incremented.
[x] Package tests (%check section) have been verified with RUN_CHECK=y for existing SPEC files, or added to new SPEC files
[x] All package sources are available
[x] cgmanifest files are up-to-date and sorted (./cgmanifest.json, ./toolkit/tools/cgmanifest.json, ./toolkit/scripts/toolchain/cgmanifest.json, .github/workflows/cgmanifest.json)
[x] LICENSE-MAP files are up-to-date (./SPECS/LICENSES-AND-NOTICES/data/licenses.json, ./SPECS/LICENSES-AND-NOTICES/LICENSES-MAP.md, ./SPECS/LICENSES-AND-NOTICES/LICENSE-EXCEPTIONS.PHOTON)
[x] All source files have up-to-date hashes in the *.signatures.json files
[x] sudo make go-tidy-all and sudo make go-test-coverage pass
[x] Documentation has been updated to match any changes to the build system
[x] Ready to merge
Summary
Add ability to partially rehydrate the toolchain from upstream repos, leveraging existing incremental toolchain build tooling.
Change Log
Makefile: Add ALLOW_TOOLCHAIN_DOWNLOAD_FAIL flag with a default of n
toolchain.mk: Add staging directory for downloaded RPMs, as well as a download manifest to track which RPMs are rehydrated
toolchain.mk: Add partially-rehydrate-toolchain-from-repo target, which attempts to download as many toolchain RPMs as it can from PACKAGE_REPO_URLS
toolchain.mk: Update/correct various comments
list_toolchain_specs.sh: Fix handling of two-argument spec build instructions in toolchain scripts, as well as double entries
build_official_toolchain_rpms.sh: Copy rehydrated RPMs into the chroot's RPM folders
Does this affect the toolchain?
YES
Test Methodology
Local toolchain delta builds
Pipeline builds
Latest commit validated in pipeline build 206181
| gharchive/pull-request | 2022-06-17T20:15:54 | 2025-04-01T06:44:57.253621 | {
"authors": [
"oliviacrain"
],
"repo": "microsoft/CBL-Mariner",
"url": "https://github.com/microsoft/CBL-Mariner/pull/3185",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
985334315 | Enable creating C# projections based on multiple input winmd's and docs
TODO
[ ] Test with multiple metadata inputs
[ ] https://github.com/microsoft/win32metadata/pull/629 completed and the new package consumed as part of this PR.
❓ Which metadata should be brought in automatically by CsWin32? win32metadata is the only one right now. Should we start including others? Should we remove win32metadata and leave no metadata inputs by default to better support folks who only want to generate projections for their own APIs?
For now, we should continue to pull in win32metadata but support adding NuGet packages to the project that pull in additional metadata. Project templates can add additional references to metadata as well as prepopulate NativeMethods.txt to improve their out-of-box experience.
| gharchive/pull-request | 2021-09-01T16:05:38 | 2025-04-01T06:44:57.265162 | {
"authors": [
"AArnott",
"mikebattista"
],
"repo": "microsoft/CsWin32",
"url": "https://github.com/microsoft/CsWin32/pull/386",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
930075211 | Question about bucket
Hello, I am reading the zero2.py code. I learned from DDP's paper that the bucketing strategy pays much attention to the gradient reducing order. As you known, the reducing order must be the same across all processes, otherwise, AllReduce contents might mismatch, resulting in incorrect reduction result or coredump. However, Autograd engine is a parallel graph execution engine, which would cause different gradients ready order among processes. DDP leverages Parameter-to-Bucket Mapping, index inside Bucket and index between Buckets together to maker sure the strict order when "AllReduce". As DeepSpeed, I am wandering how it works to handle the bucket? Thanks a lot.
@huangyanjuner The gradient order is different across processes only if the forward graph is different across process. AutoGrad computes the gradients in a sequential and deterministic order as long as the training is strictly data parallel, ie. all process take the exact same code path. In this scenario, the gradient order will be consistent. This is the underlying assumption in DeepSpeed ZeRO-2. We do not currently support cases where two different data parallel process are taking different code paths in ZeRO-2 as we have not seen cases where this is necessary for multi-billion parameter training enabled by ZeRO-2. Please do let us know if you have scenarios where this is necessary.
@huangyanjuner, please re-open if not resolved.
| gharchive/issue | 2021-06-25T11:06:31 | 2025-04-01T06:44:57.267961 | {
"authors": [
"huangyanjuner",
"jeffra",
"samyam"
],
"repo": "microsoft/DeepSpeed",
"url": "https://github.com/microsoft/DeepSpeed/issues/1186",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1074669650 | CPUAdam does not find CUDA
Discussed in https://github.com/microsoft/DeepSpeed/discussions/1619
Originally posted by javier-alvarez December 8, 2021
2021-12-08T15:12:02Z INFO Switching optimizer to DeepSpeedCPUAdam
No CUDA runtime is found, using CUDA_HOME='/usr/local/cuda'
[stderr]Traceback (most recent call last):
[stderr] File "InnerEyePrivate/ML/runner.py", line 57, in
[stderr] main()
[stderr] File "InnerEyePrivate/ML/runner.py", line 53, in main
[stderr] post_cross_validation_hook=runner.default_post_cross_validation_hook)
[stderr] File "/mnt/azureml/cr/j/cfa5340abb4d4a3abec1a3ec4d8e39a6/exe/wd/innereye-deeplearning/InnerEye/ML/runner.py", line 442, in run
[stderr] return runner.run()
[stderr] File "/mnt/azureml/cr/j/cfa5340abb4d4a3abec1a3ec4d8e39a6/exe/wd/innereye-deeplearning/InnerEye/ML/runner.py", line 219, in run
[stderr] self.run_in_situ(azure_run_info)
[stderr] File "/mnt/azureml/cr/j/cfa5340abb4d4a3abec1a3ec4d8e39a6/exe/wd/innereye-deeplearning/InnerEye/ML/runner.py", line 398, in run_in_situ
[stderr] self.ml_runner.run()
[stderr] File "/mnt/azureml/cr/j/cfa5340abb4d4a3abec1a3ec4d8e39a6/exe/wd/innereye-deeplearning/InnerEye/ML/run_ml.py", line 327, in run
[stderr] num_nodes=self.azure_config.num_nodes)
[stderr] File "/mnt/azureml/cr/j/cfa5340abb4d4a3abec1a3ec4d8e39a6/exe/wd/innereye-deeplearning/InnerEye/ML/model_training.py", line 263, in model_train
[stderr] trainer.fit(lightning_model, datamodule=data_module)
[stderr] File "/azureml-envs/azureml_5602df82e8a46f1160ede9218ecc0c87/lib/python3.7/site-packages/pytorch_lightning/trainer/trainer.py", line 460, in fit
[stderr] self._run(model)
[stderr] File "/azureml-envs/azureml_5602df82e8a46f1160ede9218ecc0c87/lib/python3.7/site-packages/pytorch_lightning/trainer/trainer.py", line 717, in _run
[stderr] self.accelerator.setup(self, model) # note: this sets up self.lightning_module
[stderr] File "/azureml-envs/azureml_5602df82e8a46f1160ede9218ecc0c87/lib/python3.7/site-packages/pytorch_lightning/accelerators/cpu.py", line 39, in setup
[stderr] return super().setup(trainer, model)
[stderr] File "/azureml-envs/azureml_5602df82e8a46f1160ede9218ecc0c87/lib/python3.7/site-packages/pytorch_lightning/accelerators/accelerator.py", line 92, in setup
[stderr] self.setup_optimizers(trainer)
[stderr] File "/azureml-envs/azureml_5602df82e8a46f1160ede9218ecc0c87/lib/python3.7/site-packages/pytorch_lightning/accelerators/accelerator.py", line 375, in setup_optimizers
[stderr] trainer=trainer, model=self.lightning_module
[stderr] File "/azureml-envs/azureml_5602df82e8a46f1160ede9218ecc0c87/lib/python3.7/site-packages/pytorch_lightning/plugins/training_type/training_type_plugin.py", line 190, in init_optimizers
[stderr] return trainer.init_optimizers(model)
[stderr] File "/azureml-envs/azureml_5602df82e8a46f1160ede9218ecc0c87/lib/python3.7/site-packages/pytorch_lightning/trainer/optimizers.py", line 34, in init_optimizers
[stderr] optim_conf = model.configure_optimizers()
[stderr] File "/mnt/azureml/cr/j/cfa5340abb4d4a3abec1a3ec4d8e39a6/exe/wd/innereye-deeplearning/InnerEye/ML/SSL/lightning_modules/simclr_module.py", line 68, in configure_optimizers
[stderr] deepspeed_optim = DeepSpeedCPUAdam(params, lr=self.learning_rate, weight_decay=self.weight_decay)
[stderr] File "/azureml-envs/azureml_5602df82e8a46f1160ede9218ecc0c87/lib/python3.7/site-packages/deepspeed/ops/adam/cpu_adam.py", line 83, in init
[stderr] self.ds_opt_adam = CPUAdamBuilder().load()
[stderr] File "/azureml-envs/azureml_5602df82e8a46f1160ede9218ecc0c87/lib/python3.7/site-packages/deepspeed/ops/op_builder/builder.py", line 370, in load
[stderr] return self.jit_load(verbose)
[stderr] File "/azureml-envs/azureml_5602df82e8a46f1160ede9218ecc0c87/lib/python3.7/site-packages/deepspeed/ops/op_builder/builder.py", line 385, in jit_load
[stderr] assert_no_cuda_mismatch()
[stderr] File "/azureml-envs/azureml_5602df82e8a46f1160ede9218ecc0c87/lib/python3.7/site-packages/deepspeed/ops/op_builder/builder.py", line 97, in assert_no_cuda_mismatch
[stderr] f"Installed CUDA version {sys_cuda_version} does not match the "
[stderr]Exception: Installed CUDA version 10.2 does not match the version torch was compiled with 11.1, unable to compile cuda/cpp extensions without a matching cuda version.
[stderr]
https://github.com/microsoft/InnerEye-DeepLearning/pull/611/files
Any ideas why this does not find CUDA 11? It installs pytorch 1.8 and cuda 11 with conda
Thanks!
Hi @javier-alvarez,
Do you have interactive access to the machine you're running on here? if so can you show me the results of ds_report?
[stderr]Exception: Installed CUDA version 10.2 does not match the version torch was compiled with 11.1, unable to compile cuda/cpp extensions without a matching cuda version.
The above error is thrown as a safety precaution, what DeepSpeed is observing is that nvcc is reporting CUDA 10.2 but the installed version of torch was compiled with CUDA 11.1. DeepSpeed uses nvcc to compile some of our custom c++/cuda ops at runtime, if the version of nvcc and torch don't align then the ops will not run properly.
We pick up the nvcc path from torch.utils.cpp_extension.CUDA_HOME, if this path isn't the correct path for your environment then there might be issues.
Was able to reproduce the issue with your conda environment, after adding cudatoolkit-dev=11.1.1 to your conda dependencies it seems to have resolved the issue on my side.
This fixed the issue. I have changed the Azure ML image to:
"mcr.microsoft.com/azureml/openmpi4.1.0-cuda11.1-cudnn8-ubuntu18.04"
It looks like having both cuda 10.2 and 11.1 does not work.
| gharchive/issue | 2021-12-08T17:52:47 | 2025-04-01T06:44:57.285220 | {
"authors": [
"javier-alvarez",
"jeffra"
],
"repo": "microsoft/DeepSpeed",
"url": "https://github.com/microsoft/DeepSpeed/issues/1620",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1116117119 | [BUG] NotImplementedError: There were no tensor arguments to this function
Describe the bug
Cannot init int8 model for inference
To Reproduce
from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
import deepspeed, torch
device = 1
torch.cuda.set_device(device)
model_name = 'EleutherAI/gpt-neo-125M'
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, pad_token_id=tokenizer.eos_token_id).eval()
deepspeed.init_inference(model, dtype=torch.int8, replace_method='auto')
NotImplementedError: There were no tensor arguments to this function (e.g., you passed an empty list of Tensors), but no fallback function is registered for schema aten::_cat. This usually means that this function requires a non-empty list of Tensors, or that you (the operator writer) forgot to register a fallback function. Available functions are [CPU, CUDA, QuantizedCPU, BackendSelect, Named, ADInplaceOrView, AutogradOther, AutogradCPU, AutogradCUDA, AutogradXLA, UNKNOWN_TENSOR_TYPE_ID, AutogradMLC, AutogradHPU, AutogradNestedTensor, AutogradPrivateUse1, AutogradPrivateUse2, AutogradPrivateUse3, Tracer, Autocast, Batched, VmapMode].
CPU: registered at /pytorch/build/aten/src/ATen/RegisterCPU.cpp:16286 [kernel]
CUDA: registered at /pytorch/build/aten/src/ATen/RegisterCUDA.cpp:20674 [kernel]
QuantizedCPU: registered at /pytorch/build/aten/src/ATen/RegisterQuantizedCPU.cpp:1025 [kernel]
BackendSelect: fallthrough registered at /pytorch/aten/src/ATen/core/BackendSelectFallbackKernel.cpp:3 [backend fallback]
Named: registered at /pytorch/aten/src/ATen/core/NamedRegistrations.cpp:7 [backend fallback]
ADInplaceOrView: fallthrough registered at /pytorch/aten/src/ATen/core/VariableFallbackKernel.cpp:60 [backend fallback]
AutogradOther: registered at /pytorch/torch/csrc/autograd/generated/VariableType_2.cpp:9928 [autograd kernel]
AutogradCPU: registered at /pytorch/torch/csrc/autograd/generated/VariableType_2.cpp:9928 [autograd kernel]
AutogradCUDA: registered at /pytorch/torch/csrc/autograd/generated/VariableType_2.cpp:9928 [autograd kernel]
AutogradXLA: registered at /pytorch/torch/csrc/autograd/generated/VariableType_2.cpp:9928 [autograd kernel]
UNKNOWN_TENSOR_TYPE_ID: registered at /pytorch/torch/csrc/autograd/generated/VariableType_2.cpp:9928 [autograd kernel]
AutogradMLC: registered at /pytorch/torch/csrc/autograd/generated/VariableType_2.cpp:9928 [autograd kernel]
AutogradHPU: registered at /pytorch/torch/csrc/autograd/generated/VariableType_2.cpp:9928 [autograd kernel]
AutogradNestedTensor: registered at /pytorch/torch/csrc/autograd/generated/VariableType_2.cpp:9928 [autograd kernel]
AutogradPrivateUse1: registered at /pytorch/torch/csrc/autograd/generated/VariableType_2.cpp:9928 [autograd kernel]
AutogradPrivateUse2: registered at /pytorch/torch/csrc/autograd/generated/VariableType_2.cpp:9928 [autograd kernel]
AutogradPrivateUse3: registered at /pytorch/torch/csrc/autograd/generated/VariableType_2.cpp:9928 [autograd kernel]
Tracer: registered at /pytorch/torch/csrc/autograd/generated/TraceType_2.cpp:9621 [kernel]
Autocast: registered at /pytorch/aten/src/ATen/autocast_mode.cpp:259 [kernel]
Batched: registered at /pytorch/aten/src/ATen/BatchingRegistrations.cpp:1019 [backend fallback]
VmapMode: fallthrough registered at /pytorch/aten/src/ATen/VmapModeRegistrations.cpp:33 [backend fallback]
ds_report output
--------------------------------------------------
DeepSpeed C++/CUDA extension op report
--------------------------------------------------
NOTE: Ops not installed will be just-in-time (JIT) compiled at
runtime if needed. Op compatibility means that your system
meet the required dependencies to JIT install the op.
--------------------------------------------------
JIT compiled ops requires ninja
ninja .................. [OKAY]
--------------------------------------------------
op name ................ installed .. compatible
--------------------------------------------------
cpu_adam ............... [NO] ....... [OKAY]
cpu_adagrad ............ [NO] ....... [OKAY]
fused_adam ............. [NO] ....... [OKAY]
fused_lamb ............. [NO] ....... [OKAY]
sparse_attn ............ [NO] ....... [OKAY]
transformer ............ [NO] ....... [OKAY]
stochastic_transformer . [NO] ....... [OKAY]
[WARNING] async_io requires the dev libaio .so object and headers but these were not found.
[WARNING] If libaio is already installed (perhaps from source), try setting the CFLAGS and LDFLAGS environment variables to where it can be found.
async_io ............... [NO] ....... [NO]
transformer_inference .. [NO] ....... [OKAY]
utils .................. [NO] ....... [OKAY]
quantizer .............. [NO] ....... [OKAY]
--------------------------------------------------
DeepSpeed general environment info:
torch install path ............... ['/home/stardust/anaconda3/lib/python3.8/site-packages/torch']
torch version .................... 1.9.0+cu111
torch cuda version ............... 11.1
nvcc version ..................... 11.4
deepspeed install path ........... ['/home/stardust/anaconda3/lib/python3.8/site-packages/deepspeed']
deepspeed info ................... 0.5.8, unknown, unknown
deepspeed wheel compiled w. ...... torch 1.9, cuda 11.1
Try changing the following :
around line 161 in deepspeed/runtime/weight_quantizer.py
else:
for plcy in replace_policies:
_ = plcy(None) # line added
policy.update({plcy._orig_layer_class: (quantize_fn, plcy)})
I have encountered the same problem. Have you found a solution yet
| gharchive/issue | 2022-01-27T11:51:02 | 2025-04-01T06:44:57.289969 | {
"authors": [
"HoBeedzc",
"callzhang",
"gsujankumar"
],
"repo": "microsoft/DeepSpeed",
"url": "https://github.com/microsoft/DeepSpeed/issues/1730",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1012849439 | [WIP] [zero_to_fp32] fix to handle world_size
This PR is trying to address specifically https://github.com/microsoft/DeepSpeed/issues/1317#issuecomment-926998777
as my original fix used a hardcoded alignment to 4, but the correct one is 2*world_size
While at it I simplified flatten_dense_tensors_aligned a bit. hope it's ok.
Trying to unravel the padding in zero2 w/o the optimizer object looks impossible.
e.g. if I use a tiny model from the test with 57 params, and 2 gpus,
zero2 pads with 3 ending up with 60 params and then splits it into 2x30 so this aligns to 2*world_size
but then before saving it, it looks up a different type of padding self.groups_padding which is totally different from nccl padding, so when it saves the single_partition_of_fp32_groups partitions it actually saves these 2 sizes [30, 29]
So when trying to reconstruct I end up with wanting 57 params and having 59 - so that gap is impossible to work out.
I suspect there is a bug somewhere there, as I think it should save [30, 27] instead, but it sort of works when DS reconstructs it using the complex logic.
The workaround I use is to basically align both numbers to 2*world_size and only then compare that they are the same.
Please correct me if I'm wrong but I didn't see that the saved single_partition_of_fp32_groups actually ever include any padding. If they do that would explain why other users sometimes have an issue using this script with zero2. Because the padding would completely mess up the weights. I tried hard to write a test that would break, but I couldn't, I always get the exact same model back.
in Z2 I simply concatenate all single_partition_of_fp32_groups and then re-shape into params.
@tjruwase, am I missing something here? The partitioning plus double padding code is so complex.
@stas00, thanks for doing the grueling work. No, you are not missing anything. The two padding logics evolved independently and we were not careful to avoid their conflicts. I will look into simplifying the padding logic by for starters padding exactly once, rather than multiple times for different constraints.
| gharchive/pull-request | 2021-10-01T03:44:25 | 2025-04-01T06:44:57.295514 | {
"authors": [
"stas00",
"tjruwase"
],
"repo": "microsoft/DeepSpeed",
"url": "https://github.com/microsoft/DeepSpeed/pull/1422",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
626013277 | Support loading and saving ZeRO checkpoints with changing DP degree
Support loading/saving ZeRO checkpoints with changing data parallelism degree, GPU counts. This will be useful for finetuning and continued training on different number of GPUs.
This is just for ZeRO stage 2 right? Should we add an assert somewhere if that it's not supported for ZeRO stage 1?
@jeffra There is nothing specific to ZeRO-2 in these changes, and so we could easily add them to ZeRO-1. But I thought we wanted to retire the old ZeRO-1 implementation, which is why I left it out. Is that no longer the case?
@tjruwase, I think we're planning to support ZeRO-1 still? We should probably have some discussions to figure out a good plan here though. Perhaps merging the implementations so that as we add features like this it's easier to support both stage 1 and 2. One example where ZeRO-1 might still be useful are cases where we need gradient accumulation, ZeRO-2 doesn't support this.
@tjruwase, I think we're planning to support ZeRO-1 still? We should probably have some discussions to figure out a good plan here though. Perhaps merging the implementations so that as we add features like this it's easier to support both stage 1 and 2. One example where ZeRO-1 might still be useful are cases where we need gradient accumulation, ZeRO-2 doesn't support this.
I agreed. We still want ZeRO-1 where we want to use gradient accumulation. This is specially true for the pipeline parallelism implementation that @ShadenSmith is working on.
Hello, is this feature still effective in the latest deepspeed 0.7.0+ versions? I think it's useful in the cloud training, where
gpu numbers as well as dp degree might change quite often. As I tried in 0.7.0 version with ZeRO1 from 8 GPUs to 16 GPUs, there was error suggesting that the other 8 gpus couldn't find the corresponding zero checkpoint.
@kisseternity, you are correct. Unfortunately, we disabled this feature because it turned out not be robust. We are working on a more general replacement.
Hi, is any replacement available for this now? Would appreciate any pointers!
| gharchive/pull-request | 2020-05-27T20:37:32 | 2025-04-01T06:44:57.300389 | {
"authors": [
"jeffra",
"kisseternity",
"kkteru",
"samyam",
"tjruwase"
],
"repo": "microsoft/DeepSpeed",
"url": "https://github.com/microsoft/DeepSpeed/pull/240",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1400725415 | Fix GPT Neo-X multi-gpu inference
Fix https://github.com/microsoft/DeepSpeed/issues/2293
Using _transpose with mp_size before mp_replace.qkv_copy causes wrong key/query/value weight in each GPU
Checked on 2xA6000 with EleutherAI/gpt-neox-20b:
Deepspeed is a new, free, and open source tool for analyzing and visualizing the performance of your computer. It is a cross-platform tool that can be used on Windows, Mac OS X, and Linux. It is a graphical tool that can be used to analyze the performance of your computer. It can be used to analyze the performance of your computer. It can be used to analyze the performance of your computer. It can be used to analyze the performance of your computer. It can be used to analyze
Thanks @andrewchernyh
This actually solves the problem, thank you for the PR! :)
| gharchive/pull-request | 2022-10-07T07:13:49 | 2025-04-01T06:44:57.302680 | {
"authors": [
"RezaYazdaniAminabadi",
"andrewchernyh"
],
"repo": "microsoft/DeepSpeed",
"url": "https://github.com/microsoft/DeepSpeed/pull/2401",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1225211901 | IDStorageQueue::EnqueueRequest() and multithreading
Can IDStorageQueue::EnqueueRequest() be called on the same queue from multiple threads at the same time, or should requests be enqueued from one thread at a time? Developer guidance does not make this clear.
Yes, it can be called from multiple threads.
Can we get some documentation anywhere to that effect? People won't be searching closed GitHub issues for answers.
| gharchive/issue | 2022-05-04T10:55:18 | 2025-04-01T06:44:57.304323 | {
"authors": [
"damyanp",
"sherief"
],
"repo": "microsoft/DirectStorage",
"url": "https://github.com/microsoft/DirectStorage/issues/12",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
405900949 | Would like example of using a string key in C++ please
I noticed that the paper says:
"Note that keys are not part of the Faster hash index, unlike many traditional designs, which provides two benefits:
• It reduces the in-memory footprint of the hash index, allowing
us to retain it entirely in memory.
• It separates user data and index metadata, which allows us to
mix and match the hash index with different record allocators."
And also the benchmark-dir/README.md says:
"The output of YCSB's "basic" driver is verbose. A typical line looks like:
INSERT usertable user5575651532496486335 [ field1='...' ... ]
To speed up file ingestion, our basic YCSB benchmark assumes that the input
file consists only of the 8-byte-integer portion of the key--e.g.:
5575651532496486335"
Does FASTER support string keys? And if so, is there any example C++ code showing string key manipulation?
@matthewbrookes added a testcase that shows how to handle variable length keys at https://github.com/microsoft/FASTER/pull/128. The same example can be used to handle string keys in C++.
| gharchive/issue | 2019-02-01T22:36:57 | 2025-04-01T06:44:57.320231 | {
"authors": [
"badrishc",
"simonhf"
],
"repo": "microsoft/FASTER",
"url": "https://github.com/microsoft/FASTER/issues/88",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
980743049 | To parse summary acks and nacks, summarizer should use the "data" field instead of "content" field.
Currently, scribe uses the "contents" field to populate summary ack and nack payload. However, this is incorrect and we should use the "data" field instead (just like join/leave). PR #7231 updated the scribe code to start populating "data" field. Now summarizer should be updated as well.
To keep backward compatibility, scribe currently fills both "data" and "content" fields. Once all clients are updated, scribe will stop populating "contents" fills completely.
Comment
| gharchive/issue | 2021-08-27T00:08:29 | 2025-04-01T06:44:57.321713 | {
"authors": [
"jatgarg",
"tanviraumi"
],
"repo": "microsoft/FluidFramework",
"url": "https://github.com/microsoft/FluidFramework/issues/7264",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1569107093 | Automation: Main Next Integrate
Manually created main/next merge.
Thanks @tylerbutler!! :)
@sonalideshpandemsft Happy to help! Can you handle the fast forward?
@sonalideshpandemsft Nvm, I was able to do it!
It says 1de364124ff95522c6caa0a0382dbeba884820dc - not something we can merge. Might be because this branch is from your fork?
merged, nvm. thanks!! 😄
| gharchive/pull-request | 2023-02-03T02:26:16 | 2025-04-01T06:44:57.323879 | {
"authors": [
"sonalideshpandemsft",
"tylerbutler"
],
"repo": "microsoft/FluidFramework",
"url": "https://github.com/microsoft/FluidFramework/pull/13983",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1681539482 | fix labeler for tree2 move
How contribute to this repo.
Guidelines for Pull Requests.
The sections included below are suggestions for what you may want to include.
Feel free to remove or alter parts of this template that do not offer value for your specific change.
Description
Fixes the auto labeler after a move of the SharedTree repo.
this was fixed by https://github.com/microsoft/FluidFramework/pull/15248
| gharchive/pull-request | 2023-04-24T15:26:08 | 2025-04-01T06:44:57.326118 | {
"authors": [
"Abe27342",
"taylorsw04"
],
"repo": "microsoft/FluidFramework",
"url": "https://github.com/microsoft/FluidFramework/pull/15244",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
708787882 | Interpreting magnetometer data
Hello!
I'm reading the magnetometer values from the Hololens 2, but I can't figure out what they are supposed to mean or if my readings are erroneous.
The documentation doesn't state any unit of measurement or meaning for these values.
When I tested it, the values where [464, -198, 483] and after turning 180 degrees they were [456, -223, 463].
What do these values mean an how are they to be processed.
Thank you!
Best regards,
Max
Hey Max,
have you had any update regarding data extraction from the IMU sensor?
Hi @ozgunkaratas
Through other channels, I received the following explanation:
They *think* the units for the returned values are micro-teslas.
[...]
I'm guessing that the HL2 only uses the magnetometer to sense relative heading. So it's probably not calibrated to do more than that. For example, the research API does not give any extrinsic calibration data on how the magnetometer is precisely oriented to the coordinate frames of the other sensors or cameras. This would also explain why they aren't 100% sure of the units – because they might only be looking at how the direction of the magnetic field changes with HL2 orientation (and not using the absolute field strength).
Hi @max-krichenbauer
thank you for your update, i was able to compare HL magnetometer data with my external IMU sensor which outputs field strength information in 3 axes in units of Tesla and the HL and the IMU do not match with each other whatsoever. I think you are right in the sense that this is used for sensing relative heading.
Hey @max-krichenbauer
Did you find a way to get a degree value or something like that from the magnetometer outputs? It is hard to find anything on the internet about using the magnetometer values of the HoloLens...
Hi
Hello!
I'm reading the magnetometer values from the Hololens 2, but I can't figure out what they are supposed to mean or if my readings are erroneous.
The documentation doesn't state any unit of measurement or meaning for these values.
When I tested it, the values where [464, -198, 483] and after turning 180 degrees they were [456, -223, 463].
What do these values mean an how are they to be processed.
Thank you!
Best regards,
Max
Hi Max,
Please how did you get the magnetometer values ?
Best regards
| gharchive/issue | 2020-09-25T09:24:19 | 2025-04-01T06:44:57.337390 | {
"authors": [
"ilyassebel",
"made3",
"max-krichenbauer",
"ozgunkaratas"
],
"repo": "microsoft/HoloLens2ForCV",
"url": "https://github.com/microsoft/HoloLens2ForCV/issues/30",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1654863738 | refactor center onMessage, and create UT
Linked GitHub issue ID: #
Pull Request Checklist
[x] Tests for the changes have been added (for bug fixes / features)
[x] Code compiles correctly with all tests are passed.
[x] I've read the contributing guide and followed the recommended practices.
[ ] Wikis or README have been reviewed and added / updated if needed (for bug fixes / features)
Does this introduce a breaking change?
If this introduces a breaking change for Hydra Lab users, please describe the impact and migration path.
[ ] Yes
[x] No
Pull Request Description
A few words to explain your changes: refactor center onMessage, and create UT
How you tested it
Please make sure the change is tested, you can test it by adding UTs, do local test and share the screenshots, etc.
Please check the type of change your PR introduces:
[ ] Bugfix
[ ] Feature
[ ] Build related changes
[x] Refactoring (no functional changes, no api changes)
[ ] Code style update (formatting, renaming) or Documentation content changes
[ ] Other (please describe):
Feature UI screenshots or Technical design diagrams
If this is a relatively large or complex change, kick it off by drawing the tech design with PlantUML and explaining why you chose the solution you did and what alternatives you considered, etc...
@zhou9584 could you help review this PR when it's a convenient time for you? Thanks!
And thank you so much for helping resolve the UT issue.
LGTM for the rest.
| gharchive/pull-request | 2023-04-05T03:15:45 | 2025-04-01T06:44:57.344139 | {
"authors": [
"hydraxman",
"olivershen-wow"
],
"repo": "microsoft/HydraLab",
"url": "https://github.com/microsoft/HydraLab/pull/411",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
593810886 | I never could make the library run any simple model. The session always exploded
After installed lightGBM, I never could run a simple model. The R session always abort after I press enter to fit the model, even the very simple ones as those provided in the demo section. I've two MacBook pro with different power and operating systems, but in none of them I could use the library. In a server, running a linux it was so simple to make it work properly though.
> sessionInfo()
R version 3.6.3 (2020-02-29)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS Catalina 10.15.2
Matrix products: default
BLAS: /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
LAPACK: /Library/Frameworks/R.framework/Versions/3.6/Resources/lib/libRlapack.dylib
locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] lightgbm_2.3.1 R6_2.4.1
loaded via a namespace (and not attached):
[1] compiler_3.6.3 tools_3.6.3 data.table_1.12.6 jsonlite_1.6.1
@danielmarcelino Could you provide us with a bit more information?
How did you install LightGBM's R package? Looking for the specific commands you ran and which compiler you used (if you know it).
If you start from a completely clean session, do you still see this issue?
I see you're running version 2.3.1. Is it possible for you to upgrade to the current version on master? We have had a lot of activity on the R package since 2.3.1 was released in November and it's possible that we've already fixed the issue you're encountering.
Hi @jameslamb, thanks for asking.
Yes, the issue always happens even using a completely new session. But after you asked those questions, I realised that even though I've downloaded the latest folder which contains a description file mentioning that the current version is 2.3.2, the version my R is calling after install is 2.3.1 still. The following it the installation verbose. Am I doing anything wrong?
daniels-MacBook-Pro-2:build marcelino$ cmake ..
-- The C compiler identification is AppleClang 11.0.3.11030032
-- The CXX compiler identification is AppleClang 11.0.3.11030032
-- Check for working C compiler: /Library/Developer/CommandLineTools/usr/bin/cc
-- Check for working C compiler: /Library/Developer/CommandLineTools/usr/bin/cc - works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /Library/Developer/CommandLineTools/usr/bin/c++
-- Check for working CXX compiler: /Library/Developer/CommandLineTools/usr/bin/c++ - works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Found OpenMP_C: -Xclang -fopenmp (found version "3.1")
-- Found OpenMP_CXX: -Xclang -fopenmp (found version "3.1")
-- Found OpenMP: TRUE (found version "3.1")
-- Performing Test MM_PREFETCH
-- Performing Test MM_PREFETCH - Success
-- Using _mm_prefetch
-- Performing Test MM_MALLOC
-- Performing Test MM_MALLOC - Success
-- Using _mm_malloc
-- Configuring done
-- Generating done
-- Build files have been written to: /Users/marcelino/LightGBM/build
Scanning dependencies of target lightgbm
Scanning dependencies of target _lightgbm
[ 6%] Building CXX object CMakeFiles/_lightgbm.dir/src/application/application.cpp.o
[ 6%] Building CXX object CMakeFiles/lightgbm.dir/src/main.cpp.o
[ 6%] Building CXX object CMakeFiles/_lightgbm.dir/src/boosting/boosting.cpp.o
[ 6%] Building CXX object CMakeFiles/lightgbm.dir/src/application/application.cpp.o
[ 8%] Building CXX object CMakeFiles/lightgbm.dir/src/boosting/boosting.cpp.o
[ 9%] Building CXX object CMakeFiles/lightgbm.dir/src/boosting/gbdt.cpp.o
[ 11%] Building CXX object CMakeFiles/_lightgbm.dir/src/boosting/gbdt.cpp.o
[ 12%] Building CXX object CMakeFiles/_lightgbm.dir/src/boosting/gbdt_model_text.cpp.o
[ 14%] Building CXX object CMakeFiles/lightgbm.dir/src/boosting/gbdt_model_text.cpp.o
[ 16%] Building CXX object CMakeFiles/_lightgbm.dir/src/boosting/gbdt_prediction.cpp.o
[ 17%] Building CXX object CMakeFiles/lightgbm.dir/src/boosting/gbdt_prediction.cpp.o
[ 19%] Building CXX object CMakeFiles/lightgbm.dir/src/boosting/prediction_early_stop.cpp.o
[ 20%] Building CXX object CMakeFiles/_lightgbm.dir/src/boosting/prediction_early_stop.cpp.o
[ 22%] Building CXX object CMakeFiles/_lightgbm.dir/src/io/bin.cpp.o
[ 24%] Building CXX object CMakeFiles/lightgbm.dir/src/io/bin.cpp.o
[ 25%] Building CXX object CMakeFiles/_lightgbm.dir/src/io/config.cpp.o
[ 27%] Building CXX object CMakeFiles/lightgbm.dir/src/io/config.cpp.o
[ 29%] Building CXX object CMakeFiles/lightgbm.dir/src/io/config_auto.cpp.o
[ 30%] Building CXX object CMakeFiles/_lightgbm.dir/src/io/config_auto.cpp.o
[ 32%] Building CXX object CMakeFiles/_lightgbm.dir/src/io/dataset.cpp.o
[ 33%] Building CXX object CMakeFiles/lightgbm.dir/src/io/dataset.cpp.o
[ 35%] Building CXX object CMakeFiles/lightgbm.dir/src/io/dataset_loader.cpp.o
[ 37%] Building CXX object CMakeFiles/_lightgbm.dir/src/io/dataset_loader.cpp.o
[ 38%] Building CXX object CMakeFiles/lightgbm.dir/src/io/file_io.cpp.o
[ 40%] Building CXX object CMakeFiles/_lightgbm.dir/src/io/file_io.cpp.o
[ 41%] Building CXX object CMakeFiles/_lightgbm.dir/src/io/json11.cpp.o
[ 43%] Building CXX object CMakeFiles/lightgbm.dir/src/io/json11.cpp.o
[ 45%] Building CXX object CMakeFiles/lightgbm.dir/src/io/metadata.cpp.o
[ 46%] Building CXX object CMakeFiles/_lightgbm.dir/src/io/metadata.cpp.o
[ 48%] Building CXX object CMakeFiles/_lightgbm.dir/src/io/parser.cpp.o
[ 50%] Building CXX object CMakeFiles/lightgbm.dir/src/io/parser.cpp.o
[ 51%] Building CXX object CMakeFiles/_lightgbm.dir/src/io/tree.cpp.o
[ 53%] Building CXX object CMakeFiles/lightgbm.dir/src/io/tree.cpp.o
[ 54%] Building CXX object CMakeFiles/lightgbm.dir/src/metric/dcg_calculator.cpp.o
[ 56%] Building CXX object CMakeFiles/_lightgbm.dir/src/metric/dcg_calculator.cpp.o
[ 58%] Building CXX object CMakeFiles/_lightgbm.dir/src/metric/metric.cpp.o
[ 59%] Building CXX object CMakeFiles/lightgbm.dir/src/metric/metric.cpp.o
[ 61%] Building CXX object CMakeFiles/lightgbm.dir/src/network/linker_topo.cpp.o
[ 62%] Building CXX object CMakeFiles/_lightgbm.dir/src/network/linker_topo.cpp.o
[ 64%] Building CXX object CMakeFiles/lightgbm.dir/src/network/linkers_mpi.cpp.o
[ 66%] Building CXX object CMakeFiles/lightgbm.dir/src/network/linkers_socket.cpp.o
[ 67%] Building CXX object CMakeFiles/_lightgbm.dir/src/network/linkers_mpi.cpp.o
[ 69%] Building CXX object CMakeFiles/lightgbm.dir/src/network/network.cpp.o
[ 70%] Building CXX object CMakeFiles/_lightgbm.dir/src/network/linkers_socket.cpp.o
[ 72%] Building CXX object CMakeFiles/_lightgbm.dir/src/network/network.cpp.o
[ 74%] Building CXX object CMakeFiles/_lightgbm.dir/src/objective/objective_function.cpp.o
[ 75%] Building CXX object CMakeFiles/lightgbm.dir/src/objective/objective_function.cpp.o
[ 77%] Building CXX object CMakeFiles/lightgbm.dir/src/treelearner/data_parallel_tree_learner.cpp.o
[ 79%] Building CXX object CMakeFiles/lightgbm.dir/src/treelearner/feature_parallel_tree_learner.cpp.o
[ 80%] Building CXX object CMakeFiles/_lightgbm.dir/src/treelearner/data_parallel_tree_learner.cpp.o
[ 82%] Building CXX object CMakeFiles/_lightgbm.dir/src/treelearner/feature_parallel_tree_learner.cpp.o
[ 83%] Building CXX object CMakeFiles/_lightgbm.dir/src/treelearner/gpu_tree_learner.cpp.o
[ 85%] Building CXX object CMakeFiles/lightgbm.dir/src/treelearner/gpu_tree_learner.cpp.o
[ 87%] Building CXX object CMakeFiles/_lightgbm.dir/src/treelearner/serial_tree_learner.cpp.o
[ 88%] Building CXX object CMakeFiles/lightgbm.dir/src/treelearner/serial_tree_learner.cpp.o
[ 90%] Building CXX object CMakeFiles/lightgbm.dir/src/treelearner/tree_learner.cpp.o
[ 91%] Building CXX object CMakeFiles/lightgbm.dir/src/treelearner/voting_parallel_tree_learner.cpp.o
[ 93%] Building CXX object CMakeFiles/_lightgbm.dir/src/treelearner/tree_learner.cpp.o
[ 95%] Building CXX object CMakeFiles/_lightgbm.dir/src/treelearner/voting_parallel_tree_learner.cpp.o
[ 96%] Building CXX object CMakeFiles/_lightgbm.dir/src/c_api.cpp.o
[ 98%] Linking CXX executable ../lightgbm
[ 98%] Built target lightgbm
[100%] Linking CXX shared library ../lib_lightgbm.so
[100%] Built target _lightgbm
Thanks! This looks exactly like our recommended approach to installing the R package right now, and I don't see any obvious issues in the logs.
Can you share the exact R code that you're running which is resulting in the error? I can kind of see it in your first screenshot but I want too be sure there aren't other things you've run in the console in that session (just trying to rule things out).
I believe we haven't tested with Catalina yet, so maybe there is something Catalina-specific that is causing this.
I want to find the issue but I also want to be sure you can get back to work...do you have gcc available? (you can check by running gcc --version)
If you do, you could try rebuilding with gcc? On my Mac, I do this in a terminal (per our docs)
export CXX=g++-8
export CC=gcc-8
Rscript build_r.R
I use g++-8 because that's the version that I have. You may need to change the command above to whatever version you have.
Sure, thanks fro replaying. Please consider that I upgraded from Mojave to Catalina. After that I've seen many mac users telling that they can't compile a C program on a Mac after upgrading to Catalina. Perhaps Apple just made our lives more difficult since then.
daniels-MacBook-Pro-2:~ marcelino$ gcc --version
Configured with: --prefix=/Library/Developer/CommandLineTools/usr --with-gxx-include-dir=/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include/c++/4.2.1
Apple clang version 11.0.3 (clang-1103.0.32.29)
Target: x86_64-apple-darwin19.2.0
Thread model: posix
InstalledDir: /Library/Developer/CommandLineTools/usr/bin
daniels-MacBook-Pro-2:~ marcelino$
The reproduction example is this:
library(lightgbm)
data(agaricus.train, package = "lightgbm")
train <- agaricus.train
dtrain <- lgb.Dataset(train$data, label = train$label)
data(agaricus.test, package = "lightgbm")
test <- agaricus.test
dtest <- lgb.Dataset.create.valid(dtrain, test$data, label = test$label)
params <- list(objective = "regression", metric = "l2")
valids <- list(test = dtest)
model <- lgb.train(
params = params
, data = dtrain
, nrounds = 10L
, valids = valids
, min_data = 1L
, learning_rate = 1.0
, early_stopping_rounds = 5L
)
After rebuilding with the following parameters the problem was apparently solved. I'll do some testing tomorrow, but at least it worked now. Thanks.
export CXX=g++-9
export CC=gcc-9
Rscript build_r.R
@danielmarcelino thanks very much! My best guess is that we have a Catalina-specific issue 😬 .
I'm glad that gcc is working for you! I'll need to experiment on Catalina and see if I can find the issue. Thanks very much for your bug report.
@danielmarcelino have you been having issues with other C++ projects on your Mac? A friend of mine said he's been unable to build R packages with C++ code on Catalina for a bit, and this fixed it: https://stackoverflow.com/questions/59071881/problems-with-c-and-gems-on-osx-catalina/59072909#59072909
Yes, I had. I saw some error mensagens while installing Rcpp package, for instance. But the issue with LightGBM was more subtle once it was perfect compiled and installed, but was not working. Thanks a lot for you guidance here.
@jameslamb Hmm, seems that it is a general Catalina problem. Nothing can be done from LightGBM side. Can we close this issue?
@danielmarcelino I'm going to close this issue for now, since it does seem like an issue with Catalina and since you were able to use g++ to get a successful installation.
I do have an experimental setup that ignores CMake and uses the toolchain commonly used by CRAN, and a few who've tested it have reported success on Catalina. It's here if you want to try: https://github.com/jameslamb/LightGBM/pull/15
./build-cran-package.sh
R CMD INSTALL lightgbm_2.3.2.tar.gz
It is VERY experimental and not something we're officially supporting yet, but when we eventually add it here it may help.
^ @StrikerRUS I'm mentioning this just to document that I think any incompatibility with Catalina might be limited to the installation path that uses CMake (Rscript build_r.R) and doesn't mean that the package we prepare for CRAN will fail for Catalina users.
| gharchive/issue | 2020-04-04T11:48:30 | 2025-04-01T06:44:57.359472 | {
"authors": [
"StrikerRUS",
"danielmarcelino",
"jameslamb"
],
"repo": "microsoft/LightGBM",
"url": "https://github.com/microsoft/LightGBM/issues/2970",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
696004391 | Monotone constraint with Quantile distribution does not work correctly
I tried quantile distribution while the monotone constraint is set and it looks it is not implemented correctly. The result prediction is not monotone, see the example and image bellow.
How you are using LightGBM?
Python package
Environment info
Operating System: Linux Debian 10.5 (x86-64)
Python version: Python 3.7.3rc1
LightGBM version or commit hash: lightgbm==2.3.1
Reproducible example
# prepare data
import numpy as np
np.random.seed(1)
def f(x):
"""The function to predict."""
return x * np.sin(x)
X = np.atleast_2d(np.random.uniform(0, 10.0, size=100)).T
X = X.astype(np.float32)
y = f(X).ravel()
dy = 1.5 + 1.0 * np.random.random(y.shape)
noise = np.random.normal(0, dy)
y += noise
y = y.astype(np.float32)
xx = np.atleast_2d(np.linspace(0, 10, 100)).T
xx = xx.astype(np.float32)
# prepare plot function
import matplotlib.pyplot as plt
def plot_prediction_quantile(xx, fxx, xx_label, X, y, y_pred,ylim, title, y_upper=None, y_lower=None,
confidence_label=None):
fig = plt.figure()
plt.plot(xx, fxx, 'g:', label=xx_label)
if (X is not None) and (y is not None):
plt.plot(X, y, 'b.', markersize=10, label=u'Observations')
plt.plot(xx, y_pred, 'r-', label=u'Prediction')
if (y_upper is not None) and (y_lower is not None):
plt.plot(xx, y_upper, 'k-')
plt.plot(xx, y_lower, 'k-')
plt.fill(np.concatenate([xx, xx[::-1]]),
np.concatenate([y_upper, y_lower[::-1]]),
alpha=.5, fc='b', ec='None', label=confidence_label)
plt.xlabel('$x$')
plt.ylabel('$f(x)$')
plt.ylim(ylim)
plt.legend(loc='upper left')
plt.title(title)
plt.show()
# prepare lightgbm
from lightgbm import LGBMRegressor
lgb_params = {
'n_jobs': 1,
'max_depth': 5,
'min_data_in_leaf': 3,
'n_estimators': 100,
'learning_rate': 0.1,
'colsample_bytree': 0.9,
'boosting_type': 'gbdt',
'monotone_constraints': -1
}
lgb_no_monotonicity = LGBMRegressor(objective='quantile', alpha=0.4, **lgb_params)
lgb_no_monotonicity .fit(X, y)
y_no_monotonicity_lgb = lgb_no_monotonicity.predict(xx)
plot_prediction_quantile(xx, f(xx), r'f(x)$', X, y, y_no_monotonicity_lgb, [min(y)-0.2*min(y), max(y)+0.2*max(y)], "LightGBM Quantile (quantile_alpha=0.4) with monotone constraint 1 - Monotonicity constraint violated!")
Result plot
Steps to reproduce
run from console using
python example.py
see the resulting image
Yes, For the objective function with RenewTreeOutput, the mononote constraint will be break.
There are not good solutions so far, as this line search process is optimized for different leaves independently.
I don't think this is a bug. will create a PR to warn the user when using them together.
Also, ping @CharlesAuguste for future possible solutions.
I am not familiar with how the quantile objective function works, but I will take a look!
I gave this some thoughts, and one way I can see this working would be:
Reset all constraints before RenewTreeOutput is being called;
Every time a leaf output is updated by RenewTreeOutput, update the constraints of the contiguous leaves using a function like GoUpToFindLeavesToUpdate (https://github.com/microsoft/LightGBM/blob/master/src/treelearner/monotone_constraints.hpp#L234);
Introduce constraints on the output of PercentileFun (https://github.com/microsoft/LightGBM/blob/master/src/objective/regression_objective.hpp#L18) so the new output will follow the constraints.
I don't have any theoretical guarantee that this would work, but it seems like a reasonable procedure to me. Any thoughts @guolinke ?
@CharlesAuguste thank you so much!
Can't we just update the leaf outputs after RenewTreeOutput?
theoretically, the post-fix solution cannot learn the "optimal" tree structure, as we don't consider MC during tree growth.
But the RenewTreeOut also is a post-fix solution, for the tree structure is learned by the different objective.
@CharlesAuguste thank you so much!
Can't we just update the leaf outputs after RenewTreeOutput?
theoretically, the post-fix solution cannot learn the "optimal" tree structure, as we don't consider MC during tree growth.
But the RenewTreeOut also is a post-fix solution, for the tree structure is learned by the different objective.
@guolinke yes updating leaf outputs after RenewTreeOutput should work the same as far as I understand it. I can give that a try in the coming days, and we'll see how that works!
@guolinke yes updating leaf outputs after RenewTreeOutput should work the same as far as I understand it. I can give that a try in the coming days, and we'll see how that works!
@CharlesAuguste was this ever fixed?
@guolinke yes updating leaf outputs after RenewTreeOutput should work the same as far as I understand it. I can give that a try in the coming days, and we'll see how that works!
@CharlesAuguste was this ever fixed?
Unfortunately I haven't fixed it, and I am not able to spend time on it right now. I am sorry about that.
was this ever fixed?
was this ever fixed?
Thanks @alisoltanisobh ! Looking at #3380, the PR that resulted in this issue being automatically closed, I don't think so.
I've renamed this to "support monotone constraints with quantile distribution" and added it to #2302, where we track other feature requests for this project.
| gharchive/issue | 2020-09-08T16:13:54 | 2025-04-01T06:44:57.372236 | {
"authors": [
"CharlesAuguste",
"alisoltanisobh",
"cah-autoit",
"guolinke",
"jameslamb",
"maurever"
],
"repo": "microsoft/LightGBM",
"url": "https://github.com/microsoft/LightGBM/issues/3371",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
999473876 | CUDA CI jobs are broken in master
Initially reported in https://github.com/microsoft/LightGBM/pull/4606#issuecomment-921374740.
docker: Error response from daemon: OCI runtime create failed: container_linux.go:380: starting container process caused: process_linux.go:545: container init caused: Running hook #0:: error running hook: exit status 1, stdout: , stderr: nvidia-container-cli: initialization error: nvml error: driver not loaded: unknown.
Full logs:
2021-09-15T23:12:00.6215072Z Can't find any online and idle self-hosted runner in the current repository, account/organization and enterprise that matches the required labels: 'self-hosted , linux'
2021-09-15T23:12:00.7256411Z Waiting for a self-hosted runner to pickup this job...
2021-09-15T23:12:47.4157030Z Job is about to start running on the runner: nv6-01 (repository)
2021-09-15T23:12:52.0695982Z Current runner version: '2.282.0'
2021-09-15T23:12:52.0700468Z Runner name: 'nv6-01'
2021-09-15T23:12:52.0701070Z Runner group name: 'Default'
2021-09-15T23:12:52.0702313Z Machine name: 'nv6-01'
2021-09-15T23:12:52.0705689Z ##[group]GITHUB_TOKEN Permissions
2021-09-15T23:12:52.0706967Z Actions: write
2021-09-15T23:12:52.0707553Z Checks: write
2021-09-15T23:12:52.0708038Z Contents: write
2021-09-15T23:12:52.0708549Z Deployments: write
2021-09-15T23:12:52.0709026Z Discussions: write
2021-09-15T23:12:52.0709541Z Issues: write
2021-09-15T23:12:52.0710017Z Metadata: read
2021-09-15T23:12:52.0710512Z Packages: write
2021-09-15T23:12:52.0710989Z PullRequests: write
2021-09-15T23:12:52.0711583Z RepositoryProjects: write
2021-09-15T23:12:52.0712180Z SecurityEvents: write
2021-09-15T23:12:52.0712745Z Statuses: write
2021-09-15T23:12:52.0713361Z ##[endgroup]
2021-09-15T23:12:52.0716281Z Prepare workflow directory
2021-09-15T23:12:52.1453868Z Prepare all required actions
2021-09-15T23:12:52.1463663Z Getting action download info
2021-09-15T23:12:52.4152015Z Download action repository 'actions/checkout@v1' (SHA:50fbc622fc4ef5163becd7fab6573eac35f8462e)
2021-09-15T23:12:52.8677653Z ##[group]Run sudo apt-get update
2021-09-15T23:12:52.8678549Z [36;1msudo apt-get update[0m
2021-09-15T23:12:52.8679176Z [36;1msudo apt-get install --no-install-recommends -y \[0m
2021-09-15T23:12:52.8679846Z [36;1m apt-transport-https \[0m
2021-09-15T23:12:52.8680400Z [36;1m ca-certificates \[0m
2021-09-15T23:12:52.8680840Z [36;1m curl \[0m
2021-09-15T23:12:52.8681232Z [36;1m git \[0m
2021-09-15T23:12:52.8681647Z [36;1m gnupg-agent \[0m
2021-09-15T23:12:52.8682251Z [36;1m software-properties-common[0m
2021-09-15T23:12:52.8683135Z [36;1mcurl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -[0m
2021-09-15T23:12:52.8684271Z [36;1msudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" -y[0m
2021-09-15T23:12:52.8685440Z [36;1mcurl -sL https://nvidia.github.io/nvidia-docker/gpgkey | sudo apt-key add -[0m
2021-09-15T23:12:52.8686789Z [36;1mcurl -sL https://nvidia.github.io/nvidia-docker/$(. /etc/os-release;echo $ID$VERSION_ID)/nvidia-docker.list | sudo tee /etc/apt/sources.list.d/nvidia-docker.list[0m
2021-09-15T23:12:52.8687869Z [36;1msudo apt-get update[0m
2021-09-15T23:12:52.8688484Z [36;1msudo apt-get install --no-install-recommends -y \[0m
2021-09-15T23:12:52.8689106Z [36;1m containerd.io \[0m
2021-09-15T23:12:52.8689561Z [36;1m docker-ce \[0m
2021-09-15T23:12:52.8690033Z [36;1m docker-ce-cli \[0m
2021-09-15T23:12:52.8690531Z [36;1m nvidia-docker2[0m
2021-09-15T23:12:52.8691062Z [36;1msudo chmod a+rw /var/run/docker.sock[0m
2021-09-15T23:12:52.8691625Z [36;1msudo systemctl restart docker[0m
2021-09-15T23:12:52.8713272Z shell: /bin/bash -e {0}
2021-09-15T23:12:52.8713839Z env:
2021-09-15T23:12:52.8714263Z github_actions: true
2021-09-15T23:12:52.8714687Z os_name: linux
2021-09-15T23:12:52.8715092Z task: cuda
2021-09-15T23:12:52.8715517Z conda_env: test-env
2021-09-15T23:12:52.8715964Z ##[endgroup]
2021-09-15T23:12:52.9693936Z Hit:1 http://azure.archive.ubuntu.com/ubuntu bionic InRelease
2021-09-15T23:12:52.9695849Z Get:2 http://azure.archive.ubuntu.com/ubuntu bionic-updates InRelease [88.7 kB]
2021-09-15T23:12:52.9697851Z Get:3 http://azure.archive.ubuntu.com/ubuntu bionic-backports InRelease [74.6 kB]
2021-09-15T23:12:52.9912762Z Get:4 https://nvidia.github.io/libnvidia-container/stable/ubuntu18.04/amd64 InRelease [1484 B]
2021-09-15T23:12:52.9917711Z Hit:5 https://download.docker.com/linux/ubuntu bionic InRelease
2021-09-15T23:12:52.9970738Z Get:6 https://nvidia.github.io/nvidia-container-runtime/stable/ubuntu18.04/amd64 InRelease [1481 B]
2021-09-15T23:12:53.0093847Z Hit:7 https://nvidia.github.io/nvidia-docker/ubuntu18.04/amd64 InRelease
2021-09-15T23:12:53.0317450Z Hit:8 http://packages.microsoft.com/repos/azurecore bionic InRelease
2021-09-15T23:12:53.1211953Z Hit:9 http://security.ubuntu.com/ubuntu bionic-security InRelease
2021-09-15T23:12:53.7836628Z Fetched 166 kB in 1s (230 kB/s)
2021-09-15T23:12:54.7220539Z Reading package lists...
2021-09-15T23:12:54.8237085Z Reading package lists...
2021-09-15T23:12:54.9680864Z Building dependency tree...
2021-09-15T23:12:54.9685268Z Reading state information...
2021-09-15T23:12:55.0854263Z ca-certificates is already the newest version (20210119~18.04.1).
2021-09-15T23:12:55.0856865Z curl is already the newest version (7.58.0-2ubuntu3.15).
2021-09-15T23:12:55.0858003Z git is already the newest version (1:2.17.1-1ubuntu0.9).
2021-09-15T23:12:55.0859277Z software-properties-common is already the newest version (0.96.24.32.14).
2021-09-15T23:12:55.0861009Z apt-transport-https is already the newest version (1.6.14).
2021-09-15T23:12:55.0862242Z gnupg-agent is already the newest version (2.2.4-1ubuntu1.4).
2021-09-15T23:12:55.0863256Z The following packages were automatically installed and are no longer required:
2021-09-15T23:12:55.0864652Z linux-azure-5.4-cloud-tools-5.4.0-1031
2021-09-15T23:12:55.0865956Z linux-azure-5.4-cloud-tools-5.4.0-1032
2021-09-15T23:12:55.0867645Z linux-azure-5.4-cloud-tools-5.4.0-1034
2021-09-15T23:12:55.0868834Z linux-azure-5.4-cloud-tools-5.4.0-1035
2021-09-15T23:12:55.0869926Z linux-azure-5.4-cloud-tools-5.4.0-1036
2021-09-15T23:12:55.0871050Z linux-azure-5.4-cloud-tools-5.4.0-1039
2021-09-15T23:12:55.0872126Z linux-azure-5.4-cloud-tools-5.4.0-1040
2021-09-15T23:12:55.0874428Z linux-azure-5.4-cloud-tools-5.4.0-1041
2021-09-15T23:12:55.0876302Z linux-azure-5.4-cloud-tools-5.4.0-1044
2021-09-15T23:12:55.0878167Z linux-azure-5.4-cloud-tools-5.4.0-1046
2021-09-15T23:12:55.0879615Z linux-azure-5.4-cloud-tools-5.4.0-1047
2021-09-15T23:12:55.0881095Z linux-azure-5.4-cloud-tools-5.4.0-1048
2021-09-15T23:12:55.0882602Z linux-azure-5.4-cloud-tools-5.4.0-1051
2021-09-15T23:12:55.0884289Z linux-azure-5.4-cloud-tools-5.4.0-1055 linux-azure-5.4-headers-5.4.0-1031
2021-09-15T23:12:55.0886185Z linux-azure-5.4-headers-5.4.0-1032 linux-azure-5.4-headers-5.4.0-1034
2021-09-15T23:12:55.0890916Z linux-azure-5.4-headers-5.4.0-1035 linux-azure-5.4-headers-5.4.0-1036
2021-09-15T23:12:55.0892481Z linux-azure-5.4-headers-5.4.0-1039 linux-azure-5.4-headers-5.4.0-1040
2021-09-15T23:12:55.0893947Z linux-azure-5.4-headers-5.4.0-1041 linux-azure-5.4-headers-5.4.0-1044
2021-09-15T23:12:55.0895312Z linux-azure-5.4-headers-5.4.0-1046 linux-azure-5.4-headers-5.4.0-1047
2021-09-15T23:12:55.0896688Z linux-azure-5.4-headers-5.4.0-1048 linux-azure-5.4-headers-5.4.0-1051
2021-09-15T23:12:55.0898029Z linux-azure-5.4-headers-5.4.0-1055 linux-azure-5.4-tools-5.4.0-1031
2021-09-15T23:12:55.0899298Z linux-azure-5.4-tools-5.4.0-1032 linux-azure-5.4-tools-5.4.0-1034
2021-09-15T23:12:55.0900548Z linux-azure-5.4-tools-5.4.0-1035 linux-azure-5.4-tools-5.4.0-1036
2021-09-15T23:12:55.0901796Z linux-azure-5.4-tools-5.4.0-1039 linux-azure-5.4-tools-5.4.0-1040
2021-09-15T23:12:55.0903047Z linux-azure-5.4-tools-5.4.0-1041 linux-azure-5.4-tools-5.4.0-1044
2021-09-15T23:12:55.0904384Z linux-azure-5.4-tools-5.4.0-1046 linux-azure-5.4-tools-5.4.0-1047
2021-09-15T23:12:55.0906346Z linux-azure-5.4-tools-5.4.0-1048 linux-azure-5.4-tools-5.4.0-1051
2021-09-15T23:12:55.0907412Z linux-azure-5.4-tools-5.4.0-1055
2021-09-15T23:12:55.0908160Z Use 'sudo apt autoremove' to remove them.
2021-09-15T23:12:55.1315489Z 0 upgraded, 0 newly installed, 0 to remove and 67 not upgraded.
2021-09-15T23:12:55.1941550Z Warning: apt-key output should not be parsed (stdout is not a terminal)
2021-09-15T23:12:55.3033324Z OK
2021-09-15T23:12:55.9840762Z Hit:1 http://azure.archive.ubuntu.com/ubuntu bionic InRelease
2021-09-15T23:12:55.9864618Z Get:2 http://azure.archive.ubuntu.com/ubuntu bionic-updates InRelease [88.7 kB]
2021-09-15T23:12:55.9866401Z Get:3 http://azure.archive.ubuntu.com/ubuntu bionic-backports InRelease [74.6 kB]
2021-09-15T23:12:56.0088056Z Get:4 https://nvidia.github.io/libnvidia-container/stable/ubuntu18.04/amd64 InRelease [1484 B]
2021-09-15T23:12:56.0103947Z Hit:5 https://download.docker.com/linux/ubuntu bionic InRelease
2021-09-15T23:12:56.0140909Z Get:6 https://nvidia.github.io/nvidia-container-runtime/stable/ubuntu18.04/amd64 InRelease [1481 B]
2021-09-15T23:12:56.0176024Z Hit:7 http://packages.microsoft.com/repos/azurecore bionic InRelease
2021-09-15T23:12:56.0193559Z Hit:8 https://nvidia.github.io/nvidia-docker/ubuntu18.04/amd64 InRelease
2021-09-15T23:12:56.1341058Z Hit:9 http://security.ubuntu.com/ubuntu bionic-security InRelease
2021-09-15T23:12:58.7911047Z Fetched 166 kB in 1s (228 kB/s)
2021-09-15T23:12:59.7421970Z Reading package lists...
2021-09-15T23:12:59.8363920Z Warning: apt-key output should not be parsed (stdout is not a terminal)
2021-09-15T23:12:59.9533374Z OK
2021-09-15T23:12:59.9943407Z deb https://nvidia.github.io/libnvidia-container/stable/ubuntu18.04/$(ARCH) /
2021-09-15T23:12:59.9944794Z #deb https://nvidia.github.io/libnvidia-container/experimental/ubuntu18.04/$(ARCH) /
2021-09-15T23:12:59.9946139Z deb https://nvidia.github.io/nvidia-container-runtime/stable/ubuntu18.04/$(ARCH) /
2021-09-15T23:12:59.9947885Z #deb https://nvidia.github.io/nvidia-container-runtime/experimental/ubuntu18.04/$(ARCH) /
2021-09-15T23:12:59.9949152Z deb https://nvidia.github.io/nvidia-docker/ubuntu18.04/$(ARCH) /
2021-09-15T23:13:00.0858628Z Hit:1 http://azure.archive.ubuntu.com/ubuntu bionic InRelease
2021-09-15T23:13:00.0860384Z Get:2 http://azure.archive.ubuntu.com/ubuntu bionic-updates InRelease [88.7 kB]
2021-09-15T23:13:00.0871844Z Get:3 http://azure.archive.ubuntu.com/ubuntu bionic-backports InRelease [74.6 kB]
2021-09-15T23:13:00.1094525Z Get:4 https://nvidia.github.io/libnvidia-container/stable/ubuntu18.04/amd64 InRelease [1484 B]
2021-09-15T23:13:00.1106033Z Hit:5 https://download.docker.com/linux/ubuntu bionic InRelease
2021-09-15T23:13:00.1144289Z Get:6 https://nvidia.github.io/nvidia-container-runtime/stable/ubuntu18.04/amd64 InRelease [1481 B]
2021-09-15T23:13:00.1194299Z Hit:7 https://nvidia.github.io/nvidia-docker/ubuntu18.04/amd64 InRelease
2021-09-15T23:13:00.1203912Z Hit:8 http://packages.microsoft.com/repos/azurecore bionic InRelease
2021-09-15T23:13:00.3105003Z Hit:9 http://security.ubuntu.com/ubuntu bionic-security InRelease
2021-09-15T23:13:00.8956958Z Fetched 166 kB in 1s (230 kB/s)
2021-09-15T23:13:01.8442490Z Reading package lists...
2021-09-15T23:13:01.9464317Z Reading package lists...
2021-09-15T23:13:02.0984029Z Building dependency tree...
2021-09-15T23:13:02.0989655Z Reading state information...
2021-09-15T23:13:02.2110377Z containerd.io is already the newest version (1.4.9-1).
2021-09-15T23:13:02.2111411Z docker-ce-cli is already the newest version (5:20.10.8~3-0~ubuntu-bionic).
2021-09-15T23:13:02.2112329Z docker-ce is already the newest version (5:20.10.8~3-0~ubuntu-bionic).
2021-09-15T23:13:02.2113146Z nvidia-docker2 is already the newest version (2.6.0-1).
2021-09-15T23:13:02.2113885Z The following packages were automatically installed and are no longer required:
2021-09-15T23:13:02.2114878Z linux-azure-5.4-cloud-tools-5.4.0-1031
2021-09-15T23:13:02.2115949Z linux-azure-5.4-cloud-tools-5.4.0-1032
2021-09-15T23:13:02.2117175Z linux-azure-5.4-cloud-tools-5.4.0-1034
2021-09-15T23:13:02.2118190Z linux-azure-5.4-cloud-tools-5.4.0-1035
2021-09-15T23:13:02.2119167Z linux-azure-5.4-cloud-tools-5.4.0-1036
2021-09-15T23:13:02.2120126Z linux-azure-5.4-cloud-tools-5.4.0-1039
2021-09-15T23:13:02.2121097Z linux-azure-5.4-cloud-tools-5.4.0-1040
2021-09-15T23:13:02.2122068Z linux-azure-5.4-cloud-tools-5.4.0-1041
2021-09-15T23:13:02.2123023Z linux-azure-5.4-cloud-tools-5.4.0-1044
2021-09-15T23:13:02.2123995Z linux-azure-5.4-cloud-tools-5.4.0-1046
2021-09-15T23:13:02.2124956Z linux-azure-5.4-cloud-tools-5.4.0-1047
2021-09-15T23:13:02.2125929Z linux-azure-5.4-cloud-tools-5.4.0-1048
2021-09-15T23:13:02.2126886Z linux-azure-5.4-cloud-tools-5.4.0-1051
2021-09-15T23:13:02.2128128Z linux-azure-5.4-cloud-tools-5.4.0-1055 linux-azure-5.4-headers-5.4.0-1031
2021-09-15T23:13:02.2129550Z linux-azure-5.4-headers-5.4.0-1032 linux-azure-5.4-headers-5.4.0-1034
2021-09-15T23:13:02.2131385Z linux-azure-5.4-headers-5.4.0-1035 linux-azure-5.4-headers-5.4.0-1036
2021-09-15T23:13:02.2133007Z linux-azure-5.4-headers-5.4.0-1039 linux-azure-5.4-headers-5.4.0-1040
2021-09-15T23:13:02.2134441Z linux-azure-5.4-headers-5.4.0-1041 linux-azure-5.4-headers-5.4.0-1044
2021-09-15T23:13:02.2135810Z linux-azure-5.4-headers-5.4.0-1046 linux-azure-5.4-headers-5.4.0-1047
2021-09-15T23:13:02.2137168Z linux-azure-5.4-headers-5.4.0-1048 linux-azure-5.4-headers-5.4.0-1051
2021-09-15T23:13:02.2138488Z linux-azure-5.4-headers-5.4.0-1055 linux-azure-5.4-tools-5.4.0-1031
2021-09-15T23:13:02.2139763Z linux-azure-5.4-tools-5.4.0-1032 linux-azure-5.4-tools-5.4.0-1034
2021-09-15T23:13:02.2140995Z linux-azure-5.4-tools-5.4.0-1035 linux-azure-5.4-tools-5.4.0-1036
2021-09-15T23:13:02.2142244Z linux-azure-5.4-tools-5.4.0-1039 linux-azure-5.4-tools-5.4.0-1040
2021-09-15T23:13:02.2143485Z linux-azure-5.4-tools-5.4.0-1041 linux-azure-5.4-tools-5.4.0-1044
2021-09-15T23:13:02.2144727Z linux-azure-5.4-tools-5.4.0-1046 linux-azure-5.4-tools-5.4.0-1047
2021-09-15T23:13:02.2148122Z linux-azure-5.4-tools-5.4.0-1048 linux-azure-5.4-tools-5.4.0-1051
2021-09-15T23:13:02.2151436Z linux-azure-5.4-tools-5.4.0-1055
2021-09-15T23:13:02.2152205Z Use 'sudo apt autoremove' to remove them.
2021-09-15T23:13:02.2548533Z 0 upgraded, 0 newly installed, 0 to remove and 67 not upgraded.
2021-09-15T23:13:11.2838582Z ##[group]Run sudo rm -rf $GITHUB_WORKSPACE
2021-09-15T23:13:11.2839209Z [36;1msudo rm -rf $GITHUB_WORKSPACE[0m
2021-09-15T23:13:11.2859604Z shell: /bin/bash -e {0}
2021-09-15T23:13:11.2859946Z env:
2021-09-15T23:13:11.2860287Z github_actions: true
2021-09-15T23:13:11.2860627Z os_name: linux
2021-09-15T23:13:11.2860950Z task: cuda
2021-09-15T23:13:11.2861283Z conda_env: test-env
2021-09-15T23:13:11.2861643Z ##[endgroup]
2021-09-15T23:13:11.3816298Z ##[group]Run actions/checkout@v1
2021-09-15T23:13:11.3816706Z with:
2021-09-15T23:13:11.3817038Z fetch-depth: 5
2021-09-15T23:13:11.3817410Z submodules: true
2021-09-15T23:13:11.3817758Z clean: true
2021-09-15T23:13:11.3818061Z env:
2021-09-15T23:13:11.3818375Z github_actions: true
2021-09-15T23:13:11.3818727Z os_name: linux
2021-09-15T23:13:11.3819029Z task: cuda
2021-09-15T23:13:11.3819409Z conda_env: test-env
2021-09-15T23:13:11.3819755Z ##[endgroup]
2021-09-15T23:13:11.7329559Z Syncing repository: microsoft/LightGBM
2021-09-15T23:13:11.7455807Z ##[command]git version
2021-09-15T23:13:11.7745974Z git version 2.17.1
2021-09-15T23:13:11.7856545Z ##[command]git init "/home/guoke/actions-runner/_work/LightGBM/LightGBM"
2021-09-15T23:13:11.7884021Z Initialized empty Git repository in /home/guoke/actions-runner/_work/LightGBM/LightGBM/.git/
2021-09-15T23:13:11.7898055Z ##[command]git remote add origin https://github.com/microsoft/LightGBM
2021-09-15T23:13:11.7924687Z ##[command]git config gc.auto 0
2021-09-15T23:13:11.7949418Z ##[command]git config --get-all http.https://github.com/microsoft/LightGBM.extraheader
2021-09-15T23:13:11.8017128Z ##[command]git -c http.extraheader="AUTHORIZATION: basic ***" fetch --tags --prune --progress --no-recurse-submodules --depth=5 origin +refs/heads/*:refs/remotes/origin/* +refs/pull/4606/merge:refs/remotes/pull/4606/merge
2021-09-15T23:13:12.3992361Z remote: Enumerating objects: 5037, done.
2021-09-15T23:13:12.3997556Z remote: Counting objects: 0% (1/5037)
2021-09-15T23:13:12.4000132Z remote: Counting objects: 1% (51/5037)
2021-09-15T23:13:12.4002058Z remote: Counting objects: 2% (101/5037)
2021-09-15T23:13:12.4002598Z remote: Counting objects: 3% (152/5037)
2021-09-15T23:13:12.4003067Z remote: Counting objects: 4% (202/5037)
2021-09-15T23:13:12.4003537Z remote: Counting objects: 5% (252/5037)
2021-09-15T23:13:12.4004026Z remote: Counting objects: 6% (303/5037)
2021-09-15T23:13:12.4013610Z remote: Counting objects: 7% (353/5037)
2021-09-15T23:13:12.4014136Z remote: Counting objects: 8% (403/5037)
2021-09-15T23:13:12.4014855Z remote: Counting objects: 9% (454/5037)
2021-09-15T23:13:12.4015303Z remote: Counting objects: 10% (504/5037)
2021-09-15T23:13:12.4015761Z remote: Counting objects: 11% (555/5037)
2021-09-15T23:13:12.4016206Z remote: Counting objects: 12% (605/5037)
2021-09-15T23:13:12.4018901Z remote: Counting objects: 13% (655/5037)
2021-09-15T23:13:12.4019614Z remote: Counting objects: 14% (706/5037)
2021-09-15T23:13:12.4020069Z remote: Counting objects: 15% (756/5037)
2021-09-15T23:13:12.4020534Z remote: Counting objects: 16% (806/5037)
2021-09-15T23:13:12.4020983Z remote: Counting objects: 17% (857/5037)
2021-09-15T23:13:12.4021542Z remote: Counting objects: 18% (907/5037)
2021-09-15T23:13:12.4022096Z remote: Counting objects: 19% (958/5037)
2021-09-15T23:13:12.4022557Z remote: Counting objects: 20% (1008/5037)
2021-09-15T23:13:12.4025911Z remote: Counting objects: 21% (1058/5037)
2021-09-15T23:13:12.4026568Z remote: Counting objects: 22% (1109/5037)
2021-09-15T23:13:12.4027076Z remote: Counting objects: 23% (1159/5037)
2021-09-15T23:13:12.4027544Z remote: Counting objects: 24% (1209/5037)
2021-09-15T23:13:12.4028134Z remote: Counting objects: 25% (1260/5037)
2021-09-15T23:13:12.4028605Z remote: Counting objects: 26% (1310/5037)
2021-09-15T23:13:12.4029174Z remote: Counting objects: 27% (1360/5037)
2021-09-15T23:13:12.4029647Z remote: Counting objects: 28% (1411/5037)
2021-09-15T23:13:12.4030109Z remote: Counting objects: 29% (1461/5037)
2021-09-15T23:13:12.4030689Z remote: Counting objects: 30% (1512/5037)
2021-09-15T23:13:12.4031153Z remote: Counting objects: 31% (1562/5037)
2021-09-15T23:13:12.4031936Z remote: Counting objects: 32% (1612/5037)
2021-09-15T23:13:12.4032434Z remote: Counting objects: 33% (1663/5037)
2021-09-15T23:13:12.4033031Z remote: Counting objects: 34% (1713/5037)
2021-09-15T23:13:12.4033503Z remote: Counting objects: 35% (1763/5037)
2021-09-15T23:13:12.4034092Z remote: Counting objects: 36% (1814/5037)
2021-09-15T23:13:12.4034660Z remote: Counting objects: 37% (1864/5037)
2021-09-15T23:13:12.4035133Z remote: Counting objects: 38% (1915/5037)
2021-09-15T23:13:12.4035581Z remote: Counting objects: 39% (1965/5037)
2021-09-15T23:13:12.4036163Z remote: Counting objects: 40% (2015/5037)
2021-09-15T23:13:12.4036631Z remote: Counting objects: 41% (2066/5037)
2021-09-15T23:13:12.4037214Z remote: Counting objects: 42% (2116/5037)
2021-09-15T23:13:12.4037680Z remote: Counting objects: 43% (2166/5037)
2021-09-15T23:13:12.4038127Z remote: Counting objects: 44% (2217/5037)
2021-09-15T23:13:12.4038593Z remote: Counting objects: 45% (2267/5037)
2021-09-15T23:13:12.4039056Z remote: Counting objects: 46% (2318/5037)
2021-09-15T23:13:12.4039505Z remote: Counting objects: 47% (2368/5037)
2021-09-15T23:13:12.4039969Z remote: Counting objects: 48% (2418/5037)
2021-09-15T23:13:12.4040415Z remote: Counting objects: 49% (2469/5037)
2021-09-15T23:13:12.4041038Z remote: Counting objects: 50% (2519/5037)
2021-09-15T23:13:12.4041488Z remote: Counting objects: 51% (2569/5037)
2021-09-15T23:13:12.4041949Z remote: Counting objects: 52% (2620/5037)
2021-09-15T23:13:12.4042409Z remote: Counting objects: 53% (2670/5037)
2021-09-15T23:13:12.4042886Z remote: Counting objects: 54% (2720/5037)
2021-09-15T23:13:12.4043485Z remote: Counting objects: 55% (2771/5037)
2021-09-15T23:13:12.4044054Z remote: Counting objects: 56% (2821/5037)
2021-09-15T23:13:12.4044526Z remote: Counting objects: 57% (2872/5037)
2021-09-15T23:13:12.4045110Z remote: Counting objects: 58% (2922/5037)
2021-09-15T23:13:12.4045558Z remote: Counting objects: 59% (2972/5037)
2021-09-15T23:13:12.4046316Z remote: Counting objects: 60% (3023/5037)
2021-09-15T23:13:12.4046763Z remote: Counting objects: 61% (3073/5037)
2021-09-15T23:13:12.4047354Z remote: Counting objects: 62% (3123/5037)
2021-09-15T23:13:12.4047817Z remote: Counting objects: 63% (3174/5037)
2021-09-15T23:13:12.4048609Z remote: Counting objects: 64% (3224/5037)
2021-09-15T23:13:12.4049079Z remote: Counting objects: 65% (3275/5037)
2021-09-15T23:13:12.4049539Z remote: Counting objects: 66% (3325/5037)
2021-09-15T23:13:12.4049984Z remote: Counting objects: 67% (3375/5037)
2021-09-15T23:13:12.4050442Z remote: Counting objects: 68% (3426/5037)
2021-09-15T23:13:12.4050917Z remote: Counting objects: 69% (3476/5037)
2021-09-15T23:13:12.4051378Z remote: Counting objects: 70% (3526/5037)
2021-09-15T23:13:12.4051847Z remote: Counting objects: 71% (3577/5037)
2021-09-15T23:13:12.4052293Z remote: Counting objects: 72% (3627/5037)
2021-09-15T23:13:12.4052757Z remote: Counting objects: 73% (3678/5037)
2021-09-15T23:13:12.4053203Z remote: Counting objects: 74% (3728/5037)
2021-09-15T23:13:12.4053660Z remote: Counting objects: 75% (3778/5037)
2021-09-15T23:13:12.4054121Z remote: Counting objects: 76% (3829/5037)
2021-09-15T23:13:12.4054568Z remote: Counting objects: 77% (3879/5037)
2021-09-15T23:13:12.4055029Z remote: Counting objects: 78% (3929/5037)
2021-09-15T23:13:12.4055479Z remote: Counting objects: 79% (3980/5037)
2021-09-15T23:13:12.4055935Z remote: Counting objects: 80% (4030/5037)
2021-09-15T23:13:12.4056384Z remote: Counting objects: 81% (4080/5037)
2021-09-15T23:13:12.4058215Z remote: Counting objects: 82% (4131/5037)
2021-09-15T23:13:12.4058840Z remote: Counting objects: 83% (4181/5037)
2021-09-15T23:13:12.4059341Z remote: Counting objects: 84% (4232/5037)
2021-09-15T23:13:12.4059803Z remote: Counting objects: 85% (4282/5037)
2021-09-15T23:13:12.4077588Z remote: Counting objects: 86% (4332/5037)
2021-09-15T23:13:12.4078126Z remote: Counting objects: 87% (4383/5037)
2021-09-15T23:13:12.4078605Z remote: Counting objects: 88% (4433/5037)
2021-09-15T23:13:12.4086904Z remote: Counting objects: 89% (4483/5037)
2021-09-15T23:13:12.4087436Z remote: Counting objects: 90% (4534/5037)
2021-09-15T23:13:12.4087900Z remote: Counting objects: 91% (4584/5037)
2021-09-15T23:13:12.4088350Z remote: Counting objects: 92% (4635/5037)
2021-09-15T23:13:12.4088812Z remote: Counting objects: 93% (4685/5037)
2021-09-15T23:13:12.4089259Z remote: Counting objects: 94% (4735/5037)
2021-09-15T23:13:12.4089718Z remote: Counting objects: 95% (4786/5037)
2021-09-15T23:13:12.4090194Z remote: Counting objects: 96% (4836/5037)
2021-09-15T23:13:12.4090757Z remote: Counting objects: 97% (4886/5037)
2021-09-15T23:13:12.4091427Z remote: Counting objects: 98% (4937/5037)
2021-09-15T23:13:12.4091967Z remote: Counting objects: 99% (4987/5037)
2021-09-15T23:13:12.4092430Z remote: Counting objects: 100% (5037/5037)
2021-09-15T23:13:12.4092898Z remote: Counting objects: 100% (5037/5037), done.
2021-09-15T23:13:12.4124229Z remote: Compressing objects: 0% (1/3137)
2021-09-15T23:13:12.4151584Z remote: Compressing objects: 1% (32/3137)
2021-09-15T23:13:12.4342232Z remote: Compressing objects: 2% (63/3137)
2021-09-15T23:13:12.4503541Z remote: Compressing objects: 3% (95/3137)
2021-09-15T23:13:12.4554205Z remote: Compressing objects: 4% (126/3137)
2021-09-15T23:13:12.4693305Z remote: Compressing objects: 5% (157/3137)
2021-09-15T23:13:12.4771123Z remote: Compressing objects: 6% (189/3137)
2021-09-15T23:13:12.4888125Z remote: Compressing objects: 7% (220/3137)
2021-09-15T23:13:12.5218277Z remote: Compressing objects: 8% (251/3137)
2021-09-15T23:13:12.5282053Z remote: Compressing objects: 9% (283/3137)
2021-09-15T23:13:12.5416655Z remote: Compressing objects: 10% (314/3137)
2021-09-15T23:13:12.5494703Z remote: Compressing objects: 11% (346/3137)
2021-09-15T23:13:12.5528850Z remote: Compressing objects: 12% (377/3137)
2021-09-15T23:13:12.5542493Z remote: Compressing objects: 13% (408/3137)
2021-09-15T23:13:12.5617926Z remote: Compressing objects: 14% (440/3137)
2021-09-15T23:13:12.5666161Z remote: Compressing objects: 15% (471/3137)
2021-09-15T23:13:12.5753207Z remote: Compressing objects: 16% (502/3137)
2021-09-15T23:13:12.5839699Z remote: Compressing objects: 17% (534/3137)
2021-09-15T23:13:12.5904768Z remote: Compressing objects: 18% (565/3137)
2021-09-15T23:13:12.5981691Z remote: Compressing objects: 19% (597/3137)
2021-09-15T23:13:12.6041044Z remote: Compressing objects: 20% (628/3137)
2021-09-15T23:13:12.6159677Z remote: Compressing objects: 21% (659/3137)
2021-09-15T23:13:12.6310396Z remote: Compressing objects: 22% (691/3137)
2021-09-15T23:13:12.6630778Z remote: Compressing objects: 23% (722/3137)
2021-09-15T23:13:12.6727809Z remote: Compressing objects: 24% (753/3137)
2021-09-15T23:13:12.6799891Z remote: Compressing objects: 25% (785/3137)
2021-09-15T23:13:12.6975913Z remote: Compressing objects: 26% (816/3137)
2021-09-15T23:13:12.7044111Z remote: Compressing objects: 27% (847/3137)
2021-09-15T23:13:12.7129636Z remote: Compressing objects: 28% (879/3137)
2021-09-15T23:13:12.7178770Z remote: Compressing objects: 29% (910/3137)
2021-09-15T23:13:12.7200197Z remote: Compressing objects: 30% (942/3137)
2021-09-15T23:13:12.7209561Z remote: Compressing objects: 31% (973/3137)
2021-09-15T23:13:12.7256553Z remote: Compressing objects: 32% (1004/3137)
2021-09-15T23:13:12.7349152Z remote: Compressing objects: 33% (1036/3137)
2021-09-15T23:13:12.7395967Z remote: Compressing objects: 34% (1067/3137)
2021-09-15T23:13:12.7397375Z remote: Compressing objects: 35% (1098/3137)
2021-09-15T23:13:12.7422992Z remote: Compressing objects: 36% (1130/3137)
2021-09-15T23:13:12.7451812Z remote: Compressing objects: 37% (1161/3137)
2021-09-15T23:13:12.7620199Z remote: Compressing objects: 38% (1193/3137)
2021-09-15T23:13:12.7693342Z remote: Compressing objects: 39% (1224/3137)
2021-09-15T23:13:12.7701196Z remote: Compressing objects: 40% (1255/3137)
2021-09-15T23:13:12.7741248Z remote: Compressing objects: 41% (1287/3137)
2021-09-15T23:13:12.7777343Z remote: Compressing objects: 42% (1318/3137)
2021-09-15T23:13:12.7809874Z remote: Compressing objects: 43% (1349/3137)
2021-09-15T23:13:12.7814608Z remote: Compressing objects: 44% (1381/3137)
2021-09-15T23:13:12.7840163Z remote: Compressing objects: 45% (1412/3137)
2021-09-15T23:13:12.7866584Z remote: Compressing objects: 46% (1444/3137)
2021-09-15T23:13:12.7884121Z remote: Compressing objects: 47% (1475/3137)
2021-09-15T23:13:12.7906250Z remote: Compressing objects: 48% (1506/3137)
2021-09-15T23:13:12.7960824Z remote: Compressing objects: 49% (1538/3137)
2021-09-15T23:13:12.8252360Z remote: Compressing objects: 50% (1569/3137)
2021-09-15T23:13:12.8253242Z remote: Compressing objects: 51% (1600/3137)
2021-09-15T23:13:12.8253860Z remote: Compressing objects: 52% (1632/3137)
2021-09-15T23:13:12.8254463Z remote: Compressing objects: 53% (1663/3137)
2021-09-15T23:13:12.8277632Z remote: Compressing objects: 54% (1694/3137)
2021-09-15T23:13:12.8350497Z remote: Compressing objects: 55% (1726/3137)
2021-09-15T23:13:12.8382204Z remote: Compressing objects: 56% (1757/3137)
2021-09-15T23:13:12.8405450Z remote: Compressing objects: 57% (1789/3137)
2021-09-15T23:13:12.8488976Z remote: Compressing objects: 58% (1820/3137)
2021-09-15T23:13:12.8607424Z remote: Compressing objects: 59% (1851/3137)
2021-09-15T23:13:12.8681759Z remote: Compressing objects: 60% (1883/3137)
2021-09-15T23:13:12.8744477Z remote: Compressing objects: 61% (1914/3137)
2021-09-15T23:13:12.8794094Z remote: Compressing objects: 62% (1945/3137)
2021-09-15T23:13:12.8829955Z remote: Compressing objects: 63% (1977/3137)
2021-09-15T23:13:12.8885494Z remote: Compressing objects: 64% (2008/3137)
2021-09-15T23:13:12.8917833Z remote: Compressing objects: 65% (2040/3137)
2021-09-15T23:13:12.8923560Z remote: Compressing objects: 66% (2071/3137)
2021-09-15T23:13:12.8932265Z remote: Compressing objects: 67% (2102/3137)
2021-09-15T23:13:12.8933202Z remote: Compressing objects: 68% (2134/3137)
2021-09-15T23:13:12.8933923Z remote: Compressing objects: 69% (2165/3137)
2021-09-15T23:13:12.8934730Z remote: Compressing objects: 70% (2196/3137)
2021-09-15T23:13:12.8935585Z remote: Compressing objects: 71% (2228/3137)
2021-09-15T23:13:12.8943721Z remote: Compressing objects: 72% (2259/3137)
2021-09-15T23:13:12.8953050Z remote: Compressing objects: 73% (2291/3137)
2021-09-15T23:13:12.8965084Z remote: Compressing objects: 74% (2322/3137)
2021-09-15T23:13:12.8965951Z remote: Compressing objects: 75% (2353/3137)
2021-09-15T23:13:12.8966784Z remote: Compressing objects: 76% (2385/3137)
2021-09-15T23:13:12.8976700Z remote: Compressing objects: 77% (2416/3137)
2021-09-15T23:13:12.8985867Z remote: Compressing objects: 78% (2447/3137)
2021-09-15T23:13:12.8988876Z remote: Compressing objects: 79% (2479/3137)
2021-09-15T23:13:12.9002207Z remote: Compressing objects: 80% (2510/3137)
2021-09-15T23:13:12.9003250Z remote: Compressing objects: 81% (2541/3137)
2021-09-15T23:13:12.9004107Z remote: Compressing objects: 82% (2573/3137)
2021-09-15T23:13:12.9004787Z remote: Compressing objects: 83% (2604/3137)
2021-09-15T23:13:12.9007465Z remote: Compressing objects: 84% (2636/3137)
2021-09-15T23:13:12.9011803Z remote: Compressing objects: 85% (2667/3137)
2021-09-15T23:13:12.9016447Z remote: Compressing objects: 86% (2698/3137)
2021-09-15T23:13:12.9019312Z remote: Compressing objects: 87% (2730/3137)
2021-09-15T23:13:12.9020588Z remote: Compressing objects: 88% (2761/3137)
2021-09-15T23:13:12.9034238Z remote: Compressing objects: 89% (2792/3137)
2021-09-15T23:13:12.9042404Z remote: Compressing objects: 90% (2824/3137)
2021-09-15T23:13:12.9064685Z remote: Compressing objects: 91% (2855/3137)
2021-09-15T23:13:12.9072015Z remote: Compressing objects: 92% (2887/3137)
2021-09-15T23:13:12.9072597Z remote: Compressing objects: 93% (2918/3137)
2021-09-15T23:13:12.9095856Z remote: Compressing objects: 94% (2949/3137)
2021-09-15T23:13:12.9110010Z remote: Compressing objects: 95% (2981/3137)
2021-09-15T23:13:12.9119981Z remote: Compressing objects: 96% (3012/3137)
2021-09-15T23:13:12.9121451Z remote: Compressing objects: 97% (3043/3137)
2021-09-15T23:13:12.9126180Z remote: Compressing objects: 98% (3075/3137)
2021-09-15T23:13:12.9133418Z remote: Compressing objects: 99% (3106/3137)
2021-09-15T23:13:12.9134293Z remote: Compressing objects: 100% (3137/3137)
2021-09-15T23:13:12.9135045Z remote: Compressing objects: 100% (3137/3137), done.
2021-09-15T23:13:12.9162622Z Receiving objects: 0% (1/5037)
2021-09-15T23:13:12.9170192Z Receiving objects: 1% (51/5037)
2021-09-15T23:13:12.9177210Z Receiving objects: 2% (101/5037)
2021-09-15T23:13:12.9182375Z Receiving objects: 3% (152/5037)
2021-09-15T23:13:12.9187814Z Receiving objects: 4% (202/5037)
2021-09-15T23:13:12.9190776Z Receiving objects: 5% (252/5037)
2021-09-15T23:13:12.9191288Z Receiving objects: 6% (303/5037)
2021-09-15T23:13:12.9192698Z Receiving objects: 7% (353/5037)
2021-09-15T23:13:12.9195142Z Receiving objects: 8% (403/5037)
2021-09-15T23:13:12.9201990Z Receiving objects: 9% (454/5037)
2021-09-15T23:13:12.9202468Z Receiving objects: 10% (504/5037)
2021-09-15T23:13:12.9212889Z Receiving objects: 11% (555/5037)
2021-09-15T23:13:12.9215519Z Receiving objects: 12% (605/5037)
2021-09-15T23:13:12.9221277Z Receiving objects: 13% (655/5037)
2021-09-15T23:13:12.9223527Z Receiving objects: 14% (706/5037)
2021-09-15T23:13:12.9227701Z Receiving objects: 15% (756/5037)
2021-09-15T23:13:12.9236397Z Receiving objects: 16% (806/5037)
2021-09-15T23:13:12.9247315Z Receiving objects: 17% (857/5037)
2021-09-15T23:13:12.9250835Z Receiving objects: 18% (907/5037)
2021-09-15T23:13:12.9261784Z Receiving objects: 19% (958/5037)
2021-09-15T23:13:12.9263551Z Receiving objects: 20% (1008/5037)
2021-09-15T23:13:12.9270579Z Receiving objects: 21% (1058/5037)
2021-09-15T23:13:12.9276586Z Receiving objects: 22% (1109/5037)
2021-09-15T23:13:12.9286436Z Receiving objects: 23% (1159/5037)
2021-09-15T23:13:12.9297585Z Receiving objects: 24% (1209/5037)
2021-09-15T23:13:12.9306092Z Receiving objects: 25% (1260/5037)
2021-09-15T23:13:12.9326139Z Receiving objects: 26% (1310/5037)
2021-09-15T23:13:12.9336790Z Receiving objects: 27% (1360/5037)
2021-09-15T23:13:12.9368966Z Receiving objects: 28% (1411/5037)
2021-09-15T23:13:12.9390302Z Receiving objects: 29% (1461/5037)
2021-09-15T23:13:12.9455552Z Receiving objects: 30% (1512/5037)
2021-09-15T23:13:12.9472642Z Receiving objects: 31% (1562/5037)
2021-09-15T23:13:12.9483989Z Receiving objects: 32% (1612/5037)
2021-09-15T23:13:12.9505720Z Receiving objects: 33% (1663/5037)
2021-09-15T23:13:12.9539634Z Receiving objects: 34% (1713/5037)
2021-09-15T23:13:12.9544546Z Receiving objects: 35% (1763/5037)
2021-09-15T23:13:12.9546553Z Receiving objects: 36% (1814/5037)
2021-09-15T23:13:12.9551263Z Receiving objects: 37% (1864/5037)
2021-09-15T23:13:12.9555914Z Receiving objects: 38% (1915/5037)
2021-09-15T23:13:12.9571344Z Receiving objects: 39% (1965/5037)
2021-09-15T23:13:12.9581021Z Receiving objects: 40% (2015/5037)
2021-09-15T23:13:12.9590686Z Receiving objects: 41% (2066/5037)
2021-09-15T23:13:12.9608906Z Receiving objects: 42% (2116/5037)
2021-09-15T23:13:12.9625789Z Receiving objects: 43% (2166/5037)
2021-09-15T23:13:12.9642262Z Receiving objects: 44% (2217/5037)
2021-09-15T23:13:12.9675483Z Receiving objects: 45% (2267/5037)
2021-09-15T23:13:12.9752987Z Receiving objects: 46% (2318/5037)
2021-09-15T23:13:13.0898606Z Receiving objects: 47% (2368/5037)
2021-09-15T23:13:13.1307710Z Receiving objects: 48% (2418/5037)
2021-09-15T23:13:13.1729760Z Receiving objects: 49% (2469/5037)
2021-09-15T23:13:13.1757416Z Receiving objects: 50% (2519/5037)
2021-09-15T23:13:13.1766793Z Receiving objects: 51% (2569/5037)
2021-09-15T23:13:13.1783088Z Receiving objects: 52% (2620/5037)
2021-09-15T23:13:13.1789883Z Receiving objects: 53% (2670/5037)
2021-09-15T23:13:13.1795925Z Receiving objects: 54% (2720/5037)
2021-09-15T23:13:13.1810030Z Receiving objects: 55% (2771/5037)
2021-09-15T23:13:13.1816930Z Receiving objects: 56% (2821/5037)
2021-09-15T23:13:13.1834165Z Receiving objects: 57% (2872/5037)
2021-09-15T23:13:13.1847598Z Receiving objects: 58% (2922/5037)
2021-09-15T23:13:13.1856749Z Receiving objects: 59% (2972/5037)
2021-09-15T23:13:13.1869431Z Receiving objects: 60% (3023/5037)
2021-09-15T23:13:13.1878490Z Receiving objects: 61% (3073/5037)
2021-09-15T23:13:13.1890843Z Receiving objects: 62% (3123/5037)
2021-09-15T23:13:13.1903610Z Receiving objects: 63% (3174/5037)
2021-09-15T23:13:13.1916243Z Receiving objects: 64% (3224/5037)
2021-09-15T23:13:13.1929872Z Receiving objects: 65% (3275/5037)
2021-09-15T23:13:13.1941935Z Receiving objects: 66% (3325/5037)
2021-09-15T23:13:13.1946580Z Receiving objects: 67% (3375/5037)
2021-09-15T23:13:13.1953796Z Receiving objects: 68% (3426/5037)
2021-09-15T23:13:13.1960387Z Receiving objects: 69% (3476/5037)
2021-09-15T23:13:13.1971997Z Receiving objects: 70% (3526/5037)
2021-09-15T23:13:13.1982863Z Receiving objects: 71% (3577/5037)
2021-09-15T23:13:13.2005172Z Receiving objects: 72% (3627/5037)
2021-09-15T23:13:13.2018027Z Receiving objects: 73% (3678/5037)
2021-09-15T23:13:13.2026369Z Receiving objects: 74% (3728/5037)
2021-09-15T23:13:13.2080966Z Receiving objects: 75% (3778/5037)
2021-09-15T23:13:13.2093551Z Receiving objects: 76% (3829/5037)
2021-09-15T23:13:13.2094663Z Receiving objects: 77% (3879/5037)
2021-09-15T23:13:13.2095108Z Receiving objects: 78% (3929/5037)
2021-09-15T23:13:13.2100634Z Receiving objects: 79% (3980/5037)
2021-09-15T23:13:13.2169313Z Receiving objects: 80% (4030/5037)
2021-09-15T23:13:13.2177567Z Receiving objects: 81% (4080/5037)
2021-09-15T23:13:13.2191230Z Receiving objects: 82% (4131/5037)
2021-09-15T23:13:13.2193616Z Receiving objects: 83% (4181/5037)
2021-09-15T23:13:13.2195990Z Receiving objects: 84% (4232/5037)
2021-09-15T23:13:13.2275380Z Receiving objects: 85% (4282/5037)
2021-09-15T23:13:13.2282084Z Receiving objects: 86% (4332/5037)
2021-09-15T23:13:13.2301568Z Receiving objects: 87% (4383/5037)
2021-09-15T23:13:13.2309070Z Receiving objects: 88% (4433/5037)
2021-09-15T23:13:13.2319132Z Receiving objects: 89% (4483/5037)
2021-09-15T23:13:13.2324536Z Receiving objects: 90% (4534/5037)
2021-09-15T23:13:13.2498387Z Receiving objects: 91% (4584/5037)
2021-09-15T23:13:13.2503033Z Receiving objects: 92% (4635/5037)
2021-09-15T23:13:13.2507991Z Receiving objects: 93% (4685/5037)
2021-09-15T23:13:13.2512728Z Receiving objects: 94% (4735/5037)
2021-09-15T23:13:13.2546081Z Receiving objects: 95% (4786/5037)
2021-09-15T23:13:13.2551304Z Receiving objects: 96% (4836/5037)
2021-09-15T23:13:13.2565759Z Receiving objects: 97% (4886/5037)
2021-09-15T23:13:13.2570231Z Receiving objects: 98% (4937/5037)
2021-09-15T23:13:13.2571731Z remote: Total 5037 (delta 3613), reused 2717 (delta 1748), pack-reused 0
2021-09-15T23:13:13.2593860Z Receiving objects: 99% (4987/5037)
2021-09-15T23:13:13.2594380Z Receiving objects: 100% (5037/5037)
2021-09-15T23:13:13.2594979Z Receiving objects: 100% (5037/5037), 7.28 MiB | 21.17 MiB/s, done.
2021-09-15T23:13:13.2602604Z Resolving deltas: 0% (0/3613)
2021-09-15T23:13:13.2603338Z Resolving deltas: 1% (56/3613)
2021-09-15T23:13:13.2607577Z Resolving deltas: 2% (75/3613)
2021-09-15T23:13:13.2608103Z Resolving deltas: 3% (124/3613)
2021-09-15T23:13:13.2609398Z Resolving deltas: 4% (155/3613)
2021-09-15T23:13:13.2611956Z Resolving deltas: 5% (181/3613)
2021-09-15T23:13:13.2614112Z Resolving deltas: 6% (222/3613)
2021-09-15T23:13:13.2617232Z Resolving deltas: 7% (253/3613)
2021-09-15T23:13:13.2619356Z Resolving deltas: 8% (296/3613)
2021-09-15T23:13:13.2621554Z Resolving deltas: 9% (333/3613)
2021-09-15T23:13:13.2623435Z Resolving deltas: 10% (369/3613)
2021-09-15T23:13:13.2625513Z Resolving deltas: 11% (398/3613)
2021-09-15T23:13:13.2626020Z Resolving deltas: 12% (434/3613)
2021-09-15T23:13:13.2628139Z Resolving deltas: 13% (470/3613)
2021-09-15T23:13:13.2629207Z Resolving deltas: 14% (512/3613)
2021-09-15T23:13:13.2631784Z Resolving deltas: 15% (548/3613)
2021-09-15T23:13:13.2633611Z Resolving deltas: 16% (588/3613)
2021-09-15T23:13:13.2636668Z Resolving deltas: 17% (617/3613)
2021-09-15T23:13:13.2639513Z Resolving deltas: 18% (651/3613)
2021-09-15T23:13:13.2642363Z Resolving deltas: 19% (690/3613)
2021-09-15T23:13:13.2646328Z Resolving deltas: 20% (725/3613)
2021-09-15T23:13:13.2648426Z Resolving deltas: 21% (765/3613)
2021-09-15T23:13:13.2653803Z Resolving deltas: 22% (796/3613)
2021-09-15T23:13:13.2667136Z Resolving deltas: 23% (833/3613)
2021-09-15T23:13:13.2678688Z Resolving deltas: 24% (870/3613)
2021-09-15T23:13:13.2682545Z Resolving deltas: 25% (906/3613)
2021-09-15T23:13:13.2780269Z Resolving deltas: 26% (941/3613)
2021-09-15T23:13:13.2782167Z Resolving deltas: 27% (997/3613)
2021-09-15T23:13:13.2782629Z Resolving deltas: 28% (1012/3613)
2021-09-15T23:13:13.2783033Z Resolving deltas: 29% (1068/3613)
2021-09-15T23:13:13.2783430Z Resolving deltas: 30% (1102/3613)
2021-09-15T23:13:13.2783830Z Resolving deltas: 31% (1136/3613)
2021-09-15T23:13:13.2784222Z Resolving deltas: 32% (1157/3613)
2021-09-15T23:13:13.2784626Z Resolving deltas: 33% (1217/3613)
2021-09-15T23:13:13.2785014Z Resolving deltas: 34% (1230/3613)
2021-09-15T23:13:13.2785410Z Resolving deltas: 35% (1271/3613)
2021-09-15T23:13:13.2785805Z Resolving deltas: 36% (1307/3613)
2021-09-15T23:13:13.2786189Z Resolving deltas: 37% (1340/3613)
2021-09-15T23:13:13.2786593Z Resolving deltas: 38% (1374/3613)
2021-09-15T23:13:13.2786983Z Resolving deltas: 39% (1423/3613)
2021-09-15T23:13:13.2787383Z Resolving deltas: 40% (1447/3613)
2021-09-15T23:13:13.2787772Z Resolving deltas: 41% (1486/3613)
2021-09-15T23:13:13.2788724Z Resolving deltas: 42% (1520/3613)
2021-09-15T23:13:13.2789125Z Resolving deltas: 43% (1568/3613)
2021-09-15T23:13:13.2793745Z Resolving deltas: 44% (1590/3613)
2021-09-15T23:13:13.2801714Z Resolving deltas: 45% (1627/3613)
2021-09-15T23:13:13.2805951Z Resolving deltas: 46% (1679/3613)
2021-09-15T23:13:13.2817904Z Resolving deltas: 47% (1700/3613)
2021-09-15T23:13:13.2822204Z Resolving deltas: 48% (1746/3613)
2021-09-15T23:13:13.2829719Z Resolving deltas: 49% (1773/3613)
2021-09-15T23:13:13.2835995Z Resolving deltas: 50% (1807/3613)
2021-09-15T23:13:13.2837222Z Resolving deltas: 51% (1843/3613)
2021-09-15T23:13:13.2841784Z Resolving deltas: 52% (1883/3613)
2021-09-15T23:13:13.2861714Z Resolving deltas: 53% (1919/3613)
2021-09-15T23:13:13.2870441Z Resolving deltas: 54% (1970/3613)
2021-09-15T23:13:13.2884197Z Resolving deltas: 55% (1992/3613)
2021-09-15T23:13:13.2890845Z Resolving deltas: 56% (2033/3613)
2021-09-15T23:13:13.2894699Z Resolving deltas: 57% (2060/3613)
2021-09-15T23:13:13.2900502Z Resolving deltas: 58% (2099/3613)
2021-09-15T23:13:13.2905016Z Resolving deltas: 59% (2134/3613)
2021-09-15T23:13:13.2915634Z Resolving deltas: 60% (2171/3613)
2021-09-15T23:13:13.2939918Z Resolving deltas: 61% (2217/3613)
2021-09-15T23:13:13.2946163Z Resolving deltas: 62% (2256/3613)
2021-09-15T23:13:13.2971625Z Resolving deltas: 63% (2279/3613)
2021-09-15T23:13:13.3001836Z Resolving deltas: 64% (2337/3613)
2021-09-15T23:13:13.3002282Z Resolving deltas: 65% (2372/3613)
2021-09-15T23:13:13.3031712Z Resolving deltas: 66% (2385/3613)
2021-09-15T23:13:13.3039827Z Resolving deltas: 67% (2422/3613)
2021-09-15T23:13:13.3059516Z Resolving deltas: 68% (2461/3613)
2021-09-15T23:13:13.3073211Z Resolving deltas: 69% (2506/3613)
2021-09-15T23:13:13.3096176Z Resolving deltas: 70% (2531/3613)
2021-09-15T23:13:13.3105466Z Resolving deltas: 71% (2570/3613)
2021-09-15T23:13:13.3114306Z Resolving deltas: 72% (2603/3613)
2021-09-15T23:13:13.3121160Z Resolving deltas: 73% (2650/3613)
2021-09-15T23:13:13.3123295Z Resolving deltas: 74% (2699/3613)
2021-09-15T23:13:13.3129298Z Resolving deltas: 75% (2713/3613)
2021-09-15T23:13:13.3135657Z Resolving deltas: 76% (2755/3613)
2021-09-15T23:13:13.3145781Z Resolving deltas: 77% (2798/3613)
2021-09-15T23:13:13.3159410Z Resolving deltas: 78% (2832/3613)
2021-09-15T23:13:13.3163119Z Resolving deltas: 79% (2880/3613)
2021-09-15T23:13:13.3172565Z Resolving deltas: 80% (2897/3613)
2021-09-15T23:13:13.3178661Z Resolving deltas: 81% (2937/3613)
2021-09-15T23:13:13.3187178Z Resolving deltas: 82% (2964/3613)
2021-09-15T23:13:13.3210375Z Resolving deltas: 83% (3004/3613)
2021-09-15T23:13:13.3213065Z Resolving deltas: 84% (3057/3613)
2021-09-15T23:13:13.3221241Z Resolving deltas: 85% (3080/3613)
2021-09-15T23:13:13.3229629Z Resolving deltas: 86% (3118/3613)
2021-09-15T23:13:13.3236278Z Resolving deltas: 87% (3156/3613)
2021-09-15T23:13:13.3244592Z Resolving deltas: 88% (3180/3613)
2021-09-15T23:13:13.3250136Z Resolving deltas: 89% (3217/3613)
2021-09-15T23:13:13.3257568Z Resolving deltas: 90% (3255/3613)
2021-09-15T23:13:13.3261694Z Resolving deltas: 91% (3293/3613)
2021-09-15T23:13:13.3266877Z Resolving deltas: 92% (3325/3613)
2021-09-15T23:13:13.3272958Z Resolving deltas: 93% (3364/3613)
2021-09-15T23:13:13.3276444Z Resolving deltas: 94% (3398/3613)
2021-09-15T23:13:13.3281154Z Resolving deltas: 95% (3435/3613)
2021-09-15T23:13:13.3284783Z Resolving deltas: 96% (3473/3613)
2021-09-15T23:13:13.3292047Z Resolving deltas: 97% (3509/3613)
2021-09-15T23:13:13.3297688Z Resolving deltas: 98% (3541/3613)
2021-09-15T23:13:13.3306262Z Resolving deltas: 99% (3577/3613)
2021-09-15T23:13:13.3306689Z Resolving deltas: 100% (3613/3613)
2021-09-15T23:13:13.3307109Z Resolving deltas: 100% (3613/3613), done.
2021-09-15T23:13:13.4165516Z From https://github.com/microsoft/LightGBM
2021-09-15T23:13:13.4167000Z * [new branch] ci/qemu-tests -> origin/ci/qemu-tests
2021-09-15T23:13:13.4201968Z * [new branch] ci/static-security -> origin/ci/static-security
2021-09-15T23:13:13.4205512Z * [new branch] feat/faster-r-cmake -> origin/feat/faster-r-cmake
2021-09-15T23:13:13.4206657Z * [new branch] fix/network-setup -> origin/fix/network-setup
2021-09-15T23:13:13.4207722Z * [new branch] fix/r-segfaults -> origin/fix/r-segfaults
2021-09-15T23:13:13.4208820Z * [new branch] fix/split-comparison -> origin/fix/split-comparison
2021-09-15T23:13:13.4209925Z * [new branch] function-hints -> origin/function-hints
2021-09-15T23:13:13.4211123Z * [new branch] master -> origin/master
2021-09-15T23:13:13.4212096Z * [new branch] mingw_link -> origin/mingw_link
2021-09-15T23:13:13.4213181Z * [new branch] r/info-setter-getter -> origin/r/info-setter-getter
2021-09-15T23:13:13.4214272Z * [new ref] refs/pull/4606/merge -> pull/4606/merge
2021-09-15T23:13:13.4215115Z * [new tag] stable -> stable
2021-09-15T23:13:13.4215907Z * [new tag] v1 -> v1
2021-09-15T23:13:13.4216660Z * [new tag] v2.0 -> v2.0
2021-09-15T23:13:13.4217423Z * [new tag] v2.0.10 -> v2.0.10
2021-09-15T23:13:13.4218206Z * [new tag] v2.0.11 -> v2.0.11
2021-09-15T23:13:13.4218982Z * [new tag] v2.0.12 -> v2.0.12
2021-09-15T23:13:13.4219759Z * [new tag] v2.0.3 -> v2.0.3
2021-09-15T23:13:13.4220530Z * [new tag] v2.0.4 -> v2.0.4
2021-09-15T23:13:13.4221289Z * [new tag] v2.0.5 -> v2.0.5
2021-09-15T23:13:13.4222064Z * [new tag] v2.0.6 -> v2.0.6
2021-09-15T23:13:13.4222834Z * [new tag] v2.0.7 -> v2.0.7
2021-09-15T23:13:13.4223602Z * [new tag] v2.0.8 -> v2.0.8
2021-09-15T23:13:13.4224375Z * [new tag] v2.1.0 -> v2.1.0
2021-09-15T23:13:13.4225146Z * [new tag] v2.1.1 -> v2.1.1
2021-09-15T23:13:13.4225902Z * [new tag] v2.1.2 -> v2.1.2
2021-09-15T23:13:13.4226674Z * [new tag] v2.2.0 -> v2.2.0
2021-09-15T23:13:13.4227446Z * [new tag] v2.2.1 -> v2.2.1
2021-09-15T23:13:13.4228201Z * [new tag] v2.2.2 -> v2.2.2
2021-09-15T23:13:13.4228970Z * [new tag] v2.2.3 -> v2.2.3
2021-09-15T23:13:13.4229729Z * [new tag] v2.3.0 -> v2.3.0
2021-09-15T23:13:13.4230500Z * [new tag] v2.3.1 -> v2.3.1
2021-09-15T23:13:13.4231272Z * [new tag] v3.0.0 -> v3.0.0
2021-09-15T23:13:13.4232051Z * [new tag] v3.0.0rc1 -> v3.0.0rc1
2021-09-15T23:13:13.4232843Z * [new tag] v3.1.0 -> v3.1.0
2021-09-15T23:13:13.4233832Z * [new tag] v3.1.1 -> v3.1.1
2021-09-15T23:13:13.4234602Z * [new tag] v3.2.0 -> v3.2.0
2021-09-15T23:13:13.4235373Z * [new tag] v3.2.1 -> v3.2.1
2021-09-15T23:13:13.4322671Z ##[command]git checkout --progress --force refs/remotes/pull/4606/merge
2021-09-15T23:13:13.5349242Z Note: checking out 'refs/remotes/pull/4606/merge'.
2021-09-15T23:13:13.5349677Z
2021-09-15T23:13:13.5351286Z You are in 'detached HEAD' state. You can look around, make experimental
2021-09-15T23:13:13.5352012Z changes and commit them, and you can discard any commits you make in this
2021-09-15T23:13:13.5352709Z state without impacting any branches by performing another checkout.
2021-09-15T23:13:13.5353130Z
2021-09-15T23:13:13.5353619Z If you want to create a new branch to retain commits you create, you may
2021-09-15T23:13:13.5354542Z do so (now or later) by using -b with the checkout command again. Example:
2021-09-15T23:13:13.5354953Z
2021-09-15T23:13:13.5355504Z git checkout -b <new-branch-name>
2021-09-15T23:13:13.5355822Z
2021-09-15T23:13:13.5356540Z HEAD is now at 198bff4 Merge 4cf98afaf4583a39b311d495e48edf66230ee132 into 54facc4d727812075b0c90e5506e521938953b25
2021-09-15T23:13:13.5364228Z ##[command]git submodule sync
2021-09-15T23:13:13.5585117Z ##[command]git -c http.https://github.com.extraheader="AUTHORIZATION: basic ***" submodule update --init --force --depth=5
2021-09-15T23:13:13.5779104Z Submodule 'include/boost/compute' (https://github.com/boostorg/compute) registered for path 'external_libs/compute'
2021-09-15T23:13:13.5780949Z Submodule 'eigen' (https://gitlab.com/libeigen/eigen.git) registered for path 'external_libs/eigen'
2021-09-15T23:13:13.5783676Z Submodule 'external_libs/fast_double_parser' (https://github.com/lemire/fast_double_parser.git) registered for path 'external_libs/fast_double_parser'
2021-09-15T23:13:13.5786185Z Submodule 'external_libs/fmt' (https://github.com/fmtlib/fmt.git) registered for path 'external_libs/fmt'
2021-09-15T23:13:13.5825779Z Cloning into '/home/guoke/actions-runner/_work/LightGBM/LightGBM/external_libs/compute'...
2021-09-15T23:13:14.3409109Z Cloning into '/home/guoke/actions-runner/_work/LightGBM/LightGBM/external_libs/eigen'...
2021-09-15T23:13:15.2444503Z Cloning into '/home/guoke/actions-runner/_work/LightGBM/LightGBM/external_libs/fast_double_parser'...
2021-09-15T23:13:15.8402851Z Cloning into '/home/guoke/actions-runner/_work/LightGBM/LightGBM/external_libs/fmt'...
2021-09-15T23:13:16.6976810Z Submodule path 'external_libs/compute': checked out '36c89134d4013b2e5e45bc55656a18bd6141995a'
2021-09-15T23:13:18.9323150Z From https://gitlab.com/libeigen/eigen
2021-09-15T23:13:18.9324592Z * branch 8ba1b0f41a7950dc3e1d4ed75859e36c73311235 -> FETCH_HEAD
2021-09-15T23:13:19.0821158Z Submodule path 'external_libs/eigen': checked out '8ba1b0f41a7950dc3e1d4ed75859e36c73311235'
2021-09-15T23:13:20.1265187Z From https://github.com/lemire/fast_double_parser
2021-09-15T23:13:20.1267120Z * branch ace60646c02dc54c57f19d644e49a61e7e7758ec -> FETCH_HEAD
2021-09-15T23:13:20.1596501Z Submodule path 'external_libs/fast_double_parser': checked out 'ace60646c02dc54c57f19d644e49a61e7e7758ec'
2021-09-15T23:13:21.4918292Z From https://github.com/fmtlib/fmt
2021-09-15T23:13:21.4919724Z * branch cc09f1a6798c085c325569ef466bcdcffdc266d4 -> FETCH_HEAD
2021-09-15T23:13:21.5610942Z Submodule path 'external_libs/fmt': checked out 'cc09f1a6798c085c325569ef466bcdcffdc266d4'
2021-09-15T23:13:21.5800916Z ##[group]Run export ROOT_DOCKER_FOLDER=/LightGBM
2021-09-15T23:13:21.5801643Z [36;1mexport ROOT_DOCKER_FOLDER=/LightGBM[0m
2021-09-15T23:13:21.5802099Z [36;1mcat > docker.env <<EOF[0m
2021-09-15T23:13:21.5802513Z [36;1mGITHUB_ACTIONS=true[0m
2021-09-15T23:13:21.5802881Z [36;1mOS_NAME=linux[0m
2021-09-15T23:13:21.5803251Z [36;1mCOMPILER=gcc[0m
2021-09-15T23:13:21.5803602Z [36;1mTASK=cuda[0m
2021-09-15T23:13:21.5803938Z [36;1mMETHOD=source[0m
2021-09-15T23:13:21.5804493Z [36;1mCONDA_ENV=test-env[0m
2021-09-15T23:13:21.5804877Z [36;1mPYTHON_VERSION=3.7[0m
2021-09-15T23:13:21.5805329Z [36;1mBUILD_DIRECTORY=$ROOT_DOCKER_FOLDER[0m
2021-09-15T23:13:21.5805798Z [36;1mLGB_VER=$(head -n 1 VERSION.txt)[0m
2021-09-15T23:13:21.5806185Z [36;1mEOF[0m
2021-09-15T23:13:21.5806587Z [36;1mcat > docker-script.sh <<EOF[0m
2021-09-15T23:13:21.5807057Z [36;1mexport CONDA=\$HOME/miniconda[0m
2021-09-15T23:13:21.5807526Z [36;1mexport PATH=\$CONDA/bin:\$PATH[0m
2021-09-15T23:13:21.5807984Z [36;1mnvidia-smi[0m
2021-09-15T23:13:21.5808429Z [36;1m$ROOT_DOCKER_FOLDER/.ci/setup.sh || exit -1[0m
2021-09-15T23:13:21.5808944Z [36;1m$ROOT_DOCKER_FOLDER/.ci/test.sh || exit -1[0m
2021-09-15T23:13:21.5809347Z [36;1mEOF[0m
2021-09-15T23:13:21.5810165Z [36;1mdocker run --env-file docker.env -v "$GITHUB_WORKSPACE":"$ROOT_DOCKER_FOLDER" --rm --gpus all "nvcr.io/nvidia/cuda:11.4.0-devel" /bin/bash $ROOT_DOCKER_FOLDER/docker-script.sh[0m
2021-09-15T23:13:21.5829304Z shell: /bin/bash -e {0}
2021-09-15T23:13:21.5829653Z env:
2021-09-15T23:13:21.5829977Z github_actions: true
2021-09-15T23:13:21.5830337Z os_name: linux
2021-09-15T23:13:21.5830642Z task: cuda
2021-09-15T23:13:21.5830991Z conda_env: test-env
2021-09-15T23:13:21.5831340Z ##[endgroup]
2021-09-15T23:13:26.0322704Z docker: Error response from daemon: OCI runtime create failed: container_linux.go:380: starting container process caused: process_linux.go:545: container init caused: Running hook #0:: error running hook: exit status 1, stdout: , stderr: nvidia-container-cli: initialization error: nvml error: driver not loaded: unknown.
2021-09-15T23:13:26.0359071Z ##[error]Process completed with exit code 125.
2021-09-15T23:13:26.0388144Z Cleaning up orphan processes
Neither reinstallation (sudo apt-get install --reinstall --no-install-recommends -y) of all docker-related stuff, nor pining versions of libnvidia-container1 and nvidia-container-toolkit from a suggestion from internet helped.
https://github.com/microsoft/LightGBM/blob/54facc4d727812075b0c90e5506e521938953b25/.github/workflows/cuda.yml#L54-L58
I guess the first thing to try is to update NVIDIA driver with machine rebooting.
Fixed via #4621.
| gharchive/issue | 2021-09-17T15:23:22 | 2025-04-01T06:44:57.389872 | {
"authors": [
"StrikerRUS"
],
"repo": "microsoft/LightGBM",
"url": "https://github.com/microsoft/LightGBM/issues/4610",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
1788698893 | Providing a minimal runnable C++ example code
Summary
Right now, there's ZERO guide on how to use the C/C++ API. Not even one runnable example is found online.
There's 1 code example (shown below), but it's totally outdated and cannot compile.
Motivation
It's useful because some people might need to integrate LightGBM directly into their C++ code.
The example code should show how to LoadData, set GBM config, train, predict an input, and free memory. It can either use the C++ code directly, or use the exposed C API shown here: https://lightgbm.readthedocs.io/en/latest/C-API.html
Description
Below is an example program that has been outdated and can't compile. I'd like to fix that.
References
#include <LightGBM/config.h>
#include <LightGBM/dataset_loader.h>
#include <LightGBM/boosting.h>
#include <LightGBM/objective_function.h>
#include <LightGBM/metric.h>
#include <LightGBM/utils/common.h>
#include <iostream>
#include <random>
#include <algorithm>
int main()
{
/* create example dataset */
std::random_device rd;
std::mt19937 gen(rd());
// one random generator for every class
std::vector<std::normal_distribution<>> dists = {
std::normal_distribution<>(0, 1),
std::normal_distribution<>(10, 1)};
/* create raw data */
const int numSamples = 5000;
const int numFeats = 2;
const int numClasses = static_cast<int>(dists.size());
std::cout << "Num classes: " << numClasses << std::endl;
// labels
std::vector<float> labels(numSamples);
for (int i=0; i < numSamples; i++)
labels[i] = i % numClasses;
std::vector< std::vector<double> > features(numSamples);
for (int i=0; i < numSamples; i++)
{
features[i].resize(numFeats);
for (int j=0; j < numFeats; j++)
{
const auto lbl = static_cast<int>(labels[i]);
features[i][j] = dists[lbl](gen);
}
}
// prepare sample data
std::vector< std::vector<double> > sampleData(numFeats);
for (int i=0; i < numSamples; i++)
{
for (int j=0; j < numFeats; j++)
sampleData[j].push_back(features[i][j]);
}
/** Load dataset **/
LightGBM::Config config;
config.num_class = numClasses;
config.max_bin = 255;
config.verbosity = 10;
std::unique_ptr<LightGBM::Dataset> dset;
LightGBM::DatasetLoader loader(config, nullptr, numClasses, nullptr);
dset.reset( loader.ConstructFromSampleData(sampleData, numSamples, numSamples) );
for (int i = 0; i < numSamples; ++i)
{
const int thread_id = 0;
dset->PushOneRow(thread_id, i, features[i]);
}
dset->FinishLoad();
// check bins
for(int j=0; j < numFeats; j++)
{
const auto nbins = dset->FeatureAt(j)->bin_mapper()->num_bin();
std::cout << "Feat " << numFeats << std::endl;
std::cout << " " << dset->FeatureAt(j)->bin_mapper()->BinToValue(0) << " ";
std::cout << " " << dset->FeatureAt(j)->bin_mapper()->BinToValue(nbins-2) << " ";
std::cout << std::endl;
}
if (!dset->SetFloatField("label", labels.data(), numSamples)) {
std::cout << "Error setting label" << std::endl;
return -1;
}
/** Prepare boosting **/
LightGBM::BoostingConfig boostConfig;
boostConfig.num_iterations = 100;
boostConfig.bagging_freq = 1;
boostConfig.bagging_fraction = 0.5;
boostConfig.num_class = numClasses;
// tree params
boostConfig.tree_config.min_data_in_leaf = 10;
boostConfig.tree_config.num_leaves = 16;
//boostConfig.tree_config.min_sum_hessian_in_leaf = 0;
LightGBM::ObjectiveConfig objConfig;
objConfig.num_class = numClasses;
// objConfig.label_gain.clear();
// objConfig.label_gain.resize(numClasses, 1.0);
auto *objFunc = LightGBM::ObjectiveFunction::CreateObjectiveFunction("multiclass", objConfig);
objFunc->Init(dset->metadata(), dset->num_data());
LightGBM::MetricConfig metricConfig;
metricConfig.num_class = numClasses;
std::vector< std::unique_ptr<LightGBM::Metric> > trainMetrics;
auto metric = std::unique_ptr<LightGBM::Metric>(
LightGBM::Metric::CreateMetric("multi_logloss", metricConfig));
metric->Init(dset->metadata(), dset->num_data());
trainMetrics.push_back(std::move(metric));
auto *booster = LightGBM::Boosting::CreateBoosting(LightGBM::BoostingType::kGBDT, nullptr);
booster->Init(&boostConfig, nullptr, objFunc,
LightGBM::Common::ConstPtrInVectorWrapper<LightGBM::Metric>(trainMetrics));
booster->ResetTrainingData(&boostConfig, dset.get(), objFunc,
LightGBM::Common::ConstPtrInVectorWrapper<LightGBM::Metric>(trainMetrics));
// booster->AddValidDataset(dset.get(), LightGBM::Common::ConstPtrInVectorWrapper<LightGBM::Metric>(trainMetrics));
for (int i=0; i < boostConfig.num_iterations; i++)
{
std::cout << "Iteration " << (i+1) << std::endl;
auto scores = booster->GetEvalAt(0);
for(auto &v: scores)
std::cout << "Score: " << v << std::endl;
if (booster->TrainOneIter(nullptr, nullptr, false))
{
std::cout << "Breaking.." << std::endl;
break;
}
}
booster->SetNumIterationForPred(0); // predict with all trees
/** Predict training data **/
std::vector<int> predictedClass(numSamples);
for (int i=0; i < numSamples; i++)
{
auto predVec = booster->PredictRaw(features[i].data());
const auto predMax = std::max_element(predVec.begin(), predVec.end());
predictedClass[i] = std::distance(predVec.begin(), predMax);
}
// compute error
double err = 0;
for (int i=0; i < numSamples; i++)
{
if (predictedClass[i] != labels[i])
{
err++;
}
}
err /= labels.size();
std::cout << "Training error: " << err << std::endl;
return 0;
}
I would like to second this request for C++ example code. I'm currently writing a Go wrapper for lightgbm, which I hope to open source. I have several things working (loading an existing model from file, making predictions for a file or a single row, writing a model to file), but I cannot figure out how to progress beyond one iteration and export all the trees. I tried looking through the CLI code, but couldn't figure out what was going on.
| gharchive/issue | 2023-07-05T03:31:22 | 2025-04-01T06:44:57.399581 | {
"authors": [
"AlgoKris",
"lehuyduc"
],
"repo": "microsoft/LightGBM",
"url": "https://github.com/microsoft/LightGBM/issues/5957",
"license": "mit",
"license_type": "permissive",
"license_source": "bigquery"
} |
951000460 | Add R resources for lessons 05 and 06
Added .Rmd, .ipynb and encouRage.png for lesson 5
Added .Rmd, .ipynb, dplyr_wrangling.png and unruly_data.jpg for lesson 6
hi @revodavid would you review this lovely PR please?
| gharchive/pull-request | 2021-07-22T19:55:47 | 2025-04-01T06:44:57.401476 | {
"authors": [
"R-icntay",
"jlooper"
],
"repo": "microsoft/ML-For-Beginners",
"url": "https://github.com/microsoft/ML-For-Beginners/pull/230",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1439578689 | Can't get Tracking State [UE5]
Hello,
as you can see from this issue I have made the switch to OpenXR. After some testing around, I notice that the tracking state of my QR-Code is allways set to "Tracking" even if it's not visible.
Unreal Engine 5.0.3
Visual Studio 2022, 17.3.1
Hololens 2, 20348.1522
OpenXR 1.1.14
This seems to be by design: the QRCode is tracked from a node in the spatial graph. When a QRCode is identified, a node is created and located with xrLocateSpace. Even if the QRCode is taken off the wall or not in view, xrLocateSpace can still find the location of a node in the graph provided the tracking system has enough data.
However, the UpdateTrackedGeometry event will not fire unless the QR code is currently being seen by the HoloLens. So if you look away or remove the QR code from the wall, the tracking system won't try to locate the QRCode and it will persist at its last-known location.
The recommendation from the team that wrote the QR package is to ignore QRCodes that have not updated for some time:
https://learn.microsoft.com/en-us/windows/mixed-reality/develop/advanced-concepts/qr-code-tracking-overview#managing-qr-code-data
You can do this by checking GetLastUpdateTimestamp on a cached TrackedGeometry, or if you're doing this in blueprint, it may be easier to cache your own GetGameTimeInSeconds on each update, since the tracked geometry is using FPlatformTime::Seconds() which isn't exposed to BP.
Okay got it, thanks a lot!
I solved this issue by simply ataching a retriggerable Delay, with Set UI Visibilty to Update Tracked Geometry:
I would suggest removing the function "Get Tracking State" or add a small note telling that it isn't used anymore, to prevent future issues.
| gharchive/issue | 2022-11-08T06:31:40 | 2025-04-01T06:44:57.408116 | {
"authors": [
"Tim-Potratz",
"fieldsJacksonG"
],
"repo": "microsoft/Microsoft-OpenXR-Unreal",
"url": "https://github.com/microsoft/Microsoft-OpenXR-Unreal/issues/88",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1975392396 | Laconfig.json file is missing in the latest MPARR Collector.zip
laconfig.json file is missinig in the latest collector zip folder. I am also not able to find that file anywhere else within the repository. Is that file needed to run this or is that being replaced on other latest scripts?
laconfig.json file is created with the MPARR_Setup.ps1 script, if you execute correctly the file is created.
Thanks Sebastian.
I got that through.
Best regards
From: Sebastián Zamorano A. @.>
Sent: Friday, November 3, 2023 5:55 PM
To: microsoft/Microsoft-Purview-Advanced-Rich-Reports-MPARR-Collector @.>
Cc: Mohamed Hasan Adnan @.>; Author @.>
Subject: Re: [microsoft/Microsoft-Purview-Advanced-Rich-Reports-MPARR-Collector] Laconfig.json file is missing in the latest MPARR Collector.zip (Issue #16)
You don't often get email from @.*** Learn why this is importanthttps://aka.ms/LearnAboutSenderIdentification
laconfig.json file is created with the MPARR_Setup.ps1 script, if you execute correctly the file is created.
—
Reply to this email directly, view it on GitHubhttps://github.com/microsoft/Microsoft-Purview-Advanced-Rich-Reports-MPARR-Collector/issues/16#issuecomment-1792234703, or unsubscribehttps://github.com/notifications/unsubscribe-auth/BDVXUCKJ5Q5ROPTC55HECULYCTERPAVCNFSM6AAAAAA632J266VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTOOJSGIZTINZQGM.
You are receiving this because you authored the thread.Message ID: @.***>
| gharchive/issue | 2023-11-03T03:15:27 | 2025-04-01T06:44:57.414016 | {
"authors": [
"ProfKaz",
"adnanm365"
],
"repo": "microsoft/Microsoft-Purview-Advanced-Rich-Reports-MPARR-Collector",
"url": "https://github.com/microsoft/Microsoft-Purview-Advanced-Rich-Reports-MPARR-Collector/issues/16",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
610502960 | No offer/answer messages are created
Describe the bug
This is in relation to : https://github.com/microsoft/MixedReality-WebRTC/issues/303#issuecomment-621698775
I've created two scenarios, one using the basic node-dss (from a fresh build and project import) and the second from my custom signaling service, which was working before I pulled yesterday.
Currently any call to StartConnection(), CreateOffer() or CreateAnswer() doesn't generate any messages, and the OnSDPOfferReadyToSend/OnSdpAnswerReadyToSend aren't being called.
To Reproduce
Steps to reproduce the behavior:
Pull & build latest Mr-Webrtc master
Open the generated unity project, setup the videoconferencing scene to use the node-dss server that's setup (I added a few debug logs
Hit play
No message is generated
Expected behavior
Previously on step 4 with the basic node-dss example, a message would be generated.
Using a build from the release branch works well with both the node-dss signaling, and my custom websocket signaling.
Environment
Platform: Unity Editor
Unity version: 2018.4
I have not tried, but if I follow exactly your steps then I do not expect anything to be generated.
Messages are only generated when a session starts to be negotiated with StartConnection(). This doesn't happen automatically at Unity startup unless you are in the very specific (and implementation defined behavior) where : your PeerConnection component is initialized first by Unity, AND the Signaler is initialized next, AND the peer connection finished to connect, AND the various tracks are initialized after that and raise a RenegotiationNeeded event, AND you have AutoCreateOfferOnRenegotiationNeeded = true. Any other deviation will fail to automatically generate an offer. And since the order in which Unity starts components is implementation-dependent, there is no expectation that this works. You can force the order of the scripts but I wouldn't rely on that if I were you, this is all too brittle.
Now assuming that you are indeed calling Startconnection(), for example via the "CreateOffer" button in the Unity scene, then I followed those steps:
Start the node-dss server after using set DEBUG=dss* to see requests
git clone --recursive https://github.com/microsoft/MixedReality-WebRTC/
Open the Visual Studio solution, build Debug x64
Open Unity, load the VideoChatDemo scene
Configure the NodeDssSignaler with the correct local/remote peer IDs
Press Play
Start TestAppUWP
Configure the signaler with the correct local/remote peer IDs
Press the "CreateOffer" button in the VideoChatDemo scene
As expected, I do get some offer and answer, and the connection establishes. I can see the 2 video tracks in TestAppUWP.
After investigating further I found my signaler's implemented update method hid the inherited method, perhaps it's possible to implement the base update method as virtual?
Either way, the hidden method handled task dequeues, closing the issue.
@HyperLethalVector I am not sure what base class you are talking about because Signaler already has a virtual Update() method.
Strange, it wasn't on my end but after a revert it was. I'll blame me for
it (don't code at 1am folks!). Either way I was hiding the original method,
which was causing the issue I posted.
Thanks anyways for entertaining my silly mistake.
On Mon, May 4, 2020, 10:17 PM Jerome Humbert notifications@github.com
wrote:
@HyperLethalVector https://github.com/HyperLethalVector I am not sure
what base class you are talking about because Signaler already has a
virtual Update() method
https://github.com/microsoft/MixedReality-WebRTC/blob/050680e4b900f61d1f290b2e98b8fe5b711f3130/libs/Microsoft.MixedReality.WebRTC.Unity/Assets/Microsoft.MixedReality.WebRTC.Unity/Scripts/Signaling/Signaler.cs#L205
.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/microsoft/MixedReality-WebRTC/issues/314#issuecomment-623442513,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/ACEKFSYLRWMCQSDCVGLTEXTRP22M3ANCNFSM4MWZBX2Q
.
| gharchive/issue | 2020-05-01T01:41:14 | 2025-04-01T06:44:57.447460 | {
"authors": [
"HyperLethalVector",
"djee-ms"
],
"repo": "microsoft/MixedReality-WebRTC",
"url": "https://github.com/microsoft/MixedReality-WebRTC/issues/314",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
690136974 | Camera constraints on HoloLens 1 not working
Describe the bug
When setting video capture constraints in WebcamSource manually in a HoloLens 1 build, no video is shown at the remote peer and the application crashes when activating Mixed Reality Capture. When capture mode is set to Automatic, everything works fine on HoloLens 1. On HoloLens 2 there is no problem with setting video capture constraints manually, video is always showing up on the remote peer.
All constraints we are setting are based on this Microsoft documentation: https://docs.microsoft.com/en-us/windows/mixed-reality/locatable-camera
I did then take a look into WebcamSource script and manually set the constraints according to automatic mode in WebcamSource script, but the video still didn't show up. I have also tried to only set one of the values, like width = 1280 and set the other values to zero, because the comment in WebcamSource script says "Avoid constraining the framerate".
I was unsure which video profile I should set for HoloLens 1, because as far as I understand, there are no camera profiles for HoloLens 1. I set it to VideoProfileKind.VideoRecording like in WebcamSource script automatic mode, but also tried to leave it blank.
I tried different combinations of constraints, with and without setting VideoProfileKind, but only got the video showing up at remote peer with WebcamSource capture format set to Automatic.
To Reproduce
Steps to reproduce the behavior:
Set WebcamSource capture format to Manual and set constraints to one of the following combinations:
width = 1280, height = 720, framerate = 30, video profile kind = Video Conferencing
width = 1280, height = any, framerate = any, video profile kind = any
Deploy on HoloLens 1 and establish a call
See that there is no video shown at remote peer
Enable Mixed Reality Capture and see app is crashing
Expected behavior
HoloLens 1 video should show up at remote peer when call is established and WebcamSource mode is set to Manual with correct constraints based on this documentation: https://docs.microsoft.com/en-us/windows/mixed-reality/locatable-camera.
Environment
MR-WebRTC v2.0.0 (via Unity Package Manager)
Unity 2018.4.26f1
UWP
x86
HoloLens 1
VideoConferencing is empty on HL1, 1280x720@30 is in VideoRecording instead. Can you check if (1280, 720, 30, VideoRecording) and (1280, any, any, VideoRecording) work?
Passing an unspecified video profile should work, this looks like a bug.
Sorry for the late response.
(1280, 720, 30, VideoRecording) and (1280, any, any, VideoRecording) both don't work, no video is shown at remote peer and the app crashes when enabling MRC.
| gharchive/issue | 2020-09-01T13:19:45 | 2025-04-01T06:44:57.455646 | {
"authors": [
"dl4mmers",
"fibann"
],
"repo": "microsoft/MixedReality-WebRTC",
"url": "https://github.com/microsoft/MixedReality-WebRTC/issues/563",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
479718594 | Nullptr in Overlap.SolverUpdate after Upgrade from RC2 to G1
Overview
After I updated my personal project from RC2 to GA, I am getting NullPtr every frame when I hit "Play" in my sample scene
Repro steps
Unzip the project Assets folder located here, move it to a new test project
Open the project, then open the BouncyBall/SampleScene.unity scene, and then hit play
You will get a large number of errors like:
NullReferenceException: Object reference not set to an instance of an object
Microsoft.MixedReality.Toolkit.Utilities.Solvers.Overlap.SolverUpdate () (at Assets/MixedRealityToolkit.SDK/Features/Utilities/Solvers/Overlap.cs:13)
Microsoft.MixedReality.Toolkit.Utilities.Solvers.Solver.SolverUpdateEntry () (at Assets/MixedRealityToolkit.SDK/Features/Utilities/Solvers/Solver.cs:245)
Microsoft.MixedReality.Toolkit.Utilities.Solvers.SolverHandler.LateUpdate () (at Assets/MixedRealityToolkit.SDK/Features/Utilities/Solvers/SolverHandler.cs:326)
This project contains my old project (all my files are in the BouncyBall folder), plus the new MRTK GA release (in the other MRTK folders). All MRTK files have been updated to their newer version, except PinchSlider.cs
Expected Behavior
The expectation is to not get any errors.
Unity Editor Version
2018.4.4f1
Mixed Reality Toolkit Release Version
GA
Assigning to Troy since he has been modifying solvers recently.
I'll pitch in and investigate while @Troy-Ferrell works on #5572
| gharchive/issue | 2019-08-12T15:45:05 | 2025-04-01T06:44:57.459422 | {
"authors": [
"davidkline-ms",
"julenka"
],
"repo": "microsoft/MixedRealityToolkit-Unity",
"url": "https://github.com/microsoft/MixedRealityToolkit-Unity/issues/5604",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
701462896 | Device Manager prefabs not initialized for Quest
Describe the bug
When using the foundations package, a pair of prefabs are listed as missing due to import errors, resulting in an improper configuration for Quest. Need to make sure these prefabs are linked correctly when using the "Integrate Oculus Integration for Unity Modules" dropdown option.
root causing the issue, the prefab's identifier's get screwed up when the OculusIntegration package is imported, need to mitigate errors that result form this.
| gharchive/issue | 2020-09-14T22:02:57 | 2025-04-01T06:44:57.460956 | {
"authors": [
"RogPodge"
],
"repo": "microsoft/MixedRealityToolkit-Unity",
"url": "https://github.com/microsoft/MixedRealityToolkit-Unity/issues/8548",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1056591891 | Can't find python environment
This issue has been moved from a ticket on Developer Community.
I installed python 3 64-bit (3.9.5) in the installer.
! [image.png] (https://aka.ms/dc/image?name=B9386e578681e4e8abc308e5064cf2509637719568220679251_20211108-162700-image.png&tid=9386e578681e4e8abc308e5064cf2509637719568220679251)
However, VS can't find the environment.
! [image.png] (https://aka.ms/dc/image?name=B8c8ce613b63943308481d19ffcd568a4637719568886914189_20211108-162806-image.png&tid=8c8ce613b63943308481d19ffcd568a4637719568886914189)
And the virtual environment tab says I don't have python installed.
! [image.png] (https://aka.ms/dc/image?name=B48f46c811d0c497f8203840780e5bcea637719569411210706_20211108-162859-image.png&tid=48f46c811d0c497f8203840780e5bcea637719569411210706)
I tried to uninstall and reinstall. Nothing different.
Original Comments
Feedback Bot on 11/8/2021, 01:12 AM:
We have directed your feedback to the appropriate engineering team for further evaluation. The team will review the feedback and notify you about the next steps.
Original Solutions
(no solutions)
I'm not able to repro this through the environments window. Can you please provide repro steps?
| gharchive/issue | 2021-11-17T20:15:01 | 2025-04-01T06:44:57.471180 | {
"authors": [
"AdamYoblick",
"vsfeedback"
],
"repo": "microsoft/PTVS",
"url": "https://github.com/microsoft/PTVS/issues/6791",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1067652970 | pykinect support?
I know this is ancient, but pretty much all links for pykinect are dead (docs/code repo/etc)
Anything listed here is still available somewhere?
Any pointers would be appreciated.
This is really old (2015!). I believe this was supported in an older branch. Please see PTVS/Python/Product/PyKinect at release/15.9 · microsoft/PTVS (github.com)
Yeah, it's been gone for a long time now. We should probably remove it from the wiki tho.
| gharchive/issue | 2021-11-30T20:36:52 | 2025-04-01T06:44:57.473424 | {
"authors": [
"AdamYoblick",
"int19h",
"virgilm"
],
"repo": "microsoft/PTVS",
"url": "https://github.com/microsoft/PTVS/issues/6803",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
1613762359 | Fix Text function handling of GUID values
GUID-to-text conversion shouldn't be done based on .NET rules. This change updates the Text implementation to ignore the second argument for GUID values.
✅ No public API change.
| gharchive/pull-request | 2023-03-07T16:09:26 | 2025-04-01T06:44:57.474581 | {
"authors": [
"CarlosFigueiraMSFT",
"LucGenetier"
],
"repo": "microsoft/Power-Fx",
"url": "https://github.com/microsoft/Power-Fx/pull/1150",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1587033724 | Grid Customizer not working
I have copied the exact sample for the Power Apps Grid custom control and it executes the custom code but no visible UI changes occur on text or currency. Please advise next steps?
@WhiteyC
Can you clarify which sample you are referring to?
Is it the one mentioned here Customized editable grid?
With the sample code from here: https://github.com/microsoft/PowerApps-Samples/tree/master/component-framework/PowerAppsGridCustomizerControl
I think next steps are to use the steps here to debug and understand the issue better Debug code components
If you identify a specific issue, please let us know.
@WhiteyC Can you clarify which sample you are referring to?
Is it the one mentioned here Customized editable grid? With the sample code from here: https://github.com/microsoft/PowerApps-Samples/tree/master/component-framework/PowerAppsGridCustomizerControl
I think next steps are to use the steps here to debug and understand the issue better Debug code components
If you identify a specific issue, please let us know.
Yes, I am using the Grid Customizer Control. I have added a debugger; in the "int" and that triggers fine. However the cell render does nothing.
export const cellRendererOverrides: CellRendererOverrides = {
["Text"]: (props, col) => {
// Render all text cells in green font
return <Label style={{ color: 'green' }}>{props.formattedValue}
}
}
public init(
context: ComponentFramework.Context<IInputs>,
notifyOutputChanged: () => void,
state: ComponentFramework.Dictionary
): void {
debugger;
const eventName = context.parameters.EventName.raw;
if (eventName) {
const paOneGridCustomizer: PAOneGridCustomizer = { cellRendererOverrides, cellEditorOverrides };
(context as any).factory.fireEvent(eventName, paOneGridCustomizer);
}
}
The control builds fine and deploys fine. Added to Power apps grid control as a customizer.
I've seen the same. It started suddenly, probably because of a Platform update last Sunday night.
It happended that the control I was developing worked, and a few minutes later, without any change, it stopped working.
I have also older controls which worked before: all of them stopped working since then.
I've seen the same. It started suddenly, probably because of a Platform update last Sunday night. It happended that the control I was developing worked, and a few minutes later, without any change, it stopped working. I have also older controls which worked before: all of them stopped working since then.
Good to see I'm not the only one going through this. It's the latest update on the Power Apps Grid control.
@WhiteyC & @brasov2de
Adding @HemantGaur
It sounds like you are describing an issue that is larger than this specific sample. This sounds like a platform level issue rather than something we can address by changing this sample code. Am I understanding you correctly?
In this case, it is important that you contact technical support and report the issue. We don't have the resources to troubleshoot this with you.
If there is an issue with a specific sample, then we do want to get that sample fixed. Otherwise, this isn't the appropriate place to look for technical support.
I notified @HemantGaur and there seems to be a corresponding known issue in production.
Not yet sure if the fix is deployed to all regions at this point.
Since there is nothing actionable related to this sample code, I'm going to close this issue.
@JimDaly is there any info about when fix will be released?
@morgutrin / @WhiteyC / @brasov2de This fix will be available by late March for most geos.
@morgutrin / @WhiteyC / @brasov2de This fix will be available by late March for most geos.
Can I ask where you found this information in regards to a fix?
@WhiteyC
@jasongre is the program manager for this feature.
@WhiteyC
@jasongre is the program manager for this feature.
Thanks Jim. That's not giving me any confidence in MS support as I have a case raised on this and they have not given me any information of this kind. Not helpful when you are trying to earn a living with this product and have deadlines.
Hello everyone, I am having a problem with the controlcustomizer and the column filters. I explain... When I filter some of the columns, sometimes, many times, the funnel icon and the option to clean filters, disappear from the column header. And the times that they don't disappear, when I clean filters, all the results are not shown again. In other words, it is as if I only get the filter the first time and the grid remains with that filter until I refresh the page.
Any help on this would be very welcome. This problem is driving me crazy and is going to force me to remove a control that has taken me many hours of development.
Thanks.
| gharchive/issue | 2023-02-16T05:16:17 | 2025-04-01T06:44:57.486958 | {
"authors": [
"JimDaly",
"WhiteyC",
"brasov2de",
"jasongre",
"mianfriH",
"morgutrin"
],
"repo": "microsoft/PowerApps-Samples",
"url": "https://github.com/microsoft/PowerApps-Samples/issues/381",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2341223818 | [Feature]: Support Model Driven Application
Is your feature request related to a problem? Please describe.
The preview version of Power Apps Test Engine only supports canvas applications. Extend the Test Engine support to also include the ability to write tests for Model Driven applications so that they can be included in testing of Power Platform solutions
Describe the solution you'd like
Provide the ability to:
Navigate between pages
Navigate between view and detail pages
Execute actions like Save and Save & Close
Query the JavaScript Object Model of the page and create Power Fx variables that represent the current state
Allow updates to the Power Fx variables and update the JavaScript object model
Describe alternatives you've considered
No response
Additional context?
No response
Yes! This is a game changer.
Thanks @filcole for the feedback. This is something we are actively looking as it is a natural evolution of Introduce the Extensibility Framework
The plan is to make use of XRM SDK and wrap existing JavaScript Object Model to a provide Low Code reprensentation of grids and forms that can be interacted with via Test Engine:
GridEntity - To get grid view for current grid
GridRow - To use code like gridContext.getGrid().getRows()[0].data.entity
For the kits that the Power CAT tools team creates and maintains we have many Model Driven Applications that we need to also test. The proposed extensions to support MDA address a key need for our test automation evolution.
@Grant-Archibald-MS we were preparing to spend a rather large effort to use the older Easyrepo approach with our MDA solutions, however I see a lot of commits around the MDA functionality is it close enough to warrant pausing for some weeks? I know you cannot give specific dates but just a general scope before the end of the month?, year?
@Grant-Archibald-MS
Hi there. We are also interested in an update regarding test-engine for model driven apps. We were using EasyRepro too before, in combination with the SpecFlow library implemented by Cap Gemini (which is heavily based upon EasyRepro). But both libraries have not been updated for 2 than years. And with all the changes that have been done to the Power Platform GUI design, but also the fact that EasyRepro is still based on Selenium 2 we decided to stop putting any effort into UI tests based upon the SpecFlow library and EasyRepro.
@sdhsynsci Model Driven Application support is progressing and if you wanted early look at what is working we have integration branch, However this is using a build from source strategy using the MIT licence not included as new features of pac test run.
@pvillads as owner of the Test Engine and the release schedule for updates to the pac test run command.
@JoehannusApg thanks for the interest in automated testing of Model Driven Applications using Test Engine and it is an active area of interest for the team to work towards covering a range of MDA scenarios across entitylist, entityrecord and custom pages.
Looking at the architecture it is based in Playwright and take advantage of Power Fx as the method. It also has a number of features around authentication handling cases like MFA and execution in the context of CI/CD process.
As an early adopter of the Test Engine we are using the custom page support of Test Engie to test our tests for the CoE Kit. We covered some of this during our recent community office hours https://www.linkedin.com/feed/update/urn:li:activity:7262550764605104138/ (Slides 6 - 9) discuss our ALM process and how we are using Test Engine for our testing work
@Grant-Archibald-MS @pvillads
Can you give an estimate on a possible Public Preview date? And perhaps even what you are aiming at for a possible General Availability Date?
| gharchive/issue | 2024-06-07T22:02:54 | 2025-04-01T06:44:57.496776 | {
"authors": [
"Grant-Archibald-MS",
"JoehannusApg",
"filcole",
"sdhsynsci"
],
"repo": "microsoft/PowerApps-TestEngine",
"url": "https://github.com/microsoft/PowerApps-TestEngine/issues/342",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2090670886 | Square Business (Independent Publisher) issue - Can't sign in
Api Name - shared_squarebusinessip
Bug description -
I created a new application as suggested and defined the OAuth redirect URL to https://global.consent.azure-apim.net/redirect
However, when I try to log in from Power Automate the use the connector I'm getting: Unable to find client by that client_id
Is this a security bug? (Y/N) Y
What is the severity of this bug? High
@alvarohv did you ever figure this out? Having the same issue.
| gharchive/issue | 2024-01-19T14:21:40 | 2025-04-01T06:44:57.499213 | {
"authors": [
"alvarohv",
"jayseet"
],
"repo": "microsoft/PowerPlatformConnectors",
"url": "https://github.com/microsoft/PowerPlatformConnectors/issues/3203",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
542882132 | Sideloading UWP project from VS
Summary of the new feature/enhancement
If I'm adding a new "C++/WinRT Core App" project, I can build and deploy it from Visual Studio using standard workflow, making it trivial to debug and test UWP-only API. We want a similar functionality for our runner as well.
Sorry, what exactly is not working correctly now?
@bzoz I'm working on #1020 atm and this method throws https://github.com/microsoft/PowerToys/commit/dfb3a999a8bb95266c2896aad25a19bfe5ff4142#diff-589e5a77526dd919d6c88fed0c9ff4daR151 from the "desktop" context w/o deployment.
how is this getting implemented? Why is this a UWP project?
https://docs.microsoft.com/en-us/windows/uwp/design/shell/tiles-and-notifications/send-local-toast-desktop
https://docs.microsoft.com/en-us/windows/uwp/design/shell/tiles-and-notifications/send-local-toast-desktop-cpp-wrl
xref #696
@crutkas
Yes, sending a simple toast notification could be done from a desktop context, though I want to be able to test scenarios where we're getting activated from a notification's "Update" button press in a UWP context, assuming we'll have such functionality in the first place.
I don't have a clear picture of the whole updating process yet, so maybe we should discuss what we want in #696!
Suspending this work for now.
we'll reopen once MSIX is back up as a tracking item
| gharchive/issue | 2019-12-27T13:51:15 | 2025-04-01T06:44:57.503712 | {
"authors": [
"bzoz",
"crutkas",
"enricogior",
"yuyoyuppe"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/1021",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
860723894 | [Usable][FancyZones Editor - Edit layout]: Text 'Distance to highlight adjacent zones' gets overlapped with the increment and decrement arrow keys of edit field when focus navigates to it.
Microsoft PowerToys version
"0.35.0"
Running as admin
[ ] Yes
Area(s) with issue?
FancyZones Editor
Steps to reproduce
Test Environment:
OS: Windows 10 Build 21354.1
App version: 0.35.0
App name: PowerToys
Screen Reader: Narrator
Tool: Accessibility Insight for Windows
Repro Steps:
Launch 'PowerToys' application.
Application will get open with default list item 'General' selected.
Navigate to 'FancyZones' at the left side of the pane and activate it.
Now navigate to 'Launch layout editor button and activate it.
FancyZones editor window will get open.
Navigate to 'Edit layout' button of any layout (say Grid) under Custom and activate it.
Edit layout dialog gets open.
Press 'Tab' key to navigate to the 'Distance to highlight adjacent zones' edit field.
Observe the issue.
User Impact: Priority-3
Keyboard users using the application will get impacted as text 'Distance to highlight adjacent zones' gets hid with the increment and decrement arrow keys.
✔️ Expected Behavior
Text 'Distance to highlight adjacent zones' should not get overlap with the increment and decrement arrow keys of edit field when focus navigates to it.
❌ Actual Behavior
Text 'Distance to highlight adjacent zones' gets overlapped with the increment and decrement arrow keys of edit field when focus navigates to it.
Other Software
No response
@Priyanka-Chauhan123 @crutkas this is default behavior of the NumberBox control as defined by the WinUI team
@niels9001 Yes, but for the above field i.e. 'show space around zones' it is not getting cropped. Refer below snippet for the same.
@niels9001 Yes, but for the above field i.e. 'show space around zones' it is not getting cropped. Refer below snippet for the same.
True, but that's because it's not using the Header property of the NumberBox but a seperate CheckBox with increased spacing between the two.
So in short, the issue is that if a Header property is used on a NumberBox that the spinners are overlapping with the Header itself. We should flag this with the WinUI team/repo @crutkas ?
Outdated due to recent UX updates.
| gharchive/issue | 2021-04-18T17:57:12 | 2025-04-01T06:44:57.513184 | {
"authors": [
"Priyanka-Chauhan123",
"niels9001"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/10809",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
942830896 | Screen setup change should adapt Video Conference Mute toolbar position
Microsoft PowerToys version
0.36.0
Running as admin
[X] Yes
Area(s) with issue?
Video Conference Mute
Steps to reproduce
Configure screen in low resolution
Enable Video Conference Mute
Configure Toolbar in "Bottom Center"
Configure screen in high resolution
✔️ Expected Behavior
Toolbar appears "Bottom Center"
❌ Actual Behavior
Toolbar appears in the middle of the screen
Other Software
No response
Refs #10860
Oliver Kopp oliver.kopp@mbition.io, Mercedes-Benz AG on behalf of MBition GmbH.
Imprint
Thanks for the feedback but VCM is moving to maintenance only in the 0.67 timeframe. Our team will only directly address critical bugs, security and accessibility issues.
We'll accept community PRs for enhancements.
| gharchive/issue | 2021-07-13T06:28:46 | 2025-04-01T06:44:57.518218 | {
"authors": [
"crutkas",
"koppor"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/12346",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1033999683 | Power Rename
Description of the new feature / enhancement
Ability to be able to add a prefix or suffix i.e., prepend or append to a file name.
Scenario when this would be used?
There are many times when it is necessary to add a word, phrase, date, etc. to the beginning or at the end of a file name.
Supporting information
No response
Please see if this helps https://github.com/microsoft/PowerToys/issues/13865#issuecomment-944878846
| gharchive/issue | 2021-10-22T23:00:47 | 2025-04-01T06:44:57.520266 | {
"authors": [
"franky920920",
"kumar007git"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/13966",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1156953137 | Thumbnail preview failed
Microsoft PowerToys version
0.56.1
Running as admin
[X] Yes
Area(s) with issue?
PDF Thumbnail, SVG Thumbnail
Steps to reproduce
Thumbnail preview is available in previous versions, but not in recent versions.
PowerToysReport\File Explorer\Logs:
[2022-03-02 01:57:02.477240] [p-6900] [t-5124] [error] unApply of HKCU\Software\Classes\CLSID{BCC13D15-9720-4CC4-8371-EA74A274741E}\Implemented Categories{62C8FE65-4EBB-45E7-B440-6E39B2CDBF29}\Default:: RegOpenKeyExW failed: ᅬ솨쀄ᅭᄇᄏᄉ퓌ᄌᄊᄄ샤ᅫᅣᄐᄀᆪ
[2022-03-02 01:57:02.477278] [p-6900] [t-5124] [error] unApply of HKCU\Software\Classes\CLSID{BCC13D15-9720-4CC4-8371-EA74A274741E}\InprocServer32\Default:F:\Program Files\PowerToys\modules\FileExplorerPreview\PowerToys.PdfThumbnailProvider.comhost.dll: RegOpenKeyExW failed: ᅬ솨쀄ᅭᄇᄏᄉ퓌ᄌᄊᄄ샤ᅫᅣᄐᄀᆪ
[2022-03-02 01:57:02.477293] [p-6900] [t-5124] [error] unApply of HKCU\Software\Classes\CLSID{BCC13D15-9720-4CC4-8371-EA74A274741E}\InprocServer32\Assembly:PowerToys.PdfThumbnailProvider, Version=v0.56.1.0, Culture=neutral: RegOpenKeyExW failed: ᅬ솨쀄ᅭᄇᄏᄉ퓌ᄌᄊᄄ샤ᅫᅣᄐᄀᆪ
✔️ Expected Behavior
can preview pdf/svg Thumbnails.
❌ Actual Behavior
Previously previewable thumbnails work fine, but new files do not preview thumbnails.
Other Software
No response
We've got some errors there, interesting. Could you please send the full /bugreport ?
PowerToysReport_2022-03-02-21-44-50.zip
Same problem: SVG & PDF thumbnails in File Explorer doesn't work.
btw, I have seen https://github.com/microsoft/PowerToys/issues/20210 https://github.com/microsoft/PowerToys/issues/20031 https://github.com/microsoft/PowerToys/issues/19277 https://github.com/microsoft/PowerToys/issues/20547 , but has no answer.
Microsoft PowerToys version
0.62.1
Other Software
Windows10
Excuse me, is anyone there?🤔
I've given up on getting a solution to this issue.
/dup https://github.com/microsoft/PowerToys/issues/20695
| gharchive/issue | 2022-03-02T09:57:45 | 2025-04-01T06:44:57.529274 | {
"authors": [
"MYGXT",
"TaicEart",
"cinnamon-msft",
"jaimecbernardo"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/16706",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
594543508 | Fancy Zones: Screen Chooser Menu for Zone Editor
OS: Windows 10 Pro 64bit,
PowerToys: 0.16.0
The current method of choosing which screen to set up FancyZones for in a multi-screen layout for is effectively hidden. There is no visible UI element (such as a tab chooser or pulldown menu) to even indicate that multiple screens are supported. The only hint is the option for "Show zones on all monitors while dragging a window".
Proposed technical implementation details
Maintain the current behaviour (the zone editor will target the monitor where the centre of the PowerTools window resides?), but have this set the DEFAULT monitor in the Zone editor when it opens.
Most desirable: Full multi-screen aware zone editor, with symmetric mirroring options for L/R and Up/Down screens etc.
Next best: In the zone editor-
A physical screen chooser (using, say- image picker, tabs, or pulldown menu)
An option to create a mirrored clone of the current zone layout, with either Left/Right or Top/Bottom mirroring. Pairs stay linked, editing one layout in the mirror automatiocally updates the other.
@incansvl Agreed. It is actually part of @crutkas draft spec for the FancyZones editor: https://github.com/microsoft/PowerToys/blob/dev/crutkas/FzEditor2Spec/doc/specs/FancyZoneEditorV2.md
Thanks. Lots of good stuff in the draft I can see. It doesn't include mirror pairs, but I think something similar has been raised by multiple people, so hopefully it will be considered for inclusion.
I do directly call out in. Is this what you mean by monitor pairs or is this something else?
4.1.1. FZ Editor Dialog window
Editor has selectable monitor layout that mimics Settings dialog interaction model and look/feel.
i'm going to close this out at tracked in #1032 unless you comment otherwise @incansvl
Sorry I don't understand this comment-
I do directly call out in.
Regarding "monitor pairs" I mean situations like this-
I have 3 monitors in a PLP setup. One monitor is rotated 90deg clockwise, and the other 90deg anticlockwise (to have the narrowest bezels next to the central screen).
If I set up a zone layout on, say screen 2 (Left) that is not symmetrical left-to-right, to achieve the same layout on screen 3, I need to have the layout flipped (mirrored) left-to-right. This is much easier to manage if the two screens are edited as one mirrored pair, rather than having to hand-edit two layouts as mirroed copies of each other, which is basically impossible without pixel-accurate layout design, and even then is a pain to do twice.
Closing this since we are implementing multi-monitor support https://github.com/microsoft/PowerToys/pull/6562
| gharchive/issue | 2020-04-05T16:31:04 | 2025-04-01T06:44:57.536809 | {
"authors": [
"crutkas",
"enricogior",
"incansvl",
"niels9001"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/1945",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1387312342 | Window snap function
Description of the new feature / enhancement
Snapping function when moving and resizing windows
Scenario when this would be used?
When I want to use the desktop more efficiently, I need to place the windows so that they do not overlap each other.
Supporting information
No response
We already have a feature called FancyZones. How would tjis differ from FancyZones?
/needinfo
@gohikan, do you want windows that magnet edges or snap? If Display fusion does what you want, why not use that?
Snapping to zones, FancyZones with a grid layout would be the way to do it.
/needinfo
else is would be a duplicate of #8
| gharchive/issue | 2022-09-27T08:07:02 | 2025-04-01T06:44:57.540006 | {
"authors": [
"Aaron-Junker",
"Gohikan",
"crutkas"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/20878",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1570664362 | Locksmith
Description of the new feature / enhancement
Locksmith could be set to ALWAYS use Administrator privileges
Scenario when this would be used?
Always when showing what is using a file from the File Explorer sub-menu
Supporting information
I have yet to see that anything is using any file I have ever tried to use Locksmith with
/dup https://github.com/microsoft/PowerToys/issues/25154
| gharchive/issue | 2023-02-04T00:36:11 | 2025-04-01T06:44:57.541892 | {
"authors": [
"Webweweave",
"cinnamon-msft"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/23836",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1615952875 | Task Manager won't snap to FancyZones
Microsoft PowerToys version
0.66.0
Installation method
GitHub
Running as admin
No
Area(s) with issue?
FancyZones
Steps to reproduce
I created a custom FancyZone setup on my secondary monitor that I was intending to use for quickly organizing all of my PC statistics applications (Task Manager, GlassWire, etc.). Discord works just fine; when holding the left shift key, dragging the window around reveals the FancyZones I set up. However when performing the same action with Task Manager, they don't appear at all, and releasing the mouse in the place of where they would be also has no effect, so it's not just that the visual layer is not visible, the actual function is not being triggered for some reason.
✔️ Expected Behavior
I was expecting Task Manager to snap to the FancyZone I created.
❌ Actual Behavior
The window was dragged around completely normally, as if I had never installed FancyZones. No default features I had before FancyZones were actively hindered or broken by trying to do it, the result was simply that FancyZones did not trigger upon holding the designated modifier (Shift) while dragging the window.
Other Software
Microsoft Task Manager (Windows 10)
AltDrag (probably not relevant; I'm not using the feature of holding alt, just dragging normally from the top bar so it shouldn't be related, just thought I should mention it on the edge case that it has some impact)
Discord (the example application where the feature actually does work as intended)
Task manager runs elevated. You need to run powertoys as admin to do this.
| gharchive/issue | 2023-03-08T21:16:13 | 2025-04-01T06:44:57.545632 | {
"authors": [
"benjaminrdeutsch",
"crutkas"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/24680",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1645156178 | UPPER/lowercase flipbof highlighted text
Description of the new feature / enhancement
I need a function that will flip UPPER/lower case of a highlighted text. To be called as a context menu rt-clk would really help my disability. Not a software person.
Scenario when this would be used?
My use of Windows taskbar's virtual keyboard is troublesome to say the least. It doesn't play well with others, especially Facebook. It very often misses my case-changing causing a lot of retyping where a case-flip function would correct things instantly.
Supporting information
No response
/dup #907
| gharchive/issue | 2023-03-29T07:00:30 | 2025-04-01T06:44:57.547650 | {
"authors": [
"beachgrrl",
"stefansjfw"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/25091",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1718144792 | Protractor Tool
Description of the new feature / enhancement
Add a protractor tool
Angle of a corner where the mouse cursor is closer (corner determined by color differences) or;
Angle between mouse cursor and lines drawn
it would be nice to learn
I drew an example by hand lol:
(I wrote using translation, forgive any typos)
Thanks
Scenario when this would be used?
May be useful for some UI and UX designers
Supporting information
No response
Would love to see this implemented! I'm a desktop GUI programmer and deal with situations all the time which would benefit from a solution like this.
The way I think it could work great in terms of UX is to have it be a 3 point selection process, eg:
Define centre point with a click. This defines the "location" of the on-screen protractor, ie: its centre.
Select the first point to measure angle from. This point would be some radial point anywhere around the centre point defined in 1
Same as 3, but now we're measuring the angle (degree and rads output would be super nice) from 3 to 2, centred about the point that was defined in 1.
Some nice UX ideas while selecting the points would be to have a line being drawn from the radial centre (point 1) to where you are tagging the next point. Then when mousing over to select point 3, having a low-opacity arc being drawn between the two measurement points 2 and 3.
If there's anything obvious I can do to help with this feature, let me know.
The amount of folded up sticky notes this feature would save me is insane
This would be great as an enhancement of the screen ruler function
I would also greatly appreciate a protractor, it would be very useful
| gharchive/issue | 2023-05-20T11:36:17 | 2025-04-01T06:44:57.553725 | {
"authors": [
"DHaak93",
"JulesBellamy",
"TArda34",
"avanbrenen",
"stacygaudreau"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/26084",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
1795433257 | Play a sound when something is copied into the clipboard
Description of the new feature / enhancement
Play a sound when something is copied into the clipboard.
Better if giving an option to customize the sound.
Scenario when this would be used?
To provide audio cues to the clipboard, you immediately gain the ability to confirm if a copy is successful, like, in a remote desktop scenario, copying from the remote machine to the local clipboard may not always succeed.
Also, you can tell if a program is secretly changing your clipboard.
Plus, if you are using an older laptop (mostly company issued), where keys are not super sensitive, CTRL+C may not always register, having an audio cue can save people lots of unwanted pastes.
Supporting information
No response
/feedbackhub
I suggest to add sound notification when pasting too, preferably emitting distinct sounds for each action, so that it is easy to identify which action (copying/pasting) was performed.
For accessibility purposes, option to enable pop up notification for each action.
| gharchive/issue | 2023-07-09T14:48:25 | 2025-04-01T06:44:57.556801 | {
"authors": [
"DeXtmL",
"crutkas",
"ozzyjr"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/27309",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
2359328651 | Add support of Azure Open AI for Paste with ai
Description of the new feature / enhancement
Add support for in Azure deployed Azure open ai Instances. This would require the configuration of an endpoint (*.openai.azure.com) and a "deployment name" additionaly to the API Key.
Scenario when this would be used?
When you are using a deployed version of Azure OpenAI and possibly don't have a OpenAI pro license. This would also enable for a Pay as you go - billing
Supporting information
https://learn.microsoft.com/en-us/azure/ai-services/openai/
To my knowledge the existing OpenAI SDKs should also support the Azure Open AI instances with according configuration so this might be a easy improvment to ship.
/dup #32960
| gharchive/issue | 2024-06-18T08:57:26 | 2025-04-01T06:44:57.559628 | {
"authors": [
"MichaMican",
"htcfreek"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/33425",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
802270215 | Update Issue from 0.29.3
ℹ Computer information
PowerToys version: 0.29.3 upgrading to 0.31.1
PowerToy Utility:
Running PowerToys as Admin: Yes
Windows build number: 19042.746
📝 Provide detailed reproduction steps (if any)
PT Asks for an update, so I run it
it fails to update saying it cannot uninstall the previous version
Starts the install, and then asks me to locate the PowertoysBoostrapInstaller-0.29.3.msi file to uninstall and fails because I have no idea where that should be
✔️ Expected result
Uninstall previous version and install new version
❌ Actual result
Fails to uninstall previous version and update to new version
📷 Screenshots
@wylel
download and run https://github.com/microsoft/PowerToys/releases/download/v0.29.0/PowerToysSetup-0.29.0-x64.exe
it should offer you the option to uninstall it.
When I try to launch it, it pops up for a split second then closes. No prompt or anything to do.
@wylel
open a Command Prompt as administrator and run, from the folder where PowerToysSetup-0.29.0-x64.exe is:
> PowerToysSetup-0.29.0-x64.exe --log_level debug
it will create a log file in the same folder.
powertoys-boostrapper-msi-0.29.0.log:
=== Verbose logging started: 2/5/2021 10:43:03 Build type: SHIP UNICODE 5.00.10011.00 Calling process: C:\Users\user\Downloads\PowerToysSetup-0.29.0-x64.exe ===
MSI (c) (FC:E0) [10:43:03:383]: Font created. Charset: Req=0, Ret=0, Font: Req=MS Shell Dlg, Ret=MS Shell Dlg
MSI (c) (FC:E0) [10:43:03:383]: Font created. Charset: Req=0, Ret=0, Font: Req=MS Shell Dlg, Ret=MS Shell Dlg
MSI (c) (FC:44) [10:43:03:399]: Resetting cached policy values
MSI (c) (FC:44) [10:43:03:399]: Machine policy value 'Debug' is 0
MSI (c) (FC:44) [10:43:03:399]: ******* RunEngine:
******* Product: C:\Windows\Installer\17c00114.msi
******* Action:
******* CommandLine: **********
MSI (c) (FC:44) [10:43:03:400]: Note: 1: 2203 2: C:\Windows\Installer\17c00114.msi 3: -2147287038
MSI (c) (FC:44) [10:43:03:400]: MainEngineThread is returning 2
=== Verbose logging stopped: 2/5/2021 10:43:03 ===
powertoys-boostrapper-0.29.0.log:
[D][05-02-21-10:43:03] PowerToys Bootstrapper is launched!
noFullUI: false
silent: false
no_start_pt: false
skip_dotnet_install: false
log_level: debug
install_dir: C:\Program Files\PowerToys
extract_msi: false
[E][05-02-21-10:43:03] Detected a newer v0.29.3 version => launching its installer
@wylel
sorry, I meant 0.29.3 and pasted 0.29.0.
The log for 0.29.3 is exactly the same?
@wylel
extract the MSI from the EXE:
> PowerToysSetup-0.29.3-x64.exe --extract_msi
you may needed when running the Microsoft uninstall trouble shooter
https://download.microsoft.com/download/7/E/9/7E9188C0-2511-4B01-8B4E-0A641EC2F600/MicrosoftProgram_Install_and_Uninstall.meta.diagcab
Got it fixed with that. Use that to uninstall then installed the new version. Thanks
| gharchive/issue | 2021-02-05T15:46:11 | 2025-04-01T06:44:57.572333 | {
"authors": [
"enricogior",
"wylel"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/issues/9521",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
783710425 | [Run][New Plugin] VSCode Workspaces/Remote machines
Summary of the Pull Request
What is this about:
So i use VSCode a lot (VSCode insiders and VSCode stable).
One of my biggest productivity problems right now is finding my projects without opening VSCode or access through the taskbar (e.g. Local, WSL, SSH or Codespaces).
So i had an idea for searching previously opened workspaces and/or remote machines in VSCode from Powertoys Launcher.
Previously opened workspaces
Vscode Remote machines
What is include in the PR:
Search by previously opened vscode workspaces
Search for remote machines configures in vscode sshconfig file
How does someone test / validate:
open a workspace or configure a remote machine in vscode
search for the workspace/remote machine in powertoys run
Quality Checklist
[x] Linked issue: #3547
[x] Communication: I've discussed this with core contributors in the issue.
[x] Tests: Added/updated and all pass
[x] Installer: Added/updated and all pass
[ ] Localization: All end user facing strings can be localized
[ ] Docs: Added/ updated
[x] Binaries: Any new files are added to WSX / YML
[ ] No new binaries
[x] YML for signing for new binaries
[x] WXS for installer for new binaries
Contributor License Agreement (CLA)
A CLA must be signed. If not, go over here and sign the CLA.
@ricardosantos9521
Can you please update the PR description based on the updated template. Thank you.
@ricardosantos9521, @enricogior
I think the plugin should follow the new naming convention. (Issue #9003.)
@ricardosantos9521
Can you please update the PR description based on the updated template. Thank you.
@ricardosantos9521, @enricogior
I think the plugin should follow the new naming convention. (Issue #9003.)
Microsoft.Plugin.VSCodeWorkspaces - > Microsoft.PowerToys.Run.Plugin.VSCodeWorkspaces
Microsoft.Plugin.VSCodeWorkspaces - > Microsoft.PowerToys.Run.Plugin.VSCodeWorkspaces
@ricardosantos9521
the core team is proposing that the community plugins use the Community.PowerToys.Run.Plugin namespace to distinguish them from the core plugins that the team takes responsibility to maintain.
Sorry for the late change request, but we just started accepting plugins that were not planned and we haven't the workflow in place yet.
@ricardosantos9521
the core team is proposing that the community plugins use the Community.PowerToys.Run.Plugin namespace to distinguish them from the core plugins that the team takes responsibility to maintain.
Sorry for the late change request, but we just started accepting plugins that were not planned and we haven't the workflow in place yet.
@ricardosantos9521
the core team is proposing that the community plugins use the Community.PowerToys.Run.Plugin namespace to distinguish them from the core plugins that the team takes responsibility to maintain.
Sorry for the late change request, but we just started accepting plugins that were not planned and we hadn't the workflow in place yet.
No problem ;)
@ricardosantos9521
the core team is proposing that the community plugins use the Community.PowerToys.Run.Plugin namespace to distinguish them from the core plugins that the team takes responsibility to maintain.
Sorry for the late change request, but we just started accepting plugins that were not planned and we hadn't the workflow in place yet.
No problem ;)
@ricardosantos9521
the core team is proposing that the community plugins use the Community.PowerToys.Run.Plugin namespace to distinguish them from the core plugins that the team takes responsibility to maintain.
Sorry for the late change request, but we just started accepting plugins that were not planned and we hadn't the workflow in place yet.
No problem ;)
@ricardosantos9521
the core team is proposing that the community plugins use the Community.PowerToys.Run.Plugin namespace to distinguish them from the core plugins that the team takes responsibility to maintain.
Sorry for the late change request, but we just started accepting plugins that were not planned and we hadn't the workflow in place yet.
No problem ;)
@ricardosantos9521
the core team is proposing that the community plugins use the Community.PowerToys.Run.Plugin namespace to distinguish them from the core plugins that the team takes responsibility to maintain.
Sorry for the late change request, but we just started accepting plugins that were not planned and we hadn't the workflow in place yet.
@enricogior
Imo Community.PowerToys.Run.Plugin namespace is a bit strange because PTRun is from MS. Imo Microsoft.PowerToys.Run.CommunityPlugin namespace would make more sense.
@ricardosantos9521
the core team is proposing that the community plugins use the Community.PowerToys.Run.Plugin namespace to distinguish them from the core plugins that the team takes responsibility to maintain.
Sorry for the late change request, but we just started accepting plugins that were not planned and we hadn't the workflow in place yet.
@enricogior
Imo Community.PowerToys.Run.Plugin namespace is a bit strange because PTRun is from MS. Imo Microsoft.PowerToys.Run.CommunityPlugin namespace would make more sense.
@ricardosantos9521
is the PR ready for review?
@ricardosantos9521
is the PR ready for review?
Yes please!
Yes please!
@ricardosantos9521
to enable localization, a config file is needed, take a look at this PR https://github.com/microsoft/PowerToys/pull/9295/files
@ricardosantos9521
to enable localization, a config file is needed, take a look at this PR https://github.com/microsoft/PowerToys/pull/9295/files
@enricogior i think it's done. Is there a way to test localization locally?
@enricogior i think it's done. Is there a way to test localization locally?
@ricardosantos9521
Is there a way to test localization locally?
Unfortunately it's not possible.
| gharchive/pull-request | 2021-01-11T21:15:54 | 2025-04-01T06:44:57.594084 | {
"authors": [
"enricogior",
"htcfreek",
"ricardosantos9521"
],
"repo": "microsoft/PowerToys",
"url": "https://github.com/microsoft/PowerToys/pull/9050",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
821692707 | Play as client in UE4 causes a crash
When I run in standalone game, it's OK. But I run as "Play as client", UE4 editor crashes. Is the project does not support multiplayer game?
LoginId:bc3ae57140518551c43e929e5e12a838
EpicAccountId:99c302e51f984c72a3a213d54546ea8d
Unhandled Exception: EXCEPTION_ACCESS_VIOLATION reading address 0x0000000000000354
UE4Editor_AkAudio!UAkComponent::GetPosition() [D:\Projects\myproject\main\UnrealEngine\Engine\Plugins\Wwise\Source\AkAudio\Private\AkComponent.cpp:883]
UE4Editor_ProjectAcoustics!AAcousticsSpace::LoadAceFile() [D:\Projects\myproject\main\UnrealEngine\Engine\Plugins\ProjectAcoustics\Source\ProjectAcoustics\Private\AcousticsSpace.cpp:209]
UE4Editor_ProjectAcoustics!AAcousticsSpace::BeginPlay() [D:\Projects\myproject\main\UnrealEngine\Engine\Plugins\ProjectAcoustics\Source\ProjectAcoustics\Private\AcousticsSpace.cpp:66]
UE4Editor_Engine!AActor::DispatchBeginPlay() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Runtime\Engine\Private\Actor.cpp:3524]
UE4Editor_Engine!AWorldSettings::NotifyBeginPlay() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Runtime\Engine\Private\WorldSettings.cpp:247]
UE4Editor_Engine!AGameStateBase::HandleBeginPlay() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Runtime\Engine\Private\GameStateBase.cpp:205]
UE4Editor_Engine!UWorld::BeginPlay() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Runtime\Engine\Private\World.cpp:4347]
UE4Editor_Engine!UGameInstance::StartPlayInEditorGameInstance() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Runtime\Engine\Private\GameInstance.cpp:489]
UE4Editor_UnrealEd!UEditorEngine::CreateInnerProcessPIEGameInstance() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Editor\UnrealEd\Private\PlayLevel.cpp:2930]
UE4Editor_UnrealEd!UEditorEngine::OnLoginPIEComplete_Deferred() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Editor\UnrealEd\Private\PlayLevel.cpp:1503]
UE4Editor_UnrealEd!UEditorEngine::CreateNewPlayInEditorInstance() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Editor\UnrealEd\Private\PlayLevel.cpp:1750]
UE4Editor_UnrealEd!UEditorEngine::StartPlayInEditorSession() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Editor\UnrealEd\Private\PlayLevel.cpp:2700]
UE4Editor_UnrealEd!UEditorEngine::StartQueuedPlaySessionRequestImpl() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Editor\UnrealEd\Private\PlayLevel.cpp:1116]
UE4Editor_UnrealEd!UEditorEngine::StartQueuedPlaySessionRequest() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Editor\UnrealEd\Private\PlayLevel.cpp:1019]
UE4Editor_UnrealEd!UEditorEngine::Tick() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Editor\UnrealEd\Private\EditorEngine.cpp:1623]
UE4Editor_UnrealEd!UUnrealEdEngine::Tick() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Editor\UnrealEd\Private\UnrealEdEngine.cpp:426]
UE4Editor!FEngineLoop::Tick() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Runtime\Launch\Private\LaunchEngineLoop.cpp:4855]
UE4Editor!GuardedMain() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Runtime\Launch\Private\Launch.cpp:169]
UE4Editor!WinMain() [D:\Projects\myproject\main\UnrealEngine\Engine\Source\Runtime\Launch\Private\Windows\LaunchWindows.cpp:269]
UE4Editor!__scrt_common_main_seh() [d:\agent\_work\63\s\src\vctools\crt\vcstartup\src\startup\exe_common.inl:288]
kernel32
ntdll
I found the error, the wwise spatial audio listener return null pointer. The UE version used is 4.26, also 4.25 has the same error.
I modified the code in the pitcture above ,when it return null, I return the camera position instead. But error still throws, because FirstPlayerController return null. Finally in such case, I return zero vector instead.
And I want to know the way I modified though can skip the crash, it is a good way the obey the original meaning of the function AAcousticsSpace::GetListenerPosition? If it is not, how can I modify it?
This looks like a timing issue. The AcousticsSpace object is initialized before you get either an AKListener or a Camera, or even a world it looks like. You can either delay the initialization of the AcousticsSpace until after you have spawned your player actor, or you can leave your solution to return 0, or negative infinity, or some other obviously default value.
@MikeChemi If I return 0 vector, does it will affect the latter soud propagation (I mean the player won't get correct 3D sound effect)?
Returning 0 will produce incorrect results only for the duration that you are returning 0. It is worth checking that non-0 values are returned after your player has spawned into the scene. Said another way, if you're always returning 0, that's bad. But if it's only for a short time while your game is loading, that's no problem.
We've released an update since this issue was reported. Please try it out and reopen this issue if you are still having problems.
| gharchive/issue | 2021-03-04T02:37:50 | 2025-04-01T06:44:57.599256 | {
"authors": [
"MicalKarl",
"MikeChemi"
],
"repo": "microsoft/ProjectAcoustics",
"url": "https://github.com/microsoft/ProjectAcoustics/issues/51",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
} |
682872550 | Using ReSettings doesn't appear to work
I have coded a custom class like the wiki example provided for using ReSettings and I get an exception:
No applicable method 'LengthGreaterThan' exists in type 'Utils'
I'm coding in VisualStudio & C#. Not sure how to get it to work.
The class is:
namespace ValidationFunction
{
public static class Utils
{
public static bool LengthGreaterThan(string s, int len)
{
return str.Length > len;
}
}
}
It's setup as
var reSettingsWithCustomTypes = new ReSettings { CustomTypes = new Type[] { typeof(Utils) } };
var bre = new RulesEngine.RulesEngine(workflowRules.ToArray(), null, reSettingsWithCustomTypes);
The rule contains: "Expression": "input1.CompanyName != null && Utils.LengthGreaterThan(input1.CompanyName, 6) == true"
It may have been my input but after playing
with it more I got it to work.
Thanks!
On Saturday, August 22, 2020, Abbas Cyclewala notifications@github.com
wrote:
@bjsapir https://github.com/bjsapir Can you share a sample input for
this?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/microsoft/RulesEngine/issues/39#issuecomment-678656766,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AQOGXVDQTAEFD32KNDDH5E3SB7SMXANCNFSM4QGKRTUA
.
Hi @bjsapir , it is possible that it happened because your input is a dynamic object and the property you tried accessing was null.
It has something to do with how we convert Dynamic objects to typed objects.
In case of null, we assume the type as object.
In case you want to handle null in inputs you can modify you Utils class as follows:
public static class Utils
{
public static bool LengthGreaterThan(object s, int len)
{
if(s is string){
return (string)s.Length > len
}
}
}
or you could try typecasting it in the expression before passing it
"Expression": "input1.CompanyName != null && Utils.LengthGreaterThan((string)(input1.CompanyName), 6) == true"
I’ll have an opportunity to test more next week
to confirm. thank-you for the suggestions
On Monday, August 24, 2020, Abbas Cyclewala notifications@github.com
wrote:
Hi @bjsapir https://github.com/bjsapir , it is possible that it
happened because your input is a dynamic object and the property you tried
accessing was null.
It has something to do with how we convert Dynamic objects to typed
objects.
In case of null, we assume the type as object.
In case you want to handle null in inputs you can modify you Utils class
as follows:
public static class Utils
{
public static bool LengthGreaterThan(object s, int len)
{
if(s is string){
return (string)s.Length > len
}
}
}
or you could try typecasting it in the expression before passing it
"Expression": "input1.CompanyName != null && Utils.LengthGreaterThan((string)(input1.CompanyName),
6) == true"
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/microsoft/RulesEngine/issues/39#issuecomment-678893995,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AQOGXVFLFYSA2RY3XO4ISTTSCHSPFANCNFSM4QGKRTUA
.
@bjsapir we have added test case for typecasting
https://github.com/microsoft/RulesEngine/blob/6b66162e561b5447c4c3500f167f8823c3c7cbbe/test/RulesEngine.UnitTest/TestData/rules5.json#L10
https://github.com/microsoft/RulesEngine/blob/6b66162e561b5447c4c3500f167f8823c3c7cbbe/test/RulesEngine.UnitTest/BusinessRuleEngineTest.cs#L236-L256
@bjsapir Closing this issue. Feel free to reopen it if you still face any problems.
| gharchive/issue | 2020-08-20T16:22:38 | 2025-04-01T06:44:57.612620 | {
"authors": [
"abbasc52",
"bjsapir"
],
"repo": "microsoft/RulesEngine",
"url": "https://github.com/microsoft/RulesEngine/issues/39",
"license": "MIT",
"license_type": "permissive",
"license_source": "github-api"
} |
954025729 | CSS : Misaligned text when viewed through mobile browsers.
Bug:
There's a slight misalignment in the text stating Open Source with TypeScript. This is visible only on Mobile Browsers.
Actual:
The text is left-aligned. Which makes it an odd one out when scrolling through.
Expected:
The text is expected to be center-aligned. So that it stays in-line with other similar elements.
Platforms tested:
Android
Chrome 92.0.4515.115 and Firefox 90.1.2
Screenshots:
Possible solutions:
There are a few ways to resolve this issue. But I think, making this alignment change to a narrow selector scope or exact selector scope will be bad. So, I've made a change, that won't affect any future CSS changes.
Created a PR here #1965
| gharchive/issue | 2021-07-27T15:51:45 | 2025-04-01T06:44:57.654030 | {
"authors": [
"code-reaper08"
],
"repo": "microsoft/TypeScript-Website",
"url": "https://github.com/microsoft/TypeScript-Website/issues/1964",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
} |
728516323 | TS 4.1 & Gatsby & Yarn update
Figured I may as well bite the bullet and just do a big dep update all over the show.
Looks like the update to yarn somehow ignores the order of the packages to be built - https://github.com/yarnpkg/berry/pull/1986
Fixed a few bugs while thinking about the CI
| gharchive/pull-request | 2020-10-23T20:51:20 | 2025-04-01T06:44:57.655580 | {
"authors": [
"orta"
],
"repo": "microsoft/TypeScript-Website",
"url": "https://github.com/microsoft/TypeScript-Website/pull/1275",
"license": "CC-BY-4.0",
"license_type": "permissive",
"license_source": "github-api"
} |
479321699 | Create Compatibility with Object.fromEntries() with ES5
Search Terms
Object.fromEntries()
Suggestion
I'd like to see Object.fromEntries() available to use as a function when compiling to the browser compatible ES5.
Use Cases
Using Object.fromEntries() when targeting ES5 alongside the friendly Object.entries()
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/fromEntries
Examples
Here is the polyfill from TC39 https://github.com/tc39/proposal-object-from-entries/blob/master/polyfill.js
function ObjectFromEntries(iter) {
const obj = {};
for (const pair of iter) {
if (Object(pair) !== pair) {
throw new TypeError('iterable for fromEntries should yield objects');
}
// Consistency with Map: contract is that entry has "0" and "1" keys, not
// that it is an array or iterable.
const { '0': key, '1': val } = pair;
Object.defineProperty(obj, key, {
configurable: true,
enumerable: true,
writable: true,
value: val,
});
}
return obj;
}
Checklist
My suggestion meets these guidelines:
[x] This wouldn't be a breaking change in existing TypeScript/JavaScript code
[x] This wouldn't change the runtime behavior of existing JavaScript code
[x] This could be implemented without emitting different JS based on the types of the expressions
[x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, etc.)
[x] This feature would agree with the rest of TypeScript's Design Goals.
My Questions
Is this feature left out intentionally?
A guide/where to look when contributing to adding a feature like this?
[x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, etc.)
A polyfill for a function is a runtime feature. See #26087
Okay, so it is left out on purpose.
Thank you for your response :)
Object.fromEntries() is not support in Microsoft Edge. so there are any replacement of this ?
| gharchive/issue | 2019-08-11T01:18:46 | 2025-04-01T06:44:57.662620 | {
"authors": [
"HrshPtl",
"jcalz",
"waynevanson"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/32803",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
596162838 | Find all references counts references for static member and regular member with same name
From https://github.com/microsoft/vscode/issues/94429
TypeScript Version: 3.9-beta
Search Terms:
find all references
f12
Code
For the code:
class Y{
foo(){}
}
class X extends Y{
static foo(){}
}
Run find all references on either instance of foo
Bug:
Two references are returned for both cases of foo. There should only be a single reference (itself) in both of these cases:
TS Server trace:
[Trace - 21:25:49.728] <semantic> Sending request: references (65). Response expected: yes. Current queue length: 0
Arguments: {
"file": "/Users/matb/projects/san/add.js",
"line": 5,
"offset": 12
}
[Trace - 21:25:49.730] <semantic> Response received: references (65). Request took 2 ms. Success: true
Result: {
"refs": [
{
"file": "/Users/matb/projects/san/add.js",
"start": {
"line": 2,
"offset": 2
},
"end": {
"line": 2,
"offset": 5
},
"contextStart": {
"line": 2,
"offset": 2
},
"contextEnd": {
"line": 2,
"offset": 9
},
"lineText": "\tfoo(){}",
"isWriteAccess": true,
"isDefinition": true
},
{
"file": "/Users/matb/projects/san/add.js",
"start": {
"line": 5,
"offset": 9
},
"end": {
"line": 5,
"offset": 12
},
"contextStart": {
"line": 5,
"offset": 2
},
"contextEnd": {
"line": 5,
"offset": 16
},
"lineText": "\tstatic foo(){}",
"isWriteAccess": true,
"isDefinition": true
}
],
"symbolName": "foo",
"symbolStartOffset": 9,
"symbolDisplayString": "(method) X.foo(): void"
}
Absolutely bizarre.
Thanks for the PR @ShuiRuTian, and thank you @sheetalkamat for helping guide the PR to completion!
| gharchive/issue | 2020-04-07T21:27:40 | 2025-04-01T06:44:57.668372 | {
"authors": [
"DanielRosenwasser",
"mjbvz"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/37830",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
622959846 | Contextually type array literals as tuples when used in readonly contexts
Search Terms
array tuple argument inference
readonly array contextual tuple inference
Suggestion
The following example currently produces an error, since the type of ["abc", 100] is inferred as (string | number)[], which means that TS is inferred as readonly (string | number)[].
function call<TS extends readonly unknown[]>(
sources: TS,
func: (...args: TS) => number,
): number {
return func(...sources);
}
call(["abc", 100], (str, num) => {
// ERROR: Property 'length' does not exist on type 'string | number'.
return str.length + num;
// ^^^^^^
})
Instead, we can type ["abc", 100] as [string, number] automatically by observing that it's being used in a readonly context (parameter sources: TS where TS: readonly unknown[]).
Use Cases
Well-typed tuples in contexts like the call function above.
Some workarounds already exist, but they aren't ideal.
Unsatisfying Workarounds
Add Typing Annotations
With an explicit type annotation, call<[string, number]>(["abc", 100], func) can be made to work, but this can be tedious or difficult if the values in the tuple have complex types.
Add as const
With an explicit as const call(["abc", 100] as const, func) can be made to work, but this can sometimes do "too much" - it now causes "abc" and 100 to be literally-typed. If they were object values, their fields would become readonly even if that wasn't desired. It also requires extra effort on the part of a library's users, which could cause libraries to shun this approach due to an apparent lack of ergonomics.
Add lots of overloads
Manually adding overloads
function call<T1>(sources: [T1], func: (a1: T1) => number): number;
function call<T1, T2>(sources: [T1, T2], func: (a1: T1, a2: T2) => number): number;
function call<T1, T2, T3>(sources: [T1, T2, T3], func: (a1: T1, a2: T2, a3: T3) => number): number;
function call<T1, T2, T3, T4>(sources: [T1, T2, T3, T4], func: (a1: T1, a2: T2, a3: T3, a4: T4) => number): number;
This approach is repetitive, incomplete (always could need more tuple arguments), and deals poorly with highly-generic code (essentially, requiring the same piecemeal approach for all generic callers).
Examples
With this feature, all of the following can type-check, without needing any annotations at use-sites:
function computeArrAny<TS extends readonly unknown[]>(
sources: TS,
func: (...args: TS) => number,
): number {
return func(...sources);
}
computeArrAny(["abc", 123], (str, num) => {
return str.length + num;
});
function computeArrStr<TS extends readonly string[]>(
sources: TS,
func: (...args: TS) => number,
): number {
return func(...sources);
}
computeArrStr(["abc", "def"], (str1, str2) => {
return str1.length + str2.length;
});
function computeArrTup<TS extends readonly [string, number]>(
sources: TS,
func: (...args: TS) => number,
): number {
return func(...sources);
}
computeArrTup(["abc", 123], (str, num) => {
return str.length + num;
});
Breaking Change?
By restricting to readonly contexts (i.e. where the contextual type extends readonly unknown[]), the number of breaking changes is minimal. In non-generic cases, the tuple type will immediately "decay" to its current array type, with no user-facing change.
In generic contexts, tuples will generally only become more-precise. There are a few cases that may behave differently:
function ditto<T extends readonly unknown[]>(arr: T): T {
return arr;
}
// OLD x: number[]
// NEW x: [number, number, number]
const x = ditto([1, 2, 3]);
It's unclear whether this pattern appears in real-world code, or if the new typing is problematic (if the array is used mutably, it may behave incorrectly; if the array is used in a read-only manner, there is no change in behavior). My prototype in the TypeScript codebase didn't find any regressions in the compiler tests.
Requiring that the array constraint be readonly drastically limits the fallout of this change, since it's uncommon (even for generic functions) to use it.
Checklist
My suggestion meets these guidelines:
[x] This wouldn't be a breaking change in existing TypeScript/JavaScript code (see caveats above)
[x] This wouldn't change the runtime behavior of existing JavaScript code
[x] This could be implemented without emitting different JS based on the types of the expressions
[x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, etc.)
[x] This feature would agree with the rest of TypeScript's Design Goals.
I'm also interested in contributing this change. I have a working prototype implementation.
@ahejlsberg thoughts on doing this as part of your current tuple work? Seems compelling
@RyanCavanaugh
@ahejlsberg thoughts on doing this as part of your current tuple work? Seems compelling
This may be a breaking change if somebody uses existing inferred array type in computation of other types, for example:
const a = [1, 2, 3]
function b (param: typeof a) { ... }
// Works now, but will produce an error after this change
const c = b([1, 2, 3, 4])
So, the better way is to add a new compiler option that will make type checker infer types as narrow as possible: https://github.com/microsoft/TypeScript/issues/38831
@awerlogus you seem to be misunderstanding this proposal- your example will continue to work exactly the same. This only affects tuples which are contextually typed in a readonly context. Since typeof a is still number[], the criteria for the new behavior don't apply.
See the section on breaking changes for the places that are affected. Non-generic code will never show any difference in behavior.
We already support inferring tuple types when the contextual type is a tuple type or a union that includes at least one tuple type. The latter means that you can generally just union with [] to get the effect you want:
function call<TS extends readonly unknown[] | []>(
sources: TS,
func: (...args: TS) => number,
): number {
return func(...sources);
}
So, I think we already have what you need.
That's a bit "hacky" but it does solve this particular problem!
| gharchive/issue | 2020-05-22T05:26:06 | 2025-04-01T06:44:57.683185 | {
"authors": [
"Nathan-Fenner",
"RyanCavanaugh",
"ahejlsberg",
"awerlogus"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/38727",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
945919994 | Provide more metadata for collapse span
Suggestion
🔍 Search Terms
imports, spans, metadata
✅ Viability Checklist
My suggestion meets these guidelines:
[x] This wouldn't be a breaking change in existing TypeScript/JavaScript code
[x] This wouldn't change the runtime behavior of existing JavaScript code
[x] This could be implemented without emitting different JS based on the types of the expressions
[x] This isn't a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, new syntax sugar for JS, etc.)
[x] This feature would agree with the rest of TypeScript's Design Goals.
⭐ Suggestion
Provide some information into span to allows editor can distinguish what exactly the code is.
📃 Motivating Example
import a from 'a'
import b from 'b'
// ...
import z from 'z'
// collapse all imports by default.
// controls by a editor option.
const mycode = 'here'
💻 Use Cases
We have a span to allows user collapse all import statements for now.
I thought we could also add an option at editor to controls the default behavior of collapse the import statements or not.
So, Editors may need some additional info to distinguish the spans.
Oh. I just found we have
export const enum OutliningSpanKind {
/** Single or multi-line comments */
Comment = "comment",
/** Sections marked by '// #region' and '// #endregion' comments */
Region = "region",
/** Declarations and expressions */
Code = "code",
/** Contiguous blocks of import declarations */
Imports = "imports"
}
So, we may don't need to add something at ts side.
| gharchive/issue | 2021-07-16T03:55:42 | 2025-04-01T06:44:57.689494 | {
"authors": [
"Kingwl"
],
"repo": "microsoft/TypeScript",
"url": "https://github.com/microsoft/TypeScript/issues/45062",
"license": "Apache-2.0",
"license_type": "permissive",
"license_source": "github-api"
} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.