Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
104,225
| 13,044,070,542
|
IssuesEvent
|
2020-07-29 03:30:25
|
filecoin-project/slate
|
https://api.github.com/repos/filecoin-project/slate
|
closed
|
GLRenderer: Start mapping coordinates to actual country location key names
|
Design Feature For the public
|
# Purpose
<img width="1770" alt="Screen Shot 2020-07-22 at 5 14 48 AM" src="https://user-images.githubusercontent.com/310223/88175139-4b9b4f00-cbda-11ea-890e-b72fea50155c.png">
* We use a JSON file that provides coordinates.
* It would be nice to map coordinate positions to `country` keywords so when we want to start showing real time connections between `Slate` clients we are setup for it.
# Deliverable
* A JavaScript map helping connect the `coordinates` to real countries and/or cities.
|
1.0
|
GLRenderer: Start mapping coordinates to actual country location key names - # Purpose
<img width="1770" alt="Screen Shot 2020-07-22 at 5 14 48 AM" src="https://user-images.githubusercontent.com/310223/88175139-4b9b4f00-cbda-11ea-890e-b72fea50155c.png">
* We use a JSON file that provides coordinates.
* It would be nice to map coordinate positions to `country` keywords so when we want to start showing real time connections between `Slate` clients we are setup for it.
# Deliverable
* A JavaScript map helping connect the `coordinates` to real countries and/or cities.
|
non_process
|
glrenderer start mapping coordinates to actual country location key names purpose img width alt screen shot at am src we use a json file that provides coordinates it would be nice to map coordinate positions to country keywords so when we want to start showing real time connections between slate clients we are setup for it deliverable a javascript map helping connect the coordinates to real countries and or cities
| 0
|
317,717
| 23,685,671,350
|
IssuesEvent
|
2022-08-29 05:57:52
|
UoaWDCC/NZCSA-Frontend
|
https://api.github.com/repos/UoaWDCC/NZCSA-Frontend
|
closed
|
[Documentation] EventDetail, EventGrid and SponsorGrid
|
Type: Documentation
|
**Describe the task that needs to be done.**
Document the EventDetail, EventGrid and SponsorGrid using jsdoc and comment any methods.
In the js doc, must include the use of the file, and what is in the input props if its applicable.
**Describe how a solution to your proposed task might look like (and any alternatives considered).**
*fill in this please*
**Notes**
|
1.0
|
[Documentation] EventDetail, EventGrid and SponsorGrid - **Describe the task that needs to be done.**
Document the EventDetail, EventGrid and SponsorGrid using jsdoc and comment any methods.
In the js doc, must include the use of the file, and what is in the input props if its applicable.
**Describe how a solution to your proposed task might look like (and any alternatives considered).**
*fill in this please*
**Notes**
|
non_process
|
eventdetail eventgrid and sponsorgrid describe the task that needs to be done document the eventdetail eventgrid and sponsorgrid using jsdoc and comment any methods in the js doc must include the use of the file and what is in the input props if its applicable describe how a solution to your proposed task might look like and any alternatives considered fill in this please notes
| 0
|
9,852
| 30,665,396,424
|
IssuesEvent
|
2023-07-25 17:52:35
|
pulumi/pulumi
|
https://api.github.com/repos/pulumi/pulumi
|
opened
|
`pulumi new` equivalent in automation API
|
kind/enhancement needs-triage area/automation-api
|
## Hello!
<!-- Please leave this section as-is, it's designed to help others in the community know how to interact with our GitHub issues. -->
- Vote on this issue by adding a 👍 reaction
- If you want to implement this feature, comment to let us know (we'll work with you on design, scheduling, etc.)
## Issue details
As part of working on a Backstage plugin @dirien noticed we don't have an equivalent of `pulumi new` exposed in the automation API. To work around this he had to `os exec` the CLI directly.
### Affected area/feature
Automation API
|
1.0
|
`pulumi new` equivalent in automation API - ## Hello!
<!-- Please leave this section as-is, it's designed to help others in the community know how to interact with our GitHub issues. -->
- Vote on this issue by adding a 👍 reaction
- If you want to implement this feature, comment to let us know (we'll work with you on design, scheduling, etc.)
## Issue details
As part of working on a Backstage plugin @dirien noticed we don't have an equivalent of `pulumi new` exposed in the automation API. To work around this he had to `os exec` the CLI directly.
### Affected area/feature
Automation API
|
non_process
|
pulumi new equivalent in automation api hello vote on this issue by adding a 👍 reaction if you want to implement this feature comment to let us know we ll work with you on design scheduling etc issue details as part of working on a backstage plugin dirien noticed we don t have an equivalent of pulumi new exposed in the automation api to work around this he had to os exec the cli directly affected area feature automation api
| 0
|
234,774
| 19,256,733,882
|
IssuesEvent
|
2021-12-09 12:08:24
|
vmware-tanzu/community-edition
|
https://api.github.com/repos/vmware-tanzu/community-edition
|
opened
|
Fix AWS E2E test pipeline failures
|
kind/test-release area/release-eng
|
Recently TCE’s AWS E2E test pipelines have been failing and I thought it might be just a flaky issue, but looks like it’s not
Interestingly it was also initially only seen in recent AWS management + workload cluster E2E test runs -
https://github.com/vmware-tanzu/community-edition/actions/runs/1552245009
https://github.com/vmware-tanzu/community-edition/actions/runs/1550338695
https://github.com/vmware-tanzu/community-edition/actions/runs/1539924396
The crux of the problem is quota for Elastic IPs (public IPs) needed for NAT Gateways (AWS resource), which can be understood from the error message popping up a lot in the diagnostics data (zip) from the failed pipelines -
```
failed to create one or more IP addresses for NAT gateways: failed to allocate Elastic IP: AddressLimitExceeded: The maximum number of addresses has been reached.\n\tstatus code: 400, request id: b9fa4619-3f8b-475d-a546-923075f6fb6a
```
which shows up in the pipeline logs as something like
```
Error: unable to wait for cluster and get the cluster kubeconfig: error waiting for cluster to be provisioned (this may take a few minutes): cluster creation failed, reason:'NatGatewaysReconciliationFailed', message:'3 of 8 completed'
```
Note the `NatGatewaysReconciliationFailed` error
Getting more into the problem - apparently only 5 elastic IPs are allowed per region by default - https://docs.aws.amazon.com/vpc/latest/userguide/amazon-vpc-limits.html#vpc-limits-eips . This is in terms of quotas. But it’s adjustable and we can get more quota
My guess on what went wrong is - many pipelines ran in parallel - AWS standalone cluster pipelines, AWS management + workload cluster pipelines. Given a cluster (standalone / management / workload), I think it needs one NAT Gateway, which needs one elastic IP. So I think the parallelism for too many pipelines gets restricted due to the above quota on the elastic IP
Interestingly, I noticed 3 elastic IPs lying around in the AWS account we use for E2E test pipelines. I think they were not cleaned up. I was wondering how the others got cleaned up, something to check. But the aws nuke config does not mention cleaning up elastic IP, and when I deleted NAT gateway (after associating it with an elastic IP I created), it deleted the NAT gateway only but not the elastic IP. But yeah, somehow some elastic IPs are left out in the account
And all our E2E test pipelines use the same AWS region - `us-east-2`, so the total elastic IP we can use is only 5 (as per the current quota) across the pipelines
https://github.com/vmware-tanzu/community-edition/blob/73783c978aae5605e2e642697430c6befa9cbbda/test/aws/cluster-config.yaml#L5
https://github.com/vmware-tanzu/community-edition/blob/73783c978aae5605e2e642697430c6befa9cbbda/test/aws/cluster-config.yaml#L2
Given that 3 elastic IPs are already lying around, I think when the AWS pipelines try to get more IPs while creating NAT gateways, they fail. Like, when a commit is pushed, I think two AWS pipelines run for each commit - standalone cluster and management + workload cluster, for example, for this commit -
https://github.com/vmware-tanzu/community-edition/commit/b5d53c19a3c8f67cabd8c078a640176cc792536e
the two AWS pipelines are -
standalone cluster - https://github.com/vmware-tanzu/community-edition/actions/runs/1552245007
management + workload cluster - https://github.com/vmware-tanzu/community-edition/actions/runs/1552245009
This is according to the AWS E2E GitHub workflows -
https://github.com/vmware-tanzu/community-edition/blob/73783c978aae5605e2e642697430c6befa9cbbda/.github/workflows/e2e-aws-standalone-cluster.yaml#L3-L6
https://github.com/vmware-tanzu/community-edition/blob/73783c978aae5605e2e642697430c6befa9cbbda/.github/workflows/e2e-aws-management-and-workload-cluster.yaml#L3-L6
The standalone cluster E2E usually seems to run fast and gets it’s NAT Gateway for the cluster with one Elastic (Public) IP
The management cluster + workload cluster E2E test runs fast too and also gets NAT Gateway for the management cluster with one Elastic (Public) IP
Now with existing 3 IPs lying around and 2 new IPs (from the above tests), total 5 IPs have been created and 2 are being used
Now in management cluster + workload cluster E2E test, workload cluster creation happens and it fails, as it cannot create a NAT Gateway which needs an Elastic IP (public IP) and the quota has been reached. I think this has been happening for quite some time now, like, last few runs - for the recent commits, you can see that either the standalone cluster E2E test fails with NAT Gateway issue, or management + workload cluster E2E test fails with NAT Gateway issue, depending on which E2E test uses up the 2 Elastic IPs first which is available in quota, and if they use it fast and clean it up fast too, then other one can use it
https://github.com/vmware-tanzu/community-edition/actions/runs/1526817483 - fail , https://github.com/vmware-tanzu/community-edition/actions/runs/1526817480 - pass
https://github.com/vmware-tanzu/community-edition/actions/runs/1532904134 - fail, https://github.com/vmware-tanzu/community-edition/actions/runs/1532904138 - pass
https://github.com/vmware-tanzu/community-edition/actions/runs/1533964631 - fail, https://github.com/vmware-tanzu/community-edition/actions/runs/1533964621 - pass
https://github.com/vmware-tanzu/community-edition/actions/runs/1535806421 - fail, https://github.com/vmware-tanzu/community-edition/actions/runs/1535806430 - pass
https://github.com/vmware-tanzu/community-edition/actions/runs/1535902764 - fail, https://github.com/vmware-tanzu/community-edition/actions/runs/1535902766 - pass
https://github.com/vmware-tanzu/community-edition/actions/runs/1539924396 - fail, https://github.com/vmware-tanzu/community-edition/actions/runs/1539924399 - pass
https://github.com/vmware-tanzu/community-edition/actions/runs/1550338695 - fail, https://github.com/vmware-tanzu/community-edition/actions/runs/1550338699 - pass
https://github.com/vmware-tanzu/community-edition/actions/runs/1552245009 - fail, https://github.com/vmware-tanzu/community-edition/actions/runs/1552245007 - pass
|
1.0
|
Fix AWS E2E test pipeline failures - Recently TCE’s AWS E2E test pipelines have been failing and I thought it might be just a flaky issue, but looks like it’s not
Interestingly it was also initially only seen in recent AWS management + workload cluster E2E test runs -
https://github.com/vmware-tanzu/community-edition/actions/runs/1552245009
https://github.com/vmware-tanzu/community-edition/actions/runs/1550338695
https://github.com/vmware-tanzu/community-edition/actions/runs/1539924396
The crux of the problem is quota for Elastic IPs (public IPs) needed for NAT Gateways (AWS resource), which can be understood from the error message popping up a lot in the diagnostics data (zip) from the failed pipelines -
```
failed to create one or more IP addresses for NAT gateways: failed to allocate Elastic IP: AddressLimitExceeded: The maximum number of addresses has been reached.\n\tstatus code: 400, request id: b9fa4619-3f8b-475d-a546-923075f6fb6a
```
which shows up in the pipeline logs as something like
```
Error: unable to wait for cluster and get the cluster kubeconfig: error waiting for cluster to be provisioned (this may take a few minutes): cluster creation failed, reason:'NatGatewaysReconciliationFailed', message:'3 of 8 completed'
```
Note the `NatGatewaysReconciliationFailed` error
Getting more into the problem - apparently only 5 elastic IPs are allowed per region by default - https://docs.aws.amazon.com/vpc/latest/userguide/amazon-vpc-limits.html#vpc-limits-eips . This is in terms of quotas. But it’s adjustable and we can get more quota
My guess on what went wrong is - many pipelines ran in parallel - AWS standalone cluster pipelines, AWS management + workload cluster pipelines. Given a cluster (standalone / management / workload), I think it needs one NAT Gateway, which needs one elastic IP. So I think the parallelism for too many pipelines gets restricted due to the above quota on the elastic IP
Interestingly, I noticed 3 elastic IPs lying around in the AWS account we use for E2E test pipelines. I think they were not cleaned up. I was wondering how the others got cleaned up, something to check. But the aws nuke config does not mention cleaning up elastic IP, and when I deleted NAT gateway (after associating it with an elastic IP I created), it deleted the NAT gateway only but not the elastic IP. But yeah, somehow some elastic IPs are left out in the account
And all our E2E test pipelines use the same AWS region - `us-east-2`, so the total elastic IP we can use is only 5 (as per the current quota) across the pipelines
https://github.com/vmware-tanzu/community-edition/blob/73783c978aae5605e2e642697430c6befa9cbbda/test/aws/cluster-config.yaml#L5
https://github.com/vmware-tanzu/community-edition/blob/73783c978aae5605e2e642697430c6befa9cbbda/test/aws/cluster-config.yaml#L2
Given that 3 elastic IPs are already lying around, I think when the AWS pipelines try to get more IPs while creating NAT gateways, they fail. Like, when a commit is pushed, I think two AWS pipelines run for each commit - standalone cluster and management + workload cluster, for example, for this commit -
https://github.com/vmware-tanzu/community-edition/commit/b5d53c19a3c8f67cabd8c078a640176cc792536e
the two AWS pipelines are -
standalone cluster - https://github.com/vmware-tanzu/community-edition/actions/runs/1552245007
management + workload cluster - https://github.com/vmware-tanzu/community-edition/actions/runs/1552245009
This is according to the AWS E2E GitHub workflows -
https://github.com/vmware-tanzu/community-edition/blob/73783c978aae5605e2e642697430c6befa9cbbda/.github/workflows/e2e-aws-standalone-cluster.yaml#L3-L6
https://github.com/vmware-tanzu/community-edition/blob/73783c978aae5605e2e642697430c6befa9cbbda/.github/workflows/e2e-aws-management-and-workload-cluster.yaml#L3-L6
The standalone cluster E2E usually seems to run fast and gets it’s NAT Gateway for the cluster with one Elastic (Public) IP
The management cluster + workload cluster E2E test runs fast too and also gets NAT Gateway for the management cluster with one Elastic (Public) IP
Now with existing 3 IPs lying around and 2 new IPs (from the above tests), total 5 IPs have been created and 2 are being used
Now in management cluster + workload cluster E2E test, workload cluster creation happens and it fails, as it cannot create a NAT Gateway which needs an Elastic IP (public IP) and the quota has been reached. I think this has been happening for quite some time now, like, last few runs - for the recent commits, you can see that either the standalone cluster E2E test fails with NAT Gateway issue, or management + workload cluster E2E test fails with NAT Gateway issue, depending on which E2E test uses up the 2 Elastic IPs first which is available in quota, and if they use it fast and clean it up fast too, then other one can use it
https://github.com/vmware-tanzu/community-edition/actions/runs/1526817483 - fail , https://github.com/vmware-tanzu/community-edition/actions/runs/1526817480 - pass
https://github.com/vmware-tanzu/community-edition/actions/runs/1532904134 - fail, https://github.com/vmware-tanzu/community-edition/actions/runs/1532904138 - pass
https://github.com/vmware-tanzu/community-edition/actions/runs/1533964631 - fail, https://github.com/vmware-tanzu/community-edition/actions/runs/1533964621 - pass
https://github.com/vmware-tanzu/community-edition/actions/runs/1535806421 - fail, https://github.com/vmware-tanzu/community-edition/actions/runs/1535806430 - pass
https://github.com/vmware-tanzu/community-edition/actions/runs/1535902764 - fail, https://github.com/vmware-tanzu/community-edition/actions/runs/1535902766 - pass
https://github.com/vmware-tanzu/community-edition/actions/runs/1539924396 - fail, https://github.com/vmware-tanzu/community-edition/actions/runs/1539924399 - pass
https://github.com/vmware-tanzu/community-edition/actions/runs/1550338695 - fail, https://github.com/vmware-tanzu/community-edition/actions/runs/1550338699 - pass
https://github.com/vmware-tanzu/community-edition/actions/runs/1552245009 - fail, https://github.com/vmware-tanzu/community-edition/actions/runs/1552245007 - pass
|
non_process
|
fix aws test pipeline failures recently tce’s aws test pipelines have been failing and i thought it might be just a flaky issue but looks like it’s not interestingly it was also initially only seen in recent aws management workload cluster test runs the crux of the problem is quota for elastic ips public ips needed for nat gateways aws resource which can be understood from the error message popping up a lot in the diagnostics data zip from the failed pipelines failed to create one or more ip addresses for nat gateways failed to allocate elastic ip addresslimitexceeded the maximum number of addresses has been reached n tstatus code request id which shows up in the pipeline logs as something like error unable to wait for cluster and get the cluster kubeconfig error waiting for cluster to be provisioned this may take a few minutes cluster creation failed reason natgatewaysreconciliationfailed message of completed note the natgatewaysreconciliationfailed error getting more into the problem apparently only elastic ips are allowed per region by default this is in terms of quotas but it’s adjustable and we can get more quota my guess on what went wrong is many pipelines ran in parallel aws standalone cluster pipelines aws management workload cluster pipelines given a cluster standalone management workload i think it needs one nat gateway which needs one elastic ip so i think the parallelism for too many pipelines gets restricted due to the above quota on the elastic ip interestingly i noticed elastic ips lying around in the aws account we use for test pipelines i think they were not cleaned up i was wondering how the others got cleaned up something to check but the aws nuke config does not mention cleaning up elastic ip and when i deleted nat gateway after associating it with an elastic ip i created it deleted the nat gateway only but not the elastic ip but yeah somehow some elastic ips are left out in the account and all our test pipelines use the same aws region us east so the total elastic ip we can use is only as per the current quota across the pipelines given that elastic ips are already lying around i think when the aws pipelines try to get more ips while creating nat gateways they fail like when a commit is pushed i think two aws pipelines run for each commit standalone cluster and management workload cluster for example for this commit the two aws pipelines are standalone cluster management workload cluster this is according to the aws github workflows the standalone cluster usually seems to run fast and gets it’s nat gateway for the cluster with one elastic public ip the management cluster workload cluster test runs fast too and also gets nat gateway for the management cluster with one elastic public ip now with existing ips lying around and new ips from the above tests total ips have been created and are being used now in management cluster workload cluster test workload cluster creation happens and it fails as it cannot create a nat gateway which needs an elastic ip public ip and the quota has been reached i think this has been happening for quite some time now like last few runs for the recent commits you can see that either the standalone cluster test fails with nat gateway issue or management workload cluster test fails with nat gateway issue depending on which test uses up the elastic ips first which is available in quota and if they use it fast and clean it up fast too then other one can use it fail pass fail pass fail pass fail pass fail pass fail pass fail pass fail pass
| 0
|
4,894
| 7,764,056,230
|
IssuesEvent
|
2018-06-01 18:49:39
|
syndesisio/syndesis
|
https://api.github.com/repos/syndesisio/syndesis
|
closed
|
Populate the UX design tracker
|
cat/process group/uxd prio/p1
|
## This is a...
<!-- Check one of the following options with "x" -->
<pre><code>
[ ] Feature request
[ ] Regression (a behavior that used to work and stopped working in a new release)
[ ] Bug report <!-- Please search GitHub for a similar issue or PR before submitting -->
[x] Documentation issue or request
</code></pre>
## The problem
Based on #2033 discussions, UX, engineering, and project management agree that going forward, UX will no longer use the Github PR process for publishing designs. Instead, we will track our designs using the "UX tracker" tool gaining popularity with the UXD team.
Our UX tracker is located in our old syndesis-ux rep: https://syndesisio.github.io/syndesis-ux/ We've repurposed those Git pages to serve up this tracker.
## Expected behavior
This will be the single location for all designs moving forward. The links will point to InVision files that contain the most up-to-date designs.
At first, however, many of the links will go to existing PRs. As we move away from the PR process, newly added designs will link to InVision pages. Anyone with the public link to the InVision will be able to add comments directly to the InVision document thereby capturing all feedback in one place (i.e., you don't need an InVision license).
## Screenshot

|
1.0
|
Populate the UX design tracker - ## This is a...
<!-- Check one of the following options with "x" -->
<pre><code>
[ ] Feature request
[ ] Regression (a behavior that used to work and stopped working in a new release)
[ ] Bug report <!-- Please search GitHub for a similar issue or PR before submitting -->
[x] Documentation issue or request
</code></pre>
## The problem
Based on #2033 discussions, UX, engineering, and project management agree that going forward, UX will no longer use the Github PR process for publishing designs. Instead, we will track our designs using the "UX tracker" tool gaining popularity with the UXD team.
Our UX tracker is located in our old syndesis-ux rep: https://syndesisio.github.io/syndesis-ux/ We've repurposed those Git pages to serve up this tracker.
## Expected behavior
This will be the single location for all designs moving forward. The links will point to InVision files that contain the most up-to-date designs.
At first, however, many of the links will go to existing PRs. As we move away from the PR process, newly added designs will link to InVision pages. Anyone with the public link to the InVision will be able to add comments directly to the InVision document thereby capturing all feedback in one place (i.e., you don't need an InVision license).
## Screenshot

|
process
|
populate the ux design tracker this is a feature request regression a behavior that used to work and stopped working in a new release bug report documentation issue or request the problem based on discussions ux engineering and project management agree that going forward ux will no longer use the github pr process for publishing designs instead we will track our designs using the ux tracker tool gaining popularity with the uxd team our ux tracker is located in our old syndesis ux rep we ve repurposed those git pages to serve up this tracker expected behavior this will be the single location for all designs moving forward the links will point to invision files that contain the most up to date designs at first however many of the links will go to existing prs as we move away from the pr process newly added designs will link to invision pages anyone with the public link to the invision will be able to add comments directly to the invision document thereby capturing all feedback in one place i e you don t need an invision license screenshot
| 1
|
685,779
| 23,467,034,792
|
IssuesEvent
|
2022-08-16 17:47:40
|
larsiusprime/tdrpg-bugs
|
https://api.github.com/repos/larsiusprime/tdrpg-bugs
|
closed
|
The Markos/Ketta Bad Ending still plays 1 second of the sad song, then loads the normal ending music.
|
bug DQ CORE Cutscene 1 Please Verify Priority HIGH
|
I thought this had been fixed a few times in DQold. How does this keep popping up? :P
|
1.0
|
The Markos/Ketta Bad Ending still plays 1 second of the sad song, then loads the normal ending music. - I thought this had been fixed a few times in DQold. How does this keep popping up? :P
|
non_process
|
the markos ketta bad ending still plays second of the sad song then loads the normal ending music i thought this had been fixed a few times in dqold how does this keep popping up p
| 0
|
121,362
| 25,956,354,672
|
IssuesEvent
|
2022-12-18 09:27:34
|
altkennyh2l/upptime
|
https://api.github.com/repos/altkennyh2l/upptime
|
opened
|
🛑 Code-server (C-take) is down
|
status code-server-c-take
|
In [`2a72f4c`](https://github.com/altkennyh2l/upptime/commit/2a72f4c7ce6f0c58e031b0f1a3968c0df4445e96
), Code-server (C-take) (https://code.kinokonoko.io) was **down**:
- HTTP code: 0
- Response time: 0 ms
|
1.0
|
🛑 Code-server (C-take) is down - In [`2a72f4c`](https://github.com/altkennyh2l/upptime/commit/2a72f4c7ce6f0c58e031b0f1a3968c0df4445e96
), Code-server (C-take) (https://code.kinokonoko.io) was **down**:
- HTTP code: 0
- Response time: 0 ms
|
non_process
|
🛑 code server c take is down in code server c take was down http code response time ms
| 0
|
16,026
| 20,188,242,898
|
IssuesEvent
|
2022-02-11 01:21:03
|
savitamittalmsft/WAS-SEC-TEST
|
https://api.github.com/repos/savitamittalmsft/WAS-SEC-TEST
|
opened
|
Enforce password-less or Multi-factor Authentication (MFA)
|
WARP-Import WAF FEB 2021 Security Performance and Scalability Capacity Management Processes Security & Compliance Authentication and authorization
|
<a href="https://docs.microsoft.com/azure/architecture/framework/security/design-identity-authentication#use-passwordless-authentication">Enforce password-less or Multi-factor Authentication (MFA)</a>
<p><b>Why Consider This?</b></p>
Attack methods have evolved to the point where passwords alone cannot reliably protect an account. Modern authentication solutions including password-less and multi-factor authentication increase security posture through strong authentication.
<p><b>Context</b></p>
<p><span>With modern authentication and security features in Azure AD, that basic password should be supplemented or replaced with more secure authentication methods. Each organization has different needs when it comes to authentication. Microsoft offers the following three passwordless authentication options that integrate with Azure Active Directory (Azure AD):</span></p><ul style="list-style-type:disc"><li value="1" style="text-indent: 0px;"><span>Windows Hello for Business</span></li><li value="2" style="margin-right: 0px;text-indent: 0px;"><span>Microsoft Authenticator app</span></li><li value="3" style="margin-right: 0px;text-indent: 0px;"><span>FIDO2 security keys</span></li></ul><p style="margin-right: 0px;"><span>It's recommended to follow a 4-stage plan to become password-less:</span></p><ul style="list-style-type:disc"><li value="1" style="margin-right: 0px;text-indent: 0px;"><span>Develop password replacement offering</span></li><li value="2" style="margin-right: 0px;text-indent: 0px;"><span>Reduce user-visible password surface area</span></li><li value="3" style="margin-right: 0px;text-indent: 0px;"><span>Transition into password-less deployment</span></li><li value="4" style="margin-right: 0px;text-indent: 0px;"><span>Eliminate passwords from the identity directory</span></li></ul>
<p><b>Suggested Actions</b></p>
<p><span>Develop a password-less strategy that requires MFA for all users without significantly impacting operations."nbsp; </span></p>
<p><b>Learn More</b></p>
<p><a href="https://docs.microsoft.com/en-us/windows/security/identity-protection/hello-for-business/passwordless-strategy" target="_blank"><span>Password-less strategy</span></a><span /></p>
|
1.0
|
Enforce password-less or Multi-factor Authentication (MFA) - <a href="https://docs.microsoft.com/azure/architecture/framework/security/design-identity-authentication#use-passwordless-authentication">Enforce password-less or Multi-factor Authentication (MFA)</a>
<p><b>Why Consider This?</b></p>
Attack methods have evolved to the point where passwords alone cannot reliably protect an account. Modern authentication solutions including password-less and multi-factor authentication increase security posture through strong authentication.
<p><b>Context</b></p>
<p><span>With modern authentication and security features in Azure AD, that basic password should be supplemented or replaced with more secure authentication methods. Each organization has different needs when it comes to authentication. Microsoft offers the following three passwordless authentication options that integrate with Azure Active Directory (Azure AD):</span></p><ul style="list-style-type:disc"><li value="1" style="text-indent: 0px;"><span>Windows Hello for Business</span></li><li value="2" style="margin-right: 0px;text-indent: 0px;"><span>Microsoft Authenticator app</span></li><li value="3" style="margin-right: 0px;text-indent: 0px;"><span>FIDO2 security keys</span></li></ul><p style="margin-right: 0px;"><span>It's recommended to follow a 4-stage plan to become password-less:</span></p><ul style="list-style-type:disc"><li value="1" style="margin-right: 0px;text-indent: 0px;"><span>Develop password replacement offering</span></li><li value="2" style="margin-right: 0px;text-indent: 0px;"><span>Reduce user-visible password surface area</span></li><li value="3" style="margin-right: 0px;text-indent: 0px;"><span>Transition into password-less deployment</span></li><li value="4" style="margin-right: 0px;text-indent: 0px;"><span>Eliminate passwords from the identity directory</span></li></ul>
<p><b>Suggested Actions</b></p>
<p><span>Develop a password-less strategy that requires MFA for all users without significantly impacting operations."nbsp; </span></p>
<p><b>Learn More</b></p>
<p><a href="https://docs.microsoft.com/en-us/windows/security/identity-protection/hello-for-business/passwordless-strategy" target="_blank"><span>Password-less strategy</span></a><span /></p>
|
process
|
enforce password less or multi factor authentication mfa why consider this attack methods have evolved to the point where passwords alone cannot reliably protect an account modern authentication solutions including password less and multi factor authentication increase security posture through strong authentication context with modern authentication and security features in azure ad that basic password should be supplemented or replaced with more secure authentication methods each organization has different needs when it comes to authentication microsoft offers the following three passwordless authentication options that integrate with azure active directory azure ad windows hello for business microsoft authenticator app security keys it s recommended to follow a stage plan to become password less develop password replacement offering reduce user visible password surface area transition into password less deployment eliminate passwords from the identity directory suggested actions develop a password less strategy that requires mfa for all users without significantly impacting operations nbsp learn more password less strategy
| 1
|
84,350
| 15,720,887,523
|
IssuesEvent
|
2021-03-29 01:31:24
|
ghuangsnl/spring-boot
|
https://api.github.com/repos/ghuangsnl/spring-boot
|
opened
|
CVE-2021-21351 (High) detected in xstream-1.4.11.1.jar
|
security vulnerability
|
## CVE-2021-21351 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xstream-1.4.11.1.jar</b></p></summary>
<p>XStream is a serialization library from Java objects to XML and back.</p>
<p>Library home page: <a href="http://x-stream.github.io">http://x-stream.github.io</a></p>
<p>Path to vulnerable library: spring-boot/spring-boot-project/spring-boot-dependencies/build/local-m2-repository/com/thoughtworks/xstream/xstream/1.4.11.1/xstream-1.4.11.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **xstream-1.4.11.1.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
XStream is a Java library to serialize objects to XML and back again. In XStream before version 1.4.16, there is a vulnerability may allow a remote attacker to load and execute arbitrary code from a remote host only by manipulating the processed input stream. No user is affected, who followed the recommendation to setup XStream's security framework with a whitelist limited to the minimal required types. If you rely on XStream's default blacklist of the Security Framework, you will have to use at least version 1.4.16.
<p>Publish Date: 2021-03-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21351>CVE-2021-21351</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/x-stream/xstream/security/advisories/GHSA-hrcp-8f3q-4w2c">https://github.com/x-stream/xstream/security/advisories/GHSA-hrcp-8f3q-4w2c</a></p>
<p>Release Date: 2021-03-23</p>
<p>Fix Resolution: com.thoughtworks.xstream:xstream:1.4.16</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-21351 (High) detected in xstream-1.4.11.1.jar - ## CVE-2021-21351 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xstream-1.4.11.1.jar</b></p></summary>
<p>XStream is a serialization library from Java objects to XML and back.</p>
<p>Library home page: <a href="http://x-stream.github.io">http://x-stream.github.io</a></p>
<p>Path to vulnerable library: spring-boot/spring-boot-project/spring-boot-dependencies/build/local-m2-repository/com/thoughtworks/xstream/xstream/1.4.11.1/xstream-1.4.11.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **xstream-1.4.11.1.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
XStream is a Java library to serialize objects to XML and back again. In XStream before version 1.4.16, there is a vulnerability may allow a remote attacker to load and execute arbitrary code from a remote host only by manipulating the processed input stream. No user is affected, who followed the recommendation to setup XStream's security framework with a whitelist limited to the minimal required types. If you rely on XStream's default blacklist of the Security Framework, you will have to use at least version 1.4.16.
<p>Publish Date: 2021-03-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21351>CVE-2021-21351</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/x-stream/xstream/security/advisories/GHSA-hrcp-8f3q-4w2c">https://github.com/x-stream/xstream/security/advisories/GHSA-hrcp-8f3q-4w2c</a></p>
<p>Release Date: 2021-03-23</p>
<p>Fix Resolution: com.thoughtworks.xstream:xstream:1.4.16</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in xstream jar cve high severity vulnerability vulnerable library xstream jar xstream is a serialization library from java objects to xml and back library home page a href path to vulnerable library spring boot spring boot project spring boot dependencies build local repository com thoughtworks xstream xstream xstream jar dependency hierarchy x xstream jar vulnerable library vulnerability details xstream is a java library to serialize objects to xml and back again in xstream before version there is a vulnerability may allow a remote attacker to load and execute arbitrary code from a remote host only by manipulating the processed input stream no user is affected who followed the recommendation to setup xstream s security framework with a whitelist limited to the minimal required types if you rely on xstream s default blacklist of the security framework you will have to use at least version publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required high user interaction none scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com thoughtworks xstream xstream step up your open source security game with whitesource
| 0
|
9,286
| 12,305,047,324
|
IssuesEvent
|
2020-05-11 21:38:40
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
Override variables
|
Pri1 devops-cicd-process/tech devops/prod product-question
|
I'm using two variable groups in my azure devOps pipeline, configured with yml file: Common-group and Dev-group. I have same variable in each of them. I want to override variable from Common-group with the one from Dev-group. I am able to do it within the deployment job, but not with the regular job. What may be the explanation to such difference?
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 5aeeaace-1c5b-a51b-e41f-f25b806155b8
* Version Independent ID: fd7ff690-b2e4-41c7-a342-e528b911c6e1
* Content: [Deployment jobs - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops)
* Content Source: [docs/pipelines/process/deployment-jobs.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/deployment-jobs.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Override variables - I'm using two variable groups in my azure devOps pipeline, configured with yml file: Common-group and Dev-group. I have same variable in each of them. I want to override variable from Common-group with the one from Dev-group. I am able to do it within the deployment job, but not with the regular job. What may be the explanation to such difference?
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 5aeeaace-1c5b-a51b-e41f-f25b806155b8
* Version Independent ID: fd7ff690-b2e4-41c7-a342-e528b911c6e1
* Content: [Deployment jobs - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops)
* Content Source: [docs/pipelines/process/deployment-jobs.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/deployment-jobs.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
override variables i m using two variable groups in my azure devops pipeline configured with yml file common group and dev group i have same variable in each of them i want to override variable from common group with the one from dev group i am able to do it within the deployment job but not with the regular job what may be the explanation to such difference document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
23,960
| 3,871,226,866
|
IssuesEvent
|
2016-04-11 08:56:53
|
selinasee/JCHR4IV6WIWXKAUKECK6744D
|
https://api.github.com/repos/selinasee/JCHR4IV6WIWXKAUKECK6744D
|
closed
|
GKzAzeOHMx8HGPFOi3cqP8gBmbeNt7Z858zJbccIVcOHaV6D8P9SIoQ94WBUd7e38AeOmiSDPn8HX2WrYaZ5FyForw7bI1eiLPKUAadRG+ry+h3tkZWVEyRivFem9xFOvQ0A8CNhzKK+srp9qycFphHfW1tZhGEYVbcPwSloqxA=
|
design
|
yzd3rXPoBoe3hqVHhe4ab2CggjqGGzXlSA8/neWRXYjudNb84F227i6j6KxhECvH0ACIjHeZVLIsxZN2jM0Ys43U9arYpwgXKIKT453fjyp//STBZDYBHs0XCS0/+7ZM/xKsN/oKbetCyILjBuUv5ZatRAm0F+jisOeRLUNkjSDPNTCRTqSDLOaR8vbtd2UeWJaQu191yrphlKyASfWVEuh4Ka7k9DQz1COVrLoRtRXx+JeRvG05Kmh8xYeQ00qpBvD+LJI7c3BsqhT4nFneKXCRj9D78g7OYOye0LRXaJ8yNtx+7Wzha8DWv2KOWWqPPSwedRLqdb+L+z8NPOvpHyIfOavzqXEUMi45lM58FCmBcRbYwN5eRZ9zfVbxYYvdTv5R4VoIpmbZW+61vTbpWf7vkmIdIcCetV4YLdks6mU6Vgb2T7ypbgbRgwMC8eI0RRtlNh/LkRPcgGMaRRmawZySn9IpIa+/EDomp9b6Cw1SF6RDLoBXdGlUDRGi/+wX+0QDwRX/RrhiIwX19NTduriC45rn3MG6WPPlLYPSeJFSF6RDLoBXdGlUDRGi/+wXGhg3ApvU2jsUh4isMZjuHUS1iKkXqlpKBp/S27ff6oG1Og6cxNLRABpyfxMbZQt5zCWES1RAQkacnGSZaZfSRylZATmpG/G+m+Vs+r9Eaw2KHMO06eTvPsEtdXe5wi90VaURAiOmEs2skQ7k+ZviGTMW/8GX97zButbxNzNlp4ZN+JriXlMLWVQ3Lgq7QIhYKUDwP0St5jHRYt7UmofcN2cVB6gm5yWotcSzzf08bMxkc2CNAvHIDDCfqmgMKoQGjQRxxrGUD3UpnnKV9wzwmF845tvSqp4QHto5ca/DUt1rmG17+oMkPvf37loWMjXaZRN9US4tKWiNlf+Jvl9C+yhMehfeSIxq+joXVsl9p4k=
|
1.0
|
GKzAzeOHMx8HGPFOi3cqP8gBmbeNt7Z858zJbccIVcOHaV6D8P9SIoQ94WBUd7e38AeOmiSDPn8HX2WrYaZ5FyForw7bI1eiLPKUAadRG+ry+h3tkZWVEyRivFem9xFOvQ0A8CNhzKK+srp9qycFphHfW1tZhGEYVbcPwSloqxA= - yzd3rXPoBoe3hqVHhe4ab2CggjqGGzXlSA8/neWRXYjudNb84F227i6j6KxhECvH0ACIjHeZVLIsxZN2jM0Ys43U9arYpwgXKIKT453fjyp//STBZDYBHs0XCS0/+7ZM/xKsN/oKbetCyILjBuUv5ZatRAm0F+jisOeRLUNkjSDPNTCRTqSDLOaR8vbtd2UeWJaQu191yrphlKyASfWVEuh4Ka7k9DQz1COVrLoRtRXx+JeRvG05Kmh8xYeQ00qpBvD+LJI7c3BsqhT4nFneKXCRj9D78g7OYOye0LRXaJ8yNtx+7Wzha8DWv2KOWWqPPSwedRLqdb+L+z8NPOvpHyIfOavzqXEUMi45lM58FCmBcRbYwN5eRZ9zfVbxYYvdTv5R4VoIpmbZW+61vTbpWf7vkmIdIcCetV4YLdks6mU6Vgb2T7ypbgbRgwMC8eI0RRtlNh/LkRPcgGMaRRmawZySn9IpIa+/EDomp9b6Cw1SF6RDLoBXdGlUDRGi/+wX+0QDwRX/RrhiIwX19NTduriC45rn3MG6WPPlLYPSeJFSF6RDLoBXdGlUDRGi/+wXGhg3ApvU2jsUh4isMZjuHUS1iKkXqlpKBp/S27ff6oG1Og6cxNLRABpyfxMbZQt5zCWES1RAQkacnGSZaZfSRylZATmpG/G+m+Vs+r9Eaw2KHMO06eTvPsEtdXe5wi90VaURAiOmEs2skQ7k+ZviGTMW/8GX97zButbxNzNlp4ZN+JriXlMLWVQ3Lgq7QIhYKUDwP0St5jHRYt7UmofcN2cVB6gm5yWotcSzzf08bMxkc2CNAvHIDDCfqmgMKoQGjQRxxrGUD3UpnnKV9wzwmF845tvSqp4QHto5ca/DUt1rmG17+oMkPvf37loWMjXaZRN9US4tKWiNlf+Jvl9C+yhMehfeSIxq+joXVsl9p4k=
|
non_process
|
ry xksn l wx g m vs zvigtmw yhmehfesixq
| 0
|
17,599
| 5,446,523,338
|
IssuesEvent
|
2017-03-07 10:51:29
|
tijhuisingenieurs/TijhuisArcGisTools
|
https://api.github.com/repos/tijhuisingenieurs/TijhuisArcGisTools
|
opened
|
Ophalen en wegschrijven van T-geometrien onderbrengen bij geometrien
|
Code improvement
|
In elke tool staat nu veel dubbele code voor het omzetten van de arcGIS shapes naar collections en visa versa
|
1.0
|
Ophalen en wegschrijven van T-geometrien onderbrengen bij geometrien - In elke tool staat nu veel dubbele code voor het omzetten van de arcGIS shapes naar collections en visa versa
|
non_process
|
ophalen en wegschrijven van t geometrien onderbrengen bij geometrien in elke tool staat nu veel dubbele code voor het omzetten van de arcgis shapes naar collections en visa versa
| 0
|
430,465
| 30,185,563,630
|
IssuesEvent
|
2023-07-04 11:52:10
|
FishNetMigration/creesysr
|
https://api.github.com/repos/FishNetMigration/creesysr
|
opened
|
build vignette
|
documentation help wanted
|
Create a general vignette to identify the required steps in the analysis. Eventually this will serve as a general manual and/or long form documentation of the package.
As this is intended to be a working document and serve generally as a road map toward development, I encourage rough notes and placeholders to be included as we go. Any documentation is better than no documentation.
|
1.0
|
build vignette - Create a general vignette to identify the required steps in the analysis. Eventually this will serve as a general manual and/or long form documentation of the package.
As this is intended to be a working document and serve generally as a road map toward development, I encourage rough notes and placeholders to be included as we go. Any documentation is better than no documentation.
|
non_process
|
build vignette create a general vignette to identify the required steps in the analysis eventually this will serve as a general manual and or long form documentation of the package as this is intended to be a working document and serve generally as a road map toward development i encourage rough notes and placeholders to be included as we go any documentation is better than no documentation
| 0
|
523,847
| 15,190,879,778
|
IssuesEvent
|
2021-02-15 18:47:33
|
GoogleCloudPlatform/golang-samples
|
https://api.github.com/repos/GoogleCloudPlatform/golang-samples
|
opened
|
internal/cloudrunci: TestGcloud failed
|
flakybot: issue priority: p1 type: bug
|
This test failed!
To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/master/packages/flakybot).
If I'm commenting on this issue too often, add the `flakybot: quiet` label and
I will stop commenting.
---
commit: b04230748b537c11a84101d5dd20e507ed377524
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/9db1634e-d305-4340-b316-0cfdc8e3e756), [Sponge](http://sponge2/9db1634e-d305-4340-b316-0cfdc8e3e756)
status: failed
<details><summary>Test output</summary><br><pre>2021/02/15 18:22:22 Attempt #1: Running: operation [id-token]...
2021/02/15 18:22:22 Attempt #1: Executing: operation [id-token]: /tmp/google-cloud-sdk/bin/gcloud: --quiet auth print-identity-token
2021/02/15 18:22:22 Running: success...
2021/02/15 18:22:22 Executing: success: /tmp/google-cloud-sdk/bin/gcloud: help
success: Error Output
###
python2: can't open file '/tmp/google-cloud-sdk/lib/gcloud.py': [Errno 2] No such file or directory
###
gcloud_test.go:43: gcloud: success: "exit status 2"
gcloud_test.go:48: gcloud: got (python2: can't open file '/tmp/google-cloud-sdk/lib/gcloud.py': [Errno 2] No such file or directory
), want (gcloud - manage Google Cloud Platform resources and developer workflow)</pre></details>
|
1.0
|
internal/cloudrunci: TestGcloud failed - This test failed!
To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/master/packages/flakybot).
If I'm commenting on this issue too often, add the `flakybot: quiet` label and
I will stop commenting.
---
commit: b04230748b537c11a84101d5dd20e507ed377524
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/9db1634e-d305-4340-b316-0cfdc8e3e756), [Sponge](http://sponge2/9db1634e-d305-4340-b316-0cfdc8e3e756)
status: failed
<details><summary>Test output</summary><br><pre>2021/02/15 18:22:22 Attempt #1: Running: operation [id-token]...
2021/02/15 18:22:22 Attempt #1: Executing: operation [id-token]: /tmp/google-cloud-sdk/bin/gcloud: --quiet auth print-identity-token
2021/02/15 18:22:22 Running: success...
2021/02/15 18:22:22 Executing: success: /tmp/google-cloud-sdk/bin/gcloud: help
success: Error Output
###
python2: can't open file '/tmp/google-cloud-sdk/lib/gcloud.py': [Errno 2] No such file or directory
###
gcloud_test.go:43: gcloud: success: "exit status 2"
gcloud_test.go:48: gcloud: got (python2: can't open file '/tmp/google-cloud-sdk/lib/gcloud.py': [Errno 2] No such file or directory
), want (gcloud - manage Google Cloud Platform resources and developer workflow)</pre></details>
|
non_process
|
internal cloudrunci testgcloud failed this test failed to configure my behavior see if i m commenting on this issue too often add the flakybot quiet label and i will stop commenting commit buildurl status failed test output attempt running operation attempt executing operation tmp google cloud sdk bin gcloud quiet auth print identity token running success executing success tmp google cloud sdk bin gcloud help success error output can t open file tmp google cloud sdk lib gcloud py no such file or directory gcloud test go gcloud success exit status gcloud test go gcloud got can t open file tmp google cloud sdk lib gcloud py no such file or directory want gcloud manage google cloud platform resources and developer workflow
| 0
|
67,204
| 27,750,707,611
|
IssuesEvent
|
2023-03-15 20:29:22
|
hashicorp/terraform-provider-aws
|
https://api.github.com/repos/hashicorp/terraform-provider-aws
|
closed
|
[Enhancement]: AWS Elasticache User IAM Auth
|
enhancement service/elasticache
|
### Description
AWS have [added functionality to Elasticache](https://aws.amazon.com/about-aws/whats-new/2022/11/amazon-elasticache-supports-iam-authentication-redis-clusters/) users allowing IAM authentication with a token that does not require setting up passwords.
The current resource only allows for the optional `passwords` or `no_password_required` arguments which do not allow for an authentication type to be set.
A new argument that allows for the selection of an authentication type with the 3 currently allowed options (password, no-password-required, iam) would allow for the use of this new functionality.
### Affected Resource(s) and/or Data Source(s)
* aws_elasticache_user
### Potential Terraform Configuration
```terraform
resource "aws_elasticache_user" "test" {
user_id = "testUserId"
user_name = "testUserName"
access_string = "on ~app::* -@all +@read +@hash +@bitmap +@geo -setbit -bitfield -hset -hsetnx -hmset -hincrby -hincrbyfloat -hdel -bitop -geoadd -georadius -georadiusbymember"
engine = "REDIS"
auth_type = "iam"
}
```
### References
[AWS announcement](https://aws.amazon.com/about-aws/whats-new/2022/11/amazon-elasticache-supports-iam-authentication-redis-clusters/)
[AWS IAM auth documentation](https://docs.aws.amazon.com/AmazonElastiCache/latest/red-ug/auth-iam.html)
### Would you like to implement a fix?
_No response_
|
1.0
|
[Enhancement]: AWS Elasticache User IAM Auth - ### Description
AWS have [added functionality to Elasticache](https://aws.amazon.com/about-aws/whats-new/2022/11/amazon-elasticache-supports-iam-authentication-redis-clusters/) users allowing IAM authentication with a token that does not require setting up passwords.
The current resource only allows for the optional `passwords` or `no_password_required` arguments which do not allow for an authentication type to be set.
A new argument that allows for the selection of an authentication type with the 3 currently allowed options (password, no-password-required, iam) would allow for the use of this new functionality.
### Affected Resource(s) and/or Data Source(s)
* aws_elasticache_user
### Potential Terraform Configuration
```terraform
resource "aws_elasticache_user" "test" {
user_id = "testUserId"
user_name = "testUserName"
access_string = "on ~app::* -@all +@read +@hash +@bitmap +@geo -setbit -bitfield -hset -hsetnx -hmset -hincrby -hincrbyfloat -hdel -bitop -geoadd -georadius -georadiusbymember"
engine = "REDIS"
auth_type = "iam"
}
```
### References
[AWS announcement](https://aws.amazon.com/about-aws/whats-new/2022/11/amazon-elasticache-supports-iam-authentication-redis-clusters/)
[AWS IAM auth documentation](https://docs.aws.amazon.com/AmazonElastiCache/latest/red-ug/auth-iam.html)
### Would you like to implement a fix?
_No response_
|
non_process
|
aws elasticache user iam auth description aws have users allowing iam authentication with a token that does not require setting up passwords the current resource only allows for the optional passwords or no password required arguments which do not allow for an authentication type to be set a new argument that allows for the selection of an authentication type with the currently allowed options password no password required iam would allow for the use of this new functionality affected resource s and or data source s aws elasticache user potential terraform configuration terraform resource aws elasticache user test user id testuserid user name testusername access string on app all read hash bitmap geo setbit bitfield hset hsetnx hmset hincrby hincrbyfloat hdel bitop geoadd georadius georadiusbymember engine redis auth type iam references would you like to implement a fix no response
| 0
|
9,744
| 12,733,815,407
|
IssuesEvent
|
2020-06-25 12:57:29
|
prisma/prisma-client-js
|
https://api.github.com/repos/prisma/prisma-client-js
|
opened
|
Escaping parameters of `queryRaw` and `executeRaw` to prevent SQL injections
|
kind/feature process/candidate team/typescript
|
## Problem
As of today, parametrized queries executed through `queryRaw` or `executeRaw` do not escape parameters to prevent from SQL injections. This can lead users to forget about this and open serious vulnerabilities in their software.
This is problematic as:
* Not mentioned in Prisma's documentation which can let people assume it's done by default
* Not the standard practice in most SQL clients (e.g. [Node-Postgres](https://node-postgres.com/features/queries#Parameterized%20query), [Knex.js](http://knexjs.org/#Raw-Bindings), [Sequelize](https://sequelize.org/master/manual/raw-queries.html#replacements)...
* Done whenever using the [tagged template call form](https://www.prisma.io/docs/reference/tools-and-interfaces/prisma-client/raw-database-access#tagged-templates)
## Suggested solution
Ensure `queryRaw` and `executeRaw` do escape parameters before sending those to the engine.
Update documentation to reflect how Prisma protects from SQL injections
## Alternatives
I don't think there's any.
## Additional context
Mentioned #727 that there could be confusion between escaped and unescaped ways to make queries.
|
1.0
|
Escaping parameters of `queryRaw` and `executeRaw` to prevent SQL injections - ## Problem
As of today, parametrized queries executed through `queryRaw` or `executeRaw` do not escape parameters to prevent from SQL injections. This can lead users to forget about this and open serious vulnerabilities in their software.
This is problematic as:
* Not mentioned in Prisma's documentation which can let people assume it's done by default
* Not the standard practice in most SQL clients (e.g. [Node-Postgres](https://node-postgres.com/features/queries#Parameterized%20query), [Knex.js](http://knexjs.org/#Raw-Bindings), [Sequelize](https://sequelize.org/master/manual/raw-queries.html#replacements)...
* Done whenever using the [tagged template call form](https://www.prisma.io/docs/reference/tools-and-interfaces/prisma-client/raw-database-access#tagged-templates)
## Suggested solution
Ensure `queryRaw` and `executeRaw` do escape parameters before sending those to the engine.
Update documentation to reflect how Prisma protects from SQL injections
## Alternatives
I don't think there's any.
## Additional context
Mentioned #727 that there could be confusion between escaped and unescaped ways to make queries.
|
process
|
escaping parameters of queryraw and executeraw to prevent sql injections problem as of today parametrized queries executed through queryraw or executeraw do not escape parameters to prevent from sql injections this can lead users to forget about this and open serious vulnerabilities in their software this is problematic as not mentioned in prisma s documentation which can let people assume it s done by default not the standard practice in most sql clients e g done whenever using the suggested solution ensure queryraw and executeraw do escape parameters before sending those to the engine update documentation to reflect how prisma protects from sql injections alternatives i don t think there s any additional context mentioned that there could be confusion between escaped and unescaped ways to make queries
| 1
|
73,443
| 14,074,926,158
|
IssuesEvent
|
2020-11-04 08:14:11
|
HydrolienF/Formiko
|
https://api.github.com/repos/HydrolienF/Formiko
|
opened
|
position of Object on a Case
|
code reorganization graphics new content
|
S'arangé pour que le positionement des élément sur une case sous adaptable par rapport a leurs tailles, leur nombres etc.
1 seul éléments doit etre centré. (Ou doit etre positionné en rapport avec sa dirrection)
Plusieurs élément doivent au maximum évité de ce cacher :
- Les petits peuvent être au dessus (Le plus simple serait de trié le GCreature avant de l'afficher.
Logiquement si on a un gros éléments et plein de petit il faudrait que le gros soit dans un coin et les petits sur les 2 autres coté libre.
On peu essayer de faire qqchose avec les centres des différentes hitbox.
Les centres des différente hitbox doivent etre :
-le plus proche possible du centre sans que les hitbox ce touche.
Sinon le plus loin possible les un des autres.
Ils doivent aussi être a minimum 10% de la bordure de la case. (Comme ca les gros insectes seront encore centré sur la case ou ils sont.)
On peut facilement savoir si les hitbox vont devoir ce touché avec 2 éléments, on les suppose le plus loin possible est on calcul si il ce touche ou pas en récupérant les coordoné de leur point le plus proche de l'autre.
|
1.0
|
position of Object on a Case - S'arangé pour que le positionement des élément sur une case sous adaptable par rapport a leurs tailles, leur nombres etc.
1 seul éléments doit etre centré. (Ou doit etre positionné en rapport avec sa dirrection)
Plusieurs élément doivent au maximum évité de ce cacher :
- Les petits peuvent être au dessus (Le plus simple serait de trié le GCreature avant de l'afficher.
Logiquement si on a un gros éléments et plein de petit il faudrait que le gros soit dans un coin et les petits sur les 2 autres coté libre.
On peu essayer de faire qqchose avec les centres des différentes hitbox.
Les centres des différente hitbox doivent etre :
-le plus proche possible du centre sans que les hitbox ce touche.
Sinon le plus loin possible les un des autres.
Ils doivent aussi être a minimum 10% de la bordure de la case. (Comme ca les gros insectes seront encore centré sur la case ou ils sont.)
On peut facilement savoir si les hitbox vont devoir ce touché avec 2 éléments, on les suppose le plus loin possible est on calcul si il ce touche ou pas en récupérant les coordoné de leur point le plus proche de l'autre.
|
non_process
|
position of object on a case s arangé pour que le positionement des élément sur une case sous adaptable par rapport a leurs tailles leur nombres etc seul éléments doit etre centré ou doit etre positionné en rapport avec sa dirrection plusieurs élément doivent au maximum évité de ce cacher les petits peuvent être au dessus le plus simple serait de trié le gcreature avant de l afficher logiquement si on a un gros éléments et plein de petit il faudrait que le gros soit dans un coin et les petits sur les autres coté libre on peu essayer de faire qqchose avec les centres des différentes hitbox les centres des différente hitbox doivent etre le plus proche possible du centre sans que les hitbox ce touche sinon le plus loin possible les un des autres ils doivent aussi être a minimum de la bordure de la case comme ca les gros insectes seront encore centré sur la case ou ils sont on peut facilement savoir si les hitbox vont devoir ce touché avec éléments on les suppose le plus loin possible est on calcul si il ce touche ou pas en récupérant les coordoné de leur point le plus proche de l autre
| 0
|
102,200
| 31,858,057,439
|
IssuesEvent
|
2023-09-15 08:54:58
|
moby/buildkit
|
https://api.github.com/repos/moby/buildkit
|
closed
|
Consider migrating from Alpine to Debian, Ubuntu, or Wolfi for reproducible builds
|
area/reproducible-builds
|
Reproducible builds is hard with Alpine, as Alpine does not keep old packages: https://gitlab.alpinelinux.org/alpine/abuild/-/issues/9996
Debian, Ubuntu, and Wolfi are more suitable for reproducible builds, as they keep old packages:
- https://snapshot-cloudflare.debian.org/
- http://snapshot.ubuntu.com/ubuntu/20230904T000000Z/dists/jammy/Release
- https://www.chainguard.dev/unchained/reproducing-chainguards-reproducible-image-builds
Wolfi currently does not support armv7, s390x, ppc64le, and riscv64 though:
- https://github.com/wolfi-dev/os/issues/5440
|
1.0
|
Consider migrating from Alpine to Debian, Ubuntu, or Wolfi for reproducible builds - Reproducible builds is hard with Alpine, as Alpine does not keep old packages: https://gitlab.alpinelinux.org/alpine/abuild/-/issues/9996
Debian, Ubuntu, and Wolfi are more suitable for reproducible builds, as they keep old packages:
- https://snapshot-cloudflare.debian.org/
- http://snapshot.ubuntu.com/ubuntu/20230904T000000Z/dists/jammy/Release
- https://www.chainguard.dev/unchained/reproducing-chainguards-reproducible-image-builds
Wolfi currently does not support armv7, s390x, ppc64le, and riscv64 though:
- https://github.com/wolfi-dev/os/issues/5440
|
non_process
|
consider migrating from alpine to debian ubuntu or wolfi for reproducible builds reproducible builds is hard with alpine as alpine does not keep old packages debian ubuntu and wolfi are more suitable for reproducible builds as they keep old packages wolfi currently does not support and though
| 0
|
1,185
| 3,687,738,973
|
IssuesEvent
|
2016-02-25 09:48:54
|
cliffparnitzky/TinyMceFontAwesome
|
https://api.github.com/repos/cliffparnitzky/TinyMceFontAwesome
|
closed
|
tl_layout conflict
|
Improvement ⚙ - Processed
|
The DCA config of this module and [contao-supertheme](https://github.com/comolo/contao-supertheme) are somehow interfering!
`// Add external_js + external_scss
$GLOBALS['TL_DCA']['tl_layout']['palettes']['default'] = str_replace(
array(',script;', 'stylesheet,external;', 'stylesheet,external,'),
array(',script,external_js;', 'stylesheet,external,external_scss;', 'stylesheet,external,external_scss,'),
$GLOBALS['TL_DCA']['tl_layout']['palettes']['default']
);`
interferes with
`$GLOBALS['TL_DCA']['tl_layout']['palettes']['default'] = str_replace('external', 'external,tinyMceFontAwesome', $GLOBALS['TL_DCA']['tl_layout']['palettes']['default']);`
Thus, only the fontawesome-input is visible and the external_js field is hidden/overwritten.
Any proposals for a local dcaconfig change for this issue?
contao 3.5.6
"comolo/contao-supertheme": ">=2.2.5.0,<2.3-dev",
"cliffparnitzky/tiny-mce-font-awesome": ">=2.0.0.0,<2.1-dev",
|
1.0
|
tl_layout conflict - The DCA config of this module and [contao-supertheme](https://github.com/comolo/contao-supertheme) are somehow interfering!
`// Add external_js + external_scss
$GLOBALS['TL_DCA']['tl_layout']['palettes']['default'] = str_replace(
array(',script;', 'stylesheet,external;', 'stylesheet,external,'),
array(',script,external_js;', 'stylesheet,external,external_scss;', 'stylesheet,external,external_scss,'),
$GLOBALS['TL_DCA']['tl_layout']['palettes']['default']
);`
interferes with
`$GLOBALS['TL_DCA']['tl_layout']['palettes']['default'] = str_replace('external', 'external,tinyMceFontAwesome', $GLOBALS['TL_DCA']['tl_layout']['palettes']['default']);`
Thus, only the fontawesome-input is visible and the external_js field is hidden/overwritten.
Any proposals for a local dcaconfig change for this issue?
contao 3.5.6
"comolo/contao-supertheme": ">=2.2.5.0,<2.3-dev",
"cliffparnitzky/tiny-mce-font-awesome": ">=2.0.0.0,<2.1-dev",
|
process
|
tl layout conflict the dca config of this module and are somehow interfering add external js external scss globals str replace array script stylesheet external stylesheet external array script external js stylesheet external external scss stylesheet external external scss globals interferes with globals str replace external external tinymcefontawesome globals thus only the fontawesome input is visible and the external js field is hidden overwritten any proposals for a local dcaconfig change for this issue contao comolo contao supertheme dev cliffparnitzky tiny mce font awesome dev
| 1
|
16,768
| 21,943,623,456
|
IssuesEvent
|
2022-05-23 20:58:55
|
ORNL-AMO/AMO-Tools-Desktop
|
https://api.github.com/repos/ORNL-AMO/AMO-Tools-Desktop
|
opened
|
Weather Data Help Text
|
Process Cooling
|
Add to help text:
Some text about how to get to weather data and how to find what file to download:
https://energyefficiency.ornl.gov/tools-training/
|
1.0
|
Weather Data Help Text - Add to help text:
Some text about how to get to weather data and how to find what file to download:
https://energyefficiency.ornl.gov/tools-training/
|
process
|
weather data help text add to help text some text about how to get to weather data and how to find what file to download
| 1
|
12,020
| 14,738,494,090
|
IssuesEvent
|
2021-01-07 04:55:47
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Client complaining they cannot apply payment in SAB portal
|
anc-ops anc-process anp-1 ant-support
|
In GitLab by @kdjstudios on Jun 6, 2018, 08:56
**Submitted by:** "djoseph" <denise.joseph@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-06-06-51661/conversation
**Server:** Internal
**Client/Site:** Toronoto
**Account:** Multi
**Issue:**
Short Description of Problem: Client complaining they cannot
apply payment in SAB portal
Detail Description of problem:
Apotex research and Century 21 President Realty account
administrators could not apply payment via SAB portal. Resent
portal link for both clients again and only Century 21 President
realty was able to apply their payment. Apotex account
administrator still could not apply payment and TMG Builders
accountant called advising he too could not apply payment.
Resent the link couple times to him and he still could not apply
payment. Can you please advise what else we can do? or can their
link be reset and resent so they can register again? Please
look into this for us.
|
1.0
|
Client complaining they cannot apply payment in SAB portal - In GitLab by @kdjstudios on Jun 6, 2018, 08:56
**Submitted by:** "djoseph" <denise.joseph@answernet.com>
**Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-06-06-51661/conversation
**Server:** Internal
**Client/Site:** Toronoto
**Account:** Multi
**Issue:**
Short Description of Problem: Client complaining they cannot
apply payment in SAB portal
Detail Description of problem:
Apotex research and Century 21 President Realty account
administrators could not apply payment via SAB portal. Resent
portal link for both clients again and only Century 21 President
realty was able to apply their payment. Apotex account
administrator still could not apply payment and TMG Builders
accountant called advising he too could not apply payment.
Resent the link couple times to him and he still could not apply
payment. Can you please advise what else we can do? or can their
link be reset and resent so they can register again? Please
look into this for us.
|
process
|
client complaining they cannot apply payment in sab portal in gitlab by kdjstudios on jun submitted by djoseph helpdesk server internal client site toronoto account multi issue short description of problem client complaining they cannot apply payment in sab portal detail description of problem apotex research and century president realty account administrators could not apply payment via sab portal resent portal link for both clients again and only century president realty was able to apply their payment apotex account administrator still could not apply payment and tmg builders accountant called advising he too could not apply payment resent the link couple times to him and he still could not apply payment can you please advise what else we can do or can their link be reset and resent so they can register again please look into this for us
| 1
|
73,266
| 32,009,114,304
|
IssuesEvent
|
2023-09-21 16:42:21
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Source vs Target slot confusion at "Swap operation steps"
|
app-service/svc triaged cxp doc-enhancement Pri1
|
Hi,
I'm trying to understand what happens during a swap operation, and I noticed something what seems to be a logical error to me (likely a typo) in the documentation.
---
1. Apply the following settings from the target slot (for example, the production slot) **to** all instances of **the source slot**:
(...)
Any of these cases trigger all instances in the **source slot to restart**. (...)
2. Wait for every instance in the >> **target slot** << to complete its restart. If any instance fails to restart, the swap operation reverts all changes to the source slot and stops the operation.
---
My problem is with the >> << part on the second point: isn't that supposed to be **source slot**? I don't understand why is the _target slot_ (production) restarting — it shouldn't, as it is the production slot and we don't want downtime while swapping.
I think it should be **source slot**, and if it would be that, it would make perfect sense to me.
Could you check this, please?
I'm looking forward to the answer.
Thanks!
_PS: if I was right, I highly recommend a through re-review of this part of the document, because it may contain other errors, too._
---
#### Document Details
⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.*
* ID: f6e09089-1ae2-8943-5ce2-9d48f458c81f
* Version Independent ID: ba780cba-f604-b0a4-a81a-23c7d2384762
* Content: [Set up staging environments - Azure App Service](https://learn.microsoft.com/en-us/azure/app-service/deploy-staging-slots?tabs=portal)
* Content Source: [articles/app-service/deploy-staging-slots.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/app-service/deploy-staging-slots.md)
* Service: **app-service**
* GitHub Login: @cephalin
* Microsoft Alias: **cephalin**
|
1.0
|
Source vs Target slot confusion at "Swap operation steps" - Hi,
I'm trying to understand what happens during a swap operation, and I noticed something what seems to be a logical error to me (likely a typo) in the documentation.
---
1. Apply the following settings from the target slot (for example, the production slot) **to** all instances of **the source slot**:
(...)
Any of these cases trigger all instances in the **source slot to restart**. (...)
2. Wait for every instance in the >> **target slot** << to complete its restart. If any instance fails to restart, the swap operation reverts all changes to the source slot and stops the operation.
---
My problem is with the >> << part on the second point: isn't that supposed to be **source slot**? I don't understand why is the _target slot_ (production) restarting — it shouldn't, as it is the production slot and we don't want downtime while swapping.
I think it should be **source slot**, and if it would be that, it would make perfect sense to me.
Could you check this, please?
I'm looking forward to the answer.
Thanks!
_PS: if I was right, I highly recommend a through re-review of this part of the document, because it may contain other errors, too._
---
#### Document Details
⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.*
* ID: f6e09089-1ae2-8943-5ce2-9d48f458c81f
* Version Independent ID: ba780cba-f604-b0a4-a81a-23c7d2384762
* Content: [Set up staging environments - Azure App Service](https://learn.microsoft.com/en-us/azure/app-service/deploy-staging-slots?tabs=portal)
* Content Source: [articles/app-service/deploy-staging-slots.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/app-service/deploy-staging-slots.md)
* Service: **app-service**
* GitHub Login: @cephalin
* Microsoft Alias: **cephalin**
|
non_process
|
source vs target slot confusion at swap operation steps hi i m trying to understand what happens during a swap operation and i noticed something what seems to be a logical error to me likely a typo in the documentation apply the following settings from the target slot for example the production slot to all instances of the source slot any of these cases trigger all instances in the source slot to restart wait for every instance in the target slot to complete its restart if any instance fails to restart the swap operation reverts all changes to the source slot and stops the operation my problem is with the part on the second point isn t that supposed to be source slot i don t understand why is the target slot production restarting — it shouldn t as it is the production slot and we don t want downtime while swapping i think it should be source slot and if it would be that it would make perfect sense to me could you check this please i m looking forward to the answer thanks ps if i was right i highly recommend a through re review of this part of the document because it may contain other errors too document details ⚠ do not edit this section it is required for learn microsoft com ➟ github issue linking id version independent id content content source service app service github login cephalin microsoft alias cephalin
| 0
|
78,545
| 10,065,619,964
|
IssuesEvent
|
2019-07-23 11:22:04
|
quantumblacklabs/kedro
|
https://api.github.com/repos/quantumblacklabs/kedro
|
closed
|
Add complete Kedro plugin example
|
Issue: Feature Request Priority: Medium Status: Approved Type: Documentation
|
## Description
I'm trying to create a Kedro plugin and it took me a long time to figure out that a plugin is a _separate_ package that exists outside of any Kedro project. I kept modifying 'setup.py' within my Kedro project, following the JSON example in the documentation, only to find that the `kedro to_json` command didn't do anything. In retrospect it now makes sense that kedro plugins are separate python packages, but this is not obvious to new users from the documentation.
## Context
New users would benefit from a more complete Kedro plugin example.
## Possible Implementation
Add more detail to the Kedro plugin tutorial. Also add a complete plugin example as its own repository, similar to the kedro-examples repo.
## Checklist
Include labels so that we can categorise your issue:
- [ ] Add a "Component" label to the issue
- [ ] Add a "Priority" label to the issue
|
1.0
|
Add complete Kedro plugin example - ## Description
I'm trying to create a Kedro plugin and it took me a long time to figure out that a plugin is a _separate_ package that exists outside of any Kedro project. I kept modifying 'setup.py' within my Kedro project, following the JSON example in the documentation, only to find that the `kedro to_json` command didn't do anything. In retrospect it now makes sense that kedro plugins are separate python packages, but this is not obvious to new users from the documentation.
## Context
New users would benefit from a more complete Kedro plugin example.
## Possible Implementation
Add more detail to the Kedro plugin tutorial. Also add a complete plugin example as its own repository, similar to the kedro-examples repo.
## Checklist
Include labels so that we can categorise your issue:
- [ ] Add a "Component" label to the issue
- [ ] Add a "Priority" label to the issue
|
non_process
|
add complete kedro plugin example description i m trying to create a kedro plugin and it took me a long time to figure out that a plugin is a separate package that exists outside of any kedro project i kept modifying setup py within my kedro project following the json example in the documentation only to find that the kedro to json command didn t do anything in retrospect it now makes sense that kedro plugins are separate python packages but this is not obvious to new users from the documentation context new users would benefit from a more complete kedro plugin example possible implementation add more detail to the kedro plugin tutorial also add a complete plugin example as its own repository similar to the kedro examples repo checklist include labels so that we can categorise your issue add a component label to the issue add a priority label to the issue
| 0
|
6,116
| 8,982,037,096
|
IssuesEvent
|
2019-01-31 00:18:52
|
cypress-io/cypress
|
https://api.github.com/repos/cypress-io/cypress
|
closed
|
Comment on each issue that went into the release after publishing Cypress version
|
difficulty: 1️⃣ process: release stage: in progress type: chore
|
We need to comment on each issue that was closed for release with a comment that says "Version X.Y.Z has been published" and maybe links to the Changelog and any other information.
Inspiration: semantic release bot here:
- comment and label `release` https://github.com/bahmutov/cypress-dark/issues/29
<img width="723" alt="screen shot 2019-01-23 at 3 46 34 pm" src="https://user-images.githubusercontent.com/2212006/51635986-169b3700-1f26-11e9-8ba8-1c08c9df2f7b.png">
|
1.0
|
Comment on each issue that went into the release after publishing Cypress version - We need to comment on each issue that was closed for release with a comment that says "Version X.Y.Z has been published" and maybe links to the Changelog and any other information.
Inspiration: semantic release bot here:
- comment and label `release` https://github.com/bahmutov/cypress-dark/issues/29
<img width="723" alt="screen shot 2019-01-23 at 3 46 34 pm" src="https://user-images.githubusercontent.com/2212006/51635986-169b3700-1f26-11e9-8ba8-1c08c9df2f7b.png">
|
process
|
comment on each issue that went into the release after publishing cypress version we need to comment on each issue that was closed for release with a comment that says version x y z has been published and maybe links to the changelog and any other information inspiration semantic release bot here comment and label release img width alt screen shot at pm src
| 1
|
410,352
| 11,986,577,111
|
IssuesEvent
|
2020-04-07 19:34:24
|
aol/moloch
|
https://api.github.com/repos/aol/moloch
|
closed
|
Feature Request: Collapsible Navbars (especially Connections Tab
|
enhancement medium priority viewer
|
can we please have collapsible navbars, especially for the connections tab?
yay @31453
|
1.0
|
Feature Request: Collapsible Navbars (especially Connections Tab - can we please have collapsible navbars, especially for the connections tab?
yay @31453
|
non_process
|
feature request collapsible navbars especially connections tab can we please have collapsible navbars especially for the connections tab yay
| 0
|
12,728
| 15,099,194,388
|
IssuesEvent
|
2021-02-08 01:46:31
|
e4exp/paper_manager_abstract
|
https://api.github.com/repos/e4exp/paper_manager_abstract
|
closed
|
Learning Transferable Visual Models From Natural Language Supervision
|
2021 Classification Computer Vision Natural Language Processing Pretraining Zero Shot _read_later
|
* https://cdn.openai.com/papers/Learning_Transferable_Visual_Models_From_Natural_Language.pdf
* 2021
最新のコンピュータビジョンシステムは、あらかじめ決められたオブジェクトのカテゴリーを予測するように訓練されています。
このような限定された形のスーパービジョンでは、他の視覚概念を特定するためにラベル付けされたデータが必要となるため、その汎用性と有用性が制限されています。
画像についての生のテキストから直接学習することは、はるかに広範なスーパービジョンのソースを活用する有望な代替手段である。
我々は、インターネットから収集した4億組の画像(画像、テキスト)のデータセット上で、どのキャプションがどの画像の年齢に合うかを予測するという単純な事前学習タスクが、SOTA画像表現をスクラッチから学習する効率的でスケーラブルな方法であることを実証した。
事前学習の後、自然言語を用いて学習した視覚概念を参照することで、下流のタスクにモデルをゼロショットで移行させることができます。
我々はこのアプローチの性能を、OCR、動画のアクション認識、ジオローカリゼーション、および多くの種類の細かい粒度の物体分類などのタスクにまたがる30以上の異なる既存のコンピュータ・ヴィジョンのデータセットでベンチマークを行うことによって研究している。
このモデルは、ほとんどのタスクに非自 主的に適用され、データセット固有のトレーニングを必要とせず、完全に教師付きのベースラインと競合することがよくあります。
例えば、128万個の訓練例を使用することなく、ImageNetゼロショット上の元のResNet-50の精度を一致させることができました。
|
1.0
|
Learning Transferable Visual Models From Natural Language Supervision - * https://cdn.openai.com/papers/Learning_Transferable_Visual_Models_From_Natural_Language.pdf
* 2021
最新のコンピュータビジョンシステムは、あらかじめ決められたオブジェクトのカテゴリーを予測するように訓練されています。
このような限定された形のスーパービジョンでは、他の視覚概念を特定するためにラベル付けされたデータが必要となるため、その汎用性と有用性が制限されています。
画像についての生のテキストから直接学習することは、はるかに広範なスーパービジョンのソースを活用する有望な代替手段である。
我々は、インターネットから収集した4億組の画像(画像、テキスト)のデータセット上で、どのキャプションがどの画像の年齢に合うかを予測するという単純な事前学習タスクが、SOTA画像表現をスクラッチから学習する効率的でスケーラブルな方法であることを実証した。
事前学習の後、自然言語を用いて学習した視覚概念を参照することで、下流のタスクにモデルをゼロショットで移行させることができます。
我々はこのアプローチの性能を、OCR、動画のアクション認識、ジオローカリゼーション、および多くの種類の細かい粒度の物体分類などのタスクにまたがる30以上の異なる既存のコンピュータ・ヴィジョンのデータセットでベンチマークを行うことによって研究している。
このモデルは、ほとんどのタスクに非自 主的に適用され、データセット固有のトレーニングを必要とせず、完全に教師付きのベースラインと競合することがよくあります。
例えば、128万個の訓練例を使用することなく、ImageNetゼロショット上の元のResNet-50の精度を一致させることができました。
|
process
|
learning transferable visual models from natural language supervision 最新のコンピュータビジョンシステムは、あらかじめ決められたオブジェクトのカテゴリーを予測するように訓練されています。 このような限定された形のスーパービジョンでは、他の視覚概念を特定するためにラベル付けされたデータが必要となるため、その汎用性と有用性が制限されています。 画像についての生のテキストから直接学習することは、はるかに広範なスーパービジョンのソースを活用する有望な代替手段である。 我々は、 (画像、テキスト)のデータセット上で、どのキャプションがどの画像の年齢に合うかを予測するという単純な事前学習タスクが、sota画像表現をスクラッチから学習する効率的でスケーラブルな方法であることを実証した。 事前学習の後、自然言語を用いて学習した視覚概念を参照することで、下流のタスクにモデルをゼロショットで移行させることができます。 我々はこのアプローチの性能を、ocr、動画のアクション認識、ジオローカリゼーション、 ・ヴィジョンのデータセットでベンチマークを行うことによって研究している。 このモデルは、ほとんどのタスクに非自 主的に適用され、データセット固有のトレーニングを必要とせず、完全に教師付きのベースラインと競合することがよくあります。 例えば、 、imagenetゼロショット上の元のresnet 。
| 1
|
559,369
| 16,557,182,241
|
IssuesEvent
|
2021-05-28 15:10:11
|
grafana/grafana
|
https://api.github.com/repos/grafana/grafana
|
closed
|
Prometheus: When Loki data source connected, querying doesn't work
|
datasource/Prometheus effort/small help wanted onboarding priority/nice-to-have type/bug
|
**What happened**:
Using Loki as Prometheus data source doesn't work:

It used to work in the past.
**What you expected to happen**:
Possibility to query Loki with Prometheus data source type.
**How to reproduce it (as minimally and precisely as possible)**:
1. Set up Loki as Prometheus data source

2. Try to run the query - it throws error
**Environment**:
- Grafana version: 7.5.4
- Data source type & version: Prometheus
|
1.0
|
Prometheus: When Loki data source connected, querying doesn't work - **What happened**:
Using Loki as Prometheus data source doesn't work:

It used to work in the past.
**What you expected to happen**:
Possibility to query Loki with Prometheus data source type.
**How to reproduce it (as minimally and precisely as possible)**:
1. Set up Loki as Prometheus data source

2. Try to run the query - it throws error
**Environment**:
- Grafana version: 7.5.4
- Data source type & version: Prometheus
|
non_process
|
prometheus when loki data source connected querying doesn t work what happened using loki as prometheus data source doesn t work it used to work in the past what you expected to happen possibility to query loki with prometheus data source type how to reproduce it as minimally and precisely as possible set up loki as prometheus data source try to run the query it throws error environment grafana version data source type version prometheus
| 0
|
130,601
| 10,617,867,553
|
IssuesEvent
|
2019-10-12 22:37:32
|
IntellectualSites/FastAsyncWorldEdit-1.13
|
https://api.github.com/repos/IntellectualSites/FastAsyncWorldEdit-1.13
|
opened
|
ArrayIndexOutOfBoundsException: 0
|
Requires Testing
|
# Bug report for FastAsyncWorldEdit 1.13.2
<!--- If you are using 1.13 or 1.13.1 consider updating to 1.13.2 before raising an issue -->
<!--- In order to create a valid issue report you have to follow this template. -->
<!--- Remove this template if making a suggestion or asking a question. -->
<!--- Incomplete reports might be marked as invalid. -->
**[REQUIRED] FastAsyncWorldEdit Version Number:**
FastAsyncWorldEdit-bukkit-1.13.179
**[REQUIRED] Spigot/Paper Version Number:**
paper build 637
**Links to config.yml and config-legacy.yml file:**
https://pastebin.com/XuhJEDjM
https://pastebin.com/6qLz3Ckh
**[REQUIRED] Description of the problem:**
Not sure what causes this error to actually happens, seems to be related with how you type a command.
Error: https://pastebin.com/DbTXviGU
**Plugins being used on the server:**
FastAsyncWorldEdit, WorldEdit, PlotSquared, PlugMan, SimpleAPI, SimplePets, BuycraftX, LuckPerms, LibsDisguises, ProtocolLib
**How to replicate**:
Not quite sure.
**Checklist**:
<!--- Make sure you've completed the following steps (put an "X" between of brackets): -->
- [X] I included all information required in the sections above
- [X] I made sure there are no duplicates of this report [(Use Search)](https://github.com/IntellectualSites/FastAsyncWorldEdit-1.13/issues?utf8=%E2%9C%93&q=is%3Aissue)
- [X] I made sure I am using an up-to-date version of [FastAsyncWorldEdit for 1.13.2](https://ci.athion.net/job/FastAsyncWorldEdit-Breaking/)
- [X] I made sure the bug/error is not caused by any other plugin
|
1.0
|
ArrayIndexOutOfBoundsException: 0 - # Bug report for FastAsyncWorldEdit 1.13.2
<!--- If you are using 1.13 or 1.13.1 consider updating to 1.13.2 before raising an issue -->
<!--- In order to create a valid issue report you have to follow this template. -->
<!--- Remove this template if making a suggestion or asking a question. -->
<!--- Incomplete reports might be marked as invalid. -->
**[REQUIRED] FastAsyncWorldEdit Version Number:**
FastAsyncWorldEdit-bukkit-1.13.179
**[REQUIRED] Spigot/Paper Version Number:**
paper build 637
**Links to config.yml and config-legacy.yml file:**
https://pastebin.com/XuhJEDjM
https://pastebin.com/6qLz3Ckh
**[REQUIRED] Description of the problem:**
Not sure what causes this error to actually happens, seems to be related with how you type a command.
Error: https://pastebin.com/DbTXviGU
**Plugins being used on the server:**
FastAsyncWorldEdit, WorldEdit, PlotSquared, PlugMan, SimpleAPI, SimplePets, BuycraftX, LuckPerms, LibsDisguises, ProtocolLib
**How to replicate**:
Not quite sure.
**Checklist**:
<!--- Make sure you've completed the following steps (put an "X" between of brackets): -->
- [X] I included all information required in the sections above
- [X] I made sure there are no duplicates of this report [(Use Search)](https://github.com/IntellectualSites/FastAsyncWorldEdit-1.13/issues?utf8=%E2%9C%93&q=is%3Aissue)
- [X] I made sure I am using an up-to-date version of [FastAsyncWorldEdit for 1.13.2](https://ci.athion.net/job/FastAsyncWorldEdit-Breaking/)
- [X] I made sure the bug/error is not caused by any other plugin
|
non_process
|
arrayindexoutofboundsexception bug report for fastasyncworldedit fastasyncworldedit version number fastasyncworldedit bukkit spigot paper version number paper build links to config yml and config legacy yml file description of the problem not sure what causes this error to actually happens seems to be related with how you type a command error plugins being used on the server fastasyncworldedit worldedit plotsquared plugman simpleapi simplepets buycraftx luckperms libsdisguises protocollib how to replicate not quite sure checklist i included all information required in the sections above i made sure there are no duplicates of this report i made sure i am using an up to date version of i made sure the bug error is not caused by any other plugin
| 0
|
19,983
| 26,462,185,846
|
IssuesEvent
|
2023-01-16 18:48:48
|
nion-software/nionswift
|
https://api.github.com/repos/nion-software/nionswift
|
opened
|
Transpose/flip operations should be easier
|
type - enhancement stage - planning level - easy f - processing
|
Right now they're mixed into the transform menu item which also includes flips. Making them separate would make them easier for the user; and allow them to better apply to collections (i.e. flipping the collection or the data? applying transpose to each item in a collection? etc.)
|
1.0
|
Transpose/flip operations should be easier - Right now they're mixed into the transform menu item which also includes flips. Making them separate would make them easier for the user; and allow them to better apply to collections (i.e. flipping the collection or the data? applying transpose to each item in a collection? etc.)
|
process
|
transpose flip operations should be easier right now they re mixed into the transform menu item which also includes flips making them separate would make them easier for the user and allow them to better apply to collections i e flipping the collection or the data applying transpose to each item in a collection etc
| 1
|
16,895
| 22,197,103,040
|
IssuesEvent
|
2022-06-07 07:58:34
|
q191201771/lal
|
https://api.github.com/repos/q191201771/lal
|
closed
|
rtmp 配置 gop_num的具体作用
|
#Question *In process
|
```
"gop_num": 0, //. RTMP拉流的GOP缓存数量,加速流打开时间,但是可能增加延时
//. 如果为0,则不使用缓存发送
```
文档中说明可以加快流打开速度。
请问有推荐的值吗?比如8h8g服务器推荐的值是 8 - 16。
|
1.0
|
rtmp 配置 gop_num的具体作用 - ```
"gop_num": 0, //. RTMP拉流的GOP缓存数量,加速流打开时间,但是可能增加延时
//. 如果为0,则不使用缓存发送
```
文档中说明可以加快流打开速度。
请问有推荐的值吗?比如8h8g服务器推荐的值是 8 - 16。
|
process
|
rtmp 配置 gop num的具体作用 gop num rtmp拉流的gop缓存数量,加速流打开时间,但是可能增加延时 ,则不使用缓存发送 文档中说明可以加快流打开速度。 请问有推荐的值吗? 。
| 1
|
3,491
| 9,669,066,797
|
IssuesEvent
|
2019-05-21 16:26:45
|
Azure/azure-sdk
|
https://api.github.com/repos/Azure/azure-sdk
|
closed
|
API Proposal: Add DeviceStream APIs to IoT Hub Device SDK (C)
|
architecture
|
# Overview
Azure IoT Hub device streams facilitate the creation of secure bi-directional TCP tunnels for a variety of cloud-to-device communication scenarios. A device stream is mediated by an IoT Hub streaming endpoint which acts as a proxy between your device and service endpoints. This setup is especially useful when devices are behind a network firewall or reside inside of a private network. As such, IoT Hub device streams help address customers' need to reach IoT devices in a firewall-friendly manner and without the need to broadly opening up incoming or outgoing network firewall ports. Using IoT Hub device streams, devices remain secure and will only need to open up outbound TCP connections to IoT hub's streaming endpoint over port 443. Once a stream is established, the service-side and device-side applications will each have programmatic access to a WebSocket client object to send and receive raw bytes to one another. The reliability and ordering guarantees provided by this tunnel is on par with TCP.
# Device Stream Workflow
A device stream is initiated when the service requests to connect to a device by providing its device ID. This workflow particularly fits into a client/server communication model, including SSH and RDP, where a user intends to remotely connect to the SSH or RDP server running on the device using an SSH or RDP client program.
The device stream creation process involves a negotiation between the device, service, IoT hub's main and streaming endpoints. While IoT hub's main endpoint orchestrates the creation of a device stream, the streaming endpoint handles the traffic that flows between the service and device.
## Device stream creation flow
Programmatic creation of a device stream using the SDK involves the following steps:
1. The device application registers a callback in advance to be notified of when a new device stream is initiated to the device. This step typically takes place when the device boots up and connects to IoT Hub.
2. The service-side program initiates a device stream when needed by providing the device ID (not the IP address).
3. IoT hub notifies the device-side program by invoking the callback registered in step 1. The device may accept or reject the stream initiation request. This logic can be specific to your application scenario. If the stream request is rejected by the device, IoT Hub informs the service accordingly; otherwise, the steps below follow.
4. The device creates a secure outbound TCP connection to the streaming endpoint over port 443 and upgrades the connection to a WebSocket. The URL of the streaming endpoint as well as the credentials to use to authenticate are both provided to the device by IoT Hub as part of the request sent in step 3.
5. The service is notified of the result of device accepting the stream and proceeds to create its own WebSocket client to the streaming endpoint. Similarly, it receives the streaming endpoint URL and authentication information from IoT Hub.
In the handshake process above:
* The handshake process must complete within 60 seconds (step 2 through 5), otherwise the handshake would fail with a timeout and the service will be notified accordingly.
* After the stream creation flow above completes, the streaming endpoint will act as a proxy and will transfer traffic between the service and the device over their respective WebSockets.
* Device and service both need outbound connectivity to IoT Hub's main endpoint as well as the streaming endpoint over port 443. The URL of these endpoints is available on Overview tab on the IoT Hub's portal.
* The reliability and ordering guarantees of an established stream is on par with TCP.
* All connections to IoT Hub and streaming endpoint use TLS and are encrypted.
## Termination Flow
An established stream terminates when either of the TCP connections to the gateway are disconnected (by the service or device). This can take place voluntarily by closing the WebSocket on either the device or service programs, or involuntarily in case of a network connectivity timeout or process failure. Upon termination of either device or service's connection to the streaming endpoint, the other TCP connection will also be (forcefully) terminated and the service and device are responsible to re-create the stream, if needed.
# SDK Availability
Two sides of each stream (on the device and service side) use the IoT Hub SDK to establish the tunnel. During public preview, customers can choose from the following SDK languages:
* The C and C# SDK's support device streams on the device side.
* The Node.js and C# SDK support device streams on the service side.
# Sample Code (C)
This echo sample demonstrates programmatic use of device streams to send and receive bytes between service and device applications. To execute the scenario end to end, use the corresponding C# or Node.js sample for the service side.
## Device Sample Code (C)
Full sample project [here](https://github.com/Azure/azure-iot-sdk-c/tree/public-preview/iothub_client/samples/iothub_client_c2d_streaming_sample)
Creating the device client using your device connection string and desired protocol (existing APIs):
```C
// Used to initialize IoTHub SDK subsystem
(void)IoTHub_Init();
IOTHUB_DEVICE_CLIENT_HANDLE device_handle;
// Create the iothub handle here
device_handle = IoTHubDeviceClient_CreateFromConnectionString(connectionString, protocol);
```
Listening for device stream requests (new APIs):
```C
if (IoTHubDeviceClient_SetStreamRequestCallback(device_handle, streamRequestCallback, NULL) != IOTHUB_CLIENT_OK)
{
(void)printf("Failed setting the stream request callback");
}
else
{
do
{
if (g_uws_client_handle != NULL)
{
uws_client_dowork(g_uws_client_handle);
}
ThreadAPI_Sleep(100);
} while (g_continueRunning);
}
```
Handling the stream request callback (new APIs):
```C
static DEVICE_STREAM_C2D_RESPONSE* streamRequestCallback(DEVICE_STREAM_C2D_REQUEST* stream_request, void* context)
{
(void)context;
(void)printf("Received stream request (%s)\r\n", stream_request->name);
g_uws_client_handle = create_websocket_client(stream_request);
return stream_c2d_response_create(stream_request, true);
}
```
# New API Surface added (C)
Reference to existing IoTHub client APIs [(link)](https://docs.microsoft.com/en-us/azure/iot-hub/iot-c-sdk-ref/iothub-client-h)
## IoTHub Device SDK additions
```C
typedef struct DEVICE_STREAM_C2D_REQUEST_TAG
{
/**
* @brief Name of the stream. This is a null-terminated string.
*/
char* name;
/**
* @brief Websockets URL to connect to the streaming gateway. This is a null-terminated string.
*/
char* url;
/**
* @brief Authorization token to be provided to the streaming gateway upon connection. This is a null-terminated string.
*/
char* authorization_token;
/**
* @brief Request ID used to correlate requests and responses. Do not modify its value.
*/
char* request_id;
} DEVICE_STREAM_C2D_REQUEST;
typedef struct DEVICE_STREAM_C2D_RESPONSE_TAG
{
/**
* @brief Indicates if the stream request was accepted or rejected by the local endpoint. Use true to accept, or false to reject.
*/
bool accept;
/**
* @brief Request ID used to correlate requests and responses. Do not modify its value.
*/
char* request_id;
} DEVICE_STREAM_C2D_RESPONSE;
/**
* @brief Callback invoked for new cloud-to-device stream requests.
* @param request Contains the basic information to connect to the streaming gateway, as well as optional custom data provided by the originating endpoint.
* @param context User-defined context, as provided in the call to *_SetStreamRequestCallback.
* @return An instance of DEVICE_STREAM_C2D_RESPONSE indicating if the stream request is accepted or rejected.
*/
typedef DEVICE_STREAM_C2D_RESPONSE* (*DEVICE_STREAM_C2D_REQUEST_CALLBACK)(DEVICE_STREAM_C2D_REQUEST* request, void* context);
DEVICE_STREAM_C2D_RESPONSE* stream_c2d_response_create(DEVICE_STREAM_C2D_REQUEST* request, bool accept);
IOTHUB_CLIENT_RESULT IoTHubDeviceClient_SetStreamRequestCallback(IOTHUB_DEVICE_CLIENT_HANDLE iotHubClientHandle, DEVICE_STREAM_C2D_REQUEST_CALLBACK streamRequestCallback, void* context);
IOTHUB_CLIENT_RESULT IoTHubModuleClient_SetStreamRequestCallback(IOTHUB_MODULE_CLIENT_HANDLE iotHubModuleClientHandle, DEVICE_STREAM_C2D_REQUEST_CALLBACK streamRequestCallback, void* context);
```
|
1.0
|
API Proposal: Add DeviceStream APIs to IoT Hub Device SDK (C) - # Overview
Azure IoT Hub device streams facilitate the creation of secure bi-directional TCP tunnels for a variety of cloud-to-device communication scenarios. A device stream is mediated by an IoT Hub streaming endpoint which acts as a proxy between your device and service endpoints. This setup is especially useful when devices are behind a network firewall or reside inside of a private network. As such, IoT Hub device streams help address customers' need to reach IoT devices in a firewall-friendly manner and without the need to broadly opening up incoming or outgoing network firewall ports. Using IoT Hub device streams, devices remain secure and will only need to open up outbound TCP connections to IoT hub's streaming endpoint over port 443. Once a stream is established, the service-side and device-side applications will each have programmatic access to a WebSocket client object to send and receive raw bytes to one another. The reliability and ordering guarantees provided by this tunnel is on par with TCP.
# Device Stream Workflow
A device stream is initiated when the service requests to connect to a device by providing its device ID. This workflow particularly fits into a client/server communication model, including SSH and RDP, where a user intends to remotely connect to the SSH or RDP server running on the device using an SSH or RDP client program.
The device stream creation process involves a negotiation between the device, service, IoT hub's main and streaming endpoints. While IoT hub's main endpoint orchestrates the creation of a device stream, the streaming endpoint handles the traffic that flows between the service and device.
## Device stream creation flow
Programmatic creation of a device stream using the SDK involves the following steps:
1. The device application registers a callback in advance to be notified of when a new device stream is initiated to the device. This step typically takes place when the device boots up and connects to IoT Hub.
2. The service-side program initiates a device stream when needed by providing the device ID (not the IP address).
3. IoT hub notifies the device-side program by invoking the callback registered in step 1. The device may accept or reject the stream initiation request. This logic can be specific to your application scenario. If the stream request is rejected by the device, IoT Hub informs the service accordingly; otherwise, the steps below follow.
4. The device creates a secure outbound TCP connection to the streaming endpoint over port 443 and upgrades the connection to a WebSocket. The URL of the streaming endpoint as well as the credentials to use to authenticate are both provided to the device by IoT Hub as part of the request sent in step 3.
5. The service is notified of the result of device accepting the stream and proceeds to create its own WebSocket client to the streaming endpoint. Similarly, it receives the streaming endpoint URL and authentication information from IoT Hub.
In the handshake process above:
* The handshake process must complete within 60 seconds (step 2 through 5), otherwise the handshake would fail with a timeout and the service will be notified accordingly.
* After the stream creation flow above completes, the streaming endpoint will act as a proxy and will transfer traffic between the service and the device over their respective WebSockets.
* Device and service both need outbound connectivity to IoT Hub's main endpoint as well as the streaming endpoint over port 443. The URL of these endpoints is available on Overview tab on the IoT Hub's portal.
* The reliability and ordering guarantees of an established stream is on par with TCP.
* All connections to IoT Hub and streaming endpoint use TLS and are encrypted.
## Termination Flow
An established stream terminates when either of the TCP connections to the gateway are disconnected (by the service or device). This can take place voluntarily by closing the WebSocket on either the device or service programs, or involuntarily in case of a network connectivity timeout or process failure. Upon termination of either device or service's connection to the streaming endpoint, the other TCP connection will also be (forcefully) terminated and the service and device are responsible to re-create the stream, if needed.
# SDK Availability
Two sides of each stream (on the device and service side) use the IoT Hub SDK to establish the tunnel. During public preview, customers can choose from the following SDK languages:
* The C and C# SDK's support device streams on the device side.
* The Node.js and C# SDK support device streams on the service side.
# Sample Code (C)
This echo sample demonstrates programmatic use of device streams to send and receive bytes between service and device applications. To execute the scenario end to end, use the corresponding C# or Node.js sample for the service side.
## Device Sample Code (C)
Full sample project [here](https://github.com/Azure/azure-iot-sdk-c/tree/public-preview/iothub_client/samples/iothub_client_c2d_streaming_sample)
Creating the device client using your device connection string and desired protocol (existing APIs):
```C
// Used to initialize IoTHub SDK subsystem
(void)IoTHub_Init();
IOTHUB_DEVICE_CLIENT_HANDLE device_handle;
// Create the iothub handle here
device_handle = IoTHubDeviceClient_CreateFromConnectionString(connectionString, protocol);
```
Listening for device stream requests (new APIs):
```C
if (IoTHubDeviceClient_SetStreamRequestCallback(device_handle, streamRequestCallback, NULL) != IOTHUB_CLIENT_OK)
{
(void)printf("Failed setting the stream request callback");
}
else
{
do
{
if (g_uws_client_handle != NULL)
{
uws_client_dowork(g_uws_client_handle);
}
ThreadAPI_Sleep(100);
} while (g_continueRunning);
}
```
Handling the stream request callback (new APIs):
```C
static DEVICE_STREAM_C2D_RESPONSE* streamRequestCallback(DEVICE_STREAM_C2D_REQUEST* stream_request, void* context)
{
(void)context;
(void)printf("Received stream request (%s)\r\n", stream_request->name);
g_uws_client_handle = create_websocket_client(stream_request);
return stream_c2d_response_create(stream_request, true);
}
```
# New API Surface added (C)
Reference to existing IoTHub client APIs [(link)](https://docs.microsoft.com/en-us/azure/iot-hub/iot-c-sdk-ref/iothub-client-h)
## IoTHub Device SDK additions
```C
typedef struct DEVICE_STREAM_C2D_REQUEST_TAG
{
/**
* @brief Name of the stream. This is a null-terminated string.
*/
char* name;
/**
* @brief Websockets URL to connect to the streaming gateway. This is a null-terminated string.
*/
char* url;
/**
* @brief Authorization token to be provided to the streaming gateway upon connection. This is a null-terminated string.
*/
char* authorization_token;
/**
* @brief Request ID used to correlate requests and responses. Do not modify its value.
*/
char* request_id;
} DEVICE_STREAM_C2D_REQUEST;
typedef struct DEVICE_STREAM_C2D_RESPONSE_TAG
{
/**
* @brief Indicates if the stream request was accepted or rejected by the local endpoint. Use true to accept, or false to reject.
*/
bool accept;
/**
* @brief Request ID used to correlate requests and responses. Do not modify its value.
*/
char* request_id;
} DEVICE_STREAM_C2D_RESPONSE;
/**
* @brief Callback invoked for new cloud-to-device stream requests.
* @param request Contains the basic information to connect to the streaming gateway, as well as optional custom data provided by the originating endpoint.
* @param context User-defined context, as provided in the call to *_SetStreamRequestCallback.
* @return An instance of DEVICE_STREAM_C2D_RESPONSE indicating if the stream request is accepted or rejected.
*/
typedef DEVICE_STREAM_C2D_RESPONSE* (*DEVICE_STREAM_C2D_REQUEST_CALLBACK)(DEVICE_STREAM_C2D_REQUEST* request, void* context);
DEVICE_STREAM_C2D_RESPONSE* stream_c2d_response_create(DEVICE_STREAM_C2D_REQUEST* request, bool accept);
IOTHUB_CLIENT_RESULT IoTHubDeviceClient_SetStreamRequestCallback(IOTHUB_DEVICE_CLIENT_HANDLE iotHubClientHandle, DEVICE_STREAM_C2D_REQUEST_CALLBACK streamRequestCallback, void* context);
IOTHUB_CLIENT_RESULT IoTHubModuleClient_SetStreamRequestCallback(IOTHUB_MODULE_CLIENT_HANDLE iotHubModuleClientHandle, DEVICE_STREAM_C2D_REQUEST_CALLBACK streamRequestCallback, void* context);
```
|
non_process
|
api proposal add devicestream apis to iot hub device sdk c overview azure iot hub device streams facilitate the creation of secure bi directional tcp tunnels for a variety of cloud to device communication scenarios a device stream is mediated by an iot hub streaming endpoint which acts as a proxy between your device and service endpoints this setup is especially useful when devices are behind a network firewall or reside inside of a private network as such iot hub device streams help address customers need to reach iot devices in a firewall friendly manner and without the need to broadly opening up incoming or outgoing network firewall ports using iot hub device streams devices remain secure and will only need to open up outbound tcp connections to iot hub s streaming endpoint over port once a stream is established the service side and device side applications will each have programmatic access to a websocket client object to send and receive raw bytes to one another the reliability and ordering guarantees provided by this tunnel is on par with tcp device stream workflow a device stream is initiated when the service requests to connect to a device by providing its device id this workflow particularly fits into a client server communication model including ssh and rdp where a user intends to remotely connect to the ssh or rdp server running on the device using an ssh or rdp client program the device stream creation process involves a negotiation between the device service iot hub s main and streaming endpoints while iot hub s main endpoint orchestrates the creation of a device stream the streaming endpoint handles the traffic that flows between the service and device device stream creation flow programmatic creation of a device stream using the sdk involves the following steps the device application registers a callback in advance to be notified of when a new device stream is initiated to the device this step typically takes place when the device boots up and connects to iot hub the service side program initiates a device stream when needed by providing the device id not the ip address iot hub notifies the device side program by invoking the callback registered in step the device may accept or reject the stream initiation request this logic can be specific to your application scenario if the stream request is rejected by the device iot hub informs the service accordingly otherwise the steps below follow the device creates a secure outbound tcp connection to the streaming endpoint over port and upgrades the connection to a websocket the url of the streaming endpoint as well as the credentials to use to authenticate are both provided to the device by iot hub as part of the request sent in step the service is notified of the result of device accepting the stream and proceeds to create its own websocket client to the streaming endpoint similarly it receives the streaming endpoint url and authentication information from iot hub in the handshake process above the handshake process must complete within seconds step through otherwise the handshake would fail with a timeout and the service will be notified accordingly after the stream creation flow above completes the streaming endpoint will act as a proxy and will transfer traffic between the service and the device over their respective websockets device and service both need outbound connectivity to iot hub s main endpoint as well as the streaming endpoint over port the url of these endpoints is available on overview tab on the iot hub s portal the reliability and ordering guarantees of an established stream is on par with tcp all connections to iot hub and streaming endpoint use tls and are encrypted termination flow an established stream terminates when either of the tcp connections to the gateway are disconnected by the service or device this can take place voluntarily by closing the websocket on either the device or service programs or involuntarily in case of a network connectivity timeout or process failure upon termination of either device or service s connection to the streaming endpoint the other tcp connection will also be forcefully terminated and the service and device are responsible to re create the stream if needed sdk availability two sides of each stream on the device and service side use the iot hub sdk to establish the tunnel during public preview customers can choose from the following sdk languages the c and c sdk s support device streams on the device side the node js and c sdk support device streams on the service side sample code c this echo sample demonstrates programmatic use of device streams to send and receive bytes between service and device applications to execute the scenario end to end use the corresponding c or node js sample for the service side device sample code c full sample project creating the device client using your device connection string and desired protocol existing apis c used to initialize iothub sdk subsystem void iothub init iothub device client handle device handle create the iothub handle here device handle iothubdeviceclient createfromconnectionstring connectionstring protocol listening for device stream requests new apis c if iothubdeviceclient setstreamrequestcallback device handle streamrequestcallback null iothub client ok void printf failed setting the stream request callback else do if g uws client handle null uws client dowork g uws client handle threadapi sleep while g continuerunning handling the stream request callback new apis c static device stream response streamrequestcallback device stream request stream request void context void context void printf received stream request s r n stream request name g uws client handle create websocket client stream request return stream response create stream request true new api surface added c reference to existing iothub client apis iothub device sdk additions c typedef struct device stream request tag brief name of the stream this is a null terminated string char name brief websockets url to connect to the streaming gateway this is a null terminated string char url brief authorization token to be provided to the streaming gateway upon connection this is a null terminated string char authorization token brief request id used to correlate requests and responses do not modify its value char request id device stream request typedef struct device stream response tag brief indicates if the stream request was accepted or rejected by the local endpoint use true to accept or false to reject bool accept brief request id used to correlate requests and responses do not modify its value char request id device stream response brief callback invoked for new cloud to device stream requests param request contains the basic information to connect to the streaming gateway as well as optional custom data provided by the originating endpoint param context user defined context as provided in the call to setstreamrequestcallback return an instance of device stream response indicating if the stream request is accepted or rejected typedef device stream response device stream request callback device stream request request void context device stream response stream response create device stream request request bool accept iothub client result iothubdeviceclient setstreamrequestcallback iothub device client handle iothubclienthandle device stream request callback streamrequestcallback void context iothub client result iothubmoduleclient setstreamrequestcallback iothub module client handle iothubmoduleclienthandle device stream request callback streamrequestcallback void context
| 0
|
1,921
| 4,757,526,322
|
IssuesEvent
|
2016-10-24 16:50:27
|
mozilla/tofino
|
https://api.github.com/repos/mozilla/tofino
|
closed
|
Consider having main process provide UAS with Node.js server directly
|
backend:main-process
|
Node.js has the ability to provide a socket or server to a child process via an IPC channel. We might care to use this ability to start the UAS from Tofino.
The big win is that the Tofino main process can arrange to start a server and bind it to any port/interface it cares to, independently of the UAS process that services requests. That means we don't have to much around with port-scanning in the UAS process, nor do we have to find a way to message the port back to the main process.
The big cost is that the UAS is now heavily dependent on a Node.js-specific piece of functionality.
We might achieve something similar by providing the UAS process an extra open FD when it starts, specifically for it to message its port back to the main process. This is more general than using Node.js's implementation.
|
1.0
|
Consider having main process provide UAS with Node.js server directly - Node.js has the ability to provide a socket or server to a child process via an IPC channel. We might care to use this ability to start the UAS from Tofino.
The big win is that the Tofino main process can arrange to start a server and bind it to any port/interface it cares to, independently of the UAS process that services requests. That means we don't have to much around with port-scanning in the UAS process, nor do we have to find a way to message the port back to the main process.
The big cost is that the UAS is now heavily dependent on a Node.js-specific piece of functionality.
We might achieve something similar by providing the UAS process an extra open FD when it starts, specifically for it to message its port back to the main process. This is more general than using Node.js's implementation.
|
process
|
consider having main process provide uas with node js server directly node js has the ability to provide a socket or server to a child process via an ipc channel we might care to use this ability to start the uas from tofino the big win is that the tofino main process can arrange to start a server and bind it to any port interface it cares to independently of the uas process that services requests that means we don t have to much around with port scanning in the uas process nor do we have to find a way to message the port back to the main process the big cost is that the uas is now heavily dependent on a node js specific piece of functionality we might achieve something similar by providing the uas process an extra open fd when it starts specifically for it to message its port back to the main process this is more general than using node js s implementation
| 1
|
18,574
| 24,556,336,041
|
IssuesEvent
|
2022-10-12 16:09:35
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[Android] App is crashing in a scenario for the 'Text choice' responce type
|
Bug P0 Android Process: Fixed Process: Tested dev
|
Steps:
1. Sign up or sign in to the mobile app
2. Enroll to the study
3. Click on text choice question with other option
4. Select the 'Other' option for text choice response type
5. Click on Next button
6. Click on the Back button and observe
A/R:- App is crashing in above scenario
E/R:- App should not crash in any screen
https://user-images.githubusercontent.com/60500517/191533174-467d4521-18c8-4a21-b357-36104afabbfd.mp4
|
2.0
|
[Android] App is crashing in a scenario for the 'Text choice' responce type - Steps:
1. Sign up or sign in to the mobile app
2. Enroll to the study
3. Click on text choice question with other option
4. Select the 'Other' option for text choice response type
5. Click on Next button
6. Click on the Back button and observe
A/R:- App is crashing in above scenario
E/R:- App should not crash in any screen
https://user-images.githubusercontent.com/60500517/191533174-467d4521-18c8-4a21-b357-36104afabbfd.mp4
|
process
|
app is crashing in a scenario for the text choice responce type steps sign up or sign in to the mobile app enroll to the study click on text choice question with other option select the other option for text choice response type click on next button click on the back button and observe a r app is crashing in above scenario e r app should not crash in any screen
| 1
|
4,715
| 7,552,482,461
|
IssuesEvent
|
2018-04-19 00:33:13
|
UnbFeelings/unb-feelings-docs
|
https://api.github.com/repos/UnbFeelings/unb-feelings-docs
|
closed
|
[Não Conformidade] Inexistência do Plano de Medição
|
Desenvolvimento Processo Qualidade invalid
|
@UnbFeelings/process
Perante critérios definidos para as [Auditorias](https://github.com/UnbFeelings/unb-feelings-GQA/wiki/Crit%C3%A9rios-de-Avalia%C3%A7%C3%A3o-e-T%C3%A9cnicas-de-Auditoria) fora auditado os [Testes de Software](https://github.com/UnbFeelings/unb-feelings-GQA/wiki/Auditoria-de-Testes-de-Software---Ciclo-1) da equipe de *Desenvolvimento* seguindo os padrões definidos no Plano de Medição proposto pela equipe de *Processo*.
### Descrição
Após a realização de um Checklist no repositório do produto fora notado que alguns testes estão sendo feitos porém sem o direcionamento do Plano de Medição feito pela equipe de Processo, já que o mesmo não existe.
### Recomendação
Para que o produto em desenvolvimento seja feito com qualidade é preciso que um Plano de Medição associado seja feito e acompanhado pela equipe de Processo.
#### Detalhes
**Auditor**: Matheus Figueiredo
**Técnica de Audição**: Checklist
**Data da Audição**: 16/04/2018
|
1.0
|
[Não Conformidade] Inexistência do Plano de Medição - @UnbFeelings/process
Perante critérios definidos para as [Auditorias](https://github.com/UnbFeelings/unb-feelings-GQA/wiki/Crit%C3%A9rios-de-Avalia%C3%A7%C3%A3o-e-T%C3%A9cnicas-de-Auditoria) fora auditado os [Testes de Software](https://github.com/UnbFeelings/unb-feelings-GQA/wiki/Auditoria-de-Testes-de-Software---Ciclo-1) da equipe de *Desenvolvimento* seguindo os padrões definidos no Plano de Medição proposto pela equipe de *Processo*.
### Descrição
Após a realização de um Checklist no repositório do produto fora notado que alguns testes estão sendo feitos porém sem o direcionamento do Plano de Medição feito pela equipe de Processo, já que o mesmo não existe.
### Recomendação
Para que o produto em desenvolvimento seja feito com qualidade é preciso que um Plano de Medição associado seja feito e acompanhado pela equipe de Processo.
#### Detalhes
**Auditor**: Matheus Figueiredo
**Técnica de Audição**: Checklist
**Data da Audição**: 16/04/2018
|
process
|
inexistência do plano de medição unbfeelings process perante critérios definidos para as fora auditado os da equipe de desenvolvimento seguindo os padrões definidos no plano de medição proposto pela equipe de processo descrição após a realização de um checklist no repositório do produto fora notado que alguns testes estão sendo feitos porém sem o direcionamento do plano de medição feito pela equipe de processo já que o mesmo não existe recomendação para que o produto em desenvolvimento seja feito com qualidade é preciso que um plano de medição associado seja feito e acompanhado pela equipe de processo detalhes auditor matheus figueiredo técnica de audição checklist data da audição
| 1
|
20,505
| 27,167,377,223
|
IssuesEvent
|
2023-02-17 16:21:36
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
condition check for main is improperly written
|
doc-bug Pri1 azure-devops-pipelines/svc azure-devops-pipelines-process/subsvc
|
(condition: contains(variables['build.sourceBranch'], 'refs/heads/main')) step in 'Pipeline behavior when build is canceled' should read as (condition: eq(variables['build.sourceBranch'], 'refs/heads/main')), since the task is meant to only run on main branch. However, it will run on branches that contain main-*** which is not a desired task. As well the follow-up steps proceed to call out that validation step via eq() over contains()
:) thanks
---
#### Document Details
⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.*
* ID: 21e5cee4-eaae-3a96-db91-540ac759e83a
* Version Independent ID: 9bdc837c-ffe0-d999-f922-f3a5debc7f92
* Content: [Conditions - Azure Pipelines](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml%2Cstages)
* Content Source: [docs/pipelines/process/conditions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/conditions.md)
* Service: **azure-devops-pipelines**
* Sub-service: **azure-devops-pipelines-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
condition check for main is improperly written -
(condition: contains(variables['build.sourceBranch'], 'refs/heads/main')) step in 'Pipeline behavior when build is canceled' should read as (condition: eq(variables['build.sourceBranch'], 'refs/heads/main')), since the task is meant to only run on main branch. However, it will run on branches that contain main-*** which is not a desired task. As well the follow-up steps proceed to call out that validation step via eq() over contains()
:) thanks
---
#### Document Details
⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.*
* ID: 21e5cee4-eaae-3a96-db91-540ac759e83a
* Version Independent ID: 9bdc837c-ffe0-d999-f922-f3a5debc7f92
* Content: [Conditions - Azure Pipelines](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml%2Cstages)
* Content Source: [docs/pipelines/process/conditions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/conditions.md)
* Service: **azure-devops-pipelines**
* Sub-service: **azure-devops-pipelines-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
condition check for main is improperly written condition contains variables refs heads main step in pipeline behavior when build is canceled should read as condition eq variables refs heads main since the task is meant to only run on main branch however it will run on branches that contain main which is not a desired task as well the follow up steps proceed to call out that validation step via eq over contains thanks document details ⚠ do not edit this section it is required for learn microsoft com ➟ github issue linking id eaae version independent id content content source service azure devops pipelines sub service azure devops pipelines process github login juliakm microsoft alias jukullam
| 1
|
168,910
| 26,711,328,271
|
IssuesEvent
|
2023-01-28 00:36:07
|
devssa/onde-codar-em-salvador
|
https://api.github.com/repos/devssa/onde-codar-em-salvador
|
closed
|
[REACT] [PROGRAMACAO/ARQUITETURA] [REMOTO] Pessoa Desenvolvedora (React) na [LIBER CAPITAL]
|
JAVASCRIPT GIT REST REACT REMOTO APIs TESTES AUTOMATIZADOS DESENVOLVIMENTO WEB METODOLOGIAS ÁGEIS HELP WANTED styled compnents DESIGN SYSTEM ARQUITETURA DE SOFTWARE PROGRAMACAO Stale
|
<!--
==================================================
POR FAVOR, SÓ POSTE SE A VAGA FOR PARA SALVADOR E CIDADES VIZINHAS!
Use: "Desenvolvedor Front-end" ao invés de
"Front-End Developer" \o/
Exemplo: `[JAVASCRIPT] [MYSQL] [NODE.JS] Desenvolvedor Front-End na [NOME DA EMPRESA]`
==================================================
-->
## Descrição da vaga
- Desenvolver novas features e customizações em nossa plataforma de negociação de recebíveis;
- Escrever código de qualidade (e refatorar quando necessário) seguindo boas práticas de desenvolvimento (ex: clean code);
- Implementar interfaces utilizando React seguindo protótipos de alta fidelidade;
- Ajudar a manter e incrementar nossa biblioteca de componentes;
- Participar de um time multidisciplinar ajudando a encontrar as melhores soluções para os usuários e o negócio.
## Local
- Remoto
## Benefícios
- Muito aprendizado;
- Ambiente cheio de pessoas inteligentes e colaborativas;
- Universidade Corporativa;
- Regime de trabalho CLT com horário flexível/ Possibilidade de atuação home office (com auxílio home);
- VA ou VR;
- Plano de saúde;
- Plano odontológico;
- Seguro de vida;
- Gympass;
- Zenklub;
- Bolsa de estudos, de acordo com a política específica.
## Requisitos
**Obrigatórios:**
- Conhecimento sólido em JavaScript;
- Experiência em desenvolvimento de aplicações web utilizando React;
- Familiaridade com consumo de APIs REST;
- Conhecimento sólido em versionamento de código (Git);
- Experiência com metodologias de desenvolvimento ágeis;
- Conhecimento em alguma ferramenta de testes automatizados (React Testing Library, Enzyme, Jest);
- Conhecimento em estilização de componentes utilizando Styled-components (CSS-in-JS);
- Conhecimento ou vontade de aprender sobre Design systems.
## Contratação
- a combinar
## Nossa empresa
- Fintech de antecipação de recebíveis que está revolucionando o mercado financeiro.
## Como se candidatar
- [Clique aqui para se candidatar](https://hipsters.jobs/job/18379/pessoa-desenvolvedora-react/)
|
1.0
|
[REACT] [PROGRAMACAO/ARQUITETURA] [REMOTO] Pessoa Desenvolvedora (React) na [LIBER CAPITAL] - <!--
==================================================
POR FAVOR, SÓ POSTE SE A VAGA FOR PARA SALVADOR E CIDADES VIZINHAS!
Use: "Desenvolvedor Front-end" ao invés de
"Front-End Developer" \o/
Exemplo: `[JAVASCRIPT] [MYSQL] [NODE.JS] Desenvolvedor Front-End na [NOME DA EMPRESA]`
==================================================
-->
## Descrição da vaga
- Desenvolver novas features e customizações em nossa plataforma de negociação de recebíveis;
- Escrever código de qualidade (e refatorar quando necessário) seguindo boas práticas de desenvolvimento (ex: clean code);
- Implementar interfaces utilizando React seguindo protótipos de alta fidelidade;
- Ajudar a manter e incrementar nossa biblioteca de componentes;
- Participar de um time multidisciplinar ajudando a encontrar as melhores soluções para os usuários e o negócio.
## Local
- Remoto
## Benefícios
- Muito aprendizado;
- Ambiente cheio de pessoas inteligentes e colaborativas;
- Universidade Corporativa;
- Regime de trabalho CLT com horário flexível/ Possibilidade de atuação home office (com auxílio home);
- VA ou VR;
- Plano de saúde;
- Plano odontológico;
- Seguro de vida;
- Gympass;
- Zenklub;
- Bolsa de estudos, de acordo com a política específica.
## Requisitos
**Obrigatórios:**
- Conhecimento sólido em JavaScript;
- Experiência em desenvolvimento de aplicações web utilizando React;
- Familiaridade com consumo de APIs REST;
- Conhecimento sólido em versionamento de código (Git);
- Experiência com metodologias de desenvolvimento ágeis;
- Conhecimento em alguma ferramenta de testes automatizados (React Testing Library, Enzyme, Jest);
- Conhecimento em estilização de componentes utilizando Styled-components (CSS-in-JS);
- Conhecimento ou vontade de aprender sobre Design systems.
## Contratação
- a combinar
## Nossa empresa
- Fintech de antecipação de recebíveis que está revolucionando o mercado financeiro.
## Como se candidatar
- [Clique aqui para se candidatar](https://hipsters.jobs/job/18379/pessoa-desenvolvedora-react/)
|
non_process
|
pessoa desenvolvedora react na por favor só poste se a vaga for para salvador e cidades vizinhas use desenvolvedor front end ao invés de front end developer o exemplo desenvolvedor front end na descrição da vaga desenvolver novas features e customizações em nossa plataforma de negociação de recebíveis escrever código de qualidade e refatorar quando necessário seguindo boas práticas de desenvolvimento ex clean code implementar interfaces utilizando react seguindo protótipos de alta fidelidade ajudar a manter e incrementar nossa biblioteca de componentes participar de um time multidisciplinar ajudando a encontrar as melhores soluções para os usuários e o negócio local remoto benefícios muito aprendizado ambiente cheio de pessoas inteligentes e colaborativas universidade corporativa regime de trabalho clt com horário flexível possibilidade de atuação home office com auxílio home va ou vr plano de saúde plano odontológico seguro de vida gympass zenklub bolsa de estudos de acordo com a política específica requisitos obrigatórios conhecimento sólido em javascript experiência em desenvolvimento de aplicações web utilizando react familiaridade com consumo de apis rest conhecimento sólido em versionamento de código git experiência com metodologias de desenvolvimento ágeis conhecimento em alguma ferramenta de testes automatizados react testing library enzyme jest conhecimento em estilização de componentes utilizando styled components css in js conhecimento ou vontade de aprender sobre design systems contratação a combinar nossa empresa fintech de antecipação de recebíveis que está revolucionando o mercado financeiro como se candidatar
| 0
|
1,031
| 3,489,256,924
|
IssuesEvent
|
2016-01-03 18:39:32
|
kerubistan/kerub
|
https://api.github.com/repos/kerubistan/kerub
|
opened
|
start vm-monitoring processes
|
component:data processing component:virtualization enhancement priority: high
|
Start the processes to monitor the VM and update VM dynamic data
|
1.0
|
start vm-monitoring processes - Start the processes to monitor the VM and update VM dynamic data
|
process
|
start vm monitoring processes start the processes to monitor the vm and update vm dynamic data
| 1
|
74,449
| 3,440,110,480
|
IssuesEvent
|
2015-12-14 13:06:45
|
Itseez/opencv
|
https://api.github.com/repos/Itseez/opencv
|
closed
|
Android native camera fails on 2.4.9 and 3.0.0
|
affected: master auto-transferred bug category: android priority: normal
|
Transferred from http://code.opencv.org/issues/3681
```
|| Anders Modén on 2014-05-09 07:12
|| Priority: Normal
|| Affected: branch 'master' (3.0-dev)
|| Category: android
|| Tracker: Bug
|| Difficulty: Hard
|| PR:
|| Platform: ARM / Android
```
Android native camera fails on 2.4.9 and 3.0.0
-----------
```
Nexxus 7 2013 model fails to activate native camera on 2.4.9 and 3.0.0 latest
applyProperties: failed setPreviewTexture call; ....
and
initCameraConnect: startPreview() fails.
Just use Tutorial Sample 1 and go into native mode
If you want to cantact me I can help you debug it.
```
History
-------
##### Anders Modén on 2014-05-09 07:13
```
It should be 2014 model
This is the output from the native camera app as well
05-09 09:28:20.452: D/OpenCV::camera(1932): CvCapture_Android::CvCapture_Android(0)
05-09 09:28:20.452: D/OpenCV::camera(1932): Library name: libopencv_java.so
05-09 09:28:20.452: D/OpenCV::camera(1932): Library base address: 0x74eca000
05-09 09:28:20.462: V/threaded_app(1932): WindowFocusChanged: 0x7615a3c0 -- 1
05-09 09:28:20.472: D/OpenCV::camera(1932): Libraries folder found: /data/app-lib/Saab.Combitech.AR.SDK-1/
05-09 09:28:20.472: D/OpenCV::camera(1932): CameraWrapperConnector::connectToLib: folderPath=/data/app-lib/Saab.Combitech.AR.SDK-1/
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r2.3.3.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r4.0.3.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r4.1.1.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r4.2.0.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r2.2.0.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r3.0.1.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r4.0.0.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r4.3.0.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r4.4.0.so
05-09 09:28:20.472: D/OpenCV::camera(1932): try to load library 'libnative_camera_r4.4.0.so'
05-09 09:28:20.482: D/OpenCV::camera(1932): Loaded library '/data/app-lib/Saab.Combitech.AR.SDK-1/libnative_camera_r4.4.0.so'
05-09 09:28:20.482: D/OpenCV_NativeCamera(1932): CameraHandler::initCameraConnect(0x75596609, 0, 0x76169c68, 0x0)
05-09 09:28:20.482: D/OpenCV_NativeCamera(1932): Current process name for camera init: Saab.Combitech.AR.SDK
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Instantiated new CameraHandler (0x75596609, 0x76169c68)
05-09 09:28:20.932: I/OpenCV_NativeCamera(1932): initCameraConnect: [antibanding=auto;antibanding-values=off,60hz,50hz,auto;auto-exposure-lock=false;auto-exposure-lock-supported=true;auto-whitebalance-lock=false;auto-whitebalance-lock-supported=true;effect=none;effect-values=none,mono,negative,solarize,sepia,posterize,whiteboard,blackboard,aqua;exposure-compensation=0;exposure-compensation-step=0.166667;flash-mode=off;flash-mode-values=off;focal-length=2.95;focus-areas=(0,0,0,0,0);focus-distances=Infinity,Infinity,Infinity;focus-mode=auto;focus-mode-values=infinity,auto,macro,continuous-video,continuous-picture;horizontal-view-angle=63.8164;jpeg-quality=90;jpeg-thumbnail-height=240;jpeg-thumbnail-quality=90;jpeg-thumbnail-size-values=512x288,480x288,256x154,432x288,320x240,176x144,0x0;jpeg-thumbnail-width=320;max-exposure-compensation=12;max-num-detected-faces-hw=0;max-num-detected-faces-sw=0;max-num-focus-areas=1;max-num-metering-areas=1;max-zoom=99;metering-areas=(0,0,0,0,0);min-exposure-compensation=-12;picture-format=jpeg;picture-format-values=jpeg;pic
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Cameras: (null)
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Picture Sizes: 2592x1944,2048x1536,1920x1080,1600x1200,1280x768,1280x720,1024x768,800x600,800x480,720x480,640x480,352x288,320x240,176x144
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Picture Formats: jpeg
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Preview Sizes: 1920x1080,1280x768,1280x720,1024x768,800x600,800x480,720x480,640x480,352x288,320x240,176x144
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Preview Formats: yuv420p,yuv420sp,
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Preview Frame Rates: 15,24,30
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Thumbnail Sizes: 512x288,480x288,256x154,432x288,320x240,176x144,0x0
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Whitebalance Modes: auto,incandescent,fluorescent,warm-fluorescent,daylight,cloudy-daylight,twilight,shade
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Effects: none,mono,negative,solarize,sepia,posterize,whiteboard,blackboard,aqua
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Scene Modes: auto,landscape,snow,beach,sunset,night,portrait,sports,steadyphoto,candlelight,fireworks,party,night-portrait,theatre,action
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Focus Modes: infinity,auto,macro,continuous-video,continuous-picture
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Antibanding Options: off,60hz,50hz,auto
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Flash Modes: off
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): initCameraConnect: autofocus is set to mode "continuous-video"
05-09 09:28:20.942: D/OpenCV_NativeCamera(1932): initCameraConnect: preview format is set to yuv420sp
05-09 09:28:20.942: D/OpenCV_NativeCamera(1932): initCameraConnect: preview format is set to 640x480
05-09 09:28:20.942: D/OpenCV_NativeCamera(1932): Starting preview
05-09 09:28:21.012: D/OpenCV_NativeCamera(1932): Preview started successfully
05-09 09:28:21.012: E/OpenCV::camera(1932): calling (*pGetPropertyC)(0x76165f68, 2)
05-09 09:28:21.012: D/OpenCV_NativeCamera(1932): CameraHandler::getProperty(2)
05-09 09:28:21.012: D/OpenCV_NativeCamera(1932): CameraHandler::setProperty(0, 640.000000)
05-09 09:28:21.022: I/OpenCV_NativeCamera(1932): Params before set: [antibanding=auto;antibanding-values=off,60hz,50hz,auto;auto-exposure-lock=false;auto-exposure-lock-supported=true;auto-whitebalance-lock=false;auto-whitebalance-lock-supported=true;effect=none;effect-values=none,mono,negative,solarize,sepia,posterize,whiteboard,blackboard,aqua;exposure-compensation=0;exposure-compensation-step=0.166667;flash-mode=off;flash-mode-values=off;focal-length=2.95;focus-areas=(0,0,0,0,0);focus-distances=Infinity,Infinity,Infinity;focus-mode=continuous-video;focus-mode-values=infinity,auto,macro,continuous-video,continuous-picture;horizontal-view-angle=63.8164;jpeg-quality=90;jpeg-thumbnail-height=240;jpeg-thumbnail-quality=90;jpeg-thumbnail-size-values=512x288,480x288,256x154,432x288,320x240,176x144,0x0;jpeg-thumbnail-width=320;max-exposure-compensation=12;max-num-detected-faces-hw=0;max-num-detected-faces-sw=0;max-num-focus-areas=1;max-num-metering-areas=1;max-zoom=99;metering-areas=(0,0,0,0,0);min-exposure-compensation=-12;picture-format=jpeg;picture-format-val
05-09 09:28:21.022: I/OpenCV_NativeCamera(1932): Params after set: [antibanding=auto;antibanding-values=off,60hz,50hz,auto;auto-exposure-lock=false;auto-exposure-lock-supported=true;auto-whitebalance-lock=false;auto-whitebalance-lock-supported=true;effect=none;effect-values=none,mono,negative,solarize,sepia,posterize,whiteboard,blackboard,aqua;exposure-compensation=0;exposure-compensation-step=0.166667;flash-mode=off;flash-mode-values=off;focal-length=2.95;focus-areas=(0,0,0,0,0);focus-distances=Infinity,Infinity,Infinity;focus-mode=continuous-video;focus-mode-values=infinity,auto,macro,continuous-video,continuous-picture;horizontal-view-angle=63.8164;jpeg-quality=90;jpeg-thumbnail-height=240;jpeg-thumbnail-quality=90;jpeg-thumbnail-size-values=512x288,480x288,256x154,432x288,320x240,176x144,0x0;jpeg-thumbnail-width=320;max-exposure-compensation=12;max-num-detected-faces-hw=0;max-num-detected-faces-sw=0;max-num-focus-areas=1;max-num-metering-areas=1;max-zoom=99;metering-areas=(0,0,0,0,0);min-exposure-compensation=-12;picture-format=jpeg;picture-format-valu
05-09 09:28:21.022: D/OpenCV_NativeCamera(1932): CameraHandler::setProperty(1, 480.000000)
05-09 09:28:21.022: I/OpenCV_NativeCamera(1932): Params before set: [antibanding=auto;antibanding-values=off,60hz,50hz,auto;auto-exposure-lock=false;auto-exposure-lock-supported=true;auto-whitebalance-lock=false;auto-whitebalance-lock-supported=true;effect=none;effect-values=none,mono,negative,solarize,sepia,posterize,whiteboard,blackboard,aqua;exposure-compensation=0;exposure-compensation-step=0.166667;flash-mode=off;flash-mode-values=off;focal-length=2.95;focus-areas=(0,0,0,0,0);focus-distances=Infinity,Infinity,Infinity;focus-mode=continuous-video;focus-mode-values=infinity,auto,macro,continuous-video,continuous-picture;horizontal-view-angle=63.8164;jpeg-quality=90;jpeg-thumbnail-height=240;jpeg-thumbnail-quality=90;jpeg-thumbnail-size-values=512x288,480x288,256x154,432x288,320x240,176x144,0x0;jpeg-thumbnail-width=320;max-exposure-compensation=12;max-num-detected-faces-hw=0;max-num-detected-faces-sw=0;max-num-focus-areas=1;max-num-metering-areas=1;max-zoom=99;metering-areas=(0,0,0,0,0);min-exposure-compensation=-12;picture-format=jpeg;picture-format-val
05-09 09:28:21.022: I/OpenCV_NativeCamera(1932): Params after set: [antibanding=auto;antibanding-values=off,60hz,50hz,auto;auto-exposure-lock=false;auto-exposure-lock-supported=true;auto-whitebalance-lock=false;auto-whitebalance-lock-supported=true;effect=none;effect-values=none,mono,negative,solarize,sepia,posterize,whiteboard,blackboard,aqua;exposure-compensation=0;exposure-compensation-step=0.166667;flash-mode=off;flash-mode-values=off;focal-length=2.95;focus-areas=(0,0,0,0,0);focus-distances=Infinity,Infinity,Infinity;focus-mode=continuous-video;focus-mode-values=infinity,auto,macro,continuous-video,continuous-picture;horizontal-view-angle=63.8164;jpeg-quality=90;jpeg-thumbnail-height=240;jpeg-thumbnail-quality=90;jpeg-thumbnail-size-values=512x288,480x288,256x154,432x288,320x240,176x144,0x0;jpeg-thumbnail-width=320;max-exposure-compensation=12;max-num-detected-faces-hw=0;max-num-detected-faces-sw=0;max-num-focus-areas=1;max-num-metering-areas=1;max-zoom=99;metering-areas=(0,0,0,0,0);min-exposure-compensation=-12;picture-format=jpeg;picture-format-valu
05-09 09:28:21.022: I/GizmoSDK(1932): Camera initialized at resolution 640x480
05-09 09:28:21.022: D/GizmoSDK(1932): APP_CMD_INIT_WINDOW
05-09 09:28:21.022: D/OpenCV_NativeCamera(1932): CameraHandler::applyProperties()
05-09 09:28:28.550: I/dalvikvm(1932): threadid=3: reacting to signal 3
05-09 09:28:28.580: I/dalvikvm(1932): Wrote stack traces to '/data/anr/traces.txt'
05-09 09:28:31.022: E/OpenCV_NativeCamera(1932): applyProperties: failed setPreviewTexture call; camera might not work correctly
05-09 09:28:31.022: D/OpenCV_NativeCamera(1932): Starting preview
05-09 09:28:31.022: D/OpenCV_NativeCamera(1932): Preview started successfully
05-09 09:28:31.032: E/OpenCV::camera(1932): calling (*pGetPropertyC)(0x76165f68, 0)
05-09 09:28:31.032: D/OpenCV_NativeCamera(1932): CameraHandler::getProperty(0)
05-09 09:28:31.032: E/OpenCV::camera(1932): calling (*pGetPropertyC)(0x76165f68, 1)
05-09 09:28:31.032: D/OpenCV_NativeCamera(1932): CameraHandler::getProperty(1)
05-09 09:28:32.103: V/threaded_app(1932): WindowFocusChanged: 0x7615a3c0 -- 0
05-09 09:29:00.281: V/threaded_app(1932): Pause: 0x7615a3c0
```
##### Dmitry Retinskiy on 2014-05-29 06:27
```
Hi Alexander,
could you check this and say your opinion?
Thanks.
- Assignee changed from Anders Modén to Alexander Smorkalov
```
##### Dmitry Retinskiy on 2014-05-30 11:28
```
- Status changed from New to Open
```
|
1.0
|
Android native camera fails on 2.4.9 and 3.0.0 - Transferred from http://code.opencv.org/issues/3681
```
|| Anders Modén on 2014-05-09 07:12
|| Priority: Normal
|| Affected: branch 'master' (3.0-dev)
|| Category: android
|| Tracker: Bug
|| Difficulty: Hard
|| PR:
|| Platform: ARM / Android
```
Android native camera fails on 2.4.9 and 3.0.0
-----------
```
Nexxus 7 2013 model fails to activate native camera on 2.4.9 and 3.0.0 latest
applyProperties: failed setPreviewTexture call; ....
and
initCameraConnect: startPreview() fails.
Just use Tutorial Sample 1 and go into native mode
If you want to cantact me I can help you debug it.
```
History
-------
##### Anders Modén on 2014-05-09 07:13
```
It should be 2014 model
This is the output from the native camera app as well
05-09 09:28:20.452: D/OpenCV::camera(1932): CvCapture_Android::CvCapture_Android(0)
05-09 09:28:20.452: D/OpenCV::camera(1932): Library name: libopencv_java.so
05-09 09:28:20.452: D/OpenCV::camera(1932): Library base address: 0x74eca000
05-09 09:28:20.462: V/threaded_app(1932): WindowFocusChanged: 0x7615a3c0 -- 1
05-09 09:28:20.472: D/OpenCV::camera(1932): Libraries folder found: /data/app-lib/Saab.Combitech.AR.SDK-1/
05-09 09:28:20.472: D/OpenCV::camera(1932): CameraWrapperConnector::connectToLib: folderPath=/data/app-lib/Saab.Combitech.AR.SDK-1/
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r2.3.3.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r4.0.3.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r4.1.1.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r4.2.0.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r2.2.0.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r3.0.1.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r4.0.0.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r4.3.0.so
05-09 09:28:20.472: E/OpenCV::camera(1932): ||libnative_camera_r4.4.0.so
05-09 09:28:20.472: D/OpenCV::camera(1932): try to load library 'libnative_camera_r4.4.0.so'
05-09 09:28:20.482: D/OpenCV::camera(1932): Loaded library '/data/app-lib/Saab.Combitech.AR.SDK-1/libnative_camera_r4.4.0.so'
05-09 09:28:20.482: D/OpenCV_NativeCamera(1932): CameraHandler::initCameraConnect(0x75596609, 0, 0x76169c68, 0x0)
05-09 09:28:20.482: D/OpenCV_NativeCamera(1932): Current process name for camera init: Saab.Combitech.AR.SDK
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Instantiated new CameraHandler (0x75596609, 0x76169c68)
05-09 09:28:20.932: I/OpenCV_NativeCamera(1932): initCameraConnect: [antibanding=auto;antibanding-values=off,60hz,50hz,auto;auto-exposure-lock=false;auto-exposure-lock-supported=true;auto-whitebalance-lock=false;auto-whitebalance-lock-supported=true;effect=none;effect-values=none,mono,negative,solarize,sepia,posterize,whiteboard,blackboard,aqua;exposure-compensation=0;exposure-compensation-step=0.166667;flash-mode=off;flash-mode-values=off;focal-length=2.95;focus-areas=(0,0,0,0,0);focus-distances=Infinity,Infinity,Infinity;focus-mode=auto;focus-mode-values=infinity,auto,macro,continuous-video,continuous-picture;horizontal-view-angle=63.8164;jpeg-quality=90;jpeg-thumbnail-height=240;jpeg-thumbnail-quality=90;jpeg-thumbnail-size-values=512x288,480x288,256x154,432x288,320x240,176x144,0x0;jpeg-thumbnail-width=320;max-exposure-compensation=12;max-num-detected-faces-hw=0;max-num-detected-faces-sw=0;max-num-focus-areas=1;max-num-metering-areas=1;max-zoom=99;metering-areas=(0,0,0,0,0);min-exposure-compensation=-12;picture-format=jpeg;picture-format-values=jpeg;pic
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Cameras: (null)
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Picture Sizes: 2592x1944,2048x1536,1920x1080,1600x1200,1280x768,1280x720,1024x768,800x600,800x480,720x480,640x480,352x288,320x240,176x144
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Picture Formats: jpeg
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Preview Sizes: 1920x1080,1280x768,1280x720,1024x768,800x600,800x480,720x480,640x480,352x288,320x240,176x144
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Preview Formats: yuv420p,yuv420sp,
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Preview Frame Rates: 15,24,30
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Thumbnail Sizes: 512x288,480x288,256x154,432x288,320x240,176x144,0x0
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Whitebalance Modes: auto,incandescent,fluorescent,warm-fluorescent,daylight,cloudy-daylight,twilight,shade
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Effects: none,mono,negative,solarize,sepia,posterize,whiteboard,blackboard,aqua
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Scene Modes: auto,landscape,snow,beach,sunset,night,portrait,sports,steadyphoto,candlelight,fireworks,party,night-portrait,theatre,action
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Focus Modes: infinity,auto,macro,continuous-video,continuous-picture
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Antibanding Options: off,60hz,50hz,auto
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): Supported Flash Modes: off
05-09 09:28:20.932: D/OpenCV_NativeCamera(1932): initCameraConnect: autofocus is set to mode "continuous-video"
05-09 09:28:20.942: D/OpenCV_NativeCamera(1932): initCameraConnect: preview format is set to yuv420sp
05-09 09:28:20.942: D/OpenCV_NativeCamera(1932): initCameraConnect: preview format is set to 640x480
05-09 09:28:20.942: D/OpenCV_NativeCamera(1932): Starting preview
05-09 09:28:21.012: D/OpenCV_NativeCamera(1932): Preview started successfully
05-09 09:28:21.012: E/OpenCV::camera(1932): calling (*pGetPropertyC)(0x76165f68, 2)
05-09 09:28:21.012: D/OpenCV_NativeCamera(1932): CameraHandler::getProperty(2)
05-09 09:28:21.012: D/OpenCV_NativeCamera(1932): CameraHandler::setProperty(0, 640.000000)
05-09 09:28:21.022: I/OpenCV_NativeCamera(1932): Params before set: [antibanding=auto;antibanding-values=off,60hz,50hz,auto;auto-exposure-lock=false;auto-exposure-lock-supported=true;auto-whitebalance-lock=false;auto-whitebalance-lock-supported=true;effect=none;effect-values=none,mono,negative,solarize,sepia,posterize,whiteboard,blackboard,aqua;exposure-compensation=0;exposure-compensation-step=0.166667;flash-mode=off;flash-mode-values=off;focal-length=2.95;focus-areas=(0,0,0,0,0);focus-distances=Infinity,Infinity,Infinity;focus-mode=continuous-video;focus-mode-values=infinity,auto,macro,continuous-video,continuous-picture;horizontal-view-angle=63.8164;jpeg-quality=90;jpeg-thumbnail-height=240;jpeg-thumbnail-quality=90;jpeg-thumbnail-size-values=512x288,480x288,256x154,432x288,320x240,176x144,0x0;jpeg-thumbnail-width=320;max-exposure-compensation=12;max-num-detected-faces-hw=0;max-num-detected-faces-sw=0;max-num-focus-areas=1;max-num-metering-areas=1;max-zoom=99;metering-areas=(0,0,0,0,0);min-exposure-compensation=-12;picture-format=jpeg;picture-format-val
05-09 09:28:21.022: I/OpenCV_NativeCamera(1932): Params after set: [antibanding=auto;antibanding-values=off,60hz,50hz,auto;auto-exposure-lock=false;auto-exposure-lock-supported=true;auto-whitebalance-lock=false;auto-whitebalance-lock-supported=true;effect=none;effect-values=none,mono,negative,solarize,sepia,posterize,whiteboard,blackboard,aqua;exposure-compensation=0;exposure-compensation-step=0.166667;flash-mode=off;flash-mode-values=off;focal-length=2.95;focus-areas=(0,0,0,0,0);focus-distances=Infinity,Infinity,Infinity;focus-mode=continuous-video;focus-mode-values=infinity,auto,macro,continuous-video,continuous-picture;horizontal-view-angle=63.8164;jpeg-quality=90;jpeg-thumbnail-height=240;jpeg-thumbnail-quality=90;jpeg-thumbnail-size-values=512x288,480x288,256x154,432x288,320x240,176x144,0x0;jpeg-thumbnail-width=320;max-exposure-compensation=12;max-num-detected-faces-hw=0;max-num-detected-faces-sw=0;max-num-focus-areas=1;max-num-metering-areas=1;max-zoom=99;metering-areas=(0,0,0,0,0);min-exposure-compensation=-12;picture-format=jpeg;picture-format-valu
05-09 09:28:21.022: D/OpenCV_NativeCamera(1932): CameraHandler::setProperty(1, 480.000000)
05-09 09:28:21.022: I/OpenCV_NativeCamera(1932): Params before set: [antibanding=auto;antibanding-values=off,60hz,50hz,auto;auto-exposure-lock=false;auto-exposure-lock-supported=true;auto-whitebalance-lock=false;auto-whitebalance-lock-supported=true;effect=none;effect-values=none,mono,negative,solarize,sepia,posterize,whiteboard,blackboard,aqua;exposure-compensation=0;exposure-compensation-step=0.166667;flash-mode=off;flash-mode-values=off;focal-length=2.95;focus-areas=(0,0,0,0,0);focus-distances=Infinity,Infinity,Infinity;focus-mode=continuous-video;focus-mode-values=infinity,auto,macro,continuous-video,continuous-picture;horizontal-view-angle=63.8164;jpeg-quality=90;jpeg-thumbnail-height=240;jpeg-thumbnail-quality=90;jpeg-thumbnail-size-values=512x288,480x288,256x154,432x288,320x240,176x144,0x0;jpeg-thumbnail-width=320;max-exposure-compensation=12;max-num-detected-faces-hw=0;max-num-detected-faces-sw=0;max-num-focus-areas=1;max-num-metering-areas=1;max-zoom=99;metering-areas=(0,0,0,0,0);min-exposure-compensation=-12;picture-format=jpeg;picture-format-val
05-09 09:28:21.022: I/OpenCV_NativeCamera(1932): Params after set: [antibanding=auto;antibanding-values=off,60hz,50hz,auto;auto-exposure-lock=false;auto-exposure-lock-supported=true;auto-whitebalance-lock=false;auto-whitebalance-lock-supported=true;effect=none;effect-values=none,mono,negative,solarize,sepia,posterize,whiteboard,blackboard,aqua;exposure-compensation=0;exposure-compensation-step=0.166667;flash-mode=off;flash-mode-values=off;focal-length=2.95;focus-areas=(0,0,0,0,0);focus-distances=Infinity,Infinity,Infinity;focus-mode=continuous-video;focus-mode-values=infinity,auto,macro,continuous-video,continuous-picture;horizontal-view-angle=63.8164;jpeg-quality=90;jpeg-thumbnail-height=240;jpeg-thumbnail-quality=90;jpeg-thumbnail-size-values=512x288,480x288,256x154,432x288,320x240,176x144,0x0;jpeg-thumbnail-width=320;max-exposure-compensation=12;max-num-detected-faces-hw=0;max-num-detected-faces-sw=0;max-num-focus-areas=1;max-num-metering-areas=1;max-zoom=99;metering-areas=(0,0,0,0,0);min-exposure-compensation=-12;picture-format=jpeg;picture-format-valu
05-09 09:28:21.022: I/GizmoSDK(1932): Camera initialized at resolution 640x480
05-09 09:28:21.022: D/GizmoSDK(1932): APP_CMD_INIT_WINDOW
05-09 09:28:21.022: D/OpenCV_NativeCamera(1932): CameraHandler::applyProperties()
05-09 09:28:28.550: I/dalvikvm(1932): threadid=3: reacting to signal 3
05-09 09:28:28.580: I/dalvikvm(1932): Wrote stack traces to '/data/anr/traces.txt'
05-09 09:28:31.022: E/OpenCV_NativeCamera(1932): applyProperties: failed setPreviewTexture call; camera might not work correctly
05-09 09:28:31.022: D/OpenCV_NativeCamera(1932): Starting preview
05-09 09:28:31.022: D/OpenCV_NativeCamera(1932): Preview started successfully
05-09 09:28:31.032: E/OpenCV::camera(1932): calling (*pGetPropertyC)(0x76165f68, 0)
05-09 09:28:31.032: D/OpenCV_NativeCamera(1932): CameraHandler::getProperty(0)
05-09 09:28:31.032: E/OpenCV::camera(1932): calling (*pGetPropertyC)(0x76165f68, 1)
05-09 09:28:31.032: D/OpenCV_NativeCamera(1932): CameraHandler::getProperty(1)
05-09 09:28:32.103: V/threaded_app(1932): WindowFocusChanged: 0x7615a3c0 -- 0
05-09 09:29:00.281: V/threaded_app(1932): Pause: 0x7615a3c0
```
##### Dmitry Retinskiy on 2014-05-29 06:27
```
Hi Alexander,
could you check this and say your opinion?
Thanks.
- Assignee changed from Anders Modén to Alexander Smorkalov
```
##### Dmitry Retinskiy on 2014-05-30 11:28
```
- Status changed from New to Open
```
|
non_process
|
android native camera fails on and transferred from anders modén on priority normal affected branch master dev category android tracker bug difficulty hard pr platform arm android android native camera fails on and nexxus model fails to activate native camera on and latest applyproperties failed setpreviewtexture call and initcameraconnect startpreview fails just use tutorial sample and go into native mode if you want to cantact me i can help you debug it history anders modén on it should be model this is the output from the native camera app as well d opencv camera cvcapture android cvcapture android d opencv camera library name libopencv java so d opencv camera library base address v threaded app windowfocuschanged d opencv camera libraries folder found data app lib saab combitech ar sdk d opencv camera camerawrapperconnector connecttolib folderpath data app lib saab combitech ar sdk e opencv camera libnative camera so e opencv camera libnative camera so e opencv camera libnative camera so e opencv camera libnative camera so e opencv camera libnative camera so e opencv camera libnative camera so e opencv camera libnative camera so e opencv camera libnative camera so e opencv camera libnative camera so d opencv camera try to load library libnative camera so d opencv camera loaded library data app lib saab combitech ar sdk libnative camera so d opencv nativecamera camerahandler initcameraconnect d opencv nativecamera current process name for camera init saab combitech ar sdk d opencv nativecamera instantiated new camerahandler i opencv nativecamera initcameraconnect antibanding auto antibanding values off auto auto exposure lock false auto exposure lock supported true auto whitebalance lock false auto whitebalance lock supported true effect none effect values none mono negative solarize sepia posterize whiteboard blackboard aqua exposure compensation exposure compensation step flash mode off flash mode values off focal length focus areas focus distances infinity infinity infinity focus mode auto focus mode values infinity auto macro continuous video continuous picture horizontal view angle jpeg quality jpeg thumbnail height jpeg thumbnail quality jpeg thumbnail size values jpeg thumbnail width max exposure compensation max num detected faces hw max num detected faces sw max num focus areas max num metering areas max zoom metering areas min exposure compensation picture format jpeg picture format values jpeg pic d opencv nativecamera supported cameras null d opencv nativecamera supported picture sizes d opencv nativecamera supported picture formats jpeg d opencv nativecamera supported preview sizes d opencv nativecamera supported preview formats d opencv nativecamera supported preview frame rates d opencv nativecamera supported thumbnail sizes d opencv nativecamera supported whitebalance modes auto incandescent fluorescent warm fluorescent daylight cloudy daylight twilight shade d opencv nativecamera supported effects none mono negative solarize sepia posterize whiteboard blackboard aqua d opencv nativecamera supported scene modes auto landscape snow beach sunset night portrait sports steadyphoto candlelight fireworks party night portrait theatre action d opencv nativecamera supported focus modes infinity auto macro continuous video continuous picture d opencv nativecamera supported antibanding options off auto d opencv nativecamera supported flash modes off d opencv nativecamera initcameraconnect autofocus is set to mode continuous video d opencv nativecamera initcameraconnect preview format is set to d opencv nativecamera initcameraconnect preview format is set to d opencv nativecamera starting preview d opencv nativecamera preview started successfully e opencv camera calling pgetpropertyc d opencv nativecamera camerahandler getproperty d opencv nativecamera camerahandler setproperty i opencv nativecamera params before set antibanding auto antibanding values off auto auto exposure lock false auto exposure lock supported true auto whitebalance lock false auto whitebalance lock supported true effect none effect values none mono negative solarize sepia posterize whiteboard blackboard aqua exposure compensation exposure compensation step flash mode off flash mode values off focal length focus areas focus distances infinity infinity infinity focus mode continuous video focus mode values infinity auto macro continuous video continuous picture horizontal view angle jpeg quality jpeg thumbnail height jpeg thumbnail quality jpeg thumbnail size values jpeg thumbnail width max exposure compensation max num detected faces hw max num detected faces sw max num focus areas max num metering areas max zoom metering areas min exposure compensation picture format jpeg picture format val i opencv nativecamera params after set antibanding auto antibanding values off auto auto exposure lock false auto exposure lock supported true auto whitebalance lock false auto whitebalance lock supported true effect none effect values none mono negative solarize sepia posterize whiteboard blackboard aqua exposure compensation exposure compensation step flash mode off flash mode values off focal length focus areas focus distances infinity infinity infinity focus mode continuous video focus mode values infinity auto macro continuous video continuous picture horizontal view angle jpeg quality jpeg thumbnail height jpeg thumbnail quality jpeg thumbnail size values jpeg thumbnail width max exposure compensation max num detected faces hw max num detected faces sw max num focus areas max num metering areas max zoom metering areas min exposure compensation picture format jpeg picture format valu d opencv nativecamera camerahandler setproperty i opencv nativecamera params before set antibanding auto antibanding values off auto auto exposure lock false auto exposure lock supported true auto whitebalance lock false auto whitebalance lock supported true effect none effect values none mono negative solarize sepia posterize whiteboard blackboard aqua exposure compensation exposure compensation step flash mode off flash mode values off focal length focus areas focus distances infinity infinity infinity focus mode continuous video focus mode values infinity auto macro continuous video continuous picture horizontal view angle jpeg quality jpeg thumbnail height jpeg thumbnail quality jpeg thumbnail size values jpeg thumbnail width max exposure compensation max num detected faces hw max num detected faces sw max num focus areas max num metering areas max zoom metering areas min exposure compensation picture format jpeg picture format val i opencv nativecamera params after set antibanding auto antibanding values off auto auto exposure lock false auto exposure lock supported true auto whitebalance lock false auto whitebalance lock supported true effect none effect values none mono negative solarize sepia posterize whiteboard blackboard aqua exposure compensation exposure compensation step flash mode off flash mode values off focal length focus areas focus distances infinity infinity infinity focus mode continuous video focus mode values infinity auto macro continuous video continuous picture horizontal view angle jpeg quality jpeg thumbnail height jpeg thumbnail quality jpeg thumbnail size values jpeg thumbnail width max exposure compensation max num detected faces hw max num detected faces sw max num focus areas max num metering areas max zoom metering areas min exposure compensation picture format jpeg picture format valu i gizmosdk camera initialized at resolution d gizmosdk app cmd init window d opencv nativecamera camerahandler applyproperties i dalvikvm threadid reacting to signal i dalvikvm wrote stack traces to data anr traces txt e opencv nativecamera applyproperties failed setpreviewtexture call camera might not work correctly d opencv nativecamera starting preview d opencv nativecamera preview started successfully e opencv camera calling pgetpropertyc d opencv nativecamera camerahandler getproperty e opencv camera calling pgetpropertyc d opencv nativecamera camerahandler getproperty v threaded app windowfocuschanged v threaded app pause dmitry retinskiy on hi alexander could you check this and say your opinion thanks assignee changed from anders modén to alexander smorkalov dmitry retinskiy on status changed from new to open
| 0
|
16,801
| 22,045,017,539
|
IssuesEvent
|
2022-05-29 23:21:32
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
"Split Vector Layer" processing algorithm: the FILE_TYPE parameter is not optional while it should be and it doesn't respect the "Default output vector layer extension" setting
|
Processing Bug
|
### What is the bug or the crash?
When the Split Vector Layer algorithm was ported from Python to C++ with https://github.com/qgis/QGIS/pull/36372 by @alexbruy, a supposedly optional `FILE_TYPE` advanced parameter was added to the algorithm ([[needs-docs] add optional parameter for output file type to the vector split algorithm](https://github.com/qgis/QGIS/commit/b99fe4e51f2cd79ad8922e765ad4d7125216259e)).
It seems to me the `FILE_TYPE` parameter is actually not optional.
In fact, when the algorithm is called from the Python console without such parameter, then the following error is displayed:
`processing.run("native:splitvectorlayer", {'INPUT':'C:\\path\\to\\input_layer','FIELD':'ID','OUTPUT':'C:\\path\\to\\output\\folder'})`
```
...
_core.QgsProcessingException: Unable to execute algorithm
Incorrect parameter value for FILE_TYPE
```
Setting the FILE_TYPE parameter to NULL, the algorithms runs but the output files format is always GPKG regardless of the "Default output vector layer extension" (Processing->General) setting (e.g. 'shp') as it should be by design (behaviour introduced by @nyalldawson with https://github.com/qgis/QGIS/pull/31973).
### Steps to reproduce the issue
- open the "Split Vector Layer" processing algorithm and see that the "Output file type" (`FILE_TYPE`) advanced parameter is not optional
or
- try to run the algorithm from the Python console (using `processing.run("native:splitvectorlayer", {'INPUT':'C:\\path\\to\\input_layer','FIELD':'ID','OUTPUT':'C:\\path\\to\\output\\folder'})`) and see that the algorithm doesn't run and an error is displayed
or
- set the "Default output vector layer extension" setting to 'shp', run the algorithm from the Python console (using `processing.run("native:splitvectorlayer", {'INPUT':'C:\\path\\to\\input_layer','FIELD':'ID','FILE_TYPE':NULL,'OUTPUT':'C:\\path\\to\\output\\folder'})`) and see that the algorithm output vector layers are in the GPKG format instead of SHP format.
### Versions
QGIS 3.24.1 - QGIS 3.22.5
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [X] I tried with a new QGIS profile
### Additional context
_No response_
|
1.0
|
"Split Vector Layer" processing algorithm: the FILE_TYPE parameter is not optional while it should be and it doesn't respect the "Default output vector layer extension" setting - ### What is the bug or the crash?
When the Split Vector Layer algorithm was ported from Python to C++ with https://github.com/qgis/QGIS/pull/36372 by @alexbruy, a supposedly optional `FILE_TYPE` advanced parameter was added to the algorithm ([[needs-docs] add optional parameter for output file type to the vector split algorithm](https://github.com/qgis/QGIS/commit/b99fe4e51f2cd79ad8922e765ad4d7125216259e)).
It seems to me the `FILE_TYPE` parameter is actually not optional.
In fact, when the algorithm is called from the Python console without such parameter, then the following error is displayed:
`processing.run("native:splitvectorlayer", {'INPUT':'C:\\path\\to\\input_layer','FIELD':'ID','OUTPUT':'C:\\path\\to\\output\\folder'})`
```
...
_core.QgsProcessingException: Unable to execute algorithm
Incorrect parameter value for FILE_TYPE
```
Setting the FILE_TYPE parameter to NULL, the algorithms runs but the output files format is always GPKG regardless of the "Default output vector layer extension" (Processing->General) setting (e.g. 'shp') as it should be by design (behaviour introduced by @nyalldawson with https://github.com/qgis/QGIS/pull/31973).
### Steps to reproduce the issue
- open the "Split Vector Layer" processing algorithm and see that the "Output file type" (`FILE_TYPE`) advanced parameter is not optional
or
- try to run the algorithm from the Python console (using `processing.run("native:splitvectorlayer", {'INPUT':'C:\\path\\to\\input_layer','FIELD':'ID','OUTPUT':'C:\\path\\to\\output\\folder'})`) and see that the algorithm doesn't run and an error is displayed
or
- set the "Default output vector layer extension" setting to 'shp', run the algorithm from the Python console (using `processing.run("native:splitvectorlayer", {'INPUT':'C:\\path\\to\\input_layer','FIELD':'ID','FILE_TYPE':NULL,'OUTPUT':'C:\\path\\to\\output\\folder'})`) and see that the algorithm output vector layers are in the GPKG format instead of SHP format.
### Versions
QGIS 3.24.1 - QGIS 3.22.5
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [X] I tried with a new QGIS profile
### Additional context
_No response_
|
process
|
split vector layer processing algorithm the file type parameter is not optional while it should be and it doesn t respect the default output vector layer extension setting what is the bug or the crash when the split vector layer algorithm was ported from python to c with by alexbruy a supposedly optional file type advanced parameter was added to the algorithm add optional parameter for output file type to the vector split algorithm it seems to me the file type parameter is actually not optional in fact when the algorithm is called from the python console without such parameter then the following error is displayed processing run native splitvectorlayer input c path to input layer field id output c path to output folder core qgsprocessingexception unable to execute algorithm incorrect parameter value for file type setting the file type parameter to null the algorithms runs but the output files format is always gpkg regardless of the default output vector layer extension processing general setting e g shp as it should be by design behaviour introduced by nyalldawson with steps to reproduce the issue open the split vector layer processing algorithm and see that the output file type file type advanced parameter is not optional or try to run the algorithm from the python console using processing run native splitvectorlayer input c path to input layer field id output c path to output folder and see that the algorithm doesn t run and an error is displayed or set the default output vector layer extension setting to shp run the algorithm from the python console using processing run native splitvectorlayer input c path to input layer field id file type null output c path to output folder and see that the algorithm output vector layers are in the gpkg format instead of shp format versions qgis qgis supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context no response
| 1
|
455,230
| 13,113,762,661
|
IssuesEvent
|
2020-08-05 06:19:14
|
StrangeLoopGames/EcoIssues
|
https://api.github.com/repos/StrangeLoopGames/EcoIssues
|
opened
|
[0.9.0 develop-22] Chat is broken
|
Priority: Critical Priority: High
|
- press 'C', chat is empty:
```
NullReferenceException: Object reference not set to an instance of an object.
at Assets.UI.Scripts.Utilities.SparseScrollRect.Init (Assets.UI.Scripts.Utilities.SparseScrollRect+CreateAtIndexFunc createAtIndex) [0x00000] in <00000000000000000000000000000000>:0
at Assets.UI.Scripts.ChatLogUI.<OnEnable>b__10_0 () [0x00000] in <00000000000000000000000000000000>:0
at UnityEngine.LowLevel.PlayerLoopSystem+UpdateFunction.Invoke () [0x00000] in <00000000000000000000000000000000>:0
at UnityEngine.Object.Instantiate[T] (T original) [0x00000] in <00000000000000000000000000000000>:0
at UI.UIManager.Open (System.String guiName, Eco.Shared.Serialization.BSONObject bson, UI.UILayer layer) [0x00000] in <00000000000000000000000000000000>:0
at UI.UIManager.Toggle (System.String guiName, Eco.Shared.Serialization.BSONObject bson, UI.UILayer layer) [0x00000] in <00000000000000000000000000000000>:0
at UI.UIManager.Toggle (System.String guiName, UI.UILayer layer) [0x00000] in <00000000000000000000000000000000>:0
at UI.ActionbarStackUI.Update () [0x00000] in <00000000000000000000000000000000>:0
UnityEngine.Object:Instantiate(T)
UI.UIManager:Open(String, BSONObject, UILayer)
UI.UIManager:Toggle(String, BSONObject, UILayer)
UI.UIManager:Toggle(String, UILayer)
UI.ActionbarStackUI:Update()
```
|
2.0
|
[0.9.0 develop-22] Chat is broken - - press 'C', chat is empty:
```
NullReferenceException: Object reference not set to an instance of an object.
at Assets.UI.Scripts.Utilities.SparseScrollRect.Init (Assets.UI.Scripts.Utilities.SparseScrollRect+CreateAtIndexFunc createAtIndex) [0x00000] in <00000000000000000000000000000000>:0
at Assets.UI.Scripts.ChatLogUI.<OnEnable>b__10_0 () [0x00000] in <00000000000000000000000000000000>:0
at UnityEngine.LowLevel.PlayerLoopSystem+UpdateFunction.Invoke () [0x00000] in <00000000000000000000000000000000>:0
at UnityEngine.Object.Instantiate[T] (T original) [0x00000] in <00000000000000000000000000000000>:0
at UI.UIManager.Open (System.String guiName, Eco.Shared.Serialization.BSONObject bson, UI.UILayer layer) [0x00000] in <00000000000000000000000000000000>:0
at UI.UIManager.Toggle (System.String guiName, Eco.Shared.Serialization.BSONObject bson, UI.UILayer layer) [0x00000] in <00000000000000000000000000000000>:0
at UI.UIManager.Toggle (System.String guiName, UI.UILayer layer) [0x00000] in <00000000000000000000000000000000>:0
at UI.ActionbarStackUI.Update () [0x00000] in <00000000000000000000000000000000>:0
UnityEngine.Object:Instantiate(T)
UI.UIManager:Open(String, BSONObject, UILayer)
UI.UIManager:Toggle(String, BSONObject, UILayer)
UI.UIManager:Toggle(String, UILayer)
UI.ActionbarStackUI:Update()
```
|
non_process
|
chat is broken press c chat is empty nullreferenceexception object reference not set to an instance of an object at assets ui scripts utilities sparsescrollrect init assets ui scripts utilities sparsescrollrect createatindexfunc createatindex in at assets ui scripts chatlogui b in at unityengine lowlevel playerloopsystem updatefunction invoke in at unityengine object instantiate t original in at ui uimanager open system string guiname eco shared serialization bsonobject bson ui uilayer layer in at ui uimanager toggle system string guiname eco shared serialization bsonobject bson ui uilayer layer in at ui uimanager toggle system string guiname ui uilayer layer in at ui actionbarstackui update in unityengine object instantiate t ui uimanager open string bsonobject uilayer ui uimanager toggle string bsonobject uilayer ui uimanager toggle string uilayer ui actionbarstackui update
| 0
|
22,318
| 30,880,979,851
|
IssuesEvent
|
2023-08-03 17:34:02
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
[MLv2] Join LHS/RHS columns have less temporal unit options that
|
.Backend .metabase-lib .Team/QueryProcessor :hammer_and_wrench:
|
`available-temporal-buckets` returns less options for join condition LHS/RHS columns than we have on the MLv1 version
<details>
<summary>MLv1</summary>
<img width="376" alt="CleanShot 2023-08-03 at 18 31 17@2x" src="https://github.com/metabase/metabase/assets/17258145/91e7e794-6025-41db-8cf8-161ad8776f22">
</details>
<details>
<summary>MLv2</summary>
<img width="518" alt="CleanShot 2023-08-03 at 18 30 37@2x" src="https://github.com/metabase/metabase/assets/17258145/23078cb9-6641-478a-9261-2dfb45fe0133">
</details>
|
1.0
|
[MLv2] Join LHS/RHS columns have less temporal unit options that - `available-temporal-buckets` returns less options for join condition LHS/RHS columns than we have on the MLv1 version
<details>
<summary>MLv1</summary>
<img width="376" alt="CleanShot 2023-08-03 at 18 31 17@2x" src="https://github.com/metabase/metabase/assets/17258145/91e7e794-6025-41db-8cf8-161ad8776f22">
</details>
<details>
<summary>MLv2</summary>
<img width="518" alt="CleanShot 2023-08-03 at 18 30 37@2x" src="https://github.com/metabase/metabase/assets/17258145/23078cb9-6641-478a-9261-2dfb45fe0133">
</details>
|
process
|
join lhs rhs columns have less temporal unit options that available temporal buckets returns less options for join condition lhs rhs columns than we have on the version img width alt cleanshot at src img width alt cleanshot at src
| 1
|
242,920
| 26,277,869,820
|
IssuesEvent
|
2023-01-07 01:22:28
|
murthy1979/hackazon
|
https://api.github.com/repos/murthy1979/hackazon
|
closed
|
CVE-2017-16654 (High) detected in symfony/intl-v2.6.1 - autoclosed
|
security vulnerability
|
## CVE-2017-16654 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>symfony/intl-v2.6.1</b></p></summary>
<p>A PHP replacement layer for the C intl extension that also provides access to the localization data of the ICU library.</p>
<p>
Dependency Hierarchy:
- symfony/form-v2.5.7 (Root Library)
- :x: **symfony/intl-v2.6.1** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/murthy1979/hackazon/commit/7a5c1fb6205b5dacb816770c95cda9299805eb02">7a5c1fb6205b5dacb816770c95cda9299805eb02</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in Symfony before 2.7.38, 2.8.31, 3.2.14, 3.3.13, 3.4-BETA5, and 4.0-BETA5. The Intl component includes various bundle readers that are used to read resource bundles from the local filesystem. The read() methods of these classes use a path and a locale to determine the language bundle to retrieve. The locale argument value is commonly retrieved from untrusted user input (like a URL parameter). An attacker can use this argument to navigate to arbitrary directories via the dot-dot-slash attack, aka Directory Traversal.
<p>Publish Date: 2018-08-06
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-16654>CVE-2017-16654</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-16654">https://nvd.nist.gov/vuln/detail/CVE-2017-16654</a></p>
<p>Release Date: 2018-08-06</p>
<p>Fix Resolution: 2.7.38,2.8.31,3.2.14,3.3.13,3.4-BETA5,4.0-BETA5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2017-16654 (High) detected in symfony/intl-v2.6.1 - autoclosed - ## CVE-2017-16654 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>symfony/intl-v2.6.1</b></p></summary>
<p>A PHP replacement layer for the C intl extension that also provides access to the localization data of the ICU library.</p>
<p>
Dependency Hierarchy:
- symfony/form-v2.5.7 (Root Library)
- :x: **symfony/intl-v2.6.1** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/murthy1979/hackazon/commit/7a5c1fb6205b5dacb816770c95cda9299805eb02">7a5c1fb6205b5dacb816770c95cda9299805eb02</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in Symfony before 2.7.38, 2.8.31, 3.2.14, 3.3.13, 3.4-BETA5, and 4.0-BETA5. The Intl component includes various bundle readers that are used to read resource bundles from the local filesystem. The read() methods of these classes use a path and a locale to determine the language bundle to retrieve. The locale argument value is commonly retrieved from untrusted user input (like a URL parameter). An attacker can use this argument to navigate to arbitrary directories via the dot-dot-slash attack, aka Directory Traversal.
<p>Publish Date: 2018-08-06
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-16654>CVE-2017-16654</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-16654">https://nvd.nist.gov/vuln/detail/CVE-2017-16654</a></p>
<p>Release Date: 2018-08-06</p>
<p>Fix Resolution: 2.7.38,2.8.31,3.2.14,3.3.13,3.4-BETA5,4.0-BETA5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in symfony intl autoclosed cve high severity vulnerability vulnerable library symfony intl a php replacement layer for the c intl extension that also provides access to the localization data of the icu library dependency hierarchy symfony form root library x symfony intl vulnerable library found in head commit a href found in base branch master vulnerability details an issue was discovered in symfony before and the intl component includes various bundle readers that are used to read resource bundles from the local filesystem the read methods of these classes use a path and a locale to determine the language bundle to retrieve the locale argument value is commonly retrieved from untrusted user input like a url parameter an attacker can use this argument to navigate to arbitrary directories via the dot dot slash attack aka directory traversal publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
16,644
| 21,709,381,981
|
IssuesEvent
|
2022-05-10 12:39:55
|
camunda/zeebe
|
https://api.github.com/repos/camunda/zeebe
|
closed
|
Refactor ProtocolFactory to only generate positive values for objects of type long
|
kind/toil team/process-automation area/test
|
**Description**
The records generated by the protocol factory currently generated random long values. Since the factory is most likely going to be used for exporter related tests, this can break some things down, as we often expect these values to be timestamps. Negative values are invalid for timestamps.
It would have been better if these types carried that kind of information (e.g. instead of longs, we use `Instant`), but this is an acceptable workaround as we don't dramatically reduce the range of values.
|
1.0
|
Refactor ProtocolFactory to only generate positive values for objects of type long - **Description**
The records generated by the protocol factory currently generated random long values. Since the factory is most likely going to be used for exporter related tests, this can break some things down, as we often expect these values to be timestamps. Negative values are invalid for timestamps.
It would have been better if these types carried that kind of information (e.g. instead of longs, we use `Instant`), but this is an acceptable workaround as we don't dramatically reduce the range of values.
|
process
|
refactor protocolfactory to only generate positive values for objects of type long description the records generated by the protocol factory currently generated random long values since the factory is most likely going to be used for exporter related tests this can break some things down as we often expect these values to be timestamps negative values are invalid for timestamps it would have been better if these types carried that kind of information e g instead of longs we use instant but this is an acceptable workaround as we don t dramatically reduce the range of values
| 1
|
200,327
| 7,005,881,341
|
IssuesEvent
|
2017-12-19 05:18:31
|
bethlakshmi/GBE2
|
https://api.github.com/repos/bethlakshmi/GBE2
|
closed
|
Getting to Ordering Acts is almost secret
|
enhancement Low Priority
|
I think this every time I have to work with it. It was a last-minute add-on last year and it kind of shows. If I didn't know how to get to the shows, I'd never figure it out. Let's revisit the dashboard on this after BurlExpo10. Also, when you hit the 'submit' button after ordering the acts, you're taken back to your home page instead of shown the results of the submit, which is odd.
|
1.0
|
Getting to Ordering Acts is almost secret - I think this every time I have to work with it. It was a last-minute add-on last year and it kind of shows. If I didn't know how to get to the shows, I'd never figure it out. Let's revisit the dashboard on this after BurlExpo10. Also, when you hit the 'submit' button after ordering the acts, you're taken back to your home page instead of shown the results of the submit, which is odd.
|
non_process
|
getting to ordering acts is almost secret i think this every time i have to work with it it was a last minute add on last year and it kind of shows if i didn t know how to get to the shows i d never figure it out let s revisit the dashboard on this after also when you hit the submit button after ordering the acts you re taken back to your home page instead of shown the results of the submit which is odd
| 0
|
73,255
| 15,253,595,655
|
IssuesEvent
|
2021-02-20 08:25:48
|
gsylvie/madness
|
https://api.github.com/repos/gsylvie/madness
|
closed
|
CVE-2014-0114 High Severity Vulnerability detected by WhiteSource - autoclosed
|
security vulnerability
|
## CVE-2014-0114 - High Severity Vulnerability
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commons-beanutils-1.7.0.jar</b></p></summary>
<p>The Java language provides Reflection and Introspection APIs (see the java.lang.reflect and java.beans packages in the JDK Javadocs). However, these APIs can be quite complex to understand and utilize. The BeanUtils component provides easy-to-use wrappers around these capabilities</p>
<p>path: /root/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar</p>
<p>
<p>Library home page: <a href=http://jakarta.apache.org/commons/beanutils/>http://jakarta.apache.org/commons/beanutils/</a></p>
Dependency Hierarchy:
- esapi-2.1.0.jar (Root Library)
- commons-configuration-1.5.jar
- commons-digester-1.8.jar
- :x: **commons-beanutils-1.7.0.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Apache Commons BeanUtils, as distributed in lib/commons-beanutils-1.8.0.jar in Apache Struts 1.x through 1.3.10 and in other products requiring commons-beanutils through 1.9.2, does not suppress the class property, which allows remote attackers to "manipulate" the ClassLoader and execute arbitrary code via the class parameter, as demonstrated by the passing of this parameter to the getClass method of the ActionForm object in Struts 1.
<p>Publish Date: 2014-04-30
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2014-0114>CVE-2014-0114</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://issues.apache.org/jira/browse/BEANUTILS-463">https://issues.apache.org/jira/browse/BEANUTILS-463</a></p>
<p>Release Date: 2014-05-24</p>
<p>Fix Resolution: Upgrade to version 1.9.2 or greater</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2014-0114 High Severity Vulnerability detected by WhiteSource - autoclosed - ## CVE-2014-0114 - High Severity Vulnerability
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commons-beanutils-1.7.0.jar</b></p></summary>
<p>The Java language provides Reflection and Introspection APIs (see the java.lang.reflect and java.beans packages in the JDK Javadocs). However, these APIs can be quite complex to understand and utilize. The BeanUtils component provides easy-to-use wrappers around these capabilities</p>
<p>path: /root/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar</p>
<p>
<p>Library home page: <a href=http://jakarta.apache.org/commons/beanutils/>http://jakarta.apache.org/commons/beanutils/</a></p>
Dependency Hierarchy:
- esapi-2.1.0.jar (Root Library)
- commons-configuration-1.5.jar
- commons-digester-1.8.jar
- :x: **commons-beanutils-1.7.0.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Apache Commons BeanUtils, as distributed in lib/commons-beanutils-1.8.0.jar in Apache Struts 1.x through 1.3.10 and in other products requiring commons-beanutils through 1.9.2, does not suppress the class property, which allows remote attackers to "manipulate" the ClassLoader and execute arbitrary code via the class parameter, as demonstrated by the passing of this parameter to the getClass method of the ActionForm object in Struts 1.
<p>Publish Date: 2014-04-30
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2014-0114>CVE-2014-0114</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://issues.apache.org/jira/browse/BEANUTILS-463">https://issues.apache.org/jira/browse/BEANUTILS-463</a></p>
<p>Release Date: 2014-05-24</p>
<p>Fix Resolution: Upgrade to version 1.9.2 or greater</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high severity vulnerability detected by whitesource autoclosed cve high severity vulnerability vulnerable library commons beanutils jar the java language provides reflection and introspection apis see the java lang reflect and java beans packages in the jdk javadocs however these apis can be quite complex to understand and utilize the beanutils component provides easy to use wrappers around these capabilities path root repository commons beanutils commons beanutils commons beanutils jar library home page a href dependency hierarchy esapi jar root library commons configuration jar commons digester jar x commons beanutils jar vulnerable library vulnerability details apache commons beanutils as distributed in lib commons beanutils jar in apache struts x through and in other products requiring commons beanutils through does not suppress the class property which allows remote attackers to manipulate the classloader and execute arbitrary code via the class parameter as demonstrated by the passing of this parameter to the getclass method of the actionform object in struts publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution upgrade to version or greater step up your open source security game with whitesource
| 0
|
12,746
| 15,107,263,603
|
IssuesEvent
|
2021-02-08 15:14:20
|
alphagov/govuk-frontend
|
https://api.github.com/repos/alphagov/govuk-frontend
|
closed
|
Release v3.11.0 🚀
|
process 🕔 hours
|
## What
Release v3.11.0 of GOV.UK Frontend, following our release process.
## Why
To publish cookie banner component
## Who needs to know about this
Developer
## Done when
- [ ] the package has been published on npm
- [ ] the release has been created on GitHub
- [ ] the release has been communicated on Slack
- [ ] the sprint board has been updated
## Next
- [Update the Design System to use Frontend v3.10.0](https://github.com/alphagov/govuk-design-system/issues/1468)
- [Update the Prototype Kit to use Frontend v3.10.0](https://github.com/alphagov/govuk-prototype-kit/issues/980)
|
1.0
|
Release v3.11.0 🚀 - ## What
Release v3.11.0 of GOV.UK Frontend, following our release process.
## Why
To publish cookie banner component
## Who needs to know about this
Developer
## Done when
- [ ] the package has been published on npm
- [ ] the release has been created on GitHub
- [ ] the release has been communicated on Slack
- [ ] the sprint board has been updated
## Next
- [Update the Design System to use Frontend v3.10.0](https://github.com/alphagov/govuk-design-system/issues/1468)
- [Update the Prototype Kit to use Frontend v3.10.0](https://github.com/alphagov/govuk-prototype-kit/issues/980)
|
process
|
release 🚀 what release of gov uk frontend following our release process why to publish cookie banner component who needs to know about this developer done when the package has been published on npm the release has been created on github the release has been communicated on slack the sprint board has been updated next
| 1
|
396,080
| 11,701,248,594
|
IssuesEvent
|
2020-03-06 19:15:59
|
cop4934-fall19-group32/Project-32
|
https://api.github.com/repos/cop4934-fall19-group32/Project-32
|
closed
|
Restrict Level Select Panning
|
Improvement LevelSelect Priority:Low
|
In the level select screen, users can pan forever. This is undesired.
- [x] Dynamically limit the range of the pan effect so that at least one node must be in a certain range
|
1.0
|
Restrict Level Select Panning - In the level select screen, users can pan forever. This is undesired.
- [x] Dynamically limit the range of the pan effect so that at least one node must be in a certain range
|
non_process
|
restrict level select panning in the level select screen users can pan forever this is undesired dynamically limit the range of the pan effect so that at least one node must be in a certain range
| 0
|
202,104
| 15,255,965,693
|
IssuesEvent
|
2021-02-20 18:12:36
|
hibernate/hibernate-reactive
|
https://api.github.com/repos/hibernate/hibernate-reactive
|
closed
|
Test lazy fetching of @ElementCollection
|
good first issue testing
|
Follows: #44
So far we only have tests for the eager case but we need to check that it also work with `Session#fetch`.
|
1.0
|
Test lazy fetching of @ElementCollection - Follows: #44
So far we only have tests for the eager case but we need to check that it also work with `Session#fetch`.
|
non_process
|
test lazy fetching of elementcollection follows so far we only have tests for the eager case but we need to check that it also work with session fetch
| 0
|
158,975
| 12,441,154,792
|
IssuesEvent
|
2020-05-26 13:14:11
|
ansible/awx
|
https://api.github.com/repos/ansible/awx
|
closed
|
Inventory Source Details
|
component:ui_next priority:high state:needs_revision state:needs_test type:enhancement type:feature
|
##### ISSUE TYPE
- Feature Idea
##### SUMMARY
Link to mockup: https://tower-mockups.testing.ansible.com/patternfly/inventories/inventories-sources-detail/
This is going to be similar to projects in that there are lots of different types of sources all with different fields.
|
1.0
|
Inventory Source Details - ##### ISSUE TYPE
- Feature Idea
##### SUMMARY
Link to mockup: https://tower-mockups.testing.ansible.com/patternfly/inventories/inventories-sources-detail/
This is going to be similar to projects in that there are lots of different types of sources all with different fields.
|
non_process
|
inventory source details issue type feature idea summary link to mockup this is going to be similar to projects in that there are lots of different types of sources all with different fields
| 0
|
646,518
| 21,051,067,801
|
IssuesEvent
|
2022-03-31 20:36:03
|
model-bakers/model_bakery
|
https://api.github.com/repos/model-bakers/model_bakery
|
closed
|
AttributeError: 'ManyToManyRel' object has no attribute 'has_default'
|
bug high priority
|
After update from 1.3.2 to 1.3.3 started getting exception from title.
Sorry I didn't debug properly this issue and can't say why this is happening but my best guess would be because of this change https://github.com/model-bakers/model_bakery/compare/1.3.2...1.3.3#diff-e5857deb915e241f429a0c118e89e06a3388d3ce1466e3aa4b960b7055172b6dL322
## Expected behavior
```
Baker.get_fields() 1.3.2 version
(
<django.db.models.fields.AutoField: id>,
<django.db.models.fields.related.ForeignKey: group>,
<django.db.models.fields.related.ManyToManyField: service_lines>,
)
```
## Actual behavior
```
Baker.get_fields() 1.3.3 version
{
<django.db.models.fields.AutoField: id>,
<django.db.models.fields.related.ForeignKey: group>,
<django.db.models.fields.related.ManyToManyField: service_lines>,
<ManyToManyRel: myapp.foo1>, # I guess it not suppose to be here
}
```
And as a result of new element from Baker.get_fields()
```
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/python3.8/site-packages/model_bakery/baker.py:89: in make
return [
/python3.8/site-packages/model_bakery/baker.py:90: in <listcomp>
baker.make(
/model_bakery/baker.py:324: in make
return self._make(**params)
/model_bakery/baker.py:371: in _make
self.model_attrs[field.name] = self.generate_value(
> if field.has_default() and field.name not in self.rel_fields:
E AttributeError: 'ManyToManyRel' object has no attribute 'has_default'
/model_bakery/baker.py:566: AttributeError
```
## Reproduction Steps
I don't think that model that I use has any custom behavior and it's just because of how Baker.get_fields() works in new 1.3.3 version
Models that I use anyway
```python
class Foo(models.Model):
slug = models.SlugField("Service line slug", unique=True, max_length=150)
name = models.CharField("Service line name", max_length=150, null=True)
class Foo1(models.Model):
bars = models.ManyToManyField("myapp.Bar")
class Bar(models.Model):
foo = models.ManyToManyField("myapp.Foo", related_name="foos")
baker.make("core.Bar", _quantity=3, slug=cycle(["1", "2", "3"]), _fill_optional=True)
```
### Versions
Python: 3.8.10
Django: 2.2.24
Model Bakery: 1.3.3
|
1.0
|
AttributeError: 'ManyToManyRel' object has no attribute 'has_default' - After update from 1.3.2 to 1.3.3 started getting exception from title.
Sorry I didn't debug properly this issue and can't say why this is happening but my best guess would be because of this change https://github.com/model-bakers/model_bakery/compare/1.3.2...1.3.3#diff-e5857deb915e241f429a0c118e89e06a3388d3ce1466e3aa4b960b7055172b6dL322
## Expected behavior
```
Baker.get_fields() 1.3.2 version
(
<django.db.models.fields.AutoField: id>,
<django.db.models.fields.related.ForeignKey: group>,
<django.db.models.fields.related.ManyToManyField: service_lines>,
)
```
## Actual behavior
```
Baker.get_fields() 1.3.3 version
{
<django.db.models.fields.AutoField: id>,
<django.db.models.fields.related.ForeignKey: group>,
<django.db.models.fields.related.ManyToManyField: service_lines>,
<ManyToManyRel: myapp.foo1>, # I guess it not suppose to be here
}
```
And as a result of new element from Baker.get_fields()
```
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/python3.8/site-packages/model_bakery/baker.py:89: in make
return [
/python3.8/site-packages/model_bakery/baker.py:90: in <listcomp>
baker.make(
/model_bakery/baker.py:324: in make
return self._make(**params)
/model_bakery/baker.py:371: in _make
self.model_attrs[field.name] = self.generate_value(
> if field.has_default() and field.name not in self.rel_fields:
E AttributeError: 'ManyToManyRel' object has no attribute 'has_default'
/model_bakery/baker.py:566: AttributeError
```
## Reproduction Steps
I don't think that model that I use has any custom behavior and it's just because of how Baker.get_fields() works in new 1.3.3 version
Models that I use anyway
```python
class Foo(models.Model):
slug = models.SlugField("Service line slug", unique=True, max_length=150)
name = models.CharField("Service line name", max_length=150, null=True)
class Foo1(models.Model):
bars = models.ManyToManyField("myapp.Bar")
class Bar(models.Model):
foo = models.ManyToManyField("myapp.Foo", related_name="foos")
baker.make("core.Bar", _quantity=3, slug=cycle(["1", "2", "3"]), _fill_optional=True)
```
### Versions
Python: 3.8.10
Django: 2.2.24
Model Bakery: 1.3.3
|
non_process
|
attributeerror manytomanyrel object has no attribute has default after update from to started getting exception from title sorry i didn t debug properly this issue and can t say why this is happening but my best guess would be because of this change expected behavior baker get fields version actual behavior baker get fields version i guess it not suppose to be here and as a result of new element from baker get fields site packages model bakery baker py in make return site packages model bakery baker py in baker make model bakery baker py in make return self make params model bakery baker py in make self model attrs self generate value if field has default and field name not in self rel fields e attributeerror manytomanyrel object has no attribute has default model bakery baker py attributeerror reproduction steps i don t think that model that i use has any custom behavior and it s just because of how baker get fields works in new version models that i use anyway python class foo models model slug models slugfield service line slug unique true max length name models charfield service line name max length null true class models model bars models manytomanyfield myapp bar class bar models model foo models manytomanyfield myapp foo related name foos baker make core bar quantity slug cycle fill optional true versions python django model bakery
| 0
|
10,149
| 13,044,162,561
|
IssuesEvent
|
2020-07-29 03:47:33
|
tikv/tikv
|
https://api.github.com/repos/tikv/tikv
|
closed
|
UCP: Migrate scalar function `Version` from TiDB
|
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
|
## Description
Port the scalar function `Version` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @sticnarf
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
2.0
|
UCP: Migrate scalar function `Version` from TiDB -
## Description
Port the scalar function `Version` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @sticnarf
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
process
|
ucp migrate scalar function version from tidb description port the scalar function version from tidb to coprocessor score mentor s sticnarf recommended skills rust programming learning materials already implemented expressions ported from tidb
| 1
|
9,519
| 12,499,389,740
|
IssuesEvent
|
2020-06-01 20:04:03
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Add texture shading algorithms to the processing toolbox
|
Feature Request Feedback Processing
|
I came across this set of algorithms which create texture shading:
https://app.box.com/v/textureshading/folder/122668213
Here are some examples:
http://www.shadedrelief.com/texture_shading/
The code appears to be under BSD:
https://app.box.com/v/textureshading/file/540913016576
|
1.0
|
Add texture shading algorithms to the processing toolbox - I came across this set of algorithms which create texture shading:
https://app.box.com/v/textureshading/folder/122668213
Here are some examples:
http://www.shadedrelief.com/texture_shading/
The code appears to be under BSD:
https://app.box.com/v/textureshading/file/540913016576
|
process
|
add texture shading algorithms to the processing toolbox i came across this set of algorithms which create texture shading here are some examples the code appears to be under bsd
| 1
|
641,657
| 20,831,568,988
|
IssuesEvent
|
2022-03-19 14:30:01
|
kubernetes/ingress-nginx
|
https://api.github.com/repos/kubernetes/ingress-nginx
|
closed
|
Unable to get REAL-IP (Client IP) from Kubernetes Ingress
|
needs-kind needs-triage needs-priority
|
I have follow the guideline
set the values from Helm
```yaml
## Set external traffic policy to: "Local" to preserve source IP on providers supporting it.
## Ref: https://kubernetes.io/docs/tutorials/services/source-ip/#source-ip-for-services-with-typeloadbalancer
externalTrafficPolicy: "Local"
```
Enable the real-ip from configmap
```yaml
enable-real-ip: "true"
```
# But it's getting kubernetes clustered ip which i do not want.
```
x-real-ip | 10.42.0.34
-- | --
x-forwarded-for | 10.42.0.34
```
### However, i have tried to enable proxy protocol, but no luck is not working.
|
1.0
|
Unable to get REAL-IP (Client IP) from Kubernetes Ingress - I have follow the guideline
set the values from Helm
```yaml
## Set external traffic policy to: "Local" to preserve source IP on providers supporting it.
## Ref: https://kubernetes.io/docs/tutorials/services/source-ip/#source-ip-for-services-with-typeloadbalancer
externalTrafficPolicy: "Local"
```
Enable the real-ip from configmap
```yaml
enable-real-ip: "true"
```
# But it's getting kubernetes clustered ip which i do not want.
```
x-real-ip | 10.42.0.34
-- | --
x-forwarded-for | 10.42.0.34
```
### However, i have tried to enable proxy protocol, but no luck is not working.
|
non_process
|
unable to get real ip client ip from kubernetes ingress i have follow the guideline set the values from helm yaml set external traffic policy to local to preserve source ip on providers supporting it ref externaltrafficpolicy local enable the real ip from configmap yaml enable real ip true but it s getting kubernetes clustered ip which i do not want x real ip x forwarded for however i have tried to enable proxy protocol but no luck is not working
| 0
|
108,400
| 16,773,147,021
|
IssuesEvent
|
2021-06-14 17:12:46
|
LibraryOfCongress/concordia
|
https://api.github.com/repos/LibraryOfCongress/concordia
|
closed
|
Address the flagged dependencies raised by dependabot
|
security
|
Both yargs-parser and url-regex a need updates to the package. Check the alerts dependabot raised and update.
https://github.com/LibraryOfCongress/concordia/security/dependabot
Highest priority will include dependencies that heightened our security vulnerability. All others, create tickets and build into other sprints.
Dependabot also created automated pull request, let's review on if this is needed. We will always need to do research if the upgrade will affect the application or build before deploying.
|
True
|
Address the flagged dependencies raised by dependabot - Both yargs-parser and url-regex a need updates to the package. Check the alerts dependabot raised and update.
https://github.com/LibraryOfCongress/concordia/security/dependabot
Highest priority will include dependencies that heightened our security vulnerability. All others, create tickets and build into other sprints.
Dependabot also created automated pull request, let's review on if this is needed. We will always need to do research if the upgrade will affect the application or build before deploying.
|
non_process
|
address the flagged dependencies raised by dependabot both yargs parser and url regex a need updates to the package check the alerts dependabot raised and update highest priority will include dependencies that heightened our security vulnerability all others create tickets and build into other sprints dependabot also created automated pull request let s review on if this is needed we will always need to do research if the upgrade will affect the application or build before deploying
| 0
|
2,219
| 5,068,749,582
|
IssuesEvent
|
2016-12-24 22:30:09
|
racerxdl/open-satellite-project
|
https://api.github.com/repos/racerxdl/open-satellite-project
|
closed
|
Epiphany Core Optimized APT Decoder
|
enhancement Epiphany Processor
|
We need to have a optimized APT Decoder for Epiphany Processor to be able to use a Parallella Board as a receiver station.
Depends on #3 and #7
|
1.0
|
Epiphany Core Optimized APT Decoder - We need to have a optimized APT Decoder for Epiphany Processor to be able to use a Parallella Board as a receiver station.
Depends on #3 and #7
|
process
|
epiphany core optimized apt decoder we need to have a optimized apt decoder for epiphany processor to be able to use a parallella board as a receiver station depends on and
| 1
|
18,563
| 24,555,734,907
|
IssuesEvent
|
2022-10-12 15:43:42
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[iOS] Notifications issue
|
Bug P1 iOS Process: Fixed Process: Tested QA Process: Tested dev
|
1. if we get activities added notification > After clicking the notification activities screen is not getting refreshed
2. In the mobile app, go to the activities list screen > Minimize the app > Pause the particular study > after getting the notification click on paused notification > Then Resume the study in SB and publish the updates > after receiving the notification click on the Notification and observe > Participant is navigating to the Study activities list screen > Participant should be in the study list screen
|
3.0
|
[iOS] Notifications issue - 1. if we get activities added notification > After clicking the notification activities screen is not getting refreshed
2. In the mobile app, go to the activities list screen > Minimize the app > Pause the particular study > after getting the notification click on paused notification > Then Resume the study in SB and publish the updates > after receiving the notification click on the Notification and observe > Participant is navigating to the Study activities list screen > Participant should be in the study list screen
|
process
|
notifications issue if we get activities added notification after clicking the notification activities screen is not getting refreshed in the mobile app go to the activities list screen minimize the app pause the particular study after getting the notification click on paused notification then resume the study in sb and publish the updates after receiving the notification click on the notification and observe participant is navigating to the study activities list screen participant should be in the study list screen
| 1
|
2,902
| 5,888,186,824
|
IssuesEvent
|
2017-05-17 09:30:08
|
CERNDocumentServer/cds
|
https://api.github.com/repos/CERNDocumentServer/cds
|
closed
|
ffmpeg: better error reporting
|
avc_processing enhancement in progress
|
The `CalledProcessError` raised from the `subprocess` package only provides the existense of an error, but not details about it. We should capture a command's output and provide useful error messages.
|
1.0
|
ffmpeg: better error reporting - The `CalledProcessError` raised from the `subprocess` package only provides the existense of an error, but not details about it. We should capture a command's output and provide useful error messages.
|
process
|
ffmpeg better error reporting the calledprocesserror raised from the subprocess package only provides the existense of an error but not details about it we should capture a command s output and provide useful error messages
| 1
|
248,736
| 7,935,444,166
|
IssuesEvent
|
2018-07-09 05:14:37
|
magda-io/magda
|
https://api.github.com/repos/magda-io/magda
|
closed
|
Ask a question form on Pixel
|
priority: medium
|
### Problem description
I can't see the exit arrow and can't click on the email or name form fields. The description form field is fine.
The feedback button hides the 'send' button, but that should be fixed once it's removed.
### Problem reproduction steps
Try it on a Pixel (Unable to reproduce with Samsung)
### Screenshot / Design / File reference

|
1.0
|
Ask a question form on Pixel - ### Problem description
I can't see the exit arrow and can't click on the email or name form fields. The description form field is fine.
The feedback button hides the 'send' button, but that should be fixed once it's removed.
### Problem reproduction steps
Try it on a Pixel (Unable to reproduce with Samsung)
### Screenshot / Design / File reference

|
non_process
|
ask a question form on pixel problem description i can t see the exit arrow and can t click on the email or name form fields the description form field is fine the feedback button hides the send button but that should be fixed once it s removed problem reproduction steps try it on a pixel unable to reproduce with samsung screenshot design file reference
| 0
|
63,794
| 14,656,783,988
|
IssuesEvent
|
2020-12-28 14:11:27
|
fu1771695yongxie/ChromeAppHeroes
|
https://api.github.com/repos/fu1771695yongxie/ChromeAppHeroes
|
opened
|
CVE-2020-11023 (Medium) detected in jquery-3.3.1.min.js
|
security vulnerability
|
## CVE-2020-11023 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-3.3.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js</a></p>
<p>Path to dependency file: ChromeAppHeroes/backup/076-listen1/076-listen1/listen1_chrome_extension-2.11.0/listen1_chrome_extension-2.11.0/listen1.html</p>
<p>Path to vulnerable library: ChromeAppHeroes/backup/076-listen1/076-listen1/listen1_chrome_extension-2.11.0/listen1_chrome_extension-2.11.0/js/vendor/jquery-3.3.1.min.js,ChromeAppHeroes/backup/076-listen1/listen1_chrome_extension-2.11.0/listen1_chrome_extension-2.11.0/js/vendor/jquery-3.3.1.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-3.3.1.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fu1771695yongxie/ChromeAppHeroes/commit/a3be6da29cfc020405737642bde09ea664b3993e">a3be6da29cfc020405737642bde09ea664b3993e</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In jQuery versions greater than or equal to 1.0.3 and before 3.5.0, passing HTML containing <option> elements from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11023>CVE-2020-11023</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11023">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11023</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jquery - 3.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-11023 (Medium) detected in jquery-3.3.1.min.js - ## CVE-2020-11023 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-3.3.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js</a></p>
<p>Path to dependency file: ChromeAppHeroes/backup/076-listen1/076-listen1/listen1_chrome_extension-2.11.0/listen1_chrome_extension-2.11.0/listen1.html</p>
<p>Path to vulnerable library: ChromeAppHeroes/backup/076-listen1/076-listen1/listen1_chrome_extension-2.11.0/listen1_chrome_extension-2.11.0/js/vendor/jquery-3.3.1.min.js,ChromeAppHeroes/backup/076-listen1/listen1_chrome_extension-2.11.0/listen1_chrome_extension-2.11.0/js/vendor/jquery-3.3.1.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-3.3.1.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fu1771695yongxie/ChromeAppHeroes/commit/a3be6da29cfc020405737642bde09ea664b3993e">a3be6da29cfc020405737642bde09ea664b3993e</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In jQuery versions greater than or equal to 1.0.3 and before 3.5.0, passing HTML containing <option> elements from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11023>CVE-2020-11023</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11023">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11023</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jquery - 3.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file chromeappheroes backup chrome extension chrome extension html path to vulnerable library chromeappheroes backup chrome extension chrome extension js vendor jquery min js chromeappheroes backup chrome extension chrome extension js vendor jquery min js dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details in jquery versions greater than or equal to and before passing html containing elements from untrusted sources even after sanitizing it to one of jquery s dom manipulation methods i e html append and others may execute untrusted code this problem is patched in jquery publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery step up your open source security game with whitesource
| 0
|
1,517
| 4,108,951,949
|
IssuesEvent
|
2016-06-06 17:51:10
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
opened
|
SIGABRT_ASSERT_System.Native.so!SystemNative_ForkAndExecProcess
|
bug reliability System.Diagnostics.Process
|
**The notes in this bug refer to the Ubuntu.14.04 dump [rc3-24202-00_004C](https://rapreqs.blob.core.windows.net/sschaab/BodyPart_669c030d-366e-481c-8f12-0e82c88679f2?sv=2015-04-05&sr=b&sig=yodf0SNKqzeSzh2ZReKVeGouiJC8UhD%2B504JC9qrI1s%3D&st=2016-06-03T22%3A16%3A16Z&se=2017-06-03T22%3A16%3A16Z&sp=r). Other dumps are available if needed.**
STOP_REASON:
SIGABRT
FAULT_SYMBOL:
System.Native.so!SystemNative_ForkAndExecProcess
FAILURE_HASH:
SIGABRT_System.Native.so!SystemNative_ForkAndExecProcess
FAULT_STACK:
libc.so.6!__GI_raise
libc.so.6!__GI_abort
libc.so.6!__assert_fail_base
libc.so.6!UNKNOWN
System.Native.so!SystemNative_ForkAndExecProcess
System.Runtime.Extensions.dll!DomainBoundILStubClass.IL_STUB_PInvoke(System.String, Byte**, Byte**, System.String, Int32, Int32, Int32, Int32 ByRef, Int32 ByRef, Int32 ByRef, Int32 ByRef)
System.Diagnostics.Process.dll!Interop+Sys.ForkAndExecProcess(System.String, System.String[], System.String[], System.String, Boolean, Boolean, Boolean, Int32 ByRef, Int32 ByRef, Int32 ByRef, Int32 ByRef)
System.Diagnostics.Process.dll!System.Diagnostics.Process.StartCore(System.Diagnostics.ProcessStartInfo)
System.Diagnostics.Process.dll!System.Diagnostics.Process.Start()
System.Diagnostics.Process.Tests.dll!System.Diagnostics.Tests.ProcessWaitingTests.SingleProcess_TryWaitMultipleTimesBeforeCompleting()
rc3-24202-00_004C.exe!stress.generated.UnitTests.UT0()
stress.execution.dll!stress.execution.UnitTest.Execute()
stress.execution.dll!stress.execution.DedicatedThreadWorkerStrategy.RunWorker(stress.execution.ITestPattern, System.Threading.CancellationToken)
stress.execution.dll!stress.execution.DedicatedThreadWorkerStrategy+<>c__DisplayClass1_0.<SpawnWorker>b__0()
System.Private.CoreLib.ni.dll!System.Threading.Tasks.Task.Execute()
System.Private.CoreLib.ni.dll!System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
System.Private.CoreLib.ni.dll!System.Threading.Tasks.Task.ExecuteWithThreadLocal(System.Threading.Tasks.Task ByRef)
System.Private.CoreLib.ni.dll!System.Threading.Tasks.Task.ExecuteEntry(Boolean)
System.Private.CoreLib.ni.dll!System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
libcoreclr.so!CallDescrWorkerInternal
libcoreclr.so!CallDescrWorkerWithHandler(CallDescrData*, int)
libcoreclr.so!MethodDescCallSite::CallTargetWorker(unsigned long const*)
libcoreclr.so!MethodDescCallSite::Call(unsigned long const*)
libcoreclr.so!ThreadNative::KickOffThread_Worker(void*)
libcoreclr.so!ManagedThreadBase_DispatchInner(ManagedThreadCallState*)
libcoreclr.so!ManagedThreadBase_DispatchMiddle(ManagedThreadCallState*)
libcoreclr.so!ManagedThreadBase_DispatchOuter(ManagedThreadCallState*)::$_6::operator()(ManagedThreadBase_DispatchOuter(ManagedThreadCallState*)::TryArgs*) const::{lambda(Param*)#1}::operator()(Param*) const
libcoreclr.so!ManagedThreadBase_DispatchOuter(ManagedThreadCallState*)::$_6::operator()(ManagedThreadBase_DispatchOuter(ManagedThreadCallState*)::TryArgs*) const
libcoreclr.so!ManagedThreadBase_DispatchOuter(ManagedThreadCallState*)
libcoreclr.so!ManagedThreadBase_FullTransitionWithAD(ADID, void (*)(void*), void*, UnhandledExceptionLocation)
libcoreclr.so!ManagedThreadBase::KickOff(ADID, void (*)(void*), void*)
libcoreclr.so!ThreadNative::KickOffThread(void*)
libcoreclr.so!Thread::intermediateThreadProc(void*)
libcoreclr.so!CorUnix::CPalThread::ThreadEntry(void*)
libpthread.so.0!start_thread
libc.so.6!__clone
FAULT_THREAD:
thread #1: tid = 53100, 0x00007f00ab935cc9 libc.so.6`__GI_raise(sig=6) + 57 at raise.c:56, name = 'corerun', stop reason = signal SIGABRT
**Looking at the code for frame 4 (pal_process.cpp line 151) it looks like the call to fork() returns -1 so it asserts with the message "fork() failed."**
(lldb) fr s 4
frame #4: 0x00007effd1df2cb5 System.Native.so`::SystemNative_ForkAndExecProcess(filename="/home/DotNetBot/dotnetbuild/work/b36360ef-8151-4402-8438-48a63dd3588d/Work/d643c218-6748-4941-8172-9696a99dcf7b/Exec/execution/corerun", argv=0x00007eff87d5ebc0, envp=0x00007eff87d61380, cwd=0x0000000000000000, redirectStdin=0, redirectStdout=0, redirectStderr=0, childPid=0x00007f000aff9d30, stdinFd=0x00007f000aff9d28, stdoutFd=0x00007f000aff9d20, stderrFd=0x00007f000aff9d18) + 645 at pal_process.cpp:151
(lldb) fr v -D1
(const char *) filename = 0x00007eff87d62220 "/home/DotNetBot/dotnetbuild/work/b36360ef-8151-4402-8438-48a63dd3588d/Work/d643c218-6748-4941-8172-9696a99dcf7b/Exec/execution/corerun"
(char *const *) argv = 0x00007eff87d5ebc0
(char *const *) envp = 0x00007eff87d61380
(const char *) cwd = 0x0000000000000000
(int32_t) redirectStdin = 0
(int32_t) redirectStdout = 0
(int32_t) redirectStderr = 0
(int32_t *) childPid = 0x00007f000aff9d30
(int32_t *) stdinFd = 0x00007f000aff9d28
(int32_t *) stdoutFd = 0x00007f000aff9d20
(int32_t *) stderrFd = 0x00007f000aff9d18
(int) success = 1
(int [2]) stdinFds = ([0] = -1, [1] = -1)
(int [2]) stdoutFds = ([0] = -1, [1] = -1)
(int [2]) stderrFds = ([0] = -1, [1] = -1)
(int [2]) waitForChildToExecPipe = ([0] = 162, [1] = 164)
(int) processId = -1
// Fork the child process
if ((processId = fork()) == -1)
{
assert(false && "fork() failed.");
success = false;
goto done;
}
|
1.0
|
SIGABRT_ASSERT_System.Native.so!SystemNative_ForkAndExecProcess - **The notes in this bug refer to the Ubuntu.14.04 dump [rc3-24202-00_004C](https://rapreqs.blob.core.windows.net/sschaab/BodyPart_669c030d-366e-481c-8f12-0e82c88679f2?sv=2015-04-05&sr=b&sig=yodf0SNKqzeSzh2ZReKVeGouiJC8UhD%2B504JC9qrI1s%3D&st=2016-06-03T22%3A16%3A16Z&se=2017-06-03T22%3A16%3A16Z&sp=r). Other dumps are available if needed.**
STOP_REASON:
SIGABRT
FAULT_SYMBOL:
System.Native.so!SystemNative_ForkAndExecProcess
FAILURE_HASH:
SIGABRT_System.Native.so!SystemNative_ForkAndExecProcess
FAULT_STACK:
libc.so.6!__GI_raise
libc.so.6!__GI_abort
libc.so.6!__assert_fail_base
libc.so.6!UNKNOWN
System.Native.so!SystemNative_ForkAndExecProcess
System.Runtime.Extensions.dll!DomainBoundILStubClass.IL_STUB_PInvoke(System.String, Byte**, Byte**, System.String, Int32, Int32, Int32, Int32 ByRef, Int32 ByRef, Int32 ByRef, Int32 ByRef)
System.Diagnostics.Process.dll!Interop+Sys.ForkAndExecProcess(System.String, System.String[], System.String[], System.String, Boolean, Boolean, Boolean, Int32 ByRef, Int32 ByRef, Int32 ByRef, Int32 ByRef)
System.Diagnostics.Process.dll!System.Diagnostics.Process.StartCore(System.Diagnostics.ProcessStartInfo)
System.Diagnostics.Process.dll!System.Diagnostics.Process.Start()
System.Diagnostics.Process.Tests.dll!System.Diagnostics.Tests.ProcessWaitingTests.SingleProcess_TryWaitMultipleTimesBeforeCompleting()
rc3-24202-00_004C.exe!stress.generated.UnitTests.UT0()
stress.execution.dll!stress.execution.UnitTest.Execute()
stress.execution.dll!stress.execution.DedicatedThreadWorkerStrategy.RunWorker(stress.execution.ITestPattern, System.Threading.CancellationToken)
stress.execution.dll!stress.execution.DedicatedThreadWorkerStrategy+<>c__DisplayClass1_0.<SpawnWorker>b__0()
System.Private.CoreLib.ni.dll!System.Threading.Tasks.Task.Execute()
System.Private.CoreLib.ni.dll!System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
System.Private.CoreLib.ni.dll!System.Threading.Tasks.Task.ExecuteWithThreadLocal(System.Threading.Tasks.Task ByRef)
System.Private.CoreLib.ni.dll!System.Threading.Tasks.Task.ExecuteEntry(Boolean)
System.Private.CoreLib.ni.dll!System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
libcoreclr.so!CallDescrWorkerInternal
libcoreclr.so!CallDescrWorkerWithHandler(CallDescrData*, int)
libcoreclr.so!MethodDescCallSite::CallTargetWorker(unsigned long const*)
libcoreclr.so!MethodDescCallSite::Call(unsigned long const*)
libcoreclr.so!ThreadNative::KickOffThread_Worker(void*)
libcoreclr.so!ManagedThreadBase_DispatchInner(ManagedThreadCallState*)
libcoreclr.so!ManagedThreadBase_DispatchMiddle(ManagedThreadCallState*)
libcoreclr.so!ManagedThreadBase_DispatchOuter(ManagedThreadCallState*)::$_6::operator()(ManagedThreadBase_DispatchOuter(ManagedThreadCallState*)::TryArgs*) const::{lambda(Param*)#1}::operator()(Param*) const
libcoreclr.so!ManagedThreadBase_DispatchOuter(ManagedThreadCallState*)::$_6::operator()(ManagedThreadBase_DispatchOuter(ManagedThreadCallState*)::TryArgs*) const
libcoreclr.so!ManagedThreadBase_DispatchOuter(ManagedThreadCallState*)
libcoreclr.so!ManagedThreadBase_FullTransitionWithAD(ADID, void (*)(void*), void*, UnhandledExceptionLocation)
libcoreclr.so!ManagedThreadBase::KickOff(ADID, void (*)(void*), void*)
libcoreclr.so!ThreadNative::KickOffThread(void*)
libcoreclr.so!Thread::intermediateThreadProc(void*)
libcoreclr.so!CorUnix::CPalThread::ThreadEntry(void*)
libpthread.so.0!start_thread
libc.so.6!__clone
FAULT_THREAD:
thread #1: tid = 53100, 0x00007f00ab935cc9 libc.so.6`__GI_raise(sig=6) + 57 at raise.c:56, name = 'corerun', stop reason = signal SIGABRT
**Looking at the code for frame 4 (pal_process.cpp line 151) it looks like the call to fork() returns -1 so it asserts with the message "fork() failed."**
(lldb) fr s 4
frame #4: 0x00007effd1df2cb5 System.Native.so`::SystemNative_ForkAndExecProcess(filename="/home/DotNetBot/dotnetbuild/work/b36360ef-8151-4402-8438-48a63dd3588d/Work/d643c218-6748-4941-8172-9696a99dcf7b/Exec/execution/corerun", argv=0x00007eff87d5ebc0, envp=0x00007eff87d61380, cwd=0x0000000000000000, redirectStdin=0, redirectStdout=0, redirectStderr=0, childPid=0x00007f000aff9d30, stdinFd=0x00007f000aff9d28, stdoutFd=0x00007f000aff9d20, stderrFd=0x00007f000aff9d18) + 645 at pal_process.cpp:151
(lldb) fr v -D1
(const char *) filename = 0x00007eff87d62220 "/home/DotNetBot/dotnetbuild/work/b36360ef-8151-4402-8438-48a63dd3588d/Work/d643c218-6748-4941-8172-9696a99dcf7b/Exec/execution/corerun"
(char *const *) argv = 0x00007eff87d5ebc0
(char *const *) envp = 0x00007eff87d61380
(const char *) cwd = 0x0000000000000000
(int32_t) redirectStdin = 0
(int32_t) redirectStdout = 0
(int32_t) redirectStderr = 0
(int32_t *) childPid = 0x00007f000aff9d30
(int32_t *) stdinFd = 0x00007f000aff9d28
(int32_t *) stdoutFd = 0x00007f000aff9d20
(int32_t *) stderrFd = 0x00007f000aff9d18
(int) success = 1
(int [2]) stdinFds = ([0] = -1, [1] = -1)
(int [2]) stdoutFds = ([0] = -1, [1] = -1)
(int [2]) stderrFds = ([0] = -1, [1] = -1)
(int [2]) waitForChildToExecPipe = ([0] = 162, [1] = 164)
(int) processId = -1
// Fork the child process
if ((processId = fork()) == -1)
{
assert(false && "fork() failed.");
success = false;
goto done;
}
|
process
|
sigabrt assert system native so systemnative forkandexecprocess the notes in this bug refer to the ubuntu dump other dumps are available if needed stop reason sigabrt fault symbol system native so systemnative forkandexecprocess failure hash sigabrt system native so systemnative forkandexecprocess fault stack libc so gi raise libc so gi abort libc so assert fail base libc so unknown system native so systemnative forkandexecprocess system runtime extensions dll domainboundilstubclass il stub pinvoke system string byte byte system string byref byref byref byref system diagnostics process dll interop sys forkandexecprocess system string system string system string system string boolean boolean boolean byref byref byref byref system diagnostics process dll system diagnostics process startcore system diagnostics processstartinfo system diagnostics process dll system diagnostics process start system diagnostics process tests dll system diagnostics tests processwaitingtests singleprocess trywaitmultipletimesbeforecompleting exe stress generated unittests stress execution dll stress execution unittest execute stress execution dll stress execution dedicatedthreadworkerstrategy runworker stress execution itestpattern system threading cancellationtoken stress execution dll stress execution dedicatedthreadworkerstrategy c b system private corelib ni dll system threading tasks task execute system private corelib ni dll system threading executioncontext run system threading executioncontext system threading contextcallback system object system private corelib ni dll system threading tasks task executewiththreadlocal system threading tasks task byref system private corelib ni dll system threading tasks task executeentry boolean system private corelib ni dll system threading executioncontext run system threading executioncontext system threading contextcallback system object libcoreclr so calldescrworkerinternal libcoreclr so calldescrworkerwithhandler calldescrdata int libcoreclr so methoddesccallsite calltargetworker unsigned long const libcoreclr so methoddesccallsite call unsigned long const libcoreclr so threadnative kickoffthread worker void libcoreclr so managedthreadbase dispatchinner managedthreadcallstate libcoreclr so managedthreadbase dispatchmiddle managedthreadcallstate libcoreclr so managedthreadbase dispatchouter managedthreadcallstate operator managedthreadbase dispatchouter managedthreadcallstate tryargs const lambda param operator param const libcoreclr so managedthreadbase dispatchouter managedthreadcallstate operator managedthreadbase dispatchouter managedthreadcallstate tryargs const libcoreclr so managedthreadbase dispatchouter managedthreadcallstate libcoreclr so managedthreadbase fulltransitionwithad adid void void void unhandledexceptionlocation libcoreclr so managedthreadbase kickoff adid void void void libcoreclr so threadnative kickoffthread void libcoreclr so thread intermediatethreadproc void libcoreclr so corunix cpalthread threadentry void libpthread so start thread libc so clone fault thread thread tid libc so gi raise sig at raise c name corerun stop reason signal sigabrt looking at the code for frame pal process cpp line it looks like the call to fork returns so it asserts with the message fork failed lldb fr s frame system native so systemnative forkandexecprocess filename home dotnetbot dotnetbuild work work exec execution corerun argv envp cwd redirectstdin redirectstdout redirectstderr childpid stdinfd stdoutfd stderrfd at pal process cpp lldb fr v const char filename home dotnetbot dotnetbuild work work exec execution corerun char const argv char const envp const char cwd t redirectstdin t redirectstdout t redirectstderr t childpid t stdinfd t stdoutfd t stderrfd int success int stdinfds int stdoutfds int stderrfds int waitforchildtoexecpipe int processid fork the child process if processid fork assert false fork failed success false goto done
| 1
|
178,738
| 30,000,360,617
|
IssuesEvent
|
2023-06-26 08:55:11
|
equinor/ert
|
https://api.github.com/repos/equinor/ert
|
opened
|
Implement localization/regularization that is built-in to the `fit` procedure of ensemble based assimilation
|
need-design needs-discussion
|
Localization is a type of (usually informed, on position in time and space) regularization.
It is well known that regularization works best when directly built-in to the objective function of the `fit`, both when `H` is known and must be learnt.
This is in contrast to adaptive localization and distance based localization.
Likely, a better approach is to
- [ ] Implement L1/L2 or elastic regularization for unstructured parameters
- [ ] Implement the Ensemble Information Filter for structured parameters
|
1.0
|
Implement localization/regularization that is built-in to the `fit` procedure of ensemble based assimilation - Localization is a type of (usually informed, on position in time and space) regularization.
It is well known that regularization works best when directly built-in to the objective function of the `fit`, both when `H` is known and must be learnt.
This is in contrast to adaptive localization and distance based localization.
Likely, a better approach is to
- [ ] Implement L1/L2 or elastic regularization for unstructured parameters
- [ ] Implement the Ensemble Information Filter for structured parameters
|
non_process
|
implement localization regularization that is built in to the fit procedure of ensemble based assimilation localization is a type of usually informed on position in time and space regularization it is well known that regularization works best when directly built in to the objective function of the fit both when h is known and must be learnt this is in contrast to adaptive localization and distance based localization likely a better approach is to implement or elastic regularization for unstructured parameters implement the ensemble information filter for structured parameters
| 0
|
21,707
| 30,205,164,101
|
IssuesEvent
|
2023-07-05 08:56:46
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
closed
|
Should JavaInfo.transitive_compile_time_jars contain the ijar?
|
type: support / not a bug (process) team-Rules-Java untriaged
|
### Description of the bug:
JavaInfo.transitive_compile_time_jars contains the compileJar passed in in the JavaInfo constructor. This was unexpected as I interpreted the documentation to say that it contains the jars that were used to compile this particular jar.
From documentation:
> Returns the transitive set of Jars required to build the target.
This to me implies it was the set of Jars needed to build the target, but not the target itself.
Anyway, not sure if this is a bug or if it is working as expected. If it is working as expected, the docs should probably be updated to be more clear like "Returns transitive set of Jars needed when this target is used as a compile-time dependency" or similar.
### What's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.
Create a java target. Make an aspect that outputs the JavaInfo.transitive_compile_time_jars. Notice that it includes the ijar.
### Which operating system are you running Bazel on?
Windows
### What is the output of `bazel info release`?
release 6.0.0
### If `bazel info release` returns `development version` or `(@non-git)`, tell us how you built Bazel.
_No response_
### What's the output of `git remote get-url origin; git rev-parse master; git rev-parse HEAD` ?
_No response_
### Is this a regression? If yes, please try to identify the Bazel commit where the bug was introduced.
_No response_
### Have you found anything relevant by searching the web?
_No response_
### Any other information, logs, or outputs that you want to share?
somewhat related to #8124
|
1.0
|
Should JavaInfo.transitive_compile_time_jars contain the ijar? - ### Description of the bug:
JavaInfo.transitive_compile_time_jars contains the compileJar passed in in the JavaInfo constructor. This was unexpected as I interpreted the documentation to say that it contains the jars that were used to compile this particular jar.
From documentation:
> Returns the transitive set of Jars required to build the target.
This to me implies it was the set of Jars needed to build the target, but not the target itself.
Anyway, not sure if this is a bug or if it is working as expected. If it is working as expected, the docs should probably be updated to be more clear like "Returns transitive set of Jars needed when this target is used as a compile-time dependency" or similar.
### What's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.
Create a java target. Make an aspect that outputs the JavaInfo.transitive_compile_time_jars. Notice that it includes the ijar.
### Which operating system are you running Bazel on?
Windows
### What is the output of `bazel info release`?
release 6.0.0
### If `bazel info release` returns `development version` or `(@non-git)`, tell us how you built Bazel.
_No response_
### What's the output of `git remote get-url origin; git rev-parse master; git rev-parse HEAD` ?
_No response_
### Is this a regression? If yes, please try to identify the Bazel commit where the bug was introduced.
_No response_
### Have you found anything relevant by searching the web?
_No response_
### Any other information, logs, or outputs that you want to share?
somewhat related to #8124
|
process
|
should javainfo transitive compile time jars contain the ijar description of the bug javainfo transitive compile time jars contains the compilejar passed in in the javainfo constructor this was unexpected as i interpreted the documentation to say that it contains the jars that were used to compile this particular jar from documentation returns the transitive set of jars required to build the target this to me implies it was the set of jars needed to build the target but not the target itself anyway not sure if this is a bug or if it is working as expected if it is working as expected the docs should probably be updated to be more clear like returns transitive set of jars needed when this target is used as a compile time dependency or similar what s the simplest easiest way to reproduce this bug please provide a minimal example if possible create a java target make an aspect that outputs the javainfo transitive compile time jars notice that it includes the ijar which operating system are you running bazel on windows what is the output of bazel info release release if bazel info release returns development version or non git tell us how you built bazel no response what s the output of git remote get url origin git rev parse master git rev parse head no response is this a regression if yes please try to identify the bazel commit where the bug was introduced no response have you found anything relevant by searching the web no response any other information logs or outputs that you want to share somewhat related to
| 1
|
1,435
| 4,003,546,554
|
IssuesEvent
|
2016-05-12 01:02:51
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Picking a date column for sorting results in "day" granularity sort order
|
Bug Correctness Priority/P1 Query Processor
|
When selecting a date column in the sort picker we incorrectly include "day" as the unit, causing rows to be sorted by day but sometimes not sorted correctly within each day.
Note that clicking a table header doesn't cause the unit to be included, so it sorts correctly. This can be used as a workaround until we fix this issue.
|
1.0
|
Picking a date column for sorting results in "day" granularity sort order - When selecting a date column in the sort picker we incorrectly include "day" as the unit, causing rows to be sorted by day but sometimes not sorted correctly within each day.
Note that clicking a table header doesn't cause the unit to be included, so it sorts correctly. This can be used as a workaround until we fix this issue.
|
process
|
picking a date column for sorting results in day granularity sort order when selecting a date column in the sort picker we incorrectly include day as the unit causing rows to be sorted by day but sometimes not sorted correctly within each day note that clicking a table header doesn t cause the unit to be included so it sorts correctly this can be used as a workaround until we fix this issue
| 1
|
4,600
| 7,448,110,992
|
IssuesEvent
|
2018-03-28 14:26:40
|
rogerthat-platform/plugin-rogerthat-control-center
|
https://api.github.com/repos/rogerthat-platform/plugin-rogerthat-control-center
|
closed
|
Add a configuration for embedded apps under build settings
|
priority_minor process_wontfix type_feature
|
See https://github.com/rogerthat-platform/rogerthat-android-client/issues/333.
Also https://github.com/rogerthat-platform/rogerthat-backend/issues/416 is related.
Should we update it completely or just add the support for this property under build settings in build.yaml files?
|
1.0
|
Add a configuration for embedded apps under build settings - See https://github.com/rogerthat-platform/rogerthat-android-client/issues/333.
Also https://github.com/rogerthat-platform/rogerthat-backend/issues/416 is related.
Should we update it completely or just add the support for this property under build settings in build.yaml files?
|
process
|
add a configuration for embedded apps under build settings see also is related should we update it completely or just add the support for this property under build settings in build yaml files
| 1
|
300,230
| 22,647,349,291
|
IssuesEvent
|
2022-07-01 10:00:21
|
phukiendienthoaigiare/Qua-Tang-Doanh-Nghiep-3-trong-1
|
https://api.github.com/repos/phukiendienthoaigiare/Qua-Tang-Doanh-Nghiep-3-trong-1
|
opened
|
Các Bước Lựa Chọn Quà Tặng Doanh Nghiệp Tết Ý Nghĩa Và Tiết Kiệm Chi Phí
|
documentation help wanted good first issue question
|
<p>Đã đến lúc bạn cần set up quà tặng doanh nghiệp Tết ý nghĩa để cho thấy bạn trân trọng nhân viên và đối tác, khách hàng của mình như thế nào! Nhưng làm thế nào để bạn chọn được món quà tặng cuối năm phù hợp và tối ưu được chi phí? Một số bước lựa chọn quà tặng doanh nghiệp cuối năm được chia sẻ dưới đây hy vọng sẽ có nhiều gợi ý hấp dẫn với bạn.</p>
<h2 id="h-cac-bước-lựa-chọn-qua-tặng-doanh-nghiệp-cuối-nam">Các Bước Lựa Chọn Quà Tặng Doanh Nghiệp Cuối Năm</h2>
<p>Lựa chọn quà tặng doanh nghiệp Tết ý nghĩa có thể là một thử thách! Và, nếu bạn đang băn khoăn không biết làm thế nào để chọn một món quà cuối năm cho công ty gây ấn tượng thì hãy theo dõi các bước được gợi ý!</p>
<p>Nhưng trước tiên, bạn phải biết định vị được sở thích của người nhận, cân đối với ngân sách công ty và cân bằng giữa sự chu đáo và tính hữu ích. Tất cả sẽ đảm bảo được tính chuyên nghiệp của món quà!</p>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="1000" height="721" src="https://phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1.jpg" alt="Các Bước Lựa Chọn Quà Tặng Doanh Nghiệp Tết Ý Nghĩa Và Tiết Kiệm Chi Phí" class="wp-image-18962" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1.jpg 1000w, //phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1-300x216.jpg 300w, //phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1-768x554.jpg 768w, //phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1-990x714.jpg 990w, //phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1-441x318.jpg 441w, //phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1-600x433.jpg 600w, //phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1-150x108.jpg 150w" sizes="(max-width: 1000px) 100vw, 1000px" /><figcaption>Các Bước Lựa Chọn Quà Tặng Doanh Nghiệp Tết Ý Nghĩa Và Tiết Kiệm Chi Phí</figcaption></figure></div>
<p>Chỉ cần dành thời gian đầu tư và sự nhiệt tâm trong khâu chọn quà theo các bước dưới đây, chắc chắn bạn sẽ vượt qua mọi giới hạn về giá cá và sở thích.</p>
<h3 id="h-bước-1-nhận-biết-dược-gu-của-người-nhận-qua"><strong>Bước 1:</strong> Nhận biết được “gu” của người nhận quà</h3>
<p>Tìm hiểu và nắm được cơ bản về sở thích của người nhận quà được cho là thách thức lớn nhất của quá trình chọn quà tặng doanh nghiệp Tết. Hẳn nhiên rằng, chúng ta đều không muốn tặng họ một món quà mà họ sẽ không thích! Vì vậy, bước đầu tiên là xem xét sở thích của người nhận.</p>
<p>Hãy suy nghĩ về mối quan hệ của bạn với nhân viên hoặc khách hàng. Tổng hợp những gì bạn biết về họ – sở thích, mối quan tâm, niềm đam mê và điều gì thúc đẩy họ gắn kết với bạn. Điều này sẽ giúp bạn chọn được món quà doanh nghiệp hoàn hảo. Dù đó là món quà tiện ích, món quà dành cho người sành ăn chơi hay thậm chí chỉ là tấm thiệp chúc mừng của công ty.</p>
<h3 id="h-bước-2-xac-dịnh-ngan-sach-dự-chi">Bước 2: Xác định ngân sách dự chi</h3>
<p>Bạn không cần phải đầu tư ngân sách khủng để có những món quà tặng doanh nghiệp Tết tuyệt vời. Hãy xem xét ngân sách dự chi của bạn để kiểm soát được số tiền bạn có thể chi trả khi tặng quà cho khách hàng và nhân viên của mình. Nắm được ngân sách sẽ cho phép bạn chọn những món quà cuối năm công ty tốt nhất với sự đầu tư một cách hợp lý.</p>
<p>Quà tặng cuối năm của công ty không chỉ là một hành động – mà còn là một bước đầu tư và kinh doanh thông minh! Quà tặng cho nhân viên giúp duy trì sự gắn bó và hài lòng trong công việc nói chung. Và, quà tặng khách hàng giúp giữ cho công ty của bạn luôn được chú trọng để xây dựng thương hiệu và giới thiệu đến tệp khách hàng mới hơn trong tương lai.</p>
<h3 id="h-bước-3-xay-dựng-qua-tặng-mang-tinh-ca-nhan-hoa">Bước 3: Xây dựng quà tặng mang tính cá nhân hóa</h3>
<p>Sẽ thật vô nghĩa nếu món quà tặng doanh nghiệp Tết của bạn quá phổ thông và không có sự khác biệt. Làm thế nào để bạn tránh được điều này? Bạn có thể thêm một chi tiết mang tính cá nhân hóa! Mọi người đều thích nhìn thấy tên của họ trên một cái gì đó (tất nhiên là ngoại trừ hóa đơn cần thanh toán…). Và, khi nói đến quà tặng cuối năm của công ty, yếu tố cá nhân hóa có thể đi một chặng đường dài để khiến người nhận cảm thấy được công nhận và đặc biệt.</p>
<p>Nếu bạn đang tìm kiếm những món quà tặng doanh nghiệp Tết cá nhân hóa , thì không đâu khác ngoài hàng loạt quà tặng cuối năm được miễn phí in khắc logo, tên cá nhân/ thương hiệu của Phụ Kiện Điện Thoại Giá Rẻ.</p>
<h3 id="h-lời-khuyen-bổ-sung">Lời khuyên bổ sung</h3>
<p>Mẹo bổ sung: ghi sai tên ai đó trong email có thể gây khó chịu, nhưng vẫn có thể tha thứ. Tuy nhiên, sai tên của họ trên một món quà là điều không nên. Thay vì có những cảm giác ấm áp, người nhận sẽ cảm thấy xấu hổ hoặc nghĩ rằng bạn bất cẩn. Và món quà đó sẽ phản tác dụng và không thực hiện đúng mục tiêu tặng quà</p>
<p>‘Ý nghĩa mới là quan trọng!’ Đó là câu thành ngữ lâu đời khi nói về quà tặng, phải không? Vâng, nó vẫn đúng. Không có cách nào thực sự nhanh chóng để chọn và gửi những món quà phù hợp của công ty cho khách hàng và nhân viên của bạn, bạn cần phải suy nghĩ và cân nhắc một chút về nó. Dán logo của công ty lên bất kỳ món quà tặng doanh nghiệp Tết cũng là giải pháp đáng giá. Đó là tất cả về chủ ý, chiến lược marketing, phát triển thương hiệu cần phải có trên món quà tặng. Khi thực hiện đúng, quà tặng doanh nghiệp sẽ để lại ấn tượng lâu dài cho người nhận. Vì vậy, khi bạn đang phân vân tìm quà tặng doanh nghiệp Tết ý nghĩa, hãy dành thời gian để cân nhắc các bước lựa chọn trên để gặt hái được thành quả!</p>
<h2 id="h-top-7-qua-tặng-doanh-nghiệp-dược-sử-dụng-nhiều-nhất-trong-dịp-tết">Top 7 quà tặng doanh nghiệp được sử dụng nhiều nhất trong dịp Tết</h2>
<h3 id="h-1-tai-nghe-airpod-pro">1, Tai nghe Airpod Pro</h3>
<ul><li>Thiết kế kiểu dáng in-ear nhỏ gọn hoàn toàn mới và độc đáo</li><li>Chip H1 siêu mạnh mẽ, khả năng xử lý âm thanh kỹ thuật số với độ trễ gần như bằng không</li><li>Tích hợp công nghệ chống ồn chủ động Active Noise Cancellation</li><li>Thời gian nghe nhạc khi bật chống ồn đến 4.5 giờ và 5 giờ khi tắt chống ồn</li><li>Khi dùng cùng hộp sạc có thể sử dụng để nghe nhạc đến 24h</li><li>Hỗ trợ sạc nhanh chóng, sạc không dây chuẩn Qi, tiện lợi khi sạc lại</li><li>Khả năng bảo vệ tai nghe an toàn dưới mưa nhỏ hoặc mồ hôi</li><li>Sản phẩm nguyên seal 100%.</li></ul>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="800" height="500" src="https://phukiendienthoaigiare.com/wp-content/uploads/2022/03/Tai-nghe-AirPods-pro-3.jpg" alt="" class="wp-image-18136" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2022/03/Tai-nghe-AirPods-pro-3.jpg 800w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Tai-nghe-AirPods-pro-3-300x188.jpg 300w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Tai-nghe-AirPods-pro-3-768x480.jpg 768w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Tai-nghe-AirPods-pro-3-441x276.jpg 441w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Tai-nghe-AirPods-pro-3-600x375.jpg 600w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Tai-nghe-AirPods-pro-3-150x94.jpg 150w" sizes="(max-width: 800px) 100vw, 800px" /><figcaption> Tai nghe Airpod Pro </figcaption></figure></div>
<p><strong>Tham khảo giá chi tiết: <em><a href="https://phukiendienthoaigiare.com/product/tai-nghe-nhet-tai-in-ear-bluetooth-tws-5-0-jacqueline-app-mien-phi-in-logo-doanh-nghiep/" target="_blank" rel="noreferrer noopener">Link</a></em></strong></p>
<h3>2, Đế Sạc Không Dây 10W Kiêm Đèn Ngủ 2 Trong 1 – Recci L07</h3>
<ul><li>Thiết kế Hiện đại với sắc trắng sang trọng</li><li>Sạc không dây 10W/ 7.5W/5W tương thích với tất cả các thiết bị điện thoại đang có trên thị trường</li><li>Tính năng tự động ngắt sau khi sạc đầy pin cùng khả năng bảo vệ quá nhiệt, quá tải thông minh</li><li>Có Thể dùng làm giá đỡ điện thoại, tiện ích khi sử dung</li><li>Đèn ngủ 3 cấp ánh sáng. Nhiệt độ màu ấm bảo vệ mắt</li><li>Chứng nhận an toàn : CE, FCC, ROHS</li></ul>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="600" height="600" src="https://phukiendienthoaigiare.com/wp-content/uploads/2022/03/Recci-L07.jpg" alt="" class="wp-image-18175" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2022/03/Recci-L07.jpg 600w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Recci-L07-300x300.jpg 300w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Recci-L07-150x150.jpg 150w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Recci-L07-441x441.jpg 441w" sizes="(max-width: 600px) 100vw, 600px" /><figcaption> Đế Sạc Không Dây 10W Kiêm Đèn Ngủ 2 Trong 1 – Recci L07 </figcaption></figure></div>
<p> <strong>Tham khảo giá chi tiết: <em><a href="https://phukiendienthoaigiare.com/product/de-sac-khong-day-10w-kiem-den-ngu-2-trong-1-recci-l07/" target="_blank" rel="noreferrer noopener">Link </a></em></strong></p>
<h3>3, Loa Bluetooth Mini RECCI W03</h3>
<ul><li>Loa Bluetooth Mini Recci W03 Thiết kế cực kỳ nhỏ gọn kèm dây đeo, dễ dàng mang theo.</li><li>Kết nối Bluetooth 5.0 kết nối ổn định trong phạm vi 10 m</li><li>Công suất 3W, chất âm hoàn hảo với màng loa 40mm thế hệ mới</li><li>Công nghệ loa 360 độ âm thanh lan tỏa theo mọi hướng</li><li>Dung lượng pin 500mAh thời gian nghe nhạc lên đến 3.5 giờ.</li><li>Hỗ trợ kết nối cổng AUX 3,5mm</li></ul>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="819" height="674" src="https://phukiendienthoaigiare.com/wp-content/uploads/2022/02/Danh-Gia-Loa-Bluetooth-Mini-RECCI-W03-3.png" alt="" class="wp-image-16959" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2022/02/Danh-Gia-Loa-Bluetooth-Mini-RECCI-W03-3.png 819w, //phukiendienthoaigiare.com/wp-content/uploads/2022/02/Danh-Gia-Loa-Bluetooth-Mini-RECCI-W03-3-300x247.png 300w, //phukiendienthoaigiare.com/wp-content/uploads/2022/02/Danh-Gia-Loa-Bluetooth-Mini-RECCI-W03-3-768x632.png 768w, //phukiendienthoaigiare.com/wp-content/uploads/2022/02/Danh-Gia-Loa-Bluetooth-Mini-RECCI-W03-3-441x363.png 441w, //phukiendienthoaigiare.com/wp-content/uploads/2022/02/Danh-Gia-Loa-Bluetooth-Mini-RECCI-W03-3-600x494.png 600w, //phukiendienthoaigiare.com/wp-content/uploads/2022/02/Danh-Gia-Loa-Bluetooth-Mini-RECCI-W03-3-150x123.png 150w" sizes="(max-width: 819px) 100vw, 819px" /><figcaption> Loa Bluetooth Mini RECCI W03 </figcaption></figure></div>
<p> <strong>Tham khảo giá chi tiết: <em><a href="https://phukiendienthoaigiare.com/product/loa-bluetooth-mini-recci-w03/" target="_blank" rel="noreferrer noopener">Link </a></em></strong></p>
<h3>4, Pin Sạc Dự Phòng 10000mAh Sạc Nhanh QC3.0 PB1006</h3>
<ul><li>Pin Sạc Dự Phòng PB1006, Dung lượng 10.000mAh Tương thích với tất cả các thiết bị điện thoại hiện có trên thị trường</li><li>Sạc nhanh QC3.0 , sạc đầy điện thoại iphone 11 chỉ trong vòng 30 phút</li><li>Thiết kế siêu mỏng, gọn nhẹ, Nhiều màu sắc thời trang dễ dàng lựa chọn</li><li>Tính năng tự động ngắt sau khi sạc đầy pin cùng khả năng bảo vệ quá nhiệt, quá tải thông minh</li><li>Có thể sạc đồng thời 2 thiết bị cùng lúc</li><li>Chứng nhận an toàn : ROHS/CE/MSDS</li><li>Phù hợp làm quà tặng Doanh Nghiệp</li><li>Tùy Chỉnh Logo Doanh nghiệp theo yêu cầu (MOQ=100pcs)</li></ul>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="673" height="950" src="https://phukiendienthoaigiare.com/wp-content/uploads/2021/12/Review-Pin-Sac-Du-Phong-10000mAh-Sac-Nhanh-QC3.0-PB1006-Uu-Nhuoc-Diem-Va-Gia-Ca-10.jpg" alt="" class="wp-image-16028" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2021/12/Review-Pin-Sac-Du-Phong-10000mAh-Sac-Nhanh-QC3.0-PB1006-Uu-Nhuoc-Diem-Va-Gia-Ca-10.jpg 673w, //phukiendienthoaigiare.com/wp-content/uploads/2021/12/Review-Pin-Sac-Du-Phong-10000mAh-Sac-Nhanh-QC3.0-PB1006-Uu-Nhuoc-Diem-Va-Gia-Ca-10-213x300.jpg 213w, //phukiendienthoaigiare.com/wp-content/uploads/2021/12/Review-Pin-Sac-Du-Phong-10000mAh-Sac-Nhanh-QC3.0-PB1006-Uu-Nhuoc-Diem-Va-Gia-Ca-10-441x623.jpg 441w, //phukiendienthoaigiare.com/wp-content/uploads/2021/12/Review-Pin-Sac-Du-Phong-10000mAh-Sac-Nhanh-QC3.0-PB1006-Uu-Nhuoc-Diem-Va-Gia-Ca-10-600x847.jpg 600w, //phukiendienthoaigiare.com/wp-content/uploads/2021/12/Review-Pin-Sac-Du-Phong-10000mAh-Sac-Nhanh-QC3.0-PB1006-Uu-Nhuoc-Diem-Va-Gia-Ca-10-150x212.jpg 150w" sizes="(max-width: 673px) 100vw, 673px" /><figcaption> Pin Sạc Dự Phòng 10000mAh Sạc Nhanh QC3.0 PB1006 </figcaption></figure></div>
<p><strong> Tham khảo giá chi tiết: <em><a href="https://phukiendienthoaigiare.com/product/pin-sac-du-phong-10000mah-sac-nhanh-qc3-0-pb1006/" target="_blank" rel="noreferrer noopener">Link </a></em></strong></p>
<h3>5, Vòng Đeo Tay Thông Minh V18</h3>
<ul><li>Vòng đeo tay thông minh V18 với thiết kếnhỏ gọn, năng động</li><li>Đo nhịp tim, SpO2, theo dõi giấc ngủ…</li><li>Dễ dàng nhận thông báo các cuộc gọi, tin nhắn</li><li>Điều khiển phát nhạc, chụp ảnh</li><li>Đa dạng chế độ thể thao, thoải mái luyện tập</li><li>Thời lượng pin lên đến 15 ngày, chỉ số chống nước IP67</li></ul>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="750" height="750" src="https://phukiendienthoaigiare.com/wp-content/uploads/2021/11/Danh-gia-Vong-Deo-Tay-Thong-Minh-V18-4-1.jpg" alt="" class="wp-image-12759" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2021/11/Danh-gia-Vong-Deo-Tay-Thong-Minh-V18-4-1.jpg 750w, //phukiendienthoaigiare.com/wp-content/uploads/2021/11/Danh-gia-Vong-Deo-Tay-Thong-Minh-V18-4-1-300x300.jpg 300w, //phukiendienthoaigiare.com/wp-content/uploads/2021/11/Danh-gia-Vong-Deo-Tay-Thong-Minh-V18-4-1-150x150.jpg 150w, //phukiendienthoaigiare.com/wp-content/uploads/2021/11/Danh-gia-Vong-Deo-Tay-Thong-Minh-V18-4-1-441x441.jpg 441w, //phukiendienthoaigiare.com/wp-content/uploads/2021/11/Danh-gia-Vong-Deo-Tay-Thong-Minh-V18-4-1-600x600.jpg 600w" sizes="(max-width: 750px) 100vw, 750px" /><figcaption> Vòng Đeo Tay Thông Minh V18 </figcaption></figure></div>
<p> <strong>Tham khảo giá chi tiết: <em><a href="https://phukiendienthoaigiare.com/product/vong-deo-tay-thong-minh-v18/" target="_blank" rel="noreferrer noopener">Link</a></em></strong> </p>
<h3>6, Set Quà Tặng Công Nghệ Cao Cấp 3in1 GS302</h3>
<ul><li>Pin Sạc dự phòng 10000mAh, sạc nhanh PD+QC3.0 18W</li><li>Củ sạc nhanh QC3.0 PD 18W</li><li>Cáp sạc nhanh PD 18W</li><li>Chứng nhận an toàn : CE/FCC/RoHS</li><li>Chứng nhận xuất xứ hàng hóa: CO</li><li>Phù hợp làm quà tặng Doanh Nghiệp, Quà tặng nhân viên, Quà khuyến mãi…</li></ul>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="706" height="706" src="https://phukiendienthoaigiare.com/wp-content/uploads/2022/01/Set-Qua-Tang-Cong-Nghe-Cao-Cap-3in1-GS302-lua-chon-hang-dau-cho-doanh-nghiep-15.jpg" alt="" class="wp-image-16509" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2022/01/Set-Qua-Tang-Cong-Nghe-Cao-Cap-3in1-GS302-lua-chon-hang-dau-cho-doanh-nghiep-15.jpg 706w, //phukiendienthoaigiare.com/wp-content/uploads/2022/01/Set-Qua-Tang-Cong-Nghe-Cao-Cap-3in1-GS302-lua-chon-hang-dau-cho-doanh-nghiep-15-300x300.jpg 300w, //phukiendienthoaigiare.com/wp-content/uploads/2022/01/Set-Qua-Tang-Cong-Nghe-Cao-Cap-3in1-GS302-lua-chon-hang-dau-cho-doanh-nghiep-15-150x150.jpg 150w, //phukiendienthoaigiare.com/wp-content/uploads/2022/01/Set-Qua-Tang-Cong-Nghe-Cao-Cap-3in1-GS302-lua-chon-hang-dau-cho-doanh-nghiep-15-441x441.jpg 441w, //phukiendienthoaigiare.com/wp-content/uploads/2022/01/Set-Qua-Tang-Cong-Nghe-Cao-Cap-3in1-GS302-lua-chon-hang-dau-cho-doanh-nghiep-15-600x600.jpg 600w" sizes="(max-width: 706px) 100vw, 706px" /><figcaption> Set Quà Tặng Công Nghệ Cao Cấp 3in1 GS302 </figcaption></figure></div>
<p> <strong>Tham khảo giá chi tiết: <em><a href="https://phukiendienthoaigiare.com/product/set-qua-tang-cong-nghe-cao-cap-3-in-1-gs302/" target="_blank" rel="noreferrer noopener">Link</a></em></strong> </p>
<h3>7, Quà Tặng Doanh Nghiệp 4 in 1 Hộp Quà Tết 2022 GS401</h3>
<ul><li>Pin Sạc dự phòng 10000mAh, sạc nhanh PD+QC3.0 18W</li><li>Cáp sạc nhanh</li><li>Đế giữ điện thoại</li><li>Tai nghe Bluetooth true wireless</li><li>Nhập khẩu & bảo hành chính hãng</li><li>Chứng nhận an toàn : CE/FCC/RoHS</li><li>Chứng nhận xuất xứ hàng hóa: CO</li><li>Phù hợp làm quà tặng Doanh Nghiệp, Quà tặng đối tác, Quà tặng khách hàng hội nghị, Quà tặng triển lãm, Quà tặng nhân viên, Quà khuyến mãi…</li></ul>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="459" height="460" src="https://phukiendienthoaigiare.com/wp-content/uploads/2022/01/Qua-Tang-Doanh-Nghiep-4-in-1-Hop-Qua-Tet-2022-GS401-2.jpg" alt="" class="wp-image-16298" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2022/01/Qua-Tang-Doanh-Nghiep-4-in-1-Hop-Qua-Tet-2022-GS401-2.jpg 459w, //phukiendienthoaigiare.com/wp-content/uploads/2022/01/Qua-Tang-Doanh-Nghiep-4-in-1-Hop-Qua-Tet-2022-GS401-2-300x300.jpg 300w, //phukiendienthoaigiare.com/wp-content/uploads/2022/01/Qua-Tang-Doanh-Nghiep-4-in-1-Hop-Qua-Tet-2022-GS401-2-150x150.jpg 150w, //phukiendienthoaigiare.com/wp-content/uploads/2022/01/Qua-Tang-Doanh-Nghiep-4-in-1-Hop-Qua-Tet-2022-GS401-2-441x442.jpg 441w" sizes="(max-width: 459px) 100vw, 459px" /><figcaption> Quà Tặng Doanh Nghiệp 4 in 1 Hộp Quà Tết 2022 GS401 </figcaption></figure></div>
<p> <strong>Tham khảo giá chi tiết: <em><a href="https://phukiendienthoaigiare.com/product/qua-tang-doanh-nghiep-4-in-1-hop-qua-tet-2022-gs401/" target="_blank" rel="noreferrer noopener">Link</a></em></strong> </p>
<p>Lựa chọn ngay một trong những vật phẩm quà tặng doanh nghiệp Tết trong kho hàng uy tín của Phụ Kiện Điện Thoại nếu bạn muốn chiến dịch tặng quà của mình tiết kiệm và thành công nhé. Còn rất nhiều sản phẩm khác được cập nhật thường xuyên trên website để bạn dễ chọn.</p>
<p>Liên hệ <strong>Hotline</strong> nếu bạn cần tư vấn hoặc tìm hiểu thông tin chi tiết, đội ngũ của chúng tôi luôn sẵn sàng phục vụ 24/24.</p>
<p>The post <a rel="nofollow" href="https://phukiendienthoaigiare.com/cac-buoc-lua-chon-qua-tang-doanh-nghiep-tet-y-nghia-va-tiet-kiem-chi-phi/">Các Bước Lựa Chọn Quà Tặng Doanh Nghiệp Tết Ý Nghĩa Và Tiết Kiệm Chi Phí</a> appeared first on <a rel="nofollow" href="https://phukiendienthoaigiare.com">Phụ kiện điện thoại</a>.</p>
|
1.0
|
Các Bước Lựa Chọn Quà Tặng Doanh Nghiệp Tết Ý Nghĩa Và Tiết Kiệm Chi Phí - <p>Đã đến lúc bạn cần set up quà tặng doanh nghiệp Tết ý nghĩa để cho thấy bạn trân trọng nhân viên và đối tác, khách hàng của mình như thế nào! Nhưng làm thế nào để bạn chọn được món quà tặng cuối năm phù hợp và tối ưu được chi phí? Một số bước lựa chọn quà tặng doanh nghiệp cuối năm được chia sẻ dưới đây hy vọng sẽ có nhiều gợi ý hấp dẫn với bạn.</p>
<h2 id="h-cac-bước-lựa-chọn-qua-tặng-doanh-nghiệp-cuối-nam">Các Bước Lựa Chọn Quà Tặng Doanh Nghiệp Cuối Năm</h2>
<p>Lựa chọn quà tặng doanh nghiệp Tết ý nghĩa có thể là một thử thách! Và, nếu bạn đang băn khoăn không biết làm thế nào để chọn một món quà cuối năm cho công ty gây ấn tượng thì hãy theo dõi các bước được gợi ý!</p>
<p>Nhưng trước tiên, bạn phải biết định vị được sở thích của người nhận, cân đối với ngân sách công ty và cân bằng giữa sự chu đáo và tính hữu ích. Tất cả sẽ đảm bảo được tính chuyên nghiệp của món quà!</p>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="1000" height="721" src="https://phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1.jpg" alt="Các Bước Lựa Chọn Quà Tặng Doanh Nghiệp Tết Ý Nghĩa Và Tiết Kiệm Chi Phí" class="wp-image-18962" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1.jpg 1000w, //phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1-300x216.jpg 300w, //phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1-768x554.jpg 768w, //phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1-990x714.jpg 990w, //phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1-441x318.jpg 441w, //phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1-600x433.jpg 600w, //phukiendienthoaigiare.com/wp-content/uploads/2022/07/Cac-Buoc-Lua-Chon-Qua-Tang-Doanh-Nghiep-Tet-Y-Nghia-Va-Tiet-Kiem-Chi-Phi-1-150x108.jpg 150w" sizes="(max-width: 1000px) 100vw, 1000px" /><figcaption>Các Bước Lựa Chọn Quà Tặng Doanh Nghiệp Tết Ý Nghĩa Và Tiết Kiệm Chi Phí</figcaption></figure></div>
<p>Chỉ cần dành thời gian đầu tư và sự nhiệt tâm trong khâu chọn quà theo các bước dưới đây, chắc chắn bạn sẽ vượt qua mọi giới hạn về giá cá và sở thích.</p>
<h3 id="h-bước-1-nhận-biết-dược-gu-của-người-nhận-qua"><strong>Bước 1:</strong> Nhận biết được “gu” của người nhận quà</h3>
<p>Tìm hiểu và nắm được cơ bản về sở thích của người nhận quà được cho là thách thức lớn nhất của quá trình chọn quà tặng doanh nghiệp Tết. Hẳn nhiên rằng, chúng ta đều không muốn tặng họ một món quà mà họ sẽ không thích! Vì vậy, bước đầu tiên là xem xét sở thích của người nhận.</p>
<p>Hãy suy nghĩ về mối quan hệ của bạn với nhân viên hoặc khách hàng. Tổng hợp những gì bạn biết về họ – sở thích, mối quan tâm, niềm đam mê và điều gì thúc đẩy họ gắn kết với bạn. Điều này sẽ giúp bạn chọn được món quà doanh nghiệp hoàn hảo. Dù đó là món quà tiện ích, món quà dành cho người sành ăn chơi hay thậm chí chỉ là tấm thiệp chúc mừng của công ty.</p>
<h3 id="h-bước-2-xac-dịnh-ngan-sach-dự-chi">Bước 2: Xác định ngân sách dự chi</h3>
<p>Bạn không cần phải đầu tư ngân sách khủng để có những món quà tặng doanh nghiệp Tết tuyệt vời. Hãy xem xét ngân sách dự chi của bạn để kiểm soát được số tiền bạn có thể chi trả khi tặng quà cho khách hàng và nhân viên của mình. Nắm được ngân sách sẽ cho phép bạn chọn những món quà cuối năm công ty tốt nhất với sự đầu tư một cách hợp lý.</p>
<p>Quà tặng cuối năm của công ty không chỉ là một hành động – mà còn là một bước đầu tư và kinh doanh thông minh! Quà tặng cho nhân viên giúp duy trì sự gắn bó và hài lòng trong công việc nói chung. Và, quà tặng khách hàng giúp giữ cho công ty của bạn luôn được chú trọng để xây dựng thương hiệu và giới thiệu đến tệp khách hàng mới hơn trong tương lai.</p>
<h3 id="h-bước-3-xay-dựng-qua-tặng-mang-tinh-ca-nhan-hoa">Bước 3: Xây dựng quà tặng mang tính cá nhân hóa</h3>
<p>Sẽ thật vô nghĩa nếu món quà tặng doanh nghiệp Tết của bạn quá phổ thông và không có sự khác biệt. Làm thế nào để bạn tránh được điều này? Bạn có thể thêm một chi tiết mang tính cá nhân hóa! Mọi người đều thích nhìn thấy tên của họ trên một cái gì đó (tất nhiên là ngoại trừ hóa đơn cần thanh toán…). Và, khi nói đến quà tặng cuối năm của công ty, yếu tố cá nhân hóa có thể đi một chặng đường dài để khiến người nhận cảm thấy được công nhận và đặc biệt.</p>
<p>Nếu bạn đang tìm kiếm những món quà tặng doanh nghiệp Tết cá nhân hóa , thì không đâu khác ngoài hàng loạt quà tặng cuối năm được miễn phí in khắc logo, tên cá nhân/ thương hiệu của Phụ Kiện Điện Thoại Giá Rẻ.</p>
<h3 id="h-lời-khuyen-bổ-sung">Lời khuyên bổ sung</h3>
<p>Mẹo bổ sung: ghi sai tên ai đó trong email có thể gây khó chịu, nhưng vẫn có thể tha thứ. Tuy nhiên, sai tên của họ trên một món quà là điều không nên. Thay vì có những cảm giác ấm áp, người nhận sẽ cảm thấy xấu hổ hoặc nghĩ rằng bạn bất cẩn. Và món quà đó sẽ phản tác dụng và không thực hiện đúng mục tiêu tặng quà</p>
<p>‘Ý nghĩa mới là quan trọng!’ Đó là câu thành ngữ lâu đời khi nói về quà tặng, phải không? Vâng, nó vẫn đúng. Không có cách nào thực sự nhanh chóng để chọn và gửi những món quà phù hợp của công ty cho khách hàng và nhân viên của bạn, bạn cần phải suy nghĩ và cân nhắc một chút về nó. Dán logo của công ty lên bất kỳ món quà tặng doanh nghiệp Tết cũng là giải pháp đáng giá. Đó là tất cả về chủ ý, chiến lược marketing, phát triển thương hiệu cần phải có trên món quà tặng. Khi thực hiện đúng, quà tặng doanh nghiệp sẽ để lại ấn tượng lâu dài cho người nhận. Vì vậy, khi bạn đang phân vân tìm quà tặng doanh nghiệp Tết ý nghĩa, hãy dành thời gian để cân nhắc các bước lựa chọn trên để gặt hái được thành quả!</p>
<h2 id="h-top-7-qua-tặng-doanh-nghiệp-dược-sử-dụng-nhiều-nhất-trong-dịp-tết">Top 7 quà tặng doanh nghiệp được sử dụng nhiều nhất trong dịp Tết</h2>
<h3 id="h-1-tai-nghe-airpod-pro">1, Tai nghe Airpod Pro</h3>
<ul><li>Thiết kế kiểu dáng in-ear nhỏ gọn hoàn toàn mới và độc đáo</li><li>Chip H1 siêu mạnh mẽ, khả năng xử lý âm thanh kỹ thuật số với độ trễ gần như bằng không</li><li>Tích hợp công nghệ chống ồn chủ động Active Noise Cancellation</li><li>Thời gian nghe nhạc khi bật chống ồn đến 4.5 giờ và 5 giờ khi tắt chống ồn</li><li>Khi dùng cùng hộp sạc có thể sử dụng để nghe nhạc đến 24h</li><li>Hỗ trợ sạc nhanh chóng, sạc không dây chuẩn Qi, tiện lợi khi sạc lại</li><li>Khả năng bảo vệ tai nghe an toàn dưới mưa nhỏ hoặc mồ hôi</li><li>Sản phẩm nguyên seal 100%.</li></ul>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="800" height="500" src="https://phukiendienthoaigiare.com/wp-content/uploads/2022/03/Tai-nghe-AirPods-pro-3.jpg" alt="" class="wp-image-18136" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2022/03/Tai-nghe-AirPods-pro-3.jpg 800w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Tai-nghe-AirPods-pro-3-300x188.jpg 300w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Tai-nghe-AirPods-pro-3-768x480.jpg 768w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Tai-nghe-AirPods-pro-3-441x276.jpg 441w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Tai-nghe-AirPods-pro-3-600x375.jpg 600w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Tai-nghe-AirPods-pro-3-150x94.jpg 150w" sizes="(max-width: 800px) 100vw, 800px" /><figcaption> Tai nghe Airpod Pro </figcaption></figure></div>
<p><strong>Tham khảo giá chi tiết: <em><a href="https://phukiendienthoaigiare.com/product/tai-nghe-nhet-tai-in-ear-bluetooth-tws-5-0-jacqueline-app-mien-phi-in-logo-doanh-nghiep/" target="_blank" rel="noreferrer noopener">Link</a></em></strong></p>
<h3>2, Đế Sạc Không Dây 10W Kiêm Đèn Ngủ 2 Trong 1 – Recci L07</h3>
<ul><li>Thiết kế Hiện đại với sắc trắng sang trọng</li><li>Sạc không dây 10W/ 7.5W/5W tương thích với tất cả các thiết bị điện thoại đang có trên thị trường</li><li>Tính năng tự động ngắt sau khi sạc đầy pin cùng khả năng bảo vệ quá nhiệt, quá tải thông minh</li><li>Có Thể dùng làm giá đỡ điện thoại, tiện ích khi sử dung</li><li>Đèn ngủ 3 cấp ánh sáng. Nhiệt độ màu ấm bảo vệ mắt</li><li>Chứng nhận an toàn : CE, FCC, ROHS</li></ul>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="600" height="600" src="https://phukiendienthoaigiare.com/wp-content/uploads/2022/03/Recci-L07.jpg" alt="" class="wp-image-18175" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2022/03/Recci-L07.jpg 600w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Recci-L07-300x300.jpg 300w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Recci-L07-150x150.jpg 150w, //phukiendienthoaigiare.com/wp-content/uploads/2022/03/Recci-L07-441x441.jpg 441w" sizes="(max-width: 600px) 100vw, 600px" /><figcaption> Đế Sạc Không Dây 10W Kiêm Đèn Ngủ 2 Trong 1 – Recci L07 </figcaption></figure></div>
<p> <strong>Tham khảo giá chi tiết: <em><a href="https://phukiendienthoaigiare.com/product/de-sac-khong-day-10w-kiem-den-ngu-2-trong-1-recci-l07/" target="_blank" rel="noreferrer noopener">Link </a></em></strong></p>
<h3>3, Loa Bluetooth Mini RECCI W03</h3>
<ul><li>Loa Bluetooth Mini Recci W03 Thiết kế cực kỳ nhỏ gọn kèm dây đeo, dễ dàng mang theo.</li><li>Kết nối Bluetooth 5.0 kết nối ổn định trong phạm vi 10 m</li><li>Công suất 3W, chất âm hoàn hảo với màng loa 40mm thế hệ mới</li><li>Công nghệ loa 360 độ âm thanh lan tỏa theo mọi hướng</li><li>Dung lượng pin 500mAh thời gian nghe nhạc lên đến 3.5 giờ.</li><li>Hỗ trợ kết nối cổng AUX 3,5mm</li></ul>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="819" height="674" src="https://phukiendienthoaigiare.com/wp-content/uploads/2022/02/Danh-Gia-Loa-Bluetooth-Mini-RECCI-W03-3.png" alt="" class="wp-image-16959" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2022/02/Danh-Gia-Loa-Bluetooth-Mini-RECCI-W03-3.png 819w, //phukiendienthoaigiare.com/wp-content/uploads/2022/02/Danh-Gia-Loa-Bluetooth-Mini-RECCI-W03-3-300x247.png 300w, //phukiendienthoaigiare.com/wp-content/uploads/2022/02/Danh-Gia-Loa-Bluetooth-Mini-RECCI-W03-3-768x632.png 768w, //phukiendienthoaigiare.com/wp-content/uploads/2022/02/Danh-Gia-Loa-Bluetooth-Mini-RECCI-W03-3-441x363.png 441w, //phukiendienthoaigiare.com/wp-content/uploads/2022/02/Danh-Gia-Loa-Bluetooth-Mini-RECCI-W03-3-600x494.png 600w, //phukiendienthoaigiare.com/wp-content/uploads/2022/02/Danh-Gia-Loa-Bluetooth-Mini-RECCI-W03-3-150x123.png 150w" sizes="(max-width: 819px) 100vw, 819px" /><figcaption> Loa Bluetooth Mini RECCI W03 </figcaption></figure></div>
<p> <strong>Tham khảo giá chi tiết: <em><a href="https://phukiendienthoaigiare.com/product/loa-bluetooth-mini-recci-w03/" target="_blank" rel="noreferrer noopener">Link </a></em></strong></p>
<h3>4, Pin Sạc Dự Phòng 10000mAh Sạc Nhanh QC3.0 PB1006</h3>
<ul><li>Pin Sạc Dự Phòng PB1006, Dung lượng 10.000mAh Tương thích với tất cả các thiết bị điện thoại hiện có trên thị trường</li><li>Sạc nhanh QC3.0 , sạc đầy điện thoại iphone 11 chỉ trong vòng 30 phút</li><li>Thiết kế siêu mỏng, gọn nhẹ, Nhiều màu sắc thời trang dễ dàng lựa chọn</li><li>Tính năng tự động ngắt sau khi sạc đầy pin cùng khả năng bảo vệ quá nhiệt, quá tải thông minh</li><li>Có thể sạc đồng thời 2 thiết bị cùng lúc</li><li>Chứng nhận an toàn : ROHS/CE/MSDS</li><li>Phù hợp làm quà tặng Doanh Nghiệp</li><li>Tùy Chỉnh Logo Doanh nghiệp theo yêu cầu (MOQ=100pcs)</li></ul>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="673" height="950" src="https://phukiendienthoaigiare.com/wp-content/uploads/2021/12/Review-Pin-Sac-Du-Phong-10000mAh-Sac-Nhanh-QC3.0-PB1006-Uu-Nhuoc-Diem-Va-Gia-Ca-10.jpg" alt="" class="wp-image-16028" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2021/12/Review-Pin-Sac-Du-Phong-10000mAh-Sac-Nhanh-QC3.0-PB1006-Uu-Nhuoc-Diem-Va-Gia-Ca-10.jpg 673w, //phukiendienthoaigiare.com/wp-content/uploads/2021/12/Review-Pin-Sac-Du-Phong-10000mAh-Sac-Nhanh-QC3.0-PB1006-Uu-Nhuoc-Diem-Va-Gia-Ca-10-213x300.jpg 213w, //phukiendienthoaigiare.com/wp-content/uploads/2021/12/Review-Pin-Sac-Du-Phong-10000mAh-Sac-Nhanh-QC3.0-PB1006-Uu-Nhuoc-Diem-Va-Gia-Ca-10-441x623.jpg 441w, //phukiendienthoaigiare.com/wp-content/uploads/2021/12/Review-Pin-Sac-Du-Phong-10000mAh-Sac-Nhanh-QC3.0-PB1006-Uu-Nhuoc-Diem-Va-Gia-Ca-10-600x847.jpg 600w, //phukiendienthoaigiare.com/wp-content/uploads/2021/12/Review-Pin-Sac-Du-Phong-10000mAh-Sac-Nhanh-QC3.0-PB1006-Uu-Nhuoc-Diem-Va-Gia-Ca-10-150x212.jpg 150w" sizes="(max-width: 673px) 100vw, 673px" /><figcaption> Pin Sạc Dự Phòng 10000mAh Sạc Nhanh QC3.0 PB1006 </figcaption></figure></div>
<p><strong> Tham khảo giá chi tiết: <em><a href="https://phukiendienthoaigiare.com/product/pin-sac-du-phong-10000mah-sac-nhanh-qc3-0-pb1006/" target="_blank" rel="noreferrer noopener">Link </a></em></strong></p>
<h3>5, Vòng Đeo Tay Thông Minh V18</h3>
<ul><li>Vòng đeo tay thông minh V18 với thiết kếnhỏ gọn, năng động</li><li>Đo nhịp tim, SpO2, theo dõi giấc ngủ…</li><li>Dễ dàng nhận thông báo các cuộc gọi, tin nhắn</li><li>Điều khiển phát nhạc, chụp ảnh</li><li>Đa dạng chế độ thể thao, thoải mái luyện tập</li><li>Thời lượng pin lên đến 15 ngày, chỉ số chống nước IP67</li></ul>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="750" height="750" src="https://phukiendienthoaigiare.com/wp-content/uploads/2021/11/Danh-gia-Vong-Deo-Tay-Thong-Minh-V18-4-1.jpg" alt="" class="wp-image-12759" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2021/11/Danh-gia-Vong-Deo-Tay-Thong-Minh-V18-4-1.jpg 750w, //phukiendienthoaigiare.com/wp-content/uploads/2021/11/Danh-gia-Vong-Deo-Tay-Thong-Minh-V18-4-1-300x300.jpg 300w, //phukiendienthoaigiare.com/wp-content/uploads/2021/11/Danh-gia-Vong-Deo-Tay-Thong-Minh-V18-4-1-150x150.jpg 150w, //phukiendienthoaigiare.com/wp-content/uploads/2021/11/Danh-gia-Vong-Deo-Tay-Thong-Minh-V18-4-1-441x441.jpg 441w, //phukiendienthoaigiare.com/wp-content/uploads/2021/11/Danh-gia-Vong-Deo-Tay-Thong-Minh-V18-4-1-600x600.jpg 600w" sizes="(max-width: 750px) 100vw, 750px" /><figcaption> Vòng Đeo Tay Thông Minh V18 </figcaption></figure></div>
<p> <strong>Tham khảo giá chi tiết: <em><a href="https://phukiendienthoaigiare.com/product/vong-deo-tay-thong-minh-v18/" target="_blank" rel="noreferrer noopener">Link</a></em></strong> </p>
<h3>6, Set Quà Tặng Công Nghệ Cao Cấp 3in1 GS302</h3>
<ul><li>Pin Sạc dự phòng 10000mAh, sạc nhanh PD+QC3.0 18W</li><li>Củ sạc nhanh QC3.0 PD 18W</li><li>Cáp sạc nhanh PD 18W</li><li>Chứng nhận an toàn : CE/FCC/RoHS</li><li>Chứng nhận xuất xứ hàng hóa: CO</li><li>Phù hợp làm quà tặng Doanh Nghiệp, Quà tặng nhân viên, Quà khuyến mãi…</li></ul>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="706" height="706" src="https://phukiendienthoaigiare.com/wp-content/uploads/2022/01/Set-Qua-Tang-Cong-Nghe-Cao-Cap-3in1-GS302-lua-chon-hang-dau-cho-doanh-nghiep-15.jpg" alt="" class="wp-image-16509" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2022/01/Set-Qua-Tang-Cong-Nghe-Cao-Cap-3in1-GS302-lua-chon-hang-dau-cho-doanh-nghiep-15.jpg 706w, //phukiendienthoaigiare.com/wp-content/uploads/2022/01/Set-Qua-Tang-Cong-Nghe-Cao-Cap-3in1-GS302-lua-chon-hang-dau-cho-doanh-nghiep-15-300x300.jpg 300w, //phukiendienthoaigiare.com/wp-content/uploads/2022/01/Set-Qua-Tang-Cong-Nghe-Cao-Cap-3in1-GS302-lua-chon-hang-dau-cho-doanh-nghiep-15-150x150.jpg 150w, //phukiendienthoaigiare.com/wp-content/uploads/2022/01/Set-Qua-Tang-Cong-Nghe-Cao-Cap-3in1-GS302-lua-chon-hang-dau-cho-doanh-nghiep-15-441x441.jpg 441w, //phukiendienthoaigiare.com/wp-content/uploads/2022/01/Set-Qua-Tang-Cong-Nghe-Cao-Cap-3in1-GS302-lua-chon-hang-dau-cho-doanh-nghiep-15-600x600.jpg 600w" sizes="(max-width: 706px) 100vw, 706px" /><figcaption> Set Quà Tặng Công Nghệ Cao Cấp 3in1 GS302 </figcaption></figure></div>
<p> <strong>Tham khảo giá chi tiết: <em><a href="https://phukiendienthoaigiare.com/product/set-qua-tang-cong-nghe-cao-cap-3-in-1-gs302/" target="_blank" rel="noreferrer noopener">Link</a></em></strong> </p>
<h3>7, Quà Tặng Doanh Nghiệp 4 in 1 Hộp Quà Tết 2022 GS401</h3>
<ul><li>Pin Sạc dự phòng 10000mAh, sạc nhanh PD+QC3.0 18W</li><li>Cáp sạc nhanh</li><li>Đế giữ điện thoại</li><li>Tai nghe Bluetooth true wireless</li><li>Nhập khẩu & bảo hành chính hãng</li><li>Chứng nhận an toàn : CE/FCC/RoHS</li><li>Chứng nhận xuất xứ hàng hóa: CO</li><li>Phù hợp làm quà tặng Doanh Nghiệp, Quà tặng đối tác, Quà tặng khách hàng hội nghị, Quà tặng triển lãm, Quà tặng nhân viên, Quà khuyến mãi…</li></ul>
<div class="wp-block-image"><figure class="aligncenter size-full"><img width="459" height="460" src="https://phukiendienthoaigiare.com/wp-content/uploads/2022/01/Qua-Tang-Doanh-Nghiep-4-in-1-Hop-Qua-Tet-2022-GS401-2.jpg" alt="" class="wp-image-16298" srcset="//phukiendienthoaigiare.com/wp-content/uploads/2022/01/Qua-Tang-Doanh-Nghiep-4-in-1-Hop-Qua-Tet-2022-GS401-2.jpg 459w, //phukiendienthoaigiare.com/wp-content/uploads/2022/01/Qua-Tang-Doanh-Nghiep-4-in-1-Hop-Qua-Tet-2022-GS401-2-300x300.jpg 300w, //phukiendienthoaigiare.com/wp-content/uploads/2022/01/Qua-Tang-Doanh-Nghiep-4-in-1-Hop-Qua-Tet-2022-GS401-2-150x150.jpg 150w, //phukiendienthoaigiare.com/wp-content/uploads/2022/01/Qua-Tang-Doanh-Nghiep-4-in-1-Hop-Qua-Tet-2022-GS401-2-441x442.jpg 441w" sizes="(max-width: 459px) 100vw, 459px" /><figcaption> Quà Tặng Doanh Nghiệp 4 in 1 Hộp Quà Tết 2022 GS401 </figcaption></figure></div>
<p> <strong>Tham khảo giá chi tiết: <em><a href="https://phukiendienthoaigiare.com/product/qua-tang-doanh-nghiep-4-in-1-hop-qua-tet-2022-gs401/" target="_blank" rel="noreferrer noopener">Link</a></em></strong> </p>
<p>Lựa chọn ngay một trong những vật phẩm quà tặng doanh nghiệp Tết trong kho hàng uy tín của Phụ Kiện Điện Thoại nếu bạn muốn chiến dịch tặng quà của mình tiết kiệm và thành công nhé. Còn rất nhiều sản phẩm khác được cập nhật thường xuyên trên website để bạn dễ chọn.</p>
<p>Liên hệ <strong>Hotline</strong> nếu bạn cần tư vấn hoặc tìm hiểu thông tin chi tiết, đội ngũ của chúng tôi luôn sẵn sàng phục vụ 24/24.</p>
<p>The post <a rel="nofollow" href="https://phukiendienthoaigiare.com/cac-buoc-lua-chon-qua-tang-doanh-nghiep-tet-y-nghia-va-tiet-kiem-chi-phi/">Các Bước Lựa Chọn Quà Tặng Doanh Nghiệp Tết Ý Nghĩa Và Tiết Kiệm Chi Phí</a> appeared first on <a rel="nofollow" href="https://phukiendienthoaigiare.com">Phụ kiện điện thoại</a>.</p>
|
non_process
|
các bước lựa chọn quà tặng doanh nghiệp tết ý nghĩa và tiết kiệm chi phí đã đến lúc bạn cần set up quà tặng doanh nghiệp tết ý nghĩa để cho thấy bạn trân trọng nhân viên và đối tác khách hàng của mình như thế nào nhưng làm thế nào để bạn chọn được món quà tặng cuối năm phù hợp và tối ưu được chi phí một số bước lựa chọn quà tặng doanh nghiệp cuối năm được chia sẻ dưới đây hy vọng sẽ có nhiều gợi ý hấp dẫn với bạn các bước lựa chọn quà tặng doanh nghiệp cuối năm lựa chọn quà tặng doanh nghiệp tết ý nghĩa có thể là một thử thách và nếu bạn đang băn khoăn không biết làm thế nào để chọn một món quà cuối năm cho công ty gây ấn tượng thì hãy theo dõi các bước được gợi ý nhưng trước tiên bạn phải biết định vị được sở thích của người nhận cân đối với ngân sách công ty và cân bằng giữa sự chu đáo và tính hữu ích tất cả sẽ đảm bảo được tính chuyên nghiệp của món quà các bước lựa chọn quà tặng doanh nghiệp tết ý nghĩa và tiết kiệm chi phí chỉ cần dành thời gian đầu tư và sự nhiệt tâm trong khâu chọn quà theo các bước dưới đây chắc chắn bạn sẽ vượt qua mọi giới hạn về giá cá và sở thích bước nhận biết được “gu” của người nhận quà tìm hiểu và nắm được cơ bản về sở thích của người nhận quà được cho là thách thức lớn nhất của quá trình chọn quà tặng doanh nghiệp tết hẳn nhiên rằng chúng ta đều không muốn tặng họ một món quà mà họ sẽ không thích vì vậy bước đầu tiên là xem xét sở thích của người nhận hãy suy nghĩ về mối quan hệ của bạn với nhân viên hoặc khách hàng tổng hợp những gì bạn biết về họ – sở thích mối quan tâm niềm đam mê và điều gì thúc đẩy họ gắn kết với bạn điều này sẽ giúp bạn chọn được món quà doanh nghiệp hoàn hảo dù đó là món quà tiện ích món quà dành cho người sành ăn chơi hay thậm chí chỉ là tấm thiệp chúc mừng của công ty bước xác định ngân sách dự chi bạn không cần phải đầu tư ngân sách khủng để có những món quà tặng doanh nghiệp tết tuyệt vời hãy xem xét ngân sách dự chi của bạn để kiểm soát được số tiền bạn có thể chi trả khi tặng quà cho khách hàng và nhân viên của mình nắm được ngân sách sẽ cho phép bạn chọn những món quà cuối năm công ty tốt nhất với sự đầu tư một cách hợp lý quà tặng cuối năm của công ty không chỉ là một hành động – mà còn là một bước đầu tư và kinh doanh thông minh quà tặng cho nhân viên giúp duy trì sự gắn bó và hài lòng trong công việc nói chung và quà tặng khách hàng giúp giữ cho công ty của bạn luôn được chú trọng để xây dựng thương hiệu và giới thiệu đến tệp khách hàng mới hơn trong tương lai bước xây dựng quà tặng mang tính cá nhân hóa sẽ thật vô nghĩa nếu món quà tặng doanh nghiệp tết của bạn quá phổ thông và không có sự khác biệt làm thế nào để bạn tránh được điều này bạn có thể thêm một chi tiết mang tính cá nhân hóa mọi người đều thích nhìn thấy tên của họ trên một cái gì đó tất nhiên là ngoại trừ hóa đơn cần thanh toán… và khi nói đến quà tặng cuối năm của công ty yếu tố cá nhân hóa có thể đi một chặng đường dài để khiến người nhận cảm thấy được công nhận và đặc biệt nếu bạn đang tìm kiếm những món quà tặng doanh nghiệp tết cá nhân hóa thì không đâu khác ngoài hàng loạt quà tặng cuối năm được miễn phí in khắc logo tên cá nhân thương hiệu của phụ kiện điện thoại giá rẻ lời khuyên bổ sung mẹo bổ sung ghi sai tên ai đó trong email có thể gây khó chịu nhưng vẫn có thể tha thứ tuy nhiên sai tên của họ trên một món quà là điều không nên thay vì có những cảm giác ấm áp người nhận sẽ cảm thấy xấu hổ hoặc nghĩ rằng bạn bất cẩn và món quà đó sẽ phản tác dụng và không thực hiện đúng mục tiêu tặng quà ‘ý nghĩa mới là quan trọng ’ đó là câu thành ngữ lâu đời khi nói về quà tặng phải không vâng nó vẫn đúng không có cách nào thực sự nhanh chóng để chọn và gửi những món quà phù hợp của công ty cho khách hàng và nhân viên của bạn bạn cần phải suy nghĩ và cân nhắc một chút về nó dán logo của công ty lên bất kỳ món quà tặng doanh nghiệp tết cũng là giải pháp đáng giá đó là tất cả về chủ ý chiến lược marketing phát triển thương hiệu cần phải có trên món quà tặng khi thực hiện đúng quà tặng doanh nghiệp sẽ để lại ấn tượng lâu dài cho người nhận vì vậy khi bạn đang phân vân tìm quà tặng doanh nghiệp tết ý nghĩa hãy dành thời gian để cân nhắc các bước lựa chọn trên để gặt hái được thành quả top quà tặng doanh nghiệp được sử dụng nhiều nhất trong dịp tết tai nghe airpod pro thiết kế kiểu dáng in ear nhỏ gọn hoàn toàn mới và độc đáo chip siêu mạnh mẽ khả năng xử lý âm thanh kỹ thuật số với độ trễ gần như bằng không tích hợp công nghệ chống ồn chủ động active noise cancellation thời gian nghe nhạc khi bật chống ồn đến giờ và giờ khi tắt chống ồn khi dùng cùng hộp sạc có thể sử dụng để nghe nhạc đến hỗ trợ sạc nhanh chóng sạc không dây chuẩn qi tiện lợi khi sạc lại khả năng bảo vệ tai nghe an toàn dưới mưa nhỏ hoặc mồ hôi sản phẩm nguyên seal tai nghe airpod pro tham khảo giá chi tiết link đế sạc không dây kiêm đèn ngủ trong – recci thiết kế hiện đại với sắc trắng sang trọng sạc không dây tương thích với tất cả các thiết bị điện thoại đang có trên thị trường tính năng tự động ngắt sau khi sạc đầy pin cùng khả năng bảo vệ quá nhiệt quá tải thông minh có thể dùng làm giá đỡ điện thoại tiện ích khi sử dung đèn ngủ cấp ánh sáng nhiệt độ màu ấm bảo vệ mắt chứng nhận an toàn ce fcc rohs đế sạc không dây kiêm đèn ngủ trong – recci tham khảo giá chi tiết link loa bluetooth mini recci loa bluetooth mini recci thiết kế cực kỳ nhỏ gọn kèm dây đeo dễ dàng mang theo kết nối bluetooth kết nối ổn định trong phạm vi m công suất chất âm hoàn hảo với màng loa thế hệ mới công nghệ loa độ âm thanh lan tỏa theo mọi hướng dung lượng pin thời gian nghe nhạc lên đến giờ hỗ trợ kết nối cổng aux loa bluetooth mini recci tham khảo giá chi tiết link pin sạc dự phòng sạc nhanh pin sạc dự phòng dung lượng tương thích với tất cả các thiết bị điện thoại hiện có trên thị trường sạc nhanh sạc đầy điện thoại iphone chỉ trong vòng phút thiết kế siêu mỏng gọn nhẹ nhiều màu sắc thời trang dễ dàng lựa chọn tính năng tự động ngắt sau khi sạc đầy pin cùng khả năng bảo vệ quá nhiệt quá tải thông minh có thể sạc đồng thời thiết bị cùng lúc chứng nhận an toàn rohs ce msds phù hợp làm quà tặng doanh nghiệp tùy chỉnh logo doanh nghiệp theo yêu cầu moq pin sạc dự phòng sạc nhanh tham khảo giá chi tiết link vòng đeo tay thông minh vòng đeo tay thông minh với thiết kếnhỏ gọn năng động đo nhịp tim theo dõi giấc ngủ… dễ dàng nhận thông báo các cuộc gọi tin nhắn điều khiển phát nhạc chụp ảnh đa dạng chế độ thể thao thoải mái luyện tập thời lượng pin lên đến ngày chỉ số chống nước vòng đeo tay thông minh tham khảo giá chi tiết link set quà tặng công nghệ cao cấp pin sạc dự phòng sạc nhanh pd củ sạc nhanh pd cáp sạc nhanh pd chứng nhận an toàn ce fcc rohs chứng nhận xuất xứ hàng hóa co phù hợp làm quà tặng doanh nghiệp quà tặng nhân viên quà khuyến mãi… set quà tặng công nghệ cao cấp tham khảo giá chi tiết link quà tặng doanh nghiệp in hộp quà tết pin sạc dự phòng sạc nhanh pd cáp sạc nhanh đế giữ điện thoại tai nghe bluetooth true wireless nhập khẩu bảo hành chính hãng chứng nhận an toàn ce fcc rohs chứng nhận xuất xứ hàng hóa co phù hợp làm quà tặng doanh nghiệp quà tặng đối tác quà tặng khách hàng hội nghị quà tặng triển lãm quà tặng nhân viên quà khuyến mãi… quà tặng doanh nghiệp in hộp quà tết tham khảo giá chi tiết link lựa chọn ngay một trong những vật phẩm quà tặng doanh nghiệp tết trong kho hàng uy tín của phụ kiện điện thoại nếu bạn muốn chiến dịch tặng quà của mình tiết kiệm và thành công nhé còn rất nhiều sản phẩm khác được cập nhật thường xuyên trên website để bạn dễ chọn liên hệ hotline nếu bạn cần tư vấn hoặc tìm hiểu thông tin chi tiết đội ngũ của chúng tôi luôn sẵn sàng phục vụ the post appeared first on
| 0
|
446,464
| 31,478,294,196
|
IssuesEvent
|
2023-08-30 12:19:16
|
lampepfl/dotty
|
https://api.github.com/repos/lampepfl/dotty
|
closed
|
Bad url link on `dotty.epfl.ch/index.html`
|
itype:bug area:documentation area:doctool
|
## Compiler version
Nightly build
## Minimized code

There is a extra `_doc/` in the url of `Contributing` link
## Output
Nothing when clicking on the link
## Expectation
Should throw an error, and redirect to `https://dotty.epfl.ch/docs/contributing/index.html`
|
1.0
|
Bad url link on `dotty.epfl.ch/index.html` - ## Compiler version
Nightly build
## Minimized code

There is a extra `_doc/` in the url of `Contributing` link
## Output
Nothing when clicking on the link
## Expectation
Should throw an error, and redirect to `https://dotty.epfl.ch/docs/contributing/index.html`
|
non_process
|
bad url link on dotty epfl ch index html compiler version nightly build minimized code there is a extra doc in the url of contributing link output nothing when clicking on the link expectation should throw an error and redirect to
| 0
|
9,371
| 12,374,183,610
|
IssuesEvent
|
2020-05-19 00:46:24
|
allinurl/goaccess
|
https://api.github.com/repos/allinurl/goaccess
|
closed
|
VIRTUAL HOSTS Issue in realtime html
|
log-processing
|
I have multiple sites on apache server. so in real time html, in NOT FOUND URLS i want to also display virtual host name.
So i can decide which host is having 404 issue.
i tried below command:-
$ awk '$8=$1$8' access.log | goaccess -a -
But gives error .
Then tried it with date time:-
$ sudo awk '$8=$1$8' /var/log/apache2/access.log | goaccess -a - --log-format='%v %h %^[%d:%t %^] "%r" %s %b "%R" "%u"' --date-format='%d/%b/%Y' --time-format=%T
But output is blank.
So how do i achieve this in realtime and both on terminal #also.
|
1.0
|
VIRTUAL HOSTS Issue in realtime html - I have multiple sites on apache server. so in real time html, in NOT FOUND URLS i want to also display virtual host name.
So i can decide which host is having 404 issue.
i tried below command:-
$ awk '$8=$1$8' access.log | goaccess -a -
But gives error .
Then tried it with date time:-
$ sudo awk '$8=$1$8' /var/log/apache2/access.log | goaccess -a - --log-format='%v %h %^[%d:%t %^] "%r" %s %b "%R" "%u"' --date-format='%d/%b/%Y' --time-format=%T
But output is blank.
So how do i achieve this in realtime and both on terminal #also.
|
process
|
virtual hosts issue in realtime html i have multiple sites on apache server so in real time html in not found urls i want to also display virtual host name so i can decide which host is having issue i tried below command awk access log goaccess a but gives error then tried it with date time sudo awk var log access log goaccess a log format v h r s b r u date format d b y time format t but output is blank so how do i achieve this in realtime and both on terminal also
| 1
|
389,467
| 11,502,506,010
|
IssuesEvent
|
2020-02-12 19:14:26
|
poissonconsulting/dbflobr
|
https://api.github.com/repos/poissonconsulting/dbflobr
|
closed
|
use check_pk in save_flobs and save_all_flobs
|
Difficulty: 1 Novice Effort: 1 Low Priority: 3 Medium Type: Enhancement
|
currently check occurs inside the functions...better to use check_pk as developed for import functions.
|
1.0
|
use check_pk in save_flobs and save_all_flobs - currently check occurs inside the functions...better to use check_pk as developed for import functions.
|
non_process
|
use check pk in save flobs and save all flobs currently check occurs inside the functions better to use check pk as developed for import functions
| 0
|
22,164
| 30,707,924,376
|
IssuesEvent
|
2023-07-27 07:45:35
|
pingcap/tiflash
|
https://api.github.com/repos/pingcap/tiflash
|
closed
|
Interpreter & Executor & MPP unit test framework
|
type/enhancement help wanted good first issue type/testing component/mpp component/coprocessor component/compute
|
## Enhancement
1. Since there is no unit test for the Interpreter module, Interpreter test framework aims to supplement the content of the unit test, improve the stability of the module, and find more hidden bugs at home.
2. TiFlash planner needs to rely on unit tests to ensure the quality and progress of development during the refactoring process.
3. User can mock input columns and feed them into the "Execution streams(BlockInputStreams in TiFlash)", then they can test executors(except MPP related logic) of TiFlash locally.
4. Further more, MPP related tests will be supported.
5. When developing more features in the future, good unit testing can be used as the basis for development to further improve the efficiency and quality of code development.
## Non MPP tests development progress:
- [x] #4610
- [x] #4632
- [x] #4706
- [x] #4742
- [x] #4788
- [x] #4911
- [x] #4858
- [x] #5041
- [x] #5021
- [x] #5262
- [x] #5295
- [x] #5243
- [x] #5347
- [ ] #5345
- [x] #5432
- [ ] #5469
- [x] #5510
- [x] #5534
- [x] #5419
- [x] #5543
- [x] #5305
- [x] #5889 add test log when failed.
- [x] #5957
- [x] #5787
- [x] #5895
- [x] #5915
- [x] Test more types of executors[WIP].
- [ ] Support more expressions to test.
- [ ] More user friendly exception throw.
- [x] Check the legality and correctness of user input.
- [x] Add some easy to use template to write tests.
- [x] When result is wrong, print more user friendly message to help debug.
- [x] Add some built-in input columns for user to generate a bunch of tests.
- [x] Generate columns with different types and size. #5743
- [ ] Generate columns following different data distribution.
- [x] Support regression test like this https://github.com/risinglightdb/sqlplannertest-rs. Here is the detail https://github.com/pingcap/tiflash/issues/6676
- [x] A document to demonstrate how to use the test framework.
- [ ] Bind test framework with google bench.
- [x] New executor should implement Serializer. New BlockInputStream should implement appendInfo().
- [ ] Improve the speed of tests.
- [ ] A test generator like sqlsmith.(Just for fun)
- [x] #6160
- [x] #6561
- [x] #7059
## MPP tests development progress:
- [x] #5369
- [x] #5450
- [x] #5573
- [x] Orchestrate MPP Tasks.
- [x] Get task status of each TiFlash service. #5732
- [x] Mock Storage. #5573
- [x] #5732
- [ ] Write TPCH queries in the framework!
## Run with storage layer
- [x] MockStorage with StorageDeltaMerge. #6561
- [x] Support push down filter test #6573
**Feel free to ask me anything about the Test Framework.**
|
1.0
|
Interpreter & Executor & MPP unit test framework - ## Enhancement
1. Since there is no unit test for the Interpreter module, Interpreter test framework aims to supplement the content of the unit test, improve the stability of the module, and find more hidden bugs at home.
2. TiFlash planner needs to rely on unit tests to ensure the quality and progress of development during the refactoring process.
3. User can mock input columns and feed them into the "Execution streams(BlockInputStreams in TiFlash)", then they can test executors(except MPP related logic) of TiFlash locally.
4. Further more, MPP related tests will be supported.
5. When developing more features in the future, good unit testing can be used as the basis for development to further improve the efficiency and quality of code development.
## Non MPP tests development progress:
- [x] #4610
- [x] #4632
- [x] #4706
- [x] #4742
- [x] #4788
- [x] #4911
- [x] #4858
- [x] #5041
- [x] #5021
- [x] #5262
- [x] #5295
- [x] #5243
- [x] #5347
- [ ] #5345
- [x] #5432
- [ ] #5469
- [x] #5510
- [x] #5534
- [x] #5419
- [x] #5543
- [x] #5305
- [x] #5889 add test log when failed.
- [x] #5957
- [x] #5787
- [x] #5895
- [x] #5915
- [x] Test more types of executors[WIP].
- [ ] Support more expressions to test.
- [ ] More user friendly exception throw.
- [x] Check the legality and correctness of user input.
- [x] Add some easy to use template to write tests.
- [x] When result is wrong, print more user friendly message to help debug.
- [x] Add some built-in input columns for user to generate a bunch of tests.
- [x] Generate columns with different types and size. #5743
- [ ] Generate columns following different data distribution.
- [x] Support regression test like this https://github.com/risinglightdb/sqlplannertest-rs. Here is the detail https://github.com/pingcap/tiflash/issues/6676
- [x] A document to demonstrate how to use the test framework.
- [ ] Bind test framework with google bench.
- [x] New executor should implement Serializer. New BlockInputStream should implement appendInfo().
- [ ] Improve the speed of tests.
- [ ] A test generator like sqlsmith.(Just for fun)
- [x] #6160
- [x] #6561
- [x] #7059
## MPP tests development progress:
- [x] #5369
- [x] #5450
- [x] #5573
- [x] Orchestrate MPP Tasks.
- [x] Get task status of each TiFlash service. #5732
- [x] Mock Storage. #5573
- [x] #5732
- [ ] Write TPCH queries in the framework!
## Run with storage layer
- [x] MockStorage with StorageDeltaMerge. #6561
- [x] Support push down filter test #6573
**Feel free to ask me anything about the Test Framework.**
|
process
|
interpreter executor mpp unit test framework enhancement since there is no unit test for the interpreter module interpreter test framework aims to supplement the content of the unit test improve the stability of the module and find more hidden bugs at home tiflash planner needs to rely on unit tests to ensure the quality and progress of development during the refactoring process user can mock input columns and feed them into the execution streams blockinputstreams in tiflash then they can test executors except mpp related logic of tiflash locally further more mpp related tests will be supported when developing more features in the future good unit testing can be used as the basis for development to further improve the efficiency and quality of code development non mpp tests development progress add test log when failed test more types of executors support more expressions to test more user friendly exception throw check the legality and correctness of user input add some easy to use template to write tests when result is wrong print more user friendly message to help debug add some built in input columns for user to generate a bunch of tests generate columns with different types and size generate columns following different data distribution support regression test like this here is the detail a document to demonstrate how to use the test framework bind test framework with google bench new executor should implement serializer new blockinputstream should implement appendinfo improve the speed of tests a test generator like sqlsmith just for fun mpp tests development progress orchestrate mpp tasks get task status of each tiflash service mock storage write tpch queries in the framework run with storage layer mockstorage with storagedeltamerge support push down filter test feel free to ask me anything about the test framework
| 1
|
96,416
| 20,015,845,771
|
IssuesEvent
|
2022-02-01 11:59:51
|
RasaHQ/rasa
|
https://api.github.com/repos/RasaHQ/rasa
|
closed
|
implement train chunk for core featurizers
|
type:enhancement :sparkles: area:rasa-oss :ferris_wheel: research:scaling-ml-codebase
|
part of https://github.com/RasaHQ/rasa/issues/6836
core featurizers should create smth like `TrainingDataChunk` files in nlu
|
1.0
|
implement train chunk for core featurizers - part of https://github.com/RasaHQ/rasa/issues/6836
core featurizers should create smth like `TrainingDataChunk` files in nlu
|
non_process
|
implement train chunk for core featurizers part of core featurizers should create smth like trainingdatachunk files in nlu
| 0
|
17,305
| 23,122,143,216
|
IssuesEvent
|
2022-07-27 23:04:08
|
medic/cht-core
|
https://api.github.com/repos/medic/cht-core
|
opened
|
Release 3.15.0-FR-offline-user-replace
|
Type: Internal process
|
When development is ready to begin on a [Feature Release](https://docs.communityhealthtoolkit.org/core/releases/feature_releases/#release-names), an engineer on the appropriate Care Team or Allies should be nominated as a Release Engineer. They will be responsible for making sure the following tasks are followed, though not necessarily doing the work themselves.
# Planning
- [ ] Create a GH Milestone for the release.
- [ ] Add all the issues to be worked on to the Milestone.
- [ ] Have an actual named deployment and specific end user that will be testing this Feature Release. They need to test in production, on the latest version. No speculative Feature Releases.
- [ ] Assign an engineer as Release Engineer for this release.
# Development
- [ ] Create a new release branch in `cht-core` from the most recent release and call it `<major>.<minor>.<patch>-FR-<FEATURE-NAME>`. If latest is `3.15.0` and the feature is to "allow movies to be uploaded", call it `3.15.0-FR-movie-upload`. Done before the release so all PRs can be set to merge to this branch, and not to `master`.
- [ ] Set the version number in `package.json` and `package-lock.json` and submit a PR. The easiest way to do this is to use `npm --no-git-tag-version version <feature-release>`.
- [ ] Ensure QA is briefed and is partnering with the Trio to ensure early and often checks of the feature are on track to be of production quality from the start.
# Releasing
This is an iterative process and it's assumed there will be multiple numbered releases throughout development of the Feature Release.
- [ ] Build a beta named `<major>.<minor>.<patch>-FR-<FEATURE-NAME>-beta.1` by pushing a git tag and when CI completes successfully notify the QA team that it's ready for release testing. If an updated Feature Release is needed, increment the last `1` by calling it `<major>.<minor>.<patch>-FR-<FEATURE-NAME>-beta.2` etc.
# Close-out
- [ ] Validate with the actual end user that this Feature Release delivers a quantifiable improvement. If yes, plan on adding the feature to the next minor release by creating a new ticket to merge the code to `master`. If no, we leave the code dead in this branch, never to be merged to `master`, but still loved all the same.
- [ ] Mark this issue "done" and close the Milestone.
|
1.0
|
Release 3.15.0-FR-offline-user-replace - When development is ready to begin on a [Feature Release](https://docs.communityhealthtoolkit.org/core/releases/feature_releases/#release-names), an engineer on the appropriate Care Team or Allies should be nominated as a Release Engineer. They will be responsible for making sure the following tasks are followed, though not necessarily doing the work themselves.
# Planning
- [ ] Create a GH Milestone for the release.
- [ ] Add all the issues to be worked on to the Milestone.
- [ ] Have an actual named deployment and specific end user that will be testing this Feature Release. They need to test in production, on the latest version. No speculative Feature Releases.
- [ ] Assign an engineer as Release Engineer for this release.
# Development
- [ ] Create a new release branch in `cht-core` from the most recent release and call it `<major>.<minor>.<patch>-FR-<FEATURE-NAME>`. If latest is `3.15.0` and the feature is to "allow movies to be uploaded", call it `3.15.0-FR-movie-upload`. Done before the release so all PRs can be set to merge to this branch, and not to `master`.
- [ ] Set the version number in `package.json` and `package-lock.json` and submit a PR. The easiest way to do this is to use `npm --no-git-tag-version version <feature-release>`.
- [ ] Ensure QA is briefed and is partnering with the Trio to ensure early and often checks of the feature are on track to be of production quality from the start.
# Releasing
This is an iterative process and it's assumed there will be multiple numbered releases throughout development of the Feature Release.
- [ ] Build a beta named `<major>.<minor>.<patch>-FR-<FEATURE-NAME>-beta.1` by pushing a git tag and when CI completes successfully notify the QA team that it's ready for release testing. If an updated Feature Release is needed, increment the last `1` by calling it `<major>.<minor>.<patch>-FR-<FEATURE-NAME>-beta.2` etc.
# Close-out
- [ ] Validate with the actual end user that this Feature Release delivers a quantifiable improvement. If yes, plan on adding the feature to the next minor release by creating a new ticket to merge the code to `master`. If no, we leave the code dead in this branch, never to be merged to `master`, but still loved all the same.
- [ ] Mark this issue "done" and close the Milestone.
|
process
|
release fr offline user replace when development is ready to begin on a an engineer on the appropriate care team or allies should be nominated as a release engineer they will be responsible for making sure the following tasks are followed though not necessarily doing the work themselves planning create a gh milestone for the release add all the issues to be worked on to the milestone have an actual named deployment and specific end user that will be testing this feature release they need to test in production on the latest version no speculative feature releases assign an engineer as release engineer for this release development create a new release branch in cht core from the most recent release and call it fr if latest is and the feature is to allow movies to be uploaded call it fr movie upload done before the release so all prs can be set to merge to this branch and not to master set the version number in package json and package lock json and submit a pr the easiest way to do this is to use npm no git tag version version ensure qa is briefed and is partnering with the trio to ensure early and often checks of the feature are on track to be of production quality from the start releasing this is an iterative process and it s assumed there will be multiple numbered releases throughout development of the feature release build a beta named fr beta by pushing a git tag and when ci completes successfully notify the qa team that it s ready for release testing if an updated feature release is needed increment the last by calling it fr beta etc close out validate with the actual end user that this feature release delivers a quantifiable improvement if yes plan on adding the feature to the next minor release by creating a new ticket to merge the code to master if no we leave the code dead in this branch never to be merged to master but still loved all the same mark this issue done and close the milestone
| 1
|
21,480
| 3,899,898,041
|
IssuesEvent
|
2016-04-18 00:48:28
|
lnsp/tea
|
https://api.github.com/repos/lnsp/tea
|
opened
|
Add standard operators
|
needs-docs needs-tests
|
Add all operators from the specification to the Standard Runtime Environment.
## High priority
- [ ] add
- [ ] subtract
- [ ] multiply
- [ ] divide
|
1.0
|
Add standard operators - Add all operators from the specification to the Standard Runtime Environment.
## High priority
- [ ] add
- [ ] subtract
- [ ] multiply
- [ ] divide
|
non_process
|
add standard operators add all operators from the specification to the standard runtime environment high priority add subtract multiply divide
| 0
|
16,505
| 21,508,814,180
|
IssuesEvent
|
2022-04-28 00:32:46
|
RobertCraigie/prisma-client-py
|
https://api.github.com/repos/RobertCraigie/prisma-client-py
|
closed
|
Scalar relational fields are not included in `create_many` input
|
kind/improvement process/candidate level/intermediate priority/high
|
<!--
Thanks for helping us improve Prisma Client Python! 🙏 Please follow the sections in the template and provide as much information as possible about your problem, e.g. by enabling additional logging output.
See https://prisma-client-py.readthedocs.io/en/stable/reference/logging/ for how to enable additional logging output.
-->
## Bug description
<!-- A clear and concise description of what the bug is. -->
The following query should be supported.
```py
await client.profile.create_many(
data=[
{'user_id': 'a', 'description': 'Foo'},
{'user_id': 'b', 'description': 'Foo 2'},
],
)
```
## How to reproduce
<!--
Steps to reproduce the behavior:
1. Go to '...'
2. Change '....'
3. Run '....'
4. See error
-->
Run pyright on the above query
## Expected behavior
<!-- A clear and concise description of what you expected to happen. -->
No errors
## Prisma information
<!-- Your Prisma schema, Prisma Client Python queries, ...
Do not include your database credentials when sharing your Prisma schema! -->
Internal test schema
## Environment & setup
<!-- In which environment does the problem occur -->
- OS: <!--[e.g. Mac OS, Windows, Debian, CentOS, ...]--> Mac OS
- Database: <!--[PostgreSQL, MySQL, MariaDB or SQLite]--> PostgreSQL
- Python version: <!--[Run `python -V` to see your Python version]--> 3.9.9
|
1.0
|
Scalar relational fields are not included in `create_many` input - <!--
Thanks for helping us improve Prisma Client Python! 🙏 Please follow the sections in the template and provide as much information as possible about your problem, e.g. by enabling additional logging output.
See https://prisma-client-py.readthedocs.io/en/stable/reference/logging/ for how to enable additional logging output.
-->
## Bug description
<!-- A clear and concise description of what the bug is. -->
The following query should be supported.
```py
await client.profile.create_many(
data=[
{'user_id': 'a', 'description': 'Foo'},
{'user_id': 'b', 'description': 'Foo 2'},
],
)
```
## How to reproduce
<!--
Steps to reproduce the behavior:
1. Go to '...'
2. Change '....'
3. Run '....'
4. See error
-->
Run pyright on the above query
## Expected behavior
<!-- A clear and concise description of what you expected to happen. -->
No errors
## Prisma information
<!-- Your Prisma schema, Prisma Client Python queries, ...
Do not include your database credentials when sharing your Prisma schema! -->
Internal test schema
## Environment & setup
<!-- In which environment does the problem occur -->
- OS: <!--[e.g. Mac OS, Windows, Debian, CentOS, ...]--> Mac OS
- Database: <!--[PostgreSQL, MySQL, MariaDB or SQLite]--> PostgreSQL
- Python version: <!--[Run `python -V` to see your Python version]--> 3.9.9
|
process
|
scalar relational fields are not included in create many input thanks for helping us improve prisma client python 🙏 please follow the sections in the template and provide as much information as possible about your problem e g by enabling additional logging output see for how to enable additional logging output bug description the following query should be supported py await client profile create many data user id a description foo user id b description foo how to reproduce steps to reproduce the behavior go to change run see error run pyright on the above query expected behavior no errors prisma information your prisma schema prisma client python queries do not include your database credentials when sharing your prisma schema internal test schema environment setup os mac os database postgresql python version
| 1
|
5,875
| 8,698,758,433
|
IssuesEvent
|
2018-12-05 00:52:32
|
googleapis/google-cloud-java
|
https://api.github.com/repos/googleapis/google-cloud-java
|
closed
|
javadoc_test CI build does not run mvn javadoc target
|
priority: p2 type: process
|
https://circleci.com/gh/GoogleCloudPlatform/google-cloud-java/8882?utm_campaign=vcs-integration-link&utm_medium=referral&utm_source=github-build-link should have failed on making the javadoc (https://github.com/GoogleCloudPlatform/google-cloud-java/pull/3459 fixes the javadoc error)
We should ensure that the CI build for javadoc_test does indeed run the javadoc target and throw an exception if there's an error.
This is an annoying issue for when google-cloud-java is being released, and the javadoc target is actually run.
|
1.0
|
javadoc_test CI build does not run mvn javadoc target - https://circleci.com/gh/GoogleCloudPlatform/google-cloud-java/8882?utm_campaign=vcs-integration-link&utm_medium=referral&utm_source=github-build-link should have failed on making the javadoc (https://github.com/GoogleCloudPlatform/google-cloud-java/pull/3459 fixes the javadoc error)
We should ensure that the CI build for javadoc_test does indeed run the javadoc target and throw an exception if there's an error.
This is an annoying issue for when google-cloud-java is being released, and the javadoc target is actually run.
|
process
|
javadoc test ci build does not run mvn javadoc target should have failed on making the javadoc fixes the javadoc error we should ensure that the ci build for javadoc test does indeed run the javadoc target and throw an exception if there s an error this is an annoying issue for when google cloud java is being released and the javadoc target is actually run
| 1
|
13,768
| 16,527,480,589
|
IssuesEvent
|
2021-05-26 22:25:39
|
googleapis/google-cloud-go
|
https://api.github.com/repos/googleapis/google-cloud-go
|
closed
|
bigtable: run tests against live bigtable for continuous test runs
|
api: bigtable type: process
|
today the bigtable tests are running under emulation for the continuous runs. As some tests require live bigtable, we should use the live instance for continuous.
|
1.0
|
bigtable: run tests against live bigtable for continuous test runs - today the bigtable tests are running under emulation for the continuous runs. As some tests require live bigtable, we should use the live instance for continuous.
|
process
|
bigtable run tests against live bigtable for continuous test runs today the bigtable tests are running under emulation for the continuous runs as some tests require live bigtable we should use the live instance for continuous
| 1
|
19,777
| 26,160,536,974
|
IssuesEvent
|
2022-12-31 12:55:20
|
apache/arrow-rs
|
https://api.github.com/repos/apache/arrow-rs
|
opened
|
Release `object_store` `YYYY` (next release after`0.5.2`)
|
development-process object-store
|
Follow on from https://github.com/apache/arrow-rs/issues/3229
* Planned Release Candidate: TBD
* Planned Release and Publish to crates.io: TBD
Items:
- [ ] Update changelog and readme:
- [ ] Create release candidate:
- [ ] Release candidate approved:
- [ ] Release to crates.io:
|
1.0
|
Release `object_store` `YYYY` (next release after`0.5.2`) - Follow on from https://github.com/apache/arrow-rs/issues/3229
* Planned Release Candidate: TBD
* Planned Release and Publish to crates.io: TBD
Items:
- [ ] Update changelog and readme:
- [ ] Create release candidate:
- [ ] Release candidate approved:
- [ ] Release to crates.io:
|
process
|
release object store yyyy next release after follow on from planned release candidate tbd planned release and publish to crates io tbd items update changelog and readme create release candidate release candidate approved release to crates io
| 1
|
405,015
| 11,866,041,361
|
IssuesEvent
|
2020-03-26 02:24:12
|
sonia-auv/hardware-utility-auv8
|
https://api.github.com/repos/sonia-auv/hardware-utility-auv8
|
closed
|
Unwanted sleep in RS485 thread
|
Priority: High Type: Bug
|
## Expected Behavior
Unread data should be revome and the serial communication should continue
## Current Behavior
If an unsued command is recevied, it should be revomed after a certain amount of time.
## Possible Solution
Comment thread.sleep in function send.packet()
|
1.0
|
Unwanted sleep in RS485 thread - ## Expected Behavior
Unread data should be revome and the serial communication should continue
## Current Behavior
If an unsued command is recevied, it should be revomed after a certain amount of time.
## Possible Solution
Comment thread.sleep in function send.packet()
|
non_process
|
unwanted sleep in thread expected behavior unread data should be revome and the serial communication should continue current behavior if an unsued command is recevied it should be revomed after a certain amount of time possible solution comment thread sleep in function send packet
| 0
|
56,042
| 6,499,302,542
|
IssuesEvent
|
2017-08-22 20:56:02
|
Lycanite/LycanitesMobs
|
https://api.github.com/repos/Lycanite/LycanitesMobs
|
closed
|
[Mounts] Non-Swimming Mounts Stuck In Water
|
bug testing
|
If you ride a mount that cannot swim into water they become stuck and you have to dismount and push them out the water or let them wander out on their own.
|
1.0
|
[Mounts] Non-Swimming Mounts Stuck In Water - If you ride a mount that cannot swim into water they become stuck and you have to dismount and push them out the water or let them wander out on their own.
|
non_process
|
non swimming mounts stuck in water if you ride a mount that cannot swim into water they become stuck and you have to dismount and push them out the water or let them wander out on their own
| 0
|
69,931
| 17,929,137,499
|
IssuesEvent
|
2021-09-10 06:42:21
|
tensorflow/tensorflow
|
https://api.github.com/repos/tensorflow/tensorflow
|
closed
|
Hwloc mirror - download issue
|
stat:awaiting response stat:awaiting tensorflower type:build/install stalled subtype: raspberry pi
|
I am trying to cross-compile TF for Raspberry Pi via Docker container - as described in documentation. Unfortunately. it breaks with the following message:
ERROR: /workspace/tensorflow/core/BUILD:2432:1: no such package '@hwloc//': java.io.IOException: Error downloading [http://mirror.tensorflow.org/download.open-mpi.org/release/hwloc/v2.0/hwloc-2.0.3.tar.gz, https://download.open-mpi.org/release/hwloc/v2.0/hwloc-2.0.3.tar.gz] to /home/dmitry/tensorflow/bazel-ci_build-cache/.cache/bazel/_bazel_dmitry/eab0d61a99b6696edb3d2aff87b585e8/external/hwloc/hwloc-2.0.3.tar.gz: Tried to reconnect at offset 6,391,630 but server didn't support it and referenced by '//tensorflow/core:lib_internal_impl'
ERROR: Analysis of target '//tensorflow/tools/pip_package:build_pip_package' failed; build aborted: no such package '@hwloc//': java.io.IOException: Error downloading [http://mirror.tensorflow.org/download.open-mpi.org/release/hwloc/v2.0/hwloc-2.0.3.tar.gz, https://download.open-mpi.org/release/hwloc/v2.0/hwloc-2.0.3.tar.gz] to /home/dmitry/tensorflow/bazel-ci_build-cache/.cache/bazel/_bazel_dmitry/eab0d61a99b6696edb3d2aff87b585e8/external/hwloc/hwloc-2.0.3.tar.gz: Tried to reconnect at offset 6,391,630 but server didn't support it
Fixes tried with no luck:
- remove a semi-downloaded package from Bazel cache
- download it from the original site and put to a cache
- download from a mirror with Chrome (breaks in two parts)
**System information**
TF_BUILD_INFO = {
container_type: "pi-python3",
command: "tensorflow/tools/ci_build/pi/build_raspberry_pi.sh PI_ONE",
source_HEAD: "c407b045b8802f9eded430ef48be18cd85e4788c",
source_remote_origin: "https://github.com/tensorflow/tensorflow.git",
OS: "Linux",
kernel: "4.15.0-54-generic",
architecture: "x86_64",
processor: "Intel(R) Core(TM) i7-8550U CPU @ 1.80GHz",
processor_count: "8",
memory_total: "16305540 kB",
swap_total: "16657404 kB",
Bazel_version: "Build label: 0.24.1",
Java_version: "1.8.0_222-ea",
Python_version: "2.7.6",
gpp_version: "g++ (Ubuntu 4.8.4-2ubuntu1~14.04.4) 4.8.4",
swig_version: "",
NVIDIA_driver_version: "418.56",
CUDA_device_count: "0",
CUDA_device_names: "",
CUDA_toolkit_version: ""
}
**Provide the exact sequence of commands / steps that you executed before running into the problem**
CI_DOCKER_EXTRA_PARAMS="-e CI_BUILD_PYTHON=python3 -e CROSSTOOL_PYTHON_INCLUDE_PATH=/usr/include/python3.4" tensorflow/tools/ci_build/ci_build.sh PI-PYTHON3 tensorflow/tools/ci_build/pi/build_raspberry_pi.sh PI_ONE
|
1.0
|
Hwloc mirror - download issue - I am trying to cross-compile TF for Raspberry Pi via Docker container - as described in documentation. Unfortunately. it breaks with the following message:
ERROR: /workspace/tensorflow/core/BUILD:2432:1: no such package '@hwloc//': java.io.IOException: Error downloading [http://mirror.tensorflow.org/download.open-mpi.org/release/hwloc/v2.0/hwloc-2.0.3.tar.gz, https://download.open-mpi.org/release/hwloc/v2.0/hwloc-2.0.3.tar.gz] to /home/dmitry/tensorflow/bazel-ci_build-cache/.cache/bazel/_bazel_dmitry/eab0d61a99b6696edb3d2aff87b585e8/external/hwloc/hwloc-2.0.3.tar.gz: Tried to reconnect at offset 6,391,630 but server didn't support it and referenced by '//tensorflow/core:lib_internal_impl'
ERROR: Analysis of target '//tensorflow/tools/pip_package:build_pip_package' failed; build aborted: no such package '@hwloc//': java.io.IOException: Error downloading [http://mirror.tensorflow.org/download.open-mpi.org/release/hwloc/v2.0/hwloc-2.0.3.tar.gz, https://download.open-mpi.org/release/hwloc/v2.0/hwloc-2.0.3.tar.gz] to /home/dmitry/tensorflow/bazel-ci_build-cache/.cache/bazel/_bazel_dmitry/eab0d61a99b6696edb3d2aff87b585e8/external/hwloc/hwloc-2.0.3.tar.gz: Tried to reconnect at offset 6,391,630 but server didn't support it
Fixes tried with no luck:
- remove a semi-downloaded package from Bazel cache
- download it from the original site and put to a cache
- download from a mirror with Chrome (breaks in two parts)
**System information**
TF_BUILD_INFO = {
container_type: "pi-python3",
command: "tensorflow/tools/ci_build/pi/build_raspberry_pi.sh PI_ONE",
source_HEAD: "c407b045b8802f9eded430ef48be18cd85e4788c",
source_remote_origin: "https://github.com/tensorflow/tensorflow.git",
OS: "Linux",
kernel: "4.15.0-54-generic",
architecture: "x86_64",
processor: "Intel(R) Core(TM) i7-8550U CPU @ 1.80GHz",
processor_count: "8",
memory_total: "16305540 kB",
swap_total: "16657404 kB",
Bazel_version: "Build label: 0.24.1",
Java_version: "1.8.0_222-ea",
Python_version: "2.7.6",
gpp_version: "g++ (Ubuntu 4.8.4-2ubuntu1~14.04.4) 4.8.4",
swig_version: "",
NVIDIA_driver_version: "418.56",
CUDA_device_count: "0",
CUDA_device_names: "",
CUDA_toolkit_version: ""
}
**Provide the exact sequence of commands / steps that you executed before running into the problem**
CI_DOCKER_EXTRA_PARAMS="-e CI_BUILD_PYTHON=python3 -e CROSSTOOL_PYTHON_INCLUDE_PATH=/usr/include/python3.4" tensorflow/tools/ci_build/ci_build.sh PI-PYTHON3 tensorflow/tools/ci_build/pi/build_raspberry_pi.sh PI_ONE
|
non_process
|
hwloc mirror download issue i am trying to cross compile tf for raspberry pi via docker container as described in documentation unfortunately it breaks with the following message error workspace tensorflow core build no such package hwloc java io ioexception error downloading to home dmitry tensorflow bazel ci build cache cache bazel bazel dmitry external hwloc hwloc tar gz tried to reconnect at offset but server didn t support it and referenced by tensorflow core lib internal impl error analysis of target tensorflow tools pip package build pip package failed build aborted no such package hwloc java io ioexception error downloading to home dmitry tensorflow bazel ci build cache cache bazel bazel dmitry external hwloc hwloc tar gz tried to reconnect at offset but server didn t support it fixes tried with no luck remove a semi downloaded package from bazel cache download it from the original site and put to a cache download from a mirror with chrome breaks in two parts system information tf build info container type pi command tensorflow tools ci build pi build raspberry pi sh pi one source head source remote origin os linux kernel generic architecture processor intel r core tm cpu processor count memory total kb swap total kb bazel version build label java version ea python version gpp version g ubuntu swig version nvidia driver version cuda device count cuda device names cuda toolkit version provide the exact sequence of commands steps that you executed before running into the problem ci docker extra params e ci build python e crosstool python include path usr include tensorflow tools ci build ci build sh pi tensorflow tools ci build pi build raspberry pi sh pi one
| 0
|
244,309
| 20,622,383,004
|
IssuesEvent
|
2022-03-07 18:43:51
|
awslabs/smithy-rs
|
https://api.github.com/repos/awslabs/smithy-rs
|
opened
|
Validate benchmarks reliability
|
question testing server
|
Since #1230 we run a benchmark using wrk against `origin/main` and any pull request and calculate the deviation.
We need to validate if the benchmark is reliable when run by the Github action runner. If it is not reliable we can
* Run the benchmark on hardware we own
* Change the benchmark to count instruction instead of using time
|
1.0
|
Validate benchmarks reliability - Since #1230 we run a benchmark using wrk against `origin/main` and any pull request and calculate the deviation.
We need to validate if the benchmark is reliable when run by the Github action runner. If it is not reliable we can
* Run the benchmark on hardware we own
* Change the benchmark to count instruction instead of using time
|
non_process
|
validate benchmarks reliability since we run a benchmark using wrk against origin main and any pull request and calculate the deviation we need to validate if the benchmark is reliable when run by the github action runner if it is not reliable we can run the benchmark on hardware we own change the benchmark to count instruction instead of using time
| 0
|
4,047
| 6,976,553,152
|
IssuesEvent
|
2017-12-12 11:29:31
|
DevExpress/testcafe-hammerhead
|
https://api.github.com/repos/DevExpress/testcafe-hammerhead
|
opened
|
Hammerhead does not inject own staff to pages with 'text/plain' content type
|
SYSTEM: resource processing TYPE: bug
|
It means that testcafe will hung after redirect (or perform any action) to such page .
|
1.0
|
Hammerhead does not inject own staff to pages with 'text/plain' content type - It means that testcafe will hung after redirect (or perform any action) to such page .
|
process
|
hammerhead does not inject own staff to pages with text plain content type it means that testcafe will hung after redirect or perform any action to such page
| 1
|
444,193
| 12,807,688,988
|
IssuesEvent
|
2020-07-03 12:03:27
|
kubernetes/kubeadm
|
https://api.github.com/repos/kubernetes/kubeadm
|
closed
|
kubeadm serves kube-scheduler and kube-controller metrics insecurely
|
area/ecosystem area/security priority/important-longterm
|
kubeadm serves kube-scheduler and kube-controller manager metrics insecurely outside of localhost, as reported here:
https://kubernetes.slack.com/archives/C2P1JHS2E/p1593237397449300
i need to double check this myself, but it feels like our --bind-address=127.0.0.1 is not sufficient to disable that.
for example:
`curl http://public-ip:10252/metrics`
flag refs:
https://kubernetes.io/docs/reference/command-line-tools-reference/kube-controller-manager/
https://kubernetes.io/docs/reference/command-line-tools-reference/kube-scheduler/
|
1.0
|
kubeadm serves kube-scheduler and kube-controller metrics insecurely - kubeadm serves kube-scheduler and kube-controller manager metrics insecurely outside of localhost, as reported here:
https://kubernetes.slack.com/archives/C2P1JHS2E/p1593237397449300
i need to double check this myself, but it feels like our --bind-address=127.0.0.1 is not sufficient to disable that.
for example:
`curl http://public-ip:10252/metrics`
flag refs:
https://kubernetes.io/docs/reference/command-line-tools-reference/kube-controller-manager/
https://kubernetes.io/docs/reference/command-line-tools-reference/kube-scheduler/
|
non_process
|
kubeadm serves kube scheduler and kube controller metrics insecurely kubeadm serves kube scheduler and kube controller manager metrics insecurely outside of localhost as reported here i need to double check this myself but it feels like our bind address is not sufficient to disable that for example curl flag refs
| 0
|
9,507
| 12,494,097,511
|
IssuesEvent
|
2020-06-01 10:30:50
|
hjaremko/io-rpg
|
https://api.github.com/repos/hjaremko/io-rpg
|
closed
|
Reorganize project
|
enhancement process improvement wontfix
|
- [x] remove any logic from entities
- [ ] implement all interfaces in services
- [ ] reinvent interfaces for using in services
- [x] add id to mapping
|
1.0
|
Reorganize project - - [x] remove any logic from entities
- [ ] implement all interfaces in services
- [ ] reinvent interfaces for using in services
- [x] add id to mapping
|
process
|
reorganize project remove any logic from entities implement all interfaces in services reinvent interfaces for using in services add id to mapping
| 1
|
20,161
| 26,714,122,237
|
IssuesEvent
|
2023-01-28 09:03:25
|
googleapis/google-cloud-php-service-control
|
https://api.github.com/repos/googleapis/google-cloud-php-service-control
|
closed
|
Your .repo-metadata.json file has a problem 🤒
|
type: process repo-metadata: lint
|
You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname field missing from .repo-metadata.json
☝️ Once you address these problems, you can close this issue.
### Need help?
* [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field.
* [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**.
* Reach out to **go/github-automation** if you have any questions.
|
1.0
|
Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file:
Result of scan 📈:
* release_level must be equal to one of the allowed values in .repo-metadata.json
* api_shortname field missing from .repo-metadata.json
☝️ Once you address these problems, you can close this issue.
### Need help?
* [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field.
* [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**.
* Reach out to **go/github-automation** if you have any questions.
|
process
|
your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 release level must be equal to one of the allowed values in repo metadata json api shortname field missing from repo metadata json ☝️ once you address these problems you can close this issue need help lists valid options for each field for grpc libraries api shortname should match the subdomain of an api s hostname reach out to go github automation if you have any questions
| 1
|
271,008
| 20,618,831,494
|
IssuesEvent
|
2022-03-07 15:35:15
|
18F/gsa-small-business-experience
|
https://api.github.com/repos/18F/gsa-small-business-experience
|
closed
|
Update repository README
|
documentation development
|
- [x] Purpose and brief project description
- [ ] Links to (private) google drive locations
- [x] Local development instructions
- [x] Link to original federalist template, etc
|
1.0
|
Update repository README - - [x] Purpose and brief project description
- [ ] Links to (private) google drive locations
- [x] Local development instructions
- [x] Link to original federalist template, etc
|
non_process
|
update repository readme purpose and brief project description links to private google drive locations local development instructions link to original federalist template etc
| 0
|
184,186
| 14,971,694,170
|
IssuesEvent
|
2021-01-27 21:33:54
|
sandflow/ttconv
|
https://api.github.com/repos/sandflow/ttconv
|
closed
|
SCC reader documentation
|
documentation
|
Document internal architecture and data flow of the SCC reader as a file under `doc`
|
1.0
|
SCC reader documentation - Document internal architecture and data flow of the SCC reader as a file under `doc`
|
non_process
|
scc reader documentation document internal architecture and data flow of the scc reader as a file under doc
| 0
|
2,451
| 5,231,289,650
|
IssuesEvent
|
2017-01-30 01:12:47
|
pathikrit/better-files
|
https://api.github.com/repos/pathikrit/better-files
|
closed
|
Wrap over java.nio types instead of exposing them directly
|
feature Scala Platform Process
|
This is in response to https://contributors.scala-lang.org/t/adding-better-files-to-the-scala-platform
In regards to cross platform support it would be ideal if better-files wrapped over `java.nio` instead of exposing them directly. This would make it much easier to support better files API over different platforms which have differing capabilities and designs when it comes to IO support.
|
1.0
|
Wrap over java.nio types instead of exposing them directly - This is in response to https://contributors.scala-lang.org/t/adding-better-files-to-the-scala-platform
In regards to cross platform support it would be ideal if better-files wrapped over `java.nio` instead of exposing them directly. This would make it much easier to support better files API over different platforms which have differing capabilities and designs when it comes to IO support.
|
process
|
wrap over java nio types instead of exposing them directly this is in response to in regards to cross platform support it would be ideal if better files wrapped over java nio instead of exposing them directly this would make it much easier to support better files api over different platforms which have differing capabilities and designs when it comes to io support
| 1
|
44,717
| 18,168,314,530
|
IssuesEvent
|
2021-09-27 16:54:06
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
closed
|
[Expressions] Implement partial results demo plugin
|
Feature:ExpressionLanguage loe:days Team:AppServicesSv impact:low
|
Part of #84051.
## Requirements
- Implement a demo plugin showing usage of the partial results.
- Provide functional tests to cover the partial results feature.
|
1.0
|
[Expressions] Implement partial results demo plugin - Part of #84051.
## Requirements
- Implement a demo plugin showing usage of the partial results.
- Provide functional tests to cover the partial results feature.
|
non_process
|
implement partial results demo plugin part of requirements implement a demo plugin showing usage of the partial results provide functional tests to cover the partial results feature
| 0
|
15,720
| 19,862,938,978
|
IssuesEvent
|
2022-01-22 04:41:20
|
ooi-data/RS01SLBS-LJ01A-11-OPTAAC103-streamed-optaa_sample
|
https://api.github.com/repos/ooi-data/RS01SLBS-LJ01A-11-OPTAAC103-streamed-optaa_sample
|
opened
|
🛑 Processing failed: TypeError
|
process
|
## Overview
`TypeError` found in `processing_task` task during run ended on 2022-01-22T04:41:19.396264.
## Details
Flow name: `RS01SLBS-LJ01A-11-OPTAAC103-streamed-optaa_sample`
Task name: `processing_task`
Error type: `TypeError`
Error message: 'NoneType' object is not subscriptable
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 157, in processing
process_dataset(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 147, in process_dataset
append_to_zarr(mod_ds, store, enc, logger=logger)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 337, in append_to_zarr
existing_zarr = zarr.open_group(store, mode='a')
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/hierarchy.py", line 1190, in open_group
return Group(store, read_only=read_only, cache_attrs=cache_attrs,
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/hierarchy.py", line 116, in __init__
meta_bytes = store[mkey]
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/fsspec/mapping.py", line 135, in __getitem__
result = self.fs.cat(k)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/fsspec/asyn.py", line 91, in wrapper
return sync(self.loop, func, *args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/fsspec/asyn.py", line 71, in sync
raise return_result
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/fsspec/asyn.py", line 25, in _runner
result[0] = await coro
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/fsspec/asyn.py", line 402, in _cat
raise ex
File "/srv/conda/envs/notebook/lib/python3.9/asyncio/tasks.py", line 442, in wait_for
return await fut
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/s3fs/core.py", line 887, in _cat_file
resp = await self._call_s3(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/s3fs/core.py", line 280, in _call_s3
err = translate_boto_error(err)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/s3fs/errors.py", line 142, in translate_boto_error
code = error.response["Error"].get("Code")
TypeError: 'NoneType' object is not subscriptable
```
</details>
|
1.0
|
🛑 Processing failed: TypeError - ## Overview
`TypeError` found in `processing_task` task during run ended on 2022-01-22T04:41:19.396264.
## Details
Flow name: `RS01SLBS-LJ01A-11-OPTAAC103-streamed-optaa_sample`
Task name: `processing_task`
Error type: `TypeError`
Error message: 'NoneType' object is not subscriptable
<details>
<summary>Traceback</summary>
```
Traceback (most recent call last):
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 157, in processing
process_dataset(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 147, in process_dataset
append_to_zarr(mod_ds, store, enc, logger=logger)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 337, in append_to_zarr
existing_zarr = zarr.open_group(store, mode='a')
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/hierarchy.py", line 1190, in open_group
return Group(store, read_only=read_only, cache_attrs=cache_attrs,
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/hierarchy.py", line 116, in __init__
meta_bytes = store[mkey]
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/fsspec/mapping.py", line 135, in __getitem__
result = self.fs.cat(k)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/fsspec/asyn.py", line 91, in wrapper
return sync(self.loop, func, *args, **kwargs)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/fsspec/asyn.py", line 71, in sync
raise return_result
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/fsspec/asyn.py", line 25, in _runner
result[0] = await coro
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/fsspec/asyn.py", line 402, in _cat
raise ex
File "/srv/conda/envs/notebook/lib/python3.9/asyncio/tasks.py", line 442, in wait_for
return await fut
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/s3fs/core.py", line 887, in _cat_file
resp = await self._call_s3(
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/s3fs/core.py", line 280, in _call_s3
err = translate_boto_error(err)
File "/srv/conda/envs/notebook/lib/python3.9/site-packages/s3fs/errors.py", line 142, in translate_boto_error
code = error.response["Error"].get("Code")
TypeError: 'NoneType' object is not subscriptable
```
</details>
|
process
|
🛑 processing failed typeerror overview typeerror found in processing task task during run ended on details flow name streamed optaa sample task name processing task error type typeerror error message nonetype object is not subscriptable traceback traceback most recent call last file srv conda envs notebook lib site packages ooi harvester processor pipeline py line in processing process dataset file srv conda envs notebook lib site packages ooi harvester processor init py line in process dataset append to zarr mod ds store enc logger logger file srv conda envs notebook lib site packages ooi harvester processor init py line in append to zarr existing zarr zarr open group store mode a file srv conda envs notebook lib site packages zarr hierarchy py line in open group return group store read only read only cache attrs cache attrs file srv conda envs notebook lib site packages zarr hierarchy py line in init meta bytes store file srv conda envs notebook lib site packages fsspec mapping py line in getitem result self fs cat k file srv conda envs notebook lib site packages fsspec asyn py line in wrapper return sync self loop func args kwargs file srv conda envs notebook lib site packages fsspec asyn py line in sync raise return result file srv conda envs notebook lib site packages fsspec asyn py line in runner result await coro file srv conda envs notebook lib site packages fsspec asyn py line in cat raise ex file srv conda envs notebook lib asyncio tasks py line in wait for return await fut file srv conda envs notebook lib site packages core py line in cat file resp await self call file srv conda envs notebook lib site packages core py line in call err translate boto error err file srv conda envs notebook lib site packages errors py line in translate boto error code error response get code typeerror nonetype object is not subscriptable
| 1
|
20,171
| 26,727,677,483
|
IssuesEvent
|
2023-01-29 22:30:08
|
evidence-dev/evidence
|
https://api.github.com/repos/evidence-dev/evidence
|
opened
|
.README files for all packages and sites
|
enhancement dev-process
|
Make it easier for open source contributors to navigate and understand the monorepo by including a readme in the root of each package and site.
|
1.0
|
.README files for all packages and sites - Make it easier for open source contributors to navigate and understand the monorepo by including a readme in the root of each package and site.
|
process
|
readme files for all packages and sites make it easier for open source contributors to navigate and understand the monorepo by including a readme in the root of each package and site
| 1
|
15,026
| 18,740,284,462
|
IssuesEvent
|
2021-11-04 12:52:16
|
streamnative/pulsar-flink
|
https://api.github.com/repos/streamnative/pulsar-flink
|
closed
|
Unable to serialize a case class with Option type fields
|
type/bug platform/data-processing
|
I'm currently trying to use Pulsar as a sink and when serializing a scala case class with a Option type field I'm getting the following error:
`org.apache.pulsar.shade.org.apache.avro.SchemaParseException: Illegal character in: None$`
Case class I'm using for testing:
```
case class Car(
name: String,
model: Option[Long]
)
```
And this is how I'm currently setting up the Pulsar sink:
```
val serializer = new PulsarSerializationSchemaWrapper.Builder[Car](AvroSer.of(classOf[Car]))
.usePojoMode(classOf[Car], RecordSchemaType.AVRO)
.build()
val clientConfigurationData = new ClientConfigurationData()
clientConfigurationData.setServiceUrl(serviceUrl)
val sink = new FlinkPulsarSink[Car](
adminUrl,
Optional.of(topic),
clientConfigurationData,
new Properties(),
serializer,
PulsarSinkSemantic.AT_LEAST_ONCE
)
```
I've tried a few different ways but them all seem to throw some sort of error. If I just remove the option field the code just works. I wonder if there is any way around this issue.
Thanks in advance.
|
1.0
|
Unable to serialize a case class with Option type fields - I'm currently trying to use Pulsar as a sink and when serializing a scala case class with a Option type field I'm getting the following error:
`org.apache.pulsar.shade.org.apache.avro.SchemaParseException: Illegal character in: None$`
Case class I'm using for testing:
```
case class Car(
name: String,
model: Option[Long]
)
```
And this is how I'm currently setting up the Pulsar sink:
```
val serializer = new PulsarSerializationSchemaWrapper.Builder[Car](AvroSer.of(classOf[Car]))
.usePojoMode(classOf[Car], RecordSchemaType.AVRO)
.build()
val clientConfigurationData = new ClientConfigurationData()
clientConfigurationData.setServiceUrl(serviceUrl)
val sink = new FlinkPulsarSink[Car](
adminUrl,
Optional.of(topic),
clientConfigurationData,
new Properties(),
serializer,
PulsarSinkSemantic.AT_LEAST_ONCE
)
```
I've tried a few different ways but them all seem to throw some sort of error. If I just remove the option field the code just works. I wonder if there is any way around this issue.
Thanks in advance.
|
process
|
unable to serialize a case class with option type fields i m currently trying to use pulsar as a sink and when serializing a scala case class with a option type field i m getting the following error org apache pulsar shade org apache avro schemaparseexception illegal character in none case class i m using for testing case class car name string model option and this is how i m currently setting up the pulsar sink val serializer new pulsarserializationschemawrapper builder avroser of classof usepojomode classof recordschematype avro build val clientconfigurationdata new clientconfigurationdata clientconfigurationdata setserviceurl serviceurl val sink new flinkpulsarsink adminurl optional of topic clientconfigurationdata new properties serializer pulsarsinksemantic at least once i ve tried a few different ways but them all seem to throw some sort of error if i just remove the option field the code just works i wonder if there is any way around this issue thanks in advance
| 1
|
666,978
| 22,394,307,023
|
IssuesEvent
|
2022-06-17 10:51:03
|
redhat-developer/odo
|
https://api.github.com/repos/redhat-developer/odo
|
closed
|
add support for `command` field in `container` components
|
kind/bug priority/High
|
https://github.com/devfile/registry/pull/102#issuecomment-1077744174
Odo is not able to use devfiles that have `container` component with a custom `command`. Problem is that in the current odo implementation it overrides the default process in deployment which means that supervisord is not started. When supervisord is not running, odo is not able to execute any commands in the container.
A quick fix is for odo to ignore `command` and `args` fields in container components.
This has been done in https://github.com/redhat-developer/odo/issues/5620
The proper fix is going to be to keep using `supervisord` as pid1, and execute the `commmand` with `args` "on the side" as a separate process.
/kind bug
/priority high
|
1.0
|
add support for `command` field in `container` components - https://github.com/devfile/registry/pull/102#issuecomment-1077744174
Odo is not able to use devfiles that have `container` component with a custom `command`. Problem is that in the current odo implementation it overrides the default process in deployment which means that supervisord is not started. When supervisord is not running, odo is not able to execute any commands in the container.
A quick fix is for odo to ignore `command` and `args` fields in container components.
This has been done in https://github.com/redhat-developer/odo/issues/5620
The proper fix is going to be to keep using `supervisord` as pid1, and execute the `commmand` with `args` "on the side" as a separate process.
/kind bug
/priority high
|
non_process
|
add support for command field in container components odo is not able to use devfiles that have container component with a custom command problem is that in the current odo implementation it overrides the default process in deployment which means that supervisord is not started when supervisord is not running odo is not able to execute any commands in the container a quick fix is for odo to ignore command and args fields in container components this has been done in the proper fix is going to be to keep using supervisord as and execute the commmand with args on the side as a separate process kind bug priority high
| 0
|
3,945
| 6,886,300,159
|
IssuesEvent
|
2017-11-21 18:58:51
|
mcellteam/neuropil_tools
|
https://api.github.com/repos/mcellteam/neuropil_tools
|
opened
|
Data model and data template
|
processor spine_head_ana
|
longer term - establish method for maintaining and updating data as tool improves
|
1.0
|
Data model and data template - longer term - establish method for maintaining and updating data as tool improves
|
process
|
data model and data template longer term establish method for maintaining and updating data as tool improves
| 1
|
13,521
| 16,058,094,358
|
IssuesEvent
|
2021-04-23 08:36:55
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
opened
|
Error: Error in migration engine. Reason: [C:\Users\runneradmin\.cargo\git\checkouts\quaint-9f01e008b9a89c14\8196f13\src\connector\result_set\result_row.rs:59:64] index out of bounds: the len is 1 but the index is 10
|
bug/1-repro-available kind/bug process/candidate team/migrations
|
<!-- If required, please update the title to be clear and descriptive -->
Command: `prisma db push --preview-feature`
Version: `2.21.2`
Binary Version: `e421996c87d5f3c8f7eeadd502d4ad402c89464d`
Report: https://prisma-errors.netlify.app/report/13234
OS: `x64 win32 10.0.19042`
JS Stacktrace:
```
Error: Error in migration engine.
Reason: [C:\Users\runneradmin\.cargo\git\checkouts\quaint-9f01e008b9a89c14\8196f13\src\connector\result_set\result_row.rs:59:64] index out of bounds: the len is 1 but the index is 10
Please create an issue with your `schema.prisma` at
https://github.com/prisma/prisma/issues/new
at ChildProcess.<anonymous> (C:\Users\Lenovo\AppData\Roaming\npm\node_modules\prisma\build\index.js:55668:23)
at ChildProcess.emit (events.js:315:20)
at ChildProcess.EventEmitter.emit (domain.js:486:12)
at Process.ChildProcess._handle.onexit (internal/child_process.js:277:12)
```
Rust Stacktrace:
```
[C:\Users\runneradmin\.cargo\git\checkouts\quaint-9f01e008b9a89c14\8196f13\src\connector\result_set\result_row.rs:59:64] index out of bounds: the len is 1 but the index is 10
```
|
1.0
|
Error: Error in migration engine. Reason: [C:\Users\runneradmin\.cargo\git\checkouts\quaint-9f01e008b9a89c14\8196f13\src\connector\result_set\result_row.rs:59:64] index out of bounds: the len is 1 but the index is 10 - <!-- If required, please update the title to be clear and descriptive -->
Command: `prisma db push --preview-feature`
Version: `2.21.2`
Binary Version: `e421996c87d5f3c8f7eeadd502d4ad402c89464d`
Report: https://prisma-errors.netlify.app/report/13234
OS: `x64 win32 10.0.19042`
JS Stacktrace:
```
Error: Error in migration engine.
Reason: [C:\Users\runneradmin\.cargo\git\checkouts\quaint-9f01e008b9a89c14\8196f13\src\connector\result_set\result_row.rs:59:64] index out of bounds: the len is 1 but the index is 10
Please create an issue with your `schema.prisma` at
https://github.com/prisma/prisma/issues/new
at ChildProcess.<anonymous> (C:\Users\Lenovo\AppData\Roaming\npm\node_modules\prisma\build\index.js:55668:23)
at ChildProcess.emit (events.js:315:20)
at ChildProcess.EventEmitter.emit (domain.js:486:12)
at Process.ChildProcess._handle.onexit (internal/child_process.js:277:12)
```
Rust Stacktrace:
```
[C:\Users\runneradmin\.cargo\git\checkouts\quaint-9f01e008b9a89c14\8196f13\src\connector\result_set\result_row.rs:59:64] index out of bounds: the len is 1 but the index is 10
```
|
process
|
error error in migration engine reason index out of bounds the len is but the index is command prisma db push preview feature version binary version report os js stacktrace error error in migration engine reason index out of bounds the len is but the index is please create an issue with your schema prisma at at childprocess c users lenovo appdata roaming npm node modules prisma build index js at childprocess emit events js at childprocess eventemitter emit domain js at process childprocess handle onexit internal child process js rust stacktrace index out of bounds the len is but the index is
| 1
|
10,866
| 13,638,560,523
|
IssuesEvent
|
2020-09-25 09:34:01
|
googleapis/google-cloud-dotnet
|
https://api.github.com/repos/googleapis/google-cloud-dotnet
|
closed
|
Some Firestore conformance tests are currently ignored
|
api: firestore type: process
|
We need to fix the conformance tests, then unskip them.
|
1.0
|
Some Firestore conformance tests are currently ignored - We need to fix the conformance tests, then unskip them.
|
process
|
some firestore conformance tests are currently ignored we need to fix the conformance tests then unskip them
| 1
|
36,496
| 2,800,062,825
|
IssuesEvent
|
2015-05-13 07:31:53
|
ceylon/ceylon.language
|
https://api.github.com/repos/ceylon/ceylon.language
|
closed
|
two bugs in JS metamodel
|
BUG high priority JS runtime
|
Given:
```ceylon
import ceylon.language.meta {
type
}
shared [K+] sequence<K>(K+ ret) => ret;
shared void run2() {
[String+] k = [ "hi", "hello" ];
value s = `sequence<String>`;
print(s.type);
print(type(s.apply(*k)));
}
```
The first line prints `<null>`, which is clearly wrong. The second line blows up with:
Not enough arguments to function. Expected 1 but got only 0
@chochos would you take a look, please?
|
1.0
|
two bugs in JS metamodel - Given:
```ceylon
import ceylon.language.meta {
type
}
shared [K+] sequence<K>(K+ ret) => ret;
shared void run2() {
[String+] k = [ "hi", "hello" ];
value s = `sequence<String>`;
print(s.type);
print(type(s.apply(*k)));
}
```
The first line prints `<null>`, which is clearly wrong. The second line blows up with:
Not enough arguments to function. Expected 1 but got only 0
@chochos would you take a look, please?
|
non_process
|
two bugs in js metamodel given ceylon import ceylon language meta type shared sequence k ret ret shared void k value s sequence print s type print type s apply k the first line prints which is clearly wrong the second line blows up with not enough arguments to function expected but got only chochos would you take a look please
| 0
|
406,124
| 27,552,540,021
|
IssuesEvent
|
2023-03-07 15:48:51
|
NotCobaltDragon/TP1_PWM_AD
|
https://api.github.com/repos/NotCobaltDragon/TP1_PWM_AD
|
opened
|
app.c -> déclaration de fct -> updatestate
|
documentation
|
https://github.com/NotCobaltDragon/TP1_PWM_AD/blob/03475c8462621ea0cf134c0922264265bb3e668b/firmware/src/app.c#L159
Rien de faux, mais il serait bien d'avoir une cartouche au dessus de fonction
voir commentaire de vos collègues : https://github.com/Ali-Z0/MINF-TP1-CM-AZ/issues/9 => même type de remarque
|
1.0
|
app.c -> déclaration de fct -> updatestate - https://github.com/NotCobaltDragon/TP1_PWM_AD/blob/03475c8462621ea0cf134c0922264265bb3e668b/firmware/src/app.c#L159
Rien de faux, mais il serait bien d'avoir une cartouche au dessus de fonction
voir commentaire de vos collègues : https://github.com/Ali-Z0/MINF-TP1-CM-AZ/issues/9 => même type de remarque
|
non_process
|
app c déclaration de fct updatestate rien de faux mais il serait bien d avoir une cartouche au dessus de fonction voir commentaire de vos collègues même type de remarque
| 0
|
56,600
| 14,078,456,656
|
IssuesEvent
|
2020-11-04 13:35:58
|
themagicalmammal/android_kernel_samsung_a3xelte
|
https://api.github.com/repos/themagicalmammal/android_kernel_samsung_a3xelte
|
opened
|
CVE-2017-16526 (High) detected in linuxv3.10
|
security vulnerability
|
## CVE-2017-16526 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv3.10</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/themagicalmammal/android_kernel_samsung_a3xelte/commit/ac11c9631a8abeed315b67913aab3ba7a400aef3">ac11c9631a8abeed315b67913aab3ba7a400aef3</a></p>
<p>Found in base branch: <b>cosmic-experimental-1.6</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>android_kernel_samsung_a3xelte/drivers/uwb/uwbd.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
drivers/uwb/uwbd.c in the Linux kernel before 4.13.6 allows local users to cause a denial of service (general protection fault and system crash) or possibly have unspecified other impact via a crafted USB device.
<p>Publish Date: 2017-11-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-16526>CVE-2017-16526</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-16526">https://nvd.nist.gov/vuln/detail/CVE-2017-16526</a></p>
<p>Release Date: 2017-11-04</p>
<p>Fix Resolution: 4.13.6</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2017-16526 (High) detected in linuxv3.10 - ## CVE-2017-16526 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv3.10</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/themagicalmammal/android_kernel_samsung_a3xelte/commit/ac11c9631a8abeed315b67913aab3ba7a400aef3">ac11c9631a8abeed315b67913aab3ba7a400aef3</a></p>
<p>Found in base branch: <b>cosmic-experimental-1.6</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>android_kernel_samsung_a3xelte/drivers/uwb/uwbd.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
drivers/uwb/uwbd.c in the Linux kernel before 4.13.6 allows local users to cause a denial of service (general protection fault and system crash) or possibly have unspecified other impact via a crafted USB device.
<p>Publish Date: 2017-11-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-16526>CVE-2017-16526</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-16526">https://nvd.nist.gov/vuln/detail/CVE-2017-16526</a></p>
<p>Release Date: 2017-11-04</p>
<p>Fix Resolution: 4.13.6</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in cve high severity vulnerability vulnerable library linux kernel source tree library home page a href found in head commit a href found in base branch cosmic experimental vulnerable source files android kernel samsung drivers uwb uwbd c vulnerability details drivers uwb uwbd c in the linux kernel before allows local users to cause a denial of service general protection fault and system crash or possibly have unspecified other impact via a crafted usb device publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
16,605
| 21,659,274,017
|
IssuesEvent
|
2022-05-06 17:19:15
|
GoogleCloudPlatform/python-docs-samples
|
https://api.github.com/repos/GoogleCloudPlatform/python-docs-samples
|
opened
|
Enable codeql for python-docs-samples
|
type: process priority: p1
|
This repository does not have code scanning enabled, we should have it enabled. There are other dependency alerts enabled but not for code scanning.
|
1.0
|
Enable codeql for python-docs-samples - This repository does not have code scanning enabled, we should have it enabled. There are other dependency alerts enabled but not for code scanning.
|
process
|
enable codeql for python docs samples this repository does not have code scanning enabled we should have it enabled there are other dependency alerts enabled but not for code scanning
| 1
|
6,468
| 9,546,666,841
|
IssuesEvent
|
2019-05-01 20:37:17
|
pytorch/pytorch
|
https://api.github.com/repos/pytorch/pytorch
|
closed
|
CUDA tensors sent over multiprocessing channel don't synchronize
|
module: multiprocessing
|
@VitalyFedyunin noticed that when he stress tests sending CUDA tensors over multiprocessing, on occasion the tensor contents get corrupted. We further observed that inserting a synchronization before sending a CUDA tensor, we avoid the corruption. The guess is that we are not properly synchronizing CUDA tensors that get sent to other processes, and CUDA kernels queued on another process don't end up on the same stream as the originating one. This needs verification, however.
Vitaly, if you want to fill this issue in with more details, please do!
|
1.0
|
CUDA tensors sent over multiprocessing channel don't synchronize - @VitalyFedyunin noticed that when he stress tests sending CUDA tensors over multiprocessing, on occasion the tensor contents get corrupted. We further observed that inserting a synchronization before sending a CUDA tensor, we avoid the corruption. The guess is that we are not properly synchronizing CUDA tensors that get sent to other processes, and CUDA kernels queued on another process don't end up on the same stream as the originating one. This needs verification, however.
Vitaly, if you want to fill this issue in with more details, please do!
|
process
|
cuda tensors sent over multiprocessing channel don t synchronize vitalyfedyunin noticed that when he stress tests sending cuda tensors over multiprocessing on occasion the tensor contents get corrupted we further observed that inserting a synchronization before sending a cuda tensor we avoid the corruption the guess is that we are not properly synchronizing cuda tensors that get sent to other processes and cuda kernels queued on another process don t end up on the same stream as the originating one this needs verification however vitaly if you want to fill this issue in with more details please do
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.