Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
16,295
20,926,103,226
IssuesEvent
2022-03-24 23:12:24
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
Error when filtering date by current quarter in Postgres
Type:Bug Priority:P2 Database/Postgres Querying/Processor Querying/MBQL .Limitation Querying/Parameters & Variables .Reproduced
**Describe the bug** When filtering by current quarter on a Postgres database the following error appears: `ERROR: invalid input syntax for type interval: "1 quarter" Position: 1550` **Logs** [metabase.log](https://github.com/metabase/metabase/files/8119658/metabase.log) **To Reproduce** Steps to reproduce the behavior: 1. Create any regular question 2. Create a filter using a date and filter by current quarter **Expected behavior** Filter by current quarter is actually filtering instead of giving an error **Screenshots** <img width="486" alt="image" src="https://user-images.githubusercontent.com/159423/155204725-6acc9ae0-3c18-4970-8b7a-04bb77719bf0.png"> **Metabase Diagnostic Info** ```json { "browser-info": { "language": "en-US", "platform": "MacIntel", "userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36", "vendor": "Google Inc." }, "system-info": { "file.encoding": "UTF-8", "java.runtime.name": "OpenJDK Runtime Environment", "java.runtime.version": "11.0.14.1+1", "java.vendor": "Eclipse Adoptium", "java.vendor.url": "https://adoptium.net/", "java.version": "11.0.14.1", "java.vm.name": "OpenJDK 64-Bit Server VM", "java.vm.version": "11.0.14.1+1", "os.name": "Linux", "os.version": "4.9.43-17.39.amzn1.x86_64", "user.language": "en", "user.timezone": "GMT" }, "metabase-info": { "databases": [ "postgres" ], "hosting-env": "unknown", "application-database": "postgres", "application-database-details": { "database": { "name": "PostgreSQL", "version": "10.14" }, "jdbc-driver": { "name": "PostgreSQL JDBC Driver", "version": "42.2.23" } }, "run-mode": "prod", "version": { "date": "2022-02-17", "tag": "v0.42.1", "branch": "release-x.42.x", "hash": "629f4de" }, "settings": { "report-timezone": "US/Pacific" } } } ``` **Severity** Only way to work around this right now is to create native queries. We use quarterly queries quite often. :arrow_down: Please click the :+1: reaction instead of leaving a `+1` or `update?` comment
1.0
Error when filtering date by current quarter in Postgres - **Describe the bug** When filtering by current quarter on a Postgres database the following error appears: `ERROR: invalid input syntax for type interval: "1 quarter" Position: 1550` **Logs** [metabase.log](https://github.com/metabase/metabase/files/8119658/metabase.log) **To Reproduce** Steps to reproduce the behavior: 1. Create any regular question 2. Create a filter using a date and filter by current quarter **Expected behavior** Filter by current quarter is actually filtering instead of giving an error **Screenshots** <img width="486" alt="image" src="https://user-images.githubusercontent.com/159423/155204725-6acc9ae0-3c18-4970-8b7a-04bb77719bf0.png"> **Metabase Diagnostic Info** ```json { "browser-info": { "language": "en-US", "platform": "MacIntel", "userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36", "vendor": "Google Inc." }, "system-info": { "file.encoding": "UTF-8", "java.runtime.name": "OpenJDK Runtime Environment", "java.runtime.version": "11.0.14.1+1", "java.vendor": "Eclipse Adoptium", "java.vendor.url": "https://adoptium.net/", "java.version": "11.0.14.1", "java.vm.name": "OpenJDK 64-Bit Server VM", "java.vm.version": "11.0.14.1+1", "os.name": "Linux", "os.version": "4.9.43-17.39.amzn1.x86_64", "user.language": "en", "user.timezone": "GMT" }, "metabase-info": { "databases": [ "postgres" ], "hosting-env": "unknown", "application-database": "postgres", "application-database-details": { "database": { "name": "PostgreSQL", "version": "10.14" }, "jdbc-driver": { "name": "PostgreSQL JDBC Driver", "version": "42.2.23" } }, "run-mode": "prod", "version": { "date": "2022-02-17", "tag": "v0.42.1", "branch": "release-x.42.x", "hash": "629f4de" }, "settings": { "report-timezone": "US/Pacific" } } } ``` **Severity** Only way to work around this right now is to create native queries. We use quarterly queries quite often. :arrow_down: Please click the :+1: reaction instead of leaving a `+1` or `update?` comment
process
error when filtering date by current quarter in postgres describe the bug when filtering by current quarter on a postgres database the following error appears error invalid input syntax for type interval quarter position logs to reproduce steps to reproduce the behavior create any regular question create a filter using a date and filter by current quarter expected behavior filter by current quarter is actually filtering instead of giving an error screenshots img width alt image src metabase diagnostic info json browser info language en us platform macintel useragent mozilla macintosh intel mac os x applewebkit khtml like gecko chrome safari vendor google inc system info file encoding utf java runtime name openjdk runtime environment java runtime version java vendor eclipse adoptium java vendor url java version java vm name openjdk bit server vm java vm version os name linux os version user language en user timezone gmt metabase info databases postgres hosting env unknown application database postgres application database details database name postgresql version jdbc driver name postgresql jdbc driver version run mode prod version date tag branch release x x hash settings report timezone us pacific severity only way to work around this right now is to create native queries we use quarterly queries quite often arrow down please click the reaction instead of leaving a or update comment
1
29,243
4,479,841,579
IssuesEvent
2016-08-27 21:40:09
mapbox/mapbox-gl-native
https://api.github.com/repos/mapbox/mapbox-gl-native
closed
Test setting style properties to functions
iOS macOS runtime styling tests
The generated unit tests for style property setters and getters only exercise constant values, not functions. We should also test roundtripping functions too. That would’ve caught #6161. /cc @frederoni
1.0
Test setting style properties to functions - The generated unit tests for style property setters and getters only exercise constant values, not functions. We should also test roundtripping functions too. That would’ve caught #6161. /cc @frederoni
non_process
test setting style properties to functions the generated unit tests for style property setters and getters only exercise constant values not functions we should also test roundtripping functions too that would’ve caught cc frederoni
0
54,205
23,200,297,113
IssuesEvent
2022-08-01 20:41:57
cityofaustin/atd-data-tech
https://api.github.com/repos/cityofaustin/atd-data-tech
closed
Panel Review Digital Scoring
Type: Feature Impact: 2-Major Service: Apps Need: 1-Must Have Workgroup: SMO Project: Public-Private Partnership P3 Management
### Judging Criteria - Quality/Appropriateness - Desirable - Feasible - Viable - Community Priorities ### Reviewer Rating - Yes - No - Not Sure - N/A - [x] create new object: `request_object` - similar to SSPR Rate Competencies (radio buttons) - `Overall rating`: green, yellow, red - multi-choice - `Questions` - rich text - `Final Comments` - rich text
1.0
Panel Review Digital Scoring - ### Judging Criteria - Quality/Appropriateness - Desirable - Feasible - Viable - Community Priorities ### Reviewer Rating - Yes - No - Not Sure - N/A - [x] create new object: `request_object` - similar to SSPR Rate Competencies (radio buttons) - `Overall rating`: green, yellow, red - multi-choice - `Questions` - rich text - `Final Comments` - rich text
non_process
panel review digital scoring judging criteria quality appropriateness desirable feasible viable community priorities reviewer rating yes no not sure n a create new object request object similar to sspr rate competencies radio buttons overall rating green yellow red multi choice questions rich text final comments rich text
0
3,970
6,901,277,252
IssuesEvent
2017-11-25 04:34:58
Great-Hill-Corporation/quickBlocks
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
closed
ethslurp range management
apps-all status-inprocess tools-all type-bug
When we run: ./ethslurp -b:4186279:-1 0x63c8c29af409bd31ec7ddeea58ff14f21e8980b0 (Notice that I put -1 on purpose) How it considers the -1?? How we deal with upper range if it is not higher than lower? Do we slurp everything?
1.0
ethslurp range management - When we run: ./ethslurp -b:4186279:-1 0x63c8c29af409bd31ec7ddeea58ff14f21e8980b0 (Notice that I put -1 on purpose) How it considers the -1?? How we deal with upper range if it is not higher than lower? Do we slurp everything?
process
ethslurp range management when we run ethslurp b notice that i put on purpose how it considers the how we deal with upper range if it is not higher than lower do we slurp everything
1
4,417
7,299,829,713
IssuesEvent
2018-02-26 21:25:59
UKHomeOffice/dq-aws-transition
https://api.github.com/repos/UKHomeOffice/dq-aws-transition
closed
Inventory ACL Processing
DQ Data Ingest DQ Data Pipeline DQ Tranche 1 DQ Tranche 2 Production SSM processing
Document configuration. - [ ] Get Python version - [ ] Check for custom monitoring setup - [ ] Check scheduled tasks/crontab - [ ] Check local users/groups - [ ] Check directory structure - [ ] Check jobs - [ ] job1 - [ ] job2 - [ ] Check local scripts - [ ] Compare scripts on Gitlab and servers and note differences - [ ] script1 - [ ] script2 - [ ] Note anything else ### Acceptance criteria - [ ] All the above completed
1.0
Inventory ACL Processing - Document configuration. - [ ] Get Python version - [ ] Check for custom monitoring setup - [ ] Check scheduled tasks/crontab - [ ] Check local users/groups - [ ] Check directory structure - [ ] Check jobs - [ ] job1 - [ ] job2 - [ ] Check local scripts - [ ] Compare scripts on Gitlab and servers and note differences - [ ] script1 - [ ] script2 - [ ] Note anything else ### Acceptance criteria - [ ] All the above completed
process
inventory acl processing document configuration get python version check for custom monitoring setup check scheduled tasks crontab check local users groups check directory structure check jobs check local scripts compare scripts on gitlab and servers and note differences note anything else acceptance criteria all the above completed
1
52,060
27,359,741,759
IssuesEvent
2023-02-27 15:08:20
playcanvas/engine
https://api.github.com/repos/playcanvas/engine
opened
Be able to set quality targets for assets/projects
performance requires: discussion
This is quite a broad topic as there are many areas that would need to be considered and is not limited to the engine The holistic goal would be to have different textures/materials and rendering settings based on the performance or hardware of the target device. For example, a low end mobile would use smaller textures and less material channels than a high end device. Materials seem to be more difficult to manage issue where having variants per quality level would help?
True
Be able to set quality targets for assets/projects - This is quite a broad topic as there are many areas that would need to be considered and is not limited to the engine The holistic goal would be to have different textures/materials and rendering settings based on the performance or hardware of the target device. For example, a low end mobile would use smaller textures and less material channels than a high end device. Materials seem to be more difficult to manage issue where having variants per quality level would help?
non_process
be able to set quality targets for assets projects this is quite a broad topic as there are many areas that would need to be considered and is not limited to the engine the holistic goal would be to have different textures materials and rendering settings based on the performance or hardware of the target device for example a low end mobile would use smaller textures and less material channels than a high end device materials seem to be more difficult to manage issue where having variants per quality level would help
0
3,417
2,685,396,256
IssuesEvent
2015-03-29 23:53:13
PlayWithMagic/PlayWithMagic
https://api.github.com/repos/PlayWithMagic/PlayWithMagic
closed
Workup a product home page
bug Design
I've got an initial product home page built here: http://playwithmagic.github.io/PlayWithMagic/ ...it still has some broken links. I'll fix `em in the morning.
1.0
Workup a product home page - I've got an initial product home page built here: http://playwithmagic.github.io/PlayWithMagic/ ...it still has some broken links. I'll fix `em in the morning.
non_process
workup a product home page i ve got an initial product home page built here it still has some broken links i ll fix em in the morning
0
196,413
6,927,728,916
IssuesEvent
2017-12-01 00:20:04
DMS-Aus/Roam
https://api.github.com/repos/DMS-Aus/Roam
opened
Image widget doesn't set empty validation correctly
bug :( priority/mid
Setting the image widget will still show the red * for required even though it has been set.
1.0
Image widget doesn't set empty validation correctly - Setting the image widget will still show the red * for required even though it has been set.
non_process
image widget doesn t set empty validation correctly setting the image widget will still show the red for required even though it has been set
0
738
4,148,278,764
IssuesEvent
2016-06-15 10:21:40
durhamatletico/durhamatletico-cms
https://api.github.com/repos/durhamatletico/durhamatletico-cms
opened
Suggested UI for Summer Cup
high priority information architecture
Thinking about a simple registration procedure. 1. We tell captain to go to durhamatletico.com/..... 2. She's given a choice of English or Spanish for the form 3. She's given the following information: Cost is $250 per team To register a team/country, the captain should pay $50 deposit with balance due at first game Captain is given the following prompts: - Name? - Birthdate? - Phone and email? - ZIP code? - Name of team? - Does she have enough players? Does she want us to help populate her team? After submitting info, an email is sent to captain with confirmation and a link to make a payment. The email should be bilingual and include an option for the captain to make a cash deposit with David. Doing it this way, we will just collect captain's info. We can capture info from other players on site, with the tablet. (One issue will be speed -- it will probably take two minutes to capture info on each player. I could bring a laptop as backup during the first week. I'll be able to tether it to the tablet.) What do you think?
1.0
Suggested UI for Summer Cup - Thinking about a simple registration procedure. 1. We tell captain to go to durhamatletico.com/..... 2. She's given a choice of English or Spanish for the form 3. She's given the following information: Cost is $250 per team To register a team/country, the captain should pay $50 deposit with balance due at first game Captain is given the following prompts: - Name? - Birthdate? - Phone and email? - ZIP code? - Name of team? - Does she have enough players? Does she want us to help populate her team? After submitting info, an email is sent to captain with confirmation and a link to make a payment. The email should be bilingual and include an option for the captain to make a cash deposit with David. Doing it this way, we will just collect captain's info. We can capture info from other players on site, with the tablet. (One issue will be speed -- it will probably take two minutes to capture info on each player. I could bring a laptop as backup during the first week. I'll be able to tether it to the tablet.) What do you think?
non_process
suggested ui for summer cup thinking about a simple registration procedure we tell captain to go to durhamatletico com she s given a choice of english or spanish for the form she s given the following information cost is per team to register a team country the captain should pay deposit with balance due at first game captain is given the following prompts name birthdate phone and email zip code name of team does she have enough players does she want us to help populate her team after submitting info an email is sent to captain with confirmation and a link to make a payment the email should be bilingual and include an option for the captain to make a cash deposit with david doing it this way we will just collect captain s info we can capture info from other players on site with the tablet one issue will be speed it will probably take two minutes to capture info on each player i could bring a laptop as backup during the first week i ll be able to tether it to the tablet what do you think
0
58,170
6,576,378,415
IssuesEvent
2017-09-11 19:36:26
PowerShell/PowerShell
https://api.github.com/repos/PowerShell/PowerShell
closed
Build is green if an MSI could not be produced due to e.g. a WiX compilation error
Area-Test
[Here](https://ci.appveyor.com/project/PowerShell/powershell/build/6.0.0-beta.6-5027/artifacts) is an example of a build that is green but should have failed because it could not produce an MSI due to a WiX compilation error. The WiX compilation error can also not be found in the log.
1.0
Build is green if an MSI could not be produced due to e.g. a WiX compilation error - [Here](https://ci.appveyor.com/project/PowerShell/powershell/build/6.0.0-beta.6-5027/artifacts) is an example of a build that is green but should have failed because it could not produce an MSI due to a WiX compilation error. The WiX compilation error can also not be found in the log.
non_process
build is green if an msi could not be produced due to e g a wix compilation error is an example of a build that is green but should have failed because it could not produce an msi due to a wix compilation error the wix compilation error can also not be found in the log
0
34,625
30,231,795,734
IssuesEvent
2023-07-06 07:30:17
spring-projects/spring-modulith
https://api.github.com/repos/spring-projects/spring-modulith
closed
Cannot resolve Spring Framework dependencies
in: infrastructure type: question resolution: invalid
``` spring-modulith\spring-modulith-examples\spring-modulith-example-full>mvn package [INFO] Scanning for projects... [INFO] [INFO] -------------------------< org.springframework.modulith:spring-modulith-example-full >-------------------------- [INFO] Building Spring Modulith - Examples - Full Example 1.0.0-SNAPSHOT [INFO] from pom.xml [INFO] ----------------------------------------------------[ jar ]----------------------------------------------------- Downloading from spring-milestone: https://repo.spring.io/milestone/org/apache/tomcat/embed/tomcat-embed-core/10.1.10/tomcat-embed-core-10.1.10.pom Downloading from central: https://repo.maven.apache.org/maven2/org/apache/tomcat/embed/tomcat-embed-core/10.1.10/tomcat-embed-core-10.1.10.pom Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/tomcat/embed/tomcat-embed-core/10.1.10/tomcat-embed-core-10.1.10.pom (1.8 kB at 11 kB/s) Downloading from spring-milestone: https://repo.spring.io/milestone/org/springframework/boot/spring-boot-starter-actuator/3.1.1/spring-boot-starter-actuator-3.1.1.pom ...... Downloaded from central: https://repo.maven.apache.org/maven2/com/structurizr/structurizr-export/1.8.3/structurizr-export-1.8.3.pom (1.7 kB at 66 kB/s) [INFO] ---------------------------------------------------------------------------------------------------------------- [INFO] BUILD FAILURE [INFO] ---------------------------------------------------------------------------------------------------------------- [INFO] Total time: 28.010 s [INFO] Finished at: 2023-06-27T13:20:26+05:30 [INFO] ---------------------------------------------------------------------------------------------------------------- [ERROR] Failed to execute goal on project spring-modulith-example-full: Could not resolve dependencies for project org.springframework.modulith:spring-modulith-example-full:jar:1.0.0-SNAPSHOT: Failed to collect dependencies at org.springframework.boot:spring-boot-starter-data-jpa:jar:3.1.1 -> org.springframework:spring-aspects:jar:6.0.10: Failed to read artifact descriptor for org.springframework:spring-aspects:jar:6.0.10: The following artifacts could not be resolved: org.springframework:spring-aspects:pom:6.0.10 (absent): Could not transfer artifact org.springframework:spring-aspects:pom:6.0.10 from/to central (https://repo.maven.apache.org/maven2): C:\Users\nagku\.m2\repository\org\springframework\spring-aspects\6.0.10\spring-aspects-6.0.10.pom.632571530660644704.tmp -> C:\Users\nagku\.m2\repository\org\springframework\spring-aspects\6.0.10\spring-aspects-6.0.10.pom -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the '-e' switch [ERROR] Re-run Maven using the '-X' switch to enable verbose output [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException C:\temp\spring-modulith\spring-modulith-examples\spring-modulith-example-full>mvn upgrade [INFO] Scanning for projects... [INFO] [INFO] -------------------------< org.springframework.modulith:spring-modulith-example-full >-------------------------- [INFO] Building Spring Modulith - Examples - Full Example 1.0.0-SNAPSHOT [INFO] from pom.xml [INFO] ----------------------------------------------------[ jar ]----------------------------------------------------- [INFO] ---------------------------------------------------------------------------------------------------------------- [INFO] BUILD FAILURE [INFO] ---------------------------------------------------------------------------------------------------------------- [INFO] Total time: 0.629 s [INFO] Finished at: 2023-06-27T13:21:08+05:30 [INFO] ---------------------------------------------------------------------------------------------------------------- [ERROR] Unknown lifecycle phase "upgrade". You must specify a valid lifecycle phase or a goal in the format <plugin-prefix>:<goal> or <plugin-group-id>:<plugin-artifact-id>[:<plugin-version>]:<goal>. Available lifecycle phases are: pre-clean, clean, post-clean, validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy, pre-site, site, post-site, site-deploy, wrapper. -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the '-e' switch [ERROR] Re-run Maven using the '-X' switch to enable verbose output [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/LifecyclePhaseNotFoundException ```
1.0
Cannot resolve Spring Framework dependencies - ``` spring-modulith\spring-modulith-examples\spring-modulith-example-full>mvn package [INFO] Scanning for projects... [INFO] [INFO] -------------------------< org.springframework.modulith:spring-modulith-example-full >-------------------------- [INFO] Building Spring Modulith - Examples - Full Example 1.0.0-SNAPSHOT [INFO] from pom.xml [INFO] ----------------------------------------------------[ jar ]----------------------------------------------------- Downloading from spring-milestone: https://repo.spring.io/milestone/org/apache/tomcat/embed/tomcat-embed-core/10.1.10/tomcat-embed-core-10.1.10.pom Downloading from central: https://repo.maven.apache.org/maven2/org/apache/tomcat/embed/tomcat-embed-core/10.1.10/tomcat-embed-core-10.1.10.pom Downloaded from central: https://repo.maven.apache.org/maven2/org/apache/tomcat/embed/tomcat-embed-core/10.1.10/tomcat-embed-core-10.1.10.pom (1.8 kB at 11 kB/s) Downloading from spring-milestone: https://repo.spring.io/milestone/org/springframework/boot/spring-boot-starter-actuator/3.1.1/spring-boot-starter-actuator-3.1.1.pom ...... Downloaded from central: https://repo.maven.apache.org/maven2/com/structurizr/structurizr-export/1.8.3/structurizr-export-1.8.3.pom (1.7 kB at 66 kB/s) [INFO] ---------------------------------------------------------------------------------------------------------------- [INFO] BUILD FAILURE [INFO] ---------------------------------------------------------------------------------------------------------------- [INFO] Total time: 28.010 s [INFO] Finished at: 2023-06-27T13:20:26+05:30 [INFO] ---------------------------------------------------------------------------------------------------------------- [ERROR] Failed to execute goal on project spring-modulith-example-full: Could not resolve dependencies for project org.springframework.modulith:spring-modulith-example-full:jar:1.0.0-SNAPSHOT: Failed to collect dependencies at org.springframework.boot:spring-boot-starter-data-jpa:jar:3.1.1 -> org.springframework:spring-aspects:jar:6.0.10: Failed to read artifact descriptor for org.springframework:spring-aspects:jar:6.0.10: The following artifacts could not be resolved: org.springframework:spring-aspects:pom:6.0.10 (absent): Could not transfer artifact org.springframework:spring-aspects:pom:6.0.10 from/to central (https://repo.maven.apache.org/maven2): C:\Users\nagku\.m2\repository\org\springframework\spring-aspects\6.0.10\spring-aspects-6.0.10.pom.632571530660644704.tmp -> C:\Users\nagku\.m2\repository\org\springframework\spring-aspects\6.0.10\spring-aspects-6.0.10.pom -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the '-e' switch [ERROR] Re-run Maven using the '-X' switch to enable verbose output [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException C:\temp\spring-modulith\spring-modulith-examples\spring-modulith-example-full>mvn upgrade [INFO] Scanning for projects... [INFO] [INFO] -------------------------< org.springframework.modulith:spring-modulith-example-full >-------------------------- [INFO] Building Spring Modulith - Examples - Full Example 1.0.0-SNAPSHOT [INFO] from pom.xml [INFO] ----------------------------------------------------[ jar ]----------------------------------------------------- [INFO] ---------------------------------------------------------------------------------------------------------------- [INFO] BUILD FAILURE [INFO] ---------------------------------------------------------------------------------------------------------------- [INFO] Total time: 0.629 s [INFO] Finished at: 2023-06-27T13:21:08+05:30 [INFO] ---------------------------------------------------------------------------------------------------------------- [ERROR] Unknown lifecycle phase "upgrade". You must specify a valid lifecycle phase or a goal in the format <plugin-prefix>:<goal> or <plugin-group-id>:<plugin-artifact-id>[:<plugin-version>]:<goal>. Available lifecycle phases are: pre-clean, clean, post-clean, validate, initialize, generate-sources, process-sources, generate-resources, process-resources, compile, process-classes, generate-test-sources, process-test-sources, generate-test-resources, process-test-resources, test-compile, process-test-classes, test, prepare-package, package, pre-integration-test, integration-test, post-integration-test, verify, install, deploy, pre-site, site, post-site, site-deploy, wrapper. -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the '-e' switch [ERROR] Re-run Maven using the '-X' switch to enable verbose output [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/LifecyclePhaseNotFoundException ```
non_process
cannot resolve spring framework dependencies spring modulith spring modulith examples spring modulith example full mvn package scanning for projects building spring modulith examples full example snapshot from pom xml downloading from spring milestone downloading from central downloaded from central kb at kb s downloading from spring milestone downloaded from central kb at kb s build failure total time s finished at failed to execute goal on project spring modulith example full could not resolve dependencies for project org springframework modulith spring modulith example full jar snapshot failed to collect dependencies at org springframework boot spring boot starter data jpa jar org springframework spring aspects jar failed to read artifact descriptor for org springframework spring aspects jar the following artifacts could not be resolved org springframework spring aspects pom absent could not transfer artifact org springframework spring aspects pom from to central c users nagku repository org springframework spring aspects spring aspects pom tmp c users nagku repository org springframework spring aspects spring aspects pom to see the full stack trace of the errors re run maven with the e switch re run maven using the x switch to enable verbose output for more information about the errors and possible solutions please read the following articles c temp spring modulith spring modulith examples spring modulith example full mvn upgrade scanning for projects building spring modulith examples full example snapshot from pom xml build failure total time s finished at unknown lifecycle phase upgrade you must specify a valid lifecycle phase or a goal in the format or available lifecycle phases are pre clean clean post clean validate initialize generate sources process sources generate resources process resources compile process classes generate test sources process test sources generate test resources process test resources test compile process test classes test prepare package package pre integration test integration test post integration test verify install deploy pre site site post site site deploy wrapper to see the full stack trace of the errors re run maven with the e switch re run maven using the x switch to enable verbose output for more information about the errors and possible solutions please read the following articles
0
12,233
14,743,658,156
IssuesEvent
2021-01-07 14:14:06
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Site 118 - SAB CC Issue
anc-process anp-1 ant-bug ant-child/secondary has attachment
In GitLab by @kdjstudios on Sep 6, 2019, 16:12 **Submitted by:** Amanda Jennings" <amanda.jennings@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-09-06-52892/conversation **Server:** INternal **Client/Site:** 118 **Account:** 84600 **Issue:** I would like to look into this CC issue. This is acct 118-SR 84600 – Get Interactive Media. We did not get payment since it was decline so I called client and he is arguing me. The client has told me that he contacted the bank and the bank shows us never trying to run card etc. First we tried running in SAB, it said decline. So we tried again in the CC processing site they declined too (the report side) these are the errors we got. This one was from CC processing site: Account Number: XXXX0290 Expiration: 0323 Auth code: Card Holder: Michael LaPorte Invoice: 118-082920190943 Payment media: Status code: Declined Amount: 225.00 Trans Type: Capture Trans Date: 08-29-2019 Trans Time: 09:43:00 Transaction Code: This one was from CC processing site: Account Number: XXXX0290 Expiration: 0323 Auth code: Card Holder: Michael LaPorte Invoice: 118-082920190951 Payment media: Status code: Declined Amount: 225.00 Trans Type: Capture Trans Date: 08-29-2019 Trans Time: 09:51:25 Transaction Code: This is the one from SAB side: Account Number: 0290 Expiration: 03/2023 Auth code: Card Holder: Michael LaPorte Invoice: SAB-bad188c1-27eb-4334-a8c3-9fdfec8eb4ae Payment media: VISA Status code: Declined Amount: 225.00 Trans Type: Capture Trans Date: 08-29-2019 Trans Time: 13:42:12 Transaction Code: 0 Here is what SAB shows under Payment history: ![image](/uploads/a7c009289bca14e053f6d3cae58c3736/image.png) And this shows under Failed CC Transactions: - it shows successful but declined everywhere else! ![image](/uploads/e8e3918a979b1f9344ad4350ecb3337f/image.png)
1.0
Site 118 - SAB CC Issue - In GitLab by @kdjstudios on Sep 6, 2019, 16:12 **Submitted by:** Amanda Jennings" <amanda.jennings@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-09-06-52892/conversation **Server:** INternal **Client/Site:** 118 **Account:** 84600 **Issue:** I would like to look into this CC issue. This is acct 118-SR 84600 – Get Interactive Media. We did not get payment since it was decline so I called client and he is arguing me. The client has told me that he contacted the bank and the bank shows us never trying to run card etc. First we tried running in SAB, it said decline. So we tried again in the CC processing site they declined too (the report side) these are the errors we got. This one was from CC processing site: Account Number: XXXX0290 Expiration: 0323 Auth code: Card Holder: Michael LaPorte Invoice: 118-082920190943 Payment media: Status code: Declined Amount: 225.00 Trans Type: Capture Trans Date: 08-29-2019 Trans Time: 09:43:00 Transaction Code: This one was from CC processing site: Account Number: XXXX0290 Expiration: 0323 Auth code: Card Holder: Michael LaPorte Invoice: 118-082920190951 Payment media: Status code: Declined Amount: 225.00 Trans Type: Capture Trans Date: 08-29-2019 Trans Time: 09:51:25 Transaction Code: This is the one from SAB side: Account Number: 0290 Expiration: 03/2023 Auth code: Card Holder: Michael LaPorte Invoice: SAB-bad188c1-27eb-4334-a8c3-9fdfec8eb4ae Payment media: VISA Status code: Declined Amount: 225.00 Trans Type: Capture Trans Date: 08-29-2019 Trans Time: 13:42:12 Transaction Code: 0 Here is what SAB shows under Payment history: ![image](/uploads/a7c009289bca14e053f6d3cae58c3736/image.png) And this shows under Failed CC Transactions: - it shows successful but declined everywhere else! ![image](/uploads/e8e3918a979b1f9344ad4350ecb3337f/image.png)
process
site sab cc issue in gitlab by kdjstudios on sep submitted by amanda jennings helpdesk server internal client site account issue i would like to look into this cc issue this is acct sr – get interactive media we did not get payment since it was decline so i called client and he is arguing me the client has told me that he contacted the bank and the bank shows us never trying to run card etc first we tried running in sab it said decline so we tried again in the cc processing site they declined too the report side these are the errors we got this one was from cc processing site account number expiration auth code card holder michael laporte invoice payment media status code declined amount trans type capture trans date trans time transaction code this one was from cc processing site account number expiration auth code card holder michael laporte invoice payment media status code declined amount trans type capture trans date trans time transaction code this is the one from sab side account number expiration auth code card holder michael laporte invoice sab payment media visa status code declined amount trans type capture trans date trans time transaction code here is what sab shows under payment history uploads image png and this shows under failed cc transactions it shows successful but declined everywhere else uploads image png
1
229,952
25,402,123,049
IssuesEvent
2022-11-22 12:54:38
elastic/cloudbeat
https://api.github.com/repos/elastic/cloudbeat
opened
Implement an AWS EC2 fetcher
8.7 Candidate Team:Cloud Security Posture
**Motivation** Create a fetcher to collect the EC2 resources to be evaluated by OPA. **Definition of done** What needs to be completed at the end of this task - [] use defsec 3rd party to fetch the data from the AWS account. - [] The required EC2 data is collected. - [] Fetcher's implementation is provider agnostic. **Related tasks/epics**
True
Implement an AWS EC2 fetcher - **Motivation** Create a fetcher to collect the EC2 resources to be evaluated by OPA. **Definition of done** What needs to be completed at the end of this task - [] use defsec 3rd party to fetch the data from the AWS account. - [] The required EC2 data is collected. - [] Fetcher's implementation is provider agnostic. **Related tasks/epics**
non_process
implement an aws fetcher motivation create a fetcher to collect the resources to be evaluated by opa definition of done what needs to be completed at the end of this task use defsec party to fetch the data from the aws account the required data is collected fetcher s implementation is provider agnostic related tasks epics
0
432,141
30,269,761,844
IssuesEvent
2023-07-07 14:29:50
aeon-toolkit/aeon
https://api.github.com/repos/aeon-toolkit/aeon
opened
[DOC] Add example to InceptionTime docstring for classification and regression
documentation deep learning
### Describe the issue linked to the documentation there are no inception time docstring examples ### Suggest a potential alternative/fix something like this @hadifawaz1999 ? Examples -------- >>> from aeon.classification.deep_learning.fcn import FCNClassifier >>> from aeon.datasets import load_unit_test >>> X_train, y_train = load_unit_test(split="train", return_X_y=True) >>> X_test, y_test = load_unit_test(split="test", return_X_y=True) >>> fcn = FCNClassifier(n_epochs=20,batch_size=4) # doctest: +SKIP >>> fcn.fit(X_train, y_train) # doctest: +SKIP FCNClassifier(...)
1.0
[DOC] Add example to InceptionTime docstring for classification and regression - ### Describe the issue linked to the documentation there are no inception time docstring examples ### Suggest a potential alternative/fix something like this @hadifawaz1999 ? Examples -------- >>> from aeon.classification.deep_learning.fcn import FCNClassifier >>> from aeon.datasets import load_unit_test >>> X_train, y_train = load_unit_test(split="train", return_X_y=True) >>> X_test, y_test = load_unit_test(split="test", return_X_y=True) >>> fcn = FCNClassifier(n_epochs=20,batch_size=4) # doctest: +SKIP >>> fcn.fit(X_train, y_train) # doctest: +SKIP FCNClassifier(...)
non_process
add example to inceptiontime docstring for classification and regression describe the issue linked to the documentation there are no inception time docstring examples suggest a potential alternative fix something like this examples from aeon classification deep learning fcn import fcnclassifier from aeon datasets import load unit test x train y train load unit test split train return x y true x test y test load unit test split test return x y true fcn fcnclassifier n epochs batch size doctest skip fcn fit x train y train doctest skip fcnclassifier
0
20,033
26,517,284,097
IssuesEvent
2023-01-18 21:58:50
MicrosoftDocs/azure-devops-docs
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
closed
Support expressions for agent `demands`
doc-enhancement devops/prod Pri1 devops-cicd-process/tech
If we are supporting this: ``` demands: - Agent.OS -equals Linux # Check if Agent.OS == Linux ``` can we also support expressions like this? ``` demands: - in(agent.name, 'Agent1', 'Agent2', 'Agent3', 'Agent4') ``` --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: e7541ee6-d2bb-84c0-fead-1aa8ee7d2372 * Version Independent ID: 5cf7c51e-37e1-6c67-e6c6-80262c4eb662 * Content: [Demands - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/demands?view=azure-devops&tabs=yaml#feedback) * Content Source: [docs/pipelines/process/demands.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/demands.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @steved0x * Microsoft Alias: **sdanie**
1.0
Support expressions for agent `demands` - If we are supporting this: ``` demands: - Agent.OS -equals Linux # Check if Agent.OS == Linux ``` can we also support expressions like this? ``` demands: - in(agent.name, 'Agent1', 'Agent2', 'Agent3', 'Agent4') ``` --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: e7541ee6-d2bb-84c0-fead-1aa8ee7d2372 * Version Independent ID: 5cf7c51e-37e1-6c67-e6c6-80262c4eb662 * Content: [Demands - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/demands?view=azure-devops&tabs=yaml#feedback) * Content Source: [docs/pipelines/process/demands.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/demands.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @steved0x * Microsoft Alias: **sdanie**
process
support expressions for agent demands if we are supporting this demands agent os equals linux check if agent os linux can we also support expressions like this demands in agent name document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id fead version independent id content content source product devops technology devops cicd process github login microsoft alias sdanie
1
26,149
6,755,404,852
IssuesEvent
2017-10-24 00:17:44
jascam/CodePlexFoo
https://api.github.com/repos/jascam/CodePlexFoo
closed
Create Example: CSVstoServerDocument
bug CodePlexMigrationInitiated impact: Low
This sample demonstrates how to use the ServerDocument class to extract information from a VSTO customized Word document or Excel Workbook; and also how to programmatically add / remove VSTO customizations. #### Migrated CodePlex Work Item Details CodePlex Work Item ID: '2710' Vote count: '1'
1.0
Create Example: CSVstoServerDocument - This sample demonstrates how to use the ServerDocument class to extract information from a VSTO customized Word document or Excel Workbook; and also how to programmatically add / remove VSTO customizations. #### Migrated CodePlex Work Item Details CodePlex Work Item ID: '2710' Vote count: '1'
non_process
create example csvstoserverdocument this sample demonstrates how to use the serverdocument class to extract information from a vsto customized word document or excel workbook and also how to programmatically add remove vsto customizations migrated codeplex work item details codeplex work item id vote count
0
223,941
17,146,018,374
IssuesEvent
2021-07-13 14:40:36
geosolutions-it/MapStore2
https://api.github.com/repos/geosolutions-it/MapStore2
opened
Updated User Guide with Theme and Edit Style options
C124-BRGM-2021-SUPPORT Documentation User Guide
Update of the following sections: * **Application Context** section with #6986 * **Managing Layer Settings** with #6879 and #6967
1.0
Updated User Guide with Theme and Edit Style options - Update of the following sections: * **Application Context** section with #6986 * **Managing Layer Settings** with #6879 and #6967
non_process
updated user guide with theme and edit style options update of the following sections application context section with managing layer settings with and
0
18,910
24,848,829,807
IssuesEvent
2022-10-26 18:10:21
apache/arrow-rs
https://api.github.com/repos/apache/arrow-rs
closed
failed to pass test cases while enabling feature chrono-tz
bug development-process
```bash cargo test --features chrono-tz ``` ```bash failures: ---- compute::kernels::cast::tests::test_cast_timestamp_to_string stdout ---- [arrow/src/compute/kernels/cast.rs:3852] &array = PrimitiveArray<Timestamp(Millisecond, None)> [ 1997-05-19T00:00:00.005, 2018-12-25T00:00:00.001, null, ] thread 'compute::kernels::cast::tests::test_cast_timestamp_to_string' panicked at 'assertion failed: `(left == right)` left: `"1997-05-19 00:00:00.005"`, right: `"1997-05-19 00:00:00.005 +00:00"`', arrow/src/compute/kernels/cast.rs:3856:9 ---- compute::kernels::cast::tests::test_timestamp_cast_utf8 stdout ---- thread 'compute::kernels::cast::tests::test_timestamp_cast_utf8' panicked at 'assertion failed: `(left == right)` left: `StringArray [ "1970-01-01 20:30:00 +10:00", null, "1970-01-02 09:58:59 +10:00", ]`, right: `StringArray [ "1970-01-01 20:30:00", null, "1970-01-02 09:58:59", ]`', arrow/src/compute/kernels/cast.rs:5762:9 ---- csv::writer::tests::test_export_csv_timestamps stdout ---- thread 'csv::writer::tests::test_export_csv_timestamps' panicked at 'assertion failed: `(left == right)` left: `Some("c1,c2\n2019-04-18T20:54:47.378000000+10:00,2019-04-18T10:54:47.378000000+00:00\n2021-10-30T17:59:07.000000000+11:00,2021-10-30T06:59:07.000000000+00:00\n")`, right: `Some("c1,c2\n2019-04-18T20:54:47.378000000+10:00,2019-04-18T10:54:47.378000000\n2021-10-30T17:59:07.000000000+11:00,2021-10-30T06:59:07.000000000\n")`', arrow/src/csv/writer.rs:652:9 ---- csv::writer::tests::test_write_csv stdout ---- thread 'csv::writer::tests::test_write_csv' panicked at 'assertion failed: `(left == right)` left: `"c1,c2,c3,c4,c5,c6,c7\nLorem ipsum dolor sit amet,123.564532,3,true,,00:20:34,cupcakes\nconsectetur adipiscing elit,,2,false,2019-04-18T10:54:47.378000000+00:00,06:51:20,cupcakes\nsed do eiusmod tempor,-556132.25,1,,2019-04-18T02:45:55.555000000+00:00,23:46:03,foo\nLorem ipsum dolor sit amet,123.564532,3,true,,00:20:34,cupcakes\nconsectetur adipiscing elit,,2,false,2019-04-18T10:54:47.378000000+00:00,06:51:20,cupcakes\nsed do eiusmod tempor,-556132.25,1,,2019-04-18T02:45:55.555000000+00:00,23:46:03,foo\n"`, right: `"c1,c2,c3,c4,c5,c6,c7\nLorem ipsum dolor sit amet,123.564532,3,true,,00:20:34,cupcakes\nconsectetur adipiscing elit,,2,false,2019-04-18T10:54:47.378000000,06:51:20,cupcakes\nsed do eiusmod tempor,-556132.25,1,,2019-04-18T02:45:55.555000000,23:46:03,foo\nLorem ipsum dolor sit amet,123.564532,3,true,,00:20:34,cupcakes\nconsectetur adipiscing elit,,2,false,2019-04-18T10:54:47.378000000,06:51:20,cupcakes\nsed do eiusmod tempor,-556132.25,1,,2019-04-18T02:45:55.555000000,23:46:03,foo\n"`', arrow/src/csv/writer.rs:546:9 failures: compute::kernels::cast::tests::test_cast_timestamp_to_string compute::kernels::cast::tests::test_timestamp_cast_utf8 csv::writer::tests::test_export_csv_timestamps csv::writer::tests::test_write_csv test result: FAILED. 751 passed; 4 failed; 0 ignored; 0 measured; 0 filtered out; finished in 1.21s ```
1.0
failed to pass test cases while enabling feature chrono-tz - ```bash cargo test --features chrono-tz ``` ```bash failures: ---- compute::kernels::cast::tests::test_cast_timestamp_to_string stdout ---- [arrow/src/compute/kernels/cast.rs:3852] &array = PrimitiveArray<Timestamp(Millisecond, None)> [ 1997-05-19T00:00:00.005, 2018-12-25T00:00:00.001, null, ] thread 'compute::kernels::cast::tests::test_cast_timestamp_to_string' panicked at 'assertion failed: `(left == right)` left: `"1997-05-19 00:00:00.005"`, right: `"1997-05-19 00:00:00.005 +00:00"`', arrow/src/compute/kernels/cast.rs:3856:9 ---- compute::kernels::cast::tests::test_timestamp_cast_utf8 stdout ---- thread 'compute::kernels::cast::tests::test_timestamp_cast_utf8' panicked at 'assertion failed: `(left == right)` left: `StringArray [ "1970-01-01 20:30:00 +10:00", null, "1970-01-02 09:58:59 +10:00", ]`, right: `StringArray [ "1970-01-01 20:30:00", null, "1970-01-02 09:58:59", ]`', arrow/src/compute/kernels/cast.rs:5762:9 ---- csv::writer::tests::test_export_csv_timestamps stdout ---- thread 'csv::writer::tests::test_export_csv_timestamps' panicked at 'assertion failed: `(left == right)` left: `Some("c1,c2\n2019-04-18T20:54:47.378000000+10:00,2019-04-18T10:54:47.378000000+00:00\n2021-10-30T17:59:07.000000000+11:00,2021-10-30T06:59:07.000000000+00:00\n")`, right: `Some("c1,c2\n2019-04-18T20:54:47.378000000+10:00,2019-04-18T10:54:47.378000000\n2021-10-30T17:59:07.000000000+11:00,2021-10-30T06:59:07.000000000\n")`', arrow/src/csv/writer.rs:652:9 ---- csv::writer::tests::test_write_csv stdout ---- thread 'csv::writer::tests::test_write_csv' panicked at 'assertion failed: `(left == right)` left: `"c1,c2,c3,c4,c5,c6,c7\nLorem ipsum dolor sit amet,123.564532,3,true,,00:20:34,cupcakes\nconsectetur adipiscing elit,,2,false,2019-04-18T10:54:47.378000000+00:00,06:51:20,cupcakes\nsed do eiusmod tempor,-556132.25,1,,2019-04-18T02:45:55.555000000+00:00,23:46:03,foo\nLorem ipsum dolor sit amet,123.564532,3,true,,00:20:34,cupcakes\nconsectetur adipiscing elit,,2,false,2019-04-18T10:54:47.378000000+00:00,06:51:20,cupcakes\nsed do eiusmod tempor,-556132.25,1,,2019-04-18T02:45:55.555000000+00:00,23:46:03,foo\n"`, right: `"c1,c2,c3,c4,c5,c6,c7\nLorem ipsum dolor sit amet,123.564532,3,true,,00:20:34,cupcakes\nconsectetur adipiscing elit,,2,false,2019-04-18T10:54:47.378000000,06:51:20,cupcakes\nsed do eiusmod tempor,-556132.25,1,,2019-04-18T02:45:55.555000000,23:46:03,foo\nLorem ipsum dolor sit amet,123.564532,3,true,,00:20:34,cupcakes\nconsectetur adipiscing elit,,2,false,2019-04-18T10:54:47.378000000,06:51:20,cupcakes\nsed do eiusmod tempor,-556132.25,1,,2019-04-18T02:45:55.555000000,23:46:03,foo\n"`', arrow/src/csv/writer.rs:546:9 failures: compute::kernels::cast::tests::test_cast_timestamp_to_string compute::kernels::cast::tests::test_timestamp_cast_utf8 csv::writer::tests::test_export_csv_timestamps csv::writer::tests::test_write_csv test result: FAILED. 751 passed; 4 failed; 0 ignored; 0 measured; 0 filtered out; finished in 1.21s ```
process
failed to pass test cases while enabling feature chrono tz bash cargo test features chrono tz bash failures compute kernels cast tests test cast timestamp to string stdout array primitivearray null thread compute kernels cast tests test cast timestamp to string panicked at assertion failed left right left right arrow src compute kernels cast rs compute kernels cast tests test timestamp cast stdout thread compute kernels cast tests test timestamp cast panicked at assertion failed left right left stringarray null right stringarray null arrow src compute kernels cast rs csv writer tests test export csv timestamps stdout thread csv writer tests test export csv timestamps panicked at assertion failed left right left some n right some n arrow src csv writer rs csv writer tests test write csv stdout thread csv writer tests test write csv panicked at assertion failed left right left nlorem ipsum dolor sit amet true cupcakes nconsectetur adipiscing elit false cupcakes nsed do eiusmod tempor foo nlorem ipsum dolor sit amet true cupcakes nconsectetur adipiscing elit false cupcakes nsed do eiusmod tempor foo n right nlorem ipsum dolor sit amet true cupcakes nconsectetur adipiscing elit false cupcakes nsed do eiusmod tempor foo nlorem ipsum dolor sit amet true cupcakes nconsectetur adipiscing elit false cupcakes nsed do eiusmod tempor foo n arrow src csv writer rs failures compute kernels cast tests test cast timestamp to string compute kernels cast tests test timestamp cast csv writer tests test export csv timestamps csv writer tests test write csv test result failed passed failed ignored measured filtered out finished in
1
6,577
9,660,217,293
IssuesEvent
2019-05-20 15:02:33
nodejs/node
https://api.github.com/repos/nodejs/node
closed
Fix documentation for `childProcess.kill(signal)`: `signal` can be a number
child_process doc good first issue
With [`childProcess.kill(signal)`](https://nodejs.org/api/child_process.html#child_process_subprocess_kill_signal), `signal` is documented as being a `string`. However in the code, `signal` can be [either a string](https://github.com/nodejs/node/blob/908292cf1f551c614a733d858528ffb13fb3a524/lib/internal/child_process.js#L435) or [a number](https://github.com/nodejs/node/blob/908292cf1f551c614a733d858528ffb13fb3a524/lib/internal/util.js#L220).
1.0
Fix documentation for `childProcess.kill(signal)`: `signal` can be a number - With [`childProcess.kill(signal)`](https://nodejs.org/api/child_process.html#child_process_subprocess_kill_signal), `signal` is documented as being a `string`. However in the code, `signal` can be [either a string](https://github.com/nodejs/node/blob/908292cf1f551c614a733d858528ffb13fb3a524/lib/internal/child_process.js#L435) or [a number](https://github.com/nodejs/node/blob/908292cf1f551c614a733d858528ffb13fb3a524/lib/internal/util.js#L220).
process
fix documentation for childprocess kill signal signal can be a number with signal is documented as being a string however in the code signal can be or
1
13,275
15,758,909,363
IssuesEvent
2021-03-31 07:20:23
scikit-learn/scikit-learn
https://api.github.com/repos/scikit-learn/scikit-learn
closed
Weighted variance computation for sparse data is not numerically stable
Bug Moderate help wanted module:linear_model module:preprocessing
This issue was discovered when adding tests for #19527 (currently marked XFAIL). Here is minimal reproduction case using the underlying private API: https://gist.github.com/ogrisel/bd2cf3350fff5bbd5a0899fa6baf3267 The results are the following (macOS / arm64 / Python 3.9.1 / Cython 0.29.21 / clang 11.0.1): ``` ## dtype=float64 _incremental_mean_and_var [100.] [0.] csr_mean_variance_axis0 [100.] [-2.18040566e-11] incr_mean_variance_axis0 csr [100.] [-2.18040566e-11] csc_mean_variance_axis0 [100.] [-2.18040566e-11] incr_mean_variance_axis0 csc [100.] [-2.18040566e-11] ## dtype=float32 _incremental_mean_and_var [100.00000577] [3.32692735e-11] csr_mean_variance_axis0 [99.99997] [0.00123221] incr_mean_variance_axis0 csr [99.99997] [0.00123221] csc_mean_variance_axis0 [99.99997] [0.00123221] incr_mean_variance_axis0 csc [99.99997] [0.00123221] ``` So the `sklearn.utils.extmath._incremental_mean_and_var` function for dense numpy arrays is numerically stable both in float64 and float32 (~1e-11 is much less then `np.finfo(np.float32).eps`), but the sparse counterparts, either incremental are not are all wrong in the same way. So the gist above should be adapted to write a new series of new tests for these Cython functions and the fix will probably involve adapting the algorithm implemented in `sklearn.utils.extmath._incremental_mean_and_var` to the sparse case. Note: there is another issue opened for the numerical stability of `StandardScaler`: #5602 / #11549 but it is related to the computation of the (unweighted) mean in incremental model (in `partial_fit`) vs full batch mode (in `fit`).
1.0
Weighted variance computation for sparse data is not numerically stable - This issue was discovered when adding tests for #19527 (currently marked XFAIL). Here is minimal reproduction case using the underlying private API: https://gist.github.com/ogrisel/bd2cf3350fff5bbd5a0899fa6baf3267 The results are the following (macOS / arm64 / Python 3.9.1 / Cython 0.29.21 / clang 11.0.1): ``` ## dtype=float64 _incremental_mean_and_var [100.] [0.] csr_mean_variance_axis0 [100.] [-2.18040566e-11] incr_mean_variance_axis0 csr [100.] [-2.18040566e-11] csc_mean_variance_axis0 [100.] [-2.18040566e-11] incr_mean_variance_axis0 csc [100.] [-2.18040566e-11] ## dtype=float32 _incremental_mean_and_var [100.00000577] [3.32692735e-11] csr_mean_variance_axis0 [99.99997] [0.00123221] incr_mean_variance_axis0 csr [99.99997] [0.00123221] csc_mean_variance_axis0 [99.99997] [0.00123221] incr_mean_variance_axis0 csc [99.99997] [0.00123221] ``` So the `sklearn.utils.extmath._incremental_mean_and_var` function for dense numpy arrays is numerically stable both in float64 and float32 (~1e-11 is much less then `np.finfo(np.float32).eps`), but the sparse counterparts, either incremental are not are all wrong in the same way. So the gist above should be adapted to write a new series of new tests for these Cython functions and the fix will probably involve adapting the algorithm implemented in `sklearn.utils.extmath._incremental_mean_and_var` to the sparse case. Note: there is another issue opened for the numerical stability of `StandardScaler`: #5602 / #11549 but it is related to the computation of the (unweighted) mean in incremental model (in `partial_fit`) vs full batch mode (in `fit`).
process
weighted variance computation for sparse data is not numerically stable this issue was discovered when adding tests for currently marked xfail here is minimal reproduction case using the underlying private api the results are the following macos python cython clang dtype incremental mean and var csr mean variance incr mean variance csr csc mean variance incr mean variance csc dtype incremental mean and var csr mean variance incr mean variance csr csc mean variance incr mean variance csc so the sklearn utils extmath incremental mean and var function for dense numpy arrays is numerically stable both in and is much less then np finfo np eps but the sparse counterparts either incremental are not are all wrong in the same way so the gist above should be adapted to write a new series of new tests for these cython functions and the fix will probably involve adapting the algorithm implemented in sklearn utils extmath incremental mean and var to the sparse case note there is another issue opened for the numerical stability of standardscaler but it is related to the computation of the unweighted mean in incremental model in partial fit vs full batch mode in fit
1
22,636
31,885,101,005
IssuesEvent
2023-09-16 21:17:23
bitfocus/companion-module-requests
https://api.github.com/repos/bitfocus/companion-module-requests
opened
Lightshark LS CORE/LS 1 feedback/Sync
NOT YET PROCESSED
- [ ] **I have researched the list of existing Companion modules and requests and have determined this has not yet been requested** Lightshark LS Core/LS 1, LightShark Software: Sync/dynamic/feedback Has anyone had any luck with getting Sync Executors feedback on the Lightshark instance? I have control over the Executors which is amazing but not the feedback. I can only see port 8000 for incoming within companion but not the outgoing port 9000. <img width="1400" alt="Screenshot 2023-09-16 at 2 25 26 pm" src="https://github.com/bitfocus/companion-module-requests/assets/65983295/a9adcb8b-ea0d-4e6d-b089-20deaeebcfa3"> <img width="1401" alt="Screenshot 2023-09-16 at 2 25 56 pm" src="https://github.com/bitfocus/companion-module-requests/assets/65983295/44aafc40-f84e-4672-a00c-84332a51fb99"> [Lightshark OSC.pdf](https://github.com/bitfocus/companion-module-requests/files/12641583/Lightshark.OSC.pdf)
1.0
Lightshark LS CORE/LS 1 feedback/Sync - - [ ] **I have researched the list of existing Companion modules and requests and have determined this has not yet been requested** Lightshark LS Core/LS 1, LightShark Software: Sync/dynamic/feedback Has anyone had any luck with getting Sync Executors feedback on the Lightshark instance? I have control over the Executors which is amazing but not the feedback. I can only see port 8000 for incoming within companion but not the outgoing port 9000. <img width="1400" alt="Screenshot 2023-09-16 at 2 25 26 pm" src="https://github.com/bitfocus/companion-module-requests/assets/65983295/a9adcb8b-ea0d-4e6d-b089-20deaeebcfa3"> <img width="1401" alt="Screenshot 2023-09-16 at 2 25 56 pm" src="https://github.com/bitfocus/companion-module-requests/assets/65983295/44aafc40-f84e-4672-a00c-84332a51fb99"> [Lightshark OSC.pdf](https://github.com/bitfocus/companion-module-requests/files/12641583/Lightshark.OSC.pdf)
process
lightshark ls core ls feedback sync i have researched the list of existing companion modules and requests and have determined this has not yet been requested lightshark ls core ls lightshark software sync dynamic feedback has anyone had any luck with getting sync executors feedback on the lightshark instance i have control over the executors which is amazing but not the feedback i can only see port for incoming within companion but not the outgoing port img width alt screenshot at pm src img width alt screenshot at pm src
1
171,895
14,347,093,445
IssuesEvent
2020-11-29 04:58:06
swordily/SPERMAPYTHON
https://api.github.com/repos/swordily/SPERMAPYTHON
reopened
Когда обновление?
documentation
Я не хочу и не буду обновлять данный чит, у меня есть другие проекты по типу SPERMAWARE, SPERMASENSE, SPERMALOADER. На пайтоне я мало чего смогу сделать, чем на c++. Имеется все базовые функции и имеется авто обновление под ласт апдейт ксго, кто хочет добавить функции, качаете сурсы и добавляете, я не буду добавлять ничего.
1.0
Когда обновление? - Я не хочу и не буду обновлять данный чит, у меня есть другие проекты по типу SPERMAWARE, SPERMASENSE, SPERMALOADER. На пайтоне я мало чего смогу сделать, чем на c++. Имеется все базовые функции и имеется авто обновление под ласт апдейт ксго, кто хочет добавить функции, качаете сурсы и добавляете, я не буду добавлять ничего.
non_process
когда обновление я не хочу и не буду обновлять данный чит у меня есть другие проекты по типу spermaware spermasense spermaloader на пайтоне я мало чего смогу сделать чем на c имеется все базовые функции и имеется авто обновление под ласт апдейт ксго кто хочет добавить функции качаете сурсы и добавляете я не буду добавлять ничего
0
9,699
12,701,028,447
IssuesEvent
2020-06-22 17:23:02
prisma/prisma
https://api.github.com/repos/prisma/prisma
closed
JSON field not compatible with built-in query batching
2.1.0-dev.52 bug/2-confirmed kind/bug process/candidate
<!-- Thanks for helping us improve Prisma! 🙏 Please follow the sections in the template and provide as much information as possible about your problem, e.g. by setting the `DEBUG="*"` environment variable and enabling additional logging output in Prisma Client. Learn more about writing proper bug reports here: https://pris.ly/d/bug-reports --> ## Bug description #1470 introduced batching/dataloader pattern for `findOne`. It appears this is not compatible with `Json` fields for some reason (specifically the nullable `Json?` field) I can confirm this is most likely the case because the exact same code succeeds when there's only one `findOne` call. ## How to reproduce Consider table `A` with a column of `Json?` type. Imagine we have the following code: ``` const ids = [1, 2, ...]; const items = await Promise.all(ids.map(id => prisma.A.findOne({where: { id }}))); ``` If the ids array has length 1, it will succeed. If the ids array has length > 1, it will fail with the following error: ``` SyntaxError: Unexpected end of JSON input at JSON.parse (<anonymous>) at IncomingMessage.<anonymous> (engine-core/dist/h1client.js:48:1) at IncomingMessage.emit (events.js:322:22) at IncomingMessage.EventEmitter.emit (domain.js:482:12) at endReadableNT (_stream_readable.js:1187:12) at processTicksAndRejections (internal/process/task_queues.js:84:21) ``` **Note** At this point I have to restart the entire server, subsequent queries of any kind continue to fail. ## Expected behaviour It should not crash. ## Prisma information Notably with `log: ["query"]` it doesn't even have a chance to output the query before crashing. ## Environment & setup OS: Latest Mac OS Database: Postgres 9.6 Prisma Version: 2.1.0-dev.52 Node.JS Version: 12.16.3
1.0
JSON field not compatible with built-in query batching - <!-- Thanks for helping us improve Prisma! 🙏 Please follow the sections in the template and provide as much information as possible about your problem, e.g. by setting the `DEBUG="*"` environment variable and enabling additional logging output in Prisma Client. Learn more about writing proper bug reports here: https://pris.ly/d/bug-reports --> ## Bug description #1470 introduced batching/dataloader pattern for `findOne`. It appears this is not compatible with `Json` fields for some reason (specifically the nullable `Json?` field) I can confirm this is most likely the case because the exact same code succeeds when there's only one `findOne` call. ## How to reproduce Consider table `A` with a column of `Json?` type. Imagine we have the following code: ``` const ids = [1, 2, ...]; const items = await Promise.all(ids.map(id => prisma.A.findOne({where: { id }}))); ``` If the ids array has length 1, it will succeed. If the ids array has length > 1, it will fail with the following error: ``` SyntaxError: Unexpected end of JSON input at JSON.parse (<anonymous>) at IncomingMessage.<anonymous> (engine-core/dist/h1client.js:48:1) at IncomingMessage.emit (events.js:322:22) at IncomingMessage.EventEmitter.emit (domain.js:482:12) at endReadableNT (_stream_readable.js:1187:12) at processTicksAndRejections (internal/process/task_queues.js:84:21) ``` **Note** At this point I have to restart the entire server, subsequent queries of any kind continue to fail. ## Expected behaviour It should not crash. ## Prisma information Notably with `log: ["query"]` it doesn't even have a chance to output the query before crashing. ## Environment & setup OS: Latest Mac OS Database: Postgres 9.6 Prisma Version: 2.1.0-dev.52 Node.JS Version: 12.16.3
process
json field not compatible with built in query batching thanks for helping us improve prisma 🙏 please follow the sections in the template and provide as much information as possible about your problem e g by setting the debug environment variable and enabling additional logging output in prisma client learn more about writing proper bug reports here bug description introduced batching dataloader pattern for findone it appears this is not compatible with json fields for some reason specifically the nullable json field i can confirm this is most likely the case because the exact same code succeeds when there s only one findone call how to reproduce consider table a with a column of json type imagine we have the following code const ids const items await promise all ids map id prisma a findone where id if the ids array has length it will succeed if the ids array has length it will fail with the following error syntaxerror unexpected end of json input at json parse at incomingmessage engine core dist js at incomingmessage emit events js at incomingmessage eventemitter emit domain js at endreadablent stream readable js at processticksandrejections internal process task queues js note at this point i have to restart the entire server subsequent queries of any kind continue to fail expected behaviour it should not crash prisma information notably with log it doesn t even have a chance to output the query before crashing environment setup os latest mac os database postgres prisma version dev node js version
1
7,350
10,482,995,270
IssuesEvent
2019-09-24 13:08:10
OI-wiki/OI-wiki
https://api.github.com/repos/OI-wiki/OI-wiki
closed
bzoj 题目链接问题
中优先级 / P2 需要处理 / Need Processing 需要帮助 / help wanted 需要讨论 / discussion
最近 bzoj 官方宣称因为题库维护原因要暂时下线半个月。 考虑到国庆期间网站访问量较大,且站内 bzoj 题目较多,失效链接可能会对用户体验感造成较大影响。 目前一种较为可行的解决方案是将所有 bzoj 题目链接更换为 [OI Archive](https://oi-archive.wa-am.com/) 上的对应链接。 欢迎各位讨论这个问题~
1.0
bzoj 题目链接问题 - 最近 bzoj 官方宣称因为题库维护原因要暂时下线半个月。 考虑到国庆期间网站访问量较大,且站内 bzoj 题目较多,失效链接可能会对用户体验感造成较大影响。 目前一种较为可行的解决方案是将所有 bzoj 题目链接更换为 [OI Archive](https://oi-archive.wa-am.com/) 上的对应链接。 欢迎各位讨论这个问题~
process
bzoj 题目链接问题 最近 bzoj 官方宣称因为题库维护原因要暂时下线半个月。 考虑到国庆期间网站访问量较大,且站内 bzoj 题目较多,失效链接可能会对用户体验感造成较大影响。 目前一种较为可行的解决方案是将所有 bzoj 题目链接更换为 上的对应链接。 欢迎各位讨论这个问题
1
158,599
13,737,163,444
IssuesEvent
2020-10-05 12:49:42
UnBArqDsw/2020.1_G2_TCLDL
https://api.github.com/repos/UnBArqDsw/2020.1_G2_TCLDL
closed
BackEnd Dynamic Diagrams
documentation dynamic-2
**_Issue_ type** DOC X - Issue title [Documentation] **Description** make the dynamic diagrams for the entire BackEnd structure and choose the most appropriate diagrams for each feature. Examples of diagrams to be made: - Sequence Diagram - Communication Diagram - Activity Diagram - State Diagram **Screenshots** **Tasks** - [ ] reference the documentation - [ ] make all the diagrams cited in the description of the issue - [ ] look for more possibles diagrams that could be done for this module - [ ] make the artifacts available here tracked **Acceptance Criteria** - [ ] Requirements tracking - [ ] Versioning - [ ] Documentation on ghpages
1.0
BackEnd Dynamic Diagrams - **_Issue_ type** DOC X - Issue title [Documentation] **Description** make the dynamic diagrams for the entire BackEnd structure and choose the most appropriate diagrams for each feature. Examples of diagrams to be made: - Sequence Diagram - Communication Diagram - Activity Diagram - State Diagram **Screenshots** **Tasks** - [ ] reference the documentation - [ ] make all the diagrams cited in the description of the issue - [ ] look for more possibles diagrams that could be done for this module - [ ] make the artifacts available here tracked **Acceptance Criteria** - [ ] Requirements tracking - [ ] Versioning - [ ] Documentation on ghpages
non_process
backend dynamic diagrams issue type doc x issue title description make the dynamic diagrams for the entire backend structure and choose the most appropriate diagrams for each feature examples of diagrams to be made sequence diagram communication diagram activity diagram state diagram screenshots tasks reference the documentation make all the diagrams cited in the description of the issue look for more possibles diagrams that could be done for this module make the artifacts available here tracked acceptance criteria requirements tracking versioning documentation on ghpages
0
13,038
15,384,894,067
IssuesEvent
2021-03-03 05:33:18
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[iOS] Delete account > Incorrect pop-up is displayed and unable to delete the app account
Bug P0 Process: Fixed Process: Tested QA Process: Tested dev iOS
Steps: 1. Enroll into the studies 2. Navigate to My account 3. Click on Delete my account 4. Proceed to delete the account 5. Observe the pop-up Actual: Incorrect pop-up is displayed and unable to delete the app account Expected: Pop-up should not be displayed and data retention should be ON by default and account should be deleted successfully Instance: QA and Dev ![unnamed (1)](https://user-images.githubusercontent.com/60386291/109520738-8f402a80-7ad2-11eb-9602-0461e8d9d52b.png)
3.0
[iOS] Delete account > Incorrect pop-up is displayed and unable to delete the app account - Steps: 1. Enroll into the studies 2. Navigate to My account 3. Click on Delete my account 4. Proceed to delete the account 5. Observe the pop-up Actual: Incorrect pop-up is displayed and unable to delete the app account Expected: Pop-up should not be displayed and data retention should be ON by default and account should be deleted successfully Instance: QA and Dev ![unnamed (1)](https://user-images.githubusercontent.com/60386291/109520738-8f402a80-7ad2-11eb-9602-0461e8d9d52b.png)
process
delete account incorrect pop up is displayed and unable to delete the app account steps enroll into the studies navigate to my account click on delete my account proceed to delete the account observe the pop up actual incorrect pop up is displayed and unable to delete the app account expected pop up should not be displayed and data retention should be on by default and account should be deleted successfully instance qa and dev
1
26,998
6,813,026,409
IssuesEvent
2017-11-06 07:19:07
BTDF/DeploymentFramework
https://api.github.com/repos/BTDF/DeploymentFramework
closed
Auto config of FILE adapter paths should be skipped when IncludeMessagingBindings is false
bug CodePlexMigrationInitiated General Impact: Low Release 5.0
Auto config of FILE adapter paths should be skipped when IncludeMessagingBindings is false #### This work item was migrated from CodePlex CodePlex work item ID: '6884' Assigned to: 'tfabraham' Vote count: '0'
1.0
Auto config of FILE adapter paths should be skipped when IncludeMessagingBindings is false - Auto config of FILE adapter paths should be skipped when IncludeMessagingBindings is false #### This work item was migrated from CodePlex CodePlex work item ID: '6884' Assigned to: 'tfabraham' Vote count: '0'
non_process
auto config of file adapter paths should be skipped when includemessagingbindings is false auto config of file adapter paths should be skipped when includemessagingbindings is false this work item was migrated from codeplex codeplex work item id assigned to tfabraham vote count
0
44,384
5,625,591,010
IssuesEvent
2017-04-04 19:46:24
phetsims/expression-exchange
https://api.github.com/repos/phetsims/expression-exchange
opened
Error: Assertion failed: homeScreenIcon has invalid aspect ratio: ...
type:automated-testing type:bug
Run the sim with `?stringTest=xss` to get this error. ``` Uncaught Error: Assertion failed: homeScreenIcon has invalid aspect ratio: 8.728259048257373 Error: Assertion failed: homeScreenIcon has invalid aspect ratio: 8.728259048257373 at window.assertions.assertFunction (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/assert/js/assert.js:21:13) at validateIconSize (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/joist/js/Screen.js?bust=1491333288707:119:15) at EEVariablesScreen.Screen [as constructor] (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/joist/js/Screen.js?bust=1491333288707:75:5) at new EEVariablesScreen (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/expression-exchange/js/variables/EEVariablesScreen.js?bust=1491333288707:37:12) at https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/expression-exchange/js/expression-exchange-main.js?bust=1491333288707:49:9 at window.phetLaunchSimulation (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/joist/js/SimLauncher.js?bust=1491333288707:50:11) at doneLoadingImages (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/joist/js/SimLauncher.js?bust=1491333288707:60:18) at Object.launch (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/joist/js/SimLauncher.js?bust=1491333288707:114:9) at https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/expression-exchange/js/expression-exchange-main.js?bust=1491333288707:43:15 at Object.execCb (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/sherpa/lib/require-2.1.11.js:1650:25) Approximately 4/4/2017, 12:52:10 PM ```
1.0
Error: Assertion failed: homeScreenIcon has invalid aspect ratio: ... - Run the sim with `?stringTest=xss` to get this error. ``` Uncaught Error: Assertion failed: homeScreenIcon has invalid aspect ratio: 8.728259048257373 Error: Assertion failed: homeScreenIcon has invalid aspect ratio: 8.728259048257373 at window.assertions.assertFunction (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/assert/js/assert.js:21:13) at validateIconSize (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/joist/js/Screen.js?bust=1491333288707:119:15) at EEVariablesScreen.Screen [as constructor] (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/joist/js/Screen.js?bust=1491333288707:75:5) at new EEVariablesScreen (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/expression-exchange/js/variables/EEVariablesScreen.js?bust=1491333288707:37:12) at https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/expression-exchange/js/expression-exchange-main.js?bust=1491333288707:49:9 at window.phetLaunchSimulation (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/joist/js/SimLauncher.js?bust=1491333288707:50:11) at doneLoadingImages (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/joist/js/SimLauncher.js?bust=1491333288707:60:18) at Object.launch (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/joist/js/SimLauncher.js?bust=1491333288707:114:9) at https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/expression-exchange/js/expression-exchange-main.js?bust=1491333288707:43:15 at Object.execCb (https://bayes.colorado.edu/continuous-testing/snapshot-1491331930059/sherpa/lib/require-2.1.11.js:1650:25) Approximately 4/4/2017, 12:52:10 PM ```
non_process
error assertion failed homescreenicon has invalid aspect ratio run the sim with stringtest xss to get this error uncaught error assertion failed homescreenicon has invalid aspect ratio error assertion failed homescreenicon has invalid aspect ratio at window assertions assertfunction at validateiconsize at eevariablesscreen screen at new eevariablesscreen at at window phetlaunchsimulation at doneloadingimages at object launch at at object execcb approximately pm
0
18,061
24,068,081,225
IssuesEvent
2022-09-17 19:41:39
lynnandtonic/nestflix.fun
https://api.github.com/repos/lynnandtonic/nestflix.fun
closed
Add The Corpse Danced at Midnight from "Murder, She Wrote" (Screenshots and Poster Added)
suggested title in process
Please add as much of the following info as you can: Title: "The Corpse Danced at Midnight" Type (film/tv show): film: 80's thriller Film or show in which it appears: Murder, She Wrote Is the parent film/show streaming anywhere? Yes - Peacock About when in the parent film/show does it appear? Ep. 1x05 - "Hooray for Homicide" Actual footage of the film/show can be seen (yes/no)? Yes Timestamp: 27:20 - 27:57 Synopsis: Based on the 1984 bestselling murder-mystery by J. B. Fletcher Producer: Ross Haley (originally Jerry Lydecker with Lydecker Productions) Director: Ross Haley Writer: Allan Gebhart Cast: Eve Crystal, Scott Bennett "Fun" Fact <FYI - This is spoiler territory for the episode.>: The "gore and sex-filled" film was never finished <like 'Daddy's Boy' from "Unbreakable Kimmy Schmidt"> due to the murder of original producer, Jerry Lydecker, by the film's leading lady, Eve Crystal, and the subsequent cover-up by director, Ross Haley.
1.0
Add The Corpse Danced at Midnight from "Murder, She Wrote" (Screenshots and Poster Added) - Please add as much of the following info as you can: Title: "The Corpse Danced at Midnight" Type (film/tv show): film: 80's thriller Film or show in which it appears: Murder, She Wrote Is the parent film/show streaming anywhere? Yes - Peacock About when in the parent film/show does it appear? Ep. 1x05 - "Hooray for Homicide" Actual footage of the film/show can be seen (yes/no)? Yes Timestamp: 27:20 - 27:57 Synopsis: Based on the 1984 bestselling murder-mystery by J. B. Fletcher Producer: Ross Haley (originally Jerry Lydecker with Lydecker Productions) Director: Ross Haley Writer: Allan Gebhart Cast: Eve Crystal, Scott Bennett "Fun" Fact <FYI - This is spoiler territory for the episode.>: The "gore and sex-filled" film was never finished <like 'Daddy's Boy' from "Unbreakable Kimmy Schmidt"> due to the murder of original producer, Jerry Lydecker, by the film's leading lady, Eve Crystal, and the subsequent cover-up by director, Ross Haley.
process
add the corpse danced at midnight from murder she wrote screenshots and poster added please add as much of the following info as you can title the corpse danced at midnight type film tv show film s thriller film or show in which it appears murder she wrote is the parent film show streaming anywhere yes peacock about when in the parent film show does it appear ep hooray for homicide actual footage of the film show can be seen yes no yes timestamp synopsis based on the bestselling murder mystery by j b fletcher producer ross haley originally jerry lydecker with lydecker productions director ross haley writer allan gebhart cast eve crystal scott bennett fun fact the gore and sex filled film was never finished due to the murder of original producer jerry lydecker by the film s leading lady eve crystal and the subsequent cover up by director ross haley
1
471,508
13,578,700,116
IssuesEvent
2020-09-20 09:11:54
DimensionDev/Maskbook
https://api.github.com/repos/DimensionDev/Maskbook
closed
[Bug] Signature Verification State should be removed
Kind: UI Priority: P2 (Important) Type: Bug
# Bug Report ## Environment ### System - [x] Windows - OS Version: All - [x] Mac OS X - OS Version: All - [x] Linux - Linux Distribution: All - OS Version: All ### Platform/Browser - [x] Chrome - Maskbook Version: - Browser Version: - [x] Firefox - Maskbook Version: - Browser Version: - [x] Android - Maskbook Version: - Android Version: - [x] iOS - Maskbook Version: - iOS Version: ### Build Variant - Where do you get Maskbook? - [x] Store - [ ] ZIP - [ ] Self-Compiled - Build Commit: /* Optionally attach a Commit ID, if it is from an pre-release branch head */ ## Bug Info All `Signature Verification State(Verified/Not Verified)` info should be removed since the related design is deprecated. The red boxed content as below: <img width="599" alt="Snipaste_2020-09-16_15-26-05" src="https://user-images.githubusercontent.com/7972003/93305957-ccb94000-f831-11ea-9610-340021261ea0.png"> ### Actual Behavior /* What happened? */ There should not exist any "Signature" related content now.
1.0
[Bug] Signature Verification State should be removed - # Bug Report ## Environment ### System - [x] Windows - OS Version: All - [x] Mac OS X - OS Version: All - [x] Linux - Linux Distribution: All - OS Version: All ### Platform/Browser - [x] Chrome - Maskbook Version: - Browser Version: - [x] Firefox - Maskbook Version: - Browser Version: - [x] Android - Maskbook Version: - Android Version: - [x] iOS - Maskbook Version: - iOS Version: ### Build Variant - Where do you get Maskbook? - [x] Store - [ ] ZIP - [ ] Self-Compiled - Build Commit: /* Optionally attach a Commit ID, if it is from an pre-release branch head */ ## Bug Info All `Signature Verification State(Verified/Not Verified)` info should be removed since the related design is deprecated. The red boxed content as below: <img width="599" alt="Snipaste_2020-09-16_15-26-05" src="https://user-images.githubusercontent.com/7972003/93305957-ccb94000-f831-11ea-9610-340021261ea0.png"> ### Actual Behavior /* What happened? */ There should not exist any "Signature" related content now.
non_process
signature verification state should be removed bug report environment system windows os version all mac os x os version all linux linux distribution all os version all platform browser chrome maskbook version browser version firefox maskbook version browser version android maskbook version android version ios maskbook version ios version build variant where do you get maskbook store zip self compiled build commit optionally attach a commit id if it is from an pre release branch head bug info all signature verification state verified not verified info should be removed since the related design is deprecated the red boxed content as below img width alt snipaste src actual behavior what happened there should not exist any signature related content now
0
819,848
30,752,921,220
IssuesEvent
2023-07-28 21:14:34
Rothamsted-Ecoinformatics/farm_rothamsted
https://api.github.com/repos/Rothamsted-Ecoinformatics/farm_rothamsted
closed
Design Entity: rename Number of plots per block
Experiment Module Priority 1: Must do (Essential) Data standardisation and accuracy
On the design entity please can we rename 'Number of plots per block' as 'Number of main plots per block'. Requested by Suzanne in order to standardise the naming conventions. Marked as must do for that reason.
1.0
Design Entity: rename Number of plots per block - On the design entity please can we rename 'Number of plots per block' as 'Number of main plots per block'. Requested by Suzanne in order to standardise the naming conventions. Marked as must do for that reason.
non_process
design entity rename number of plots per block on the design entity please can we rename number of plots per block as number of main plots per block requested by suzanne in order to standardise the naming conventions marked as must do for that reason
0
10,036
13,044,161,504
IssuesEvent
2020-07-29 03:47:23
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `AddDateDatetimeDecimal` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `AddDateDatetimeDecimal` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @breeswish ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `AddDateDatetimeDecimal` from TiDB - ## Description Port the scalar function `AddDateDatetimeDecimal` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @breeswish ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function adddatedatetimedecimal from tidb description port the scalar function adddatedatetimedecimal from tidb to coprocessor score mentor s breeswish recommended skills rust programming learning materials already implemented expressions ported from tidb
1
19,733
26,084,916,225
IssuesEvent
2022-12-26 00:34:21
devssa/onde-codar-em-salvador
https://api.github.com/repos/devssa/onde-codar-em-salvador
closed
ANALISTA DE SISTEMA na [ATAKAREJO]
SALVADOR BANCO DE DADOS UML MODELAGEM DE PROCESSOS HELP WANTED Stale
<!-- ================================================== POR FAVOR, SÓ POSTE SE A VAGA FOR PARA SALVADOR E CIDADES VIZINHAS! Use: "Desenvolvedor Front-end" ao invés de "Front-End Developer" \o/ Exemplo: `[JAVASCRIPT] [MYSQL] [NODE.JS] Desenvolvedor Front-End na [NOME DA EMPRESA]` ================================================== --> ## Descrição da vaga - Atender o segundo nível de atendimento a chamado de sistemas, soluções, sugestões, etc - Elaborar atendimento remoto e presencial - Diagnosticar falhas nos sistemas RMS, RM, SOCIN, BLCM, ENGEMAN e outros - Orientar os usuários sobre procedimentos operacionais padrões e melhores práticas para uso dos sistemas - Monitorar os chamados aos fornecedores - Controlar os acessos aos sistemas - Garantir o cumprimento dos requisitos dos clientes (SLA) acordado pela empresa - Acompanhamento de chamados junto a fornecedores - Elaborar apresentações executivas sobre os projetos - Realizar testes unitários e integrados em sistema - Dentre outras atividades ## Local - Salvador ## Requisitos **Obrigatórios:** - Superior completo na área de Tecnologia - Experiência anterior na função - Experiência no segmento de varejo - Conhecimento em UML, Banco de Dados, Mapeamento de Processos **Desejáveis:** - Desejável pós-graduação em Engenharia de softwares/ Desenvolvimento de software **Diferenciais:** - Experiência com RMS e E-Connect (SOCIN) ## Contratação - a combinar ## Nossa empresa - O Grupo Atakarejo é uma empresa brasileira do ramo de supermercados com sede em Salvador, Bahia. ## Como se candidatar - [Clique aqui para se candidatar](https://jobs.kenoby.com/atakarejo/job/analista-de-sistema/5e4c018bbb61864768315631)
1.0
ANALISTA DE SISTEMA na [ATAKAREJO] - <!-- ================================================== POR FAVOR, SÓ POSTE SE A VAGA FOR PARA SALVADOR E CIDADES VIZINHAS! Use: "Desenvolvedor Front-end" ao invés de "Front-End Developer" \o/ Exemplo: `[JAVASCRIPT] [MYSQL] [NODE.JS] Desenvolvedor Front-End na [NOME DA EMPRESA]` ================================================== --> ## Descrição da vaga - Atender o segundo nível de atendimento a chamado de sistemas, soluções, sugestões, etc - Elaborar atendimento remoto e presencial - Diagnosticar falhas nos sistemas RMS, RM, SOCIN, BLCM, ENGEMAN e outros - Orientar os usuários sobre procedimentos operacionais padrões e melhores práticas para uso dos sistemas - Monitorar os chamados aos fornecedores - Controlar os acessos aos sistemas - Garantir o cumprimento dos requisitos dos clientes (SLA) acordado pela empresa - Acompanhamento de chamados junto a fornecedores - Elaborar apresentações executivas sobre os projetos - Realizar testes unitários e integrados em sistema - Dentre outras atividades ## Local - Salvador ## Requisitos **Obrigatórios:** - Superior completo na área de Tecnologia - Experiência anterior na função - Experiência no segmento de varejo - Conhecimento em UML, Banco de Dados, Mapeamento de Processos **Desejáveis:** - Desejável pós-graduação em Engenharia de softwares/ Desenvolvimento de software **Diferenciais:** - Experiência com RMS e E-Connect (SOCIN) ## Contratação - a combinar ## Nossa empresa - O Grupo Atakarejo é uma empresa brasileira do ramo de supermercados com sede em Salvador, Bahia. ## Como se candidatar - [Clique aqui para se candidatar](https://jobs.kenoby.com/atakarejo/job/analista-de-sistema/5e4c018bbb61864768315631)
process
analista de sistema na por favor só poste se a vaga for para salvador e cidades vizinhas use desenvolvedor front end ao invés de front end developer o exemplo desenvolvedor front end na descrição da vaga atender o segundo nível de atendimento a chamado de sistemas soluções sugestões etc elaborar atendimento remoto e presencial diagnosticar falhas nos sistemas rms rm socin blcm engeman e outros orientar os usuários sobre procedimentos operacionais padrões e melhores práticas para uso dos sistemas monitorar os chamados aos fornecedores controlar os acessos aos sistemas garantir o cumprimento dos requisitos dos clientes sla acordado pela empresa acompanhamento de chamados junto a fornecedores elaborar apresentações executivas sobre os projetos realizar testes unitários e integrados em sistema dentre outras atividades local salvador requisitos obrigatórios superior completo na área de tecnologia experiência anterior na função experiência no segmento de varejo conhecimento em uml banco de dados mapeamento de processos desejáveis desejável pós graduação em engenharia de softwares desenvolvimento de software diferenciais experiência com rms e e connect socin contratação a combinar nossa empresa o grupo atakarejo é uma empresa brasileira do ramo de supermercados com sede em salvador bahia como se candidatar
1
15,400
19,591,951,577
IssuesEvent
2022-01-05 13:56:47
RobertCraigie/prisma-client-py
https://api.github.com/repos/RobertCraigie/prisma-client-py
closed
Increase default HTTP timeout
kind/improvement process/candidate
## Problem <!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] --> Currently, `httpx.ReadTimeout` errors can be encountered easily (#197). ## Suggested solution <!-- A clear and concise description of what you want to happen. --> We should increase the default timeout, however I do not know by how much. We should look into how Prisma handles this.
1.0
Increase default HTTP timeout - ## Problem <!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] --> Currently, `httpx.ReadTimeout` errors can be encountered easily (#197). ## Suggested solution <!-- A clear and concise description of what you want to happen. --> We should increase the default timeout, however I do not know by how much. We should look into how Prisma handles this.
process
increase default http timeout problem currently httpx readtimeout errors can be encountered easily suggested solution we should increase the default timeout however i do not know by how much we should look into how prisma handles this
1
15,245
19,183,454,042
IssuesEvent
2021-12-04 20:11:46
ethereum/EIPs
https://api.github.com/repos/ethereum/EIPs
closed
Yellow Paper and Potential EIP Licensing Violation
type: Meta type: EIP1 (Process) stale
The current editors ask all authors to publish their EIPs under public domain. However, given that many EIPs have to reference yellow paper, which is licensed under CC-BY-SA, it simply means many EIPs cannot be published, or published EIPs commit a license violation: > If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original. Part of the EIP might still be able to be public domain. However, those materials that built upon yellow paper must be published under the same license -- CC-BY-SA. Compared with the practice in this repo, the ECIP repository is much more friendly to all types of [permissive licenses](https://github.com/ethereumclassic/ECIPs/blob/master/_specs/ecip-1000.md#ecip-licensing), which we may want to learn from. Another solution is to ask Gavin, and all contributors to yellow paper, to re-publish it in public domain.
1.0
Yellow Paper and Potential EIP Licensing Violation - The current editors ask all authors to publish their EIPs under public domain. However, given that many EIPs have to reference yellow paper, which is licensed under CC-BY-SA, it simply means many EIPs cannot be published, or published EIPs commit a license violation: > If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original. Part of the EIP might still be able to be public domain. However, those materials that built upon yellow paper must be published under the same license -- CC-BY-SA. Compared with the practice in this repo, the ECIP repository is much more friendly to all types of [permissive licenses](https://github.com/ethereumclassic/ECIPs/blob/master/_specs/ecip-1000.md#ecip-licensing), which we may want to learn from. Another solution is to ask Gavin, and all contributors to yellow paper, to re-publish it in public domain.
process
yellow paper and potential eip licensing violation the current editors ask all authors to publish their eips under public domain however given that many eips have to reference yellow paper which is licensed under cc by sa it simply means many eips cannot be published or published eips commit a license violation if you remix transform or build upon the material you must distribute your contributions under the same license as the original part of the eip might still be able to be public domain however those materials that built upon yellow paper must be published under the same license cc by sa compared with the practice in this repo the ecip repository is much more friendly to all types of which we may want to learn from another solution is to ask gavin and all contributors to yellow paper to re publish it in public domain
1
18,540
10,252,668,904
IssuesEvent
2019-08-21 09:29:52
EduSemensati/github-security-on-github
https://api.github.com/repos/EduSemensati/github-security-on-github
opened
WS-2018-0236 (Medium) detected in mem-1.1.0.tgz
security vulnerability
## WS-2018-0236 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mem-1.1.0.tgz</b></p></summary> <p>Memoize functions - An optimization used to speed up consecutive function calls by caching the result of calls with identical input</p> <p>Library home page: <a href="https://registry.npmjs.org/mem/-/mem-1.1.0.tgz">https://registry.npmjs.org/mem/-/mem-1.1.0.tgz</a></p> <p>Path to dependency file: /github-security-on-github/package.json</p> <p>Path to vulnerable library: /tmp/git/github-security-on-github/node_modules/npm/node_modules/mem/package.json</p> <p> Dependency Hierarchy: - npm-6.11.1.tgz (Root Library) - libnpx-10.2.0.tgz - yargs-11.0.0.tgz - os-locale-2.1.0.tgz - :x: **mem-1.1.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/EduSemensati/github-security-on-github/commit/af410b96e9422cb271b6d0afc106a507d8cc4839">af410b96e9422cb271b6d0afc106a507d8cc4839</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In nodejs-mem before version 4.0.0 there is a memory leak due to old results not being removed from the cache despite reaching maxAge. Exploitation of this can lead to exhaustion of memory and subsequent denial of service. <p>Publish Date: 2019-05-30 <p>URL: <a href=https://bugzilla.redhat.com/show_bug.cgi?id=1623744>WS-2018-0236</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=1623744">https://bugzilla.redhat.com/show_bug.cgi?id=1623744</a></p> <p>Release Date: 2019-05-30</p> <p>Fix Resolution: 4.0.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
WS-2018-0236 (Medium) detected in mem-1.1.0.tgz - ## WS-2018-0236 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>mem-1.1.0.tgz</b></p></summary> <p>Memoize functions - An optimization used to speed up consecutive function calls by caching the result of calls with identical input</p> <p>Library home page: <a href="https://registry.npmjs.org/mem/-/mem-1.1.0.tgz">https://registry.npmjs.org/mem/-/mem-1.1.0.tgz</a></p> <p>Path to dependency file: /github-security-on-github/package.json</p> <p>Path to vulnerable library: /tmp/git/github-security-on-github/node_modules/npm/node_modules/mem/package.json</p> <p> Dependency Hierarchy: - npm-6.11.1.tgz (Root Library) - libnpx-10.2.0.tgz - yargs-11.0.0.tgz - os-locale-2.1.0.tgz - :x: **mem-1.1.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/EduSemensati/github-security-on-github/commit/af410b96e9422cb271b6d0afc106a507d8cc4839">af410b96e9422cb271b6d0afc106a507d8cc4839</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In nodejs-mem before version 4.0.0 there is a memory leak due to old results not being removed from the cache despite reaching maxAge. Exploitation of this can lead to exhaustion of memory and subsequent denial of service. <p>Publish Date: 2019-05-30 <p>URL: <a href=https://bugzilla.redhat.com/show_bug.cgi?id=1623744>WS-2018-0236</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=1623744">https://bugzilla.redhat.com/show_bug.cgi?id=1623744</a></p> <p>Release Date: 2019-05-30</p> <p>Fix Resolution: 4.0.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
ws medium detected in mem tgz ws medium severity vulnerability vulnerable library mem tgz memoize functions an optimization used to speed up consecutive function calls by caching the result of calls with identical input library home page a href path to dependency file github security on github package json path to vulnerable library tmp git github security on github node modules npm node modules mem package json dependency hierarchy npm tgz root library libnpx tgz yargs tgz os locale tgz x mem tgz vulnerable library found in head commit a href vulnerability details in nodejs mem before version there is a memory leak due to old results not being removed from the cache despite reaching maxage exploitation of this can lead to exhaustion of memory and subsequent denial of service publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
153,566
13,518,845,815
IssuesEvent
2020-09-15 00:16:58
Descent098/pystall
https://api.github.com/repos/Descent098/pystall
closed
Documentation improvements
documentation
- [x] Update resource library list - [x] Add info for debian and windows resource classes
1.0
Documentation improvements - - [x] Update resource library list - [x] Add info for debian and windows resource classes
non_process
documentation improvements update resource library list add info for debian and windows resource classes
0
14,281
17,170,829,948
IssuesEvent
2021-07-15 04:00:39
checkstyle/checkstyle
https://api.github.com/repos/checkstyle/checkstyle
closed
Inconsistent `TYPE_ARGUMENT` AST
approved breaking compatibility
The `TYPE_ARGUMENT` AST is not consistent when it is inside of a `TYPE_UPPER_BOUNDS` AST: ```bash ➜ src cat Test.java import java.util.List; public class Test<T> { List<String> myList; List<T> myOtherList; List<String> listMethod1() { return null; } List<T> listMethod2() { return null; } public <E extends Enum<E>> void m22() {} } ➜ src javac Test.java ➜ src java -jar checkstyle-8.44-SNAPSHOT-all.jar -t Test.java IMPORT -> import [1:0] |--DOT -> . [1:16] | |--DOT -> . [1:11] | | |--IDENT -> java [1:7] | | `--IDENT -> util [1:12] | `--IDENT -> List [1:17] `--SEMI -> ; [1:21] CLASS_DEF -> CLASS_DEF [3:0] |--MODIFIERS -> MODIFIERS [3:0] | `--LITERAL_PUBLIC -> public [3:0] |--LITERAL_CLASS -> class [3:7] |--IDENT -> Test [3:13] |--TYPE_PARAMETERS -> TYPE_PARAMETERS [3:17] | |--GENERIC_START -> < [3:17] | |--TYPE_PARAMETER -> TYPE_PARAMETER [3:18] | | `--IDENT -> T [3:18] | `--GENERIC_END -> > [3:19] `--OBJBLOCK -> OBJBLOCK [3:21] |--LCURLY -> { [3:21] |--VARIABLE_DEF -> VARIABLE_DEF [4:4] | |--MODIFIERS -> MODIFIERS [4:4] | |--TYPE -> TYPE [4:4] | | |--IDENT -> List [4:4] | | `--TYPE_ARGUMENTS -> TYPE_ARGUMENTS [4:8] | | |--GENERIC_START -> < [4:8] | | |--TYPE_ARGUMENT -> TYPE_ARGUMENT [4:9] // no TYPE node | | | `--IDENT -> String [4:9] | | `--GENERIC_END -> > [4:15] | |--IDENT -> myList [4:17] | `--SEMI -> ; [4:23] |--VARIABLE_DEF -> VARIABLE_DEF [5:4] | |--MODIFIERS -> MODIFIERS [5:4] | |--TYPE -> TYPE [5:4] | | |--IDENT -> List [5:4] | | `--TYPE_ARGUMENTS -> TYPE_ARGUMENTS [5:8] | | |--GENERIC_START -> < [5:8] | | |--TYPE_ARGUMENT -> TYPE_ARGUMENT [5:9] // no TYPE node | | | `--IDENT -> T [5:9] | | `--GENERIC_END -> > [5:10] | |--IDENT -> myOtherList [5:12] | `--SEMI -> ; [5:23] |--METHOD_DEF -> METHOD_DEF [7:4] | |--MODIFIERS -> MODIFIERS [7:4] | |--TYPE -> TYPE [7:4] | | |--IDENT -> List [7:4] | | `--TYPE_ARGUMENTS -> TYPE_ARGUMENTS [7:8] | | |--GENERIC_START -> < [7:8] | | |--TYPE_ARGUMENT -> TYPE_ARGUMENT [7:9] // no TYPE node | | | `--IDENT -> String [7:9] | | `--GENERIC_END -> > [7:15] | |--IDENT -> listMethod1 [7:17] | |--LPAREN -> ( [7:28] | |--PARAMETERS -> PARAMETERS [7:29] | |--RPAREN -> ) [7:29] | `--SLIST -> { [7:31] | |--LITERAL_RETURN -> return [8:8] | | |--EXPR -> EXPR [8:15] | | | `--LITERAL_NULL -> null [8:15] | | `--SEMI -> ; [8:19] | `--RCURLY -> } [9:4] |--METHOD_DEF -> METHOD_DEF [10:4] | |--MODIFIERS -> MODIFIERS [10:4] | |--TYPE -> TYPE [10:4] | | |--IDENT -> List [10:4] | | `--TYPE_ARGUMENTS -> TYPE_ARGUMENTS [10:8] | | |--GENERIC_START -> < [10:8] | | |--TYPE_ARGUMENT -> TYPE_ARGUMENT [10:9] // no TYPE node | | | `--IDENT -> T [10:9] | | `--GENERIC_END -> > [10:10] | |--IDENT -> listMethod2 [10:12] | |--LPAREN -> ( [10:23] | |--PARAMETERS -> PARAMETERS [10:24] | |--RPAREN -> ) [10:24] | `--SLIST -> { [10:26] | |--LITERAL_RETURN -> return [11:8] | | |--EXPR -> EXPR [11:15] | | | `--LITERAL_NULL -> null [11:15] | | `--SEMI -> ; [11:19] | `--RCURLY -> } [12:4] |--METHOD_DEF -> METHOD_DEF [14:4] | |--MODIFIERS -> MODIFIERS [14:4] | | `--LITERAL_PUBLIC -> public [14:4] | |--TYPE_PARAMETERS -> TYPE_PARAMETERS [14:11] | | |--GENERIC_START -> < [14:11] | | |--TYPE_PARAMETER -> TYPE_PARAMETER [14:12] | | | |--IDENT -> E [14:12] | | | `--TYPE_UPPER_BOUNDS -> extends [14:14] | | | |--IDENT -> Enum [14:22] | | | `--TYPE_ARGUMENTS -> TYPE_ARGUMENTS [14:26] | | | |--GENERIC_START -> < [14:26] | | | |--TYPE_ARGUMENT -> TYPE_ARGUMENT [14:27] | | | | `--TYPE -> TYPE [14:27] // TYPE node is inconsistent | | | | `--IDENT -> E [14:27] | | | `--GENERIC_END -> > [14:28] | | `--GENERIC_END -> > [14:29] | |--TYPE -> TYPE [14:31] | | `--LITERAL_VOID -> void [14:31] | |--IDENT -> m22 [14:36] | |--LPAREN -> ( [14:39] | |--PARAMETERS -> PARAMETERS [14:40] | |--RPAREN -> ) [14:40] | `--SLIST -> { [14:42] | `--RCURLY -> } [14:43] `--RCURLY -> } [16:0 ```
True
Inconsistent `TYPE_ARGUMENT` AST - The `TYPE_ARGUMENT` AST is not consistent when it is inside of a `TYPE_UPPER_BOUNDS` AST: ```bash ➜ src cat Test.java import java.util.List; public class Test<T> { List<String> myList; List<T> myOtherList; List<String> listMethod1() { return null; } List<T> listMethod2() { return null; } public <E extends Enum<E>> void m22() {} } ➜ src javac Test.java ➜ src java -jar checkstyle-8.44-SNAPSHOT-all.jar -t Test.java IMPORT -> import [1:0] |--DOT -> . [1:16] | |--DOT -> . [1:11] | | |--IDENT -> java [1:7] | | `--IDENT -> util [1:12] | `--IDENT -> List [1:17] `--SEMI -> ; [1:21] CLASS_DEF -> CLASS_DEF [3:0] |--MODIFIERS -> MODIFIERS [3:0] | `--LITERAL_PUBLIC -> public [3:0] |--LITERAL_CLASS -> class [3:7] |--IDENT -> Test [3:13] |--TYPE_PARAMETERS -> TYPE_PARAMETERS [3:17] | |--GENERIC_START -> < [3:17] | |--TYPE_PARAMETER -> TYPE_PARAMETER [3:18] | | `--IDENT -> T [3:18] | `--GENERIC_END -> > [3:19] `--OBJBLOCK -> OBJBLOCK [3:21] |--LCURLY -> { [3:21] |--VARIABLE_DEF -> VARIABLE_DEF [4:4] | |--MODIFIERS -> MODIFIERS [4:4] | |--TYPE -> TYPE [4:4] | | |--IDENT -> List [4:4] | | `--TYPE_ARGUMENTS -> TYPE_ARGUMENTS [4:8] | | |--GENERIC_START -> < [4:8] | | |--TYPE_ARGUMENT -> TYPE_ARGUMENT [4:9] // no TYPE node | | | `--IDENT -> String [4:9] | | `--GENERIC_END -> > [4:15] | |--IDENT -> myList [4:17] | `--SEMI -> ; [4:23] |--VARIABLE_DEF -> VARIABLE_DEF [5:4] | |--MODIFIERS -> MODIFIERS [5:4] | |--TYPE -> TYPE [5:4] | | |--IDENT -> List [5:4] | | `--TYPE_ARGUMENTS -> TYPE_ARGUMENTS [5:8] | | |--GENERIC_START -> < [5:8] | | |--TYPE_ARGUMENT -> TYPE_ARGUMENT [5:9] // no TYPE node | | | `--IDENT -> T [5:9] | | `--GENERIC_END -> > [5:10] | |--IDENT -> myOtherList [5:12] | `--SEMI -> ; [5:23] |--METHOD_DEF -> METHOD_DEF [7:4] | |--MODIFIERS -> MODIFIERS [7:4] | |--TYPE -> TYPE [7:4] | | |--IDENT -> List [7:4] | | `--TYPE_ARGUMENTS -> TYPE_ARGUMENTS [7:8] | | |--GENERIC_START -> < [7:8] | | |--TYPE_ARGUMENT -> TYPE_ARGUMENT [7:9] // no TYPE node | | | `--IDENT -> String [7:9] | | `--GENERIC_END -> > [7:15] | |--IDENT -> listMethod1 [7:17] | |--LPAREN -> ( [7:28] | |--PARAMETERS -> PARAMETERS [7:29] | |--RPAREN -> ) [7:29] | `--SLIST -> { [7:31] | |--LITERAL_RETURN -> return [8:8] | | |--EXPR -> EXPR [8:15] | | | `--LITERAL_NULL -> null [8:15] | | `--SEMI -> ; [8:19] | `--RCURLY -> } [9:4] |--METHOD_DEF -> METHOD_DEF [10:4] | |--MODIFIERS -> MODIFIERS [10:4] | |--TYPE -> TYPE [10:4] | | |--IDENT -> List [10:4] | | `--TYPE_ARGUMENTS -> TYPE_ARGUMENTS [10:8] | | |--GENERIC_START -> < [10:8] | | |--TYPE_ARGUMENT -> TYPE_ARGUMENT [10:9] // no TYPE node | | | `--IDENT -> T [10:9] | | `--GENERIC_END -> > [10:10] | |--IDENT -> listMethod2 [10:12] | |--LPAREN -> ( [10:23] | |--PARAMETERS -> PARAMETERS [10:24] | |--RPAREN -> ) [10:24] | `--SLIST -> { [10:26] | |--LITERAL_RETURN -> return [11:8] | | |--EXPR -> EXPR [11:15] | | | `--LITERAL_NULL -> null [11:15] | | `--SEMI -> ; [11:19] | `--RCURLY -> } [12:4] |--METHOD_DEF -> METHOD_DEF [14:4] | |--MODIFIERS -> MODIFIERS [14:4] | | `--LITERAL_PUBLIC -> public [14:4] | |--TYPE_PARAMETERS -> TYPE_PARAMETERS [14:11] | | |--GENERIC_START -> < [14:11] | | |--TYPE_PARAMETER -> TYPE_PARAMETER [14:12] | | | |--IDENT -> E [14:12] | | | `--TYPE_UPPER_BOUNDS -> extends [14:14] | | | |--IDENT -> Enum [14:22] | | | `--TYPE_ARGUMENTS -> TYPE_ARGUMENTS [14:26] | | | |--GENERIC_START -> < [14:26] | | | |--TYPE_ARGUMENT -> TYPE_ARGUMENT [14:27] | | | | `--TYPE -> TYPE [14:27] // TYPE node is inconsistent | | | | `--IDENT -> E [14:27] | | | `--GENERIC_END -> > [14:28] | | `--GENERIC_END -> > [14:29] | |--TYPE -> TYPE [14:31] | | `--LITERAL_VOID -> void [14:31] | |--IDENT -> m22 [14:36] | |--LPAREN -> ( [14:39] | |--PARAMETERS -> PARAMETERS [14:40] | |--RPAREN -> ) [14:40] | `--SLIST -> { [14:42] | `--RCURLY -> } [14:43] `--RCURLY -> } [16:0 ```
non_process
inconsistent type argument ast the type argument ast is not consistent when it is inside of a type upper bounds ast bash ➜ src cat test java import java util list public class test list mylist list myotherlist list return null list return null public void ➜ src javac test java ➜ src java jar checkstyle snapshot all jar t test java import import dot dot ident java ident util ident list semi class def class def modifiers modifiers literal public public literal class class ident test type parameters type parameters generic start type parameter type parameter ident t generic end objblock objblock lcurly variable def variable def modifiers modifiers type type ident list type arguments type arguments generic start type argument type argument no type node ident string generic end ident mylist semi variable def variable def modifiers modifiers type type ident list type arguments type arguments generic start type argument type argument no type node ident t generic end ident myotherlist semi method def method def modifiers modifiers type type ident list type arguments type arguments generic start type argument type argument no type node ident string generic end ident lparen parameters parameters rparen slist literal return return expr expr literal null null semi rcurly method def method def modifiers modifiers type type ident list type arguments type arguments generic start type argument type argument no type node ident t generic end ident lparen parameters parameters rparen slist literal return return expr expr literal null null semi rcurly method def method def modifiers modifiers literal public public type parameters type parameters generic start type parameter type parameter ident e type upper bounds extends ident enum type arguments type arguments generic start type argument type argument type type type node is inconsistent ident e generic end generic end type type literal void void ident lparen parameters parameters rparen slist rcurly rcurly
0
2,374
5,172,669,406
IssuesEvent
2017-01-18 14:14:09
openvstorage/volumedriver
https://api.github.com/repos/openvstorage/volumedriver
closed
package dependency centos 7 15.11
process_duplicate type_bug
_From @hoanhdo on December 13, 2016 4:32_ I run install openvstorage in centos 7 15.11 yum install --nogpgcheck --enablerepo=fc22 librbd1 librados2 gcc volumedriver-server -y. This is my log. ``` --> Finished Dependency Resolution Error: Package: 1:librbd1-0.94.5-1.el7.x86_64 (base) Requires: libboost_thread-mt.so.1.53.0()(64bit) Available: boost-thread-1.53.0-26.el7.x86_64 (base) libboost_thread-mt.so.1.53.0()(64bit) Available: boost-thread-1.57.0-6.el7.centos.x86_64 (openvstorage) Not found Available: boost-thread-1.57.0-6.fc22.i686 (fc22) Not found Error: Package: 1:librbd1-0.94.5-1.el7.x86_64 (base) Requires: libboost_system-mt.so.1.53.0()(64bit) Available: boost-system-1.53.0-26.el7.x86_64 (base) libboost_system-mt.so.1.53.0()(64bit) Removing: boost-system-1.57.0-6.el7.centos.x86_64 (@openvstorage) Not found Updated By: boost-system-1.57.0-6.fc22.x86_64 (fc22) Not found Error: Package: volumedriver-server-6.0.0.0-1.x86_64 (openvstorage) Requires: libzmq.so.4()(64bit) Available: zeromq-4.0.5-1.fc22.x86_64 (fc22) libzmq.so.4()(64bit) Installed: zeromq-4.1.4-5.el7.x86_64 (@epel) ~libzmq.so.5()(64bit) Error: Package: 1:librados2-0.94.5-1.el7.x86_64 (base) Requires: libboost_system-mt.so.1.53.0()(64bit) Available: boost-system-1.53.0-26.el7.x86_64 (base) libboost_system-mt.so.1.53.0()(64bit) Removing: boost-system-1.57.0-6.el7.centos.x86_64 (@openvstorage) Not found Updated By: boost-system-1.57.0-6.fc22.x86_64 (fc22) Not found Error: Package: volumedriver-base-6.0.0.0-1.x86_64 (openvstorage) Requires: libzmq.so.4()(64bit) Available: zeromq-4.0.5-1.fc22.x86_64 (fc22) libzmq.so.4()(64bit) Installed: zeromq-4.1.4-5.el7.x86_64 (@epel) ~libzmq.so.5()(64bit) Error: Package: 1:librados2-0.94.5-1.el7.x86_64 (base) Requires: libboost_system-mt.so.1.53.0()(64bit) Available: boost-system-1.53.0-26.el7.x86_64 (base) libboost_system-mt.so.1.53.0()(64bit) Installed: boost-system-1.57.0-6.el7.centos.x86_64 (@openvstorage) Not found Available: boost-system-1.57.0-6.fc22.i686 (fc22) Not found Error: Package: volumedriver-server-6.0.0.0-1.x86_64 (openvstorage) Requires: liburcu-bp.so.1()(64bit) Removing: userspace-rcu-0.7.16-1.el7.x86_64 (@epel) liburcu-bp.so.1()(64bit) Updated By: userspace-rcu-0.8.5-1.fc22.x86_64 (fc22) ~liburcu-bp.so.2()(64bit) Error: Package: 1:librados2-0.94.5-1.el7.x86_64 (base) Requires: libboost_thread-mt.so.1.53.0()(64bit) Available: boost-thread-1.53.0-26.el7.x86_64 (base) libboost_thread-mt.so.1.53.0()(64bit) Available: boost-thread-1.57.0-6.el7.centos.x86_64 (openvstorage) Not found Available: boost-thread-1.57.0-6.fc22.i686 (fc22) Not found Error: Package: 1:librbd1-0.94.5-1.el7.x86_64 (base) Requires: libboost_system-mt.so.1.53.0()(64bit) Available: boost-system-1.53.0-26.el7.x86_64 (base) libboost_system-mt.so.1.53.0()(64bit) Installed: boost-system-1.57.0-6.el7.centos.x86_64 (@openvstorage) Not found Available: boost-system-1.57.0-6.fc22.i686 (fc22) Not found Error: Package: volumedriver-base-6.0.0.0-1.x86_64 (openvstorage) Requires: liburcu-bp.so.1()(64bit) Removing: userspace-rcu-0.7.16-1.el7.x86_64 (@epel) liburcu-bp.so.1()(64bit) Updated By: userspace-rcu-0.8.5-1.fc22.x86_64 (fc22) ~liburcu-bp.so.2()(64bit) Error: Package: volumedriver-server-6.0.0.0-1.x86_64 (openvstorage) Requires: liburcu-cds.so.1()(64bit) Removing: userspace-rcu-0.7.16-1.el7.x86_64 (@epel) liburcu-cds.so.1()(64bit) Updated By: userspace-rcu-0.8.5-1.fc22.x86_64 (fc22) ~liburcu-cds.so.2()(64bit) Error: Package: volumedriver-base-6.0.0.0-1.x86_64 (openvstorage) Requires: liburcu-cds.so.1()(64bit) Removing: userspace-rcu-0.7.16-1.el7.x86_64 (@epel) liburcu-cds.so.1()(64bit) Updated By: userspace-rcu-0.8.5-1.fc22.x86_64 (fc22) ~liburcu-cds.so.2()(64bit) You could try using --skip-broken to work around the problem You could try running: rpm -Va --nofiles --nodigest ``` _Copied from original issue: openvstorage/framework#1263_
1.0
package dependency centos 7 15.11 - _From @hoanhdo on December 13, 2016 4:32_ I run install openvstorage in centos 7 15.11 yum install --nogpgcheck --enablerepo=fc22 librbd1 librados2 gcc volumedriver-server -y. This is my log. ``` --> Finished Dependency Resolution Error: Package: 1:librbd1-0.94.5-1.el7.x86_64 (base) Requires: libboost_thread-mt.so.1.53.0()(64bit) Available: boost-thread-1.53.0-26.el7.x86_64 (base) libboost_thread-mt.so.1.53.0()(64bit) Available: boost-thread-1.57.0-6.el7.centos.x86_64 (openvstorage) Not found Available: boost-thread-1.57.0-6.fc22.i686 (fc22) Not found Error: Package: 1:librbd1-0.94.5-1.el7.x86_64 (base) Requires: libboost_system-mt.so.1.53.0()(64bit) Available: boost-system-1.53.0-26.el7.x86_64 (base) libboost_system-mt.so.1.53.0()(64bit) Removing: boost-system-1.57.0-6.el7.centos.x86_64 (@openvstorage) Not found Updated By: boost-system-1.57.0-6.fc22.x86_64 (fc22) Not found Error: Package: volumedriver-server-6.0.0.0-1.x86_64 (openvstorage) Requires: libzmq.so.4()(64bit) Available: zeromq-4.0.5-1.fc22.x86_64 (fc22) libzmq.so.4()(64bit) Installed: zeromq-4.1.4-5.el7.x86_64 (@epel) ~libzmq.so.5()(64bit) Error: Package: 1:librados2-0.94.5-1.el7.x86_64 (base) Requires: libboost_system-mt.so.1.53.0()(64bit) Available: boost-system-1.53.0-26.el7.x86_64 (base) libboost_system-mt.so.1.53.0()(64bit) Removing: boost-system-1.57.0-6.el7.centos.x86_64 (@openvstorage) Not found Updated By: boost-system-1.57.0-6.fc22.x86_64 (fc22) Not found Error: Package: volumedriver-base-6.0.0.0-1.x86_64 (openvstorage) Requires: libzmq.so.4()(64bit) Available: zeromq-4.0.5-1.fc22.x86_64 (fc22) libzmq.so.4()(64bit) Installed: zeromq-4.1.4-5.el7.x86_64 (@epel) ~libzmq.so.5()(64bit) Error: Package: 1:librados2-0.94.5-1.el7.x86_64 (base) Requires: libboost_system-mt.so.1.53.0()(64bit) Available: boost-system-1.53.0-26.el7.x86_64 (base) libboost_system-mt.so.1.53.0()(64bit) Installed: boost-system-1.57.0-6.el7.centos.x86_64 (@openvstorage) Not found Available: boost-system-1.57.0-6.fc22.i686 (fc22) Not found Error: Package: volumedriver-server-6.0.0.0-1.x86_64 (openvstorage) Requires: liburcu-bp.so.1()(64bit) Removing: userspace-rcu-0.7.16-1.el7.x86_64 (@epel) liburcu-bp.so.1()(64bit) Updated By: userspace-rcu-0.8.5-1.fc22.x86_64 (fc22) ~liburcu-bp.so.2()(64bit) Error: Package: 1:librados2-0.94.5-1.el7.x86_64 (base) Requires: libboost_thread-mt.so.1.53.0()(64bit) Available: boost-thread-1.53.0-26.el7.x86_64 (base) libboost_thread-mt.so.1.53.0()(64bit) Available: boost-thread-1.57.0-6.el7.centos.x86_64 (openvstorage) Not found Available: boost-thread-1.57.0-6.fc22.i686 (fc22) Not found Error: Package: 1:librbd1-0.94.5-1.el7.x86_64 (base) Requires: libboost_system-mt.so.1.53.0()(64bit) Available: boost-system-1.53.0-26.el7.x86_64 (base) libboost_system-mt.so.1.53.0()(64bit) Installed: boost-system-1.57.0-6.el7.centos.x86_64 (@openvstorage) Not found Available: boost-system-1.57.0-6.fc22.i686 (fc22) Not found Error: Package: volumedriver-base-6.0.0.0-1.x86_64 (openvstorage) Requires: liburcu-bp.so.1()(64bit) Removing: userspace-rcu-0.7.16-1.el7.x86_64 (@epel) liburcu-bp.so.1()(64bit) Updated By: userspace-rcu-0.8.5-1.fc22.x86_64 (fc22) ~liburcu-bp.so.2()(64bit) Error: Package: volumedriver-server-6.0.0.0-1.x86_64 (openvstorage) Requires: liburcu-cds.so.1()(64bit) Removing: userspace-rcu-0.7.16-1.el7.x86_64 (@epel) liburcu-cds.so.1()(64bit) Updated By: userspace-rcu-0.8.5-1.fc22.x86_64 (fc22) ~liburcu-cds.so.2()(64bit) Error: Package: volumedriver-base-6.0.0.0-1.x86_64 (openvstorage) Requires: liburcu-cds.so.1()(64bit) Removing: userspace-rcu-0.7.16-1.el7.x86_64 (@epel) liburcu-cds.so.1()(64bit) Updated By: userspace-rcu-0.8.5-1.fc22.x86_64 (fc22) ~liburcu-cds.so.2()(64bit) You could try using --skip-broken to work around the problem You could try running: rpm -Va --nofiles --nodigest ``` _Copied from original issue: openvstorage/framework#1263_
process
package dependency centos from hoanhdo on december i run install openvstorage in centos yum install nogpgcheck enablerepo gcc volumedriver server y this is my log finished dependency resolution error package base requires libboost thread mt so available boost thread base libboost thread mt so available boost thread centos openvstorage not found available boost thread not found error package base requires libboost system mt so available boost system base libboost system mt so removing boost system centos openvstorage not found updated by boost system not found error package volumedriver server openvstorage requires libzmq so available zeromq libzmq so installed zeromq epel libzmq so error package base requires libboost system mt so available boost system base libboost system mt so removing boost system centos openvstorage not found updated by boost system not found error package volumedriver base openvstorage requires libzmq so available zeromq libzmq so installed zeromq epel libzmq so error package base requires libboost system mt so available boost system base libboost system mt so installed boost system centos openvstorage not found available boost system not found error package volumedriver server openvstorage requires liburcu bp so removing userspace rcu epel liburcu bp so updated by userspace rcu liburcu bp so error package base requires libboost thread mt so available boost thread base libboost thread mt so available boost thread centos openvstorage not found available boost thread not found error package base requires libboost system mt so available boost system base libboost system mt so installed boost system centos openvstorage not found available boost system not found error package volumedriver base openvstorage requires liburcu bp so removing userspace rcu epel liburcu bp so updated by userspace rcu liburcu bp so error package volumedriver server openvstorage requires liburcu cds so removing userspace rcu epel liburcu cds so updated by userspace rcu liburcu cds so error package volumedriver base openvstorage requires liburcu cds so removing userspace rcu epel liburcu cds so updated by userspace rcu liburcu cds so you could try using skip broken to work around the problem you could try running rpm va nofiles nodigest copied from original issue openvstorage framework
1
330,859
24,280,917,130
IssuesEvent
2022-09-28 17:19:06
markbaaijens/rpmusicserver
https://api.github.com/repos/markbaaijens/rpmusicserver
closed
If hostname or ip of rpms does not show up, mention to connect a display and keyboard to the pi (readme)
documentation good first issue wait for author
Normally, no keyboard or display is needed to install rpms to a Pi. But sometimes, when the Pi is not appearing to the network, some help in the form of a keyboard/display is needed to analyse the problem, This should be documented in the readme. ## Use case As a user, I want to know what to do when the Pi does not appear in the network. ## Tasks (changes in Italic): (1) * if hostname `rpms` is not found (after reboot router): * reboot pi (best done by rpms web-interface) * _if rpms still cannot be found, it's possible that the pi is unable to connect to the network:_ * _connect a keyboard and display to the pi to troubleshoot the issue directly from the device_ (2) * command `nmap 192.168.x.* -p 9002 --open` * must be * `nmap $(echo "$(hostname -I | cut -d"." -f1-3).*") -p 22 --open` (3) * find a device which has port 9002 open (that will be the Pi running RPMS) * must be * find a device which has port 22 open (that might be the Pi running RPMS) (4) * remove line: * fill for x your personal subnet-number; use hostname -I | awk '{print $1}' to retrieve that info
1.0
If hostname or ip of rpms does not show up, mention to connect a display and keyboard to the pi (readme) - Normally, no keyboard or display is needed to install rpms to a Pi. But sometimes, when the Pi is not appearing to the network, some help in the form of a keyboard/display is needed to analyse the problem, This should be documented in the readme. ## Use case As a user, I want to know what to do when the Pi does not appear in the network. ## Tasks (changes in Italic): (1) * if hostname `rpms` is not found (after reboot router): * reboot pi (best done by rpms web-interface) * _if rpms still cannot be found, it's possible that the pi is unable to connect to the network:_ * _connect a keyboard and display to the pi to troubleshoot the issue directly from the device_ (2) * command `nmap 192.168.x.* -p 9002 --open` * must be * `nmap $(echo "$(hostname -I | cut -d"." -f1-3).*") -p 22 --open` (3) * find a device which has port 9002 open (that will be the Pi running RPMS) * must be * find a device which has port 22 open (that might be the Pi running RPMS) (4) * remove line: * fill for x your personal subnet-number; use hostname -I | awk '{print $1}' to retrieve that info
non_process
if hostname or ip of rpms does not show up mention to connect a display and keyboard to the pi readme normally no keyboard or display is needed to install rpms to a pi but sometimes when the pi is not appearing to the network some help in the form of a keyboard display is needed to analyse the problem this should be documented in the readme use case as a user i want to know what to do when the pi does not appear in the network tasks changes in italic if hostname rpms is not found after reboot router reboot pi best done by rpms web interface if rpms still cannot be found it s possible that the pi is unable to connect to the network connect a keyboard and display to the pi to troubleshoot the issue directly from the device command nmap x p open must be nmap echo hostname i cut d p open find a device which has port open that will be the pi running rpms must be find a device which has port open that might be the pi running rpms remove line fill for x your personal subnet number use hostname i awk print to retrieve that info
0
22,217
30,768,106,381
IssuesEvent
2023-07-30 14:59:09
km4ack/73Linux
https://api.github.com/repos/km4ack/73Linux
closed
73Linux unistalls
in process
ran script 73 linux installed and I only installed Conky just to try the interface out. I rebooted and ran HamRadio>73 Linux and asked if I wanted to update and click yes It removed 73 linux from the menu structure. I am running git script again to see if it fixes it. running on fresh install of mint on Fujitsu Laptop
1.0
73Linux unistalls - ran script 73 linux installed and I only installed Conky just to try the interface out. I rebooted and ran HamRadio>73 Linux and asked if I wanted to update and click yes It removed 73 linux from the menu structure. I am running git script again to see if it fixes it. running on fresh install of mint on Fujitsu Laptop
process
unistalls ran script linux installed and i only installed conky just to try the interface out i rebooted and ran hamradio linux and asked if i wanted to update and click yes it removed linux from the menu structure i am running git script again to see if it fixes it running on fresh install of mint on fujitsu laptop
1
7,561
10,680,476,084
IssuesEvent
2019-10-21 21:30:55
JustBru00/RenamePlugin
https://api.github.com/repos/JustBru00/RenamePlugin
reopened
Use with Anvil / Lock names behind permissions
Processing Question
Hi there I'm currently trying to figure out a way to restrict my players from naming items certain names and lock them behind permissions. For example I want to make some custom elytra skins (which already work) but I don't want my players to be able to rename their elytra to "red elytra" to unlock the skin without first being given the permission to do this. Is this possible with this plugin?
1.0
Use with Anvil / Lock names behind permissions - Hi there I'm currently trying to figure out a way to restrict my players from naming items certain names and lock them behind permissions. For example I want to make some custom elytra skins (which already work) but I don't want my players to be able to rename their elytra to "red elytra" to unlock the skin without first being given the permission to do this. Is this possible with this plugin?
process
use with anvil lock names behind permissions hi there i m currently trying to figure out a way to restrict my players from naming items certain names and lock them behind permissions for example i want to make some custom elytra skins which already work but i don t want my players to be able to rename their elytra to red elytra to unlock the skin without first being given the permission to do this is this possible with this plugin
1
18,788
24,692,320,174
IssuesEvent
2022-10-19 09:26:55
TUM-Dev/NavigaTUM
https://api.github.com/repos/TUM-Dev/NavigaTUM
closed
[Entry] [8120.01.101]: Koordinate bearbeiten
entry webform delete-after-processing
Hallo, ich möchte diese Koordinate zum Roomfinder hinzufügen: ``` "8120.01.101": {coords: {lat: 48.26546490139094, lon: 11.672389219881723}},```
1.0
[Entry] [8120.01.101]: Koordinate bearbeiten - Hallo, ich möchte diese Koordinate zum Roomfinder hinzufügen: ``` "8120.01.101": {coords: {lat: 48.26546490139094, lon: 11.672389219881723}},```
process
koordinate bearbeiten hallo ich möchte diese koordinate zum roomfinder hinzufügen coords lat lon
1
3,757
3,548,174,014
IssuesEvent
2016-01-20 13:22:29
MISP/MISP
https://api.github.com/repos/MISP/MISP
opened
Display orgs in Filter event index
enhancement usability
Display org name instead of org ID in the filter for events, both in the search box to compose the filter and on the list of results.
True
Display orgs in Filter event index - Display org name instead of org ID in the filter for events, both in the search box to compose the filter and on the list of results.
non_process
display orgs in filter event index display org name instead of org id in the filter for events both in the search box to compose the filter and on the list of results
0
1,750
4,442,698,915
IssuesEvent
2016-08-19 14:21:47
sysown/proxysql
https://api.github.com/repos/sysown/proxysql
closed
Allows multiple query rewrite
ADMIN QUERY PROCESSOR
In the current implementation of Query_Processor and `mysql_query_rules` , a query is rewritten only at the end of the analysis of the rules of `mysql_query_rules`. This leads to the fact that a query can only be rewritten once. To allow multiple rewrites a lot of code needs to changed, because after each re-write we need to recalculate: - original query - digest
1.0
Allows multiple query rewrite - In the current implementation of Query_Processor and `mysql_query_rules` , a query is rewritten only at the end of the analysis of the rules of `mysql_query_rules`. This leads to the fact that a query can only be rewritten once. To allow multiple rewrites a lot of code needs to changed, because after each re-write we need to recalculate: - original query - digest
process
allows multiple query rewrite in the current implementation of query processor and mysql query rules a query is rewritten only at the end of the analysis of the rules of mysql query rules this leads to the fact that a query can only be rewritten once to allow multiple rewrites a lot of code needs to changed because after each re write we need to recalculate original query digest
1
950
3,413,047,277
IssuesEvent
2015-12-06 11:23:48
pwittchen/ReactiveSensors
https://api.github.com/repos/pwittchen/ReactiveSensors
opened
Release 0.1.0
release process
**Initial release notes**: - removed `filterSensorChanged()` method from `ReactiveSensorEvent` class - removed `filterAccuracyChanged()` method from `ReactiveSensorEvent` class - created `ReactiveSensorFilter` class - added `filterSensorChanged()` method to `ReactiveSensorFilter` class - added `filterAccuracyChanged()` method to `ReactiveSensorFilter` class - added sample app in Kotlin - improved sample apps - added Static Code Analysis - updated documentation in `README.md` file **Things to do**: TBD.
1.0
Release 0.1.0 - **Initial release notes**: - removed `filterSensorChanged()` method from `ReactiveSensorEvent` class - removed `filterAccuracyChanged()` method from `ReactiveSensorEvent` class - created `ReactiveSensorFilter` class - added `filterSensorChanged()` method to `ReactiveSensorFilter` class - added `filterAccuracyChanged()` method to `ReactiveSensorFilter` class - added sample app in Kotlin - improved sample apps - added Static Code Analysis - updated documentation in `README.md` file **Things to do**: TBD.
process
release initial release notes removed filtersensorchanged method from reactivesensorevent class removed filteraccuracychanged method from reactivesensorevent class created reactivesensorfilter class added filtersensorchanged method to reactivesensorfilter class added filteraccuracychanged method to reactivesensorfilter class added sample app in kotlin improved sample apps added static code analysis updated documentation in readme md file things to do tbd
1
1,946
2,678,634,751
IssuesEvent
2015-03-26 12:18:43
stan-dev/stan
https://api.github.com/repos/stan-dev/stan
closed
fix types in check_equal so clients don't need to cast
Code cleanup Feature
There's a problem with issue #1405 triggering compiler style warnings that are due to a problem in `check_equal`. Those calls to `check_equal` should not be triggering warnings.
1.0
fix types in check_equal so clients don't need to cast - There's a problem with issue #1405 triggering compiler style warnings that are due to a problem in `check_equal`. Those calls to `check_equal` should not be triggering warnings.
non_process
fix types in check equal so clients don t need to cast there s a problem with issue triggering compiler style warnings that are due to a problem in check equal those calls to check equal should not be triggering warnings
0
138,611
11,208,765,384
IssuesEvent
2020-01-06 08:49:31
Huskehhh/FakeBlock
https://api.github.com/repos/Huskehhh/FakeBlock
closed
Feature request: Message formats and localization
enhancement implemented-needs-testing
Please provide configuration options to remove formatting codes from all messages sent to game and console. Alternatively, a language file, which would also provide localization opportunities. Note that PREFIX should be configurable, as well as message content. For example: [21:18:18] [Server thread/INFO]: ^[[0;30;22m[^[[0;36;1mFakeBlock^[[0;30;22m] ^[[0;31;1m'xyzzy' has been deleted^[[m prefix: cyan/aqua message: red
1.0
Feature request: Message formats and localization - Please provide configuration options to remove formatting codes from all messages sent to game and console. Alternatively, a language file, which would also provide localization opportunities. Note that PREFIX should be configurable, as well as message content. For example: [21:18:18] [Server thread/INFO]: ^[[0;30;22m[^[[0;36;1mFakeBlock^[[0;30;22m] ^[[0;31;1m'xyzzy' has been deleted^[[m prefix: cyan/aqua message: red
non_process
feature request message formats and localization please provide configuration options to remove formatting codes from all messages sent to game and console alternatively a language file which would also provide localization opportunities note that prefix should be configurable as well as message content for example xyzzy has been deleted m prefix cyan aqua message red
0
30,727
8,583,011,778
IssuesEvent
2018-11-13 18:31:50
mozilla-mobile/android-components
https://api.github.com/repos/mozilla-mobile/android-components
opened
Components with shouldPublish=false are not getting build at all anymore
🏗️ build 🤖 automation
I just noticed that components with shouldPublish=false are not getting build at all anymore. 😨 Now with `printModules` filtering those out for the release they are gone from all our decision tasks.
1.0
Components with shouldPublish=false are not getting build at all anymore - I just noticed that components with shouldPublish=false are not getting build at all anymore. 😨 Now with `printModules` filtering those out for the release they are gone from all our decision tasks.
non_process
components with shouldpublish false are not getting build at all anymore i just noticed that components with shouldpublish false are not getting build at all anymore 😨 now with printmodules filtering those out for the release they are gone from all our decision tasks
0
128,689
27,313,005,009
IssuesEvent
2023-02-24 13:42:38
sourcegraph/sourcegraph
https://api.github.com/repos/sourcegraph/sourcegraph
opened
insights: some group by insights have a live preview but no data after fill
bug team/code-insights
example queries: `file:readme \s`, `file:test \s` ``` query executor: type:commit file:readme count:99999999 repo:^(github\.com/golang/go)$ content:output.extra(\s -> $repo) query work handler: fork:yes archived:yes patterntype:literal type:commit file:readme count:99999999 repo:^(github\.com/golang/go)$ content:output.extra(\s -> $repo) ``` maybe something to do with using `\s` and `file`? just need to confirm what's happening there such that we get data on the live preview but not the backfill
1.0
insights: some group by insights have a live preview but no data after fill - example queries: `file:readme \s`, `file:test \s` ``` query executor: type:commit file:readme count:99999999 repo:^(github\.com/golang/go)$ content:output.extra(\s -> $repo) query work handler: fork:yes archived:yes patterntype:literal type:commit file:readme count:99999999 repo:^(github\.com/golang/go)$ content:output.extra(\s -> $repo) ``` maybe something to do with using `\s` and `file`? just need to confirm what's happening there such that we get data on the live preview but not the backfill
non_process
insights some group by insights have a live preview but no data after fill example queries file readme s file test s query executor type commit file readme count repo github com golang go content output extra s repo query work handler fork yes archived yes patterntype literal type commit file readme count repo github com golang go content output extra s repo maybe something to do with using s and file just need to confirm what s happening there such that we get data on the live preview but not the backfill
0
412,667
27,866,944,482
IssuesEvent
2023-03-21 10:54:37
acdh-oeaw/apis-bibsonomy
https://api.github.com/repos/acdh-oeaw/apis-bibsonomy
opened
Add Docstrings
enhancement Documentation
Most of the code currently does not include any docstrings, this should be updated.
1.0
Add Docstrings - Most of the code currently does not include any docstrings, this should be updated.
non_process
add docstrings most of the code currently does not include any docstrings this should be updated
0
326,004
9,941,409,278
IssuesEvent
2019-07-03 11:32:14
eJourn-al/eJournal
https://api.github.com/repos/eJourn-al/eJournal
opened
Speedup grading process
Priority: Medium Status: Available Type: Enhancement Workload: Medium
**Describe the solution you'd like** The grading process can be sped up. Some ideas: - On publish grade / publish grade and comment, move to the next student. - Upon approval of all entries in a journal. move to the next student journal.
1.0
Speedup grading process - **Describe the solution you'd like** The grading process can be sped up. Some ideas: - On publish grade / publish grade and comment, move to the next student. - Upon approval of all entries in a journal. move to the next student journal.
non_process
speedup grading process describe the solution you d like the grading process can be sped up some ideas on publish grade publish grade and comment move to the next student upon approval of all entries in a journal move to the next student journal
0
237,948
18,174,051,982
IssuesEvent
2021-09-27 23:35:48
teknologi-umum/bot
https://api.github.com/repos/teknologi-umum/bot
closed
prepare for hacktoberfest
documentation
- [x] create CONTRIBUTING.md - [ ] create pull request template - PR without issue would be rejected - [x] put hacktoberfest tag - [ ] put a few hacktoberfest guides on readme such as: - https://twitter.com/sudo_navendu/status/1437456596473303042 - https://www.digitalocean.com/community/tutorials/hacktoberfest-contributor-s-guide-how-to-find-and-contribute-to-open-source-projects
1.0
prepare for hacktoberfest - - [x] create CONTRIBUTING.md - [ ] create pull request template - PR without issue would be rejected - [x] put hacktoberfest tag - [ ] put a few hacktoberfest guides on readme such as: - https://twitter.com/sudo_navendu/status/1437456596473303042 - https://www.digitalocean.com/community/tutorials/hacktoberfest-contributor-s-guide-how-to-find-and-contribute-to-open-source-projects
non_process
prepare for hacktoberfest create contributing md create pull request template pr without issue would be rejected put hacktoberfest tag put a few hacktoberfest guides on readme such as
0
63,372
15,590,029,695
IssuesEvent
2021-03-18 08:51:20
appsmithorg/appsmith
https://api.github.com/repos/appsmithorg/appsmith
closed
[Bug] Table row height doesn't get saved
Bug High Needs Triaging Table Widget UI Building Pod Widgets
1. User selects a table row height. 2. The row height doesn't get saved in the deployed mode
1.0
[Bug] Table row height doesn't get saved - 1. User selects a table row height. 2. The row height doesn't get saved in the deployed mode
non_process
table row height doesn t get saved user selects a table row height the row height doesn t get saved in the deployed mode
0
2,503
5,277,587,386
IssuesEvent
2017-02-07 03:56:00
rubberduck-vba/Rubberduck
https://api.github.com/repos/rubberduck-vba/Rubberduck
closed
Name .. As ... not seen as using a var
bug feature-inspections parse-tree-processing
> Variable 's2' is not used VBAProject.Module1 ````vb Sub testRDUsed() Dim s1 As String, s2 As String s1 = "test.xls" s2 = s1 & ".bak" Name s1 As s2 End Sub ````
1.0
Name .. As ... not seen as using a var - > Variable 's2' is not used VBAProject.Module1 ````vb Sub testRDUsed() Dim s1 As String, s2 As String s1 = "test.xls" s2 = s1 & ".bak" Name s1 As s2 End Sub ````
process
name as not seen as using a var variable is not used vbaproject vb sub testrdused dim as string as string test xls bak name as end sub
1
40,211
10,473,126,456
IssuesEvent
2019-09-23 11:55:04
microsoft/appcenter
https://api.github.com/repos/microsoft/appcenter
closed
Building an iOS app with Xcode 11 fails with IDEFoundationErrorDomain Code=1 "IPA processing failed"
build
When I try to build an iOS app using Xcode 11, the build fails and the following error appears in the build output log: ``` Error Domain=IDEFoundationErrorDomain Code=1 "IPA processing failed" UserInfo={NSLocalizedDescription=IPA processing failed} ** EXPORT FAILED ** ##[error]Error: /usr/bin/xcodebuild failed with return code: 70 ``` This error wasn't present when my selected Xcode version was 10.3. What should I do in order to fix this?
1.0
Building an iOS app with Xcode 11 fails with IDEFoundationErrorDomain Code=1 "IPA processing failed" - When I try to build an iOS app using Xcode 11, the build fails and the following error appears in the build output log: ``` Error Domain=IDEFoundationErrorDomain Code=1 "IPA processing failed" UserInfo={NSLocalizedDescription=IPA processing failed} ** EXPORT FAILED ** ##[error]Error: /usr/bin/xcodebuild failed with return code: 70 ``` This error wasn't present when my selected Xcode version was 10.3. What should I do in order to fix this?
non_process
building an ios app with xcode fails with idefoundationerrordomain code ipa processing failed when i try to build an ios app using xcode the build fails and the following error appears in the build output log error domain idefoundationerrordomain code ipa processing failed userinfo nslocalizeddescription ipa processing failed export failed error usr bin xcodebuild failed with return code this error wasn t present when my selected xcode version was what should i do in order to fix this
0
3,202
6,262,550,907
IssuesEvent
2017-07-15 11:54:24
coala/teams
https://api.github.com/repos/coala/teams
closed
Docs Team Member application: Mariatta
process/approved
# Bio My name is Mariatta, and I'm extra super duper special because: - At the time of applying, I'm the only woman in coala maintainers team. - I'm on IMDb http://www.imdb.com/name/nm7641957/ Other documenting expriences: - Improving documentation for CPython, Python.org, Python Developer's Guide. # coala Contributions so far - documentation - review prs - accept prs - helped onboard newcomers and new contributors # Road to the Future - continue working on docs
1.0
Docs Team Member application: Mariatta - # Bio My name is Mariatta, and I'm extra super duper special because: - At the time of applying, I'm the only woman in coala maintainers team. - I'm on IMDb http://www.imdb.com/name/nm7641957/ Other documenting expriences: - Improving documentation for CPython, Python.org, Python Developer's Guide. # coala Contributions so far - documentation - review prs - accept prs - helped onboard newcomers and new contributors # Road to the Future - continue working on docs
process
docs team member application mariatta bio my name is mariatta and i m extra super duper special because at the time of applying i m the only woman in coala maintainers team i m on imdb other documenting expriences improving documentation for cpython python org python developer s guide coala contributions so far documentation review prs accept prs helped onboard newcomers and new contributors road to the future continue working on docs
1
144,082
19,268,775,143
IssuesEvent
2021-12-10 01:15:13
mpulsemobile/doccano
https://api.github.com/repos/mpulsemobile/doccano
opened
CVE-2021-37701 (High) detected in tar-4.4.1.tgz
security vulnerability
## CVE-2021-37701 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-4.4.1.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.1.tgz">https://registry.npmjs.org/tar/-/tar-4.4.1.tgz</a></p> <p> Dependency Hierarchy: - webpack-dev-server-3.2.1.tgz (Root Library) - chokidar-2.0.3.tgz - fsevents-1.2.4.tgz - node-pre-gyp-0.10.0.tgz - :x: **tar-4.4.1.tgz** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The npm package "tar" (aka node-tar) before versions 4.4.16, 5.0.8, and 6.1.7 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory, where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems. The cache checking logic used both `\` and `/` characters as path separators, however `\` is a valid filename character on posix systems. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. Additionally, a similar confusion could arise on case-insensitive filesystems. If a tar archive contained a directory at `FOO`, followed by a symbolic link named `foo`, then on case-insensitive file systems, the creation of the symbolic link would remove the directory from the filesystem, but _not_ from the internal directory cache, as it would not be treated as a cache hit. A subsequent file entry within the `FOO` directory would then be placed in the target of the symbolic link, thinking that the directory had already been created. These issues were addressed in releases 4.4.16, 5.0.8 and 6.1.7. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-9r2w-394v-53qc. <p>Publish Date: 2021-08-31 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37701>CVE-2021-37701</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc">https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc</a></p> <p>Release Date: 2021-08-31</p> <p>Fix Resolution: tar - 4.4.16, 5.0.8, 6.1.7</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"tar","packageVersion":"4.4.1","packageFilePaths":[],"isTransitiveDependency":true,"dependencyTree":"webpack-dev-server:3.2.1;chokidar:2.0.3;fsevents:1.2.4;node-pre-gyp:0.10.0;tar:4.4.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"tar - 4.4.16, 5.0.8, 6.1.7","isBinary":true}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2021-37701","vulnerabilityDetails":"The npm package \"tar\" (aka node-tar) before versions 4.4.16, 5.0.8, and 6.1.7 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory, where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems. The cache checking logic used both `\\` and `/` characters as path separators, however `\\` is a valid filename character on posix systems. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. Additionally, a similar confusion could arise on case-insensitive filesystems. If a tar archive contained a directory at `FOO`, followed by a symbolic link named `foo`, then on case-insensitive file systems, the creation of the symbolic link would remove the directory from the filesystem, but _not_ from the internal directory cache, as it would not be treated as a cache hit. A subsequent file entry within the `FOO` directory would then be placed in the target of the symbolic link, thinking that the directory had already been created. These issues were addressed in releases 4.4.16, 5.0.8 and 6.1.7. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-9r2w-394v-53qc.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37701","cvss3Severity":"high","cvss3Score":"8.6","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Changed","C":"High","UI":"Required","AV":"Local","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2021-37701 (High) detected in tar-4.4.1.tgz - ## CVE-2021-37701 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-4.4.1.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.1.tgz">https://registry.npmjs.org/tar/-/tar-4.4.1.tgz</a></p> <p> Dependency Hierarchy: - webpack-dev-server-3.2.1.tgz (Root Library) - chokidar-2.0.3.tgz - fsevents-1.2.4.tgz - node-pre-gyp-0.10.0.tgz - :x: **tar-4.4.1.tgz** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The npm package "tar" (aka node-tar) before versions 4.4.16, 5.0.8, and 6.1.7 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory, where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems. The cache checking logic used both `\` and `/` characters as path separators, however `\` is a valid filename character on posix systems. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. Additionally, a similar confusion could arise on case-insensitive filesystems. If a tar archive contained a directory at `FOO`, followed by a symbolic link named `foo`, then on case-insensitive file systems, the creation of the symbolic link would remove the directory from the filesystem, but _not_ from the internal directory cache, as it would not be treated as a cache hit. A subsequent file entry within the `FOO` directory would then be placed in the target of the symbolic link, thinking that the directory had already been created. These issues were addressed in releases 4.4.16, 5.0.8 and 6.1.7. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-9r2w-394v-53qc. <p>Publish Date: 2021-08-31 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37701>CVE-2021-37701</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc">https://github.com/npm/node-tar/security/advisories/GHSA-9r2w-394v-53qc</a></p> <p>Release Date: 2021-08-31</p> <p>Fix Resolution: tar - 4.4.16, 5.0.8, 6.1.7</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"tar","packageVersion":"4.4.1","packageFilePaths":[],"isTransitiveDependency":true,"dependencyTree":"webpack-dev-server:3.2.1;chokidar:2.0.3;fsevents:1.2.4;node-pre-gyp:0.10.0;tar:4.4.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"tar - 4.4.16, 5.0.8, 6.1.7","isBinary":true}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2021-37701","vulnerabilityDetails":"The npm package \"tar\" (aka node-tar) before versions 4.4.16, 5.0.8, and 6.1.7 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory, where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems. The cache checking logic used both `\\` and `/` characters as path separators, however `\\` is a valid filename character on posix systems. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. Additionally, a similar confusion could arise on case-insensitive filesystems. If a tar archive contained a directory at `FOO`, followed by a symbolic link named `foo`, then on case-insensitive file systems, the creation of the symbolic link would remove the directory from the filesystem, but _not_ from the internal directory cache, as it would not be treated as a cache hit. A subsequent file entry within the `FOO` directory would then be placed in the target of the symbolic link, thinking that the directory had already been created. These issues were addressed in releases 4.4.16, 5.0.8 and 6.1.7. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-9r2w-394v-53qc.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37701","cvss3Severity":"high","cvss3Score":"8.6","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Changed","C":"High","UI":"Required","AV":"Local","I":"High"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in tar tgz cve high severity vulnerability vulnerable library tar tgz tar for node library home page a href dependency hierarchy webpack dev server tgz root library chokidar tgz fsevents tgz node pre gyp tgz x tar tgz vulnerable library vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite and arbitrary code execution vulnerability node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems the cache checking logic used both and characters as path separators however is a valid filename character on posix systems by first creating a directory and then replacing that directory with a symlink it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite additionally a similar confusion could arise on case insensitive filesystems if a tar archive contained a directory at foo followed by a symbolic link named foo then on case insensitive file systems the creation of the symbolic link would remove the directory from the filesystem but not from the internal directory cache as it would not be treated as a cache hit a subsequent file entry within the foo directory would then be placed in the target of the symbolic link thinking that the directory had already been created these issues were addressed in releases and the branch of node tar has been deprecated and did not receive patches for these issues if you are still using a release we recommend you update to a more recent version of node tar if this is not possible a workaround is available in the referenced ghsa publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree webpack dev server chokidar fsevents node pre gyp tar isminimumfixversionavailable true minimumfixversion tar isbinary true basebranches vulnerabilityidentifier cve vulnerabilitydetails the npm package tar aka node tar before versions and has an arbitrary file creation overwrite and arbitrary code execution vulnerability node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory where the symlink and directory names in the archive entry used backslashes as a path separator on posix systems the cache checking logic used both and characters as path separators however is a valid filename character on posix systems by first creating a directory and then replacing that directory with a symlink it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite additionally a similar confusion could arise on case insensitive filesystems if a tar archive contained a directory at foo followed by a symbolic link named foo then on case insensitive file systems the creation of the symbolic link would remove the directory from the filesystem but not from the internal directory cache as it would not be treated as a cache hit a subsequent file entry within the foo directory would then be placed in the target of the symbolic link thinking that the directory had already been created these issues were addressed in releases and the branch of node tar has been deprecated and did not receive patches for these issues if you are still using a release we recommend you update to a more recent version of node tar if this is not possible a workaround is available in the referenced ghsa vulnerabilityurl
0
4,317
7,204,049,429
IssuesEvent
2018-02-06 11:15:18
decidim/decidim
https://api.github.com/repos/decidim/decidim
closed
When changing process config instead of updating a new process is created.
lot: core space: processes type: bug
<!-- 1. Please check if an issue already exists so there are no duplicates 2. Fill out the whole template so we have a good overview on the issue 3. Do not remove any section of the template. If something is not applicable leave it empty but leave it in the Issue 4. Please follow the template, otherwise we'll have to ask you to update it --> # This is a Bug Report When changing a Processe's #### :tophat: Description For bug reports: * What went wrong? When changing a Process's general information content (probably when changing the slug), instead of changing the data Decidim creates a new process, maintaining the old one. * What did you expect should have happened? Either: a) the process is updated and/or b) if there are fields that cannot really be changed there should be impossible to modify those fields. * What was the config you used? Decidim v.0.9 at meta.decidim * ***Decidim deployment where you found the issue***: meta.decidim * ***URL to reproduce the error***: enter admin panel and try changing a process's general information
1.0
When changing process config instead of updating a new process is created. - <!-- 1. Please check if an issue already exists so there are no duplicates 2. Fill out the whole template so we have a good overview on the issue 3. Do not remove any section of the template. If something is not applicable leave it empty but leave it in the Issue 4. Please follow the template, otherwise we'll have to ask you to update it --> # This is a Bug Report When changing a Processe's #### :tophat: Description For bug reports: * What went wrong? When changing a Process's general information content (probably when changing the slug), instead of changing the data Decidim creates a new process, maintaining the old one. * What did you expect should have happened? Either: a) the process is updated and/or b) if there are fields that cannot really be changed there should be impossible to modify those fields. * What was the config you used? Decidim v.0.9 at meta.decidim * ***Decidim deployment where you found the issue***: meta.decidim * ***URL to reproduce the error***: enter admin panel and try changing a process's general information
process
when changing process config instead of updating a new process is created please check if an issue already exists so there are no duplicates fill out the whole template so we have a good overview on the issue do not remove any section of the template if something is not applicable leave it empty but leave it in the issue please follow the template otherwise we ll have to ask you to update it this is a bug report when changing a processe s tophat description for bug reports what went wrong when changing a process s general information content probably when changing the slug instead of changing the data decidim creates a new process maintaining the old one what did you expect should have happened either a the process is updated and or b if there are fields that cannot really be changed there should be impossible to modify those fields what was the config you used decidim v at meta decidim decidim deployment where you found the issue meta decidim url to reproduce the error enter admin panel and try changing a process s general information
1
18,729
24,622,733,821
IssuesEvent
2022-10-16 05:27:50
pyanodon/pybugreports
https://api.github.com/repos/pyanodon/pybugreports
closed
Dependency Loop detected with Py post processing and Clusterio
mod:pypostprocessing postprocess-fail compatibility
### Mod source Factorio Mod Portal ### Which mod are you having an issue with? - [ ] pyalienlife - [ ] pyalternativeenergy - [ ] pycoalprocessing - [ ] pyfusionenergy - [ ] pyhightech - [ ] pyindustry - [ ] pypetroleumhandling - [X] pypostprocessing - [ ] pyrawores ### Operating system >=Windows 10 ### What kind of issue is this? - [ ] Compatibility - [ ] Locale (names, descriptions, unknown keys) - [ ] Graphical - [X] Crash - [ ] Progression - [ ] Balance - [ ] Pypostprocessing failure - [ ] Other ### What is the problem? Mod compatibility crash between py mods and clusterio. ### Steps to reproduce 1. Add all the Py mods. 2. Add clusterio lib+sub space storage mod 3. crash at py post processing ### Additional context ![image](https://user-images.githubusercontent.com/8072869/196013087-0a22ac30-733f-4630-88b9-ac49fcee9e62.png) It doesn't seem it's just this mod. There seem to be several other mods that cause this dependency loop. I removed clusterio and added a bunch of other mods. similar issue. ### Log file _No response_
2.0
Dependency Loop detected with Py post processing and Clusterio - ### Mod source Factorio Mod Portal ### Which mod are you having an issue with? - [ ] pyalienlife - [ ] pyalternativeenergy - [ ] pycoalprocessing - [ ] pyfusionenergy - [ ] pyhightech - [ ] pyindustry - [ ] pypetroleumhandling - [X] pypostprocessing - [ ] pyrawores ### Operating system >=Windows 10 ### What kind of issue is this? - [ ] Compatibility - [ ] Locale (names, descriptions, unknown keys) - [ ] Graphical - [X] Crash - [ ] Progression - [ ] Balance - [ ] Pypostprocessing failure - [ ] Other ### What is the problem? Mod compatibility crash between py mods and clusterio. ### Steps to reproduce 1. Add all the Py mods. 2. Add clusterio lib+sub space storage mod 3. crash at py post processing ### Additional context ![image](https://user-images.githubusercontent.com/8072869/196013087-0a22ac30-733f-4630-88b9-ac49fcee9e62.png) It doesn't seem it's just this mod. There seem to be several other mods that cause this dependency loop. I removed clusterio and added a bunch of other mods. similar issue. ### Log file _No response_
process
dependency loop detected with py post processing and clusterio mod source factorio mod portal which mod are you having an issue with pyalienlife pyalternativeenergy pycoalprocessing pyfusionenergy pyhightech pyindustry pypetroleumhandling pypostprocessing pyrawores operating system windows what kind of issue is this compatibility locale names descriptions unknown keys graphical crash progression balance pypostprocessing failure other what is the problem mod compatibility crash between py mods and clusterio steps to reproduce add all the py mods add clusterio lib sub space storage mod crash at py post processing additional context it doesn t seem it s just this mod there seem to be several other mods that cause this dependency loop i removed clusterio and added a bunch of other mods similar issue log file no response
1
166,389
20,718,484,453
IssuesEvent
2022-03-13 01:53:23
michaeldotson/babys-first-rails
https://api.github.com/repos/michaeldotson/babys-first-rails
opened
CVE-2022-21831 (High) detected in activestorage-5.2.2.gem
security vulnerability
## CVE-2022-21831 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>activestorage-5.2.2.gem</b></p></summary> <p>Attach cloud and local files in Rails applications.</p> <p>Library home page: <a href="https://rubygems.org/gems/activestorage-5.2.2.gem">https://rubygems.org/gems/activestorage-5.2.2.gem</a></p> <p>Path to dependency file: /babys-first-rails/Gemfile.lock</p> <p>Path to vulnerable library: /var/lib/gems/2.3.0/cache/activestorage-5.2.2.gem</p> <p> Dependency Hierarchy: - rails-5.2.2.gem (Root Library) - :x: **activestorage-5.2.2.gem** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Activestorage versions prior to 7.0.2.3, 6.1.4.7, 6.0.4.7, 5.2.6.3 are vulnerable to code injection. This vulnerability impacts applications that use Active Storage with the image_processing processing in addition to the mini_magick back end for image_processing.. <p>Publish Date: 2021-12-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-21831>CVE-2022-21831</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-w749-p3v6-hccq">https://github.com/advisories/GHSA-w749-p3v6-hccq</a></p> <p>Release Date: 2021-12-11</p> <p>Fix Resolution: activestorage - 5.2.6.3,6.0.4.7,6.1.4.7,7.0.2.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-21831 (High) detected in activestorage-5.2.2.gem - ## CVE-2022-21831 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>activestorage-5.2.2.gem</b></p></summary> <p>Attach cloud and local files in Rails applications.</p> <p>Library home page: <a href="https://rubygems.org/gems/activestorage-5.2.2.gem">https://rubygems.org/gems/activestorage-5.2.2.gem</a></p> <p>Path to dependency file: /babys-first-rails/Gemfile.lock</p> <p>Path to vulnerable library: /var/lib/gems/2.3.0/cache/activestorage-5.2.2.gem</p> <p> Dependency Hierarchy: - rails-5.2.2.gem (Root Library) - :x: **activestorage-5.2.2.gem** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Activestorage versions prior to 7.0.2.3, 6.1.4.7, 6.0.4.7, 5.2.6.3 are vulnerable to code injection. This vulnerability impacts applications that use Active Storage with the image_processing processing in addition to the mini_magick back end for image_processing.. <p>Publish Date: 2021-12-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-21831>CVE-2022-21831</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-w749-p3v6-hccq">https://github.com/advisories/GHSA-w749-p3v6-hccq</a></p> <p>Release Date: 2021-12-11</p> <p>Fix Resolution: activestorage - 5.2.6.3,6.0.4.7,6.1.4.7,7.0.2.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in activestorage gem cve high severity vulnerability vulnerable library activestorage gem attach cloud and local files in rails applications library home page a href path to dependency file babys first rails gemfile lock path to vulnerable library var lib gems cache activestorage gem dependency hierarchy rails gem root library x activestorage gem vulnerable library vulnerability details activestorage versions prior to are vulnerable to code injection this vulnerability impacts applications that use active storage with the image processing processing in addition to the mini magick back end for image processing publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution activestorage step up your open source security game with whitesource
0
325,385
27,873,276,646
IssuesEvent
2023-03-21 14:40:28
wazuh/wazuh-qa
https://api.github.com/repos/wazuh/wazuh-qa
closed
Missing resources when retrieving the ruleset active configuration
team/qa target/4.5.0 dev-testing subteam/qa-main level/task type/test
| Target version | Related issue | Related PR | |--------------------|--------------------|-----------------| |4.5.0|https://github.com/wazuh/wazuh/issues/15996|https://github.com/wazuh/wazuh/pull/16000| <!-- Important: No section may be left blank. If not, delete it directly (in principle only Steps to reproduce could be left blank in case of not proceeding, although there are always exceptions). --> ## Description This PR aims to fix a bug in Analysisd that made it report an incomplete set of rules and decoders. ## Proposed checks <!-- Indicate through a list of checkboxes the suggested checks to be carried out by the QA tester --> - [x] Check that missing rules (https://github.com/wazuh/wazuh/issues/15725) are now reported by the API. - [x] Check that missing decoders (https://github.com/wazuh/wazuh/issues/15716) are now reported by the API. - [x] Check that all rules in XML files are reported by the API (check the functional test at https://github.com/wazuh/wazuh/pull/16000) ## Steps to reproduce Run the API endpoints `/manager/configuration/analysis/decoders` and `/manager/configuration/analysis/rules`, and check the result. ## Expected results The API is expected to report all rules and decoders defined in the XML files. However, child decoders are renamed by their parents' names. *** ### Status - [x] In progress (@juliamagan) - [x] Pending Review - [x] QA manager approved (@jmv74211) - [x] Development team leader approved (@vikman90)
2.0
Missing resources when retrieving the ruleset active configuration - | Target version | Related issue | Related PR | |--------------------|--------------------|-----------------| |4.5.0|https://github.com/wazuh/wazuh/issues/15996|https://github.com/wazuh/wazuh/pull/16000| <!-- Important: No section may be left blank. If not, delete it directly (in principle only Steps to reproduce could be left blank in case of not proceeding, although there are always exceptions). --> ## Description This PR aims to fix a bug in Analysisd that made it report an incomplete set of rules and decoders. ## Proposed checks <!-- Indicate through a list of checkboxes the suggested checks to be carried out by the QA tester --> - [x] Check that missing rules (https://github.com/wazuh/wazuh/issues/15725) are now reported by the API. - [x] Check that missing decoders (https://github.com/wazuh/wazuh/issues/15716) are now reported by the API. - [x] Check that all rules in XML files are reported by the API (check the functional test at https://github.com/wazuh/wazuh/pull/16000) ## Steps to reproduce Run the API endpoints `/manager/configuration/analysis/decoders` and `/manager/configuration/analysis/rules`, and check the result. ## Expected results The API is expected to report all rules and decoders defined in the XML files. However, child decoders are renamed by their parents' names. *** ### Status - [x] In progress (@juliamagan) - [x] Pending Review - [x] QA manager approved (@jmv74211) - [x] Development team leader approved (@vikman90)
non_process
missing resources when retrieving the ruleset active configuration target version related issue related pr description this pr aims to fix a bug in analysisd that made it report an incomplete set of rules and decoders proposed checks check that missing rules are now reported by the api check that missing decoders are now reported by the api check that all rules in xml files are reported by the api check the functional test at steps to reproduce run the api endpoints manager configuration analysis decoders and manager configuration analysis rules and check the result expected results the api is expected to report all rules and decoders defined in the xml files however child decoders are renamed by their parents names status in progress juliamagan pending review qa manager approved development team leader approved
0
652,265
21,526,976,628
IssuesEvent
2022-04-28 19:30:29
KA-Huis/space-management
https://api.github.com/repos/KA-Huis/space-management
opened
Create policy for Group model
Task Priority: High
## Description Some actions are only permitted for certain users. In order to make sure unauthorised users cannot perform restricted actions we need to analyse the policy business rules for the `Group` model. ## Technical tasks - [ ] Create policy class - [ ] Create unit test that asserts the current happy flows ## Automated testing Unit tests are the best option to validate the workings of all the policy actions. They will make sure all actions are correctly according to the business requirements.
1.0
Create policy for Group model - ## Description Some actions are only permitted for certain users. In order to make sure unauthorised users cannot perform restricted actions we need to analyse the policy business rules for the `Group` model. ## Technical tasks - [ ] Create policy class - [ ] Create unit test that asserts the current happy flows ## Automated testing Unit tests are the best option to validate the workings of all the policy actions. They will make sure all actions are correctly according to the business requirements.
non_process
create policy for group model description some actions are only permitted for certain users in order to make sure unauthorised users cannot perform restricted actions we need to analyse the policy business rules for the group model technical tasks create policy class create unit test that asserts the current happy flows automated testing unit tests are the best option to validate the workings of all the policy actions they will make sure all actions are correctly according to the business requirements
0
9,510
12,497,536,626
IssuesEvent
2020-06-01 16:39:41
MicrosoftDocs/azure-devops-docs
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
closed
DependsOn for steps
Pri1 devops-cicd-process/tech devops/prod product-question
Is it possible to add dependencies to steps? For example I want to run publish .pdb only if build step was succeeded. They both in same job. Or for example I want to publish .sql files only if any of them exists in specific folder. All what I found now is "in(variables['Agent.JobStatus'], 'Succeeded', 'SucceededWithIssues')" that refer to whole JobStatus. And continueOnError: true - that warn me but not allow to refer to it. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 77c58a78-a567-e99a-9eb7-62dddd1b90b6 * Version Independent ID: 680a79bc-11de-39fc-43e3-e07dc762db18 * Content: [Expressions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#dependencies) * Content Source: [docs/pipelines/process/expressions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/expressions.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
1.0
DependsOn for steps - Is it possible to add dependencies to steps? For example I want to run publish .pdb only if build step was succeeded. They both in same job. Or for example I want to publish .sql files only if any of them exists in specific folder. All what I found now is "in(variables['Agent.JobStatus'], 'Succeeded', 'SucceededWithIssues')" that refer to whole JobStatus. And continueOnError: true - that warn me but not allow to refer to it. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 77c58a78-a567-e99a-9eb7-62dddd1b90b6 * Version Independent ID: 680a79bc-11de-39fc-43e3-e07dc762db18 * Content: [Expressions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#dependencies) * Content Source: [docs/pipelines/process/expressions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/expressions.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
process
dependson for steps is it possible to add dependencies to steps for example i want to run publish pdb only if build step was succeeded they both in same job or for example i want to publish sql files only if any of them exists in specific folder all what i found now is in variables succeeded succeededwithissues that refer to whole jobstatus and continueonerror true that warn me but not allow to refer to it document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
1
62,767
8,639,607,794
IssuesEvent
2018-11-23 20:11:15
codebuddies/codebuddies
https://api.github.com/repos/codebuddies/codebuddies
closed
Encourage using `git cz` (commitizen) by telling contributors about it in docs.codebuddies.org
backlog claimed documentation
The staging branch now supports commitizen: http://commitizen.github.io/cz-cli/ This means that after `git add`, instead of `git commit`, I can type `git cz` and be walked through some prompts that'll package a better looking commit message. Proposal: 1) Update docs.codebuddies.org (github.com/codebuddiesdotorg/documentation) to encourage users to use `git cz` 2) Make an announcement about it in the #cb-code channel on Slack We obviously won't reject PRs if they don't follow cz, but `git cz` can help improve the quality of our commit messages.
1.0
Encourage using `git cz` (commitizen) by telling contributors about it in docs.codebuddies.org - The staging branch now supports commitizen: http://commitizen.github.io/cz-cli/ This means that after `git add`, instead of `git commit`, I can type `git cz` and be walked through some prompts that'll package a better looking commit message. Proposal: 1) Update docs.codebuddies.org (github.com/codebuddiesdotorg/documentation) to encourage users to use `git cz` 2) Make an announcement about it in the #cb-code channel on Slack We obviously won't reject PRs if they don't follow cz, but `git cz` can help improve the quality of our commit messages.
non_process
encourage using git cz commitizen by telling contributors about it in docs codebuddies org the staging branch now supports commitizen this means that after git add instead of git commit i can type git cz and be walked through some prompts that ll package a better looking commit message proposal update docs codebuddies org github com codebuddiesdotorg documentation to encourage users to use git cz make an announcement about it in the cb code channel on slack we obviously won t reject prs if they don t follow cz but git cz can help improve the quality of our commit messages
0
178,989
13,807,253,682
IssuesEvent
2020-10-11 21:14:13
OllisGit/OctoPrint-SpoolManager
https://api.github.com/repos/OllisGit/OctoPrint-SpoolManager
closed
Feature Request: Select columns to be shown in overview
status: markedForAutoClose status: waitingForTestFeedback type: enhancement
Requesting a settings feature, to select the columns which should be shown in the overview. In my case I would like to hide the Note column and instead want to show the temperature column, because this value would be much more interesting for me.
1.0
Feature Request: Select columns to be shown in overview - Requesting a settings feature, to select the columns which should be shown in the overview. In my case I would like to hide the Note column and instead want to show the temperature column, because this value would be much more interesting for me.
non_process
feature request select columns to be shown in overview requesting a settings feature to select the columns which should be shown in the overview in my case i would like to hide the note column and instead want to show the temperature column because this value would be much more interesting for me
0
13,062
15,395,003,923
IssuesEvent
2021-03-03 18:38:33
kubernetes/minikube
https://api.github.com/repos/kubernetes/minikube
closed
minikube v1.18.0 version dirty
kind/bug kind/process priority/important-soon
this is the script that runs the release https://github.com/medyagh/minikube/blob/ec61815d60f66a6e4f6353030a40b12362557caa/hack/jenkins/release_build_and_upload.sh#L46 ``` $ minikube version minikube version: v1.18.0 commit: ec61815d60f66a6e4f6353030a40b12362557caa-dirty ```
1.0
minikube v1.18.0 version dirty - this is the script that runs the release https://github.com/medyagh/minikube/blob/ec61815d60f66a6e4f6353030a40b12362557caa/hack/jenkins/release_build_and_upload.sh#L46 ``` $ minikube version minikube version: v1.18.0 commit: ec61815d60f66a6e4f6353030a40b12362557caa-dirty ```
process
minikube version dirty this is the script that runs the release minikube version minikube version commit dirty
1
119
2,550,838,502
IssuesEvent
2015-02-01 23:44:27
dalehenrich/filetree
https://api.github.com/repos/dalehenrich/filetree
closed
FileUrl deprecated in Pharo3.0 ... finally removed in Pharo4.0
in process
FileTreeUrl subclasses FileUrl ... presumably need to convert to use ZnUrl
1.0
FileUrl deprecated in Pharo3.0 ... finally removed in Pharo4.0 - FileTreeUrl subclasses FileUrl ... presumably need to convert to use ZnUrl
process
fileurl deprecated in finally removed in filetreeurl subclasses fileurl presumably need to convert to use znurl
1
12,628
15,016,021,288
IssuesEvent
2021-02-01 09:03:34
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[PM] User details page > Search bar > 'No records found' text should be displayed when user searches with invalid input
Bug P2 Participant manager Process: Fixed Process: Tested QA Process: Tested dev
AR : User details page > Search bar > Empty section is displayed when a user searches with invalid input ER : User details page > Search bar > 'No records found' text should be displayed when a user searches with invalid input [Note : Issue should be fixed for Add new admin, Admin details page, and Edit admin details page] ![search21](https://user-images.githubusercontent.com/71445210/105741143-fa409380-5f5f-11eb-9cac-c4630a9a5d8d.png)
3.0
[PM] User details page > Search bar > 'No records found' text should be displayed when user searches with invalid input - AR : User details page > Search bar > Empty section is displayed when a user searches with invalid input ER : User details page > Search bar > 'No records found' text should be displayed when a user searches with invalid input [Note : Issue should be fixed for Add new admin, Admin details page, and Edit admin details page] ![search21](https://user-images.githubusercontent.com/71445210/105741143-fa409380-5f5f-11eb-9cac-c4630a9a5d8d.png)
process
user details page search bar no records found text should be displayed when user searches with invalid input ar user details page search bar empty section is displayed when a user searches with invalid input er user details page search bar no records found text should be displayed when a user searches with invalid input
1
109,642
23,803,435,410
IssuesEvent
2022-09-03 17:05:09
DS-13-Dev-Team/DS13
https://api.github.com/repos/DS-13-Dev-Team/DS13
closed
leapers seem to get stunned at the end of a leap, regardless wether they hit someone or not.
Bug Type: Code
#### Description of issue Leapers always get stunned at the end of the leap. Whilst probably intentional for a miss, hitting a player doesn't change this #### Difference between expected and actual behavior reward goof leaps #### Steps to reproduce be leaper leap at a person
1.0
leapers seem to get stunned at the end of a leap, regardless wether they hit someone or not. - #### Description of issue Leapers always get stunned at the end of the leap. Whilst probably intentional for a miss, hitting a player doesn't change this #### Difference between expected and actual behavior reward goof leaps #### Steps to reproduce be leaper leap at a person
non_process
leapers seem to get stunned at the end of a leap regardless wether they hit someone or not description of issue leapers always get stunned at the end of the leap whilst probably intentional for a miss hitting a player doesn t change this difference between expected and actual behavior reward goof leaps steps to reproduce be leaper leap at a person
0
77
2,528,591,011
IssuesEvent
2015-01-22 05:21:37
sysown/proxysql-0.2
https://api.github.com/repos/sysown/proxysql-0.2
opened
Print "DEBUG" in print_version() if compiled with DEBUG option
ADMIN AUTHENTICATION CACHE CONNECTION POOL DEBUG development GLOBAL QUERY PROCESSOR
This applies to all modules. If the module is compiled with DEBUG , this needs to be clear in print_version() . This applies to the current implemented modules: - MySQL Thread Handler - ProxySQL Admin - MySQL Authentication - Query Processor - Query Cache There should be a compatibility between ProxySQL (main process) and modules with regards to DEBUG
1.0
Print "DEBUG" in print_version() if compiled with DEBUG option - This applies to all modules. If the module is compiled with DEBUG , this needs to be clear in print_version() . This applies to the current implemented modules: - MySQL Thread Handler - ProxySQL Admin - MySQL Authentication - Query Processor - Query Cache There should be a compatibility between ProxySQL (main process) and modules with regards to DEBUG
process
print debug in print version if compiled with debug option this applies to all modules if the module is compiled with debug this needs to be clear in print version this applies to the current implemented modules mysql thread handler proxysql admin mysql authentication query processor query cache there should be a compatibility between proxysql main process and modules with regards to debug
1
54,128
11,196,359,585
IssuesEvent
2020-01-03 09:53:23
azl397985856/leetcode
https://api.github.com/repos/azl397985856/leetcode
closed
【每日一题】- 2019-08-26 - 1122. 数组的相对排序
Daily Question Easy LeetCode stale
给你两个数组,arr1 和 arr2, arr2 中的元素各不相同 arr2 中的每个元素都出现在 arr1 中 对 arr1 中的元素进行排序,使 arr1 中项的相对顺序和 arr2 中的相对顺序相同。未在 arr2 中出现过的元素需要按照升序放在 arr1 的末尾。   示例: 输入:arr1 = [2,3,1,3,2,4,6,7,9,2,19], arr2 = [2,1,4,3,9,6] 输出:[2,2,2,1,4,3,3,9,6,7,19]   提示: arr1.length, arr2.length <= 1000 0 <= arr1[i], arr2[i] <= 1000 arr2 中的元素 arr2[i] 各不相同 arr2 中的每个元素 arr2[i] 都出现在 arr1 中 来源:力扣(LeetCode) 链接:https://leetcode-cn.com/problems/relative-sort-array 著作权归领扣网络所有。商业转载请联系官方授权,非商业转载请注明出处。
1.0
【每日一题】- 2019-08-26 - 1122. 数组的相对排序 - 给你两个数组,arr1 和 arr2, arr2 中的元素各不相同 arr2 中的每个元素都出现在 arr1 中 对 arr1 中的元素进行排序,使 arr1 中项的相对顺序和 arr2 中的相对顺序相同。未在 arr2 中出现过的元素需要按照升序放在 arr1 的末尾。   示例: 输入:arr1 = [2,3,1,3,2,4,6,7,9,2,19], arr2 = [2,1,4,3,9,6] 输出:[2,2,2,1,4,3,3,9,6,7,19]   提示: arr1.length, arr2.length <= 1000 0 <= arr1[i], arr2[i] <= 1000 arr2 中的元素 arr2[i] 各不相同 arr2 中的每个元素 arr2[i] 都出现在 arr1 中 来源:力扣(LeetCode) 链接:https://leetcode-cn.com/problems/relative-sort-array 著作权归领扣网络所有。商业转载请联系官方授权,非商业转载请注明出处。
non_process
【每日一题】 数组的相对排序 给你两个数组, 和  ,  中的元素各不相同 中的每个元素都出现在   中 对  中的元素进行排序,使 中项的相对顺序和   中的相对顺序相同。未在   中出现过的元素需要按照升序放在   的末尾。   示例: 输入: 输出:   提示: length length  中的元素   各不相同 中的每个元素   都出现在   中 来源:力扣(leetcode) 链接: 著作权归领扣网络所有。商业转载请联系官方授权,非商业转载请注明出处。
0
3,395
6,516,771,590
IssuesEvent
2017-08-27 14:11:30
P0cL4bs/WiFi-Pumpkin
https://api.github.com/repos/P0cL4bs/WiFi-Pumpkin
closed
Pumpking-proxy + [all plugins inject page ]
enhancement in process priority solved
## What's the problem (or question)? All works fine accept that the pumpkin-proxy is not working. I cannot see logs and pictures. Traffic seems to go fine from the client side perspective. My plan is to inject the beef hook, but it's just not working. Also I can see a lot of typo's in the ini files. Especially regarding the iptables stuff. I manual changed these and even applied it outside the app, but it did not help anyway. iptables -t nat -F iptables -F iptables --policy INPUT ACCEPT iptables --policy FORWARD ACCEPT iptables --policy OUTPUT ACCEPT iptables -A FORWARD -i eth0 --out-interface wlan1 -j ACCEPT iptables -A INPUT --in-interface wlan1 -j ACCEPT iptables -t nat -A PREROUTING -p tcp --destination-port 80 -j REDIRECT --to-port 8080 iptables -t nat -A POSTROUTING --out-interface eth0 -j MASQUERADE iptables -A OUTPUT --out-interface eth0 -j ACCEPT If I configure the WLAN AP-IP (ex. 10.0.0.1) as the proxy server on the client side, I receive an error on the page regarding : HttpExepction('Invalid HTTP request........(expected: relative, got: absolute) I assume the proxy is running on my AP. I can see the port is also running on 8080. Have you experienced this before? BR, Thierry #### Please tell us details about your environment. * Card wireless adapters name (please check if support AP/mode): Alfa 036H * Version used tool: 0.8.5 * Virtual Machine (yes or no and which): Live machine * Operating System and version: Kali 2.0 rolling with latest updates
1.0
Pumpking-proxy + [all plugins inject page ] - ## What's the problem (or question)? All works fine accept that the pumpkin-proxy is not working. I cannot see logs and pictures. Traffic seems to go fine from the client side perspective. My plan is to inject the beef hook, but it's just not working. Also I can see a lot of typo's in the ini files. Especially regarding the iptables stuff. I manual changed these and even applied it outside the app, but it did not help anyway. iptables -t nat -F iptables -F iptables --policy INPUT ACCEPT iptables --policy FORWARD ACCEPT iptables --policy OUTPUT ACCEPT iptables -A FORWARD -i eth0 --out-interface wlan1 -j ACCEPT iptables -A INPUT --in-interface wlan1 -j ACCEPT iptables -t nat -A PREROUTING -p tcp --destination-port 80 -j REDIRECT --to-port 8080 iptables -t nat -A POSTROUTING --out-interface eth0 -j MASQUERADE iptables -A OUTPUT --out-interface eth0 -j ACCEPT If I configure the WLAN AP-IP (ex. 10.0.0.1) as the proxy server on the client side, I receive an error on the page regarding : HttpExepction('Invalid HTTP request........(expected: relative, got: absolute) I assume the proxy is running on my AP. I can see the port is also running on 8080. Have you experienced this before? BR, Thierry #### Please tell us details about your environment. * Card wireless adapters name (please check if support AP/mode): Alfa 036H * Version used tool: 0.8.5 * Virtual Machine (yes or no and which): Live machine * Operating System and version: Kali 2.0 rolling with latest updates
process
pumpking proxy what s the problem or question all works fine accept that the pumpkin proxy is not working i cannot see logs and pictures traffic seems to go fine from the client side perspective my plan is to inject the beef hook but it s just not working also i can see a lot of typo s in the ini files especially regarding the iptables stuff i manual changed these and even applied it outside the app but it did not help anyway iptables t nat f iptables f iptables policy input accept iptables policy forward accept iptables policy output accept iptables a forward i out interface j accept iptables a input in interface j accept iptables t nat a prerouting p tcp destination port j redirect to port iptables t nat a postrouting out interface j masquerade iptables a output out interface j accept if i configure the wlan ap ip ex as the proxy server on the client side i receive an error on the page regarding httpexepction invalid http request expected relative got absolute i assume the proxy is running on my ap i can see the port is also running on have you experienced this before br thierry please tell us details about your environment card wireless adapters name please check if support ap mode alfa version used tool virtual machine yes or no and which live machine operating system and version kali rolling with latest updates
1
147,419
19,522,792,339
IssuesEvent
2021-12-29 22:21:56
swagger-api/swagger-codegen
https://api.github.com/repos/swagger-api/swagger-codegen
opened
WS-2018-0125 (Medium) detected in jackson-core-2.4.5.jar, jackson-core-2.6.4.jar
security vulnerability
## WS-2018-0125 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-core-2.4.5.jar</b>, <b>jackson-core-2.6.4.jar</b></p></summary> <p> <details><summary><b>jackson-core-2.4.5.jar</b></p></summary> <p>Core Jackson abstractions, basic JSON streaming API implementation</p> <p>Library home page: <a href="https://github.com/FasterXML/jackson">https://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /samples/client/petstore/scala/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-core/2.4.5/6fb96728ee26edb19fe329d94f3bd4df1a97652a/jackson-core-2.4.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-core/2.4.5/6fb96728ee26edb19fe329d94f3bd4df1a97652a/jackson-core-2.4.5.jar</p> <p> Dependency Hierarchy: - swagger-core-1.5.8.jar (Root Library) - jackson-dataformat-yaml-2.4.5.jar - :x: **jackson-core-2.4.5.jar** (Vulnerable Library) </details> <details><summary><b>jackson-core-2.6.4.jar</b></p></summary> <p>Core Jackson abstractions, basic JSON streaming API implementation</p> <p>Library home page: <a href="https://github.com/FasterXML/jackson-core">https://github.com/FasterXML/jackson-core</a></p> <p>Path to dependency file: /samples/client/petstore/java/jersey1/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-core/2.6.4/27d3a9f7bbdcf72d93c9b2da7017e39551bfa9fb/jackson-core-2.6.4.jar,/aches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-core/2.6.4/27d3a9f7bbdcf72d93c9b2da7017e39551bfa9fb/jackson-core-2.6.4.jar</p> <p> Dependency Hierarchy: - jackson-databind-2.6.4.jar (Root Library) - :x: **jackson-core-2.6.4.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/swagger-api/swagger-codegen/commit/4b7a8d7d7384aa6a27d6309c35ade0916edae7ed">4b7a8d7d7384aa6a27d6309c35ade0916edae7ed</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> OutOfMemoryError when writing BigDecimal In Jackson Core before version 2.7.7. When enabled the WRITE_BIGDECIMAL_AS_PLAIN setting, Jackson will attempt to write out the whole number, no matter how large the exponent. <p>Publish Date: 2016-08-25 <p>URL: <a href=https://github.com/FasterXML/jackson-core/issues/315>WS-2018-0125</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-core/releases/tag/jackson-core-2.7.7">https://github.com/FasterXML/jackson-core/releases/tag/jackson-core-2.7.7</a></p> <p>Release Date: 2016-08-25</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-core:2.7.7</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-core","packageVersion":"2.4.5","packageFilePaths":["/samples/client/petstore/scala/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.swagger:swagger-core:1.5.8;com.fasterxml.jackson.dataformat:jackson-dataformat-yaml:2.4.5;com.fasterxml.jackson.core:jackson-core:2.4.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-core:2.7.7","isBinary":false},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-core","packageVersion":"2.6.4","packageFilePaths":["/samples/client/petstore/java/jersey1/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.6.4;com.fasterxml.jackson.core:jackson-core:2.6.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-core:2.7.7","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2018-0125","vulnerabilityDetails":"OutOfMemoryError when writing BigDecimal In Jackson Core before version 2.7.7.\nWhen enabled the WRITE_BIGDECIMAL_AS_PLAIN setting, Jackson will attempt to write out the whole number, no matter how large the exponent.","vulnerabilityUrl":"https://github.com/FasterXML/jackson-core/issues/315","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
WS-2018-0125 (Medium) detected in jackson-core-2.4.5.jar, jackson-core-2.6.4.jar - ## WS-2018-0125 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-core-2.4.5.jar</b>, <b>jackson-core-2.6.4.jar</b></p></summary> <p> <details><summary><b>jackson-core-2.4.5.jar</b></p></summary> <p>Core Jackson abstractions, basic JSON streaming API implementation</p> <p>Library home page: <a href="https://github.com/FasterXML/jackson">https://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /samples/client/petstore/scala/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-core/2.4.5/6fb96728ee26edb19fe329d94f3bd4df1a97652a/jackson-core-2.4.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-core/2.4.5/6fb96728ee26edb19fe329d94f3bd4df1a97652a/jackson-core-2.4.5.jar</p> <p> Dependency Hierarchy: - swagger-core-1.5.8.jar (Root Library) - jackson-dataformat-yaml-2.4.5.jar - :x: **jackson-core-2.4.5.jar** (Vulnerable Library) </details> <details><summary><b>jackson-core-2.6.4.jar</b></p></summary> <p>Core Jackson abstractions, basic JSON streaming API implementation</p> <p>Library home page: <a href="https://github.com/FasterXML/jackson-core">https://github.com/FasterXML/jackson-core</a></p> <p>Path to dependency file: /samples/client/petstore/java/jersey1/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-core/2.6.4/27d3a9f7bbdcf72d93c9b2da7017e39551bfa9fb/jackson-core-2.6.4.jar,/aches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-core/2.6.4/27d3a9f7bbdcf72d93c9b2da7017e39551bfa9fb/jackson-core-2.6.4.jar</p> <p> Dependency Hierarchy: - jackson-databind-2.6.4.jar (Root Library) - :x: **jackson-core-2.6.4.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/swagger-api/swagger-codegen/commit/4b7a8d7d7384aa6a27d6309c35ade0916edae7ed">4b7a8d7d7384aa6a27d6309c35ade0916edae7ed</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> OutOfMemoryError when writing BigDecimal In Jackson Core before version 2.7.7. When enabled the WRITE_BIGDECIMAL_AS_PLAIN setting, Jackson will attempt to write out the whole number, no matter how large the exponent. <p>Publish Date: 2016-08-25 <p>URL: <a href=https://github.com/FasterXML/jackson-core/issues/315>WS-2018-0125</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-core/releases/tag/jackson-core-2.7.7">https://github.com/FasterXML/jackson-core/releases/tag/jackson-core-2.7.7</a></p> <p>Release Date: 2016-08-25</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-core:2.7.7</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-core","packageVersion":"2.4.5","packageFilePaths":["/samples/client/petstore/scala/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"io.swagger:swagger-core:1.5.8;com.fasterxml.jackson.dataformat:jackson-dataformat-yaml:2.4.5;com.fasterxml.jackson.core:jackson-core:2.4.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-core:2.7.7","isBinary":false},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-core","packageVersion":"2.6.4","packageFilePaths":["/samples/client/petstore/java/jersey1/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.6.4;com.fasterxml.jackson.core:jackson-core:2.6.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-core:2.7.7","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2018-0125","vulnerabilityDetails":"OutOfMemoryError when writing BigDecimal In Jackson Core before version 2.7.7.\nWhen enabled the WRITE_BIGDECIMAL_AS_PLAIN setting, Jackson will attempt to write out the whole number, no matter how large the exponent.","vulnerabilityUrl":"https://github.com/FasterXML/jackson-core/issues/315","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_process
ws medium detected in jackson core jar jackson core jar ws medium severity vulnerability vulnerable libraries jackson core jar jackson core jar jackson core jar core jackson abstractions basic json streaming api implementation library home page a href path to dependency file samples client petstore scala build gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson core jackson core jar home wss scanner gradle caches modules files com fasterxml jackson core jackson core jackson core jar dependency hierarchy swagger core jar root library jackson dataformat yaml jar x jackson core jar vulnerable library jackson core jar core jackson abstractions basic json streaming api implementation library home page a href path to dependency file samples client petstore java build gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson core jackson core jar aches modules files com fasterxml jackson core jackson core jackson core jar dependency hierarchy jackson databind jar root library x jackson core jar vulnerable library found in head commit a href found in base branch master vulnerability details outofmemoryerror when writing bigdecimal in jackson core before version when enabled the write bigdecimal as plain setting jackson will attempt to write out the whole number no matter how large the exponent publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson core isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree io swagger swagger core com fasterxml jackson dataformat jackson dataformat yaml com fasterxml jackson core jackson core isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson core isbinary false packagetype java groupid com fasterxml jackson core packagename jackson core packageversion packagefilepaths istransitivedependency true dependencytree com fasterxml jackson core jackson databind com fasterxml jackson core jackson core isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson core isbinary false basebranches vulnerabilityidentifier ws vulnerabilitydetails outofmemoryerror when writing bigdecimal in jackson core before version nwhen enabled the write bigdecimal as plain setting jackson will attempt to write out the whole number no matter how large the exponent vulnerabilityurl
0
14,873
18,282,415,874
IssuesEvent
2021-10-05 06:12:31
prisma/prisma
https://api.github.com/repos/prisma/prisma
closed
MDBI: Show a helpful error message when developers try to re-introspect a MongoDB database
process/candidate team/migrations topic: mongodb kind/subtask
Perhaps linked to https://github.com/prisma/prisma/issues/9585 where people can upvote and discuss this feature.
1.0
MDBI: Show a helpful error message when developers try to re-introspect a MongoDB database - Perhaps linked to https://github.com/prisma/prisma/issues/9585 where people can upvote and discuss this feature.
process
mdbi show a helpful error message when developers try to re introspect a mongodb database perhaps linked to where people can upvote and discuss this feature
1
6,803
9,942,448,952
IssuesEvent
2019-07-03 13:55:21
googleapis/google-cloud-python
https://api.github.com/repos/googleapis/google-cloud-python
reopened
Firestore: 'test_watch_document' systest flakes
api: firestore flaky testing type: process
```python _____________________________ test_watch_document ______________________________ client = <google.cloud.firestore_v1beta1.client.Client object at 0x7f0d928d09b0> cleanup = <built-in method append of list object at 0x7f0d90008bc8> def test_watch_document(client, cleanup): db = client doc_ref = db.collection(u'users').document( u'alovelace' + unique_resource_id()) # Initial setting doc_ref.set({ u'first': u'Jane', u'last': u'Doe', u'born': 1900 }) sleep(1) # Setup listener def on_snapshot(docs, changes, read_time): on_snapshot.called_count += 1 on_snapshot.called_count = 0 doc_ref.on_snapshot(on_snapshot) # Alter document doc_ref.set({ u'first': u'Ada', u'last': u'Lovelace', u'born': 1815 }) sleep(1) for _ in range(10): if on_snapshot.called_count == 1: return sleep(1) if on_snapshot.called_count != 1: raise AssertionError( "Failed to get exactly one document change: count: " + > str(on_snapshot.called_count)) E AssertionError: Failed to get exactly one document change: count: 2 tests/system.py:836: AssertionError ```
1.0
Firestore: 'test_watch_document' systest flakes - ```python _____________________________ test_watch_document ______________________________ client = <google.cloud.firestore_v1beta1.client.Client object at 0x7f0d928d09b0> cleanup = <built-in method append of list object at 0x7f0d90008bc8> def test_watch_document(client, cleanup): db = client doc_ref = db.collection(u'users').document( u'alovelace' + unique_resource_id()) # Initial setting doc_ref.set({ u'first': u'Jane', u'last': u'Doe', u'born': 1900 }) sleep(1) # Setup listener def on_snapshot(docs, changes, read_time): on_snapshot.called_count += 1 on_snapshot.called_count = 0 doc_ref.on_snapshot(on_snapshot) # Alter document doc_ref.set({ u'first': u'Ada', u'last': u'Lovelace', u'born': 1815 }) sleep(1) for _ in range(10): if on_snapshot.called_count == 1: return sleep(1) if on_snapshot.called_count != 1: raise AssertionError( "Failed to get exactly one document change: count: " + > str(on_snapshot.called_count)) E AssertionError: Failed to get exactly one document change: count: 2 tests/system.py:836: AssertionError ```
process
firestore test watch document systest flakes python test watch document client cleanup def test watch document client cleanup db client doc ref db collection u users document u alovelace unique resource id initial setting doc ref set u first u jane u last u doe u born sleep setup listener def on snapshot docs changes read time on snapshot called count on snapshot called count doc ref on snapshot on snapshot alter document doc ref set u first u ada u last u lovelace u born sleep for in range if on snapshot called count return sleep if on snapshot called count raise assertionerror failed to get exactly one document change count str on snapshot called count e assertionerror failed to get exactly one document change count tests system py assertionerror
1
255,040
21,895,967,755
IssuesEvent
2022-05-20 08:39:13
biocore/empress
https://api.github.com/repos/biocore/empress
opened
Race condition / itermittent failure in one of the barplot tests
bug testing
Noticed while working on #521 -- [this particular test](https://github.com/biocore/empress/blob/bf6f755c5187e543ff13c2c624b153fb97918b23/tests/test-barplots.js#L353-L359) was failing (it was expecting `216` coordinates, but only saw `72`). Adding log statements during debugging seemed to cause the test to stop failing, and the test kept passing even after I removed the logs again? Since I don't think #521 touches any of the code impacting this check, I suspected at first that it was a race condition, but now I'm not sure about that. In any case, we can always comment this test out if it becomes a problem, since it's not super important.
1.0
Race condition / itermittent failure in one of the barplot tests - Noticed while working on #521 -- [this particular test](https://github.com/biocore/empress/blob/bf6f755c5187e543ff13c2c624b153fb97918b23/tests/test-barplots.js#L353-L359) was failing (it was expecting `216` coordinates, but only saw `72`). Adding log statements during debugging seemed to cause the test to stop failing, and the test kept passing even after I removed the logs again? Since I don't think #521 touches any of the code impacting this check, I suspected at first that it was a race condition, but now I'm not sure about that. In any case, we can always comment this test out if it becomes a problem, since it's not super important.
non_process
race condition itermittent failure in one of the barplot tests noticed while working on was failing it was expecting coordinates but only saw adding log statements during debugging seemed to cause the test to stop failing and the test kept passing even after i removed the logs again since i don t think touches any of the code impacting this check i suspected at first that it was a race condition but now i m not sure about that in any case we can always comment this test out if it becomes a problem since it s not super important
0
48,130
7,375,177,743
IssuesEvent
2018-03-13 23:04:56
aurelia/skeleton-navigation
https://api.github.com/repos/aurelia/skeleton-navigation
closed
error TS2307: Cannot find module 'aurelia-router' - document update required
documentation
**I'm submitting a bug report** - **Aurelia Skeleton Version** skeleton-typescript - **Framework Version:** 1.0.6 (latest) **Please tell us about your environment:** - **Operating System:** ubuntu docker image on virtual box on windows 10 - **Node Version:** 6.7.0 - **NPM Version:** 3.10.3 - **JSPM OR Webpack AND Version** JSPM 0.16.46 - **Browser:** N/A - **Language:** typeScript **Current behavior:** Documentation at http://aurelia.io/hub.html#/doc/article/aurelia/framework/latest/setup-jspm/2 doesn't mention to run **typings install** before **gulp watch**. **Expected/desired behavior:** Running 'typings install' prevents both build errors and the app not loading correctly in the browser. - **What is the expected behavior?** - **What is the motivation / use case for changing the behavior?** An hour lost time scratching head \* 1000 people (guess at # using this version of aurelia) = over a months saved world wide dev time. Every little helps!
1.0
error TS2307: Cannot find module 'aurelia-router' - document update required - **I'm submitting a bug report** - **Aurelia Skeleton Version** skeleton-typescript - **Framework Version:** 1.0.6 (latest) **Please tell us about your environment:** - **Operating System:** ubuntu docker image on virtual box on windows 10 - **Node Version:** 6.7.0 - **NPM Version:** 3.10.3 - **JSPM OR Webpack AND Version** JSPM 0.16.46 - **Browser:** N/A - **Language:** typeScript **Current behavior:** Documentation at http://aurelia.io/hub.html#/doc/article/aurelia/framework/latest/setup-jspm/2 doesn't mention to run **typings install** before **gulp watch**. **Expected/desired behavior:** Running 'typings install' prevents both build errors and the app not loading correctly in the browser. - **What is the expected behavior?** - **What is the motivation / use case for changing the behavior?** An hour lost time scratching head \* 1000 people (guess at # using this version of aurelia) = over a months saved world wide dev time. Every little helps!
non_process
error cannot find module aurelia router document update required i m submitting a bug report aurelia skeleton version skeleton typescript framework version latest please tell us about your environment operating system ubuntu docker image on virtual box on windows node version npm version jspm or webpack and version jspm browser n a language typescript current behavior documentation at doesn t mention to run typings install before gulp watch expected desired behavior running typings install prevents both build errors and the app not loading correctly in the browser what is the expected behavior what is the motivation use case for changing the behavior an hour lost time scratching head people guess at using this version of aurelia over a months saved world wide dev time every little helps
0
5,756
8,598,512,359
IssuesEvent
2018-11-15 22:01:58
HumanCellAtlas/dcp-community
https://api.github.com/repos/HumanCellAtlas/dcp-community
closed
Community members may escalate decisions for RFCs
rfc-process
**Notes from PM F2F (9/25/18):** There was a desire (strongly expressed by @briandoconnor) for the escalation process to not be limited to the Author in the case of technical RFCs. For example, any community member may escalate an _Architecture++_ decision (approval or disapproval) to _PM++_ for oversight when there is disagreement.
1.0
Community members may escalate decisions for RFCs - **Notes from PM F2F (9/25/18):** There was a desire (strongly expressed by @briandoconnor) for the escalation process to not be limited to the Author in the case of technical RFCs. For example, any community member may escalate an _Architecture++_ decision (approval or disapproval) to _PM++_ for oversight when there is disagreement.
process
community members may escalate decisions for rfcs notes from pm there was a desire strongly expressed by briandoconnor for the escalation process to not be limited to the author in the case of technical rfcs for example any community member may escalate an architecture decision approval or disapproval to pm for oversight when there is disagreement
1
12,225
14,743,310,960
IssuesEvent
2021-01-07 13:43:44
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Site 062 Client unable to apply payment on through portal and expiry date not being accepted.
anc-process anp-1 ant-support has attachment
In GitLab by @kdjstudios on Aug 7, 2019, 11:32 **Submitted by:** "Denise Joseph" <denise.joseph@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/8968504 **Server:** Internal **Client/Site:** Toronto **Account:** Platform Properties **Issue:** Our contact Tania of Platform Properties advised she attempted to pay her invoice through our portal but due to the expiry date on the card which is still valid. The expiry date was 03/21. Can you please look into this for us and update us. Thank you in advance for your time and assistance. ![image](/uploads/0fa6fa9674694afe7e080dac5043b20c/image.png)
1.0
Site 062 Client unable to apply payment on through portal and expiry date not being accepted. - In GitLab by @kdjstudios on Aug 7, 2019, 11:32 **Submitted by:** "Denise Joseph" <denise.joseph@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/8968504 **Server:** Internal **Client/Site:** Toronto **Account:** Platform Properties **Issue:** Our contact Tania of Platform Properties advised she attempted to pay her invoice through our portal but due to the expiry date on the card which is still valid. The expiry date was 03/21. Can you please look into this for us and update us. Thank you in advance for your time and assistance. ![image](/uploads/0fa6fa9674694afe7e080dac5043b20c/image.png)
process
site client unable to apply payment on through portal and expiry date not being accepted in gitlab by kdjstudios on aug submitted by denise joseph helpdesk server internal client site toronto account platform properties issue our contact tania of platform properties advised she attempted to pay her invoice through our portal but due to the expiry date on the card which is still valid the expiry date was can you please look into this for us and update us thank you in advance for your time and assistance uploads image png
1
15,091
18,800,022,872
IssuesEvent
2021-11-09 05:48:54
googleapis/nodejs-automl
https://api.github.com/repos/googleapis/nodejs-automl
closed
Tables PredictionAPI: should perform single prediction failed
type: process api: automl flakybot: issue flakybot: flaky
Note: #496 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky. ---- commit: b6a22c7f0def0c8567ea54f68f32d2ad202d385c buildURL: [Build Status](https://source.cloud.google.com/results/invocations/e24eb91d-cdc0-4f45-9b78-5945da27eb7b), [Sponge](http://sponge2/e24eb91d-cdc0-4f45-9b78-5945da27eb7b) status: failed <details><summary>Test output</summary><br><pre>Command failed: node tables/predict.v1beta1.js "cdpe-automl-tests" "us-central1" "TBL7473655411900416000" '[{"numberValue":39},{"stringValue":"technician"},{"stringValue":"married"},{"stringValue":"secondary"},{"stringValue":"no"},{"numberValue":52},{"stringValue":"no"},{"stringValue":"no"},{"stringValue":"cellular"},{"numberValue":12},{"stringValue":"aug"},{"numberValue":96},{"numberValue":2},{"numberValue":-1},{"numberValue":0},{"stringValue":"unknown"}]' 4 DEADLINE_EXCEEDED: Deadline exceeded Error: Command failed: node tables/predict.v1beta1.js "cdpe-automl-tests" "us-central1" "TBL7473655411900416000" '[{"numberValue":39},{"stringValue":"technician"},{"stringValue":"married"},{"stringValue":"secondary"},{"stringValue":"no"},{"numberValue":52},{"stringValue":"no"},{"stringValue":"no"},{"stringValue":"cellular"},{"numberValue":12},{"stringValue":"aug"},{"numberValue":96},{"numberValue":2},{"numberValue":-1},{"numberValue":0},{"stringValue":"unknown"}]' 4 DEADLINE_EXCEEDED: Deadline exceeded at checkExecSyncError (child_process.js:635:11) at execSync (child_process.js:671:15) at exec (test/automlTablesPredict.v1beta1.test.js:32:21) at Context.<anonymous> (test/automlTablesPredict.v1beta1.test.js:55:20) at processImmediate (internal/timers.js:461:21)</pre></details>
1.0
Tables PredictionAPI: should perform single prediction failed - Note: #496 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky. ---- commit: b6a22c7f0def0c8567ea54f68f32d2ad202d385c buildURL: [Build Status](https://source.cloud.google.com/results/invocations/e24eb91d-cdc0-4f45-9b78-5945da27eb7b), [Sponge](http://sponge2/e24eb91d-cdc0-4f45-9b78-5945da27eb7b) status: failed <details><summary>Test output</summary><br><pre>Command failed: node tables/predict.v1beta1.js "cdpe-automl-tests" "us-central1" "TBL7473655411900416000" '[{"numberValue":39},{"stringValue":"technician"},{"stringValue":"married"},{"stringValue":"secondary"},{"stringValue":"no"},{"numberValue":52},{"stringValue":"no"},{"stringValue":"no"},{"stringValue":"cellular"},{"numberValue":12},{"stringValue":"aug"},{"numberValue":96},{"numberValue":2},{"numberValue":-1},{"numberValue":0},{"stringValue":"unknown"}]' 4 DEADLINE_EXCEEDED: Deadline exceeded Error: Command failed: node tables/predict.v1beta1.js "cdpe-automl-tests" "us-central1" "TBL7473655411900416000" '[{"numberValue":39},{"stringValue":"technician"},{"stringValue":"married"},{"stringValue":"secondary"},{"stringValue":"no"},{"numberValue":52},{"stringValue":"no"},{"stringValue":"no"},{"stringValue":"cellular"},{"numberValue":12},{"stringValue":"aug"},{"numberValue":96},{"numberValue":2},{"numberValue":-1},{"numberValue":0},{"stringValue":"unknown"}]' 4 DEADLINE_EXCEEDED: Deadline exceeded at checkExecSyncError (child_process.js:635:11) at execSync (child_process.js:671:15) at exec (test/automlTablesPredict.v1beta1.test.js:32:21) at Context.<anonymous> (test/automlTablesPredict.v1beta1.test.js:55:20) at processImmediate (internal/timers.js:461:21)</pre></details>
process
tables predictionapi should perform single prediction failed note was also for this test but it was closed more than days ago so i didn t mark it flaky commit buildurl status failed test output command failed node tables predict js cdpe automl tests us deadline exceeded deadline exceeded error command failed node tables predict js cdpe automl tests us deadline exceeded deadline exceeded at checkexecsyncerror child process js at execsync child process js at exec test automltablespredict test js at context test automltablespredict test js at processimmediate internal timers js
1
57,112
8,140,611,417
IssuesEvent
2018-08-20 21:43:52
appium/appium
https://api.github.com/repos/appium/appium
closed
Appium.io: Add links to GitHub source documents
Documentation
To simplify the process of reporting issues and putting in PR's to fix documentation issues, we should add links in appium.io that link back to the source .md file
1.0
Appium.io: Add links to GitHub source documents - To simplify the process of reporting issues and putting in PR's to fix documentation issues, we should add links in appium.io that link back to the source .md file
non_process
appium io add links to github source documents to simplify the process of reporting issues and putting in pr s to fix documentation issues we should add links in appium io that link back to the source md file
0
6,695
9,813,710,027
IssuesEvent
2019-06-13 08:37:44
opengeospatial/CityGML-3.0CM
https://api.github.com/repos/opengeospatial/CityGML-3.0CM
closed
Do We Need Dedicated Discussions of Specific Features of the Conceptual Model?
SWG Process
Should we put effort into thinking about the model first or should we work with experimental implementations to better understand the practical details?
1.0
Do We Need Dedicated Discussions of Specific Features of the Conceptual Model? - Should we put effort into thinking about the model first or should we work with experimental implementations to better understand the practical details?
process
do we need dedicated discussions of specific features of the conceptual model should we put effort into thinking about the model first or should we work with experimental implementations to better understand the practical details
1
8,268
11,429,670,732
IssuesEvent
2020-02-04 08:31:52
threefoldtech/zos
https://api.github.com/repos/threefoldtech/zos
closed
logs: container logs processing with promtail
process_duplicate type_feature
This will make it easier to debug 0-db, and user container logs (minio and other workloads)
1.0
logs: container logs processing with promtail - This will make it easier to debug 0-db, and user container logs (minio and other workloads)
process
logs container logs processing with promtail this will make it easier to debug db and user container logs minio and other workloads
1
39,042
10,281,924,349
IssuesEvent
2019-08-26 09:43:33
ShaikASK/Testing
https://api.github.com/repos/ShaikASK/Testing
opened
Educational Details screen: The screen name displayed as “Educational Details Details” whereas actual form name given in design web forms as “Education Details”.
Defect P2 Release #5 Build #52
Steps To Replicate : 1.Launch the URL 2.Sign in as HR admin user’ 3.Go to Design webform 4.Update BGV Education Sub webform Name to Education Details and save it 5.Create a New Hire and initiate it 6.Sign in as candidate 7.Sign the “Offer Letter” 8.Navigate to Common Details webform and click on Next button 9.Navigate to “BGV” webform 10.Click on+ icon displayed beside Education Details 11.Navigate to popup window ,check the title of popup window Experienced Behaviour : Observed that The screen name displayed as “Educational Details Details” whereas actual form name given in design web forms as “Education Details”. Expected Behaviour : Ensure that it should display same name provided from design webform screen
1.0
Educational Details screen: The screen name displayed as “Educational Details Details” whereas actual form name given in design web forms as “Education Details”. - Steps To Replicate : 1.Launch the URL 2.Sign in as HR admin user’ 3.Go to Design webform 4.Update BGV Education Sub webform Name to Education Details and save it 5.Create a New Hire and initiate it 6.Sign in as candidate 7.Sign the “Offer Letter” 8.Navigate to Common Details webform and click on Next button 9.Navigate to “BGV” webform 10.Click on+ icon displayed beside Education Details 11.Navigate to popup window ,check the title of popup window Experienced Behaviour : Observed that The screen name displayed as “Educational Details Details” whereas actual form name given in design web forms as “Education Details”. Expected Behaviour : Ensure that it should display same name provided from design webform screen
non_process
educational details screen the screen name displayed as “educational details details” whereas actual form name given in design web forms as “education details” steps to replicate launch the url sign in as hr admin user’ go to design webform update bgv education sub webform name to education details and save it create a new hire and initiate it sign in as candidate sign the “offer letter” navigate to common details webform and click on next button navigate to “bgv” webform click on icon displayed beside education details navigate to popup window check the title of popup window experienced behaviour observed that the screen name displayed as “educational details details” whereas actual form name given in design web forms as “education details” expected behaviour ensure that it should display same name provided from design webform screen
0
338,020
30,277,354,120
IssuesEvent
2023-07-07 21:06:16
unifyai/ivy
https://api.github.com/repos/unifyai/ivy
reopened
Fix creation_ops.test_torch_full
PyTorch Frontend Sub Task Failing Test
| | | |---|---| |jax|<a href="https://github.com/unifyai/ivy/actions/runs/5490329014/jobs/10005627811"><img src=https://img.shields.io/badge/-success-success></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5490329014/jobs/10005627811"><img src=https://img.shields.io/badge/-success-success></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5490329014/jobs/10005627811"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/5490329014/jobs/10005627811"><img src=https://img.shields.io/badge/-success-success></a> |paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5490329014/jobs/10005627811"><img src=https://img.shields.io/badge/-failure-red></a>
1.0
Fix creation_ops.test_torch_full - | | | |---|---| |jax|<a href="https://github.com/unifyai/ivy/actions/runs/5490329014/jobs/10005627811"><img src=https://img.shields.io/badge/-success-success></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5490329014/jobs/10005627811"><img src=https://img.shields.io/badge/-success-success></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5490329014/jobs/10005627811"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/5490329014/jobs/10005627811"><img src=https://img.shields.io/badge/-success-success></a> |paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5490329014/jobs/10005627811"><img src=https://img.shields.io/badge/-failure-red></a>
non_process
fix creation ops test torch full jax a href src numpy a href src tensorflow a href src torch a href src paddle a href src
0
250,280
27,066,429,888
IssuesEvent
2023-02-14 01:03:07
DevOps-PM-PGDip-2022-2023/easybuggy4django.old
https://api.github.com/repos/DevOps-PM-PGDip-2022-2023/easybuggy4django.old
opened
CVE-2018-20677 (Medium) detected in bootstrap-3.3.7.min.js
security vulnerability
## CVE-2018-20677 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.7.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js</a></p> <p>Path to dependency file: /templates/base.html</p> <p>Path to vulnerable library: /templates/base.html</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.7.min.js** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap before 3.4.0, XSS is possible in the affix configuration target property. <p>Publish Date: 2019-01-09 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-20677>CVE-2018-20677</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20677">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20677</a></p> <p>Release Date: 2019-01-09</p> <p>Fix Resolution: Bootstrap - v3.4.0;NorDroN.AngularTemplate - 0.1.6;Dynamic.NET.Express.ProjectTemplates - 0.8.0;dotnetng.template - 1.0.0.4;ZNxtApp.Core.Module.Theme - 1.0.9-Beta;JMeter - 5.0.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2018-20677 (Medium) detected in bootstrap-3.3.7.min.js - ## CVE-2018-20677 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.7.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js</a></p> <p>Path to dependency file: /templates/base.html</p> <p>Path to vulnerable library: /templates/base.html</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.7.min.js** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap before 3.4.0, XSS is possible in the affix configuration target property. <p>Publish Date: 2019-01-09 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-20677>CVE-2018-20677</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20677">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20677</a></p> <p>Release Date: 2019-01-09</p> <p>Fix Resolution: Bootstrap - v3.4.0;NorDroN.AngularTemplate - 0.1.6;Dynamic.NET.Express.ProjectTemplates - 0.8.0;dotnetng.template - 1.0.0.4;ZNxtApp.Core.Module.Theme - 1.0.9-Beta;JMeter - 5.0.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in bootstrap min js cve medium severity vulnerability vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to dependency file templates base html path to vulnerable library templates base html dependency hierarchy x bootstrap min js vulnerable library found in base branch master vulnerability details in bootstrap before xss is possible in the affix configuration target property publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution bootstrap nordron angulartemplate dynamic net express projecttemplates dotnetng template znxtapp core module theme beta jmeter step up your open source security game with mend
0
814,957
30,531,398,016
IssuesEvent
2023-07-19 14:29:30
telerik/kendo-ui-core
https://api.github.com/repos/telerik/kendo-ui-core
opened
Search icon not aligned in the filter menu since svg icons introduced.
Bug SEV: Low C: Grid Priority 1 FP: Unplanned
### Bug report Input is missing `k-input-icon` class. See [the HTML spec](https://github.com/telerik/kendo-themes/blob/develop/packages/html/src/searchbox/searchbox.spec.tsx#L92) for reference. ### Reproduction of the problem Dojo: https://dojo.telerik.com/ojowuMEy ![image](https://github.com/telerik/kendo-ui-core/assets/46747298/e9ef7e3d-bd5e-44a6-98c2-8c8a54d92844) ### Expected/desired behavior Search icon shall be aligned in the filter menu. ### Environment * **Kendo UI version:** 2023.1.314 * **jQuery version:** [all] * **Browser:** [all]
1.0
Search icon not aligned in the filter menu since svg icons introduced. - ### Bug report Input is missing `k-input-icon` class. See [the HTML spec](https://github.com/telerik/kendo-themes/blob/develop/packages/html/src/searchbox/searchbox.spec.tsx#L92) for reference. ### Reproduction of the problem Dojo: https://dojo.telerik.com/ojowuMEy ![image](https://github.com/telerik/kendo-ui-core/assets/46747298/e9ef7e3d-bd5e-44a6-98c2-8c8a54d92844) ### Expected/desired behavior Search icon shall be aligned in the filter menu. ### Environment * **Kendo UI version:** 2023.1.314 * **jQuery version:** [all] * **Browser:** [all]
non_process
search icon not aligned in the filter menu since svg icons introduced bug report input is missing k input icon class see for reference reproduction of the problem dojo expected desired behavior search icon shall be aligned in the filter menu environment kendo ui version jquery version browser
0
793,242
27,987,767,632
IssuesEvent
2023-03-26 21:50:17
NCAR/wrfcloud
https://api.github.com/repos/NCAR/wrfcloud
opened
Reinstate color contoured wind speed products
priority: high type: new feature alert: NEED MORE DEFINITION component: graphics component: NWP components
## Describe the New Feature ## With the addition of the wind direction products, the wind speed contour products were removed. It may still be worthwhile to have these products available. The wind speed can be derived using the wrf python package, including 10m and 3D wind speeds, color contoured, and listed on the forecast viewer. ### Acceptance Testing ### *List input data types and sources.* *Describe tests required for new functionality.* ### Time Estimate ### *Estimate the amount of work required here.* *Issues should represent approximately 1 to 3 days of work.* ### Sub-Issues ### Consider breaking the new feature down into sub-issues. - [ ] *Add a checkbox for each sub-issue here.* ### Relevant Deadlines ### *List relevant project deadlines here or state NONE.* ## Define the Metadata ## ### Assignee ### - [ ] Select **engineer(s)** or **no engineer** required - [ ] Select **scientist(s)** or **no scientist** required ### Labels ### - [ ] Select **component(s)** - [ ] Select **priority** ### Projects and Milestone ### - [ ] Select **Project** - [ ] Select **Milestone** as the next official version or **Backlog of Development Ideas** ## New Feature Checklist ## - [ ] Complete the issue definition above, including the **Time Estimate** and **Funding source**. - [ ] Fork this repository or create a branch of **develop**. Branch name: `feature_<Issue Number>/<Description>` - [ ] Complete the development and test your changes. - [ ] Add/update log messages for easier debugging. - [ ] Add/update tests. - [ ] Add/update documentation. - [ ] Push local changes to GitHub. - [ ] Submit a pull request to merge into **develop**. Pull request: `feature <Issue Number> <Description>` - [ ] Define the pull request metadata, as permissions allow. Select: **Reviewer(s)**, **Project**, and **Development** issue Select: **Milestone** as the next official version - [ ] Iterate until the reviewer(s) accept and merge your changes. - [ ] Delete your fork or branch. - [ ] Close this issue.
1.0
Reinstate color contoured wind speed products - ## Describe the New Feature ## With the addition of the wind direction products, the wind speed contour products were removed. It may still be worthwhile to have these products available. The wind speed can be derived using the wrf python package, including 10m and 3D wind speeds, color contoured, and listed on the forecast viewer. ### Acceptance Testing ### *List input data types and sources.* *Describe tests required for new functionality.* ### Time Estimate ### *Estimate the amount of work required here.* *Issues should represent approximately 1 to 3 days of work.* ### Sub-Issues ### Consider breaking the new feature down into sub-issues. - [ ] *Add a checkbox for each sub-issue here.* ### Relevant Deadlines ### *List relevant project deadlines here or state NONE.* ## Define the Metadata ## ### Assignee ### - [ ] Select **engineer(s)** or **no engineer** required - [ ] Select **scientist(s)** or **no scientist** required ### Labels ### - [ ] Select **component(s)** - [ ] Select **priority** ### Projects and Milestone ### - [ ] Select **Project** - [ ] Select **Milestone** as the next official version or **Backlog of Development Ideas** ## New Feature Checklist ## - [ ] Complete the issue definition above, including the **Time Estimate** and **Funding source**. - [ ] Fork this repository or create a branch of **develop**. Branch name: `feature_<Issue Number>/<Description>` - [ ] Complete the development and test your changes. - [ ] Add/update log messages for easier debugging. - [ ] Add/update tests. - [ ] Add/update documentation. - [ ] Push local changes to GitHub. - [ ] Submit a pull request to merge into **develop**. Pull request: `feature <Issue Number> <Description>` - [ ] Define the pull request metadata, as permissions allow. Select: **Reviewer(s)**, **Project**, and **Development** issue Select: **Milestone** as the next official version - [ ] Iterate until the reviewer(s) accept and merge your changes. - [ ] Delete your fork or branch. - [ ] Close this issue.
non_process
reinstate color contoured wind speed products describe the new feature with the addition of the wind direction products the wind speed contour products were removed it may still be worthwhile to have these products available the wind speed can be derived using the wrf python package including and wind speeds color contoured and listed on the forecast viewer acceptance testing list input data types and sources describe tests required for new functionality time estimate estimate the amount of work required here issues should represent approximately to days of work sub issues consider breaking the new feature down into sub issues add a checkbox for each sub issue here relevant deadlines list relevant project deadlines here or state none define the metadata assignee select engineer s or no engineer required select scientist s or no scientist required labels select component s select priority projects and milestone select project select milestone as the next official version or backlog of development ideas new feature checklist complete the issue definition above including the time estimate and funding source fork this repository or create a branch of develop branch name feature complete the development and test your changes add update log messages for easier debugging add update tests add update documentation push local changes to github submit a pull request to merge into develop pull request feature define the pull request metadata as permissions allow select reviewer s project and development issue select milestone as the next official version iterate until the reviewer s accept and merge your changes delete your fork or branch close this issue
0
15,414
19,600,373,851
IssuesEvent
2022-01-06 00:08:05
GoogleCloudPlatform/python-docs-samples
https://api.github.com/repos/GoogleCloudPlatform/python-docs-samples
closed
composer.rest.get_client_id_test: test_get_client_id failed
priority: p2 type: process api: composer samples flakybot: issue
Note: #6949 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky. ---- commit: 5ceff2520ecced7e1284f5ce0592b93dafcf3817 buildURL: [Build Status](https://source.cloud.google.com/results/invocations/63899ce4-0591-4f40-81b5-ddef69dd0b95), [Sponge](http://sponge2/63899ce4-0591-4f40-81b5-ddef69dd0b95) status: failed <details><summary>Test output</summary><br><pre>Traceback (most recent call last): File "/workspace/composer/rest/get_client_id_test.py", line 29, in test_get_client_id get_client_id(PROJECT, COMPOSER_LOCATION, COMPOSER_ENVIRONMENT) File "/workspace/composer/rest/get_client_id.py", line 49, in get_client_id composer_version = environment_data["config"]["softwareConfig"]["imageVersion"] KeyError: 'config'</pre></details>
1.0
composer.rest.get_client_id_test: test_get_client_id failed - Note: #6949 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky. ---- commit: 5ceff2520ecced7e1284f5ce0592b93dafcf3817 buildURL: [Build Status](https://source.cloud.google.com/results/invocations/63899ce4-0591-4f40-81b5-ddef69dd0b95), [Sponge](http://sponge2/63899ce4-0591-4f40-81b5-ddef69dd0b95) status: failed <details><summary>Test output</summary><br><pre>Traceback (most recent call last): File "/workspace/composer/rest/get_client_id_test.py", line 29, in test_get_client_id get_client_id(PROJECT, COMPOSER_LOCATION, COMPOSER_ENVIRONMENT) File "/workspace/composer/rest/get_client_id.py", line 49, in get_client_id composer_version = environment_data["config"]["softwareConfig"]["imageVersion"] KeyError: 'config'</pre></details>
process
composer rest get client id test test get client id failed note was also for this test but it was closed more than days ago so i didn t mark it flaky commit buildurl status failed test output traceback most recent call last file workspace composer rest get client id test py line in test get client id get client id project composer location composer environment file workspace composer rest get client id py line in get client id composer version environment data keyerror config
1
9,040
12,130,107,973
IssuesEvent
2020-04-23 00:30:40
GoogleCloudPlatform/python-docs-samples
https://api.github.com/repos/GoogleCloudPlatform/python-docs-samples
closed
remove gcp-devrel-py-tools from appengine/standard/firebase/firenotes/backend/requirements-test.txt
priority: p2 remove-gcp-devrel-py-tools type: process
remove gcp-devrel-py-tools from appengine/standard/firebase/firenotes/backend/requirements-test.txt
1.0
remove gcp-devrel-py-tools from appengine/standard/firebase/firenotes/backend/requirements-test.txt - remove gcp-devrel-py-tools from appengine/standard/firebase/firenotes/backend/requirements-test.txt
process
remove gcp devrel py tools from appengine standard firebase firenotes backend requirements test txt remove gcp devrel py tools from appengine standard firebase firenotes backend requirements test txt
1
17,285
23,094,061,555
IssuesEvent
2022-07-26 17:43:22
dotnet/runtime
https://api.github.com/repos/dotnet/runtime
closed
Process.ProcessName throws for services.exe when accessed from a non-admin user
area-System.Diagnostics.Process regression-from-last-release
`Process.ProcessName` throws when the process is an admin process (like "services.exe") and the current app is running as a low privilege user. This is a regression from .NET 6, and it breaks `Microsoft.Extensions.Hosting.WindowsServices` when running as a non-admin user. This could possibly be a regression from #59672. ### Repro Steps 1. Build a console app as below: ```xml <Project Sdk="Microsoft.NET.Sdk"> <PropertyGroup> <OutputType>Exe</OutputType> <TargetFramework>net7.0</TargetFramework> <ImplicitUsings>enable</ImplicitUsings> <Nullable>enable</Nullable> </PropertyGroup> </Project> ``` ```C# using System.Diagnostics; namespace ConsoleApp1 { internal class Program { const int ServicesProcessId = 756; static void Main(string[] args) { var services = Process.GetProcessById(ServicesProcessId); Console.WriteLine(services.ProcessName); } } } ``` 2. Look in Task Manager for the `services.exe` process and replace the ServiceProcessId with the process ID for your current machine. 3. Run the app as you, it prints ``` services ``` 4. Run the app as a non-admin, low privilege user ``` Unhandled exception. System.InvalidOperationException: Process has exited, so the requested information is not available. at System.Diagnostics.Process.get_ProcessName() at ConsoleApp1.Program.Main(String[] args) in C:\Users\eerhardt\source\repos\WindowsService1\ConsoleApp1\Program.cs:line 12 ``` 5. Change the `TargetFramework` to `net6.0`, and run as the same non-admin, low privilege user: ``` services ``` ### Original Report I come back to the issue #67093, which is closed. The error still occurs with preview.4 and todays nightly build, but only on Windows Server 2016, not on my local Windows 11 developer machine. The error is: ``` Application: CloudManagementTool.WorkerService.exe CoreCLR Version: 7.0.22.27203 .NET Version: 7.0.0-preview.5.22272.3 Description: The process was terminated due to an unhandled exception. Exception Info: System.TypeInitializationException: The type initializer for 'CloudManagementTool.WorkerService.WorkerService' threw an exception. ---> System.InvalidOperationException: Process has exited, so the requested information is not available. at System.Diagnostics.Process.get_ProcessName() at Microsoft.Extensions.Hosting.WindowsServices.WindowsServiceHelpers.IsWindowsService() at ConsoleApp1.Program.<Main>(String[] args) in D:\Test\Program.cs:line 13 ``` A reproducing sample program is quite short, but it only crashes on Windows Server 2016 and when registered and executed as a service. Starting the same code directly from the console works fine. ``` public class Program { public static async Task Main(string[] args) { var isService = WindowsServiceHelpers.IsWindowsService(); if (isService) { // ... ``` This issue becomes critical now, because it will make the 7.0 version unsable for windows services on Windows Server 2016.
1.0
Process.ProcessName throws for services.exe when accessed from a non-admin user - `Process.ProcessName` throws when the process is an admin process (like "services.exe") and the current app is running as a low privilege user. This is a regression from .NET 6, and it breaks `Microsoft.Extensions.Hosting.WindowsServices` when running as a non-admin user. This could possibly be a regression from #59672. ### Repro Steps 1. Build a console app as below: ```xml <Project Sdk="Microsoft.NET.Sdk"> <PropertyGroup> <OutputType>Exe</OutputType> <TargetFramework>net7.0</TargetFramework> <ImplicitUsings>enable</ImplicitUsings> <Nullable>enable</Nullable> </PropertyGroup> </Project> ``` ```C# using System.Diagnostics; namespace ConsoleApp1 { internal class Program { const int ServicesProcessId = 756; static void Main(string[] args) { var services = Process.GetProcessById(ServicesProcessId); Console.WriteLine(services.ProcessName); } } } ``` 2. Look in Task Manager for the `services.exe` process and replace the ServiceProcessId with the process ID for your current machine. 3. Run the app as you, it prints ``` services ``` 4. Run the app as a non-admin, low privilege user ``` Unhandled exception. System.InvalidOperationException: Process has exited, so the requested information is not available. at System.Diagnostics.Process.get_ProcessName() at ConsoleApp1.Program.Main(String[] args) in C:\Users\eerhardt\source\repos\WindowsService1\ConsoleApp1\Program.cs:line 12 ``` 5. Change the `TargetFramework` to `net6.0`, and run as the same non-admin, low privilege user: ``` services ``` ### Original Report I come back to the issue #67093, which is closed. The error still occurs with preview.4 and todays nightly build, but only on Windows Server 2016, not on my local Windows 11 developer machine. The error is: ``` Application: CloudManagementTool.WorkerService.exe CoreCLR Version: 7.0.22.27203 .NET Version: 7.0.0-preview.5.22272.3 Description: The process was terminated due to an unhandled exception. Exception Info: System.TypeInitializationException: The type initializer for 'CloudManagementTool.WorkerService.WorkerService' threw an exception. ---> System.InvalidOperationException: Process has exited, so the requested information is not available. at System.Diagnostics.Process.get_ProcessName() at Microsoft.Extensions.Hosting.WindowsServices.WindowsServiceHelpers.IsWindowsService() at ConsoleApp1.Program.<Main>(String[] args) in D:\Test\Program.cs:line 13 ``` A reproducing sample program is quite short, but it only crashes on Windows Server 2016 and when registered and executed as a service. Starting the same code directly from the console works fine. ``` public class Program { public static async Task Main(string[] args) { var isService = WindowsServiceHelpers.IsWindowsService(); if (isService) { // ... ``` This issue becomes critical now, because it will make the 7.0 version unsable for windows services on Windows Server 2016.
process
process processname throws for services exe when accessed from a non admin user process processname throws when the process is an admin process like services exe and the current app is running as a low privilege user this is a regression from net and it breaks microsoft extensions hosting windowsservices when running as a non admin user this could possibly be a regression from repro steps build a console app as below xml exe enable enable c using system diagnostics namespace internal class program const int servicesprocessid static void main string args var services process getprocessbyid servicesprocessid console writeline services processname look in task manager for the services exe process and replace the serviceprocessid with the process id for your current machine run the app as you it prints services run the app as a non admin low privilege user unhandled exception system invalidoperationexception process has exited so the requested information is not available at system diagnostics process get processname at program main string args in c users eerhardt source repos program cs line change the targetframework to and run as the same non admin low privilege user services original report i come back to the issue which is closed the error still occurs with preview and todays nightly build but only on windows server not on my local windows developer machine the error is application cloudmanagementtool workerservice exe coreclr version net version preview description the process was terminated due to an unhandled exception exception info system typeinitializationexception the type initializer for cloudmanagementtool workerservice workerservice threw an exception system invalidoperationexception process has exited so the requested information is not available at system diagnostics process get processname at microsoft extensions hosting windowsservices windowsservicehelpers iswindowsservice at program string args in d test program cs line a reproducing sample program is quite short but it only crashes on windows server and when registered and executed as a service starting the same code directly from the console works fine public class program public static async task main string args var isservice windowsservicehelpers iswindowsservice if isservice this issue becomes critical now because it will make the version unsable for windows services on windows server
1
6,222
3,352,793,781
IssuesEvent
2015-11-18 00:38:03
hypatia-engine/hypatia
https://api.github.com/repos/hypatia-engine/hypatia
opened
Branch cleanup
code review request documentation enhancement no programming required
Please offer an elaboration as to what each branch is, possible links/resources, we'll get a pull request going fixing these branches up merging them or discarding them.
1.0
Branch cleanup - Please offer an elaboration as to what each branch is, possible links/resources, we'll get a pull request going fixing these branches up merging them or discarding them.
non_process
branch cleanup please offer an elaboration as to what each branch is possible links resources we ll get a pull request going fixing these branches up merging them or discarding them
0
206,517
16,044,764,304
IssuesEvent
2021-04-22 12:26:45
geosolutions-it/geonode
https://api.github.com/repos/geosolutions-it/geonode
closed
documentation is missing for "Install GDAL for Development"
Documentation
**Description:** In order to install GeoNode for development, there will be a step to install GDAL for development too. The referred link: https://training.geonode.geo-solutions.it/005_dev_workshop/004_devel_env/gdal_install.html is outdated and doesn't lead to a functioning installation neither for GeoNode nor for GDAL. This step will be blocking issue #420 **Solution:** To install GDAL for development so that it can serve GeoNode and document these steps to be served at the official GeoNode documentation **Notes:** The format/methodology followed in geo-solutions slides to install GDAL can be used in the newly created documentation. e.g. installing GDAL using virtualenv
1.0
documentation is missing for "Install GDAL for Development" - **Description:** In order to install GeoNode for development, there will be a step to install GDAL for development too. The referred link: https://training.geonode.geo-solutions.it/005_dev_workshop/004_devel_env/gdal_install.html is outdated and doesn't lead to a functioning installation neither for GeoNode nor for GDAL. This step will be blocking issue #420 **Solution:** To install GDAL for development so that it can serve GeoNode and document these steps to be served at the official GeoNode documentation **Notes:** The format/methodology followed in geo-solutions slides to install GDAL can be used in the newly created documentation. e.g. installing GDAL using virtualenv
non_process
documentation is missing for install gdal for development description in order to install geonode for development there will be a step to install gdal for development too the referred link is outdated and doesn t lead to a functioning installation neither for geonode nor for gdal this step will be blocking issue solution to install gdal for development so that it can serve geonode and document these steps to be served at the official geonode documentation notes the format methodology followed in geo solutions slides to install gdal can be used in the newly created documentation e g installing gdal using virtualenv
0
15,032
18,755,095,548
IssuesEvent
2021-11-05 09:42:57
prisma/prisma
https://api.github.com/repos/prisma/prisma
opened
Error: [introspection-engine\connectors\sql-introspection-connector\src\re_introspection.rs:378:14] Could not find relation field Transaction on model Account.
bug/1-repro-available kind/bug process/candidate topic: error reporting team/migrations
<!-- If required, please update the title to be clear and descriptive --> This could be related to the recent relation lifting refactoring. Command: `prisma db pull` Version: `3.4.0` Binary Version: `1c9fdaa9e2319b814822d6dbfd0a69e1fcc13a85` Report: https://prisma-errors.netlify.app/report/13562 OS: `x64 win32 10.0.22000` JS Stacktrace: ``` Error: [introspection-engine\connectors\sql-introspection-connector\src\re_introspection.rs:378:14] Could not find relation field Transaction on model Account. at ChildProcess.<anonymous> (D:\work\open_source\financeNext\node_modules\prisma\build\index.js:45755:30) at ChildProcess.emit (events.js:376:20) at Process.ChildProcess._handle.onexit (internal/child_process.js:277:12) ``` Rust Stacktrace: ``` 0: <unknown> 1: <unknown> 2: <unknown> 3: <unknown> 4: <unknown> 5: <unknown> 6: <unknown> 7: <unknown> 8: <unknown> 9: <unknown> 10: <unknown> 11: <unknown> 12: <unknown> 13: <unknown> 14: <unknown> 15: <unknown> 16: <unknown> 17: <unknown> 18: <unknown> 19: <unknown> 20: <unknown> 21: <unknown> 22: <unknown> 23: <unknown> 24: <unknown> 25: <unknown> 26: BaseThreadInitThunk 27: RtlUserThreadStart ```
1.0
Error: [introspection-engine\connectors\sql-introspection-connector\src\re_introspection.rs:378:14] Could not find relation field Transaction on model Account. - <!-- If required, please update the title to be clear and descriptive --> This could be related to the recent relation lifting refactoring. Command: `prisma db pull` Version: `3.4.0` Binary Version: `1c9fdaa9e2319b814822d6dbfd0a69e1fcc13a85` Report: https://prisma-errors.netlify.app/report/13562 OS: `x64 win32 10.0.22000` JS Stacktrace: ``` Error: [introspection-engine\connectors\sql-introspection-connector\src\re_introspection.rs:378:14] Could not find relation field Transaction on model Account. at ChildProcess.<anonymous> (D:\work\open_source\financeNext\node_modules\prisma\build\index.js:45755:30) at ChildProcess.emit (events.js:376:20) at Process.ChildProcess._handle.onexit (internal/child_process.js:277:12) ``` Rust Stacktrace: ``` 0: <unknown> 1: <unknown> 2: <unknown> 3: <unknown> 4: <unknown> 5: <unknown> 6: <unknown> 7: <unknown> 8: <unknown> 9: <unknown> 10: <unknown> 11: <unknown> 12: <unknown> 13: <unknown> 14: <unknown> 15: <unknown> 16: <unknown> 17: <unknown> 18: <unknown> 19: <unknown> 20: <unknown> 21: <unknown> 22: <unknown> 23: <unknown> 24: <unknown> 25: <unknown> 26: BaseThreadInitThunk 27: RtlUserThreadStart ```
process
error could not find relation field transaction on model account this could be related to the recent relation lifting refactoring command prisma db pull version binary version report os js stacktrace error could not find relation field transaction on model account at childprocess d work open source financenext node modules prisma build index js at childprocess emit events js at process childprocess handle onexit internal child process js rust stacktrace basethreadinitthunk rtluserthreadstart
1
12,920
15,294,220,244
IssuesEvent
2021-02-24 02:00:00
carterclark/GroceryStore
https://api.github.com/repos/carterclark/GroceryStore
closed
Remove a member
Business Process
If a valid id is received, the corresponding member is removed; the system would need the member’s id for this purpose. Only one member is removed when this functionality is invoked.
1.0
Remove a member - If a valid id is received, the corresponding member is removed; the system would need the member’s id for this purpose. Only one member is removed when this functionality is invoked.
process
remove a member if a valid id is received the corresponding member is removed the system would need the member’s id for this purpose only one member is removed when this functionality is invoked
1
21,467
29,501,363,222
IssuesEvent
2023-06-02 22:18:28
mrdoob/three.js
https://api.github.com/repos/mrdoob/three.js
closed
TAARenderPass: Pass over-brightens scene, shows occasional black frames
Examples Post-processing
**Describe the bug** When using TAARenderPass, certain valid settings result in incorrect behavior: - `scene.background` — any non-black background will cause the final scene to over-brighten during TAA accumulation - `pass.sampleLevel` — any non-zero level will cause a single-frame flash of black when the accumulation phase begins **To Reproduce** Steps to reproduce the behavior: 1. Open `webgl_postprocessing_taa` locally 2. Add a background color like `scene.background = new THREE.Color( 0xEE5555 );` 3. Increase TAASampleLevel (any level > 0) Observe (1) the scene brightens when it stops moving and begins accumulating, and (2) the scene flashes to black for a single frame at that time. It may be easier to see the flash when the background is kept black. **Expected behavior** TAARenderPass should be compatible with non-black scene backgrounds, and should not show a black frame while accumulating. **Screenshots** If applicable, add screenshots to help explain your problem (drag and drop the image). **Platform:** - Device: Desktop - OS: macOS 12.4 - Browser: Chrome 103.0.5060.53 - Three.js version: r142
1.0
TAARenderPass: Pass over-brightens scene, shows occasional black frames - **Describe the bug** When using TAARenderPass, certain valid settings result in incorrect behavior: - `scene.background` — any non-black background will cause the final scene to over-brighten during TAA accumulation - `pass.sampleLevel` — any non-zero level will cause a single-frame flash of black when the accumulation phase begins **To Reproduce** Steps to reproduce the behavior: 1. Open `webgl_postprocessing_taa` locally 2. Add a background color like `scene.background = new THREE.Color( 0xEE5555 );` 3. Increase TAASampleLevel (any level > 0) Observe (1) the scene brightens when it stops moving and begins accumulating, and (2) the scene flashes to black for a single frame at that time. It may be easier to see the flash when the background is kept black. **Expected behavior** TAARenderPass should be compatible with non-black scene backgrounds, and should not show a black frame while accumulating. **Screenshots** If applicable, add screenshots to help explain your problem (drag and drop the image). **Platform:** - Device: Desktop - OS: macOS 12.4 - Browser: Chrome 103.0.5060.53 - Three.js version: r142
process
taarenderpass pass over brightens scene shows occasional black frames describe the bug when using taarenderpass certain valid settings result in incorrect behavior scene background — any non black background will cause the final scene to over brighten during taa accumulation pass samplelevel — any non zero level will cause a single frame flash of black when the accumulation phase begins to reproduce steps to reproduce the behavior open webgl postprocessing taa locally add a background color like scene background new three color increase taasamplelevel any level observe the scene brightens when it stops moving and begins accumulating and the scene flashes to black for a single frame at that time it may be easier to see the flash when the background is kept black expected behavior taarenderpass should be compatible with non black scene backgrounds and should not show a black frame while accumulating screenshots if applicable add screenshots to help explain your problem drag and drop the image platform device desktop os macos browser chrome three js version
1
281,813
21,315,438,741
IssuesEvent
2022-04-16 07:27:44
yusufaine/pe
https://api.github.com/repos/yusufaine/pe
opened
[docs-dg] formatting with page breaks
severity.VeryLow type.DocumentationBug
Small nit, similar to the UG, the content is really well-written but could be improved by using page breaks to avoid issues like these: ![image.png](https://raw.githubusercontent.com/yusufaine/pe/main/files/8d2080b8-0f19-4d49-ba3a-5d734cf950e4.png) ![image.png](https://raw.githubusercontent.com/yusufaine/pe/main/files/62290def-7659-4924-a1c3-ff3c31dac561.png) <!--session: 1650093397954-d3d4204c-1ddc-4073-87ff-1884e509bbb4--> <!--Version: Web v3.4.2-->
1.0
[docs-dg] formatting with page breaks - Small nit, similar to the UG, the content is really well-written but could be improved by using page breaks to avoid issues like these: ![image.png](https://raw.githubusercontent.com/yusufaine/pe/main/files/8d2080b8-0f19-4d49-ba3a-5d734cf950e4.png) ![image.png](https://raw.githubusercontent.com/yusufaine/pe/main/files/62290def-7659-4924-a1c3-ff3c31dac561.png) <!--session: 1650093397954-d3d4204c-1ddc-4073-87ff-1884e509bbb4--> <!--Version: Web v3.4.2-->
non_process
formatting with page breaks small nit similar to the ug the content is really well written but could be improved by using page breaks to avoid issues like these
0
21,164
28,137,849,295
IssuesEvent
2023-04-01 15:45:22
FOLIO-FSE/folio_migration_tools
https://api.github.com/repos/FOLIO-FSE/folio_migration_tools
closed
In the task configuration, rename NeverUpdateHridSettings (True/False) to UpdateHridSettings (True/False)
Simplify migration process
To avoid double negations, consider renaming the configuration NeverUpdateHridSettings (True/False) to UpdateHridSettings (True/False). Note that this change will "reverse" the configuration, i.e. in cases where it was True it will need to be changed to False and vice versa.
1.0
In the task configuration, rename NeverUpdateHridSettings (True/False) to UpdateHridSettings (True/False) - To avoid double negations, consider renaming the configuration NeverUpdateHridSettings (True/False) to UpdateHridSettings (True/False). Note that this change will "reverse" the configuration, i.e. in cases where it was True it will need to be changed to False and vice versa.
process
in the task configuration rename neverupdatehridsettings true false to updatehridsettings true false to avoid double negations consider renaming the configuration neverupdatehridsettings true false to updatehridsettings true false note that this change will reverse the configuration i e in cases where it was true it will need to be changed to false and vice versa
1
156,728
19,903,458,257
IssuesEvent
2022-01-25 10:20:07
sultanabubaker/octopus-master
https://api.github.com/repos/sultanabubaker/octopus-master
closed
CVE-2019-20922 (High) detected in handlebars-4.0.10.tgz - autoclosed
security vulnerability
## CVE-2019-20922 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.0.10.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.0.10.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.0.10.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/handlebars/package.json,/start-preset-idea/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - octopus-start-preset-idea-0.0.4.tgz (Root Library) - :x: **handlebars-4.0.10.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/sultanabubaker/octopus-master/commit/fe409ca2bb6102addf56e0caf3b48bb9726d71f3">fe409ca2bb6102addf56e0caf3b48bb9726d71f3</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Handlebars before 4.4.5 allows Regular Expression Denial of Service (ReDoS) because of eager matching. The parser may be forced into an endless loop while processing crafted templates. This may allow attackers to exhaust system resources. <p>Publish Date: 2020-09-30 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20922>CVE-2019-20922</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.npmjs.com/advisories/1300">https://www.npmjs.com/advisories/1300</a></p> <p>Release Date: 2020-09-30</p> <p>Fix Resolution: handlebars - 4.4.5</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"handlebars","packageVersion":"4.0.10","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"octopus-start-preset-idea:0.0.4;handlebars:4.0.10","isMinimumFixVersionAvailable":true,"minimumFixVersion":"handlebars - 4.4.5","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2019-20922","vulnerabilityDetails":"Handlebars before 4.4.5 allows Regular Expression Denial of Service (ReDoS) because of eager matching. The parser may be forced into an endless loop while processing crafted templates. This may allow attackers to exhaust system resources.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20922","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2019-20922 (High) detected in handlebars-4.0.10.tgz - autoclosed - ## CVE-2019-20922 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.0.10.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.0.10.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.0.10.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/handlebars/package.json,/start-preset-idea/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - octopus-start-preset-idea-0.0.4.tgz (Root Library) - :x: **handlebars-4.0.10.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/sultanabubaker/octopus-master/commit/fe409ca2bb6102addf56e0caf3b48bb9726d71f3">fe409ca2bb6102addf56e0caf3b48bb9726d71f3</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Handlebars before 4.4.5 allows Regular Expression Denial of Service (ReDoS) because of eager matching. The parser may be forced into an endless loop while processing crafted templates. This may allow attackers to exhaust system resources. <p>Publish Date: 2020-09-30 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20922>CVE-2019-20922</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.npmjs.com/advisories/1300">https://www.npmjs.com/advisories/1300</a></p> <p>Release Date: 2020-09-30</p> <p>Fix Resolution: handlebars - 4.4.5</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"handlebars","packageVersion":"4.0.10","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"octopus-start-preset-idea:0.0.4;handlebars:4.0.10","isMinimumFixVersionAvailable":true,"minimumFixVersion":"handlebars - 4.4.5","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2019-20922","vulnerabilityDetails":"Handlebars before 4.4.5 allows Regular Expression Denial of Service (ReDoS) because of eager matching. The parser may be forced into an endless loop while processing crafted templates. This may allow attackers to exhaust system resources.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20922","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in handlebars tgz autoclosed cve high severity vulnerability vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file package json path to vulnerable library node modules handlebars package json start preset idea node modules handlebars package json dependency hierarchy octopus start preset idea tgz root library x handlebars tgz vulnerable library found in head commit a href found in base branch master vulnerability details handlebars before allows regular expression denial of service redos because of eager matching the parser may be forced into an endless loop while processing crafted templates this may allow attackers to exhaust system resources publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution handlebars isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree octopus start preset idea handlebars isminimumfixversionavailable true minimumfixversion handlebars isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails handlebars before allows regular expression denial of service redos because of eager matching the parser may be forced into an endless loop while processing crafted templates this may allow attackers to exhaust system resources vulnerabilityurl
0