Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
255,263
| 27,484,830,019
|
IssuesEvent
|
2023-03-04 01:23:35
|
panasalap/linux-4.1.15
|
https://api.github.com/repos/panasalap/linux-4.1.15
|
opened
|
CVE-2018-10902 (High) detected in linux179e72b561d3d331c850e1a5779688d7a7de5246
|
security vulnerability
|
## CVE-2018-10902 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux179e72b561d3d331c850e1a5779688d7a7de5246</b></p></summary>
<p>
<p>Linux kernel stable tree mirror</p>
<p>Library home page: <a href=https://github.com/gregkh/linux.git>https://github.com/gregkh/linux.git</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/sound/core/rawmidi.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/sound/core/rawmidi.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
It was found that the raw midi kernel driver does not protect against concurrent access which leads to a double realloc (double free) in snd_rawmidi_input_params() and snd_rawmidi_output_status() which are part of snd_rawmidi_ioctl() handler in rawmidi.c file. A malicious local attacker could possibly use this for privilege escalation.
<p>Publish Date: 2018-08-21
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-10902>CVE-2018-10902</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2018-10902">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2018-10902</a></p>
<p>Release Date: 2018-08-21</p>
<p>Fix Resolution: v4.18-rc6</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2018-10902 (High) detected in linux179e72b561d3d331c850e1a5779688d7a7de5246 - ## CVE-2018-10902 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux179e72b561d3d331c850e1a5779688d7a7de5246</b></p></summary>
<p>
<p>Linux kernel stable tree mirror</p>
<p>Library home page: <a href=https://github.com/gregkh/linux.git>https://github.com/gregkh/linux.git</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/sound/core/rawmidi.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/sound/core/rawmidi.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
It was found that the raw midi kernel driver does not protect against concurrent access which leads to a double realloc (double free) in snd_rawmidi_input_params() and snd_rawmidi_output_status() which are part of snd_rawmidi_ioctl() handler in rawmidi.c file. A malicious local attacker could possibly use this for privilege escalation.
<p>Publish Date: 2018-08-21
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-10902>CVE-2018-10902</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2018-10902">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2018-10902</a></p>
<p>Release Date: 2018-08-21</p>
<p>Fix Resolution: v4.18-rc6</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in cve high severity vulnerability vulnerable library linux kernel stable tree mirror library home page a href found in base branch master vulnerable source files sound core rawmidi c sound core rawmidi c vulnerability details it was found that the raw midi kernel driver does not protect against concurrent access which leads to a double realloc double free in snd rawmidi input params and snd rawmidi output status which are part of snd rawmidi ioctl handler in rawmidi c file a malicious local attacker could possibly use this for privilege escalation publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
21,113
| 28,076,306,028
|
IssuesEvent
|
2023-03-30 00:07:43
|
allinurl/goaccess
|
https://api.github.com/repos/allinurl/goaccess
|
closed
|
nginx log_json format
|
bug log/date/time format log-processing JSON
|
# cat goaccess.conf
```
time-format %T
date-format %Y-%m-%d
log-format %^:"~h{,}",%^:"%h",%^:%^,%^:%^,%^:"%dT%t+%^",%^:"%m",%^:"%U",%^:%^,%^:"%R",%^:%^,%^:"%T",%^:"%U",%^:%^,%^:%^
```
#grep log_format nginx.conf
`log_format log_json '{"accessIpList":"$proxy_add_x_forwarded_for","clientIp":"$remote_addr","cookie":"$http_cookie","httpHost":"$host","timestamp":"$time_iso8601","method":"$request_method","url":"$request_uri","status":$status,"httpReferer":"$http_referer","bodyBytesSent":$body_bytes_sent,"requestTime":"$request_time","ua":"$http_user_agent","totalBytesSent":$bytes_sent,"serverIp":"$server_addr"}';`
# sed -n '15079,15081p' access.log-20230314
```
{"accessIpList":"113.24.224.111, 113.96.58.76","clientIp":"113.24.224.111","cookie":"-","httpHost":"hxooe.ooef.cn","timestamp":"2023-03-13T11:04:02+08:00","method":"GET","url":"/?source=bd&plan=LPP-mb%E3%80%902.16%2Bocpc%2Bjingpin%E3%80%91&unit=mb%E3%80%902.16%2Bocpc%2Bjingpin%E3%80%91&keyword=95skinskxw95-2.0","status":200,"httpReferer":"http://www.baidu.com/s?wd=feed9cc","bodyBytesSent":37434,"requestTime":"0.000","ua":"Mozilla/5.0 (Windows NT 6.1; WOW64; rv:43.0) Gecko/20100101 Firefox/43.0","totalBytesSent":37698,"serverIp":"172.16.0.3"}
{"accessIpList":"223.102.162.140, 113.96.58.94","clientIp":"223.102.162.140","cookie":"-","httpHost":"hxooe.ooef.cn","timestamp":"2023-03-13T11:04:05+08:00","method":"GET","url":"/?source=baidu&plan=csgokaixiang-yd&unit=csgokaixiang&keyword=csgo%E5%BC%80%E7%AE%B1%E7%BD%91&e_creative=66212592778&e_keywordid=509204241374&e_keywordid2=509204241374&bd_vid=11374240533623204504","status":200,"httpReferer":"https://m.baidu.com/baidu.php?url=0f00000uEDLSpLgiCVWyeVzxwpkDjakngDUkkgGaywMa2-ASWFrpUlrEUvFdwdnGQQdxZRQSTWDadtpnFVjf-CCB_08Cy56tnxKxs5P-iElkBv0mKBFwj-C-YEu6gYqprCuHdb7SwhuH3XvNUjyoWdwjfDqfr5rRAJIGMxdCMGv06N238hjCz2FDfqlle5tfpRjr8A0u9fnks8ZW26GB0IAdpwTI.DY_iNI7YQnpx2Ao01hCmZRDsdTbZXPvap7Q7erQKdkYymhcmqH8d-OfqMPMYgITDHljDnXGLpzOCX1hlqJIZ0lp4BA5Rej4e_5otrZuu9tOZjex5jld3x5ksSEzselSJXyAuQPKOYFmlXMFgbLUtMW3erJXgVHC3ZHgmhPdhOFuvurzEW3qMN5QvGYTjGo_ywWIuEzyIhH_o3S5m_Hj_TIIhHkLeVv-muCyrz1Fkz20.U1Yk0ZDqdXUNOUBY8TydknUhE252dV5r0A7bTgbqmLPMUeSJ1SHPS0K9uZ7Y5Hc0TA-W5H00IjdWTvIEElo54nL30A-V5HcznsKM5gIzm6KdpHY0TA-b5Hck0APGujYkPjf0Ugfqn10srNt1nj03g1nvnj7xnH0kPdtznjRk0AVG5H00TMfqPWR0mhbqnW0dg1csP-tdnjPxP10Yg1csnj0zg1csnjm10AdW5HDsnNtknj7xn1msnNtknjDLg1csPHD0TgKGujYs0Z7Wpyfqn0KzuLw9u1Ys0Aq9pyfquhNhuhw-ujTVP1ubuBY3rHnLQyubuAcVP19-PvFhuAwbn1P90A7B5HKxn0K-ThTqn0KsTjYs0A4vTjYsQW0snj0snj0s0AdYTjYs0AwbUL0qn0KzpWYk0Aw-IWdLpgP-0AuY5Hc0TA6qn0KET1Yz0AFL5Hf0UMfqnfK1XWY1nWKxnH0snfK9TdqGuAnquj0VmhwbX0KGuAnqiDFK0ZKCXgIC5HDsrjKxnWnYn0KspZw45fKYmgFMugfqPWPxn7tdnj00IZN15H6znH6LPjczPjTYn1RLPH6zn1f0ThNkIjYkPWT3PWTvPWnYnWnY0A-YUHYvPHR1P6K8IZws5Hn0mv6qUZNxTZPxmgKs0Zwd5H6zPHfdP1b0T1dhnA79uW7hmH9hPAfdmW-h0ZwV5H00mvmqnfKzmWYk0AkdpvbqnfKWUMw85ycknWbYrj7WgvPsT6K1TL0qn6K1TL0z5H00IZws5Hn0UZN15HP9m1ckuWfkn17BnWTvPhf0UZN1IjYvPjK-rHc3mfK_IyPY5HmYnAR4nWbz0ZPWuHYs0A7sT7q1pyfqmWDzrHf3nyn0UZNxpywW5g-h0Zwzmyw-nHYs0Zwzmyw-nWYs0AwYTjYz0A7bmvk9TLnqnW60myw35H00TvwogLmqnfKLpHYY0Au1mv9VujYz0Zwb5H00uMcqnWD0TLcqnHfY0AF-TvsqQHD0UZNLIZcqnWcYn1RznjcsPjnznHfknH6zP6KdThsqpZwYTjCEQ1wBuAcYmiYVmWDzrHf3nyn8mvqVQsK1pyfqmHf3uAmsrjR3P10vrAn4r0KWTvYqwbFArHNKwHDzfWD3f1cvPsK9m1Yk0ZK85H00TydY5H00Tyd15H00uANYgvPsmHY1n0KlIjYs0AdWgvuzUvYqn7tsg1Kxn7ts0Aw9UMNBuNqsUA78pyw15HT3nWbvP1FxP16zrHmsn-tsg1Kxn0Ksmgwxuhk9u1Ys0APY5HcsPH0zc1R1g1csn10zc1IxnW0knjTWn7tznjDsPansg1csnH0zc1m0uAPGujYs0ANYpyfqQHD0mgPsmvnqn0KdTA-8mvnqn0KkUymqn0KhmLNY5H00pgPWUjYs0A7buhk9u1Yk0Akhm1Ys0AwWmvfqnYmYPbDkPWckPWc1n1cLP1FAfWKjfW9DwHm3n1NanjKtn0KYTh7buHYs0AFbpyfqrHn4fW7KPDujwbfzfbujwRRznHbdwjPKrHm4P1TLfYc0IvuzUvYqnH0zP1bdPyY0UvnqnfKBIjYs0Aq9IZTqn0KEIjYk0AqzTZfqnBnsc1nWninznj01n1ckPjR1c1nYnj0Wn1fsna3sn1n4n1mWnznkc10WQinsQWDsrjTsPBnsQW6dnj0snankc10Wna3snj0snj00mh78pv7Wm1YknWDWnHcs0Z7xIWYsQWf3g108njPxna3sn-tsQWDdg108njKxn7tsQW0sg100mMPxTZFEuA-b5H00ThqGuhk9u1Ys0ZFYmy-b5fKWIWY0pgPxmLK95H00mL0qn0K-TLfqn0KWThnqPjmdPHf&ai=0_2012005920_1_0&word=&qid=a48df08587068c98&bdrank=0&rank=1&sourceid=111&placeid=1&sht=1027955m&shh=m.baidu.com&ck=3037.2.353.129.345.269.0.0.209.0.0&us=0.0.0.0.0.0.0.20108","bodyBytesSent":37434,"requestTime":"0.000","ua":"Mozilla/5.0 (Linux; Android 10; SPN-AL00 Build/HUAWEISPN-AL00; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/88.0.4324.93 Mobile Safari/537.36 Mobads","totalBytesSent":37698,"serverIp":"172.16.0.3"}
{"accessIpList":"unknown, 61.240.129.30, 113.96.58.75","clientIp":"","cookie":"AGL_USER_ID=667c7854-fe5c-4b16-9cd4-c5c7fc71ab38; 53gid2=11656049891000; 53revisit=1674112818794; Hm_lvt_df5192b95cbbcc58820974673e44eab5=1676460860,1676872173; Hm_lvt_2ed30596ae2989f61a29cebc04960edd=1676460860,1676872173","httpHost":"hxooe.ooef.cn","timestamp":"2023-03-13T11:04:05+08:00","method":"GET","url":"/?source=baidu&plan=LPP-PC%E3%80%902.6%2Bocpc%2Bzonghecihui%E3%80%91&unit=PC%E3%80%902.6%2Bocpc%2Bzonghecihui%E3%80%91&keyword=kxwz&e_creative=71440802679&e_keywordid=554265605137&bd_vid=8973616287845308454","status":200,"httpReferer":"https://www.baidu.com/baidu.php?url=0f00000uEDLSpLgiCMzPScAsu8qqdZEnJxAvzEgxnAPgq3x7BFHnhzoLQPVu6imvSyB48APIoTuWj8_hdMJta2FPeSLOz8X8_MUx_iW5oPP_1mPwkHa9wwURCUkw6Y-xDuROruj72GVH62Cd12zUs0mWgkKJrnN9BevAbIkpWfosSLPHQYQmUx_1xKLHf1kPtJSGhrmM_QoR6BMEHQvBxLn6ZG48.7D_iNI7YQnpxQ6l5mMgzfI3LmFCR_g_3_ZgZwuBmOgSUtvxtLs3x5W9vymIqhHzlTrHdIjW9s45g9IhJHaSo0Oj4tTrz1jEol3cMYA51xj9tSrZuu9Lvu5QGs45NYrrZjbqX7vpGnQl_jt5MkseQnrSEovGYTjGohrLC3rZJM-9hi1fxqay5AE_eQP-hHGtION5QvGYTjGo_HAeIvurMIYvUJS5B_HZalWjDh9zs34-9h9m3S5Hvu70.U1Yk0ZDqIMP1pv-8TeSJ1SHPS0Kspynqn0KY5USJ1SHPSPyS0A-V5HDYPWD0u1dEuZCk0ZNG5yF9pywd0ZKGujYknsKWpyfqn1cL0AdY5HnznW9xn1DzP7tknjDLg1csPH7xnH0krNt1PW0kg1bkP160pvbqn0KzIjYLnW00mhbqnW01g1csP7tdnjn0UynqnH0kg1DsnNtknjDLg1csPH7xnH0krNtsg100TgKGujYs0Z7Wpyfqn0KzuLw9u1Ys0A7B5HKxn0K-ThTqn0KsTjYs0A4vTjYsQW0snj0snj0s0AdYTjYs0AwbUL0qn0KzpWYs0Aw-IWdsmsKhIjYs0ZKC5H00ULnqn0KBI1Ykn0K8IjYs0ZPl5fK9TdqGuAnqTZnVmhwbX0KGuAnqiDF70ZKCIZbq0Zw9ThI-IjY1nNt1nHFxr0KYIgnqnWnvnHfdnH03njbLPWTzrjmdP6Kzug7Y5HDvP16vP1mvn104P100pgwV5H00UMwYTjYk0APC5ykdgLK10Zwd5gRvrjc4PjDs0ZnquWbLnHNbuHfLuj-hPhwhn0KYUHYknj61P10L0APh5HD0ThcqnfK_IyVG5HDs0AP8IA3qXhq_mv4-I7qWTZc0TLPs5HD0TLPsnWYk0ZwYTjYk0AkdT1YYmymdmhfYrHmkujmknhR10AkdTLfqPWfsuHbzP1R0UZNWIjYvPjK-rHc4n0K1mvRqPsK_INqGuAnqXym0IZF9uARk5H00IZF9uARz5H00uZws5HD0mywWUA71T1Ys0ZIG5Hf0uMPWpAdb5Hc0IAfqPW6zrHfkn0KhTWYznfK1TWYkPHR0mhN1UjYVnfK_IgIYTWYzn1Rsn1mknHfYnWDznHfdP1mL0AN3TA-b5Hcsnj04n-tznjDkPHIxnW0krjm4g1csnWDLnNtznjn4nHNxnW0vnjDYg1csPWnkrNtznjm4P1FxnW0LP1DLg1csrHDsn-tznjbdP100IgF_5y9YIZK1rBtEXA-9Uv9dmi4lUvs8mvqVQhP8Qvw-IA7GUjmsQ1R4nWbkQh9YUys0Tv-b5yNbuWDsPAnznj0snyRvPvn0mLPV5H6dfbPjPWRkn1b1PYFDwWR0mynqnfKsUWYk0Z7VIjYs0Z7VT1Ys0Aw-I7qWTADqn0KlIjYz0AdWgvuzUvYqn7tsg1Kxn7tsg1Kxn0Kbmy4dmhNxTAk9Uh-bT1Y3nHckrHDdg16knWD3nHKxn7tsg1Kxn7ts0ZK9I7qhUA7M5H00uAPGujYs0ANYpyfqQHD0mgPsmvnqn0KdTA-8mvnqn0KkUymqn0KhmLNY5H00pgPWUjYs0A7buhk9u1Yk0Akhm1Ys0AwWmvfq0Zwzmyw-5H00mhwGujdDwWRdrH0YwW0dPjF7nYPAP10vn1cYwWD3PjfzPHPjr0KEm1Yk0AFY5H00Uv7YI1Ys0AqY5HD0ULFsIjYzc10Wnznkc1czrHnvnjTvnWcWPWcsnanvnW0sQW0snj0snan1c1DWnanVc108nHfdnj0Yc108rjRsnj0sc1DWnansQW0snj0sn0KBmy4omyPW5H0Wn0K3TLwd5HfYnHR3rjfs0Z7xIWYsQWm1g108njKxna3sn7tsQWRvg108nj7xn7tsQW0kg100mMPxTZFEuA-b5H00ThqGuhk9u1Ys0ZFYmy-b5fKWIWY0pgPxmLK95H00mL0qn0K-TLfqn0KWThnqnHcsrH6&us=newvui&xst=mWdDwWRdrH0YwW0dPjF7nYPAP10vn1cYwWD3PjfzPHPjr0715HDLnHfdPjbsPj6LPWfvPjT4rHbvg1DYPW7xn07L5USJ1SHPSPyS0gDqIMP1pv-8TeSJ1SHPS07d5HfYnHR3rjfs0gfqnHmLrjmLPWm1nf7VTHYs0W0aQf7Wpjdhmdqsms7_IHYk0yP85gGEUAP8ugwxmLKz0HfYnHcdPH6Yns&word=&ck=974.7.77.598.495.676.588.2790&shh=www.baidu.com&sht=zolcnet_cpr&wd=&bc=110101","bodyBytesSent":37434,"requestTime":"0.000","ua":"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.198 Safari/537.36","totalBytesSent":37698,"serverIp":"172.16.0.3"}
```
error:
```
==44923== GoAccess 1.7.1 crashed by Sig 11
==44923==
==44923== VALUES AT CRASH POINT
==44923==
==44923== FILE: access.log-20230314
==44923== Line number: 15080
==44923== Invalid data: 604
==44923== Piping: 0
==44923==
==44923== STACK TRACE:
==44923==
==44923== 0 goaccess(sigsegv_handler+0x14f) [0x40f54f]
==44923== 1 /lib64/libpthread.so.0(+0x141d0) [0x7f4f39ea71d0]
==44923== 2 /lib64/libc.so.6(cfree+0x52) [0x7f4f39d41f32]
==44923== 3 goaccess() [0x42a12f]
==44923== 4 goaccess(pre_process_log+0x81) [0x42b351]
==44923== 5 goaccess(parse_log+0x152) [0x42bdd2]
==44923== 6 goaccess(main+0x2a9) [0x40a039]
==44923== 7 /lib64/libc.so.6(__libc_start_main+0xf2) [0x7f4f39cdd202]
==44923== 8 goaccess(_start+0x2e) [0x40c1fe]
==44923==
==44923== Please report it by opening an issue on GitHub:
==44923== https://github.com/allinurl/goaccess/issues
```
|
1.0
|
nginx log_json format - # cat goaccess.conf
```
time-format %T
date-format %Y-%m-%d
log-format %^:"~h{,}",%^:"%h",%^:%^,%^:%^,%^:"%dT%t+%^",%^:"%m",%^:"%U",%^:%^,%^:"%R",%^:%^,%^:"%T",%^:"%U",%^:%^,%^:%^
```
#grep log_format nginx.conf
`log_format log_json '{"accessIpList":"$proxy_add_x_forwarded_for","clientIp":"$remote_addr","cookie":"$http_cookie","httpHost":"$host","timestamp":"$time_iso8601","method":"$request_method","url":"$request_uri","status":$status,"httpReferer":"$http_referer","bodyBytesSent":$body_bytes_sent,"requestTime":"$request_time","ua":"$http_user_agent","totalBytesSent":$bytes_sent,"serverIp":"$server_addr"}';`
# sed -n '15079,15081p' access.log-20230314
```
{"accessIpList":"113.24.224.111, 113.96.58.76","clientIp":"113.24.224.111","cookie":"-","httpHost":"hxooe.ooef.cn","timestamp":"2023-03-13T11:04:02+08:00","method":"GET","url":"/?source=bd&plan=LPP-mb%E3%80%902.16%2Bocpc%2Bjingpin%E3%80%91&unit=mb%E3%80%902.16%2Bocpc%2Bjingpin%E3%80%91&keyword=95skinskxw95-2.0","status":200,"httpReferer":"http://www.baidu.com/s?wd=feed9cc","bodyBytesSent":37434,"requestTime":"0.000","ua":"Mozilla/5.0 (Windows NT 6.1; WOW64; rv:43.0) Gecko/20100101 Firefox/43.0","totalBytesSent":37698,"serverIp":"172.16.0.3"}
{"accessIpList":"223.102.162.140, 113.96.58.94","clientIp":"223.102.162.140","cookie":"-","httpHost":"hxooe.ooef.cn","timestamp":"2023-03-13T11:04:05+08:00","method":"GET","url":"/?source=baidu&plan=csgokaixiang-yd&unit=csgokaixiang&keyword=csgo%E5%BC%80%E7%AE%B1%E7%BD%91&e_creative=66212592778&e_keywordid=509204241374&e_keywordid2=509204241374&bd_vid=11374240533623204504","status":200,"httpReferer":"https://m.baidu.com/baidu.php?url=0f00000uEDLSpLgiCVWyeVzxwpkDjakngDUkkgGaywMa2-ASWFrpUlrEUvFdwdnGQQdxZRQSTWDadtpnFVjf-CCB_08Cy56tnxKxs5P-iElkBv0mKBFwj-C-YEu6gYqprCuHdb7SwhuH3XvNUjyoWdwjfDqfr5rRAJIGMxdCMGv06N238hjCz2FDfqlle5tfpRjr8A0u9fnks8ZW26GB0IAdpwTI.DY_iNI7YQnpx2Ao01hCmZRDsdTbZXPvap7Q7erQKdkYymhcmqH8d-OfqMPMYgITDHljDnXGLpzOCX1hlqJIZ0lp4BA5Rej4e_5otrZuu9tOZjex5jld3x5ksSEzselSJXyAuQPKOYFmlXMFgbLUtMW3erJXgVHC3ZHgmhPdhOFuvurzEW3qMN5QvGYTjGo_ywWIuEzyIhH_o3S5m_Hj_TIIhHkLeVv-muCyrz1Fkz20.U1Yk0ZDqdXUNOUBY8TydknUhE252dV5r0A7bTgbqmLPMUeSJ1SHPS0K9uZ7Y5Hc0TA-W5H00IjdWTvIEElo54nL30A-V5HcznsKM5gIzm6KdpHY0TA-b5Hck0APGujYkPjf0Ugfqn10srNt1nj03g1nvnj7xnH0kPdtznjRk0AVG5H00TMfqPWR0mhbqnW0dg1csP-tdnjPxP10Yg1csnj0zg1csnjm10AdW5HDsnNtknj7xn1msnNtknjDLg1csPHD0TgKGujYs0Z7Wpyfqn0KzuLw9u1Ys0Aq9pyfquhNhuhw-ujTVP1ubuBY3rHnLQyubuAcVP19-PvFhuAwbn1P90A7B5HKxn0K-ThTqn0KsTjYs0A4vTjYsQW0snj0snj0s0AdYTjYs0AwbUL0qn0KzpWYk0Aw-IWdLpgP-0AuY5Hc0TA6qn0KET1Yz0AFL5Hf0UMfqnfK1XWY1nWKxnH0snfK9TdqGuAnquj0VmhwbX0KGuAnqiDFK0ZKCXgIC5HDsrjKxnWnYn0KspZw45fKYmgFMugfqPWPxn7tdnj00IZN15H6znH6LPjczPjTYn1RLPH6zn1f0ThNkIjYkPWT3PWTvPWnYnWnY0A-YUHYvPHR1P6K8IZws5Hn0mv6qUZNxTZPxmgKs0Zwd5H6zPHfdP1b0T1dhnA79uW7hmH9hPAfdmW-h0ZwV5H00mvmqnfKzmWYk0AkdpvbqnfKWUMw85ycknWbYrj7WgvPsT6K1TL0qn6K1TL0z5H00IZws5Hn0UZN15HP9m1ckuWfkn17BnWTvPhf0UZN1IjYvPjK-rHc3mfK_IyPY5HmYnAR4nWbz0ZPWuHYs0A7sT7q1pyfqmWDzrHf3nyn0UZNxpywW5g-h0Zwzmyw-nHYs0Zwzmyw-nWYs0AwYTjYz0A7bmvk9TLnqnW60myw35H00TvwogLmqnfKLpHYY0Au1mv9VujYz0Zwb5H00uMcqnWD0TLcqnHfY0AF-TvsqQHD0UZNLIZcqnWcYn1RznjcsPjnznHfknH6zP6KdThsqpZwYTjCEQ1wBuAcYmiYVmWDzrHf3nyn8mvqVQsK1pyfqmHf3uAmsrjR3P10vrAn4r0KWTvYqwbFArHNKwHDzfWD3f1cvPsK9m1Yk0ZK85H00TydY5H00Tyd15H00uANYgvPsmHY1n0KlIjYs0AdWgvuzUvYqn7tsg1Kxn7ts0Aw9UMNBuNqsUA78pyw15HT3nWbvP1FxP16zrHmsn-tsg1Kxn0Ksmgwxuhk9u1Ys0APY5HcsPH0zc1R1g1csn10zc1IxnW0knjTWn7tznjDsPansg1csnH0zc1m0uAPGujYs0ANYpyfqQHD0mgPsmvnqn0KdTA-8mvnqn0KkUymqn0KhmLNY5H00pgPWUjYs0A7buhk9u1Yk0Akhm1Ys0AwWmvfqnYmYPbDkPWckPWc1n1cLP1FAfWKjfW9DwHm3n1NanjKtn0KYTh7buHYs0AFbpyfqrHn4fW7KPDujwbfzfbujwRRznHbdwjPKrHm4P1TLfYc0IvuzUvYqnH0zP1bdPyY0UvnqnfKBIjYs0Aq9IZTqn0KEIjYk0AqzTZfqnBnsc1nWninznj01n1ckPjR1c1nYnj0Wn1fsna3sn1n4n1mWnznkc10WQinsQWDsrjTsPBnsQW6dnj0snankc10Wna3snj0snj00mh78pv7Wm1YknWDWnHcs0Z7xIWYsQWf3g108njPxna3sn-tsQWDdg108njKxn7tsQW0sg100mMPxTZFEuA-b5H00ThqGuhk9u1Ys0ZFYmy-b5fKWIWY0pgPxmLK95H00mL0qn0K-TLfqn0KWThnqPjmdPHf&ai=0_2012005920_1_0&word=&qid=a48df08587068c98&bdrank=0&rank=1&sourceid=111&placeid=1&sht=1027955m&shh=m.baidu.com&ck=3037.2.353.129.345.269.0.0.209.0.0&us=0.0.0.0.0.0.0.20108","bodyBytesSent":37434,"requestTime":"0.000","ua":"Mozilla/5.0 (Linux; Android 10; SPN-AL00 Build/HUAWEISPN-AL00; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/88.0.4324.93 Mobile Safari/537.36 Mobads","totalBytesSent":37698,"serverIp":"172.16.0.3"}
{"accessIpList":"unknown, 61.240.129.30, 113.96.58.75","clientIp":"","cookie":"AGL_USER_ID=667c7854-fe5c-4b16-9cd4-c5c7fc71ab38; 53gid2=11656049891000; 53revisit=1674112818794; Hm_lvt_df5192b95cbbcc58820974673e44eab5=1676460860,1676872173; Hm_lvt_2ed30596ae2989f61a29cebc04960edd=1676460860,1676872173","httpHost":"hxooe.ooef.cn","timestamp":"2023-03-13T11:04:05+08:00","method":"GET","url":"/?source=baidu&plan=LPP-PC%E3%80%902.6%2Bocpc%2Bzonghecihui%E3%80%91&unit=PC%E3%80%902.6%2Bocpc%2Bzonghecihui%E3%80%91&keyword=kxwz&e_creative=71440802679&e_keywordid=554265605137&bd_vid=8973616287845308454","status":200,"httpReferer":"https://www.baidu.com/baidu.php?url=0f00000uEDLSpLgiCMzPScAsu8qqdZEnJxAvzEgxnAPgq3x7BFHnhzoLQPVu6imvSyB48APIoTuWj8_hdMJta2FPeSLOz8X8_MUx_iW5oPP_1mPwkHa9wwURCUkw6Y-xDuROruj72GVH62Cd12zUs0mWgkKJrnN9BevAbIkpWfosSLPHQYQmUx_1xKLHf1kPtJSGhrmM_QoR6BMEHQvBxLn6ZG48.7D_iNI7YQnpxQ6l5mMgzfI3LmFCR_g_3_ZgZwuBmOgSUtvxtLs3x5W9vymIqhHzlTrHdIjW9s45g9IhJHaSo0Oj4tTrz1jEol3cMYA51xj9tSrZuu9Lvu5QGs45NYrrZjbqX7vpGnQl_jt5MkseQnrSEovGYTjGohrLC3rZJM-9hi1fxqay5AE_eQP-hHGtION5QvGYTjGo_HAeIvurMIYvUJS5B_HZalWjDh9zs34-9h9m3S5Hvu70.U1Yk0ZDqIMP1pv-8TeSJ1SHPS0Kspynqn0KY5USJ1SHPSPyS0A-V5HDYPWD0u1dEuZCk0ZNG5yF9pywd0ZKGujYknsKWpyfqn1cL0AdY5HnznW9xn1DzP7tknjDLg1csPH7xnH0krNt1PW0kg1bkP160pvbqn0KzIjYLnW00mhbqnW01g1csP7tdnjn0UynqnH0kg1DsnNtknjDLg1csPH7xnH0krNtsg100TgKGujYs0Z7Wpyfqn0KzuLw9u1Ys0A7B5HKxn0K-ThTqn0KsTjYs0A4vTjYsQW0snj0snj0s0AdYTjYs0AwbUL0qn0KzpWYs0Aw-IWdsmsKhIjYs0ZKC5H00ULnqn0KBI1Ykn0K8IjYs0ZPl5fK9TdqGuAnqTZnVmhwbX0KGuAnqiDF70ZKCIZbq0Zw9ThI-IjY1nNt1nHFxr0KYIgnqnWnvnHfdnH03njbLPWTzrjmdP6Kzug7Y5HDvP16vP1mvn104P100pgwV5H00UMwYTjYk0APC5ykdgLK10Zwd5gRvrjc4PjDs0ZnquWbLnHNbuHfLuj-hPhwhn0KYUHYknj61P10L0APh5HD0ThcqnfK_IyVG5HDs0AP8IA3qXhq_mv4-I7qWTZc0TLPs5HD0TLPsnWYk0ZwYTjYk0AkdT1YYmymdmhfYrHmkujmknhR10AkdTLfqPWfsuHbzP1R0UZNWIjYvPjK-rHc4n0K1mvRqPsK_INqGuAnqXym0IZF9uARk5H00IZF9uARz5H00uZws5HD0mywWUA71T1Ys0ZIG5Hf0uMPWpAdb5Hc0IAfqPW6zrHfkn0KhTWYznfK1TWYkPHR0mhN1UjYVnfK_IgIYTWYzn1Rsn1mknHfYnWDznHfdP1mL0AN3TA-b5Hcsnj04n-tznjDkPHIxnW0krjm4g1csnWDLnNtznjn4nHNxnW0vnjDYg1csPWnkrNtznjm4P1FxnW0LP1DLg1csrHDsn-tznjbdP100IgF_5y9YIZK1rBtEXA-9Uv9dmi4lUvs8mvqVQhP8Qvw-IA7GUjmsQ1R4nWbkQh9YUys0Tv-b5yNbuWDsPAnznj0snyRvPvn0mLPV5H6dfbPjPWRkn1b1PYFDwWR0mynqnfKsUWYk0Z7VIjYs0Z7VT1Ys0Aw-I7qWTADqn0KlIjYz0AdWgvuzUvYqn7tsg1Kxn7tsg1Kxn0Kbmy4dmhNxTAk9Uh-bT1Y3nHckrHDdg16knWD3nHKxn7tsg1Kxn7ts0ZK9I7qhUA7M5H00uAPGujYs0ANYpyfqQHD0mgPsmvnqn0KdTA-8mvnqn0KkUymqn0KhmLNY5H00pgPWUjYs0A7buhk9u1Yk0Akhm1Ys0AwWmvfq0Zwzmyw-5H00mhwGujdDwWRdrH0YwW0dPjF7nYPAP10vn1cYwWD3PjfzPHPjr0KEm1Yk0AFY5H00Uv7YI1Ys0AqY5HD0ULFsIjYzc10Wnznkc1czrHnvnjTvnWcWPWcsnanvnW0sQW0snj0snan1c1DWnanVc108nHfdnj0Yc108rjRsnj0sc1DWnansQW0snj0sn0KBmy4omyPW5H0Wn0K3TLwd5HfYnHR3rjfs0Z7xIWYsQWm1g108njKxna3sn7tsQWRvg108nj7xn7tsQW0kg100mMPxTZFEuA-b5H00ThqGuhk9u1Ys0ZFYmy-b5fKWIWY0pgPxmLK95H00mL0qn0K-TLfqn0KWThnqnHcsrH6&us=newvui&xst=mWdDwWRdrH0YwW0dPjF7nYPAP10vn1cYwWD3PjfzPHPjr0715HDLnHfdPjbsPj6LPWfvPjT4rHbvg1DYPW7xn07L5USJ1SHPSPyS0gDqIMP1pv-8TeSJ1SHPS07d5HfYnHR3rjfs0gfqnHmLrjmLPWm1nf7VTHYs0W0aQf7Wpjdhmdqsms7_IHYk0yP85gGEUAP8ugwxmLKz0HfYnHcdPH6Yns&word=&ck=974.7.77.598.495.676.588.2790&shh=www.baidu.com&sht=zolcnet_cpr&wd=&bc=110101","bodyBytesSent":37434,"requestTime":"0.000","ua":"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.198 Safari/537.36","totalBytesSent":37698,"serverIp":"172.16.0.3"}
```
error:
```
==44923== GoAccess 1.7.1 crashed by Sig 11
==44923==
==44923== VALUES AT CRASH POINT
==44923==
==44923== FILE: access.log-20230314
==44923== Line number: 15080
==44923== Invalid data: 604
==44923== Piping: 0
==44923==
==44923== STACK TRACE:
==44923==
==44923== 0 goaccess(sigsegv_handler+0x14f) [0x40f54f]
==44923== 1 /lib64/libpthread.so.0(+0x141d0) [0x7f4f39ea71d0]
==44923== 2 /lib64/libc.so.6(cfree+0x52) [0x7f4f39d41f32]
==44923== 3 goaccess() [0x42a12f]
==44923== 4 goaccess(pre_process_log+0x81) [0x42b351]
==44923== 5 goaccess(parse_log+0x152) [0x42bdd2]
==44923== 6 goaccess(main+0x2a9) [0x40a039]
==44923== 7 /lib64/libc.so.6(__libc_start_main+0xf2) [0x7f4f39cdd202]
==44923== 8 goaccess(_start+0x2e) [0x40c1fe]
==44923==
==44923== Please report it by opening an issue on GitHub:
==44923== https://github.com/allinurl/goaccess/issues
```
|
process
|
nginx log json format cat goaccess conf time format t date format y m d log format h h dt t m u r t u grep log format nginx conf log format log json accessiplist proxy add x forwarded for clientip remote addr cookie http cookie httphost host timestamp time method request method url request uri status status httpreferer http referer bodybytessent body bytes sent requesttime request time ua http user agent totalbytessent bytes sent serverip server addr sed n access log accessiplist clientip cookie httphost hxooe ooef cn timestamp method get url source bd plan lpp mb unit mb keyword status httpreferer windows nt rv gecko firefox totalbytessent serverip accessiplist clientip cookie httphost hxooe ooef cn timestamp method get url source baidu plan csgokaixiang yd unit csgokaixiang keyword csgo bc ae bd e creative e keywordid e bd vid status httpreferer linux android spn build huaweispn wv applewebkit khtml like gecko version chrome mobile safari mobads totalbytessent serverip accessiplist unknown clientip cookie agl user id hm lvt hm lvt httphost hxooe ooef cn timestamp method get url source baidu plan lpp pc unit pc keyword kxwz e creative e keywordid bd vid status httpreferer windows nt applewebkit khtml like gecko chrome safari totalbytessent serverip error goaccess crashed by sig values at crash point file access log line number invalid data piping stack trace goaccess sigsegv handler libpthread so libc so cfree goaccess goaccess pre process log goaccess parse log goaccess main libc so libc start main goaccess start please report it by opening an issue on github
| 1
|
16,964
| 22,327,641,908
|
IssuesEvent
|
2022-06-14 12:06:40
|
ropensci/software-review-meta
|
https://api.github.com/repos/ropensci/software-review-meta
|
closed
|
Add itdepends as part of a reviewer's report
|
automation process
|
https://github.com/jimhester/itdepends looks like it will be a useful tool for assessing package dependencies. It would be a good part of a future workflow where editors and reviewers get an automated report on the the package. (Along with other parts of currently stalled projects https://github.com/ropenscilabs/pkgreviewr and https://github.com/ropenscilabs/launchboat)
|
1.0
|
Add itdepends as part of a reviewer's report - https://github.com/jimhester/itdepends looks like it will be a useful tool for assessing package dependencies. It would be a good part of a future workflow where editors and reviewers get an automated report on the the package. (Along with other parts of currently stalled projects https://github.com/ropenscilabs/pkgreviewr and https://github.com/ropenscilabs/launchboat)
|
process
|
add itdepends as part of a reviewer s report looks like it will be a useful tool for assessing package dependencies it would be a good part of a future workflow where editors and reviewers get an automated report on the the package along with other parts of currently stalled projects and
| 1
|
8,621
| 11,776,375,169
|
IssuesEvent
|
2020-03-16 13:09:32
|
Arch666Angel/mods
|
https://api.github.com/repos/Arch666Angel/mods
|
closed
|
[BUG] Butchering Biter Queens
|
Angels Bio Processing Bug
|
**Describe the bug**
When butchering biter queens (tested with small, medium and big), the recipe promises raw meat and a percent chance of crystals. After trying in creative mod and in my playthrough, with something like a thousand queens just to check, just raw meat is outputted.
**To Reproduce**
I'm using the a344b8fc1227a64449431112f6439204827cfa7d version (just after bio rebalancing).
1. Put a butchery building (I suggest with speed modules to be quick).
2. Give the butchery huge amount of biter queens and inserter to output the results.
3. No crystals - only raw meat.
**On another note:**
- Alien spores, after processing the polluted fish water, can't be voided. FNEI shows the void recipe, but no building. Tried with both voiders - unsuccessful. Is it intended?
- The new bio rebalancing is excellently done. Created a build for outputting 1 raw speed & 1 raw productivity per second, and the needed bio buildings is quite appropriate for the mod pack - so I guess it is a win.
|
1.0
|
[BUG] Butchering Biter Queens - **Describe the bug**
When butchering biter queens (tested with small, medium and big), the recipe promises raw meat and a percent chance of crystals. After trying in creative mod and in my playthrough, with something like a thousand queens just to check, just raw meat is outputted.
**To Reproduce**
I'm using the a344b8fc1227a64449431112f6439204827cfa7d version (just after bio rebalancing).
1. Put a butchery building (I suggest with speed modules to be quick).
2. Give the butchery huge amount of biter queens and inserter to output the results.
3. No crystals - only raw meat.
**On another note:**
- Alien spores, after processing the polluted fish water, can't be voided. FNEI shows the void recipe, but no building. Tried with both voiders - unsuccessful. Is it intended?
- The new bio rebalancing is excellently done. Created a build for outputting 1 raw speed & 1 raw productivity per second, and the needed bio buildings is quite appropriate for the mod pack - so I guess it is a win.
|
process
|
butchering biter queens describe the bug when butchering biter queens tested with small medium and big the recipe promises raw meat and a percent chance of crystals after trying in creative mod and in my playthrough with something like a thousand queens just to check just raw meat is outputted to reproduce i m using the version just after bio rebalancing put a butchery building i suggest with speed modules to be quick give the butchery huge amount of biter queens and inserter to output the results no crystals only raw meat on another note alien spores after processing the polluted fish water can t be voided fnei shows the void recipe but no building tried with both voiders unsuccessful is it intended the new bio rebalancing is excellently done created a build for outputting raw speed raw productivity per second and the needed bio buildings is quite appropriate for the mod pack so i guess it is a win
| 1
|
141
| 2,575,871,718
|
IssuesEvent
|
2015-02-12 03:23:22
|
dominikwilkowski/bronzies
|
https://api.github.com/repos/dominikwilkowski/bronzies
|
closed
|
rework the js
|
In process
|
JS needs to be rewritten as it's a terrible bus job right now and not really very solid.
|
1.0
|
rework the js - JS needs to be rewritten as it's a terrible bus job right now and not really very solid.
|
process
|
rework the js js needs to be rewritten as it s a terrible bus job right now and not really very solid
| 1
|
8,674
| 11,807,872,825
|
IssuesEvent
|
2020-03-19 12:19:45
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
No AzureAutomation folder after install the agent on Windows
|
Pri2 automation/svc process-automation/subsvc
|
Hello,
After install the agent on my environment (windows 10.0.18362.0), I didn't find such a folder C:\Program Files\Microsoft Monitoring Agent\Agent\AzureAutomation, I tried several ways, but still didn't find the AzureAutomation subdirectory, is there anything I missed in the steps?
Or can you direct me how to manually install hybrid runbook worker?
Thanks
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 7b29372c-7bd9-7da2-4cff-9afbb432bccf
* Version Independent ID: 66ce101d-d21b-3fdf-be70-7f9cadc1570e
* Content: [Azure Automation Windows Hybrid Runbook Worker](https://docs.microsoft.com/en-us/azure/automation/automation-windows-hrw-install#feedback)
* Content Source: [articles/automation/automation-windows-hrw-install.md](https://github.com/Microsoft/azure-docs/blob/master/articles/automation/automation-windows-hrw-install.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @MGoedtel
* Microsoft Alias: **magoedte**
|
1.0
|
No AzureAutomation folder after install the agent on Windows - Hello,
After install the agent on my environment (windows 10.0.18362.0), I didn't find such a folder C:\Program Files\Microsoft Monitoring Agent\Agent\AzureAutomation, I tried several ways, but still didn't find the AzureAutomation subdirectory, is there anything I missed in the steps?
Or can you direct me how to manually install hybrid runbook worker?
Thanks
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 7b29372c-7bd9-7da2-4cff-9afbb432bccf
* Version Independent ID: 66ce101d-d21b-3fdf-be70-7f9cadc1570e
* Content: [Azure Automation Windows Hybrid Runbook Worker](https://docs.microsoft.com/en-us/azure/automation/automation-windows-hrw-install#feedback)
* Content Source: [articles/automation/automation-windows-hrw-install.md](https://github.com/Microsoft/azure-docs/blob/master/articles/automation/automation-windows-hrw-install.md)
* Service: **automation**
* Sub-service: **process-automation**
* GitHub Login: @MGoedtel
* Microsoft Alias: **magoedte**
|
process
|
no azureautomation folder after install the agent on windows hello after install the agent on my environment windows i didn t find such a folder c program files microsoft monitoring agent agent azureautomation i tried several ways but still didn t find the azureautomation subdirectory is there anything i missed in the steps or can you direct me how to manually install hybrid runbook worker thanks document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service automation sub service process automation github login mgoedtel microsoft alias magoedte
| 1
|
16,205
| 20,731,733,094
|
IssuesEvent
|
2022-03-14 10:04:10
|
damb/scdetect
|
https://api.github.com/repos/damb/scdetect
|
closed
|
Memory issue
|
bug processing
|
I got this error message while executing another example.
This time the amplitude and magnitude calculations were off.
Specs:
24 hours of data
8 detectors
24 templates
1 station
3 channels
```
10:58:51 [debug] [detector-08] Start processing detection (time=2019-07-30T01:53:24.226292Z, associated_results=3) ...
10:58:51 [debug] Found 67 origins.
10:58:51 [debug] Leaving ::done
10:58:51 [info] Waiting for record thread
10:58:51 [debug] Unload plugin 'PostgreSQL database driver'
=================================================================
==19975==ERROR: LeakSanitizer: detected memory leaks
Direct leak of 3840 byte(s) in 16 object(s) allocated from:
#0 0x7fd3af9c5587 in operator new(unsigned long) ../../../../src/libsanitizer/asan/asan_new_delete.cc:104
#1 0x55f3e77b1956 in std::unique_ptr<Seiscomp::GenericRecord, std::default_delete<Seiscomp::GenericRecord> > Seiscomp::detect::util::make_unique<Seiscomp::GenericRecord, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, Seiscomp::Core::Time, double>(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, Seiscomp::Core::Time&&, double&&) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/detail/../../util/memory.h:16
#2 0x55f3e77b0ba9 in Seiscomp::detect::processing::detail::InterpolateGaps::fillGap(Seiscomp::detect::processing::StreamState&, Seiscomp::Record const*, Seiscomp::Core::TimeSpan const&, double, unsigned long) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/detail/gap_interpolate.cpp:77
#3 0x55f3e77b0450 in Seiscomp::detect::processing::detail::InterpolateGaps::handleGap(Seiscomp::detect::processing::StreamState&, Seiscomp::Record const*, boost::intrusive_ptr<Seiscomp::NumericArray<double> >&) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/detail/gap_interpolate.cpp:50
#4 0x55f3e77b5287 in Seiscomp::detect::processing::WaveformProcessor::store(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/waveform_processor.cpp:116
#5 0x55f3e7736ce4 in Seiscomp::detect::detector::Detector::store(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/detector/detector.cpp:464
#6 0x55f3e77b495c in Seiscomp::detect::processing::WaveformProcessor::feed(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/waveform_processor.cpp:60
#7 0x55f3e7596a84 in Seiscomp::detect::Application::handleRecord(Seiscomp::Record*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:587
#8 0x7fd3af5dfe60 in Seiscomp::Client::StreamApplication::dispatch(Seiscomp::Core::BaseObject*) /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/streamapplication.cpp:151
#9 0x55f3e7595ac1 in Seiscomp::detect::Application::dispatch(Seiscomp::Core::BaseObject*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:533
#10 0x7fd3af5b16a9 in Seiscomp::Client::Application::processEvent() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/application.cpp:1298
#11 0x7fd3af5b1526 in Seiscomp::Client::Application::run() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/application.cpp:1274
#12 0x7fd3af5dfd0e in Seiscomp::Client::StreamApplication::run() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/streamapplication.cpp:110
#13 0x55f3e759517d in Seiscomp::detect::Application::run() /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:495
#14 0x7fd3ae4d750d in Seiscomp::System::Application::exec() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/system/application.cpp:1099
#15 0x7fd3ae4d7451 in Seiscomp::System::Application::operator()() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/system/application.cpp:1079
#16 0x55f3e77a633c in main /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/main.cpp:11
#17 0x7fd3ad1480b2 in __libc_start_main (/lib/x86_64-linux-gnu/libc.so.6+0x240b2)
Indirect leak of 24576 byte(s) in 16 object(s) allocated from:
#0 0x7fd3af9c5587 in operator new(unsigned long) ../../../../src/libsanitizer/asan/asan_new_delete.cc:104
#1 0x55f3e756cfd8 in __gnu_cxx::new_allocator<double>::allocate(unsigned long, void const*) /usr/include/c++/9/ext/new_allocator.h:114
#2 0x55f3e756c085 in std::allocator_traits<std::allocator<double> >::allocate(std::allocator<double>&, unsigned long) /usr/include/c++/9/bits/alloc_traits.h:444
#3 0x55f3e756b2d3 in std::_Vector_base<double, std::allocator<double> >::_M_allocate(unsigned long) /usr/include/c++/9/bits/stl_vector.h:343
#4 0x55f3e7569719 in void std::vector<double, std::allocator<double> >::_M_realloc_insert<double>(__gnu_cxx::__normal_iterator<double*, std::vector<double, std::allocator<double> > >, double&&) /usr/include/c++/9/bits/vector.tcc:440
#5 0x55f3e75679c6 in void std::vector<double, std::allocator<double> >::emplace_back<double>(double&&) /usr/include/c++/9/bits/vector.tcc:121
#6 0x55f3e7565d83 in std::vector<double, std::allocator<double> >::push_back(double&&) /usr/include/c++/9/bits/stl_vector.h:1201
#7 0x55f3e776b09f in std::back_insert_iterator<std::vector<double, std::allocator<double> > >::operator=(double&&) /usr/include/c++/9/bits/stl_iterator.h:522
#8 0x7fd3ae47f0f3 in std::back_insert_iterator<std::vector<double, std::allocator<double> > > std::transform<double const*, std::back_insert_iterator<std::vector<double, std::allocator<double> > >, convert<double, double> >(double const*, double const*, std::back_insert_iterator<std::vector<double, std::allocator<double> > >, convert<double, double>) (/home/maria/seiscomp/lib/libseiscomp_core.so.15+0xd4b0f3)
#9 0x7fd3ae47dcb7 in void convertArray<std::vector<double, std::allocator<double> >, double>(std::vector<double, std::allocator<double> >&, int, double const*) (/home/maria/seiscomp/lib/libseiscomp_core.so.15+0xd49cb7)
#10 0x7fd3ae47c5a3 in Seiscomp::ArrayFactory::Create(Seiscomp::Array::DataType, Seiscomp::Array::DataType, int, void const*) /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/core/arrayfactory.cpp:209
#11 0x7fd3ae4b4c46 in Seiscomp::GenericRecord::setData(int, void const*, Seiscomp::Array::DataType) /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/core/genericrecord.cpp:152
#12 0x55f3e77b0dad in Seiscomp::detect::processing::detail::InterpolateGaps::fillGap(Seiscomp::detect::processing::StreamState&, Seiscomp::Record const*, Seiscomp::Core::TimeSpan const&, double, unsigned long) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/detail/gap_interpolate.cpp:88
#13 0x55f3e77b0450 in Seiscomp::detect::processing::detail::InterpolateGaps::handleGap(Seiscomp::detect::processing::StreamState&, Seiscomp::Record const*, boost::intrusive_ptr<Seiscomp::NumericArray<double> >&) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/detail/gap_interpolate.cpp:50
#14 0x55f3e77b5287 in Seiscomp::detect::processing::WaveformProcessor::store(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/waveform_processor.cpp:116
#15 0x55f3e7736ce4 in Seiscomp::detect::detector::Detector::store(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/detector/detector.cpp:464
#16 0x55f3e77b495c in Seiscomp::detect::processing::WaveformProcessor::feed(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/waveform_processor.cpp:60
#17 0x55f3e7596a84 in Seiscomp::detect::Application::handleRecord(Seiscomp::Record*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:587
#18 0x7fd3af5dfe60 in Seiscomp::Client::StreamApplication::dispatch(Seiscomp::Core::BaseObject*) /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/streamapplication.cpp:151
#19 0x55f3e7595ac1 in Seiscomp::detect::Application::dispatch(Seiscomp::Core::BaseObject*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:533
#20 0x7fd3af5b16a9 in Seiscomp::Client::Application::processEvent() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/application.cpp:1298
#21 0x7fd3af5b1526 in Seiscomp::Client::Application::run() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/application.cpp:1274
#22 0x7fd3af5dfd0e in Seiscomp::Client::StreamApplication::run() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/streamapplication.cpp:110
#23 0x55f3e759517d in Seiscomp::detect::Application::run() /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:495
#24 0x7fd3ae4d750d in Seiscomp::System::Application::exec() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/system/application.cpp:1099
#25 0x7fd3ae4d7451 in Seiscomp::System::Application::operator()() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/system/application.cpp:1079
#26 0x55f3e77a633c in main /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/main.cpp:11
#27 0x7fd3ad1480b2 in __libc_start_main (/lib/x86_64-linux-gnu/libc.so.6+0x240b2)
Indirect leak of 640 byte(s) in 16 object(s) allocated from:
#0 0x7fd3af9c5587 in operator new(unsigned long) ../../../../src/libsanitizer/asan/asan_new_delete.cc:104
#1 0x7fd3ae47c554 in Seiscomp::ArrayFactory::Create(Seiscomp::Array::DataType, Seiscomp::Array::DataType, int, void const*) /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/core/arrayfactory.cpp:208
#2 0x7fd3ae4b4c46 in Seiscomp::GenericRecord::setData(int, void const*, Seiscomp::Array::DataType) /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/core/genericrecord.cpp:152
#3 0x55f3e77b0dad in Seiscomp::detect::processing::detail::InterpolateGaps::fillGap(Seiscomp::detect::processing::StreamState&, Seiscomp::Record const*, Seiscomp::Core::TimeSpan const&, double, unsigned long) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/detail/gap_interpolate.cpp:88
#4 0x55f3e77b0450 in Seiscomp::detect::processing::detail::InterpolateGaps::handleGap(Seiscomp::detect::processing::StreamState&, Seiscomp::Record const*, boost::intrusive_ptr<Seiscomp::NumericArray<double> >&) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/detail/gap_interpolate.cpp:50
#5 0x55f3e77b5287 in Seiscomp::detect::processing::WaveformProcessor::store(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/waveform_processor.cpp:116
#6 0x55f3e7736ce4 in Seiscomp::detect::detector::Detector::store(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/detector/detector.cpp:464
#7 0x55f3e77b495c in Seiscomp::detect::processing::WaveformProcessor::feed(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/waveform_processor.cpp:60
#8 0x55f3e7596a84 in Seiscomp::detect::Application::handleRecord(Seiscomp::Record*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:587
#9 0x7fd3af5dfe60 in Seiscomp::Client::StreamApplication::dispatch(Seiscomp::Core::BaseObject*) /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/streamapplication.cpp:151
#10 0x55f3e7595ac1 in Seiscomp::detect::Application::dispatch(Seiscomp::Core::BaseObject*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:533
#11 0x7fd3af5b16a9 in Seiscomp::Client::Application::processEvent() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/application.cpp:1298
#12 0x7fd3af5b1526 in Seiscomp::Client::Application::run() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/application.cpp:1274
#13 0x7fd3af5dfd0e in Seiscomp::Client::StreamApplication::run() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/streamapplication.cpp:110
#14 0x55f3e759517d in Seiscomp::detect::Application::run() /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:495
#15 0x7fd3ae4d750d in Seiscomp::System::Application::exec() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/system/application.cpp:1099
#16 0x7fd3ae4d7451 in Seiscomp::System::Application::operator()() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/system/application.cpp:1079
#17 0x55f3e77a633c in main /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/main.cpp:11
#18 0x7fd3ad1480b2 in __libc_start_main (/lib/x86_64-linux-gnu/libc.so.6+0x240b2)
SUMMARY: AddressSanitizer: 29056 byte(s) leaked in 48 allocation(s).
|
1.0
|
Memory issue - I got this error message while executing another example.
This time the amplitude and magnitude calculations were off.
Specs:
24 hours of data
8 detectors
24 templates
1 station
3 channels
```
10:58:51 [debug] [detector-08] Start processing detection (time=2019-07-30T01:53:24.226292Z, associated_results=3) ...
10:58:51 [debug] Found 67 origins.
10:58:51 [debug] Leaving ::done
10:58:51 [info] Waiting for record thread
10:58:51 [debug] Unload plugin 'PostgreSQL database driver'
=================================================================
==19975==ERROR: LeakSanitizer: detected memory leaks
Direct leak of 3840 byte(s) in 16 object(s) allocated from:
#0 0x7fd3af9c5587 in operator new(unsigned long) ../../../../src/libsanitizer/asan/asan_new_delete.cc:104
#1 0x55f3e77b1956 in std::unique_ptr<Seiscomp::GenericRecord, std::default_delete<Seiscomp::GenericRecord> > Seiscomp::detect::util::make_unique<Seiscomp::GenericRecord, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, Seiscomp::Core::Time, double>(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, Seiscomp::Core::Time&&, double&&) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/detail/../../util/memory.h:16
#2 0x55f3e77b0ba9 in Seiscomp::detect::processing::detail::InterpolateGaps::fillGap(Seiscomp::detect::processing::StreamState&, Seiscomp::Record const*, Seiscomp::Core::TimeSpan const&, double, unsigned long) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/detail/gap_interpolate.cpp:77
#3 0x55f3e77b0450 in Seiscomp::detect::processing::detail::InterpolateGaps::handleGap(Seiscomp::detect::processing::StreamState&, Seiscomp::Record const*, boost::intrusive_ptr<Seiscomp::NumericArray<double> >&) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/detail/gap_interpolate.cpp:50
#4 0x55f3e77b5287 in Seiscomp::detect::processing::WaveformProcessor::store(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/waveform_processor.cpp:116
#5 0x55f3e7736ce4 in Seiscomp::detect::detector::Detector::store(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/detector/detector.cpp:464
#6 0x55f3e77b495c in Seiscomp::detect::processing::WaveformProcessor::feed(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/waveform_processor.cpp:60
#7 0x55f3e7596a84 in Seiscomp::detect::Application::handleRecord(Seiscomp::Record*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:587
#8 0x7fd3af5dfe60 in Seiscomp::Client::StreamApplication::dispatch(Seiscomp::Core::BaseObject*) /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/streamapplication.cpp:151
#9 0x55f3e7595ac1 in Seiscomp::detect::Application::dispatch(Seiscomp::Core::BaseObject*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:533
#10 0x7fd3af5b16a9 in Seiscomp::Client::Application::processEvent() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/application.cpp:1298
#11 0x7fd3af5b1526 in Seiscomp::Client::Application::run() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/application.cpp:1274
#12 0x7fd3af5dfd0e in Seiscomp::Client::StreamApplication::run() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/streamapplication.cpp:110
#13 0x55f3e759517d in Seiscomp::detect::Application::run() /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:495
#14 0x7fd3ae4d750d in Seiscomp::System::Application::exec() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/system/application.cpp:1099
#15 0x7fd3ae4d7451 in Seiscomp::System::Application::operator()() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/system/application.cpp:1079
#16 0x55f3e77a633c in main /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/main.cpp:11
#17 0x7fd3ad1480b2 in __libc_start_main (/lib/x86_64-linux-gnu/libc.so.6+0x240b2)
Indirect leak of 24576 byte(s) in 16 object(s) allocated from:
#0 0x7fd3af9c5587 in operator new(unsigned long) ../../../../src/libsanitizer/asan/asan_new_delete.cc:104
#1 0x55f3e756cfd8 in __gnu_cxx::new_allocator<double>::allocate(unsigned long, void const*) /usr/include/c++/9/ext/new_allocator.h:114
#2 0x55f3e756c085 in std::allocator_traits<std::allocator<double> >::allocate(std::allocator<double>&, unsigned long) /usr/include/c++/9/bits/alloc_traits.h:444
#3 0x55f3e756b2d3 in std::_Vector_base<double, std::allocator<double> >::_M_allocate(unsigned long) /usr/include/c++/9/bits/stl_vector.h:343
#4 0x55f3e7569719 in void std::vector<double, std::allocator<double> >::_M_realloc_insert<double>(__gnu_cxx::__normal_iterator<double*, std::vector<double, std::allocator<double> > >, double&&) /usr/include/c++/9/bits/vector.tcc:440
#5 0x55f3e75679c6 in void std::vector<double, std::allocator<double> >::emplace_back<double>(double&&) /usr/include/c++/9/bits/vector.tcc:121
#6 0x55f3e7565d83 in std::vector<double, std::allocator<double> >::push_back(double&&) /usr/include/c++/9/bits/stl_vector.h:1201
#7 0x55f3e776b09f in std::back_insert_iterator<std::vector<double, std::allocator<double> > >::operator=(double&&) /usr/include/c++/9/bits/stl_iterator.h:522
#8 0x7fd3ae47f0f3 in std::back_insert_iterator<std::vector<double, std::allocator<double> > > std::transform<double const*, std::back_insert_iterator<std::vector<double, std::allocator<double> > >, convert<double, double> >(double const*, double const*, std::back_insert_iterator<std::vector<double, std::allocator<double> > >, convert<double, double>) (/home/maria/seiscomp/lib/libseiscomp_core.so.15+0xd4b0f3)
#9 0x7fd3ae47dcb7 in void convertArray<std::vector<double, std::allocator<double> >, double>(std::vector<double, std::allocator<double> >&, int, double const*) (/home/maria/seiscomp/lib/libseiscomp_core.so.15+0xd49cb7)
#10 0x7fd3ae47c5a3 in Seiscomp::ArrayFactory::Create(Seiscomp::Array::DataType, Seiscomp::Array::DataType, int, void const*) /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/core/arrayfactory.cpp:209
#11 0x7fd3ae4b4c46 in Seiscomp::GenericRecord::setData(int, void const*, Seiscomp::Array::DataType) /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/core/genericrecord.cpp:152
#12 0x55f3e77b0dad in Seiscomp::detect::processing::detail::InterpolateGaps::fillGap(Seiscomp::detect::processing::StreamState&, Seiscomp::Record const*, Seiscomp::Core::TimeSpan const&, double, unsigned long) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/detail/gap_interpolate.cpp:88
#13 0x55f3e77b0450 in Seiscomp::detect::processing::detail::InterpolateGaps::handleGap(Seiscomp::detect::processing::StreamState&, Seiscomp::Record const*, boost::intrusive_ptr<Seiscomp::NumericArray<double> >&) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/detail/gap_interpolate.cpp:50
#14 0x55f3e77b5287 in Seiscomp::detect::processing::WaveformProcessor::store(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/waveform_processor.cpp:116
#15 0x55f3e7736ce4 in Seiscomp::detect::detector::Detector::store(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/detector/detector.cpp:464
#16 0x55f3e77b495c in Seiscomp::detect::processing::WaveformProcessor::feed(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/waveform_processor.cpp:60
#17 0x55f3e7596a84 in Seiscomp::detect::Application::handleRecord(Seiscomp::Record*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:587
#18 0x7fd3af5dfe60 in Seiscomp::Client::StreamApplication::dispatch(Seiscomp::Core::BaseObject*) /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/streamapplication.cpp:151
#19 0x55f3e7595ac1 in Seiscomp::detect::Application::dispatch(Seiscomp::Core::BaseObject*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:533
#20 0x7fd3af5b16a9 in Seiscomp::Client::Application::processEvent() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/application.cpp:1298
#21 0x7fd3af5b1526 in Seiscomp::Client::Application::run() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/application.cpp:1274
#22 0x7fd3af5dfd0e in Seiscomp::Client::StreamApplication::run() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/streamapplication.cpp:110
#23 0x55f3e759517d in Seiscomp::detect::Application::run() /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:495
#24 0x7fd3ae4d750d in Seiscomp::System::Application::exec() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/system/application.cpp:1099
#25 0x7fd3ae4d7451 in Seiscomp::System::Application::operator()() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/system/application.cpp:1079
#26 0x55f3e77a633c in main /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/main.cpp:11
#27 0x7fd3ad1480b2 in __libc_start_main (/lib/x86_64-linux-gnu/libc.so.6+0x240b2)
Indirect leak of 640 byte(s) in 16 object(s) allocated from:
#0 0x7fd3af9c5587 in operator new(unsigned long) ../../../../src/libsanitizer/asan/asan_new_delete.cc:104
#1 0x7fd3ae47c554 in Seiscomp::ArrayFactory::Create(Seiscomp::Array::DataType, Seiscomp::Array::DataType, int, void const*) /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/core/arrayfactory.cpp:208
#2 0x7fd3ae4b4c46 in Seiscomp::GenericRecord::setData(int, void const*, Seiscomp::Array::DataType) /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/core/genericrecord.cpp:152
#3 0x55f3e77b0dad in Seiscomp::detect::processing::detail::InterpolateGaps::fillGap(Seiscomp::detect::processing::StreamState&, Seiscomp::Record const*, Seiscomp::Core::TimeSpan const&, double, unsigned long) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/detail/gap_interpolate.cpp:88
#4 0x55f3e77b0450 in Seiscomp::detect::processing::detail::InterpolateGaps::handleGap(Seiscomp::detect::processing::StreamState&, Seiscomp::Record const*, boost::intrusive_ptr<Seiscomp::NumericArray<double> >&) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/detail/gap_interpolate.cpp:50
#5 0x55f3e77b5287 in Seiscomp::detect::processing::WaveformProcessor::store(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/waveform_processor.cpp:116
#6 0x55f3e7736ce4 in Seiscomp::detect::detector::Detector::store(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/detector/detector.cpp:464
#7 0x55f3e77b495c in Seiscomp::detect::processing::WaveformProcessor::feed(Seiscomp::Record const*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/processing/waveform_processor.cpp:60
#8 0x55f3e7596a84 in Seiscomp::detect::Application::handleRecord(Seiscomp::Record*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:587
#9 0x7fd3af5dfe60 in Seiscomp::Client::StreamApplication::dispatch(Seiscomp::Core::BaseObject*) /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/streamapplication.cpp:151
#10 0x55f3e7595ac1 in Seiscomp::detect::Application::dispatch(Seiscomp::Core::BaseObject*) /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:533
#11 0x7fd3af5b16a9 in Seiscomp::Client::Application::processEvent() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/application.cpp:1298
#12 0x7fd3af5b1526 in Seiscomp::Client::Application::run() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/application.cpp:1274
#13 0x7fd3af5dfd0e in Seiscomp::Client::StreamApplication::run() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/client/streamapplication.cpp:110
#14 0x55f3e759517d in Seiscomp::detect::Application::run() /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/app.cpp:495
#15 0x7fd3ae4d750d in Seiscomp::System::Application::exec() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/system/application.cpp:1099
#16 0x7fd3ae4d7451 in Seiscomp::System::Application::operator()() /home/maria/work/projects/seiscomp/src/base/common/libs/seiscomp/system/application.cpp:1079
#17 0x55f3e77a633c in main /home/maria/work/projects/seiscomp/src/extras/scdetect/src/apps/cc/main.cpp:11
#18 0x7fd3ad1480b2 in __libc_start_main (/lib/x86_64-linux-gnu/libc.so.6+0x240b2)
SUMMARY: AddressSanitizer: 29056 byte(s) leaked in 48 allocation(s).
|
process
|
memory issue i got this error message while executing another example this time the amplitude and magnitude calculations were off specs hours of data detectors templates station channels start processing detection time associated results found origins leaving done waiting for record thread unload plugin postgresql database driver error leaksanitizer detected memory leaks direct leak of byte s in object s allocated from in operator new unsigned long src libsanitizer asan asan new delete cc in std unique ptr seiscomp detect util make unique std allocator const std basic string std allocator const std basic string std allocator const std basic string std allocator const seiscomp core time double std basic string std allocator const std basic string std allocator const std basic string std allocator const std basic string std allocator const seiscomp core time double home maria work projects seiscomp src extras scdetect src apps cc processing detail util memory h in seiscomp detect processing detail interpolategaps fillgap seiscomp detect processing streamstate seiscomp record const seiscomp core timespan const double unsigned long home maria work projects seiscomp src extras scdetect src apps cc processing detail gap interpolate cpp in seiscomp detect processing detail interpolategaps handlegap seiscomp detect processing streamstate seiscomp record const boost intrusive ptr home maria work projects seiscomp src extras scdetect src apps cc processing detail gap interpolate cpp in seiscomp detect processing waveformprocessor store seiscomp record const home maria work projects seiscomp src extras scdetect src apps cc processing waveform processor cpp in seiscomp detect detector detector store seiscomp record const home maria work projects seiscomp src extras scdetect src apps cc detector detector cpp in seiscomp detect processing waveformprocessor feed seiscomp record const home maria work projects seiscomp src extras scdetect src apps cc processing waveform processor cpp in seiscomp detect application handlerecord seiscomp record home maria work projects seiscomp src extras scdetect src apps cc app cpp in seiscomp client streamapplication dispatch seiscomp core baseobject home maria work projects seiscomp src base common libs seiscomp client streamapplication cpp in seiscomp detect application dispatch seiscomp core baseobject home maria work projects seiscomp src extras scdetect src apps cc app cpp in seiscomp client application processevent home maria work projects seiscomp src base common libs seiscomp client application cpp in seiscomp client application run home maria work projects seiscomp src base common libs seiscomp client application cpp in seiscomp client streamapplication run home maria work projects seiscomp src base common libs seiscomp client streamapplication cpp in seiscomp detect application run home maria work projects seiscomp src extras scdetect src apps cc app cpp in seiscomp system application exec home maria work projects seiscomp src base common libs seiscomp system application cpp in seiscomp system application operator home maria work projects seiscomp src base common libs seiscomp system application cpp in main home maria work projects seiscomp src extras scdetect src apps cc main cpp in libc start main lib linux gnu libc so indirect leak of byte s in object s allocated from in operator new unsigned long src libsanitizer asan asan new delete cc in gnu cxx new allocator allocate unsigned long void const usr include c ext new allocator h in std allocator traits allocate std allocator unsigned long usr include c bits alloc traits h in std vector base m allocate unsigned long usr include c bits stl vector h in void std vector m realloc insert gnu cxx normal iterator double usr include c bits vector tcc in void std vector emplace back double usr include c bits vector tcc in std vector push back double usr include c bits stl vector h in std back insert iterator operator double usr include c bits stl iterator h in std back insert iterator std transform convert double const double const std back insert iterator convert home maria seiscomp lib libseiscomp core so in void convertarray double std vector int double const home maria seiscomp lib libseiscomp core so in seiscomp arrayfactory create seiscomp array datatype seiscomp array datatype int void const home maria work projects seiscomp src base common libs seiscomp core arrayfactory cpp in seiscomp genericrecord setdata int void const seiscomp array datatype home maria work projects seiscomp src base common libs seiscomp core genericrecord cpp in seiscomp detect processing detail interpolategaps fillgap seiscomp detect processing streamstate seiscomp record const seiscomp core timespan const double unsigned long home maria work projects seiscomp src extras scdetect src apps cc processing detail gap interpolate cpp in seiscomp detect processing detail interpolategaps handlegap seiscomp detect processing streamstate seiscomp record const boost intrusive ptr home maria work projects seiscomp src extras scdetect src apps cc processing detail gap interpolate cpp in seiscomp detect processing waveformprocessor store seiscomp record const home maria work projects seiscomp src extras scdetect src apps cc processing waveform processor cpp in seiscomp detect detector detector store seiscomp record const home maria work projects seiscomp src extras scdetect src apps cc detector detector cpp in seiscomp detect processing waveformprocessor feed seiscomp record const home maria work projects seiscomp src extras scdetect src apps cc processing waveform processor cpp in seiscomp detect application handlerecord seiscomp record home maria work projects seiscomp src extras scdetect src apps cc app cpp in seiscomp client streamapplication dispatch seiscomp core baseobject home maria work projects seiscomp src base common libs seiscomp client streamapplication cpp in seiscomp detect application dispatch seiscomp core baseobject home maria work projects seiscomp src extras scdetect src apps cc app cpp in seiscomp client application processevent home maria work projects seiscomp src base common libs seiscomp client application cpp in seiscomp client application run home maria work projects seiscomp src base common libs seiscomp client application cpp in seiscomp client streamapplication run home maria work projects seiscomp src base common libs seiscomp client streamapplication cpp in seiscomp detect application run home maria work projects seiscomp src extras scdetect src apps cc app cpp in seiscomp system application exec home maria work projects seiscomp src base common libs seiscomp system application cpp in seiscomp system application operator home maria work projects seiscomp src base common libs seiscomp system application cpp in main home maria work projects seiscomp src extras scdetect src apps cc main cpp in libc start main lib linux gnu libc so indirect leak of byte s in object s allocated from in operator new unsigned long src libsanitizer asan asan new delete cc in seiscomp arrayfactory create seiscomp array datatype seiscomp array datatype int void const home maria work projects seiscomp src base common libs seiscomp core arrayfactory cpp in seiscomp genericrecord setdata int void const seiscomp array datatype home maria work projects seiscomp src base common libs seiscomp core genericrecord cpp in seiscomp detect processing detail interpolategaps fillgap seiscomp detect processing streamstate seiscomp record const seiscomp core timespan const double unsigned long home maria work projects seiscomp src extras scdetect src apps cc processing detail gap interpolate cpp in seiscomp detect processing detail interpolategaps handlegap seiscomp detect processing streamstate seiscomp record const boost intrusive ptr home maria work projects seiscomp src extras scdetect src apps cc processing detail gap interpolate cpp in seiscomp detect processing waveformprocessor store seiscomp record const home maria work projects seiscomp src extras scdetect src apps cc processing waveform processor cpp in seiscomp detect detector detector store seiscomp record const home maria work projects seiscomp src extras scdetect src apps cc detector detector cpp in seiscomp detect processing waveformprocessor feed seiscomp record const home maria work projects seiscomp src extras scdetect src apps cc processing waveform processor cpp in seiscomp detect application handlerecord seiscomp record home maria work projects seiscomp src extras scdetect src apps cc app cpp in seiscomp client streamapplication dispatch seiscomp core baseobject home maria work projects seiscomp src base common libs seiscomp client streamapplication cpp in seiscomp detect application dispatch seiscomp core baseobject home maria work projects seiscomp src extras scdetect src apps cc app cpp in seiscomp client application processevent home maria work projects seiscomp src base common libs seiscomp client application cpp in seiscomp client application run home maria work projects seiscomp src base common libs seiscomp client application cpp in seiscomp client streamapplication run home maria work projects seiscomp src base common libs seiscomp client streamapplication cpp in seiscomp detect application run home maria work projects seiscomp src extras scdetect src apps cc app cpp in seiscomp system application exec home maria work projects seiscomp src base common libs seiscomp system application cpp in seiscomp system application operator home maria work projects seiscomp src base common libs seiscomp system application cpp in main home maria work projects seiscomp src extras scdetect src apps cc main cpp in libc start main lib linux gnu libc so summary addresssanitizer byte s leaked in allocation s
| 1
|
368,119
| 25,777,242,716
|
IssuesEvent
|
2022-12-09 13:03:17
|
BojarLab/glycowork
|
https://api.github.com/repos/BojarLab/glycowork
|
closed
|
more example code please...
|
documentation
|
Dear Glycowork,
Thanks you for your software. I have been trying to work through this so I can predict the binding capacity of some lectins from their sequence using lectin_oracle_flex, whoever the example code ins https://bojarlab.github.io/glycowork/examples.html#example2 is not enough for me to work with in order to do it myself.
Would you kindly please expand your example for the lecting section?
kind regards,
Peter Thorpe
|
1.0
|
more example code please... - Dear Glycowork,
Thanks you for your software. I have been trying to work through this so I can predict the binding capacity of some lectins from their sequence using lectin_oracle_flex, whoever the example code ins https://bojarlab.github.io/glycowork/examples.html#example2 is not enough for me to work with in order to do it myself.
Would you kindly please expand your example for the lecting section?
kind regards,
Peter Thorpe
|
non_process
|
more example code please dear glycowork thanks you for your software i have been trying to work through this so i can predict the binding capacity of some lectins from their sequence using lectin oracle flex whoever the example code ins is not enough for me to work with in order to do it myself would you kindly please expand your example for the lecting section kind regards peter thorpe
| 0
|
7,232
| 10,371,896,462
|
IssuesEvent
|
2019-09-08 23:45:03
|
kubeflow/testing
|
https://api.github.com/repos/kubeflow/testing
|
closed
|
Define labels to indicate approximate effort needed to complete an issue
|
area/engprod kind/process priority/p1
|
We'd like a standard way to indicate how much effort an issue will take. This will help us do better planning.
It doesn't look like GitHub has a way to specify effort. But one way we can do this is by defining labels.
As a starting point here's an initial set of label
effort/1-day
effort/3-day
effort/5-day
effort/2-weeks
effort/2-weeks+
|
1.0
|
Define labels to indicate approximate effort needed to complete an issue - We'd like a standard way to indicate how much effort an issue will take. This will help us do better planning.
It doesn't look like GitHub has a way to specify effort. But one way we can do this is by defining labels.
As a starting point here's an initial set of label
effort/1-day
effort/3-day
effort/5-day
effort/2-weeks
effort/2-weeks+
|
process
|
define labels to indicate approximate effort needed to complete an issue we d like a standard way to indicate how much effort an issue will take this will help us do better planning it doesn t look like github has a way to specify effort but one way we can do this is by defining labels as a starting point here s an initial set of label effort day effort day effort day effort weeks effort weeks
| 1
|
16,869
| 22,149,902,296
|
IssuesEvent
|
2022-06-03 15:42:59
|
hashgraph/hedera-json-rpc-relay
|
https://api.github.com/repos/hashgraph/hedera-json-rpc-relay
|
closed
|
Add rpc call test suite for simple testing
|
enhancement P1 process
|
### Problem
Currently most endpoint testing is done manually through `curl` or metamask.
We need a quick way to run a collection of calls against a hosted instance - local or deployed
### Solution
Utilize postman (newman CLI) flow to run multiple queries against a hosted relay instance.
- Add transaction submission logic using js-sdk
See [Mirror Node Web3 acceptance test](https://github.com/hashgraph/hedera-mirror-node/tree/main/docs/web3#acceptance-tests) as inspiration.
For a postman.json file see the [Mirror Node web3 postman.json](https://github.com/hashgraph/hedera-mirror-node/blob/main/hedera-mirror-web3/postman.json) for inspiration.
### Alternatives
Add a ts test suite with a CLI option to run
|
1.0
|
Add rpc call test suite for simple testing - ### Problem
Currently most endpoint testing is done manually through `curl` or metamask.
We need a quick way to run a collection of calls against a hosted instance - local or deployed
### Solution
Utilize postman (newman CLI) flow to run multiple queries against a hosted relay instance.
- Add transaction submission logic using js-sdk
See [Mirror Node Web3 acceptance test](https://github.com/hashgraph/hedera-mirror-node/tree/main/docs/web3#acceptance-tests) as inspiration.
For a postman.json file see the [Mirror Node web3 postman.json](https://github.com/hashgraph/hedera-mirror-node/blob/main/hedera-mirror-web3/postman.json) for inspiration.
### Alternatives
Add a ts test suite with a CLI option to run
|
process
|
add rpc call test suite for simple testing problem currently most endpoint testing is done manually through curl or metamask we need a quick way to run a collection of calls against a hosted instance local or deployed solution utilize postman newman cli flow to run multiple queries against a hosted relay instance add transaction submission logic using js sdk see as inspiration for a postman json file see the for inspiration alternatives add a ts test suite with a cli option to run
| 1
|
408
| 2,850,438,349
|
IssuesEvent
|
2015-05-31 15:42:34
|
K0zka/kerub
|
https://api.github.com/repos/K0zka/kerub
|
closed
|
fill host capabilities with detected data
|
component:data processing enhancement priority: high
|
when performing host discovery, check if dmidecode is installed, if installed, run it and get all data
if not installed and the host is dedicated, install it first
if not installed and the host is not dedicated, leave the records empty
|
1.0
|
fill host capabilities with detected data - when performing host discovery, check if dmidecode is installed, if installed, run it and get all data
if not installed and the host is dedicated, install it first
if not installed and the host is not dedicated, leave the records empty
|
process
|
fill host capabilities with detected data when performing host discovery check if dmidecode is installed if installed run it and get all data if not installed and the host is dedicated install it first if not installed and the host is not dedicated leave the records empty
| 1
|
6,143
| 9,013,820,636
|
IssuesEvent
|
2019-02-05 20:34:51
|
grow/grow
|
https://api.github.com/repos/grow/grow
|
opened
|
Future of Grow Survey
|
process ux
|
Hello Grow Community!
We are currently in the discovery phase of building a new version of Grow and we need your help! Please support us to by sharing your experiences thus far as well as ideas for the future of Grow.
The survey includes 3 simple questions and will take 5 minutes or less: [go.blinkk.com/feb-grow-survey](https://goo.gl/forms/xL3Rz2ilwLIGgjFC3).
We look forward to reading your responses and sharing updates with you as we progress.
Thank you in advance!
|
1.0
|
Future of Grow Survey - Hello Grow Community!
We are currently in the discovery phase of building a new version of Grow and we need your help! Please support us to by sharing your experiences thus far as well as ideas for the future of Grow.
The survey includes 3 simple questions and will take 5 minutes or less: [go.blinkk.com/feb-grow-survey](https://goo.gl/forms/xL3Rz2ilwLIGgjFC3).
We look forward to reading your responses and sharing updates with you as we progress.
Thank you in advance!
|
process
|
future of grow survey hello grow community we are currently in the discovery phase of building a new version of grow and we need your help please support us to by sharing your experiences thus far as well as ideas for the future of grow the survey includes simple questions and will take minutes or less we look forward to reading your responses and sharing updates with you as we progress thank you in advance
| 1
|
25,473
| 4,330,983,994
|
IssuesEvent
|
2016-07-26 21:48:39
|
CompEvol/beast2
|
https://api.github.com/repos/CompEvol/beast2
|
closed
|
Logger.openLogFile() occasionally fails with null pointer exception
|
Could not reproduce CRITICAL priority defect
|
This has come up a few times today at the TTB workshop with users on OS X. A fix seems to be to tell BEAST to overwrite log files (on the command line or from the launch window), but this problem seems to occur even when no previous log file exists. I haven't been able to reproduce it myself on Linux.
|
1.0
|
Logger.openLogFile() occasionally fails with null pointer exception - This has come up a few times today at the TTB workshop with users on OS X. A fix seems to be to tell BEAST to overwrite log files (on the command line or from the launch window), but this problem seems to occur even when no previous log file exists. I haven't been able to reproduce it myself on Linux.
|
non_process
|
logger openlogfile occasionally fails with null pointer exception this has come up a few times today at the ttb workshop with users on os x a fix seems to be to tell beast to overwrite log files on the command line or from the launch window but this problem seems to occur even when no previous log file exists i haven t been able to reproduce it myself on linux
| 0
|
3,067
| 6,051,575,351
|
IssuesEvent
|
2017-06-13 00:36:33
|
hashicorp/packer
|
https://api.github.com/repos/hashicorp/packer
|
closed
|
pp/vagrant-cloud: large files are read in to memory
|
bug post-processor/atlas post-processor/vagrant-cloud
|
Hi,
I'm trying to switch from the `atlas` post processor to `vagrant-cloud` to upload a vagrant box.
This was working fine:
```json
"post-processors": [
[
{
"type": "vagrant",
"vagrantfile_template": "templates/vagrantfile.tpl",
"keep_input_artifact": true,
"output": "build/esss-devenv-7.3.box"
},
{
"type": "atlas",
"artifact": "esss/devenv-7.3",
"artifact_type": "vagrant.box",
"metadata": {
"description": "{{user `description`}}",
"provider": "virtualbox",
"version": "{{user `devenv_version`}}"
}
}
]
]
```
This doesn't work:
```json
"post-processors": [
[
{
"type": "vagrant",
"vagrantfile_template": "templates/vagrantfile.tpl",
"output": "build/esss-devenv-7.3.box"
},
{
"type": "vagrant-cloud",
"access_token": "{{user `vagrantcloud_token`}}",
"box_tag": "esss/devenv-7.3",
"no_release": "true",
"version": "{{user `devenv_version`}}",
"version_description": "{{user `description`}}"
}
]
]
```
It raises the error:
* Post-processor failed: unexpected EOF
See the gist url below for the full debug log.
- Packer version: Packer v1.0.0
- Host platform: CentOS Linux release 7.3.1611
- Debug log: https://gist.github.com/beenje/71a711b0587d18714abebd417f691158
- Pakcer template used: https://gist.github.com/beenje/568aabe7881c649ca7baec4514d89e79
|
2.0
|
pp/vagrant-cloud: large files are read in to memory - Hi,
I'm trying to switch from the `atlas` post processor to `vagrant-cloud` to upload a vagrant box.
This was working fine:
```json
"post-processors": [
[
{
"type": "vagrant",
"vagrantfile_template": "templates/vagrantfile.tpl",
"keep_input_artifact": true,
"output": "build/esss-devenv-7.3.box"
},
{
"type": "atlas",
"artifact": "esss/devenv-7.3",
"artifact_type": "vagrant.box",
"metadata": {
"description": "{{user `description`}}",
"provider": "virtualbox",
"version": "{{user `devenv_version`}}"
}
}
]
]
```
This doesn't work:
```json
"post-processors": [
[
{
"type": "vagrant",
"vagrantfile_template": "templates/vagrantfile.tpl",
"output": "build/esss-devenv-7.3.box"
},
{
"type": "vagrant-cloud",
"access_token": "{{user `vagrantcloud_token`}}",
"box_tag": "esss/devenv-7.3",
"no_release": "true",
"version": "{{user `devenv_version`}}",
"version_description": "{{user `description`}}"
}
]
]
```
It raises the error:
* Post-processor failed: unexpected EOF
See the gist url below for the full debug log.
- Packer version: Packer v1.0.0
- Host platform: CentOS Linux release 7.3.1611
- Debug log: https://gist.github.com/beenje/71a711b0587d18714abebd417f691158
- Pakcer template used: https://gist.github.com/beenje/568aabe7881c649ca7baec4514d89e79
|
process
|
pp vagrant cloud large files are read in to memory hi i m trying to switch from the atlas post processor to vagrant cloud to upload a vagrant box this was working fine json post processors type vagrant vagrantfile template templates vagrantfile tpl keep input artifact true output build esss devenv box type atlas artifact esss devenv artifact type vagrant box metadata description user description provider virtualbox version user devenv version this doesn t work json post processors type vagrant vagrantfile template templates vagrantfile tpl output build esss devenv box type vagrant cloud access token user vagrantcloud token box tag esss devenv no release true version user devenv version version description user description it raises the error post processor failed unexpected eof see the gist url below for the full debug log packer version packer host platform centos linux release debug log pakcer template used
| 1
|
7,163
| 10,310,453,132
|
IssuesEvent
|
2019-08-29 15:11:52
|
heim-rs/heim
|
https://api.github.com/repos/heim-rs/heim
|
closed
|
process::Process::parent method
|
A-process C-enhancement O-linux O-macos O-windows
|
Should be as simple as combination of `Process::parent_pid` and `Process::get`
|
1.0
|
process::Process::parent method - Should be as simple as combination of `Process::parent_pid` and `Process::get`
|
process
|
process process parent method should be as simple as combination of process parent pid and process get
| 1
|
3,308
| 6,412,042,599
|
IssuesEvent
|
2017-08-08 01:27:58
|
triplea-game/triplea
|
https://api.github.com/repos/triplea-game/triplea
|
opened
|
Glitch (Bug) Reporting - Where? Who?
|
discussion type: process
|
Following up from testing group forum thread: https://forums.triplea-game.org/topic/268/volunteers-needed-early-release-testing-group/13
The question is raised: where to report game glitches?
Requirements from a game dev point of view:
* We need glitches tracked in GitHub issues. Why GitHub issues? Kindly take this as a given, in short, dev's need bugs to be tracked in bug tracking software, forum's are not bug tracking software.
## Reasons to report all bugs in github issues first and only
* Least process overhead, no moving bugs from forum to github issues
* All bugs tracked in central location, easier de-duping against open bugs, single way to categorize track bugs
* Github.com has invested a lot in their new user registration process, linking to a single URL for reporting bugs is very useful! Alternatively, we'd need to tell users how to register for forum and where to post bugs. There would be some duplication in documentation, one for dev another for normal users, we would likely duplicate that documentation in the game and on the website for normal users.
* Some bugs could get lost, the process depends on bugs moving by hand from forum to github issues, humans are fallible, get sick, etc... It's overly important IMO for an open source project like this to cut out any labor requirements possible.
* Ideally dev's would not need to watch two locations for bug reports. Feature requests IMO are a bit different since it is a bit of a different task and audience to build something new compared to fixing or even triaging something broken.
## Reasons to report all bugs to forums first
* Meet players where they are, we do not want end-users to be watching github issues, it does not 'filter' correctly for them, the topics can be hard to follow.
* Bugs are curated by dev's before being moved to github issues, higher quality bug reports (on the flip side, we might get lazy about this, we may have extra round trips as additional dev's see the new report on github issues and ask questions)
* The forum is a funnel for user engagement with TripleA. There is simplicity in having all users report to forum.
If we could easily assign, close, tag and filter forum posts like we can with bug tracking software, I think then there would be little argument for posting bugs to github issues at all. Failing that though, I think the reasons for keeping github issues to bug tracking is compelling. I may be mising something, or under-valuing forum reporting, please weigh in if you think so, or if you generally agree it is most efficient and even easier to funnel all bugs to github issues. We could also assess a hybrid where dev's and testers report to github issues (our "secret" bug tracking location), and everyone else goes to forum.
Thinking about it, I think technically our process *is* to have all bugs go to github issues, and that was established. IIRC there was a lot of noise and we got derailed. It's also easy to confuse features/bugs, and to argue the difference between the two. Perhaps this has gotten more confusing by having feature requiremetns recently moving to forums.
For any bug reports in forums, I would almost be curious why people are posting bugs there? Regardless, let's revisit and clarify this decision, I think the pro/con list is pretty compelling, though then again I may be missing something important.
|
1.0
|
Glitch (Bug) Reporting - Where? Who? - Following up from testing group forum thread: https://forums.triplea-game.org/topic/268/volunteers-needed-early-release-testing-group/13
The question is raised: where to report game glitches?
Requirements from a game dev point of view:
* We need glitches tracked in GitHub issues. Why GitHub issues? Kindly take this as a given, in short, dev's need bugs to be tracked in bug tracking software, forum's are not bug tracking software.
## Reasons to report all bugs in github issues first and only
* Least process overhead, no moving bugs from forum to github issues
* All bugs tracked in central location, easier de-duping against open bugs, single way to categorize track bugs
* Github.com has invested a lot in their new user registration process, linking to a single URL for reporting bugs is very useful! Alternatively, we'd need to tell users how to register for forum and where to post bugs. There would be some duplication in documentation, one for dev another for normal users, we would likely duplicate that documentation in the game and on the website for normal users.
* Some bugs could get lost, the process depends on bugs moving by hand from forum to github issues, humans are fallible, get sick, etc... It's overly important IMO for an open source project like this to cut out any labor requirements possible.
* Ideally dev's would not need to watch two locations for bug reports. Feature requests IMO are a bit different since it is a bit of a different task and audience to build something new compared to fixing or even triaging something broken.
## Reasons to report all bugs to forums first
* Meet players where they are, we do not want end-users to be watching github issues, it does not 'filter' correctly for them, the topics can be hard to follow.
* Bugs are curated by dev's before being moved to github issues, higher quality bug reports (on the flip side, we might get lazy about this, we may have extra round trips as additional dev's see the new report on github issues and ask questions)
* The forum is a funnel for user engagement with TripleA. There is simplicity in having all users report to forum.
If we could easily assign, close, tag and filter forum posts like we can with bug tracking software, I think then there would be little argument for posting bugs to github issues at all. Failing that though, I think the reasons for keeping github issues to bug tracking is compelling. I may be mising something, or under-valuing forum reporting, please weigh in if you think so, or if you generally agree it is most efficient and even easier to funnel all bugs to github issues. We could also assess a hybrid where dev's and testers report to github issues (our "secret" bug tracking location), and everyone else goes to forum.
Thinking about it, I think technically our process *is* to have all bugs go to github issues, and that was established. IIRC there was a lot of noise and we got derailed. It's also easy to confuse features/bugs, and to argue the difference between the two. Perhaps this has gotten more confusing by having feature requiremetns recently moving to forums.
For any bug reports in forums, I would almost be curious why people are posting bugs there? Regardless, let's revisit and clarify this decision, I think the pro/con list is pretty compelling, though then again I may be missing something important.
|
process
|
glitch bug reporting where who following up from testing group forum thread the question is raised where to report game glitches requirements from a game dev point of view we need glitches tracked in github issues why github issues kindly take this as a given in short dev s need bugs to be tracked in bug tracking software forum s are not bug tracking software reasons to report all bugs in github issues first and only least process overhead no moving bugs from forum to github issues all bugs tracked in central location easier de duping against open bugs single way to categorize track bugs github com has invested a lot in their new user registration process linking to a single url for reporting bugs is very useful alternatively we d need to tell users how to register for forum and where to post bugs there would be some duplication in documentation one for dev another for normal users we would likely duplicate that documentation in the game and on the website for normal users some bugs could get lost the process depends on bugs moving by hand from forum to github issues humans are fallible get sick etc it s overly important imo for an open source project like this to cut out any labor requirements possible ideally dev s would not need to watch two locations for bug reports feature requests imo are a bit different since it is a bit of a different task and audience to build something new compared to fixing or even triaging something broken reasons to report all bugs to forums first meet players where they are we do not want end users to be watching github issues it does not filter correctly for them the topics can be hard to follow bugs are curated by dev s before being moved to github issues higher quality bug reports on the flip side we might get lazy about this we may have extra round trips as additional dev s see the new report on github issues and ask questions the forum is a funnel for user engagement with triplea there is simplicity in having all users report to forum if we could easily assign close tag and filter forum posts like we can with bug tracking software i think then there would be little argument for posting bugs to github issues at all failing that though i think the reasons for keeping github issues to bug tracking is compelling i may be mising something or under valuing forum reporting please weigh in if you think so or if you generally agree it is most efficient and even easier to funnel all bugs to github issues we could also assess a hybrid where dev s and testers report to github issues our secret bug tracking location and everyone else goes to forum thinking about it i think technically our process is to have all bugs go to github issues and that was established iirc there was a lot of noise and we got derailed it s also easy to confuse features bugs and to argue the difference between the two perhaps this has gotten more confusing by having feature requiremetns recently moving to forums for any bug reports in forums i would almost be curious why people are posting bugs there regardless let s revisit and clarify this decision i think the pro con list is pretty compelling though then again i may be missing something important
| 1
|
12,867
| 8,729,308,041
|
IssuesEvent
|
2018-12-10 19:52:38
|
careytews/graph-ui
|
https://api.github.com/repos/careytews/graph-ui
|
closed
|
CVE-2017-16137 Medium Severity Vulnerability detected by WhiteSource
|
security vulnerability
|
## CVE-2017-16137 - Medium Severity Vulnerability
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>debug-0.7.4.tgz</b></p></summary>
<p>small debugging utility</p>
<p>path: /tmp/git/graph-ui/node_modules/debug/package.json</p>
<p>
<p>Library home page: <a href=http://registry.npmjs.org/debug/-/debug-0.7.4.tgz>http://registry.npmjs.org/debug/-/debug-0.7.4.tgz</a></p>
Dependency Hierarchy:
- grunt-contrib-qunit-0.7.0.tgz (Root Library)
- grunt-lib-phantomjs-0.6.0.tgz
- phantomjs-1.9.20.tgz
- extract-zip-1.5.0.tgz
- :x: **debug-0.7.4.tgz** (Vulnerable Library)
<p>Found in commit: <a href="https://github.com/careytews/graph-ui/commit/293867c5bd68f89eb8e53e168ae02f2f2eb2f41c">293867c5bd68f89eb8e53e168ae02f2f2eb2f41c</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The debug module is vulnerable to regular expression denial of service when untrusted user input is passed into the o formatter. It takes around 50k characters to block for 2 seconds making this a low severity issue.
<p>Publish Date: 2018-06-07
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-16137>CVE-2017-16137</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nodesecurity.io/advisories/534">https://nodesecurity.io/advisories/534</a></p>
<p>Release Date: 2017-09-27</p>
<p>Fix Resolution: Version 2.x.x: Update to version 2.6.9 or later.
Version 3.x.x: Update to version 3.1.0 or later.</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2017-16137 Medium Severity Vulnerability detected by WhiteSource - ## CVE-2017-16137 - Medium Severity Vulnerability
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>debug-0.7.4.tgz</b></p></summary>
<p>small debugging utility</p>
<p>path: /tmp/git/graph-ui/node_modules/debug/package.json</p>
<p>
<p>Library home page: <a href=http://registry.npmjs.org/debug/-/debug-0.7.4.tgz>http://registry.npmjs.org/debug/-/debug-0.7.4.tgz</a></p>
Dependency Hierarchy:
- grunt-contrib-qunit-0.7.0.tgz (Root Library)
- grunt-lib-phantomjs-0.6.0.tgz
- phantomjs-1.9.20.tgz
- extract-zip-1.5.0.tgz
- :x: **debug-0.7.4.tgz** (Vulnerable Library)
<p>Found in commit: <a href="https://github.com/careytews/graph-ui/commit/293867c5bd68f89eb8e53e168ae02f2f2eb2f41c">293867c5bd68f89eb8e53e168ae02f2f2eb2f41c</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The debug module is vulnerable to regular expression denial of service when untrusted user input is passed into the o formatter. It takes around 50k characters to block for 2 seconds making this a low severity issue.
<p>Publish Date: 2018-06-07
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-16137>CVE-2017-16137</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nodesecurity.io/advisories/534">https://nodesecurity.io/advisories/534</a></p>
<p>Release Date: 2017-09-27</p>
<p>Fix Resolution: Version 2.x.x: Update to version 2.6.9 or later.
Version 3.x.x: Update to version 3.1.0 or later.</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium severity vulnerability detected by whitesource cve medium severity vulnerability vulnerable library debug tgz small debugging utility path tmp git graph ui node modules debug package json library home page a href dependency hierarchy grunt contrib qunit tgz root library grunt lib phantomjs tgz phantomjs tgz extract zip tgz x debug tgz vulnerable library found in commit a href vulnerability details the debug module is vulnerable to regular expression denial of service when untrusted user input is passed into the o formatter it takes around characters to block for seconds making this a low severity issue publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution version x x update to version or later version x x update to version or later step up your open source security game with whitesource
| 0
|
19,366
| 25,496,202,578
|
IssuesEvent
|
2022-11-27 18:00:46
|
python/cpython
|
https://api.github.com/repos/python/cpython
|
closed
|
multiprocessing.set_start_method force argument is not documented
|
type-feature docs 3.11 3.10 3.9 3.8 3.7 expert-multiprocessing
|
BPO | [47184](https://bugs.python.org/issue47184)
--- | :---
Nosy | @johnthagen, @dignissimus
PRs | <li>python/cpython#32339</li>
<sup>*Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.*</sup>
<details><summary>Show more details</summary><p>
GitHub fields:
```python
assignee = None
closed_at = None
created_at = <Date 2022-03-31.17:42:20.592>
labels = ['3.7', '3.8', '3.9', '3.10', '3.11', 'type-feature', 'docs']
title = 'multiprocessing.set_start_method force argument is not documented'
updated_at = <Date 2022-04-05.19:12:05.366>
user = 'https://github.com/johnthagen'
```
bugs.python.org fields:
```python
activity = <Date 2022-04-05.19:12:05.366>
actor = 'sam_ezeh'
assignee = 'docs@python'
closed = False
closed_date = None
closer = None
components = ['Documentation']
creation = <Date 2022-03-31.17:42:20.592>
creator = 'John Hagen'
dependencies = []
files = []
hgrepos = []
issue_num = 47184
keywords = ['patch']
message_count = 2.0
messages = ['416451', '416805']
nosy_count = 3.0
nosy_names = ['docs@python', 'John Hagen', 'sam_ezeh']
pr_nums = ['32339']
priority = 'normal'
resolution = None
stage = 'patch review'
status = 'open'
superseder = None
type = 'enhancement'
url = 'https://bugs.python.org/issue47184'
versions = ['Python 3.7', 'Python 3.8', 'Python 3.9', 'Python 3.10', 'Python 3.11']
```
</p></details>
<!-- gh-linked-prs -->
### Linked PRs
* gh-99820
<!-- /gh-linked-prs -->
|
1.0
|
multiprocessing.set_start_method force argument is not documented - BPO | [47184](https://bugs.python.org/issue47184)
--- | :---
Nosy | @johnthagen, @dignissimus
PRs | <li>python/cpython#32339</li>
<sup>*Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.*</sup>
<details><summary>Show more details</summary><p>
GitHub fields:
```python
assignee = None
closed_at = None
created_at = <Date 2022-03-31.17:42:20.592>
labels = ['3.7', '3.8', '3.9', '3.10', '3.11', 'type-feature', 'docs']
title = 'multiprocessing.set_start_method force argument is not documented'
updated_at = <Date 2022-04-05.19:12:05.366>
user = 'https://github.com/johnthagen'
```
bugs.python.org fields:
```python
activity = <Date 2022-04-05.19:12:05.366>
actor = 'sam_ezeh'
assignee = 'docs@python'
closed = False
closed_date = None
closer = None
components = ['Documentation']
creation = <Date 2022-03-31.17:42:20.592>
creator = 'John Hagen'
dependencies = []
files = []
hgrepos = []
issue_num = 47184
keywords = ['patch']
message_count = 2.0
messages = ['416451', '416805']
nosy_count = 3.0
nosy_names = ['docs@python', 'John Hagen', 'sam_ezeh']
pr_nums = ['32339']
priority = 'normal'
resolution = None
stage = 'patch review'
status = 'open'
superseder = None
type = 'enhancement'
url = 'https://bugs.python.org/issue47184'
versions = ['Python 3.7', 'Python 3.8', 'Python 3.9', 'Python 3.10', 'Python 3.11']
```
</p></details>
<!-- gh-linked-prs -->
### Linked PRs
* gh-99820
<!-- /gh-linked-prs -->
|
process
|
multiprocessing set start method force argument is not documented bpo nosy johnthagen dignissimus prs python cpython note these values reflect the state of the issue at the time it was migrated and might not reflect the current state show more details github fields python assignee none closed at none created at labels title multiprocessing set start method force argument is not documented updated at user bugs python org fields python activity actor sam ezeh assignee docs python closed false closed date none closer none components creation creator john hagen dependencies files hgrepos issue num keywords message count messages nosy count nosy names pr nums priority normal resolution none stage patch review status open superseder none type enhancement url versions linked prs gh
| 1
|
71,245
| 30,840,927,278
|
IssuesEvent
|
2023-08-02 10:34:45
|
r-geoflow/geoflow
|
https://api.github.com/repos/r-geoflow/geoflow
|
opened
|
Add rule to check if a style is available with same name as the entity id
|
enhancement technicality:MINOR software:geoserver action:geosapi-publish-ogc-services
|
When there is no style associated in the 'Data' column, before choosing the "generic" Geoserver style, we first check if Geoserver has some style named as the entity id.
|
1.0
|
Add rule to check if a style is available with same name as the entity id - When there is no style associated in the 'Data' column, before choosing the "generic" Geoserver style, we first check if Geoserver has some style named as the entity id.
|
non_process
|
add rule to check if a style is available with same name as the entity id when there is no style associated in the data column before choosing the generic geoserver style we first check if geoserver has some style named as the entity id
| 0
|
13,737
| 16,490,289,444
|
IssuesEvent
|
2021-05-25 02:02:14
|
fluent/fluent-bit
|
https://api.github.com/repos/fluent/fluent-bit
|
closed
|
[windows] freeze on http output when connection fails
|
Stale work-in-process
|
## Bug Report
**Describe the bug**
I read log files and send them out to http. Fluentbit freezes when either
1) the receiving webservice disappears for some time and comes back. The freeze occurrs after some of the queued messages have been successfully delivered
2) the receiving webservice is not available at all. After 10 messages fluentbit freezes and does not continue to queue
**log**
2020/05/06 13:57:54] [debug] [input:tail:tail.0] file=C:\Users\al\devel\ARBEIT\fluentd\POB.log read=1618 lines=1
[2020/05/06 13:57:55] [debug] [task] created task=0000028F489A0290 id=14 OK
[2020/05/06 13:57:55] [error] [io] TCP connection failed: 127.0.0.1:7000 (Unknown error)
[2020/05/06 13:57:55] [error] [output:http:http.0] no upstream connections available to 127.0.0.1:7000
[2020/05/06 13:57:55] [debug] [retry] new retry created for task_id=12 attemps=1
[2020/05/06 13:57:55] [ warn] [engine] failed to flush chunk '15260-1588766272.671583700.flb', retry in 9 seconds: task_id=12, input=tail.0 > output=http.0
[2020/05/06 13:57:55] [error] [io] TCP connection failed: 127.0.0.1:7000 (Unknown error)
[2020/05/06 13:57:55] [error] [output:http:http.0] no upstream connections available to 127.0.0.1:7000
[2020/05/06 13:57:55] [debug] [retry] re-using retry for task_id=1 attemps=2
[2020/05/06 13:57:55] [ warn] [engine] failed to flush chunk '15260-1588766261.671370100.flb', retry in 62 seconds: task_id=1, input=tail.0 > output=http.0
[2020/05/06 13:57:55] [debug] [input:tail:tail.0] file=C:\Users\al\devel\ARBEIT\fluentd\POB.log collect event
[2020/05/06 13:57:55] [debug] [input:tail:tail.0] file=C:\Users\al\devel\ARBEIT\fluentd\POB.log read=1618 lines=1
[2020/05/06 13:57:56] [debug] [task] created task=0000028F4ADB8DD0 id=15 OK
[2020/05/06 13:57:56] [error] [io] TCP connection failed: 127.0.0.1:7000 (Unknown error)
[2020/05/06 13:57:56] [error] [output:http:http.0] no upstream connections available to 127.0.0.1:7000
[2020/05/06 13:57:56] [debug] [retry] new retry created for task_id=13 attemps=1
[2020/05/06 13:57:56] [ warn] [engine] failed to flush chunk '15260-1588766273.665188200.flb', retry in 9 seconds: task_id=13, input=tail.0 > output=http.0
[2020/05/06 13:57:56] [error] [io] TCP connection failed: 127.0.0.1:7000 (Unknown error)
[2020/05/06 13:57:56] [error] [output:http:http.0] no upstream connections available to 127.0.0.1:7000
[2020/05/06 13:57:56] [debug] [retry] re-using retry for task_id=2 attemps=2
[2020/05/06 13:57:56] [ warn] [engine] failed to flush chunk '15260-1588766262.427955800.flb', retry in 62 seconds: task_id=2, input=tail.0 > output=http.0
[2020/05/06 13:57:56] [debug] [input:tail:tail.0] file=C:\Users\al\devel\ARBEIT\fluentd\POB.log collect event
[2020/05/06 13:57:56] [debug] [input:tail:tail.0] file=C:\Users\al\devel\ARBEIT\fluentd\POB.log read=1618 lines=1
[2020/05/06 13:57:57] [debug] [task] created task=0000028F4ADB8D30 id=16 OK
here is the freeze
system: windows
**To Reproduce**
- configure a simple log reader without parser, filter ,
- forward to http
- disable the receiving webservice
- wait some messages
- fluentbit freezes
**Your Environment**
latest github from 05.05.2020
windows 10
**config**
```
[SERVICE]
# Flush
# =====
# Set an interval of seconds before to flush records to a destination
Flush 1
# Daemon
# ======
# Instruct Fluent Bit to run in foreground or background mode.
Daemon Off
# Log_Level
# =========
# Set the verbosity level of the service, values can be:
#
# - error
# - warning
# - info
# - debug
# - trace
#
# By default 'info' is set, that means it includes 'error' and 'warning'.
Log_Level trace
# Parsers_File
# ============
# Specify an optional 'Parsers' configuration file
Parsers_File parsers.conf
Plugins_File plugins.conf
# HTTP Server
# ===========
# Enable/Disable the built-in HTTP Server for metrics
HTTP_Server Off
HTTP_Listen 0.0.0.0
HTTP_Port 2020
[INPUT]
Name tail
Path C:\Users\al\devel\ARBEIT\fluentd\*POB*.log
path_key FILEPATH
Refresh_Interval 10
tag pob
Buffer_Chunk_Size 40k
Buffer_Max_Size 400k
Mem_Buf_Limit 10M
[OUTPUT]
Name http
Match pob
Host 127.0.0.1
Port 7000
URI /api
Format json
Retry_Limit False
```
|
1.0
|
[windows] freeze on http output when connection fails - ## Bug Report
**Describe the bug**
I read log files and send them out to http. Fluentbit freezes when either
1) the receiving webservice disappears for some time and comes back. The freeze occurrs after some of the queued messages have been successfully delivered
2) the receiving webservice is not available at all. After 10 messages fluentbit freezes and does not continue to queue
**log**
2020/05/06 13:57:54] [debug] [input:tail:tail.0] file=C:\Users\al\devel\ARBEIT\fluentd\POB.log read=1618 lines=1
[2020/05/06 13:57:55] [debug] [task] created task=0000028F489A0290 id=14 OK
[2020/05/06 13:57:55] [error] [io] TCP connection failed: 127.0.0.1:7000 (Unknown error)
[2020/05/06 13:57:55] [error] [output:http:http.0] no upstream connections available to 127.0.0.1:7000
[2020/05/06 13:57:55] [debug] [retry] new retry created for task_id=12 attemps=1
[2020/05/06 13:57:55] [ warn] [engine] failed to flush chunk '15260-1588766272.671583700.flb', retry in 9 seconds: task_id=12, input=tail.0 > output=http.0
[2020/05/06 13:57:55] [error] [io] TCP connection failed: 127.0.0.1:7000 (Unknown error)
[2020/05/06 13:57:55] [error] [output:http:http.0] no upstream connections available to 127.0.0.1:7000
[2020/05/06 13:57:55] [debug] [retry] re-using retry for task_id=1 attemps=2
[2020/05/06 13:57:55] [ warn] [engine] failed to flush chunk '15260-1588766261.671370100.flb', retry in 62 seconds: task_id=1, input=tail.0 > output=http.0
[2020/05/06 13:57:55] [debug] [input:tail:tail.0] file=C:\Users\al\devel\ARBEIT\fluentd\POB.log collect event
[2020/05/06 13:57:55] [debug] [input:tail:tail.0] file=C:\Users\al\devel\ARBEIT\fluentd\POB.log read=1618 lines=1
[2020/05/06 13:57:56] [debug] [task] created task=0000028F4ADB8DD0 id=15 OK
[2020/05/06 13:57:56] [error] [io] TCP connection failed: 127.0.0.1:7000 (Unknown error)
[2020/05/06 13:57:56] [error] [output:http:http.0] no upstream connections available to 127.0.0.1:7000
[2020/05/06 13:57:56] [debug] [retry] new retry created for task_id=13 attemps=1
[2020/05/06 13:57:56] [ warn] [engine] failed to flush chunk '15260-1588766273.665188200.flb', retry in 9 seconds: task_id=13, input=tail.0 > output=http.0
[2020/05/06 13:57:56] [error] [io] TCP connection failed: 127.0.0.1:7000 (Unknown error)
[2020/05/06 13:57:56] [error] [output:http:http.0] no upstream connections available to 127.0.0.1:7000
[2020/05/06 13:57:56] [debug] [retry] re-using retry for task_id=2 attemps=2
[2020/05/06 13:57:56] [ warn] [engine] failed to flush chunk '15260-1588766262.427955800.flb', retry in 62 seconds: task_id=2, input=tail.0 > output=http.0
[2020/05/06 13:57:56] [debug] [input:tail:tail.0] file=C:\Users\al\devel\ARBEIT\fluentd\POB.log collect event
[2020/05/06 13:57:56] [debug] [input:tail:tail.0] file=C:\Users\al\devel\ARBEIT\fluentd\POB.log read=1618 lines=1
[2020/05/06 13:57:57] [debug] [task] created task=0000028F4ADB8D30 id=16 OK
here is the freeze
system: windows
**To Reproduce**
- configure a simple log reader without parser, filter ,
- forward to http
- disable the receiving webservice
- wait some messages
- fluentbit freezes
**Your Environment**
latest github from 05.05.2020
windows 10
**config**
```
[SERVICE]
# Flush
# =====
# Set an interval of seconds before to flush records to a destination
Flush 1
# Daemon
# ======
# Instruct Fluent Bit to run in foreground or background mode.
Daemon Off
# Log_Level
# =========
# Set the verbosity level of the service, values can be:
#
# - error
# - warning
# - info
# - debug
# - trace
#
# By default 'info' is set, that means it includes 'error' and 'warning'.
Log_Level trace
# Parsers_File
# ============
# Specify an optional 'Parsers' configuration file
Parsers_File parsers.conf
Plugins_File plugins.conf
# HTTP Server
# ===========
# Enable/Disable the built-in HTTP Server for metrics
HTTP_Server Off
HTTP_Listen 0.0.0.0
HTTP_Port 2020
[INPUT]
Name tail
Path C:\Users\al\devel\ARBEIT\fluentd\*POB*.log
path_key FILEPATH
Refresh_Interval 10
tag pob
Buffer_Chunk_Size 40k
Buffer_Max_Size 400k
Mem_Buf_Limit 10M
[OUTPUT]
Name http
Match pob
Host 127.0.0.1
Port 7000
URI /api
Format json
Retry_Limit False
```
|
process
|
freeze on http output when connection fails bug report describe the bug i read log files and send them out to http fluentbit freezes when either the receiving webservice disappears for some time and comes back the freeze occurrs after some of the queued messages have been successfully delivered the receiving webservice is not available at all after messages fluentbit freezes and does not continue to queue log file c users al devel arbeit fluentd pob log read lines created task id ok tcp connection failed unknown error no upstream connections available to new retry created for task id attemps failed to flush chunk flb retry in seconds task id input tail output http tcp connection failed unknown error no upstream connections available to re using retry for task id attemps failed to flush chunk flb retry in seconds task id input tail output http file c users al devel arbeit fluentd pob log collect event file c users al devel arbeit fluentd pob log read lines created task id ok tcp connection failed unknown error no upstream connections available to new retry created for task id attemps failed to flush chunk flb retry in seconds task id input tail output http tcp connection failed unknown error no upstream connections available to re using retry for task id attemps failed to flush chunk flb retry in seconds task id input tail output http file c users al devel arbeit fluentd pob log collect event file c users al devel arbeit fluentd pob log read lines created task id ok here is the freeze system windows to reproduce configure a simple log reader without parser filter forward to http disable the receiving webservice wait some messages fluentbit freezes your environment latest github from windows config flush set an interval of seconds before to flush records to a destination flush daemon instruct fluent bit to run in foreground or background mode daemon off log level set the verbosity level of the service values can be error warning info debug trace by default info is set that means it includes error and warning log level trace parsers file specify an optional parsers configuration file parsers file parsers conf plugins file plugins conf http server enable disable the built in http server for metrics http server off http listen http port name tail path c users al devel arbeit fluentd pob log path key filepath refresh interval tag pob buffer chunk size buffer max size mem buf limit name http match pob host port uri api format json retry limit false
| 1
|
225,930
| 24,918,627,163
|
IssuesEvent
|
2022-10-30 17:53:15
|
rsoreq/sqlmap
|
https://api.github.com/repos/rsoreq/sqlmap
|
reopened
|
CVE-2016-10735 (Medium) detected in bootstrap-3.3.0.min.js
|
security vulnerability
|
## CVE-2016-10735 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.0.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.0/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.0/js/bootstrap.min.js</a></p>
<p>Path to dependency file: /data/html/index.html</p>
<p>Path to vulnerable library: /data/html/index.html</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.0.min.js** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap 3.x before 3.4.0 and 4.x-beta before 4.0.0-beta.2, XSS is possible in the data-target attribute, a different vulnerability than CVE-2018-14041.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2016-10735>CVE-2016-10735</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: bootstrap - 3.4.0, 4.0.0-beta.2</p>
</p>
</details>
<p></p>
|
True
|
CVE-2016-10735 (Medium) detected in bootstrap-3.3.0.min.js - ## CVE-2016-10735 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.0.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.0/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.0/js/bootstrap.min.js</a></p>
<p>Path to dependency file: /data/html/index.html</p>
<p>Path to vulnerable library: /data/html/index.html</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.0.min.js** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Bootstrap 3.x before 3.4.0 and 4.x-beta before 4.0.0-beta.2, XSS is possible in the data-target attribute, a different vulnerability than CVE-2018-14041.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2016-10735>CVE-2016-10735</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: bootstrap - 3.4.0, 4.0.0-beta.2</p>
</p>
</details>
<p></p>
|
non_process
|
cve medium detected in bootstrap min js cve medium severity vulnerability vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to dependency file data html index html path to vulnerable library data html index html dependency hierarchy x bootstrap min js vulnerable library found in base branch master vulnerability details in bootstrap x before and x beta before beta xss is possible in the data target attribute a different vulnerability than cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution bootstrap beta
| 0
|
13,483
| 16,016,634,096
|
IssuesEvent
|
2021-04-20 16:49:23
|
tdwg/dwc
|
https://api.github.com/repos/tdwg/dwc
|
reopened
|
verbatimLabel
|
Class - MaterialSample Process - dismissed Process - need evidence for demand Term - add
|
Was https://code.google.com/p/darwincore/issues/detail?id=124
Submitter: Hannu Saarenmaa
Justification: In the first phase of the digitisation process we try to capture everything "as is". Interpretation should follow from that.
Definition: The full, verbatim text from the specimen label.
Comment: There are various verbatim fields in Darwin Core already, but they do not capture everything.
Refines:
Has Domain: Separators for line and different labels are needed. They need to be something that cannot possibly be present in label texts, such as $ and §.
Has Range:
Replaces:
ABCD 2.06:
Oct 6, 2011 comment #1 wixner
I second that. At GBIF we had created our own term for this and it would be lovely to reuse a dwc term instead:
http://rs.gbif.org/extension/gbif/1.0/typesandspecimen.xml#verbatimLabel
Sep 23, 2013 comment #4 gtuco.btuco
I would like to promote the adoption of this term. To do so, I will need a stronger proposal demonstrating the need to share this information - that is, that independent groups, organizations, projects have the same need and can reach a consensus proposal about how the term should be used.
Sep 23, 2013 comment #5 gkamp76
verbatimLabel information, capturing all labels as they appear with the specimen, is essential for preserving the original information before subsequent interpretation takes place. It is in fact, one of the simpler tasks (aside from handwriting interpretation) for relatively untrained data entry workers to do. Any future interpretations of the data from the verbatimLabel can then be compared as political boundaries change, shortening or changing of collector information with subsequent publication, any number of interpretations may need to ultimately refer back to the original source: the verbatimLabel.
The question I would propose is if you are talking about all labels, what do you really mean? Would this include specimen identifer labels? Determination labels? Type labels? Loan labels? The latter are often removed when a loan is returned. What constitutes the "original" verbatimLabel" information? At the time of recording, having all of this information in one place (and if photographed, all are easily included) could be helpful as future workers realize for example, that the attribution of one person as a determiner was incorrect given the date and taxon in question, and it was actually someone else with similar initials and family name.
Sep 23, 2013 comment #6 gtuco.btuco
It might be a good idea to circulate the proposal on tdwg-content and see if a community can be built around and support the addition of this concept.
|
2.0
|
verbatimLabel - Was https://code.google.com/p/darwincore/issues/detail?id=124
Submitter: Hannu Saarenmaa
Justification: In the first phase of the digitisation process we try to capture everything "as is". Interpretation should follow from that.
Definition: The full, verbatim text from the specimen label.
Comment: There are various verbatim fields in Darwin Core already, but they do not capture everything.
Refines:
Has Domain: Separators for line and different labels are needed. They need to be something that cannot possibly be present in label texts, such as $ and §.
Has Range:
Replaces:
ABCD 2.06:
Oct 6, 2011 comment #1 wixner
I second that. At GBIF we had created our own term for this and it would be lovely to reuse a dwc term instead:
http://rs.gbif.org/extension/gbif/1.0/typesandspecimen.xml#verbatimLabel
Sep 23, 2013 comment #4 gtuco.btuco
I would like to promote the adoption of this term. To do so, I will need a stronger proposal demonstrating the need to share this information - that is, that independent groups, organizations, projects have the same need and can reach a consensus proposal about how the term should be used.
Sep 23, 2013 comment #5 gkamp76
verbatimLabel information, capturing all labels as they appear with the specimen, is essential for preserving the original information before subsequent interpretation takes place. It is in fact, one of the simpler tasks (aside from handwriting interpretation) for relatively untrained data entry workers to do. Any future interpretations of the data from the verbatimLabel can then be compared as political boundaries change, shortening or changing of collector information with subsequent publication, any number of interpretations may need to ultimately refer back to the original source: the verbatimLabel.
The question I would propose is if you are talking about all labels, what do you really mean? Would this include specimen identifer labels? Determination labels? Type labels? Loan labels? The latter are often removed when a loan is returned. What constitutes the "original" verbatimLabel" information? At the time of recording, having all of this information in one place (and if photographed, all are easily included) could be helpful as future workers realize for example, that the attribution of one person as a determiner was incorrect given the date and taxon in question, and it was actually someone else with similar initials and family name.
Sep 23, 2013 comment #6 gtuco.btuco
It might be a good idea to circulate the proposal on tdwg-content and see if a community can be built around and support the addition of this concept.
|
process
|
verbatimlabel was submitter hannu saarenmaa justification in the first phase of the digitisation process we try to capture everything as is interpretation should follow from that definition the full verbatim text from the specimen label comment there are various verbatim fields in darwin core already but they do not capture everything refines has domain separators for line and different labels are needed they need to be something that cannot possibly be present in label texts such as and § has range replaces abcd oct comment wixner i second that at gbif we had created our own term for this and it would be lovely to reuse a dwc term instead sep comment gtuco btuco i would like to promote the adoption of this term to do so i will need a stronger proposal demonstrating the need to share this information that is that independent groups organizations projects have the same need and can reach a consensus proposal about how the term should be used sep comment verbatimlabel information capturing all labels as they appear with the specimen is essential for preserving the original information before subsequent interpretation takes place it is in fact one of the simpler tasks aside from handwriting interpretation for relatively untrained data entry workers to do any future interpretations of the data from the verbatimlabel can then be compared as political boundaries change shortening or changing of collector information with subsequent publication any number of interpretations may need to ultimately refer back to the original source the verbatimlabel the question i would propose is if you are talking about all labels what do you really mean would this include specimen identifer labels determination labels type labels loan labels the latter are often removed when a loan is returned what constitutes the original verbatimlabel information at the time of recording having all of this information in one place and if photographed all are easily included could be helpful as future workers realize for example that the attribution of one person as a determiner was incorrect given the date and taxon in question and it was actually someone else with similar initials and family name sep comment gtuco btuco it might be a good idea to circulate the proposal on tdwg content and see if a community can be built around and support the addition of this concept
| 1
|
22,532
| 31,681,746,246
|
IssuesEvent
|
2023-09-08 00:50:03
|
FasterXML/jackson-core
|
https://api.github.com/repos/FasterXML/jackson-core
|
closed
|
Add configurable limit for the maximum number of bytes/chars of content to parse before failing
|
2.16 processing-limits
|
(note: part of #637)
Jackson 2.15 included a few processing limits that can be applied to limit processing for "too big content"; first focusing on general nesting depth and max. length of individual tokens.
While this is good first step, it also makes sense to offer a simple way to limit maximum content in total allowed to be read -- typically a maximum document size, but in case of line-delimited input, maximum streaming content.
The reasoning for addition of such feature is that although users can -- if they must -- implement this at yet lower level (length-limited `InputStream`, for example), there are some benefits from Jackson streaming component offering this:
1. Less work for user (obviously), better accessibility leading to wider adoption and helping against possible DoS vectors
2. Better integration via well-defined exception type common to constraints violations (`StreamConstraintsException`)
3. More reliable limits when shared implementation used (i.e. less like users/devs implement faulty limits checks)
Note, too, that this feature significantly improves usefulness (or right now, lack thereof) of #863 to combine per-token limits with overall limits.
NOTE: the default setting for this limits should, however, be left as "unlimited": using anything else is likely to break some processing somewhere.
Limit has to be defined as 64-bit `long` (not `int`); default value to use then is likely `Long.MAX_VALUE`.
|
1.0
|
Add configurable limit for the maximum number of bytes/chars of content to parse before failing - (note: part of #637)
Jackson 2.15 included a few processing limits that can be applied to limit processing for "too big content"; first focusing on general nesting depth and max. length of individual tokens.
While this is good first step, it also makes sense to offer a simple way to limit maximum content in total allowed to be read -- typically a maximum document size, but in case of line-delimited input, maximum streaming content.
The reasoning for addition of such feature is that although users can -- if they must -- implement this at yet lower level (length-limited `InputStream`, for example), there are some benefits from Jackson streaming component offering this:
1. Less work for user (obviously), better accessibility leading to wider adoption and helping against possible DoS vectors
2. Better integration via well-defined exception type common to constraints violations (`StreamConstraintsException`)
3. More reliable limits when shared implementation used (i.e. less like users/devs implement faulty limits checks)
Note, too, that this feature significantly improves usefulness (or right now, lack thereof) of #863 to combine per-token limits with overall limits.
NOTE: the default setting for this limits should, however, be left as "unlimited": using anything else is likely to break some processing somewhere.
Limit has to be defined as 64-bit `long` (not `int`); default value to use then is likely `Long.MAX_VALUE`.
|
process
|
add configurable limit for the maximum number of bytes chars of content to parse before failing note part of jackson included a few processing limits that can be applied to limit processing for too big content first focusing on general nesting depth and max length of individual tokens while this is good first step it also makes sense to offer a simple way to limit maximum content in total allowed to be read typically a maximum document size but in case of line delimited input maximum streaming content the reasoning for addition of such feature is that although users can if they must implement this at yet lower level length limited inputstream for example there are some benefits from jackson streaming component offering this less work for user obviously better accessibility leading to wider adoption and helping against possible dos vectors better integration via well defined exception type common to constraints violations streamconstraintsexception more reliable limits when shared implementation used i e less like users devs implement faulty limits checks note too that this feature significantly improves usefulness or right now lack thereof of to combine per token limits with overall limits note the default setting for this limits should however be left as unlimited using anything else is likely to break some processing somewhere limit has to be defined as bit long not int default value to use then is likely long max value
| 1
|
287,498
| 24,834,320,197
|
IssuesEvent
|
2022-10-26 07:35:59
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
opened
|
roachtest: backup/2TB/n10cpu4 failed
|
C-test-failure O-robot O-roachtest branch-master release-blocker
|
roachtest.backup/2TB/n10cpu4 [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/7136814?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/7136814?buildTab=artifacts#/backup/2TB/n10cpu4) on master @ [1b1c8da55be48c174b7b370b305f42622546209f](https://github.com/cockroachdb/cockroach/commits/1b1c8da55be48c174b7b370b305f42622546209f):
```
test artifacts and logs in: /artifacts/backup/2TB/n10cpu4/run_1
(test_impl.go:291).Fatal: output in run_063207.327473943_n1_workload_fixtures_import_bank: ./workload fixtures import bank --db=bank --payload-bytes=10240 --csv-server http://localhost:8081 --seed=1 --ranges=0 --rows=65104166 {pgurl:1} returned: COMMAND_PROBLEM: exit status 1
```
<p>Parameters: <code>ROACHTEST_cloud=gce</code>
, <code>ROACHTEST_cpu=4</code>
, <code>ROACHTEST_encrypted=true</code>
, <code>ROACHTEST_fs=ext4</code>
, <code>ROACHTEST_localSSD=true</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #90678 roachtest: backup/2TB/n10cpu4 failed [C-test-failure O-roachtest O-robot T-disaster-recovery branch-release-22.1 release-blocker]
- #80030 roachtest: backup/2TB/n10cpu4 failed [C-test-failure O-roachtest O-robot T-disaster-recovery]
</p>
</details>
/cc @cockroachdb/disaster-recovery
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*backup/2TB/n10cpu4.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
2.0
|
roachtest: backup/2TB/n10cpu4 failed - roachtest.backup/2TB/n10cpu4 [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/7136814?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/7136814?buildTab=artifacts#/backup/2TB/n10cpu4) on master @ [1b1c8da55be48c174b7b370b305f42622546209f](https://github.com/cockroachdb/cockroach/commits/1b1c8da55be48c174b7b370b305f42622546209f):
```
test artifacts and logs in: /artifacts/backup/2TB/n10cpu4/run_1
(test_impl.go:291).Fatal: output in run_063207.327473943_n1_workload_fixtures_import_bank: ./workload fixtures import bank --db=bank --payload-bytes=10240 --csv-server http://localhost:8081 --seed=1 --ranges=0 --rows=65104166 {pgurl:1} returned: COMMAND_PROBLEM: exit status 1
```
<p>Parameters: <code>ROACHTEST_cloud=gce</code>
, <code>ROACHTEST_cpu=4</code>
, <code>ROACHTEST_encrypted=true</code>
, <code>ROACHTEST_fs=ext4</code>
, <code>ROACHTEST_localSSD=true</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #90678 roachtest: backup/2TB/n10cpu4 failed [C-test-failure O-roachtest O-robot T-disaster-recovery branch-release-22.1 release-blocker]
- #80030 roachtest: backup/2TB/n10cpu4 failed [C-test-failure O-roachtest O-robot T-disaster-recovery]
</p>
</details>
/cc @cockroachdb/disaster-recovery
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*backup/2TB/n10cpu4.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
non_process
|
roachtest backup failed roachtest backup with on master test artifacts and logs in artifacts backup run test impl go fatal output in run workload fixtures import bank workload fixtures import bank db bank payload bytes csv server seed ranges rows pgurl returned command problem exit status parameters roachtest cloud gce roachtest cpu roachtest encrypted true roachtest fs roachtest localssd true roachtest ssd help see see same failure on other branches roachtest backup failed roachtest backup failed cc cockroachdb disaster recovery
| 0
|
6,385
| 9,459,651,178
|
IssuesEvent
|
2019-04-17 09:02:29
|
log2timeline/plaso
|
https://api.github.com/repos/log2timeline/plaso
|
closed
|
file_entry can be None and raises exception
|
bug preprocessing
|
https://github.com/log2timeline/plaso/blob/bc6314896bb2f5cfa4c319972abfadfef3f9c441/plaso/preprocessors/linux.py#L194
```
Processing started.
Traceback (most recent call last):
File "/usr/bin/log2timeline.py", line 83, in <module>
if not Main():
File "/usr/bin/log2timeline.py", line 69, in Main
tool.ExtractEventsFromSources()
File "/usr/lib/python2.7/dist-packages/plaso/cli/log2timeline_tool.py", line 415, in ExtractEventsFromSources
self._PreprocessSources(extraction_engine)
File "/usr/lib/python2.7/dist-packages/plaso/cli/extraction_tool.py", line 188, in _PreprocessSources
resolver_context=self._resolver_context)
File "/usr/lib/python2.7/dist-packages/plaso/engine/engine.py", line 273, in PreprocessSources
self.knowledge_base)
File "/usr/lib/python2.7/dist-packages/plaso/preprocessors/manager.py", line 311, in RunPlugins
artifacts_registry, knowledge_base, searcher, file_system)
File "/usr/lib/python2.7/dist-packages/plaso/preprocessors/manager.py", line 149, in CollectFromFileSystem
knowledge_base, artifact_definition, searcher, file_system)
File "/usr/lib/python2.7/dist-packages/plaso/preprocessors/interface.py", line 82, in Collect
source.separator)
File "/usr/lib/python2.7/dist-packages/plaso/preprocessors/interface.py", line 135, in _ParsePathSpecification
self._ParseFileEntry(knowledge_base, file_entry)
File "/usr/lib/python2.7/dist-packages/plaso/preprocessors/linux.py", line 194, in _ParseFileEntry
if file_entry.link:
AttributeError: 'NoneType' object has no attribute 'link'
```
|
1.0
|
file_entry can be None and raises exception - https://github.com/log2timeline/plaso/blob/bc6314896bb2f5cfa4c319972abfadfef3f9c441/plaso/preprocessors/linux.py#L194
```
Processing started.
Traceback (most recent call last):
File "/usr/bin/log2timeline.py", line 83, in <module>
if not Main():
File "/usr/bin/log2timeline.py", line 69, in Main
tool.ExtractEventsFromSources()
File "/usr/lib/python2.7/dist-packages/plaso/cli/log2timeline_tool.py", line 415, in ExtractEventsFromSources
self._PreprocessSources(extraction_engine)
File "/usr/lib/python2.7/dist-packages/plaso/cli/extraction_tool.py", line 188, in _PreprocessSources
resolver_context=self._resolver_context)
File "/usr/lib/python2.7/dist-packages/plaso/engine/engine.py", line 273, in PreprocessSources
self.knowledge_base)
File "/usr/lib/python2.7/dist-packages/plaso/preprocessors/manager.py", line 311, in RunPlugins
artifacts_registry, knowledge_base, searcher, file_system)
File "/usr/lib/python2.7/dist-packages/plaso/preprocessors/manager.py", line 149, in CollectFromFileSystem
knowledge_base, artifact_definition, searcher, file_system)
File "/usr/lib/python2.7/dist-packages/plaso/preprocessors/interface.py", line 82, in Collect
source.separator)
File "/usr/lib/python2.7/dist-packages/plaso/preprocessors/interface.py", line 135, in _ParsePathSpecification
self._ParseFileEntry(knowledge_base, file_entry)
File "/usr/lib/python2.7/dist-packages/plaso/preprocessors/linux.py", line 194, in _ParseFileEntry
if file_entry.link:
AttributeError: 'NoneType' object has no attribute 'link'
```
|
process
|
file entry can be none and raises exception processing started traceback most recent call last file usr bin py line in if not main file usr bin py line in main tool extracteventsfromsources file usr lib dist packages plaso cli tool py line in extracteventsfromsources self preprocesssources extraction engine file usr lib dist packages plaso cli extraction tool py line in preprocesssources resolver context self resolver context file usr lib dist packages plaso engine engine py line in preprocesssources self knowledge base file usr lib dist packages plaso preprocessors manager py line in runplugins artifacts registry knowledge base searcher file system file usr lib dist packages plaso preprocessors manager py line in collectfromfilesystem knowledge base artifact definition searcher file system file usr lib dist packages plaso preprocessors interface py line in collect source separator file usr lib dist packages plaso preprocessors interface py line in parsepathspecification self parsefileentry knowledge base file entry file usr lib dist packages plaso preprocessors linux py line in parsefileentry if file entry link attributeerror nonetype object has no attribute link
| 1
|
6,711
| 9,818,410,849
|
IssuesEvent
|
2019-06-13 19:10:36
|
meumobi/sitebuilder
|
https://api.github.com/repos/meumobi/sitebuilder
|
opened
|
log when remote media info don't recognize mime type
|
monitoring process-remote-media
|
### Expected behaviour
Tell us what should happen
### Actual behaviour
Tell us what happens instead
### Steps to reproduce
1.
2.
3.
### Expected responses
- Why it happens
- How to fix it
- How to test
|
1.0
|
log when remote media info don't recognize mime type - ### Expected behaviour
Tell us what should happen
### Actual behaviour
Tell us what happens instead
### Steps to reproduce
1.
2.
3.
### Expected responses
- Why it happens
- How to fix it
- How to test
|
process
|
log when remote media info don t recognize mime type expected behaviour tell us what should happen actual behaviour tell us what happens instead steps to reproduce expected responses why it happens how to fix it how to test
| 1
|
1,584
| 4,175,295,176
|
IssuesEvent
|
2016-06-21 16:24:57
|
kerubistan/kerub
|
https://api.github.com/repos/kerubistan/kerub
|
opened
|
start vm: filter by host capabilities
|
component:data processing enhancement
|
Do not try to start a VM on a host that does not have
- required hardware resources (like the svm instructions for KVM)
- required software packages installed
|
1.0
|
start vm: filter by host capabilities - Do not try to start a VM on a host that does not have
- required hardware resources (like the svm instructions for KVM)
- required software packages installed
|
process
|
start vm filter by host capabilities do not try to start a vm on a host that does not have required hardware resources like the svm instructions for kvm required software packages installed
| 1
|
16,507
| 21,510,735,369
|
IssuesEvent
|
2022-04-28 04:01:49
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
closed
|
Remove support for delayed conref domain
|
priority/medium preprocess enhancement
|
## Description
Remove support for [delayed conref domain](https://docs.oasis-open.org/dita/dita/v1.3/errata02/os/complete/part3-all-inclusive/langRef/containers/delayconref-d.html). It's only used by Eclipse Help plug-in and has very little use. Removing support for the domain will simplify code and reduced code maintenance burden.
|
1.0
|
Remove support for delayed conref domain - ## Description
Remove support for [delayed conref domain](https://docs.oasis-open.org/dita/dita/v1.3/errata02/os/complete/part3-all-inclusive/langRef/containers/delayconref-d.html). It's only used by Eclipse Help plug-in and has very little use. Removing support for the domain will simplify code and reduced code maintenance burden.
|
process
|
remove support for delayed conref domain description remove support for it s only used by eclipse help plug in and has very little use removing support for the domain will simplify code and reduced code maintenance burden
| 1
|
674,834
| 23,067,296,680
|
IssuesEvent
|
2022-07-25 14:53:39
|
pycaret/pycaret
|
https://api.github.com/repos/pycaret/pycaret
|
closed
|
[BUG]: Can't train with data frequency of 30 minutes
|
bug time_series priority_medium
|
### pycaret version checks
- [X] I have checked that this issue has not already been reported [here](https://github.com/pycaret/pycaret/issues).
- [X] I have confirmed this bug exists on the [latest version](https://github.com/pycaret/pycaret/releases) of pycaret.
- [ ] I have confirmed this bug exists on the develop branch of pycaret (pip install -U git+https://github.com/pycaret/pycaret.git@develop).
### Issue Description
Can't train most models when freq='30T'.
### Reproducible Example
```python
import numpy as np
import pandas as pd
from pycaret.time_series import TSForecastingExperiment
index = pd.date_range(start='2020-10-01 00:00:00', end='2020-10-01 23:30:00', freq='30T')
np.random.seed(5005)
df = pd.DataFrame({
'index': index,
'target': np.random.randint(0, 255, index.shape[0], dtype=int),
}).set_index('index')
df
exp = TSForecastingExperiment()
s = exp.setup(
data=df,
target='target',
fh=1,
fold=3,
session_id=5005
)
exp.create_model('lr_cds_dt')
```
### Expected Behavior
Model to be trained.
### Actual Results
```python-traceback
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Input In [1], in <cell line: 26>()
16 exp = TSForecastingExperiment()
18 s = exp.setup(
19 data=df,
20 target='target',
(...)
23 session_id=5005
24 )
---> 26 exp.create_model('lr_cds_dt')
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\pycaret\time_series\forecasting\oop.py:1718, in TSForecastingExperiment.create_model(self, estimator, fold, round, cross_validation, fit_kwargs, experiment_custom_tags, verbose, **kwargs)
1623 """
1624 This function trains and evaluates the performance of a given estimator
1625 using cross validation. The output of this function is a score grid with
(...)
1713
1714 """
1716 self._check_setup_ran()
-> 1718 return super().create_model(
1719 estimator=estimator,
1720 fold=fold,
1721 round=round,
1722 cross_validation=cross_validation,
1723 fit_kwargs=fit_kwargs,
1724 experiment_custom_tags=experiment_custom_tags,
1725 verbose=verbose,
1726 **kwargs,
1727 )
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\pycaret\internal\pycaret_experiment\supervised_experiment.py:1472, in _SupervisedExperiment.create_model(self, estimator, fold, round, cross_validation, predict, fit_kwargs, groups, refit, probability_threshold, experiment_custom_tags, verbose, system, add_to_model_list, X_train_data, y_train_data, metrics, display, **kwargs)
1469 return model, model_fit_time
1470 return model
-> 1472 model, model_fit_time, model_results, avgs_dict = self._create_model_with_cv(
1473 model,
1474 data_X,
1475 data_y,
1476 fit_kwargs,
1477 round,
1478 cv,
1479 groups,
1480 metrics,
1481 refit,
1482 system,
1483 display,
1484 )
1486 # end runtime
1487 runtime_end = time.time()
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\pycaret\time_series\forecasting\oop.py:1932, in TSForecastingExperiment._create_model_with_cv(self, model, data_X, data_y, fit_kwargs, round, cv, groups, metrics, refit, system, display)
1930 self.logger.info("Finalizing model")
1931 with io.capture_output():
-> 1932 pipeline_with_model.fit(y=data_y, X=data_X, **fit_kwargs)
1933 model_fit_end = time.time()
1934 model_fit_time = np.array(model_fit_end - model_fit_start).round(2)
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\base\_base.py:139, in BaseForecaster.fit(self, y, X, fh)
134 self._update_y_X(y_inner, X_inner)
136 # checks and conversions complete, pass to inner fit
137 #####################################################
--> 139 self._fit(y=y_inner, X=X_inner, fh=fh)
141 # this should happen last
142 self._is_fitted = True
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\compose\_pipeline.py:245, in ForecastingPipeline._fit(self, y, X, fh)
243 name, forecaster = self.steps[-1]
244 f = clone(forecaster)
--> 245 f.fit(y, X, fh)
246 self.steps_[-1] = (name, f)
248 return self
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\base\_base.py:139, in BaseForecaster.fit(self, y, X, fh)
134 self._update_y_X(y_inner, X_inner)
136 # checks and conversions complete, pass to inner fit
137 #####################################################
--> 139 self._fit(y=y_inner, X=X_inner, fh=fh)
141 # this should happen last
142 self._is_fitted = True
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\compose\_pipeline.py:396, in TransformedTargetForecaster._fit(self, y, X, fh)
394 name, forecaster = self.steps[-1]
395 f = clone(forecaster)
--> 396 f.fit(y, X, fh)
397 self.steps_[-1] = (name, f)
398 return self
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\base\_base.py:139, in BaseForecaster.fit(self, y, X, fh)
134 self._update_y_X(y_inner, X_inner)
136 # checks and conversions complete, pass to inner fit
137 #####################################################
--> 139 self._fit(y=y_inner, X=X_inner, fh=fh)
141 # this should happen last
142 self._is_fitted = True
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\pycaret\containers\models\time_series.py:2558, in BaseCdsDtForecaster._fit(self, y, X, fh)
2536 def _fit(self, y, X=None, fh=None):
2537 self._forecaster = TransformedTargetForecaster(
2538 [
2539 (
(...)
2556 ]
2557 )
-> 2558 self._forecaster.fit(y=y, X=X, fh=fh)
2559 self._cutoff = self._forecaster.cutoff
2561 # this should happen last
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\base\_base.py:139, in BaseForecaster.fit(self, y, X, fh)
134 self._update_y_X(y_inner, X_inner)
136 # checks and conversions complete, pass to inner fit
137 #####################################################
--> 139 self._fit(y=y_inner, X=X_inner, fh=fh)
141 # this should happen last
142 self._is_fitted = True
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\compose\_pipeline.py:390, in TransformedTargetForecaster._fit(self, y, X, fh)
388 for step_idx, name, transformer in self._iter_transformers():
389 t = clone(transformer)
--> 390 y = t.fit_transform(y, X)
391 self.steps_[step_idx] = (name, t)
393 # fit forecaster
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\transformations\base.py:445, in BaseTransformer.fit_transform(self, X, y, Z)
442 X = _handle_alias(X, Z)
443 # Non-optimized default implementation; override when a better
444 # method is possible for a given algorithm.
--> 445 return self.fit(X, y).transform(X, y)
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\transformations\base.py:242, in BaseTransformer.fit(self, X, y, Z)
237 X_inner, y_inner = self._convert_X_y(X, y)
239 # todo: uncomment this once Z is completely gone
240 # self._fit(X=X_inner, y=y_inner)
241 # less robust workaround until then
--> 242 self._fit(X_inner, y_inner)
244 self._is_fitted = True
245 return self
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\transformations\series\detrend\_detrend.py:108, in Detrender._fit(self, X, y)
106 forecaster = clone(self.forecaster)
107 # note: the y in the transformer is exogeneous in the forecaster, i.e., X
--> 108 self.forecaster_ = forecaster.fit(y=X, X=y)
109 # multivariate
110 elif isinstance(X, pd.DataFrame):
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\base\_base.py:139, in BaseForecaster.fit(self, y, X, fh)
134 self._update_y_X(y_inner, X_inner)
136 # checks and conversions complete, pass to inner fit
137 #####################################################
--> 139 self._fit(y=y_inner, X=X_inner, fh=fh)
141 # this should happen last
142 self._is_fitted = True
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\trend.py:188, in PolynomialTrendForecaster._fit(self, y, X, fh)
185 X = np.arange(n_timepoints).reshape(-1, 1)
187 # fit regressor
--> 188 self.regressor_.fit(X, y)
189 return self
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sklearn\pipeline.py:394, in Pipeline.fit(self, X, y, **fit_params)
392 if self._final_estimator != "passthrough":
393 fit_params_last_step = fit_params_steps[self.steps[-1][0]]
--> 394 self._final_estimator.fit(Xt, y, **fit_params_last_step)
396 return self
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sklearn\linear_model\_base.py:662, in LinearRegression.fit(self, X, y, sample_weight)
658 n_jobs_ = self.n_jobs
660 accept_sparse = False if self.positive else ["csr", "csc", "coo"]
--> 662 X, y = self._validate_data(
663 X, y, accept_sparse=accept_sparse, y_numeric=True, multi_output=True
664 )
666 if sample_weight is not None:
667 sample_weight = _check_sample_weight(sample_weight, X, dtype=X.dtype)
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sklearn\base.py:581, in BaseEstimator._validate_data(self, X, y, reset, validate_separately, **check_params)
579 y = check_array(y, **check_y_params)
580 else:
--> 581 X, y = check_X_y(X, y, **check_params)
582 out = X, y
584 if not no_val_X and check_params.get("ensure_2d", True):
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sklearn\utils\validation.py:981, in check_X_y(X, y, accept_sparse, accept_large_sparse, dtype, order, copy, force_all_finite, ensure_2d, allow_nd, multi_output, ensure_min_samples, ensure_min_features, y_numeric, estimator)
964 X = check_array(
965 X,
966 accept_sparse=accept_sparse,
(...)
976 estimator=estimator,
977 )
979 y = _check_y(y, multi_output=multi_output, y_numeric=y_numeric)
--> 981 check_consistent_length(X, y)
983 return X, y
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sklearn\utils\validation.py:332, in check_consistent_length(*arrays)
330 uniques = np.unique(lengths)
331 if len(uniques) > 1:
--> 332 raise ValueError(
333 "Found input variables with inconsistent numbers of samples: %r"
334 % [int(l) for l in lengths]
335 )
ValueError: Found input variables with inconsistent numbers of samples: [1381, 47]
```
### Installed Versions
<details>
System:
python: 3.8.13 (default, Mar 28 2022, 06:59:08) [MSC v.1916 64 bit (AMD64)]
executable: C:\Users\Renan\anaconda3\envs\pycaret-ts\python.exe
machine: Windows-10-10.0.19044-SP0
Python dependencies:
pip: 21.2.2
setuptools: 61.2.0
pycaret: 3.0.0
sklearn: 1.0.2
sktime: 0.10.1
statsmodels: 0.13.2
numpy: 1.21.6
scipy: 1.7.3
pandas: 1.4.2
matplotlib: 3.5.1
plotly: 5.6.0
joblib: 1.0.1
numba: 0.55.1
mlflow: Not installed
lightgbm: 3.3.2
xgboost: Not installed
pmdarima: 1.8.5
tbats: Installed but version unavailable
prophet: Not installed
tsfresh: Not installed
</details>
|
1.0
|
[BUG]: Can't train with data frequency of 30 minutes - ### pycaret version checks
- [X] I have checked that this issue has not already been reported [here](https://github.com/pycaret/pycaret/issues).
- [X] I have confirmed this bug exists on the [latest version](https://github.com/pycaret/pycaret/releases) of pycaret.
- [ ] I have confirmed this bug exists on the develop branch of pycaret (pip install -U git+https://github.com/pycaret/pycaret.git@develop).
### Issue Description
Can't train most models when freq='30T'.
### Reproducible Example
```python
import numpy as np
import pandas as pd
from pycaret.time_series import TSForecastingExperiment
index = pd.date_range(start='2020-10-01 00:00:00', end='2020-10-01 23:30:00', freq='30T')
np.random.seed(5005)
df = pd.DataFrame({
'index': index,
'target': np.random.randint(0, 255, index.shape[0], dtype=int),
}).set_index('index')
df
exp = TSForecastingExperiment()
s = exp.setup(
data=df,
target='target',
fh=1,
fold=3,
session_id=5005
)
exp.create_model('lr_cds_dt')
```
### Expected Behavior
Model to be trained.
### Actual Results
```python-traceback
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Input In [1], in <cell line: 26>()
16 exp = TSForecastingExperiment()
18 s = exp.setup(
19 data=df,
20 target='target',
(...)
23 session_id=5005
24 )
---> 26 exp.create_model('lr_cds_dt')
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\pycaret\time_series\forecasting\oop.py:1718, in TSForecastingExperiment.create_model(self, estimator, fold, round, cross_validation, fit_kwargs, experiment_custom_tags, verbose, **kwargs)
1623 """
1624 This function trains and evaluates the performance of a given estimator
1625 using cross validation. The output of this function is a score grid with
(...)
1713
1714 """
1716 self._check_setup_ran()
-> 1718 return super().create_model(
1719 estimator=estimator,
1720 fold=fold,
1721 round=round,
1722 cross_validation=cross_validation,
1723 fit_kwargs=fit_kwargs,
1724 experiment_custom_tags=experiment_custom_tags,
1725 verbose=verbose,
1726 **kwargs,
1727 )
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\pycaret\internal\pycaret_experiment\supervised_experiment.py:1472, in _SupervisedExperiment.create_model(self, estimator, fold, round, cross_validation, predict, fit_kwargs, groups, refit, probability_threshold, experiment_custom_tags, verbose, system, add_to_model_list, X_train_data, y_train_data, metrics, display, **kwargs)
1469 return model, model_fit_time
1470 return model
-> 1472 model, model_fit_time, model_results, avgs_dict = self._create_model_with_cv(
1473 model,
1474 data_X,
1475 data_y,
1476 fit_kwargs,
1477 round,
1478 cv,
1479 groups,
1480 metrics,
1481 refit,
1482 system,
1483 display,
1484 )
1486 # end runtime
1487 runtime_end = time.time()
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\pycaret\time_series\forecasting\oop.py:1932, in TSForecastingExperiment._create_model_with_cv(self, model, data_X, data_y, fit_kwargs, round, cv, groups, metrics, refit, system, display)
1930 self.logger.info("Finalizing model")
1931 with io.capture_output():
-> 1932 pipeline_with_model.fit(y=data_y, X=data_X, **fit_kwargs)
1933 model_fit_end = time.time()
1934 model_fit_time = np.array(model_fit_end - model_fit_start).round(2)
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\base\_base.py:139, in BaseForecaster.fit(self, y, X, fh)
134 self._update_y_X(y_inner, X_inner)
136 # checks and conversions complete, pass to inner fit
137 #####################################################
--> 139 self._fit(y=y_inner, X=X_inner, fh=fh)
141 # this should happen last
142 self._is_fitted = True
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\compose\_pipeline.py:245, in ForecastingPipeline._fit(self, y, X, fh)
243 name, forecaster = self.steps[-1]
244 f = clone(forecaster)
--> 245 f.fit(y, X, fh)
246 self.steps_[-1] = (name, f)
248 return self
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\base\_base.py:139, in BaseForecaster.fit(self, y, X, fh)
134 self._update_y_X(y_inner, X_inner)
136 # checks and conversions complete, pass to inner fit
137 #####################################################
--> 139 self._fit(y=y_inner, X=X_inner, fh=fh)
141 # this should happen last
142 self._is_fitted = True
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\compose\_pipeline.py:396, in TransformedTargetForecaster._fit(self, y, X, fh)
394 name, forecaster = self.steps[-1]
395 f = clone(forecaster)
--> 396 f.fit(y, X, fh)
397 self.steps_[-1] = (name, f)
398 return self
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\base\_base.py:139, in BaseForecaster.fit(self, y, X, fh)
134 self._update_y_X(y_inner, X_inner)
136 # checks and conversions complete, pass to inner fit
137 #####################################################
--> 139 self._fit(y=y_inner, X=X_inner, fh=fh)
141 # this should happen last
142 self._is_fitted = True
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\pycaret\containers\models\time_series.py:2558, in BaseCdsDtForecaster._fit(self, y, X, fh)
2536 def _fit(self, y, X=None, fh=None):
2537 self._forecaster = TransformedTargetForecaster(
2538 [
2539 (
(...)
2556 ]
2557 )
-> 2558 self._forecaster.fit(y=y, X=X, fh=fh)
2559 self._cutoff = self._forecaster.cutoff
2561 # this should happen last
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\base\_base.py:139, in BaseForecaster.fit(self, y, X, fh)
134 self._update_y_X(y_inner, X_inner)
136 # checks and conversions complete, pass to inner fit
137 #####################################################
--> 139 self._fit(y=y_inner, X=X_inner, fh=fh)
141 # this should happen last
142 self._is_fitted = True
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\compose\_pipeline.py:390, in TransformedTargetForecaster._fit(self, y, X, fh)
388 for step_idx, name, transformer in self._iter_transformers():
389 t = clone(transformer)
--> 390 y = t.fit_transform(y, X)
391 self.steps_[step_idx] = (name, t)
393 # fit forecaster
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\transformations\base.py:445, in BaseTransformer.fit_transform(self, X, y, Z)
442 X = _handle_alias(X, Z)
443 # Non-optimized default implementation; override when a better
444 # method is possible for a given algorithm.
--> 445 return self.fit(X, y).transform(X, y)
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\transformations\base.py:242, in BaseTransformer.fit(self, X, y, Z)
237 X_inner, y_inner = self._convert_X_y(X, y)
239 # todo: uncomment this once Z is completely gone
240 # self._fit(X=X_inner, y=y_inner)
241 # less robust workaround until then
--> 242 self._fit(X_inner, y_inner)
244 self._is_fitted = True
245 return self
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\transformations\series\detrend\_detrend.py:108, in Detrender._fit(self, X, y)
106 forecaster = clone(self.forecaster)
107 # note: the y in the transformer is exogeneous in the forecaster, i.e., X
--> 108 self.forecaster_ = forecaster.fit(y=X, X=y)
109 # multivariate
110 elif isinstance(X, pd.DataFrame):
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\base\_base.py:139, in BaseForecaster.fit(self, y, X, fh)
134 self._update_y_X(y_inner, X_inner)
136 # checks and conversions complete, pass to inner fit
137 #####################################################
--> 139 self._fit(y=y_inner, X=X_inner, fh=fh)
141 # this should happen last
142 self._is_fitted = True
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sktime\forecasting\trend.py:188, in PolynomialTrendForecaster._fit(self, y, X, fh)
185 X = np.arange(n_timepoints).reshape(-1, 1)
187 # fit regressor
--> 188 self.regressor_.fit(X, y)
189 return self
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sklearn\pipeline.py:394, in Pipeline.fit(self, X, y, **fit_params)
392 if self._final_estimator != "passthrough":
393 fit_params_last_step = fit_params_steps[self.steps[-1][0]]
--> 394 self._final_estimator.fit(Xt, y, **fit_params_last_step)
396 return self
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sklearn\linear_model\_base.py:662, in LinearRegression.fit(self, X, y, sample_weight)
658 n_jobs_ = self.n_jobs
660 accept_sparse = False if self.positive else ["csr", "csc", "coo"]
--> 662 X, y = self._validate_data(
663 X, y, accept_sparse=accept_sparse, y_numeric=True, multi_output=True
664 )
666 if sample_weight is not None:
667 sample_weight = _check_sample_weight(sample_weight, X, dtype=X.dtype)
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sklearn\base.py:581, in BaseEstimator._validate_data(self, X, y, reset, validate_separately, **check_params)
579 y = check_array(y, **check_y_params)
580 else:
--> 581 X, y = check_X_y(X, y, **check_params)
582 out = X, y
584 if not no_val_X and check_params.get("ensure_2d", True):
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sklearn\utils\validation.py:981, in check_X_y(X, y, accept_sparse, accept_large_sparse, dtype, order, copy, force_all_finite, ensure_2d, allow_nd, multi_output, ensure_min_samples, ensure_min_features, y_numeric, estimator)
964 X = check_array(
965 X,
966 accept_sparse=accept_sparse,
(...)
976 estimator=estimator,
977 )
979 y = _check_y(y, multi_output=multi_output, y_numeric=y_numeric)
--> 981 check_consistent_length(X, y)
983 return X, y
File ~\anaconda3\envs\pycaret-ts\lib\site-packages\sklearn\utils\validation.py:332, in check_consistent_length(*arrays)
330 uniques = np.unique(lengths)
331 if len(uniques) > 1:
--> 332 raise ValueError(
333 "Found input variables with inconsistent numbers of samples: %r"
334 % [int(l) for l in lengths]
335 )
ValueError: Found input variables with inconsistent numbers of samples: [1381, 47]
```
### Installed Versions
<details>
System:
python: 3.8.13 (default, Mar 28 2022, 06:59:08) [MSC v.1916 64 bit (AMD64)]
executable: C:\Users\Renan\anaconda3\envs\pycaret-ts\python.exe
machine: Windows-10-10.0.19044-SP0
Python dependencies:
pip: 21.2.2
setuptools: 61.2.0
pycaret: 3.0.0
sklearn: 1.0.2
sktime: 0.10.1
statsmodels: 0.13.2
numpy: 1.21.6
scipy: 1.7.3
pandas: 1.4.2
matplotlib: 3.5.1
plotly: 5.6.0
joblib: 1.0.1
numba: 0.55.1
mlflow: Not installed
lightgbm: 3.3.2
xgboost: Not installed
pmdarima: 1.8.5
tbats: Installed but version unavailable
prophet: Not installed
tsfresh: Not installed
</details>
|
non_process
|
can t train with data frequency of minutes pycaret version checks i have checked that this issue has not already been reported i have confirmed this bug exists on the of pycaret i have confirmed this bug exists on the develop branch of pycaret pip install u git issue description can t train most models when freq reproducible example python import numpy as np import pandas as pd from pycaret time series import tsforecastingexperiment index pd date range start end freq np random seed df pd dataframe index index target np random randint index shape dtype int set index index df exp tsforecastingexperiment s exp setup data df target target fh fold session id exp create model lr cds dt expected behavior model to be trained actual results python traceback valueerror traceback most recent call last input in in exp tsforecastingexperiment s exp setup data df target target session id exp create model lr cds dt file envs pycaret ts lib site packages pycaret time series forecasting oop py in tsforecastingexperiment create model self estimator fold round cross validation fit kwargs experiment custom tags verbose kwargs this function trains and evaluates the performance of a given estimator using cross validation the output of this function is a score grid with self check setup ran return super create model estimator estimator fold fold round round cross validation cross validation fit kwargs fit kwargs experiment custom tags experiment custom tags verbose verbose kwargs file envs pycaret ts lib site packages pycaret internal pycaret experiment supervised experiment py in supervisedexperiment create model self estimator fold round cross validation predict fit kwargs groups refit probability threshold experiment custom tags verbose system add to model list x train data y train data metrics display kwargs return model model fit time return model model model fit time model results avgs dict self create model with cv model data x data y fit kwargs round cv groups metrics refit system display end runtime runtime end time time file envs pycaret ts lib site packages pycaret time series forecasting oop py in tsforecastingexperiment create model with cv self model data x data y fit kwargs round cv groups metrics refit system display self logger info finalizing model with io capture output pipeline with model fit y data y x data x fit kwargs model fit end time time model fit time np array model fit end model fit start round file envs pycaret ts lib site packages sktime forecasting base base py in baseforecaster fit self y x fh self update y x y inner x inner checks and conversions complete pass to inner fit self fit y y inner x x inner fh fh this should happen last self is fitted true file envs pycaret ts lib site packages sktime forecasting compose pipeline py in forecastingpipeline fit self y x fh name forecaster self steps f clone forecaster f fit y x fh self steps name f return self file envs pycaret ts lib site packages sktime forecasting base base py in baseforecaster fit self y x fh self update y x y inner x inner checks and conversions complete pass to inner fit self fit y y inner x x inner fh fh this should happen last self is fitted true file envs pycaret ts lib site packages sktime forecasting compose pipeline py in transformedtargetforecaster fit self y x fh name forecaster self steps f clone forecaster f fit y x fh self steps name f return self file envs pycaret ts lib site packages sktime forecasting base base py in baseforecaster fit self y x fh self update y x y inner x inner checks and conversions complete pass to inner fit self fit y y inner x x inner fh fh this should happen last self is fitted true file envs pycaret ts lib site packages pycaret containers models time series py in basecdsdtforecaster fit self y x fh def fit self y x none fh none self forecaster transformedtargetforecaster self forecaster fit y y x x fh fh self cutoff self forecaster cutoff this should happen last file envs pycaret ts lib site packages sktime forecasting base base py in baseforecaster fit self y x fh self update y x y inner x inner checks and conversions complete pass to inner fit self fit y y inner x x inner fh fh this should happen last self is fitted true file envs pycaret ts lib site packages sktime forecasting compose pipeline py in transformedtargetforecaster fit self y x fh for step idx name transformer in self iter transformers t clone transformer y t fit transform y x self steps name t fit forecaster file envs pycaret ts lib site packages sktime transformations base py in basetransformer fit transform self x y z x handle alias x z non optimized default implementation override when a better method is possible for a given algorithm return self fit x y transform x y file envs pycaret ts lib site packages sktime transformations base py in basetransformer fit self x y z x inner y inner self convert x y x y todo uncomment this once z is completely gone self fit x x inner y y inner less robust workaround until then self fit x inner y inner self is fitted true return self file envs pycaret ts lib site packages sktime transformations series detrend detrend py in detrender fit self x y forecaster clone self forecaster note the y in the transformer is exogeneous in the forecaster i e x self forecaster forecaster fit y x x y multivariate elif isinstance x pd dataframe file envs pycaret ts lib site packages sktime forecasting base base py in baseforecaster fit self y x fh self update y x y inner x inner checks and conversions complete pass to inner fit self fit y y inner x x inner fh fh this should happen last self is fitted true file envs pycaret ts lib site packages sktime forecasting trend py in polynomialtrendforecaster fit self y x fh x np arange n timepoints reshape fit regressor self regressor fit x y return self file envs pycaret ts lib site packages sklearn pipeline py in pipeline fit self x y fit params if self final estimator passthrough fit params last step fit params steps self final estimator fit xt y fit params last step return self file envs pycaret ts lib site packages sklearn linear model base py in linearregression fit self x y sample weight n jobs self n jobs accept sparse false if self positive else x y self validate data x y accept sparse accept sparse y numeric true multi output true if sample weight is not none sample weight check sample weight sample weight x dtype x dtype file envs pycaret ts lib site packages sklearn base py in baseestimator validate data self x y reset validate separately check params y check array y check y params else x y check x y x y check params out x y if not no val x and check params get ensure true file envs pycaret ts lib site packages sklearn utils validation py in check x y x y accept sparse accept large sparse dtype order copy force all finite ensure allow nd multi output ensure min samples ensure min features y numeric estimator x check array x accept sparse accept sparse estimator estimator y check y y multi output multi output y numeric y numeric check consistent length x y return x y file envs pycaret ts lib site packages sklearn utils validation py in check consistent length arrays uniques np unique lengths if len uniques raise valueerror found input variables with inconsistent numbers of samples r valueerror found input variables with inconsistent numbers of samples installed versions system python default mar executable c users renan envs pycaret ts python exe machine windows python dependencies pip setuptools pycaret sklearn sktime statsmodels numpy scipy pandas matplotlib plotly joblib numba mlflow not installed lightgbm xgboost not installed pmdarima tbats installed but version unavailable prophet not installed tsfresh not installed
| 0
|
16,145
| 20,425,286,221
|
IssuesEvent
|
2022-02-24 02:42:18
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Saga tool do not work
|
Feedback stale Processing Bug
|
### What is the bug or the crash?
When I try to use every SAGA tool such as fill sink(wang & liu) etc... , I got a error message below.
And, When I search a saga tool using 'processing toolbox', every SAGA tool is turned light gray color. (usually it means that 'disabled or inactive')
### Steps to reproduce the issue
1. SAGA
2. Terrain Analysis - Hydrology
3. Fill Sink (wang & liu)
4.
The following layers were not correctly generated.
• C:/Users/Y14135/AppData/Local/Temp/processing_DjLFpd/008a1a423b3540bca3ae1fcb2947a969/WSHED.sdat
• C:/Users/Y14135/AppData/Local/Temp/processing_DjLFpd/19ac58f6934b419a807c7a4465ba49e1/FILLED.sdat
• C:/Users/Y14135/AppData/Local/Temp/processing_DjLFpd/3af7e91cc6e84836aba1950499d55380/FDIR.sdat
You can check the 'Log Messages Panel' in QGIS main window to find more information about the execution of the algorithm.
### Versions
QGIS version
3.16.15-Hannover
QGIS code revision
e7fdad6431
Compiled against Qt
5.11.2
Running against Qt
5.11.2
Compiled against GDAL/OGR
3.1.4
Running against GDAL/OGR
3.1.4
Compiled against GEOS
3.8.1-CAPI-1.13.3
Running against GEOS
3.8.1-CAPI-1.13.3
Compiled against SQLite
3.29.0
Running against SQLite
3.29.0
PostgreSQL Client Version
11.5
SpatiaLite Version
4.3.0
QWT Version
6.1.3
QScintilla2 Version
2.10.8
Compiled against PROJ
6.3.2
Running against PROJ
Rel. 6.3.2, May 1st, 2020
OS Version
Windows 10 (10.0)
Active python plugins
AusMap;
geomelwatershed-master;
db_manager;
MetaSearch;
processing
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [X] I tried with a new QGIS profile
### Additional context
_No response_
|
1.0
|
Saga tool do not work - ### What is the bug or the crash?
When I try to use every SAGA tool such as fill sink(wang & liu) etc... , I got a error message below.
And, When I search a saga tool using 'processing toolbox', every SAGA tool is turned light gray color. (usually it means that 'disabled or inactive')
### Steps to reproduce the issue
1. SAGA
2. Terrain Analysis - Hydrology
3. Fill Sink (wang & liu)
4.
The following layers were not correctly generated.
• C:/Users/Y14135/AppData/Local/Temp/processing_DjLFpd/008a1a423b3540bca3ae1fcb2947a969/WSHED.sdat
• C:/Users/Y14135/AppData/Local/Temp/processing_DjLFpd/19ac58f6934b419a807c7a4465ba49e1/FILLED.sdat
• C:/Users/Y14135/AppData/Local/Temp/processing_DjLFpd/3af7e91cc6e84836aba1950499d55380/FDIR.sdat
You can check the 'Log Messages Panel' in QGIS main window to find more information about the execution of the algorithm.
### Versions
QGIS version
3.16.15-Hannover
QGIS code revision
e7fdad6431
Compiled against Qt
5.11.2
Running against Qt
5.11.2
Compiled against GDAL/OGR
3.1.4
Running against GDAL/OGR
3.1.4
Compiled against GEOS
3.8.1-CAPI-1.13.3
Running against GEOS
3.8.1-CAPI-1.13.3
Compiled against SQLite
3.29.0
Running against SQLite
3.29.0
PostgreSQL Client Version
11.5
SpatiaLite Version
4.3.0
QWT Version
6.1.3
QScintilla2 Version
2.10.8
Compiled against PROJ
6.3.2
Running against PROJ
Rel. 6.3.2, May 1st, 2020
OS Version
Windows 10 (10.0)
Active python plugins
AusMap;
geomelwatershed-master;
db_manager;
MetaSearch;
processing
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [X] I tried with a new QGIS profile
### Additional context
_No response_
|
process
|
saga tool do not work what is the bug or the crash when i try to use every saga tool such as fill sink wang liu etc i got a error message below and when i search a saga tool using processing toolbox every saga tool is turned light gray color usually it means that disabled or inactive steps to reproduce the issue saga terrain analysis hydrology fill sink wang liu the following layers were not correctly generated • c users appdata local temp processing djlfpd wshed sdat • c users appdata local temp processing djlfpd filled sdat • c users appdata local temp processing djlfpd fdir sdat you can check the log messages panel in qgis main window to find more information about the execution of the algorithm versions qgis version hannover qgis code revision compiled against qt running against qt compiled against gdal ogr running against gdal ogr compiled against geos capi running against geos capi compiled against sqlite running against sqlite postgresql client version spatialite version qwt version version compiled against proj running against proj rel may os version windows active python plugins ausmap geomelwatershed master db manager metasearch processing supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context no response
| 1
|
8,986
| 12,100,383,668
|
IssuesEvent
|
2020-04-20 13:43:28
|
ComposableWeb/poolbase
|
https://api.github.com/repos/ComposableWeb/poolbase
|
opened
|
[💥FEAT](keywords) image recognition for main image
|
enhancement epic: processing
|
**Feature request? Please describe.**
derive additional keywords from image recognition AI by analyzing "main" image
**Acceptance Criteria - Describe the solution you'd like**
A clear and concise description of what you want to happen in bullet points:
* ...
**Related issues**
#1
And any other context or screenshots about the feature request here.
|
1.0
|
[💥FEAT](keywords) image recognition for main image - **Feature request? Please describe.**
derive additional keywords from image recognition AI by analyzing "main" image
**Acceptance Criteria - Describe the solution you'd like**
A clear and concise description of what you want to happen in bullet points:
* ...
**Related issues**
#1
And any other context or screenshots about the feature request here.
|
process
|
keywords image recognition for main image feature request please describe derive additional keywords from image recognition ai by analyzing main image acceptance criteria describe the solution you d like a clear and concise description of what you want to happen in bullet points related issues and any other context or screenshots about the feature request here
| 1
|
556,350
| 16,482,422,059
|
IssuesEvent
|
2021-05-24 13:32:00
|
Redocly/openapi-cli
|
https://api.github.com/repos/Redocly/openapi-cli
|
closed
|
paths-kebab-case rule isn't checking for snake_case
|
Cat Priority: Low Type: Bug
|
**Describe the bug**
The `paths-kebab-case` rule doesn't forbid snake_case paths, it only seems to detect camelCase.
**To Reproduce**
Steps to reproduce the behavior:
1. Add snake_case paths.
2. Lint the spec.
**Expected behavior**
The `paths-kebab-case` rule detects the snake_case paths and shows an error or warning.
**Logs**
N/a
**OpenAPI definition**
N/a
**`openapi-cli` Version(s)**
1.0.0-beta.39
**`Node.js` Version(s)**
v12.19.0
**Additional context**
None.
|
1.0
|
paths-kebab-case rule isn't checking for snake_case - **Describe the bug**
The `paths-kebab-case` rule doesn't forbid snake_case paths, it only seems to detect camelCase.
**To Reproduce**
Steps to reproduce the behavior:
1. Add snake_case paths.
2. Lint the spec.
**Expected behavior**
The `paths-kebab-case` rule detects the snake_case paths and shows an error or warning.
**Logs**
N/a
**OpenAPI definition**
N/a
**`openapi-cli` Version(s)**
1.0.0-beta.39
**`Node.js` Version(s)**
v12.19.0
**Additional context**
None.
|
non_process
|
paths kebab case rule isn t checking for snake case describe the bug the paths kebab case rule doesn t forbid snake case paths it only seems to detect camelcase to reproduce steps to reproduce the behavior add snake case paths lint the spec expected behavior the paths kebab case rule detects the snake case paths and shows an error or warning logs n a openapi definition n a openapi cli version s beta node js version s additional context none
| 0
|
4,129
| 7,086,156,216
|
IssuesEvent
|
2018-01-11 13:41:57
|
rogerthat-platform/rogerthat-backend
|
https://api.github.com/repos/rogerthat-platform/rogerthat-backend
|
closed
|
Apps with a custom home screen
|
process_duplicate type_feature
|
- [ ] Add `home_branding` to `FriendTO`, `ServiceIdentity` and the service panels
- Add the following functionalities to the rogerthat JS api and rogerthat CDV plugin:
- [ ] Read news items
- [ ] Receive callbacks when badges (messaging/news) are updated.
- [ ] Document the changes
___
Apps:
- Use homescreen_style "branding" in `build.yaml`
- After registration, show progress bar until main service, all js embeddings and home branding are available.
- Implement api to read news items from inside a branding
- Security: return all news items to the main service. Other services should only receive their news items.
- Triggering of badge number callbacks
___
CC:
- Setting to be able to configure the homescreen_style (shouldn't be possible without a main service)
|
1.0
|
Apps with a custom home screen - - [ ] Add `home_branding` to `FriendTO`, `ServiceIdentity` and the service panels
- Add the following functionalities to the rogerthat JS api and rogerthat CDV plugin:
- [ ] Read news items
- [ ] Receive callbacks when badges (messaging/news) are updated.
- [ ] Document the changes
___
Apps:
- Use homescreen_style "branding" in `build.yaml`
- After registration, show progress bar until main service, all js embeddings and home branding are available.
- Implement api to read news items from inside a branding
- Security: return all news items to the main service. Other services should only receive their news items.
- Triggering of badge number callbacks
___
CC:
- Setting to be able to configure the homescreen_style (shouldn't be possible without a main service)
|
process
|
apps with a custom home screen add home branding to friendto serviceidentity and the service panels add the following functionalities to the rogerthat js api and rogerthat cdv plugin read news items receive callbacks when badges messaging news are updated document the changes apps use homescreen style branding in build yaml after registration show progress bar until main service all js embeddings and home branding are available implement api to read news items from inside a branding security return all news items to the main service other services should only receive their news items triggering of badge number callbacks cc setting to be able to configure the homescreen style shouldn t be possible without a main service
| 1
|
95
| 2,535,611,106
|
IssuesEvent
|
2015-01-26 04:11:35
|
simaya/simaya
|
https://api.github.com/repos/simaya/simaya
|
closed
|
Daftar Kategori Pengguna yang dibakukan dalam aplikasi
|
Must Fix #4 newProcess
|
Silakan ditambahkan, dengan ejaan yang tepat dengan format
namakategori / deskripsi / identitas / digitidentitas
- PNS / Pegawai Negeri Sipil / NIP / 18
- Jabatan Politik / Jabatan Politik tanpa nomor identitas / - / -
|
1.0
|
Daftar Kategori Pengguna yang dibakukan dalam aplikasi - Silakan ditambahkan, dengan ejaan yang tepat dengan format
namakategori / deskripsi / identitas / digitidentitas
- PNS / Pegawai Negeri Sipil / NIP / 18
- Jabatan Politik / Jabatan Politik tanpa nomor identitas / - / -
|
process
|
daftar kategori pengguna yang dibakukan dalam aplikasi silakan ditambahkan dengan ejaan yang tepat dengan format namakategori deskripsi identitas digitidentitas pns pegawai negeri sipil nip jabatan politik jabatan politik tanpa nomor identitas
| 1
|
52,579
| 7,775,775,433
|
IssuesEvent
|
2018-06-05 05:03:52
|
esi/esi-issues
|
https://api.github.com/repos/esi/esi-issues
|
closed
|
characters/{character_id}/assets/ returns is_blueprint_copy
|
documentation question
|
# Extra field is_blueprint_copy returned
The documentation does not say that is_blueprint_copy will be returned. And it is only there if it is a blueprint copy.
## Routes
- `GET /v1/characters/{character_id}/assets/ `
## Resolution
Update documentation. Or remove is_blueprint_copy from returned results.
# Checklist
Check all boxes that apply to this issue:
- [ x] Inconsistency description is provided
- [ x] Inconsistent route(s) are provided
- [ ] Resolution [will require a version bump](https://esi.github.io/esi-issues/breaking_changes)
- [x ] Resolution [does not require a version bump](https://esi.github.io/esi-issues/breaking_changes)
|
1.0
|
characters/{character_id}/assets/ returns is_blueprint_copy - # Extra field is_blueprint_copy returned
The documentation does not say that is_blueprint_copy will be returned. And it is only there if it is a blueprint copy.
## Routes
- `GET /v1/characters/{character_id}/assets/ `
## Resolution
Update documentation. Or remove is_blueprint_copy from returned results.
# Checklist
Check all boxes that apply to this issue:
- [ x] Inconsistency description is provided
- [ x] Inconsistent route(s) are provided
- [ ] Resolution [will require a version bump](https://esi.github.io/esi-issues/breaking_changes)
- [x ] Resolution [does not require a version bump](https://esi.github.io/esi-issues/breaking_changes)
|
non_process
|
characters character id assets returns is blueprint copy extra field is blueprint copy returned the documentation does not say that is blueprint copy will be returned and it is only there if it is a blueprint copy routes get characters character id assets resolution update documentation or remove is blueprint copy from returned results checklist check all boxes that apply to this issue inconsistency description is provided inconsistent route s are provided resolution resolution
| 0
|
372,428
| 11,014,277,618
|
IssuesEvent
|
2019-12-04 22:22:02
|
ampproject/amphtml
|
https://api.github.com/repos/ampproject/amphtml
|
closed
|
[amp-carousel] Carousel (v2) skips slides if container has relative or decimal width
|
P1: High Priority Type: Bug UI: Component: amp-carousel WG: ui-and-a11y
|
Hello,
In the latest version (1911121900560), the carousel (v2) skips every second slide when using the back and forth buttons when layout=responsive is used and its wrapper has a relative width.
```html
<!doctype html>
<html ⚡>
<head>
<meta charset="utf-8">
<!-- doesn't work -->
<script async src="https://cdn.ampproject.org/rtv/001911121900560/v0.js"></script>
<script async custom-element="amp-carousel" src="https://cdn.ampproject.org/rtv/001911121900560/v0/amp-carousel-0.2.js"></script>
<!-- works -->
<!-- <script async src="https://cdn.ampproject.org/rtv/001911070201440/v0.js"></script>-->
<!-- <script async custom-element="amp-carousel" src="https://cdn.ampproject.org/rtv/001911070201440/v0/amp-carousel-0.2.js"></script>-->
<!-- any other relative value than 100% or non integer values, f.e. 235.5px, 80% or 60em -->
<style amp-custom>
.wrapper {
width: 235.5px;
}
</style>
</head>
<body>
<div class="wrapper">
<amp-carousel width="360" height="240" layout="responsive" type="slides">
<amp-img src="https://via.placeholder.com/360x240.png?text=1"
width="3"
height="2"
layout="responsive"
alt="1">
</amp-img>
<amp-img src="https://via.placeholder.com/360x240.png?text=2"
width="3"
height="2"
layout="responsive"
alt="2">
</amp-img>
<amp-img src="https://via.placeholder.com/360x240.png?text=3"
width="3"
height="2"
layout="responsive"
alt="3">
</amp-img>
<amp-img src="https://via.placeholder.com/360x240.png?text=4"
width="3"
height="2"
layout="responsive"
alt="4">
</amp-img>
<amp-img src="https://via.placeholder.com/360x240.png?text=5"
width="3"
height="2"
layout="responsive"
alt="5">
</amp-img>
<amp-img src="https://via.placeholder.com/360x240.png?text=6"
width="3"
height="2"
layout="responsive"
alt="6">
</amp-img>
<amp-img src="https://via.placeholder.com/360x240.png?text=7"
width="3"
height="2"
layout="responsive"
alt="7">
</amp-img>
</amp-carousel>
</div>
</body>
</html>
```
Maybe that's a subpixel problem. Integer pixel values (e.g. 200px) seem to work. Version 1 of the carousel is not affected.
Thanks for your help!
//cc @dritter | @thecoconutdream | @dennisp93
|
1.0
|
[amp-carousel] Carousel (v2) skips slides if container has relative or decimal width - Hello,
In the latest version (1911121900560), the carousel (v2) skips every second slide when using the back and forth buttons when layout=responsive is used and its wrapper has a relative width.
```html
<!doctype html>
<html ⚡>
<head>
<meta charset="utf-8">
<!-- doesn't work -->
<script async src="https://cdn.ampproject.org/rtv/001911121900560/v0.js"></script>
<script async custom-element="amp-carousel" src="https://cdn.ampproject.org/rtv/001911121900560/v0/amp-carousel-0.2.js"></script>
<!-- works -->
<!-- <script async src="https://cdn.ampproject.org/rtv/001911070201440/v0.js"></script>-->
<!-- <script async custom-element="amp-carousel" src="https://cdn.ampproject.org/rtv/001911070201440/v0/amp-carousel-0.2.js"></script>-->
<!-- any other relative value than 100% or non integer values, f.e. 235.5px, 80% or 60em -->
<style amp-custom>
.wrapper {
width: 235.5px;
}
</style>
</head>
<body>
<div class="wrapper">
<amp-carousel width="360" height="240" layout="responsive" type="slides">
<amp-img src="https://via.placeholder.com/360x240.png?text=1"
width="3"
height="2"
layout="responsive"
alt="1">
</amp-img>
<amp-img src="https://via.placeholder.com/360x240.png?text=2"
width="3"
height="2"
layout="responsive"
alt="2">
</amp-img>
<amp-img src="https://via.placeholder.com/360x240.png?text=3"
width="3"
height="2"
layout="responsive"
alt="3">
</amp-img>
<amp-img src="https://via.placeholder.com/360x240.png?text=4"
width="3"
height="2"
layout="responsive"
alt="4">
</amp-img>
<amp-img src="https://via.placeholder.com/360x240.png?text=5"
width="3"
height="2"
layout="responsive"
alt="5">
</amp-img>
<amp-img src="https://via.placeholder.com/360x240.png?text=6"
width="3"
height="2"
layout="responsive"
alt="6">
</amp-img>
<amp-img src="https://via.placeholder.com/360x240.png?text=7"
width="3"
height="2"
layout="responsive"
alt="7">
</amp-img>
</amp-carousel>
</div>
</body>
</html>
```
Maybe that's a subpixel problem. Integer pixel values (e.g. 200px) seem to work. Version 1 of the carousel is not affected.
Thanks for your help!
//cc @dritter | @thecoconutdream | @dennisp93
|
non_process
|
carousel skips slides if container has relative or decimal width hello in the latest version the carousel skips every second slide when using the back and forth buttons when layout responsive is used and its wrapper has a relative width html script async src script async custom element amp carousel src script async src script async custom element amp carousel src wrapper width amp img src width height layout responsive alt amp img src width height layout responsive alt amp img src width height layout responsive alt amp img src width height layout responsive alt amp img src width height layout responsive alt amp img src width height layout responsive alt amp img src width height layout responsive alt maybe that s a subpixel problem integer pixel values e g seem to work version of the carousel is not affected thanks for your help cc dritter thecoconutdream
| 0
|
149,837
| 5,729,537,134
|
IssuesEvent
|
2017-04-21 06:35:12
|
CovertJaguar/Railcraft
|
https://api.github.com/repos/CovertJaguar/Railcraft
|
closed
|
Crash for the version 10.1.0 ,MC 1.10.2.
|
bug cannot reproduce needs verification priority-high
|
same error for Forge 2.2017 and 3.2281。
Time: 4/21/17 10:51 AM
Description: There was a severe problem during mod loading that has caused the game to fail
net.minecraftforge.fml.common.LoaderExceptionModCrash: Caught exception from Railcraft (railcraft)
Caused by: java.lang.NoSuchFieldError: CHEST
at mods.railcraft.common.carts.RailcraftCarts.<clinit>(RailcraftCarts.java:55)
at mods.railcraft.common.core.RailcraftConfig.loadCarts(RailcraftConfig.java:410)
at mods.railcraft.common.core.RailcraftConfig.preInit(RailcraftConfig.java:160)
at mods.railcraft.common.core.Railcraft.preInit(Railcraft.java:167)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)......
|
1.0
|
Crash for the version 10.1.0 ,MC 1.10.2. - same error for Forge 2.2017 and 3.2281。
Time: 4/21/17 10:51 AM
Description: There was a severe problem during mod loading that has caused the game to fail
net.minecraftforge.fml.common.LoaderExceptionModCrash: Caught exception from Railcraft (railcraft)
Caused by: java.lang.NoSuchFieldError: CHEST
at mods.railcraft.common.carts.RailcraftCarts.<clinit>(RailcraftCarts.java:55)
at mods.railcraft.common.core.RailcraftConfig.loadCarts(RailcraftConfig.java:410)
at mods.railcraft.common.core.RailcraftConfig.preInit(RailcraftConfig.java:160)
at mods.railcraft.common.core.Railcraft.preInit(Railcraft.java:167)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)......
|
non_process
|
crash for the version ,mc same error for forge and 。 time am description there was a severe problem during mod loading that has caused the game to fail net minecraftforge fml common loaderexceptionmodcrash caught exception from railcraft railcraft caused by java lang nosuchfielderror chest at mods railcraft common carts railcraftcarts railcraftcarts java at mods railcraft common core railcraftconfig loadcarts railcraftconfig java at mods railcraft common core railcraftconfig preinit railcraftconfig java at mods railcraft common core railcraft preinit railcraft java at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke unknown source at sun reflect delegatingmethodaccessorimpl invoke unknown source at java lang reflect method invoke unknown source
| 0
|
178,005
| 14,657,713,962
|
IssuesEvent
|
2020-12-28 16:12:27
|
iterative/shtab
|
https://api.github.com/repos/iterative/shtab
|
closed
|
tag stable or beta
|
documentation
|
as per https://github.com/tldr-pages/tldr-python-client/pull/135#issuecomment-751740867, `shtab` is still classified as `Development Status :: 3 - Alpha`. It's probably time to change to `Development Status :: 5 - Production/Stable` or at least `Development Status :: 4 - Beta`
|
1.0
|
tag stable or beta - as per https://github.com/tldr-pages/tldr-python-client/pull/135#issuecomment-751740867, `shtab` is still classified as `Development Status :: 3 - Alpha`. It's probably time to change to `Development Status :: 5 - Production/Stable` or at least `Development Status :: 4 - Beta`
|
non_process
|
tag stable or beta as per shtab is still classified as development status alpha it s probably time to change to development status production stable or at least development status beta
| 0
|
27,945
| 13,459,550,553
|
IssuesEvent
|
2020-09-09 12:26:00
|
bbc/simorgh
|
https://api.github.com/repos/bbc/simorgh
|
closed
|
Upgrade to Styled Components v5
|
performance technical-work ws-engage
|
**Is your feature request related to a problem? Please describe.**
Upgrade to the latest release of Styled Components v5. It is non-breaking and compared to v4, it comes with:
- 26% smaller bundle size (16.2kB vs. 12.42kB min+gzip)
- 22% faster client-side mounting
- 26% faster updating of dynamic styles
- 45% faster server-side rendering
**Describe the solution you'd like**
Simorgh:
- Update `styled-components` to the latest version.
- Update `jest-styled-components` to the latest version.
- Check for regressions.
Psammead:
- Update `styled-components` to the latest version.
- Update `jest-styled-components` to the latest version.
- Update `psammead-test-helpers` to use the latest `jest-styled-components`.
- Check for regressions.
**Describe alternatives you've considered**
n/a
**Testing notes**
[Tester to complete]
Dev insight: Will Cypress tests be required or are unit tests sufficient? Will there be any potential regression? etc
- [ ] This feature is expected to need manual testing.
**Additional context**
- https://medium.com/styled-components/announcing-styled-components-v5-beast-mode-389747abd987
|
True
|
Upgrade to Styled Components v5 - **Is your feature request related to a problem? Please describe.**
Upgrade to the latest release of Styled Components v5. It is non-breaking and compared to v4, it comes with:
- 26% smaller bundle size (16.2kB vs. 12.42kB min+gzip)
- 22% faster client-side mounting
- 26% faster updating of dynamic styles
- 45% faster server-side rendering
**Describe the solution you'd like**
Simorgh:
- Update `styled-components` to the latest version.
- Update `jest-styled-components` to the latest version.
- Check for regressions.
Psammead:
- Update `styled-components` to the latest version.
- Update `jest-styled-components` to the latest version.
- Update `psammead-test-helpers` to use the latest `jest-styled-components`.
- Check for regressions.
**Describe alternatives you've considered**
n/a
**Testing notes**
[Tester to complete]
Dev insight: Will Cypress tests be required or are unit tests sufficient? Will there be any potential regression? etc
- [ ] This feature is expected to need manual testing.
**Additional context**
- https://medium.com/styled-components/announcing-styled-components-v5-beast-mode-389747abd987
|
non_process
|
upgrade to styled components is your feature request related to a problem please describe upgrade to the latest release of styled components it is non breaking and compared to it comes with smaller bundle size vs min gzip faster client side mounting faster updating of dynamic styles faster server side rendering describe the solution you d like simorgh update styled components to the latest version update jest styled components to the latest version check for regressions psammead update styled components to the latest version update jest styled components to the latest version update psammead test helpers to use the latest jest styled components check for regressions describe alternatives you ve considered n a testing notes dev insight will cypress tests be required or are unit tests sufficient will there be any potential regression etc this feature is expected to need manual testing additional context
| 0
|
52,393
| 27,544,226,770
|
IssuesEvent
|
2023-03-07 10:36:44
|
NethermindEth/nethermind
|
https://api.github.com/repos/NethermindEth/nethermind
|
closed
|
TxPool further improvements
|
performance good first issue
|
It seems that we made quite significant optimization here: https://github.com/NethermindEth/nethermind/pull/4702
I added metric: PendingTransactionsWithExpensiveFiltering. Ideally, we should filter as many transactions as possible before we reach this metric.
One potential idea:
When our TxPool is full we shouldn't accept Tx which are worse (in terms of GasPrice) than our worse transaction in TxPool. We're doing something like this in FeeTooLowTxFilter: https://github.com/NethermindEth/nethermind/blob/master/src/Nethermind/Nethermind.TxPool/TxPool.cs#L119
However, it should be implemented without accessing the state. If we're accessing state we won't have benefits of optimization.
|
True
|
TxPool further improvements - It seems that we made quite significant optimization here: https://github.com/NethermindEth/nethermind/pull/4702
I added metric: PendingTransactionsWithExpensiveFiltering. Ideally, we should filter as many transactions as possible before we reach this metric.
One potential idea:
When our TxPool is full we shouldn't accept Tx which are worse (in terms of GasPrice) than our worse transaction in TxPool. We're doing something like this in FeeTooLowTxFilter: https://github.com/NethermindEth/nethermind/blob/master/src/Nethermind/Nethermind.TxPool/TxPool.cs#L119
However, it should be implemented without accessing the state. If we're accessing state we won't have benefits of optimization.
|
non_process
|
txpool further improvements it seems that we made quite significant optimization here i added metric pendingtransactionswithexpensivefiltering ideally we should filter as many transactions as possible before we reach this metric one potential idea when our txpool is full we shouldn t accept tx which are worse in terms of gasprice than our worse transaction in txpool we re doing something like this in feetoolowtxfilter however it should be implemented without accessing the state if we re accessing state we won t have benefits of optimization
| 0
|
16,067
| 20,206,382,869
|
IssuesEvent
|
2022-02-11 20:55:51
|
oasis-tcs/csaf
|
https://api.github.com/repos/oasis-tcs/csaf
|
closed
|
Comment Resolution Log CSDPR01 to CS01
|
csaf 2.0 email oasis_tc_process CSDPR01_feedback non_material CS01
|
# Comment Resolution Log
The table summarizes the comments that were received during the 30-day public review of the committee specification draft "[Common Security Advisory Framework Version 2.0](https://docs.oasis-open.org/csaf/csaf/v2.0/csd01/csaf-v2.0-csd01.html)" and their resolution. Comments came to the csaf-comment and the CSAF TC mailing lists. [The public review did start 14 August 2021 at 00:00 UTC and did end 12 September 2021 at 23:59 UTC](https://www.oasis-open.org/2021/08/13/invitation-to-comment-on-common-security-advisory-framework-v2-0/).
A status of "Completed" in the Disposition column indicates that the editors have implemented the changes on which the TC decided, which are outlined in the Resolution column. It also is a hyperlink to the GitHub commit notice.
The item number is a hyperlink to the e-mail on the csaf-comment or CSAF TC mailing lists.
The following abbreviations are used:
* SH= Stefan Hagen
* TS = Thomas Schmidt
| Item # | Date | Commenter | Description | Date acknowledged | Resolution | Disposition |
| ------------- | ------------- | ------------- | ------------- | ------------- | ------------- | ------------- |
| [1](https://lists.oasis-open.org/archives/csaf-comment/202108/msg00001.html) | 2021-08-24 | Eliot Lear | [Section 7.1.6: Redirects](https://github.com/oasis-tcs/csaf/issues/342) | 2021-08-25 | Discussed at TC call | No change required |
| [2](https://lists.oasis-open.org/archives/csaf-comment/202108/msg00001.html) | 2021-08-24 | Eliot Lear | [Section 5.1 filename - #343](https://github.com/oasis-tcs/csaf/issues/343) | 2021-08-25 | Discussed at TC call | No change required |
| [3](https://lists.oasis-open.org/archives/csaf-comment/202108/msg00001.html) | 2021-08-24 | Eliot Lear | [Should Section 7 be a part of THIS specification or a separate specification? - #344](https://github.com/oasis-tcs/csaf/issues/344) | 2021-08-25 | Discussed at TC call | No change required |
| [4](https://lists.oasis-open.org/archives/csaf-comment/202108/msg00001.html) | 2021-08-24 | Eliot Lear | [Section 7.1.19: OpenPGP signatures? - #345](https://github.com/oasis-tcs/csaf/issues/345) | 2021-08-26 | Discussed at TC call | No change required |
| [5](https://lists.oasis-open.org/archives/csaf-comment/202109/msg00000.html) | 2021-09-01 | Christian Keil | [Correct links in status section - these point to OpenC2 instead of CSAF TC #354](https://github.com/oasis-tcs/csaf/issues/354) | 2021-09-08 | Discussed at TC call | [Completed](https://github.com/oasis-tcs/csaf/pull/362/commits/b2c3e6fe7d0e292309cd3253bb3d10ef53eb09d8) |
| [6](https://lists.oasis-open.org/archives/csaf-comment/202109/msg00000.html) | 2021-09-01 | Christian Keil | [Extend to product identification helpers to mitigate name / vendor changes? #355](https://github.com/oasis-tcs/csaf/issues/355) | 2021-09-08 | Discussed at TC call | No change required |
| [7](https://lists.oasis-open.org/archives/csaf-comment/202109/msg00000.html) | 2021-09-01 | Christian Keil | [Extend documentation for Notes Type to better guide producers? #356](https://github.com/oasis-tcs/csaf/issues/356) | 2021-09-08 | Discussed at TC call | No change required |
| [8](https://lists.oasis-open.org/archives/csaf-comment/202109/msg00000.html) | 2021-09-01 | Christian Keil | [Consider adding Product Status options for vendors to express that there will be no further investigation #357](https://github.com/oasis-tcs/csaf/issues/357) | 2021-09-08 | Discussed at TC call | No change required |
| [9](https://lists.oasis-open.org/archives/csaf-comment/202109/msg00000.html) | 2021-09-01 | Christian Keil | [Further constraints for semantic version application? #358](https://github.com/oasis-tcs/csaf/issues/358) | 2021-09-08 | Discussed at TC call | No change required |
| [10](https://lists.oasis-open.org/archives/csaf/202109/msg00000.html) | 2021-09-08 | Duncan Sparrell | [Terms legacy, end-of-life, and end-of-support #363](https://github.com/oasis-tcs/csaf/issues/363) | 2021-09-28 | Discussed at TC call | No change for version 2.0 required |
| [11](https://lists.oasis-open.org/archives/csaf-comment/202109/msg00001.html) | 2021-09-10 | Patrick Fuller | [Call for assisted input technology to mitigate complexity and support automation #359](https://github.com/oasis-tcs/csaf/issues/359) | 2021-09-11 | Discussed at TC call | No change required |
## Evaluation of Feedback
The editors consider above public comments as well as other more editorial feedback documented in issue [#360](https://github.com/oasis-tcs/csaf/issues/360) and classified/considered per pull request [#362]((https://github.com/oasis-tcs/csaf/issues/362)) as **Non-Material** per OASIS TC process.
A [motion has been issued to the TC mailing list by Stefan Hagen on 2021-10-10](https://lists.oasis-open.org/archives/csaf/202110/msg00000.html) to promote the resulting revised work products to CS01 including non-material changes only.
To ease verification by anyone and to support the administration a separate release candidate archive containing the 4 standards track work products will be created and linked to this issue as well as noted in the motion to the list.
|
1.0
|
Comment Resolution Log CSDPR01 to CS01 - # Comment Resolution Log
The table summarizes the comments that were received during the 30-day public review of the committee specification draft "[Common Security Advisory Framework Version 2.0](https://docs.oasis-open.org/csaf/csaf/v2.0/csd01/csaf-v2.0-csd01.html)" and their resolution. Comments came to the csaf-comment and the CSAF TC mailing lists. [The public review did start 14 August 2021 at 00:00 UTC and did end 12 September 2021 at 23:59 UTC](https://www.oasis-open.org/2021/08/13/invitation-to-comment-on-common-security-advisory-framework-v2-0/).
A status of "Completed" in the Disposition column indicates that the editors have implemented the changes on which the TC decided, which are outlined in the Resolution column. It also is a hyperlink to the GitHub commit notice.
The item number is a hyperlink to the e-mail on the csaf-comment or CSAF TC mailing lists.
The following abbreviations are used:
* SH= Stefan Hagen
* TS = Thomas Schmidt
| Item # | Date | Commenter | Description | Date acknowledged | Resolution | Disposition |
| ------------- | ------------- | ------------- | ------------- | ------------- | ------------- | ------------- |
| [1](https://lists.oasis-open.org/archives/csaf-comment/202108/msg00001.html) | 2021-08-24 | Eliot Lear | [Section 7.1.6: Redirects](https://github.com/oasis-tcs/csaf/issues/342) | 2021-08-25 | Discussed at TC call | No change required |
| [2](https://lists.oasis-open.org/archives/csaf-comment/202108/msg00001.html) | 2021-08-24 | Eliot Lear | [Section 5.1 filename - #343](https://github.com/oasis-tcs/csaf/issues/343) | 2021-08-25 | Discussed at TC call | No change required |
| [3](https://lists.oasis-open.org/archives/csaf-comment/202108/msg00001.html) | 2021-08-24 | Eliot Lear | [Should Section 7 be a part of THIS specification or a separate specification? - #344](https://github.com/oasis-tcs/csaf/issues/344) | 2021-08-25 | Discussed at TC call | No change required |
| [4](https://lists.oasis-open.org/archives/csaf-comment/202108/msg00001.html) | 2021-08-24 | Eliot Lear | [Section 7.1.19: OpenPGP signatures? - #345](https://github.com/oasis-tcs/csaf/issues/345) | 2021-08-26 | Discussed at TC call | No change required |
| [5](https://lists.oasis-open.org/archives/csaf-comment/202109/msg00000.html) | 2021-09-01 | Christian Keil | [Correct links in status section - these point to OpenC2 instead of CSAF TC #354](https://github.com/oasis-tcs/csaf/issues/354) | 2021-09-08 | Discussed at TC call | [Completed](https://github.com/oasis-tcs/csaf/pull/362/commits/b2c3e6fe7d0e292309cd3253bb3d10ef53eb09d8) |
| [6](https://lists.oasis-open.org/archives/csaf-comment/202109/msg00000.html) | 2021-09-01 | Christian Keil | [Extend to product identification helpers to mitigate name / vendor changes? #355](https://github.com/oasis-tcs/csaf/issues/355) | 2021-09-08 | Discussed at TC call | No change required |
| [7](https://lists.oasis-open.org/archives/csaf-comment/202109/msg00000.html) | 2021-09-01 | Christian Keil | [Extend documentation for Notes Type to better guide producers? #356](https://github.com/oasis-tcs/csaf/issues/356) | 2021-09-08 | Discussed at TC call | No change required |
| [8](https://lists.oasis-open.org/archives/csaf-comment/202109/msg00000.html) | 2021-09-01 | Christian Keil | [Consider adding Product Status options for vendors to express that there will be no further investigation #357](https://github.com/oasis-tcs/csaf/issues/357) | 2021-09-08 | Discussed at TC call | No change required |
| [9](https://lists.oasis-open.org/archives/csaf-comment/202109/msg00000.html) | 2021-09-01 | Christian Keil | [Further constraints for semantic version application? #358](https://github.com/oasis-tcs/csaf/issues/358) | 2021-09-08 | Discussed at TC call | No change required |
| [10](https://lists.oasis-open.org/archives/csaf/202109/msg00000.html) | 2021-09-08 | Duncan Sparrell | [Terms legacy, end-of-life, and end-of-support #363](https://github.com/oasis-tcs/csaf/issues/363) | 2021-09-28 | Discussed at TC call | No change for version 2.0 required |
| [11](https://lists.oasis-open.org/archives/csaf-comment/202109/msg00001.html) | 2021-09-10 | Patrick Fuller | [Call for assisted input technology to mitigate complexity and support automation #359](https://github.com/oasis-tcs/csaf/issues/359) | 2021-09-11 | Discussed at TC call | No change required |
## Evaluation of Feedback
The editors consider above public comments as well as other more editorial feedback documented in issue [#360](https://github.com/oasis-tcs/csaf/issues/360) and classified/considered per pull request [#362]((https://github.com/oasis-tcs/csaf/issues/362)) as **Non-Material** per OASIS TC process.
A [motion has been issued to the TC mailing list by Stefan Hagen on 2021-10-10](https://lists.oasis-open.org/archives/csaf/202110/msg00000.html) to promote the resulting revised work products to CS01 including non-material changes only.
To ease verification by anyone and to support the administration a separate release candidate archive containing the 4 standards track work products will be created and linked to this issue as well as noted in the motion to the list.
|
process
|
comment resolution log to comment resolution log the table summarizes the comments that were received during the day public review of the committee specification draft and their resolution comments came to the csaf comment and the csaf tc mailing lists a status of completed in the disposition column indicates that the editors have implemented the changes on which the tc decided which are outlined in the resolution column it also is a hyperlink to the github commit notice the item number is a hyperlink to the e mail on the csaf comment or csaf tc mailing lists the following abbreviations are used sh stefan hagen ts thomas schmidt item date commenter description date acknowledged resolution disposition eliot lear discussed at tc call no change required eliot lear discussed at tc call no change required eliot lear discussed at tc call no change required eliot lear discussed at tc call no change required christian keil discussed at tc call christian keil discussed at tc call no change required christian keil discussed at tc call no change required christian keil discussed at tc call no change required christian keil discussed at tc call no change required duncan sparrell discussed at tc call no change for version required patrick fuller discussed at tc call no change required evaluation of feedback the editors consider above public comments as well as other more editorial feedback documented in issue and classified considered per pull request as non material per oasis tc process a to promote the resulting revised work products to including non material changes only to ease verification by anyone and to support the administration a separate release candidate archive containing the standards track work products will be created and linked to this issue as well as noted in the motion to the list
| 1
|
7,806
| 10,025,205,956
|
IssuesEvent
|
2019-07-17 01:09:21
|
userfrosting/UserFrosting
|
https://api.github.com/repos/userfrosting/UserFrosting
|
closed
|
Update to latest AdminLTE
|
Breaking Change compatibility up-for-grabs
|
> I had a quick look and even between 2.3.6 (which comes with UF4) and 2.4.10 there are quite a number of changes. There's something like 3 years between these releases which in our world is/feels like ages
|
True
|
Update to latest AdminLTE - > I had a quick look and even between 2.3.6 (which comes with UF4) and 2.4.10 there are quite a number of changes. There's something like 3 years between these releases which in our world is/feels like ages
|
non_process
|
update to latest adminlte i had a quick look and even between which comes with and there are quite a number of changes there s something like years between these releases which in our world is feels like ages
| 0
|
463,292
| 13,262,631,346
|
IssuesEvent
|
2020-08-20 22:13:15
|
internetarchive/openlibrary
|
https://api.github.com/repos/internetarchive/openlibrary
|
closed
|
Python 3: get_sorted_editions() is not working on books/OL42679M
|
Lead: @cclauss Priority: 2 Theme: Upgrade to Python 3 Type: Bug
|
<!-- What problem are we solving? What does the experience look like today? What are the symptoms? -->
Fixed by internetarchive/infogami#107
Look at the editions of The wit and wisdom of Mark Twain on Python 3. http://staging.openlibrary.org/books/OL42679M
I have put debug code all around [get_sorted_editions()](https://github.com/internetarchive/openlibrary/search?q=get_sorted_editions) but have not found the fault.
When this URL is opened, [openlibrary/templates/type/edition/view.html](https://github.com/internetarchive/openlibrary/blob/master/openlibrary/templates/type/edition/view.html#L13-L15) calls [get_sorted_editions()](https://github.com/internetarchive/openlibrary/blob/master/openlibrary/plugins/upstream/models.py#L665)
Even when I `def get_sorted_editions(): return ['a', 'b', 'c']` an empty list [] is getting returned!
### Evidence / Screenshot (if possible)
<img width="961" alt="Screenshot 2020-07-29 at 17 31 10" src="https://user-images.githubusercontent.com/3709715/88821066-74b37500-d1c2-11ea-903d-d01d3fc14df7.png">
### Relevant url?
<!-- `https://openlibrary.org/...` -->
http://localhost:8080/books/OL42679M
https://github.com/internetarchive/openlibrary/blob/master/openlibrary/plugins/upstream/models.py#L665
called from https://github.com/internetarchive/openlibrary/blob/master/openlibrary/templates/type/edition/view.html#L14 returns 8 editions on Python 2 but no editions on Python 3.
### Steps to Reproduce (Using changes in #3637 and internetarchive/infogami#104)
<!-- What steps caused you to find the bug? -->
1. docker-compose down ; INFOGAMI=local PYENV_VERSION=3.8.5 docker-compose up -d ; docker-compose logs -f --tail=10 web
2. Browser: `http://localhost:8080/books/OL42679M ` # no editions shown
3. Repeat 1. on PYENV_VERSION=2.7.6
4. Browser: `http://localhost:8080/books/OL42679M` # 8 editions shown
<!-- What actually happened after these steps? What did you expect to happen? -->
* Actual: No editions
* Expected: 8 editions
### Details
- **Logged in (Y/N)?**
- **Browser type/version?**
- **Operating system?**
- **Environment (prod/dev/local)?** prod
<!-- If not sure, put prod -->
### Proposal & Constraints
<!-- What is the proposed solution / implementation? Is there a precedent of this approach succeeding elsewhere? -->
### Related files
<!-- Files related to this issue; this is super useful for new contributors who might want to help! If you're not sure, leave this blank; a maintainer will add them. -->
### Stakeholders
<!-- @ tag stakeholders of this bug -->
|
1.0
|
Python 3: get_sorted_editions() is not working on books/OL42679M - <!-- What problem are we solving? What does the experience look like today? What are the symptoms? -->
Fixed by internetarchive/infogami#107
Look at the editions of The wit and wisdom of Mark Twain on Python 3. http://staging.openlibrary.org/books/OL42679M
I have put debug code all around [get_sorted_editions()](https://github.com/internetarchive/openlibrary/search?q=get_sorted_editions) but have not found the fault.
When this URL is opened, [openlibrary/templates/type/edition/view.html](https://github.com/internetarchive/openlibrary/blob/master/openlibrary/templates/type/edition/view.html#L13-L15) calls [get_sorted_editions()](https://github.com/internetarchive/openlibrary/blob/master/openlibrary/plugins/upstream/models.py#L665)
Even when I `def get_sorted_editions(): return ['a', 'b', 'c']` an empty list [] is getting returned!
### Evidence / Screenshot (if possible)
<img width="961" alt="Screenshot 2020-07-29 at 17 31 10" src="https://user-images.githubusercontent.com/3709715/88821066-74b37500-d1c2-11ea-903d-d01d3fc14df7.png">
### Relevant url?
<!-- `https://openlibrary.org/...` -->
http://localhost:8080/books/OL42679M
https://github.com/internetarchive/openlibrary/blob/master/openlibrary/plugins/upstream/models.py#L665
called from https://github.com/internetarchive/openlibrary/blob/master/openlibrary/templates/type/edition/view.html#L14 returns 8 editions on Python 2 but no editions on Python 3.
### Steps to Reproduce (Using changes in #3637 and internetarchive/infogami#104)
<!-- What steps caused you to find the bug? -->
1. docker-compose down ; INFOGAMI=local PYENV_VERSION=3.8.5 docker-compose up -d ; docker-compose logs -f --tail=10 web
2. Browser: `http://localhost:8080/books/OL42679M ` # no editions shown
3. Repeat 1. on PYENV_VERSION=2.7.6
4. Browser: `http://localhost:8080/books/OL42679M` # 8 editions shown
<!-- What actually happened after these steps? What did you expect to happen? -->
* Actual: No editions
* Expected: 8 editions
### Details
- **Logged in (Y/N)?**
- **Browser type/version?**
- **Operating system?**
- **Environment (prod/dev/local)?** prod
<!-- If not sure, put prod -->
### Proposal & Constraints
<!-- What is the proposed solution / implementation? Is there a precedent of this approach succeeding elsewhere? -->
### Related files
<!-- Files related to this issue; this is super useful for new contributors who might want to help! If you're not sure, leave this blank; a maintainer will add them. -->
### Stakeholders
<!-- @ tag stakeholders of this bug -->
|
non_process
|
python get sorted editions is not working on books fixed by internetarchive infogami look at the editions of the wit and wisdom of mark twain on python i have put debug code all around but have not found the fault when this url is opened calls even when i def get sorted editions return an empty list is getting returned evidence screenshot if possible img width alt screenshot at src relevant url called from returns editions on python but no editions on python steps to reproduce using changes in and internetarchive infogami docker compose down infogami local pyenv version docker compose up d docker compose logs f tail web browser no editions shown repeat on pyenv version browser editions shown actual no editions expected editions details logged in y n browser type version operating system environment prod dev local prod proposal constraints related files stakeholders
| 0
|
11,342
| 14,165,118,510
|
IssuesEvent
|
2020-11-12 06:36:29
|
microsoft/react-native-windows
|
https://api.github.com/repos/microsoft/react-native-windows
|
opened
|
Release RNW 0.64
|
Area: Release Process Discussion
|
**Legend**
- [ ] ⁉ Needs driver
- [ ] Work not started
- [ ] 🏃♂️ Work in progress
- [x] Work completed
## Milestones
- [ ] **11/25:** Release 0.64.0-preview.1
- [ ] **12/9:** Changes in 0.64-stable require triage
- [ ] **TBD (early 2021):** Release
## Highlights (WIP):
- Easy Hermes Opt-In
- Binary distribution improvements
- React 17
- Improved API parity
- Overhaul of Websocket implementation
- Supports border on Text
- Platform.Version
- Too many reliability and performance improvements to mention
## Checklist
**Before Preview**
- [ ] Draft GitHub release notes from commit log (NickGerleman)
- [ ] Send mail to the team for additions/changes to release notes (NickGerleman)
- [ ] Promote canary build to preview using [wiki instructions](https://github.com/microsoft/react-native-windows/wiki/How-to-promote-a-release) (NickGerleman)
- [ ] ⁉ Smoke test of functionality (???)
- Some discussion here https://github.com/microsoft/react-native-windows/issues/5326
- [ ] Poke at RNTester
- [ ] Test new C++ app with run-windows
- [ ] Test new C++ app with VS
- [ ] Test new C# app with run-windows
- [ ] Test new C# app with VS
- [ ] Test new C++ app with Hermes
- [ ] Push build to 0.63-stable branch and publish (NickGerleman )
- [ ] Update GitHub release notes to use manually curated notes instead of a changelog (NickGerleman)
- [ ] Post release notes internally (NickGerleman)
-----
**After Preview**
- [ ] Send mail to the team reminding of dates + restrictions (NickGerleman)
- [ ] Echo release notes into announcement/Tweet (stmoy)
- [ ] Move most issues targeting 0.63 to 0.64 (chrisglein)
- [ ] Add "Blocking" label to any known showstopper regressions (chrisglein)
- [ ] Remind engineers to update API docs (chrisglein)
- [ ] ⁉ Drive validation of community modules (stmoy/vmoroz?)
- [ ] ⁉ Test updated gallery app (stmoy?)
- [ ] ⁉ Test updated samples (jonthysell?)
- [ ] Integrate patch releases for React Native (NickGerleman)
- [ ] Modify CODEOWNERS in 0.63-stable to require changes go through @microsoft/react-native-windows-backport-triage (NickGerleman)
- [ ] Send reminder mail to the team about backport triage (NickGerleman)
-----
**Before Release**
- [ ] Ensure any community typing changes happen (rectified95?)
- [ ] Test samples against new version (jonthysell)
- [ ] ⁉ Do a pass on API Docs (Team effort again? Need driver)
- [ ] Promote `latest` build to `legacy` using [wiki instructions](https://github.com/microsoft/react-native-windows/wiki/How-to-promote-a-release) (NickGerleman)
- [ ] Ensure all issues marked with "Blocking Label" are fixed (chrisglein)
- [ ] Ensure any backport-triage items needed in 0.63-stable are added (khalef1)
-----
**Release**
- [ ] Update preview release notes with any changes from cherry-picked PRs (NickGerleman)
- [ ] Update sample repos to new version (jonthysell)
- [ ] Smoke test of functionality (???)
- [ ] Poke at RNTester
- [ ] Test new C++ app with run-windows
- [ ] Test new C++ app with VS
- [ ] Test new C# app with run-windows
- [ ] Test new C# app with VS
- [ ] Test new C++ app with Hermes
- [ ] Promote `preview` build to `latest` using [wiki instructions](https://github.com/microsoft/react-native-windows/wiki/How-to-promote-a-release) (NickGerleman)
- [ ] Update GitHub release notes to use manually curated notes instead of a changelog (NickGerleman)
- [ ] Flip docs site to 0.63 (kikisaints)
- [ ] Send out internal release announcement (NGerlem)
- [ ] Send out external release announcement (kikisaints)
|
1.0
|
Release RNW 0.64 - **Legend**
- [ ] ⁉ Needs driver
- [ ] Work not started
- [ ] 🏃♂️ Work in progress
- [x] Work completed
## Milestones
- [ ] **11/25:** Release 0.64.0-preview.1
- [ ] **12/9:** Changes in 0.64-stable require triage
- [ ] **TBD (early 2021):** Release
## Highlights (WIP):
- Easy Hermes Opt-In
- Binary distribution improvements
- React 17
- Improved API parity
- Overhaul of Websocket implementation
- Supports border on Text
- Platform.Version
- Too many reliability and performance improvements to mention
## Checklist
**Before Preview**
- [ ] Draft GitHub release notes from commit log (NickGerleman)
- [ ] Send mail to the team for additions/changes to release notes (NickGerleman)
- [ ] Promote canary build to preview using [wiki instructions](https://github.com/microsoft/react-native-windows/wiki/How-to-promote-a-release) (NickGerleman)
- [ ] ⁉ Smoke test of functionality (???)
- Some discussion here https://github.com/microsoft/react-native-windows/issues/5326
- [ ] Poke at RNTester
- [ ] Test new C++ app with run-windows
- [ ] Test new C++ app with VS
- [ ] Test new C# app with run-windows
- [ ] Test new C# app with VS
- [ ] Test new C++ app with Hermes
- [ ] Push build to 0.63-stable branch and publish (NickGerleman )
- [ ] Update GitHub release notes to use manually curated notes instead of a changelog (NickGerleman)
- [ ] Post release notes internally (NickGerleman)
-----
**After Preview**
- [ ] Send mail to the team reminding of dates + restrictions (NickGerleman)
- [ ] Echo release notes into announcement/Tweet (stmoy)
- [ ] Move most issues targeting 0.63 to 0.64 (chrisglein)
- [ ] Add "Blocking" label to any known showstopper regressions (chrisglein)
- [ ] Remind engineers to update API docs (chrisglein)
- [ ] ⁉ Drive validation of community modules (stmoy/vmoroz?)
- [ ] ⁉ Test updated gallery app (stmoy?)
- [ ] ⁉ Test updated samples (jonthysell?)
- [ ] Integrate patch releases for React Native (NickGerleman)
- [ ] Modify CODEOWNERS in 0.63-stable to require changes go through @microsoft/react-native-windows-backport-triage (NickGerleman)
- [ ] Send reminder mail to the team about backport triage (NickGerleman)
-----
**Before Release**
- [ ] Ensure any community typing changes happen (rectified95?)
- [ ] Test samples against new version (jonthysell)
- [ ] ⁉ Do a pass on API Docs (Team effort again? Need driver)
- [ ] Promote `latest` build to `legacy` using [wiki instructions](https://github.com/microsoft/react-native-windows/wiki/How-to-promote-a-release) (NickGerleman)
- [ ] Ensure all issues marked with "Blocking Label" are fixed (chrisglein)
- [ ] Ensure any backport-triage items needed in 0.63-stable are added (khalef1)
-----
**Release**
- [ ] Update preview release notes with any changes from cherry-picked PRs (NickGerleman)
- [ ] Update sample repos to new version (jonthysell)
- [ ] Smoke test of functionality (???)
- [ ] Poke at RNTester
- [ ] Test new C++ app with run-windows
- [ ] Test new C++ app with VS
- [ ] Test new C# app with run-windows
- [ ] Test new C# app with VS
- [ ] Test new C++ app with Hermes
- [ ] Promote `preview` build to `latest` using [wiki instructions](https://github.com/microsoft/react-native-windows/wiki/How-to-promote-a-release) (NickGerleman)
- [ ] Update GitHub release notes to use manually curated notes instead of a changelog (NickGerleman)
- [ ] Flip docs site to 0.63 (kikisaints)
- [ ] Send out internal release announcement (NGerlem)
- [ ] Send out external release announcement (kikisaints)
|
process
|
release rnw legend ⁉ needs driver work not started 🏃♂️ work in progress work completed milestones release preview changes in stable require triage tbd early release highlights wip easy hermes opt in binary distribution improvements react improved api parity overhaul of websocket implementation supports border on text platform version too many reliability and performance improvements to mention checklist before preview draft github release notes from commit log nickgerleman send mail to the team for additions changes to release notes nickgerleman promote canary build to preview using nickgerleman ⁉ smoke test of functionality some discussion here poke at rntester test new c app with run windows test new c app with vs test new c app with run windows test new c app with vs test new c app with hermes push build to stable branch and publish nickgerleman update github release notes to use manually curated notes instead of a changelog nickgerleman post release notes internally nickgerleman after preview send mail to the team reminding of dates restrictions nickgerleman echo release notes into announcement tweet stmoy move most issues targeting to chrisglein add blocking label to any known showstopper regressions chrisglein remind engineers to update api docs chrisglein ⁉ drive validation of community modules stmoy vmoroz ⁉ test updated gallery app stmoy ⁉ test updated samples jonthysell integrate patch releases for react native nickgerleman modify codeowners in stable to require changes go through microsoft react native windows backport triage nickgerleman send reminder mail to the team about backport triage nickgerleman before release ensure any community typing changes happen test samples against new version jonthysell ⁉ do a pass on api docs team effort again need driver promote latest build to legacy using nickgerleman ensure all issues marked with blocking label are fixed chrisglein ensure any backport triage items needed in stable are added release update preview release notes with any changes from cherry picked prs nickgerleman update sample repos to new version jonthysell smoke test of functionality poke at rntester test new c app with run windows test new c app with vs test new c app with run windows test new c app with vs test new c app with hermes promote preview build to latest using nickgerleman update github release notes to use manually curated notes instead of a changelog nickgerleman flip docs site to kikisaints send out internal release announcement ngerlem send out external release announcement kikisaints
| 1
|
51,389
| 3,012,837,142
|
IssuesEvent
|
2015-07-29 03:08:55
|
codefordc/districthousing
|
https://api.github.com/repos/codefordc/districthousing
|
closed
|
Feature Request: Contact Identification
|
priority
|
Stacey J. wants to create three categories for contacts listed.
- Emergency Contact
- Reference
- General Contact
This should be required and be multi-select or checkboxes (as your programming brain desires). Someone entered must be at least one of these, and could be more than that. Not a priority.
|
1.0
|
Feature Request: Contact Identification - Stacey J. wants to create three categories for contacts listed.
- Emergency Contact
- Reference
- General Contact
This should be required and be multi-select or checkboxes (as your programming brain desires). Someone entered must be at least one of these, and could be more than that. Not a priority.
|
non_process
|
feature request contact identification stacey j wants to create three categories for contacts listed emergency contact reference general contact this should be required and be multi select or checkboxes as your programming brain desires someone entered must be at least one of these and could be more than that not a priority
| 0
|
83,283
| 24,026,618,438
|
IssuesEvent
|
2022-09-15 12:04:38
|
atc0005/check-vmware
|
https://api.github.com/repos/atc0005/check-vmware
|
closed
|
Timeout occurred for `Build codebase using Makefile all recipe` GHAW job
|
bug builds CI
|
The current timeout is explicitly set to 40 minutes:
https://github.com/atc0005/check-vmware/blob/7bb33565e5bea66db51d3546b6267fe608ec17ff/.github/workflows/lint-and-build-using-make.yml#L56-L60
This was sufficient for Go 1.17 and earlier, but Go 1.19 (and presumably Go 1.18) builds are taking longer to complete.
Command executed: `make all`
Last output:
```
Removing object files and cached files ...
Removing any existing release assets
Building release assets for windows x86 ...
Building check_vmware_tools 386 binary
Generating check_vmware_tools checksum file
Building check_vmware_vcpus 386 binary
Generating check_vmware_vcpus checksum file
Building check_vmware_vhw 386 binary
Generating check_vmware_vhw checksum file
Building check_vmware_hs2ds2vms 386 binary
Generating check_vmware_hs2ds2vms checksum file
Building check_vmware_datastore_space 386 binary
Generating check_vmware_datastore_space checksum file
Building check_vmware_datastore_performance 386 binary
Generating check_vmware_datastore_performance checksum file
Building check_vmware_snapshots_age 386 binary
Generating check_vmware_snapshots_age checksum file
Building check_vmware_snapshots_count 386 binary
Generating check_vmware_snapshots_count checksum file
Building check_vmware_snapshots_size 386 binary
Generating check_vmware_snapshots_size checksum file
Building check_vmware_rps_memory 386 binary
Generating check_vmware_rps_memory checksum file
Building check_vmware_host_memory 386 binary
Generating check_vmware_host_memory checksum file
Building check_vmware_host_cpu 386 binary
Generating check_vmware_host_cpu checksum file
Building check_vmware_vm_power_uptime 386 binary
Generating check_vmware_vm_power_uptime checksum file
Building check_vmware_disk_consolidation 386 binary
Generating check_vmware_disk_consolidation checksum file
Building check_vmware_question 386 binary
Generating check_vmware_question checksum file
Building check_vmware_alarms 386 binary
Generating check_vmware_alarms checksum file
Building check_vmware_vm_backup_via_ca 386 binary
Generating check_vmware_vm_backup_via_ca checksum file
Completed build tasks for windows x86
Building release assets for windows x64 ...
Building check_vmware_tools amd64 binary
Generating check_vmware_tools checksum file
Building check_vmware_vcpus amd64 binary
Generating check_vmware_vcpus checksum file
Building check_vmware_vhw amd64 binary
Generating check_vmware_vhw checksum file
Building check_vmware_hs2ds2vms amd64 binary
Generating check_vmware_hs2ds2vms checksum file
Building check_vmware_datastore_space amd64 binary
Generating check_vmware_datastore_space checksum file
Building check_vmware_datastore_performance amd64 binary
Generating check_vmware_datastore_performance checksum file
Building check_vmware_snapshots_age amd64 binary
Generating check_vmware_snapshots_age checksum file
Building check_vmware_snapshots_count amd64 binary
Generating check_vmware_snapshots_count checksum file
Building check_vmware_snapshots_size amd64 binary
Generating check_vmware_snapshots_size checksum file
Building check_vmware_rps_memory amd64 binary
Generating check_vmware_rps_memory checksum file
Building check_vmware_host_memory amd64 binary
Generating check_vmware_host_memory checksum file
Building check_vmware_host_cpu amd64 binary
Generating check_vmware_host_cpu checksum file
Building check_vmware_vm_power_uptime amd64 binary
Generating check_vmware_vm_power_uptime checksum file
Building check_vmware_disk_consolidation amd64 binary
Generating check_vmware_disk_consolidation checksum file
Building check_vmware_question amd64 binary
Generating check_vmware_question checksum file
Building check_vmware_alarms amd64 binary
Generating check_vmware_alarms checksum file
Building check_vmware_vm_backup_via_ca amd64 binary
Generating check_vmware_vm_backup_via_ca checksum file
Completed build tasks for windows x64
Completed all build tasks for windows
Building release assets for linux x86 ...
Building check_vmware_tools 386 binary
Generating check_vmware_tools checksum file
Building check_vmware_vcpus 386 binary
Generating check_vmware_vcpus checksum file
Building check_vmware_vhw 386 binary
Generating check_vmware_vhw checksum file
Building check_vmware_hs2ds2vms 386 binary
Generating check_vmware_hs2ds2vms checksum file
Building check_vmware_datastore_space 386 binary
Generating check_vmware_datastore_space checksum file
Building check_vmware_datastore_performance 386 binary
Generating check_vmware_datastore_performance checksum file
Building check_vmware_snapshots_age 386 binary
Generating check_vmware_snapshots_age checksum file
Building check_vmware_snapshots_count 386 binary
Generating check_vmware_snapshots_count checksum file
Building check_vmware_snapshots_size 386 binary
Generating check_vmware_snapshots_size checksum file
Building check_vmware_rps_memory 386 binary
Generating check_vmware_rps_memory checksum file
Building check_vmware_host_memory 386 binary
Generating check_vmware_host_memory checksum file
Building check_vmware_host_cpu 386 binary
Generating check_vmware_host_cpu checksum file
Building check_vmware_vm_power_uptime 386 binary
Generating check_vmware_vm_power_uptime checksum file
Building check_vmware_disk_consolidation 386 binary
Generating check_vmware_disk_consolidation checksum file
Building check_vmware_question 386 binary
Generating check_vmware_question checksum file
Building check_vmware_alarms 386 binary
Generating check_vmware_alarms checksum file
Building check_vmware_vm_backup_via_ca 386 binary
Generating check_vmware_vm_backup_via_ca checksum file
Completed build tasks for linux x86
Building release assets for linux x64 ...
Building check_vmware_tools amd64 binary
Generating check_vmware_tools checksum file
Building check_vmware_vcpus amd64 binary
Generating check_vmware_vcpus checksum file
Building check_vmware_vhw amd64 binary
Generating check_vmware_vhw checksum file
Building check_vmware_hs2ds2vms amd64 binary
Generating check_vmware_hs2ds2vms checksum file
Building check_vmware_datastore_space amd64 binary
Generating check_vmware_datastore_space checksum file
Building check_vmware_datastore_performance amd64 binary
Generating check_vmware_datastore_performance checksum file
Building check_vmware_snapshots_age amd64 binary
Generating check_vmware_snapshots_age checksum file
Building check_vmware_snapshots_count amd64 binary
Generating check_vmware_snapshots_count checksum file
Building check_vmware_snapshots_size amd64 binary
Generating check_vmware_snapshots_size checksum file
Building check_vmware_rps_memory amd64 binary
Generating check_vmware_rps_memory checksum file
Building check_vmware_host_memory amd64 binary
Generating check_vmware_host_memory checksum file
Building check_vmware_host_cpu amd64 binary
Generating check_vmware_host_cpu checksum file
Building check_vmware_vm_power_uptime amd64 binary
Generating check_vmware_vm_power_uptime checksum file
Building check_vmware_disk_consolidation amd64 binary
Generating check_vmware_disk_consolidation checksum file
Building check_vmware_question amd64 binary
```
Will need to adjust the timeout to a value that consistently is successful.
|
1.0
|
Timeout occurred for `Build codebase using Makefile all recipe` GHAW job - The current timeout is explicitly set to 40 minutes:
https://github.com/atc0005/check-vmware/blob/7bb33565e5bea66db51d3546b6267fe608ec17ff/.github/workflows/lint-and-build-using-make.yml#L56-L60
This was sufficient for Go 1.17 and earlier, but Go 1.19 (and presumably Go 1.18) builds are taking longer to complete.
Command executed: `make all`
Last output:
```
Removing object files and cached files ...
Removing any existing release assets
Building release assets for windows x86 ...
Building check_vmware_tools 386 binary
Generating check_vmware_tools checksum file
Building check_vmware_vcpus 386 binary
Generating check_vmware_vcpus checksum file
Building check_vmware_vhw 386 binary
Generating check_vmware_vhw checksum file
Building check_vmware_hs2ds2vms 386 binary
Generating check_vmware_hs2ds2vms checksum file
Building check_vmware_datastore_space 386 binary
Generating check_vmware_datastore_space checksum file
Building check_vmware_datastore_performance 386 binary
Generating check_vmware_datastore_performance checksum file
Building check_vmware_snapshots_age 386 binary
Generating check_vmware_snapshots_age checksum file
Building check_vmware_snapshots_count 386 binary
Generating check_vmware_snapshots_count checksum file
Building check_vmware_snapshots_size 386 binary
Generating check_vmware_snapshots_size checksum file
Building check_vmware_rps_memory 386 binary
Generating check_vmware_rps_memory checksum file
Building check_vmware_host_memory 386 binary
Generating check_vmware_host_memory checksum file
Building check_vmware_host_cpu 386 binary
Generating check_vmware_host_cpu checksum file
Building check_vmware_vm_power_uptime 386 binary
Generating check_vmware_vm_power_uptime checksum file
Building check_vmware_disk_consolidation 386 binary
Generating check_vmware_disk_consolidation checksum file
Building check_vmware_question 386 binary
Generating check_vmware_question checksum file
Building check_vmware_alarms 386 binary
Generating check_vmware_alarms checksum file
Building check_vmware_vm_backup_via_ca 386 binary
Generating check_vmware_vm_backup_via_ca checksum file
Completed build tasks for windows x86
Building release assets for windows x64 ...
Building check_vmware_tools amd64 binary
Generating check_vmware_tools checksum file
Building check_vmware_vcpus amd64 binary
Generating check_vmware_vcpus checksum file
Building check_vmware_vhw amd64 binary
Generating check_vmware_vhw checksum file
Building check_vmware_hs2ds2vms amd64 binary
Generating check_vmware_hs2ds2vms checksum file
Building check_vmware_datastore_space amd64 binary
Generating check_vmware_datastore_space checksum file
Building check_vmware_datastore_performance amd64 binary
Generating check_vmware_datastore_performance checksum file
Building check_vmware_snapshots_age amd64 binary
Generating check_vmware_snapshots_age checksum file
Building check_vmware_snapshots_count amd64 binary
Generating check_vmware_snapshots_count checksum file
Building check_vmware_snapshots_size amd64 binary
Generating check_vmware_snapshots_size checksum file
Building check_vmware_rps_memory amd64 binary
Generating check_vmware_rps_memory checksum file
Building check_vmware_host_memory amd64 binary
Generating check_vmware_host_memory checksum file
Building check_vmware_host_cpu amd64 binary
Generating check_vmware_host_cpu checksum file
Building check_vmware_vm_power_uptime amd64 binary
Generating check_vmware_vm_power_uptime checksum file
Building check_vmware_disk_consolidation amd64 binary
Generating check_vmware_disk_consolidation checksum file
Building check_vmware_question amd64 binary
Generating check_vmware_question checksum file
Building check_vmware_alarms amd64 binary
Generating check_vmware_alarms checksum file
Building check_vmware_vm_backup_via_ca amd64 binary
Generating check_vmware_vm_backup_via_ca checksum file
Completed build tasks for windows x64
Completed all build tasks for windows
Building release assets for linux x86 ...
Building check_vmware_tools 386 binary
Generating check_vmware_tools checksum file
Building check_vmware_vcpus 386 binary
Generating check_vmware_vcpus checksum file
Building check_vmware_vhw 386 binary
Generating check_vmware_vhw checksum file
Building check_vmware_hs2ds2vms 386 binary
Generating check_vmware_hs2ds2vms checksum file
Building check_vmware_datastore_space 386 binary
Generating check_vmware_datastore_space checksum file
Building check_vmware_datastore_performance 386 binary
Generating check_vmware_datastore_performance checksum file
Building check_vmware_snapshots_age 386 binary
Generating check_vmware_snapshots_age checksum file
Building check_vmware_snapshots_count 386 binary
Generating check_vmware_snapshots_count checksum file
Building check_vmware_snapshots_size 386 binary
Generating check_vmware_snapshots_size checksum file
Building check_vmware_rps_memory 386 binary
Generating check_vmware_rps_memory checksum file
Building check_vmware_host_memory 386 binary
Generating check_vmware_host_memory checksum file
Building check_vmware_host_cpu 386 binary
Generating check_vmware_host_cpu checksum file
Building check_vmware_vm_power_uptime 386 binary
Generating check_vmware_vm_power_uptime checksum file
Building check_vmware_disk_consolidation 386 binary
Generating check_vmware_disk_consolidation checksum file
Building check_vmware_question 386 binary
Generating check_vmware_question checksum file
Building check_vmware_alarms 386 binary
Generating check_vmware_alarms checksum file
Building check_vmware_vm_backup_via_ca 386 binary
Generating check_vmware_vm_backup_via_ca checksum file
Completed build tasks for linux x86
Building release assets for linux x64 ...
Building check_vmware_tools amd64 binary
Generating check_vmware_tools checksum file
Building check_vmware_vcpus amd64 binary
Generating check_vmware_vcpus checksum file
Building check_vmware_vhw amd64 binary
Generating check_vmware_vhw checksum file
Building check_vmware_hs2ds2vms amd64 binary
Generating check_vmware_hs2ds2vms checksum file
Building check_vmware_datastore_space amd64 binary
Generating check_vmware_datastore_space checksum file
Building check_vmware_datastore_performance amd64 binary
Generating check_vmware_datastore_performance checksum file
Building check_vmware_snapshots_age amd64 binary
Generating check_vmware_snapshots_age checksum file
Building check_vmware_snapshots_count amd64 binary
Generating check_vmware_snapshots_count checksum file
Building check_vmware_snapshots_size amd64 binary
Generating check_vmware_snapshots_size checksum file
Building check_vmware_rps_memory amd64 binary
Generating check_vmware_rps_memory checksum file
Building check_vmware_host_memory amd64 binary
Generating check_vmware_host_memory checksum file
Building check_vmware_host_cpu amd64 binary
Generating check_vmware_host_cpu checksum file
Building check_vmware_vm_power_uptime amd64 binary
Generating check_vmware_vm_power_uptime checksum file
Building check_vmware_disk_consolidation amd64 binary
Generating check_vmware_disk_consolidation checksum file
Building check_vmware_question amd64 binary
```
Will need to adjust the timeout to a value that consistently is successful.
|
non_process
|
timeout occurred for build codebase using makefile all recipe ghaw job the current timeout is explicitly set to minutes this was sufficient for go and earlier but go and presumably go builds are taking longer to complete command executed make all last output removing object files and cached files removing any existing release assets building release assets for windows building check vmware tools binary generating check vmware tools checksum file building check vmware vcpus binary generating check vmware vcpus checksum file building check vmware vhw binary generating check vmware vhw checksum file building check vmware binary generating check vmware checksum file building check vmware datastore space binary generating check vmware datastore space checksum file building check vmware datastore performance binary generating check vmware datastore performance checksum file building check vmware snapshots age binary generating check vmware snapshots age checksum file building check vmware snapshots count binary generating check vmware snapshots count checksum file building check vmware snapshots size binary generating check vmware snapshots size checksum file building check vmware rps memory binary generating check vmware rps memory checksum file building check vmware host memory binary generating check vmware host memory checksum file building check vmware host cpu binary generating check vmware host cpu checksum file building check vmware vm power uptime binary generating check vmware vm power uptime checksum file building check vmware disk consolidation binary generating check vmware disk consolidation checksum file building check vmware question binary generating check vmware question checksum file building check vmware alarms binary generating check vmware alarms checksum file building check vmware vm backup via ca binary generating check vmware vm backup via ca checksum file completed build tasks for windows building release assets for windows building check vmware tools binary generating check vmware tools checksum file building check vmware vcpus binary generating check vmware vcpus checksum file building check vmware vhw binary generating check vmware vhw checksum file building check vmware binary generating check vmware checksum file building check vmware datastore space binary generating check vmware datastore space checksum file building check vmware datastore performance binary generating check vmware datastore performance checksum file building check vmware snapshots age binary generating check vmware snapshots age checksum file building check vmware snapshots count binary generating check vmware snapshots count checksum file building check vmware snapshots size binary generating check vmware snapshots size checksum file building check vmware rps memory binary generating check vmware rps memory checksum file building check vmware host memory binary generating check vmware host memory checksum file building check vmware host cpu binary generating check vmware host cpu checksum file building check vmware vm power uptime binary generating check vmware vm power uptime checksum file building check vmware disk consolidation binary generating check vmware disk consolidation checksum file building check vmware question binary generating check vmware question checksum file building check vmware alarms binary generating check vmware alarms checksum file building check vmware vm backup via ca binary generating check vmware vm backup via ca checksum file completed build tasks for windows completed all build tasks for windows building release assets for linux building check vmware tools binary generating check vmware tools checksum file building check vmware vcpus binary generating check vmware vcpus checksum file building check vmware vhw binary generating check vmware vhw checksum file building check vmware binary generating check vmware checksum file building check vmware datastore space binary generating check vmware datastore space checksum file building check vmware datastore performance binary generating check vmware datastore performance checksum file building check vmware snapshots age binary generating check vmware snapshots age checksum file building check vmware snapshots count binary generating check vmware snapshots count checksum file building check vmware snapshots size binary generating check vmware snapshots size checksum file building check vmware rps memory binary generating check vmware rps memory checksum file building check vmware host memory binary generating check vmware host memory checksum file building check vmware host cpu binary generating check vmware host cpu checksum file building check vmware vm power uptime binary generating check vmware vm power uptime checksum file building check vmware disk consolidation binary generating check vmware disk consolidation checksum file building check vmware question binary generating check vmware question checksum file building check vmware alarms binary generating check vmware alarms checksum file building check vmware vm backup via ca binary generating check vmware vm backup via ca checksum file completed build tasks for linux building release assets for linux building check vmware tools binary generating check vmware tools checksum file building check vmware vcpus binary generating check vmware vcpus checksum file building check vmware vhw binary generating check vmware vhw checksum file building check vmware binary generating check vmware checksum file building check vmware datastore space binary generating check vmware datastore space checksum file building check vmware datastore performance binary generating check vmware datastore performance checksum file building check vmware snapshots age binary generating check vmware snapshots age checksum file building check vmware snapshots count binary generating check vmware snapshots count checksum file building check vmware snapshots size binary generating check vmware snapshots size checksum file building check vmware rps memory binary generating check vmware rps memory checksum file building check vmware host memory binary generating check vmware host memory checksum file building check vmware host cpu binary generating check vmware host cpu checksum file building check vmware vm power uptime binary generating check vmware vm power uptime checksum file building check vmware disk consolidation binary generating check vmware disk consolidation checksum file building check vmware question binary will need to adjust the timeout to a value that consistently is successful
| 0
|
115,013
| 14,674,294,529
|
IssuesEvent
|
2020-12-30 15:01:42
|
department-of-veterans-affairs/va.gov-cms
|
https://api.github.com/repos/department-of-veterans-affairs/va.gov-cms
|
opened
|
Tree test with CMS users - planning
|
Core Application Team Design Needs refining Research
|
_As a UX researcher, I want to test the proposed IA structure with CMS users, so that we can evaluate the findability of topics in the CMS._
**AC:**
- [ ] Tree test scenarios are defined.
- [ ] Tree test study is set up (tool: Treejack).
- [ ] Participants are selected.
- [ ] Tree test invitation is sent to participants.
|
1.0
|
Tree test with CMS users - planning - _As a UX researcher, I want to test the proposed IA structure with CMS users, so that we can evaluate the findability of topics in the CMS._
**AC:**
- [ ] Tree test scenarios are defined.
- [ ] Tree test study is set up (tool: Treejack).
- [ ] Participants are selected.
- [ ] Tree test invitation is sent to participants.
|
non_process
|
tree test with cms users planning as a ux researcher i want to test the proposed ia structure with cms users so that we can evaluate the findability of topics in the cms ac tree test scenarios are defined tree test study is set up tool treejack participants are selected tree test invitation is sent to participants
| 0
|
55,844
| 11,471,381,591
|
IssuesEvent
|
2020-02-09 10:51:53
|
joomla/joomla-cms
|
https://api.github.com/repos/joomla/joomla-cms
|
closed
|
[4.0] media manager return and correction
|
J4 Issue No Code Attached Yet
|
### Is your feature request related to a problem? Please describe.
For now the media manager design isn't effective
1 full with option is realy a problem for usability (a slider or in input in 800px isn't good for eyes lol)
2 order option problem
3 No button to save a new image version
### Describe the solution you'd like
i think use right column is good for large screen
ex

About order (for commun user and most used option) will be
For crop
Aspect ratio
Width
Height
Quality
X-axis (it a googd option but not commun)
Y-axis (it a googd option but not commun)
For rezize
Width
Height
Quality
=> maybe add a size report ?
For rotate
Angle
Quality
I think its realy important to add a button "Save as a new image" to avoid confusion and better workflow
|
1.0
|
[4.0] media manager return and correction - ### Is your feature request related to a problem? Please describe.
For now the media manager design isn't effective
1 full with option is realy a problem for usability (a slider or in input in 800px isn't good for eyes lol)
2 order option problem
3 No button to save a new image version
### Describe the solution you'd like
i think use right column is good for large screen
ex

About order (for commun user and most used option) will be
For crop
Aspect ratio
Width
Height
Quality
X-axis (it a googd option but not commun)
Y-axis (it a googd option but not commun)
For rezize
Width
Height
Quality
=> maybe add a size report ?
For rotate
Angle
Quality
I think its realy important to add a button "Save as a new image" to avoid confusion and better workflow
|
non_process
|
media manager return and correction is your feature request related to a problem please describe for now the media manager design isn t effective full with option is realy a problem for usability a slider or in input in isn t good for eyes lol order option problem no button to save a new image version describe the solution you d like i think use right column is good for large screen ex about order for commun user and most used option will be for crop aspect ratio width height quality x axis it a googd option but not commun y axis it a googd option but not commun for rezize width height quality maybe add a size report for rotate angle quality i think its realy important to add a button save as a new image to avoid confusion and better workflow
| 0
|
183,015
| 14,926,992,008
|
IssuesEvent
|
2021-01-24 13:48:50
|
do02reen24/Doreen-Chat
|
https://api.github.com/repos/do02reen24/Doreen-Chat
|
opened
|
Alias 설정하기
|
📝 documentation 🦟 bug
|
CRA를 통해 프로젝트를 생성하여 자유로운 변경이 불가능하였다.
typescript에 alias를 적용할 수 있는 방법을 알아보던중 `craco`라는 라이브러리를 알게되어 적용하였지만 지속해서 아래의 메시지를 띄워주었다.
```bash
The following changes are being made to your tsconfig.json file:
- compilerOptions.paths must not be set (aliased imports are not supported)
```
다른 사람들의 해결 방법을 찾아보던 중 [여기](https://github.com/timarney/react-app-rewired/issues/375)에서 [링크](https://medium.com/@gustavograeff1998/absolute-imports-with-create-react-app-typescript-e87878cab65b)를 알게되었다.
해당 블로그에서 안내된 방법을 통해 alias를 적용할 수 있었다.
|
1.0
|
Alias 설정하기 - CRA를 통해 프로젝트를 생성하여 자유로운 변경이 불가능하였다.
typescript에 alias를 적용할 수 있는 방법을 알아보던중 `craco`라는 라이브러리를 알게되어 적용하였지만 지속해서 아래의 메시지를 띄워주었다.
```bash
The following changes are being made to your tsconfig.json file:
- compilerOptions.paths must not be set (aliased imports are not supported)
```
다른 사람들의 해결 방법을 찾아보던 중 [여기](https://github.com/timarney/react-app-rewired/issues/375)에서 [링크](https://medium.com/@gustavograeff1998/absolute-imports-with-create-react-app-typescript-e87878cab65b)를 알게되었다.
해당 블로그에서 안내된 방법을 통해 alias를 적용할 수 있었다.
|
non_process
|
alias 설정하기 cra를 통해 프로젝트를 생성하여 자유로운 변경이 불가능하였다 typescript에 alias를 적용할 수 있는 방법을 알아보던중 craco 라는 라이브러리를 알게되어 적용하였지만 지속해서 아래의 메시지를 띄워주었다 bash the following changes are being made to your tsconfig json file compileroptions paths must not be set aliased imports are not supported 다른 사람들의 해결 방법을 찾아보던 중 알게되었다 해당 블로그에서 안내된 방법을 통해 alias를 적용할 수 있었다
| 0
|
11,765
| 14,596,338,007
|
IssuesEvent
|
2020-12-20 15:26:41
|
AlexsLemonade/refinebio-examples
|
https://api.github.com/repos/AlexsLemonade/refinebio-examples
|
closed
|
If Github actions rendering fails, we should know about it
|
before going live process
|
Related to:
https://github.com/AlexsLemonade/refinebio-examples/issues/312#issuecomment-713026482
Github actions rendering with `build-all` failed three times without us knowing about it because of a weird docker build fail that didn't really stop the docker building.
We don't want fails to happen quietly, so we need to have system that sends us an alert if the render fails. Spell check sends alerts when it fails but perhaps this is because its required? - I'm not sure how this works, need to look into it.
### Proposed solution(s)
I saw an example for [sending emails](https://github.com/spandanpal22/sending-alert-github-actions/blob/master/.github/workflows/send.yml) but they seemed to required a lot of `SECRETS` and set up.
A slack notification might be the way to go? But what channel?
https://github.com/marketplace/actions/slack-notify
|
1.0
|
If Github actions rendering fails, we should know about it - Related to:
https://github.com/AlexsLemonade/refinebio-examples/issues/312#issuecomment-713026482
Github actions rendering with `build-all` failed three times without us knowing about it because of a weird docker build fail that didn't really stop the docker building.
We don't want fails to happen quietly, so we need to have system that sends us an alert if the render fails. Spell check sends alerts when it fails but perhaps this is because its required? - I'm not sure how this works, need to look into it.
### Proposed solution(s)
I saw an example for [sending emails](https://github.com/spandanpal22/sending-alert-github-actions/blob/master/.github/workflows/send.yml) but they seemed to required a lot of `SECRETS` and set up.
A slack notification might be the way to go? But what channel?
https://github.com/marketplace/actions/slack-notify
|
process
|
if github actions rendering fails we should know about it related to github actions rendering with build all failed three times without us knowing about it because of a weird docker build fail that didn t really stop the docker building we don t want fails to happen quietly so we need to have system that sends us an alert if the render fails spell check sends alerts when it fails but perhaps this is because its required i m not sure how this works need to look into it proposed solution s i saw an example for but they seemed to required a lot of secrets and set up a slack notification might be the way to go but what channel
| 1
|
269,692
| 28,960,250,398
|
IssuesEvent
|
2023-05-10 01:26:48
|
praneethpanasala/linux
|
https://api.github.com/repos/praneethpanasala/linux
|
reopened
|
CVE-2023-1382 (Medium) detected in linuxv4.19
|
Mend: dependency security vulnerability
|
## CVE-2023-1382 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv4.19</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p>
<p>Found in HEAD commit: <a href="https://api.github.com/repos/praneethpanasala/linux/commits/d80c4f847c91020292cb280132b15e2ea147f1a3">d80c4f847c91020292cb280132b15e2ea147f1a3</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/tipc/topsrv.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/tipc/topsrv.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
A data race flaw was found in the Linux kernel, between where con is allocated and con->sock is set. This issue leads to a NULL pointer dereference when accessing con->sock->sk in net/tipc/topsrv.c in the tipc protocol in the Linux kernel.
<p>Publish Date: 2023-04-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-1382>CVE-2023-1382</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2023-1382">https://www.linuxkernelcves.com/cves/CVE-2023-1382</a></p>
<p>Release Date: 2023-04-19</p>
<p>Fix Resolution: v4.19.268,v5.4.226,v5.10.157,v5.15.81,v6.0.11</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2023-1382 (Medium) detected in linuxv4.19 - ## CVE-2023-1382 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv4.19</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p>
<p>Found in HEAD commit: <a href="https://api.github.com/repos/praneethpanasala/linux/commits/d80c4f847c91020292cb280132b15e2ea147f1a3">d80c4f847c91020292cb280132b15e2ea147f1a3</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/tipc/topsrv.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/tipc/topsrv.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
A data race flaw was found in the Linux kernel, between where con is allocated and con->sock is set. This issue leads to a NULL pointer dereference when accessing con->sock->sk in net/tipc/topsrv.c in the tipc protocol in the Linux kernel.
<p>Publish Date: 2023-04-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-1382>CVE-2023-1382</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2023-1382">https://www.linuxkernelcves.com/cves/CVE-2023-1382</a></p>
<p>Release Date: 2023-04-19</p>
<p>Fix Resolution: v4.19.268,v5.4.226,v5.10.157,v5.15.81,v6.0.11</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in cve medium severity vulnerability vulnerable library linux kernel source tree library home page a href found in head commit a href found in base branch master vulnerable source files net tipc topsrv c net tipc topsrv c vulnerability details a data race flaw was found in the linux kernel between where con is allocated and con sock is set this issue leads to a null pointer dereference when accessing con sock sk in net tipc topsrv c in the tipc protocol in the linux kernel publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
17,390
| 10,697,074,486
|
IssuesEvent
|
2019-10-23 15:48:43
|
MassTransit/MassTransit
|
https://api.github.com/repos/MassTransit/MassTransit
|
closed
|
Dependent Endpoints
|
servicebus
|
If a receive endpoint is dependent upon another endpoint being created (such as a dead-letter queue, etc.), allow that dependency to be specified so that when endpoints are started, the dependent ones wait until the other is configured/created before creating itself.
Otherwise, service startup can fail on initial launch due to non-existent entities.
|
1.0
|
Dependent Endpoints - If a receive endpoint is dependent upon another endpoint being created (such as a dead-letter queue, etc.), allow that dependency to be specified so that when endpoints are started, the dependent ones wait until the other is configured/created before creating itself.
Otherwise, service startup can fail on initial launch due to non-existent entities.
|
non_process
|
dependent endpoints if a receive endpoint is dependent upon another endpoint being created such as a dead letter queue etc allow that dependency to be specified so that when endpoints are started the dependent ones wait until the other is configured created before creating itself otherwise service startup can fail on initial launch due to non existent entities
| 0
|
418,151
| 12,194,193,902
|
IssuesEvent
|
2020-04-29 15:27:26
|
anmedio/junost
|
https://api.github.com/repos/anmedio/junost
|
closed
|
NodeJS -> Functional programming -> Functional Programming in JavaScript
|
grade: nodejs priority: low type: new feature 🚀
|
Add new book:
"Luis Atencio - Functional Programming in JavaScript"
"How to improve your JavaScript programs using functional techniques."
___
https://www.manning.com/books/functional-programming-in-javascript
|
1.0
|
NodeJS -> Functional programming -> Functional Programming in JavaScript - Add new book:
"Luis Atencio - Functional Programming in JavaScript"
"How to improve your JavaScript programs using functional techniques."
___
https://www.manning.com/books/functional-programming-in-javascript
|
non_process
|
nodejs functional programming functional programming in javascript add new book luis atencio functional programming in javascript how to improve your javascript programs using functional techniques
| 0
|
50,427
| 13,527,852,283
|
IssuesEvent
|
2020-09-15 15:56:02
|
jgeraigery/salt
|
https://api.github.com/repos/jgeraigery/salt
|
opened
|
CVE-2015-9251 (Medium) detected in jquery-1.9.1.js
|
security vulnerability
|
## CVE-2015-9251 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.9.1.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js</a></p>
<p>Path to vulnerable library: salt/doc/_themes/saltstack/static/js/vendor/jquery-1.9.1.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.9.1.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/jgeraigery/salt/commit/1362ce577176c220b7b67ea2b8560e112474f1c9">1362ce577176c220b7b67ea2b8560e112474f1c9</a></p>
<p>Found in base branch: <b>develop</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.
<p>Publish Date: 2018-01-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251>CVE-2015-9251</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-9251">https://nvd.nist.gov/vuln/detail/CVE-2015-9251</a></p>
<p>Release Date: 2018-01-18</p>
<p>Fix Resolution: jQuery - v3.0.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.9.1","isTransitiveDependency":false,"dependencyTree":"jquery:1.9.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jQuery - v3.0.0"}],"vulnerabilityIdentifier":"CVE-2015-9251","vulnerabilityDetails":"jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2015-9251 (Medium) detected in jquery-1.9.1.js - ## CVE-2015-9251 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.9.1.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.9.1/jquery.js</a></p>
<p>Path to vulnerable library: salt/doc/_themes/saltstack/static/js/vendor/jquery-1.9.1.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.9.1.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/jgeraigery/salt/commit/1362ce577176c220b7b67ea2b8560e112474f1c9">1362ce577176c220b7b67ea2b8560e112474f1c9</a></p>
<p>Found in base branch: <b>develop</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.
<p>Publish Date: 2018-01-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251>CVE-2015-9251</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-9251">https://nvd.nist.gov/vuln/detail/CVE-2015-9251</a></p>
<p>Release Date: 2018-01-18</p>
<p>Fix Resolution: jQuery - v3.0.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.9.1","isTransitiveDependency":false,"dependencyTree":"jquery:1.9.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jQuery - v3.0.0"}],"vulnerabilityIdentifier":"CVE-2015-9251","vulnerabilityDetails":"jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in jquery js cve medium severity vulnerability vulnerable library jquery js javascript library for dom operations library home page a href path to vulnerable library salt doc themes saltstack static js vendor jquery js dependency hierarchy x jquery js vulnerable library found in head commit a href found in base branch develop vulnerability details jquery before is vulnerable to cross site scripting xss attacks when a cross domain ajax request is performed without the datatype option causing text javascript responses to be executed publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails jquery before is vulnerable to cross site scripting xss attacks when a cross domain ajax request is performed without the datatype option causing text javascript responses to be executed vulnerabilityurl
| 0
|
21,782
| 30,294,975,325
|
IssuesEvent
|
2023-07-09 18:44:10
|
The-Data-Alchemists-Manipal/MindWave
|
https://api.github.com/repos/The-Data-Alchemists-Manipal/MindWave
|
closed
|
[Image Processing] : Virtual Quiz Game
|
image-processing
|
It is developing the virtual quiz game where user will be provided with 4 options and there no need of keyboard or mouse click to select the answer.
Using the finger user can just select the correct answer and it's done.
Tech Stack used : Python, OpenCV, CVZone, MediaPipe
|
1.0
|
[Image Processing] : Virtual Quiz Game - It is developing the virtual quiz game where user will be provided with 4 options and there no need of keyboard or mouse click to select the answer.
Using the finger user can just select the correct answer and it's done.
Tech Stack used : Python, OpenCV, CVZone, MediaPipe
|
process
|
virtual quiz game it is developing the virtual quiz game where user will be provided with options and there no need of keyboard or mouse click to select the answer using the finger user can just select the correct answer and it s done tech stack used python opencv cvzone mediapipe
| 1
|
6,348
| 9,398,651,268
|
IssuesEvent
|
2019-04-08 12:48:33
|
linnovate/root
|
https://api.github.com/repos/linnovate/root
|
reopened
|
document: tasks from office document
|
2.0.6.1 Fixed More Info Process bug
|
go to document
open new document
click on tasks and click on Manage tasks
You'll get to the Tasks from Documents screen
open some new tasks
select a task and click the star on the left side of the screen
click on sort by star
result: sorting does not work
in document click on Manage tasks

open some new tasks
select a task and click the star on the left side of the screen

click on sort by star
result: sorting does not work

|
1.0
|
document: tasks from office document - go to document
open new document
click on tasks and click on Manage tasks
You'll get to the Tasks from Documents screen
open some new tasks
select a task and click the star on the left side of the screen
click on sort by star
result: sorting does not work
in document click on Manage tasks

open some new tasks
select a task and click the star on the left side of the screen

click on sort by star
result: sorting does not work

|
process
|
document tasks from office document go to document open new document click on tasks and click on manage tasks you ll get to the tasks from documents screen open some new tasks select a task and click the star on the left side of the screen click on sort by star result sorting does not work in document click on manage tasks open some new tasks select a task and click the star on the left side of the screen click on sort by star result sorting does not work
| 1
|
19,113
| 25,166,792,722
|
IssuesEvent
|
2022-11-10 21:40:11
|
fluent/fluent-bit
|
https://api.github.com/repos/fluent/fluent-bit
|
closed
|
in_tail: Ignore_Older does not work properly on Windows
|
work-in-process
|
## Bug Report
**Describe the bug**
The configuration option `Ignore_Older` for `in_tail` has no effect. It still registers files older than what is configured.
**To Reproduce**
*Windows only*
If you add a new file that matches the Path pattern in your `in_tail` config, you will see while debugging the application that [this](https://github.com/fluent/fluent-bit/blob/master/plugins/in_tail/tail_file.h#L46) line does not *"return the file modification time in seconds since epoch"* as expected, but rather an 18-digit long LDAP timestamp.
**Expected behavior**
The `Ignore_Older` configuration should work. The function mentioned above should return a Unix epoch timestamp for Windows.
**Your Environment**
<!--- Include as many relevant details about the environment you experienced the bug in -->
* Version used: 2.0.0
* Configuration:
```
[SERVICE]
flush 5
log_level debug
[INPUT]
name tail
path Path/To/Folder/*.log
ignore_older 60
[OUTPUT]
name stdout
match *
```
* Environment name and version (e.g. Kubernetes? What version?): N/A
* Server type and version: N/A
* Operating System and version: Any Windows (tried on 10 and Windows Server 2019)
* Filters and plugins: in_tail
**Additional context**
<!--- How has this issue affected you? What are you trying to accomplish? -->
<!--- Providing context helps us come up with a solution that is most useful in the real world -->
This issue has caused fluent-bit on the server to open all the log files from ages ago until it gets an error `Too many files open`. Then it can't open the files I actually want it to open and my logs don't get forwarded.
|
1.0
|
in_tail: Ignore_Older does not work properly on Windows - ## Bug Report
**Describe the bug**
The configuration option `Ignore_Older` for `in_tail` has no effect. It still registers files older than what is configured.
**To Reproduce**
*Windows only*
If you add a new file that matches the Path pattern in your `in_tail` config, you will see while debugging the application that [this](https://github.com/fluent/fluent-bit/blob/master/plugins/in_tail/tail_file.h#L46) line does not *"return the file modification time in seconds since epoch"* as expected, but rather an 18-digit long LDAP timestamp.
**Expected behavior**
The `Ignore_Older` configuration should work. The function mentioned above should return a Unix epoch timestamp for Windows.
**Your Environment**
<!--- Include as many relevant details about the environment you experienced the bug in -->
* Version used: 2.0.0
* Configuration:
```
[SERVICE]
flush 5
log_level debug
[INPUT]
name tail
path Path/To/Folder/*.log
ignore_older 60
[OUTPUT]
name stdout
match *
```
* Environment name and version (e.g. Kubernetes? What version?): N/A
* Server type and version: N/A
* Operating System and version: Any Windows (tried on 10 and Windows Server 2019)
* Filters and plugins: in_tail
**Additional context**
<!--- How has this issue affected you? What are you trying to accomplish? -->
<!--- Providing context helps us come up with a solution that is most useful in the real world -->
This issue has caused fluent-bit on the server to open all the log files from ages ago until it gets an error `Too many files open`. Then it can't open the files I actually want it to open and my logs don't get forwarded.
|
process
|
in tail ignore older does not work properly on windows bug report describe the bug the configuration option ignore older for in tail has no effect it still registers files older than what is configured to reproduce windows only if you add a new file that matches the path pattern in your in tail config you will see while debugging the application that line does not return the file modification time in seconds since epoch as expected but rather an digit long ldap timestamp expected behavior the ignore older configuration should work the function mentioned above should return a unix epoch timestamp for windows your environment version used configuration flush log level debug name tail path path to folder log ignore older name stdout match environment name and version e g kubernetes what version n a server type and version n a operating system and version any windows tried on and windows server filters and plugins in tail additional context this issue has caused fluent bit on the server to open all the log files from ages ago until it gets an error too many files open then it can t open the files i actually want it to open and my logs don t get forwarded
| 1
|
17,779
| 23,705,879,874
|
IssuesEvent
|
2022-08-30 01:00:35
|
GoogleCloudPlatform/java-docs-samples
|
https://api.github.com/repos/GoogleCloudPlatform/java-docs-samples
|
closed
|
compute/cloud-client/src/main/java/compute/CreateInstanceFromTemplate.java has no test and is broken
|
type: process api: compute samples
|
This snippet is never tested anywhere: compute/cloud-client/src/main/java/compute/CreateInstanceFromTemplate.java
When I tried to run it, I got:
```
{
"error": {
"code": 400,
"message": "Invalid value for field 'resource.disks[0].initializeParams.diskType': 'pd-balanced'. The URL is malformed.",
"errors": [
{
"message": "Invalid value for field 'resource.disks[0].initializeParams.diskType': 'pd-balanced'. The URL is malformed.",
"domain": "global",
"reason": "invalid"
}
]
}
}
```
Please update the sample with a test.
|
1.0
|
compute/cloud-client/src/main/java/compute/CreateInstanceFromTemplate.java has no test and is broken - This snippet is never tested anywhere: compute/cloud-client/src/main/java/compute/CreateInstanceFromTemplate.java
When I tried to run it, I got:
```
{
"error": {
"code": 400,
"message": "Invalid value for field 'resource.disks[0].initializeParams.diskType': 'pd-balanced'. The URL is malformed.",
"errors": [
{
"message": "Invalid value for field 'resource.disks[0].initializeParams.diskType': 'pd-balanced'. The URL is malformed.",
"domain": "global",
"reason": "invalid"
}
]
}
}
```
Please update the sample with a test.
|
process
|
compute cloud client src main java compute createinstancefromtemplate java has no test and is broken this snippet is never tested anywhere compute cloud client src main java compute createinstancefromtemplate java when i tried to run it i got error code message invalid value for field resource disks initializeparams disktype pd balanced the url is malformed errors message invalid value for field resource disks initializeparams disktype pd balanced the url is malformed domain global reason invalid please update the sample with a test
| 1
|
277,038
| 30,594,519,596
|
IssuesEvent
|
2023-07-21 20:23:18
|
justunsix/automatetheboringstuff-py-tests
|
https://api.github.com/repos/justunsix/automatetheboringstuff-py-tests
|
opened
|
CVE-2023-25670 (High) detected in tensorflow-2.11.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
|
Mend: dependency security vulnerability
|
## CVE-2023-25670 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tensorflow-2.11.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/42/24/830571895f0927fe205a23309b136520c7914921420bd1e81aff1da47bb1/tensorflow-2.11.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl">https://files.pythonhosted.org/packages/42/24/830571895f0927fe205a23309b136520c7914921420bd1e81aff1da47bb1/tensorflow-2.11.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl</a></p>
<p>Path to dependency file: /src/project/data-science/requirements.txt</p>
<p>Path to vulnerable library: /src/project/data-science/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **tensorflow-2.11.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/justunsix/automatetheboringstuff-py-tests/commit/92e57f9e81da15812523bf929f8ad33bdae5e967">92e57f9e81da15812523bf929f8ad33bdae5e967</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
TensorFlow is an open source platform for machine learning. Versions prior to 2.12.0 and 2.11.1 have a null point error in QuantizedMatMulWithBiasAndDequantize with MKL enabled. A fix is included in TensorFlow version 2.12.0 and version 2.11.1.
<p>Publish Date: 2023-03-25
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-25670>CVE-2023-25670</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-49rq-hwc3-x77w">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-49rq-hwc3-x77w</a></p>
<p>Release Date: 2023-03-24</p>
<p>Fix Resolution: tensorflow - 2.11.1,2.12.0, tensorflow-cpu - 2.11.1,2.12.0, tensorflow-gpu - 2.11.1,2.12.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2023-25670 (High) detected in tensorflow-2.11.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl - ## CVE-2023-25670 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tensorflow-2.11.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/42/24/830571895f0927fe205a23309b136520c7914921420bd1e81aff1da47bb1/tensorflow-2.11.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl">https://files.pythonhosted.org/packages/42/24/830571895f0927fe205a23309b136520c7914921420bd1e81aff1da47bb1/tensorflow-2.11.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl</a></p>
<p>Path to dependency file: /src/project/data-science/requirements.txt</p>
<p>Path to vulnerable library: /src/project/data-science/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **tensorflow-2.11.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/justunsix/automatetheboringstuff-py-tests/commit/92e57f9e81da15812523bf929f8ad33bdae5e967">92e57f9e81da15812523bf929f8ad33bdae5e967</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
TensorFlow is an open source platform for machine learning. Versions prior to 2.12.0 and 2.11.1 have a null point error in QuantizedMatMulWithBiasAndDequantize with MKL enabled. A fix is included in TensorFlow version 2.12.0 and version 2.11.1.
<p>Publish Date: 2023-03-25
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-25670>CVE-2023-25670</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-49rq-hwc3-x77w">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-49rq-hwc3-x77w</a></p>
<p>Release Date: 2023-03-24</p>
<p>Fix Resolution: tensorflow - 2.11.1,2.12.0, tensorflow-cpu - 2.11.1,2.12.0, tensorflow-gpu - 2.11.1,2.12.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in tensorflow manylinux whl cve high severity vulnerability vulnerable library tensorflow manylinux whl tensorflow is an open source machine learning framework for everyone library home page a href path to dependency file src project data science requirements txt path to vulnerable library src project data science requirements txt dependency hierarchy x tensorflow manylinux whl vulnerable library found in head commit a href found in base branch main vulnerability details tensorflow is an open source platform for machine learning versions prior to and have a null point error in quantizedmatmulwithbiasanddequantize with mkl enabled a fix is included in tensorflow version and version publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tensorflow tensorflow cpu tensorflow gpu step up your open source security game with mend
| 0
|
7,963
| 11,146,835,193
|
IssuesEvent
|
2019-12-23 10:47:11
|
AcademySoftwareFoundation/OpenCue
|
https://api.github.com/repos/AcademySoftwareFoundation/OpenCue
|
opened
|
Update docs/conf.py version information
|
process
|
Implement an automated process so that docs/conf.py derives version information from existing information elsewhere in the repository, such as VERSION.in.
|
1.0
|
Update docs/conf.py version information - Implement an automated process so that docs/conf.py derives version information from existing information elsewhere in the repository, such as VERSION.in.
|
process
|
update docs conf py version information implement an automated process so that docs conf py derives version information from existing information elsewhere in the repository such as version in
| 1
|
335,743
| 24,478,234,349
|
IssuesEvent
|
2022-10-08 13:10:26
|
veracioux/tuterm
|
https://api.github.com/repos/veracioux/tuterm
|
closed
|
Create cheatsheet for tuterm commands
|
documentation good first issue hacktoberfest
|
Create a cheatsheet that can be used with the [`cheat`](https://github.com/cheat/cheat) program that:
- Shows the basic ways the `tuterm` command can be invoked
- Shows the most useful shell (API) functions that can be used within tuterm scripts (they are already documented in the [man page](https://github.com/veracioux/tuterm/blob/master/docs/tuterm.1)
|
1.0
|
Create cheatsheet for tuterm commands - Create a cheatsheet that can be used with the [`cheat`](https://github.com/cheat/cheat) program that:
- Shows the basic ways the `tuterm` command can be invoked
- Shows the most useful shell (API) functions that can be used within tuterm scripts (they are already documented in the [man page](https://github.com/veracioux/tuterm/blob/master/docs/tuterm.1)
|
non_process
|
create cheatsheet for tuterm commands create a cheatsheet that can be used with the program that shows the basic ways the tuterm command can be invoked shows the most useful shell api functions that can be used within tuterm scripts they are already documented in the
| 0
|
92,635
| 11,694,194,008
|
IssuesEvent
|
2020-03-06 03:09:42
|
ChilkGames/RPG-Luces
|
https://api.github.com/repos/ChilkGames/RPG-Luces
|
opened
|
Historia
|
Design Documentation Investigation
|
Pensar un collar de perlas para una historia basica y personajes
- [ ] Collar de perlas
- [ ] Pjs
|
1.0
|
Historia - Pensar un collar de perlas para una historia basica y personajes
- [ ] Collar de perlas
- [ ] Pjs
|
non_process
|
historia pensar un collar de perlas para una historia basica y personajes collar de perlas pjs
| 0
|
2,330
| 5,142,618,923
|
IssuesEvent
|
2017-01-12 13:52:23
|
jimbrown75/Permit-Vision-Enhancements
|
https://api.github.com/repos/jimbrown75/Permit-Vision-Enhancements
|
opened
|
Option to issue permit in field must occur for every issue, and not just first issue
|
bug High Priority Must Fix Verified by PTW Process Lead
|
Unable to issue in field after initial issue. The option to issue a permit in the field is only available on the initial issue. Once a permit is suspended, the option no longer exists for subsequent issues.
|
1.0
|
Option to issue permit in field must occur for every issue, and not just first issue - Unable to issue in field after initial issue. The option to issue a permit in the field is only available on the initial issue. Once a permit is suspended, the option no longer exists for subsequent issues.
|
process
|
option to issue permit in field must occur for every issue and not just first issue unable to issue in field after initial issue the option to issue a permit in the field is only available on the initial issue once a permit is suspended the option no longer exists for subsequent issues
| 1
|
8,896
| 11,991,608,399
|
IssuesEvent
|
2020-04-08 08:40:05
|
prisma/prisma-client-js
|
https://api.github.com/repos/prisma/prisma-client-js
|
closed
|
Invalid `include` query incorrectly accepted by type system
|
bug/2-confirmed kind/bug process/candidate
|
Schema:
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model api_keys {
allowed_ips String[]
created_at DateTime
created_by_id Int?
hidden Boolean @default(false)
id Int @default(autoincrement()) @id
key String
updated_at DateTime
user_id Int? @unique
@@index([key], name: "index_api_keys_on_key")
}
```
The following query is allowed by TypeScript but shouldn't be:
```ts
import { PrismaClient } from '@prisma/client'
// or const { PrismaClient } = require('@prisma/client')
const prisma = new PrismaClient()
prisma.api_keys.findMany({ first: 1, include: { user: true } }).then(x => {
console.log(x)
prisma.disconnect()
})
```

Note: I ran into this error after I've changed my Prisma schema. Initially I actually **had** a relation called `user` based on which I've written the Prisma Client query.
|
1.0
|
Invalid `include` query incorrectly accepted by type system - Schema:
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model api_keys {
allowed_ips String[]
created_at DateTime
created_by_id Int?
hidden Boolean @default(false)
id Int @default(autoincrement()) @id
key String
updated_at DateTime
user_id Int? @unique
@@index([key], name: "index_api_keys_on_key")
}
```
The following query is allowed by TypeScript but shouldn't be:
```ts
import { PrismaClient } from '@prisma/client'
// or const { PrismaClient } = require('@prisma/client')
const prisma = new PrismaClient()
prisma.api_keys.findMany({ first: 1, include: { user: true } }).then(x => {
console.log(x)
prisma.disconnect()
})
```

Note: I ran into this error after I've changed my Prisma schema. Initially I actually **had** a relation called `user` based on which I've written the Prisma Client query.
|
process
|
invalid include query incorrectly accepted by type system schema prisma generator client provider prisma client js datasource db provider postgresql url env database url model api keys allowed ips string created at datetime created by id int hidden boolean default false id int default autoincrement id key string updated at datetime user id int unique index name index api keys on key the following query is allowed by typescript but shouldn t be ts import prismaclient from prisma client or const prismaclient require prisma client const prisma new prismaclient prisma api keys findmany first include user true then x console log x prisma disconnect note i ran into this error after i ve changed my prisma schema initially i actually had a relation called user based on which i ve written the prisma client query
| 1
|
16,396
| 21,180,187,445
|
IssuesEvent
|
2022-04-08 07:08:06
|
acdh-oeaw/abcd-db
|
https://api.github.com/repos/acdh-oeaw/abcd-db
|
closed
|
parser for references
|
Data Processing
|
In der Tabelle [Events](https://abcd.acdh-dev.oeaw.ac.at/archiv/event/) in den Feldern "Anmerkungen Literatur"; "Anmerkungen Text", "Anmerkungen ..." werden Einträge in der Tabelle [Work](https://abcd.acdh-dev.oeaw.ac.at/archiv/work/) referenziert wie zum Beispiel [hier](https://abcd.acdh-dev.oeaw.ac.at/archiv/event/detail/185806165)
> 65/129, 264/248, 1131/46 (Bericht irrtümlich mit »16. Mai« überschrieben)
dabei sollte sich die Zahl VOR dem `/` auf einen Eintrag in der Work-Tabelle beziehen, also `65` -> [Lit.0065](https://abcd.acdh-dev.oeaw.ac.at/archiv/work/detail/48)
ToDo:
Eine Funktion schreiben, die
* e.g. mittels Regex sämtliche Literteraturverweise in den Einträgen der "Event"-Tabelle erkennt
* versucht dieses Referenze mit einer "Ordnungsnummer" der Work-Tabelle aufzulösen
* das Ergebnis in Form eines e.g. Python-Dicts zurückzugeben
```python
{
"event_id": [
reference_id_1, reference_id_2, reference_id.3
]
}
```
also
```python
{
"185806165": [
"Lit.00065",
"Lit.00264",
"Lit.01131"
],
"185806237": [
"Lit.00065",
"Lit.00034",
"Lit.00213",
],
...
}
```
Die Daten können wir [hier](https://github.com/acdh-oeaw/abcd-db/wiki) beschrieben heruntergeladen werden
|
1.0
|
parser for references - In der Tabelle [Events](https://abcd.acdh-dev.oeaw.ac.at/archiv/event/) in den Feldern "Anmerkungen Literatur"; "Anmerkungen Text", "Anmerkungen ..." werden Einträge in der Tabelle [Work](https://abcd.acdh-dev.oeaw.ac.at/archiv/work/) referenziert wie zum Beispiel [hier](https://abcd.acdh-dev.oeaw.ac.at/archiv/event/detail/185806165)
> 65/129, 264/248, 1131/46 (Bericht irrtümlich mit »16. Mai« überschrieben)
dabei sollte sich die Zahl VOR dem `/` auf einen Eintrag in der Work-Tabelle beziehen, also `65` -> [Lit.0065](https://abcd.acdh-dev.oeaw.ac.at/archiv/work/detail/48)
ToDo:
Eine Funktion schreiben, die
* e.g. mittels Regex sämtliche Literteraturverweise in den Einträgen der "Event"-Tabelle erkennt
* versucht dieses Referenze mit einer "Ordnungsnummer" der Work-Tabelle aufzulösen
* das Ergebnis in Form eines e.g. Python-Dicts zurückzugeben
```python
{
"event_id": [
reference_id_1, reference_id_2, reference_id.3
]
}
```
also
```python
{
"185806165": [
"Lit.00065",
"Lit.00264",
"Lit.01131"
],
"185806237": [
"Lit.00065",
"Lit.00034",
"Lit.00213",
],
...
}
```
Die Daten können wir [hier](https://github.com/acdh-oeaw/abcd-db/wiki) beschrieben heruntergeladen werden
|
process
|
parser for references in der tabelle in den feldern anmerkungen literatur anmerkungen text anmerkungen werden einträge in der tabelle referenziert wie zum beispiel bericht irrtümlich mit » mai« überschrieben dabei sollte sich die zahl vor dem auf einen eintrag in der work tabelle beziehen also todo eine funktion schreiben die e g mittels regex sämtliche literteraturverweise in den einträgen der event tabelle erkennt versucht dieses referenze mit einer ordnungsnummer der work tabelle aufzulösen das ergebnis in form eines e g python dicts zurückzugeben python event id reference id reference id reference id also python lit lit lit lit lit lit die daten können wir beschrieben heruntergeladen werden
| 1
|
21,264
| 28,438,605,314
|
IssuesEvent
|
2023-04-15 16:19:36
|
sulton-max/profile.todoapp
|
https://api.github.com/repos/sulton-max/profile.todoapp
|
opened
|
Create Authentication Processing Service
|
processing
|
# The Ask
Develop an Authentication Processing Service
# How to Complete this Task
Here's some steps to complete this deliverable.
- Create service contract
- Implement following features
|
1.0
|
Create Authentication Processing Service - # The Ask
Develop an Authentication Processing Service
# How to Complete this Task
Here's some steps to complete this deliverable.
- Create service contract
- Implement following features
|
process
|
create authentication processing service the ask develop an authentication processing service how to complete this task here s some steps to complete this deliverable create service contract implement following features
| 1
|
7,191
| 10,330,663,762
|
IssuesEvent
|
2019-09-02 15:14:25
|
rubberduck-vba/Rubberduck
|
https://api.github.com/repos/rubberduck-vba/Rubberduck
|
closed
|
Rubberduck stuck on "Pending" Status.. Sub menu disabled
|
bug parse-tree-processing
|

Rubberduck is not working properly.
Sub menus are disabled(Greyed out).
"Pending" button is stuck.
Thank you in advance!
|
1.0
|
Rubberduck stuck on "Pending" Status.. Sub menu disabled - 
Rubberduck is not working properly.
Sub menus are disabled(Greyed out).
"Pending" button is stuck.
Thank you in advance!
|
process
|
rubberduck stuck on pending status sub menu disabled rubberduck is not working properly sub menus are disabled greyed out pending button is stuck thank you in advance
| 1
|
2,533
| 5,290,468,357
|
IssuesEvent
|
2017-02-08 19:58:12
|
GoogleCloudPlatform/google-cloud-java
|
https://api.github.com/repos/GoogleCloudPlatform/google-cloud-java
|
closed
|
Convert to manually-triggered release process
|
release process
|
For the last several releases, the release process hasn't run properly in Travis, and I have had to run various components manually. It's really hard to figure out what happened and didn't happen during the release process because there is no messaging. So, I would like to change the release scripts (specifically https://github.com/GoogleCloudPlatform/google-cloud-java/blob/master/utilities/after_success.sh ) so that they are designed to be run from the command line by a team member instead of by Travis. I intend to make the process a one-shot command so that only two steps are required in the happy case: 1. start the script, 2. verify it succeeded.
I am filing this as an issue to get any comments anyone has about this.
|
1.0
|
Convert to manually-triggered release process - For the last several releases, the release process hasn't run properly in Travis, and I have had to run various components manually. It's really hard to figure out what happened and didn't happen during the release process because there is no messaging. So, I would like to change the release scripts (specifically https://github.com/GoogleCloudPlatform/google-cloud-java/blob/master/utilities/after_success.sh ) so that they are designed to be run from the command line by a team member instead of by Travis. I intend to make the process a one-shot command so that only two steps are required in the happy case: 1. start the script, 2. verify it succeeded.
I am filing this as an issue to get any comments anyone has about this.
|
process
|
convert to manually triggered release process for the last several releases the release process hasn t run properly in travis and i have had to run various components manually it s really hard to figure out what happened and didn t happen during the release process because there is no messaging so i would like to change the release scripts specifically so that they are designed to be run from the command line by a team member instead of by travis i intend to make the process a one shot command so that only two steps are required in the happy case start the script verify it succeeded i am filing this as an issue to get any comments anyone has about this
| 1
|
84,650
| 10,551,844,464
|
IssuesEvent
|
2019-10-03 14:06:19
|
kgrzybek/modular-monolith-with-ddd
|
https://api.github.com/repos/kgrzybek/modular-monolith-with-ddd
|
closed
|
IUnitOfWork is a part of domain?
|
design discussion
|
Why [IUnitOfWork](https://github.com/kgrzybek/modular-monolith-with-ddd/blob/master/src/BuildingBlocks/Domain/IUnitOfWork.cs) is a part of domain?
It is used only in Application parts and implementation is in Infrastructure.
|
1.0
|
IUnitOfWork is a part of domain? - Why [IUnitOfWork](https://github.com/kgrzybek/modular-monolith-with-ddd/blob/master/src/BuildingBlocks/Domain/IUnitOfWork.cs) is a part of domain?
It is used only in Application parts and implementation is in Infrastructure.
|
non_process
|
iunitofwork is a part of domain why is a part of domain it is used only in application parts and implementation is in infrastructure
| 0
|
14,279
| 17,259,967,197
|
IssuesEvent
|
2021-07-22 05:46:44
|
ltechkorea/inference_results_v1.0
|
https://api.github.com/repos/ltechkorea/inference_results_v1.0
|
opened
|
[ BUG ] BERT: `generate_engine` failed
|
bug natural language processing
|
<!--
label에 해당 카테고리 추가해 주세요.
-->
## **Describe the bug**
> TensorRT Engine 빌드 실패
- `make generate_engine` failed.
### **Screenshots or Logs**
If applicable, add screenshots to help explain your problem.
```
time docker run --gpus "device=7" --rm -w /work \
-v /home/jay/work/inference-v1.0/closed/LTechKorea:/work -v /home/jay:/mnt//home/jay \
--cap-add SYS_ADMIN --cap-add SYS_TIME \
-e NVIDIA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 \
--shm-size=32gb \
-v /etc/timezone:/etc/timezone:ro -v /etc/localtime:/etc/localtime:ro \
--security-opt apparmor=unconfined --security-opt seccomp=unconfined \
--name mlperf-inference-jay-1 -h mlperf-inference-jay --add-host mlperf-inference-jay:127.0.0.1 \
--user 1001:1001 --net host --device /dev/fuse \
-v /opt/data/scratch.mlperf_inference:/opt/data/scratch.mlperf_inference -v /opt/data/Dataset:/opt/data/Dataset \
-e MLPERF_SCRATCH_PATH=/opt/data/scratch.mlperf_inference \
-e HOST_HOSTNAME=ltech-gpu10 \
-e LD_LIBRARY_PATH=:/usr/local/cuda/lib64:/usr/lib/x86_64-linux-gnu:/home/jay/work/inference-v1.0/closed/LTechKorea/build/inference/loadgen/build:/usr/local/cuda-11.1/targets/x86_64-linux/lib/ \
-t \
mlperf-inference:jay make generate_engines RUN_ARGS="--benchmarks=bert --scenarios=Offline --config_ver=default --test_mode=PerformanceOnly --fast"
[2021-07-21 22:35:31,947 __init__.py:255 INFO] Running command: CUDA_VISIBILE_ORDER=PCI_BUS_ID nvidia-smi --query-gpu=gpu_name,pci.device_id,uuid --format=csv
[2021-07-21 22:35:35,848 main.py:701 INFO] Detected System ID: V100S-PCIE-32GBx1
[2021-07-21 22:35:35,860 main.py:529 INFO] Using config files: configs/bert/Offline/config.json
[2021-07-21 22:35:35,861 __init__.py:341 INFO] Parsing config file configs/bert/Offline/config.json ...
[2021-07-21 22:35:35,862 main.py:542 INFO] Processing config "V100S-PCIE-32GBx1_bert_Offline"
[2021-07-21 22:35:35,930 main.py:82 INFO] Building engines for bert benchmark in Offline scenario...
[2021-07-21 22:35:35,931 main.py:102 INFO] Building GPU engine for V100S-PCIE-32GBx1_bert_Offline
[2021-07-21 22:35:38,490 bert_var_seqlen.py:63 INFO] Using workspace size: 7,516,192,768
[2021-07-21 22:35:52,402 __init__.py:255 INFO] Running command: CUDA_VISIBILE_ORDER=PCI_BUS_ID nvidia-smi --query-gpu=gpu_name,pci.device_id,uuid --format=csv
[TensorRT] WARNING: Tensor DataType is determined at build time for tensors not marked as input or output.
Replacing l0_fc_qkv with small-tile GEMM plugin, with fairshare cache size 120.
#assertionsrc/smallTileGEMMPlugin.cu,588
Traceback (most recent call last):
File "code/main.py", line 703, in <module>
main(main_args, system)
File "code/main.py", line 634, in main
launch_handle_generate_engine(*_gen_args, **_gen_kwargs)
File "code/main.py", line 62, in launch_handle_generate_engine
raise RuntimeError("Building engines failed!")
RuntimeError: Building engines failed!
Makefile:613: recipe for target 'generate_engines' failed
make: *** [generate_engines] Error 1
docker run --gpus "device=7" --rm -w /work -v -v /home/jay:/mnt//home/jay 0.07s user 0.04s system 0% cpu 45.377 total
```
## **Expected behavior**
> A clear and concise description of what you expected to happen.
- TensorRT engine 정상 빌드
## **Possible Solution**
1. 1st solution
2. 2nd solution
## **Additional context**
> Add any other context about the problem here.
- 추가 정보
- 추가 정보
|
1.0
|
[ BUG ] BERT: `generate_engine` failed - <!--
label에 해당 카테고리 추가해 주세요.
-->
## **Describe the bug**
> TensorRT Engine 빌드 실패
- `make generate_engine` failed.
### **Screenshots or Logs**
If applicable, add screenshots to help explain your problem.
```
time docker run --gpus "device=7" --rm -w /work \
-v /home/jay/work/inference-v1.0/closed/LTechKorea:/work -v /home/jay:/mnt//home/jay \
--cap-add SYS_ADMIN --cap-add SYS_TIME \
-e NVIDIA_VISIBLE_DEVICES=0,1,2,3,4,5,6,7 \
--shm-size=32gb \
-v /etc/timezone:/etc/timezone:ro -v /etc/localtime:/etc/localtime:ro \
--security-opt apparmor=unconfined --security-opt seccomp=unconfined \
--name mlperf-inference-jay-1 -h mlperf-inference-jay --add-host mlperf-inference-jay:127.0.0.1 \
--user 1001:1001 --net host --device /dev/fuse \
-v /opt/data/scratch.mlperf_inference:/opt/data/scratch.mlperf_inference -v /opt/data/Dataset:/opt/data/Dataset \
-e MLPERF_SCRATCH_PATH=/opt/data/scratch.mlperf_inference \
-e HOST_HOSTNAME=ltech-gpu10 \
-e LD_LIBRARY_PATH=:/usr/local/cuda/lib64:/usr/lib/x86_64-linux-gnu:/home/jay/work/inference-v1.0/closed/LTechKorea/build/inference/loadgen/build:/usr/local/cuda-11.1/targets/x86_64-linux/lib/ \
-t \
mlperf-inference:jay make generate_engines RUN_ARGS="--benchmarks=bert --scenarios=Offline --config_ver=default --test_mode=PerformanceOnly --fast"
[2021-07-21 22:35:31,947 __init__.py:255 INFO] Running command: CUDA_VISIBILE_ORDER=PCI_BUS_ID nvidia-smi --query-gpu=gpu_name,pci.device_id,uuid --format=csv
[2021-07-21 22:35:35,848 main.py:701 INFO] Detected System ID: V100S-PCIE-32GBx1
[2021-07-21 22:35:35,860 main.py:529 INFO] Using config files: configs/bert/Offline/config.json
[2021-07-21 22:35:35,861 __init__.py:341 INFO] Parsing config file configs/bert/Offline/config.json ...
[2021-07-21 22:35:35,862 main.py:542 INFO] Processing config "V100S-PCIE-32GBx1_bert_Offline"
[2021-07-21 22:35:35,930 main.py:82 INFO] Building engines for bert benchmark in Offline scenario...
[2021-07-21 22:35:35,931 main.py:102 INFO] Building GPU engine for V100S-PCIE-32GBx1_bert_Offline
[2021-07-21 22:35:38,490 bert_var_seqlen.py:63 INFO] Using workspace size: 7,516,192,768
[2021-07-21 22:35:52,402 __init__.py:255 INFO] Running command: CUDA_VISIBILE_ORDER=PCI_BUS_ID nvidia-smi --query-gpu=gpu_name,pci.device_id,uuid --format=csv
[TensorRT] WARNING: Tensor DataType is determined at build time for tensors not marked as input or output.
Replacing l0_fc_qkv with small-tile GEMM plugin, with fairshare cache size 120.
#assertionsrc/smallTileGEMMPlugin.cu,588
Traceback (most recent call last):
File "code/main.py", line 703, in <module>
main(main_args, system)
File "code/main.py", line 634, in main
launch_handle_generate_engine(*_gen_args, **_gen_kwargs)
File "code/main.py", line 62, in launch_handle_generate_engine
raise RuntimeError("Building engines failed!")
RuntimeError: Building engines failed!
Makefile:613: recipe for target 'generate_engines' failed
make: *** [generate_engines] Error 1
docker run --gpus "device=7" --rm -w /work -v -v /home/jay:/mnt//home/jay 0.07s user 0.04s system 0% cpu 45.377 total
```
## **Expected behavior**
> A clear and concise description of what you expected to happen.
- TensorRT engine 정상 빌드
## **Possible Solution**
1. 1st solution
2. 2nd solution
## **Additional context**
> Add any other context about the problem here.
- 추가 정보
- 추가 정보
|
process
|
bert generate engine failed label에 해당 카테고리 추가해 주세요 describe the bug tensorrt engine 빌드 실패 make generate engine failed screenshots or logs if applicable add screenshots to help explain your problem time docker run gpus device rm w work v home jay work inference closed ltechkorea work v home jay mnt home jay cap add sys admin cap add sys time e nvidia visible devices shm size v etc timezone etc timezone ro v etc localtime etc localtime ro security opt apparmor unconfined security opt seccomp unconfined name mlperf inference jay h mlperf inference jay add host mlperf inference jay user net host device dev fuse v opt data scratch mlperf inference opt data scratch mlperf inference v opt data dataset opt data dataset e mlperf scratch path opt data scratch mlperf inference e host hostname ltech e ld library path usr local cuda usr lib linux gnu home jay work inference closed ltechkorea build inference loadgen build usr local cuda targets linux lib t mlperf inference jay make generate engines run args benchmarks bert scenarios offline config ver default test mode performanceonly fast running command cuda visibile order pci bus id nvidia smi query gpu gpu name pci device id uuid format csv detected system id pcie using config files configs bert offline config json parsing config file configs bert offline config json processing config pcie bert offline building engines for bert benchmark in offline scenario building gpu engine for pcie bert offline using workspace size running command cuda visibile order pci bus id nvidia smi query gpu gpu name pci device id uuid format csv warning tensor datatype is determined at build time for tensors not marked as input or output replacing fc qkv with small tile gemm plugin with fairshare cache size assertionsrc smalltilegemmplugin cu traceback most recent call last file code main py line in main main args system file code main py line in main launch handle generate engine gen args gen kwargs file code main py line in launch handle generate engine raise runtimeerror building engines failed runtimeerror building engines failed makefile recipe for target generate engines failed make error docker run gpus device rm w work v v home jay mnt home jay user system cpu total expected behavior a clear and concise description of what you expected to happen tensorrt engine 정상 빌드 possible solution solution solution additional context add any other context about the problem here 추가 정보 추가 정보
| 1
|
59,707
| 12,013,373,137
|
IssuesEvent
|
2020-04-10 08:42:17
|
home-assistant/brands
|
https://api.github.com/repos/home-assistant/brands
|
closed
|
Philips TV is missing brand images
|
domain-missing has-codeowner
|
## The problem
The Philips TV integration does not have brand images in
this repository.
We recently started this Brands repository, to create a centralized storage of all brand-related images. These images are used on our website and the Home Assistant frontend.
The following images are missing and would ideally be added:
- `src/philips_js/icon.png`
- `src/philips_js/logo.png`
- `src/philips_js/icon@2x.png`
- `src/philips_js/logo@2x.png`
For image specifications and requirements, please see [README.md](https://github.com/home-assistant/brands/blob/master/README.md).
## Updating the documentation repository
Our documentation repository already has a logo for this integration, however, it does not meet the image requirements of this new Brands repository.
If adding images to this repository, please open up a PR to the documentation repository as well, removing the `logo: philips.png` line from this file:
<https://github.com/home-assistant/home-assistant.io/blob/current/source/_integrations/philips_js.markdown>
**Note**: The documentation PR needs to be opened against the `current` branch.
**Note2**: Please leave the actual logo file in the documentation repository. It will be cleaned up differently.
## Additional information
For more information about this repository, read the [README.md](https://github.com/home-assistant/brands/blob/master/README.md) file of this repository. It contains information on how this repository works, and image specification and requirements.
## Codeowner mention
Hi there, @elupus! Mind taking a look at this issue as it is with an integration (philips_js) you are listed as a [codeowner](https://github.com/home-assistant/core/blob/dev/homeassistant/components/philips_js/manifest.json) for? Thanks!
Resolving this issue is not limited to codeowners! If you want to help us out, feel free to resolve this issue! Thanks already!
|
1.0
|
Philips TV is missing brand images -
## The problem
The Philips TV integration does not have brand images in
this repository.
We recently started this Brands repository, to create a centralized storage of all brand-related images. These images are used on our website and the Home Assistant frontend.
The following images are missing and would ideally be added:
- `src/philips_js/icon.png`
- `src/philips_js/logo.png`
- `src/philips_js/icon@2x.png`
- `src/philips_js/logo@2x.png`
For image specifications and requirements, please see [README.md](https://github.com/home-assistant/brands/blob/master/README.md).
## Updating the documentation repository
Our documentation repository already has a logo for this integration, however, it does not meet the image requirements of this new Brands repository.
If adding images to this repository, please open up a PR to the documentation repository as well, removing the `logo: philips.png` line from this file:
<https://github.com/home-assistant/home-assistant.io/blob/current/source/_integrations/philips_js.markdown>
**Note**: The documentation PR needs to be opened against the `current` branch.
**Note2**: Please leave the actual logo file in the documentation repository. It will be cleaned up differently.
## Additional information
For more information about this repository, read the [README.md](https://github.com/home-assistant/brands/blob/master/README.md) file of this repository. It contains information on how this repository works, and image specification and requirements.
## Codeowner mention
Hi there, @elupus! Mind taking a look at this issue as it is with an integration (philips_js) you are listed as a [codeowner](https://github.com/home-assistant/core/blob/dev/homeassistant/components/philips_js/manifest.json) for? Thanks!
Resolving this issue is not limited to codeowners! If you want to help us out, feel free to resolve this issue! Thanks already!
|
non_process
|
philips tv is missing brand images the problem the philips tv integration does not have brand images in this repository we recently started this brands repository to create a centralized storage of all brand related images these images are used on our website and the home assistant frontend the following images are missing and would ideally be added src philips js icon png src philips js logo png src philips js icon png src philips js logo png for image specifications and requirements please see updating the documentation repository our documentation repository already has a logo for this integration however it does not meet the image requirements of this new brands repository if adding images to this repository please open up a pr to the documentation repository as well removing the logo philips png line from this file note the documentation pr needs to be opened against the current branch please leave the actual logo file in the documentation repository it will be cleaned up differently additional information for more information about this repository read the file of this repository it contains information on how this repository works and image specification and requirements codeowner mention hi there elupus mind taking a look at this issue as it is with an integration philips js you are listed as a for thanks resolving this issue is not limited to codeowners if you want to help us out feel free to resolve this issue thanks already
| 0
|
20,794
| 27,540,997,535
|
IssuesEvent
|
2023-03-07 08:33:09
|
ExpertSDR3/ExpertSDR3-BUG-TRACKER
|
https://api.github.com/repos/ExpertSDR3/ExpertSDR3-BUG-TRACKER
|
closed
|
ExpertSDR3 v1.0.3 Beta: audio record playback problem (on Windows 10 with SunSDR2DX)
|
bug in process
|
I detected this bug during the recent ARRL DX SSB contest.
Both TX1 and TX2 are active. TX1 is for run, TX2 is for calling back stations spotted in the dx cluster.
I play recorded CQ message in TX1.
I click on a spot in N1MM or directly in the TX2 band spectrum of the ExpertSDR TX2 spectrum view to put the spot into the N1MM second entry window and jump TX2 to the frequency (SO2R setup).
This action is immediately stops the playback CQ message in TX1.
Similarly, if I'm speaking through MIC1 on TX1 and click something on the view of TX2, it stops transmitting my voice in TX1.
This is only in SSB mode, on CW it works correctly.
Expected behavior: if I click something in the view of TX2, it mustn't affect the transmission in TX1.
|
1.0
|
ExpertSDR3 v1.0.3 Beta: audio record playback problem (on Windows 10 with SunSDR2DX) - I detected this bug during the recent ARRL DX SSB contest.
Both TX1 and TX2 are active. TX1 is for run, TX2 is for calling back stations spotted in the dx cluster.
I play recorded CQ message in TX1.
I click on a spot in N1MM or directly in the TX2 band spectrum of the ExpertSDR TX2 spectrum view to put the spot into the N1MM second entry window and jump TX2 to the frequency (SO2R setup).
This action is immediately stops the playback CQ message in TX1.
Similarly, if I'm speaking through MIC1 on TX1 and click something on the view of TX2, it stops transmitting my voice in TX1.
This is only in SSB mode, on CW it works correctly.
Expected behavior: if I click something in the view of TX2, it mustn't affect the transmission in TX1.
|
process
|
beta audio record playback problem on windows with i detected this bug during the recent arrl dx ssb contest both and are active is for run is for calling back stations spotted in the dx cluster i play recorded cq message in i click on a spot in or directly in the band spectrum of the expertsdr spectrum view to put the spot into the second entry window and jump to the frequency setup this action is immediately stops the playback cq message in similarly if i m speaking through on and click something on the view of it stops transmitting my voice in this is only in ssb mode on cw it works correctly expected behavior if i click something in the view of it mustn t affect the transmission in
| 1
|
17,672
| 23,502,403,872
|
IssuesEvent
|
2022-08-18 09:33:34
|
apache/arrow-rs
|
https://api.github.com/repos/apache/arrow-rs
|
closed
|
Unsigned Arrays Fail to Roundtrip Through Parquet
|
bug development-process
|
**Describe the bug**
<!--
A clear and concise description of what the bug is.
-->
Since #1682 the parquet reader will ignore the signedness of the embedded arrow schema and always return unsigned integer arrays. This might be a bug in the writer, as it doesn't appear to be setting the ConvertedType/LogicalType
**To Reproduce**
<!--
Steps to reproduce the behavior:
-->
Attempt to roundtrip an unsigned integer array
**Expected behavior**
<!--
A clear and concise description of what you expected to happen.
-->
The values should roundtrip correctly
**Additional context**
<!--
Add any other context about the problem here.
-->
|
1.0
|
Unsigned Arrays Fail to Roundtrip Through Parquet - **Describe the bug**
<!--
A clear and concise description of what the bug is.
-->
Since #1682 the parquet reader will ignore the signedness of the embedded arrow schema and always return unsigned integer arrays. This might be a bug in the writer, as it doesn't appear to be setting the ConvertedType/LogicalType
**To Reproduce**
<!--
Steps to reproduce the behavior:
-->
Attempt to roundtrip an unsigned integer array
**Expected behavior**
<!--
A clear and concise description of what you expected to happen.
-->
The values should roundtrip correctly
**Additional context**
<!--
Add any other context about the problem here.
-->
|
process
|
unsigned arrays fail to roundtrip through parquet describe the bug a clear and concise description of what the bug is since the parquet reader will ignore the signedness of the embedded arrow schema and always return unsigned integer arrays this might be a bug in the writer as it doesn t appear to be setting the convertedtype logicaltype to reproduce steps to reproduce the behavior attempt to roundtrip an unsigned integer array expected behavior a clear and concise description of what you expected to happen the values should roundtrip correctly additional context add any other context about the problem here
| 1
|
244,197
| 26,373,668,128
|
IssuesEvent
|
2023-01-11 23:16:41
|
phytomichael/KSA
|
https://api.github.com/repos/phytomichael/KSA
|
opened
|
WS-2016-7112 (Medium) detected in spring-context-3.1.1.RELEASE.jar
|
security vulnerability
|
## WS-2016-7112 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-context-3.1.1.RELEASE.jar</b></p></summary>
<p>Spring Framework Parent</p>
<p>Path to dependency file: /ksa/ksa/ksa-web-root/ksa-web/pom.xml</p>
<p>Path to vulnerable library: /root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/ksa/ksa-web-root/ksa-web/target/ROOT/WEB-INF/lib/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/ksa/ksa/ksa-web-root/ksa-web/target/ROOT/WEB-INF/lib/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- :x: **spring-context-3.1.1.RELEASE.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/phytomichael/KSA/commit/bda81888904e8e992056fa1c451e21d4d805f2cf">bda81888904e8e992056fa1c451e21d4d805f2cf</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Spring Framework, versions 3.0.0.RELEASE through 3.2.17.RELEASE, 4.0.0.RELEASE through 4.2.7.RELEASE and 4.3.0.RELEASE through 4.3.1.RELEASE are vulnerable to Stack-based Buffer Overflow, which allows an authenticated attacker to crash the application when giving CronSequenceGenerator a reversed range in the “minutes” or “hours” fields.
<p>Publish Date: 2021-09-23
<p>URL: <a href=https://github.com/spring-projects/spring-framework/commit/e431624e8472b3b53d1a0c4528bf736c612f1fd9>WS-2016-7112</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2016-07-14</p>
<p>Fix Resolution: 3.2.18.RELEASE</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
|
True
|
WS-2016-7112 (Medium) detected in spring-context-3.1.1.RELEASE.jar - ## WS-2016-7112 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-context-3.1.1.RELEASE.jar</b></p></summary>
<p>Spring Framework Parent</p>
<p>Path to dependency file: /ksa/ksa/ksa-web-root/ksa-web/pom.xml</p>
<p>Path to vulnerable library: /root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/ksa/ksa-web-root/ksa-web/target/ROOT/WEB-INF/lib/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/ksa/ksa/ksa-web-root/ksa-web/target/ROOT/WEB-INF/lib/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar,/root/.m2/repository/org/springframework/spring-context/3.1.1.RELEASE/spring-context-3.1.1.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- :x: **spring-context-3.1.1.RELEASE.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/phytomichael/KSA/commit/bda81888904e8e992056fa1c451e21d4d805f2cf">bda81888904e8e992056fa1c451e21d4d805f2cf</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Spring Framework, versions 3.0.0.RELEASE through 3.2.17.RELEASE, 4.0.0.RELEASE through 4.2.7.RELEASE and 4.3.0.RELEASE through 4.3.1.RELEASE are vulnerable to Stack-based Buffer Overflow, which allows an authenticated attacker to crash the application when giving CronSequenceGenerator a reversed range in the “minutes” or “hours” fields.
<p>Publish Date: 2021-09-23
<p>URL: <a href=https://github.com/spring-projects/spring-framework/commit/e431624e8472b3b53d1a0c4528bf736c612f1fd9>WS-2016-7112</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2016-07-14</p>
<p>Fix Resolution: 3.2.18.RELEASE</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
|
non_process
|
ws medium detected in spring context release jar ws medium severity vulnerability vulnerable library spring context release jar spring framework parent path to dependency file ksa ksa ksa web root ksa web pom xml path to vulnerable library root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar repository org springframework spring context release spring context release jar repository org springframework spring context release spring context release jar ksa ksa web root ksa web target root web inf lib spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar repository org springframework spring context release spring context release jar repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar ksa ksa ksa web root ksa web target root web inf lib spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar root repository org springframework spring context release spring context release jar dependency hierarchy x spring context release jar vulnerable library found in head commit a href vulnerability details in spring framework versions release through release release through release and release through release are vulnerable to stack based buffer overflow which allows an authenticated attacker to crash the application when giving cronsequencegenerator a reversed range in the “minutes” or “hours” fields publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required high user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution release check this box to open an automated fix pr
| 0
|
8,668
| 11,802,639,302
|
IssuesEvent
|
2020-03-18 22:00:46
|
phokz/mod-auth-external
|
https://api.github.com/repos/phokz/mod-auth-external
|
opened
|
Add runtime tests to Travis CI, in addition to our existing build tests
|
process-item
|
We have tests to make sure code changes build properly on Mac, Windows, and Linux.
It would be helpful to add tests for the actual functionality. Example:
- add mod_authnz_external to apache
- set up an example authenticator/authorizer
- set example configuration directives in apache conf files
- run curl on it several times
|
1.0
|
Add runtime tests to Travis CI, in addition to our existing build tests - We have tests to make sure code changes build properly on Mac, Windows, and Linux.
It would be helpful to add tests for the actual functionality. Example:
- add mod_authnz_external to apache
- set up an example authenticator/authorizer
- set example configuration directives in apache conf files
- run curl on it several times
|
process
|
add runtime tests to travis ci in addition to our existing build tests we have tests to make sure code changes build properly on mac windows and linux it would be helpful to add tests for the actual functionality example add mod authnz external to apache set up an example authenticator authorizer set example configuration directives in apache conf files run curl on it several times
| 1
|
1,857
| 4,679,929,645
|
IssuesEvent
|
2016-10-08 00:34:38
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
child_process.exec fails non-deterministically
|
child_process question
|
node version 6.7
mac OSX
using child_process exec, running this command:
`cd /Users/t_millal/WebstormProjects/blah/existing-dir && git add . && git add -A && git commit -am "temp commit [at]Fri Oct 07 2016 12:50:43 GMT-0700 (PDT)"`
sometimes it succeeds sometimes it doesn't, for no good reason
my code is:
```
function makeGitCommit(cb) {
//because we will be deleting files...let's just add everything to this commit
cp.exec('cd ' + __dirname + ' && git add . && git add -A && ' +
'git commit -am "temp commit [at]' + new Date() + '"', function (err, stdout, stderr) {
if (String(err).match(/error/i) || String(stdout).match(/error/i) || String(stderr).match(/error/i)) {
console.error('Error => ', err.stack || err);
console.error('Stdout => ', stdout);
console.error('Stderr => ', stderr);
return cb(err || new Error('Placeholder error...' + stdout + stderr));
}
cb(null);
});
}
```
as you can see `__dirname` is the Node.js `__dirname`, which almost certainly should exist, unless I am deleting my own source code, and I have verified that I am not doing that. So hopefully this convinces you that the directory exists, and that it has a .git repo inited.
here is the error I get:
```
ccsg0sqq32:crucible-poc-discovery t_millal$ node auto-discover-models.js
Error => Error: Command failed: cd /Users/t_millal/WebstormProjects/autodesk/crucible-poc-discovery && git add . && git add -A && git commit -am "temp commit [at]Fri Oct 07 2016 12:50:43 GMT-0700 (PDT)"
at ChildProcess.exithandler (child_process.js:206:12)
at emitTwo (events.js:106:13)
at ChildProcess.emit (events.js:191:7)
at maybeClose (internal/child_process.js:877:16)
at Socket.<anonymous> (internal/child_process.js:334:11)
at emitOne (events.js:96:13)
at Socket.emit (events.js:188:7)
at Pipe._handle.close [as _onclose] (net.js:493:12)
Stdout => On branch master
Your branch is ahead of 'origin/master' by 17 commits.
(use "git push" to publish your local commits)
nothing to commit, working tree clean
Stderr =>
Error: Command failed: cd /Users/t_millal/WebstormProjects/autodesk/crucible-poc-discovery && git add . && git add -A && git commit -am "temp commit [at]Fri Oct 07 2016 12:50:43 GMT-0700 (PDT)"
at ChildProcess.exithandler (child_process.js:206:12)
at emitTwo (events.js:106:13)
at ChildProcess.emit (events.js:191:7)
at maybeClose (internal/child_process.js:877:16)
at Socket.<anonymous> (internal/child_process.js:334:11)
at emitOne (events.js:96:13)
at Socket.emit (events.js:188:7)
at Pipe._handle.close [as _onclose] (net.js:493:12)
```
why would it fail sometimes and succeed sometimes? Any ideas?
|
1.0
|
child_process.exec fails non-deterministically - node version 6.7
mac OSX
using child_process exec, running this command:
`cd /Users/t_millal/WebstormProjects/blah/existing-dir && git add . && git add -A && git commit -am "temp commit [at]Fri Oct 07 2016 12:50:43 GMT-0700 (PDT)"`
sometimes it succeeds sometimes it doesn't, for no good reason
my code is:
```
function makeGitCommit(cb) {
//because we will be deleting files...let's just add everything to this commit
cp.exec('cd ' + __dirname + ' && git add . && git add -A && ' +
'git commit -am "temp commit [at]' + new Date() + '"', function (err, stdout, stderr) {
if (String(err).match(/error/i) || String(stdout).match(/error/i) || String(stderr).match(/error/i)) {
console.error('Error => ', err.stack || err);
console.error('Stdout => ', stdout);
console.error('Stderr => ', stderr);
return cb(err || new Error('Placeholder error...' + stdout + stderr));
}
cb(null);
});
}
```
as you can see `__dirname` is the Node.js `__dirname`, which almost certainly should exist, unless I am deleting my own source code, and I have verified that I am not doing that. So hopefully this convinces you that the directory exists, and that it has a .git repo inited.
here is the error I get:
```
ccsg0sqq32:crucible-poc-discovery t_millal$ node auto-discover-models.js
Error => Error: Command failed: cd /Users/t_millal/WebstormProjects/autodesk/crucible-poc-discovery && git add . && git add -A && git commit -am "temp commit [at]Fri Oct 07 2016 12:50:43 GMT-0700 (PDT)"
at ChildProcess.exithandler (child_process.js:206:12)
at emitTwo (events.js:106:13)
at ChildProcess.emit (events.js:191:7)
at maybeClose (internal/child_process.js:877:16)
at Socket.<anonymous> (internal/child_process.js:334:11)
at emitOne (events.js:96:13)
at Socket.emit (events.js:188:7)
at Pipe._handle.close [as _onclose] (net.js:493:12)
Stdout => On branch master
Your branch is ahead of 'origin/master' by 17 commits.
(use "git push" to publish your local commits)
nothing to commit, working tree clean
Stderr =>
Error: Command failed: cd /Users/t_millal/WebstormProjects/autodesk/crucible-poc-discovery && git add . && git add -A && git commit -am "temp commit [at]Fri Oct 07 2016 12:50:43 GMT-0700 (PDT)"
at ChildProcess.exithandler (child_process.js:206:12)
at emitTwo (events.js:106:13)
at ChildProcess.emit (events.js:191:7)
at maybeClose (internal/child_process.js:877:16)
at Socket.<anonymous> (internal/child_process.js:334:11)
at emitOne (events.js:96:13)
at Socket.emit (events.js:188:7)
at Pipe._handle.close [as _onclose] (net.js:493:12)
```
why would it fail sometimes and succeed sometimes? Any ideas?
|
process
|
child process exec fails non deterministically node version mac osx using child process exec running this command cd users t millal webstormprojects blah existing dir git add git add a git commit am temp commit fri oct gmt pdt sometimes it succeeds sometimes it doesn t for no good reason my code is function makegitcommit cb because we will be deleting files let s just add everything to this commit cp exec cd dirname git add git add a git commit am temp commit new date function err stdout stderr if string err match error i string stdout match error i string stderr match error i console error error err stack err console error stdout stdout console error stderr stderr return cb err new error placeholder error stdout stderr cb null as you can see dirname is the node js dirname which almost certainly should exist unless i am deleting my own source code and i have verified that i am not doing that so hopefully this convinces you that the directory exists and that it has a git repo inited here is the error i get crucible poc discovery t millal node auto discover models js error error command failed cd users t millal webstormprojects autodesk crucible poc discovery git add git add a git commit am temp commit fri oct gmt pdt at childprocess exithandler child process js at emittwo events js at childprocess emit events js at maybeclose internal child process js at socket internal child process js at emitone events js at socket emit events js at pipe handle close net js stdout on branch master your branch is ahead of origin master by commits use git push to publish your local commits nothing to commit working tree clean stderr error command failed cd users t millal webstormprojects autodesk crucible poc discovery git add git add a git commit am temp commit fri oct gmt pdt at childprocess exithandler child process js at emittwo events js at childprocess emit events js at maybeclose internal child process js at socket internal child process js at emitone events js at socket emit events js at pipe handle close net js why would it fail sometimes and succeed sometimes any ideas
| 1
|
94,400
| 11,869,130,795
|
IssuesEvent
|
2020-03-26 10:24:48
|
PostHog/posthog
|
https://api.github.com/repos/PostHog/posthog
|
closed
|
WIP Built out demo site - potential redesign
|
Design/UX
|
***Thank*** you for your feature request - we love each and every one :)
**Is your feature request related to a problem? Please describe.**
The demo site is useful but we have seen users run into some problems with it's general unintuitiveness and want to ease onboarding + encourage it's use in production.
**Describe the solution you'd like**
A potential redesign with prebuilt actions (other than pageviews), funnels, users etc? This should then allow a user to reverse engineer how actions, funnels and trends are built.
**Describe alternatives you've considered**
Increased video coverage of use cases or links to wiki from site itself (will create a separate issue). We should potentially discuss removal of demo site if creates problems but perhaps will create a bigger barrier to production use.
|
1.0
|
WIP Built out demo site - potential redesign - ***Thank*** you for your feature request - we love each and every one :)
**Is your feature request related to a problem? Please describe.**
The demo site is useful but we have seen users run into some problems with it's general unintuitiveness and want to ease onboarding + encourage it's use in production.
**Describe the solution you'd like**
A potential redesign with prebuilt actions (other than pageviews), funnels, users etc? This should then allow a user to reverse engineer how actions, funnels and trends are built.
**Describe alternatives you've considered**
Increased video coverage of use cases or links to wiki from site itself (will create a separate issue). We should potentially discuss removal of demo site if creates problems but perhaps will create a bigger barrier to production use.
|
non_process
|
wip built out demo site potential redesign thank you for your feature request we love each and every one is your feature request related to a problem please describe the demo site is useful but we have seen users run into some problems with it s general unintuitiveness and want to ease onboarding encourage it s use in production describe the solution you d like a potential redesign with prebuilt actions other than pageviews funnels users etc this should then allow a user to reverse engineer how actions funnels and trends are built describe alternatives you ve considered increased video coverage of use cases or links to wiki from site itself will create a separate issue we should potentially discuss removal of demo site if creates problems but perhaps will create a bigger barrier to production use
| 0
|
13,213
| 15,685,099,878
|
IssuesEvent
|
2021-03-25 10:47:51
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
GO:0075202 modulation of symbiont penetration hypha formation for entry into host
|
multi-species process
|
GO:0075202 modulation of symbiont penetration hypha formation for entry into host | GO:0075203 positive regulation of symbiont penetration hypha formation for entry into host
I am wondering why GO:0075202 is 'modulation' rather than 'regulation?
|
1.0
|
GO:0075202 modulation of symbiont penetration hypha formation for entry into host - GO:0075202 modulation of symbiont penetration hypha formation for entry into host | GO:0075203 positive regulation of symbiont penetration hypha formation for entry into host
I am wondering why GO:0075202 is 'modulation' rather than 'regulation?
|
process
|
go modulation of symbiont penetration hypha formation for entry into host go modulation of symbiont penetration hypha formation for entry into host go positive regulation of symbiont penetration hypha formation for entry into host i am wondering why go is modulation rather than regulation
| 1
|
20,881
| 27,699,045,886
|
IssuesEvent
|
2023-03-14 06:10:50
|
ExpertSDR3/ExpertSDR3-BUG-TRACKER
|
https://api.github.com/repos/ExpertSDR3/ExpertSDR3-BUG-TRACKER
|
closed
|
EESDR3 start up CW filter abnormal
|
bug in process
|
EESDR3 1.0.3 beta
Windows 11 64-bit
SunSDR2DX
When I start EESDR3 up in CW, the initial filter appears to be abnormal. For example, if I had the filter set to 500Hz bandwidth when I launch EESDR3, it shows 500Hz but the audio suggest that the filter setting is something narrower. If I click on the 500Hz bandwidth button, nothing changes but if I switch to some other bandwidth such as 250Hz, and revert to 500Hz I can then hear the normal sound of the filter. Once I do this, the problem does not show up until I shut down EESDR3 and re-launch it. Even using the soft-power on/off cycle will not bring the problem back until a relaunch of EESDR3.
This behaviour has been confirmed by other testers as well.
https://user-images.githubusercontent.com/13930571/222334055-cab06d89-e041-4a2b-9c23-1237810f2d21.mp4
|
1.0
|
EESDR3 start up CW filter abnormal - EESDR3 1.0.3 beta
Windows 11 64-bit
SunSDR2DX
When I start EESDR3 up in CW, the initial filter appears to be abnormal. For example, if I had the filter set to 500Hz bandwidth when I launch EESDR3, it shows 500Hz but the audio suggest that the filter setting is something narrower. If I click on the 500Hz bandwidth button, nothing changes but if I switch to some other bandwidth such as 250Hz, and revert to 500Hz I can then hear the normal sound of the filter. Once I do this, the problem does not show up until I shut down EESDR3 and re-launch it. Even using the soft-power on/off cycle will not bring the problem back until a relaunch of EESDR3.
This behaviour has been confirmed by other testers as well.
https://user-images.githubusercontent.com/13930571/222334055-cab06d89-e041-4a2b-9c23-1237810f2d21.mp4
|
process
|
start up cw filter abnormal beta windows bit when i start up in cw the initial filter appears to be abnormal for example if i had the filter set to bandwidth when i launch it shows but the audio suggest that the filter setting is something narrower if i click on the bandwidth button nothing changes but if i switch to some other bandwidth such as and revert to i can then hear the normal sound of the filter once i do this the problem does not show up until i shut down and re launch it even using the soft power on off cycle will not bring the problem back until a relaunch of this behaviour has been confirmed by other testers as well
| 1
|
272,247
| 20,739,339,866
|
IssuesEvent
|
2022-03-14 16:16:13
|
louking/rrwebapp
|
https://api.github.com/repos/louking/rrwebapp
|
opened
|
docs: add full workflow description of how race result import works
|
documentation
|
* results are imported to managedresult table
* edit participants is used to pick and choose results for which the names aren't exactly matched
* options are given for each matching runner who was a "member" on the race date (including grace period)
* tabulate takes accepted results and puts into the raceresult table
* standings uses results from the raceresult table
|
1.0
|
docs: add full workflow description of how race result import works - * results are imported to managedresult table
* edit participants is used to pick and choose results for which the names aren't exactly matched
* options are given for each matching runner who was a "member" on the race date (including grace period)
* tabulate takes accepted results and puts into the raceresult table
* standings uses results from the raceresult table
|
non_process
|
docs add full workflow description of how race result import works results are imported to managedresult table edit participants is used to pick and choose results for which the names aren t exactly matched options are given for each matching runner who was a member on the race date including grace period tabulate takes accepted results and puts into the raceresult table standings uses results from the raceresult table
| 0
|
435,306
| 12,534,081,701
|
IssuesEvent
|
2020-06-04 18:46:11
|
SETI/pds-opus
|
https://api.github.com/repos/SETI/pds-opus
|
closed
|
Bugs when parsing limit= field
|
A-Bug B-OPUS Django Effort 3 Easy Priority 4 Useful
|
This error occurs once at 2020-05-14 20:11:51.017195.
Traceback (most recent call last):
File "/opus/p3venv/lib/python3.7/site-packages/django/core/handlers/exception.py", line 34, in inner
response = get_response(request)
File "/opus/p3venv/lib/python3.7/site-packages/django/core/handlers/base.py", line 115, in _get_response
response = self.process_exception_by_middleware(e, request)
File "/opus/p3venv/lib/python3.7/site-packages/django/core/handlers/base.py", line 113, in _get_response
response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "/opus/p3venv/lib/python3.7/site-packages/django/views/decorators/cache.py", line 44, in _wrapped_view_func
response = view_func(request, *args, **kwargs)
File "/opus/src/pds-opus/opus/application/apps/results/views.py", line 355, in api_get_data
api_code=api_code)
File "/opus/src/pds-opus/opus/application/apps/results/views.py", line 1258, in get_search_results_chunk
return error_return(404, HTTP404_BAD_LIMIT(limit, request))
NameError: name 'HTTP404_BAD_LIMIT' is not defined
---
Log entries missing
========================
This error occurs once at 2020-05-14 15:44:47.533108.
get_search_results_chunk: Unable to parse limit ""
Internal Server Error: /opus/api/images/thumb.json
Traceback (most recent call last):
File "/opus/src/pds-opus/opus/application/apps/results/views.py", line 1252, in get_search_results_chunk
limit = int(limit)
ValueError: invalid literal for int() with base 10: '""'
During handling of the above exception, another exception occurred:
---
IP: 70.93.137.48
2020-05-14 15:44:47-07:00 /opus/api/images/thumb.json?planet=Uranus&imageType=frame&RINGGEOringcenterphase1=55&RINGGEOringcenterphase2=85&unit-RINGGEOringcenterphase=degrees&cols=opusid,mission,instrument,RINGGEOringcenterphase,time1,wavelength1,wavelength2&limit=%22%22
========================
This error occurs once at 2020-05-14 15:44:59.141622.
get_search_results_chunk: Unable to parse limit all
Internal Server Error: /opus/api/images/thumb.json
Traceback (most recent call last):
File "/opus/src/pds-opus/opus/application/apps/results/views.py", line 1252, in get_search_results_chunk
limit = int(limit)
ValueError: invalid literal for int() with base 10: 'all'
During handling of the above exception, another exception occurred:
---
IP: 70.93.137.48
2020-05-14 15:44:59-07:00 /opus/api/images/thumb.json?planet=Uranus&imageType=frame&RINGGEOringcenterphase1=55&RINGGEOringcenterphase2=85&unit-RINGGEOringcenterphase=degrees&cols=opusid,mission,instrument,RINGGEOringcenterphase,time1,wavelength1,wavelength2&limit=all
========================
This error occurs once at 2020-05-14 15:45:16.095536.
get_search_results_chunk: Unable to parse limit count
Internal Server Error: /opus/api/images/thumb.json
Traceback (most recent call last):
File "/opus/src/pds-opus/opus/application/apps/results/views.py", line 1252, in get_search_results_chunk
limit = int(limit)
ValueError: invalid literal for int() with base 10: 'count'
During handling of the above exception, another exception occurred:
---
IP: 70.93.137.48
2020-05-14 15:45:16-07:00 /opus/api/images/thumb.json?planet=Uranus&imageType=frame&RINGGEOringcenterphase1=55&RINGGEOringcenterphase2=85&unit-RINGGEOringcenterphase=degrees&cols=opusid,mission,instrument,RINGGEOringcenterphase,time1,wavelength1,wavelength2&limit=count
========================
This error occurs once at 2020-05-14 20:11:51.016518.
get_search_results_chunk: Unable to parse limit none
Internal Server Error: /opus/api/data.json
Traceback (most recent call last):
File "/opus/src/pds-opus/opus/application/apps/results/views.py", line 1252, in get_search_results_chunk
limit = int(limit)
ValueError: invalid literal for int() with base 10: 'none'
During handling of the above exception, another exception occurred:
---
Log entries missing
|
1.0
|
Bugs when parsing limit= field - This error occurs once at 2020-05-14 20:11:51.017195.
Traceback (most recent call last):
File "/opus/p3venv/lib/python3.7/site-packages/django/core/handlers/exception.py", line 34, in inner
response = get_response(request)
File "/opus/p3venv/lib/python3.7/site-packages/django/core/handlers/base.py", line 115, in _get_response
response = self.process_exception_by_middleware(e, request)
File "/opus/p3venv/lib/python3.7/site-packages/django/core/handlers/base.py", line 113, in _get_response
response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "/opus/p3venv/lib/python3.7/site-packages/django/views/decorators/cache.py", line 44, in _wrapped_view_func
response = view_func(request, *args, **kwargs)
File "/opus/src/pds-opus/opus/application/apps/results/views.py", line 355, in api_get_data
api_code=api_code)
File "/opus/src/pds-opus/opus/application/apps/results/views.py", line 1258, in get_search_results_chunk
return error_return(404, HTTP404_BAD_LIMIT(limit, request))
NameError: name 'HTTP404_BAD_LIMIT' is not defined
---
Log entries missing
========================
This error occurs once at 2020-05-14 15:44:47.533108.
get_search_results_chunk: Unable to parse limit ""
Internal Server Error: /opus/api/images/thumb.json
Traceback (most recent call last):
File "/opus/src/pds-opus/opus/application/apps/results/views.py", line 1252, in get_search_results_chunk
limit = int(limit)
ValueError: invalid literal for int() with base 10: '""'
During handling of the above exception, another exception occurred:
---
IP: 70.93.137.48
2020-05-14 15:44:47-07:00 /opus/api/images/thumb.json?planet=Uranus&imageType=frame&RINGGEOringcenterphase1=55&RINGGEOringcenterphase2=85&unit-RINGGEOringcenterphase=degrees&cols=opusid,mission,instrument,RINGGEOringcenterphase,time1,wavelength1,wavelength2&limit=%22%22
========================
This error occurs once at 2020-05-14 15:44:59.141622.
get_search_results_chunk: Unable to parse limit all
Internal Server Error: /opus/api/images/thumb.json
Traceback (most recent call last):
File "/opus/src/pds-opus/opus/application/apps/results/views.py", line 1252, in get_search_results_chunk
limit = int(limit)
ValueError: invalid literal for int() with base 10: 'all'
During handling of the above exception, another exception occurred:
---
IP: 70.93.137.48
2020-05-14 15:44:59-07:00 /opus/api/images/thumb.json?planet=Uranus&imageType=frame&RINGGEOringcenterphase1=55&RINGGEOringcenterphase2=85&unit-RINGGEOringcenterphase=degrees&cols=opusid,mission,instrument,RINGGEOringcenterphase,time1,wavelength1,wavelength2&limit=all
========================
This error occurs once at 2020-05-14 15:45:16.095536.
get_search_results_chunk: Unable to parse limit count
Internal Server Error: /opus/api/images/thumb.json
Traceback (most recent call last):
File "/opus/src/pds-opus/opus/application/apps/results/views.py", line 1252, in get_search_results_chunk
limit = int(limit)
ValueError: invalid literal for int() with base 10: 'count'
During handling of the above exception, another exception occurred:
---
IP: 70.93.137.48
2020-05-14 15:45:16-07:00 /opus/api/images/thumb.json?planet=Uranus&imageType=frame&RINGGEOringcenterphase1=55&RINGGEOringcenterphase2=85&unit-RINGGEOringcenterphase=degrees&cols=opusid,mission,instrument,RINGGEOringcenterphase,time1,wavelength1,wavelength2&limit=count
========================
This error occurs once at 2020-05-14 20:11:51.016518.
get_search_results_chunk: Unable to parse limit none
Internal Server Error: /opus/api/data.json
Traceback (most recent call last):
File "/opus/src/pds-opus/opus/application/apps/results/views.py", line 1252, in get_search_results_chunk
limit = int(limit)
ValueError: invalid literal for int() with base 10: 'none'
During handling of the above exception, another exception occurred:
---
Log entries missing
|
non_process
|
bugs when parsing limit field this error occurs once at traceback most recent call last file opus lib site packages django core handlers exception py line in inner response get response request file opus lib site packages django core handlers base py line in get response response self process exception by middleware e request file opus lib site packages django core handlers base py line in get response response wrapped callback request callback args callback kwargs file opus lib site packages django views decorators cache py line in wrapped view func response view func request args kwargs file opus src pds opus opus application apps results views py line in api get data api code api code file opus src pds opus opus application apps results views py line in get search results chunk return error return bad limit limit request nameerror name bad limit is not defined log entries missing this error occurs once at get search results chunk unable to parse limit internal server error opus api images thumb json traceback most recent call last file opus src pds opus opus application apps results views py line in get search results chunk limit int limit valueerror invalid literal for int with base during handling of the above exception another exception occurred ip opus api images thumb json planet uranus imagetype frame unit ringgeoringcenterphase degrees cols opusid mission instrument ringgeoringcenterphase limit this error occurs once at get search results chunk unable to parse limit all internal server error opus api images thumb json traceback most recent call last file opus src pds opus opus application apps results views py line in get search results chunk limit int limit valueerror invalid literal for int with base all during handling of the above exception another exception occurred ip opus api images thumb json planet uranus imagetype frame unit ringgeoringcenterphase degrees cols opusid mission instrument ringgeoringcenterphase limit all this error occurs once at get search results chunk unable to parse limit count internal server error opus api images thumb json traceback most recent call last file opus src pds opus opus application apps results views py line in get search results chunk limit int limit valueerror invalid literal for int with base count during handling of the above exception another exception occurred ip opus api images thumb json planet uranus imagetype frame unit ringgeoringcenterphase degrees cols opusid mission instrument ringgeoringcenterphase limit count this error occurs once at get search results chunk unable to parse limit none internal server error opus api data json traceback most recent call last file opus src pds opus opus application apps results views py line in get search results chunk limit int limit valueerror invalid literal for int with base none during handling of the above exception another exception occurred log entries missing
| 0
|
585,710
| 17,515,736,536
|
IssuesEvent
|
2021-08-11 06:16:33
|
GoogleContainerTools/skaffold
|
https://api.github.com/repos/GoogleContainerTools/skaffold
|
closed
|
skaffold init shows a list of warnings for templated image names.
|
kind/bug help wanted good first issue priority/p3 area/logging area/init dogfood/helm
|
Skaffold init shows a bunch of warnings if the image name is templated which is kind of annoying
```
[kritis:make_changes_to_helm_charts]$skaffold init
WARN[0000] Ignoring image referenced by digest: [gcr.io/kritis-tutorial/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/nginx-digest-whitelist@sha256:56e0af16f4a9d2401d3f55bc8d214d519f070b5317512c87568603f315a8be72]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/nginx-digest-whitelist@sha256:56e0af16f4a9d2401d3f55bc8d214d519f070b5317512c87568603f315a8be72]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/nginx-digest-whitelist:latest]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/nginx-no-digest-breakglass:latest]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/nginx-no-digest:latest]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/acceptable-vulnz@sha256:2a81797428f5cab4592ac423dc3049050b28ffbaa3dd11000da942320f9979b6]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/acceptable-vulnz@sha256:2a81797428f5cab4592ac423dc3049050b28ffbaa3dd11000da942320f9979b6]: invalid reference format
? Choose the builder to build image gcr.io/kritis-project/kritis-server [Use arrows to move, space to select, type to filter]
> Docker (deploy/Dockerfile)
Docker (deploy/Dockerfile_resolve)
Docker (deploy/kritis-gcb-signer/Dockerfile)
Docker (deploy/kritis-int-test/Dockerfile)
Docker (helm-hooks/Dockerfile)
Docker (helm-release/Dockerfile)
Docker (vendor/golang.org/x/net/http2/Dockerfile)
None (image not built from these sources)
```
|
1.0
|
skaffold init shows a list of warnings for templated image names. - Skaffold init shows a bunch of warnings if the image name is templated which is kind of annoying
```
[kritis:make_changes_to_helm_charts]$skaffold init
WARN[0000] Ignoring image referenced by digest: [gcr.io/kritis-tutorial/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/java-with-vulnz@sha256:358687cfd3ec8e1dfeb2bf51b5110e4e16f6df71f64fba01986f720b2fcba68a]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/nginx-digest-whitelist@sha256:56e0af16f4a9d2401d3f55bc8d214d519f070b5317512c87568603f315a8be72]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/nginx-digest-whitelist@sha256:56e0af16f4a9d2401d3f55bc8d214d519f070b5317512c87568603f315a8be72]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/nginx-digest-whitelist:latest]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/nginx-no-digest-breakglass:latest]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/nginx-no-digest:latest]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/acceptable-vulnz@sha256:2a81797428f5cab4592ac423dc3049050b28ffbaa3dd11000da942320f9979b6]: invalid reference format
WARN[0000] Couldn't parse image [gcr.io/{{ .Project }}/acceptable-vulnz@sha256:2a81797428f5cab4592ac423dc3049050b28ffbaa3dd11000da942320f9979b6]: invalid reference format
? Choose the builder to build image gcr.io/kritis-project/kritis-server [Use arrows to move, space to select, type to filter]
> Docker (deploy/Dockerfile)
Docker (deploy/Dockerfile_resolve)
Docker (deploy/kritis-gcb-signer/Dockerfile)
Docker (deploy/kritis-int-test/Dockerfile)
Docker (helm-hooks/Dockerfile)
Docker (helm-release/Dockerfile)
Docker (vendor/golang.org/x/net/http2/Dockerfile)
None (image not built from these sources)
```
|
non_process
|
skaffold init shows a list of warnings for templated image names skaffold init shows a bunch of warnings if the image name is templated which is kind of annoying skaffold init warn ignoring image referenced by digest warn couldn t parse image invalid reference format warn couldn t parse image invalid reference format warn couldn t parse image invalid reference format warn couldn t parse image invalid reference format warn couldn t parse image invalid reference format warn couldn t parse image invalid reference format warn couldn t parse image invalid reference format warn couldn t parse image invalid reference format warn couldn t parse image invalid reference format warn couldn t parse image invalid reference format warn couldn t parse image invalid reference format warn couldn t parse image invalid reference format warn couldn t parse image invalid reference format warn couldn t parse image invalid reference format choose the builder to build image gcr io kritis project kritis server docker deploy dockerfile docker deploy dockerfile resolve docker deploy kritis gcb signer dockerfile docker deploy kritis int test dockerfile docker helm hooks dockerfile docker helm release dockerfile docker vendor golang org x net dockerfile none image not built from these sources
| 0
|
8,551
| 11,726,836,304
|
IssuesEvent
|
2020-03-10 15:04:51
|
MicrosoftDocs/vsts-docs
|
https://api.github.com/repos/MicrosoftDocs/vsts-docs
|
closed
|
Runtime parameters not working as described
|
Pri1 devops-cicd-process/tech devops/prod doc-bug
|
Please see #6588. This issue added so readers my find that issue from this page.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 790318bb-8220-3241-4ca7-73351074492f
* Version Independent ID: db1da9db-3694-779b-17aa-1ed67fcecf86
* Content: [Use runtime and type-safe parameters - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/runtime-parameters?view=azure-devops#feedback)
* Content Source: [docs/pipelines/process/runtime-parameters.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/runtime-parameters.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Runtime parameters not working as described -
Please see #6588. This issue added so readers my find that issue from this page.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 790318bb-8220-3241-4ca7-73351074492f
* Version Independent ID: db1da9db-3694-779b-17aa-1ed67fcecf86
* Content: [Use runtime and type-safe parameters - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/runtime-parameters?view=azure-devops#feedback)
* Content Source: [docs/pipelines/process/runtime-parameters.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/runtime-parameters.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
runtime parameters not working as described please see this issue added so readers my find that issue from this page document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
14,854
| 18,248,719,764
|
IssuesEvent
|
2021-10-01 22:57:18
|
googleapis/nodejs-document-ai
|
https://api.github.com/repos/googleapis/nodejs-document-ai
|
closed
|
promote library to GA
|
type: process api: documentai
|
Package name: **@google-cloud/documentai**
Current release: **beta**
Proposed release: **GA**
## Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
## Required
- [ ] 28 days elapsed since last beta release with new API surface
- [ ] Server API is GA
- [ ] Package API is stable, and we can commit to backward compatibility
- [ ] All dependencies are GA
## Optional
- [ ] Most common / important scenarios have descriptive samples
- [ ] Public manual methods have at least one usage sample each (excluding overloads)
- [ ] Per-API README includes a full description of the API
- [ ] Per-API README contains at least one “getting started” sample using the most common API scenario
- [ ] Manual code has been reviewed by API producer
- [ ] Manual code has been reviewed by a DPE responsible for samples
- [ ] 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
|
1.0
|
promote library to GA - Package name: **@google-cloud/documentai**
Current release: **beta**
Proposed release: **GA**
## Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
## Required
- [ ] 28 days elapsed since last beta release with new API surface
- [ ] Server API is GA
- [ ] Package API is stable, and we can commit to backward compatibility
- [ ] All dependencies are GA
## Optional
- [ ] Most common / important scenarios have descriptive samples
- [ ] Public manual methods have at least one usage sample each (excluding overloads)
- [ ] Per-API README includes a full description of the API
- [ ] Per-API README contains at least one “getting started” sample using the most common API scenario
- [ ] Manual code has been reviewed by API producer
- [ ] Manual code has been reviewed by a DPE responsible for samples
- [ ] 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
|
process
|
promote library to ga package name google cloud documentai current release beta proposed release ga instructions check the lists below adding tests documentation as required once all the required boxes are ticked please create a release and close this issue required days elapsed since last beta release with new api surface server api is ga package api is stable and we can commit to backward compatibility all dependencies are ga optional most common important scenarios have descriptive samples public manual methods have at least one usage sample each excluding overloads per api readme includes a full description of the api per api readme contains at least one “getting started” sample using the most common api scenario manual code has been reviewed by api producer manual code has been reviewed by a dpe responsible for samples client libraries page is added to the product documentation in apis reference section of the product s documentation on cloud site
| 1
|
4,249
| 7,187,161,165
|
IssuesEvent
|
2018-02-02 03:19:02
|
Great-Hill-Corporation/quickBlocks
|
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
|
closed
|
cacheMan --fix should use --check first to see if the thing needs to be fixed
|
monitors-cacheMan status-inprocess type-enhancement
|
As it is, --fix fixes caches that don't need fixing. This takes way longer than simply checking the cache first (which takes almost no time). (It's because of all the copying.)
|
1.0
|
cacheMan --fix should use --check first to see if the thing needs to be fixed - As it is, --fix fixes caches that don't need fixing. This takes way longer than simply checking the cache first (which takes almost no time). (It's because of all the copying.)
|
process
|
cacheman fix should use check first to see if the thing needs to be fixed as it is fix fixes caches that don t need fixing this takes way longer than simply checking the cache first which takes almost no time it s because of all the copying
| 1
|
11,178
| 13,957,695,313
|
IssuesEvent
|
2020-10-24 08:11:30
|
alexanderkotsev/geoportal
|
https://api.github.com/repos/alexanderkotsev/geoportal
|
opened
|
MT: Harvesting
|
Geoportal Harvesting process MT - Malta
|
Dear Angelo,
I trust this message finds you well, Can you kindly perform a harvest on the Maltese CSW at your convenience as we did some changes and need to check the outcome. Thanks in advance for your help and on behalf of the Maltese Inspire Team we wish you a Happy Christmas and a Prosperous New Year.
Regards,
Rene
|
1.0
|
MT: Harvesting - Dear Angelo,
I trust this message finds you well, Can you kindly perform a harvest on the Maltese CSW at your convenience as we did some changes and need to check the outcome. Thanks in advance for your help and on behalf of the Maltese Inspire Team we wish you a Happy Christmas and a Prosperous New Year.
Regards,
Rene
|
process
|
mt harvesting dear angelo i trust this message finds you well can you kindly perform a harvest on the maltese csw at your convenience as we did some changes and need to check the outcome thanks in advance for your help and on behalf of the maltese inspire team we wish you a happy christmas and a prosperous new year regards rene
| 1
|
15,372
| 19,554,642,179
|
IssuesEvent
|
2022-01-03 07:18:01
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[android][ios] app behavior when no network connection
|
Feature request P2 iOS Android Process: Track 3
|
Can the app display an error message when there's no network connection (e.g. in airplane mode)? Currently, the app renders as if it's working normally, however if the user tries to interact with it, the app will appear to not respond. It's confusing for the user. @zohrehj
|
1.0
|
[android][ios] app behavior when no network connection - Can the app display an error message when there's no network connection (e.g. in airplane mode)? Currently, the app renders as if it's working normally, however if the user tries to interact with it, the app will appear to not respond. It's confusing for the user. @zohrehj
|
process
|
app behavior when no network connection can the app display an error message when there s no network connection e g in airplane mode currently the app renders as if it s working normally however if the user tries to interact with it the app will appear to not respond it s confusing for the user zohrehj
| 1
|
12,339
| 14,882,878,050
|
IssuesEvent
|
2021-01-20 12:32:46
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Different layer input parameter in singe mode and batchmode in processing script
|
Bug Feedback Processing
|
For example:
https://gis.stackexchange.com/questions/384625/different-layer-name-in-batchmode-in-qgis-processing-script
Problem is in different layer parameter in single mode and batchmode. In single mode is `parameters[input_layer]` path of source layer, but in batch mode is `layer.name()`
Single mode:
```
QGIS version: 3.10.14-A Coruna
QGIS code revision: 8374282d2a
Qt version: 5.11.2
GDAL version: 3.1.4
GEOS version: 3.8.1-CAPI-1.13.3
PROJ version: Rel. 6.3.2, May 1st, 2020
Processing algorithm…
Algorithm 'My Script' starting…
Input parameters:
{ 'INPUT' : 'dbname=\'podoli_nad_bobruvkou_724271\' ....top.secret....', 'OUTPUT' : 'TEMPORARY_OUTPUT' }
kabelaz_e315491b_f638_465a_9e63_77d21b048262
CRS is EPSG:5514
Execution completed in 0.30 seconds
Results:
{'OUTPUT': 'Output_layer_e95a209a_90c0_49d4_8b89_07cefc930e62'}
Loading resulting layers
Algorithm 'My Script' finished
```
Batch mode:
```
Processing algorithm 1/2…
Algorithm My Script starting…
Input parameters:
{'INPUT': 'Kabeláž',
'OUTPUT': <QgsProcessingOutputLayerDefinition {'sink':C:/Users/001-PC/Desktop/fff1.gpkg, 'createOptions': {}}>}
Kabeláž
CRS is EPSG:5514
Algorithm My Script correctly executed…
Execution completed in 0.32 seconds
Results:
{'OUTPUT': 'C:/Users/001-PC/Desktop/fff1.gpkg'}
Loading resulting layers
Processing algorithm 2/2…
Algorithm My Script starting…
Input parameters:
{'INPUT': 'nosic_sb',
'OUTPUT': <QgsProcessingOutputLayerDefinition {'sink':C:/Users/001-PC/Desktop/fff2.gpkg, 'createOptions': {}}>}
nosic_sb
CRS is
Algorithm My Script correctly executed…
Execution completed in 0.16 seconds
Results:
{'OUTPUT': 'C:/Users/001-PC/Desktop/fff2.gpkg'}
Loading resulting layers
Batch execution completed in 0.59 seconds
```
Different is `kabelaz_e315491b_f638_465a_9e63_77d21b048262` and `Kabeláž`.
In QGIS 3.16 this different not exists. Values (parameter['INPUT']) are equal in single and batchmode.
Testing script:
```
# -*- coding: utf-8 -*-
"""
***************************************************************************
* *
* This program is free software; you can redistribute it and/or modify *
* it under the terms of the GNU General Public License as published by *
* the Free Software Foundation; either version 2 of the License, or *
* (at your option) any later version. *
* *
***************************************************************************
"""
from qgis.PyQt.QtCore import QCoreApplication
from qgis.core import (QgsProcessing,
QgsFeatureSink,
QgsProcessingException,
QgsProcessingAlgorithm,
QgsProcessingParameterFeatureSource,
QgsProcessingParameterFeatureSink)
from qgis import processing
class ExampleProcessingAlgorithm(QgsProcessingAlgorithm):
"""
This is an example algorithm that takes a vector layer and
creates a new identical one.
It is meant to be used as an example of how to create your own
algorithms and explain methods and variables used to do it. An
algorithm like this will be available in all elements, and there
is not need for additional work.
All Processing algorithms should extend the QgsProcessingAlgorithm
class.
"""
# Constants used to refer to parameters and outputs. They will be
# used when calling the algorithm from another algorithm, or when
# calling from the QGIS console.
INPUT = 'INPUT'
OUTPUT = 'OUTPUT'
def tr(self, string):
"""
Returns a translatable string with the self.tr() function.
"""
return QCoreApplication.translate('Processing', string)
def createInstance(self):
return ExampleProcessingAlgorithm()
def name(self):
"""
Returns the algorithm name, used for identifying the algorithm. This
string should be fixed for the algorithm, and must not be localised.
The name should be unique within each provider. Names should contain
lowercase alphanumeric characters only and no spaces or other
formatting characters.
"""
return 'myscript'
def displayName(self):
"""
Returns the translated algorithm name, which should be used for any
user-visible display of the algorithm name.
"""
return self.tr('My Script')
def group(self):
"""
Returns the name of the group this algorithm belongs to. This string
should be localised.
"""
return self.tr('Example scripts')
def groupId(self):
"""
Returns the unique ID of the group this algorithm belongs to. This
string should be fixed for the algorithm, and must not be localised.
The group id should be unique within each provider. Group id should
contain lowercase alphanumeric characters only and no spaces or other
formatting characters.
"""
return 'examplescripts'
def shortHelpString(self):
"""
Returns a localised short helper string for the algorithm. This string
should provide a basic description about what the algorithm does and the
parameters and outputs associated with it..
"""
return self.tr("Example algorithm short description")
def initAlgorithm(self, config=None):
"""
Here we define the inputs and output of the algorithm, along
with some other properties.
"""
# We add the input vector features source. It can have any kind of
# geometry.
self.addParameter(
QgsProcessingParameterFeatureSource(
self.INPUT,
self.tr('Input layer'),
[QgsProcessing.TypeVectorAnyGeometry]
)
)
# We add a feature sink in which to store our processed features (this
# usually takes the form of a newly created vector layer when the
# algorithm is run in QGIS).
self.addParameter(
QgsProcessingParameterFeatureSink(
self.OUTPUT,
self.tr('Output layer')
)
)
def processAlgorithm(self, parameters, context, feedback):
"""
Here is where the processing itself takes place.
"""
feedback.pushInfo(str(parameters['INPUT']))
# Retrieve the feature source and sink. The 'dest_id' variable is used
# to uniquely identify the feature sink, and must be included in the
# dictionary returned by the processAlgorithm function.
source = self.parameterAsSource(
parameters,
self.INPUT,
context
)
# If source was not found, throw an exception to indicate that the algorithm
# encountered a fatal error. The exception text can be any string, but in this
# case we use the pre-built invalidSourceError method to return a standard
# helper text for when a source cannot be evaluated
if source is None:
raise QgsProcessingException(self.invalidSourceError(parameters, self.INPUT))
(sink, dest_id) = self.parameterAsSink(
parameters,
self.OUTPUT,
context,
source.fields(),
source.wkbType(),
source.sourceCrs()
)
# Send some information to the user
feedback.pushInfo('CRS is {}'.format(source.sourceCrs().authid()))
# If sink was not created, throw an exception to indicate that the algorithm
# encountered a fatal error. The exception text can be any string, but in this
# case we use the pre-built invalidSinkError method to return a standard
# helper text for when a sink cannot be evaluated
if sink is None:
raise QgsProcessingException(self.invalidSinkError(parameters, self.OUTPUT))
# Compute the number of steps to display within the progress bar and
# get features from source
total = 100.0 / source.featureCount() if source.featureCount() else 0
features = source.getFeatures()
for current, feature in enumerate(features):
# Stop the algorithm if cancel button has been clicked
if feedback.isCanceled():
break
# Add a feature in the sink
sink.addFeature(feature, QgsFeatureSink.FastInsert)
# Update the progress bar
feedback.setProgress(int(current * total))
# To run another Processing algorithm as part of this algorithm, you can use
# processing.run(...). Make sure you pass the current context and feedback
# to processing.run to ensure that all temporary layer outputs are available
# to the executed algorithm, and that the executed algorithm can send feedback
# reports to the user (and correctly handle cancellation and progress reports!)
if False:
buffered_layer = processing.run("native:buffer", {
'INPUT': dest_id,
'DISTANCE': 1.5,
'SEGMENTS': 5,
'END_CAP_STYLE': 0,
'JOIN_STYLE': 0,
'MITER_LIMIT': 2,
'DISSOLVE': False,
'OUTPUT': 'memory:'
}, context=context, feedback=feedback)['OUTPUT']
# Return the results of the algorithm. In this case our only result is
# the feature sink which contains the processed features, but some
# algorithms may return multiple feature sinks, calculated numeric
# statistics, etc. These should all be included in the returned
# dictionary, with keys matching the feature corresponding parameter
# or output names.
return {self.OUTPUT: dest_id}
```
|
1.0
|
Different layer input parameter in singe mode and batchmode in processing script - For example:
https://gis.stackexchange.com/questions/384625/different-layer-name-in-batchmode-in-qgis-processing-script
Problem is in different layer parameter in single mode and batchmode. In single mode is `parameters[input_layer]` path of source layer, but in batch mode is `layer.name()`
Single mode:
```
QGIS version: 3.10.14-A Coruna
QGIS code revision: 8374282d2a
Qt version: 5.11.2
GDAL version: 3.1.4
GEOS version: 3.8.1-CAPI-1.13.3
PROJ version: Rel. 6.3.2, May 1st, 2020
Processing algorithm…
Algorithm 'My Script' starting…
Input parameters:
{ 'INPUT' : 'dbname=\'podoli_nad_bobruvkou_724271\' ....top.secret....', 'OUTPUT' : 'TEMPORARY_OUTPUT' }
kabelaz_e315491b_f638_465a_9e63_77d21b048262
CRS is EPSG:5514
Execution completed in 0.30 seconds
Results:
{'OUTPUT': 'Output_layer_e95a209a_90c0_49d4_8b89_07cefc930e62'}
Loading resulting layers
Algorithm 'My Script' finished
```
Batch mode:
```
Processing algorithm 1/2…
Algorithm My Script starting…
Input parameters:
{'INPUT': 'Kabeláž',
'OUTPUT': <QgsProcessingOutputLayerDefinition {'sink':C:/Users/001-PC/Desktop/fff1.gpkg, 'createOptions': {}}>}
Kabeláž
CRS is EPSG:5514
Algorithm My Script correctly executed…
Execution completed in 0.32 seconds
Results:
{'OUTPUT': 'C:/Users/001-PC/Desktop/fff1.gpkg'}
Loading resulting layers
Processing algorithm 2/2…
Algorithm My Script starting…
Input parameters:
{'INPUT': 'nosic_sb',
'OUTPUT': <QgsProcessingOutputLayerDefinition {'sink':C:/Users/001-PC/Desktop/fff2.gpkg, 'createOptions': {}}>}
nosic_sb
CRS is
Algorithm My Script correctly executed…
Execution completed in 0.16 seconds
Results:
{'OUTPUT': 'C:/Users/001-PC/Desktop/fff2.gpkg'}
Loading resulting layers
Batch execution completed in 0.59 seconds
```
Different is `kabelaz_e315491b_f638_465a_9e63_77d21b048262` and `Kabeláž`.
In QGIS 3.16 this different not exists. Values (parameter['INPUT']) are equal in single and batchmode.
Testing script:
```
# -*- coding: utf-8 -*-
"""
***************************************************************************
* *
* This program is free software; you can redistribute it and/or modify *
* it under the terms of the GNU General Public License as published by *
* the Free Software Foundation; either version 2 of the License, or *
* (at your option) any later version. *
* *
***************************************************************************
"""
from qgis.PyQt.QtCore import QCoreApplication
from qgis.core import (QgsProcessing,
QgsFeatureSink,
QgsProcessingException,
QgsProcessingAlgorithm,
QgsProcessingParameterFeatureSource,
QgsProcessingParameterFeatureSink)
from qgis import processing
class ExampleProcessingAlgorithm(QgsProcessingAlgorithm):
"""
This is an example algorithm that takes a vector layer and
creates a new identical one.
It is meant to be used as an example of how to create your own
algorithms and explain methods and variables used to do it. An
algorithm like this will be available in all elements, and there
is not need for additional work.
All Processing algorithms should extend the QgsProcessingAlgorithm
class.
"""
# Constants used to refer to parameters and outputs. They will be
# used when calling the algorithm from another algorithm, or when
# calling from the QGIS console.
INPUT = 'INPUT'
OUTPUT = 'OUTPUT'
def tr(self, string):
"""
Returns a translatable string with the self.tr() function.
"""
return QCoreApplication.translate('Processing', string)
def createInstance(self):
return ExampleProcessingAlgorithm()
def name(self):
"""
Returns the algorithm name, used for identifying the algorithm. This
string should be fixed for the algorithm, and must not be localised.
The name should be unique within each provider. Names should contain
lowercase alphanumeric characters only and no spaces or other
formatting characters.
"""
return 'myscript'
def displayName(self):
"""
Returns the translated algorithm name, which should be used for any
user-visible display of the algorithm name.
"""
return self.tr('My Script')
def group(self):
"""
Returns the name of the group this algorithm belongs to. This string
should be localised.
"""
return self.tr('Example scripts')
def groupId(self):
"""
Returns the unique ID of the group this algorithm belongs to. This
string should be fixed for the algorithm, and must not be localised.
The group id should be unique within each provider. Group id should
contain lowercase alphanumeric characters only and no spaces or other
formatting characters.
"""
return 'examplescripts'
def shortHelpString(self):
"""
Returns a localised short helper string for the algorithm. This string
should provide a basic description about what the algorithm does and the
parameters and outputs associated with it..
"""
return self.tr("Example algorithm short description")
def initAlgorithm(self, config=None):
"""
Here we define the inputs and output of the algorithm, along
with some other properties.
"""
# We add the input vector features source. It can have any kind of
# geometry.
self.addParameter(
QgsProcessingParameterFeatureSource(
self.INPUT,
self.tr('Input layer'),
[QgsProcessing.TypeVectorAnyGeometry]
)
)
# We add a feature sink in which to store our processed features (this
# usually takes the form of a newly created vector layer when the
# algorithm is run in QGIS).
self.addParameter(
QgsProcessingParameterFeatureSink(
self.OUTPUT,
self.tr('Output layer')
)
)
def processAlgorithm(self, parameters, context, feedback):
"""
Here is where the processing itself takes place.
"""
feedback.pushInfo(str(parameters['INPUT']))
# Retrieve the feature source and sink. The 'dest_id' variable is used
# to uniquely identify the feature sink, and must be included in the
# dictionary returned by the processAlgorithm function.
source = self.parameterAsSource(
parameters,
self.INPUT,
context
)
# If source was not found, throw an exception to indicate that the algorithm
# encountered a fatal error. The exception text can be any string, but in this
# case we use the pre-built invalidSourceError method to return a standard
# helper text for when a source cannot be evaluated
if source is None:
raise QgsProcessingException(self.invalidSourceError(parameters, self.INPUT))
(sink, dest_id) = self.parameterAsSink(
parameters,
self.OUTPUT,
context,
source.fields(),
source.wkbType(),
source.sourceCrs()
)
# Send some information to the user
feedback.pushInfo('CRS is {}'.format(source.sourceCrs().authid()))
# If sink was not created, throw an exception to indicate that the algorithm
# encountered a fatal error. The exception text can be any string, but in this
# case we use the pre-built invalidSinkError method to return a standard
# helper text for when a sink cannot be evaluated
if sink is None:
raise QgsProcessingException(self.invalidSinkError(parameters, self.OUTPUT))
# Compute the number of steps to display within the progress bar and
# get features from source
total = 100.0 / source.featureCount() if source.featureCount() else 0
features = source.getFeatures()
for current, feature in enumerate(features):
# Stop the algorithm if cancel button has been clicked
if feedback.isCanceled():
break
# Add a feature in the sink
sink.addFeature(feature, QgsFeatureSink.FastInsert)
# Update the progress bar
feedback.setProgress(int(current * total))
# To run another Processing algorithm as part of this algorithm, you can use
# processing.run(...). Make sure you pass the current context and feedback
# to processing.run to ensure that all temporary layer outputs are available
# to the executed algorithm, and that the executed algorithm can send feedback
# reports to the user (and correctly handle cancellation and progress reports!)
if False:
buffered_layer = processing.run("native:buffer", {
'INPUT': dest_id,
'DISTANCE': 1.5,
'SEGMENTS': 5,
'END_CAP_STYLE': 0,
'JOIN_STYLE': 0,
'MITER_LIMIT': 2,
'DISSOLVE': False,
'OUTPUT': 'memory:'
}, context=context, feedback=feedback)['OUTPUT']
# Return the results of the algorithm. In this case our only result is
# the feature sink which contains the processed features, but some
# algorithms may return multiple feature sinks, calculated numeric
# statistics, etc. These should all be included in the returned
# dictionary, with keys matching the feature corresponding parameter
# or output names.
return {self.OUTPUT: dest_id}
```
|
process
|
different layer input parameter in singe mode and batchmode in processing script for example problem is in different layer parameter in single mode and batchmode in single mode is parameters path of source layer but in batch mode is layer name single mode qgis version a coruna qgis code revision qt version gdal version geos version capi proj version rel may processing algorithm… algorithm my script starting… input parameters input dbname podoli nad bobruvkou top secret output temporary output kabelaz crs is epsg execution completed in seconds results output output layer loading resulting layers algorithm my script finished batch mode processing algorithm … algorithm my script starting… input parameters input kabeláž output kabeláž crs is epsg algorithm my script correctly executed… execution completed in seconds results output c users pc desktop gpkg loading resulting layers processing algorithm … algorithm my script starting… input parameters input nosic sb output nosic sb crs is algorithm my script correctly executed… execution completed in seconds results output c users pc desktop gpkg loading resulting layers batch execution completed in seconds different is kabelaz and kabeláž in qgis this different not exists values parameter are equal in single and batchmode testing script coding utf this program is free software you can redistribute it and or modify it under the terms of the gnu general public license as published by the free software foundation either version of the license or at your option any later version from qgis pyqt qtcore import qcoreapplication from qgis core import qgsprocessing qgsfeaturesink qgsprocessingexception qgsprocessingalgorithm qgsprocessingparameterfeaturesource qgsprocessingparameterfeaturesink from qgis import processing class exampleprocessingalgorithm qgsprocessingalgorithm this is an example algorithm that takes a vector layer and creates a new identical one it is meant to be used as an example of how to create your own algorithms and explain methods and variables used to do it an algorithm like this will be available in all elements and there is not need for additional work all processing algorithms should extend the qgsprocessingalgorithm class constants used to refer to parameters and outputs they will be used when calling the algorithm from another algorithm or when calling from the qgis console input input output output def tr self string returns a translatable string with the self tr function return qcoreapplication translate processing string def createinstance self return exampleprocessingalgorithm def name self returns the algorithm name used for identifying the algorithm this string should be fixed for the algorithm and must not be localised the name should be unique within each provider names should contain lowercase alphanumeric characters only and no spaces or other formatting characters return myscript def displayname self returns the translated algorithm name which should be used for any user visible display of the algorithm name return self tr my script def group self returns the name of the group this algorithm belongs to this string should be localised return self tr example scripts def groupid self returns the unique id of the group this algorithm belongs to this string should be fixed for the algorithm and must not be localised the group id should be unique within each provider group id should contain lowercase alphanumeric characters only and no spaces or other formatting characters return examplescripts def shorthelpstring self returns a localised short helper string for the algorithm this string should provide a basic description about what the algorithm does and the parameters and outputs associated with it return self tr example algorithm short description def initalgorithm self config none here we define the inputs and output of the algorithm along with some other properties we add the input vector features source it can have any kind of geometry self addparameter qgsprocessingparameterfeaturesource self input self tr input layer we add a feature sink in which to store our processed features this usually takes the form of a newly created vector layer when the algorithm is run in qgis self addparameter qgsprocessingparameterfeaturesink self output self tr output layer def processalgorithm self parameters context feedback here is where the processing itself takes place feedback pushinfo str parameters retrieve the feature source and sink the dest id variable is used to uniquely identify the feature sink and must be included in the dictionary returned by the processalgorithm function source self parameterassource parameters self input context if source was not found throw an exception to indicate that the algorithm encountered a fatal error the exception text can be any string but in this case we use the pre built invalidsourceerror method to return a standard helper text for when a source cannot be evaluated if source is none raise qgsprocessingexception self invalidsourceerror parameters self input sink dest id self parameterassink parameters self output context source fields source wkbtype source sourcecrs send some information to the user feedback pushinfo crs is format source sourcecrs authid if sink was not created throw an exception to indicate that the algorithm encountered a fatal error the exception text can be any string but in this case we use the pre built invalidsinkerror method to return a standard helper text for when a sink cannot be evaluated if sink is none raise qgsprocessingexception self invalidsinkerror parameters self output compute the number of steps to display within the progress bar and get features from source total source featurecount if source featurecount else features source getfeatures for current feature in enumerate features stop the algorithm if cancel button has been clicked if feedback iscanceled break add a feature in the sink sink addfeature feature qgsfeaturesink fastinsert update the progress bar feedback setprogress int current total to run another processing algorithm as part of this algorithm you can use processing run make sure you pass the current context and feedback to processing run to ensure that all temporary layer outputs are available to the executed algorithm and that the executed algorithm can send feedback reports to the user and correctly handle cancellation and progress reports if false buffered layer processing run native buffer input dest id distance segments end cap style join style miter limit dissolve false output memory context context feedback feedback return the results of the algorithm in this case our only result is the feature sink which contains the processed features but some algorithms may return multiple feature sinks calculated numeric statistics etc these should all be included in the returned dictionary with keys matching the feature corresponding parameter or output names return self output dest id
| 1
|
15,254
| 19,190,521,317
|
IssuesEvent
|
2021-12-05 22:38:57
|
km4ack/pi-build
|
https://api.github.com/repos/km4ack/pi-build
|
closed
|
YAAC won't install Bullseye SOLVED
|
bug in process
|
The dependency openjdk-8-dbg needs to be changed to openjdk-10-dbg in the install code. See this [line](https://github.com/km4ack/pi-build/blob/master/functions/additional.function#L403)
for now, simply running:
`sudo apt install openjdk-10-dbg`
will allow YAAC to run.
|
1.0
|
YAAC won't install Bullseye SOLVED - The dependency openjdk-8-dbg needs to be changed to openjdk-10-dbg in the install code. See this [line](https://github.com/km4ack/pi-build/blob/master/functions/additional.function#L403)
for now, simply running:
`sudo apt install openjdk-10-dbg`
will allow YAAC to run.
|
process
|
yaac won t install bullseye solved the dependency openjdk dbg needs to be changed to openjdk dbg in the install code see this for now simply running sudo apt install openjdk dbg will allow yaac to run
| 1
|
2,311
| 5,132,756,601
|
IssuesEvent
|
2017-01-11 00:15:31
|
AffiliateWP/AffiliateWP
|
https://api.github.com/repos/AffiliateWP/AffiliateWP
|
reopened
|
Store an affiliate's unpaid earnings in the affiliates_wp_affiliates DB table
|
batch-processing enhancement
|
PR: #1899
The `affiliates_wp_affiliates` database table shows an `earnings` column. This is the total `paid` earnings of an affiliate. It would be useful to also store an affiliate's `unpaid` earnings.
For example, with our leaderboard add-on we can list affiliates based on their earnings. Customers have requested to show only an affiliate's unpaid earnings, or a combination of both unpaid and paid. This could be possible by looping through every referral based on the status, but it would be much easier/more efficient to store this amount in the affiliates table.
I'm sure there are also other use-cases where this would be useful. One that comes to mind is replacing the use of `affwp_get_affiliate_unpaid_earnings()` on the affiliate dashboard. We can get the value directly from the affiliate without having to work it out each time.
|
1.0
|
Store an affiliate's unpaid earnings in the affiliates_wp_affiliates DB table - PR: #1899
The `affiliates_wp_affiliates` database table shows an `earnings` column. This is the total `paid` earnings of an affiliate. It would be useful to also store an affiliate's `unpaid` earnings.
For example, with our leaderboard add-on we can list affiliates based on their earnings. Customers have requested to show only an affiliate's unpaid earnings, or a combination of both unpaid and paid. This could be possible by looping through every referral based on the status, but it would be much easier/more efficient to store this amount in the affiliates table.
I'm sure there are also other use-cases where this would be useful. One that comes to mind is replacing the use of `affwp_get_affiliate_unpaid_earnings()` on the affiliate dashboard. We can get the value directly from the affiliate without having to work it out each time.
|
process
|
store an affiliate s unpaid earnings in the affiliates wp affiliates db table pr the affiliates wp affiliates database table shows an earnings column this is the total paid earnings of an affiliate it would be useful to also store an affiliate s unpaid earnings for example with our leaderboard add on we can list affiliates based on their earnings customers have requested to show only an affiliate s unpaid earnings or a combination of both unpaid and paid this could be possible by looping through every referral based on the status but it would be much easier more efficient to store this amount in the affiliates table i m sure there are also other use cases where this would be useful one that comes to mind is replacing the use of affwp get affiliate unpaid earnings on the affiliate dashboard we can get the value directly from the affiliate without having to work it out each time
| 1
|
182,650
| 14,145,368,747
|
IssuesEvent
|
2020-11-10 17:38:47
|
istio/istio.io
|
https://api.github.com/repos/istio/istio.io
|
closed
|
[istio.io testing] Need test for Secure Gateways
|
area/test and release kind/docs kind/enhancement lifecycle/needs-triage
|
**Describe the feature request**
We need to develop a test for the [Secure Gateways](https://preliminary.istio.io/docs/tasks/traffic-management/ingress/secure-ingress-sds/) task in order to provide automated testing of the Istio.io website for future releases.
Example tests and the documentation for the framework can be found here: https://github.com/istio/istio/tree/master/tests/integration/istioio
A YouTube video describing the usage of the framework can be found here: https://www.youtube.com/watch?v=3y-z8NaVwr0
Please reach out to the Test and Release channel on the [Istio Slack](https://discuss.istio.io/t/istio-slack-channel/1527) or [Test and Release Discuss](https://discuss.istio.io/c/test-and-release) with any questions.
|
1.0
|
[istio.io testing] Need test for Secure Gateways - **Describe the feature request**
We need to develop a test for the [Secure Gateways](https://preliminary.istio.io/docs/tasks/traffic-management/ingress/secure-ingress-sds/) task in order to provide automated testing of the Istio.io website for future releases.
Example tests and the documentation for the framework can be found here: https://github.com/istio/istio/tree/master/tests/integration/istioio
A YouTube video describing the usage of the framework can be found here: https://www.youtube.com/watch?v=3y-z8NaVwr0
Please reach out to the Test and Release channel on the [Istio Slack](https://discuss.istio.io/t/istio-slack-channel/1527) or [Test and Release Discuss](https://discuss.istio.io/c/test-and-release) with any questions.
|
non_process
|
need test for secure gateways describe the feature request we need to develop a test for the task in order to provide automated testing of the istio io website for future releases example tests and the documentation for the framework can be found here a youtube video describing the usage of the framework can be found here please reach out to the test and release channel on the or with any questions
| 0
|
503,895
| 14,601,221,392
|
IssuesEvent
|
2020-12-21 08:19:20
|
hchiam/slides
|
https://api.github.com/repos/hchiam/slides
|
closed
|
snap positions
|
bug enhancement priority
|
**Single line:**
- [x] can snap to title/top position
- [x] can snap to center of slide
- [x] center-align text
**Multi-line:**
- [x] can snap to center of slide
- [ ] ~left-align text~
|
1.0
|
snap positions - **Single line:**
- [x] can snap to title/top position
- [x] can snap to center of slide
- [x] center-align text
**Multi-line:**
- [x] can snap to center of slide
- [ ] ~left-align text~
|
non_process
|
snap positions single line can snap to title top position can snap to center of slide center align text multi line can snap to center of slide left align text
| 0
|
172,020
| 21,031,016,732
|
IssuesEvent
|
2022-03-31 01:01:16
|
Tim-sandbox/WebGoat-8.1
|
https://api.github.com/repos/Tim-sandbox/WebGoat-8.1
|
opened
|
CVE-2022-22950 (Medium) detected in spring-expression-5.3.9.jar
|
security vulnerability
|
## CVE-2022-22950 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-expression-5.3.9.jar</b></p></summary>
<p>Spring Expression Language (SpEL)</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /webgoat-integration-tests/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-expression/5.3.9/spring-expression-5.3.9.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-expression/5.3.9/spring-expression-5.3.9.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-expression/5.3.9/spring-expression-5.3.9.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-expression/5.3.9/spring-expression-5.3.9.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.5.4.jar (Root Library)
- spring-boot-starter-2.5.4.jar
- spring-boot-2.5.4.jar
- spring-context-5.3.9.jar
- :x: **spring-expression-5.3.9.jar** (Vulnerable Library)
<p>Found in base branch: <b>develop</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Spring Framework versions 5.3.0 - 5.3.16 and older unsupported versions, it is possible for a user to provide a specially crafted SpEL expression that may cause a denial of service condition
<p>Publish Date: 2022-01-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-22950>CVE-2022-22950</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://tanzu.vmware.com/security/cve-2022-22950">https://tanzu.vmware.com/security/cve-2022-22950</a></p>
<p>Release Date: 2022-01-11</p>
<p>Fix Resolution: org.springframework:spring-expression:5.3.17</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.springframework","packageName":"spring-expression","packageVersion":"5.3.9","packageFilePaths":["/webgoat-integration-tests/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-validation:2.5.4;org.springframework.boot:spring-boot-starter:2.5.4;org.springframework.boot:spring-boot:2.5.4;org.springframework:spring-context:5.3.9;org.springframework:spring-expression:5.3.9","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.springframework:spring-expression:5.3.17","isBinary":false}],"baseBranches":["develop"],"vulnerabilityIdentifier":"CVE-2022-22950","vulnerabilityDetails":"In Spring Framework versions 5.3.0 - 5.3.16 and older unsupported versions, it is possible for a user to provide a specially crafted SpEL expression that may cause a denial of service condition","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-22950","cvss3Severity":"medium","cvss3Score":"5.4","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2022-22950 (Medium) detected in spring-expression-5.3.9.jar - ## CVE-2022-22950 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-expression-5.3.9.jar</b></p></summary>
<p>Spring Expression Language (SpEL)</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /webgoat-integration-tests/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-expression/5.3.9/spring-expression-5.3.9.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-expression/5.3.9/spring-expression-5.3.9.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-expression/5.3.9/spring-expression-5.3.9.jar,/home/wss-scanner/.m2/repository/org/springframework/spring-expression/5.3.9/spring-expression-5.3.9.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-validation-2.5.4.jar (Root Library)
- spring-boot-starter-2.5.4.jar
- spring-boot-2.5.4.jar
- spring-context-5.3.9.jar
- :x: **spring-expression-5.3.9.jar** (Vulnerable Library)
<p>Found in base branch: <b>develop</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Spring Framework versions 5.3.0 - 5.3.16 and older unsupported versions, it is possible for a user to provide a specially crafted SpEL expression that may cause a denial of service condition
<p>Publish Date: 2022-01-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-22950>CVE-2022-22950</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://tanzu.vmware.com/security/cve-2022-22950">https://tanzu.vmware.com/security/cve-2022-22950</a></p>
<p>Release Date: 2022-01-11</p>
<p>Fix Resolution: org.springframework:spring-expression:5.3.17</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.springframework","packageName":"spring-expression","packageVersion":"5.3.9","packageFilePaths":["/webgoat-integration-tests/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-validation:2.5.4;org.springframework.boot:spring-boot-starter:2.5.4;org.springframework.boot:spring-boot:2.5.4;org.springframework:spring-context:5.3.9;org.springframework:spring-expression:5.3.9","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.springframework:spring-expression:5.3.17","isBinary":false}],"baseBranches":["develop"],"vulnerabilityIdentifier":"CVE-2022-22950","vulnerabilityDetails":"In Spring Framework versions 5.3.0 - 5.3.16 and older unsupported versions, it is possible for a user to provide a specially crafted SpEL expression that may cause a denial of service condition","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-22950","cvss3Severity":"medium","cvss3Score":"5.4","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in spring expression jar cve medium severity vulnerability vulnerable library spring expression jar spring expression language spel library home page a href path to dependency file webgoat integration tests pom xml path to vulnerable library home wss scanner repository org springframework spring expression spring expression jar home wss scanner repository org springframework spring expression spring expression jar home wss scanner repository org springframework spring expression spring expression jar home wss scanner repository org springframework spring expression spring expression jar dependency hierarchy spring boot starter validation jar root library spring boot starter jar spring boot jar spring context jar x spring expression jar vulnerable library found in base branch develop vulnerability details in spring framework versions and older unsupported versions it is possible for a user to provide a specially crafted spel expression that may cause a denial of service condition publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org springframework spring expression isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree org springframework boot spring boot starter validation org springframework boot spring boot starter org springframework boot spring boot org springframework spring context org springframework spring expression isminimumfixversionavailable true minimumfixversion org springframework spring expression isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails in spring framework versions and older unsupported versions it is possible for a user to provide a specially crafted spel expression that may cause a denial of service condition vulnerabilityurl
| 0
|
13,537
| 16,067,677,994
|
IssuesEvent
|
2021-04-23 22:18:52
|
Azure/azure-event-hubs-java
|
https://api.github.com/repos/Azure/azure-event-hubs-java
|
closed
|
Provide to a way to checkpoint by partitionId, sequence number or offset
|
EventProcessorHost enhancement
|
## Actual Behavior
1. `PartitionContext` only has 2 public method `checkpoint()` and `checkpoint(EventData)`
## Expected Behavior
1. Provide method `checkpoint(partitionId, offset, sequenceNumber)`
## Versions
- azure-eventhubs-eph 1.0.0
|
1.0
|
Provide to a way to checkpoint by partitionId, sequence number or offset - ## Actual Behavior
1. `PartitionContext` only has 2 public method `checkpoint()` and `checkpoint(EventData)`
## Expected Behavior
1. Provide method `checkpoint(partitionId, offset, sequenceNumber)`
## Versions
- azure-eventhubs-eph 1.0.0
|
process
|
provide to a way to checkpoint by partitionid sequence number or offset actual behavior partitioncontext only has public method checkpoint and checkpoint eventdata expected behavior provide method checkpoint partitionid offset sequencenumber versions azure eventhubs eph
| 1
|
14,503
| 17,604,346,964
|
IssuesEvent
|
2021-08-17 15:16:50
|
qgis/QGIS-Documentation
|
https://api.github.com/repos/qgis/QGIS-Documentation
|
closed
|
[feature][processing] Complete random raster algorithm collection #2 (Request in QGIS)
|
Processing Alg 3.14
|
### Request for documentation
From pull request QGIS/qgis#36130
Author: @root676
QGIS version: 3.14
**[feature][processing] Complete random raster algorithm collection #2**
### PR Description:
## Description
This PR refactors and completes the recently added work on **random number raster layer creation algorithms** (see #35835) and reworks the single algorithm implementation proposed in [#36065](https://github.com/qgis/QGIS/pull/36065) to a **base-algorithm solution** which avoids duplicate code.
With the new algorithms aimed towards distribution based random number creation, QGIS reaches to the same level of functionality as current ArcGIS random raster creation tools. In total, the PR adds the following algorithms (normal and uniform raster layer creation algs are refactored to be in line the new naming scheme):
1. Create random raster layer (binomial distribution)
2. Create random raster layer (exponential distribution)
3. Create random raster layer (gamma distribution)
4. Create random raster layer (negative binomial distribution)
5. Create random raster layer (normal distribution)
6. Create random raster layer (poisson distribution)
7. Create random raster layer (uniform distribution)
Tests have been added for the algorithms data type choices.
I'm looking forward to comments/questions/reviews/etc.!
### Commits tagged with [need-docs] or [FEATURE]
|
1.0
|
[feature][processing] Complete random raster algorithm collection #2 (Request in QGIS) - ### Request for documentation
From pull request QGIS/qgis#36130
Author: @root676
QGIS version: 3.14
**[feature][processing] Complete random raster algorithm collection #2**
### PR Description:
## Description
This PR refactors and completes the recently added work on **random number raster layer creation algorithms** (see #35835) and reworks the single algorithm implementation proposed in [#36065](https://github.com/qgis/QGIS/pull/36065) to a **base-algorithm solution** which avoids duplicate code.
With the new algorithms aimed towards distribution based random number creation, QGIS reaches to the same level of functionality as current ArcGIS random raster creation tools. In total, the PR adds the following algorithms (normal and uniform raster layer creation algs are refactored to be in line the new naming scheme):
1. Create random raster layer (binomial distribution)
2. Create random raster layer (exponential distribution)
3. Create random raster layer (gamma distribution)
4. Create random raster layer (negative binomial distribution)
5. Create random raster layer (normal distribution)
6. Create random raster layer (poisson distribution)
7. Create random raster layer (uniform distribution)
Tests have been added for the algorithms data type choices.
I'm looking forward to comments/questions/reviews/etc.!
### Commits tagged with [need-docs] or [FEATURE]
|
process
|
complete random raster algorithm collection request in qgis request for documentation from pull request qgis qgis author qgis version complete random raster algorithm collection pr description description this pr refactors and completes the recently added work on random number raster layer creation algorithms see and reworks the single algorithm implementation proposed in to a base algorithm solution which avoids duplicate code with the new algorithms aimed towards distribution based random number creation qgis reaches to the same level of functionality as current arcgis random raster creation tools in total the pr adds the following algorithms normal and uniform raster layer creation algs are refactored to be in line the new naming scheme create random raster layer binomial distribution create random raster layer exponential distribution create random raster layer gamma distribution create random raster layer negative binomial distribution create random raster layer normal distribution create random raster layer poisson distribution create random raster layer uniform distribution tests have been added for the algorithms data type choices i m looking forward to comments questions reviews etc commits tagged with or
| 1
|
3,984
| 6,912,422,180
|
IssuesEvent
|
2017-11-28 11:55:20
|
elastic/beats
|
https://api.github.com/repos/elastic/beats
|
closed
|
Add timezone to the exported event
|
:Processors enhancement libbeat meta
|
There are a few requests to export timezone in order to be able to find the local timestamp. For that, we think the best option would be to export also the timezone (optionally) in the event, and make sure the @timestamp is always sent in UTC.
TODOs:
- [x] Create `add_locale` processor to add timezone in the event https://github.com/elastic/beats/pull/3902
- [x] Configure the format of the timezone in the `add_locale` processor. For example `America/Curacao` or `CEST`
|
1.0
|
Add timezone to the exported event - There are a few requests to export timezone in order to be able to find the local timestamp. For that, we think the best option would be to export also the timezone (optionally) in the event, and make sure the @timestamp is always sent in UTC.
TODOs:
- [x] Create `add_locale` processor to add timezone in the event https://github.com/elastic/beats/pull/3902
- [x] Configure the format of the timezone in the `add_locale` processor. For example `America/Curacao` or `CEST`
|
process
|
add timezone to the exported event there are a few requests to export timezone in order to be able to find the local timestamp for that we think the best option would be to export also the timezone optionally in the event and make sure the timestamp is always sent in utc todos create add locale processor to add timezone in the event configure the format of the timezone in the add locale processor for example america curacao or cest
| 1
|
107,238
| 4,295,378,427
|
IssuesEvent
|
2016-07-19 06:46:36
|
InfiniteFlightAirportEditing/Airports
|
https://api.github.com/repos/InfiniteFlightAirportEditing/Airports
|
closed
|
EPWA
|
Being Redone Priority 2 - 10k+
|
I'm going to do a complete Re-Do of this one. It was requested by a member of the forums. #
|
1.0
|
EPWA - I'm going to do a complete Re-Do of this one. It was requested by a member of the forums. #
|
non_process
|
epwa i m going to do a complete re do of this one it was requested by a member of the forums
| 0
|
7,019
| 10,168,347,674
|
IssuesEvent
|
2019-08-07 20:34:48
|
shirou/gopsutil
|
https://api.github.com/repos/shirou/gopsutil
|
closed
|
undefined: "golang.org/x/sys/windows".PROCESS_QUERY_LIMITED_INFORMATION
|
os:windows package:process
|
**Describe the bug**
Error when compiling with Go 1.12:
`undefined: "golang.org/x/sys/windows".PROCESS_QUERY_LIMITED_INFORMATION`
**To Reproduce**
```go
package main
import _ "github.com/shirou/gopsutil/process"
func main() {}
```
**Expected behavior**
No errors
**Environment (please complete the following information):**
- [X] Windows: `Microsoft Windows [Version 6.1.7601]`
**Notes**
The constant was recently added to Go (https://github.com/golang/sys/commit/79a91cf218c425e1f4d48cd960839b45dd4d065f), but is not yet available in the current stable version (Go 1.12).
Is this package supposed to work with Go 1.12? Otherwise please close this issue and I'll wait for Go 1.13.
|
1.0
|
undefined: "golang.org/x/sys/windows".PROCESS_QUERY_LIMITED_INFORMATION - **Describe the bug**
Error when compiling with Go 1.12:
`undefined: "golang.org/x/sys/windows".PROCESS_QUERY_LIMITED_INFORMATION`
**To Reproduce**
```go
package main
import _ "github.com/shirou/gopsutil/process"
func main() {}
```
**Expected behavior**
No errors
**Environment (please complete the following information):**
- [X] Windows: `Microsoft Windows [Version 6.1.7601]`
**Notes**
The constant was recently added to Go (https://github.com/golang/sys/commit/79a91cf218c425e1f4d48cd960839b45dd4d065f), but is not yet available in the current stable version (Go 1.12).
Is this package supposed to work with Go 1.12? Otherwise please close this issue and I'll wait for Go 1.13.
|
process
|
undefined golang org x sys windows process query limited information describe the bug error when compiling with go undefined golang org x sys windows process query limited information to reproduce go package main import github com shirou gopsutil process func main expected behavior no errors environment please complete the following information windows microsoft windows notes the constant was recently added to go but is not yet available in the current stable version go is this package supposed to work with go otherwise please close this issue and i ll wait for go
| 1
|
9,898
| 12,906,110,366
|
IssuesEvent
|
2020-07-15 00:33:15
|
dotnet/runtime
|
https://api.github.com/repos/dotnet/runtime
|
closed
|
Test failure: System.ServiceProcess.Tests.ServiceBaseTests.TestOnStartWithArgsThenStop (expected: 6, Actual: 0)
|
area-System.ServiceProcess blocking-outerloop test bug
|
### Build
https://dev.azure.com/dnceng/public/_build/results?buildId=720801&view=ms.vss-test-web.build-test-results-tab&runId=22327000
### Message
```
System.AggregateException : One or more errors occurred. (Assert.Equal() Failure\r\nExpected: 6\r\nActual: 0) (The operation requested for service '6d2e6215-600a-4f31-a5a8-e6ab01a20de9' has not been completed within the specified time interval.)\r\n---- Assert.Equal() Failure\r\nExpected: 6\r\nActual: 0\r\n---- System.ServiceProcess.TimeoutException : The operation requested for service '6d2e6215-600a-4f31-a5a8-e6ab01a20de9' has not been completed within the specified time interval.
```
### Stack trace
```
----- Inner Stack Trace #1 (Xunit.Sdk.EqualException) -----
at System.ServiceProcess.Tests.ServiceBaseTests.TestOnStartWithArgsThenStop() in /_/src/libraries/System.ServiceProcess.ServiceController/tests/ServiceBaseTests.cs:line 96
----- Inner Stack Trace #2 (System.ServiceProcess.TimeoutException) -----
at System.ServiceProcess.ServiceController.WaitForStatus(ServiceControllerStatus desiredStatus, TimeSpan timeout) in /_/src/libraries/System.ServiceProcess.ServiceController/src/System/ServiceProcess/ServiceController.cs:line 905
at System.ServiceProcess.Tests.TestServiceInstaller.StopService() in /_/src/libraries/System.ServiceProcess.ServiceController/tests/System.ServiceProcess.ServiceController.TestService/TestServiceInstaller.cs:line 154
at System.ServiceProcess.Tests.TestServiceInstaller.RemoveService() in /_/src/libraries/System.ServiceProcess.ServiceController/tests/System.ServiceProcess.ServiceController.TestService/TestServiceInstaller.cs:line 119
at System.ServiceProcess.Tests.TestServiceProvider.DeleteTestServices() in /_/src/libraries/System.ServiceProcess.ServiceController/tests/TestServiceProvider.cs:line 135
at System.ServiceProcess.Tests.ServiceBaseTests.Dispose() in /_/src/libraries/System.ServiceProcess.ServiceController/tests/ServiceBaseTests.cs:line 218
at ReflectionAbstractionExtensions.DisposeTestClass(ITest test, Object testClass, IMessageBus messageBus, ExecutionTimer timer, CancellationTokenSource cancellationTokenSource) in C:\Dev\xunit\xunit\src\xunit.execution\Extensions\ReflectionAbstractionExtensions.cs:line 79
```
|
1.0
|
Test failure: System.ServiceProcess.Tests.ServiceBaseTests.TestOnStartWithArgsThenStop (expected: 6, Actual: 0) - ### Build
https://dev.azure.com/dnceng/public/_build/results?buildId=720801&view=ms.vss-test-web.build-test-results-tab&runId=22327000
### Message
```
System.AggregateException : One or more errors occurred. (Assert.Equal() Failure\r\nExpected: 6\r\nActual: 0) (The operation requested for service '6d2e6215-600a-4f31-a5a8-e6ab01a20de9' has not been completed within the specified time interval.)\r\n---- Assert.Equal() Failure\r\nExpected: 6\r\nActual: 0\r\n---- System.ServiceProcess.TimeoutException : The operation requested for service '6d2e6215-600a-4f31-a5a8-e6ab01a20de9' has not been completed within the specified time interval.
```
### Stack trace
```
----- Inner Stack Trace #1 (Xunit.Sdk.EqualException) -----
at System.ServiceProcess.Tests.ServiceBaseTests.TestOnStartWithArgsThenStop() in /_/src/libraries/System.ServiceProcess.ServiceController/tests/ServiceBaseTests.cs:line 96
----- Inner Stack Trace #2 (System.ServiceProcess.TimeoutException) -----
at System.ServiceProcess.ServiceController.WaitForStatus(ServiceControllerStatus desiredStatus, TimeSpan timeout) in /_/src/libraries/System.ServiceProcess.ServiceController/src/System/ServiceProcess/ServiceController.cs:line 905
at System.ServiceProcess.Tests.TestServiceInstaller.StopService() in /_/src/libraries/System.ServiceProcess.ServiceController/tests/System.ServiceProcess.ServiceController.TestService/TestServiceInstaller.cs:line 154
at System.ServiceProcess.Tests.TestServiceInstaller.RemoveService() in /_/src/libraries/System.ServiceProcess.ServiceController/tests/System.ServiceProcess.ServiceController.TestService/TestServiceInstaller.cs:line 119
at System.ServiceProcess.Tests.TestServiceProvider.DeleteTestServices() in /_/src/libraries/System.ServiceProcess.ServiceController/tests/TestServiceProvider.cs:line 135
at System.ServiceProcess.Tests.ServiceBaseTests.Dispose() in /_/src/libraries/System.ServiceProcess.ServiceController/tests/ServiceBaseTests.cs:line 218
at ReflectionAbstractionExtensions.DisposeTestClass(ITest test, Object testClass, IMessageBus messageBus, ExecutionTimer timer, CancellationTokenSource cancellationTokenSource) in C:\Dev\xunit\xunit\src\xunit.execution\Extensions\ReflectionAbstractionExtensions.cs:line 79
```
|
process
|
test failure system serviceprocess tests servicebasetests testonstartwithargsthenstop expected actual build message system aggregateexception one or more errors occurred assert equal failure r nexpected r nactual the operation requested for service has not been completed within the specified time interval r n assert equal failure r nexpected r nactual r n system serviceprocess timeoutexception the operation requested for service has not been completed within the specified time interval stack trace inner stack trace xunit sdk equalexception at system serviceprocess tests servicebasetests testonstartwithargsthenstop in src libraries system serviceprocess servicecontroller tests servicebasetests cs line inner stack trace system serviceprocess timeoutexception at system serviceprocess servicecontroller waitforstatus servicecontrollerstatus desiredstatus timespan timeout in src libraries system serviceprocess servicecontroller src system serviceprocess servicecontroller cs line at system serviceprocess tests testserviceinstaller stopservice in src libraries system serviceprocess servicecontroller tests system serviceprocess servicecontroller testservice testserviceinstaller cs line at system serviceprocess tests testserviceinstaller removeservice in src libraries system serviceprocess servicecontroller tests system serviceprocess servicecontroller testservice testserviceinstaller cs line at system serviceprocess tests testserviceprovider deletetestservices in src libraries system serviceprocess servicecontroller tests testserviceprovider cs line at system serviceprocess tests servicebasetests dispose in src libraries system serviceprocess servicecontroller tests servicebasetests cs line at reflectionabstractionextensions disposetestclass itest test object testclass imessagebus messagebus executiontimer timer cancellationtokensource cancellationtokensource in c dev xunit xunit src xunit execution extensions reflectionabstractionextensions cs line
| 1
|
217,199
| 16,681,891,233
|
IssuesEvent
|
2021-06-08 01:36:27
|
outline/outline
|
https://api.github.com/repos/outline/outline
|
closed
|
[Development OAuth] Cannot add `http://*` URL as redirect URL
|
documentation
|
**To Reproduce**
Steps to reproduce the behavior:
1. Follow the instruction in README file.
2. In Slack OAuth setting, cannot set redirect URL.
**Expected behavior**
* Need to update the development instruction.
**Screenshots**

**Outline (please complete the following information):**
- Install: local development
- Version: `main` branch
|
1.0
|
[Development OAuth] Cannot add `http://*` URL as redirect URL - **To Reproduce**
Steps to reproduce the behavior:
1. Follow the instruction in README file.
2. In Slack OAuth setting, cannot set redirect URL.
**Expected behavior**
* Need to update the development instruction.
**Screenshots**

**Outline (please complete the following information):**
- Install: local development
- Version: `main` branch
|
non_process
|
cannot add url as redirect url to reproduce steps to reproduce the behavior follow the instruction in readme file in slack oauth setting cannot set redirect url expected behavior need to update the development instruction screenshots outline please complete the following information install local development version main branch
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.